![]()
内容推荐 深度学习是基于学习多层次抽象的机器学习算法的一个分支。作为深度学习核心的神经网络被用于预测分析、计算机视觉、自然语言处理、时间序列预测以及执行大量其他的复杂任务。 本书面向的是希望利用TensorFlow的强大功能,结合其他的开源Python库构建强大、稳健、准确的预测模型的开发人员、数据分析师、机器学习从业者和深度学习爱好者。 在本书中,你将学习如何使用前馈神经网络、卷积神经网络、递归神经网络、自动编码器和因式分解机为机器学习系统开发深度学习应用程序,了解如何以分布式的方式在GPU上完成深度学习编程。 最终,你将深入了解机器学习技术以及将其应用于现实项目的技巧。 目录 Preface Chapter 1: Getting Started with Deep Learning A soft introduction to machine learning Supervised learning Unbalanced data Unsupervised learning Reinforcement learning What is deep learning? Artificial neural networks The biological neurons The artificial neuron How does an ANN learn? ANNs and the backpropagation algorithm Weight optimization Stochastic gradient descent Neural network architectures Deep Neural Networks (DNNs) Multilayer perceptron Deep Belief Networks (DBNs) Convolutional Neural Networks (CNNs) AutoEncoders Recurrent Neural Networks (RNNs) Emergent architectures Deep learning frameworks Summary Chapter 2: A First Look at TensorFlow A general overview of TensorFlow What's new in TensorFlow vl.6? Nvidia GPU support optimized Introducing TensorFlow Lite Eager execution Optimized Accelerated Linear Algebra (XLA) Installing and configuring TensorFlow TensorFlow computational graph TensorFlow code structure Eager execution with TensorFIow Data model in TensorFlow Tensor Rank and shape Data type Variables Fetches Feeds and placeholders Visualizing computations through TensorBoard How does TensorBoard work? Linear regression and beyond Linear regression revisited for a real dataset Summary Chapter 3: Feed-Forward Neural Networks with TensorFIow Feed-forward neural networks (FFNNs) Feed-forward and backpropagation Weights and biases Activation functions Using sigmoid Using tanh Using ReLU Using softmax Implementing a feed-forward neural network Exploring the MNIST dataset Softmax classifier Implementing a multilayer perceptron (MLP) Training an MLP Using MLPs Dataset description Preprocessing A TensorFIow implementation of MLP for client-subscription assessment Chapter 4: Convolutional Neural Networks Chapter 5: Optimizing TensorFIow Autoencoders Chapter 6: Recurrent Neural Networks Chapter 7: Heterogeneous and Distributed Computing Chapter 8: Advanced TensorFIow Programming Chapter 9: Recommendation Systems Using Factorization Machines Chapter 10: Reinforcement Learning Other Books You May Enjoy Index
|