内容推荐 机器学习正在蚕食软件世界。在这本Sebastian Raschka的畅销书《Python机器学习(第二版)》中,你将了解并学习到机器学习、神经网络和深度学习的最前沿知识。 塞巴斯蒂安·拉施卡、瓦希德·麦加利利著的《Python机器学习》更新并扩展了包括scikit-learn、Keras、TensorFlow在内的最新开源技术。书中提供了使用Python创建有效的机器学习和深度学习应用所需的实用知识和技术。 在涉及数据分析的高级主题之前,Sebastian Raschka和Vahid Mirjalili以其独特见解和专业知识为你介绍机器学习和深度学习算法。本书将机器学习的理论原理与实际编码方法相结合,以求全面掌握机器学习理论及其Python实现。 目录 Preface Chapter 1: Giving Computers the Ability_ to Learn from Data Building intelligent machines to transform data into knowledge The three different types of machine learning Making predictions about the future with supervised learning Classification for predicting class labels Regression for predicting continuous outcomes Solving interactive problems with reinforcement learning Discovering hidden structures with unsupervised learning Finding subgroups with clustering Dimensionality reduction for data compression Introduction to the basic terminology and notations A roadmap for building machine learning systems Preprocessing - getting data into shape Training and selecting a predictive model Evaluating models and predicting unseen data instances Using Python for machine learning Installing Python and packages from the Python Package Index Using the Anaconda Python distribution and package manager Packages for scientific computing, data science, and machine learning Summary Chapter 2: Training Simple Machine Learning Algorithms for Classification Artificial neurons - a brief glimpse into the early history of machine learning The formal definition of an artificial neuron The perceptron learning rule Implementing a perceptron learning algorithm in Python An object-oriented perceptron API Training a perceptron model on the Iris dataset Adaptive linear neurons and the convergence of learning Minimizing cost functions with gradient descent Implementing Adaline in Python Improving gradient descent through feature scaling Large-scale machine learning and stochastic gradient descent Summary Chapter 3: A Tour of Machine Learning Classifiers Using scikit-learn Choosing a classification algorithm First steps with scikit-learn - training a perceptron Modeling class probabilities via logistic regression Logistic regression intuition and conditional probabilities Learning the weights of the logistic cost function Converting an Adaline implementation into an algorithm for logistic regression Training a logistic regression model with scikit-learn Tackling overfitting via regularization Maximum margin classification with support vector machines Maximum margin intuition Dealing with a nonlinearly separable case using slack variables Alternative implementations in scikit-learn Solving nonlinear problems using a kernel SVM Kernel methods for linearly inseparable data Using the kernel trick to find separating hyperplanes in high-dimensional space Decision tree learning Maximizing information gain - getting the most bang for your buck Building a decision tree Combining multiple decision trees via random forests K-nearest neighbors - a lazy learning algorithm Summary Chapter 4: Building Good Training Sets - Data Preprocessing Dealing with missing data Identifying missing values in tabular data Eliminating samples or features with missing values Imputing missing values Understanding the scikit-learn estimator API ……
|