网站首页 软件下载 游戏下载 翻译软件 电子书下载 电影下载 电视剧下载 教程攻略
书名 | 机器学习的信息论方法(全彩英文版香农信息科学经典) |
分类 | |
作者 | (美)何塞·普林西比 |
出版社 | 世界图书出版公司 |
下载 | ![]() |
简介 | 内容推荐 信息论与机器学习同为涉及计算机科学和应用数学等学科的分支领域,这两门交叉学科在起源和应用上有很多相似之处。信息论的理论体系相对成熟一些。机器学习这些年比较受欢迎,理论和应用的扩充发展速度远远更快且看不到饱和的趋势。两个方向互有交叉,但主要还是机器学习中借用信息论的方法以此拓展理论研究和应用场景,比较典型的就是借鉴信息理论创造和改进学习算法。本书分为十一章内容。 目录 1 Information Theory, Machine Learning, and Reproducing Kernel Hilbert Spaces 1.1 Introduction 1.2 Information Theory 1.3 Entropy 1.4 Mutual Information 1.5 Relative Entropy and Kullback-Leibler Divergence 1.6 Information Theory beyond Communications 1.7 Adaptive Model Building 1.8 Information-Theoretic Learning 1.9 ITL as a Unifying Learning Paradigm 1.10 Reproducing Kernel Hilbert Spaces 1.11 RKHS and ITL 1.12 Conclusions 2 Renyi's Entropy, Divergence and Their Nonparametric Estimators Chapter Coauthors: Dongxin Xu and Deniz Erdogmuns 2.1 Introduction 2.2 Definition and Interpretation of Renyi's Entropy 2.3 Quadratic Renyi's Entropy Estimator 2.4 Properties of Renyi's Nonparametric Entropy Estimators 2.5 Bias and Variance of the Information Potential Estimator 2.6 Physical Interpretation of Renyi's Entropy Kernel Estimators 2.7 Extension to a-Information Potential with Arbitrary Kernels 2.8 Renyi's Divergence and Mutual Information 2.9 Quadratic Divergences and Mutual Information 2.10 Information Potentials and Forces in the Joint Space 2.11 Fast Computation of IP and CIP 2.12 Conclusion 3 Adaptive Information Filtering with Error Entropy and Error Correntropy Criteria Chapter Coauthors: Deniz Erdogmus and Weifeng Liu 3.1 Introduction 3.2 The Error Entropy Criterion (EEC) for Adaptation 3.3 Understanding the Error Entropy Criterion 3.4 Minimum Error Entropy Algorithm 3.5 Analysis of MEE Performance Surface 3.6 Error Entropy, Correntropy, and M Estimation 3.7 Correntropy Induced Metric and M-Estimation 3.8 Normalized Information Potential as a Pseudometric 3.9 Adaptation of the Kernel Size in Adaptive Filtering 3.10 Conclusions 4 Algorithms for Entropy and Correntropy Adaptation with Applications to Linear Systems Chapter Coauthors: Deniz Erdogmus, Seungju Han and Abhishek Singh 4.1 Introduction 4.2 Recursive Information Potential for MEE (MEE-RIP) 4.3 Stochastic Information Gradient for MEE (MEE-SIG) 4.4 Self-Adjusting Stepsize for MEE (MEE-SAS) 4.5 Normalized MEE (NMEE) 4.6 Fixed-Point MEE (MEE-FP) 4.7 Fast Gauss Transform in MEE Adaptation 4.8 Incomplete Cholesky Decomposition for MEE 4.9 Linear Filter Adaptation with MSE, MEE and MCC 4.10 Conclusion 5 Nonlinear Adaptive Filtering with MEE, MCC and Applications Chapter Coauthors: Deniz Erdogmus, Rodney Morejon and Weifeng Liu 5.1 Introduction 5.2 Backpropagation of Information Forces in MLP Training 5.3 Advanced Search Methods for Nonlinear Systems 5.4 ITL Advanced Search Algorithms 5.5 Application: Prediction of the Mackey Glass Chaotic Time Series 5.6 Application: Nonlinear Channel Equalization 5.7 Error Correntropy Criterion (ECC) in Regression 5.8 Adaptive Kernel Size in System Identification and Tracking 5.9 Conclusions 6 Classification with EEC, Divergence Measures and Error Bounds Chapter Coauthors: Deniz Erdogmus, Dongxin Xu and Kenneth Hild H 6.1 Introduction 6.2 Brief Review of Classification 6.3 Error Entropy Criterion in Classification 6.4 Nonparametric Classifiers 6.5 Classification with Information Divergences 6.6 ITL Algorithms for Divergence and Mutual Information 6.6.1 Case Study: Automatic Target Recognition (ATR) with ITL 6.7 The Role of ITL Feature Extraction in Classification 6.8 Error Bounds for Classification 6.9 Conclusions 7 Clustering with ITL Principles Chapter Coauthors: Robert Jenssen and Sudhir Rao 7.1 Introduction 7.2 Information-Theoretic Clustering 7.3 Differential Clustering Using Renyi's Entropy 7.4 The Clustering Evaluation ~ction 7.5 A Gradient Algorithm for Clustering with Des 7.6 Mean Shift Algorithms and Renyi's Entropy 7.7 Graph-Theoretic Clustering with ITL 7.8 Information Cut for Clustering 7.9 Conclusion 8 Se |
随便看 |
|
霍普软件下载网电子书栏目提供海量电子书在线免费阅读及下载。