网站首页 软件下载 游戏下载 翻译软件 电子书下载 电影下载 电视剧下载 教程攻略
书名 | 支持向量机(英文版香农信息科学经典) |
分类 | |
作者 | (德)英戈·斯坦沃特//安德烈亚斯·克里斯特曼 |
出版社 | 世界图书出版公司 |
下载 | |
简介 | 内容推荐 本书旨在解释使支持向量机成为各种应用的成功建模和预测工具的原理。书中通过展示支持向量机的基本概念,以及最新发展和当前的研究问题来实现这一目标。本书分析了支持向量机成功的至少三个原因:它们在只有少量自由参数的情况下很好地学习的能力,它们对几种类型的模型违反和异常值的鲁棒性,最后是它们的计算效率与其他几种方法进行的比较。 目录 Preface Reading Guide 1 Introduction 1.1 Statistical Learning 1.2 Support Vector Machines: An Overview 1.3 History of SVMs and Geometrical Interpretation 1.4 Alternatives to SVMs 2 Loss Functions and Their Risks 2.1 Loss Functions: Definition and Examples 2.2 Basic Properties of Loss Functions and Their Risks 2.3 Margin-Based Losses for Classification Problems 2.4 Distance-Based Losses for Regression Problems 2.5 Further Reading and Advanced Topics 2.6 Summary 2.7 Exercises 3 Surrogate Loss Functions (*) 3.1 Inner Risks and the Calibration Function 3.2 Asymptotic Theory of Surrogate Losses 3.3 Inequalities between Excess Risks 3.4 Surrogates for Unweighted Binary Classification 3.5 Surrogates for Weighted Binary Classification 3.6 Template Loss Functions 3.7 Surrogate Losses for Regression Problems 3.8 Surrogate Losses for the Density Level Problem 3.9 Self-Calibrated Loss Functions 3.10 Further Reading and Advanced Topics 3.11 Summary 3.12 Exercises 4 Kernels and Reproducing Kernel Hilbert Spaces 4.1 Basic Properties and Examples of Kernels 4.2 The Reproducing Kernel Hilbert Space of a Kernel 4.3 Properties of RKHSs 4.4 Gaussian Kernels and Their RKHSs 4.5 Mercer's Theorem (*) 4.6 Large Reproducing Kernel Hilbert Spaces 4.7 Further Reading and Advanced Topics 4.8 Summary 4.9 Exercises 5 Infinite-Sample Versions of Support Vector Machines 5.1 Existence and Uniqueness of SVM Solutions 5.2 A General Representer Theorem 5.3 Stability of Infinite-Sample SVMs 5.4 Behavior for Small Regularization Parameters 5.5 Approximation Error of RKHSs 5.6 Further Reading and Advanced Topics 5.7 Summary 5.8 Exercises 6 Basic Statistical Analysis of SVMs 6.1 Notions of Statistical Learning 6.2 Basic Concentration Inequalities 6.3 Statistical Analysis of Empirical Risk Minimization 6.4 Basic Oracle Inequalities for SVMs 6.5 Data-Dependent Parameter Selection for SVMs 6.6 Further Reading and Advanced Topics 6.7 Summary 6.8 Exercises 7 Advanced Statistical Analysis of SVMs (*) 7.1 Why Do We Need a Refined Analysis? 7.2 A Refined Oracle Inequality for ERM 7.3 Some Advanced Machinery 7.4 Refined Oracle Inequalities for SVMs 7.5 Some Bounds on Average Entropy Numbers 7.6 Further Reading and Advanced Topics 7.7 Summary 7.8 Exercises 8 Support Vector Machines for Classification 8.1 Basic Oracle Inequalities for Classifying with SVMs 8.2 Classifying with SVMs Using Gaussian Kernels 8.3 Advanced Concentration Results for SVMs (*) 8.4 Sparseness of SVMs Using the Hinge Loss 8.5 Classifying with other Margin-Based Losses (*) 8.6 Further Reading and Advanced Topics 8.7 Summary 8.8 Exercises 9 Support Vector Machines for Regression 9.1 Introduction 9.2 Consistency 9.3 SVMs for Quantile Regression 9.4 Numerical Results for Quantile Regression 9.5 Median Regression with the eps-Insensitive Loss (*) 9.6 Further Reading and Advanced Topics 9.7 Summary 9.8 Exercises 10 Robustness 10.1 Motivation 10.2 Approaches to Robust Statistics 10.3 Robustness of SVMs for Classification 10.4 Robustness of SVMs for Regression (*) 10.5 Robust Learning from Bites (*) 10.6 Further Reading and Advanced Topics 10.7 Summary 10.8 Exercises 11 Computational Aspects 11.1 SVMs, Convex Programs, and Duality 11.2 Implementation Techniques 11.3 Determination of Hyperparameters 11.4 Software Packages 11.5 Further Reading and Advanced Topics 11.6 Summary 11.7 Exercises 12 Data Mining 12.1 Introduction 12.2 CRISP-DM Strategy 12.3 Role of SVMs in Data Mining 12.4 Software Tools for Data Mining 12.5 Further Reading and Advanced Topics 12.6 Summary 12.7 Exercises Appendix A.1 Basic Equations, Ineq |
随便看 |
|
霍普软件下载网电子书栏目提供海量电子书在线免费阅读及下载。