网站首页 软件下载 游戏下载 翻译软件 电子书下载 电影下载 电视剧下载 教程攻略
书名 | 基于回归视野的统计学习(英文版) |
分类 | 科学技术-自然科学-数学 |
作者 | (美)R.A.伯克 |
出版社 | 世界图书出版公司 |
下载 | ![]() |
简介 | 内容推荐 《基于回归视野的统计学习》作者R.A.伯克是宾夕法尼亚大学数理统计系教授,研究领域广泛,在社会科学和自然科学均有很深的造诣。本书主要阐述统计学习的应用知识,各章还有实际应用实例,可作为统计、社会科学和生命科学等相关领域的研究生和科研人员的参考书。 目录 Preface 1 Statistical Learning as a Regression Problem 1.1 Getting Started 1.2 Setting the Regression Context 1.3 The Transition to Statistical Learning 1.3.1 Some Goals of Statistical Learning 1.3.2 Statistical Inference 1.3.3 Some Initial Cautions 1.3.4 A Cartoon Illustration 1.3.5 A Taste of Things to Come 1.4 Some Initial Concepts and Definitions 1.4.1 Overall Goals 1.4.2 Loss Functions and Related Concepts 1.4.3 Linear Estimators 1.4.4 Degrees of Freedom 1.4.5 Model Evaluation 1.4.6 Model Selection 1.4.7 Basis Functions 1.5 Some Common Themes 1.6 Summary and Conclusions 2 Regression Splines and Regression Smoothers 2.1 Introduction 2.2 Regression Splines 2.2.1 Applying a Piecewise Linear Basis 2.2.2 Polynomial Regression Splines 2.2.3 Natural Cubic Splines 2.2.4 B-Splines 2.3 Penalized Smoothing 2.3.1 Shrinkage 2.3.2 Shrinkage and Statistical Inference 2.3.3 Shrinkage: So What? 2.4 Smoothing Splines 2.4.1 An Illustration 2.5 Locally Weighted Regression as a Smoother 2.5.1 Nearest Neighbor Methods 2.5.2 Locally Weighted Regression 2.6 Smoothers for Multiple Predictors 2.6.1 Smoothing in Two Dimensions 2.6.2 The Generalized Additive Model 2.7 Smoothers with Categorical Variables 2.7.1 An Illustration 2.8 Locally Adaptive Smoothers 2.9 The Role of Statistical Inference 2.9.1 Some Apparent Prerequisites 2.9.2 Confidence Intervals 2.9.3 Statistical Tests 2.9.4 Can Asymptotics Help? 2.10 Software Issues 2.11 Summary and Conclusions 3 Classification and Regression Trees (CART) 3.1 Introduction 3.2 An Overview of Recursive Partitioning with CART 3.2.1 Tree Diagrams 3.2.2 Classification and Forecasting with CART 3.2.3 Confusion Tables 3.2.4 CART as an Adaptive Nearest Neighbor Method 3.2.5 What CART Needs to Do 3.3 Splitting a Node 3.4 More on Classification 3.4.1 Fitted Values and Related Terms 3.4.2 An Example 3.5 Classification Errors and Costs 3.5.1 Default Costs in CART 3.5.2 Prior Probabilities and Costs 3.6 Pruning 3.6.1 Impurity Versus Rа(T) 3.7 Missing Data 3.7.1 Missing Data with CART 3.8 Statistical Inference with CART 3.9 Classification Versus Forecasting 3.10 Varying the Prior, Costs, and the Complexity Penalty 3.11 An Example with Three Response Categories 3.12 CART with Highly Skewed Response Distributions 3.13 Some Cautions in Interpreting CART Results 3.13.1 Model Bias 3.13.2 Model Variance 3.14 Regression Trees 3.14.1 An Illustration 3.14.2 Some Extensions 3.14.3 Multivariate Adaptive Regression Splines (MARS) 3.15 Software Issues 3.16 Summary and Conclusions 4 Bagging 4.1 Introduction 4.2 Overfitting and Cross-Validation 4.3 Bagging as an Algorithm 4.3.1 Margins 4.3.2 Out-Of-Bag Observations 4.4 Some Thinking on Why Bagging Works 4.4.1 More on Instability in CART 4.4.2 How Bagging Can Help 4.4.3 A Somewhat More Formal Explanation 4.5 Some Limitations of Bagging 4.5.1 Sometimes Bagging Does Not Help 4.5.2 Sometimes Bagging Can Make the Bias Worse 4.5.3 Sometimes Bagging Can Make the Variance Worse 4.5.4 Losing the Trees for the Forest 4.5.5 Bagging Is Only an Algorithm 4.6 An Example 4.7 Bagging a Quantitative Response Variable 4.8 Software Considerations 4.9 Summary and Conclusions 5 Random Forests 5.1 Introduction and Overview 5.1.1 Unpacking How Random Forests Works 5.2 An Initial Illustration 5.3 A Few Formalities 5.3.1 What Is a Random Forest? 5.3.2 Margins and Generalization Error for Classifiers in General 5.3.3 Generalization Error for Random Forests 5.3.4 The Stren |
随便看 |
|
霍普软件下载网电子书栏目提供海量电子书在线免费阅读及下载。