网站首页  软件下载  游戏下载  翻译软件  电子书下载  电影下载  电视剧下载  教程攻略

请输入您要查询的图书:

 

书名 神经网络(影印版)/大学计算机教育国外著名教材系列
分类
作者 (印)库马尔
出版社 清华大学出版社
下载
简介
编辑推荐

本书从理论和实际应用出发,全面系统地介绍神经网络的基本模型、基本方法和基本技术,涵盖了神经系统科学、统计模式识别、支撑向量机、模糊系统、软件计算与动态系统等内容。对所有模型不仅给出了实际的应用示例,还提供了详细的MATHLAB代码,是一本很好的神经网络教材。

本书适合作为相关专业研究生或本科高年级学生的教材,也是神经网络的科研人员的参考书。

内容推荐

本书从理论和实际应用出发,全面系统地介绍神经网络的基本模型、基本方法和基本技术,涵盖了神经系统科学、统计模式识别、支撑向量机、模糊系统、软件计算与动态系统等内容。本书对神经网络的各种基本模型做了深入研究,对神经网络的最新发展趋势和主要研究方向也都进行了全面而综合的介绍,每章都包含大量例题、习题,对所有模型不仅给出了实际的应用示例,还提供了详细的MATHLAB代码,是一本很好的神经网络教材。

本书适合作为相关专业研究生或本科高年级学生的教材,也是神经网络的科研人员的参考书。

目录

Foreword xi

Prefac xiii

More Acknowledgements xxi

Part I

Traces of History and A Neuroscience Briefer

1. Brain Style Computing: Origins and Issues 3

1.1 From the Greeks to the Renaissance 3

1.2 The Advent of Modern Neuroscience 6

1.3 On the Road to Artificial Intelligence 9

1.4 Classical AI and Neural Networks 12

1.5 Hybrid Intelligent Systems 14

Chapter Summary 15

Bibliographic Remarks 16

2.

Lessons from Neuroscience 17

2.1 The Human Brain 17

2.2 Biological Neurons 23

Chapter Summary 37

Bibliographic Remarks 38

Part II

Feedforward Neural Networks and Supervised Learning

3. Artificial Neurons, Neural Networks and Architectures 41

3.1

Neuron Abstraction 41

3.2

Neuron Signal Functions 44

3.3

Mathematical Preliminaries 53

3.4

Neural Networks Defined 61

3.5

Architectures: Feedforward and Feedback 62

3.6

Salient Properties and Application Domains of Neural Networks 65

Chapter Summary 68

Bibliographic Remarks 69

Review Questions 69

4. Geometry of Binary Threshold Neurons and Their Networks

72

4.1

Pattern Recognition and Data Classification 72

4.2

Convex Sets, Convex Hulls and Linear Separability 76

4.3

Space of Boolean Functions 78

4.4

Binary Neurons are Pattern Dichotomizers 80

4.5

Non-linearly Separable Problems 83

4.6

Capacity of a Simple Threshold Logic Neuron 87

4.7

Revisiting the XOR Problem 92

4.8

Multilayer Networks 95

4.9

How Many Hidden Nodes are Enough? 97

Chapter Summary 99

Bibliographic Remarks 100

Review Questions 100

5. Supervised Learning I: Perceptrons and LMS

104

5.1

Learning and Memory 104

5.2

From Synapses to Behaviour: The Case of Aplysia 106

5.3

Learning Algorithms 110

5.4

Error Correction and Gradient Descent Rules 114

5.5

The Learning Objective for TLNs 115

5.6

Pattern Space and Weight Space 117

5.7

Perceptron Learning Algorithm

19

5.8

Perceptron Convergence Theorem 122

5.9

A Handworked Example and MATLAB Simulation 125

5.10

Perceptron Learning and Non-separable Sets 128

5.11

Handling Linearly Non-separable Sets 130

5.12

.–Least Mean Square Learning 132

5.13

MSE Error Surface and its Geometry 137

5.14

Steepest Descent Search with Exact Gradient Information 143

5.15

.–LMS: Approximate Gradient Descent 147

5.16

Application of LMS to Noise Cancellation 152

Chapter Summary 156

Bibliographic Remarks 157

Review Questions 158

6. Supervised Learning II: Backpropagation and Beyond

164

6.1

Multilayered Network Architectures 164

6.2

Backpropagation Learning Algorithm 167

6.3

Handworked Example 177

6.4

MATLAB Simulation Examples 181

6.5

Practical Considerations in Implementing the BP Algorithm 187

6.6

Structure Growing Algorithms 196

ix

6.7

Fast Relatives of Backpropagation 198

6.8

Universal Function Approximation and Neural Networks 199

6.9

Applications of Feedforward Neural Networks 201

6.10

Reinforcement Learning: A Brief Review 205

Chapter Summary 212

Bibliographic Remarks 213

Review Questions 214

7.

Neural Networks: A Statistical Pattern Recognition Perspective 218

7.1

Introduction 218

7.2

Bayes’ Theorem 219

7.3

Two Instructive MATLAB Simulations 222

7.4

Implementing Classification Decisions with Bayes’ Theorem 227

7.5

Probabilistic Interpretation of a Neuron Discriminant Function 230

7.6

MATLAB Simulation: Plotting Bayesian Decision Boundaries 232

7.7

Interpreting Neuron Signals as Probabilities 236

7.8

Multilayered Networks, Error Functions and Posterior Probabilities 239

7.9

Error Functions for Classification Problems 245

Chapter Summary 254

Bibliographic Remarks 255

Review Questions 255

8.

Focussing on Generalization: Support Vector Machines and

Radial Basis Function Networks 259

8.1

Learning From Examples and Generalization 259

8.2

Statistical Learning Theory Briefer 264

8.3

Support Vector Machines 273

8.4

Radial Basis Function Networks 304

8.5

Regularization Theory Route to RBFNs 314

8.6

Generalized Radial Basis Function Network 323

8.7

Learning in RBFN’s 326

8.8

Image Classification Application 329

8.9

Other Models For Valid Generalization 334

Chapter Summary 339

Bibliographic Remarks 341

Review Questions 341

Part III Recurrent Neurodynamical Systems

9.

Dynamical Systems Review 347

9.1

States, State Vectors and Dynamics 347

x

9.2

State Equations 350

9.3

Attractors and Stability 352

9.4

Linear Dynamical Systems 354

9.5

Non-linear Dynamical Systems 358

9.6

Lyapunov Stability 363

9.7

Neurodynamical Systems 369

9.8

The Cohen-Grossberg Theorem 373

Chapter Summary 375

Bibliographic Remarks 376

Review Questions 376

10. Attractor Neural Networks

378

10.1

Introduction 378

10.2

Associative Learning 379

10.3

Attractor Neural Network Associative Memory 382

10.4

Linear Associative Memory 386

10.5

Hopfield Network 389

10.6

Content Addressable Memory 397

10.7

Two Handworked Examples 400

10.8

Example of Recall of Memories in Continuous Time 404

10.9

Spurious Attractors 405

10.10 Error Correction with Bipolar Encoding 407

10.11 Error Performance of Hopfield Networks 409

10.12 Applications of Hopfield Networks 412

10.13 Brain-State-in-a-Box Neural Network 419

10.14 Simulated Annealing 426

10.15 Boltzmann Machine 431

10.16 Bidirectional Associative Memory 440

10.17 Handworked Example 443

10.18 BAM Stability Analysis 447

10.19 Error Correction in BAMs 448

10.20 Memory Annihilation of Structured Maps in BAMs 450

10.21 Continuous BAMs 457

10.22 Adaptive BAMs 458

10.23 Application: Pattern Association

461

Chapter Summary 462

Bibliographic Remarks 464

Review Questions 464

11. Adaptive Resonance Theory

469

11.1

Noise-Saturation Dilemma 469

11.2

Solving the Noise-Saturation Dilemma 471

11.3

Recurrent On-center–Off-surround Networks 477

11.4

Building Blocks of Adaptive Resonance 482

xi

11.5

Substrate of Resonance 487

11.6

Structural Details of the Resonance Model 489

11.7

Adaptive Resonance Theory I (ART I) 491

11.8

Handworked Example 502

11.9

MATLAB Code Description 504

11.10 A Breezy Review of ART Operating Principles 506

11.11 Neurophysiological Evidence for ART Mechanisms 507

11.12 Applications

511

Chapter Summary 516

Bibliographic Remarks 517

Review Questions 518

12. Towards the Self-organizing Feature Map

521

12.1

Self-organization 521

12.2

Maximal Eigenvector Filtering 522

12.3

Extracting Principal Components: Sanger’s Rule 530

12.4

Generalized Learning Laws 532

12.5

Competitive Learning Revisited 537

12.6

Vector Quantization 540

12.7

Mexican Hat Networks 546

12.8

Self-organizing Feature Maps 552

12.9

Applications of the Self Organizing Map 563

Chapter Summary 569

Bibliographic Remarks 570

Review Questions 571

Part IV

Contemporary Topics

13. Pulsed Neuron Models: The New Generation

577

13.1

Introduction 577

13.2

Spiking Neuron Model 578

13.3

Integrate-and-Fire Neurons 586

13.4

Conductance Based Models 594

13.5

Computing with Spiking Neurons 608

13.6

Reflections

.

616

Chapter Summary 617

Bibliographic Remarks 618

14. Fuzzy Sets, Fuzzy Systems and Applications

620

14.1

Need for Numeric and Linguistic Processing 620

14.2

Fuzzy Uncertainty and the Linguistic Variable 621

14.3

Fuzzy Set 622

14.4

Membership Functions 624

xii

14.5

Geometry of Fuzzy Sets 627

14.6

Simple Operations on Fuzzy Sets 628

14.7

Fuzzy Rules for Approximate Reasoning 632

14.8

Rule Composition and Deffuzification 634

14.9

Fuzzy Engineering 638

14.10 Applications

644

Chapter Summary 649

Bibliographic Remarks 650

Review Questions 650

15. Neural Networks and the Soft Computing Paradigm

652

15.1

Soft Computing = Neural + Fuzzy + Evolutionary 652

15.2

Neural Networks: A Summary 654

15.3

Fuzzy Sets and Systems: A Summary 656

15.4

Genetic Algorithms 658

15.5

Neural Networks and Fuzzy Logic 662

15.6

Neuro-Fuzzy-Genetic Integration 671

15.7

Integration Example: Subsethood-Product Based Fuzzy–Neural Inference System 675

15.8

A Concluding Note 683

Chapter Summary 684

Bibliographic Remarks 685

Appendix A: Neural Network Hardware

686

A.1

Motivation and Issues 686

A.2

Analog Building Blocks for Neuromorphic Networks 687

A.3

Digital Techniques 691

A.4

Bibliographic Remarks 692

Appendix B: Web Pointers

694

Bibliography

697

Index

729

随便看

 

霍普软件下载网电子书栏目提供海量电子书在线免费阅读及下载。

 

Copyright © 2002-2024 101bt.net All Rights Reserved
更新时间:2025/4/7 7:07:22