网站首页  软件下载  游戏下载  翻译软件  电子书下载  电影下载  电视剧下载  教程攻略

请输入您要查询的图书:

 

书名 数值最优化(第2版影印版英文版)(精)/国外数学名著系列
分类 科学技术-自然科学-数学
作者 (美)乔治·劳斯特//斯蒂芬·J.瑞特
出版社 科学出版社
下载
简介
内容推荐
乔治·劳斯特、斯蒂芬·J.瑞特著的《数值最优化(第2版影印版英文版)(精)/国外数学名著系列》作者根据在教学、研究和咨询中的经验,写了这本适合学生和实际工作者的书。本书提供连续优化中大多数有效方法的全面的新的论述。每一章从基本概念开始,逐步阐述当前可用的技术。
本书强调实用方法,包含大量图例和练习,适合广大读者阅读,可作为工程、运筹学、数学、计算机科学以及商务方面的研究生教材,也可作为该领域的科研人员和实际工作人员的手册。
目录
Preface
prefcetothe Second Edition
1 Introduction
Mathematical Formulation
Example:A Transportation Problem
Continuous versus Discrete Optimization
Constrained and Unconstrained Optimization
Global and Local Optimization
Stocbastic and Deterministic Optimization
Convexity
Optimization Algorithms
Notes and References
2 Fundamentals of Unconstrained Optimization
2.1 What ls a Solution?
Recognizing a Local Minimum
Nonsmooth Problems
2.2 Overview of A1gorithms
Two Strategies:Line Search and Trust Region
Search Directions for Line Search Methods
Models for Trust-Region Methods
Scaling
Exercises
3 Line Search Methods
3.1 Step Length
The Wolfe Conditions
The Goldstein Conditions
Sufficient Decrease and Backtracking
3.2 Convergence of Line Search Methods
3.3 Rate of Convergence
Convergence Rate of Steepest Descent
Newton's Method
Quasi-Newton Methods
3.4 Newton's Method with Hessian Modification
Eigenvalue Modification
Adding a Multiple of the ldentity
Modified Cholesky Factorization
Modified Symmetric Indefinite Factorization
3.5 Step-Length Selection Algorithms
lnterpolation
lnitial Step Length
A Line Search A1gorithm for the Wolfe Conditions
Notes and References
Exercises
4 Trust-Region Methods
Outline of the Trust-Region Approach
4.1 A1gorithms Based on the Cauchy Point
The Cauchy Point
lmpro时ng on the Cauchy Point
The Dogleg Method
Two-Dinlensional Subspace Mininlization
4.2 Global Convergence
Reduction Obtained by the Cauchy Point
Convergence to Stationary Points
4.3 lterative Solution of the Subproblem
The Hard Case
Proof of Theorem 4.
Convergence of Algorithms Based on Nearly Exact Solutions
4.4 Local Convergence ofTrust-Region Newton Methods
4.5 0ther Enhancements
Scaling
Trust Regions in 0ther Norms
Notes and References
Exercises
5 Conjugate Gradient Methods
5.1 The linear Conjugate Gradient Method
Conjugate Direction Methods
Basic Properties of thee Conjugate Gradient Method
A Practical Form of the Conjugate Gradient Method
Rate of Convergence
Preconditioning
Practical Preconditioners
5.2 Nonlinear Conjugate Gradient Methods
The Fletcher-Reeves Method
The Polak-Ribière Method and Variants
Quadratic Termination and Restarts
Behavior of the Fletcher-Reeves Method
Global Convergence
Numerical Performance
Notes and Reference
Exercises
6 Quasi-Newton Methods
6.1 The BFGS Method
Properties ofthe BFGS Method
Implementation
6.2 The SR1 Method
Properties of SR1 Updating
6.3 The Broyden Class
6.4 Convergence Analysis
Global Convergence of the BFGS Method
Superlinear Convergence of the BFGS Method
Convergence Analysis of the SR1 Method
Notes and References
Exercises
7 Large-Scale Unconstrained optimization
7.1 lnexact Newton Methods
Local Convergence of Inexact Newton Methods
Line Search Newton-CG Method
Trust-Region Newton-CG Method
Preconditioning the Trust-Region Newton-CG Method
Trust-Region Newton-Lanczos Method
7.2 Limited-Memory Quasi-Newton Methods
Limited-Memory BFGS
Relationship with Conjugate Gradient Methods
General Lirnited:d-Memory Updatiug
Compact Representation of BFGS Updating
Unrolling the Update
7.3 Sparse Quasi-Newton Updates
7.4 Algorithms for Partially Separable Fnnctions
7.5 Perspectives and Sotrware
Notes and References
Exercises
8 Calculating Derivatives
8.1 Finite-Difference Derivative Approximations
Approximating the Gradient
Approximating a Sparse Jacobian
Approximatiug the Hessian
Approximatiug a Sparse Hessian
8.2 Automatic Differentiation
Au Example
The Forward Mode
The Reverse Mode
Ve
随便看

 

霍普软件下载网电子书栏目提供海量电子书在线免费阅读及下载。

 

Copyright © 2002-2024 101bt.net All Rights Reserved
更新时间:2025/4/7 10:12:57