实用迭代分析 目录
preface to the series in information and computational science preface chapter 1 introduction 1 1.1 background in linear algebra 1 1.1.1 basic symbols, notations, and de.nitions 1 1.1.2 vector norm 2 1.1.3matrix norm 4 1.1.4 spectral radii 8 1.2 spectralresultsofmatrix 10 1.3 specialmatrices 15 1.3.1 reducible and irreducible matrices 15 1.3.2 diagonally dominant matrices 16 1.3.3 nonnegative matrices 20 1.3.4 p-cyclic matrices 22 1.3.5 toeplitz, hankel, cauchy, cauchy-like and hessenberg matrices 24 1.4 matrix decomposition 27 1.4.1 lu decomposition 27 1.4.2 singular value decomposition 28 1.4.3 conjugate decomposition 30 1.4.4 qz decomposition 32 1.4.5 s & t decomposition 33 1.5 exercises 37 chapter 2 basic methods and convergence 40 2.1 basic concepts 40 2.2 the jacobi method 43 2.3 the gauss-seidel method 46 2.4 the sor method 49 2.5 m-matrices and splitting methods 58 2.5.1 m-matrix 58 2.5.2 splitting methods 60 2.5.3 comparison theorems 62 2.5.4 multi-splitting methods 66 2.5.5 generalized ostrowski-reich theorem 67 2.6 error analysis of iterative methods 69 27 iterative re.nement 70 2.8 exercises 75 chapter 3 non-stationary methods 78 3.1 conjugategradientmethods 79 3.1.1 steepest descent method 79 3.1.2 conjugate gradient method 80 3.1.3 preconditioned conjugate gradient method 83 3.1.4 generalized conjugate gradient method 85 3.1.5 theoretical results on the conjugate gradient method 85 3.1.6 generalized product-type methods based on bi-cg 91 3.1.7 inexact preconditioned conjugate gradient method 92 3.2 lanczos method 93 3.3 gmres method and qmr method 95 3.3.1 gmres method 95 3.3.2 qmr method 98 3.3.3 variants of the qmr method 100 3.4 direct projection method 101 3.4.1 theory of the direct projection method 102 3.4.2 direct projection algorithms 105 3.5 semi-conjugate direction method 107 3.5.1 semi-conjugate vectors 107 3.5.2 left conjugate direction method 110 3.5.3 one possible way to .nd left conjugate vector set 112 3.5.4 remedy for breakdown 117 3.5.5 relation with gaussian elimination 119 3.6 krylov subspace methods 121 3.7 exercises 122 chapter 4 iterative methods for least squares problems 126 4.1 introduction 126 4.2 basic iterative methods 128 4.3 blocksor methods 131 4.3.1 block sor algorithms 131 4.3.2 convergence and optimal factors 132 4.3.3 example 135 4.4 preconditioned conjugate gradient methods 136 4.5 generalized least squares problems 138 4.5.1 block sor methods 139 4.5.2 preconditioned conjugate gradient method 142 4.5.3 comparison 143 4.5.4 sor-like methods 144 4.6 rank de.cient problems 148 4.6.1 augmented system of normal equation 149 4.6.2 block sor algorithms 150 4.6.3 convergence and optimal factor 151 4.6.4 preconditioned conjugate gradient method 154 4.6.5 comparison results 158 4.7 exercises 161 chapter 5 preconditioners 163 5.1 lu decomposition and orthogonal transformations 164 5.1.1 gilbert and peierls algorithm for lu decomposition 164 5.1.2 orthogonal transformations 166 5.2 stationary preconditioners 167 5.2.1 jacobi preconditioner 167 5.2.2 ssor preconditioner 168 5.3 incompletefactorization 169 5.3.1 point incomplete factorization 170 5.3.2 modi.ed incomplete factorization 172 5.3.3 block incomplete factorization 172 5.4 diagonally dominant preconditioner 173 5.5 preconditionerforleastsquaresproblems 177 5.5.1 preconditioner by lu decomposition 179 5.5.2 preconditioner by direct projection method 181 5.5.3 preconditioner by qr decomposition 182 5.6 exercises 186 chapter 6 singular linear systems 188 6.1 introduction 188 6.2 properties of singular systems 191 6.3 splittingmethodsforsingularsystems 195 6.4 nonstationarymethodsforsingularsystems 219 6.4.1 symmetric and positive semide.nite systems 219 6.4.2 general systems 222 6.5 exercises 225 bibliography 228 index 249 《信息与计算科学丛书》 253
|