应用线性统计模型-上册-(英文影印版.原书第5版)

首页 > 图书 > 教材教辅/2020-09-24 / 加入收藏 / 阅读 [打印]
应用线性统计模型-上册-(英文影印版.原书第5版)

应用线性统计模型-上册-(英文影印版.原书第5版)

作者:库特纳

开 本:16开

书号ISBN:9787111490685

定价:

出版时间:2016-04-01

出版社:机械工业出版社

应用线性统计模型-上册-(英文影印版.原书第5版) 本书特色

本书分为三部分:第1部分简单线性回归,内容涉及单个预测变量的线性回归、利用回归和相关分析做推断、诊断和修正测度、回归分析的联合推断和其他论题以及简单线性回归分析的矩阵法等内容;第2部分多重线性回归、内容涉及多重回归ⅰ,多重回归ⅱ,定量和定性预测变量的回归模型、构建回归模型ⅰ、构建回归模型ⅱ、构建回归模型ⅲ、时序数据中的自相关等内容;第3部分非线性回归,内容涉及非线性回归的引入和神经网络、logistic回归、泊松回归和广义线性模型等内容。本书篇幅适中,例子涉及各个应用领域,在介绍统计思想方面比较突出,数据丰富。    本书适用于高等院校统计学专业和理工科各专业本科生和研究生作为教材使用。

应用线性统计模型-上册-(英文影印版.原书第5版) 内容简介

本书是在美国大学中广泛使用的教材,已经再版至第5版,不仅深受广大师生的欢迎,而且有很大的影响,已逐步成为经典。    由于篇幅较大,股起英文影印版分为上、下两册。本书深入地介绍了“应用线性统计模型”这门课程中几乎所有的关键知识,但是读起来并不艰深晦涩。书中用深入浅出的方式来讲解相关概念,同时配有大量的例题、习题以及实际案例帮助学生理解知识点。同时在帮助学生独立地解决实际问题方面,本书给人留下很深刻的印象。    本书图文并茂,许多例子和习题都是经过精心挑选的,来源于生活和工程实践,丰富的数据也都取材于实际案例。因此,本书不仅适用于统计专业,也可作为商业、计量经济学等专业的参考书。     本书叙述比较详尽,内容比国内教材丰富,篇幅较大,因此作为教材时刻适当选取主要内容讲授,其余可作为学生自学使用。 

应用线性统计模型-上册-(英文影印版.原书第5版) 目录

contentsprefacepart onesimple linear regression 1chapter 1linear regression with one predictorvariable 21.1 relations between variables 2functional relation between twovariables 2statistical relation between two variables 31.2 regression models and their uses 5historical origins 5basic concepts 5construction of regression models 7uses of regression analysis 8regression and causality 8use of computers 91.3 simple linear regression modelwith distribution of error termsunspecified 9formal statement of model 9important features of model 9meaning of regression parameters 11alternative versions of regression model 121.4 data for regression analysis 12observational data 12experimental data 13completely randomized design 131.5 overview of steps in regressionanalysis 131.6 estimation of regression function 15method of least squares 15point estimation of mean response 21residuals 22properties of fitted regression line 231.7 estimation of error terms variance •2 24point estimator of •2 241.8 normal error regression model 26model 26estimation of parameters by methodof maximum likelihood 27cited references 33problems 33exercises 37projects 38chapter 2inferences in regression and correlationanalysis 402.1 inferences concerning •1 40sampling distribution of b1 41sampling distribution of (b1 -•1)/s{b1} 44confidence interval for •1 45tests concerning •1 472.2 inferences concerning •0 48sampling distribution of b0 48sampling distribution of (b0 -•0)/s{b0} 49confidence interval for •0 492.3 some considerations on making inferencesconcerning •0 and •1 50effects of departures from normality 50interpretation of confidence coefficientand risks of errors 50spacing of the x levels 50power of tests 502.4 interval estimation of e{yh} 52sampling distribution of ˆyh 52sampling distribution of( ˆyh - e{yh})/s{ ˆyh} 54confidence interval for e{yh} 542.5 prediction of new observation 55prediction interval for yh(new) whenparameters known 56prediction interval for yh(new) whenparameters unknown 57prediction of mean of m new observationsfor given xh 602.6 confidence band for regression line 612.7 analysis of variance approachto regression analysis 63partitioning of total sum of squares 63breakdown of degrees of freedom 66xcontents ximean squares 66analysis of variance table 67expected mean squares 68f test of •1 = 0 versus •1 _= 0692.8 general linear test approach 72full model 72reduced model 72test statistic 73summary 732.9 descriptive measures of linear associationbetween x and y 74coefficient of determination 74limitations of r2 75coefficient of correlation 762.10 considerations in applying regressionanalysis 772.11 normal correlation models 78distinction between regression andcorrelation model 78bivariate normal distribution 78conditional inferences 80inferences on correlation coefficients 83spearman rank correlation coefficient 87cited references 89problems 89exercises 97projects 98chapter 3diagnostics and remedial measures 1003.1 diagnostics for predictor variable 1003.2 residuals 102properties of residuals 102semistudentized residuals 103departures from model to be studied byresiduals 1033.3 diagnostics for residuals 103nonlinearity of regression function 104nonconstancy of error variance 107presence of outliers 108nonindependence of error terms 108nonnormality of error terms 110omission of important predictorvariables 112some final comments 1143.4 overview of tests involvingresiduals 114tests for randomness 114tests for constancy of variance 115tests for outliers 115tests for normality 1153.5 correlation test for normality 1153.6 tests for constancy of errorvariance 116brown-forsythe test 116breusch-pagan test 1183.7 f test for lack of fit 119assumptions 119notation 121full model 121reduced model 123test statistic 123anova table 1243.8 overview of remedial measures 127nonlinearity of regressionfunction 128nonconstancy of error variance 128nonindependence of error terms 128nonnormality of error terms 128omission of important predictorvariables 129outlying observations 1293.9 transformations 129transformations for nonlinearrelation only 129transformations for nonnormalityand unequal error variances 132box-cox transformations 1343.10 exploration of shape of regressionfunction 137lowess method 138use of smoothed curves to confirm fittedregression function 1393.11 case example—plutoniummeasurement 141cited references 146problems 146exercises 151projects 152case studies 153xii contentschapter 4simultaneous inferences and othertopics in regression analysis 1544.1 joint estimation of •0 and •1 154need for joint estimation 154bonferroni joint confidence intervals 1554.2 simultaneous estimation of meanresponses 157working-hotelling procedure 158bonferroni procedure 1594.3 simultaneous prediction intervalsfor new observations 1604.4 regression through origin 161model 161inferences 161important cautions for using regressionthrough origin 1644.5 effects of measurement errors 165measurement errors in y 165measurement errors in x 165berkson model 1674.6 inverse predictions 1684.7 choice of x levels 170cited references 172problems 172exercises 175projects 175chapter 5matrix approach to simplelinear regression analysis 1765.1 matrices 176definition of matrix 176square matrix 178vector 178transpose 178equality of matrices 1795.2 matrix addition and subtraction 1805.3 matrix multiplication 182multiplication of a matrix by a scalar 182multiplication of a matrix by a matrix 1825.4 special types of matrices 185symmetric matrix 185diagonal matrix 185vector and matrix with all elementsunity 187zero vector 1875.5 linear dependence and rankof matrix 188linear dependence 188rank of matrix 1885.6 inverse of a matrix 189finding the inverse 190uses of inverse matrix 1925.7 some basic results for matrices 1935.8 random vectors and matrices 193expectation of random vector or matrix 193variance-covariance matrixof random vector 194some basic results 196multivariate normal distribution 1965.9 simple linear regression modelin matrix terms 1975.10 least squares estimationof regression parameters 199normal equations 199estimated regression coefficients 2005.11 fitted values and residuals 202fitted values 202residuals 2035.12 analysis of variance results 204sums of squares 204sums of squares as quadraticforms 2055.13 inferences in regression analysis 206regression coefficients 207mean response 208prediction of new observation 209cited reference 209problems 209exercises 212part twomultiple linearregression 213chapter 6multiple regression i 2146.1 multiple regression models 214contents xiiineed for several predictor variables 214first-order model with two predictorvariables 215first-order model with more than twopredictor variables 217general linear regression model 2176.2 general linear regression model in matrixterms 2226.3 estimation of regression coefficients 2236.4 fitted values and residuals 2246.5 analysis of variance results 225sums of squares and mean squares 225f test for regression relation 226coefficient of multiple determination 226coefficient of multiple correlation 2276.6 inferences about regressionparameters 227interval estimation of •k 228tests for •k 228joint inferences 2286.7 estimation of mean response andprediction of new observation 229interval estimation of e{yh} 229confidence region for regressionsurface 229simultaneous confidence intervals for severalmean responses 230prediction of new observation yh(new) 230prediction of mean of m new observationsat xh 230predictions of g new observations 231caution about hidden extrapolations 2316.8 diagnostics and remedial measures 232scatter plot matrix 232three-dimensional scatter plots 233residual plots 233correlation test for normality 234brown-forsythe test for constancy of errorvariance 234breusch-pagan test for constancy of errorvariance 234f test for lack of fit 235remedial measures 2366.9 an example—multiple regression withtwo predictor variables 236setting 236basic calculations 237estimated regression function 240fitted values and residuals 241analysis of appropriateness of model 241analysis of variance 243estimation of regression parameters 245estimation of mean response 245prediction limits for new observations 247cited reference 248problems 248exercises 253projects 254chapter 7multiple regression ii 2567.1 extra sums of squares 256basic ideas 256definitions 259decomposition of ssr into extra sumsof squares 260anova table containing decompositionof ssr 2617.2 uses of extra sums of squares in tests forregression coefficients 263test whether a single •k = 0 263test whether several •k = 0 2647.3 summary of tests concerning regressioncoefficients 266test whether all •k = 0 266test whether a single •k = 0 267test whether some •k = 0 267other tests 2687.4 coefficients of partial determination 268two predictor variables 269general case 269coefficients of partial correlation 2707.5 standardized multiple regressionmodel 271roundoff errors in normal equationscalculations 271lack of comparability in regressioncoefficients 272correlation transformation 272standardized regression model 273x_x matrix for transformed variables 274xiv contentsestimated standardized regressioncoefficients 2757.6 multicollinearity and its effects 278uncorrelated predictor variables 279nature of problem when predictor variablesare perfectly correlated 281effects of multicollinearity 283need for more powerful diagnostics formulticollinearity 289cited reference 289problems 289exercise 292projects 293chapter 8regression models for quantitativeand qualitative predictors 2948.1 polynomial regression models 294uses of polynomial models 294one predictor variable—second order 295one predictor variable—third order 296one predictor variable—higher orders 296two predictor variables—second order 297three predictor variables—secondorder 298implementation of polynomial regressionmodels 298case example 300some further comments on polynomialregression 3058.2 interaction regression models 306interaction effects 306interpretation of interaction regressionmodels with linear effects 306interpretation of interaction regressionmodels with curvilinear effects 309implementation of interaction regressionmodels 3118.3 qualitative predictors 313qualitative predictor with twoclasses 314interpretation of regression coefficients 315qualitative predictor with more than twoclasses 318time series applications 3198.4 some considerations in using indicatorvariables 321indicator variables versus allocatedcodes 321indicator variables versus quantitativevariables 322other codings for indicator variables 3238.5 modeling interactions between quantitativeand qualitative predictors 324meaning of regression coefficients 3248.6 more complex models 327more than one qualitative predictorvariable 328qualitative predictor variables only 3298.7 comparison of two or more regressionfunctions 329soap production lines example 330instrument calibration study example 334cited reference 335problems 335exercises 340projects 341case study 342chapter 9building the regression model i:model selection and validation 3439.1 overview of model-building process 343data collection 343data preparation 346preliminary model investigation 346reduction of explanatory variables 347model refinement and selection 349model validation 3509.2 surgical unit example 3509.3 criteria for model selection 353r2p or ssep criterion 354r2a,p or msep criterion 355mallows’ cp criterion 357aicp and sbcp criteria 359pressp criterion 3609.4 automatic search procedures for modelselection 361“best” subsets algorithm 361stepwise regression methods 364contents xvforward stepwise regression 364other stepwise procedures 3679.5 some final comments on automaticmodel selection procedures 3689.6 model validation 369collection of new data to checkmodel 370comparison with theory, empiricalevidence, or simulation results 371data splitting 372cited references 375problems 376exercise 380projects 381case studies 382chapter 10building the regression model ii:diagnostics 38410.1 model adequacy for a predictorvariable—added-variable plots 38410.2 identifying outlying y observations—studentized deleted residuals 390outlying cases 390residuals and semistudentizedresiduals 392hat matrix 392studentized residuals 394deleted residuals 395studentized deleted residuals 39610.3 identifying outlying x observations—hatmatrix leverage values 398use of hat matrix for identifying outlyingx observations 398use of hat matrix to identify hiddenextrapolation 40010.4 identifying influential cases—dffits,cook’s distance, and dfbetasmeasures 400influence on single fittedvalue—dffits 401influence on all fitted values—cook’sdistance 402influence on the regressioncoefficients—dfbetas 404influence on inferences 405some final comments 40610.5 multicollinearity diagnostics—varianceinflation factor 406informal diagnostics 407variance inflation factor 40810.6 surgical unit example—continued 410cited references 414problems 414exercises 419projects 419case studies 420chapter 11building the regression model iii:remedial measures 42111.1 unequal error variances remedialmeasures—weighted least squares 421error variances known 422error variances known up toproportionality constant 424error variances unknown 42411.2 multicollinearity remedialmeasures—ridge regression 431some remedial measures 431ridge regression 43211.3 remedial measures for influentialcases—robust regression 437robust regression 438irls robust regression 43911.4 nonparametric regression: lowessmethod and regression trees 449lowess method 449regression trees 45311.5 remedial measures for evaluatingprecision in nonstandardsituations—bootstrapping 458general procedure 459bootstrap sampling 459bootstrap confidence intervals 46011.6 case example—mndot trafficestimation 464the aadt database 464model development 465weighted least squares estimation 468xvi contentscited references 471problems 472exercises 476projects 476case studies 480chapter 12autocorrelation in timeseries data 48112.1 problems of autocorrelation 48112.2 first-order autoregressive errormodel 484simple linear regression 484multiple regression 484properties of error terms 48512.3 durbin-watson test forautocorrelation 48712.4 remedial measures forautocorrelation 490addition of predictor variables 490use of transformed variables 490cochrane-orcutt procedure 492hildreth-lu procedure 495first differences procedure 496comparison of three methods 49812.5 forecasting with autocorrelated errorterms 499cited references 502problems 502exercises 507projects 508case studies 508part threenonlinear regression 509chapter 13introduction to nonlinear regressionand neural networks 51013.1 linear and nonlinear regressionmodels 510linear regression models 510nonlinear regression models 511estimation of regression parameters 51413.2 least squares estimation in nonlinearregression 515solution of normal equations 517direct numerical search—gauss-newtonmethod 518other direct search procedures 52513.3 model building and diagnostics 52613.4 inferences about nonlinear regressionparameters 527estimate of error term variance 527large-sample theory 528when is large-sample theoryapplicable? 528interval estimation of a single •k 531simultaneous interval estimationof several •k 532test concerning a single •k 532test concerning several •k 53313.5 learning curve example 53313.6 introduction to neural networkmodeling 537neural network model 537network representation 540neural network as generalization of linearregression 541parameter estimation: penalized leastsquares 542example: ischemic heart disease 543model interpretation andprediction 546some final comments on neural networkmodeling 547cited references 547problems 548exercises 552projects 552case studies 554chapter 14logistic regression, poisson regression,and generalized linear models 55514.1 regression models with binary responsevariable 555meaning of response function whenoutcome variable is binary 556contents xviispecial problems when response variableis binary 55714.2 sigmoidal response functionsfor binary responses 559probit mean response function 559logistic mean response function 560complementary log-log responsefunction 56214.3 simple logistic regression 563simple logistic regression model 563likelihood function 564maximum likelihood estimation 564interpretation of b1 567use of probit and complementary log-logresponse functions 568repeat observations—binomialoutcomes 56814.4 multiple logistic regression 570multiple logistic regression model 570fitting of model 571polynomial logistic regression 57514.5 inferences about regressionparameters 577test concerning a single •k: waldtest 578interval estimation of a single •k 579test whether several •k = 0: likelihoodratio test 58014.6 automatic model selectionmethods 582model selection criteria 582best subsets procedures 583stepwise model selection 58314.7 tests for goodness of fit 586pearson chi-square goodnessof fit test 586deviance goodness of fit test 588hosmer-lemeshow goodnessof fit test 58914.8 logistic regression diagnostics 591logistic regression residuals 591diagnostic residual plots 594detection of influentialobservations 59814.9 inferences aboutmean response 602point estimator 602interval estimation 602simultaneous confidence intervals forseveral mean responses 60314.10 prediction of a new observation 604choice of prediction rule 604validation of prediction error rate 60714.11 polytomous logistic regression fornominal response 608pregnancy duration datawith polytomous response 609j - 1 baseline-category logits fornominal response 610maximum likelihood estimation 61214.12 polytomous logistic regressionfor ordinal response 61414.13 poisson regression 618poisson distribution 618poisson regression model 619maximum likelihood estimation 620model development 620inferences 62114.14 generalized linear models 623cited references 624problems 625exercises 634projects 635case studies 640appendix asome basic results in probabilityand statistics appendix btables appendix cdata sets appendix dselected bibliography index

 1/2    1 2 下一页 尾页

教材 研究生/本科/专科教材

在线阅读

  • 最新内容
  • 相关内容
  • 网友推荐
  • 图文推荐
上一篇:BIM算量一图一练     下一篇:300MW(循环流化床)分册