An Introduction to the Bootstrap

Author: Bradley Efron,R.J. Tibshirani
Publisher: CRC Press
ISBN: 9780412042317
Category: Mathematics
Page: 456
View: 5050

Statistics is a subject of many uses and surprisingly few effective practitioners. The traditional road to statistical knowledge is blocked, for most, by a formidable wall of mathematics. The approach in An Introduction to the Bootstrap avoids that wall. It arms scientists and engineers, as well as statisticians, with the computational techniques they need to analyze and understand complicated data sets.

Generalized Additive Models

Author: T.J. Hastie,R.J. Tibshirani
Publisher: CRC Press
ISBN: 9780412343902
Category: Mathematics
Page: 352
View: 9738

This book describes an array of power tools for data analysis that are based on nonparametric regression and smoothing techniques. These methods relax the linear assumption of many standard models and allow analysts to uncover structure in the data that might otherwise have been missed. While McCullagh and Nelder's Generalized Linear Models shows how to extend the usual linear methodology to cover analysis of a range of data types, Generalized Additive Models enhances this methodology even further by incorporating the flexibility of nonparametric regression. Clear prose, exercises in each chapter, and case studies enhance this popular text.

The Bootstrap and Edgeworth Expansion

Author: Peter Hall
Publisher: Springer Science & Business Media
ISBN: 146124384X
Category: Mathematics
Page: 354
View: 2347

This monograph addresses two quite different topics, each being able to shed light on the other. Firstly, it lays the foundation for a particular view of the bootstrap. Secondly, it gives an account of Edgeworth expansion. The first two chapters deal with the bootstrap and Edgeworth expansion respectively, while chapters 3 and 4 bring these two themes together, using Edgeworth expansion to explore and develop the properties of the bootstrap. The book is aimed at graduate level for those with some exposure to the methods of theoretical statistics. However, technical details are delayed until the last chapter such that mathematically able readers without knowledge of the rigorous theory of probability will have no trouble understanding most of the book.

Bootstrap Methods and Their Application

Author: A. C. Davison,D. V. Hinkley
Publisher: Cambridge University Press
ISBN: 9780521574716
Category: Computers
Page: 582
View: 4894

This book gives a broad and up-to-date coverage of bootstrap methods, with numerous applied examples, developed in a coherent way with the necessary theoretical basis. Applications include stratified data; finite populations; censored and missing data; linear, nonlinear, and smooth regression models; classification; time series and spatial problems. Special features of the book include: extensive discussion of significance tests and confidence intervals; material on various diagnostic methods; and methods for efficient computation, including improved Monte Carlo simulation. Each chapter includes both practical and theoretical exercises. Included with the book is a disk of purpose-written S-Plus programs for implementing the methods described in the text. Computer algorithms are clearly described, and computer code is included on a 3-inch, 1.4M disk for use with IBM computers and compatible machines. Users must have the S-Plus computer application. Author resource page:

Local Polynomial Modelling and Its Applications

Monographs on Statistics and Applied Probability 66
Author: Jianqing Fan
Publisher: Routledge
ISBN: 1351434802
Category: Mathematics
Page: 360
View: 1356

Data-analytic approaches to regression problems, arising from many scientific disciplines are described in this book. The aim of these nonparametric methods is to relax assumptions on the form of a regression function and to let data search for a suitable function that describes the data well. The use of these nonparametric functions with parametric techniques can yield very powerful data analysis tools. Local polynomial modeling and its applications provides an up-to-date picture on state-of-the-art nonparametric regression techniques. The emphasis of the book is on methodologies rather than on theory, with a particular focus on applications of nonparametric techniques to various statistical problems. High-dimensional data-analytic tools are presented, and the book includes a variety of examples. This will be a valuable reference for research and applied statisticians, and will serve as a textbook for graduate students and others interested in nonparametric regression.

Kernel Smoothing

Author: M.P. Wand,M.C. Jones
Publisher: CRC Press
ISBN: 9780412552700
Category: Mathematics
Page: 224
View: 3025

Kernel smoothing refers to a general methodology for recovery of underlying structure in data sets. The basic principle is that local averaging or smoothing is performed with respect to a kernel function. This book provides uninitiated readers with a feeling for the principles, applications, and analysis of kernel smoothers. This is facilitated by the authors' focus on the simplest settings, namely density estimation and nonparametric regression. They pay particular attention to the problem of choosing the smoothing parameter of a kernel smoother, and also treat the multivariate case in detail. Kernal Smoothing is self-contained and assumes only a basic knowledge of statistics, calculus, and matrix algebra. It is an invaluable introduction to the main ideas of kernel estimation for students and researchers from other discipline and provides a comprehensive reference for those familiar with the topic.

Density Estimation for Statistics and Data Analysis

Author: Bernard. W. Silverman
Publisher: CRC Press
ISBN: 9780412246203
Category: Mathematics
Page: 176
View: 7557

Although there has been a surge of interest in density estimation in recent years, much of the published research has been concerned with purely technical matters with insufficient emphasis given to the technique's practical value. Furthermore, the subject has been rather inaccessible to the general statistician. The account presented in this book places emphasis on topics of methodological importance, in the hope that this will facilitate broader practical application of density estimation and also encourage research into relevant theoretical work. The book also provides an introduction to the subject for those with general interests in statistics. The important role of density estimation as a graphical technique is reflected by the inclusion of more than 50 graphs and figures throughout the text. Several contexts in which density estimation can be used are discussed, including the exploration and presentation of data, nonparametric discriminant analysis, cluster analysis, simulation and the bootstrap, bump hunting, projection pursuit, and the estimation of hazard rates and other quantities that depend on the density. This book includes general survey of methods available for density estimation. The Kernel method, both for univariate and multivariate data, is discussed in detail, with particular emphasis on ways of deciding how much to smooth and on computation aspects. Attention is also given to adaptive methods, which smooth to a greater degree in the tails of the distribution, and to methods based on the idea of penalized likelihood.

The Jackknife and Bootstrap

Author: Jun Shao,Dongsheng Tu
Publisher: Springer Science & Business Media
ISBN: 1461207959
Category: Mathematics
Page: 517
View: 6134

The jackknife and bootstrap are the most popular data-resampling meth ods used in statistical analysis. The resampling methods replace theoreti cal derivations required in applying traditional methods (such as substitu tion and linearization) in statistical analysis by repeatedly resampling the original data and making inferences from the resamples. Because of the availability of inexpensive and fast computing, these computer-intensive methods have caught on very rapidly in recent years and are particularly appreciated by applied statisticians. The primary aims of this book are (1) to provide a systematic introduction to the theory of the jackknife, the bootstrap, and other resampling methods developed in the last twenty years; (2) to provide a guide for applied statisticians: practitioners often use (or misuse) the resampling methods in situations where no theoretical confirmation has been made; and (3) to stimulate the use of the jackknife and bootstrap and further devel opments of the resampling methods. The theoretical properties of the jackknife and bootstrap methods are studied in this book in an asymptotic framework. Theorems are illustrated by examples. Finite sample properties of the jackknife and bootstrap are mostly investigated by examples and/or empirical simulation studies. In addition to the theory for the jackknife and bootstrap methods in problems with independent and identically distributed (Li.d.) data, we try to cover, as much as we can, the applications of the jackknife and bootstrap in various complicated non-Li.d. data problems.

Statistics and Finance

An Introduction
Author: David Ruppert
Publisher: Springer
ISBN: 1441968768
Category: Business & Economics
Page: 474
View: 9407

This book emphasizes the applications of statistics and probability to finance. The basics of these subjects are reviewed and more advanced topics in statistics, such as regression, ARMA and GARCH models, the bootstrap, and nonparametric regression using splines, are introduced as needed. The book covers the classical methods of finance and it introduces the newer area of behavioral finance. Applications and use of MATLAB and SAS software are stressed. The book will serve as a text in courses aimed at advanced undergraduates and masters students. Those in the finance industry can use it for self-study.

Statistical Learning with Sparsity

The Lasso and Generalizations
Author: Trevor Hastie,Robert Tibshirani,Martin Wainwright
Publisher: CRC Press
ISBN: 1498712177
Category: Business & Economics
Page: 367
View: 4551

Discover New Methods for Dealing with High-Dimensional Data A sparse statistical model has only a small number of nonzero parameters or weights; therefore, it is much easier to estimate and interpret than a dense model. Statistical Learning with Sparsity: The Lasso and Generalizations presents methods that exploit sparsity to help recover the underlying signal in a set of data. Top experts in this rapidly evolving field, the authors describe the lasso for linear regression and a simple coordinate descent algorithm for its computation. They discuss the application of l1 penalties to generalized linear models and support vector machines, cover generalized penalties such as the elastic net and group lasso, and review numerical methods for optimization. They also present statistical inference methods for fitted (lasso) models, including the bootstrap, Bayesian methods, and recently developed approaches. In addition, the book examines matrix decomposition, sparse multivariate analysis, graphical models, and compressed sensing. It concludes with a survey of theoretical results for the lasso. In this age of big data, the number of features measured on a person or object can be large and might be larger than the number of observations. This book shows how the sparsity assumption allows us to tackle these problems and extract useful and reproducible patterns from big datasets. Data analysts, computer scientists, and theorists will appreciate this thorough and up-to-date treatment of sparse statistical modeling.


A Nonparametric Approach to Statistical Inference
Author: Christopher Z. Mooney,Robert D. Duval,Robert Duvall
Publisher: SAGE
ISBN: 9780803953819
Category: Social Science
Page: 73
View: 1016

Bootstrapping, a computational nonparametric technique for "re-sampling," enables researchers to draw a conclusion about the characteristics of a population strictly from the existing sample rather than by making parametric assumptions about the estimator. Using real data examples from per capita personal income to median preference differences between legislative committee members and the entire legislature, Mooney and Duval discuss how to apply bootstrapping when the underlying sampling distribution of the statistics cannot be assumed normal, as well as when the sampling distribution has no analytic solution. In addition, they show the advantages and limitations of four bootstrap confidence interval methods: normal approximation, percenti

Missing Data in Longitudinal Studies

Strategies for Bayesian Modeling and Sensitivity Analysis
Author: Michael J. Daniels,Joseph W. Hogan
Publisher: CRC Press
ISBN: 9781420011180
Category: Mathematics
Page: 328
View: 1821

Drawing from the authors’ own work and from the most recent developments in the field, Missing Data in Longitudinal Studies: Strategies for Bayesian Modeling and Sensitivity Analysis describes a comprehensive Bayesian approach for drawing inference from incomplete data in longitudinal studies. To illustrate these methods, the authors employ several data sets throughout that cover a range of study designs, variable types, and missing data issues. The book first reviews modern approaches to formulate and interpret regression models for longitudinal data. It then discusses key ideas in Bayesian inference, including specifying prior distributions, computing posterior distribution, and assessing model fit. The book carefully describes the assumptions needed to make inferences about a full-data distribution from incompletely observed data. For settings with ignorable dropout, it emphasizes the importance of covariance models for inference about the mean while for nonignorable dropout, the book studies a variety of models in detail. It concludes with three case studies that highlight important features of the Bayesian approach for handling nonignorable missingness. With suggestions for further reading at the end of most chapters as well as many applications to the health sciences, this resource offers a unified Bayesian approach to handle missing data in longitudinal studies.

Analysis of Variance for Functional Data

Author: Jin-Ting Zhang
Publisher: CRC Press
ISBN: 1439862745
Category: Mathematics
Page: 412
View: 444

Despite research interest in functional data analysis in the last three decades, few books are available on the subject. Filling this gap, Analysis of Variance for Functional Data presents up-to-date hypothesis testing methods for functional data analysis. The book covers the reconstruction of functional observations, functional ANOVA, functional linear models with functional responses, ill-conditioned functional linear models, diagnostics of functional observations, heteroscedastic ANOVA for functional data, and testing equality of covariance functions. Although the methodologies presented are designed for curve data, they can be extended to surface data. Useful for statistical researchers and practitioners analyzing functional data, this self-contained book gives both a theoretical and applied treatment of functional data analysis supported by easy-to-use MATLAB® code. The author provides a number of simple methods for functional hypothesis testing. He discusses pointwise, L2-norm-based, F-type, and bootstrap tests. Assuming only basic knowledge of statistics, calculus, and matrix algebra, the book explains the key ideas at a relatively low technical level using real data examples. Each chapter also includes bibliographical notes and exercises. Real functional data sets from the text and MATLAB codes for analyzing the data examples are available for download from the author’s website.

Computer Age Statistical Inference

Algorithms, Evidence, and Data Science
Author: Bradley Efron,Trevor Hastie
Publisher: Cambridge University Press
ISBN: 1108107958
Category: Mathematics
Page: N.A
View: 3432

The twenty-first century has seen a breathtaking expansion of statistical methodology, both in scope and in influence. 'Big data', 'data science', and 'machine learning' have become familiar terms in the news, as statistical methods are brought to bear upon the enormous data sets of modern science and commerce. How did we get here? And where are we going? This book takes us on an exhilarating journey through the revolution in data analysis following the introduction of electronic computation in the 1950s. Beginning with classical inferential theories - Bayesian, frequentist, Fisherian - individual chapters take up a series of influential topics: survival analysis, logistic regression, empirical Bayes, the jackknife and bootstrap, random forests, neural networks, Markov chain Monte Carlo, inference after model selection, and dozens more. The distinctly modern approach integrates methodology and algorithms with statistical inference. The book ends with speculation on the future direction of statistics and data science.

Large-Scale Inference

Empirical Bayes Methods for Estimation, Testing, and Prediction
Author: Bradley Efron
Publisher: Cambridge University Press
ISBN: 1139492136
Category: Mathematics
Page: N.A
View: 8850

We live in a new age for statistical inference, where modern scientific technology such as microarrays and fMRI machines routinely produce thousands and sometimes millions of parallel data sets, each with its own estimation or testing problem. Doing thousands of problems at once is more than repeated application of classical methods. Taking an empirical Bayes approach, Bradley Efron, inventor of the bootstrap, shows how information accrues across problems in a way that combines Bayesian and frequentist ideas. Estimation, testing and prediction blend in this framework, producing opportunities for new methodologies of increased power. New difficulties also arise, easily leading to flawed inferences. This book takes a careful look at both the promise and pitfalls of large-scale statistical inference, with particular attention to false discovery rates, the most successful of the new statistical techniques. Emphasis is on the inferential ideas underlying technical developments, illustrated using a large number of real examples.

Asymptotic Theory of Statistics and Probability

Author: Anirban DasGupta
Publisher: Springer Science & Business Media
ISBN: 0387759700
Category: Mathematics
Page: 722
View: 9587

This unique book delivers an encyclopedic treatment of classic as well as contemporary large sample theory, dealing with both statistical problems and probabilistic issues and tools. The book is unique in its detailed coverage of fundamental topics. It is written in an extremely lucid style, with an emphasis on the conceptual discussion of the importance of a problem and the impact and relevance of the theorems. There is no other book in large sample theory that matches this book in coverage, exercises and examples, bibliography, and lucid conceptual discussion of issues and theorems.

Resampling Methods for Dependent Data

Author: S. N. Lahiri
Publisher: Springer Science & Business Media
ISBN: 147573803X
Category: Mathematics
Page: 374
View: 7477

By giving a detailed account of bootstrap methods and their properties for dependent data, this book provides illustrative numerical examples throughout. The book fills a gap in the literature covering research on re-sampling methods for dependent data that has witnessed vigorous growth over the last two decades but remains scattered in various statistics and econometrics journals. It can be used as a graduate level text and also as a research monograph for statisticians and econometricians.

An Introduction to Bootstrap Methods with Applications to R

Author: Michael R. Chernick,Robert A. LaBudde
Publisher: John Wiley & Sons
ISBN: 1118625412
Category: Mathematics
Page: 240
View: 7188

A comprehensive introduction to bootstrap methods in the R programming environment Bootstrap methods provide a powerful approach to statistical data analysis, as they have more general applications than standard parametric methods. An Introduction to Bootstrap Methods with Applications to R explores the practicality of this approach and successfully utilizes R to illustrate applications for the bootstrap and other resampling methods. This book provides a modern introduction to bootstrap methods for readers who do not have an extensive background in advanced mathematics. Emphasis throughout is on the use of bootstrap methods as an exploratory tool, including its value in variable selection and other modeling environments. The authors begin with a description of bootstrap methods and its relationship to other resampling methods, along with an overview of the wide variety of applications of the approach. Subsequent chapters offer coverage of improved confidence set estimation, estimation of error rates in discriminant analysis, and applications to a wide variety of hypothesis testing and estimation problems, including pharmaceutical, genomics, and economics. To inform readers on the limitations of the method, the book also exhibits counterexamples to the consistency of bootstrap methods. An introduction to R programming provides the needed preparation to work with the numerous exercises and applications presented throughout the book. A related website houses the book's R subroutines, and an extensive listing of references provides resources for further study. Discussing the topic at a remarkably practical and accessible level, An Introduction to Bootstrap Methods with Applications to R is an excellent book for introductory courses on bootstrap and resampling methods at the upper-undergraduate and graduate levels. It also serves as an insightful reference for practitioners working with data in engineering, medicine, and the social sciences who would like to acquire a basic understanding of bootstrap methods.


Author: Dimitris N. Politis,Joseph P. Romano,Michael Wolf
Publisher: Springer Science & Business Media
ISBN: 1461215544
Category: Mathematics
Page: 348
View: 1296

Since Efron's profound paper on the bootstrap, an enormous amount of effort has been spent on the development of bootstrap, jacknife, and other resampling methods. The primary goal of these computer-intensive methods has been to provide statistical tools that work in complex situations without imposing unrealistic or unverifiable assumptions about the data generating mechanism. This book sets out to lay some of the foundations for subsampling methodology and related methods.