Read more
This text presents the rigorous theory of probability and statistical inference using worked examples, exercises, figures, tables, and computer simulations to develop and illustrate concepts. Beginning with the basic ideas and techniques of probability theory and progressing to more rigorous topics, the author covers all of the topics typically addressed in a two-semester graduate or upper-level undergraduate course in probability and statistical inference, including hypothesis testing, Bayesian analysis, and sample-size determination. He reinforces important ideas and special techniques with drills and boxed summaries.
List of contents
Notions of probability; expectations of functions of random variables; multivariate random variables; transformations and sampling distributions; notions of stochastic convergence; sufficiency, completeness and ancillarity; point estimation; tests of hypotheses; confidence interval estimation; Bayesian methods; likelihood ratio and other tests; large-sample inference; sample size determination - two-stage procedures. Appendices: abbreviations and notation; celebration of statistics - selected biographical notes; selected statistical tables.
Summary
Priced very competitively compared with other textbooks at this level!
This gracefully organized textbook reveals the rigorous theory of probability and statistical inference in the style of a tutorial, using worked examples, exercises, numerous figures and tables, and computer simulations to develop and illustrate concepts.
Beginning with an introduction to the basic ideas and techniques in probability theory and progressing to more rigorous topics, Probability and Statistical Inference
studies the Helmert transformation for normal distributions and the waiting time between failures for exponential distributions
develops notions of convergence in probability and distribution
spotlights the central limit theorem (CLT) for the sample variance
introduces sampling distributions and the Cornish-Fisher expansions
concentrates on the fundamentals of sufficiency, information, completeness, and ancillarity
explains Basu's Theorem as well as location, scale, and location-scale families of distributions
covers moment estimators, maximum likelihood estimators (MLE), Rao-Blackwellization, and the Cramér-Rao inequality
discusses uniformly minimum variance unbiased estimators (UMVUE) and Lehmann-Scheffé Theorems
focuses on the Neyman-Pearson theory of most powerful (MP) and uniformly most powerful (UMP) tests of hypotheses, as well as confidence intervals
includes the likelihood ratio (LR) tests for the mean, variance, and correlation coefficient
summarizes Bayesian methods
describes the monotone likelihood ratio (MLR) property
handles variance stabilizing transformations
provides a historical context for statistics and statistical discoveries
showcases great statisticians through biographical notes
Employing over 1400 equations to reinforce its subject matter, Probability and Statistical Inference is a groundbreaking text for first-year graduate and upper-level undergraduate courses in probability and statistical inference who have completed a calculus prerequisite, as well as a supplemental text for classes in Advanced Statistical Inference or Decision Theory.
Additional text
Praise for the First Edition...the book contains unique features throughout. Examples are the moment problem, which is clarified through a nice example, the role of the probability generating functions, and the central limit theorem for the sample variance. Techniques and concepts are typically illustrated through a series of examples. Within a box is routinely summarized what it is that has been accomplished or where to go from that point. At the end of each chapter a long list of exercises is arranged according the sections.—Zentralblatt MATH, 2000
…a marvelous book for students.—Statistical Papers …a handy reference as well as a good textbook.—ISI Short Book Reviews