Fr. 109.00

Introduction to Statistical Inference

English · Paperback / Softback

Shipping usually within 1 to 2 weeks (title will be printed to order)

Description

Read more

This book is based upon lecture notes developed by Jack Kiefer for a course in statistical inference he taught at Cornell University. The notes were distributed to the class in lieu of a textbook, and the problems were used for homework assignments. Relying only on modest prerequisites of probability theory and cal culus, Kiefer's approach to a first course in statistics is to present the central ideas of the modem mathematical theory with a minimum of fuss and formality. He is able to do this by using a rich mixture of examples, pictures, and math ematical derivations to complement a clear and logical discussion of the important ideas in plain English. The straightforwardness of Kiefer's presentation is remarkable in view of the sophistication and depth of his examination of the major theme: How should an intelligent person formulate a statistical problem and choose a statistical procedure to apply to it? Kiefer's view, in the same spirit as Neyman and Wald, is that one should try to assess the consequences of a statistical choice in some quan titative (frequentist) formulation and ought to choose a course of action that is verifiably optimal (or nearly so) without regard to the perceived "attractiveness" of certain dogmas and methods.

List of contents

1 Introduction to Statistical Inference.- 2 Specification of a Statistical Problem.- 2.1 Additional Remarks on the Loss Function.- 3 Classifications of Statistical Problems.- 4 Some Criteria for Choosing a Procedure.- 4.1 The Bayes Criterion.- 4.2 Minimax Criterion.- 4.3 Randomized Statistical Procedures.- 4.4 Admissibility: The Geometry of Risk Points.- 4.5 Computation of Minimax Procedures.- 4.6 Unbiased Estimation.- 4.7 The Method of Maximum Likelihood.- 4.8 Sample Functionals: The Method of Moments.- 4.9 Other Criteria.- 5 Linear Unbiased Estimation.- 5.1 Linear Unbiased Estimation in Simple Settings.- 5.2 General Linear Models: The Method of Least Squares.- 5.3 Orthogonalization.- 5.4 Analysis of the General Linear Model.- 6 Sufficiency.- 6.1 On the Meaning of Sufficiency.- 6.2 Recognizing Sufficient Statistics.- 6.3 Reconstruction of the Sample.- 6.4 Sufficiency: "No Loss of Information".- 6.5 Convex Loss.- 7 Point Estimation.- 7.1 Completeness and Unbiasedness.- 7.2 The "Information Inequality".- 7.3 Invariance.- 7.4 Computation of Minimax Procedures (Continued).- 7.5 The Method of Maximum Likelihood.- 7.6 Asymptotic Theory.- 8 Hypothesis Testing.- 8.1 Introductory Notions.- 8.2 Testing Between Simple Hypotheses.- 8.3 Composite Hypotheses: UMP Tests; Unbiased Tests.- 8.4 Likelihood Ratio (LR) Tests.- 8.5 Problems Where n Is to Be Found.- 8.6 Invariance.- 8.7 Summary of Common "Normal Theory" Tests.- 9 Confidence Intervals.- Appendix A Some Notation, Terminology, and Background Material.- Appendix B Conditional Probability and Expectation, Bayes Computations.- Appendix C Some Inequalities and Some Minimization Methods.- C.1 Inequalities.- C.2 Methods of Minimization.- References.

Product details

Authors Jack C Kiefer, Jack C. Kiefer
Assisted by Gar Lorden (Editor), Gary Lorden (Editor)
Publisher Springer, Berlin
 
Languages English
Product format Paperback / Softback
Released 25.07.2012
 
EAN 9781461395805
ISBN 978-1-4613-9580-5
No. of pages 334
Dimensions 155 mm x 236 mm x 20 mm
Illustrations VIII, 334 p.
Series Springer Texts in Statistics
Springer Texts in Statistics
Subject Natural sciences, medicine, IT, technology > Mathematics > Miscellaneous

Customer reviews

No reviews have been written for this item yet. Write the first review and be helpful to other users when they decide on a purchase.

Write a review

Thumbs up or thumbs down? Write your own review.

For messages to CeDe.ch please use the contact form.

The input fields marked * are obligatory

By submitting this form you agree to our data privacy statement.