Condividi
Fr. 142.00
Ily Narsky, Ilya Narsky, Frank C Porter, Frank C. Porter
Statistical Analysis Techniques in Particle Physics - Fits, Density Estimation and Supervised Learning
Inglese · Tascabile
Spedizione di solito entro 3 a 5 settimane
Descrizione
The first book written specifically with physicists in mind on analysis techniques in particle physics with an emphasis on machine learning techniques.
Based on lectures given by the authors at Stanford and Caltech, this practical approach shows by means of analysis examples how observables are extracted from data, how signal and background are estimated, and how accurate error estimates are obtained exploiting uni- and multivariate analysis techniques, such as non-parametric density estimation, likelihood fits, neural networks, support vector machines, decision trees, and ensembles of classifiers. It includes simple code snippets that run on popular software suites such as Root and Matlab, and either include the codes for generating data or make use of publically available data that can be downloaded from the Web.
Primarily aimed at master and very advanced undergraduate students, this text is also intended for study and research.
Sommario
1 Why We Wrote This Book and How You Should Read It
2 Parametric Likelihood Fits
2.1 Preliminaries
2.2 Parametric Likelihood Fits
2.3 Fits for Small Statistics
2.4 Results Near the Boundary of a Physical Region
2.5 Likelihood Ratio Test for Presence of Signal
2.6 sPlots
2.7 Exercises
3 Goodness of Fit
3.1 Binned Goodness of Fit Tests
3.2 Statistics Converging to Chi-Square
3.3 Univariate Unbinned Goodness of Fit Tests
3.4 Multivariate Tests
3.5 Exercises
4 Resampling Techniques
4.1 Permutation Sampling
4.2 Bootstrap
4.3 Jackknife
4.4 BCa Confidence Intervals
4.5 Cross-Validation
4.6 _Resampling Weighted Observations
4.7 Exercises
5 Density Estimation
5.1 Empirical Density Estimate
5.2 Histograms
5.3 Kernel Estimation
5.4 Ideogram
5.5 Parametric vs. Nonparametric Density Estimation
5.6 Optimization
5.7 Estimating Errors
5.8 The Curse of Dimensionality
5.9 Adaptive Kernel Estimation
5.10 Naive Bayes Classification
5.11 Multivariate Kernel Estimation
5.12 Estimation Using Orthogonal Series
5.13 Using Monte Carlo Models
5.14 Unfolding
5.14.1 Unfolding: Regularization
6 Basic Concepts and Definitions of Machine Learning
6.1 Supervised, Unsupervised, and Semi-Supervised
6.2 Tall and Wide Data
6.3 Batch and Online Learning
6.4 Parallel Learning
6.5 Classification and Regression
7 Data Preprocessing
7.1 Categorical Variables
7.2 Missing Values
7.3 Outliers
7.4 Exercises
8 Linear Transformations and Dimensionality Reduction
8.1 Centering, Scaling, Reflection and Rotation
8.2 Rotation and Dimensionality Reduction
8.3 Principal Component Analysis (PCA)
of Components
8.4 Independent Component Analysis (ICA)
8.4.1 Theory
8.5 Exercises
9 Introduction to Classification
9.1 Loss Functions: Hard Labels and Soft Scores
9.2 Bias, Variance, and Noise
9.3 Training, Validating and Testing: The Optimal Splitting Rule
9.4 Resampling Techniques: Cross-Validation and Bootstrap
9.5 Data with Unbalanced Classes
9.6 Learning with Cost
9.7 Exercises
10 Assessing Classifier Performance
10.1 Classification Error and Other Measures of Predictive Power
10.2 Receiver Operating Characteristic (ROC) and Other Curves
10.3 Testing Equivalence of Two Classification Models
10.4 Comparing Several Classifiers
10.5 Exercises
11 Linear and Quadratic Discriminant Analysis, Logistic Regression,
and Partial Least Squares Regression
11.1 Discriminant Analysis
11.2 Logistic Regression
11.3 Classification by Linear Regression
11.4 Partial Least Squares Regression
11.5 Example: Linear Models for MAGIC Telescope Data
11.6 Choosing a Linear Classifier for Your Analysis
11.7 Exercises
12 Neural Networks
12.1 Perceptrons
12.2 The Feed-Forward Neural Network
12.3 Backpropagation
12.4 Bayes Neural Networks
12.5 Genetic Algorithms
12.6 Exercises
13 Local Learning and Kernel Expansion
13.1 From Input Variables to the Feature Space
13.2 Regularization
13.3 Making and Choosing Kernels
13.4 Radial Basis Functions
13.5 Support Vector Machines (SVM)
13.6 Empirical Local Methods
13.7 Kernel Methods: The Good, the Bad and the Curse of Dimensionality
13.8 Exercises
14 Decision Trees
14.1 Growing Trees
14.2 Predicting by Decision Trees
14.3 Stopping Rules
14.4 Pruning Trees
14.5 Trees for Multiple Classes
14.6 Splits on Categorical Variables
14.7 Surrogate Splits
14.8 Missing Values
14.9 Variable importance
14.10 Why Are Decision Trees Good (or Bad)?
14.11 Exercises
15 Ensemble Learning
15.1 Boosting
15.2 Diversifying theWeak Learner: Bagging, Random Subspace and Random Forest
15.3 Choosing an Ensemble for Your Analysis
15.4 Exercises
16 Reducing Multiclass to Binary
16.1 Encoding
16.2 Decoding
16.3 Summary: Choosing the Right Design
17 How to Cho
Info autore
The authors are experts in the use of statistics in particle physics data analysis. Frank C. Porter is Professor at Physics at the California Institute of Technology and has lectured extensively at CalTech, the SLAC Laboratory at Stanford, and elsewhere. Ilya Narsky is Senior Matlab Developer at The MathWorks, a leading developer of technical computing software for engineers and scientists, and the initiator of the StatPatternRecognition, a C++ package for statistical analysis of HEP data. Together, they have taught courses for graduate students and postdocs.
Riassunto
The first book written specifically with physicists in mind on analysis techniques in particle physics with an emphasis on machine learning techniques.
Based on lectures given by the authors at Stanford and Caltech, this practical approach shows by means of analysis examples how observables are extracted from data, how signal and background are estimated, and how accurate error estimates are obtained exploiting uni- and multivariate analysis techniques, such as non-parametric density estimation, likelihood fits, neural networks, support vector machines, decision trees, and ensembles of classifiers. It includes simple code snippets that run on popular software suites such as Root and Matlab, and either include the codes for generating data or make use of publically available data that can be downloaded from the Web.
Primarily aimed at master and very advanced undergraduate students, this text is also intended for study and research.
Dettagli sul prodotto
Autori | Ily Narsky, Ilya Narsky, Frank C Porter, Frank C. Porter |
Editore | Wiley-VCH |
Lingue | Inglese |
Formato | Tascabile |
Pubblicazione | 01.11.2013 |
EAN | 9783527410866 |
ISBN | 978-3-527-41086-6 |
Pagine | 459 |
Dimensioni | 172 mm x 25 mm x 240 mm |
Peso | 866 g |
Illustrazioni | 100 SW-Abb., 70 Tabellen |
Categorie |
Scienze naturali, medicina, informatica, tecnica
> Fisica, astronomia
> Fisica atomica, fisica nucleare
Statistik, Physik, Mathematik, Datenanalyse, Statistics, Teilchenphysik, Mathematics, Physics, data analysis, statistische Analyse, Nuclear & High Energy Physics, Kern- u. Hochenergiephysik, Applied Mathematics in Science, Mathematik in den Naturwissenschaften |
Recensioni dei clienti
Per questo articolo non c'è ancora nessuna recensione. Scrivi la prima recensione e aiuta gli altri utenti a scegliere.
Scrivi una recensione
Top o flop? Scrivi la tua recensione.