Hastie, Trevor

Statistical learning with sparsity: the lasso and generalizations - Boca Raton CRC Press 2015 - xiii, 351 p. - Chapman & Hall/CRC Monographs on Statistics & Applied Probability .

Table of Contents:

1. Introduction

2. The Lasso for Linear Models
The Lasso Estimator
Cross-Validation and Inference
Computation of the Lasso Solution
Degrees of Freedom
Uniqueness of the Lasso Solutions
A Glimpse at the Theory
The Nonnegative Garrote
ℓq Penalties and Bayes Estimates
Some Perspective

3. Generalized Linear Models
Logistic Regression
Multiclass Logistic Regression
Log-Linear Models and the Poisson GLM
Cox Proportional Hazards Models
Support Vector Machines
Computational Details and glmnet

4. Generalizations of the Lasso Penalty
The Elastic Net
The Group Lasso
Sparse Additive Models and the Group Lasso
The Fused Lasso
Nonconvex Penalties

5. Optimization Methods
Convex Optimality Conditions
Gradient Descent
Coordinate Descent
A Simulation Study
Least Angle Regression
Alternating Direction Method of Multipliers
Minorization-Maximization Algorithms
Biconvexity and Alternating Minimization
Screening Rules

6. Statistical Inference
The Bayesian Lasso
The Bootstrap
Post-Selection Inference for the Lasso
Inference via a Debiased Lasso
Other Proposals for Post-Selection Inference

7. Matrix Decompositions, Approximations, and Completion
The Singular Value Decomposition
Missing Data and Matrix Completion
Reduced-Rank Regression
A General Matrix Regression Framework
Penalized Matrix Decomposition
Additive Matrix Decomposition

8. Sparse Multivariate Methods
Sparse Principal Components Analysis
Sparse Canonical Correlation Analysis
Sparse Linear Discriminant Analysis
Sparse Clustering

9. Graphs and Model Selection
Basics of Graphical Models
Graph Selection via Penalized Likelihood
Graph Selection via Conditional Inference
Graphical Models with Hidden Variables

10. Signal Approximation and Compressed Sensing
Signals and Sparse Representations
Random Projection and Approximation
Equivalence between ℓ0 and ℓ1 Recovery

11. Theoretical Results for the Lasso
Bounds on Lasso ℓ2-error
Bounds on Prediction Error
Support Recovery in Linear Regression
Beyond the Basic Lasso

Discover New Methods for Dealing with High-Dimensional Data

A sparse statistical model has only a small number of nonzero parameters or weights; therefore, it is much easier to estimate and interpret than a dense model. Statistical Learning with Sparsity: The Lasso and Generalizations presents methods that exploit sparsity to help recover the underlying signal in a set of data.

Top experts in this rapidly evolving field, the authors describe the lasso for linear regression and a simple coordinate descent algorithm for its computation. They discuss the application of l1 penalties to generalized linear models and support vector machines, cover generalized penalties such as the elastic net and group lasso, and review numerical methods for optimization. They also present statistical inference methods for fitted (lasso) models, including the bootstrap, Bayesian methods, and recently developed approaches. In addition, the book examines matrix decomposition, sparse multivariate analysis, graphical models, and compressed sensing. It concludes with a survey of theoretical results for the lasso.

In this age of big data, the number of features measured on a person or object can be large and might be larger than the number of observations. This book shows how the sparsity assumption allows us to tackle these problems and extract useful and reproducible patterns from big datasets. Data analysts, computer scientists, and theorists will appreciate this thorough and up-to-date treatment of sparse statistical modeling.



Mathematical statistics
Least squares
Linear models (Statistics)
Proof theory

519.5 / H2S8

Powered by Koha