Extremely efficient procedures for fitting the entire lasso or elastic-net regularization path for linear regression, logistic and multinomial regression models, Poisson regression, Cox model, multiple-response Gaussian, and the grouped multinomial regression; see <doi:10.18637/jss.v033.i01> and <doi:10.18637/jss.v039.i05>. There are two new and important additions. The family argument can be a GLM family object, which opens the door to any programmed family (<doi:10.18637/jss.v106.i01>). This comes with a modest computational cost, so when the built-in families suffice, they should be used instead. The other novelty is the relax option, which refits each of the active sets in the path unpenalized. The algorithm uses cyclical coordinate descent in a path-wise fashion, as described in the papers cited.
- Bayesian inference for high-dimensional linear regression under mnet priors
- A maximum entropy copula model for mixed data: representation, estimation and applications
- Adjusted regularized estimation in the accelerated failure time model with high dimensional covariates
- Separating variables to accelerate non-convex regularized optimization
- Nuclear penalized multinomial regression with an application to predicting at bat outcomes in baseball
- Adaptive and reversed penalty for analysis of high-dimensional correlated data
- Two symmetrized coordinate descent methods can be \(O(n^2)\) times slower than the randomized version
- Generalised joint regression for count data: a penalty extension for competitive settings
- Rare feature selection in high dimensions
- On the choice of high-dimensional regression parameters in Gaussian random tomography
- Local Linear Forests
- Estimating time-varying networks
- A cubic spline penalty for sparse approximation under tight frame balanced model
- Discrete-time survival forests with Hellinger distance decision trees
- Sparsity-promoting elastic net method with rotations for high-dimensional nonlinear inverse problem
- Condition estimation for regression and feature selection
- PLS for Big Data: a unified parallel algorithm for regularised group PLS
- Multiple choice from competing regression models under multicollinearity based on standardized update
- Sparse Bayesian imaging of solar flares
- Sparse identification of posynomial models
- Measuring the Stability of Results From Supervised Statistical Learning
- Comparing different propensity score estimation methods for estimating the marginal causal effect through standardization to propensity scores
- Variable selection for semiparametric random-effects conditional density models with longitudinal data
- Model selection for high-dimensional quadratic regression via regularization
- Financial, macro and micro econometrics using R
- Variable selection and forecasting via automated methods for linear models: LASSO/adaLASSO and autometrics
- Robust sparse regression by modeling noise as a mixture of Gaussians
- Fast best subset selection: coordinate descent and local combinatorial optimization algorithms
- An iterative approach to distance correlation-based sure independence screening
- Group descent algorithms for nonconvex penalized linear and logistic regression models with grouped predictors
- Learning Oncogenic Pathways from Binary Genomic Instability Data
- Hierarchical inference for genome-wide association studies: a view on methodology with software
- A distributed algorithm for fitting generalized additive models
- Continuous-time discrete-space models for animal movement
- Wavelet-based scalar-on-function finite mixture regression models
- Lassoing the determinants of retirement
- Recovery of partly sparse and dense signals
- Sparse estimation: an MMSE approach
- Bootstrapping the out-of-sample predictions for efficient and accurate cross-validation
- Multinomial regression with elastic net penalty and its grouping effect in gene selection
- Prediction with a flexible finite mixture-of-regressions
- Single stage prediction with embedded topic modeling of online reviews for mobile app management
- Penalized and ridge-type shrinkage estimators in Poisson regression model
- Structured regularization for conditional Gaussian graphical models
- Sparse estimation via nonconcave penalized likelihood in factor analysis model
- On regularization of generalized maximum entropy for linear models
- Combined \(\ell_1\) and greedy \(\ell_0\) penalized least squares for linear model selection
- A penalty approach to differential item functioning in Rasch models
- Robust variable selection based on the random quantile LASSO
- A statistical mechanics approach to de-biasing and uncertainty estimation in Lasso for random measurements
- A statistical framework for pathway and gene identification from integrative analysis
- UPS delivers optimal phase diagram in high-dimensional variable selection
- Multivariate sparse Laplacian shrinkage for joint estimation of two graphical structures
- Semi-parametric Bayes regression with network-valued covariates
- A phase transition for finding needles in nonlinear haystacks with LASSO artificial neural networks
- Kernel methods in system identification, machine learning and function estimation: a survey
- Spectrally Sparse Nonparametric Regression via Elastic Net Regularized Smoothers
- The variable selection by the Dantzig selector for Cox's proportional hazards model
- Oblique random survival forests
- Sparse high-dimensional regression: exact scalable algorithms and phase transitions
- A nonparametric empirical Bayes approach to large-scale multivariate regression
- Generalized co-sparse factor regression
- Hybrid safe-strong rules for efficient optimization in Lasso-type problems
- Model-free variable selection for conditional mean in regression
- Tuning-free ridge estimators for high-dimensional generalized linear models
- Penalized logspline density estimation using total variation penalty
- Evaluating time series forecasting models: an empirical study on performance estimation methods
- Variance prior forms for high-dimensional Bayesian variable selection
- Sure independence screening for real medical Poisson data
- Identifying soccer players on Facebook through predictive analytics
- Cholesky-based model averaging for covariance matrix estimation
- Orthogonal subsampling for big data linear regression
- A consistent and numerically efficient variable selection method for sparse Poisson regression with applications to learning and signal recovery
- High-dimensional consistency in score-based and hybrid structure learning
- Usage of the GO estimator in high dimensional linear models
- Sparse regression: scalable algorithms and empirical performance
- A Pliable Lasso
- A variable selection approach in the multivariate linear model: an application to LC-MS metabolomics data
- Simple expressions of the Lasso and SLOPE estimators in low-dimension
- Shrinkage priors for Bayesian penalized regression
- Smooth LASSO estimator for the function-on-function linear regression model
- Penalized interaction estimation for ultrahigh dimensional quadratic regression
- Primal path algorithm for compositional data analysis
- An efficient approach for discriminant analysis based on adaptive feature augmentation
- A parallel line search subspace correction method for composite convex optimization
- Data shared Lasso: a novel tool to discover uplift
- A non-convex regularization approach for stable estimation of loss development factors
- Approximation accuracy, gradient methods, and error bound for structured convex optimization
- adass
- Discussion: ``A significance test for the lasso
- Discussion: ``A significance test for the lasso
- Discussion: ``A significance test for the lasso
- Discussion: ``A significance test for the lasso
- Regularized estimation in sparse high-dimensional multivariate regression, with application to a DNA methylation study
- Fusion learning algorithm to combine partially heterogeneous Cox models
- A supervised clustering approach for fMRI-based inference of brain states
- How can lenders prosper? Comparing machine learning approaches to identify profitable peer-to-peer loan investments
- A note on the adaptive Lasso for zero-inflated Poisson regression
- IPF-LASSO: integrative \(L_1\)-penalized regression with penalty factors for prediction based on multi-omics data
- Majorization-minimization algorithms for nonsmoothly penalized objective functions
This page was built for software: glmnet