Functional gradient ascent for probit regression
From MaRDI portal
Publication:454437
DOI10.1016/j.patcog.2012.06.006zbMath1248.68428OpenAlexW1990158361MaRDI QIDQ454437
Publication date: 5 October 2012
Published in: Pattern Recognition (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/j.patcog.2012.06.006
Classification and discrimination; cluster analysis (statistical aspects) (62H30) Learning and adaptive systems in artificial intelligence (68T05)
Uses Software
Cites Work
- Greedy function approximation: A gradient boosting machine.
- Boosting algorithms: regularization, prediction and model fitting
- Bayesian binary kernel probit model for microarray based cancer classification and gene selection
- The lower bound method in probit regression.
- A decision-theoretic generalization of on-line learning and an application to boosting
- Additive logistic regression: a statistical view of boosting. (With discussion and a rejoinder by the authors)
- Bayesian Variable Selection in Multinomial Probit Models to Identify Molecular Signatures of Disease Stage
- Parameter expansion to accelerate EM: the PX-EM algorithm
- Boosting With theL2Loss
- Parameter Expansion for Data Augmentation
- Unnamed Item
- Unnamed Item
- Unnamed Item
This page was built for publication: Functional gradient ascent for probit regression