Kernel negative \(\varepsilon\) dragging linear regression for pattern classification (Q1693779): Difference between revisions
From MaRDI portal
Set OpenAlex properties. |
ReferenceBot (talk | contribs) Changed an Item |
||
Property / cites work | |||
Property / cites work: The Collinearity Problem in Linear Regression. The Partial Least Squares (PLS) Approach to Generalized Inverses / rank | |||
Normal rank | |||
Property / cites work | |||
Property / cites work: Learning a Nonnegative Sparse Graph for Linear Regression / rank | |||
Normal rank | |||
Property / cites work | |||
Property / cites work: Correlated Logistic Model With Elastic Net Regularization for Multilabel Image Classification / rank | |||
Normal rank | |||
Property / cites work | |||
Property / cites work: Kernel nearest-neighbor algorithm / rank | |||
Normal rank | |||
Property / cites work | |||
Property / cites work: Kernel Sparse Representation-Based Classifier / rank | |||
Normal rank | |||
Property / cites work | |||
Property / cites work: Local Rademacher Complexity for Multi-Label Learning / rank | |||
Normal rank |
Latest revision as of 01:49, 15 July 2024
scientific article
Language | Label | Description | Also known as |
---|---|---|---|
English | Kernel negative \(\varepsilon\) dragging linear regression for pattern classification |
scientific article |
Statements
Kernel negative \(\varepsilon\) dragging linear regression for pattern classification (English)
0 references
31 January 2018
0 references
Summary: Linear Regression (LR) and its variants have been widely used for classification problems. However, they usually predefine a strict binary label matrix which has no freedom to fit the samples. In addition, they cannot deal with complex real-world applications such as the case of face recognition where samples may not be linearly separable owing to varying poses, expressions, and illumination conditions. Therefore, in this paper, we propose the Kernel Negative \(\varepsilon\) Dragging Linear Regression (KNDLR) method for robust classification on noised and nonlinear data. First, a technique called negative \(\varepsilon\) dragging is introduced for relaxing class labels and is integrated into the LR model for classification to properly treat the class margin of conventional linear regressions for obtaining robust result. Then, the data is implicitly mapped into a high dimensional kernel space by using the nonlinear mapping determined by a kernel function to make the data more linearly separable. Finally, our obtained KNDLR method is able to partially alleviate the problem of overfitting and can perform classification well for noised and deformable data. Experimental results show that the KNDLR classification algorithm obtains greater generalization performance and leads to better robust classification decision.
0 references
linear regression
0 references
pattern classification
0 references
complex real-world applications
0 references
kernel negative \(\varepsilon\) dragging linear regression
0 references
0 references
0 references