Comparison between various regression depth methods and the support vector machine to approximate the minimum number of misclassifications (Q1855638): Difference between revisions

From MaRDI portal
Set OpenAlex properties.
ReferenceBot (talk | contribs)
Changed an Item
 
Property / cites work
 
Property / cites work: On the existence of maximum likelihood estimates in logistic regression models / rank
 
Normal rank
Property / cites work
 
Property / cites work: Double Exponential Families and Their Use in Generalized Linear Regression / rank
 
Normal rank
Property / cites work
 
Property / cites work: THE ESTIMATION FROM INDIVIDUAL RECORDS OF THE RELATIONSHIP BETWEEN DOSE AND QUANTAL RESPONSE / rank
 
Normal rank
Property / cites work
 
Property / cites work: Robust trainability of single neurons / rank
 
Normal rank
Property / cites work
 
Property / cites work: Q4521593 / rank
 
Normal rank
Property / cites work
 
Property / cites work: Conditionally Unbiased Bounded-Influence Estimation in General Regression Models, with Applications to Generalized Linear Models / rank
 
Normal rank
Property / cites work
 
Property / cites work: Logistic regression diagnostics / rank
 
Normal rank
Property / cites work
 
Property / cites work: Q3126965 / rank
 
Normal rank
Property / cites work
 
Property / cites work: Q5519782 / rank
 
Normal rank
Property / cites work
 
Property / cites work: Regression Depth / rank
 
Normal rank
Property / cites work
 
Property / cites work: A note on A. Albert and J. A. Anderson's conditions for the existence of maximum likelihood estimates in logistic regression models / rank
 
Normal rank
Property / cites work
 
Property / cites work: Q4261789 / rank
 
Normal rank

Latest revision as of 12:22, 5 June 2024

scientific article
Language Label Description Also known as
English
Comparison between various regression depth methods and the support vector machine to approximate the minimum number of misclassifications
scientific article

    Statements

    Comparison between various regression depth methods and the support vector machine to approximate the minimum number of misclassifications (English)
    0 references
    0 references
    0 references
    0 references
    0 references
    6 February 2003
    0 references
    This paper deals with the following problem: from a given set of observations \(z_{n}=\{(x_{i,1},\dots,\) \(x_{i,p-1},y_{i})\); \(i=1,\dots,n\}\subset R^{p}\), where \(x_{i}=(x_{i,1},\dots,x_{i,p-1})\in R^{p-1}\), \(y_{i}\in\{0,1\},\;i=1,\ldots,n\), are responses, find an affine hyperplane defined via \(\theta\in R^{p}\) such that a good classification of the responses is possible. A central role concerning the existence and the quality of such estimates plays the quantity \(n_{co}\) defined as the minimum number of misclassifications, for the given data set \(z_{n}\), that any affine hyperplane must incur. The authors introduce two new methods for finding reasonable approximation procedures for the minimum number of misclassifications. The proposed methods are modifications of the regression depth method. They employ the support vector machine in a way especially adapted to the regression depth problem. None of the considered approximation algorithms outperform all others, but the authors give some recommendations how to choose a reasonable algorithm. Pseudo code for a heuristic method is presented.
    0 references
    0 references
    regression depth methods
    0 references
    support vector machines
    0 references
    approximations
    0 references
    minimum number of misclassifications
    0 references
    0 references