Learning distributions by their density levels: A paradigm for learning without a teacher
From MaRDI portal
Publication:1370866
DOI10.1006/jcss.1997.1507zbMath0880.68106OpenAlexW2005530305MaRDI QIDQ1370866
Shai Ben-David, Michael Lindenbaum
Publication date: 7 January 1998
Published in: Journal of Computer and System Sciences (Search for Journal in Brave)
Full work available at URL: https://semanticscholar.org/paper/cda0324c7f815d53d75149ea97b122f15e5b2f1c
Learning and adaptive systems in artificial intelligence (68T05) Parallel algorithms in computer science (68W10)
Related Items
Fully adaptive density-based clustering ⋮ LEARNING RATES FOR DENSITY LEVEL DETECTION ⋮ Estimating the Support of a High-Dimensional Distribution ⋮ Support vector machines for default prediction of SMEs based on technology credit ⋮ The computational complexity of densest region detection
Cites Work
- Lower bounds for sampling algorithms for estimating the average
- Learnability with respect to fixed distributions
- A general lower bound on the number of examples needed for learning
- Bounding the Vapnik-Chervonenkis dimension of concept classes parameterized by real numbers
- General bounds on the number of examples needed for learning probabilistic concepts
- On the learnability of discrete distributions
- Probably Approximate Learning of Sets and Functions
- Learnability and the Vapnik-Chervonenkis dimension
- On the Uniform Convergence of Relative Frequencies of Events to Their Probabilities
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item