Pages that link to "Item:Q1122259"
From MaRDI portal
The following pages link to Akaike's information criterion and Kullback-Leibler loss for histogram density estimation (Q1122259):
Displaying 16 items.
- Hellinger distance and Akaike's information criterion for the histogram (Q689552) (← links)
- Bin width selection in multivariate histograms by the combinatorial method (Q882926) (← links)
- Contrast-based information criterion for ergodic diffusion processes from discrete observations (Q904080) (← links)
- On the estimation of entropy (Q1260701) (← links)
- Hellinger distance and Kullback-Leibler loss for the kernel density estimator (Q1314730) (← links)
- Detecting conditional independence for modeling non-Gaussian time series (Q2131924) (← links)
- Optimizing Time Histograms for Non-Poissonian Spike Trains (Q2887006) (← links)
- How many bins should be put in a regular histogram (Q3373752) (← links)
- Maximum smoothed likelihood density estimation (Q3432407) (← links)
- Estimation of the Entropy Functional from Dependent Samples (Q3593576) (← links)
- A Method for Selecting the Bin Size of a Time Histogram (Q3593960) (← links)
- Cross-validated density estimates based on Kullback–Leibler information (Q4831088) (← links)
- A comparison of automatic histogram constructions (Q5851017) (← links)
- Information measures of kernel estimation (Q5860908) (← links)
- Uncertainty, information, and disagreement of economic forecasters (Q5864649) (← links)
- Fast and fully-automated histograms for large-scale data sets (Q6167056) (← links)