The following pages link to Leandro Pardo (Q356492):
Displaying 50 items.
- (Q215859) (redirect page) (← links)
- \(\phi\)-divergence based procedure for parametric change-point problems (Q267862) (← links)
- Influence analysis of robust Wald-type tests (Q272063) (← links)
- Poisson loglinear modeling with linear constraints on the expected cell frequencies (Q356493) (← links)
- The power divergence and the density power divergence families: the mathematical connection (Q361226) (← links)
- Julio Angel Pardo Llorente, 1960--2013 (Associate Editor of Statistical Papers) (Q379939) (← links)
- Change-point detection in multinomial data using phi-divergence test statistics (Q391620) (← links)
- Divergence-based tests of homogeneity for spatial data (Q465634) (← links)
- New improved estimators for overdispersion in models with clustered multinomial data and unequal cluster sizes (Q517401) (← links)
- Rényi statistics for testing equality of autocorrelation coefficients (Q537378) (← links)
- Minimum disparity inference and the empty cell penalty: asymptotic results (Q541774) (← links)
- On the comparison of the pre-test and shrinkage phi-divergence test estimators for the symmetry model of categorical data (Q609203) (← links)
- Testing the order of Markov dependence in DNA sequences (Q631473) (← links)
- On testing homogeneity of variances for nonnormal models using entropy (Q635882) (← links)
- The Jensen-Shannon divergence (Q677507) (← links)
- Robust Wald-type tests for non-homogeneous observations based on the minimum density power divergence estimator (Q723446) (← links)
- The chi-square divergence measure in random sampling with Dirichlet process priors (Q750026) (← links)
- Divergence-based estimation and testing with misclassified data (Q816556) (← links)
- Minimum \(\phi\)-divergence estimator in logistic regression models (Q819428) (← links)
- Preliminary phi-divergence test estimators for linear restrictions in a logistic regression model (Q840971) (← links)
- Ordering and selecting extreme populations by means of entropies and divergences (Q843140) (← links)
- Hypothesis testing for two discrete populations based on the Hellinger distance (Q844873) (← links)
- Residual analysis and outliers in loglinear models based on phi-divergence statistics (Q872095) (← links)
- The moment-corrected phi-divergence test statistics for symmetry (Q875375) (← links)
- On Christensen's conjecture (Q882903) (← links)
- Connections between some criteria to compare fuzzy information systems (Q918519) (← links)
- New families of estimators and test statistics in log-linear models (Q943599) (← links)
- Minimum phi-divergence estimators for loglinear models with linear constraints and multinomial sampling (Q949420) (← links)
- Robust median estimator in logistic regression (Q951041) (← links)
- On tests of homogeneity based on minimum \(\varphi \)-divergence estimator with constraints (Q951919) (← links)
- Preliminary test estimators and phi-divergence measures in generalized linear models with binary data (Q957310) (← links)
- Preliminary Phi-divergence test estimator for multinomial probabilities (Q959268) (← links)
- Cressie and Read power-divergences as influence measures for logistic regression models (Q959404) (← links)
- A new family of BAN estimators for polytomous logistic regression models based on \(\varphi\)-diver\-gence measures (Q1001725) (← links)
- On tests of independence based on minimum \(\varphi \)-divergence estimator with constraints: An application to modeling DNA (Q1010450) (← links)
- An extension of likelihood-ratio-test for testing linear hypotheses in the baseline-category logit model (Q1023473) (← links)
- Order-\(\alpha\) weighted information energy (Q1094386) (← links)
- A sequential selection method of a fixed number of fuzzy information systems based on the information energy gain (Q1101423) (← links)
- Sufficient fuzzy information systems (Q1121856) (← links)
- A test for homogeneity of variances based on Shannon's entropy (Q1126046) (← links)
- Some bounds on probability of error in fuzzy discrimination problems (Q1178531) (← links)
- \((R,S)\)-information radius of type \(t\) and comparison of experiments (Q1184942) (← links)
- Fuzziness in the experimental outcomes: Comparing experiments and removing the loss of information (Q1193805) (← links)
- Informational energy in the sequential design of experiments in a Bayesian context (Q1196697) (← links)
- \((h,\Psi)\)-entropy differential metric (Q1265614) (← links)
- A characterization of monotone and regular divergences (Q1280559) (← links)
- Informational distances and related statistics in mixed continuous and categorical variables (Q1299007) (← links)
- Statistical inference for finite Markov chains based on divergences (Q1304078) (← links)
- The generalized entropy measure to the design and comparison of regression experiment in a Bayesian context (Q1310934) (← links)
- Some statistical applications of generalized Jensen difference divergence measures for fuzzy information systems (Q1311716) (← links)