The following pages link to Oliver Johnson (Q215737):
Displayed 46 items.
- Discrete versions of the transport equation and the Shepp-Olkin conjecture (Q272953) (← links)
- Log-concavity, ultra-log-concavity, and a maximum entropy property of discrete compound Poisson measures (Q385153) (← links)
- Compound Poisson approximation via information functionals (Q638324) (← links)
- Central limit theorem and convergence to stable laws in Mallows distance (Q817972) (← links)
- Acknowledgement of priority: ``Central limit theorem and convergence to stable laws in Mallows distance'' (Q850739) (← links)
- Log-concavity and the maximum entropy property of the Poisson distribution (Q885265) (← links)
- Some results concerning maximum Rényi entropy distributions (Q885278) (← links)
- Entropy and convergence on compact groups (Q1592279) (← links)
- An extremal property of the normal distribution, with a discrete analog (Q1726784) (← links)
- Entropy inequalities and the central limit theorem. (Q1877517) (← links)
- Fisher information inequalities and the central limit theorem (Q1881640) (← links)
- Bounds on the Poincaré constant under negative dependence (Q1950656) (← links)
- Theoretical properties of Cook's PFC dimension reduction algorithm for linear regression (Q1951774) (← links)
- A proof of the Shepp-Olkin entropy monotonicity conjecture (Q2279321) (← links)
- A proof of the Shepp-Olkin entropy concavity conjecture (Q2405168) (← links)
- Entropy and thinning of discrete random variables (Q2406336) (← links)
- Group Testing Algorithms: Bounds and Simulations (Q2986358) (← links)
- (Q3158591) (← links)
- A Conditional Entropy Power Inequality for Dependent Variables (Q3547090) (← links)
- Entropy and the Law of Small Numbers (Q3547166) (← links)
- Information inequalities and a dependent Central Limit Theorem (Q4330955) (← links)
- Entropy and a generalisation of “Poincaré's Observation” (Q4461545) (← links)
- Relaxation of monotone coupling conditions: Poisson approximation and beyond (Q4555288) (← links)
- Strong converses for group testing from finite blocklength results (Q4589415) (← links)
- Performance of Group Testing Algorithms With Near-Constant Tests Per Item (Q4615335) (← links)
- Noisy Non-Adaptive Group Testing: A (Near-)Definite Defectives Approach (Q5124399) (← links)
- A natural derivative on [0, <i>n</i>] and a binomial Poincaré inequality (Q5174376) (← links)
- Group Testing: An Information Theory Perspective (Q5213209) (← links)
- Interference Alignment-Based Sum Capacity Bounds for Random Dense Gaussian Interference Networks (Q5281126) (← links)
- Monotonicity, Thinning, and Discrete Versions of the Entropy Power Inequality (Q5281202) (← links)
- Thinning, Entropy, and the Law of Thin Numbers (Q5281286) (← links)
- Preservation of log-concavity on summation (Q5429575) (← links)
- An Information-Theoretic Central Limit Theorem for Finitely Susceptible FKG Systems (Q5472374) (← links)
- A Central Limit Theorem for Non-Overlapping Return Times (Q5488986) (← links)
- Entropy and random vectors (Q5949315) (← links)
- A de Bruijn identity for symmetric stable laws (Q6245438) (← links)
- The capacity of non-identical adaptive group testing (Q6255099) (← links)
- Thinning and Information Projections (Q6269380) (← links)
- A de Bruijn identity for discrete random variables (Q6282351) (← links)
- A proof of the Shepp-Olkin entropy monotonicity conjecture (Q6308643) (← links)
- A negative binomial approximation in group testing (Q6393739) (← links)
- Small error algorithms for tropical group testing (Q6451031) (← links)
- The von Neumann entropy and information rate for ideal quantum Gibbs ensembles (Q6468608) (← links)
- The von Neumann entropy and information rate for integrable quantum Gibbs ensembles, 2 (Q6468928) (← links)
- Convergence of the empirical process in Mallows distance, with an application to bootstrap performance (Q6474185) (← links)
- The empirical process in Mallows distance, with application to goodness-of-fit tests (Q6475428) (← links)