Algorithms for approximate linear regression design with application to a first order model with heteroscedasticity
From MaRDI portal
Publication:1621395
DOI10.1016/j.csda.2013.07.029zbMath1471.62069MaRDI QIDQ1621395
Rainer Schwabe, Norbert Gaffke, Ulrike Graßhoff
Publication date: 8 November 2018
Published in: Computational Statistics and Data Analysis (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/j.csda.2013.07.029
information matrix; efficiency; steepest descent; quasi-Newton method; optimality criterion; invariant design
62-08: Computational methods for problems pertaining to statistics
62J05: Linear regression; mixed models
62K05: Optimal statistical designs
Related Items
Empirical likelihood based diagnostics for heteroscedasticity in semiparametric varying-coefficient partially linear errors-in-variables models, Special issue on algorithms for design of experiments, Quasi-Newton algorithm for optimal approximate linear regression design: optimization in matrix space, Approximate optimal designs for multivariate polynomial regression, Locally optimal designs for gamma models
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Optimal design for linear regression models in the presence of heteroscedasticity caused by random coefficients
- Active constraints, indefinite quadratic test problems, and complexity
- Minimizing pseudoconvex functions on convex compact sets
- Convergent design sequences, for sufficiently regular optimality criteria
- Some algorithmic aspects of the theory of optimal designs
- Polynomial time algorithms for some classes of constrained nonconvex quadratic problems
- On a class of algorithms from experimental design theory
- Second order methods for solving extremum problems form optimal linar regression design
- Optimal Design of Experiments
- State Constraints in Convex Control Problems of Bolza
- The Sequential Generation of $D$-Optimum Experimental Designs