scientific article

From MaRDI portal
Publication:3995624

zbMath0800.68508MaRDI QIDQ3995624

Jorma Rissanen

Publication date: 17 September 1992


Title: zbMATH Open Web Interface contents unavailable due to conflicting licenses.



Related Items

The polyharmonic local sine transform: a new tool for local image analysis and synthesis without edge effect, Law of iterated logarithm and consistent model selection criterion in logistic regression, Minimum entropy of error principle in estimation, Selecting optimal multistep predictors for autoregressive processes of unknown order., On Rissanen's predictive stochastic complexity for stationary ARMA processes, Bounded rationality, neural network and folk theorem in repeated games with discounting, Optimal embedding parameters: a modelling paradigm, On consistency of minimum description length model selection for piecewise autoregressions, Detecting abrupt changes in the spectra of high-energy astrophysical sources, Recognition, Statistical optimization for geometric fitting: theoretical accuracy bound and high order error analysis, On universal prediction and Bayesian confirmation, `Ideal learning' of natural language: positive results about learning from positive evidence, Suboptimal behavior of Bayes and MDL in classification under misspecification, Unsupervised learning of arbitrarily shaped clusters using ensembles of Gaussian models, Nonlinear dynamical system identification with dynamic noise and observational noise, Local discriminant bases and their applications, Modeling chaotic motions of a string from experimental data, The discovery of algorithmic probability, Stochastic complexity in learning, Learning about the parameter of the Bernoulli model, On-line maximum likelihood prediction with respect to general loss functions, Open problems in universal induction \& intelligence, Divergence rates of Markov order estimators and their application to statistical estimation of stationary ergodic processes, Classification of binary vectors by stochastic complexity., A network of autoregressive processing units for time series modeling, From \(\varepsilon\)-entropy to KL-entropy: analysis of minimum information complexity density estima\-tion, The ambiguity of simplicity in quantum and classical simulation, Locally uniform prior distributions, An entropy criterion for assessing the number of clusters in a mixture model, Averaging over decision trees, Test for homogeneity of several populations by stochastic complexity, Fast nonparametric active contour adapted to quadratic inhomogeneous intensity fluctuations, A unified perspective and new results on RHT computing, mixture based learning, and multi-learner based problem solving, Three-objective subgraph mining using multiobjective evolutionary programming, Effective complexity of stationary process realizations, Predicting a binary sequence almost as well as the optimal biased coin, Scalability of the Bayesian optimization algorithm., The generalized universal law of generalization., Study and development of the DTD generation system for XML documents, Image transforms for determining fit-for-purpose complexity of geostatistical models in flow modeling, Local bandwidth selection via second derivative segmentation, Structure detection and parameter estimation for NARX models in a unified EM framework, Using the minimum description length to discover the intrinsic cardinality and dimensionality of time series, On image segmentation using information theoretic criteria, Towards long-term prediction, Automatic identification of rock fracture sets using finite mixture models, The decomposed normalized maximum likelihood code-length criterion for selecting hierarchical latent variable models, Embedding as a modeling problem, Measures of statistical complexity: why?, Comparisons of new nonlinear modeling techniques with applications to infant respiration., Variable length Markov chains, Strong approximation of vector-valued stochastic integrals, A two-stage information criterion for stochastic systems revisited, Akaike's information criterion and recent developments in information complexity, Model selection based on minimum description length, Law of iterated logarithm and model selection consistency for generalized linear models with independent and dependent responses, Parameter uncertainties in models of equivariant dynamical systems, Minimum message length estimation using EM methods: a case study, Data compression and histograms, Generalized threshold latent variable model, Counterexamples to parsimony and BIC, The polynomial Fourier transform with minimum mean square error for noisy data, Improving optical Fourier pattern recognition by accommodating the missing information, An MDL approach to the climate segmentation problem, Discovery of time-series motif from multi-dimensional data based on MDL principle, Machine learning problems from optimization perspective, Thermodynamics of natural selection. III: Landauer's principle in computation and chemistry, Clustering of fuzzy data and simultaneous feature selection: a model selection approach, Minimum message length inference of the Poisson and geometric models using heavy-tailed prior distributions, Data mining based Bayesian networks for best classification, Model selection and mixed-effects modeling of HIV infection dynamics, Model selection and prediction: Normal regression, Spatial sampling design based on stochastic complexity., Minimum message length estimation of mixtures of multivariate Gaussian and von Mises-Fisher distributions, Models of knowing and the investigation of dynamical systems, Minimum message length shrinkage estimation, The calculi of emergence: Computation, dynamics and induction, On some properties of the NML estimator for Bernoulli strings, Multiple changepoint detection with partial information on changepoint times, Regression spline smoothing using the minimum description length principle, Brownian warps for non-rigid registration, Modeling nonlinear dynamics and chaos: a review, On the computation of entropy prior complexity and marginal prior distribution for the Bernoulli model, Classification using proximity catch digraphs, Applying MDL to learn best model granularity, On selecting models for nonlinear time series, Structural break estimation of noisy sinusoidal signals, Failure-time prediction, Stochastic complexity and model selection from incomplete data, On model selection via stochastic complexity in robust linear regression, Information and complexity, or: where is the information?, The consistency of the BIC Markov order estimator., Asymptotically minimax regret procedures in regression model selection and the magnitude of the dimension penalty., A maximum likelihood method for latent class regression involving a censored dependent variable, On the complexity of additive clustering models, Strong approximation of the recursive prediction error estimator of the parameters of an ARMA process, Efficient data reconciliation, On the martingale approximation of the estimation error of ARMA parameters, A wavelet regularization method for diffusion radar-target imaging and speckle noise reduction, Heuristic, systematic, and informational regularization for process monitoring, The Whole and the Parts: The Minimum Description Length Principle and the A-Contrario Framework, Unnamed Item, On model selection for dense stochastic block models, Unnamed Item, Quantum information criteria for model selection in quantum state estimation, DETERMINING THE NUMBER OF TERMS IN A TRIGONOMETRIC REGRESSION, Real patterns and indispensability, Data driven versions of pearson's chisquare test for uniformity, Entropy concentration and the empirical coding game, Robust image segmentation via Bayesian type criterion, Machine Learning: Deepest Learning as Statistical Data Assimilation Problems, Robust growing neural gas algorithm with application in cluster analysis, A unified view on clustering binary data, Selecting nonlinear stochastic process rate models using information criteria, Structure-selection techniques applied to continuous-time nonlinear models, An algebra of human concept learning, Evaluating the performance of cost-based discretization versus entropy- and error-based discretization, Complexity through nonextensivity, Network tomography: recent developments, Accumulative prediction error and the selection of time series models, Model selection by normalized maximum likelihood, An empirical study of minimum description length model selection with infinite parametric complexity, Stochastic motion and the level set method in computer vision: stochastic active contours, Flexible scan statistic test to detect disease clusters in hierarchical trees, A BYY scale-incremental EM algorithm for Gaussian mixture learning, Exploring Validity Indices for Clustering Textual Data, Subset Selection in Linear Regression using Sequentially Normalized Least Squares: Asymptotic Theory, Structural changes estimation for strongly dependent processes, Estimation of General Stationary Processes by Variable Length Markov Chains, Minimum description length modelling of musical structure, Varieties of Helmholtz machine, Minimum description length revisited, Simultaneous modeling of nonlinear deterministic and stochastic dynamics, Consistency of the BIC order estimator, A comparison of automatic histogram constructions, Unnamed Item, Statistical learning theory, model identification and system information content