Reasoning, nonmonotonicity and learning in connectionist networks that capture propositional knowledge
From MaRDI portal
Publication:1855237
DOI10.1016/0004-3702(94)00032-VzbMath1013.68505MaRDI QIDQ1855237
Publication date: 4 February 2003
Published in: Artificial Intelligence (Search for Journal in Brave)
68T05: Learning and adaptive systems in artificial intelligence
68T27: Logic in artificial intelligence
03B70: Logic in computer science
Related Items
Artificial nonmonotonic neural networks, \(\text{DA}^2\) merging operators, Compiling propositional weighted bases, How to decide what to do?, Nonmonotonic inferences and neural networks, Sequential inference with reliable observations: Learning to construct force-dynamic models, Recurrent neural networks with backtrack-points and negative reinforcement applied to cost-based abduction
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- A mathematical treatment of defeasible reasoning and its implementation.
- ``Neural computation of decisions in optimization problems
- On the stability of the travelling salesman problem algorithm of Hopfield and Tank
- A logical framework for default reasoning
- A logic for default reasoning
- Circumscription - a form of non-monotonic reasoning
- Erratum to: ``What does a conditional knowledge base entail?
- Some results on the computational complexity of symmetric connectionist networks.
- On inference from inconsistent premisses
- Stochastic Relaxation, Gibbs Distributions, and the Bayesian Restoration of Images
- A theory of the learnable
- Neural networks and physical systems with emergent collective computational abilities.
- Neurons with graded response have collective computational properties like those of two-state neurons.
- A Machine-Oriented Logic Based on the Resolution Principle
- A Computing Procedure for Quantification Theory