Refining neural network predictions using background knowledge
DOI10.1007/S10994-023-06310-3zbMATH Open1518.68295arXiv2206.04976MaRDI QIDQ6176232FDOQ6176232
Authors: Alessandro Daniele, Emile van Krieken, Luciano Serafini, Frank van Harmelen
Publication date: 22 August 2023
Published in: Machine Learning (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/2206.04976
Recommendations
Learning and adaptive systems in artificial intelligence (68T05) Problem solving in the context of artificial intelligence (heuristics, search strategies, etc.) (68T20) Reasoning under uncertainty in the context of artificial intelligence (68T37)
Cites Work
- SATLIB: An online resource for research on SAT
- Inequalities: theory of majorization and its applications
- Title not available (Why is that?)
- Fuzzy implications
- Triangular norms. Position paper II: General constructions and parameterized families
- Subgradient Criteria for Monotonicity, The Lipschitz Condition, and Convexity
- Semantic-based regularization for learning and inference
- Title not available (Why is that?)
- Analyzing differentiable fuzzy logic operators
- Cone monotonicity: structure theorem, properties, and comparisons to other notions of monotonicity
- Schur-concave triangular norms: Characterization and application in pFCSP
- Logic tensor networks
- Multi-Label Classification Neural Networks with Hard Logical Constraints
This page was built for publication: Refining neural network predictions using background knowledge
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6176232)