Sentence entailment in compositional distributional semantics
From MaRDI portal
(Redirected from Publication:722091)
Abstract: Distributional semantic models provide vector representations for words by gathering co-occurrence frequencies from corpora of text. Compositional distributional models extend these from words to phrases and sentences. In categorical compositional distributional semantics, phrase and sentence representations are functions of their grammatical structure and representations of the words therein. In this setting, grammatical structures are formalised by morphisms of a compact closed category and meanings of words are formalised by objects of the same category. These can be instantiated in the form of vectors or density matrices. This paper concerns the applications of this model to phrase and sentence level entailment. We argue that entropy-based distances of vectors and density matrices provide a good candidate to measure word-level entailment, show the advantage of density matrices over vectors for word level entailments, and prove that these distances extend compositionally from words to phrases and sentences. We exemplify our theoretical constructions on real data and a toy entailment dataset and provide preliminary experimental evidence.
Recommendations
Cites work
- scientific article; zbMATH DE number 2133295 (Why is no real title available?)
- scientific article; zbMATH DE number 195199 (Why is no real title available?)
- A Compositional Distributional Inclusion Hypothesis
- A Frobenius model of information structure in categorical compositional distributional semantics
- A generalised quantifier theory of natural language in categorical compositional distributional semantics with bialgebras
- A vector space model for automatic indexing
- Categories for the Practising Physicist
- Coherence for compact closed categories
- Dagger compact closed categories and completely positive maps (extended abstract)
- Distributional sentence entailment using density matrices
- Open system categorical quantum semantics in natural language processing
- Sentence entailment in compositional distributional semantics
- Similarity of semantic relations
- Similarity-based models of word cooccurrence probabilities
- The Frobenius anatomy of word meanings. I: Subject and object relative pronouns
- The Frobenius anatomy of word meanings. II: Possessive relative pronouns
- Type grammars as pregroups
Cited in
(20)- Quantum computations for disambiguation and question answering
- A framework for distributional formal semantics
- scientific article; zbMATH DE number 7649886 (Why is no real title available?)
- Gaussianity and typicality in matrix distributional semantics
- Linguistic matrix theory
- Sentence entailment in compositional distributional semantics
- Semantic composition inspired by quantum measurement
- Quantum Mathematics in Artificial Intelligence
- Towards logical negation for compositional distributional semantics
- Distributional formal semantics
- Categorical vector space semantics for Lambek calculus with a relevant modality (extended abstract)
- Compositionality for recursive neural networks
- Distributional sentence entailment using density matrices
- Concrete sentence spaces for compositional distributional models of meaning
- scientific article; zbMATH DE number 7453972 (Why is no real title available?)
- Putting a spin on language: a quantum interpretation of unary connectives for linguistic applications
- Quantum natural language processing on near-term quantum computers
- Composing conversational negation
- A Compositional Distributional Inclusion Hypothesis
- QNLP in Practice: Running Compositional Models of Meaning on a Quantum Computer
This page was built for publication: Sentence entailment in compositional distributional semantics
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q722091)