Titre

Minimal distributional semantics

Auteur Alexandre Kabbach
Directeur /trice Aurélie Herbelot
Co-directeur(s) /trice(s) Jacques Moeschler
Résumé de la thèse We propose to explore the benefits for artificial intelligence and natural language processing of a model of language where linguistic utterances are modeled as projections of vector representations of concepts along the dimension of a given language. We intend to demonstrate that such a model can successfully account for the fact that concepts are speaker-specific while lexicalized concepts are shared across speakers or the same language. We argue that, for an artificial intelligence to develop, reason must precede communication, and that a machine must first acquire a rich set of de-lexicalized and multimodal conceptual representations before it can learn to project them in any given language. We propose to learn such conceptual representations by combining traditional distributional models of lexical meaning, which provide vector representations of words grounded in the statistical analysis of the contexts in which words are used. We propose to evaluate our model in two ways: first, we evaluate the compositional power of the model by quantifying its contribution to the concrete NLP problem of adjective-noun composition. Second, we evaluate the cognitive plausibility of the model by testing its potential to learn rich representations from tiny data, in a way similar to humans’ fast mapping
Statut au début
Délai administratif de soutenance de thèse
URL
LinkedIn
Facebook
Twitter
Google+
Xing