The technological revolution of human language analysis

Discover how the mix of social science, linguistic and technology open up the field of possibilities in artificial intelligence

Few words about's Technology is a language analysis technology with a linguistic universals approach. This positioning is significantly different from the 2 most common approaches in the NLU today : the statistical approach (training of artificial neural networks), very popular, and the more classical grammatical approach. These 2 approaches have their strengths and their weaknesses.

Two main approaches

By its positioning, offers both a technology easily multilingual and without the need for training. The AI already has a knowledge of linguistics understanding, common to all human languages. The configuration only consists in describing the format of the expected elements (what are the purposes of action or interpretation, in the given context) and provide the specific business vocabulary. This technology has been developed after many years of experimentation, to find the easiest and most efficient way to configure a NLU AI.

The universal linguistics at

Noam Chomsky

This approach is inspired in particular by the ideas of Noam Chomsky, one of the greatest thinkers of linguistics, who worked on highlighting the structures inherent in human languages, for every languages, through generative and transformational grammar. One of the great ideas corollary to this perspective, is that language is a reflection of our cognitive structuring, and so, an important part of the language would be innate rather than acquired.


UG [universal grammar] may be regarded as a characterization of the genetically determined language faculty. One may think of this faculty as a ‘language acquisition device,’ an innate component of the human mind that yields a particular language through interaction with present experience, a device that converts experience into a system of knowledge attained: knowledge of one or another language.

– Noam Chomsky, American professor, philosopher, linguist and cognitive scientist.

The parallel with AI is striking. Any statistical training on a given language must lead to recurrent patterns. But also, there are recurring patterns between languages. Which suggests that, trained massively and over very long durations, artificial neural networks for NLU converge, repeating, independently of each other, a work of approximate abstraction of language, which, in the end, is always the same for a good part.

The linguistics universal approach will reinforce this SaaS approach, because the absence of training will give space for what is essential for the technologies of NLU: continuous improvement and personalization. Indeed, we experienced a wave of disappointment over chatbots, which, popularly, serves as a reducing standard for the NLU domain. This insufficiency has several causes, one of them is the high cost of NLU technologies. The good chatbots are those who spend time on the user experience, monitoring and improving the solution. Chatbots are far from representing the entire NLU, and these lessons apply to all the revolutions brought by the rise of solutions that use NLU.

This is how aims to offer an experience that gives more time to developers and businesses to focus on what makes the heart of innovation, by freeing them from the hassle of configuring the NLU itself.

Some related articles

Recherche : vers une intelligence artificielle plus frugale

Conscients des limites du deep learning, de plus en plus de chercheurs planchent sur des systèmes d'apprentissage utilisant moins de données et d'énergie

Nouveau monde. L’intelligence artificielle actuelle est une "boîte noire", selon la startup

La startup française défend l'usage d'une intelligence "éthique" à base d’algorithmes contrôlables.

Le temps d'analyse initiale a été divisé par quatre

Engie Entreprises et Collectivités (Engie E&C) a recours à pour extraire les données de qualification des appels d'offres et ainsi gagner du temps dans leur traitement.