Relex network
The theoretical background which has been found to be the more suitable for the common project is Z.S. Harris' theory of transformations. The level of representation used by this theory has been subjected to experimental work for over twenty years, both by theoreticians and by computer scientists who have built parsers (the string analyzers of N. Sager for English and of M. Salkoff for French) that are probably the most complete built so far. The transformational level is based on the postulate that the basic unit of meaning is the elementary sentence rather than the word.
Lexicon-Grammar of verbs and other predicative elements (adjectives, nouns and adverbs) have been built for each of the languages of the LEXNET project. Examples are provided in annex 4. The matrix representations shown in this annex are in significant numbers (in the thousands at least for each language). They result from the application of a set of common linguistic principles:
When such general principles are respected, experience over the past ten years has shown that various individuals or teams could reach identical evaluation, thus ensuring cumulativity for the descriptions. This point is crucial for the LEXNET project and for the future of NLP: The amount of data that must be accumulated and represented in a coherent model is such that many research and development teams will have to cooperate and their results must be such that they can be merged without having to rewrite large parts of the grammar and of the lexicon of each language. This requirement is by no means trivial to meet: current experience with phrase structure grammars tend to show that their construction is not cumulative and there is no example of a grammar that is of significant size and that is not the work of a single specialist.