Abstract Wikipedia/Related and previous work/Natural language generation
|Notes, drafts, discussions|
|Examples & mockups|
Abstract Wikipedia will generate natural language text from an abstract representation. This is not a novel idea, and it has been tried a number of times before.
This page aims to collect different existing approaches. It tries to summarize the core ideas of the different approaches, their advantages and disadvantages, and points to existing implementations. This page (by and for the community) will help to choose which approach to focus on first.
- Wikipedia: Arria NLG [ de ] [ en ] [ nn ]
- Website: https://www.arria.com/
- Implementation link:
- License: Proprietary, 30 patents apply
- Supported languages: English
- Website: http://kowey.github.io/GenI/
- Website: https://github.com/rali-udem/gophi
- Wikipedia: Grammatical Framework [ en ] [ nn ]
- Website: https://www.grammaticalframework.org
- Implementation link:
- License: GNU General Public License: see text
- Supported languages: Afrikaans, Amharic (partial), Arabic (partial), Basque (partial), Bulgarian, Catalan, Chinese, Czech (partial), Danish, Dutch, English, Estonian, Finnish, French, German, Greek ancient (partial), Greek modern, Hebrew (fragments), Hindi, Hungarian (partial), Interlingua, Italian, Japanese, Korean (partial), Latin (partial), Latvian, Maltese, Mongolian, Nepali, Norwegian bokmål, Norwegian nynorsk, Persian, Polish, Punjabi, Romanian, Russian, Sindhi, Slovak (partial), Slovene (partial), Somali (partial), Spanish, Swahili (fragments), Swedish, Thai, Turkish (fragments), and Urdu.
- Website: http://www.fb10.uni-bremen.de/anglistik/langpro/kpml/README.html
- Supported languages (2014):
- More advanced: Czech, English, German?, Spanish
- Prototype: Bulgarian, Chinese, Dutch, Portuguese, Russian
- Less advanced: French, Greek, Japanese
Linguistic Knowledge Builder
- Website: http://moin.delph-in.net/LkbTop
Multimodal Unification Grammar
- Implementation link: http://www.aueb.gr/users/ion/software/NaturalOWL1.1.tar.gz
NLGen and NLGen2
- Website: http://openccg.sourceforge.net/
- Website: https://rosaenlg.org/
- Supported languages: English, French, German and Italian
Semantic Web Authoring Tool (SWAT)
- Wikipedia: WYSIWYM [ en ] [ nn ] A SWAT is a tool that implements the WYSIWYM (what you see is what you meant) interaction technique for developing formal representations based on successive refinements (by humans) of NLG outputs.
- Website: http://mcs.open.ac.uk/nlg/SWAT/
- Implementation link: SWAT Ontology Editor (Java)
- Implementation link: SWAT Ontology Verbaliser (Prolog)
- Implementation link: SWAT dialogue-based Ontology Editor (Java)
- Supported languages: OWL Simplified English
- Website: https://github.com/simplenlg/simplenlg
- Supported languages: English, French
- Website: http://www.suregen.de/index.html
- Supported languages: German, English
- Website: https://github.com/mikahama/syntaxmaker
- Supported languages: Finnish
- Website: https://github.com/UFAL-DSG/tgen
Universal Networking Language
- Wikipedia: Universal Networking Language [ de ] [ en ] [ es ] [ fr ] [ 日本語 ] [ nn ]
- Implementation link:
- Supported languages:
- Website: https://uralicnlp.com/
- Website: https://github.com/mikahama/uralicNLP
- Supported languages: Finnish, Russian, German, English, Norwegian, Swedish, Arabic, Ingrian, Meadow & Eastern Mari, Votic, Olonets-Karelian, Erzya, Moksha, Hill Mari, Udmurt, Tundra Nenets, Komi-Permyak, North Sami, South Sami and Skolt Sami
- Implementation link:
- Supported languages:
In their 2018 Survey, Gatt and Krahmer begin by describing natural language generation as the "task of generating text or speech from non-linguistic input." They identify six sub-problems (after Reiter & Dale 1997, 2000) [2.NLG Tasks, pp. 70-82]:
- Content determination (Content determination (Q5165077))
- Text structuring (Document structuring (Q5287648))
- Sentence aggregation (Aggregation (Q4692263))
- Lexicalisation (Lexical choice (Q6537688))
- Referring expression generation (Referring expression generation (Q7307185))
- Linguistic realisation (Realization (Q7301282))
Please note that the six topics listed above have articles only in the English Wikipedia (24 July 2020).
These six sub-problems can be seen as a segmentation of the "pipeline", beginning with "early" tasks, aligned to the purpose of the linguistic output. The "late" tasks are more aligned to the final linguistic form. A summary form might be "What (1), ordered (2) and segmented (3) how, with what words (4&5) in which forms(6)". Lexicalisation (4) is not clearly distinguished from "referring expression generation" (REG) (5) in this summary form. The key idea during REG is avoiding repetition and ambiguity, or managing the tension between those conflicting aims. This corresponds to the Gricean maxim (Grice, 1975) that "speakers should make sure that their contributions are sufficiently informative for the purposes of the exchange, but not more so" (or, as Roger Sessions said (1950) after Albert Einstein (1933): "everything should be as simple as it can be but not simpler!").
Referring expression generation
"In linguistics, realization is the process by which some kind of surface representation is derived from its underlying representation; that is, the way in which some abstract object of linguistic analysis comes to be produced in actual language. Phonemes are often said to be realized by speech sounds. The different sounds that can realize a particular phoneme are called its allophones.
"Realization is also a subtask of natural language generation, which involves creating an actual text in a human language (English, French, etc.) from a syntactic representation."
- (Wikipedia contributors, 'Realization', Wikipedia, The Free Encyclopedia, 26 May 2020, 02:46 UTC, <https://en.wikipedia.org/w/index.php?title=Realization&oldid=958866516> [accessed 31 August 2020].)
In a later survey, Gârbacea and Mei suggested "Neural language generation" as an emerging sub-field of NLG. Eleven of the papers cited in their survey have titles with "neural language" in them, the earliest from 2016 (Edouard Grave, Armand Joulin, and Nicolas Usunier. 2016. Improving neural language models with a continuous cache). The earliest citation in which "neural language generation" appears is from 2017 (Jessica Ficler and Yoav Goldberg. 2017. Controlling linguistic style aspects in neural language generation. In Proceedings of the Workshop on Stylistic Variation, pages 94–104). Published slightly earlier that year was Van-Khanh Tran and Le-Minh Nguyen. 2017. Semantic Refinement GRU-based Neural Language Generation for Spoken Dialogue Systems. In mid 2020, "neural language generation" is not mature enough to be used to generate natural language renditions of language-neutral content.
- Gârbacea and Mei, 2020
- Gardent et al. 2017
- Gatt & Krahmer, 2018
- Reiter & Dale, 2000 (PDF ends at the end of the first section.)
- ACL Special Interest Group on Natural Language Generation ACL is the Association for Computational Linguistics.
- Ehud Reiter's Blog Ehud Reiter has no English Wikipedia page (apart from his user page).
- Natural Language Generation (CLAN Group), School of Natural and Computing Sciences, The University of Aberdeen.
- Institute for Language, Cognition and Computation (ILCC), School of Informatics, The University of Edinburgh.
- Harvard NLP, Harvard University.
- The Interaction Lab, School of Mathematical and Computer Sciences, Heriot-Watt University.
- Institute of Linguistics and Language Technology, University of Malta (Albert Gatt, Director).
- The Open University Natural Language Generation Group.
- TALN Research Group, Department of Information and Communication Technologies,Universitat Pompeu Fabra, Barcelona.
- The Natural Language Processing Group, The University of Sheffield.
- The Natural Language Group, Information Sciences Institute, University of Southern California.
- SyNaLP (Symbolic and statistical NLP), Laboratoire Lorrain d'Informatique et ses Applications (LORIA).
- Paul G. Allen School of Computer Science and Engineering, University of Washington.
- The Scholia view on Natural-language generation lacked the standard sources and leading authors on 27 July 2020. Instead, see Google Scholar
- Gatt, Albert; Krahmer, Emiel (January 2018), "Survey of the State of the Art in Natural Language Generation: Core tasks, applications and evaluation", Journal of Artificial Intelligence Research 61: 65–170, archived from the original on 2020-06-23, retrieved 2020-07-24
- Gatt's publications
- Emiel Krahmer ( Q51689943) selected publications
- Reiter, EB; Dale, R (2000), Building Natural-Language Generation Systems. (PDF), Cambridge University Press., archived from the original (PDF) on 2019-07-11, retrieved 2020-07-27
- Grice, H. Paul (1975), Logic and conversation. (PDF), retrieved 2020-08-10
- Gârbacea, Cristina; Mei, Qiaozhu, Neural Language Generation: Formulation, Methods, and Evaluation (PDF), pp. 1–70, retrieved 2020-08-08,
Compared to the survey of (Gatt and Krahmer, 2018), our overview is a more comprehensive and updated coverage of neural network methods and evaluation centered around the novel problem definitions and task formulations.
- Gardent, Claire; Shimorina, Anastasia; Narayan, Shashi; Perez-Beltrachini, Laura (2017), "The WebNLG Challenge: Generating Text from RDF data." (PDF), Proceedings of the 10th International Conference on Natural Language Generation: 124–133