HAL will be down for maintenance from Friday, June 10 at 4pm through Monday, June 13 at 9am. More information
Skip to Main content Skip to Navigation
Theses

Text Generation with and without Retrieval

Angela Fan 1
1 SYNALP - Natural Language Processing : representations, inference and semantics
LORIA - NLPKD - Department of Natural Language Processing & Knowledge Discovery
Abstract : Every day we write --- from sending your mother a quick text to drafting a scientific article such as this thesis. The writing we do often goes hand-in-hand with automated assistance. For example, modern instant messaging software often suggests what word to write next, emails can be started with an autocomposer, and essays are improved with machine-suggested edits. These technologies are powered by years of research on text generation, a natural language processing field with the goal of automatically producing fluent, human-readable natural language. At a small scale, text generation systems can generate individual words or sentences, but have wide-reaching applications beyond that. For instance, systems for summarization, dialogue, and even the writing of entire Wikipedia articles are grounded in foundational text generation technology.Producing fluent, accurate, and useful natural language faces numerous challenges. Recent advances in text generation, principally leveraging training neural network architectures on large datasets, have significantly improved the surface-level readability of machine-generated text. However, current systems necessitate improvement along numerous axes, including generation beyond English and writing increasingly longer texts. While the field has seen rapid progress, much research focus has been directed towards the English language, where large-scale training and evaluation datasets for various tasks are readily available. Nevertheless, applications from autocorrect to autocomposition of text should be available universally. After all, by population, the majority of the world does not write in English. In this work, we create text generation systems for various tasks with the capability of incorporating languages beyond English, either as algorithms that easily extend to new languages or multilingual models encompassing up to 20 languages in one model.Beyond our work in multilingual text generation, we focus on a critical piece of generation systems: knowledge. A pre-requisite to writing well is knowing what to write. This concept of knowledge is incredibly important in text generation systems. For example, automatically writing an entire Wikipedia article requires extensive research on that article topic. The instinct to research is often intuitive --- decades ago people would have gone to a library, replaced now by the information available on the World Wide Web. However, for automated systems, the question is not only what knowledge to use to generate text, but also how to retrieve that knowledge and best utilize it to achieve the intended communication goal.We face the challenge of retrieval-based text generation. We present several techniques for identifying relevant knowledge at different scales: from local knowledge available in a paragraph to sifting through Wikipedia, and finally identifying the needle-in-the-haystack on the scale of the full web. We describe neural network architectures that can perform large-scale retrieval efficiently, utilizing pre-computation and caching mechanisms. Beyond how to retrieve knowledge, we further investigate the form the knowledge should take --- from natural language such as Wikipedia articles or text on the web to structured inputs in the form of knowledge graphs. Finally, we utilize these architectures in novel, much more challenging tasks that push the boundaries of where text generation models work well today: tasks that necessitate knowledge but also require models to produce long, structured natural language output, such as answering complex questions or writing full Wikipedia articles.
Complete list of metadata

https://hal.univ-lorraine.fr/tel-03542634
Contributor : Thèses Ul Connect in order to contact the contributor
Submitted on : Tuesday, January 25, 2022 - 2:43:09 PM
Last modification on : Thursday, January 27, 2022 - 3:37:36 AM
Long-term archiving on: : Tuesday, April 26, 2022 - 6:55:11 PM

File

DDOC_T_2021_0164_FAN.pdf
Files produced by the author(s)

Identifiers

  • HAL Id : tel-03542634, version 1

Citation

Angela Fan. Text Generation with and without Retrieval. Computer Science [cs]. Université de Lorraine, 2021. English. ⟨NNT : 2021LORR0164⟩. ⟨tel-03542634⟩

Share

Metrics

Record views

92

Files downloads

76