Escrito por estudiantes que aprobaron Inmediatamente disponible después del pago Leer en línea o como PDF ¿Documento equivocado? Cámbialo gratis 4,6 TrustPilot
logo-home
Resumen

Summary- Natural Language Generation (INFOMNLG)

Puntuación
-
Vendido
3
Páginas
101
Subido en
24-03-2024
Escrito en
2023/2024

This document includes a summary of all lectures, lecture notes, screenshots of important lecture slides and extra notes to help understand the contents and concepts better.

Institución
Grado

Vista previa del contenido

Natural Language Generation
Lecture 1 – General Introduction
Introduction




What is Natural Language Generation?
• Natural Language Generation: Automatic generation of text in any natural language
• This can take place in different settings
o Text-to-text (e.g. automatic summarisation, machine translation: sth in language
A as input, something in language B as output)
o Data-to-text (e.g. summarising tables of sports or weather data, summarising
patient data)
o Media-to-text (e.g. captioning images, describing videos)
o Open-ended (“creative”?) generation (e.g. generating stories based on
prompts: tell me a story about xyz)
• Current state of the art: Deep neural networks (Transformers) offer a unified
framework in which to deal with all of these.




1

, • There is a classic distinction, which is sometimes left implicit:
• Strategic choices: what to say (street, organ, people)
o Based on the input
o Based on additional knowledge (what you already know)
o Based on the target language
• Tactical choices: how to say it → Highly dependent on language (A street organ on a city
street/ Een traditioneel draaiorgel in Utrecht)
• Originally proposed by Thomson and features in several architectures for (human)
production and (automatic) generation.
• The same football match can be described entirely differently depending on whose side
you’re on/ the perspective
• Hallucination: when the model predicts something, e.g. hail, because the data contains
parts about showers and comparable weather conditions

3 dimensions to consider when generating text




2

,Lecture 2 - What are the subtasks involved in generating text?
The classic pipeline architecture for NLG and its sub-tasks
• What is involved in NLG? It’s all about choices.
• Modular versus end-to-end
o A modular architecture breaks down the main task into sub-tasks, modelling
each one separately. This was the dominant approach in “classical” (pre-neural)
NLG systems.
▪ breaks steps up from the input in steps, breaking up big tasks in subtasks
o In end-to-end models, there might be no (or fewer) explicit subtasks. This does
not mean that the choices are not made.
o A classic approach to NLG involves breaking down the generation process into
stages, such as content selection, rhetorical structuring, ordering, lexicalization,
aggregation, referring expressions, and syntactic planning. These stages can be
implemented using either modular architectures, where each sub-task is
modeled separately, or end-to-end models, which integrate multiple tasks into a
single framework. Both approaches have their advantages and trade-offs.
• The early “consensus”
o Reiter and Reiter and Dale argued that the various tasks can be grouped in a
three-stage pipeline. Their architecture represented a “consensus” view.




o
o Pipeline: you start with an input → then you have some communicative goal:
many systems are designed to inform people about something, but it could also
be to entertain → plan what to say and structure those messages, which are not
linguistic yet into a document plan. Goal of document planner: choose what to
say and structure it in a certain way and target relationships → microplanning
stage: where document plan begins to be lashed out, in a more linguistic way →
surface realiser is the actual text
o Domain knowledge is important; how you structure a document to report about
e.g. a football match is governed by knowledge of conventions
o Also, who you are generating for (doctor vs nurse vs family member) → what
lexical/ grammatical knowledge do you assume?




3

, o
o Strategic tasks (what to say):
▪ What information to include (what are people wearing in a football
match might not be important); depending on how much you assume
your user knows
▪ Rhethorical structuring
▪ Ordering
▪ Segmentation: some things you can merge (this person scored a goal, but
if there was a tackle before that, you also include that part)
o Tactical tasks:
▪ What words to use
▪ How to refer to things
▪ Some sentences merged to help with the narrative flow
o Tactical tasks
▪ Syntactic structure
▪ Morphologic rules: Rules at level of the world (change form of verb)
• The case of raw input data
o Some NLG systems have to deal with raw, unstructured data. This means that
prior to generating text, the data has to be analysed in order to:
1. Identify the important things and filter out noise
2. Map the data to appropriate input representations
3. Perform some reasoning on these representations
o Image caption → pixels
o Pre-processing to figure out what the objects are
• Extending the original architecture to handle data pre-processing
o Reiter (2007) proposed to extend the “consensus” architecture
to deal with preliminary stages of:
1. Signal analysis: to extract patterns and trends from
unstructured input data;
2. Data interpretation: the perform reasoning on the
results




4

Escuela, estudio y materia

Institución
Estudio
Grado

Información del documento

Subido en
24 de marzo de 2024
Número de páginas
101
Escrito en
2023/2024
Tipo
RESUMEN

Temas

$9.05
Accede al documento completo:

¿Documento equivocado? Cámbialo gratis Dentro de los 14 días posteriores a la compra y antes de descargarlo, puedes elegir otro documento. Puedes gastar el importe de nuevo.
Escrito por estudiantes que aprobaron
Inmediatamente disponible después del pago
Leer en línea o como PDF

Conoce al vendedor

Seller avatar
Los indicadores de reputación están sujetos a la cantidad de artículos vendidos por una tarifa y las reseñas que ha recibido por esos documentos. Hay tres niveles: Bronce, Plata y Oro. Cuanto mayor reputación, más podrás confiar en la calidad del trabajo del vendedor.
IsabelleU Universiteit Utrecht
Seguir Necesitas iniciar sesión para seguir a otros usuarios o asignaturas
Vendido
138
Miembro desde
4 año
Número de seguidores
86
Documentos
34
Última venta
2 semanas hace

3.8

4 reseñas

5
2
4
0
3
1
2
1
1
0

Documentos populares

Recientemente visto por ti

Por qué los estudiantes eligen Stuvia

Creado por compañeros estudiantes, verificado por reseñas

Calidad en la que puedes confiar: escrito por estudiantes que aprobaron y evaluado por otros que han usado estos resúmenes.

¿No estás satisfecho? Elige otro documento

¡No te preocupes! Puedes elegir directamente otro documento que se ajuste mejor a lo que buscas.

Paga como quieras, empieza a estudiar al instante

Sin suscripción, sin compromisos. Paga como estés acostumbrado con tarjeta de crédito y descarga tu documento PDF inmediatamente.

Student with book image

“Comprado, descargado y aprobado. Así de fácil puede ser.”

Alisha Student

Preguntas frecuentes