search-based unsupervised text generation · •of how i learned natural language processing (nlp):...
TRANSCRIPT
![Page 1: Search-Based Unsupervised Text Generation · •Of how I learned natural language processing (NLP): NLP = NLU + NLG-NLU was the main focus of NLP research.-NLG was relatively easy,](https://reader035.vdocuments.net/reader035/viewer/2022063017/5fd7fa718486d049af045ef7/html5/thumbnails/1.jpg)
Search-Based Unsupervised Text Generation
Lili Mou
Dept. Computing Science, University of Alberta
Alberta Machine Intelligence Institute (Amii)
![Page 2: Search-Based Unsupervised Text Generation · •Of how I learned natural language processing (NLP): NLP = NLU + NLG-NLU was the main focus of NLP research.-NLG was relatively easy,](https://reader035.vdocuments.net/reader035/viewer/2022063017/5fd7fa718486d049af045ef7/html5/thumbnails/2.jpg)
"Kale & Salami Pizza" by ~malkin~ is licensed under CC BY-NC-SA 2.0
![Page 3: Search-Based Unsupervised Text Generation · •Of how I learned natural language processing (NLP): NLP = NLU + NLG-NLU was the main focus of NLP research.-NLG was relatively easy,](https://reader035.vdocuments.net/reader035/viewer/2022063017/5fd7fa718486d049af045ef7/html5/thumbnails/3.jpg)
Outline• Introduction
• General framework
• Applications
- Paraphrasing
- Summarization
- Text simplification
• Conclusion & Future Work
![Page 4: Search-Based Unsupervised Text Generation · •Of how I learned natural language processing (NLP): NLP = NLU + NLG-NLU was the main focus of NLP research.-NLG was relatively easy,](https://reader035.vdocuments.net/reader035/viewer/2022063017/5fd7fa718486d049af045ef7/html5/thumbnails/4.jpg)
• Of how I learned natural language processing (NLP):
NLP = NLU + NLG
- NLU was the main focus of NLP research.
- NLG was relatively easy, as we can generate sentences by rules, templates, etc.
• Why this may NOT be correct?
- Rules and templates are not natural language.
- How can we represent meaning? — Almost the same question as NLU.
A fading memory …
Understanding Generation
![Page 5: Search-Based Unsupervised Text Generation · •Of how I learned natural language processing (NLP): NLP = NLU + NLG-NLU was the main focus of NLP research.-NLG was relatively easy,](https://reader035.vdocuments.net/reader035/viewer/2022063017/5fd7fa718486d049af045ef7/html5/thumbnails/5.jpg)
• Of how I learned natural language processing (NLP):
NLP = NLU + NLG
- NLU was the main focus of NLP research.
- NLG was relatively easy, as we can generate sentences by rules, templates, etc.
• Why this may NOT be correct?
- Rules and templates are not natural language.
- How can we represent meaning? — Almost the same question as NLU.
A fading memory …
Understanding Generation
![Page 6: Search-Based Unsupervised Text Generation · •Of how I learned natural language processing (NLP): NLP = NLU + NLG-NLU was the main focus of NLP research.-NLG was relatively easy,](https://reader035.vdocuments.net/reader035/viewer/2022063017/5fd7fa718486d049af045ef7/html5/thumbnails/6.jpg)
• Industrial applications
- Machine translation
- Headline generation for news
- Grammarly: grammatical error correction
Why NLG is interesting?
https://translate.google.com/
![Page 7: Search-Based Unsupervised Text Generation · •Of how I learned natural language processing (NLP): NLP = NLU + NLG-NLU was the main focus of NLP research.-NLG was relatively easy,](https://reader035.vdocuments.net/reader035/viewer/2022063017/5fd7fa718486d049af045ef7/html5/thumbnails/7.jpg)
• Industrial applications
- Machine translation
- Headline generation for news
- Grammarly: grammatical error correction
• Scientific questions
- Non-linear dynamics for long-text generation
- Discrete “multi-modal” distribution
Why NLG is interesting?
![Page 8: Search-Based Unsupervised Text Generation · •Of how I learned natural language processing (NLP): NLP = NLU + NLG-NLU was the main focus of NLP research.-NLG was relatively easy,](https://reader035.vdocuments.net/reader035/viewer/2022063017/5fd7fa718486d049af045ef7/html5/thumbnails/8.jpg)
Sequence-to-sequence training
Training data =
known as a parallel corpus
{(x(m), y(m))}Mm=1
Supervised Text Generation
x1 x2 x3 x4 y1 y2 y3
y1 y2 y3
Predicted sentence
Reference/target sentence
Sequence-aggregated Cross-entropy loss}
![Page 9: Search-Based Unsupervised Text Generation · •Of how I learned natural language processing (NLP): NLP = NLU + NLG-NLU was the main focus of NLP research.-NLG was relatively easy,](https://reader035.vdocuments.net/reader035/viewer/2022063017/5fd7fa718486d049af045ef7/html5/thumbnails/9.jpg)
• Training data =
- Not even training (we did it by searching)
• Important to industrial applications
- Startup: No data
- Minimum viable product
• Scientific interest
- How can AI agents go beyond NLU to NLG?
- Unique search problems
{x(m)}Mm=1
Unsupervised Text Generation
![Page 10: Search-Based Unsupervised Text Generation · •Of how I learned natural language processing (NLP): NLP = NLU + NLG-NLU was the main focus of NLP research.-NLG was relatively easy,](https://reader035.vdocuments.net/reader035/viewer/2022063017/5fd7fa718486d049af045ef7/html5/thumbnails/10.jpg)
General Framework
![Page 11: Search-Based Unsupervised Text Generation · •Of how I learned natural language processing (NLP): NLP = NLU + NLG-NLU was the main focus of NLP research.-NLG was relatively easy,](https://reader035.vdocuments.net/reader035/viewer/2022063017/5fd7fa718486d049af045ef7/html5/thumbnails/11.jpg)
• Search objective - Scoring function measuring text quality
• Search algorithm - Currently we are using stochastic local search
General Framework
![Page 12: Search-Based Unsupervised Text Generation · •Of how I learned natural language processing (NLP): NLP = NLU + NLG-NLU was the main focus of NLP research.-NLG was relatively easy,](https://reader035.vdocuments.net/reader035/viewer/2022063017/5fd7fa718486d049af045ef7/html5/thumbnails/12.jpg)
• Search objective
- Scoring function measuring text quality
• Language fluency
• Semantic coherence
• Task-specific constraints
Scoring Function
s(y) = sLM(y) ⋅ sSemantic(y)α ⋅ sTask(y)β
![Page 13: Search-Based Unsupervised Text Generation · •Of how I learned natural language processing (NLP): NLP = NLU + NLG-NLU was the main focus of NLP research.-NLG was relatively easy,](https://reader035.vdocuments.net/reader035/viewer/2022063017/5fd7fa718486d049af045ef7/html5/thumbnails/13.jpg)
• Search objective
- Scoring function measuring text quality
• Language fluency - Language model estimates the “probability" of a
sentence
• Semantic coherence
• Task-specific constraints
Scoring Function
s(y) = sLM(y) ⋅ sSemantic(y)α ⋅ sTask(y)β
sLM(y) = PPL(y)−1
![Page 14: Search-Based Unsupervised Text Generation · •Of how I learned natural language processing (NLP): NLP = NLU + NLG-NLU was the main focus of NLP research.-NLG was relatively easy,](https://reader035.vdocuments.net/reader035/viewer/2022063017/5fd7fa718486d049af045ef7/html5/thumbnails/14.jpg)
• Search objective
- Scoring function measuring text quality
• Language fluency
• Semantic coherence
• Task-specific constraints
Scoring Function
s(y) = sLM(y) ⋅ sSemantic(y)α ⋅ sTask(y)β
ssemantic = cos(e(y), e(y))
![Page 15: Search-Based Unsupervised Text Generation · •Of how I learned natural language processing (NLP): NLP = NLU + NLG-NLU was the main focus of NLP research.-NLG was relatively easy,](https://reader035.vdocuments.net/reader035/viewer/2022063017/5fd7fa718486d049af045ef7/html5/thumbnails/15.jpg)
• Search objective
- Scoring function measuring text quality
• Language fluency
• Semantic coherence
• Task-specific constraints
- Paraphrasing: lexical dissimilarity with input
- Summarization: length budget
Scoring Function
s(y) = sLM(y) ⋅ sSemantic(y)α ⋅ sTask(y)β
![Page 16: Search-Based Unsupervised Text Generation · •Of how I learned natural language processing (NLP): NLP = NLU + NLG-NLU was the main focus of NLP research.-NLG was relatively easy,](https://reader035.vdocuments.net/reader035/viewer/2022063017/5fd7fa718486d049af045ef7/html5/thumbnails/16.jpg)
• Observations:
- The output closely resembles the input
- Edits are mostly local
- May have hard constraints
• Thus, we mainly used local stochastic search
Search Algorithm
![Page 17: Search-Based Unsupervised Text Generation · •Of how I learned natural language processing (NLP): NLP = NLU + NLG-NLU was the main focus of NLP research.-NLG was relatively easy,](https://reader035.vdocuments.net/reader035/viewer/2022063017/5fd7fa718486d049af045ef7/html5/thumbnails/17.jpg)
Search Algorithm (stochastic local search)
Start with # an initial candidate sentence
Loop within budget at step :
# a new candidate in the neighbor
Either reject or accept
If accepted, , or otherwise
Return the best scored
y0
t
y′� ∼ Neighbor(yt)
y′�
yt = y′� yt = yt−1
y*
![Page 18: Search-Based Unsupervised Text Generation · •Of how I learned natural language processing (NLP): NLP = NLU + NLG-NLU was the main focus of NLP research.-NLG was relatively easy,](https://reader035.vdocuments.net/reader035/viewer/2022063017/5fd7fa718486d049af045ef7/html5/thumbnails/18.jpg)
Local edits for
• General edits
- Word deletion
- Word insertion
- Word replacement
• Task specific edits
- Reordering, swap of word selection, etc.
y′� ∼ Neighbor(yt)
Search Algorithm
}Gibbs in Metropolis
![Page 19: Search-Based Unsupervised Text Generation · •Of how I learned natural language processing (NLP): NLP = NLU + NLG-NLU was the main focus of NLP research.-NLG was relatively easy,](https://reader035.vdocuments.net/reader035/viewer/2022063017/5fd7fa718486d049af045ef7/html5/thumbnails/19.jpg)
Example: Metropolis—Hastings sampling
Start with # an initial candidate sentence
Loop within your budget at step :
# a new candidate in the neighbor
Either reject or accept
If accepted, , or otherwise
Return the best scored
y0
t
y′� ∼ Neighbor(yt)
y′�
yt = y′� yt = yt−1
y*
Search Algorithm
![Page 20: Search-Based Unsupervised Text Generation · •Of how I learned natural language processing (NLP): NLP = NLU + NLG-NLU was the main focus of NLP research.-NLG was relatively easy,](https://reader035.vdocuments.net/reader035/viewer/2022063017/5fd7fa718486d049af045ef7/html5/thumbnails/20.jpg)
Example: Simulated annealing
Start with # an initial candidate sentence
Loop within your budget at step :
# a new candidate in the neighbor
Either reject or accept
If accepted, , or otherwise
Return the best scored
y0
t
y′� ∼ Neighbor(yt)
y′�
yt = y′� yt = yt−1
y*
Search Algorithm
![Page 21: Search-Based Unsupervised Text Generation · •Of how I learned natural language processing (NLP): NLP = NLU + NLG-NLU was the main focus of NLP research.-NLG was relatively easy,](https://reader035.vdocuments.net/reader035/viewer/2022063017/5fd7fa718486d049af045ef7/html5/thumbnails/21.jpg)
Example: Hill climbing
Start with # an initial candidate sentence
Loop within your budget at step :
# a new candidate in the neighbor
Either reject or accept
If accepted, , or otherwise
Return the best scored
y0
t
y′� ∼ Neighbor(yt)
y′�
yt = y′� yt = yt−1
y*
Search Algorithm
whenever is better than y′ � yt−1
![Page 22: Search-Based Unsupervised Text Generation · •Of how I learned natural language processing (NLP): NLP = NLU + NLG-NLU was the main focus of NLP research.-NLG was relatively easy,](https://reader035.vdocuments.net/reader035/viewer/2022063017/5fd7fa718486d049af045ef7/html5/thumbnails/22.jpg)
Applications
![Page 23: Search-Based Unsupervised Text Generation · •Of how I learned natural language processing (NLP): NLP = NLU + NLG-NLU was the main focus of NLP research.-NLG was relatively easy,](https://reader035.vdocuments.net/reader035/viewer/2022063017/5fd7fa718486d049af045ef7/html5/thumbnails/23.jpg)
Could be useful for various NLP applications
- E.g., query expansion, data augmentation
Paraphrase Generation
Input Reference
Which is the best training institute in Pune for digital marketing ?
Which is the best digital marketing training institute in Pune ?
![Page 24: Search-Based Unsupervised Text Generation · •Of how I learned natural language processing (NLP): NLP = NLU + NLG-NLU was the main focus of NLP research.-NLG was relatively easy,](https://reader035.vdocuments.net/reader035/viewer/2022063017/5fd7fa718486d049af045ef7/html5/thumbnails/24.jpg)
Paraphrase Generation• Search objective
- Fluency- Semantic preservation- Expression diversity • The paraphrase should be different from the input
• Search algorithm
• Search space = input
• Search neighbors
y0
BLEU here measures the n-gram overlappingsexp(y*, y0) = 1 − BLEU(y*, y0)
![Page 25: Search-Based Unsupervised Text Generation · •Of how I learned natural language processing (NLP): NLP = NLU + NLG-NLU was the main focus of NLP research.-NLG was relatively easy,](https://reader035.vdocuments.net/reader035/viewer/2022063017/5fd7fa718486d049af045ef7/html5/thumbnails/25.jpg)
Paraphrase Generation• Search objective
- Fluency- Semantic preservation- Expression diversity • The paraphrase should be different from the input
• Search algorithm: Simulated annealing
• Search space: the entire sentence space with = input
• Search neighbors- Generic word deletion, insertion, and replacement- Copying words in the input sentence
y0
BLEU here measures the n-gram overlappingsexp(y*, y0) = 1 − BLEU(y*, y0)
![Page 26: Search-Based Unsupervised Text Generation · •Of how I learned natural language processing (NLP): NLP = NLU + NLG-NLU was the main focus of NLP research.-NLG was relatively easy,](https://reader035.vdocuments.net/reader035/viewer/2022063017/5fd7fa718486d049af045ef7/html5/thumbnails/26.jpg)
Text SimplificationInput Reference
In 2016 alone, American developers had spent 12 billion dollars on constructing theme parks, according to a Seattle based reporter.
American developers had spent 12 billion dollars in 2016 alone on building theme parks.
Could be useful for - education purposes (e.g., kids, foreigners)- for those with dyslexia
Key observations- Dropping phrases and clauses- Phrase re-ordering- Dictionary-guided lexicon substitution
![Page 27: Search-Based Unsupervised Text Generation · •Of how I learned natural language processing (NLP): NLP = NLU + NLG-NLU was the main focus of NLP research.-NLG was relatively easy,](https://reader035.vdocuments.net/reader035/viewer/2022063017/5fd7fa718486d049af045ef7/html5/thumbnails/27.jpg)
Text SummarizationSearch objective
- Language model fluency (discounted by word frequency)- Cosine similarity- Entity matching- Length penalty- Flesh Reading Ease (FRE) score [Kincaid et al., 1975]
Search operations
![Page 28: Search-Based Unsupervised Text Generation · •Of how I learned natural language processing (NLP): NLP = NLU + NLG-NLU was the main focus of NLP research.-NLG was relatively easy,](https://reader035.vdocuments.net/reader035/viewer/2022063017/5fd7fa718486d049af045ef7/html5/thumbnails/28.jpg)
Text SummarizationSearch objective
- Language model fluency (discounted by word frequency)- Cosine similarity- Entity matching- Length penalty- Flesh Reading Ease (FRE) score [Kincaid et al., 1975]
Search operations- Dictionary-guided substitution (e.g., WordNet)- Phrase removal- Re-ordering with parse trees}
![Page 29: Search-Based Unsupervised Text Generation · •Of how I learned natural language processing (NLP): NLP = NLU + NLG-NLU was the main focus of NLP research.-NLG was relatively easy,](https://reader035.vdocuments.net/reader035/viewer/2022063017/5fd7fa718486d049af045ef7/html5/thumbnails/29.jpg)
Text Summarization
Key observation
- Words in the summary mostly come from the input
- If we generate the summary by selecting words, we have
Input Reference
The world’s biggest miner bhp billiton announced tuesday it was dropping its controversial hostile takeover bid for rival rio tinto due to the state of the global economy
bhp billiton drops rio tinto takeover bid
bhp billiton dropping hostile bid for rio tinto
![Page 30: Search-Based Unsupervised Text Generation · •Of how I learned natural language processing (NLP): NLP = NLU + NLG-NLU was the main focus of NLP research.-NLG was relatively easy,](https://reader035.vdocuments.net/reader035/viewer/2022063017/5fd7fa718486d049af045ef7/html5/thumbnails/30.jpg)
Text Summarization• Search objective
- Fluency- Semantic preservation- A hard length constraint
(Explicitly controlling length is not feasible in previous work)
• Search space
• Search neighbor
• Search algorithm
![Page 31: Search-Based Unsupervised Text Generation · •Of how I learned natural language processing (NLP): NLP = NLU + NLG-NLU was the main focus of NLP research.-NLG was relatively easy,](https://reader035.vdocuments.net/reader035/viewer/2022063017/5fd7fa718486d049af045ef7/html5/thumbnails/31.jpg)
Text Summarization• Search objective
- Fluency- Semantic preservation- A hard length constraint
(Explicitly controlling length is not feasible in previous work)
• Search space with only feasible solutions
• Search neighbor: swap only
• Search algorithm: hill-climbing
|𝒱 ||y| ⟹ ( |x |s )
![Page 32: Search-Based Unsupervised Text Generation · •Of how I learned natural language processing (NLP): NLP = NLU + NLG-NLU was the main focus of NLP research.-NLG was relatively easy,](https://reader035.vdocuments.net/reader035/viewer/2022063017/5fd7fa718486d049af045ef7/html5/thumbnails/32.jpg)
Experimental Results
![Page 33: Search-Based Unsupervised Text Generation · •Of how I learned natural language processing (NLP): NLP = NLU + NLG-NLU was the main focus of NLP research.-NLG was relatively easy,](https://reader035.vdocuments.net/reader035/viewer/2022063017/5fd7fa718486d049af045ef7/html5/thumbnails/33.jpg)
Research Questions• General performance
• Greediness vs. Stochasticity
• Search objective vs. Measure of success
![Page 34: Search-Based Unsupervised Text Generation · •Of how I learned natural language processing (NLP): NLP = NLU + NLG-NLU was the main focus of NLP research.-NLG was relatively easy,](https://reader035.vdocuments.net/reader035/viewer/2022063017/5fd7fa718486d049af045ef7/html5/thumbnails/34.jpg)
General PerformanceParaphrase generation
BLEU and ROUGE scores are automatic evaluation metrics based on references
![Page 35: Search-Based Unsupervised Text Generation · •Of how I learned natural language processing (NLP): NLP = NLU + NLG-NLU was the main focus of NLP research.-NLG was relatively easy,](https://reader035.vdocuments.net/reader035/viewer/2022063017/5fd7fa718486d049af045ef7/html5/thumbnails/35.jpg)
General Performance Text Summarization
![Page 36: Search-Based Unsupervised Text Generation · •Of how I learned natural language processing (NLP): NLP = NLU + NLG-NLU was the main focus of NLP research.-NLG was relatively easy,](https://reader035.vdocuments.net/reader035/viewer/2022063017/5fd7fa718486d049af045ef7/html5/thumbnails/36.jpg)
General Performance Text Simplification
![Page 37: Search-Based Unsupervised Text Generation · •Of how I learned natural language processing (NLP): NLP = NLU + NLG-NLU was the main focus of NLP research.-NLG was relatively easy,](https://reader035.vdocuments.net/reader035/viewer/2022063017/5fd7fa718486d049af045ef7/html5/thumbnails/37.jpg)
General Performance Human evaluation on paraphrase generation
![Page 38: Search-Based Unsupervised Text Generation · •Of how I learned natural language processing (NLP): NLP = NLU + NLG-NLU was the main focus of NLP research.-NLG was relatively easy,](https://reader035.vdocuments.net/reader035/viewer/2022063017/5fd7fa718486d049af045ef7/html5/thumbnails/38.jpg)
General Performance Examples
Main conclusion
• Search-based unsupervised text generation works in a variety of applications
• Surprisingly, it does yield fluent sentences.
![Page 39: Search-Based Unsupervised Text Generation · •Of how I learned natural language processing (NLP): NLP = NLU + NLG-NLU was the main focus of NLP research.-NLG was relatively easy,](https://reader035.vdocuments.net/reader035/viewer/2022063017/5fd7fa718486d049af045ef7/html5/thumbnails/39.jpg)
Greediness vs Stochasticity Paraphrase generation
Findings:
• Greedy search Simulated annealing
• Sampling stochastic search
≺
≺
![Page 40: Search-Based Unsupervised Text Generation · •Of how I learned natural language processing (NLP): NLP = NLU + NLG-NLU was the main focus of NLP research.-NLG was relatively easy,](https://reader035.vdocuments.net/reader035/viewer/2022063017/5fd7fa718486d049af045ef7/html5/thumbnails/40.jpg)
Search Objective vs. Measure of Success Experiment: summarization by word selection
Comparing hill-climbing (w/ restart) and exhaustive search
• Exhaustive search does yield higher scores
• Exhaustive search does NOT yield higher measure of success (ROUGE)
s(y)
![Page 41: Search-Based Unsupervised Text Generation · •Of how I learned natural language processing (NLP): NLP = NLU + NLG-NLU was the main focus of NLP research.-NLG was relatively easy,](https://reader035.vdocuments.net/reader035/viewer/2022063017/5fd7fa718486d049af045ef7/html5/thumbnails/41.jpg)
Conclusion & Future Work
![Page 42: Search-Based Unsupervised Text Generation · •Of how I learned natural language processing (NLP): NLP = NLU + NLG-NLU was the main focus of NLP research.-NLG was relatively easy,](https://reader035.vdocuments.net/reader035/viewer/2022063017/5fd7fa718486d049af045ef7/html5/thumbnails/42.jpg)
Search-based unsupervised text generation
General framework • Search objective
- fluency, semantic coherence, etc.• Search space
- word generation from the vocabulary, word selection• Search algorithm
- Local search with word-based edit- MH, SA, and hill climbing
Applications
- Paraphrasing, summarization, simplification
![Page 43: Search-Based Unsupervised Text Generation · •Of how I learned natural language processing (NLP): NLP = NLU + NLG-NLU was the main focus of NLP research.-NLG was relatively easy,](https://reader035.vdocuments.net/reader035/viewer/2022063017/5fd7fa718486d049af045ef7/html5/thumbnails/43.jpg)
Future WorkDefining the search neighborhood
Input: What would you do if given the power to become invisible?
Output: What would you do when you have the power to be invisible?
Current progress
- Large edits are possibly due to the less greedy SA but are rare
Future work
- Phrase-based edit (combining discrete sampling with VAE)
- Syntax-based edit (making use of probabilistic CFG)
![Page 44: Search-Based Unsupervised Text Generation · •Of how I learned natural language processing (NLP): NLP = NLU + NLG-NLU was the main focus of NLP research.-NLG was relatively easy,](https://reader035.vdocuments.net/reader035/viewer/2022063017/5fd7fa718486d049af045ef7/html5/thumbnails/44.jpg)
Future WorkInitial state of the local search
Current applications
- Paraphrasing, summarization, text simplification, grammatical error correction
- Input and desired output closely resemble each other
Future work
- Dialogue systems, machine translation, etc.
- Designing initial search state for general-purpose TextGen
- Combining retrieval-based methods
![Page 45: Search-Based Unsupervised Text Generation · •Of how I learned natural language processing (NLP): NLP = NLU + NLG-NLU was the main focus of NLP research.-NLG was relatively easy,](https://reader035.vdocuments.net/reader035/viewer/2022063017/5fd7fa718486d049af045ef7/html5/thumbnails/45.jpg)
Future WorkCombining search and learning
Disadvantage of search-only approaches
- Efficiency: 1—2 seconds per sample
- Heuristically defined objective may be deterministically wrong
Future work
- MCTS (currently exploring)
- Difficulties: large branching factor, noisy reward
![Page 46: Search-Based Unsupervised Text Generation · •Of how I learned natural language processing (NLP): NLP = NLU + NLG-NLU was the main focus of NLP research.-NLG was relatively easy,](https://reader035.vdocuments.net/reader035/viewer/2022063017/5fd7fa718486d049af045ef7/html5/thumbnails/46.jpg)
ReferencesNing Miao, Hao Zhou, Lili Mou, Rui Yan, Lei Li. CGMH: Constrained sentence generation by Metropolis-Hastings sampling. In AAAI, 2019.
Xianggen Liu, Lili Mou, Fandong Meng, Hao Zhou, Jie Zhou and Sen Song. Unsupervised paraphrasing by simulated annealing. In ACL, 2020.
Raphael Schumann, Lili Mou, Yao Lu, Olga Vechtomova and Katja Markert. Discrete optimization for unsupervised sentence summarization with word level extraction. In ACL, 2020.
Dhruv Kumar, Lili Mou, Lukasz Golab and Olga Vechtomova. Iterative edit-based unsupervised sentence simplification. In ACL, 2020.
![Page 47: Search-Based Unsupervised Text Generation · •Of how I learned natural language processing (NLP): NLP = NLU + NLG-NLU was the main focus of NLP research.-NLG was relatively easy,](https://reader035.vdocuments.net/reader035/viewer/2022063017/5fd7fa718486d049af045ef7/html5/thumbnails/47.jpg)
Acknowledgments
Lili Mou is supported by AltaML, Amii Fellow Program, and Canadian CIFAR AI Chair Program.
![Page 48: Search-Based Unsupervised Text Generation · •Of how I learned natural language processing (NLP): NLP = NLU + NLG-NLU was the main focus of NLP research.-NLG was relatively easy,](https://reader035.vdocuments.net/reader035/viewer/2022063017/5fd7fa718486d049af045ef7/html5/thumbnails/48.jpg)
Q&A
Thanks for listening!