talking robotstalkingrobots.dfki.de/cms/wp-content/uploads/2011/10/5.intro-tarot... · talking...

133
Talking Robots Introducing Tarot (1) Geert-Jan M. Kruijff Monday, January 16, 2012

Upload: vudung

Post on 28-Feb-2019

216 views

Category:

Documents


0 download

TRANSCRIPT

Talking RobotsIntroducing Tarot (1)

Geert-Jan M. Kruijff

Monday, January 16, 2012

Talking Robots @ German Research Center for AI

Geert-Jan M. Kruijff © 2011 http://talkingrobots.dfki.de

Tarot: The Talking Robots Toolkit

http://talkingrobots.dfki.de/?page_id=443

Monday, January 16, 2012

Talking Robots@ German Research Center for AI

Geert-Jan M. Kruijff © 2012 http://talkingrobots.dfki.de

Goal and overview

Goal

Introduce Tarot in terms of aim, structure; introduce one of the core concepts (Logical Forms), discuss parsing, content planning and realizing utterances.

Overview

1. Tarot: Aims and overview

2. Logical forms

3. Using tccg to parse with MOLOKO

4. Content planning: From proto-logical form to full form

5. Exercises

Monday, January 16, 2012

Talking Robots@ German Research Center for AI

Geert-Jan M. Kruijff © 2012 http://talkingrobots.dfki.de

Tarot: Aims

Tarot: Talking Robots Toolkit

Provide a comprehensive toolkit for building systems for situated dialogue processing for human-robot interaction, including the connection to robot middleware.

What would Tarot enable you to do?

Build a spoken dialogue system

Build connections to extra-linguistic representations, to influence comprehension & production

Build connections to robot middleware, to “control” a robot

Monday, January 16, 2012

Talking Robots@ German Research Center for AI

Geert-Jan M. Kruijff © 2012 http://talkingrobots.dfki.de

Tarot: Background

Situated dialogue as collaborative activity

Intention-based view on situated dialogue processing

Dialogue comprehension and -management based on intention / intension / denotation cycle

Ontological mediation to situate meaning

Ontologies, relational structure are considered core to establishing connections between different levels of representation,

be that within dialogue processing, or between dialogue and other (extra-linguistic) processes

Monday, January 16, 2012

Talking Robots@ German Research Center for AI

Geert-Jan M. Kruijff © 2012 http://talkingrobots.dfki.de

Tarot: Overview

Core

API: ASR, parsing, dialogue interpretation & management, content planning, realization, synthesis

Resources: CCG grammar (MOLOKO)

All API packages and resources come with documentation

Additional

Bridges to commercial ASR packages (Nuance, Loquendo)

R&D prototypes for different levels of functionality

Middleware

ROS, NAO

Monday, January 16, 2012

Talking Robots@ German Research Center for AI

Geert-Jan M. Kruijff © 2012 http://talkingrobots.dfki.de

Logical forms

Situated Beliefs & Intentions

ExternalProcesses

Dialogue Management

Dialogue Production

in

Dialogue Understanding

out

Monday, January 16, 2012

Talking Robots@ German Research Center for AI

Geert-Jan M. Kruijff © 2012 http://talkingrobots.dfki.de

Getting to logical forms

Monday, January 16, 2012

Talking Robots@ German Research Center for AI

Geert-Jan M. Kruijff © 2012 http://talkingrobots.dfki.de

Getting to logical forms

robust spoken dialogue processing

Monday, January 16, 2012

Talking Robots@ German Research Center for AI

Geert-Jan M. Kruijff © 2012 http://talkingrobots.dfki.de

Getting to logical forms

robust spoken dialogue processing

Monday, January 16, 2012

Talking Robots@ German Research Center for AI

Geert-Jan M. Kruijff © 2012 http://talkingrobots.dfki.de

Getting to logical forms

robust spoken dialogue processing

(most probable)logical formof linguisticmeaning

Monday, January 16, 2012

Talking Robots@ German Research Center for AI

Geert-Jan M. Kruijff © 2012 http://talkingrobots.dfki.de

Getting to logical forms

robust spoken dialogue processing

(most probable)logical formof linguisticmeaning

(ranked) referencehypothesesgiven binding, inference

Monday, January 16, 2012

Talking Robots@ German Research Center for AI

Geert-Jan M. Kruijff © 2012 http://talkingrobots.dfki.de

Getting to logical forms

robust spoken dialogue processing

(most probable)logical formof linguisticmeaning

(ranked) referencehypothesesgiven binding, inference

binding

Monday, January 16, 2012

Talking Robots@ German Research Center for AI

Geert-Jan M. Kruijff © 2012 http://talkingrobots.dfki.de

Getting to logical forms

robust spoken dialogue processing

(most probable)logical formof linguisticmeaning

(ranked) referencehypothesesgiven binding, inference

binding

Monday, January 16, 2012

Talking Robots@ German Research Center for AI

Geert-Jan M. Kruijff © 2012 http://talkingrobots.dfki.de

Getting to logical forms

robust spoken dialogue processing

(most probable)logical formof linguisticmeaning

(ranked) referencehypothesesgiven binding, inference

(weighted) abduction: given uncertainty, proof to “best” explanation of why something was said or done, or how best to achieve a goal, using assumptions & assertions

binding

Monday, January 16, 2012

Talking Robots@ German Research Center for AI

Geert-Jan M. Kruijff © 2012 http://talkingrobots.dfki.de

Getting to logical forms

robust spoken dialogue processing

(most probable)logical formof linguisticmeaning

(ranked) referencehypothesesgiven binding, inference

(weighted) abduction: given uncertainty, proof to “best” explanation of why something was said or done, or how best to achieve a goal, using assumptions & assertions

multi-agent beliefs, some bound (with a probability), as basis for assumptions & assertions

binding

Monday, January 16, 2012

Talking Robots@ German Research Center for AI

Geert-Jan M. Kruijff © 2012 http://talkingrobots.dfki.de

Getting to logical forms

robust spoken dialogue processing

(most probable)logical formof linguisticmeaning

(ranked) referencehypothesesgiven binding, inference

(weighted) abduction: given uncertainty, proof to “best” explanation of why something was said or done, or how best to achieve a goal, using assumptions & assertions

multi-agent beliefs, some bound (with a probability), as basis for assumptions & assertions

subarchitectures may trigger actions, e.g. questions, and help verify assertions.

binding

Monday, January 16, 2012

Talking Robots@ German Research Center for AI

Geert-Jan M. Kruijff © 2012 http://talkingrobots.dfki.de

Getting to logical forms

robust spoken dialogue processing

(most probable)logical formof linguisticmeaning

(ranked) referencehypothesesgiven binding, inference

(weighted) abduction: given uncertainty, proof to “best” explanation of why something was said or done, or how best to achieve a goal, using assumptions & assertions

multi-agent beliefs, some bound (with a probability), as basis for assumptions & assertions

subarchitectures may trigger actions, e.g. questions, and help verify assertions.

binding

Monday, January 16, 2012

Talking Robots@ German Research Center for AI

Geert-Jan M. Kruijff © 2012 http://talkingrobots.dfki.de

Getting to logical forms

robust spoken dialogue processing

(most probable)logical formof linguisticmeaning

(ranked) referencehypothesesgiven binding, inference

(weighted) abduction: given uncertainty, proof to “best” explanation of why something was said or done, or how best to achieve a goal, using assumptions & assertions

multi-agent beliefs, some bound (with a probability), as basis for assumptions & assertions

context- and content-determination for utterances to be realized ...

subarchitectures may trigger actions, e.g. questions, and help verify assertions.

binding

Monday, January 16, 2012

Talking Robots@ German Research Center for AI

Geert-Jan M. Kruijff © 2012 http://talkingrobots.dfki.de

Getting to logical forms

robust spoken dialogue processing

(most probable)logical formof linguisticmeaning

(ranked) referencehypothesesgiven binding, inference

(weighted) abduction: given uncertainty, proof to “best” explanation of why something was said or done, or how best to achieve a goal, using assumptions & assertions

multi-agent beliefs, some bound (with a probability), as basis for assumptions & assertions

context- and content-determination for utterances to be realized ...

subarchitectures may trigger actions, e.g. questions, and help verify assertions.

binding

Monday, January 16, 2012

Talking Robots@ German Research Center for AI

Geert-Jan M. Kruijff © 2012 http://talkingrobots.dfki.de

Getting to logical forms

robust spoken dialogue processing

(most probable)logical formof linguisticmeaning

(ranked) referencehypothesesgiven binding, inference

(weighted) abduction: given uncertainty, proof to “best” explanation of why something was said or done, or how best to achieve a goal, using assumptions & assertions

multi-agent beliefs, some bound (with a probability), as basis for assumptions & assertions

context- and content-determination for utterances to be realized ...

subarchitectures may trigger actions, e.g. questions, and help verify assertions.

... including assignment of contextually appropriate intonationbinding

Monday, January 16, 2012

Talking Robots@ German Research Center for AI

Geert-Jan M. Kruijff © 2012 http://talkingrobots.dfki.de

Building up linguistic meaning with CCG

Combinatory Categorial Grammar

In the lexicon, words are assigned categories

A category specifies the grammatical use (syntactic behavior) of the word

A category can be atomic, or a function using “slashes” \ (left) and / (right)

Functional categories are written “result first”: RES |ARG1 ... |ARGn

Examples

Nouns: N

Adjectives: N / N

Determiners: NP / N

Transitive verbs: (S \ NP) / NP, also written as S \ NP / NP

Monday, January 16, 2012

Talking Robots@ German Research Center for AI

Geert-Jan M. Kruijff © 2012 http://talkingrobots.dfki.de

Application

Basic rules of combination

Forward application (>): X/Y Y ⇒ X

Backward application (<): Y X\Y ⇒ X

Example derivation

Monday, January 16, 2012

Talking Robots@ German Research Center for AI

Geert-Jan M. Kruijff © 2012 http://talkingrobots.dfki.de

Application

Basic rules of combination

Forward application (>): X/Y Y ⇒ X

Backward application (<): Y X\Y ⇒ X

Example derivation

Ed

np

saw

(s\np)/np

Ann

np

Monday, January 16, 2012

Talking Robots@ German Research Center for AI

Geert-Jan M. Kruijff © 2012 http://talkingrobots.dfki.de

Application

Basic rules of combination

Forward application (>): X/Y Y ⇒ X

Backward application (<): Y X\Y ⇒ X

Example derivation

Ed

np

saw

(s\np)/np

Ann

np>

Monday, January 16, 2012

Talking Robots@ German Research Center for AI

Geert-Jan M. Kruijff © 2012 http://talkingrobots.dfki.de

Application

Basic rules of combination

Forward application (>): X/Y Y ⇒ X

Backward application (<): Y X\Y ⇒ X

Example derivation

Ed

np

saw

(s\np)/np

Ann

np>

s\np

Monday, January 16, 2012

Talking Robots@ German Research Center for AI

Geert-Jan M. Kruijff © 2012 http://talkingrobots.dfki.de

Application

Basic rules of combination

Forward application (>): X/Y Y ⇒ X

Backward application (<): Y X\Y ⇒ X

Example derivation

Ed

np

saw

(s\np)/np

Ann

np>

s\np<

Monday, January 16, 2012

Talking Robots@ German Research Center for AI

Geert-Jan M. Kruijff © 2012 http://talkingrobots.dfki.de

Application

Basic rules of combination

Forward application (>): X/Y Y ⇒ X

Backward application (<): Y X\Y ⇒ X

Example derivation

Ed

np

saw

(s\np)/np

Ann

np>

s\nps

<

Monday, January 16, 2012

Talking Robots@ German Research Center for AI

Geert-Jan M. Kruijff © 2012 http://talkingrobots.dfki.de

Combinatory CG

CCG adds further rules, based on the combinators of Curry & Feys’ combinatory logic

Forward type raising (>T): X ⇒ Y/(Y\X)

Forward harmonic composition (>B): X/Y Y/Z ⇒ X/Z

These rules induce associativity:

(Steedman 1996, 2000)

Monday, January 16, 2012

Talking Robots@ German Research Center for AI

Geert-Jan M. Kruijff © 2012 http://talkingrobots.dfki.de

Combinatory CG

CCG adds further rules, based on the combinators of Curry & Feys’ combinatory logic

Forward type raising (>T): X ⇒ Y/(Y\X)

Forward harmonic composition (>B): X/Y Y/Z ⇒ X/Z

These rules induce associativity:

Ed

np

saw

(s\np)/np

Ann

np

(Steedman 1996, 2000)

Monday, January 16, 2012

Talking Robots@ German Research Center for AI

Geert-Jan M. Kruijff © 2012 http://talkingrobots.dfki.de

Combinatory CG

CCG adds further rules, based on the combinators of Curry & Feys’ combinatory logic

Forward type raising (>T): X ⇒ Y/(Y\X)

Forward harmonic composition (>B): X/Y Y/Z ⇒ X/Z

These rules induce associativity:

Ed

np

saw

(s\np)/np

Ann

np>T

(Steedman 1996, 2000)

Monday, January 16, 2012

Talking Robots@ German Research Center for AI

Geert-Jan M. Kruijff © 2012 http://talkingrobots.dfki.de

Combinatory CG

CCG adds further rules, based on the combinators of Curry & Feys’ combinatory logic

Forward type raising (>T): X ⇒ Y/(Y\X)

Forward harmonic composition (>B): X/Y Y/Z ⇒ X/Z

These rules induce associativity:

Ed

np

saw

(s\np)/np

Ann

np>T

s/(s\np)

(Steedman 1996, 2000)

Monday, January 16, 2012

Talking Robots@ German Research Center for AI

Geert-Jan M. Kruijff © 2012 http://talkingrobots.dfki.de

Combinatory CG

CCG adds further rules, based on the combinators of Curry & Feys’ combinatory logic

Forward type raising (>T): X ⇒ Y/(Y\X)

Forward harmonic composition (>B): X/Y Y/Z ⇒ X/Z

These rules induce associativity:

Ed

np

saw

(s\np)/np

Ann

np

>B

>Ts/(s\np)

(Steedman 1996, 2000)

Monday, January 16, 2012

Talking Robots@ German Research Center for AI

Geert-Jan M. Kruijff © 2012 http://talkingrobots.dfki.de

Combinatory CG

CCG adds further rules, based on the combinators of Curry & Feys’ combinatory logic

Forward type raising (>T): X ⇒ Y/(Y\X)

Forward harmonic composition (>B): X/Y Y/Z ⇒ X/Z

These rules induce associativity:

Ed

np

saw

(s\np)/np

Ann

np

>Bs/np

>Ts/(s\np)

(Steedman 1996, 2000)

Monday, January 16, 2012

Talking Robots@ German Research Center for AI

Geert-Jan M. Kruijff © 2012 http://talkingrobots.dfki.de

Combinatory CG

CCG adds further rules, based on the combinators of Curry & Feys’ combinatory logic

Forward type raising (>T): X ⇒ Y/(Y\X)

Forward harmonic composition (>B): X/Y Y/Z ⇒ X/Z

These rules induce associativity:

Ed

np

saw

(s\np)/np

Ann

np

>Bs/np

>Ts/(s\np)s/(s\np)

(Steedman 1996, 2000)

Monday, January 16, 2012

Talking Robots@ German Research Center for AI

Geert-Jan M. Kruijff © 2012 http://talkingrobots.dfki.de

Combinatory CG

CCG adds further rules, based on the combinators of Curry & Feys’ combinatory logic

Forward type raising (>T): X ⇒ Y/(Y\X)

Forward harmonic composition (>B): X/Y Y/Z ⇒ X/Z

These rules induce associativity:

Ed

np

saw

(s\np)/np

Ann

np

>Bs/np

>Ts/(s\np)

(s\np)/np

s/(s\np)

(Steedman 1996, 2000)

Monday, January 16, 2012

Talking Robots@ German Research Center for AI

Geert-Jan M. Kruijff © 2012 http://talkingrobots.dfki.de

Combinatory CG

CCG adds further rules, based on the combinators of Curry & Feys’ combinatory logic

Forward type raising (>T): X ⇒ Y/(Y\X)

Forward harmonic composition (>B): X/Y Y/Z ⇒ X/Z

These rules induce associativity:

Ed

np

saw

(s\np)/np

Ann

np

>Bs/np

>Ts/(s\np)

(s\np)/np

s/(s\np)

s/np

(Steedman 1996, 2000)

Monday, January 16, 2012

Talking Robots@ German Research Center for AI

Geert-Jan M. Kruijff © 2012 http://talkingrobots.dfki.de

Combinatory CG

CCG adds further rules, based on the combinators of Curry & Feys’ combinatory logic

Forward type raising (>T): X ⇒ Y/(Y\X)

Forward harmonic composition (>B): X/Y Y/Z ⇒ X/Z

These rules induce associativity:

Ed

np

saw

(s\np)/np

Ann

np

>Bs/np

>

>Ts/(s\np)

(s\np)/np

s/(s\np)

s/np

(Steedman 1996, 2000)

Monday, January 16, 2012

Talking Robots@ German Research Center for AI

Geert-Jan M. Kruijff © 2012 http://talkingrobots.dfki.de

Combinatory CG

CCG adds further rules, based on the combinators of Curry & Feys’ combinatory logic

Forward type raising (>T): X ⇒ Y/(Y\X)

Forward harmonic composition (>B): X/Y Y/Z ⇒ X/Z

These rules induce associativity:

Ed

np

saw

(s\np)/np

Ann

np

>Bs/np

s>

>Ts/(s\np)

(s\np)/np

s/(s\np)

s/np

(Steedman 1996, 2000)

Monday, January 16, 2012

Talking Robots@ German Research Center for AI

Geert-Jan M. Kruijff © 2012 http://talkingrobots.dfki.de

Representing meaning

Kathy:person

book:object

read:event

park:region

locative

patient

actor

Monday, January 16, 2012

Talking Robots@ German Research Center for AI

Geert-Jan M. Kruijff © 2012 http://talkingrobots.dfki.de

@{r:event} read & @r <Actor>k & @{k:person} Kathy& @r <Patient>b & @{b:object} Book & @r <Locative>p & @{p:region} Park

Representing meaning

Kathy:person

book:object

read:event

park:region

locative

patient

actor

Monday, January 16, 2012

Talking Robots@ German Research Center for AI

Geert-Jan M. Kruijff © 2012 http://talkingrobots.dfki.de

@{r:event} read & @r <Actor>k & @{k:person} Kathy& @r <Patient>b & @{b:object} Book & @r <Locative>p & @{p:region} Park

Representing meaning

Kathy:person

book:object

read:event

park:region

locative

patient

actor

read

kathybook

park<Actor>

<Patient>

<Locative>

r

k b

p

Monday, January 16, 2012

Talking Robots@ German Research Center for AI

Geert-Jan M. Kruijff © 2012 http://talkingrobots.dfki.de

Syntax/semantics-interface

Basic idea follows older proposals by Zeevat (1988), Hoffman (1995), ...

Co-index an argument in the meaning with the category that provides it

Constraints only get triggered through when an index on a category gets unified

Derivation over categories is the only source for triggering constraints

How do we build meaning?

read(S:h \ NP: a) / NP: b

@h read & @h<ACT>a & @h<PAT>b

Monday, January 16, 2012

Talking Robots@ German Research Center for AI

Geert-Jan M. Kruijff © 2012 http://talkingrobots.dfki.de

Example

Monday, January 16, 2012

Talking Robots@ German Research Center for AI

Geert-Jan M. Kruijff © 2012 http://talkingrobots.dfki.de

Example

EdNP:e

@e Ed

Monday, January 16, 2012

Talking Robots@ German Research Center for AI

Geert-Jan M. Kruijff © 2012 http://talkingrobots.dfki.de

Example

EdNP:e

@e Ed

reads(S:r \● NP: a) /● NP:b

@r read & @r<ACT>a & @r<PAT>b

Monday, January 16, 2012

Talking Robots@ German Research Center for AI

Geert-Jan M. Kruijff © 2012 http://talkingrobots.dfki.de

Example

EdNP:e

@e Ed

reads(S:r \● NP: a) /● NP:b

@r read & @r<ACT>a & @r<PAT>b

MadNP:m

@m Mad

Monday, January 16, 2012

Talking Robots@ German Research Center for AI

Geert-Jan M. Kruijff © 2012 http://talkingrobots.dfki.de

Example

EdNP:e

@e Ed

reads(S:r \● NP: a) /● NP:b

@r read & @r<ACT>a & @r<PAT>b

MadNP:m

@m Mad>

Monday, January 16, 2012

Talking Robots@ German Research Center for AI

Geert-Jan M. Kruijff © 2012 http://talkingrobots.dfki.de

Example

EdNP:e

@e Ed

reads(S:r \● NP: a) /● NP:b

@r read & @r<ACT>a & @r<PAT>b

MadNP:m

@m Mad>

Monday, January 16, 2012

Talking Robots@ German Research Center for AI

Geert-Jan M. Kruijff © 2012 http://talkingrobots.dfki.de

Example

EdNP:e

@e Ed

reads(S:r \● NP: a) /● NP:b

@r read & @r<ACT>a & @r<PAT>b

MadNP:m

@m Mad>

Monday, January 16, 2012

Talking Robots@ German Research Center for AI

Geert-Jan M. Kruijff © 2012 http://talkingrobots.dfki.de

Example

EdNP:e

@e Ed

reads(S:r \● NP: a) /● NP:b

@r read & @r<ACT>a & @r<PAT>b

MadNP:m

@m Mad>

Monday, January 16, 2012

Talking Robots@ German Research Center for AI

Geert-Jan M. Kruijff © 2012 http://talkingrobots.dfki.de

Example

EdNP:e

@e Ed

reads(S:r \● NP: a) /● NP:b

@r read & @r<ACT>a & @r<PAT>b

MadNP:m

@m Mad

(S:r \● NP: a)@r read & @r<ACT>a & @r<PAT>m & @m Mad

>

Monday, January 16, 2012

Talking Robots@ German Research Center for AI

Geert-Jan M. Kruijff © 2012 http://talkingrobots.dfki.de

Example

EdNP:e

@e Ed

reads(S:r \● NP: a) /● NP:b

@r read & @r<ACT>a & @r<PAT>b

MadNP:m

@m Mad

(S:r \● NP: a)@r read & @r<ACT>a & @r<PAT>m & @m Mad

>

<

Monday, January 16, 2012

Talking Robots@ German Research Center for AI

Geert-Jan M. Kruijff © 2012 http://talkingrobots.dfki.de

Example

EdNP:e

@e Ed

reads(S:r \● NP: a) /● NP:b

@r read & @r<ACT>a & @r<PAT>b

MadNP:m

@m Mad

(S:r \● NP: a)@r read & @r<ACT>a & @r<PAT>m & @m Mad

S:r@r read & @r<ACT>e & @e Ed & <PAT>m & @m Mad

>

<

Monday, January 16, 2012

Talking Robots@ German Research Center for AI

Geert-Jan M. Kruijff © 2012 http://talkingrobots.dfki.de

Arguments and adjuncts have different behavior

The way the meaning of an argument contributes to that of the head is specified by the head

Adjuncts use a specification that enables them to insert their meaning into that of the head they modify, extending the meaning of the head

Adjunct meaning

No longer do adjuncts act like “predicates”: yesterday(read(j,c)), we obtain (retain) a proper relational structure

yesterdayS:h \• S: h

@h p & @h <TWH>(y & yesterday)

Arguments and adjuncts

Monday, January 16, 2012

Talking Robots@ German Research Center for AI

Geert-Jan M. Kruijff © 2012 http://talkingrobots.dfki.de

Example

Monday, January 16, 2012

Talking Robots@ German Research Center for AI

Geert-Jan M. Kruijff © 2012 http://talkingrobots.dfki.de

Example

EdNP:e

@e Ed

Monday, January 16, 2012

Talking Robots@ German Research Center for AI

Geert-Jan M. Kruijff © 2012 http://talkingrobots.dfki.de

Example

EdNP:e

@e Ed

read(S:r \● NP: a) /● NP:b

@r read & @r<ACT>a & @r<PAT>b

Monday, January 16, 2012

Talking Robots@ German Research Center for AI

Geert-Jan M. Kruijff © 2012 http://talkingrobots.dfki.de

Example

EdNP:e

@e Ed

read(S:r \● NP: a) /● NP:b

@r read & @r<ACT>a & @r<PAT>b

MadNP:m

@m Mad

Monday, January 16, 2012

Talking Robots@ German Research Center for AI

Geert-Jan M. Kruijff © 2012 http://talkingrobots.dfki.de

Example

yesterdayS:h \• S: h

@h p & @h<TWH>(y & y)

EdNP:e

@e Ed

read(S:r \● NP: a) /● NP:b

@r read & @r<ACT>a & @r<PAT>b

MadNP:m

@m Mad

Monday, January 16, 2012

Talking Robots@ German Research Center for AI

Geert-Jan M. Kruijff © 2012 http://talkingrobots.dfki.de

Example

yesterdayS:h \• S: h

@h p & @h<TWH>(y & y)

EdNP:e

@e Ed

read(S:r \● NP: a) /● NP:b

@r read & @r<ACT>a & @r<PAT>b

MadNP:m

@m Mad>

Monday, January 16, 2012

Talking Robots@ German Research Center for AI

Geert-Jan M. Kruijff © 2012 http://talkingrobots.dfki.de

Example

yesterdayS:h \• S: h

@h p & @h<TWH>(y & y)

EdNP:e

@e Ed

read(S:r \● NP: a) /● NP:b

@r read & @r<ACT>a & @r<PAT>b

MadNP:m

@m Mad

(S:r \● NP: a)@r read & @r<ACT>a & @r<PAT>(m & Mad)

>

Monday, January 16, 2012

Talking Robots@ German Research Center for AI

Geert-Jan M. Kruijff © 2012 http://talkingrobots.dfki.de

Example

yesterdayS:h \• S: h

@h p & @h<TWH>(y & y)

EdNP:e

@e Ed

read(S:r \● NP: a) /● NP:b

@r read & @r<ACT>a & @r<PAT>b

MadNP:m

@m Mad

(S:r \● NP: a)@r read & @r<ACT>a & @r<PAT>(m & Mad)

>

<

Monday, January 16, 2012

Talking Robots@ German Research Center for AI

Geert-Jan M. Kruijff © 2012 http://talkingrobots.dfki.de

Example

yesterdayS:h \• S: h

@h p & @h<TWH>(y & y)

EdNP:e

@e Ed

read(S:r \● NP: a) /● NP:b

@r read & @r<ACT>a & @r<PAT>b

MadNP:m

@m Mad

(S:r \● NP: a)@r read & @r<ACT>a & @r<PAT>(m & Mad)

S:r@r read & @r<ACT>(e & Ed) & <PAT>(m & Mad)

>

<

Monday, January 16, 2012

Talking Robots@ German Research Center for AI

Geert-Jan M. Kruijff © 2012 http://talkingrobots.dfki.de

Example

yesterdayS:h \• S: h

@h p & @h<TWH>(y & y)

EdNP:e

@e Ed

read(S:r \● NP: a) /● NP:b

@r read & @r<ACT>a & @r<PAT>b

MadNP:m

@m Mad

(S:r \● NP: a)@r read & @r<ACT>a & @r<PAT>(m & Mad)

S:r@r read & @r<ACT>(e & Ed) & <PAT>(m & Mad)

>

<

<

Monday, January 16, 2012

Talking Robots@ German Research Center for AI

Geert-Jan M. Kruijff © 2012 http://talkingrobots.dfki.de

Example

yesterdayS:h \• S: h

@h p & @h<TWH>(y & y)

EdNP:e

@e Ed

read(S:r \● NP: a) /● NP:b

@r read & @r<ACT>a & @r<PAT>b

MadNP:m

@m Mad

(S:r \● NP: a)@r read & @r<ACT>a & @r<PAT>(m & Mad)

S:r@r read & @r<ACT>(e & Ed) & <PAT>(m & Mad)

>

<

<

Monday, January 16, 2012

Talking Robots@ German Research Center for AI

Geert-Jan M. Kruijff © 2012 http://talkingrobots.dfki.de

Example

yesterdayS:h \• S: h

@h p & @h<TWH>(y & y)

EdNP:e

@e Ed

read(S:r \● NP: a) /● NP:b

@r read & @r<ACT>a & @r<PAT>b

MadNP:m

@m Mad

(S:r \● NP: a)@r read & @r<ACT>a & @r<PAT>(m & Mad)

S:r@r read & @r<ACT>(e & Ed) & <PAT>(m & Mad)

>

<

<

Monday, January 16, 2012

Talking Robots@ German Research Center for AI

Geert-Jan M. Kruijff © 2012 http://talkingrobots.dfki.de

Example

yesterdayS:h \• S: h

@h p & @h<TWH>(y & y)

EdNP:e

@e Ed

read(S:r \● NP: a) /● NP:b

@r read & @r<ACT>a & @r<PAT>b

MadNP:m

@m Mad

(S:r \● NP: a)@r read & @r<ACT>a & @r<PAT>(m & Mad)

S:r@r read & @r<ACT>(e & Ed) & <PAT>(m & Mad)

>

<

<S:r

@r read & @r<ACT>(e & Ed) & <PAT>(m & Mad)

Monday, January 16, 2012

Talking Robots@ German Research Center for AI

Geert-Jan M. Kruijff © 2012 http://talkingrobots.dfki.de

Example

yesterdayS:h \• S: h

@h p & @h<TWH>(y & y)

EdNP:e

@e Ed

read(S:r \● NP: a) /● NP:b

@r read & @r<ACT>a & @r<PAT>b

MadNP:m

@m Mad

(S:r \● NP: a)@r read & @r<ACT>a & @r<PAT>(m & Mad)

S:r@r read & @r<ACT>(e & Ed) & <PAT>(m & Mad)

>

<

<S:r

@r read & @r<ACT>(e & Ed) & <PAT>(m & Mad)

Monday, January 16, 2012

Talking Robots@ German Research Center for AI

Geert-Jan M. Kruijff © 2012 http://talkingrobots.dfki.de

Example

yesterdayS:h \• S: h

@h p & @h<TWH>(y & y)

EdNP:e

@e Ed

read(S:r \● NP: a) /● NP:b

@r read & @r<ACT>a & @r<PAT>b

MadNP:m

@m Mad

(S:r \● NP: a)@r read & @r<ACT>a & @r<PAT>(m & Mad)

S:r@r read & @r<ACT>(e & Ed) & <PAT>(m & Mad)

>

<

<S:r

@r read & @r<ACT>(e & Ed) & <PAT>(m & Mad) & @r<TWH>(y & y)

Monday, January 16, 2012

Talking Robots@ German Research Center for AI

Geert-Jan M. Kruijff © 2012 http://talkingrobots.dfki.de

Incremental processing with CCG

Building up meaning step-by-step

CCG enables “left-to-right” parsing

Rightward frontier shows expected category, meaning

Linguistic meaning is built up part-by-part,

using the possibility to see linguistic meaning as a conjunction over elementary predications

Monday, January 16, 2012

Talking Robots@ German Research Center for AI

Geert-Jan M. Kruijff © 2012 http://talkingrobots.dfki.de

Incremental processing with CCG

Building up meaning step-by-step

CCG enables “left-to-right” parsing

Rightward frontier shows expected category, meaning

Linguistic meaning is built up part-by-part,

using the possibility to see linguistic meaning as a conjunction over elementary predications

Monday, January 16, 2012

Talking Robots@ German Research Center for AI

Geert-Jan M. Kruijff © 2012 http://talkingrobots.dfki.de

Incremental processing with CCG

Monday, January 16, 2012

Talking Robots@ German Research Center for AI

Geert-Jan M. Kruijff © 2012 http://talkingrobots.dfki.de

Incremental processing with CCG

Monday, January 16, 2012

Talking Robots@ German Research Center for AI

Geert-Jan M. Kruijff © 2012 http://talkingrobots.dfki.de

Incremental processing with CCG

Monday, January 16, 2012

Talking Robots@ German Research Center for AI

Geert-Jan M. Kruijff © 2012 http://talkingrobots.dfki.de

Incremental processing with CCG

Monday, January 16, 2012

Talking Robots@ German Research Center for AI

Geert-Jan M. Kruijff © 2012 http://talkingrobots.dfki.de

Incremental processing with CCG

Monday, January 16, 2012

Talking Robots@ German Research Center for AI

Geert-Jan M. Kruijff © 2012 http://talkingrobots.dfki.de

Grammatical interpretationSyntax: Combinatory categorial grammar

Semantics: Ontologically richly sorted, relational structures

(Syntax reflects semantics reflects extra-linguistic categorical, situated meaning) (Possibility of probabilistically ranking grammatical derivations)

Monday, January 16, 2012

Talking Robots@ German Research Center for AI

Geert-Jan M. Kruijff © 2012 http://talkingrobots.dfki.de

Grammatical interpretationSyntax: Combinatory categorial grammar

Semantics: Ontologically richly sorted, relational structures

(Syntax reflects semantics reflects extra-linguistic categorical, situated meaning) (Possibility of probabilistically ranking grammatical derivations)

take the mug

Monday, January 16, 2012

Talking Robots@ German Research Center for AI

Geert-Jan M. Kruijff © 2012 http://talkingrobots.dfki.de

Grammatical interpretationSyntax: Combinatory categorial grammar

Semantics: Ontologically richly sorted, relational structures

(Syntax reflects semantics reflects extra-linguistic categorical, situated meaning) (Possibility of probabilistically ranking grammatical derivations)

s:e / np:p

@e:action-motion(take& <Actor>r1:robot& <Patient>p)

take the mug

Monday, January 16, 2012

Talking Robots@ German Research Center for AI

Geert-Jan M. Kruijff © 2012 http://talkingrobots.dfki.de

Grammatical interpretationSyntax: Combinatory categorial grammar

Semantics: Ontologically richly sorted, relational structures

(Syntax reflects semantics reflects extra-linguistic categorical, situated meaning) (Possibility of probabilistically ranking grammatical derivations)

s:e / np:p

@e:action-motion(take& <Actor>r1:robot& <Patient>p)

np:x / n:x

@x:thing(& <Delimitation>unique& <Quantification>singular)

take the mug

Monday, January 16, 2012

Talking Robots@ German Research Center for AI

Geert-Jan M. Kruijff © 2012 http://talkingrobots.dfki.de

Grammatical interpretationSyntax: Combinatory categorial grammar

Semantics: Ontologically richly sorted, relational structures

(Syntax reflects semantics reflects extra-linguistic categorical, situated meaning) (Possibility of probabilistically ranking grammatical derivations)

s:e / np:p

@e:action-motion(take& <Actor>r1:robot& <Patient>p)

np:x / n:x

@x:thing(& <Delimitation>unique& <Quantification>singular)

take the mug

Monday, January 16, 2012

Talking Robots@ German Research Center for AI

Geert-Jan M. Kruijff © 2012 http://talkingrobots.dfki.de

Grammatical interpretationSyntax: Combinatory categorial grammar

Semantics: Ontologically richly sorted, relational structures

(Syntax reflects semantics reflects extra-linguistic categorical, situated meaning) (Possibility of probabilistically ranking grammatical derivations)

s:e / np:p

@e:action-motion(take& <Actor>r1:robot& <Patient>p)

np:x / n:x

@x:thing(& <Delimitation>unique& <Quantification>singular)

take the mug

s:e / n:x

@e:action-motion(take& <Actor>r1:robot& <Patient>(x:thing & <Delimitation>unique & <Quantification>singular)

Monday, January 16, 2012

Talking Robots@ German Research Center for AI

Geert-Jan M. Kruijff © 2012 http://talkingrobots.dfki.de

Grammatical interpretationSyntax: Combinatory categorial grammar

Semantics: Ontologically richly sorted, relational structures

(Syntax reflects semantics reflects extra-linguistic categorical, situated meaning) (Possibility of probabilistically ranking grammatical derivations)

s:e / np:p

@e:action-motion(take& <Actor>r1:robot& <Patient>p)

np:x / n:x

@x:thing(& <Delimitation>unique& <Quantification>singular)

take the mugn:x

@x:thing(mug)

s:e / n:x

@e:action-motion(take& <Actor>r1:robot& <Patient>(x:thing & <Delimitation>unique & <Quantification>singular)

Monday, January 16, 2012

Talking Robots@ German Research Center for AI

Geert-Jan M. Kruijff © 2012 http://talkingrobots.dfki.de

Grammatical interpretationSyntax: Combinatory categorial grammar

Semantics: Ontologically richly sorted, relational structures

(Syntax reflects semantics reflects extra-linguistic categorical, situated meaning) (Possibility of probabilistically ranking grammatical derivations)

s:e / np:p

@e:action-motion(take& <Actor>r1:robot& <Patient>p)

np:x / n:x

@x:thing(& <Delimitation>unique& <Quantification>singular)

take the mugn:x

@x:thing(mug)

s:e / n:x

@e:action-motion(take& <Actor>r1:robot& <Patient>(x:thing & <Delimitation>unique & <Quantification>singular)

Monday, January 16, 2012

Talking Robots@ German Research Center for AI

Geert-Jan M. Kruijff © 2012 http://talkingrobots.dfki.de

Grammatical interpretationSyntax: Combinatory categorial grammar

Semantics: Ontologically richly sorted, relational structures

(Syntax reflects semantics reflects extra-linguistic categorical, situated meaning) (Possibility of probabilistically ranking grammatical derivations)

s:e / np:p

@e:action-motion(take& <Actor>r1:robot& <Patient>p)

np:x / n:x

@x:thing(& <Delimitation>unique& <Quantification>singular)

take the mugn:x

@x:thing(mug)

s:e / n:x

@e:action-motion(take& <Actor>r1:robot& <Patient>(x:thing & <Delimitation>unique & <Quantification>singular)

s:e@e:action-motion(take& <Actor>r1:robot& <Patient>(x:thing & mug & <Delimitation>unique & <Quantification>singular)

Monday, January 16, 2012

Talking Robots@ German Research Center for AI

Geert-Jan M. Kruijff © 2012 http://talkingrobots.dfki.de

Logical forms in Tarot

MOLOKO

CCG grammar in DotCCG format

Large collection of (SVO) grammatical families,

and a large English dictionary

Parsing & realization

OpenCCG-based (API, tccg) parsing & realization support

Tarot API packages for parsing and realization extend OpenCCG

Monday, January 16, 2012

Talking Robots@ German Research Center for AI

Geert-Jan M. Kruijff © 2012 http://talkingrobots.dfki.de

Setting things up

Get OpenCCG running

Download OpenCCG from http://openccg.sf.net (v.0.9.4)

Install OpenCCG (cf. README)

Get MOLOKO v6

Download MOLOKO from http://talkingrobots.dfki.de/tarot

Unzip the library ...

Monday, January 16, 2012

Talking Robots@ German Research Center for AI

Geert-Jan M. Kruijff © 2012 http://talkingrobots.dfki.de

The MOLOKO resource

Resource organization

Grammar provided in *.ccg format, under ccg-files

.ccg files organized (largely) by grammatical type

Resource compilation

OpenCCG works with XML format

Compilation from .ccg to XML

gj:[~/devel/moloko.v6]% ccg2xml moloko_FAST.ccg ccg2xml: Processing moloko_FAST.ccgOutputting XML file: ./moloko_FAST-lexicon.xmlOutputting XML file: ./moloko_FAST-grammar.xmlOutputting XML file: ./moloko_FAST-morph.xmlOutputting XML file: ./moloko_FAST-rules.xmlOutputting XML file: ./moloko_FAST-testbed.xmlOutputting XML file: ./moloko_FAST-types.xmlgj:[~/devel/moloko.v6]% mkdir moloko_FASTgj:[~/devel/moloko.v6]% mv moloko_FAST-*.xml moloko_FASTgj:[~/devel/moloko.v6]%

Monday, January 16, 2012

Talking Robots@ German Research Center for AI

Geert-Jan M. Kruijff © 2012 http://talkingrobots.dfki.de

What compilation uses

ccg2xml

Python program to transform .ccg into XML

Different OpenCCG files are being generated (grammar, lexicon, morph, etc.)

Requires a single .ccg file to be transformed

moloko*.ccg

A single file .ccg: moloko, moloko-FAST, moloko-FULL

These files can be created by merge-ccg.pl

Merging grammar files

merge-ccg.pl runs over all .ccg files in a given directory

What files are merged is thus determined by what is there

Monday, January 16, 2012

Talking Robots@ German Research Center for AI

Geert-Jan M. Kruijff © 2012 http://talkingrobots.dfki.de

What compilation creates

ccg2xml X.ccg

Creates XML files named X-abc.xml

Copy/move these to an individual subdirectory

OpenCCG

Within the individual subdirectory,

before running tccg,

rename the X-grammar.xml file to grammar.xml

gj:[~/devel/moloko.v6]% mv moloko_FAST-*.xml moloko_FAST

gj:[~/devel/moloko.v6]% cd moloko_FASTgj:[~/devel/moloko.v6]% mv moloko_FAST-grammar.xml grammar.xml

Monday, January 16, 2012

Talking Robots @ German Research Center for AI

Geert-Jan M. Kruijff © 2011 http://talkingrobots.dfki.de

And now let’s get things going!

Monday, January 16, 2012

Talking Robots@ German Research Center for AI

Geert-Jan M. Kruijff © 2012 http://talkingrobots.dfki.de

Starting up tccg

gj:[~/devel/moloko.v6/moloko_FAST]% tccgUnable to find a $JAVA_HOME at "/usr/", continuing with system-provided Java...Loading grammar from URL: file:/Users/gj/devel/moloko.v6/moloko_FAST/grammar.xmlGrammar 'moloko_FAST.ccg' loaded.

Enter strings to parse.Type ':r' to realize selected reading of previous parse.Type ':h' for help on display options and ':q' to quit.You can use the tab key for command completion, Ctrl-P (prev) and Ctrl-N (next) to access the command history, and emacs-style control keys to edit the line.

tccg> :h

Monday, January 16, 2012

Talking Robots@ German Research Center for AI

Geert-Jan M. Kruijff © 2012 http://talkingrobots.dfki.de

Starting up tccg

gj:[~/devel/moloko.v6/moloko_FAST]% tccgUnable to find a $JAVA_HOME at "/usr/", continuing with system-provided Java...Loading grammar from URL: file:/Users/gj/devel/moloko.v6/moloko_FAST/grammar.xmlGrammar 'moloko_FAST.ccg' loaded.

Enter strings to parse.Type ':r' to realize selected reading of previous parse.Type ':h' for help on display options and ':q' to quit.You can use the tab key for command completion, Ctrl-P (prev) and Ctrl-N (next) to access the command history, and emacs-style control keys to edit the line.

tccg> :h

Many options!

Monday, January 16, 2012

Talking Robots@ German Research Center for AI

Geert-Jan M. Kruijff © 2012 http://talkingrobots.dfki.de

Starting up tccg

gj:[~/devel/moloko.v6/moloko_FAST]% tccgUnable to find a $JAVA_HOME at "/usr/", continuing with system-provided Java...Loading grammar from URL: file:/Users/gj/devel/moloko.v6/moloko_FAST/grammar.xmlGrammar 'moloko_FAST.ccg' loaded.

Enter strings to parse.Type ':r' to realize selected reading of previous parse.Type ':h' for help on display options and ':q' to quit.You can use the tab key for command completion, Ctrl-P (prev) and Ctrl-N (next) to access the command history, and emacs-style control keys to edit the line.

tccg> :h

Many options!

:sh show current preference settings :v verbose output :reset reset options to defaults :feats (L) show features (or just show features in list L) :nofeats don't show features :sem show semantics :nosem don't show semantics :all show all parse results :notall don't show all parse results :derivs show derivations :noderivs don't show derivations :vison (FN) turn visualization on (saving to file with name FN) :visoff turn visualization off

Monday, January 16, 2012

Talking Robots@ German Research Center for AI

Geert-Jan M. Kruijff © 2012 http://talkingrobots.dfki.de

:noderivs :sem tccg> :noderivstccg> put the box near the red ball17 parses found.

:

Parse 7: s : @w0:action-non-motion(put ^ <Mood>imp ^ <Actor>(a1:entity ^ addressee) ^ <Patient>(w2:thing ^ box ^ <Delimitation>unique ^ <Num>sg ^ <Quantification>specific) ^ <Result>(w3:m-whereto ^ near ^ <Anchor>(w6:thing ^ ball ^ <Delimitation>unique ^ <Num>sg ^ <Quantification>specific ^ <Modifier>(w5:q-color ^ red))) ^ <Subject>a1:entity)

Tells you whatyou get: the meaning, (witha syntactic category of course)

Monday, January 16, 2012

Talking Robots@ German Research Center for AI

Geert-Jan M. Kruijff © 2012 http://talkingrobots.dfki.de

:noderivs :sem tccg> :noderivstccg> put the box near the red ball17 parses found.

:

Parse 7: s : @w0:action-non-motion(put ^ <Mood>imp ^ <Actor>(a1:entity ^ addressee) ^ <Patient>(w2:thing ^ box ^ <Delimitation>unique ^ <Num>sg ^ <Quantification>specific) ^ <Result>(w3:m-whereto ^ near ^ <Anchor>(w6:thing ^ ball ^ <Delimitation>unique ^ <Num>sg ^ <Quantification>specific ^ <Modifier>(w5:q-color ^ red))) ^ <Subject>a1:entity)

Tells you whatyou get: the meaning, (witha syntactic category of course)

Parse 7: s------------------------------(lex) put :- s\!np/pp/np(lex) the :- np/^n(lex) box :- n(>) the box :- np(>) put the box :- s\!np/pp(lex) near :- pp/^np(lex) the :- np/^n(lex) red :- adj(gram) typechange-5: adj$1 => n/n$1(typechange-5) red :- n/n(lex) ball :- n(>) red ball :- n(>) the red ball :- np(>) near the red ball :- pp(>) put the box near the red ball :- s\!np(gram) typechange-17: s\!np => s(typechange-17) put the box near the red ball :- s

:derivs :nosem

Tells you howyou get there: the derivation, with all the lexical entries and rulesthat have been applied

Monday, January 16, 2012

Talking Robots@ German Research Center for AI

Geert-Jan M. Kruijff © 2012 http://talkingrobots.dfki.de

Trying things out: Single words for entities

tccg> :derivstccg> :semtccg> office3 parses found

Parse 1: n : @w0:e-place(office)------------------------------(lex) office :- n : @T_0:e-place(office)

Parse 2: du : @w0:e-place(office)------------------------------(lex) office :- n : @T_0:e-place(office)(gram) typechange-53: n => du(typechange-53) office :- du : @T_0:e-place(office)

Parse 3: n/*n : @x1:entity( <Compound>(w0:e-place ^ office))------------------------------(lex) office :- n : @T_0:e-place(office)(gram) typechange-4: n => n/*n : @T_0:entity(<Compound>T2_0)(typechange-4) office :- n/*n : (@T_0:e-place(office) ^ @T_4:entity(<Compound>T_0:e-place))

Monday, January 16, 2012

Talking Robots@ German Research Center for AI

Geert-Jan M. Kruijff © 2012 http://talkingrobots.dfki.de

Trying things out: Single words for entities

tccg> :derivstccg> :semtccg> office3 parses found

Parse 1: n : @w0:e-place(office)------------------------------(lex) office :- n : @T_0:e-place(office)

Parse 2: du : @w0:e-place(office)------------------------------(lex) office :- n : @T_0:e-place(office)(gram) typechange-53: n => du(typechange-53) office :- du : @T_0:e-place(office)

Parse 3: n/*n : @x1:entity( <Compound>(w0:e-place ^ office))------------------------------(lex) office :- n : @T_0:e-place(office)(gram) typechange-4: n => n/*n : @T_0:entity(<Compound>T2_0)(typechange-4) office :- n/*n : (@T_0:e-place(office) ^ @T_4:entity(<Compound>T_0:e-place))

Basic lexical entry, entity semantics specifies the (lexical) index variable of the entity (T_0), instantiated in the “utterance” as w0, its sort (e-place) and the proposition (office).

Monday, January 16, 2012

Talking Robots@ German Research Center for AI

Geert-Jan M. Kruijff © 2012 http://talkingrobots.dfki.de

Trying things out: Single words for entities

tccg> :derivstccg> :semtccg> office3 parses found

Parse 1: n : @w0:e-place(office)------------------------------(lex) office :- n : @T_0:e-place(office)

Parse 2: du : @w0:e-place(office)------------------------------(lex) office :- n : @T_0:e-place(office)(gram) typechange-53: n => du(typechange-53) office :- du : @T_0:e-place(office)

Parse 3: n/*n : @x1:entity( <Compound>(w0:e-place ^ office))------------------------------(lex) office :- n : @T_0:e-place(office)(gram) typechange-4: n => n/*n : @T_0:entity(<Compound>T2_0)(typechange-4) office :- n/*n : (@T_0:e-place(office) ^ @T_4:entity(<Compound>T_0:e-place))

Basic lexical entry, entity semantics specifies the (lexical) index variable of the entity (T_0), instantiated in the “utterance” as w0, its sort (e-place) and the proposition (office).

Here, the major change is in the syntactic category assigned to the expression. Rather than a noun (n) we now have a discourse unit (du) which can be combined with other such units, to form possibly incomplete or

discontinuous utterances.

Monday, January 16, 2012

Talking Robots@ German Research Center for AI

Geert-Jan M. Kruijff © 2012 http://talkingrobots.dfki.de

Trying things out: Single words for entities

tccg> :derivstccg> :semtccg> office3 parses found

Parse 1: n : @w0:e-place(office)------------------------------(lex) office :- n : @T_0:e-place(office)

Parse 2: du : @w0:e-place(office)------------------------------(lex) office :- n : @T_0:e-place(office)(gram) typechange-53: n => du(typechange-53) office :- du : @T_0:e-place(office)

Parse 3: n/*n : @x1:entity( <Compound>(w0:e-place ^ office))------------------------------(lex) office :- n : @T_0:e-place(office)(gram) typechange-4: n => n/*n : @T_0:entity(<Compound>T2_0)(typechange-4) office :- n/*n : (@T_0:e-place(office) ^ @T_4:entity(<Compound>T_0:e-place))

Basic lexical entry, entity semantics specifies the (lexical) index variable of the entity (T_0), instantiated in the “utterance” as w0, its sort (e-place) and the proposition (office).

Here, the major change is in the syntactic category assigned to the expression. Rather than a noun (n) we now have a discourse unit (du) which can be combined with other such units, to form possibly incomplete or

discontinuous utterances.

Finally, we see both syntactic category and semantic structure changed, to create a meaning in which “office” functions as part of a compound or multi-word expression, as e.g. in “the edge of GJ ‘s office door”

Monday, January 16, 2012

Talking Robots@ German Research Center for AI

Geert-Jan M. Kruijff © 2012 http://talkingrobots.dfki.de

Trying things out: Entity descriptions

tccg> :derivstccg> :semtccg> the

6 parses found.

Parse 1: np/^n : @x1( <Delimitation>unique ^ <Num>sg ^ <Quantification>specific)Parse 2: np/^n : @x1( <Delimitation>unique ^ <Num>pl ^ <Quantification>specific)Parse 3: np/^n : @x1( <Delimitation>unique ^ <Num>pl ^ <Quantification>unspecific)::

tccg> the office3 parses found.

Parse 1: np : @w1:e-place(office ^ <Delimitation>unique ^ <Num>sg ^ <Quantification>specific)

::

Monday, January 16, 2012

Talking Robots@ German Research Center for AI

Geert-Jan M. Kruijff © 2012 http://talkingrobots.dfki.de

Trying things out: Entity descriptions

tccg> :derivstccg> :semtccg> the

6 parses found.

Parse 1: np/^n : @x1( <Delimitation>unique ^ <Num>sg ^ <Quantification>specific)Parse 2: np/^n : @x1( <Delimitation>unique ^ <Num>pl ^ <Quantification>specific)Parse 3: np/^n : @x1( <Delimitation>unique ^ <Num>pl ^ <Quantification>unspecific)::

tccg> the office3 parses found.

Parse 1: np : @w1:e-place(office ^ <Delimitation>unique ^ <Num>sg ^ <Quantification>specific)

::

Forward projection: The determiner projects structure “forward,” in this case a nominal structure, for which it then provides number, delimitation and quantification information.

Monday, January 16, 2012

Talking Robots@ German Research Center for AI

Geert-Jan M. Kruijff © 2012 http://talkingrobots.dfki.de

Trying things out: Entity descriptions

tccg> :derivstccg> :semtccg> the

6 parses found.

Parse 1: np/^n : @x1( <Delimitation>unique ^ <Num>sg ^ <Quantification>specific)Parse 2: np/^n : @x1( <Delimitation>unique ^ <Num>pl ^ <Quantification>specific)Parse 3: np/^n : @x1( <Delimitation>unique ^ <Num>pl ^ <Quantification>unspecific)::

tccg> the office3 parses found.

Parse 1: np : @w1:e-place(office ^ <Delimitation>unique ^ <Num>sg ^ <Quantification>specific)

::

Forward projection: The determiner projects structure “forward,” in this case a nominal structure, for which it then provides number, delimitation and quantification information.

Lexical ambiguity: Parse 1 is based on a lexical item for a determiner modifying a noun in singular, Parse 2 provides the plural case. Parse 3 provides the generic use of the definite determiner (in English), as e.g. in “The door is the typical entrance to an office.”

Monday, January 16, 2012

Talking Robots@ German Research Center for AI

Geert-Jan M. Kruijff © 2012 http://talkingrobots.dfki.de

Trying things out: Entity descriptions

tccg> :derivstccg> :semtccg> the

6 parses found.

Parse 1: np/^n : @x1( <Delimitation>unique ^ <Num>sg ^ <Quantification>specific)Parse 2: np/^n : @x1( <Delimitation>unique ^ <Num>pl ^ <Quantification>specific)Parse 3: np/^n : @x1( <Delimitation>unique ^ <Num>pl ^ <Quantification>unspecific)::

tccg> the office3 parses found.

Parse 1: np : @w1:e-place(office ^ <Delimitation>unique ^ <Num>sg ^ <Quantification>specific)

::

Forward projection: The determiner projects structure “forward,” in this case a nominal structure, for which it then provides number, delimitation and quantification information.

Lexical ambiguity: Parse 1 is based on a lexical item for a determiner modifying a noun in singular, Parse 2 provides the plural case. Parse 3 provides the generic use of the definite determiner (in English), as e.g. in “The door is the typical entrance to an office.”

Monday, January 16, 2012

Talking Robots@ German Research Center for AI

Geert-Jan M. Kruijff © 2012 http://talkingrobots.dfki.de

Trying things out: Entity descriptions

tccg> :derivstccg> :semtccg> the

6 parses found.

Parse 1: np/^n : @x1( <Delimitation>unique ^ <Num>sg ^ <Quantification>specific)Parse 2: np/^n : @x1( <Delimitation>unique ^ <Num>pl ^ <Quantification>specific)Parse 3: np/^n : @x1( <Delimitation>unique ^ <Num>pl ^ <Quantification>unspecific)::

tccg> the office3 parses found.

Parse 1: np : @w1:e-place(office ^ <Delimitation>unique ^ <Num>sg ^ <Quantification>specific)

::

Forward projection: The determiner projects structure “forward,” in this case a nominal structure, for which it then provides number, delimitation and quantification information.

Lexical ambiguity: Parse 1 is based on a lexical item for a determiner modifying a noun in singular, Parse 2 provides the plural case. Parse 3 provides the generic use of the definite determiner (in English), as e.g. in “The door is the typical entrance to an office.”

Backward integration: complete integration of the current expression-under-analysis with what already preceded. This makes it possible for the analysis to proceed incrementally.

Monday, January 16, 2012

Talking Robots@ German Research Center for AI

Geert-Jan M. Kruijff © 2012 http://talkingrobots.dfki.de

Trying things out: Entity descriptions with modifierstccg> :derivstccg> :semtccg> GJ 's big office3 parses found.

Parse 1: np : @w3:e-place(office ^ <Delimitation>unique ^ <Num>sg ^ <Quantification>specific ^ <Modifier>(w2:q-size ^ big) ^ <Owner>(w0:person ^ GJ))Parse 2: du : @w3:e-place(office ^ <Delimitation>unique ^ <Num>sg ^ <Quantification>specific ^ <Modifier>(w2:q-size ^ big) ^ <Owner>(w0:person ^ GJ)):

tccg> GJ 's big office at the end of the hall3 parses found.

Parse 1: np : @w3:e-place(office ^ <Delimitation>unique ^ <Num>sg ^ <Quantification>specific ^ <Modifier>(w2:q-size ^ big) ^ <Modifier>(w4:m-location ^ at ^ <Anchor>(w6:e-region ^ end ^ <Delimitation>unique ^ <Num>sg ^ <Quantification>specific ^ <Owner>(w9:e-place ^ hall ^ <Delimitation>unique ^ <Num>sg ^ <Quantification>specific))) ^ <Owner>(w0:person ^ GJ))

Monday, January 16, 2012

Talking Robots@ German Research Center for AI

Geert-Jan M. Kruijff © 2012 http://talkingrobots.dfki.de

Trying things out: Entity descriptions with modifierstccg> :derivstccg> :semtccg> GJ 's big office3 parses found.

Parse 1: np : @w3:e-place(office ^ <Delimitation>unique ^ <Num>sg ^ <Quantification>specific ^ <Modifier>(w2:q-size ^ big) ^ <Owner>(w0:person ^ GJ))Parse 2: du : @w3:e-place(office ^ <Delimitation>unique ^ <Num>sg ^ <Quantification>specific ^ <Modifier>(w2:q-size ^ big) ^ <Owner>(w0:person ^ GJ)):

tccg> GJ 's big office at the end of the hall3 parses found.

Parse 1: np : @w3:e-place(office ^ <Delimitation>unique ^ <Num>sg ^ <Quantification>specific ^ <Modifier>(w2:q-size ^ big) ^ <Modifier>(w4:m-location ^ at ^ <Anchor>(w6:e-region ^ end ^ <Delimitation>unique ^ <Num>sg ^ <Quantification>specific ^ <Owner>(w9:e-place ^ hall ^ <Delimitation>unique ^ <Num>sg ^ <Quantification>specific))) ^ <Owner>(w0:person ^ GJ))

Modifiers: Particularly for entity descriptions, modifiers are (mostly) modeled by Modifier relations. Distinguishing modifiers is done by looking at the type of information provided (e.g. q-size, meaning size)

Monday, January 16, 2012

Talking Robots@ German Research Center for AI

Geert-Jan M. Kruijff © 2012 http://talkingrobots.dfki.de

Trying things out: Entity descriptions with modifierstccg> :derivstccg> :semtccg> GJ 's big office3 parses found.

Parse 1: np : @w3:e-place(office ^ <Delimitation>unique ^ <Num>sg ^ <Quantification>specific ^ <Modifier>(w2:q-size ^ big) ^ <Owner>(w0:person ^ GJ))Parse 2: du : @w3:e-place(office ^ <Delimitation>unique ^ <Num>sg ^ <Quantification>specific ^ <Modifier>(w2:q-size ^ big) ^ <Owner>(w0:person ^ GJ)):

tccg> GJ 's big office at the end of the hall3 parses found.

Parse 1: np : @w3:e-place(office ^ <Delimitation>unique ^ <Num>sg ^ <Quantification>specific ^ <Modifier>(w2:q-size ^ big) ^ <Modifier>(w4:m-location ^ at ^ <Anchor>(w6:e-region ^ end ^ <Delimitation>unique ^ <Num>sg ^ <Quantification>specific ^ <Owner>(w9:e-place ^ hall ^ <Delimitation>unique ^ <Num>sg ^ <Quantification>specific))) ^ <Owner>(w0:person ^ GJ))

Modifiers: Particularly for entity descriptions, modifiers are (mostly) modeled by Modifier relations. Distinguishing modifiers is done by looking at the type of information provided (e.g. q-size, meaning size)

Monday, January 16, 2012

Talking Robots@ German Research Center for AI

Geert-Jan M. Kruijff © 2012 http://talkingrobots.dfki.de

Verbs

Verbal structure

Grammatical subject

Verb frame-specific roles: Actor, Patient, Beneficiary, ...

Features indicating Tense, Aspect, Mood; (Modality is a Modifier)

Specific case: Copula verb “to be”

Ascription use: “The mug is green,” “Where is the mug”

Presentation use: “There is a mug on the table”

Relational structure using Cop-Restr and Cop-Scope, based on Partee’s tripartite structure

Design principles

Again, facilitate backward integration and forward projection

Monday, January 16, 2012

Talking Robots@ German Research Center for AI

Geert-Jan M. Kruijff © 2012 http://talkingrobots.dfki.de

Trying things out: basic action verb structure

tccg> :noderivstccg> go15 parses found.

Parse 1: adv : @w0:m-manner(go):Parse 12: s : @w0:action-motion(go ^ <Mood>imp ^ <Actor>(a1:entity ^ addressee) ^ <Subject>a1:entity):tccg>

Monday, January 16, 2012

Talking Robots@ German Research Center for AI

Geert-Jan M. Kruijff © 2012 http://talkingrobots.dfki.de

Trying things out: basic action verb structure

tccg> go to the office4 parses found.

Parse 1: s : @w0:action-motion(go ^ <Mood>imp ^ <Actor>(a1:entity ^ addressee) ^ <Modifier>(w1:m-whereto ^ to ^ <Anchor>(w3:e-place ^ office ^ <Delimitation>unique ^ <Num>sg ^ <Quantification>specific)) ^ <Subject>a1:entity)

Monday, January 16, 2012

Talking Robots@ German Research Center for AI

Geert-Jan M. Kruijff © 2012 http://talkingrobots.dfki.de

Trying things out: basic action verb structure

tccg> go to the office4 parses found.

Parse 1: s : @w0:action-motion(go ^ <Mood>imp ^ <Actor>(a1:entity ^ addressee) ^ <Modifier>(w1:m-whereto ^ to ^ <Anchor>(w3:e-place ^ office ^ <Delimitation>unique ^ <Num>sg ^ <Quantification>specific)) ^ <Subject>a1:entity)

Imperative use of the verb. The actor (and the grammatical subject) of the construction is defined contextually, as the addressee of the utterance.

Monday, January 16, 2012

Talking Robots@ German Research Center for AI

Geert-Jan M. Kruijff © 2012 http://talkingrobots.dfki.de

Trying things out: basic action verb structure

tccg> could you go to the office7 parses found.

Parse 1: s : @w2:action-motion(go ^ <Mood>int ^ <Actor>(w1:person ^ you ^ <Num>sg) ^ <Modifier>(w0:modal ^ could) ^ <Modifier>(w3:m-whereto ^ to ^ <Anchor>(w5:e-place ^ office ^ <Delimitation>unique ^ <Num>sg ^ <Quantification>specific)) ^ <Subject>w1:person)

Monday, January 16, 2012

Talking Robots@ German Research Center for AI

Geert-Jan M. Kruijff © 2012 http://talkingrobots.dfki.de

Trying things out: basic action verb structure

tccg> could you go to the office7 parses found.

Parse 1: s : @w2:action-motion(go ^ <Mood>int ^ <Actor>(w1:person ^ you ^ <Num>sg) ^ <Modifier>(w0:modal ^ could) ^ <Modifier>(w3:m-whereto ^ to ^ <Anchor>(w5:e-place ^ office ^ <Delimitation>unique ^ <Num>sg ^ <Quantification>specific)) ^ <Subject>w1:person)

Interrogative use of the verb. The actor (and the grammatical subject) of the construction is given explicitly in the utterance.

Monday, January 16, 2012

Talking Robots@ German Research Center for AI

Geert-Jan M. Kruijff © 2012 http://talkingrobots.dfki.de

Trying things out: basic action verb structure

tccg> could you go to the office7 parses found.

Parse 1: s : @w2:action-motion(go ^ <Mood>int ^ <Actor>(w1:person ^ you ^ <Num>sg) ^ <Modifier>(w0:modal ^ could) ^ <Modifier>(w3:m-whereto ^ to ^ <Anchor>(w5:e-place ^ office ^ <Delimitation>unique ^ <Num>sg ^ <Quantification>specific)) ^ <Subject>w1:person)

tccg> can you go to the office7 parses found.

Parse 1: s : @w2:action-motion(go ^ <Mood>int ^ <Tense>pres ^ <Actor>(w1:person ^ you ^ <Num>sg) ^ <Modifier>(w0:modal ^ can) ^ <Modifier>(w3:m-whereto ^ to ^ <Anchor>(w5:e-place ^ office ^ <Delimitation>unique ^ <Num>sg ^ <Quantification>specific)) ^ <Subject>w1:person)

Monday, January 16, 2012

Talking Robots@ German Research Center for AI

Geert-Jan M. Kruijff © 2012 http://talkingrobots.dfki.de

Trying things out: basic action verb structure

tccg> could you go to the office7 parses found.

Parse 1: s : @w2:action-motion(go ^ <Mood>int ^ <Actor>(w1:person ^ you ^ <Num>sg) ^ <Modifier>(w0:modal ^ could) ^ <Modifier>(w3:m-whereto ^ to ^ <Anchor>(w5:e-place ^ office ^ <Delimitation>unique ^ <Num>sg ^ <Quantification>specific)) ^ <Subject>w1:person)

tccg> can you go to the office7 parses found.

Parse 1: s : @w2:action-motion(go ^ <Mood>int ^ <Tense>pres ^ <Actor>(w1:person ^ you ^ <Num>sg) ^ <Modifier>(w0:modal ^ can) ^ <Modifier>(w3:m-whereto ^ to ^ <Anchor>(w5:e-place ^ office ^ <Delimitation>unique ^ <Num>sg ^ <Quantification>specific)) ^ <Subject>w1:person)

Interrogative use of the verb. The actor (and the grammatical subject) of the construction is given explicitly in the utterance.

Monday, January 16, 2012

Talking Robots@ German Research Center for AI

Geert-Jan M. Kruijff © 2012 http://talkingrobots.dfki.de

Trying things out: basic action verb structure

tccg> could you go to the office7 parses found.

Parse 1: s : @w2:action-motion(go ^ <Mood>int ^ <Actor>(w1:person ^ you ^ <Num>sg) ^ <Modifier>(w0:modal ^ could) ^ <Modifier>(w3:m-whereto ^ to ^ <Anchor>(w5:e-place ^ office ^ <Delimitation>unique ^ <Num>sg ^ <Quantification>specific)) ^ <Subject>w1:person)

tccg> can you go to the office7 parses found.

Parse 1: s : @w2:action-motion(go ^ <Mood>int ^ <Tense>pres ^ <Actor>(w1:person ^ you ^ <Num>sg) ^ <Modifier>(w0:modal ^ can) ^ <Modifier>(w3:m-whereto ^ to ^ <Anchor>(w5:e-place ^ office ^ <Delimitation>unique ^ <Num>sg ^ <Quantification>specific)) ^ <Subject>w1:person)

Interrogative use of the verb. The actor (and the grammatical subject) of the construction is given explicitly in the utterance.

Monday, January 16, 2012

Talking Robots@ German Research Center for AI

Geert-Jan M. Kruijff © 2012 http://talkingrobots.dfki.de

Trying things out: basic action verb structure

tccg> could you go to the office7 parses found.

Parse 1: s : @w2:action-motion(go ^ <Mood>int ^ <Actor>(w1:person ^ you ^ <Num>sg) ^ <Modifier>(w0:modal ^ could) ^ <Modifier>(w3:m-whereto ^ to ^ <Anchor>(w5:e-place ^ office ^ <Delimitation>unique ^ <Num>sg ^ <Quantification>specific)) ^ <Subject>w1:person)

tccg> can you go to the office7 parses found.

Parse 1: s : @w2:action-motion(go ^ <Mood>int ^ <Tense>pres ^ <Actor>(w1:person ^ you ^ <Num>sg) ^ <Modifier>(w0:modal ^ can) ^ <Modifier>(w3:m-whereto ^ to ^ <Anchor>(w5:e-place ^ office ^ <Delimitation>unique ^ <Num>sg ^ <Quantification>specific)) ^ <Subject>w1:person)

Interrogative use of the verb. The actor (and the grammatical subject) of the construction is given explicitly in the utterance.

Observe that here the Tense is made explicit, whereas in the above example this is left implicit.

Monday, January 16, 2012

Talking Robots@ German Research Center for AI

Geert-Jan M. Kruijff © 2012 http://talkingrobots.dfki.de

Trying things out: ambiguity in verb structure

tccg> put the ball near the box to the left of the computer44 parses found.

Parse 19: s : @w0:action-non-motion(put ^ <Mood>imp ^ <Actor>(a1:entity ^ addressee) ^ <Patient>(w2:thing ^ ball ^ <Delimitation>unique ^ <Num>sg ^ <Quantification>specific) ^ <Result>(w3:m-whereto ^ near ^ <Anchor>(w5:thing ^ box ^ <Delimitation>unique ^ <Num>sg ^ <Quantification>specific ^ <Modifier>(w6:m-location ^ to ^ <Anchor>(w8:e-region ^ left ^ <Delimitation>unique ^ <Num>sg ^ <Quantification>specific ^ <Owner>(w11:thing ^ computer ^ <Delimitation>unique ^ <Num>sg ^ <Quantification>specific))))) ^ <Subject>a1:entity)Parse 20: s : @w0:action-non-motion(put ^ <Mood>imp ^ <Actor>(a1:entity ^ addressee) ^ <Modifier>(w6:m-location ^ to ^ <Anchor>(w8:e-region ^ left ^ <Delimitation>unique ^ <Num>sg ^ <Quantification>specific ^ <Owner>(w11:thing ^ computer ^ <Delimitation>unique ^ <Num>sg ^ <Quantification>specific))) ^ <Patient>(w2:thing ^ ball ^ <Delimitation>unique ^ <Num>sg ^ <Quantification>specific) ^ <Result>(w3:m-whereto ^ near ^ <Anchor>(w5:thing ^ box ^ <Delimitation>unique ^ <Num>sg ^ <Quantification>specific)) ^ <Subject>a1:entity)

Monday, January 16, 2012

Talking Robots@ German Research Center for AI

Geert-Jan M. Kruijff © 2012 http://talkingrobots.dfki.de

Trying things out: Copula constructions

Copula “to be”

Something is predicated (Cop-Scope) over something (Cop-Restr)

Ascription: Ascribing a property to an entity, “the mug is green”

Presentation: Situating an entity, “there is a mug there”

Monday, January 16, 2012

Talking Robots@ German Research Center for AI

Geert-Jan M. Kruijff © 2012 http://talkingrobots.dfki.de

Trying things out: Copula constructions

Copula “to be”

Something is predicated (Cop-Scope) over something (Cop-Restr)

Ascription: Ascribing a property to an entity, “the mug is green”

Presentation: Situating an entity, “there is a mug there”

tccg> the mug is blue10 parses found.

Parse 2: s : @w2:ascription(be ^ <Mood>ind ^ <Tense>pres ^ <Cop-Restr>(w1:thing ^ mug ^ <Delimitation>unique ^ <Num>sg ^ <Quantification>specific) ^ <Cop-Scope>(w3:q-color ^ blue) ^ <Subject>w1:thing)

Monday, January 16, 2012

Talking Robots@ German Research Center for AI

Geert-Jan M. Kruijff © 2012 http://talkingrobots.dfki.de

Trying things out: Copula constructions

Copula “to be”

Something is predicated (Cop-Scope) over something (Cop-Restr)

Ascription: Ascribing a property to an entity, “the mug is green”

Presentation: Situating an entity, “there is a mug there”

tccg> the mug is blue10 parses found.

Parse 2: s : @w2:ascription(be ^ <Mood>ind ^ <Tense>pres ^ <Cop-Restr>(w1:thing ^ mug ^ <Delimitation>unique ^ <Num>sg ^ <Quantification>specific) ^ <Cop-Scope>(w3:q-color ^ blue) ^ <Subject>w1:thing)

tccg> there is a mug there31 parses found.

Parse 1: s : @w1:presentational(be ^ <Mood>ind ^ <Tense>pres ^ <Modifier>(w0:m-location ^ context ^ <Proximity>m-distal) ^ <Presented>(w3:thing ^ mug ^ <Delimitation>existential ^ <Num>sg ^ <Quantification>specific ^ <Modifier>(w4:m-location ^ context ^ <Proximity>m-distal)))

Monday, January 16, 2012

Talking Robots@ German Research Center for AI

Geert-Jan M. Kruijff © 2012 http://talkingrobots.dfki.de

Exercises

Testsuite

To add an expression to a test suite (in tccg): :2tb XYZ.xml

The number of parses obtained in tccg is stored as the target in the test suite

To run all the examples in a test suite (cmd line): % ccg-test tb.xml

Exercises for MOLOKO

Extend the dictionary with entries for “staircase”, “stairs”, “victim”

Extend the dictionary with entries for “drive”, “enter”, “shall”

Create an OpenCCG XML distribution using moloko-FULL

Create a test suite for the utterances from “talking+acting” (slide#45)

Devise a (simple!) decision structure for modeling how to distinguish between (almost all of) the different readings

Monday, January 16, 2012

Talking Robots@ German Research Center for AI

Geert-Jan M. Kruijff © 2012 http://talkingrobots.dfki.de

Content planning

What is content planning?

Determining what content to convey,

in a sub-dialogue, possibly just an utterance

(Content aggregation and -distribution)

What does content planning start from?

Planning starts from a goal,

specifying what is to be conveyed.

What is a goal?

We specify a goal as a proto-logical form,

including the basic structure and contextual informationfor which the full logical form is to be generated

(This does not exclude access to external sources!)

Monday, January 16, 2012

Talking Robots@ German Research Center for AI

Geert-Jan M. Kruijff © 2012 http://talkingrobots.dfki.de

CPLAN in Tarot

Logical Forms as Graphs

A CCG logical form can be interpreted as an acyclic graph:

@d1:dvp(<Content>(c1:ascription ^<Cop-Scope>(cs1:type ^ <Questioned>true))

<Wh-Restr>(:specifier ^ what ^ <Scope> (cs1:type)))

d1

dvp

ascription

c1

specifier

what

cs1

type

true

ID

TYPE

Content

Wh-Restr

ID

TYPE

Cop-Scope

TYPE

PROP

Scope

ID

TYPE

Questioned

Slide courtesy of Bernd KieferFull presentation at Tarot

Monday, January 16, 2012

Talking Robots@ German Research Center for AI

Geert-Jan M. Kruijff © 2012 http://talkingrobots.dfki.de

CPLAN in Tarot

The content planner as a rule system

The planner uses a set of transformation rules which specify local modifications (to a logical form as graph)

The planner processes these rules with an engine which applies the rules to all sub-parts of the logical form, (following a specific strategy)

Transformation rules: Antecedent and consequent

The match part of a rule specifies what shape some substructure of the graph must exhibit to have the rule apply

The action part specifies the modifications to parts of the substructure

Monday, January 16, 2012

Talking Robots@ German Research Center for AI

Geert-Jan M. Kruijff © 2012 http://talkingrobots.dfki.de

Step 1: Match patternsStep 1: Match Patterns

Specify patterns where certain actions should be applied

<TYPE> (activate all nodes that have an outgoing TYPE edge)

d1

dvp

ascription

c1

specifier

what

cs1

type

true

ID

TYPE

Content

Wh-Restr

ID

TYPE

Cop-Scope

TYPE

PROP

Scope

ID

TYPE

Questioned

TYPE

TYPE

TYPE

TYPE

Monday, January 16, 2012

Talking Robots@ German Research Center for AI

Geert-Jan M. Kruijff © 2012 http://talkingrobots.dfki.de

Step 1: Match patternsStep 1: Match Patterns

Specify patterns where certain actions should be applied

<TYPE> ascription (alternatively: :ascription)

d1

dvp

ascription

c1

specifier

what

cs1

type

true

ID

TYPE

Content

Wh-Restr

ID

TYPE

Cop-Scope

TYPE

PROP

Scope

ID

TYPE

Questioned

TYPEascription

Monday, January 16, 2012

Talking Robots@ German Research Center for AI

Geert-Jan M. Kruijff © 2012 http://talkingrobots.dfki.de

Step 2: ActionsStep 2: Actions

How should the active node be modified

# ^ :newtype overwrites the type (shorthand for # ^ <TYPE>newtype)

d1

dvp

ascription

c1

specifier

what

cs1

type

true

ID

TYPE

Content

Wh-Restr

ID

TYPE

Cop-Scope

TYPE

PROP

Scope

ID

TYPE

Questioned

TYPE

Monday, January 16, 2012

Talking Robots@ German Research Center for AI

Geert-Jan M. Kruijff © 2012 http://talkingrobots.dfki.de

Step 2: ActionsStep 2: Actions

How should the active node be modified

# ^ :newtype overwrites the type (shorthand for # ^ <TYPE>newtype)

d1

dvp

c1

specifier

what

cs1

type

true

ID

TYPE

Content

Wh-Restr

ID

TYPE

Cop-Scope

TYPE

PROP

Scope

ID

TYPE

Questioned

TYPEnewtype

Monday, January 16, 2012

Talking Robots@ German Research Center for AI

Geert-Jan M. Kruijff © 2012 http://talkingrobots.dfki.de

Content planning: Proto logical forms

Proto-LFs specify intent and content

Some of the content provided ends up in the result,

other content is just there to drive the “decision making process” (captured by the transformation rules)

Between planning and realization

When writing rules for obtaining a particular outcome,

make sure that the resulting logical form exactly describes the content needed for the realizer: Not more, not less!

(Over- or underspecified logical forms are not realized by the OpenCCG realizer)

Therefore, make sure rules “clean up” meta-level content

Monday, January 16, 2012

Talking Robots@ German Research Center for AI

Geert-Jan M. Kruijff © 2012 http://talkingrobots.dfki.de

Exercises

Read the documentation

Content planner: General documentation for the GUI, and more detailed technical tutorial slides

Implement a planning rules for the “enter the office” suite

Try to cover at least 6 different examples, (statements and requests)

Start from a proto logical form for the action,

with a feature to indicate the (surface) intention (e.g. statement vs. request)

and possibly any other meta-level features you need to differentiate between the LFs

Have (simple) rules for simple expansions (e.g. subject, actor, ...) to create common structure

Design rules to introduce intention/function-specific content, to obtain the selection of different examples

Keep the number of meta-features to a minimum; vary in feature-values instead

Monday, January 16, 2012