jhu mt class: semantics-preserving machine translation
TRANSCRIPT
Semantics-PreservingMachine Translation
The diagram that will not die
(Vauquois, 1968)
Do we really need semantics?
(Jones et al. 2012)
Source: Anna fehlt ihrem KaterMT: Anna is missing her cat
Reference: Anna’s cat is missing her
"Fehlen" means "ARG1 to be missing to ARG0". There's a slew of German (active voice) verbs that behave like this---dative NP translating to subject NP in English---including certain uses of "sein" (to be). "Passen", for example, can mean "ARG1 to be acceptable to ARG0". "Mir ist kalt" -- "I am feeling cold" or literally "to me is cold"... None of these are idiomatic. They are just forms that the other Germanic
languages have (I think) but English lost at the Battle of Hastings.(Asad Sayeed, personal communication)
Semantic transfer
Anna fehlt ihrem Kater
MISS
CAT
ANNA
instance
agent
patient
instance
owner
instance
Anna’s cat is missing her
Figure 1: A string to meaning graph to string translation pipeline.
Experimental results demonstrate that our system is capable of learning semantic abstractions,and more specifically, to both analyse text into these abstractions and decode them back intotext in multiple languages.
The need to manipulate graph structures adds an additional level of complexity to the stan-dard MT task. While the problems of parsing and rule-extraction are well-studied for stringsand trees, there has been considerably less work within the NLP community on the equiva-lent algorithms for graphs. In this paper, we use hyperedge replacement grammars (HRGs)(Drewes et al., 1997) for the basic machinery of graph manipulation; in particular, we use asynchronous HRG (SHRG) to relate graph and string derivations.
We provide the following contributions:1. Introduction of string! graph transduction with HRGs to NLP2. Efficient algorithms for
• string–graph alignment• inference of graph grammars from aligned graph/string pairs
3. Empirical results from a working machine translation system, and analysis of that sys-tem’s performance on the subproblems of semantic parsing and generation.
We proceed as follows: Section 2 explains the SHRG formalism and shows how it is usedto derive graph-structured meaning representations. Section 3 introduces two algorithms forlearning SHRG rules automatically from semantically-annotated corpora. Section 4 describesthe details of our machine translation system, and explains how a SHRG is used to transforma natural language sentence into a meaning representation and vice-versa. Section 6 discussesrelated work and Section 7 summarizes the main results of the paper.
2 Synchronous Hyperedge Replacement Grammars
Hyperedge replacement grammars (Drewes et al., 1997) are an intuitive generalization of con-text free grammars (CFGs) from strings to hypergraphs. Where in CFGs strings are built upby successive rewriting of nonterminal tokens, in hyperedge replacement grammars (HRGs),nonterminals are hyperedges, and rewriting steps replace these nonterminal hyperedges withsubgraphs rather than strings.
A hypergraph is a generalization of an graph in which edges may link an arbitrary number ofnodes. Formally, a hypergraph over a set of edge labels C is a tuple H = "V, E, l, X #, where Vis a finite set of nodes, E is a finite set of edges, where each edge is a subset of V , l : E $ Cis a labeling function. |e| % N denotes the type of a hyperedge e % E (the number of nodesconnected by the edge). For the directed hypergraphs we are concerned with, each edgecontains a distinguished source node and one or more target nodes.
∃x1, x2, x3 instance(x1, MISS) ∧ agent(x1, x2) ∧ patient(x1, x3) ∧instance(x2, CAT ) ∧ instance(x3, ANNA) ∧ owner(x2, x3)
Problems we must solve
• Where do we get data that looks like this?
• How do we go from sentences to graphs (analysis)?
• How do we go from graphs to sentences (generation)?
• How do we do this efficiently?
Problems we must solve
• Where do we get data that looks like this?
• How do we go from sentences to graphs (analysis)?
• How do we go from graphs to sentences (generation)?
• How do we do this efficiently?
Note: generation from arbitrary conjunctions is NP-complete
(Moore, ENLG 2002)
AMRbank
(s / say-01 :ARG0 (g / organization :name (n / name :op1 "UN")) :ARG1 (f / flee-01 :ARG0 (p / person :quant (a / about :op1 14000)) :ARG1 (h / home :poss p) :time (w / weekend) :time (a2 / after :op1 (w2 / warn-01 :ARG1 (t / tsunami) :location (l / local)))) :medium (s2 / site :poss g :mod (w3 / web)))
http://amr.isi.edu
About 14,000 people fled their homes at the weekend after a local tsunami warning was issued, the UN said on its Web site
Synchronous Hyperedge Replacement Grammar
A HRG over a set of labels C is a rewriting system G = !N , T, P,S", where N and T # C are thefinite sets of nonterminal and terminal labels (T $ N = %), and S & N is the start symbol. P isa finite set of productions of the form A' R, where A& N and R is a hypergraph over C , witha set of distinguished external nodes, XR.
To describe the rewriting mechanism, let H[e/R] be the hypergraph obtained by replacing theedge e = (v1 · · · vn) with the hypergraph R. The external nodes of R “fuse” to the nodes of e,(v1 · · · vn), so that R connects to H[e/R] at the same nodes that e does to H. Note that H[e/R]is undefined if |e| (= |XR|. Given some hypergraph H with an edge e, if there is a productionp : lH(e)' R & GP and |XR|= |e|, we write H )p H[e/R] to indicate that p can derive H[e/R]from H in a single step. We write H )*G R to mean that R is derivable from H by G in somefinite number of rewriting steps. The grammars we use in this paper do not contain terminalhyperedges, thus the yield of each complete derivation is a graph (but note that intermediatesteps in the derivation may contain hyperedges).
A Synchronous Hyperedge Replacement Grammar (SHRG) is a HRG whose productions havepairs of right hand sides. Productions have the form (A' !R,Q",+), where A & N and R andQ are hypergraphs over N , T . + is a bijection linking nonterminal mentions in R and Q. Wecall the R side of a rule the source and the Q side the target. Isolating each side produces aprojection HRG of the SHRG. In general the target representation can be any hypergraph, oreven a string since string can be represented as monadic (non-branching) graphs. Becausewe are interested in translation between MRs and natural language we focus on graph-stringSHRGs. The target projection of such a SHRG is a context free string grammar. To ensure thatsource and target projection allow the same derivations, we constrain the relation + such thatevery linked pair of nonterminals has the same label in R and Q.
Figure 2 shows an example SHRG with start symbol ROOTS . External nodes are shaded black.
R1 A0NNP
'
!
A0:anna , Anna
"
R2 ROOTVB
'
!
ROOT:miss , misses
"
R3 POSSPP
'
!
poss:anna , her
"
R4 A1NN
'
!
A1:cat , cat
"
R5 A0NP
'
!
A0NNP , A0
NNP
"
R6 A1NP
'
!
A1NN
POSSPRP
, POSSPRP
A1NN
"
R7 ROOTVP
'
!
ROOTVB
A1NP
, ROOTVB
A1NP
"
R8 ROOTS
'
!
A0NP
ROOTVP , A0
NPROOT
VP
"
Figure 2: A graph-string SHRG automatically extracted from the meaning representation graphin figure 3a using the SYNSEM algorithm. Note the hyperedge in rule R8.
Synchronous Hyperedge Replacement Grammar
A HRG over a set of labels C is a rewriting system G = !N , T, P,S", where N and T # C are thefinite sets of nonterminal and terminal labels (T $ N = %), and S & N is the start symbol. P isa finite set of productions of the form A' R, where A& N and R is a hypergraph over C , witha set of distinguished external nodes, XR.
To describe the rewriting mechanism, let H[e/R] be the hypergraph obtained by replacing theedge e = (v1 · · · vn) with the hypergraph R. The external nodes of R “fuse” to the nodes of e,(v1 · · · vn), so that R connects to H[e/R] at the same nodes that e does to H. Note that H[e/R]is undefined if |e| (= |XR|. Given some hypergraph H with an edge e, if there is a productionp : lH(e)' R & GP and |XR|= |e|, we write H )p H[e/R] to indicate that p can derive H[e/R]from H in a single step. We write H )*G R to mean that R is derivable from H by G in somefinite number of rewriting steps. The grammars we use in this paper do not contain terminalhyperedges, thus the yield of each complete derivation is a graph (but note that intermediatesteps in the derivation may contain hyperedges).
A Synchronous Hyperedge Replacement Grammar (SHRG) is a HRG whose productions havepairs of right hand sides. Productions have the form (A' !R,Q",+), where A & N and R andQ are hypergraphs over N , T . + is a bijection linking nonterminal mentions in R and Q. Wecall the R side of a rule the source and the Q side the target. Isolating each side produces aprojection HRG of the SHRG. In general the target representation can be any hypergraph, oreven a string since string can be represented as monadic (non-branching) graphs. Becausewe are interested in translation between MRs and natural language we focus on graph-stringSHRGs. The target projection of such a SHRG is a context free string grammar. To ensure thatsource and target projection allow the same derivations, we constrain the relation + such thatevery linked pair of nonterminals has the same label in R and Q.
Figure 2 shows an example SHRG with start symbol ROOTS . External nodes are shaded black.
R1 A0NNP
'
!
A0:anna , Anna
"
R2 ROOTVB
'
!
ROOT:miss , misses
"
R3 POSSPP
'
!
poss:anna , her
"
R4 A1NN
'
!
A1:cat , cat
"
R5 A0NP
'
!
A0NNP , A0
NNP
"
R6 A1NP
'
!
A1NN
POSSPRP
, POSSPRP
A1NN
"
R7 ROOTVP
'
!
ROOTVB
A1NP
, ROOTVB
A1NP
"
R8 ROOTS
'
!
A0NP
ROOTVP , A0
NPROOT
VP
"
Figure 2: A graph-string SHRG automatically extracted from the meaning representation graphin figure 3a using the SYNSEM algorithm. Note the hyperedge in rule R8.
The graph language captures a type of meaning representation in which semantic predicatesand concepts are connected to their semantic arguments by directed edges. The edges arelabeled with PropBank-style semantic roles (A0, A1, poss). Nonterminal symbols in this SHRGare complex symbols consisting of a semantic and a syntactic part, notated with the formerabove the latter.
Since HRG derivations are context free, we can represent them as trees. As an example, Figure3c shows a derivation tree using the grammar in Figure 2, Figure 3a shows the resulting graphand Figure 3b the corresponding string. Describing graphs as their SHRG derivation treesallows us to use a number of standard algorithms from the NLP literature.
Finally, an Adaptive Synchronous Hyperedge Replacement Grammar (ASHRG) is a SHRG G =!N , T, P",S, V #, where V is a finite set of variables. ASHRG production templates are of thesame form as SHRG productions, (A $ !R,Q#,%), but A & N ' V and Q,R & N ' T ' V . Aproduction template p" & P" is realised as a set of rules P by substituting all variables v forany symbol s & N ' T : P = {(v&V(s&N'T p"[v/s]}. ASHRGs are a useful formalism for definingcanonical grammars over the structure of graphs, with production templates describing graphstructure transformations without regard to edge labels. We make use of this formalism in theproduction template R" in Figure 4a.
root:miss1
A0:a
nna 0 A1:cat3
poss:anna2
(a)
Anna0 misses1 her2 cat3
NNP VB PRP$ NN
NP NPVP
S
(b)
R8R5R1
R7R2 R6
R3 R4
(c)
Figure 3: (a) an example meaning representation graph for the sentence ‘Anna misses her cat.’,(b) the corresponding syntax tree. Subscripts indicate which words align to which graph edges.(c) a SHRG derivation tree for (a) using the grammar Figure in 2.
R* NT$
!
(role):(concept) , (string)
"
R1 NT$
! NT
NT, —
"
R2 NT$
!
NT NT , —
"
(a)
R1R* R2
R1R* R*
R*
(b)
Figure 4: (a) The canonical grammar of width 2. R" is a production template and values inparentheses denote variables as defined by the ASHRG formalism. (b) A SHRG derivation treefor the MR graph in Figure 3a using the canonical grammar in a, as created by the CANSEM
algorithm.
More problems
More problems
Natural language is not context-free.wambnxcmdny
wa∗b∗xc∗d∗y
Swiss-German, under string homomorphism:Intersect with (Shieber 1985)
More problems
It’s (arguably) hard to make SCFG translation models efficient.Chiang 2005; Huang & Chiang 2007; Venugopal et al. 2007; Petrov et al. 2008; Zhang & Gildea 2008; Hopkins & Langmead 2009; Iglesias et al. 2009, 2011; Huang & Mi 2010;
Rush & Collins 2011; Gesmundo et al. 2012
Natural language is not context-free.wambnxcmdny
wa∗b∗xc∗d∗y
Swiss-German, under string homomorphism:Intersect with (Shieber 1985)
Combinatory categorial grammar
Ajdukiewicz, 1935. Die syntaktische konnexität
53 A QUASI-ARITHMETICAL NOTATION
within Poor John sleeps with respect to this derivation, but John sleeps is not connex within Poor John sleeps with respect to this derivation.
We now define 'ml is connex within mnl as short for 'ml is connex within m2 with respect to all proper derivations of ms', and 'ml is thoroughly connex' as short for 'ml is connex within all mi of which it is a (proper or improper) part'.
Clearly, not every connex string has to be also thoroughly connex. In English, John sleeps is connex but not thoroughly connex since it is not connex within Poor John sleeps. That a language should exhibit this character may be deplored, since it introduces complications into its description and into the analyses car- ried out on the basis of such a description. We shall take up this point again at a later stage.
The complications mentioned are not such as to cause, by necessity, any major ambiguities. The knowledge that a string is thoroughly connex would indeed dispense with the task of testing whether this string is connex within some given context. That this knowledge is not at our disposal might necessitate more complex checking procedures, but the outcome of these procedures can still be unique. Knowing that John sleeps, though connex, is not thoroughly connex, we might be interested in finding out whether it is connex within Paul thinks that John sleeps, or a t least whether it is connex within this larger string with respect to some of its proper derivations. This last question can indeed be answered in the affirmative by exhibiting the following proper derivation:
Paul thinks that John sleeps
(14)
The relevant subderivation is framed. But is John sleeps also connex with re- spect to all other proper derivations of Paul thinks that John sleeps? The deriva- tion given above is the only proper one with (14) as the original index-sequence. But (14) is only one out of many other possible original sequences. Thinks may also have at least the indexes s/(n) and s/(n)[s] (as in Paul thinks and Paul thinks John is sleeping, waiving possible sophistications) and that also has the indexes n and n/[n] (as in Paul believes that and Paul likes that girl). Disregarding other possible indexes, we have therefore before us at least nine original index- sequences for the given string, which we might arrange in the following way:
Paul thinks that John sleeps
By systematic testing we can find that only one other original index-sequence
Bar-Hillel, 1953. A Quasi-Arithmetical Notation for Syntactic Description
Steedman, 2000. The Syntactic Process
Steedman, 2011. Taking Scope
Categorial grammar
Categorial grammar
A set of terminals {we, helped, Hans, paint, the house}
{NP, S, VP}
�mer � NP : we
�
we � NP : we�
�
�em Hans � NP : Hans
�
Hans � NP : Hans�
�
�es huus � NP : house
�
the house � NP : house�
�
�halfed � S\NP\NP/VP : λf.λx.λy.helped
�fxy
helped � S\NP/VP/NP : λx.λf.λy.helped�fxy
�
�aastriiche � VP\NP : λx.paint
�x
paint � VP/NP : λx.paint�x
�
mer � NP : we�
em Hans � NP : Hans�
es huus � NP : house�
halfed � S\NP\NP/VP : λf.λx.λy.helped�fxy
aastriiche � VP\NP : λx.paint�x
we � NP : we�
helped � S\NP/VP/NP : λx.λf.λy.helped�fxy
Hans � NP : Hans�
paint � VP/NP : λx.paint�x
the house � NP : house�
Categorial grammar
A set of terminals {we, helped, Hans, paint, the house}
{NP, S, VP}
�mer � NP : we
�
we � NP : we�
�
�em Hans � NP : Hans
�
Hans � NP : Hans�
�
�es huus � NP : house
�
the house � NP : house�
�
�halfed � S\NP\NP/VP : λf.λx.λy.helped
�fxy
helped � S\NP/VP/NP : λx.λf.λy.helped�fxy
�
�aastriiche � VP\NP : λx.paint
�x
paint � VP/NP : λx.paint�x
�
mer � NP : we�
em Hans � NP : Hans�
es huus � NP : house�
halfed � S\NP\NP/VP : λf.λx.λy.helped�fxy
aastriiche � VP\NP : λx.paint�x
we � NP : we�
helped � S\NP/VP/NP : λx.λf.λy.helped�fxy
Hans � NP : Hans�
paint � VP/NP : λx.paint�x
the house � NP : house�
A set of atomic categories (nonterminals)
{we, helped, Hans, paint, the house}
{NP, S, VP}
�mer � NP : we
�
we � NP : we�
�
�em Hans � NP : Hans
�
Hans � NP : Hans�
�
�es huus � NP : house
�
the house � NP : house�
�
�halfed � S\NP\NP/VP : λf.λx.λy.helped
�fxy
helped � S\NP/VP/NP : λx.λf.λy.helped�fxy
�
�aastriiche � VP\NP : λx.paint
�x
paint � VP/NP : λx.paint�x
�
mer � NP : we�
em Hans � NP : Hans�
es huus � NP : house�
halfed � S\NP\NP/VP : λf.λx.λy.helped�fxy
aastriiche � VP\NP : λx.paint�x
we � NP : we�
helped � S\NP/VP/NP : λx.λf.λy.helped�fxy
Hans � NP : Hans�
paint � VP/NP : λx.paint�x
the house � NP : house�
Categorial grammar
A set of terminals {we, helped, Hans, paint, the house}
{NP, S, VP}
�mer � NP : we
�
we � NP : we�
�
�em Hans � NP : Hans
�
Hans � NP : Hans�
�
�es huus � NP : house
�
the house � NP : house�
�
�halfed � S\NP\NP/VP : λf.λx.λy.helped
�fxy
helped � S\NP/VP/NP : λx.λf.λy.helped�fxy
�
�aastriiche � VP\NP : λx.paint
�x
paint � VP/NP : λx.paint�x
�
mer � NP : we�
em Hans � NP : Hans�
es huus � NP : house�
halfed � S\NP\NP/VP : λf.λx.λy.helped�fxy
aastriiche � VP\NP : λx.paint�x
we � NP : we�
helped � S\NP/VP/NP : λx.λf.λy.helped�fxy
Hans � NP : Hans�
paint � VP/NP : λx.paint�x
the house � NP : house�
A set of atomic categories (nonterminals)
{we, helped, Hans, paint, the house}
{NP, S, VP}
�mer � NP : we
�
we � NP : we�
�
�em Hans � NP : Hans
�
Hans � NP : Hans�
�
�es huus � NP : house
�
the house � NP : house�
�
�halfed � S\NP\NP/VP : λf.λx.λy.helped
�fxy
helped � S\NP/VP/NP : λx.λf.λy.helped�fxy
�
�aastriiche � VP\NP : λx.paint
�x
paint � VP/NP : λx.paint�x
�
mer � NP : we�
em Hans � NP : Hans�
es huus � NP : house�
halfed � S\NP\NP/VP : λf.λx.λy.helped�fxy
aastriiche � VP\NP : λx.paint�x
we � NP : we�
helped � S\NP/VP/NP : λx.λf.λy.helped�fxy
Hans � NP : Hans�
paint � VP/NP : λx.paint�x
the house � NP : house�
Categorial grammar
A set of terminals {we, helped, Hans, paint, the house}
{NP, S, VP}
�mer � NP : we
�
we � NP : we�
�
�em Hans � NP : Hans
�
Hans � NP : Hans�
�
�es huus � NP : house
�
the house � NP : house�
�
�halfed � S\NP\NP/VP : λf.λx.λy.helped
�fxy
helped � S\NP/VP/NP : λx.λf.λy.helped�fxy
�
�aastriiche � VP\NP : λx.paint
�x
paint � VP/NP : λx.paint�x
�
mer � NP : we�
em Hans � NP : Hans�
es huus � NP : house�
halfed � S\NP\NP/VP : λf.λx.λy.helped�fxy
aastriiche � VP\NP : λx.paint�x
we � NP : we�
helped � S\NP/VP/NP : λx.λf.λy.helped�fxy
Hans � NP : Hans�
paint � VP/NP : λx.paint�x
the house � NP : house�
A set of atomic categories (nonterminals)
{we, helped, Hans, paint, the house}
{NP, S, VP}
�mer � NP : we
�
we � NP : we�
�
�em Hans � NP : Hans
�
Hans � NP : Hans�
�
�es huus � NP : house
�
the house � NP : house�
�
�halfed � S\NP\NP/VP : λf.λx.λy.helped
�fxy
helped � S\NP/VP/NP : λx.λf.λy.helped�fxy
�
�aastriiche � VP\NP : λx.paint
�x
paint � VP/NP : λx.paint�x
�
mer � NP : we�
em Hans � NP : Hans�
es huus � NP : house�
halfed � S\NP\NP/VP : λf.λx.λy.helped�fxy
aastriiche � VP\NP : λx.paint�x
we � NP : we�
helped � S\NP/VP/NP : λx.λf.λy.helped�fxy
Hans � NP : Hans�
paint � VP/NP : λx.paint�x
the house � NP : house�
The complete set of categories: if and are categories, then and are also categories.
A B
A/B A\B
Categorial grammar
A set of terminals {we, helped, Hans, paint, the house}
{NP, S, VP}
�mer � NP : we
�
we � NP : we�
�
�em Hans � NP : Hans
�
Hans � NP : Hans�
�
�es huus � NP : house
�
the house � NP : house�
�
�halfed � S\NP\NP/VP : λf.λx.λy.helped
�fxy
helped � S\NP/VP/NP : λx.λf.λy.helped�fxy
�
�aastriiche � VP\NP : λx.paint
�x
paint � VP/NP : λx.paint�x
�
mer � NP : we�
em Hans � NP : Hans�
es huus � NP : house�
halfed � S\NP\NP/VP : λf.λx.λy.helped�fxy
aastriiche � VP\NP : λx.paint�x
we � NP : we�
helped � S\NP/VP/NP : λx.λf.λy.helped�fxy
Hans � NP : Hans�
paint � VP/NP : λx.paint�x
the house � NP : house�
A set of atomic categories (nonterminals)
{we, helped, Hans, paint, the house}
{NP, S, VP}
�mer � NP : we
�
we � NP : we�
�
�em Hans � NP : Hans
�
Hans � NP : Hans�
�
�es huus � NP : house
�
the house � NP : house�
�
�halfed � S\NP\NP/VP : λf.λx.λy.helped
�fxy
helped � S\NP/VP/NP : λx.λf.λy.helped�fxy
�
�aastriiche � VP\NP : λx.paint
�x
paint � VP/NP : λx.paint�x
�
mer � NP : we�
em Hans � NP : Hans�
es huus � NP : house�
halfed � S\NP\NP/VP : λf.λx.λy.helped�fxy
aastriiche � VP\NP : λx.paint�x
we � NP : we�
helped � S\NP/VP/NP : λx.λf.λy.helped�fxy
Hans � NP : Hans�
paint � VP/NP : λx.paint�x
the house � NP : house�
The complete set of categories: if and are categories, then and are also categories.
A B
A/B A\B
�mer � NP : we
�
we � NP : we�
�
�em Hans � NP : Hans
�
Hans � NP : Hans�
�
�es huus � NP : house
�
the house � NP : house�
�
�halfed � S\NP\NP/VP : λf.λx.λy.helped
�fxy
helped � S\NP/VP/NP : λx.λf.λy.helped�fxy
�
�aastriiche � VP\NP : λx.paint
�x
paint � VP/NP : λx.paint�x
�
mer � NP : we�
em Hans � NP : Hans�
es huus � NP : house�
halfed � S\NP\NP/VP : λf.λx.λy.helped�fxy
aastriiche � VP\NP : λx.paint�x
we � NP : we�
helped � S\NP/VP/NP : λx.λf.λy.helped�fxy
Hans � NP : Hans�
paint � VP/NP : λx.paint�x
the house � NP : house�
A lexicon: a subset of terminals × categories × lambda terms
functionalcategory
Categorial grammar
A set of terminals {we, helped, Hans, paint, the house}
{NP, S, VP}
�mer � NP : we
�
we � NP : we�
�
�em Hans � NP : Hans
�
Hans � NP : Hans�
�
�es huus � NP : house
�
the house � NP : house�
�
�halfed � S\NP\NP/VP : λf.λx.λy.helped
�fxy
helped � S\NP/VP/NP : λx.λf.λy.helped�fxy
�
�aastriiche � VP\NP : λx.paint
�x
paint � VP/NP : λx.paint�x
�
mer � NP : we�
em Hans � NP : Hans�
es huus � NP : house�
halfed � S\NP\NP/VP : λf.λx.λy.helped�fxy
aastriiche � VP\NP : λx.paint�x
we � NP : we�
helped � S\NP/VP/NP : λx.λf.λy.helped�fxy
Hans � NP : Hans�
paint � VP/NP : λx.paint�x
the house � NP : house�
A set of atomic categories (nonterminals)
{we, helped, Hans, paint, the house}
{NP, S, VP}
�mer � NP : we
�
we � NP : we�
�
�em Hans � NP : Hans
�
Hans � NP : Hans�
�
�es huus � NP : house
�
the house � NP : house�
�
�halfed � S\NP\NP/VP : λf.λx.λy.helped
�fxy
helped � S\NP/VP/NP : λx.λf.λy.helped�fxy
�
�aastriiche � VP\NP : λx.paint
�x
paint � VP/NP : λx.paint�x
�
mer � NP : we�
em Hans � NP : Hans�
es huus � NP : house�
halfed � S\NP\NP/VP : λf.λx.λy.helped�fxy
aastriiche � VP\NP : λx.paint�x
we � NP : we�
helped � S\NP/VP/NP : λx.λf.λy.helped�fxy
Hans � NP : Hans�
paint � VP/NP : λx.paint�x
the house � NP : house�
The complete set of categories: if and are categories, then and are also categories.
A B
A/B A\B
�mer � NP : we
�
we � NP : we�
�
�em Hans � NP : Hans
�
Hans � NP : Hans�
�
�es huus � NP : house
�
the house � NP : house�
�
�halfed � S\NP\NP/VP : λf.λx.λy.helped
�fxy
helped � S\NP/VP/NP : λx.λf.λy.helped�fxy
�
�aastriiche � VP\NP : λx.paint
�x
paint � VP/NP : λx.paint�x
�
mer � NP : we�
em Hans � NP : Hans�
es huus � NP : house�
halfed � S\NP\NP/VP : λf.λx.λy.helped�fxy
aastriiche � VP\NP : λx.paint�x
we � NP : we�
helped � S\NP/VP/NP : λx.λf.λy.helped�fxy
Hans � NP : Hans�
paint � VP/NP : λx.paint�x
the house � NP : house�
A lexicon: a subset of terminals × categories × lambda terms
targetcategory
Categorial grammar
A set of terminals {we, helped, Hans, paint, the house}
{NP, S, VP}
�mer � NP : we
�
we � NP : we�
�
�em Hans � NP : Hans
�
Hans � NP : Hans�
�
�es huus � NP : house
�
the house � NP : house�
�
�halfed � S\NP\NP/VP : λf.λx.λy.helped
�fxy
helped � S\NP/VP/NP : λx.λf.λy.helped�fxy
�
�aastriiche � VP\NP : λx.paint
�x
paint � VP/NP : λx.paint�x
�
mer � NP : we�
em Hans � NP : Hans�
es huus � NP : house�
halfed � S\NP\NP/VP : λf.λx.λy.helped�fxy
aastriiche � VP\NP : λx.paint�x
we � NP : we�
helped � S\NP/VP/NP : λx.λf.λy.helped�fxy
Hans � NP : Hans�
paint � VP/NP : λx.paint�x
the house � NP : house�
A set of atomic categories (nonterminals)
{we, helped, Hans, paint, the house}
{NP, S, VP}
�mer � NP : we
�
we � NP : we�
�
�em Hans � NP : Hans
�
Hans � NP : Hans�
�
�es huus � NP : house
�
the house � NP : house�
�
�halfed � S\NP\NP/VP : λf.λx.λy.helped
�fxy
helped � S\NP/VP/NP : λx.λf.λy.helped�fxy
�
�aastriiche � VP\NP : λx.paint
�x
paint � VP/NP : λx.paint�x
�
mer � NP : we�
em Hans � NP : Hans�
es huus � NP : house�
halfed � S\NP\NP/VP : λf.λx.λy.helped�fxy
aastriiche � VP\NP : λx.paint�x
we � NP : we�
helped � S\NP/VP/NP : λx.λf.λy.helped�fxy
Hans � NP : Hans�
paint � VP/NP : λx.paint�x
the house � NP : house�
The complete set of categories: if and are categories, then and are also categories.
A B
A/B A\B
�mer � NP : we
�
we � NP : we�
�
�em Hans � NP : Hans
�
Hans � NP : Hans�
�
�es huus � NP : house
�
the house � NP : house�
�
�halfed � S\NP\NP/VP : λf.λx.λy.helped
�fxy
helped � S\NP/VP/NP : λx.λf.λy.helped�fxy
�
�aastriiche � VP\NP : λx.paint
�x
paint � VP/NP : λx.paint�x
�
mer � NP : we�
em Hans � NP : Hans�
es huus � NP : house�
halfed � S\NP\NP/VP : λf.λx.λy.helped�fxy
aastriiche � VP\NP : λx.paint�x
we � NP : we�
helped � S\NP/VP/NP : λx.λf.λy.helped�fxy
Hans � NP : Hans�
paint � VP/NP : λx.paint�x
the house � NP : house�
A lexicon: a subset of terminals × categories × lambda terms
argumentcategories
Categorial grammar
A set of terminals {we, helped, Hans, paint, the house}
{NP, S, VP}
�mer � NP : we
�
we � NP : we�
�
�em Hans � NP : Hans
�
Hans � NP : Hans�
�
�es huus � NP : house
�
the house � NP : house�
�
�halfed � S\NP\NP/VP : λf.λx.λy.helped
�fxy
helped � S\NP/VP/NP : λx.λf.λy.helped�fxy
�
�aastriiche � VP\NP : λx.paint
�x
paint � VP/NP : λx.paint�x
�
mer � NP : we�
em Hans � NP : Hans�
es huus � NP : house�
halfed � S\NP\NP/VP : λf.λx.λy.helped�fxy
aastriiche � VP\NP : λx.paint�x
we � NP : we�
helped � S\NP/VP/NP : λx.λf.λy.helped�fxy
Hans � NP : Hans�
paint � VP/NP : λx.paint�x
the house � NP : house�
A set of atomic categories (nonterminals)
{we, helped, Hans, paint, the house}
{NP, S, VP}
�mer � NP : we
�
we � NP : we�
�
�em Hans � NP : Hans
�
Hans � NP : Hans�
�
�es huus � NP : house
�
the house � NP : house�
�
�halfed � S\NP\NP/VP : λf.λx.λy.helped
�fxy
helped � S\NP/VP/NP : λx.λf.λy.helped�fxy
�
�aastriiche � VP\NP : λx.paint
�x
paint � VP/NP : λx.paint�x
�
mer � NP : we�
em Hans � NP : Hans�
es huus � NP : house�
halfed � S\NP\NP/VP : λf.λx.λy.helped�fxy
aastriiche � VP\NP : λx.paint�x
we � NP : we�
helped � S\NP/VP/NP : λx.λf.λy.helped�fxy
Hans � NP : Hans�
paint � VP/NP : λx.paint�x
the house � NP : house�
The complete set of categories: if and are categories, then and are also categories.
A B
A/B A\B
�mer � NP : we
�
we � NP : we�
�
�em Hans � NP : Hans
�
Hans � NP : Hans�
�
�es huus � NP : house
�
the house � NP : house�
�
�halfed � S\NP\NP/VP : λf.λx.λy.helped
�fxy
helped � S\NP/VP/NP : λx.λf.λy.helped�fxy
�
�aastriiche � VP\NP : λx.paint
�x
paint � VP/NP : λx.paint�x
�
mer � NP : we�
em Hans � NP : Hans�
es huus � NP : house�
halfed � S\NP\NP/VP : λf.λx.λy.helped�fxy
aastriiche � VP\NP : λx.paint�x
we � NP : we�
helped � S\NP/VP/NP : λx.λf.λy.helped�fxy
Hans � NP : Hans�
paint � VP/NP : λx.paint�x
the house � NP : house�
A lexicon: a subset of terminals × categories × lambda terms
Categorial grammar
�mer � NP : we
�
we � NP : we�
�
�em Hans � NP : Hans
�
Hans � NP : Hans�
�
�es huus � NP : house
�
the house � NP : house�
�
�halfed � S\NP\NP/VP : λf.λx.λy.helped
�fxy
helped � S\NP/VP/NP : λx.λf.λy.helped�fxy
�
�aastriiche � VP\NP : λx.paint
�x
paint � VP/NP : λx.paint�x
�
mer � NP : we�
em Hans � NP : Hans�
es huus � NP : house�
halfed � S\NP\NP/VP : λf.λx.λy.helped�fxy
aastriiche � VP\NP : λx.paint�x
we � NP : we�
helped � S\NP/VP/NP : λx.λf.λy.helped�fxy
Hans � NP : Hans�
paint � VP/NP : λx.paint�x
the house � NP : house�
we helped Hans paint the houseNP : we
� S\NP/VP/NP : λx.λf.λy.helped�fxy NP : Hans
� VP/NP : λx.paint�x NP : house
�: > >: S\NP/VP : λf.λy.helped
�fHans
�y VP : paint
�house
�: >: S\NP : λy.helped
�(paint�house
�)Hans�y
<S : helped
�(paint�house
�)Hans�we
�
S : helped�(paint
�house
�)Hans�we
�<
NP : we� S\NP : λy.helped
�(paint�house
�)Hans�y
<: NP : Hans
� S\NP\NP : λx.λy.helped�(paint
�house
�)xy: <: : NP : house
� S\NP\NP\NP : λz.λx.λy.helped�(paint
�z)xy
: : >B×: : : S\NP\NP/VP : λf.λx.λy.helped
�fxy VP\NP : λx.paint
�x
: : :mer em Hans es huus halfed aastriiche
NP : we� NP : Hans
� NP : house� S\NP\NP/VP : λx.λf.λy.helped
�(fy)x VP\NP : λx.paint�x
we helped Hans paint the house:: S\NP/VP/NP : λx.λf.λy.helped
�fxy NP : Hans
� VP/NP : λx.paint�x NP : house
�: > >: S\NP/VP : λf.λy.helped
�fHans
�y VP : paint
�house
�>
NP : we� S\NP : λy.helped
�(paint�house
�)Hans�y
<S : helped
�(paint�house
�)Hans�we
�
NP : we� S\NP/NP/NP : λx.λf.λy.helped
�fxy NP : Hans
� VP/NP : λx.paint�x NP : house
�
Categorial grammar
�mer � NP : we
�
we � NP : we�
�
�em Hans � NP : Hans
�
Hans � NP : Hans�
�
�es huus � NP : house
�
the house � NP : house�
�
�halfed � S\NP\NP/VP : λf.λx.λy.helped
�fxy
helped � S\NP/VP/NP : λx.λf.λy.helped�fxy
�
�aastriiche � VP\NP : λx.paint
�x
paint � VP/NP : λx.paint�x
�
mer � NP : we�
em Hans � NP : Hans�
es huus � NP : house�
halfed � S\NP\NP/VP : λf.λx.λy.helped�fxy
aastriiche � VP\NP : λx.paint�x
we � NP : we�
helped � S\NP/VP/NP : λx.λf.λy.helped�fxy
Hans � NP : Hans�
paint � VP/NP : λx.paint�x
the house � NP : house�
we helped Hans paint the houseNP : we
� S\NP/VP/NP : λx.λf.λy.helped�fxy NP : Hans
� VP/NP : λx.paint�x NP : house
�: > >: S\NP/VP : λf.λy.helped
�fHans
�y VP : paint
�house
�: >: S\NP : λy.helped
�(paint�house
�)Hans�y
<S : helped
�(paint�house
�)Hans�we
�
S : helped�(paint
�house
�)Hans�we
�<
NP : we� S\NP : λy.helped
�(paint�house
�)Hans�y
<: NP : Hans
� S\NP\NP : λx.λy.helped�(paint
�house
�)xy: <: : NP : house
� S\NP\NP\NP : λz.λx.λy.helped�(paint
�z)xy
: : >B×: : : S\NP\NP/VP : λf.λx.λy.helped
�fxy VP\NP : λx.paint
�x
: : :mer em Hans es huus halfed aastriiche
NP : we� NP : Hans
� NP : house� S\NP\NP/VP : λx.λf.λy.helped
�(fy)x VP\NP : λx.paint�x
we helped Hans paint the house:: S\NP/VP/NP : λx.λf.λy.helped
�fxy NP : Hans
� VP/NP : λx.paint�x NP : house
�: > >: S\NP/VP : λf.λy.helped
�fHans
�y VP : paint
�house
�>
NP : we� S\NP : λy.helped
�(paint�house
�)Hans�y
<S : helped
�(paint�house
�)Hans�we
�
NP : we� S\NP/NP/NP : λx.λf.λy.helped
�fxy NP : Hans
� VP/NP : λx.paint�x NP : house
�
Categorial grammar
�mer � NP : we
�
we � NP : we�
�
�em Hans � NP : Hans
�
Hans � NP : Hans�
�
�es huus � NP : house
�
the house � NP : house�
�
�halfed � S\NP\NP/VP : λf.λx.λy.helped
�fxy
helped � S\NP/VP/NP : λx.λf.λy.helped�fxy
�
�aastriiche � VP\NP : λx.paint
�x
paint � VP/NP : λx.paint�x
�
mer � NP : we�
em Hans � NP : Hans�
es huus � NP : house�
halfed � S\NP\NP/VP : λf.λx.λy.helped�fxy
aastriiche � VP\NP : λx.paint�x
we � NP : we�
helped � S\NP/VP/NP : λx.λf.λy.helped�fxy
Hans � NP : Hans�
paint � VP/NP : λx.paint�x
the house � NP : house�
we helped Hans paint the houseNP : we
� S\NP/VP/NP : λx.λf.λy.helped�fxy NP : Hans
� VP/NP : λx.paint�x NP : house
�: > >: S\NP/VP : λf.λy.helped
�fHans
�y VP : paint
�house
�: >: S\NP : λy.helped
�(paint�house
�)Hans�y
<S : helped
�(paint�house
�)Hans�we
�
S : helped�(paint
�house
�)Hans�we
�<
NP : we� S\NP : λy.helped
�(paint�house
�)Hans�y
<: NP : Hans
� S\NP\NP : λx.λy.helped�(paint
�house
�)xy: <: : NP : house
� S\NP\NP\NP : λz.λx.λy.helped�(paint
�z)xy
: : >B×: : : S\NP\NP/VP : λf.λx.λy.helped
�fxy VP\NP : λx.paint
�x
: : :mer em Hans es huus halfed aastriiche
NP : we� NP : Hans
� NP : house� S\NP\NP/VP : λx.λf.λy.helped
�(fy)x VP\NP : λx.paint�x
we helped Hans paint the house:: S\NP/VP/NP : λx.λf.λy.helped
�fxy NP : Hans
� VP/NP : λx.paint�x NP : house
�: > >: S\NP/VP : λf.λy.helped
�fHans
�y VP : paint
�house
�>
NP : we� S\NP : λy.helped
�(paint�house
�)Hans�y
<S : helped
�(paint�house
�)Hans�we
�
NP : we� S\NP/NP/NP : λx.λf.λy.helped
�fxy NP : Hans
� VP/NP : λx.paint�x NP : house
�
forward application backward applicationB : g A\B : f ⇒ A : fgA/B : f B : g ⇒ A : fg
Categorial grammar
�mer � NP : we
�
we � NP : we�
�
�em Hans � NP : Hans
�
Hans � NP : Hans�
�
�es huus � NP : house
�
the house � NP : house�
�
�halfed � S\NP\NP/VP : λf.λx.λy.helped
�fxy
helped � S\NP/VP/NP : λx.λf.λy.helped�fxy
�
�aastriiche � VP\NP : λx.paint
�x
paint � VP/NP : λx.paint�x
�
mer � NP : we�
em Hans � NP : Hans�
es huus � NP : house�
halfed � S\NP\NP/VP : λf.λx.λy.helped�fxy
aastriiche � VP\NP : λx.paint�x
we � NP : we�
helped � S\NP/VP/NP : λx.λf.λy.helped�fxy
Hans � NP : Hans�
paint � VP/NP : λx.paint�x
the house � NP : house�
we helped Hans paint the houseNP : we
� S\NP/VP/NP : λx.λf.λy.helped�fxy NP : Hans
� VP/NP : λx.paint�x NP : house
�: > >: S\NP/VP : λf.λy.helped
�fHans
�y VP : paint
�house
�: >: S\NP : λy.helped
�(paint�house
�)Hans�y
<S : helped
�(paint�house
�)Hans�we
�
S : helped�(paint
�house
�)Hans�we
�<
NP : we� S\NP : λy.helped
�(paint�house
�)Hans�y
<: NP : Hans
� S\NP\NP : λx.λy.helped�(paint
�house
�)xy: <: : NP : house
� S\NP\NP\NP : λz.λx.λy.helped�(paint
�z)xy
: : >B×: : : S\NP\NP/VP : λf.λx.λy.helped
�fxy VP\NP : λx.paint
�x
: : :mer em Hans es huus halfed aastriiche
NP : we� NP : Hans
� NP : house� S\NP\NP/VP : λx.λf.λy.helped
�(fy)x VP\NP : λx.paint�x
we helped Hans paint the house:: S\NP/VP/NP : λx.λf.λy.helped
�fxy NP : Hans
� VP/NP : λx.paint�x NP : house
�: > >: S\NP/VP : λf.λy.helped
�fHans
�y VP : paint
�house
�>
NP : we� S\NP : λy.helped
�(paint�house
�)Hans�y
<S : helped
�(paint�house
�)Hans�we
�
NP : we� S\NP/NP/NP : λx.λf.λy.helped
�fxy NP : Hans
� VP/NP : λx.paint�x NP : house
�
forward application backward applicationB : g A\B : f ⇒ A : fgA/B : f B : g ⇒ A : fg
primary premise
Categorial grammar
�mer � NP : we
�
we � NP : we�
�
�em Hans � NP : Hans
�
Hans � NP : Hans�
�
�es huus � NP : house
�
the house � NP : house�
�
�halfed � S\NP\NP/VP : λf.λx.λy.helped
�fxy
helped � S\NP/VP/NP : λx.λf.λy.helped�fxy
�
�aastriiche � VP\NP : λx.paint
�x
paint � VP/NP : λx.paint�x
�
mer � NP : we�
em Hans � NP : Hans�
es huus � NP : house�
halfed � S\NP\NP/VP : λf.λx.λy.helped�fxy
aastriiche � VP\NP : λx.paint�x
we � NP : we�
helped � S\NP/VP/NP : λx.λf.λy.helped�fxy
Hans � NP : Hans�
paint � VP/NP : λx.paint�x
the house � NP : house�
we helped Hans paint the houseNP : we
� S\NP/VP/NP : λx.λf.λy.helped�fxy NP : Hans
� VP/NP : λx.paint�x NP : house
�: > >: S\NP/VP : λf.λy.helped
�fHans
�y VP : paint
�house
�: >: S\NP : λy.helped
�(paint�house
�)Hans�y
<S : helped
�(paint�house
�)Hans�we
�
S : helped�(paint
�house
�)Hans�we
�<
NP : we� S\NP : λy.helped
�(paint�house
�)Hans�y
<: NP : Hans
� S\NP\NP : λx.λy.helped�(paint
�house
�)xy: <: : NP : house
� S\NP\NP\NP : λz.λx.λy.helped�(paint
�z)xy
: : >B×: : : S\NP\NP/VP : λf.λx.λy.helped
�fxy VP\NP : λx.paint
�x
: : :mer em Hans es huus halfed aastriiche
NP : we� NP : Hans
� NP : house� S\NP\NP/VP : λx.λf.λy.helped
�(fy)x VP\NP : λx.paint�x
we helped Hans paint the house:: S\NP/VP/NP : λx.λf.λy.helped
�fxy NP : Hans
� VP/NP : λx.paint�x NP : house
�: > >: S\NP/VP : λf.λy.helped
�fHans
�y VP : paint
�house
�>
NP : we� S\NP : λy.helped
�(paint�house
�)Hans�y
<S : helped
�(paint�house
�)Hans�we
�
NP : we� S\NP/NP/NP : λx.λf.λy.helped
�fxy NP : Hans
� VP/NP : λx.paint�x NP : house
�
forward application backward applicationB : g A\B : f ⇒ A : fgA/B : f B : g ⇒ A : fg
secondary premise
Categorial grammar
�mer � NP : we
�
we � NP : we�
�
�em Hans � NP : Hans
�
Hans � NP : Hans�
�
�es huus � NP : house
�
the house � NP : house�
�
�halfed � S\NP\NP/VP : λf.λx.λy.helped
�fxy
helped � S\NP/VP/NP : λx.λf.λy.helped�fxy
�
�aastriiche � VP\NP : λx.paint
�x
paint � VP/NP : λx.paint�x
�
mer � NP : we�
em Hans � NP : Hans�
es huus � NP : house�
halfed � S\NP\NP/VP : λf.λx.λy.helped�fxy
aastriiche � VP\NP : λx.paint�x
we � NP : we�
helped � S\NP/VP/NP : λx.λf.λy.helped�fxy
Hans � NP : Hans�
paint � VP/NP : λx.paint�x
the house � NP : house�
we helped Hans paint the houseNP : we
� S\NP/VP/NP : λx.λf.λy.helped�fxy NP : Hans
� VP/NP : λx.paint�x NP : house
�: > >: S\NP/VP : λf.λy.helped
�fHans
�y VP : paint
�house
�: >: S\NP : λy.helped
�(paint�house
�)Hans�y
<S : helped
�(paint�house
�)Hans�we
�
S : helped�(paint
�house
�)Hans�we
�<
NP : we� S\NP : λy.helped
�(paint�house
�)Hans�y
<: NP : Hans
� S\NP\NP : λx.λy.helped�(paint
�house
�)xy: <: : NP : house
� S\NP\NP\NP : λz.λx.λy.helped�(paint
�z)xy
: : >B×: : : S\NP\NP/VP : λf.λx.λy.helped
�fxy VP\NP : λx.paint
�x
: : :mer em Hans es huus halfed aastriiche
NP : we� NP : Hans
� NP : house� S\NP\NP/VP : λx.λf.λy.helped
�(fy)x VP\NP : λx.paint�x
we helped Hans paint the house:: S\NP/VP/NP : λx.λf.λy.helped
�fxy NP : Hans
� VP/NP : λx.paint�x NP : house
�: > >: S\NP/VP : λf.λy.helped
�fHans
�y VP : paint
�house
�>
NP : we� S\NP : λy.helped
�(paint�house
�)Hans�y
<S : helped
�(paint�house
�)Hans�we
�
NP : we� S\NP/NP/NP : λx.λf.λy.helped
�fxy NP : Hans
� VP/NP : λx.paint�x NP : house
�
forward application backward applicationB : g A\B : f ⇒ A : fgA/B : f B : g ⇒ A : fg
Categorial grammar
�mer � NP : we
�
we � NP : we�
�
�em Hans � NP : Hans
�
Hans � NP : Hans�
�
�es huus � NP : house
�
the house � NP : house�
�
�halfed � S\NP\NP/VP : λf.λx.λy.helped
�fxy
helped � S\NP/VP/NP : λx.λf.λy.helped�fxy
�
�aastriiche � VP\NP : λx.paint
�x
paint � VP/NP : λx.paint�x
�
mer � NP : we�
em Hans � NP : Hans�
es huus � NP : house�
halfed � S\NP\NP/VP : λf.λx.λy.helped�fxy
aastriiche � VP\NP : λx.paint�x
we � NP : we�
helped � S\NP/VP/NP : λx.λf.λy.helped�fxy
Hans � NP : Hans�
paint � VP/NP : λx.paint�x
the house � NP : house�
we helped Hans paint the houseNP : we
� S\NP/VP/NP : λx.λf.λy.helped�fxy NP : Hans
� VP/NP : λx.paint�x NP : house
�: > >: S\NP/VP : λf.λy.helped
�fHans
�y VP : paint
�house
�: >: S\NP : λy.helped
�(paint�house
�)Hans�y
<S : helped
�(paint�house
�)Hans�we
�
S : helped�(paint
�house
�)Hans�we
�<
NP : we� S\NP : λy.helped
�(paint�house
�)Hans�y
<: NP : Hans
� S\NP\NP : λx.λy.helped�(paint
�house
�)xy: <: : NP : house
� S\NP\NP\NP : λz.λx.λy.helped�(paint
�z)xy
: : >B×: : : S\NP\NP/VP : λf.λx.λy.helped
�fxy VP\NP : λx.paint
�x
: : :mer em Hans es huus halfed aastriiche
NP : we� NP : Hans
� NP : house� S\NP\NP/VP : λx.λf.λy.helped
�(fy)x VP\NP : λx.paint�x
we helped Hans paint the house:: S\NP/VP/NP : λx.λf.λy.helped
�fxy NP : Hans
� VP/NP : λx.paint�x NP : house
�: > >: S\NP/VP : λf.λy.helped
�fHans
�y VP : paint
�house
�>
NP : we� S\NP : λy.helped
�(paint�house
�)Hans�y
<S : helped
�(paint�house
�)Hans�we
�
NP : we� S\NP/NP/NP : λx.λf.λy.helped
�fxy NP : Hans
� VP/NP : λx.paint�x NP : house
�
forward application backward applicationB : g A\B : f ⇒ A : fgA/B : f B : g ⇒ A : fg
Categorial grammar
�mer � NP : we
�
we � NP : we�
�
�em Hans � NP : Hans
�
Hans � NP : Hans�
�
�es huus � NP : house
�
the house � NP : house�
�
�halfed � S\NP\NP/VP : λf.λx.λy.helped
�fxy
helped � S\NP/VP/NP : λx.λf.λy.helped�fxy
�
�aastriiche � VP\NP : λx.paint
�x
paint � VP/NP : λx.paint�x
�
mer � NP : we�
em Hans � NP : Hans�
es huus � NP : house�
halfed � S\NP\NP/VP : λf.λx.λy.helped�fxy
aastriiche � VP\NP : λx.paint�x
we � NP : we�
helped � S\NP/VP/NP : λx.λf.λy.helped�fxy
Hans � NP : Hans�
paint � VP/NP : λx.paint�x
the house � NP : house�
we helped Hans paint the houseNP : we
� S\NP/VP/NP : λx.λf.λy.helped�fxy NP : Hans
� VP/NP : λx.paint�x NP : house
�: > >: S\NP/VP : λf.λy.helped
�fHans
�y VP : paint
�house
�: >: S\NP : λy.helped
�(paint�house
�)Hans�y
<S : helped
�(paint�house
�)Hans�we
�
S : helped�(paint
�house
�)Hans�we
�<
NP : we� S\NP : λy.helped
�(paint�house
�)Hans�y
<: NP : Hans
� S\NP\NP : λx.λy.helped�(paint
�house
�)xy: <: : NP : house
� S\NP\NP\NP : λz.λx.λy.helped�(paint
�z)xy
: : >B×: : : S\NP\NP/VP : λf.λx.λy.helped
�fxy VP\NP : λx.paint
�x
: : :mer em Hans es huus halfed aastriiche
NP : we� NP : Hans
� NP : house� S\NP\NP/VP : λx.λf.λy.helped
�(fy)x VP\NP : λx.paint�x
we helped Hans paint the house:: S\NP/VP/NP : λx.λf.λy.helped
�fxy NP : Hans
� VP/NP : λx.paint�x NP : house
�: > >: S\NP/VP : λf.λy.helped
�fHans
�y VP : paint
�house
�>
NP : we� S\NP : λy.helped
�(paint�house
�)Hans�y
<S : helped
�(paint�house
�)Hans�we
�
NP : we� S\NP/NP/NP : λx.λf.λy.helped
�fxy NP : Hans
� VP/NP : λx.paint�x NP : house
�
forward application backward applicationB : g A\B : f ⇒ A : fgA/B : f B : g ⇒ A : fg
Categorial grammar
�mer � NP : we
�
we � NP : we�
�
�em Hans � NP : Hans
�
Hans � NP : Hans�
�
�es huus � NP : house
�
the house � NP : house�
�
�halfed � S\NP\NP/VP : λf.λx.λy.helped
�fxy
helped � S\NP/VP/NP : λx.λf.λy.helped�fxy
�
�aastriiche � VP\NP : λx.paint
�x
paint � VP/NP : λx.paint�x
�
mer � NP : we�
em Hans � NP : Hans�
es huus � NP : house�
halfed � S\NP\NP/VP : λf.λx.λy.helped�fxy
aastriiche � VP\NP : λx.paint�x
we � NP : we�
helped � S\NP/VP/NP : λx.λf.λy.helped�fxy
Hans � NP : Hans�
paint � VP/NP : λx.paint�x
the house � NP : house�
we helped Hans paint the houseNP : we
� S\NP/VP/NP : λx.λf.λy.helped�fxy NP : Hans
� VP/NP : λx.paint�x NP : house
�: > >: S\NP/VP : λf.λy.helped
�fHans
�y VP : paint
�house
�: >: S\NP : λy.helped
�(paint�house
�)Hans�y
<S : helped
�(paint�house
�)Hans�we
�
S : helped�(paint
�house
�)Hans�we
�<
NP : we� S\NP : λy.helped
�(paint�house
�)Hans�y
<: NP : Hans
� S\NP\NP : λx.λy.helped�(paint
�house
�)xy: <: : NP : house
� S\NP\NP\NP : λz.λx.λy.helped�(paint
�z)xy
: : >B×: : : S\NP\NP/VP : λf.λx.λy.helped
�fxy VP\NP : λx.paint
�x
: : :mer em Hans es huus halfed aastriiche
NP : we� NP : Hans
� NP : house� S\NP\NP/VP : λx.λf.λy.helped
�(fy)x VP\NP : λx.paint�x
we helped Hans paint the house:: S\NP/VP/NP : λx.λf.λy.helped
�fxy NP : Hans
� VP/NP : λx.paint�x NP : house
�: > >: S\NP/VP : λf.λy.helped
�fHans
�y VP : paint
�house
�>
NP : we� S\NP : λy.helped
�(paint�house
�)Hans�y
<S : helped
�(paint�house
�)Hans�we
�
NP : we� S\NP/NP/NP : λx.λf.λy.helped
�fxy NP : Hans
� VP/NP : λx.paint�x NP : house
�
forward application backward applicationB : g A\B : f ⇒ A : fgA/B : f B : g ⇒ A : fg
Categorial grammar
�mer � NP : we
�
we � NP : we�
�
�em Hans � NP : Hans
�
Hans � NP : Hans�
�
�es huus � NP : house
�
the house � NP : house�
�
�halfed � S\NP\NP/VP : λf.λx.λy.helped
�fxy
helped � S\NP/VP/NP : λx.λf.λy.helped�fxy
�
�aastriiche � VP\NP : λx.paint
�x
paint � VP/NP : λx.paint�x
�
mer � NP : we�
em Hans � NP : Hans�
es huus � NP : house�
halfed � S\NP\NP/VP : λf.λx.λy.helped�fxy
aastriiche � VP\NP : λx.paint�x
we � NP : we�
helped � S\NP/VP/NP : λx.λf.λy.helped�fxy
Hans � NP : Hans�
paint � VP/NP : λx.paint�x
the house � NP : house�
we helped Hans paint the houseNP : we
� S\NP/VP/NP : λx.λf.λy.helped�fxy NP : Hans
� VP/NP : λx.paint�x NP : house
�: > >: S\NP/VP : λf.λy.helped
�fHans
�y VP : paint
�house
�: >: S\NP : λy.helped
�(paint�house
�)Hans�y
<S : helped
�(paint�house
�)Hans�we
�
S : helped�(paint
�house
�)Hans�we
�<
NP : we� S\NP : λy.helped
�(paint�house
�)Hans�y
<: NP : Hans
� S\NP\NP : λx.λy.helped�(paint
�house
�)xy: <: : NP : house
� S\NP\NP\NP : λz.λx.λy.helped�(paint
�z)xy
: : >B×: : : S\NP\NP/VP : λf.λx.λy.helped
�fxy VP\NP : λx.paint
�x
: : :mer em Hans es huus halfed aastriiche
NP : we� NP : Hans
� NP : house� S\NP\NP/VP : λx.λf.λy.helped
�(fy)x VP\NP : λx.paint�x
we helped Hans paint the house:: S\NP/VP/NP : λx.λf.λy.helped
�fxy NP : Hans
� VP/NP : λx.paint�x NP : house
�: > >: S\NP/VP : λf.λy.helped
�fHans
�y VP : paint
�house
�>
NP : we� S\NP : λy.helped
�(paint�house
�)Hans�y
<S : helped
�(paint�house
�)Hans�we
�
NP : we� S\NP/NP/NP : λx.λf.λy.helped
�fxy NP : Hans
� VP/NP : λx.paint�x NP : house
�
forward application backward applicationB : g A\B : f ⇒ A : fgA/B : f B : g ⇒ A : fg
Categorial grammar
�mer � NP : we
�
we � NP : we�
�
�em Hans � NP : Hans
�
Hans � NP : Hans�
�
�es huus � NP : house
�
the house � NP : house�
�
�halfed � S\NP\NP/VP : λf.λx.λy.helped
�fxy
helped � S\NP/VP/NP : λx.λf.λy.helped�fxy
�
�aastriiche � VP\NP : λx.paint
�x
paint � VP/NP : λx.paint�x
�
mer � NP : we�
em Hans � NP : Hans�
es huus � NP : house�
halfed � S\NP\NP/VP : λf.λx.λy.helped�fxy
aastriiche � VP\NP : λx.paint�x
we � NP : we�
helped � S\NP/VP/NP : λx.λf.λy.helped�fxy
Hans � NP : Hans�
paint � VP/NP : λx.paint�x
the house � NP : house�
we helped Hans paint the houseNP : we
� S\NP/VP/NP : λx.λf.λy.helped�fxy NP : Hans
� VP/NP : λx.paint�x NP : house
�: > >: S\NP/VP : λf.λy.helped
�fHans
�y VP : paint
�house
�: >: S\NP : λy.helped
�(paint�house
�)Hans�y
<S : helped
�(paint�house
�)Hans�we
�
S : helped�(paint
�house
�)Hans�we
�<
NP : we� S\NP : λy.helped
�(paint�house
�)Hans�y
<: NP : Hans
� S\NP\NP : λx.λy.helped�(paint
�house
�)xy: <: : NP : house
� S\NP\NP\NP : λz.λx.λy.helped�(paint
�z)xy
: : >B×: : : S\NP\NP/VP : λf.λx.λy.helped
�fxy VP\NP : λx.paint
�x
: : :mer em Hans es huus halfed aastriiche
NP : we� NP : Hans
� NP : house� S\NP\NP/VP : λx.λf.λy.helped
�(fy)x VP\NP : λx.paint�x
we helped Hans paint the house:: S\NP/VP/NP : λx.λf.λy.helped
�fxy NP : Hans
� VP/NP : λx.paint�x NP : house
�: > >: S\NP/VP : λf.λy.helped
�fHans
�y VP : paint
�house
�>
NP : we� S\NP : λy.helped
�(paint�house
�)Hans�y
<S : helped
�(paint�house
�)Hans�we
�
NP : we� S\NP/NP/NP : λx.λf.λy.helped
�fxy NP : Hans
� VP/NP : λx.paint�x NP : house
�
forward application backward applicationB : g A\B : f ⇒ A : fgA/B : f B : g ⇒ A : fg
Categorial grammar
�mer � NP : we
�
we � NP : we�
�
�em Hans � NP : Hans
�
Hans � NP : Hans�
�
�es huus � NP : house
�
the house � NP : house�
�
�halfed � S\NP\NP/VP : λf.λx.λy.helped
�fxy
helped � S\NP/VP/NP : λx.λf.λy.helped�fxy
�
�aastriiche � VP\NP : λx.paint
�x
paint � VP/NP : λx.paint�x
�
mer � NP : we�
em Hans � NP : Hans�
es huus � NP : house�
halfed � S\NP\NP/VP : λf.λx.λy.helped�fxy
aastriiche � VP\NP : λx.paint�x
we � NP : we�
helped � S\NP/VP/NP : λx.λf.λy.helped�fxy
Hans � NP : Hans�
paint � VP/NP : λx.paint�x
the house � NP : house�
we helped Hans paint the houseNP : we
� S\NP/VP/NP : λx.λf.λy.helped�fxy NP : Hans
� VP/NP : λx.paint�x NP : house
�: > >: S\NP/VP : λf.λy.helped
�fHans
�y VP : paint
�house
�: >: S\NP : λy.helped
�(paint�house
�)Hans�y
<S : helped
�(paint�house
�)Hans�we
�
S : helped�(paint
�house
�)Hans�we
�<
NP : we� S\NP : λy.helped
�(paint�house
�)Hans�y
<: NP : Hans
� S\NP\NP : λx.λy.helped�(paint
�house
�)xy: <: : NP : house
� S\NP\NP\NP : λz.λx.λy.helped�(paint
�z)xy
: : >B×: : : S\NP\NP/VP : λf.λx.λy.helped
�fxy VP\NP : λx.paint
�x
: : :mer em Hans es huus halfed aastriiche
NP : we� NP : Hans
� NP : house� S\NP\NP/VP : λx.λf.λy.helped
�(fy)x VP\NP : λx.paint�x
we helped Hans paint the house:: S\NP/VP/NP : λx.λf.λy.helped
�fxy NP : Hans
� VP/NP : λx.paint�x NP : house
�: > >: S\NP/VP : λf.λy.helped
�fHans
�y VP : paint
�house
�>
NP : we� S\NP : λy.helped
�(paint�house
�)Hans�y
<S : helped
�(paint�house
�)Hans�we
�
NP : we� S\NP/NP/NP : λx.λf.λy.helped
�fxy NP : Hans
� VP/NP : λx.paint�x NP : house
�
forward application backward applicationB : g A\B : f ⇒ A : fgA/B : f B : g ⇒ A : fg
Categorial grammar
�mer � NP : we
�
we � NP : we�
�
�em Hans � NP : Hans
�
Hans � NP : Hans�
�
�es huus � NP : house
�
the house � NP : house�
�
�halfed � S\NP\NP/VP : λf.λx.λy.helped
�fxy
helped � S\NP/VP/NP : λx.λf.λy.helped�fxy
�
�aastriiche � VP\NP : λx.paint
�x
paint � VP/NP : λx.paint�x
�
mer � NP : we�
em Hans � NP : Hans�
es huus � NP : house�
halfed � S\NP\NP/VP : λf.λx.λy.helped�fxy
aastriiche � VP\NP : λx.paint�x
we � NP : we�
helped � S\NP/VP/NP : λx.λf.λy.helped�fxy
Hans � NP : Hans�
paint � VP/NP : λx.paint�x
the house � NP : house�
we helped Hans paint the houseNP : we
� S\NP/VP/NP : λx.λf.λy.helped�fxy NP : Hans
� VP/NP : λx.paint�x NP : house
�: > >: S\NP/VP : λf.λy.helped
�fHans
�y VP : paint
�house
�: >: S\NP : λy.helped
�(paint�house
�)Hans�y
<S : helped
�(paint�house
�)Hans�we
�
S : helped�(paint
�house
�)Hans�we
�<
NP : we� S\NP : λy.helped
�(paint�house
�)Hans�y
<: NP : Hans
� S\NP\NP : λx.λy.helped�(paint
�house
�)xy: <: : NP : house
� S\NP\NP\NP : λz.λx.λy.helped�(paint
�z)xy
: : >B×: : : S\NP\NP/VP : λf.λx.λy.helped
�fxy VP\NP : λx.paint
�x
: : :mer em Hans es huus halfed aastriiche
NP : we� NP : Hans
� NP : house� S\NP\NP/VP : λx.λf.λy.helped
�(fy)x VP\NP : λx.paint�x
we helped Hans paint the house:: S\NP/VP/NP : λx.λf.λy.helped
�fxy NP : Hans
� VP/NP : λx.paint�x NP : house
�: > >: S\NP/VP : λf.λy.helped
�fHans
�y VP : paint
�house
�>
NP : we� S\NP : λy.helped
�(paint�house
�)Hans�y
<S : helped
�(paint�house
�)Hans�we
�
NP : we� S\NP/NP/NP : λx.λf.λy.helped
�fxy NP : Hans
� VP/NP : λx.paint�x NP : house
�
forward application backward applicationB : g A\B : f ⇒ A : fgA/B : f B : g ⇒ A : fg
Categorial grammar
�mer � NP : we
�
we � NP : we�
�
�em Hans � NP : Hans
�
Hans � NP : Hans�
�
�es huus � NP : house
�
the house � NP : house�
�
�halfed � S\NP\NP/VP : λf.λx.λy.helped
�fxy
helped � S\NP/VP/NP : λx.λf.λy.helped�fxy
�
�aastriiche � VP\NP : λx.paint
�x
paint � VP/NP : λx.paint�x
�
mer � NP : we�
em Hans � NP : Hans�
es huus � NP : house�
halfed � S\NP\NP/VP : λf.λx.λy.helped�fxy
aastriiche � VP\NP : λx.paint�x
we � NP : we�
helped � S\NP/VP/NP : λx.λf.λy.helped�fxy
Hans � NP : Hans�
paint � VP/NP : λx.paint�x
the house � NP : house�
we helped Hans paint the houseNP : we
� S\NP/VP/NP : λx.λf.λy.helped�fxy NP : Hans
� VP/NP : λx.paint�x NP : house
�: > >: S\NP/VP : λf.λy.helped
�fHans
�y VP : paint
�house
�: >: S\NP : λy.helped
�(paint�house
�)Hans�y
<S : helped
�(paint�house
�)Hans�we
�
S : helped�(paint
�house
�)Hans�we
�<
NP : we� S\NP : λy.helped
�(paint�house
�)Hans�y
<: NP : Hans
� S\NP\NP : λx.λy.helped�(paint
�house
�)xy: <: : NP : house
� S\NP\NP\NP : λz.λx.λy.helped�(paint
�z)xy
: : >B×: : : S\NP\NP/VP : λf.λx.λy.helped
�fxy VP\NP : λx.paint
�x
: : :mer em Hans es huus halfed aastriiche
NP : we� NP : Hans
� NP : house� S\NP\NP/VP : λx.λf.λy.helped
�(fy)x VP\NP : λx.paint�x
we helped Hans paint the house:: S\NP/VP/NP : λx.λf.λy.helped
�fxy NP : Hans
� VP/NP : λx.paint�x NP : house
�: > >: S\NP/VP : λf.λy.helped
�fHans
�y VP : paint
�house
�>
NP : we� S\NP : λy.helped
�(paint�house
�)Hans�y
<S : helped
�(paint�house
�)Hans�we
�
NP : we� S\NP/NP/NP : λx.λf.λy.helped
�fxy NP : Hans
� VP/NP : λx.paint�x NP : house
�
forward application backward applicationB : g A\B : f ⇒ A : fgA/B : f B : g ⇒ A : fg
Categorial grammar
�mer � NP : we
�
we � NP : we�
�
�em Hans � NP : Hans
�
Hans � NP : Hans�
�
�es huus � NP : house
�
the house � NP : house�
�
�halfed � S\NP\NP/VP : λf.λx.λy.helped
�fxy
helped � S\NP/VP/NP : λx.λf.λy.helped�fxy
�
�aastriiche � VP\NP : λx.paint
�x
paint � VP/NP : λx.paint�x
�
mer � NP : we�
em Hans � NP : Hans�
es huus � NP : house�
halfed � S\NP\NP/VP : λf.λx.λy.helped�fxy
aastriiche � VP\NP : λx.paint�x
we � NP : we�
helped � S\NP/VP/NP : λx.λf.λy.helped�fxy
Hans � NP : Hans�
paint � VP/NP : λx.paint�x
the house � NP : house�
we helped Hans paint the houseNP : we
� S\NP/VP/NP : λx.λf.λy.helped�fxy NP : Hans
� VP/NP : λx.paint�x NP : house
�: > >: S\NP/VP : λf.λy.helped
�fHans
�y VP : paint
�house
�: >: S\NP : λy.helped
�(paint�house
�)Hans�y
<S : helped
�(paint�house
�)Hans�we
�
S : helped�(paint
�house
�)Hans�we
�<
NP : we� S\NP : λy.helped
�(paint�house
�)Hans�y
<: NP : Hans
� S\NP\NP : λx.λy.helped�(paint
�house
�)xy: <: : NP : house
� S\NP\NP\NP : λz.λx.λy.helped�(paint
�z)xy
: : >B×: : : S\NP\NP/VP : λf.λx.λy.helped
�fxy VP\NP : λx.paint
�x
: : :mer em Hans es huus halfed aastriiche
NP : we� NP : Hans
� NP : house� S\NP\NP/VP : λx.λf.λy.helped
�(fy)x VP\NP : λx.paint�x
we helped Hans paint the house:: S\NP/VP/NP : λx.λf.λy.helped
�fxy NP : Hans
� VP/NP : λx.paint�x NP : house
�: > >: S\NP/VP : λf.λy.helped
�fHans
�y VP : paint
�house
�>
NP : we� S\NP : λy.helped
�(paint�house
�)Hans�y
<S : helped
�(paint�house
�)Hans�we
�
NP : we� S\NP/NP/NP : λx.λf.λy.helped
�fxy NP : Hans
� VP/NP : λx.paint�x NP : house
�
forward application backward applicationB : g A\B : f ⇒ A : fgA/B : f B : g ⇒ A : fg
Categorial grammar
�mer � NP : we
�
we � NP : we�
�
�em Hans � NP : Hans
�
Hans � NP : Hans�
�
�es huus � NP : house
�
the house � NP : house�
�
�halfed � S\NP\NP/VP : λf.λx.λy.helped
�fxy
helped � S\NP/VP/NP : λx.λf.λy.helped�fxy
�
�aastriiche � VP\NP : λx.paint
�x
paint � VP/NP : λx.paint�x
�
mer � NP : we�
em Hans � NP : Hans�
es huus � NP : house�
halfed � S\NP\NP/VP : λf.λx.λy.helped�fxy
aastriiche � VP\NP : λx.paint�x
we � NP : we�
helped � S\NP/VP/NP : λx.λf.λy.helped�fxy
Hans � NP : Hans�
paint � VP/NP : λx.paint�x
the house � NP : house�
we helped Hans paint the houseNP : we
� S\NP/VP/NP : λx.λf.λy.helped�fxy NP : Hans
� VP/NP : λx.paint�x NP : house
�: > >: S\NP/VP : λf.λy.helped
�fHans
�y VP : paint
�house
�: >: S\NP : λy.helped
�(paint�house
�)Hans�y
<S : helped
�(paint�house
�)Hans�we
�
S : helped�(paint
�house
�)Hans�we
�<
NP : we� S\NP : λy.helped
�(paint�house
�)Hans�y
<: NP : Hans
� S\NP\NP : λx.λy.helped�(paint
�house
�)xy: <: : NP : house
� S\NP\NP\NP : λz.λx.λy.helped�(paint
�z)xy
: : >B×: : : S\NP\NP/VP : λf.λx.λy.helped
�fxy VP\NP : λx.paint
�x
: : :mer em Hans es huus halfed aastriiche
NP : we� NP : Hans
� NP : house� S\NP\NP/VP : λx.λf.λy.helped
�(fy)x VP\NP : λx.paint�x
we helped Hans paint the house:: S\NP/VP/NP : λx.λf.λy.helped
�fxy NP : Hans
� VP/NP : λx.paint�x NP : house
�: > >: S\NP/VP : λf.λy.helped
�fHans
�y VP : paint
�house
�>
NP : we� S\NP : λy.helped
�(paint�house
�)Hans�y
<S : helped
�(paint�house
�)Hans�we
�
NP : we� S\NP/NP/NP : λx.λf.λy.helped
�fxy NP : Hans
� VP/NP : λx.paint�x NP : house
�
forward application backward applicationB : g A\B : f ⇒ A : fgA/B : f B : g ⇒ A : fg
Categorial grammar
�mer � NP : we
�
we � NP : we�
�
�em Hans � NP : Hans
�
Hans � NP : Hans�
�
�es huus � NP : house
�
the house � NP : house�
�
�halfed � S\NP\NP/VP : λf.λx.λy.helped
�fxy
helped � S\NP/VP/NP : λx.λf.λy.helped�fxy
�
�aastriiche � VP\NP : λx.paint
�x
paint � VP/NP : λx.paint�x
�
mer � NP : we�
em Hans � NP : Hans�
es huus � NP : house�
halfed � S\NP\NP/VP : λf.λx.λy.helped�fxy
aastriiche � VP\NP : λx.paint�x
we � NP : we�
helped � S\NP/VP/NP : λx.λf.λy.helped�fxy
Hans � NP : Hans�
paint � VP/NP : λx.paint�x
the house � NP : house�
we helped Hans paint the houseNP : we
� S\NP/VP/NP : λx.λf.λy.helped�fxy NP : Hans
� VP/NP : λx.paint�x NP : house
�: > >: S\NP/VP : λf.λy.helped
�fHans
�y VP : paint
�house
�: >: S\NP : λy.helped
�(paint�house
�)Hans�y
<S : helped
�(paint�house
�)Hans�we
�
S : helped�(paint
�house
�)Hans�we
�<
NP : we� S\NP : λy.helped
�(paint�house
�)Hans�y
<: NP : Hans
� S\NP\NP : λx.λy.helped�(paint
�house
�)xy: <: : NP : house
� S\NP\NP\NP : λz.λx.λy.helped�(paint
�z)xy
: : >B×: : : S\NP\NP/VP : λf.λx.λy.helped
�fxy VP\NP : λx.paint
�x
: : :mer em Hans es huus halfed aastriiche
NP : we� NP : Hans
� NP : house� S\NP\NP/VP : λx.λf.λy.helped
�(fy)x VP\NP : λx.paint�x
we helped Hans paint the house:: S\NP/VP/NP : λx.λf.λy.helped
�fxy NP : Hans
� VP/NP : λx.paint�x NP : house
�: > >: S\NP/VP : λf.λy.helped
�fHans
�y VP : paint
�house
�>
NP : we� S\NP : λy.helped
�(paint�house
�)Hans�y
<S : helped
�(paint�house
�)Hans�we
�
NP : we� S\NP/NP/NP : λx.λf.λy.helped
�fxy NP : Hans
� VP/NP : λx.paint�x NP : house
�
CG is context-free (Bar-Hillel et al., 1964)
Categorial grammar
�mer � NP : we
�
we � NP : we�
�
�em Hans � NP : Hans
�
Hans � NP : Hans�
�
�es huus � NP : house
�
the house � NP : house�
�
�halfed � S\NP\NP/VP : λf.λx.λy.helped
�fxy
helped � S\NP/VP/NP : λx.λf.λy.helped�fxy
�
�aastriiche � VP\NP : λx.paint
�x
paint � VP/NP : λx.paint�x
�
mer � NP : we�
em Hans � NP : Hans�
es huus � NP : house�
halfed � S\NP\NP/VP : λf.λx.λy.helped�fxy
aastriiche � VP\NP : λx.paint�x
we � NP : we�
helped � S\NP/VP/NP : λx.λf.λy.helped�fxy
Hans � NP : Hans�
paint � VP/NP : λx.paint�x
the house � NP : house�
we helped Hans paint the houseNP : we
� S\NP/VP/NP : λx.λf.λy.helped�fxy NP : Hans
� VP/NP : λx.paint�x NP : house
�: > >: S\NP/VP : λf.λy.helped
�fHans
�y VP : paint
�house
�: >: S\NP : λy.helped
�(paint�house
�)Hans�y
<S : helped
�(paint�house
�)Hans�we
�
S : helped�(paint
�house
�)Hans�we
�<
NP : we� S\NP : λy.helped
�(paint�house
�)Hans�y
<: NP : Hans
� S\NP\NP : λx.λy.helped�(paint
�house
�)xy: <: : NP : house
� S\NP\NP\NP : λz.λx.λy.helped�(paint
�z)xy
: : >B×: : : S\NP\NP/VP : λf.λx.λy.helped
�fxy VP\NP : λx.paint
�x
: : :mer em Hans es huus halfed aastriiche
NP : we� NP : Hans
� NP : house� S\NP\NP/VP : λx.λf.λy.helped
�(fy)x VP\NP : λx.paint�x
we helped Hans paint the house:: S\NP/VP/NP : λx.λf.λy.helped
�fxy NP : Hans
� VP/NP : λx.paint�x NP : house
�: > >: S\NP/VP : λf.λy.helped
�fHans
�y VP : paint
�house
�>
NP : we� S\NP : λy.helped
�(paint�house
�)Hans�y
<S : helped
�(paint�house
�)Hans�we
�
NP : we� S\NP/NP/NP : λx.λf.λy.helped
�fxy NP : Hans
� VP/NP : λx.paint�x NP : house
�
NP→ we
S→ NP helped NP VP
NP→ Hans
VP→ paint NP
NP→ house
CG is context-free (Bar-Hillel et al., 1964)
Categorial grammar
�mer � NP : we
�
we � NP : we�
�
�em Hans � NP : Hans
�
Hans � NP : Hans�
�
�es huus � NP : house
�
the house � NP : house�
�
�halfed � S\NP\NP/VP : λf.λx.λy.helped
�fxy
helped � S\NP/VP/NP : λx.λf.λy.helped�fxy
�
�aastriiche � VP\NP : λx.paint
�x
paint � VP/NP : λx.paint�x
�
mer � NP : we�
em Hans � NP : Hans�
es huus � NP : house�
halfed � S\NP\NP/VP : λf.λx.λy.helped�fxy
aastriiche � VP\NP : λx.paint�x
we � NP : we�
helped � S\NP/VP/NP : λx.λf.λy.helped�fxy
Hans � NP : Hans�
paint � VP/NP : λx.paint�x
the house � NP : house�
we helped Hans paint the houseNP : we
� S\NP/VP/NP : λx.λf.λy.helped�fxy NP : Hans
� VP/NP : λx.paint�x NP : house
�: > >: S\NP/VP : λf.λy.helped
�fHans
�y VP : paint
�house
�: >: S\NP : λy.helped
�(paint�house
�)Hans�y
<S : helped
�(paint�house
�)Hans�we
�
S : helped�(paint
�house
�)Hans�we
�<
NP : we� S\NP : λy.helped
�(paint�house
�)Hans�y
<: NP : Hans
� S\NP\NP : λx.λy.helped�(paint
�house
�)xy: <: : NP : house
� S\NP\NP\NP : λz.λx.λy.helped�(paint
�z)xy
: : >B×: : : S\NP\NP/VP : λf.λx.λy.helped
�fxy VP\NP : λx.paint
�x
: : :mer em Hans es huus halfed aastriiche
NP : we� NP : Hans
� NP : house� S\NP\NP/VP : λx.λf.λy.helped
�(fy)x VP\NP : λx.paint�x
we helped Hans paint the house:: S\NP/VP/NP : λx.λf.λy.helped
�fxy NP : Hans
� VP/NP : λx.paint�x NP : house
�: > >: S\NP/VP : λf.λy.helped
�fHans
�y VP : paint
�house
�>
NP : we� S\NP : λy.helped
�(paint�house
�)Hans�y
<S : helped
�(paint�house
�)Hans�we
�
NP : we� S\NP/NP/NP : λx.λf.λy.helped
�fxy NP : Hans
� VP/NP : λx.paint�x NP : house
�
NP→ we
S→ NP helped NP VP
NP→ Hans
VP→ paint NP
NP→ house
CG is context-free (Bar-Hillel et al., 1964)
Categorial grammar
�mer � NP : we
�
we � NP : we�
�
�em Hans � NP : Hans
�
Hans � NP : Hans�
�
�es huus � NP : house
�
the house � NP : house�
�
�halfed � S\NP\NP/VP : λf.λx.λy.helped
�fxy
helped � S\NP/VP/NP : λx.λf.λy.helped�fxy
�
�aastriiche � VP\NP : λx.paint
�x
paint � VP/NP : λx.paint�x
�
mer � NP : we�
em Hans � NP : Hans�
es huus � NP : house�
halfed � S\NP\NP/VP : λf.λx.λy.helped�fxy
aastriiche � VP\NP : λx.paint�x
we � NP : we�
helped � S\NP/VP/NP : λx.λf.λy.helped�fxy
Hans � NP : Hans�
paint � VP/NP : λx.paint�x
the house � NP : house�
we helped Hans paint the houseNP : we
� S\NP/VP/NP : λx.λf.λy.helped�fxy NP : Hans
� VP/NP : λx.paint�x NP : house
�: > >: S\NP/VP : λf.λy.helped
�fHans
�y VP : paint
�house
�: >: S\NP : λy.helped
�(paint�house
�)Hans�y
<S : helped
�(paint�house
�)Hans�we
�
S : helped�(paint
�house
�)Hans�we
�<
NP : we� S\NP : λy.helped
�(paint�house
�)Hans�y
<: NP : Hans
� S\NP\NP : λx.λy.helped�(paint
�house
�)xy: <: : NP : house
� S\NP\NP\NP : λz.λx.λy.helped�(paint
�z)xy
: : >B×: : : S\NP\NP/VP : λf.λx.λy.helped
�fxy VP\NP : λx.paint
�x
: : :mer em Hans es huus halfed aastriiche
NP : we� NP : Hans
� NP : house� S\NP\NP/VP : λx.λf.λy.helped
�(fy)x VP\NP : λx.paint�x
we helped Hans paint the house:: S\NP/VP/NP : λx.λf.λy.helped
�fxy NP : Hans
� VP/NP : λx.paint�x NP : house
�: > >: S\NP/VP : λf.λy.helped
�fHans
�y VP : paint
�house
�>
NP : we� S\NP : λy.helped
�(paint�house
�)Hans�y
<S : helped
�(paint�house
�)Hans�we
�
NP : we� S\NP/NP/NP : λx.λf.λy.helped
�fxy NP : Hans
� VP/NP : λx.paint�x NP : house
�
NP→ we
S→ NP helped NP VP
NP→ Hans
VP→ paint NP
NP→ house
CG is context-free (Bar-Hillel et al., 1964)
Categorial grammar
�mer � NP : we
�
we � NP : we�
�
�em Hans � NP : Hans
�
Hans � NP : Hans�
�
�es huus � NP : house
�
the house � NP : house�
�
�halfed � S\NP\NP/VP : λf.λx.λy.helped
�fxy
helped � S\NP/VP/NP : λx.λf.λy.helped�fxy
�
�aastriiche � VP\NP : λx.paint
�x
paint � VP/NP : λx.paint�x
�
mer � NP : we�
em Hans � NP : Hans�
es huus � NP : house�
halfed � S\NP\NP/VP : λf.λx.λy.helped�fxy
aastriiche � VP\NP : λx.paint�x
we � NP : we�
helped � S\NP/VP/NP : λx.λf.λy.helped�fxy
Hans � NP : Hans�
paint � VP/NP : λx.paint�x
the house � NP : house�
we helped Hans paint the houseNP : we
� S\NP/VP/NP : λx.λf.λy.helped�fxy NP : Hans
� VP/NP : λx.paint�x NP : house
�: > >: S\NP/VP : λf.λy.helped
�fHans
�y VP : paint
�house
�: >: S\NP : λy.helped
�(paint�house
�)Hans�y
<S : helped
�(paint�house
�)Hans�we
�
S : helped�(paint
�house
�)Hans�we
�<
NP : we� S\NP : λy.helped
�(paint�house
�)Hans�y
<: NP : Hans
� S\NP\NP : λx.λy.helped�(paint
�house
�)xy: <: : NP : house
� S\NP\NP\NP : λz.λx.λy.helped�(paint
�z)xy
: : >B×: : : S\NP\NP/VP : λf.λx.λy.helped
�fxy VP\NP : λx.paint
�x
: : :mer em Hans es huus halfed aastriiche
NP : we� NP : Hans
� NP : house� S\NP\NP/VP : λx.λf.λy.helped
�(fy)x VP\NP : λx.paint�x
we helped Hans paint the house:: S\NP/VP/NP : λx.λf.λy.helped
�fxy NP : Hans
� VP/NP : λx.paint�x NP : house
�: > >: S\NP/VP : λf.λy.helped
�fHans
�y VP : paint
�house
�>
NP : we� S\NP : λy.helped
�(paint�house
�)Hans�y
<S : helped
�(paint�house
�)Hans�we
�
NP : we� S\NP/NP/NP : λx.λf.λy.helped
�fxy NP : Hans
� VP/NP : λx.paint�x NP : house
�
NP→ we
S→ NP helped NP VP
NP→ Hans
VP→ paint NP
NP→ house
it is also a projective dependency grammar (Hays, 1964; Gaifman, 1965)
CG is context-free (Bar-Hillel et al., 1964)
mer em Hans es huus halfed aastriicheNP : we
� NP : Hans� NP : house
� S\NP\NP/VP : λf.λx.λy.helped�fxy VP\NP : λx.paint
�x
: : : >B×: : : S\NP\NP\NP : λz.λx.λy.helped
�(paint�z)xy
: : <: : S\NP\NP : λx.λy.helped
�(paint�house
�)xy: <: S\NP : λy.helped
�(paint�house
�)Hans�y
<S : helped
�(paint�house
�)Hans�we
�
we helped Hans paint the houseNP : we
� S\NP/VP/NP : λx.λf.λy.helped�fxy NP : Hans
� VP/NP : λx.paint�x NP : house
�: > >: S\NP/VP : λf.λy.helped
�fHans
�y VP : paint
�house
�: >: S\NP : λy.helped
�(paint�house
�)Hans�y
<S : helped
�(paint�house
�)Hans�we
�
S : helped�(paint
�house
�)Hans�we
�<
NP : we� S\NP : λy.helped
�(paint�house
�)Hans�y
<: NP : Hans
� S\NP\NP : λx.λy.helped�(paint
�house
�)xy: <: : NP : house
� S\NP\NP\NP : λz.λx.λy.helped�(paint
�z)xy
: : >B×: : : S\NP\NP/VP : λf.λx.λy.helped
�fxy VP\NP : λx.paint
�x
: : :mer em Hans es huus halfed aastriiche
NP : we� NP : Hans
� NP : house� S\NP\NP/VP : λx.λf.λy.helped
�(fy)x VP\NP : λx.paint�x
we helped Hans paint the house:: S\NP/VP/NP : λx.λf.λy.helped
�fxy NP : Hans
� VP/NP : λx.paint�x NP : house
�: > >: S\NP/VP : λf.λy.helped
�fHans
�y VP : paint
�house
�>
NP : we� S\NP : λy.helped
�(paint�house
�)Hans�y
<S : helped
�(paint�house
�)Hans�we
�
NP : we� S\NP/NP/NP : λx.λf.λy.helped
�fxy NP : Hans
� VP/NP : λx.paint�x NP : house
�
Combinatory categorial grammar
mer � NP : we�
em Hans � NP : Hans�
es huus � NP : house�
halfed � S\NP\NP/VP : λf.λx.λy.helped�fxy
aastriiche � VP\NP : λx.paint�x
mer em Hans es huus halfed aastriicheNP : we
� NP : Hans� NP : house
� S\NP\NP/VP : λf.λx.λy.helped�fxy VP\NP : λx.paint
�x
: : : >B×: : : S\NP\NP\NP : λz.λx.λy.helped
�(paint�z)xy
: : <: : S\NP\NP : λx.λy.helped
�(paint�house
�)xy: <: S\NP : λy.helped
�(paint�house
�)Hans�y
<S : helped
�(paint�house
�)Hans�we
�
we helped Hans paint the houseNP : we
� S\NP/VP/NP : λx.λf.λy.helped�fxy NP : Hans
� VP/NP : λx.paint�x NP : house
�: > >: S\NP/VP : λf.λy.helped
�fHans
�y VP : paint
�house
�: >: S\NP : λy.helped
�(paint�house
�)Hans�y
<S : helped
�(paint�house
�)Hans�we
�
S : helped�(paint
�house
�)Hans�we
�<
NP : we� S\NP : λy.helped
�(paint�house
�)Hans�y
<: NP : Hans
� S\NP\NP : λx.λy.helped�(paint
�house
�)xy: <: : NP : house
� S\NP\NP\NP : λz.λx.λy.helped�(paint
�z)xy
: : >B×: : : S\NP\NP/VP : λf.λx.λy.helped
�fxy VP\NP : λx.paint
�x
: : :mer em Hans es huus halfed aastriiche
NP : we� NP : Hans
� NP : house� S\NP\NP/VP : λx.λf.λy.helped
�(fy)x VP\NP : λx.paint�x
we helped Hans paint the house:: S\NP/VP/NP : λx.λf.λy.helped
�fxy NP : Hans
� VP/NP : λx.paint�x NP : house
�: > >: S\NP/VP : λf.λy.helped
�fHans
�y VP : paint
�house
�>
NP : we� S\NP : λy.helped
�(paint�house
�)Hans�y
<S : helped
�(paint�house
�)Hans�we
�
NP : we� S\NP/NP/NP : λx.λf.λy.helped
�fxy NP : Hans
� VP/NP : λx.paint�x NP : house
�
Combinatory categorial grammar
mer � NP : we�
em Hans � NP : Hans�
es huus � NP : house�
halfed � S\NP\NP/VP : λf.λx.λy.helped�fxy
aastriiche � VP\NP : λx.paint�x
mer em Hans es huus halfed aastriicheNP : we
� NP : Hans� NP : house
� S\NP\NP/VP : λf.λx.λy.helped�fxy VP\NP : λx.paint
�x
: : : >B×: : : S\NP\NP\NP : λz.λx.λy.helped
�(paint�z)xy
: : <: : S\NP\NP : λx.λy.helped
�(paint�house
�)xy: <: S\NP : λy.helped
�(paint�house
�)Hans�y
<S : helped
�(paint�house
�)Hans�we
�
we helped Hans paint the houseNP : we
� S\NP/VP/NP : λx.λf.λy.helped�fxy NP : Hans
� VP/NP : λx.paint�x NP : house
�: > >: S\NP/VP : λf.λy.helped
�fHans
�y VP : paint
�house
�: >: S\NP : λy.helped
�(paint�house
�)Hans�y
<S : helped
�(paint�house
�)Hans�we
�
S : helped�(paint
�house
�)Hans�we
�<
NP : we� S\NP : λy.helped
�(paint�house
�)Hans�y
<: NP : Hans
� S\NP\NP : λx.λy.helped�(paint
�house
�)xy: <: : NP : house
� S\NP\NP\NP : λz.λx.λy.helped�(paint
�z)xy
: : >B×: : : S\NP\NP/VP : λf.λx.λy.helped
�fxy VP\NP : λx.paint
�x
: : :mer em Hans es huus halfed aastriiche
NP : we� NP : Hans
� NP : house� S\NP\NP/VP : λx.λf.λy.helped
�(fy)x VP\NP : λx.paint�x
we helped Hans paint the house:: S\NP/VP/NP : λx.λf.λy.helped
�fxy NP : Hans
� VP/NP : λx.paint�x NP : house
�: > >: S\NP/VP : λf.λy.helped
�fHans
�y VP : paint
�house
�>
NP : we� S\NP : λy.helped
�(paint�house
�)Hans�y
<S : helped
�(paint�house
�)Hans�we
�
NP : we� S\NP/NP/NP : λx.λf.λy.helped
�fxy NP : Hans
� VP/NP : λx.paint�x NP : house
�
Combinatory categorial grammar
mer � NP : we�
em Hans � NP : Hans�
es huus � NP : house�
halfed � S\NP\NP/VP : λf.λx.λy.helped�fxy
aastriiche � VP\NP : λx.paint�x
A/B : f B|1C1...|nCn : λxn...λx1.g(x1, ..., xn)⇒ A|1C1...|nCn : λxn...λx1.fg(x1, ..., xn)
forward composition
(degree n)
mer em Hans es huus halfed aastriicheNP : we
� NP : Hans� NP : house
� S\NP\NP/VP : λf.λx.λy.helped�fxy VP\NP : λx.paint
�x
: : : >B×: : : S\NP\NP\NP : λz.λx.λy.helped
�(paint�z)xy
: : <: : S\NP\NP : λx.λy.helped
�(paint�house
�)xy: <: S\NP : λy.helped
�(paint�house
�)Hans�y
<S : helped
�(paint�house
�)Hans�we
�
we helped Hans paint the houseNP : we
� S\NP/VP/NP : λx.λf.λy.helped�fxy NP : Hans
� VP/NP : λx.paint�x NP : house
�: > >: S\NP/VP : λf.λy.helped
�fHans
�y VP : paint
�house
�: >: S\NP : λy.helped
�(paint�house
�)Hans�y
<S : helped
�(paint�house
�)Hans�we
�
S : helped�(paint
�house
�)Hans�we
�<
NP : we� S\NP : λy.helped
�(paint�house
�)Hans�y
<: NP : Hans
� S\NP\NP : λx.λy.helped�(paint
�house
�)xy: <: : NP : house
� S\NP\NP\NP : λz.λx.λy.helped�(paint
�z)xy
: : >B×: : : S\NP\NP/VP : λf.λx.λy.helped
�fxy VP\NP : λx.paint
�x
: : :mer em Hans es huus halfed aastriiche
NP : we� NP : Hans
� NP : house� S\NP\NP/VP : λx.λf.λy.helped
�(fy)x VP\NP : λx.paint�x
we helped Hans paint the house:: S\NP/VP/NP : λx.λf.λy.helped
�fxy NP : Hans
� VP/NP : λx.paint�x NP : house
�: > >: S\NP/VP : λf.λy.helped
�fHans
�y VP : paint
�house
�>
NP : we� S\NP : λy.helped
�(paint�house
�)Hans�y
<S : helped
�(paint�house
�)Hans�we
�
NP : we� S\NP/NP/NP : λx.λf.λy.helped
�fxy NP : Hans
� VP/NP : λx.paint�x NP : house
�
Combinatory categorial grammar
mer � NP : we�
em Hans � NP : Hans�
es huus � NP : house�
halfed � S\NP\NP/VP : λf.λx.λy.helped�fxy
aastriiche � VP\NP : λx.paint�x
A/B : f B|1C1...|nCn : λxn...λx1.g(x1, ..., xn)⇒ A|1C1...|nCn : λxn...λx1.fg(x1, ..., xn)
forward composition
(degree n)
mer em Hans es huus halfed aastriicheNP : we
� NP : Hans� NP : house
� S\NP\NP/VP : λf.λx.λy.helped�fxy VP\NP : λx.paint
�x
: : : >B×: : : S\NP\NP\NP : λz.λx.λy.helped
�(paint�z)xy
: : <: : S\NP\NP : λx.λy.helped
�(paint�house
�)xy: <: S\NP : λy.helped
�(paint�house
�)Hans�y
<S : helped
�(paint�house
�)Hans�we
�
we helped Hans paint the houseNP : we
� S\NP/VP/NP : λx.λf.λy.helped�fxy NP : Hans
� VP/NP : λx.paint�x NP : house
�: > >: S\NP/VP : λf.λy.helped
�fHans
�y VP : paint
�house
�: >: S\NP : λy.helped
�(paint�house
�)Hans�y
<S : helped
�(paint�house
�)Hans�we
�
S : helped�(paint
�house
�)Hans�we
�<
NP : we� S\NP : λy.helped
�(paint�house
�)Hans�y
<: NP : Hans
� S\NP\NP : λx.λy.helped�(paint
�house
�)xy: <: : NP : house
� S\NP\NP\NP : λz.λx.λy.helped�(paint
�z)xy
: : >B×: : : S\NP\NP/VP : λf.λx.λy.helped
�fxy VP\NP : λx.paint
�x
: : :mer em Hans es huus halfed aastriiche
NP : we� NP : Hans
� NP : house� S\NP\NP/VP : λx.λf.λy.helped
�(fy)x VP\NP : λx.paint�x
we helped Hans paint the house:: S\NP/VP/NP : λx.λf.λy.helped
�fxy NP : Hans
� VP/NP : λx.paint�x NP : house
�: > >: S\NP/VP : λf.λy.helped
�fHans
�y VP : paint
�house
�>
NP : we� S\NP : λy.helped
�(paint�house
�)Hans�y
<S : helped
�(paint�house
�)Hans�we
�
NP : we� S\NP/NP/NP : λx.λf.λy.helped
�fxy NP : Hans
� VP/NP : λx.paint�x NP : house
�
Combinatory categorial grammar
mer � NP : we�
em Hans � NP : Hans�
es huus � NP : house�
halfed � S\NP\NP/VP : λf.λx.λy.helped�fxy
aastriiche � VP\NP : λx.paint�x
A/B : f B|1C1...|nCn : λxn...λx1.g(x1, ..., xn)⇒ A|1C1...|nCn : λxn...λx1.fg(x1, ..., xn)
forward composition
(degree n)
mer em Hans es huus halfed aastriicheNP : we
� NP : Hans� NP : house
� S\NP\NP/VP : λf.λx.λy.helped�fxy VP\NP : λx.paint
�x
: : : >B×: : : S\NP\NP\NP : λz.λx.λy.helped
�(paint�z)xy
: : <: : S\NP\NP : λx.λy.helped
�(paint�house
�)xy: <: S\NP : λy.helped
�(paint�house
�)Hans�y
<S : helped
�(paint�house
�)Hans�we
�
we helped Hans paint the houseNP : we
� S\NP/VP/NP : λx.λf.λy.helped�fxy NP : Hans
� VP/NP : λx.paint�x NP : house
�: > >: S\NP/VP : λf.λy.helped
�fHans
�y VP : paint
�house
�: >: S\NP : λy.helped
�(paint�house
�)Hans�y
<S : helped
�(paint�house
�)Hans�we
�
S : helped�(paint
�house
�)Hans�we
�<
NP : we� S\NP : λy.helped
�(paint�house
�)Hans�y
<: NP : Hans
� S\NP\NP : λx.λy.helped�(paint
�house
�)xy: <: : NP : house
� S\NP\NP\NP : λz.λx.λy.helped�(paint
�z)xy
: : >B×: : : S\NP\NP/VP : λf.λx.λy.helped
�fxy VP\NP : λx.paint
�x
: : :mer em Hans es huus halfed aastriiche
NP : we� NP : Hans
� NP : house� S\NP\NP/VP : λx.λf.λy.helped
�(fy)x VP\NP : λx.paint�x
we helped Hans paint the house:: S\NP/VP/NP : λx.λf.λy.helped
�fxy NP : Hans
� VP/NP : λx.paint�x NP : house
�: > >: S\NP/VP : λf.λy.helped
�fHans
�y VP : paint
�house
�>
NP : we� S\NP : λy.helped
�(paint�house
�)Hans�y
<S : helped
�(paint�house
�)Hans�we
�
NP : we� S\NP/NP/NP : λx.λf.λy.helped
�fxy NP : Hans
� VP/NP : λx.paint�x NP : house
�
Combinatory categorial grammar
mer � NP : we�
em Hans � NP : Hans�
es huus � NP : house�
halfed � S\NP\NP/VP : λf.λx.λy.helped�fxy
aastriiche � VP\NP : λx.paint�x
A/B : f B|1C1...|nCn : λxn...λx1.g(x1, ..., xn)⇒ A|1C1...|nCn : λxn...λx1.fg(x1, ..., xn)
forward composition
(degree n)
mer em Hans es huus halfed aastriicheNP : we
� NP : Hans� NP : house
� S\NP\NP/VP : λf.λx.λy.helped�fxy VP\NP : λx.paint
�x
: : : >B×: : : S\NP\NP\NP : λz.λx.λy.helped
�(paint�z)xy
: : <: : S\NP\NP : λx.λy.helped
�(paint�house
�)xy: <: S\NP : λy.helped
�(paint�house
�)Hans�y
<S : helped
�(paint�house
�)Hans�we
�
we helped Hans paint the houseNP : we
� S\NP/VP/NP : λx.λf.λy.helped�fxy NP : Hans
� VP/NP : λx.paint�x NP : house
�: > >: S\NP/VP : λf.λy.helped
�fHans
�y VP : paint
�house
�: >: S\NP : λy.helped
�(paint�house
�)Hans�y
<S : helped
�(paint�house
�)Hans�we
�
S : helped�(paint
�house
�)Hans�we
�<
NP : we� S\NP : λy.helped
�(paint�house
�)Hans�y
<: NP : Hans
� S\NP\NP : λx.λy.helped�(paint
�house
�)xy: <: : NP : house
� S\NP\NP\NP : λz.λx.λy.helped�(paint
�z)xy
: : >B×: : : S\NP\NP/VP : λf.λx.λy.helped
�fxy VP\NP : λx.paint
�x
: : :mer em Hans es huus halfed aastriiche
NP : we� NP : Hans
� NP : house� S\NP\NP/VP : λx.λf.λy.helped
�(fy)x VP\NP : λx.paint�x
we helped Hans paint the house:: S\NP/VP/NP : λx.λf.λy.helped
�fxy NP : Hans
� VP/NP : λx.paint�x NP : house
�: > >: S\NP/VP : λf.λy.helped
�fHans
�y VP : paint
�house
�>
NP : we� S\NP : λy.helped
�(paint�house
�)Hans�y
<S : helped
�(paint�house
�)Hans�we
�
NP : we� S\NP/NP/NP : λx.λf.λy.helped
�fxy NP : Hans
� VP/NP : λx.paint�x NP : house
�
Combinatory categorial grammar
mer � NP : we�
em Hans � NP : Hans�
es huus � NP : house�
halfed � S\NP\NP/VP : λf.λx.λy.helped�fxy
aastriiche � VP\NP : λx.paint�x
A/B : f B|1C1...|nCn : λxn...λx1.g(x1, ..., xn)⇒ A|1C1...|nCn : λxn...λx1.fg(x1, ..., xn)
forward composition
(degree n)
mer em Hans es huus halfed aastriicheNP : we
� NP : Hans� NP : house
� S\NP\NP/VP : λf.λx.λy.helped�fxy VP\NP : λx.paint
�x
: : : >B×: : : S\NP\NP\NP : λz.λx.λy.helped
�(paint�z)xy
: : <: : S\NP\NP : λx.λy.helped
�(paint�house
�)xy: <: S\NP : λy.helped
�(paint�house
�)Hans�y
<S : helped
�(paint�house
�)Hans�we
�
we helped Hans paint the houseNP : we
� S\NP/VP/NP : λx.λf.λy.helped�fxy NP : Hans
� VP/NP : λx.paint�x NP : house
�: > >: S\NP/VP : λf.λy.helped
�fHans
�y VP : paint
�house
�: >: S\NP : λy.helped
�(paint�house
�)Hans�y
<S : helped
�(paint�house
�)Hans�we
�
S : helped�(paint
�house
�)Hans�we
�<
NP : we� S\NP : λy.helped
�(paint�house
�)Hans�y
<: NP : Hans
� S\NP\NP : λx.λy.helped�(paint
�house
�)xy: <: : NP : house
� S\NP\NP\NP : λz.λx.λy.helped�(paint
�z)xy
: : >B×: : : S\NP\NP/VP : λf.λx.λy.helped
�fxy VP\NP : λx.paint
�x
: : :mer em Hans es huus halfed aastriiche
NP : we� NP : Hans
� NP : house� S\NP\NP/VP : λx.λf.λy.helped
�(fy)x VP\NP : λx.paint�x
we helped Hans paint the house:: S\NP/VP/NP : λx.λf.λy.helped
�fxy NP : Hans
� VP/NP : λx.paint�x NP : house
�: > >: S\NP/VP : λf.λy.helped
�fHans
�y VP : paint
�house
�>
NP : we� S\NP : λy.helped
�(paint�house
�)Hans�y
<S : helped
�(paint�house
�)Hans�we
�
NP : we� S\NP/NP/NP : λx.λf.λy.helped
�fxy NP : Hans
� VP/NP : λx.paint�x NP : house
�
Combinatory categorial grammar
mer � NP : we�
em Hans � NP : Hans�
es huus � NP : house�
halfed � S\NP\NP/VP : λf.λx.λy.helped�fxy
aastriiche � VP\NP : λx.paint�x
A/B : f B|1C1...|nCn : λxn...λx1.g(x1, ..., xn)⇒ A|1C1...|nCn : λxn...λx1.fg(x1, ..., xn)
forward composition
(degree n)
mer em Hans es huus halfed aastriicheNP : we
� NP : Hans� NP : house
� S\NP\NP/VP : λf.λx.λy.helped�fxy VP\NP : λx.paint
�x
: : : >B×: : : S\NP\NP\NP : λz.λx.λy.helped
�(paint�z)xy
: : <: : S\NP\NP : λx.λy.helped
�(paint�house
�)xy: <: S\NP : λy.helped
�(paint�house
�)Hans�y
<S : helped
�(paint�house
�)Hans�we
�
we helped Hans paint the houseNP : we
� S\NP/VP/NP : λx.λf.λy.helped�fxy NP : Hans
� VP/NP : λx.paint�x NP : house
�: > >: S\NP/VP : λf.λy.helped
�fHans
�y VP : paint
�house
�: >: S\NP : λy.helped
�(paint�house
�)Hans�y
<S : helped
�(paint�house
�)Hans�we
�
S : helped�(paint
�house
�)Hans�we
�<
NP : we� S\NP : λy.helped
�(paint�house
�)Hans�y
<: NP : Hans
� S\NP\NP : λx.λy.helped�(paint
�house
�)xy: <: : NP : house
� S\NP\NP\NP : λz.λx.λy.helped�(paint
�z)xy
: : >B×: : : S\NP\NP/VP : λf.λx.λy.helped
�fxy VP\NP : λx.paint
�x
: : :mer em Hans es huus halfed aastriiche
NP : we� NP : Hans
� NP : house� S\NP\NP/VP : λx.λf.λy.helped
�(fy)x VP\NP : λx.paint�x
we helped Hans paint the house:: S\NP/VP/NP : λx.λf.λy.helped
�fxy NP : Hans
� VP/NP : λx.paint�x NP : house
�: > >: S\NP/VP : λf.λy.helped
�fHans
�y VP : paint
�house
�>
NP : we� S\NP : λy.helped
�(paint�house
�)Hans�y
<S : helped
�(paint�house
�)Hans�we
�
NP : we� S\NP/NP/NP : λx.λf.λy.helped
�fxy NP : Hans
� VP/NP : λx.paint�x NP : house
�
Combinatory categorial grammar
mer � NP : we�
em Hans � NP : Hans�
es huus � NP : house�
halfed � S\NP\NP/VP : λf.λx.λy.helped�fxy
aastriiche � VP\NP : λx.paint�x
mer em Hans es huus halfed aastriicheNP : we
� NP : Hans� NP : house
� S\NP\NP/VP : λf.λx.λy.helped�fxy VP\NP : λx.paint
�x
: : : >B×: : : S\NP\NP\NP : λz.λx.λy.helped
�(paint�z)xy
: : <: : S\NP\NP : λx.λy.helped
�(paint�house
�)xy: <: S\NP : λy.helped
�(paint�house
�)Hans�y
<S : helped
�(paint�house
�)Hans�we
�
we helped Hans paint the houseNP : we
� S\NP/VP/NP : λx.λf.λy.helped�fxy NP : Hans
� VP/NP : λx.paint�x NP : house
�: > >: S\NP/VP : λf.λy.helped
�fHans
�y VP : paint
�house
�: >: S\NP : λy.helped
�(paint�house
�)Hans�y
<S : helped
�(paint�house
�)Hans�we
�
S : helped�(paint
�house
�)Hans�we
�<
NP : we� S\NP : λy.helped
�(paint�house
�)Hans�y
<: NP : Hans
� S\NP\NP : λx.λy.helped�(paint
�house
�)xy: <: : NP : house
� S\NP\NP\NP : λz.λx.λy.helped�(paint
�z)xy
: : >B×: : : S\NP\NP/VP : λf.λx.λy.helped
�fxy VP\NP : λx.paint
�x
: : :mer em Hans es huus halfed aastriiche
NP : we� NP : Hans
� NP : house� S\NP\NP/VP : λx.λf.λy.helped
�(fy)x VP\NP : λx.paint�x
we helped Hans paint the house:: S\NP/VP/NP : λx.λf.λy.helped
�fxy NP : Hans
� VP/NP : λx.paint�x NP : house
�: > >: S\NP/VP : λf.λy.helped
�fHans
�y VP : paint
�house
�>
NP : we� S\NP : λy.helped
�(paint�house
�)Hans�y
<S : helped
�(paint�house
�)Hans�we
�
NP : we� S\NP/NP/NP : λx.λf.λy.helped
�fxy NP : Hans
� VP/NP : λx.paint�x NP : house
�
Combinatory categorial grammar
mer � NP : we�
em Hans � NP : Hans�
es huus � NP : house�
halfed � S\NP\NP/VP : λf.λx.λy.helped�fxy
aastriiche � VP\NP : λx.paint�x
is mildly non-projective dependency grammar (Kuhlmann, 2013)
mer em Hans es huus halfed aastriicheNP : we
� NP : Hans� NP : house
� S\NP\NP/VP : λf.λx.λy.helped�fxy VP\NP : λx.paint
�x
: : : >B×: : : S\NP\NP\NP : λz.λx.λy.helped
�(paint�z)xy
: : <: : S\NP\NP : λx.λy.helped
�(paint�house
�)xy: <: S\NP : λy.helped
�(paint�house
�)Hans�y
<S : helped
�(paint�house
�)Hans�we
�
we helped Hans paint the houseNP : we
� S\NP/VP/NP : λx.λf.λy.helped�fxy NP : Hans
� VP/NP : λx.paint�x NP : house
�: > >: S\NP/VP : λf.λy.helped
�fHans
�y VP : paint
�house
�: >: S\NP : λy.helped
�(paint�house
�)Hans�y
<S : helped
�(paint�house
�)Hans�we
�
S : helped�(paint
�house
�)Hans�we
�<
NP : we� S\NP : λy.helped
�(paint�house
�)Hans�y
<: NP : Hans
� S\NP\NP : λx.λy.helped�(paint
�house
�)xy: <: : NP : house
� S\NP\NP\NP : λz.λx.λy.helped�(paint
�z)xy
: : >B×: : : S\NP\NP/VP : λf.λx.λy.helped
�fxy VP\NP : λx.paint
�x
: : :mer em Hans es huus halfed aastriiche
NP : we� NP : Hans
� NP : house� S\NP\NP/VP : λx.λf.λy.helped
�(fy)x VP\NP : λx.paint�x
we helped Hans paint the house:: S\NP/VP/NP : λx.λf.λy.helped
�fxy NP : Hans
� VP/NP : λx.paint�x NP : house
�: > >: S\NP/VP : λf.λy.helped
�fHans
�y VP : paint
�house
�>
NP : we� S\NP : λy.helped
�(paint�house
�)Hans�y
<S : helped
�(paint�house
�)Hans�we
�
NP : we� S\NP/NP/NP : λx.λf.λy.helped
�fxy NP : Hans
� VP/NP : λx.paint�x NP : house
�
Combinatory categorial grammar
mer � NP : we�
em Hans � NP : Hans�
es huus � NP : house�
halfed � S\NP\NP/VP : λf.λx.λy.helped�fxy
aastriiche � VP\NP : λx.paint�x
is mildly non-projective dependency grammar (Kuhlmann, 2013)
CCG is not context-free
{we, helped, Hans, paint, the house}
{NP, S, VP}
a � A
b � S\A/C
b � S\A/C/S
c � C
a a a b b b c c c
A A A S\A/C/S S\A/C/S S\A/C C C C>B
S\A/C\A/C>B
S\A/C\A/C\A/C>
S\A/C\A/C\A<
S\A/C\A/C>
S\A/C\A<
S\A/C>
S\A<
S
�mer � NP : we
�
we � NP : we�
�
�em Hans � NP : Hans
�
Hans � NP : Hans�
�
�es huus � NP : house
�
the house � NP : house�
�
�halfed � S\NP\NP/VP : λf.λx.λy.helped
�fxy
helped � S\NP/VP/NP : λx.λf.λy.helped�fxy
�
�aastriiche � VP\NP : λx.paint
�x
paint � VP/NP : λx.paint�x
�
mer � NP : we�
em Hans � NP : Hans�
es huus � NP : house�
halfed � S\NP\NP/VP : λf.λx.λy.helped�fxy
aastriiche � VP\NP : λx.paint�x
we � NP : we�
helped � S\NP/VP/NP : λx.λf.λy.helped�fxy
Hans � NP : Hans�
paint � VP/NP : λx.paint�x
the house � NP : house�
a∗b∗c∗ anbncn
a a a b b a a b b b
S/B/S S/S/B/S S/B B B S/B/S S/B B B B> >
S S> >
S/S/B S/B> >
S/S S>
S>
S/B>
S
4 Mildly Context-Sensitive Languages
Combinatory categorial grammar can generate mildly context-sensitive languages. It is sufficient to includecomposition rules. Here’s an example that generates the context-sensitive language {anbncn : n ≥ 1} usingcross-composition:
a := A
b := S\A/C
b := S\A/C/S
c := C
a a a b b b c c c
A A A S\A/C/S S\A/C/S S\A/C C C C: : : >B : : : :: : : S\A/C\A/C/S : : : :: : : >B : : :: : : S\A/C\A/C\A/C : : :: : : > : :: : : S\A/C\A/C\A : :: : < : :: : S\A/C\A/C : :: : > :: : S\A/C\A :: < :: S\A/C :: >: S\A
<S
The idea here is that the category S\A/C/S can wrap arguments around an adjacent secondary categorywith target S, simulating adjunction onto a node dominating the foot note in a tree-adjoining grammar. Ihaven’t figured out if there’s a way to do this without cross-composition (the dependencies come out slighlydifferently than in the TAG example). There’s a construction from TAG in David Weir’s thesis.
Note that if the rule schema is restricted to a finite number of classes (i.e. if there is a fixed, finite rulegrammar as in some practical implementations) then the CCG is context-free (Fowler & Penn).
4
intersected with =
CCG parsing is fast and accurate
0
10
20
30
40
50
60
70
87 88 89 90
BetterAccuracy
Faster
Sent
ence
s/se
cond
Clark & Curran 2007
Belief propagation or dual
decomposition
Softmax-margin training
Auli & Lopez 2011 (ACL & EMNLP)
SCFGs with categorial labels
• CCG labels shoehorned into a context-free framework.
• Trades accuracy for speed and compactness, compared to popular SCFG model.
Weese, Callison-Burch, & Lopez 2012
regular
context-free
context-sensitive
indexed
combinatory categorial
regular
context-free
context-sensitive
indexed
fst:L1 × L2
combinatory categorial
regular
context-free
context-sensitive
indexed
fst:L1 × L2
L3 × L4
pa:combinatory categorial
regular
context-free
context-sensitive
indexed
fst:L1 × L2
L3 × L4
pa:combinatory categorial
L5 × L6
???:
CCG and graph grammars
The company that Marks wants to buyNPx Nx,1 Nc NPx NPx,1 S2 NPx NPm Sw NPx,1 S2 NPx Sy NPx,1 Sy,2 NPx Sb NP1 NP2
NPc Sx Sx NPm Sb NP NPSw NP NP
Sw NPNPx NPx
NPc
Figure 1: Relative clause derivation
with co-indexing of heads, mediate transmission ofthe head of the NP the company onto the object ofbuy. The corresponding dependencies are given inthe following figure, with the convention that arcspoint away from arguments. The relevant argumentslot in the functor category labels the arcs.
1
22 2
2
111 1
The towantscompany Marksthat buy
Note that we encode the subject argument of theto category as a dependency relation (Marks is a“subject” of to), since our philosophy at this stageis to encode every argument as a dependency, wherepossible. The number of dependency types may bereduced in future work.
3 The Probability Model
The DAG-like nature of the dependency structuresmakes it difficult to apply generative modelling tech-niques (Abney, 1997; Johnson et al., 1999), sowe have defined a conditional model, similar tothe model of Collins (1996) (see also the condi-tional model in Eisner (1996b)). While the modelof Collins (1996) is technically unsound (Collins,1999), our aim at this stage is to demonstrate thataccurate, efficient wide-coverage parsing is possiblewith CCG, even with an over-simplified statisticalmodel. Future work will look at alternative models.4
4The reentrancies creating the DAG-like structures are fairlylimited, and moreover determined by the lexical categories. Weconjecture that it is possible to define a generative model thatincludes the deep dependencies.
The parse selection component must choose themost probable dependency structure, given the sen-tence S. A sentence S w1 t1 w2 t2 wn tnis assumed to be a sequence of word, pos-tagpairs. For our purposes, a dependency structure !is a C D pair, where C c1 c2 cn is the se-quence of categories assigned to the words, andD hfi fi si hai i 1 m is the set of de-pendencies. The probability of a dependency struc-ture can be written as follows:
(7) P ! P C D S P C S P D C S
The probability P C S can be approximated asfollows:
(8) P C S "ni 1P ci Xi
where Xi is the local context for the ith word. Wehave explained elsewhere (Clark, 2002) how suit-able features can be defined in terms of the word,pos-tag pairs in the context, and how maximum en-tropy techniques can be used to estimate the proba-bilities, following Ratnaparkhi (1996).We assume that each argument slot in the cat-
egory sequence is filled independently, and writeP D C S as follows:
(9) P D C S "mi 1P hai C S
where hai is the head word filling the argument slotof the ith dependency, and m is the number of de-pendencies entailed by the category sequenceC.
3.1 Estimating the dependency probabilitiesThe estimation method is based on Collins (1996).We assume that the probability of a dependency onlydepends on those words involved in the dependency,together with their categories. We follow Collinsand base the estimate of a dependency probabilityon the following intuition: given a pair of words,with a pair of categories, which are in the same sen-
If bound variables appear more than once (Clark et al. 2002) ...
CCG and graph grammars
The company that Marks wants to buyNPx Nx,1 Nc NPx NPx,1 S2 NPx NPm Sw NPx,1 S2 NPx Sy NPx,1 Sy,2 NPx Sb NP1 NP2
NPc Sx Sx NPm Sb NP NPSw NP NP
Sw NPNPx NPx
NPc
Figure 1: Relative clause derivation
with co-indexing of heads, mediate transmission ofthe head of the NP the company onto the object ofbuy. The corresponding dependencies are given inthe following figure, with the convention that arcspoint away from arguments. The relevant argumentslot in the functor category labels the arcs.
1
22 2
2
111 1
The towantscompany Marksthat buy
Note that we encode the subject argument of theto category as a dependency relation (Marks is a“subject” of to), since our philosophy at this stageis to encode every argument as a dependency, wherepossible. The number of dependency types may bereduced in future work.
3 The Probability Model
The DAG-like nature of the dependency structuresmakes it difficult to apply generative modelling tech-niques (Abney, 1997; Johnson et al., 1999), sowe have defined a conditional model, similar tothe model of Collins (1996) (see also the condi-tional model in Eisner (1996b)). While the modelof Collins (1996) is technically unsound (Collins,1999), our aim at this stage is to demonstrate thataccurate, efficient wide-coverage parsing is possiblewith CCG, even with an over-simplified statisticalmodel. Future work will look at alternative models.4
4The reentrancies creating the DAG-like structures are fairlylimited, and moreover determined by the lexical categories. Weconjecture that it is possible to define a generative model thatincludes the deep dependencies.
The parse selection component must choose themost probable dependency structure, given the sen-tence S. A sentence S w1 t1 w2 t2 wn tnis assumed to be a sequence of word, pos-tagpairs. For our purposes, a dependency structure !is a C D pair, where C c1 c2 cn is the se-quence of categories assigned to the words, andD hfi fi si hai i 1 m is the set of de-pendencies. The probability of a dependency struc-ture can be written as follows:
(7) P ! P C D S P C S P D C S
The probability P C S can be approximated asfollows:
(8) P C S "ni 1P ci Xi
where Xi is the local context for the ith word. Wehave explained elsewhere (Clark, 2002) how suit-able features can be defined in terms of the word,pos-tag pairs in the context, and how maximum en-tropy techniques can be used to estimate the proba-bilities, following Ratnaparkhi (1996).We assume that each argument slot in the cat-
egory sequence is filled independently, and writeP D C S as follows:
(9) P D C S "mi 1P hai C S
where hai is the head word filling the argument slotof the ith dependency, and m is the number of de-pendencies entailed by the category sequenceC.
3.1 Estimating the dependency probabilitiesThe estimation method is based on Collins (1996).We assume that the probability of a dependency onlydepends on those words involved in the dependency,together with their categories. We follow Collinsand base the estimate of a dependency probabilityon the following intuition: given a pair of words,with a pair of categories, which are in the same sen-
If bound variables appear more than once (Clark et al. 2002) ...
The company that Marks wants to buyNPx Nx,1 Nc NPx NPx,1 S2 NPx NPm Sw NPx,1 S2 NPx Sy NPx,1 Sy,2 NPx Sb NP1 NP2
NPc Sx Sx NPm Sb NP NPSw NP NP
Sw NPNPx NPx
NPc
Figure 1: Relative clause derivation
with co-indexing of heads, mediate transmission ofthe head of the NP the company onto the object ofbuy. The corresponding dependencies are given inthe following figure, with the convention that arcspoint away from arguments. The relevant argumentslot in the functor category labels the arcs.
1
22 2
2
111 1
The towantscompany Marksthat buy
Note that we encode the subject argument of theto category as a dependency relation (Marks is a“subject” of to), since our philosophy at this stageis to encode every argument as a dependency, wherepossible. The number of dependency types may bereduced in future work.
3 The Probability Model
The DAG-like nature of the dependency structuresmakes it difficult to apply generative modelling tech-niques (Abney, 1997; Johnson et al., 1999), sowe have defined a conditional model, similar tothe model of Collins (1996) (see also the condi-tional model in Eisner (1996b)). While the modelof Collins (1996) is technically unsound (Collins,1999), our aim at this stage is to demonstrate thataccurate, efficient wide-coverage parsing is possiblewith CCG, even with an over-simplified statisticalmodel. Future work will look at alternative models.4
4The reentrancies creating the DAG-like structures are fairlylimited, and moreover determined by the lexical categories. Weconjecture that it is possible to define a generative model thatincludes the deep dependencies.
The parse selection component must choose themost probable dependency structure, given the sen-tence S. A sentence S w1 t1 w2 t2 wn tnis assumed to be a sequence of word, pos-tagpairs. For our purposes, a dependency structure !is a C D pair, where C c1 c2 cn is the se-quence of categories assigned to the words, andD hfi fi si hai i 1 m is the set of de-pendencies. The probability of a dependency struc-ture can be written as follows:
(7) P ! P C D S P C S P D C S
The probability P C S can be approximated asfollows:
(8) P C S "ni 1P ci Xi
where Xi is the local context for the ith word. Wehave explained elsewhere (Clark, 2002) how suit-able features can be defined in terms of the word,pos-tag pairs in the context, and how maximum en-tropy techniques can be used to estimate the proba-bilities, following Ratnaparkhi (1996).We assume that each argument slot in the cat-
egory sequence is filled independently, and writeP D C S as follows:
(9) P D C S "mi 1P hai C S
where hai is the head word filling the argument slotof the ith dependency, and m is the number of de-pendencies entailed by the category sequenceC.
3.1 Estimating the dependency probabilitiesThe estimation method is based on Collins (1996).We assume that the probability of a dependency onlydepends on those words involved in the dependency,together with their categories. We follow Collinsand base the estimate of a dependency probabilityon the following intuition: given a pair of words,with a pair of categories, which are in the same sen-
...Result is a dependency graph:
CCG and graph grammars
The company that Marks wants to buyNPx Nx,1 Nc NPx NPx,1 S2 NPx NPm Sw NPx,1 S2 NPx Sy NPx,1 Sy,2 NPx Sb NP1 NP2
NPc Sx Sx NPm Sb NP NPSw NP NP
Sw NPNPx NPx
NPc
Figure 1: Relative clause derivation
with co-indexing of heads, mediate transmission ofthe head of the NP the company onto the object ofbuy. The corresponding dependencies are given inthe following figure, with the convention that arcspoint away from arguments. The relevant argumentslot in the functor category labels the arcs.
1
22 2
2
111 1
The towantscompany Marksthat buy
Note that we encode the subject argument of theto category as a dependency relation (Marks is a“subject” of to), since our philosophy at this stageis to encode every argument as a dependency, wherepossible. The number of dependency types may bereduced in future work.
3 The Probability Model
The DAG-like nature of the dependency structuresmakes it difficult to apply generative modelling tech-niques (Abney, 1997; Johnson et al., 1999), sowe have defined a conditional model, similar tothe model of Collins (1996) (see also the condi-tional model in Eisner (1996b)). While the modelof Collins (1996) is technically unsound (Collins,1999), our aim at this stage is to demonstrate thataccurate, efficient wide-coverage parsing is possiblewith CCG, even with an over-simplified statisticalmodel. Future work will look at alternative models.4
4The reentrancies creating the DAG-like structures are fairlylimited, and moreover determined by the lexical categories. Weconjecture that it is possible to define a generative model thatincludes the deep dependencies.
The parse selection component must choose themost probable dependency structure, given the sen-tence S. A sentence S w1 t1 w2 t2 wn tnis assumed to be a sequence of word, pos-tagpairs. For our purposes, a dependency structure !is a C D pair, where C c1 c2 cn is the se-quence of categories assigned to the words, andD hfi fi si hai i 1 m is the set of de-pendencies. The probability of a dependency struc-ture can be written as follows:
(7) P ! P C D S P C S P D C S
The probability P C S can be approximated asfollows:
(8) P C S "ni 1P ci Xi
where Xi is the local context for the ith word. Wehave explained elsewhere (Clark, 2002) how suit-able features can be defined in terms of the word,pos-tag pairs in the context, and how maximum en-tropy techniques can be used to estimate the proba-bilities, following Ratnaparkhi (1996).We assume that each argument slot in the cat-
egory sequence is filled independently, and writeP D C S as follows:
(9) P D C S "mi 1P hai C S
where hai is the head word filling the argument slotof the ith dependency, and m is the number of de-pendencies entailed by the category sequenceC.
3.1 Estimating the dependency probabilitiesThe estimation method is based on Collins (1996).We assume that the probability of a dependency onlydepends on those words involved in the dependency,together with their categories. We follow Collinsand base the estimate of a dependency probabilityon the following intuition: given a pair of words,with a pair of categories, which are in the same sen-
If bound variables appear more than once (Clark et al. 2002) ...
The company that Marks wants to buyNPx Nx,1 Nc NPx NPx,1 S2 NPx NPm Sw NPx,1 S2 NPx Sy NPx,1 Sy,2 NPx Sb NP1 NP2
NPc Sx Sx NPm Sb NP NPSw NP NP
Sw NPNPx NPx
NPc
Figure 1: Relative clause derivation
with co-indexing of heads, mediate transmission ofthe head of the NP the company onto the object ofbuy. The corresponding dependencies are given inthe following figure, with the convention that arcspoint away from arguments. The relevant argumentslot in the functor category labels the arcs.
1
22 2
2
111 1
The towantscompany Marksthat buy
Note that we encode the subject argument of theto category as a dependency relation (Marks is a“subject” of to), since our philosophy at this stageis to encode every argument as a dependency, wherepossible. The number of dependency types may bereduced in future work.
3 The Probability Model
The DAG-like nature of the dependency structuresmakes it difficult to apply generative modelling tech-niques (Abney, 1997; Johnson et al., 1999), sowe have defined a conditional model, similar tothe model of Collins (1996) (see also the condi-tional model in Eisner (1996b)). While the modelof Collins (1996) is technically unsound (Collins,1999), our aim at this stage is to demonstrate thataccurate, efficient wide-coverage parsing is possiblewith CCG, even with an over-simplified statisticalmodel. Future work will look at alternative models.4
4The reentrancies creating the DAG-like structures are fairlylimited, and moreover determined by the lexical categories. Weconjecture that it is possible to define a generative model thatincludes the deep dependencies.
The parse selection component must choose themost probable dependency structure, given the sen-tence S. A sentence S w1 t1 w2 t2 wn tnis assumed to be a sequence of word, pos-tagpairs. For our purposes, a dependency structure !is a C D pair, where C c1 c2 cn is the se-quence of categories assigned to the words, andD hfi fi si hai i 1 m is the set of de-pendencies. The probability of a dependency struc-ture can be written as follows:
(7) P ! P C D S P C S P D C S
The probability P C S can be approximated asfollows:
(8) P C S "ni 1P ci Xi
where Xi is the local context for the ith word. Wehave explained elsewhere (Clark, 2002) how suit-able features can be defined in terms of the word,pos-tag pairs in the context, and how maximum en-tropy techniques can be used to estimate the proba-bilities, following Ratnaparkhi (1996).We assume that each argument slot in the cat-
egory sequence is filled independently, and writeP D C S as follows:
(9) P D C S "mi 1P hai C S
where hai is the head word filling the argument slotof the ith dependency, and m is the number of de-pendencies entailed by the category sequenceC.
3.1 Estimating the dependency probabilitiesThe estimation method is based on Collins (1996).We assume that the probability of a dependency onlydepends on those words involved in the dependency,together with their categories. We follow Collinsand base the estimate of a dependency probabilityon the following intuition: given a pair of words,with a pair of categories, which are in the same sen-
...Result is a dependency graph: Same type of graph?Anna fehlt ihrem Kater
MISS
CAT
ANNA
instance
agent
patient
instance
owner
instance
Anna’s cat is missing her
Figure 1: A string to meaning graph to string translation pipeline.
Experimental results demonstrate that our system is capable of learning semantic abstractions,and more specifically, to both analyse text into these abstractions and decode them back intotext in multiple languages.
The need to manipulate graph structures adds an additional level of complexity to the stan-dard MT task. While the problems of parsing and rule-extraction are well-studied for stringsand trees, there has been considerably less work within the NLP community on the equiva-lent algorithms for graphs. In this paper, we use hyperedge replacement grammars (HRGs)(Drewes et al., 1997) for the basic machinery of graph manipulation; in particular, we use asynchronous HRG (SHRG) to relate graph and string derivations.
We provide the following contributions:1. Introduction of string! graph transduction with HRGs to NLP2. Efficient algorithms for
• string–graph alignment• inference of graph grammars from aligned graph/string pairs
3. Empirical results from a working machine translation system, and analysis of that sys-tem’s performance on the subproblems of semantic parsing and generation.
We proceed as follows: Section 2 explains the SHRG formalism and shows how it is usedto derive graph-structured meaning representations. Section 3 introduces two algorithms forlearning SHRG rules automatically from semantically-annotated corpora. Section 4 describesthe details of our machine translation system, and explains how a SHRG is used to transforma natural language sentence into a meaning representation and vice-versa. Section 6 discussesrelated work and Section 7 summarizes the main results of the paper.
2 Synchronous Hyperedge Replacement Grammars
Hyperedge replacement grammars (Drewes et al., 1997) are an intuitive generalization of con-text free grammars (CFGs) from strings to hypergraphs. Where in CFGs strings are built upby successive rewriting of nonterminal tokens, in hyperedge replacement grammars (HRGs),nonterminals are hyperedges, and rewriting steps replace these nonterminal hyperedges withsubgraphs rather than strings.
A hypergraph is a generalization of an graph in which edges may link an arbitrary number ofnodes. Formally, a hypergraph over a set of edge labels C is a tuple H = "V, E, l, X #, where Vis a finite set of nodes, E is a finite set of edges, where each edge is a subset of V , l : E $ Cis a labeling function. |e| % N denotes the type of a hyperedge e % E (the number of nodesconnected by the edge). For the directed hypergraphs we are concerned with, each edgecontains a distinguished source node and one or more target nodes.
Other formal models
• Lexical functional grammar
• Minimal recursion semantics
• (Synchronous) tree-adjoining grammar
• How (dis)similar are these?
Bottom line
We are building a program, called Aristo, that seeks to understand science at the level of a fourth-grader and prove it by taking a standardized science test (that it hasnʼt seen before) and acing it. That problem forces us to study fundamental problems in AI in understanding language, reasoning, and much more.
Bottom line
• Semantics is a hot topic in NLP right now.
We are building a program, called Aristo, that seeks to understand science at the level of a fourth-grader and prove it by taking a standardized science test (that it hasnʼt seen before) and acing it. That problem forces us to study fundamental problems in AI in understanding language, reasoning, and much more.
Bottom line
• Semantics is a hot topic in NLP right now.
We are building a program, called Aristo, that seeks to understand science at the level of a fourth-grader and prove it by taking a standardized science test (that it hasnʼt seen before) and acing it. That problem forces us to study fundamental problems in AI in understanding language, reasoning, and much more.
Bottom line
• Semantics is a hot topic in NLP right now.
We are building a program, called Aristo, that seeks to understand science at the level of a fourth-grader and prove it by taking a standardized science test (that it hasnʼt seen before) and acing it. That problem forces us to study fundamental problems in AI in understanding language, reasoning, and much more.
Bottom line
• Semantics is a hot topic in NLP right now.
We are building a program, called Aristo, that seeks to understand science at the level of a fourth-grader and prove it by taking a standardized science test (that it hasnʼt seen before) and acing it. That problem forces us to study fundamental problems in AI in understanding language, reasoning, and much more.
Bottom line
• Semantics is a hot topic in NLP right now.
• Why use semantics for MT?
We are building a program, called Aristo, that seeks to understand science at the level of a fourth-grader and prove it by taking a standardized science test (that it hasnʼt seen before) and acing it. That problem forces us to study fundamental problems in AI in understanding language, reasoning, and much more.
Bottom line
• Semantics is a hot topic in NLP right now.
• Why use semantics for MT?
• How do we use semantics for MT?
We are building a program, called Aristo, that seeks to understand science at the level of a fourth-grader and prove it by taking a standardized science test (that it hasnʼt seen before) and acing it. That problem forces us to study fundamental problems in AI in understanding language, reasoning, and much more.
Bottom line
• Semantics is a hot topic in NLP right now.
• Why use semantics for MT?
• How do we use semantics for MT?
• Will it actually be useful for MT?
We are building a program, called Aristo, that seeks to understand science at the level of a fourth-grader and prove it by taking a standardized science test (that it hasnʼt seen before) and acing it. That problem forces us to study fundamental problems in AI in understanding language, reasoning, and much more.
Bottom line
• Semantics is a hot topic in NLP right now.
• Why use semantics for MT?
• How do we use semantics for MT?
• Will it actually be useful for MT?
Pretty clear
We are building a program, called Aristo, that seeks to understand science at the level of a fourth-grader and prove it by taking a standardized science test (that it hasnʼt seen before) and acing it. That problem forces us to study fundamental problems in AI in understanding language, reasoning, and much more.
Bottom line
• Semantics is a hot topic in NLP right now.
• Why use semantics for MT?
• How do we use semantics for MT?
• Will it actually be useful for MT?
Pretty clear
Less clear
We are building a program, called Aristo, that seeks to understand science at the level of a fourth-grader and prove it by taking a standardized science test (that it hasnʼt seen before) and acing it. That problem forces us to study fundamental problems in AI in understanding language, reasoning, and much more.
Bottom line
• Semantics is a hot topic in NLP right now.
• Why use semantics for MT?
• How do we use semantics for MT?
• Will it actually be useful for MT?
Pretty clear
Less clear
Totally unclear
We are building a program, called Aristo, that seeks to understand science at the level of a fourth-grader and prove it by taking a standardized science test (that it hasnʼt seen before) and acing it. That problem forces us to study fundamental problems in AI in understanding language, reasoning, and much more.
Final project poster session
• In this room, 2-5pm on May 9
• Project reports due at poster session
• Email to Yuan