insights from the syntagmatic-paradigmatic learner barend...

Post on 18-Oct-2020

21 Views

Category:

Documents

0 Downloads

Preview:

Click to see full reader

TRANSCRIPT

Modeling and theoryInsights from the Syntagmatic-Paradigmatic Learner

Barend Beekhuizen

Leiden University & University of Amsterdam

22 September 2015

Barend Beekhuizen (Leiden & UvA) Modeling and theory 22 September 2015 1 / 32

1 Overview of SPLRepresentationsProcessingLearning

2 Main findingsModeling issues, theoretical puzzlesComprehensionProduction

3 Competence and performanceGaps in the theoryComprehensionRepresentationProduction

4 Wrap-up

Barend Beekhuizen (Leiden & UvA) Modeling and theory 22 September 2015 2 / 32

Overview

1 Overview of SPLRepresentationsProcessingLearning

2 Main findingsModeling issues, theoretical puzzlesComprehensionProduction

3 Competence and performanceGaps in the theoryComprehensionRepresentationProduction

4 Wrap-up

Barend Beekhuizen (Leiden & UvA) Modeling and theory 22 September 2015 3 / 32

Overview

The general problem

Mapping form to meaning

Acquisition:

Arriving at adult stateExplaining developmental waypoints

Starting point: Usage-Based framework

Computational cognitive model as method

Barend Beekhuizen (Leiden & UvA) Modeling and theory 22 September 2015 4 / 32

Overview

The Syntagmatic-Paradigmatic Learner

Flow of the model:

1 Model receives input item: pair of an utterance and a number ofsituations

2 Model tries to analyze using processing mechanisms and existingrepresentations

3 Model updates grammar using learning mechanisms and best analysis

4 goto 1

Barend Beekhuizen (Leiden & UvA) Modeling and theory 22 September 2015 5 / 32

Overview Representations

Representations

Constructions: pairings of signifiers and signifieds, both for ‘grammar’ and‘lexicon’

Barend Beekhuizen (Leiden & UvA) Modeling and theory 22 September 2015 6 / 32

Overview Representations

F: ball

K

{object,entity,ball}

sd

srsr1

count = 4

F: Ɛ

K

F: go

K

{move}

{agent,mover} {location,goal}

{animate} {surface}

sd

srsr1 sr2

count = 12

Figure: Constructions

Barend Beekhuizen (Leiden & UvA) Modeling and theory 22 September 2015 7 / 32

Overview Representations

F: ball

K

{object,entity,ball}

sd

srsr1

count = 4

F: Ɛ

K

F: go

K

{move}

{agent,mover} {location,goal}

{animate} {surface}

sd

srsr1 sr2

count = 12

Figure: Constructions

1 [ ball / ball ]

2 [ [ animate ] [ move / go ] ] |move(agent(animate),location(surface))

Barend Beekhuizen (Leiden & UvA) Modeling and theory 22 September 2015 7 / 32

Overview Processing

Representations

Constructions: pairings of signifiers and signifieds, both for ‘grammar’ and‘lexicon’

Processing

An utterance in a situational context is analyzed using the set of knownconstructions and processing mechanisms

Barend Beekhuizen (Leiden & UvA) Modeling and theory 22 September 2015 8 / 32

Overview Processing

{move}

{agent,mover} {location,goal}

{animate,Adam} {surface}

PHON: Adam

SEM:

PHON: ɛ

SEM:

{move}

{agent,mover} {location,goal}

{animate,Adam}

situation

O =

PHON: put

SEM:

PHON: it

SEM:

{move}

{patient,moved} {location,goal}

{object,entity} {surface}

{patient,moved}

{object,entity}

PHON: Adam

SEM:

PHON: ɛ

SEM:

{move}

{agent,mover} {location,goal}

{animate,Adam}

PHON: put

SEM:

PHON: it

SEM:

{move}

{patient,moved} {location,goal}

{object,entity} {surface}

Leftmost open constituent of this construction, pointing to {move}-node of meaning

Figure: Combine

Barend Beekhuizen (Leiden & UvA) Modeling and theory 22 September 2015 9 / 32

Overview Processing

bootstrap

w3

ignore

meaning

c1

c2

meaning

c1

w1

w2

w4

c1

meaning

concatenate

Figure: Concatenate, bootstrap, ignore

Barend Beekhuizen (Leiden & UvA) Modeling and theory 22 September 2015 10 / 32

Overview Processing

Representations

Constructions: pairings of signifiers and signifieds, both for ‘grammar’ and‘lexicon’

Processing

An utterance in a situational context is analyzed using the set of knownconstructions and processing mechanisms.Often many analyses possible, so find best one:

Most frequently encountered constructions

With fewest concatenate, bootstrap, and ignore operations.

Barend Beekhuizen (Leiden & UvA) Modeling and theory 22 September 2015 11 / 32

Overview Learning

Representations

Constructions: pairings of signifiers and signifieds, both for ‘grammar’ and‘lexicon’

Processing

An utterance in a situational context is analyzed using the set of knownconstructions and processing mechanisms.Often many analyses possible, so find best one:

Most frequently encountered constructions

With fewest concatenate, bootstrap, and ignore operations.

Learning

Best analysis leaves trace in memory: 5 learning mechanisms.

Barend Beekhuizen (Leiden & UvA) Modeling and theory 22 September 2015 12 / 32

Overview Learning

ball go there

a7

{event,move}

{role,patient,moved} {role,location,goal}

{entity,object,ball}

{entity,object,ball}

{event,move}

{role,location,goal}

{role,patient,moved}

K

F: ball F: go

K

F: ball

K

F: go

K

mccs

Figure: Adding most concrete constructions

Barend Beekhuizen (Leiden & UvA) Modeling and theory 22 September 2015 13 / 32

Overview Learning

ball go there

a7

{event,move}

{role,patient,moved} {role,location,goal}

{entity}

{entity,object,ball}

{event,move}

{role,location,goal}

{role,patient,moved}

F: ε

K

F: go

K

F: ball

K

F: go

K

mcucsc

2

c4

c5

Figure: Updating the most concrete used constructions

Barend Beekhuizen (Leiden & UvA) Modeling and theory 22 September 2015 14 / 32

Overview Learning

ball go there

a5

{entity,object,ball}

{event,move}

{role,location,goal}{role,patient,moved}

F: ball

K

F: go

K

csyn

Figure: Syntagmatization

Barend Beekhuizen (Leiden & UvA) Modeling and theory 22 September 2015 15 / 32

Overview Learning

{entity,object,ball}

{event,move}

{role,location,goal}{role,patient,moved}

F: ball

K

F: go

K

{entity,object,ball}

{event,fall}

{role,patient,move}

F: ball

K

F: fall

K

{entity,object,ball}

{event}

{role,patient}

F: ball

K

F: ε

K

cpara

c c'

Figure: Paradigmatization

Barend Beekhuizen (Leiden & UvA) Modeling and theory 22 September 2015 16 / 32

Overview Learning

{event,move}

{role,location,goal}

{entity,object,box}

st

{role,patient,moved}

{entity,object,ball}

ball go there

at

{event,state}

{role,location,goal}

{entity,object,floor}

st-1

{role,theme}

{entity,object,cup}

cup lies there

at-1

{role,location,goal}

F: there

K

cxsl

Figure: Cross-situational learning

Barend Beekhuizen (Leiden & UvA) Modeling and theory 22 September 2015 17 / 32

Main findings

1 Overview of SPLRepresentationsProcessingLearning

2 Main findingsModeling issues, theoretical puzzlesComprehensionProduction

3 Competence and performanceGaps in the theoryComprehensionRepresentationProduction

4 Wrap-up

Barend Beekhuizen (Leiden & UvA) Modeling and theory 22 September 2015 18 / 32

Main findings Modeling issues, theoretical puzzles

SPL resolves some a priori issues (chapter 2)

Comprehensiveness: comprehension and production

Simultaneity: lexical and grammatical constructions

Reappraisal of the starting-small approach

Learning as by-product of processing (immanence)

Reappraisal of the competence-performance distinction

Barend Beekhuizen (Leiden & UvA) Modeling and theory 22 September 2015 19 / 32

Main findings Modeling issues, theoretical puzzles

SPL resolves some a priori issues (chapter 2)

Comprehensiveness: comprehension and production

Simultaneity: lexical and grammatical constructions

Reappraisal of the starting-small approach

Learning as by-product of processing (immanence)

Reappraisal of the competence-performance distinction

Barend Beekhuizen (Leiden & UvA) Modeling and theory 22 September 2015 19 / 32

Main findings Comprehension

Comprehension (chapter 5, 6)

Robustness: making sense of utterance despite knowing little (usingconcatenation, bootstrapping)

Increasing coverage of utterance and situation

Increasing accuracy of picking out situation from 6 candidates

Varying mechanisms: XSL precedes bootstrapping; bootstrappingdominates.

Barend Beekhuizen (Leiden & UvA) Modeling and theory 22 September 2015 20 / 32

Main findings Production

Production (chapter 7)

Experiment: give model situation, ask to produce utterance

Increasing length of produced utterance

Hardly any errors of comission

Barend Beekhuizen (Leiden & UvA) Modeling and theory 22 September 2015 21 / 32

Competence and performance

1 Overview of SPLRepresentationsProcessingLearning

2 Main findingsModeling issues, theoretical puzzlesComprehensionProduction

3 Competence and performanceGaps in the theoryComprehensionRepresentationProduction

4 Wrap-up

Barend Beekhuizen (Leiden & UvA) Modeling and theory 22 September 2015 22 / 32

Competence and performance Gaps in the theory

Linguistic knowledge grounded in language use

So we can reason from child’s productions to its knowledge oflanguage

However:

Sample may not contain reflection of full potentialOther reasons for not producing some linguistic itemInteractivity of components invalidates line of reasoning

So: need to account for a linguistic competence and performancewithin Usage-Based framework.

And show its explanatory value.

Not unique to SPL: all UB computational models do so. However,interaction lexical/grammatical acquisition gives interesting effects

Barend Beekhuizen (Leiden & UvA) Modeling and theory 22 September 2015 23 / 32

Competence and performance Gaps in the theory

Linguistic knowledge grounded in language use

So we can reason from child’s productions to its knowledge oflanguage

However:

Sample may not contain reflection of full potentialOther reasons for not producing some linguistic itemInteractivity of components invalidates line of reasoning

So: need to account for a linguistic competence and performancewithin Usage-Based framework.

And show its explanatory value.

Not unique to SPL: all UB computational models do so. However,interaction lexical/grammatical acquisition gives interesting effects

Barend Beekhuizen (Leiden & UvA) Modeling and theory 22 September 2015 23 / 32

Competence and performance Gaps in the theory

Linguistic knowledge grounded in language use

So we can reason from child’s productions to its knowledge oflanguage

However:

Sample may not contain reflection of full potentialOther reasons for not producing some linguistic itemInteractivity of components invalidates line of reasoning

So: need to account for a linguistic competence and performancewithin Usage-Based framework.

And show its explanatory value.

Not unique to SPL: all UB computational models do so. However,interaction lexical/grammatical acquisition gives interesting effects

Barend Beekhuizen (Leiden & UvA) Modeling and theory 22 September 2015 23 / 32

Competence and performance Comprehension

Comprehension

Early abstraction

Increasing use of more concrete constructions

Does not entail loss of abstraction (to the contrary)

0.00

0.25

0.50

0.75

1.00

0

1000

2000

3000

4000

5000

6000

7000

8000

9000

1000

0

time

prop

ortio

n n abstractslots

0

1

2

0.00

0.25

0.50

0.75

1.00

0

1000

2000

3000

4000

5000

6000

7000

8000

9000

1000

0

time

prop

ortio

n n abstractslots

0

1

2

3

Figure: Abstraction of used length-2 and 3 constructions

Barend Beekhuizen (Leiden & UvA) Modeling and theory 22 September 2015 24 / 32

Competence and performance Representation

Reflections of use on representations

Look under the hood to obtain a fuller understanding ofrepresentational potential

Abstractions are there, but not used so much

Barend Beekhuizen (Leiden & UvA) Modeling and theory 22 September 2015 25 / 32

Competence and performance Representation

!"!"!!"#$#!"!%#&$#!#

"!%#&$&'()$#

"!%#&$$#

!"!"!*+,+'$#!"!%#&$#!#

!"!"!$%&'()$#!"!%#&$#!#

Figure: After 100 input items

Barend Beekhuizen (Leiden & UvA) Modeling and theory 22 September 2015 26 / 32

Competence and performance Representation

!"!"!!"#$#!"!%#&$#!#

"!%#&$&'()$#

"!%#&$$#

!"!"!*+,+'$#!"!%#&$#!#

!"!"!$%&'()$#!"!%#&$#!#

!"!"!-$#!"!%#&$#!"!&'()$#!#

!"!"!!"#$#!"!&+.($#!"!/&$#!#

!"!"!!"#$#!"!%#&$#!"!/&$#!#

!"!"!$%&'()$#!"!%#&$#!"!%)*+*,$#!#

!"!"!!"#$#!"!-./'%00(1%$#!"!/&$#!#

!"!"!$%&'()$#!"!-./'%00(1%$#!"!%)*+*,$#!#

Figure: After 500 input items

Barend Beekhuizen (Leiden & UvA) Modeling and theory 22 September 2015 27 / 32

Competence and performance Representation

!"!"!!"#$#!"!%#&$#!#

"!%#&$&'()$#

"!%#&$$#

!"!"!*+,+'$#!"!%#&$#!#

!"!"!$%&'()$#!"!%#&$#!#

!"!"!-$#!"!%#&$#!"!&'()$#!#

!"!"!!"#$#!"!&+.($#!"!/&$#!#

!"!"!!"#$#!"!%#&$#!"!/&$#!#

!"!"!$%&'()$#!"!%#&$#!"!%)*+*,$#!#

!"!"!!"#$#!"!-./'%00(1%$#!"!/&$#!#

!"!"!$%&'()$#!"!-./'%00(1%$#!"!%)*+*,$#!#

!"!"!$%&'()$#!"!%#&$#!"!(23%-*$#!"!4(-5&(4%!#!#

!"!"!$%&'()$#!"!%#&$#!"!(23%-*$#!"!/1!#!#!"!"!!"#$#!"!%#&$#!

"!&'()!$#!"!/1!#!#

!"!"!!"#$#!"!%#&$#!"!(23%-*!$#!"!/1!#!#

!"!"!*+,+'$#!"!%#&$#!"!&'()!$#!"!/1!#!#

!"!"!!"#$#!"!%#&$#!"!(23%-*!$#!"!4(-5&(4%!#!#

!"!"!!"#$#!"!%#&$#!"!/&!$#!"!/1!#!"!/&$#!#

!"!"!!"#$#!"!%#&$#!"!%)*+*,$#!"!/1!#!"!%)*+*,!#!#

!"!"!$%&'()$#!"!%#&$#!"!%)*+*,$#

!"!4(-5&(4%!#!"!%)*+*,!#!#

!"!"!$%&'()$#!"!%#&$#!"!(23%-*!$#!"!&'(,(!#!#

!"!"!$%&'()$#!"!-./'%50(1%$#!"!%)*+*,$#

!"!4(-5&(4%!#!"!%)*+*,!#!#

Figure: After 10000 input items

Barend Beekhuizen (Leiden & UvA) Modeling and theory 22 September 2015 28 / 32

Competence and performance Production

Linguistic competence in production

Wysiwyg?

No:

Lexical items may be known but not produced because grammaticalconstructions are not known yetInteraction: competition between grammatical constructions

Barend Beekhuizen (Leiden & UvA) Modeling and theory 22 September 2015 29 / 32

Competence and performance Production

The unexpressed expressables

Words that are known but nonetheless not produced, because there is (1)an erroneous word outcompeting them or (2) there is no grammaticalconstruction to ‘host’ them.

0

200

400

600

0 2500 5000 7500 10000time

coun

t

expressed

unexpressed unexpressable

unexpressed expressable

expressed

Figure: The expression of ‘second arguments’ over time

Barend Beekhuizen (Leiden & UvA) Modeling and theory 22 September 2015 30 / 32

Wrap-up

Wrap-up

Why we need to focus on competence/performance

Corpora hide potential for abstraction

Simultaneity effects hide potential

Lexical knowledge hidden (unexpressed expressables)Paradoxal blocking effects

Barend Beekhuizen (Leiden & UvA) Modeling and theory 22 September 2015 31 / 32

Wrap-up

Thank you

Barend Beekhuizen (Leiden & UvA) Modeling and theory 22 September 2015 32 / 32

top related