associative browsing

17
ASSOCIATIVE BROWSING Evaluating 1 Jin Y. Kim / W. Bruce Croft / David Smith by Simulation

Upload: moshe

Post on 22-Feb-2016

50 views

Category:

Documents


0 download

DESCRIPTION

ASSOCIATIVE BROWSING. Evaluating. by Simulation. Jin Y. Kim / W. Bruce Croft / David Smith. What do you remember about your documents?. Registration. James. James. Use search if you recall keywords!. What if keyword search is not enough?. Registration. - PowerPoint PPT Presentation

TRANSCRIPT

Page 1: ASSOCIATIVE BROWSING

1

ASSOCIATIVE BROWSINGEvaluating

Jin Y. Kim / W. Bruce Croft / David Smith

by Simulation

Page 2: ASSOCIATIVE BROWSING

2

*What do you remember about your documents?

Registration

James

James

Use search if you recall keywords!

Page 3: ASSOCIATIVE BROWSING

3

*What if keyword search is not enough?

Registration

Associative browsing to the rescue!

Page 4: ASSOCIATIVE BROWSING

4

*Probabilistic User Modeling

• Query generation model• Term selection from a target document [Kim&Croft09]

• State transition model• Use browsing when result looks marginally relevant

• Link selection model• Click on browsing suggestions based on perceived relevance

Page 5: ASSOCIATIVE BROWSING

5

*Simulating Interaction using Probabilistic User Model

Initial Query : James Registration

Marginally Relevant(11 < RankD < 50 )

Not Relevant(RankD > 50 )

Reformulated Query : Two Dollar Registration

Search

Click On a Result : 1. Two Dollar Regist…

End

Target Docat Top 10

Target Docat Top 10

Target Doc :

Page 6: ASSOCIATIVE BROWSING

*A User Model for Link Selection• User’s browsing behavior [Smucker&Allan06]

• Fan-out 1~3: the number of clicks per ranked list• BFS vs. DFS : the order in which documents are visited

Page 7: ASSOCIATIVE BROWSING

*A User Model for Link Selection• User’s level of knowledge

• Random : randomly click on a ranked list• Informed : more likely to click on more relevant item• Oracle : always click on the most relevant item

• Relevance estimated using the position of target item

1 …

2 …

3 …

4 …

5 …

1 …

2 …

3 …

4 …

5 …

1 …

2 …

3 …

4 …

5 …

Page 8: ASSOCIATIVE BROWSING

*Evaluation Results• Simulated interaction was generated using CS collection

• 63,260 known-item finding sessions in total

• The Value of Browsing• Browsing was used in 15% of all sessions• Browsing saved 42% of sessions when used

• Comparison with User Study Results• Roughly matches in terms of overall usage and success ratio

Evaluation Type

Total Browsing used

Successful

Simulation 63,260 9,410 (14.8%) 3,957 (42.0%)

User Study 290 42 (14.5%) 15 (35.7%)

Page 9: ASSOCIATIVE BROWSING

*Evaluation Results• Success Ratio of Browsing

FO1 FO2 FO30.3

0.32

0.34

0.36

0.38

0.4

0.42

0.44

0.46

0.48

randominformedoracle

More Exploration

Page 10: ASSOCIATIVE BROWSING

10

*SummaryAssociative Browsing Model Evaluation by Simulation

Any Questions?Jin Y. Kim / W. Bruce Croft / David Smith

• Simulated evaluation showed very similar statistics to user study in when and how successfully associative browsing is used

• Simulated evaluation reveals a subtle interaction between the level of knowledge and the degree of exploration

Page 11: ASSOCIATIVE BROWSING

11

*Simulation of Know-item Finding using Memory Model

• Build the model of user’s memory• Model how the memory degrades over time

• Generate search and browsing behavior on the model• Query-term selection from the memory model• Use information scent to guide browsing choices [Pirolli, Fu, Chi]

• Update the memory model during the interaction• New terms and associations are learned

t1

t2

t3

t4

t5

t3

Page 12: ASSOCIATIVE BROWSING

12

OPTIONAL SLIDES

Page 13: ASSOCIATIVE BROWSING

*Evaluation Results• Lengths of Successful Sessions

random informed oracle0

0.5

1

1.5

2

2.5

FO1FO2-BFSFO3-BFS

random informed oracle0

0.5

1

1.5

2

2.5

FO1FO2-DFSFO3-DFS

Page 14: ASSOCIATIVE BROWSING

14

*Summary of Previous Evaluation

• User study by DocTrack Game [Kim&Croft11]

• Collect public documents in UMass CS department• Build a web interface by which participants can find documents• Department people were asked to join and compete

• Limitations• Fixed collection, with a small set of target tasks• Hard to evaluate with varying system parameters

• Simulated Evaluation as a Solution• Build a model of user behavior• Generate simulated interaction logs

If search accuracy improves by X%,

how will it affect user behavior?

How would its effectiveness vary for

diverse groups of users?

Page 15: ASSOCIATIVE BROWSING

15

*Building the Associative Browsing Model

2. Concept Extraction

3. Link Extraction

4. Link Refinement

1. Document Collection

Term SimilarityTemporal SimilarityCo-occurrence

Page 16: ASSOCIATIVE BROWSING

16

*DocTrack Game

Jinyoung Kim
How can we present shortly?
Page 17: ASSOCIATIVE BROWSING

17

*Community Efforts based on the Datasets