collaborative information retrieval: concepts, models and evaluation

153
1. Collaboration and Information Retrieval 2. Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion Collaborative Information Retrieval: Concepts, Models and Evaluation Lynda Tamine Paul Sabatier University IRIT, Toulouse - France Laure Soulier Pierre and Marie Curie University LIP6, Paris - France April 10, 2016 1 / 111

Upload: upmc-sorbonne-universities

Post on 15-Apr-2017

526 views

Category:

Science


0 download

TRANSCRIPT

Page 1: Collaborative Information Retrieval: Concepts, Models and Evaluation

1. Collaboration and Information Retrieval 2. Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion

Collaborative Information Retrieval: Concepts, Models andEvaluation

Lynda TaminePaul Sabatier UniversityIRIT, Toulouse - France

Laure SoulierPierre and Marie Curie UniversityLIP6, Paris - France

April 10, 2016

1 / 111

Page 2: Collaborative Information Retrieval: Concepts, Models and Evaluation

1. Collaboration and Information Retrieval 2. Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion

OVERVIEW OF THE RESEARCH AREA

c© [Shah, 2012]

• PublicationsI Papers in several conferences (SIGIR, CIKM, ECIR, CHI, CSCW,...) and journals (IP&M,

JASIST, JIR, IEEE, ...)I Books on ”Collaborative Information Seeking”

[Morris and Teevan, 2009, Shah, 2012, Hansen et al., 2015]I Special issues on ”Collaborative Information Seeking” (IP&M, 2010; IEEE, 2014)

• Workshops and TutorialsI Collaborative Information Behavior: GROUP 2009I Collaborative Information Seeking: GROUP 2010, CSCW 2010, ASIST 2011 and CSCW 2013I Collaborative Information Retrieval: JCDL 2008 and CIKM 2011I Evaluation in Collaborative Information Retrieval: CIKM 2015

2 / 111

Page 3: Collaborative Information Retrieval: Concepts, Models and Evaluation

1. Collaboration and Information Retrieval 2. Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion

COLLABORATION IN FEW NUMBERS [MORRIS, 2008, MORRIS, 2013]

• On which occasion do you collaborate?I Collaboration purposes

Task FrequencyTravel planing 27.5%Online shopping 25.7%Bibliographic search 20.2 %Technical search 16.5 %Fact-finding 16.5 %Social event planing 12.8 %Health search 6.4 %

I Application domains

Domain ExampleMedical Physician/Patient - Physician/NurseDigital library Librarians/CustomersE-Discovery Fee-earners/Customers - Contact reviewer/Lead counselAcademic groups of students

3 / 111

Page 4: Collaborative Information Retrieval: Concepts, Models and Evaluation

1. Collaboration and Information Retrieval 2. Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion

COLLABORATION IN FEW NUMBERS [MORRIS, 2008, MORRIS, 2013]

• On which occasion do you collaborate?I Collaboration purposes

Task FrequencyTravel planing 27.5%Online shopping 25.7%Bibliographic search 20.2 %Technical search 16.5 %Fact-finding 16.5 %Social event planing 12.8 %Health search 6.4 %

I Application domains

Domain ExampleMedical Physician/Patient - Physician/NurseDigital library Librarians/CustomersE-Discovery Fee-earners/Customers - Contact reviewer/Lead counselAcademic groups of students

3 / 111

Page 5: Collaborative Information Retrieval: Concepts, Models and Evaluation

1. Collaboration and Information Retrieval 2. Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion

COLLABORATION IN FEW NUMBERS [MORRIS, 2008, MORRIS, 2013]

• On which occasion do you collaborate?I Collaboration purposes

Task FrequencyTravel planing 27.5%Online shopping 25.7%Bibliographic search 20.2 %Technical search 16.5 %Fact-finding 16.5 %Social event planing 12.8 %Health search 6.4 %

I Application domains

Domain ExampleMedical Physician/Patient - Physician/NurseDigital library Librarians/CustomersE-Discovery Fee-earners/Customers - Contact reviewer/Lead counselAcademic groups of students

3 / 111

Page 6: Collaborative Information Retrieval: Concepts, Models and Evaluation

1. Collaboration and Information Retrieval 2. Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion

COLLABORATION IN FEW NUMBERS [MORRIS, 2008, MORRIS, 2013]

• On which occasion do you collaborate?I Collaboration purposes

Task FrequencyTravel planing 27.5%Online shopping 25.7%Bibliographic search 20.2 %Technical search 16.5 %Fact-finding 16.5 %Social event planing 12.8 %Health search 6.4 %

I Application domains

Domain ExampleMedical Physician/Patient - Physician/NurseDigital library Librarians/CustomersE-Discovery Fee-earners/Customers - Contact reviewer/Lead counselAcademic groups of students

3 / 111

Page 7: Collaborative Information Retrieval: Concepts, Models and Evaluation

1. Collaboration and Information Retrieval 2. Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion

COLLABORATION IN FEW NUMBERS [MORRIS, 2008, MORRIS, 2013]

• How do you collaborate?

I How often?

I Group size?

I Collaborative settings?

22% 11.9% 66.1%

4 / 111

Page 8: Collaborative Information Retrieval: Concepts, Models and Evaluation

1. Collaboration and Information Retrieval 2. Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion

COLLABORATION IN FEW NUMBERS [MORRIS, 2008, MORRIS, 2013]

• How do you collaborate?

I How often?

I Group size?

I Collaborative settings?

22% 11.9% 66.1%

4 / 111

Page 9: Collaborative Information Retrieval: Concepts, Models and Evaluation

1. Collaboration and Information Retrieval 2. Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion

COLLABORATION IN FEW NUMBERS [MORRIS, 2008, MORRIS, 2013]

• How do you collaborate?

I How often? I Group size?

I Collaborative settings?

22% 11.9% 66.1%

4 / 111

Page 10: Collaborative Information Retrieval: Concepts, Models and Evaluation

1. Collaboration and Information Retrieval 2. Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion

COLLABORATION IN FEW NUMBERS [MORRIS, 2008, MORRIS, 2013]

• How do you collaborate?

I How often? I Group size?

I Collaborative settings?

22% 11.9% 66.1%

4 / 111

Page 11: Collaborative Information Retrieval: Concepts, Models and Evaluation

1. Collaboration and Information Retrieval 2. Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion

COLLABORATION IN FEW NUMBERS [MORRIS, 2008, MORRIS, 2013]

• How do you collaborate?

I How often? I Group size?

I Collaborative settings?

22% 11.9% 66.1%

4 / 111

Page 12: Collaborative Information Retrieval: Concepts, Models and Evaluation

1. Collaboration and Information Retrieval 2. Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion

COLLABORATION IN FEW NUMBERS [MORRIS, 2008, MORRIS, 2013]

• How do you collaborate?

I How often? I Group size?

I Collaborative settings?

22% 11.9% 66.1%

4 / 111

Page 13: Collaborative Information Retrieval: Concepts, Models and Evaluation

1. Collaboration and Information Retrieval 2. Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion

OUTLINE

1. Collaboration and Information Retrieval

2. Collaborative IR techniques and models

3. Evaluation

4. Challenges ahead

5. Discussion

5 / 111

Page 14: Collaborative Information Retrieval: Concepts, Models and Evaluation

Collaboration and Information Retrieval 2. Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion

PLAN

1. Collaboration and Information RetrievalUsers and Information RetrievalThe notion of collaborationCollaboration paradigmsCollaborative search approachesCollaborative search interfaces

2. Collaborative IR techniques and models

3. Evaluation

4. Challenges ahead

5. Discussion

6 / 111

Page 15: Collaborative Information Retrieval: Concepts, Models and Evaluation

Collaboration and Information Retrieval 2. Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion

AD-HOC INFORMATION RETRIEVALLET’S START BY WHAT YOU ALREADY KNOW...

• Ranking documents with respect to a query• How?

I Term weighting/Document scoring [Robertson and Walker, 1994, Salton, 1971]I Query Expansion/Reformulation [Rocchio, 1971]

7 / 111

Page 16: Collaborative Information Retrieval: Concepts, Models and Evaluation

Collaboration and Information Retrieval 2. Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion

USERS AND INFORMATION RETRIEVALLET’S START BY WHAT YOU ALREADY KNOW...

• Personalized IR [Kraft et al., 2005, Gauch et al., 2003, Liu et al., 2004]I Personalizing search results to user’s context, preferences

and interestsI How?

I Modeling user’s profileI Integrating the user’s context and preferences within the

document scoring

• Collaborative filtering [Resnick et al., 1994]I Recommending search results using ratings/preferences

of other usersI How?

I Inferring user’s own preferences from other users’preferences

I Personalizing search results• Social Information Retrieval [Amer-Yahia et al., 2007, Pal and Counts, 2011]

I Exploiting social media platforms to retrievedocument/users...

I How?I Social network analysis (graph structure, information

diffusion, ...)I Integrating social-based features within the document

relevance scoring

Let’s have a more in-depth look on...

Collaborative Information Retrieval

8 / 111

Page 17: Collaborative Information Retrieval: Concepts, Models and Evaluation

Collaboration and Information Retrieval 2. Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion

USERS AND INFORMATION RETRIEVALLET’S START BY WHAT YOU ALREADY KNOW...

• Personalized IR [Kraft et al., 2005, Gauch et al., 2003, Liu et al., 2004]I Personalizing search results to user’s context, preferences

and interestsI How?

I Modeling user’s profileI Integrating the user’s context and preferences within the

document scoring• Collaborative filtering [Resnick et al., 1994]

I Recommending search results using ratings/preferencesof other users

I How?I Inferring user’s own preferences from other users’

preferencesI Personalizing search results

• Social Information Retrieval [Amer-Yahia et al., 2007, Pal and Counts, 2011]I Exploiting social media platforms to retrieve

document/users...I How?

I Social network analysis (graph structure, informationdiffusion, ...)

I Integrating social-based features within the documentrelevance scoring

Let’s have a more in-depth look on...

Collaborative Information Retrieval

8 / 111

Page 18: Collaborative Information Retrieval: Concepts, Models and Evaluation

Collaboration and Information Retrieval 2. Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion

USERS AND INFORMATION RETRIEVALLET’S START BY WHAT YOU ALREADY KNOW...

• Personalized IR [Kraft et al., 2005, Gauch et al., 2003, Liu et al., 2004]I Personalizing search results to user’s context, preferences

and interestsI How?

I Modeling user’s profileI Integrating the user’s context and preferences within the

document scoring• Collaborative filtering [Resnick et al., 1994]

I Recommending search results using ratings/preferencesof other users

I How?I Inferring user’s own preferences from other users’

preferencesI Personalizing search results

• Social Information Retrieval [Amer-Yahia et al., 2007, Pal and Counts, 2011]I Exploiting social media platforms to retrieve

document/users...I How?

I Social network analysis (graph structure, informationdiffusion, ...)

I Integrating social-based features within the documentrelevance scoring

Let’s have a more in-depth look on...

Collaborative Information Retrieval

8 / 111

Page 19: Collaborative Information Retrieval: Concepts, Models and Evaluation

Collaboration and Information Retrieval 2. Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion

USERS AND INFORMATION RETRIEVALLET’S START BY WHAT YOU ALREADY KNOW...

• Personalized IR [Kraft et al., 2005, Gauch et al., 2003, Liu et al., 2004]I Personalizing search results to user’s context, preferences

and interestsI How?

I Modeling user’s profileI Integrating the user’s context and preferences within the

document scoring• Collaborative filtering [Resnick et al., 1994]

I Recommending search results using ratings/preferencesof other users

I How?I Inferring user’s own preferences from other users’

preferencesI Personalizing search results

• Social Information Retrieval [Amer-Yahia et al., 2007, Pal and Counts, 2011]I Exploiting social media platforms to retrieve

document/users...I How?

I Social network analysis (graph structure, informationdiffusion, ...)

I Integrating social-based features within the documentrelevance scoring

Let’s have a more in-depth look on...

Collaborative Information Retrieval 8 / 111

Page 20: Collaborative Information Retrieval: Concepts, Models and Evaluation

Collaboration and Information Retrieval 2. Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion

THE NOTION OF COLLABORATIONDEFINITION

Definition

‘A process through which parties who see different aspects of a problem can constructively exploretheir differences and search for solutions that go beyond their own limited vision of what is possible.”[Gray, 1989]

Definition

‘Collaboration is a process in which autonomous actors interact through formal and informalnegotiation, jointly creating rules and struc- tures governing their relationships and ways to act ordecide on the issues that brought them together ; it is a process involving shared norms and mutuallybeneficial interactions.” [Thomson and Perry, 2006]

9 / 111

Page 21: Collaborative Information Retrieval: Concepts, Models and Evaluation

Collaboration and Information Retrieval 2. Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion

THE NOTION OF COLLABORATIONTHE 5WS OF THE COLLABORATION [MORRIS AND TEEVAN, 2009, SHAH, 2010]

What?

Tasks: Complex, exploratory or fact-finding tasks, ...Application domains: Bibliographic, medical, e-Discovery, academic search

Why?

Shared interestsInsufficient knowledge

Mutual beneficial goalsDivision of labor

Who?

Groups vs. Communities

When?

Synchronous vs. Asynchronous

Where?

Colocated vs. Remote

How?

CrowdsourcingImplicit vs. Explicit intent

User mediationSystem mediation

10 / 111

Page 22: Collaborative Information Retrieval: Concepts, Models and Evaluation

Collaboration and Information Retrieval 2. Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion

THE NOTION OF COLLABORATIONTHE 5WS OF THE COLLABORATION [MORRIS AND TEEVAN, 2009, SHAH, 2010]

What?

Tasks: Complex, exploratory or fact-finding tasks, ...Application domains: Bibliographic, medical, e-Discovery, academic search

Why?

Shared interestsInsufficient knowledge

Mutual beneficial goalsDivision of labor

Who?

Groups vs. Communities

When?

Synchronous vs. Asynchronous

Where?

Colocated vs. Remote

How?

CrowdsourcingImplicit vs. Explicit intent

User mediationSystem mediation

10 / 111

Page 23: Collaborative Information Retrieval: Concepts, Models and Evaluation

Collaboration and Information Retrieval 2. Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion

THE NOTION OF COLLABORATIONTHE 5WS OF THE COLLABORATION [MORRIS AND TEEVAN, 2009, SHAH, 2010]

What?

Tasks: Complex, exploratory or fact-finding tasks, ...Application domains: Bibliographic, medical, e-Discovery, academic search

Why?

Shared interestsInsufficient knowledge

Mutual beneficial goalsDivision of labor

Who?

Groups vs. Communities

When?

Synchronous vs. Asynchronous

Where?

Colocated vs. Remote

How?

CrowdsourcingImplicit vs. Explicit intent

User mediationSystem mediation

10 / 111

Page 24: Collaborative Information Retrieval: Concepts, Models and Evaluation

Collaboration and Information Retrieval 2. Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion

THE NOTION OF COLLABORATIONTHE 5WS OF THE COLLABORATION [MORRIS AND TEEVAN, 2009, SHAH, 2010]

What?

Tasks: Complex, exploratory or fact-finding tasks, ...Application domains: Bibliographic, medical, e-Discovery, academic search

Why?

Shared interestsInsufficient knowledge

Mutual beneficial goalsDivision of labor

Who?

Groups vs. Communities

When?

Synchronous vs. Asynchronous

Where?

Colocated vs. Remote

How?

CrowdsourcingImplicit vs. Explicit intent

User mediationSystem mediation

10 / 111

Page 25: Collaborative Information Retrieval: Concepts, Models and Evaluation

Collaboration and Information Retrieval 2. Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion

THE NOTION OF COLLABORATIONTHE 5WS OF THE COLLABORATION [MORRIS AND TEEVAN, 2009, SHAH, 2010]

What?

Tasks: Complex, exploratory or fact-finding tasks, ...Application domains: Bibliographic, medical, e-Discovery, academic search

Why?

Shared interestsInsufficient knowledge

Mutual beneficial goalsDivision of labor

Who?

Groups vs. Communities

When?

Synchronous vs. Asynchronous

Where?

Colocated vs. Remote

How?

CrowdsourcingImplicit vs. Explicit intent

User mediationSystem mediation

10 / 111

Page 26: Collaborative Information Retrieval: Concepts, Models and Evaluation

Collaboration and Information Retrieval 2. Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion

THE NOTION OF COLLABORATIONTHE 5WS OF THE COLLABORATION [MORRIS AND TEEVAN, 2009, SHAH, 2010]

What?

Tasks: Complex, exploratory or fact-finding tasks, ...Application domains: Bibliographic, medical, e-Discovery, academic search

Why?

Shared interestsInsufficient knowledge

Mutual beneficial goalsDivision of labor

Who?

Groups vs. Communities

When?

Synchronous vs. Asynchronous

Where?

Colocated vs. Remote

How?

CrowdsourcingImplicit vs. Explicit intent

User mediationSystem mediation

10 / 111

Page 27: Collaborative Information Retrieval: Concepts, Models and Evaluation

Collaboration and Information Retrieval 2. Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion

THE NOTION OF COLLABORATIONCOLLABORATIVE INFORMATION RETRIEVAL (CIR) [FOSTER, 2006, GOLOVCHINSKY ET AL., 2009]

11 / 111

Page 28: Collaborative Information Retrieval: Concepts, Models and Evaluation

Collaboration and Information Retrieval 2. Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion

THE NOTION OF COLLABORATIONCOMPARING CIR WITH OTHER IR APPROACHES

Exercice

How do you think that CIR differs from Personalized IR, Collaborative Filtering, or Social IR?• User (unique/group)

• Personalization (yes/no)

• Collaboration (implicit/explicit)

• Concurrency (collocated/remote)

• Collaboration benefit (symmetric/asymmetric)

• Communication (yes/no)

• ...

12 / 111

Page 29: Collaborative Information Retrieval: Concepts, Models and Evaluation

Collaboration and Information Retrieval 2. Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion

THE NOTION OF COLLABORATIONCOMPARING CIR WITH OTHER IR APPROACHES

Exercice

How do you think that CIR differs from Personalized IR, Collaborative Filtering, or Social IR?

Perso. IR Collab. Filtering Social IR Collab. IR

User unique � � � �group � � � �

Personalization no � � � �yes � � � �

Collaboration implicit � � � �explicit � � � �

Concurrency synchronous � � � �asynchronous � � � �

Benefit symmetric � � � �asymmetric � � � �

Communication no � � � �yes � � � �

Information usage

Information exchange � � � �Information retrieval � � � �Information synthesis � � � �Sensemaking � � � �

13 / 111

Page 30: Collaborative Information Retrieval: Concepts, Models and Evaluation

Collaboration and Information Retrieval 2. Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion

COLLABORATION PARADIGMS [FOLEY AND SMEATON, 2010,KELLY AND PAYNE, 2013, SHAH AND MARCHIONINI, 2010]

Division of labor • Role-based division of labor

• Document-based division of labor

Sharing of knowledge • Communication and shared workspace

• Ranking based on relevance judgements

Awareness • Collaborators’ actions

• Collaborators’ context

14 / 111

Page 31: Collaborative Information Retrieval: Concepts, Models and Evaluation

Collaboration and Information Retrieval 2. Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion

COLLABORATION PARADIGMS [FOLEY AND SMEATON, 2010,KELLY AND PAYNE, 2013, SHAH AND MARCHIONINI, 2010]

Division of labor • Role-based division of labor

• Document-based division of labor

Sharing of knowledge • Communication and shared workspace

• Ranking based on relevance judgements

Awareness • Collaborators’ actions

• Collaborators’ context

14 / 111

Page 32: Collaborative Information Retrieval: Concepts, Models and Evaluation

Collaboration and Information Retrieval 2. Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion

COLLABORATION PARADIGMS [FOLEY AND SMEATON, 2010,KELLY AND PAYNE, 2013, SHAH AND MARCHIONINI, 2010]

Division of labor • Role-based division of labor

• Document-based division of labor

Sharing of knowledge • Communication and shared workspace

• Ranking based on relevance judgements

Awareness • Collaborators’ actions

• Collaborators’ context

14 / 111

Page 33: Collaborative Information Retrieval: Concepts, Models and Evaluation

Collaboration and Information Retrieval 2. Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion

COLLABORATIVE INFORMATION RETRIEVALCOLLABORATIVE SEARCH SESSION

15 / 111

Page 34: Collaborative Information Retrieval: Concepts, Models and Evaluation

Collaboration and Information Retrieval 2. Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion

STRUCTURE OF THE COLLABORATIVE SEARCH SESSIONS

• The 3 phasesof the socialsearch model[Evans and Chi, 2010]

16 / 111

Page 35: Collaborative Information Retrieval: Concepts, Models and Evaluation

Collaboration and Information Retrieval 2. Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion

STRUCTURE OF THE COLLABORATIVE SEARCH SESSIONS

• The 3 phases of thecollaboratorsbehavioral model[Karunakaran et al., 2013]

17 / 111

Page 36: Collaborative Information Retrieval: Concepts, Models and Evaluation

Collaboration and Information Retrieval 2. Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion

COLLABORATIVE SEARCH APPROACHES [JOHO ET AL., 2009]

• “Development of new IR models that can take collaboration into account in retrieval.”• “Leverage IR techniques such as relevance feedback, clustering, profiling, and data

fusion to support collaborative search while using conventional IR models.”• “Develop search interfaces that allow people to perform search tasks in

collaboration.interfaces”

18 / 111

Page 37: Collaborative Information Retrieval: Concepts, Models and Evaluation

Collaboration and Information Retrieval 2. Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion

COLLABORATIVE SEARCH INTERFACES

What could be collaborative in search interfaces [Shah, 2012, Thomson and Perry, 2006]:

• Communication tools for defining search strategies, users’ roles as well as sharing relevantinformation [Golovchinsky et al., 2011, Kelly and Payne, 2013]

• Awareness tools for reporting collaborators’ actions[Diriye and Golovchinsky, 2012, Rodriguez Perez et al., 2011]

• Individual and shared workspace to ensure mutual beneficial goals

• Algorithmic mediation to monitor collaborators’ actions

• User-driven collaborative interfacesI Collaborators fully activeI Collaboration support through devices

(interactive tabletop) or tools (web interfaces)

• System-mediated collaborative interfacesI Collaborators partially activeI Collaboration support through algorithmic

mediation (e.g., document distributionaccording roles or not)

19 / 111

Page 38: Collaborative Information Retrieval: Concepts, Models and Evaluation

Collaboration and Information Retrieval 2. Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion

COLLABORATIVE SEARCH INTERFACES

What could be collaborative in search interfaces [Shah, 2012, Thomson and Perry, 2006]:

• Communication tools for defining search strategies, users’ roles as well as sharing relevantinformation [Golovchinsky et al., 2011, Kelly and Payne, 2013]

• Awareness tools for reporting collaborators’ actions[Diriye and Golovchinsky, 2012, Rodriguez Perez et al., 2011]

• Individual and shared workspace to ensure mutual beneficial goals

• Algorithmic mediation to monitor collaborators’ actions

• User-driven collaborative interfacesI Collaborators fully activeI Collaboration support through devices

(interactive tabletop) or tools (web interfaces)

• System-mediated collaborative interfacesI Collaborators partially activeI Collaboration support through algorithmic

mediation (e.g., document distributionaccording roles or not)

19 / 111

Page 39: Collaborative Information Retrieval: Concepts, Models and Evaluation

Collaboration and Information Retrieval 2. Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion

COLLABORATIVE SEARCH INTERFACESUSER-DRIVEN COLLABORATIVE INTERFACES

• Coagmento [Shah and Gonzalez-Ibanez, 2011a]

20 / 111

Page 40: Collaborative Information Retrieval: Concepts, Models and Evaluation

Collaboration and Information Retrieval 2. Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion

COLLABORATIVE SEARCH INTERFACESUSER-DRIVEN COLLABORATIVE INTERFACES

• CoFox [Rodriguez Perez et al., 2011]

Others interfaces: [Erickson, 2010] [Vivian and Dinet, 2008]... 21 / 111

Page 41: Collaborative Information Retrieval: Concepts, Models and Evaluation

Collaboration and Information Retrieval 2. Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion

COLLABORATIVE SEARCH INTERFACESUSER-DRIVEN COLLABORATIVE INTERFACES

• TeamSearch [Morris et al., 2006]

Others interfaces: Fischlar-DiamondTouch [Smeaton et al., 2006] - WeSearch[Morris et al., 2010]... 22 / 111

Page 42: Collaborative Information Retrieval: Concepts, Models and Evaluation

Collaboration and Information Retrieval 2. Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion

COLLABORATIVE SEARCH INTERFACESSYSTEM-MEDIATED COLLABORATIVE INTERFACES

• Cerchiamo [Golovchinsky et al., 2008]

23 / 111

Page 43: Collaborative Information Retrieval: Concepts, Models and Evaluation

Collaboration and Information Retrieval 2. Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion

COLLABORATIVE SEARCH INTERFACESSYSTEM-MEDIATED COLLABORATIVE INTERFACES

• Querium [Diriye and Golovchinsky, 2012]

24 / 111

Page 44: Collaborative Information Retrieval: Concepts, Models and Evaluation

1. Collaboration and Information Retrieval Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion

PLAN

1. Collaboration and Information Retrieval

2. Collaborative IR techniques and modelsChallenges and issuesUnderstanding Collaborative IROverviewSystem-mediated CIR modelsUser-Driven System-mediated CIR modelsRoadmap

3. Evaluation

4. Challenges ahead

5. Discussion

25 / 111

Page 45: Collaborative Information Retrieval: Concepts, Models and Evaluation

1. Collaboration and Information Retrieval Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion

CHALLENGES

• Conceptual models of IR:I Static IR: system-based IR, does not learn from users

eg. VSM [Salton, 1971], BM25 [Robertson et al., 1995] LM [Ponte and Croft, 1998], PageRankand Hits [Brin and Page, 1998]

I Interactive IR: exploiting feedback from userseg. Rocchio [Rocchio, 1971], Relevance-based LM [Lavrenko and Croft, 2001]

I Dynamic IR: learning dynamically from past user-system interactions and predicts futureeg. iPRP [Fuhr, 2008], interactive exploratory search [Jin et al., 2013]

26 / 111

Page 46: Collaborative Information Retrieval: Concepts, Models and Evaluation

1. Collaboration and Information Retrieval Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion

CHALLENGES

• Conceptual models of IR:I Static IR: system-based IR, does not learn from users

eg. VSM [Salton, 1971], BM25 [Robertson et al., 1995] LM [Ponte and Croft, 1998], PageRankand Hits [Brin and Page, 1998]

I Interactive IR: exploiting feedback from userseg. Rocchio [Rocchio, 1971], Relevance-based LM [Lavrenko and Croft, 2001]

I Dynamic IR: learning dynamically from past user-system interactions and predicts futureeg. iPRP [Fuhr, 2008], interactive exploratory search [Jin et al., 2013]

26 / 111

Page 47: Collaborative Information Retrieval: Concepts, Models and Evaluation

1. Collaboration and Information Retrieval Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion

CHALLENGES

• Conceptual models of IR:I Static IR: system-based IR, does not learn from users

eg. VSM [Salton, 1971], BM25 [Robertson et al., 1995] LM [Ponte and Croft, 1998], PageRankand Hits [Brin and Page, 1998]

I Interactive IR: exploiting feedback from userseg. Rocchio [Rocchio, 1971], Relevance-based LM [Lavrenko and Croft, 2001]

I Dynamic IR: learning dynamically from past user-system interactions and predicts futureeg. iPRP [Fuhr, 2008], interactive exploratory search [Jin et al., 2013]

26 / 111

Page 48: Collaborative Information Retrieval: Concepts, Models and Evaluation

1. Collaboration and Information Retrieval Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion

CHALLENGES

• Conceptual models of IR:

27 / 111

Page 49: Collaborative Information Retrieval: Concepts, Models and Evaluation

1. Collaboration and Information Retrieval Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion

CHALLENGES

• Conceptual models of IR:

27 / 111

Page 50: Collaborative Information Retrieval: Concepts, Models and Evaluation

1. Collaboration and Information Retrieval Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion

CHALLENGES

28 / 111

Page 51: Collaborative Information Retrieval: Concepts, Models and Evaluation

1. Collaboration and Information Retrieval Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion

CHALLENGES

1 Learning from user and user-user past interactions

2 Adaptation to multi-faceted and multi-user contexts: skills, expertise, role, etc.

3 Aggregating relevant information nuggets

4 Supporting synchronous vs. asynchronous coordination

5 Modeling collaboration paradigms: division of labor, sharing of knowledge

6 Optimizing the search cost: balance in work (search) and group benefit (task outcome)

29 / 111

Page 52: Collaborative Information Retrieval: Concepts, Models and Evaluation

1. Collaboration and Information Retrieval Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion

EMPIRICAL UNDERSTANDING OF CIR

Objectives

1 Investigating user behavior and search patternsI Search processes [Shah and Gonzalez-Ibanez, 2010, Yue et al., 2014]I Search tactics and practices [Hansen and Jarvelin, 2005, Morris, 2008, Morris, 2013,

Amershi and Morris, 2008, Tao and Tombros, 2013, Capra, 2013]I Role assignement [Imazu et al., 2011, Tamine and Soulier, 2015]

2 Studying the impact of collaborative search settings on performanceI Impact of collaboration on search performance

[Shah and Gonzalez-Ibanez, 2011b, Gonzalez-Ibanez et al., 2013]

30 / 111

Page 53: Collaborative Information Retrieval: Concepts, Models and Evaluation

1. Collaboration and Information Retrieval Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion

EMPIRICAL UNDERSTANDING OF CIRGOAL: EXPLORING COLLABORATIVE SEARCH PROCESSES

• Study objective: Testing the feasibility of the Kuhlthau’s model of the informationseking process in a collaborative information seeking situation[Shah and Gonzalez-Ibanez, 2010]

Stage Feeling Thoughts Actions(Affective) (Cognitive)

Initiation Uncertainty General/Vague ActionsSelection OptimismExploration Confusion, Frustration, Doubt Seeking relevant informa-

tionFormulation Clarity Narrowed, ClearerCollection Sense of direction,

ConfidenceIncreased interest Seeking relevant or focused

informationPresentation Relief, Satisfaction or disap-

pointmentClearer or focused

31 / 111

Page 54: Collaborative Information Retrieval: Concepts, Models and Evaluation

1. Collaboration and Information Retrieval Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion

EMPIRICAL UNDERSTANDING OF CIRGOAL: EXPLORING COLLABORATIVE SEARCH PROCESSES

• Study objective: Testing the feasibility of the Kuhlthau’s model in collaborativeinformation seeking situations [Shah and Gonzalez-Ibanez, 2010]

I Participants: 42 dyads, students or university employees who already did a collaborative worktogether

I System: Coagmento 1

I Sessions: two sessions (S1, S2) running in 7 main phases: (1) tutorial on system, (2)demographic questionnaire, (3) task description, (4) timely-bounded task achievement, (5)post-questionnaire, (6) report compilation, (7) questionnaire and interview

I Tasks: simulated work tasks.eg. Task 1: Economic recession”A leading newspaper has hired your team to create a comprehensive report on the causes and consequencesof the current economic recession in the US. As a part of your contract, you are required to collect all therelevant information from any available online sources that you can find. ... Your report on this topic shouldaddress the following issues: reasons behind this recession, effects on some major areas, such as health-care,home ownership, and financial sector (stock market), unemployment statistics over a period of time, proposalexecution, and effects of the economy simulation plan, and people’s opinions and reactions on economy’sdownfall”

1http://www.coagmento.org/32 / 111

Page 55: Collaborative Information Retrieval: Concepts, Models and Evaluation

1. Collaboration and Information Retrieval Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion

EMPIRICAL UNDERSTANDING OF CIRGOAL: EXPLORING COLLABORATIVE SEARCH PROCESSES

• (Main) Study results:I The Kuhlthau’s model stages map collaborative tasks

• Initiation: number of chatmessages at the stage andbetween stages

• Selection: number of chatmessages discussing thestrategy

• Exploration: number ofsearch queries

• Formulation: number ofvisited webpages

• Collection: number ofcollected webpages

• Presentation: number ofmoving actions fororganizing collectedsnippets

33 / 111

Page 56: Collaborative Information Retrieval: Concepts, Models and Evaluation

1. Collaboration and Information Retrieval Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion

EMPIRICAL UNDERSTANDING OF CIRGOAL: EXPLORING COLLABORATIVE SEARCH PROCESSES

• (Main) Study results:I The Kuhlthau’s model stages map collaborative tasks

• Initiation: number of chatmessages at the stage andbetween stages

• Selection: number of chatmessages discussing thestrategy

• Exploration: number ofsearch queries

• Formulation: number ofvisited webpages

• Collection: number ofcollected webpages

• Presentation: number ofmoving actions fororganizing collectedsnippets

33 / 111

Page 57: Collaborative Information Retrieval: Concepts, Models and Evaluation

1. Collaboration and Information Retrieval Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion

EMPIRICAL UNDERSTANDING OF CIRGOAL: EXPLORING SEARCH TACTICS AND PRACTICES

• Study objective: Analyzing query (re)formulations and related term sources based onparticipants’ actions [Yue et al., 2014]

I Participants: 20 dyads, students who already knew each other in advanceI System: CollabsearchI Session: one session running in running in 7 main phases: (1) tutorial on system, (2)

demographic questionnaire, (3) task description, (4) timely-bounded task achievement, (5)post-questionnaire, (6) report compilation, (7) questionnaire and interview

I Tasks: (T1) academic literature search, (T2) travel planning

34 / 111

Page 58: Collaborative Information Retrieval: Concepts, Models and Evaluation

1. Collaboration and Information Retrieval Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion

EMPIRICAL UNDERSTANDING OF CIRGOAL: EXPLORING SEARCH TACTICS AND PRACTICES

• (Main) Study results:I Individual action-based query reformulation (V, S, Q):

I No (significant) new findings

I Collaborative action-based query reformulation (SP, QP, C):I Influence of communication (C) is task-dependent.I Influence of collaborators’ queries (QP) is significantly higher than previous own queries (Q).I Less influence of collaborators’ workspace (SP) than own workspace (S).

• V: percentage of queries for whichparticipants viewed results, oneterm originated from at least onepage

• S: percentage of queries for whichparticipants saved results, one termoriginated from at least one page

• Q: percentage of queries with atleast one overlapping term withprevious queries

• SP: percentage of queries for whichat least one term originated fromcollaborators’ workspace

• QP: percentage of queries for whichat least one term originated fromcollaborators’ previous queries

• C: percentage of queries for whichat least one term originated fromcollaborators’ communication

35 / 111

Page 59: Collaborative Information Retrieval: Concepts, Models and Evaluation

1. Collaboration and Information Retrieval Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion

EMPIRICAL UNDERSTANDING OF CIRGOAL: STUDYING ROLE ASSIGNMENT

• Study objective: Understanding differences in users’ behavior in role-oriented andnon-role- oriented collaborative search sessions

I Participants: 75 dyads, students who already knew each otherI Settings: 25 dyads without roles, 50 dyads with roles (25 PM roles, 25 GS roles)I System: open-source Coagmento pluginI Session: one session running in 7 main phases: (1) tutorial on system, (2) demographic

questionnaire, (3) task description, (4) timely-bounded task achievement, (5)post-questionnaire, (6) report compilation, (7) questionnaire and interview

I Tasks: Three (3) exploratory search tasks, topics from Interactive TREC track2

Tamine, L. and Soulier, L. (2015). Understanding the impact of therole factor in collaborative information retrieval. In Proceedings ofthe ACM International on Conference on Information andKnowledge Management, CIKM 15, pages 4352.

2http://trec.nist.gov/data/t8i/t8i.html36 / 111

Page 60: Collaborative Information Retrieval: Concepts, Models and Evaluation

1. Collaboration and Information Retrieval Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion

EMPIRICAL UNDERSTANDING OF CIRGOAL: STUDYING ROLE ASSIGNMENT

• (Main) Study resultsI Users with assigned roles significantly behave differently than users with roles

Mean(s.d.)npq dt nf qn ql qo nbm

W/RoleGS Group 1.71(1.06) 9.99(3.37) 58.52(27.13) 65.91(31.54) 4.64(1.11) 0.44(0.18) 20(14.50)

IGDiff p -0.52 -3.47*** 1.30*** 2.09*** 1.16*** 0.14*** 2.23***

PM Group 1.88(1.53) 10.47(3.11) 56.31(27.95) 56.31(27.95) 2.79(0.70) 0.39(0.08) 15(12.88)IGDiff p 0.24*** 1.45*** -2.42*** -1.69*** 0.06*** 0-0.23*** 0.05***

W/oRoleGroup 2.09(1.01) 13.16(3.92) 24.13(12.81) 43.58(16.28) 3.67(0.67) 0.45(0.10) 19(11.34)

p-value/GS *** *** *** *** *** ***p-value/PM *** *** *** *** *** *** *

W/Rolevs.W/oRole

ANOVA p-val. ** *** ** *

37 / 111

Page 61: Collaborative Information Retrieval: Concepts, Models and Evaluation

1. Collaboration and Information Retrieval Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion

EMPIRICAL UNDERSTANDING OF CIRGOAL: STUDYING ROLE ASSIGNMENT

• (Main) Study resultsI Early and high level of coordination of participants without roleI Role drift for participants with PM role

38 / 111

Page 62: Collaborative Information Retrieval: Concepts, Models and Evaluation

1. Collaboration and Information Retrieval Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion

EMPIRICAL UNDERSTANDING OF CIRGOAL: EVALUATING THE IMPACT OF COLLABORATION ON SEARCH PERFORMANCE

• Study objective: Evaluating the synergic effect of collaboration in information seeking[Shah and Gonzalez-Ibanez, 2011b]

I Participants: 70 participants, 10 as single users, 30 as dyadsI Settings: C1 (single users), C2 (artificial formed teams), C3 (co-located teams, different

computers), C4 (co-located teams, same computer), C5 remotely located teamsI System: CoagmentoI Session: one session running in running in 7 main phases: (1) tutorial on system, (2)

demographic questionnaire, (3) task description, (4) timely-bounded task achievement, (5)post-questionnaire, (6) report compilation, (7) questionnaire and interview

I Tasks: One exploratory search task, topic ”gulf oil spill”

39 / 111

Page 63: Collaborative Information Retrieval: Concepts, Models and Evaluation

1. Collaboration and Information Retrieval Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion

EMPIRICAL UNDERSTANDING OF CIRGOAL: EVALUATING THE IMPACT OF COLLABORATION ON SEARCH PERFORMANCE

• (Main) Study resultsI Value of remote collaboration when the task has clear independent componentsI Remotely located teams able to leverage real interactions leading to synergic collaborationI Cognitive load in a collaborative setting not significantly higher than in an individual one

40 / 111

Page 64: Collaborative Information Retrieval: Concepts, Models and Evaluation

1. Collaboration and Information Retrieval Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion

EMPIRICAL UNDERSTANDING OF CIR

Lessons learned

• Small-group (critical mass) collaborative search is a common practice despite the lack ofspecific tools

• The whole is greater than the sum of all• Collaborative search behavior differs from individual search behavior while some

phases of theoretical models of individual search are still valid for collaborative search• Algorithmic mediation lowers the coordination cost• Roles structure the collaboration but do not guarantee performance improvement in

comparison to no roles

Design implications: revisit IR models and techniques

• Back to the axiomatic relevance hypothesis (Fang et al. 2011)• Role as a novel variable in the IR models ?• Learning to rank from user-system, user-user interactions within multi-session search

tasks?

41 / 111

Page 65: Collaborative Information Retrieval: Concepts, Models and Evaluation

1. Collaboration and Information Retrieval Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion

EMPIRICAL UNDERSTANDING OF CIR

Lessons learned

• Small-group (critical mass) collaborative search is a common practice despite the lack ofspecific tools

• The whole is greater than the sum of all

• Collaborative search behavior differs from individual search behavior while somephases of theoretical models of individual search are still valid for collaborative search

• Algorithmic mediation lowers the coordination cost• Roles structure the collaboration but do not guarantee performance improvement in

comparison to no roles

Design implications: revisit IR models and techniques

• Back to the axiomatic relevance hypothesis (Fang et al. 2011)• Role as a novel variable in the IR models ?• Learning to rank from user-system, user-user interactions within multi-session search

tasks?

41 / 111

Page 66: Collaborative Information Retrieval: Concepts, Models and Evaluation

1. Collaboration and Information Retrieval Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion

EMPIRICAL UNDERSTANDING OF CIR

Lessons learned

• Small-group (critical mass) collaborative search is a common practice despite the lack ofspecific tools

• The whole is greater than the sum of all• Collaborative search behavior differs from individual search behavior while some

phases of theoretical models of individual search are still valid for collaborative search

• Algorithmic mediation lowers the coordination cost• Roles structure the collaboration but do not guarantee performance improvement in

comparison to no roles

Design implications: revisit IR models and techniques

• Back to the axiomatic relevance hypothesis (Fang et al. 2011)• Role as a novel variable in the IR models ?• Learning to rank from user-system, user-user interactions within multi-session search

tasks?

41 / 111

Page 67: Collaborative Information Retrieval: Concepts, Models and Evaluation

1. Collaboration and Information Retrieval Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion

EMPIRICAL UNDERSTANDING OF CIR

Lessons learned

• Small-group (critical mass) collaborative search is a common practice despite the lack ofspecific tools

• The whole is greater than the sum of all• Collaborative search behavior differs from individual search behavior while some

phases of theoretical models of individual search are still valid for collaborative search• Algorithmic mediation lowers the coordination cost

• Roles structure the collaboration but do not guarantee performance improvement incomparison to no roles

Design implications: revisit IR models and techniques

• Back to the axiomatic relevance hypothesis (Fang et al. 2011)• Role as a novel variable in the IR models ?• Learning to rank from user-system, user-user interactions within multi-session search

tasks?

41 / 111

Page 68: Collaborative Information Retrieval: Concepts, Models and Evaluation

1. Collaboration and Information Retrieval Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion

EMPIRICAL UNDERSTANDING OF CIR

Lessons learned

• Small-group (critical mass) collaborative search is a common practice despite the lack ofspecific tools

• The whole is greater than the sum of all• Collaborative search behavior differs from individual search behavior while some

phases of theoretical models of individual search are still valid for collaborative search• Algorithmic mediation lowers the coordination cost• Roles structure the collaboration but do not guarantee performance improvement in

comparison to no roles

Design implications: revisit IR models and techniques

• Back to the axiomatic relevance hypothesis (Fang et al. 2011)• Role as a novel variable in the IR models ?• Learning to rank from user-system, user-user interactions within multi-session search

tasks?

41 / 111

Page 69: Collaborative Information Retrieval: Concepts, Models and Evaluation

1. Collaboration and Information Retrieval Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion

EMPIRICAL UNDERSTANDING OF CIR

Lessons learned

• Small-group (critical mass) collaborative search is a common practice despite the lack ofspecific tools

• The whole is greater than the sum of all• Collaborative search behavior differs from individual search behavior while some

phases of theoretical models of individual search are still valid for collaborative search• Algorithmic mediation lowers the coordination cost• Roles structure the collaboration but do not guarantee performance improvement in

comparison to no roles

Design implications: revisit IR models and techniques

• Back to the axiomatic relevance hypothesis (Fang et al. 2011)• Role as a novel variable in the IR models ?• Learning to rank from user-system, user-user interactions within multi-session search

tasks?

41 / 111

Page 70: Collaborative Information Retrieval: Concepts, Models and Evaluation

1. Collaboration and Information Retrieval Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion

EMPIRICAL UNDERSTANDING OF CIR

Lessons learned

• Small-group (critical mass) collaborative search is a common practice despite the lack ofspecific tools

• The whole is greater than the sum of all• Collaborative search behavior differs from individual search behavior while some

phases of theoretical models of individual search are still valid for collaborative search• Algorithmic mediation lowers the coordination cost• Roles structure the collaboration but do not guarantee performance improvement in

comparison to no roles

Design implications: revisit IR models and techniques

• Back to the axiomatic relevance hypothesis (Fang et al. 2011)

• Role as a novel variable in the IR models ?• Learning to rank from user-system, user-user interactions within multi-session search

tasks?

41 / 111

Page 71: Collaborative Information Retrieval: Concepts, Models and Evaluation

1. Collaboration and Information Retrieval Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion

EMPIRICAL UNDERSTANDING OF CIR

Lessons learned

• Small-group (critical mass) collaborative search is a common practice despite the lack ofspecific tools

• The whole is greater than the sum of all• Collaborative search behavior differs from individual search behavior while some

phases of theoretical models of individual search are still valid for collaborative search• Algorithmic mediation lowers the coordination cost• Roles structure the collaboration but do not guarantee performance improvement in

comparison to no roles

Design implications: revisit IR models and techniques

• Back to the axiomatic relevance hypothesis (Fang et al. 2011)• Role as a novel variable in the IR models ?

• Learning to rank from user-system, user-user interactions within multi-session searchtasks?

41 / 111

Page 72: Collaborative Information Retrieval: Concepts, Models and Evaluation

1. Collaboration and Information Retrieval Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion

EMPIRICAL UNDERSTANDING OF CIR

Lessons learned

• Small-group (critical mass) collaborative search is a common practice despite the lack ofspecific tools

• The whole is greater than the sum of all• Collaborative search behavior differs from individual search behavior while some

phases of theoretical models of individual search are still valid for collaborative search• Algorithmic mediation lowers the coordination cost• Roles structure the collaboration but do not guarantee performance improvement in

comparison to no roles

Design implications: revisit IR models and techniques

• Back to the axiomatic relevance hypothesis (Fang et al. 2011)• Role as a novel variable in the IR models ?• Learning to rank from user-system, user-user interactions within multi-session search

tasks?

41 / 111

Page 73: Collaborative Information Retrieval: Concepts, Models and Evaluation

1. Collaboration and Information Retrieval Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion

OVERVIEW OF IR MODELS AND TECHNIQUESDESIGNING COLLABORATIVE IR MODELS: A YOUNG RESEARCH AREA

42 / 111

Page 74: Collaborative Information Retrieval: Concepts, Models and Evaluation

1. Collaboration and Information Retrieval Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion

OVERVIEW OF IR MODELS AND TECHNIQUESDESIGNING COLLABORATIVE IR MODELS: A YOUNG RESEARCH AREA

42 / 111

Page 75: Collaborative Information Retrieval: Concepts, Models and Evaluation

1. Collaboration and Information Retrieval Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion

OVERVIEW OF IR MODELS AND TECHNIQUES

Collaborative IR models are based on algorithmic mediation:Systems re-use users’ search activity data to mediate the search• Data?

I Click-through data, queries, viewed results, result rankings, ...I User-user communication

• Mediation?I Rooting/suggesting/enhance the queriesI Building personalized document rankingsI Automatically set-up division of labor

43 / 111

Page 76: Collaborative Information Retrieval: Concepts, Models and Evaluation

1. Collaboration and Information Retrieval Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion

OVERVIEW OF IR MODELS AND TECHNIQUES

Collaborative IR models are based on algorithmic mediation:Systems re-use users’ search activity data to mediate the search• Data?

I Click-through data, queries, viewed results, result rankings, ...I User-user communication

• Mediation?I Rooting/suggesting/enhance the queriesI Building personalized document rankingsI Automatically set-up division of labor

43 / 111

Page 77: Collaborative Information Retrieval: Concepts, Models and Evaluation

1. Collaboration and Information Retrieval Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion

OVERVIEW OF IR MODELS AND TECHNIQUES

Notations

Notation Descriptiond Documentq Queryuj User jg Collaborative groupti term iRSV(d, q) Relevance Status Value given (d,q)N Document collection sizeni Number of documents in the collection in which term ti occursR Number of relevant documents in the collectionRuj Number of relevant documents in the collection for user uj

ruji Number of relevant documents of user uj in which term ti occurs

44 / 111

Page 78: Collaborative Information Retrieval: Concepts, Models and Evaluation

1. Collaboration and Information Retrieval Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion

SYSTEM-MEDIATED CIR MODELSUSER GROUP-BASED MEDIATION

• Enhancing collaborative search with users’ context[Morris et al., 2008, Foley and Smeaton, 2009a, Han et al., 2016]

I Division of labor: dividing the work by non-overlapping browsingI Sharing of knowledge: exploiting personal relevance judgments, user’s authority

45 / 111

Page 79: Collaborative Information Retrieval: Concepts, Models and Evaluation

1. Collaboration and Information Retrieval Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion

SYSTEM-MEDIATED CIR MODELSUSER/GROUP-BASED MEDIATION: GROUPIZATION, SMART SPLITTING, GROUP-HIGHLIGHTING [MORRIS ET AL., 2008]

• Hypothesis setting: one or a few synchronous search query(ies)• 3 approaches

I Smart splitting: splitting top ranked web results using a round-robin technique,personalized-splitting of remaining results (document ranking level)

I Groupization: reusing individual personalization techniques towards groups (document rankinglevel)

I Hit Highlighting: highlighting user’s keywords (document browsing level)

46 / 111

Page 80: Collaborative Information Retrieval: Concepts, Models and Evaluation

1. Collaboration and Information Retrieval Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion

SYSTEM-MEDIATED CIR MODELSUSER/GROUP-BASED MEDIATION: SMART-SPLITTING [MORRIS ET AL., 2008]

Personalizing the document ranking: use the revisited BM25 weighting scheme[Teevan et al., 2005]

RSV(d, q, uj) =∑

ti∈d∩q

wBM25(ti, uj) (1)

wB2M5(ti, uj) = log(ri + 0.5)(N′ − n′i − Ruj + r

uji + 0.5)

(n′i − ruji + 0.5)(Ruj − r

uji + 0.5

(2)

N′ = (N + Ruj ) (3)

n′i = ni + ruji (4)

47 / 111

Page 81: Collaborative Information Retrieval: Concepts, Models and Evaluation

1. Collaboration and Information Retrieval Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion

SYSTEM-MEDIATED CIR MODELSUSER/GROUP-BASED MEDIATION: SMART-SPLITTING [MORRIS ET AL., 2008]

Example

Smart-splitting according to personalized scores.

48 / 111

Page 82: Collaborative Information Retrieval: Concepts, Models and Evaluation

1. Collaboration and Information Retrieval Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion

SYSTEM-MEDIATED CIR MODELSUSER/GROUP-BASED MEDIATION: COLLABORATIVE RELEVANCE FEEDBACK [FOLEY ET AL., 2008, FOLEY AND SMEATON, 2009B]

• Hypothesis setting: multiple independent synchronous search queries• Collaborative relevance feedback: sharing collaborator’s explicit relevance judgments

I Aggregate the partial user relevance scoresI Compute the user’s authority weighting

49 / 111

Page 83: Collaborative Information Retrieval: Concepts, Models and Evaluation

1. Collaboration and Information Retrieval Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion

SYSTEM-MEDIATED CIR MODELSUSER/GROUP-BASED MEDIATION: COLLABORATIVE RELEVANCE FEEDBACK [FOLEY ET AL., 2008, FOLEY AND SMEATON, 2009B]

• A: Combining inputs of the RF process

puwo(ti) =

U−1∑u=0

ruiwBM25(ti) (5)

wBM25(ti) = log(∑U−1

u=0 αuru

iRu

)(1−∑U−1

u=0 αuni − rui

N − Ru)

(∑U−1

u=0 αuni − rui

N − Ru)(1−

∑U−1u=0 αu

rui

Ru)

(6)

U−1∑u=0

αu = 1 (7)

50 / 111

Page 84: Collaborative Information Retrieval: Concepts, Models and Evaluation

1. Collaboration and Information Retrieval Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion

SYSTEM-MEDIATED CIR MODELSUSER/GROUP-BASED MEDIATION: COLLABORATIVE RELEVANCE FEEDBACK [FOLEY ET AL., 2008, FOLEY AND SMEATON, 2009B]

• B: Combining outputs of the RF process

crwo(ti) =

U−1∑u=0

αuwBM25(ti, u) (8)

wBM25(ti, u) = log(

rui

Ru)(1−

ni − rui

N − Ru)

(ni − rui

N − Ru)(1−

rui

Ru)

(9)

• C: Combining outputs of the ranking process

RSV(d, q) =

U−1∑u=0

αuRSV(d, q, u) (10)

RSV(d, q, u) =∑

ti∈d∩q

wBM25(ti, u) (11)

51 / 111

Page 85: Collaborative Information Retrieval: Concepts, Models and Evaluation

1. Collaboration and Information Retrieval Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion

SYSTEM-MEDIATED CIR MODELSUSER/GROUP-BASED MEDIATION: CONTEXT-BASED COLLABORATIVE SEARCH [HAN ET AL., 2016]

• Exploit a 3-dimensional context:I Individual search history HQU : queries, results, bookmarks etc.)I Collaborative group HCL: collaborators’ search history (queries, results, bookmarks etc.)I Collaboration HCH : collaboration behavior chat (communication)

52 / 111

Page 86: Collaborative Information Retrieval: Concepts, Models and Evaluation

1. Collaboration and Information Retrieval Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion

SYSTEM-MEDIATED CIR MODELSUSER/GROUP-BASED MEDIATION: CONTEXT-BASED COLLABORATIVE SEARCH [HAN ET AL., 2016]

1 Building a document ranking RSV(q, d) and generating Rank(d)

2 Building the document language model θd

3 Building the context language model θHx

p(ti|Hx) =1K

K∑k=1

p(ti|Xk) (12)

p(ti|Xk) =nk

Xk(13)

4 Computing the KL-divergence between θHx and θd

D(θd, θHx ) = −∑

ti

p(ti|θd) log p(ti|Hx) (14)

5 Learning to rank using pairwise features (Rank(d), D(θd, θHx))

53 / 111

Page 87: Collaborative Information Retrieval: Concepts, Models and Evaluation

1. Collaboration and Information Retrieval Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion

SYSTEM-MEDIATED CIR MODELSROLE-BASED MEDIATION

Enhancing collaborative search with user’s role[Pickens et al., 2008, Shah et al., 2010, Soulier et al., 2014b]• Division of labour: dividing the work based on users’ role peculiarities• Sharing of knowledge: splitting the search results

54 / 111

Page 88: Collaborative Information Retrieval: Concepts, Models and Evaluation

1. Collaboration and Information Retrieval Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion

SYSTEM-MEDIATED CIR MODELSROLE-BASED MEDIATION: PROSPECTOR AND MINER [PICKENS ET AL., 2008]

• Prospector/Miner as functional roles supported by algorithms:I Prospector: ”..opens new fields for exploration into a data collection..”.→ Draws ideas from algorithmically suggested query terms

I Miner: ”..ensures that rich veins of information are explored...”.→ Refines the search by judging highly ranked (unseen) documents

• Collaborative system architecture:I Algorithmic layer: functions

combining users’ search activities toproduce fitted outcomes to roles(queries, document rankings).

I Regulator layer: captures inputs(search activities), calls theappropriate functions of thealgorithmic layer, roots the outputsof the algorithmic layer to theappropriate role (user).

55 / 111

Page 89: Collaborative Information Retrieval: Concepts, Models and Evaluation

1. Collaboration and Information Retrieval Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion

SYSTEM-MEDIATED CIR MODELSROLE-BASED MEDIATION: PROSPECTOR AND MINER [PICKENS ET AL., 2008]

• Prospector function: The highly-relevant terms are suggested based on:

Score(ti) =∑Lq∈L

wr(Lq)wf (Lq)rlf (ti; Lq) (15)

rlf (ti; Lq): number of documents in Lq in which ti occurs.• Miner function: The unseen documents are queued according to

RSV(q, d) =∑Lq∈L

wr(Lk)wf (Lq)borda(d; Lq) (16)

wr(Lq) =|seen ∈ Lq||seen ∈ Lq|

(17)

wf (Lq) =|rel ∈ Lq||seen ∈ Lq|

(18)

56 / 111

Page 90: Collaborative Information Retrieval: Concepts, Models and Evaluation

1. Collaboration and Information Retrieval Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion

SYSTEM-MEDIATED CIR MODELSROLE-BASED MEDIATION: GATHERER AND SURVEYOR [SHAH ET AL., 2010]

• Gatherer/Surveyor as functional roles supported by algorithms:I Gatherer: ”..scan results of joint search activity to discover most immediately relevant documents..”.I Surveyor: ”..browse a wider diversity of information to get a better understanding of the collection

being searched...”.

• Main functions:I Merging: merging (eg. CombSum) the

documents rankings of collaboratorsI Splitting: rooting the appropriate

documents according to roles (eg.k-means clustering). High precision forthe Gatherer, high diversity for theSurveyor

57 / 111

Page 91: Collaborative Information Retrieval: Concepts, Models and Evaluation

1. Collaboration and Information Retrieval Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion

SYSTEM-MEDIATED CIR MODELSROLE-BASED MEDIATION: DOMAIN EXPERT AND DOMAIN NOVICE

Domain expert/Domain novice as knowledge-based roles supported by algorithms:• Domain expert: ”..represent problems at deep structural levels and are generally interested in

discovering new associations among different aspects of items, or in delineating the advances ina research focus surrounding the query topic..”.

• Domain novice: ”..represent problems in terms of surface or superficial aspects and aregenerally interested in enhancing their learning about the general query topic..”.

Soulier, L., Tamine, L., and Bahsoun, W. (2014b). On domainexpertise-based roles in collaborative information retrieval.Information Processing & Management (IP&M), 50(5):752774.

58 / 111

Page 92: Collaborative Information Retrieval: Concepts, Models and Evaluation

1. Collaboration and Information Retrieval Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion

SYSTEM-MEDIATED CIR MODELSROLE-BASED MEDIATION: DOMAIN EXPERT AND DOMAIN NOVICE [SOULIER ET AL., IP&M 2014B]

A two step algorithm:

1 Role-based document relevance scoring

Pk(d|uj, q) ∝ Pk(uj|d) · Pk(d|q) (19)

P(q|θd) ∝∏

(ti,wiq)∈q[λP(ti|θd) + (1− λ)P(ti|θC)]wiq (20)

Pk(uj|d) ∝ P(π(uj)k|θd)

∝∏

(ti,wkij)∈π(uj)

k [λkdjP(ti|θd) + (1− λk

dj)P(ti|θC)]wk

ij (21)

59 / 111

Page 93: Collaborative Information Retrieval: Concepts, Models and Evaluation

1. Collaboration and Information Retrieval Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion

SYSTEM-MEDIATED CIR MODELSROLE-BASED MEDIATION: DOMAIN EXPERT AND DOMAIN NOVICE [SOULIER ET AL., IP&M 2014B]

A two step algorithm:

1 Role-based document relevance scoring : parameter smoothing using evidence fromnovelty and specificity

λkdj =

Nov(d,D(uj)k) · Spec(d)β

maxd′∈D Nov(d,D(uj)k) · Spec(d′)β(22)

with β{

1 if uj is an expert−1 if uj is a novice

I Novelty

Nov(d,D(uj)k) = mind′∈D(uj)

k d(d, d′) (23)

I Specificity

Spec(d) = avgti∈dspec(ti) = avgti∈d(−log(

fdtiN )

α) (24)

60 / 111

Page 94: Collaborative Information Retrieval: Concepts, Models and Evaluation

1. Collaboration and Information Retrieval Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion

SYSTEM-MEDIATED CIR MODELSROLE-BASED MEDIATION: DOMAIN EXPERT AND DOMAIN NOVICE [SOULIER ET AL., IP&M 2014B]

A two step algorithm:

1 Document allocation to collaboratorsI Classification-based on the Expectation Maximization algorithm (EM)

I E-step: Document probability of belonging to collaborator’s class

P(Rj = 1|xkdj) =

αkj · φ

kj (xk

dj)

αkj · φ

kj (xk

dj) + (1− αkj ) · ψ

kj (xk

dj)(25)

I M-step : Parameter updating and likelihood estimationI Document allocation to collaborators by comparison of document ranks within collaborators’

lists

rkjj′ (d, δk

j , δkj′ ) =

{1 if rank(d, δk

j ) < rank(d, δkj′ )

0 otherwise(26)

I Division of labor: displaying distinct document lists between collaborators

61 / 111

Page 95: Collaborative Information Retrieval: Concepts, Models and Evaluation

1. Collaboration and Information Retrieval Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion

SYSTEM-MEDIATED CIR MODELSROLE-BASED MEDIATION: DOMAIN EXPERT AND DOMAIN NOVICE [SOULIER ET AL., IP&M 2014B]

Example

Applying the Expert/Novice CIR model

Let’s consider:• A collaborative search session with two users u1 (expert) and u2 (novice).• A shared information need I modeled through a query q.• A collection of 10 documents and their associated relevance score with respect to the

shared information need I.

t1 t2 t3 t4q 1 0 1 0d1 2 3 1 1d2 0 0 5 3d3 2 1 7 6d4 4 1 0 0d5 2 0 0 0d6 3 0 0 0d7 7 1 1 1d8 3 3 3 3d9 1 4 5 0d10 0 0 4 0

Weighting vectors of documents and query:q = (0.5, 0, 0.5, 0) ;d1 = (0.29, 0.43, 0.14, 0.14)d2 = (0, 0, 0.63, 0.37)d3 = (0.12, 0.06, 0.44, 0.28)d4 = (0.8, 0.2, 0, 0)d5 = (1, 0, 0, 0)d6 = (0.3, 0, 0, 0.7)d7 = (0.7, 0.1, 0.1, 0.1)d8 = (0.25, 0.25, 0.25, 0.25)d9 = (0.1, 0.4, 0.5, 0)d10 = (0, 0, 1, 0).Users profile: π(u1)0 = π(u2)0 = q

62 / 111

Page 96: Collaborative Information Retrieval: Concepts, Models and Evaluation

1. Collaboration and Information Retrieval Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion

SYSTEM-MEDIATED CIR MODELSROLE-BASED MEDIATION: DOMAIN EXPERT AND DOMAIN NOVICE [SOULIER ET AL., IP&M 2014B]

Example

Applying the Expert/Novice CIR model

RSV(q, d) rank(d) Spec(d)d1 0.24 2 0.19d2 0.02 7 0.23d3 0.17 3 0.19d4 0.03 6 0.15d5 0.01 9 0.1d6 0.02 8 0.1d7 0.10 4 0.19d8 0.31 1 0.19d9 0.09 5 0.16d10 0.01 10 0.15

• The document specificity is estimated as:I α = 3 (If a term has a collection frequency equals to 1,−log(1/10) = 2.30)

I d1 =

−log( 810 )

3−log( 6

10 )

3−log( 7

10 )

3−log( 5

10 )

34 = 0.19

d2 = 0.23, d3 = 0.19, d4 = 0.15, d5 = 0.01, d6 = 0.1, d7 = 0.19, d8 = 0.19, d9 = 0.16,d10 = 0.15

• Iteration 0: Distributing top (6) documents to users: 3 most specific to the expert andthe 3 less specific to the novice.

I Expert u1: l0(u1,D0ns) = {d8, d1, d3}

I Novice u2: l0(u2,D0ns) = {d7, d9, d4}

63 / 111

Page 97: Collaborative Information Retrieval: Concepts, Models and Evaluation

1. Collaboration and Information Retrieval Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion

SYSTEM-MEDIATED CIR MODELSROLE-BASED MEDIATION: DOMAIN EXPERT AND DOMAIN NOVICE [SOULIER ET AL., IP&M 2014B]

Example

Applying the Expert/Novice CIR model

• Iteration 1. Let’s consider that user u2 selected document d4 (D(u1)1 = {d4, d5}).I Building the user’s profile.π(u1)

1 = (0.5, 0, 0.5, 0)π(u2)

1 = ( 0.5+0.82 , 0.2

2 ,0.52 , 0) = (0.65, 0.1, 0.25, 0).

I Estimating the document relevance with respect to collaborators.I For user u1 : P1(d1|u1) = P1(d1|q) ∗ P1(u1|d1) = 0.24 ∗ 0.22 = 0.05.

P1(d1|q) = 0.24.P1(u1|d1) = (0.85 ∗ 2

7 + 0.15 ∗ 2484 )

0.05 + (0.85 ∗ 37 + 0.15 ∗ 13

84 )0 + (0.85 ∗ 1

7 + 0.15 ∗ 2684 )

0.05 +

(0.85 ∗ 17 + 0.15 ∗ 21

84 )0 = 0.22

λ111 = 1∗0.19

0.23 = 0.85 where 0.19 expresses the specificity of document d1 and 1 is the documentnovelty score, and 0.23 the normalization score.

The normalizeddocument scoresfor eachcollaborators arethe following:

P1(d|u1) P2(d|u2)d1 0.23 0.28d2 0 0.03d3 0.16 0.11d5 0.01 0.01d6 0.03 0.02d7 0.12 0.14d8 0.34 0.34d9 0.10 0.06d10 0.01 0.01 64 / 111

Page 98: Collaborative Information Retrieval: Concepts, Models and Evaluation

1. Collaboration and Information Retrieval Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion

SYSTEM-MEDIATED CIR MODELSROLE-BASED MEDIATION: DOMAIN EXPERT AND DOMAIN NOVICE [SOULIER ET AL., IP&M 2014B]

Example

Applying the Expert/Novice CIR model

• Iteration 1. Let’s consider that user u2 selected document d4 (D(u1)1 = {d4, d5}).I Building the user’s profile.π(u1)

1 = (0.5, 0, 0.5, 0)π(u2)

1 = ( 0.5+0.82 , 0.2

2 ,0.52 , 0) = (0.65, 0.1, 0.25, 0).

I Estimating the document relevance with respect to collaborators.I For user u1 : P1(d1|u1) = P1(d1|q) ∗ P1(u1|d1) = 0.24 ∗ 0.22 = 0.05. P1(d1|q) = 0.24 since that the

user’s profile has not evolve.λ1

11 = 1∗0.190.23 = 0.85 where 0.19 expresses the specificity of document d1 and 1 is the document

novelty score, and 0.23 the normalization score.P1(u1|d1) = (0.85 ∗ 2

7 + 0.15 ∗ 2484 )

0.05 + (0.85 ∗ 37 + 0.15 ∗ 13

84 )0 + (0.85 ∗ 1

7 + 0.15 ∗ 2684 )

0.05 +

(0.85 ∗ 17 + 0.15 ∗ 21

84 )0 = 0.22

The normalizeddocument scoresfor eachcollaborators arethe following:

P1(d|u1) P2(d|u2)d1 0.23 0.28d2 0 0.03d3 0.16 0.11d5 0.01 0.01d6 0.03 0.02d7 0.12 0.14d8 0.34 0.34d9 0.10 0.06d10 0.01 0.01 65 / 111

Page 99: Collaborative Information Retrieval: Concepts, Models and Evaluation

1. Collaboration and Information Retrieval Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion

USER-DRIVEN SYSTEM-MEDIATED CIR MODELSMINE USERS’ ROLES THEN PERSONALIZE THE SEARCH

Soulier, L., Shah, C., and Tamine, L. (2014a). User-drivenSystem-mediated Collaborative Information Retrieval. InProceedings of the Annual International SIGIR Conference onResearch and Development in Information Retrieval, SIGIR 14,pages 485494. ACM.

66 / 111

Page 100: Collaborative Information Retrieval: Concepts, Models and Evaluation

1. Collaboration and Information Retrieval Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion

USER-DRIVEN SYSTEM-MEDIATED CIR MODELSMINE USERS’ ROLES THEN PERSONALIZE THE SEARCH [SOULIER ET AL., SIGIR 2014A]

• Identifying users’ search behavior differences: estimating significance of differencesusing the Kolmogrov-Smirnov test

• Characterizing users’ role

67 / 111

Page 101: Collaborative Information Retrieval: Concepts, Models and Evaluation

1. Collaboration and Information Retrieval Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion

USER-DRIVEN SYSTEM-MEDIATED CIR MODELSMINE USERS’ ROLES THEN PERSONALIZE THE SEARCH [SOULIER ET AL., SIGIR 2014A]

• Categorizing users’ roles Ru

argmin R1,2 ||FR1,2 C(tl)

u1,u2 || (27)

subject to :

∀(fj,fk)∈K

R1,2 FR1,2 (fj, fk)− C(tl)u1,u2 (fj, fk)) > −1

where defined as:

FR1,2 (fj, fk) C(tl)u1,u2 (fj, fk) =

{FR1,2 (fj, fk)− C(tl)

u1,u2 (fj, fk) if FR1,2 (fj, fk) ∈ {−1; 1}0 otherwise

• Personalizing the search: [Pickens et al., 2008, Shah, 2011]...

68 / 111

Page 102: Collaborative Information Retrieval: Concepts, Models and Evaluation

1. Collaboration and Information Retrieval Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion

USER-DRIVEN SYSTEM-MEDIATED CIR MODELSMINE USERS’ ROLES THEN PERSONALIZE THE SEARCH [SOULIER ET AL., SIGIR 2014A]

• User’s roles modeled through patternsI Intuition

Number of visited documents

Number of submitted queries

Negative correlation

I Role pattern PR1,2

I Search feature kernel KR1,2

I Search feature-based correlation matrix FR1,2

FR1,2=

1 if positively correlated−1 if negatively correlated0 otherwise

69 / 111

Page 103: Collaborative Information Retrieval: Concepts, Models and Evaluation

1. Collaboration and Information Retrieval Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion

USER-DRIVEN SYSTEM-MEDIATED CIR MODELSMINE USERS’ ROLES THEN PERSONALIZE THE SEARCH [SOULIER ET AL., SIGIR 2014A]

Example

Mining role of collaborators

A collaborativesearch sessionimplies two usersu1 and u2 aimingat identifyinginformationdealing with“global warming”.We present searchactions ofcollaborators forthe 5 first minutesof the session.

u t actions additional informationu2 0 submitted query “global warming”u1 1 submitted query “global warming”u2 8 document d1 : visited comment: “interesting”u2 12 document d2 : visitedu2 17 document d3 : visited rated: 4/5u2 19 document d4 : visitedu1 30 submitted query “greenhouse effect”u1 60 submitted query “global warming definition”u1 63 document d20 : visited rated: 3/5u1 70 submitted query “global warming protection”u1 75 document d21 : visitedu2 100 document d5 : visited rated: 5/5u2 110 document d6 : visited rated: 4/5u2 120 document d7 : visitedu1 130 submitted query “gas emission”u1 132 document d22 : visited rated: 4/5u2 150 document d8 : visitedu2 160 document d9 : visitedu2 170 document d10 : visitedu2 200 document d11 : visited comment: “great”u2 220 document d12 : visitedu2 240 document d13 : visitedu1 245 submitted query “global warming world protection”u1 250 submitted query “causes temperature changes”u1 298 submitted query “global warming world politics” 70 / 111

Page 104: Collaborative Information Retrieval: Concepts, Models and Evaluation

1. Collaboration and Information Retrieval Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion

USER-DRIVEN SYSTEM-MEDIATED CIR MODELSMINE USERS’ ROLES THEN PERSONALIZE THE SEARCH [SOULIER ET AL., SIGIR 2014A]

Example

Mining role of collaborators: matching with role patterns

• Role patternsI Roles of reader-querier

FRread,querier =

(1 −1−1 1

),KRread,querier = {(Nq,Np)}

Role : (S(tl)u1, S

(tl)u2 ,Rread,querier) → {(reader, querier), (querier, reader)}

(S(tl)u1, S

(tl)u2 ,Rread,querier) 7→

{(reader, querier) if S

(tl)u1

(tl,Np) > S(tl)u2 (tl,Np)

(querier, reader) otherwise

I Role of judge-querier

FRjudge,querier =

(1 −1−1 1

),KRjudge,querier = {(Nq,Nc)}

Role : (S(tl)u1, S

(tl)u2 ,Rjudge,querier → {(judge, querier), (querier, judge)}

(S(tl)u1, S

(tl)u2 ,Rjudge,querier) 7→

{(judge, querier) if S

(tl)u1

(tl,Nc) > S(tl)u2 (tl,Nc)

(querier, judge) otherwise

71 / 111

Page 105: Collaborative Information Retrieval: Concepts, Models and Evaluation

1. Collaboration and Information Retrieval Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion

USER-DRIVEN SYSTEM-MEDIATED CIR MODELSMINE USERS’ ROLES THEN PERSONALIZE THE SEARCH [SOULIER ET AL., SIGIR 2014A]

Example

Mining role of collaborators• Track users’ behavior each 60 seconds

• F = {Nq,Nd,Nc,Nr}, respectively, number of queries, documents, comments, ratings.

• Users’ search behavior

S(300)u1 =

3 0 0 04 2 0 15 3 0 25 3 0 28 3 0 2

S(300)u2 =

1 4 1 11 7 1 31 10 1 31 13 2 31 13 2 3

• Collaborators’ search differences (matrix and Kolmogorov-Smirnov test)

∆(300)u1,u2 =

2 −4 −1 −13 −5 −1 −24 −7 −1 −14 −10 −2 −17 −10 −2 −1

- Number of queries : p(tl)

u1,u2 (Nq) = 0.01348

- Number of pages : p(tl)u1,u2 (Nd) = 0.01348

- Number of comments : p(tl)u1,u2 (Nc) = 0.01348

- Number of ratings : p(tl)u1,u2 (Nr) = 0.08152

72 / 111

Page 106: Collaborative Information Retrieval: Concepts, Models and Evaluation

1. Collaboration and Information Retrieval Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion

USER-DRIVEN SYSTEM-MEDIATED CIR MODELSMINE USERS’ ROLES THEN PERSONALIZE THE SEARCH [SOULIER ET AL., SIGIR 2014A]

Example

Mining role of collaborators: matching with role patterns

• Collaborators’ search action complementarity: correlation matrix between searchdifferences

C(300)u1,u2 =

1 −0.8186713 −0.731925 0−0.8186713 1 0.9211324 0−0.731925 0.9211324 1 0

0 0 0 0

• Role mining: comparing the role pattern with the sub-matrix of collaborators’

behaviorsI Role of reader-querier

||FRread,querier C(300)u1,u2|| =

(0 −1− (−0.8186713)

−1− (−0.8186713) 0

)=

(0 0.183287

0.183287 0

)The Frobenius norm is equals to:

√0.1832872 = 0.183287.

I Role of judge-querier

||FRjudge,querier C(300)u1,u2|| =

(0 −1− (−0.731925)

−1− (−0.731925) 0

)=

(0 0.268174

0.268174 0

)The Frobenius norm is equals to:

√0.2681742 = 0.268174.

→ Collaborators acts as reader/querier with u1 labeled as querier and u2 as reader (highestNp).

73 / 111

Page 107: Collaborative Information Retrieval: Concepts, Models and Evaluation

1. Collaboration and Information Retrieval Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion

OVERVIEW OF IR MODELS AND TECHNIQUES

[Fol

eyan

dSm

eato

n,20

09a]

[Mor

ris

etal

.,20

08]“

smar

t-sp

litti

ng”

[Mor

ris

etal

.,20

08]“

grou

piza

tion

[Pic

kens

etal

.,20

08]

[Sha

het

al.,

2010

]

[Sou

lier

etal

.,IP

&M

2014

b]

[Sou

lier

etal

.,SI

GIR

2014

a]

Relevance collective � � � � � � �individual � � � � � � �

Evidence source

feedback � � � � � � �interest � � � � � � �expertise � � � � � � �behavior � � � � � � �role � � � � � � �

Paradigm division of labor � � � � � � �sharing of knowledge � � � � � � �

74 / 111

Page 108: Collaborative Information Retrieval: Concepts, Models and Evaluation

1. Collaboration and Information Retrieval 2. Collaborative IR techniques and models Evaluation 4. Challenges ahead 5. Discussion

PLAN

1. Collaboration and Information Retrieval

2. Collaborative IR techniques and models

3. EvaluationEvaluation challengesProtocolsProtocolsProtocolsMetrics and ground truthBaselinesTools and datasets

4. Challenges ahead

5. Discussion

75 / 111

Page 109: Collaborative Information Retrieval: Concepts, Models and Evaluation

1. Collaboration and Information Retrieval 2. Collaborative IR techniques and models Evaluation 4. Challenges ahead 5. Discussion

EVALUATION CHALLENGES

• Learning from user and user-user pastinteractions

• Adaptation to multi-faceted and multi-usercontexts: skills, expertise, role, etc

• Aggregating relevant information nuggets

Evaluating the collective relevance

• Supporting synchronous vs. asynchronouscoordination

• Modeling collaboration paradigms: division oflabor, sharing of knowledge

• Optimizing search cost: balance in work (search)and group benefit (task outcome)

Measuring the collaborativeeffectiveness

76 / 111

Page 110: Collaborative Information Retrieval: Concepts, Models and Evaluation

1. Collaboration and Information Retrieval 2. Collaborative IR techniques and models Evaluation 4. Challenges ahead 5. Discussion

PROTOCOLSCATEGORIES OF PROTOCOLS

• Standard evaluation frameworksI Without humans: batch-based evaluation (TREC, INEX, CLEF, ...)

77 / 111

Page 111: Collaborative Information Retrieval: Concepts, Models and Evaluation

1. Collaboration and Information Retrieval 2. Collaborative IR techniques and models Evaluation 4. Challenges ahead 5. Discussion

PROTOCOLSCATEGORIES OF PROTOCOLS

• Standard evaluation frameworksI Without humans: batch-based evaluation (TREC, INEX, CLEF, ...)I With humans in the process (recommended)

c© [Dumais, 2014]

78 / 111

Page 112: Collaborative Information Retrieval: Concepts, Models and Evaluation

1. Collaboration and Information Retrieval 2. Collaborative IR techniques and models Evaluation 4. Challenges ahead 5. Discussion

PROTOCOLSCATEGORIES OF PROTOCOLS

• Standard evaluation frameworksI Without humans: batch-based evaluation (TREC, INEX, CLEF, ...)I With humans in the process (recommended)

• CIR-adapted evaluation frameworks

79 / 111

Page 113: Collaborative Information Retrieval: Concepts, Models and Evaluation

1. Collaboration and Information Retrieval 2. Collaborative IR techniques and models Evaluation 4. Challenges ahead 5. Discussion

PROTOCOLSBATCH: COLLABORATION SIMULATION [MORRIS ET AL., 2008, SHAH ET AL., 2010]

• Real users formulating queries w.r.t. the shared information needI 15 individual users asked to list queries they would associate to 10 TREC topics. Then, pairs

of collaborators are randomly built [Shah et al., 2010]I 10 groups of 3 participants asked to list collaboratively 6 queries related to the information

need [Morris et al., 2008]• Simulating the collaborative rankings on the participants’ queries

Advantages:• Larger number of experimental tests

(parameter tuning, more baselines, ...)• Less costly and less time consuming

than user studies

Limitations:• Small manifestation of the collaborative

aspects• No span of the collaborative search

session• Difficult to evaluate the generalization of

findings

80 / 111

Page 114: Collaborative Information Retrieval: Concepts, Models and Evaluation

1. Collaboration and Information Retrieval 2. Collaborative IR techniques and models Evaluation 4. Challenges ahead 5. Discussion

PROTOCOLSBATCH: COLLABORATION SIMULATION [MORRIS ET AL., 2008, SHAH ET AL., 2010]

• Real users formulating queries w.r.t. the shared information needI 15 individual users asked to list queries they would associate to 10 TREC topics. Then, pairs

of collaborators are randomly built [Shah et al., 2010]I 10 groups of 3 participants asked to list collaboratively 6 queries related to the information

need [Morris et al., 2008]• Simulating the collaborative rankings on the participants’ queries

Advantages:• Larger number of experimental tests

(parameter tuning, more baselines, ...)• Less costly and less time consuming

than user studies

Limitations:• Small manifestation of the collaborative

aspects• No span of the collaborative search

session• Difficult to evaluate the generalization of

findings80 / 111

Page 115: Collaborative Information Retrieval: Concepts, Models and Evaluation

1. Collaboration and Information Retrieval 2. Collaborative IR techniques and models Evaluation 4. Challenges ahead 5. Discussion

PROTOCOLSLOG-STUDY: COLLABORATION SIMULATION [FOLEY AND SMEATON, 2009A, SOULIER ET AL., 2014B]

• Individual search logs (from user studies or official benchmarks)

• Chronological synchronization of individual search actions• Simulating the collaborative rankings on the users’ queries

81 / 111

Page 116: Collaborative Information Retrieval: Concepts, Models and Evaluation

1. Collaboration and Information Retrieval 2. Collaborative IR techniques and models Evaluation 4. Challenges ahead 5. Discussion

PROTOCOLSLOG-STUDY: COLLABORATION SIMULATION [FOLEY AND SMEATON, 2009A, SOULIER ET AL., 2014B]

• Individual search logs (from user studies or official benchmarks)• Chronological synchronization of individual search actions

• Simulating the collaborative rankings on the users’ queries

81 / 111

Page 117: Collaborative Information Retrieval: Concepts, Models and Evaluation

1. Collaboration and Information Retrieval 2. Collaborative IR techniques and models Evaluation 4. Challenges ahead 5. Discussion

PROTOCOLSLOG-STUDY: COLLABORATION SIMULATION [FOLEY AND SMEATON, 2009A, SOULIER ET AL., 2014B]

• Individual search logs (from user studies or official benchmarks)• Chronological synchronization of individual search actions• Simulating the collaborative rankings on the users’ queries

81 / 111

Page 118: Collaborative Information Retrieval: Concepts, Models and Evaluation

1. Collaboration and Information Retrieval 2. Collaborative IR techniques and models Evaluation 4. Challenges ahead 5. Discussion

PROTOCOLSLOG-STUDY: COLLABORATION SIMULATION [FOLEY AND SMEATON, 2009A, SOULIER ET AL., 2014B]

• Individual search logs (from user studies or official benchmarks)• Chronological synchronization of individual search actions• Simulate the collaborative rankings on the users’ queries

Advantages:• Modeling of a collaborative session• Larger number of experimental tests

(parameter tuning, more baselines, ...)• Less costly and less time consuming

than user studies

Limitations:• Any manifestation of the collaborative

aspects• Difficult to evaluate the generalization of

findings

82 / 111

Page 119: Collaborative Information Retrieval: Concepts, Models and Evaluation

1. Collaboration and Information Retrieval 2. Collaborative IR techniques and models Evaluation 4. Challenges ahead 5. Discussion

PROTOCOLSLOG-STUDY: COLLABORATION SIMULATION [FOLEY AND SMEATON, 2009A, SOULIER ET AL., 2014B]

• Individual search logs (from user studies or official benchmarks)• Chronological synchronization of individual search actions• Simulate the collaborative rankings on the users’ queries

Advantages:• Modeling of a collaborative session• Larger number of experimental tests

(parameter tuning, more baselines, ...)• Less costly and less time consuming

than user studies

Limitations:• Any manifestation of the collaborative

aspects• Difficult to evaluate the generalization of

findings

82 / 111

Page 120: Collaborative Information Retrieval: Concepts, Models and Evaluation

1. Collaboration and Information Retrieval 2. Collaborative IR techniques and models Evaluation 4. Challenges ahead 5. Discussion

PROTOCOLSLOG-STUDIES: COLLABORATIVE SEARCH LOGS [SOULIER ET AL., 2014A]

• Real logs of collaborative search sessions• CIR ranking model launched on the participant queries

Advantages:• A step forward to realistic collaborative

scenarios• Queries resulting from a collaborative

search process

Limitations:• Costly and time-consuming, unless

available data• Implicit feedback on the retrieved

document lists

83 / 111

Page 121: Collaborative Information Retrieval: Concepts, Models and Evaluation

1. Collaboration and Information Retrieval 2. Collaborative IR techniques and models Evaluation 4. Challenges ahead 5. Discussion

PROTOCOLSLOG-STUDIES: COLLABORATIVE SEARCH LOGS [SOULIER ET AL., 2014A]

• Real logs of collaborative search sessions• CIR ranking model launched on the participant queries

Advantages:• A step forward to realistic collaborative

scenarios• Queries resulting from a collaborative

search process

Limitations:• Costly and time-consuming, unless

available data• Implicit feedback on the retrieved

document lists

83 / 111

Page 122: Collaborative Information Retrieval: Concepts, Models and Evaluation

1. Collaboration and Information Retrieval 2. Collaborative IR techniques and models Evaluation 4. Challenges ahead 5. Discussion

PROTOCOLSUSER-STUDIES [PICKENS ET AL., 2008]

• Real users performing the collaborative task• CIR models launched in real time in response to users’ actions

Advantages:• One of the most realistic scenario

(instead of panels)

Limitations:• Costly and time-consuming• Controlled tasks in laboratory

84 / 111

Page 123: Collaborative Information Retrieval: Concepts, Models and Evaluation

1. Collaboration and Information Retrieval 2. Collaborative IR techniques and models Evaluation 4. Challenges ahead 5. Discussion

PROTOCOLSUSER-STUDIES [PICKENS ET AL., 2008]

• Real users performing the collaborative task• CIR models launched in real time in response to users’ actions

Advantages:• One of the most realistic scenario

(instead of panels)

Limitations:• Costly and time-consuming• Controlled tasks in laboratory

84 / 111

Page 124: Collaborative Information Retrieval: Concepts, Models and Evaluation

1. Collaboration and Information Retrieval 2. Collaborative IR techniques and models Evaluation 4. Challenges ahead 5. Discussion

METRICSCATEGORIES OF METRICS

Evaluation Objectives in collaborative search

• Measuring the retrieval effectiveness of the ranking models

• Measuring the search effectiveness of the collaborative groups

• Measuring collaborators’ satisfaction and cognitive effort

• Analyzing collaborators’ behavior

• User-driven metrics/indicators aimingat evaluating:

I The collaborators’ awareness andsatisfaction [Aneiros and Morris, 2003,Smyth et al., 2005]

I The cognitive effortI The search outcomes

• System-oriented metrics/indicatorsaiming at evaluating:

I The retrieval effectiveness of the rankingmodels

I The insurance of the collaborativeparadigms of the ranking models(division of labor)

I The collaborative relevance ofdocuments (→ ground truth)

85 / 111

Page 125: Collaborative Information Retrieval: Concepts, Models and Evaluation

1. Collaboration and Information Retrieval 2. Collaborative IR techniques and models Evaluation 4. Challenges ahead 5. Discussion

METRICSUSER-DRIVEN METRICS

• Search log analysisI Behavioral analysis: collaborators’ actions [Tamine and Soulier, 2015]

Feature Descriptionnpq Average number of visited pages by querydt Average time spent between two visited pagesnf Average number of relevance feedback information (snippets, annotations

& bookmarks)qn Average number of submitted queriesql Average number of query tokensqo Average ratio of shared tokens among successive queriesnbm Average number of exchanged messages within the search groups

I Behavioral analysis: communication channels[Gonzalez-Ibanez et al., 2013, Strijbos et al., 2004]

86 / 111

Page 126: Collaborative Information Retrieval: Concepts, Models and Evaluation

1. Collaboration and Information Retrieval 2. Collaborative IR techniques and models Evaluation 4. Challenges ahead 5. Discussion

METRICSUSER-DRIVEN METRICS

• Search log analysisI Behavioral analysis: collaborators’ actions [Tamine and Soulier, 2015]

Feature Descriptionnpq Average number of visited pages by querydt Average time spent between two visited pagesnf Average number of relevance feedback information (snippets, annotations

& bookmarks)qn Average number of submitted queriesql Average number of query tokensqo Average ratio of shared tokens among successive queriesnbm Average number of exchanged messages within the search groups

I Behavioral analysis: communication channels[Gonzalez-Ibanez et al., 2013, Strijbos et al., 2004]

c©86 / 111

Page 127: Collaborative Information Retrieval: Concepts, Models and Evaluation

1. Collaboration and Information Retrieval 2. Collaborative IR techniques and models Evaluation 4. Challenges ahead 5. Discussion

METRICSUSER-DRIVEN METRICS

• Search log analysisI Behavioral analysis: collaborators’ actions and communication channelsI Search outcomes [Shah, 2014]

c©Evidence sources Description

Visit. doc. Rel. doc. Dwell-time Number of visits(Unique) Coverage � � � (unique) visited webpagesLikelihood of discovery � � � number of visits-based IDF metric(Unique) Useful pages � � � (unique) number of useful pages

(visited more than 30 seconds)Precision � � � number of distinct relevant and vis-

ited pages over the number of dis-tinct visited pages

Recall � � � number of distinct relevant and vis-ited pages over the number of dis-tinct relevant pages

F-measure � � � Combinaison of precision and recall

87 / 111

Page 128: Collaborative Information Retrieval: Concepts, Models and Evaluation

1. Collaboration and Information Retrieval 2. Collaborative IR techniques and models Evaluation 4. Challenges ahead 5. Discussion

METRICSUSER-DRIVEN METRICS

Exercice

Estimating the search outcome effectiveness of a collaborative search session (Coverage, RelevantCoverage, Precision, Recall, F-measure).

• Let’s consider:I a collaborative search session involving two users u1 and u2 aiming at solving an information

need I.I During the session, u1 selected the following documents: {d1, d2, d6, d9, d17, d20}I During the session, u2 selected the following documents: {d3, d4, d5, d6, d7}

I a collection of 20 documentsD = {d ; i = 1, ·, 20},I a ground truth for the information need I: GTI = {d2, d6, d15}

• Evaluation metrics:I UniqueCoverage(g) = {d1, d2, d3, d4, d5, d6, d7, d9, d17, d20}.I RelevantCoverage(g) = {d2, d6}.I Precision(g) = 2

10 = 0.2I Recall(g) = 2

3 = 0.66I F− measure(g) = 2·0.2·0.66

0.2+0.66 = 0.33

88 / 111

Page 129: Collaborative Information Retrieval: Concepts, Models and Evaluation

1. Collaboration and Information Retrieval 2. Collaborative IR techniques and models Evaluation 4. Challenges ahead 5. Discussion

METRICSUSER-DRIVEN METRICS

Exercice

Estimating the search outcome effectiveness of a collaborative search session (Coverage, RelevantCoverage, Precision, Recall, F-measure).

• Let’s consider:I a collaborative search session involving two users u1 and u2 aiming at solving an information

need I.I During the session, u1 selected the following documents: {d1, d2, d6, d9, d17, d20}I During the session, u2 selected the following documents: {d3, d4, d5, d6, d7}

I a collection of 20 documentsD = {d ; i = 1, ·, 20},I a ground truth for the information need I: GTI = {d2, d6, d15}

• Evaluation metrics:I UniqueCoverage(g) = {d1, d2, d3, d4, d5, d6, d7, d9, d17, d20}.I RelevantCoverage(g) = {d2, d6}.I Precision(g) = 2

10 = 0.2I Recall(g) = 2

3 = 0.66I F− measure(g) = 2·0.2·0.66

0.2+0.66 = 0.33

88 / 111

Page 130: Collaborative Information Retrieval: Concepts, Models and Evaluation

1. Collaboration and Information Retrieval 2. Collaborative IR techniques and models Evaluation 4. Challenges ahead 5. Discussion

METRICSUSER-DRIVEN METRICS

• Questionnaires and interviewsI The “TLX instrument form”: measuring the cognitive effort

I Satisfaction interviews [Shah and Gonzalez-Ibanez, 2011a, Tamine and Soulier, 2015]

Question Answer typeHave you already participated in such userstudy? If yes, please describe it.

Free-answer

What do you think about this collaborative man-ner of seeking information?

Free-answer

What was the level of difficulty of the task? a) Easy (Not difficult) b) Moder-ately difficult c) Difficult

What was task difficulty related to? Free-answerCould you say that the collaborative system sup-ports your search?

a) Yes b) Not totally c) Not at all

How could we improve this system? Free-answer

89 / 111

Page 131: Collaborative Information Retrieval: Concepts, Models and Evaluation

1. Collaboration and Information Retrieval 2. Collaborative IR techniques and models Evaluation 4. Challenges ahead 5. Discussion

METRICSUSER-DRIVEN METRICS

• Questionnaires and interviewsI The “TLX instrument form”: measuring the cognitive effortI Satisfaction interviews [Shah and Gonzalez-Ibanez, 2011a, Tamine and Soulier, 2015]

Question Answer typeHave you already participated in such userstudy? If yes, please describe it.

Free-answer

What do you think about this collaborative man-ner of seeking information?

Free-answer

What was the level of difficulty of the task? a) Easy (Not difficult) b) Moder-ately difficult c) Difficult

What was task difficulty related to? Free-answerCould you say that the collaborative system sup-ports your search?

a) Yes b) Not totally c) Not at all

How could we improve this system? Free-answer

89 / 111

Page 132: Collaborative Information Retrieval: Concepts, Models and Evaluation

1. Collaboration and Information Retrieval 2. Collaborative IR techniques and models Evaluation 4. Challenges ahead 5. Discussion

METRICSSYSTEM-ORIENTED METRICS [SOULIER ET AL., 2014A]

• The precision Prec@R(g) at rank R of a collaborative group g:

Prec@R(g) = 1T(g)

∑|T(g)|t=1 Prec@R(g)(t) = 1

T(g)∑|T(g)|

t=1RelCov@R(g)(t)

Cov@R(g)(t) (28)

• The recall Recall@R(g) at rank R of group g:

Recall@R(g) = 1T(g)

∑|T(g)|t=1 Recall@R(g)(t) = 1

T(g)∑|T(g)|

t=1RelCov@R(g)(t)

|RelDoc| (29)

• The F-measure Fsyn@R(g) at rank R of a collaborative group g:

F@R(g) =1

T(g)

|T(g)|∑t=1

2 ∗ Prec@R(g)(t) ∗ Recall@R(g)(t)

Prec@R(g)(t) + Recall@R(g)(t)(30)

90 / 111

Page 133: Collaborative Information Retrieval: Concepts, Models and Evaluation

1. Collaboration and Information Retrieval 2. Collaborative IR techniques and models Evaluation 4. Challenges ahead 5. Discussion

METRICSSYSTEM-ORIENTED METRICS AND GROUND TRUTH

Example

Estimating the retrieval effectiveness of the rankings of CIR models (Coverage, Relevant Coverage,Precision, Recall, F-measure).

Ground truth GTI = {d2, d6, d15}Query Document rankingq1 d1, d2, d3q2 d2, d8, d14q3 d17, d3, d8q4 d9, d15, d2q5 d1, d5, d3q6 d20, d3, d1q7 d5, d2, d4

91 / 111

Page 134: Collaborative Information Retrieval: Concepts, Models and Evaluation

1. Collaboration and Information Retrieval 2. Collaborative IR techniques and models Evaluation 4. Challenges ahead 5. Discussion

METRICSSYSTEM-ORIENTED METRICS AND GROUND TRUTH

Example

Estimating the retrieval effectiveness of the rankings of CIR models.

Evaluation metrics:Query pairs Coverage Relevant Coverage Precision Recall F-measureq1-q2 d1, d2, d3, d8, d14 d2

15

13 0.25

q2-q3 d2, d8, d14, d17, d3 d215

13 0.25

q3-q4 d17, d3, d8, d9, d15 d1515

13 0.25

q3-q7 d17, d3, d8, d5, d2, d4 d216

13 0.22

q5-q7 d1, d3, d5, d2, d4 - 0 0 0q6-q7 d20, d3, d1, d5, d2, d4 d2

16

13 0.22

Average 0,16 0,28 0,20

92 / 111

Page 135: Collaborative Information Retrieval: Concepts, Models and Evaluation

1. Collaboration and Information Retrieval 2. Collaborative IR techniques and models Evaluation 4. Challenges ahead 5. Discussion

METRICSGROUND TRUTH

• Evidence sources:I From relevance assessments [Morris et al., 2008]I From individual search logs [Foley and Smeaton, 2009b, Soulier et al., 2014b]I From collaborative search logs [Shah and Gonzalez-Ibanez, 2011b, Soulier et al., 2014a]

• Importance of considering an agreement level of at least two users (belonging todifferent groups?) [Shah and Gonzalez-Ibanez, 2011b, Soulier et al., 2014a]

93 / 111

Page 136: Collaborative Information Retrieval: Concepts, Models and Evaluation

1. Collaboration and Information Retrieval 2. Collaborative IR techniques and models Evaluation 4. Challenges ahead 5. Discussion

BASELINES

• Benefit of the collaborationI Individual models: BM25, LM, ...I Search logs of individual search

• Collaboration optimization through algorithmic mediationI User-driven approach with collaborative interfaces

• Benefit of rolesI Role-based vs. No-role CIR models [Foley and Smeaton, 2009b, Morris et al., 2008]I Dynamic vs. predefined CIR models [Pickens et al., 2008, Shah et al., 2010]

• ...

94 / 111

Page 137: Collaborative Information Retrieval: Concepts, Models and Evaluation

1. Collaboration and Information Retrieval 2. Collaborative IR techniques and models Evaluation 4. Challenges ahead 5. Discussion

TOOLS AND DATASETS

• Simulation-based evaluationI TREC Interractive dataset [Over, 2001]I Other available search logs (TREC, CLEF, propritary, ...)

• Log-studiesI Collaborative dataset [Tamine and Soulier, 2015]

• User-studiesI open-source Coagmento plugin [Shah and Gonzalez-Ibanez, 2011a]:

http://www.coagmento.org/collaboraty.php

95 / 111

Page 138: Collaborative Information Retrieval: Concepts, Models and Evaluation

1. Collaboration and Information Retrieval 2. Collaborative IR techniques and models 3. Evaluation Challenges ahead 5. Discussion

PLAN

1. Collaboration and Information Retrieval

2. Collaborative IR techniques and models

3. Evaluation

4. Challenges aheadTheoretical foundations of CIREmpirical evaluation of CIROpen ideas

5. Discussion

96 / 111

Page 139: Collaborative Information Retrieval: Concepts, Models and Evaluation

1. Collaboration and Information Retrieval 2. Collaborative IR techniques and models 3. Evaluation Challenges ahead 5. Discussion

THEORETICAL FOUNDATIONS OF CIR

• Towards a novel probabilistic framework of relevance for CIRI What is a ”good ranking” with regard to the expected synergic effect of collaboration?

• Dynamic IR models for CIRI How to optimize long-term gains over multiple users, user-user interactions, user-system

interactions and multi-search sessions?I How to formalize the division of labor through the evolving of users’ information needs over

time?• Towards an axiomatic approach of relevance for CIR

I Are IR heuristics similar to CIR heuristics?I Can relevance towards a group be modeled by a set of formally defined constraints on a

retrieval function?

97 / 111

Page 140: Collaborative Information Retrieval: Concepts, Models and Evaluation

1. Collaboration and Information Retrieval 2. Collaborative IR techniques and models 3. Evaluation Challenges ahead 5. Discussion

EVALUATION OF CIR

• Multiple facets of system performanceI Should we measure the performance in terms of gain per time, effort gain per user,

effectiveness of outcomes or all in a whole?I How do we delineate the performance of the system from the performance and interaction of

the users?• Robust experiments for CIR

I Should experimental evaluation protocol be task-dependent?I Are simulated work tasks used in IIR reasonable scenario for evaluating CIR scenario?I How to build data collections allowing reproducible experiments and handling robust

statistical tests?

98 / 111

Page 141: Collaborative Information Retrieval: Concepts, Models and Evaluation

1. Collaboration and Information Retrieval 2. Collaborative IR techniques and models 3. Evaluation Challenges ahead 5. Discussion

OPEN IDEAS

• Multi-level CIR [Htun et al., 2015]I Non-uniform information access within the groupI Application domains: legacy, military, ...

• Collaborative group buildingI Task-based group building (information search, synthesis, sense-making,

question-answering...)I Leveraging users’ knowledge, collaboration abilities, information need perception

• Socio-collaborative IR [Morris, 2013]I Web search vs. social networking [Oeldorf-Hirsch et al., 2014]I Leveraging from the crowd to solve a user’s information need

99 / 111

Page 142: Collaborative Information Retrieval: Concepts, Models and Evaluation

1. Collaboration and Information Retrieval 2. Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead Discussion

PLAN

1. Collaboration and Information Retrieval

2. Collaborative IR techniques and models

3. Evaluation

4. Challenges ahead

5. Discussion

100 / 111

Page 143: Collaborative Information Retrieval: Concepts, Models and Evaluation

1. Collaboration and Information Retrieval 2. Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead Discussion

DISCUSSION

101 / 111

Page 144: Collaborative Information Retrieval: Concepts, Models and Evaluation

1. Collaboration and Information Retrieval 2. Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead Discussion

REFERENCES I

Amer-Yahia, S., Benedikt, M., and Bohannon, P. (2007).

Challenges in Searching Online Communities.IEEE Data Engineering Bulletin, 30(2):23–31.

Amershi, S. and Morris, M. R. (2008).

CoSearch: a system for co-located collaborative web search.In Proceedings of the Conference on Human Factors in Computing Systems, CHI ’08, pages 1647–1656. ACM.

Aneiros, M. and Morris, M. R. (2003).

Foundation of unconstrained collaborative web browsing with awareness.In Proceedings of the International Conference on Web Intelligence, WI ’02, pages 8–25. ACM/IEEE.

Brin, S. and Page, L. (1998).

The Anatomy of a Large-scale Hypertextual Web Search Engine.Computer Networks and ISDN Systems, 30(1-7):107–117.

Capra, R. (2013).

Information Seeking and Sharing in Design Teams.In Proceedings of the ASIS&T Annual Meeting, ASIS&T ’13, pages 239–247. American Society for Information Science.

Diriye, A. and Golovchinsky, G. (2012).

Querium: A session-based collaborative search system.In Proceedings of the European Conference on Advances in Information Retrieval, ECIR ’12, pages 583–584. Springer.

Dumais, S. T. (2014).

Putting searchers into search.In Proceedings of the International ACM SIGIR Conference on Research and Development in Information Retrieval, pages 1–2.

102 / 111

Page 145: Collaborative Information Retrieval: Concepts, Models and Evaluation

1. Collaboration and Information Retrieval 2. Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead Discussion

REFERENCES II

Erickson, T. (2010).

A Social Proxy for Collective Search.In Proceedings of the International Workshop on Collaborative Information Seeking, CSCW ’10. ACM.

Evans, B. M. and Chi, E. H. (2010).

An elaborated model of social search.Information Processing & Management (IP&M), 46(6):656–678.

Foley, C. and Smeaton, A. F. (2009a).

Evaluation of Coordination Techniques in Synchronous Collaborative Information Retrieval.CoRR, abs/0908.0.

Foley, C. and Smeaton, A. F. (2009b).

Synchronous Collaborative Information Retrieval: Techniques and Evaluation.In Proceedings of the European Conference on Advances in Information Retrieval, ECIR ’09, pages 42–53. Springer.

Foley, C. and Smeaton, A. F. (2010).

Division of Labour and Sharing of Knowledge for Synchronous Collaborative Information Retrieval.Information Processing & Management (IP&M), 46(6):762–772.

Foley, C., Smeaton, A. F., and Jones., G. (2008).

Collaborative and Social Information Retrieval and Access: Techniques for Improved User Modeling, chapter Combining.IGI Global.

Foster, J. (2006).

Collaborative information seeking and retrieval.Annual Review of Information Science & Technology (ARIST), 40(1):329–356.

103 / 111

Page 146: Collaborative Information Retrieval: Concepts, Models and Evaluation

1. Collaboration and Information Retrieval 2. Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead Discussion

REFERENCES III

Fuhr, N. (2008).

A probability ranking principle for interactive information retrieval.Information Retrieval, 11(3):251–265.

Gauch, S., Chaffee, J., and Pretschner, A. (2003).

Ontology-based Personalized Search and Browsing.Web Intelligence and Agent Systems (WIAS), 1(3-4):219–234.

Golovchinsky, G., Adcock, J., Pickens, J., Qvarfordt, P., and Back, M. (2008).

Cerchiamo: a collaborative exploratory search tool.Proceedings of the Demo in Computer Supported Cooperative Work.

Golovchinsky, G., Diriye, A., and Pickens, J. (2011).

Designing for Collaboration in Information Seeking.Proceedings of the ASIS&T Annual Meeting.

Golovchinsky, G., Pickens, J., and Back, M. (2009).

A Taxonomy of Collaboration in Online Information Seeking.In Proceedings of the International Workshop on Collaborative Information Retrieval, CIR ’09.

Gonzalez-Ibanez, R., Haseki, M., and Shah, C. (2013).

Lets search together, but not too close! An analysis of communication and performance in collaborative information seeking.Information Processing & Management (IP&M), 49(5):1165–1179.

Gray, B. (1989).

Collaborating: finding common ground for multiparty problems.Jossey Bass Business and Management Series. Jossey-Bass.

104 / 111

Page 147: Collaborative Information Retrieval: Concepts, Models and Evaluation

1. Collaboration and Information Retrieval 2. Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead Discussion

REFERENCES IV

Han, S., He, D., Yue, Z., and Jiang, J. (2016).

Contextual support for collaborative information retrieval.In Proceedings of the International ACM SIGIR Conference on Human Information Interaction and Retrieval.

Hansen, P. and Jarvelin, K. (2005).

Collaborative information retrieval in an information-intensive domain.Information Processing & Management (IP&M), 41(5):1101–1119.

Hansen, P., Shah, C., and Klas, C.-P. (2015).

Collaborative information seeking. best practices, new domains and new thoughts.

Htun, N. N., Halvey, M., and Baillie, L. (2015).

Towards quantifying the impact of non-uniform information access in collaborative information retrieval.In Proceedings of the International ACM SIGIR Conference on Research and Development in Information Retrieval, pages 843–846.

Imazu, M., Nakayama, S.-i., and Joho, H. (2011).

Effect of Explicit Roles on Collaborative Search in Travel Planning Task.In Proceedings of the Asia Information Retrieval Societies Conference, AIRS ’11, pages 205–214. Springer.

Jin, X., Sloan, M., and Wang, J. (2013).

Interactive Exploratory Search for Multi Page Search Results.In Proceedings of the International Conference on World Wide Web, WWW ’13, pages 655–666. ACM.

Joho, H., Hannah, D., and Jose, J. (2009).

Revisiting IR Techniques for Collaborative Search Strategies.In Proceedings of the European Conference on Advances in Information Retrieval, ECIR ’09, pages 66–77. Springer.

Karunakaran, A., Reddy, M. C., and Spence, P. R. (2013).

Toward a model of collaborative information behavior in organizations.Journal of the Association for Information Science and Technology (JASIST), 64(12):2437–2451.

105 / 111

Page 148: Collaborative Information Retrieval: Concepts, Models and Evaluation

1. Collaboration and Information Retrieval 2. Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead Discussion

REFERENCES V

Kelly, R. and Payne, S. J. (2013).

Division of labour in collaborative information seeking: Current approaches and future directions.In Proceedings of the International Workshop on Collaborative Information Seeking, CSCW ’13. ACM.

Kraft, R., Maghoul, F., and Chang, C. C. (2005).

Y!Q: Contextual Search at the Point of Inspiration.In Proceedings of the Conference on Information and Knowledge Management, CIKM ’05, pages 816–823. ACM.

Lavrenko, V. and Croft, W. B. (2001).

Relevance based language models.In Proceedings of the Annual International ACM SIGIR Conference on Research and Development in Information Retrieval, SIGIR ’01, pages120–127. ACM.

Liu, F., Yu, C., and Meng, W. (2004).

Personalized Web Search For Improving Retrieval Effectiveness.IEEE Transactions on Knowledge and Data Engineering (TKDE), 16(1):28–40.

Morris, M. R. (2008).

A survey of collaborative web search practices.In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, CHI ’08, pages 1657–1660. ACM.

Morris, M. R. (2013).

Collaborative Search Revisited.In Proceedings of the Conference on Computer Supported Cooperative Work, CSCW ’13, pages 1181–1192. ACM.

Morris, M. R., Lombardo, J., and Wigdor, D. (2010).

WeSearch: supporting collaborative search and sensemaking on a tabletop display.In Proceedings of the Conference on Computer Supported Cooperative Work, CSCW ’10, pages 401–410. ACM.

106 / 111

Page 149: Collaborative Information Retrieval: Concepts, Models and Evaluation

1. Collaboration and Information Retrieval 2. Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead Discussion

REFERENCES VI

Morris, M. R., Paepcke, A., and Winograd, T. (2006).

TeamSearch: Comparing Techniques for Co-Present Collaborative Search of Digital Media.In Proceedings of the International Workshop on Horizontal Interactive Human-Computer Systems, Tabletop ’06, pages 97–104. IEEE ComputerSociety.

Morris, M. R. and Teevan, J. (2009).

Collaborative Web Search: Who, What, Where, When, and Why.Synthesis Lectures on Information Concepts, Retrieval, and Services. Morgan & Claypool Publishers.

Morris, M. R., Teevan, J., and Bush, S. (2008).

Collaborative Web Search with Personalization: Groupization, Smart Splitting, and Group Hit-highlighting.In Proceedings of the Conference on Computer Supported Cooperative Work, CSCW ’08, pages 481–484. ACM.

Oeldorf-Hirsch, A., Hecht, B., Morris, M. R., Teevan, J., and Gergle, D. (2014).

To Search or to Ask: The Routing of Information Needs Between Traditional Search Engines and Social Networks.In Proceedings of the Conference on Computer Supported Cooperative Work, CSCW ’14, pages 16–27. ACM.

Over, P. (2001).

The TREC interactive track: an annotated bibliography.Information Processing & Management (IP&M), 37(3):369–381.

Pal, A. and Counts, S. (2011).

Identifying topical authorities in microblogs.In Proceedings of the Conference on Web Search and Data Mining, WSDM ’11, pages 45–54. ACM.

Pickens, J., Golovchinsky, G., Shah, C., Qvarfordt, P., and Back, M. (2008).

Algorithmic Mediation for Collaborative Exploratory Search.In Proceedings of the Annual International SIGIR Conference on Research and Development in Information Retrieval, SIGIR ’08, pages 315–322.ACM.

107 / 111

Page 150: Collaborative Information Retrieval: Concepts, Models and Evaluation

1. Collaboration and Information Retrieval 2. Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead Discussion

REFERENCES VII

Ponte, J. M. and Croft, W. B. (1998).

A language modeling approach to information retrieval.In Proceedings of the Annual International SIGIR Conference on Research and Development in Information Retrieval, SIGIR ’98, pages 275–281.ACM.

Resnick, P., Iacovou, N., Suchak, M., Bergstrom, P., and Riedl, J. (1994).

GroupLens: An Open Architecture for Collaborative Filtering of Netnews.In Proceedings of the Conference on Computer Supported Cooperative Work, CSCW ’94, pages 175–186. ACM.

Robertson, S. E. and Walker, S. (1994).

Some simple effective approximations to the 2-Poisson model for probabilistic weighted retrieval.In Proceedings of the Annual International SIGIR Conference on Research and Development in Information Retrieval, SIGIR ’94, pages 232–241.ACM.

Robertson, S. E., Walker, S., Jones, S., Hancock-Beaulieu, M., and Gatford, M. (1995).

Okapi at TREC-3.In Proceedings of the Text retrieval conference-3 (TREC-3), TREC ’95, pages 109–126.

Rocchio, J. J., editor (1971).

Relevance Feedback in Information Retrieval.Prentice Hall.

Rodriguez Perez, J. A., Whiting, S., and Jose, J. M. (2011).

CoFox: A visual collaborative browser.In Proceedings of the International Workshop on Collaborative Information Retrieval, CIKM ’11. ACM.

Salton, G. (1971).

A comparison between manual and automatic indexing method.Journal of American Documentation, 2(1):61–71.

108 / 111

Page 151: Collaborative Information Retrieval: Concepts, Models and Evaluation

1. Collaboration and Information Retrieval 2. Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead Discussion

REFERENCES VIII

Shah, C. (2010).

Working in Collaboration - What, Why, and How?In Proceedings of the International Workshop on Collaborative Information Seeking, CSCW ’10. ACM.

Shah, C. (2011).

A framework for supporting user-centric collaborative information seeking.Number 2 in SIGIR ’11, page 88. ACM.

Shah, C. (2012).

Collaborative Information Seeking - The Art and Science of Making the Whole Greater than the Sum of All.pages I–XXI, 1–185.

Shah, C. (2014).

Evaluating collaborative information seeking - synthesis, suggestions, and structure.Journal of Information Science (JIS), 40(4):460–475.

Shah, C. and Gonzalez-Ibanez, R. (2010).

Exploring Information Seeking Processes in Collaborative Search Tasks.In Proceedings of the ASIS&T Annual Meeting, ASIS&T ’10, pages 60:1–60:10. American Society for Information Science.

Shah, C. and Gonzalez-Ibanez, R. (2011a).

Coagmento - A System for Supporting Collaborative Information Seeking.In Demo in Proceedings of Association for Information Science and Technology Annual Meeting, ASIST ’12, pages 9–12.

Shah, C. and Gonzalez-Ibanez, R. (2011b).

Evaluating the Synergic Effect of Collaboration in Information Seeking.In Proceedings of the Annual International SIGIR Conference on Research and Development in Information Retrieval, SIGIR ’11, pages 913–922.ACM.

109 / 111

Page 152: Collaborative Information Retrieval: Concepts, Models and Evaluation

1. Collaboration and Information Retrieval 2. Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead Discussion

REFERENCES IX

Shah, C. and Marchionini, G. (2010).

Awareness in collaborative information seeking.Journal of the Association for Information Science and Technology (JASIST), 61(10):1970–1986.

Shah, C., Pickens, J., and Golovchinsky, G. (2010).

Role-based results redistribution for collaborative information retrieval.Information Processing & Management (IP&M), 46(6):773–781.

Smeaton, A. F., Foley, C., Gurrin, C., Lee, H., and McGivney, S. (2006).

Collaborative Searching for Video Using the Fischlar System and a DiamondTouch Table.In Proceedings of the International Workshop on Horizontal Interactive Human-Computer Systems, Tabletop ’06, pages 151–159. IEEE ComputerSociety.

Smyth, B., Balfe, E., Boydell, O., Bradley, K., Briggs, P., Coyle, M., and Freyne, J. (2005).

A live-user evaluation of collaborative web search.In Proceedings of the International Joint Conference on Artificial Intelligence, IJCAI ’05, pages 1419–1424.

Soulier, L., Shah, C., and Tamine, L. (2014a).

User-driven System-mediated Collaborative Information Retrieval.In Proceedings of the Annual International SIGIR Conference on Research and Development in Information Retrieval, SIGIR ’14, pages 485–494.ACM.

Soulier, L., Tamine, L., and Bahsoun, W. (2014b).

On domain expertise-based roles in collaborative information retrieval.Information Processing & Management (IP&M), 50(5):752–774.

Strijbos, J.-W., Martens, R. O. B. L., Jochems, W. M. G., and Broers, N. J. (2004).

The Effect of Functional Roles on Group Efficiency. Using Multilevel Modeling and Content Analysis to Investigate Computer-SupportedCollaboration in Small Groups.Journal of Information Science (JIS), 35(2):195–229.

110 / 111

Page 153: Collaborative Information Retrieval: Concepts, Models and Evaluation

1. Collaboration and Information Retrieval 2. Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead Discussion

REFERENCES X

Tamine, L. and Soulier, L. (2015).

Understanding the impact of the role factor in collaborative information retrieval.In Proceedings of the ACM International on Conference on Information and Knowledge Management, CIKM ’15, pages 43–52.

Tao, Y. and Tombros, A. (2013).

An Exploratory Study of Sensemaking in Collaborative Information Seeking.In Proceedings of the European Conference on Advances in Information Retrieval, ECIR ’13, pages 26–37. Springer.

Teevan, J., Dumais, S. T., and Horvitz, E. (2005).

Personalizing Search via Automated Analysis of Interests and Activities.In Proceedings of the Annual International SIGIR Conference on Research and Development in Information Retrieval, SIGIR ’05, pages 449–456.ACM.

Thomson, A. M. and Perry, J. L. (2006).

Collaboration Processes: Inside the Black Box.Public Administration Review, 66:20–32.

Vivian, R. and Dinet, J. (2008).

RCI WEB : un systeme collaboratif de recherche dinformation centre utilisateur.Revue des Interactions Humaines Mediatisees, 9(2):85–110.

Yue, Z., Han, S., He, D., and Jiang, J. (2014).

Influences on Query Reformulation in Collaborative Web Search.IEEE Computer, 47(3):46–53.

111 / 111