why aren't evaluators using digital media analytics?

24
Why aren’t evaluators taking advantage of Digital Media Analytics? from social media to open web sources Giles Crouch, CEO giles@mediabadger. com Tasha Truant, Consultant Manager [email protected]

Upload: cestoronto

Post on 19-May-2015

252 views

Category:

Technology


0 download

DESCRIPTION

Whether it’s through blogs, tweets, or even the comments section of an online newspaper, the world is increasingly talking online. However, the potential uses for the massive amounts of information available on the internet remain largely untapped in the sphere of evaluation. This presentation will explore innovative methods to extract these insights from the large and complex collections of digital data publicly available online. In particular, we will examine the unprecedented uses, and potential limitations, of digital media analytics to: • Measure the outcomes of public outreach, advocacy, communications, and information sharing programs; • Establish current and retroactive baselines; • Conduct “borderless” data collection to gain insights from other countries, as well as disapora communities in Canada; • Identify unknown stakeholder groups and create detailed stakeholder maps; and, • Provide context and insight to inform further data collection.

TRANSCRIPT

Page 1: Why aren't Evaluators using Digital Media Analytics?

Why aren’t evaluators taking advantage of Digital Media Analytics? from social media to open web sources

Giles Crouch, [email protected]

Tasha Truant, Consultant [email protected]

Page 2: Why aren't Evaluators using Digital Media Analytics?

agenda

The Internet | Breaking Assumptions

What is Cyber Analytics?

Examples

Strengths and Challenges

Applications and Opportunities for Evaluation

Case Studies

MediaBadger + Goss Gilroy

Page 3: Why aren't Evaluators using Digital Media Analytics?

the internet

Page 4: Why aren't Evaluators using Digital Media Analytics?

dispelling some myths

Average age of social media users is 36 USA, Canada, UK, EU

The 55+ demographic is fastest growing

25% of all content uploaded is now mobile device

Connecting to social media & the web is not done at home from a desktop anymore

Approximately 40% of the developing world is on social media

Page 5: Why aren't Evaluators using Digital Media Analytics?
Page 6: Why aren't Evaluators using Digital Media Analytics?
Page 7: Why aren't Evaluators using Digital Media Analytics?

what social media can tell us

Sentiment on issues (societal, political etc.)

Key influencers and their networks

Cultural behaviors and actions (e.g. Kenya; local social media versus Facebook)

Myths and narratives on issues e.g. Tar sands are acidic (in reality bitumen is less

acidic than traditional oil and gas) Important to understand myths and how they turn

into narratives – very difficult to change narratives

Page 8: Why aren't Evaluators using Digital Media Analytics?

what social media can tell us

Demographics All social media apps record certain “meta data”

such as location (geo) time and IP address Keywords When gender isn’t identified, use face recognition in

from profile pictures Over 60% of posts contain a person mentioning

where they are from and items such as tribe (i.e. for First Nations) and often city/province

Combination of identifiers are used to create a user profile (like the Target example from yesterday)

Margin of error is between 4-6%

Page 9: Why aren't Evaluators using Digital Media Analytics?

how we make sense of all this data

Page 10: Why aren't Evaluators using Digital Media Analytics?

examples of useapplications

Page 11: Why aren't Evaluators using Digital Media Analytics?

2010 New Brunswick Flood Event

Historical Trends | Communications Patterns | Citizen Behaviours

• Examined how people used social media immediately before, during and after the flood event

• Findings: Citizens paid very little attention to crisis communications from authorities

• Instead, relied on each other, posting pictures and videos, warning people in forums etc. of where to go or not go

• Result of research let to overhaul of emergency comms and use of social media

Page 12: Why aren't Evaluators using Digital Media Analytics?

a connected society

Haiti

n=3000 (!)

Page 13: Why aren't Evaluators using Digital Media Analytics?

strengths | challengesgoing forward

Page 14: Why aren't Evaluators using Digital Media Analytics?

difference of approach

Traditional (survey) Limited Data

Time Snapshot Specific Point

Curated & Pristine Must be clean

n=xxxx Defined, Limited

active bias (bias in asking the questions)

cyber research Massive Data

Historical Data 1985 Onward

Chaotic Accuracy Messiness is OK

N=ALL Exponential

passive bias (listening bias)

Page 15: Why aren't Evaluators using Digital Media Analytics?

premise for use in evaluations

Evaluators spend a lot of time and money trying to find out how people perceive, and are affected by, programs, policies, and organizations

World is increasingly voicing their opinions online

Methods to extract insights from the large and complex collections of digital data openly available exist

Page 16: Why aren't Evaluators using Digital Media Analytics?

possible applications

Impact of public policy programmes + projects Large data sets, near real-time Historical trending & sentiment

Inform research design Supports field research, focus groups & interviews

Using alongside traditional lines of evidence

Establish baselines for analysis & future monitoring

Page 17: Why aren't Evaluators using Digital Media Analytics?

opportunities: measurement & context

Measure the outcomes of public outreach, communication, advocacy, and information sharing programs

Create detailed stakeholder maps Maps of social networks that capture relationships,

provide insight into their nature, and identify unknown stakeholders and influencers

Provide context and insight to inform further data collection E.g. Country profiles of internet/social media usage

Historical trends Take a snapshot of these data at several points in time

Page 18: Why aren't Evaluators using Digital Media Analytics?

opportunities: respondent groups

Traditional methodologies have a hard time reaching certain respondent groups

Useful in gathering unvarnished views from groups that have access/means to get online, but benefit from anonymity. Useful for: Data collection in sensitive environments (e.g. post-

conflict zones); Obtaining views on issues people are quiet about in

person (e.g. racism) Gathering perceptions from beneficiaries adverse to

authority (vulnerable and marginalized populations, criminal offenders, youth)

Page 19: Why aren't Evaluators using Digital Media Analytics?

When there could have been Sentiment Analysis in Evaluation

Arts PromotionProgram aims to build stronger citizen engagement in communities through the performing and visual arts and in the expression, celebration and preservation of local historical heritage.

“Limited evidence gathered to fully assess the ultimate outcomes of the program, stemming from … the fact that the evaluation team could not gather direct views from a representative sample of volunteers and the general public. “

Strengthen Civil SocietyProgram’s goal is to promote resilient, healthy and just communities and support processes that strengthen civil society.

“Determining the extent to which [the program] has increased the knowledge or actions taken by Canadians in food, environmental and biodiversity issues was not possible within the scope and budget of this evaluation in the absence of survey data or a baseline”.

Page 20: Why aren't Evaluators using Digital Media Analytics?

When there could have been Sentiment Analysis in Evaluation

Canadian CultureProgram aims to develop Canadian writers, and to publish and disseminate their books effectively in Canada and abroad. The ultimate program outcome is “Increased access to a diverse range of Canadian-authored books in Canada and abroad”.

“Certain program outcomes could not be directly measured during the evaluation…For example, in order to measure indicators such as “Increased Awareness” and “Increased Access”, the evaluation has had to use proxy indicators (e.g. sales) to infer awareness and access.

Immigration/SettlementProgram aims to contribute to improving labour market integration outcomes of foreign-trained individuals in targeted occupations and sectors.

In most areas of anticipated outcomes, there was not a baseline measure of these outcomes at the point of implementation of the program. As a result, measurements of change or improvements rely on the recall and opinion of current respondents.

Page 21: Why aren't Evaluators using Digital Media Analytics?

biases + challenges

survey Trying to locate users/beneficiaries

With small amounts of data, accuracy is key

Locating hard to reach populations

Sampling Biases (Whom to survey?)

Selection biases (people that respond to surveys)

sentiment analysis Production of intentionally

misleading content (i.e. astro-turfing)

Overly aggressive behaviours (troll tendencies)

Difficulty in detecting sarcasm, irony

Selection biases (people that post opinions online)

Page 22: Why aren't Evaluators using Digital Media Analytics?

Privacy and Ethics

All data collected are publicly accessible; no private data are accessed.

MediaBadger is in full compliance with Canada’s privacy law (PIPEDA) at all times.

MediaBadger is reviewing membership in industry associations that provide clear, ethical guidelines in line with our values.

Digital Media Analytics and Sentiment Analysis maintain the confidentiality of all respondents; responses are aggregated in ways similar to traditional lines of evidence (i.e. key informant interviews).

Page 23: Why aren't Evaluators using Digital Media Analytics?

Exploring opportunities for digital media analytics as a line of evidence in evaluation Difficult when ToRs are established and

budgets are fixed

Outreach to the designers of evaluations (many of you!)

Looking for opportunities to further the field of evaluation through this workBringing advanced cyber

analytics to program evaluation.

partnering

Page 24: Why aren't Evaluators using Digital Media Analytics?

Giles Crouch, [email protected]

Tasha Truant, Consultant [email protected]