jen sweet office for teaching, learning & assessment depaul university shannon milligan faculty...

35
Jen Sweet Office for Teaching, Learning & Assessment DePaul University Shannon Milligan Faculty Center for Ignatian Pedagogy Loyola University Chicago ADVANCED SURVEY DESIGN Tuesday, December 8, 2015 2:00 – 3:30pm

Upload: augustine-justin-lindsey

Post on 19-Jan-2016

222 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Jen Sweet Office for Teaching, Learning & Assessment DePaul University Shannon Milligan Faculty Center for Ignatian Pedagogy Loyola University Chicago

Jen SweetOffi ce for Teaching, Learning & Assessment

DePaul University

Shannon MilliganFaculty Center for Ignatian Pedagogy

Loyola University Chicago

ADVANCED SURVEY DESIGN

Tuesday, December 8, 20152:00 – 3:30pm

Page 2: Jen Sweet Office for Teaching, Learning & Assessment DePaul University Shannon Milligan Faculty Center for Ignatian Pedagogy Loyola University Chicago

Workshop Outline

• Introductions• Conflicting Recommendations for Survey Design?• Cognition and Survey Design• Recommendations to Reduce Cognitive LoadACTIVITY!• Affective Assessment and SurveysACTIVITY!• Survey Design & Distribution Tools• Analyses of your Survey Instrument

Page 3: Jen Sweet Office for Teaching, Learning & Assessment DePaul University Shannon Milligan Faculty Center for Ignatian Pedagogy Loyola University Chicago

Workshop Outcomes

By the end of this workshop, participants will be able to:

• Apply knowledge of the cognitive processes students use to respond to surveys to design effective survey items and instruments.

• Identify non-cognitive variables that can be assessed with surveys

• Use the tools available to them at their respective institutions to design and distribute surveys.

• Identify a variety of methods available for the analysis of survey data.

Page 4: Jen Sweet Office for Teaching, Learning & Assessment DePaul University Shannon Milligan Faculty Center for Ignatian Pedagogy Loyola University Chicago

Conflicting Recommendation

s for Survey Design?

Page 5: Jen Sweet Office for Teaching, Learning & Assessment DePaul University Shannon Milligan Faculty Center for Ignatian Pedagogy Loyola University Chicago

Recommendations for Survey Design that seem to Conflict

Examples:Neutral Point • Always include? • Never use? • Sometimes yes; sometimes no?

Number of Scale Points to Include• 2?• 3?• 4?• 5?• 6?• 7?• 9?

• The more the better?

All of This is in the Literature!

Page 6: Jen Sweet Office for Teaching, Learning & Assessment DePaul University Shannon Milligan Faculty Center for Ignatian Pedagogy Loyola University Chicago

So, What’s Up with the Literature?All of these recommendations may be appropriate depending on the specific context:

• respondent attributes • nature of the items in the survey • length of the survey• Etc.

Generally looking at things like:• Reliability• *Validity • Survey Outcomes • Response Rate (high)

• Use of Response Sets (low)

Survey Design is as Much Art as Science!“There is always a well-known solution to every human problem - neat, plausible, and wrong.” H.L. Hencken

Page 7: Jen Sweet Office for Teaching, Learning & Assessment DePaul University Shannon Milligan Faculty Center for Ignatian Pedagogy Loyola University Chicago

Cognition and Survey Design

Page 8: Jen Sweet Office for Teaching, Learning & Assessment DePaul University Shannon Milligan Faculty Center for Ignatian Pedagogy Loyola University Chicago

Cognitive LoadPaas and Van Merrienboer, 1994

The amount of cognitive effort (or thinking) students need to exert to respond to a survey item.

If cognitive load exceeds the student’s working memory capacity, they will take some sort of shortcut (Paas & Van Merrienboer, 1994), or satisfice (Krosnick, 1991)

• Read questions less carefully (skim)

• Use a response set

• Give same response for all questions, regardless of content

• Overuse neutral or N/A response option

• Skip the question (provide no response)

• Respond randomly

• Decide not to complete the survey

Page 9: Jen Sweet Office for Teaching, Learning & Assessment DePaul University Shannon Milligan Faculty Center for Ignatian Pedagogy Loyola University Chicago

Cognitive to Respond to Surveys (Tourangeau, 1984)

1.Interpretation

2.Retrieval

3.Judgment

4.Response

Page 10: Jen Sweet Office for Teaching, Learning & Assessment DePaul University Shannon Milligan Faculty Center for Ignatian Pedagogy Loyola University Chicago

Recommendations to Reduce

Cognitive Load

Page 11: Jen Sweet Office for Teaching, Learning & Assessment DePaul University Shannon Milligan Faculty Center for Ignatian Pedagogy Loyola University Chicago

Step 1: Interpretation•Use language that is clear and familiar to survey respondents. • Avoid cognitively taxing wording.• Avoid unfamiliar words and phrasing.• Avoid jargon and acronyms

• Ensure that question stems are clear and explicit.• Do not use concepts that are unclear or unfamiliar to respondents.• Avoid complex sentence structures.• Ask about only one concept in each stem; avoid double-barreled

questions

• Use questions that do not make assumptions.• Ask for information in a direct manner by avoiding double negatives • Ensure question stems are succinct, including only as much information as is necessary for respondents to properly interpret what is being requested of them.

Page 12: Jen Sweet Office for Teaching, Learning & Assessment DePaul University Shannon Milligan Faculty Center for Ignatian Pedagogy Loyola University Chicago

Interpretation (Continued)• Include clear instructions that clarify the purpose of the survey instrument, and provide respondents with expected procedures for responding to the survey instrument.

• Ensure that every portion of a survey instrument is visible without the need for additional action by the respondent. • Use radio buttons instead of drop-down boxes to display response

options.• Do not “hide” definitions respondents may need to interpret and

respond to survey items.• Use easy-to-read font size and type.• Use high-contrast font and background colors.

Page 13: Jen Sweet Office for Teaching, Learning & Assessment DePaul University Shannon Milligan Faculty Center for Ignatian Pedagogy Loyola University Chicago

Step 2: Retrieval•Use stems that request information with which respondents have primary experience and avoid asking for second-hand information (i.e., information that the respondent has heard about, but not experienced personally) or hypothetical information.•Group conceptually similar items together.

Page 14: Jen Sweet Office for Teaching, Learning & Assessment DePaul University Shannon Milligan Faculty Center for Ignatian Pedagogy Loyola University Chicago

Step 3: Judgment•Use the smallest number of response options necessary to encompass all meaningful divisions of what you are asking about.•General Guideline: four or five response options, depending on whether or not there will be a neutral option.• Include a neutral option if you reasonably expect participants to have no opinion, but otherwise, they should be avoided •Neutral responses can be difficult to interpret•Offering a neutral option may encourage satisficing

Page 15: Jen Sweet Office for Teaching, Learning & Assessment DePaul University Shannon Milligan Faculty Center for Ignatian Pedagogy Loyola University Chicago

Step 4: Response•Use the smallest number of response options necessary to encompass all meaningful divisions of what you are asking about.•Label the scale options.•May only need to label the most extreme options• Include a neutral option if you reasonably expect participants to have no opinion, but otherwise, they should be avoided •Neutral responses can be difficult to interpret•Offering a neutral option may encourage satisficing

Page 16: Jen Sweet Office for Teaching, Learning & Assessment DePaul University Shannon Milligan Faculty Center for Ignatian Pedagogy Loyola University Chicago

Activity!

Page 17: Jen Sweet Office for Teaching, Learning & Assessment DePaul University Shannon Milligan Faculty Center for Ignatian Pedagogy Loyola University Chicago

Practice Evaluating Survey Items!

Individually:Complete the worksheet

In Groups:Compare ResponsesDid Everyone Identify the same items for Improvement?Are there differences in ways you edited items?

Page 18: Jen Sweet Office for Teaching, Learning & Assessment DePaul University Shannon Milligan Faculty Center for Ignatian Pedagogy Loyola University Chicago

Surveys and the Rise of Affective

Assessment

Page 19: Jen Sweet Office for Teaching, Learning & Assessment DePaul University Shannon Milligan Faculty Center for Ignatian Pedagogy Loyola University Chicago

Growing Emphasis on Non-Cognitive Abilities• Grit• Growth• Social-emotional development• Self-awareness/management/efficacy• General affect• Engagement• Mattering• Climate

• Research shows strong relationships between these variables and overall success

From NPR: “Nonacademic Skills Are Key to Success. But What Should We Call Them?” (May 28, 2015)

Page 20: Jen Sweet Office for Teaching, Learning & Assessment DePaul University Shannon Milligan Faculty Center for Ignatian Pedagogy Loyola University Chicago

The Role of Surveys From NPR: “To Measure What Tests Can’t, Some Schools Turn to Surveys” (December 2, 2015)

From that article: “A growing battery of school leaders, researchers and policymakers think surveys are the best tool available right now to measure important social and emotional goals for schools and students”

Why?• Easy to administer• Easier to collect and analyze than reflection papers (and the like)• Many surveys/survey questions already exist• Faster data sharing = faster decision-making/implementation

(maybe)

Page 21: Jen Sweet Office for Teaching, Learning & Assessment DePaul University Shannon Milligan Faculty Center for Ignatian Pedagogy Loyola University Chicago

Activity!

Page 22: Jen Sweet Office for Teaching, Learning & Assessment DePaul University Shannon Milligan Faculty Center for Ignatian Pedagogy Loyola University Chicago

Non-Cognitive Assessment and You

Individually:•What non-cognitive variables might be of interest to your program?•Are you already assessing any of these variables?

In Groups:•Share what you are assessing and what you might be interested in assessing•What variables might be important to look at on an institutional-level?

Page 23: Jen Sweet Office for Teaching, Learning & Assessment DePaul University Shannon Milligan Faculty Center for Ignatian Pedagogy Loyola University Chicago

Survey Design/Distributio

n Tools

Page 24: Jen Sweet Office for Teaching, Learning & Assessment DePaul University Shannon Milligan Faculty Center for Ignatian Pedagogy Loyola University Chicago

Three Main Tools Used at DePaul

1. Qualtrics

2. Google Forms

3. Survey Monkey

Page 25: Jen Sweet Office for Teaching, Learning & Assessment DePaul University Shannon Milligan Faculty Center for Ignatian Pedagogy Loyola University Chicago

QualtricsAdvantages:•Free to DePaul faculty and staff•Supported by Information Services•Very Flexible and Comprehensive System•Lots of Features

Disadvantage:•Reporting Features aren’t Great•Steeper Learning Curve than Other Systems

Page 26: Jen Sweet Office for Teaching, Learning & Assessment DePaul University Shannon Milligan Faculty Center for Ignatian Pedagogy Loyola University Chicago

Google FormsAdvantages:•Free to Everyone•Data is Collected in Excel Format•Easier to Learn than Qualtrics

Disadvantage:•Not Nearly as many Features as Qualtrics

Page 27: Jen Sweet Office for Teaching, Learning & Assessment DePaul University Shannon Milligan Faculty Center for Ignatian Pedagogy Loyola University Chicago

Survey MonkeyAdvantages:•?Disadvantage:•Most Advanced Features Cost $•To get all the features available in Qualtrics, cost is $780/year•Limited to 10 questions and 100 responses on the free version

Page 28: Jen Sweet Office for Teaching, Learning & Assessment DePaul University Shannon Milligan Faculty Center for Ignatian Pedagogy Loyola University Chicago

Overview of Survey Analysis

Page 29: Jen Sweet Office for Teaching, Learning & Assessment DePaul University Shannon Milligan Faculty Center for Ignatian Pedagogy Loyola University Chicago

Survey Analysis• For when you want to:• Group survey respondents• Group survey items• Make predictions/observe relationships• Analyze “fit” of respondents and items

•Which to use based on:1. Purpose

2. Audience

http://www.edmeasurement.net/5244/SPSS%20survey%20data.pdf

Page 30: Jen Sweet Office for Teaching, Learning & Assessment DePaul University Shannon Milligan Faculty Center for Ignatian Pedagogy Loyola University Chicago

Basic Survey Analysis• Sometimes all you need are:

1. Means

2. Correlations

3. Frequency tables/graphs

Page 31: Jen Sweet Office for Teaching, Learning & Assessment DePaul University Shannon Milligan Faculty Center for Ignatian Pedagogy Loyola University Chicago

Survey Analysis-Group Respondents

• Cluster Analysis• What it does: Creates groups or “clusters” of respondents

based on similar responses to a set of survey questions• Kind of intuitive because: Clustering is part of organizing (e.g.

organization of produce section, medical symptoms)• What it answers: What do members of each cluster have in

common?• Ex. First-generation students tend to have less familiarity with

research opportunities

• Useful for: Marketing, outreach, cohort creation

Page 32: Jen Sweet Office for Teaching, Learning & Assessment DePaul University Shannon Milligan Faculty Center for Ignatian Pedagogy Loyola University Chicago

Survey Analysis-Group Survey Items

• Factor Analysis • What it does: Groups statistically related survey items into a

number of “factors”• Kind of intuitive because : Groupings can be done based on

preconceived ideas or based on the data analysis. Also, largely correlation-based• What it answers: Can items be removed from the survey? Do

the factors relate to constructs that make sense (e.g. identified learning outcomes)?• Useful for: Survey refinement, naming constructs

Example: Do our questions aimed at measuring a certain outcome actually seem to measure that outcome?

Page 33: Jen Sweet Office for Teaching, Learning & Assessment DePaul University Shannon Milligan Faculty Center for Ignatian Pedagogy Loyola University Chicago

Survey Analysis-Make Predictions• Regression • What it does: Determines the relationship between multiple

predictor variables and a dependent variable• Kind of intuitive because : Similar to correlation (determining the

relationship between 2 variables), but with the ability to control for other variables• What it answers: What item(s) are significant predictors of an

outcome? Is there a positive or negative relationship between the predictor variable and outcome variable? How well do our variables explain the observed outcome?• Useful for: Looking at relationships between responses to certain

questions (or factors) and outcomes. Determination of resource allocation. Possible survey refinement

Example: Do students who report greater library use have a higher GPA?

Page 34: Jen Sweet Office for Teaching, Learning & Assessment DePaul University Shannon Milligan Faculty Center for Ignatian Pedagogy Loyola University Chicago

Survey Analysis-Analyze “Fit” of Respondents and Items

• Rasch Analysis and Item Response Theory (IRT)• What it does: Provides information about “ability” of

respondents and “difficulty” of survey items• But they differ in: Rasch only considers respondent ability and

item difficulty. IRT can also account for guessing and greater differences between high/low ability respondents• What it answers: What item(s) are too easy or too difficult? Is

the survey measuring a single variable? Are there trends in responses between respondent groups?• Useful for: Survey refinement, detecting bias in survey items,

determining number of levels needed in a rating scale

Example: Is a survey item written in such a way that it is interpreted differently across student groups?

Page 35: Jen Sweet Office for Teaching, Learning & Assessment DePaul University Shannon Milligan Faculty Center for Ignatian Pedagogy Loyola University Chicago

Contact InformationJen SweetDePaul UniversityOffice for Teaching, Learning & [email protected]

Shannon MilliganLoyola University ChicagoFaculty Center for Ignatian [email protected]