staying ahead of the curve: keeping our assessment skills sharp
DESCRIPTION
Staying Ahead of the Curve: Keeping Our Assessment Skills Sharp. International Assessment & Retention Conference St. Louis, Missouri June 9, 2007. Ted Elling, Ed.D. Associate Vice Chancellor for Student Affairs Research & Systems Development Univ. of North Carolina at Charlotte - PowerPoint PPT PresentationTRANSCRIPT
Staying Ahead of the Curve: Keeping Our Assessment Skills Sharp
International Assessment & Retention ConferenceSt. Louis, Missouri
June 9, 2007
Ted Elling, Ed.D.Associate Vice Chancellor for Student AffairsResearch & Systems DevelopmentUniv. of North Carolina at [email protected]
Carrie Zelna, Ph.D. Director of Student Affairs Research and AssessmentNorth Carolina State [email protected]
Goals
• Share findings from recent national NASPA survey (August 2006) – How we use assessment– Where our pockets of expertise lie– What are our training needs & desired forums– How do we differ from our generalist colleagues
• Group Discussion– How are we keeping current?– How can we get & keep others current– How do we sustain these efforts
• Joint NASPA/ACPA Assessment Contact Effort
SAAER Membership Survey
• Administered to all NASPA Student Affairs Assessment, Evaluation, and Research Knowledge Community Members – August/September 2006
• 413 respondents– 147 Coordinate assessment at the SA division level
– 73 Coordinate assessment at the SA department level
– 112 Do not formally coordinate assessment efforts
– 81 (other – mainly academic affairs and other divisions)
• What Are Our Data Gathering Methods?• What Are Our Training Needs & Desired Formats?• Where Is Our Presentation Expertise?
What Are Our Data Gathering Methods?Division Level Coordinators
0%
10%
20%
30%
40%
50%
60%
70%
80%
90%
100%
Partic
ipant
#'s
Surve
ys
Using
Insti
tutio
nal Dat
a
Focus
Gro
ups
Docum
ent A
nalys
is
Obser
vatio
ns
Course
Ass
essment
Inte
rviews
Reflect
ion/Rub
rics
Portfo
lios
Case S
tudie
s
Per
cen
t Y
es
Infreq/never
1 every 3 yrs
1 every 2 yrs
1 a year
1 a sem
> 1 a sem
What Skills Are We Looking to Enhance & How?Division Level Coordinators
0% 20% 40% 60% 80% 100%
Basic assessment overviewFocus groups
InterviewsSurveys
ObservationsCase studies
Learning outcomesCollection methodsWriting instruments
Use of reflectionDocument analysis
PortfoliosData analysis
Using national surveysRubrics at department level
Interpretation of dataData integration
Percent Yes
Nothing Needed
Overview Presentation
Detailed Presentation
Intensive Training
Skill Enhancement NeedsAll Groups
0% 20% 40% 60% 80% 100%
Basic assessment overviewFocus groups
InterviewsObservationsCase studies
SurveysCollection methodsWriting instruments
PortfoliosLearning outcomesDocument analysis
Use of reflectionData analysis
Rubrics at department levelInterpretation of data
Using national surveysData integration
Percent Yes
Nothing Needed
Overview Presentation
Detailed Presentation
Intensive Training
What Do We Need the Most & LeastBy Coordination Level
0% 10% 20% 30% 40% 50%
Basic assessment overviewFocus groups
InterviewsSurveys
ObservationsCase studies
Learning outcomesCollection methodsWriting instruments
Use of reflectionDocument analysis
PortfoliosData analysis
Using national surveysRubrics at department level
Interpretation of dataData integration
Percent "Nothing Needed"
No Div/Dept Assessment
Division Level Assessment
Who is Interested In Presenting?
0% 5% 10% 15% 20% 25%
Basic assessment overviewLearning outcomesWriting instruments
Interpretation of dataCollection methods
Use of reflectionFocus groups
Surveys
Rubrics at department levelData analysis
InterviewsUsing national surveys
Data integrationObservations
PortfoliosCase studies
Document analysis
Percent Yes
No Div/Dept Assessment
Division Level Assessment
What We Need Versus Available PresentersTraining Needs Gap???
0% 10% 20% 30% 40% 50% 60% 70% 80% 90% 100%
Data integration
Using national surveys
Interpretation of data
Rubrics at department level
Use of reflection
Data analysis
Learning outcomes
Document analysis
Collection methods
Writing instruments
Portfolios
Surveys
Observations
Case studies
Focus groups
Interviews
Basic assessment overview
Available Presenters
Membership Requests
Group Discussion
• How are we keeping current?
• How can we get & keep others current?
• How can we sustain these efforts?
Assessment Survey Process• Joint NASPA/ACPA Assessment Summer 07 Contact Effort
– NASPA SAAER Knowledge Community• Support Assessment Curriculum Effort
– ACPA Commission for Assessment for Student Development• Support Assessment Skills and Knowledge (ASK) Standards
• Email to all ACPA Senior Student Affairs Officers• Multi-modal email contact to NASPA SAAER KC & IARC Attendees
– Forward email to dedicated assessment staff (75 to 100% FT)– Request to complete survey via embedded link
• Main Purpose– Who & Where are our dedicated assessment staff?– Develop updated web-based resource list– Identify our trainers– Encourage focused-presentations in conference & workshop settings
Survey Areas
• Institution & Location• Title• Education & Experience• Level of Coordination• Home Department/Division• Research Findings Website (URL)• Areas of Skill Development Needs & Expertise• Desire to Present• Contact Data• Others – What are we missing?
Purpose of a “Curriculum”
• Provide a frame for the KC to use when choosing programs for national and regional conferences and workshops.
• Serve as a guide to assist Professionals in choosing educational opportunities that will provide them with the most useful information for engaging in assessment.
• Provide those in Student Affairs that did not receive any assessment education through academic programs an opportunity to learn assessment “skills”.
• A curriculum format may allow us to implement programs in such a way that completing it will provide a resume item/documentation of training.
Course Course Title Course Description
Assessment100 Beginning Concepts and Overview of Process Assessment for Beginners: Big Picture of Assessment and show stoppers-what you need to know first: I. Basic purpose/Uses for Assessment II. Definition of terms
i. Research vs. Assessment vs. Program Evaluation ii. Type of data: strategic, dashboard, program (aggregate vs. individual data)
iii. Direct indirect iv. Summative formative
III. Levels of Assessment: Institution, Division, Department, project IV. Type of Assessment: Needs, demographics, satisfaction, outcomes, climate, benchmarking,
and national assessments V. Other issues: Pre and post testing, causality, longitudinal VI. Ethical Considerations and Politics of assessment and accreditation
Assessment 110 Articulating Purpose and Mapping Activities Writing outcomes and unit planning
Assessment 200 Overview of Methods and Basic Implementation Issues Overview of methods and basic implementation issues including sampling, etc, How to determine the best method for the outcome
Assessment 201 Home Grown Surveys for Assessment
Assessment 202 Observations and Case studies in Assessment
Assessment 203 Case Studies in Assessment
Assessment 204 Reflection/rubrics in Assessment
Assessment 205 Interviews/Focus Groups in Assessment
Assessment 206 Portfolios for Assessment
Assessment 300 Data Analysis Techniques
Assessment 400 Use of Data: Data integration, Internal and external uses of data, using national data, campus IR data ect
Assessment 600 Basic Consultation 30-60 minute sessions with a professional at various NASPA organization meetings
Assessment 700 Special topics Course Assessment, Technology and Assessment, Assessing Student Success, Using university data, using national data,.....and the like.
Assessment 800 Creating a Student Affairs Assessment Process Process models, infrastructure (staffing, budget, and space issues) motivation techniques, training for division, etc
Program Evaluation 100 Beginning Concepts and Resources for Program Evaluation
Research 100 Beginning Concepts and Resources for Research