trudy banta aaglo forum melbourne may 2012
DESCRIPTION
TRANSCRIPT
Trudy W. BantaProfessor of Higher Education
and
Senior Advisor to the Chancellor for
Academic Planning and Evaluation
Indiana University-Purdue University Indianapolis
© TWBANTA-IUPUI
click for short
video of Trudy
click for audio
of this
presentation
Discipline-Based Assessment
to Provide Convincing Evidence of
Graduate Learning Outcomes
Presented in
Australia
May 2012
by
Trudy W. BantaProfessor of Higher Education
and
Senior Advisor to the Chancellor for
Academic Planning and Evaluation
Indiana University-Purdue University Indianapolis
355 N. Lansing St., AO 140
Indianapolis, Indiana 46202-2896
tbanta@ iupui.edu
http://www.planning.iupui.edu
© TWBANTA-IUPUI
My History
• Educational psychology
• Program evaluation & measurement
• Performance funding in Tennessee
• 1990 USDOE effort to build a national test
• 1992 Initiated evidence-based culture at
IUPUI
© TWBANTA-IUPUI
© TWBANTA-IUPUI
ASSESSMENT
Is like a dancer’s mirror.
It improves one’s ability to see and
improve one’s performance.
Alexander Astin
1993
© TWBANTA-IUPUI
ASSESSMENT OF INDIVIDUAL
STUDENT DEVELOPMENT
•Assessment of basic skills for use in advising
•Placement
•Counseling
•Periodic review of performance with detailed
feedback
•End-of-program certification of competence
•Licensing exams
•External examiners
© TWBANTA-IUPUI
KEY RESULTS OF INDIVIDUAL
ASSESSMENT
•Faculty can assign grades
•Students learn their own
strengths and weaknesses
•Students become self-
assessors
© TWBANTA-IUPUI
A SECOND LOOK
•Across students
•Across sections
•Across courses
© TWBANTA-IUPUI
•Where is learning satisfactory?
•What needs to be retaught?
•Which approaches produce the most
learning for which students?
© TWBANTA-IUPUI
GROUP ASSESSMENT ACTIVITIES
•Classroom assignments, tests, projects
•Questionnaires for students, graduates, employers
•Interviews, focus groups
•Program completion and placement
•Awards/recognition for graduates
•Monitoring of success in graduate school
•Monitoring of success on the job
© TWBANTA-IUPUI
ASSESSMENT . . .
“a rich conversation
about student learning
informed by data.”
-- Ted Marchese --
AAHE
© TWBANTA-IUPUI
USE OF RESULTS OF GROUP
ASSESSMENT
•Program improvement
•Institutional and / or state peer
review
•Regional and / or national
accreditation
© TWBANTA-IUPUI
ORGANIZATIONAL LEVELS FOR ASSESSMENT
National
Regional
State
Campus
College
Discipline
Classroom
Student
© TWBANTA-IUPUI
GROUP ASSESSMENT REQUIRES
COLLABORATION
In setting expected program outcomes
In developing sequence of learning experiences (curriculum)
In choosing measures
In interpreting assessment findings
In making responsive improvements
© TWBANTA-IUPUI
BARRIERS TO COLLABORATION
IN THE ACADEMY
1. Graduate schools prepare specialists
2. Departments hire specialists
3. Much of our scholarship is
conducted alone
4. Promotion and tenure favor
individual achievements --
interdisciplinary work is harder to
evaluate
© TWBANTA-IUPUI
TO FOSTER COLLABORATION
•Name interdisciplinary committees
•Read and discuss current literature on
learning/assessment
•Attend conferences together
•Bring experts to campus
•Share good practices
•Work together on learning communities
MOST FACULTY ARE NOT TRAINED AS
TEACHERS
Faculty Development
Can Help Instructors:
•Write clear objectives (outcomes) for student learning in courses and curricula
•Connect learning outcomes to assignments in courses.
•Develop assessment tools that test higher order intellectual skills
© TWBANTA-IUPUI
Taxonomy of Educational Objectives
(Bloom and Others, 1956)
Cognitive domain
categories
Knowledge
Comprehension
Application
Analysis
Synthesis
Evaluation
Sample verbs for outcomes
Identifies, defines, describes
Explains, summarizes, classifies
Demonstrates, computes, solves
Differentiates, diagrams, estimates
Creates, formulates, revises
Criticizes, compares, concludes
© TWBANTA-IUPUI
SOME GENERIC LEARNING OBJECTIVES
•Differentiate between fact and opinion
•Gather, analyze, and interpret data
•Apply ethical principles to local,
national, global issues
•Communicate ideas in writing effectively
© TWBANTA-IUPUI
PROFESSIONAL PROGRAM
OBJECTIVES
Program Graduates will Demonstrate
1. Professional commitment
2. Communication skills
3. Administrative and managerial skills
4. Information technology competence
5. Research and analytic competence
To Ensure That Concepts Are
Taught
© TWBANTA-IUPUI
Time management
ALVERNO COLLEGE 8 ABILITIES
Communication
Analysis
Problem Solving
Valuing in Decision-Making
Interacting
Global Perspectives
Effective Citizenship
Aesthetic Responsiveness
PRINCIPLES OF UNDERGRADUATE
LEARNING (PULS)
1. Core communication and quantitative
skills
2. Critical thinking
3. Integration and application of knowledge
4. Intellectual depth, breadth, and
adaptiveness
5. Understanding society and culture
6. Values and ethics
Approved by IUPUI Faculty Council
May 1998
PUL #1
CORE COMMUNICATION & QUANTITATIVE
SKILLS
Demonstrated by student’s ability to:
•Express ideas and facts to others effectively in a variety
of formats, particularly written, oral, and visual formats
•Communicate effectively in a range of settings
•Identify and propose solutions for problems using
quantitative tools and reasoning
•Make effective use of information resources and
technology
PRINCIPLES OF UNDERGRADUATE
LEARNING
•A distinctive feature of education at IUPUI
•Permeate the entire undergraduate
curriculum
•Are enacted differently in each discipline
PUL HISTORY AT IUPUI
1990 – Study group of faculty and staff
1992-98 – Series of task forces
1998 – Adoption by Faculty Council
2007 – Adoption of revised version
© TWBANTA-IUPUI
Standardized tests
CAN
initiate conversation
IN USING STANDARDIZED TESTS
• Match test with curriculum
•Set expected scores on subscales
•Discuss results
•Determine what is missing
© TWBANTA-IUPUI
Limitationsof standardized tests of generic skills
cannot cover all a student knows
narrow coverage, need to supplement
difficult to motivate students to take
them!
What are they actually measuring?
© TWBANTA-IUPUI
© TWBANTA-IUPUI
VOLUNTARY SYSTEM OF ACCOUNTABILITY
Report Scores in
critical thinking, written communication,
analytic reasoning
using
•Collegiate Assessment of Academic Proficiency (CAAP)
•Measuring Academic Proficiency and Progress (MAPP)
•Collegiate Learning Assessment (CLA)
© TWBANTA-IUPUI
TN = MOST PRESCRIPTIVE(5.45% OF
BUDGET FOR INSTRUCTION)
1. Accredit all accreditable programs (25)
2. Test all seniors in general education (25)
3. Test seniors in 20% of majors (20)
4. Give an alumni survey (15)
5. Demonstrate use of data to improve (15)
___
100
© TWBANTA-IUPUI
AT THE UNIVERSITY OF TENNESSEE
CAAP
Academic Profile (now MAPP)
COMP (like CLA and withdrawn
by 1990)
College BASE
© TWBANTA-IUPUI
IN TN WE LEARNED
1. No test measured 30% of gen ed skills
2. Tests of generic skills measure primarily
prior learning
3. Reliability of value added = .1
4. Test scores give few clues to guide
improvement actions
© TWBANTA-IUPUI
AN INCONVENIENT TRUTH
.9 = the correlation between SAT
and CLA scores of institutions
thus
81% of the variance in institutions’
scores is due to prior learning
© TWBANTA-IUPUI
HOW MUCH OF THE VARIANCE IN SENIOR
SCORES IS DUE TO COLLEGE IMPACT?
• Student motivation to attend that institution (mission differences)
• Student mix based on
• age, gender
• socioeconomic status
• race/ethnicity
• transfer status
• college major
© TWBANTA-IUPUI
HOW MUCH OF THE VARIANCE IN SENIOR
SCORES IS DUE TO COLLEGE IMPACT?
(CONTINUED)
•Student motivation to do well
•Sampling error
•Measurement error
•Test anxiety
•College effects
______
19 %
© TWBANTA-IUPUI
STUDENT MOTIVATION
• Samples of students are being tested
• Extrinsic motivators (cash, prizes) are used
We have learned:
• Only a requirement and intrinsic motivation
will bring seniors in to do their best
CONCERNS ABOUT VALUE ADDED
•Student attrition
•Proportion of transfer students
•Different methods of calculating
•Unreliability
•Confounding effects of maturation
© TWBANTA-IUPUI
Recent University of Texas Experience
30 – 40% of seniors at flagships earn
highest CLA score (ceiling effect)
flagship campuses have lowest value
added scores
© TWBANTA-IUPUI
WORD FROM MEASUREMENT EXPERTS
Given the complexity of
educational settings, we may never be
satisfied that value added models can be
used to appropriately partition the causal
effects of teacher, school, and student on
measured changes in standardized test
scores.
- Henry Braun & Howard Wainer
Handbook of Statistics, Vol. 26: Psychometrics
Elsevier 2007
© TWBANTA-IUPUI
Employing currently available
standardized tests of generic
skills to compare the quality
of institutions is not a valid use of
those tests.
© TWBANTA-IUPUI
OECD’S AHELO
COMPARING HEIS X NATIONS
1. Generic skills (CLA)
2. Disciplines (Engineering and Economics)
3. Value added
4. Contextual information indicators
© TWBANTA-IUPUI
2012K-12 standardized test scores are used to
evaluate and compare schools
assign grades to schools
take over failing schools
evaluate, compare, and fail teachers
Yet NAEP scores have stagnated
© TWBANTA-IUPUI
IN FINLAND AND SINGAPORE
•No annual testing of students
•No high-stakes accountability measures for
teachers/schools
•Scholarships for best and brightest
•Starting pay like a doctor
•Must complete master’s degree
•Teachers are respected professionals
© TWBANTA-IUPUI
SHORT-TERM PERSPECTIVE
•Limit degrees to 120 SCH
•Penalize students who go beyond a SCH
cap
•Reward graduation in 4 years
•Consider earning potential in setting tuition
© TWBANTA-IUPUI
DE-PROFESSIONALIZATION –
IMMEDIATE PAYOFF
•Teacher education is first
•Industry certifications
•Partnerships to fill employers’ needs
Does apprenticeship model prepare us for
global leadership in the future?
© TWBANTA-IUPUI
BETTER WAYS TO DEMONSTRATE
ACCOUNTABILITY
Performance Indicators
1.Access (to promote social mobility)
2.Engaging student experience
3.Workforce development
4.Economic development
5.Civic contribution of students, faculty,
staff, graduates
© TWBANTA-IUPUI
IF WE MUST MEASURE LEARNING
LET’S USE:
1. Standardized tests in major fieldslicensure and certification tests
ETS Major Field Tests
2. Internship performance
3. Senior projects
4. Study abroad performance
5. Electronic portfolios
6. External examiners
© TWBANTA-IUPUI
START WITH MEASURES YOU
HAVE
•Assignments in courses
•Course exams
•Work performance
•Records of progress through the
curriculum
© TWBANTA-IUPUI
METHODS OF ASSESSMENT
Paper and pencil tests
Individual or group projects
Portfolios
Observation of practice
Observation of simulated practice
Analysis of case studies
Attitude or belief inventories
Interviews and focus groups
Surveys© TWBANTA-IUPUI
© TWBANTA-IUPUI
Direct Measures of LearningAssignments, exams, projects, papers
Indirect MeasuresQuestionnaires, inventories, interviews
- Did the course cover these objectives?
- How much did your knowledge increase?
- Did the teaching method(s) help you learn?
- Did the assignments help you learn?
GOOD ASSESSMENT INCLUDES BOTH
NILOA SURVEYPROGRAM LEVEL
APPROACHES
1. Portfolios (80% in at least 1 area)
2. Performance assessments
3. Rubrics
4. External judges
5. Student interviews
6. Employer surveys
© TBANTA-IUPUI
© TWBANTA-IUPUI
STUDENT ELECTRONIC PORTFOLIO
•Students take responsibility for demonstrating core skills
•Unique individual skills and achievements can be emphasized
•Multi-media opportunities extend possibilities
•Metacognitive thinking is enhanced through reflection on contents
- Sharon J. Hamilton
IUPUI
More use of RUBRICS
locally developed
VALUE from AAC&U
© TWBANTA-IUPUI
VALUE RUBRICS
•Critical thinking
•Written communication
•Oral communication
•Information literacy
•Teamwork
•Intercultural knowledge
•Ethical reasoning
© TWBANTA-IUPUI
ACCOUNTABILITY REPORT
•85% achieve Outstanding ratings in writing
as defined . . .
•78% are Outstanding in applying knowledge
and skills in internships
•75% are Outstanding in delivering an oral
presentation
© TWBANTA-IUPUI
FOR EXTERNAL CREDIBILITY
Collaborate on rubrics
Use employers as examiners
Conduct process audits
© TWBANTA-IUPUI
E-PORT CHALLENGES
•Reliability of rubrics
•Student motivation if used for assessment
(Barrett, 2009)
•Differences in topics for products to be
evaluated
(Sekolsky & Wentland, 2010)
© TWBANTA-IUPUI
OBSTACLES TO USING
PERFORMANCE-BASED MEASURES
•Defining domains and constructs
•Obtaining agreement on what to measure
and definitions
•Defining reliability and validity
•Creating good measures
- Tom Zane
WGU
© TWBANTA-IUPUI
WILL IT TAKE 80 YEARS . . . ?
3 Promising Alternatives
E portfolios
Rubrics
Assessment
communities
- Banta, Griffin, Flateby,
Kahn
NILOA Paper #2 (2009)
© TWBANTA-IUPUI
TEAGLE ASSESSMENT SCHOLARS
•study assessment data
•visit campuses
•talk with 3-4 groups of students
•talk with faculty about their campus
assessment data
- Charles Blaich
Wabash College
© TWBANTA-IUPUI
NATIONAL SURVEY OF STUDENT ENGAGEMENT
AT
~ HOPE COLLEGE ~
% STUDENTS STUDYING LESS THAN
10 HOURS/WEEK
Freshmen Seniors
2003 38% 39%
2010 21% 28%
© TWBANTA-IUPUI
HOPE COLLEGE
•Considered data over supper
•Proposed solutions
•Conducted student focus groups
•Shared all data with all faculty
•Departments dedicated a meeting to prepare
strategies to increase rigor
© TWBANTA-IUPUI
NATIONAL INSTITUTE FOR
LEARNING OUTCOMES ASSESSMENT
•Surveys
2009 CAOs
2011 Departments
•Occasional Papers
•Website review, standards
•Quick comments (monthly)
•Calendar of events
© TWBANTA-IUPUI
LUMINA
Degree Qualifications Profile
- Linked student learning outcomes
–
AA to MA levels
Suggests transfer based on assessment
of learning outcomes
© TWBANTA-IUPUI
NEW LEADERSHIP ALLIANCE
FOR STUDENT LEARNING AND ACCOUNTABILITY
- Presidents’ Alliance
- Certification Process
Set ambitious goals for learning
Gather evidence of learning
Use evidence to improve learning
Report evidence and results
© TWBANTA-IUPUI
© TWBANTA-IUPUI
BUILD ASSESSMENT INTO VALUED
PROCESSES
1. Assessment of learning
2. Curriculum review and revision
3. Survey research
4. Program review
5. Scholarship of Teaching & Learning
6. Evaluation of initiatives
7. Faculty development
8. Promotion & tenure
9. Rewards and recognition