from 0 to 60 in three years – student learning … 0 to 60 in three years – student learning...

44
From 0 to 60 in Three Years – Student Learning Assessment Turnaround August C. Delbert, MA, M.Div Director of Assessment Anne M. Candreva , MEd Assistant VP of Institutional Research, Effectiveness & Planning December 3, 2015

Upload: vuongcong

Post on 25-Jun-2018

220 views

Category:

Documents


0 download

TRANSCRIPT

From 0 to 60 in Three Years – Student Learning Assessment Turnaround

August C. Delbert, MA, M.Div Director of Assessment

Anne M. Candreva , MEd Assistant VP of Institutional Research, Effectiveness & Planning

December 3, 2015

Intended Outcomes

• Develop strategies for building and fostering a body of faculty support for the implementation and sustainable practice of assessment.

• Learn methods for planning, tracking and ushering the

implementation of institution-wide learning assessment processes. • Acquire practical tips for building organizational consensus around

change initiatives related to assessment.

The Carlow Context

• Private master’s comprehensive university

• 1400 undergraduate • 875 graduate

• 100 full-time faculty • 170 part-time faculty

The Carlow Context

Carlow University, rooted in its Catholic identity and embodying the heritage and values of the Sisters of Mercy, offers transformational educational opportunities for a diverse community of learners and empowers them to excel in their chosen work as compassionate, responsible leaders in the creation of a just and merciful world.

A History of Inconsistency Prior to Fall 2012:

• Waves of assessment initiatives; inconsistent and unsustained

• Responsible for assessment was a “tack-on” for a single person in addition

to other duties

• Pockets of excellence, but no common standards across departments

• Lots of dusty data; little impact to show for it

• Complexity

Spring - Fall 2012

• Spurred on by our Periodic Review Report and in new focus on institutional effectiveness

• Identified key players and secured funds for professional development

• Hired a full-time assessment coordinator (since promoted to Director of Assessment)

• Established an assessment committee • Developed a project plan

Planning Considerations The “Why” • Establish goals and rationale

• Making your case • The importance of institutional accountability

Political/Interpersonal • Secure support from administrative leadership • Build a team • Establish relationships • Educate the community

Project Management

• Start with the end in sight • Start with what you have (the “low hanging fruit”) • Gap analysis • Defining Time frame and milestones • Managing and organizing information • Execute and then adapt

Quality Control • Assessing our assessment process

The “Why?”

Establish A Rationale For Your Assessment Goals

• Why are we doing this? Why is it important? Why should people care?

• It must be anchored in your mission. – The goals of assessment should be aligned with the things that

people care about.

• It can’t just be: “Middle States said so…” – It may facilitate compliance, but not buy-in.

Making The Case For Assessment

The Angles We Used: • Anchored in an important aspect of the mission and/or vision • Professionalism • Ethical Imperative • Student centeredness • Best practice: It’s a characteristic of a thriving institution • Institutional accountability

and…Accreditation expectations

The Importance of Institutional Accountability

• Increasingly limited resources • Increasing amounts of public scrutiny • Responsible stewardship is the new norm.

– A big part of this is data-informed decisions…at all levels – Most institutions can no longer afford to be anecdotal

• The entire institution has a shared responsibility. This isn’t one

person’s job. This isn’t a silo. If your institution thinks that is the case, your assessment efforts are doomed.

Political/Interpersonal Considerations

Secure Support from Administrative Leadership

• Not just lip-service. There must be a true commitment from the highest levels – Make assessment someone’s primary responsibility

• Director of Institutional Research became Senior Director of Institutional Research and Effectiveness (and eventually became a member of the President’s Cabinet)

• Hired a full-time assessment coordinator

– Identify infrastructure and information management needs • Carlow purchased an assessment information management system

– Build assessment initiatives into the strategic plan

Secure Support from Administrative Leadership • Not just lip-service. There must be a true commitment from the highest

levels – Carve out calendar space

• “Summer Assessment Team” meetings • “Close the Loop Day” in May and “Open the Loop Day” in August

– Consistent messaging about the importance of assessment from the highest levels

• At Opening Convocation • At University Faculty Assemblies • In meetings with Directors

– Include assessment as a responsibility within job descriptions • Assessment is NOT a bureaucratic task. It is a professional and ethical responsibility.

Build a team

• “Team Empowering Student Learning Assessment” (TESLA) – Representative cross-section of respected faculty – Development and education to ensure understanding of best

practices in assessment – Clear charge (link to TESLA’s charge statement) – Honest conversation about the goals. Build consensus and buy-in.

Establish Relationships

• Identify “Point People” in every key functional area – A point person charged for oversight of each core learning outcome – A point person charged as the primary liaison for each program of

study – Clear “role description”, expectations and responsibilities

• CLO point person role description • PoS point person role description

– One-on-one meetings at least once a semester with the assessment coordinator

Project Management

Start With The End In Sight • X number of years from now, where do we need to be and what do we

need to have achieved? What’s the intended outcome?

• Do you have an assessment vision? What would constitute meaningful, useful, sustainable, pervasive assessment at your institution?

• Define the outcomes of the initiative • Will all programs be assessing all outcomes every year? • Will a subset of programs be assessing a subset of outcomes every year? • How will assessment be connected to decision-making processes at the institution? • How do we ensure quality in our assessment efforts?

Begin with what you have

• Identify the “low-hanging fruit” – Areas of excellence that could serve as exemplars

• Perhaps you have a program

– Programs that have experience with external accreditation • Nursing, Social Work, Psychology, Business, Education, etc.

– Infrastructure that already exists • Perhaps you already have one or two institutional learning outcomes that are

already pervasively integrated throughout the curriculum?

Define the Time Frame, Milestones, and Expectations

• What needs to be accomplished, by when, and by whom?

• What are the individual, discrete steps that must be taken to reach these goals?

• At what rate can these individual steps be realistically achieved?

Define the Timeline

Aug 2012

Aug 2013

Aug 2014

Aug 2015

MSCHE Team Site Visit

April 2016 Today

Group A

Group B

Core Learning Assessment

Fall Spring

Cycle 1 Cycle 2 Cycle 3

C

GE PoS

Program of Study Assessment Expectations: Each Program of Study Will Develop and Assess 5-7 Student Learning Outcomes

2012-2013

2013-2014

2015-2016

2014-2015

Self-study team visit: Spring 2016

Assessing and closing the loop with all possible outcomes.

Assessing and closing the loop with all outcomes. Assessing and closing the loop with all possible outcomes.

Assessing and closing the loop with all outcomes. Assessing and closing the loop with all outcomes.

Assessing and closing the loop with all outcomes. Assessing and closing the loop with all outcomes.

Group A

Group A

Group B

Group A

Group B

Group A

Group B

*Applies to both undergraduate and graduate programs.

Programs of StudyAssessment Process Updates

Art Accountin

g

Business

Manag

ement

HR Mana

gemen

t &

Tech

nology

Manag

ement in

Health Se

rvice

s

Nursing

Communica

tion

Politica

l Scie

nce

Psychology

Socia

l Work

Fraud &

Foren

sics

MBAProfessi

onal

Counseli

ng

PsyD Nursing F

NP

Nursing L

eaders

hip

Nursing D

NP

Assessment Process Updates

A1 Undergraduate A1 GraduateLast updated 5/7/13

Introduced Carlow's Assessment Process ExpectationsOverview Presentation Given

Developed 5-7 (or appropriate) OutcomesFinalized Assessment Methods

Finalized Criteria

Tracdat Training

Outcomes information entered into TracDatAssessment method info entered into TracDat

Assessment results collectedResults entered into TracDatResults reviewed to determine action stepsAction Plan developed and entered into TracDat

Fall 2

012

28-F

eb31

-Mar

April

-May

A1 Undergraduate A1 Graduate

Step by step breakdown:

Assessment Process Updates

Assessment Process Updates

Last updated 2/5/14

25-O

ct Introduced Carlow's Assessment Process Expectations

Developed 5-7 (or appropriate) OutcomesFinalized Assessment MethodsFinalized CriteriaTracdat TrainingOutcomes information entered into TracDatAssessment method info entered into TracDatResults entered into TracDat

Results reviewed to determine action stepsAction Plan developed and entered into TracDat

April

-Jun

e 2

014

13-D

ec31

-Jan

• Introduced to Carlow’s Assessment Process Expectations

• Overview presentation given

Assessment Process Updates

Assessment Process Updates

Last updated 2/5/14

25-O

ct Introduced Carlow's Assessment Process Expectations

Developed 5-7 (or appropriate) OutcomesFinalized Assessment MethodsFinalized CriteriaTracdat TrainingOutcomes information entered into TracDatAssessment method info entered into TracDatResults entered into TracDat

Results reviewed to determine action stepsAction Plan developed and entered into TracDat

April

-Jun

e 2

014

13-D

ec31

-Jan

• PoS Developed 5-7 outcomes (or another appropriately agreed upon number)

• PoS has finalized assessment methods for each outcome

• PoS has finalized criteria for each outcome

Assessment Process Updates

Assessment Process Updates

Last updated 2/5/14

25-O

ct Introduced Carlow's Assessment Process Expectations

Developed 5-7 (or appropriate) OutcomesFinalized Assessment MethodsFinalized CriteriaTracdat TrainingOutcomes information entered into TracDatAssessment method info entered into TracDatResults entered into TracDat

Results reviewed to determine action stepsAction Plan developed and entered into TracDat

April

-Jun

e 2

014

13-D

ec31

-Jan

• Representatives in the program have received TracDat training.

• Outcomes info has been entered into TracDat

• Assessment method info has been entered into TracDat

Assessment Process Updates

Assessment Process Updates

Last updated 2/5/14

25-O

ct Introduced Carlow's Assessment Process Expectations

Developed 5-7 (or appropriate) OutcomesFinalized Assessment MethodsFinalized CriteriaTracdat TrainingOutcomes information entered into TracDatAssessment method info entered into TracDatResults entered into TracDat

Results reviewed to determine action stepsAction Plan developed and entered into TracDat

April

-Jun

e 2

014

13-D

ec31

-Jan

• Assessment results collected.

• Results entered into TracDat

• Results reviewed to determine action steps.

• Action plan developed and entered into TracDat.

Assessment Process Updates

Assessment Process UpdatesLast updated 2/5/14

2012-13 Results Entered Into TracDat2012-13 Action Steps Entered Into TracDatAction Step Follow-ups Entered Into TracDat

Developed 5-7 (or appropriate) OutcomesFinalized Assessment Methods

Finalized Criteria

Tracdat Training

Outcomes information entered into TracDatAssessment method info entered into TracDat

Results entered into TracDatResults reviewed to determine action stepsAction Plan developed and entered into TracDat

31-J

an13

-Dec

ASAP

April

-Jun

e 2

014

• 2012-13 results entered in TracDat

• 2012-13 action steps entered in TracDat

• Action step follow-ups entered into TracDat

Assessment Process Updates

Programs of Study (Group B)

Art Educ

ation

Art Thera

py

Biology

Chemist

ry

Creativ

e Writ

ing

Criminal

Justice

Early

Childho

od Educa

tion

Engli

sh

History

Liberal St

udies

Mathem

atics

Middle Le

vel E

duca

tion

Nursing R

N to BSN

Philoso

phy

Socio

logy

Theolo

gy

Seco

ndary

Educa

tion

Spec

ial Ed

ucatio

n 7-12

Spec

ial Ed

ucatio

n PreK

-8Assessment

Process Updates

UndergraduateLast updated 2/13/14

25-O

ct Introduced Carlow's Assessment Process Expectations

Developed 5-7 (or appropriate) OutcomesFinalized Assessment MethodsFinalized CriteriaTracdat TrainingOutcomes information entered into TracDatAssessment method info entered into TracDatResults entered into TracDat

Results reviewed to determine action stepsAction Plan developed and entered into TracDat

13-D

ec31

-Jan

Undergraduate

April

-Jun

e 2

014

Color Color KeyCompletedWork in progress

Step Task Oral Comm WritingCritial

CreasoningQuantitative

ReasoningScientific

Reasoning LAIService

Learning TechnologyInformation

LiteracyBehaviorMethodCriteria

Determine where the assessment is applied

Determine who collects the results

Determine who evaluates the results

Communicate plans to all relevant parties

Collect and organize results

Review and evaluate results

Determine action steps based on resultsImplement action steps

Review the effectiveness of action step implementation

Act

General Education and Carlow Outcomes

Define Outcomes

Implement the

Process

Evaluate Results

Step Task Oral Comm WritingCritial

CreasoningQuantitative

ReasoningScientific

Reasoning LAIService

Learning TechnologyInformation

LiteracyCreative

ReasoningEthical

Reasoning

Catholic Intellectual

Tradition

Women-Centeredness

?BehaviorMethodCriteriaIdentify/create learning opportunitiesIdentify/create demonstration opportunitiesDetermine where the assessment is appliedDetermine who collects the resultsDetermine who evaluates the resultsCommunicate plans to all relevant partiesCollect and organize resultsReview and evaluate resultsDetermine action steps based on resultsDocument results and action steps in TracDatImplement action steps

Review the effectiveness of action step implementation

= Complete and finalized = Progress but not finalized

Act

Developing and Embedding General Education and Carlow Outcomes (2012-2013)Last updated 11-22-13

Define Outcomes

Gen Ed Curriculum

Design

Implement the

Process

Evaluate Results

Map Your Gaps Information Literacy Students will demonstrate information literacy by formulating research questions, designing a search strategy, and critically analyzing sources and content in order to evaluate the information gathered and use it effectively.

Can you identify any specific assignments wherein your students demonstrate this IL outcome?

Yes / No

If Yes, please describe the assignment(s), the courses in which the assignment(s) occur(s), and when each course is offered:

Do you have any suggestions or considerations about information literacy assessment you think we should be aware of? If yes, please describe below:

Managing and Organizing Information

• Develop a system to manage information flow – What are information channels? – Who reports to who?

• Invest in infrastructure to support the massive amounts of information – Where and when in university decision-making processes will

assessment information be integrated? – Consider an information management system.

Execute and Adapt

• This is an iterative ongoing process • Give yourself permission to adapt • Chunk the project into stages • Do not let the perfect be the enemy of the good • Assessment is not research • You have to start somewhere

Quality Control

Assessing Our Assessment • “Assessment Process Evaluation Rubric”

– Each programs’ assessment process are evaluated by the Director of Assessment every academic year.

– Feedback is provided based on the rubric. – A summary of all evaluations are reviewed by the assessment committee, the deans and the

Provost

• External review – Bring in assessment experts from other organizations at regular intervals to review your

processes and provide feedback.

• Practice what we preach – IRE&P has it’s own divisional assessment plan

• Goals, assessment methods, and criteria defined for the success of our assessment processes • Establish criteria for what constitutes effective assessment

Effectively Proficient (2 points) Minimally Proficient (1 point) Developing (0 points) To achieve a level of proficiency, you must demonstrate a majority of items in that proficiency column’s checklist

Student Learning Outcomes

� The assessment plan has defined an appropriate number of outcomes that are robust in breadth and depth.

� SLOs reflect attributes specific to the program of study. (No ambiguity with Gen Ed outcomes)

� SLO verbiage is appropriate for the highest level of proficiency that the program expects of its students (see Bloom's taxonomy)

� All tracks and specialties within the program are accounted for in a robust fashion.

� All SLOs are clear and distinct

� A full set of SLOs has been developed; however, it may be lacking in some aspect of breadth or depth.

� SLOs reflect attributes specific to the program of study; however, a few may have a small degree of ambiguity with Gen Ed outcomes

� SLO verbiage is mostly appropriate; however, several SLOs may need to be reframed in higher-level terms (see Bloom’s taxonomy)

� All tracks and specialties within the program have at least one outcome; however it may be limited in depth or narrow in scope

� The intent of all SLOs is clear; however, a few may be lacking in focus

� Assessment plan has less than 5 SLOs OR more than 7 unsupportable SLOs. (Unsupportable meaning prohibitive to complete assessment of the full plan.)

� A significant number of outcomes are not program-specific. They may be focused more on general education skills than program-specific skills

� Most SLO verbiage is inappropriate or does not reflect high-level skill sets (see Bloom's taxonomy)

� There are tracks and specialties within the program that have not yet been addressed

� SLOs are vague, unclear, or unfocused Assessment Methods

� Each outcome has a complete and precise method of assessment that describes both 1) how students will demonstrate the outcome and 2) how the students’ demonstration of the outcome is going to be evaluated.

� Most outcomes have multiple assessment methods with at least one direct method

� All rubrics and relevant supporting documents are “Related” in TracDat. The rubrics and documents are appropriately descriptive.

� Each outcome has a complete method of assessment, but it may be lacking in precision or neglect to describe either 1) how students will demonstrate the outcome or 2) how the students’ demonstration of the outcome is going to be evaluated in several assessment methods.

� Each outcome has at least one direct assessment method

� All rubrics and relevant supporting documents are uploaded to TracDat

� The description of a significant number of assessment methods is lacking either 1) a description of the source “demonstration opportunity” or 2) a description of the tool or technique that will be used to evaluate the “demonstration opportunity”.

� Direct assessment methods have not been established for every outcome.

� Rubrics and supporting materials have not yet been uploaded to TracDat

Goals/ Criteria for Success

� Each SLO has at least one value-based or performance-based criterion (e.g. a definition of good performance vs. inadequate performance)

� Each criterion includes a reasonable target, aim, or goal

� Each criterion includes a reasonable target/aim/goal OR a value/performance based standard for success

� Criteria for success have not yet been articulated for a significant number of assessment methods

Results � 100% of outcome results entered in TracDat � 66-99% of outcome results entered in TracDat � Less than 66% of outcome results entered in TracDat

Action and Follow-up

� Specific actions steps for improvement have been determined based on the assessment results for all results which did not meet criteria

� Timelines / milestones / target dates for implementation of action steps have been established where appropriate

� Appropriate follow-ups entered for all actions

� Specific actions steps for improvement have been determined based on the assessment results for most results which did not meet criteria

� Timelines and/or methods for implementation of the action steps have been established for most applicable outcomes and actions

� Appropriate follow-ups entered for most actions

� Action steps are either missing where needed or too vague to be meaningful or effective

� Timelines and/or methods for implementation of the action steps have not been sufficiently established

� Appropriate follow-ups have not yet been entered for actions

Open Discussion and Questions

Appendix

• Assessment Committee Charge Statement • Point Person Role Descriptions For:

– Programs of Study – Core Learning Outcomes

Team Empowering Student Learning Assessment Charge Statement

• TESLA is committed to quality assessment of student learning at Carlow University. More specifically, TESLA is charged with the following tasks:

• To coordinate and advise on the planning and implementation of student learning assessment; • To prepare an overall University Student Learning Assessment Plan which meets the requirements of

the Middle States Commission of Higher Education; • To develop, implement, teach, and advocate for Carlow’s Principles of Good Assessment Practice • To discern best practices in assessment and work to integrate these practices with the university’s

decision-making processes. • To assist programs of study as they identify critical issues and assessment opportunities. • To support programs of study as they, in turn, seek to implement Carlow’s Principles of Good

Assessment Practice • To assess and continually improve Carlow’s Student Learning Assessment processes • Delivery of charge: 09/13/2012

Return Link

Program of Study “Point Person” Role Description

One “Point Person” will be identified for each department or program of study (PoS) to assist and facilitate consistency within the assessment process. The Point Person will do this by: • Communicating with colleagues to ensure assessment information is entered into the TracDat

assessment management system at the appropriate times. • Working with colleagues to develop a timeline for the assessment of each of their department/PoS’s

outcomes. • Reviewing agreed-upon action plans at the beginning of each fall semester and then communicating

with colleagues to learn if they have prepared to implement their delegated action steps. • Sharing updates about the assessment process with their colleagues. At the end of each academic

year, IR&E will send each Point Person the following information: – A review of their department/PoS’s plan with recommendations for continued improvement. – An evaluation of their department/PoS’s assessment process in terms of sustainability and proficiency. – Instructions for future assessment reports.

Return Link

Core Learning Outcome Point-Person Role Description

• Will work with Core Director to collect artifacts; • Will develop and update rubrics as needed; • Will identify opportunities for students to demonstrate the core learning outcomes within

the curriculum; • Will identify gaps within the curriculum wherein students do not have an opportunity to

clearly demonstrate the core learning outcome; • Will work with Programs of Study to develop opportunities for students to demonstrate the

Core Learning Outcome in the curriculum; • Will communicate the expectations and necessary steps in order for Programs of Study to

assist with and facilitate the Core Learning Assessment Process; • Will collaborate with all relevant stakeholders to analyze CLO assessment results and

develop appropriate action steps for improvement; • Will ensure results and any action steps and/or follow-ups are entered into TracDat for the

particular Core Learning Outcome.

Return Link