learning utcomes s assessment lan orkshop...2. list of assessments, also known as learning objects,...
Post on 09-Sep-2020
3 Views
Preview:
TRANSCRIPT
EMU: Office of Institutional Effectiveness and Accountability Page 1
Tuesday, September 14, 1-2:30 PM Friday, September 17, 10:30 AM – 12:00 PM Monday, September 20, 9 – 10:30 AM Tuesday, September 21, 2:30 – 4:00 PM Monday, September 27, 1 – 2:30 PM Friday, October 1, 1 – 2:30 PM Wednesday, October 6, 2:30 – 4:00 PM
Thursday, October 7, 9 – 10:30 AM Monday, October 11, 10:30 AM – 12:00 PM Thursday, October 14, 1 – 2:30 PM Tuesday, October 19, 9 – 10:30 AM Wednesday, October 20, 3 – 4:30 PM Wednesday, October 27, 10:30 AM – 12:00 PM Thursday, October 28, 3 – 4:30 PM
Peggy Liggit Lisa Klopfer
Peggy.Liggit@emich.edu Lisa.Klopfer@emich.edu
734-487-0199 734-487-0020 ext. 2114
Director of Academic Assessment (I) Director, Bruce K. Nelson Faculty Development Center (I)
Office of Institutional Effectiveness, 234 McKenny 109 Bruce T. Halle Library Building
STUDENT LEARNING OUTCOMES (SLOS) ASSESSMENT PLAN WORKSHOP
Conducted by Peggy Liggit and
Lisa Klopfer
109 Bruce T. Halle Library
Building
EMU: Office of Institutional Effectiveness and Accountability Page 2
Table of Contents Workshop Goals ............................................................................................................................................................................................................ 3
Project Timeline ............................................................................................................................................................................................................ 4
Notes and Encouragement from Peggy ........................................................................................................................................................................ 5
Overview of Embedded Assessment ............................................................................................................................................................................ 6
Plan for Program-Level Assessment of Student Learning ............................................................................................................................................ 8
Methodology Plan Templates ....................................................................................................................................................................................... 9
Self-Evaluation Questions -- Program-Level Assessment Plans for Student Learning .............................................................................................. 13
Submitting Your Program-Level Assessment ............................................................................................................................................................. 17
Electronic Resources ................................................................................................................................................................................................... 22
Appendix: Supplementary Material ........................................................................................................................................................................... 23
EMU: Office of Institutional Effectiveness and Accountability Page 3
WORKSHOP GOALS
Welcome to a new academic year. Let’s remember that the purpose of this project is to share with others your genuine interest and intention in evaluating student performance as a means to improve the learning experience for the students enrolled in the programs associated with your department or school. The workshop facilitators and their office teams are here to support your efforts, particularly:
forming and implementing plans for assessing student learning, and successfully reporting assessment plans and activities within the Academic Review and Continuous
Improvement process for 2010-2011. During this workshop you will practice:
comparing and contrasting multiple methods for writing effective program-level assessment plans, practice writing student learning outcomes (or revising outcomes that are already written), curriculum mapping: that is, mapping program outcomes to learning experiences conducted
throughout the curriculum, and writing a methodology plan that describes how the academic program is going to improve student
learning for at least one outcome in your program. Additional activities that will be covered:
Describe how embedded assessment can be used to identify student misconceptions.
Explain that when the embedded assessment process is written as a narrative (the Teach, Assess,
Analyze, and Adjust steps), this narrative can serve as one example of an accountability record for
documenting how you are improving student learning in the program area.
IMPORTANT NOTE: This methodology plan is due by October 30, 2010.
EMU: Office of Institutional Effectiveness and Accountability Page 4
PROJECT TIMELINE
October 30, 2010
Assessment plans due for Departments/
Schools completing
Annual Planning
November 12
and December 3rd, 2010
Internal Review Retreats
January 14, 2011
Assessment plas for Departments/Schools
completing Full Review due for submission to the
College Deans
Deans' comments submitted 1/31/11
Campus Comment Opened 2/1/11
February 11
and
February 18, 2011
Internal Review Retreats
March 1 - May 2, 2011
Roundtable Discussions
(Scheduled on Fridays, due to limited space)
EMU: Office of Institutional Effectiveness and Accountability Page 5
NOTES AND ENCOURAGEMENT FROM PEGGY
“The connections made by good teachers are held not in their methods but in their hearts – meaning heart in its ancient sense,
as a place where intellect and emotion and spirit and will converge in the human self.” --Parker Palmer, Courage to Teach.
After analyzing over 100 hours of faculty interviews in the last year, it is clear to me that faculty at our institution teach
with heart.
“You should never worry about your good ideas being stolen in educational reform, because even when people are sincerely
motivated to learn from you, they have a devil of a time doing so.” --Michael Fullan, Change Forces: The Sequel
Why There are Few ‘Good’ Assessment models:
o Michael Fullan (1999, 2006) - Innovative ideas are “difficult to disseminate and replicate.”
Not easily transferable - difficult to capture “subtleties of the reform practice.”
The inability to replicate another’s model, “replicating the wrong thing”
“The reform itself, instead of the conditions which spawned the success.”
Problems with scale: small -scale may not work well on a wider-scale.
After returning from the Higher Learning Commission Conference and reading many articles in the current literature
about assessment, there are no assessment models/templates that one can simple take and fill-in to meet the needs of
an individual program. The best plan is to look at several examples of assessment models/templates and modify/revise
what can work in your own discipline.
I suggest beginning this work with what you already know and practice … (continue to the next page)
EMU: Office of Institutional Effectiveness and Accountability Page 6
EMBEDDED ASSESSMENT
Take a Moment Here:
Reflect and write about a misconception/problem in student learning that you did something about.
What/how were you teaching at the time?
What activity were you doing to determine student’s thinking/reasoning?
What did students understand/not understand about what you were teaching?
What did you do to help students better understand?
You have just documented a, presumably, unconscious competent activity.
Teach,
Assess,
Analyze, and
Adjust = Embedded Assessment
EMU: Office of Institutional Effectiveness and Accountability Page 7
EMU: Office of Institutional Effectiveness and Accountability Page 8
PLAN FOR PROGRAM-LEVEL ASSESSMENT OF STUDENT LEARNING*
1. Generate Program-Level Student Learning Outcomes: At the end of the program, students will be able to <<action verb>> <<something>>. 2. Identify Learning Activities Associated with Program Outcomes: Describe what student performance of an outcome looks like. Identify learning activities, or create them if none exist, to provide opportunities for students to move closer toward mastery of the program’s outcomes. How well a student performs on a particular outcome can be demonstrated through a learning object (or artifact). For example, exam questions, projects, research papers, presentations, portfolios, exhibitions directly offer evidence regarding students’ performance in relation to a particular outcome. Some type of “assessment instrument” to evaluate how well students master the learning activity accompanies the learning object, such as, a scoring guideline, rubric, or answer key. 3. Curriculum Mapping: Identify courses spanning the curriculum that provide learning opportunities for students to achieve program outcomes. Students do not master an outcome with one or two experiences; they need multiple opportunities from the beginning to the end of their program, with increasing difficulty and challenge as they developmentally advance. 4. Capture Student Performance of Outcomes: Collect information about student work from learning objects (artifacts) associated with particular program outcomes, so you can learn about students’ progress with those outcomes. How you learn about student performance depends on the kinds of artifacts you collect. For instance, if you test students on important concepts, you might look at their test scores. Alternatively, if you ask students to create a performance, write a report, etc. , you might use those. Whatever you choose, make sure that:
a) It represents learning outcomes that are important for the your discipline; b) You have a way to assess the artifacts that you collect. If you’re looking at test scores, you have the scores themselves. However, if you’re looking at something like a paper, portfolio, or performance, you will need to create a method to analyze these artifacts.
It is not necessary to collect data in every course every semester. For each program outcome, it is reasonable to collect 3-5 scores over the number of years a student is in the program – documenting that students have multiple opportunities that bring them closer to “mastering” each outcome. 5. Interpret Data: Analyze trends and patterns in student performance to determine to what extent students are achieving program outcomes. What do you see that could be improved upon with regard to students learning these outcomes? 6. Take Action to Improve Your Program: If students are not meeting your program outcomes to the extent or manner in which you have intended, what changes in the program need to be made? Implement an improvement plan, such actions might include, but are not limited to: improving advising, implementing prerequisites, changing the curriculum, mentoring faculty for improving instructional delivery or revising learning activities and objects, acquiring different or updated program resources and technologies, creating community partnerships. Repeat steps 4 - 6 as necessary. 7. Share Your Progress with Program Stakeholders: Our internal and external communities want to know to what extent students are learning and mastering the content, skills, and attitudes of the disciplines that constitute your programs. 8. Revisit, Revise, Repeat steps of this process as often as necessary to improve student learning. *Although this looks like a clean and regimented process written here as text, it is really a much more fluid and, at times, even “messy” process. Within this framework, every program must create an assessment system that works for them, and it is understood this system will look a little different from program to program. To create such a system requires meaningful dialog and collaboration, identification and prioritization of our educational values, and the understanding and patience that this is a dynamic and human process.
EMU: Office of Institutional Effectiveness and Accountability Page 9
TEMPLATES TO HELP YOU GET STARTED WITH YOUR METHODOLOGY PLAN
(Remember, these are only graphic organizers to help you with your thinking/planning. As we know, models rarely transfer from
one program to the next, from one department to the next, from one institution to the next. Take-in what information helps, leave
behind what doesn’t and create your own model and plan).
1. List of Program-Level Student Learning Outcomes:
When writing these outcomes consider the following:
– Write outcomes that are measurable and specify definite, observable behaviors. (Remember, data will be collected on
these outcomes).
• Students will <<action verb>>< <something>>.
– Program level outcomes should also be written to communicate mastery and the highest levels learning (see Bloom’s
Taxonomy). What skills, knowledge, or behaviors do you want students to master at the end of the program?
Outcome 1 –
Outcome 2 –
Outcome 3 –
2. List of assessments, also known as learning objects, from course or program assignments or exams used to evaluate student
learning outcomes.
Outcome 1 Assessment/learning object
Outcome 2 Assessment/learning object
EMU: Office of Institutional Effectiveness and Accountability Page 10
3. Alignment of course and program assessments with student learning outcomes – Curriculum Map
List Of Student Learning Outcomes Students can: 1. 2. 3. 4. 5. 6. 7.
Student Learning Outcomes
Courses/ Assignments
1
2
3
4
5
6
7
EMU: Office of Institutional Effectiveness and Accountability Page 11
4. Design a methodology plan to show evidence that students are meeting the outcomes (What data will be collected, how often will
it be collected, how will it be summarized and analyzed?)
Suggestion: Design a small pilot plan you can implement and manage throughout 2010/2011. Start with one or two outcomes,
assess these in a beginning, mid-program, and capstone classes, ask questions about student learning – What do students
struggle with and what are you testing to see what is helping this? Collect data to answer your questions. What worked, what will
you do next?
5. Use of assessment results to improve the program: In this section, present the evidence that assessment results have been
analyzed and been (or will be used) to improve student performance and strengthen the program. This description should not link
improvements to individual assessments but, rather it should summarize principal findings from the evidence, the faculty’s
interpretations of those findings, and changes made in (or planned for) the program as a result. Describe the steps program faculty
has taken to use information from assessment for improvement of both student performance and the program.
EMU: Office of Institutional Effectiveness and Accountability Page 12
EMBEDDED ASSESSMENT METHODOLOGY FOR PROGRAM-LEVEL ASSESSMENT (ANNUAL PLANNING)
Teach/Implement - (What activity/concept/skill are you going to be teaching/implementing?)
Assess – (What activity/exam/project are you going to have students do to determine their thinking/reasoning/performance?)
Analyze – (What are you going to be looking for to understand what student know and don’t know about the
concept/skill/task/idea/reasoning they are supposed to be learning?)
Adjust – (What did you do to improve student learning based on your findings?
EMU: Office of Institutional Effectiveness and Accountability Page 13
SELF-EVALUATION QUESTIONS – PROGRAM-LEVEL ASSESSMENT PLANS FOR STUDENT LEARNING
ORGANIZATION AND FORMAT
EFFECTIVE ASSESSMENT PLANS SOMEWHAT EFFECTIVE ASSESSMENT PLANS
INEFFECTIVE ASSESSMENT PLANS
Who is responsible for the assessment plan in our program?
Assessment is collective effort involving both tenured and non-tenured faculty.
Clear direction and leadership by a “point person” or small group of persons.
Large portion of faculty participate in assessment process, although some are not yet included.
Direction and leadership for the process may be divided among too many individuals or groups to be effective in directing assessment.
Little to no participation from faculty.
No directing “point person.” Rules and responsibilities are not clearly defined.
Do we use a common language in our assessment plan?
Program faculty have agreed on common terminology.
Program’s terminology coincides with that of the University as much as possible.
For the most part, there is agreement on terminology among faculty, but inconsistencies remain.
Little agreement on terminology with the University as a whole.
Essentially no agreement on terminology at any institutional level; numerous different terminologies and standards for measure employed.
How many Student Learning Outcomes are included in our assessment program?
Short list, of appropriate length for your program, of clearly defined program-level student learning outcomes.
List is of manageable size, but not all SLOs are necessary (e.g. some similar SLOs can be rephrased and combined into a single outcome, thus increasing the efficiency of the assessment process).
Excessive complexity of outcomes.
“Laundry list” of numerous program outcomes.
Numerous SLOs overlap with each other, and are often redundant.
For similar programs that are grouped together for convenience of reporting: Did we include at least one unique outcome to distinguish on program from another?
At least one and, ideally, more than one unique outcome has been included for each program
Some programs have unique learning outcomes, but not all in group
No unique outcomes to distinguish programs; all programs covered by general set of outcomes
EMU: Office of Institutional Effectiveness and Accountability Page 14
ORGANIZATION AND FORMAT
EFFECTIVE ASSESSMENT PLANS SOMEWHAT EFFECTIVE ASSESSMENT PLANS
INEFFECTIVE ASSESSMENT PLANS
Are our Student Learning Outcomes written in an appropriate format?
SLOs address issues that are concise and measurable through the format: “Students will be able to <<action verb>> <<something>>.” For example: “Students will be able to design solutions based upon customer requirements.”
SLOs assess advanced skills of students graduating from the program.
Program faculty understand and agree upon the definition of the written outcome.
Complexity and measurability of outcomes is somewhat inconsistent: e.g. some are concise and clearly measurable, but others are excessively length and/or lack empirical standards of measurement.
Some SLOs do not address the most advanced skills of students graduating from the program
Although there is consensus on the majority of outcomes, there is some degree of disagreement amongst faculty over the definitions of some written outcomes.
SLOs are excessively complex. Issues are not measurable, difficult or impossible to assess. For example: “Students will be able to identify, define, and analyze the major causes, effects, and implications of $150-a-barrell oil prices on the transportation, food, and housing industries.”
SLOs assess basic understandings that develop early in the curriculum.
Little to no agreement upon definition of written outcomes amongst program faculty.
Are our outcomes supported by core courses?
Core curriculum courses designed to ensure students are given opportunity to develop competences in program-level SLOs.
Faculty have systematically examined curriculum, and created a “curriculum map” to identify inadequately supported outcomes, areas of overlap, or overlooked outcomes.
Some outcomes are not adequately supported by the program’s core courses, but faculty are aware of these blind spots thanks to curriculum mapping.
Some or all of the SLOs are unsupported by the program’s core curriculum courses.
No “curriculum map” has been created; little awareness of relation between core curriculum and SLOs.
EMU: Office of Institutional Effectiveness and Accountability Page 15
DATA COLLECTION,
METHODOLOGY, AND ANALYSIS
EFFECTIVE ASSESSMENT PLANS
SOMEWHAT EFFECTIVE ASSESSMENT PLANS
INEFFECTIVE ASSESSMENT PLANS
Does our assessment plan rely on direct measures of student learning?
Assessment based upon direct examination of work completed by students, either formatively or summatively. Possible vehicles for the direct examination of student work include essays, problem sets, exams, etc.
Work itself directly tied to one or more SLO.
The majority of assessment is based upon direct examination of student work, although conjecture and opinion surveys are sometimes used to shore up the gaps.
Assessment depends upon indirect measures of student outcomes, such surveys asking students how much they have learned, or extrapolation from student satisfaction data.
How do we assess, collect, and organize data?
Assessment methods are chosen to address the specific characteristics of the program .
Assessment plans implemented in systematic, ongoing manner.
Faculty regularly meet and exchange student work for assessment.
Data collection, assessment, and analysis utilizes a hybrid of specific and nonspecific methods.
At least some degree of planning prior to undertaking assessment.
Faculty meetings for the exchange of student work do take place, although perhaps they do not include a large enough portion of the faculty, or do not occur with enough regularity.
Data collected and assessed primarily from nonspecific sources, such as standardized tests.
Assessment plans implemented “at the last minute,” shortly before accreditation or re- accreditation visit.
Little to no inter-faculty exchange of student work.
What do we do with our data after it has been collected?
Data is retained and studied over time to identify different patterns of evidence. These patterns include “patterns of consistency” (developed by studying data from the same outcome over a period of time), “patterns of consensus” (developed by disaggregating data to compare the achievement levels.
Either data is retained over time but not enough is done to analyze and act on the information, or data is analyzed upon collection but not retained for comparison with future data for the purpose of identifying patterns.
Little to no action taken after data collection; data stored and forgotten.
Little to no analysis of data to identify trends over time.
EMU: Office of Institutional Effectiveness and Accountability Page 16
DATA COLLECTION, METHODOLOGY, AND
ANALYSIS
EFFECTIVE ASSESSMENT PLANS SOMEWHAT EFFECTIVE ASSESSMENT PLANS
INEFFECTIVE ASSESSMENT PLANS
How have we used our collected data to improve learning and teaching?
Faculty, staff, and student development activities – such as workshops, presentations, and discussions – have been organized to insure that everyone in the program fully understands the program assessment plan.
When necessary and appropriate, program policies and procedures may be revised to take into account the findings of assessment (e.g. revising the criteria for admission to the program, etc.).
Program curriculum is revised to take into account any gaps or deficiencies that may be have become apparent during assessment.
Some development activities have been prepared, but they may not include a large enough portion of the program’s faculty, staff, and student populations.
Some program policies and procedures have been revised.
Program curriculum has been revised, but some gaps may remain unaddressed.
There has been little to no preparation of development activities for the program’s faculty, students, and staff.
Little to no revision of program policies and procedures to take into account findings of assessment.
Numerous gaps in the program’s curriculum remain largely unaddressed.
EMU: Office of Institutional Effectiveness and Accountability Page 17
SUBMITTING YOUR PROGRAM-LEVEL ASSESSMENT PLAN
In my.emich, go to the “Reports” tab. Under
“IRIM Reports,” select “Academic Review and
Continuous Improvement.” After entering your
my.emich ID and password, you will be taken to
the Modules page.
Select the appropriate Department or School from the
drop-down menu at the top of the page and click the
“Switch Dept/Schl” button. From there, you can use the
Select the appropriate Department or School from
the drop-down menu at the top of the page, then
click the “Switch Dept/Schl” button. If your
department/school is in Full Review, click on
“Review Module.” If your department/school is in
Annual Planning, click on “Planning Module.”
my.emich
REPORTS
MODULES PAGE
EMU: Office of Institutional Effectiveness and Accountability Page 18
SUBMITTING PLANS IN FULL REVIEW
Once you have clicked on the Review Module, you will be taken to
the Department/School Information page. From there, you can
use the link on the left-hand sidebar to navigate to the Program
Review Page. Be sure that the box at the bottom of this first page
– the one that acknowledges that the list of programs is correct –
is checked; otherwise you will not be able move forward. Make
sure to click “Save” after checking the box.
Use the “Save,” “Save and Continue,” and “Continue” buttons at the
bottom of each page to navigate your way through the various
pages.
Navigating away from a page without clicking “Save” or “Save and
Continue” will erase any information you have entered.
DEPT./SCHOOL
INFORMATION
PROGRAM
REVIEW
EMU: Office of Institutional Effectiveness and Accountability Page 19
Selecting “Program Review” will first take you to
the “Program Overview” section.
From there, you can use the left-hand
sidebar to navigate (make sure to save
each time you navigate to a new page!)
through the different sections of Program
Review.
“Summary of Past Activities/Improvement
Status” takes you to the page where you
can enter information on the status of past
program improvement efforts.
“Assurance of Quality (Assessment of
Student Learning Outcomes)” takes you to
the page where you can submit
information on your program-level Student
Learning Outcomes.
Always remember to Save!
EMU: Office of Institutional Effectiveness and Accountability Page 20
SUBMITTING PLANS IN ANNUAL PLANNING
After selecting the “Planning Module” you will be prompted to select the year
for planning. Select the current year in the annual planning cycle and click
“Continue.”
This will take you to the “Department/School Information” Page.
Click “Program Planning” on the left-hand sidebar and select the
appropriate program from the list.
EMU: Office of Institutional Effectiveness and Accountability Page 21
This will take you to the page where you can submit your
program’s Annual Planning information on Student Learning
Outcomes.
Before navigating away from this page, or any others, be
sure to click either “Save” or “Save and Continue” at the
bottom of the page.
EMU: Office of Institutional Effectiveness and Accountability Page 22
ELECTRONIC RESOURCES
Electronic copies of the resources used in this workbook (listed in the bibliography), along with further information on the assessment process, including articles, sample assessment plans, and links to helpful outside resources can be found on the Student Learning Outcomes Assessment site on eFolio. To access the site, follow these steps:
1. Go to http://tinyurl.com/emuassess This will take you to the main page for Student Learning Outcomes Assessment, where you can find guides and resources to help you with assessment, as well as a schedule of Assessment Plan Workshops.
2. On the left-hand sidebar, you will see several links underneath “Student Learning Outcomes Assessment.”
“Resource Library” – this includes links to electronic copies to articles/resources that were used in the development of this workbook, and that may be helpful to you during your assessment process.
“Samples” – here you can find sample assessment plans from programs that have previously undergone assessment. These illustrate a variety of different approaches to the assessment process.
“Finding and Fixing Misconceptions” – this section houses resources and a bibliography from a different workshop, but feel free to look! There are some interesting articles.
3. In order to access these pages, you may be prompted to login. If so, use the following information: Username: SLO Password: SLO
Be aware, both the username and password are case sensitive.
EMU: Office of Institutional Effectiveness and Accountability Page 23
APPENDIX: SUPPLEMENTARY MATERIAL
Bloom’s Taxonomy
“Assessing Your Program-Level Assessment Plan,” Susan Hatfield
EMU: Office of Institutional Effectiveness and Accountability Page 24
KNOWLEDGE
COMPREHENSIONAPPLICATION
ANALYSISSYNTHESIS
EVALUATION
Cite
Count
Define
Draw
Identify
List
Name
Point
Quote
Read
Recite
Record
Repeat
Select
State
Tabulate
Tell
Trace
Underline
Associate
Classify
Compare
Compute
Contrast
Differentiate
Discuss
Distinguish
Estimate
Explain
Express
Extrapolate
Interpolate
Locate
Predict
Report
Restate
Review
Tell
Translate
Apply
Calculate
Classify
Demonstrate
Determine
Dramatize
Employ
Examine
Illustrate
Interpret
Locate
Operate
Order
Practice
Report
Restructure
Schedule
Sketch
Solve
Translate
Use
Write
Analyze
Appraise
Calculate
Categorize
Classify
Compare
Debate
Diagram
Differentiate
Distinguish
Examine
Experiment
Inspect
Inventory
Question
Separate
Summarize
Test
Arrange
Assemble
Collect
Compose
Construct
Create
Design
Formulate
Integrate
Manage
Organize
Plan
Prepare
Prescribe
Produce
Propose
Specify
Synthesize
Write
Appraise
Assess
Choose
Compare
Criticize
Determine
Estimate
Evaluate
Grade
Judge
Measure
Rank
Rate
Recommend
Revise
Score
Select
Standardize
Test
Validate
EMU: Office of Institutional Effectiveness and Accountability Page 25
Consider using the NEW Bloom’s Taxonomy
top related