developing academic program assessment plans uaa faculty senate academic assessment committee 1

20
Developing Academic Program Assessment Plans UAA Faculty Senate Academic Assessment Committee 1

Upload: david-austin

Post on 16-Dec-2015

218 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Developing Academic Program Assessment Plans UAA Faculty Senate Academic Assessment Committee 1

1

Developing Academic Program Assessment Plans

UAA Faculty Senate Academic Assessment Committee

Page 2: Developing Academic Program Assessment Plans UAA Faculty Senate Academic Assessment Committee 1

2

What is an assessment plan?

• Start with your program student learning outcomes

• Decide what evaluation tools you want to use for which outcome and how often you’ll collect the data

• Collect the data, then get together and figure out what the results mean and what you want to do about it

This is what needs to be assessed

This is your assessment plan

This your assessment

Page 3: Developing Academic Program Assessment Plans UAA Faculty Senate Academic Assessment Committee 1

3

Good assessment plan characteristics

• Is your process systematic (as opposed to ad hoc)?

• Is it sustainable? Can it continue to run if the person in charge leaves?

• Is it robust, or is it met with faculty apathy?

Page 4: Developing Academic Program Assessment Plans UAA Faculty Senate Academic Assessment Committee 1

4

Where to start

• Program goals/mission statement• Student learning outcomes answer the

question “what should students be able to do upon completion of your program?”– SLOs relate to the knowledge and skills that

students acquire as they progress through your program

– If you are externally accredited, these have probably been given to you

Page 5: Developing Academic Program Assessment Plans UAA Faculty Senate Academic Assessment Committee 1

5

Ways of gathering assessment dataFormative vs. Summative

Formative – undertaken while student learning is taking place; the purpose of which is to improve teaching and learning; designed to capture students’ progress

Summative – obtained at the end of a course or program; the purpose of which is to document student learning; designed to capture students’ achievement at the end of their program of study

Direct vs. Indirect

Direct – evidence of student learning which is tangible, visible, self-explanatoryExample: performances, creations, results of research, responses to questions or prompts

Indirect – evidence that provides signs that students are properly learning, but the evidence of exactly what they are learning is less clear and convincingExample: student satisfaction surveys, alumni surveys

Source: http://assessment.uconn.edu/primer/how1.html

Page 6: Developing Academic Program Assessment Plans UAA Faculty Senate Academic Assessment Committee 1

6

Direct and indirect assessment

Direct assessment methods• Published/standardized

tests• Locally-developed tests• Embedded assignments and

course activities• Competence

interviews/practica• Portfolios

Indirect assessment methods• Surveys• Interviews• Focus groups• Reflective essays

Source: http://assessment.uconn.edu/primer/how1.html

Page 7: Developing Academic Program Assessment Plans UAA Faculty Senate Academic Assessment Committee 1

7

Commonly used tools at UAA

• Embedded course-level assessment

• Standardized tests (if your discipline has one)

• Alumni/employer surveys• Professional portfolios/e-

portfolios• Field instructor

assessments/practical examinations

Source: www.uaa.alaska.edu

Page 8: Developing Academic Program Assessment Plans UAA Faculty Senate Academic Assessment Committee 1

8

When forming your plan

• Capitalize on what you are already doing• More data are not necessarily better – Do not try to assess every course every semester– It is generally not considered good practice to try

to assess every outcome every year• Don’t wait for perfection – it generally take 2-

3 full assessment cycles to get your process nailed down

Page 9: Developing Academic Program Assessment Plans UAA Faculty Senate Academic Assessment Committee 1

9

When and how to collect data• Student learning is cumulative over time

– What students learn in one course, they use, practice and develop in other courses

• We are not interested in assessing individual students, faculty or courses, we are evaluating programs

• Focus of data collection in program assessment should be on cumulative effect of student learning

• With this in mind, you can determine– When to collect data, and how often– From whom to collect data– How to interpret results

Source: ABET Advanced Program Assessment Workshop

Page 10: Developing Academic Program Assessment Plans UAA Faculty Senate Academic Assessment Committee 1

10

Course-level assessment• Course-embedded assessments that

look at actual work produced by students in our courses

• May be separate from graded work in course (but often are not)

• Purpose of this assessment is to assess the particular learning outcome, not the grade of the student (although this work can contribute to the grade of the student)

• May evaluate the student by assigning a grade, but each student can be additionally evaluated for the purpose of assessing the outcome

Source: www.mbaschoolauthority.com

Page 11: Developing Academic Program Assessment Plans UAA Faculty Senate Academic Assessment Committee 1

11

Page 12: Developing Academic Program Assessment Plans UAA Faculty Senate Academic Assessment Committee 1

12

Performance Indicators

• Not required, but considered a best practice• PIs are specific, measureable statements identifying

student performance(s) required to meet the SLO, confirmable through evidence

• Three characteristics of good PIs– Subject content that is the focus of instruction– One action verb (indicates level, e.g. Bloom’s taxonomy)– Value free (don’t use descriptors like “few” or “many”) – we

can add value by creating rubrics• Rule of thumb: each SLO should have no fewer than 2

PIs and no more than 4

Page 13: Developing Academic Program Assessment Plans UAA Faculty Senate Academic Assessment Committee 1

13

Example: writing PIs• SLO: an ability to communicate effectively– Communicates information in a logical, well-

organized manner– Uses graphics effectively to illustrate concepts– Presents material that is factually correct,

supported with evidence, explained in sufficient detail and properly documented

– Listens and responds appropriately to questions (for oral communication)

Source: UAA ME Department

Page 14: Developing Academic Program Assessment Plans UAA Faculty Senate Academic Assessment Committee 1

14

Creating rubrics for PIsOutcome g: an ability to communicate effectively

Performance Indicator Poor Developing Satisfactory Excellent1. Communicates

information in a logical, well-organized manner

Communication is particularly poorly organized, or grammar and usage is particularly poor

Organization of communication is limited

Communicates information in a way that is satisfactorily well-organized

Communicates information in an exceptionally well-organized manner

1. Uses graphics effectively to illustrate concepts

Does not attempt to clarify ideas with graphics, or graphics are inappropriate to the idea being expressed

Limited attempts to clarify ideas with graphics, or graphics are of limited effectiveness

Makes satisfactory use of graphics to illustrate concepts

Makes exceptional use of graphics to illustrate concepts

1. Presents material that is factually correct, supported with evidence, explained in sufficient detail and properly documented

Much of the material presented is factually incorrect, poorly supported and/or documented incorrectly

Some of the material presented is factually incorrect, poorly supported and/or documented incorrectly

Factually correct material is satisfactorily supported with evidence, explained in sufficient detail and properly documented

Factually correct material is supported with an exceptional amount of evidence or explained particularly well

1. Listens and responds appropriately to questions (for oral communication)

Does not respond to questions appropriately or does not listen to questions

Makes limited attempts to respond to questions

Provides satisfactory response to questions

Provides exceptional response to questions

Source: UAA ME Department

Page 15: Developing Academic Program Assessment Plans UAA Faculty Senate Academic Assessment Committee 1

15

Streamlining the processCourseSemester, Year

Course Outcome

Criterion

Evidence - not too much - substantive - rubric-based - counts (not means)

Source: James Allert, Department of Computer ScienceUniversity of Minnesota Duluth

Page 16: Developing Academic Program Assessment Plans UAA Faculty Senate Academic Assessment Committee 1

16

Course title: ME 313 Instructor: Brock

Number of students: 24 Semester: Spring 2012

Outcome e: an ability to identify, formulate, and solve engineering problems

Performance Indicator Poor Developing Satisfactory Excellent1. Identifies

relevant known and unknown factors

Does not demonstrate understanding of known and unknown factors

Demonstrates limited understanding of known and unknown factors

Identifies expected known and unknown factors

Demonstrates exceptional insight in identifying known and unknown factors

2. Provides appropriate analysis of elements of the solution

In unable to provide analysis of the problem

Provides limited analysis of the problem

Provides satisfactory analysis of the problem

Provides analysis of the problem which exceeds expectations

3. Assesses the validity of the solution based on mathematical or engineering insight

Makes no attempt to validate the solution, or validation method is completely incorrect

Makes limited attempts to validate the solution

Assesses the validity of the solution using an appropriate technique

Uses multiple techniques to assess validity of solution

Number of Students Achieving this Level

PI Assessment method

Poor (1) Developing (2) Satisfactory (3) Excellent (4) % Students scoring 3 or 4

1 Project 1 2 4 5 75%

2 Project 0 3 3 6 75%

3 Project 0 5 6 1 58%

Direct Assessment Action: Students were assigned one of three design problems where they were asked to optimize a thermodynamic cycle for either refrigeration or power generation. They worked in groups of two. Their project reports were assessed.Comments and Proposed Improvement:

Source: UAA ME Department

Page 17: Developing Academic Program Assessment Plans UAA Faculty Senate Academic Assessment Committee 1

17

Example resultsCourse Measure PI Assessed Attainment LevelME A414 Project report 1 100%

Project presentation 2 88%Project report 3 100%Project presentation 4 88%

ME A441 Lab report 1 54%Lab report 2 100%Lab report 3 38%

ES A341L Lab reports 1 100%Lab reports 2 100%Lab reports 3 100%

ME A438 Final presentation 1-4 94%Final report 1 100%Final report 2 82%Final report 3 100%

Measure Attainment LevelSenior exit survey 100%

Overall attainment level (80/20 weight factor for direct vs. indirect measures): 91%

Direct Assessment Measures

Indirect Assessment Measures

Source: UAA ME Department

Page 18: Developing Academic Program Assessment Plans UAA Faculty Senate Academic Assessment Committee 1

18

Example results

Unsatisfactory Developing Satisfactory Excellent0%

10%

20%

30%

40%

50%

60%

70%

Direct CLA results for Outcome g

Communicates in a well-organized mannerUses graphics effectivelyPresents factually correct material supported with documentationListens and responds appropriately to questions (oral communication)

Level of attainment

Perc

enta

ge o

f stu

dent

s

0%

10%

20%

30%

40%

50%

60%

70%

80%

90%

100%

85%

96%

79%

88%

Overall student attainment of Outcome g

Communicates in a well-organized mannerUses graphics effectivelyPresents factually correct material supported with documentationListens and responds appropriately to questions (oral communication)

PIPe

rcen

tage

of s

tude

nts

Source: UAA ME Department

Page 19: Developing Academic Program Assessment Plans UAA Faculty Senate Academic Assessment Committee 1

19

Example resultsOutcomes CLA,

directCLA,

indirect

ME A438 Capstone Design

Senior Exit

SurveyFE

Exam

Overall Attainment

(a) an ability to apply knowledge of mathematics, science and engineering

66% 100% 100% 77%

(b) an ability to design and conduct experiments, as well as analyze and interpret data

76% 91% 100% 82%

(b) an ability to design a system, component, or process to meet desired needs within realistic constraints such as economic, environmental, social, political, ethical, health, and safety, manufacturability, and sustainability

75% 88% 78% 82%

(b) an ability to function on multi-disciplinary teams 90% 100% 92%

(b) an ability to identify, formulate, and solve engineering problems

67% 89% 100% 81%

(b) an understanding of professional and ethical responsibilities

72% 80% 100% 100% 79%

(b) an ability to communicate effectively 87% 94% 100% 91%

(b) the broad education necessary to understand the impact of engineering solutions in a global, economic, environmental and societal context

85% 100% 88%

(b) a recognition of the need for, and the ability to engage in, life-long learning

59% 100% 67%

(b) a knowledge of contemporary issues 75% 100% 80%

(b) an ability to use the techniques, skills, and modern engineering tools necessary for engineering practice

87% 93% 89% 89%

Source: UAA ME Department

Page 20: Developing Academic Program Assessment Plans UAA Faculty Senate Academic Assessment Committee 1

20

Available resources

• Upcoming workshops in the assessment series– Norming Your Academic Assessment Rubrics, Friday,

March 20, 10:30 – 11:30am RH 303– ePortfolios and Academic Assessment, Friday, April 3,

10:30 – 11:30am RH 303• UAA Academic Assessment Committee webpage:

http://www.uaa.alaska.edu/governance/academic_assessment_committee/index.cfm

• UConn Assessment Primer, available at http://assessment.uconn.edu/primer/