jim julius sdsu course design institute may 27, 2009

Post on 29-Mar-2015

219 Views

Category:

Documents

5 Downloads

Preview:

Click to see full reader

TRANSCRIPT

Jim JuliusSDSU Course Design InstituteMay 27, 2009

Guiding Questions

Why collect formative feedback on course design?

How should one decide what kind of feedback to seek?

What tools are available to collect feedback?

What do I do with the data?

What (and why) are you measuring?

What (and why) are you measuring?

Outcomes: tell you what you got, not how or why

Inputs Processes

Seeking continuous improvement Approaching course design from

an inquiry mindset

Outcomes

Satisfaction Retention Success Achievement External proficiencies Real-world performance

Inputs

Learner characteristics Context Design Learning resources Faculty development

Processes

Pedagogies Presentation media Assignments/assessments Student use of technologies Community of Inquiry model

(social, cognitive, teaching presence) Interactions

(content, peers, instructor, technology itself)

Community of Inquiry Model

CoI - Interactions

Narrowing Your Inquiry Do you want to evaluate your course

according to “best practices”, i.e. standard course design quality criteria?

Do you want to know more about your learners in general: needs, preferences, motivation, satisfaction?

Do you want to focus on student achievement?

Do you want feedback on your facilitation of learning?

Do you want feedback on specific course elements and/or technologies?

Course Design Quality Criteria Chico rubric Quality Matters Related to Chickering and Gamson’s

“7 Principles for Good Practice in Undergraduate Education” From Indiana University, 2001 From VCU, 2009

Paid tool: Flashlight

Learning about Learners

Direct Indirect

Learning styles surveys Parallel faculty-student

surveys ELI – student and faculty SDSU’s LRS faculty and

student surveys, adapted from LITRE (NC State)

Distance Education Learning Environment faculty and student surveys

National and institutional data (aggregate)

Institutional data (for your learners)

LMS data

Student Achievement

Direct Indirect

Low-stakes: muddiest point, minute papers, clickers, discussion boards

Pre- and post- tests

Grade data Attendance/participation Outcome comparisons

(Different technology/pedagogy and same outcome, or Same technology/pedagogy and different outcomes)

Teacher Behaviors/Overall Direct Indirect

Community of Inquiry Survey

Small Group Analysis Mid-semester surveys End of course

evaluations Assessing online

facilitation Paid: IDEA survey of

student ratings of instruction

Observation Protocols

Course Elements

Direct Indirect

Student Assessment of Learning Gains: SALG

Clicker opinions survey

Examine usage data from Blackboard

Data from M. Laumakis

pICT fellow in 2005 Began teaching parallel 500-student

sections of PSYCH 101 in 2006, one traditional and one hybrid

First fully online PSYCH 101, Summer 2008

Evaluating the Face-to-Face Class Evaluated Fall 2005 innovations via the

Student Assessment of Learning Gains (SALG)

How much did the following aspects of the class help your learning?

Rated from 1 (no help) to 5 (great help)

Evaluating the Face-to-Face Class What did the data show?

Question MWF Section

TTH Section

ConceptCheck Questions 4.1 4.1

Discussion Boards 2.9 3.1

19

Evaluation Findings: IDEA Diagnostic Survey

20

Evaluation Findings: IDEA Diagnostic Survey

Fall 2006 Blended

Fall 2006 Traditional

Spring 2007

Blended

Spring 2007

Traditional

Progress on objectives

70 73 77 77

Excellent teacher

65 68 69 68

Excellent course

62 72 73 71

Note: Top 10% = 63 or more

21

Evaluation Findings:Departmental Course Evaluations

22

Evaluation Findings: Course GradesFall 2007

Fall 2007 Course Grades

12.8

15

34.6

35.8

3.9

15

12.1

33.1

31

8.9

0 10 20 30 40

F

D

C

B

A

Gra

de

% in Category

Blended

Traditional

Clicker Data: Spring 2007

Question % Agree or Strongly

Agree

Class clicker usage makes me more likely to attend class. 93%

Class clicker usage helps me to feel more involved in class. 84%

Class clicker usage makes it more likely for me to respond to a question from the professor.

91%

I understand why my professor is using clickers in this course. 90%

My professor asks clicker questions which are important to my learning.

90%

Summer 2008 Fully Online: SALG Data How much did the following aspects

of the class help your learning? Rated from 1 (no help) to 5 (great

help)

Summer 2008 Fully Online: SALG Data

Question Summer 2008 Online

Taking the test online 4.27

Discussion Forums 3.00

Introduction e-mail that explained the basics of the course

4.50

SALG Data over time

Question Fall 2007

Blended

Fall 2007

F2F

Spring 2008

Blended

Spring 2008 F2F

Summer 2008

Online

Questions, answers, and discussions in class

3.96 4.04 4.10 4.01 4.36

Live online class sessions

3.39 4.20 4.15

Archives of live online class sessions

4.15 4.50 4.44

Quality of contact with the teacher

3.41 3.48 3.94 3.90 4.26

Working with peers outside of class/online

3.12 3.22 3.31 3.39 3.82

Summer 2008: Community of Inquiry Survey Statements rated from 1 (strongly

disagree) to 5 (strongly agree) Based on the Community of Inquiry

framework’s three elements:1. Social Presence2. Cognitive Presence3. Teaching Presence

Summer 2008:Community of Inquiry SurveyCoI Dimension Student Ratings

Social Presence 3.94Affective Expression 3.56Open Communication 4.29Group Cohesion 3.97

Cognitive Presence 3.96Triggering Event 3.91Exploration 3.73Integration 4.09Resolution 4.10

Teaching Presence 4.38Design and Organization 4.50Facilitation 4.38Direct Instruction 4.23

So, what would you like to further explore?

top related