“creating a more educated georgia” using what you have: observational data and the scholarship...

Post on 24-Dec-2015

217 Views

Category:

Documents

1 Downloads

Preview:

Click to see full reader

TRANSCRIPT

“Creating A More Educated Georgia”

Using What You Have: Observational Data and the

Scholarship of Teaching

Catherine FinneganBoard of Regents of University System of Georgia

Agenda

• Introductions and Definitions

• Sources of Data in CMS

• Study Examples– Engagement – Retention – Instruction

University System of Georgia

35 public colleges and universities– 4 Research Universities,– 15 Regional/State Universities – 4 State Colleges – 12 Associate Colleges– 253,552 students– 9,553 full-time faculty

Office of Information and Instructional Technologies

• Supports and coordinates the delivery of innovative technology resources, services, and solutions.

• Establishes a communications conduit among executive management for the university system about information and instructional technology.

Advanced Learning Technologies

• Provides academic enterprise systems and services for USG institutions.

• Fosters the development and implementation of collaborative online degree programs and training materials.

• Conducts research and evaluations to influence policy making, instructional practice and technology development.

Technology Use in Courses

Adapted from Campus Computing Study,2002-2004.

0%

10%

20%

30%

40%

50%

60%

70%

80%

E-mail InternetResources

Webpages forCourse

1994

1995

1996

1997

1998

1999

2000

2001

2002

2003

2004

0%

5%

10%

15%

20%

25%

30%

35%

40%

45%

50%

Public Univ. Private Univ. Public 4-Yr.College

Private 4-Yr.College

CommunityColleges

Rising Use of IT in Instruction

Percentage of courses using course management tools, by sector, 2000-2004

Adapted from Campus Computing Study,2002-2004.

USG Faculty Use of CMS 2005

• Nearly half (46.3%) of all USG faculty currently use a CMS in their instruction.

• Almost two-thirds of users have increased their usage over time.

• Over two-thirds of users believe that a CMS has provided important advantages in improving student engagement in learning.

• Over two-fifths of non-users would use a CMS if their issues were addressed.

What CMS was Used For

• 90.6% enhanced their face-to-face instruction

• 43.8% deliver fully on-line instruction

• 43.8% deliver hybrid courses

* Based on 46.3% of respondents who were currently using a CMS.

CMS and Student Engagement

• Increased amount of contact with their students (55.6%)

• Increased student engagement with the course materials (63.5%)

• Allowed for inclusion of more interactive activities in their class (54.2%)

• Allowed them to accommodate more diverse learning styles (67.6%)

* Based on 46.3% of respondents who were currently using a CMS.

Evaluation

• Measures the effectiveness of an ongoing program in achieving its objectives

• Aims at program improvement through a modification of current operations

• Two types of evaluations:– Project– Program

Assessment

• Systematic collection, review, and use of information about educational programs undertaken for the purpose of improving learning and development

• Two types of audience:– Accreditation– Accountability

Scholarship of Teaching

• Sustained inquiry into teaching practices and students’ learning in ways that allow other educators to build on one’s findings

• Directed toward other instructors in one’s field and beyond

Now Tell Me

• What you are interested in learning about your teaching practices and your students’ learning?

• What projects are you now conducting?

• What data are you using to investigate?

Surveys

e-Portfolio

Content

SIS

CMS In Scholarship of Teaching

E-learningSystem

Student Online ActivityLOGON

RE-READLECTURE

NOTES

REPLY toMESSAGE

READMESSAGE

LOGOFF

CREATE NEWMESSAGE

Emergence of a New Data Set

= Large Data Set

How is this data different from other inputs to pedagogical research?

• It’s what the students actually did– Compared to self-reporting

• It captures the steps of the process– Rather than the outcome alone

• It’s quantitative

• It’s easy to collect this data across a large number of students.

How can CMS data be used?

• See patterns and trends

• Tell a story that explains the results

• Identify areas of improvement and targeted change

• Evaluate impact of changes

Patterns of Movement in Courses

Comparison of Withdrawing, Non-Successful and Successful Student Access Logs

0

20

40

60

80

100

120

140

160

Access Order

Page Accessed

Withdrawer Non Successful Successful

New evidence for…?

• Course level inquiry

• Cross course and programmatic research

• College-wide policy review

Typical Sources of Data

• Student course evaluations and surveys

• Content analysis

• Grade distributions

• Interviews

• Portfolio review

CMS Data Sources

• Individual, course, group and institutional activity reports

• Assessment reports

• Survey reports

• Discussions

• Assignments

• Content analysis

Advantages of CMS Data

• Data captured automatically as students interact with software

• Reports available at each level (course, group, institution)

• Time parameters of reports allow more timely and granular review

• Consistency of data across time and course

• Instructor control of tools

Disadvantages of CMS Data

• Only reports actions – doesn’t explain them

• Access to data based on role

• “Canned” report data limited

• Data collection dependent on proper formatting of content and assessment

Activity Data Reports Available to Instructors

• Summary of Activity

• Tool Usage

• Components Usage

• Content File Usage

• Entry and Exit Pages

• Student Tracking

Entry Into Reports and Tracking

Available from

TEACH only

List of Available Reports

Date and time parameters can be set.

Summary of Activity ReportsProvides a general overview of student and auditor activity

Information contained• Total number of sessions• Average session length• Average sessions/day

– by weekday

– by weekend

• Most active day• Least active day• Most active hour of day• Least active hour of day

Example Summary of Activity Report

Tool Usage ReportsProvides an overview of how often tools are used

Tools available• Assessments• Assignments• Bookmarks• Calendar• Chat/Whiteboard• Content File• Discussions• Mail• Media Library• Notes• PowerLinks Proxy Tool• SCORM Module• Organizer Page• URL

Information contained• Total number of

sessions for each tool• Average time per

session• Total time for all tool

sessions• Percent time for each

tool compared with total time

Example Tool Usage Report

Component Usage ReportsProvides an overview of how often students use each component of a course

• Component – which component student has accessed

• Visits – total number of times student has visited a component

• Average time/visit – average time students spend per visit

• Total time – total amount of time students spent for all components

• Percent of total visits – relates time spent in a given component compared to

total time spent for all components

ExampleComponent Usage Report

Entry and Exit Page ReportsProvides an overview of pages used most frequently for course entry and exit

• Page Name– which page student entered or exited

• Tool Used– which tool was used to enter or exit

• Page Usage– total number of times student entered or exited from

the page• Percent of Total Usage

– relates the number of times a page is used to enter or exit to total number of entries or exits

Example Entry and Exit Page Reports

Content Usage ReportsProvides an overview of the content files viewed by students

• Content file – the content file that students have

accessed

• Sessions – the total number of content file

sessions

• Percent of Total Sessions – relates the number of content file

sessions to the total number of sessions for all content files

Content File Usage Report

Content File Usage Graph

Student Tracking ReportsProvides an overview of student activities in the course, displaying both general and detailed statistics

First Access Last Access Sessions Total Time• Mail

– Read Messages – Sent Messages

• Discussion– Read Messages – Sent Messages

• Calendar• Chat and Whiteboard• Assessments• Assignments• URL• Media Library• Content Files

Aggregate Student Tracking

Individual Student Tracking

Data from Quizzes and Surveys

• Performance– Displays student scores for quiz submissions

• Item Statistics– Displays performance statistics for individual

questions. Compares the performance of selected students with the entire class

• Summary Statistics– Compares all students’ results in one table

• Class Statistics– Displays class performance for individual

questions

PerformanceDisplays student scores for quiz and survey submissions

Item StatisticsDisplays performance statistics for individual questions.

Item StatisticsDisplays performance statistics for individual questions. Compares the performance of selected students with the entire class

Summary StatisticsCompares all students’ results in one table

Class StatisticsDisplays class performance for individual questions

Additional Data Sources

• Discussions and Mail

• Assignments

• Course Evaluations and Surveys

• Student Information Systems

Now Tell Me

• Considering the projects that you outlined earlier, – What data found in a CMS might

be used to investigate your theories?

– How would you collect this data?– Would you triangulate this data

with other sources?

Typical Statistical Methods

• Frequency Distributions and Trends

• Measures of Central Tendency

• ANOVA

• Regression

Want to play with some data?

• Go to http://www.statcrunch.com

• Create an account

• Upload data file:

ExampleData.xls

• Run Summary Statistics

“Creating A More Educated Georgia”

Studies on Student Persistence and Achievement

Research Setting: eCore®

• Fully online, collaboratively developed, core curriculum courses offered jointly by institutions in the University System of Georgia. Supported by University System.

• Courses include the humanities, social sciences, mathematics, and sciences.

• Over 25 courses and 2000 enrollments in Spring semester

• http://www.gactr.uga.edu/ecore/

Underlyling Problem: Student RetentionOverall Course Retention: Fall 2000-Spring 2003

Withdraw and Complete

43145

128

220 266245

294631

1569

84

82 12185

144204

0%

20%

40%

60%

80%

100%

Fall 2000 Spring 2001 Summer2001

Fall 2001 Spring 2002 Summer2002

Fall 2002 Spring 2003

Term

Percent

Findings from Four studies

• Predicting Student Retention & Withdrawal

• Tracking Student Behavior & Achievement Online

• Examining Student Persistence and Satisfaction

• Perspectives and Activities of Faculty Teaching Online

Study 1: Predicting Student Retention & Withdrawal

• Purpose: to investigate student withdrawal and retention in eCore courses.

• How well can a student’s group membership (completion & withdrawal) be predicted?

• A two group Predictive Discriminant Analysis (PDA) is used to predict students’ withdrawals and completions in online courses.

• Authors: Morris, Wu, Finnegan (2005).

Variables

• Two grouping variables - student completers - student withdrawers

• Nine predictor variables - gender, age, verbal ability,

math ability, current credit hours, high school GPA, institutional GPA, locus of control and financial aid.

Model A: Two-group PDA Predictive Model, Spring 2002

Grouping Variable

Age

Inst Cum GPAHS GPA

SAT-Verbal

SAT-Math

Withdraw

Complete

Inst Cum Cr HR

Gender

Model A : Findings

• The most important predictors in Model A are

- high school GPA

- mathematic ability (SAT-math)

• Model A, prediction with 62.8% accuracy

Model B: Two-group PDA Predictive Model, Fall 2002

Grouping Variable

FA

Locus

Withdraw

Complete

Model B : Findings

• Financial aid showed significant differences between the responses of withdrawers and completers (x2=4.84, df=1, p<.05). Completers were more likely to receive financial aid that withdrawers.

• Locus of control has significant differences between the responses of withdrawer and completer(X2= 4.205, df= 1, p<.05). Completers were more likely to have internal motivation than withdrawers.

• Model B predicted with 74.5% accuracy

Study 1: Summary

• Students withdraw for a variety of reasons.

• Primary instructional reasons for withdrawing included too much work in the online course, preferred the classroom environment, and disliked online instruction.

• High school grade point average and mathematics SAT were related to retention in the online courses.

• Students who completed courses were more likely to have received financial aid.

• Students who completed courses were more likely to have a higher internal locus of control.

Study 2: Tracking Student Behavior & Achievement Online• Purpose: to examine student behavior by

tracking what students do online and how long they spend on each activity.

• Data: analyzed student access tracking logs.

• Coded over 300,000 student activities.

• Frequency: number of times student did a behavior

• Duration: time spent on the behavior

• Authors: Morris, Finnegan, Wu (2005)

Research Questions

• What are the differences and similarities between completers and withdrawers in various measures of student behavior online?

• How accurately can achievement be predicted from student participation measures in online learning courses?

Variables (n=8)

Frequency and Duration of– viewing course content– viewing discussions– creating new discussion posts– responding to discussion posts

• Over 400 students and 13 sections of 3 courses

Frequency of Learning Activities

0

100

200

300

400

500

Average Over Term

Withdrawer Non-Successful

Successful

English Geology History

Content Pages Viewed

0200400600800

1000120014001600

Average Over Term

Withdrawer Non-Successful

Successful

English Geology History

Discussion Posts Viewed

Frequency of Learning Activities

0

10

20

30

40

50

Average Over Term

Withdrawer Non-Successful

Successful

English Geology History

Original Posts Created

0

20

40

60

80

100

Average Over Term

Withdrawer Non-Successful

Successful

English Geology History

Follow-up Posts Created

Duration of Learning Activities

N=423

Total Time Spent During Term

Viewing Content

Viewing Discussions

Creating Original Posts

Creating

Follow-up Posts

Average Overall Time Per Week

Withdrawers

n=13710 hours 2.6

hours3 hours <1 hour <1 hour <1 hour

Non-Successful Completers

n=72

18 hours 9 hours 6 hours <1 hour <1 hour 1.2 hours

Successful Completers

n=214

54 hours 19 hours 19 hours 1 hour 1.5 hours 3.75 hours

Findings: Completers & Withdrawers

• Completers had more frequent activity and spent more time on task on all 4 measures than unsuccessful completers and withdrawers.

• Withdrawers spent significantly less time and had less frequent activity than completers on all 4 measures (p>.001).

Expected.

• Significant differences in participation also existed between successful and unsuccessful completers.

Multiple Regression Model for Impact of Participation on

Achievement

Successful and Non-Successful Completersn = 286

Findings: Successful and Unsuccessful Completers• The participation model explained 31% of

the variability in achievement.

• 3 of 8 variables were significant at the p.<.05 level and good predictors of successful completion (achievement/grades).– # of content pages viewed – # of discussion posts viewed – Seconds viewing discussions

Summary: Study 2

• Time-on-task matters; withdrawers did engage significantly in number or duration of activities at the online site.

• Successful completers engaged significantly with the online course:– Going repeatedly to content pages

(frequency)

– Going repeatedly to discussion posts (frequency)

– Spending significant time reading discussion posts (duration)

Study 3: Understanding Student Persistence and Satisfaction

• Purpose: To investigate issues that affect course completion, course withdrawals and satisfaction with online courses.

• Survey (n=505, response 22%)

• Indepth Interviews– 8 withdrawers

– 8 completers

• Authors: Boop, Morris, Finnegan (2005)

Successful completers

• Felt “membership” in the course.

• Understood course layout, expectations, assignments.

• Faculty feedback was important.

• Clarity about course was important.

• Used words indicating “drive” and “persistence” to succeed. Could overcome course-related” problems.

Withdrawers/Unsuccessful Students

• Spoke of being “lost” & “confused” in the course.

• Needed more direction & help from faculty to understand the course goals, expectations, assignments & design.

• Needed more explicit help with discussions and understanding involvement.

• Needed more managerial and navigational help.

Study 4: Perspectives and Activities of Faculty Teaching Online

• Purpose: To explore the activities and perspectives of faculty teaching online

• Interviews (n=13)

• Analysis of archived courses (10)

• Authors: Morris, Xu, Finnegan (2005)

Classification of Faculty Roles

Classification of Faculty Roles (N=10)

0

100

200

300

400

500

N N N N E E E E) E E

Number of Postings

Pedagogical Social Managerial

Summary: Study 4• Novice instructors are far less engaged

with students online.• Experienced faculty posted with a ratio of

1:6 --faculty to student posts• Experienced faculty interchanged

pedagogical, managerial, and social roles online

• Students in courses with experienced faculty engaged more often in discussions

• Faculty visibility is important to student participation.

• Novice faculty need extensive assistance to understand online instruction.

Best Practices:Students

• Students should be advised that for online courses– Time on task matters for successful

achievement;– Online courses may be activity and time

intensive; – requires pro-active, engaged students;– Will not be easier for academically marginal

students;– Students should directly (and as needed) seek

instructor help to understand course structure and course-related objects and objectives

Best Practices: Faculty 1

• Faculty should – Understand Low participation early in the

term as an indicator for withdrawal or unsuccessful completion.

– Should monitor/track all students early in the course term to see lags in participation

– Understand the role of student expectations & attitudes in persistence

– Should understand the role of Locus of Control in Withdrawing and Unsuccessful completion

Best Practices: Faculty 2

– Should engage managerial functions to explain course layout, assignments expectations (may be more important than pedagogical function at times)

– Understand that course layout and instructions are not necessarily intuitive to the students

– Should seek to understand previous academic preparation of students and make adjustments accordingly

Comparing Student Performance to Programmatic Learning Outcomes

• Link graded activities within courses to eCore® common student learning outcomes

• Determine achievement of learning outcomes based on trends in grades

• Identify additional means of documenting student achievement of learning outcomes

Benefits of CMS Data

• New quantitative evidence– Complements survey, grades, and portfolio

data– Very detailed information about

engagement and learning process

• Reduce burden on faculty and staff– Automatically collects evidence – Leverages tools already in use

Opportunities for Studies• Increase awareness of data

sources available to study pedagogy and outcomes

• Encourage systematic analysis of existing data for pedagogical improvement

• Identify additional data elements within CMS and other data sources

Challenges for Studies

• Use of CMS not widespread nor extensive

• Essential tools not used (i.e., gradebook)

• Siloed data sources (Green’s ERP Turtle)

Conclusions

• Data collected in CMS and other systems can be used to inform the scholarship of teaching– Systematic and ongoing

• New sources of data offer opportunities to study perennial questions from different perspectives.

Thank You!

Catherine Finnegan

Catherine.finnegan@usg.edu

Presentations and Citations Available at: http://alt.usg.edu

top related