agenda introduction benchmarks benchmarking survey data and benchmarking

35
Agenda Introduction Benchmarks Benchmarking Survey data and benchmarking

Upload: molly-ryan

Post on 28-Dec-2015

249 views

Category:

Documents


4 download

TRANSCRIPT

Page 1: Agenda Introduction Benchmarks Benchmarking Survey data and benchmarking

Agenda

Introduction

Benchmarks

Benchmarking

Survey data and benchmarking

Page 2: Agenda Introduction Benchmarks Benchmarking Survey data and benchmarking

Benchmarks

Benchmark:

Used to establish “industry standards” based on external and internal comparisons

Comparisons to similar institutions establish benchmarks

CCSSE measures what students are doing

Page 3: Agenda Introduction Benchmarks Benchmarking Survey data and benchmarking

Benchmarks

An example from the corporate world

An interview with Jeff Immelt, CEO of General Electric, identified Toyota, Dell, and Procter & Gamble as the three companies that GE has benchmarked the most

GE looks at Toyota and Dell to learn from their process excellence

GE looks at Procter & Gamble to learn from their marketing and commercial excellence

Page 4: Agenda Introduction Benchmarks Benchmarking Survey data and benchmarking

Benchmarks

Opportunities abound in educational research

Corporate benchmarking tends to be sharing between partners

Corporate benchmarking limited to a few competitors in contrast to education that has a much larger universe with which to compare

Are these approaches really that different?

Should institutions attempt to find similar or different partners for benchmarking?

Page 5: Agenda Introduction Benchmarks Benchmarking Survey data and benchmarking

Benchmarks

What are the most important characteristics of a benchmark?

Page 6: Agenda Introduction Benchmarks Benchmarking Survey data and benchmarking

Benchmarks

Elements Credibility—reliability and validity

Comparative—results examined relative to peers

Comprehensive—measures key elements according to the experts

Performance & Importance—what has the greatest impact

Confidentiality—it takes courage to assess oneself

Continuous—it takes time for improvement

-Joseph A. Pica, CEO of Educational Benchmarking Inc. in About Campus

Page 7: Agenda Introduction Benchmarks Benchmarking Survey data and benchmarking

Benchmarks

The CCSSE survey:

is administered directly to community college students at CCSSE member colleges in randomly selected classes.

is based on research, asking questions about institutional practices and student behaviors demonstrated to promote student learning and retention.

uses a sampling methodology that is consistent across all participating colleges.

Page 8: Agenda Introduction Benchmarks Benchmarking Survey data and benchmarking

Benchmarks

The five CCSSE benchmarks:

Active and Collaborative Learning

Student-Faculty Interaction

Academic Challenge

Support for Learners

Student Effort

Page 9: Agenda Introduction Benchmarks Benchmarking Survey data and benchmarking

Benchmarks

Active and Collaborative Learning:

Worked with other students on projects during class

Worked with classmates outside of class to prepare class assignments

Tutored or taught other students (paid or voluntary)

Participated in a community-based project as a part of a regular course

Made a class presentation

Asked questions in class or contributed to class discussions.

Discussed ideas from your readings or classes with others outside of class (students, family members, co-workers, etc.)

Page 10: Agenda Introduction Benchmarks Benchmarking Survey data and benchmarking

Criterion Benchmarking

How do you determine which measure you use to compare yourself with other institutions?

Page 11: Agenda Introduction Benchmarks Benchmarking Survey data and benchmarking

Benchmarking

Normative Compare your college with the mean

CriterionCompare your college with a predetermined value

Page 12: Agenda Introduction Benchmarks Benchmarking Survey data and benchmarking

Benchmarking

Normative Benchmarks Provide contextDetermine what the mean you would like to be compared with is

Normative Benchmarks Situate Your ResultsWhat does it mean to have 80% of your students satisfied?

A good place to start, but not necessarily the end point

Page 13: Agenda Introduction Benchmarks Benchmarking Survey data and benchmarking

Benchmarking

Normative Benchmarking with CCSSE

•Look for Differences of 5 points (a standardized effect size of .2)

•Is .2 a noteworthy difference?

Page 14: Agenda Introduction Benchmarks Benchmarking Survey data and benchmarking

Criterion Benchmarking

Criterion Benchmarking with CCSSE

•What is the college mission?

•What are the college’s accreditation goals?

•Are all students equally engaged?

Page 15: Agenda Introduction Benchmarks Benchmarking Survey data and benchmarking

Benchmarking

Five ways to colleges can reach for excellence using CCSSE Benchmarks:

Compare themselves to national average

Compare themselves to high-performing colleges

Measure their performance against their least-engaged group

Gauge work in areas most strongly valued

Compare now to where they want to be

Page 16: Agenda Introduction Benchmarks Benchmarking Survey data and benchmarking

Benchmarking

Comparisons yourself with high-performers

Page 17: Agenda Introduction Benchmarks Benchmarking Survey data and benchmarking

Benchmarking

Measure performance against least-engaged group

Breakout by race, gender, enrollment status, parental education, traditional vs. non-traditional age

At risk students vs. other students

Define at-risk at your college

Page 18: Agenda Introduction Benchmarks Benchmarking Survey data and benchmarking

BenchmarkingDensity Curves of Student v. Institutions benchmarks (within institution v. between institution variation)

Page 19: Agenda Introduction Benchmarks Benchmarking Survey data and benchmarking

Benchmarking

Gauge work in areas most strongly valued

Focus On Current College Initiatives

Sharing thoughts about how to use CCSSE data to evaluate different programs

Examine institutional mission, vision, and values

Page 20: Agenda Introduction Benchmarks Benchmarking Survey data and benchmarking

Benchmarking

Understand values by sharing results

Share results with others to determine what is most strongly valued

Faculty, students, and administrators will likely have different opinions on what it is that accounts for particular phenomena

Hopefully, the questions that are created from benchmarks are more focused questions than the original question

Page 21: Agenda Introduction Benchmarks Benchmarking Survey data and benchmarking

Benchmarking

Compare now to where you want to be

Page 22: Agenda Introduction Benchmarks Benchmarking Survey data and benchmarking

Survey Data as Benchmarks

Should benchmarks derived from surveys be used to rank colleges?

Are there fundamental differences in how census benchmarks (e.g., graduation rates) and benchmarks derived from surveys should be used?

Page 23: Agenda Introduction Benchmarks Benchmarking Survey data and benchmarking

Survey Data as Benchmarks

Ewell’s distinguishes between ‘hard’ statistics and ‘second-order’ statistics

‘Hard’ statistics are clearly enumerated and based on census-type data, such as numbers of students, graduates, and degrees awarded

‘Second-order’ statistics measure phenomena that cannot be directly counted, such as student satisfaction and students’ self-assessments of their behavior, and as such contain some statistical instability

‘Hard’ statistics are preferable for performance funding because they are more statistically stable than ‘second-order’ statistics

Source: Ewell, P. T. (1999). Linking performance measures to resource allocation: Exploring unmapped terrain. Quality in Higher Education, 5 (3), 191-209.

Page 24: Agenda Introduction Benchmarks Benchmarking Survey data and benchmarking

Survey Data as Benchmarks

Texas Community Colleges Performance Measures:

1.The rate at which students completed courses attempted.

2. The number and types of degrees and certificates awarded.

3. The percentage of graduates who passed licensing exams related to the degree or certificate awarded, to the extent the information can be determined.

4. The number of students or graduates who transfer to or are admitted to a public university.

5. The passing rates for students required to be tested under the Section 51.306.

6. The percentage of students enrolled who are academically disadvantaged.

7. The percentage of students enrolled who are economically disadvantaged.

8. The racial and ethnic composition of the district’s student body.

9. The percentage of students contact hours taught by full-time faculty.

Source: http://www.thecb.state.tx.us/reports/DOC/1197.DOC

Page 25: Agenda Introduction Benchmarks Benchmarking Survey data and benchmarking

Survey Data as Benchmarks

Texas State-Level Benchmarks for Higher Education:

• Percent of recent high school graduates enrolled in a Texas public college or university

• Percent of first-time, full-time freshmen returning after one academic year

• Percent of first-time, full-time freshmen who graduate within four years

• Percent of first-time, full-time freshmen who graduate within six years

• Percent of two-year college students who transfer to four-year institutions

• Percent of two-year transfer students who graduate from four-year institutions

• Percent of population age 24 and older with vocational/technical certificate as highest level of educational attainment

• Percent of population age 24 and older with two-year college degree as highest level of educational attainment

Source: http://www.thecb.state.tx.us/reports/DOC/1197.DOC

Page 26: Agenda Introduction Benchmarks Benchmarking Survey data and benchmarking

Survey Data as Benchmarks

If performance funding is based on verifiable hard statistics, what role does survey data have in benchmarking?

Or, why should we concern ourselves with the student experience?

Input and outcome versus process measures

To achieve outcomes, we need to understand the process by which they are obtained

Page 27: Agenda Introduction Benchmarks Benchmarking Survey data and benchmarking

Survey Data as Benchmarks

“We can tell people almost anything about education except how well students are learning.”

Patrick M. Callan, President, National Center for Public Policy and Higher Education

Page 28: Agenda Introduction Benchmarks Benchmarking Survey data and benchmarking

Survey Data as Benchmarks

Input -> Process -> Outcome Model

Inputs include costs, numbers admitted, etc.

Outputs include graduation rates, retention, graduate satisfaction

How do we measure the Process component?

Page 29: Agenda Introduction Benchmarks Benchmarking Survey data and benchmarking

Survey Data as Benchmarks

Input and Ranking

Inputs are heavily emphasized in media rankings and potentially serve to maintain an establishment

Inputs and outputs are naturally correlated

The challenge for institutions is to maximize process to improve on the ability of inputs to predict outputs

Page 30: Agenda Introduction Benchmarks Benchmarking Survey data and benchmarking

Survey Data as Benchmarks

Input -> Process -> Outcome Model

Inputs include costs, numbers admitted, etc.

Outputs include graduation rates, retention, graduate satisfaction

How do we measure the Process component?

Page 31: Agenda Introduction Benchmarks Benchmarking Survey data and benchmarking

Survey Data as Benchmarks

Is there a danger of impacting results by raising the stakes?

Increasing the outcome without increasing the process

To achieve outcomes, we need to understand the process by which they are obtained

Increasing the outcome may not reflect improvement, but increasing the process won’t hurt

Page 32: Agenda Introduction Benchmarks Benchmarking Survey data and benchmarking

Survey Data as Benchmarks

CCSSE does not rank

There is not a single criteria or set of criteria that can be used universally

Institutional characteristics matter

Institutional missions differ

Benchmarking with CCSSE data is best when presented in a non-threatening manner

Improvement requires an understanding of the process

Understanding the process in an institution will require hearing different voices and different perspectives on the same issues

Page 33: Agenda Introduction Benchmarks Benchmarking Survey data and benchmarking

Survey Data as Benchmarks

Stability of CCSSE Benchmarks

Correlate 2005 and 2006 benchmarks for colleges that participated both years

45 institutions

55,903 students

Page 34: Agenda Introduction Benchmarks Benchmarking Survey data and benchmarking

Survey Data as Benchmarks

Variable

Academic Challenge

2006

Active & Collaborative

Learning 2005

Active & Collaborative

Learning 2006

Student Effort 2005

Student Effort 2006

Student-Faculty

Interaction 2005

Student-Faculty

Interaction 2006

Support for

Learners 2005

Support for

Learners 2006

Academic Challenge 2005 0.80 0.55 0.44 0.65 0.56 0.59 0.36 0.24 0.10

Academic Challenge 2006 0.49 0.58 0.56 0.67 0.57 0.53 0.34 0.21

Active & Collaborative Learning 2005 0.78 0.58 0.55 0.44 0.30 0.15 0.16

Active & Collaborative Learning 2006 0.47 0.61 0.49 0.61 0.28 0.27

Student Effort 2005 0.74 0.39 0.29 0.53 0.46

Student Effort 2006 0.44 0.42 0.49 0.48

Student-Faculty Interaction 2005 0.77 0.47 0.33

Student-Faculty Interaction 2006 0.52 0.41

Support for Learners 2005 0.90

Page 35: Agenda Introduction Benchmarks Benchmarking Survey data and benchmarking

Summary

Summary

Survey results are ‘second order’ data not ideal for performance funding and

Understanding ‘hard’ statistics naturally leads to a discussion of processes.

Survey data present an opportunity to understand processes and impact hard, outcome measures

As such, survey data presents opportunities for non-threatening discussions of