assessment 101 carteret community college learning outcomes workshop june 18-20, 2007

53
Assessment 101 Carteret Community College Learning Outcomes Workshop June 18-20, 2007

Upload: maryann-parker

Post on 29-Dec-2015

220 views

Category:

Documents


0 download

TRANSCRIPT

Assessment 101

Carteret Community CollegeLearning Outcomes WorkshopJune 18-20, 2007

Purpose

Understand your options Use more than one method Create tools or processes that help you obtain

the data you need We need outcome data but also

Strengths and weaknesses Needs for change Future issues for the field and program

Can use assessment to rate performance but also as a needs assessment tool

Methods of Interest

Focus GroupsNominal GroupsClassroom assessmentsSurveys/questionnairesPre-test/post-test assessmentsDocument Evaluation/Work Sample

EvaluationGrading Rubrics

Focus Group - Advantages

Inexpensive method of exploration Can be done quickly Can use multiple groups Supplies a wide range of opinion, ideas,

information, etc. Moderator can get clarification Free exchange of open, honest expression of

ideas Can deviate from the original format if needed

Focus Group - Disadvantages

Qualitative in nature – no tabulating, coding or quantifying

Small sample size Heavy focus on the skills of the moderator Results cannot stand alone – should not make

decisions based on focus groups Recruitment is hard Some may not have an equal opportunity to

participate (group dynamics, skill or moderator)

How to Prepare

1. Develop an interview guide

2. Enlist a well-trained moderator

3. Determine the number and makeup of the group

4. Select participants

5. Arrange the facilities

Conducting the Assessment

1. Allow participants to get acquainted and relax

2. Introduce the process and topic

3. Moderator guides the discussion

4. Interject when necessary

5. Bring the discussion to a close

6. “Thank you” follow-ups

How to Write Questions

How many is too many? Usually max of 6-8 Never “closed” questions Always open-ended - items they can list Refrain from items that are too controversial or

ill-defined What should the college’s policy be on allowing

illegal immigrants to attend at in-state rates? What are the major factors that contribute to

retention?

Focus Groups

Look at your worksheetCreate 3-4 focus group questions you

might ask a group of students or employers of graduates/advisory committee members.

What’s a Nominal Group Process?

A means to solicit information from interested parties in a manner where all opinions matter. The results are typically the generation of a prioritized list of items.The college has just received $1 million

from a private donor to enhance technology skills among faculty, staff and students. What five things should be our top priority?

Let’s look at the handout.

When might you use this process?

Classroom Assessments

TestsEssayShort answer essayMultiple choice

AssignmentsPortfoliosProjects

Uses

To determine how students are progressing. Main purpose is to grade students on their progress.

Allows the faculty multiple pieces of information on student progress – to determine overall grades.

If students do poorly, they must make changes based on our grading policies, requirements of the assignment, etc.

Outcome Assessment

Using classroom assessmentsMust have clearly defined goals – purpose

of the assignmentMust be a valid assessment of the outcomeRemember we are assessing student

progress as a means of assessing ourselves.

If students do poorly – who needs to make changes? We do?

Some Common Problems

When faculty use classroom assessments to evaluate student learning outcomes and the results are not as good as they thought they would be. Something is wrong with the test – it didn’t measure

what we thought it measured (measurement error) We aren’t teaching the topic in all classes (not all

faculty on board) Our adjuncts don’t know about what we are doing. It has been too long since the students had the

material We need to revise it We forgot to do it (process error)

Surveys

Also called questionnaires, opinionnaires, tests

Designed to be descriptive in nature - of people’s attitudes, behaviors, opinions about a construct

Surveys are done when there is an absence of data to sufficiently describe conditions within a population

Some Critical Factors Worth Thinking About

ReliabilityValiditySensitivityAppropriateness

Reliability

Dependability or trustworthinessA perfect instrument, when given twice

under the same circumstances, will yield identical results

The degree to which a test consistently measures whatever it measures (over time)

Validity

The degree to which a test measures what it is supposed to measure

Does the test provide data on the phenomena under consideration so that I can answer the research question?

Sensitivity

Can the test make fine distinctions required for the detection of true differences among subjects.

Difficult with likert scales.

Appropriateness

Is the examinee capable of meeting the requirements of the test (e.g. reading level, physical format, age appropriateness, administration setting)

Designing a Survey or Questionnaire

Avoid the following pitfalls:

1.Phrase questions to be comprehended by all those in the target population.

2.Avoid double barreled questions.3.Be careful of double negatives.4.Define terms that could be easily

misinterpreted (see confusing adjectives).

Designing a Survey or Questionnaire

5.Underline or boldface a word if special emphasis is demanded.

6.Watch for inadequate alternatives to a question.

7.Do not use adjectives that fail to have an agreed-upon meaning.

8.Be sure questions are not leading questions.

9.There should be no ambiguity in the questions.

Some general rules to follow…..

Sensitive questions as well as open-ended ones should be near the end of the questionnaire.

Place questions in a logical order where possible.

Simpler questions should be ahead of more difficult ones.

Avoid establishing a response set. Request information needed for subsequent

questions first.

Some general rules to follow…..

Vary questions by length and type. Define the purpose and scope of the survey in

explicit terms - avoid “fishing expeditions” and rambling, redundant, ill-conceived approaches.

Avoid using an existing survey, if it was designed for a different purpose, population or circumstance. Although they may serve as a point of departure, surveys usually have aims or situational factors that are specific to each application.

Some general rules to follow….. In designing questionnaires or interviews, one

often finds it helpful to sit down with a group of potential respondents and explore what is meaningful or important to them, and how best to phrase questions to reflect their attitudes or opinions

Field test instruments to spot ambiguous or redundant items and to arrive at a format leading to ease of data tabulation and analysis.

Examine the merits of using machine scored answer sheets to facilitate tabulation and analysis.

Some general rules to follow…..

As often as possible, use structured questions as opposed to unstructured and open ended ones for uniformity of results and ease of analysis.

Do not ask questions out of idle curiosity - this approach will overtax the respondents. Avoid questions that are redundant or have obvious answers.

Some general rules to follow…..

Avoid loaded or biased questions (usually by involving others in the wording process and by field testing) and be watchful of biased sampling.

Keep the final product as brief, simple, clear, and straightforward as possible. Complex instruments, while justified under special circumstances, generally will be resisted or rejected by most respondents, and cloud analysis of the data.

Brainstorm the analysis needs to insure the clarity and comprehensiveness of the instrument.

Some general rules to follow…..

Consider the necessary and sufficient characteristics of the respondent that must be collected at the time the survey in administered and on which data analysis will be based (gender, age, race, occupation, education, background, life-history, demographic variables). Keep them to a minimum - these often invade the privacy of respondents.

Imagine various outcomes that might result from the survey, including surprising ones. Anticipate gaps

and shortcomings.

Types of Questions

Dichotomous (circle the following) I am: Male Female

Multiple Choice: In what year of college are you?

1. Freshman 2. Sophomore 3. Junior 4. Senior

Types of Questions

Likert ScalesOverall, I am satisfied with library services.1 2 3 4 5strongly disagree somewhat agree stronglydisagree agree agree

Overall, how satisfied are you with the quality of academicprograms at the college?1 2 3 4 5very dissatisfied somewhat satisfied verydissatisfied satisfied satisfied

Types of Questions (cont.)

Ranking In selecting a college to attend, the following

influenced my decision (rank from greatest influence [5] to least influence [1])

Nearness to my home 5 4 3 2 1Availability of student services 5 4 3 2 1Quality of instruction 5 4 3 2 1Transportation to that college 5 4 3 2 1Availability of parking 5 4 3 2 1

Types of Questions (cont.)

Sentence completion:The most important factor to me when

selecting a college to attend is: ________________________

Open ended responses:What factors are most important to you in

selecting a college to attend?

Avoid the following pitfalls:

1.Phrase questions to be comprehended by all those in the target population.

2.Avoid double barreled questions.3.Be careful of double negatives.4.Define terms that could be easily

misinterpreted (see confusing adjectives).5.Underline or boldface a word if special

emphasis is demanded.

Avoid the following pitfalls:

6.Watch for inadequate alternatives to a question.

7.Do not use adjectives that fail to have an agreed-upon meaning.

8.Be sure questions are not leading questions.

9.There should be no ambiguity in the questions.

Confusing Adjectives Checklist

Respondents were asked “out of 100 anything… pennies, apples, people, what do the following adjectives mean? Respondents indicated what number they envisioned when they heard the following adjectives:

Quantifying Phrase Most Frequent Answer

Almost none 1 A couple 2 Damn few 3 Several 3 Hardly any 5 Few 10 A small number of 10 Not very many 10

Confusing Adjectives Checklist

Quantifying Phrase Most Frequent AnswerLots 40Many 40A significant number 40A considerable number of 40 Numerous 40 Minority 49 Majority 51 Consensus 60

Confusing Adjectives Checklist

Quantifying Phrase Most Frequent Answer

Substantial majority 75Large proportion 75Clear mandate 75Most 90Nearly all 90Almost all 95Virtually all 95

Confusing Adjectives Checklist

Letter of Transmittal Should Include the Following:

Introduction (who you are, your position, etc.) General characteristics: a clear, brief, yet

adequate statement of the purpose and value of the questionnaire ( this will elicit a maximum number of returned questionnaires.)

Effectiveness: It must provide good reason for the subject to respond.

It should involve him/her in a constructive and appealing way. His sense of professional responsibility, intellectual curiosity, and personal worth, are typical response appeals.

Letter of Transmittal Should Include the Following:

Effectiveness (cont.): It should establish a reasonable, but firm,

return date. An offer to send respondents a report of the

findings is often effective, though it carried with it the ethical responsibility to honor such a pledge.

If possible, use a letterhead and a signature that will lead prestige and official status to the letter of transmittal.

Pre-test/Post-test

Apply what we just discussed about surveys Must pre and post test the exact same

individuals – so need to be able to track them More time consuming (longer commitment) Pre-test before they get a dosage How long must they stay to be considered a

program participant? How long do you wait before you post-test? How about pre-test treatment interaction or

pre-test/post-test interaction?

Pre-test/Post-test

You are in essence comparing the group against themselves – looking for changes over time.

If you divide people into two groups, give an intervention to one and test them – you are comparing groups – looking for differences in treatment

When you pre-test/post-test one group, you are looking for changes within subjects

Devil in the Details

Must have names or codes on surveys to make sure you match them up correctly

Must match Jane Doe to Jane Doe, not the entire pretest group to the entire posttest group

Someone has to be the responsible party for collecting and putting together these matched surveys

Document Evaluation/Work Sample Evaluation

Portfolio assessment Reflective journal evaluation Specific activity analysis (interviewing,

writing résumés, etc.) Grading

According to criteria Grading rubrics Completion vs. non-completion

Creating Rubrics

What is a rubric? Rubrics specify the performance expected for several

levels of quality.  They provide an objective and consistent way to assess subjective tasks, indicate what is expected, and highlight how performance will be evaluated. (http://bhardemon.tripod.com/id6.html)

Rubrics are scales in which the criteria used for grading or assessment are clearly spelled out along a continuum.  Rubrics can be used to assess a wide range of assignments and activities in the classroom, from oral presentations to term papers to class participation. http://gsi.berkeley.edu/resources/grading/rubricsIntro.html 

Two Types of Rubrics

Analytic Rubrics: Separate scales for each trait, or learning outcome, being assessed within the assignment (e.g., separate scales for "Argument,” “Organization,” “Use of Evidence,” etc.)

Holistic Rubrics: One scale for the assignment considered as a whole.  (e.g., one scale describing the characteristics of an “A” paper, a “B” paper, or a “C” paper, etc.) 

Rubrics enhance student learning by:

Anchoring grading to specific learning objectives, rather than more subjective, distracting considerations of rank or effort

Improving assignment design by clarifying desired learning outcomes

Contributing to fairness and consistency across sections.

Reducing student anxieties about the subjectivity of grading

Rubrics help faculty save time by:

Narrowing the field of evaluation to desired learning outcomes

Facilitating constructive written comments

Reducing grade challenges Reducing graders’ anxieties about grade

inflation and the subjectivity of grading

How to Create a Rubric

1. Think through your learning objectives.  1. think through the various traits, or learning outcomes, you

want the assignment to assess.

2. Decide what kind of scale you will use.  1. Decide whether the traits you have identified should be

assessed separately or holistically. 

3. Describe the characteristics of student work at each point on your scale.  

1. Once you have defined the learning outcomes being assessed and the scale you want to employ, create a table to think through the characteristics of student work at every point or grade on your scale. 

How to Create a Rubric

4. Test your rubric on student work.  1. It is essential to try your rubric out and make sure it

accurately reflects your grading expectations (as well as those of other instructors)

5. Use your rubric to give constructive feedback to students. 

1. Consider handing the rubric out with students’ returned work.  You can use the rubric to facilitate the process of justifying grades and to provide students with clear instructions about how they can do better next time. 

6. Use your rubric to clarify your assignments and to improve your teaching. 

1. The process of creating a rubric can help you create assignments tailored to clear and specific learning objectives. 

Let’s Look at the Handout

Rubric HandoutRubric WorksheetRubric Practice Worksheet

Create a Grading Rubric for the example (using your worksheet as a guide)

Matrix Sheet

What are your outcomes?What will be your means of assessment

(outcome indicators)What will be your outcome targets (70%

of students will pass with a minimum of 75%).

How will the results be used?