general education outcomes assessment 2016-2017: written...

130
Report 2016-2017 Page 1 of 36 General Education Outcomes Assessment General Education Outcomes Assessment 2016-2017: Written Communication, Oral Communication, Personal Responsibility, and Social Responsibility Executive Summary In the 2016-2017 academic year, the San Jacinto College Office of Accreditation and Assessment conducted general education outcomes assessment with a focus on student attainment of written communication, oral communication, personal responsibility, social responsibility. To assess student outcomes attainment, the Office of Accreditation and Assessment collected samples of student work that were assessed against selected rubrics. Overall in fall 2016, there were 1,785 sections of 79 different courses with 44,688 enrollments. From those course sections, the process collected 52,973 student documents of which 1,120 were randomly sampled for assessment. 104 different faculty evaluators assessed anywhere from 22-33 student documents each, depending on the segment. Each document was assessed by three different evaluators with the exception of oral communication. This year marked the first implementation of the newly revised personal responsibility and social responsibility rubrics. The Speech faculty also used a new rubric in order to assess oral communication. In each sample, a student was determined to have successfully attained an outcome if a Level 2 or better was achieved on all rubric criteria according to 2 of the 3 faculty evaluators. The same standard (Level 2) applied to oral communication, but with only one faculty member assessing. All Speech courses were sampled for this outcome. Faculty assessed in real time as students delivered live presentations. Faculty analysis of the results during focus groups produced recommendations for continuous improvement. When interpreting the results, several factors should be considered that limit how representative the results are of college-wide student performance: 1. The sample included students across the core regardless of semester credit hours completed. The results may be generalized to the entire student population and not specific subgroups of students (e.g. graduates). This may change with the next assessment cycle. Blackboard has announced that with its next upgrade that Outcomes will be able to sample specific groups of students, such as recent graduates or those about to graduate. 2. Inter-rater reliability does not yet meet institutional standards and suggests that some portion of the results may be attributed to differences among raters. The College is working to improve the inter- rater reliability. However, the inter-rater reliability may also be considered a measure of the extent to which an institutional standard has been established regarding general education outcomes. 3. The number of documents collected and the attrition rate in the random sample were higher than desired or expected. The error bars included on each chart of overall results accounts for the reduced sample size. Measures are being taken collaboratively, college-wide to reduce the sample attrition rate.

Upload: vuongthu

Post on 03-Jul-2019

214 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: General Education Outcomes Assessment 2016-2017: Written ...stufiles.sanjac.edu/GeneralEducation/2017-2018-GenEd/other_docs/Report General... · 2016-2017: Written Communication,

Report 2016-2017 Page 1 of 36

General Education Outcomes Assessment

General Education Outcomes Assessment 2016-2017: Written Communication, Oral Communication, Personal Responsibility, and Social Responsibility

Executive Summary

In the 2016-2017 academic year, the San Jacinto College Office of Accreditation and Assessment conducted general education outcomes assessment with a focus on student attainment of written communication, oral communication, personal responsibility, social responsibility.

To assess student outcomes attainment, the Office of Accreditation and Assessment collected samples of student work that were assessed against selected rubrics. Overall in fall 2016, there were 1,785 sections of 79 different courses with 44,688 enrollments. From those course sections, the process collected 52,973 student documents of which 1,120 were randomly sampled for assessment. 104 different faculty evaluators assessed anywhere from 22-33 student documents each, depending on the segment. Each document was assessed by three different evaluators with the exception of oral communication.

This year marked the first implementation of the newly revised personal responsibility and social responsibility rubrics. The Speech faculty also used a new rubric in order to assess oral communication. In each sample, a student was determined to have successfully attained an outcome if a Level 2 or better was achieved on all rubric criteria according to 2 of the 3 faculty evaluators. The same standard (Level 2) applied to oral communication, but with only one faculty member assessing. All Speech courses were sampled for this outcome. Faculty assessed in real time as students delivered live presentations. Faculty analysis of the results during focus groups produced recommendations for continuous improvement.

When interpreting the results, several factors should be considered that limit how representative the results are of college-wide student performance:

1. The sample included students across the core regardless of semester credit hours completed. The results may be generalized to the entire student population and not specific subgroups of students (e.g. graduates). This may change with the next assessment cycle. Blackboard has announced that with its next upgrade that Outcomes will be able to sample specific groups of students, such as recent graduates or those about to graduate.

2. Inter-rater reliability does not yet meet institutional standards and suggests that some portion of the results may be attributed to differences among raters. The College is working to improve the inter-rater reliability. However, the inter-rater reliability may also be considered a measure of the extent to which an institutional standard has been established regarding general education outcomes.

3. The number of documents collected and the attrition rate in the random sample were higher than desired or expected. The error bars included on each chart of overall results accounts for the reduced sample size. Measures are being taken collaboratively, college-wide to reduce the sample attrition rate.

Page 2: General Education Outcomes Assessment 2016-2017: Written ...stufiles.sanjac.edu/GeneralEducation/2017-2018-GenEd/other_docs/Report General... · 2016-2017: Written Communication,

Report 2016-2017 Page 2 of 36

General Education Outcomes Assessment

Written Communication

Core-wide, 58.3 % (+/- 6.0%) of students were successful per the Written Communication rubric on five criteria, which is short of the college’s goal for 75% success rate.

Notably, the results indicated that students continue to struggle with Sources and Evidence (current year’s data was similar to the last cycle’s data). A disconnect between the assignments and the rubric need to be addressed. The problem may also reside with students’ own inability to locate, evaluate, and utilize quality sources.

Visual Communication

Starting with the 2017-2018 academic year, visual communication and oral communication need to be taught and assessed core-wide. In order to determine if and where visual communication was being taught, it was added to the most likely candidate—written communication. Surprisingly, there was a 20.7% success rate (+/- 6.4%) in this area with over 51% of the successful samples going to Math, Science, and PHED courses.

Oral Communication, Speech Communication Courses

There were 65 sections among the speech communication courses assessed, and there were approximately 1,390 enrollments. 761 students delivered in-class, oral presentations that were assessed by the faculty of record against the oral communication rubric. 629 students did not complete

the assessment (45% attrition rate); 209 were reported as having not completed the assignment or did not have complete data available; the remaining 420 were not reported by the faculty of record for the

course. The results are only representative of 55% of students within speech communication courses; the results may not be validly generalized to the entire college student population. 60.1% of students were successful.

Personal Responsibility

Core-wide, there was only a 40% (+/- 5.8%) success rate. Results indicate that many of the instances in which a student was marked unsuccessful was because that student was not asked to perform one or more of the required criteria. It is highly possible that many faculty members did not revise their common assignments after this rubric was revised, so alignment between the assignments and the rubric is still a major issue that needs to be addressed.

Social Responsibility

The success rate for social responsibility was 47.3% (+/- 5.8%) core-wide. Again, this is below the College’s goal of 75% success rate. Similar to personal responsibility, it is highly likely that common assignments were not revised in light of the newly revised rubric. This placed students at a disadvantage and may have negatively impacted their potential success.

Faculty Recommendations:

Focus groups for personal and social responsibility had similar thoughts and recommendations on (a) reviewing assignments to improve alignment with the selected rubrics; (b) taking a completed, revised common assignment and elevating it to a signature assignment, especially for part-time faculty use; (c) allowing faculty members to develop their own signature assignment(s) once a course-specific assignment has been vetted and approved; (d) hosting workshops in order to assist faculty with assignment re-design that addresses alignment and rubric issues as well as continuing to develop the institutional interpretations for all rubrics; (e) allowing faculty members to decide whether they will teach oral, visual, or written communication in their courses (need TWO).

All focus group discussions were recorded as abbreviated transcripts with a summary in Appendix A.

Page 3: General Education Outcomes Assessment 2016-2017: Written ...stufiles.sanjac.edu/GeneralEducation/2017-2018-GenEd/other_docs/Report General... · 2016-2017: Written Communication,

Report 2016-2017 Page 3 of 36

General Education Outcomes Assessment

Executive Summary ............................................................................................................................1

Overview ...........................................................................................................................................4

General Education Outcomes ..........................................................................................................4

Assessment Philosophy ...................................................................................................................6

Assessment Plan .............................................................................................................................6

Procedures .........................................................................................................................................7

Identification of Courses..................................................................................................................7

Development of Common Assignment .............................................................................................7

Instruments ....................................................................................................................................7

Submission and Sampling of Artifacts ............................................................................................. 10

Evaluator Training ......................................................................................................................... 11

Sample Attrition............................................................................................................................ 12

Evaluation Period .......................................................................................................................... 12

Results ............................................................................................................................................. 12

Written Communication (with Visual Communication) .................................................................... 13

Oral Communication ..................................................................................................................... 14

Personal Responsibility.................................................................................................................. 15

Social Responsibility……………………………………………………………………………………………………………………..……...16

Continuous Improvement ................................................................................................................. 16

Instructional ................................................................................................................................. 17

Procedural .................................................................................................................................... 17

References ....................................................................................................................................... 19

Appendix A. Focus Group Summary ................................................................................................... 20

Appendix B. Focus Group Presentation .............................................................................................. 22

Appendix C. Disaggregated Results ................................................................................................... 90

Appendix D. Indirect Assessment Data ............................................................................................... 94

Appendix E. Rubrics and Institutional Interpretations.......................................................................... 101

Appendix F. Inter-Rater Reliability Estimates ...................................................................................... 118

Appendix G. Sample Size Calculation.................................................................................................. 122

Page 4: General Education Outcomes Assessment 2016-2017: Written ...stufiles.sanjac.edu/GeneralEducation/2017-2018-GenEd/other_docs/Report General... · 2016-2017: Written Communication,

Report 2016-2017 Page 4 of 36

General Education Outcomes Assessment

Overview

General education outcomes are the core of the undergraduate curriculum for all students, regardless of

major, and represent essential knowledge, skills, and attitudes necessary for academic disciplines and workforce careers. To evaluate institutional effectiveness and to facilitate continuous improvement efforts, general education outcomes are assessed by faculty at the course level, and the Office of

Accreditation and Assessment coordinates institutional assessment and reporting efforts.

General Education Outcomes

Each institution of higher education that offers an undergraduate academic degree program develops its core curriculum by using the Texas Higher Education Coordinating Board-approved purpose, core objectives, and foundational component areas of the Texas Core Curriculum. By completing the Texas

Core Curriculum, students will be prepared for contemporary challenges by meeting the following Core Objectives and accompanying definitions:

(A) Critical Thinking Skills: to include creative thinking, innovation, inquiry, and analysis, evaluation, and synthesis of information;

(B) Communication Skills: to include effective development, interpretation, and expression of

ideas through written, oral, and visual communication; (C) Empirical and Quantitative Skills: to include the manipulation and analysis of numerical data

or observable facts resulting in informed conclusions;

(D) Teamwork: to include the ability to consider different points of view and to work effectively with others to support a shared purpose or goal;

(E) Personal Responsibility: to include the ability to connect choices, actions, and consequences to ethical decision-making; and

(F) Social Responsibility: to include intercultural competence, knowledge of civic responsibility,

and the ability to engage effectively in regional, national, and global communities.

Each institution's core curriculum is composed of courses that adhere to the content description, core objectives, and semester credit hour requirements for a specific Foundational Component Area. The

Foundational Component Areas are: 1) Communication

2) Mathematics 3) Life and Physical Science 4) Language, Philosophy, and Culture

5) Creative Arts 6) U.S. History

7) Government/Political Science 8) Social/Behavioral Science and 9) Component Area Option

Page 5: General Education Outcomes Assessment 2016-2017: Written ...stufiles.sanjac.edu/GeneralEducation/2017-2018-GenEd/other_docs/Report General... · 2016-2017: Written Communication,

Report 2016-2017 Page 5 of 36

General Education Outcomes Assessment

The Core Objectives for the Foundation Component Areas are as follows:

CORE OBJECTIVES

FOUNDATION COMPONENT AREAS

Cri

tica

l Th

inki

ng

Co

mm

un

icat

ion

Sk

ills

Emp

iric

al a

nd

Q

uan

tita

tive

Ski

lls

Team

wo

rk

Soci

al

Res

po

nsi

bili

ty

Per

son

al

Res

po

nsi

bili

ty

Communication X X X X

Mathematics X X X

Life and Physical Sciences X X X X

Language, Philosophy and Culture X X X X

Creative Arts X X X X

U.S. History X X X X

Government/Political Science X X X X

Social/Behavioral Science X X X X

Component Area Option

Language and Communication

X X X X

Physical Education X X X X

The faculty-developed San Jacinto College District general education outcomes are: 1. Communication Skills – Students will communicate ideas, express feelings, and support

conclusions effectively in written, oral, and visual formats. 2. Critical Thinking Skills – Students will develop habits of mind, allowing them to appreciate

the processes by which scholars in various disciplines organize and evaluate data and use the methodologies of each discipline to understand the human experience.

3. Empirical and Quantitative Skills – Students will develop quantitative and empirical skills to

understand, analyze, and explain natural, physical, and social realms. 4. Teamwork – Students will consider different points of view and work interdependently to

achieve a shared purpose or goal.

5. Personal Responsibility – Students will develop habits of intellectual exploration, personal responsibility, and physical wellbeing.

6. Social Responsibility – Students will demonstrate a global perspective toward issues of culture, society, politics, environment, and sustainability.

Since there is no single course that measures all Core Objectives or Foundation Component

Areas, student attainment of core objectives follows from students completing core curriculum requirements for respective degrees.

Page 6: General Education Outcomes Assessment 2016-2017: Written ...stufiles.sanjac.edu/GeneralEducation/2017-2018-GenEd/other_docs/Report General... · 2016-2017: Written Communication,

Report 2016-2017 Page 6 of 36

General Education Outcomes Assessment

Assessment Philosophy

To guide assessment of instructional outcomes, a philosophical framework of assessment has been

developed. San Jacinto College instructional assessment is faculty centered, uses direct and indirect methods, embeds assessment of student learning in courses, and adapts proven and research-based

methods to meet college needs and requirements.

Instructional assessment at San Jacinto College is faculty centered. Faculty are responsible for assessing student attainment of all outcomes including content learning outcomes, program outcomes, and

general education outcomes. Thus, faculty remain at the center of institutional efforts to define assessment procedures and guidelines. The Institutional Effectiveness Council and the Assessment Committee include faculty representatives, and both groups regularly consult with faculty to develop

and refine institutional procedures.

Instructional assessment at San Jacinto College uses direct and indirect methods of assessment. Direct

assessment methods focus on samples of student work. Student performance on objective assessment instruments may be aggregated or products of student work may be assessed using a performance rubric or other “authentic” assessment instrument. Indirect assessment relies on secondary reports

regarding student attainment of learning outcomes, often in the form of surveys or focus groups in which a student, or employer, or faculty are asked how well they believe a student performed on a particular outcome.

Instructional assessment at San Jacinto College embeds assessment of student learning in courses. In many areas, standardized exams have been developed to assess student attainment of knowledge and

skills. Many career fields require licensure or certification earned by satisfactory performance on a standardized exam. Several of the college’s technical/workforce programs culminate with a student preparing for and taking a licensure or certification exam (e.g. paralegal, nursing, and health information

management). While results of licensure and certification exams may be considered, San Jacinto College’s institutional assessment of learning outcomes focuses on course-embedded assessment. The

college assesses student attainment of course, program, and general education outcomes by assessing student work produced to meet requirements for a course.

Instructional assessment at San Jacinto College adapts proven and research-based methods to meet

college assessment needs and requirements. Faculty, department chairs, deans, provosts, and assessment professionals at the college are expected to maintain awareness of trends and issues related to assessment of student learning in higher education—that is, accomplished through professional

development opportunities, attendance at assessment and discipline-specific conferences, and networking with colleagues at other institutions. The college actively incorporates into and adapts

proven assessment methods used and reported by other colleagues and institutions.

Assessment Plan

San Jacinto College established a two-year cycle for reporting results of general education outcomes

assessment. The two-year cycle was agreed upon by the Assessment Committee given the College’s assessment needs, existing continuous improvement cycles, and accreditation and statutory reporting requirements. This year, assessment results for four general education outcomes are reported: written

communication (with visual communication), oral communication, personal responsibility and social responsibility.

Page 7: General Education Outcomes Assessment 2016-2017: Written ...stufiles.sanjac.edu/GeneralEducation/2017-2018-GenEd/other_docs/Report General... · 2016-2017: Written Communication,

Report 2016-2017 Page 7 of 36

General Education Outcomes Assessment

Procedures

Identification of Courses

Given that no single course measures all Core Objectives or Foundation Component Areas and that student attainment of core objectives occurs through completion of core curriculum

requirements for respective degrees, assessment regarding student attainment of general education outcomes should focus on the entire core curriculum.

Thus, sampling and collection of student work included all courses in the core curriculum. The

limitation of the sampling method is that students at all points of their academic career may be included in the sample (e.g. ENGL 1301), and the enrollment in that course typically includes many first semester students. This limitation must be considered when interpreting the results.

With Blackboard’s next upgrade, it will be possible for the College to only assess graduates or students about to graduate through Blackboard Outcomes.

San Jacinto College has 27,205 students enrolled in credit courses, with approximately 1,884 students graduating in the spring semester. Given institutional standards for sampling power (α=0.05, CI=95%), appropriate samples were drawn within the scope of each construct as defined by the outcomes and

rubrics; each sample size was derived from the institutional sample size calculation table (Appendix G).

Development of Common Assignment

Prior to the beginning of the fall semester, faculty teaching a core curriculum course convened to

develop or revise a “common assignment” aligned to the general education outcomes and to the LEAP VALUE rubrics with which student work would be assessed. Common assignments developed by faculty

for each course are available on the network drive at G:\Assessment\General Education (Institutional) Level Assessment\2016 - 2017 Common Assignments.

Instruments

Direct, Course-Embedded Assessment via LEAP VALUE Rubrics

The American Association of Colleges and Universities’ (AAC&U) “Valid Assessment of Learning in Undergraduate Education” project (VALUE) developed institutional-level rubrics for sixteen areas of

learning related to general education outcomes. Each rubric “represents a distillation and synthesis of the core elements of learning for each outcome” (Rhodes & Finley, 2013, p. 1).

The faculty Assessment Committee elected to assess College general education outcomes using the

VALUE rubrics as the primary direct assessment instrument. That decision was based on (a) the broad research-based methodology by which the rubrics were developed, (b) the widespread adoption among

comparable higher education institutions, (c) regional and state-wide interest in the rubrics among Texas institutions, and (d) the immediate availability of and liberal licensing for the rubrics.

The Assessment Committee, in collaboration with faculty teaching in respective disciplines, identified

the following rubrics to be used to assess samples of student work in the noted component areas of the core curriculum.

Page 8: General Education Outcomes Assessment 2016-2017: Written ...stufiles.sanjac.edu/GeneralEducation/2017-2018-GenEd/other_docs/Report General... · 2016-2017: Written Communication,

Report 2016-2017 Page 8 of 36

General Education Outcomes Assessment

Rubrics relevant to Written Communication (with Visual Communication), Oral Communication, Personal Responsibility, and Social Responsibility are available in Appendix E. Rubrics and Institutional Interpretations. During the previous assessment cycle, College faculty who taught courses aligned to

Social Responsibility and Personal Responsibility were not satisfied with the AAC&U rubrics utilized. They participated in a card sort of different terms and ideas pulled from other rubrics, including the

AAC&U rubrics, in order to create unique rubrics to the College. They were then approved by the Assessment Committee and distributed.

The VALUE rubrics have also been adapted slightly to meet institutional needs. One notable

modification—the Usability of Sample criterion—is described below:

SJC Modifications to LEAP VALUE Rubrics

Beginning with 2012-2013, a Level 0 column was added to all criterion of all rubrics. Level 0 adheres to

the instructions of the original AAC&U rubrics and indicates the student did not attain a Level 1 performance regarding the criterion.

Beginning with 2014-2015, a Usability of Sample criterion was added to all rubrics. The criterion was

designed to provide faculty evaluators a means by which to provide feedback regarding the quality of each sample they assessed. Level 0 usability indicated the sample was unusable, and evaluators were

instructed to indicate that level of usability if the student submission was blank, could not be opened, was the wrong file format, was not the work of one student, or contained an instance of plagiarism. Faculty evaluators were instructed to indicate a Level 1 usability when the assignment information

presented to students in Blackboard conflicted with the common assignment intended for the course. Evaluators indicated a Level 2 usability when the nature of the student’s response interfered with the evaluator’s ability to reliably assess the outcome (e.g. the student elected to only write a few sentences

rather than the assigned pages or the student’s written communication skills interfered with assessing another outcome). A Level 3 usability indicated that the evaluator believed the student’s performance

regarding the outcome was negatively influenced by the misalignment of the rubric and the assignment. A Level 4 usability indicated that none of the aforementioned issues were present, and that the sample provided an opportunity, from the evaluator’s perspective, to reliably assess the given outcome.

Written Communication/with Visual Communication

Communication; Mathematics; Life and Physical Science; Language, Philosophy, and Culture; Creative Arts; U.S. History; Government/Political Science; Social/Behavioral Science; Component Area Option

LEAP VALUE Written Communication

Oral Communication

SPCH 1311, SPCH 1315, SPCH 1321 LEAP VALUE Oral Communication

Personal Responsibility

Communication; Language, Philosophy, and Culture; U.S. History; Government/Political Science; Speech, Language, Physical Education

SAN JACINTO COLLEGE Personal Responsibility

Social Responsibility

Language, Philosophy, and Culture; Creative Arts; U.S. History; Government/Political Science; Social/Behavioral Science

SAN JACINTO COLLEGE Social Responsibility

Page 9: General Education Outcomes Assessment 2016-2017: Written ...stufiles.sanjac.edu/GeneralEducation/2017-2018-GenEd/other_docs/Report General... · 2016-2017: Written Communication,

Report 2016-2017 Page 9 of 36

General Education Outcomes Assessment

Indirect Assessment of General Education Outcomes

The college uses three different instruments as indirect assessment methods regarding student

attainment of general education outcomes: End of Course Survey, Graduate Exit Survey, and the Community College Survey of Student Engagement.

End of Course Survey. In the standard, college-wide End of Course Survey issued to each student in each core curriculum course section, seven questions address the general education outcomes. Students are asked if the course “increased [their] ability to” perform against each of the general education outcomes

included in the course outcomes and syllabus. For the current year, the questions included:

This course increased my ability to communicate in written, oral, or visual formats.

This course encouraged me to take responsibility for my own learning and well-being.

This course increased my ability to understand people with different backgrounds or perspectives.

The answer choices were a five point Likert scale from Strongly Agree to Strongly Disagree with a mid-point of Neutral. Uncertain was a sixth option.

Graduate Exit Survey. In the standard, college-wide Graduate Exit Survey issued to each student during

the last two weeks of the semester in which they have applied for graduation, seven questions address the general education outcomes. Students are asked if their “experience at San Jacinto College improved [their] ability to” perform against each of the general education outcomes. For the current

year, the relevant questions were:

write clearly and concisely for different audiences and purposes,

communicate verbally with others and to speak to groups of people,

take responsibility for my own learning, well-being, and decisions,

understand people with different backgrounds or perspectives

The answer choices were a six point Likert scale from Agree Strongly to Disagree Strongly.

Community College Survey of Student Engagement (CCSSE). In the biannual administration of the survey, questions relevant to general education outcomes assessment were identified. For the current

year, results from the 2013 and 2015 administration of the survey were available including the following questions:

o In your experiences at this College during the current school year, about how often have you done each of the following? (Very Often, Often, Sometimes, Never)

o asked questions in class or contributed to class discussions (COMM)

o made a class presentation (COMM) o prepared two or more drafts of a paper or assignment before turning it in (COMM) o worked on a paper or project that required integrating ideas or information from

various sources (COMM) o came to class without completing readings or assignments (PR)

o participated in a community-based project as part of a regular course (SR) o had serious conversations with students of a different race or ethnicity other than

your own (SR)

o had serious conversations with students who differ from you in terms of their religious beliefs, political opinion, or personal values (SR)

o During the current school year, about how much reading and writing have you done at this

college? (None, 1 to 4, 5 to 10, 11 to 20, More than 20)

o number of written papers or reports of any length (COMM)

Page 10: General Education Outcomes Assessment 2016-2017: Written ...stufiles.sanjac.edu/GeneralEducation/2017-2018-GenEd/other_docs/Report General... · 2016-2017: Written Communication,

Report 2016-2017 Page 10 of 36

General Education Outcomes Assessment

o How much does this college emphasize each of the following? (Very Much, Quite a Bit, Some, Very Little)

o encouraging contact among students from different economic, social, or ethnic backgrounds (SR)

o How much has your experience at this college contributed to your knowledge, skills, and

personal development in the following areas? (Very much, quite a bit, some, very little)

o acquiring a broad general education (COMM) (PR) (SR) o writing clearly and effectively (COMM) o speaking clearly and effectively (COMM)

o understanding people of other racial and ethnic backgrounds (SR) o developing a personal code of values and ethics (PR)

o contributing to the welfare of your community (SR)

o (Additional COMM):

o The assignments and activities in this course encouraged discussion and interaction in class.

o The professor encouraged me to participate and share ideas or experiences during

class. o I worked on a paper or project for this course that required two or more drafts

before I turned it in. o This course increased my ability to communicate in written, oral, or visual formats.

o This course encouraged me to take responsibility for my own learning and well-being. (PR)

o This course increased my ability to understand people with different backgrounds or

perspectives. (SR)

Submission and Sampling of Artifacts

During the fall 2016 semester, students submitted their common assignments to an Assignment item in Blackboard Learn. The intention was that the assignment was graded by faculty as part of the course requirements. Faculty used the Add Alignment feature in Blackboard Learn to indicate the assignment

was aligned to the general education outcomes relevant to the course and the assignment. The Office of Accreditation and Assessment used Blackboard Outcomes to electronically collect documents submitted

by students to assignments aligned to the respective outcomes.

Across all courses and outcomes included in the project, there were 1,785 course sections with 44,688 enrollments, 52,973 documents were collected, and 1,120 were included in the sample. Blackboard

Outcomes collected a copy of the entire population of student artifacts stored in Blackboard Learn. In contrast to the prior year collection in which there was a 48.3% attrition rate overall from population to collection, the population-to-collection attrition rate this year was 39.6%. That attrition rate is due to

the combined effect of two factors: (a) students not submitting a response to an assignment, or (b) faculty not properly aligning the assignment to a general education outcome. Data from alignment

reports that indicate which course sections have the expected alignments for assignments to outcomes observed that 28.8% of the attrition was due to students not submitting a response and that 10.8% was due to faculty not properly aligning the assignment. In the prior year, the rates were 25.8% and 22.4%,

respectively.

Page 11: General Education Outcomes Assessment 2016-2017: Written ...stufiles.sanjac.edu/GeneralEducation/2017-2018-GenEd/other_docs/Report General... · 2016-2017: Written Communication,

Report 2016-2017 Page 11 of 36

General Education Outcomes Assessment

Written Communication

There were 1,785 sections core-wide and there were approximately 44,688 enrollments. 26,661

documents were collected. 375 documents were sampled; however, only 276 were usable (26.4% attrition rate). 51 faculty evaluators participated in order to assess each document three times against

the LEAP VALUE rubric.

Visual Communication

Starting with the 2017-2018 academic year, visual communication and oral communication need to be taught and assessed core-wide. In order to determine if and where visual communication was being taught, it was added to the most likely candidate—written communication. Surprisingly, there was a 20.7% success rate (+/- 6.4%) in this area with over 51% of the successful samples going to Math, Science, and PHED courses.

Oral Communication

There were 65 sections among the speech communication courses assessed. 761 students delivered in-class, oral presentations that were assessed by the faculty of record against the oral communication

rubric. 629 students did not complete the assessment (45% attrition rate); 209 were reported as having not completed the assignment or did not have complete data available; the remaining 420 were not

reported by the faculty of record for the course.

Personal Responsibility

There were 948 sections among the courses and there were approximately 22,624 enrollments. 14,175 documents were collected. 381 documents were sampled, but only 300 were usable (attrition rate

21.3%). 37 faculty evaluators participated in order to assess each document three times each against the criteria of the newly revised rubric.

Social Responsibility

There were 765 sections among the courses and there were approximately 20,377 enrollments. 12,137 documents were collected. 364 documents were sampled, but only 294 were usable (attrition rate 19.2%). 33 faculty evaluators participated in order to assess each document three times each against the

criteria of the newly revised rubric.

Evaluator Training

Faculty only

In mid-August 2016, faculty were asked by provosts, deans, and department chairs to participate in the

general education outcomes assessment process as evaluators. 118 faculty members were initially identified as evaluators; 101 individual faculty attended 200 (duplicated) two-hour training and calibration session regarding one of the three LEAP VALUE rubrics. The calibration sessions were offered

in two rounds: one during the first two weeks of November and one during the last week of January/first weeks of February. Faculty were asked to attend a session during each round; 85 faculty attended a session during each round; the remaining 16 faculty attended a session during one of the two rounds.

All Evaluators

In mid-August 2016, faculty were asked by provosts, deans, and department chairs to participate in the

general education outcomes assessment process as evaluators. 123 employees were initially identified as evaluators; 106 individual employees attended 210 (duplicated) two-hour training and calibration session regarding one of the three LEAP VALUE rubrics. The calibration sessions were offered in two

Page 12: General Education Outcomes Assessment 2016-2017: Written ...stufiles.sanjac.edu/GeneralEducation/2017-2018-GenEd/other_docs/Report General... · 2016-2017: Written Communication,

Report 2016-2017 Page 12 of 36

General Education Outcomes Assessment

rounds: one during the first two weeks of November and one during the last week of January/first weeks of February. Employees were asked to attend a session during each round; 89 employees attended a

session during each round; the remaining 17 employees attended a session during one of the two rounds.

During the calibration sessions, a brief overview was provided regarding the outcomes assessment process procedures. Ten sample student artifacts were provided to faculty who independently applied the respective rubric to 2-3 sample artifacts at a time; after each set of 2-3 artifacts, results were

discussed with an emphasis on the source of differences in ratings between faculty. The calibration sessions focused on establishing an institutional interpretation of each rubric and criterion; decisions made within calibration sessions were compiled and distributed to faculty evaluators prior to the

evaluation periods. The institutional interpretations of rubrics will evolve and be continuously updated.

Sample Attrition

In addition to the enrollment to artifact collection attrition described previously, additional attrition occurred during the evaluation session because of (a) errors by evaluators in completing the evaluation and (b) anomalies in student submissions. 1,120 documents were sampled initially; after the evaluation

sample, 870 documents were evaluated (78% of the sample) by three evaluators each and were included in the data analysis. The attrition in the sample does limit the generalizability of results and should be considered when interpreting results.

Evaluation Period

The evaluation period began on February 16-17 with two evaluation kick-off meetings in which all

faculty evaluators that had completed at least one calibration session were invited to participate. 71 faculty attended these meetings during which the institutional interpretation of each rubric was reviewed and then discussed in small groups, the procedures for the evaluation session were described,

and the faculty later accessed the evaluation session via Blackboard Outcomes. Three evaluators were assigned to each student artifact for inter-rater reliability purposes. Faculty completed their evaluations

between February 17 and March 19.

Results

While San Jacinto College would like to see all of its students achieve a standard of 100% on the core

objectives, the Assessment Committee established a standard whereby 75% of students will score at least a ‘2’ on each criteria of each respective LEAP VALUE rubric. Student criterion-level success was based on a ‘2’ or better as assessed by 2 of 3 evaluators; composite success required a student to be

assessed as successful on each criterion of the rubric.

Inter-rater reliability was calculated (a) for success and not success overall and (b) for each criterion

using Krippendorf’s alpha applied in a manner that accommodated the sampling and evaluation methods in which there was significant variance in which three evaluators were assigned to each student document. In short, with 51 evaluators in a session, each individual document was evaluated by

3 evaluators and had 48 “missing values.” Krippendorf’s alpha was applied in a manner that addressed that assumption.

Across all three outcomes and rubrics, inter-rater reliability creates a critical concern. In a purely

academic research setting, the results may be discarded pending improvement of inter-rater reliability

Page 13: General Education Outcomes Assessment 2016-2017: Written ...stufiles.sanjac.edu/GeneralEducation/2017-2018-GenEd/other_docs/Report General... · 2016-2017: Written Communication,

Report 2016-2017 Page 13 of 36

General Education Outcomes Assessment

and the application of the rubric to student work. However, given the nature and purpose of the research is institutional improvement, results must be taken very cautiously with a commitment to

improve results both instructionally and procedurally.

Written Communication (with visual communication)

Direct Assessment

The overall success rate for written communications across the core was

58.3%. Results were also disaggregated by criterion for all students and

for students that were “not successful” overall. The disaggregated

criterion level results were presented to focus group participants to

inform discussions regarding continuous improvement efforts. The

disaggregated results are available in Appendix C.

Notable results include the higher percentage of unsuccessful students

who were unsuccessful on Sources and Evidence: 85.2% of unsuccessful

students were not successful on that criterion.

A limitation of the results exists in the

sampling method; the results may not

be generalized to a statement

regarding the success of graduates

because the sampling is across the

core and likely includes assessed work

of students early in their academic

career. Inter-rater reliability by criterion did not meet institutional

research standards (α > 0.60). For the LEAP VALUE written

communication rubric for this sample, the inter-rater reliability

coefficient for success (0, 1) and not success (2, 3, 4) ranged from

0.14 < α < 0.26. The inter-rater reliability for the Usability of Sample

criterion was α = 0.44.

Visual Communication

The overall success rate for visual communications across the core

was 20.7%. Not all courses have necessarily explicitly included a

visual element as part of the assignment. Thus, the success rate is

very low. This reveals the extent to which visual communications must be increased across the core. It

will be assessed along with oral communication during the next assessment cycle. Nevertheless, these

results were also disaggregated by criterion for all students and for students that were “not successful”

overall. The results were presented to focus group participants to inform discussions regarding

continuous improvement efforts, especially how to better integrate visual communication into the core.

The disaggregated results are available in Appendix C.

Page 14: General Education Outcomes Assessment 2016-2017: Written ...stufiles.sanjac.edu/GeneralEducation/2017-2018-GenEd/other_docs/Report General... · 2016-2017: Written Communication,

Report 2016-2017 Page 14 of 36

General Education Outcomes Assessment

Inter-rater reliability by criterion did not meet institutional research standards (α > 0.60). For the LEAP VALUE visual communication criteria appended to the Written Communication rubric for this sample,

the inter-rater reliability coefficient for success (0, 1) and not success (2, 3, 4) ranged from 0.21 < α < 0.30 (only 2 items assessed).

Indirect Assessment

The observed results from the Fall 2016 and Spring 2017 End of Course Survey and the two most recent administrations of the Graduate Exit Survey (Fall 2016 and Spring 2017) did not reveal notable results; students typically reported strong agreement regarding college coursework improving their ability to

write clearly and concisely for different audiences and purposes. Relevant results are available in Appendix D.

The 2013 and 2015 administrations of CCSSE included four questions deemed relevant to written communication. Notable results were observed on one of the questions. From 2013 to 2015, there was an observable increase in the number of students that indicated their experience at the College

contributed to their knowledge, skills, and personal development in terms of writing clearly and effectively. Overall, the number of students indicating Very Much increased by 5%.

Oral Communication

Direct Assessment

The overall success rate for oral communication was 60.1%. Results were also disaggregated by

criterion for all students and for students that were “not successful” overall. The disaggregated criterion

level results were presented to focus group participants to inform discussions regarding continuous

improvement efforts. The disaggregated results are available in Appendix C.

Notable results include the nearly

identical percentages of successful versus

unsuccessful students across all three

criteria (organization: 61.3%/38.7%;

content: 61.9%/38.1%; delivery:

61.9%/38.1%).

These results are descriptive; faculty

teaching sections of a speech course

included in the core assessed students

within the course and reported the

results. The data represent the population

and not a sample.

Indirect Assessment

The observed results from the Fall 2016 and Spring 2017 End of Course Survey and the two most recent administrations of the Graduate Exit Survey (Fall 2016 and Spring 2017) did not reveal notable results; students typically reported strong agreement regarding college coursework improving their ability to

Page 15: General Education Outcomes Assessment 2016-2017: Written ...stufiles.sanjac.edu/GeneralEducation/2017-2018-GenEd/other_docs/Report General... · 2016-2017: Written Communication,

Report 2016-2017 Page 15 of 36

General Education Outcomes Assessment

communicate verbally with others and to speak to groups of people. Relevant results are available in Appendix D.

The 2013 and 2015 administrations of CCSSE included three questions deemed relevant to oral communication. Notable results were observed on one of the questions. From 2013 to 2015, there was

a slight increase in the number of students that indicated their experience at the College contributed to their knowledge, skills, and personal development in terms of speaking clearly and effectively. Overall, the number of students indicating Very Much increased by nearly 5%.

Personal Responsibility

Direct Assessment

The overall success rate for personal responsibility was 40.0%. Results were also disaggregated by

criterion for all students and for students that were “not successful” overall. The disaggregated criterion

level results were presented to focus group participants to inform discussions regarding continuous

improvement efforts. The disaggregated results are available in Appendix C.

The lower success rate, 40.0%, is notable. However, the results regarding the

Usability of Sample likely explain that low success rate. For 14.5% of

unsuccessful students, 2 of 3 evaluators indicated a usability Level 3 which

suggests student performance was negatively influenced by mis/alignment

between the rubric and the assignment. An additional 15.1% of evaluators were

listed as “inconclusive” meaning that at least 1 of 3 evaluators believed there to

be a mis/alignment issue.

A limitation of the results exists in the sampling method; the results may not be

generalized to a statement regarding the success of graduates because the

sampling is across the core and likely includes assessed work of students early

in their academic career.

Inter-rater reliability by criterion did not meet institutional research standards

(α > 0.60). For the LEAP VALUE composite personal responsibility rubric for this sample, the inter-rater reliability coefficient for success (0, 1) and not success (2, 3, 4) ranged from 0.13 < α < 0.33. The inter-rater reliability for the Usability

of Sample criterion was α = 0.40.

Indirect Assessment

The observed results from the Fall 2016 and Spring 2017 End of Course Survey

and the two most recent administrations of the Graduate Exit Survey (Fall 2016 and Spring 2017) did not reveal notable results; students typically reported strong agreement regarding college coursework improving their ability to take responsibility for their own learning, well-being, and decisions. Relevant

results are available in Appendix D.

The 2013 and 2015 administrations of CCSSE included two questions deemed relevant to personal

responsibility. No notable results were observed.

Page 16: General Education Outcomes Assessment 2016-2017: Written ...stufiles.sanjac.edu/GeneralEducation/2017-2018-GenEd/other_docs/Report General... · 2016-2017: Written Communication,

Report 2016-2017 Page 16 of 36

General Education Outcomes Assessment

Social Responsibility

Direct Assessment

The overall success rate for social responsibility was 47.3%, which is a significant improvement from last cycle’s 19.3%. Results were also disaggregated by criterion for all students and for students that were “not

successful” overall. The disaggregated criterion level results were presented to focus group participants to inform discussions regarding continuous improvement efforts. The disaggregated results are available in Appendix C.

Although an improvement, this is still a very low success rate. However, the results regarding the Usability of Sample likely explain that low success rate.

For 21.6% of unsuccessful students, 2 of 3 evaluators indicated a usability Level 3 which suggests student performance was negatively influenced by mis/alignment between the rubric and the assignment. An additional 17.0% of

evaluators were listed as “inconclusive” meaning that at least 1 of 3 evaluators believed there to be a mis/alignment issue.

A limitation of the results exists in the sampling method; the results may not

be generalized to a statement regarding the success of graduates because the sampling is across the core and likely includes assessed work of students early

in their academic career.

Inter-rater reliability by criterion did not meet institutional research standards (α > 0.60). For the LEAP VALUE composite social responsibility rubric for this

sample, the inter-rater reliability coefficient for success (0, 1) and not success (2, 3, 4) ranged from 0.07 < α < 0.16. The inter-rater reliability for the Usability of Sample criterion was α = 0.46.

Indirect Assessment

The observed results from the Fall 2016 and Spring 2017 End of Course Survey and the two most recent administrations of the Graduate Exit Survey (Fall 2016 and Spring 2017) did not reveal notable results; students typically reported strong agreement regarding college coursework improving their ability to

understand people with different backgrounds or perspectives. Relevant results are available in Appendix D.

The 2013 and 2015 administrations of CCSSE included six questions deemed relevant to social responsibility. Notable results were observed on one of the questions. From 2013 to 2015, there was an observable increase in the number of students that indicated the College emphasized encouraging

contact among students from different economic, social, and racial or ethnic backgrounds. Overall, the number of students indicating Very Much increased by 5%.

Continuous Improvement

To develop recommendations for continuous improvement of student attainment of general education outcomes, all full-time faculty, instructional leaders, and professional personnel were invited to

participate in outcomes assessment results focus groups. Two focus group sessions for each outcome were scheduled on each campus and for one day at the District office. A total of 21 focus group sessions were conducted between April 11 and April 20. Across the three outcomes, 44 participants attended the

focus groups: 24 in written communication (included oral and visual communication discussions), 9 in

Page 17: General Education Outcomes Assessment 2016-2017: Written ...stufiles.sanjac.edu/GeneralEducation/2017-2018-GenEd/other_docs/Report General... · 2016-2017: Written Communication,

Report 2016-2017 Page 17 of 36

General Education Outcomes Assessment

personal responsibility, and 17 in social responsibility. Some participants attended more than one session. Focus group sessions presented results for the outcome by rubric and then centered on two

primary questions:

What do the results mean? What is our interpretation of the results?

What do we need to do, as an institution, to improve future results?

Additional questions were explored and discussed based upon the interest of and responses by the participants within the respective sessions. Specific questions were asked with a focus on (a) how to

improve the rate at which common assignments are properly aligned to outcomes in Blackboard; (b) how to transition from common assignments to signature assignments; (c) how to strengthen the institutional interpretations for each outcome, and (d) how to address the continuing issue with Sources

and Evidence in Written Communication.

All discussions were recorded as abbreviated transcripts; all focus group discussions quickly reached a

point of saturation with a focus on three recommendations. A summary of these discussions is available in Appendix A.

The Assessment Committee will review, revise, and approve recommendations including accountability

methods and timelines for implementation for both instructional improvements and procedural improvements.

Instructional

Recommendation Owner(s) Method/Date

Collaborate with subject matter expert faculty to develop discipline-agnostic teaching resources focused on formative assessment and scaffolding of written communication skills; initial focus should be on Sources and Evidence. Make resources available directly and/or through professional development opportunities for faculty; incorporate promotion of collaborative technologies into common resources.

Office Accreditation & Assessment

Teaching resources developed by May 1, 2018. Professional development opportunities to be made available beginning Fall 2018.

Procedural

Recommendation Owner(s) Method/Date

Collaborate with faculty members on revisions to all institutional interpretations of rubrics by examining the rubrics criterion by criterion; produce final, edited versions of all institutional interpretations.

Deans, Department Chairs, and Faculty with support from Office Accreditation & Assessment

Complete by May 1, 2018

The College should transition from “common assignments” to “signature assignments.” A signature assignment is different from a common assignment:

Deans, Department Chairs, and Faculty with support from

Complete by May 1, 2018

Page 18: General Education Outcomes Assessment 2016-2017: Written ...stufiles.sanjac.edu/GeneralEducation/2017-2018-GenEd/other_docs/Report General... · 2016-2017: Written Communication,

Report 2016-2017 Page 18 of 36

General Education Outcomes Assessment

Recommendation Owner(s) Method/Date

*an assignment design worksheet has been completed by the faculty designing the assignment for each outcome the assignment is intended to assess; *a team of faculty has reviewed/vetted the assignment and the assignment design worksheets for all outcomes the assignment is to assess; *the assignment has then been added to an institutional repository of signature assignments.

Office Accreditation & Assessment

Page 19: General Education Outcomes Assessment 2016-2017: Written ...stufiles.sanjac.edu/GeneralEducation/2017-2018-GenEd/other_docs/Report General... · 2016-2017: Written Communication,

Report 2016-2017

General Education Outcomes Assessment

References Rhodes, T.L., & Finley, A. (2013). Using the VALUE Rubrics for Improvement of Learning and Authentic

Assessment. Washington, D.C.: American Association of Colleges and Schools.

Page 20: General Education Outcomes Assessment 2016-2017: Written ...stufiles.sanjac.edu/GeneralEducation/2017-2018-GenEd/other_docs/Report General... · 2016-2017: Written Communication,

Report 2016-2017

General Education Outcomes Assessment

Appendix A. Focus Group Summary

Communication (Written, Visual, and Oral)

Discussions addressed the need to include visual or oral communication in all areas of the core

for the next assessment cycle. Faculty will need to work together to see which outcome is best

suited to a particular course. For example, Math would be a great place to include visual, but

difficult to address oral.

o This creates an issue. For example, if Speech wants to assess oral and visual, can they

now omit written? How will the other outcomes be assessed if there are no written

documents submitted/collected?

Evaluators expressed concern about assessing work outside of their field of expertise. How can

an evaluator know all the conventions of multiple disciplines?

Sources and Evidence continues to be a difficult criterion for the students and faculty. We need

to look at what we are trying to accomplish and work towards that. Is it about the mechanics of

citations or is it about students integrating these sources into their writing? It may be beneficial

to hold institutional interpretation workshops to help better define key terms and concepts.

o A special comment was made regarding North campus students: Three of the poorest

zip codes in the Houston metro are in their service area. These students may not receive

as much encouragement to read or to pursue other “intellectual” activities.

o Students, district-wide, also seem to struggle with equating “beliefs” to evidence, social

media serving as their primary source of information, and therefore only exposing

themselves to ideas with which they already agree.

Personal Responsibility

An evaluator made the comment that the rubric requires a higher level of thinking than most

assignments prompt for; this again indicates a need for a workshop or session that focuses on

aligning common assignments to the rubric.

It was also suggested that faculty scaffold assignments dealing with personal responsibility

throughout the semester as this is most likely students’ first time addressing this outcome.

Reflection continues to be a difficult criterion with low scores. Most likely students are not being

asked to address it in their assignment.

Reacquaint faculty with this rubric.

Social Responsibility

Assignments don’t match well with the rubric.

Assignments need better definitions. For example, how is “culture” viewed or defined in this

discipline?

Students struggle with two things: looking past their own previously held viewpoints or not

being able to take a stand on an issue (“what do you want me to think?”). This is difficult to work

through in only two years of instruction.

One evaluator suggested that we should be scaffolding ALL of this in GUST 1301. Has there been

an improvement in student general education results since GUST 1301 was implemented?

Students performed well when the assignment asked for the criteria to be addressed;

conversely, students performed poorly because they were not asked to do something.

Page 21: General Education Outcomes Assessment 2016-2017: Written ...stufiles.sanjac.edu/GeneralEducation/2017-2018-GenEd/other_docs/Report General... · 2016-2017: Written Communication,

Report 2016-2017

General Education Outcomes Assessment

General Recommendations

All common assignments need to be reviewed in order to ensure proper alignment with the

rubrics used to assess samples of student work. These newly revised common assignments will then become signature assignments and will be distributed to all faculty teaching a particular course. Many current common assignments do not require students to perform in capacities

expected by the rubrics and it has a substantial, negative effect on their success across all outcomes.

o “Validity Worksheets” might be valuable in assisting faculty with demonstrating how an assignment is expected to elicit a response from a student relevant to a particular outcome. These could also increase accountability.

o Faculty who choose to develop their own signature assignment(s) would be encouraged to do so only after a signature assignment for a course has been developed and vetted.

They may also be required to have assessed general education outcomes previously and/or attend a workshop.

o Could signature assignments be more flat/generic in nature? That way they can be

pulled from the repository for a variety of courses. o Evaluators know the rubrics better than anyone. Could we put a mentorship program in

place? Could they assist with workshops?

Results and focus group recommendations should be presented to all full-time faculty teaching in academic areas. Faculty need to be more aware of the results and the rubrics. This may be

appropriate during college community week meetings in August in combination with general education information included in faculty packets. However, the fall focus this year needs to be on program outcomes assessment.

Consider posting rubrics to SOS (faculty access the most up-to-date syllabi here). Should the general education outcomes be listed and defined in all syllabi?

o Add the gen-ed-faculty link on the internal website?

Evaluators recommend that faculty include full details in the assignment in Blackboard and not just a “submit here” type statement. This is where “validity worksheets” could provide some

additional context and guidance for those who are not familiar with a particular discipline.

Continue efforts to improve the rate at which common assignments are properly aligned to outcomes in Blackboard (provide Department Chairs with alignment reports every 3 weeks).

Institutional interpretation workshops may help better define what each criterion means and how to require it in an assignment. Perhaps assign 2-3 faculty to a particular criterion and have them workshop that criterion?

o Need to consider disciplines and how each one may define key rubric terms differently.

Some evaluators expressed concern over how much we are trying to do in a semester and that if more assignments continue to be added it starts to influence teaching and academic freedom.

Perhaps some of these faculty will approve of creating their own signature assignment(s) once a signature assignment has been developed and vetted for a particular course.

o Maybe use a word other than “assignment,” like “guidelines”?

Faculty communication may also be an issue. If faculty have not been directly informed about

issues with the common assignments and the rubrics, why would they revise them?

o Also, a faculty member requested that a timeline be provided to all faculty—not just

evaluators—so when it comes time to revise an assignment (or participate in some

other aspect of the process) they will be informed about it.

Could we provide an online location for students to access general education information?

Page 22: General Education Outcomes Assessment 2016-2017: Written ...stufiles.sanjac.edu/GeneralEducation/2017-2018-GenEd/other_docs/Report General... · 2016-2017: Written Communication,

Report 2016-2017 Appendix B

General Education Outcomes Assessment

Appendix B. Focus Group Presentation

Page 23: General Education Outcomes Assessment 2016-2017: Written ...stufiles.sanjac.edu/GeneralEducation/2017-2018-GenEd/other_docs/Report General... · 2016-2017: Written Communication,

OUTCOMES RESULTS

FOCUS GROUPS

General Education Outcomes Assessment, 2016-2017

Page 24: General Education Outcomes Assessment 2016-2017: Written ...stufiles.sanjac.edu/GeneralEducation/2017-2018-GenEd/other_docs/Report General... · 2016-2017: Written Communication,

• Common assignments implemented via Bb

• Student work collected via Bb Outcomes

Process

Page 25: General Education Outcomes Assessment 2016-2017: Written ...stufiles.sanjac.edu/GeneralEducation/2017-2018-GenEd/other_docs/Report General... · 2016-2017: Written Communication,

• Communication: Written, Oral, and Visual

• Personal Responsibility

• Social Responsibility

Outcomes Evaluated

Page 26: General Education Outcomes Assessment 2016-2017: Written ...stufiles.sanjac.edu/GeneralEducation/2017-2018-GenEd/other_docs/Report General... · 2016-2017: Written Communication,

• 6th year of LEAP-based assessment

• 79 courses (80 last year)

• 1,785 sections (1,677)

• 44,688 enrollments (42,275)

• 52,973 documents collected (34,765)

Scope of this Project

52% increase!

Page 27: General Education Outcomes Assessment 2016-2017: Written ...stufiles.sanjac.edu/GeneralEducation/2017-2018-GenEd/other_docs/Report General... · 2016-2017: Written Communication,

Correct Alignment Counts, Three Year Summary

Campus F14 Aligned Properly F15 Aligned Properly F16 Aligned Properly

Central 454 72% 483 82% 580 90%

North 272 73% 299 75% 368 84%

South 407 59% 443 71% 537 75%

District Totals

1,133 67% 1,225 76% 1,485 83%

Alignments Are Improving

Page 28: General Education Outcomes Assessment 2016-2017: Written ...stufiles.sanjac.edu/GeneralEducation/2017-2018-GenEd/other_docs/Report General... · 2016-2017: Written Communication,

• Student work sampled

• Evaluators completed calibration sessions

• Evaluators assessed student work

Process

Page 29: General Education Outcomes Assessment 2016-2017: Written ...stufiles.sanjac.edu/GeneralEducation/2017-2018-GenEd/other_docs/Report General... · 2016-2017: Written Communication,

• 52,973 documents collected (34,765)

• 1,120 documents sampled (2,039)

• 104 evaluators (172)

Scope of this Project

Page 30: General Education Outcomes Assessment 2016-2017: Written ...stufiles.sanjac.edu/GeneralEducation/2017-2018-GenEd/other_docs/Report General... · 2016-2017: Written Communication,

• Danielle Bible

• Paul Bounds

• Barbara Brown

• Michael Brown

• Joseph Brysch

• Chad Clark

• Steven Davidson

• Kim DeLauro

• Madhu Gyawali

• Karyn Jones

• Karen Hattaway

• Lesley Kauffman

• Brittany Moore

• Tina Mougouris

• Carla Ruffins

• Joseph Stromberg

• Chris Wild

• Monica Yancey

Thank You to our Evaluators

AND Clean-Up Crew Members!

Page 31: General Education Outcomes Assessment 2016-2017: Written ...stufiles.sanjac.edu/GeneralEducation/2017-2018-GenEd/other_docs/Report General... · 2016-2017: Written Communication,

• Amy Austin

• Tim Bell

• Anna Bennett

• Pam Betts

• Mary Ann Blake

• Stephen Bonnette

• Liana Boop

• Regan Boudra

• Greg Brutsche

• Cristina Cardenas

• Angelina Cavallo

• Sarah Chaudhary

• Debra Clarke

• Laura Cole

• Tonja Conerly

• Christina Crawford

• Chris Duke

• Woody Dunseith

• Susan Eason

• Dawn Eaton

• Art Fitzgerald

• Sharon George

• Shari Goldstein

• Sharada Gollapudi

• Catherine Gragg

• Julie Groesch

• Barbara Guillory

• Kevin Hale

• Guillermo Hernandez

• Kerri Hines

• Merrily Hoffman

• Kevin Holden

• Yuri Horner

• Carrie Hughes

• Veronica Jammer

• Bennie Jenkins

• Yuli Kainer

• Carrol LaRowe

• Eric Late

• Jeanette Liberty

• Stephen Lopez

• Aaron Love

Thank you, Evaluators!

Page 32: General Education Outcomes Assessment 2016-2017: Written ...stufiles.sanjac.edu/GeneralEducation/2017-2018-GenEd/other_docs/Report General... · 2016-2017: Written Communication,

• Pamela Maack

• Tanya Madrigal

• Allyson Marceau

• Shelley McCaul

• Rebecca McDonald

• Paul McRee

• Bobby Mixon

• Kevin Morris

• Bryan Moss

• Cheryl Mott

• Ernie Murray

• Mark Myers

• Bret Nelson

• Ryan Newman

• Lambrini Nicopoulos

• Farran Norris-Sands

• Alexander Okwonna

• Joy Pasini

• Elida Petkovich

• Louis Pitre

• Casey Prince

• Sherilyn Reynolds

• Robyn Ring

• Katherine Ryan

• Patricia Sayles

• Terri Seiver

• Michelle Selk

• Kelly Simons

• Sharon Sledge

• Beverly Smith

• Anton Solovyov

• Karen Springer

• Susan Starr

• Patricia Steinke

• Kristen Taylor

• Cayman Tirado

• Marcus Turner

• Dave Turnquist

• Michael Unger

• Roger Watkins

• Alan Wiederhold

• Mary Wisgirda

Thank you, Evaluators!

Page 33: General Education Outcomes Assessment 2016-2017: Written ...stufiles.sanjac.edu/GeneralEducation/2017-2018-GenEd/other_docs/Report General... · 2016-2017: Written Communication,

• Results aggregated

• What’s next?

Process

Page 34: General Education Outcomes Assessment 2016-2017: Written ...stufiles.sanjac.edu/GeneralEducation/2017-2018-GenEd/other_docs/Report General... · 2016-2017: Written Communication,

Criteria 1 Criteria 2 Criteria 3 Criteria 4 Criteria 5

Evaluator 1 2 2 3 2 2

Evaluator 2 2 2 2 1 2

Evaluator 3 2 3 2 1 3

Success Success Success Not Success

Success vs. Not Success

Criteria 1 Criteria 2 Criteria 3 Criteria 4 Criteria 5

Evaluator 1 3 2 3 1 3

Evaluator 2 2 1 3 2 3

Evaluator 3 3 2 3 3 3

Success Success Success Success Success

GOAL: 75% Success

Page 35: General Education Outcomes Assessment 2016-2017: Written ...stufiles.sanjac.edu/GeneralEducation/2017-2018-GenEd/other_docs/Report General... · 2016-2017: Written Communication,

• Samples include students across the core regardless of

credit hours completed; results may be generalized to

the entire student population and not specific

subgroups of students, e.g. graduates.

• Good news on the horizon for Fall 2017.

• Inter-rater reliability does not yet meet institutional

standards (a measure of institutional effectiveness).

• Attrition rate in the designed samples increases

standard error and limits the generalizability.

Limitations

Page 36: General Education Outcomes Assessment 2016-2017: Written ...stufiles.sanjac.edu/GeneralEducation/2017-2018-GenEd/other_docs/Report General... · 2016-2017: Written Communication,

• What do the results mean? What is

our interpretation of the results?

• What do we need to do, as an

institution, to improve future results?

Focus Groups—Two Questions

Page 37: General Education Outcomes Assessment 2016-2017: Written ...stufiles.sanjac.edu/GeneralEducation/2017-2018-GenEd/other_docs/Report General... · 2016-2017: Written Communication,

• Visual, Oral, and Written

Communication

• Personal Responsibility

• Social Responsibility

Outcomes Evaluated

Page 38: General Education Outcomes Assessment 2016-2017: Written ...stufiles.sanjac.edu/GeneralEducation/2017-2018-GenEd/other_docs/Report General... · 2016-2017: Written Communication,

Communication, Visual

Page 39: General Education Outcomes Assessment 2016-2017: Written ...stufiles.sanjac.edu/GeneralEducation/2017-2018-GenEd/other_docs/Report General... · 2016-2017: Written Communication,
Page 40: General Education Outcomes Assessment 2016-2017: Written ...stufiles.sanjac.edu/GeneralEducation/2017-2018-GenEd/other_docs/Report General... · 2016-2017: Written Communication,
Page 41: General Education Outcomes Assessment 2016-2017: Written ...stufiles.sanjac.edu/GeneralEducation/2017-2018-GenEd/other_docs/Report General... · 2016-2017: Written Communication,
Page 42: General Education Outcomes Assessment 2016-2017: Written ...stufiles.sanjac.edu/GeneralEducation/2017-2018-GenEd/other_docs/Report General... · 2016-2017: Written Communication,
Page 43: General Education Outcomes Assessment 2016-2017: Written ...stufiles.sanjac.edu/GeneralEducation/2017-2018-GenEd/other_docs/Report General... · 2016-2017: Written Communication,
Page 44: General Education Outcomes Assessment 2016-2017: Written ...stufiles.sanjac.edu/GeneralEducation/2017-2018-GenEd/other_docs/Report General... · 2016-2017: Written Communication,

• How do we increase or scale up

teaching and assessment of visual

communication within the core?

Question – Visual Communication

Page 45: General Education Outcomes Assessment 2016-2017: Written ...stufiles.sanjac.edu/GeneralEducation/2017-2018-GenEd/other_docs/Report General... · 2016-2017: Written Communication,

Oral Communication

39.9%

60.1%

449

675

0.0%

10.0%

20.0%

30.0%

40.0%

50.0%

60.0%

70.0%

80.0%

90.0%

100.0%

Not Successful Successful

Error Rate: +/- 2.88%

Page 46: General Education Outcomes Assessment 2016-2017: Written ...stufiles.sanjac.edu/GeneralEducation/2017-2018-GenEd/other_docs/Report General... · 2016-2017: Written Communication,

38.7% 38.1% 38.1%

61.3% 61.9% 61.9%

435 449 449

690 730 730

Organization Content Delivery

Performance by Criterion, All Students, Oral Communication

Not Successful by Criterion Successful by Criterion

Page 47: General Education Outcomes Assessment 2016-2017: Written ...stufiles.sanjac.edu/GeneralEducation/2017-2018-GenEd/other_docs/Report General... · 2016-2017: Written Communication,

96.9% 96.7% 97.6%

3.1% 3.3% 2.4%

435 434 438

14 15 11

Organization Content Delivery

Performance by Criterion, Students Not Successful Overall, Oral Communication

Not Successful by Criterion Successful by Criterion

Page 48: General Education Outcomes Assessment 2016-2017: Written ...stufiles.sanjac.edu/GeneralEducation/2017-2018-GenEd/other_docs/Report General... · 2016-2017: Written Communication,

• Students are either successful on all or successful on

none of the criteria.

• The criterion are not discerning success

independently; should there not be more instances?

• Success on Organization but NOT Content or Delivery

• Success on Content but NOT Organization or Delivery

• Success on Delivery but NOT Organization or Content

• How do we get there? What needs to happen?

Observation & Question—

Oral Communication

Page 49: General Education Outcomes Assessment 2016-2017: Written ...stufiles.sanjac.edu/GeneralEducation/2017-2018-GenEd/other_docs/Report General... · 2016-2017: Written Communication,

• Anecdotally, faculty are cross-walking

from individual grading rubrics to the

general education rubric, after the fact.

• Could that be contributing to the lack

of discernment between the criteria?

Observation & Question—

Oral Communication

Page 50: General Education Outcomes Assessment 2016-2017: Written ...stufiles.sanjac.edu/GeneralEducation/2017-2018-GenEd/other_docs/Report General... · 2016-2017: Written Communication,

• Anecdotally, faculty are cross-walking

from individual grading rubrics to the

general education rubric, after the fact.

• Could we leverage that to develop the

institutional interpretation for oral

communication? Faculty willing to

analyze and describe their cross-walk?

Observation & Question—

Oral Communication

Page 51: General Education Outcomes Assessment 2016-2017: Written ...stufiles.sanjac.edu/GeneralEducation/2017-2018-GenEd/other_docs/Report General... · 2016-2017: Written Communication,

• How do we increase or scale up

teaching and assessment of oral and

visual communication within the

core?

Question –

Oral & Visual Communication

Page 52: General Education Outcomes Assessment 2016-2017: Written ...stufiles.sanjac.edu/GeneralEducation/2017-2018-GenEd/other_docs/Report General... · 2016-2017: Written Communication,

Communication, Written

Page 53: General Education Outcomes Assessment 2016-2017: Written ...stufiles.sanjac.edu/GeneralEducation/2017-2018-GenEd/other_docs/Report General... · 2016-2017: Written Communication,

• Sections: 1,785

• Enrollment: 44,688

• Documents collected: 26,661

• Sampled: 375

• Faculty evaluators: 51

• Usable documents: 276

• Attrition rate: 26.4%

Written Communication—

By the Numbers…

Page 54: General Education Outcomes Assessment 2016-2017: Written ...stufiles.sanjac.edu/GeneralEducation/2017-2018-GenEd/other_docs/Report General... · 2016-2017: Written Communication,
Page 55: General Education Outcomes Assessment 2016-2017: Written ...stufiles.sanjac.edu/GeneralEducation/2017-2018-GenEd/other_docs/Report General... · 2016-2017: Written Communication,
Page 56: General Education Outcomes Assessment 2016-2017: Written ...stufiles.sanjac.edu/GeneralEducation/2017-2018-GenEd/other_docs/Report General... · 2016-2017: Written Communication,
Page 57: General Education Outcomes Assessment 2016-2017: Written ...stufiles.sanjac.edu/GeneralEducation/2017-2018-GenEd/other_docs/Report General... · 2016-2017: Written Communication,
Page 58: General Education Outcomes Assessment 2016-2017: Written ...stufiles.sanjac.edu/GeneralEducation/2017-2018-GenEd/other_docs/Report General... · 2016-2017: Written Communication,
Page 59: General Education Outcomes Assessment 2016-2017: Written ...stufiles.sanjac.edu/GeneralEducation/2017-2018-GenEd/other_docs/Report General... · 2016-2017: Written Communication,

• What do the results mean? What is

our interpretation of the results?

• What do we need to do, as an

institution, to improve future results?

So Once Again…

Page 60: General Education Outcomes Assessment 2016-2017: Written ...stufiles.sanjac.edu/GeneralEducation/2017-2018-GenEd/other_docs/Report General... · 2016-2017: Written Communication,

Personal Responsibility

Page 61: General Education Outcomes Assessment 2016-2017: Written ...stufiles.sanjac.edu/GeneralEducation/2017-2018-GenEd/other_docs/Report General... · 2016-2017: Written Communication,

• Sections: 948

• Enrollment: 22,624

• Documents collected: 14,175

• Sampled: 381

• Faculty evaluators: 37

• Usable documents: 300

• Attrition rate: 21.3%

Personal Responsibility: By the

Numbers…

Page 62: General Education Outcomes Assessment 2016-2017: Written ...stufiles.sanjac.edu/GeneralEducation/2017-2018-GenEd/other_docs/Report General... · 2016-2017: Written Communication,
Page 63: General Education Outcomes Assessment 2016-2017: Written ...stufiles.sanjac.edu/GeneralEducation/2017-2018-GenEd/other_docs/Report General... · 2016-2017: Written Communication,
Page 64: General Education Outcomes Assessment 2016-2017: Written ...stufiles.sanjac.edu/GeneralEducation/2017-2018-GenEd/other_docs/Report General... · 2016-2017: Written Communication,

In 25% of instances in which a student was not successful, one or more evaluators indicated a disconnect between the rubric and the assignment.

That potentially changes the success rate by 15%

Page 65: General Education Outcomes Assessment 2016-2017: Written ...stufiles.sanjac.edu/GeneralEducation/2017-2018-GenEd/other_docs/Report General... · 2016-2017: Written Communication,

• Ensure alignments are appropriate.

• Align the common assignment with the rubric.

• Align the assignment with the appropriate rubric.

Evaluator, Usability Comments

Page 66: General Education Outcomes Assessment 2016-2017: Written ...stufiles.sanjac.edu/GeneralEducation/2017-2018-GenEd/other_docs/Report General... · 2016-2017: Written Communication,

• Assignment does not take into consideration the revised

Personal Responsibility rubric.

• Needs to be redesigned to account for changes to the

rubric.

Evaluator, Usability Comments

Page 67: General Education Outcomes Assessment 2016-2017: Written ...stufiles.sanjac.edu/GeneralEducation/2017-2018-GenEd/other_docs/Report General... · 2016-2017: Written Communication,

• The criteria for the common assignment do not

specifically address the Reflection & Self-Assessment,

nor do they assess Solving Problems. Those two criteria

should have been stated more explicitly within the

common assignment’s instructions.

• The assignment does not appear to expect students, in any

manner, to perform in a way expected by *at least*

Reflection and Self-Assessment or Solving Problems.

Evaluator, Usability Comments

Page 68: General Education Outcomes Assessment 2016-2017: Written ...stufiles.sanjac.edu/GeneralEducation/2017-2018-GenEd/other_docs/Report General... · 2016-2017: Written Communication,

• How do we as an institution reliably

address the assignment-to-rubric

alignment issue? How do we get the

assignments aligned to the rubric?

Recommendations?

Page 69: General Education Outcomes Assessment 2016-2017: Written ...stufiles.sanjac.edu/GeneralEducation/2017-2018-GenEd/other_docs/Report General... · 2016-2017: Written Communication,
Page 70: General Education Outcomes Assessment 2016-2017: Written ...stufiles.sanjac.edu/GeneralEducation/2017-2018-GenEd/other_docs/Report General... · 2016-2017: Written Communication,
Page 71: General Education Outcomes Assessment 2016-2017: Written ...stufiles.sanjac.edu/GeneralEducation/2017-2018-GenEd/other_docs/Report General... · 2016-2017: Written Communication,
Page 72: General Education Outcomes Assessment 2016-2017: Written ...stufiles.sanjac.edu/GeneralEducation/2017-2018-GenEd/other_docs/Report General... · 2016-2017: Written Communication,
Page 73: General Education Outcomes Assessment 2016-2017: Written ...stufiles.sanjac.edu/GeneralEducation/2017-2018-GenEd/other_docs/Report General... · 2016-2017: Written Communication,

• How do we as an institution reliably

address the assignment-to-rubric

alignment issue? How do we get the

assignments aligned to the rubric?

Recommendations?

Page 74: General Education Outcomes Assessment 2016-2017: Written ...stufiles.sanjac.edu/GeneralEducation/2017-2018-GenEd/other_docs/Report General... · 2016-2017: Written Communication,
Page 75: General Education Outcomes Assessment 2016-2017: Written ...stufiles.sanjac.edu/GeneralEducation/2017-2018-GenEd/other_docs/Report General... · 2016-2017: Written Communication,

• What do the results mean? What is

our interpretation of the results?

• What do we need to do, as an

institution, to improve future results?

Focus Groups—Personal

Responsibility

Page 76: General Education Outcomes Assessment 2016-2017: Written ...stufiles.sanjac.edu/GeneralEducation/2017-2018-GenEd/other_docs/Report General... · 2016-2017: Written Communication,

Social Responsibility

Page 77: General Education Outcomes Assessment 2016-2017: Written ...stufiles.sanjac.edu/GeneralEducation/2017-2018-GenEd/other_docs/Report General... · 2016-2017: Written Communication,

• Sections: 765

• Enrollment: 20,377

• Documents collected: 12,137

• Sampled: 364

• Faculty evaluators: 33

• Usable documents: 294

• Attrition rate: 19.2%

Social Responsibility: By the

Numbers…

Page 78: General Education Outcomes Assessment 2016-2017: Written ...stufiles.sanjac.edu/GeneralEducation/2017-2018-GenEd/other_docs/Report General... · 2016-2017: Written Communication,
Page 79: General Education Outcomes Assessment 2016-2017: Written ...stufiles.sanjac.edu/GeneralEducation/2017-2018-GenEd/other_docs/Report General... · 2016-2017: Written Communication,
Page 80: General Education Outcomes Assessment 2016-2017: Written ...stufiles.sanjac.edu/GeneralEducation/2017-2018-GenEd/other_docs/Report General... · 2016-2017: Written Communication,

In 33% of instances in which a student was not successful, one or more evaluators indicated a disconnect between the rubric and the assignment.

That potentially changes the success rate by 18-20%

Page 81: General Education Outcomes Assessment 2016-2017: Written ...stufiles.sanjac.edu/GeneralEducation/2017-2018-GenEd/other_docs/Report General... · 2016-2017: Written Communication,

• The parameters for the assignment are too open-ended . . .

There needs to be more specific requirements that

intentionally require the student to perform in a way that

the social responsibility requires.

• The student sample I evaluated did an incredible job of

addressing the assignment, and it would perform very

well against the written communication and critical

thinking rubrics. However, there’s just nothing in the

assignment that requires a student to “explain and connect

two or more cultures” in any manner (Cultural Diversity).

Evaluator, Usability Comments

Page 82: General Education Outcomes Assessment 2016-2017: Written ...stufiles.sanjac.edu/GeneralEducation/2017-2018-GenEd/other_docs/Report General... · 2016-2017: Written Communication,

• How do we as an institution reliably

address the assignment-to-rubric

alignment issue? How do we get the

assignments aligned to the rubric?

Recommendations?

Page 83: General Education Outcomes Assessment 2016-2017: Written ...stufiles.sanjac.edu/GeneralEducation/2017-2018-GenEd/other_docs/Report General... · 2016-2017: Written Communication,
Page 84: General Education Outcomes Assessment 2016-2017: Written ...stufiles.sanjac.edu/GeneralEducation/2017-2018-GenEd/other_docs/Report General... · 2016-2017: Written Communication,
Page 85: General Education Outcomes Assessment 2016-2017: Written ...stufiles.sanjac.edu/GeneralEducation/2017-2018-GenEd/other_docs/Report General... · 2016-2017: Written Communication,
Page 86: General Education Outcomes Assessment 2016-2017: Written ...stufiles.sanjac.edu/GeneralEducation/2017-2018-GenEd/other_docs/Report General... · 2016-2017: Written Communication,
Page 87: General Education Outcomes Assessment 2016-2017: Written ...stufiles.sanjac.edu/GeneralEducation/2017-2018-GenEd/other_docs/Report General... · 2016-2017: Written Communication,

• How do we as an institution reliably

address the assignment-to-rubric

alignment issue? How do we get the

assignments aligned to the rubric?

Recommendations?

Page 88: General Education Outcomes Assessment 2016-2017: Written ...stufiles.sanjac.edu/GeneralEducation/2017-2018-GenEd/other_docs/Report General... · 2016-2017: Written Communication,
Page 89: General Education Outcomes Assessment 2016-2017: Written ...stufiles.sanjac.edu/GeneralEducation/2017-2018-GenEd/other_docs/Report General... · 2016-2017: Written Communication,

• What do the results mean? What is

our interpretation of the results?

• What do we need to do, as an

institution, to improve future results?

Focus Groups—Social

Responsibility

Page 90: General Education Outcomes Assessment 2016-2017: Written ...stufiles.sanjac.edu/GeneralEducation/2017-2018-GenEd/other_docs/Report General... · 2016-2017: Written Communication,

Report 2016-2017 Appendix B

General Education Outcomes Assessment

Appendix C. Disaggregated Results

Written Communication (with Visual Communication)

Page 91: General Education Outcomes Assessment 2016-2017: Written ...stufiles.sanjac.edu/GeneralEducation/2017-2018-GenEd/other_docs/Report General... · 2016-2017: Written Communication,

Report 2016-2017 Appendix B

General Education Outcomes Assessment

Oral Communication

Page 92: General Education Outcomes Assessment 2016-2017: Written ...stufiles.sanjac.edu/GeneralEducation/2017-2018-GenEd/other_docs/Report General... · 2016-2017: Written Communication,

Report 2016-2017 Appendix B

General Education Outcomes Assessment

Personal Responsibility

Page 93: General Education Outcomes Assessment 2016-2017: Written ...stufiles.sanjac.edu/GeneralEducation/2017-2018-GenEd/other_docs/Report General... · 2016-2017: Written Communication,

Report 2016-2017 Appendix B

General Education Outcomes Assessment

Social Responsibility

Page 94: General Education Outcomes Assessment 2016-2017: Written ...stufiles.sanjac.edu/GeneralEducation/2017-2018-GenEd/other_docs/Report General... · 2016-2017: Written Communication,

Report 2016-2017

General Education Outcomes Assessment

Appendix D. Indirect Assessment Data

Page 95: General Education Outcomes Assessment 2016-2017: Written ...stufiles.sanjac.edu/GeneralEducation/2017-2018-GenEd/other_docs/Report General... · 2016-2017: Written Communication,

Report 2016-2017

General Education Outcomes Assessment

Page 96: General Education Outcomes Assessment 2016-2017: Written ...stufiles.sanjac.edu/GeneralEducation/2017-2018-GenEd/other_docs/Report General... · 2016-2017: Written Communication,

Report 2016-2017

General Education Outcomes Assessment

Page 97: General Education Outcomes Assessment 2016-2017: Written ...stufiles.sanjac.edu/GeneralEducation/2017-2018-GenEd/other_docs/Report General... · 2016-2017: Written Communication,

Report 2016-2017

General Education Outcomes Assessment

Community College Survey of Student Engagement (CCSSE), 2013 and 2015

Written Communication, Oral Communication, Personal Responsibility, Social Responsibility

Page 98: General Education Outcomes Assessment 2016-2017: Written ...stufiles.sanjac.edu/GeneralEducation/2017-2018-GenEd/other_docs/Report General... · 2016-2017: Written Communication,

Report 2016-2017

General Education Outcomes Assessment

Page 99: General Education Outcomes Assessment 2016-2017: Written ...stufiles.sanjac.edu/GeneralEducation/2017-2018-GenEd/other_docs/Report General... · 2016-2017: Written Communication,

Report 2016-2017

General Education Outcomes Assessment

Page 100: General Education Outcomes Assessment 2016-2017: Written ...stufiles.sanjac.edu/GeneralEducation/2017-2018-GenEd/other_docs/Report General... · 2016-2017: Written Communication,

Report 2016-2017

General Education Outcomes Assessment

Page 101: General Education Outcomes Assessment 2016-2017: Written ...stufiles.sanjac.edu/GeneralEducation/2017-2018-GenEd/other_docs/Report General... · 2016-2017: Written Communication,

Appendix E. Rubrics and Institutional Interpretations

Page 102: General Education Outcomes Assessment 2016-2017: Written ...stufiles.sanjac.edu/GeneralEducation/2017-2018-GenEd/other_docs/Report General... · 2016-2017: Written Communication,

Written Communication Institutional Interpretation

As of February 15, 2017

Definitions & Rubric • Communication Skills (SJC), “Students will communicate ideas, express feelings, and support conclusions

effectively in written, oral, and visual formats.” • Communication Skills (THECB), “to include effective development, interpretation, and expression of ideas

through written, oral, and visual communication.”

General Issues & Discussion • Setting a standard for our graduates and core complete students. The question addressed through this

assessment process is, “Are we, as an institution, effective in educating our students to attain the general education outcomes?” That question focuses on students that graduate (with an AA, AS, or AAT) or complete the core. As we assess, we should not make allowances (e.g. assess more easily) for students we believe may be earlier in their college career; we acknowledge that the technology does not yet support ideal sampling methods (work sampled only from recent graduates), and we consider that limitation when analyzing the results. Our standard for and expectations of our graduates and core complete students should be applied consistently.

• Grading Perspective. The levels of the rubric do not correspond to a particular grade (e.g. Level 4 does not equate to an “A”) or to a particular level of academic experience (freshman, sophomore etc.). More importantly, we are not assessing student work for the purpose of assigning a grade; the purpose is to identify their relative level of written communication to evaluate how well San Jacinto College is helping student attain that skill. Thus, if a student only performs at a Level 1 even though they are addressing the assignment extremely well, it is still a Level 1 of performance.

• Frame analytic (criterion-by-criterion) assessment within a holistic assessment framework. Once all criterion have been evaluated, consider the student’s performance more holistically. Is the student’s work representative of work you would be confident as labeling “successful written communication” by a San Jacinto College graduate or core complete student. Alternatively, is the student’s work representative of work you believe should *not* be labeled as “successful written communication” by a San Jacinto College graduate or core complete student? Use your answers to those questions to consider your criterion-by-criterion assessment

• Generalize behaviors (reading holistically). It is valid to consider the examples given within the criteria or within the institutional interpretation to be a general description of that level of written communication rather than the only valid examples of that skill level. The work presented by the student may represent a comparable level of skill. In other words, given another opportunity to complete the assignment, would a student performing as they are given their response to the assignment perhaps exhibit the specific examples offered by the rubric or institutional interpretation?

• Accuracy of content specific knowledge. The written communication rubric does not assess content specific knowledge. It is possible for a student to complete an assignment based on incorrect knowledge schema while still demonstrating adequate written communication. Certainly, very specific content errors that do not affect the meaning of the document should not unduly lower the assessment on the rubric.

Page 103: General Education Outcomes Assessment 2016-2017: Written ...stufiles.sanjac.edu/GeneralEducation/2017-2018-GenEd/other_docs/Report General... · 2016-2017: Written Communication,

Institutional Interpretation of Criteria Usability of Sample

• See separate document “Determining Usability of Sample”

Context of and Purpose for Writing • Successful students exhibit awareness of audience and purpose for writing beyond the scope and purposes of

the course; they consider an audience other than self or the instructor. In addition, successful students are able to place an issue or topic within a clear context.

• Unsuccessful students write with little attention given to context, audience, or purpose. • This criterion includes regulating tone, e.g. writing an argument in favor of gun control should attend to

concerns of gun advocates. Successful students are able to weigh all relevant sides and come to an informed conclusion.

Content Development • Successful students use relevant content (examples, evidence, etc.) to develop and explore ideas through most

of their work. Unsuccessful students may have well-developed, simple ideas in only one area, section or paragraph or may not develop any ideas particularly well.

• Successful content development (Level 2) requires the student to include multiple, main supporting ideas that are developed with examples and details. Ineffective content development may only include a single, main supporting idea developed with examples and details (Level 1), or it may rely heavily on declarative statements with circular or declarative supporting statements (Level 0).

Genre and Disciplinary Conventions • Successful students organize, present, and write in a way that is consistent with basic, common practices of the

discipline. Unsuccessful students may organize and present information in a generically acceptable manner but do not attend to specific disciplinary expectations or may not organize their writing in any meaningful way.

• Assessment of citation mechanics and style specific to a discipline or genre are appropriate within this criterion, e.g. formatting, presentation, organization, and layout.

• A student that adheres to fundamental expectations of the genre or discipline, perhaps by meeting and fulfilling the requirements of the assignment, is performing at least at a Level 2. A student that excludes key elements or features expected performs at a lower level (Level 1 or 0), e.g. organizational strategies for an essay or research paper do not follow convention; sections of a lab report are omitted or are completed with very low proficiency.

Sources and Evidence • Successful students attempt to use credible, relevant sources to support their ideas in a discipline appropriate

manner. Unsuccessful students may not use sources or may use sources without considering the credibility or relevance related to the task.

• This criterion considers the use of evidence and sources within the paper: how are they integrated into the student’s writing? Some in-text work must be evident in order to receive a Level 2. This means that students are able to place referenced material into the context of their own work. It also considers the quality and relevance of sources given the context of the subject matter. It does not consider the mechanics of citing sources; that is considered within the Genre and Disciplinary Conventions criterion.

• To receive a Level 2, students must support their claims with appropriate referenced material within the text of their work. Inadequate or very little use of source materials and/or the use of sources with questionable credibility is indicative of a Level 1.

• In more quantitatively or empirically centered disciplines, using, integrating, or referencing quantitative or empirical results in the written sample may be considered the use of evidence or sources in lieu of using primary or secondary sources, e.g. writing in Math should reference completed calculations.

Control of Syntax and Mechanics • Successful students use clear language that readers are able to understand easily, even with some

errors. Unsuccessful students write with language that may be difficult to understand because of a lack of clarity or errors in usage or simply may not be able to write effective enough to convey meaning.

Page 104: General Education Outcomes Assessment 2016-2017: Written ...stufiles.sanjac.edu/GeneralEducation/2017-2018-GenEd/other_docs/Report General... · 2016-2017: Written Communication,

• Having occasionally to read and re-read or to reinterpret a student’s writing (due to issues such as grammar, sentence structure, typos) to make meaning of it is indicative of a Level 1. Having to do that more frequently or being unable to determine meaning of a passage suggests a Level 0 assessment. Some errors that do not interfere with meaning (e.g. omission of a comma) may be considered successful at a Level 2.

Readability of Visual Elements • Successful students include visual elements that may be reliably interpreted by different readers. Unsuccessful

students may not use visual elements at all, or the visual elements are difficult to read or to interpret. • Students that do not include a visual element should be assigned a Level 0 on this criteria. This criteria SHOULD

NOT be considered when determining Usability of Sample. • Outlining or formatting of text is not currently considered to be a “Visual Element.”

Relevance, Accuracy, and Integration of Visual Elements • Successful students include visual elements that are intentionally integrated into the message; the student uses

the visual element to enhance their communication in some manner, and the visual element does not appear to include unintended distortions or bias. Unsuccessful students may only use a visual element decoratively in a manner that does not add specific value to the communication, or the element may include misleading or distorted information.

• Students that do not include a visual element should be assigned a Level 0 on this criteria. This criteria SHOULD NOT be considered when determining Usability of Sample.

Page 105: General Education Outcomes Assessment 2016-2017: Written ...stufiles.sanjac.edu/GeneralEducation/2017-2018-GenEd/other_docs/Report General... · 2016-2017: Written Communication,

WRITTEN COMMUNICATION VALUE RUBRIC

The VALUE rubrics were developed by teams of faculty experts representing colleges and universities across the United States through a process that examined many existing campus rubrics and related documents for each learning outcome and incorporated additional feedback from faculty. The rubrics articulate fundamental criteria for each learning outcome, with performance descriptors demonstrating progressively more sophisticated levels of attainment. The rubrics are intended for institutional-level use in evaluating and discussing student learning, not for grading. The core expectations articulated in all 15 of the VALUE rubrics can and should be translated into the language of individual campuses, disciplines, and even courses. The utility of the VALUE rubrics is to position learning at all undergraduate levels within a basic framework of expectations such that evidence of learning can by shared nationally through a common dialog and understanding of student success. Definition: Written communication is the development and expression of ideas in writing. Written communication involves learning to work in many genres and styles. It can involve working with many different writing technologies, and mixing texts, data, and images. Written communication abilities develop through iterative experiences across the curriculum. Framing Language: This writing rubric is designed for use in a wide variety of educational institutions. The most clear finding to emerge from decades of research on writing assessment is that the best writing assessments are locally determined and sensitive to local context and mission. Users of this rubric should, in the end, consider making adaptations and additions that clearly link the language of the rubric to individual campus contexts. This rubric focuses assessment on how specific written work samples or collectios of work respond to specific contexts. The central question guiding the rubric is "How well does writing respond to the needs of audience(s) for the work?" In focusing on this question the rubric does not attend to other aspects of writing that are equally important: issues of writing process, writing strategies, writers' fluency with different modes of textual production or publication, or writer's growing engagement with writing and disciplinarity through the process of writing. Evaluators using this rubric must have information about the assignments or purposes for writing guiding writers' work. Also recommended is including reflective work samples of collections of work that address such questions as: What decisions did the writer make about audience, purpose, and genre as s/he compiled the work in the portfolio? How are those choices evident in the writing -- in the content, organization and structure, reasoning, evidence, mechanical and surface conventions, and citational systems used in the writing? This will enable evaluators to have a clear sense of how writers understand the assignments and take it into consideration as they evaluate The first section of this rubric addresses the context and purpose for writing. A work sample or collections of work can convey the context and purpose for the writing tasks it showcases by including the writing assignments associated with work samples. But writers may also convey the context and purpose for their writing within the texts. It is important for faculty and institutions to include directions for students about how they should represent their writing contexts and purposes. Faculty interested in the research on writing assessment that has guided our work here can consult the National Council of Teachers of English/Council of Writing Program Administrators' White Paper on Writing Assessment (2008; www.wpacouncil.org/whitepaper) and the Conference on College Composition and Communication's Writing Assessment: A Position Statement (2008; www.ncte.org/cccc/resources/positions/123784.htm)

Glossary • Content Development: The ways in which the text explores and represents its topic in relation to its audience and purpose. • Context of and purpose for writing: The context of writing is the situation surrounding a text: who is reading it? who is writing it? Under what circumstances will the text be shared

or circulated? What social or political factors might affect how the text is composed or interpreted? The purpose for writing is the writer's intended effect on an audience. Writers might want to persuade or inform; they might want to report or summarize information; they might want to work through complexity or confusion.

• Disciplinary conventions: Formal and informal rules that constitute what is seen generally as appropriate within different academic fields, e.g. introductory strategies, use of passive voice or first person point of view, expectations for thesis or hypothesis, expectations for kinds of evidence and support that are appropriate to the task at hand, use of primary and secondary sources to provide evidence and support arguments and to document critical perspectives on the topic. Through increasingly sophisticated use of sources, writers develop an ability to differentiate between their own ideas and the ideas of others, credit and build upon work already accomplished in the field or issue they are addressing, and provide meaningful examples to readers.

• Evidence: Source material that is used to extend, in purposeful ways, writers' ideas in a text. • Genre conventions: Formal and informal rules for particular kinds of texts and/or media that guide formatting, organization, and stylistic choices, e.g. lab reports, academic papers,

poetry, webpages, or personal essays. • Sources: Texts (written, oral, behavioral, visual, or other) that writers draw on as they work for a variety of purposes -- to extend, argue with, develop, define, or shape their ideas,

for example.

Page 106: General Education Outcomes Assessment 2016-2017: Written ...stufiles.sanjac.edu/GeneralEducation/2017-2018-GenEd/other_docs/Report General... · 2016-2017: Written Communication,

Written Communication VALUE Rubric Definition: Written communication is the development and expression of ideas in writing. Written communication involves learning to work in many genres and styles. It can involve working with many different writing technologies, and mixing texts, data, and images. Written communication abilities develop through iterative experiences across the curriculum. Evaluators are encouraged to assign a zero to any work sample or collection of work that does not meet the Level 1 performance.

Criteria Level 4 3 2 Level 1 0

Usability of Sample (Levels for this criterion are separate and distinct categories – not a scale.)

No issues encountered. There was a disconnect between the rubric and the assignment, e.g. student may or may not have performed well, but the assignment did not ask the student to perform in a manner expected by the rubric.

The student responded in a manner that interfered with reliable assessment of this outcome, e.g. student wrote much less than assignment required; or, poor writing skills interfere with assessing a non-writing outcome.

The assignment was not the common assignment expected for the course, e.g. entirely different assignment, excessively modified common assignment, mistaken alignment, or presented as not part of regular course grade.

This document was not accessible or not assessable: wrong file format, unable to open the file, illegible/unreadable, unexpected teamwork, or instance of plagiarism.

Context of and Purpose for Writing Includes considerations of audience, purpose, and the circumstances surrounding the writing task(s).

Demonstrates a thorough understanding of context, audience, and purpose that is responsive to the assigned task(s) and focuses all elements of the work.

Demonstrates adequate consideration of context, audience, and purpose and a clear focus on the assigned task(s) (e.g., the task aligns with audience, purpose, and context).

Demonstrates awareness of context, audience, purpose, and to the assigned tasks(s) (e.g., begins to show awareness of audience's perceptions and assumptions).

Demonstrates minimal attention to context, audience, purpose, and to the assigned tasks(s) (e.g., expectation of instructor or self as audience).

Does not meet “Level 1” standards.

Content Development Uses appropriate, relevant, and compelling content to illustrate mastery of the subject, conveying the writer's understanding, and shaping the whole work.

Uses appropriate, relevant, and compelling content to explore ideas within the context of the discipline and shape the whole work.

Uses appropriate and relevant content to develop and explore ideas through most of the work.

Uses appropriate and relevant content to develop simple ideas in some parts of the work.

Does not meet “Level 1” standards.

Genre and Disciplinary Conventions Formal and informal rules inherent in the expectations for writing in particular forms and/or academic fields (please see glossary).

Demonstrates detailed attention to and successful execution of a wide range of conventions particular to a specific discipline and/or writing task (s) including organization, content, presentation, formatting, and stylistic choices

Demonstrates consistent use of important conventions particular to a specific discipline and/or writing task(s), including organization, content, presentation, and stylistic choices

Follows expectations appropriate to a specific discipline and/or writing task(s) for basic organization, content, and presentation

Attempts to use a consistent system for basic organization and presentation.

Does not meet “Level 1” standards.

Sources and Evidence Demonstrates skillful use of high-quality, credible, relevant sources to develop ideas that are appropriate for the discipline and genre of the writing

Demonstrates consistent use of credible, relevant sources to support ideas that are situated within the discipline and genre of the writing.

Demonstrates an attempt to use credible and/or relevant sources to support ideas that are appropriate for the discipline and genre of the writing.

Demonstrates an attempt to use sources to support ideas in the writing.

Does not meet “Level 1” standards.

Control of Syntax and Mechanics

Uses graceful language that skillfully communicates meaning to readers with clarity and fluency, and is virtually error-free.

Uses straightforward language that generally conveys meaning to readers. The language in the portfolio has few errors.

Uses language that generally conveys meaning to readers with clarity, although writing may include some errors.

Uses language that sometimes impedes meaning because of errors in usage.

Does not meet “Level 1” standards.

Readability of Visual Elements

Visual characteristics are strikingly effective, intentionally designed to enhance readability and interpretation. May clearly

Visual characteristics are clearly readable and easily and can be easily and consistently interpreted; design elements do

Visual characteristics allow visual elements to be read and interpreted consistently by most

Visual characteristics (e.g. color, contrast, font style/size, or layout and use of space)

Visual elements NOT included or do not meet “Level 1” standards.

Page 107: General Education Outcomes Assessment 2016-2017: Written ...stufiles.sanjac.edu/GeneralEducation/2017-2018-GenEd/other_docs/Report General... · 2016-2017: Written Communication,

address accessibility concerns (e.g., multiple modes are used to communicate message: color and shape.)

not interfere in any manner with interpretation.

readers despite needed improvements.

make visual elements difficult to read or interpret.

Relevance, Accuracy, and Integration of Visual Elements

Is clearly relevant to and appropriate for subject matter. Integrated well into, augments understanding of, or strengthens the message. Any distorted representations of information are clearly intentional and used for persuasive purposes.

Is clearly relevant to and appropriate for subject matter. Reinforces and is integrated well into the message. Absent of unintentional bias; any distorted representations of information may be used intentionally though the purpose could be clearer.

Is clearly relevant to and appropriate for the subject matter. Reinforces but may be more intentionally integrated into the message. May contain subtle bias or distorted representations of information that appear to be unintentional.

Is decorative only or is not appropriate for the subject matter. Does not add meaning, understanding, or value to the message. May contain clearly misleading, biased, or distorted representations of information that appear to be unintentional.

Visual elements NOT included or do not meet “Level 1” standards.

Adapted with permission from Assessing Outcomes and Improving Achievement: Tips and tools for Using Rubrics, edited by Terrel L. Rhodes. Copyright 2010 by the Association of American Colleges and Universities.

Page 108: General Education Outcomes Assessment 2016-2017: Written ...stufiles.sanjac.edu/GeneralEducation/2017-2018-GenEd/other_docs/Report General... · 2016-2017: Written Communication,

Oral Communication Institutional Interpretation

As of January 8, 2016

Definitions & Rubric Communication Skills (SJC), “Students will communicate ideas, express feelings, and support conclusions effectively

in written, oral, and visual formats.”

Communication Skills (THECB), “to include effective development, interpretation, and expression of ideas through

written, oral, and visual communication.”

General Issues & Discussion Setting a standard for our graduates and core complete students. The question addressed through this assessment

process is, “Are we, as an institution, effective in educating our students to attain the general education outcomes?” That question focuses on students that graduate (with an AA, AS, or AAT) or complete the core. As we assess, we should not make allowances (e.g. assess more easily) for students we believe may be earlier in their college career; we acknowledge that the technology does not yet support ideal sampling methods (work sampled only from recent graduates), and we consider that limitation when analyzing the results. Our standard for and expectations of our graduates and core complete students should be applied consistently.

Grading Perspective. The levels of the rubric do not correspond to a particular grade (e.g. Level 4 does not equate to an “A”) or to a particular level of academic experience (freshman, sophomore etc.). More importantly, we are not assessing student work for the purpose of assigning a grade; the purpose is to identify their relative level of teamwork to evaluate how well San Jacinto College is helping student attain that skill. Thus, if a student only performs at a Level 1 even though they are addressing the assignment extremely well, it is still a Level 1 of performance.

Frame analytic (criterion-by-criterion) assessment within a holistic assessment framework. Once all criterion have been evaluated, consider the student’s performance more holistically. Is the student’s work representative of work you would be confident as labeling “successful teamwork” by a San Jacinto College graduate or core complete student. Alternatively, is the student’s work representative of work you believe should *not* be labeled as “successful teamwork” by a San Jacinto College graduate or core complete student? Use your answers to those questions to consider your criterion-by-criterion assessment

Generalize behaviors (assessing holistically). It is valid to consider the examples given within the criteria or within the institutional interpretation to be a general description of that level of teamwork rather than the only valid examples of that skill level. The work presented by the student may represent a comparable level of skill. In other words, given another opportunity to complete the assignment, would a student performing as they are given their response to the assignment perhaps exhibit the specific examples offered by the rubric or institutional interpretation?

Explicit Examples. This rubric seeks to observe explicit examples of behaviors relevant to teamwork skills. For that to happen, we rely on students to provide those examples. If a student focuses entirely on the end product for the team project, only provides declarative statements without examples, or does not answer a question, a lower assessment on the rubric may be appropriate: a “1” or a “0” on the rubric.

Page 109: General Education Outcomes Assessment 2016-2017: Written ...stufiles.sanjac.edu/GeneralEducation/2017-2018-GenEd/other_docs/Report General... · 2016-2017: Written Communication,

Institutional Interpretation of Criteria

Organization Successful students at least have an organizational pattern intermittently observable throughout the

presentation/discussion. Unsuccessful students have minimally observable or no organizational pattern.

Content Successful students appropriately cite content that at least partially supports the presentation/discussion or

establishes their credibility on the topic. Unsuccessful students do not adequately cite content or use content that makes minimal contributions to the presentation/discussion.

Delivery Successful students may be tentative but use delivery techniques that make the presentation/discussion

understandable by the audience. Unsuccessful students appear uncomfortable and have delivery techniques that interfere with the presentation/discussion.

Readability of Visual Elements

.

Relevance, Accuracy, and Integration of Visual Elements .

Page 110: General Education Outcomes Assessment 2016-2017: Written ...stufiles.sanjac.edu/GeneralEducation/2017-2018-GenEd/other_docs/Report General... · 2016-2017: Written Communication,

ORAL COMMUNICATION VALUE RUBRIC for more information, please contact [email protected]

The VALUE rubrics were developed by teams of faculty experts representing colleges and universities across the United States through a process that examined many existing campus rubrics and related documents for each learning outcome and incorporated additional feedback from faculty. This version of the Oral Communication rubric was adapted with modifications by faculty at San Jacinto College. The type of oral communication most likely to be included in a collection of student work is an oral presentation or discussion and therefore is the focus for the application of this rubric. Definition: Oral communication is a purposeful presentation or discussion designed to increase knowledge, to foster understanding, or to promote change in the listeners' attitudes, values, beliefs, or behaviors. Framing Language: Oral communication takes many forms. This rubric is specifically designed to evaluate oral presentations of or a participation in a discussion by a single speaker and is best applied to live or video-recorded presentations or discussions. For panel presentations, group presentations, or discussions, it is recommended that each speaker be evaluated separately. This rubric best applies to presentations or discussions of sufficient length such that a central message is conveyed, supported by one or more forms of contents and includes a purposeful organizational pattern. An oral answer to a single question not designed to be structured into a presentation or discussion does not readily apply to this rubric. Glossary, The definitions that follow were developed to clarify terms and concepts used in this rubric only.

• Delivery techniques: Posture, gestures, eye contact, the use of the voice, and appropriate attire. Delivery techniques enhance the effectiveness of the presentation when the speaker stands and moves with authority, looks more often at the audience than at his/her speaking materials/notes, uses the voice expressively, and uses few vocal fillers ("um," "uh," "like," "you know," etc.).

• Organization: The grouping and sequencing of ideas and content in a presentation or discussion. An organizational pattern that enhances the effectiveness of the presentation reflects a purposeful choice among possible alternatives, such as a chronological pattern, a problem-solution pattern, an analysis-of-parts pattern, etc., that makes the content of the presentation or discussion easier to follow and more likely to accomplish its purpose.

• Content: Explanations, examples, illustrations, statistics, analogies, quotations from relevant authorities with appropriate citations, and other kinds of information or analysis that supports the principal ideas of the presentation or discussion. Content is generally credible when it is relevant, derived and cited from reliable and appropriate sources. Content is highly credible when it is also vivid and varied across the types listed above (e.g., a mix of examples, statistics, and references to authorities). Content may also serve the purpose of establishing the speakers credibility. For example, in presenting a creative work such as a dramatic reading of Shakespeare, supporting evidence may not advance the ideas of Shakespeare, but rather serve to establish the speaker as a credible Shakespearean actor

Page 111: General Education Outcomes Assessment 2016-2017: Written ...stufiles.sanjac.edu/GeneralEducation/2017-2018-GenEd/other_docs/Report General... · 2016-2017: Written Communication,

Oral Communication VALUE Rubric (adapted by SJC) Definition: Oral communication is a purposeful presentation or discussion designed to increase knowledge, to foster understanding, or to promote change in the listeners' attitudes, values, beliefs, or behaviors. Evaluators are encouraged to assign a zero to any work sample or collection of work that does not meet benchmark (cell one) level performance.

Level 4 3 2 Level 1 0

Organization Organizational pattern is clearly and consistently observable and is skillful and makes the content of the presentation/discussion cohesive.

Organizational pattern is clearly and consistently observable within the presentation/discussion.

Organizational pattern is intermittently observable within the presentation/discussion.

Organizational pattern is minimally observable within the presentation/discussion.

Does not meet “Level 1” standards.

Content A variety of types of appropriately cited content (explanations, examples, illustrations, statistics, analogies, quotations from relevant authorities) makes appropriate reference to information or analysis that significantly supports the presentation/discussion or establishes the presenter's credibility/authority on the topic.

Appropriately cited content (explanations, examples, illustrations, statistics, analogies, quotations from relevant authorities) makes appropriate reference to information or analysis that generally supports the presentation/discussion or establishes the presenter's credibility/authority on the topic.

Appropriately cited content (explanations, examples, illustrations, statistics, analogies, quotations from relevant authorities) makes appropriate reference to information or analysis that partially supports the presentation/discussion or establishes the presenter's credibility/authority on the topic.

Insufficient and/or inadequately cited content (explanations, examples, illustrations, statistics, analogies, quotations from relevant authorities) makes reference to information or analysis that minimally supports the presentation/discussion or establishes the presenter's credibility/authority on the topic.

Does not meet “Level 1” standards.

Delivery Delivery techniques (posture, gesture, eye contact, vocal expressiveness, and appropriate attire) make the presentation/discussion compelling, and speaker appears polished and confident.

Delivery techniques (posture, gesture, eye contact, vocal expressiveness, and appropriate attire) make the presentation/discussion interesting, and speaker appears comfortable.

Delivery techniques (posture, gesture, eye contact, vocal expressiveness, and appropriate attire) make the presentation/discussion understandable, and speaker appears tentative.

Delivery techniques (posture, gesture, eye contact, vocal expressiveness, and appropriate attire) detract from the understandability of the presentation/discussion, and speaker appears uncomfortable.

Does not meet “Level 1” standards.

Readability of Visual Elements

Visual characteristics are strikingly effective, intentionally designed to enhance readability and interpretation. May clearly address accessibility concerns (e.g., multiple modes are used to communicate message: color and shape.)

Visual characteristics are clearly readable and easily and can be easily and consistently interpreted; design elements do not interfere in any manner with interpretation.

Visual characteristics allow visual elements to be read and interpreted consistently by most readers despite needed improvements.

Visual characteristics (e.g. color, contrast, font style/size, or layout and use of space) make visual elements difficult to read or interpret.

Visual elements NOT included or do not meet “Level 1” standards.

Relevance, Accuracy, and Integration of Visual Elements

Is clearly relevant to and appropriate for subject matter. Integrated well into, augments understanding of, or strengthens the message. Any distorted representations of information are clearly intentional and used for persuasive purposes.

Is clearly relevant to and appropriate for subject matter. Reinforces and is integrated well into the message. Absent of unintentional bias; any distorted representations of information may be used intentionally though the purpose could be clearer.

Is clearly relevant to and appropriate for the subject matter. Reinforces but may be more intentionally integrated into the message. May contain subtle bias or distorted representations of information that appear to be unintentional.

Is decorative only or is not appropriate for the subject matter. Does not add meaning, understanding, or value to the message. May contain clearly misleading, biased, or distorted representations of information that appear to be unintentional.

Visual elements NOT included or do not meet “Level 1” standards.

Page 112: General Education Outcomes Assessment 2016-2017: Written ...stufiles.sanjac.edu/GeneralEducation/2017-2018-GenEd/other_docs/Report General... · 2016-2017: Written Communication,

Personal Responsibility Institutional Interpretation

As of February 15, 2017

Definitions & Rubric • Personal Responsibility (SJC), “Students will develop habits of intellectual exploration, ethical decision-making,

and physical wellbeing.” • Personal Responsibility (THECB), “to include the ability to connect choices, actions, and consequences to ethical

decision-making.”

General Issues & Discussion • Setting a standard for our graduates and core complete students. The question addressed through this

assessment process is, “Are we, as an institution, effective in educating our students to attain the general education outcomes?” That question focuses on students that graduate (with an AA, AS, or AAT) or complete the core. As we assess, we should not make allowances (e.g. assess more easily) for students we believe may be earlier in their college career; we acknowledge that the technology does not yet support ideal sampling methods (work sampled only from recent graduates), and we consider that limitation when analyzing the results. Our standard for and expectations of our graduates and core complete students should be applied consistently.

• Grading Perspective. The levels of the rubric do not correspond to a particular grade (e.g. Level 4 does not equate to an “A”) or to a particular level of academic experience (freshman, sophomore etc.). More importantly, we are not assessing student work for the purpose of assigning a grade; the purpose is to identify their relative level of personal responsibility to evaluate how well San Jacinto College is helping student attain that skill. Thus, if a student only performs at a Level 1 even though they are addressing the assignment extremely well, it is still a Level 1 of performance.

• Frame analytic (criterion-by-criterion) assessment within a holistic assessment framework. Once all criterion have been evaluated, consider the student’s performance more holistically. Is the student’s work representative of work you would be confident as labeling “successful personal responsibility” by a San Jacinto College graduate or core complete student. Alternatively, is the student’s work representative of work you believe should *not* be labeled as “successful personal responsibility” by a San Jacinto College graduate or core complete student? Use your answers to those questions to consider your criterion-by-criterion assessment

• Generalize behaviors (reading holistically). It is valid to consider the examples given within the criteria or within the institutional interpretation to be a general description of that level of personal responsibility rather than the only valid examples of that skill level. The work presented by the student may represent a comparable level of skill. In other words, given another opportunity to complete the assignment, would a student performing as they are given their response to the assignment perhaps exhibit the specific examples offered by the rubric or institutional interpretation?

• Accuracy of content specific knowledge. The personal responsibility rubric does not assess content specific knowledge. It is possible for a student to complete an assignment based on incorrect knowledge schema while still demonstrating adequate personal responsibility. Certainly, very specific content errors that do not affect the meaning of the document should not unduly lower the assessment on the rubric.

Page 113: General Education Outcomes Assessment 2016-2017: Written ...stufiles.sanjac.edu/GeneralEducation/2017-2018-GenEd/other_docs/Report General... · 2016-2017: Written Communication,

Institutional Interpretation of Criteria Usability of Sample

• See separate document “Determining Usability of Sample”

Ethical Issue Recognition • Successful students are able to recognize basic and obvious ethical issues and begin to address the complexities or

interrelationships among the issues. Unsuccessful students may not recognize basic/obvious ethical issues, are unable to address the issues beyond describing them,

• Addressing ethical issues in an oversimplified manner – e.g. only describing the issue without more in depth analysis or discussion – is indicative of a Level 1.

Curiosity • Successful students exhibit curiosity in learning by exploring and writing about a topic in some depth and in a manner

indicating at least mild interest. Unsuccessful students explore a topic at a surface level and with only limited, basic facts.

• The level of content development exhibited by the student or the complexity of prior learning transferred into the assignment may overlap with and be indicative of the student’s curiosity. For example, a student that provides multiple supporting ideas each developed with examples and details is likely exhibiting at least mild interest in the subject (e.g. Level 2). A student that makes an effort to include more novel or insightful content suggests higher levels of curiosity. A student that does bare minimum or does not appear to make the effort to develop ideas may be performing lower (Level 1 or 0) on this criterion.

Reflection and Self-Assessment • Successful students articulate personal strengths and challenges from one context to increase their ability to perform

in another. Unsuccessful students are able to describe their own performance in basic terms (success or failure). • To reflect and self-assess, a student must express introspection; they must explicitly describe and evaluate their own

thought process, learning process, or performance. To qualify as Reflection and Self-assessment, a student expressing an opinion on an issue should discuss how or why their opinion has changed or not; if they only express an opinion or agreement/disagreement on an issue, that is not relevant to this criterion. This may yield a very significant number of Level 0 assessments on this criterion; if the assignment does not ask a student to reflect in this manner, the usability of sample may be a Level 3.

Solving Problems • Successful students are able to at least consider and reject less acceptable alternatives; they may be able to select

from multiple alternatives, provide a rationale for that selection, and explain any consequences of the solution. Unsuccessful students may only be able to consider a single approach to solve a problem.

• For common assignments that don’t take a problem-solution approach: Successful students are able to view an issue or topic from different positions and perspectives, such as cultural, political, and/or anthropological. Successful students are able to demonstrate sympathy toward these positions/perspectives. They are able to select and defend a conclusion. Unsuccessful students may respond from only one position or perspective (Level 1). If that position/perspective is their own it’s a Level 0.

Evaluation of Difference Ethical Perspectives or Concepts • Successful students are able to at least state a position and describe objections to, assumptions and implications of

different perspectives; they may be able to respond to those objections, assumptions, and implications. Unsuccessful students may be able to state a position, but they are unable to identify objections, assumptions, or limitations of different perspectives.

Page 114: General Education Outcomes Assessment 2016-2017: Written ...stufiles.sanjac.edu/GeneralEducation/2017-2018-GenEd/other_docs/Report General... · 2016-2017: Written Communication,

Personal Responsibility Rubric Definition: Students will develop habits of intellectual exploration, ethical decision-making, and physical well-being.

Criteria Level 4 3 2 Level 1 0

Usability of Sample (Levels for this criterion are separate and distinct categories – not a scale.)

No issues encountered. There was a disconnect between the rubric and the assignment, e.g. student may or may not have performed well, but the assignment did not ask the student to perform in a manner expected by the rubric.

The student responded in a manner that interfered with reliable assessment of this outcome, e.g. student wrote much less than assignment required; or, poor writing skills interfere with assessing a non-writing outcome.

The assignment was not the common assignment expected for the course, e.g. entirely different assignment, excessively modified common assignment, mistaken alignment, or presented as not part of regular course grade.

This document was not accessible or not assessable: wrong file format, unable to open the file, illegible/unreadable, unexpected teamwork, or instance of plagiarism.

Ethical Issue Recognition

Student can recognize ethical issues when presented in a complex, multilayered (gray) context AND can recognize cross-relationships among the issues.

Student can recognize ethical issues when issues are presented in a complex, multilayered (gray) context OR can grasp cross-relationships among the issues.

Student can recognize basic and obvious ethical issues and grasp (incompletely) the complexities or interrelationships among the issues.

Student can recognize basic and obvious ethical issues but fails to grasp complexity or interrelationships.

Does not meet “Level 1” standards.

Curiosity Explores a topic in depth, yielding a rich awareness and/or little-known information indicating intense interest in the subject.

Explores a topic in depth, yielding insight and/or information indicating interest in the subject.

Explores a topic with some evidence of depth, providing occasional insight and/or information indicating mild interest in the subject.

Explores a topic at a surface level, providing little insight and/or information beyond the very basic facts indicating low interest in the subject.

Does not meet “Level 1” standards.

Reflection and Self-Assessment

Envisions a future self (and possibly makes plans that build on past experiences that have occurred across multiple and diverse contexts).

Evaluates changes in own learning over time, recognizing complex contextual factors (e.g., works with ambiguity and risk, deals with frustration, considers ethical frameworks).

Articulates strengths and challenges (within specific performances or events) to increase effectiveness in different contexts (through increased self-awareness).

Describes own performances with general descriptors of success and failure.

Does not meet “Level 1” standards.

Solving Problems Not only develops a logical, consistent plan to solve problem, but recognizes consequences of solution and can articulate reason for choosing solution.

Having selected from among alternatives, develops a logical, consistent plan to solve the problem.

Considers and rejects less acceptable approaches to solving problem.

Only a single approach is considered and is used to solve the problem.

Does not meet “Level 1” standards.

Evaluation of Different Ethical Perspectives or Concepts

States a position and can state the objections to, assumptions and implications of, and can reasonably defend against the objections to, assumptions and implications of different ethical perspectives/concepts, and the defense is adequate and effective.

States a position and can state the objections to, assumptions and implications of, and respond to the objections to, assumptions and implications of different ethical perspectives/concepts, but the student's response is inadequate.

States a position and can state the objections to, assumptions and implications of different ethical perspectives/concepts but does not respond to them (objections, assumptions, and implications are compartmentalized by and do not affect position.)

States a position but cannot state the objections to and assumptions and limitations of the different perspectives/concepts.

Does not meet “Level 1” standards.

San Jacinto College, General Education effective Fall 2016 Adapted from AAC&U LEAP VALUE Rubrics

Page 115: General Education Outcomes Assessment 2016-2017: Written ...stufiles.sanjac.edu/GeneralEducation/2017-2018-GenEd/other_docs/Report General... · 2016-2017: Written Communication,

Social Responsibility Institutional Interpretation

As of February 14, 2017

Definitions & Rubric

Social Responsibility (SJC), “Students will demonstrate a global perspective toward issues of culture, society, politics, environment, and sustainability.”

Social Responsibility (THECB), “to include intercultural competence, knowledge of civic responsibility, and the

ability to engage effectively in regional, national, and global communities.”

General Issues & Discussion

Setting a standard for our graduates and core complete students. The question addressed through this assessment process is, “Are we, as an institution, effective in educating our students to attain the general education outcomes?” That question focuses on students that graduate (with an AA, AS, or AAT) or complete

the core. As we assess, we should not make allowances (e.g. assess more easily) for students we believe may be earlier in their college career; we acknowledge that the technology does not yet support ideal sampling methods

(work sampled only from recent graduates), and we consider that limitation when analyzing the results. Our standard for and expectations of our graduates and core complete students should be applied consistently.

Grading Perspective. The levels of the rubric do not correspond to a particular grade (e.g. Level 4 does not

equate to an “A”) or to a particular level of academic experience (freshman, sophomore etc.). More importantly, we are not assessing student work for the purpose of assigning a grade; the purpose is to identify their relative level of social responsibility to evaluate how well San Jacinto College is helping student attain that skill. Thus, if a

student only performs at a Level 1 even though they are addressing the assignment extremely well, it is still a Level 1 of performance.

Frame analytic (criterion-by-criterion) assessment within a holistic assessment framework. Once all criterion have been evaluated, consider the student’s performance more holistically. Is the student’s work representative of work you would be confident as labeling “successful social responsibility” by a San Jacinto College graduate or

core complete student. Alternatively, is the student’s work representative of work you believe should *not* be labeled as “successful social responsibility” by a San Jacinto College graduate or core complete student? Use your answers to those questions to consider your criterion-by-criterion assessment

Generalize behaviors (reading holistically). It is valid to consider the examples given within the criteria or within the institutional interpretation to be a general description of that level of social responsibility rather than

the only valid examples of that skill level. The work presented by the student may represent a comparable level of skill. In other words, given another opportunity to complete the assignment, would a student performing as they are given their response to the assignment perhaps exhibit the specific examples offered by the rubric or

institutional interpretation?

Accuracy of content specific knowledge. The social responsibility rubric does not assess content specific knowledge. It is possible for a student to complete an assignment based on incorrect knowledge schema while

still demonstrating adequate social responsibility. Certainly, very specific content errors that do not affect the meaning of the document should not unduly lower the assessment on the rubric.

Page 116: General Education Outcomes Assessment 2016-2017: Written ...stufiles.sanjac.edu/GeneralEducation/2017-2018-GenEd/other_docs/Report General... · 2016-2017: Written Communication,

Institutional Interpretation of Criteria

The College definition of “Social Responsibility” includes “culture, society, politics, environment, and sustainability.” The rubric criteria focuses on culture, but it should be interpreted to include the other elements of the College definition. When the rubric uses the phrase “culture,” it should be interpreted to also include perspectives regarding society,

politics, environment, or sustainability. The College interprets the use of the word “culture” as including a range of personal characteristics: race, ethnicity,

nationality, gender, age, sexual orientation, gender identity, religion, disability, socioeconomic status, political affiliation, or veteran status.

References to “global” suggest perspectives other than the student’s own; it does not necessarily suggest a truly global worldview. It may also suggest aspects of civic responsibility.

Usability of Sample

See separate document Determining Usability of Sample

Cultural Diversity

Successful students at least connect two or more cultures with an acknowledgement of power structures and demonstrate respectful interaction. Unsuccessful students may describe the experiences of others through one cultural perspective though perhaps demonstrating openness to varied cultures.

Knowledge (of cultural worldview frameworks) Successful students at least partially understand the complexity of important elements of another culture’s

history, values, politics, beliefs, or practices. Unsuccessful students at best superficially understand elements of another culture.

Describing or listing ideas, history, values, etc. that are important to another culture – without further explanation or analysis – is indicative of a Level 1.

Skills (empathy) Successful students at least identify components of other cultural perspectives though they may respond from

their own worldview; they may interpret experiences from another worldview and recognize the feelings of another cultural group. Unsuccessful students only view the experiences of others through their own cultural worldview.

Attitudes (openness) Successful students at least express openness to cultural differences, are aware of their own cultural values

within their judgments though they may have difficulty suspending that judgment, and are open to or are willing to change. Unsuccessful students are unaware of the cultural influences in their own judgment or worldview

though they may be receptive to interacting with culturally different others.

Global Self-Awareness

Successful students analyze ways that human actions influence the natural and human world and may be able to articulate their own identity in and effect on a global context. Unsuccessful students are only to identify some

connections between individual decision-making and local or global issues.

Page 117: General Education Outcomes Assessment 2016-2017: Written ...stufiles.sanjac.edu/GeneralEducation/2017-2018-GenEd/other_docs/Report General... · 2016-2017: Written Communication,

Social Responsibility Rubric Definition: Students will demonstrate a global perspective toward issues of culture, society, politics, environment, and sustainability.

Criteria Level 4 3 2 Level 1 0

Usability of

Sample (Levels for this

criterion are separate and distinct

categories – not

No issues encountered. There was a disconnect between the

rubric and the assignment, e.g. student may or may not have performed well, but the assignment

did not ask the student to perform in a manner expected by the rubric.

The student responded in a manner

that interfered with reliable assessment of this outcome, e.g. student wrote much less than

assignment required; or, poor writing skills interfere with assessing a non-writing outcome.

The assignment was not the

common assignment expected for the course, e.g. entirely different assignment, excessively modified

common assignment, mistaken alignment, or presented as not part of regular course grade.

This document was not

accessible or not assessable: wrong file format, unable to open the file,

illegible/unreadable, unexpected teamwork, or instance of plagiarism.

Cultural Diversity

Adapts and applies a deep understanding of multiple

worldviews, experiences, and power structures while initiating meaningful interaction with other cultures to

address significant global problems.

Analyzes substantial connections between the worldviews, power

structures, and experiences of multiple cultures historically or in contemporary contexts, incorporating respectful

interactions with other cultures.

Explains and connects two or more cultures historically or in contemporary

contexts with some acknowledgement of power structures, demonstrating respectful interaction with varied

cultures and worldviews.

Describes the experiences of others historically or in contemporary

contexts primarily through one cultural perspective, demonstrating some openness to varied cultures

and worldviews.

Does not meet “Level 1” standards.

Knowledge Knowledge of

cultural worldview frameworks

Demonstrates sophisticated understanding of the complexity of

elements important to members of another culture in relation to its history, values, politics, communication styles, economy, or

beliefs and practices.

Demonstrates adequate understanding of the complexity of elements

important to members of another culture in relation to its history, values, politics, communication styles, economy, or beliefs and practices.

Demonstrates partial understanding of the complexity of elements important to

members of another culture in relation to its history, values, politics, communication styles, economy, or beliefs and practices.

Demonstrates surface understanding of the complexity of

elements important to members of another culture in relation to its history, values, politics, communication styles, economy, or

beliefs and practices.

Does not meet “Level 1” standards.

Skills Empathy

Interprets intercultural experience from the perspectives of own and

more than one worldview and demonstrates ability to act in a supportive manner that recognizes the feelings of another cultural

group.

Recognizes intellectual and emotional dimensions of more than one

worldview and sometimes uses more than one worldview in interactions.

Identifies components of other cultural perspectives but responds in all

situations with own worldview.

Views the experience of others but does so through own cultural

worldview.

Does not meet “Level 1” standards.

Attitudes Openness

Initiates and develops interactions with culturally different others.

Suspends judgment in valuing her/his interactions with culturally different others.

Begins to initiate and develop interactions with culturally different

others. Begins to suspend judgment in valuing her/his interactions with culturally different others.

Expresses openness to most, if not all, interactions with culturally different

others. Has difficulty suspending any judgment in her/his interactions with culturally different others, and is aware of own judgment and expresses a

willingness to change.

Receptive to interacting with culturally different others. Has

difficulty suspending any judgment in her/his interactions with culturally different others, but is unaware of own judgment.

Does not meet “Level 1” standards.

Global Self-Awareness

Effectively addresses significant issues in the natural and human

world based on articulating one’s identity in a global context.

Evaluates the global impact of one’s own and others’ specific local actions

on the natural and human world.

Analyzes ways that human actions influence the natural and human world.

Identifies some connections between an individual’s personal

decision-making and certain local and global issues.

Does not meet “Level 1” standards.

Page 118: General Education Outcomes Assessment 2016-2017: Written ...stufiles.sanjac.edu/GeneralEducation/2017-2018-GenEd/other_docs/Report General... · 2016-2017: Written Communication,

Appendix F. Inter-Rater Reliability Estimates

Page 119: General Education Outcomes Assessment 2016-2017: Written ...stufiles.sanjac.edu/GeneralEducation/2017-2018-GenEd/other_docs/Report General... · 2016-2017: Written Communication,

Nominal (0,1) Criteria Krippendorff’s Alpha Probability of Failure

Usabilitity of Sample 0.4403 0.5454Context and Purpose 0.1905 0.4458Content Development 0.2573 0.5242Genre and Disciplinary Conventions 0.2612 0.4514Sources and Evidence 0.2208 0.4604Control of Syntax and Mechanics 0.1370 0.5276

Ordinal (0,1,2,3,4)Criteria Krippendorff’s Alpha Probability of Failure

Usabilitity of Sample 0.3650 0.5082Context and Purpose 0.2656 0.5000Content Development 0.2995 0.4960Genre and Disciplinary Conventions 0.2793 0.5016Sources and Evidence 0.2951 0.4982Control of Syntax and Mechanics 0.1833 0.4930

Page 120: General Education Outcomes Assessment 2016-2017: Written ...stufiles.sanjac.edu/GeneralEducation/2017-2018-GenEd/other_docs/Report General... · 2016-2017: Written Communication,

Nominal (0,1) Criteria Krippendorff’s Alpha Probability of Failure

Usabilitity of Sample 0.3998 0.5494Exthical Issue Recognition 0.2639 0.5168Curiosity 0.3344 0.4832Reflection 0.1304 0.4904Solving Problems 0.2173 0.4954Different Ethic Perspective 0.2120 0.5076

Ordinal (0,1,2,3,4)Criteria Krippendorff’s Alpha Probability of Failure

Usabilitity of Sample 0.2836 0.4950Exthical Issue Recognition 0.3192 0.4906Curiosity 0.3652 0.4818Reflection 0.1558 0.4898Solving Problems 0.2660 0.4890Different Ethic Perspective 0.3224 0.5044

Page 121: General Education Outcomes Assessment 2016-2017: Written ...stufiles.sanjac.edu/GeneralEducation/2017-2018-GenEd/other_docs/Report General... · 2016-2017: Written Communication,

Nominal (0,1) Criteria Krippendorff’s Alpha Probability of Failure

Usabilitity of Sample 0.4574 0.4516Cultural Diversity 0.1600 0.4698Cultural Worldview 0.0728 0.5002Empathy 0.1077 0.5120Openness 0.1280 0.4840Global Self-Awareness 0.1262 0.5118

Ordinal (0,1,2,3,4)Criteria Krippendorff’s Alpha Probability of Failure

Usabilitity of Sample 0.3194 0.4988Cultural Diversity 0.1676 0.5008Cultural Worldview 0.1272 0.4948Empathy 0.1711 0.5108Openness 0.1414 0.4866Global Self-Awareness 0.1764 0.4978

Page 122: General Education Outcomes Assessment 2016-2017: Written ...stufiles.sanjac.edu/GeneralEducation/2017-2018-GenEd/other_docs/Report General... · 2016-2017: Written Communication,

Appendix G. Sample Size Calculation

Page 123: General Education Outcomes Assessment 2016-2017: Written ...stufiles.sanjac.edu/GeneralEducation/2017-2018-GenEd/other_docs/Report General... · 2016-2017: Written Communication,

Sample Size Calculation

Population Minimum Population Maximum Required Sample Size

31,286 35,000 380

25,690 31,285 379

21,775 25,689 378

18,883 21,774 377

16,658 18,882 376

14,894 16,657 375

13,461 14,893 374

12,273 13,460 373

11,273 12,272 372

10,420 11,272 371

9,683 10,419 370

9,040 9,682 369

8,475 9,039 368

7,973 8,474 367

7,525 7,972 366

7,123 7,524 365

6,760 7,122 364

6,430 6,759 363

6,129 6,429 362

5,854 6,128 361

5,601 5,853 360

5,368 5,600 359

5,152 5,367 358

4,952 5,151 357

4,766 4,951 356

4,592 4,765 355

4,430 4,591 354

4,278 4,429 353

4,135 4,277 352

4,001 4,134 351

3,874 4,000 350

3,755 3,873 349

3,642 3,754 348

3,535 3,641 347

3,434 3,534 346

3,337 3,433 345

3,246 3,336 344

3,159 3,245 343

3,076 3,158 342

2,997 3,075 341

2,921 2,996 340

2,848 2,920 339

2,779 2,847 338

2,713 2,778 337

2,649 2,712 336

2,588 2,648 335

2,529 2,587 334

2,473 2,528 333

2,419 2,472 332

To determine the sample size needed for your analysis, identify the range for your

identified population. The sample size indicated is sufficient that you can be 95%

confident that your results (give or take 5%) are representative of the population.

Page 1 of 8

Page 124: General Education Outcomes Assessment 2016-2017: Written ...stufiles.sanjac.edu/GeneralEducation/2017-2018-GenEd/other_docs/Report General... · 2016-2017: Written Communication,

Sample Size Calculation

Population Minimum Population Maximum Required Sample Size

To determine the sample size needed for your analysis, identify the range for your

identified population. The sample size indicated is sufficient that you can be 95%

confident that your results (give or take 5%) are representative of the population.

2,367 2,418 331

2,316 2,366 330

2,268 2,315 329

2,221 2,267 328

2,176 2,220 327

2,132 2,175 326

2,090 2,131 325

2,049 2,089 324

2,010 2,048 323

1,972 2,009 322

1,935 1,971 321

1,899 1,934 320

1,864 1,898 319

1,830 1,863 318

1,798 1,829 317

1,766 1,797 316

1,735 1,765 315

1,705 1,734 314

1,676 1,704 313

1,647 1,675 312

1,620 1,646 311

1,593 1,619 310

1,567 1,592 309

1,541 1,566 308

1,517 1,540 307

1,493 1,516 306

1,469 1,492 305

1,446 1,468 304

1,424 1,445 303

1,402 1,423 302

1,380 1,401 301

1,360 1,379 300

1,339 1,359 299

1,319 1,338 298

1,300 1,318 297

1,281 1,299 296

1,262 1,280 295

1,244 1,261 294

1,226 1,243 293

1,209 1,225 292

1,192 1,208 291

1,175 1,191 290

1,159 1,174 289

1,143 1,158 288

1,127 1,142 287

1,112 1,126 286

1,097 1,111 285

1,082 1,096 284

1,068 1,081 283

Page 2 of 8

Page 125: General Education Outcomes Assessment 2016-2017: Written ...stufiles.sanjac.edu/GeneralEducation/2017-2018-GenEd/other_docs/Report General... · 2016-2017: Written Communication,

Sample Size Calculation

Population Minimum Population Maximum Required Sample Size

To determine the sample size needed for your analysis, identify the range for your

identified population. The sample size indicated is sufficient that you can be 95%

confident that your results (give or take 5%) are representative of the population.

1,054 1,067 282

1,040 1,053 281

1,026 1,039 280

1,013 1,025 279

1,000 1,012 278

987 999 277

975 986 276

962 974 275

950 961 274

938 949 273

926 937 272

915 925 271

903 914 270

892 902 269

881 891 268

871 880 267

860 870 266

850 859 265

839 849 264

829 838 263

819 828 262

810 818 261

800 809 260

791 799 259

781 790 258

772 780 257

763 771 256

755 762 255

746 754 254

737 745 253

729 736 252

720 728 251

712 719 250

704 711 249

696 703 248

688 695 247

681 687 246

673 680 245

666 672 244

658 665 243

651 657 242

644 650 241

637 643 240

630 636 239

623 629 238

616 622 237

609 615 236

602 608 235

596 601 234

Page 3 of 8

Page 126: General Education Outcomes Assessment 2016-2017: Written ...stufiles.sanjac.edu/GeneralEducation/2017-2018-GenEd/other_docs/Report General... · 2016-2017: Written Communication,

Sample Size Calculation

Population Minimum Population Maximum Required Sample Size

To determine the sample size needed for your analysis, identify the range for your

identified population. The sample size indicated is sufficient that you can be 95%

confident that your results (give or take 5%) are representative of the population.

589 595 233

583 588 232

577 582 231

571 576 230

564 570 229

558 563 228

552 557 227

546 551 226

541 545 225

535 540 224

529 534 223

524 528 222

518 523 221

513 517 220

507 512 219

502 506 218

497 501 217

491 496 216

486 490 215

481 485 214

476 480 213

471 475 212

466 470 211

461 465 210

456 460 209

452 455 208

447 451 207

442 446 206

438 441 205

433 437 204

429 432 203

424 428 202

420 423 201

416 419 200

411 415 199

407 410 198

403 406 197

399 402 196

394 398 195

390 393 194

386 389 193

382 385 192

378 381 191

374 377 190

371 373 189

367 370 188

363 366 187

359 362 186

355 358 185

Page 4 of 8

Page 127: General Education Outcomes Assessment 2016-2017: Written ...stufiles.sanjac.edu/GeneralEducation/2017-2018-GenEd/other_docs/Report General... · 2016-2017: Written Communication,

Sample Size Calculation

Population Minimum Population Maximum Required Sample Size

To determine the sample size needed for your analysis, identify the range for your

identified population. The sample size indicated is sufficient that you can be 95%

confident that your results (give or take 5%) are representative of the population.

352 354 184

348 351 183

345 347 182

341 344 181

337 340 180

334 336 179

330 333 178

327 329 177

324 326 176

320 323 175

317 319 174

314 316 173

310 313 172

307 309 171

304 306 170

301 303 169

297 300 168

294 296 167

291 293 166

288 290 165

285 287 164

282 284 163

279 281 162

276 278 161

273 275 160

270 272 159

267 269 158

265 266 157

262 264 156

259 261 155

256 258 154

253 255 153

251 252 152

248 250 151

245 247 150

243 244 149

240 242 148

237 239 147

235 236 146

232 234 145

230 231 144

227 229 143

225 226 142

222 224 141

220 221 140

217 219 139

215 216 138

212 214 137

210 211 136

Page 5 of 8

Page 128: General Education Outcomes Assessment 2016-2017: Written ...stufiles.sanjac.edu/GeneralEducation/2017-2018-GenEd/other_docs/Report General... · 2016-2017: Written Communication,

Sample Size Calculation

Population Minimum Population Maximum Required Sample Size

To determine the sample size needed for your analysis, identify the range for your

identified population. The sample size indicated is sufficient that you can be 95%

confident that your results (give or take 5%) are representative of the population.

207 209 135

205 206 134

203 204 133

200 202 132

198 199 131

196 197 130

194 195 129

191 193 128

189 190 127

187 188 126

185 186 125

183 184 124

180 182 123

178 179 122

176 177 121

174 175 120

172 173 119

170 171 118

168 169 117

166 167 116

164 165 115

162 163 114

160 161 113

158 159 112

156 157 111

154 155 110

152 153 109

150 151 108

148 149 107

146 147 106

144 145 105

142 143 104

140 141 103

138 139 102

137 137 101

135 136 100

133 134 99

131 132 98

129 130 97

128 128 96

126 127 95

124 125 94

122 123 93

121 121 92

119 120 91

117 118 90

115 116 89

114 114 88

112 113 87

Page 6 of 8

Page 129: General Education Outcomes Assessment 2016-2017: Written ...stufiles.sanjac.edu/GeneralEducation/2017-2018-GenEd/other_docs/Report General... · 2016-2017: Written Communication,

Sample Size Calculation

Population Minimum Population Maximum Required Sample Size

To determine the sample size needed for your analysis, identify the range for your

identified population. The sample size indicated is sufficient that you can be 95%

confident that your results (give or take 5%) are representative of the population.

110 111 86

109 109 85

107 108 84

106 106 83

104 105 82

102 103 81

101 101 80

99 100 79

98 98 78

96 97 77

94 95 76

93 93 75

91 92 74

90 90 73

88 89 72

87 87 71

85 86 70

84 84 69

82 83 68

81 81 67

79 80 66

78 78 65

77 77 64

75 76 63

74 74 62

72 73 61

71 71 60

70 70 59

68 69 58

67 67 57

65 66 56

64 64 55

63 63 54

61 62 53

60 60 52

59 59 51

57 58 50

56 56 49

55 55 48

53 54 47

52 52 46

51 51 45

50 50 44

48 49 43

47 47 42

46 46 41

45 45 40

43 44 39

42 42 38

Page 7 of 8

Page 130: General Education Outcomes Assessment 2016-2017: Written ...stufiles.sanjac.edu/GeneralEducation/2017-2018-GenEd/other_docs/Report General... · 2016-2017: Written Communication,

Sample Size Calculation

Population Minimum Population Maximum Required Sample Size

To determine the sample size needed for your analysis, identify the range for your

identified population. The sample size indicated is sufficient that you can be 95%

confident that your results (give or take 5%) are representative of the population.

41 41 37

40 40 36

38 39 35

37 37 34

36 36 33

35 35 32

34 34 31

32 33 30

31 31 29

30 30 28

29 29 27

28 28 26

27 27 25

26 26 24

24 25 23

23 23 22

22 22 21

21 21 20

20 20 19

19 19 18

18 18 17

17 17 16

16 16 15

14 15 14

13 13 13

12 12 12

11 11 11

10 10 10

9 9 9

8 8 8

7 7 7

6 6 6

5 5 5

4 4 4

3 3 3

2 2 2

1 1 1

Page 8 of 8