annual institutional effectiveness report

27
Annual Institutional Effectiveness Report Annual Institutional Effectiveness Report, 2017-2018 Prepared by the Office for Institutional Effectiveness and Strategic Planning Page 1 Annual Institutional Effectiveness Report 2017- 2018 Report prepared by the Office for Institutional Effectiveness and Strategic Planning OIEP.COFC.EDU

Upload: others

Post on 17-Apr-2022

4 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Annual Institutional Effectiveness Report

Annual Institutional Effectiveness Report

Annual Institutional Effectiveness Report, 2017-2018 Prepared by the Office for Institutional Effectiveness and Strategic Planning Page 1

Annual Institutional Effectiveness Report

2017-2018

Report prepared by the Office for Institutional Effectiveness and Strategic Planning OIEP.COFC.EDU

Page 2: Annual Institutional Effectiveness Report

Annual Institutional Effectiveness Report

Annual Institutional Effectiveness Report, 2017-2018 Prepared by the Office for Institutional Effectiveness and Strategic Planning Page 2

The Assessment Process

The College of Charleston follows an annual assessment model for the systematic submission and review of academic program and administrative unit assessment reports (plans and results). Key elements of this model include articulated student learning/operational outcomes, multiple assessment measures, performance targets, peer mentoring and review, and broad-based participation. Academic programs and administrative units use the assessment template to guide the required structure of the assessment reports that are housed in Compliance Assist (an assessment planning and management system). The College of Charleston Institutional Effectiveness (IE) Assessment model defines two broad categories: academic programs and administrative units. Academic programs include undergraduate and graduate educational programs, stand-alone minors, certificates, and the general education program. Administrative units include administrative support services, academic and student support services, and centers and institutes. Assessment Calendar with Deadlines

Academic Programs

• September 1: Data entry for assessment reports (plans and results) completed by all academic programs and administrative units to Compliance Assist. and submission of self-completed rubrics for the plans and results. For example, on September 1, 2018 assessment results for 2017-18 and assessment plans for 2018-19 are due.

• September 1-15: Independent reviews conducted and rubrics completed by the respective Deans Assessment Committees (DACs) and returned to the respective assessment coordinator(s).

• September 30: Final edits to plans and results as well as final rubrics uploaded to Compliance Assist.

Administrative Units

• June 30: Data entry for assessment reports (plans and results) completed by all administrative units to Compliance Assist and submission of self-completed rubrics for the plans and results. For example, on June 30, 2018 assessment results for 2017-18 and assessment plans for 2018-19 are due.

• July 1-15: Independent reviews conducted and rubrics completed by the respective Administrative Assessment Committees (AACs) and returned to the respective assessment coordinator(s).

• July 30: Final edits to plans and results as well as final rubrics uploaded to Compliance Assist.

Roles and Responsibilities The College of Charleston IE assessment model engages broad-based participation and encompasses several key faculty, staff, and administrator roles. The IE assessment model is an ongoing, broad-based process and involves collaborations between assessment coordinators, the Deans Assessment Committees (DACs) members at the school level, the Administrative Assessment Committees (AACs) members at the division level, the chairs of the DACs and AACs who comprise the Institutional Assessment Committee (IAC), the Provost or Executive Vice Presidents, the President, the Office for Institutional Effectiveness and Strategic Planning (OIEP).

Page 3: Annual Institutional Effectiveness Report

Annual Institutional Effectiveness Report

Annual Institutional Effectiveness Report, 2017-2018 Prepared by the Office for Institutional Effectiveness and Strategic Planning Page 3

Assessment coordinators (faculty and staff members) work collaboratively with colleagues in their programs or units to develop an assessment report (plans and results) and coordinate their program’s or unit’s ongoing assessment process. The Deans Assessment Committees (DACs) are school level assessment committees that exist for each school or college and the General Education Program. The DACs consists of faculty across the varying disciplines. The Administrative Assessment Committees (AACs) are division level assessment committees that exists for each of the nine divisions and consist of staff members and administrators from the respective divisions. These committees serve as mentors and work collaboratively with their programs/units to assist the assessment coordinators in their assessment efforts and to provide a review of the quality of the assessment reports based on established criteria provided in the Institutional Assessment Rubrics. The committees use the IE rubrics to focus discussions on the rubric indicators, so as to increase the quality of assessment reports. The chair of each DAC and AAC serves on the Institutional Assessment Committee (IAC). The Institutional Assessment Committee (IAC) is an institutional-level committee that consists of the chairs from each of the DACs and AACs. It oversees the implementation of the IE assessment process, facilitates campus discussion and reflection on use of results to make improvements in student learning and operations. The IAC also ensures the quality of the reviews conducted by the DACs and AACs through its oversight of the review process. Annually, each member of the IAC presents a DAC or AAC report to IAC about the quality of the results and plans. Committee rosters and meeting minutes are archived at the OIEP website. Additional responsibilities of the IAC include:

helping to create and maintain a culture of at the College of Charleston,

ensuring the use of assessment results to make improvements,

motivating faculty and staff participation in all steps of the assessment process,

providing feedback on assessments to promote continuous improvement,

involving students by promoting awareness of institutional measures,

coordinating assessment efforts,

generating ways to involve external stakeholders in meaningful assessment activities,

working with other campus entities to incorporate institutional data,

coordinating and collaborating to provide faculty/professional development, and

ensuring that new faculty and staff receive information about assessment. The Executive Vice Presidents (EVPs) and the President review a random sample of completed rubrics for programs and units and provide additional feedback, if necessary. OIEP serves as a support office for assessment coordinators, the DACs, the AACs, the IAC, the EVPs, and the President.

Summary of Completion Rates in the Assessment Cycle

As a part of the assessment process, assessment coordinators (faculty and staff members) from each program or unit work collaboratively with their colleagues to develop the outcomes, select and implement measures, analyze results, and plan for improvements based on the results. There are two phases to this collaborative process that represent the two parts of an assessment report, planning and results. Assessment coordinators: 1) report results from the previous year's assessment plan based on

Page 4: Annual Institutional Effectiveness Report

Annual Institutional Effectiveness Report

Annual Institutional Effectiveness Report, 2017-2018 Prepared by the Office for Institutional Effectiveness and Strategic Planning Page 4

data analysis and use results to make changes in curriculum, pedagogy, or operations; and provide necessary changes in the use of results and assessment summary sections in Compliance Assist; and 2) develop an assessment plan for the current year, which includes measurement of the effect of changes implemented based on the results of the previous year. The assessment plan includes a mission statement, assessment process, outcomes, measures and a curriculum or a functional map. In order to demonstrate compliance with both the appropriate accrediting standards related to assessment and the College of Charleston’s assessment procedures, OIEP staff conduct an audit of the completion status of all assessment reports, presenting the planning and the results phases separately. The data from this audit uses the following criterion as its primary indicators of completion: complete, partial, and missing. A rating of "complete" is determined as 100% of the required fields within the IE template containing content (text). A "partial" rating is determined as some percentage less than 100% of the required fields with content, but more than 0%. A rating of "missing" is determined as 0% of the required fields with content. The required fields included within the planning phase of the audit are program information, mission statement, assessment process, functional map, outcomes, assessment measures, research & service, and the relating of outcomes to strategic initiatives. The required fields included within the results phase of the audit are assessment results, meet targets, use of assessment results, impact on budget, and the assessment summary report. Note: The data presented throughout this report were collected from Compliance Assist within two weeks after the institutional deadlines, so these data may not reflect the current completion status of any given program/school or unit/division. Tables 1-3 show the completion rates for academic programs and schools and Table 4-6 show completion rates for administrative units. Table 1. Completion Rates for 2017-2018 Academic Assessment Reports

School Number of Programs

Plans Results

Complete Partial Missing Complete Partial Missing

SOTA 8 38% 63% 0% 38% 13% 50%

SB 14 71% 21% 7% 50% 43% 7%

EHHP 15 20% 73% 7% 100% 0% 0%

HSS 17 82% 12% 6% 53% 35% 12%

LCWA 24 46% 54% 0% 67% 33% 0%

SSM 23 78% 17% 4% 52% 39% 9%

SPS 3 0% 100% 0% 0% 67% 33%

Interdisciplinary Minors 5 100% 0% 0% 60% 40% 0%

Gen Ed* 1 100% 0% 0% 100% 0% 0%

HONS 1 100% 0% 0% 0% 100% 0%

UCSC 33 33% 55% 12% 27% 18% 24%

Total 144 53% 41% 6% 50% 34% 15% *General Education has 7 distribution areas; however, is counted as 1 program for the assessment audit.

Page 5: Annual Institutional Effectiveness Report

Annual Institutional Effectiveness Report

Annual Institutional Effectiveness Report, 2017-2018 Prepared by the Office for Institutional Effectiveness and Strategic Planning Page 5

Table 2. Comparison of Completion Rates for Academic Programs

Year Number of Programs

Plans Results

Complete Partial Missing Complete Partial Missing

2015-2016 140 91% 8% 1% 81% 11% 8%

2016-2017 144 92% 8% 1% 54% 35% 12%

2017-2018 144 53% 41% 6% 50% 34% 15%

Table 3. Comparison of Complete Academic Assessment Reports

School Plans Results

2015-2016 2016-2017 2017-2018 2015-2016 2016-2017 2017-2018

SOTA 80% 88% 38% 50% 63% 38%

SB 100% 100% 64% 100% 77% 50%

EHHP 92% 87% 20% 88% 53% 100%**

HSS 96% 82% 82% 83% 77% 53%

LCWA 100% 96% 46% 68% 25% 67%

SSM 86% 91% 78% 90% 57% 52%

SPS 0% 100% 0% 0% 0% 0%

Interdisciplinary Minors

86% 100% 100% 86% 100% 60%

Gen Ed* 100% 100% 100% 100% 100% 100%

HONS 100% 100% 100% 100% 100% 0%

UCSC 100% 90% 33% 100% 33% 27%

Total 91% 92% 53% 81% 54% 50% *General Education has 7 distribution areas; however, is counted as 1 program for the assessment audit. ** EHHP was given an extension by the President to report past the institutional deadline.

Table 4. Completion Rates for 2017-2018 Administrative Assessment Reports

Division Number of Units

Plans Results

Complete Partial Missing Complete Partial Missing

Academic Affairs 18 6% 94% 0% 33% 50% 17%

Business Affairs 7 14% 86% 0% 0% 86% 14%

Enrollment Planning 2 0% 100% 0% 0% 100% 0%

Facilities Management 3 0% 100% 0% 0% 67% 33%

Information Technology

2 0% 100% 0% 0% 0% 100%

Institutional Advancement

1 0% 100% 0% 0% 0% 100%

Marketing and Communication

1 0% 100% 0% 0% 0% 100%

Presidents Division 11 9% 91% 0% 0% 64% 36%

Page 6: Annual Institutional Effectiveness Report

Annual Institutional Effectiveness Report

Annual Institutional Effectiveness Report, 2017-2018 Prepared by the Office for Institutional Effectiveness and Strategic Planning Page 6

Student Affairs 14 0% 100% 0% 36% 64% 0%

Centers and Institutes 26 23% 77% 0% 12% 35% 4%

Total 85 11% 89% 0% 16% 52% 16%

Table 5. Comparison of Completion Rates for Administrative Units

Year Number of Units

Plans Results

Complete Partial Missing Complete Partial Missing

2015-2016 85 73% 27% 0% 49% 31% 19%

2016-2017 85 66% 34% 0% 27% 67% 6%

2017-2018 85 11% 89% 0% 16% 52% 16%

Table 6. Comparison of Complete Administrative Assessment Reports

Division Plans Results

2015-2016 2016-2017 2017-2018 2015-2016 2016-2017 2017-2018 Academic Affairs 63% 53% 6% 21% 21% 33%

Business Affairs 92% 42% 14% 67% 0% 0%

Enrollment Planning ** ** 0% ** ** 0%

Facilities Management

** ** 0% ** ** 0%

Information Technology

** ** 0% ** ** 0%

Institutional Advancement 100% 100% 0% 100% 0% 0%

Marketing and Communication 100% 100% 0% 0% 100% 0%

Presidents Division 100% 82% 9% 0% 0% 0%

Student Affairs 80% 73% 0% 100% 27% 36%

Centers and Institutes

52% 73% 23% 57% 54% 12%

Total 73% 66% 11% 49% 27% 16% ** Enrollment Planning, Facilities Management, and Information Technology were not formal divisions prior to 2017-2018, thus were not reported as such for assessment. Due to low completion rates from many academic program and administrative units the, Office for Institutional Effectiveness and Strategic Planning implemented several strategies to increase completion. The initial strategy incorporated the use of emails to DAC and AAC chairs to encourage completion within their respective schools and divisions. The second strategy used a similar emailing structure, but focused directly on the assessment coordinators. The third strategy involved a thorough review of data to determine which programs and units were missing which sections of the assessment template. The resulting analysis informed the office of which programs and units required one-on-one consultation with OIEP. The fourth strategy was to hold meetings with the Deans and EVPs of the respective schools and divisions to highlight missing components. The fifth strategy involved the Provost sending memos to missing units and programs.

Page 7: Annual Institutional Effectiveness Report

Annual Institutional Effectiveness Report

Annual Institutional Effectiveness Report, 2017-2018 Prepared by the Office for Institutional Effectiveness and Strategic Planning Page 7

Quality Assurance Process

The assessment coordinators submit the assessment reports (plans and results) for review to the assigned DAC or AAC committee. The assigned DAC or AAC chair and/or mentor in each school or division review the quality of the assessment reports (plan and results) based on established criteria defined in the IE assessment rubrics: the Assessment Plan Rubric and the Assessment Results Rubric. These rubrics are a tool for providing specific feedback to improve the quality and increase the rigor of the assessment reports (plans and results) by setting expectations and promoting discussion. The Assessment Plan Rubric and the Assessment Results Rubric enhance the collaborative process to deepen the culture of assessment. Based on feedback from DACs and AACs, assessment coordinators have the opportunity to revise and improve the quality of plans and reports unit. Table 8 demonstrates the aggregated data of the plan rubric ratings for the programs and units and Table 9 for results rubric ratings. Table 7 provides the coding scheme for the rubrics. See Appendix A for plan and results rubric ratings by individual programs and units. Table 10 shows the average rubric rating for both academic programs and administrative units. Table 7. Codes for Assessment Rubrics

Codes for Assessment Rubrics Establishing (1) Emerging (2) Developing (3) Proficient (4) Exemplary (5)

Plans 0-3 indicators met 4-5 indicators met 6 indicators met ALL Developing indictors + 1-2 of the Proficient indicators

9 indicators met

Results 0-3 indicators met 4-5 indicators met 6 indicators met ALL Developing indictors + #7 of the Proficient indicators

8 indicators met

Table 8. 2017-2018 Summary of Plans Assessment Rubric Ratings*

School/Program Rubric Rating (Plans)

Establishing (1) Emerging (2) Developing (3) Proficient (4) Exemplary (5)

Arts (N=8, n=8) 25% 25% 25% 13% 13%

Business (N=14, n=13) 0% 31% 8% 46% 15%

Education, Health and Human Performance (N=15, n=13) 8% 69% 23% 0% 0%

Humanities and Social Sciences (N=17, n=16) 0% 0% 0% 0% 100%

Languages, Cultures and World Affairs (N=24, n=24) 4% 17% 8% 42% 29%

Sciences and Mathematics (N=23, n=22) 5% 18% 23% 55% 0%

Professional Studies (N=3, n=2) 0% 100% 0% 0% 0%

Interdisciplinary Programs (N=5, n=5) 0% 0% 0% 40% 60%

Graduate School (N=33, n=28) 18% 29% 14% 29% 11%

Honors College (N=1, n=1) 0% 0% 0% 0% 100%

General Education (N=1, n=1) 0% 0% 0% 0% 100%

TOTAL (N=144, n=133) 8% 25% 13% 29% 26%

Division/Unit Rubric Rating (Plans)

Academic Affairs (N=18, n=9) 0% 0% 67% 22% 11%

Business Affairs (N=7, n=6) 17% 0% 0% 67% 17%

Page 8: Annual Institutional Effectiveness Report

Annual Institutional Effectiveness Report

Annual Institutional Effectiveness Report, 2017-2018 Prepared by the Office for Institutional Effectiveness and Strategic Planning Page 8

*Percentages based on those programs/units that completed rubrics. Table 9. 2017-2018 Summary of Results Assessment Rubric Ratings*

Enrollment Management (N=2, n=1)

0% 0% 0% 100% 0%

Facilities Management (N=3, n=1)

0% 0% 0% 100% 0%

Information Technology (N=2, n=1)

0% 0% 0% 0% 100%

Institutional Advancement (N=1, n=1)

0% 0% 0% 100% 0%

Marketing and Communications (N=1, n=1) 0% 0% 0% 0% 100%

President’s Division (N=11, n=4) 0% 0% 0% 100% 0%

Student Affairs (N=14, n=14) 36% 43% 7% 14% 0%

Centers, Institutes, Offices, and Programs within Schools (N=26, n=22)

5% 45% 27% 18% 5%

TOTAL (N=85, n=60) 12% 27% 22% 32% 8%

School/Area Rubric Rating (Results)

Establishing (1) Emerging (2) Developing (3) Proficient (4) Exemplary (5)

Arts (N=8, n=0) Missing Missing Missing Missing Missing

Business (N=14, n=13) 8% 0% 15% 23% 54%

Education, Health and Human Performance (N=15, n=13) 0% 31% 15% 38% 15%

Humanities and Social Sciences (N=17, n=15) 0% 0% 0% 27% 73%

Languages, Cultures and World Affairs (N=24, n=23) 0% 9% 35% 35% 22%

Sciences and Mathematics (N=23, n=20) 15% 20% 20% 25% 20%

Professional Studies (N=3, n=0) Missing Missing Missing Missing Missing

Interdisciplinary Programs (N=5, n=5) 0% 20% 0% 0% 80%

Graduate School (N=33, n=11) 0% 18% 18% 27% 36%

Honors College (N=1, n=1) 0% 0% 0% 0% 100%

General Education (N=1, n=1) 0% 0% 0% 0% 100%

TOTAL (N=144, n=102) 4% 13% 18% 27% 38%

Division Rubric Rating (Results)

Academic Affairs (N=18, n=10) 13% 0% 75% 13% 0%

Business Affairs (N=7, n=0) Missing Missing Missing Missing Missing

Enrollment Planning (N=2, n=2) 0% 0% 50% 50% 0%

Facilities Management (N=3, n=0) Missing Missing Missing Missing Missing

Information Technology (N=2, n=1) 0% 0% 0% 0% 100%

Institutional Advancement (N=1, n=0) Missing Missing Missing Missing Missing

Page 9: Annual Institutional Effectiveness Report

Annual Institutional Effectiveness Report

Annual Institutional Effectiveness Report, 2017-2018 Prepared by the Office for Institutional Effectiveness and Strategic Planning Page 9

*Percentages based on those programs/units that completed rubrics.

Table 10. Year-to-Year Comparison of Average Rubric Ratings*

Plans Results

Academic Administrative Total Academic Administrative Total 2015-2016 2.80 2.81 2.80 3.32 2.94 3.18

2016-2017 3.75 3.70 3.72 3.44 3.23 3.42

2017-2018 3.42 2.95 3.27 3.83 3.68 3.79 *Percentages based on those programs/units that completed rubrics.

Evidence of Continuous Improvement

The primary purpose of IE assessment is to collect data to identify gaps in student learning, and operations. This is demonstrated when assessment data presents an opportunity for improvement and a new strategy is implemented to remove the gap, and ultimately, enhance student learning and/or advance operational effectiveness. Programs and units collect data to evaluate the impact of an implemented change to improve student learning and operations. The use of prior year’s results to improve student learning and operations demonstrates a “closed loop” process. As specified by indicator(s) 9 on the plan assessment rubric and 7 and 8 on the results assessment rubric (see Appendix B), closing of the loop is essential to an exemplary assessment report. Table 11 provides a summary of the number of programs and units from their respective schools and divisions that received exemplary ratings on their assessment plan and results rubrics, thus, closing the loop. Table 11. Closing the Loop*

Marketing and Communications (N=1)

Missing Missing Missing Missing Missing

President’s Division (N=11, n=1) 0% 0% 0% 0% 100%

Student Affairs (N=14, n=5) 0% 0% 0% 0% 100%

Centers, Institutes, Offices, and Programs within Schools (N=26, n=23)

4% 22% 17% 26% 30%

TOTAL (N=85, n=40) 5% 13% 28% 20% 35%

School/Area**

Rubric Rating (Exemplary)

Plans Result

2015-2016 2016-2017 2017-2018 2015-2016 2016-2017 2017-2018

Arts 0% 0% 13% 0% 50% Missing

Business 0% 8% 15% 17% 33% 54%

Education, Health and Human Performance 0% 0% 0% 0% 7% 15%

Humanities and Social Sciences 88% 100% 100% 94% 81% 73%

Languages, Cultures and World Affairs 4% 17% 29% 33% 41% 22%

Sciences and Mathematics 0% 18% 0% 9% 29% 20%

Professional Studies Missing 0% 0% Missing 0% Missing

Interdisciplinary Programs 75% 75% 60% 75% 50% 80%

Page 10: Annual Institutional Effectiveness Report

Annual Institutional Effectiveness Report

Annual Institutional Effectiveness Report, 2017-2018 Prepared by the Office for Institutional Effectiveness and Strategic Planning Page 10

*Percentages based on those programs/units that completed rubrics. ** Prior to 2016-2017, First-Year Experience was counted with the academic programs/schools, but for 2016-2017 forward will be counted with the administrative units. *** Enrollment Planning, Facilities Management, and Information Technology were not formal divisions prior to 2017-2018, thus were not reported as such for assessment.

Institutional Measures: Surveys

The College of Charleston has utilized a variety of survey datasets to assess its student learning, student engagement, and student success. OIEP plans, coordinates, administers, and publishes results from several national and institutional surveys conducted at the College of Charleston. National surveys consist of The CIRP freshman Survey, Your First College Year Survey(YFCY), College Senior Survey(CSS), National Survey of Student Engagement (NSSE), The Beginning College Survey of Student Engagement (BCSSE), and Faculty Survey of Student Engagement(FSSE). In addition, the ETS Proficiency Profile standard test is administered every three years at the College of Charleston. Surveys at the institutional level are Senior Exit Surveys(SES), Post-Graduation Survey (six-months post-graduation, one-year post-graduation, three-years post-graduation and five-years post-graduation). Results from these surveys are communicated via emails and presentations, and previous and current survey analytical reports are published on the OIEP website. The results of these enterprise-level surveys are used to evaluate student learning. In the 2017-2018 academic year, OIEP administered Senior Exit Survey, Post-Graduation Survey (six-months post-graduation, one-year post-graduation, three-years post-graduation and five-years post-graduation) and the ETS Proficiency Profile test.

Graduate School 4% 8% 11% 10% 11% 36%

Honors College 0% 100% 100% 100% 100% 100%

General Education 100% 100% 100% 100% 100% 100%

TOTAL 16% 25% 26% 26% 35% 38%

Division/Area

Academic Affairs 6% 0% 11% 11% 30% 0%

Business Affairs 0% 58% 17% 0% 20% Missing

Enrollment Planning *** *** 0% *** *** 0%

Facilities Management *** *** 0% *** *** Missing

Information Technology *** *** 100% *** *** 100%

Institutional Advancement 0% 0% 0% 0% 0% Missing

Marketing and Communications 0% 100% 100% 0% 0% Missing

President’s Division 0% 0% 0% 8% 0% 100%

Student Affairs 0% 40% 0% 7% 31% 100%

Centers, Institutes, Offices, and Programs within Schools 5% 4% 5% 5% 9% 30%

TOTAL 3% 19% 8% 6% 16% 35%

Page 11: Annual Institutional Effectiveness Report

Annual Institutional Effectiveness Report

Annual Institutional Effectiveness Report, 2017-2018 Prepared by the Office for Institutional Effectiveness and Strategic Planning Page 11

Assessing the College’s Mission Academic programs and administrative units aligned their outcomes with strategic initiative(s) from the College’s Strategic Plan to assess the institution’s mission. Table 12 summarizes the number of outcomes supporting each strategic initiative for the past three years, while Figure 1 demonstrates the same comparison over the past six years. Table 12. Strategic Initiatives being supported by Assessment Reports

Strategic Initiative 2015-2016 2016-2017 2017-2018

Acad. Admin. Acad. Admin. Acad. Admin.

1: Enhance the undergraduate academic core. 343 59 333 79 329 32

2: Develop nationally recognized academic programs at the graduate level. 52 13 76 23 80 7

3: Develop and support a highly qualified, diverse and stable base of faculty and staff. 6 41 1 49 3 16

4: Identify, attract, recruit, enroll and retain academically distinguished, well-prepared, diverse students.

20 75 38 95 34 41

5: Enhance and support co-curricular and extracurricular programs and facilities to promote and sustain an integrated, campus-wide approach to holistic education of students.

22 94 7 94 5 45

6: Align all aspects of the administrative and academic policies, procedures and practices to support the College’s purpose and achieve its envisioned future.

3 71 4 66 1 40

7: Provide appropriate, up-to-date facilities and infrastructure to support and enhance academic programs and co-curricular opportunities for students.

0 41 0 25 1 10

8: Engage with local, national and international constituents to leverage higher education for a stronger South Carolina.

22 39 22 68 24 38

9: Establish campus wide policies and practices aimed at creating enhanced non-state resources and promoting greater fiscal responsibility and self-sufficiency.

1 35 0 33 0 16

10: Brand the College of Charleston so that it is nationally and internationally recognized for a personalized liberal arts education with specific areas of distinction at the undergraduate and graduate level.

17 28 26 37 0 1

Page 12: Annual Institutional Effectiveness Report

Annual Institutional Effectiveness Report

Annual Institutional Effectiveness Report, 2017-2018 Prepared by the Office for Institutional Effectiveness and Strategic Planning Page 12

Figure 1. Number of Outcomes Aligned to the College’s Strategic Initiatives

Institutional Effectiveness Consultations and Workshops

Throughout the fall and spring semesters, OIEP employed numerous assessment strategies (sessions and workshops) to improve the quality of assessment at the College of Charleston. There were 229 units/programs available for attendance at these sessions and workshops in the 2017-2018 academic year. One of these strategies included “Connecting the Dots” sessions created to increase collaboration between units and enhance the quality of assessment overall. The structure of the sessions was a two-hour workshop with 2-5 units involved per session. Also, the process included a Session 1 and Session 2 for each unit (10 units received a combined Session 1 and Session 2 in one session). Unit leaders were encouraged to invite as many contributing faculty/staff members as possible. For the administrative side, 70 units were represented in at least one of the 3 offerings, with 27 units were represented at Session 1 and 2. For the academic programs, 49 programs participate in part 1 of the Connect the Dots series, which included 52 faculty members. Another strategy utilized this academic year was a workshop series (3 workshops) crafted for Student Affairs around the topic of rubric, both design and application. The attendance for these workshops were as follows: 13 participants in the first, 12 in the second, and 6 in the third. There were 4 participants that attended all 3 workshops. An additional four workshops were offered to train faculty and staff on the use of the IE rubric; 7 faculty/staff attended those workshops.

050

100150200250300350400450

S.I. 1 S.I. 2 S.I. 3 S.I. 4 S.I. 5 S.I. 6 S.I. 7 S.I. 8 S.I. 9 S.I. 10

Out

com

es

Strategic Initiatives

Number of Outcomes Aligned to the College's Strategic Initiatives

2012-2013 2013-2014 2014-2015 2015-2016 2016-2017 2017-2018

Page 13: Annual Institutional Effectiveness Report

Annual Institutional Effectiveness Report

Annual Institutional Effectiveness Report, 2017-2018 Prepared by the Office for Institutional Effectiveness and Strategic Planning Page 13

Appendix A Detailed Summary of Plan and Results Rubric Ratings

As part of the annual review process, each academic program’s and administrative unit’s assessment report is rated using 2 developmental rubrics for the 2 sections (plans and results) created by OIEP (see Appendix B). The plan rubric has 9 indicators and the results rubric has 8 indicators. As described in Table 13 and on each rubric, based on the number of indicators successfully completed, the program or unit receives a rubric rating. Tables 14-34 provide each program’s and unit’s rubric score. Table 13. Codes for Assessment Rubrics

Codes for Assessment Rubrics

Establishing (1) Emerging (2) Developing (3) Proficient (4) Exemplary (5)

Plans 0-3 indicators met 4-5 indicators met 6 indicators met ALL Developing indictors + 1-2 of the Proficient indicators

9 indicators met

Results 0-3 indicators met 4-5 indicators met 6 indicators met ALL Developing indictors + #7 of the Proficient indicators

8 indicators met

Table 14. School of the Arts Assessment Rubrics

School of the Arts (N=8 Academic Programs)

2015-2016 2016-2017 2017-2018

Plans Results Plans Results Plans Results

School of the Arts 1 3 4 5 5 Missing

Art History BA 2 4 4 2 1 Missing

Arts Management BA 1 2 1 3 1 Missing

Dance BA 2 3 4 3 3 Missing

Historic Preservation and Community Planning BA 2 2 4 3 3 Missing

Music BA 2 4 4 5 2 Missing

Studio Arts BA 2 3 2 5 4 Missing

Theatre BA 2 4 4 5 2 Missing

Page 14: Annual Institutional Effectiveness Report

Annual Institutional Effectiveness Report

Annual Institutional Effectiveness Report, 2017-2018 Prepared by the Office for Institutional Effectiveness and Strategic Planning Page 14

Table 15. School of Business Assessment Rubrics

School of Business (N=14 Academic Programs)

2015-2016 2016-2017 2017-2018

Plans Results Plans Results Plans Results

School of Business Missing Missing Missing Missing Missing Missing

Accounting BS 4 5 2 1 2 5

Business Administration BS 3 3 3 4 4 5

Commercial Real Estate Finance NA NA NA NA 2 5

Economics BS 4 3 4 5 4 5

Entrepreneurship Minor 2 3 4 1 3 3

Finance BS 4 4 4 5 4 5

Global Logistics and Transportation Minor 2 3 4 4 4 5

Hospitality and Tourism Management BS 1 5 5 5 5 5

Information Management Minor NA NA NA NA 4 3

International Business BS 1 4 4 1 2 1

Leadership , Change, and Social Responsibility Minor 2 3 4 5 4 4

Marketing BS 2 3 4 2 5 4

Real Estate Minor 1 3 3 4 NA NA

Supply Chain Management BS 4 3 4 1 2 4

Page 15: Annual Institutional Effectiveness Report

Annual Institutional Effectiveness Report

Annual Institutional Effectiveness Report, 2017-2018 Prepared by the Office for Institutional Effectiveness and Strategic Planning Page 15

Table 16. School of Education, Health, and Human Performance Assessment Rubrics

School of Education, Health, and Human Performance

(N=15 Academic Programs)

2015-2016 2016-2017 2017-2018

Plans Results Plans Results Plans Results

School of Education, Health, and Human Performance 1 1 1 Missing Missing Missing

Athletic Training BS 2 3 4 3 2 5

Coaching Minor 1 2 4 1 2 2

Early Childhood Education BS 2 3 4 2 2 5

Elementary Education BS 2 2 4 4 3 3

Exercise Science BS 1 4 2 1 1 2

Health Minor 1 4 3 2 Missing Missing

Middle Grades Education BS 3 3 4 3 3 4

Physical Education BS 4 4 4 5 2 4

Public Health BS 1 4 4 3 2 3

Secondary Education English BS 1 3 3 1 2 2

Secondary Education Mathematics BS 2 4 3 3 2 4

Secondary Education Science BS 1 4 4 4 2 4

Secondary Education Social Studies BS 2 4 4 2 3 2

Special Education BS 2 2 4 2 2 4

Page 16: Annual Institutional Effectiveness Report

Annual Institutional Effectiveness Report

Annual Institutional Effectiveness Report, 2017-2018 Prepared by the Office for Institutional Effectiveness and Strategic Planning Page 16

Table 17. School of Humanities and Social Sciences Assessment Rubrics

School of Humanities and Social Sciences

(N=17 Academic Programs)

2015-2016 2016-2017 2017-2018

Plans Results Plans Results Plans Results

School of Humanities and Social Science 4 Missing Missing Missing Missing 4

Anthropology BS 5 5 5 5 5 5

Communication BA 5 5 5 5 5 4

Creative Writing Minor 5 5 5 5 5 5

English BA 5 5 5 5 5 5

Geography Minor 5 5 5 3 5 Missing

History BA 5 5 5 5 5 5

Irish and Irish American Studies Minor 3 5 5 4 5 Missing

Philosophy BA 5 5 5 5 5 5

Political Science BA 5 5 5 5 5 5

Psychology BA 5 5 5 5 5 5

Psychology BS 5 5 5 5 5 5

Public Health BA 5 5 5 5 5 4

Religious Studies BA 5 5 5 5 5 5

Sociology BS 5 5 5 5 5 5

Urban Studies BA 5 5 5 5 5 4

Women's and Gender Studies BA 5 5 5 4 5 5

Page 17: Annual Institutional Effectiveness Report

Annual Institutional Effectiveness Report

Annual Institutional Effectiveness Report, 2017-2018 Prepared by the Office for Institutional Effectiveness and Strategic Planning Page 17

Table 18. School of Languages, Cultures, and World Affairs Assessment Rubrics

School of Languages, Cultures, and World Affairs

(N=24 Academic Programs)

2015-2016 2016-2017 2017-2018

Plans Results Plans Results Plans Results

School of Languages, Cultures, and World Affairs 5 4 5 4 4 4

African American Studies BA 4 2 4 4 4 2

African Studies Minor 4 3 4 Missing 5 4

Archaeology BA 4 5 Missing 4 4 5

Asian Studies Minor 4 2 4 Missing 3 4

British Studies Minor 4 2 4 1 4 Missing

Business Language in French Minor 4 2 4 Missing 4 4

Business Language in Spanish Minor 4 3 4 5 5 3

Classics BA 4 5 5 5 5 5

Comparative Literature Minor 4 2 4 3 2 3

European Studies Minor 2 4 4 5 2 3

Foreign Language Education Cognate BS 4 4 4 5 5 4

French and Francophone BA 4 5 5 5 5 5

German BA 4 5 4 4 4 5

German Studies Minor 4 5 4 Missing 4 4

International Studies BA 4 3 4 3 2 4

Italian Studies Minor 4 3 4 3 4 3

Japanese Studies Minor 4 2 4 Missing 4 3

Jewish Studies BA 4 3 4 5 5 3

Latin American and Caribbean Studies BA 4 3 4 5 4 3

Linguistics Minor 4 5 5 Missing 3 4

Middle East and the Islamic World Minor 4 2 4 1 1 2

Russian Studies Minor 4 5 4 2 2 3

Spanish BA 4 5 4 Missing 5 5

Page 18: Annual Institutional Effectiveness Report

Annual Institutional Effectiveness Report

Annual Institutional Effectiveness Report, 2017-2018 Prepared by the Office for Institutional Effectiveness and Strategic Planning Page 18

Table 19. School of School of Sciences and Mathematics Assessment Rubrics

School of Sciences and Mathematics (N=23 Academic Programs)

2015-2016 2016-2017 2017-2018

Plans Results Plans Results Plans Results

School of Sciences and Mathematics 4 5 4 5 4 Missing

Astronomy BA 2 1 2 Missing 4 1

Astrophysics BS 2 1 2 3 4 2

Biochemistry BS 2 5 5 4 4 4

Biology BA 2 3 3 3 4 3

Biology BS 2 3 3 3 4 3

Biomedical Physics Minor 2 3 4 4 3 2

Chemistry BA 2 4 5 3 4 4

Chemistry BS 2 4 5 4 4 4

Computational Thinking Minor* 1 1 1 1 Missing Missing

Computer Information Systems BS 2 3 5 3 2 5

Computer Science BA 2 4 3 5 2 5

Computer Science BS 2 4 1 5 3 5

Computing in the Arts BA 1 3 3 3 3 2

Data Science BS 2 3 3 2 2 3

Geology BA 2 1 2 5 3 4

Geology BS 2 1 2 5 3 4

Marine Biology BS 2 2 3 3 4 3

Mathematics BA 2 3 3 1 1 Missing

Mathematics BS 2 3 Missing 5 4 5

Meteorology BA NA NA 2 Missing 2 1

Meteorology Minor 2 1 NA NA NA NA

Physics BA 2 3 3 1 4 2

Physics BS 2 3 3 3 4 1

* No students enrolled.

Page 19: Annual Institutional Effectiveness Report

Annual Institutional Effectiveness Report

Annual Institutional Effectiveness Report, 2017-2018 Prepared by the Office for Institutional Effectiveness and Strategic Planning Page 19

Table 20. School of Professional Studies Assessment Rubrics

School of Professional Studies (N=3 Academic Programs)

2015-2016 2016-2017 2017-2018

Plans Results Plans Results Plans Results

School of Professional Studies Missing Missing Missing Missing 2 Missing

Professional Studies BPS Missing Missing Missing Missing 2 Missing

Project Management Certificate NA NA NA NA Missing Missing

Table 21. Interdisciplinary Programs Assessment Rubrics

Interdisciplinary Programs (N=5 Academic Programs)

2015-2016 2016-2017 2017-2018

Plans Results Plans Results Plans Results

Crime, Law, and Society Minor 5 5 5 5 5 5

Environmental Studies Minor 5 5 5 4 4 5

Film Studies Minor 5 5 5 5 5 5

Neuroscience Minor 2 3 4 3 4 2

Southern Studies NA NA NA NA 5 5

Table 22. Honors College Assessment Rubrics

Honors College (N=1 Academic Unit)

2015-2016 2016-2017 2017-2018

Plans Results Plans Results Plans Results

Honors College 1 5 5 5 5 5

Table 23. General Education Assessment Rubrics

General Education (N=1 Academic Unit)

2015-2016 2016-2017 2017-2018

Plans Results Plans Results Plans Results

General Education 5 5 5 5 5 5

Page 20: Annual Institutional Effectiveness Report

Annual Institutional Effectiveness Report

Annual Institutional Effectiveness Report, 2017-2018 Prepared by the Office for Institutional Effectiveness and Strategic Planning Page 20

Table 24. Graduate School Assessment Rubrics

Graduate School (N=33 Academic Programs)

2015-2016 2016-2017 2017-2018

Plans Results Plans Results Plans Results

Graduate School Missing 4 4 3 5 4

Accountancy MS 4 4 5 1 2 5

Arts Management Certificate 1 2 3 3 3 Missing

Business Administration MBA 2 1 5 5 5 5

Child Life MS 1 3 Missing Missing 2 Missing

Communication MA 1 5 2 5 5 Missing

Community Planning, Policy, and Design MA NA NA NA NA Missing Missing

Computer and Information Sciences MS 2 3 2 1 1 Missing

Creative Writing MFA NA NA 2 1 2 Missing

Cybersecurity Certificate 1 1 2 1 1 Missing

Early Childhood Education MAT 2 4 4 2 2 5

Elementary Education MAT 1 2 4 4 3 3

English MA 2 3 2 3 4 Missing

Environmental Studies MS 1 1 4 2 3 Missing

ESOL 1 Certificate 1 1 3 2 2 3

ESOL 2 Certificate Missing Missing NA NA NA NA

Gifted and Talented Education Certificate 1 2 2 2 1 2

Historic Preservation MS 1 1 1 1 1 Missing

History MA 1 2 2 3 Missing Missing

Information Systems Certificate NA NA NA NA Missing Missing

Languages MEd 4 5 3 5 4 5

Marine Biology MS 4 3 4 4 4 Missing

Mathematics MS 4 2 4 3 4 Missing

Middle Grades MAT 1 2 4 3 Missing 4

Operations Research Certificate 4 1 4 3 4 Missing

Performing Arts MAT 2 1 Missing Missing 2 4

Public Administration MPA 5 4 Missing 3 3 Missing

Sciences and Mathematics for Teachers MEd 1 4 3 1 1 Missing

Software Engineering Certificate NA NA NA NA Missing Missing

Special Education Certificate 2 1 4 Missing 2 Missing

Special Education MAT 2 3 4 2 2 2

Page 21: Annual Institutional Effectiveness Report

Annual Institutional Effectiveness Report

Annual Institutional Effectiveness Report, 2017-2018 Prepared by the Office for Institutional Effectiveness and Strategic Planning Page 21

Statistics Certificate 2 1 4 3 4 Missing

Teaching, Learning, and Advocacy MEd 2 4 Missing 4 4 Missing

Urban and Regional Planning Certificate 1 2 3 3 4 Missing

Table 25. Division of Academic Affairs Assessment Rubrics

Division of Academic Affairs (N=18 Administrative Units)

2015-2016 2016-2017 2017-2018

Plans Results Plans Results Plans Results

Academic Advising and Planning 4 3 4 Missing 5 4

Academic Experience 4 3 4 4 3 Missing

Center for Academic Performance and Persistence 4 5 4 5 4 3

Center for Excellence in Peer Education 4 3 4 5 3 3

Center for International Education 2 2 2 Missing Missing Missing

Center for Student Learning 4 3 4 3 3 3

Educational Programs and Services 4 3 4 Missing NA NA

First-Year Experience 4 3 1 5 3 3

Library 3 3 1 Missing Missing Missing

New Student Programs 4 3 4 3 3 3

Office of Research and Grants Administration 1 2 1 Missing Missing Missing

Provost’s Office Missing Missing Missing Missing Missing Missing

Quality Enhancement Plan NA NA NA NA Missing Missing

REACH 4 3 4 3 4 1

Registrar 1 3 4 Missing Missing Missing

ROAR 3 1 NA NA NA NA

Summer Sessions 1 1 4 Missing Missing Missing

Sustainability Literacy Institute NA NA NA NA Missing Missing

Undergraduate Research and Creative Activities 5 5 4 Missing Missing Missing

Veteran and Military Student Services 3 1 3 2 3 3

Page 22: Annual Institutional Effectiveness Report

Annual Institutional Effectiveness Report

Annual Institutional Effectiveness Report, 2017-2018 Prepared by the Office for Institutional Effectiveness and Strategic Planning Page 22

Table 26. Division of Business Affairs Assessment Rubrics

Division of Business Affairs (N=7 Administrative Units)

2015-2016 2016-2017 2017-2018

Plans Results Plans Results Plans Results

Business and Auxiliary Services 4 3 5 5 5 Missing

Fiscal Services 2 3 5 3 4 Missing

Governmental Relations 2 3 4 1 Missing Missing

Human Resources 4 3 5 4 4 Missing

Internal Auditor 4 3 5 3 4 Missing

Office of Sustainability 2 3 2 1 1 Missing

Procurement and Supply Services 1 3 4 4 4 Missing

Table 27. Division of Enrollment Planning Assessment Rubrics

Division of Enrollment Planning (N=2 Administrative Units)

2015-2016 2016-2017 2017-2018

Plans Results Plans Results Plans Results

Admissions 1 3 4 4 4 4

Financial Assistance and Veterans Affairs 3 2 3 3 Missing 3

Table 28. Division of Facilities Management Assessment Rubrics

Division of Facilities Management (N=3 Administrative Units)

2015-2016 2016-2017 2017-2018

Plans Results Plans Results Plans Results

Environmental Health and Safety 2 3 4 1 Missing Missing

Facilities Operations 4 3 4 Missing 4 Missing

Facilities Planning, Architecture, and Engineering 4 3 5 Missing Missing Missing

Table 29. Division of Information Technology Assessment Rubrics

Division of Information Technology (N=2 Administrative Units)

2015-2016 2016-2017 2017-2018

Plans Results Plans Results Plans Results

Information Technology 2 2 5 3 Missing Missing

Teaching, Learning, and Technology 2 3 5 5 5 5

Page 23: Annual Institutional Effectiveness Report

Annual Institutional Effectiveness Report

Annual Institutional Effectiveness Report, 2017-2018 Prepared by the Office for Institutional Effectiveness and Strategic Planning Page 23

Table 30. Division of Institutional Advancement Assessment Rubrics

Division of Institutional Advancement (N=1 Administrative Unit)

2015-2016 2016-2017 2017-2018

Plans Results Plans Results Plans Results

Institutional Advancement 4 4 4 4 4 Missing

Table 31. Division of Marketing and Communications Assessment Rubrics

Division of Marketing and Communications

(N=1 Administrative Unit)

2015-2016 2016-2017 2017-2018

Plans Results Plans Results Plans Results

Marketing and Communication 4 3 5 4 5 Missing

Table 32. Division of the President Assessment Rubrics

Division of the President (N=11 Administrative Units)

2015-2016 2016-2017 2017-2018

Plans Results Plans Results Plans Results

Athletics Department 4 4 4 4 Missing Missing

Economic Development 4 3 NA NA NA NA

Community Relations 4 3 4 4 4 Missing

Fire and EMS Missing 3 4 3 Missing Missing

Institutional Diversity 4 4 4 4 4 Missing

Institutional Effectiveness and Strategic Planning 4 4 4 Missing Missing Missing

Institutional Events 4 3 4 4 NA NA

Institutional Research and Planning 4 5 4 4 4 Missing

Legal Affairs 4 3 4 4 4 Missing

Ombudsperson 4 3 4 4 Missing Missing

President’s Office 4 3 Missing 4 Missing 5

Public Safety 4 3 4 4 Missing Missing

Special Projects and the Board of Trustees NA NA NA NA Missing Missing

Page 24: Annual Institutional Effectiveness Report

Annual Institutional Effectiveness Report

Annual Institutional Effectiveness Report, 2017-2018 Prepared by the Office for Institutional Effectiveness and Strategic Planning Page 24

Table 33. Division of Student Affairs Assessment Rubrics

Division of Student Affairs (N=14 Administrative Units)

2015-2016 2016-2017 2017-2018

Plans Results Plans Results Plans Results

Campus Recreation Services 3 3 5 5 1 Missing

Career Center 4 3 4 4 1 Missing

Center for Civic Engagement 3 3 4 5 4 5

Center for Disability Services 2 4 5 Missing 2 5

Collegiate Recovery Program NA NA NA NA 2 Missing

Counseling and Substance Abuse Services 3 4 2 3 2 Missing

Dean of Students 1 5 5 4 4 5

Fraternity and Sorority Life 1 3 2 1 NA NA

Higdon Student Leadership Center 4 2 5 5 1 Missing

Multicultural Student Programs and Services 4 3 5 4 1 Missing

Residence Life and Housing 4 3 5 4 2 Missing

Student Health Services 3 3 4 5 2 5

Student Life 4 4 3 4 3 5

Technical Support 3 3 4 Missing NA NA

Upward Bound 3 3 4 4 2 Missing

Victim Services 3 3 3 1 1 Missing

Page 25: Annual Institutional Effectiveness Report

Annual Institutional Effectiveness Report

Annual Institutional Effectiveness Report, 2017-2018 Prepared by the Office for Institutional Effectiveness and Strategic Planning Page 25

Table 34. Centers, Institutes, Offices, and Programs within Academic Schools Assessment Rubrics

Centers, Institutes, Offices, and Programs

(N=26 Administrative Units)

2015-2016 2016-2017 2017-2018

Plans Results Plans Results Plans Results

A Talent Development USDE 1 3 3 Missing 2 2

Afterschool and Summer Learning Resources Center NA NA 3 4 3 4

Art Attack 1 1 4 3 2 3

Autism Project 2 2 4 4 3 3

Call Me MISTER 1 4 3 2 2 2

Carter Real Estate Center 3 3 4 2 2 5

Center for Coastal Environmental and Human Health NA NA NA NA 2 3

Center for Continuing and Professional Education Missing Missing Missing Missing Missing Missing

Center for Entrepreneurship 2 2 2 3 4 5

Center for Partnerships to Improve Education 2 3 3 5 3 5

Center for Public Choice and Market Process 2 3 4 4 4 4

FitCatz 1 1 2 1 3 5

Global Business Resource Center 2 2 2 1 2 2

Grice Marine Laboratory NA NA 2 1 2 5

Halsey Institute of Contemporary Art 2 3 4 1 1 Missing

Lowcountry Graduate Center Missing Missing Missing Missing Missing Missing

N.E. Miles Early Childhood Development Center 2 4 4 4 2 5

Office of Economic Analysis 1 2 2 4 2 4

Office of Professional Development in Education 3 3 4 2 3 4

Office of Student Services and Credentialing 1 2 4 4 2 3

Office of Tourism Analysis 2 3 4 4 4 4

Riley Center for Livable Communities 5 5 5 5 5 5

SCDOE grant 1 1 2 2 Missing 2

Student Success Center 2 4 4 3 4 4

Teacher Leader Program 1 3 4 Missing Missing 2

Teaching Fellows 1 2 3 1 3 1

Tech Fit 1 1 1 4 NA NA

Page 26: Annual Institutional Effectiveness Report

Annual Institutional Effectiveness Report

Annual Institutional Effectiveness Report, 2017-2018 Prepared by the Office for Institutional Effectiveness and Strategic Planning Page 26

Appendix B College of Charleston Institutional Effectiveness Assessment Plan Rubric

Academic/Administrative Unit: ____________________________ Rubric Completed By:_____________________ Date Completed:_____________ Reviewed with Assessment Coordinator (Initial/Date):_________________ Rating:_____________ Instructions: Please review the assessment plan and check which indicators are met as well as provide necessary comments. Based on the number of indicators met, please identify the rating (Level) of the assessment plan and indicate above.

Levels Indicators Comments

Establishing (Level 1)

Three or fewer indicators from the Developing category are met.

Emerging (Level 2)

Five or fewer indicators from the Developing category are met.

Developing (Level 3) ALL of the Developing indicators (#1-6) are met.

1. Program/Unit's mission statement concisely defines the purpose, functions, and key constituents. [The Program/Unit's mission is aligned to the School/Division/College Strategic Plan.]2. The assessment process describes: • Strategies to assess the outcomes. A strategy is a plan of action intended to accomplish a specific outcome/measure. • A plan to use the data for improving student learning and/or operations. • How the data will be shared within the Program/Unit and the College. [The assessment process describes how evidence-based decision making leads to improvement for the Program/Unit and how the planevolves over time. The assessment process description should present a clear understanding of how the Program/Unit utilizes assessment data for continuous quality improvement.] 3. Number of outcomes: • Administrative Units - minimum of three outcomes. • Academic Programs (undergraduate, graduate, stand-alone minors, certificates) - minimum of three student learning outcomes. [The outcomes are specific, measurable, attainable, results oriented, and time bound. The outcomes are clearly related to the mission and focus on activities of the Program/Unit.] 4. Measures and Performance Targets: a minimum of two appropriate, quantitative measures, with at least one being a direct measure, per outcome. Measures for the outcomes define specific performance targets and strategies to achieve the targets. [The measure matches the outcome, uses appropriate direct and indirect methods, indicates desired level of performance, helps identify what to improve, and is based on tested, known methods. The performance target is meaningful; based on existing benchmarks, previousresults, and existing standards. Grades and/or GPA should not be used as measures.] 5. The assessment plan directly links outcomes to the School/Division/College Strategic Plan. 6. Relevant assessment instruments (e.g., rubrics, survey instruments, logs, reports, etc.) are uploaded in ComplianceAssist (e.g., viaURL, as attachments, etc., if not proprietary). [If instrument is proprietary, please state so in the report.]

Proficient (Level 4) ALL of the Developing indicators plus at least one of the Proficient Indicators (#7 & 8) are met.

7. Clearly defined curriculum or functional map is provided.[Courses/functions are listed and linked to outcomes. Clear levels of learning defined for all outcomes at all levels (Introduce, Enhance, Reinforce).]8. The assessment plan promotes continuous quality improvement by having outcomes and measures that are formative in nature. Formative assessments provide ongoing feedback that can be used to improve student learning and operations. [The primary purpose of IE assessment is to collect data to identify gaps in student learning, and operations. This is demonstrated whenassessment data presents an opportunity for improvement and a new strategy is implemented to remove the gap. For best practices, when a measure has a performance target of 100%, or is constant for 2-3 assessment cycles, it is advisable to conduct a granular(disaggregate) analysis to identify gaps in learning and/or operations.]

Exemplary (Level 5) ALL nine indicators are met.

9. The assessment plan “closes the loop” by linking new strategies (changes) to previous assessment results. [Program/Unit collects data to evaluate the impact of an implemented change to improve student learning and operations. The use of prior year’s results to improve student learning and operations demonstrates a “closed loop” process.]

Page 27: Annual Institutional Effectiveness Report

Annual Institutional Effectiveness Report

Annual Institutional Effectiveness Report, 2017-2018 Prepared by the Office for Institutional Effectiveness and Strategic Planning Page 27

College of Charleston Institutional Effectiveness Assessment Results Rubric

Academic/Administrative Unit: ____________________________ Rubric Completed By: ______________________ Date Completed: _____________ Reviewed with Assessment Coordinator (Initial/Date): _______________ Rating: _____________ Instructions: Please review the assessment plan and check which indicators are met as well as provide necessary comments. Based on the number of indicators met, please identify the rating (Level) of the assessment plan and indicate above.

Levels Indicators Comments

Establishing (Level 1)

Three or fewer indicators from the Developing category are met.

Emerging (Level 2)

Five or fewer indicators from the Developing category are met.

Developing (Level 3)

ALL of the Developing indicators (#1-6) are met.

1. Complete, aggregated, and relevant data are provided for each measure.[If there are extenuating circumstances that lead to missing data, an explanation must be provided.Missing data for extenuating circumstances is only permitted for one assessment cycle. If appropriate,data should be disaggregated by distance learning, off-site locations, and mode of delivery.]2. Data reporting is complete, concise, and well-presented.[Reported data are aligned and appropriate to the outcome and the corresponding measure.Sampling methodology, population size (N), and sample size (n) must be provided.]3. Results clearly specify whether the performance target (performance expectations) for eachmeasure have been met.[Assessment results are used for comparison of actual vs. expected performance targets. Dataprovides evidence of performance targets met, partially met, or not met.]4. Results provide evidence that the assessment findings informed discussion and improvements inthe Program/Unit.5. Results include at least one applied and/or planned change(s) based on the assessment data toimprove student learning, program quality, or unit operations. If no changes are provided, resultsshould identify an area of improvement for the next cycle.[The discussion of the results should specifically identify any curricular/operational/budget changes asa result of assessment.]6. Relevant assessment instruments (e.g., rubrics, survey instruments, etc.) are uploaded inComplianceAssist (e.g., via URL, as attachments, etc., if not proprietary).

Proficient (Level 4)

ALL of the Developing indicators plus indicator #7 are met.

7. The assessment report demonstrates how data analysis “closes the loop” by assessing the impactof applied changes.[Current year's results are compared to the previous year's results to evaluate the impact of apreviously reported change to demonstrate use of results to improve student learning andoperations.]

Exemplary (Phase 5)

ALL eight indicators are met.

8. The impact of “closing the loop” with an improvement is demonstrated by analyzing follow-updata. [Examples of improvement(s) in student learning, program quality, or unit operations areprovided and are directly linked to assessment data. The primary purpose of IE assessment is to assessthe impact of an implemented change.