15-16 qep assessment report final - east carolina universityrelated activities (section 2), the...

42
1 QEP Assessment Report 2015-16 1. Introduction Academic year 2015-16 was the third full year of QEP implementation. Following a summary of QEP initiative- related activities (Section 2), the report provides descriptions of formative and summative assessments (direct and indirect) undertaken within each of the three main QEP initiative areas: Curriculum Enhancement, Student Support, and Faculty Support. Within each initiative-area section of the report, actions taken as a result of last year’s QEP assessments (see QEP Assessment Report 2014-15) are reviewed and their impact, as determined through assessments this year, are discussed. Also noted within each initiative-area section of the report are areas for improvement, as revealed through 2015- 16 assessments, and actions planned in response to the discovery of these areas for improvement. The report concludes with a summary of trends regarding the impact of the QEP based on assessments completed to date and an overview of actions and assessments planned for the 2016-17 year, year four of the QEP. 2. Summary of Major QEP Activities This year's activities in each of the three primary QEP initiative areas— Curriculum Enhancement, Student Support, and Faculty Support—are briefly discussed below. 2.1 Summary of Curriculum Enhancement Activities 2.1.1 Full implementation of ENGL 2201: The 2015-16 academic year saw the full implementation of ENGL 2201: Writing About the Disciplines. Students who entered as first-year students in the fall of 2014 took ENGL 1100 that year and then took ENGL 2201 in 2015-16. In addition to offering “Multi-disciplinary” sections of 2201, several discipline-themed sections (Writing About the Health Sciences; Writing About the Arts and Humanities; Writing About Engineering and Technology; etc.) were offered each semester. Assessment of the implementation is included below in section 5.2. 2.1.2 Tenured/Tenure-track Faculty 2201 Orientation: At the conclusion of the spring 2016 semester, seven tenured and tenure-track faculty from the Department of English participated in an orientation session designed to introduce them to the ENGL 2201 curriculum. As part of this orientation, the Director of Writing Foundations provided participants with an overview of the goals and course outcomes for the new course, explained how the course differs from the previous course, ENGL 1200, and distributed a sample syllabus and textbooks that several instructors and GTAs teaching the course in the fall had chosen to adopt. 2.1.3 University Writing Portfolio (UWPort) Implementation: In fall 2014, the University Writing Portfolio became a requirement for all WI courses. In an effort to ensure the success of the implementation, the QEP Director, with help from two graduate assistants, offered several faculty orientation sessions in both the fall and spring semesters. Dedicated UWPort graduate assistants were available for 20 hours per week to help students with the uploading process in the University Writing Center, and all writing center consultants were familiarized with the upload process so that they were also able to assist students. Video and PDF instructions were available for all steps of the UWPort process for both students and faculty on the QEP website. To facilitate access to these materials, a University Writing Portfolio bookmark, complete with contact information, a web address, and a QR code, was provided to any instructor who wanted to distribute them to students. Submission rates were also compiled for the fall 2015 semester by a QEP graduate assistant and for spring 2016 by the Web Coordinator. Results of those compilations are included below in 5.1.2. 2.2 Summary of Student Support Activities

Upload: others

Post on 01-Jan-2020

2 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: 15-16 QEP Assessment Report Final - East Carolina Universityrelated activities (Section 2), the report provides descriptions of formative and summative assessments (direct and

1

QEP Assessment Report 2015-16 1. Introduction

Academic year 2015-16 was the third full year of QEP implementation. Following a summary of QEP initiative-related activities (Section 2), the report provides descriptions of formative and summative assessments (direct and indirect) undertaken within each of the three main QEP initiative areas: Curriculum Enhancement, Student Support, and Faculty Support.

Within each initiative-area section of the report, actions taken as a result of last year’s QEP assessments (see QEP Assessment Report 2014-15) are reviewed and their impact, as determined through assessments this year, are discussed.

Also noted within each initiative-area section of the report are areas for improvement, as revealed through 2015-16 assessments, and actions planned in response to the discovery of these areas for improvement. The report concludes with a summary of trends regarding the impact of the QEP based on assessments completed to date and an overview of actions and assessments planned for the 2016-17 year, year four of the QEP.

2. Summary of Major QEP Activities This year's activities in each of the three primary QEP initiative areas— Curriculum Enhancement, Student Support, and Faculty Support—are briefly discussed below.

2.1 Summary of Curriculum Enhancement Activities

2.1.1 Full implementation of ENGL 2201: The 2015-16 academic year saw the full implementation of ENGL 2201: Writing About the Disciplines. Students who entered as first-year students in the fall of 2014 took ENGL 1100 that year and then took ENGL 2201 in 2015-16. In addition to offering “Multi-disciplinary” sections of 2201, several discipline-themed sections (Writing About the Health Sciences; Writing About the Arts and Humanities; Writing About Engineering and Technology; etc.) were offered each semester. Assessment of the implementation is included below in section 5.2.

2.1.2 Tenured/Tenure-track Faculty 2201 Orientation: At the conclusion of the spring 2016 semester, seven tenured and tenure-track faculty from the Department of English participated in an orientation session designed to introduce them to the ENGL 2201 curriculum. As part of this orientation, the Director of Writing Foundations provided participants with an overview of the goals and course outcomes for the new course, explained how the course differs from the previous course, ENGL 1200, and distributed a sample syllabus and textbooks that several instructors and GTAs teaching the course in the fall had chosen to adopt.

2.1.3 University Writing Portfolio (UWPort) Implementation: In fall 2014, the University Writing Portfolio became a requirement for all WI courses. In an effort to ensure the success of the implementation, the QEP Director, with help from two graduate assistants, offered several faculty orientation sessions in both the fall and spring semesters. Dedicated UWPort graduate assistants were available for 20 hours per week to help students with the uploading process in the University Writing Center, and all writing center consultants were familiarized with the upload process so that they were also able to assist students. Video and PDF instructions were available for all steps of the UWPort process for both students and faculty on the QEP website. To facilitate access to these materials, a University Writing Portfolio bookmark, complete with contact information, a web address, and a QR code, was provided to any instructor who wanted to distribute them to students. Submission rates were also compiled for the fall 2015 semester by a QEP graduate assistant and for spring 2016 by the Web Coordinator. Results of those compilations are included below in 5.1.2.

2.2 Summary of Student Support Activities

Page 2: 15-16 QEP Assessment Report Final - East Carolina Universityrelated activities (Section 2), the report provides descriptions of formative and summative assessments (direct and

2

2.2.1 Writing Mentors: Writing Mentors are writing consultants who work exclusively with students in particular WI courses. Due to graduations and low enrollment (only 6 students) in ENGL 3875: Peer Tutoring, the course required for undergraduate students to serve as writing mentors in spring 2015, Mentor numbers were considerably lower than in 2014-2015.

In fall 2015, four Writing Mentors worked in WI classes in the following programs:

• Health Services and Information Management (College of Allied Health) • Spanish (Harriot College of Arts and Sciences) • Hospitality Leadership (College of Business) • Psychology (Harriot College of Arts and Sciences)

In spring 2016, seven Writing Mentors worked in WI classes in the following programs: • Communication Sciences and Disorders (College of Allied Health) • Theater and Dance (College of Fine Arts and Communication) • Political Science (Harriot College of Arts and Sciences) • Interior Design and Merchandising (College of Health and Human Performance) • Art (College of Fine Arts and Communication) • Nursing (College of Nursing) • Criminal Justice (Harriot College of Arts and Sciences)

2.2.2 University Writing Center Expansion: The third year of the QEP saw further growth for the

UWC, with the number of appointments again hitting an all-time high. Information about use of the UWC services/staff, as well as information about student satisfaction with those services, can be found later in this report.

2.2.3 Writing@ECU Website: A new Web Coordinator was hired in fall 2015 after the previous coordinator left to return to graduate school full time. In her time in the position, the current Web Coordinator has, among other things, worked with professionals from ITCS to redesign the website, begun reorganizing the resource materials to make them more accessible to site visitors, conducted user testing session on the website with both faculty (Writing Liaisons) and students (Writing Center Consultants); updated staff and writing center information on the site; added a calendar and “featured resources” section; developed a new primary navigation bar; started the process of making all pages ADA compliant; gathered and reviewed analytics on the site; and led a process of selecting a new content management system (CMS) that will enable the site to be easily organized, searched, and updated while also integrating more effectively with the resources available through Academic Library Services.

2.3 Summary of Faculty Support Activities

2.3.1 Writing Liaisons: Three times each semester, the QEP Writing Liaisons—faculty from undergraduate programs across campus—met with the QEP Director and other writing program administrators (Director of Writing Foundations, Director of the University Writing Program, Director of the University Writing Center). Meetings for the 2015-16 academic year focused on the following:

• Updating Liaisons on various ongoing QEP initiatives and securing their participation in several of those activities (particularly the Writing and the Metacognition Workshop Series, Writing Mentors Program, and the Writing and Learning Communities);

• Gathering feedback on multiple texts, including the 2014-15 QEP Assessment Report and the latest version of the Writing@ECU website (Liaisons participated in user-testing of the site, as detailed below).

• Sharing information about enrollment trends and content of discipline-themed ENGL 2201 sections. Two meetings in the spring 2016 semester provided a space for instructors of

Page 3: 15-16 QEP Assessment Report Final - East Carolina Universityrelated activities (Section 2), the report provides descriptions of formative and summative assessments (direct and

3

different discipline-themed sections to share syllabi and assignments with Writing Liaisons in order to raise awareness of these sections with the hope of increasing enrollment rates among students whose intended majors are best suited to the sections’ themes.

• Discussing best practices in and scholarship on using student peer review in writing-intensive courses. The January 2016 meetings aimed to help faculty incorporate the practice of peer review into their courses and advance student achievement in QEP SLO 3 (Demonstrate that they understand writing as a process that can be made more effective through drafting and revision).

2.3.2 Metacognition and Writing Workshop Series: In both fall 2015 and spring 2016, the QEP Director collaborated with the Assistant Director of the University Writing Program and the Office for Faculty Excellence to offer a three-part workshop series on integrating metacognitive activities into writing-intensive courses. This professional development initiative aims to increase instructors' ability to promote QEP SLO #5: students will be able to assess and explain the major choices that they make in their writing. This academic year, 17 faculty members participated in the workshop series.

2.3.3 Writing and Learning Communities (WLCs): In fall of 2015, two groups of 5-7 faculty volunteers formed two Writing and Learning Communities. One WLC investigated how students select and use secondary sources in their writing, developed assignments/activities to help students improve in these areas, and shared their findings and ideas in a workshop for faculty across the university. The second WLC, comprised of faculty from the College of Nursing, worked to develop a language to talk about writing with their students within the RN-BSN curriculum.

2.3.4 Summer Writing Across the Curriculum (WAC) Academy: During the first summer session of 2016, a group of 5 faculty members (one each from English, Music, Social Work, Computer Science and Foreign Languages and Literatures) participated in a weeklong, intensive study of the transfer of writing skills and knowledge within and across disciplinary contexts. Each faculty participant created a project to help other teachers promote active learning transfer. These projects will be made available to other faculty at ECU and beyond on the Writing@ECU website.

2.3.5 Biennial Eastern NC Writing Symposium: In August of 2015, the QEP initiative—the “Biennial K-12/Community College Writing Symposium”— was held on ECU’s campus for the first time. In developing marketing materials for the event, the QEP Director and the Director of Writing Foundations recognized that the name proposed for the event in the QEP was quite cumbersome; thus, the name was shortened to the Biennial Eastern NC Writing Symposium. The event brought together educators from K-12, community college, and university sectors to talk about the teaching of writing, with a focus on introducing students to writing across the curriculum (WAC). To better ensure the engagement of these three sectors, three keynote speakers—one each from K-12, community college, and university sectors—addressed participants and helped to guide two breakout sessions. The first of these breakout sessions mixed participants from the different sectors and asked participants to consider similarities and differences between writing expectations as articulated in the Council of Writing Program Administrators’ “Framework for Success in Postsecondary Writing” and in the Common Core’s “College and Career Readiness Anchor Standards for Writing.” The second breakout session involved small groups of instructors from similar institutions developing WAC-related activities and assignments that they might use in their courses.

3. Direct Assessment of Student Writing Baseline assessment of pre-QEP writing samples—writing samples collected from major-area writing-intensive courses prior to fall 2013— began in the summer of 2013 and was completed in the summer of 2016. These assessment results serve as "baseline" data against which we will compare scores from student writing samples complete after the full implementation of the QEP. Three programs offering WI courses were assessed in this final summer of baseline assessment: French, Physics, and Math.

Page 4: 15-16 QEP Assessment Report Final - East Carolina Universityrelated activities (Section 2), the report provides descriptions of formative and summative assessments (direct and

4

Assessment procedures and scores are discussed in this section. In keeping with previous baseline results reporting, scores are reported by college. Also in keeping with previous reporting practices, individual reports were provided separately to the departments of Foreign Languages and Literatures (for French), Physics, and Math. When providing these reports to department leaders, the QEP Director and the Director of the University Writing Program offered to meet with departmental faculty to discuss additional, program-specific actions that might be taken beyond the actions taken through the QEP to improve student writing among majors.

3.1 Baseline Assessment of Writing Samples from WI Courses

During summer 2016, writing samples submitted by students or faculty from Math, Physics, and French were assessed using a rubric derived from the QEP SLOs. Those SLOs are as follows:

At the conclusion of their undergraduate degree programs, ECU graduates will be able to

SLO 1. Use writing to investigate complex, relevant topics and address significant questions through engagement with and effective use of credible sources.

SLO 2. Produce writing that reflects an awareness of context, purpose, and audience, particularly within the written genres (including genres that integrate writing with visuals, audio or other multimodal components) of their major disciplines and/or career fields.

SLO 3. Demonstrate that they understand writing as a process that can be made more effective through drafting and revision.

SLO 4. Proofread and edit their own writing, avoiding grammatical and mechanical errors.

SLO 5. Assess and explain the major choices that they make in their writing.

A copy of the rubric derived from these SLOs and used in the scoring of writing samples during the summer of 2016 is available in Appendix A.

The Process

Assessment for French and Math samples began with norming sessions, followed by scoring of the samples themselves. Samples were read and scored by two readers. If a significant difference was discovered in any of the 6 outcome areas, the scorers discussed the outcome and adjusted to within no more than a one-point difference in scores.

Samples from the Department of Physics were read and scored by an experienced faculty member with over 10 years of experience teaching WI courses in the Physics program after a meeting with the QEP Director to review the rubric and discuss the scoring criteria.

Scoring Results

Results—by college—are included in table form below. The results table includes the percentage of scores at 3 or higher and the percentage of those at 1 because the criteria for success for post-QEP implementation, as articulated in the QEP document approved by SACS, stipulates that "80 percent of scores on a four-point scale will be a 3 or 4 and no more than 5 percent will be at 1."

Individual programs will receive separate results for their programs. Departments will have the option to have the Director of the University Writing Program and/or the QEP Director meet with faculty to discuss results and to review QEP initiatives that are now in place to help raise scores.

Limitations

In addition to the low number of scorers involved in this year’s baseline work, other limitations to the assessment include the following:

• Thenumberofsamplesassessedvariedacrosscollegesandprograms.Ideally,thenumberofwritingsamplesincludedfromeachprogramwouldbeproportionaltothenumberofgraduatesfromthatprogram;however,thiswasnotpossibleforseveralreasons.First,smallerprogramsandprogramswithonly1or2WIcoursescouldnotprovideasmanywritingsamplesasprogramswithmultipleWIcourses.Second,facultyandstudentsincertainprogramswerenotasconsistentinrespondingtorequestsfor

Page 5: 15-16 QEP Assessment Report Final - East Carolina Universityrelated activities (Section 2), the report provides descriptions of formative and summative assessments (direct and

5

writingsamples.Third,samplessubmittedvariedconsiderablyinlength.Finally,assessmentshadtobecompletedformultipleprogramsinthelimitedtimeframeofasummersession.

• Because a number of the samples were gathered during semesters prior to the full development of the QEP, they did not all include writing self-analysis documents or assignment descriptions. At the time when these samples were gathered, SLOs 3 and 5, along with the writing self-analysis questions and the list of items to be submitted for assessment (the writing project, the writing self-analysis, and the assignment description), were not finalized and thus were either not integrated into the sample collection process or were integrated in "draft" form (instructions for students were not worded as effectively as they could have been, for example). As a result, it is difficult to draw conclusions about students' abilities in the areas addressed in SLOs 3 and 5 in some of the programs assessed. Additionally, the occasional absence of assignment descriptions created difficulty in assessing SLO 2, particularly with respect to "context, purpose, and audience."

3.1.1 Baseline Results by College

3.1.1.1 Thomas Harriot College of Arts and Sciences (partial)

The sample included 45 writing samples randomly selected from WI courses in French, Math, and Physics.

SLO Mean StandardDeviation

Percent≥3 Percent@1

SLO1a 2.7 0.80 53.3% 6.7%

SLO1b* 2.3 1.10 40.0% 30.0%

SLO2 2.6 0.81 42.2% 8.9%

SLO3** 2.3 0.67 29.2% 4.2%

SLO4 2.6 0.76 48.9% 4.4%

SLO5** 2.4 0.74 33.3% 4.2%

*35ofthesamplesdidnotrequiretheuseofoutside/secondarysources.

**21ofthesamplesdidnotincludewritingself-analyses.

Table1:HarriotCollegeResultsfromDepartmentsAssessedinSummer2016

3.1.2 Summary and Interpretation of Assessment Data

The lowest percentages of students scoring a “3” or higher come on SLO 3 and SLO 5. Our hope is that the continuation of the Writing and Metacognition Workshop Series, along with implementation of the University Writing Portfolio requirement that asks students to think and write about their writing choices in every WI course that they take, will help improve these scores.

3.1.3 Assessment of the WI Assessment Process—Internal Assessors’ Questionnaire

As in past baseline assessments, scorers were asked to respond to questions upon completion of the assessment process. Due to the structure this year, in which scorers only reviewed samples from their own disciplines, only three questions were asked:

1. What difference, if any, did you notice between how you read student writing when you are “grading” a piece of writing and when you are doing “assessment”?

2. What, if anything, did you learn from this experience that might impact your own teaching of writing in the future?

Page 6: 15-16 QEP Assessment Report Final - East Carolina Universityrelated activities (Section 2), the report provides descriptions of formative and summative assessments (direct and

6

3. Is there anything else that you would like us to know for the next time we do this project?

As in previous baseline assessment feedback, responses to these questions reveal how beneficial the process of participating in the assessment can be for faculty. A sampling of comments from their responses to questions 2 and 3 appears below:

“The aspect of the assessment that I found to be most interesting was the self-evaluation part of each writing sample. I was quite fascinating to read about the process the students used to create the work and to compare it with the sample itself… As it is to be expected, those students who thought that drafting and editing are important did a better job than those who just quickly put the assignment together.”

“I found the student analyses interesting. They confirmed a suspicion I’ve had: our students confuse writing in a foreign language (language proficiency, grammar, vocabulary, etc.) with the writing process. I will keep this in mind when preparing future classes and activities.”

“It was really interesting to see/read other kinds of writing assignments than the ones which I assign, which are either close readings or research papers. I was impressed with the quality of “personal” writing assignments where students were able to express their feelings. [I will] think about incorporating different writing genres in my classes. It was also a good reminder about having a good, detailed and organized rubric.”

3.1.4 Assessment of the WI Assessment Process—External Scoring of Baseline Samples

In an effort to ensure the reliability of internal assessment scores, a random sampling of 10% of internally assessed baseline WI samples from summer 2013, 2014, and 2015 were distributed, along with the QEP rubric, to faculty experts at other institutions. These faculty experts had at least five years of experience teaching writing at the college level and participation in Writing Across the Curriculum programs at other universities (UNC-Charlotte; NCSU; Appalachian State; University of Maryland). Each assessor received a stipend of $800 for the work.

Prior to scoring, each group of external assessors participated in an online video meeting for norming purposes. Scoring by external readers followed a process similar to that used for internal assessment: each writing sample was read by two different external scorers. If scores on any SLO for a given sample differed by more than one point, the sample was scored by a third external reader for that SLO.

The scores of external assessors differed from those of internal scorers as summarized in the tables below:

ExternalScoresRelativetoInternalScores

#ofinstances(outof673SLO

scores)

%oftotalscoredSLOs

Externalscore1pointormore

ABOVEinternalscoreforSLO 170 25%

Externalscore1pointormore

BELOWinternalscoreforSLO 77 11%

Externalscorewithin.5ofinternal

scoreforSLO 425 63%

Table2:Ratesofscoredifferencesbetweeninternalandexternalassessors

The fact that 25% of SLO scores from the external assessors were 1 or more points above the scores determined by internal assessors for the same writing samples suggests that some internal scorers may have assessed several samples too critically, perhaps responding to prior

Page 7: 15-16 QEP Assessment Report Final - East Carolina Universityrelated activities (Section 2), the report provides descriptions of formative and summative assessments (direct and

7

experiences with student writers (frustration with students’ struggles in courses taught, for example) or to confusion over what distinguishes “good” (3) from “fair” (2) and “good” from “excellent” (4). In response to these difference, future norming sessions for internal assessors will focus particular attention on samples that fall into these “middle” ranges on different SLOs and on strategies for applying rubric criteria evenly, regardless of prior experiences with student writers. Of particular importance will be attention to SLO 1A:“Writer uses writing to investigate complex, relevant topics and address significant questions.” As the table below shows, the greatest number of external scores “1 or more points ABOVE internal score” occurred with this SLO:

SLO1A SLO1B SLO3 SLO4 SLO5 TotalNumberofexternalscores1

pointormoreABOVEinternal

scoreforSLO 48 31 34 32 25 170

Table3:RatesofexternalscoresonepointormoreaboveinternalscoresbySLO

3.2 Assessment of Writing Samples from ENGL 2201

In fall 2015, assessment of the Writing Foundations courses, including ENGL 2201, became course-embedded, meaning that instructors were responsible for using the ENGL 2201 rubric to score writing by students in their classes. In support of the transition to course-embedded assessment, workshops were offered over fall 2015 and spring 2016 for all Writing Foundations instructors. Many of these workshops were calibration sessions that addressed all the SLOs of ENGL 2201.

In reviewing results below, it should be kept in mind that, because the Writing Foundations courses have their own course objectives (objectives that are necessarily more detailed than the 5 QEP SLOs), the rubric used to score the samples differed from that used for other WI courses, but many parallels exist between the two rubrics. It is also important to note that the 2201 rubric uses a five-point scale (the QEP/WI rubric uses a 4-point scale). On the 2201 rubric (see Appendix B), 1=insufficient, 2=developing, 3=adequate, 4=very good, and 5=excellent.

3.2.1 ENGL 2201 Assessment Results

During AY 2015-16, instructors of ENGL 2201 conducted course-embedded assessment, scoring student writing samples from their courses on the ENGL 2201 rubric (see Appendix B). The table below provides average scores by SLO on that rubric.

2201RubricOutcomeCategory AverageScoreFall2015

AverageScoreSpring2016

Self-analysis 3.5 4.2

Inquiry 3.4 4.2

SourceUse 3.2 4.1

Purpose,Audience,Context 3.4 4.2

DisciplinaryConventions 3.3 4.1

FormattingandCitation 3.3 4.1

ExpressionandOrganization 3.3 4.3

Table4:English2201AverageScoresbySLO

Page 8: 15-16 QEP Assessment Report Final - East Carolina Universityrelated activities (Section 2), the report provides descriptions of formative and summative assessments (direct and

8

To validate instructors’ assessments, almost a third of the portfolios were also scored by course-external assessors (trained graduate teaching assistants who were not instructors of the students whose portfolios were sampled) in summer 2016 in the selected outcome areas of “Self-analysis,” “Disciplinary Conventions,” and “Inquiry.” Highlights from both course-embedded instructor scoring and course-external assessor scoring in these three areas are included in the table below.

2201RubricOutcomeCategory

Course-embeddedScoringResults

Course-ExternalScoringResults

Self-analysis 98%ofstudentsscored

a2orhigher.

92%ofstudentsscored

a2orhigher.

DisciplinaryConventions 99%ofstudentsscored

a2orhigher.

84%ofstudentsscored

a2orhigher.

Inquiry 99%ofstudentsscored

a2orhigher.

84%ofstudentsscored

a2orhigher.

Table5:English2201AssessmentInternalandExternalScoresComparisons

3.2.2 Interpretation of Assessment Data and Action Taken

While it appears that students are generally successful in the outcomes of ENGL 2201, with scores in all areas of the course-embedded assessment at 3 (“adequate”) or higher in fall 2015 and increasing to 4 (“very good”) or higher in spring 2016, the differences between course-embedded scores and those assigned by course-external assessors, as reported in Table 5, suggest that instructors may be scoring the work of their own students too high. In response to these concerns and in the interest of ensuring valid assessment results, the Director of Writing Foundations and members of the Composition Committee in the Department of English have taken up assessment scoring responsibilities. Course-embedded assessment will not be part of the assessment plan for ENGL 2201 moving forward.

4. Indirect Assessments of Student Writing: Baseline Results 4.1 Student Focus Groups

In the fall of 2015 and again in early spring 2016, focus group sessions were held with several upper-division ECU students to gather detailed information about their experiences with writing and writing instruction in the pre-ENGL 2201 curriculum. In total, six students participated, once each from the following major areas: Engineering, Psychology, Public Health, Political Science, Biology, and Geography. As is the nature of focus groups, these conversations did not produce generalizable results but did provide useful insights into some of the struggles students faced in completing their writing-intensive courses and learning write effectively in their major areas prior to QEP implementation. Questions asked during the focus group sessions were designed to align with QEP initiatives and student learning outcomes as follows: Question/Prompt Related

SLO/Initiative

1. ThinkbacktoyourEnglishcompositioncourses(atECU,thatwouldhave

beenENGL1100and1200;ifyoutransferredcreditfromotherschools,the

coursenumberswouldvary).Whatisyoursenseofhoweffectiveor

ineffectivethesecompositioncourseswereinpreparingyouforfuture

Curricularrevision

(ENGL2201)

SLO2

Page 9: 15-16 QEP Assessment Report Final - East Carolina Universityrelated activities (Section 2), the report provides descriptions of formative and summative assessments (direct and

9

writing?Whatdoyouthinkhelpedthemost?Whatdoyoufeelwaseffective

andineffective?Whatwouldhaveimprovedyourlearningorunderstanding

ofwritingingeneralandwritinginyourfield?

2. Drawyourwritingprocess(thinkaboutthelastmajorprojectyouwrote).

Whatquestionswereyoutryingtoanswerinyourproject?Whatwasyour

firstlineofactionasfarasfindingsources?Howdidyoudecidewhatwas

relevanttoyourtopic?Whatstrategiesdoyouusewhenyouareprewriting

orplanning?Howdidyouconductresearch?Howdoyouprogressthrough

yourwritingasyourevisetowardyourfinaldraft?

SLOs1&3

3. Giveeachparticipantthesameparagraphtoedit,thendiscusstheirchoices-

discusswheredidtheydeveloptheserevisionskills-Whathashelpedyouto

learnhowtoreviseandedit?

SLO4

4. Thinkaboutwhatisdistinctoruniqueaboutwritinginyourfield:Doyoufeel

confidenttowriteinyourfieldwhenmovingintogradschoolorcareerafter

ECU?

SLO2

5. Whatadvicewouldyougivetoyourselfaboutwritingeffectivelyifyoucould

gobackandtalkwithyourselfjustbeforeyoustartedyourcourseworkat

ECU?

SLO5

Table6:StudentFocusGroupQuestionsAlignmentwithQEPSLOs

Focus groups were recorded, transcribed, and reviewed by graduate assistants to ascertain common responses and perspectives that emerged during the conversations.

4.1.1 Focus Group Question 1 Responses—ENGL 1100 and 1200

Student responses to question 1 reflected some common sources of frustration. The most commonly mentioned negative aspect of the 1100/1200 experience was that it did not seem to provide a foundation for writing within different disciplinary citation styles, formatting conventions, and genres of writing. Some examples of the these comments include

“[I]n my class we had Nursing, Psychology, all these kinds of majors and we were all working on different things and different paths, but we all learned the same style that we may not ever use.”

“It did not prepare me for the APA writing that I would have to do for the next 4 years or so.”

“We were like drilled with MLA in those classes but now it switched to APA.”

“I learned MLA one year, then did APA, and now I’m not using either one of those – [it was] aggravating to use styles not relevant to my field.”

“[I]n Engineering, a lot of the beginning English classes didn’t help us a lot. Like, they are refreshers, but the type of writing…we do a lot of technical writing [but in English 1100 and 1200] we did a lot of essays and stuff which is completely different, so, you get the basics but as writing papers it didn’t help a whole lot.”

“I don’t remember [an] emphasis on technical writing [in 1100 and 1200], and I am never going to write a narrative…it isn’t going to be as relevant as writing to the point, concisely.”

4.1.2 Focus Group Question 2 Responses—Writing Processes

After drawing their writing processes, students discussed what their illustrations reflected in more detail. Commonalities across responses are listed below:

Page 10: 15-16 QEP Assessment Report Final - East Carolina Universityrelated activities (Section 2), the report provides descriptions of formative and summative assessments (direct and

10

• Research occurs throughout the writing process. Several focus group participants talked about how they returned to the process of finding information as they drafted and revised their work. Research was not a separate step in the writing process, but rather seemed to flow throughout:

“I draft the paper then continue researching on the topic and make edits along the way.”

“I brainstorm possible topics that I find interesting and then I research them a little bit. Then I narrow it down to a few potentials, then I research a little bit more. And then I choose the one that is most interesting or the one that I can write the most about.”

“Me personally, I just go find something I like or whatever and start writing and write as much as I can, with whatever feasible sources I’ve found. But then once I’ve kinda run out of information I’ll continue researching. [I] research and write almost at the same exact time and then review all at once.”

• Disciplinary conventions and expectations are particularly important when students are trying to determine what information should be included in a written project and how that information should be organized:

Public Health Major: “Since it is healthcare … they all want to talk about treatment, prevention, or whatever.”

Engineering Major: “The lab reports is pretty simplistic. You start out with detailing what you are going to do, tell them what you did, or what you’re doing, and then what you did; it is like an intro, body, and then conclusion.” Psychology Major: “I organize my paper by topic, usually you have to define what you are talking about, then subtopics, like signs and symptoms. And then [include] details of each subtopic and related topics like comparing to international people.”

Biology Major: “There is a typical structure to follow, with abstract, introduction, materials and methods, discussion, conclusion.”

• Wikipedia, Google, Google Scholar, and Joyner Library’s OneSearch are what students reported using for much of their research, with two participants explaining that they had only very recently started using the library’s discipline-focused databases. One senior participant indicated that he wished he had known earlier that the library databases can really help in locating and citing disciplinary sources:

“It would have helped if I had learned previously how to use the library website. I found most of my journals on Google, to be honest with you, but there were a lot of resources on the library website apparently and I didn’t know it could build citations for you. I just found that out like a few days ago, after I finished [a major course project]. Knowing this earlier would have made it easier to navigate and find journals.”

4.1.3 Focus Group Prompt 3 Responses—Editing for Grammar and Mechanics

Focus group participants did not seem to struggle with identifying and correcting issues in grammar and mechanics in the brief passage provided to them. When asked how they developed the ability to identify and correct these errors, several attributed it to high school education or to “trial and error”—learning from feedback given from instructors or peer reviewers in classes. The importance of feedback from the instructor and from peers to the development of editing abilities was mentioned multiple times by different students. Other

Page 11: 15-16 QEP Assessment Report Final - East Carolina Universityrelated activities (Section 2), the report provides descriptions of formative and summative assessments (direct and

11

students mentioned the value of “just slowing down [to] actually think about what I’m writing more” and the benefits of reading a draft aloud when trying to identify errors.

4.1.4 Focus Group Question 4 Responses—Writing in the Disciplines

Student responses to the fourth question in the focus group reflected that they had developed an awareness of the valued attributes of writing in their major areas. Traits identified with major areas represented in the focus groups are included in the table below:

Major UniqueFeatures/ValuedAttributes

Engineering Conciseness

Technicallanguage

Nopersonalpronouns

Psychology Accuracy

Currency

Statistics/graphs/charts

PublicHealth Persuasion:convincingpeopletosupporta

courseofaction

Biology Typicalstructurewithabstract,introduction,

materialsandmethods,discussion,conclusion.

Geography Variesdependingonareawithinthefield:“it

combinesthenaturalscienceswiththesocial

sciences,sothereisalotofscientific,technical

writingdealingwithquantitativedata,butitalso

hashumangeography,whichismore

philosophical.”

PoliticalScience Historicalwritingisimportantbecausewritingis

oftenbasedoncasesfromthepast.

Table7:FocusGroupStudentMajorsandIdentifiedAttributesofWritinginMajors

Further, several responses revealed that these students recognized some of the reasons for certain disciplinary writing features. As one student majoring in Psychology explained, “With Psychology the most important thing is accuracy, because you know, you have psychologists doing research over and over again, and you want to get the most recent research you can find. You don’t want to be putting old news in your paper; it just doesn’t make sense with things that are accurate today.” Another student, a Public Health major, clarified why persuasive writing is so critical to her field: “We are doing a lot of papers … creating programs, health programs, so it’s about why this program is the best, so our papers are more about persuading the audience.” Similarly, the participant majoring in Geography, with a focus on Human Geography, explained why writing in his disciplinary specialty cannot rely just on quantitative evidence: “I’ve had to do a lot of readings that are focused on adaptation, like what makes people vulnerable-- social factors and economic factors--and those are things that you can’t describe with numbers. [You need to] be able to elaborate and explain an issue in detail.”

4.1.5 Focus Group Question 4 Responses—Reflection and Advice

The final question for focus group participants asked them to consider what advice they would give to themselves as first-year students about succeeding in college writing. The

Page 12: 15-16 QEP Assessment Report Final - East Carolina Universityrelated activities (Section 2), the report provides descriptions of formative and summative assessments (direct and

12

question was intended in part to gauge students’ aware of their successes and struggles as writers, in keeping with SLO 5 of the QEP. The most common pieces of advice included • Don’t procrastinate/start sooner (this piece of advice was offered by or agreed to by all

participants in both focus groups). • Read assignment prompts more fully and carefully. • Take more time to read and analyze your sources. • Revise more. • Ask your teacher more questions about writing. • Use the writing center.

4.1.6 Insights from Student Focus Groups

The two focus groups reported on here highlight some of the struggles that student writers faced before the implementation of QEP initiatives. Of particular note is the frustration students expressed with regard to the disconnect between the forms and formats of writing that they were asked to complete in ENGL 1100 and 1200 and the forms and formats that they were subsequently asked to complete within their major-area courses. In post-implementation focus groups, QEP leadership hopes to see that ENGL 2201, with its focus on disciplinary writing, has resulted in more coordination across Writing Foundations courses and WI courses in the majors in terms of citation styles, formatting, and genres.

It was encouraging to see that participants had developed the ability both to identify and explain unique features of writing in their major areas. In post-implementation focus groups, we hope to see this awareness expressed by most, if not all, of the participants, and we hope to see ENGL 2201 credited as a place that fostered such awareness. The implementation of the “writing self-analysis” as part of all WI courses might also contribute to increased awareness among future focus group participants of the reasons for disciplinary writing features. Indeed, the increased focus on metacognition that is part of the QEP (through the writing self-analysis and the Writing and Metacognition Workshop Series) might lead to students more regularly slowing down and actually thinking about what they are doing (the advice provided by one focus group participant).

Finally, we hope to see more comments in future focus groups about the benefits of regular use of the writing center (the advice of another focus group participant).

4.2 UNC/ECU Sophomore Survey—Spring 2016

Six writing-related questions previous on the Sophomore Survey were accidentally dropped from the survey in the process of migrating the survey to the Qualtrics platform. The questions will be added back in for the spring 2017 administration. For results from previous years of the QEP, see assessment reports from 2012-13 and 2014-15, both available on the QEP website (www.ecu.edu/QEP)

4.3 UNC/ECU Graduating Senior Survey—Spring 2016

The Graduating Senior Survey included six writing-related questions that align with the QEP SLOs. Response data for these 6 questions is included in the table below. Also included is data from the spring 2015 administration of the survey for comparison.

Page 13: 15-16 QEP Assessment Report Final - East Carolina Universityrelated activities (Section 2), the report provides descriptions of formative and summative assessments (direct and

13

Spring2015 Spring2016

Pleaseindicatetheextentofyouragreementordisagreementwiththestatements

Mean(1=Strongly

Disagreethrough

5=Stronglyagree)

%"StronglyAgree"or"Agree"

Mean(1=Strongly

Disagreethrough

5=Stronglyagree)

%"StronglyAgree"or"Agree

Writingaboutcomplicatedor

trickytopicsandsituations

helpsmetothinkaboutthem

3.9 73.3% 3.9 71.4%

Iamwellpreparedtowrite

effectivelyinthestylesand

formatsofmycareerfield

4.2 87.1% 4.1 84.2%

Whencomposingimportant

documents,Ioftenwrite

multipledrafts

3.7 64.9% 3.7 67.2%

Iregularlytaketimeto

proofreadmywritingbefore

givingittoothers

4.1 83.2% 4.1 85.3%

Iamconfidentinmyabilityto

avoidgrammaticalerrorsinmy

writing

4.1 81.9% 4.1 81.6%

Iamconfidentinmyabilityto

evaluatethequalityofmyown

writing

4.2 85.2% 4.1 84.0%

Table8:GraduatingSeniorSurveyResponsesonWritingQuestions

While little change is apparent across mean scores in these two years of results, changes may be visible in the spring of 2017, when, for the first time, a good number of graduating seniors will have had the full experience of QEP implementation (including ENGL 2201). Accordingly, we hope to see the percentages of those who agree and strongly agree to go up in the areas of writing multiple drafts and using writing to think about complex topics and situations. Additionally, while the percentages of agreement and strong agreement were above 80% in response to four of the statements, we hope to see an increase in the mean scores in these areas (indicating stronger agreement than pre-QEP).

5. Curriculum Enhancement Initiatives Implementation Assessment Results and Actions Planned

Several measures were taken during AY 2015-16 to assess the effectiveness of the University Writing Portfolio and ENGL 2201 implementation processes.

5.1 Assessment of University Writing Portfolio Implementation

5.1.1 WI Syllabi Review Academic year 15-16 also continued assessment of the implementation of the University Writing Portfolio through a review of syllabi and through an inventory of portfolio submission rates mid-way through the year.

As they had done in spring of 2015, the QEP team reviewed WI course syllabi in fall 2015 and spring 2016. Syllabi were reviewed to see if they mentioned the UWPort and, if they did,

Page 14: 15-16 QEP Assessment Report Final - East Carolina Universityrelated activities (Section 2), the report provides descriptions of formative and summative assessments (direct and

14

if the portfolio was a requirement or was strongly encouraged. Syllabi review revealed the following trends:

WISyllabusPortfolioMention

Spring2015N=179

Fall2015N=97

Spring2016N=124

UWPortRequiredforcredit/grade/course

completion.

24% 56% 64%

UWPortmentioned,butNOTREQUIRED(stronglyencouraged,forextra

credit,etc.)

44% 1% 1%

NOUWPortmention 32% 43% 35%

Table9:WISyllabusReviewResultsforUWPortMention

While the continued increase in percentage of syllabi that mention an upload requirement is encouraging, and while it is possible that information about the UWPort is being distributed to students through venues other than the course syllabus, the fact that upwards of 30% of syllabi reviewed from the most recent semester do not mention the portfolio is troubling.

In an attempt to lower the percentage of syllabi that do not mention the UWPort, multiple emails and announcements will continue to be circulated to faculty who teach WI courses, and additional reminders will be provided throughout the semester.

Additionally, the QEP Director met with Department Chairs and Writing Liaisons from programs in which one or more WI sections had no students uploading materials. The complete lack of participation from among students in these sections suggests that the faculty member teaching the section did not sufficiently discuss the upload process. Thus, as part of these meetings, Chairs (and Liaisons) were asked to publicize the requirement to all faculty in their programs and to offer reminders throughout the semester.

5.1.2 UWPort Upload Review The QEP Director, with help from a graduate assistant and the Web Coordinator, also reviewed UWPort submission rates for fall 2015 and spring 2016 writing-intensive courses.

Those rates, along with rates for the fall of 2014, are included below:

WI Course Submission Rates

Fall 2014 Fall 2015 Spring 2016

> ½ of students submitting

41% 53% 46%

< ½ of students submitting

31% 28% 36%

NO students submitting

28% 19% 18%

Table10:UWPortUploadRatesFall2014throughSpring2016

In light of the fact that last year’s efforts to raise submission rates did not result in as sharp an increase in submission rates as desired (see QEP Assessment Report for 2014-15), this year, the QEP Director met with Chairs and Liaisons for 16 programs in which multiple sections had below 50% upload rates and/or in which one or more sections had no students uploading materials.

These meetings yielded insights into potential causes for the lower-than-anticipated upload rates:

Page 15: 15-16 QEP Assessment Report Final - East Carolina Universityrelated activities (Section 2), the report provides descriptions of formative and summative assessments (direct and

15

1) In multiple cases, the sections with no students or fewer than half of students uploading were taught by faculty hired near the beginning of the semester, on a part-time basis. It’s quite possible that these faculty, because of the hiring timeline, were not fully aware of the requirement for upload or of how to help students complete the upload successfully.

2) Online/DE sections also accounted for several of the low/no upload sections. These sections tended to be taught by part-time, non-local faculty.

3) The growth of upload rates this past year was also hampered by ongoing technology glitches. While most of the Wifi connectivity issues reported in the 14-15 academic year were addressed thanks to ITCS, the server capacity at Nuventive, the company that hosts iWebfolio, was an issue. When classes or 20-25 students attempted to upload materials at the same time, the servers were sometimes unable to process the requests without long delays or crashes.

4) In 6 instances, a course that was tagged as WI in banner was not, in fact, supposed to be. Thus, the course should not have been included in the review.

In response to the issues raised through these meetings, several actions have been taken to increase upload rates:

1) Nuventive has increased server capacity near the end of both fall and spring semesters.

2) The QEP Director has implemented the practice of contacting Department Chairs and Writing Liaisons near the beginning of the semester, and again shortly after the beginning of the semester, to remind them to alert new hires that, if teaching a WI course, those new hires need to comply with the UWPort upload requirement.

3) The QEP Director has implemented the practice of contacting instructors of DE WI courses directly to provide them with links to student checklists for the uploading process. These links can easily be shared with students via email and/or Blackboard.

4) In August 2016, QEP Director reported to academic Deans and Directors regarding upload rates and the findings from meetings with Department Chairs and Liaisons. Deans and Directors were asked to include reminders about the upload in their regular communications with College/School faculty.

5) A review of submission rates will be conducted at the conclusion of the fall 2016 semester in order to monitor progress. While the submission rates reported here for AY 2015-16 will not have a significant impact on the ability to conduct post-implementation QEP assessment, it is important that rates increase over the final two years of the QEP so that a good sample of writing from students who have completed the ENGL 2201 curriculum, and have thus been at ECU during the time of full QEP implementation, can be gathered for post-implementation assessment.

5.1.3 Revised Goals for University Writing Portfolio Uploads

In hindsight, it appears that our initial goals (as articulated in the QEP document) for University Writing Portfolio upload rates were likely too optimistic. In the QEP, we indicated an 80% submission rate as our goal. Given the number of writing-intensive course sections offered each semester (a number in the hundreds); the large, diverse group of faculty who teach these courses; and seemingly ever-present technical difficulties, reaching 80% may not be possible, at least not within the limited time frame of the QEP period.

5.2 Enrollment in discipline-themed 2201 Sections and Actions Planned The QEP Director reviewed the intended and declared majors of students enrolled in discipline-themed sections of ENGL 2201 in both fall 2015 and spring 2016. Results of those reviews are reported in the table below. Note that majors that might fit into multiple disciplinary areas (for example, Psychology may be considered a social science or a health science, depending on what the student focuses on) were considered “in” the disciplinary theme of the section.

Page 16: 15-16 QEP Assessment Report Final - East Carolina Universityrelated activities (Section 2), the report provides descriptions of formative and summative assessments (direct and

16

Fall2015 Spring2016DisciplineTheme #of

SectionsTotal#ofStudents

%inDiscipline

#ofSections

Total#ofStudents

%inDiscipline

Arts&Humanities 3 67 22% 1 24 17%

Business 4 96 68% 2 48 52%

Communication 3 69 35% 3 75 5%

Engineering&Technology

2 48 61% 2 50 86%

Education 4 90 39% 1 25 16%

HealthSciences 3 67 91% 4 98 60%

NaturalSciences 2 45 62% 2 50 38%

SocialSciences 3 70 47% 2 50 50%

Table11:EnrollmentinDiscipline-themedENGL2201Sections

Keeping in mind that students change majors and that they can major in areas that do not have an obvious connection to their intended career paths (in other words, they may enroll in a themed section that is more aligned with their career path than with their major), it is still clear that enrollment in some of the themed sections has been more effective than enrollment in others.

Several measures were taken prior to registration for fall 2016 with the goal of increasing the numbers of students majoring in the areas around which selected ENGL 2201 sections are themed:

1) Writing Liaisons were asked to provide the QEP Director and/or the Director of Writing Foundations with a schedule of when required courses for majors would be offered (to the best of their knowledge—teaching schedules change frequently before actual registration begins). The information was compiled and used by the Director of Writing Foundations in scheduling 2201 themed sections for the fall 2016 semester.

2) Writing Liaisons were instructed to ensure that advisors working with their programs are aware of the related discipline-themed sections.

3) The names of the course sections as they appear in Banner were changed so that the theme is more visible in the listing. The listings used to say “ENGL 2201: Writing About the Disciplines (Arts and Humanities)” but have been changed to read “ENGL 2201: Writing About the Arts and Humanities”

4) The course search widget in Pirate Port was revised so that the theme of the sections is visible in search results.

5) Registration restrictions were piloted on certain discipline-themed sections to make it more likely that the most appropriate students would enroll. These restrictions were placed on sections themed for Business; Education; and Engineering and Technology.

6) An assistant to the Director of Writing Foundations was hired in June 2016 to help with, among other things, the scheduling and registration processes for ENGL 2201. This individual has added additional major-appropriate restrictions for other discipline-themed sections

Enrollment rates will be reviewed again at the conclusion of the fall 2016 and spring 2017 semesters.

5.3 2201 ENGL 2201 Instructor Survey Results and Actions Planned

5.3.1 Student Engagement with ENGL 2201 Material and Major/Career Connections

At the conclusion of the spring 2016 semester, faculty teaching ENGL 2201 were asked to complete a brief survey about their perceptions of the effectiveness of the 2201

Page 17: 15-16 QEP Assessment Report Final - East Carolina Universityrelated activities (Section 2), the report provides descriptions of formative and summative assessments (direct and

17

implementation process and the effectiveness of the course in engaging and helping students move into writing in their major/career areas.

When asked to indicate their agreement with the statement “The majority of students in my 2201 section(s) seemed engaged in the assignments,” 12 of 20 (60%) of respondents said either “Somewhat agree” (7), “Agree” (5) or “Strongly agree” (1). Six respondents (30%) indicated that they either “Somewhat disagree” (4), “Disagree” (1) or “Strongly disagree” (1).

When asked to indicate agreement with the statement “The majority of students in 2201 seemed to be able to make connections between what we did in the class and what they will do in their majors/career areas,” 11 of 21 (52%) of respondents said either “Somewhat agree” (2), “Agree” (7) or “Strongly agree” (2). Five respondents (24%) indicated that they either “Somewhat disagree” (1), “Disagree” (2) or “Strongly disagree” (2). Five instructors (24%) additionally indicated that they neither agreed nor disagreed with the statement.

The fact that 48% of respondents indicated that they either did not have a strong sense of how well students were able to make connections or did not agree with the statement that students were able to make connections suggests that more effort might be put forth to demonstrate to students in 2201 that writing will be important to their future academic and/or professional lives. In light of these concerns, More effort will be made in academic year 2016-2017 to promote use of some of the resources, including videos of faculty from different disciplines talking about the importance of writing to success in their areas, on the Writing@ECU website.

It is also possible that, due to the difficulties detailed above in terms of getting students enrolled in the most appropriate disciplinary themed sections, students could not easily see connections because they were not part of the disciplinary area that was the focus of the section they had registered for.

With the various efforts (discussed above in 5.2) to increase the percentage of students appropriately enrolled in the disciplinary themed sections, we hope to see an increase in instructors’ perceptions that students are able to make connections between ENGL 2201 work and writing in their future majors and/or career paths.

5.3.2 Contribution to QEP SLOs

Instructors were also asked how much they felt that the ENGL 2201 curriculum had helped their students in moving toward the QEP SLOs. As the table below reflects, the majority of the 20 respondents felt that the course helped “A lot” or a “Moderate Amount” for each of the five outcomes. Note that, similar to the QEP WI rubric, the survey broke SLO 1 into two parts.

Question Alot Moderateamount

Alittle Notatall

Usingwritingtoinvestigate

complextopicsandaddress

significantquestions.

5 9 5 1

Locatingandintegratingcredible

researchsourcesintotheirwriting.

10 9 1 0

Producingwritingthateffectively

addressescontexts,purposes,and

audiences.

6 10 4 0

Usingdraftingandrevisionto

improvetheirwriting.

6 8 5 1

Page 18: 15-16 QEP Assessment Report Final - East Carolina Universityrelated activities (Section 2), the report provides descriptions of formative and summative assessments (direct and

18

Proofreadingandeditingtoavoid

grammaticalandmechanical

errors.

5 7 5 3

Explainingandassessingthemajor

choicesthattheymakeintheir

writing

5 9 4 2

Table12:InstructorPerceptionofStudentPerformanceinENGL2201onQEPSLOs

Responses suggest that proofreading and editing continue to be a challenge for many students and instructors. As a result, greater effort will be made to promote the resources available on the Writing@ECU website to assist students in this area. Additionally, a QEP/UWP Writing and Learning Community focused on proofreading and editing will be created to develop additional resources in the 2016-2017 academic year.

5.3.3 Comparison to ENGL 1200

Ten instructors responding to the survey indicated that they had taught ENGL 1200 in the past. These instructors were asked to compare the performance of students in ENGL 2201 to the performance of students in ENGL 1200. Results are summarized in the table below.

PerformanceArea Muchbetter

Somewhatbetter

Aboutthesame

Somewhatworse

Muchworse

Engagementwith

coursematerial

2 2 4 1 1

Effortexertedon

assignments

3 0 5 1 1

Contributionstopeer

review

1 0 7 1 1

Participationinclass

activities

1 0 7 0 2

Understandingcourse

readings

2 0 5 2 1

Table13:InstructorPerceptionofStudentPerformanceinENGL1200versusENGL2201

The fact that the majority of instructors with previous experience teaching ENGL 1200 did not see dramatic improvements in the areas listed is not terribly surprising given that the course is still writing-intensive and still introduces difficult, new material. Furthermore, the target audience of students for 2201 is not much more advanced in terms of academic experience. It is interesting to note that 4 of 10 respondents indicated improvement in “engagement with course material”: a response that suggests some benefits from connecting the course with students’ major/career interests. This increase in engagement may be related to the 3 responses that indicated that students’ effort on assignments in 2201 is “Much Better” than the effort witnessed in ENGL 1200. Research consistently points to a connection between engagement and effort, and some of the open-ended responses to the question “what aspects of ENGL 2201 do you believe are most effective and why?” reflect such a connection:

“The focus on disciplinary conventions and shifting audiences foregrounds more clearly what we (or, at least, most of us) were trying to accomplish in 1200, and by assuring that all assignments deal with the students' disciplines directly, in some way, the students themselves seem more engaged and invested in their work.”

Page 19: 15-16 QEP Assessment Report Final - East Carolina Universityrelated activities (Section 2), the report provides descriptions of formative and summative assessments (direct and

19

“I feel that a discipline-specific focus on researching and writing is highly relevant and significant to students' understandings of writing and researching in their disciplines.” “My students appreciated the chance to write in their disciplines and many commented that they thought they would use these skills again.” “Discipline-specific sections -- shared perspectives and more closely related content. It's not easy to create a course that highlights these possibilities, but it's an excellent challenge and one that tends to increase student investment.”

5.4 ENGL 2201 Student Survey Results and Actions Planned

Students in ENGL 2201 were surveyed during the final two weeks of the spring 2016 semester. Faculty teaching 2201 were asked to encourage students to take the brief survey. Of the approximately 1300 students in 56 sections of ENGL 2201 offered in spring 2016, a total of 157 responded, for a response rate of 12%. Not all students answered all questions. Additional efforts will be made this come academic year, in both fall and spring semesters, to achieve higher response rates. In addition to providing more time to complete the survey by distributing it initially one month before the conclusion of the semester and sending additional survey reminders to faculty teaching 2201, the QEP Director will investigate survey incentives (randomly selected respondents to receive Student Stores gift cards, for example).

5.4.1 Section Selection

Of 153 respondents who answered the question about class standing, 128, or 84%, were sophomores. Responses came from themed sections as follows:

2201SectionType Respondents

Multidisciplinary(notfocusedonwritinginanyparticularmajorarea) 90

HealthSciences 19

Business 14

SocialSciences 8

Notsure/Don’tknow 4

Education 4

Communications 4

ArtsandHumanities 4

NaturalSciences 3

EngineeringandTechnology 2

Total 152

Table14:ENGL2201SurveyResponsesbySectionType

Seventy-two percent of students responding reported that they were aware that there were discipline-themed sections of ENGL 2201 when they signed up for the course. Respondents who did not take a discipline-themed section were asked to indicate why they had not taken one (checking all options that apply). Response options receiving more than 20 selections were as follows:

Answer #ofrespondentsindicating

Ididn'tspecificallydecidenottotakeadiscipline-themedsection:Ijust 35

Page 20: 15-16 QEP Assessment Report Final - East Carolina Universityrelated activities (Section 2), the report provides descriptions of formative and summative assessments (direct and

20

tookthesectionthatwasbestformyschedule.

ThesectionsformydisciplinewerenotofferedwhenIcouldtakethem. 28

Ididn’tknowtheywereoffered. 26

Table15:MostPopularReasonsforTakingMulti-disciplinarySectionofENGL2201

Similarly, those who DID take a discipline-themed section were asked to indicate why they had taken the section (selecting all options that apply). Only two options received more than 10 selections:

Answer Count

Ididn’tspecificallydecidetotakeadiscipline-themedsection:Ijusttookthe

sectionthatwasbestformyschedule

33

Myadvisorsuggesteditwouldbeagoodidea 16

Table16:MostPopularReasonsforTakingDiscipline-themedSectionofENGL2201

Not surprisingly, the timing of section offerings proved to be the most significant issue for many students. When asked to identify the most important factors in selecting their section of 2201, 60% of students selected “the day/time of the section.”

While it will never be possible to meet all students’ scheduling needs and desires, the increased coordination of scheduling through the Writing Liaisons, as discussed in 5.1 above, may reduce the number of students who find that the sections for their disciplines are not offered at a time that is not compatible with their schedules and increase the number of students who can find a discipline-themed section that fits well with their schedules.

Enrollment trends in discipline-themed sections will continue to be monitored throughout the rest of the QEP. If, after a number of semesters, it appears that certain discipline-themed sections are consistently populated by students who do not have a clear interest in the disciplinary focus of the section, those discipline-themed options may be removed from 2201 offerings so that both students and faculty can move toward the course outcomes most effectively.

5.4.2 Connection to Major/Career Area

To better understand students’ perceptions of the connections between the work done in 2201 and the writing they may encounter in future courses and their careers, the survey asked for respondents to indicate level of agreement (strongly agree, agree, somewhat agree, neither agree nor disagree, somewhat disagree, disagree, strongly disagree) with three statements about these connections. Number and percentage of respondents indicating agreement with these statements are included below:

Question Somewhat

agreeAgree Strongly

agreeTotal

combinedagreement

ENGL2201hashelpedme

betterunderstandhowwriting

worksinmymajor/intended

career.

25%

35

39%

55

20%

28

84%

118

Theassignmentswehavedone

inENGL2201willapplytomy

majorarea.

28%

40

30%

42

15%

21

73%

103

Page 21: 15-16 QEP Assessment Report Final - East Carolina Universityrelated activities (Section 2), the report provides descriptions of formative and summative assessments (direct and

21

Thethingswehavedonein

ENGL2201willapplytomy

career.

36%

51

22%

31

14%

20

72%

102

Table17:StudentPerceptionofConnectionsbetweenENGL2201andMajorArea

Interestingly, while only slightly more than half of the ENGL 2201 instructors who responded to the survey agreed with the statement that students in the course were able to make connections between what they did in the class and what they will do in their majors/career areas (see 5.3.1 above), students’ self-reporting, as reflected in responses to the three statements in the table above, suggests that most of them are, in fact, able to make connections.

6. Student Support Initiatives Assessment Results and Actions Planned

6.1 Writing Mentors Program Survey Results and Actions Planned

Surveys to measure the impact of the Writing Mentors program were distributed in fall 2015 and spring 2016 to faculty who worked with Mentors in their WI courses, to the students in those courses, and to the Mentors themselves.

6.1.1 Faculty Survey Results

In fall of 2015, all 4 participating faculty members completed the survey. In spring 2016, all 7 participating faculty completed the survey.

The survey provided a series of statements about the helpfulness of Mentors in different areas of writing and asked faculty to rate, on a scale of 1 (strongly disagree) to 5 (strongly agree) how much they agreed or disagreed with the statement. Mean responses are indicated in the table below, along with means from the previous semester, spring 2015, for comparison.

Howmuchdoyouagreeordisagreewiththefollowingstatements?(1-5scale,1=stronglydisagreeand5=stronglyagree)

TheWritingMentorinmyclasshelpedstudentsto…

Spring2015Mean

Fall2015Mean

Spring2016Mean

Understandwritingassignments 4.2 4.7 3.9

Developideasforwriting 4.2 4.5 4.6

Establishandmaintainathesis/focus 4.3 4.7 4.6

Findgoodoutsidesources(books,articles,web

sites,etc.)forwriting

3.3 3.5 3.7

Understandaudiencewhenwriting 3.7 4.2 4.0

Writemultipledraftsofassignments 4.2 4.2 4.0

Revise(makesubstantivechangesto)writing 4.5 4.5 4.4

Edit/proofreadwriting 4.0 4.7 4.3

Recognizestrengthsandweaknessintheirown

writing

3.8 4.7 4.3

Table18:InstructorPerceptionsofWritingMentorImpact

The decline in mean scores between spring and fall 2015 and spring 2016 in “Understand writing assignments” is initially concerning; however, it should be noted that responses to this statement also had the highest standard deviation (.99), with one faculty member out of 7 indicating that they disagreed with the statement. The same faculty member indicated that she would recommend the program to other faculty and commented that she believe it had benefitted the majority of her students. It is difficult to reconcile this low score on one

Page 22: 15-16 QEP Assessment Report Final - East Carolina Universityrelated activities (Section 2), the report provides descriptions of formative and summative assessments (direct and

22

statement with the respondent’s praise of the program. Given the high margin of error that accompanies a survey—even one with a 100% response rate like this one—of participants in a program of this limited scope, conclusions are hard to draw on a semester-by-semester basis. The QEP five-year impact report, in which faculty survey responses will be aggregated from multiple semesters, may provide a better place to draw conclusions.

Faculty were also asked to comment on the value added by the Mentor Program by comparing students' writing performance in classes with Mentors versus classes without Mentors. Mean scores for the fall 2015 and spring 2016 semesters, along with means scores from the spring 2015 semester (for comparison) are included in the table below.

Whencomparedtostudentsinother/previoussectionsofthiscoursesthatdidnothaveaWritingMentor,howdidstudentsinthissectionperforminthefollowingareas?(1-5scale,1=Muchworseand

5=Muchbetter)

Spring2015Mean

Fall2015Mean

Spring2016Mean

Usingwritingtoinvestigatecomplex,

relevanttopicsandaddresssignificant

questions

3.7 4.2 4.2

Identifyingcrediblesourcestousein

writing

3.7 3.5 3.7

Incorporatingevidenceandoutside

sourcesintowriting

3.8 4.5 4.2

Producingwritingthatreflectsan

awarenessofcontext,purpose,and

audience

3.7 4.5 4.0

Revising 4.3 4.7 4.3

Proofreadingandediting 4.0 4.5 4.3

Table19:InstructorPerceptionsofValueAddedbyWritingMentor

Here again, means at 4 or higher in all areas except “Identifying credible sources to use in writing” in fall 2015 and spring 2016 suggest that, overall, the Writing Mentors Program has contributed to improved student performance. One area that still would benefit from improvement is in identifying credible sources: more work with librarians who specialize in different disciplines would be a good resource to bring into Mentor training. Additionally, training for faculty in how to bring Mentors into the process of finding and evaluating sources (as opposed to having the Mentors work primarily or exclusively on drafts composed after research is already complete) will be added to future Writing Mentor Program faculty orientations.

While the quantitative data from the faculty surveys are instructive, the details provided in open-ended responses provide specific ideas for what works well in the program and what might be improved. Benefits of the Writing Mentors Program highlighted in survey responses from fall 2015 and spring 2016 include

• It provides a supportive, approachable peer for students:

“The biggest benefit of the writing mentor program is the peer-to-peer interaction between the graduate and undergraduate students. I noticed that the undergraduate students sometimes appeared more comfortable expressing their writing concerns to the writing mentor before communicating their issues with me.”

Page 23: 15-16 QEP Assessment Report Final - East Carolina Universityrelated activities (Section 2), the report provides descriptions of formative and summative assessments (direct and

23

“The students were more willing to meet with him than they are to meet with me. Since he is one of their peers, they listened to his advice.” “Students feel more comfortable speaking with a fellow student and get extra reinforcement in their writing skills.”

• It provides consistent feedback that is aligned with the instructor’s goals for the course:

“In the past, students have used the Writing Center as a required component of drafting a final paper.... While this was helpful to students, I found the writing mentor program improved the consistency of feedback provided to the students.” “I also have a lot more confidence in what [the students] are getting than if they went to a regular tutor. Not that the tutors are bad, but I feel better knowing they are hearing feedback that is aligned with my goals for their assignments.” “It was also helpful to be able to point out to the writing mentor specific issues that I had with the writing assignments so that he could work on those when he met with them.”

• It increases students’ engagement with the writing process:

“Students with a writing mentor have thought more about both the content and what THEY want to say, not just what an article says. They have much more of a voice in their papers.” “I think that having to go to meetings and be held accountable for doing some work before the due date really helps them to see that they shouldn't procrastinate. But more than that, it encourages seeing writing as a process and not just an opportunity to display what they ‘know.’”

• It helps the instructor improve the course:

“Having a designated mentor in the course allowed for planning and implementation conversations between the faculty and the mentor. These conversations facilitated a strategic effort to enhance writing competencies while meeting the QEP outcomes.” “Getting the check-in reports from the mentor helps me adjust instruction too. I also appreciate being able to ask questions of the mentor and get feedback when I notice a pattern in the writing that I need to address”

Suggestions for improvement from the surveys in fall 2015 and spring 2016 were somewhat limited: In spring 2016, for example, 3 of the 5 total responses to the question about what could be improved indicated that they had “no areas for improvement at this time” and that we should “keep up the good work!” Two areas were, however, offered for improvement across the two sets of faculty surveys:

• Provide faculty with additional ideas for how to use Mentors:

“Perhaps a meeting with other faculty in the writing mentor program to hear more about what others are doing and how they are utilizing the mentor in their class. ”

• Improve processes for scheduling meetings/interactions between students and Mentors:

“The main difficulty is trying to find times for the mentor and students to meet. Each person has a complex schedule and setting up times to get together is difficult.” “Scheduling is the hardest aspect. I am not sure what can be done. The students all have different schedules that are very full so sometimes it legitimately is hard for them to schedule appointments”

Page 24: 15-16 QEP Assessment Report Final - East Carolina Universityrelated activities (Section 2), the report provides descriptions of formative and summative assessments (direct and

24

“My class met twice a week and for one of those days the writing mentor was scheduled to work at the writing center…. This made scheduling times for peer reviews and other activities difficult since he could only come one day a week.”

6.1.2 Mentor Program Student Survey Results

In fall 2015, a total of 23 students responded to the survey distributed to classes with a Writing Mentor assigned (a response rate of approximately 23%). In spring 2016, a total of 74 students responded (a response rate of approximately 42%).

Students were asked how often they communicated with the Writing Mentor during the course. Results, by percentage of respondents, are included below, along with results from spring 2016 for comparison:

HowoftendidyoucommunicatewiththeWritingMentorinthisclass?

Spring2015% Fall2015% Spring2016%

Never 3% 0% 0%

LessthanOnceaMonth 40% 0% 8%

OnceaMonth 23% 52% 34%

2-3TimesaMonth 19% 35% 24%

OnceaWeek 13% 4% 16%

2-3TimesaWeek 1% 9% 18%

Morethan2-3Timesa

Week

0% 0% 0%

Table20:StudentReportingofCommunicationwithWritingMentor

In last year’s (2014-2015) QEP assessment report, we indicated that actions would be taken to decrease the percentage of students reporting that they have communicated with the Mentor less often than once a month. Those efforts appear to have been successful, with no students reporting that they met with the Mentor less often than once per month in fall 2015 and only 8% of students indicating that they met less often than once per month with the Mentor. This is a substantial decrease from the 43% of respondents who indicated that they met less often than once per month with the Mentor in spring 2015.

Students were also asked to indicate how much they felt the Mentor(s) working with their classes had helped them with various writing tasks by indicating their agreement or disagreement with a set of statements. Means for responses for spring 2016, fall 2015, and spring 2015 (for comparison) are summarized below.

Page 25: 15-16 QEP Assessment Report Final - East Carolina Universityrelated activities (Section 2), the report provides descriptions of formative and summative assessments (direct and

25

Howmuchdoyouagreeordisagreewiththefollowingstatements(1-5scale,1=stronglydisagreeand5=stronglyagree)

TheWritingMentorhelpedmeto…

Spring2015Mean

Fall2015Mean

Spring2016Mean

Understandwritingassignments 3.6 4.2 4.2

Developideasforwriting 3.9 4.4 4.3

Establishandmaintainathesis/focusfor

mywriting

3.8 4.3 4.3

Findgoodoutsidesources(books,articles,

websites,etc.)formywriting

3.3 4.0 3.8

Understandmyaudiencewhenwriting 3.5 4.2 4.2

Writemultipledraftsofmyassignments 3.5 3.6 4.1

Revise(makesubstantivechangesto)my

writing

4.0 4.0 4.3

Edit/Proofreadmywriting 4.1 4.2 4.3

Recognizestrengthsandweaknessinmy

ownwriting

3.8 4.1 4.3

Addressweaknessesinmyownwriting 3.7 4.1 4.3

Table21:StudentPerceptionsofWritingMentorImpact

In the QEP assessment report for 2014-2015, when comparing mean scores for these statements from spring 2014, fall 2014, and spring 2015, it was noted that “the highest means in all areas occurred in spring 2015. In terms of students’ perceptions, it appears that the Mentor Program continues to improve.” Given that the mean scores for all statements have increased even further across AY 2015-2016, with several mean scores .5 or .6 higher than those reported in AY 2014-2015, the Mentor program appears to have continued improving and to be quite successful from students’ perspectives at present.

As was the case with the faculty survey, students’ open-ended comments provide useful information about the benefits and areas for improvement in the Writing Mentors Program.

Many benefits were mentioned, but three themes emerged prominently across multiple open-ended responses:

1. The Mentor Program provides students with a consistently supportive and approachable peer:

“Communicating with someone who is also a student because then you do not feel as intimidated when asking for help. If it was a professor or someone of higher authority then I would probably have felt nervous to get an opinion or help about my own writing.” “[The program] gives students the opportunity to work with someone else one the writing besides the professor. I know for me it is easier to relate to someone around my age then having to go to the professor for help.” “Having the writing mentor was also good because it removed bias that is normally associated with submitting work to a professor.” “I think the biggest benefit is that they are students that help you, so I don't feel as pressured and nervous because they are also my peers.”

“I really liked having someone other than the professor to help with the writing assignments and answer questions.”

Page 26: 15-16 QEP Assessment Report Final - East Carolina Universityrelated activities (Section 2), the report provides descriptions of formative and summative assessments (direct and

26

“You have a peer that you are able to reach with questions that you may not feel comfortable asking your instructor.” “It was nice having her in class to put a face with a name.” “It is nice to have a set person that you can use as a resource for your class.

“I liked having another person to talk to about my writing. I felt like I had constant support while writing my papers and always had someone I could ask questions and help me with my writing.” “It allows you to have someone to go to with ANY questions, and having the same person assures that you are getting the same feedback.”

2. The program helps to clarify assignments and expectations for students:

“I think it is beneficial because you have someone assigned to your class specifically, they understand what is being assigned. Instead of going to someone who doesn't know what is going on in your class they know exactly what is being asked of the students.” “[The mentor] was very knowledgeable as to what my professor expected for the assignment.” “Having someone who has seen multiple papers on the same topic and suggest things based on the theme of other people's papers so that you're not off-field” “I can go in not knowing if my paper meets the criteria the professor wants, and the writing mentor can help me meet it.”

“[The biggest benefit is] gaining insight from a trained personal that knows the direction the instructor is aiming.” “It was nice to have someone devoted to helping a class. She fully understood each assignment and was very helpful!!!”

3. The program raises student writers’ confidence:

“The writing mentor program serves as a confidence booster.” “It helps in the development of strong arguments and helps build confidence in your writing because you have support.” “It was very nice to have someone one on one that was working specifically with my class and understood the assignments. It made everything so much easier while I also felt more confident in my writing. “ “As a rusty 2nd student, my writing is not at its best. However, the mentor increased my confidence and helped me identify my strengths.” “I have never been a confident writer, but the writing mentor offered encouragement as well as suggestions of ways to refine my writing skills.”

One recommendation for improvement turned up multiple responses: Improve scheduling and increase amount of time to meet with Mentor.

Page 27: 15-16 QEP Assessment Report Final - East Carolina Universityrelated activities (Section 2), the report provides descriptions of formative and summative assessments (direct and

27

6.1.3 Writing Mentor Program Mentor Survey Results

In fall 2015, 3 Writing Mentors responded to the survey about their experiences. In spring 2016, 7 Mentors responded.

Mentors were asked how much of an impact they felt they had on student writers by indicating agreement or disagreement with a series of statements. Results are below, along with data from spring 2015 for comparison:

Howmuchdoyouagreeordisagreewiththefollowingstatements(1-5scale,1=stronglydisagreeand5=stronglyagree)

Youwereabletohelpthestudentsto…

Spring2015Mean

Fall2015Mean

Spring2016Mean

Understandwritingassignments 4.8 4.0 4.6

Developideasforwriting 4.8 4.3 4.6

Establishandmaintainathesis/focusfortheir

writing4.5 4.3 4.7

Findgoodoutsidesources(books,articles,web

sites,etc.)fortheirwriting4.3 3.7 4.3

Understandtheaudiencewhenwriting 4.8 4.3 4.4

Writemultipledraftsoftheirassignments 4.3 4.3 4.1

Revise(makesubstantivechangesto)their

writing4.8 5.0 4.9

Edit/Proofreadtheirwriting 4.5 5.0 4.1

Recognizestrengthsandweaknessintheir

writing4.3 4.3 4.6

Addressweaknessesintheirownwriting 4.5 4.7 4.6

Table22:WritingMentorsPerceptionsofTheirOwnImpact

Scores from the Mentors across the past two academic years are consistently high. The scores for “Revise (make substantive changes to) their writing” are particularly impressive, with means of 4.8 or above for the past three semesters. One area of student writing that results from the Mentors suggest is ripe for further attention is the practice of composing multiple drafts. It’s also quite possible that, with revision occurring in electronic files, finite, identifiable “drafts” are harder to identify and thus harder to quantify as “multiple.” Rather than producing multiple drafts, writers have a file—a draft—that they continually revise. The fact that scores in response to the statement “Revise their writing” are so high suggests that this model, in which there is one “draft” that is constantly under revision, may account for the lower scores in response to “write multiple drafts of their assignments.”

6.1.4 Comparison of Writing Mentor Program Perceptions of Impact

It would be difficult to draw meaningful conclusion from the comparison of mean scores among mentors, faculty, and students in fall 2015 because of the small number of mentors placed that semester. Comparison of perceptions from these three groups from spring 2016, however, may reveal areas of success and areas for improvement.

Spring 2016 Comparison

Howmuchdoyouagreeordisagreewiththefollowingstatements(1-5scale,1=stronglydisagreeand5=stronglyagree):TheWritingMentorhelpedme/studentsto…

MentorMean

FacultyMean

StudentMean

Page 28: 15-16 QEP Assessment Report Final - East Carolina Universityrelated activities (Section 2), the report provides descriptions of formative and summative assessments (direct and

28

Understandwritingassignments 4.6 3.9 4.2

Developideasforwriting 4.6 4.6 4.3

Establishandmaintainathesis/focus 4.7 4.6 4.3

Findgoodoutsidesources(books,articles,

websites,etc.)forwriting

4.3 3.7 3.8

Understandaudiencewhenwriting 4.4 4.0 4.2

Writemultipledrafts 4.1 4.0 4.1

Revise(makesubstantivechangesto)

writing

4.9 4.4 4.3

Edit/proofreadwriting 4.1 4.3 4.3

Recognizestrengthsandweaknessin

writing

4.6 4.3 4.3

Addressweaknessesinwriting 4.6 3.9 4.3

Table23:ComparisonofFaculty,Student,andWritingMentorPerceptionsofImpact

It is worth noting that, in spring 2016, there were no areas in which mean scores for perceptions differed by one point or more. This was not the case in the data reported in the AY 2014-15 QEP Assessment Report, suggesting that the various trainings and orientations for faculty and Mentors are having a positive impact on the program.

In one area—finding good outside sources for writing—faculty and students mean scores fell below 4 while, in the eyes of the Mentors, their work in the area was more successful, as reflected in the mean score of 4.3. As mentioned above, Mentor training for AY 2015-16 will incorporate more work with librarians who specialize in research practices and database resources in different disciplines. Through this training, we hope to see scores from students and faculty move closer to the self-assessment scores reported by Mentors.

6.1.5 Writing Mentors Program Areas for Improvement, Actions Planned or Taken, Impact of Actions Taken

Assessment of the Writing Mentors Program for 2015-16 revealed several areas for improvement:

1) Area: Enrollment in ENGL 3875: Peer Tutoring

Low numbers of Mentors available for placement in fall 2015 reflected the problematic nature of requiring that a specific course be taken in order for undergraduate students to serve as mentors. While many students expressed interest in the course and the opportunity to become a Writing Mentor, very few actually took ENGL 3875: Peer Tutoring in the spring of 2015. The day/time on which ENGL 3875 was scheduled in spring 2015 was likely largely responsible for the low enrollment in the course.

Actions Taken: Greater focus on scheduling. In response to this problem, more emphasis was placed in setting up the course schedule in the English Department so that the course was offered on a day and at a time when more students would be able to take it without it conflicting with other required courses for their programs.

Impact: The adjustment, along with additional advertising for the course, helped grow the enrollment to 19 students in the spring 2016.

2) Area: Students’ ability to recognize strengths and weaknesses in their own writing. Faculty indicated in their survey responses in spring 2015 that students, even with the help of a Mentor in the course, struggled to recognize strength and weaknesses in their own writing.

Actions Taken: Faculty Development. In the faculty orientation sessions for the Writing Mentor Program in both fall 2015 and spring 2016, faculty were encouraged to involve Writing Mentors more directly in helping students to evaluate their own writing and were

Page 29: 15-16 QEP Assessment Report Final - East Carolina Universityrelated activities (Section 2), the report provides descriptions of formative and summative assessments (direct and

29

provided with strategies to do this, including the incorporation of a “feedback memo” assignment in which students were required to summarize the feedback they were given by the Mentor and to describe how they plan to use (or not use) the feedback and why. These kinds of memos further attune students to the “whys” of employing certain writing strategies and practices within particular disciplinary and course contexts.

Impact: The increases in mean agreement score between spring 2015 and both semesters of the 2015-16 academic for the statement that the Writing Mentor helped students “Recognize strengths and weaknesses in their own writing” (See 6.1.1 above) suggests that the emphasis in faculty orientation helped improve student performance via the Mentors.

3) Area: Schedule Alignments between Faculty and Mentors and between Mentors and Students. In fall 2015, 37% of students who responded (n=16) suggested that scheduling time with the Mentor was difficult. In spring of 2016, 21% (n=38) of respondents also mentioned scheduling as an area for improvement. Scheduling/increasing time for meeting with Mentors was also the most common suggestion for improvement from students last AY (2014-15). Actions Taken: Increasing Enrollment in ENGL 3875. By increasing the number of students who have completed ENGL 3875 in spring 2016, we have expanded options for pairing Writing Mentors with faculty and thus have a greater chance of identifying Writing Mentors who can, if desired by the instructor, attend class meetings and be available at certain times to assist students when an instructor includes such assistance in the course schedule. Impact: Responses to faculty, student, and Mentor surveys will be reviewed from AY 2016-17 to determine impacts, if any.

6.2 University Writing Center Usage Data and Actions Planned

Usage data for the University Writing Centers (at Joyner and Laupus Libraries and through the Online Writing Lab) reflects the impact the QEP expansion has had, with increasing numbers of students assisted each year. The table below provides usage data from this academic year as well as the past three academic years:

UWCLocation 2012-2013(pre-QEP)

2013-2014(QEPYr.1)

2014-2015(QEPYr.2)

2015-2016(QEPYr.3)

Face-to-facesites 1,918 3,755 3,809 5,115

OnlineWritingLab 561 1,022 1,275 1,946

Total 2,479 4,777 5,084 7,061

Table24:UniversityWritingCenterUsage,2012-2016

These numbers show an increase in use of the UWC of 185%.

Of course, usage should not be the only measure of the impact of the UWC expansion. As noted in the assessment plan for the QEP, UWC exit surveys were administered to all ECU community members who received assistance beginning in the 2012-2013 academic year. Thus, pre-QEP implementation response data can be compared with post-QEP implementation response data to determine if satisfaction with the UWC's services changes at all with the expansion. Data for spring 2013 (pre-QEP implementation) is not available due to technical problems that arose during the process of moving into a temporary space while the new UWC space was under construction:

Page 30: 15-16 QEP Assessment Report Final - East Carolina Universityrelated activities (Section 2), the report provides descriptions of formative and summative assessments (direct and

30

LevelofSatisfaction

Fall2012

Fall2013

Spring2014

Fall2014

Spring2015

Fall2015 Spring2016

Very

Satisfied

78% 77% 81% 73% 81% 79% 81%

Satisfied 22% 22% 19% 26% 19% 20% 18%

Dissatisfied 0% 1% 0% 1% 1% 1% .5%

Very

Dissatisfied

0% 0% 0% 0% 0% 0% 0%

Table25:UniversityWritingCenterExitSurvey“Satisfaction”QuestionResponses,2012-2016

Satisfaction levels among users of the UWC remained consistently high from pre-to-post QEP implementation, suggesting that the expansion of the center has not had a negative impact on the quality of the help it provides for students.

7. Faculty Support Initiatives Assessment Results and Actions Planned

7.1 Summer 2016 WAC Academy Survey Results and Actions Planned

Five participants—representing programs in English, Criminal Justice, Computer Science, Foreign Languages and Literatures, and Social Work—responded to a Qualtrics survey about the summer 2016 Advanced WAC Academy. This is a 100% response rate: numbers in the academy are kept purposefully low to enable extensive interaction and collaboration.

Respondents were asked to indicate level of agreement on a number of statements about their experience in the academy. Responses are summarized below.

Page 31: 15-16 QEP Assessment Report Final - East Carolina Universityrelated activities (Section 2), the report provides descriptions of formative and summative assessments (direct and

31

Statement Stronglydisagree

1

Disagree2

Agree3

Stronglyagree4

Mean

Ihaveabetterunderstandingofhow

tohelpmystudentsbecomestronger

writersnowthanIdidbeforethe

Academy.

0 0 2 3 3.6

Thereadingsandactivitieswillhelp

informmyplanningandinstructionto

encouragethetransferofskillsand

knowledge.

0 0 2 3 3.6

Thegroupdiscussionsthroughoutthe

Academywereusefulforinforming

myinstruction.

0 0 1 4 3.8

Iwillusetheknowledgeandmaterials

IgainedintheAcademyinmy

classroom.

0 0 0 5 4.0

Iplantosharetheknowledgeand

materialsgainedintheAcademywith

mycolleagues.

0 0 2 3 3.6

IwouldrecommendtheWAC

Academy:TransferofWritingSkills

andKnowledgetosomeoneelseinmy

department.

0 1 1 3 3.4

Table26:Responsesto2016SummerWACAcademySurvey

Comparing mean scores to those from the 2015 Summer WAC Academy reveals strong similarity. While the mean score for “I would recommend the WAC Academy: Transfer of Writing Skills and Knowledge to someone in my department” dropped by .4 and that for “The readings and activities will help inform my planning and instruction and encourage the transfer of skills and knowledge” dropped by .2, the score for “I will use the knowledge and materials gained in the Academy in my classroom” rose by .2, to the top possible score of 4.

Other statements in the survey functioned as part of QEP formative assessment and were intended to inform the structure and content of future summer WAC Academies. Responses to some of these statements are included below.

Statement Stronglydisagree

1

Disagree2

Agree3

Stronglyagree

4

Mean

TheAcademyFacilitatorwashelpful. 0 0 1 4 3.80

Thetimeallottedforactivitiesand

discussionwasappropriate.

0 1 2 2 3.20

Ifoundtheprojectworkedon

throughouttheAcademytobea

helpfulwaytolearnaboutstudent

learningandwriting.

0 0 1 4 3.80

Table27:Resultsfrom2016SummerWACAcademyFormativeAssessmentQuestions

Mean scores of 3.80 in response to the statements about the facilitator and the value of the project are the same or higher than those reported for the summer 2015 WAC academy. The mean score regarding the

Page 32: 15-16 QEP Assessment Report Final - East Carolina Universityrelated activities (Section 2), the report provides descriptions of formative and summative assessments (direct and

32

time allotted—3.20—remained the same from last year, despite efforts to address this issue (as mentioned in the QEP report for 2014-15). Based on one of the open-ended comments from participants, the demands of the project in the short time-frame still seem to be a problem for some participants:

I wish there had been more time to work on the pieces for the final project. It was difficult to find time outside of the Academy to accomplish some items (just because of other commitments) … If more than 3 hours a day could be allocated that would be helpful.

In response to these continued concerns about the time limits of the academy, the Assistant Director of the UWP and the QEP Director will explore options for extending the hours and/or days during which the summer academy is held.

7.2 Writing Liaisons 2016 Survey Results and Actions Planned

A survey was distributed to Writing Liaisons at the conclusion of the spring 2016 semester. Twenty-six Liaisons completed the survey (a 63% response rate). Of those who responded, 92% rated their involvement with the Writing Liaisons program as "Good" or "Very Good."

Given that a primary goal of the Liaisons program is to increase communication across campus about writing, writing instruction, and writing support, Liaisons were asked to indicate how often they discuss information that they have received through the Liaisons program with others in their departments, programs, and/or colleges. Data from the 2014-2015 Liaisons’ survey is included for comparison.

HowfrequentlyduringyourtimeasaWritingLiaisonhaveyoutalkedwithcolleaguesinyourprogram,department,orcollegeaboutissuesrelatedtotheQEP,theUniversityWritingProgram,theUniversityWritingCenter,ortheWritingFoundationsprogram?Response 2014-15%ofResponses 2015-16%ofResponsesNever 0% 4%

LessthanOnce

aMonth

40% 27%

OnceaMonth 32% 46%

2-3Timesa

Month

28% 23%

OnceaWeek 0% 0%

2-3Timesa

Week

0% 0%

Daily 0% 0%

Table28:ComparisonofWritingLiaisons’ReportsofTimeSpentCommunicatingwithColleaguesabout

Writing-relatedPrograms

While one Liaison reported that s/he “never” spoke with colleagues, the percentage of Liaisons reporting that they shared information once per month or more often rose by 9% to a total of 69% between 2014-15 and 2015-16. In an effort to maintain this rate and, ideally, increase it, the QEP Director will continue the practice—instituted after a review of the 2014-2015 QEP Assessment report—of sharing “talking points” with Liaisons after monthly meetings and will also send reminder emails several days after circulating these talking points.

Accuracy of the information that Liaisons share is as important as the frequency of discussion. Thus, Liaisons were asked to rate their ability to explain various aspects of the QEP and/or the University Writing Program to their colleagues. Percentages of respondents who indicated each rating are indicated below, along with data from the 2014-2015 Liaison survey for comparison:

Page 33: 15-16 QEP Assessment Report Final - East Carolina Universityrelated activities (Section 2), the report provides descriptions of formative and summative assessments (direct and

33

Area 2014-15Pooror

VeryPoor1or2

2014-15Fair3

2014-15GoodorVeryGoo4or5

2015-16PoororVeryPoor1or2

2015-16Fair3

2015-16GoodorVeryGoo4or5

Theservicesofthe

UniversityWritingCenter

0 21% 79% 0 11% 89%

TheWritingMentors

Program

0 16% 84% 0 20% 80%

Thestructureandgoalsof

theWAC(Writing-

intensive)program

4% 25% 71% 4% 31% 65%

TheUniversityWriting

Portfolio

8% 29% 63% 8% 27% 65%

WritingandLearning

Communities

17% 46% 37% 19% 38% 42%

TheQEPStudentLearning

Outcomes

4% 25% 71% 0% 23% 77%

TheQEPassessment

process

12% 33% 55% 8% 31% 61%

Thegoalsandrationale

fortherevisedWriting

Foundations(English

1100and2201)sequence

0 25% 75% 0 20% 80%

Table29:WritingLiaisons’SurveyResultsforPerceptionofKnowledgeaboutWritingProgramsatECU

While these results show a 5% increase in the number of Liaisons who feel that their ability to help their colleagues understand the “Writing and Learning Communities” is “good” or “very good,” the fact that the percentage remains below 50% shows that more detail about these groups needs to be provided. As a result, participants from past writing and learning communities will be asked to attend a meeting of the Liaisons this year to share what they learned from the groups and to explain how those groups worked. This may be more effective than having the QEP Director simply summarize what has been done.

The survey provided additional formative insights. Respondents were asked to rate their satisfaction with a number of different aspects of the Liaisons program:

Page 34: 15-16 QEP Assessment Report Final - East Carolina Universityrelated activities (Section 2), the report provides descriptions of formative and summative assessments (direct and

34

Aspect 2014-15Mean(Scaleof1-5)

2015-16Mean

Format/structure 3.92 4.23

Amountofwork 4.00 4.36

Readings&

Assignments

3.84 3.96

Meetings/interacti

on(face-to-face)

4.12 4.31

Meetings/interacti

on(online)

3.88 4.09

Stipendfor

participation

3.84 4.23

Opportunitiesfor

discussion,

interaction,and

questionswith

QEP/Writing

Programleaders.

4.24 4.35

Opportunitiesfor

discussion,

interaction,and

questionswith

otherLiaisons

4.12 4.54

Table30:WritingLiaisons’SatisfactionwithAspectsofProgram,2014-2016

Mean scores for all areas went up between AY 2014-15 and AY 2015-16.

7.3 Metacognition Workshop Series Survey Results and Actions Planned

A total of 17 faculty members from various disciplines completed the three-session during AY 2015-16.

Workshop participants were asked to complete a survey about the workshop series. Percentages of respondents answering “Excellent” and “Good” are provided below, along with percentages from AY 2014-15.

Question 2014-15Excellent

2015-16Good

2015-16Excellent

2015-16Good

Howorganized,knowledgeable,and

preparedwasthepresenter?

91% 9% 94% 6%

Howeffectivewasthepresentationof

content(interestingandengaging?

80% 20% 100% 0%

Towhatextentdoyouthinktheworkshop

contentwillbeuseful?

82%

8%

94%

6%

Thequalityofthematerialsandresources

offeredwere…

80% 20% 94% 6%

Towhatextentdidthepresentersprovide

opportunitiesfordiscussion,interaction,

andquestions?

100%

0%

100%

0%

Howappropriatewastheamountoftime

allottedforthisworkshop?

64%

36%

75%

25%

Page 35: 15-16 QEP Assessment Report Final - East Carolina Universityrelated activities (Section 2), the report provides descriptions of formative and summative assessments (direct and

35

Overall,Ifoundthisseriestobe… 82% 8% 94% 6%

Table31:WritingandMetacognitionWorkshopSeriesParticipantSurveyResults

Feedback for the series continues to be positive. Changes made to the structure of the workshops to allow for more time appear to have improved participants’ experiences. In AY 2016-17, the workshop series leaders will continue to focus on adjusting activities to ensure they all fit into the one-hour slot. Many faculty can only stay for one hour due to teaching schedules, so extending the workshop time is not a feasible option.

7.4 Writing and Learning Communities Survey Results and Actions Planned

After the first two years of WLCs, The QEP Director and the Assistant Director of the University Writing Program revised the structure of the WLCs based on participant feedback. These revisions included a more specific timeline that incorporated benchmarks to ensure timely completion of work stages and a QEP/UWP facilitator (rather than a WLC member facilitator) to alleviate the pressure previous WLC member-facilitators had faced. Using this revised structure, two WLCs were initiated during 2015-16: one group was comprised of 5 faculty members who responded to a call for participants in a WLC focused on “Students Using Sources.” These 5 faculty came from multiple disciplines (Foreign Languages and Literatures, English, Social Work, Engineering, and Technology Systems) and worked over the course of two semesters to investigate common struggles that student writers encounter when locating and integrating strong secondary sources into their written projects. The work culminated in spring 2016 with a workshop in which all five participants presented specific teaching strategies that they had developed to help students. These strategies included using Bloom's taxonomy to teach source integration, analyzing books to develop source-use awareness, and applying the I-BEAM framework (Instance, Background, Exhibit, Argument, Method) to understand how sources can be used. Members of the WLC plan to repeat the workshop session in fall 2016 so that even more faculty have a chance to benefit from the group's work. Materials from the WLC will also be made available on the Writing@ECU website.

Another WLC formed within the College of Nursing, where six faculty members decided to use the QEP's WLC as a platform continue work begun during a day-long UWP workshop in August 2015. More specifically, the group developed a language to talk about writing with students within the RN to BSN curriculum. The group determined definitions for key rhetorical concepts (context, voice, audience, format, topic, and purpose), discussed examples for each term in the broader contexts of their students' writing and learning, and integrated the new language into several key documents (syllabi, writing assignments, and rubrics) to be used throughout the RN-BSN curriculum. The WLC shared their experiences and products with the entire RN-BSN faculty during the program's April 2016 meeting, including discussion of the documents they revised and the Weebly website (http://piratenurseswriting.weebly.com) that contains common tools, information, and language that can be used throughout the program. In the future, the WLC plans to expand to other areas in the College of Nursing, including representatives from the RN-BSN option, the traditional undergraduate program, and the masters and PhD programs.

7.4.1 WLCs Feedback Survey The 8 faculty members who completed the WLC Feedback Survey reported satisfaction, with 75% indicating their involvement was "Very Good". At least 50% expressed that they were "Very Pleased" with 8 out of 10 facets of the WLCs. 75% reported that it was "Very Likely" that the experience would influence how they approach teaching writing, with the additional 25% saying it was "Likely". As indicated in the table below, all participants who responded to the survey indicated that they were either “Pleased” or “Very Pleased” with 10 facets of the WLCs:

Page 36: 15-16 QEP Assessment Report Final - East Carolina Universityrelated activities (Section 2), the report provides descriptions of formative and summative assessments (direct and

36

How pleased were you with the following aspects of your WLC? If an aspect in the list does not apply, simply leave it blank. Responses on a 1-5 scale (1=Very Displeased, 5=Very Pleased)

Aspect Mean %ofRespondents“Pleased”or“VeryPleased”

Format/Structure 4.4 100%

Amountofwork 4.5 100%

Meetings/Interaction

(face-to-face)

4.6 100%

Meetings/Interaction

(online)

4.5 100%

Reading 4.6 100%

Qualityofmaterialsand

projectsdeveloped

4.5 100%

Timelineforwork 4.5 100%

Stipendforparticipation 4.1 100%

Processofdeveloping

materialsandprojects

4.5 100%

Opportunitiesfordiscussion,

interaction,andquestions

4.6 100%

Table32:WLCFeedbackSurveyResults

Additionally, all respondents indicated that it was either “Very Likely” (75%) or “Likely” (25%) that participation in a WLC will influence their approach to writing instruction. In response to a request on the survey for ideas to improve future WLCs, respondents provided the following recommendations:

• Start earlier in the semester • Keep the groups small (5) for maximum participation • Provide more administrative support to assist faculty in their participation • Hold follow-up meetings [and] create a blog as a group

The WLC facilitators will work to make sure each of these suggestions are taken into account when planning and leading future WLCs.

7.5 Eastern NC Writing Symposium

As described in section 2.3.5 above, this QEP initiative was held on ECU’s campus in August of 2015. The event brought together educators from K-12, community college, and university sectors to talk about the teaching of writing, with a focus on introducing students to writing across the curriculum (WAC).

7.5.1 Symposium Feedback Survey Forty participants completed a feedback survey that was designed and distributed by the Lifelong Learning Program. Several questions on the survey asked respondents about their level of agreement or disagreement with various statements about the Symposium. The table below includes percentages of responses in each category:

Page 37: 15-16 QEP Assessment Report Final - East Carolina Universityrelated activities (Section 2), the report provides descriptions of formative and summative assessments (direct and

37

Table33:EasternNCWritingSymposiumFeedbackSurveyResults

Overall feedback was positive, with “Strongly Agree” or “Agree” receiving the most responses for each aspect of the symposium. In planning for the next symposium in 2017, the QEP Director and the Director of Writing Foundations, in cooperation with the Lifelong Learning Program, will focus on improving in the 4 areas in which agreement was the weakest according to the survey: meeting the expectations of attendees, providing attendees with learning opportunities, designing useful breakout sessions, and providing helpful visual aids and handouts.

Feedback from participants in response to some open-ended questions on the survey provide direction in these endeavors.

Expectations: Open-ended comments from several respondents mentioned that the symposium exceeded their expectations or that they didn’t have expectations going into the event. In light of these responses, it seems possible that the wording of the statement may have influenced the percentage of “strongly agree” responses. Only two open-ended comment provided an idea of how expectations had not been met:

“I was hoping for more strategies that I can use in the classroom.” “I would like to see more actual classroom examples.”

For future symposia, we will aim to send attendees home with class activity ideas (or to make such resources available electronically).

Breakout Sessions

Statement StronglyAgree

Agree Neutral DisagreeSomewhat

Thisprogramfillsaneedintheeducation

system(s)ofeasternNC.

60% 40% 0% 0%

Thisprogrammetmyexpectations. 49% 44% 5% 2%

Thespeakerswereknowledgeableand

helpful.

77% 20% 3% 0%

Breakoutsessionswerewellorganizedand

helpful.

52.5% 37.5% 10% 0%

Ilearnedalotonthetopicofthe

symposium.

42.5% 50% 7.5% 0%

IplantoapplywhatIhavelearnedtodayto

myteaching.

70% 25% 5% 0%

Timeallowedfordiscussionwithspeakers

andotherparticipantswasadequate.

79% 13% 5% 3%

Thevisualaidsandhandoutshelpedmake

thesessionsmoreuseful.

47.5% 42.5% 7.5% 2.5%

Theregistrationprocesswentsmoothly. 72% 13% 5% 10%

Iwouldrecommendthisprogramtoa

friend.

72.5% 22.5% 5% 0%

Thedateforthesymposiumwasconvenient

forme.

68% 29% 0% 3%

Page 38: 15-16 QEP Assessment Report Final - East Carolina Universityrelated activities (Section 2), the report provides descriptions of formative and summative assessments (direct and

38

Many respondents indicated that the breakout sessions were the most productive aspect of the symposium, with 17 respondents mentioning them in response to the open-ended question, “What did you find most helpful about the program?” A few respondents, however, provided some important critiques of the sessions:

“Breakout sessions…could have used more concrete content” “We had some characters and very little was accomplished in the first session.” “I enjoyed the breakouts but think they need more structure.”

These three responses suggest, and we will plan to provide, more direction and guidance will be helpful in breakout sessions at future conferences. Learning

One open-ended response suggested that the symposium, while not necessarily presenting a lot of new knowledge, was important in supporting previous knowledge:

“My knowledge was reinforced, my suspicions were clarified, my challenges commiserated, but I did not learn "a lot". I appreciate the opportunity immensely! Because it supported many challenges I face and gave me an opportunity to share and inquire.”

It is possible that other respondents felt similarly, and this affected their response to the item on the survey. Another open-ended response, which indicated that “The QEP seminar last year covered much of this material,” suggests that ECU faculty may not always find the symposium as useful as faculty from other programs. Because of the QEP, many conversations about many aspects of writing are currently ongoing on the ECU campus, and, as a result, some of the conversations at the symposium can seem redundant. In future symposia, it may be advantageous to involve more ECU faculty in leading session and delivering keynote addresses given that many of them are already very involved in professional development around the teaching of writing.

8. Additional/New Assessment Activities Planned for 2016-2017

In addition to continuing many of the assessments discussed in this report, several additional assessment activities will be undertaken in AY 2016-2017. These include the following:

• Post-implementation assessments of WI writing samples for approximately 1/2 of programs.

• Continued review of submission rates for the University Writing Portfolio.

• Continued assessment of course enrollment practices for ENGL 2201.

• Follow up surveys for past participants in the Writing and Metacognition Workshop Series and past participants in the summer WAC Academy

• Follow up surveys for participants in the 2015 Eastern NC Writing Symposium.

Page 39: 15-16 QEP Assessment Report Final - East Carolina Universityrelated activities (Section 2), the report provides descriptions of formative and summative assessments (direct and

39

AppendixA:WICourseRubric

QEPSLO1aInquiryandFocus

QEPSLO1bSourceUse

QEPSLO2Context,Genre,

Audience

QEPSLO3WritingProcess

QEPSLO4Proofreadingand

Editing

QEPSLO5WritingAwareness

Writeruseswritingtoinvestigatecomplex,relevanttopicsandaddresssignificantquestionsthroughengagementwithandeffectiveuseofcrediblesources.

Writeruseswritingtoinvestigatecomplex,relevanttopicsandaddresssignificantquestionsthroughengagementwithandeffectiveuseofcrediblesources.

Writerproduceswritingthatreflectsawarenessofaudience,genre,andconventionsoftheirmajordisciplinesand/orcareerfields.

Writerdemonstratesthathe/sheunderstandswritingasaprocessthatcanbemademoreeffectivethroughdraftingandrevision.

Writerproofreadsandeditshisorherownwriting,avoidinggrammaticalandmechanicalerrors.

Writerassessesandexplainsthemajorchoicesthathe/shemakesinhis/herwriting.

4Excellent

Projectsdemonstratethewriter’sabilitytoidentifyandfullyengagesignificantquestionsrelevanttothecourse.Writingmaintainsaconsistentfocusofinquiry.

Projectsconsistentlydrawoncrediblesourcestosupportthepointsthewritermakesandtohelpthewriterachievehisorherpurpose.

Projectsconsistentlydemonstrateakeenawarenessofaudience,genre,andconventionsofthediscipline/course.

Theanalysisclearlyreflectsthatthewriterhasplannedtheprojectinmultiplestepsandrevisedthoroughlyandcarefullybetweeneachdraft.

Projectsconsistentlydisplaycarefulproofreadingandarelargelyfreeofsurface-levelerrors.

Theanalysisclearlydemonstratesthewriter’sabilitytoidentifyandexplainwritingchoicesandstrategiesusedinprojects.

3Good

Projectslargelydemonstratethewriter’sabilitytoengagemeaningfulquestionsrelevanttothecourse,andthewritingmaintainsafocuswithonlyoccasionallapses.

Projectsdrawoncred-iblesourcestosupportthewriter’spointsandpurposes,withonlyoccasionallapses.

Projectsdemonstrateanawarenessofaudience,genre,andconventionsofthediscipline/coursewithonlyoccasionallapses.

Theanalysissuggeststhatthewriterrecognizestheimportanceofplanningandrevisingandhasengagedtosomedegreeintheseprocesses,makingsomesubstantiverevisionsbetweendrafts.

Projectsreflecttheproofreadingeffortsofthewriterandincludeonlyoccasionalsurface-levelerrors.

Theanalysisdemonstratesthewriter’sabilitytoidentifyandexplainwritingstrategiesusedintheprojects,withonlyoccasionalareasthatareconfusingorincomplete.

2Fair

(Showspromise)

Projectsdemonstratethewriter’sabilitytoengagequestionsrelevanttothecourse,butinlimitedways.Thereisafocusofinquiry,butthewritingstraysfromthatfocusonseveraloccasions.

Projectsdrawoncred-iblesourcestosupportthewriter’spointsandpurposes,butdosoinconsistently.

Projectsdemonstrateanunevenawarenessofaudience,genre,andconventionsofthediscipline/course.

Theanalysissuggeststhatthewriterrecognizessomebenefitstoplanningandrevisingbutthatshe/hehasnotfullyengagedintheseprocesses.

Projectsevidencesomeproofreadingandediting,butseveralsurface-levelerrorsremain.

Theanalysisdemonstratesthatthewriterissometimesabletoidentifyand/orexplainwritingstrategiesusedintheprojects,butthereareseveralareasthatareconfusingorincomplete.

1Poor

Projectslargelyfailtodemonstrateengagementwithquestionsrelevanttothecourse.Thewritingappearstohavelittlefocus.

Projectslargelyfailtodrawoncrediblesourcestosupportthewriter’spointsandpurposes.

Projectslargelyfailtodemonstrateanawarenessofaudience,genre,andconventionsofthediscipline/course.

Theanalysissuggeststhatthewritercomposedtheworklargelyinonedraft,dedicatinglittle,ifany,timetoplanningorrevising.

Projectsreflectminimalorineffectiveproofreadingandeditingstrategies.Numeroussurface-levelerrorsremain.

Theanalysislargelyfailstodemonstrateanabilitytoidentifyandexplainwritingstrategiesintheprojects.

Page 40: 15-16 QEP Assessment Report Final - East Carolina Universityrelated activities (Section 2), the report provides descriptions of formative and summative assessments (direct and

39

AppendixA:WICourseRubric

QEPSLO1aInquiryandFocus

QEPSLO1bSourceUse

QEPSLO2Context,Genre,

Audience

QEPSLO3WritingProcess

QEPSLO4Proofreadingand

Editing

QEPSLO5WritingAwareness

Writeruseswritingtoinvestigatecomplex,relevanttopicsandaddresssignificantquestionsthroughengagementwithandeffectiveuseofcrediblesources.

Writeruseswritingtoinvestigatecomplex,relevanttopicsandaddresssignificantquestionsthroughengagementwithandeffectiveuseofcrediblesources.

Writerproduceswritingthatreflectsawarenessofaudience,genre,andconventionsoftheirmajordisciplinesand/orcareerfields.

Writerdemonstratesthathe/sheunderstandswritingasaprocessthatcanbemademoreeffectivethroughdraftingandrevision.

Writerproofreadsandeditshisorherownwriting,avoidinggrammaticalandmechanicalerrors.

Writerassessesandexplainsthemajorchoicesthathe/shemakesinhis/herwriting.

4Excellent

Projectsdemonstratethewriter’sabilitytoidentifyandfullyengagesignificantquestionsrelevanttothecourse.Writingmaintainsaconsistentfocusofinquiry.

Projectsconsistentlydrawoncrediblesourcestosupportthepointsthewritermakesandtohelpthewriterachievehisorherpurpose.

Projectsconsistentlydemonstrateakeenawarenessofaudience,genre,andconventionsofthediscipline/course.

Theanalysisclearlyreflectsthatthewriterhasplannedtheprojectinmultiplestepsandrevisedthoroughlyandcarefullybetweeneachdraft.

Projectsconsistentlydisplaycarefulproofreadingandarelargelyfreeofsurface-levelerrors.

Theanalysisclearlydemonstratesthewriter’sabilitytoidentifyandexplainwritingchoicesandstrategiesusedinprojects.

3Good

Projectslargelydemonstratethewriter’sabilitytoengagemeaningfulquestionsrelevanttothecourse,andthewritingmaintainsafocuswithonlyoccasionallapses.

Projectsdrawoncred-iblesourcestosupportthewriter’spointsandpurposes,withonlyoccasionallapses.

Projectsdemonstrateanawarenessofaudience,genre,andconventionsofthediscipline/coursewithonlyoccasionallapses.

Theanalysissuggeststhatthewriterrecognizestheimportanceofplanningandrevisingandhasengagedtosomedegreeintheseprocesses,makingsomesubstantiverevisionsbetweendrafts.

Projectsreflecttheproofreadingeffortsofthewriterandincludeonlyoccasionalsurface-levelerrors.

Theanalysisdemonstratesthewriter’sabilitytoidentifyandexplainwritingstrategiesusedintheprojects,withonlyoccasionalareasthatareconfusingorincomplete.

2Fair

(Showspromise)

Projectsdemonstratethewriter’sabilitytoengagequestionsrelevanttothecourse,butinlimitedways.Thereisafocusofinquiry,butthewritingstraysfromthatfocusonseveraloccasions.

Projectsdrawoncred-iblesourcestosupportthewriter’spointsandpurposes,butdosoinconsistently.

Projectsdemonstrateanunevenawarenessofaudience,genre,andconventionsofthediscipline/course.

Theanalysissuggeststhatthewriterrecognizessomebenefitstoplanningandrevisingbutthatshe/hehasnotfullyengagedintheseprocesses.

Projectsevidencesomeproofreadingandediting,butseveralsurface-levelerrorsremain.

Theanalysisdemonstratesthatthewriterissometimesabletoidentifyand/orexplainwritingstrategiesusedintheprojects,butthereareseveralareasthatareconfusingorincomplete.

1Poor

Projectslargelyfailtodemonstrateengagementwithquestionsrelevanttothecourse.Thewritingappearstohavelittlefocus.

Projectslargelyfailtodrawoncrediblesourcestosupportthewriter’spointsandpurposes.

Projectslargelyfailtodemonstrateanawarenessofaudience,genre,andconventionsofthediscipline/course.

Theanalysissuggeststhatthewritercomposedtheworklargelyinonedraft,dedicatinglittle,ifany,timetoplanningorrevising.

Projectsreflectminimalorineffectiveproofreadingandeditingstrategies.Numeroussurface-levelerrorsremain.

Theanalysislargelyfailstodemonstrateanabilitytoidentifyandexplainwritingstrategiesintheprojects.

Page 41: 15-16 QEP Assessment Report Final - East Carolina Universityrelated activities (Section 2), the report provides descriptions of formative and summative assessments (direct and

40

Appendix B: English 2201 Assessment Rubrics

Two rubrics were used for assessing the portfolios for English 2201: one for the writing self-analysis and one for the projects included in the portfolio. ENGL 2201: Self-Analytical Rubric

Excellent (5)

The self-analytical writing demonstrates the writer’s exceptional ability to identify and explain the writing strategies (i.e., argument, organization, evidence, style, tone, etc.) used in the documents included in the portfolio.

Very Good (4)

The self-analytical writing demonstrates, with only minor lapses, the writer’s ability to identify and explain the writing strategies used in the documents included in the portfolio.

Adequate (3)

The self-analytical writing demonstrates the writer’s inconsistent ability to identify and explain the writing strategies used in the documents included in the portfolio.

Developing (2)

The self-analytical writing demonstrates the writer’s limited ability to identify and explain the writing strategies used in the documents included in the portfolio.

Insufficient (1)

The self-analytical writing completely fails to demonstrate an ability to identify and explain the writing strategies the writer has made in the documents included in the portfolio or one is not provided.

Page 42: 15-16 QEP Assessment Report Final - East Carolina Universityrelated activities (Section 2), the report provides descriptions of formative and summative assessments (direct and

English 2201 Portfolio Rubric Excellent (5) Very Good (4) Adequate (3) Developing (2) Insufficient (1) N/A Inquiry The projects

demonstrate an exceptional ability to create, identify, and engage in significant research questions.

The projects demonstrate, with only minor lapses, a strong ability to create, identify, and engage in significant research questions.

The projects demonstrate an inconsistent ability to create, identify, and engage in research questions.

The projects demonstrate a limited ability to create, identify, and engage in research questions.

The projects do not demonstrate a college-level ability to create, identify, and engage in research questions.

Not assessed in this Project.

Critical Engagement with and Use of Evidence

The projects demonstrate an exceptional ability to rhetorically engage a variety of appropriate sources to support the central claims.

The projects demonstrate, with only minor lapses, a strong ability to rhetorically engage a variety of appropriate sources to support the central claims.

The projects demonstrate an inconsistent ability to rhetorically engage a limited number of appropriate sources support the central claims.

The projects demonstrate a limited ability to rhetorically engage sources to support the central claim.

The projects do not demonstrate a college-level ability to rhetorically engage sources to support the central claims.

Not assessed in this Project.

Purpose, Audience, and Context

The projects demonstrate exceptional awareness of purposes, audiences, and contexts.

The projects demonstrate, with only minor lapses, steady awareness of purposes, audiences, and contexts.

The projects demonstrate an inconsistent awareness of purposes, audiences, and contexts.

The projects demonstrate a limited awareness of purposes, audiences, and contexts.

The projects do not demonstrate a college-level awareness of purposes, audiences, and contexts.

Not assessed in this Project.

Disciplinary Conventions

The projects demonstrate the writer’s exceptional understanding of methods of inquiry and rhetorical strategies, including form, media, and style, relevant to the discipline.

The projects demonstrate, with only minor lapses, the writer’s strong understanding of methods of inquiry and rhetorical strategies, including form, media, and style, relevant to the discipline.

The projects demonstrate the writer’s uneven understanding of methods of inquiry and rhetorical strategies, including form, media, and style, relevant to the discipline.

The projects demonstrate the writer’s limited understanding of methods of inquiry and rhetorical strategies, including form, media, and style, relevant to the discipline.

The projects do not demonstrate a college-level understanding of methods of inquiry and rhetorical strategies, including form, media, and style, relevant to the discipline.

Not assessed in this Project.

Formatting & Citation

The projects follow standard formatting and documentation guidelines. Attributions are complete and meet the appropriate style guidelines (APA, Chicago, CSE, or MLA).

The projects generally follow formatting and documentation guidelines. Errors in the appropriate style guidelines (APA, MLA, Chicago or CSE) are negligible and do not affect the integrity of the work.

The projects inconsistently follow formatting and documentation guidelines. Errors in the appropriate style guidelines (APA, MLA, Chicago or CSE) occur regularly.

The projects randomly follow formatting and documentation guidelines. Errors in the appropriate style guidelines (APA, MLA, Chicago or CSE) compromise the integrity and honesty of the projects.

The projects show little to no adherence to formatting and documentation guidelines. Plagiarism is evident.

Not assessed in this Project.

Expression and Organization

The projects are clearly organized to develop the central points. Sentences and paragraphs are logically connected with a minimum of grammar and punctuation errors.

The projects are organized to develop the central points. Sentences and paragraphs are connected with few lapses in transition and explanation. Grammar and punctuations errors are rare but obvious.

The projects are somewhat organized to develop the central points. Sentences and paragraphs inconsistently develop clear logical connections. Grammar and punctuation errors occur regularly and interfere with transitions and explanations.

The projects lack clear organization and development of central points. Sentences and paragraphs are not clearly developed or logically connected. Grammar and punctuation errors are regular and impede understanding of the text.

The projects do not demonstrate college-level organization and development. Sentences and paragraphs lack academic development.

Not assessed in this Project.