direct measurement of student web viewdirect measurement has become an essential part of the...

16
DIRECT MEASUREMENT OF STUDENT OUTCOMES WITH A PIPE NETWORK DESIGN PROGRAM John Finnie & Neil Fennessey Department of Civil & Environmental Engineering University of Massachusetts Dartmouth [email protected] INTRODUCTION Direct measurement has become an essential part of the assessment and evaluation of ABET Student Outcomes. One of the possible ways to accomplish a direct measurement is to document student performance on specific parts of an assignment. Student performance on specific parts of the assignment can be compared over time and conclusions drawn regarding attainment of various student outcomes. The objective of this paper is to show how a Communication Rubric for grading student design projects could be used to provide some direct measurements of attainment of student outcomes. DIRECT MEASUREMENT OF STUDENT OUTCOMES The ABET web site (WWW.ABET.ORG ) provides information about accreditation, including the documents entitled “Criteria for Accrediting Engineering Programs” for specific Accreditation Cycles (school years). A comparison of these criteria reveals that “direct measures” for assessment are not specifically mentioned until 2011-2012. The definition section for the 2010- 2011 criteria defines Assessment as follows [1]. Proceedings of the 2011 ASEE Northeast Section Annual Conference University of Hartford Copyright © 2011, American Society for Engineering Education

Upload: doquynh

Post on 01-Feb-2018

215 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: DIRECT MEASUREMENT OF STUDENT Web viewDirect measurement has become an essential part of the assessment and evaluation of ABET Student Outcomes. One of the possible ways to accomplish

DIRECT MEASUREMENT OF STUDENT OUTCOMESWITH A PIPE NETWORK DESIGN PROGRAM

John Finnie & Neil FennesseyDepartment of Civil & Environmental Engineering

University of Massachusetts [email protected]

INTRODUCTION

Direct measurement has become an essential part of the assessment and evaluation of ABET Student Outcomes. One of the possible ways to accomplish a direct measurement is to document student performance on specific parts of an assignment. Student performance on specific parts of the assignment can be compared over time and conclusions drawn regarding attainment of various student outcomes. The objective of this paper is to show how a Communication Rubric for grading student design projects could be used to provide some direct measurements of attainment of student outcomes.

DIRECT MEASUREMENT OF STUDENT OUTCOMES

The ABET web site (WWW.ABET.ORG) provides information about accreditation, including the documents entitled “Criteria for Accrediting Engineering Programs” for specific Accreditation Cycles (school years). A comparison of these criteria reveals that “direct measures” for assessment are not specifically mentioned until 2011-2012. The definition section for the 2010-2011 criteria defines Assessment as follows [1].

“Assessment is one or more processes that identify, collect, and prepare data to evaluate the achievement of program outcomes and program educational objectives.”

However, the 2011-2012 criteria document provides the following definition [2].

“Assessment is one or more processes that identify, collect, and prepare data to evaluate the attainment of student outcomes and program educational objectives. Effective assessment uses relevant direct, indirect, quantitative and qualitative measures as appropriate to the objective or outcome being measured. Appropriate sampling methods may be used as part of an assessment process.”

Proceedings of the 2011 ASEE Northeast Section Annual ConferenceUniversity of Hartford

Copyright © 2011, American Society for Engineering Education

Page 2: DIRECT MEASUREMENT OF STUDENT Web viewDirect measurement has become an essential part of the assessment and evaluation of ABET Student Outcomes. One of the possible ways to accomplish

This latter definition was proposed for the “EAC Harmonized General Criteria” on November 1, 2008. It was formally adopted for the 2011-2012 accreditation cycle at the ABET meeting October 30, 2010 [3].

ABET does not provide a specific definition of a direct measure, nor does it publish procedures for the use of direct measures as assessment tools. However, ABET does provide a number of venues for learning about the assessment process and preparing for an accreditation visit. These include an ongoing series of webinars, workshops, conferences, newsletters, as well as their website. Some of the webinars are available at no cost, as are the newsletters and website.

One definition of a direct measure was provided by Community Matters, a monthly newsletter from ABET, who started a column about assessment in August of 2006. This series addressed a number of assessment topics, including direct measures of learning. In a column in Community Matters, Gloria Rogers presented a list of direct and indirect assessment techniques [4]. The list of indirect measures included the familiar techniques of written surveys, questionnaires, and interviews. The list of direct measures included portfolios, local exams, oral exams, standardized exams, and external examiners. In this paper, a different measure will be explored. A rubric for grading technical reports will be used to directly measure attainment of a student outcome.

THE COMMUNICATION RUBRIC

A previous accreditation visit criticized our process for evaluating technical communications. In response to this criticism, our Communication Rubric was developed as a systematic way of grading student reports. It has been utilized to grade student reports in three specific undergraduate courses: freshman computer graphics, our second course in water resources engineering, and our second course in environmental engineering. A section of the Communication Rubric is given below. The full text of our Communication Rubric is available on our web site at: http://www.umassd.edu/engineering/cen/undergraduate/programoutcomes/

The Communication Rubric includes sections for Written Content, Technical Content, and Oral Presentation. For example, the section on written content is presented below.

Written Content Grade ____

Content and integration of information from sources (journals, manuals, etc.) ( %)

______ 1. All ideas presented support and develop the topic. ______ 2. Project reflects insight into and understanding of the subject matter.

Proceedings of the 2011 ASEE Northeast Section Annual ConferenceUniversity of Hartford

Copyright © 2011, American Society for Engineering Education

Page 3: DIRECT MEASUREMENT OF STUDENT Web viewDirect measurement has become an essential part of the assessment and evaluation of ABET Student Outcomes. One of the possible ways to accomplish

______ 3. Ideas are stated clearly and are developed fully with specific supporting details from the specifications or technical literature.

______ 4. Effectively uses examples, paraphrases, or summaries from the literature concerning the subject matter, not just quotations.

______ 5. Work reflects a sufficient review of the applicable Codes, specifications and/or technical literature.

Structure and Form ( %)

______ 1. Abstract is succinct and clear. ______ 2. Table of Contents is correct and logical. ______ 3. Introduction engages reader, explains project and gives clear sense of

direction. ______ 4. Logical, structured body guides reader through ideas, using topic sentences,

etc. ______ 5. Conclusion gives sense of rounding off and wrapping up without feeling

repetitive, rushed, or unfamiliar. ______ 6. Demonstrates proper and effective paragraphing. ______ 7. Uses appropriate transitional words and phrases between paragraphs and

sentences. ______ 8. Meets required length, if specified.

Grammar, Usage, and Mechanics ( %)

______ 1. Contains few or no errors in grammar and usage. ______ 2. Word choice is appropriate to professional writing. ______ 3. Contains few or no errors in spelling, capitalization, and punctuation. ______ 4. Shows clear evidence of proofreading and use of a spellchecker.

Format ( %)

______ 1. Typed – black ink in 12-point standard font (Times New Roman or similar) ______ 2. Follows specified line spacing (e.g., single, 1.5 or double-spaced). ______ 3. Follows specified page margins (e.g., 1-inch margins all around.) ______ 4. Pages numbered at page bottom, center. ______ 5. Follows other formatting requirements specific to course/project (i.e., title

page, etc.) ______ 6. Citation of facts, tables, figures, quotations, etc.

Quotations: lengthy quotations block-style indented 1 inch and single-spaced; source and page number provided for quotations.

Source citation in correct format: e.g., Fennessey (2004)

Proceedings of the 2011 ASEE Northeast Section Annual ConferenceUniversity of Hartford

Copyright © 2011, American Society for Engineering Education

Page 4: DIRECT MEASUREMENT OF STUDENT Web viewDirect measurement has become an essential part of the assessment and evaluation of ABET Student Outcomes. One of the possible ways to accomplish

_____ 7. Citation/Reference list is complete, accurate, and in specified format (ASCE, TRB, etc.)

In a similar fashion, the section on technical content addresses Technical Approach, Design Calculations, and Drawings and Supporting Graphics. The section on oral presentation (if applicable) addresses Appearance of Presenters, Oral Presentation, Subject Matter Presentation, and Post-Presentation Questions and Answers. Under each of these categories, a list of positive attributes helps the instructor to decide upon a numerical score for that category.

Our Communication Rubric has been utilized to evaluate technical communication projects since 2004 in at least one of our courses. However, its use as a direct measurement of student outcomes was not considered until recently.

THE DESIGN PROJECT

The team design project used for this example requires students to increase the capacity of domestic water pipeline system for a fictional city. Students are required to enlarge some of the pipes to increase capacity for fighting fires, to specify dimensions, volume, and elevations for a new city water tank, and to specify horse power, flow rate, and pressure (i.e. a “pump curve”) for a new pump to supply the new system. A copy of the specific design project is presented in the appendix.

STUDENT OUTCOMES

Our program has adopted the “a thru k” student outcomes as presented in the 2010-2011 criteria [1]. In addition, we added an additional outcome which addresses application of codes and regulations. For this particular course, we wish to assess attainment of the following student outcome: (g) an ability to communicate effectively.

RESULTS OF THE COMMUNICATION RUBRIC

Table 1 presents an example using scores for thirteen teams on the Written Content section of the Communication Rubric. This section accounts for 40% of the project grade. We have assumed the following weights for each category of this section.

Content and integration of information from sources (journals, manuals, etc.) (8 %) Structure and Form (12 %) Grammar, Usage, and Mechanics (12 %) Format (8 %)

Proceedings of the 2011 ASEE Northeast Section Annual ConferenceUniversity of Hartford

Copyright © 2011, American Society for Engineering Education

Page 5: DIRECT MEASUREMENT OF STUDENT Web viewDirect measurement has become an essential part of the assessment and evaluation of ABET Student Outcomes. One of the possible ways to accomplish

Figure 1 presents the results of the Written Content section of the Communication Rubric. A rating of “Exceeds Criteria” requires a score of 90% or above. A score below 90% but at least 80% receives a rating of “Meets Criteria”. A score below 80% but at least 70% receives a rating of “Progressing to Criteria”. Any score below 70% gets a rating of “Below Expectations”. These limits and their descriptions were suggested by Gloria Rogers in column in Community Matters [5]. The number of these ratings, the limits, and their names are not proscribed by ABET, but should be decided by the program.

To determine team ratings on Figure 1, the numerical scores in Table 1were first divided by the total possible points. For example, on the category “Content & Sources” Team 1 scored 5 out of 8 possible for an average of .625. Since this is below 70%, they received a rating of “Below Expectations” on this category. Figure1 summarizes the performance of the teams on each of the four categories. Under the category “Content & Sources”, one team received an evaluation of “Exceeds Criteria”, three teams received an evaluation of “Meets Criteria”, five teams received an evaluation of “Progressing to Criteria”, and four teams received an evaluation of “Below Expectations”.

Table 1 Summary of grades by category

Team

Content & Sources

Structure & Form

Grammar, Usage & Mechanics

Format

Sum

0.08 0.12 0.12 0.08 0.41 0.05 0.09 0.12 0.07 0.332 0.05 0.11 0.10 0.08 0.343 0.06 0.09 0.09 0.08 0.324 0.07 0.10 0.12 0.07 0.365 0.06 0.09 0.11 0.06 0.326 0.07 0.10 0.11 0.08 0.367 0.05 0.11 0.11 0.07 0.348 0.06 0.10 0.12 0.07 0.359 0.05 0.08 0.12 0.08 0.3310 0.06 0.11 0.10 0.06 0.3311 0.08 0.12 0.11 0.08 0.3912 0.06 0.11 0.12 0.06 0.35

Proceedings of the 2011 ASEE Northeast Section Annual ConferenceUniversity of Hartford

Copyright © 2011, American Society for Engineering Education

Page 6: DIRECT MEASUREMENT OF STUDENT Web viewDirect measurement has become an essential part of the assessment and evaluation of ABET Student Outcomes. One of the possible ways to accomplish

13 0.07 0.08 0.11 0.08 0.34

Figure 1 Results of Communication Rubric for Written Content

It should be noted that the Technical Content section of the Communication Rubric could also be used to assess other Program Outcomes. These outcomes could include

(a) an ability to apply knowledge of mathematics, science, and engineering(c) an ability to design a system, component, or process to meet desired needs within realistic

constraints such as economic, environmental, social, political, ethical, health and safety, manufacturability, and sustainability

(e) an ability to identify, formulate, and solve engineering problems

EVALUATION

Figure 1 identifies topics and skills areas where students need extra work. An initial evaluation would be that this class needs to improve its performance on the first two categories (Content & Sources, and Structure & Form).

Proceedings of the 2011 ASEE Northeast Section Annual ConferenceUniversity of Hartford

Copyright © 2011, American Society for Engineering Education

Page 7: DIRECT MEASUREMENT OF STUDENT Web viewDirect measurement has become an essential part of the assessment and evaluation of ABET Student Outcomes. One of the possible ways to accomplish

However, ABET has a specific definition of evaluation which is presented in its criteria [1].

“Evaluation is one or more processes for interpreting the data and evidence accumulated through assessment processes. Evaluation determines the extent to which student outcomes and program educational objectives are being attained. Evaluation results in decisions and actions regarding program improvement.”

Figure 1 can help the program to evaluate the level of attainment of Outcome “g”, (an ability to communicate effectively). To do this, the program must specify how scores on Figure 1 correlate to levels of attainment of an outcome. For example, the program could decide that acceptable attainment for Outcome “g” means that all student teams receive a rating of “Meets Criteria” in at least 3 of the 4 categories.

However, this evaluation must also integrate Figure 1 with other assessment tools and the results of the Communication Rubric from other courses. In our case, these other assessment tools include embedded exam questions and student presentations in front of our Industrial Advisory Board. Evaluating outcomes with data from multiple assessment tools can be approached in a number of ways. The program could develop a formula to combine all assessments into a single number (dimension). This could be accomplished by converting the results of all assessment tools to a number between 1 and 5, and averaging the results of all tools. Alternatively, the program can form a qualitative statement based on multiple data sources and data dimensions. ABET does not specify a procedure for formulating an evaluation from assessment tools.

The data from Figure 1 for future classes could also allow the program to track performance trends.

CONCLUSIONS

A Communication Rubric has been presented and applied to a team design project in a civil engineering course. Scores in specific categories of the Communication Rubric were tabulated. Weights were applied to each category, and a four-part rating system developed. These ratings were summarized and graphed for each category in the Communication Rubric. Procedures were presented for using these ratings to form an evaluation of attainment of a student outcome.

The results and recommendations of this paper are those of the authors, and have not been reviewed or approved by ABET.

REFERENCES

Proceedings of the 2011 ASEE Northeast Section Annual ConferenceUniversity of Hartford

Copyright © 2011, American Society for Engineering Education

Page 8: DIRECT MEASUREMENT OF STUDENT Web viewDirect measurement has become an essential part of the assessment and evaluation of ABET Student Outcomes. One of the possible ways to accomplish

[1] ABET, “Criteria for Accrediting Engineering Programs, Effective for Evaluations during the 2010-2011 Accreditation Cycle”, ABET.

[2] ABET, “Criteria for Accrediting Engineering Programs, Effective for Evaluations during the 2011-2012 Accreditation Cycle”, ABET.

[3] ABET, “Criteria for Accrediting Engineering Programs, Effective for Evaluations during the 2009-2010 Accreditation Cycle”, ABET.

[4] Gloria Rogers, “Assessment 101, Direct and Indirect Assessments: What Are They Good For?”, Community Matters Newletter, (August 2006), ABET Inc.

[5] Gloria Rogers, “Assessment 101, Rubrics: What Are They Good For?”, Community Matters Newsletter, (September 2006), ABET Inc.

APPENDIX

CEN 325 Water Resource Engineering

City of San Roberto Light Water Distribution System (Version S)Project 1 of 2 Design Projects for a Team

March 31, 2009

Introduction. The City of San Roberto Light (population 2736) has an existing drinking water distribution system which meets its current minimum needs. It wants to upgrade its existing system to handle fire flows. It has hired you to advise it on alternatives and to design alterations to the pipe system, pump, and storage tank.

Description of the system. Treated water enters the pipe network at node 1 (see Figure 1) and is pumped to an elevated storage tank. The storage tank has a volume of 20,000 gallons, with a diameter of 16 feet and a depth of 13 feet. The ground elevation at the storage tank is 112 feet, and the bottom of the tank is at elevation 170 feet. The pumps at the water treatment plan can deliver 500 gallons per minute to the storage tank, when its water surface is 190 feet elevation.

Design Specifications. The goal of the city is to provide water at 40 to 65 psi to any node during maximum flow conditions (2.5 times average demand). They would also like to provide fire flows of 1000 gpm for each of the following nodes one at a time: 1, 4, 7, 8, and 10, with a minimum pressure of 20 psi at all nodes (except at the fire flow, which must not be negative).

Proceedings of the 2011 ASEE Northeast Section Annual ConferenceUniversity of Hartford

Copyright © 2011, American Society for Engineering Education

Page 9: DIRECT MEASUREMENT OF STUDENT Web viewDirect measurement has become an essential part of the assessment and evaluation of ABET Student Outcomes. One of the possible ways to accomplish

During fire flows, the other nodes would experience 150% of average demands. Pumps would be turned OFF during the fire flows. That is, all flows would come from the storage tank. Confirmation of these fire flows would reduce the cost of fire insurance for every home owner in San Roberto Light.

The water tank must meet the following criteria. Design a new tank, if necessary, including overall dimensions and elevations.

1) Its water surface elevation must not go below 172 feet during a 24 hour period, under the following conditions:

• The normal demand cycle (Figure 2) with 1000 gpm fire flow for 4 hours,• Tank starts from a full condition,• The water treatment pumps are providing 500 gpm

2) In addition, over the 24 hour normal demand cycle, the water surface elevation is stable (i.e. it returns to full).

3) After 1 and 2 are done, select a new pump for the water treatment plant. That is, determine its water horsepower and design head (or three pairs of head & Q on its pump curve). The replacement pump should provide 1000 gpm to the elevated tank when the water surface elevation is 190 feet (no other flows in the network). Assume that the water surface elevation at the water treatment plant is 80 feet. Verify by pipe network calculation that your selected pump provides the desired flow rate and head.

It is assumed that you will use a commercial pipe network computer program.

Deliverables. Provide plans and a parts list for the changes to accommodate fire flows in the existing system. Provide a plan view map. Provide plan and cross-section views for any changes to the water tower. Use a CAD system for any plans not provided by the pipe network software.

Submit a written report which presents your design. State assumptions and conclusions. Include the pipe, node, and tank tables for every alternative, including pressures. Include any hand calculations. Do not make me dig for information. Use the node numbers shown on Figure 1.

Grading. The grade for the project will be determined as follows: technical quality (50%), written report (40%) and drawings (10%).

I have posted the Project Evaluation Checklist in the glass case opposite room 110 in our building.

Proceedings of the 2011 ASEE Northeast Section Annual ConferenceUniversity of Hartford

Copyright © 2011, American Society for Engineering Education

Page 10: DIRECT MEASUREMENT OF STUDENT Web viewDirect measurement has become an essential part of the assessment and evaluation of ABET Student Outcomes. One of the possible ways to accomplish

You must clearly identify the individual author of all work being submitted, so that individual work can be recognized. Place initials of the person who did the work and the person who checked the work on 1) every sheet of hand calculations, 2) every drawing (CAD and other), and on all printed output from computer programs. In the appendix to the report, provide a table which identifies the group members and lists their individual tasks separately. If I can’t tell who did the work, I will reduce the grade by one letter.

Due Date. The due date is ___________________

Proceedings of the 2011 ASEE Northeast Section Annual ConferenceUniversity of Hartford

Copyright © 2011, American Society for Engineering Education

Page 11: DIRECT MEASUREMENT OF STUDENT Web viewDirect measurement has become an essential part of the assessment and evaluation of ABET Student Outcomes. One of the possible ways to accomplish

Figure 1 – Existing Pipe Network

Figure 2 – Normal Demand Cycle

Proceedings of the 2011 ASEE Northeast Section Annual ConferenceUniversity of Hartford

Copyright © 2011, American Society for Engineering Education