educator’s guide to assessment laboratory

23
Assessment Science Educator’s Guide to Laboratory By Rodney Doran Fred Chan Pinchas Tamir Carol Lenhardt Copyright © 2002 NSTA. All rights reserved. For more information, go to www.nsta.org/permissions.

Upload: others

Post on 29-Jan-2022

8 views

Category:

Documents


0 download

TRANSCRIPT

Assessment

ScienceEducator’sGuide to

Laboratory

By

Rodney Doran

Fred Chan

Pinchas Tamir

Carol Lenhardt

Copyright © 2002 NSTA. All rights reserved. For more information, go to www.nsta.org/permissions.

Claire Reinburg, DirectorJudy Cusick, Associate EditorCarol Duval, Associate EditorBetty Smith, Associate Editor

NATIONAL SCIENCE TEACHERS ASSOCIATION

Gerald F. Wheeler, Executive DirectorDavid Beacom, Publisher

Science Educator’s Guide to Laboratory AssessmentNSTA Stock Number: PB145X2ISBN: 0-87355-210-5Library of Congress Catalog Card Number: 98-84914Printed in the USA by Victor GraphicsPrinted on recycled paper

Copyright © 2002 by the National Science Teachers Association.

Permission is granted in advance for reproduction for purpose of classroom or workshopinstruction. To request permission for other uses, send specific requests to:NSTA PRESS

1840 Wilson BoulevardArlington, Virginia 22201-3000www.nsta.org

NSTA is committed to publishing quality materials that promote the best in inquiry-based scienceeducation. However, conditions of actual use may vary and the safety procedures and practices describedin this book are intended to serve only as a guide. Additional precautionary measures may be required.NSTA and the author(s) do not warrant or represent that the procedures and practices in this book meetany safety code or standard or federal, state, or local regulations. NSTA and the author(s) disclaim anyliability for personal injury or damage to property arising out of or relating to the use of this bookincluding any of the recommendations, instructions, or materials contained therein.

ART AND DESIGN Linda Olliver, DirectorCover image, Photodisc

NSTA WEB Tim Weber, WebmasterPERIODICALS PUBLISHING Shelley Carey, DirectorPRINTING AND PRODUCTION Catherine Lorrain-Hale, DirectorPUBLICATIONS OPERATIONs Erin Miller, ManagersciLINKS Tyson Brown, Manager

Copyright © 2002 NSTA. All rights reserved. For more information, go to www.nsta.org/permissions.

P A G E vP A G E vP A G E vP A G E vP A G E vA B O U T T H E A U T H O R S

About the Authors

Rodney Doran has been a professor ofscience education at the State Universityof New York at Buffalo’s Graduate Schoolof Education since 1969. He previouslytaught physics, Earth science, and math ata Minnesota high school. He holds de-grees from the University of Minnesota(BS, 1961), Cornell University (MST,1966), and the University of Wisconsin(PhD, 1969). Dr. Doran was U.S. Associ-ate Coordinator for the Second Interna-tional Science Study, and received theState University of New York Chancellor’sAward for Excellence in Teaching in 1998.He has published articles on assessment inScience and Children, Science Scope, and TheScience Teacher, and was the author of theNational Science Teachers Association1981 book, Basic Measurement and Evalu-ation in Science Instruction.

Fred Chan is a vice principal with theToronto, Ontario, District School Board.He previously taught physics, chemistry,and biology in Guyana and the Caribbean.He holds degrees from McMaster Univer-sity (BS, 1973), the University of Toronto(BEd, 1975), Niagara University (MS inEd, 1984), and the University of Bridge-port (MS, 1989), and he expects to receivehis PhD from the University at Buffalo.He is a 1996 recipient of the Prime Min-ister of Canada’s Award for Excellence in

Science Teaching. He has written articleson assessment for The Science Teacher andserves on the manuscript review panel forthat journal. He has also served on theNational Science Teachers Association’sCommittee on International Science Edu-cation.

Pinchas Tamir is professor emeritus atthe School of Education at Hebrew Uni-versity of Jerusalem in Israel. He was alsodirector of the Israel Science Teacher Cen-ter, where he helped introduce BSCS biol-ogy to Israel and develop the IsraelMatriculation Exam for high school biol-ogy. He was Israel’s research coordinatorfor the Second International ScienceStudy and the Third International ScienceStudy, and he is a visiting scholar at doz-ens of universities around the world. Heholds degrees from Hebrew University ofJerusalem (MS, 1966) and Cornell Uni-versity (PhD, 1968).

Carol Lenhardt is a retired middleschool science teacher. She also was an ad-junct faculty member at Buffalo StateCollege and the University of Buffalo. Sheholds degrees from the University ofRochester (BS, 1956), Syracuse University(MS, 1960), and the University of Buffalo(PhD, 1994) She is a past president andfellow of the Science Teachers Associationof New York State (STANYS) and coau-thor of Alternative Assessment in Science,Grade 8.

Copyright © 2002 NSTA. All rights reserved. For more information, go to www.nsta.org/permissions.

P A G E v iP A G E v iP A G E v iP A G E v iP A G E v i N A T I O N A L S C I E N C E T E A C H E R S A S S O C I A T I O N

ContentsABOUT THE AUTHORS ............................................................................................ vPREFACE ................................................................................................................... viii

Background: The Importance of Assessment ............................................................ viiiThe Meaning of Inquiry .............................................................................................. ixDiagnostic, Formative, and Summative Assessment ...................................................... xHigh-Stakes Tests ........................................................................................................xiProfessional Development ............................................................................................xiOrganization and Use of This Book .......................................................................... xii

ACKNOWLEDGMENTS ........................................................................................... xvCHAPTER 1: A Rationale for Assessment ................................................ 2

The Present State of Assessment ................................................................................. 2The Constructivist Paradigm ....................................................................................... 3Assessment’s Changing Nature .................................................................................... 4The Multifaceted Assessment System .......................................................................... 6Using Assessment Results—The New Paradigm ......................................................... 8

CHAPTER 2: Developing New Assessments ................................................. 12Toward Effective Assessment ..................................................................................... 12An Assessment Development Model ......................................................................... 12

State the Purpose .................................................................................................. 13Select the Appropriate Task Format ...................................................................... 14Write the Task ....................................................................................................... 15Modify an Existing Task ....................................................................................... 17Use Clear Directions and Questions ..................................................................... 19Consider Equity .................................................................................................... 20Clarify Administrative Procedures ......................................................................... 20Develop the Scoring Rubric .................................................................................. 22Trial Test the Task ................................................................................................. 23Analyze Results ..................................................................................................... 23Revise Tasks .......................................................................................................... 24

Structure ..................................................................................................................... 24Sequence .................................................................................................................... 25Novelty ....................................................................................................................... 26

CHAPTER 3: Alternative Assessment Formats ............................................ 28What Is “Alternative”? ................................................................................................ 28Performance-Based Assessment Formats ................................................................... 28

Skills Tasks ............................................................................................................ 28Investigations ......................................................................................................... 31Extended Investigations ......................................................................................... 32

Student-Focused Assessment Formats ....................................................................... 35Graphic Organizers ............................................................................................... 35Concept Maps ....................................................................................................... 35Venn Diagrams ...................................................................................................... 38Vee Diagramming, or Vee Heuristic ...................................................................... 39

ScienceEducator’sGuidetoLaboratoryAssessment

Copyright © 2002 NSTA. All rights reserved. For more information, go to www.nsta.org/permissions.

P A G E v i iP A G E v i iP A G E v i iP A G E v i iP A G E v i i

Portfolios ............................................................................................................... 42Oral Presentations and Debate .............................................................................. 43Interviews and Conferences ................................................................................... 46Lab Skills Checklists ............................................................................................. 47Self, Pair, and Peer Evaluations ............................................................................. 49Technological Applications ................................................................................... 53

Teacher-Directed Assessment Formats ...................................................................... 53Demonstrations ..................................................................................................... 53Group Visuals ........................................................................................................ 54

CHAPTER 4: Using Performance Assessment Results .............................. 58Assessment and Evaluation ........................................................................................ 58Norm- and Criterion-Referenced Evaluations ........................................................... 58Using Assessment Data .............................................................................................. 60Scoring Performance Assessment ............................................................................... 60Developing a Scoring Team ....................................................................................... 69Scorer Agreement ....................................................................................................... 70Reliability ................................................................................................................... 71Aggregating Assessment Data and Assigning Grades ................................................ 72Assessment Data Management .................................................................................. 74Using Results of Performance Assessment ................................................................. 74Test Validity ............................................................................................................... 76Interpreting and Describing Results ........................................................................... 77Program Evaluation ................................................................................................... 79Annual Plan ............................................................................................................... 81

CHAPTER 5: Illustrative Assessment Tasks for Biology ........................... 84Biology Skills Tasks .................................................................................................... 84Biology Investigation Tasks ...................................................................................... 107Biology Extended Investigation Tasks ...................................................................... 129

CHAPTER 6: Illustrative Assessment Tasks for Chemistry .................... 144Chemistry Skills Tasks ............................................................................................. 145Chemistry Investigation Tasks ................................................................................. 164Chemistry Extended Investigation Task ................................................................... 172

CHAPTER 7: Illustrative Assessment Tasks for Earth Science ............ 176Earth Science Skills Tasks ........................................................................................ 176Earth Science Investigation Tasks ............................................................................ 191Earth Science Extended Investigation Task ............................................................. 199

CHAPTER 8: Illustrative Assessment Tasks for Physics ......................... 204Physics Skills Tasks .................................................................................................. 205Physics Investigation Tasks ...................................................................................... 236Physics Extended Investigation Task ........................................................................ 247

APPENDICES

Glossary of Assessment Terminology ....................................................................... 254National Science Education Standards for Assessment ................................................ 257Complete Bibliography ............................................................................................ 258

INDEX ........................................................................................................................ 265

Copyright © 2002 NSTA. All rights reserved. For more information, go to www.nsta.org/permissions.

P A G E v i i iP A G E v i i iP A G E v i i iP A G E v i i iP A G E v i i i N A T I O N A L S C I E N C E T E A C H E R S A S S O C I A T I O N

Preface

Welcome to the second, enlarged editionof Science Educator’s Guide to LaboratoryAssessment. This version contains fifteennew assessment tasks, and like the firstedition, it presents multiple assessmentformats, strategies, models, and templatesappropriate for inquiry activities in thegrades 7–12 science classroom and labora-tory, as well as outdoors. These assessmentformats and strategies are based on themost recent research on assessment, in-struction, and learning and include manypractical examples you can adapt for use inyour classroom.

Background: TheImportance ofAssessment

As science teachers, we face a continualchallenge of assessing what students know,are able to do, and value in learning sci-ence. Assessment provides insights intostudents’ rates of progress in conceptualunderstanding, reinforces productivelearning habits, and validates learning ac-tivities. Students need recurring, system-atic, and regular feedback to understandtheir own strengths and capabilities inlearning and to identify areas for improve-ment. We now are aware that increaseduse of formative assessment in scienceclassrooms to modify teaching and to pro-vide feedback to students has powerfulpositive effects on student learning. Awell-designed assessment program, byproviding regular and systematic feedback,goes a long way in helping students reflecton their learning. Hence the importanceof assessment reform.

The assessment phase of the teaching-learning process is our primary way of“keeping score.” Teachers measure how wellstudents learn new concepts and skills, ad-ministrators and policymakers measure theeffectiveness of teaching strategies and edu-cational and program policies, and parentsuse grades and marks to monitor theirchildren’s progress in school. Also, as a soci-ety we use data from assessments to com-pare our national progress in educationwith that of other nations.

There is a growing tension betweenthe rich, authentic assessments that thescience standards suggest and the in-creased use of large-scale, high-stakes test-ing. Science teachers need to come togrips with how much we teach “to thetest,” and in so doing, how much we nar-row the curriculum. We need to balancethe requirements of high-stakes testingwith designing assessments that providestudents with varied opportunities to de-velop competencies in science and to dem-onstrate what they know and can do.

Assessment has become increasinglyimportant during the past decade, as edu-cators and policymakers seek reforms toour educational system in response to na-tional and international priorities andchallenges. Educators concerned withweak science achievement, low levels ofscience literacy, and poor international testscores have undertaken major reforms inscience instruction. Increased internationaleconomic competition has reinforced theimportance of excellence in science educa-tion as a fundamental priority for everynation to maintain its competitiveness.New insights into how children learn and

Copyright © 2002 NSTA. All rights reserved. For more information, go to www.nsta.org/permissions.

P A G E i xP A G E i xP A G E i xP A G E i xP A G E i xP R E F A C E

advances in learning theory, such asconstructivism and the identification of al-ternative and prior conceptions of learningscience, have added impetus to assessmentreform. As a result, there is a call for wide-spread use of alternative assessments, anda shift away from textbook- and teacher-centered approaches to instruction.

These reform efforts, embodied in theNational Science Education Standards(NRC 1996) and in reform documentssuch as Project 2061: Benchmarks for ScienceLiteracy (AAAS 1993), call for widespreadreform in science instruction and assess-ment. Old teaching strategies and assess-ment formats based on behavioristtheories, such as rote memorization andpaper-and-pencil examinations, are beingreplaced with holistic, constructivist ap-proaches that promote problem solvingand higher-level thinking. These sophisti-cated assessments demand the use of avariety of teaching strategies to help stu-dents develop their ability to learn and tosolve problems in “real-world” situationsand contexts.

The Meaning ofInquiry

Inquiry has been and continues to be aconcept near and dear to the hearts of sci-ence teachers. Bybee (2000) traces thelong history of inquiry at least back toJohn Dewey in the early 1900s. Inquiryhas been in and out of favor since then,depending on the reform efforts popular ata particular time.

One source of confusion about inquiryis that it is both a methodology of howscientists investigate natural phenomenaand a methodology espoused for facilitat-ing the engagement of students with ma-terials and questions. To add to theconfusion, process goals (to include in-

quiry) have been cited as “content out-comes” since the 1960s (Parker and Rubin1966). This view is continued with theNational Science Education Standards(NRC 1996), which uses inquiry in twoways: as abilities students should developto be able to design and conduct scientificinvestigations and as the understandingsthey should gain about the nature of pro-fessional scientific inquiry.

Although inquiry is a mode of gather-ing information in many academic/schol-arly fields, there are some unique aspectsof scientific inquiry. In many of the sciencecurriculum projects from the 1960s, in-quiry was largely accepted as a collectionof science processes (e.g., observing, mea-suring, predicting, hypothesizing). Cur-rently it is viewed as one set of tools tofurther the development of scientific ex-planation. For instance, the Learning Stan-dards for Mathematics, Science, andTechnology (New York State EducationDepartment 1996) identifies three keyideas of scientific inquiry:

• The central purpose of scientific in-quiry is to develop explanations ofnatural phenomena in a continuing,creative process.

• Beyond the use of reasoning andconsensus, scientific inquiry involvesthe testing of proposed explanationsinvolving the use of conventionaltechniques and procedures and usu-ally requiring considerable ingenu-ity.

• The observations made while test-ing proposed explanations, whenanalyzed using conventional and in-vented methods, provide new in-sights into phenomena.

Other educators have treated inquiry asvirtually synonymous with problem solvingand/or critical thinking. Although there ismuch overlap among these concepts, it may

Copyright © 2002 NSTA. All rights reserved. For more information, go to www.nsta.org/permissions.

P A G E xP A G E xP A G E xP A G E xP A G E x N A T I O N A L S C I E N C E T E A C H E R S A S S O C I A T I O N

be helpful to make the following distinc-tions: inquiry tends to focus on developingnew information (relationships, concepts,principles); problem solving focuses on find-ing solutions to problems and is linked withtechnology; and critical thinking, also de-scribed as “rational reasoning,” can be con-sidered to be a set of cognitive strategiesthat include, for example, deduction andinduction.

In this volume, when we refer to in-quiry we mean scientific inquiry. One of theclearest descriptions of the term is fromthe National Science Education Standards:

Scientific inquiry refers to the diverseways in which scientists study thenatural world and propose explana-tions based on the evidence derivedfrom their work. Inquiry also refers tothe activities of students in which theydevelop knowledge and understandingof scientific ideas, as well as an under-standing of how scientists study thenatural world. (NRC 1996, 23)

Diagnostic,Formative, andSummativeAssessment

The current view is that every assessmentconsists of three interconnected elements—observation, interpolation, and cognition—that form a triangle. Each element isconnected to and dependent on the others.Assessment tasks are designed aroundcognition or theories of learning. Studentaccomplishments provide observations andevidence for an interpretation of how muchthey know and can do (NRC 2001).

As we design assessment based oncurrent theories of learning, it is importantto clarify the meanings of diagnostic, for-mative, and summative assessment. TheNational Research Council’s Committee

on Classroom Assessment and the Na-tional Science Education Standards (NRC2001) suggest that we ask the followingquestions to determine what type of as-sessment we are using:

• Where are we presently? (diagnosticassessment)

• How can we get there? (formativeassessment)

• Have we arrived? (summative as-sessment)

Diagnostic assessment is the use ofqualitative and quantitative data and in-formation to determine where students arein terms of their knowledge and skills. Theuse of this assessment information tellsstudents which areas they are strong inand which areas need academic interven-tion. This kind of assessment can be infor-mal—for example, interviews,paper-and-pencil tests, and previous aca-demic records. Diagnostic assessment is“low stakes” and answers the question“Where are we presently?”

Formative assessment is also “lowstakes” and gives feedback to studentsabout where they are in terms of theirknowledge and skills. These assessmentsare informal and ongoing. The feedback tostudents should provide a roadmap for“How can we get there?” Using theroadmap, students try new ideas, look atproblems differently, and discuss problemswith peers and teachers. The roadmaptakes us to our destination, which is thestandards set forth by your state or schooldistrict.

Of our destination, we naturally ask,“Have we arrived?” That is wheresummative assessments enter the picture.These are culminating assessment tasksthat occur at the end of a unit, topic, orcourse. They are considered “high-stakes”(more about this term below) because de-

Copyright © 2002 NSTA. All rights reserved. For more information, go to www.nsta.org/permissions.

P A G E x iP A G E x iP A G E x iP A G E x iP A G E x iP R E F A C E

cisions regarding further study, jobs, andacademic standing are based on them.Summative assessments can be paper-and-pencil format or can be a collection of stu-dent work collected over time using aportfolio format. Summative assessmentsare of the highest stakes when the assess-ment data are used for credentialing pur-poses such as the awarding of a highschool diploma.

The key distinction among theseterms is the use and timing of the assess-ment data. Diagnostic and formative as-sessments are intended to support studentlearning. Summative assessment data areused to certify student accomplishments interms of their knowledge and skills.

High-Stakes Tests

A few more comments on high-stakestests are appropriate here. Just what aresuch tests (or assessments) from the pointof view of a classroom science teacher?The key to answer this question is to de-termine the purpose of the assessment.When assessment results are used to giverewards to those students who obtain hightest scores, then such assessments (tests)are “high stakes.” (An unwelcome resultmay be that those students who have low-test scores are denied educational opportu-nities.) Examples of common high-stakestests are the SAT and the ACT. A recenttrend in high-stakes testing is the use ofstate tests for graduation decisions, such asthe awarding of high school diplomas. It isimportant that these tests satisfy testmeasurement principles of reliability,validity, and fairness (National ResearchCouncil 1999; AERA, APA, and NCME1999) and that appropriate accommoda-tions be made for English language learn-ers and students with disabilities.

The classroom science teacher’s incli-nation can be to “ teach to the test” in or-der to maximize students’ opportunities toobtain a high test score and prevent anysanctions against the school or theteacher’s performance. When the majorityof class time is spent practicing and re-viewing sections of previous tests, however,the curriculum will tend to narrow. In thecontext of high-stakes testing, good teach-ers know they can facilitate student learn-ing in a variety of engaging ways(including through the use of the assess-ment tasks in this book), while familiariz-ing students with the item format andcognitive demands of the tests. In this waystudents are provided with the “opportu-nity to learn” in preparation for the tests.

It should be noted that high-stakestests are subject to legal challenges whenthe test scores are used inappropriately. Testresults should not be used for purposes forwhich the test was not designed. For ex-ample, the use of tests designed for pro-gram evaluation may be inappropriate formaking decisions regarding student ac-countability. Increasingly, test results are be-ing used for more than one purpose. Suchuse imposes limits on the consequential va-lidity of the test. In addition, the use of theresults of a single test as the sole criterionfor a high-stakes decision is problematic(AERA, APA, and NCME 1999).

ProfessionalDevelopment

The authors share the belief that the on-going professional development of teach-ers is a priority to bring alive the NationalScience Education Standards. We believethat teachers must be well grounded intheir assessment knowledge and be able touse this knowledge in their classroompractice.

Copyright © 2002 NSTA. All rights reserved. For more information, go to www.nsta.org/permissions.

P A G E x i iP A G E x i iP A G E x i iP A G E x i iP A G E x i i N A T I O N A L S C I E N C E T E A C H E R S A S S O C I A T I O N

In the past, professional developmenthas largely consisted of the one-day work-shop where experts use “show and tell”methods to inform teachers about the lat-est teaching trend. We believe that lastingchange in assessment practices will notcome about using that disjointed ap-proach. Teachers are professionals; they areactive learners who know best what theywant to know; and they see their profes-sional development as continuous and on-going. Our vision for effectiveassessment-focused professional develop-ment for science teachers is that its designmust be consistent with appropriate learn-ing theories for adults and must involvethe professional’s construction of meaningand knowledge (Loucks-Horsley, et al.1998). School districts and school admin-istrators need to provide support in theform of time and opportunity for scienceteachers to meet and collaborate in waysto inform and improve classroom assess-ment practice.

We offer this book, now in its secondedition, as a resource to assist scienceteachers in their ongoing professional de-velopment. Many of the ideas will chal-lenge fundamental philosophical beliefsabout learning and education. We hopethat our colleagues will engage in collabo-rative discussions to advance their assess-ment practices. We envisage scienceteachers working with colleagues in theirown schools, school districts, and profes-sional organizations to gain expertise inassessment practices that work with theirstudents.

Organization and Useof This Book

This book has two sections, followed bythree appendices. Chapters 1-4 discuss as-sessment theory, research, and use, and

Chapters 5-8 contain model assessmentsgrouped by science discipline. The follow-ing provides a brief description of whatyou will find in each of the book’s chap-ters.

Chapter 1 discusses the National Sci-ence Education Standards and recent re-search suggesting that instruction movefrom a primarily behaviorist approach to-ward constructivist models of learning andinstruction. Chapter 2 addresses practicalissues related to designing performance as-sessments that are aligned with the Na-tional Standards. Chapter 3 discusses thebenefits and drawbacks of various assess-ment formats, ranging from short, focusedtasks to extended investigations. Chapter 4provides suggestions for using rubrics toestablish reliable and consistent scoring ofassessments, and for using data to improveboth the overall science program and theperformance of students.

Chapters 5–8 are disciplinary chaptersthat provide model assessment examplesfrom biology, chemistry, Earth science,and physics. Most of these examples arecomplete tasks with information aboutmeasuring the skills appropriate for eachtask, time requirements, and preparingmaterials and equipment. There are alsodirections and answer sheets for students,a list of required materials and equipment,and scoring guidelines for evaluating stu-dent responses.

The disciplinary assessment tasks aregrouped into three sections:

• Skills Tasks: relatively short, and fo-cus on a few specific process skills.

• Investigations: focus on a wide vari-ety of skills. They typically requireone or two 40-45-minute class peri-ods for completion. Students canplan and design an investigation,conduct an experiment, and com-

Copyright © 2002 NSTA. All rights reserved. For more information, go to www.nsta.org/permissions.

P A G E x i i iP A G E x i i iP A G E x i i iP A G E x i i iP A G E x i i iP R E F A C E

municate their findings and conclu-sions.

• Extended Investigations: last forseveral 40-45-minute class periodsand can require several weeks forcompletion. Extended investigationsare examples of curriculum-embed-ded assessments that align closelywith instruction.

You are invited to use these assess-ments as is, modify them for specific in-structional programs or purposes, or usethem as models or templates to design en-tirely new and innovative assessments.

Although the book’s primary focus ison assessing student achievement in theclassroom and laboratory, we also includesuggestions and examples on using theseassessments for program evaluation. Manyof the examples also include suggestionsfor revisions, depending on the uses of theassessment and the availability of materialsand equipment.

There are three Appendices: a glos-sary, the National Science Education Stan-dards for assessment, and a completebibliography consisting of works cited andother relevant assessment resources—espe-cially those that emphasize hands-on in-quiry activities.

You can use this book à la carte bytaking as much or as little as you desire toassist you with your assessments. You mayfirst wish to reacquaint or familiarizeyourself with the National Science Educa-tion Standards, principles of assessment de-sign, and the rationale for new formats ofassessment that interface with your evolv-ing instructional pedagogy. Chapters 1-4and the Appendices are appropriate forthese purposes. Once you are comfortablewith these concepts, go to Chapters 5-8and examine the specific assessment ex-

amples that are relevant to the science dis-ciplines you teach.

This book is practical in its approachto assessment reform. The assessmentswith their scoring rubrics have been field-tested by “real” teachers in “real” scienceclassrooms. We hope you find the bookuseful as a resource as you continue toimplement the assessment standards. Wealso hope you try the assessments withyour students, and suggest you modify andrevise the tasks to fit your needs. Involvingyour students at appropriate times in peerand self-reflection will help to embed yourassessments in instructional practices.

Works Cited

American Association for the Advancement ofScience (AAAS). 1993. Project 2061:Benchmarks in Science Literacy. New York:Oxford University Press.

American Educational Research Association(AERA), American Psychological Asso-ciation (APA), and National Council onMeasurement and Education (NCME).1999. Standards for Educational and Psy-chological Testing. Washington, DC:American Educational Research Associa-tion.

Bybee, R. 2000. Teaching Science as Inquiry.In Minstrell, J. and Van Zee, E., eds. In-quiring into Inquiry Learning and Teachingin Science. Washington, DC: AmericanAssociation for the Advancement of Sci-ence.

Loucks-Horsley, S., Hewson, P. W., Love, N.,and Stiles, K. E. 1998. Designing Profes-sional Development for Teachers of Scienceand Mathematics. Thousand Oaks, CA:Corwin Press

National Research Council (NRC). 1996. Na-tional Science Education Standards. Wash-ington, DC: National Academy Press.

———. 1999. High Stakes Testing for Tracking,Promotion and Graduation. Board on Test-

Copyright © 2002 NSTA. All rights reserved. For more information, go to www.nsta.org/permissions.

P A G E x i vP A G E x i vP A G E x i vP A G E x i vP A G E x i v N A T I O N A L S C I E N C E T E A C H E R S A S S O C I A T I O N

ing and Assessment. Commission onBehavioral and Social Sciences andEducation. Washington, DC: NationalAcademy Press.

———. 2001. Classroom Assessment and theNational Science Education Standards.Washington, DC: National AcademyPress.

New York State Education Department, Uni-versity of the State of New York. 1996.Learning Standards for Mathematics, Sci-ence, and Technology. Albany: New YorkState Education Department.

Parker, J. C., and Rubin, L. J. 1966. Process asContent: Curriculum Design and the Appli-cation of Knowledge. Chicago: RandMcNally.

U.S. Department of Education, Office of CivilRights. 2000. The Use of Tests as Part ofHigh-Stakes Decision-Making for Stu-dents: A Resource Guide for Educatorsand Policy-Makers. (available atwww.ed.gov/offices/OCR)

Suggested Readings

American Association for the Advancement ofScience (AAAS). 1989. Project 2061: Sci-ence for All Americans. New York: OxfordUniversity Press.

Black, P., and Wiliam, D. 1998a. Inside theBlack Box: Raising Standards throughClassroom Assessment. Phi Delta Kappan80 (2): 139–48.

———. 1998b. Assessment and ClassroomLearning. Assessment in Education 5 (1):7–74.

National Education Goals Panel. 1996. TheNational Education Goals Report: ExecutiveSummary-Commonly Asked QuestionsAbout Standards and Assessment. Washing-ton, DC: National Education GoalsPanel.

National Research Council (NRC). 2001.Knowing What Students Know: The Scienceand Design of Educational Assessments.Committee on the Foundations of As-

sessment. J. Pellegrino, N. Chudowsky,and R. Glaser, eds. Washington, DC: Na-tional Academy Press.

National Science Teachers Association. 1992.Scope, Sequence, and Coordination of Sec-ondary School Science, Volume II: RelevantResearch. Arlington, VA: National ScienceTeachers Association.

Copyright © 2002 NSTA. All rights reserved. For more information, go to www.nsta.org/permissions.

A C K N O W L E D G M E N T S P A G E x vP A G E x vP A G E x vP A G E x vP A G E x v

Acknowledgments

When a project of this scope is completed,it is important to recognize the manypeople who have contributed to it. With-out the support and acceptance of ourfamilies, this book would never have hap-pened. The help and encouragement ofShirley Watt Ireton, who was director ofNSTA’s Special Publications division atthe time the first edition was published,was crucial at every stage. Her initial sug-gestions for the book’s focus were bothwise and timely, and her continued in-volvement during all phases was essential.Thanks also to Chris Findlay and the en-tire Special Publications staff for all theirhard work on the first edition and to JudyCusick, who was the NSTA Press’s projectmanager for the second edition. LindaOlliver designed the cover and drew theillustrations for this later edition.Catherine Lorrain-Hale coordinated pro-duction and printing for, and laid out,both the first and second editions.

Two funded projects provided manyof the assessment tasks in this book. TheUniversity at Buffalo/National OpinionResearch Center joint project (UB/NORC) project developed prototype ex-ams for high school science courses.Darrell Bock and Michelle Zimowski of

the University of Chicago’s NationalOpinion Research Center developed themultiple-choice and open-ended items,while Rod Doran coordinated the labora-tory tests at the University at Buffalo. JoanBoorman, Fred Chan, Nicholas Hejaily,and Diana Anderson focused on assess-ment tasks in separate fields. The NewYork Alternative Assessment in ScienceProject was a joint effort of the New YorkState Education Department and the Uni-versity at Buffalo, producing a TeachersGuide and Collection of Tasks for grade 4,grade 8, Earth science, and biology. Dou-glas Reynolds, Robert Allers, and SusanAgruso were co-investigators. Dozens ofteachers from western New York Statehelped revise and trial test the tasks pre-sented here.

Audrey Champagne, FlorenceJuillerat, Cornelia Munroe, Mildred Barry,Burt Voss, Mary Kalen Romjue, AngeloCollins, Bill Williams, Lawrence Gilbert,Dwaine Eubanks, and Tony Bartley re-viewed the manuscript as it progressed.Gouranga Saha and Jane Anzalone con-tributed in numerous ways. Special thanksto Lauri Di Matteo for single-handedlyentering almost the entire contents of thebook onto computer disks.

Copyright © 2002 NSTA. All rights reserved. For more information, go to www.nsta.org/permissions.

P A G E 2P A G E 2P A G E 2P A G E 2P A G E 2 N A T I O N A L S C I E N C E T E A C H E R S A S S O C I A T I O N

A Rationale for AssessmentCHAPTER 1

The Present State ofAssessment

The roots of our current education systemlie in the mass public school programs ofthe Industrial Revolution. The mecha-nized assembly lines and standardizedprocesses that dominated that era foundtheir way into education, where they re-main deeply embedded today.

Over most of this century, school hasbeen conceived as a manufacturingprocess in which raw materials (young-sters) are operated upon by the educa-tional process (machinery), some for alonger period than others, and turnedinto finished products. Youngsters learnin lockstep or not at all (frequently notat all) in an assembly line of workers(teachers) who run the instructionalmachinery. A curriculum of mostly fac-tual knowledge is poured into theproducts to the degree they can absorbit, using mostly expository teachingmethods. The bosses (school adminis-trators) tell the workers how to makethe products under rigid work rulesthat give them little or no stake in theprocess. (Rubba, et al. 1991)

This assembly-line approach reliesheavily upon behaviorist learning theory,which is based on three main concepts: thatcomplex learning can be broken into dis-crete bits of information; that students learnby making associations between differentkinds of perceptions and experiences; andthat knowledge is an accumulation of dis-crete facts and basic skills.

Under behaviorist learning, knowledgeis “decomposable” and can be broken into

its component parts without jeopardizingunderstanding or applicability. These de-composable skills can be learned separatelyusing stimulus-response associations. Inaddition, students can learn knowledge outof context. In other words, if studentsdemonstrate a skill in one context, theyshould be able to then demonstrate it indifferent contexts or situations. However,behaviorist learning theory does not ad-dress how discrete pieces of informationare integrated into a coherent whole.Teachers must assume that students inte-grate this information elsewhere.

The behaviorist approach still plays adominant role in schools, and results inlearning that relies heavily on the memori-zation of factual information. In scienceeducation, the behaviorist legacy takes theform of teaching and learning that reliesheavily on using textbooks as curriculumsurrogates, and on having students memo-rize discrete bits of often unrelated science“factoids.” Assessments aligned with theseapproaches use formats made up primarilyof multiple-choice, true/false, and short-answer questions. Students focus on iden-tifying the “right” answer, as opposed todeveloping inquiry skills and conceptualunderstanding.

As a result, our education system hasfallen behind in preparing students to copesuccessfully with the challenges of an in-creasingly complex and sophisticatedworld, a world where scientific and tech-nological skills have become significantavenues to success. Students need oppor-tunities to develop problem-solving andinterpersonal skills if they are to succeedin this global yet “smaller” world, where

Copyright © 2002 NSTA. All rights reserved. For more information, go to www.nsta.org/permissions.

P A G E 3P A G E 3P A G E 3P A G E 3P A G E 3C H A P T E R 1 : A R A T I O N A L E F O R A S S E S S M E N T

many diverse interest groups compete forincreasingly scarce resources.

Science teachers are making thesenecessary “shifts” by implementingchanges suggested by reform documents.We now use current findings from re-search in learning and research in scienceeducation to inform ourselves about exem-plary practice. Our shifts are coupled witha move away from stimulus-responselearning toward learning that is inquirybased and that focuses on previouslylearned science concepts, alternative con-ceptions, and conceptual change. Success-ful learning is context dependent, and isfacilitated by interaction among peers.Our assessment reforms must be alignedwith these instructional reforms.

The ConstructivistParadigm

A crucial aspect of this shift is to move to-ward “constructivist” paradigms in our de-sign of science programs and assessments.The constructivist approach begins with afocus on what students already knowabout the world around them and on theirunderstanding of this world. Using this asa base, educators work to help studentsdevelop methods for further educatingthemselves about the world. The end re-sult is that students come away not onlywith scientific information but with ananalytical way of thinking that they canapply to any number of situations in life.

Recent work in cognitive psychologysuggests that meaningful learning occursin context, and that some skills used inone context do not necessarily transfer toother contexts. Some cognitive skills aregeneral and are used in a wide variety ofacademic and “real-world” tasks. On theother hand, other cognitive skills are con-text dependent, and apply to domain-spe-

cific knowledge and skills. There is an in-terface between the learning of cognitiveskills and context. Some cognitive skillsare transferable while others are domainspecific (Perkins and Salomon 1989).

Constructivism underlies the NationalScience Education Standards, published bythe National Research Council in 1996.The result of years of deliberations byeducators, scientists, government officials,and a wide range of other participants, theNational Science Education Standards viewscience as a process “in which studentslearn skills, such as observation, inference,and experimentation.” Through inquiry-based learning, “students develop under-

Figure 1.1: Assessingthe Ability to Inquire orthe Ability to doScientific Inquiry.National ScienceEducation Standards,NRC, 1996.

Identify Questions and Concepts That Guide Scientific Investigations• formulate a testable hypothesis• demonstrate the logical connections between the scientific concepts guiding

a hypothesis and the design of the experiment

Design and Conduct Scientific Investigations• formulate a question to investigate• develop a preliminary plan• choose appropriate equipment• take appropriate safety precautions• clarify controls and variables• organize and display data• use evidence, apply logic, and construct arguments for proposed explanations

Use Technology and Mathematics to Improve Investigations andCommunications• use a variety of measuring instruments and calculators in scientific

investigations• use formulas, charts, and graphs for communicating results

Formulate and Revise Scientific Explanations and Models Using Logic andEvidence• formulate models based upon physical, conceptual, and mathematical

concepts• use logic and evidence from investigations to explain arguments

Communicate and Defend Scientific Arguments• use accurate and effective means of communication, including writing,

following procedures, expressing concepts, and summarizing data• use diagrams and charts to construct reasoned arguments

Copyright © 2002 NSTA. All rights reserved. For more information, go to www.nsta.org/permissions.

P A G E 4P A G E 4P A G E 4P A G E 4P A G E 4 N A T I O N A L S C I E N C E T E A C H E R S A S S O C I A T I O N

standing of scientific concepts; an appre-ciation of the ‘how we know’ what weknow in science; understanding of the na-ture of science; skills necessary to becomeindependent inquirers about the naturalworld; [and] the dispositions to use theskills, abilities, and attitudes associatedwith science.” Figure 1.1 (page 3) providesan outline of standards for assessing astudent’s ability to inquire or undertakescientific inquiry.

Figure 1.2 shows the changing empha-ses needed to promote inquiry-based learn-ing. As you can see, the National Standardsfocus on giving students a much greaterrole in defining problems, designing experi-ments, and analyzing results. Through thisprocess, students gain the same exhilarationof discovery that practicing scientists expe-rience in their work when they plan andconduct investigations.

Assessment’sChanging Nature

As the nature of science educationchanges, so must our assessments. In gen-eral, assessment becomes a more integralpart of the learning process, growing bothbroader and deeper to probe student un-

Figure 1.2: ChangingEmphases. NationalScience EducationStandards, NRC, 1996.

The National Science Education Standards envision change throughout the system.The assessment standards encompass the following changes in emphases:

Less Emphasis On More Emphasis On

Assessing what is easily measured Assessing what is most highly valuedAssessing discrete knowledge Assessing rich, well-structured knowledgeAssessing scientific knowledge Assessing scientific understanding and reasoningAssessing to learn what students do not know Assessing to learn what students do understandAssessing only achievement Assessing achievement and opportunity to learnEnd of term assessments by teachers Students engaged in ongoing assessment of their

work and that of othersDevelopment of external assessments by Teachers involved in the development of

measurement experts alone external assessments

derstanding. It becomes broader in thesense that it encompasses more varied for-mats of assessment; it is deeper in terms ofmeasuring more complex skills. As stu-dents carry out laboratory investigationsthat challenge them to increase their con-ceptual understanding, the distinction be-tween assessment and instruction blursinto a seamless whole, and there is nearperfect alignment with standards (out-comes and expectations), programs (in-struction), and assessments. As we assessscientific thinking, science inquiry, andproblem-solving skills, then we mustchange our instruction to provide studentswith opportunities to learn and practicethese skills.

Figure 1.3 (page 5) depicts a congru-ence triangle where standards, instruction,and assessment interact in the planningand implementation of successful scienceprograms. If any of the three dimensionsdoes not clearly link or interface with theother dimensions, then we compromisethe fairness, credibility, validity, and utilityof the assessment. Figure 1.4 (page 5)provides a checklist that teachers andschool administrators can apply to evalu-ate their assessment programs.

Copyright © 2002 NSTA. All rights reserved. For more information, go to www.nsta.org/permissions.

P A G E 5P A G E 5P A G E 5P A G E 5P A G E 5C H A P T E R 1 : A R A T I O N A L E F O R A S S E S S M E N T

CurriculumStandards

• Frameworks• Syllabi• Guides• Blueprints• Benchmarks

VALIDITY

CORRELATION

ALIGNMENT

InstructionalProgram

• Instructional styles• Print materials• Equipment• Facilities• Technologies• Communities

Assessment-EvaluationSystem

• Objective tests• Performance assessments• Portfolios• Teacher observations• Program evaluations

Figure 1.3: CongruenceTriangle. Reynolds, etal., 1996.

Figure 1.4: AssessmentChecklist.Question Yes No

Do the school, district, or state curriculum guides and assessment frameworksincorporate the National Science Education Standards?

Are the assessment standards relevant to local perspectives and issues?

Are the assessment standards developmentally appropriate for the age ofstudents?

Are the assessment standards challenging to the academic capabilities ofstudents?

Are the instructional activities of teachers aligned with the assessmentstandards in use by the school or district?

Can students distinguish between instruction and assessment?

Are adequate materials available for student use in the laboratory?

Are students informed of the criteria for success?

Are students involved in the development of criteria for success?

Are the science process skills and content outcomes being measured consistentwith the standards in use?

Do the assessment instruments reflect a variety of formats? Is the assessmentsystem multifaceted?

Copyright © 2002 NSTA. All rights reserved. For more information, go to www.nsta.org/permissions.

P A G E 6P A G E 6P A G E 6P A G E 6P A G E 6 N A T I O N A L S C I E N C E T E A C H E R S A S S O C I A T I O N

The MultifacetedAssessment System

Educators have traditionally made wide useof paper-and-pencil examinations, whichhave typically included multiple-choice,true/false, short-answer, and essay ques-tions. Often these assessments are usedprimarily at the end of a course or instruc-tional unit as a way of measuring overallstudent understanding of facts and con-cepts. The large majority of questions inthese examinations or assessment formatstend to measure low-level cognitive skills.

With recent reforms, these assess-ments are being supplemented with abroad range of assessment tools designedto measure higher-level cognitive skills,such as problem-solving, inquiry, commu-nication, and also interpersonal skills.These multifaceted tools can include a va-riety of assessment formats, as depicted inFigure 1.5.

These varying assessment formats arediscussed in greater depth in Chapter 3.They can be used throughout the instruc-tional process to promote student learning.Figure 1.5: Multifaceted

Assessment System.Adapted from Reynoldset al., 1996.

Multiple choice

Short answer

Open/free response

Essay/journals

Papers/reports

Group visuals

Teacher observations

Interviews

Portfolios

Skills checklist

Manipulative skills

Laboratory performance

Extended investigations

Projects

Concept mapping

Vee heuristic

Venn diagram

Presentations

TEACHER INVOLVEDFORMATS

STUDENT WRITTENFORMATS

PERFORMANCEFORMATS

MULTIFACETED ASSESSMENTSYSTEM

Copyright © 2002 NSTA. All rights reserved. For more information, go to www.nsta.org/permissions.

P A G E 7P A G E 7P A G E 7P A G E 7P A G E 7C H A P T E R 1 : A R A T I O N A L E F O R A S S E S S M E N T

Most of these methods share a commonbenefit. As you measure student progressduring implementation of your scienceprogram, you can use the data to adjustinstruction and provide assistance to indi-vidual students as necessary. The data youcollect can also help you adjust overall in-structional strategies for use in future sci-ence classes.

Teachers can select their most appro-priate teaching strategies that help stu-dents learn new concepts within theconfines of their classroom environment.You can also use the most appropriate as-sessment formats and techniques to deter-mine whether students have mastered newskills and understandings. Just as no oneteaching strategy will cover every learningsituation, no single assessment format canmeasure every aspect of student learning.

The assessment formats depicted in Fig-ure 1.5, for example, are contained withinneat little cells. While these formats doprovide important data about studentlearning, in reality a given test might fitinto more than one category, or even pro-vide information that supports data gath-ered by several assessment methods.

This book focuses on performance as-sessments, and how these assessmentsconnect and interface with the NationalScience Education Standards. Its focus is onperformance-based assessments that usethe science classroom and laboratory asmajor contexts for inquiry. Performance-based assessment is by definition “authen-tic” in nature, because it allows students todemonstrate their science inquiry, reason-ing, and understanding skills when chal-lenged with relevant, “real-world”

Figure 1.6: ImportantAspects of LaboratoryPerformance-BasedAssessment.

The laboratory is an important component of science instruction.

• There are certain featuresthat are common to allmodels of laboratoryperformance-basedassessment. There is aPlanning and Designingphase or step, aPerforming or Doingphase, an Analysis andInterpretation of Dataphase, and a Conclusionsand Making Projectionsfor Future Study phase.The phases are placed insequence for discussionpurposes. In reality, thephases or steps areinterrelated, and studentscan revisit or retrace theirthinking at any time tomodify their work orinvestigation.

• The laboratory provides anappropriate context forstudents to engage inproblem-based learning,where they practice anduse science process andproblem-solving skills.

• Laboratory investigationsand tasks by their natureallow students to producea product and generate,rather than select,responses to questions.

• If appropriately designed,laboratory investigationsallow students to generatemultiple solutions to novelproblems.

• As students produce aproduct and generatemultiple responses toquestions, laboratoryinvestigations fit thecriteria as beingperformance based.

• As laboratoryperformance-basedassessment becomes anintegral part of sciencelearning, then instructionand the nature of whatgoes on in scienceclassrooms come closer tothe vision of assessmentlaid out in the NationalStandards. Instructionmoves from “a trans-mission of information”approach to a hands-on,problem-based approachthat allows students tointegrate new knowledgeand skills into theirexisting cognitivestructure.

• The laboratory or practicalscience is a “holisticactivity” (Woolnough1991) where students do atask rather than writeabout something. This inessence is a performance-based activity for a limitedor extended period oftime. This approach is inagreement with theNational Science EducationStandards for assessment.

• Laboratory investigations,while an exemplar ofperformance-basedassessment, are also anexcellent approach toproblem-based learning.Problem-based learning iswhere students inquire,debate, and engage indiscussion of open-endedproblems that havemultiple solutions. Theentire investigation canfocus on a single problem.

Copyright © 2002 NSTA. All rights reserved. For more information, go to www.nsta.org/permissions.

P A G E 8P A G E 8P A G E 8P A G E 8P A G E 8 N A T I O N A L S C I E N C E T E A C H E R S A S S O C I A T I O N

problems. The science laboratory, tradi-tionally under-used as a context for assess-ment, is an ideal setting for teachers toimplement many of the reforms suggestedby the National Science Education Stan-dards, state assessment frameworks, andother standards documents, such as theNew Standards Project (1997a, 1997b).Figure 1.6 (page 7) provides an outline ofimportant aspects of performance-basedassessment for the science laboratory.

This conceptualization of science in-quiry and its interface with laboratory per-formance-based assessment is consistentwith the assessment standards provided inthe National Science Education Standards,and forms the basic framework for design-ing performance assessments.

Many traditional assessments havebeen large-group oriented—that is, a singleteacher administering tests to a class. Thenew assessment formats supplement theseformats by focusing on individuals andsmall groups. Portfolios, interviews, jour-nals, and other assessment formats reinforceindividualized instruction, and also accom-modate different learning styles, exceptionalstudents, and students with Limited En-glish Proficiency skills.

Presentations, group and peer evalua-tions, and projects tap into students’ cre-ativity and planning and speaking skills byproviding them with the opportunity to dothe same things adults do every day. Lifeis not a series of true/false or multiple-choice tests. In most “real-world” decision-making and problem-solving situations,adults gather appropriate information, in-terpret that information using their ownexperiences and knowledge, and reach ap-propriate conclusions. In many cases, theirdecisions have important consequences. Inthe process, adults discard irrelevant infor-mation, search for additional data, and an-ticipate the consequences of their actions.

They also communicate their decisions,along with their rationale, to others.

A significant component of our currentteaching and assessment is based onwords—transmitting information to stu-dents verbally and through print, and thenrequiring students to repeat or replicate thatinformation verbally and through writing.But many students learn best by receivinginformation through visual tools such ascharts, data tables, graphs, and sketches.For such students, these kinds of visualstimuli can produce more effective learning.Several of these student performance-basedassessment formats—including conceptmaps, Venn diagrams, and the Vee heuristic(see pages 35–42 for examples of allthree)—emphasize visual stimuli.

Alternative response formats offer sig-nificant assistance to learners with Lim-ited English Proficiency skills and otherexceptionalities. As teachers, we must bewilling to accept many kinds of evidencegiven by students to demonstrate their un-derstanding of a concept or principle. Asthere are many ways to demonstrate un-derstanding, we need to go beyond paper-and-pencil assessment formats andembrace alternative assessment formatsthat reflect a variety of learning styles, co-operative learning in small groups, and thenurturing of multiple intelligences.

Using AssessmentResults—The NewParadigm

Science classroom and laboratory assess-ments are the foundation of a sophisticatedprocess designed to evaluate and improvethe science education system. Everyone—from students, teachers, and parents to gov-ernment officials—uses assessment data toevaluate how well the education system isperforming. It’s all part of a growing em-

Copyright © 2002 NSTA. All rights reserved. For more information, go to www.nsta.org/permissions.

P A G E 9P A G E 9P A G E 9P A G E 9P A G E 9C H A P T E R 1 : A R A T I O N A L E F O R A S S E S S M E N T

The four components can be combined in numerous ways. For example, teachers use student achievement data to plan and modifyteaching practices, and business leaders use per capita educational expenditures to locate businesses. The variety of uses, users,methods, and data contributes to the complexity and importance of the assessment process.

Data Use Data Collection Collection Methods Data Users

To describe and quantify:

Plan teaching Student achievement and Paper-and-pencil testing Teachers attitude

Guide learning Teacher preparation and Performance testing Students quality

Calculate grades Program characteristics Interviews Parents

Make comparisons Resource allocation Portfolios Public

Credential and Policy instruments Performances Policymakers license

Determine access Observing programs, Institutions of to special or advanced students, and teachers higher education education in classroom

Develop education Transcript analysis Business and theory industry

Inform policy Expert reviews of Government formulation education materials

Monitor effects of policies

Allocate resources

Evaluate quality of curricula, programs, and teaching practices

Figure 1.7: Componentsin the Assessment DataCollection Process.National ScienceEducation Standards,NRC, 1996.

phasis on making the education system ac-countable for its progress. According to theNational Science Education Standards:

Assessment is the primary feedbackmechanism in the science educationsystem. For example, assessment dataprovide students with feedback onhow well they are meeting the expec-tations of their teachers and parents,teachers with feedback on how welltheir students are learning, districtswith feedback on the effectiveness oftheir teachers and programs, andpolicymakers with feedback on how

well policies are working. Feedbackleads to changes in the science educa-tion system by stimulating changes inpolicy, guiding teacher professionaldevelopment, and encouraging stu-dents to improve their understandingof science.

Figure 1.7 depicts some of the com-ponents in the four-part assessment datacollection process designated in the Na-tional Standards, and highlights the com-plexity of assessment and howdifferent parts all work together to providea basis for important decisions.

Copyright © 2002 NSTA. All rights reserved. For more information, go to www.nsta.org/permissions.

P A G E 1 0P A G E 1 0P A G E 1 0P A G E 1 0P A G E 1 0 N A T I O N A L S C I E N C E T E A C H E R S A S S O C I A T I O N

Conclusion

It is clear that assessment is an important,integral part of science education that pro-motes learning for all students. Teachersuse a variety of assessment instruments ofthe highest quality for providing feedbackto students, parents, administrators, andpolicymakers. There is no single assess-ment format that works best for everyone;you must refine your assessments throughtrial and error to develop a system thatworks best for your particular situation.Different assessment formats provide dif-ferent kinds of information used for differ-ent purposes. Classroom and laboratoryassessments focus on improving studentlearning by providing feedback to stu-dents, while international and national as-sessments provide data for systemaccountability.

The next three chapters of this bookfocus on developing performance assess-ment tasks, alternative forms of assess-ment, and the analysis and use ofassessment data. These chapters will giveyou a practical primer on how to improvethe assessment process in your classroomor school.

Works Cited

National Research Council (NRC). 1996. Na-tional Science Education Standards. Wash-ington, DC: National Academy Press.

News Standards Project. 1997a. PerformanceStandards. Volume 2: Middle School.Washington, DC: National Center forEducation and the Economy (Tel. 202-783-3668).

———. 1997b. Middle School Science Portfolio.Washington, DC: National Center forEducation and the Economy (Tel. 202-783-3668).

Perkins, D., and Salomon, G. 1989. Are Cog-nitive Skills Context-Bound? EducationalResearcher 19:16–25.

Reynolds, D., Doran, R., Allers, R., andAgruso, S. 1996. Alternative Assessment inScience: A Teacher’s Guide. Buffalo: Univer-sity of Buffalo.

Rubba, P., Miller, E., Schmalz, R., Rosenfeld,L., and Shyamal, K. 1991. Science Edu-cation in the United States: Editors Re-flections. In Science Education in theUnited States: Issues, Crises and Priorities.Easton, PA: Pennsylvania Academy ofScience.

Suggested Readings

Carr, M., Barker, M., Bell, B., Biddulph, F.,Jones, A., Kirkwood, V., Pearson, J., andSymington, D. 1994. The ConstructivistParadigm and Some Implications for Sci-ence Content and Pedagogy. In The Con-tent of Science—A Constructivist Approachto Its Teaching and Learning, Fensham, P.,Gunstone, R., and White, R., eds. Bristol,PA: Falmer Press.

Duit, R., and Treagust, D. 1995. Students’Conceptions and Constructivist TeachingApproaches. In Improving Science Educa-tion, Fraser, B., and Walberg, H. eds.Chicago: National Society for the Studyof Education.

National Center on Education and theEconomy, University of Pittsburgh. 1997.Performance Standards, Volumes I, II, andIII. Washington, DC: National Center onEducation and the Economy.

National Research Council. 2000. Inquiry andthe National Science Education Standards: AGuide for Teaching and Learning. Wash-ington, DC: National Academy Press.

New York State Education Department, Uni-versity of the State of New York. 1996.Learning Standards for Mathematics, Sci-ence, and Technology. Albany: New YorkState Education Department.

Woolnough, B. 1991. Practical Science as aHolistic Activity. In Practical Science,Woolnough, B. ed. Bristol, PA: OpenUniversity Press.

Copyright © 2002 NSTA. All rights reserved. For more information, go to www.nsta.org/permissions.

P A G E 1 1P A G E 1 1P A G E 1 1P A G E 1 1P A G E 1 1C H A P T E R 1 : A R A T I O N A L E F O R A S S E S S M E N T

Yager, R. 1995. Constructivism and theLearning of Science. In Learning Sciencein the Schools: Research Reforming Practice,Glynn, S., and Duit, R., eds. Mahwah,NJ: Lawrence Erlbaum Associates.

Copyright © 2002 NSTA. All rights reserved. For more information, go to www.nsta.org/permissions.