from state-wide to state consortium assessment systems: test administration lessons learned from a...

57
From State-Wide to State Consortium Assessment Systems: Test Administration Lessons Learned From a Consortium Pilot National Conference on Student Assessment June 22, 2013

Upload: diana-cordelia-morris

Post on 25-Dec-2015

213 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: From State-Wide to State Consortium Assessment Systems: Test Administration Lessons Learned From a Consortium Pilot National Conference on Student Assessment

From State-Wide to State Consortium Assessment Systems: Test Administration Lessons Learned From a Consortium Pilot

National Conference on Student AssessmentJune 22, 2013

Page 2: From State-Wide to State Consortium Assessment Systems: Test Administration Lessons Learned From a Consortium Pilot National Conference on Student Assessment

PRESENTERS:Norma SinclairPsychometrician and Education Consultant, Connecticut State Department

of Education

Juan D’BrotExecutive Director, West Virginia Department of Education

Beth FultzProgram Consultant, NAEP, Kansas State Department of Education

Paula Hutton

Deborah MatthewsSWD Educational Program Consultant, Kansas State Department of

Education

MODERATOR:Jennifer PaulEL Assessment Consultant, Michigan Department of Education

Page 3: From State-Wide to State Consortium Assessment Systems: Test Administration Lessons Learned From a Consortium Pilot National Conference on Student Assessment

Overview

– Try out 10,000 new CB Math/ELA test items– Try out new item types– Try out test delivery system– Secure data to analyze the stability of

reporting scales– Secure data to build CAT system

Page 4: From State-Wide to State Consortium Assessment Systems: Test Administration Lessons Learned From a Consortium Pilot National Conference on Student Assessment

Before the Pilot: Expected Scope

– 25 member states– 1.2 million students– 9,000 schools– February to May 2013 testing window– Untimed pilot administrations– Desktops, laptops, notebooks, tablet support– Support multiple operating systems

Page 5: From State-Wide to State Consortium Assessment Systems: Test Administration Lessons Learned From a Consortium Pilot National Conference on Student Assessment

Test Administration Lessons From a Consortium Pilot

Paula Hutton

Maine Department of Education

Norma SinclairConnecticut State Department of Education

Paper presented at the NCSA 2013, Washington, D.C.

Page 6: From State-Wide to State Consortium Assessment Systems: Test Administration Lessons Learned From a Consortium Pilot National Conference on Student Assessment

Without standardized test administrations and testing conditions, the accuracy and comparability of score interpretations as well as student opportunity to demonstrate skills and abilities could be diminished (AERA, APA & NCME, 1999) .

Page 7: From State-Wide to State Consortium Assessment Systems: Test Administration Lessons Learned From a Consortium Pilot National Conference on Student Assessment

Our Goals

• Describe resources to foster standardization.

• Record pilot participant experiences.• Implications for transitioning to multi-

state SBAC Assessments.

Page 8: From State-Wide to State Consortium Assessment Systems: Test Administration Lessons Learned From a Consortium Pilot National Conference on Student Assessment

Standardizing the Pilot Administration

• Downloadable administration manuals • Webinars and training modules • Practice tests• Help-desk support• Student instructions (Test navigation,

tools) • Recruitment materials, pilot updates

Page 9: From State-Wide to State Consortium Assessment Systems: Test Administration Lessons Learned From a Consortium Pilot National Conference on Student Assessment

Data Collection

• On-site observations:– Connecticut– Maine

• Email surveys

• Observation notes

Page 10: From State-Wide to State Consortium Assessment Systems: Test Administration Lessons Learned From a Consortium Pilot National Conference on Student Assessment

Pilot Participant Comments: Kudos

• Manuals/modules: comprehensive coverage of fundamentals

• TIDE (test information distribution engine): Efficient to manage students and teachers

• Secure browser easy to install

• Help desk support

Page 11: From State-Wide to State Consortium Assessment Systems: Test Administration Lessons Learned From a Consortium Pilot National Conference on Student Assessment

Standardization Challenges: Highlights

Challenges based Standard 5 of test administration standards.

•Administration materials, procedures, resources (5.1)– Difficulties with Operating systems/internet providers

– Voluminous/inconsistent manuals and resource materials

– Reduced performance and ease of use of software (due to multiple TDS technical difficulties)

– Online tools and student supports: Inconsistent quality

•Pilot administration disruptions/modifications (5.2)– Untimed pilot testing (non-standardized test lengths)

– Technical difficulties (arbitrary log-offs, computer freezes, error messages, volume control)

Page 12: From State-Wide to State Consortium Assessment Systems: Test Administration Lessons Learned From a Consortium Pilot National Conference on Student Assessment

Pilot Standardization Challenges Contd.

• Distractions-free testing environment (5.4)– Testing in open areas in libraries– Technical difficulties (repeated log-offs, freezes,

volume control)

• Student test instructions (5.5)– Misleading instructions– Quality of videos and audio prompts– Incorrect pilot component assignments

Page 13: From State-Wide to State Consortium Assessment Systems: Test Administration Lessons Learned From a Consortium Pilot National Conference on Student Assessment

Pilot Standardization Challenges Contd.

• Unfamiliar test equipment and tools (5.5)– Incomplete/missing instructions for online tools and

student supports– Practice items unavailable at top of session

• Responding to test items using unfamiliar equipment (5.5)– Limited opportunity to learn keyboarding– No opportunity to practice using navigation and test

tools

Page 14: From State-Wide to State Consortium Assessment Systems: Test Administration Lessons Learned From a Consortium Pilot National Conference on Student Assessment

Pilot Participant Wish List

• Quick start administration guide

• In-person training

• Top-Down Communications

• Have states upload student information

• Minimize technical difficulties

• Reduce the length of field test

• Reduce keyboarding requirements

Page 15: From State-Wide to State Consortium Assessment Systems: Test Administration Lessons Learned From a Consortium Pilot National Conference on Student Assessment

SBAC Field Test Administration: Implications

• Attend to differing needs of CBT and Non-CBT states and their students

• Implement top-down communication system for test administration

• Institute a top-down approach in TIDE to manage students and test administrators

• Improve TDS to reduce technical issues

• More responsive and accurate help desk

• Clarify roles and responsibilities of test administrators (Difference between test admin. facilitation/cheating)

Page 16: From State-Wide to State Consortium Assessment Systems: Test Administration Lessons Learned From a Consortium Pilot National Conference on Student Assessment

Test DeliveryBeth Fultz

Kansas Department of Education

National Conference on Student AssessmentSaturday, June 22, 2013

Page 17: From State-Wide to State Consortium Assessment Systems: Test Administration Lessons Learned From a Consortium Pilot National Conference on Student Assessment

Technology Readiness

Page 18: From State-Wide to State Consortium Assessment Systems: Test Administration Lessons Learned From a Consortium Pilot National Conference on Student Assessment

Data

My school was prepared with the technology required to administer the SBAC computer-based Pilot test:

Strongly Agree

Agree Neither Agree or Disagree

Disagree Strongly Disagree

26.4% 48.8% 6.8% 13.6% 4.3%

Page 19: From State-Wide to State Consortium Assessment Systems: Test Administration Lessons Learned From a Consortium Pilot National Conference on Student Assessment

Common Issues

• Lack of understanding around length of test -

• Volume Control• Headphones• Ability to move to the next question

• Programming issues• Skipped questions• Split screens

Page 20: From State-Wide to State Consortium Assessment Systems: Test Administration Lessons Learned From a Consortium Pilot National Conference on Student Assessment

Computer Delivery States

• Assumption - SBAC computer test technology would work just like the state assessment • Required closer coordination with district IT

staff • Multiple test takers using the same computer

on the same day (access point/reboot)

Page 21: From State-Wide to State Consortium Assessment Systems: Test Administration Lessons Learned From a Consortium Pilot National Conference on Student Assessment

Paper-Pencil States

• Band-width and wireless connectivity• Age of school computers• Unexpected “internal system updates”• Students not as familiar with online

matching, drag-and-drop and calculator tools

• Lack of experience in dealing with small problems• Student getting logged out• Browser getting stuck

Page 22: From State-Wide to State Consortium Assessment Systems: Test Administration Lessons Learned From a Consortium Pilot National Conference on Student Assessment

Browser/Delivery Systems

Page 23: From State-Wide to State Consortium Assessment Systems: Test Administration Lessons Learned From a Consortium Pilot National Conference on Student Assessment

Data

How easy or difficult did you find the SBAC delivery system to use:

Very Easy Easy Neither Easy or Difficult

Difficult Very Difficult

8.7% 39.8% 28.9% 18.8% 3.7%

Page 24: From State-Wide to State Consortium Assessment Systems: Test Administration Lessons Learned From a Consortium Pilot National Conference on Student Assessment

Data

SBAC Kansas

Windows 81.29% 74.89%

iPad 1.89% 10.15%

Mac 14.74% 14.92%

Other 2.08% .04%

Page 25: From State-Wide to State Consortium Assessment Systems: Test Administration Lessons Learned From a Consortium Pilot National Conference on Student Assessment

Common Issues

• Audio • Film clips - often audio would not play • Re-listen to entire section – not a selected

section

• Drag & drop didn’t always work• Issue – test would not allow the student to go

on to next question

Page 26: From State-Wide to State Consortium Assessment Systems: Test Administration Lessons Learned From a Consortium Pilot National Conference on Student Assessment

iPad and Tablets

• Keyboards• Difficulty seeing the entire

question/scrolling• Software would only work if all other

programs were closed

• Overall, very successful pilot• Fewer technology issues

Page 27: From State-Wide to State Consortium Assessment Systems: Test Administration Lessons Learned From a Consortium Pilot National Conference on Student Assessment

Technical Difficulties

Page 28: From State-Wide to State Consortium Assessment Systems: Test Administration Lessons Learned From a Consortium Pilot National Conference on Student Assessment

Data

Did you, or any of the test takers, experience any technical difficulties during the administration of the test:

Yes 78.6%

No 21.4%

Page 29: From State-Wide to State Consortium Assessment Systems: Test Administration Lessons Learned From a Consortium Pilot National Conference on Student Assessment

What technical difficulties did you, or any of the test takers, experience?

Reasons/Responses

Problems with school computer equipment 36.0%

Issues with the testing platform/ the test itself 51.5%

Lack of resources to conduct testing 10.1%

Overall the system was difficult to use 14.2%

Training materials were inadequate 15.0%

Insufficient time allowed for introducing students to the test environment

21.3%

Other 22.1%

Page 30: From State-Wide to State Consortium Assessment Systems: Test Administration Lessons Learned From a Consortium Pilot National Conference on Student Assessment

Conclusions

Page 31: From State-Wide to State Consortium Assessment Systems: Test Administration Lessons Learned From a Consortium Pilot National Conference on Student Assessment

Comment from a Kansas Testing Coordinator:

“Most problems are of the sort that will be resolved before the final test is operational. In other words, the pilot is serving its purpose.”

Page 32: From State-Wide to State Consortium Assessment Systems: Test Administration Lessons Learned From a Consortium Pilot National Conference on Student Assessment

Accessibility and Accommodation for SBAC

Pilot Assessment

National Conference on Student AssessmentJune 22, 2013

Page 33: From State-Wide to State Consortium Assessment Systems: Test Administration Lessons Learned From a Consortium Pilot National Conference on Student Assessment

Pilot Test

• Universal Design- Increased accessibility– To include 2% students– Only looking to exempt ALT 1%

• Work Group Members• Contract Work

– State Practices– Literature Reviews– Policy Recommendations

• Advisory Committees– ELL Advisory Committee– SWD Advisory Committee– Cross-consortia ELL Advisory Committee

Page 34: From State-Wide to State Consortium Assessment Systems: Test Administration Lessons Learned From a Consortium Pilot National Conference on Student Assessment

Pilot Test

• ELA Writing Tools for Performance Tasks: available to all students

• General tools for Math and ELA: most available to all students

• Accessibility Pilot Studies: content and grades specified

Page 35: From State-Wide to State Consortium Assessment Systems: Test Administration Lessons Learned From a Consortium Pilot National Conference on Student Assessment

Digital Tools – All GradesWriting ELA Performance Tasks

Universal Digital Tool All StudentsBold YesItalics YesUnderline YesIndent YesCut YesCopy YesPaste YesSpell Check YesUndo/Redo Yes

Page 36: From State-Wide to State Consortium Assessment Systems: Test Administration Lessons Learned From a Consortium Pilot National Conference on Student Assessment

Additional Tools and ResourcesAll Grades

Universal Digital Tool ELA Math All Students

Tab-Enter Navigation Yes Yes Yes

Font Background Color Alternatives

Yes Yes Yes

Breaks Yes Yes Yes

Additional Time Yes Yes Yes

Calculator* N/A Yes Yes

* Note: Calculators available on items when they do not interfere with intended construct.

Page 37: From State-Wide to State Consortium Assessment Systems: Test Administration Lessons Learned From a Consortium Pilot National Conference on Student Assessment

Accessibility Pilot Studies

Features Math Grades

ELA Grades

Full Spanish Translation 3, 7, 11 ---

Customized Spanish glossaries 3, 7, 11 ---

Online refreshable Braille* 3, 7, 11 4, 7, 11

Text to speech 3, 7, 11 4, 7, 11 (items only)

Customized English glossaries 3 4 (items only)

* Note: Special equipment provided locally.

Page 38: From State-Wide to State Consortium Assessment Systems: Test Administration Lessons Learned From a Consortium Pilot National Conference on Student Assessment

• Full Spanish translation• Customized Spanish glossaries

English Language Learners and the Pilot Test

Page 39: From State-Wide to State Consortium Assessment Systems: Test Administration Lessons Learned From a Consortium Pilot National Conference on Student Assessment

Students with Disabilitiesand the Pilot Test

• Technology• Students and Technology• Braille• Text to Speech

Page 40: From State-Wide to State Consortium Assessment Systems: Test Administration Lessons Learned From a Consortium Pilot National Conference on Student Assessment

Item Types & Content Areas

• ELA and Math– Multiple-choice– Constructed Response– Performance Tasks– Classroom Activities– Technology Enhanced

• What new accessibility issues does the item create?

*Special considerations for SWDs and ELLs

Page 41: From State-Wide to State Consortium Assessment Systems: Test Administration Lessons Learned From a Consortium Pilot National Conference on Student Assessment

Technology Enhanced Items

• Unnecessary item format (increased cognitive load)

• Unclear/confusing design and layout• Appropriateness for visually impaired• Embedded identification of tools and

commands for use• All included tools are necessary

Page 42: From State-Wide to State Consortium Assessment Systems: Test Administration Lessons Learned From a Consortium Pilot National Conference on Student Assessment

SBAC Efforts

• For Pilot Test:– All items went through bias, sensitivity, and

accessibility vendor review– SBAC A & A workgroup reviewed items

• For Field Test:– All items to go through vendor review– SBAC A & A workgroup quality check– Large, coordinated SBAC review by state members– Strengthen training

Page 43: From State-Wide to State Consortium Assessment Systems: Test Administration Lessons Learned From a Consortium Pilot National Conference on Student Assessment

Pilot Studies Results

• Gather information about the process of providing accommodations and results of offering them.

• Provide feedback to states, work groups, experts.

• Incorporate what we learn into field test development work.

• Continue to develop materials resources that are state friendly.

Page 44: From State-Wide to State Consortium Assessment Systems: Test Administration Lessons Learned From a Consortium Pilot National Conference on Student Assessment

Test SecurityFrom State to Consortium and Back

National Conference on Student AssessmentJune 22, 2013

Page 45: From State-Wide to State Consortium Assessment Systems: Test Administration Lessons Learned From a Consortium Pilot National Conference on Student Assessment

Purpose of the Pilot

– Test thousands of new CB Math/ELA test items

– Test new item types– Test a new delivery system– Analyze the stability of reporting scales– Secure data to build CAT system

– Unnamed: To identify gaps in policy, process and procedure

Page 46: From State-Wide to State Consortium Assessment Systems: Test Administration Lessons Learned From a Consortium Pilot National Conference on Student Assessment

Timeline and Challenges at Two Levels

• Pilot Test (SY 2012-2013) & Field Test (SY 2013-2014)

• 2 opportunities: – Smarter Balanced:

• System readiness

– Smarter Balanced & States• Field readiness• Policy availability • Defined processes and procedures

Page 47: From State-Wide to State Consortium Assessment Systems: Test Administration Lessons Learned From a Consortium Pilot National Conference on Student Assessment

Test Security

• TILSA Test Security Guidebook

• Three Key Areas1. Prevention2. Detection3. Follow-up Investigation

• Primary Goal:– Ensure the reliability of results and the

validity of student responses– Validity: the consequential kind

• Accountability for Schools and Teachers

Page 48: From State-Wide to State Consortium Assessment Systems: Test Administration Lessons Learned From a Consortium Pilot National Conference on Student Assessment

Prevention

• Program Management– Security Plans and Staffing

• Test Design and Deployment– Item pools mitigating over-exposure

• Test Administration and Scoring– Procedures, rules, documentation

• Quality Control – Security of items and materials– Web monitoring– Training and security awareness

Page 49: From State-Wide to State Consortium Assessment Systems: Test Administration Lessons Learned From a Consortium Pilot National Conference on Student Assessment

Detection

• Reporting Protocols• Investigation Protocols• Data Forensics

– Erasure– Person-fit– Answer change– Gains and Losses– Similarity/collusion

Page 50: From State-Wide to State Consortium Assessment Systems: Test Administration Lessons Learned From a Consortium Pilot National Conference on Student Assessment

Follow-up Investigations

• Evidence guidelines and criteria– What kinds, how much, and from whom?

• Roles and Responsibilities– State staff– Vendor staff– LEA and school staff

• Investigation Tool Kit for SEAs and LEAs– Expectations, requirements, roles, responsibilities,

and types of information

• Timelines– A dedicated and transparent plan

Page 51: From State-Wide to State Consortium Assessment Systems: Test Administration Lessons Learned From a Consortium Pilot National Conference on Student Assessment

The Pilot Test Experience

• States may have been unprepared beyond their current policies

• Pilot test administration manual:– Informed administrators about security of the assessment– If everyone read it

• Consortium data– What kinds of analyses will be conducted?

• State feedback (WV):– General adherence to security requirements of state

policy– Multiple cases of “breach” events– A few cases of test impropriety

Page 52: From State-Wide to State Consortium Assessment Systems: Test Administration Lessons Learned From a Consortium Pilot National Conference on Student Assessment

The Pilot Test Experience (cont’d)

• State feedback (WV):– A few cases of test impropriety

• Recourse– Little due to lack of paper trail– Non-state directed process led to a degree of

disconnect between LEA and SEA – affected prevention

– Lessons learned for Field Test and Operational

Page 53: From State-Wide to State Consortium Assessment Systems: Test Administration Lessons Learned From a Consortium Pilot National Conference on Student Assessment

A Potential State Example

• What if an administrator posed as a student? – Threats to:

• response validity• item over-exposure• Influencing field test scaling

– What recourse does a state have? • Against what state policy?• Against what LEA and school required process? • Depends on documentation…

Page 54: From State-Wide to State Consortium Assessment Systems: Test Administration Lessons Learned From a Consortium Pilot National Conference on Student Assessment

What’s Needed for the Field Test

• Processes (co-chairs present…)– More support from the Smarter Balanced test

administration standpoint– Increased integration with state-specific best

practices– States requiring training from the consortium– Leads to:

• Documentation of Evidence– Signed agreements– Training verification– Creating a paper-trail

Page 55: From State-Wide to State Consortium Assessment Systems: Test Administration Lessons Learned From a Consortium Pilot National Conference on Student Assessment

What States Need for Operational Administration

• Solid Policy and Guidelines– Consortium-sponsored guidelines, potentially

policy– State defined policies, requirements, and

guidance– LEA-focused toolkits, plans, and requirements – Signed user agreements – Minimum standards for states to engage in

the consortium

Page 56: From State-Wide to State Consortium Assessment Systems: Test Administration Lessons Learned From a Consortium Pilot National Conference on Student Assessment

What States Need for Operational Administration

• Agreements– Vendor agreements to get data/reports

• Data Themselves– Student level responses

• Answer changing (think erasure)• Pattern analyses• Person-fit analyses

– Latency data– Time stamped data– New flagging criteria

Page 57: From State-Wide to State Consortium Assessment Systems: Test Administration Lessons Learned From a Consortium Pilot National Conference on Student Assessment

General Questions?