10421 plan manual 1-34successlineinc.com/actinfo/plantechnicalmanual.pdf · act endorses the code...

78
T ECHNICAL M ANUAL

Upload: others

Post on 19-Jul-2020

3 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: 10421 PLAN Manual 1-34successlineinc.com/actinfo/plantechnicalmanual.pdf · ACT endorses the Code of Fair Testing Practices in Educationand the Code of Professional Responsibilities

TECHNICAL MANUAL

Page 2: 10421 PLAN Manual 1-34successlineinc.com/actinfo/plantechnicalmanual.pdf · ACT endorses the Code of Fair Testing Practices in Educationand the Code of Professional Responsibilities
Page 3: 10421 PLAN Manual 1-34successlineinc.com/actinfo/plantechnicalmanual.pdf · ACT endorses the Code of Fair Testing Practices in Educationand the Code of Professional Responsibilities

TECHNICAL MANUAL

Page 4: 10421 PLAN Manual 1-34successlineinc.com/actinfo/plantechnicalmanual.pdf · ACT endorses the Code of Fair Testing Practices in Educationand the Code of Professional Responsibilities

ACT endorses the Code of Fair Testing Practices in Education and the Code ofProfessional Responsibilities in Educational Measurement, guides to the conductof those involved in educational testing. ACT is committed to ensuring that eachof its testing programs upholds the guidelines in each Code.

A copy of each Code may be obtained free of charge from ACT CustomerServices (68), P.O. Box 1008, Iowa City, IA 52243-1008, 319/337-1429.

Visit ACT’s website at www.act.org.

© 2007 by ACT, Inc. All rights reserved.

10421

Page 5: 10421 PLAN Manual 1-34successlineinc.com/actinfo/plantechnicalmanual.pdf · ACT endorses the Code of Fair Testing Practices in Educationand the Code of Professional Responsibilities

Figures. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . vi

Tables . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . vii

Preface. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . viii

Chapter 1 The PLAN Program . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1Overview and Purpose of the PLAN Program . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1Code of Fair Testing Practices in Education and Code of Professional Responsibilities

in Educational Measurement . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2Philosophical Basis for the Tests of Educational Development . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2Administering the PLAN Program . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3

Participation Procedures. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3Special Testing for Students With Disabilities . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3

Administration Schedules . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4PLAN Support Materials . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4ACT’s Standards for Test Administration and Security . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4

Chapter 2 The PLAN Tests of Educational Achievement . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5Description of the PLAN Tests . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5

The English Test . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5The Mathematics Test. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5The Reading Test . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6The Science Test . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6

Test Development Procedures . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6Reviewing Test Specifications. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6

Content Specifications. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6Statistical Specifications . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6

Selection of Item Writers. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9Item Construction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9Review of Items . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10Item Tryouts . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10Item Analysis of Tryout Units . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10Assembly of New Forms. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10Content and Fairness Review of Test Items . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11Review Following Operational Administration. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11

PLAN Scoring Procedures . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12

Chapter 3 ACT’s College Readiness Standards and College Readiness Benchmarks . . . . . . . . . . . . . . . . . . . . . . . . . . 13ACT’s College Readiness Standards. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13

Description of the College Readiness Standards . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13Determining the Score Ranges for the College Readiness Standards (1997). . . . . . . . . . . . . . . . . . . . . . . . . . . . 13Developing the College Readiness Standards . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14

The Process . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14Conducting an Independent Review of the College Readiness Standards . . . . . . . . . . . . . . . . . . . . . . . . . 16

Refining the College Readiness Standards for EXPLORE and PLAN (2001) . . . . . . . . . . . . . . . . . . . . . . . . . . . . 17Periodic Review of the College Readiness Standards . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 17

Interpreting and Using the College Readiness Standards . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18ACT’s College Readiness Benchmarks . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18

Description of the College Readiness Benchmarks . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18Data Used to Establish the Benchmarks for the ACT . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18Procedures Used to Establish the Benchmarks for EXPLORE and PLAN . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 19

iii

Contents

Page 6: 10421 PLAN Manual 1-34successlineinc.com/actinfo/plantechnicalmanual.pdf · ACT endorses the Code of Fair Testing Practices in Educationand the Code of Professional Responsibilities

Interpreting the EXPLORE and PLAN Benchmarks for Students Who Test With EXPLOREin Grade 9 and PLAN in Grade 11 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 19

Intended Uses of the Benchmarks for Students, Schools, Districts, and States . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 19Interpreting EPAS Test Scores With Respect to Both ACT’s College Readiness Standards

and ACT’s College Readiness Benchmarks. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 19

Chapter 4 Technical Characteristics of the PLAN Tests of Educational Achievement . . . . . . . . . . . . . . . . . . . . . . . . . . 20The Score Scale and Norms . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 20

Scaling . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 20The Score Scale . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 20Scaling Process. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 21

PLAN National Norming Study . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 22Sample . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 22Weighting. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 23Sample Representativeness . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 23Obtained Precision . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 23

Norms for the 2005 National Sample . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 23Scale Score Statistics for the 2005 National Sample . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 25Cumulative Percents for the 2005 National Sample. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 25Estimated ACT Composite Score Interval . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 30

Equating . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 33Reliability, Measurement Error, and Effective Weights. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 33Validity . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 40

Measuring Educational Achievement . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 40Content Validity Argument for PLAN Scores. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 40PLAN Test Scores . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 40Statistical Relationships Between EXPLORE, PLAN, and ACT Scores . . . . . . . . . . . . . . . . . . . . . . . . . . . . 41EXPLORE and PLAN College Readiness Benchmarks . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 41

Statistical Relationships Between PLAN Scores and High School Course Work and Grades . . . . . . . . . . . . . . . 43PLAN Scores and High School Course Work . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 43PLAN Scores and Course Grades . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 45Course Work Associated With Longitudinal Educational Achievement,

as Measured by EXPLORE and PLAN Scores. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 46Data and Method. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 46Results . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 46Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 46Course Work Associated With Longitudinal Educational Achievement,

as Measured by PLAN and ACT Scores. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 50Data and Method. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 50Results . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 50Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 51High School Course Work Associated With ACT College Readiness Benchmarks . . . . . . . . . . . . . . . . . . . 52Data and Method. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 52Results . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 52Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 52Growth From Grade 8 to Grade 12. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 54

Using ACT and PLAN Scores for Program Evaluation. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 55

iv

Page 7: 10421 PLAN Manual 1-34successlineinc.com/actinfo/plantechnicalmanual.pdf · ACT endorses the Code of Fair Testing Practices in Educationand the Code of Professional Responsibilities

Chapter 5 The PLAN Interest Inventory, Student Information Section,and the High School Course Information Section . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 56

Interest Inventory . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 56Overview . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 56Score Reporting Procedures . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 56Norms . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 56

Sampling . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 56Weighting. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 56Precision . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 56Representativeness. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 57

Psychometric Support for UNIACT. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 58Student Information, Needs Assessment, and High School Course Information Sections . . . . . . . . . . . . . . . . . . . . . . . 61

Student Information Needs Assessment Sections . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 61High School Course Information Section . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 61

Chapter 6 Reporting and Research Services. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 62Student Report . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 62Student Score Label. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 64High School List Report . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 64School Profile Summary Report . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 64Optional Supplemental Research and Reporting Services . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 64

High Schools. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 66Districts . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 66AIM . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 66

References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 67

v

Page 8: 10421 PLAN Manual 1-34successlineinc.com/actinfo/plantechnicalmanual.pdf · ACT endorses the Code of Fair Testing Practices in Educationand the Code of Professional Responsibilities

vi

3.1 Score Ranges for EXPLORE, PLAN, and the ACT . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13

4.1 Conditional Standard Error of Measurement for the PLAN Tests . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 34

4.2 Conditional Standard Error of Measurement for the PLAN Subscores . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 35

4.3 Conditional Standard Error of Measurement for the PLAN Composite Scores . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 38

4.4 Conditional Probability of Meeting/Exceeding an ACT English Score = 18, Given Students’ EXPLORE or PLAN English Score . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 42

4.5 2005–2006 National PLAN-Tested Students Likely to Be Ready for College-Level Work (In Percent). . . . . . . . . . . . . . 43

4.6 Typical Chances of Meeting the College Readiness Benchmark for College Algebra by Specific Mathematics Course Work . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 53

4.7 EPAS Growth Trajectories Calculated at Grades 8 through 12 for Expected Within-School Average E(Yij), High School A, and High School B. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 54

5.1 World-of-Work Map. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 59

5.2 UNIACT Technical Manual Table of Contents . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 60

6.1 Front of PLAN Student Score Report. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 63

6.2 Back of PLAN Student Score Report . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 63

6.3 PLAN Student Score Label . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 64

6.4 PLAN High School List Report (With Coding Key). . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 65

Figures

Page 9: 10421 PLAN Manual 1-34successlineinc.com/actinfo/plantechnicalmanual.pdf · ACT endorses the Code of Fair Testing Practices in Educationand the Code of Professional Responsibilities

1.1 Components of EPAS. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2

2.1 Content Specifications for the PLAN English Test . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7

2.2 Content Specifications for the PLAN Mathematics Test. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8

2.3 Content Specifications for the PLAN Reading Test . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8

2.4 Content Specifications for the PLAN Science Test . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9

2.5 Difficulty Distributions, Mean Discrimination Indices, and Average Completion Rates for the 2003–2006 PLAN Operational Administrations . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11

3.1 Illustrative Listing of Mathematics Items by Score Range . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14

3.2 Number of Items Reviewed During 1997 National Review . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15

3.3 Percentage of Agreement of 1997 National Expert Review. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 16

3.4 Percentage of Agreement of 2000 National Expert Review. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 17

3.5 ACT’s College Readiness Benchmarks . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18

4.1 Properties of the Score Scale for PLAN. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 20

4.2 Selected Demographic and Educational Characteristics of Grade 10 Students for the 2005 PLAN Norm Group . . . . . 24

4.3 Scale Score Summary Statistics for a National Sample . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 25

4.4 Scale Score Summary Statistics for a College-Bound Sample . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 25

4.5 PLAN 2005 National Norms for Fall Grade 10. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 26

4.6 PLAN 2005 National Norms for Spring Grade 10 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 27

4.7 PLAN 2005 College-Bound Norms for Fall Grade 10 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 28

4.8 PLAN 2005 College-Bound Norms for Spring Grade 10 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 29

4.9 Estimated ACT Composite Score Intervals Using PLAN 2002–ACT 2004 Data . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 31

4.10 Estimated ACT Composite Score Intervals for PLAN Composite Scores . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 32

4.11 Estimated Reliabilities and Standard Errors of Measurement . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 36

4.12 Scale Score Covariances, and Effective Weights . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 39

4.13 Observed Correlations (On and Above Diagonal) and Disattenuated Correlations(Below Diagonal) Between PLAN Test Scores and Subscores . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 40

4.14 Correlations, Observed and (Disattenuated), Between EXPLORE, PLAN, and ACT Test Scale Scores . . . . . . . . . . . . 41

4.15 Average PLAN Scores by Courses Taken in High School . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 44

4.16 Correlations Between PLAN Scores and Course Grades . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 45

4.17 Means and Correlations for PLAN Score and High School Grade Point Average. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 46

4.18 Percentages and Zero-Order Correlation Coefficients for Blocks of Independent Variables That Met the Criteria of Selection . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 47

4.19 Distributions, Across Schools, of Regression Statistics for Modeling PLAN Mathematics, Science, and Composite Scores . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 48

4.20 Distributions, Across Schools, of Regression Statistics for Modeling PLAN English and Reading Scores. . . . . . . . . . . 49

4.21 Hierarchical Linear Regression Coefficients for Modeling ACT Scores . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 51

4.22 Hierarchical Logistic Regression Coefficients for Modeling the Probability of Students’ Meeting or Exceeding ACT College Readiness Benchmarks . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 53

5.1 Selected Characteristics of Grade 10 UNIACT Norm Group Students and Schools. . . . . . . . . . . . . . . . . . . . . . . . . . . . 57

vii

Tables

Page 10: 10421 PLAN Manual 1-34successlineinc.com/actinfo/plantechnicalmanual.pdf · ACT endorses the Code of Fair Testing Practices in Educationand the Code of Professional Responsibilities

This manual contains information, primarily of a technicalnature, about the PLAN® program. The principal focus of thismanual is to document the PLAN program’s technical ade-quacy in light of its intended purposes.

The content of this manual responds to requirements of the testing industry as established in the Code ofProfessional Responsibilities in Educational Measurement(NCME Ad Hoc Committee on the Development of a Code of Ethics, 1995), the Standards for Educational andPsychological Testing (AERA, APA, & NCME, 1999), and theCode of Fair Testing Practices in Education (Joint Committeeon Testing Practices, 2004).

ACT regularly conducts research studies as part of theongoing formative evaluation of its programs. These studiesare aimed at ensuring that the programs remain technicallysound. Information gleaned from these studies is also usedto identify aspects of the programs that might be improved orenhanced. The information reported in this manual wasderived from studies that have been conducted for the PLAN

program since its inception. ACT will continue to conductresearch on the PLAN program and will report future findingsin updated versions of this manual. Those who wish toreceive more detailed information on a topic discussed inthis manual, or on a related topic, are encouraged to contactACT.

Qualified researchers wishing to access data in order toconduct research designed to advance knowledge aboutthe PLAN program are also encouraged to contact ACT. As part of its involvement in the testing process, ACT is com-mitted to continuing and supporting such research, docu-menting and disseminating its outcomes, and encouragingothers to engage in research that sheds light on the PLANprogram and its uses. Please direct comments or inquiriesto Elementary and Secondary School Programs, ACTDevelopment Area, P.O. Box 168, Iowa City, Iowa 52243-0168.

Iowa City, IowaDecember 2007

Preface

viii

Page 11: 10421 PLAN Manual 1-34successlineinc.com/actinfo/plantechnicalmanual.pdf · ACT endorses the Code of Fair Testing Practices in Educationand the Code of Professional Responsibilities

1

Chapter 1The PLAN® Program

Overview and Purpose of the PLAN Program

The comprehensive PLAN® program from ACT helps10th-grade students make the most of their opportunitiesand helps guide them in future educational and career plan-ning.

Like all of ACT’s assessment programs, PLAN is basedon the belief that young people—and their parents, teachers,counselors, and school administrators—will make more pro-ductive plans and decisions if they have organized, relevantinformation available when they need it most.

PLAN is designed to be an every-student program admin-istered in Grade 10 to provide students with an early indica-tion of their educational progress in the context of thepost–high school educational and career options they areconsidering. The results from PLAN can be used to help stu-dents make adjustments in their course work to help ensurethat they are prepared for what they want to do after highschool. High schools use the data in academic advising andcounseling.

PLAN includes four multiple-choice tests—English,Mathematics, Reading, and Science. PLAN also collectsinformation about student interests, needs, plans, andselected background characteristics that can be useful inguidance and planning activities.

ACT makes available to PLAN test takers and prospectivePLAN test takers various materials about test preparation andthe interpretation of test results. An overview of the test and aselection of sample test questions are available to studentsonline at www.actstudent.org/plan. The Student ScoreReport each examinee receives after testing contains sectionsabout the student’s scores, plans, career possibilities, andskills. The report is accompanied by a booklet, Using YourPLAN Results, which provides interpretive information aboutthe test results and provides suggestions for making educa-tional plans, for building academic skills, and for exploringoccupations.

PLAN functions as a stand-alone program or as the mid-point of the secondary-school level of ACT’s EducationalPlanning and Assessment System (EPAS™)—an integratedseries of assessment programs that includes EXPLORE®,PLAN, and the ACT®. When used together, the assessmentsin the EPAS system give educators at the middle-school andsecondary-school levels a powerful, interrelated sequenceof instruments to measure student development from Grade8 through Grade 12.

The EXPLORE, PLAN, and ACT programs are scoredalong a common scale extending from 1 to 36; the maximumscore on EXPLORE is 25, the maximum PLAN score is 32,and the maximum ACT score is 36. Because they arereported on the same score scale, EPAS assessment resultsinform students, parents, teachers, and counselors aboutindividual student strengths and weaknesses while there isstill time to address them.

The EXPLORE, PLAN, and ACT assessments provideinformation about how well a student performs compared toother students. They also provide standards-based interpre-tations through the EPAS College Readiness Standards™—statements that describe students’ performance in terms ofthe knowledge and skills they have acquired. Because theCollege Readiness Standards focus on the integrated,higher-order thinking skills that students develop in GradesK–12 and that are important for success both during andafter high school, the Standards provide a common lan-guage for secondary and postsecondary educators.

Using the College Readiness Standards, secondary edu-cators can pinpoint the skills students have and those theyare ready to learn next. The Standards clarify college expec-tations in terms that high school teachers understand. TheStandards also offer teachers guidance for improvinginstruction to help correct student deficiencies in specificareas. EPAS results can be used to identify students whoare on track to being ready for college. ACT’s CollegeReadiness Benchmark Scores—for English Composition,Algebra, Social Sciences, and Biology—were developed tohelp identify EPAS examinees who would likely be ready fordoing college-level work in these courses or course areas.Chapter 3 gives details about the College ReadinessStandards and Benchmarks.

EPAS is designed to help students plan for further edu-cation and explore careers, based on their own skills, inter-ests, and aspirations. EPAS results give schools a way toget students engaged in planning their own futures. Whenthey know what colleges expect, in terms they can under-stand, students can take ownership and control of their infor-mation, and they can use it to help make a smooth transitionto postsecondary education or training. Table 1.1 summa-rizes the EPAS components.

Page 12: 10421 PLAN Manual 1-34successlineinc.com/actinfo/plantechnicalmanual.pdf · ACT endorses the Code of Fair Testing Practices in Educationand the Code of Professional Responsibilities

Code of Fair Testing Practices in Education and Code of Professional Responsibilities in

Educational Measurement

Since publication of the original edition in 1988, ACT hasendorsed the Code of Fair Testing Practices in Education(Joint Committee on Testing Practices, 2004), a statement ofthe obligations to test takers of those who develop, adminis-ter, or use educational tests and test data. The developmentof the Code was sponsored by a joint committee of theAmerican Association for Counseling and Development,Association for Measurement and Evaluation in Counselingand Development, American Educational ResearchAssociation, American Psychological Association, AmericanSpeech-Language-Hearing Association, and NationalCouncil on Measurement in Education to advance, in thepublic interest, the quality of testing practices.

The Code sets forth fairness criteria in four areas: devel-oping and selecting appropriate tests, administering andscoring tests, reporting and interpreting test results, andinforming test takers. Separate standards are provided fortest developers and for test users in each of these four areas.

ACT’s endorsement of the Code represents a commit-ment to vigorously safeguard the rights of individuals partic-ipating in its testing programs. ACT employs an ongoingreview process whereby each of its testing programs is

routinely reviewed to ensure that it upholds the standardsset forth in the Code for appropriate test development prac-tice and test use.

Similarly, ACT endorses and is committed to complyingwith the Code of Professional Responsibilities inEducational Measurement (NCME Ad Hoc Committee on theDevelopment of a Code of Ethics, 1995), a statement of pro-fessional responsibilities for those who develop assess-ments; market and sell assessments; select assessments;administer assessments; interpret, use, and communicateassessment results; educate about assessment; and evalu-ate programs and conduct research on assessments.

A copy of each Code may be obtained free of charge fromACT Customer Services (68), P.O. Box 1008, Iowa City,Iowa 52243-1008, 319/337-1429.

Philosophical Basis for the Testsof Educational Development

The PLAN multiple-choice tests of educational develop-ment share a common philosophical basis with theEXPLORE and ACT tests. The three EPAS testing programsmeasure student development in the same curriculum areasof English, mathematics, reading, and science. In simplestterms, the principal difference between the three testing

2

Component Grades 8/9 Grade 10 Grades 11/12

Career and education planning

EXPLORE:Interest InventoryNeeds Assessment

PLAN:Interest InventoryCourse Taking

Needs Assessment

ACT:Interest InventoryCourse Taking and GradesStudent Profile

Objective assessments EXPLORE:EnglishMathematicsReadingScience

PLAN:EnglishMathematicsReadingScience

ACT:EnglishMathematicsReadingScienceWriting (optional)

Instructional support College Readiness StandardsCollege Readiness Standards

Information Services

College Readiness StandardsCollege Readiness Standards

Information Services

College Readiness StandardsCollege Readiness Standards

Information Services

Evaluation Summary ReportsEXPLORE/PLAN Linkage

Reports

Summary ReportsEXPLORE/PLAN Linkage

ReportsPLAN/ACT Linkage

Reports

Summary ReportsPLAN/ACT Linkage

Reports

Table 1.1Components of EPAS

Page 13: 10421 PLAN Manual 1-34successlineinc.com/actinfo/plantechnicalmanual.pdf · ACT endorses the Code of Fair Testing Practices in Educationand the Code of Professional Responsibilities

programs is that they focus on knowledge and skills typicallyattained at different times in students’ secondary-schoolexperience. The ACT, for college-bound 11th and 12thgraders, focuses on knowledge and skills attained as thecumulative effect of school experience. PLAN, intended forall 10th graders, focuses on knowledge and skills typicallyattained early in students’ secondary school experience (byGrade 10), and EXPLORE, intended for all students ineighth and ninth grades, focuses on knowledge and skillsusually attained by Grade 8.

Because the content of the PLAN tests of educationaldevelopment is linked to the ACT framework, understandingthe philosophical basis of the PLAN tests requires an appre-ciation of the philosophical basis of the ACT.

The ACT tests of educational development are designedto measure how prepared students are to achieve the gen-eral academic goals of college. The principal philosophicalbasis for the ACT is that college preparedness is bestassessed by measuring, as directly as possible, the academic skills that students will need in order to perform college-level work. Complexity is certainly a characteristic ofsuch skills. Thus, the ACT tests are designed to determinehow skillful students are in solving problems, graspingimplied meanings, drawing inferences, evaluating ideas, andmaking judgments. In addition, the ACT tests of educationaldevelopment are oriented toward the general content areasof college and high school instructional programs. The testquestions require students to integrate the knowledge andskills they possess in major curriculum areas with the stimu-lus material provided by the test. Briefly, then, thephilosophical basis for the ACT tests rests on two pillars: (a) the tests should measure academic skills necessary foreducation and work after high school and (b) the content ofthe tests should be related to major curriculum areas.

Tests of general educational development are used in theACT, PLAN, and EXPLORE because, when compared toother types of tests, it was judged that they better satisfy thediverse requirements of tests intended to facilitate the tran-sition to high school, college, or work. By contrast, measuresof examinee knowledge of specific course content (asopposed to curriculum areas) often do not provide a com-mon baseline for comparisons of students, because coursesvary so much across schools, and even within schools. Inaddition, course-specific tests may not measure students’skills in problem solving and in the integration of knowledgefrom a variety of courses.

Tests of educational development can also be contrastedwith tests of academic aptitude. The stimuli and test ques-tions for aptitude tests are often purposefully chosen to bedissimilar to instructional materials, and each test within abattery of aptitude tests is usually designed to be homo-geneous in psychological structure. Consequently, oftenaptitude tests are not designed to reflect the complexity ofcourse work or the interactions among the skills measured.

Also, because tests of educational development measuremany of the same skills that are taught in school, the best

preparation for such tests should be course work. Thus,tests of educational development should send students aclear message that high test scores are not simply a matterof innate ability—they reflect a level of achievement that hasbeen earned as a result of hard work and dedication inschool.

Finally, the ACT, PLAN, and EXPLORE tests are intendedto reflect educational goals that are widely accepted andjudged by educators to be important for success in collegeand work. As such, the content of the tests is designed witheducational considerations, rather than statistical and empir-ical techniques, given paramount importance. For example,content representativeness of the tests is more importantthan choosing the most highly discriminating items.

Administering the PLAN Program

The PLAN program is available for administrationSeptember through June each year. Consult the PLANAdministrator’s Handbook for further instructions schedulingyour testing and about ordering materials.

Participation Procedures

Each spring ACT activates its online ordering system forPLAN test materials. Schools are provided notice of this acti-vation and asked to go online and place orders for the nextacademic year or contact PLAN Customer Services for plac-ing their order. Those placing an order online would then beable to track their order through the shipping processapproximately 4–5 weeks prior to their test date.

Schools may choose to census test their students in agiven grade or provide the testing as optional.

Special Testing for Students With Disabilities. Specialprovisions are available for administering PLAN to studentswho have diagnosed learning or physical disabilities thatrequire extended time or special materials. Special testingmaterials available include large-type and Braille test booksfor visually impaired students, audio recordings of test bookson audio cassette or CD, and reader’s scripts for oral pres-entation of the test items. Order forms for special formatmaterials are provided to schools with the letter confirmingPLAN materials orders. Schools are encouraged to adminis-ter special tests on the same day as the timed testing ses-sion. However, if necessary, special testing can beconducted on any of the PLAN testing dates.

Norms (cumulative percents of examinees who scored ator below the earned scale score of the examinee) reportedfor students testing under special conditions are based onthe same group as the norms reported for examinees testingunder standard conditions. The sample of examinees(described in detail in chapter 4 of this manual) used indeveloping these norms does not include students whorequired extended time or special materials. This fact shouldbe taken into consideration when interpreting test results ofthese students.

3

Page 14: 10421 PLAN Manual 1-34successlineinc.com/actinfo/plantechnicalmanual.pdf · ACT endorses the Code of Fair Testing Practices in Educationand the Code of Professional Responsibilities

Administration Schedule

The PLAN program has been designed to be adminis-tered within a half day during school-supervised sessions. Ittakes about 3 hours and 15 minutes to complete the entireprogram: approximately 60–70 minutes for the non-test sec-tions and 2 hours and 10 minutes for the four tests of edu-cational development. The PLAN procedures and materialshave been designed to allow schools the option of dividingthe administration over two or more days. The non-test sec-tions (student plans and background information, InterestInventory, and course/grade information) may be adminis-tered in a nonsecure, supervised school setting on or beforethe test day. The four tests of educational development mustbe administered in a single session on the designated testday. Consult the PLAN Administrator’s Handbook for infor-mation about makeup testing.

PLAN Support Materials

PLAN includes a coordinated set of support materials tohelp students, parents, teachers, counselors, and adminis-trators understand the purposes of the program and theinformation provided.

• The PLAN Administrator’s Handbook, designed to beused by the administrators of PLAN, shows how thePLAN program can be used to help students build asolid foundation for future academic and career suc-cess.

• An introductory brochure for students and their parents,Why Take PLAN?, provides a brief overview of the PLANprogram and tips to help students do their best.

• Test materials are composed of Student AssessmentSets—the test booklet, an answer folder, andInstructions for Completing Your Answer Folder. OneAdministrator’s folder is shipped with each order, consisting of an Administrator’s Handbook, a copy ofDirections for Testing, a combination of large and smallposters to promote the test, and an order form for Pre-ID labels for test forms. Reports are routinely shippedfrom ACT about three weeks from the receipt of thecompleted answer folders.

• Each student who participates in PLAN will receiveUsing Your PLAN Results, which includes sections oninterpreting the Student Score Report, planning for highschool and beyond, career possibilities, building aca-demic skills, and coursework planning.

• The Item-Response Summary Report provides tablesdescribing item by item the performance of your PLANexaminees. These results are categorized by test (e.g.,English), by subscore (e.g., Usage/Mechanics), and bycontent area (e.g., Punctuation) and provide compar-isons to the performance of other students taking thesame test form.

• College Readiness Standards help students, teachers,counselors, and others to more fully understand whatstudents who score in various score ranges are likely toknow and to be able to do in each academic areaassessed: English, mathematics, reading, and science.The Standards are complemented, in ACT’s CollegeReadiness Standards Information Services, by “ideasfor progress,” which are suggestions for learning experi-ences that students might benefit from if they wish toprogress to higher levels of achievement. The PLANCollege Readiness Standards are discussed in chapter 3.

• Linkage Reports assist counselors in evaluating student academic development and progress as stu-dents move through EXPLORE, PLAN, and the ACT.EXPLORE/PLAN Linkage Reports are based on aprocess of matching EXPLORE and PLAN studentrecords and analyzing the changes in student perform-ance between Grade 8 or 9 and Grade 10. The matchprocess reflected in PLAN/ACT Linkage Reportsbetween PLAN and ACT records allows analysis ofchanges between Grade 10 and Grade 12.

ACT’s Standards for Test Administration and Security

ACT provides specific guidelines for test administrationand materials handling in order to maintain testing condi-tions as uniform as possible across all schools. Test super-visors are provided with a copy of the PLAN Administrator’sHandbook and Directions for Testing. These documents pro-vide detailed instructions about all aspects of test adminis-tration. Among other standard procedures, Directions forTesting includes instructions to be read aloud to the exami-nees. The instructions are to be read without departure fromthe specified text in order to maintain standard testing con-ditions.

4

Page 15: 10421 PLAN Manual 1-34successlineinc.com/actinfo/plantechnicalmanual.pdf · ACT endorses the Code of Fair Testing Practices in Educationand the Code of Professional Responsibilities

5

Chapter 2The PLAN Tests of Educational Achievement

Description of the PLAN Tests

PLAN contains four multiple-choice tests—English,Mathematics, Reading, and Science. These tests aredesigned to measure students’ curriculum-related knowl-edge and the complex cognitive skills important for futureeducation and careers. PLAN results provide 10th-gradestudents with the information they need to continue makingplans for high school and beyond.

The fundamental idea underlying the development anduse of these tests is that the best way to determine how wellprepared students are for further education and for work isto measure as directly as possible the knowledge and skillsneeded in those settings. ACT conducted a detailed analysisof three sources of information to determine which knowl-edge and skills should be measured by PLAN. First, westudied the objectives for instruction in Grades 7 through 12for all states that had published them. Second, we reviewedtextbooks on state-approved lists for courses in Grades 7through 12. Third, we consulted educators at grade levels 7through 12 and at the postsecondary level to determine theknowledge and skills taught in Grades 7 through 12 that areprerequisite to successful performance in high school andbeyond. Information from these sources helped to define ascope and sequence for each of the areas measured byPLAN.

Curriculum study is ongoing at ACT. Curricula in eachcontent area (English, mathematics, reading, and science) inthe PLAN tests are reviewed on a periodic basis. ACT’sanalyses include reviews of tests, curriculum guides, andnational standards; surveys of current instructional practice;and meetings with content experts (see ACT, ACT NationalCurriculum Survey ® 2005–2006, 2007).

The PLAN tests are designed to be developmentally andconceptually linked to those of EXPLORE and the ACT. Toreflect that continuity, names of the multiple-choice tests arethe same across the three programs. The programs are sim-ilar in their focus on critical thinking skills and in their com-mon curriculum base. Specifications for the PLAN programare consistent with, and should be seen as a logical exten-sion of, the content and skills measured in the EXPLOREand ACT programs.

The English Test

The PLAN English Test (50 items, 30 minutes) measuresthe student’s understanding of the conventions of standardwritten English (punctuation, grammar and usage, and sen-tence structure) and of rhetorical skills (strategy, organiza-

tion, and style). The test stresses the analysis of the kinds ofprose that students are required to read and write in mostsecondary and postsecondary programs, rather than therote recall of rules of grammar. The test consists of severalprose passages, each accompanied by a number of multi-ple-choice test items. Different passage types are employedto provide a variety of rhetorical situations.

Some items refer to underlined portions of the passageand offer several alternatives to the portion underlined. Thestudent must decide which choice is most appropriate in thecontext of the passage. Some items ask about an underlinedportion, a section of the passage, or the passage as a whole.The student must decide which choice best answers thequestion posed. Many items offer “NO CHANGE” to the pas-sage as one of the choices.

Two subscores are reported for this test, aUsage/Mechanics subscore based on 30 items and aRhetorical Skills subscore based on 20 items.

The Mathematics Test

The PLAN Mathematics Test (40 items, 40 minutes)measures the student’s mathematical reasoning. The testemphasizes quantitative reasoning rather than memoriza-tion of formulas or computational skills. In particular, itemphasizes the ability to solve practical quantitative prob-lems that are encountered in many first- and second-yearhigh school courses (pre-algebra, first-year algebra, andplane geometry). While some material from second-yearcourses is included on the test, most items, including thegeometry items, emphasize content presented before thesecond year of high school. The items included in theMathematics Test cover four skill areas: knowledge andskills, direct application, understanding concepts, and inte-grating conceptual understanding.

Calculators, although not required, are permitted for useon the Mathematics Test. Almost any four-function, scientific,or graphing calculator may be used on the MathematicsTest. A few restrictions do apply to the calculator used.These restrictions can be found in the current year’s PLANAdministrator’s Handbook or on ACT’s website atwww.act.org.

The items in the Mathematics Test are classified accord-ing to four content categories: Pre-Algebra, ElementaryAlgebra, Coordinate Geometry, and Plane Geometry. Twosubscores are reported for the Mathematics Test: Pre-Algebra/Algebra and Geometry, based on 22 and 18 items,respectively.

Page 16: 10421 PLAN Manual 1-34successlineinc.com/actinfo/plantechnicalmanual.pdf · ACT endorses the Code of Fair Testing Practices in Educationand the Code of Professional Responsibilities

The Reading Test

The PLAN Reading Test (25 items, 20 minutes) measuresthe student’s level of reading comprehension as a product ofskill in referring and reasoning. That is, the test items requirestudents to derive meaning from several texts by: (a) refer-ring to what is explicitly stated and (b) reasoning to deter-mine implicit meanings. Specifically, items will ask thestudent to use referring and reasoning skills to determinemain ideas; locate and interpret significant details; under-stand sequences of events; make comparisons; compre-hend cause-effect relationships; determine the meaning ofcontext-dependent words, phrases, and statements; drawgeneralizations; and analyze the author’s or narrator’s voiceor method. The test comprises three prose passages thatare representative of the level and kinds of text commonlyencountered in 10th-grade curricula; passages on topics inthe social sciences, prose fiction, and the humanities areincluded. Each passage is preceded by a heading that iden-tifies what type of passage it is (e.g., “Prose Fiction”), namesthe author, and may include a brief note that helps in under-standing the passage. Each passage is accompanied by aset of multiple-choice test items. These items do not test therote recall of facts from outside the passage, isolated vocab-ulary questions, or rules of formal logic. Rather, the testfocuses upon the complex of complementary and mutuallysupportive skills that readers must bring to bear in studyingwritten materials across a range of subject areas.

The Science Test

The PLAN Science Test (30 items, 25 minutes) measuresscientific reasoning skills acquired through Grade 10. Thetest presents five sets of scientific information, each followedby a number of multiple-choice test items. The scientificinformation is conveyed in one of three different formats:data representation (graphs, tables, and other schematicforms), research summaries (descriptions of several relatedexperiments), or conflicting viewpoints (expressions of sev-eral related hypotheses or views that are inconsistent withone another). The items require students to recognize andunderstand the basic features of, and concepts related to,the information provided; to examine critically the relation-ships between the information provided and the conclusionsdrawn or hypotheses developed; and to generalize fromgiven information to gain new information, draw conclusions,or make predictions.

The Science Test is based on materials drawn from thecontent areas of biology, the Earth/space sciences, chem-istry, and physics. The test emphasizes scientific reasoningskills over recall of scientific content, skill in mathematics, orskill in reading.

Test Development Procedures

This section describes the procedures that are used indeveloping the four multiple-choice tests described above.The test development cycle required to produce each newform of the PLAN tests takes as long as two and one-halfyears and involves several stages, beginning with a reviewof the test specifications.

Reviewing Test Specifications

Two types of test specifications are used in developingthe PLAN tests: content specifications and statistical specifi-cations.

Content specifications. Content specifications for thePLAN tests were developed through the curricular analysisdiscussed above. While care is taken to ensure that thebasic structure of the PLAN tests remains the same fromyear to year so that the scale scores are comparable, thespecific characteristics of the test items used in each speci-fication category are reviewed regularly. Consultant panelsare convened to review the new forms of the test in order toverify their content accuracy and the match of the content ofthe tests to the content specifications. At this time, the char-acteristics of the items that fulfill the content specificationsare also reviewed. While the general content of the testremains constant, the particular kinds of items in a specifi-cation category may change slightly. The basic structure ofthe content specifications for each of the PLAN multiple-choice tests is provided in Tables 2.1 through 2.4.

Statistical specifications. Statistical specifications forthe tests indicate the level of difficulty (proportion correct)and minimum acceptable level of discrimination (biserial cor-relation) of the test items to be used.

The tests are constructed to have a mean item difficultyof about .58 for the PLAN national population and a range ofdifficulties from about .20 to .89. The distribution of item dif-ficulties was selected so that the tests will effectively differ-entiate among students who vary widely in their level ofachievement.

The items in the PLAN tests are selected to have a bise-rial correlation of at least 0.20 with scores on a test measur-ing comparable content. For example, examinees’performance on each English item should have a biserialcorrelation of 0.20 or higher with their performance on theEnglish Test.

6

Page 17: 10421 PLAN Manual 1-34successlineinc.com/actinfo/plantechnicalmanual.pdf · ACT endorses the Code of Fair Testing Practices in Educationand the Code of Professional Responsibilities

7

Table 2.1Content Specifications for the PLAN English Test

Six elements of effective writing are included in the English Test. These elements and the approximate proportion of thetest devoted to each are given below.

Content/Skills Proportion of test Number of items

Usage/Mechanics .60 30

Punctuation .14 7

Grammar and Usage .18 9

Sentence Structure .28 14

Rhetorical Skills .40 20

Strategy .12 6

Organization .14 7

Style .14 7

Total 1.00 50

a. Punctuation. The items in this category test the stu-dents’ knowledge of the conventions of internal andend-of-sentence punctuation, with emphasis on therelationship of punctuation to meaning (e.g., avoidingambiguity, identifying appositives).

b. Grammar and Usage. The items in this category testthe student’s understanding of agreement betweensubject and verb, between pronoun and antecedent,and between modifiers and the words modified; verbformation; pronoun case; formation of comparative andsuperlative adjectives and adverbs; and idiomaticusage.

c. Sentence Structure. The items in this category test thestudent’s understanding of relationships between andamong clauses, placement of modifiers, and shifts inconstruction.

d. Strategy. The items in this category test the student’sability to develop a given topic by choosing expres-sions appropriate to an essay’s audience and purpose;to judge the effect of adding, revising, or deleting sup-porting material; and to judge the relevance of state-ments in context.

e. Organization. The items in this category test the stu-dent’s ability to organize ideas and to choose effectiveopening, transitional, and closing sentences.

f. Style. The items in this category test the student’s abil-ity to select precise and appropriate words andimages, to maintain the level of style and tone in anessay, to manage sentence elements for rhetoricaleffectiveness, and to avoid ambiguous pronoun refer-ences, wordiness, and redundancy.

Page 18: 10421 PLAN Manual 1-34successlineinc.com/actinfo/plantechnicalmanual.pdf · ACT endorses the Code of Fair Testing Practices in Educationand the Code of Professional Responsibilities

88

Table 2.2Content Specifications for the PLAN Mathematics Test

The items in the Mathematics Test are classified according to four content areas. These areas and the approximate proportion of the test devoted to each are given below.

Mathematics content area Proportion of test Number of items

Pre-Algebra/Algebra .55 22

Pre-Algebra .35 14

Elementary Algebra .20 8

Geometry .45 18

Coordinate Geometry .18 7

Plane Geometry .27 11

Total 1.00 40

a. Pre-Algebra. Items in this category are based on basicoperations using whole numbers, decimals, fractions,and integers; place value; square roots and approxima-tions; the concept of exponents; scientific notation; fac-tors; ratio, proportion, and percent; linear equations inone variable; absolute value and ordering numbers byvalue; elementary counting techniques and simpleprobability; data collection, representation, and inter-pretation; and understanding simple descriptive statis-tics.

b. Elementary Algebra. The items in this category arebased on properties of exponents and square roots;evaluation of algebraic expressions through substitu-tion; simplification of algebraic expressions; addition,

subtraction, and multiplication of polynomials; factor-ization of polynomials; and solving quadratic equationsby factoring.

c. Coordinate Geometry. Items in this category are basedon graphing and the relations between equations andgraphs, including points and lines; graphing inequali-ties; slope; parallel and perpendicular lines; distance;and midpoints.

d. Plane Geometry. Items in this category are based onthe properties and relations of plane figures, includingangles and relations among perpendicular and parallellines; properties of circles, triangles, rectangles, parallel-ograms, and trapezoids; transformations; and volume.

Table 2.3Content Specifications for the PLAN Reading Test

The items in the Reading Test are based on the prose passages that are representative of the kinds of writing commonlyencountered in secondary curricula, including the social sciences, prose fiction, and the humanities. The three contentareas and the approximate proportion of the test devoted to each are given below.

Reading passage content Proportion of test Number of items

Prose Fiction .32 8

Humanities .36 9

Social Sciences .32 8

Total 1.00 25

a. Prose Fiction. The items in this category are based onshort stories or excerpts from short stories or novels.

b. Humanities. The items in this category are based onpassages from memoirs and personal essays, and inthe content areas of architecture, art, dance, ethics,film, language, literary criticism, music, philosophy,radio, television, or theater.

c. Social Sciences. The items in this category are basedon passages in anthropology, archaeology, biography,business, economics, education, geography, history,political science, psychology, or sociology.

Page 19: 10421 PLAN Manual 1-34successlineinc.com/actinfo/plantechnicalmanual.pdf · ACT endorses the Code of Fair Testing Practices in Educationand the Code of Professional Responsibilities

99

Table 2.4Content Specifications for the PLAN Science Test

The Science Test is based on the type of content that is typically covered in early high school general science courses.Materials are drawn from the biological sciences, the earth/space sciences, physics, and chemistry. The test emphasizesscientific reasoning skills over recall of specific scientific content, skill in mathematics, or skill in reading. Minimal arithmeticand algebraic computations may be required to answer some questions. The three formats and the approximate propor-tion of the test devoted to each are given below.

Content areaa Format Proportion of test Number of items

Data Representation .33 10

Research Summaries .47 14

Conflicting Viewpoints .20 6

Total 1.00 30aAll four content areas are represented in the test. The content areas are distributed over the different formats.

a. Data Representation. This format presents studentswith graphic and tabular material similar to that foundin science journals and texts. The items associatedwith this format measure skills such as graph reading,interpretation of scatterplots, and interpretation ofinformation presented in tables. The graphic or tabularmaterial may be taken from published materials; theitems are composed expressly for the Science Test.

b. Research Summaries. This format provides studentswith descriptions of one or more related experiments.The items focus on the design of experiments and the

interpretation of experimental results. The stimulus anditems are written expressly for the Science Test.

c. Conflicting Viewpoints. This format presents studentswith expressions of several hypotheses or views that,being based on differing premises or on incompletedata, are inconsistent with one another. The itemsfocus on the understanding, analysis, and comparisonof alternative viewpoints or hypotheses. Both the stim-ulus and the items are written expressly for theScience Test.

Selection of Item Writers

Each year, ACT contracts with item writers to constructitems for PLAN. The item writers are content specialists inthe disciplines measured by the PLAN tests. Most areactively engaged in teaching at various levels, from highschool to university, and at a variety of institutions, fromsmall private schools to large public institutions. ACT makesevery attempt to include item writers who represent thediversity of the population of the United States with respectto ethnic background, gender, and geographic location.

Before being asked to write items for the PLAN tests,potential item writers are required to submit a sample set ofmaterials for review. Each item writer receives an itemwriter’s guide that is specific to the content area. The guidesinclude examples of items and provide item writers with thetest specifications and ACT’s requirements for content andstyle. Included are specifications for fair portrayal of allgroups of individuals, avoidance of subject matter that maybe unfamiliar to members of certain groups within society,and nonsexist use of language.

Each sample set submitted by a potential item writer isevaluated by ACT Test Development staff. A decision con-cerning whether to contract with the item writer is made onthe basis of that evaluation.

Each item writer under contract is given an assignment toproduce a small number of multiple-choice items. The smallsize of the assignment ensures production of a diversity ofmaterial and maintenance of the security of the testing pro-gram, since any item writer will know only a small proportionof the items produced. Item writers work closely with ACTtest specialists, who assist them in producing items of highquality that meet the test specifications.

Item Construction

The item writers must create items that are educationallyimportant and psychometrically sound. A large number ofitems must be constructed because, even with good writers,many items fail to meet ACT’s standards.

Each item writer submits a set of items, called a unit, in agiven content area. Most Mathematics Test items are dis-crete (not passage-based), but occasionally some maybelong to sets composed of several items based on thesame paragraph or chart. All items on the English andReading Tests are related to prose passages. All items onthe Science Test are related to passages and/or other stim-ulus material (such as graphs and tables).

BiologyEarth/Space SciencesPhysicsChemistry

Page 20: 10421 PLAN Manual 1-34successlineinc.com/actinfo/plantechnicalmanual.pdf · ACT endorses the Code of Fair Testing Practices in Educationand the Code of Professional Responsibilities

Review of Items

After a unit is accepted, it is edited to meet ACT’s speci-fications for content accuracy, word count, item classifica-tion, item format, and language. During the editing process,all test materials are reviewed for fair portrayal and balancedrepresentation of groups within society and for nonsexist useof language. The unit is reviewed several times by ACT staffto ensure that it meets all of ACT’s standards.

Copies of each unit are then submitted to content andfairness experts for external reviews prior to the pretestadministration of these units. The content reviewers are highschool teachers, curriculum specialists, and college and uni-versity faculty members. The content experts review the unitfor content accuracy, educational importance, and grade-level appropriateness. The fairness reviewers are experts indiverse educational areas who represent both genders anda variety of racial and ethnic backgrounds. These reviewershelp ensure fairness to all examinees.

Any comments on the units by the content consultantsare discussed in a panel meeting with all the content con-sultants and ACT staff, and appropriate changes are madeto the unit(s). All fairness consultants’ comments arereviewed and discussed, and appropriate changes are madeto the unit(s).

Item Tryouts

The items that are judged to be acceptable in the reviewprocess are assembled into tryout units for pretesting onsamples from the national examinee population. These sam-ples are carefully selected to be representative of the totalexaminee population. Each sample is administered a tryoutunit from one of the four academic areas covered by thePLAN tests. The time limits for the tryout units permit themajority of students to respond to all items.

Item Analysis of Tryout Units

Item analyses are performed on the tryout units. For agiven booklet the sample is divided into low, medium, andhigh groups by total tryout test score. The cutoff scores forthe three groups are the 27th and the 73rd percentile pointsin the distribution of those scores.

Proportions of students in each of the groups correctlyanswering each tryout item are tabulated, as well as the pro-portion in each group selecting each of the incorrect options.Biserial and point-biserial correlation coefficients betweeneach item score (correct/incorrect) and the total score on thetryout unit are also computed.

The item analyses serve to identify statistically effectivetest questions. Items that were either too difficult or too easy,and those that failed to discriminate between students ofhigh and low educational development as measured by theirtryout test scores, are eliminated or revised. The biserial andpoint-biserial correlation coefficients, as well as the differ-ences between proportions of students answering the itemcorrectly in each of the three groups, are used as indices ofthe discriminating power of the tryout items.

Each item is reviewed following the item analysis. ACTstaff scrutinizes items determined to be of poor quality inorder to identify possible causes. Some items are revisedand placed in new tryout units following further review. Thereview process also provides feedback that helps decreasethe incidence of poor quality items in the future.

Assembly of New Forms

Items that are judged acceptable in the review processare placed in an item pool. Preliminary forms of the PLANtests are constructed by selecting from this pool items thatmatch the content, cognitive, and statistical specifications forthe tests.

For each test in a battery form, items are selected tomatch the content distribution for the test shown in Tables2.1 through 2.4. Items are also selected to comply with sta-tistical specifications as discussed in an earlier section. Thedistributions of item difficulty levels obtained on recent formsof the four tests are displayed in Table 2.5. The data in thetable are taken from random samples of approximately2,000 students from the operational administration of thetests in 2003 through 2006. In addition to the item difficultydistributions, item discrimination indices in the form ofobserved mean biserial correlations and completion ratesare reported.

The completion rate is an indication of how speeded atest is for a group of students. A test is considered to bespeeded if most students do not have sufficient time toanswer the items in the time allotted. The completion ratereported in Table 2.5 for each test is the average completionrate for the 2003–2006 PLAN operational administrations.The completion rate for each test is computed as the aver-age proportion of examinees who answered each of the lastfive items.

10

Page 21: 10421 PLAN Manual 1-34successlineinc.com/actinfo/plantechnicalmanual.pdf · ACT endorses the Code of Fair Testing Practices in Educationand the Code of Professional Responsibilities

Content and Fairness Review of Test Items

The preliminary versions of the test forms are subjectedto several reviews to ensure that the items are accurate andthat the overall test forms are fair and conform to good testconstruction practice. The first review is performed by ACTstaff. Items are checked for content accuracy and conform-ity to ACT style. The items are also reviewed to ensure thatthey are free of clues that could allow testwise students toanswer the item correctly even though they lack knowledgein the subject area or the required skills.

The preliminary versions of the test forms are then sub-mitted to content and fairness experts for external reviewprior to the operational administration of the test forms.These experts are not the same individuals consulted for thecontent and fairness reviews of tryout units.

The content consultants are high school teachers, cur-riculum specialists, and college and university faculty mem-bers. The content consultants review the forms for contentaccuracy, educational importance, and grade-level appropri-ateness. The fairness consultants are diversity experts ineducation who represent both genders and a variety of racialand ethnic backgrounds. The fairness consultants review theforms to help ensure fairness to all examinees.

After the external content and fairness reviews, ACT sum-marizes the results from the reviews. Comments from the

consultants are then reviewed by ACT staff members, andany necessary changes are made to the test forms.Whenever significant changes are made, the revised com-ponents are again reviewed by the appropriate consultantsand by ACT staff. If no further corrections are needed, thetest forms are prepared for printing.

In all, at least sixteen independent reviews are made ofeach test item before it appears on a national form of PLAN.The many reviews are performed to help ensure that eachstudent’s level of achievement is accurately and fairly evalu-ated.

Review Following Operational Administration

After each operational administration, item analysisresults are reviewed for any abnormality such as substantialchanges in item difficulty and discrimination indices betweentryout and national administrations. Only after all anomalieshave been thoroughly checked and the final scoring keyapproved are score reports produced. Examinees areencouraged to challenge any items that they feel are ques-tionable in correctness. Once a challenge to an item israised and reported, the item is reviewed by the experts inthe content area the item is assessing. In the event that aproblem is found with an item, necessary actions will betaken to eliminate or minimize the influence of the problem

11

Table 2.5Difficultya Distributions, Mean Discriminationb Indices, and

Average Completion Ratesc for the 2003–2006 PLAN Operational Administrations

Observed difficulty distributions (frequencies)

Difficulty range English Mathematics Reading Science

.00–.09 0.00 0.00 0.00 0.00

.10–.19 0.00 8.00 2.00 0.00

.20–.29 12.00 16.00 8.00 11.00

.30–.39 18.00 18.00 17.00 10.00

.40–.49 27.00 25.00 19.00 20.00

.50–.59 34.00 29.00 21.00 27.00

.60–.69 41.00 27.00 19.00 25.00

.70–.79 44.00 26.00 9.00 20.00

.80–.89 24.00 11.00 5.00 7.00

.90–1.00 0.00 0.00 0.00 0.00

Number of itemsd 200.00 160.00 100.00 120.00Mean difficultya .59 .53 .51 .55Mean discriminationb 0.51 0.51 0.52 0.56Avg. completion ratec .89 .88 .88 .94aDifficulty is the proportion of examinees correctly answering the item.bDiscrimination is the item–total score biserial correlation coefficient.cAverage completion rate for all four forms, with the completion rate for any one form being the mean proportion of students responding to each of the last 5 items.

dFour forms consisting of the following number of items per test: English 50, Mathematics 40, Reading 25, Science 30.

Page 22: 10421 PLAN Manual 1-34successlineinc.com/actinfo/plantechnicalmanual.pdf · ACT endorses the Code of Fair Testing Practices in Educationand the Code of Professional Responsibilities

item. In all cases, the person who challenges an item is senta letter indicating the results of the review.

Also, after each operational administration, differentialitem functioning (DIF) analysis is conducted on the test data.DIF can be described as a statistical difference between theprobability of the specific population subgroup (the “focal”group) getting the item right and the comparison populationsubgroup (the “base” group) getting the item right given thatboth groups have the same level of expertise with respect tothe content being tested. The procedures currently used forthe analysis include the standardized difference in propor-tion-correct (STD) procedure and the Mantel-Haenszel com-mon odds-ratio (MH) procedure.

In ACT’s experience, the MH and STD procedures areuseful techniques in detecting DIF. Both techniques aredesigned for use with multiple-choice items, and bothrequire data from significant numbers of examinees to pro-vide reliable results. For a description of these statistics andtheir performance overall in detecting DIF, see the ACTResearch Report entitled Performance of Three ConditionalDIF Statistics in Detecting Differential Item Functioning onSimulated Tests (Spray, 1989). In the analysis on items in aPLAN form, large samples representing examinee groups ofinterest, e.g., males and females, are selected from the totalnumber of examinees taking the test. The examinees’responses to each item on the test are analyzed using theSTD and MH procedures. Compared with preestablished cri-teria, the items with MH and/or STD values exceeding thetolerance level are flagged. The flagged items are then fur-ther reviewed by the content specialists for possible expla-nations of the unusual MH and/or STD results of the items.In the event that a problem is found with an item, necessaryactions will be taken to eliminate or minimize the influence ofthe problem item.

PLAN Scoring Procedures

For each of the four tests in PLAN (English, Mathematics,Reading, Science), the raw scores (number of correctresponses) are converted to scale scores ranging from 1 to32. The score scale is discussed further on pages 20–22 ofthis manual.

The Composite score is the average of the four scalescores rounded to the nearest whole number (0.5 roundsup). The minimum Composite score is 1; the maximum is 32.

In addition to the four PLAN test scores and Compositescore, two subscores are reported for both the English Testand the Mathematics Test. As for each of the four tests, theraw scores for the subscore items are converted to scalescores. These subscores are reported on a score scaleranging from 1 to 16. The score scale is discussed further onpages 20–22 of this manual.

National percentile rank, based on a national normativegroup, is reported as percent-at-or-below for the four PLANtest scores, four subscores, and Composite score. PLANnorms are intended to be representative of the performanceof all 10th graders in the nation. All norms are based on a fall2005 administration of PLAN to classes of 10th graders inpublic and private schools throughout the United States.

12

Page 23: 10421 PLAN Manual 1-34successlineinc.com/actinfo/plantechnicalmanual.pdf · ACT endorses the Code of Fair Testing Practices in Educationand the Code of Professional Responsibilities

ACT’s College Readiness Standards

Description of the College Readiness Standards

In 1997, ACT began an effort to make EPAS test resultsmore informative and useful. This effort yielded CollegeReadiness Standards for each of the EPAS programs. TheEPAS College Readiness Standards are statements thatdescribe what students who score in various score rangeson the EPAS tests are likely to know and to be able to do.For example, students who score in the 16–19 range on thePLAN English Test typically are able “to select the most log-ical place to add a sentence in a paragraph,” while studentswho score in the 28–32 score range are able “to add a sen-tence to introduce or conclude a fairly complex paragraph.”The Standards reflect a progression of skills in each of thefive tests: English, Reading, Mathematics, Science, andWriting. ACT has organized the standards by strands—related areas of knowledge and skill within each test—forease of use by teachers and curriculum specialists. Thecomplete College Readiness Standards are posted on ACT’swebsite: www.act.org. They also are available in poster for-mat from ACT Educational Services at 319/337-1040. ACTalso offers College Readiness Standards InformationServices, a supplemental reporting service based on theStandards.

College Readiness Standards for EXPLORE, PLAN, andthe ACT are provided for six score ranges (13–15, 16–19,20–23, 24–27, 28–32, and 33–36) along a score scale that iscommon to EXPLORE (1–25), PLAN (1–32), and the ACT(1–36). Students who score in the 1–12 range are most likelybeginning to develop the skills and knowledge described inthe 13–15 score range. The Standards are cumulative, whichmeans that if students score, for example, in the 20–23 rangeon the English Test, they are likely able to demonstrate mostor all of the skills and understandings in the 13–15, 16–19,and 20–23 score ranges.

College Readiness Standards for Writing, which ACTdeveloped in 2005, are available only for the ACT and are

provided for five score ranges (3–4, 5–6, 7–8, 9–10, and11–12) based on ACT Writing Test scores obtained (sum oftwo readers’ rating using the six-point holistic scoring rubricfor the ACT Writing Test). Scores below 3 do not permit use-ful generalizations about students’ writing abilities.

Since the three EPAS testing programs—EXPLORE,PLAN, and the ACT—are designed to measure students’progressive development of knowledge and skills in thesame four academic areas through Grades 8–12, theStandards are correlated across programs as much as pos-sible. The Standards in the 13–15, 16–19, 20–23, and 24–27score ranges apply to scores for all three programs. TheStandards in the 28–32 score range are specific to PLANand the ACT, and the scores in the 33–36 score range arespecific to the ACT. Figure 3.1 illustrates the score-rangeoverlap among the three programs.

Determining the Score Ranges for the CollegeReadiness Standards (1997)

When ACT began work on the College ReadinessStandards in 1997, the first step was to determine the num-ber of score ranges and the width of each score range. To dothis, ACT staff reviewed EPAS normative data and consid-ered the relationships among EXPLORE, PLAN, and theACT. This information was considered within the context ofhow the test scores are used—for example, the use of theACT scores in college admissions and course-placementdecisions.

In reviewing the normative data, ACT staff analyzed thedistribution of student scores across the respective EPASscore scales (EXPLORE 1–25, PLAN 1–32, and ACT 1–36).The staff also considered course placement research thatACT has conducted over the last forty years. ACT’s CoursePlacement Service provides colleges and universities withcutoff scores that are used for placement into appropriateentry-level college courses. Cutoff scores based on admis-sions and course placement criteria were used to help definethe score ranges of all three EPAS programs.

13

Chapter 3ACT’s College Readiness Standards and

College Readiness Benchmarks

Figure 3.1. Score ranges for EXPLORE, PLAN, and the ACT.

EXPLORE

PLAN

ACT

13–15 16–19 20–23 24–27 28–32 33–36

Page 24: 10421 PLAN Manual 1-34successlineinc.com/actinfo/plantechnicalmanual.pdf · ACT endorses the Code of Fair Testing Practices in Educationand the Code of Professional Responsibilities

After analyzing all the data and reviewing different possi-ble score ranges, ACT staff concluded that the score ranges1–12, 13–15, 16–19, 20–23, 24–27, 28–32, and 33–36would best distinguish students’ levels of achievement so asto assist teachers, administrators, and others in relatingEPAS test scores to students’ skills and understandings.

Developing the College Readiness Standards

After reviewing the normative data, college admissionscriteria, and information obtained through ACT’s CoursePlacement Service, content area test specialists wrote theCollege Readiness Standards based on their analysis of theskills and knowledge students need in order to respond suc-cessfully to test items that were answered correctly by 80%or more of the examinees who scored within each scorerange. Content specialists analyzed test items taken fromdozens of test forms. The 80% criterion was chosenbecause it offers those who use the College ReadinessStandards a high degree of confidence that students scoringin a given score range will most likely be able to demonstratethe skills and knowledge described in that range.

The Process. Four ACT content teams were identified,one for each of the tests (English, Mathematics, Reading,and Science) included in the three EPAS programs. Eachcontent team was provided with numerous EPAS test formsalong with tables that showed the percentages of students ineach score range who answered each test item correctly(the item difficulties). Item difficulties were computed sepa-rately based on groups of students whose scores fell withineach of the defined score ranges.

The College Readiness Standards were identified by test,by program, beginning with the ACT. Each content team wasprovided with 10 forms of the ACT and the item difficultiescomputed separately for each score range for each of theitems on the forms. For example, the mathematics contentteam reviewed 10 forms of the ACT Mathematics Test. Thereare 60 items in each ACT Mathematics Test form, so 600ACT Mathematics items were reviewed in all. An illustrativetable displaying the information provided to the mathematicscontent team for one ACT Mathematics Test form is shownin Table 3.1.

The shaded areas in Table 3.1 show the items that metthe 0.80 or above item difficulty criterion for each of thescore ranges. As illustrated in Table 3.1, a cumulative effectcan be noted: the items that are correctly answered by 80%of the students in Score Range 16–19 also appear in ScoreRange 20–23; the items that are correctly answered by 80%of the students in Score Range 20–23 also appear in ScoreRange 24–27; and so on. By using this information, the con-tent teams were able to isolate and review the items byscore ranges across test forms.

14

Itemno.

Item difficulties for students scoring in the score range of:

13–15 16–19 20–23 24–27 28–32 33–36

1 .62 .89 .98 .99 1.00 1.002 .87 .98 .99 .99 1.006 .60 .86 .94 .97 .99 .997 .65 .92 .98 .99 .99 1.00

20 .84 .94 .97 .98 .9927 .85 .97 .99 .99 .994 .92 .97 .99 1.005 .94 .97 .99 .99. . . . .. . . . .. . . . .8 .82 .95 .98 .999 .80 .89 .96 .99

21 .82 .92 .97 .9913 .90 .97 .9915 .90 .97 .9917 .87 .98 1.0018 .83 .93 .9822 .81 .91 .9824 .83 .96 .9829 .87 .98 1.0034 .86 .95 .9936 .82 .93 .9939 .85 .96 .9944 .84 .96 .9925 .95 .9928 .97 1.00. . .. . .. . .

35 .86 .9647 .86 .9732 .9533 .9246 .9049 .9551 .9852 .9853 .9256 .9857 .8658 .9559 .8660 .96

Table 3.1Illustrative Listing of Mathematics Items

by Score Range

Page 25: 10421 PLAN Manual 1-34successlineinc.com/actinfo/plantechnicalmanual.pdf · ACT endorses the Code of Fair Testing Practices in Educationand the Code of Professional Responsibilities

The procedures described allowed the content teams toconceptualize what is measured by each of the academictests. Each content team followed the same basic process asthey reviewed the test items in each academic test in thethree assessment programs, EXPLORE, PLAN, and the ACT:1. Multiple forms of each academic test were distributed.2. The knowledge, skills, and understandings that are nec-

essary to answer the test items in each score rangewere identified.

3. The additional knowledge, skills, and understandingsthat are necessary to answer the test items in the nextscore range were identified. This process was repeatedfor all the score ranges.

4. All the lists of statements identified by each content spe-cialist were merged into a composite list. The compositelist was distributed to a larger group of content special-ists.

5. The composite list was reviewed by each content spe-cialist and ways to generalize and to consolidate thevarious skills and understandings were identified.

6. The content specialists met as a group to discuss theindividual, consolidated lists and prepared a master listof skills and understandings, organized by score ranges.

7. The master list was used to review at least three addi-tional test forms, and adjustments and refinements weremade as necessary.

8. The adjustments were reviewed by the content special-ists and “final” revisions were made.

9. The “final” list of skills and understandings was used toreview additional test forms. The purpose of this reviewwas to determine whether the College ReadinessStandards adequately and accurately described theskills and understandings measured by the items, byscore range.

10. The College Readiness Standards were once againrefined.

These steps were used to review test items for all fourmultiple-choice academic tests in all three testing programs.As work began on the PLAN and EXPLORE test items, theCollege Readiness Standards developed for the ACT wereused as a baseline, and modifications or revisions weremade as necessary.

Table 3.2 reports the total number of test items reviewedfor each content area and for each testing program.

15

Table 3.2Number of Items Reviewed During 1997 National Review

Number of items for each testing program

Content area EXPLORE PLAN ACT

English 40 50 75

Mathematics 30 40 60

Reading 30 25 40

Science 28 30 40

Number of items per form 128 145 215

Total number of test forms reviewed 4 9 10

Total number of items reviewed 512 1,305 2,150

Page 26: 10421 PLAN Manual 1-34successlineinc.com/actinfo/plantechnicalmanual.pdf · ACT endorses the Code of Fair Testing Practices in Educationand the Code of Professional Responsibilities

Conducting an Independent Review of the CollegeReadiness Standards. As a means of gathering contentvalidity evidence, ACT invited nationally recognized scholarsfrom high school and university English, mathematics, read-ing, science, and education departments to review theCollege Readiness Standards. These teachers andresearchers were asked to provide ACT with independent,authoritative reviews of the College Readiness Standards.

The content area experts were selected from among can-didates having experience with and an understanding of theacademic tests on EXPLORE, PLAN, and the ACT. Theselection process sought and achieved a diverse represen-tation by gender, ethnic background, and geographic loca-tion. Each participant had extensive and current knowledgeof his or her field, and many had acquired national recogni-tion for their professional accomplishments.

The reviewers were asked to evaluate whether theCollege Readiness Standards (a) accurately reflected theskills and knowledge needed to correctly respond to testitems (in specific score ranges) in EXPLORE, PLAN, and theACT and (b) represented a continuum of increasinglysophisticated skills and understandings across the scoreranges. Each national content area team consisted of threecollege faculty members currently teaching courses in cur-riculum and instruction, and three classroom teachers, oneeach from Grades 8, 10, and 12. The reviewers were pro-vided with the complete set of College Readiness Standardsand a sample of test items falling in each of the scoreranges, by academic test and program.

The samples of items to be reviewed by the consultantswere randomly selected for each score range in all four aca-demic tests for all three assessment programs. ACTbelieved that a random selection of items would ensure amore objective outcome than would preselected items.Ultimately, 17 items for each score range were selected (85 items per testing program, or a total of 255 items for all

three programs). Before identifying the number of items thatwould comprise each set of items in each score range, it wasfirst necessary to determine the target criterion for the levelof agreement among the consultants. ACT decided upon atarget criterion of 70%. It was deemed most desirable for thepercentage of matches to be estimated with an accuracy ofplus or minus 0.05. That is, the standard error of the esti-mated percent of matches to the Standards should be nogreater than 0.05. To estimate a percentage around 70%with that level of accuracy, 85 observations were needed.Since there were five score ranges, the number of items perscore range to be reviewed was 17 (85 ÷ 5 = 17).

The consultants had two weeks to review the CollegeReadiness Standards. Each reviewer received a packet ofmaterials that contained the College Readiness Standards,sets of randomly selected items (17 per score range), intro-ductory material about the College Readiness Standards, adetailed set of instructions, and two evaluation forms.

The sets of materials submitted for the experts’ reviewwere drawn from 13 ACT forms, 8 PLAN forms, and 4EXPLORE forms. The consultants were asked to performtwo main tasks in their area of expertise: Task 1—Judge theconsistency between the Standards and the correspondingsample items provided for each score range; Task 2—Judgethe degree to which the Standards represent a cumulativeprogression of increasingly sophisticated skills and under-standings from the lowest score range to the highest scorerange. The reviewers were asked to record their ratingsusing a five-point Likert scale that ranged from StronglyAgree to Strongly Disagree. They were also asked to sug-gest revisions to the language of the Standards that wouldhelp the Standards better reflect the skills and knowledgemeasured by the sample items.

ACT collated the consultants’ ratings and comments asthey were received. The consultants’ reviews in all but twocases reached ACT’s target criterion, as shown in Table 3.3.

16

Table 3.3Percentage of Agreement of 1997 National Expert Review

EXPLORE PLAN/ACT

Task 1 Task 2 Task 1 (PLAN) Task 1 (ACT) Task 2

English 65% 80% 75% 75% 86%

Mathematics 80% 100% 70% 95% 100%

Reading 75% 75% 75% 60% 100%

Science 95% 100% 100% 70% 80%

Page 27: 10421 PLAN Manual 1-34successlineinc.com/actinfo/plantechnicalmanual.pdf · ACT endorses the Code of Fair Testing Practices in Educationand the Code of Professional Responsibilities

That is, 70% or more of the consultants’ ratings were Agreeor Strongly Agree when judging whether the Standards ade-quately described the skills required by the test items andwhether the Standards adequately represented the cumula-tive progression of increasingly sophisticated skills from thelowest to the highest score ranges. The two exceptions werethe EXPLORE English Test and the ACT Reading Test,where the degree of agreement was 65% and 60%, respec-tively. Each ACT staff content area team met to review allcomments made by all the national consultants. The teamsreviewed all suggestions and adopted a number of helpfulclarifications in the language of the Standards, particularly inthe language of the EXPLORE English Test Standards andthe ACT Reading Test Standards—those two cases in whichthe original language had failed to meet the target criterion.

Refining the College Readiness Standards forEXPLORE and PLAN (2001)

In 2001, the score scale for EXPLORE and PLAN wasrefined. This required that the College Readiness Standardsfor EXPLORE and PLAN be reexamined.

The approach used in 1997 to develop the Standardswas used to reexamine the Standards for EXPLORE andPLAN in 2000. Staff reviewed items, at each EXPLORE andPLAN score interval, that were answered correctly by 80%or more of the EXPLORE and PLAN examinees. Using thePLAN College Readiness Standards as a baseline,EXPLORE test items were reviewed to ensure that thePLAN College Readiness Standards adequately describedthe skills and understandings students were being asked todemonstrate in each score range.

As in the 1997 study, a national independent panel ofcontent experts was convened in each of the four multiple-choice academic tests to ensure that the refinedEXPLORE/PLAN Standards (a) accurately reflected theskills and knowledge needed to correctly respond to testitems in the common score ranges and (b) represented a

continuum of increasingly sophisticated skills and under-standings across the entire score range. As was the case in1997, content area experts were identified in the areas ofEnglish, mathematics, reading, and science. Each contentarea team consisted of three reviewers, one each from mid-dle school/junior high, high school, and college/university.

For each academic test, the consultants were asked toreview sets of test items, arranged by score range, and thecorresponding College Readiness Standards. The PLANreviewers received two sets of test items, an EXPLORE setand a PLAN set, along with the corresponding Standards. Acriterion of 17 items per score range was chosen.

As was the case in 1997, the reviewers were asked torecord their ratings using a five-point Likert scale that rangedfrom Strongly Agree to Strongly Disagree. They were alsoasked to suggest revisions to the language of the Standardsthat would help the Standards better reflect the skills andknowledge measured by the sample items. A target criterionof 70% agreement was again identified. The consultants’review in all cases reached ACT’s target criterion, as shownin Table 3.4.

Periodic Review of the College Readiness Standards

In addition to the regularly scheduled independentreviews conducted by national panels of subject matterexperts, ACT also periodically conducts internal reviews ofthe College Readiness Standards. ACT identifies three tofour new forms of the ACT, PLAN, and EXPLORE (forEXPLORE, fewer forms are available) and then analyzes thedata and the corresponding test items, by score range. Thepurposes of these reviews are to ensure that (a) theStandards reflect the knowledge and skills being measuredby the items in each score range and (b) the Standardsreflect a cumulative progression of increasingly sophisti-cated skills and understandings from the lowest score rangeto the highest. Minor refinements intended to clarify the lan-guage of the Standards have resulted from these reviews.

17

Table 3.4Percentage of Agreement of 2000 National Expert Review

EXPLORE PLAN

Task 1 Task 2 Task 1 Task 2

English 90% 100% 73% 100%

Mathematics 75% 100% 100% 100%

Reading 100% 100% 87% 100%

Science 75% 100% 90% 100%

Page 28: 10421 PLAN Manual 1-34successlineinc.com/actinfo/plantechnicalmanual.pdf · ACT endorses the Code of Fair Testing Practices in Educationand the Code of Professional Responsibilities

Interpreting and Using the College Readiness Standards

Because new EPAS test forms are developed at regularintervals and because no one test form measures all of theskills and knowledge included in any particular Standard, theCollege Readiness Standards must be interpreted as skillsand knowledge that most students who score in a particularscore range are likely to be able to demonstrate. Since therewere relatively few test items that were answered correctlyby 80% or more of the students who scored in the lowerscore ranges, the standards in these ranges should be inter-preted cautiously.

It is important to recognize that the EPAS tests neithermeasure everything students have learned nor does any testmeasure everything necessary for students to know to besuccessful in their next level of learning. The EPAS testsinclude questions from a large domain of skills and fromareas of knowledge that have been judged important for suc-cess in high school, college, and beyond. Thus, the CollegeReadiness Standards should be interpreted in a responsibleway that will help students, parents, teachers, and adminis-trators to:

• Identify skill areas in which students might benefit fromfurther instruction

• Monitor student progress and modify instruction toaccommodate learners’ needs

• Encourage discussion among principals, curriculumcoordinators, and classroom teachers as they evalu-ate their academic programs

• Enhance discussions between educators and parentsto ensure that students’ course selections are appro-priate and consistent with their post–high school plans

• Enhance the communication between secondary andpostsecondary institutions

• Identify the knowledge and skills students enteringtheir first year of postsecondary education shouldknow and be able to do in the academic areas of lan-guage arts, mathematics, and science

• Assist students as they identify skill areas they need tomaster in preparation for college-level course work

ACT’s College Readiness Benchmarks

Description of the College Readiness Benchmarks

The ACT College Readiness Benchmarks (see Table 3.5)are the minimum ACT test scores required for students tohave a high probability of success in first-year, credit-bearingcollege courses—English Composition, social sciencescourses, Algebra, or Biology. In addition to the Benchmarksfor the ACT, there are corresponding EXPLORE and PLANBenchmarks for use by students who take these programs togauge their progress in becoming college ready in the eighthand tenth grades, respectively. Students who meet aBenchmark on the ACT have approximately a 50% chanceof earning a B or better and approximately a 75–80% chanceof earning a C or better in the corresponding college courseor courses. Students who meet a Benchmark on EXPLOREor PLAN are likely to have approximately this same chanceof earning such a grade in the corresponding collegecourse(s) by the time they graduate high school.

Data Used to Establish the Benchmarks for the ACT

The ACT College Readiness Benchmarks are empiricallyderived based on the actual performance of students in col-lege. As part of its Course Placement Service, ACT providesresearch services to colleges to help them place students inentry-level courses as accurately as possible. In providingthese research services, ACT has an extensive databaseconsisting of course grade and test score data from a largenumber of first-year students and across a wide range ofpostsecondary institutions. These data provide an overallmeasure of what it takes to be successful in selected first-year college courses. Data from 98 institutions and over90,000 students were used to establish the Benchmarks. Foreach course, all colleges that supplied data for that coursewere included. If a college sent data from more than a singleyear, only data from the most recent year were included. Thenumbers and types of college varied by course. Because thesample of colleges in this study is a “convenience” sample(that is, based on data from colleges that chose to participate

18

Table 3.5ACT’s College Readiness Benchmarks

EPAS subject test College courseEXPLORE test score

PLAN test score

ACT test score

English English Composition 13 15 18

Mathematics College Algebra 17 19 22

Reading College Social Sciences 15 17 21

Science College Biology 20 21 24

Page 29: 10421 PLAN Manual 1-34successlineinc.com/actinfo/plantechnicalmanual.pdf · ACT endorses the Code of Fair Testing Practices in Educationand the Code of Professional Responsibilities

in ACT’s Course Placement Service), there is no guaranteethat it is representative of all colleges in the U.S. Therefore,ACT weighted the sample so that it would be representativeof the variety of schools in terms of their selectivity.

Procedures Used to Establish the Benchmarks forEXPLORE and PLAN

The College Readiness Benchmarks for EXPLORE andPLAN were developed using about 150,000 records of stu-dents who had taken EXPLORE in the fall of Grade 8, PLANin the fall of Grade 10, and the ACT in Grades 11 or 12. First,the probabilities at each EXPLORE and PLAN test scorepoint were estimated and associated with meeting theappropriate Benchmark for the ACT. Next, the EXPLOREand PLAN test scores were identified in English, Reading,Mathematics, and Science that corresponded most closelyto a 50% probability of success at meeting each of the fourBenchmarks established for the ACT.

Interpreting the EXPLORE and PLAN Benchmarks for Students Who Test With EXPLORE in Grade 9

and PLAN in Grade 11

The EXPLORE and PLAN Benchmarks were establishedusing scores of Grade 8 (EXPLORE) and Grade 10 (PLAN)students. Students who take EXPLORE in Grade 9 or PLANin Grade 11 are generally more academically prepared thanstudents who take them in Grade 8 or Grade 10, respec-tively. Benchmark results for Grade 9 and 11 students whoachieve scores near the Benchmarks should therefore beinterpreted with caution.

Intended Uses of the Benchmarks for Students,Schools, Districts, and States

ACT, PLAN, and EXPLORE results give students an indi-cation of how likely they are to be ready for college-levelwork. The results let students know if they have developedor are developing the foundation for the skills they will need

by the time they finish high school. PLAN and EXPLOREresults provide an early indication of the student’s collegereadiness. Students who score at or above the CollegeReadiness Benchmarks in English, mathematics, and sci-ence are likely to be on track to do well in entry-level collegecourses in these subjects. Students scoring at or above thereading Benchmark are likely to be developing the level ofreading skills they will need in all of their college courses.For students taking EXPLORE and PLAN, this assumes thatthese students will continue to work hard and take challeng-ing courses throughout high school.

States can use the Benchmarks as a tool for establishingminimum standards for high school graduation in statewideassessment contexts that are aimed at preparing highschool graduates for postsecondary education. Junior highand high schools can use the Benchmarks for EXPLOREand PLAN as a means of evaluating students’ early progresstoward college readiness so that timely interventions can bemade when necessary, or as an educational counseling orcareer planning tool.

Interpreting EPAS Test Scores With Respect to Both ACT’s College Readiness Standards and

ACT’s College Readiness Benchmarks

The performance levels on ACT’s EPAS tests necessaryfor students to be ready to succeed in college-level work are defined in ACT’s College Readiness Benchmarks.Meanwhile, the skills and knowledge a student currently has(and areas for improvement) can be identified by examiningthe student’s EPAS test scores with respect to ACT’s empir-ically derived College Readiness Standards. These twoempirically derived tools are designed to help a studenttranslate the student’s EPAS test scores into a clear indica-tor of the student’s current level of college readiness and tohelp the student identify key knowledge and skill areas thestudent needs to improve in order to increase the student’slikeliness to achieve college success.

19

Page 30: 10421 PLAN Manual 1-34successlineinc.com/actinfo/plantechnicalmanual.pdf · ACT endorses the Code of Fair Testing Practices in Educationand the Code of Professional Responsibilities

20

Chapter 4Technical Characteristics of the

PLAN Tests of Educational Achievement

This chapter discusses the technical characteristics—thescore scale, norms, equating, and reliability—of the PLANtests of educational achievement. The scales were con-structed in a special study with data collected in the fall of1988. These data included both 10th- and 12th-grade students. The scaling process placed the PLAN and ACTtests on the same scale. The norms were done in a specialstudy in the fall of 2005. Data for norming the PLAN testswere obtained from samples of 10th- and 11th-grade students in the fall of 2005. Other results are based on samples of PLAN user data, collected in the course of nor-mal testing, rather than in special studies. The special stud-ies, user data, and results are described below.

The Score Scale and Norms

Scaling

Scale scores are reported for the PLAN English,Mathematics, Reading, and Science Tests, and for theUsage/Mechanics, Rhetorical Skills, Pre-Algebra/Algebra,and Geometry subscores. A Composite score, calculated byrounding the unweighted average of the four test scores, isalso reported. Because subscores and test scores werescaled separately, there is no arithmetic relationshipbetween subscores and the test score. For example, theUsage/Mechanics and Rhetorical Skills subscores will notnecessarily sum to the English Test score.

The Score Scale. The range of the test and Compositescores on the PLAN is 1 to 32; the range of the subscores is1 to 16. The maximum scores for some forms may be lessthan 32 for tests and 16 for subscores (see the Equatingsection, which begins on page 33, for details). Properties ofthe score scale for PLAN are summarized in Table 4.1.

The scores reported for the four PLAN tests are approxi-mately “on the same scale” as the scores on the correspon-ding tests of the ACT. The ACT is intended for use by 11thand 12th graders, whereas PLAN is intended for use by 10thgraders, and both testing programs have similar (althoughnot identical) content specifications. As such, PLAN isintended to be a shorter and less difficult version of the ACT.

Therefore, to facilitate longitudinal comparisons betweenPLAN scores for 10th graders and ACT scores for 12thgraders, the score scales for PLAN were constructed withthe consideration that they be approximately “on the samescale” as the ACT scores. Being “on the same scale” meansthat the PLAN test score obtained by an examinee can beinterpreted as approximately the ACT test score the exami-nee would obtain if that examinee had taken the ACT at thetime of the PLAN testing. If this property were exactlyachieved, the mean PLAN and ACT scale scores would bethe same for any group of examinees. The PLAN scorescales were constructed such that the means of the PLANtests and the Composite would be approximately the sameas the corresponding means on the ACT among students atthe beginning of 12th grade, nationwide, who reported that

Table 4.1Properties of the Score Scale for PLAN

• Scores reported for the four PLAN tests are approxi-mately “on the same scale” as the scores reportedfor the corresponding tests of the ACT.

• For PLAN, the maximum range of scores is 1–32 onall tests and the Composite and 1–16 for subscores.

• PLAN test means are approximately 18 and sub-score means are approximately 9 for fall 10th-gradeU.S. students who report they plan to attend college.

• The average standard error of measurement isapproximately 2 points for each test score and 1point for the subscores and composite.

• The conditional standard error of measurement isapproximately equal along the score scale.

• The occurrence of gaps (unused scale score points)and multiple raw scores converting to the samescale score were minimized in constructing the raw-to-scale score transformation.

Page 31: 10421 PLAN Manual 1-34successlineinc.com/actinfo/plantechnicalmanual.pdf · ACT endorses the Code of Fair Testing Practices in Educationand the Code of Professional Responsibilities

they plan to attend a two- or four-year college (12th-gradecollege-bound examinees). The PLAN score scale was con-structed such that the mean of 12th-grade college-boundexaminees would be approximately 18 for each of the fourtests and Composite and 9 for the four subscores.

The rationale for the maximum scale score on the PLANtests being 32, rather than 36 as it is for the ACT tests, wasto leave room in the scale for assessment of educationaldevelopment that occurs between the 10th and 12th grades.The ACT tests are intended to assess skills typicallyachieved through the 11th and 12th grades—skills notassessed in the PLAN tests. Note that the requirement thatthe maximum scale score on the PLAN tests be 32, ratherthan 36 as for the ACT tests, ensures that the property of thePLAN and ACT scores being on the same scale will not beattained for very high values of the ACT scale.

The scales for PLAN were constructed using a methoddescribed by Kolen (1988) to produce score scales withapproximately equal conditional standard errors of measure-ment along the entire range of scores. Without nearly equalconditional standard errors of measurement, standard errorsof measurement at different score levels would need to bepresented and considered in score interpretation (seeAmerican Psychological Association, 1985, p. 22). Thescales for PLAN were constructed such that the averagestandard error of measurement is approximately 2 scalescore points for each of the test scores and 1 scale scorepoint for the subscores and the Composite.

Based on the properties of the standard error of meas-urement just described, if the distribution of measurementerror is approximated by a normal distribution, an approxi-mate 68% confidence interval can be constructed for anyexaminee by adding 2 points to and subtracting 2 pointsfrom his or her scale score for any of the PLAN tests. Ananalogous interval for the subscores or the Composite canbe constructed by adding 1 point to and subtracting 1 pointfrom the score.

In thinking about standard errors and their use, note thatthe reported scale score (i.e., the obtained score) for anexaminee is only an estimate of that examinee’s true score,where the true score can be interpreted as the averageobtained score over repeated testings under identical condi-tions. If 1 standard error of measurement were added to andsubtracted from each of these obtained scores, about 68%of the resulting intervals would contain the examinee’s truescore. Technically, this is how a 68% interval for an exami-nee’s scale score should be interpreted. These statementsmake normal distribution assumptions.

Another way to view 68% intervals is in terms of groupsof examinees. Specifically, if 1 standard error of measure-ment were added to and subtracted from the obtained scoreof each examinee in a group of examinees, the resultingintervals would contain the true score for approximately 68%of the examinees. To put it another way, about 68% of theexaminees would be mismeasured by less than 1 standarderror of measurement. Again, such statements make normaldistribution assumptions. Also, these statements assume aconstant standard error of measurement, which is a charac-teristic (approximately) of the PLAN score scales.Consequently, it is relatively straightforward to interpretscale scores in relation to measurement error.

Scaling Process. The data used in the scaling processwere collected in the fall of 1988 as part of the AcademicSkills Study, which provided nationally representative sam-ples of examinees for the scaling of the ACT and PLAN.Data from 12th-grade college-bound examinees in theAcademic Skills Study were used in scaling PLAN. Twelfth-grade data were used to help obtain a desired property ofthe PLAN score scale that test score means be approxi-mately 18 and subscore means be approximately 9 for 12th-grade college-bound examinees. A detailed discussion of thedata used in scaling PLAN is given in Hanson (1989).

21

Page 32: 10421 PLAN Manual 1-34successlineinc.com/actinfo/plantechnicalmanual.pdf · ACT endorses the Code of Fair Testing Practices in Educationand the Code of Professional Responsibilities

22

The first step in constructing the score scales was to pro-duce initial scale scores with a specified mean and a speci-fied standard error of measurement that is approximatelyequal for all examinees. The means and standard errors ofmeasurement specified for each test score and subscore arenoted in Table 4.1. The process used was based on Kolen(1988) and presented in detail in Hanson (1989). These ini-tial scale scores were rounded to integers ranging from 1 to32 for the tests and 1 to 16 for the subscores. Some adjust-ments of the rounded scale scores were performed toattempt to meet the specified mean and standard error ofmeasurement, as well as to avoid gaps in the score scale(scale score values that were not used) or to avoid havingtoo many raw scores converting to a single scale score. Thisprocess resulted in the final raw-to-scale score conversions.

The final score scales for the four PLAN tests were eval-uated to determine the degree to which the reported PLANscores would be “on the same scale” as the score reportedfor the corresponding ACT test. Details are given in Hanson(1989). The condition of the PLAN and ACT scores being onthe same scale was approximately met, except for exami-nees scoring very high on the ACT, who would have lowerPLAN than ACT scores. This is due to the lower maximumscore on PLAN.

The score scales constructed based on data from theAcademic Skills Study are different from the score scalesused for the 1987 and 1988 administrations of the PLAN(which at that time was named P-ACT+). Scores reported forthe PLAN administered in 1987 and 1988 are not inter-changeable with scores reported for the PLAN administeredin 1989 or later. A concordance table relating scores on theoriginal score scale (reported in the 1987 and 1988 adminis-trations) to the current score scale (reported in the 1989 andlater administrations) is available from ACT.

The 1991 and later administrations of PLAN used formswith slightly different statistical specifications than formsused for administrations prior to 1991. The change in statis-tical specifications was made to have the PLAN statisticalspecifications better match the ACT statistical specifications.This change in statistical specifications resulted in formsused for the 1991 and later administrations being, in general,less difficult than forms used in earlier administrations.Scores on forms used in the 1991 and later administrationsare equated back to the score scale developed using datafrom the 1988 Academic Skills Study. Because of thechange in the statistical specifications, the highest achiev-able test scores and subscores may be less than 32 and 16,respectively, for forms used in the 1991 and later adminis-trations (for further information regarding the maximum scalescore see the section in this manual on equating, whichbegins on page 33.

PLAN National Norming Study

In the fall of 2005, ACT conducted a research study toprovide a new set of nationally representative norms for stu-dents taking PLAN during and after the fall of 2006. In thisstudy, test score data were collected on students throughoutthe United States.

Sample. The composition of the sample obtained to com-pute the nationally representative norms for the PLAN testsis presented in Table 4.2. The number of fall 10th- and 11th-grade examinees used to compute the nationally represen-tative norms for the PLAN tests were 4,356 and 2,556. Thesample selected for the study included schools chosen torepresent various regions of the United States and sizes ofhigh school sophomore classes.

Page 33: 10421 PLAN Manual 1-34successlineinc.com/actinfo/plantechnicalmanual.pdf · ACT endorses the Code of Fair Testing Practices in Educationand the Code of Professional Responsibilities

Weighting. For the sampling and norming process, indi-vidual examinee records were multiplied by weights toachieve representativeness with respect to the explicit strat-ification variables. The weight for an examinee was inverselyproportional to the probability of the examinee’s being cho-sen for the sample, given the sampling plan. The weight foreach case was:WGT = W1 • W2 • K,where W1 = N/n

W2 = M/mK = constant that scales down the total weighted

sample size so that it is equal to the size of asimple random sample of equal precision.

The variables N, n, M, and m are defined as:N = number of schools, in the population, from the stratum

to which the school belongs;n = number of schools, in the sample, from the stratum to

which the school belongs;M = number of students enrolled in the school associated

with the student; andm = number of students sampled from the school associated

with the student.Sample Representativeness. One way to determine the

character and extent of sample bias is to compare the demo-graphic characteristics of the sample of examinees with theU.S. statistics for various educational and demographic vari-ables presented in Table 4.2. Precisely comparable U.S.data for the population of interest were not available.However, the data shown allow for a general examination ofthe representativeness of the sample with respect to thedemographic variables. As indicated in Table 4.2, theweighted sample appears to be reasonably representative ofthe national population of interest.

Obtained Precision. The targeted precision level was to estimate any proportion to within .05 with probability .95.The actual obtained level of precision for the norms wassuch that any proportion is estimated to within .02 with prob-ability .95.

Norms for the 2005 National Sample

The norms for PLAN are intended to represent thenational population of 10th-grade students and the nationalsubpopulation of 10th-grade students who reported that theireducational plans include two-year postsecondary, four-yearpostsecondary, or graduate/professional studies (collegebound). Norms are reported both for students tested in thefall and spring of 10th grade. The norms for PLAN wereobtained from the 2005 nationally representative sampleusing the following sampling and weighting process.

The number of fall Grade 10 and fall Grade 11 examineesused to compute the nationally representative norms for thePLAN tests were 4,356 and 2,556. The sampling of schoolswas stratified by school size, and weights were applied sothat the sample was representative of the population. In par-ticular, a weight was applied within a school, so that the stu-dents tested represented the total student body. A weightwas also applied to each school, so that the regional repre-sentation and sizes of schools was representative of schoolsnationwide. The calculated margin of error is approximately.02, indicating that the population value should be within .02of the calculated sample value, for any proportion, with 95%probability.

23

Page 34: 10421 PLAN Manual 1-34successlineinc.com/actinfo/plantechnicalmanual.pdf · ACT endorses the Code of Fair Testing Practices in Educationand the Code of Professional Responsibilities

24

Table 4.2Selected Demographic and Educational Characteristics of Grade 10 Students for the 2005 PLAN Norm Group

Grade 10 Grade 11

Category identifier used in studyWeighted sample

proportionU.S.

proportionWeighted sample

proportionU.S.

proportion

Gender

Female .51 .49 .53 .49

Male .49 .51 .47 .51

Racial/Ethnic Origin

African American .18 .17 .22 .17

Native American .02 .01 .02 .01

White .57 .60 .52 .60

Hispanic .11 .17 .11 .17

Asian American .02 .04 .05 .04

Multiracial/Other, Prefer Not to Respond .10 — .11 —

School Affiliation

Private .06 .06 .06 .06

Public .94 .94 .94 .94

Geographic Region

East .39 .39 .39 .39

Midwest .23 .23 .24 .23

Southwest .14 .14 .14 .14

West .24 .24 .24 .24

Size of 10th Grade

Small .68 .64 .69 .64

Medium .22 .21 .18 .21

Large .10 .15 .13 .15

Page 35: 10421 PLAN Manual 1-34successlineinc.com/actinfo/plantechnicalmanual.pdf · ACT endorses the Code of Fair Testing Practices in Educationand the Code of Professional Responsibilities

25

Scale Score Statistics for the 2005 National Sample

Scale score summary statistics for all examinees takingthe two forms in the 2005 nationally representative sampleare given in Table 4.3. Scale score statistics for college-bound examinees are given in Table 4.4. The data used forthe results in Tables 4.3 and 4.4 were weighted using theweighting procedure described in the sampling section, onpage 23. The score scale of PLAN was constructed using anearly form of PLAN with data collected in fall 1988. Theresults presented in Tables 4.3 and 4.4 were obtained byequating the two newer forms to earlier forms using the pro-cedures discussed in the equating section of this manual(page 33).

Cumulative Percents for the 2005 National Sample

Data from the national sample were used to developcumulative percents (percents-at-or-below) for each PLANtest and Composite as well as the subscores. The percent-at-or-below corresponding to a scale score is defined as thepercent of examinees with scores equal to less than thescale score. Tables 4.5 through 4.8 are the norms (percents-

at-or-below) for the four PLAN test scale scores, the fourPLAN subscores, and Composite score, for national and col-lege bound examinees, tested either in the fall or spring ofGrade 10.

An examinee’s standing on different tests should be com-pared by using the percents-at-or-below shown in the normstables and not by using scale scores. The reason for prefer-ring percents-at-or-below for such comparisons is that thescales were not constructed to ensure that, for example, ascale score of 21 on the English Test is comparable to a 21on the Mathematics, Reading, or Science Tests. In contrast,examinee percents-at-or-below on different tests indicatestandings relative to the same comparison group.

Even comparison of percents-at-or-below do not permitcomparison of standing in different skill areas in anyabsolute sense. The question of whether a particular exam-inee is stronger in science reasoning than in mathematics,assessed by the corresponding tests, can be answered onlyin relation to reference groups of other examinees. Whetherthe answer is “yes” or “no” can depend on the group.

(Text continues on p. 30.)

Table 4.3Scale Score Summary Statistics for a National Sample

Statistic English Mathematics Reading Science CompositeUsage/

MechanicsRhetorical

Skills

Pre-Algebra/Algebra Geometry

Mean 16.9 17.4 16.9 18.2 17.5 8.3 8.4 8.2 8.7

SD 4.6 4.6 4.6 3.5 3.8 3.1 3.1 3.5 2.9

Skewness 0.3 0.5 0.3 0.6 0.5 0.2 0.2 0.4 0.5

Kurtosis 3.0 3.7 2.8 4.3 3.1 2.8 2.5 2.5 3.1

Table 4.4Scale Score Summary Statistics for a College-Bound Sample

Statistic English Mathematics Reading Science CompositeUsage/

MechanicsRhetorical

Skills

Pre-Algebra/Algebra Geometry

Mean 17.6 17.9 17.4 18.5 18.0 8.7 8.8 8.6 9.0

SD 4.5 4.5 4.5 3.5 3.8 3.0 3.1 3.5 2.9

Skewness 0.3 0.6 0.2 0.5 0.4 0.2 0.1 0.4 0.5

Kurtosis 3.1 3.5 2.8 4.3 3.1 2.8 2.5 2.4 2.9

Page 36: 10421 PLAN Manual 1-34successlineinc.com/actinfo/plantechnicalmanual.pdf · ACT endorses the Code of Fair Testing Practices in Educationand the Code of Professional Responsibilities

26

Table 4.5PLAN 2005 National Norms for Fall Grade 10

Scale score

Percent at or below

English Mathematics Reading Science CompositeUsage/

MechanicsRhetorical

Skills

Pre-Algebra/Algebra Geometry

32 100 100 100 100 100

31 99 99 99 99 99

30 99 99 99 99 99

29 99 99 99 99 99

28 99 98 99 99 99

27 98 96 99 98 99

26 97 95 98 98 98

25 96 93 96 97 97

24 94 91 94 96 95

23 92 89 91 93 93

22 88 86 87 90 89

21 84 83 83 85 85

20 79 79 78 78 79

19 72 73 73 69 72

18 65 66 66 57 64

17 57 57 58 45 54

16 49 48 50 32 44 100 100 100 100

15 40 37 41 21 33 98 99 96 98

14 32 27 32 13 23 97 97 93 95

13 24 18 24 7 14 94 94 90 92

12 17 11 17 4 8 90 89 86 88

11 12 6 11 2 4 85 82 81 83

10 7 3 7 1 2 77 74 76 77

9 4 2 4 1 1 67 64 68 67

8 2 1 2 1 1 55 53 58 53

7 1 1 1 1 1 41 41 47 37

6 1 1 1 1 1 28 29 35 22

5 1 1 1 1 1 18 19 24 10

4 1 1 1 1 1 10 10 14 4

3 1 1 1 1 1 5 5 7 2

2 1 1 1 1 1 2 2 3 1

1 1 1 1 1 1 1 1 1 1

Mean 16.9 17.4 16.9 18.2 17.5 8.3 8.4 8.2 8.7

SD 4.6 4.6 4.6 3.5 3.8 3.1 3.1 3.5 2.9

Page 37: 10421 PLAN Manual 1-34successlineinc.com/actinfo/plantechnicalmanual.pdf · ACT endorses the Code of Fair Testing Practices in Educationand the Code of Professional Responsibilities

27

Table 4.6PLAN 2005 National Norms for Spring Grade 10

Scale score

Percent at or below

English Mathematics Reading Science CompositeUsage/

MechanicsRhetorical

Skills

Pre-Algebra/Algebra Geometry

32 100 100 100 100 100

31 99 99 99 99 99

30 99 99 99 99 99

29 99 98 99 99 99

28 98 97 99 99 99

27 98 95 98 98 99

26 96 94 97 97 98

25 95 92 95 96 96

24 92 90 92 95 94

23 89 87 89 93 91

22 85 84 85 89 87

21 81 81 80 84 82

20 75 76 75 76 76

19 68 70 70 67 69

18 61 63 63 55 61

17 53 54 56 43 51

16 45 44 48 31 41 100 100 100 100

15 36 34 40 21 30 98 99 95 97

14 28 24 31 13 21 96 96 92 94

13 21 15 23 7 13 92 92 88 90

12 15 9 16 4 7 88 87 84 86

11 10 5 10 2 3 82 79 79 81

10 6 3 6 1 1 73 70 73 74

9 3 1 3 1 1 63 60 65 64

8 2 1 2 1 1 51 49 55 50

7 1 1 1 1 1 37 38 44 34

6 1 1 1 1 1 25 27 32 19

5 1 1 1 1 1 15 17 21 8

4 1 1 1 1 1 8 9 12 3

3 1 1 1 1 1 4 4 6 1

2 1 1 1 1 1 2 1 2 1

1 1 1 1 1 1 1 1 1 1

Mean 17.4 17.8 17.2 18.3 17.8 8.7 8.7 8.5 9.0

SD 4.7 4.7 4.7 3.6 3.9 3.1 3.2 3.6 2.9

Page 38: 10421 PLAN Manual 1-34successlineinc.com/actinfo/plantechnicalmanual.pdf · ACT endorses the Code of Fair Testing Practices in Educationand the Code of Professional Responsibilities

28

Table 4.7PLAN 2005 College-Bound Norms for Fall Grade 10

(Educational plans include 2-year postsecondary, 4-year postsecondary, or graduate/professional studies)

Scale score

Percent at or below

English Mathematics Reading Science CompositeUsage/

MechanicsRhetorical

Skills

Pre-Algebra/Algebra Geometry

32 100 100 100 100 100

31 99 99 99 99 99

30 99 99 99 99 99

29 99 99 99 99 99

28 98 97 99 99 99

27 98 96 98 98 99

26 97 94 97 98 98

25 95 93 95 97 96

24 93 90 93 95 94

23 90 88 89 93 92

22 86 85 86 89 88

21 82 81 81 83 83

20 76 76 75 75 76

19 68 70 69 65 68

18 60 62 62 52 59

17 52 53 53 40 49

16 43 43 45 28 38 100 100 100 100

15 34 32 36 18 27 98 99 96 97

14 26 22 27 10 18 96 96 92 94

13 19 14 20 6 10 93 93 89 91

12 13 8 13 3 5 89 87 84 87

11 8 4 8 1 3 82 80 79 82

10 5 2 5 1 1 73 71 72 74

9 3 1 3 1 1 62 60 63 63

8 1 1 1 1 1 49 48 53 49

7 1 1 1 1 1 35 36 42 33

6 1 1 1 1 1 23 24 31 18

5 1 1 1 1 1 13 15 20 8

4 1 1 1 1 1 7 8 11 3

3 1 1 1 1 1 3 3 5 1

2 1 1 1 1 1 1 1 2 1

1 1 1 1 1 1 1 1 1 1

Mean 17.6 17.9 17.4 18.5 18.0 8.7 8.8 8.6 9.0

SD 4.5 4.5 4.5 3.5 3.8 3.0 3.1 3.5 2.9

Page 39: 10421 PLAN Manual 1-34successlineinc.com/actinfo/plantechnicalmanual.pdf · ACT endorses the Code of Fair Testing Practices in Educationand the Code of Professional Responsibilities

29

Table 4.8PLAN 2005 College-Bound Norms for Spring Grade 10

(Educational plans include 2-year postsecondary, 4-year postsecondary, or graduate/professional studies)

Scale score

Percent at or below

English Mathematics Reading Science CompositeUsage/

MechanicsRhetorical

Skills

Pre-Algebra/Algebra Geometry

32 100 100 100 100 100

31 99 99 99 99 99

30 99 99 99 99 99

29 99 98 99 99 99

28 98 96 99 98 99

27 97 95 98 98 98

26 95 93 96 97 97

25 93 90 94 96 95

24 90 88 90 94 92

23 87 85 87 91 89

22 83 82 82 87 84

21 77 78 77 81 79

20 71 72 71 72 72

19 63 66 65 61 64

18 55 58 58 49 55

17 47 49 50 37 44

16 38 39 42 26 34 100 100 100 100

15 30 28 34 17 24 98 98 94 97

14 22 19 26 10 15 95 96 90 93

13 15 12 19 6 9 91 91 86 89

12 10 6 13 3 4 86 85 81 84

11 6 3 8 2 2 78 76 76 78

10 4 2 5 1 1 69 66 68 70

9 2 1 2 1 1 57 54 60 60

8 1 1 1 1 1 44 43 49 45

7 1 1 1 1 1 31 32 38 29

6 1 1 1 1 1 19 22 27 15

5 1 1 1 1 1 11 13 17 7

4 1 1 1 1 1 6 7 9 3

3 1 1 1 1 1 3 3 4 1

2 1 1 1 1 1 1 1 1 1

1 1 1 1 1 1 1 1 1 1

Mean 18.2 18.4 17.8 18.8 18.4 9.1 9.1 9.0 9.3

SD 4.6 4.7 4.7 3.6 3.9 3.0 3.1 3.6 2.9

Page 40: 10421 PLAN Manual 1-34successlineinc.com/actinfo/plantechnicalmanual.pdf · ACT endorses the Code of Fair Testing Practices in Educationand the Code of Professional Responsibilities

30

Estimated ACT Composite Score Interval

For all 10th-grade examinees who are administeredPLAN operationally and receive a PLAN Composite score,an estimated ACT Composite score interval is reported. Forany particular examinee, the Composite score obtained bythe examinee on the ACT two years later may or may not fallin the interval reported on the PLAN score report.

The data used to construct the intervals consists ofscores for examinees who were administered PLAN in thefall of 2002 as sophomores and who took the ACT in the fallof 2004 as seniors. This data set contained 168,808 exami-nees. Of the examinees who took PLAN in the fall of 2002only examinees who took the ACT in fall 2004 as seniorswere included because the estimated ACT Composite scoreinterval is defined to refer to the score that an examineewould obtain in the fall of his or her senior year.

Table 4.9 contains the bivariate frequencies of PLAN andACT Composite scores for the 168,808 examinees. Therows of Table 4.9 give PLAN Composite scores and thecolumns give ACT Composite scores. For example, the cellcorresponding to a PLAN Composite score of 15 and anACT Composite score of 17 contains the number 2,024. Thismeans of the 168,808 examinees, 2,024 received a PLANComposite score of 15 in the fall of 2002 and an ACTComposite score of 17 in the fall of 2004.

The estimated ACT Composite score intervals were con-structed based on the bivariate distribution of PLAN andACT Composite scores. The goal was to obtain an overallprobability of ACT Composite scores falling within the esti-mated ACT Composite score interval of around .75 (thesame overall probability attained for the original intervalsused prior to the fall of 1995). For each PLAN Compositescore the smallest interval possible was constructed suchthat approximately .75 of the conditional distribution of ACTComposite scores for that PLAN Composite score fell withinthe interval.

The resulting intervals are reported in Table 4.10. Forexample, Table 4.10 indicates that, for a PLAN Compositescore of 15, the lower limit of the estimated ACT Compositescore interval is given as 15 and the upper limit is given as 19. Consequently, an estimated ACT Composite scoreinterval of 15 to 19 is reported for examinees with PLANComposite scores of 15. The width of the intervals varieswith the PLAN Composite score. The intervals are wider inthe middle of the PLAN score scale than at the extremes.

The score intervals are presented in Table 4.9 as shadedcells. The numbers in the column labeled “Row Total” givethe total number of examinees for the PLAN Compositescore corresponding to that row. The column labeled “RowHits” gives the number of examinees whose obtained ACTComposite score was within the estimated score interval inthat row (the sum of the numbers in the shaded cells). Thelast column, labeled “Prob. Cov.”, is the proportion of theexaminees in each row whose obtained ACT Compositescore was within their estimated ACT Composite score inter-val (this is referred to as the probability of coverage). Theseprobabilities of coverage are reasonably constant acrossPLAN Composite scores (the major discrepancies occur atPLAN Composite scores where there were few examinees).As reported at the bottom of Table 4.9, the proportion of allexaminees whose estimated ACT Composite score intervalcontained their obtained ACT Composite score (the overallprobability of coverage) was .75.

(Text continued from p. 25.)

Page 41: 10421 PLAN Manual 1-34successlineinc.com/actinfo/plantechnicalmanual.pdf · ACT endorses the Code of Fair Testing Practices in Educationand the Code of Professional Responsibilities

31

TABLE 4.5

12

34

56

78

910

1112

1314

1516

1718

1920

2122

2324

2526

2728

2930

3132

3334

3536

Row

Tot

alR

ow H

itsP

rob.

Cov

.

10

00

00

00

00

00

00

00

00

00

00

00

00

00

00

00

00

00

00

20

00

00

00

00

00

00

00

00

00

00

00

00

00

00

00

00

00

00

30

00

00

00

00

00

00

00

00

00

00

00

00

00

00

00

00

00

00

40

00

00

00

00

00

00

00

00

00

00

00

00

00

00

00

00

00

00

50

00

00

00

00

00

00

00

01

00

00

00

00

00

00

00

00

00

01

00.

00

60

00

00

00

00

00

10

00

00

00

00

00

00

00

00

00

00

00

01

11.

00

70

00

00

00

01

01

01

00

00

00

00

00

00

00

00

00

00

00

03

20.

67

80

00

00

00

00

03

33

14

10

00

00

00

00

10

00

00

00

00

016

100.

63

90

00

00

00

02

212

1616

74

12

10

01

10

00

00

00

00

00

00

065

510.

78

100

00

00

00

21

732

4442

3519

156

45

34

20

01

10

01

01

00

00

022

517

20.

76

110

00

00

00

12

1045

8914

712

710

053

1620

126

93

11

10

00

00

01

00

00

644

463

0.72

120

00

00

00

25

2184

197

319

308

284

182

100

5737

1714

72

31

20

01

00

00

00

016

4311

930.

73

130

00

00

00

04

1397

310

510

712

700

551

345

205

9960

4515

118

33

00

11

00

00

00

3693

2818

0.76

140

00

00

00

00

861

203

523

1011

1247

1242

989

699

393

192

9542

2218

71

11

00

00

00

00

6755

5188

0.77

150

00

00

00

00

318

113

328

819

1527

1936

2024

1748

1158

674

391

155

5825

114

42

11

00

00

00

1100

083

930.

76

160

00

00

00

01

05

4514

740

294

117

3225

0529

5825

2117

9711

5055

022

410

039

145

01

00

00

00

015

137

1151

30.

76

170

00

00

01

00

11

1842

138

416

983

1889

2818

3483

3364

2471

1483

753

359

149

3918

30

11

00

00

018

431

1402

50.

76

180

00

00

00

00

12

613

5815

636

889

717

6227

3536

0035

6228

7418

3796

843

717

572

185

41

10

00

019

552

1460

80.

75

190

00

00

00

01

01

112

1746

128

325

767

1489

2575

3375

3500

2966

1993

1173

477

186

7734

55

00

00

019

153

1440

90.

75

200

00

00

00

00

00

25

1018

4084

240

599

1248

2224

3016

3219

2946

2065

1175

509

252

8029

53

00

00

1776

913

470

0.76

210

00

00

00

00

00

22

413

1234

7816

739

191

116

0823

9927

2425

8518

6611

0756

122

888

2911

10

00

1482

111

182

0.75

220

00

00

00

00

00

02

12

712

1538

104

290

650

1201

1879

2276

2160

1774

1036

532

250

8120

40

10

1233

592

900.

75

230

00

00

00

00

00

00

22

42

710

2882

167

433

882

1367

1686

1709

1370

835

521

203

8820

71

094

2670

140.

74

240

00

00

00

00

00

00

06

02

54

717

4912

628

065

395

712

6612

0296

270

735

516

965

153

068

5050

940.

74

250

00

00

00

00

00

00

01

20

33

39

1135

6722

139

565

988

084

280

746

526

599

301

047

9835

830.

75

260

00

00

00

00

00

00

01

10

10

34

38

1145

136

260

434

561

615

464

296

160

6114

230

8023

340.

76

270

00

00

00

00

00

00

00

00

00

12

21

214

4677

153

268

367

377

295

176

6819

218

7014

600.

78

280

00

00

00

00

00

00

00

00

00

00

01

01

725

4483

129

189

174

145

101

392

940

720

0.77

290

00

00

00

00

00

00

00

00

00

00

00

00

14

720

4672

9590

4331

341

230

30.

74

300

00

00

00

00

00

00

00

00

00

00

00

10

01

21

413

1938

5435

617

412

40.

71

310

00

00

00

00

00

00

00

00

00

00

00

00

00

00

00

20

37

214

100.

71

320

00

00

00

00

00

00

00

00

00

00

00

00

00

00

00

00

00

00

1688

0812

7430

0.75

Tabl

e 4.

9E

stim

ated

AC

T C

om

po

site

Sco

re In

terv

als

Usi

ng

PL

AN

200

2–A

CT

200

4 D

ata

AC

TS

core

PLAN Score

Page 42: 10421 PLAN Manual 1-34successlineinc.com/actinfo/plantechnicalmanual.pdf · ACT endorses the Code of Fair Testing Practices in Educationand the Code of Professional Responsibilities

32

Table 4.10Estimated ACT Composite

Score Intervals for PLAN Composite Scores

ACT intervalsPLANscore Low score High score Width

1 8 10 32 8 10 33 8 10 34 8 11 45 8 11 46 9 12 47 10 13 48 11 14 49 11 14 4

10 11 15 511 12 15 412 13 17 513 13 17 514 14 18 515 15 19 516 16 20 517 17 21 518 19 23 519 20 24 520 21 25 521 22 26 522 23 27 523 24 28 524 26 30 525 26 30 526 27 31 527 28 32 528 29 33 529 30 33 430 31 34 431 33 35 332 33 35 3

Page 43: 10421 PLAN Manual 1-34successlineinc.com/actinfo/plantechnicalmanual.pdf · ACT endorses the Code of Fair Testing Practices in Educationand the Code of Professional Responsibilities

33

Equating

Even though each new form is constructed to adhere tothe current content and statistical specifications, the formsmay be slightly different in difficulty. To control for these dif-ferences, subsequent forms are equated to earlier forms andthe scores reported to examinees are scale scores that havethe same meaning regardless of the particular form admin-istered to examinees. Thus, scale scores are comparableacross test forms and test dates. (Please note the exceptionfor PLAN forms administered prior to 1991, as discussed onpage 22 of this manual.)

Equating is conducted using a special sample of studentsfrom schools who volunteer to participate in an equatingstudy. The examinees in equating samples are administereda spiraled set of forms—the new forms (“n – 1” of them) andone anchor form that has already been equated to previousforms. (The initial anchor form was the form used to estab-lish the score scale.) This spiraling technique, in which everynth examinee receives the same form of the test, results inrandomly equivalent groups taking the forms. The use ofrandomly equivalent groups is an important feature of theequating procedure and provides a basis for confidence inthe continuity of scales.

Scores on the alternate forms are equated to the scorescale using equipercentile equating methodology. Inequipercentile equating, a score on Form X of a test and ascore on Form Y are considered to be equivalent if they havethe same percentile rank in a given group of examinees. Theequipercentile equating results are subsequently smoothedusing an analytic method described by Kolen (1984) toestablish a smooth curve, and the equivalents are roundedto integers. The conversion tables that result from thisprocess are used to transform raw scores on the new formsto scale scores.

The equipercentile equating technique is applied to theraw scores of each of the four tests for each form separately.The Composite score is not directly equated across forms.Instead, the Composite is calculated by rounding theunweighted average of the scale scores for the four equatedtests. The subscores are also separately equated using theequipercentile method. Note, in particular, that the equatingprocess does not lead to the English and Mathematics Testscores being a sum of their respective subscores.

Prior to fall 1991, answering all of the items correctly ona test assured an examinee of obtaining a scale score of 32.Similarly, answering all of the items correctly within a sub-score area assured an examinee of obtaining a scale sub-score of 16. Beginning in fall 1991, the equating wasconducted such that it is possible an examinee answering allof the items correctly on a particular test will receive a scalescore less than 32. Similarly, an examinee answering all ofthe items correctly within a subscore area may obtain ascale subscore less than 16.

The reason for allowing the possibility of the maximumscale score being unattainable for some forms is a changein the statistical specifications for forms administered in fall1991 and later. PLAN (formerly P-ACT+) was introduced in1987, prior to the introduction of the enhanced ACT in 1989.Subsequently, it was decided to modify the statistical speci-fications of the PLAN tests to better match the ACT specifi-cations. This change in specifications has resulted in PLANforms administered in the fall of 1991 and later being, in gen-eral, less difficult than earlier PLAN forms. The change intest difficulty could create gaps at the top of the score scaleif an all-correct raw score were forced to equal a scale scoreof 32 on all tests. Consequently, there would be a discrep-ancy between the level of achievement conveyed by a scalescore of 32 on later versus earlier forms. By allowing an all-correct raw score to convert to a scale score less than 32,scale scores from different PLAN forms are kept more com-parable.

Reliability, Measurement Error, and Effective Weights

Some degree of inconsistency or error is potentially con-tained in the measurement of any cognitive characteristic.An examinee administered one form of a test on one occa-sion and a second, parallel form on another occasion likelywould earn somewhat different scores on the two adminis-trations. These differences might be due to the examinee orthe testing situation, such as differential motivation or differ-ential levels of distractions on the two testings. Alternatively,these differences might result from attempting to infer theexaminee’s level of skill from a relatively small sample ofitems.

Reliability coefficients are estimates of the consistency oftest scores. They typically range from zero to one, with val-ues near one indicating greater consistency and those nearzero indicating little or no consistency.

The standard error of measurement (SEM) is closelyrelated to test reliability. The standard error of measurementsummarizes the amount of error or inconsistency in scoreson a test. As noted previously, the score scales for PLANwere developed to have approximately constant standarderrors of measurement for all true scale scores (i.e., the con-ditional standard error of measurement as a function of truescale score is constant). This statement implies, for exam-ple, that the standard error of measurement for any particu-lar PLAN test score or subscore is approximately the samefor low-scoring examinees as it is for high-scoring exami-nees. As discussed more fully in the score scale section onpages 20–22, if the distribution of measurement error isapproximated by a normal distribution, about two thirds ofthe examinees can be expected to be mismeasured by lessthan 1 standard error of measurement.

Page 44: 10421 PLAN Manual 1-34successlineinc.com/actinfo/plantechnicalmanual.pdf · ACT endorses the Code of Fair Testing Practices in Educationand the Code of Professional Responsibilities

Figure 4.1 presents the conditional standard errors ofmeasurement for the four tests as a function of true scalescore. Conditional standard errors of measurement for thesubscores are presented in Figure 4.2. Data from the 2005norming study were used in producing the plots. The condi-tional standard error of measurement functions were com-

puted using methods discussed in Kolen, Hanson, andBrennan (1992). The minimum true scale score plotted isaround 10 for each test and around 4 for each subscorebecause the probability of a true score lower than 10 foreach test or lower than 4 for each subscore was very small.

34

3.0

2.5

2.0

1.5

1.0

0.5

0.0

10 15 20 25 30

English

Sta

ndar

d E

rror

of M

easu

rem

ent

True Scale Score

NationalCollege-Bound

3.0

2.5

2.0

1.5

1.0

0.5

0.0

10 15 20 25 30

Mathematics

Sta

ndar

d E

rror

of M

easu

rem

ent

True Scale Score

NationalCollege-Bound

3.0

2.5

2.0

1.5

1.0

0.5

0.0

10 15 20 25 30

Reading

Sta

ndar

d E

rror

of M

easu

rem

ent

True Scale Score

NationalCollege-Bound

3.0

2.5

2.0

1.5

1.0

0.5

0.0

10 15 20 25 30

Science

Sta

ndar

d E

rror

of M

easu

rem

ent

True Scale Score

NationalCollege-Bound

Figure 4.1. Conditional standard error of measurement for the PLAN tests.

Page 45: 10421 PLAN Manual 1-34successlineinc.com/actinfo/plantechnicalmanual.pdf · ACT endorses the Code of Fair Testing Practices in Educationand the Code of Professional Responsibilities

Usage/Mechanics2.5

2.0

1.5

1.0

0.5

0.0

4 6 8 10 12 14 16

Sta

ndar

d E

rror

of M

easu

rem

ent

True Scale Score

NationalCollege-Bound

Rhetorical Skills2.5

2.0

1.5

1.0

0.5

0.0

4 6 8 10 12 14 16

Sta

ndar

d E

rror

of M

easu

rem

ent

True Scale Score

NationalCollege-Bound

Geometry2.5

2.0

1.5

1.0

0.5

0.0

4 6 8 10 12 14 16

Sta

ndar

d E

rror

of M

easu

rem

ent

True Scale Score

NationalCollege-Bound

Pre-Algebra/Algebra2.5

2.0

1.5

1.0

0.5

0.0

4 6 8 10 12 14 16

Sta

ndar

d E

rror

of M

easu

rem

ent

True Scale Score

NationalCollege-Bound

Figure 4.2. Conditional standard error of measurement for the PLAN subscores.

35

Page 46: 10421 PLAN Manual 1-34successlineinc.com/actinfo/plantechnicalmanual.pdf · ACT endorses the Code of Fair Testing Practices in Educationand the Code of Professional Responsibilities

For most of the true scale score range, the scale scorestandard error of measurement is reasonably constant forthe English and Reading Tests. For the Mathematics andScience Tests the conditional standard error of measure-ment is somewhat lower near the middle of the score rangethan it is for moderately low and moderately high scores. Forall tests the standard error of measurement is smaller at veryhigh scores. This is expected, since the conditional standarderror of measurement must be zero for the maximum truescale score and near zero for true scale scores near themaximum (note that for some forms for each test the equat-ing may result in a maximum scale score less than 32). Themethod used to produce the score scales, therefore, cannotguarantee a completely constant standard error of measure-ment for all true scale scores.

The proportion of examinees with true scale scores at theextremes of the score scale, where the deviations from aconstant conditional standard error of measurement aremost apparent in Figures 4.1 and 4.2, is small. For example,the average standard error of measurement for theMathematics Test is 1.98. The average standard error ofmeasurement, which is the average of the conditional stan-dard errors of measurement given in Figure 4.1 over the dis-tribution of true scale scores, is approximately equal to thecorresponding conditional standard error of measurement attrue scale scores in the middle of the scale. This is a reflec-tion of most of the true scores being in the middle of thescore range, and very few of the true scale scores being inthe extremes of the score range where the conditional stan-dard errors of measurement deviate from the average. It isconcluded that the constant conditional standard error ofmeasurement property is, for practical purposes, reasonablywell met for these forms.

PLAN reliability coefficients and average standard errors ofmeasurement for all examinees in the 2005 norming study aregiven in Table 4.11. Kuder-Richardson formula 20 (KR-20)internal consistency reliability coefficients of raw scores arelisted first. Scale score reliability coefficients and standarderrors of measurement are reported next. Scale score aver-age standard errors of measurement were estimated using afour-parameter beta compound binomial model as describedin Kolen, Hanson, and Brennan (1992). The estimated scalescore reliability for test i (RELi) was calculated as

RELi = 1 – ,

where SEMi is the estimated scale score average standarderror of measurement and Si

2 is the observed scale scorevariance for test i.

The estimated average standard error of measurementfor the Composite (SEMc) was calculated as

SEMc = ,

where the summation is over the four tests.The estimatedreliability of the Composite (RELc) was calculated as

RELc = 1 – ,

where Sc2 is the estimated variance of scores on the

Composite.

SEMc2

______Sc

2

Σ i SEMi2

__________4

SEMi2

______Si

2

�������

36

Table 4.11Estimated Reliabilities and Standard Errors of Measurement

Statistic EnglishUsage/

MechanicsRhetorical

Skills Mathematics

Pre-Algebra/Algebra Geometry Reading Science Composite

National Sample

Raw ScoresReliability .88 .83 .74 .86 .80 .72 .81 .86 —

Scale ScoresReliabilitySEM

.861.71

.821.27

.731.63

.811.98

.781.62

.711.58

.802.03

.831.45

.94

.90

College-Bound SampleRaw Scores

Reliability .88 .82 .72 .86 .78 .71 .81 .86 —

Scale ScoresReliabilitySEM

.851.73

.811.28

.721.63

.802.01

.761.65

.701.59

.802.01

.831.44

.94

.91

Page 47: 10421 PLAN Manual 1-34successlineinc.com/actinfo/plantechnicalmanual.pdf · ACT endorses the Code of Fair Testing Practices in Educationand the Code of Professional Responsibilities

37

Assuming the measurement errors on the four tests areuncorrelated, the conditional standard error of measurementof the unrounded Composite scale score is

sc(τe,τm,τr,τs) =

where si(τi) is the conditional standard error of measurementfor test i at true scale score τi, where i = e, m, r, s for English,Mathematics, Reading, and Science, respectively. The func-tions si(τi) are plotted in Figure 4.1. The conditional standarderror of measurement for the Composite, instead of depend-ing on a single variable as the conditional standard errors ofmeasurement for the four tests, depends on four variables—the true scale scores for the four tests. To facilitate presen-tation of the conditional standard errors of measurement forthe Composite, the conditional standard errors will be plottedas a function of the average of the true scale scores for thefour tests. In other words, sc(τe,τm,τr,τs) will be plotted as afunction of

.

A particular true Composite score value can be obtained in avariety of ways (i.e., different combinations of true scalescores on the individual tests could produce the same trueComposite score). Consequently, each true Compositescore value may correspond to several different values ofthe conditional standard error of measurement depending onthe combination of true scores on the four tests that pro-duced the true Composite score value.

To produce plots of the conditional standard errors ofmeasurement of the Composite, the observed proportion-correct scores (the number of items correct divided by thetotal number of items) of the examinees on the four testswere treated as the true proportion-correct scores at whichthe conditional standard errors were calculated. For eachtest the conditional standard error of measurement wascomputed for each examinee using the observed proportion-correct score as the true proportion-correct in the formula forthe conditional standard error of measurement (Equation 8in Kolen, Hanson, & Brennan, 1992). In addition, for eachtest the true scale score corresponding to the observed pro-portion-correct score (treated as a true proportion-correctscore) was computed (Equation 7 in Kolen, Hanson, &Brennan, 1992). The resulting conditional standard errors ofmeasurement for the four tests were substituted in the equa-tion given above to compute a value of the conditional stan-dard error of measurement of the Composite. This is plottedas a function of the average of the true scale scores acrossthe four tests. This procedure was repeated for examinees inthe 2005 norming study. Figure 4.3 presents a plot of these

calculated conditional Composite standard errors of meas-urement versus the averages of the true scale scores overthe four tests. Values for examinees who received propor-tion-correct scores of 0 or 1 on any of the four tests are notplotted in Figure 4.3, since while observed proportion-correctscores of 0 and 1 are possible, true proportion-correctscores of 0 and 1 are unrealistic.

The conditional standard errors of measurement, as pre-sented in Figure 4.3, vary not only across average scalescores but also within each average scale score. Differentstandard errors of measurement are possible for each par-ticular value of the average scale score because more thanone combination of the four test scores can produce thesame average score. The general trend in the plots is for theconditional standard errors in the middle of the scale to belower than the conditional standard errors for moderately lowand moderately high scores. This trend is similar to the trendin Figure 4.1 for the conditional standard errors of measure-ment for the Mathematics and Science Tests. The degree towhich the conditional standard errors of measurement varywith true scale score is greater for the Mathematics andScience Tests than it is for the Composite. As with the fourtest scores, it is concluded that the conditional standarderror of measurement of the Composite is, for practical pur-poses, reasonably constant across the score scale.

A limitation of the approach used in producing estimatesof the conditional standard error of measurement of theComposite is that standard errors of measurement of theunrounded average of the four test scores are computedrather than the standard errors of measurement of therounded average of the four test scores (the rounded aver-age is the score reported to examinees).

It is not a problem that the observed scores of the exam-inees are used in producing the plots because it is standarderrors conditional on average true scale score that are beingplotted, and the observed scores for the examinees are onlyused to determine the specific average true scale scores atwhich to plot the standard errors. One effect of usingobserved scores as the true score values at which to plot theconditional standard errors of measurement is that manypoints at the extremes of the scale may not represent realis-tically obtainable average true scale scores (the probabilityof observing examinees with these values of average truescale score is extremely small).

Scale scores from the four tests are summed and dividedby 4 in the process of calculating the Composite score. Thisprocess suggests that, in a sense, each test is contributingequally to the Composite. The weights used (0.25, in thiscase) are often referred to as nominal weights.

Σ i τi_____4

Σ i si2(τi)___________

4�������

Page 48: 10421 PLAN Manual 1-34successlineinc.com/actinfo/plantechnicalmanual.pdf · ACT endorses the Code of Fair Testing Practices in Educationand the Code of Professional Responsibilities

38

National

1.4

1.3

1.2

1.1

1

0.9

0.8

0.7

0.6

0.5

0.4 0 5 10 15 20 25 30 35

Average True Scale Score

Sta

nd

ard

Err

or

of

Mea

sure

men

t

College-Bound

1.4

1.3

1.2

1.1

1

0.9

0.8

0.7

0.6

0.5

0.4 0 5 10 15 20 25 30 35

Average True Scale Score

Sta

nd

ard

Err

or

of

Mea

sure

men

t

Figure 4.3. Conditional standard error of measurement for the PLAN Composite scores.

Page 49: 10421 PLAN Manual 1-34successlineinc.com/actinfo/plantechnicalmanual.pdf · ACT endorses the Code of Fair Testing Practices in Educationand the Code of Professional Responsibilities

Other definitions of the contribution of a test to aComposite may be more useful. Wang and Stanley (1970)described effective weights as an index of the contribution ofa test to a Composite. Specifically, the effective weights aredefined as the covariance between a test score and thescore on a Composite. These covariances can be summedover tests and then each covariance divided by their sum(i.e., the Composite variance) to arrive at proportional effec-tive weights. Proportional effective weights are referred to aseffective weights in the remainder of this discussion.

The covariances and effective weights are shown in Table 4.12 for the norming study. The values in the diagonalsthat are not in brackets are the observed scale score vari-ances; the diagonal values in brackets are the true scalescore variances. With nominal weights of 0.25 for each test,the effective weight for a test can be calculated by summingthe values in the appropriate row that are not in brackets anddividing the resulting value by the sum of all covariancesamong the four tests using the formula

(effective weight)i = ,

where covij is the observed covariance of test scores corre-sponding to row i and column j. Effective weights for truescores, shown in brackets, are calculated similarly, with thetrue score variance [Si

2 · RELi] used in place of the observedscore variance.

The effective weights for English, Mathematics, andReading are the largest of the effective weights. They arerelatively high because they had the largest scale score vari-ances and because their covariances with the other meas-ures tended to be the highest. These effective weights implythat these tests are more heavily weighted (relative toComposite variance) in forming the Composite than isScience. Note that these effective weights are for the norm-ing study sample and that the weights might differ from thosefor other examinee groups.

Σ j covij_________Σ iΣ jcovij

Table 4.12Scale Score Covariances, and Effective Weights

English Mathematics Reading Science

Number of items 50.00 40.00 25.00 30.00Proportion of total EXPLORE items .34 .28 .17 .21

National Sample

English 21.13 14.69 15.57 11.30[18.20]

Mathematics 14.69 20.91 13.00 11.65[16.99]

Reading 15.57 13.00 20.81 10.89[16.71]

Science 11.30 11.65 10.89 12.18[10.08]

Effective weight .27 .26 .26 .20[.28] [.26] [.26] [.20]

Reliability .86 .81 .80 .83

College-Bound Sample

English 20.20 13.83 14.70 10.73[17.02]

Mathematics 13.83 20.66 12.34 11.40[16.60]

Reading 14.70 12.34 20.15 10.46[16.10]

Science 10.73 11.40 10.46 11.95[9.87]

Effective weight .27 .27 .26 .20[.27] [.26] [.26] [.21]

Reliability .85 .80 .80 .83

39

Page 50: 10421 PLAN Manual 1-34successlineinc.com/actinfo/plantechnicalmanual.pdf · ACT endorses the Code of Fair Testing Practices in Educationand the Code of Professional Responsibilities

Validity

Validity refers to the degree to which particular uses ofscores on a test are appropriate. For example, PLAN scoresare intended to be used as measures of college-bound andnon-college-bound students’ academic development in earlyhigh school, and to provide an estimate of the students’future performance on the ACT test.

Measuring Educational Achievement

Content Validity Argument for PLAN Scores. ThePLAN tests are designed to measure students’ problem-solving skills and knowledge in particular subject areas. Theusefulness of PLAN scores for this purpose provides thefoundation for validity arguments for more specific uses(e.g., program evaluation).

The fundamental idea underlying the development anduse of PLAN tests is that the best way to determine studentpreparedness for further education and careers is to meas-ure as directly as possible the knowledge and skills studentswill need in those settings. Tasks presented in the tests musttherefore be representative of scholastic tasks. They mustbe intricate in structure, comprehensive in scope, and signif-icant in their own right, rather than narrow or artificial tasksthat can be defended for inclusion in the tests solely on thebasis of their statistical correlation with a criterion. In thiscontext, content-related validity is particularly significant.

The PLAN tests contain a proportionately large number ofcomplex problem-solving skills. The tests are orientedtoward major areas of high school instructional programs,rather than toward a factorial definition of various aspects ofintelligence. Thus, PLAN scores, subscores, and skill state-ments based on the ACT College Readiness Standards aredirectly related to student educational progress and are eas-ily interpreted by instructional staff, parents, and students.

As described earlier in this chapter, the specific knowl-edge and skills selected for inclusion in PLAN were deter-mined through a detailed analysis of three sources ofinformation. First, the objectives for instruction for Grades 7

through 12 were obtained for all states in the United Statesthat had published such objectives. Second, textbooks onstate-approved lists for courses in Grades 7 through 12 werereviewed. Third, educators at the secondary (Grades 7through 12) and postsecondary levels were consulted todetermine the knowledge and skills taught in Grades 7through 12 prerequisite to successful performance in highschool and beyond. These three sources of information wereanalyzed to define a scope and sequence (i.e., test contentspecifications) for each of the areas measured by PLAN.These detailed test content specifications have been devel-oped to ensure that the test content is representative of cur-rent high school curricula. All forms are reviewed to ensurethat they match these specifications. Throughout the itemdevelopment process there is an ongoing assessment of thecontent validity of the tests.

PLAN Test Scores. This section provides evidencethat the PLAN tests and subtests measure separate anddistinct skills. The data included all 10th-grade 2005–2006PLAN test takers who took the test under standard condi-tions (N = 881,976). Correlations were developed for all pos-sible pairs of tests; disattenuated correlations adjust theobserved correlations for measurement error associatedwith each test. As shown in Table 4.13, the scale scores onthe four tests have observed correlations in the range of .63to .74, indicating that examinees who score well on one testalso tend to score well on another. Also, the values in thetable show that the disattenuated correlation between anytwo PLAN tests is greater than the observed correlation. Thedisattenuated correlations among the tests are sufficientlybelow 1.0 to suggest that the tests are measuring skills thatare at least somewhat distinct, statistically. In general, thedisattenuated correlations between English and Readingscores and Reading and Science scores are the highest ofthe disattenuated correlations between tests shown in thetable, and the disattenuated correlations betweenMathematics and Reading scores are the lowest.

40

Table 4.13Observed Correlations (On and Above Diagonal) and Disattenuated Correlations (Below Diagonal)

Between PLAN Test Scores and Subscores

Usage/ Rhetorical Pre-Algebra/Test score English Mechanics Skills Math Algebra Geometry Reading Science Comp.

English 1.00 .96 .90 .70 .68 .64 .74 .71 .90Usage/Mechanics 1.00 .76 .67 .65 .61 .70 .67 .85Rhetorical Skills 1.00 .65 .64 .59 .70 .65 .83

Mathematics 1.00 .93 .90 .63 .73 .87Pre-Algebra/Algebra 1.00 .73 .61 .70 .83Plane Geometry 1.00 .57 .66 .79

Reading 1.00 .69 .87Science 1.00 .87

Page 51: 10421 PLAN Manual 1-34successlineinc.com/actinfo/plantechnicalmanual.pdf · ACT endorses the Code of Fair Testing Practices in Educationand the Code of Professional Responsibilities

41

Statistical Relationships Between EXPLORE®, PLAN,and ACT scores. The EXPLORE, PLAN, and ACT tests allmeasure student educational development in the same cur-ricular areas of English, mathematics, reading, and science.The EXPLORE scale ranges from 1 (lowest) to 25, PLANfrom 1 to 32, and the ACT from 1 to 36. Each test includes acomputed Composite score equal to the average of the fourtest scores in the four curriculum areas (English,Mathematics, Reading, and Science). The three programsfocus on knowledge and skills typically attained within thesecurriculum areas at different times in students’ secondary-school experience. Thus, performance on PLAN should bedirectly related to performance on EXPLORE and the ACT.

Table 4.14 shows the correlations between EXPLORE,PLAN, and ACT scale scores for 481,996 students who tookEXPLORE, PLAN, and the ACT in Grades 8, 10, and 11–12,respectively. The table shows observed correlations for alltest scores and disattenuated correlations (shown in paren-theses) for corresponding test scores across EXPLORE,PLAN, and the ACT. The observed correlations among thefour subject area tests are in the range of .53 to .80 and dis-attenuated correlations are in the range of .77 to .94. Theobserved correlations between tests suggest that perform-ance on the three test batteries is related.

Table 4.14Correlations, Observed and (Disattenuated), Between EXPLORE, PLAN, and ACT Test Scale Scores

EXPLORE

PLAN English Mathematics Reading Science Composite

English .75 (.88) .61 .67 .64 .77

Mathematics .60 .73 (.90) .57 .63 .73

Reading .62 .53 .63 (.77) .59 .69

Science .59 .61 .59 .63 (.78) .69

Composite .75 .72 .72 .72 .84 (.89)

ACT

PLAN English Mathematics Reading Science Composite

English .80 (.82) .64 .72 .65 .80

Mathematics .66 .81 (.94) .60 .70 .77

Reading .67 .55 .70 (.85) .60 .71

Science .64 .67 .63 .68 (.83) .73

Composite .81 .77 .77 .76 .88 (.93)

EXPLORE and PLAN College ReadinessBenchmarks. As described in Chapter 3, ACT has identifiedCollege Readiness Benchmarks for the ACT. TheseBenchmarks (ACT English = 18, ACT Mathematics = 22,ACT Reading = 21, and ACT Science = 24) reflect a 50%chance of a B or higher grade or a 75–80% chance of a C orhigher grade in entry-level, credit-bearing college EnglishComposition, College Algebra, Social Sciences, and Biologycourses. Subsequently, corresponding College ReadinessBenchmarks were developed for EXPLORE and PLAN toreflect a student’s probable readiness for college-level workin these same courses by the time he or she graduates fromhigh school.

The EXPLORE and PLAN College ReadinessBenchmarks were developed using approximately 150,000records of students who had taken EXPLORE, PLAN, andthe ACT. Using each of the EXPLORE subject area scoresand PLAN subject area scores, we estimated the conditionalprobabilities associated with meeting or exceeding the cor-responding ACT College Readiness Benchmark. Thus, eachEXPLORE (1–25) or PLAN (1–32) score was associatedwith an estimated probability of meeting or exceeding therelevant ACT Benchmark (see Figure 4.4 for English). Wethen identified the EXPLORE and PLAN scores that camethe closest to a .5 probability of meeting or exceeding theACT Benchmark, by subject area. These scores wereselected as the EXPLORE and PLAN Benchmarks.

Page 52: 10421 PLAN Manual 1-34successlineinc.com/actinfo/plantechnicalmanual.pdf · ACT endorses the Code of Fair Testing Practices in Educationand the Code of Professional Responsibilities

42

The resulting EXPLORE and PLAN Benchmarks, with thecorresponding ACT Benchmarks, are given in Table 3.5.

Figure 4.5 shows the percentages of PLAN-tested stu-dents in 2005–2006 who met the PLAN Benchmarks.

1.00

.90

.80

.70

.60

.50

.40

.30

.20

.10

.00

EXPLORE or PLAN English score

1 3 5 7 9 11 13 15 17 19 21 23 25 27 29 31

Pro

bab

ility

PLANEXPLORE

Figure 4.4. Conditional probability of meeting/exceeding an ACT English score = 18, given students’ EXPLORE or PLAN English score.

Page 53: 10421 PLAN Manual 1-34successlineinc.com/actinfo/plantechnicalmanual.pdf · ACT endorses the Code of Fair Testing Practices in Educationand the Code of Professional Responsibilities

43

Figure 4.5. 2005–2006 national PLAN-tested students likely to be ready for college-level work (in percent).

100

90

80

70

60

50

40

30

20

10

0

EnglishComposition

PLANEnglish

BenchmarkScore = 15

Per

cen

t R

ead

y

Algebra

PLANMath

BenchmarkScore = 19

SocialSciences

PLANReading

BenchmarkScore = 17

Biology

PLANScience

BenchmarkScore = 21

StudentsMeeting All 4

PLANBenchmark

Scores

75

35

57

2721

Statistical Relationships Between PLAN Scores andHigh School Course Work and Grades

The PLAN tests are oriented toward the general contentareas of high school and college curricula. Students’ per-formance on PLAN should therefore be related to the highschool courses they have taken and to their performance inthese courses.

The PLAN Course/Grade Information Section (CGIS) collects information about 38 high school courses in English,mathematics, social studies, and natural sciences. Many ofthese courses form the basis of a high school core curriculum and are also frequently required for collegeadmission or placement. For each of the 38 courses, stu-dents indicate whether they have taken or are currently tak-ing the course, whether they plan to take it, or do not plan totake it. If they have taken the course, they indicate the gradethey received (A–F).

PLAN Scores and High School Course Work.Table 4.15 includes average PLAN scores by course work10th-grade students nationally reported they had taken orwere currently taking at the time of PLAN testing (ACT,2006). Students who had taken additional courses, espe-cially upper-level courses, within each subject area,achieved higher average PLAN subject area and Compositescores than students taking fewer courses. Moreover, stu-dents who had taken sufficient course work by Grade 10 tobe on track to complete the ACT-recommended core curricu-lum (four years of English and three years each of mathe-matics, social studies, and science) by high schoolgraduation had higher average PLAN scores than those whohad not completed these courses.

Page 54: 10421 PLAN Manual 1-34successlineinc.com/actinfo/plantechnicalmanual.pdf · ACT endorses the Code of Fair Testing Practices in Educationand the Code of Professional Responsibilities

44

Course workNo. of

students

Mean PLAN score

English Mathematics Reading Science Composite

English

English 9 101756 16.3 16.9

English 9 & English 10 567279 18.1 18.5

Other combinations of one or more years of English 102144 17.3 17.9No English course work information reported 105127 16.2 17.0

Mathematics

Algebra 1 153907 15.4 15.8

Algebra 1 & Algebra 2 39384 17.1 17.4

Algebra 1 & Geometry 284452 17.8 18.0

Algebra 1, Geometry & Algebra 2 161552 21.3 20.6

Other combinations of one or more years of math 106465 20.3 19.8No math course work information reported 118221 16.6 16.7

Social Studies

U.S. History 38037 16.1 16.7

World History 62462 17.5 18.1

World History & U.S. History 93806 17.6 18.2

Other combinations of one year of social studies 153444 17.3 18.0Other combinations of two or more years of social studies 389979 18.1 18.7No social studies course work information reported 95996 16.0 16.7

Natural Science

General Science 104308 17.3 16.4

Biology 61290 17.9 17.2

Chemistry 3124 19.0 18.4

General Science & Biology 367402 18.6 18.0

General Science & Chemistry 9440 20.1 19.6

Biology & Chemistry 86499 20.9 20.8

Other combinations of one or more years of natural science 125695 19.7 19.3No natural science course work information reported 114641 17.1 16.8

On track for core course work

English 10, Algebra 1 plus oneother math course, any social studies course, and Biology 437179 18.9 19.4 18.7 19.6 19.3

Not taken/not taking these courses 441868 16.2 16.8 16.3 17.7 16.9

Table 4.15Average PLAN Scores by Courses Taken in High School

Page 55: 10421 PLAN Manual 1-34successlineinc.com/actinfo/plantechnicalmanual.pdf · ACT endorses the Code of Fair Testing Practices in Educationand the Code of Professional Responsibilities

45

PLAN Scores and Course Grades. The results shown in Table 4.16 are based on the PLAN scores and self-reported course grades of students who took PLAN inGrade 10 in 2005–2006. A total of 883,470 10th-grade stu-dents had valid PLAN Composite scores and took the testunder standard conditions.

Across individual courses, correlations between subjectarea PLAN scores and associated course grades rangedfrom .27 (Geography) to .47 (Algebra 1). However, correla-tions with subject area GPAs and overall high school GPAwere generally somewhat higher, ranging from .38 to .48 forsubject area GPAs and from .48 to .53 for overall high schoolGPA.

In general, correlations between test scores and coursegrades are smaller than those between test scores due tothe lower reliabilities of course grades. For example, theintercorrelations among course grades could be consideredan estimate of the reliabilities of individual course grades.For these courses, the median correlation among pairs ofgrades was .51. Using this value as a reliability estimate forindividual course grades, disattenuated correlations amongtest scores and individual course grades ranged from .27(Geography) to .47 (Algebra 1; see page 40 for further dis-cussion of disattenuated correlations).

Course grade/grade average

No. of students Mean

PLAN score

English Mathematics Reading Science Composite

English

English 9 716552 3.05 .44 .48

English 10 375957 3.09 .41 .44

English GPA 733190 3.04 .45 .49

Mathematics

Algebra 1 642462 3.03 .47 .46

Geometry 360847 3.16 .44 .45

Algebra 2 162739 3.26 .41 .41

Math GPA 686199 2.98 .48 .47

Social Studies

U.S. History 274000 3.15 .39 .45

World History 329993 3.18 .39 .45

Government/Civics 135686 3.18 .40 .47

World Cultures/Global Studies 76537 3.25 .40 .46

Geography 297644 3.34 .27 .33

Economics 35937 3.16 .38 .46

Social Studies GPA 708395 3.20 .38 .44

Science

Physical/Earth/General Science 503041 3.08 .41 .46

Biology 1 419838 3.10 .42 .43

Chemistry 1 90665 3.25 .39 .46

Science GPA 691247 3.06 .44 .49

Overall

Overall GPA 609620 3.12 .51 .53 .48 .51 .58

Table 4.16Correlations Between PLAN Scores and Course Grades

Page 56: 10421 PLAN Manual 1-34successlineinc.com/actinfo/plantechnicalmanual.pdf · ACT endorses the Code of Fair Testing Practices in Educationand the Code of Professional Responsibilities

46

Table 4.17Means and Correlations for PLAN Score and High School Grade Point Average

EXPLORE NPLAN means

HSGPA means Correlations

English 221,805 19.0 3.30 .46

Mathematics 210,651 18.9 3.12 .49

Reading 210,666 18.4 3.40 .39

Science 210,493 19.2 3.25 .39

Composite 211,603 19.0 3.27 .55

The results showed a moderate relationship betweenHSGPA and PLAN test scores, even though the time spanbetween PLAN testing and HSGPA was about two years.The largest correlation was between the PLAN Compositescore and HSGPA.

Course Work Associated With LongitudinalEducational Achievement, as Measured by EXPLOREand PLAN Scores. Roberts and Noble (2004) examined theextent to which the courses students take or plan to take inhigh school explain PLAN performance at Grade 10, afterstatistically controlling for prior achievement at Grade 8. Thecontribution of nonacademic variables, including post–highschool plans and needs for help, was also examined.

Data and Method. The sample for this study was basedon all students (over 175,000 students) who took EXPLOREin Grade 8 in 1997–1998 and PLAN two years later in1999–2000. Students who were given extended time, highschools with less than 25 student records, and students withmissing data on one or more variables were excluded fromthe study. Initial data screening resulted in a longitudinalsample of 42,193 student records from 488 high schools.

Multiple linear regression was used to examine theeffects of taking specific high school courses on students’PLAN scores. Simple correlations and regression modelsbased on pooled data were used to identify the final regres-sion models for each PLAN score. Within-school regressionmodels were then developed using final regression models.Regression statistics were then summarized across schoolsusing minimum, median, and maximum values. EXPLOREscores were entered into each regression equation first tocontrol for prior achievement before entering five sets ofindependent variables. Independent variables were required

to share a logical relationship with PLAN scores, have azero-order correlation greater than .1 or higher with PLANscores, and regression coefficients for course work variableswere required to be > 0.5.

Results. Zero-order correlations between PLAN scoresand selected independent variables that met the criteria formodel selection are shown in Table 4.18 (see Roberts &Noble, 2004 for further information). Mathematics coursework taken (Algebra 2, Geometry) and planned (e.g.,Trigonometry, Calculus) had the highest correlations with allPLAN scores. English and social studies course work takenor planned had low correlations with PLAN English andReading scores; this was typically due to limited variability inthe course work variables (e.g., virtually all students takeEnglish 9 and 10).

Summary. Results of this study showed that studentswho take or plan to take rigorous mathematics and sciencecourses (e.g., Algebra 2, Geometry, Trigonometry, andChemistry), on average, achieve higher PLAN Mathematics,Science, and Composite scores than students who do nottake these courses, regardless of prior achievement, per-ceived educational needs, educational plans, educationalbackground, and personal characteristics of the student.Given the benchmarks set for this study, few relationshipswere detected between PLAN English and Reading scoresand courses taken or planned in English and social studies.It should not be concluded, however, that these types ofcourses are not important to PLAN performance. Exclusionof English and social studies course work occurred largelybecause of insufficient variability and problems of highlyredundant relationships with other independent variables.

Additional ACT research examined the relationshipbetween high school grade point average (HSGPA) at thetime students took the ACT and their PLAN test scores atGrade 10. The data included high school students who grad-uated in 2002, 2003 or 2004 and took EXPLORE, PLAN and

the ACT. Self-reported high school course grades from theACT CGIS were used to calculate an overall HSGPA. PLANmeans, HSPGA means, and correlations for each of the fourPLAN test scores and the Composite are presented in Table 4.17.

Page 57: 10421 PLAN Manual 1-34successlineinc.com/actinfo/plantechnicalmanual.pdf · ACT endorses the Code of Fair Testing Practices in Educationand the Code of Professional Responsibilities

47

Independent variable %Correlations with PLAN Score

English Mathematics Reading Science Composite

Courses taken

Algebra 1 (not pre) 95 0.18 0.18 0.16 0.16 0.20Algebra 2 42 0.34 0.44 0.30 0.34 0.41Geometry 72 0.38 0.43 0.33 0.35 0.43Trigonometry 6 0.20 0.28 0.19 0.22 0.25Other math beyond Algebra 3 0.07 0.10 0.06 0.08 0.09General Science 82 –0.06 –0.10 –0.07 –0.07 –0.08Biology 89 0.10 0.09 0.08 0.08 0.10Chemistry 22 0.22 0.29 0.22 0.23 0.27Art 58 0.10 0.10 0.09 0.08 0.11German 43 0.12 0.06 0.09 0.07 0.10Courses taken or plannedEnglish 11 98 0.10 0.10 0.09 0.09 0.11English 12 96 0.11 0.10 0.10 0.09 0.12Algebra 2 94 0.17 0.17 0.15 0.15 0.18Geometry 95 0.18 0.18 0.16 0.16 0.20Trigonometry 66 0.26 0.30 0.23 0.26 0.30Calculus (not pre) 51 0.25 0.34 0.23 0.27 0.31Other math beyond Algebra 57 0.19 0.25 0.18 0.20 0.23General Science 87 –0.10 –0.14 –0.10 –0.10 –0.12Chemistry 91 0.16 0.16 0.14 0.14 0.17Physics 73 0.15 0.19 0.14 0.17 0.19Psychology 45 0.12 0.06 0.12 0.07 0.11French 57 –0.09 –0.09 –0.08 –0.09 –0.10Educational needExpressing ideas in writing 46 0.14 0.02 0.13 0.06 0.10Increasing reading speed 48 0.23 0.07 0.25 0.15 0.21Increasing reading understanding 45 0.23 0.15 0.27 0.21 0.25Developing math skills 36 0.15 0.37 0.10 0.22 0.24Developing study skills 28 0.14 0.15 0.12 0.14 0.16Developing test-taking skills 35 0.26 0.28 0.24 0.27 0.30Investigating my options after high school 27 –0.11 –0.11 –0.10 –0.10 –0.12

College-bound/non-college-bound 94 0.22 0.19 0.19 0.18 0.22

Parents’ level of educationa 0.28 0.30 0.27 0.26 0.32Student characteristicsGender 55 0.15 –0.08 0.10 –0.03 0.05Black vs. white comparison 8 –0.23 –0.25 –0.21 –0.23 –0.26Hispanic vs. white comparison 5 –0.13 –0.09 –0.10 –0.10 –0.12Asian vs. white comparison 2 0.05 0.10 0.05 0.06 0.07Other vs. white comparison 6 –0.08 –0.08 –0.07 –0.07 –0.08Note. N = 42,193 students with no missing data on all variables. aFrom EXPLORE; Mean = 3.79 and standard deviation = 1.43 for parents’ level of education.

Table 4.18Percentages and Zero-Order Correlation Coefficients for Blocks of

Independent Variables That Met the Criteria of Selection

Page 58: 10421 PLAN Manual 1-34successlineinc.com/actinfo/plantechnicalmanual.pdf · ACT endorses the Code of Fair Testing Practices in Educationand the Code of Professional Responsibilities

48

Table 4.19Distributions, Across Schools, of Regression Statistics for Modeling

PLAN Mathematics, Science, and Composite Scores

Test/Subtest

PLAN Mathematics PLAN Science PLAN Composite

Median Minimum Maximum Median Minimum Maximum Median Minimum Maximum

R .83 .63 .95 .73 .31 .90 .87 .58 .97

SEE 2.37 1.20 4.60 2.20 1.24 3.55 1.78 0.97 2.56

Intercept 4.66 –7.40 15.66 6.82 –1.91 18.36 1.83 –6.78 14.32

Regression Coefficients

EXPLORE

Mathematics/Composite 0.64 –0.06 1.38 0.68 0.11 1.14 0.92 0.23 1.35

Taken

Algebra 2 1.53 –6.93 9.33 — — — — — —

Geometry 1.37 –8.52 7.85 0.52 –5.18 4.75 0.60 –3.05 3.55

Taken or Planned

Trigonometry 0.35 –3.19 4.80 0.33 –3.44 3.48 0.31 –3.41 3.57

Chemistry 0.10 –7.52 8.57 — — — — — —

Needs help developing test-taking skills 0.64 –2.46 3.35 — — — — — —

Educational plans 0.36 –7.09 8.73 — — — 0.44 –5.53 5.73

Parents’ education 0.10 –1.01 1.33 0.07 –0.58 1.31 0.10 –0.51 0.74

Gender –0.53 –5.63 4.48 –0.43 –3.30 1.93 –0.08 –1.81 2.85

Majority/Minority 0.53 –12.70 6.60 0.22 –6.98 5.02 0.33 –5.11 3.78

Median, minimum, and maximum regression statistics foreach of the five PLAN models are displayed in Tables 4.19and 4.20. A separate table was constructed for PLANEnglish and Reading, since no course work variables metthe criteria for selection for these tests. The typical number

of students for each model was 61, and ranged from 25 to320 across schools. The multiple R medians ranged from.73 for PLAN Science to .87 for PLAN Composite.

Page 59: 10421 PLAN Manual 1-34successlineinc.com/actinfo/plantechnicalmanual.pdf · ACT endorses the Code of Fair Testing Practices in Educationand the Code of Professional Responsibilities

49

Table 4.20Distributions, Across Schools, of Regression Statistics

for Modeling PLAN English and Reading Scores

Regression coefficients for the course work variablesreflect statistically adjusted mean test score differencesbetween students who took the course (coded as one) andthose who did not take the course (coded as zero). Forexample, as shown in Table 4.19, positive high schoolmedian regression coefficients for the PLAN Mathematicstest were associated with taking Algebra 2, given all othervariables in the model. Adjusted mean PLAN Mathematicsscores of students taking Algebra 2 were typically 1.53 scalescore units higher than those of students who did not takeAlgebra 2. In addition, adjusted mean PLAN Mathematicsscores of students taking Geometry were, on average, 1.37

scale score units higher than those of students not takingGeometry. Smaller adjusted mean differences were associ-ated with taking or planning to take Trigonometry (.35) orChemistry (.10). Positive regression coefficients on thePLAN Science Test were, on average, associated with tak-ing Geometry (0.52) or taking or planning to takeTrigonometry (0.33). Results also indicated, as shown inTables 4.19 and 4.20, that positive regression coefficientswere, on average, associated with planning to attend collegefor PLAN English (0.68), Mathematics (0.36), Reading(0.60), and Composite (0.44) scores.

Statistic

PLAN English PLAN Reading

Median Minimum Maximum Median Minimum Maximum

R .81 .45 .93 .76 .37 .93

SEE 2.67 1.73 3.91 3.06 1.03 4.35

Intercept –1.89 –14.59 13.34 –0.55 –14.33 15.32

Regression coefficients

EXPLORE Composite 1.11 0.36 1.69 1.00 0.15 1.97

Increasing reading speed — — — 0.72 –3.94 4.66

Increasing reading understanding — — — 0.70 –2.55 6.23

Educational plans 0.68 –5.73 7.00 0.60 –6.82 6.83

Parents’ education 0.16 –1.93 1.23 0.13 –0.94 1.55

Gender 0.85 –2.40 3.97 0.36 –3.10 4.44

Majority/Minority 0.46 –5.53 6.61 0.23 –7.94 9.19

Page 60: 10421 PLAN Manual 1-34successlineinc.com/actinfo/plantechnicalmanual.pdf · ACT endorses the Code of Fair Testing Practices in Educationand the Code of Professional Responsibilities

50

The regression coefficients for gender in Tables 4.19 and4.20 indicated that for the typical high school, PLANadjusted means for females were higher than those formales for PLAN English and Reading (median regressioncoefficients = 0.85 and 0.36), whereas adjusted means forfemales were typically lower than those for males for PLANMathematics and Science (median regression coefficients =–0.53 and –0.43). After statistically controlling for priorachievement and the other independent variables, genderdifferences decreased, on average, by 53% (English), 12%(Mathematics), 70% (Reading), and 80% (Composite).Mathematics continued to show higher average PLANscores for males. To a much lesser degree, English andReading continued to show higher average scores forfemales. Although Composite median adjusted means formales were slightly higher than those for females (medianregression coefficient = –0.08), the gender difference inunadjusted means was substantially reduced (92%) by sta-tistically controlling for prior achievement and the other inde-pendent variables.

For Science, statistically controlling for prior achievementand the other independent variables resulted in an increasein gender differences. Regressing EXPLORE Composite ongender alone showed that within most schools, femalesscored half a point higher than males (0.56) at Grade 8.Regressing PLAN Science on gender after controlling forEXPLORE Composite scores only resulted in an averageadjusted mean difference of –0.45 across schools, which isonly slightly higher than the result shown in Table 4.19.Within most schools, females score higher than males atGrade 8 on EXPLORE Composite, but by Grade 10 maleshave caught up and score slightly higher than females onPLAN Science. Other research into relationships betweengender and achievement provides evidence that males tendto outperform females in mathematics and science over timeon the Iowa Test of Basic Skills from Grades 3 through 8 andon the Iowa Test of Educational Development from Grades 9through 12 (Becker & Forsyth, 1990; Martin & Hoover,1987). It is possible females begin to lose interest in scienceby Grade 10 and focus their attention on other subject areas.

After statistically controlling for prior achievement and theother independent variables in this study, PLAN scores con-tinued to show higher averages for the majority group. Meandifferences were reduced by 81% (English), 77%(Mathematics), 90% (Reading), 87% (Science), and 86%(Composite) by statistically controlling for these variables.

Course Work Associated With LongitudinalEducational Achievement, as Measured by PLAN andACT Scores. ACT research has shown that taking rigorous,college-preparatory mathematics courses is associated withhigher ACT Mathematics and Composite scores. (e.g., ACT,2005a; Noble, Davenport, & Sawyer, 2001; Noble, Roberts,& Sawyer, 2006). Schiel, Pommerich, and Noble (1996) sta-tistically controlled for prior achievement using PLAN scoresand found substantive increases in average ACTMathematics and Science scores associated with taking

upper-level mathematics and science courses. In a recentstudy (Noble & Schnelker, 2007; ACT, 2005b) researchersexamined the effects of taking specific high school coursesequences on students’ ACT performance in English,Mathematics, and Science based on data for students whohad taken both PLAN and the ACT.

Data and Method. Data for 403,381 students represent-ing 10,792 high schools were analyzed. The PLAN/ACTcohort file for the 2003 graduating class contained matchedrecords of students who completed PLAN during their soph-omore year (2000–2001) and the ACT during their junior orsenior year, prior to graduating in 2003. If students took theACT more than once, only the most recent ACT record wasused. Each record included PLAN and ACT scores (inEnglish, Mathematics, and Science), race/ethnicity, gradelevel at the time of taking the ACT, self-reported course workinformation from the CGIS, and high school attended.Dummy variables were used to represent specific coursesequences; the course sequences were based on previousresearch (ACT, 2004; Noble et al., 1999) and were con-structed such that the incremental benefit of specific coursescould be determined.

Hierarchical regression modeling was used to examinethe effects of taking specific high school course sequenceson students’ ACT scores. Hierarchical regression modelsaccount for variability in regression coefficients acrossschools in order to draw correct conclusions about predictor-outcome relationships. In these analyses, student-levelregression coefficients were allowed to vary across highschools.

All effects were examined in the context of the highschools students attended, and prior achievement (i.e.,PLAN scores) and students’ grade level at the time of ACTtesting were statistically controlled in the models. For a moredetailed discussion concerning the data and methods used,including a more in-depth discussion of hierarchical regres-sion, see Noble and Schnelker (2007).

Results. The results of the hierarchical linear regressionmodels are shown in Table 4.21. The table includes theunstandardized regression coefficients for each variable ineach model; all regression coefficients were statistically sig-nificant (p < .01) unless otherwise noted. Overall, about 0.60of the variance in students’ ACT English scores, between0.50 to 0.60 of the variance in students’ ACT Mathematicsscores, and between 0.30 to 0.50 of the variance in students’ACT Science scores were explained by the models. Highschool attended explained 0.16 to 0.25 of the varianceacross ACT scores (intraclass correlations; see Noble &Schnelker, 2007).

For all models, PLAN scores were positively related toACT scores. A 1-point increase in PLAN English score cor-responded to about a 1-point increase in ACT English score,and a 1-point increase in PLAN Mathematics or Sciencescore corresponded to about a 0.8-point increase in ACTMathematics or Science score, respectively. Moreover, highschool seniors, on average, scored about 0.3 points higher

Page 61: 10421 PLAN Manual 1-34successlineinc.com/actinfo/plantechnicalmanual.pdf · ACT endorses the Code of Fair Testing Practices in Educationand the Code of Professional Responsibilities

51

Table 4.21Hierarchical Linear Regression Coefficients for Modeling ACT Scores

on ACT English, about 0.5 to 0.7 points lower on ACTMathematics, and about 0.1 to 0.5 points lower on ACTScience than did juniors.

Taking one or more foreign languages, over and aboveEnglish 9–11, increased students’ ACT English score, onaverage, by 1.1 score points, compared to taking onlyEnglish 9–11. Taking Algebra 1, Algebra 2, and Geometrywas associated with an average ACT Mathematics scoreincrease of about 1.1 score points, compared with takingless than these three courses. Taking either Trigonometry orOther Advanced Mathematics, over and above these threecourses, resulted in an average increase in ACTMathematics score of 1.0 to 1.5 score points. Taking OtherAdvanced Mathematics and Trigonometry, or Trigonometryand Calculus, increased ACT Mathematics scores, on aver-age, by more than 2.0 score points. The greatest averagescore increase associated with mathematics course workresulted from taking Other Advanced Mathematics,Trigonometry, and Calculus, in addition to Algebra 1,Geometry, and Algebra 2 (3.2 score points).

Compared with taking General Science only, takingGeneral Science and Biology, or Biology alone, resulted inan average ACT Science score increase of about 0.5 points.Taking Biology and Chemistry, or Biology, Chemistry, andPhysics, was associated with an average ACT Sciencescore increase of 1.3 and 2.4 score points, respectively,compared to taking Biology only.

Summary. These results indicate that, in a typical highschool, students who take upper-level mathematics or sci-ence courses (e.g., Trigonometry, Calculus, Chemistry, orPhysics) can expect, on average, to earn meaningfullyhigher ACT Mathematics and Science scores than studentswho do not take these courses. The benefits of course worktaken in high school for increasing ACT performance dependon the high school students attend, regardless of priorachievement and grade level at testing. The relationshipsbetween course work taken and ACT performance are alsoinfluenced by the characteristics of schools. For a detaileddescription of these results, see Noble and Schnelker(2007).

Model Course work comparison

Regression coefficient

Level 1R2Intercept

PLANscore

Gradelevel

Coursework

ACT English score

1 English 9–11 & 1 or more foreign languages vs. English 9–11 1.33 0.99 0.32 1.12 0.60

ACT Mathematics score

1 Algebra 1, Algebra 2, and Geometry vs. less than these courses 5.03 0.75 –0.45 1.07 0.52

Algebra 1, Algebra 2, and Geometry vs.

2 Algebra 1, Algebra 2, Geometry & Other Advanced Math 5.65 0.79 –0.66 1.01 0.52

3 Algebra 1, Algebra 2, Geometry & Trig 5.63 0.79 –0.70 1.52 0.59

4 Algebra 1, Algebra 2, Geometry, Trig & Other Advanced Math 5.91 0.78 –0.72 2.02 0.60

5 Algebra 1, Algebra 2, Geometry, Trig & Calculus only 5.84 0.78 –0.62 2.91 0.60

6 Algebra 1, Algebra 2, Geometry, Other Advanced Math, Trig & Calculus

5.90 0.77 –0.62 3.16 0.63

ACT Science score

1 Biology vs. General Science 4.70 0.78 –0.07* 0.46 0.28

Biology vs.

2 Biology & Chemistry 4.26 0.83 –0.43 1.29 0.37

3 Biology, Chemistry & Physics 4.23 0.84 –0.48 2.41 0.47

*p > .01

Page 62: 10421 PLAN Manual 1-34successlineinc.com/actinfo/plantechnicalmanual.pdf · ACT endorses the Code of Fair Testing Practices in Educationand the Code of Professional Responsibilities

52

High School Course Work Associated With ACTCollege Readiness Benchmarks. Noble and Schnelker(2007; ACT, 2005b) also examined the contribution of spe-cific high school course sequences to college readiness inEnglish Composition, College Algebra, and Biology.

Data and Method. Students’ readiness for collegecourse work in a subject area was defined by whether therelevant ACT Benchmark (see pp. 18–19 for a description ofthe PLAN Benchmarks) had been met or not. Hierarchicallogistic regression was used to model the probability of astudent meeting or exceeding the English Composition,Algebra, or Biology College Readiness Benchmark as afunction of courses taken in high school, while statisticallycontrolling for the relevant PLAN score (as a measure of stu-dents’ prior achievement) and student grade level at the timeof taking the ACT (junior or senior). High school attendedwas also accounted for in the models by allowing the stu-dent-level regression coefficients to vary across highschools.

Results. In this study, 74% of the students met the ACTEnglish Benchmark, 44% met the ACT MathematicsBenchmark, and 30% met the ACT Science Benchmark.Table 4.22 gives the unstandardized logistic regression coef-ficients for each variable from each model; all regressioncoefficients were statistically significant (p < .01) unless oth-erwise noted. The odds ratios for the course work compar-isons are also reported in Table 4.22. Compared to takingonly English 9–11, the odds of meeting the ACT EnglishBenchmark for students also taking one or more foreign lan-guages was 2 times greater. Moreover, taking at least oneforeign language was typically associated with a 9%increase in students’ chances of meeting the Benchmark,compared to taking only English 9–11.

Figure 4.6 illustrates students’ chances of meeting theCollege Algebra Benchmark associated with taking variousmathematics course sequences. Taking Algebra 1,Geometry, and Algebra 2 was typically associated with a22% chance of meeting the Benchmark (an increase of 12%over that for students taking less than Algebra 1, Geometry,and Algebra 2). Taking upper-level mathematics coursesbeyond Algebra 2 was associated with substantial increasesin students’ chances of meeting the College AlgebraBenchmark, compared to taking less than Algebra 1,Geometry, and Algebra 2. Chances ranged from 34% (otheradvanced mathematics) to 58% (Other AdvancedMathematics, Trigonometry, and Calculus), compared to10% for those taking less than Algebra 1, Geometry, andAlgebra 2.

Compared to students taking Biology only, the odds ofmeeting the ACT Science Benchmark were 2 times greaterfor students taking Biology and Chemistry and were nearly 4times greater for students taking Biology, Chemistry, andPhysics. Taking Biology and Chemistry was typically associ-ated with a 19% chance of meeting the College BiologyBenchmark, compared to a 10% chance for students takingBiology only. Students taking Biology, Chemistry, andPhysics typically had a 29% chance of meeting theBenchmark.

Summary. The findings from this study indicate thatsome courses and course sequences better prepare stu-dents for postsecondary-level work than others. Each incre-mental college-preparatory course taken, particularly inmathematics and science (e.g., Trigonometry beyondAlgebra 2, Physics beyond Chemistry), added to readinessmore than did the number of courses in a discipline alone. Amore detailed description of these results is provided in thefull ACT Research Report (Noble & Schnelker, 2007).

Page 63: 10421 PLAN Manual 1-34successlineinc.com/actinfo/plantechnicalmanual.pdf · ACT endorses the Code of Fair Testing Practices in Educationand the Code of Professional Responsibilities

53

Table 4.22Hierarchical Logistic Regression Coefficients for Modeling the Probability of

Students’ Meeting or Exceeding ACT College Readiness Benchmarks

Model Course work comparison

Regression coefficient

OddsratioIntercept

PLANscore

Gradelevel

Coursework

College English Benchmark

1 English 9–11 & 1 or more foreign languages vs. English 9–11 –8.04 0.49 0.02* 0.68 1.97

College Algebra Benchmark

1 Algebra 1, Algebra 2, and Geometry vs. less than these courses –10.29 0.47 –0.37 0.91 2.48

Algebra 1, Algebra 2, and Geometry vs.

2 Algebra 1, Algebra 2, Geometry, & Other Advanced Math only –9.18 0.46 –0.40 0.63 1.88

3 Algebra 1, Algebra 2, Geometry, & Trig only –8.91 0.44 –0.43 0.90 2.46

4 Algebra 1, Algebra 2, Geometry, Trig & Other Advanced Math only –8.86 0.44 –0.42 1.15 3.16

5 Algebra 1, Algebra 2, Geometry, Trig, & Calculus only –9.01 0.45 –0.40 1.66 5.26

6 Algebra 1, Algebra 2, Geometry, Other Advanced Math, Trig, & Calculus

–8.96 0.44 –0.40 1.76 5.81

College Biology Benchmark

Biology vs.

1 Biology & Chemistry –10.97 0.48 –0.29 0.71 2.03

2 Biology, Chemistry, & Physics –10.24 0.44 –0.30 1.31 3.71

*p > .01

Figure 4.6. Typical chances of meeting the College Readiness Benchmark for College Algebra by specific mathematics course work.

70%

60%

50%

40%

30%

20%

10%

0%

Less thanAlg 1,

Geom, &Alg 2

Typ

ical

Ch

ance

Alg 1,Geom, &

Alg 2

Alg 1,Geom,Alg 2, &

Trig

Alg 1,Geom,Alg 2,

Other AdvMath, &

Trig

Alg 1,Geom,

Alg 2, OtherAdv Math,

Trig, &Calc

Alg 1,Geom,Alg 2, &Other

Adv Math

Alg 1,Geom,Alg 2,Trig, &Calc

Mathematics Course Sequence

Page 64: 10421 PLAN Manual 1-34successlineinc.com/actinfo/plantechnicalmanual.pdf · ACT endorses the Code of Fair Testing Practices in Educationand the Code of Professional Responsibilities

54

Growth From Grade 8 to Grade 12. EXPLORE, PLAN,and the ACT can be used to measure growth in educationalachievement across Grades 8, 10, and 12. Roberts andBassiri (2005) investigated the relationship between the ini-tial academic achievement status of students on EXPLOREat Grade 8, and their rate of change in academic achieve-ment at Grade 10 on PLAN and Grades 11/12 on the ACT.The longitudinal achievement data for this study consisted of34,500 students from 621 high schools who had takenEXPLORE (1998–1999), PLAN (1999–2000), and the ACT(2001–2002). Linear growth over time for students withinschools was measured using EXPLORE, PLAN, and ACTComposite scores. A multilevel growth model was used totest the extent of the relationship between the initial achieve-ment of students (measured by EXPLORE) and their rate ofeducational growth (measured by PLAN and the ACT,respectively). The multilevel growth model for the study wasspecified, where the time of measure was nested within stu-dents and students were nested within high schools toaccount for variation at both levels.

The unconditional model showed an expected between-school grand mean equal to 16.35 for the EXPLOREComposite and achievement rate of growth equal to 1.16points per year. A strong correlation (r = .90) was foundbetween where students start on EXPLORE and their rate ofacademic achievement growth through high school, asmeasured by PLAN and the ACT. Within-school rates ofchange regressed on students’ EXPLORE scores explained79% of the variation in student-level growth trajectories.These results showed that, on average, students’ initial levelof academic achievement is an important predictor ofchange in academic growth for students within schools.Although variation between schools was observed (Figure4.7), a student might be expected to increase his or her rateof change in academic growth by 0.19 scale score points, onaverage, for each point increase on the EXPLOREComposite.

36

31

26

21

16

11

6

1

Grade8 10 11 12

Wit

hin

-Sch

oo

l EPA

S T

est

Mea

ns

School AE(Yij)School B

Figure 4.7. EPAS growth trajectories calculated at Grades 8 through 12 for expected within-school average E(Yij), high school A, and high school B.

Page 65: 10421 PLAN Manual 1-34successlineinc.com/actinfo/plantechnicalmanual.pdf · ACT endorses the Code of Fair Testing Practices in Educationand the Code of Professional Responsibilities

55

The Roberts and Bassiri study was extended by Bassiriand Roberts (2005) using the same sample and employingthe same growth model statistical methods to examine therelationship between high school core courses (four years ofEnglish and three years each of mathematics, science, andsocial studies) taken and student growth over time meas-ured with EXPLORE, PLAN, and the ACT subject tests inEnglish, Mathematics, Reading, and Science. Statisticallysignificant initial status and rate of change variances showedthat students attending different schools do not start at thesame level on EXPLORE or change at the same rate of aca-demic achievement over time. Yet, on average, studentswithin a school who take EXPLORE can be expected tomove to higher Mathematics, English, Reading, and Sciencescores over time as assessed by PLAN and the ACT. Aftercontrolling for school-level characteristics (i.e., metropolitanarea, proportion of ACT-tested students in a school, degreeof integration of race/ethnicity in a school, and initialachievement status of the school), and student-level vari-ables (i.e., gender and race/ethnicity), EXPLORE-tested stu-dents who took the high school core curriculum showedstatistically significantly faster rates of change to higherachievement scores on English, Mathematics, Reading, andScience, compared to EXPLORE-tested students who didnot take core (regression coefficients for each of the fourtests equal 1.56, 1.37, 1.32, and 0.96, respectively). Basedon these results, regardless of where students start onEXPLORE, the rate of change in EPAS achievement isfastest for students who have taken the high school core cur-riculum.

Using ACT and PLAN Scores for Program Evaluation

ACT scores may be used in concert with PLAN scores forprogram evaluation. PLAN includes academic tests in thesame subject areas as the ACT—English, Mathematics,Reading, and Science. Content specifications for the PLANtests were developed using procedures comparable to thoseused for the ACT (ACT, PLAN Technical Manual, 1999).However, results based on both PLAN and ACT scoresshould not be used as sole indicators in program evaluation.They should be considered with other indicators of programeffectiveness.

The PLAN and ACT tests were designed to be similar intheir content and in their focus on higher-order thinkingskills. Their scores are reported on a common scale. ThePLAN and ACT tests, then, are conceptually and psychome-trically linked. As shown earlier in this chapter, ACT andPLAN scores are highly correlated with each other and withhigh school course work and grades. They are thereforeappropriate for measuring student academic achievementover time.

Student progress within a school can be examined usingthe percentage of students meeting College ReadinessBenchmark Scores on PLAN and the ACT (see chapter 3 fora discussion of the PLAN College Readiness Benchmarks).The PLAN College Readiness Benchmark Scores are basedon the ACT College Readiness Benchmark Scores. Theyreflect students’ expected growth from PLAN to the ACT andassume sustained academic effort throughout high school.The PLAN Benchmarks are 15 in English, 19 inMathematics, 17 in Reading, and 21 in Science. ACT’sPLAN/ACT Linkage Reports provide this information for stu-dents who have taken both PLAN and the ACT.

Page 66: 10421 PLAN Manual 1-34successlineinc.com/actinfo/plantechnicalmanual.pdf · ACT endorses the Code of Fair Testing Practices in Educationand the Code of Professional Responsibilities

Interest Inventory

Overview

The Unisex Edition of the ACT Interest Inventory (UNI-ACT) helps students explore personally relevant career(occupational and educational) options. Using their UNIACTresults, students can explore occupations and academiccourses in line with their preferences for common, everydayactivities involving data, ideas, people, and things. UNIACTprovides scores on six scales paralleling Holland’s (1997) sixtypes of interests and occupations (see also Holland,Whitney, Cole, & Richards, 1969). Scale names (and corre-sponding Holland types) are Science & Technology(Investigative), Arts (Artistic), Social Service (Social),Administration & Sales (Enterprising), Business Operations(Conventional), and Technical (Realistic). Each scale con-sists of work-relevant activities (e.g., fix a toy, help settle anargument between friends, sketch and draw pictures) thatare familiar to students, either through participation or obser-vation. The activities have been carefully chosen to assessbasic interests while minimizing the effects of sex-role con-notations. Since males and females obtain similar distribu-tions of scores on the UNIACT scales, combined-sex normsare used to obtain sex-balanced scores.

Score Reporting Procedures

The PLAN student score report suggests 2–3 regions onthe World-of-Work Map (Figure 5.1), the primary procedureused to link UNIACT scores to career options (Prediger &Swaney, 1995). Holland’s hexagonal model of interests andoccupations (Holland, 1997; Holland et al., 1969), and theunderlying Data/Ideas and Things/People work task dimen-sions (Prediger, 1982), form the core of the map. Hollandtypes, and corresponding ACT career clusters, appear onthe periphery. The map is populated by 26 career areas(groups of occupations). Each career area consists of manyoccupations sharing similar work tasks. A simpler version ofthe map, without Holland types or ACT career clusters, isused on the PLAN student score report.

The student guidebook Using Your PLAN Resultsdescribes how World-of-Work Map regions are used as abasis for career exploration. Students are encouraged toexplore occupations in career areas suggested by their UNI-ACT results and their self-reported career plans. Studentsare encouraged to visit www.actstudent.org/plan to gatheroccupational information, such as salary, growth, and entryrequirements.

Norms

Data for Grade 10 UNIACT norms were obtained fromPLAN program files. The target population consisted of stu-dents enrolled in Grade 10 in the United States. Although thePLAN program tests a sizable percentage of U.S. highschool students, some sample bias is inevitable. To improvethe national representativeness of the sample, individualrecords were weighted to more closely match the character-istics of the target population with respect to gender, ethnic-ity, school enrollment, school affiliation (public/private), andregion of the country.

Sampling. Development of the norming sample beganwith schools that participated in PLAN testing during the2003–2004 academic year. Based on Market Data Retrieval(MDR; 2003) data, we retained schools in the U.S. with pub-lic, private, Catholic, or Bureau of Indian Affairs affiliation. Inaddition, we retained schools that contained a 10th gradeand had at least 10 PLAN-tested students during the2003–2004 academic year. Within retained schools, weretained students who reported a valid career choice andhad a complete set of valid interest inventory responses. Oursample consisted of 407,325 students from 4,030 schools.In general, schools use PLAN to test all Grade 10 students.The median proportion of students tested was .78 for thegroup of 4,030 schools.

Weighting. As noted above, the sample was weighted to make it more representative of the population of 10thgraders in the U.S. The proportion of 10th graders in theU.S. in each gender/ethnicity category was approximatedusing population counts for the 15–19 age group from the2000 Census (United States Census Bureau, 2001). Theproportion of U.S. 10th graders in each enrollment size/affil-iation/region category was calculated using MDR (2003)data. Each student was assigned a weight as WGT =(N1/n1) • (N2/n2) where:

N1 = the number of students, in the population, from thegender/ethnicity category to which the student belongs,n1 = the number of students, in the sample, from the gen-der/ethnicity category to which the category belongs,N2 = the number of students, in the population, from theenrollment size/affiliation/region category to which thestudent belongs, andn2 = the number of students, in the sample, from theenrollment size/affiliation/region category to which thestudent belongs.Precision. By obtaining data from PLAN program files,

we were able to make the norming sample quite large so as

56

Chapter 5The PLAN Interest Inventory, Student Information Section,

and the High School Course Information Section

Page 67: 10421 PLAN Manual 1-34successlineinc.com/actinfo/plantechnicalmanual.pdf · ACT endorses the Code of Fair Testing Practices in Educationand the Code of Professional Responsibilities

to allow a precise estimate of percentile ranks. For a simplerandom sample of 16,587 student scores, there would be a99% chance that the 50th percentile of the scores in thesample was within one percentile rank of the 50th percentileof the scores in the target population. Although our samplewas not a simple random sample, the norming sample con-sisted of more than 400,000 students, permitting preciseestimation of percentiles.

Representativeness. One way to determine the typeand extent of sample bias is to compare demographic char-acteristics of the norming sample with national statistics for

various educational and demographic variables. The sampleweights described above were used to obtain the weightedsample proportions in Table 5.1. This table compares demo-graphic characteristics of the norming sample to nationalstatistics, permitting a general examination of the represen-tativeness of the norming sample. As can be seen, the norm-ing sample appears to be reasonably representative of thenational population. For example, the weighted sample isvery similar to the national population with respect to geo-graphic region—within 1 percentage point in each region.

57

Table 5.1Selected Characteristics of Grade 10 UNIACT Norm Group Students and Schools

Weighted sampleCharacteristic proportion U.S. proportiona U.S. category

GenderFemale .48 .48 FemaleMale .52 .52 Male

Racial/EthnicAfrican American/Black .13 .13 African American/BlackAmerican Indian, Alaska Native .01 .01 American Indian/Alaska NativeAsian American, Pacific Islander .03 .03 Asian/Native Hawaiian/Other

Pacific IslanderCaucasian American/White .57 .60 WhiteHispanicb .12 .13 Hispanic/Latino EthnicityOther, Prefer Not to Respond, Blank .12 c —Multiracial .02 .03 Two or more races

Estimated Enrollment<170 .24 .25170–336 .25 .25337–505 .25 .25>505 .26 .25

School AffiliationPublic .92 .92 PublicPrivate .08 .08 Private

Geographic RegionEast .41 .42 EastMidwest .21 .22 MidwestSouthwest .13 .12 SouthwestWest .25 .24 West

a U.S. proportion for gender and ethnicity estimated from the 2000 Census (2001) age 15–19 group. U.S. proportions for enrollment and region obtained from the Market Data Retrieval Educational Database (2003).

b Combination of two racial/ethnic categories: “Mexican American/Chicano” and “Puerto Rican, Cuban, OtherHispanic Origin.”

c U.S. proportion not available.

Page 68: 10421 PLAN Manual 1-34successlineinc.com/actinfo/plantechnicalmanual.pdf · ACT endorses the Code of Fair Testing Practices in Educationand the Code of Professional Responsibilities

Psychometric Support for UNIACT

UNIACT’s 128-page Technical Manual (Swaney, 1995)describes UNIACT’s rationale, interpretive aids, develop-ment, norms, reliability, and validity. To provide readers withan overview of the information available, the table of con-tents of the UNIACT Technical Manual is listed in Figure 5.2.Four of the nine chapters summarize validity evidence,including score profiles for 648 career groups (N = 79,040).

Internal consistency reliability coefficients for the six scales,based on Grade 10 data, ranged from .84 to .92 (median =.87). ACT invites readers to examine the full scope of infor-mation available on UNIACT. Single copies of the UNIACTTechnical Manual are available at no charge from ACTCareer Transitions Department (65), P.O. Box 168, IowaCity, IA 52243-0168.

58

Page 69: 10421 PLAN Manual 1-34successlineinc.com/actinfo/plantechnicalmanual.pdf · ACT endorses the Code of Fair Testing Practices in Educationand the Code of Professional Responsibilities

59

About the Map• The World-of-Work Map arranges 26 career areas (groups of similar jobs) into 12 regions. Together, the career areas cover

all U.S. jobs. Most jobs in a career area are located near the point shown. However, some may be in adjacent Map regions.

• A career area’s location is based on its primary work tasks. The four primary work tasks are working with—DATA: Facts, numbers, files, accounts, business procedures.IDEAS: Insights, theories, new ways of saying or doing something—for example, with words, equations, or music.PEOPLE: People you help, serve, inform, care for, or sell things to.THINGS: Machines, tools, living things, and materials such as food, wood, or metal.

• Six general types of work (“career clusters”) and related Holland types (RIASEC) are shown around the edge of the Map. The overlapping career cluster arrows indicate overlap in the occupational content of adjacent career clusters.

• Because of their People rather than Things orientation, the following two career areas in the Science & Technology Clusterare located toward the left side of the Map (Region 10): Medical Diagnosis & Treatment and Social Science.

Arts

Science & Technology

Soci

alSe

rvic

e

Administr

ation & Sales Business Operations

Technical

H. Transport Operation& Related

A. Employment-Related Services

B. Marketing& Sales D. Regulation

&Protection

E. Communi-cations &Records

F. FinancialTransactions

G. Distribution and Dispatching

K. Construction &Maintenance

L. Crafts &Related

M.Manufacturing& Processing

N. Mechanical& ElectricalSpecialties

O. Engineering & Technologies

P. Natural Science& TechnologiesT. Applied Arts

(Visual)

V. Applied Arts(Written &Spoken)

U. Creative &Performing Arts

W. Health Care

S R

I

E

A

C

Q. MedicalTech-nologies

X. Education

Y. Community Services

Z. PersonalServices

C. Manage-ment

12

2

1

3 4

5

6

7

8

910

11R.

MedicalDiagnosis

& Treatment

S.Social

Science

J. Computer/Info Specialties

I. Ag/Forestry& Related

ID E A S

THIN

GS

DATA

PEO

PLE

World-of-Work Map(Third Edition — COUNSELOR Version)

Figure 5.1. World-of-Work Map.

© 2000 by ACT, Inc. All rights reserved.

Page 70: 10421 PLAN Manual 1-34successlineinc.com/actinfo/plantechnicalmanual.pdf · ACT endorses the Code of Fair Testing Practices in Educationand the Code of Professional Responsibilities

60

Chapter 1. Overview of The Unisex Edition of the ACT InterestInventory

ACT Programs Involving UNIACTDescription of UNIACT-R [1989 revision]

Basic Interest ScalesThe Data/Ideas and Things/People Summary Scales

Interpretive AidsACT Occupational Classification SystemThe World-of-Work MapBasic Interest Scale Profile

Historical Basis of UNIACTEditions of the ACT Interest InventoryEmpirical Relationships Among the Editions

Chapter 2. Rationale for Unisex Interest Scales

The Origin of Sex-restrictive ScoresIncidence of Sex RestrictivenessComparative Validity of Sex-restrictive and Sex-balanced Interest

ScoresSex-balanced Unisex Scales as an AlternativeSummary

Chapter 3. UNIACT Development and Revision

Summary of UNIACT Development: 1975-76Review of UNIACT Item Functioning: 1987UNIACT-R Item Development

Basic Interest ScalesSummary Scales

Correlations with Previous Editions of the ACT Interest InventoryDegree of Sex Balance

Male-Female Score OverlapCareer Options Suggested to Males and Females

Chapter 4. Scale Reliability and Stability

Reliability EstimatesUNIACTUNIACT-R

Stability EstimatesShort-term Stability

Intermediate-term StabilityLong-term Stability

Chapter 5. Norms

Norms SamplesWeightingRepresentativeness of Norms

Norms Distributions

Chapter 6. Convergent and Discriminant Validity

UNIACT-R Scale IntercorrelationsUNIACT-R Scale Structure

Theory-based Dimensions Underlying UNIACT-RResponse Set FactorEvidence from Other Measures of Holland’s Types

Correlations with Other MeasuresOther Measures of Holland’s TypesOther Measures of InterestsCareer-related Experiences

Measures of Academic and Career-related AbilitySummary

Chapter 7. Criterion-Related Validity: Group Profiles, HexagonLocations, and Hit Rates

Agreement Between Criterion Group Type and the PredominantInterests of Groups

Criterion Group Profile Validity: Qualitative EvaluationCriterion Group Profile Validity: Quantitative Evaluation

Mapping Criterion Groups on Holland’s HexagonMap of College MajorsHexagon Locations Based on Score ProfilesHexagon Location Validity: Qualitative EvaluationHexagon Location Validity: Quantitative Evaluation

Agreement Between Criterion Group Type and the PredominantInterests of Individuals

Validity Determined Via High-point CodeValidity Determined Via Career Counseling Decision Rules

Summary

Chapter 8. Other Indicators of Criterion-Related Validity

Analyses Based on Strong’s PropositionsSample and VariablesResults

Satisfaction and Persistence CriteriaSatisfaction with College MajorSatisfaction with OccupationPersistence in College MajorPersistence in Occupation

Experience as a Moderator of Interest Score ValiditySample and VariablesResults

Chapter 9. Appropriateness for Racial/Ethnic Minority GroupMembers

ReliabilitySex BalanceScale StructureCriterion-related Validity

Qualitative EvaluationQuantitative Evaluation

References

Appendix A. The ACT Vocational Research Program: List of ReportsAppendix B. List of Non-ACT-Sponsored Reports Relevant to

UNIACT-RAppendix C. ACT Interest Inventory Summary Profiles for 648

Educational and Occupational GroupsAppendix D. UNIACT-R ItemsAppendix E. UNIACT-R NormsAppendix F. UNIACT-R Scoring Procedures

Figure 5.2. UNIACT Technical Manual table of contents.

Page 71: 10421 PLAN Manual 1-34successlineinc.com/actinfo/plantechnicalmanual.pdf · ACT endorses the Code of Fair Testing Practices in Educationand the Code of Professional Responsibilities

Student Information, Needs Assessment, and HighSchool Course Information Sections

Student Information Needs Assessment Sections

The Student Information section of the answer folder col-lects information about educational plans and religious pref-erence for each student who takes PLAN. This sectioncollects information useful in discussing future plans with thestudent: high school course work and future educational andcareer plans. It is helpful for counselors to consider each stu-dent’s responses to these items and the High School CourseInformation section as a unit; for instance, a student whoplans an engineering career but expects to take only a yearof high school mathematics would likely benefit from a dis-cussion of the skills needed to enter training as an engineer.

The Needs Assessment section of the answer folder pro-vides a place in which the student can indicate a need forassistance in ten selected academic areas and enablingskills. Students are asked to indicate whether they need a lotof help, some help, or little or no help in the following areas:

• Expressing my ideas in writing• Developing my public speaking skills• Increasing my reading speed• Increasing my understanding of what I read• Developing my math skills• Developing my study skills and study habits• Developing my test-taking skills• Understanding and using computers• Choosing a college or technical school to attend after

high school• Selecting a career/job that is right for me

High School Course Information Section

The High School Course Information section of theanswer folder collects information about the courses stu-dents have taken and plan to take before completing highschool. Descriptors of courses that constitute the typical highschool core curriculum are included to help students relateeach of the 48 courses listed to courses offered in their ownschools.

This kind of information is useful to school counselors,faculty, and administrators. If students are not taking or plan-ning to take the specific courses appropriate to their careerarea of interest, counselors can guide them into courses thatwill best prepare them for further training or allow them to getrelevant experience in a particular occupational area. Forteachers, curriculum coordinators, and administrators, thecourse information can be used in conjunction with PLANscores and ACT scores to study the relationship between thecurriculum and student performance.

61

Page 72: 10421 PLAN Manual 1-34successlineinc.com/actinfo/plantechnicalmanual.pdf · ACT endorses the Code of Fair Testing Practices in Educationand the Code of Professional Responsibilities

Student Report

PLAN Student Reports provide valuable information tohelp students begin to consider their future plans at an earlyage. PLAN school reports provide information that princi-pals, teachers, counselors, and district superintendents canuse to monitor and evaluate the academic achievement oftheir students.

Two copies of the PLAN Student Report (Figures 6.1 and6.2) are provided to the school. One is intended for the stu-dent, the other for the school file or for use with parents.

The front of the Student Report includes the followinginformation:

a. Test Scores for the four academic tests (English,Mathematics, Reading, and Science), the Compositescore, and subscores for the English and MathematicsTests. The Composite score is the arithmetic average ofthe 4 test scores.

b. Cumulative Percents. The column labeled “In the US”shows how students’ scores compared with those of stu-dents in the appropriate national norm group (Fall Grade10 or Spring Grade 10) based on a national study con-ducted by ACT in the fall of 2005. Because no test meas-ures with absolute precision, percentile bands areprovided for each test score and subscore and for theComposite score, to illustrate the amount of errorinvolved. Because scale scores are not comparableacross subject areas, percentile bands also allow stu-dents and schools to make comparisons between thefour PLAN test scores or subscores. The columnslabeled “In Your School,” “In Your District,” and “In YourState” may show how students’ scores compare to thoseof students in the respective group. The column labeled“College-Bound 10th” shows how students’ scores com-pared with those of students in the national study whoindicated they planned to attend a two-year or four-yearcollege.

c. Your Estimated ACT Composite Score Range is therange within which the student’s ACT Composite scorewould be expected to fall when he or she takes the ACTin two years. This estimate is reported only if the studentindicates that he/she is in 10th grade. Score ranges arecurrently based on data for examinees who took PLANin the fall of 1989 and the ACT in October 1991, andexaminees who took PLAN in the fall of 1990 and theACT in October 1992.

d. Your High School Course Plan Compared to Core.Students’ self-reported plans for high school course

work in a set of core courses are compared to the courseof study in English, math, social studies, and sciencerecommended by ACT as the minimum necessary forstudents to be prepared for entry-level college courses.

e. Your College Readiness. Students’ test scores are com-pared with those established by extensive ACT researchas the Benchmark in each area, indicating whether stu-dents are on track to be ready for college-level workupon graduation from high school.

f. Your Educational Plans for After High School. These arethe student’s self-reported plans for post–high schooleducation or training.

g. Admission Standards. This area shows the range of typ-ical ACT scores of entering first-year students at post-secondary institutions with various admission standards.

h. Profile for Success. The student can note for comparisonthe average ACT score achieved by high school stu-dents who went on to college and were successful in amajor related to the preferred career area he or shereported when taking PLAN.

i. Your Reported Needs. These are the student’s self-reported needs for assistance in seven areas.

j. Your Career Possibilities. The lower portion of the frontof the Student Report provides the information and guid-ance students need to begin to narrow down the rangeof occupational possibilities to consider. Color highlightsdisplay Interest Inventory results, expressed both asWorld-of-Work Map regions and career areas. Studentsare guided through steps to identify at least three careerareas containing occupations in line with self-reportedcareer plans and/or work-relevant interests, and aredirected to the booklet Using Your PLAN Results andwww.planstudent.org to continue the explorationprocess. Steps in the booklet encourage students toexplore specific occupations in the identified careerareas. The information on the Student Report, coupledwith the steps in Using Your PLAN Results and on theWeb, encourages students to think about and act uponthe following suggestions:• Explore careers compatible with work-relevant inter-

ests and plans.• Identify the level of preparation required after high

school for careers under consideration.• Identify high school courses related to careers under

consideration.• Develop plans for high school course work.

62

Chapter 6Reporting and Research Services

Page 73: 10421 PLAN Manual 1-34successlineinc.com/actinfo/plantechnicalmanual.pdf · ACT endorses the Code of Fair Testing Practices in Educationand the Code of Professional Responsibilities

k. Your Skills. On the back of the Student Report, the cor-rect response to each test item in the PLAN test is listed.To the right are the student’s responses. If the responsewas correct, a “+” is listed; if incorrect, the letter of theincorrect response chosen by the student is indicated. Ifthe item was omitted or the student marked more thanone answer for the item, a zero (“0”) appears. The thirdcolumn shown for the English and Mathematics Testsindicates the content area to which each item in the testsbelongs (U = Usage/Mechanics; R = Rhetorical Skills; A = Pre-Algebra/Algebra; G = Geometry). For each con-tent area, the reports also provides suggestion for stu-dents to improve their skills. These statement are basedon the score or subscore achieved by the student oneach test. Note: Schools should retain PLAN test book-lets and return them to students with their PLAN StudentReports. Students can then examine their responses toindividual test questions.

This information can help students better understandtheir performance on each PLAN test. For instance, studentsmight:

• identify and reexamine the items missed in a test tounderstand why each item was answered incorrectly.

• identify those areas in the English and MathematicsTests that were particularly difficult by referring to thesubscores.

• review content areas they found difficult and look care-fully at the skills described for improvement.

63

Figure 6.1. Front of PLAN Student Score Report. Figure 6.2. Back of PLAN Student Score Report.

Your Skills

1

2

3

4

5

6

7

8

9

10

11

12

13

14

15

16

17

A

C

A

D

B

B

D

A

C

B

A

D

D

B

A

B

C

+

+

+

+

+

A

+

+

+

A

+

C

+

o

+

A

+

18

19

20

21

22

23

24

25

26

27

28

29

30

31

32

33

34

D

D

A

C

C

A

B

B

A

C

D

B

D

A

C

C

C

35

36

37

38

39

40

41

42

43

44

45

46

47

48

49

50

A

B

D

D

A

B

C

D

C

A

D

B

A

A

B

B

Correct Answer

Your Answer

Question

Correct Answer

Your Answer

Question

Correct Answer

Your Answer

Question

u

u

u

r

r

r

u

u

r

u

u

r

r

r

r

r

u

Subscore

r

u

u

r

r

r

u

u

r

r

r

u

r

u

u

u

r

Subscore

Subscore

r

r

u

u

r

r

r

r

u

u

r

r

r

r

r

r

You correctly answered 34 out of 50 questions.

You omitted 3 questions.

You incorrectly answered 13 questions.

+

C

+

+

B

+

+

+

D

+

+

+

+

+

+

+

B

SUBSCORE AREA(u = Usage; r = Rhetorical Skills)

+

C

o

o

+

+

B

+

+

+

B

C

+

+

+

A

To improve your skills you can:

Engl

ish

Suggestions for improving your skills are based on your scores.

Content Areas

1

2

3

4

5

6

7

8

9

10

11

12

13

14

15

16

17

18

19

20

21

22

23

24

25

26

27

28

29

30

31

32

33

34

35

36

37

38

39

40

Correct Answer

Your Answer

Question

Correct Answer

Your Answer

Question

Your Answer

Correct Answer

Question

Subscore

Subscore

Subscore

1

2

3

4

5

6

7

8

9

10

11

12

13

14

15

16

17

18

19

20

21

22

23

24

25

Correct Answer

Your Answer

Question

Correct Answer

Your Answer

Question

Correct Answer

Your Answer

Question

SUBSCORE AREA(a = Algebra; g = Geometry)

To improve your skills you can:

Mat

hem

atic

sRe

adin

g

To improve your skills you can:

1

2

3

4

5

6

7

8

9

10

11

12

13

14

15

16

17

18

19

20

21

22

23

24

25

26

27

28

29

30

Correct Answer

Your Answer

Question

Correct Answer

Your Answer

Question

Correct Answer

Your Answer

Question

Scie

nce

To improve your skills you can:

Content Areas

Content Areas

Content Areas

More Info at www.planstudent.org

Topic Development

Organization

Word Choice

Sentence Structure

Usage

Punctuation

TAYLOR, ANN C

challenge yourself by reading new kinds of books; experiment with new writing styles

rewrite a paper, sharpening its focus by cutting sentences not directly related to the topic

add examples to illustrate or support major points

use transitions (like similarly or to repeat) to compare or emphasize ideas

have a classmate read your paper to see if sentences need to be reordered for clarity

try different openings and closings for a paper; say which works best and why

make sure repetition in a paper is purposeful (to provide emphasis, unity, etc.)

verify that each pronoun clearly refers to a noun or noun phrase

reread writing to make sure the words convey the same tone or vary in tone for a good reason

learn the difference between uses of coordinating conjunctions (like and or but) andsubordinating conjunctions (like after or though)

make sure pronoun person is consistent in a sentence; for instance, avoid shifts from one (ìWhenone sees . . .”) to you (“ . . . you are impressed.”)

check possessive pronouns (like her or his) to make sure they are used correctly

use the word have (not of) following verbs like could, would, and should

use commas, dashes, or parentheses to set off nonessential information in a sentence

delete unneeded commas in compound constructions, as in “Flags waved[,] and rustled.”

check to make sure semicolons are not used between a dependent and independent clause in asentence (for example, “He ran all the way to school[;] because he was late.”)

A

C

A

D

B

B

D

A

C

B

A

D

D

B

+

+

+

+

+

A

+

B

+

A

+

C

B

o

A

B

C

D

D

A

C

C

A

B

B

A

C

D

+

A

+

+

C

+

+

B

+

C

+

D

+

C

B

D

A

C

C

C

A

B

D

D

A

B

C

+

+

+

B

B

+

C

B

o

o

+

a

a

a

a

g

g

g

a

a

g

g

g

g

a

a

a

a

a

g

g

a

a

g

g

a

a

a

g

g

g

a

a

g

g

g

g

a

a

g

g

You correctly answered 21 out of 40 questions.

You omitted 3 questions.

You incorrectly answered 16 questions.

A

C

A

D

B

B

D

A

C

+

+

B

+

+

A

+

B

+

B

A

D

D

B

A

B

C

D

A

+

C

+

o

+

A

+

+

D

A

C

C

A

B

B

C

+

+

B

+

C

+

You correctly answered 15 out of 25 questions.

You omitted 1 question.

You incorrectly answered 9 questions.

A

C

A

D

B

B

D

A

C

B

+

+

C

A

+

A

+

B

A

A

A

D

D

B

A

B

C

D

D

A

+

C

+

o

+

A

+

A

C

+

C

C

A

B

B

A

C

D

B

D

+

B

+

C

C

D

+

B

C

C

You correctly answered 12 out of 30 questions.

You omitted 1 question.

You incorrectly answered 17 questions.

take notes on a challenging text; decide how the information fits together as a whole

practice writing brief summaries of books you have read

decide who is telling a story (a child, an adult, etc.) and if that viewpoint relates the story well

understand textual details and how they contribute to the author’s or narrator’s message (forexample, strengthening or clarifying it)

write an essay about something you’ve read, supporting your ideas with evidence

use a chart or web to connect a series of events in a text or film, or from an everyday occurrence,justifying your chosen sequence

decide whether comparisons made by the author or narrator help you understand relationships

look up word meanings and determine how the words an author or narrator uses affect people’simpressions of a topic or issue

defend or challenge the author’s or narrator’s claims in a text by locating key pieces ofinformation in other sources

make accurate generalizations (avoiding oversimplifications) based on details in the text (forexample, “You live there—in that polka-dotted house?” suggests disbelief)

Main Ideas and Authorí sApproach

Supporting Details

Relationships

Meanings of Words

Generalizations andConclusions

know how to locate several pieces of data in a complex table or graph (for example, a graph withseveral curved lines or axes displaying values that increase by powers of ten)

take data from an experiment you or others did and use it to make a line graph and a bar graph

describe how the values of several pieces of data from a line graph are different (for example,larger or smaller)

do an experiment that includes a control group (something used as the basis for comparison)and that uses procedures with several steps

create a one-step experiment that will answer a specific question

tell how two experiments are the same or different

read descriptions of actual experiments and, in each case, see if the reported results support thehypothesis

read a scientist’s opinion about an observation and figure out what assumptions the scientistmade in forming that opinion

Interpretation of Data

Scientific Investigation

Evaluation of Models,Inferences, and

Experimental Results

determine the discount price of items on sale (for example, an item that normally cost $10.00 ison sale for 13% off, so the sale price of the item is $8.70)

calculate the score value you need on your next math test to raise your overall grade by a certainpercent

predict the outcome of simple events (for example, the sum of two 6-sided fair number cubeswhen rolled)

research, and discuss with others, the uses of number sequences (for example, Fibonacci,arithmetic, geometric)

obtain lists of formulas and practice substituting positive and negative whole numbers into theformulas to evaluate

practice adding and subtracting algebraic expressions such as (3h + 8k) – (5h – 2k) = –2h + 10k

practice solving two-step equations such as 2x – 18 = –32; 2x = –14; x = –7

draw coordinate maps of your school, home, town, etc., labeling one point as the origin (0,0) andlocating all other points appropriately; recognize lines that are vertical or horizontal andincreasing and decreasing slopes of lines

use number lines to represent lengths of segments (for example, have a friend point to any twopoints on a meterstick and mentally calculate the distance between the two points)

determine how the sum of the interior angles of polygons are related (for example, cut the anglesoff of a triangle and arrange them to make a line; cut the angles off of a quadrilateral and arrangethem to make a circle)

quiz yourself and practice using the basic area and perimeter formulas for various polygons

Basic Operations

Probability

Numbers: Conceptsand Properties

Expressions, Equations,and Inequalities

GraphicalRepresentations

Properties ofPlane Figures

Measurement

Ask for your test booklet so you can review the questions and your answers.“+” = correct answer, “o” = no response, “*” = marked more than one answer

Your Career Possibilities

EnglishUsage/Mechanics (1-16)

Rhetorical Skills (1-16)

MathematicsPre-Alg./Algebra (1-16)

Geometry (1-16)

Reading

Science

Composite Score

Percent of students scoring at or below your score

50% 75% 90%25%10%In YourState

In the U.S.(Fall 10th)

73%

83%

70%

85%

41%

90%

10

69%

55%

1820

17

20

16

11

08

08

81%

College-Bound 10th

ScoreRange(1-32)

TEST DATE:

GRADE: 10

00A OCTOBER 23, 2007

TAYLOR, ANN C

EXAMPLE HIGH SCHOOLSCHOOL NAME:

49% 71%

74%

45%

68%

20%

80%

66%

82%

70%

81%

41%

85%

78%

47%

35%

68%

64%

67%

79%

65%

81%

33%

87%

77%

64%

49%

In YourDistrict

1404 8TH STANYTOWN, USA 00000

1% 99%

© 2007 by ACT, Inc. All rights reserved.

SCHOOL CODE: TEST FORM:000000

54%

78%

49%

74%

20%

83%

72%

52%

38%

In YourSchool

Your High School Course PlansCompared to Core

Core means minimum number of high school courses recommendedto prepare for college.

English

Mathematics

Social Studies

Science

4Years

3Years

2Years

1Year

0Years

You:Core:

You:Core:

You:Core:

You:Core:

About Your Course Plans. Your plans fall short of recommendedcourses. Consider taking additional courses in Science. You maywant to talk to your counselor or teacher to make sure you aregetting the courses you need.

College Readiness

Students scoring at or above these PLAN benchmark scores, andtaking college prep courses throughout high school, will likely beready for first-year college courses. How do your scores compare?

English

Mathematics

Reading

Science

PLANBenchmark

Scores

15

19

17

21

Below At Above

Your score is:

About Your Scores. One or more of your PLAN scores fallbelow the benchmark scores that show readiness for college-level work. Suggestions for improving your skills are listed on theback of this report. Also, talk to your counselor or teacher aboutcourses that can improve your skills. Check college websites tolearn more about their admission requirements.

Your Estimated ACTComposite Score Range

Use this score range to help plan for college.

18-22

More Info atwww.planstudent.org

Your Educational Plans forAfter High School

Your Career Area Preference

4-Year College or University

Profile for Success

See Using Your PLAN Results.

Successful college sophomores in majorsrelated to your preferred Career Areatypically have ACT Composite scores of:

Management

Admission Standards

Colleges differ in their admission standards.For example, most students in “selective”colleges have ACT Composite scores in therange of 21 to 26. Some admitted studentsmay have scores outside the range.

Your reportedneeds

Improving my writing skillsImproving my reading speed and comprehension

Improving my study skillsImproving my mathematical skills

Improving my computer skillsImproving my public speaking skills

••

••

••

✓ ✓

Making plans for my education,career, and work after high school

•✓

STEP 1: You and the World of Work STEP 2: Your Interests STEP 3: Exploring Career Options

Admission Standard Typical Scores

OpenTraditionalSelective

Highly Selective

16–2118–2421–2625–30

%Like, Indifferent, Dislike: 22—38—40Scores: R5 I4 A3 S4 E7 C6Information for

Counselors

World-of-Work Map

21-25

Your

Sco

res

Your

Pla

ns

The World-of-Work Map is your key to hundreds of jobs inthe work world. The Map shows 26 Career Areas (groups ofsimilar jobs) according to their basic work tasks involvingpeople, things, data, and ideas.

The Map is divided in 12 regions. Each region has adifferent mix of work tasks. For example, Career Area P(Natural Science & Technologies) mostly involves workingwith ideas and things.

The Career Area List below shows examples of jobs in eachof the 26 Career Areas. Review all of the Career Areas,especially those that are shaded.

Circle at least two Career Areas that have jobs you might likebest.

Find out more about jobs that are right for you. Use the stepsin your booklet, or go to www.planstudent.org.

When you completed PLAN you were asked to:• choose a Career Area you would like.• complete an interest inventory.

Your results are shown on the World-of-Work Map below.• You chose Career Area C: Management.• Your interest inventory results suggest that you may enjoy

jobs in map regions 3, 4, and 5. See the Career Areas inthose regions.

5+Years

Career Area List

A. Employment-Related ServicesHuman Resources Manager; Recruiter;Interviewer

B. Marketing & SalesAgents (Insurance, Real Estate, etc.); RetailSalesworker

C. ManagementExecutive; Office Manager; Hotel/MotelManager

D. Regulation & ProtectionFood Inspector; Police Officer; Detective

E. Communications & RecordsSecretary; Court Reporter; Office Clerk

F. Financial TransactionsAccountant; Bank Teller; Budget Analyst

G. Distribution & DispatchingWarehouse Supervisor; Air Traffic Controller

H. Transport Operation & RelatedTruck/Bus/Cab Drive; Ship Captain; Pilot

I. Agriculture, Forestry & RelatedFarmer; Nursery Manager; Forester

J. Computer & Information SpecialtiesProgrammer; Systems Analyst; DesktopPublisher; Actuary

K. Construction & MaintenanceCarpenter; Electrician; Bricklayer

L. Crafts & RelatedCabinetmaker; Tailor; Chef/Cook; Jeweler

M. Manufacturing & ProcessingTool & Die Maker; Machinist; Welder; DryCleaner

N. Mechanical & Electrical SpecialtiesAuto Mechanic; Aircraft Mechanic; OfficeMachine Repairer

O. Engineering & TechnologiesEngineers (Civil, etc.); Technicians (Laser,etc.); Architect

P. Natural Science & TechnologiesPhysicist; Biologist; Chemist; Statistician

Q. Medical Technologies (also seeArea W)Pharmacist; Optician; Dietitian; Technologists(Surgical, etc.)

R. Medical Diagnosis & Treatment (alsosee Area W)Physician; Pathologist; Dentist; Veterinarian;Nurse Anesthetist

S. Social ScienceSociologist; Political Scientist; Economist;Urban Planner

T. Applied Arts (Visual)Artist; Illustrator; Photographer; InteriorDesigner

U. Creative & Performing ArtsWriter; Musician; Singer; Dancer; TV/MovieDirector

V. Applied Arts (Written & Spoken)Reporter; Columnist; Editor; Librarian

W. Health Care (also see Areas Q and R)Recreational Therapist; Dental Assistant;Licensed Practical Nurse

X. EducationAdministrator; Athletic Coach; Teacher

Y. Community ServicesSocial Worker; Lawyer; Paralegal; Counselor;Clergy

Z. Personal ServicesWaiter/Waitress; Barber; Cosmetologist;Travel Guide

SORT CODE:

November 6, 2007 PN: 99244642 123876

TAYLOR, ANN C

Your Score Report 5

Page 74: 10421 PLAN Manual 1-34successlineinc.com/actinfo/plantechnicalmanual.pdf · ACT endorses the Code of Fair Testing Practices in Educationand the Code of Professional Responsibilities

Student Score Label

ACT provides schools with two self-adhesive labels(Figure 6.3) for each student participating in the PLAN pro-gram. This label includes student name, Social Securitynumber (if reported by the student), date tested, scores andnational cumulative percent at or below for each of the fouracademic tests and PLAN Composite score, and subscoresfor the English and Mathematics Tests.

High School List Report

In addition to the PLAN Student Report, each participat-ing school receives an alphabetical list of students who tookpart in PLAN (Figure 6.4) with a summary of each student’sacademic tests, Interest Inventory results, career and educa-tional plans, estimated range of ACT Composite scores, andself-reported needs for help.

On the list report, students from the same grade level (asreported on their answer folders) are grouped together. Forexample, all 9th-grade students are listed alphabetically,then all 10th-grade students, etc. Students who did not indi-

cate a grade are listed separately. When scores areachieved under extended time, “TE” will be printed followingthe student’s name.

School Profile Summary Report

Every school testing 25 or more sophomores students inthe same grade will receive a School Profile SummaryReport, which aggregates and summarizes PLAN results.The PLAN School Profile Summary Report consists of aseries of tables to address the following issues:

• What are the content area strengths and weaknesses ofour students relative to the national norms?

• Does our students’ performance on PLAN differ by gen-der and/or ethnic group?

• What courses are our students taking or planning totake?

• What are our students’ career preferences?• What are our students’ educational aspirations?

64

Optional Supplemental Research and Reporting Services

In addition to the basic reporting services describedabove, which are included in the PLAN fee, schools/districtsmay purchase the Reporting Services described on page 66.Information on procedures, prices, and deadlines for order-ing these reporting services will be distributed to all partici-

pating schools and districts in the fall. The aggregate reportsdescribed on page 66 can be prepared only for groups of 25or more students. (Requires minimum of 25 students insame grade, with complete test scores [four scores plusComposite], testing under standard-time conditions [noextended-time accommodations].)

Figure 6.3. PLAN Student Score Label.

TAYLOR ANN C 555236664892514NAME OF STUDENT SOC. SEC. NO./ID

ENG MATH READ SCI COMP

SCALE SCORES 27 32 30 32 30% AT OR BELOW 99 100 99 100 99

TEST DATE 10/23/07ENGLISH SUBSCORES MATH SUBSCORES

15 13 16 16U/M RS ALG GEOM

Page 75: 10421 PLAN Manual 1-34successlineinc.com/actinfo/plantechnicalmanual.pdf · ACT endorses the Code of Fair Testing Practices in Educationand the Code of Professional Responsibilities

65

Figure 6.4. PLAN High School List Report (with coding key).

EST ACT EDUCCOMP CAREER INTEREST PLANS

SCORE PREF INVENTORY AFTERRANGE MAP REGIONS HS

*c *d *e *f*a *b

TEST SCORES & NATIONAL ENGLISH MATH COMP SCORE & NT’L10TH GRADE PERCENT AT OR BELOW SUBSCORES SUBSCORES % AT OR BELOW NEEDS ASSESSMENT

NAME OF STUDENTSPECIAL COLLSTATUS AC ENG % MATH % READ % SCI % U/M RS ALG GEO COMP 10TH BND

GRD 10TH

1 2 3 4 5 6 7

HIGH SCHOOL LIST REPORT

*a The following code is used to denote the Special Status codereported by your school for this student:H = HomeboundS = Special EducationL = Limited English ProficiencyF = Free/Reduced LunchME=Migrant EducationM = Title 1 MathR = Title 1 ReadingX, Y, Z = Locally defined

*b The Accommodation Code that was reported for this student by your school:1 = Standard print materials with extended time limits (no other

assistance)2 = Large-print test book with standard time limits3 = Large-print test book with extended time limits4 = Oral presentation from audio CD with extended time limits5 = Oral presentation from reader script with extended time

limits6 = Braille test book with extended time limits7 = Scribe to transfer answers to answer folder with standard

time limits8 = Scribe to transfer answers to answer folder with extended

time limits9 = Assistive communication device (e.g., FM audio system)

with extended time limits0 = Oral presentation from cassette with extended time limits

*c The following code is used to denote the student’s career choice:A = Employment-Related ServicesB = Marketing & SalesC = ManagementD = Regulation & ProtectionE = Communications & RecordsF = Financial TransactionsG = Distribution & DispatchingH = Transport Operation & RelatedI = Agriculture, Forestry & RelatedJ = Computer & Information SpecialtiesK = Construction & MaintenanceL = Crafts & Related

M = Manufacturing & ProcessingN = Mechanical & Electrical SpecialtiesO = Engineering & TechnologiesP = Natural Science & TechnologiesQ = Medical Technologies (Also see Area W)R = Medical Diagnosis & Treatment (Also see Area W)S = Social ScienceT = Applied Arts (Visual)U = Creative & Performing ArtsV = Applied Arts (Written & Spoken)W = Health Care (Also see Areas Q and R)X = EducationY = Community ServicesZ = Personal Services

*d Up to three World-of-Work Map Regions are listed here. Incounseling students, refer to the individual student’s reportfor more complete information.

*e The following code is used to denote the student’s futureeducational plans:A = Not planning to complete high schoolB = No education or other training planned after high schoolC = Job-related training offered through military serviceD = Apprenticeship or other on-the-job trainingE = Career/technical schoolF = 2-year community college or junior collegeG = 4-year college or universityH = Graduate or professional studies after a 4-year degree

(law school, medical school, master’s degree, etc.)I = UndecidedJ = Other

*f The seven items in the “Needs Assessment” section aredenoted by the following numeric codes:1. Making plans for my education, career, and work after

high school2. Improving my writing skills3. improving my reading speed or comprehension4. Improving my study skills5. Improving my mathematical skills6. Improving my computer skills7. Improving my public speaking skills

Page 76: 10421 PLAN Manual 1-34successlineinc.com/actinfo/plantechnicalmanual.pdf · ACT endorses the Code of Fair Testing Practices in Educationand the Code of Professional Responsibilities

High Schools

The Item-Response Summary Report provides tables foreach academic test, describing item by item the perform-ance of a school’s PLAN examinees. Item-response resultsare categorized by test (e.g., English), by subscore area(e.g., Usage/Mechanics), and by category (e.g.,Punctuation).

The Customized School Profile Summary Report sum-marizes PLAN results by specific subgroups selected by theschool—e.g., females, African American males, college-bound, or non-college-bound. The results are presented inthe same format as the School Profile Summary Report.

The Data Service provides a school’s PLAN studentrecords on CD-ROM. These records can be integrated intoa school’s database to study important local questions notaddressed by PLAN standard and special research services.

The College Readiness Standards Information Servicedescribes the skills and knowledge of PLAN-tested studentsby score range and summarizes their progressive skill devel-opment by PLAN content area. The service packageincludes a set of instructional guides, Connecting CollegeReadiness Standards to the Classroom (one guide for eachcontent area and one overall administrator’s guide).

Districts

The Item-Response Summary Report provides tables foreach academic test, describing item by item the perform-ance of a district’s PLAN examinees. Item-response resultsare categorized by test (e.g., English), by subscore area(e.g., Usage/Mechanics), and by category (e.g.,Punctuation).

The District Profile Summary Report consists of the sameseries of tables found in the School Profile Summary Report.These reports summarize selected PLAN information abouta district’s tested population.

The Customized District Profile Summary Report sum-marizes PLAN results by specific groups selected by the dis-trict—e.g., females, African American males, college-bound,or non-college-bound. The results are presented in the sameformat as the District Profile Summary Report.

The Data Service provides a district’s PLAN studentrecords on CD-ROM. These records can be integrated intoa school’s database to study important local questions notaddressed by PLAN standard and special research services.

Multiple-school districts that administer PLAN as arequired testing program on a district-wide basis are eligibleto receive a free Special Research and Reporting Service.Districts qualifying for a free service will be notified of theireligibility after testing is completed.

AIM

AIM is a reporting component of ACT’s EducationalPlanning and Assessment System (EPAS). Within the EPASsystem, students’ progress can be monitored betweenGrades 8 and 12 utilizing information obtained from assess-ments at the various grade levels. A complete AIM schoolreport package consists of five separate documents: one forprincipals, one for guidance and counseling staff, and oneeach for the English/reading, the mathematics, and the sci-ence faculties. Information used to create AIM reports isbased upon data from students who have taken both PLANand the ACT or both PLAN and EXPLORE (ACT’s eighth-grade program) depending on the report requested.

66

Page 77: 10421 PLAN Manual 1-34successlineinc.com/actinfo/plantechnicalmanual.pdf · ACT endorses the Code of Fair Testing Practices in Educationand the Code of Professional Responsibilities

67

References

ACT. (1999). PLAN technical manual. Iowa City, IA: Author.

ACT. (2004). Crisis at the core: Preparing all students forcollege and work. Iowa City, IA: Author.

ACT. (2005a). The ACT national high school profile report.Iowa City, IA: Author.

ACT. (2005b). Courses count: Preparing students for post-secondary success. Iowa City, IA: Author.

ACT. (2006). The PLAN national high school profile report.Iowa City, IA: Author.

ACT. (2007). ACT National Curriculum Survey ® 2005–2006.Iowa City, IA: Author.

ACT. (2007). Directions for testing. Iowa City, IA: Author.

ACT. (2007). PLAN administrator’s handbook. Iowa City, IA:Author.

ACT. (2007). Using your PLAN results. Iowa City, IA: Author.

American Educational Research Association, AmericanPsychological Association, & National Council on Mea-surement in Education. (1999). Standards for educa-tional and psychological testing. Washington, DC:American Educational Research Association.

American Psychological Association. (1985). Standards foreducational and psychological testing. Washington, DC:Author.

Bassiri, D., & Roberts, W. L. (2005). Using latent growthanalysis to investigate relationships between EPASacademic development and student/school characteris-tics. A paper presented at the annual meeting of theAmerican Educational Research Association, Montreal,Quebec, Canada.

Becker, D. F., & Forsyth, R. A. (1990 April). Gender differ-ences in academic achievement in Grades 3 through12: A longitudinal analysis. Paper presented at themeeting of the American Educational ResearchAssociation, Boston, MA.

Hanson, B. A. (1989). Scaling the P-ACT+. In R. L. Brennan(Ed.), Methodology used in scaling the ACTAssessment and P-ACT+. Iowa City, IA: AmericanCollege Testing Program.

Holland, J. L. (1997). Making vocational choices: A theory ofvocational personalities and work environments (3rd ed.).Odessa, FL: Psychological Assessment Resources.

Holland, J. L., Whitney, D. R., Cole, N. S., & Richards, J. M.(1969). An empirical occupational classification derivedfrom a theory of personality and intended for practiceand research (ACT Research Report No. 29). Iowa City,IA: ACT.

Joint Committee on Testing Practices. (2004). Code of fairtesting practices in education. Washington, DC: Author.

Kolen, M. J. (1988). Defining score scales in relation tomeasurement error. Journal of Education Measure-ment, 25, 97–110.

Kolen, M. J. (1984). Effectiveness of analytic smoothing inequipercentile equating. Journal of Educational Statis-tics, 9, 25–44.

Kolen, M. J., Hanson, B. A., & Brennan, R. L. (1992). Condi-tional standard errors of measurement for scale scores.Journal of Educational Measurement, 29, 285–307.

Market Data Retrieval Educational Database [Electronicdata type]. (2003). Chicago: Market Data Retrieval.

Martin, D. J., & Hoover, H. D. (1987). Sex differences in edu-cational achievement: A longitudinal study. Journal ofEarly Adolescence, 7, 65–83.

NCME Ad Hoc Committee on the Development of a Code ofEthics. (1995). Code of professional responsibilities ineducational measurement. Washington, DC: NationalCouncil on Measurement in Education.

Noble, J., Davenport, M., & Sawyer, R. (2001). Relationshipsbetween noncognitive characteristics, high schoolcourse work and grades, and performance on a collegeadmissions test. A paper presented at the annual meet-ing of the American Educational Research Association,Seattle, WA.

Noble, J., Davenport, M., Schiel, J., & Pommerich, M.(1999). High school academic and noncognitive vari-ables related to the ACT scores of racial/ethnic andgender groups. (ACT Research Report No. 99-6). IowaCity, IA: ACT.

Noble, J., Roberts, B., & Sawyer, R. L., (2006). Studentachievement, behavior, perceptions and other factorsaffecting ACT scores. (ACT Research Report No.2006–1). Iowa City, IA: ACT.

(Please note that in 1996 the corporate name “The American College Testing Program” was changed to “ACT.”)

Page 78: 10421 PLAN Manual 1-34successlineinc.com/actinfo/plantechnicalmanual.pdf · ACT endorses the Code of Fair Testing Practices in Educationand the Code of Professional Responsibilities

68

Noble, J. P., & Schnelker, D. (2007). Using hierarchical mod-eling to examine course work and ACT score relation-ships across high schools. (ACT Research Report No.2007-2). Iowa City, IA: ACT.

Prediger, D. J. (1982). Dimensions underlying Holland’shexagon: Missing link between interests and occupa-tions? Journal of Vocational Behavior, 21, 259–287.

Prediger, D. J., & Swaney, K. B. (1995). Using UNIACT in acomprehensive approach to assessment for careerplanning. Journal of Career Assessment, 3, 429–451.

Roberts, W. L., & Bassiri, D. (2005). Assessing the relation-ship between students’ initial achievement level and rateof change in academic growth within and between highschools. Paper presented at the annual meeting of theAmerican Educational Research Association, Montreal,Quebec, Canada.

Roberts. W. L., & Noble, J. P. (2004). Academic and noncog-nitive variables related to PLAN scores. (ACT ResearchReport No. 2004-1). Iowa City, IA: ACT, Inc.

Schiel, J., Pommerich, M., & Noble, J. (1996). Factors asso-ciated with longitudinal educational achievement, asmeasured by PLAN and ACT Assessment scores. (ACTResearch Report 96-5). Iowa City, Iowa: ACT.

Spray, J. A. (1989). Performance of three conditional DIFstatistics in detecting differential item functioning onsimulated tests (ACT Research Report No. 89-7). IowaCity, IA: American College Testing Program.

Swaney, K. B. (1995). Technical manual: Revised UnisexEdition of the ACT Interest Inventory (UNIACT). IowaCity, IA: ACT.

United States Census Bureau. (2001). Census 2000Summary File 1. Retrieved using 2000 American FactFinder, July 15, 2004, from http://factfinder.census.gov

Wang, M. W., & Stanley, J. L. (1970). Differential weighting:A review of methods and empirical studies. Review ofEducational Research, 40, 663–705.