PARCC Releases Field Test Report

Posted · Add Comment

The Partnership for Assessment of Readiness for College and Career (PARCC) released a report, Field Test: Lessons Learned, earlier this week.  The report shows that the spring 2014 field test of the PARCC assessments went well, that PARCC states used feedback to make adjustments and improvements, and that the testing experience for students was largely positive.

The report also found that school and district test administrators and coordinators thought the test manuals should be streamlined, and that the test delivery platform worked well, but could use some adjustments to simplify some of the procedures.

Background

More than 1 million students in nearly 16,000 schools participated in the spring 2014 PARCC field test. Fourteen states and the District of Columbia administered the test.

The primary purposes of the PARCC field test were to:

  • Examine the quality of test questions and tasks;
  • Evaluate assessment training materials and administration procedures;
  • Evaluate the computer-based delivery platform; and
  • Conduct research on a range of topics, including those that will inform the reporting of results from the first round of full testing.

The ultimate goal of the field test was to confirm that PARCC is a quality assessment program and to make improvements based on the experience prior to the 2014–15 administration, in which an estimated 5 million students will participate.

Findings

According to PARCC, with minor exceptions, the field test went very smoothly. Most students had sufficient time to complete the field test and were comfortable with the computer-based items, especially those students who practiced first with sample questions and tools designed to familiarize them with features of the online system. Likewise, schools that used the various tools, manuals, and training modules provided by the PARCC consortium reported fewer issues than those that did not — particularly in regards to technology.

Test Items & Student Experience

Item-level data from the field test indicate that a large majority of the items developed for the PARCC assessments over the past several years performed well. Students understood the questions and responded appropriately. Approximately 89 percent of the English language arts/literacy (ELA/L) questions and 78 percent of the mathematics questions were found eligible for the 2014–15 administration, mirroring results for other new assessment programs and providing ample test items for the administration.

In addition, students across grades, even those in elementary school, were able to successfully use the computer-based test (CBT) delivery system, including keyboarding their answers to short and extended questions, scrolling through reading passages, and moving from one question to the next. In addition, students told observing researchers that they found the computer-based assessments engaging.

Other findings related to the student experience were:

  • More than 90 percent of students said that they had sufficient time to complete the test.
  • Approximately 90 percent of students said that they understood the directions on the ELA/L tests (both the computer-based and paper-based formats) and the mathematics computer-based test. Eighty-three percent of students said they understood the instructions on the paper-based mathematics test.
  • Two-thirds of students said that they entered mathematics symbols and numbers with ease on the CBT.
  • Approximately half of students said that it was easy to use the online calculator (which, nonetheless, is being revised for the 2014–15 administration).

Tech Preparedness

There were no system-wide technology issues during the field test. Most technology issues that did occur were local (e.g., firewall settings needed to be changed, computer settings needed to be adjusted, or students needing help logging in) — an expected result when school districts introduce computer testing for the first time, as was the case in most PARCC states. Most issues were also quickly and easily resolved.

Training & Test Administration Materials

Test administrators did report that some directions, such as how to close out of online test sessions, needed clarification and that the test manuals could be shorter and clearer. Students and test administrators and coordinators requested some improvements to the equation editor, a computer-based feature that allows students to solve mathematics problems and explain their reasoning. 

Responsiveness

The steps that the PARCC states are taking to help ensure the success of the 2014–15 administration and all future administrations, in light of the field test, include:

  • Revising manuals and training modules;
  • Revising general directions on the tests, especially the mathematics tests, to make them clearer;
  • Upgrading PearsonAccess and TestNav8 (the CBT delivery platform);
  • Conducting a third-party verification and validation of TestNav8 performance;
  • Revising tutorials to include a full array of tools, accessibility features, and item-computer interactions; and
  • Expanding practice tests to include paper-based tests and additional components (performance-based and end-of-year assessments in both content areas).

 

The Report