4 Test Administration

Chapter 4 of the Dynamic Learning Maps® (DLM®) Alternate Assessment System 2014–2015 Technical Manual—Year-End Model (Dynamic Learning Maps Consortium, 2016b) describes general test administration and monitoring procedures. This chapter describes updated procedures and data collected in 2019–2020.

For a complete description of test administration for DLM assessments, including information on available resources and materials and information on monitoring assessment administration, see the 2014–2015 Technical Manual—Year-End Model (Dynamic Learning Maps Consortium, 2016b).

4.1 Overview of Key Administration Features

This section describes DLM test administration for 2019–2020. For a complete description of key administration features, including information on assessment delivery, Kite® Student Portal, and linkage level selection, see Chapter 4 of the 2014–2015 Technical Manual—Year-End Model (Dynamic Learning Maps Consortium, 2016b). Additional information about administration can also be found in the Test Administration Manual 2019–2020 (Dynamic Learning Maps Consortium, 2019d) and the Educator Portal User Guide (Dynamic Learning Maps Consortium, 2019c).

4.1.1 Change to Administration Model

Instructionally embedded assessments were available for teachers to optionally administer between September 9 and December 20, 2019, and between January 1 and February 26, 2020. During the consortium-wide spring testing window, which occurred between March 9 and June 5, 2020, students were assessed on each Essential Element (EE) on the blueprint. Each state education agency sets its own testing window within the larger consortium spring window.

The COVID-19 pandemic significantly impacted the spring 2020 window, which resulted in all states ending assessment administration earlier than planned. State education agencies were given the option to continue assessments later in the year, extending to the end of June, however no state education agencies used this option.

4.1.2 The Instruction and Assessment Planner

In 2019–2020, the Instructional Tools Interface that was used for administering the optional fall instructionally embedded assessments was replaced with the Instruction and Assessment Planner (“planner” hereafter). The planner is designed to facilitate a cycle of instruction and assessment throughout the instructionally embedded windows.

The planner includes information to track the lifecycle of an instructionally embedded assessment from the creation of an instructional plan to the completion of the testlet. Additionally, student performance on instructionally embedded testlets is also reported within the planner to assist teachers in monitoring student progress and planning future instruction. For a complete descripion of planner, including the development process, see Chapter 4 of the 2019–2020 Technical Manual Update—Instructionally Embedded Model (Dynamic Learning Maps Consortium, 2020a).

4.2 Administration Evidence

This section describes evidence collected during the spring 2020 operational administration of the DLM alternate assessment. The categories of evidence include data relating to the length of the assessment and the adaptive delivery of testlets in the spring window.

4.2.1 Assessment Blueprint

In 2019–2020, states education agencies participating in the year-end assessment model voted to make updates to the assessment blueprint for English language arts (ELA) and mathematics. For a full description of the blueprint changes, see Chapter 3 of this manual. As part of this change, testlets administered to students now assess a single Essential Element (EE), whereas the previous assessment blueprint required multiple EEs to be assessed on a single testlet. However, the administration features of the testlets were unchanged. For a complete description of DLM testlet administration features, see Chapter 4 of the 2014–2015 Technical Manual—Year-End Model (Dynamic Learning Maps Consortium, 2016b). The decision to update the assessment blueprint followed multiple rounds of input from state education agency members and the Technical Advisory Committee . The updates to the assessment blueprint were expected to have limited impact on administration, and due to the use of short testlets, estimated administration time was also not expected to be substantially impacted . Scoring remained the same and would be based on all available evidence throughout the year . See Chapter 5 of this manual for details on how the assessments are scored.

Due to these administration changes, the number of testlets each student is required to completed changed in 2019–2020 from previous assessment administrations. Table 4.1 displays the number of testlets required to meet blueprint coverage in the previous blueprint and the updated 2019–2020 blueprint. Overall, the changes to the assessment blueprint resulted in an increase in the number of testlets to be taken. However, because under the new assessment model each testlet assesses a single EE, each EE is assessed with more items (i.e., 3-9), allowing for more fine grained reporting of results to inform future instruction. See Chapter 7 of this manual for more information on the results provided in 2019–2020.

Table 4.1: Number of Testlets Required to Cover Blueprint
Grade Previous Blueprint 2019–2020
English language arts
  3 6 9
  4 6 9
  5 6 9
  6 5 9
  7 5 9
  8 5 9
  9 5 9
10 5 9
11 4 9
Mathematics
  3 6 8
  4 7 8
  5 6 8
  6 6 7
  7 6 7
  8 6 8
  9 6 7
10 6 8
11 6 6

4.2.2 Adaptive Delivery

During the spring 2020 test administration, the ELA and mathematics assessments were adaptive between testlets, following the same routing rules applied in prior years. That is, the linkage level associated with the next testlet a student received was based on the student’s performance on the most recently administered testlet, with the specific goal of maximizing the match of student knowledge and skill to the appropriate linkage level content.

  • The system adapted up one linkage level if the student responded correctly to at least 80% of the items measuring the previously tested EE. If the previous testlet was at the highest linkage level (i.e., Successor), the student remained at that level.
  • The system adapted down one linkage level if the student responded correctly to less than 35% of the items measuring the previously tested EE. If the previous testlet was at the lowest linkage level (i.e., Initial Precursor), the student remained at that level.
  • Testlets remained at the same linkage level if the student responded correctly to between 35% and 80% of the items on the previously tested EE.

In prior years, because testlets assessed multiple EEs, the percentage of items answered correctly was calculated for each group of items measuring the same EE. The minimum of these values was then used to determine the next linkage level, based on the above thresholds. However, because the updated blueprint includes a switch to only assessing a single EE on each testlet, it is no longer necessary to calculate the percentage correct for each EE on the testlet. That is, the percentage correct for the testlet overall is equivalent to the EE percentage correct. Thus, beginning with the spring 2020 operational administration, adaptive delivery is determined by all items on a testlet, rather than a subset of items for each EE.

The linkage level of the first testlet assigned to a student was based on First Contact survey responses. The correspondence between the First Contact complexity bands and first assigned linkage level is shown in Table 4.2.

Table 4.2: Correspondence of Complexity Bands and Linkage Levels
First Contact Complexity Band Linkage Level
Foundational Initial Precursor
1 Distal Precursor
2 Proximal Precursor
3 Target

For a complete description of adaptive delivery procedures, see Chapter 4 of the 2014–2015 Technical Manual—Year-End Model (Dynamic Learning Maps Consortium, 2016b).

4.2.3 Administration Incidents

As in all previous years, testlet assignment during the spring 2020 assessment window was monitored to ensure students were correctly assigned to testlets. Administration incidents that have the potential to affect scoring are reported to state education agencies in a supplemental Incident File. No incidents were observed during the spring 2020 assessment window. Assignment of testlets will continue to be monitored in subsequent years to track any potential incidents and report them to state education agencies.

4.3 Implementation Evidence

This section describes additional resources that were made available during the spring 2020 operational implementation of the DLM alternate assessment. For evidence relating to user experience and accessibility, see the 2018–2019 Technical Manual Update—Year-End Model (Dynamic Learning Maps Consortium, 2019b).

4.3.1 Data Forensics Monitoring

Beginning with the spring 2020 administration, two data forensics monitoring reports were made available in Educator Portal. The first report includes information about testlets completed outside of normal business hours. The second report includes information about testlets that were completed within a short period of time.

The Testing Outside of Hours report allows state education agencies to specify days and hours within a day that testlets are expected to be completed. For example, Monday through Friday from 6:00 a.m. to 5:00 p.m. Each state selects their own days and hours for setting expectations. The Testing Outside of Hours report then identifies students who completed assessments outside of the defined expected hours. The report includes the student’s first and last name, district, school, name of the completed testlet, and time the testlet was started and completed. Information in the report is updated at approximately noon and midnight each day, and the report can be viewed by state education agencies in Educator Portal or downloaded as a CSV file.

The Testing Completed in a Short Period of Time report identifies students who completed a testlet within an unexpectedly short period of time. The threshold for inclusion in the report was testlet completion time of less than 30 seconds in mathematics and 60 seconds in English language arts. The report includes the student’s first name, last name, grade, and state student identifier. Also included are the district, school, teacher, name of the completed testlet, number of items on the testlet, an indicator for whether or not all items were answered correctly, the number of seconds for completion of the testlet, and the starting and completion times. Information in the report is updated at approximately noon and midnight each day, and the report can be viewed by state assessment administrators in Educator Portal or downloaded as a CSV file.

4.3.2 Released Testlets

The DLM Alternate Assessment System provides educators and students with the opportunity to preview assessments by using released testlets. A released testlet is a publicly available, sample DLM assessment. Released testlets cover the same content and are in the same format as operational DLM testlets. Students and educators can use released testlets as examples or opportunities for practice. Released testlets are developed using the same standards and methods used to develop testlets for the DLM operational assessments. New released testlets are added on a yearly basis.

In response to state inquiries about supplemental assessment resources to address the increase in remote or disrupted instruction due to COVID-19, the DLM team published additional English language arts, mathematics, and science released testlets during the spring 2020 window. Across all subjects, nearly 50 new released testlets were selected and made available through Kite Student Portal. To help parents and educators better review the available options for released testlets, the DLM team also provided tables for each subject that display the Essential Elements and linkage levels for which released testlets are available.

The test development team selected new released testlets that would have the greatest impact for remote or disrupted instruction. The team prioritized testlets at the Initial Precursor, Distal Precursor, and Proximal Precursor linkage levels, as those linkage levels are used by the greatest number of students. The test development team selected testlets written to Essential Elements that covered common instructional ground, with a consideration for previously released testlets to minimize overlap between the testlet that were already available and new released testlets. The test development team also aimed to provide at least one new released testlet per grade level, where possible.

4.4 Conclusion

During the spring 2020 administration, state education agencies adopting the year-end model adjusted the blueprint and transitioned to administration of single-EE testlets. The Instruction and Assessment Planner was introduced to better support learning, instruction, and the process of administering DLM testlets. Additionally, new data forensics monitoring reports were made available to state education agencies in Educator Portal. Finally, DLM published additional English language arts, mathematics, and science released testlets during the spring 2020 window to support remote or disrupted instruction resulting from COVID-19. Updated results for administration time, linkage level selection, user experience with the DLM system, and accessibility supports were not provided in this chapter due to limited samples in the spring 2020 window which may not be representative of the full DLM student population. For a summary of these administration features in 2018–2019, see Chapter 4 of the 2018–2019 Technical Manual Update—Year-End Model (Dynamic Learning Maps Consortium, 2019b).