Key Policy Letters Signed by the Education Secretary or Deputy Secretary

November 20, 2023

Dear Chief State School Officer:

We're at a pivotal moment in the educational progress and advancement of our nation. Students are now back in classrooms after three years of school impacted by the worldwide pandemic. We have an opportunity to improve education in America to levels never before seen as we collectively Raise the Bar in education. Part of this work is ensuring students are on a path to academic success and mastery of grade-level standards and that we are engaging in best practices to provide rich teaching and learning experiences.

High quality assessment systems are a critical component of a rich, rigorous instructional program for every child. Assessment data give insight into student learning and help guide instruction and meet students' needs. It can also drive resources and strategies to address general underperformance and disparities in opportunities and outcomes for students. Unfortunately, as we've heard too often from educators, parents, and students, our collective approach to assessment has not always met that mark. However, individual States are making progress to innovate and improve the quality of their assessments, including by supporting the development of interim assessments and diagnostic tools that can provide teachers, instructional leaders, and parents with a timely and more complete picture of a student's academic growth and – as importantly – help tailor tiered instruction, supports, and interventions to accelerate student growth. Other States are piloting competency-based assessment item types, which can provide important information on how students apply what they learn. The U.S. Department of Education (Department) has been pleased to work with individual States to advance their goals in innovation and quality, and we will continue to do so. This partnership is crucial to lifting best practices and guiding our policy direction.

As I have stated on numerous occasions before, we cannot expect innovation from the field of education while protecting the status quo from Washington, D.C. We believe it is time to innovate around high-quality assessment systems, and we have a plan to do so. As you will see in the attachment to this letter, we are making improvements to our implementation of the Innovative Assessment Demonstration Authority (IADA) that we hope will encourage more States to make use of it. We are doing so by clarifying how States can demonstrate comparability, supporting and incentivizing planning, and announcing that the cap on IADA has been lifted, to signal that we're open to hearing from as many States as there is interest. We have crafted the attached guidance to clarify our existing authorities, outline how we will use them to support innovation and quality, and provide every State with running room to pilot and adopt innovative approaches that can lead to more authentic student learning.

When used well, assessments give us important guideposts for how to scaffold instructional approaches and supports, adjust teacher practice and development, and better target interventions to meet student needs. Strong assessment practices allow us to maximize the quality of and returns on instructional time. Parents deserve better information about how their child is progressing toward grade-level standards. Students deserve instruction that is better tailored to their needs and enables them to more quickly reach mastery of academic standards. Educators need assessments that inform intervention and curricular modifications in real time. The actions we take now – in this school year, and in the school years to come – will have an indelible, long- term impact on our students. Innovative approaches to high-quality assessment systems can help accelerate learning for all kids, especially those who are the furthest from opportunity. I invite you to join me in innovating your approach.

Please reach out to my team with any questions. We are eager to work with you, and I hope you will consider applying for the IADA.

Sincerely,
/s/
Miguel A. Cardona, Ed.D.
U.S. Secretary of Education

 

cc: State Assessment Directors
State Title I Directors

ATTACHMENTS:

  • Appendix A: Technical Guidance on Innovative Approach to Assessment, November 2023
  • Appendix B: Comparability Resources

APPENDIX A: TECHNICAL GUIDANCE
Innovative Assessment Demonstration Authority, November 2023

The U.S. Department of Education (Department) is taking action to support more States in using innovative approaches to high-quality assessment. Earlier this year, we asked for help from States, school districts, experts, and other interested stakeholders in improving the Innovative Assessment Demonstration Authority (IADA) authorized by the Elementary and Secondary Education Act of 1965 (ESEA) through a public request for information (RFI).1 We appreciate the public input in response to the RFI. Based on this feedback and the Institute of Education Sciences (IES) report about the first five IADA systems through the 2020-2021 school year, the Department is instituting several clarifications and process improvements for IADA. Specifically, we are:

  • Lifting the cap on States that can participate in IADA;
  • Clarifying methods States can use to demonstrate comparability;
  • Clarifying IADA timelines, including planning periods, standardized review windows for State applications, and extension and waiver options;
  • Recognizing the importance of funding opportunities that support IADA work;
  • Emphasizing educator and family engagement and clarifying the role of external partners; and
  • Inviting all interested experts, particularly those with expertise in innovative assessment, to apply to serve as assessment peer reviewers.

IADA provides the opportunity for an approved State educational agency (SEA) to pilot new assessment approaches and to scale up over time. While an SEA is working with some local educational agencies (LEAs) or schools in using and evaluating the innovative assessment, students in other schools continue taking the existing State assessment.

Since 2016, the Department has approved five States (Georgia, Louisiana, Massachusetts, New Hampshire, and North Carolina) to participate in IADA, with each State pursuing different innovative assessment designs. For example, Louisiana is working on assessments that utilize through-year, curriculum-embedded assessments; while Massachusetts is building technology- enhanced science performance tasks. More information about IADA, including the approved IADA applications and each State's report on implementation progress, can be found at https://oese.ed.gov/offices/office-of-formula-grants/school-support-and-accountability/iada/.

1. Lifting the Cap on States that Can Participate in IADA

While the ESEA originally limited the number of States that could be approved under IADA to seven during an "initial demonstration period," the cap has since been lifted consistent with section 1204(d) of the ESEA. Given the time since the first State was approved and the publication of a progress report by the IES,2 last spring, the "initial demonstration period" as specified in the statute has ended. All interested States may now apply either individually or as part of a consortium for approval for IADA.

2. Clarifying the Methods for the Evaluation of Comparability

To promote transparency and allow the State to continue making school accountability determinations, the ESEA requires an SEA approved for IADA to demonstrate the comparability between its IADA assessment and the existing statewide assessment, in terms of alignment to the State's academic content standards and the academic achievement standards (i.e., proficiency levels). A State has flexibility in the method it uses to demonstrate comparability.

Crucially, evaluating comparability does not require that the proficiency results be exactly the same between the two assessments; because the assessment designs will differ, the assessments' resulting estimates of student performance will differ. IADA does not require that a State demonstrate comparability at the individual test score level or comparability from scale score to scale score across the performance distribution. Moreover, the Department does not require data to support the demonstration of comparability when a State submits its application under IADA. Rather, a State must provide us with a plan for how it will demonstrate comparability – and there are several paths to that goal, as outlined below.

In 2016, when the Department issued IADA regulations, we provided four methods for how an SEA could demonstrate comparability and a fifth option for an SEA to propose "an alternative method" for demonstrating comparability (34 CFR 200.105(b)(4)(i)(E)). A common theme in the RFI comments was a request that the Department provide additional information on the "alternative method" option. In response, we are offering some additional information for States to consider in developing IADA comparability evaluation designs. For example, a State could potentially demonstrate comparability by providing:

    • Evidence of the alignment of both the innovative assessment and the statewide assessment to the content standards, and
    • Evidence of the consistency of achievement classifications across the two systems.

Below are two ways that SEAs have used alternative approaches to evaluate comparability between different assessments:

    • One State approved for IADA3 compared results of students taking the grade 3 reading/language arts (R/LA) statewide assessment in 2017-18 with results for the same students taking the grade 4 IADA R/LA pilot assessment in 2018-19. The State also compared students taking the grade 4 IADA mathematics pilot assessment in 2017-18 with results for the same students taking the statewide grade 5 mathematics assessment in 2018-19. The State asserted that this non-concurrent comparability approach "demonstrated remarkable consistency of expectations for the same students as we would expect some growth to proficiency from one year to the next."
    • Three States4 have been approved to implement the flexibility in ESEA section 1111(b)(2)(H) to permit LEAs to administer a nationally recognized, locally selected high school assessment in place of the statewide assessment. Each demonstrated comparability between the statewide assessment and the nationally recognized assessment using the approach noted above (i.e., demonstrating the alignment of both assessments to the State's content standards and evaluating the consistency of achievement classifications across the two systems). As further described in critical element 7.3 of the Department's assessment peer review guide, evidence to demonstrate this approach can include empirical analyses of the rigor and quality of the assessments (e.g., technical reports), studies, summaries of reviews, or samples of reports.5

We have also received feedback that States may want to create assessments with academic achievement standards that more completely describe student achievement expectations in light of innovative assessment approaches. A State in this situation might consider establishing a set of academic achievement standards on the innovative assessment that are comparable to the statewide assessment for use during the IADA period while also building academic achievement standards on the innovative assessment to which the State could transition once all students in the State are taking the innovative assessment. In this way, at any given time, the State would be using a single set of comparable academic achievement standards, but it would not be constrained by the previous test design in building toward the desired innovative approach.

See Appendix B "References on Comparability" for additional resources6 that a State might consider when planning how to evaluate comparability of the IADA pilot and statewide assessments.

3. IADA Timelines

A. Planning

A common theme in the RFI comments was that the timeline for IADA implementation is too aggressive and that States needed time to prepare an IADA design and discuss with partners, including parents, educators, and school and district leaders. States and stakeholders have also mentioned that having approval early in their development process (before they have an IADA ready to administer in schools) will help States in their internal discussions.

Although we cannot provide early approval, to help address these concerns, the Department is offering an initial "planning status" phase for any interested State so that it may benefit from early feedback from the Department as it develops its application. A State can submit a request to enter the planning status whenever it would like to do so. This will not replace the existing peer review of a full application, which still must occur prior to the State administering its IADA pilot in lieu of the statewide assessment. A State interested in entering planning status would notify the Department of its intent to plan and develop a full IADA application.

At this stage, the Department does not expect that the State is ready to submit a full application with a fully developed IADA design. Rather, the State would submit a short summary or overview of its initial approach or design to IADA, its goals for the IADA pilot, and its proposed timeline for when it expects to submit a complete IADA application for the Department to review.7 In turn, the Department would then provide early feedback on the State's approach as the State continues its development work for a future IADA application. While States are not required to participate in the planning phase, doing so would give them the benefit of early technical assistance from the Department on their suggested approach.

B. Overall IADA Timeframe

To further address the feedback that the timeline for IADA implementation is too aggressive, we want to clarify the available options. By statute, the Department can approve States for IADA for five years; we also are permitted to provide a two-year renewal period after the initial five years. After the renewal period, the Department has waiver authority to grant "the time necessary to implement the innovative assessment system statewide" (ESEA section 1204(j)(3)). We anticipate working closely with States to support successful innovation; we understand this may take more than five years and are committed to working with each IADA State, provided it continues making progress.

C. Standardized IADA Submission Windows

When a State is ready to submit an IADA application (whether or not the State previously participated in the planning status phase, which is not a prerequisite), it must receive approval from the Department prior to administering the IADA pilot in LEAs or schools. To better accommodate State planning, the Department will offer two application opportunities per calendar year: on the first Friday in May (for review and approval prior to the upcoming school year—e.g., submission in May 2024 will be for approval to implement in the 2024-2025 school year) and the first Friday in December (for review and approval in the spring prior to the upcoming school year—e.g., submission in December 2024 will be for the 2025-2026 school year). To appropriately plan for peer review of IADA applications, the Department will ask that States provide a notice of intent to submit an IADA application no later than 45 days prior to the application deadline for that period. The notice of intent will not be binding or required but will allow the Department time to plan for the logistics for the peer review.

4. Recognizing the Importance of Funding Opportunities

Many of the RFI comments observed that bold change in assessment requires a significant commitment of public resources over a multi-year transition. The Department agrees and is committed to finding new ways to support States in this work. Allowable uses of funds under the ESEA's Competitive Grants for State Assessments (CGSA) program are well aligned with IADA, and the Department intends to incentivize innovation in assessments through this program in fiscal year 2024, pending appropriations. More information will be provided as it becomes available.

5. Educator and Family Engagement and Collaboration with Outside Partners

Successful innovation in schools builds on deep educator and family engagement. We want to underscore the importance of educator and family engagement throughout the process of conceiving, planning, building, and implementing innovative assessments. Continuous family engagement in assessment design, specifically, can help ensure that assessments reflect students' lived experiences and are free from bias. As assessment transitions take place, clear and ongoing communication between educators and families about how an IADA assessment relates to other tests students have taken is critical. Communicating assessment results with families in easily understandable and actionable ways is also an essential aspect of IADA.

External partnerships are allowed and can be valuable and encouraged from the beginning. Please note that there is no requirement that a State secure a contract with an external vendor to apply for IADA.

6. Assessment Peer Reviewers

Finally, the Department is always seeking additional experts to serve as peer reviewers of State assessment systems, both for IADA applications and for the Department's peer review of State assessment systems, including alternate assessment systems and English language proficiency assessments. An application for prospective assessment peer reviewers is available online.8 Please consider serving as a peer reviewer and forwarding this open invitation to others in your network, particularly experts in innovative assessments. Completed applications, along with a copy of the applicant's resume, may be submitted to ESEA.Assessment@ed.gov.

The Department is committed to supporting you in building and using assessment approaches that measure what matters, support students, educators, and families, and continuously improve. We encourage you to reach out to us at ESEA.Assessment@ed.gov to discuss your plans for revising your assessments or pursuing IADA. We also encourage you to reach out to the Comprehensive Centers, which are available to support your efforts.

APPENDIX B

There are a variety of resources that have been published over the past 15 years that could potentially inform States and their assessment partners as they consider plans for comparability evaluations in their IADA applications.9 The list below is not intended to be exhaustive. The Department encourages States to review these and other relevant publications when considering and developing comparability evaluation plans. States should also discuss their plans with their technical advisory committees and others.

  1. Forte, E. (2017). Evaluating Alignment in Large-Scale Assessment Systems. Council of Chief State School Officers. Online at https://ccsso.org/sites/default/files/2018- 07/TILSA%20Evaluating%20Alignment%20in%20Large- Scale%20Standards-Based%20Assessment%20Systems.pdf.
  2. Lyons, S. & Marion, S. F. (2016). Comparability options for state applying for the Innovative Assessment and Accountability Demonstration Authority: Comments submitted to the United States Department of Education regarding proposed ESSA regulations. Online at www.nciea.org.
  3. Marion, S. and Perie, M. (2011). Some Thoughts about Comparability Issues with "Common" and Uncommon Assessments. Online at www.nciea.org.
  4. Perie, M. in Berman, A., Hartel, E. and Pelligrino, J. (eds.). (2020) Comparability in Large Scale Assessments. National Academy of Education. Online at https://naeducation.org/wp-content/uploads/2020/06/Comparability- of-Large-Scale-Educational-Assessments.pdf.
  5. Winter, P.(ed.) (2010) Evaluating the Comparability of Scores from Achievement Test Variations. Pages 1-13. Council of Chief State School Officers. Online at https://files.eric.ed.gov/fulltext/ED543067.pdf.

1 See https://www.federalregister.gov/documents/2023/03/31/2023-06697/request-for-information-regarding-the- innovative-assessment-demonstration-authority.

2 See https://ies.ed.gov/ncee/pubs/2023004/ for the full report from IES.

3 See https://www2.ed.gov/admins/lead/account/iada/nh-annual-perf-rpt1819.pdf, page 117 for presentation of non- concurrent validity comparisons of New Hampshire's PACE pilot with the New Hampshire statewide assessment.

4 Currently, MS, ND, and OK have been approved to permit a district to administer a nationally recognized, locally selected high school assessment in lieu of the statewide assessment.

5 See pages 76-77 in https://www2.ed.gov/admins/lead/account/saa/assessmentpeerreview.pdf.

6 This document contains resources that are provided for the user's convenience. The inclusion of these materials is not intended to reflect its importance, nor is it intended to endorse any views expressed, or products or services offered. These materials may contain the views and recommendations of various subject matter experts as well as hypertext links, contact addresses and websites to information created and maintained by other public and private organizations. The opinions expressed in any of these materials do not necessarily reflect the positions or policies of the U.S. Department of Education. The U.S. Department of Education does not control or guarantee the accuracy, relevance, timeliness, or completeness of any outside information included in these materials.

7 States can submit to esea.assessment@ed.gov.

8 See https://oese.ed.gov/files/2023/09/Assessment-Peer-Reviewer-Checklist-2023-24.docx

9 This document contains resources that are provided for the user's convenience. The inclusion of these materials is not intended to reflect its importance, nor is it intended to endorse any views expressed, or products or services offered. These materials may contain the views and recommendations of various subject matter experts as well as hypertext links, contact addresses and websites to information created and maintained by other public and private organizations. The opinions expressed in any of these materials do not necessarily reflect the positions or policies of the U.S. Department of Education. The U.S. Department of Education does not control or guarantee the accuracy, relevance, timeliness, or completeness of any outside information included in these materials.



   
Last Modified: 03/05/2024