Letter Published February 12, 2026 · 7 minute read
Comments to the HELP Committee on Academic Growth Measures
Michelle Dimino
February 11, 2026
Senator Bill Cassidy, M.D.
Chair, Senate Health, Education, Labor, and Pensions Committee
428 Dirksen Senate Office Building
Washington, DC 20510
Dear Senator Cassidy:
Thank you for the opportunity to respond to the HELP Committee’s request for information on measuring school-level academic growth. Third Way shares the Committee’s concerns about declining academic achievement in American K-12 schools and its commitment to using comprehensive growth measures to enhance our understanding of school quality and equip parents and families with meaningful information about student performance and progress.
While federal law does not mandate that states measure and report on student growth, nearly all have adopted the practice as part of their accountability plans under the Every Student Succeeds Act (ESSA), establishing a variety of effective approaches from which the HELP Committee can draw useful data. As the Committee’s request for information notes, these methodologies include growth percentiles, value-added growth measures, and growth tables—each of which use the year-over-year gains of individual students to encapsulate school-level progress.
Both growth and proficiency provide essential pieces of information for educators, families, and policymakers. Proficiency, often assessed through standardized tests, measures current performance in relation to grade-level expectations. Growth indicators capture a student’s progress over time, in turn providing distinct insights into the effectiveness of instructional practices and school improvement efforts. As such, considering growth in students’ academic achievement alongside measures of proficiency offers a comprehensive outlook on student performance and school quality.
Are there any kinds of federal support that would be useful to states seeking to implement new growth measures or revise existing ones?
State longitudinal data systems (SLDSs)—which link individual-level statewide data across early childhood, K-12 education, postsecondary education, and the workforce—are vital to enabling the construction and application of student growth measures. Over many years, states and the federal government have made coordinated investments to build secure data systems, improve data quality, and use the data to address key questions and inform policy decisions. Sustained federal funding will be vital for the continuation of this work and for empowering state and local education leaders to strengthen student growth measures. To this end, Congress should increase annual appropriations for the SLDS Grant Program and work to pass the Committee’s Advancing Research in Education Act, which contains important provisions to modernize SLDS grants and has bipartisan support.
How can the federal government support cross-state learning about communicating information to families?
Enabling states to more easily share best practices for communicating school-level growth to families is a constructive federal goal. Under ESSA, states are tasked with designing accountability systems—but it is critical that families understand the data produced by those systems. To support cross-state learning, the federal government could:
- Organize a national convening of stakeholders leading communications efforts at the state level to facilitate cross-state dialogue and idea-sharing.
- Develop and maintain a repository of effective messaging materials curated from states—for example, plain-language explanations of growth vs. proficiency measures, subgroup results, and achievement standards. In alignment with ESSA, any such effort should be strictly voluntary in nature and designed to provide resources for consideration and customization, rather than mandated templates.
- Fund user testing of states’ family-facing communications materials. Testing the clarity and usefulness of messages disseminated to families about school or state report cards, growth measures, visual dashboards, and ESSA-required accountability information could inform best practices and help states develop clear, consistent, and meaningful explanations of student and school results. The federal government could provide competitive grants to support such testing (and/or match state dollars) and require the submission of an evaluation report. Alternatively, to reduce identification concerns and facilitate pattern detection, the National Center for Education Statistics (NCES) could solicit voluntary submissions of public-facing communications materials to be analyzed in the aggregate and compiled into best-practice guidance accessible to all states.
What changes to the National Assessment of Educational Progress or other federal data collection efforts would support a national focus on student growth?
NAEP results—deemed “the nation’s report card”—are a major driver of the national narrative on K-12 education. These scores provide valuable, standardized data that are typically presented as a moment-in-time snapshot of student achievement. As the Committee notes, NAEP is limited in its ability to provide insight into state or national student growth measures. Still, intentional framing of the results can support broader national dialogue about student growth, and federal policymakers can invite such conversation. In convening hearings on NAEP results for example, the Committee could invite state and local education agency representatives and education researchers to contextualize national results with state-level growth indicators to direct the narrative toward a comprehensive focus on both proficiency and progress.
How can federal policy incentivize states to focus on growth and remove any barriers to state innovation?
To spur state innovation in academic assessment, including assessment of individual student growth, Congress established the Innovative Assessment Demonstration Authority (IADA) in 2015. Through the program, up to seven states could initially be granted federal approval to experiment with innovative assessment models—such as competency-based tests that focus on skill mastery, interim assessments that are administered multiple times during the school year to provide timely access to student performance data, adaptive assessments that adjust test question difficulty based on students’ prior responses, or digital assessments that integrate multimedia or interactive testing components. The expectation was that approved states would be on track to implement their piloted assessments statewide within five years. To support their progress in scaling up the new models, IADA states are given temporary flexibility from double-testing students participating in the pilot assessments.
Since the creation of IADA over a decade ago, six states have been approved for participation: Louisiana, New Hampshire, Georgia, North Carolina, Massachusetts, and Missouri. Missouri, approved in July 2025, is the only state to apply and receive approval within the last five years; however, innovative assessment planning and development has also been supported in other states through the Competitive Grants for State Assessments (CGSA) program. In 2023, the Department of Education solicited input from states and districts about IADA and subsequently issued guidance lifting the cap on the number of states able to participate in the program and clarifying how states can demonstrate that their IADA assessment meets comparability requirements in relation to the current state assessment. Nonetheless, it is evident that barriers to participation and successful scaling of piloted models persist. A congressionally required report evaluating the initial five IADA systems emphasized the significant disruptions to testing caused by the pandemic, but also indicated state and district capacity limitations, tight timelines, and stakeholder engagement as observable challenges to system development and implementation.
Keeping these lessons in mind, both IADA and CGSA offer established avenues that federal policymakers can use to provide funding and guidance that will incentivize more states to pursue innovative growth assessments by enabling greater capacity and necessary interim flexibility. To this end, Congress should consider:
- Pursuing additional channels to elevate awareness of expanded IADA eligibility for states, the benefits of participation, and acceptable coordination between projects developed under CGSA and subsequent IADA implementation.
- Meeting with states whose IADA applications have not been successful in the past to learn more about the challenges encountered in prior rounds and how policymakers can lower barriers to future participation.
- Appropriating robust sums for the CGSA program in the Fiscal Year 2027 budget process and including report language to compel the Department of Education to run a new competition and allocate funding for additional state education agencies (SEAs) to engage in the planning and development processes for innovative assessment systems.
- Publishing a Frequently Asked Questions document or guidance memo to enhance clarity around allowable uses of artificial intelligence in developing, reviewing, and/or scoring innovative assessments. Rapid advances in AI capabilities have introduced broad new opportunities for the field of assessment, and SEAs with an interest in exploring the potential of these new tools may have reasonable concerns about their ability to do so through CGSA and/or IADA. Clear guidance around acceptable and legally compliant use cases of AI in innovative testing could serve to encourage more applications to these programs and support technological experimentation while safeguarding student privacy, ensuring assessment validity and auditability, and integrating appropriate levels of human oversight.
Thank you for your time and consideration of these comments, and please do not hesitate to contact us should you have further questions.
Sincerely,
Michelle Dimino
Director of Education, Third Way
[email protected]