Skip to main content

Exploring Opportunities to Increase the Practical Usage of NAEP Results: Ideas for Improving Data Literacy

By Daniel Frederking & Bobbi Newman - February 15, 2023

The National Assessment of Educational Progress (NAEP) 2022 results were released in October, and they confirmed what many of us already knew: the pandemic had a significant negative effect on student learning, and the gap between higher-performing and lower-performing students continued to widen. The 2022 NAEP results in mathematics saw the largest decline from the previous administration since the current test’s inception in 1990. NAEP is the closest thing we have to a consistent learning metric across this country, and it is showing concerning data about the state of American education.

These drops prompt us to ask how states can carve a path forward to improve student achievement. This question doesn’t have a clear-cut answer because states vary in how (and if) they leverage NAEP data to make policy decisions and to create a vision for teaching and learning. Regardless of these current uses,  more can be done with NAEP results if NAEP data literacy is improved.

NAEP Data Literacy

Increasing data literacy is a key element of leveraging NAEP data to inform teaching and learning policies. To be data literate, it is important to start with understanding what the data from NAEP is intended to measure. NAEP monitors the condition of education in the United States and each individual state, it evaluates and reports trends in student achievement over time and across student demographics, and it makes comparisons among states with data being reported at a national and state level rather than a district, school, or student level.

States can support educators, school and district leaders, and stakeholders to be aware of how to leverage the findings and how the data are reported through capacity building around data literacy. Being data literate means understanding the types of data that NAEP can produce and ways to use that data to inform policy and practice. Resources do exist for states to support NAEP data literacy. Below are a few considerations for states to answer the questions they might have:

How can I provide context to the test results?

A relatively untapped NAEP resource is the contextual data that accompanies the test. Surveys are given to students, teachers, and school administrators participating in the NAEP assessment. These surveys dive into all kinds of variables, including demographics, curriculum decisions, and educator information. Pairing this information with the performance data can yield some interesting findings that could be worth investigating at a deeper level. For example, if the results of the statewide survey are declining over time for students who say they use math in everyday life outside of school, this may signal that the state should conduct a deeper dive into the overall math achievement of students across different student populations. The NAEP survey data provides an opportunity to consider the authentic contexts of instruction or curriculum or how the language of the standards might promote greater authentic learning experiences.

What digital tools do students access when taking the NAEP exams?

Another resource for states is the information from the process data from the online NAEP tests. In 2017, the NAEP mathematics assessment was administered for the first time as a digitally based assessment at grades 4 and 8. Student actions during testing are logged and coded with their associated timestamps. For instance, if a student uses a highlighter function or zooms in on a particular resource, there is now a record of it. These data recordings are known as process data. Our ability to understand how students approach digital assessments, including the “electronic NAEP,” or eNAEP system, has greatly expanded over the last decade.

In a recent article, American Institutes for Research’s (AIR) Juanita Hicks explained how process data can be used to better understand student interactions with digital platforms. The data can provide new insight into how students interact with the test, and how they utilize the tools and accommodations available to them. If states were to take a closer look into this data, they could uncover more about the impacts of technology on learning and identify ways their districts might be able to leverage this.

What common misconceptions do students have when engaging with the content?

Many state education agencies provide educators with released assessment items from NAEP so teachers can develop focused instruction to address common misconceptions, errors, and misunderstandings of foundational knowledge taught in earlier grades. Expanding practices such as this could be hugely impactful to student learning. There is great potential to focus on the misconceptions from the NAEP results, similar to the way that AIR’s standards and assessment team is developing ways for educators to analyze student misconceptions in the Trends in International Mathematics and Science Study (TIMSS). The AIR Educator Tool supports K–12 mathematics and science learning by making large-scale assessment resources accessible, relevant, and useful for U.S. teachers. This proof-of-concept tool is designed for U.S. teachers who want to learn more about the errors students make or the misconceptions they have in mathematics and science across K–12 education. Though schools don’t get student-level NAEP results, opportunities like this could still provide value for using items to impact classroom instruction.

A Promising Opportunity

The opportunity to use NAEP results as evidence to support strategic planning and continuous improvement is promising. To do so, however, NAEP data literacy must be improved. States can convene a task force of state and district leaders to identify specific objectives for analyzing and leveraging NAEP scores to improve teaching and learning. A task force can begin by understanding the data better. They can utilize the free tools from the NAEP Data Explorer webpage to create statistical tables, charts, and maps using NAEP results. Other efforts could involve documenting the stories of how states use the data to inform policy and resource allocation by providing brief case studies, exemplars, and data visualization tied to targeted policies.  Framing the data in these ways can better showcase the results in a real-world context where its importance to districts and schools can be fully understood. Data literacy and leveraging NAEP data are key components of a standards-aligned system. If you’re looking for a thought partner in this process, we encourage you to reach out to your regional comprehensive center for help.

 

Daniel Frederking is a senior technical assistance consultant at AIR. His work includes serving as a partnership facilitator for the Midwest Regional Educational Laboratory (REL Midwest), as well as developing and leading various REL Midwest technical assistance projects. He serves as an instructional coach for school districts across the country and participates in a STEM research-practice partnership in Milwaukee. He specializes in improving educator data literacy, working with educators at all levels to improve their use of data to make instructional decisions. Prior to joining AIR, Frederking was a consultant and coordinator in the student assessment division of the Illinois State Board of Education as well as a high school language arts teacher in two different Illinois districts. 

Bobbi Newman is a principal researcher and director of AIR’s practice area for standards and assessments. She has more than 15 years of experience in research and evaluation of school reform efforts. Dr. Newman has worked at every level of the education system, from the classroom to the state. As a researcher, she has worked with state and local education agencies to study reform efforts such as school turnaround, effective leadership programs, assessment, and standards implementation. On the state level, Dr. Newman worked with New Jersey and Pennsylvania on their teacher and principal effectiveness systems and contributed federal waiver applications. As a researcher at the University of Pennsylvania, she successfully created a virtual space to unite policymakers, practitioners, and researchers around problems of education practice. She has expertise in strategic education partnerships and facilitating stakeholders to accomplish shared goals.

Photo by Allison Shelley for EDUimages