Skip to main content

Making ARP ESSER investments sustainable through evaluation

By Jessica Newman

November 11, 2021

The American Rescue Plan Elementary and Secondary School Emergency Relief (ARP ESSER) fund has made billions of dollars available to state and local education agencies to use for supporting students' social, emotional, and academic needs. These supports include implementing COVID-19 safety and prevention strategies, offering comprehensive afterschool and summer enrichment programs, hiring staff and providing other supports for mental health and wellness, implementing models of integrated supports (like the full-service community schools model), and ensuring that students can connect to high-quality home internet and/or devices.  

But ARP ESSER funding is limited in scope and timeline. It is a one-time funding opportunity that is meant to support recovery now. How can you ensure that the investments you make today continue to yield benefits into the future?  

The first answer is an obvious one, and one that we wrote about earlier this summer: don’t panic or make rash decisions. Rather, make choices strategically based on existing plans, data, and lessons learned, and with sustainability in mind. A second answer is to follow the evidence and make sure that the choices you are making support the whole person (both young people and adults) by attending to their social, emotional, physical, spiritual, and academic needs. The U.S. Department of Education’s COVID-19 Handbook (Volume 2) highlights practices for creating a safe and supportive school environment for students and staff alike. 

A third answer, and one that we propose is critically important for sustainability, is to make sure that you set yourself up to learn from your efforts through data-informed learning and continuous improvement. As Matt Soldner (Commissioner of the National Center for Education Evaluation and Regional Assistance, Institute for Education Sciences) put it in his open letter to superintendents this past spring, “Please use this summer as a chance to build evidence about ‘what works’ to improve outcomes for your students. In a word: evaluate!” Well, we couldn’t agree more!  

This is a time of opportunity. We strongly encourage state and local education agencies to set aside ARP ESSER funds to evaluate implementation (definitely) and outcomes (where appropriate and feasible). ARP ESSER not only provides the resources to recover from the challenges caused by COVID and to do so in support of the whole child, but it also offers an opportunity to learn from what we are doing if we are strategic and intentional. Putting evaluation systems and processes in place makes learning from ARP ESSER investments possible. Integrating evaluation systems and processes early and intentionally, with the resources that are now available, makes ARP ESSER investments sustainable.  

Below are three intentional steps you can (and should) take now to prioritize evaluation for data-informed learning and continuous improvement:  

  1. First, what do you need to know? Start with a few key questions. What do you absolutely need to know? If you are at the state level, do you have a mechanism for understanding how local dollars are being spent? At the local level, who is being served (and who is not), how, and why? Are the ARP ESSER dollars (and other funding) being used in the way they were intended? Understanding implementation is important and a critical first step before you can measure or make meaning from anything else.  

Here we ask the question about what you absolutely need to know because we have observed how quickly assessment can go wild. Now, more than ever, is the time to take a critical look at the burden we place on educators, young people, and their families through intensive, sometimes invasive, data-gathering efforts. Focus only on what you need to know, consider the least burdensome way to collect data, and ensure that your practices are equitable, culturally competent, and responsive, and come from a place of learning rather than compliance.  

  1. Second, what information do you need to collect? Consider what information you need to collect to be able to answer your questions. Gathering specific information can be useful, especially if you can link to other existing data. For example, can you capture attendance or participation in specific services or activities? If so, then can you link it to existing demographic information? Now you have a more robust picture of who is being served and how. 

State and local education agencies are required to set aside 1% of their funding for comprehensive afterschool and summer enrichment programs. If you are new to out-of-school- time programming, you might be wondering what data are essential. Start with attendance and participation: Who is showing up to programs, and what are they doing while they are there? Incorporate staffing data: Who is leading program activities, and where do they work (e.g., school, community partner)? Consider embedding measures of program implementation, quality, or youth engagement: How are programs creating safe and supportive spaces for learning and development, and are young people engaged in their experience? What outcomes can we expect from this program? Are they realistic? Are there existing data that would help to understand progress toward these goals?  

  1. Third, how will you collect the data? Identify existing systems and processes that you can leverage or build from to capture the information you need. For example, how can you use existing data systems to track whether and how local education agencies are using ARP ESSER dollars, especially those intended for something specific? Are there ways for local data management systems to connect young people’s records with the services and activities they are now engaging in? If you have existing school-day attendance and course-tracking systems, can you expand those to track what is happening during the hours that young people are out of school participating in afterschool or summer programs? Expanding what exists is another way to ensure that these processes are sustained in the future.  

Taking these steps is important, but data are of little use if they are not accessed and used regularly and with purpose. Remember those key questions from step 1? Answer them! It is essential to identify and plan for ways to make meaning from your data formatively—that is, in an ongoing way that drives learning and improvement—and collaboratively. Meaning-making is not something one should do alone. Your regional comprehensive center or another trusted research or technical assistance partner, for example, can help you (and your colleagues!) dive into your data, elevate lessons learned, and make data-informed decisions. We often use Plan-Do-Study-Act (PDSA) cycles, for example, which are an effective approach to continuous improvement because they provide a way for groups to work together to create plans that address specific needs, make changes to address those needs, measure changes that occur, and refine the plans to strive toward improvement. Ultimately, evaluation is a key to learning, and learning is the key to improvement. As we continue to rebuild and recover, we can integrate more and better practices that foster equity and strengthen how young people and adults learn and develop.