(Translated by https://www.hiragana.jp/)
IES Blog | Institute of Education Sciences

IES Blog

Institute of Education Sciences

Measuring the Homelife of Students During the Pandemic

As part of the IES-funded Prekindergarten Through Grade 12 Recovery Research Network, the Georgia Policy Labs has been working to gauge the effects of economic insecurity and health stressors on student engagement and achievement during and after COVID-19 era remote learning. In this guest blog, Dr. Tim Sass, principal investigator of the IES-funded project and Distinguished Professor at Georgia State University, discusses a challenge his research team faced while linking home environment and student academic success, highlighting an obstacle to collecting home background survey data from parents/legal guardians. Dr. Sass shares his team’s practical solution to address this challenge.

The Challenge: Difficulty Collecting Data about Student Home Situation

A major challenge to studying out-of-school factors that contribute to student academic success is the lack of information about a student’s home situation. Standard administrative data barely scratch the surface, providing information on language spoken at home, eligibility for subsidized meals (an admittedly crude measure of poverty), and little else. This lack of information became even more problematic during the COVID-19 pandemic, when students were learning from home, and the pandemic had severe effects on many families’ health and economic well-being.

As part of our project, we focus on identifying the factors associated with student engagement and achievement growth in remote instruction. Our initial strategy was to gather survey data in three of our partner school districts in metro Atlanta. We created a questionnaire to measure student engagement as well as collect information on economic insecurity, health stressors, and protective factors like adult monitoring of student learning at home or hiring a tutor. However, we faced two major challenges that made it difficult for us to collect the information we needed.

The first challenge was creating a process for identifying respondents.  Our partners agreed to send out the survey on our behalf, using their established systems for communicating with parents/guardians. Our intent was to make the responses identifiable so we could directly link information gathered from the survey to outcomes for specific students. While one of our partner districts was not comfortable with identifying individual respondents, it agreed to identify respondents’ school of enrollment. A second district agreed to identify respondents, but due to miscommunication within the district, their survey team made the survey anonymous. Finally, the third district agreed to allow linking individual responses to student ID numbers but left it up to parents/guardians to identify their student in the survey, and only about half of respondents filled in their student’s ID number.   

The second challenge was the very low response rates: 192 respondents from District A (0.4% response rate), 1,171 respondents from District B (1.2% response rate), and 80 respondents from District C (0.1% response rate). While disappointing, the low response rates are not unique to our study. Other researchers have struggled to get parents/guardians to respond to surveys conducted during the pandemic or shortly after the resumption of in-person instruction.

The Solution: Using Non-School-Based Administrative Data from Multiple Agencies to Complement Survey Data

Given the low response rates and non-identifiable responses, we considered how we could use additional non-school-based administrative data to complement the survey evidence. Through a partnership with researchers from the Atlanta Regional Commission, the Federal Reserve Bank of Atlanta, and the Georgia Institute of Technology, we obtained data from court records on evictions in the Atlanta Metro Area. The data cover four of the five “core” counties in the Atlanta Metro Area, spanning calendar years 2019-21. Over this period, there were approximately 300,000 unique eviction filings across the four-county area, including both households with and without school-aged children. The court records contain a property address and filing date, which were used to match yearly eviction filings to students based on district student address files.

The following table provides counts of students that we successfully linked to eviction filings by district and year.  “Experienced Eviction” refers to cases where we directly matched an individual dwelling unit to an eviction filing.  In many cases, however, the street address in the eviction filing is for a multi-family structure, and there is not enough information in the filing or in the student address files to directly match a student living in the complex with the unit in which the eviction filing occurred.  When this type of match occurs, we designate it as being “Exposed to Eviction.”  The “Exposed to Eviction” counts include the instances of “Experienced Eviction.” The “Eviction Rate” is the ratio of “Experienced Eviction” to the total number of students in the district.

 

DISTRICT A

DISTRICT B

DISTRICT C

Exper-ienced Eviction

Exposed to Eviction

Eviction Rate

Exper-ienced Eviction

Exposed to Eviction

Eviction Rate

Exper-ienced Eviction

Exposed to Eviction

Eviction Rate

2019​

2,608

12,555

0.043

3,249

22,520

0.030

32

321

 <0.001

2020​

3,412

13,408

0.057

4,467

25,503

0.042

1,246

10,520

0.013

2021​

2,251

10,789

0.041

2,929

19,514

0.029

2,323

9,842

0.024

 

While an eviction filing does not mean that a family was removed from their dwelling, it does indicate that they were at risk of losing their home. Moving forward, we plan to use the eviction filing information as a measure of housing insecurity and economic stress. We will incorporate this metric when estimating models of student achievement growth during the pandemic, and absenteeism and student behavior after students returned to in-person learning. This will give us a sense of the degree to which external factors affected student performance in remote learning as well as the influence of housing and economic insecurity on achievement, engagement, and behavior once students returned to classrooms. Our findings will provide information to school districts and social-service providers on student exposure to economic stress so as to ensure that in-school supports and "wraparound" services are offered to students who need them the most.  


This blog was produced by Haigen Huang (Haigen.Huang@ed.gov), program officer at NCER.

Unlocking College and Career Success: How the RELs are Making a Difference in Access, Enrollment, and Completion

A smiling student works looks at the person accross from her while working at a round table.

Removing barriers to college access and success begins well before the first college application is submitted. It starts with high schools offering advanced courses, work-based learning (WBL), and career programs, giving students a clear roadmap toward higher education and career readiness. While recent data from the National Center for Education Statistics show that 73 percent of public high schools offer some type of advanced academic course, and 86 percent offer career and technical education (CTE), not all students, especially those from historically underserved backgrounds, have equal access to these resources and opportunities. Improving college access, enrollment, and completion for all students can help address the inequities we see in higher education and the workforce and facilitate equal opportunities for all students to achieve economic stability.

Many policymakers and educators are focused on ensuring that students are ready for college and careers when they graduate from high school. RELs work in partnership with states and districts to 1) conduct original high-quality research, 2) provide training, coaching, and technical support, and 3) disseminate high-quality research findings on the topic of college and career readiness.

In this fourth installation of our blog series, we share how two of IES’s Regional Educational Laboratories (RELs) are making a difference in college and career success.

REL Northeast & Islands: Preparing Students for Success after High School

REL Northeast & Islands is partnering with education leaders In Vermont, Rhode Island, and New York to support college and career readiness initiatives.

Facing an aging workforce and stagnant postsecondary enrollment, Vermont has launched a statewide initiative to offer students multiple pathways to achieve education and career success. With support from REL Northeast & Islands through the Partnership to Strengthen Flexible Pathways for College and Career Success, Vermont is assessing the quality and completeness of the data they currently collect about program access, participation, and success and exploring how that data can be used to identify inequities and barriers. REL Northeast & Islands supported a series of meetings that bring together Vermont Agency of Education staff from the divisions of Data and Analysis and Student Pathways and work-based learning coordinators from comprehensive high schools and regional Career and Technical Education Centers to identify and address opportunity gaps in access to and participation in these various pathways, particularly for students from historically underserved groups and those in rural locations. REL Northeast & Islands also supported partners as they consider developing policies and guidance to improve data collection about student CTE and WBL participation and success.

In Rhode Island, REL Northeast & Islands’ Partnership to Support Early College Opportunities is studying whether and how early college opportunities help bring the state closer to achieving its postsecondary enrollment goals. The REL Northeast & Islands team supported school and postsecondary leaders and counselors by conducting coaching sessions to increase school and district teams’ understanding of their early college data, setting goals for improvement, and supporting the use of Rhode Island Department of Education’s data dashboards. Through 2024, REL Northeast & Islands is also conducting an applied research study to investigate the cost-effectiveness of three state programs that help students earn college credits during high school: dual enrollment, concurrent enrollment, and advanced placement programs.  “Our Rhode Island partners are very interested in understanding the results of these programs and whether they work for all students,” explains REL research scientist and partnership co-lead Dr. Katherine Shields. “So in this study, we are looking at whether the effects of the programs differ for students who started high school academically proficient and those who did not.”

And in New York, REL Northeast & Islands just established a new partnership with state education leaders,  the Partnership to Support Equity in Early College Programs, to address a persistent decline in postsecondary enrollment and support equity in early college programs. A study with the New York Department of Education will help New York policymakers and education leaders better understand inequities in access and enrollment and outcomes experienced by participating students. 

REL West: Using Evidence-Based Strategies to Reengage and Support Adults with Some College but No Credential

In addition to identifying evidence to support college and career readiness through collaboration with K–12 agencies, the RELs are working with postsecondary institutions to extend this support. For example, to re-engage Californians who have some college experience but did not complete their credentials, REL West has established the California Adult College Completion Partnership, comprising six higher education institutions in northern California. Using a continuous improvement model, REL West is helping these institutions implement strategies to re-engage these students and encourage them to return to college and complete credentialing. The partnership identified strategies that fall under three main buckets: communications and outreach, reenrollment and onboarding, and student supports. REL West is providing tailored coaching support to help each of the six participating institutions identify, implement, and test at least one of these strategies.

“The work of the REL allowed us to refocus the efforts to identify and re-engage students at Shasta College who completed some courses but have no credentials. We were also able to add capacity to the efforts with other stakeholders on our campus. This resulted in an increase in enrollment by near completers. Also, involvement in the cohort has strengthened our partnerships with other colleges in the region, and we look forward to continuing our joint efforts after the completion of the project.”
—Kate Mahar, Associate Vice President and Strategic Initiatives at Shasta College and Executive Director of Shasta College Attainment and Innovation Lab for Equity (SCAILE)

How RELs are Contributing to the Research Base

RELs collaborate with school districts, state departments of education, and other education partners to help generate evidence and contribute to the research base through rigorous inquiry and data analysis. The two studies highlighted below focus on college and career readiness and both meet the What Works Clearinghouse standards with reservations, with at least one statistically significant finding and moderate evidence of effectiveness.

REL Central: The Impact of Career and Technical Education on Postsecondary Outcomes in Nebraska and South Dakota

Education leaders in Nebraska and South Dakota partnered with REL Central to examine how completing a sequence of career and technical education courses in high school affects students' rates of on-time high school graduation and their rates of postsecondary education enrollment and completion within two and five years.

REL Northeast & Islands: The Effects of Accelerated College Credit Programs on Educational Attainment in Rhode Island

This study examined participation in accelerated college credit programs dual enrollment, concurrent enrollment, and Advanced Placement courses in Rhode Island high schools to understand their effects on educational attainment. This video, What are the effects of taking college-level courses in high school?, shares findings from the report.

Learn More about the College and Career Work of the RELs

The examples shared here illustrate the varied support RELs can provide across data systems, access, and analysis, cost effectiveness, and support for research and development. In addition to the work highlighted in this blog, multiple RELs across the program are working hard to support college and career readiness and success in their regions. Learn more about this work by visiting:

REL Appalachia

Strengthening Students’ Preparation for College and Careers

Developing Resilient and Supportive Community Colleges

REL Central

Supporting Postsecondary and Workforce Readiness of Students in Kansas

REL Mid-Atlantic

Improving Post-High School Transitions for Students with Disabilities in Maryland

REL Midwest

Employability Skills Partnership

REL Northwest

Portland High School Graduation

REL Southeast

Diversifying the Teacher Pipeline with Historically Black Colleges and Universities


The Regional Educational Labs (REL) program, operated by the U.S. Department of Education’s Institute of Education Sciences (IES), supports state education agencies, schools and school districts, and institutions of higher education nationwide in using data and evidence-based practice to improve opportunities and outcome for learners. Operating in all 50 states, the District of Columbia, Puerto Rico, and the U.S. Territories and Freely Associated States of the Pacific region, the REL program brings together the expertise of local communities, top-tier education researchers, and education scientists at IES’s National Center for Education Evaluation and Regional Assistance (NCEE) to address the most vexing problems of education policy and practice in states and regions—on demand and free of charge.

This blog was written by Laura Dyer, NCEE Knowledge Use Dissemination contract

Navigating the ESSER Funding Cliff: A Toolkit for Evidence-Based Financial Decisions

As the federal Elementary and Secondary School Emergency Relief (ESSER) funds approach their expiration date in September 2024, schools and districts across the nation are facing a budgeting season like no other. ESSER funding has played a pivotal role in supporting schools in recovery from the COVID-19 pandemic, but with the deadline looming, districts and charters must take stock of their investments and ensure that programs that are making a positive impact for students continue in a post-ESSER world.

A team at the North Carolina Department of Public Instruction (NCDPI) has been examining COVID-19 learning recovery in the state as part of their Using Longitudinal Data to Support State Education Policymaking project, which is part of the IES-funded RESTART network. In addition to the research, members of the team at NCDPI developed a toolkit to help local leaders make decisions about what programs to continue or discontinue in the face of the upcoming expiration of federal funding to help schools with learning recovery post-pandemic. Through their work in the Office of Learning and Research (OLR) and the Division of Innovation at NCDPI, Rachel Wright-Junio, Jeni Corn, and Andrew Smith are responsible for managing statewide programs, conducting research on innovative teaching practices, and sharing insights using modern data analysis and visualization techniques. In this guest blog, they describe the need for the toolkit, how they developed it, how it is being used, and next steps.

The ESSER Funding Cliff Toolkit: A Data-Driven Approach

To help district and school leaders navigate the financial uncertainties following the end of ESSER funding, the OLR team created a Funding Cliff Toolkit as a starting point for data-driven decision-making based on unique local contexts. The toolkit provides a comprehensive set of resources, including a Return on Investment Framework and Calculator that uses detailed data on ESSER expenditures as well as the impacts on student outcomes of various investments. By using this toolkit, schools and districts can assess what worked during the ESSER funding period, identify areas for improvement, and create sustainable financial plans that ensure effective programs continue regardless of funding.

Knowing the far-reaching implications for this type of tool, the OLR team worked with federal programs and finance leaders across NCDPI. Additionally, they consulted leaders including superintendents and chief financial officers of North Carolina school districts and charter schools in the design process to ensure that the tool met their immediate needs. Finally, Drs. Brooks Bowden, associate professor at the Graduate School of Education at the University of Pennsylvania, and Nora Gordon, professor at the Georgetown University McCourt School of Public Policy,  served as collaborators on the design of the ROI toolkit to ensure validity of the tool.

Rolling Out the Toolkit: Engaging Leaders Across the State

In rolling out the toolkit, the OLR Team intentionally invited diverse stakeholders to the table, including district staff from finance, federal programs, academics, and cabinet-level leadership. It was crucial to bring together the financial, compliance, and programmatic pieces of the “ESSER puzzle” to allow them to work collaboratively to take stock of their ESSER-funded investments and explore academic progress post-pandemic. 

To ensure that the ESSER Funding Cliff Toolkit reached as many district leaders as possible, the OLR Team organized a comprehensive rollout plan, which began with a series of introductory webinars that provided an overview of the toolkit and its components. These webinars were followed by nine in-person sessions, held in each of the eight state board of education regions across North Carolina where over 400 leaders attended. Building upon the initial learning from informational webinars, in-person learning sessions featured interactive presentations that allowed district teams to practice using the tool with simulated data as well as their own. By the end of the session, participants left with new, personalized data sets and tools to tackle the impending ESSER funding cliff. After each session, the team collected feedback that improved the toolkit and subsequent learning sessions. This process laid the groundwork for continued support and collaboration among district and school leaders.

What's Next: Expanding the Toolkit's Reach

Next steps for the OLR Team include expanding the use of the toolkit and working with district and charter schools to apply the ROI framework to help districts make evidence-based financial decisions across all funding sources. Districts are already using the toolkit beyond ESSER-funded programs. One district shared how they applied the ROI framework to their afterschool tutoring programs. Other districts have shared how they plan to use the ROI framework and funding cliff toolkit to guide conversations with principals who receive Title I funds in their schools to determine potential tradeoffs in the upcoming budget year.

As North Carolina schools inch closer to the end of ESSER, the goal is to continue to encourage districts and charters to incorporate evidence-based decision-making into their budgeting and program planning processes. This ensures that districts and schools are prioritizing those programs and initiatives that deliver the most significant impact for students.

In addition to expanding support to North Carolina districts and schools, we also hope that this supportive approach can be replicated in other SEAs across the nation. We are honored to have our toolkit featured in the National Comprehensive Center’s upcoming Communities of Practice (CoP) Strategic Planning for Continued Recovery (SPCR) and believe that cross-SEA collaboration in this CoP will improve the usefulness of the toolkit. 


Rachel Wright-Junio is the director of the Office of Learning and Research (OLR) at the North Carolina Department of Public Instruction (NCDPI); Jeni Corn is the director of research and evaluation in OLR; and Andrew Smith is the deputy state superintendent in the NCDPI Division of Innovation.

Contact Rachel Wright-Junio at Rachel.WrightJunio@dpi.nc.gov for the webinar recording or copies of the slide deck from the in-person sessions.

This guest blog was produced by Corinne Alfeld (Corinne.Alfeld@ed.gov), NCER Program Officer.

Introducing NCER’s Federation of American Scientists Fellows

We are excited to have Katherine McEldoon and Alexandra Resch, two Federation of American Scientists (FAS) Impact Fellows, who joined the center in December 2023 to support the Accelerate, Transform, Scale (ATS) Initiative. The ATS Initiative supports advanced education research and development (R&D) to create scalable solutions to improve education outcomes for all learners and eliminate persistent achievement and attainment gaps.

Both of our FAS Fellows have experiences that reinforce the need to start with the science and to use the right methods at the right time to build solutions. They’ve observed that while researchers are great at producing insights about education and learning, and developers are great at building education solutions and technologies, the broader field isn’t yet great at is doing the two together. Through their careers, they’ve come to see rigorous research and development happening together as the path forward to build effective, evidence-based solutions.

In this blog, Alex and Katherine share about their career paths and how their unique experiences and perspectives are suited to help grow the ATS Initiative.

Alexandra Resch

I’ve always been driven by an urge to try to improve our education systems. I often felt bored in school and could see huge gaps in resources and opportunities among classrooms in my school and between my district and others nearby. I studied economics because the quantitative and analytic tools came naturally to me and because I could see the importance that incentives and resource constraints play in understanding how our systems work and how to improve them. I love the lens that economics provides to make sense of the world.

When I finished my PhD in 2008, I got my dream job as a researcher at Mathematica. Among other things, I worked on the What Works Clearinghouse, interesting methods papers, and national studies. I enjoyed these projects, but I started to worry that while I was doing great research, it wasn’t answering the questions practitioners had and wasn’t timely enough to inform their decisions. I gradually started shifting my work to be closer to decisions and decision makers, eventually building out a portfolio of work on rapid cycle evaluation and ways to be opportunistic about generating strong evidence. I also started thinking about how we talk about evidence and whether we’re framing questions and findings to privilege the status quo. I’ve come to believe the questions we ask, the methods we use, and how we describe our results all need to be different if we want to affect how the education system works and make a difference for student learning.

Over the last decade, I’ve developed expertise in R&D, learning about and applying tools and processes for human centered design, continuous improvement, product development, and product management. I haven’t put aside the tools I had from economics, but I have a bigger toolbox and am better able to use the right tool at the right time. I’ve seen progress in recent years in bringing more rigor to product development and more speed and agility to education research. I’m excited to support the work that the ATS initiative is doing to bring researchers and developers closer together into productive partnerships in the service of solving genuine problems for educators and students. 

Katherine McEldoon

Early in my career, I set connecting scientific insights and education practice as my north star, and I haven’t looked back since. I was intrigued with what cognitive sciences could unlock: clear explanatory mechanisms of certain behaviors and beliefs—empirically validated, no less! There were so many insights ripe for the classroom, but why weren’t they being used?

Through my doctoral work at Vanderbilt University and the IES-funded Experimental Education Research Training (ExpERT) program, I grounded myself in cognitive theories of learning and designed instruction using those insights while measuring impact. This cross-training equipped me with the skillset I’d need to conduct a range of efficacy studies and honed my ability to speak multiple academic dialects—a skill that became more important as I grew in my career.

Next, I set my sights on scale-up: first at Arizona State University, where we incorporated a theory of active learning into teacher practice; then by running a state-level evaluation study for an EdTech start-up company; and finally by supporting a networked improvement community with the Tennessee Department of Education. I learned firsthand how many layers we had to work through to bring the "active ingredients” into the learner experience. I also developed an appreciation for the multifaceted collaborations it takes to bring these efforts together.

In 2019, I joined Pearson’s Efficacy and Learning division, where we collaborated with product development teams, providing research-based insights to inform learning design and outcome measurement. We started with insights from the learning sciences and conducted iterative R&D with end-users from ideation, to prototypes and designs, to mature product evaluations. The research perspective kept our eye on conducting development work in a careful, measured, and learning outcomes-focused way. The development perspective kept us centered on researching applied and immediate problems and keeping practical significance at the fore. When done well, the balance of research and development hummed into harmony, and resulted in effective, enjoyable experiences that really worked.

Through my career I’ve learned that instead of asking how do we connect research to practice, the better question is how do we intertwine the research and development process? Not only should we be starting with research-based insights, but we should also be integrating research methods and development processes to build a high quality and useful solution from the start. That’s precisely what we’re working to achieve with the ATS Initiative.


This blog was written by Alex Resch and Katherine McEldoon, Accelerate, Transform, Scale Initiative, NCER.

Evidence on CTE: A Convening of Consequence

In 2018, NCER funded a research network to build the evidence base for career and technical education (CTE). As with other research networks, the CTE Research Network comprises a set of research teams and a lead team, which is responsible for collaboration and dissemination among the research teams. On March 20, 2024, the Network held a final convening to present its findings to the field. In this blog, Network Lead Member Tara Smith, Network Director Kathy Hughes, and NCER Program Officer Corinne Alfeld reflect on the success of the final convening and share thoughts about future directions.

Insights From the Convening

An audience of CTE Research Network members, joined by educators, administrators, policymakers and other CTE stakeholders, gathered for a one-day convening to hear about the Network’s findings. Several aspects of the meeting contributed to its significance.

  • The presentations highlighted an important focus of the Network – making research accessible to and useable for practitioners. The agenda included presentations from four Network member researchers and their district or state partners from New York City and North Carolina. Each presentation highlighted positive impacts of CTE participation, but more importantly, they demonstrated the value of translating research findings into action. Translation involves collaboration between researchers and education agency staff to develop joint research questions and discuss the implications of findings for improving programs to better serve students or to take an innovative practice and scale it to other pathways and schools.
  • Brand-new and much-anticipated research was released at the convening. The Network lead announced a systematic review of all of the rigorous causal research on secondary-level CTE from the last 20 years. This is an exciting advancement for building the evidence base for CTE, which was the purpose of the Network. The meta-analysis found that CTE has statistically significant positive impacts on several high school outcomes, such as academic achievement, high school completion, employability skills, and college readiness. The review found no statistically significant negative impacts of CTE participation. The evidence points to the power of CTE to transform lives, although more research is needed. To guide future research, the review provided a “gap analysis” of where causal research is lacking, such as any impacts of high school CTE participation on academic achievement in college or attainment of a postsecondary degree.
  • National CTE leaders and experts put the research findings into a policy context and broadcasted its importance. These speakers commented on the value of research for CTE advocacy on Capitol Hill, in states, and in informing decisions about how to target resources. Luke Rhine, the deputy assistant secretary of the Office of Career, Technical, and Adult Education (OCTAE) said, “The best policy is informed by practice [...] and the best practice is informed by research.” Kate Kreamer, the executive director of Advance CTE, emphasized the importance of research in dispelling myths, saying that “if the data are not there, that allows people to fill the gaps with their assumptions.” However, she noted, as research increasingly shows the effectiveness of CTE, we must also guard against CTE programs becoming selective, and thus limiting equitable access.

New Directions

In addition to filling the critical gaps identified by the Network lead’s review, other future research questions suggested by researchers, practitioners, and policymakers at the convening include:

  • How can we factor in the varied contexts of CTE programs and the wide range of experiences of CTE students to understand which components of CTE really matter?  What does it look like when those are done well?  What does it take to do them well? Where is it happening?
  • How can we learn more about why students decide to participate in CTE generally and in their chosen pathway? What are the key components of useful supports that schools can provide to help them make these decisions?
  • How do we engage employers more deeply and actively in CTE programs and implement high quality work-based learning to ensure that students are acquiring skills and credentials that are valued in the labor market?
  • What are evidence-based practices for supporting special student populations, such as students with disabilities, or English language learners?
  • How can we harness state longitudinal data systems that link education and employment data to examine the long-term labor market outcomes of individuals from various backgrounds who participated in different career clusters or who had access to multiple CTE experiences?

While IES alone will not be able to fund all the needed research, state agencies, school districts, and even individual CTE programs can partner with researchers to study what works in their context and identify where more innovation and investment is needed. The work of the CTE Research Network has provided a good evidence base with which to start, and a good model for additional research that improves practice and policy. Fortunately, the CTE research field will continue to grow via the support of a new CTE Research Network – stay tuned for more information!


This blog was co-written by CTE Network Lead Member Tara Smith of Job for the Future, CTE Network Director Kathy Hughes of AIR, and NCER Program Officer Corinne Alfeld.

Questions can be addressed to Corinne.Alfeld@ed.gov.