This guide shares five strategies federal agency staff should take to help build grantees´ data and evaluation capacity, offering concrete steps for each strategy:
1. Design the review process to prioritize applications that propose evidencebased practices.
OMB´s implementing memo on their 2024 Uniform Grants Guidance (M-24- 11 Reducing Burden in the Administration of Federal Financial Assistance) specifically instructs agencies to reassess merit review processes as they make changes to NOFOs and to “consider prioritizing Federal awards to applicants that propose evidence-based practices.” Results for America has identified a total of 98 federal grant programs at 11 federal agencies that invest in what works by defining and prioritizing evidence of effectiveness. Moreover, in many cases, capacity for collecting and using data plays a key role in successfully implementing evidencebased practices and achieving key outcomes.
● Include the use of evidence-based practices in merit review criteria.
○ The U.S. Department of Education guidance uses an evidence framework that includes definitions for evidence-based interventions, including for strong, moderate and promising evidence and evidence that demonstrates a rationale. These definitions help competitive grant programs distinguish and prioritize different evidence levels among applicants. U.S. Department of Education regulations further provide general selection criteria to apply these evidence definitions in NOFOs, specifically “the extent to which the proposed project is supported by promising evidence (as defined in 34 CFR 77.1(c)).”
○ The Substance Abuse and Mental Health Services Administration’s (SAMHSA) Mental Health Block Grant (MHBG) includes a 10 percent set-aside of allocated funding for evidence-based interventions to address the needs of individuals with early serious mental illness.
● Include a commitment to invest in and use data infrastructure, analysis, and/or evaluation in merit review criteria.
○ The U.S. Department of Labor´s (DOL) Strengthening Community Colleges Training Grants competitions require grantees to conduct an evaluation of their projects using grant funds. In some years, DOL offered additional, competitive funding for rigorous project evaluations.
2. Articulate clear support for data and evaluation in notice of funding opportunities (NOFOs).
Using the OMB Uniform Grants Guidance NOFO template in Appendix I, look for ways to highlight opportunities to invest in integrated data systems, data infrastructure, data analysis, evidence-based practices, evaluation activities, and related costs such as in the executive summary, program description, and application review criteria.
The U.S. Department of Health and Human Services’ Maternal, Infant, and Early Childhood Home Visiting (MIECHV) Program FY2024 NOFO states that at least 75 percent of service delivery funding must be allocated toward one or more of the home visiting models which meet the evidence criteria developed by ACF’s Home Visiting Evidence of Effectiveness (HomVEE) review committee. In addition, awardees must report on how their program performs across six benchmark areas, and they must show that they’ve improved in at least four of the six areas.
● Executive Summary. This plain language description of the program summarizes the goals and objectives, target audience, and eligible recipients in 500 words or less. Use the executive summary to acknowledge that integrated data systems, data infrastructure, data analysis, evidence-based practices, evaluation activities, and/or related costs will play important roles in meeting program goals and objectives and clarify up front that grant funds can support these functions. See Appendix I Section(b)(1)(i)(H).
● Program Description. This section contains the full program description of the funding opportunity, including several subsections where data and evaluation can play a key role. The following required program description items may be appropriate to address data and/or evaluation. See Appendix I Section(b)(3)(i) paragraphs (B) through (G).
○ Agency funding priorities or focus areas. Would evidence-based practices, data infrastructure including integrated data systems, analysis, and/or evaluation be an effective emphasis?
○ Program goals and objectives. Is supporting grantee data infrastructure, analysis, or evaluation among the goals and objectives?
○ Description of how the award will contribute to achieving the program’s goals. How can evidence-based practices, data infrastructure, analysis, and/or evaluation play a key enabling role in program goals, including priorities identified in consultation with impacted communities during program design?
○ Expected performance goals, indicators, targets, baseline data, data collection, and other outcomes the federal agency expects recipients to achieve. A grantee´s ability to collect, store, and use data effectively is critical to tracking and understanding meaningful outcomes. How does the agency expect grantee data infrastructure including integrated data systems, analysis, or evaluation to play a role in performance, outcomes, and data-related activities? How can funds be used in this way?
○ Unallowable costs. Should applicants know of any costs related to data infrastructure, analysis, and/or evaluation that are specifically unallowable as they propose their project?
In addition, the following optional program description items may be appropriate to address data infrastructure, analysis, and/or evaluation. See Appendix I Section(b)(3)(ii) paragraphs (A) and (B).
● Program history. Have evidence-based practices, data infrastructure, analysis, and/or evaluation played a meaningful role in the development, evolution, or priorities in the program?
● Past project examples. How have evidence-based practices, data infrastructure, analysis, and/or evaluation played an important role in the success of past projects, or how have grantees used program funds to support these functions and activities in key ways?
Application Review Criteria. This section addresses how the federal agency will conduct the review process for competitive awards. Because review criteria help determine which applications are funded, they send one of the strongest signals about what the program and agency value. See Appendix I Section(b)(6)(ii) and based on the priorities, goals, and objectives of the program, should investments in evidence-based practices, supporting or using data infrastructure, analysis, and/or evaluation be included in the criteria that determine awards?
The AmeriCorps State and National Competitive Grants NOFO assigns 20 of 100 points in its selection criteria for the strength of the evidence base supporting an applicant’s proposed program design.
The System of Care Program for children with serious emotional disturbances awards 25 of 100 points in its selection criteria based on the evidence for the proposed approach and the associated plan for monitoring and fidelity.
The Safe Streets and Roads for All (SS4A) grant program under the U.S. Department of Transportation funds “demonstration activities” to support grantees in testing and evaluating strategies to improve road safety. Data collection and evaluation activities are among the application review criteria.
3. Offer technical assistance — both pre-award and post-award — that highlights evidence-based practices, data infrastructure, analysis, and/or evaluation.
The Temporary Assistance for Needy Families (TANF) Data Collaborative Equity Analysis Awards from the U.S. Department of Health and Human Services, Administration for Children and Families support state, territory, or county TANF agencies in conducting equity-focused analyses of their TANF and other human services data. ACF will provide intensive training and technical assistance to build the capacity of grantee agencies.
The 2019 WIC Special Project Innovation grant from the U.S. Department of Agriculture’s Food and Nutrition Service offered pre – and post-award technical assistance. Post-award technical assistance included support to grantees to develop evaluation plans, create data collection procedures, timelines and analytic support.
● Describe how data and evaluation capacity support program goals, including priorities identified in consultations with impacted communities during program design.
● Explain how grantees can/must use funds to support evidence-based practices.
●Explain whether and how grantees can use funds — including braiding funds across federal agency programs — to support integrated data and evaluation capacity.
● Give examples of how current or past grantees have used grant dollars to support integrated data systems, evaluation and customer service.
● Offer examples — whether actual or aspirational — of when and how your agency might waive performance reporting requirements that are not necessary due to innovative approaches, higher-quality outcomes data or other scenarios.
4. Develop reporting requirements that build on evidence-based practices, data infrastructure, analysis, and/or evaluation activities.
OMB mandates reporting requirements for the effective monitoring or evaluation of a grant and authorizes waivers of reporting requirements that are not necessary. See § 200.329(b) and § 200.329(g).
● Focus reporting requirements on information that, in addition to supporting federal oversight, will also add value to the practical implementation of grantees´ projects.
● Allow applicants to propose reporting and accountability elements that best leverage their data infrastructure, other critical analysis, and/or evaluation of their project.
● Work with grantees to identify reporting requirements that could be waived in favor of more insightful information that their data and analytic capacity or evaluation activities could provide with less burden.
5. Coordinate across federal agencies during program planning and design.
OMB encourages agencies to work together on program design, particularly where the goals and objectives of programs or projects align. See § 200.202(b).
● Use existing interagency committees, coordinating structures, or initiatives (including input from applicants) to identify partner agencies and programs that serve similar populations and/or have aligned goals and objectives that would be best served by grantees having integrated data systems.
● Work with partner agencies to identify complementary data, evidence, and evaluation needs or requirements — or work actively to align needs and requirements — to help grantees focus on collecting and using data for greatest impact across programs.
● Align NOFO priorities, language, timing, and technical assistance across programs to help applicants identify and braid funding that can play a complementary role in supporting data infrastructure, evaluation and analytical capacity.