Public administrators have always been interested in identifying cost-effective strategies for managing their programs. As government agencies invest in data warehouses and business intelligence capabilities, it becomes feasible to employ analytic techniques used more-commonly in the private sector. Predictive analytics and rapid-cycle evaluation are analytical approaches that are used to do more than describe the current status of programs: in both the public and private sectors, these approaches provide decision makers with guidance on what to do next.
Predictive analytics refers to a broad range of methods used to anticipate an outcome. For many types of government programs, predictive analytics can be used to anticipate how individuals will respond to interventions, including new services, targeted prompts to participants, and even automated actions by transactional systems. With information from predictive analytics, administrators can identify who is likely to benefit from an intervention and find ways to formulate better interventions. Predictive analytics can also be embedded in agency operational systems to guide real-time decision making. For instance, predictive analytics could be embedded in intake and eligibility determination systems, prompting frontline workers to review suspect client applications more-closely to determine whether income or assets may be understated or deductions underclaimed.
Rapid-cycle evaluation, another decision-support approach, uses evaluation research methods to quickly determine whether an intervention is effective, and enables program administrators to continuously improve their programs by experimenting with different interventions. Like predictive analytics, rapid-cycle evaluation leverages the data available in administrative records. It can be used to assess large program changes, such as providing clients with a new set of services, as well as small program changes, such as rewording letters that encourage clients to take some action. This type of formative evaluation can be contrasted with the summative program evaluations familiar to many in the policy community. Summative program evaluations often assess whether a program has an impact by comparing program participants with nonparticipants. Rapid-cycle evaluation uses similar techniques, but does not examine the overall impact of the program. Instead, it assesses the impacts of changes to the program by comparing some program participants (with the change) to other program participants (without the change). For example, rapid-cycle evaluation can determine whether an employment training program can use text message prompts to encourage more clients to successfully complete program activities. In this way, rapid-cycle evaluation can identify incremental changes that make the program more effective for its clients, increasing the likelihood that a subsequent summative evaluation would identify large impacts relative to individuals not in the program. We believe that these techniques can be used to help government programs—including social service programs serving low-income individuals—to improve program services while efficiently allocating limited resources.
We believe that the use of predictive modeling and rapid-cycle evaluation— both individually and together—holds significant promise to improve programs in an increasingly fast-paced policy and political environment.
We propose that social service agencies take two actions. First, agency departments with planning and oversight responsibilities should encourage the staff of individual programs to conduct a thorough needs assessment. This assessment should identify where predictive analytics and rapid-cycle evaluation can be used to improve service delivery and program management. The assessment should also evaluate whether the benefits of adopting these tools outweigh the costs, resulting in a recommendation of whether and how these tools should be deployed. Second, federal agencies should take broad steps to promote the use of predictive analytics and rapid-cycle evaluation across multiple programs. These steps include investments in data quality and data linkage, as well as measures to support and promote innovation among agency staff.