by Nate Talley, Budget, Revenue, and Policy Analyst
Imagine that someone you know is unemployed. Then suppose this individual enrolls in the Department of Workforce Services’ labor exchange system, becomes connected with a hiring employer, and subsequently secures employment. Now answer the question, “Was the DWS labor exchange system effective?” The intuitive and seemingly obvious answer is “yes” because in this hypothetical scenario, the person you imagined received job-matching services and found work. However, because public resources are required to operate the labor exchange system, what policy-makers, administrators, and stakeholders really need to know is whether the program was effective in relative terms. That is, would the person you imagined have found an equally well-paying and well-fitting job had they not used the labor exchange system? How would you know?
The thought experiment above illustrates how challenging it can be to evaluate the efficacy of some government program operations, particularly those found in social service environments. Fortunately, the Governor’s Office of Management of Budget is committed to using every tool in the project, budget, and policy analysis toolkit to ensure that sound policies are adopted and the return on taxpayer investment is maximized. Beyond the use of business-process and operational improvement modalities such as QT/OE, Critical Chain Project Management, Aggregate Buffering, and Lean Project Management (to name a few), GOMB is heavily engaged in efforts to promote the use of evidence-based practices across state agencies. We have more to do, but we are making progress.
While an evidenced-based approach for shaping program selection and implementation can take a number of forms (as summarized by Dr. Eva Witesman’s work “Evidence-based Innovation: Creating a Statewide Initiative” ), perhaps the two most basic avenues for evidenced-based adherence are as follows:
- Look externally. To look externally means to ascertain and review information from sources that are not gathered directly from in-house program operations with the expectation that such information could be helpful in understanding how a program should operate to be most effective. Examples include conducting a meta-analysis of published literature on a program, inquiring with associations or consortiums of states conducting the same or similar programs, extrapolating trends from public use data or deriving relevant information from academic research on different but related programs. Creating and studying a robust set of external evidence that relates to a program will increase the probability that the program is effective when implemented locally.
- Look internally. To look internally means to use self-generated information that specifically relates to the program of interest. Examples include conducting pilot programs that are narrow in scope or scale, mining administrative data or conducting program impact assessments using experimental or quasi-experimental methods. Using these techniques to look internally usually require an investment of additional resources, since labor and expertise are needed to execute these forms of program evaluation. However, as expected, return on the investment of such resources can be quite high as internal program evaluations lead to more effective and efficient programs in the long run.
By using an evidenced-based approach for designing, implementing, and evaluating state government program interventions, we can eliminate some of the uncertainty around whether the public funds used for state programs would be better spent elsewhere. We can ensure that these programs are positively impacting the citizens of Utah.
As for the DWS labor exchange system, an internal evaluation has been conducted and it was found that program participants are significantly more likely to secure employment if they enrolled in job matching services with DWS. In other words, the person you imagined was better off by electing to receive DWS services.