Evaluate the Program

During the early planning stages of your program, before you start serving older adults, developing an evaluation plan that reflects your program’s vision and mission is imperative. By collecting data systematically from the beginning of your program and continuously throughout implementation, you can both evaluate its impact (e.g., reaching the triple aim of improved care, improved health, and lower costs per capita) and demonstrate to potential funders that it’s worth additional public and private investment. In this way, evaluation that begins before program launch can help ensure your program’s long-term sustainability as well as prioritize oral health in your community. Systematic evaluation also offers you the opportunity to identify areas of, and for, improvement—determine what works and what doesn’t—and initiate a continuous quality improvement plan to strengthen your program.

Four primary activities for evaluating your program include the following:

  1. Develop an evaluation plan
  2. Collect data for evaluation
  3. Analyze the collected data
  4. Conduct a quality improvement process

Depending on your program’s size, scope, and capacity, you might spend more time on some of these activities than on others as you conduct your evaluation and quality improvement.

Develop an Evaluation Plan

  • Involve key stakeholders in developing the evaluation plan. Stakeholders can help inform and enhance the evaluation, and involving stakeholders can increase their buy-in to the program as a whole.
  • Create evaluation questions that—when answered—demonstrate all of your program’s potential effects. These questions might relate to implementation, effectiveness, efficiency, timeliness, access, safety, and continuity. See the Mobile-Portable Dental Manual for more details on these evaluation topic areas. Consider both process metrics (e.g., number of clients served, number of procedures completed, cost of services provided) and outcome metrics (e.g., number of clients reporting satisfaction or dissatisfaction with services).
  • Develop methods for how you’ll answer your evaluation questions and measure the success of your program. Such methods might include written surveys, interviews, focus groups, chart reviews, observations, or tracking forms.
  • Select an appropriate research design to demonstrate reliable, valid findings. Depending on your program’s size and capacity, this approach might include pre- and post-intervention designs, interrupted time series designs, or control group designs, involving people who don’t participate in the program but, instead, serve as a benchmark, or point of reference from which to measure for program impact.
  • Set up a timeline for your evaluation activities; start them when the program launches, and outline dates for all major milestones (e.g., data collection, reporting). Note that some methods are ongoing whereas others occur periodically (e.g., quarterly, yearly).

Collect Data for Evaluation

  • Gather information on your planned methods and research design, including how other programs have gathered and analyzed data using the same approach.
  • Develop and pilot test the data collections tools for the method or methods you selected in your evaluation plan. Testing allows you to refine and finalize the tools before collecting real data.
  • Develop a plan to collect success stories, which you can use to attract additional sustainability funding (see the Ensure Sustainability section for more information). Real life anecdotes add a human element to your program, capture the attention of others in your community, and support and supplement data that shows program success. The CDC’s success story workbook (PDF) contains tips on collecting information for success stories.
  • Deploy staff to collect data and prepare the data for analysis. Consider reaching out to volunteers, dental students, and other stakeholders for assistance.
    • Consolidate and combine data from multiple sources, standardizing measurements.
    • Compile the data into a single location and organize it, as necessary.
    • Clean the data (i.e., review it and correct or remove inaccurate records) to ensure accuracy, completeness, and standardization.

Analyze the Collected Data

  • Perform simple counting, graphing, and visual inspection of data over time or across different groups (e.g., comparing mean percentages at the beginning versus the end of the program) to reveal preliminary trends that might suggest program effects.
  • Analyze qualitative interviews, focus groups, or observations to find patterns that can demonstrate program impact. Develop anecdotes and success stories from individual client interviews to strengthen your findings.
  • Conduct more advanced analysis (e.g., subgroup variation) based on your available data and research design. Consider reaching out to volunteers with a statistics background for assistance (one source is SeniorCorps volunteers).
  • Document interesting or important findings, such as differences within or among groups or correlations that might shed light on your program’s effectiveness.
  • Interpret the results of your evaluation and determine whether the program had positive effects (i.e., met the objectives), no effect, or a negative effect. Identify unintended consequences of the program that could prompt program modifications or raise additional evaluation questions.

Conduct a Quality Improvement Process

  • Create a formal, written policy for conducting quality improvement. You should conduct quality improvement after structured evaluations and routinely as the program evolves over time. As described in the University of Kansas’s Community Tool Box, consider instituting a formal public reporting process as part of your quality improvement process.
  • Share the results of your evaluation with involved staff, participants, and stakeholder partners, and collaborate to brainstorm ideas for improvement (and continuance of successful activities). Present the evaluation results as a positive step toward success, even if the results aren’t all positive to start out.
  • Use quality improvement tools to encourage discussion and create actionable next steps. Specific tools include flowcharts, cause-and-effect diagrams, and Pareto diagrams (see the National Maternal and Child Oral Health Resource Center’s Safety Net Dental Clinic Manual for other tools). Consider all of the potential reasons for failing to achieving desired outcomes when identifying root causes for challenges.
  • Consider pursuing accreditation through the National Committee for Quality Assurance and other such organizations as you conduct quality improvement. Accreditation might help you garner recognition and validate your program’s approach to service delivery as well as help your program improve its quality.
 

Program Spotlight: The Dentists' Partnership

The Dentists’ Partnership in Battle Creek, Michigan, collaborates with volunteer dentists to provide free oral health care to low-income, uninsured people. In exchange for the free services, clients must volunteer at local nonprofit organizations, with the number of hours dependent on the types of oral health services required. This case study provides additional information on how to evaluate a program based on the experiences of the Dentists’ Partnership.

Before program launch, staff collected pre-evaluation data to identify unmet needs in the community and agreed on baseline data that would inform the evaluation. The Dentists’ Partnership evaluated its program through pre- and post-implementation data, which included the number of clients served, the value of oral health services provided, and no-show rates. The program also estimated its return on investment and collected anecdotal reports about client satisfaction. Positive impacts from the evaluation, which is posted on the U.S. Agency for Healthcare Research and Quality’s website, include a 70 percent decrease in the number of low-income people with oral health complaints at the local emergency department and a 322 percent return on investment. These robust evaluation efforts have helped sustain the initiative, generate volunteer dentist interest, and maintain support from program funders.

Key Resources

The resources listed below provide additional guidance and support for evaluating your program.

  1. University of Kansas’s Community Tool Box, Chapters 36–41 – This toolbox provides comprehensive information for anyone interested in developing healthier communities. Chapters 36 to 41 focus on evaluating community programs and initiatives and maintaining high quality and rewarding accomplishments, respectively.
  2. National Maternal and Child Oral Health Resource Center’s Safety Net Dental Clinic Manual, Chapters 5: Quality Assurance and Quality Improvement and 6: Program Sustainability – The fifth and sixth chapters of this manual provide detailed guidance on quality improvement and program sustainability, focusing specifically on data collection, analysis, and measuring program effectiveness.
  3. National Maternal and Child Oral Health Resource Center’s Mobile-Portable Dental Manual, Chapter 5: Quality Assurance and Quality Improvement – This manual’s fifth chapter focuses on measuring effectiveness and outcomes. It summarizes evaluation basics, explains how to create an evaluation plan, and offers guidance for using evaluation information.
  4. Rural Health Information Hub’s Oral Health Toolkit, Module 5: Evaluating Rural Oral Health Programs – This toolkit’s fifth module covers evaluation of rural oral health programs, providing high-level information on the framework for evaluation, methods and considerations, and metrics commonly used.
  5. Centers for Disease Control and Prevention’s Developing an Effective Evaluation Plan (PDF) – This resource aims to help public health program managers, administrators, and evaluators develop a comprehensive and effective evaluation plan. It covers six steps for developing an evaluation plan and contains exercises, worksheets, and tools to assist along the way.
  6. University of Wisconsin Program Development and Evaluation Unit: Planning a Program Evaluation Worksheet (PDF) – This worksheet outlines five steps in program evaluation as well as poses questions your organization should ask when conducting an evaluation.
  7. Centers for Disease Control and Prevention, Program Performance and Evaluation Office’s Program Evaluation Steps – The CDC developed six connected steps that can be used as a starting point in tailoring an evaluation for a specific public health effort as well as a checklist of items to consider when developing evaluation reports.