Focus the Evaluation Practical Use of Program Evaluation among Sexually Transmitted Disease (STD) Programs STEP 1: ENGAGE STAKEHOLDERS 1.1 Determine how and to what extent to involve stakeholders in program evaluation STEP 2: DESCRIBE THE PROGRAM 2.1 Understand your program focus and priority areas 2.2 Develop your program goals and measurable (SMART) objectives 2.3 Identify the elements of your program and get familiar with logic models 2.4 Develop logic models to link program activities with outcomes . STEP 3: FOCUS THE EVALUATION 3.1 Tailor the evaluation to your program and stakeholders needs 3.2 Determine resources and personnel available for your evaluation 3.3 Develop and prioritize evaluation questions STEP 4: GATHER CREDIBLE EVIDENCE 4.1 Choose appropriate and reliable indicators to answer your evaluation questions 4.2 Determine the data sources and methods to measure indicators 4.3 Establish a clear procedure to collect evaluation information 4.4 Complete an evaluation plan based on program description and evaluation design STEP 5: JUSTIFY CONCLUSIONS 5.1 Analyze the evaluation data 5.2 Determine what the evaluation findings say about your program STEP 6: ENSURE USE OF EVALUATION FINDINGS AND SHARE LESSONS LEARNED 6.1 Share with stakeholders the results and lessons learned from the evaluation 6.2 Use evaluation findings to modify, strengthen, and improve your program SUGGESTED CITATION: SalabarríaPeńa, Y, Apt, B.S., Walsh, C.M. Practical Use of Program Evaluation among Sexually Transmitted Disease (STD) Programs, Atlanta (GA): Centers for Disease Control and Prevention; 2007. Focus the Evaluation F ocusing your evaluation is the third step in CDCs framework for program evaluation. A focused evaluation enables you to answer the questions of greatest concern to program staff and other stakeholders about the STD program. This is based on the assumption that the entire program does not need to be evaluated at any point in time. Rather, the evaluation(s) you conduct will most likely be of a program component or activity, and will focus on what question is being asked, by whom, and what will be done with the information. Step 3 will help you determine the resources and personnel that are needed to implement the evaluation, the most important questions for the evaluation, and the type of evaluation(s) that you will conduct. Step 3 is broken down into three evaluation tools: Tool 3.1 provides information on types of evaluation and when to conduct them. Tool 3.2 provides guidance on determining what resources and personnel are available for your evaluation. Tool 3.3 describes how to develop and prioritize your evaluation questions based on program and stakeholders needs and resources. TOOL 3.1: TAILOR YOUR EVALUATION TO YOUR PROGRAM AND STAKEHOLDERS NEEDS INTRODUCTION In Steps 1 and 2 you learned how to engage stakeholders to fully describe your STD program.You developed a logic model that includes activities, outputs, and outcomes. Now it is time to determine which program activities in your logic model should be evaluated based on your stakeholders needs and available resources. Once you make that determination, you are in a better position to identify the types of evaluation you can conduct, the purpose(s) of the evaluation, who will use the evaluation results and for what. The flowchart below provides a description of where these activities fit with the program planning and program evaluation process. UNDERSTANDING OF PROGRAM FOCUS AND PRIORITY AREAS LEARNING OBJECTIVES Upon completion of this tool, you will be able to: 1. Define process evaluation and outcome evaluation. 2. Describe reasons for conducting process and outcome evaluations. WHAT ARE THE MOST COMMON TYPES OF EVALUATION? There are several types of evaluations that can be conducted. Some of them include the following: Formative evaluation ensures that a program or program activity is feasible, appropriate, and acceptable before it is fully implemented. It is usually conducted when a new program or activity is being developed or when an existing one is being adapted or modified. Process/implementation evaluation determines whether program activities have been implemented as intended. Outcome/effectiveness evaluation measures program effects in the target population by assessing the progress in the outcomes or outcome objectives that the program is to achieve. Impact evaluation assesses program effectiveness in achieving its ultimate goals. This tool will focus on process and outcome evaluations because they are the most common types of evaluations you will probably conduct. WHAT ARE PROCESS AND OUTCOME EVALUATIONS? Process evaluation determines whether program activities have been implemented as intended and resulted in certain outputs. You conduct process evaluation periodically throughout the life of your program and start by reviewing the activities and output components of the logic model (i.e., the left side). Results of a process evaluation will strengthen your ability to report on your program and use information to improve future activities. It allows you to track program information related to Who, What, When, and Where questions: To whom did you direct program efforts? Example: What types and how many target population members received STD services? What has your program done? Examples: Did the program staff distribute the STD screening protocols to clinics? Did medical staff counsel, screen, and appropriately treat clinic patients for STDs? How many professional development workshops were provided for disease intervention specialists (DIS) on protocols for interviewing clients and conducting case management? Did the program staff collaborate with the stakeholders or other partners in designing a screening program? When did your program activities take place? Example: How many days after interviewing index cases were contacts treated prophylactically? Where did your program activities take place? Example: Where was outreach conducted to reach the target population(s)? Suppose you were implementing an initiative to address a syphilis outbreak among men who have sex with men (MSM). Here are some of the questions you could answer with a process evaluation: How did DIS and STD program management collaborate with communitybased organizations (CBOs) or other partners to reach MSM who engage in highrisk behaviors? How many CBO outreach workers received STD training? What activities did these partners implement to address the problem? When were these activities conducted? Where were these activities conducted? Was the target population reached? What were the problems encountered in reaching the target population? Outcome evaluation measures program effects in the target population by assessing the progress in the outcomes that the program is to address. To design an outcome evaluation, begin with a review of the outcome components of your logic model (i.e., the right side of the model), which reflects the intended changes in your target population (knowledge, awareness, attitudes, skills, and behaviors) as well as potential changes in program policies that you hope to achieve. Some questions you may address with an outcome evaluation include: Were medical providers who received intensive STD training more likely to effectively counsel, screen and treat patients than those who did not? Did the implementation of STD counseling in CBOs result in changes in knowledge, attitudes, and skills among the members of the target population? Did the program have any unintended (beneficial or adverse) effects on the target population(s)? Do the benefits of the STD activity justify a continued allocation of resources? If you were implementing the initiative noted above (i.e., addressing a syphilis outbreak in the MSM community), the following are some of the questions you could address with an outcome evaluation: As a result of the syphilis initiative, was there any change in the awareness of the syphilis outbreak among MSM who engage in highrisk behaviors? Was there any change in attitudes toward condom use among MSM who engage in highrisk behaviors? Was there any change in intention to use condoms among MSM who engage in highrisk behaviors? Was there any change in syphilis incidence rates among MSM who engage in highrisk behaviors? Did the benefits of the activity justify continued allocation of resources? It is important to note the usefulness of conducting process evaluation while you are implementing outcome evaluation. If the outcome evaluation shows that the program did not produce the expected results, it may be due to program implementation issues (e.g., inadequate resources). Therefore, it is recommended that if you conduct outcome evaluation you also implement process evaluation. HOW DO YOU CHOOSE THE FOCUS OF YOUR EVALUATION? Use the following steps to help program staff and stakeholders focus the evaluation. 1. Decide on the purpose of your evaluation. The purpose of your evaluation is what you intend to get from the evaluation activities. This serves as the basis for the evaluation design, questions, and methods. By thoughtfully conducting this first step, you will prevent premature decisionmaking regarding how to carry out the evaluation. Example of purposes include improve program operations, determine the effects of the program in the target population(s), and gain knowledge about program activities. 2. Identify the users of your evaluation results. The users of your evaluation are the specific persons who receive and use evaluation findings (e.g., stakeholders). Support from the users increases the likelihood that the evaluation results will be used to improve the program or program activity under evaluation. 3. Identify the uses of your evaluation results. The uses of your evaluation are the specific ways that program staff and stakeholders plan to utilize the evaluation findings. To identify the uses of your evaluation, program staff and the various stakeholders should discuss the different expectations and needs they have for the evaluation (e.g., What aspect of the program am I most interested in? How can I use the evaluation results?). A clinical supervisor, for example, might expect that an evaluation of the implementation of an activity (process evaluation) would provide feedback on whether staff is conducting quality STD screening. Therefore, the use of the evaluation might be to improve the quality of screening. Other examples of uses of evaluation findings include: deciding how to allocate resources, deciding whether to expand the locations where a program activity is carried out, identifying program areas that need improvement, documenting the level of success in achieving objectives, and soliciting more funds. Once program staff and stakeholders have identified their expectations for conducting an evaluation, you will need to prioritize the various uses of the evaluation. The following is an example to guide this process based on decisions made by program staff and stakeholders. They decided that the overall purpose of the evaluation should be to determine if Chlamydia screening (activity that has been carried out for six months) has been implemented as planned. Table 1 summarizes stakeholder expectations, evaluation users, and evaluation uses for the proposed process evaluation. Table 1: Stakeholder Expectations, Evaluation Users, and Evaluation Uses for Process Evaluation STAKEHOLDER NEEDS/ EXPECTATIONS EVALUATION USERS EVALUATION AUDIENCES CDC, local STD program director and manager expect the evaluation to document the scope and quality of activity implementation and demonstrate that the activity is being implemented as designed. CDC Justify that STD program resources are being appropriately used. STD program director, STD manager and staff Identify the extent to which activity plans were implemented. Make midcourse adjustments to improve the activity and refine the activity plan. Clinical supervisors and staff expect the evaluation to provide feedback on whether staff members are conducting quality screening. Clinical supervisors, STD program director and staff Identify the extent to which Chlamydia screening is being implemented as planned. Improve the quality of screening. CBO partners expect the evaluation to inform them about whether their outreach efforts are reaching the target population for the activity. CBO partners, STD program director, manager and staff (DIS, field staff supervisors), community leaders Improve outreach strategies to the community. Increase Chlamydia screening in the community. Representatives from the target population expect the evaluation to inform them if the activity is culturally/language appropriate and whether their rights are being protected. Target population members, STD program director, manager and staff Enhance the activitys cultural competence. Verify that participants rights are protected. 4. Identify the stage of development of your program to determine if the focus of your evaluation is realistic. With the implementation of a new STD program activity, consider conducting a process evaluation to determine if the program is being delivered as planned and if improvements are needed. An evaluation that included outcomes would make little sense at this stage. With a more mature program, you may want to conduct an outcome evaluation to assess your programs effectiveness and to demonstrate that it is making productive use of resources. For instance, if your STD program has been implemented for 23 years, you may want to focus on assessing whether shortterm outcomes (e.g., changes in knowledge, attitudes, and skills of the target population) are achieved. If (1) the program has been implemented for several years, (2) program delivery is going smoothly, and (3) you have evaluated the implementation of its activities, you may want your evaluation to focus on intermediateand/or longterm outcomes (e.g., changes in STD risk behaviors and sexual health status of the target population). SUMMARY CHECKLIST: Focusing the Evaluation CONCLUSION AND NEXT STEPS With this evaluation tool you have learned about process and outcome evaluation and the essentials of focusing your evaluation. You have also learned the importance of having an evaluation that is realistic and feasible. The next tool, Tool 3.2 (Determine resources and personnel available for the evaluation), details what you need to know about determining the resources and personnel available for your evaluation. ACRONYMS USED IN THIS TOOL CBO Communitybased organization DIS Disease Intervention Specialist MSM Men who have sex with men STD Sexually transmitted disease KEY TERMS Activities: Actual events that take place as part of your program (e.g., developing pamphlets, testing patients). Effectiveness: This relates to outcome evaluation, and it refers to the contribution a program makes to produce changes in the target population/organization. Fidelity: When your STD program or intervention is implemented as intended. Inputs: Programs resources (e.g., money, staff, materials). Intermediate outcomes: Intended effects of your program in the target population/organization that takes longer than shortterm outcomes to occur (e.g., changes in STDrelated policy or in behavior of the target population). Logic model: A picture of how a program/component/activity is supposed to work. Longterm outcomes: Intended effects of your program in the target population/organization that may take several years to achieve, such as reduced disease transmission and incidence. Outcome evaluation: A type of evaluation that determines the effects of your program activities in the target population (e.g., changes in: knowledge, attitudes, beliefs, skills) or organization. The outcome components of the logic model (the right side) are used to plan an outcome evaluation. Outputs: The direct products of your program activities or services delivered (e.g., pamphlets developed, patients tested). Process evaluation: Also referred to as implementation evaluation, is a type of evaluation that determines whether your program and its activities are implemented as intended and why?/why not? Information gathered is used for refining or modifying these activities and related procedures. The inputs, activities, and outputs of a logic model (the left side) are used to plan a process evaluation. Purpose of evaluation: General intent of evaluation (e.g., to finetune program operations). Shortterm outcomes: Immediate effects of a program in the target population/organization (e.g., changes in knowledge, attitudes, skills, awareness, or beliefs). Stage of development: The level of maturity of your program, which influences the type of evaluation to conduct (e.g., planning, implementation and maintenance stages). Users of an evaluation: The specific persons/organizations that will employ the evaluation findings in some way (e.g., STD Director, CBO, funder). Uses of an evaluation: The specific ways that program staff and other stakeholders will apply what is learned from the evaluation (e.g., change STD clinical practice, inform STD prevention policy). CASE SCENARIO The STD Program of the city of ChancriLa had implemented a syphilis elimination media campaign for three years to address a syphilis outbreak in the MSM community. Program staff and stakeholders used the results of a process evaluation to track their implementation efforts and improve the scope and quality of the campaign delivery to the target population. The next step was for program staff and stakeholders to come together to discuss the possibility of conducting an outcome evaluation. Along with the syphilis elimination team from the health department and other program staff, the stakeholders present at the meeting included representatives of the gay media, business owners of bathhouses and other businesses frequented by MSM, CBOs that serve highrisk MSM, and MSM advocating for their community. Discussion topics addressed the following questions: What will be the purpose of our evaluation? How realistic is what we plan to do? What expectations do individuals have from an outcome evaluation? Who will be the users of the evaluation information? What will be the uses of the evaluation? Because the syphilis elimination media campaign had been widely profiled in the mass media, including the gay media, several community stakeholders wanted to measure the effectiveness of the intervention in attaining its objectives. Their process evaluation efforts led to several improvements in the activity, and they felt it was now mature enough to undergo outcome evaluation. Also, some stakeholders needed to know if the syphilis outbreak had been effectively addressed via the activity so they could divert scarce resources to other program activities. The stakeholders had differing expectations about the outcome evaluation and views about its users and uses of the evaluation information. Table 2 summarizes this information. Table 2: Stakeholders Needs/Expectations, Uses, andUsers for the Outcome Evaluation REFERENCES Centers for Disease Control and Prevention, National Center for Chronic Disease Prevention and Health Promotion, Division of Adolescent and School Health. (2004). Evaluation steps tools. Unpublished manuscript. Centers for Disease Control and Prevention. (1999). Framework for program evaluation in public health. MMWR Recommendations and Reports, 48(RR11). Retrieved February 22, 2005 from http://www.cdc.gov/mmwr/preview/mmwrhtml/rr4811a1.htm Fitzpatrick, J. L., Sanders, J.R., Worthen, B.R. (2004). Program Evaluation:Alternative Approaches and Practical Guidelines (3rd ed.). New York: Allyn and Bacon. MacDonald, G., Starr, G., Schooley, M., Yee, S. L., Klimowski, K., & Turner, K. (2001, November). Introduction to program evaluation for comprehensive tobacco control programs. Atlanta, GA: Centers for Disease Control and Prevention. Retrieved October 17, 2004, from http://www.cdc.gov/tobacco/evaluation_manual/ Evaluation.pdf Rossi, P. H., Freeman, H. E., & Lipsey, M. W. (1999). Evaluation: A Systematic Approach (6th ed.). Thousand Oaks, CA: Sage. Stecher, B. M., & Davis, W. A. (1987). How to focus an evaluation. Newbury Park, CA: Sage. Thompson, N.J., & McClintock, H.O. (1998). Demonstrating your programs worth: a primer for evaluation on programs to prevent unintentional injury. Atlanta: Centers for Disease Control and Prevention, National Center for Injury Prevention and Control. U.S. Department of Health and Human Services. (2002). Physical Activity Evaluation Handbook. Atlanta, GA: US Department of Health and Human Services, Centers for Disease Control and Prevention. W.K. Kellogg Foundation. (1998, January). W.K. Kellogg Foundation Evaluation Handbook. Retrieved October 17, 2005, from http://www.wkkf.org/Programming/ResourceOverview.aspx?CID= 281&ID;=770 W.K. Kellogg Foundation (2001, December). Logic Model Development Guide. Retrieved January 8, 2005, from http://www.wkkf.org/Programming/ResourceOverview.aspx?CID= 281&ID;=3669 TOOL 3.2: DETERMINE THE RESOURCES AND PERSONNEL AVAILABLE FOR YOUR EVALUATION INTRODUCTION In Steps 1 and 2, you learned how to engage stakeholders and to fully describe your STD program. You also developed a logic model that included inputs, activities, outputs, and outcomes. Using Tool 3.1, you learned how to tailor your evaluation to your program/ stakeholder needs and to identify the types of evaluations you could conduct. In Tool 3.2 you will learn about the importance of considering the time, the skills, and the fiscal resources needed to conduct an evaluation. LEARNING OBJECTIVE Upon completion of this activity, you will be able to: Determine the amount of financial resources, time, and personnel available/needed to plan a feasible evaluation. WHEN DO YOU NEED THE RESULTS OF YOUR EVALUATION? Part of planning an evaluation involves determining when results are needed. You can then work backward by projecting startup and completion dates of the evaluation. For instance, if you plan to conduct a process evaluation of a relatively new STD activity/intervention, you will want to generate evaluation results in an early stage of implementation in order to improve the activity. Keep in mind that stakeholders may want to use evaluation results to make policy or programmatic decisions concerning this STD activity/intervention. If so, you will need to plan your evaluation activities so that these results are available according to the policy decisionmaking schedule. Some evaluation activities occur occasionally (e.g., collecting baseline measures of clients STD knowledge on symptom recognition), while others are ongoing (e.g., monitoring implementation of your syphilis elimination activities over time). You need to create a timeline for every evaluation you conduct of your STD program. Please refer to Appendix A to see a sample evaluation timeline and an accompanying case scenario. WHO WILL CONDUCT THE EVALUATION? You will need individuals with evaluation expertise to conduct the evaluation. You should consider recruiting this expertise at the beginning of the evaluation process so that these individuals can provide input early on. Here is some advice to consider in determining how to staff your evaluation. 1. Assess the evaluation skills and needs of the STD program staff. A variety of evaluation skills are needed at different points in an evaluation. These skills range from the ability to communicate well, to the ability to analyze data. Assess which evaluation skills the STD program staff has. You may want to use Appendix B as guidance to determine which evaluation skills you have within the STD staff and skills that are still needed to undertake the evaluation. 2. Consider staff professional development or CDC technical assistance. If individuals are available to do the required evaluation work, but lack the necessary skills, consider professional development in these skill areas, which in the long run can be very beneficial for the STD program. Training your staff to use these evaluation tools developed by CDCs Division of STD Prevention (DSTDP) is a good place to start. There are other resources STD programs can contact for training on program evaluation. The National Coalition of STD Directors, the Prevention Training Centers, or local universities may know of resources. You may also want to contact your program consultant and request evaluation technical assistance from DSTDP. See Appendices C and D for program evaluation technical assistance services offered by DSTDP. 3. Explore local evaluation resources and peertopeer assistance. Evaluation resources and expertise may exist in several places in your locale. One place to look is in other health department (HD) programs or divisions. You may also locate peertopeer assistance by exploring evaluation resources connected with your partners and evaluation stakeholders. Another place to locate evaluation expertise is in colleges and universities and specifically in educational, social science and public health programs. A relatively easy way of garnering evaluation expertise is to work with a skilled graduate student who is interested in your project. PRACTICAL USE OF PROGRAM EVALUATION AMONG STD PROGRAMS 4. Consider hiring an evaluation consultant, if funding is available. If there are evaluation activities that require specific expertise outside the skills of your program staff (e.g., qualitative data analysis), you may want to consider hiring an external evaluation consultant if funds are available. Faculty at local colleges and universities may be interested in serving as a consultant. See Appendix E for how to recruit an evaluation consultant. Keep in mind that your STD program is ultimately responsible for any STD program evaluation activities conducted in your project area. Therefore, you need to become familiar with the program evaluation process, be an integral part of the decision process, and monitor the progress of the evaluation activities implemented, even if you are using an external consultant. WHAT FINANCIAL RESOURCES ARE AVAILABLE FOR YOUR EVALUATION? Along with time and skill, you need financial resources to conduct your evaluation. Funding an evaluation may seem a bit daunting for your STD program when you are working with limited yeartoyear funding and pressing program priorities. However, you can conduct meaningful evaluation activities with limited resources. Part of your decisionmaking process for using funds for evaluation will be based on the extent that you, your program staff and other stakeholders value and prioritize evaluation. STD programs and stakeholders who value the benefits of evaluation, and are creative, find ways to begin, maintain, and even expand effective program evaluation activities in the midst of severe fiscal constraints. You may want to consider building an evaluation line item directly into your program budget to be used for evaluating priority program activities or interventions. In this regard, based on internal program decisionmaking, you may want to consider dedicating a proportion or set dollar amount of the annual budget to program evaluation. Another common funding pattern is to take advantage of opportunities for unexpected funding through such avenues as supplemental applications (i.e., from government, private foundation, nonprofit entity) or rollover funds. Another issue you should consider in identifying financial resources for an evaluation is costsharing with other entities that can benefit from the evaluation. If you are already working with other agencies on the program activity or component being evaluated, they may be willing to costshare on the evaluation (e.g., sharing personnel, providing space or equipment for evaluation activities). You may also want to consider calling on local businesses and asking for their support in supplying incentives (e.g., food coupons, tickets to entertainment events) for those who participate in the evaluation as a token of appreciation for their time. However, this is not a requirement. The amount of fiscal resources you will need for an evaluation will vary considerably depending on the design of the evaluation you choose and the scope of activities you plan. See Appendix F for how to develop an evaluation budget. Such guidance was adapted from and can be found at the CDC website http://www.cdc.gov/od/pgo/ funding/budgetguide.htm. SUMMARY CHECKLIST: Identifying the Resources And Personnel Available For Your Evaluation Develop a timeline for your evaluation that takes into account the needs of your program and its stakeholders. Determine how to staff your evaluation. Assess the evaluation skills and needs of the STD program staff. Consider staff professional development (training) or CDC technical assistance. Explore local evaluation resources and peertopeer assistance. Consider hiring an evaluation consultant, if funds are available. Determine what financial resources you have available to conduct evaluation activities. Develop a realistic budget for the evaluation. CONCLUSION AND NEXT STEPS By using this program evaluation tool, you learned about the factors (i.e., time, staff, and fiscal resources) that you need to consider when planning an evaluation. Next, Tool 3.3 will allow you to further focus your evaluation. It will help you select what you want to evaluate and will assist you in developing and prioritizing your evaluation questions. ACRONYMS USED IN THIS TOOL AEA American Evaluation Association CDC Centers for Disease Control and Prevention DSTDP Division of STD Prevention HD Health Department HSREB Health Services Research and Evaluation Branch MSM Men who have sex with men SOW Statement of work STD Sexually transmitted disease TIG Topical Interest Group KEY TERMS Activities: Actual events that take place as part of your program (e.g., developing pamphlets, testing patients). Effectiveness: This relates to outcome evaluation, and it refers to the contribution a program makes to produce change in the target population/organizations. Focus group: A qualitative method used to collect data from a group of people (about 611) who meet for 12 hours to discuss their insights, ideas, and observations about a particular topic with a trained moderator. Participants are selected because they share certain characteristics (e.g., individuals who have been tested for syphilis, women in detention facilities) relevant to the evaluation. Qualitative data: Detailed/narrative information that provide an indepth understanding of a topic/issue/population. An example of qualitative data is the answers representatives of a CBO would provide when asked for their thoughts on how to reach highrisk adolescents. Quantitative data: Numerical information. An example is data that identify the number of times (e.g., 1, 2, 3, 10) each client has visited your clinic within the last year. Stakeholders: Individuals or organizations directly or indirectly affected by your program and/or the evaluation results (e.g., STD program staff, family planning staff, representatives of target populations). REFERENCES FitzGibbon, C.T., & Morris, L.L. (1987). How to design a program evaluation. Newbury Park, CA: Sage publications. Horn, J. (2001). A checklist for developing and evaluating evaluation budgets. Retrieved December 28, 2004, from: http://www.wmich.edu/evalctr/checklists/evaluationbudgets.htm Joint Committee on Standards for Educational Evaluation (1994). The program evaluation standards (2nd ed.). Thousand Oaks, CA: Sage. (Ed.). Thousand Oaks, CA: Sage Publications. King, J.A., Morris, L.L., & FitzGibbon, C.T. (1987). How to assess program implementation. Newbury Park, CA: Sage. Langford, L., DeJong, W. (2002). How to select a program evaluator. Retrieved December 28, 2004, from: http://www.edc.org/hec/pubs/prevupdates/progeval.html Roe, K.M. (1997, January). Evaluation methodologies with limited resources. (Concept paper prepared for the Academy for Educational Development). Rossi, P. H., Lipsey, M. W., Freeman, H. E. (2004). Evaluation: A systematic approach (7th ed.). Thousand Oaks, CA: Sage. Sarvela, P.D., & McDermott, R.J. (1993). Health education evaluation and measurement: A Practitioners perspective. Madison, WI: Brown and Benchmark. Windsor, R., Baranowski, T., Clark, N., & Cutter, G. (1999). Evaluation of health promotion, health education, and disease prevention programs (2nd edition). Mountain View, CA: Mayfield. Witkin, B.R., & Altschuld, J.W. (1995). Planning and conducting needs assessments. Thousand Oaks, CA: Sage. Wholey, J.S., Hatry, H.P., Newcomer, K.E. (Eds.) (1994) Handbook of Practical Program Evaluation. San Francisco: JosseyBass. APPENDIX A CASE SCENARIO The STD program in State X has been implementing a Chlamydia screening initiative in family planning clinics for 3 years to address a continued high prevalence of Chlamydia among females ages 14 to 24. The STD Director is evaluating the program to determine its effectiveness and to use the results to solicit future funding for this program, since the present funding will end this fiscal year. Finally, all stakeholders have agreed that the results obtained from the evaluation would be used to improve the program. The director and other stakeholders concluded at a meeting that an evaluation (process and outcome) is feasible at this time because the program goals and objectives are welldefined and can be used to measure performance. The director is also confident that the clinics internal database will provide the information that is needed because evaluation was part of planning the initiative, thus allowing the program to collect the appropriate data First the director and program staff identified when evaluation results are needed for planning and decision making. They determined that process information is needed at key time intervals to improve the program. They also decided that data collection on clinic patients needs to occur throughout the program and be analyzed at strategic times in order to demonstrate effectiveness to program stakeholders. The timeline for the evaluation is provided below. FOCUS THE EVALUATION Table 1: Projected 1ST Year Timeline for Evaluation Activities For State X Chlamydia Screening Initiative EVALUATION ACTIVITIES MONTHS 1 2 3 4 5 6 7 8 9 10 11 12 Develop evaluation data collection instruments. Collect baseline measures on female family planning clinic patients ages 1424. Monitor implementation of the Chlamydia screening initiative. Conduct focus groups with female STD clinic patients. Collect same measures from patients as was collected at baseline (i.e. followup). Analyze evaluation data Report findings. Revise program, if needed. The STD director and staff, as well as other stakeholders, also identified the evaluation skills that are necessary to conduct the evaluation activities. They felt that as part of the process evaluation, a sample of the female clients who have visited the clinic should be interviewed individually and as a group. In order to obtain the most reliable and valid information from clients, the director and staff felt the evaluator should be an experienced interviewer who has conducted both individual and focus groups interviews. In addition, statistical skills are needed to analyze the clinic database on all female clients aged 14 through 24 for the outcome evaluation (effects of initiative on target population). Since many of the clients are young females, the evaluator should feel comfortable with this target population. The director developed a budget for the evaluation using the guidance provided in Appendix E and following the fiscal processes dictated at the local level. Along with stakeholders, the director developed a scope of work to recruit an evaluation consultant and distributed it through different channels (e.g., internet, bulleting boards at universities, word of mouth). The program hired an evaluator from a local university. Note. Keep in mind that you and other stakeholders develop a final timeline and budget when you have agreed on what to evaluate, on staffing, and on the design and methods that are to be applied in the evaluation. The last two topics are discussed in Tools 4.1, 4.2, 4.3 & 4.4. APPENDIX B SKILLS OF A QUALIFIED EVALUATOR Core Evaluation Skills Knowledge about program evaluation theory and methodology. In order to conduct any evaluation, your evaluator should be trained in the most current evaluation theories and evaluation methodologies (e.g., participatory evaluation, process and outcome evaluation). Ability to differentiate between research and program evaluation procedures. Program evaluation addresses practical problems, whereas research addresses theoretical issues first, with less focus on practice. In addition, the intent of program evaluation is program improvement, while the intent of research is primarily on testing hypotheses. It is helpful if your evaluator has the ability to differentiate between the two and has primary experience with program evaluation principles and standards. Experience and comfort working with diverse populations. Your evaluator should have experience, and feel comfortable, working with the target populations your STD program works with, and with policy makers. Experience with designing and conducting an evaluation. It is very important that your evaluator is experienced with designing and conducting evaluations. You will be able to assess this skill by reading previous evaluation reports that he/she has written. The reports should clearly communicate evaluation concepts and findings to diverse audiences. The reports should address evaluation questions and discuss both the strengths and limitations of the evaluation design, methods and data. Experience with qualitative and/or quantitative data sources and analyses. If you have a database that contains quantitative information (e.g., demographic information, test results, number of sexual partners) or need to develop one, your evaluator should have an understanding of how that information was collected and needs to be qualified in designing, collecting and analyzing these types of data. In addition, your evaluation may involve the collection of qualitative data via indepth interviews, focus groups, or observations, among other methods. Again, your evaluator should have experience collecting and analyzing these types of data as well as knowledge of the strengths and weaknesses of either kind of data and when to apply them. Desirable Evaluation Skills Knowledge about STD content areas. Your evaluator should be familiar with the best practices of STD prevention and control, how STD programs operate, the communities affected by STDs and how these diseases can impact the community. Ability to train staff in evaluation concepts and skills. To make your evaluation cost effective, it is helpful if a qualified evaluator can train program staff in different evaluation skills (e.g., use of data collection methods, how to use qualitative software). APPENDIX C PROGRAM EVALUATION TECHNICAL ASSISTANCE (TA) SERVICES AT CDC Program Evaluation Technical Assistance is available from evaluation specialists located in the Health Services Research and Evaluation Branch (HSREB) at CDCs Division of STD Prevention (DSTDP). We provide evaluation TA to help project areas build evaluation capacity so they can evaluate their own STD program components. What services do we provide? Building program evaluation skills by assisting individuals in: Developing goals and SMART objectives Developing a logic model Formulating evaluation questions Determining data collection methods and sources Developing an evaluation plan Developing indicators Developing evaluation instruments Developing a timeline for evaluation activities Providing recommendations about qualities to look for when hiring or contracting evaluators Developing dissemination/reporting strategies for evaluation results Evaluating a component/activity of a program Evaluating implementation of a program or its activities Evaluating effects of a program in the target population (outcomes) Conducting empowerment evaluation (coaching others on how to evaluate their programs) Conducting evaluation workshops Identifying/providing evaluation materials (e.g., tools, resources) Who is on the Evaluation TA Team? The team consists of two evaluation specialists with expertise in the practical and theoretical aspects of program evaluation. Depending on the time and complexity of the request, we will either provide direct evaluation TA or refer project areas to other DSTDP branch staff, other state/local STD program staff with expertise in specific evaluationrelated subjects, or local evaluation resources. How can you request evaluation TA? Contact your program consultant to let him/her know about the need for evaluation TA. Fill out the TA Evaluation Form (see Appendix D). Submit your completed form and fax it at (404) 6398607. TA requests will be processed within 10 working days from the time received. Where to call for questions? For more information, call HSREB at (404) 6398276 or email at eval@cdc.gov. APPENDIX D TA EVALUATION FORM PROGRAM EVALUATION TECHNICAL ASSISTANCE SERVICES AT DSTDP Attention:__________________________________ Date: __________________ Name of Project Area: Name of Person Requesting Evaluation TA: (e.g., individual from project area, program consultant, other staff member at PDSB/DSTDP) Address of Contact Person at Project Area (street name/number, city, zip code): Phone (xxxxxxxxxx):_____________________ Email:________________________ Type of Evaluation TA Requested (please check all that apply): Conducting evaluation workshop Evaluating or providing guidance on how to evaluate a program component/activity Evaluating or providing guidance on how a program or its activities is/are implemented Evaluating or providing guidance on how to conduct an outcome evaluation (effects of a program in the target population) Other (Specify) Please respond to the following as completely and clearly as possible. If you need assistance in clarifying your TA request, please contact your Program Consultant. Briefly describe the issue or problem that prompted this TA request. Specify your timeline to resolve the issue for which you are requesting TA (e.g., 3 weeks, 1 month, 4 months, 1 year) Describe the goal(s) or objective(s) from your STD Program to be addressed through this request. Describe how the evaluation TA being requested will help improve your program. You can either download this form and complete by computer, or print it out and complete by hand. Please either fax or email the completed form to the Health Services Research and Evaluation Branch at (404) 6398607 or eval@cdc.gov respectively. For CDC use: Date received: ______________________________________ By: _________________________________________________ Date of Disposition:_________________________________ Disposition: ________________________________________ APPENDIX E HOW DO YOU RECRUIT AN EVALUATION CONSULTANT? 1. Identify your evaluation needs, and develop a clear scope of work. The first step in recruiting an external evaluation consultant is to know what you will evaluate and decide exactly what you need from that individual. This may be spelled out in a statement of work (SOW) that describes the intended evaluation task(s), what products or deliverables you would like to see as a result of the evaluators labor, and the timeframe in which you are operating. The more specific you are in writing up this SOW, the easier it will be to locate the individual you need and to negotiate a work contract with her/him. Some of the categories to take into account when constructing a contract include procedural guidelines for conducting the evaluation (e.g., communication protocols), informational requirements (clear description of the expected product(s), data collection protocols, safeguarding of information such as confidentiality of informants, and data ownership), procedures for data analysis, reporting and timeline demands, client responsibilities, budgetary guidelines, and provisions for reviewing and evaluating the evaluators work. See the sample SOW provided on the next page. SOW SAMPLE EVALUATION CONSULTANT WORK ORDER WORK ORDER NO. 12345 Period of performance: June 23, 2006 July 24, 2006 Task Description: Under cooperative agreement X titled Improving Evaluation Skills (IES), the consultant will draft and finalize the evaluation report for this project according to the instructions specified in the Program Guidance. Specifically, the consultant will: Draft, revise as needed, and finalize the Executive Summary; Evaluation Purpose; Evaluation Design and Methods; Results; Conclusions and Recommendations. Prepare required appendices (i.e., logic model, evaluation plan). Draft an abstract if requested by Project Director Deliverables/Milestones Due Dates 1. Conference call with Project Director and other stakeholders. June 27, 2006 2. Draft of above referenced sections July 9, 2006 3. Revised evaluation report July 18, 2006 4. Completed evaluation report July 20, 2006 Payment Schedule: Not to exceed level of effort (5 days) with payments based on days worked and days worked tied to specific progress in achieving milestones/deliverables. Nottoexceed # days at $ /day; Nottoexceed $ total. Technical Direction: Jane Doe Organization X and Consultant agree that the above services will be provided in accordance with the Organization X Consulting Agreement signed by both parties dated . For (Organization X) Consultant Signature Date Signature Date Name Name Title Social Security Number 2.Search for available evaluation consultants. Find out about the availability of evaluation consultants by checking with nearby colleges or universities that have a school or program of public health or departments of education, psychology, sociology, or statistical analysis. Evaluators may also be identified through the website of the American Evaluation Association (AEA) (www.eval.org). By clicking on Affiliates, you can find state or regional evaluation members. By clicking on Find an Evaluator, you can find several evaluation firms located in different parts of the country with their contact web addresses and contact information. By clicking on TIGs (Topical Interest Groups), you can find contact information for evaluators located at different universities and organizations, as well as their specific interests. AEA also maintains an online job bank, where you can post a job description for your ideal evaluator (http://www.eval.org/JobBank/jobbank.htm). 3.Develop a set of criteria to identify an appropriate evaluator for your program. Consider using the following checklist as an additional resource to identify a suitable evaluator for your evaluation activities. CHECKLIST FOR SELECTING AN EVALUATOR EVALUATION ACTIVITIES Evaluator appears to be: (Check one for each item) Well Qualified Not Well Qualified Cannot Determine if Qualified 1. To what extent does the formal training of the potential evaluator qualify her/him to conduct evaluation studies (Consider degree specialization and if the candidate has any previous STD evaluation experience). 2. To what extent does the previous evaluation experience of the potential evaluator qualify her/him to conduct evaluation studies? (Consider length of experience, and relevance of experience) 3. To what extent does the previous performance of the potential evaluator qualify her/him to meet the needs for your project? What prior experience does she/he have in similar settings? (Look at work samples or contact references) 4. To what extent are the personal styles and characteristics of the potential evaluator acceptable? (honesty, character, interpersonal communication skills, ability to resolve conflicts) 5. Summary: Based on the questions above, to what extent is the potential evaluator qualified and acceptable to conduct your evaluation? APPENDIX F HOW DO YOU DEVELOP A BUDGET FOR YOUR EVALUATION? The costs of your evaluation need to be realistic so that the resources expended can be justified. Stakeholders need to feel confident that the evaluation will be conducted efficiently and will produce information of sufficient value to address their needs. To project a realistic evaluation budget, consider the direct and indirect evaluation costs of your program. Direct costs for program evaluation include dollars spent on evaluation activities, employee benefits, contractors, equipment, supplies, and travel. Indirect costs include overhead expenses (e.g., building rental and utilities, custodial services). As you prepare your budget, the cost for the contractor may appear as a single line item, or it can be itemized by consultants salary, fringe benefits, and the nonpersonnel costs. The following guidance includes information you need when developing a budget for any evaluation you undertake. A. Salaries and Wages For each requested position, provide the following information: name of staff member occupying the position, if available; annual salary; percentage of time budgeted for this program; total months of salary budgeted; and total salary requested. Also, provide a justification and describe the scope of responsibility for each position, relating it to the accomplishment of program objectives. Sample Justification The format may vary, but the description of responsibilities should be directly related to specific program objectives. Job Description: Evaluation Coordinator (Name) This position directs the overall operation of the program evaluation activities and for staff performance evaluations; responsible for overseeing the implementation of evaluation activities, coordination with other agencies, development of data collection instruments, provisions of in service and training on program evaluation, conducting meetings; designs and directs the gathering, tabulating and interpreting of required data, and is the responsible authority for ensuring necessary reports/documentation are submitted to CDC. B. Fringe Benefits Fringe benefits are usually applicable to direct salaries and wages. Provide information on the rate of fringe benefits used and the basis for their calculation. If a fringe benefit rate is not used, itemize how the fringe benefit amount is computed. C. Consultant Costs This category is appropriate when hiring an individual to give professional advice or services (e.g., external evaluator, data collectors, data entry, etc.) for a fee but not as an employee of the grantee organization. When federal funds are used, written approval must be obtained from CDC prior to establishing a written agreement for consultant services. Approval to initiate program activities through the services of a consultant requires submission of the following information to CDC: 1. Name of Consultant; 2. Organizational Affiliation (if applicable); 3. Nature of Services to Be Rendered; 4. Relevance of Service to the Project; 5. The Number of Days of Consultation (basis for fee); and 6. The Expected Rate of Compensation (travel, per diem, other related expenses) list a subtotal for each consultant in this category. If the above information is unknown for any consultant at the time the application is submitted, the information may be submitted at a later date as a revision to the budget. In the body of the budget request, a summary should be provided of the proposed consultants and amounts for each. D. Equipment Provide justification for the use of each item and relate it to specific program objectives. Maintenance or rental fees for equipment should be shown in the "Other" category. Sample Justification Provide complete justification for all requested equipment, including a description of how it will be used in the program (e.g., (1) One computer workstation is requested for the evaluation coordinator, who will be working at the State Department of Health, STD Program office. At the current time, there are no unused computer workstations available. The coordinator will use this workstation to design all data collection instruments, keep minutes of all stakeholder meetings, design all components of the evaluation plan and write all evaluation reports. The second computer workstation is requested for the data collector, also to be housed at the State Department of Health, STD Program office. This workstation will be used to collect and analyze all data from STD Program evaluations. (2) The fax machine will be used to receive reports from the field from interviewers of the qualitative and quantitative evaluations details on these evaluations may be accessed in the most current CSPS progress report.). E. Supplies Individually list each item requested. Show the unit cost of each item, number needed, and total amount. Provide justification for each item and relate it to specific program objectives. If appropriate, General Office Supplies may be shown by an estimated amount per month times the number of months in the budget category. Sample Justification General office supplies will be used by staff members to carry out daily evaluation activities of the program. Qualitative software will be used to analyze interviews with the target population on their perceptions about STD services and Word Processing Software will be used to document program activities, process progress reports, etc. F. Travel Dollars requested in the travel category should be for staff travel only. Travel for consultants should be shown in the consultant category. Travel for other participants, advisory committees, review panel, etc. should be itemized in the same way specified below and placed in the Other category. InState Travel Provide a narrative justification describing the travel staff members will perform. List where travel will be undertaken, number of trips planned, who will be making the trip, and approximate dates. If mileage is to be paid, provide the number of miles and the cost per mile. If travel is by air, provide the estimated cost of airfare. If per diem/lodging is to be paid, indicate the number of days and amount of daily per diem as well as the number of nights and estimated cost of lodging. Include the cost of ground transportation when applicable. OutofState Travel Provide a narrative justification describing the same information requested above. Include CDC meetings, conferences, workshops, if required by CDC. Itemize outofstate travel in the format described above. SAMPLE BUDGET INSTATE TRAVEL: 1 trip x X people x X miles r/t x .27/mile = $X 2 days per diem x $X/day x X people = $X 1 nights lodging x $X/night x X people = $X 25 trips x X person x X miles avg. x .27/mile = $X Total = $X TRAVEL (INSTATE AND OUTOFSTATE) Total $______ Sample Justification The Evaluator and STD Program Director will travel to (location) to present the evaluation findings. The Evaluator will make an estimated 25 trips to local STD outreach sites to monitor program implementation. PRACTICAL USE OF PROGRAM EVALUATION AMONG STD PROGRAMS 157 SAMPLE BUDGET 1 trip x 1 person x $X r/t airfare = $X 3 days per diem x $X/day x 1 person = $X 1 night= lodging x $X/night x 1 person = $X Ground transportation 1 person = $X Total = $X OUTOFSTATE TRAVEL Sample Justification The Project Coordinator will travel to CDC, in Atlanta, GA, to attend the CDC Conference. G. Other This category contains items not included in the previous budget categories. Individually list each item requested and provide appropriate justification related to the program objectives. SAMPLE BUDGET Telephone: ($ per month x months x #staff) = $ Subtotal Postage: ($ per month x months x #staff) = $ Subtotal Printing: ($ per x documents) = $ Subtotal Equipment Rental (describe) ($ per month x months) = $ Subtotal Internet Provider Service ($___ per month x ___ months) = $ Subtotal OTHER Total $_______ Sample Justification Some items are selfexplanatory (telephone, postage, rent) unless the unit rate or total amount requested is excessive. If not, include additional justification. For printing costs, identify the types and number of copies of documents to be printed (e.g., procedure manuals, annual reports, materials for media campaign). H. Contractual Costs Cooperative Agreement recipients must obtain written approval from CDC prior to establishing a thirdparty contract to perform program activities. Approval to initiate program activities through the services of a contractor requires submission of the following information to CDC: 1. Name of Contractor; 2. Method of Selection; 3. Period of Performance; 4. Scope of Work; 5. Method of Accountability; and 6. Itemized Budget and Justification. If the above information is unknown for any contractor at the time the application is submitted, the information may be submitted at a later date as a revision to the budget. Copies of the actual contracts should not be sent to CDC, unless specifically requested. In the body of the budget request, a summary should be provided of the proposed contracts and amounts for each. I. Total Direct Costs $________ Show total direct costs by listing totals of each category. J. Indirect Costs $________ To claim indirect costs, the applicant organization must have a current approved indirect cost rate agreement established with the cognizant Federal agency. A copy of the most recent indirect cost rate agreement must be provided with the application. SAMPLE BUDGET THE RATE IS ___% AND IS COMPUTED ON THE FOLLOWING DIRECT COST BASE OF $________. Personnel $ Fringe $ Travel $ Supplies $ Other $ Total $ x ___% = Total Indirect Costs If the applicant organization does not have an approved indirect cost rate agreement, costs normally identified as indirect costs (overhead costs) can be budgeted and identified as direct costs. PRACTICAL USE OF PROGRAM EVALUATION AMONG STD PROGRAMS 159 TOOL 3.3: DEVELOP AND PRIORITIZE EVALUATION QUESTIONS BASED ON PROGRAM AND STAKEHOLDERS NEEDS AND RESOURCES INTRODUCTION This tool will help you develop the questions you want the evaluation to answer about the program component/activity which you are interested in evaluating. The flowchart below provides a description of where development and prioritization of evaluation questions fit with the program planning and program evaluation process. UNDERSTANDING OF PROGRAM FOCUS AND PRIORITY AREAS LEARNING OBJECTIVES Upon completion of this tool, you will be able to: 1. Generate process and outcome evaluation questions. 2. Identify the necessary criteria for prioritizing evaluation questions. WHAT IS THE PURPOSE OF EVALUATION QUESTIONS? Evaluation questions help focus your evaluation and should reflect the purpose of the evaluation, and the priorities and needs of stakeholders. Therefore, key stakeholders should be involved in identifying, discussing and coming to agreement on the questions the evaluation will answer. Ultimately the answers to these questions will assist you in making improvements to your program. HOW DO YOU DEVELOP AND PRIORITIZE EVALUATION QUESTIONS? To focus your STD program evaluation, you will need to formulate and prioritize the evaluation questions. Here is some guidance. 1. Involve key stakeholders. Review, along with stakeholders, why you need to do the evaluation, how the evaluation results will be used, and by whom. Make sure that the evaluation can be conducted with the available resources, including funds and personnel with relevant expertise. Review the logic model of the program component/activity you need to evaluate. This will generate discussion and evaluation questions addressing multiple dimensions (e.g., outcomes, operations) of the issue to be evaluated. 2. Brainstorm evaluation questions. Brainstorm a list of questions about the specific component/activity you want to evaluate by posing the following: What are the questions we want the evaluation to answer? As part of the brainstorming process, encourage STD program staff and other stakeholders to voice possible questions they have about program operations, implementation, and outcomes. You need to ensure that your stakeholders range of interests in, concerns about, and perspectives on your STD program are represented and that they buy into the entire evaluation process. Use your logic model to help you develop questions. For instance, if you decided to evaluate the community screening activity because you suspect the screening is not reaching the right population, look back on the logic model to identify the process and outcome components linked to that activity and generate questions accordingly. Sample evaluation questions on outreach activities targeting atrisk MSM that might result from a brainstorming could include: Are the prevention materials (e.g., pamphlets, posters) appropriate for MSM at high risk? Did MSM read the prevention materials? Were new STD screening and counseling pamphlets distributed to appropriate locations by the disease intervention specialists (DIS) and outreach workers? Did screening and counseling events reach the target population? Did screening and counseling events reach the target population? 3. Group questions by theme. After you have generated a set of evaluation questions that you would like to answer, you will need to group the questions by themes. Grouping questions by theme will allow you to identify those that measure similar topics and to begin to prioritize. Classify the questions by either process or outcome questions. This will help you decide the type of evaluation (see Tool 3.1) you will apply. Below you will find the evaluation questions identified during the brainstorming activity., These were grouped by theme (e.g., operations, behaviors, etc.) and identified as either a process or an outcome question. A rationale for the grouping is also provided. Operational Evaluation Questions Are the prevention materials (e.g., pamphlets, posters) appropriate for MSM at high risk? (process) Were new STD screening and counseling pamphlets distributed to appropriate locations by the disease intervention specialists (DIS) and outreach workers? (process) Did screening and counseling events reach the target population? (process) Consider hiring an evaluation consultant, if funds are available. Rationale for grouping: Each of these questions relates to the operation of the intervention. Behavioral and Awareness Evaluation Question Did atrisk MSM read the prevention material? (outcome) Were atrisk MSM aware of the prevention messages conveyed by the media campaign? (outcome) To what extent did the outreach activities increase the number of atrisk MSM coming to the STD clinic for testing? (outcome) Rationale for grouping: Each of these questions relates to behavioral and awareness outcomes associated with the exposure to the intervention. Health Outcome Evaluation Question How have the P&S; syphilis rates decreased among the MSM population? (outcome) Rationale for grouping: This question deals directly with a health outcome of the target population. 4. Prioritize evaluation questions. You may need to narrow down the number of evaluation questions proposed due to the relevance of the questions to the purpose and uses of your evaluation and your available resources. For example, you may realize that outcome evaluation questions would require substantial resources (e.g., evaluation personnel; time commitments of program staff to collect data) that are not available. Therefore, prioritize the list of proposed evaluation questions by asking: Why this question? Make sure that each question is clear to all stakeholders. The following criteria can help you determine the importance, utility, and feasibility of each question as it pertains to your STD program evaluation needs. Ensure that a question: Is important to your program staff and stakeholders. Reflects key goals and objectives of your program. Reflects key elements of your program logic model. Will provide information you can act upon to make program improvements. Can be answered using available resources (e.g., budget, personnel) and within the appropriate timeframe. Will be supported (in terms of resources needed to answer the question) by the STD program. Assess each of your evaluation questions based on how well it meets these criteria. Questions that do not meet all of the criteria should be considered lower priority. It is particularly important to address how each of the questions, once answered, is likely to provide information with which you can make program improvements. See the Appendix in this tool for a worksheet to help you apply the criteria to your evaluation questions. You may have to negotiate with stakeholders if it is not feasible to include some of their evaluation questions. For example, you may not be able to answer how much jail screening contributes to the reduction in syphilis cases, but you can determine if jail screening is resulting in the discovery of new cases, and if not, why not. 5. Carefully examine and categorize the prioritized list of questions. Categorize the prioritized list of evaluation questions as either process or outcome evaluation questions. Process evaluation questions are primarily concerned with the implementation, or process of delivering your STD prevention and control activities. Outcome evaluation questions are concerned with the effects of your STD program delivery and operations in the target population(s). Relate your questions to your logic model. Make sure your process evaluation questions incorporate key process components of your logic model (i.e., inputs, activities, outputs) and your outcome evaluation questions incorporate key outcome components of the logic model (i.e., short, intermediate, and longterm outcomes). SUMMARY CHECKLIST: Develop and Prioritize Evaluation Questions Based on Program and Stakeholders Needs and Resources CONCLUSION AND NEXT STEPS An important step in focusing your evaluation involves developing and prioritizing evaluation questions that are relevant to your program and stakeholders needs. In this tool you learned how you and your stakeholders can do this. Now that you have focused your evaluation, it is time to consider the indicators (see Tool 4.1) that will help you answer the evaluation questions. ACRONYMS USED IN THIS TOOL CBO Communitybased organization DIS Disease Intervention Specialist GC Gonorrhea MSM Men who have sex with men STD Sexually transmitted disease KEY TERMS Outcome evaluation: A type of evaluation that determines the effects of your program activities in the target population (e.g., changes in: knowledge, attitudes, beliefs and skills) or organization. The outcome components of a logic model (the right side) are used to plan an outcome evaluation. Outcome evaluation questions: Evaluation questions concerned with your program outcomes. Such questions can address whether the desired outcomes of your program were achieved, and whether the program produced changes in the target population(s)/organization. Process evaluation: Also referred to as implementation evaluation, is a type of evaluation that determines whether your program and its activities are implemented as intended; and why? /why not? The information gathered is used for refining or modifying these activities and related procedures. The inputs, activities and outputs of a logic model (the left side) are used to plan a process evaluation. Process evaluation questions: Evaluation questions concerned with the implementation of your program and specific program component/activity. You develop process evaluation questions to examine the development and delivery of services and activities of your program, as well as its operations and administrative functions. CASE SCENARIO Two years ago, Cactus City, a medium sized city located in the American southwest, experienced an outbreak of reported gonorrhea (GC) among Hispanic/Latino males and females; specifically among Mexican Americans (Americans whose parents are Mexicans). Increased GC testing in the two STD and multiple family planning clinics in the city has identified many more infections, but not reduced the overall morbidity. The City Health Officer, under pressure from the mayors office (because of recent local news media about the outbreak) has decided to make a concerted effort to bring GC under control, especially in the Mexican American community. He called a meeting of the city STD, family planning, laboratory, surveillance, and budget office directors to develop a plan of action. The plan of action called for a number of steps to be taken, including: 1) development of new, or revision of existing STD prevention materials to be disseminated by all city clinics that serve the target population; 2) provision of updated information about the outbreak and reminders to relevant health providers to report all infections; and 3) expanded efforts by Hispanic/Latino/Mexican/Mexican American CBOs to distribute STD prevention awareness materials and condoms to their clients. At the start of the effort, program staff and other stakeholders brainstormed the following possible evaluation questions: 1. Are the STD prevention materials (e.g., pamphlets, fact sheets, posters) that were developed or revised for use by the city health clinics culturally appropriate for the target population (i.e., Mexican American males and females)? (process) 2. To what extent did the city health clinics disseminate the STD prevention materials to the target population? (process) 3. To what extent was updated information about the outbreak provided to health providers who serve the target population? (process) 4. What were the barriers for the implementation of the action plan? (process). 5. Did health providers who serve the target population consistently report new STD infections according to reporting guidelines? (outcome) 6. Were MexicanAmerican clients satisfied with the GC screening and treatment services they received in city clinics? (process) 7. Did CBOs have appropriate trained staff to adequately reach the target population? (process) 8. To what extent did Hispanic/Latino/Mexican/Mexican American CBOs distribute STD prevention awareness materials (e.g., fact sheets; referral information) to Mexican American clients at risk of infection? (process) 9. To what extent did Hispanic/Latino/Mexican/Mexican American CBOs distribute condoms to Mexican American clients at risk of infection? (process) 10. Did Hispanic/Latino/Mexican/Mexican Americans who received prevention awareness material find them to be culturally appropriate? (process) 11. Did Hispanic/Latino/Mexican/Mexican Americans who received prevention awareness material, satisfied with the amount of information provided? (process) 12. Did Hispanic/Latino/Mexican/Mexican Americans who received prevention awareness material share this information with their family and friends? (outcome) 13. Was there an increase in the awareness of the GC outbreak in the MexicanAmerican community? (outcome) 14. Was there an increase in the selfreported consistent use of condoms by Mexican American males following implementation of the action plan? (outcome) 15. Did Mexican American females request that their partners were using condoms consistently following the implementation of the action plan? (outcome) 16. To what extent did the GC plan of action reduce the reported prevalence of GC within the MexicanAmerican population? (outcome) 17. Was there an increase in Mexican Americans requesting to be tested for STD, following implementation of the action plan? (outcome) After examining these questions, you and your colleagues decided to group your questions into three themes. Your first theme was called operations and included questions 14, 69. This theme was chosen because it related to how the action plan operated. The second theme referred to acceptability which includes questions related with target populations reaction to the material/activities (1012). The third theme that was selected was called knowledge and behavioral outcomes. This theme included questions 5 and 1317. This theme was chosen because these questions all deal with knowledge and behavioral outcomes related to the action plan. The fourth theme was labeled health outcome and included question 16, a question related to expected health outcome of the action plan. After grouping the evaluation questions by theme, you determined which evaluation questions you would attempt to answer. While the City Health Officer had been under pressure from the mayors office to make a concerted effort to bring GC under control, you and your colleagues knew that your action plan could not make that happen unless the plan was implemented according to its design. Therefore, you initially decided to address the questions related to the theme of operations and one of the behavioral outcome evaluation questions (e.g., question number 14 on treatment of target population members who had tested positive). When you and your colleagues reviewed these evaluation questions, you determined that not enough resources were available to address all of the operations questions. You decided to address those that were considered: 1) the most important in assessing the major objective of implementing the action plan according to its design: 2) would provide information that the staff could easily act upon; 3) reflected the key components of the action plan logic models outputs and intermediate outcomes; and 4) could be supported with resources of the STD program. You ended up with the following finalized list of process and outcome evaluation questions: 1. Are the STD prevention materials (e.g., pamphlets, fact sheets, posters) that were developed or revised for use by the city health clinics culturally appropriate for the target population (i.e., Mexican American males and females)? (process) 2. To what extent did the city health clinics disseminate the STD prevention materials to the target population? (process) 3. To what extent was updated information about the outbreak provided to health providers who serve the target population? (process) 4. Did CBOs have appropriate trained staff to adequately reach the target population? (process) 5. To what extent did Hispanic/Latino/Mexican/Mexican American CBOs distribute STD prevention awareness materials (e.g., fact sheets; referral information) to Mexican American clients? (process) 6. To what extent did Hispanic/Latino/Mexican/Mexican American CBOs distribute condoms to Mexican American clients? (process) 7. What were the barriers for the implementation of the action plan? (process) 8. Was there an increase in Mexican Americans requesting to be tested for STD, following implementation of the action plan? (outcome) REFERENCES King, J. A., Morris, L. L., & FitzGibbon, C. T. (1987). How to assess program implementation. Thousand Oaks, CA: Sage. Patton, M. Q. (1997). Utilizationfocused evaluation: The new century text (3rd ed.). Thousand Oaks, CA: Sage. Rossi, P. H., Freeman, H. E., & Lipsey, M. W. (1999). Evaluation: A systematic approach (6th ed.). Thousand Oaks, CA: Sage. APPENDIX WORKSHEET: Criteria to Use When Prioritizing the Evaluation Questions List your evaluation questions in this column and then decide if each question meets the six criteria listed in the right hand column. Criteria: 1 = Important to stakeholders. 2 = Reflects key goals and objectives of program. 3 = Reflects key elements of program logic model. 4 = Provides information which can be acted upon to make program improvements. 5 = Can be answered with available resources. 6 = Will be supported by the STD program. Check if evaluation question meets each criteria: Evaluation Questions 1 2 3 4 5 6 1 2 3 4 5 6 7 8