Assignment: Drafting a Process Evaluation

Assignment: Drafting a Process Evaluation

Assignment: Drafting a Process Evaluation

The steps for process evaluation outlined by Bliss and Emshoff (2002) may seem very similar to those for conducting other types of evaluation that you have learned about in this course; in fact, it is the purpose and timing of a process evaluation that most distinguish it from other types of evaluation. A process evaluation is conducted during the implementation of the program to evaluate whether the program has been implemented as intended and how the delivery of a program can be improved. A process evaluation can also be useful in supporting an outcome evaluation by helping to determine the reason behind program outcomes.

ORDER NOW FOR COMPREHENSIVE, PLAGIARISM-FREE PAPERS

There are several reasons for conducting process evaluation throughout the implementation of a program. Chief among them is to compare the program that is being delivered to the original program plan, in order to identify gaps and make improvements. Therefore, documentation from the planning stage may prove useful when planning a process evaluation.

 

For this Assignment, you either build on the work that you completed in Weeks 6, 7, and 8 related to a support group for caregivers, or on your knowledge about a program with which you are familiar. Review the resource “Workbook for Designing a Process Evaluation”. Assignment: Drafting a Process Evaluation

 

Submit a 4- to 5-page plan for a process evaluation. Include the following minimal information:

A description of the key program elements

A description of the strategies that the program uses to produce change

A description of the needs of the target population

An explanation of why a process evaluation is important for the program

A plan for building relationships with the staff and management

Broad questions to be answered by the process evaluation

Specific questions to be answered by the process evaluation

A plan for gathering and analyzing the information

  • attachment

    USW1_SOCW_6311_Week09_workbookForDesigningAProcessEvaluation.pdf

    Workbook

    for Designing a Process Evaluation

     

    Produced for the

    Georgia Department of Human Resources

    Division of Public Health

    By

    Melanie J. Bliss, M.A. James G. Emshoff, Ph.D.

    Department of Psychology Georgia State University

     

    July 2002

     

     

    Evaluation Expert Session July 16, 2002 Page 1

     

    What is process evaluation?

    Process evaluation uses empirical data to assess the delivery of programs. In contrast to outcome evaluation, which assess the impact of the program, process evaluation verifies what the program is and whether it is being implemented as designed. Thus, process evaluation asks “what,” and outcome evaluation asks, “so what?”

    When conducting a process evaluation, keep in mind these three questions:

    1. What is the program intended to be? 2. What is delivered, in reality? 3. Where are the gaps between program design and delivery?

    This workbook will serve as a guide for designing your own process evaluation for a program of your choosing. There are many steps involved in the implementation of a process evaluation, and this workbook will attempt to direct you through some of the main stages. It will be helpful to think of a delivery service program that you can use as your example as you complete these activities. Why is process evaluation important? 1. To determine the extent to which the program is being

    implemented according to plan 2. To assess and document the degree of fidelity and variability in

    program implementation, expected or unexpected, planned or unplanned

    3. To compare multiple sites with respect to fidelity 4. To provide validity for the relationship between the intervention

    and the outcomes 5. To provide information on what components of the intervention

    are responsible for outcomes 6. To understand the relationship between program context (i.e.,

    setting characteristics) and program processes (i.e., levels of implementation).

    7. To provide managers feedback on the quality of implementation 8. To refine delivery components 9. To provide program accountability to sponsors, the public, clients,

    and funders 10. To improve the quality of the program, as the act of evaluating is

    an intervention. Assignment: Drafting a Process Evaluation

     

     

    Evaluation Expert Session July 16, 2002 Page 2

    Stages of Process Evaluation Page Number

    1. Form Collaborative Relationships 3 2. Determine Program Components 4 3. Develop Logic Model* 4. Determine Evaluation Questions 6 5. Determine Methodology 11 6. Consider a Management Information System 25 7. Implement Data Collection and Analysis 28 8. Write Report**

    Also included in this workbook:

    a. Logic Model Template 30 b. Pitfalls to avoid 30 c. References 31

     

    Evaluation can be an exciting, challenging, and fun experience

    Enjoy!

     

    * Previously covered in Evaluation Planning Workshops. ** Will not be covered in this expert session. Please refer to the Evaluation Framework

    and Evaluation Module of FHB Best Practice Manual for more details.

     

     

     

     

     

    Evaluation Expert Session July 16, 2002 Page 3

    Forming collaborative relationships

    A strong, collaborative relationship with program delivery staff and management will likely result in the following:

    Feedback regarding evaluation design and implementation Ease in conducting the evaluation due to increased cooperation Participation in interviews, panel discussion, meetings, etc. Increased utilization of findings

    Seek to establish a mutually respectful relationship characterized by trust, commitment, and flexibility.

    Key points in establishing a collaborative relationship:

    Start early. Introduce yourself and the evaluation team to as many delivery staff and management personnel as early as possible.

    Emphasize that THEY are the experts, and you will be utilizing their knowledge and

    information to inform your evaluation development and implementation.

    Be respectful of their time both in-person and on the telephone. Set up meeting places that are geographically accessible to all parties involved in the evaluation process.

    Remain aware that, even if they have requested the evaluation, it may often appear as

    an intrusion upon their daily activities. Attempt to be as unobtrusive as possible and request their feedback regarding appropriate times for on-site data collection.

    Involve key policy makers, managers, and staff in a series of meetings throughout the

    evaluation process. The evaluation should be driven by the questions that are of greatest interest to the stakeholders. Set agendas for meetings and provide an overview of the goals of the meeting before beginning. Obtain their feedback and provide them with updates regarding the evaluation process. You may wish to obtained structured feedback. Sample feedback forms are throughout the workbook.

    Provide feedback regarding evaluation findings to the key policy makers, managers,

    and staff when and as appropriate. Use visual aids and handouts. Tabulate and summarize information. Make it as interesting as possible.

    Consider establishing a resource or expert “panel” or advisory board that is an official

    group of people willing to be contacted when you need feedback or have questions.

     

     

     

     

     

    Evaluation Expert Session July 16, 2002 Page 4

    Determining Program Components

    Program components are identified by answering the questions who, what, when, where, and how as they pertain to your program.

    Who: the program clients/recipients and staff What: activities, behaviors, materials When: frequency and length of the contact or intervention Where: the community context and physical setting How: strategies for operating the program or intervention

    BRIEF EXAMPLE: Who: elementary school students What: fire safety intervention When: 2 times per year Where: in students’ classroom How: group administered intervention, small group practice

    1. Instruct students what to do in case of fire (stop, drop and roll). 2. Educate students on calling 911 and have them practice on play telephones. 3. Educate students on how to pull a fire alarm, how to test a home fire alarm and how to

    change batteries in a home fire alarm. Have students practice each of these activities. 4. Provide students with written information and have them take it home to share with their

    parents. Request parental signature to indicate compliance and target a 75% return rate. Points to keep in mind when determining program components Specify activities as behaviors that can be observed

    If you have a logic model, use the “activities” column as a starting point

    Ensure that each component is separate and distinguishable from others

    Include all activities and materials intended for use in the intervention

    Identify the aspects of the intervention that may need to be adapted, and those that should

    always be delivered as designed. Consult with program staff, mission statements, and program materials as needed. Assignment: Drafting a Process Evaluation

     

     

     

     

     

    Evaluation Expert Session July 16, 2002 Page 5

    Your Program Components

    After you have identified your program components, create a logic model that graphically portrays the link between program components and outcomes expected from these components.

    Now, write out a succinct list of the components of your program. WHO: WHAT: WHEN: WHERE: HOW:

     

     

     

     

     

    Evaluation Expert Session July 16, 2002 Page 6

    What is a Logic Model

    A logical series of statements that link the problems your program is attempting to address (conditions), how it will address them (activities), and what are the expected results (immediate and intermediate outcomes, long-term goals).

    Benefits of the logic model include:

    helps develop clarity about a project or program, helps to develop consensus among people, helps to identify gaps or redundancies in a plan, helps to identify core hypothesis, helps to succinctly communicate what your project or program is about.

    When do you use a logic model Use… – During any work to clarify what is being done, why, and with what intended results – During project or program planning to make sure that the project or program is logical and complete – During evaluation planning to focus the evaluation – During project or program implementation as a template for comparing to the actual program and as a filter to determine whether proposed changes fit or not. This information was extracted from the Logic Models: A Multi-Purpose Tool materials developed by Wellsys Corporation for the Evaluation Planning Workshop Training. Please see the Evaluation Planning Workshop materials for more information. Appendix A has a sample template of the tabular format. Assignment: Drafting a Process Evaluation

     

     

     

     

     

    Evaluation Expert Session July 16, 2002 Page 7

    Determining Evaluation Questions

    As you design your process evaluation, consider what questions you would like to answer. It is only after your questions are specified that you can begin to develop your methodology. Considering the importance and purpose of each question is critical.

    BROADLY…. What questions do you hope to answer? You may wish to turn the program components that you have just identified into questions assessing: Was the component completed as indicated? What were the strengths in implementation? What were the barriers or challenges in implementation? What were the apparent strengths and weaknesses of each step of the intervention? Did the recipient understand the intervention? Were resources available to sustain project activities? What were staff perceptions? What were community perceptions? What was the nature of the interaction between staff and clients?

    These are examples. Check off what is applicable to you, and use the space below to write additional broad, overarching questions that you wish to answer.

     

     

     

     

     

    Evaluation Expert Session July 16, 2002 Page 8

    SPECIFICALLY … Now, make a list of all the specific questions you wish to answer, and organize your questions categorically. Your list of questions will likely be much longer than your list of program components. This step of developing your evaluation will inform your methodologies and instrument choice. Remember that you must collect information on what the program is intended to be and what it is in reality, so you may need to ask some questions in 2 formats. For example:

    How many people are intended to complete this intervention per week?” How many actually go through the intervention during an average week?”

    Consider what specific questions you have. The questions below are only examples! Some may not be appropriate for your evaluation, and you will most likely need to add additional questions. Check off the questions that are applicable to you, and add your own questions in the space provided. WHO (regarding client): Who is the target audience, client, or recipient? How many people have participated? How many people have dropped out? How many people have declined participation? What are the demographic characteristics of clients?

    Race Ethnicity National Origin Age Gender Sexual Orientation Religion Marital Status Employment Income Sources Education Socio-Economic Status

    What factors do the clients have in common? What risk factors do clients have? Who is eligible for participation? How are people referred to the program? How are the screened? How satisfied are the clients?

    YOUR QUESTIONS:

     

     

     

     

     

    Evaluation Expert Session July 16, 2002 Page 9

    WHO (Regarding staff): Who delivers the services? How are they hired? How supportive are staff and management of each other? What qualifications do staff have? How are staff trained? How congruent are staff and recipients with one another? What are staff demographics? (see client demographic list for specifics.)

    YOUR QUESTIONS: WHAT: What happens during the intervention? What is being delivered? What are the methods of delivery for each service (e.g., one-on-one, group session, didactic instruction,

    etc.) What are the standard operating procedures? What technologies are in use? What types of communication techniques are implemented? What type of organization delivers the program? How many years has the organization existed? How many years has the program been operating? What type of reputation does the agency have in the community? What about the program? What are the methods of service delivery? How is the intervention structured? How is confidentiality maintained?

    YOUR QUESTIONS: WHEN: When is the intervention conducted? How frequently is the intervention conducted? At what intervals? At what time of day, week, month, year? What is the length and/or duration of each service?

     

     

     

     

     

     

    Evaluation Expert Session July 16, 2002 Page 10

    YOUR QUESTIONS: WHERE: Where does the intervention occur? What type of facility is used? What is the age and condition of the facility? In what part of town is the facility? Is it accessible to the target audience? Does public transportation access

    the facility? Is parking available? Is child care provided on site?

    YOUR QUESTIONS: WHY: Why are these activities or strategies implemented and why not others? Why has the intervention varied in ability to maintain interest? Why are clients not participating? Why is the intervention conducted at a certain time or at a certain frequency?

    YOUR QUESTIONS:

     

     

     

     

     

    Evaluation Expert Session July 16, 2002 Page 11

    Validating Your Evaluation Questions

    Even though all of your questions may be interesting, it is important to narrow your list to questions that will be particularly helpful to the evaluation and that can be answered given your specific resources, staff, and time.

    Go through each of your questions and consider it with respect to the questions below, which may be helpful in streamlining your final list of questions. Revise your worksheet/list of questions until you can answer “yes” to all of these questions. If you cannot answer “yes” to your question, consider omitting the question from your evaluation. Assignment: Drafting a Process Evaluation

    Validation

    Yes

    No

    Will I use the data that will stem from these questions?

     

     

    Do I know why each question is important and /or valuable?

     

     

    Is someone interested in each of these questions?

     

     

    Have I ensured that no questions are omitted that may be important to someone else?

     

     

    Is the wording of each question sufficiently clear and unambiguous?

     

     

    Do I have a hypothesis about what the “correct” answer will be for each question?

     

     

    Is each question specific without inappropriately limiting the scope of the evaluation or probing for a specific response?