Health Canada
Symbol of the Government of Canada
First Nations & Inuit Health

A Guide for First Nations on Evaluating Health Programs

Warning This content was archived on July 25, 2013.

Archived Content

Information identified as archived on the Web is for reference, research or recordkeeping purposes. It has not been altered or updated after the date of archiving. Web pages that are archived on the Web are not subject to the Government of Canada Web Standards. As per the Communications Policy of the Government of Canada, you can request alternate formats on the "Contact Us" page.

Help on accessing alternative formats, such as Portable Document Format (PDF), Microsoft Word and PowerPoint (PPT) files, can be obtained in the alternate format help section.

Table of Contents

Introduction

This guide is published by the Health Funding Arrangements Division, Program Policy Transfer Secretariat and Planning Directorate, First Nations and Inuit Health Branch (FNIHB), Health Canada. It presents basic information on evaluating health programs for First Nations communities that are taking control of their health programs under the Department's Health Transfer Initiative.

This guide is not a complete textbook on evaluating health programs in First Nations communities. Instead, it will help First Nations communities to understand what evaluation is and how it works. It also explains the need for evaluations and the part they can play in helping to conduct better community health programs.

Evaluation involves looking closely at programs to find out whether they did what they were supposed to do. It also means looking to see whether the results of the health program activities were worth the time and money communities spent.

Evaluators' reports are a very important part of the evaluation process and these reports are the final product or result of the evaluation work. Most of the information evaluators need to write their reports must come from communities, not from evaluators. The active involvement of communities is important. Therefore, communities need to develop and collect program information well before evaluation begins to facilitate program evaluation.

For more information on transferring control of health programs to First Nations and Inuit communities please refer to the following handbooks:

Transferring Control of Health Programs to First Nations and Inuit Communities: Handbook 1C An Introduction to Three Approaches provides an introduction to transfer of control of health programs and summarizes First Nations and Inuit Health Branch (FNIHB) policies concerning the control of health programs by First Nations and Inuit communities across Canada.

Transferring Control of Health Programs to First Nations and Inuit Communities: Handbook 2C The Health Services Transfer provides information about the health services transfer process and the procedures and policies for planning under the transfer approach.

Transferring Control of Health Programs to First Nations and Inuit Communities: Handbook 3C After the Transfer the New Environment explains what happens after transfer is completed.

Basic Information About Program Evaluation

This chapter answers two questions about program evaluation: What is program evaluation and why is it necessary?

What Is Program Evaluation?

Program evaluation is a way of measuring whether programs are doing what they are supposed to do. Evaluation means collecting information about program activity effectiveness and then comparing expected to actual results. Every program has one or more results that it wants to accomplish. Evaluations help communities to see if their programs are achieving these results.

Why Do Communities Have to Evaluate their Health Programs?

Evaluations help communities to understand how their health programs are working. At any point in the life of programs, communities need to know whether programs are doing what they set out to do. Therefore, the Transfer Framework requires that evaluation be completed every five years. These evaluations address the effectiveness of community health programs and objectives. Also, they identify changes in health status of community members. Communities agree to provide their evaluation reports to the Minister prior to the end of the five-year transfer period to allow for joint discussion and analysis with FNIHB before renewal of transfer agreements.

Evaluations show other communities what works in particular communities. For instance, a prenatal program may have had a very positive impact on pregnancies in one community. Other communities will want to know how to make such a program work for them. Evaluations describe what happens in programs and enables communities to learn from one another's experiences and to adapt successful experiences to their communities.

Evaluations show staff what they are doing. People working on programs need to see their work in a larger picture. Sometimes seeing past day-to-day challenges is difficult. Evaluations can show people whether their work is making a difference. Done properly, evaluations can give communities much information about their programs and lead to program improvements. Consequently, all community members benefit.

Linkage Between Evaluation and Program Management

As we mentioned earlier, evaluation is an essential activity that shows whether health programs are producing good results. Figure 1 below shows how evaluation fits into the Community Health Plan and shows the linkage between evaluation and program management.

Figure 1

Figure 1: Community Health Needs Assessment pie diagram

Community Health Needs Assessment

Needs assessments are part of the planning stage of programs and identify community health priorities. Needs assessments help communities to understand the following:

  • what kinds of health problems communities are experiencing;
  • what causes these health problems;
  • what resources are available to address these health problems;
  • what goals and objectives communities need to write to help solve these health problems;
  • which community members have the most urgent needs; and
  • how best to meet the needs of community members.

Community Health Plan (CHP)

The CHP is the first step in planning what health programs communities want to provide for their members. When developing the CHP, communities have to begin thinking about evaluation. To illustrate, the CHP should specify the following four items that tie in with evaluation:

  • the programs and activities that communities plan to conduct;
  • the goals and objectives for each;
  • the indicators that communities use in their evaluations to measure how well programs meet their objectives; and
  • the day-to-day records and other evaluation information (data) that staff collect when programs are running.

Communities have to plan for evaluation from "day one" in the CHP.

Operating Programs

When communities conduct community health programs, they need to make sure that program staff collect the right kind of information on how well programs are working. Staff members need to collect this information every day. Communities need to identify the resources, people, money, materials and time needed to produce the desired results. If communities do not collect this information daily, they cannot give evaluators the tools they need to judge whether programs are improving community health. In other words - "No information, no evaluation."

Program Evaluation

The following questions form the core of evaluation and apply to all types of community health activities.

Did we do what we said we would do?

The answers to this question describe work done in programs and its relevance in meeting program objectives. Indicators provide the criteria that communities can use to measure program success.

What did we learn about what did and did not work?

The answers to this question give communities reasons for their success in conducting programs. Finding out what worked well in programs encourages participation in evaluation that focuses on success, learning and action. Health care providers, users and other interested parties need actively to participate in all phases of evaluation. Their involvement ensures that evaluations include ways to use the results throughout the life of programs. Participation in evaluation promotes ownership, focuses on community needs and encourages follow-up action.

Did this work make a difference?

The answers to this question measure program successes in changing knowledge, attitudes, skills and behaviour. Program success indicators identify the benefits that communities expect to gain from program work. Also, they provide the criteria against which to measure change throughout the life of programs.

What could we do differently?

Evaluations help communities to learn, and often the best way to learn comes from examining the challenges that programs present.

How do we plan to use evaluation findings for continuous learning?

Community participation in evaluation ensures that evaluation results are used throughout the life of programs.

Seeking answers to these key questions guides the evaluation process throughout the life of programs. By answering the questions, communities learn how to shape current and future programs.

Using the Results of the Evaluation

Communities can use evaluation results to help make decisions about their programs, to show where and how they need to revise community health plans, and to generate new program ideas.

Program Reformulation and Change

Program evaluation gives managers the facts they need to make informed decisions about community programs. Figure 2 below shows that evaluation is an ongoing process throughout the life of programs.

Figure 2

Figure 2: Flowchart of Program Reformation and Change

How to Prepare for Program Evaluation

Introduction

If communities have developed CHPs, they have already built the foundation for evaluating their health programs. A complete CHP lists the programs and activities, goals and objectives and the indicators for each program and activity. It also lists the information or data about program results.

Communities may want to use the points in the next few pages as a checklist to make sure that their CHPs are complete and updated. If communities have not yet completed CHPs, the information here can help them to put one together.

Program

Evaluators need to know the purpose of each program. For that reason, community members need to ask the following questions for each of their community health programs:

  • Does this program meet a need?
  • What is the purpose of this program?

For example, if teenage pregnancy is a concern in communities, include a teenage pregnancy program in CHPs. If it is not a concern, programs dealing with teenage pregnancy are a waste of time and money. Asking whether programs listed in the CHP are meeting community needs is an important step towards evaluating them.

The following definitions pertain to evaluations.

Goal

Goals are broad statements that describe what programs or activities should achieve. For example, the goal of a diabetes program could be "To help community members who have diabetes to lead more comfortable lives through learning about the benefits of better nutrition and regular exercise."

Objective

Objectives state exactly what programs should do. Objectives are identifiable and measurable actions to be completed by a specific time. When objectives are stated in measurable, time-related terms, evaluators can find out if objectives have been met. The following example shows a program objective:

The Maternal and Child Health Program has the following basic objective: "By December 1999, to reduce by 40 percent the number of fatal and non-fatal injuries to children younger than 10 years of age."

Activity

An activity is something communities do to help meet program objectives. Activities support programs and are really mini-programs. Programs may include one or more activities. They are the building blocks that make up programs. The four activities shown below would help to achieve the following objective: "By December 1999, to reduce by 40 percent the number of fatal and non-fatal injuries to children younger than 10 years of age."

  • Conduct parent education classes to instruct parents in preventing childhood poisoning, burns and injuries to occupants of motor vehicles.
  • Start school and playground inspections to identify and remove threats to child safety.
  • Give information and counselling sessions for teachers about teaching safety to school children.
  • Write a report on fatal and non-fatal childhood injuries to the band council recommending that council enact local by-laws to prevent pedestrian injuries to children.

Indicator

Evaluation indicators are signs, events or statistics that measure the success of programs or activities in meeting their objectives. Each program should include one or more success indicators. The greater the number of indicators, the more "rulers" available for measuring the effectiveness of programs or activities in improving the health of community members. For example, a success indicator for a prenatal program might be a decrease in the number of complications during or after delivery for both mothers and babies.

Hard and Soft Indicators

Indicators fall into two categories: "hard" indicators and "soft" indicators. Hard indicators are based on numbers - they are quantitative. For example, a decrease in the number of cases of a particular disease is a hard indicator. Soft indicators are not based on numbers - they are qualitative. An example of a soft indicator could be the satisfaction community members express about a program or activity.

Short-term and Long-term Indicators

Short-term indicators are signs that appear within a few weeks or months after programs or activities start and that show progress toward meeting objectives. Long-term indicators are signs that may take many months or years to show progress.

Data

Data is information about community programs and activities collected whenever programs start. Examples are statistics, records, surveys, community meetings, and interviews that supply information about programs. Examples of collected data are client information, immunization information, and user information gathered through community surveys and questionnaires.

Management Information System

A Management Information System (MIS) is computer software that collects, stores and processes program data efficiently. When collecting data, program managers should consider the following points:

  • what information to collect;
  • who will collect the information and how;
  • what program indicators to measure;
  • what reports to write;
  • how to store data securely;
  • how to restrict access to data; and
  • how staff can use the data system during working hours.

FNIHB, in partnership with First Nations, has developed a computerized information system called the Health Information System. This system helps First Nations to establish their own program priorities and an independent approach to managing community health issues.

The examples below show how programs, goals, objectives, activities, indicators and data relate for a few typical programs.

Program
Childhood Injury Prevention

  • Goal
    Decrease childhood injuries by changing the thinking of community members about childhood injury
  • Objective
    By December 1999, reduce the number of deaths and injuries to children younger than 10 years of age by 40%
  • Activity
    • Parent education classes
    • School and playground inspection
    • Teacher education
  • Indicator
    Reduction in numbers of childhood deaths and injuries
  • Data
    • Number of childhood deaths and injuries
    • Number of people attending parent education classes

Program
Adult Diabetes

  • Goal
    Decrease the incidence of complications due to adult diabetes
  • Objective
    By December 1999, reduce by 60% the number of diabetics presenting at the health clinic or hospital due to low or high blood sugar levels
  • Activity
    Provide information and counselling to community members with diabetes about diet, weight control, exercise and self-care
  • Indicator
    Number of adult diabetics needing treatment at home, in the clinic or in a hospital to stabilize blood-sugar levels
  • Data
    • Number of diabetics receiving treatment
    • Number of people receiving counselling

Program
Immunization

  • Goal
    Eliminate the incidence of childhood vaccine- preventable diseases through immunization
  • Objective
    By December 1999, fully immunize 90% of the infants and preschool children according to provincial immunization protocol
  • Activity
    • Operate regular immunization clinics
    • During pre- and post-natal visits, promote childhood immunization with mothers and caregivers
    • Encourage schools to require up-to-date immunization before school registration

  • Indicator
    • Well-child clinic
    • Number of contacts, content of discussions, increased knowledge, attitude and practice of mother or caregiver
    • Existence of school policy requiring full immunization
    • Percent increase in immunization levels
  • Data
    • Number of children vaccinated in community clinics
    • Number of mothers receiving information about immunization
    • Number of children vaccinated at school

This chapter has discussed the building blocks of evaluation - programs, activities, objectives, indicators and data. The information on how objectives, indicators and data for community programs and activities fit into the evaluation process should give communities enough background to begin working on evaluation plans. The next chapter will cover this topic.

Preparing an Evaluation Plan

Introduction

Within one year of signing Transfer Agreements, communities are responsible for developing evaluation plans. These plans outline a proposed strategy for conducting community evaluations including specific time frames and cost estimates. This chapter outlines how to develop an evaluation plan.

Communities may want to hire consultants to help plan and carry out evaluations, however, communities need to work closely with their consultants. Working with consultants ensures that evaluations meet community needs and tell community members what they most want to know about their programs.

When communities develop evaluation plans, it may be helpful to set up evaluation committees. These committees are responsible for the following activities:

  • developing terms of reference for program evaluations;
  • keeping the program evaluations on target; and
  • ensuring that evaluation plans reflect community goals and objectives.

These committees may include members of the health committee/board and people who use community health programs. The committee may also include people who know something about planning and evaluation, even if they are not in fact involved in community health programs.

Evaluation Plan Components

An evaluation plan should include the following components:

  • terms of reference and evaluation questions;
  • an evaluation approach;
  • indicators and data sources; and
  • an evaluation work plan.

Terms of Reference and Evaluation Questions

Terms of reference are instructions from communities that tell evaluators what programs to examine. Also, terms of reference list the main questions for evaluators to answer. Terms of reference and evaluation questions need to reflect defined community objectives and priorities stated in the Community Health Plan (CHP).

Evaluation Approach

The evaluation approach outlines how to carry out the evaluation. It identifies the tools or methods to use to get the information needed to answer the questions listed in the terms of reference. The evaluation approach includes the model that structures the evaluation study. The selected evaluation model or design needs to be identified, and the methods and procedures clearly stated. (Please see an example of the Logic Model in the Appendix.)

Indicators and Data Sources

CHPs must list the indicators used to measure the effectiveness of programs and activities. Data is information about each program or activity that must be available for each question in the terms of reference. CHPs must identify sources of data for each question in the terms of reference and explain how to collect the data.

Evaluation Work Plan

An evaluation work plan is a structured, specific plan for carrying out program evaluation.

Evaluation committees need to work with evaluators to develop work plans for conducting evaluations. Evaluation work plans should provide answers to the following questions:

  • Who will manage evaluations?
  • Who will perform each evaluation task?
  • What is the schedule for completing each evaluation task?
  • What is the schedule for producing the draft and final reports?
  • Who will receive copies of the reports?

Finally, evaluation work plans should state evaluation costs. Evaluation costs are usually about five percent of the costs of programs and activities. The funding guidelines supporting the completion of evaluation reports for communities operating under transfer agreements are available from First Nations and Inuit Health Branch (FNIHB).

Checklist for Assessing Evaluation Plans

Introduction

The purpose of this checklist is to assess evaluation plans. It covers the following items:

  • terms of reference and evaluation questions;
  • evaluation approach;
  • indicators and data sources;
  • evaluation work plan; and
  • dissemination and use of evaluation results.

Terms of Reference and Evaluation Questions

Are the terms of reference for the evaluation stated clearly? Yes __  No __

Are all community health programs and activities clearly identified and described? Yes __  No __

Are the goals and objectives of the community health programs clearly written? Yes __  No __

Evaluation Approach

Has the evaluation model or design been identified? Yes __  No __

Are the methods and procedures clearly stated? Yes __   No __

Indicators and Data Sources

Do all community health programs have success indicators that measure results linked to improvement in the health status of the community? Yes __  No __

Is there a management information system in place for collecting data? Yes __ 
No  __

Are data sources such as statistics, records, questionnaires, interviews, and so on identified? Yes __  No  __

Evaluation Work Plan

Is the manager of the evaluation identified? Yes __   No __

Are the persons who will do each evaluation task identified? Yes __  No __

Is there a timetable for specific activities? Yes __ No __  

Is the budget covering costs of the evaluation included? Yes __  No __

Dissemination and Use of the Results

How will the results of the evaluation be communicated to community members?

How will the results be used to improve community health programs?

Who will receive a copy of the evaluation report?

 

Conducting Evaluations

Introduction

This chapter discusses the evaluation process and the need to communicate evaluation findings to communities. Also, it discusses how to use the information and recommendations in evaluators' reports to improve community programs.

The four main tasks in carrying out evaluations are the following:

  • collecting the prepared data;
  • answering the evaluation questions asked in the terms of reference and making recommendations;
  • analyzing why a program did or did not meet its objectives or produce the planned results; and
  • writing the draft and final evaluation reports.

The Four Main Evaluation Tasks

Step 1: Collecting the Data

Evaluators need the data or information - statistics, records, questionnaires, and the results of interviews collected since the beginning of programs and activities.

Step 2: Analyzing the Evaluation Data

Evaluators study the available data to gather as many facts as possible about programs or activities. Evaluators call these facts evaluation findings. Examining these findings enables evaluators to answer evaluation questions such as the following example:

Are there good reasons to continue conducting the Maternal Health Program in the same way?

The facts or evaluation findings are as follows:

  • New mothers are attending the prenatal clinics regularly.
  • Community members express positive feelings about the program.
  • Complications during and within two weeks of birth are lower than they were before the program started.
  • There is a waiting list for prenatal classes.

The evaluator concludes that the program is working well. Hospital admissions are down 40 percent and costs are less. The waiting list shows a need to expand the prenatal classes. Otherwise continue to conduct the program as it is.

In this example, the evaluator answered an evaluation question and made a recommendation to expand the program.

Step 3: Explaining the Conclusions

Analysis of evaluation findings and conclusions is among the most useful information evaluation provides. This analysis explains why programs or activities were or were not successful. It is especially important to know why programs did not produce the expected results. For example, programs may have not met their objectives because people were not aware that the programs existed, or because the objectives were unrealistic. Communities can use this kind of information to make the necessary corrections.

Step 4: Writing the Evaluation Report

Final reports are the end products of evaluations. These reports need to contain information that is useful to communities that they can use to improve programs.

Evaluation committees should discuss draft evaluation findings with evaluators. The committees should agree that any criticisms in the reports are fair and reasonable and present accurate pictures of health programs. They should discuss any problems concerning draft reports with the evaluators, and evaluators should consider committee comments when writing their final reports. Evaluation reports need to be useful to chiefs and councils and health boards because they are responsible to members of the community for meeting health care needs.

Work plans that are part of evaluation plans must state clearly what information the final report must include. The following list shows the essential components of an evaluation report.

The Summary

The summary is at the beginning of the evaluation report and gives the following information:

  • It states the questions the study answers, as requested in the terms of reference.
  • It presents the most important findings and conclusions of the study in two or three sentences.
  • It states the methods or approaches used to get the answers. (Examples are reviewing hospital records, reports from community health representatives, community surveys, and interviews with medical and non-medical staff.)

The Introduction

The introduction describes each program briefly, the objective and duration, and the number of staff and their positions.

The Approach

The approach lists the indicators for each program or activity. Also, it adds more details on methods and approaches. (For example, how evaluators collected information and the information used for each indicator.)

The Evaluation Findings and Conclusions

The section dealing with evaluation findings and conclusions presents the findings about the program gained from studying the data and states the conclusions.

The following is an example showing how to present findings and conclusions in an evaluation report:

The Maternal Care Program has helped to improve the prenatal health of mothers. Attendance at prenatal clinics is high and growing, as shown by clinical records. Mothers accept the program and state that it has helped them to learn about nutrition and caring for newborn children, as shown by the survey done in June 1998. From the findings, we conclude that the program should continue to operate as it is.

The Analysis

The analysis should state why a program did or did not do what it is supposed to do. Statistical information (hard data) is useful but feedback from the community (soft data) is one of the best ways of finding out why things happened as they did. For this reason, ensure that the Management Information System is set up to receive community feedback. Also, ensure that surveys are included in the evaluation approach.

The following example shows how community feedback can help shape analysis:

Community members state that they need more information and counselling for diabetes, especially on diet, exercise, self-care and controlling weight. Few people, however, are attending quarterly information and counselling sessions. Community surveys show that few have changed their diets and done other things suggested by the counsellor.

Other data shows that the program has not resulted in lower blood sugar levels in the community. The data also shows that the number of diabetics using less insulin has declined only slightly since the program began operating two years ago.

Analysis pointed out two reasons why the program has not worked as expected. First, much of the information distributed to diabetics by the community is too hard to read. Many people say they cannot understand it. Second, many recommended foods such as fruits and vegetables are too expensive or unavailable. Finally, many counselling sessions were cancelled because the counsellor was unable to travel to the community in bad weather. Note that community feedback was the most important source of information in this example.

Recommendations

Evaluators need to make recommendations to improve programs. For example, in the Adult Diabetes Program, evaluators could recommend the following changes:

The community needs to find or develop easy-to-read pamphlets and other information concerning diabetes.

The community needs to find a local person to run the counselling sessions three times monthly or to establish a drop-in information centre.

Evaluators make recommendations based on conclusions and analysis. Chief and councils, health boards, and communities decide which recommendations to follow.

The Three Main Post-Evaluation Tasks

After evaluations are completed, three tasks remain to be done. They are the following:

  • communicating evaluation findings and conclusions to the community;
  • using evaluation results to improve community health programs; and
  • sending a copy of the evaluation report to FNIHB for joint discussion and analysis.

Using Evaluation Results

Managers of community health programs are responsible for correcting problems revealed by evaluations. Good evaluation information gives the facts that enable sound decision-making on how to improve community health programs.

Appendix: An Evaluation Model

Introduction

The literature in the field of evaluation discusses various evaluation models. These models serve mainly to explain evaluation, suggest a framework or plan for use, and define the evaluator's role and responsibilities. This section describes the Logic Model used to evaluate many health programs.

The Logic Model

The following are some practical applications of the Logic Model approach to evaluation:

  • provide program staff with a tool for planning and managing programs and for planning evaluations;
  • provide an efficient and effective way to communicate the essence of programs to program staff, communities, community leaders and other interested parties;
  • provide a flexible tool adaptable to programs of various sizes and levels of complexity - it helps to establish a clear picture of program scope, structure and function; and
  • provide clear program goals (short-term and long-term) which in turn serve to define program boundaries and resource needs. This clarification is particularly useful when programs undergo changes.

Logic Model Format

Health programs often have several activities. For example, as shown below, an injury prevention program for children is likely to have several different activities. The long-term goal is to prevent and reduce fatal and non-fatal injuries to children. To find out if this goal was achieved, it is necessary to collect "before and after" injury data and also follow-up data.

To learn which activities are productive, it is possible, using the Logic Model, to conduct short-term evaluations of each program activity. The Logic Model is a diagram or flow chart that shows the linkages between what a program does and what it should achieve. Evaluators can then follow each activity through its logical stages of use and outcome.

Because evaluators can view activities separately, they can evaluate them separately. Using the Logic Model, evaluators can break down programs into their activities to illustrate how they lead to the overall goals. The following is the format of a basic program Logic Model.

Logic Model

Figure 5.1 The Logic Model is a framework for evaluation

Figure 5: Logic model of Health Programs

Logic Model For Childhood Injury Prevention

Figure 5.2 This Logic Model shows four activities of a program designed to reduced the number of fatal and non-fatal injuries to children belonging to a First Nations community

Logical model for childhood injury prevention

References

The following publications contain information on evaluating various programs, including health programs.

Cole et al. (1997). Evaluating Health-Related Programs through Systematic Planning and Evaluation.

Comptroller General of Canada (1981). Principles for the Evaluation of Programs.

Cunningham (1978). Community Program Evaluation: A Suggested Approach.

Ellis et al (1990). Keeping on Track: An Evaluation Guide for Community Groups.

Fink and Kosecoff (1978). An Evaluation Primer.

Fisher (1997). A Theory-Based Framework for Intervention and Evaluation in SRD/HIV Prevention.

Graham et al. (1994). The Evaluation Casebook: Using Evaluation Techniques to Enhance Program Quality in Addictions.

Health Canada (1991). A Handbook for First Nations on Evaluating Health Programs.

Health Canada (1996). Canada Prenatal Nutrition Program - First Nations and Inuit Component: Guide to Planning and Evaluation.

Health Canada (1998 ). Evaluating Your Progress.@ Reducing Smoking in the Workplace.

Health Canada (1996). Program Consultant's Guide to Project Evaluation.

Health Canada (1993) J. Holt How About Evaluation...

McKenzie, James (1993). Implementing and Evaluating Health Promotion Programs.

MMWR (998). Guidelines for Evaluating Surveillance Systems.

Trueblood, Gordon (1992). Health Education Guidelines. (Unpublished thesis.) Ottawa, On: Health Canada Library.

Wong-Reiger, Durhane and David, Lindee (1993) A Hands-On Guide to Planning and Evaluation: How to Plan and Evaluate Programs in Community-Based Organizations. Ottawa, ON: Canadian Hemophilia Society.