Evaluation Guidelines
RRF Foundation for Aging promotes the use of evaluation as part of every RRF grant in order to:
- Encourage applicants and grantees to become more effective learning organizations, gathering and systematically analyzing important client and program information about the nature, reach, quality, and efficiency of the services they provide
- Enable the Foundation to better understand the value of the investments we make and the lessons that funded projects can teach us about how to invest our grant dollars more effectively in the future
- Add to knowledge in the field about best practices in services for older adults by supporting, when appropriate, rigorous experimental or quasi-experimental outcome studies
RRF uses three categories of evaluation:
- Implementation – Providing practical lessons that emerge from putting a new project into action
- Process – Generating a blueprint of a program in action
- Outcome – Determining if a program can improve one or more targeted results
Implementation evaluation asks about the practical lessons that emerge from putting a new project into action. Rarely does a project go off without a hitch, and lessons learned during implementation help organizations identify if an approach may need to be modified and what critical next steps are required. In turn, these lessons can help others avoid the same pitfalls. Finally, they teach the Foundation important lessons that can help improve our grantmaking capacity.
Implementation evaluation is the appropriate focus for the evaluation of:
- Demonstration projects for training or service delivery where the intervention/training model is still undergoing development
- Projects that seek to replicate an existing model in one or more new settings, or with a different population
- Planning and seed grants
- Service expansion grants
- Technical assistance grants
- Advocacy and community organizing grants
Key Questions
The following questions may be used to develop effective implementation evaluation. These questions may help applicants create an outline of how they plan to gather information about their project. The list is intended to be illustrative—some questions may not be relevant to all projects and applicants may want to include other questions that are not listed below.
- What is your program model (goals, objectives, activities, resource inputs, short- and long-term outcomes, types of clients/participants targeted, timeframe, budget, etc.)?
- What aspects of your original program model were implemented as planned and what had to be changed?
- Why were revisions needed?
- What changes were made, and why did you select these new approaches and discard other options?
- What aspects of the program were felt to work particularly well and why?
Is there evidence that any unintended outcomes occurred, either positive or negative, for either the program, its staff, or for participants? For example, did you receive unexpected publicity, attract new volunteers, connect to new partner organizations, or identify and meet unexpected client needs? Alternately, did the project cause stress among or between staff, divert staff from other responsibilities they have to clients, or cost more than expected? - How do you explain unintended outcomes?
- Did you confront any barriers that were not anticipated?
- Will you do things differently now, based on lessons learned to date?
- What next steps will you take and/or do you recommend to further revise the model and why?
- Are there conditions under which you would recommend that this program or service not be used, and why?
Process evaluation documents how a program operates by describing characteristics of clients and staff, the nature of services offered and methods of delivery, and patterns of service use to essentially generate a blueprint of a program in action. Effective process evaluation will allow applicants to:
- Describe how funds were used
- Provide a guide to others wishing to replicate the project and study the outcomes of a model program
- Describe what the “intervention” consisted of in reality, not just as designed
Process evaluation is appropriate for:
- Direct service and training projects
- Conferences
- Model and demonstration projects
Key Questions
The following questions may be used to develop effective process evaluation. These questions may help applicants create an outline of how they plan to gather information about their project. The list is intended to be illustrative—some questions may not be relevant to all projects and applicants may want to include other questions that are not listed below.
- What were the goals and specific objectives of the project?
- For each objective, what specific steps were taken and how were they accomplished?
- For each objective or program component, what resources/inputs were needed (type, numbers, and time commitments of staff, physical space(s), equipment, volunteers, etc.)?
- What type(s) of client(s) did each program element target?
- What were the characteristics of clients actually served (age, gender, health status, living situation, family status, cognitive status, functional status, etc.)?
- Were the characteristics of clients/participants in line with the targeted population? If not, why not? Were any type(s) of clients underrepresented? If so, why do you feel these groups were not reached?
- How many clients/participants received each service? Was this more or less than your goal? Why do you think demand was higher or lower than expected?
- How many units of each type of service/program component did clients receive (e.g., hours, rides, course sessions, friendly visits, days of adult day care, rehabilitation sessions, etc.)?
- How much did the program cost? How did this break down for individual parts of larger projects?
- How satisfied were clients with services provided? Were there any aspects of program operation that clients or staff recommended changing and why?
Outcome evaluation is what most people think of when they hear the term evaluation. It focuses on determining whether a program improves one or more targeted outcomes for those served (e.g., health, mental health, quality of life, risk of falling, re-hospitalization rates, etc.). Outcome evaluation requires that targeted clients are compared to a control group who are similar to clients in every way except for the fact that they are not exposed to the program being studied.
Outcome evaluation is often expensive and time-consuming, and requires the involvement of experts with a track record of documenting their knowledge of and experience with evaluation research and statistics. Generally, the Foundation only funds outcome studies when the proposed project is likely to be replicable and already has been pilot-tested to document that it is feasible to implement. Specifically, outcome evaluation is relevant for applicants who are proposing to test the effects of programs that are innovative, replicable, and already shown to be feasible.
RRF has developed two sets of guidelines for outcome evaluation: one for applicants with limited research expertise and a second for experienced researchers. Applicants with limited research and evaluation experience are encouraged to include funds for an expert evaluation consultant in their program budget.
These guidelines are presented to help applicants with limited research experience to think about effective outcome evaluations for direct service projects:
- Outcomes: List specific measurable outcomes from your planned activities that will be tested for with your evaluation.
- Evaluation design: Discuss the approach that will be used to test whether the project is achieving these outcomes. Will you ask participants to complete surveys following the program? Is there an opportunity to do a pre-assessment or to follow participants over time? Or will you conduct interviews or focus groups with a sample of participants to learn about their experiences with your program? Or, perhaps you will consider another opportunity to creatively evaluate the outcomes of your program.
- Measures: Discuss the information or data you will gather and what form it will take. Will you be using a validated survey? Are you collecting additional information from participants such as demographics?
- Data collection & interpretation plan: Describe how you will collect your information and where the data will be stored. Who will collect the data, at what points in time, and how (e.g., via telephone or personal interviews, review of records, mailed survey, ratings by a nurse or social worker, etc.)? How will you analyze the data and how will you share findings with your stakeholders?
The following may serve as a guideline for applicants who have more training in evaluation research as they develop effective outcome evaluations for model or demonstration projects or training programs:
- State your research questions or hypotheses.
- Detail your research design. Will you use an experimental or quasi-experimental design? If quasi-experimental, what type? Will it include a non-equivalent comparison group? If so, who will be included in this group? What potential threats to the internal validity of the design do you anticipate
- Present your sampling plan. Discuss inclusion/exclusion criteria and sample sizes and, if possible, provide a statistical power calculation. Address the potential for sample attrition and how it may affect the composition of study groups and, therefore, the validity of conclusions. Discuss generalizability.
- Discuss measurement, including operational definitions for all dependent, antecedent, and intervening variables. Discuss level of measurement for each. If existing scales are to be used, cite the references for each and discuss pros and cons. If new measures are to be used, describe the process by which they are/will be developed and tested.
- Describe your data collection plan: who will collect each set of data, how they will do so (e.g., telephone, review of records, personal interviews, mailed survey, etc.), when they will do so, and how you will handle problems such as missing data or non-response to surveys?
- Include a detailed data analysis plan. Discuss the stages you will use to analyze your data. List all statistics you will run, clarify how your design meets the assumptions of each, and discuss their appropriateness given your levels of measurement for different variables and given sample sizes and estimated distribution of responses across categories on nominal or ordinal measures. Address any multivariate techniques designed to assess differential impact of the intervention or to allow you to control for antecedent or intervening variables. Discuss different potential findings that might emerge, what conclusions you would draw from each, and what additional data analyses you would run as a result. Please name any statistical consultants you will rely on and include their resumes with your proposal.
- Discuss next steps. Clarify what you hope to know and will not be able to ascertain with the proposed research and future steps you would take as a function of different patterns of outcomes from your work.
- Discuss how you will address issues of confidentiality, informed consent, and other human subjects concerns.
- Include separate budget items for evaluation costs such as data collection (e.g., personnel, copying, transportation, purchase of datasets if applicable, etc.); data management (checking for accuracy and entering data); and data analysis (software purchases, analysis time not included in other staff salary figures, and consultant fees).
Things to Know
Learn about general grant exclusions, search our FAQs, watch an instructional video on our application process, read evaluation guidelines and learn how to develop SMART objectives.