A sound evaluation plan should guide the project’s data collection activities and it should provide useful evidence about the project’s processes and outcomes. IEPS offers applicant institutions the following suggestions to consider in developing the evaluation and impact section of the NRC/FLAS proposal.
Develop clear goals and objectives
- Limit the evaluation plan to a few clear and specific objectives that can be measured:
What are the main themes or project goals being addressed? For each theme or project goal, identify a few key questions that will need to be answered.
- What data will need to be collected to demonstrate the project’s progress?
- What does the NRC or FLAS project hope to demonstrate with the data collected? Change, improvement, professional development, learning outcomes, placement, usage? These are examples of impacts that can be demonstrated with data collections.
Explain the Evaluation Plan
Formulate evaluation questions that are of interest to all stakeholders and audiences related to the NRC and FLAS projects, and align questions with appropriate information gathering techniques.
- Who/what will change?
- When do you expect the change(s) to take place?
- How much change is expected?
For planning data collection, the NRC and FLAS projects should determine if any baseline data are needed and, if they already exist, where to find them. Instruments to collect data that are not available need to be developed. Data collection instruments may include: surveys, standardized test scores, exams, focus groups, etc. The NRC may have additional instruments that are specific to the project. NRCs may wish to collaborate on this phase with other NRCs on campus to enrich the process and share costs.
After the data collection plan is in place, a timeline needs to be constructed for the duration of the grant. The timeline should reflect if a goal is in place throughout the four year cycle or if it can be achieved within the cycle. Indicate the timeline for designing each measure and data collection instrument. An evaluation timeline may be included as part of the application narrative or integrated into the four-page timeline appendix to be referenced in the application instructions.
Implementing the data collection and then reviewing preliminary findings come next. The NRC will then need to decide whether to modify the project based on the findings. Some activities may be able to be modified mid-project if the findings warrant such a change.
Program Evaluation Specialists
A program evaluation specialist should be involved in the implementation of the evaluation planning and throughout the duration of the four year grant cycle. The specialist should be trained in evaluation and, ideally, have conducted similar evaluations. A team can also be formed, one person who is an evaluation expert and the other who is a content knowledge expert in order to maximize the quality of the evaluation plan and its implementation. The evaluation specialist should be well informed on the proposal writing and it may be helpful if he/she reviews the draft. The specialist should be involved in all of the steps above as well as providing advice for disseminating the results of the project. The NRC and the evaluation specialist may wish to develop a plan for wide dissemination of results on the campus, to the local community, to similar institutions, to professional associations, to colleagues, to government officials (at all levels), and to the media. The NRC may wish to collaborate with other NRCs on campus to pool resources and share the cost of a professional evaluator.
Please follow the sub-questions from the evaluation criterion in the technical review form (*see below) in order to assist the reviewers who will be assessing the evaluation plan.
Impact and Evaluation Criterion Questions:
For all NRC applicants, to what extent do the center’s activities and training programs have a significant impact on the university, community, region and the nation as shown through indices such as enrollments, graduate placement data, participation rates for events, and usage of center resources?
For undergraduate NRC applicants, to what extent do students matriculate into advanced language and area or international studies programs or related professional programs?
For all applicants, to what extent will provisions be made for equal access and treatment for eligible students and other participants who are members of groups that have been traditionally under-represented (such as members of racial or ethnic minority groups, women, persons with disabilities, and the elderly)?
For all applicants, does the applicant provide an evaluation plan that is comprehensive and objective and that will produce quantifiable, outcome-measure-oriented data?
For all applicants, to what extent have recent evaluations been used by the applicant to improve its program?
For FLAS applicants, to what extent have the applicant’s activities and training programs contributed to an improved supply of specialists on the program’s subject as shown through indicies such as undergraduate and graduate enrollments and placement data?
For all NRC applicants, to what degree do activities of the center address national needs, and generate information for and disseminate information to the public?
For all NRC applicants, what is the center’s record of placing students into post graduate employment, education, or training in areas of national need and the center’s stated efforts to increase the number of such students that go into such placements?
For FLAS applicants, to what degree are fellowships awarded by the applicant addressing national needs?
For FLAS applicants, what is the applicant’s record of placing students into post-graduate employment, education or training in areas of national need and the applicant’s stated efforts to increase the number of such students that go into such placements?