

![]() | |||||||||||||||||||||||||||||||||||||
![]() | |||||||||||||||||||||||||||||||||||||
![]() |
| ![]() | |||||||||||||||||||||||||||||||||||
![]() | |||||||||||||||||||||||||||||||||||||
![]() |

Dora Marcus
Former FIPSE Evaluation Specialist
October 2001
Introduction. One of the essential components of FIPSE projects is a solid evaluation plan that guides data collection and furnishes solid evidence about grant processes and outcomes.
Three key features should characterize your evaluation plan:
Formative evaluation will assure the quality of program management by tracking the effectiveness of project development and implementation.
Summative evaluation, especially one that carefully documents impact on learners, may transform your promising project into a national model of reform.
Controlled comparisons between program participants and non-participants will clarify the impact of your particular innovation and its potential for benefiting other campuses.
Aside from confirming your program's success, a strong evaluation will:
- inform project activities and practices
- justify expenditure of funds
- enhance administrative planning and policy making
- assure that project objectives have been met
- provide evidence for program achievements
- monitor program implementation
- note unintended consequences
- inform allocation of resources
- identify problems and costs
Evaluation Design. To make a convincing case for any reforms brought about by your project, you will need to consider:
Limiting yourself to a few clear and specific objectives that have measurable qualities
Selecting measures that specify who, when, and how the data will be collected, analyzed and reported
Building evaluation measures into the routines of program procedures, rather than appending them later
Using multiple measures, rather than a single measure, when possible (Similar results establish credibility.)
Orienting evaluation measures primarily toward behavior, especially student academic performance
Using project documents and records for on-going process evaluation
Consulting with evaluation experts at your institution early in the design of the project's evaluation
Engaging an independent evaluator who does not stand to gain personally or professionally from the project results
Designing an evaluation that takes into account the project's eventual dissemination audiences and potential adapters and their data needs
Collecting information on the project's cost-effectiveness
Providing evidence of the wider impact of your project: how adaptable is the model and how likely is institutionalization?