Data & Research EVALUATION OF PROGRAMS
THE PERFORMANCE MEASUREMENT STUDY OF THE TITLE III INSTITUTIONAL AID PROGRAM--SUMMARY
Highlights of Findings--June 2000

C. A Performance Measurement System

A hallmark of the Title III program has been its flexibility, which allows grantees to direct funds toward their most critical areas of need. However, the diversity of the Title III activities, as well as the multiplicity of goals expressed in the authorizing legislation, has made it difficult to aggregate data in a way that could document overall program outcomes. The performance measures appropriate to examining the achievement of the Title III program goals, and the wide range of activities funded to pursue those goals, can be thought of at three different levels.

  • At the broadest level, the Congress and the public wish to know whether the program has strengthened the grantees and made them better able to serve the intended population effectively. Therefore, they are interested in the type of students attending grantee institutions and their progress within the institution. The interest in indicators at this broadest level relates to overall program goals rather than the detailed purposes of grant expenditures.
  • At a more specific functional level of program achievement and effectiveness, the program indicators should portray to the Congress and the public what and how program resources are being used to contribute to the broadest goals. These indicators should cover functional areas of activity that are being addressed by specific grant activities.
  • At the most specific level of grantee activities, there is a set of detailed and individualistic expectations and indicators which are of primary interest to the Department's program office and individual grantees. These concern objectives and achievements in particular activities within the various functional areas. Activity objectives can be found in institutional comprehensive development plans, and activity results in annual performance reports, but they are not usually carried forward into broader documents. If a specific activity is of potential use to other grantees, project objectives and accomplishments should be taken from these sources and made available to a wider audience.

The discussion which follows presents a group of recommended indicators for the Title III program which are linked to the first two levels described above -- the broad program goals and the functional areas of program activity. It is at these two levels that the Department needs to communicate routinely with the Congress and the public. The first set of indicators presented below are broad program goal indicators, and the last three sets are functional area indicators.

This discussion will not address the third level, which includes specific measures appropriate to the planning, review and approval of individual institutional plans and grants and their subsequent results. Those measures are highly individualistic to particular institutional needs, and they are already imbedded in program processes. It will be helpful in the future for the Department and the grantees to monitor actual achievements against initial expectations.

Finally, the findings of this study indicate that fiscal stability is an indirect concern of the institutions, and that relatively little fiscal instability is apparent among the grantees. For those few small private institutions that do face financial difficulties, Department staff should continue to monitor individual situations from indicators already available to the Department through general or programmatic reporting. Accrediting organizations are also alert to such problems, and they have been vigilant about pressing institutions to keep expenses and revenues in line.

Process for Developing the Indicators

Extensive work was conducted to develop these indicators, with input from the grantees, the Advisory Team, and the Department. The process began in December of 1995 when the grantees were asked for suggestions regarding appropriate indicators and indicator types. One hundred eighteen grantees submitted suggestions in response. Next, in July 1996, the grantees were asked to submit comments on a list of proposed indicators. The list was based in part on the indicators suggested by the grantees, and it was reviewed by the Title III Advisory Team. Fifty-four institutions responded to this mailing, and a report was prepared summarizing their comments.

The survey instrument sent to all grantees in March 1997 included questions about the availability of data for a number of potential performance indicators, and it asked whether or not each of the indicators was applicable to the responding institution. The field of indicators used in the survey was developed in consultation with the Survey Task Force of the Advisory Team. A majority of Title III schools reported in the survey that the various student outcome and institutional measures are either available or accessible. The case studies confirmed the findings of the survey, and presented a picture of substantial grantee familiarity with performance measure reporting to IPEDS, state authorities, accreditors and a variety of voluntary associations. The case study protocol included questions about which indicators the host institutions found the most useful. Other case study questions asked which of the indicators were viewed with disfavor, and why, and whether the respondent had any alternative indicators to suggest. The indicators were also discussed at a number of national and regional meetings of the Title III grantee community.

Performance Indicator and Accountability System

The Department's GPRA structure will adequately handle the external reporting requirements for the Title III program. Internally, the information for the recommended indicators should be included in the annual performance reports submitted by the grantees. For ease of handling, it would be desirable for indicator data to be included on a standardized form which could be transmitted electronically, or easily converted to electronic form. The selection of definitions should be carefully coordinated with the grantees, in order to minimize or eliminate conflicting definitions, insofar is possible.

Recommended Indicators - Service Population:

  • Number and percent of total fall enrollment who are minority students (by the five ethnic/racial groups), as reported to the Integrated Postsecondary Education Data System (IPEDS).
  • Number and percent of degrees awarded to minority students (by the five ethnic/racial groups), as reported to IPEDS.
  • Number and percent of total fall enrollment who are first-generation college students (if available).
  • Number and percent of total fall enrollment who are Pell Grant recipients.

How to Be Used: This set of indicators is to be used as a way to describe the extent to which the program serves the intended population. Low percentages for some schools may serve as signals that the Title III funds are not being distributed in accordance with the program legislation.

Recommended Indicators - Expenditures:

  • Number and percent of Title III schools using Title III funds to establish or expand learning labs or academic skills centers.
  • Number and percent of Title III schools using Title III funds to develop a new course of study, or to strengthen existing courses.
  • Number and percent of Title III schools using Title III funds to establish or expand within-campus electronic linkages.
  • If the school had a previous five-year Title III grant: Percentage of activities funded by the previous Title III grant still in place at the institution.

How to Be Used: The schools will check yes or no to each of the first three indicators; then the Department can sum across the schools for an aggregate measure. This set of indicators will be used as a way to describe how Title III money is being spent. The last indicator will serve as a signal of the level of institutionalization of past Title III-funded activities, if applicable, at the institution.

Recommended Indicators - Participants:

If the school has student support services activities:

  • Number of students (and percentage of total fall enrollment) estimated to receive student support services developed through Title III, on an annual basis once the projects are fully operational.
If the school has academic programs and resources activities:
  • Number of students (and percentage of total fall enrollment) estimated to enroll in academic courses or programs developed or improved through Title III, on an annual basis once the projects are fully operational.
  • Number and percentage of faculty estimated to participate in Title III-funded academic programs and resources activities, on an annual basis once the projects are fully operational.
  • Number of students (and percentage of total fall enrollment) gaining access to computers due to Title III-funded activities.
  • Number of students (and percentage of total fall enrollment) gaining access to the Internet due to Title III-funded activities.
  • Number of students (and percentage of total fall enrollment) taking courses with computer use due to Title III-funded activities.
If the school has faculty development activities:
  • Number and percentage of faculty participating in Title III-funded faculty development activities.
If the school has funds and administrative management activities:
  • Number and percentage of staff trained in how to use the new systems funded by Title III.

How to Be Used: Again, while numbers served is a process measure and not an indicator of outcomes, these indicators can still function as essential descriptive measures that give the big picture of the number of students, faculty and staff involved in Title III-funded activities.

The Department might want to follow up on some of these estimates once the grant period is over, to see if the projects indeed became fully operational, and to see if the estimated numbers to be served were realized. With these data, the Department can also calculate costs (grant expenditures on the activity) per student or faculty member served.

Recommended Indicators - Outcomes:

  • Cohort graduation rates, as will be reported annually to IPEDS.
  • Course completion rates for courses developed with Title III assistance.
  • Course completion rates for courses enhanced with Title III assistance.
  • Accreditation status.
  • Fiscal balance (operating revenues less expenditures), as reported to IPEDS.
  • Enrollment trends, as reported to IPEDS.
  • Working capital debt.

How to Be Used: This set of indicators can be used as proxies of quality and as signals suggesting whether the Title III money has been well spent. However, these proxies will of necessity be imperfect, since impacts and outcomes arising solely due to the Title III program cannot be determined without an impact evaluation in a multivariate framework. These measures can only suggest outcomes that may be correlated with Title III expenditures.

It would be expected that maintained or improved levels of these seven outcome variables would be looked upon as favorable signals by the Department and Congress (i.e., stable or improved gradation rates; stable or improved completion rates; maintenance or improvement of accreditation status; a positive fiscal balance with operating revenues exceeding expenditures; steady or increasing enrollments; and steady or declining working capital debt). These indicators need to be examined together as a group, since extraneous factors may lead to changes in a single indicator.

Conversely, unexplained declines in these measures might be looked upon with concern (i.e., declining graduation rates; declining completion rates; a downgrading of accreditation status; fiscal imbalance with operating expenditures exceeding revenues; a rapid decline in enrollment; and the growing use of working capital debt, particularly in the last quarter of the academic year).

All such outcome measures need to be examined in conjunction with what year of the grant the school was in, particularly for many of the Part A schools that only receive a single five-year Title III grant. For this study, many reporting colleges were only mid-way or less through their projects.

Beyond their function as proxies, the only other way to use performance indicators as a measure of institutional quality resulting from Title III would require disaggregation to the institutional level. This is because of the wide range of activities being funded and the unique local circumstances faced by postsecondary institutions. In an ideal world, a more precise and uniform set of measures of academic quality across course topics and institutions would provide better information than these proxies, but such measures are not generally in place in academia. Until such measures are developed and accepted, the best way to address the issue of quality is to rely on these proxies, as well as on the accreditation process and discipline-specific standards.

Top


 
Print this page Printable view Send this page Share this page
Last Modified: 03/10/2006