A r c h i v e d I n f o r m a t i o n
FY 1999 Annual Plan - Volume 1. Objective Performance Plans and Data Quality - February 27, 1998
Quality of Performance Data: How Data Will Be Verified and Validated
Strategies for Ensuring High Quality Information for Strategic Plan and Program Performance Indicators
- Performance indicator standards
- Timeliness. Ensure performance information is collected on a regular and timely basis (e.g., a minimum of two data collections per indicator over the Strategic Plan period; quick turnaround customer surveys).
- Validity. Ensure performance indicators are valid (e.g., measures align with research-based findings on effective practices). Conduct state-of-the-art reviews of measures by objective and major program area.
- Reliability. Ensure performance indicators are reliable (e.g., measures meet acceptable sample size criteria for confidence levels, samples are appropriately representative; definitions are consistent over time.)
- Disaggregation. Ensure performance indicators appropriate disaggregate information down to key operational units (e.g., states for state grant programs).
- Employee training in performance measurement and use
- Materials and tools. Develop guides and training programs around performance management and measurement
- Tailored training to particular offices needs and offer ongoing performance measurement assistance.
- Develop certificated courses in performance measurement and utilization.
- Evaluate employee capabilities in different offices to manage information for performance.
- Offer on-line training
- Monitor data quality
- Internal monitoring and reviews. Develop an indicator ranking system that assesses indicator quality by objective and major program area.
- Staff performance agreements. Require program managers, as part of their performance agreement, to validate the quality of their program performance information.
- Evaluation of data quality.
- Ensure independence and objectivity of performance information by relying on program evaluations to collect and process performance reports and to conduct independent evaluations of performance for major programs.
- Conduct indicator quality evaluations to assess the accuracy of key indicator information.
- Inspector General. Support Inspector General reviews of data quality (see exhibit 5).
- Accountability for data quality
- Attest to data quality. "ED program managers assert that the data used for their program's performance measurement are reliable and valid or will have plans for improvement." (Objective 4.7: Indicator 31)
- External validation. Evaluate program managers assertions through IG audits and program evaluations.
Effectively reporting the performance of the Education Department in achieving its Strategic Plan goals and objectives requires developing some new data systems and fixing old ones. The new systems will seek to redirect data collections toward gathering performance information on the accomplishment of Department-wide and program goals. Existing data systems need to be strengthened to ensure the Department receives high quality and timely data on its programs and their effects.
The Department is undertaking a comprehensive set of data improvement activities built around the following two strategies.
- Strengthening data quality.
- Efficiently reporting and using performance information for improvement.
Actions being taken to achieve these strategies are described below followed by highlights of the strategies being taken to improve elementary/secondary and postsecondary data quality.
Strengthening data quality
The quality of the Department's performance measures can be no better than the quality of the data from which they are generated. Inadequate attention to data quality produces inaccurate information and misleading results. The lack among many program staff of formal training in information processing, evaluation and reporting is a further impediment to obtaining high quality information. .
To ensure the quality of performance indicator information the Department is proceeding forward on a four-part improvement strategy (Exhibit 4).
- Develop Department-wide standards for performance indicator measurements. The standards should be consistent with data quality standards developed by the National Forum on Education Statistics and evaluation standards prepared by the American Evaluation Association. These standards would apply to indicators in the Department's multiyear Strategic Plan and annual program plans.
Develop employee training in the application of the data standards for performance measurement. Training will be made on-going, integral to employee work and a priority of supervisors. To support performance measurement training, the Department already has developed a performance measurement guide by a noted expert in the field (see Hatry and Kopczynski, 1997). This comprehensive overview of performance measurement will be followed by miniguides on performance management and measurement, often customized to individual agency offices.
- Monitor data quality. Quality control to ensure sound data is essential to the Department, which is heavily dependent on State, grandee and institutional external data systems for information. The Department is developing a formal process of rating data quality against data quality standards. The Department will also use program evaluations for data validation. In addition, the Department is working closely with its Office of Inspector General to achieve independent monitoring of the reliability of its data quality in high priority areas. See exhibit 5 on Inspector General activities.
- Managers attest to the reliability and validity of their performance measures or submit plans for data improvement. Making managers accountable for data quality encourages attention to sound data collections from the start of the grantmaking process. Managers asserting to the quality of their data coupled with periodic external validation of their assertions is a powerful new incentive to improve program performance information.
During 1998 and 1999, the Department will be strengthening its two major performance indicator systems, one for the elementary and secondary system and a second for student aid. Collectively, these systems account for about three-quarters of Departmental funds.
Developing an integrated data system for elementary and secondary state grant programs
A key component of the Department's information improvement strategy is creating integrated federal/state indicator systems for the elementary and secondary state grant programs. This redesign will cover all federal funds disbursed through the state grant process. These include programs to support challenging standards, at-risk students, professional development, education technology, safe and drug-free schools, and general improvement support.
A joint redesign project with the Council of Chief State School Officers will begin in spring 1998. Initial tasks will include agreement on a set of data standards, key performance indicators and pilot testing of systems beginning in the fall of school year 1998-99. Our target is to begin working with a least six exemplary states on developing aligned, exemplary integrated indicator systems.
Key steps to developing the redesigned system include:
- Identifying critical indicators in the Department's strategic and program plans that are essential to evaluating the performance of state grantees against program goals and objectives.
- Identifying which indicators are appropriate to collect at the national level and which ones should be reported at the individual state level.
- Determining the best measures available to states within an appropriate range of costs and data burden.
- Aligning state performance reporting to indicators and measures, including exploring electronic means for data transmission.
- Agreeing to standards for data reliability and validity for state-reported performance measures.
- Coordinating data collections across different information sources, including state performance reports and general purpose statistical collections. The later can serve as an independent check on self-reported performance information.
Inspector General Support for
Data Verification and Validation
The Office of Inspector General (OIG) has participated in an advisory capacity in the development of the Department's Strategic Plan and Annual Plan and will continue to provide this service to Department managers.
The OIG is currently performing the first in a series of audits covering the Department's implementation of the Government Performance and Results Act. The objectives of the first audit are; (1) to assess the Department's process for institutionalizing the results-oriented management envisioned by the Act; and (2) to assess the development of the system for the accurate and timely collection and reporting of performance data.
For the Strategic Plan, the OIG recommended that Department program managers assert that the data used for their program's performance measurement are reliable and if not reliable, detail plans for improving the data or finding alternative sources. The Department agreed with the OIG's recommendation and included it as a performance indicator in the Strategic Plan. The OIG plans to perform a series of audits on select performance measurement data to assess the reliability of that data. The OIG also plans to assess how the Department is using the performance data to improve programs.
Improving postsecondary data quality
Validity and accuracy of postsecondary performance measures. Data used to measure progress toward achievement of the performance indicators come from several sources including program data, surveys conducted by the National Center for Education Statistics (NCES), and evaluation studies. Steps being taken in 1998 and 1999 to strengthen the quality of these data include:
- Improving the coordination of data related to postsecondary education through the National Postsecondary Education Cooperative (NPEC) which is sponsored by NCES and whose mission is, "to promote the quality, comparability, and utility of postsecondary data and information that support policy development, implementation, and evaluation." NPEC will help improve the efficiency and usefulness of the data reported on postsecondary education by standardizing definitions of key variables, avoiding duplicate data requests, and increasing the level of communication between the major providers and users of postsecondary data.
- Continuing to support and strengthen NCES's major postsecondary data collection activities including the Integrated Postsecondary Education Data System (IPEDS), the National Postsecondary Student Aid Study (NPSAS), the Beginning Postsecondary Student Study (BPS), the Bachelor's and Beyond Study (B&B), and the National Education Longitudinal Study (NELS). A major area of expected improvement in the quality of these data collections is better linkages with OPE's student aid data files to capture accurate data on the federal aid being received by survey respondents.
- Using evaluation methods and findings for the TRIO and Title III programs to help improve the data collected in the annual program performance reports to provide a more accurate and more complete picture of the activities and outcomes of the two programs.
Accuracy and efficiency of program data systems. In FY 1999 the Department of Education will provide nearly $51 billion in federal student aid funds. To properly distribute and account for these funds, the Department of Education needs to process and store data from over: 8 million student aid applications; 93 million individual student loans with a value of more than $150 billion; 6,000 postsecondary institutions; 4,800 lenders; and 36 state guarantee agencies. Ensuring the accurate and efficient collection of these data is a key component in the successful delivery of the student aid programs and achievement of Goal 3 in the Department's Strategic Plan to, "ensure access to postsecondary education and lifelong learning."
The student aid delivery system has suffered from data quality problems which are sufficiently severe to cause the Department to fail to receive an unqualified audit opinion. Steps being taken to improve the efficiency and quality of the student aid delivery system include:
- Improving data accuracy by:
- continuing or expanding interagency coordination on data matches-with the Internal Revenue Service, the Social Security Administration, the Immigration and Naturalization Service, the Selective Service, the U.S. Postal Service, and the Departments of Defense, Justice, and Housing and Urban Development-to help improve data accuracy and reduce burden on respondents,
- establishing by December 1999 industry-wide standards for data exchanges to stabilize data requirements, improve data integrity, and reduce costly errors, and
- receiving individual student loan data directly from lenders rather than through guarantee agencies and by expanding efforts to verify the data reported to the National Student Loan Data System.
- Strengthening indicators of customer satisfaction to provide early warnings of possible delivery system problems. This will build on the Department's successful on-going evaluations of its institutional and student aid customers.
- Refining a risk management system that encompasses all relevant data regarding postsecondary institutions operation and management of the student aid programs so that compliance and enforcement activities can be targeted on poorly performing institutions.
- Preparing a system architecture for the delivery of federal student aid by December 1998 that will help integrate the multiple student aid databases based on student-level data in order to improve the availability and quality of information on student aid applicants and recipients.
Ensuring the quality of performance information on internal management systems
The Department must have solid information on the performance of its internal management systems if it is to achieve its Goal 4 objective of a high performance organization. This means obtaining adequate coverage of internal system performance and ensuring the reliability of such data.
The Department plans on extending its independent evaluations, now used mainly for program evaluations, to management evaluations. A priority area will be evaluation of the quality of performance data, including data for customer surveys, performance contracting and employee performance ratings and awards. Also, informational technology systems are the bulwark of productivity growth and evaluations are planned to examine the quality of data on the performance of key information systems and of employee use of these systems.
Hatry, Harry and Kopczynski, Mary. Guide to Program Outcome Measurement for the U.S. Department of Education. Washington, DC: U.S. Department of Education's Planning and Evaluation Service, 1997.
[Relationship of program ...]
[Program Performance Plans]