U.S. Department of Education: Promoting Educational Excellence for all Americans

A r c h i v e d  I n f o r m a t i o n

RA: National Institute on Disability and Rehabilitation Research - 2004

CFDA Number: 84.133 - National Institute on Disability and Rehabilitation Research


Program Goal: To conduct high-quality research that leads to high quality research products
Objective 8.1 of 3: Conduct high-quality research
Indicator 8.1.1 of 3: The percentage of grantee research that is deemed to be good to excellent as reflected in the appropriateness of the designs used and the rigor with which accepted standards of scientific and/or engineering methods are applied.
Targets and Performance Data Assessment of Progress Sources and Data Quality
Percentage of grantee research and development activity rated 4 or greater in appropriateness of study designs, the rigor with which accepted standards of scientific and/or engineering methods are applied, and the degree to which the research and development activity builds on and contributes to the level of knowledge in the field, based on a 5-point Likert-type scale.
Year Actual Performance Performance Targets
2002
54
65
2003
67
70
2004
 
70
2005
 
75
2006
 
75
2007
 
80


Progress: To date, only 20 of the 47 formative reviews slated for calendar 2004 have been conducted, and these were focused exclusively on TBI and Burn Model Systems projects. The remaining reviews involve 13 RERCs and 14 RRTCs and are planned for the Fall of 04. Preliminary data from the first set of reviews indicate that only 53% of the Model Systems projects were deemed by constituent reviewers to be conducting “high-quality” research and demonstration projects. Actual performance on this measure for 2004 will be based on all 47 formative reviews conducted in calendar year 2004 and will be available in March 2005.

Explanation: In 2004 NIDRR changed the assessment of this measure from summative review, which is conducted late in a five-year funding cycle, to formative review, which is typically conducted during the first 15-18 months. This change was made in anticipation of replacing summative review of individual centers with a more comprehensive portfolio assessment process and to better align review of scientific rigor to a stage in the funding cycle when recommendations can be acted upon more readily. Scores on this measure are based on constituent reviewers' ratings of “good to excellent” on six indicators of scientific rigor taken from NIDRR's “centers of excellence” model. The specific areas rated include: levels and appropriateness of expertise and history of relevant publications of investigators; evidence conducting innovative program of basic or applied R&D, use of appropriate and rigorous methods, appropriateness of research tools, adequacy of sample size, and potential contribution to advancement of knowledge or product development.  
Additional Source Information: Program review-type meetings (i.e., reverse site-visits) with expert panels representing the following key stakeholder groups: researchers and other scientists, practitioners, service providers, policy analysts, industry representatives, and individuals with disabilities.

Frequency: Annually.
Collection Period: 2004
Data Available: March 2005
Validated By: On-Site Monitoring By ED.

Improvements: Extensive efforts have been made to ensure that centers being rated and experts serving as reviewers are conversant with the evidence based and outcomes oriented approaches to the review process.

 
Indicator 8.1.2 of 3: A significant percentage of new studies funded by NIDRR assess the effectiveness of interventions using rigorous and appropriate methods.
Targets and Performance Data Assessment of Progress Sources and Data Quality
Percentage of new studies funded by NIDRR assess the effectiveness of interventions using rigorous and appropriate methods.
Year Actual Performance Performance Targets
2003
 
999


Progress: This is a new measure that was added in 2004 in anticipation of the establishment of NIDRR's new portfolio assessment process based on expert panels. Due to delays in implementing the new panels, the measure was revised in the FY 2005PM plan and re-numbered 7.1.1 to give the agency more time to design the portfolio assessment process, which will replace the current system of summative program reviews. The next data collection period for measure 7.1.1 will be 2005 with results available in 2006.

Explanation: In 2004 and 2005 NIDRR will develop and test strategies for deriving this measure using information from the web-based annual project performance reporting (APPR) system and preliminary data from the initial round of portfolio review panels. A baseline will be established in 2007 using data from the previous two years.  
Additional Source Information: Triangulation of data from the web-based annual project performance reporting (APPR) system and the planned Portfolio Review Expert Panels.

Frequency: Annually.
Collection Period: 2005
Data Available: February 2006
Validated By: On-Site Monitoring By ED.

 
Indicator 8.1.3 of 3: The number of publications based on NIDRR-funded research in refereed journals
Targets and Performance Data Assessment of Progress Sources and Data Quality
The average number of publications per award based on NIDRR-funded research and development activities in refereed journals.
Year Actual Performance Performance Targets
2002
2.74
 
2003
2.84
8
2004
 
5
2005
 
5
2006
 
10
2007
 
10


Progress: The average number of peer-reviewed journal articles published in calendar 2003 by NIDRR-funded RRTCs, RERCs, and Model Systems is 2.84 per award. Although this represents a slight increase over the previous year's average of 2.74, it falls significantly short of the original performance target, which was determined to be ill-founded. In the 2005PM plan the performance target for 2002 was converted to Baseline to give NIDRR time to work out significant data management problems associated with the web-based annual project performance reporting system (APPR) and to establish a trend line. The data problems were resolved in July 2004 allowing NIDRR to report accurate and verifiable averages for both 2002 and 2003 publications for the three program funding mechanisms required to provide citation data in the existing APPR. NOTE: To capture all the refereed publications that are published in a given calendar year, but which may not have come out in time to be included in the APPR for that year, the data collection period must span two years of performance reporting (i.e., data on 2004 publications will be based on both the 2004 and 2005 APPRs and will be available in September 2005).

Explanation: The total number of refereed articles published in 2003 by active centers and projects was 253, ranging from a high of 183 for the SCI, TBI and Burn Models Systems (n=37) to 48 for the RRTCs (n=29) and 22 for the RERCs (n=23). The average number of refereed publications per award also varied from 4.95 for Model Systems to 1.66 for RRTCs and .99 for RERCs. The same ordering was observed for 2002 publications, although the numbers are different. Average peer-reviewed journal articles increased approximately 1.5 for Models Systems (3.48 to 4.95), whereas RRTCs declined by almost the same amount (2.89 to 1.66) and RERCs remained virtually the same (1.1 vs. 99). Variations in this measure by program type are most likely due to differences in the nature of research and demonstration activities conducted (i.e., medical/clinical rehabilitation research for Model Systems vs. psychosocial research for RRTCs, and engineering design and development for RERCs). Whereas, differences over time probably have more to due with variations in the topic and the number of awards funded and terminating in a given year. A new baseline will be set in 2005 using data from 2002-2004 publications.  
Source: Performance Report
Contractor Performance Report

Program: NIDRR.
Contractor: Research Triangle Institute, North Carolina.

Additional Source Information: The web-based annual project performance reporting (APPR) system.

Frequency: Annually.
Collection Period: 2004 - 2005
Data Available: September 2005
The peer-reviewed status of self-reported journal articles cited in the APPR system by individual grantees are verified by the National Education Library based on the International Scientific Index.

Limitations: (1) Data on peer reviewed publications are based on self-reported citations by grantees in the web-based annual project and performance reporting (APPR) systems. Concerns have been raised about the potential for over reporting. Methods to independently confirm publications are planned. (2) In the current version of the APPR only three program funding mechanisms are required to report citation data. (3) To date, this measure does not include peer-reviewed journal articles published during the final year of an award.

Improvements: NIDRR is evaluating methods of assessing productivity that fairly represent all parts of the NIDRR grant portfolio.

 

Objective 8.2 of 3: Disseminate and promote use of information on research findings, in accessible formats, to improve rehabilitation services and outcomes.
Indicator 8.2.1 of 1: Grantees deemed to be implementing a plan for widespread dissemination and utilization of validated research findings, developed with stakeholder input and based on measurable objectives, that is producing products and services at sufficient levels and in accessible formats and reaching targeted customers in sufficient numbers, including those from diverse and underserved populations
Targets and Performance Data Assessment of Progress Sources and Data Quality
The percentage of grantees deemed to be implementing a plan for widespread dissemination and utilization of validated research findings, developed with stakeholder input and based on measurable objectives, that is producing products and services at sufficient levels and in accessible formats and reaching targeted customers in sufficient numbers, including those from diverse and undeserved populations
Year Actual Performance Performance Targets
2002 68 50
2003 55.50 50
2004   55
2005   60
2006   65
2007   70


Progress: No data are reported for this measure for 2004 because the decision was made to drop it from NIDRR's set of performance measures.

Explanation: The decision to drop this measure was based on several factors, including: (1) development of NIDRR's new Draft Logic Model and the changing view of the role of ''Dissemination' reflected in the model; and (2) plans to conduct a Comprehensive Evaluation of NIDRR's Knowledge Dissemination and Utilization portfolio in 2005, the results of which will be used to inform strategic planning in this area. A new “developmental” measure has been defined under Goal 7 to replace the deleted one, which reflects NIDRR's new strategic goal for the primary outcome arena of “Knowledge Translation and Dissemination” depicted in the Logic Model. This new measure emphasizes the utility of grantee outputs rather than the quality of dissemination plans.  

Validated By: On-Site Monitoring By ED.

 

Objective 8.3 of 3: Ensure Utility of Research Problems and Products to End-Users
Indicator 8.3.1 of 1: Outcomes-Oriented Measure of Results of R&D Investment: The number of new or improved assistive and universally-designed technologies, devices and systems developed by grantees that are deemed to improve rehabilitation services and outcomes and/or enhance opportunities for full participation, and are successfully transferred to industry for potential commercialization.
Targets and Performance Data Assessment of Progress Sources and Data Quality
Number of new or improved assistive and universally-designed technologies, devices and systems developed by grantees that are rated ''good to excellent'' in ability to improve rehabilitation services and outcomes and/or to enhance opportunities for full participation, and are successfully transferred to industry for potential commercialization.
Year Actual Performance Performance Targets
2004
 
999
2005
 
999


Progress: This is a new measure that was added to NIDRR's set of Goal 8 performance measures in the plan for the 2005PM. The wording of the measure was subsequently revised in the 2006PM plan based on recommendations from the PART review, follow up negotiations with the Department's Budget and Strategic Accountability Services, and the development of NIDRR's new Draft Logic Model.

Explanation: Preliminary data on this measure will be collected from a sampling of NIDRR grantees in the summer of 2005 based on the pilot version of the revised web-based annual project performance reporting (APPR) form for 2004-2005. The first official data will be collected from all grantees in the spring and summer of 2005 based on the 2005-2006 performance period and the first official data will be available in November 2006. A baseline will be established in 2007 using both pilot and official data from 2004-2005 and 2005-2006.  
Source: Performance Report
Grantee Performance Report: 1820-0642 Annual Performance Reporting Forms for NIDRR Grantees (RERCs, RRTCs, DBTACs, DRRPs, Model Systems, Dissemination & Utillization Projects).
Program: National Institute on Disability and Rehabilitation Research..

Additional Source Information: Triangulation of data from the web-based annual project performance reporting (APPR) system and program review-type meetings with expert panels.

Frequency: Annually.
Collection Period: 2005 - 2006
Data Available: November 2006
Validated By: On-Site Monitoring By ED.
Review by expert panel

Improvements: To reduce the costs and improve the efficiency of collecting qualitative judgements from experts panels, in 2004 NIDRR will experiment with using Internet-based alternatives to face to face program-review-type meetings.

 

Return to table of contents