ATTACHMENT A MAGNET SCHOOLS ASSISTANCE PROGRAM (MSAP) EVALUATION Performance Work Statement (PWS) OVERVIEW This evaluation will provide a comprehensive examination of the Magnet Schools Assistance Program (MSAP). The Department of Education (ED), Office of the Undersecretary (OUS), Planning and Evaluation Service (PES) has a need to evaluate the MSAP in order to provide information to Congress, the MSAP, and other policy makers on the uses, successes, and problems associated with federal funding to magnet projects. The evaluation will be largely based on the program's performance indicators, and will also include additional questions regarding the federal role in supporting and promoting magnet schools. Specifically, the evaluation will be guided by the following seven research questions: What are the characteristics of MSAP projects? How are students in MSAP districts selected for the magnet projects (e.g., by lottery, academic examination, race/ethnicity)? What is the frequency and extent of waiting lists? How many schools and students do MSAP projects serve? What grade levels do MSAP projects serve? How many schools are targeted for desegregation impact and how many are designated as feeder schools? What is the demographic composition of MSAP project target and feeder school students? What proportion of MSAP-targeted schools are whole-school vs. program-within-a-school? How do MSAP school teachers and principals compare to traditional public schools in their district in terms of background, demographic characteristics? How are MSAP projects managed and financed? In addition to MSAP grants, what other funding sources are available and accessed by MSAP projects? Do magnet schools and districts receive and coordinate other federal funding for which they are eligible? Are magnet projects more autonomous and/or accountable to local education agencies (LEAs) than traditional public schools in their district, and if so, how? What are the characteristics of MSAP districts? What is the demographic composition, size, and urbanicity of MSAP districts? What are the enrollment trends of MSAP districts by racial/ethnic composition? How many districts are operating under a court order vs. implementing a voluntary desegregation plan? What other kinds of school choice are available to students and families in MSAP districts? To what extent are federally funded magnet projects reducing the incidence or degree of minority student isolation in their programs? How many MSAP grantees have explicit desegregation objectives (reduce, eliminate, or prevent minority isolation) and annual benchmarks? Of those grantees with explicit desegregation objectives, how many implemented the strategies they specified to meet those objectives, and how many met those benchmarks/objectives? How do overall district enrollment trends and/or other factors influence grantees' ability to meet their desegregation objectives? For programs-within-a-school, do magnet school program activities reflect a similar minority/non-minority distribution as the school? Are there differences in desegregation outcomes for different minority groups? To what extent are federally funded magnet projects promoting systemic reform efforts consistent with Goals 2000: Educate America Act? Are magnet projects and schools involved in systemic reform efforts, and if so, what kind, to what extent, and are they national, state or local efforts? Are magnet projects using/consolidating Title I, Goals 2000 and other federal funds to promote these reforms? To what extent, if any, is there a tension between what is required by state standards and the magnet schools' mission, philosophy, and/or curriculum? To what extent do federally funded magnet projects feature innovative educational methods and practices that meet identified student needs and interests? How are student needs and interests gauged, and how are they incorporated into magnet projects? How do magnet projects define "innovative?" What kinds of educational methods/practices/curricula do grantees employ? Are they research-based? To what extent do these innovative practices serve as models for school improvement? Do MSAP projects, LEAs and state education agencies (SEAs) formally or informally share information about promising reform strategies for public education generally? What can the successes and failures of magnet projects tell policy makers and practitioners about reform strategies? To what extent do federally funded magnet projects strengthen students' knowledge of academic subjects and skills needed for successful careers in the future? Do magnet students meet or exceed the achievement benchmarks/goals set forth in their applications? What kinds and how often do magnet projects collect performance data (e.g., standardized assessments, portfolios, coursetaking, attendance)? Do magnet students show achievement gains in core subjects and special skill areas (where applicable)? What accounts for magnet student achievement gains or losses? Are achievement gains of students in magnet projects statistically and substantively significantly different from comparable students in traditional public schools in the district? How has the MSAP contributed to the development and implementation of magnet projects? What kinds of planning and implementation activities do MSAP grants support? What did federal funds allow magnet projects to do that they otherwise could not have done? How does MSAP award grants? What accounts for differences in federal grant amounts? Does the distribution of funds differ by year of implementation, and if so, how? What proportion of annual costs (less transportation) does the MSAP grant cover? What kinds of technical assistance does the MSAP provide? How accessible, useful, and timely is this assistance? Do MSAP grantees plan to continue their programs after their grant expires? LEGISLATIVE AUTHORIZATION This evaluation is authorized under the Improving America's Schools Act (IASA), PL 103-382: Sections 5112 (a) and (b) of the Magnet Schools Assistance Program, "EVALUATIONS. (a) RESERVATION. The Secretary may reserve not more than two percent of the funds appropriated under section 5113 (a) for any fiscal year to carry out evaluations of projects assisted under this part. (b) CONTENTS. Each evaluation described in subsection (a), at a minimum, shall address: (1) how and the extent to which magnet school programs lead to educational quality and improvement; (2) the extent to which magnet school programs enhance student access to quality education; (3) the extent to which magnet school programs lead to the elimination, reduction, or prevention of minority group isolation in elementary and secondary schools with substantial proportions of minority students; and (4) the extent to which magnet schools programs differ from other school programs in terms of the organizational characteristics and resource allocations of such magnet school programs." 20 U.S.C. 7212. BACKGROUND The Magnet Schools Assistance Program The Magnet Schools Assistance Program is a federal grant program designed to support the development of high quality magnet projects. Local education agencies (LEAs) that are part of a court-ordered or federally approved desegregation plan are eligible for funding, and competitive awards are made on three-year grant cycles. Competition for awards is currently underway for FY 98; the MSAP anticipates that it will fund 60 projects in total, with an estimated average size of awards of $1,608,000 per year. MSAP grants assist in the desegregation of public schools by supporting the elimination, reduction, and prevention of minority group isolation in elementary and secondary schools with substantial numbers of minority group students. They also permit the development and implementation of magnet schools that assist in the achievement of systemic reforms and provide all students with the opportunity to meet challenging content and performance standards; the development and design of innovative education methods and practices; and courses of instruction in magnet schools that strengthen students' knowledge of academic subjects and their grasp of tangible and marketable vocational skills. Magnet Schools Research The Department of Education (ED) has supported two major studies of magnet schools in the recent past (copies of both reports are available by calling 1-800-USALEARN, and "analysis and highlights" of the 1994 study is available on the Department of Education website at www.ed.gov/offices/OUS/eval/elem.html#Magnet). Both studies were conducted under ED contract #LC90043001 by the American Institutes for Research. The first study, completed in 1994, was largely a descriptive study of the growth, nature, and comparability to other school choice programs (such as non-magnet specialty schools) of a nationally representative sample of magnet projects (Educational Innovation in Multiracial Contexts: The Growth of Magnet Schools in American Education, 1994). Key findings from this report include: ÀÀ By 1991-92, the number of school districts operating magnet projects had doubled over the previous 10 year period, with over half operating waiting lists, suggesting considerable unmet demand for these programs; ÀÀ Magnet school programs tended to be concentrated in districts having minority enrollments of 50 percent or more, and over half were located in large urban districts; and ÀÀ Most magnet schools were "whole school magnets," where all students in the school participate in the magnet program. The second study focused on the extent to which federally funded magnet projects (MSAP grantees) were able to eliminate, reduce, or prevent minority student isolation in targeted schools (Reducing, Eliminating, and Preventing Minority Isolation in American Schools: The Impact of the Magnet Schools Assistance Program, 1996). Key findings from that report include: ÀÀ Desegregation objectives consistent with the statutory goals of reducing, eliminating, or preventing minority isolation could be determined for only three-fifths of the schools targeted for desegregation impact; ÀÀ About half of the schools targeted for desegregation impact were able to meet their desegregation objectives within the two-year period covered by the MSAP grants, while increasing minority enrollments in many of the districts constrained efforts to reduce, eliminate, or prevent isolation in many schools; and ÀÀ There were modest improvements in racial balance among targeted schools in the grantee districts, but the number of targeted schools that were minority isolated increased slightly during the two-year period covered by the MSAP grants. In addition to these ED-sponsored reports on magnet schools, the Citizens' Commission on Civil Rights also issued a report Difficult Choices: Do Magnet Schools Serve Children in Need? (Citizens' Commission on Civil Rights, 1997) which focused on magnet schools in three communities St. Louis, MO; Cincinnati, OH; and Nashville, TN where extensive use of magnet schools have been made in meeting their obligations to desegregate schools. Key report findings include: À Magnets in all three communities studied had been successful in creating desegregated schools, i.e., in Nashville and Cincinnati, the percentage of black students enrolled in the magnet programs is roughly the same as the total percentage of black students in the district. And in St. Louis, a combination of magnet schools and other choice options also had resulted in a high percentage of resident black students attending desegregated schools. À Despite efforts to inform and attract students from low income families, the study found that low income children remain more highly concentrated in non-magnet than in magnet schools. À District data from the three communities suggest that low income students in magnet schools generally do better on measures of academic performance than their counterparts at non-magnet schools. Additional evidence from Cincinnati and St. Louis suggests that this may be the case even when differences in the socio-economic status of students are taken into account. On the question of impact of magnet schools on student achievement, an analysis of 1988 and 1990 National Educational Longitudinal Survey data (Gamoran,"Student achievement in public magnet, public comprehensive, and private city high schools," Educational Evaluation and Policy Analysis, 1996) showed that magnet school student achievement gains in reading, science, and social studies were statistically and substantively significantly higher than those in public comprehensive schools, private nonsectarian schools, or Catholic schools, after controlling for a host of other explanatory variables. Additionally, in recent months the Department has sponsored a Feasibility Study of Evaluating Magnet Schools Assistance Program Grantee Student Achievement (conducted under ED contract #EA 94052001, task order #EA 980150, by the American Institutes for Research), which outlines key issues which have implications for this evaluation's analysis of magnet student achievement. The purpose of the study was to examine how accessible, reliable, and useful MSAP grantee application, annual project report, and other LEA data are for evaluating the impact of MSAP on student achievement, particularly as student achievement is conceived in the MSAP performance indicators. This study was a preliminary examination of nine MSAP FY 95 grantees which focused on issues which are relevant to an evaluation of magnet student and comparable non-magnet public school student achievement (e.g., what kinds of assessments are used for magnet students as compared to students district wide). The study relied on the FY 95 grantee applications and performance reports and in-depth interviews with magnet project directors and Pupil Personnel Officers/Evaluators to highlight potential problem areas and suggest solutions that will minimize the barriers to conducting a meaningful evaluation of MSAP grantee student achievement. GENERAL EVALUATION DESIGN This evaluation of the MSAP shall track the progress of the approximately 60 1998 MSAP grantees in attaining the overall statutory, as well as project-specific, objectives over the course of their three-year grant cycle. It shall involve the collection of longitudinal data from a variety of sources most importantly MSAP applications and annual performance reports and the development of reports coinciding to the end of each MSAP project year. Each of the three reports shall address the key research questions, discuss the progress of all MSAP grantees and followed samples of MSAP schools and students over time, and be separated into two volumes: one for policy makers and the other for practitioners. The contractor shall implement a dissemination plan which shall ensure a wide range of interested parties are provided with the findings of the evaluation. The years 2 and 3 reports shall illuminate trends for key research questions and build upon lessons learned in earlier data collections. The Framework for the Evaluation The Government Performance and Results Act of 1993 (GPRA) requires the development of the MSAP performance plan. This plan, like all ED program performance plans, includes a set of performance indicators which is intended to be used as a tool to assess the extent to which the program is meeting its statutory objectives. As such, it offers a useful framework for evaluation purposes. The research questions for this evaluation have been framed around these indicators in order that the evaluation address these information needs to the extent possible. Additionally, MSAP grant application materials include the program performance indicators, and potential grantees are instructed to address them throughout their application. Grantees are also required to address the performance indicators in their annual progress reports to ED. Evaluation Design The contractor shall develop a thorough and complete design for the project which will guide the evaluation, and shall include, at a minimum: Description of the sample selection plan, including the numbers of respondents (e.g., magnet school administrators and teachers) from which data will be collected, the basis for the sample size proposed, and the calculations employed to determine the numbers proposed. At the grantee level, all local MSAP projects shall be examined. At the school and student level, the contractor shall develop a sample design and sample size that maximizes the generalizability of MSAP-funded magnet schools and students. In the sample design, the contractor shall consider stratification variables such as whole school/program-within-school programs, school size, elementary/middle/high schools, and urban/suburban/rural programs. Description of the data collection plan, including the types of data collection instruments that the contractor will use, the specific data that the contractor will collect, and how the data will address each of the research questions. The contractor shall conduct all data collection and reporting in accord with the Standards for Education Data Collection and Reporting developed for the National Center for Education Statistics, U.S. Department of Education unless otherwise approved by ED. These standards are available on the U.S. Department of Education website at: nces.ed.gov/pubsearch/pubsinfo.asp?pubid=92021XXXXX. This data collection plan shall include a matrix showing all data collection activities, when they will occur, and who will be responsible for implementing them. Data collection sources and instrumentation are described in greater detail below (see "Data Collection" section). Data analysis plan that describes how the contractor proposes to analyze the data obtained from all data collection instruments, and any plans for additional analyses in reports for years 2 and/or 3. Dissemination plan that includes, at a minimum, a complete description of the procedures and strategies to be used for widely disseminating the findings generated under this contract to both practitioners and policy makers. For example, the dissemination plan may include plans to write articles for Education Weekly, newspapers, parent organization newsletters, and/or descriptions of plans for presenting report findings to policy audiences at such arenas as the Magnet Schools of America annual conference. Description of the procedures to reduce participant burden, to obtain a participant response rate and an item response rate of a minimum of 90 percent on all surveys and other data collection instruments used in the evaluation, and to maintain confidentiality of all data collected on all participants in this evaluation. The contractor shall maintain information that identifies persons or institutions in files that are separate from other research data and that are accessible only to authorized agency and contractor personnel. The contractor shall explicitly reference Section 3 of the National Center for Education Statistics' Restricted-Use Data Procedures Manual (see nces.ed.gov/NCES/orderinfo.html for information on how to order a free copy). Data Collection The contractor shall develop a complete set of data collection instruments designed to provide sufficient information to address the research questions outlined above. These instruments shall consider several different methods of data collection appropriate to the types of information needed. For example: MSAP applications and annual performance reports. The Department will provide the contractor with grant applications and annual performance reports (as they become available; see tentative schedule below) of all 1998 MSAP grantees. MSAP grantees are required to report every project year (three times, once per data collection/reporting cycle of the contract) on the MSAP performance indicators and the extent to which they are meeting the objectives set forth in their application. Every effort was made to disseminate ED guidance to potential MSAP applicants on how to address the MSAP performance indicators in their applications and performance reports. Each MSAP funded district will be provided with this guidance for addressing the indicators in their annual performance reports in the rare case that they had not already received this documentation. For the purposes of collecting the requisite data for this evaluation, additional data sources shall be targeted to fill the gaps between the research questions and the expected content of the MSAP applications and performance reports. Follow-up calls to MSAP districts on particular elements of their applications and/or performance reports will likely be necessary to ensure all data elements are captured for the evaluation. A tentative schedule for MSAP grantee reporting (individual grantee timelines vary according to their project period) is provided below: Project/School Year Performance Report Due Date(s) 1998-99 April, June, July 1999 1999-00 April, June, July 2000 2000-01 September-December 2001 The contractor shall develop data extraction protocols for aggregating and summarizing the data contained in the MSAP applications and annual performance reports. Project director survey/interview. To examine important contextual and supplemental information not captured in MSAP applications and annual performance reports, the contractor shall develop a survey/interview to be administered to MSAP project directors. The contractor shall consider including questions in this survey about external factors influencing their ability to meet their desegregation objectives and other goals, the degree to which MSAP projects and LEAs/SEAs share information about promising practices, how MSAP project theme/focus areas were selected and implemented, how the MSAP grant and other federal funding sources have helped to support their magnet program and systemic reform efforts, how MSAP projects are managed and financed, and the accessibility and usefulness of federal technical assistance. Principal and teacher surveys. The contractor shall consider developing principal and teacher surveys designed to explore topics such as their background characteristics, the autonomy and control afforded to principals and teachers in MSAP-funded schools, and other school-level issues that pertain to the research questions. LEA/school record review and interview protocols. The contractor shall develop a student achievement/performance data extraction methodology and/or interview protocols for key LEA personnel (e.g., Pupil Personnel Officer) in order to gather the requisite information to address the research questions regarding magnet student and comparable traditional public school student achievement, the sharing of information about and replication of model practices between magnet schools and the LEA, and how the MSAP project fits into the LEA's systemic reform efforts. Interviews and document requests with MSAP staff. The contractor shall consider developing interview protocols and document requests from key MSAP staff in order to gather information regarding the MSAP grant-making process and technical assistance activities. Existing documentation that will address questions of federal technical assistance activities and grant-making processes will be provided to the contractor at the kick-off meeting. The contractor shall consider the extent to which site visits may be necessary to obtain the requisite information and the impact that this decision will have on the sample size for the evaluation. SCOPE OF WORK The contractor shall conduct a comprehensive evaluation of the impact of the MSAP based on the four primary objectives of the program by conducting a longitudinal study of the 1998 MSAP grantees across the program's objectives and over the three years of the grant cycle. The contractor shall prepare three reports (keyed to year of grant cycle) summarizing findings, and provide briefings to ED staff and other magnet school stakeholders on the report findings. TASK 1 Kick-off meeting with ED The contractor shall meet with the Contracting Officer's Technical Representative (COTR), the Contracting Officer (CO) or his/her designated representative, and other ED staff within 1 week after the effective date of the contract. The purpose of this meeting is to discuss upcoming tasks, preliminary evaluation design issues, the scheduling of activities and other issues related to the conduct of the work. The contractor should allocate 1 day for this initial meeting. The COTR will identify a time and place for the meeting and will assume responsibility for inviting all Department staff. The contractor shall provide the COTR with meeting minutes via E-mail within 2 weeks of the effective date of the contract. TASK 2 Utilize outside expertise on an ongoing basis Subtask 2.1 Establish Technical Work Group (TWG) The contractor shall form a Technical Work Group (TWG) of 7-10 people to provide the contractor with outside expertise on the conduct of the evaluation including, but not limited to: refinements of the evaluation design; data collection and instrumentation; analysis plans; and the content and format of evaluation reports. The contractor is free to accept or reject any advice or recommendations individual panel members offer. The contractor shall submit to the COTR a list of proposed TWG members for approval by the COTR no later than 2 weeks after the effective date of the contract. The contractor shall modify the list of TWG members it submitted as part of its proposal to incorporate suggestions and comments provided during the proposal review process and the kick-off meeting. In the list, the contractor shall discuss the strengths of each potential advisor and explain the role each will play in helping achieve the objectives of the evaluation. The COTR will provide comments on the list within one week. The contractor shall contact each selected member within 4 weeks after the effective date of the contract and formally invite him or her to serve on the work group. The contractor shall finalize the TWG membership and submit the final list to the COTR via E-mail no later than 5 weeks after the effective date of the contract. Subtask 2.2 Convene Technical Work Group The contractor shall convene the first meeting of the TWG within 8 weeks after the effective date of the contract. The contractor, in conjunction with the COTR shall decide upon the timing and scope of the subsequent meetings after the first meeting. During the course of the contract, the contractor shall convene the TWG for no more than 3 meetings of one day's duration each per project year. Department staff will observe/participate at these meetings. The contractor shall convene all meetings in the Washington, D.C. metropolitan area. The contractor shall develop a schedule for succeeding meetings after the first TWG meeting is held and submit it to the COTR within 11 weeks of the effective date of the contract. Three weeks prior to each meeting, the contractor shall submit a draft agenda and briefing book for review by the COTR. After a three-day review by the COTR, the contractor shall revise the agenda and briefing book as required. Briefing books shall be sent to the TWG members one week prior to each meeting, and include in its contents at least the following: the agenda, status reports, background information on issues to be discussed, and any draft deliverables to be discussed at the meeting. The contractor shall prepare and submit summary minutes via e-mail of the TWG meetings to the COTR within 1 week after each meeting takes place. After a one-week review by the COTR, the contractor shall revise the minutes and submit a final copy to the COTR no later than one week after receiving the COTR's comments. TASK 3 REFINE THE BASELINE MANAGEMENT PLAN The contractor shall revise the baseline management plan submitted in its proposal to reflect topics discussed in the initial meeting with the COTR and items raised in negotiation no later than 4 weeks after the effective date of the contract. The contractor shall include in the revised plan critical path diagrams, GANTT or PERT charts, including person loading charts by task. In years 2 and 3 of the evaluation, the contractor shall refine and update the plan to incorporate revisions as needed (e.g., due to staffing changes, etc.). TASK 4 REFINE THE EVALUATION DESIGN The contractor shall refine the evaluation design described in its proposal based on the initial meeting with ED staff and TWG suggestions. The contractor shall include in the evaluation design all elements outlined in the "General Evaluation Design" section above. The contractor shall submit the draft evaluation design to the COTR within 10 weeks after the effective date of the contract. Allowing 3 weeks for Department review and comment, the contractor shall submit the revised evaluation design to the COTR within 15 weeks after the effective date of the contract. TASK 5 DATA COLLECTION PLAN AND INSTRUMENTATION Subtask 5.1 Refine data collection plan The contractor shall refine the overview of all data collection instruments that clearly and succinctly describes the data the contractor will collect, why each particular data element was determined to be important, and how each data element will answer one or more of the research questions. This refined data collection plan shall be submitted with the data collection instruments described in subtask 5.2. Subtask 5.2 Develop data collection instruments The contractor shall develop data collection instruments that include all questions that the contractor will ask of all respondents for the evaluation. The contractor shall submit the draft data collection instruments within 17 weeks after the effective date of the contract. Allowing 3 weeks for Department review and comment, the contractor shall submit the revised data collection instruments to the COTR within 21 weeks after the effective date of the contract. Subtask 5.3 Pilot test data collection instruments The contractor shall conduct a pilot test of the data collection instruments developed and revised under Subtask 5.2. This pilot test shall comply with Office of Management and Budget (OMB) requirements not to ask the same questions of more than nine individuals. The contractor shall select pilot test entities based on discussions with the COTR. The contractor shall administer the pilot test within 23 weeks after the effective date of the contract. Based on feedback from the pilot test, the contractor shall make appropriate revisions and finalize the data collection instruments within 25 weeks after the effective date of the contract. TASK 6 OMB CLEARANCE PACKAGE The contractor shall develop a forms clearance package which shall cover all three years of the evaluation's data collections. The package shall be developed for the Department to submit to OMB under procedures of the Paperwork Reduction Act and 5 CFR 1320. The contractor shall submit the draft forms clearance package to the COTR within 25 weeks after the effective date of the contract. Allowing 3 weeks for ED review and comment, the contractor shall submit the revised forms clearance package to the COTR within 30 weeks after the effective date of the contract. The contractor shall schedule at least 5 months for review of the clearance package by the Department's Information Management Team (IMT) and by OMB prior to OMB approval. IMT or OMB may require revisions to parts of the clearance package prior to approval. The contractor shall make the required revisions and respond to questions from OMB and the public upon request and submit the revised materials to the COTR. The contractor shall, if necessary, meet with ED and OMB staff to discuss the clearance package and its revisions and provide other support for the clearance process. Allowing just over 5 months, OMB clearance is expected to be received within 52 weeks after the effective date of the contract. TASK 7 REFINE ANALYSIS PLAN The contractor shall refine the preliminary analysis plan submitted with its proposal. The contractor shall include in the refined analysis plan a detailed description of how the contractor will treat the data gathered, specifying the manner in which the contractor will analyze the data over the course of the evaluation, including the techniques to analyze the quantitative and qualitative data elements for each data collection activity/instrument in the evaluation. The contractor shall include in the analysis plan a description of all key variables that it will analyze, how the data analysis addresses the research questions, how the analyses integrate data from various sources, and the types of descriptive and inferential statistics that the contractor will use, along with table shells to illustrate planned analyses for the three annual reports. The contractor shall submit the revised analysis plan to the COTR no later than 24 weeks after the effective date of the contract. Allowing 3 weeks for ED review and comment, the contractor shall submit the final analysis plan to the COTR no later than 30 weeks after the effective date of the contract. TASK 8 NOTIFICATION OF STUDY PARTICIPANTS Subtask 8.1 Select sample and prepare notification materials for sample The contractor shall select districts and schools from within the sampling frame based on the sampling plan included in the final evaluation design. The contractor shall also make initial contacts with all sites to secure their participation. The contractor shall prepare notification packets for the districts and schools selected for the sample as well as for the states in which these sites are located. The contractor shall include in the notification packets a notification letter and a brochure. The contractor shall ensure that the letter includes an overview of the evaluation as well as specific information on the data collection schedule and plans, sample, provisions for maintaining anonymity of survey participants, data security, the organizations and persons involved in the evaluation, and the benefits to be derived from the evaluation. The contractor shall ensure that the brochure is a non-technical 8«-by-11-inch tri-fold brochure describing the evaluation that is suitable for distribution to a broad audience of policy makers, educators and managers of education programs. The contractor shall submit draft notification packets (including draft letters and draft brochure) to the COTR no later than 22 weeks after the effective date of the contract. The COTR will obtain the Under Secretary's signature on notification letters and forward these letters, as well as comments on the brochure, back to the contractor within 25 weeks of the effective date of the contract. The contractor shall revise the notification packets as needed and submit copies of the final letters and brochure to the COTR within 26 weeks of the effective date of contract award. Subtask 8.2 Notify sample participants The contractor shall mail the notification packets developed under subtask 8.1 to relevant personnel no later than 1 week after OMB clearance. TASK 9 YEAR 1 DATA COLLECTION, ANALYSIS AND REPORTING Subtask 9.1 Collect year 1 data The contractor shall make final changes to the data collection instruments and begin to administer the data collection instruments to respondents within 1 week after OMB clearance. The contractor shall implement procedures to ensure that the data collection instruments are administered in a timely and efficient manner. The contractor shall complete the year 1 data collection within 16 weeks of OMB clearance. The contractor shall utilize appropriate techniques to obtain response rates of at least 90 percent (e.g., postcard reminders and telephone follow up). The contractor shall continue follow-up activities until 16 weeks after OMB clearance. Subtask 9.2 Analyze and process year 1 data The contractor shall develop coding materials for entering the data collected and preparing the data for analysis as it is received. The contractor shall place the abstracted data in a computer-accessible format. To ensure accuracy, the contractor shall verify all key data entered, conduct edit and consistency checks, and track response rates. The contractor shall resolve problems identified in this process through phone calls to the respondents. The contractor shall include information on the status of this task in each monthly progress report. The contractor shall analyze the data received from the surveys in accordance with the final data analysis plan. The contractor shall submit to the COTR preliminary tables based on tabulations from the initial data analysis, as well as a brief description of initial findings from these analyses within 28 weeks after OMB clearance. Subtask 9.3 Prepare year 1 report The contractor shall prepare a year 1 report and a non-technical executive summary summarizing the findings of the evaluation. The contractor shall include in the report descriptive and analytic information that addresses the research questions outlined above and as agreed upon by the COTR in any subsequent meetings or correspondence, and that describes the progress that MSAP grantees have made towards the annual benchmarks outlined in their applications. The contractor shall prepare two volumes to comprise this year 1 report: one volume for policy makers and the other for practitioners. While both volumes shall address key cross-cutting issues and findings that emerge from the evaluation, the policymaker volume shall focus on implications for MSAP administration and technical assistance improvement strategies, and the practitioner volume shall focus on the successful models and strategies ("what works") for replication in other areas. The contractor shall incorporate into the final year 1 report information and findings from other studies conducted or databases maintained for the Department or by independent researchers when they are relevant to the research questions for this evaluation. The contractor shall submit a draft year 1 report and executive summary to the COTR no later than 32 weeks after OMB clearance. After a 1-week review by the COTR, the contractor shall submit a second draft to the COTR no later than 35 weeks after OMB clearance. After a 2-week review by the Department, the contractor shall submit a third draft to the COTR no later than 39 weeks after OMB clearance. After a 3-week review by the Department, the contractor shall submit the final report to the COTR no later than 44 weeks after OMB clearance. The COTR will transmit the year 1 report to Congress after receiving approval from senior Department officials. TASK 10 YEAR 2 DATA COLLECTION, ANALYSIS AND REPORTING Subtask 10.1 Collect year 2 data The contractor shall make any minor changes to the data collection instruments based on experience and information gathered in the year 1 data collection. The contractor shall begin to administer the data collection instruments to respondents within 45 weeks after OMB clearance. The contractor shall implement procedures to ensure that the data collection instruments are administered in a timely and efficient manner. The contractor shall complete the year 2 data collection within 60 weeks after OMB clearance. The contractor shall utilize appropriate techniques to obtain response rates of at least 90 percent (e.g., postcard reminders and telephone follow up). The contractor shall continue follow-up activities until 60 weeks after OMB clearance. Subtask 10.2 Analyze and process year 2 data The contractor shall develop coding materials for entering the data collected and preparing the data for analysis as it is received. The contractor shall place the abstracted data in a computer-accessible format. To ensure accuracy, the contractor shall verify all key data entered, conduct edit and consistency checks, and track response rates. The contractor shall resolve problems identified in this process through phone calls to the respondents. The contractor shall include information on the status of this task in each monthly progress report. The contractor shall analyze the data received from the surveys in accordance with the final data analysis plan. The contractor shall submit to the COTR preliminary tables based on tabulations from the initial data analysis, as well as a brief description of initial findings from these analyses within 72 weeks after OMB clearance. Subtask 10.3 Prepare year 2 report The contractor shall prepare a year 2 report and a non-technical executive summary summarizing the findings of the evaluation. The contractor shall include in the report descriptive and analytic information that addresses the research questions outlined above and as agreed upon by the COTR in any subsequent meetings or correspondence, and that describes the progress that MSAP grantees have made towards the annual benchmarks outlined in their applications. The contractor shall prepare two volumes to comprise this year 2 report: one volume for policy makers and the other for practitioners. While both volumes shall address key cross-cutting issues and findings that emerge from the evaluation, the policymaker volume shall focus on implications for MSAP administration and technical assistance improvement strategies, and the practitioner volume shall focus on the successful models and strategies ("what works") for replication in other areas. The contractor shall incorporate into the final year 2 report information and findings from other studies or databases conducted or maintained for the Department or by independent researchers where they are relevant to the research questions for this evaluation, as well as trends and patterns observed from year 1 to year 2. The contractor shall submit a draft year 2 report and executive summary to the COTR no later than 76 weeks after OMB clearance. After a 1-week review by the COTR, the contractor shall submit a second draft to the COTR no later than 79 weeks after OMB clearance. After a 2-week review by the Department, the contractor shall submit a third draft to the COTR no later than 83 weeks after OMB clearance. After a 3-week review by the Department, the contractor shall submit the final report to the COTR no later than 88 weeks after OMB clearance. The COTR will transmit the year 2 report to Congress after receiving approval from senior Department officials. TASK 11 YEAR 3 DATA COLLECTION, ANALYSIS AND REPORTING Subtask 11.1 Collect year 3 data The contractor shall make any minor changes to the data collection instruments based on experience and information gained from year 2 data collection. The contractor shall begin to administer the data collection instruments to respondents within 89 weeks after OMB clearance. The contractor shall implement procedures to ensure that the data collection instruments are administered in a timely and efficient manner. The contractor shall complete the year 3 data collection within 120 weeks after the effective date of the contract. Additional data collection time is needed in year 3 because final performance reports are not due from grantees until 90 days after their grant expires. The contractor shall utilize appropriate techniques to obtain response rates of at least 90 percent (e.g., postcard reminders and telephone follow up). The contractor shall continue follow-up activities until a response rate of 90 percent has been obtained or until 120 weeks after OMB clearance. Subtask 11.2 Analyze and process year 3 data The contractor shall develop coding materials for entering the data collected and preparing the data for analysis as it is received. The contractor shall place the abstracted data in a computer-accessible format. To ensure accuracy, the contractor shall verify all key data entered, conduct edit and consistency checks, and track response rates. The contractor shall resolve problems identified in this process through phone calls to the respondents. The contractor shall include information on the status of this task in each monthly progress report. The contractor shall analyze the data received from the surveys in accordance with the final data analysis plan. The contractor shall submit to the COTR preliminary tables based on tabulations from the initial data analysis, as well as a brief description of initial findings from these analyses within 132 weeks after the effective date of the contract. Subtask 11.3 Prepare year 3 report The contractor shall prepare a year 3 report and a non-technical executive summary summarizing the findings of the evaluation. The contractor shall include in the report descriptive and analytic information that addresses the research questions outlined above and as agreed upon by the COTR in any subsequent meetings or correspondence, and that describes the progress that MSAP grantees made towards the ultimate goals outlined in their applications. The contractor shall prepare two volumes to comprise this year 3 report: one volume for policy makers and the other for practitioners. While both volumes shall address key cross-cutting issues and findings that emerge from the evaluation, the policymaker volume shall focus on implications for MSAP administration and technical assistance improvement strategies, and the practitioner volume shall focus on the successful models and strategies ("what works") for replication in other areas. The contractor shall incorporate into the final year 3 report information and findings from other studies or databases conducted or maintained for the Department or by independent researchers where they are relevant to the research questions for this evaluation, as well as trends and patterns observed from year 1 to year 3. The contractor shall submit a draft year 3 report and executive summary to the COTR no later than 136 weeks after OMB clearance. After a 1-week review by the COTR, the contractor shall submit a second draft to the COTR no later than 139 weeks after OMB clearance. After a 2-week review by the Department, the contractor shall submit a third draft to the COTR no later than 143 weeks after OMB clearance. After a 3-week review by the Department, the contractor shall submit the final report to the COTR no later than 148 weeks after OMB clearance. The COTR will transmit the report to Congress after receiving approval from senior Department officials. TASK 12 IMPLEMENT DISSEMINATION PLAN The contractor shall disseminate the findings of the evaluation as outlined in the dissemination plan. The contractor shall NOT report on any findings that have not yet been transmitted to Congress. The dissemination for year 1 shall be completed within 56 weeks of OMB clearance; the dissemination for year 2 shall be completed within 98 weeks of OMB clearance; and the dissemination for year 3 shall be completed within 158 weeks of OMB clearance. Subtask 12.1 Provide briefing to IRP The contractor shall present a briefing of their final year 3 report to Department staff and the Independent Review Panel (IRP) at one of the IRP's biannual meetings in Washington, DC. The contractor shall include in the briefing a brief overview and summary of the evaluation's purpose, methodology, and findings. The briefing shall focus upon addressing the research questions as presented in the contract and agreed upon by the COTR. The COTR will inform the contractor of the time of the meeting and the anticipated number of attendees within one week prior to the scheduled briefing. Subtask 12.3 Make presentations at professional and practitioner conferences The contractor shall submit proposals for presentations at professional and/or practitioner conferences, ranging between 1 and 3 presentations per year. To prepare for these presentations, the contractor shall develop briefing materials that are non-technical and appropriate for the general public. The contractor shall obtain the information on proposal requirements and deadlines from each professional and/or practitioner organization. The contractor shall provide a draft of the proposal and briefing materials to the COTR and receive the COTR's approval at least one week before the submission to each organization/conference planner. Examples of appropriate conferences include but are not limited to:  Association for Public Policy Analysis and Management  American Educational Research Association  Magnet Schools of America For each presentation the contractor shall submit the material for presentation to the COTR for approval. The contractor shall not present evaluation findings from reports or tabulations that have not been reviewed by the Department and transmitted to Congress. Prior to transmittal of the final report to Congress, the contractor shall present only methodology at any conferences or other public presentations. TASK 13 ARCHIVING OF DATA Subtask 13.1 Public Use Data Files The contractor shall prepare public use data files that are formatted to the National Center for Education Statistics (NCES) Electronic Codebook (ECB). The contractor shall meet with NCES staff and the COTR at the beginning of the contract and prior to developing codebooks in order to determine the most efficient way of recording information so that the Department will not incur additional costs in order to fit the specifications of the ECB. The contractor shall schedule the meeting with NCES no later than 12 weeks after the effective date of the contract. Subtask 13.2 Transmitting the Data Files to ED Upon completion of each year of the evaluation and transmission of each report to Congress, the contractor shall provide hard copy and electronic medium copies of the data set, code books, technical reports and other study materials to an archival site for public dissemination. The contractor shall obtain the COTR's approval for the specific archival site to be used. The contractor shall ensure that the archived materials are in compliance with privacy protection laws. The electronic data sets shall be submitted in ASCII format with documentation of field names and widths. The contractor shall complete this task for year 1 no later than 54 weeks after OMB clearance; for year 2 no later than 96 weeks after OMB clearance; and for year 3 no later than 156 weeks after OMB clearance. TASK 14 CONTRACTOR MONTHLY REPORTING REQUIREMENTS The contractor shall establish an internal system with the capacity to: Identify problem areas by order of importance; Identify anticipated schedule slippage and cost overruns; and Provide means of determining where project managers and resources are deployed to assist more critical tasks. The contractor shall include this information in the monthly progress reports. The progress report shall include both monthly and cumulative contract costs by task for the full evaluation. The contractor shall provide documentation of how this system will work to the COTR within 1 week of the effective date of the contract. The contractor shall submit 2 copies of these reports to the COTR within one week of the end of each reporting month. TIMELINE AND SCHEDULE OF DELIVERABLES The due dates associated with the timeline and schedule of deliverables are anchored by the effective date of the award of the contract, or OMB clearance. Except where noted otherwise, tasks 1-8 deliverables are due according to the indicated number of weeks after the award of the contract, and tasks 9-13 are due according to the indicated number of weeks after OMB clearance. The contractor shall send one copy of the following to the contract specialist: all key deliverables assessed in the Quality Assurance Surveillance Plan, final deliverables, and monthly reports; the remaining copies of all deliverables shall be sent to the COTR. TASKS DELIVERABLES DUE DATES COPIES CONTRACT AWARD Task 1: Meetings with ED 1. Kick-off meeting 2. Minutes 1 week 2 weeks NA e-mail Task 2: Utilize Outside Expertise on Ongoing Basis 2.1 Establish Technical Work Group 1. Propose members 2. Invite members to serve 3. Finalize membership 2 weeks 4 weeks 5 weeks e-mail e-mail 2.2 Convene Technical Work Group (first meeting) 1. Draft agenda and briefing book 2. Send briefing books to panelists and to ED 3. Meeting 4. Draft minutes 5. Final minutes and schedule for subsequent meetings 5 weeks 7 weeks 8 weeks 9 weeks 11 weeks 3 copies 3 to ED e-mail e-mail 2.2 Convene Technical Work Group (subsequent meetings) 1. Draft agenda and briefing book 2. Send briefing books to panelists and to ED 3. Meeting 4. Draft minutes 5. Final minutes and schedule for subsequent meetings 3 wks before meeting 1 wk before meeting TBA 1 wk after meeting 2 wks after meeting 3 copies 3 to ED e-mail e-mail Task 3: Refine Baseline Management Plan 1. Revised Plan 4 weeks e-mail Task 4: Refine Evaluation Design 1. Draft revision 2. Final 10 weeks 15 weeks 8 8 Task 5: Data Collection Instruments 5.1 Refine data collection plan 1. Draft plan 2. Final plan 17 weeks 21 weeks 8 8 5.2 Develop data collection instruments 1. Draft DCIs 2. Final DCIs 17 weeks 21 weeks 8 8 5.3 Pilot test 1. Conduct pilot test 2. Revise instruments 23 weeks 25 weeks NA Task 6: OMB Clearance Package 1. Draft clearance package 2. Final clearance package 3. Receive OMB clearance 25 weeks 30 weeks 52 weeks 5 5 Task 7: Refine Analysis Plan 1. Draft plan 2. Final plan 24 weeks 30 weeks e-mail e-mail Task 8: Notification of Study Participants 8.1 Select sample and prepare notification materials 1. Draft letters and brochure 2. Final letters and brochure 22 weeks 26 weeks e-mail 20 to ED 8.2 Notify sample participants 1. Mail notification materials 1 week (after OMB clearance) N/A OMB CLEARANCE (52 weeks after effective date of contract; subsequent deliverable due dates keyed to weeks after clearance) Task 9: Year 1 data collection, analysis and reporting 9.1 Collect year 1 data 1. Collect data 16 weeks N/A 9.2 Analyze and process year 1 data 1.Preliminary year 1 findings and data tabulations 28 weeks 1 9.3 Prepare year 1 report 1. First draft year 1 report 2. Second draft year 1 report 3. Third draft year 1 report 4. Final year 1 report 32 weeks 35 weeks 39 weeks 44 weeks 2 10 10 20 Task 10: Year 2 data collection, analysis and reporting 10.1 Collect year 2 data 1. Collect data 60 weeks N/A 10.2 Analyze and process year 2 data 1. Preliminary year 2 findings and data tabulations 72 weeks 1 10.3 Prepare year 2 report 1. First draft year 2 report 2. Second draft year 2 report 3. Third draft year 2 report 4. Final year 2 report 76 weeks 79 weeks 83 weeks 88 weeks 2 10 10 20 Task 11: Year 3 data collection, analysis and reporting 11.1 Collect year 3 data 1. Collect data 120 weeks N/A 11.2 Analyze and process year 3 data 1. Preliminary year 3 findings and data tabulations 132 weeks 1 11.3 Prepare year 3 report 1. First draft year 3 report 2. Second draft year 3 report 3. Third draft year 3 report 4. Final year 3 report 136 weeks 139 weeks 143 weeks 148 weeks 2 10 10 20 Task 12: Feedback to Policy Audiences and Participants 12.1 Implement dissemination plan 1. Year 1 dissemination complete 2. Year 2 dissemination complete 3. Year 3 dissemination complete 56 weeks 98 weeks 158 weeks NA 12.2 Provide Briefing to IRP Briefing to IRP * NA 12.3 Presentations to professional and practitioner conferences After transmittal to Congress * NA Task 13: Archiving of Data 13.1 Public Use Data Tapes 1. Schedule meeting with NCES 12 weeks (from award) 13.2 Transmitting Data Tapes to ED 1. Year 1 data 2. Year 2 data 3. Year 3 data 54 weeks 96 weeks 156 weeks Task 14: Monthly Reporting Requirements 1. Reporting system 2. Monthly report 1 week (after award) 1 week (after end of month) 1 2 * as scheduled APPENDIX 1: QUALITY ASSURANCE SURVEILLANCE PLAN Introduction This Performance-Based Quality Assurance Surveillance Plan (QASP) has been developed pursuant to the requirements of the Performance-Based Statement of Work in Contract No. 98-0040. This plan sets forth procedures and guidelines that the Department of Education will use in evaluating the technical performance of the Contractor. A copy of this plan will be furnished to the Contractor so that the Contractor will be aware of the methods that the Government will employ in evaluating performance on this contract and so that the Government may address any concerns that the Contractor may have prior to initiating work. Purpose of the QASP The QASP is intended to accomplish the following: ÀÀ Define the roles and responsibilities of participating Government officials and outside experts; ÀÀ Define the key deliverables which will be assessed; ÀÀ Describe the rating elements and standards of performance against which the Contractor's performance will be assessed for each key deliverable; ÀÀ Describe the process of quality assurance assessment; and ÀÀ Provide copies of the quality assurance monitoring forms that will be used by the Government in documenting and evaluating the Contractor's performance. Each of these purposes is discussed in detail below. Roles and Responsibilities of Participating Government Officials and Experts The following Government Officials and/or experts will participate in assessing the quality of the Contractor's performance. Their roles and responsibilities are described as follows: À`À Contracting Officer's Technical Representatives (COTRs). COTR's will be responsible for monitoring, assessing, recording, and reporting on the technical performance of the Contractor on a day-to-day basis. They will also be responsible for assembling a three member Quality Assurance Review Panel (QARP) to complete the Quality Assurance Monitoring Forms (described in greater detail below and provided in Exhibits A, B, C, and D) which will be used to document the inspection and evaluation of the Contractor's work performance on four key deliverables in Year 1, and two key deliverables in Years 2 and 3 (note: "Year" refers to the year of the MSAP grant for which data are being collected). À`À Three additional ED staff and/or outside experts with knowledge and experience in the areas of evaluation design/methodology, magnet schools, desegregation, data analysis, and/or contract management may serve as QARP members. For each key deliverable assessment, one of these individuals may be selected, based on time availability and appropriateness to the task, to serve with the two COTRs (resulting in a three-person team for each deliverable assessment) in assessing the quality of that deliverable. However, the Department reserves the right to have only one COTR assess each deliverable. It is extremely important for the COTRs to establish and maintain a team-oriented line of communication with the Contractor's Project Manager (PM) and the PM's office staff in order to perform her monitoring functions. The COTRs, Contracting Officer (CO), and PM must work together as a team to ensure that required work is accomplished in an efficient and proper manner. Meetings should be held on a regular basis in order to resolve serious problems. Less serious problems should be discussed and resolved on an impromptu basis. À`À The Contracting Specialist (CS) will have overall responsibility for overseeing the Contractor's performance. The CS will also be responsible for the day-to-day monitoring of the Contractor's performance in the areas of contract compliance, contract administration, cost control and property control; reviewing the COTR's assessment of the Contractor's performance; and resolving all differences between the COTR's version and the Contractor's version. The CS may call upon the expertise of other Government individuals as required. À`À Contracting Officer (CO) has the following procurement authorities: ÀÀ SOLE authority for any decisions which produce an increase or decrease in the scope of the contract; ÀÀ SOLE authority for any actions subject to the "Changes" clause; ÀÀ SOLE authority for any decision to be rendered under the "Disputes" clause; ÀÀ SOLE authority for negotiation and determination of indirect rates to be applied to the contract; ÀÀ SOLE authority to approve the substitution or replacement of the Project Manager and other key personnel; ÀÀ SOLE authority to approve the Contractor's invoices for payment, subject to the Limitation of Costs clause and the Limitation of Funds clause; ÀÀ SOLE authority to monitor and enforce Department of Labor promulgated labor requirements; ÀÀ Authority to arrange for and supervise Quality Assurance activities under this contract; ÀÀ SOLE authority to approve the Contractor's Quality Control Program; ÀÀ To approve all Contractor purchases of equipment, supplies, and materials exceeding $2,500 are encouraged even though not required by FAR 13.106; and ÀÀ Signatory authority for the issuance of all modifications to the contract. Key Deliverables to be Assessed Even though the Government through its COTRs will be monitoring the Contractor's performance on a continuing basis, the volume of tasks performed by the Contractor makes technical inspections of every task and step impractical. Accordingly, the Department of Education will use a quality-assurance review process to monitor the Contractor's performance under this contract. Specifically, the QARP or the COTR will assess the contractor's performance across a set of tailored rating elements for each of four key deliverables: ÀÀ Evaluation design; ÀÀ OMB clearance process; ÀÀ Yearly reporting; and ÀÀ Database. Rating Elements and Standards of Performance for Key Deliverables The contractor's performance shall be evaluated in Year 1 by assessing the draft evaluation design and the OMB clearance process, and in Years 2 and 3 by assessing the yearly report(s) and database(s) only. Tailored rating elements for each key deliverable have been developed and incorporated into the Quality Assurance Rating Forms (see Exhibits A, B, C, and D). The rating elements and acceptable standards of performance for each key deliverable are described below: ÀÀ Evaluation Design: (1) Quality of data collection plan. ...where acceptable performance would include sound, creditable, comprehensive approaches to collecting data and incorporating secondary data sources (when applicable) in the first draft; (2) Quality of data analysis plan. ...where acceptable performance would include sound, creditable, comprehensive approaches to analyzing quantitative and qualitative data, adequately addressing key research questions in the first draft, providing a clear conceptual model for analysis in the first draft; (3) Quality of dissemination plan. ...where acceptable performance would include sound, creditable, comprehensive approaches to disseminating reports and otherwise sharing key evaluation findings with a wide variety of stakeholders in the first draft; and (4) Comprehensiveness, clarity, and organization of design. ...where acceptable performance would include complete, clear, efficient approaches to addressing the research questions, clear writing style, proper grammar/spelling, clearly organized document format in the first draft; (5) Responsiveness to ED and other reviewers' comments and suggestions. ...where acceptable performance would include thoughtful consideration of reviewers' comments and suggestions for revisions throughout the drafting process, including written responses to unheeded suggestions to each reviewer should they request it, and timely revisions; (6) Timeliness ...where acceptable performance would include a deliverable that is received on time or within a reasonable amount of time (i.e., within one day of the date that it is due). ÀÀ OMB Clearance Process: (1) Comprehensiveness, clarity, and organization of draft package. ...where acceptable performance would include a complete OMB clearance package, clear writing style, proper grammar/spelling, well-organized document format, accurate and complete descriptions of data collection and data analysis plans; (2) Quality of draft data collection instruments. ...where acceptable performance would include complete, clear, straightforward data collection instruments, sufficient instrumentation to support data collection analysis plans without being unnecessarily burdensome to respondents; and (3) Quality of support during clearance process. ...where acceptable performance would include timely, relevant, complete and continuous support during the clearance process, including responding to OMB and/or public questions within three days of each request. ÀÀ Yearly Reporting: (1) Accuracy and relevance of information provided. ...where acceptable performance would include complete, clear, logical, appropriate, accurate reporting on data analysis results for key research questions, appropriate context for interpreting results in the first draft; (2) Usefulness for target audiences. ...where acceptable performance would include clear, tailored language and results for targeted audiences in the first draft; and (3) Comprehensiveness, clarity, and organization of report. ...where acceptable performance would include comprehensive description of key results, clear writing style, proper grammar/spelling, well-organized document format in the first draft; and (4) Responsiveness to ED and other reviewers' comments and suggestions. ...where acceptable performance would include thoughtful consideration of reviewers' comments and suggestions for revisions throughout the drafting process, including written responses to unheeded suggestions to each reviewer should they request it, and timely revisions; (5) Timeliness ...where acceptable performance would include a deliverable that is received on time or within a reasonable amount of time (i.e., within one day of the date that it is due). ÀÀ Database: (1) Cleanliness of data. ...where acceptable performance would include a database where an average of 95-97% of records pass all edit, consistency and outlier checks; and (2) User friendliness of documentation and database structure. ...where acceptable performance would include complete, clear, accurate documentation of variable names and labels, value labels, codes for missing values, descriptions of procedures used to compute analysis variables, documentation of all edit and consistency checks used to clean the data, and list of any outliers recoded as part of the cleaning process. (3) Timeliness ...where acceptable performance would include a deliverable that is received on time or within a reasonable amount of time (i.e., within one day of the date that it is due). Process of Quality Assurance Assessment While quality assurance is closely tied to these performance standards for deliverable content, cost is also an important consideration in the assessment of contractor performance. The contractor's cost performance will be evaluated by the Department at the end of the contract. See Section B.2 of the contract for further information. In the event of an excusable delay (as defined in FAR 52.249-14), the Department and the contractor shall work together to modify the contract in regard to the due dates of the deliverables. If such an event were to occur that would require a modification to the due dates of the deliverables, the contractor's performance, where applicable in this QASP, shall be measured by the date agreed upon in the modification. The QARP or the COTR will use the appropriate key deliverable evaluation forms (Exhibits A: Evaluation Design; B: OMB Clearance Process; C: Yearly Reporting; and D: Database) to document and evaluate the Contractor's performance for each of the key deliverables under this contract. Each form may be completed independently by each of the QARP members selected for each deliverable assessment, or the deliverable may be evaluated solely by the COTR. If a QARP panel is used, the rating element scores will be averaged for each member to arrive at an "overall" evaluation score. Then, if a QARP is used, an average of the members' overall ratings will generate the final evaluation score for that key deliverable. This final evaluation score will document the QARP's overall evaluation of Contractor performance for that key deliverable. If a QARP panel is not used and only the COTR evaluates the deliverable, the COTR's evaluation of the quality of that deliverable will serve as the overall evaluation score. Each key deliverable will be evaluated in accordance with the following definitions of contractor performance: ÀÀ Unacceptable. Level of performance which is not acceptable and which fails to meet the minimum standards of performance, resulting in the contractor receiving a reduction in targeted fee for that deliverable; ÀÀ Acceptable. Level of performance which meets the minimum standards of performance, resulting in the contractor receiving its targeted fee for that deliverable; or ÀÀ Superior. Level of performance which exceeds the minimum standards of performance, resulting in a bonus over targeted fee for that deliverable. Incentive fees for the key deliverables will be assessed as follows: Evaluation Design Superior: Target fee plus $5,000 Unacceptable: Target fee minus $8,000 OMB Clearance Superior: Target fee plus $1,000 Unacceptable: Target fee minus $2,000 Yearly Reporting (per reporting year) Superior: Target fee plus $5,000 Unacceptable: Target fee minus $9,000 Database (for each year that a database is delivered) Superior: Target fee plus $2,000 Unacceptable: Target fee minus $3,000 Total dollar amount of fee increase possible due to superior performance (deliverable quality): $27,000. Total dollar amount of fee decrease possible due to unacceptable performance (deliverable quality): $46,000. Each review panel member, and/or the COTR, must substantiate, in narrative form, all individual scores which they judge to be indicative of "superior" or "unacceptable" performance. Performance at the "acceptable" level is expected from the Contractor. The COTRs will forward copies of all completed QA monitoring forms (without reviewers' names) and a report of average scores to the CO and Contractor according to the following schedule: ÀÀ Evaluation design assessment: submitted by the close of business 20 working days from the date the final design was received by the COTRs. ÀÀ OMB clearance process assessment: submitted by the close of business 20 working days from the date OMB clearance is granted. ÀÀ Yearly reporting assessment: submitted by the close of business 20 working days from the date final reports are received by the COTRs. ÀÀ Database assessment: submitted by the close of business 20 working days from the date it was received by the COTRs. For the purposes of documentation, the Contractor may respond in writing to any "unacceptable" final average evaluation scores within 5 working days after receipt of the form(s); however, this does not mean that the QARP members will change their scores nor does it mean that the average final score will be changed. The CO will review each key deliverable evaluation form prepared by the QARP and/or the COTR. When appropriate, the CO may investigate the event further to determine if all the facts and circumstances surrounding the event were considered in the QARP opinions outlined on the forms. The CO will immediately discuss every deliverable receiving an "unacceptable" rating with the Contractor to assure that corrective action is promptly initiated. Discussion with the contractor of unacceptable performance or deliverables does not negate the Department's right to terminate the contractor for default for poor performance per FAR 52.249-6, Termination (Cost Reimbursement). QUALITY ASSURANCE SURVEILLANCE PLAN EXHIBIT A: EVALUATION DESIGN EVALUATION FORM Rating Element 1: Quality of Data Collection Plan Circle the appropriate number for your rating: 1 2 3 4 5 6 7 8 9 10 where: Unacceptable performance (1-4) would include missing, illogical, unclear, inappropriate approaches to collecting data, lacking a strategy for the incorporation of secondary data sources (when applicable), unnecessarily burdening respondents in the first draft. Acceptable performance (5-7) would include sound, creditable, comprehensive approaches to collecting data and incorporating secondary data sources (when applicable) in the first draft. Superior performance (8-10) would meet "acceptable performance" standards for this rating element, AND include innovative, exceptionally skillful approaches and/or methods to collecting data, incorporating secondary data sources (when applicable), and reducing respondent burden in the first draft. Supporting comments (required for unacceptable or superior performance ratings): Rating Element 2: Quality of Data Analysis Plan Circle the appropriate number for your rating: 1 2 3 4 5 6 7 8 9 10 where: Unacceptable performance (1-4) would include missing, illogical, unclear, inappropriate approaches to analyzing quantitative and qualitative data, lack a theory or conceptual model for analysis in the first draft. Acceptable performance (5-7) would include sound, creditable, comprehensive approaches to analyzing quantitative and qualitative data, adequately addressing key research questions, providing a clear conceptual model for analysis in the first draft. Superior performance (8-10) would meet "acceptable performance" standards for this rating element, AND include innovative, exceptionally skillful approaches and/or methods to analyzing quantitative and qualitative data in the first draft. Supporting comments (required for unacceptable or superior performance ratings): Rating Element 3: Quality of Dissemination Plan Circle the appropriate number for your rating: 1 2 3 4 5 6 7 8 9 10 where: Unacceptable performance (1-4) would include missing, illogical, unclear, inappropriate approaches to disseminating reports and otherwise sharing key evaluation findings with a wide variety of stakeholders in the first draft. Acceptable performance (5-7) would include sound, creditable, comprehensive approaches to disseminating reports and otherwise sharing key evaluation findings with a wide variety of stakeholders in the first draft. Superior performance (8-10) would meet "acceptable performance" standards for this rating element, AND include innovative, exceptionally skillful, multi-media approaches to disseminating reports and otherwise sharing key evaluation findings with a wide variety of stakeholders in the first draft. Supporting comments (required for unacceptable or superior performance ratings): Rating Element 4: Comprehensiveness, Clarity and Organization of Design Circle the appropriate number for your rating: 1 2 3 4 5 6 7 8 9 10 where: Unacceptable performance (1-4) would include missing, unclear, inefficient approaches to addressing the research questions, unclear writing style, poor grammar/spelling, disorganized document format in the first draft. Acceptable performance (5-7) would include complete, clear, efficient approaches to addressing the research questions, clear writing style, proper grammar/spelling, clearly organized document format in the first draft. Superior performance (8-10) would meet "acceptable performance" standards for this rating element, AND include innovative, exceptionally skillful approaches to addressing the research questions in the first draft. Supporting comments (required for unacceptable or superior performance ratings): Rating Element 5: Responsiveness to Reviewers' Comments and Suggestions Circle the appropriate number for your rating: 1 2 3 4 5 6 7 8 9 10 where: Unacceptable performance (1-4) would include unsubstantiated disregard for reviewers' comments and suggestions for revisions throughout the drafting process, refusal to provide written responses to reviewers who request them, and late or untimely revisions (i.e., received after the date that it is due, per the schedule in the contract). Acceptable performance (5-7) would include thoughtful consideration of reviewers' comments and suggestions for revisions throughout the drafting process, including written responses to unheeded suggestions to reviewers who request them, and timely revisions (i.e., within one day of the date that it is due). Superior performance (8-10) would meet "acceptable performance" standards for this rating element, AND include written responses to all reviewers for all drafts upon submission of all revised drafts, and would include revisions submitted prior to the date that they are due. Supporting comments (required for unacceptable or superior performance ratings): Rating Element 6: Timeliness Circle the appropriate number for your rating: 0 1 2 3 4 5 6 7 8 9 10 where: Unacceptable performance (0-4) would include late delivery (i.e., received after the date that it is due, per the schedule in the contract) for the first draft of the evaluation design. Four points should be given if the deliverable is received 2 working days late; three points should be given if the deliverable is received 3 working days late; two points should be given if the deliverable is received 4 working days late; one point should be given if the deliverable is received 5 working days late; no points should be given in the deliverable is received 6 or more working days late. Acceptable performance (5-7) would include timely delivery of the first draft of the evaluation design (i.e., within one day of the date that it is due, per the schedule in the contract). Seven points should be given if the deliverable is received on the date due; five points should be given if the deliverable is received one working day late. Superior performance (8 - 10) would include early (i.e., before the date that it is due, per the schedule in the contract) delivery of the first draft of the evaluation design. Eight points should be given for any deliverable that is received 1 working day early; nine points should be given for any deliverable that is received 2 - 3 working days early; 10 points should be given for any deliverable that is received 4 or more days early. Supporting comments (required for unacceptable or superior performance ratings): QUALITY ASSURANCE SURVEILLANCE PLAN EXHIBIT B: OMB CLEARANCE PROCESS EVALUATION FORM Rating Element 1: Comprehensiveness, Clarity and Organization of Draft Package Circle the appropriate number for your rating: 1 2 3 4 5 6 7 8 9 10 where: Unacceptable performance (1-4) would include missing OMB clearance package requirements, unclear writing style, poor grammar/spelling, disorganized document format, inaccurate or incomplete descriptions of data collection and data analysis plans, and require major corrections/edits for submission to OMB. Acceptable performance (5-7) would include a complete OMB clearance package, clear writing style, proper grammar/spelling, well-organized document format, accurate and complete descriptions of data collection and data analysis plans and could be submitted for clearance with only minor corrections/edits. Superior performance (8-10) would meet "acceptable performance" standards for this rating element, AND include submission of the final package to OMB at least one week ahead of schedule. Supporting comments (required for unacceptable or superior performance ratings): Rating Element 2: Quality of Draft Data Collection Instruments Circle the appropriate number for your rating: 1 2 3 4 5 6 7 8 9 10 where: Unacceptable performance (1-4) would include missing, unclear, confusing data collection instruments, insufficient instrumentation to support data collection and analysis plans in the first draft. Acceptable performance (5-7) would include complete, clear, straightforward data collection instruments, sufficient instrumentation to support data collection and analysis plans without being unnecessarily burdensome to respondents in the first draft. Superior performance (8-10) would meet "acceptable performance" standards for this rating element, AND include submission of package to OMB at least one week ahead of schedule. Supporting comments (required for unacceptable or superior performance ratings): Rating Element 3: Quality of Support during Clearance Process Circle the appropriate number for your rating: 1 2 3 4 5 6 7 8 9 10 where: Unacceptable performance (1-4) would include irrelevant, incomplete, and sporadic or nonexistent support during the clearance process, not responding to public and/or OMB questions or responding more than 3 days after each request. Acceptable performance (5-7) would include relevant, complete, and continuous support during the clearance process including responding to questions from OMB and/or the public within 3 days of each request. Superior performance (8-10) would meet "acceptable performance" standards for this rating element, AND provides innovative or exceptionally skillful solutions to problems raised during the clearance process, turns around each request from OMB and/or the public within 2 days of each request. Supporting comments (required for unacceptable or superior performance ratings): QUALITY ASSURANCE SURVEILLANCE PLAN EXHIBIT C: YEARLY REPORTING EVALUATION FORM Rating Element 1: Accuracy and Relevance of Information Provided Circle the appropriate number for your rating: 1 2 3 4 5 6 7 8 9 10 where: Unacceptable performance (1-4) would include incomplete, illogical, unclear, inappropriate, inaccurate reporting on data analysis results for key research questions, lack appropriate context for interpreting results in the first draft. Acceptable performance (5-7) would include complete, logical, clear, appropriate, accurate reporting on data analysis results for key research questions, appropriate context for interpreting results in the first draft. Superior performance (8-10) would meet "acceptable performance" standards for this rating element, AND include innovative, exceptionally skillful reporting on data analysis results for research questions in the first draft. Supporting comments (required for unacceptable or superior performance ratings): Rating Element 2: Usefulness for Target Audiences Circle the appropriate number for your rating: 1 2 3 4 5 6 7 8 9 10 where: Unacceptable performance (1-4) would include unclear, inappropriate language and results for targeted audiences in the first draft. Acceptable performance (5-7) would include clear, tailored language and results for targeted audiences in the first draft. Superior performance (8-10) would meet "acceptable performance" standards for this rating element, AND include innovative, exceptionally skillful approaches and/or methods to providing tailored information to individual audiences in the first draft. Supporting comments (required for unacceptable or superior performance ratings): Rating Element 3: Comprehensiveness, Clarity and Organization of Report Circle the appropriate number for your rating: 1 2 3 4 5 6 7 8 9 10 where: Unacceptable performance (1-4) would include missing elements, unclear writing style, poor grammar/spelling, disorganized document format in the first draft. Acceptable performance (5-7) would include comprehensive description of key results, clear writing style, proper grammar/spelling, well-organized document format in the first draft. Superior performance (8-10) would meet "acceptable performance" standards for this rating element, AND include relevant analyses of and reports on extant data sources in the first draft. Supporting comments (required for unacceptable or superior performance ratings): Rating Element 4: Responsiveness to Reviewers' Comments and Suggestions Circle the appropriate number for your rating: 1 2 3 4 5 6 7 8 9 10 where: Unacceptable performance (1-4) would include unsubstantiated disregard for reviewers' comments and suggestions for revisions throughout the drafting process, refusal to provide written responses to reviewers who request them, and late or untimely revisions (i.e., received after the date that it is due, per the schedule in the contract). Acceptable performance (5-7) would include thoughtful consideration of reviewers' comments and suggestions for revisions throughout the drafting process, including written responses to unheeded suggestions to each reviewer should they request it, and timely revisions (i.e., within one day of the date that it is due). Superior performance (8-10) would meet "acceptable performance" standards for this rating element, AND include written responses to all reviewers' for all drafts upon submission of all revised drafts, and would include revisions submitted prior to the date that they are due. Supporting comments (required for unacceptable or superior performance ratings): Rating Element 5: Timeliness Circle the appropriate number for your rating: 0 1 2 3 4 5 6 7 8 9 10 where: Unacceptable performance (0-4) would include late (i.e., received after the date that it is due, per the schedule in the contract) delivery for the first draft of the yearly report. Four points should be given if the deliverable is received 2 working days late; three points should be given if the deliverable is received 3 working days late; two points should be given if the deliverable is received 4 working days late; one point should be given if the deliverable is received 5 working days late; no points should be given in the deliverable is received 6 or more working days late. Acceptable performance (5-7) would include timely (i.e., within one day of the date that it is due, per the schedule in the contract) delivery of the first draft of the yearly report. Seven points should be given if the deliverable is received on the date due; five points should be given if the deliverable is received one working day late. Superior performance (8 - 10) would include early (i.e., before the date that it is due, per the schedule in the contract) delivery of the first draft of the evaluation design. Eight points should be given for any deliverable that is received 1 working day early; nine points should be given for any deliverable that is received 2 - 3 working days early; 10 points should be given for any deliverable that is received 4 or more days early. Supporting comments (required for unacceptable or superior performance ratings): QUALITY ASSURANCE SURVEILLANCE PLAN EXHIBIT D: DATABASE EVALUATION FORM Rating Element 1: Cleanliness of data Circle the appropriate number for your rating: 1 2 3 4 5 6 7 8 9 10 where: Unacceptable performance (1-4) would include a database where less than an average of 95% of records pass edit and consistency checks. Acceptable performance (5-7) would include a database where an average of 95-97% of records pass edit and consistency checks. Superior performance (8-10) would include a database where 98-100% of records pass all edit and consistency checks. Supporting comments (required for unacceptable or superior performance ratings): Rating Element 2: User friendliness of documentation and database structure Circle the appropriate number for your rating: 1 2 3 4 5 6 7 8 9 10 where: Unacceptable performance (1-4) would include incomplete, unclear, inaccurate documentation of variable names and labels, codes for missing values, descriptions of procedures used to compute analysis variables, documentation of all edit and consistency checks used to clean the data, and a list of any outliers recoded as part of the cleaning process. Acceptable performance (5-7) would include complete, clear, accurate documentation of variable names and labels, value labels, codes for missing values, descriptions of procedures used to compute analysis variables, documentation of all edit and consistency checks used to clean the data, and a list of any outliers recoded as part of the cleaning process. Superior performance (8-10) would meet "acceptable performance" standards for this rating element, AND include innovative, exceptionally skillful programs or other database user guide features. Supporting comments (required for unacceptable or superior performance ratings): Rating Element 3: Timeliness Circle the appropriate number for your rating: 0 1 2 3 4 5 6 7 8 9 10 where: Unacceptable performance (0-4) would include any deliverable that is late (i.e., received after the date that it is due, per the schedule in the contract). Four points should be given if the deliverable is received 2 working days late; three points should be given if the deliverable is received 3 working days late; two points should be given if the deliverable is received 4 working days late; one point should be given if the deliverable is received 5 working days late; no points should be given in the deliverable is received 6 or more working days late. Acceptable performance (5-7) would include any deliverable that is received on time or within a reasonable amount of time (i.e., within one day of the date that it is due, per the schedule in the contract). Seven points should be given if the deliverable is received on the date due; five points should be given if the deliverable is received one working day late. Superior performance (8 - 10) would include any deliverable that is received early (i.e., before the date that it is due, per the schedule in the contract). Eight points should be given for any deliverable that is received 1 working day early; nine points should be given for any deliverable that is received 2 - 3 working days early; 10 points should be given for any deliverable that is received 4 or more days early. Supporting comments (required for unacceptable or superior performance ratings):