RN 990050 ATTACHMENT A STATEMENT OF WORK National Household Education Survey 2001 and 2003 Table of Contents I. INTRODUCTION AND BACKGROUND 1 A. PURPOSE AND NATURE OF THE PROCUREMENT 1 B. ENABLING LEGISLATION 1 C. GENERAL FEATURES OF THE NHES 1 C.1 Multiple Components in Each Collection 2 C.2 Sampling through Random Digit Dialing 2 C.3 Sampling to Produce Greater Minority Representation 2 C.4 Sampling with Intent to Assess Change Over Time 3 C.5 Use of Computer-Assisted Interviewing (CATI) Techniques 3 C.6 CATI Interviews in Spanish 3 C.7 Fast Turnaround 3 C.8 Commitment to Data Quality 3 C.8a. Cognitive Laboratory Work 4 C.8b. Field Tests 4 C.8c. Maximizing Response Rates 4 C.8d. Reinterviews 5 C.9 Technical Review Panels 5 C.10 Adherence to NCES Standards 6 D. GENERAL PLANS FOR THE NHES:2001 6 E. GENERAL PLANS FOR THE NHES:2003 7 II. SCOPE OF WORK 8 A. GENERAL TASKS TO BE PERFORMED 8 Task 1. General Project Planning Meeting 8 Task 1.1 Contractor Review of NHES Background Materials 8 Task 1.2 Other Meetings with NCES 8 Task 1.3 Briefing Materials and Briefings 8 Task 1.4 Project Leaflet 9 Task 1.5 Project Bibliography 9 Task 1.6 Development of Data Security Plan 9 Task 1.7 Initial Meeting with NCES 10 Task 1.8 Project Management Plan 10 Task 2 Technical Review Panel 11 Task 2.1 Develop TRP Plan 11 Task 2.2 Solicit Participation in TRP 11 Task 2.3 Plan and Arrange TRP Meetings 11 Task 2.4 Provide Meeting Summaries 12 Task 2.5 Administer Other TRP Activities 12 Task 3 Develop Survey Content 12 Task 3.1 Hold Meetings to Identify Key Research and Policy Issues 12 Task 3.2 Literature Review 12 Task 3.3 Prioritize Research and Policy Questions to be Addressed 13 Task 3.4 Review Designs and Instruments of Other Surveys 13 Task 3.5 Prepare Content Outline 13 Task 4 Develop Survey Instruments and Procedures 14 Task 4.1 Develop Survey Instruments 14 Task 4.2 Develop Data Collection Procedures 14 Task 4.3 Conduct Cognitive Laboratory Research 15 Task 5 Sample Design 15 Task 6 Survey Design Report 16 Task 7 IMT/OMB Clearance of the NHES Instruments 17 Task 8 Develop CATI System 18 Task 8.1 Specification of Range and Logic Checks 18 Task 8.2 Programming of Survey Instruments 19 Task 8.3 Testing of CATI System 19 Task 8.4 Delivery of Testable CATI Instrument 20 Task 8.5 Spanish and English Language CATI Screens 20 Task 9 Field Test of Survey Instruments and Procedures 20 Task 9.1 Submit Field Test Plan 20 Task 9.2 Conduct Field Test 21 Task 9.3 Submit Field Test Report with Revised CATI Instruments 21 Task 9.4 Submit Design/Instrument Changes in IMT/OMB Memo 21 Task 9.5 Submit Final Instruments 21 Task 10 Hiring and Training CATI Interviewers 21 Task 10.1 Develop Interviewer Training Materials 22 Task 10.2 Recruitment of Interviewers 22 Task 10.3 Interviewer Training Sessions 22 Task 11 Data Collection 23 Task 11.1 Data Collection Schedule 23 Task 11.2 Quality Control Procedures 24 Task 11.3 Progress Reports 24 Task 12 Data File Preparation and Documentation 25 Task 12.1 Data Coding and Editing 25 Task 12.2 Data Conversion 26 Task 12.3 Creation of Composite and Classification Variables 27 Task 12.4 Item Nonresponse 27 Task 12.5 Sample Weights 27 Task 12.6 Public Release Files and User's Manuals 28 Task 12.7 Restricted-use Data Files and User's Manuals 29 Task 12.8 Disclosure Review Board 29 Task 12.9 Maintenance of Necessary Data Files 29 Task 13 Data Analysis and Reporting 29 Task 13.1 Analysis Reports 29 Task 13.2 Standard Error Calculations 31 Task 14 Methodology Report 32 EXHIBIT 1. SCHEDULE OF DELIVERABLES FOR NHES:2001 33 EXHIBIT 2. SCHEDULE OF DELIVERABLES FOR NHES:2003 35 B. REPORTING 37 C. EXPECTATIONS CONCERNING QUALITY OF, CORPORATE SUPPORT FOR, AND TIMING OF DELIVERABLES 37 D. THE ADJUDICATION PROCESS 38 E. FORMS CLEARANCE 38 F. ELECTRONIC COMMUNICATIONS 39 Appendix A Background Information on the NHES 40 Appendix B NHES Publications List 45 Appendix C Open-Ended Items 50 I. INTRODUCTION AND BACKGROUND A. PURPOSE AND NATURE OF THE PROCUREMENT The National Center for Education Statistics (NCES) of the Office of Educational Research and Improvement (OERI), United States Department of Education (ED), requires a contract to continue the National Household Education Survey (NHES) program. This procurement includes the design and conduct of two NHES collections-one in 2001 (NHES:2001) and one in 2003 (NHES:2003). The NHES:2001 collection will include Early Childhood Program Participation (last collected in 1995) and Adult Education (also collected in 1995). An additional topic will be included in NHES:2001 which will cover After School Program Participation and Activities. Proposed topics for the NHES:2003 are: School Readiness (last collected in 1993), Parent and Family Involvement in Education (last collected in 1996) and an adult component which will collect detailed information on work-related Adult Education. Each collection requires random digit dialing (RDD) sampling methods and computer assisted telephone interviewing (CATI) techniques. Large amounts of data need to be processed and analyzed and statistical reports need to be written under tight time schedules; task schedules overlap, and there are both planning and field operations that may be going on simultaneously. B. ENABLING LEGISLATION The National Household Education Survey (NHES) program was established by the NCES in response to the immediate and continuing need for quality and timely data pertaining to the condition of education in the United States. Specifically, the NHES is undertaken in compliance with the mandate stated in section 404 of the National Center for Education Statistics Act, P.L. 103-382 (20 USC, 9003): "The duties of the Center are to collect, analyze, and disseminate statistics and other information related to education in the United States and in other nations..." C. GENERAL FEATURES OF THE NHES The NHES is a telephone survey of the noninstitutionalized civilian population of the United States. Households are selected for the survey using random digit dialing (RDD) methods. Data are collected using computer assisted telephone interviewing (CATI) techniques. The methodology for any single fielding of the NHES is linked to the research issues under study, the level of data required to address these issues, and how precise the estimates generated from the survey data need to be in order to meet the objectives of the study. However, while the specifications for each survey will vary, there are general features of the NHES methodology that will stay relatively constant from one survey to the next. The general characteristics of the NHES sample design and the data collection approach are described in the next sections. Although these general features have been used in previous NHES, and it is anticipated that they will continue to be part of future NHES collections, the contractor may propose alternative approaches to any aspect of the NHES. C.1. Multiple Components in Each Collection Because of the high costs associated with screening large numbers of households in order to meet the sample size requirements of NHES components concerning young children, more than one population and set of issues has been addressed concurrently. Each of NHES:91, NHES:93 and NHES:95 included two topical components; the NHES:96 included two larger components and one smaller component imbedded within the screening interview. The components selected and the final sampling designs have sought to maximize the probability of a household qualifying for inclusion in the survey, but limit the burden placed on each household. (More details about these design features can be found in Overview of the National Household Education Survey 91/93/95/96, NHES Technical Report, and in User's Manuals and working papers for specific collections. These documents are fully referenced in Appendix B.) C.2. Sampling through Random Digit Dialing Random digit dialing (RDD) has been used for each NHES collection. However, the particular method of RDD utilized has varied. For the NHES:91 and the NHES:93, a modified Mitofsky-Waksberg method described by Brick and Waksberg (1991) was used. A list-assisted method was used in the NHES:95 and again in the NHES:96. The list-assisted method eliminates the sequential difficulties associated with the modified Mitofsky-Waksberg in that it is a single-stage sample. It also produces an unclustered and self-weighting sample. C.3. Sampling to Produce Greater Minority Representation One of the goals of the NHES program is to produce reliable estimates of the characteristics of children's and adults' educational experiences for totals and for subdomains defined by race and ethnicity. In each NHES, decennial Census information about telephone numbers has been used to increase the likelihood of calling into minority households. In the NHES:95, for example, 100-banks were classified as "high minority" if at least 20 percent of its population was black or at least 20 percent of its population was Hispanic. In NHES:93, "high minority" was defined as having a population at least 20 percent black, Hispanic or Asian. In both collections, "high minority" numbers were sampled at twice the rate as "low minority" numbers. "Asian/Pacific Islander" was included in the 1993 definition because design work suggested that a definition based on only blacks and Hispanics tended to depress the number of Asians and Pacific Islanders encountered. Adding "Asian/Pacific Islander" to the definition, increased the numbers of blacks and Hispanics without hurting the number of Asians. It was not possible to use this same approach in the NHES:95 or NHES:96 because the sample was purchased from a provider that did not include information about Asians/Pacific Islanders on the sample frame. C.4. Sampling with Intent to Assess Change Over Time The NHES program was conceived of as a program which would provide trend information on important educational indicators. Thus, each component has been designed with the intent of assessing change over time. Estimates by race and ethnicity are of great interest especially for monitoring educational trends over time. C.5. Use of Computer-Assisted Telephone Interviewing (CATI) Techniques The NHES has several features that require the use of a computer-assisted telephone interviewing (CATI) methodology. First, the topics covered by the NHES are complex and difficult skip patterns often arise. Consequently, the types of questionnaires that are developed to collect information on such detailed topics as child care or preschool arrangements for young children would be difficult to administer by telephone without the aid of a CATI system. Second, as noted earlier, the NHES will usually cover more than one topic and require sampling from more than one population. Here again, it seems unrealistic to implement such a design without the assistance of an effective and efficient CATI system. Third, one of the features of the NHES that is important to NCES is the timely dissemination of the data gathered by this system. The use of a CATI system shortens the amount of time devoted to interview followup, data processing and editing, and also hastens release of public-use data files and statistical reports. C.6. CATI Interviews in Spanish Because NHES oversamples Hispanics, a number of households that are contacted through regular CATI methods require a Spanish-language version in order to complete the interview. In order to follow the complex skip patterns, a Spanish-language version of the CATI instrument must be available for interviewers who can conduct interviews in Spanish. C.7. Fast Turnaround Historically, NCES has been able to release an adjudicated "Statistics in Brief" report on each data set prior to the end of September of the data collection year. Thus, data cleaning, imputation, weighting, analysis, report writing, and report adjudication have been accomplished within 5 months of the completion of each data collection. C.8. Commitment to Data Quality Since its inception, the NHES program has dedicated resources to both assessing and improving the quality of the data. In the sections that follow are descriptions of data quality work that has taken place. C.8a. Cognitive Laboratory Work Cognitive laboratory work has been used extensively in the NHES program. Focus groups have been used at even the earliest stages of development to explore general perceptions concerning the topics. As questionnaires have begun to take shape, focus groups and one-on-one think-aloud or debriefing interviews have been used to help refine items as well as the general flow of the questionnaires. Multiple rounds of cognitive lab work have allowed for iterative testing of the instruments. (See Use of Cognitive Laboratories and Recorded Interviews in the National Household Education Survey, NHES Technical Report, as listed in Appendix B). C.8b. Field Tests Field tests have been used to improve the quality of the data in all NHES collections. In the 1989 field test, the NHES:91 and the NHES:93, due to limited planning time, field tests were done in the fall preceding the collection. The field tests primarily served to ensure that the CATI program was operating according to specifications. In both NHES:95 and NHES:96, extensive field testing of the topical components was done in the spring prior to the collection year. Both times, three rounds of field tests were performed, though the sample size devoted to each round varied. In the NHES:96, the first phase was small, and the purpose was to test the overall flow of the questions, and to determine if there were any major problems with general wording. The second phase was larger and more attention was paid to individual items, including the appropriateness of response categories. The third phase was smaller and was done to ensure that changes made as a result of the earlier phases were properly programmed. Methodological tests, such as efforts to identify means for increasing response rates, have been incorporated into field testing as deemed necessary. The field tests for NHES:2001 and NHES:2003 will be conducted in the spring, and will be similar in size to those conducted in previous NHES. C.8c. Maximizing Response Rates High response rates are critical to the NHES, and all surveys, because the potential for nonresponse bias increases as the response rate decreases. The most appropriate way to avoid serious nonresponse bias is to achieve high response rates. Response rates in the NHES are the product of the completion rate for the final stage of the survey and the completion rates for all earlier stages. From the NHES:91 through the NHES:96, the response rates have ranged from approximately 70 to 82 percent. The Screener completion rates in 1991 and 1993 were about 10 percentage points higher than the Screener completion rates in 1995 and 1996. In each survey year, most nonresponse to the Screener was accounted for by refusals, which comprised 68 to 84 percent of nonresponse. A screening experiment conducted in the field test for the NHES:96 revealed that a major reason for lower Screener response rates was the screening approach. The screen-out approach (an early eligibility item which allows many noneligible households to "opt out") results in Screener completion rates about 10 percentage points higher than either no screen-out or full household enumeration approaches. (See An Experiment in Random-Digit-Dial Screening, NHES Technical Report, as listed in Appendix B). Other approaches to maximize response rates that have been fielded by NHES include: continuing to recycle telephone numbers known to be residential until a refusal is confirmed, or a completed interview or other outcome is confirmed; leaving messages on answering machines; sending a letter and brochure explaining the survey to potential nonrespondents; sending a letter to initial refusal cases eliciting their cooperation in a possible interview; and sending out a letter in advance of the data collection (done only in NHES:96). It should also be noted that the percentage of telephone numbers yielding residences is lower for the list-assisted method than the modified Waksberg-Mitofsky method (50-52 percent of numbers dialed are considered to be residential in the list-assisted sampling method versus 60-62 percent for Waksberg-Mitofsky). Thus, for NHES:95 and NHES:96, more telephone numbers had to be dialed in order to establish a sufficient number of potential respondents in sample than for the Waksberg-Mitofsky method, which clusters the banks of telephone numbers. Achieving high levels of cooperation from the telephone numbers sampled can maximize the efficiency of the sample and reduce the amount of interviewer time. Strategies to maximize response rates are an essential part of future NHES collections. (See An Overview of Response Rates in the National Household Education Survey: 1991, 1993, 1995, and 1996, NCES Technical Report, as listed in Appendix B). C.8d. Reinterviews Each NHES has included a reinterview survey in which a random sample of respondents have been called and re-asked a subsample of the items on the original interview to assess item reliability. The purposes of the reinterview have been to identify survey items that are not reliable and to quantify the magnitude of the response variance for items collected from the same respondent at two different times. The NHES:93 included reinterviews on both components; the NHES:91 and NHES:95 have included reinterviews with the early childhood and adult education components, respectively. (See The 1995 National Household Survey: Reinterview Results for the Adult Education Component, NHES Working Paper, as listed in Appendix B). The NHES:96 reinterview included reinterviews on both parent and family involvement in the education of their children ages 3 through 5th grade, and civic involvement of parents and youth. Adults with no children in the specified age/grade range were not reinterviewed, under the assumption that reliability of items on civic involvement for adults would not be significantly different than for parents. C.9. Technical Review Panels Studies of this scope, complexity, and importance require input from a large number of individuals and organizations in order to address, in a technically competent and meaningful way, the data needs of policymakers and of those performing policy studies and educational research. Thus, the contractor for the program has worked with a number of Technical Review Panels (TRPs). Historically, a TRP has been formed for each topical component being covered by the NHES. TRPs have played an active role through reviewing and commenting on overall research priorities, identifying policy and research questions, providing input about questionnaire content, proposing analytical models and methods, reviewing work plans and their implementation, and reviewing and suggesting modifications to draft reports. C.10. Adherence to NCES Standards NCES has developed and implemented a set of standards that set guidelines to insure the quality of NCES' work. These standards are to be followed by NCES staff and their contractors in performing the day-to-day work of the NCES. A copy of the NCES publication NCES Statistical Standards (U.S. Department of Education, Office of Educational Research and Improvement, National Center for Education Statistics, June, 1992 (NCES 92-021)) can be obtained by calling (877) 4ED-PUBS. A list of the relevant standards covered in this publication is provided below. - Standard for Planning of Statistical Surveys - Standard for the Design of a Survey - Standard for Testing Survey Systems - Standard for Managing Survey Operations - Standard for Achieving Acceptable Response Rates - Standard for Imputation of Item Nonresponse - Standard for Computation of Response Rates - Standard for Codes, Abbreviations and Acronyms - Standard for Variance Estimation - Standard for Analysis and Statistical Comparisons - Standard for Maintaining Confidentiality - Standard for Tabular and Graphical Presentations - Standard for Dissemination of Survey Data and Reports - Standard for Timely Processing and Release of Data and Data Tapes - Standard for Maintaining Data Series - Standard for Machine Readable Products - Standard for Evaluation of Survey - Standard for Documenting a Survey System - Standard for Survey Documentation in Center Reports The contractor should be aware that NCES is reviewing its standards and policies. Should any standard change prior to or after contract award, the contractor shall be informed. D. GENERAL PLANS FOR THE NHES:2001 NHES:2001 is proposed as a data collection with three components: Adult Education, Early Childhood Program Participation, and After School Program Participation and Activities. Although Adult Education and Early Childhood Education have been fielded before, it is possible that there could be substantial modification of the questionnaires. For example, the adult education component is exploring the development of items in two areas: (1) barriers to participation, and (2) nontraditional providers and media for the provision of adult education. Furthermore, the NHES has collected a limited amount of information on After School Program Participation and Activities in previous collections, but substantial modification will be required on this component. E. GENERAL PLANS FOR THE NHES:2003 The NHES:2003 will also consist of three components: Adult Education-Work Related, School Readiness and Parent Involvement in Education. However, input from various sources will be sought prior to confirming this set of topics. The Adult-Education-Work Related component may seek to identify the changing provisions of adult education courses. Businesses, professional associations, and governments have provided an increasing share of courses to adults at the expense of the more traditional two-year institutions. If there is a shift from the academic to other types of providers, issues of equity in access to life long learning may arise. As with the NHES:2001, the School Readiness component and the Parent and Family Involvement in Education component were previously fielded by the NHES; however, redesign of these components may be necessary. II. SCOPE OF WORK A number of activities are required to successfully design and implement the NHES:2001 and NHES:2003 data collections. These activities are described below as a series of tasks and subtasks and all are applicable for both the NHES:2001 and NHES:2003. Specific dates for deliverables are provided in Exhibit 1 (Page 37). Except as noted, the contractor shall allow two weeks for the NCES COTR review of a draft deliverable. One week shall be allowed for the contractor to respond to any requests for revisions. A. GENERAL TASKS TO BE PERFORMED Task 1. General Project Planning and Management Under this task, the contractor shall perform its general project management functions. A routine aspect of this management shall be the development and submission of regular monthly project reports. The monthly progress reports are described in more detail in Section B. 1.1 Contractor Review of NHES Background Materials. Within 5 to 10 days following award, the Contracting Officer's Technical Representative (COTR) will provide project materials from the earlier studies (i.e., 1989 field test, NHES:91, NHES:93, NHES:95, NHES:96, and NHES:99), including the NHES technical reports and data files. To take advantage of the experience gained during the prior NHES surveys and to insure that the new studies will be comparable in important ways with previous NHES surveys, the contractor shall review these reports, and data files. 1.2 Other Meetings with NCES. Key contractor staff shall be prepared to travel to NCES' offices in Washington, DC, for meetings throughout the contract period. It is anticipated that about 3 meetings per year (other than the initial meetings described in Task 1.7 and any TRP meetings described in Task 2) shall be needed over the course of the contract. Typically, two of the meetings per year are one-day meetings and one would be a two-day meeting. The designation of attendees to each meeting shall be made only after the meeting goals and agenda have been established. An average of three of the contractor's staff members is typically in attendance at each meeting. The contractor shall participate in the development of the goals and agenda for each meeting and shall be responsible for distributing meeting materials, if any, to about 12 meeting attendees. Materials to be distributed shall be approved, in advance, by the COTR. 1.3 Briefing Materials and Briefings. From time to time throughout the course of the project, the NCES COTR will provide interested individuals and agencies with information about the nature, findings, and progress of the NHES. The contractor shall support these activities by developing up to four sets of briefing materials to be used by the COTR for the survey. Each set of briefing materials shall be no more than 10 typewritten pages of text (double-spaced) with accompanying graphs and/or tables suitable for display (such as transparencies or slides). Within two weeks of a request, a set of briefing materials shall be submitted to the COTR for review and comment. The contractor shall make necessary revisions to the materials based on this review and resubmit the materials within one week of receipt of comments. The contractor shall also be prepared to give up to ten briefings and/or demonstrations about the NHES program. The COTR will give as much notice as possible for these briefings and/or demonstrations. Examples of the kinds of meetings at which the contractor might be asked to demonstrate NHES data are the Annual Meeting of the American Educational Research Association and NCES' Summer Data Conference. All materials prepared for these briefings shall also be approved, prior to use, by the COTR. The contractor shall submit briefing materials for approval no less than 10 working days before the materials are to be presented. The contractor shall make revisions as appropriate and provide final copies to the COTR within 1 week of receipt of comments. 1.4 Project Leaflet. Over the course of the contract, the contractor, in consultation with the COTR, shall prepare three leaflets describing the NHES. The COTR will request a leaflet, and the contractor shall deliver a first draft of the leaflet to the COTR within three weeks of the request. The COTR will review the leaflet and provide comments and suggestions within two weeks. The contractor shall revise the materials based on suggestions from the COTR and submit camera-ready, color-separated copy of the materials no later than two weeks following receipt of the COTR's comments. An electronic version of the leaflet's text and graphics should be delivered with the camera-ready copy; the electronic version should be in a Web-compatible format in accordance with existing NCES guidelines. 1.5 Project Bibliography. Throughout the course of the contract, the contractor shall maintain a bibliography of reports and articles which cite any of the NHES surveys. These bibliographies shall be delivered to the COTR each November (See Exhibits 1 and 2). The COTR will review the bibliography and provide comments within two weeks of receipt. The contractor shall revise the bibliography and submit one camera-ready and ten stapled copies within two weeks of receipt of the COTR's comments. This bibliography shall be updated on an annual basis thereafter to include citations of reports or articles which cite previous NHES surveys or the current NHES surveys (once available). 1.6 Development of Data Security Plan. In order to ensure the anonymity of individual respondents, the contractor must comply with Section 408 of the National Education Statistics Act of 1994, P.L. 103-382 (20 U.S.C. 9007). The Act authorizes fines or imprisonment for disclosure of individually identifiable information for any purpose other than statistical purposes. Under no circumstances may the contractor release personally identifiable information. Information which identifies persons must be maintained in files which are physically separate from other research data and which are accessible only to sworn agency and contractor personnel. Individual identifiers used during the course of the project shall be associated with data only for purposes of data gathering, matching new data with old, establishing sample composition, authenticating data collections, editing data based on callbacks, or obtaining missing information. The contractor shall enforce strict procedures for ensuring confidentiality. These procedures shall apply to all phases of the project and should include but not be limited to: 1) data collection in the field; 2) coding and editing phases of data prior to machine processing; and, 3) safeguarding response documents, including CATI records. Any employee needing access to confidential information shall first swear to an Affidavit of Nondisclosure. A copy of the Affidavit is available upon request from ED's Grants and Contracts Service. The contractor shall execute these Affidavits of Nondisclosure and the originals shall be maintained by the contractor at their office. The contractor shall be able to produce these Affidavits within a few hours notice from the COTR. The contractor shall indicate the position in the organization of the person signing the Affidavit of Nondisclosure, and the person's functional relationship to this project in a memorandum provided to the COTR. As new persons are assigned to this project, an Affidavit of Nondisclosure shall be executed for them on the first working day of assignment to the project. Throughout the life of the contract, the memorandum of Affidavits of Nondisclosure for new project staff, as well as interviewers and other short-term personnel shall be submitted on a schedule designated by the COTR or at a minimum of three times a year. All individuals asked to respond to the survey shall be informed of the following: NCES' enabling legislation; the purposes for which the information is needed; uses that may be made of the data; and the methods of reporting the data so that an individual's responses are not revealed. The contractor shall maintain security on the complete set (and deliverable backups) of all master survey files and documentation. The contractor shall present a detailed security plan that expands upon what was presented in the proposal to the NCES COTR for approval. Tasks 1.1 through 1.6 shall be conducted once during the contract, the remaining tasks will be completed for both NHES:2001 and NHES:2003. See Exhibit 1 (p.37) for schedule of deliverable dates. 1.7. Initial Meeting with NCES. The contractor shall meet with NCES staff , and someone from the Department of Education's Grants and Contracts, within ten days of contract award to review the study's tasks and to discuss issues related to the conduct of the work. This conference shall be considered part of the background review and as input for the project management plan. 1.8. Project Management Plan. The contractor shall prepare and submit an updated project management plan to the COTR. The COTR will provide comments on the plan within two weeks of receipt. The contractor shall incorporate the changes necessary to respond to the concerns raised by the COTR and submit a revised plan within one week of receipt of the COTR's review comments. Final approval of the revised plan will be provided within two weeks of receipt of the final plan. The plan shall describe and justify any revisions to the original plan specified in the contract. These might include changes in data collection approach, data processing, plans for analysis and reporting, as well as changes in the proposed project organization, staffing plan, and schedule. The plan shall fully describe the rationale for any proposed changes. The plan shall also detail the management techniques that will be employed to insure timely, efficient, and cost-effective data collection, processing, analysis and reporting. Task 2. Technical Review Panel A Technical Review Panel (TRP) shall be established for each component of the NHES. The TRP for each component shall consist of no fewer than six persons who the contractor believes will have expertise to bring to the project, as the topics cross the spectrum of education areas (such as early childhood program participation to work-related adult education). Candidates for the NHES TRPs should be drawn from data users to the extent possible. The role of each TRP shall be to review the technical and substantive issues associated with the NHES. 2.1. Develop TRP Plan. The contractor shall submit a plan to the COTR for the NHES TRPs. This plan shall contain a list of nominees of individuals with the needed expertise for possible membership on the TRP, shall detail a schedule of meetings, and outline the role of the TRP. The COTR must approve the nominations or suggest alternate nominees. The contractor shall recommend a tentative schedule of panel meetings in keeping with the overall plans presented in its proposal. 2.2. Solicit Participation in TRP. Once the COTR has approved an individual for membership, which will take no more than four weeks, the contractor shall take no more than two weeks to contact the individual to solicit participation. 2.3. Plan and Arrange TRP Meetings. All materials and correspondence required for the TRP's consideration and review must be provided to the COTR for review. After the COTR review and approval, those materials should be sent out to TRP members for review and comment. The contractor shall arrange for TRP meetings and pay all associated expenses. It is anticipated that each TRP shall convene for as many as three one-day meetings. Panelists will need one day, prior to each meeting, to review materials and to otherwise prepare for the meeting. All meetings shall be held in Washington, DC at the offices of the NCES. Non-government panelists should be paid an honorarium plus per diem and related expenses. The same rates would apply for other work panel members might perform. 2.4. Provide Meeting Summaries. The contractor shall tape record meetings and submit a written summary to the COTR within two weeks of each meeting. The meeting summary shall not be a verbatim transcript of the meeting. Instead, it shall summarize the discussions and activities that took place during the meeting, highlighting major issues that were raised and decisions made. 2.5. Administer Other TRP Activities. The contractor shall arrange and pay for specific work products not tied to a meeting. For example, the TRP might be asked to comment on a technical report. Again, non-government panelists should be paid an honorarium. The contractor shall supply the COTR with copies of all correspondence and other materials exchanged with panelists. Task 3. Develop Survey Content Even though the NHES may be drawn from items that have been fielded previously, additional conceptual work (both substantive and methodological) is required to design the sample, data collection procedures, instruments, and analyses. The contractor shall implement an approach to the design of the NHES that follows the steps outlined in the following subtasks. 3.1. Hold Meetings to Identify Key Research and Policy Issues. The contractor shall consult with the COTR, and through the COTR with the various offices in the U.S. Department of Education, other Federal agencies (including OMB), education policy groups (e.g., heads of state policy analysis centers, and administrators from state departments of education), and members of the NHES TRPs for input on issues germane to the approved topics. A series of meetings with various education policy offices and groups shall be arranged by the COTR and the contractor to obtain their recommendations. At these meetings, the contractor shall outline NCES' plans for the NHES and request the input of the policy offices and groups in identifying important research and policy issues in need of comprehensive national data that are appropriately gathered by the NHES. Following completion of work on this task, the contractor shall prepare and submit an issues report. At a minimum, this report shall contain a written summary of each meeting. It shall also contain a summary of the meetings as a whole. This report should strive to present a balanced view of the research issues presented by all of the various groups. In particular, the summary should highlight recommendations that emerge from more than one source or from consensus. 3.2. Literature Review. For each topical component of the NHES, the contractor shall prepare a literature review of all of the reports, articles and extant surveys available in the appropriate content area. The contractor shall submit a plan for the conduct of each of the literature reviews. The COTR will review each of the plans and provide comments within two weeks of receipt. The contractor shall provide the literature reviews to the COTR. The COTR will review each literature review and provide comments within two weeks. The contractor shall deliver the final literature reviews two weeks after data collection, at which time, they shall contain the most current information in each field. 3.3 Prioritize Research and Policy Questions to be Addressed The contractor shall prepare lists of the research questions to be addressed by each component of each NHES. On each list, the questions shall appear in their order of priority for the collection. The contractor shall submit the first draft of the prioritized research question list with the summary of meetings report mentioned above. Approved lists of research questions shall be provided to the appropriate TRP for review and comment. All comments from the TRP members shall be summarized by the contractor and presented to the NCES COTR. The lists will undergo constant revision. As the instruments are developed, the lists shall also indicate the questionnaire items which will be used to address each research question. Each submission of a questionnaire shall be accompanied by a revised research question list. In that way, as items are revised or deleted or as design changes are made, their impact on the research and policy questions to be addressed will be known. The contractor should plan on up to 6 revisions to the research questions lists. 3.4. Review Designs and Instruments of Other Surveys. The contractor shall review the designs and instrumentation of key studies and data collection programs that are related to the topical components of the NHES. The primary objective of this review is to collect information that will help to place the NHES into perspective with respect to past NHES surveys and other extant studies. The contractor shall prepare a summary of these studies emphasizing the areas of comparability and differences between these studies and the NHES. The summary shall show how the NHES complements other studies and extend the information base on the topics being addressed. An outline of the summary shall be submitted to the COTR for review and approval prior to the actual summary being prepared. After approval of the outline, the contractor shall prepare the summary and submit a draft for the COTR review and comment. At a minimum, the summary shall contain detailed information on the purpose of each survey, its sample design, the inference population, the methods used to collect data pertaining to the education issues being addressed, the characteristics for which information is being sought, and the types of estimation/analyses the data collection supports. Following the COTR's review of the draft, the contractor shall make any revisions necessitated by the COTR's comments and submit a final summary of the extant studies/data for the COTR's review and approval. The summary of extant studies/data will provide the necessary background information needed to develop a plan for conducting a comparative analysis of estimates derived from the NHES. As a part of the analysis and reporting requirements associated with each NHES, the contractor shall compare estimates from the NHES with similar estimates derived from other studies (see Task 14). 3.5. Prepare Content Outline. The contractor shall prepare and submit a content outline for each survey component. The content outline shall list the recommended topics for inclusion along with a brief rationale for each. The topics for inclusion in the survey shall be organized by the research and policy issues they address. Following the COTR's review of the drafts, revised outlines which incorporate suggestions made by the COTR shall be submitted. Approved content outlines shall be provided to the appropriate Technical Review Panel for review and comment. All comments from the TRP members shall be summarized by the contractor and presented to the NCES COTR. Task 4. Develop Survey Instruments and Procedures 4.1. Develop Survey Instrument. The survey "instrument" for RDD telephone surveys is the combination of questionnaire items comprising the screener interview and the content interviews. The NHES will be conducted through the use of CATI; therefore, the design of the survey instrument shall take this into consideration (See task 8). The screener interview provides the information about whether the telephone number serves a household (eligible telephone number) and whether any household members are eligible for sampling and interview (eligible HH members). The NHES:95 Basic Screener served that purpose. An alternative to the Basic Screener was tested in NHES:95 and used in full-scale operation for NHES:96. The Expanded Screener added two features to the Basic Screener. First, it provided for the collection of more detailed demographic information about every household member after full enumeration. Secondly, it added a third topical component about the household by a household respondent. Current plans for the NHES include a Basic Screener; however, NCES may specify the use of an Expanded Screener. Draft copies of the survey instrument shall be submitted to the COTR for review and comment. The COTR will provide comments on the draft instruments within two weeks. Within one week of receipt of the COTR's review comments, the contractor shall revise the instruments incorporating the comments and suggestions of the COTR, and submit a second draft of the instruments. Following the cognitive laboratory research, discussed below, the contractor shall prepare a third draft of the instruments. Following the COTR's review and approval of this draft (within two weeks of its submittal), the contractor shall make revisions necessitated by the COTR's review. 4.2. Develop Data Collection Procedures. Data collection procedures are the decision rules about how to handle various types of situations encountered in the field, such as, but not limited to, how many Ring No Answers to allow before finalizing the case as a no contact, whether to use Advance Letters as an aid to reducing possible nonresponse, what type of message to leave on answering machines, the use of an "800" number for callbacks, and so forth. These procedures shall be documented by the contractor. An outline of data collection procedures together with a proposal that contains a written description of the activities and decision rules for nonresponse and noncontact cases shall be submitted to the COTR. The COTR will comment on the data collection procedures. Upon completion of the Cognitive Laboratory testing and subsequent revisions to the survey instrument, a revised and more detailed version of the data collection procedures shall be submitted to the COTR. The COTR will provide comments on the revised version. A final version, incorporating the use of CATI programming procedures, shall be delivered to the COTR. Any last-minute changes and corrections due to CATI programming changes should be documented and the field version made available to the COTR at the start of data collection. 4.3. Conduct Cognitive Laboratory Research. Because most of the data items in the NHES will be collected periodically, it is important to insure that the items developed for the NHES are both valid and sensitive measures of an individual's educational experiences. In support of these objectives, the contractor shall design and implement cognitive laboratory research on the NHES instruments. The cognitive laboratory research shall be directed toward an understanding of the extent to which the questionnaire items are understood by respondents and how their answers might be interpreted. This research shall seek to identify any areas of ambiguity and confusion. Prior to conducting any cognitive laboratory research, the contractor shall develop and submit a plan for the conduct of such research. The plan shall describe the contractor's general approach to this work and identify the specific methods and procedures that shall be used. Justification shall be provided for the selection of certain methods over others relative to the objectives of the research. The plan shall also describe how the outcomes of the cognitive laboratory research will be used to improve the overall quality of the NHES instruments and procedures. The contractor shall allow two weeks for the COTR to review the plan. Implementation of the plan shall be contingent upon the COTR's approval. A draft of the Cognitive Research Report shall be due to the COTR. The contractor shall allow two weeks for the COTR to review the draft. The report shall be revised in response to the COTR's comments and the final report submitted. Task 5. Sample Design During the design and conduct of the NHES, it is important that considerable attention be paid to the reduction of non-sampling errors as well as to the reduction of errors that are introduced through the sample selection process. The contractor shall consider any experiment or special studies that would enhance our knowledge of the effects of different NHES design features, and submit a plan for the conduct of such research, if necessary. The plan shall describe the contractor's general approach to this work and identify the specific methods and procedures that shall be used. Justification shall be provided for the selection of certain methods over others relative to the objectives of the research. The plan shall also describe how the outcomes of the cognitive laboratory research will be used to improve the overall quality of the NHES instruments and procedures. The contractor shall allow two weeks for the COTR to review the plan. Implementation of the plan shall be contingent upon the COTR's approval. The contractor shall design an appropriate sampling strategy that will result in a sample sufficiently large sample to support the types of analyses to which the data will be subjected. In designing the sample, the contractor shall weigh the advantages and disadvantages of different sampling approaches. The contractor shall carefully consider the total number of households that will need to be screened in order to obtain the target sample sizes. The impact of the screening rates on both the precision of the sample estimates and on the costs of the survey shall be evaluated. The contractor shall take the following features of the NHES into account when designing the sample: o In developing the sample design, the contractor shall give special attention to increasing the minority (black and Hispanic) representation in the sample and to increasing the precision of estimates for different race/ethnic groups. Increasing the number of minority households in the sample may be accomplished in several ways. The simplest method, but not necessarily the most efficient method, is to increase the overall sample size (i.e., the number of households screened). Other methods involve the use of information on neighborhood characteristics associated with different telephone exchanges to identify and oversample high minority exchanges and the use of more screening items at the household level. o The NHES shall be designed in such a way as to increase the power of detecting changes across time in the key estimates. For estimates of percentages in the 30-60 range, the ability to detect at least a 10-15 percent relative change in key estimates (i.e., relative change as a percent of the estimate p) is desirable. o The contractor shall consider the merits of including in the sample all household members who meet the eligibility criteria for a topical component (e.g., 3- to 8-year-old children) versus sampling within the household among those members who meet the criteria. NCES may decide to develop state-level estimates for key estimates from each of the different topical components for either the NHES:2001 or the NHES:2003. In the event that state-level estimates are required, the contractor shall work closely with the COTR to develop this capability. The contractor shall prepare a sample design plan for the NHES surveys and submit them to the COTR for review and approval prior to implementation. A draft of the plans shall be submitted to the COTR. Following the COTR review, the contractor shall make any necessary revisions to the plans and submit the revised plans to the COTR for review. The revised plans shall be submitted within one week of receipt of the COTR's comments on the draft plans. The revised plans are expected to be the final sample design. Task 6. Survey Design Report The contractor shall prepare a Survey Design Report. This report shall be in large part a compilation of the materials prepared under Tasks 4 and 5. At a minimum, it shall contain: 1) an introduction to the survey with a comprehensive statement of its intended purpose; 2) a description of the sampling design of the survey; 3) a description of the survey procedures; 4) copies of the NHES instruments along with appropriate justifications and definitions for key variables; and 5) a plan for analyzing the data from the survey. The Survey Design Report shall also contain detailed projected costs for Tasks 8 through 14 based on the same assumptions used in the design. The contractor shall submit an outline of the Survey Design Report to the COTR for approval. Once the outline of the report is approved by the COTR, the contractor shall submit a draft of the Survey Design Report to the COTR. Within two weeks of receipt of the draft report, the COTR will provide the contractor with comments on the draft document. The contractor shall revise the Survey Design Report incorporating the COTR's comments and suggestions. The revised Draft Survey Design Report shall be submitted to the COTR within one week of receipt of the COTR's review comments on the draft report. The contractor shall provide the NHES COTR and members of the NHES TRP with copies of the report for their review and comment. The contractor shall request that the members of the TRP provide written comments and be prepared to discuss the NHES design at a TRP meeting. With the COTR's approval, the contractor shall modify the Survey Design Report (and the NHES survey design) based on the suggestions of the TRP members. The COTR assumes that most of the contents of the Survey Design Report shall be used to prepare the IMT/OMB Forms Clearance Package. The Survey Design Report will continue to be updated throughout the data collection process, to incorporate the field test results and implementation. Task 7. IMT/OMB Clearance of the NHES Instruments The contractor shall prepare a forms clearance package for obtaining IMT (Information Management Team of the Department of Education) and OMB approval of the NHES. The contractor shall follow specifications as provided in Standard Form 83-I. The forms clearance package shall include: a supporting statement that describes the reason for the study, detailed justification of all items to be included in the survey, sample design specifications, data collection procedures, analysis plan, estimated response burden, description of the NHES program, other information required by IMT and OMB, and survey materials to be used in the study--survey questionnaires, and other materials to be used with the respondents. Following the COTR's review of the draft package, the contractor shall revise the package incorporating the COTR's comments and suggestions and submit a revised package. This version of the package will be distributed more widely within the Department of Education for further review. The contractor shall expect additional comments and suggestions within four weeks. Following this review, the contractor shall have one week to submit a revised package incorporating the COTR's comments and suggestions. Following approval of the revised package, the contractor shall provide the COTR with 10 copies of the IMT/OMB Clearance Package. Submitting the package far in advance of the scheduled start of data collection will help insure that OMB approval is obtained prior to the conduct of the field tests (see Task 9). The COTR will submit the package to IMT/OMB. Following submission of the package, IMT/OMB has up to 4 months to respond with questions or revisions before granting clearance; the contractor is expected to provide any technical assistance to the COTR as necessary to answer such inquiries. Task 8. Develop CATI System All interviews conducted as a part of the NHES shall use a CATI methodology. The CATI system developed for the NHES shall perform thenecessary functions associated with: 1. sampling the appropriate households and individuals for interviews; 2. scheduling interviews -- the CATI shall provide the telephone numbers online to interviewers to attempt initial contacts, maintain a list and schedule of callback appointments, and establish a schedule for attempting to contact numbers when initial attempts at contact fail; 3. guiding the interviewer through the questionnaire based on the data requirements for given classes of respondents -- the programming of skip patterns will reduce the number of interviewer errors and improve the overall quality of the survey data; 4. monitoring and reporting the progress of the data collection on a daily and weekly basis; 5. coding and editing of the interview data -- most of the coding of the interview data shall be accomplished by the CATI system during the conduct of the interview. However, there may be exceptions to this. Editing of the answers provided by respondents shall be performed during the interview and at a minimum shall include both range and consistency checks. To the greatest extent possible, the contractor shall design the CATI system in such a way as to minimize disruptions to the system when requests for changes are made. The contractor shall program the approved survey instrument into a CATI system in both English language and Spanish language versions. 8.1. Specification of Range and Logic Checks. The contractor shall prepare specifications for every questionnaire item response that detail the allowable range of response and of internal consistency checks between sets of related items. For example, if the age of the respondent is collected in the demographic items, then other items relating to the person's age should be consistent with the reported age. The contractor shall develop response categories that distinguish between "Not Applicable" responses (i.e., valid missing data due to skip patterns) and item nonresponse. The contractor shall develop procedures for insuring the overall quality of the data collected in the NHES. Most of the data editing associated with the conduct of the NHES shall take place during the interview session via edits contained in the CATI system. However, even with elaborate edit specifications for the survey instruments, it is assumed that errors in the data still may occur. As a consequence, the contractor shall implement manual editing procedures. Procedures shall be developed and in place for resolving unanticipated problems. At a minimum, editing of the survey data during the conduct of the interview shall include range checks (both hard and soft) and consistency checks. Inappropriate characters (such as words in a numeric field) should be prevented from being entered by the interviewer. Procedures shall be established that permit interviewers to respond appropriately to survey participants who provide out-of-range or inconsistent information. In developing these procedures, the contractor shall weigh the needs for accurate information and limited interviewer interference against the need for completing the interview as efficiently as possible. For each NHES instrument, the contractor shall prepare CATI edit specifications. (Post-CATI edit specifications are covered in Task 12-1, below.) These specifications shall be submitted to the COTR for review and comment. The COTR will provide comments within two weeks, and CATI edit specifications revised in response to the COTR's comments shall be submitted. Revised CATI specifications are expected to be the final version; the draft submitted should be considered as a final version with only minor modifications to be incorporated. Open-ended items (such as an occupation or education provider name) will have to be coded and edited either online (with screens that provide precoded response categories) or handled in post-interview manual edits (See Appendix C). The online capability is most desirable but would need to work seamlessly with the rest of the CATI software. The COTR has some experience with online coding programs used for ICOC and IPEDS. The contractor should propose how to handle such open-ended items. 8.2. Programming of Survey Instruments. Programming the survey instruments into the CATI system is one of the key activities of the project. For the survey instrument to capture the data reported by respondents accurately, the CATI system shall be able to appropriately handle all contingencies associated with the conduct of the interview. Each response shall lead to the next appropriate point in the survey and each response provided by the participants shall be accounted for. Because of the complexity of the CATI programming of a NHES instrument, the contractor shall develop the procedures and materials necessary to insure that all program specifications are correct and complete. The contractor shall propose appropriate procedures for verification of specifications, such as having multiple reviewers of varying levels of expertise test out the screens. 8.3. Testing of the CATI System. Prior to the conduct of the field test (see Task 9), the contractor shall test all features of the CATI system, including those features of the system associated with sampling, scheduling, interview management (e.g., skip patterns), data entry and editing, and case control. Testing of the system shall involve the review of project staff at all levels of the project (e.g., project management, staff with substantive expertise pertaining to the collection, telephone operations staff, and programmers). It shall also involve the use of interview scenarios so that the system can be tested under simulated interview conditions. Every branch of the interview skip patterns should be tested, and corrected if necessary, prior to delivery of the testable CATI instrument, to be sure that all contingencies have been foreseen. The COTR shall be informed of the nature of changes to the CATI instrument due to error resolution, and any differences between the OMB-approved questionnaire and the CATI instrument that is programmed shall be documented and justified with the COTR's knowledge and consent. 8.4. Delivery of Testable CATI Instrument. The COTR shall have a minimum of 10 days to test out the CATI system. The contractor shall propose how to make the CATI system testable to the COTR, whether at the contractor's facilities or at NCES. A testable CATI is one in which the sampling and dialing functions do not have to be demonstrated, but all screens should be ready to test out in its entirety. It should be possible for the COTR or other assigned project staff to go through a variety of simulated interviews (everything except dialing and sampling), testing out the responses to various scenarios and types of household respondents. The COTR shall notify the contractor promptly of all questions or problems with the CATI programming. The contractor shall remedy all problems reported during the testing phase. A testable CATI instrument shall be made available to the COTR no later than 3 weeks prior to the Field Test (see Task 9), to allow time for reprogramming as necessary. 8.5. Spanish and English Language CATI Screens. To allow for the review of the Spanish-language version of the CATI instrument, the contractor shall provide hard copies of both the English and the Spanish-language CATI screens. These CATI screens shall be submitted to the NCES COTR for review and comment. The COTR will provide comments within four weeks, and the revised Spanish-language version CATI screens shall be submitted. Task 9. Field Test of Survey Instruments and Procedures 9.1. Submit Field Test Plan. The contractor, with the COTR, shall design a field test that tests both the implementation of full-scale procedures, such as sampling, scheduling, and actual interviewing and CATI survey questionnaire programming (skip patterns, online edits, etc.). To maximize the utility of the Field Tests, two rounds of testing are recommended: the first on a small number of households to test out the basic screen functioning and skip patterns, the second on a larger number of households to test out finer detail such as whether responding categories make sense. The number of cases for the first round should be large enough to test out each branch of the skip patterns. The contractor shall also use the field test to test alternative introductions. The goal would be to find the introduction that results in fewer initial hang-ups while still meeting OMB requirements for the introduction. The field test plan shall detail the schedule and work to be conducted for not only the field test, but also the subsequent analysis and revisions to the survey instruments. The contractor shall submit the field test plan to the COTR for review. The COTR will review the field test plan and provide comments within two weeks. The contractor shall submit a final plan for conduct of a field test to the COTR. 9.2. Conduct Field Test. Once the cognitive laboratory work is complete, and the data collection instruments have been developed and approved (by NCES and OMB), the contractor shall conduct a field test of the instruments and the CATI system. All sampling and data collection procedures scheduled for the full-scale survey (see Survey Design Report under Task 6) shall be used to the extent feasible during the conduct of the field test; however, if necessary, purposive sampling may be substituted in order to guarantee a sufficient number of rare population households. The field test shall be used to test the entire CATI system, including its dialing, scheduling, and interview progress reporting functions. It shall also be used to test the wording and flow of the questionnaire items including appropriate skip patterns. Responses of field test sample members shall be reviewed and analyzed, and results shall be documented. Item response rate tables shall be generated. Special attention shall be given to skip pattern errors, consistency and other edit checks, and the sensitivity of items. 9.3. Submit Field Test Report with Revised CATI Instruments. The contractor shall prepare a short summary of the field test findings for review by the COTR and, at the COTR's request, members of the NHES TRP. Any problems encountered during the conduct of the field test and the contractor's recommendations for overcoming these problems shall be documented and reported. Recommendations for changes to the survey instruments shall be given in detail with justification of any additional costing. 9.4. Submit Design/Instrument Changes in IMT/OMB Memo. Based on field test results, modifications may be called for in the data collection procedures and instruments. Following the COTR's approval of any revisions to the survey instruments and procedures, the contractor shall submit these changes to IMT/OMB through the COTR. Notification of these changes shall be made in the form of a memorandum from the COTR to IMT/OMB. A draft of the memorandum shall be prepared by the contractor and submitted to the NCES COTR. The contractor shall modify the memorandum based on the comments of NCES staff and submit a final memorandum to NCES. 9.5. Submit Final Instruments. The final instruments, reflecting any IMT/OMB revisions, shall be submitted to the NCES COTR in hard copy format for the English language version. Task 10. Hiring and Training CATI Interviewers The success of the NHES is due in large part to the skills and dedication of the CATI interviewers. It is critical that all persons assigned to these positions have the necessary verbal, interpersonal and typing skills required for successful CATI interviewing. These skills are developed through a combination of prior experiences working on other CATI projects and training. 10.1. Develop Interviewer Training Materials. In order to ensure that the CATI interviewers assigned to the NHES have the skill levels necessary to perform successfully the demanding tasks associated with the conduct of the survey, the contractor shall design and implement a CATI interviewer training program. At a minimum, this program shall include both lecture and interactive sessions. The program shall include training related to: 1) the general conduct of interviews, 2) CATI interviewing skills and techniques, and 3) the administration of the NHES screener and topical component survey instruments. Prior to developing any training materials, the contractor shall submit an outline of its interviewer training program to the COTR for review. This outline shall include: 1) a training program agenda that identifies the format of the session (lecture, interactive, role-playing, etc.), the topics to be covered (e.g., study background, overview of instruments, survey topical component), the length of time the session is scheduled to run; 2) an outline of the study materials that interviewers will be provided; and 3) a preliminary training program schedule that identifies when and where each group of interviewers will be trained. Following the COTR's review and approval of the interviewer training program, the contractor shall develop and submit copies of all training materials to the COTR for review. The contractor shall make revisions as necessary based on the COTR's review of the materials and submit 5 copies of the final training materials to the NCES COTR. The contractor shall monitor all data collection activities to assure consistent high quality data throughout the collection period. Therefore, as a part of the training plan, the contractor shall include awareness of interviewer responsibilities under the Privacy Act of 1974 (5 U.S.C. 552a), the Privacy Act Regulations (34 CFR Part 5b), the National Center for Education Statistics Act of 1994 (P.L. 103-382, Section 9007), detailing the Center's responsibility for keeping all individual information confidential. The contractor shall also follow NCES Standards and Policies, and design methods of review and checking which will take place during and after training to assure that only high quality CATI personnel are retained and utilized. In addition to supervisory monitoring, the contractor shall include plans that address issues of quality control such as tracking interviewer performance, maintenance of standards, identification of poor performance, and procedures for correcting such performance. 10.2. Recruitment of Interviewers. The contractor shall recruit and hire all CATI interviewers needed to complete the data collection within the time constraints imposed by the project schedule. Because the NHES requires the conduct of a large number of interviews within a relatively short time period, the contractor will need a large number of trained staff committed to the project. Whenever possible, the contractor shall recruit interviewers with prior CATI interviewing experience. Because the NHES over samples Hispanics, the contractor shall have staff who are qualified to conduct the interviews in Spanish. 10.3. Interviewer Training Sessions. The contractor shall have responsibility for the conduct of the training sessions. All sessions shall take place at the contractors' facilities or facilities arranged for by the contractor. The contractor shall be responsible for assuring that all interviewers assigned to the project successfully complete the training program as specified in the training plan. The contractor shall have responsibility for the production of all interviewer training materials and shall have sufficient materials available for all staff trained during the conduct of the project. Because of the complex nature of the NHES, the contractor shall develop and implement a multi-day training program which shall take place in the week prior to the start of data collection. (During the conduct of the previous NHES, approximately 20 to 28 hours of project-specific training was required of each interviewer in addition to basic training in general interviewing techniques and the use of the CATI system.) At a minimum, the training sessions shall consist of lecture, interactive, and role-playing sessions. The last session of each interviewer training program shall involve on-line interviewing with actual respondents under the close supervision of the contractor's staff. Training should take place in the week preceding the start of data collection. Confidentiality requirements in NCES contracts mandate that interviewers shall complete a sworn Affidavit of Nondisclosure. These affidavits shall be signed in the presence of a public notary and must be dated the first day the interviewer is on the payroll. (This requirement applies to all project staff having access to the data.) Task 11. Data Collection The NHES is scheduled to be conducted in the late winter/spring of each year (January through March) with data collection spanning a period of no more than three months (a shorter time period is certainly more desirable). All interviewing associated with the NHES, unless otherwise approved by the COTR, shall be done by telephone using a CATI methodology. Once a telephone number is sampled for the NHES, the contractor shall screen the household (telephone number) for eligibility and, if determined to be eligible under the conditions of the survey, conduct all screener and extended interviews for which the household qualifies. All interviewing shall be conducted at the contractor's facilities and shall be under the supervision of the contractor's staff. 11.1. Data Collection Schedule. As stated above, the data collection period shall be a maximum of three months, beginning no earlier than the second week of January and ending no later than the first of April. All interviews shall be completed within this period. Interviews shall be scheduled so as to maximize the number of completed interviews and minimize the number of non-productive interviewer hours charged to the project. When a telephone number is sampled for the NHES, at least seven attempts (telephone calls) to complete the household screening interview shall be made over a two-week period. These attempts shall be scheduled so that they cover different days of the week and hours of the day. For reporting purposes, Unable to Contact (Ring No Answers or FAX/Data line) should be distinguishable from Contact, Noninterview (such as only Answering Machine/Voicemail responses). At least 14 attempts to complete all interviews associated with a household shall be made once a household is determined to qualify for the survey. Once again, these attempts shall be staggered over different days of the week and time of day (e.g., morning versus afternoon or evening hours). 11.2. Quality Control Procedures. The contractor shall develop and implement a set of quality control procedures that will assure the collection of high quality data throughout the data collection period. Project supervisory staff shall closely monitor interviewer activities in order to ensure that all data collection procedures are followed and that all standards are adhered to. Problems that are identified shall be addressed immediately and consistently. Statistics on interviewer response rates should be maintained and checked regularly to pinpoint response rate problems. Examples of data quality control checks that should be performed during the field data collection phase are: 1. All unusual circumstances are documented and discussed with the COTR, such as unexpected field occurrences that require modification to the normal procedures. 2. Any situation in which unusually low response rates seem to be occurring is investigated and the reasons are discussed with the COTR. Because of the number of staff who will be assigned to the NHES data collection activities (both supervisory and interviewer staff), it is important that any decisions that are made about the conduct of the survey be disseminated immediately and clearly to all staff involved in the project. Procedures shall be in place from the beginning of training until the completion of data collection that will assure that all parties involved with the collection of data use the same solutions to and interpretations of problems that arise during the course of the survey. 11.3. Progress Reports. Throughout the data collection period, the contractor shall provide the NCES COTR with weekly progress reports. The contractor's CATI system shall be designed in such a way as to be able to produce computer-generated reports that show the progress that is being made during the interview phase of the project. These reports shall contain detailed information on the interim and final status of all telephone numbers sampled for the survey. At a minimum, they shall include information on the number of potential households (telephone numbers) screened, the number of eligible telephone numbers sampled, the number of cases for which contact has been attempted and established, the number of cases for which interviews have been completed or that have refused to participate in the survey, the number of cases by reason for all types of noninterview or out-of-scope (such as: language barrier, phone in residence used for business, entire HH not eligible, phone disconnected after initial contact, ring no answer after maximum number of calls, etc.), the number of missed scheduled calls, the number of interviewer hours (aggregated), the number of cases referred to refusal conversion, and the number of initial refusals that were completed during refusal conversion. Where appropriate this information shall be reported separately for each type of eligible household or respondent. Response rates as well as completion rates for all cases should be calculated and reported to the COTR on a weekly basis, for all phases of the operation, from screening through final interview, with a summary that certifies that progress is at or above target levels, or with explanation and actions taken if sufficient progress is not being made. In addition to the computer-generated reports described above, these weekly reports shall identify any problems encountered and either describe how these problems were resolved or recommend alternative ways of resolving these problems. These weekly progress reports are in addition to the monthly reporting requirements described in Section B. Any problems encountered that either have consequences for the project's budget or time schedule shall be brought to the attention of the NCES COTR in these weekly progress summaries. Task 12. Data File Preparation and Documentation The data collected during the conduct of the NHES shall be entered directly into a computer file through the CATI system. Nevertheless, additional steps will be necessary to prepare the data for public release and analysis. The contractor shall take the necessary steps to convert the raw data entered by the CATI interviewers to a more useable form. 12.1 Data Coding and Editing. Although most data edits shall occur on-line as the CATI interviewers enter responses into the computer (see Task 8), additional data editing and coding shall be performed at the conclusion of the data collection period. These editing and coding activities shall involve both computer-assisted and manual activities. Respondent data records shall be reviewed for completeness and for any other problems (e.g., inappropriate skips, out-of-range values, input errors) that may have occurred during the conduct of the survey. Problem records shall be identified and appropriate corrective actions taken. A status report on data editing shall be submitted to the COTR, including a description of how any errors (e.g., unexpectedly high edit failure) during this phase of the processing were resolved. Examples of completeness and accuracy in data editing and file preparation include: 1. There are no unreadable fields on any record nor unallowable characters. 2. Valid skips (missing data due to skip patterns) are not imputed and can be distinguished from valid zeros and nonzero survey data. 3. The files carry all of the sampling variables agreed to between the contractor and the NCES COTR, and those variables are documented. 4. The data on the datafile to be delivered for COTR review have been checked and basic tables run. 5. The data are checked against known data for reasonableness (i.e., there aren't twice as many parents of children in the age group being studied as reported on CPS for a roughly comparable year) and discrepancies are resolved. 6. Indications that specified edits are not working as expected (i.e., very high failure rates) are brought quickly to the attention of the COTR and modifications are implemented with COTR's knowledge and summary review if necessary. A plan for post-CATI edit checks shall be submitted to the NCES COTR. The COTR will review the plan and provide comments and suggestions within two weeks. Incorporating the comments and suggestions made by the COTR, the contractor shall revise its post-CATI editing plan and submit a revised plan. The contractor shall maintain for the COTR's inspection the records and results of the data editing and corrective actions. These records shall be delivered to the COTR on request. A status report of the data editing and corrective actions shall be delivered to the COTR. This status report shall include cumulative summary statistics by instrument on the number of problems detected for the various types of items; and the status and method of corrective actions taken. The contractor shall submit specifications for the coding of open-ended items. When appropriate, the coding specifications, which shall be approved in advance by the COTR, shall be consistent with those used in previous surveys (i.e., NHES spring 1995). The contractor shall receive comments from the COTR within two weeks and have one week to revise the specifications based on the COTR's comments and to submit the final coding specifications to the NCES COTR. The contractor shall code responses to all open-ended questions according to the approved coding specifications. Examples of the types and numbers of open-ended questions that the contractor can expect to find in the NHES include industry and occupation items and major field of study. Appendix C lists the open-ended items from NHES:95. The contractor shall use common formats and procedures for editing, coding, error resolution, and documentation for each NHES. In addition, for those collections that involve administration of items from prior NHES surveys, the contractor shall use the same format and procedures that were used during the earlier collections, unless alternative formats and procedures are approved by the NCES COTR. The use of these common formats and procedures will increase the comparability of the data from the different survey administrations and will enable users to compare data from the different collections with a minimum of effort. 12.2 Data Conversion. The contractor shall design, establish, and carry out the procedures necessary to convert the data in the CATI system to CD-ROM with electronic codebook format. The datasets on the CD-ROM shall be made available in three formats: 1)ASCII format; 2)SAS/PC; 3)SPSS/PC. ASCII-readable input files shall be used by the contractor to create Electronic Codebooks (ECBs) for the separate files. The contractor shall evaluate the previous NHES CD-ROM and propose enhanced features, such as Windows capability, or the availability of formats for STATA. The contractor shall deliver a plan for all recommendations and revision which includes a cost estimate. The contractor shall consider the implications that any changes will have on the compatibility with the previous NHES CD-ROM. The contractor shall submit a plan to the NCES COTR that describes the proposed structure and specifications for the data files. The COTR will review the plan and provide comments within two weeks. A revised plan shall be submitted. The plan shall not be implemented without the approval of the COTR. 12.3 Creation of Composite and Classification Variables. The contractor shall propose and create composite (e.g., SES using income measures, parent's education and occupation) and classification variables (e.g., race/ethnicity, highest level of parent education, family/household composition, count of adults in household, etc.) for use by analysts. A plan for the creation of composite and classification variables shall be submitted. The COTR shall review the plan and provide comments and suggestions within two weeks. A revised plan incorporating the changes requested by the COTR shall be submitted within one week of receipt of the COTR's comments. Once approved, these variables shall be included on the public use data files and their definitions and code included in the NHES data file user's manual. 12.4 Item Nonresponse. Since the data from the NHES will be used to monitor the participation of children and adults in a variety of education-related activities over time, it is important that the impact of item nonresponse on survey estimates be considered carefully. The contractor shall analyze the level of item nonresponse for the individual survey items and propose a plan for full imputation of missing item data. The imputed data shall be placed on the public-use and restricted-use data files, along with appropriate documentation and imputation flag variables. The contractor shall propose a plan for full imputation of missing data and present it to the COTR. Following approval or modification of the imputation scheme, the contractor shall prepare imputation specifications; these should be delivered to the COTR upon finalization. The contractor should also prepare a report, analyzing the item nonresponse, as part of the methodology reports. 12.5 Sample Weights. The contractor shall develop sample weights to apply to the data. The weights are necessary to produce estimates required for various summaries and analyses. The weights shall incorporate a unit (complete interview) nonresponse rate adjustment for individuals in different "weighting classes," as well as an adjustment for non-telephone households. The contractor should propose a strategy also for dealing with households not on the telephone lists but which do possess telephones; at a minimum, the contractor shall evaluate the potential bias for excluding this type of household. Weighting classes and adjustment procedures shall be consistent with NCES Standards. The following are a few examples of data quality checks that should be performed after the sampling weights are applied to all cases on the file: 1. The sum total of weighted interviews is equal to the population totals used in the poststratification process, and all subgroup totals sum to the grand total. 2. All imputed data items have a valid in-range value and are flagged as imputed. 3. Imputed data are checked against the distribution of unimputed data and match that distribution within a reasonable margin of error. The contractor shall document the procures planned for use in developing the weights and submit these plans to the COTR. Two weeks shall be allowed for the COTR's review. The contractor shall have one week to make any revisions resulting from the COTR's review and submit a revised plan. 12.6. Public Release Files and User's Manuals. The contractor shall prepare (following NCES standards on datafiles and documentation) and deliver to the COTR preliminary or draft copies of the following computer-related products: 1. NHES CD-ROM containing a separate datafile for each topical component. Files on CD-ROM shall be in ASCII, SAS-PC and SPSS-PC for Windows formats. 2. User's manual (patterned after NHES:96) for each of the files. The User's Manuals will also include a codebook print file containing well-documented, weighted and unweighted frequency tables for all variables contained in the master files. 3. User's Guides that provide suggestions for using the data sets. These guides will be patterned after the Guides produced for NHES:96. These guides will also include a section that instructs CD-ROM users in the use of the ECB. 4. READ.ME files that will be included on the CD-ROM that will provide guidelines for accessing the data files and describing the data files contained on the CD-ROM. The User's Manuals shall include documentation that is of sufficient detail to enable any user to fully understand the files. The manuals shall include, but not be limited to the following information: o a description of the NHES sample design o a description of the data collection procedures o a description of the response rates for each stage of the sample and for each topical component o a discussion and evaluation of the design effects of the NHES sample o standard errors for key (representative) statistics o a codebook that contains the unweighted and weighted frequencies and percentages for each survey item The User's Manuals using data from the NHES:96 study shall be used as a model for the User's Manuals developed under this procurement. (See Appendix B). The COTR's review of data files and User's Manuals will be completed four (4) weeks after receipt of the draft/preliminary deliverables. The contractor shall have two (2) weeks to submit revised copies of the manuals following the COTR's review. The COTR's comments and suggestions shall be taken into consideration when developing these final products. Following the COTR's approval, 10 copies of the User's Manuals shall be sent to the COTR along with the final public use data files. At the same time, the contractor shall deliver two masters of a CD containing the ASCII data files, SAS for Windows and SPSS for Windows data files. The contractor shall produce and deliver an Electronic Codebook (ECB) similar to those produced for previous NHES collections. The COTR shall review the ECB and provide comments within four weeks. 12.7 Restricted-use Data Files and User's Manuals. Following the same steps presented in Task 12.6 above, the contractor shall produce data files that contain restricted-use data. These data files will contain all the variables not included on the public-use files, including potential identifiers of respondents (e.g., State codes, Census ZIPcode data, telephone area codes, etc.). These files will be distributed by the government through NCES site-license agreements. 12.8 Disclosure Review Board. NCES standards for release of public-use data require that the data ready for release be submitted to the Disclosure Review Board. This process usually entails having the dataset tested out and ready to certify as "clean," meaning that all of the procedures in NCES' Statistical Standard for Machine-Readable Products (IV-06-92), and that data suppression or conversion has been performed on individually-identifiable data (NCES Statistical Standard for Maintaining Confidentiality, IV-01-92). There should be a data release product, usually a Statistics in Brief, and a User's Manual, to submit to the DRB at the same time. The contractor shall submit a memorandum detailing those items that might be considered disclosure risks (generally, a combination of any 2 variables which results in fewer than 3 cases in any cell in a cross-tabulation). This memorandum shall also contain an analysis of the data suppression technique used, such as a distribution of cases "before" and "after" the suppression. Another consideration is that any identifier of a respondent record to a state (a telephone area code does identify a geographic area within a state), when the design of the survey is not state-representative (i.e., insufficient sample) must suppress the state code, if part of an identification code, for example. 12.9 Maintenance of Necessary Data Files. The contractor shall maintain complete data files for all administrations of the NHES and all supplemental files created and obtained in support of this study. Maintenance of data files means keeping data file documentation available for reproduction on request from data users on a cost-reimbursable basis and keeping a machine-readable copy of each data file, so that copies can be supplied to the COTR on request. Certain of the data files that shall be maintained may not be made available to the public because of privacy considerations. In these cases, file maintenance means that the data files shall be kept intact, in a form that can be updated or copied. Data files shall not be allowed to expire, to be released, to be overwritten, or otherwise destroyed. These files shall be maintained with a software system capable of easily interlinking them for analytic (or other) purposes while, at the same time, preserving the necessary confidentiality requirements of the study. Task 13. Data Analysis and Reporting 13.1 Analysis Reports. The contractor shall perform analyses of the survey data and prepare a Statistics in Brief publication for each data file both survey years, and a Statistical Analysis report for each data file in NHES 2001 and one Statistical Analysis report for NHES 2003. The Statistics in Brief reports shall be prepared first, and shall serve two purposes: 1) To adjudicate the public release data files, and 2) To stimulate interest in the NHES data. A Statistics in Brief report is no more than 16 pages in length and may contain the following information: 1) Short introduction with brief review of relevant literature; 2) Description of NHES survey; 3) Results or highlights section; 4) Conclusions; 5) A number of major tables, with standard errors; 6) Figures from table data (optional); 7) Description of NHES survey design and methods; 8) Characteristics of sample used in report; 9) Generalizabilty of sample -- potential for bias; 10) Significance testing and sampling errors; 11) Acknowledgments section. A Statistical Analysis report is longer and contains similar information as described above. For examples of both Statistics in Brief reports and Statistical Analysis reports, see the publication list in Appendix B. The contractor will use as a model previous Statistics in Briefs and Statistical Analysis reports written, using NHES data. The exact format will be determined by the COTR and all reports prepared by the contractor shall follow NCES publications standards (See publication list in Appendix B). A proposal for the Statistics in Brief and Statistical Analysis reports shall be submitted to the COTR. The COTR will take two weeks for review, and the contractor shall submit the revised outlines, incorporating the COTR's review comments, one week later. Once the outline is approved by the COTR, the contractor shall begin preparation of the reports. The COTR expects that several iterations will be required for each report. The COTR shall require a minimum of two weeks for the review of at least one of the iterations. In addition to hardcopy, all reports shall be submitted on floppy diskette, or by electronic mail if appropriate, using Microsoft Word. The contractor and the NCES COTR will jointly author the reports. No information in a report shall be released prior to NCES release of the report. Examples of completeness and accuracy in reports are as follows: 1. Numbers reported are consistent within a report (i.e., between tables) and with any prior reports based upon the same data. 2. Tables/graphs/figures are clearly labeled and understood, and include all necessary notation. 3. Labels are consistent among tables, graphs, and figures. 4. Suppression rules are followed. 5. Results of tests of significance are reported in the text of drafts. 6. All text follows a logical and coherent progression from introduction to purpose to results to discussion. 7. Statements in the text are supported by test results. 8. Statements in the text are consistent with the meaning/intent of questionnaire items. 9. Ambiguous terms are defined. 10. The contractor shall bring to the attention of the NCES COTR all data which appear unexpected or unusual. 11. Materials meet standards set forth in "NCES Statistical Standards" and the "OERI Publications Guide." All differences cited in the text of the report shall be supported by an appropriate statistical test (e.g., Bonferroni-adjusted t-test, chi square, etc.). Standard errors of the estimates presented in the brief shall be reported in the technical appendix. Standard errors for the brief should be calculated according to one of the methods specified in Task 13-2 (see below). Statistics in Brief, Statistical Analysis reports, and most other published products (Working Papers are an exception) go through a peer review process termed "adjudication." This process follows internal review through the Group level, and requires that the publication be sent out to several academic or government reviewers who are chosen for their expertise in the subject area, as well as other parts of the Department of Education. Five weeks should be allowed for circulation of the report to be adjudicated. An adjudication meeting is scheduled at the end of the review period, and all members convene or send in their comments and all comments are discussed. Comments are accepted, modified, or rejected, by consensus, and the contractor must incorporate the comments and write up a memorandum to the Adjudicator detailing the changes within two weeks following the meeting. At that point, once all parties have given their approval (i.e., Program Director, Associate Commissioner, Technical Reviewer, Chief Statistician), the publication camera-ready copy or electronic file prepared. 13.2 Standard Error Calculations. The contractor shall develop and implement procedures (see NCES Statistical Standards) to enable users of the NHES to estimate the sampling errors of survey estimates. The contractor shall calculate estimates of design effect values for a minimum of thirty variables for the total population and for various subgroups of selected respondent characteristics (e.g., sex, race/ethnicity, SES, and at least two other variables recommended by the contractor). Means and standard deviations of the design effect distributions shall be calculated. The same set of procedures shall be followed for each public release file developed under this contract. The data files created by the contractor for each NHES topical component shall contain the elements necessary to support the calculation of standard errors for complex designs using one of three possible methods--balanced repeated replication (BRR), jackknife repeated replication (JRR), or Taylor series procedures (TSP). If the contractor uses a software package to calculate these standard errors other than those available to the COTR, it shall make this package available to the COTR. The contractor shall provide replicate codes that indicate the computing strata and the half-sample to which each sample unit belongs and the contractor shall provide, for each sample unit, the replicate weights for all replicates that were formed in order to calculate variances. In addition, the contractor shall provide the stratum code and PSU code that identifies each sample unit. Regardless of the method used, the contractor shall describe in detail the method used to calculate the standard errors of survey estimates in the User's Manual. This description shall instruct users how to use the data elements and the associated software to calculate sample variances for the NHES. A short example of the method and the use of the available software shall be provided in the manual. The contractor shall propose how to calculate variances that are suitable for users (i.e., software is widely available). The contractor shall document how to use those variances in the sample design report and the User's Manual. The contractor shall prepare a plan for measuring sampling errors and submit it to the COTR for review. At the same time, a plan for describing non-response patterns for each data collection and for estimating non-response bias for key variables shall be submitted. The plan will be reviewed within two weeks by the COTR, and the contractor shall submit a revised plan within one week of receipt of comments. Results of the analyses shall be included in the User's Manual (see Task 12.6) and in the Methodology Report (see Task 14). Task 14. Methodology Report The contractor shall submit an outline and production schedule for the methodology reports (one for each administration year) that document the entire project including: (1) a description of the sample design, weighting, and imputation scheme (sample design report); (2) item nonresponse rates, unit nonresponse rates, and bias analysis (nonresponse and bias report); (3) instrument development and CATI specifications (CATI methodology report); (4) data collection procedures (field methodology report); (5) and data quality report. It is important that technical reports describe as fully as possible the study design and all procedures used to conduct the survey. Special features of the study shall be described in detail and any problems encountered either during the design or implementation of the survey identified. Changes that were made to the design and/or procedures in response to unanticipated problems shall be documented. The reports shall include major findings pertaining to the methods used and/or experimented with during the conduct of the study and discuss their implications for future surveys. EXHIBIT 1. SCHEDULE OF DELIVERABLES FOR NHES:2001 Deliverable (Task) Completion/Due Date Initial Meeting (1.7) April 2, 1999 Draft Project Management Plan (1.8) May 14, 1999 TRP Plan (2.1) May 14, 1999 June 11, 1999 Final Project Management Plan (1.8) June 4, 1999 Draft Research Priority List to TRP (3.3) July 16, 1999 Outline of Extant Surveys Research (3.4) August 6, 1999 Draft of Extant Surveys Research (3.4) August 27, 1999 Outline of Questionnaire Content (3.5) August 27, 1999 Draft Cognitive Laboratory Research Plan (4.3) August 27, 1999 Revised Extant Surveys Research (3.4) September 17, 1999 Revised Questionnaire Content Outline (3.5) October 8, 1999 Draft Sample Design Plan (5) October 15, 1999 Literature Review Plan (3.2) October 15, 1999 Draft Cognitive Research Report (4.3) October 29, 1999 Revised Sample Design Plan (5) November 5, 1999 Comments on Questionnaire Summarized from TRP (3.3) November 12, 1999 Revised Cognitive Research Report (4.3) November 19, 1999 Draft Project Bibliography 1 (1.5) November 19, 1999 Draft Literature Reviews (3.2) November 19, 1999 Draft Copies of Survey Instruments (4.1) November 19, 1999 Revised Project Bibliography 1 (1.5) December 10, 1999 Second Draft of Survey Instruments (4.1) December 10, 1999 Survey Design Report Plan Outline (6) December 10, 1999 Third Draft of Survey Instruments (4.1) December 31, 2000 Draft Survey Design Report (6) January 7, 2000 Draft IMT/OMB Clearance Package (7) January 7, 2000 Revised IMT/OMB Clearance Package (7) January 28, 2000 Revised Survey Design Report (6) January 28, 2000 Final IMT/OMB Clearance Package to Submit (7) February 18, 2000 Draft Field Test Plan (9.1) February 18, 2000 Field Test Plan (9.1) March 10, 2000 Draft CATI Edit Specifications (8.1) March 17, 2000 Draft CATI Instrument (8.4) April 11, 2000 Revised CATI Edit Specifications (8.1) August 11, 2000 Field Test Report & Revised CATI Instrument (9.3) September 22, 2000 Draft Spanish and English language CATI Screens (8.5) September 30, 2000 IMT/OMB Revised Clearance Package Memo (9.4) [10 copies]October 13, 2000 Revised Spanish CATI Screens (8.5) October 30, 2000 Interviewer Training Program Outline (10.1) November 3, 2000 Draft Project Bibliography 2 (1.5) November 10, 2000 Hard Copy of Final Instrument (9.5) November 17, 2000 Draft Interviewer Training Materials (10.1) November 27, 2000 Revised Project Bibliography 2 (1.5) December 1, 2000 Revised Interviewer Training Materials (10.1) December 22, 2000 DATA COLLECTION BEGINS January 8, 2001 Draft Plan for Post-CATI Edit Specifications (12.1) February 19, 2001 Draft Plan for Coding of Open-Ended Items (12.1) February 19, 2001 Deliverable (Task) Completion/Due Date Draft Plan for Data File Specifications (12.2) February 19, 2001 Draft Plan for Creation of Composite Variables (12.3) February 19, 2001 Draft Plan for Sample Weights (12.5) February 19, 2001 Draft Plan for Standard Error Calculations (13.2) February 19, 2001 Revised Plan for Post-CATI Editing (12.1) March 12, 2001 Revised Plan for Coding of Open-Ended Items (12.1) March 12, 2001 Revised Plan for Data File Specifications (12.2) March 12, 2001 Revised Plan for Creation of Composite Variables (12.3) March 12, 2001 Revised Plan for Sample Weights (12.5) March 12, 2001 Revised Plan for Standard Error Calculations (13.2) March 12, 2001 DATA COLLECTION ENDS Apr. 1, 2001 Revised Project Bibliography (1.5) April 23, 2001 Revised Literature Reviews (3.2) April 23, 2001 Final Item Nonresponse Specifications (12.4) April 30, 2001 Draft/Preliminary Data Files (12.6) May 14, 2001 Draft User's Manuals/Guides (12.6) May 14, 2001 Final Survey Design Report (6) May 28, 2001 Draft Statistics in Briefs (3) (13.1) June 8, 2001 Methodology Report Outline (14) June 11, 2001 Revised Data Files (12.6) June 25, 2001 Revised User's Manuals/Guides (12.6) June 25, 2001 Draft ECB (12.6) June 25, 2001 Draft Statistical Analysis Report 1 (13.1) June 25, 2001 Draft Statistical Analysis Report 2 (13.1) July 6, 2001 Draft Statistical Analysis Report 3 (13.1) July 20, 2001 Revised Statistical Analysis Report 1 (13.1) August 17, 2001 Revised ECB (12.6) August 25, 2001 Revised Statistical Analysis Report 2 (13.1) August 31, 2001 Revised Statistical Analysis Report 3 (13.1) September 14, 2001 Final Statistics in Briefs (3) (13.1) September 28, 2001 Draft Project Bibliography 3 (1.5) November 2, 2001 Draft Methodology Report (14) November 9, 2001 Revised Project Bibliography 3 (1.5) November 23, 2001 Final Methodology Report (14) November 30, 2001 Final Statistical Analysis Report 1 (13.1) March 1, 2002 Final Statistical Analysis Report 2 (13.1) March 15, 2002 Final Statistical Analysis Report 3 (13.1) March 29, 2002 NOTES: o Unless otherwise indicated, the COTR will review all deliverables and provide feedback within two weeks. o Unless otherwise indicated, five copies of each deliverable are required. o Not covered in this exhibit are the following: 1) Summary Reports and Recordings of all Technical Review Panel 2) Meetings (Task 2); 2) Weekly progress reports throughout data collection (Task 11-3); 3) Briefing materials (Task 1.3); 4) Project Leaflet (Task 1.4); and, 5) Monthly letters of progress. o A copy of the monthly letters of progress and the final methodology report shall be delivered to the Contracting Officer (CO). EXHIBIT 2. SCHEDULE OF DELIVERABLES FOR NHES:2003 Deliverable (Task) Completion/Due Date Initial Meeting (1.7) June 2, 2000 Draft Project Management Plan (1.8) July 28, 2000 TRP Plan (2.1) July 28, 2000 June 11, 1999 Final Project Management Plan (1.8) August 18, 2000 Draft Research Priority List to TRP (3.3) September 29, 2000 Outline of Extant Surveys Research (3.4) November 3, 2000 Draft of Extant Surveys Research (3.4) November 24, 2000 Outline of Questionnaire Content (3.5) November 24, 2000 Draft Cognitive Laboratory Research Plan (4.3) November 24, 2000 Revised Extant Surveys Research (3.4) December 8, 2000 Draft Project Leaflet (1.4) December 29, 2000 Literature Review Plan (3.2) December 29, 2000 Revised Questionnaire Content Outline (3.5) December 29, 2000 Draft Sample Design Plan (5) January 12, 2001 Revised Project Leaflet (1.4) January 26, 2001 Draft Cognitive Research Report (4.3) January 26, 2001 Revised Sample Design Plan (5) February 9, 2001 Comments on Questionnaire Summarized from TRP (3.3) February 16, 2001 Draft Literature Reviews (3.2) March 2, 2001 Revised Cognitive Research Report (4.3) March 2, 2001 Draft Copies of Survey Instruments (4.1) March 9, 2001 Second Draft of Survey Instruments (4.1) March 30, 2001 Survey Design Report Plan Outline (6) April 13, 2001 Third Draft of Survey Instruments (4.1) April 27, 2001 Draft Survey Design Report (6) May 11, 2001 Draft IMT/OMB Clearance Package (7) May 25, 2001 Revised IMT/OMB Clearance Package (7) June 8, 2001 Revised Survey Design Report (6) June 8, 2001 Final IMT/OMB Clearance Package to Submit (7) July 20, 2001 Draft Field Test Plan (9.1) August 10, 2001 Field Test Plan (9.1) December 14, 2001 Draft CATI Edit Specifications (8.1) January 11, 2002 Draft CATI Instrument (8.4) February 15, 2002 Revised CATI Edit Specifications (8.1) June 20, 2002 Field Test Report & Revised CATI Instrument (9.3) July 12, 2002 Draft Spanish and English language CATI Screens (8.5) October 11, 2002 IMT/OMB Revised Clearance Package Memo (9.4) [10 copies] October 11, 2002 Revised Spanish CATI Screens (8.5) October 18, 2002 Interviewer Training Program Outline (10.1) November 1, 2002 Draft Project Bibliography 4 (1.5) November 8, 2002 Hard Copy of Final Instrument (9.5) November 15, 2002 Revised Project Bibliography 4 (1.5) November 29, 2002 Draft Interviewer Training Materials (10.1) November 29, 2002 Revised Interviewer Training Materials (10.1) December 20, 2002 DATA COLLECTION BEGINS January 3, 2003 Draft Plan for Post-CATI Edit Specifications (12.1) February 14, 2003 Draft Plan for Coding of Open-Ended Items (12.1) February 14, 2003 Deliverable (Task) Completion/Due Date Draft Plan for Data File Specifications (12.2) February 14, 2003 Draft Plan for Creation of Composite Variables (12.3) February 14, 2003 Draft Plan for Sample Weights (12.5) February 14, 2003 Draft Plan for Standard Error Calculations (13.2) February 14, 2003 Revised Plan for Post-CATI Editing (12.1) March 14, 2003 Revised Plan for Coding of Open-Ended Items (12.1) March 14, 2003 Revised Plan for Data File Specifications (12.2) March 14, 2003 Revised Plan for Creation of Composite Variables (12.3) March 14, 2003 Revised Plan for Sample Weights (12.5) March 14, 2003 Revised Plan for Standard Error Calculations (13.2) March 14, 2003 DATA COLLECTION ENDS Apr. 4, 2003 Revised Literature Reviews (3.2) April 18, 2003 Final Item Nonresponse Specifications (12.4) April 25, 2003 Draft/Preliminary Data Files (12.6) May 23, 2003 Draft User's Manuals/Guides (12.6) May 23, 2003 Final Survey Design Report (6) May 30, 2003 Draft Statistics in Briefs (3) (13.1) June 13, 2003 Methodology Report Outline (14) June 20, 2003 Revised Data Files (12.6) July 7, 2003 Revised User's Manuals/Guides (12.6) July 7, 2003 Draft ECB (12.6) July 7, 2003 Draft Statistical Analysis Report (13.1) July 7, 2003 Revised Statistical Analysis Reports 3 (13.1) August 8, 2003 Revised ECB (12.6) August 29, 2003 Final Statistics in Brief (13.1) October 3, 2003 Draft Project Bibliography 5 (1.5) November 7, 2003 Draft Methodology Report (14) November 21, 2003 Revised Project Bibliography 5 (1.5) November 28, 2003 Final Methodology Report (14) December 12, 2003 Final Statistical Analysis Report (13.1) January 2, 2004 NOTES: o Unless otherwise indicated, the COTR will review all deliverables and provide feedback within two weeks. o Unless otherwise indicated, five copies of each deliverable are required. o Not covered in this exhibit are the following: 1) Summary Reports and Recordings of all Technical Review Panel Meetings (Task 2); 2) Weekly progress reports throughout data collection (Task 11.3); 3) Briefing materials (Task 1.3); 4) Project Leaflet (Task 1.4) 5) Monthly letters of progress. 0 A copy of the monthly letters of progress and the final methodology report shall be delivered to the Contracting Officer (CO). B. REPORTING The contractor shall submit 3 copies of monthly letters of progress to theCOTR for the duration of the contract: 1 copy to the Contracting Officer (CO), and 2 copies to the COTR. Progress reports shall describe the work in progress, indicate any problems encountered, and in the event of delays, shall propose ways of bringing the effort back on schedule. Progress letters shall be submitted by the 15th day of each month. A summary of project expenditures shall also be included in each letter ofprogress. Project expenditures shall include: 1) a table summarizing, by task, the budgeted cost for the month, the actual cost for the month, the cumulative budgeted cost, the cumulative actual cost, the percent of contracted amount spent to date, the estimated cost to complete, the total estimated cost, the contracted cost, and the difference between the total estimated cost and the contracted cost; 2) appropriate records and information to permit the COTR to certify that the services that are listed on monthly bills are actually used and are for official purposes; and 3) a manpower report prepared and signed by the project director that summarizes actual personnel assignments for the month just completed, showing for each named individual the hours charged by task. The format chosen shall be consistent from month to month. Also, the same format shall be used to report on any subcontractor activity costs. The contractor shall notify the Contracting Officer if the projected costs of the project are expected to exceed the project budget. The contractor shall communicate regularly (several times each week), andwork closely with the NCES COTR on all aspects of the study. To facilitate this communication, the COTR expects the contractor to set up an electronic system for transferring information via electronic mail and microcomputer. In addition, the contractor shall make at least six visits per year to the COTR to discuss project problems and progress. To the greatest extent possible, the contractor shall coordinate these meetings with meetings of the Technical Review Panels. C. EXPECTATIONS CONCERNING QUALITY OF, CORPORATE SUPPORT FOR, AND TIMING OF DELIVERABLES Although the term "draft" is used frequently in the Scope of Work, it is important that the contractor understand what NCES means and does not mean by "draft". A draft is a complete product of high quality that the contractor (and the government) would be proud to distribute. A "draft" is not simply what is available on the due date-regardless of completeness or quality. NCES anticipates that only minor changes to drafts will be needed for them to become final deliverables. Every deliverable shall be sent with a cover memorandum signed or initialed by the project director or someone of higher corporate stature than the project director. The cover memorandum shall certify that the contractor's organization stands behind the quality of the product. This holds for all products-even those prepared primarily by consultants or subcontractors. At no time will the government take direct delivery of a product from a consultant or subcontractor. All draft deliverables shall be due at the office of the COTR on or before their due date. Draft deliverables of 10 pages or less may be sent by electronic mail to the COTR and other appropriate project staff. Draft deliverables of more than 10 pages must be sent to the COTR such that they arrive on or before their due date. And, unless otherwise specified, five copies of each deliverable shall be required. Contractors must justify use of and obtain prior approval for use of any delivery service more expensive than USPS priority mail. D. THE ADJUDICATION PROCESS Adjudication of NCES reports is a process of internal review, followed by a peer review meeting. At the adjudication meeting, comments from reviewers both within NCES and from subject-matter experts or technical experts are reconciled. All NCES reports except for Working Papers go through adjudication. Following adjudication, the report must be revised per the agreements reached in the meeting. A post-adjudication memorandum detailing the changes that were agreed-upon is a standard operating procedure. Some comments can be answered without necessarily making revisions to the report, provided that such comments are deemed to be beyond the scope of the report. However, all comments received must be addressed. The contractor shall allow sufficient time for NCES and OERI review. E. FORMS CLEARANCE The contractor shall prepare all written materials needed in the formal clearance and approval process according to IMT/OMB requirements. This approval process typically involves the following series of steps: 1) Submission of draft copies of questionnaires to the COTR for review (part of Task 4); 2) Revision of draft copies and presentation to NHES TRP members for comments (Task 2); 3) Revision of drafts and presentation to NCES Interdivisional Review panel (Task 7); 4) Submission of draft of clearance package including instruments and justification for NCES, Office of the Commissioner review (part of Task 7); 5) Revision of package and submission to IMT for review (part of Task 7); 6) Revision of and submission to OMB for review (part of Task 7); and 7) Revision and submission of package for final approval based on OMB review and comments (part of Task 7). For the IMT/OMB clearance package, the contractor shall provide item-by-item justification as well as justification for each major section of the questionnaire (e.g., background, education history, work history). Not only shall each item be justified by itself, it shall be justified as being a part of a major content area. Reasons for including selected items and not including other items shall be provided. The contractor staff shall be prepared to join the COTR in person for a review of the materials with IMT/OMB staff (if necessary). Eighteen (18) weeks (four calendar months) shall be allowed for the IMT/OMB review process following submission of the revised clearance package at 32 weeks prior to the start of data collection. The contractor shall assure that authorization by IMT/OMB has been obtained before field test data collection begins. All hard copies of the data collection forms shall bear the approval number assigned by OMB. CATI interviewers shall have this number available to respond to survey participant inquiries. F. ELECTRONIC COMMUNICATIONS The contractor shall transmit all correspondence directly to the NCES COTR via cc:Mail or another mutually agreed-upon communications package. As of 1998, NCES uses Office '97: Word, PowerPoint, and Excel as its primary software packages for communicating documents, charts, and tables. NCES uses the Department of Education's cc:Mail software as its communications package. All deliverables are to be the version of Word, PowerPoint, Excel, and/or other software packages used by NCES. In addition, all final versions of reports shall be delivered in both a HTML and PDF version for distribution on the internet. Appendix A Background Information on the NHES NATIONAL HOUSEHOLD EDUCATION SURVEY (NHES) PROGRAM HISTORY The National Center for Education Statistics (NCES) more often collects data through school-based surveys of teachers, students, and schools, and through its administrative records surveys of school districts, and state education agencies. Data collected in this manner have been the foundation of the NCES' program to fulfill its legislative mandate to collect and report information on the condition of education in the United States. The collection of data from noninstitutional samples of individuals, particularly household-based data collections, was limited prior to the addition of the NHES program. The NHES is a vehicle for collecting detailed information on educational issues from a relatively large and targeted sample of households in a timely fashion. It fills a need that existing household surveys, such as the Current Population Survey (CPS), sponsored by the Bureau of Labor Statistics and the Bureau of the Census, and the Survey of Income and Program Participation (SIPP), sponsored by the Bureau of the Census, cannot satisfy because such surveys were designed to focus primarily on issues other than education. If data are collected on education issues, it is usually done as a supplement to the main surveys. The level of detail in these supplements is often extremely limited because the sponsoring agency has practical constraints in obtaining the amount of data needed to address the basic issues of the survey as well as any supplementary data. As a result, data collected in this manner have often failed to provide NCES with the level of detail needed for desired analyses. In addition, because the existing sample designs of the main survey are established to meet specific objectives, it is not possible to target them to meet the design objectives associated with the educational issues. Consequently, sample sizes are sometimes inadequate for the applications required by NCES. NCES added the NHES to its repertoire of programs in 1988, with the award of a contract to conduct a field test collection in 1989. An option was exercised under that contract to conduct the first full-scale NHES in 1991 (NHES:91). A second competitively awarded contract covered the conduct of the NHES:93, the NHES:95, and the NHES:96. That contract ended after a 60 month period of performance. The third competitively awarded contract is currently operating to conduct the NHES:99, and the contract covers a 42 month period of performance. A full list of NHES publications can be found in Appendix B. Those publications marked with an asterisk are considered required reading for the purpose of this procurement. COMPLETED NHES COLLECTIONS One large-scale field test and four full-scale NHES collections (NHES:91, 93, 95, and 96) have been completed to date. The topics addressed by these collections are discussed in the following sections. Detailed information about the design of these collections can be found in the papers and reports listed in Appendix A. 1989 Field Test A field test of the NHES as a methodology for collecting education data was designed and conducted in the fall of 1989. Data on two topics of high priority to the NCES were collected during the field test--school dropouts and early childhood education. Other than the size of the sample, the field test included all of the design features of a full-scale NHES. This included the use of random digit dialing (RDD) and computer assisted telephone interviewing (CATI) methods. The findings from the field test were encouraging for the future of the NHES. The field test demonstrated that many of the concerns surrounding the use of a telephone survey to study education issues could be handled adequately in practice. It also revealed that each survey topic has its own unique set of circumstances that must be explored individually. NHES:91 NCES implemented the first full-scale NHES in the spring of 1991. The topics selected for the NHES:91 were early childhood education and adult education. The NHES:91 early childhood education component was an expansion of the 1989 early childhood field test topic. The age range of the children covered by the survey was extended from 3- to 5-year-olds to 3- to 8-year-olds. Information on the early childhood education experiences similar to that collected in the field test was collected for children not yet enrolled in first grade. The focus was on non-parental care and education, characteristics of programs and care arrangements and activities children engaged in with parents and other family members. For those children enrolled in primary school, the survey focused more heavily on: educational experiences to date; delayed entry into kindergarten and first grade; retention; and parental involvement in schooling. The second topical component of the 1991 survey concerned the educational activities of members of the U.S. adult population. Specifically, the survey collected up-to-date information on the participation of persons 16 years and older in a wide array of adult education activities. This survey component was based in large part on the Current Population Survey (CPS) supplement on adult education, last conducted in 1984. Data were collected on the numbers and types of courses in which adults had participated over the previous 12 months. For the four most recent courses information was collected on course content, provider, sources of payment, and reason for taking the course. The NHES adult education component, unlike the adult education supplement to the CPS, also asked questions of nonparticipants. These questions focused on the need for adult education, its availability, and barriers to participation. In the NHES:91, 120,000 telephone numbers were dialed, and screening interviews were completed in about 60,500 households. One or more topical interviews were completed in about 21,500 of the households. Altogether, about 14,000 early childhood interviews and 12,500 adult education interviews were conducted. The early childhood interviews were conducted with the parent or guardian identified by the screener respondent as being the most knowledgeable about the sample child. The adult education interviews were conducted with the sampled adults. NHES:93 Topics covered in the NHES:93 were School Readiness and School Safety and Discipline. The population of interest in the School Readiness component was children age 3 through age 7 and others enrolled up through 2nd grade. Issues addressed were: developmental characteristics of preschoolers, school adjustment and teacher feedback to parents for kindergartners and primary students, center-based program participation, early school experiences, home activities and health status. Extensive family and child background characteristics, including parent language and education, income, receipt of public assistance and household composition, were collected to permit the identification of at-risk children. The population of interest in the School Safety and Discipline component was children enrolled in grades 3 through 12. Parents of these children were interviewed about school learning environment, discipline policy, safety at school, victimization, the availability and use of alcohol/drugs, and alcohol/drug education. Peer norms for behavior in school and substance use were also included. A subsample of the children in grades 6 through 12 were also interviewed in the School Safety and Discipline component. The youth were asked essentially the same items as their parents. However, parents provided the extensive family and household background information as well as characteristics of the school attended by the child. In the NHES:93, about 130,000 telephone numbers were dialed, and screening interviews were completed in about 64,000 households. One or more topical interviews were completed in about 19,500 of the households. Altogether, about 11,000 school readiness interviews and 12,500 school safety and discipline interviews were conducted with knowledgeable parents or guardians. Also, about 6,500 school safety and discipline interviews were conducted with 6th through 12th grade youth. NHES:95 For the most part, topics included in the NHES:95 were a repeat of those included in the NHES:91, early childhood education and adult education. The early childhood component focused a bit more on characteristics of non-parental care and education arrangements and thus was renamed early childhood program participation. For the first time, the population of interest included infants and toddlers, and interviews were conducted with parents or guardians of children from birth through 3rd grade, up to and including age 10. The NHES:95 adult education questionnaire was different from the questionnaire used in 1991. The earlier questionnaire was closely patterned after that used in the 1984 CPS supplement on adult education. Intervening cognitive lab work suggested that a different questionnaire structure might better capture the totality of adult education activities. To preserve the trend measure and to allow for crosswalk between the 1991 and 1995 estimates of adult education participation, a sample of adults in 1995 received the 1991 version of the adult education participation items. The NHES:95 also included a small methodological study to test the feasibility of expanding the screening interview to allow for collection of data from the complete sample of households. In the NHES:95, about 120,000 telephone numbers were dialed, and screening interviews were completed in about 45,500 households. One or more topical interviews were completed in about 31,500 of the households. Altogether, about 14,000 early childhood interviews and 19,500 adult education interviews were conducted. The early childhood interviews were conducted with the parent or guardian identified by the screener respondent as being the most knowledgeable about the sample child. The adult education interviews were conducted with the sampled adults. NHES:96 The NHES:96 covered the topics of family involvement in education; civic and community involvement among adults and youth; and household library use. Children from age 3 through 12th grade were sampled for the parent involvement component. The most knowledgeable parent was interviewed about the child's school experiences, family involvement in school and schoolwork, school practices to involve families, and family involvement outside of school. Questions were also included about the involvement of the non-custodial parent. For adults, the civic and community involvement component included questions about exposure to national news, participation in community and political activity and political attitudes and knowledge. These questions were blended in with the parent involvement items for the parents of 6th through 12 graders and some emphasis was given to how the family models these involvement behaviors to the youth. For youth in grades 6 through 12, the civic and community involvement component included many of the same items asked of adults, but also items that captured involvement in activities that promote or indicate personal responsibility, including service learning activities. All households contacted in the NHES:96 were asked to respond to a brief set of items about household use of public libraries. In households in which no persons were sampled for either of the two larger topical components, the library items followed the screening items. In households in which someone was sampled for the more extensive topical interviews, the library items appeared as part of the topical interview. In the NHES:96, about 160,000 telephone numbers were dialed, and screening interviews were completed in about 55,000 households. Altogether, about 21,000 Parent Involvement in Education/Parent Civic Involvement interviews, 8,000 Youth Civic Involvement interviews, and 2,250 Adult Civic Involvement interviews were completed. NHES:99 The NHES:99 is scheduled to begin data collection in January 1999. The collection will include end-of-decade estimates of key indicators from the surveys conducted throughout the 1990s. It is expected that approximately 60,000 households will be screened, and that a total of about 40,000 interviews will be conducted with parents of children from birth through 12th grade, youth in 6th through 12th grade, and adults age 16 or older and not enrolled in grade 12 or below. Key indicators are expected to include participation of children in nonparental care and early childhood programs, school experiences, parent/family involvement in education at home and at school, youth community service activities, plans for future education, and adult participation in educational activities and community service. Appendix B May 1998 National Household Education Survey Publication List For copies of most of the National Household Education Survey publications, visit our web site at http://nces.ed.gov/nhes To order single copies in print, call (877) 4ED-PUBS CIVIC INVOLVEMENT Student Interest in National News and Its Relation to School Courses, July 1997. (10 pages, NCES 97-970) Student Participation in Community Service Activity, April 1997. (39 pages, NCES 97-331) National Household Education Survey: Adult Civic Involvement in the United States, February 1997. (19 pages, NCES-97-906) HOUSEHOLD USE OF PUBLIC LIBRARIES Use of Public Library Services by Households in the United States: 1996, March 1997. (12 pages, NCES 97-446) PARENT/FAMILY INVOLVEMENT IN EDUCATION * Father's Involvement in Their Children's School, October 1997. (117 pages, NCES 98-091) Factors Associated with Fathers' and Mothers' Involvement in Their Children's Schools, April 1998. (2 pages, NCES 98-122) How Involved are Fathers in Their Children's Schools?, April 1998. (2 pages, NCES 98-120) Students Do Better when Their Fathers are Involved at School, April 1998. (2 pages, NCES 98-121) Nonresident Fathers can Make a Difference in Children's School Performance, June 1998 (2 pages, NCES 98-117) Parents' Reports of School Practices to Involve Families, October 1996. (14 pages, NCES 97-327) EARLY CHILDHOOD/SCHOOL READINESS * Characteristics of Children's Early Care and Education Programs: Data from the 1995 National Household Education Survey, June 1998. (160 pages, NCES 98-128) The Elementary School Performance and Adjustment of Children Who Enter Kindergarten Late or Repeat Kindergarten: Findings from National Surveys, December 1997. (70 pages, NCES 98-097) Approaching Kindergarten: A Look at Preschoolers in the United States, October 1995. (72 pages, NCES 95-280) * Child Care and Early Education Program Participation of Infants, Toddlers, and Preschoolers, October 1995. (11 pages, NCES 95-824) Family-Child Engagement in Literacy Activities: Changes in Participation Between 1991 and 1993, December 1994. (9 pages, NCES 95-689) Access to Early Childhood Programs for Children At Risk, May 1994. (112 pages, NCES 93-372) Readiness for Kindergarten: Parent and Teacher Beliefs, September 1993. (10 pages, NCES 93-257) Profile of Preschool Children's Child Care and Early Education Program Participation, February 1993. (38 pages, NCES 93-133) Home Activities of 3- to 8-year-olds, January 1992. (8 pages, NCES 92-004) Experiences in Child Care and Early Childhood Programs of First and Second Graders, January 1992. (6 pages, NCES 92-005) SCHOOL SAFETY AND DISCIPLINE Student Reports of Availability and Peer Approval of the Use of Tobacco, Alcohol and Other Drugs at School: 1993, June 1997. (19 pages, NCES 97-279) Student Victimization at School, October 1995. (8 pages, NCES 95-204) Student Strategies to Avoid Harm at School, October, 1995. (7 pages, NCES 95-203) Gangs and Victimization at School, July 1995. (2 pages, NCES 95-740) Use of School Choice, June 1995. (2 pages, NCES 95-742R) Parent and Student Perceptions of the Learning Environment at School, September 1993. (17 pages, NCES 93-281) ADULT EDUCATION Adults' Participation in Work-Related Courses: 1994-1995, November 1998. (16 pages, NCES 98-309) Participation of Adults in English as a Second Language Classes, May 1997. (19 pages, NCES 97-319) Participation of Adults in Basic Skills Courses, March 1997. (20 pages, NCES 97-325) * Forty Percent of Adults Participate in Adult Education Activities: 1994-95, November 1995. (12 pages, NCES 95-823) Adult Education: Employment-Related Training, May 1994. (29 pages, NCES 94-471) Adult Education: Main Reasons for Participating, June 1993. (8 pages, NCES 93-451) TECHNICAL REPORTS An Experiment in Random-Digit-Dial Screening, December 1997 (49 pages, NCES 98-255) * An Overview of Response Rates in the National Household Education Survey: 1991, 1993, 1995, and 1996, June 1997. (64 pages, NCES 97-948) Feasibility of Conducting Followup Surveys in the National Household Education Survey, June 1997 (23 pages, NCES 97-335) * Overview of the National Household Education Survey: 1991, 1993, 1995, and 1996. May 1997. (35 pages, NCES 97-448) Measuring Participation in Adult Education, May 1997. (50 pages, NCES 97-341) * Reinterview Results for the School Readiness and School Safety and Discipline Components of the National Household Education Survey 1993, May 1996. (75 pages, NCES 97-339) Adjusting for Coverage Bias Using Telephone Service Interruption Data from the National Household Education Survey 1993, December 1996. (34 pages, NCES 97-336) Use of Cognitive Laboratories and Recorded Interviews in the National Household Education Survey, September 1996. (36 pages, NCES 96-332) Overview of the NHES Field Test, July 1992. (44 pages, NCES 92-099) Telephone Undercoverage Bias of 14- to 21-year-olds and 3- to 5-year-olds, July 1992. (42 pages, NCES 92-101) Multiplicity Sampling for Dropouts in the NHES Field Test, July 1992. (23 pages, NCES 92-102) Proxy Reporting of Dropout Status in the NHES Field Test, July 1992. (29 pages, NCES 92-103) Effectiveness of Oversampling Blacks and Hispanics in the NHES Field Test, July 1992. (20 pages, NCES 92-104) WORKING PAPERS Working papers related to NHES:96 Design, Data Collection, Interview Timing and Data Editing in the 1996 National Household Education Survey, November 1997. (95 pages, NCES Working Paper 97-35) Reinterview Results for the Parent and Youth Components of the National Household Education Survey, December 1997. (47 pages, NCES Working Paper 97-38) Undercoverage Bias in Estimate in Characteristics of Households and Adults in the 1996 National Household Education Survey, December 1997. (29 pages, NCES Working Paper 97-39) Unit and Item Response, Weighting, and Imputation Procedures in the 1996 National Household Education Survey, December 1997. (86 pages, NCES Working Paper 97-40) Working papers related to NHES:95 *Estimation of Response Bias in the NHES:95 Adult Education Survey, June 1996. (41 pages, NCES Working Paper 96-13) * The 1995 National Household Education Survey: Reinterview Results for the Adult Education Component, June 1996. (48 pages, NCES Working Paper 96-14) * Undercoverage Bias in Estimates of Characteristics of Adults and 0- to 2-Year-Olds in the 1995 National Household Education Survey, December 1996. (22 pages, NCES Working Paper 96-29) * Comparison of Estimates from the 1995 National Household Education Survey, December 1996. (75 pages, NCES Working Paper 96-30) * Unit and Item Response, Weighting, and Imputation Procedures in the 1995 National Household Education Survey, February 1997 (107 pages, NCES Working Paper 97-06) * Design, Data Collection, Interview Timing and Data Editing in the 1995 National Household Education Survey, March 1997. (95 pages, NCES Working Paper 97-08) * Adult Education Course Coding User's Manual, June 1997. (196 pages, NCES Working Paper 97-19) * Adult Education Course Code Merge Files User's Guide, June 1997. (50 pages, NCES Working Paper 97-20) Working papers related to NHES:93 Telephone Coverage Bias and Recorded Interviews in the 1993 National Household Education Survey, February 1997. (89 pages, NCES Working Paper 97-02) Design, Data Collection, Interview Timing and Data Editing in the 1993 National Household Education Survey, February 1997. (53 pages, NCES Working Paper 97-04) Unit and Item Response Rates, Weighting and Imputation in the 1993 National Household Education Survey, February 1997. (54 pages, NCES Working Paper 97-05) Comparison of Estimates from the 1993 National Household Education Survey, December 1997. (75 pages, NCES Working Paper 97-34) QUESTIONNAIRE WORKING PAPERS * 1996 National Household Education Survey (NHES:96) Questionnaires: Screener, Household and Public Library, Parent and Family Involvement in Education and Civic Involvement, Youth Civic Involvement and Adult Civic Involvement, August 1997. (60 pages, NCES Working Paper 97-25) 1991 and 1995 National Household Education Survey Questionnaires: NHES:91 Screener, NHES:91 Adult Education, NHES:95 Basic Screener, and NHES:95 Adult Education, February 1997. (80 pages, NCES Working Paper 97-03) * 1995 National Household Education Survey (NHES:95) Questionnaires: Screener, Early Childhood Program Participation, and Adult Education, October 1996. (112 pages, NCES Working Paper 96-22) * (1993 National Household Education Survey (NHES:93) Questionnaires: Screener, School Readiness, and School Safety and Discipline, October 1996. (66 pages, NCES Working Paper 96-21) * 1991 National Household Education Survey (NHES:91) Questionnaires: Screener, Early Childhood Education, and Adult Education, October 1996. (58 pages, NCES Working Papers 96-20) DATA SETS * NHES:91/93/95/96 CD-ROM. (NCES 97-426) The NHES:91/93/95/96 CD-ROM contains the following data files: 1991 Adult Education, Adult file; 1991 Adult Education, Course file; 1991 Preprimary file; 1991 Primary file; 1993 School Readiness file; 1993 School Safety and Discipline file; 1995 Adult Education file; 1995 Early Childhood Program Participation file; 1996 Parent/Family Involvement in Education and Parent Civic Involvement file; 1996 Youth Civic Involvement file; 1996 Adult Civic Involvement file; and 1996 Household and Public Library file. The CD-ROM contains an Electric Codebook (ECB) program that allows researchers to examine the variables in each of the NHES data sets as well as create SAS, SPSS for DOS, and SPSS for Windows programs that generate an extract data file. In addition, the Data Files User's Manuals for each of the NHES components, A Guide to Using Data from the National Household Education Survey, and the NHES:91/93/95/96 Electronic Codebook User's Guide are included on the CD-ROM. Appendix C Examples of open ended questions drawn from the National Household Education Survey of 1995 D4. What was the major subject or field of study of your (CREDENTIAL)? CRMAJOR1- CRMAJOR3/R SPECIFY______________________________________ D10. Let's talk about courses you took as a part-time student in the past 12 months. What (was/were) the name(s) of the course(s) and what was the general subject matter for each course in (CREDENTIAL) in (SUBJECT)? CR1CLS1-CR1CLS14/RNAME SUBJECT__________________ CR2CLS1-CR2CLS14/RNAME SUBJECT__________________ CR3CLS1-CR3CLS14/RNAME SUBJECT__________________ CR1SUB1-CR1SUB14/RNAME SUBJECT__________________ CR2SUB1-CR2SUB14/R CR3SUB1-CR3SUB14/R Questionnaire items that are the same as D10-F3, G3 E2. In what trade or craft (are you an/did you) apprentice? APTRADE/R SPECIFY_______________________________________ I32. What (is/was) your job title and what (are/were) your most important duties? [JOB PROBE: For example, electrical engineer, stock clerk, typist, or farmer] [IMPORTANT DUTY PROBE: For example, typing, keeping account book, f filing, selling cars, operating printing press, and finishing concrete.] PROFESS1-PROFESS5/R IMPORTANT DUTY______________________ DUTIES-DUTIES5/R JOB TITLE______________________________ 49