Safe, Disciplined and Drug-Free Schools Programs 2001
Safe, Disciplined, and Drug-Free Schools Expert Panel
In 1994, Congress directed the Office of Educational Research and Improvement (OERI), U.S. Department of Education, to establish "panels of appropriate qualified experts and practitioners" to evaluate educational programs and recommend to the Secretary of Education those programs that should be designated as exemplary or promising. Under the Education, Research, Development, Dissemination, and Improvement Act of 1994, each panel, in making this recommendation, was directed to consider 1) whether based on empirical data a program was effective and should be designated as exemplary or 2) whether there was sufficient evidence to demonstrate that the program showed promise for improving student achievement and should be designated as promising. The purpose of these panels was and still is to provide teachers, administrators, policymakers, and parents with solid information on the quality and effectiveness of programs and materials so that they can make better-informed decisions in their efforts to improve the quality of student learning. The OERI regulations implementing the statute leave to the judgment of the expert panels a determination of the nature and weight of evidence necessary to designate a program either promising or exemplary.
The Safe and Drug-Free Schools (SDFS) program and OERI established the Safe, Disciplined, and Drug-Free Schools Expert Panel in May 1998. (This panel was one of five established by the Department; the others were in the fields of math, science, gender equity, and educational technology.) The 15-member Expert Panel for Safe, Disciplined, and Drug-Free Schools was composed of educators, researchers, evaluators, program developers, and representatives from local and state education agencies, businesses, institutions of higher education, and medical and legal communities. Its task was to develop and oversee a process for identifying and designating as promising and exemplary programs that promote safe, disciplined, and drug-free schools. The Expert Panel initiative was a way of enhancing prevention programming by making schools and communities aware of programs that have proved their effectiveness when judged against rigorous criteria. The activity was also in keeping with the "Principles of Effectiveness" governing recipients' use of funds received under the Safe and Drug-Free Schools and Communities Act, State Grants Program.
The Review Process
The panel initially met to set up a process for making determinations and to establish the criteria under which programs would be reviewed. The panel drew heavily on the considerable research on "what works" in prevention programming in combating both substance use and violence among youth. The panel developed seven criteria, under the four "criteria categories" provided in the regulations, for judging the efficacy and quality of programs that would be submitted for their review and consideration. These seven criteria follow this Introduction.
The Expert Panel had an open and widely publicized submission process that encouraged applications from any program sponsor who believed that his or her program might meet the review criteria. A total of 124 programs were reviewed under a two-stage field review process established by the panel. In the first stage, 19 individuals with special expertise in research and evaluation, as well as in safe, disciplined, and drug-free schools programming, formed a pool of Criterion 1 field reviewers. They were selected by the U.S. Department of Education (the Department) from a list of individuals nominated by state SDFS coordinators, program staff, and Expert Panel members. These Criterion 1 reviewers met and were trained in the review procedures and became familiar with the criterion--evidence of efficacy--they were to use for reviewing programs. During this first-stage field review, each of the 124 programs was scored for evidence of efficacy by two Criterion 1 field reviewers.
Programs with high scores on the evidence of efficacy criterion (Criterion 1) were then considered by two second-stage field reviewers. In the second-stage field review, a pool of 40 individuals different from those used in the first-stage field review was selected by the Department to serve as Criteria 2 to 7 field reviewers. These individuals were nominated by state SDFS coordinators and program staff for their expertise in safe, disciplined, and drug-free schools programming. These Criteria 2 to 7 field reviewers met and were trained in the procedures and criteria they were to use when reviewing programs. They reviewed submissions on the criteria categories of quality of program, educational significance, and usefulness to others.
The Expert Panel met and considered field reviewer ratings and comments from both stages of the process for all programs reviewed. The panel identified 33 programs it wished to designate as promising and nine programs it wished to designate as exemplary.
Each of the nine potentially exemplary programs was subsequently sent to a separate Impact Review Panel for further review by at least two of its members according to procedures established by the Department. The Impact Review Panel comprised a group of national experts in evaluation/research design and analysis and was established by the Department to review the strength of evidence of program effects for all five of the Department's Expert Panels. The Expert Panel then considered comments and scores from the Impact Review Panel on the nine potentially exemplary programs and made a final determination about the programs to recommend to the Department as exemplary.
This publication provides descriptions of the 9 exemplary and 33 promising programs selected by the Expert Panel in 2001. Contact information for each program is also provided. In the program summaries that follow, the sections "Program Description" and "Professional Development Resources and Program Costs" were prepared based on information provided by the developers at the time they submitted their programs for consideration. At the request of the Department, developers checked each program description for accuracy and added updated information regarding costs as relevant. The remaining sections--"Program Quality" and "Evidence of Efficacy"--are based on the assessments of the reviewers and panelists.