|PDF (1 MB)|
Evaluating and Continually Improving
In this era of accountability, all school districts are struggling with how best to evaluate the effectiveness of their schools and how to use assessment and evaluation results to continually improve the quality of teaching and learning. For districts with magnet schools, which by definition are trying something unique or innovative, evaluation is all the more critical. Resulting data provide their schools with guidance for achieving their mission and education objectives, helping them iron out possible kinks, and identifying and correcting what's not working. And, according to several districts in this study, as magnet schools continue to improve, they raise the bar for all schools in a district.
To keep their magnet schools both effective and relevant, districts have found it important to use data to guide improvements in teaching and learning, to revisit and evaluate magnet themes over time, and to keep parents and community stakeholders involved in the process of evaluation and improvement.
Use data as a basis for improving teaching and learning
Evaluation of magnet schools and subsequent efforts for improvement must be data driven. Knowing what data to collect and how to utilize that information to improve the magnet schools is essential.
Many factors affect choices about what data should be collected to evaluate magnet programs. Some districts must collect certain data to meet state or funding institution reporting requirements; others have developed comprehensive evaluation plans that outline the data to be collected and used in assessing various aspects of magnet schools. The important thing to remember is that data collected for evaluation purposes should ultimately be utilized to improve teaching and learning.
Districts that receive funding from the Magnet Schools Assistance Program (MSAP) must meet certain reporting requirements, and these often form the foundation for the data collection process. But many districts find it is important to collect other information as well. For example, for MSAP-funded schools, Hamilton's and Wake's magnet schools directors collect information about professional development programs, alignment of the curriculum with state and local standards, infusion of the magnet theme into the curriculum, and parental involvement. The data collection process includes surveying parents, students, and teachers at the school to gauge their feelings about such things as the school's student diversity and integration of its magnet theme. For schools not supported by MSAP funds, the director uses a different evaluation process, analyzing data from the five disciplines tested by the district and looking at attendance and graduation rates, which tend to be higher in magnets than in other schools. The director of magnet schools in Hamilton also meets with principals and curriculum facilitators to discuss strengths and weaknesses.
Hot Springs uses state-mandated assessments to evaluate student progress and make adjustments to its academic programs. In addition, the district employs its own thrice-yearly standardized tests with a rational and easily interpretable growth measure to gauge student progress throughout the year. These measures are used in evaluating teacher efficacy and the success and failure of curriculum implementation at the district level. All student achievement and attendance data and other relevant information are collected and stored in a comprehensive student database maintained and utilized by the district's Office of Research and Evaluation. The district uses an outside evaluator to determine on an annual basis if it is meeting its objectives. The evaluator visits every classroom in the elementary magnets and several of the middle- and high-school magnets to assess programs. Additionally, consultants are hired to determine whether themes are being effectively implemented and infused in the curriculum and instruction at each magnet school.
Most districts, even those no longer under a desegregation order, remain committed to a diverse enrollment in their magnet schools and track their student population mix at each site. This allows Duval to report that when it operated under a desegregation order, 46 percent of its magnet schools were racially balanced. Now, even though the district no longer factors race into school assignments, 42 percent of Duval's magnet schools have a student mix reflecting districtwide student demographics. Montclair is still under a compliance order from the 1970s that says its schools must be balanced racially within 10 percent of the population. However, in evaluating its magnet program, the district considers equally important the elimination of the achievement gap between students of color and white students, and the district has managed to reduce that persistent performance discrepancy in the course of improving student achievement overall.
Knowing who is responsible for data collection and dissemination is also important. Having a central office for research and evaluation, such as that in Hot Springs, is one way to ensure that data collection is a priority. Hot Springs' office uses a wide array of reporting and data analysis software to continually monitor district, school, teacher, and student progress. For example, the district reviews the results of benchmark assessments to evaluate schools, teachers, and students, using the information they find to plan professional development and other interventions around areas of weakness.
Wake has also centralized its data collection efforts. The mission of its Evaluation and Research Department is "to improve the effectiveness of the Wake County Public School System for all students by: providing objective, accurate, and timely information on system, school, and program outcomes, management practices, cost effectiveness, and compliance; collaborating with others to ensure high-quality data collection, interpretation, and data-based decision making; and ensuring grant-based programs are developed, implemented, and managed in ways consistent with district priorities, research findings, and applicable regulations."14 The department consists of four offices: Testing, Program Accountability, School Accountability, and Grants Administration and Compliance Reporting. The Program Accountability office promotes continuous improvement and accountability through evaluations of district programs, including the magnet program. This office also supports individual schools through analysis of local assessment results and interpretation of school performance data. Among the evaluation-related activities of Program Accountability staff are monitoring program participation; monitoring program implementation and effectiveness through surveys, interviews, and site visits; analyzing achievement test results as well as other data on desired outcomes; and developing needs assessments and evaluation plans for grant applications.
The highly decentralized Houston Independent School District also has a district-level Department of Research and Accountability that in 2002-03 conducted a review to capture baseline data to use in evaluating Houston's magnet schools. However, the district magnet office is continually engaged in monitoring and evaluating the overall magnet program, especially on an informal level. For example, the Magnet Department monitors program implementation at both the district and the school level through site visits and technical support. The Magnet Department routinely requests application, transfer, and enrollment information from individual magnet coordinators and conducts informal surveys to monitor magnets' impact on diversity, equity, accessibility, and school improvement.
Other districts without central evaluation and research offices have developed means to collect, disseminate, and review magnet school data. As previously noted, Hot Springs employs an outside evaluator. For its part, Duval has conducted peer reviews in the past, and the director of school choice and pupil assignment recommends this practice to other districts. During this process, a team of peers would visit a school over the course of one to two days, interview everyone involved, and then provide feedback.
Perhaps the most crucial piece of any magnet school evaluation is understanding how to utilize the available data to improve teaching and learning, not just in magnet schools, but across the district. Montclair's assistant superintendent puts it this way: "Getting the data is one thing. Understanding what it is trying to tell you [is another]. And using it as a blueprint for change-that's where [data] can help." With this in mind, Montclair has trained its principals in how to utilize data. Guided by the results from standardized tests, for example, they look to see if certain skills were taught. They also look for patterns in report card grades that teachers are giving. The superintendent notes that being able to analyze and use data in more sophisticated ways has allowed district and school staff to see that schools are making some progress in addressing the achievement gap. The district has been following a cohort of students between 4th and 8th grade and has seen the gap in reading narrow to 6 percent.
Revisit and reinvent magnet themes to ensure appeal and relevance
As part of the evaluation process, districts must revisit and, as needed, make changes to magnet themes to ensure that themes remain relevant and appealing to the community. The attraction of a technology theme initially implemented at a time when it was still rare to find district teachers and students using technology for teaching and learning may diminish as schools throughout the district improve in integrating technology. It may be time to change the theme entirely or, instead, to ratchet up the curriculum by instituting new, specialized classes, in computer animation, for example. Thematic review is a necessary step in ensuring the long-term quality and sustainability of magnet schools. One way Duval evaluates its magnet program and schools is monitoring how parents "vote with their feet." In other words, if a magnet school no longer meets the needs of students, parents will choose other alternatives, such as private schools.
Montclair officials acknowledge that implementing magnet themes that make each school unique is a challenge they must meet if their magnet schools are to remain viable and true to the magnet school philosophy. Broad community input is part of the current effort to revise and improve Montclair's themes. "The changes will not be top down," the superintendent says, "they will start with parents and teachers."
This reinventing process has happened before in Montclair. For example, Bradford was identified as the "back – to – basics" school when it was set up in the 1970s. But back-to-basics eventually lost its appeal and Bradford enrollment began to drop. For two years the district tried unsuccessfully to re-market the school. Eventually, the theme was changed to communication arts and sciences and enrollment began climbing. Teachers and parents worked together to develop and enhance the new program. In the reinvention process, says the principal, it was "critical to keep what was working and to hear everyone's voice." Bradford's enrollment has once again begun flagging and as part of the current revisiting process, its theme is likely to change again. One possibility is that it will become a "university" magnet affiliated with Montclair State University.
Another example of Montclair's emphasis on formative evaluation of its magnet schools is Northeast Elementary School. When this school first became a magnet it had an "international" theme, primarily because much of its enrollment consisted of English-as-a-Second Language students. When the current principal came to Northeast in 2002, she surveyed students, parents, and teachers and found that few seemed to like the theme. "It was one day a week and too focused on the adult experts," the principal reported. "Many students said, 'I hated it. I never got to be a part of it.' Parents were unhappy, too." Using this information, the principal wrote a rationale for a new theme and presented it to district officials. The school now has a Global Studies theme.
Wake also takes steps to assess application numbers for its magnet schools. If numbers are low for a particular magnet school, the magnet office sends an internal survey to the campus, goes to the school, talks to the faculty, and, based on collected information, decides on another theme that might rejuvenate the school. As part of this process, the schools themselves conduct meetings with parents to seek their feedback.
Keep parents and community stakeholders involved in evaluation and improvement
All of the same school stakeholders and community partners that are such an important part of creating, implementing, and promoting magnet schools should be included in their evaluation and improvement. As mentioned, parents are a critical resource for the reinventing or revamping of magnet themes. Paying attention to parents and other stakeholders can inform evaluation and improvement efforts in other ways as well.
Most of the districts in this study collect data from stakeholders to use in their assessment of magnet schools. In Hot Springs, the schools keep records of parental participation in school activities, which is interpreted as an indication of parental satisfaction. Parents are also surveyed as to their satisfaction with after-school programs. Montclair currently is developing a parent survey instrument that will allow the district to learn more about what parents think of the school system and what their needs are. Wake surveys parents, students; and staff annually (see figure 11). In addition to these districtwide surveys, each magnet school can survey parents, students, and faculty to get feedback about its program. Hamilton also surveys teachers, students, and parents to gauge their feelings about diversity in the school, alignment of curriculum with state standards, integration of the school's theme, and parent involvement. Teachers are also asked about their professional development experiences. Survey results are analyzed at Brown University. In doing these surveys, the district intends to generate more data to inform school improvement efforts.
Stakeholders from outside the district management structure can also be involved directly in evaluation activities. One of the responsibilities of the Magnet Advisory Council in Duval-a diverse group including members from the community, higher education, private industry, district staff, teachers, and principals-is to "review program evaluations concerning the effectiveness of magnet programs and ... [to make] recommendations to the school system regarding the effectiveness of these programs."
In this district and elsewhere, ongoing review helps stakeholders identify and build on the strengths of their program and understand and address the areas in need of improvement. In this way, their magnet schools remain vital and effective.
Summary for Evaluating and Continually Improving