James-Burdumy, Susanne, Mark Dynarski, Mary Moore, John Deke, Wendy Mansfield, and Carol Pistorino. "When Schools Stay Open Late: The National Evaluation of the 21st Century Community Learning Centers Program: Final Report." U.S. Department of Education, National Center for Education Evaluation and Regional Assistance. Available at http://www.ed.gov/ies/ncee.
Background: The 21st Century Community Learning Centers program has supported after-school programs since 1998. Research on the effects of after-school programs has been inconclusive, leading to an ongoing debate about the effects of after-school programs.
Purpose: To examine the implementation of the 21st Century Community Learning Centers after-school program and assess its impacts on students. Earlier reports from this study presented findings based on two school years of data for middle school students and one school year of data for elementary school students. Key impact findings from the first report include no improvement in homework completion, limited effects on academic outcomes, no reduction in self-care, no improvements in safety and behavior, higher levels of parental involvement for the treatment group relative to the control group, and few effects on developmental outcomes. Key impact findings from the second report include higher levels of supervision by adults for treatment-group students relative to control-group students, lower levels of supervision by siblings for treatment-group students relative to control-group students, no reduction in self-care, few impacts on academic outcomes, improved feelings of safety after school for elementary students in the treatment group relative to students in the control group, mixed evidence on negative behavior for middle school students, some impacts on parents of elementary students, and few impacts on developmental outcomes. The purpose of the current report is to present impact analyses based on two years of follow-up data for elementary students.
Setting: Twenty-six 21st Century centers in 12 school districts.
Subjects: A total of 2,308 elementary students eligible for and interested in attending a 21st Century Community Learning Center. A total of 973 students applied to 18 centers in fall 2000, and 1,335 applied to 8 centers in fall 2001.
Intervention: 21st Century centers typically offered homework sessions, academic activities, enrichment activities, such as art, drama, or music, and recreation activities.
Research Design: Randomized controlled field trial. Students were randomly assigned either to the 21st Century center group (1,258 students) or to a control group (1,050 students).
Control or Comparison Condition: Control students could participate in any other after-school activities and programs to which they were entitled or eligible, but they were not eligible to participate in 21st Century after-school centers for two years.
Data Collection and Analysis: Data on students' supervision after school, academic achievement, behavior, developmental outcomes, and feelings of safety after school were collected from parents, teachers, students, and school records in fall 2000 (baseline), spring 2001 (first followup), and spring 2002 (second followup) for the first cohort of students, and one year later for students who applied to centers in fall 2001. The Stanford Achievement Test in reading was administered at baseline and followup. Regression-adjusted impact estimates that compare the outcomes of treatment and control students were calculated to assess differences between the 21st Century and control groups. We also collected implementation data from program staff and principals and conducted two visits to each site, once during each of the two years of the study.
Findings: Earlier analyses found few impacts of 21st Century programs. It was hypothesized that an additional year of follow-up data might show positive effects because students had the opportunity to participate for a second school year, and change in some outcomes might require more time than others. Analyses of an additional year of follow-up data do not yield support for this hypothesis. Treatment-group students were less likely than control-group students to be in parent care and more likely to be in the care of other adults, but they were no less likely than control-group students to be in self-care. Treatment-group students did not have higher levels of academic achievement as measured by reading test scores or grades in math, science, social studies, and English relative to control-group students. There was evidence of higher levels of negative behavior among the treatment group relative to the control group on multiple outcomes, including suspensions, teachers calling students' parents about behavior, and students being disciplined by teachers. There were mixed effects on developmental outcomes. Treatment-group students had improved feelings of safety after school relative to control-group students.
Conclusions: This study finds that elementary students who were randomly assigned to attend the 21st Century Community Learning Centers after-school program were more likely to feel safe after school, no more likely to have higher academic achievement, no less likely to be in self-care, more likely to engage in some negative behaviors, and experience mixed effects on developmental outcomes relative to students who were not randomly assigned to attend the centers.
Various studies of after-school programs have reported that programs can increase academic achievement and student safety and reduce negative behaviors, such as drug and alcohol use. However, some studies have reported that after-school programs had no effect on some important outcomes and even worsened others, leading to a debate over whether the research evidence supports increased investment in after-school programs.
In 1994, Congress authorized the 21st Century Community Learning Centers (21st
Century) program to open up schools for broader use by their communities. In 1998,
the program was refocused on providing academic, enrichment, and recreational
activities in public schools during the after-school hours (centers also could
be open before school, on weekends, and during the summer). The program grew from
an appropriation of $40 million in 1998 to $1 billion in 2002, where it has remained.
In 1999, the U.S. Department of Education (ED) selected Mathematica Policy Research, Inc. (MPR) and Decision Information Resources, Inc. to evaluate the 21st Century program. The challenge facing the national evaluation was to address three key questions about a rapidly growing and popular program at a level of rigor that would support policymakers in their efforts to further develop and enhance the program. The three questions were: (1) Did the program improve student outcomes, such as supervision after school, safety after school, academic achievement, behavior, and social and emotional development? (2) What types of students benefited the most? and (3) What were the features and characteristics of programs? The wide range of outcomes examined in the study was guided by program content and ED's priorities in the 21st Century program grant competitions, which called for programs to include extended learning opportunities, but also allowed them to include enrichment activities, such as recreation, music, and art.
To address these key questions, the evaluation conducted an impact study and an implementation study. Two different designs were used for the elementary and middle school impact studies. The elementary school study was based on random assignment, in which outcomes of students assigned to the centers were compared to outcomes of students not assigned to the centers. The elementary grantees and centers in our study were purposively selected because they could implement random assignment, and the results apply to these grantees and centers. The results should not be interpreted as findings from the universe of 21st Century centers serving elementary school students. The middle school study was based on a nationally representative sample of 21st Century programs serving middle school participants and a matched-comparison design, in which outcomes of students who participated in centers were compared to outcomes of similar students who did not. The results can be interpreted as findings from the universe of 21st Century centers serving middle school students (in the first three cohorts of grantees). Both the elementary and middle school studies followed all students in the treatment and control or comparison groups for two school years, with baseline data collection in the fall and follow-up data collection in the two subsequent springs. Both studies collected implementation data, mainly through visits to centers in both school years.
In its first year of data collection, the evaluation collected baseline and first follow-up data for roughly 1,000 elementary students in 18 centers in 7 school districts, and 4,300 middle school students in 61 schools in 32 school districts (baseline data were collected in fall 2000, and first follow-up data were collected in spring 2001). The evaluation's first report, released in February 2003, includes findings based on these data (Dynarski et al. 2003).
In its second year of data collection, the evaluation added a second cohort of elementary school students from 8 centers (administering the baseline surveys in fall 2001 and first follow-up surveys in spring 2002 to these new students), and conducted the second and final follow-up data collection for middle school students and the first cohort of elementary school students (second follow-up data for these students were collected in spring 2002). First follow-up data from the first and second cohorts of elementary students also were combined. Findings from these data—the full first follow-up sample for elementary school students and the full second follow-up sample for middle school students—were reported in Dynarski et al. (2004) (hereafter, referred to as the second report).
In spring 2003, during the evaluation's third and final year of data collection, the study collected the second and final year of follow-up data for the second cohort of elementary students. The second follow-up data for the two elementary school cohorts were then combined. This report analyzes these second follow-up data on elementary school students, to explore whether outcomes are affected by a second year of being able to attend 21st Century programs in the evaluation.
The report concludes with a synthesis of the evaluation's findings. The synthesis looks comprehensively at implementation and impact findings for elementary schools and middle schools in the context of the program's objectives and goals.
When the national evaluation got under way in October 1999, relatively little was known about the effectiveness of after-school programs, though some research suggested that the programs held promise. This promise was captured in the title Safe and Smart, a report about after-school programs jointly issued in June 1998 by ED and the U.S. Department of Justice, promoting after-school programs as safe places for children to improve their academic skills and enhance other aspects of their development.
At the time of Safe and Smart's release, ED was making its initial 21st Century grants, totaling $40 million to school districts. Within a few months, Congress increased the program's funding to $200 million; the following year, funding more than doubled, to $450 million. When the evaluation began in 1999, funding had increased tenfold in three fiscal years. Program funding continued to increase, rising to $1 billion in 2002, where it has remained. Until the No Child Left Behind Act (NCLB), ED had made grants to seven cohorts of school districts, with funds going to almost 1,600 districts and 6,800 schools.
Design of the National Evaluation
The national evaluation of the 21st Century Community Learning Centers program includes an elementary school study and a middle school study.
The elementary school study used random assignment of students to treatment and control groups. Random assignment was conducted separately for each center. The study included 12 school districts and 26 centers, which were able to participate in the evaluation because the centers had more students interested in attending than the centers could serve, a precondition for random assignment. The findings are based on data collected from students, parents, teachers, principals, program staff members, and school records. The evaluation selected students in the fall of the school year and followed those students for two school years. Baseline and first follow-up data were collected for 589 treatment-group students and 384 control-group students in seven school districts in the 2000-2001 school year (the first cohort), and for 693 treatment-group students and 666 control-group students in five school districts in the 2001-2002 school year (the second cohort). Second follow-up data were collected in the 2001-2002 school year for the first cohort, and in the 2002-2003 school year for the second cohort. The total elementary school sample was 2,308 students.
The middle school study is based on a nationally representative sample of 21st Century programs serving middle school participants and a matched-comparison group of students who are similar to participants. Similar students were identified in host schools or in other schools in the participating districts. Student data were collected from 32 school districts and 61 centers in those districts. The sample includes 1,782 participants who were matched to 2,482 comparison students. As with the elementary school study, the evaluation selected students in the fall of the school year and followed those students for two school years. Baseline and first follow-up data were collected in the 2000-2001 school year, and second follow-up data were collected in the 2001-2002 school year.
ED funded seven cohorts of 21st Century discretionary grants. The elementary school study includes grantees from the first five cohorts (4 of the 12 districts also received grants in the sixth and seventh cohorts). The middle school study includes grantees from the first three cohorts. When the study began, all grantees were in their second or third year of a three-year grant. In 2001, NCLB changed the program from discretionary grants to state-administered grants. As of October 2004, 6 of the 12 school districts in the study had received grants from their state.
The implementation analysis was based on site visits that were conducted to all grantees, with visits lasting two to four days. Each center was visited twice, once during each of the two years of the study. The study also conducted surveys of program directors, program staff, and school principals in its first two years. These surveys were not conducted for the second year of the second elementary school cohort.
In 2002, NCLB changed the program's structure by allocating its funds to states in proportion to the state allocation of Title I funds. States operate their own grant competitions to select school districts to receive funding.
Even though the 21st Century program made its first grants in 1998, school districts receiving funding were not necessarily operating after-school programs for the first time. Most districts in the study had experience in running some type of after-school program, though the programs may have been smaller or included fewer services and activities than those offered with the 21st Century grants. When data collection for the evaluation began, the programs were in the second or third year of their 21st Century grant.
The legislation authorizing the 21st Century program did not require programs to focus on academic activities, but ED's priorities in its grant competitions were clear: to be funded, programs needed to provide these types of activities. In the initial grant competition announced in the Federal Register on December 2, 1997, ED indicated it would fund only those applicants that propose "an array of inclusive and supervised services that include extended learning opportunities (such as instructional enrichment programs, tutoring, or homework assistance) but may also include recreational, musical and artistic activities." ED also awarded additional points to applicants (school districts) that proposed activities that assisted students in meeting state and federal standards in core academic subjects.
A Typical Elementary School Center
The center is open five days a week for two and a half to three hours a day. About 85 students come to the center on an average day. The first hour is a snack and a homework session. Certified teachers and aides oversee the homework session. After homework ends, students are grouped by grade level and rotate through various activities, depending on the day of the week. Some students work in the computer lab on their reading and math skills or meet with certified teachers for instruction that complements instruction in the regular school day. Other students participate in enrichment activities, such as martial arts, fitness and dance, art, and music. A mix of teachers, instructional aides, and outside organizations lead the enrichment activities. On Fridays, students participate in special activities or spend time playing board games or basketball.
Characteristics of Elementary School Centers in the Second Year
Combining the information from both cohorts of centers in the study, the two most common objectives of administrators of elementary school centers were to (1) provide students with a safe place after school, and (2) help students improve academically. These goals were similar to those of parents, who said they enrolled their children in the centers to help them do better in school (80 percent of parents) or to provide "a safe place for my child after school" (69 percent of parents).
Generally, centers were open for three hours after school four or five days a week. A typical day included one hour for homework and a snack, one hour for another academic activity, such as a lesson or working in a computer lab, and one hour for recreational or cultural activities. All centers provided academic assistance, mostly as homework sessions (100 percent of centers), instruction in reading and writing (86 percent), and instruction in math (77 percent).
Centers also provided recreational, cultural, and interpersonal development activities. Nearly all centers offered recreational activities, ranging from unstructured free time to organized sports. Centers also offered dance, drama, music, and workshops on development topics, such as developing leadership skills and resolving conflicts with peers.
Students in the treatment group attended an average of 81 days during the two-year follow-up period—49 days in the first year and 32 in the second. An important reason for the decline in measured attendance in the second year compared with the first year is that one-fourth of the treatment group did not have access to the program in the second year, because they changed schools and their new school did not operate a 21st Century center (see Figure 1). Focusing on the second year, about 40 percent of the treatment-group students attended the program in the second year for at least one day (Figure 2 shows the distribution of attendance for students who attended in the second year). Attendance for these students (those who continued to attend in the second year) averaged about 81 days, which translates into roughly 2.7 days a week (centers were open for 30 weeks on average) or 63 percent of days centers were open (on average, centers were open for 129 days). The study observed large variations in average attendance across districts and students but additional analysis did not reveal district or student characteristics that explained the variations.
Impacts of Elementary School Centers in the Second Year
The experiences and outcomes of control-group students in the evaluation provide a benchmark for measuring impacts. Control-group students may have gone home after school or attended some other after-school program, been supervised by a parent, sibling, or some other adult, worked on their homework in their own home or in an after-school program, and so on. Because the evaluation used an experimental design, these experiences accurately measure what treatment-group students would have experienced in the absence of the 21st Century center in their school. The experimental design ensures that outcome differences between the treatment and control groups are attributable to the program.
Supervision After School. Treatment-group students were more likely than control-group students to be with adults who were not their parents (40 vs. 33 percent) and less likely to be with their parents after school (68 vs. 75 percent). There was no impact of the program on the frequency of self-care (defined as not being with a parent, another adult, or older sibling after school). Just over one percent of students were in self-care three or more days in a typical week (Figure 3).
Academic Achievement. There were no differences between treatment-group students and control-group students on most academic outcomes. Treatment-group students scored no better on reading tests than control-group students and had similar grades in English, mathematics, science, and social studies. There also were no differences in time spent on homework, preparation for class, and absenteeism. However, teachers reported lower levels of effort and achievement for treatment-group students relative to control-group students. (According to teachers, 47 percent of treatment students tried hard in reading vs. 52 percent of control students, and 22 percent of treatment students achieved at an above-average or high level vs. 28 percent of control students.)
Safety After School. Treatment-group students reported feeling safer after school than control-group students; 2.5 percent of treatment-group students, compared with 7.1 percent of control-group students, reported feeling "not at all safe" after school.
Negative Behaviors. Treatment-group students were more likely than control-group students to engage in negative behaviors during the school day. Treatment-group students were more likely than control-group students to have schools contact their parents about behavior problems (28 vs. 23 percent), be suspended (12 vs. 8 percent), miss recess or sit in the hall (22 vs. 17 percent), and have their parents come to school about a problem (22 vs. 17 percent). The outcomes were gathered from different data sources, and higher levels of negative behaviors for the treatment group relative to the control group were evident in most of the 12 school districts. On other measures, such as teacher reports of sending the student to the office for misbehaving or giving the student detention, there were no impacts.
Developmental Outcomes. Teachers were less likely to report that program students got along well with others relative to control-group students (70 vs. 76 percent), and program students were less likely to rate themselves highly in working with others on a team relative to control-group students (78 vs. 85 percent). Program students reported that they were equally likely to get along with others their age, which differs from teacher reports. The difference may be attributable to differences in the samples underlying the two measures (student surveys were administered only to third- through sixth-grade students) or to differences in perspectives between teachers and students.
Parent Outcomes. There was no impact of the program on parental involvement in school, as measured by attendance at events held after school, parent-teacher organization meetings, or open houses or by the extent to which parents volunteered at school.
Subgroup Impacts. Generally, few subgroups had impacts that differed significantly. However, boys and students who had higher levels of disciplinary problems at baseline appeared to have significantly different impacts on negative behaviors relative to girls and students with low levels of disciplinary problems. In addition, students with lower test scores at baseline had significantly different impacts on grades than did students with higher test scores at baseline.
Comparison of Elementary School Findings in the First and Second Years
Some findings are the same in both years. In both years, the findings indicate that elementary students in centers were less likely than control-group students to be supervised by parents and more likely to be supervised by other adults after school, and more likely to be at school during after-school hours and less likely to be at home. In both years, there was no impact of the program on academic outcomes, such as grades, test scores, or homework completion, and treatment-group students reported feeling safer after school than control-group students.
Other findings were found in one year but not the other. In the first year, the study found that elementary school students in the treatment group were more likely than students in the control group to help other students after school, but this impact was not found in the second year. In the second year, treatment-group students were less likely than control-group students to rate themselves highly at working with others on a team, and, according to teachers, were less likely than control-group students to get along with others. A higher percentage of treatment-group students than control-group students had behavior problems in the second year, but the findings were not statistically significant in the first year. In the first year, parents of treatment-group students were more likely than parents of control-group students to attend after-school events, help their child with homework, and ask their child about school, but none of these impacts was found in the second year. Boys and students with higher levels of discipline problems at baseline experienced larger impacts on negative behavior, and students with low test scores at baseline had significantly different impacts on grades than students with high baseline test scores. Neither pattern was evident in the first year.
Synthesis of National Evaluation Findings
The national evaluation is the largest and most rigorous examination to date of school-based after-school programs. Given the large amount of data that have been collected, analyzed, and reported, it is helpful to synthesize the findings presented in the evaluation's three major reports. We first highlight key implementation findings, then turn to impact findings.
The synthesis necessarily focuses on particular findings from the many findings reported by the evaluation. In highlighting the particular findings, the synthesis relied on the three main evaluation questions: (1) What were the features and characteristics of programs? (2) Did programs improve student outcomes? and (3) What types of students benefited the most? It also considered the second impact question in five student domains: supervision and location after school, academic performance, personal and social development, behavior, and safety. In addition, the synthesis touches on several parent outcomes. Generally, impact findings are discussed only if the estimated impact is statistically significant in one or both years. Some findings relate to an absence of impact when it was hypothesized that an impact would be observed.
The synthesis combines both elementary and middle school findings. Middle school centers in the study were nationally representative, but elementary school centers had higher levels of low-income and minority students than the national average for elementary school centers, and the impact estimates are based on different measurement designs. The synthesis focuses on the overall consistency of findings, for which these differences play less of a role.
The study team collected data from program directors, staff, and school principals, and it observed centers to analyze program objectives, activities, staffing, and changes in centers during the two-year follow-up period. The data were the basis for several useful findings about implementation.
National data from program performance reports provide a description of an average 21st Century center. An average center serves about 200 students during a school year (though the number served each day is lower and varies widely across centers) and is open 10 or more hours a week (many are open 20 or more hours a week and on Saturdays). The center employs 12 or 13 staff, many of whom are teachers during the regular school day, to work with students. The center's budget enables it to spend about $1,000 a year per enrolled student, with most of its funds coming from the 21st Century grant.
Most schools hosting centers are elementary and middle schools that enroll a large number of low-income and minority students. Whereas 17 percent of middle schools nationwide are classified as high poverty (based on the proportion of students participating in the free lunch program), 66 percent of middle schools operating 21st Century centers are classified as high poverty. Similarly, 37 percent of students in middle schools nationwide are minorities; in middle schools operating 21st Century centers, 57 percent are minorities.
In both middle and elementary centers, program directors' most important objectives were (1) providing a safe environment after school, and (2) helping students improve academically. These objectives coincide with ED's Safe and Smart theme for the 21st Century program. Nearly all centers provided academic activities in reading, math, and science. Enrichment activities, such as art, music, and technology, also were common.
Program directors of elementary school centers in the study reported that they designed activities mostly to increase academic achievement and to provide opportunities for enrichment and recreation. Directors of middle school centers reported that they designed activities to improve academic achievement but also to appeal to students (most of whom said they attended voluntarily) and to accommodate staff, parent, and teacher views about what students needed to develop and improve. In interviews, program directors noted that they needed to provide interesting and fun activities that attracted students, while also providing academic activities that they viewed as being less attractive to students. Finding the right balance was a continual concern.
The study found wide variability in activities and services delivered across programs. Homework help was the most consistent activity that programs provided, but nearly all other activities and services varied widely across districts (and across centers within districts, to a lesser degree). The variability is consistent with the "model" underlying the program, which is that school districts and community partners should work together and combine local resources and skills to create a menu of services and activities that appeal to students. The authorizing legislation and ED's funding criteria both left program design primarily to the local programs. The variation in activities and services observed by the study is a logical consequence of this feature.
Academic activities, which programs had to provide to be funded, also varied according to local skills and resources. Middle school programs commonly provided homework help, and the evaluation observed that, typically, the help was passive and more like a study hall than a tutoring session. Other academic activities generally focused on smaller numbers of students who needed to work on particular skills or practice for state assessment tests. Coordination with the school-day curriculum was uncommon. Elementary school programs provided a range of academic activities beyond homework, and most were attentive to coordinating the activities with curriculum in the regular school day. Program staff and school-day teachers generally were aware of the need to have information flow between teachers in classrooms and staff in programs, but they had varying degrees of success in facilitating the flow. Coordination was smoother when regular schoolteachers were also program staff and had the same students, which was uncommon. Coordination appeared weak or nonexistent in centers that relied on outside staff, focused on noncognitive activities, or used processes that created a paperwork burden, such as having teachers send homework assignments to programs or share lesson plans with them.
During the study's two-year period when it observed implementation, program leadership was stable. Eighty-two percent of program directors were still working for the programs in the study's second year. However, two-thirds of the center staff and one-third of center coordinators from the first year had left the centers in the second year, suggesting high turnover. Centers did not pay high wages, which may have contributed to turnover, but the most common reason staff gave for departing was the demands of working after school.
Key Implementation Findings from the National Evaluation
This burnout factor may relate to the fact that many center staff were teachers during the regular school day. Though hiring teachers as staff has advantages—they are familiar with delivering curriculum and instruction and maintaining control of students, and the district knows them—the demands of teaching during the day make it difficult to teach after school as well.
Program attendance was about two days a week for elementary students and about one day a week for middle schoolers. Weekly attendance for middle school students was higher in the earlier part of the school year and declined as the year went on, and many did not return to the program in the second year. Weekly attendance was about the same for elementary school students throughout the school year, and they were more likely than middle school students to return in the second year. Middle school and elementary school students who returned in the second year had patterns of attendance similar to those in the first year. Program and student characteristics did not appear to have relationships with the frequency of attendance, and the study did not find relationships between more frequent attendance and positive outcomes. However, more frequent and steadier attendance would help programs manage service delivery and integrate school-day and after-school instruction.
The experiences and outcomes of control- and comparison-group students in the evaluation provide a benchmark for measuring impacts. Control- or comparison-group students may have gone home after school or attended some other after-school program, been supervised by a parent, sibling, or some other adult, worked on their homework in their own home or in an after school program, and so on. Because the elementary school evaluation used an experimental design, the study can validly measure what treatment-group students would have experienced in the absence of the 21st Century center in their school or local area. For example, the majority of elementary school students in the control group were at home after school and with a parent, which indicates that the majority of the treatment group attending 21st Century centers would have been home with a parent if centers did not operate in their schools.
For the middle school study, the evaluation used a matched-comparison group design, which by its nature cannot rule out the possibility that other factors besides the program explain part of the outcome differences. The evaluation used statistical techniques to adjust for a wide range of other variables that could differ between the treatment and comparison groups.
Supervision After School. Students in centers were more likely than control- or comparison-group students to be with adults who were not their parents after school and less likely to be with parents or older siblings. There was no impact of the program on self-care, regardless of how it was defined.
Academic Achievement. Generally, the program had no impact on reading test scores or grades. For elementary school students who had low grades at baseline, the program had a positive impact on English grades. The difference was about 2 points on a 100-point scale. Middle school students in the treatment group also had lower absenteeism than students in the comparison group.
Homework. Homework assistance was the most common academic activity that centers provided, but there was no impact of the program on the extent to which students completed homework or received help with it. The study found that nearly all elementary school students already received homework help. About 90 percent of the elementary students in the control group reported that a parent or some other adult asked them if their homework was complete, and about 80 percent reported that a parent or some other adult checked their homework to see if it was complete. For middle school students in the comparison group, 80 percent reported that a parent or other adult asked them if their homework was complete; about 53 percent reported that a parent or other adult checked that homework was complete.
Feelings of Safety. Elementary school students in the treatment group reported feeling safer after school than students in the control group, even though nearly three-quarters of students in the control group reported feeling "very safe" (the highest of three categories) and only seven percent reported feeling "not at all safe" (the lowest of three categories). Similar findings were not observed for middle school students. Fewer than three percent of middle school students reported feeling "not at all safe."
Developmental Outcomes. The study looked at a range of outcomes related to personal and social development, though it did not collect detailed measures in these domains. Although most outcomes showed no differences, middle school treatment-group students were more likely than comparison-group students to say they expected to graduate from college. The difference was small, about two percentage points. Elementary students in the treatment group were more likely than elementary students in the control group to report helping other students after school in the first year, which may be related to program activities. In the second year, however, treatment-group students were less likely than control-group students to say they worked well in teams, and teachers rated them lower than control-group students in getting along with others.
Parental Outcomes. Parents of elementary students in the treatment group had higher employment levels than parents of elementary students in the control group in the first year but not in the second year. The finding hints at the possibility that programs may enable parents to participate in the labor market, although the lack of a second-year finding makes the picture unclear. For middle school parents, parental involvement was higher in the first year for the treatment group than for the comparison group. Treatment-group parents were more likely than comparison-group parents to attend parent-teacher organization meetings, volunteer at school, and go to after-school events. Elementary school parents in the treatment group were more likely than parents in the control group to participate in after-school events in the first year, but their involvement in other areas was unaffected. In the second year, parents were as involved as the first year, but the extent of involvement was the same for the program and control groups.
Negative Behaviors. Middle school treatment-group students were more likely than comparison-group students to engage in some negative behaviors. A composite variable for five negative behaviors was higher for the program group than the comparison group in both years, and the difference was statistically significant. Negative behaviors were higher among elementary students in the treatment group compare with the control group in the second year but not the first. Treatment-group students were more likely than control-group students to be disciplined by their regular school-day teachers and to be suspended from school (about 12 percent of the treatment group were suspended at least once in the second year, compared to about 8 percent of the control group). Discussions with program directors indicated that, generally, students were not suspended because of misbehavior that may have happened during the after-school program, suggesting that, like the teacher discipline outcome, suspensions are related to negative behavior during the regular school day. Subgroup analyses showed that impacts on negative behaviors were larger for boys (behavior impacts for girls were close to zero and statistically insignificant) and for students who had higher levels of disciplinary problems at baseline, providing some insights about the pathways of behavior problems.
Key Impact Findings from the National Evaluation