The problem. As schools have come under increasing pressure to improve student achievement on state and national tests, children's opportunities to play and run around during the school day have dwindled. Those recess periods that do exist often lack the structure needed to support physical activity and positive social development and, instead, can be occasions for bullying and other aggressive behavior. Yet, a growing body of research is finding that reducing recess may have a negative impact on classroom behavior and readiness to learn.
Playworks: Re-introducing play into schools. Playworks is a national nonprofit organization that works to improve the health and well-being of children by increasing opportunities for physical activity and safe, meaningful play. Playworks sends trained, full-time coaches to low-income, urban schools, where they lead organized play and physical activity during recess, guide games, provide leadership development during class times, and run tutoring and physical activity programs after school. Watch this Before and After video for a view of Playworks in action. As of the 2011–2012 school year, Playworks was bringing play to more than 300 schools in 23 cities throughout the country, serving more than 130,000 elementary students each day.
In 2005 and again in 2008, the Robert Wood Johnson Foundation (RWJF) provided funding for Playworks to expand its program into new schools. An evaluation of the Playworks program is part of RWJF's $32.5 million investment in the program.
Does Playworks work? An initial evaluation of the implementation of Playworks in eight schools in the San Francisco Bay area was conducted by Rebecca A. London, PhD, and colleagues at the John W. Gardner Center for Youth and Their Communities at Stanford University. The evaluation showed that each school had implemented the program quickly and with fidelity. "At every single school we went to, the program—with its four or five different components—was in place," said London. "I think that was incredibly validating to Playworks leadership." See the Grantee Profile of London for more information.
By 2009, it was time for a rigorous evaluation of the program and its outcomes. RWJF selected Mathematica Policy Research, Inc., a research and evaluation firm headquartered in Princeton, N.J., to conduct the evaluation.
Susanne N. James-Burdumy, PhD, associate director of research at Mathematica, heads up the evaluation team. James-Burdumy has many years of experience evaluating educational interventions—a career focus that does not surprise her. Her mother was a teacher and from the time James-Burdumy was a small child she heard a lot "about the different issues my mom was facing at school." This background fostered her interest in how things work in education.
After several years as a consultant with Andersen Consulting post-college, James-Burdumy entered a doctoral program in economics at Johns Hopkins University. As research assistant to her thesis adviser, labor economist Robert A. Moffitt, PhD, she worked on a study conducted by Mathematica on the factors that affect entry into and exit from the food stamp program. This was her first exposure to evaluation. "I really enjoyed the nature of the work as well as its mission—looking at the effectiveness of programs and how they work and operate," she says.
After receiving her PhD in 1999, James-Burdumy joined Mathematica as a researcher and worked on a variety of projects. In time she found that she particularly enjoyed education work and began to focus in that area. At Mathematica, James-Burdumy directs the national evaluation of the U.S. Department of Education's Race to the Top and School Improvement Grants programs.
The Playworks evaluation was a good fit with her career. "The study that RWJF laid out was very relevant to the types of work we do at Mathematica," she says, "in terms of the required methods—random assignment, impact studies, and implementation studies—as well as the topic of the evaluation: a school-based intervention designed to improve student outcomes."
Designing and conducting the evaluation. RWJF evaluation staff wanted a design that included randomly assigning schools to either a treatment (i.e., Playworks) or control group. They also identified broad outcome areas for the evaluation to focus on, such as school climate and conflict resolution. James-Burdumy and her team worked closely with London and her colleagues at Stanford to develop a design for the evaluation, based on the Foundation's criteria, that would include both impact and implementation components.
They also relied on input from Playworks. "We needed to identify the particular questions that we would include in a teacher or student survey that would measure the outcomes that Playworks was expecting that they might affect," James-Burdumy explains. "We laid out a set of outcome measures that we thought captured the goals and the focus of Playworks and we ran those by them. Playworks was very helpful during this process."
Mathematica is the prime contractor for the random-assignment evaluation and focuses on the impact study. The Gardner Center team focuses on how Playworks was implemented in the study schools.
The evaluators initially recruited 25 schools from five cities to participate in the evaluation. All 25 wanted to implement Playworks. Fourteen were randomly assigned to the treatment group and implemented Playworks for the 2010–2011 school year. The other 11 comprised the control group that would receive Playworks the following year. It is important to note, says Laura Leviton, PhD, RWJF's senior adviser for evaluation, "Playworks is a place-based, randomized experiment, with the schools that did not win the 'lottery' for the first year put on a wait list. This was an ethical decision because there are not enough slots to fill the demand for the program."
Recruiting schools was a challenge, according to James-Burdumy. "The first year we were recruiting, many districts and schools were facing cutbacks and budgets that were incredibly tight. Since schools were randomly assigned, they had to be prepared to pay for the Playworks program if they were assigned to the treatment group. So recruiting was tough."
Leviton adds, "We planned to recruit across two years from the start because of the economic forces James-Burdumy mentions." And in the second year, she says, Mathematica was able to add a second cohort of three treatment schools and one control school in one other city during the 2011–2012 school year."
Program impact was addressed by the following research question:
Three research questions related to program implementation:
Data were collected through teacher and student surveys; interviews with principals, teachers, and Playworks coaches; focus groups with junior coaches (students selected to assist at recess); observations of recess; and review of administrative records.
Throughout the year regular meetings that included staff from Mathematica, Gardner Center, RWJF, and Playworks helped to "keep everyone abreast of where we were in the process," says James-Burdumy.
Finding positive impacts. In April 2012, the evaluation teams and RWJF released the project's first research brief, Findings from a Randomized Experiment of Playworks: Selected Results from Cohort 1. The brief notes significant, positive impacts of Playworks in the study schools:
Implementation findings were also positive:
"This is a very important study," says James-Burdumy, "because it is the first rigorous, random-assignment evaluation that has been done of Playworks. The findings showing positive impacts could be helpful for Playworks as it continues to raise funds and recruit new schools."
"The study is also important because it is one of only five randomized experiments on preventing bullying—and the others either show no effect, or in one case, small effects of the program. From teacher perceptions, Playworks' effect on preventing bullying is quite large. And the program itself is such a positive, welcome approach in schools," says Leviton.
Final steps. Data collection from the four cohort 2 schools was completed in spring 2012 and analysis of the data and preparation of reports are ongoing through the end of 2012.
The evaluators plan three more briefs and anticipate release sometime in spring of 2013. These briefs will report on findings from the total sample of 29 schools (17 Playworks and 12 control), and will include findings based on data on physical activity collected from accelerometers worn by students (accelerometers are instruments that track motion), from observation of recess, and from administrative records (such as daily attendance and suspensions). Topics for the briefs are likely to include physical activity outcomes, academic and learning outcomes, and implementation.
James-Burdumy notes that, while the Playworks evaluation, in many ways, reflects the design and challenges of other school-based evaluations, the large-scale use of accelerometers in a random-assignment study to measure the physical activity of elementary school students was a new experience for Mathematica. "There are lots of logistical issues to deal with," she explains. "You have to get them on the students at the beginning of the day, take them off at the end of the day. They must be placed on the students in the right way to make sure the measurements are coming through appropriately. And accelerometers are expensive—it was important to keep track of them all."
The evaluators took away important lessons from this experience. "The knowledge we developed about the use of accelerometers on a large scale is something that we will be able to apply to other studies moving ahead," says James-Burdumy.
James-Burdumy believes that the evaluation makes an important contribution to Playworks as "a rigorous, random-assignment evaluation that shows that the program has positive impacts on some outcomes. That's very powerful information for the Playworks program."
RWJF perspective. "This random-assignment evaluation is strategic for us in order to convincingly demonstrate the positive effects of Playworks," says RWJF's Leviton. "We don't even know what all of them are yet. The study is of a size and rigor that has already drawn the attention of the Department of Education's What Works Clearinghouse, which endorses evidence-based practices. Susie and her team are real pros at doing school-randomized experiments. I have never had a smoother, more informative, and professional experience with an evaluation team."