"If You Have Something People Need ... Give it Away"

    • March 24, 2010

The Problem. Without active dissemination, most research-tested innovations are never widely used, according to Karen Glanz, PhD, MPH, of the University of Pennsylvania. She and her colleagues were determined that the tool they had created to measure the healthfulness of food choices in restaurants and food stores did not meet the same fate. Collaborators include James Sallis, PhD (San Diego State University), Brian Saelens, PhD (Seattle Children's Hospital Research Institute and the University of Washington), and Lawrence Frank, PhD (University of British Columbia).

The Project. With a grant from the Robert Wood Johnson Foundation (RWJF), Glanz and a research team at Emory University (where Glanz was affiliated at the time) created and tested the Nutrition Environments Measures Survey (NEMS)—a promising new tool that researchers and others could use to record the availability, price and quality of healthful foods in restaurants and food stores. After presenting the tool at several scientific meetings, the team used two additional grants from RWJF to disseminate the tool and train people to use it.

At two-day intensive training sessions—held in Atlanta and other sites around the country—researchers, public health professionals, nutritionists and others received hands-on practice in using the tool in the community. The Emory team provided the training free of charge. Participants were asked to cover their travel and accommodations, but if they could not afford it, the training program provided travel scholarships.

The team also offered a train-the-trainer workshop so that researchers and others could pass on what they had learned to their colleagues.

Training on the Tool. From 2006 through 2009 some 385 people from 40 states and eight foreign countries received training in using the measures. All training participants went home with a manual of NEMS documents and a CD-ROM of all NEMS materials saved in modifiable formats.

Many of the trainees soon started using the measures in their work. A follow-up survey of 129 people who participated in the training sessions between March 2006 and January 2008 found that two-thirds had used the tool and a third had trained close to 300 others to use it. Twenty-one projects had collected data using the measures. Of these, 10 teams had final results available, and four teams had published NEMS assessments in peer-reviewed journals.

Modifying the NEMS tool to make it more useful. While offering hands-on training was one strategy designed to promote wide uptake of the NEMS tools, the Emory team determined it would not be sufficient. So it decided to grant permission to other researchers to adapt the NEMS tools for their own settings, reasoning that this would broaden the tools' appeal and utility.

For example, researchers at the University of Pennsylvania adapted the NEMS tool to study changes to the federal Women, Infants and Children (WIC) nutrition program for low-income women and children. In October 2009, WIC had begun issuing vouchers to participants to purchase fresh fruit and vegetables, as well as whole grain and low-fat options—"one of the biggest changes in the access that low-income families have to healthy food in a number of years," said Amy Hillier, PhD, assistant professor in the department of city and regional planning.

Hillier and her team modified the NEMS food list to match the new WIC food list and visited two low-income Philadelphia neighborhoods to see if the foods were available in local grocery and food stores. The store raters found that 79 percent of stores had no fresh fruit at all, 46 percent had no fresh vegetables at all, and 50 percent had no fresh vegetables of acceptable quality.

"It was great that we could modify [the NEMS tool] with some support," Hillier said. "[The Emory researchers] were really clear that they wanted to see people use this and use this well. They told us, 'We're not just going to do a fly-by training. We're going to see you through this.'"

Encouraging researchers to adapt a tool to fit their situation is not routine practice, Glanz said, "but if you don't allow that, you really restrict its usability." The Emory team worked with researchers as they adapted the tool to ensure that its reliability and validity, determined through rigorous testing, was not compromised. They also got researchers' permission to share any modifications with others who had received training in using the tool.

Using feedback to make improvements in the tool. The Emory team also paid close attention to user experience and feedback. When some users requested NEMS forms that could be scored manually rather than entered into a computer, the Emory team created them. The team also offered trainees more opportunities to practice rating stores and restaurants in the field, changed the NEMS measures from PDF format to a format that could be modified, and provided more information about how to analyze data.

In response to another change requested by users, a team of researchers led by Stephen Matthews, PhD, at Pennsylvania State University developed an integrated PDA-GPS version of the NEMS tool for food stores. The new version allows users to enter data directly into a PDA, for easy upload to a computer.

The power of flexibility. Glanz believes the broad uptake observed with the NEMS tool can be achieved by researchers in other content areas. What is required, she said, is flexibility in responding to users' needs, a strong commitment to bridging the gap between research and practice and adequate funding for dissemination.

Wide uptake of the NEMS tool came, in part, because of "RWJF's willingness to stick with us a bit longer in terms of funding," Glanz said. "But it was also my colleagues and their attitude, which is, 'If you have something that people need, let's give it away.'"

In January 2010, the NEMS team launched an online version of the NEMS training program, with the goal of expanding its reach to even more researchers and communities.

For more information on the project, see related Program Results Report.