September 2006

Grant Results

SUMMARY

In 2002, the OMG Center for Collaborative Learning, Philadelphia, convened representatives of foundations to review the role of evaluation in philanthropy.

As foundation grantmaking has evolved from charity to strategic philanthropy, the role of evaluation within foundations has grown substantially. However, foundation evaluation directors perceive that evaluation is not yet yielding as much as it might. It continued previous discussions among foundation evaluation directors (ID# 037391).

Researchers interviewed 54 officers and executives from 19 foundations for a "Briefing Note" and used it for discussion at an Evaluation Roundtable, April 25–26, 2002 that included 14 representatives from six of the funding foundations.

Project staff members also developed four case studies of evaluation practices in four foundation initiatives. Two of the case studies provided the basis for teaching and discussion at the second Evaluation Roundtable, which took place on July 18, 2002, with 48 people attending, including representatives of 24 foundations.

Key Findings and Recommendations
The "Briefing Note" included numerous findings and recommendations, including:

  • Within foundations, the directive for evaluation is tenuous, fleeting and shape-shifting.
  • Foundations take on accountability in many ways, but staff members are ambivalent and perhaps confused by the concept.
  • Performance information is emerging but limited, collected inconsistently and not strategically focused.
  • Unstated organizational dynamics play a significant role in the relationship between evaluation and program staff, with each competing for resources and operating with different incentives.

Funding
The Robert Wood Johnson Foundation (RWJF) provided $125,090 in partial support of this effort between March 2001 and October 2002.

 See Grant Detail & Contact Information
 Back to the Table of Contents


THE PROBLEM

As foundation grantmaking has evolved from charity to strategic philanthropy, the role of evaluation within foundations has grown substantially. However, foundation evaluation directors perceive that evaluation is not yet yielding as much as it might. In a 1998 study, evaluation directors from 21 large foundations documented a perceived gap between what is expected of evaluations conducted by and for foundations and what they actually achieve.

In 2000, RWJF cofunded an Evaluation Roundtable (ID# 037391) with the W.K. Kellogg Foundation. At the roundtable, evaluation directors from 13 foundations recommended that foundation leadership participate in a discussion about how evaluation can enhance and assure the value of philanthropy.

 Back to the Table of Contents


THE PROJECT

RWJF provided partial support to the OMG Center for Collaborative Learning for continued discussion among foundation leaders centered on ways to advance the debate over the appropriate roles for, and use of, evaluation within foundations. Six other foundations also supported the project (see Appendix 1 for a list).

In preparation for the first meeting convening foundation evaluation directors and program leadership, researchers interviewed 54 representatives of 19 foundations (program officers, evaluation staff and executives) about evaluation practices and the ways in which program and evaluation interact. From these interviews, they prepared a 19-page "Briefing Note" for discussion at the meeting. (See Findings.)

The project's initial Evaluation Roundtable took place April 25–26, 2002, at the James Irvine Foundation in San Francisco; 14 evaluation directors and heads of program from six of the funding foundations attended. To further discussion, the project staff prepared case studies of evaluation practices in four foundations' projects. These were:

  • Arts Participation Program, Wallace – Reader's Digest Funds. This was an evaluation process that examined the funds' arts participation grantmaking. The foundation commissioned a two-part study to investigate the assumptions held by both staff and grantees about enhancing audience participation in the arts. The case study focused on whether the activities under the study can be considered true evaluation.
  • Central Valley Partnership, James Irvine Foundation. The partnership supported neighborhood and regional organization efforts to improve quality of life in their communities in the central valley of California. The foundation evaluation director and an outside consulting firm conducted a two-phase in-depth evaluation. The case study raised questions concerning evaluation including the appropriate timing, evaluating the whole partnership effort versus the parts and the costs and benefits of it on different levels.
  • Home Visitation, David and Lucile Packard Foundation. A decade-long evaluation-focused strategy looked at the home visitation approach to providing a child-development service. To gauge the effectiveness of the home visitation approach in one of its efforts, Parents as Teachers, the foundation included a randomized trial (in which participants are randomly assigned into either an experimental group receiving the intervention or a control group not receiving it). The randomized trial results were consistently less positive than those in which Parents as Teachers did not use random assignment (called a quasi-experimental evaluation design in which there is an experimental group and a comparison group). This sparked debate among the field and policy community.
  • Fighting Back® Initiative, RWJF. This multi-site program employed community-generated strategies and coalitions to reduce the use and abuse of alcohol and illegal drugs. The initiative incorporated a lengthy national evaluation process that surfaced numerous complex issues and differences with program staff over the evaluation design and findings. Although the evaluators concluded that the initiative did not produce significant results, as RWJF program officer Laura Leviton pointed out, the situation illustrated "an all too familiar but central problem for evaluation: what to do with a no-effect conclusion … decision-makers can legitimately ask whether the flaw lies with the original theory … implementation … or the ability of evaluation measurement to sensitively detect relevant change …"

The second project Evaluation Roundtable met at RWJF, in Princeton, N.J., on July 18, 2002, with 48 participants, including representatives from 24 foundations (see Appendix 2 for a list of the foundations) in attendance. Two case studies provided the focus of the meeting: Home Visitation from the Packard Foundation and the Central Valley Partnership from the Irvine Foundation.

 Back to the Table of Contents


FINDINGS

The principal investigator, Patricia Patrizi, reported a number of findings in the "Briefing Note," including:

  • "The state of foundation evaluation is not a very happy one — either on the program side or the evaluation side."
  • Within foundations, the directive for evaluation is tenuous, fleeting and shape-shifting. Organizational issues add to confusion about evaluation's purposes and are compounded in light of the inherently difficult nature of giving and receiving potentially critical feedback.
  • Although the concept of accountability carries psychological weight within the philanthropic enterprise, and foundations take on accountability in many ways, staff members are ambivalent and perhaps confused by the concept. In most cases foundation staff members puzzle over accountability to whom and for what end.
  • Performance information is emerging but limited, collected inconsistently, and not strategically focused. Efforts tend to center on foundation processes and grantee outcomes, leaving the linkages between the two undetermined.
  • The organizational learning agenda has failed to link to foundation mission and to the needs of program staff. The relationship between learning and accountability has assumed a polarized tenor, and learning and accountability are sometimes seen as competing functions. This is particularly true when the same personnel, often the evaluation staff, perform both functions; program staff members find it difficult to discern when an evaluator is functioning in the learning mode or the accountability mode.
  • Unstated organizational dynamics play a significant role in the relationship between evaluation and program staff, with each competing for resources and operating with different incentives. Program staff members tend to align more closely with grantees while evaluation staff members are usually more aligned with institutional interests; the dyadic and triadic relationships that emerge can be filled with conflict.

Conclusions and Recommendations

Patrizi drew numerous conclusions and recommendations in the "Briefing Note." Recommendations include broadening the concept of foundation effectiveness to more explicitly include effects upon target populations, programs and fields. She writes that this would "naturally expand the boundary of the philanthropic endeavor to incorporate the concerns and challenges of the fields [it] supports, thereby providing grounding in the reality of externally validated knowledge needs." It would result, she suggests, in removing the triangular conflicts that arise within foundations by providing the "bigger-than-ourselves" audience of the field.

Communications

The journal Evaluation and Program Planning will publish the Irvine Foundation case study. Project staff disseminated the "Briefing Note" to 70 foundations in total.

 Back to the Table of Contents


LESSONS LEARNED

  1. Case studies are an excellent way for foundation staff members to learn about evaluation issues, problems and conflicts. The case teaching experience at the July 2002 meeting was highly successful. (Project Director)

 Back to the Table of Contents


AFTER THE GRANT

Teaching of the cases will take place at the Council on Foundations in April 2003, at a Knight Foundation meeting in December 2003, and in university evaluation programs, among other venues. Grantmakers for Effective Organizations is carrying the work forward with the project director as consultant.

 Back to the Table of Contents


GRANT DETAILS & CONTACT INFORMATION

Project

Reviewing the Role of Evaluation in Philanthropy

Grantee

OMG Center for Collaborative Learning (Philadelphia,  PA)

  • Amount: $ 125,090
    Dates: March 2001 to October 2002
    ID#:  041237

Contact

Patricia Patrizi
(215) 572-1647
patti@patriziassociates.com

 Back to the Table of Contents


APPENDICES


Appendix 1

(Current as of the time of the grant; provided by the grantee organization; not verified by RWJF.)

Additional Foundations Providing Support

  • David and Lucile Packard Foundation, $150,000
  • James Irvine Foundation, $50,000
  • John S. and James L. Knight Foundation, $49,990
  • Wallace - Reader's Digest Funds, $41,400
  • Edna McConnell Clark Foundation, $30,000
  • W.K. Kellogg Foundation, $25,000


Appendix 2

(Current as of the time of the grant; provided by the grantee organization; not verified by RWJF.)

Foundations with Representatives Attending the Evaluation Roundtable, July 18, 2002

Atlantic Philanthropies
California Endowment
Annie E. Casey Foundation
The Edna McConnell Clark Foundation
Daniels Fund
Doris Duke Charitable Foundation
Fannie Mae Foundation
Heinz Endowments
Hogg Foundation for Mental Health
James Irvine Foundation
Robert Wood Johnson Foundation
Ewing Marion Kauffman Foundation
W.K. Kellogg Foundation
John S. and James L. Knight Foundation
Lumina Foundation for Education
Northwest Area Foundation
David and Lucile Packard Foundation
William Penn Foundation
Pew Charitable Trusts
Robin Hood Foundation
Charles & Helen Schwab Foundation
Wallace – Reader's Digest Funds
Woods Fund of Chicago

 Back to the Table of Contents


BIBLIOGRAPHY

(Current as of date of this report; as provided by grantee organization; not verified by RWJF; items not available from RWJF.)

Book Chapters

Campbell MS, Patton MQ and Patrizi P. "Evaluation of the Central Valley Partnership of the James Irvine Foundation." in New Directions for Evaluation, 105, Patton MQ and Patrizi P (eds.), 2005.

Reports

Patton MQ. "The Central Valley Partnership: A Case Study of Evaluation at the James Irvine Foundation." Unpublished case study. Available from the project director.

Patrizi P. "Briefing Note." Unpublished. Available from the project director.

Segal A. "The Arts Participation Program: A Foundation's Effort to Establish an Effective Evaluation Strategy for a New Field to Support a Large Return on Investment." Unpublished case study. Available from the project director.

Sherwood KE. "A Case Study of Evaluation at the David and Lucile Packard Foundation." Unpublished case study. Available from the project director.

Sherwood KE. "Evaluation of the Fighting Back Initiative." Unpublished case study. Available from the project director.

Sponsored Conferences

"Evaluation Roundtable," April 25–26, 2002, San Francisco. Attended by 14 evaluation and program officers and executives from six foundations: David and Lucile Packard Foundation, Edna McConnell Clark Foundation, James Irvine Foundation, John S. and James L. Knight Foundation, Wallace – Reader's Digest Funds and The Robert Wood Johnson Foundation. Group and general discussion.

"Evaluation Roundtable." July 18, 2002, Princeton, N.J. Attended by 48 evaluation and program officers and executives from 24 foundations, including those represented at the April 2002 meeting as well as others such as Pew Charitable Trusts, Charles & Helen Schwab Foundation, Ewing Marion Kauffman Foundation, Robin Hood Foundation, Annie E. Casey Foundation and Doris Duke Charitable Foundation. Teaching and discussion of two case studies.

 Back to the Table of Contents


Report prepared by: Mary B. Geisz
Reviewed by: Janet Spencer King
Reviewed by: Molly McKaughan
Program Officer: Laura C. Leviton

Most Requested