This is STAGING. For front-end user testing and QA.
The Chronicle of Philanthropy logo

Foundation Giving

Foundations Embrace Evaluation but Not Transparency, Study Finds

September 20, 2016 | Read Time: 3 minutes

Foundations are embracing program evaluation — but may not interpret study findings correctly or share the results externally, according to a study released Tuesday.

“The picture I get of evaluation at foundations is a good one,” says Julia Coffman, executive director of the Center for Evaluation Innovation, which conducted the study along with the Center for Effective Philanthropy. “The integration of evaluative thinking has increased a lot. I think that the sector is on a good trajectory.”

The Benchmarking Foundation Evaluation Practices report drew on survey data from 127 people who work as the most senior evaluation or program staff members at their foundations. All the institutions give at least $10 million annually or participate in the Evaluation Roundtable, a network of grant-maker leaders from the United States and Canada.

A third of respondents said their foundations have dedicated evaluation departments. Of those, 79 percent have their own grant-making or contracting budget to assist in evaluating programs.

The authors say the study is the first of its kind. Ms. Coffman, who has been observing the field for 20 years, says she believes that in the past few years, foundations have grown significantly more interested in conducting evaluations.


Different Approaches

Evaluation staff perform different functions at different foundations: They direct most evaluation work at 45 percent of foundations, provide coaching to other employees at 21 percent, and hire third parties to do evaluation work at 14 percent.

The report is intended to prompt leaders to reflect on their own practices, not necessarily dictate how they should collect and study data, Ms. Coffman says.

“Every foundation has a different approach to strategy, culture, and the assets available,” she says. “There’s not one right way to do this.”

Regardless of how it’s done, evaluation matters for foundations because they should know whether “what they’re investing in has an impact,” Ms. Coffman says.

Particularly for grant makers that are “investing in untested or innovative approaches to social change,” she says, it’s important to ask, “Are we understanding the problem correctly? Are we approaching it in a useful way?”


But according to the study, only 20 percent of respondents believe their foundation accurately understands what it has accomplished for beneficiaries, and not quite half believe their grant maker knows how its work has affected grantee organizations. Ellie Buteau, vice president for research at the Center for Effective Philanthropy, calls those findings “concerning.”

She was pleasantly surprised, however, that 42 percent of respondents said their foundation has worked to coordinate its evaluations with other grant makers working on the same issues.

The median amount respondents said their foundations spend on evaluation is $200,000, a figure Ms. Buteau calls low, considering the large sums many of the organizations give away every year.

There’s a discrepancy between the high demand for evaluation at foundations and the time and money allocated for that work, Ms. Coffman says: “How to meet that demand effectively with the available resources is something the sector is struggling with.”

Sharing Results

Only 57 percent of respondents disseminate their evaluation findings outside of their institutions. That’s not nearly high enough, according to Ms. Coffman and Ms. Buteau. Indeed, 71 percent of respondents said that their foundations invest too little in sharing evaluation data externally, and being more transparent about evaluation results was one of the top three changes respondents hope to see over the next five years.


The Wallace Foundation makes a point of sharing its evaluation information, and “our grantees say they appreciate it,” says Edward Pauly, director of research and evaluation, noting that the foundation’s formal evaluation and research reports were downloaded nearly 680,000 times last year. “There’s a terrific appetite for learning about the lessons and evidence and results.”

In addition to simply sharing study results, it’s important to make sure grantees have the opportunity to help design evaluations that will answer their own questions as well as those of foundations, Ms. Coffman says.

Before running evaluations, the Wallace Foundation asks grantees “what they don’t know that could advance their work,” Mr. Pauly says. “We’re really finding what kind of information is most useful to the key stakeholders.”

About the Author

Contributor