Clear all

16 results found

reorder grid_view

Aproaches to Learning Amid Crises: Reflections from Philanthropy

March 29, 2021

Early 2020 sparked an urgency for foundations to work equitably and adapt quickly, while also reflecting deeply on their roles in society and the alignment between their operations and values. The covid-19 pandemic and murder of George Floyd have brought to the forefront the long-held assumptions about how foundations should work and be held accountable. Many foundations have begun to explore significant changes to their practices. Along the way, they have also been forced to learn, reflect, and adapt in unprecedented ways. We wanted to know whether developing an organizational learning culture yields benefits in terms of better evaluation outcomes as well as stronger more adaptive decision-making.In interviews with seven foundations from Canada and the U.S. in the summer and fall of 2020, we spent time applying the lens of organizational learning to how each of these foundations made sense of their reality, asked different kinds of questions to inform their thinking, and acted in new or different ways as a result.In our report: Approaches to Learning Amid Crises: Reflections from Philanthropy, we lift up examples of how foundations have reacted and specifically what and how they are learning. 

Benchmarking Foundation Evaluation Practices 2020

January 1, 2020

In 2019, the Center for Evaluation Innovation administered a benchmarking survey to collect data on evaluation and learning practices at foundations. This is an ongoing effort (previous surveys were conducted in 2015, 2012, and 2009) to understand evaluation functions and staff roles; the level of investment in and support of evaluation; the specific evaluative activities foundations engage in; the evaluative challenges foundations experience; and the use of evaluation information once it is collected.The survey was sent to 354 independent and community foundations in the US and Canada reporting at least $10M in annual giving during the previous fiscal year, and to foundations that participate in the Evaluation Roundtable network (the vast majority of which meet the annual giving criterion). This report includes survey data from 161 foundations, a 45% response rate.We conduct the survey so that foundations can compare their evaluation and learning structures and practices to those of the broader sector. The results offer a point-in-time assessment of sector practice. They do not necessarily represent "best" or even "good" practice. They do, however, offer valuable inputs on key questions, such as: How should the evaluation and learning function be staffed and resourced? What kinds of evaluative activities should be prioritized? How can evaluation and learning link to strategy?

What Will it Take for Philanthropy to Learn?

October 6, 2019

In collaboration with evaluation and learning leaders in fourteen foundations, we experimented with how to integrate learning into the way philanthropy works. Our "Lab for Learning" revealed several ideas about what it takes.

Benchmarking Foundation Evaluation Practices

September 1, 2016

For foundations, there are lots of questions to reflect on when thinking about which evaluation practices best align with their strategy, culture, and mission. How much should a foundation invest in evaluation? What can they do to ensure that the information they receive from evaluation is useful to them? With whom should they share what they have learned?Considering these numerous questions in light of benchmarking data about what other foundations are doing can be informative and important.Developed in partnership with the Center for Evaluation Innovation (CEI), Benchmarking Foundation Evaluation Practices is the most comprehensive data collection effort to date on evaluation practices at foundations. The report shares data points and infographics on crucial topics related to evaluation at foundations, such as evaluation staffing and structures, investment in evaluation work, and the usefulness of evaluation information.Findings in the report are based on survey responses from individuals who were either the most senior evaluation or program staff at foundations in the U.S. and Canada giving at least $10 million annually, or members of the Evaluation Roundtable, a network of foundation leaders in evaluation convened by CEI.

Evaluating Networks for Social Change: A Casebook - Part 2 of a Guide to Network Evaluation

July 30, 2014

The casebook is part 2 of of a guide to network evaluation. This document provides profiles of nine evaluations that detail key questions, methodologies, implementation and results while expanding what is known about assessment approaches that fit how networks develop and function. It was written as a complement to a framing paper: The State of Network Evaluation, which offers the field's current thinking on frameworks, approaches and tools to address practical questions about designing and funding network evaluations.What will I learn in the guide?· How an evaluation can help a network function more effectively and promote network health· Elements of a network that can be evaluated· Approaches, methods and tools for evaluating networks· How to design a network evaluation that fits the? network type and investment (e.g., size, stage of development; issue focus)· Key questions to ask in a network evaluation· Examples of network evaluations and what has been learned from them

How Shortcuts Cut Us Short: Cognitive Traps in Philanthropic Decision Making

May 1, 2014

Foundations need to build new ways of thinking and interaction that help to combat cognitive traps, support rigorous inquiry, and foster more deliberative decision making. This brief highlights several common cognitive traps that can trip up philanthropic decision making, and suggests straightforward steps that strategists, evaluators, and organizational learning staff can take to address them.

Framework for Planning, Implementing, and Evaluating PreK-3rd Grade Approaches

March 1, 2013

Co-written by Education & Learning National Advisory Committee member Kristie Kauerz, this report helps to address key questions facing those who are developing PreK-3rd grade approaches in their school, districts, and communities.

Eyes Wide Open: Learning as Strategy Under Conditions of Complexity and Uncertainty

January 1, 2013

Foundations have important but unrealized potential to contribute value to strategy by building, supporting, and engaging in learning. While learning is important for strategic success in most circumstances, it becomes essential when foundations engage in many large and extraordinarily difficult and complex concerns: improving food security in Africa; addressing global warming, poverty, or issues of equity in difficult urban settings.This article identifies three common "traps" that hinder foundation capacity to learn and adapt: 1) linearity and certainty bias; 2) the autopilot effect; and 3) indicator blindness. The authors propose a framework to avoid these traps. Through learning from action, a truly powerful strategy – one with the potential to foster change and better outcomes – can emerge and take hold.

Evaluation for Models and Adaptive Initiatives

March 1, 2012

In this article, we promote the use of evaluation for unlocking the power of both models and adaptive initiatives. Our aim is to help grantmakers understand the critical distinctions between models and adaptive initiatives, whether these represent individual grants or broader initiatives, and to identify how evaluation can be used to learn about and assess impacts at different stages of development. We do not set out to teach evaluation skills or techniques, but to empower grantmakers to strengthen the models and adaptive initiatives in their portfolios through informed, strategic choices about the kinds of evaluation that should be funded. 

Digital Transitions: Nonprofit Investigative Journalism: Evaluation Report on the Center for Public Integrity

May 23, 2010

Summarizes outcomes of a one-year grant to CPI to transform itself into a leader in digital nonprofit journalism. Examines CPI's track record, use of new tools and methods, capacity as an effective and credible online presence, and areas for improvement.

Early Childhood Systems Building from a Community Perspective

April 16, 2010

Even when children and their families have access to support services from a variety of programs and organizations -- such as early learning centers, nutrition programs, and pediatric, nursing, dental and mental health care providers -- there are challenges in connecting families to these services. The result is that families often have a difficult time learning about, applying for and taking advantage of the services that could benefit their children. This Issue Brief, prepared for The Colorado Trust by Julia Coffman of the Center for Evaluation Innovation and Susan Parker of Clear Thinking Communications, explains systems building as an intentional, organized way to create or improve a system of early care and education services for children.

Evaluating Communication Campaigns

June 11, 2008

Summarizes presentations from a September 2007 conference on evaluating communication campaigns. Discusses the mechanism of effecting change through communication; the principles of advocacy evaluation; the design, methods, and tools; and lessons learned.