Give-Well and Deciding who to Give To

July 5th, 2010
ea, money
Once one determines that one has a responsibility to help people who are less fortunate, and decides to do this by donating money, an important question is "where will my money be best used?". [1] One organization that tries to answer this question is givewell.

When you donate to one charity instead of another, you care about the relative impacts. By far the biggest source of differences in relative impact is how effective the charity's programs are. GiveWell appears to be the only organization publishing efficacy evaluations. [2] Some organizations, like charity navigator, make an attempt to rate charities based on the IRS form 990, but there is insufficient information in that form to do more than rule out obviously ineffective charities. [3] Pretty much the only usable information on the form, and the only information that charity navigator displays prominently, is the funding breakdown between fund-raising, administration, and programs. When people use these simple statistics to decide between charities, it forces the charities to compete on a bad metric. Julia confirms that oxfam pays a lot of attention to the program expenses as a percentage of the budget statistic. If they're intentionally keeping overhead down not because they want to have more money to spend on programs but because they want to look good on this statistic, they're not helping as many people as they could be. GiveWell understands that spending money on administration can make programs more effective and so this statistic is largely useless at identifying the best charities.

GiveWell's goal is to find the most effective charities, not to come up with a ranking of all charities. They make heavy use of heuristics in filtering out charities that are probably not that good or that they would not be able to evaluate effectively. This means that 95% of the charities they evaluate get a 0 star rating. This is primarily because very few charities have done any kind of systematic analysis to determine if their programs are effective. And it turns out, many programs, when carefully evaluated, are not actually effective. So if you go look at their rating page and see some charity you think is doing good work has a rating of 0, that's likely because they have no good evidence to convince GiveWell that they are effective. How have they convinced you that they are effective? Why are you confident in them? Should you be? Reading about GiveWell has convinced me that we should be pushing the charities to which we give to either run more studies of program efficacy or, if they already do so, make the results public. [4]

[1] The way Julia and I originally figured out where to give our money was pretty simple: Julia did some research online and decided that oxfam america is an effective charity. Six months or so after deciding to donate (all) our money to them Julia started working there, and she still agrees with her initial asessment.

[2] Large foundations such as the gates foundation must be doing this kind of research to determine where to give, but they don't make this research public.

[3] For example, the charity navigtor review of the America-Israel Cultural Foundation which shows them spending only 14.4% of their budget on program expenses, does tell us that they are a very ineffective charity.

[4] A difficult aspect of this is that it is important that studies showing programs to be ineffective be made just as public as ones showing efficacy. This is sure to be unpopular with charities.

Update 2010-07-05: Jonah Sinick writes that GiveWell rewards admitting failures:

In our experience, charities are very rarely willing to share evidence of disappointing impact. We believe that any charity that does so is being unusually honest about the challenges of international aid, and unusually accountable to donors. We expect that charities capable of spotting, documenting and sharing disappointing results are better positioned to improve our time.
...
If your organization is listed on the GiveWell site and you want to improve your ranking, publish a case study of a program you ran that failed. As usual, we.re not looking for marketing materials, and we won.t accept .weaknesses that are really strengths. (or reports that blame failure entirely on insufficient funding/support from others). But if you share open, honest, unadulterated evidence of failure, you.ll join a select group of organizations that have a GiveWell star.

Comment via: facebook

Recent posts on blogs I like:

Abigail Shrier's Bad Therapy: Surveys

Surveys matter!

via Thing of Things April 12, 2024

Clarendon Postmortem

I posted a postmortem of a community I worked to help build, Clarendon, in Cambridge MA, over at Supernuclear.

via Home March 19, 2024

How web bloat impacts users with slow devices

In 2017, we looked at how web bloat affects users with slow connections. Even in the U.S., many users didn't have broadband speeds, making much of the web difficult to use. It's still the case that many users don't have broadband speeds, both …

via Posts on March 16, 2024

more     (via openring)