::  Posts  ::  RSS  ::  ◂◂RSS  ::  Contact

Give-Well and Deciding who to Give To

July 5th, 2010
giving, money  [html]

Once one determines that one has a responsibility to help people who are less fortunate, and decides to do this by donating money, an important question is "where will my money be best used?". [1] One organization that tries to answer this question is givewell.

When you donate to one charity instead of another, you care about the relative impacts. By far the biggest source of differences in relative impact is how effective the charity's programs are. GiveWell appears to be the only organization publishing efficacy evaluations. [2] Some organizations, like charity navigator, make an attempt to rate charities based on the IRS form 990, but there is insufficient information in that form to do more than rule out obviously ineffective charities. [3] Pretty much the only usable information on the form, and the only information that charity navigator displays prominently, is the funding breakdown between fund-raising, administration, and programs. When people use these simple statistics to decide between charities, it forces the charities to compete on a bad metric. Julia confirms that oxfam pays a lot of attention to the program expenses as a percentage of the budget statistic. If they're intentionally keeping overhead down not because they want to have more money to spend on programs but because they want to look good on this statistic, they're not helping as many people as they could be. GiveWell understands that spending money on administration can make programs more effective and so this statistic is largely useless at identifying the best charities.

GiveWell's goal is to find the most effective charities, not to come up with a ranking of all charities. They make heavy use of heuristics in filtering out charities that are probably not that good or that they would not be able to evaluate effectively. This means that 95% of the charities they evaluate get a 0 star rating. This is primarily because very few charities have done any kind of systematic analysis to determine if their programs are effective. And it turns out, many programs, when carefully evaluated, are not actually effective. So if you go look at their rating page and see some charity you think is doing good work has a rating of 0, that's likely because they have no good evidence to convince GiveWell that they are effective. How have they convinced you that they are effective? Why are you confident in them? Should you be? Reading about GiveWell has convinced me that we should be pushing the charities to which we give to either run more studies of program efficacy or, if they already do so, make the results public. [4]

[1] The way Julia and I originally figured out where to give our money was pretty simple: Julia did some research online and decided that oxfam america is an effective charity. Six months or so after deciding to donate (all) our money to them Julia started working there, and she still agrees with her initial asessment.

[2] Large foundations such as the gates foundation must be doing this kind of research to determine where to give, but they don't make this research public.

[3] For example, the charity navigtor review of the America-Israel Cultural Foundation which shows them spending only 14.4% of their budget on program expenses, does tell us that they are a very ineffective charity.

[4] A difficult aspect of this is that it is important that studies showing programs to be ineffective be made just as public as ones showing efficacy. This is sure to be unpopular with charities.

Update 2010-07-05: Jonah Sinick writes that GiveWell rewards admitting failures:

In our experience, charities are very rarely willing to share evidence of disappointing impact. We believe that any charity that does so is being unusually honest about the challenges of international aid, and unusually accountable to donors. We expect that charities capable of spotting, documenting and sharing disappointing results are better positioned to improve our time.
...
If your organization is listed on the GiveWell site and you want to improve your ranking, publish a case study of a program you ran that failed. As usual, we.re not looking for marketing materials, and we won.t accept .weaknesses that are really strengths. (or reports that blame failure entirely on insufficient funding/support from others). But if you share open, honest, unadulterated evidence of failure, you.ll join a select group of organizations that have a GiveWell star.

Comment via: facebook

Recent posts on blogs I like:

How Fast New York Regional Rail Could Be Part 2

In my last post about New York regional rail schedules, I covered the New Haven and Harlem Lines of Metro-North and the Main Line and Hempstead Branch of the LIRR. I was hoping to cover more lines tonight, but due to time constraints only the Hudson Line …

via Pedestrian Observations October 17, 2019

Strong stances

I. The question of confidence Should one hold strong opinions? Some say yes. Some say that while it’s hard to tell, it tentatively seems pretty bad (probably). There are many pragmatically great upsides, and a couple of arguably unconscionable downsides. …

via Meteuphoric October 15, 2019

What do executives do, anyway?

An executive with 8,000 indirect reports and 2000 hours of work in a year can afford to spend, at most, 15 minutes per year per person in their reporting hierarchy... even if they work on nothing else. That job seems impossible. How can anyone make any im…

via apenwarr September 29, 2019

more     (via openring)

More Posts:


  ::  Posts  ::  RSS  ::  ◂◂RSS  ::  Contact