Responsible Transparency Consumption

October 2nd, 2016
ea, transparency
People don't usually volunteer details about why they decided to do something, how they did it, or how it turned out, unless they have another goal in mind. You see people and organizations writing about cases where they've done better than expected, in the hope others will think better of them. You see writing that explains already-public cases of failure, casting it in a more positive light. You see people writing in the hope they'll be seen as an expert, to build up a reputation. Additionally, while most real decisions are made in people's heads as the output of a complicated process no one really understands, if you look at decision writeups you'll typically see something easy to follow that only contains respectable considerations and generally reflects well on the ones publishing it. If you're trying to learn from others, or evaluate them, this isn't much to go on.

Efforts to change this generally go under the banner of "transparency", and this is one of the components of the effective altruism (EA) movement, especially for EA organizations. GiveWell is especially known for putting this into practice but most EA organizations value transparency and prioritize it to some extent. Many individual EAs do this as well; for example, as someone earning to give I keep a donations page and have posted several spending updates.

This puts the members of the EA movement in a position as consumers of transparency: people and organizations are releasing information because it benefits the broader community. This is information that they could easily keep to themselves, since as a practical matter everything is private by default and requires effort to make available. Writing a report requires taking an enormous amount of detail and deciding what to communicate, which means it's very easy through selective inclusion to hide mistakes and present yourself or your organization in an artificially positive light. And not even intentionally! It's very easy to subconsciously shy away from writing things that might make you look bad, or might reflect badly on people you on the whole think highly of.

So imagine an organization makes something public, and voluntarily reports some kind of failure that people outside of the organization wouldn't have known about otherwise. If people react critically and harshly to this failure, it makes them much less likely to be willing to be so transparent in the future. And not just this organization: others will also see that sharing negative information doesn't go well.

When people react negatively to an organization sharing an instance of failure one thing they're doing is putting pressure on the norm that organizations should be sharing this sort of thing. If the norm is very strong, then the pressure is not going to keep people from sharing similar things in the future, and it also means that seeing a failure from this organization but not from others is informative. On the other hand, if the norm is weaker we need to be careful to nourish it, not pushing it harder than it can stand.

Comment via: google plus, facebook, the EA Forum

Recent posts on blogs I like:

How Does Fiction Affect Reality?

Social norms

via Thing of Things April 19, 2024

Clarendon Postmortem

I posted a postmortem of a community I worked to help build, Clarendon, in Cambridge MA, over at Supernuclear.

via Home March 19, 2024

How web bloat impacts users with slow devices

In 2017, we looked at how web bloat affects users with slow connections. Even in the U.S., many users didn't have broadband speeds, making much of the web difficult to use. It's still the case that many users don't have broadband speeds, both …

via Posts on March 16, 2024

more     (via openring)