Computers and Prejudice

April 6th, 2010
ism, nlp
Consider the following fake snippet of news reporting:

Sarah Jones reporting for News Station X. The CEO of Big Powerful Company announced today that BPC was on track to meet its earnings targets for the quarter. She went on to attribute the continuing success of BPC to the quality of its employees.

Imagine I were to tell you that I interpreted the "she" as being a reference back to "sarah jones" instead of "the ceo of big powerful company". What if I say that this is because I think of "ceo" as a male-gendered word, so I interpret the "she" as being more likely to refer to "sarah jones". You'd say I was being sexist, no? The assumption that a ceo is male is a sexist assumption. I'd agree with you.

Now imagine I am working on a computer program that is supposed to learn how to connect pronouns (and other words) back to their antecedents. (I am, in fact, working on such a program) Imagine it learns, from lots and lots of data, that ceos are much more likely to be male than female. So it makes the mistake I described above, identifying "she" as the reporter instead of the ceo. Is the program being sexist? Were the programmers sexist in desigining the program with the capacity to learn the gender of words? Is there sexism here, aside from that which results in there being more male ceos than female?

Comment via: facebook

Recent posts on blogs I like:

Solution-Focused Brief Therapy

Look! A therapy technique people don't already know!

via Thing of Things May 14, 2025

Workshop House case study

Lauren Hoffman interviewed me about Workshop House and wrote this post about a community I’m working on building in DC.

via Home April 30, 2025

Impact, agency, and taste

understand + work backwards from the root goal • don’t rely too much on permission or encouragement • make success inevitable • find your angle • think real hard • reflect on your thinking

via benkuhn.net April 19, 2025

more     (via openring)