I wrote a before
about a survey
that tries to estimate effect on people's eating habits of seeing
Facebook ads for the Hidden
Face of Food
. I'm following up to look more at sample bias: how
the people taking the survey might be different from the overall
group we're trying to learn about.
Ideally you could sample from the whole population
of people seeing the ads, but instead you can only email people who
left their address in ordering Vegetarian Starter Packs (VSPs) and
advertise to people that 'like'd the page. Depending on how these
emails and ads were set up, you could have quite a bit of sample bias:
imagine if the ads said "Given up eating meat? Let us know why!" To
find out how these requests to take surveys were set up, I wrote to
I'm going to go through this separately for the people who they
contacted through email and the people who they brought back in with
The survey says 7% of page visitors clicked 'like'. To get these
users to fill out a survey they placed ads set to display only to
these users who clicked 'like'. The author tells me these were a
"picture of a movie ticket, did not mention group or veg stuff, said
something to the effect of 'Want free movie tickets? Fill out this one
minute survey and you have a one in ten chance of winning free movie
tickets.'" They "did not have to mention the site at all because it
was only shown to those who had "liked" the site." This sounds good:
the people brought in are biased in that they have free time and want
movie tickets, but I don't think that's a big problem. One thing that
does worry me is that Facebook may have added text to the ads about
them showing up "because you like hidden face of food" or something
similar. Does anyone know whether ads shown because you 'like'
something do that?
Some people may have, after seeing the survey, been embarrassed not to
have changed their behavior and not filled it out. How many? Asking,
the author wrote "aside from the 44 that filled out the survey, only 1
did not complete it". This sounds good, but is suspiciously high. A
one in ten chance of tickets is pretty good, though, so this could well
be that people just wanted their tickets.
As I wrote last time, 1.5% of the visitors to the site ordered a VSP.
The author tells me "roughly 1/3rd of those who ordered a VSK left
their email, we emailed all who ordered (within a selected time
period), and about 10% replied."  The people who left their email
were doing that in a field labeled "To receive our E-newsletter, enter
your email (optional)", so it's likely that these were the more
enthusiastic ones. I don't know what these emails looked like or what
fraction failed to complete the survey after clicking through to the
survey page, but if these are like the Facebook ads then they're
probably not an issue.
For the most part, sample bias appears not to be a problem beyond the
low rates for liking and VSP-requesting that Alan and I already talked
about. The VSP numbers I trust less, because I know less about them
and they were selected only from people who left their email address,
but the Facebook numbers are the larger fraction and seem quite good.
 This is consistent with what I calculated last time: a "10%
response rate indicates they sent surveys to 39% of their VSPers".