|July 10th, 2013|
- They're too small. The best existing one had only 44 respondents.
- They have sample bias issues, where the surveys can't measure people going back to eating animals.
- They don't have control groups.
Both studies are looking at distributing leaflets on college campuses. The first study, run by the Humane League, is trying to measure the immediate effect of receiving a booklet promoting vegetarian eating. Have a look at the survey form.
There are several good things about this survey:
- Control group: some people can get a booklet that doesn't advocate vegetarianism.
- The questions about diet are numeric and aren't pushing towards one end or the other.
- This sample size can be pretty large without too much cost.
- It's only measuring what people immediately say they will do, not what they actually do or even what they say if you follow up with them a month later.
- It's clear that the experimenters are interested in promoting vegetarianism, and participants might overstate their decrease in meat eating.
- Many people are going to look ahead and see the second set of questions before stopping and reading the leaflet.
Again, there are some really good things about this survey:
- The people being surveyed won't know that they surveyors care about animals and want to promote veg eating. (There are many places where people could be tipped off, but I think the design now deals with all of them.)
- They're planning to also survey at a school where they didn't hand out leaflets.
Even if they could perfectly identify which people had taken a leaflet and use that when comparing survey responses, the right control group is actually "people who were asked to take a leaflet". If they later find out that people who took the leaflets are more vegetarian than people who didn't, one possibility is that vegetarians and future vegetarians were more likely to take the flyers. A way to fix this control group issue would be to use places instead of memory. Hand out veg leaflets to people leaving some randomly chosen classes early in the semester, and random leaflets outside other classes, then survey both sets of classes later in the semester.
Overall, I'm glad to see new work being done here. This is much better than earlier research which just went by the number of people leafletters estimated they were converting. But I'm worried that when the surveys come back they're going to have enough problems that we still won't know very much about the efficacy of leafletting on college campuses.
Update 2013-07-24: EAA has published an updated methodology that resolves some of these issues. They've simplified the study to compare a flyer advocating veg eating to one against puppy mills. The main remaining issue, and it's still a big one, is that we can only compare the people who remember getting flyers.