Wednesday, 25 February 2009

Invading Poll-land

Did you see the BBC's report on how, contrary to what we all thought, the UK actually loves religious values and wants more religious influence on our lives? Now I should start out by saying, for the record, I'm not a big fan of religion and don't think people should have a say in law-making, or how I live my life, simply because they feel more comfortable living by the moral standards of, say, an Iron-Age Middle-Eastern society. I fully understand why people would feel more comfortable with the simple black-and-white morals of an ancient and distant society, which save one dealing with the scary complexities of a modern pluralist society, but then I also fully understand why other historical re-enactors like to dress up as Vikings at weekends.

Anyway, as I wasn't sure I believed the BBC's conclusions - which claimed the majority of Britons wanted much more religious influence over their lives - I thought I'd delve into the source of the data a little, to see if they would persuade me. It turns out the survey was conducted by a polling organization called Comres. Comres, on their website, boast all sorts of big-name clients and proudly declare they are a member of the British Polling Council and the Association for Qualitative Research. Sounds like a group of researchers who know what they're doing.

Looking at their portfolio of 'Social polls' is interesting. Most of their recent 'social polls' have been about religion, and all these were conducted at the behest of Christian-interest groups. Hmm. Why might Christian-interest groups give so much business to Comres? I wondered.

I then went back and looked at the results of the Comres/BBC poll [PDF link]. Gosh, what detailed analysis! The data are there, broken down in minute detail by gender, age, social grade and region. Big pages full of scary numbers: this looks like a thoroughly rigourous and scientific study!

But let's look at these numbers in a little more detail. Down on page 3 we can see how the sample of 1045 people breaks down into religious groups. Of the 1045 people surveyed, it turns out 639 were Christians and 279 were of no religion. Now this immediately sent alarm bells ringing. To show you why, let me digress slightly into some introductory sampling theory...

In human research, an early step is to identify the population of interest. This is the group about which you want to reach a conclusion. For example, if you want to learn something about the opinions of all the people in the UK, your population is 'all the people in the UK'. In an ideal world you would then conduct a census, whereby you speak to every member of this population. At the end of such research you know exactly what are its opinions.

Of course, when you're dealing with really large populations - like the population of the UK - conducting a census becomes logistically difficult so you instead use a sample. A sample is a subset of your population which you hope will behave exactly like the population. In other words, it should be the population in miniature, a microcosm of the population. You are hoping it will behave exactly like the population, whilst being of a manageable size. If it does, that's great: you've learnt something about a big population from studying a convenient number of people. But if your sample is in some way biased, or behaves differently to the population as a whole, you will reach false conclusions about the population. The only information you have about the population comes from your sample, so every effort must be taken to ensure that sample isn't biased in some way. Ideally this is done by keeping the sample large, and using methods such as random sampling to choose the people included.

And this is what first worries me with Comres's sample. This survey is being used to represent the views of the UK population (it certainly is in the BBC article). For it to have any validity, then, the sample has to be a smaller version of the UK population - it has to look just like the UK population, in miniature, or else we can't meaningfully generalize from it. But there's the thing: the UK population isn't 64% Christian and 28% non-religious (I'm ignoring Comres's 'weighted' numbers as they haven't bothered to report what they were weighted by). Nor is the population of the UK 0.02% Muslim, and nor does it have exactly 10 times more Muslims than Jews. This sample is clearly biased. With the majority self-identifying as Christians, whatever the sample 'says' is simply going to represent Christian views (at least to the extent Christians all agree with one another on things). Strange - you'd expect a member of the British Polling Council to be a bit more careful than that.

Where did this sample bias come from? Critically, we cannot know. Any sort of proper scientific report would contain full details of how the sample was recruited, so we could read the report fully informed and judge its findings according to the strengths or weaknesses of its methodology. But Comres's report doesn't bother to say how the sample was recruited. Given the massive skew towards representing Christians, I'm tempted to suspect they did a lot of their polling outside churches, at religious group meetings, or something similar. But I don't know, because they don't tell us. Nor do we know how they defined groups like 'Christian', 'non-religious' and so on. This lack of detail really matters: you're going to see very different results if you define 'Christianity' as 'I actively go to church at least once a week, am born-again and believe Jesus Christ is my personal saviour' or if you define it to include all those people who say they're Church of England as a sort of 'default' option because they don't feel very strongly one way or the other (like my mother), or who choose that option because they feel 'spiritual, like there must be something bigger' and so won't choose the non-religious tag. In this case I suspect Christianity was defined somewhat like the first of these options, and the wishy-washy undecided made up the non-religious group. But again, I can't tell because these crucial details aren't reported.

Moving on from the sampling, let's look at the survey itself. It included questions like "The media reports my religion fairly and accurately (agree/disagree/don't know)". From experience of similar surveys I can tell you this is a very strange question to ask someone who is not religious. It simply doesn't make sense - not having a religion is not a religious position, except possibly for some of the more hard-line atheists. Asking someone who isn't religious about their religion is like asking someone who doesn't own a hat about their hat: what are they to answer other than "Huh? I don't have one"?

But it seems Comres have something of a history here. Let's look at their questions in other surveys. How about their "Rescuing Darwin" survey, conducted at the behest (i.e., payment) of Theos, a Christian think-tank? Here we see questions like "Young Earth Creationism is the idea that God* created the world sometime in the last 10,000 years. In your opinion is Young Earth Creationism: definitely true, probably true, probably untrue or definitely untrue". With 11% saying this is definitely true, again alarm bells are ringing about which particular evangelical church they got their sample from (and again, they don't tell us), but let's ignore that for a moment as we're looking at the questions. How about question 3: "Atheistic evolution is the idea that evolution makes belief in God unnecessary and absurd. In your opinion is Atheistic evolution: definitely true, probably true, probably untrue or definitely untrue" with 30% saying 'definitely untrue'. I'm sorry, is that question dispassionate and scientific, carefully designed to elicit opinion, as it should be, or is it emotive and written in the language of fundamentalist Christianity? There's plenty more of this sort of thing in Comres's oeuvre.

(* 'God' you notice. Not '...the idea that a powerful entity created the world' but 'God', with a capital letter.)

So what's the conclusion? Basically, it rather looks as though Comres have established themselves as the polling organization of choice for religious groups wanting to find the 'right answers' in national opinion polls. With dubious questions which only make sense to a subset of those questioned (seriously: go and read the rest of the Rescuing Darwin questions), and apparently biased samples (which we can't even properly evaluate, without information on where they came from), they seem always to support exactly what the paying customer wants to find - which is nice, as that's a good way of getting repeat business.

I'm tempted to call for important organizations such as the General Medical Council to stop giving their business to such a polling organization, but I think the bigger question here is why on earth the supposedly dispassionate BBC News commissioned this particular organization - with their track-record of questionable polling in the interests of religious bodies - to conduct their snapshot survey of religious feeling in the UK. And I'm also curious as to why the BBC didn't notice the rather flagrant sample bias in the data they eventually received. I would be very very interested in knowing the religious background of the individual who commissioned this 'research'. Very interested indeed.

And this is where I turn all Ben Goldacre: this doesn't really bother me because of its religious aspects, but rather because it is the sort of thing which gets proper and effective researchers a bad name. Public opinion polls can play an important role in testing the Zeitgeist, and also contribute a great deal to our modern discourse about society. But for them to have any use they have to be done properly, and reported transparently. This sort of thing not only breeds distrust of opinion polls in general, but is also a classic example of how you can't just believe any sort of research reported in the media but rather need to go back to the source of the data and evaluate where they came from. I know this for a fact: I learnt it from a rigourous survey of me.

EDIT: Here's something I just typed in the comments to this post. Oh, and there's another issue I forgot to mention in the post, which is a shame as it was one of the things that really bothered me. One of the questions was "Our laws should respect and be influenced by UK religious values (agree/disagree)". Surely that's two questions rolled into one! That question was rolling people who think the law should respect religion into saying they also think the law should be influenced by religion. Because those are really quite separate ideas: personally I wouldn't be too worried by the first part of the question (as I think the law should respect our right to believe what we want), but I'd vehemently oppose the second part of the question. Tricksy, I'd say. I do wish I'd remembered to put that into the main article!


Karl McCracken (twitter: @karlonsea) said...

Nicely explained.

I can give you the answers on why the BBC commissioned Comres, and why no-one at the BBC noticed the sampling bias:

1. Imagine the commissioning meeting . . . "Hey! Let's do some research or religion, because lots of people are religious, right? So we can see what proportion of the population to target for Thought For The Day, and be seen to be contributing to public debate. [Flicks through BPC member list]. Look - these Comres chaps do loads of polling on religious themes. They must be the experts on this sort of thing."

2. And when the report is delivered . . . "Gee, thank you Comres. This sure is a thick research report, with lots of numbers in it. I was never much good at sums myself, which makes me laugh now, considering the budget I'm responsible for! Y'know, this report's almost as thick as my MA thesis on the Symbolism Of Late Renaissance Poetry of Venice. I have a copy round here somewhere, if you'd like to have a quick read, while I get your cheque sorted out . . ."

Ian Walker said...


Excellent points. Perhaps - at least on the BBC's side - we should be using Hanlon's Razor to explain things.

Anthony Wells said...

I can't blame you for not finding it, since it isn't where you'd expect to find it on their site (it's under "Poll Digest", not under "what we do"), but ComRes have got the details of their sampling here here.

It's a quasi-random telephone sample. It doesn't say it there, but as far as I'm aware they still actually outsource their sampling and fieldwork to ICM. They weight by "sex, age, social class, household tenure, work status, number of cars in the household and whether or not respondent has taken a foreign holiday in the last 3 years" - not by religion, presumably since it would be nigh on impossible to do: people give drastically different answers on things like religion and belief in god depending on what question is asked and how it's asked, so it would be difficult to get good targets for weighting.

The proportion of people who say they are Christian isn't unusual - in the 2001 census somewhere around 70% of people described themselves as Christian. Depending on how you ask it, you get the same sort of figure in polls. Presumably a large proportion of that 70% are Christian in a rather vague, cultural identity sense, since a large proportion of them also tell pollsters that they don't actually believe in a god. It goes without saying that only a tiny proportion of them, normally a percentage in the high teens if you ask about every week, are regular churchgoers. The ComRes sample was 2% Muslim, I think the figure in the census was about 3%, but I would expect Muslims to be under-represented due to language issues and some cultural difficulties in getting women from some ethnic minority communities to agree to take part in polls.

They do carry out a lot of polls for Christian pressure groups though :)

Ian Walker said...


Many thanks for those useful comments. Telephone polls eh? How many of us would answer the questions of a telephone pollster when they call during our evening meals? Could there be a demographic skew in the people who have BT landlines?

Given what you found, the statistic they REALLY should have reported was how many of the people polled refused to take part, and how the refusals were distributed across religious groups. Again, without those data I don't think we can really interpret their figures.

Here's why: I remember once freezing in the street in Cambridge trying to poll cyclists. I'd approach people who were locking up their bikes and almost every time the conversation would start like this:

ME: Can I ask you a few questions?
THEM: SORRY! Far too busy. Got places to go, people to see.
ME: It's about cycling...
THEM: Oh, in that case fire away!

People are happy to talk in polls when the topic interests them or matter to them but refuse when it doesn't. That's why we'd need to know whether non-religious people were more likely to refuse than the religious.

Anthony Wells said...

Landlines aren't particularly skewed - penetration is still well over 90%, though it is falling as more people move to be mobile only. Any sampling skew is far more likely to be attitudinal in terms of people who are at home less, or who hang up on cold callers, or who screen their calls through Caller ID and stuff like that.

No one tells you what their refusal rate is - or at least, it is very rarely mentioned. I think, though it must be going back a few years, it's a response rate of something like 1-in-6. For obvious reasons, no one knows the demographic make up of people who refuse to be interviewed, though it can be inferred sometimes.

In political polls we know raw telephone samples overrepresent Labour voters, for example, and weighting is used to right it. But that's because political polls end up being tested against election results, so get rather more attention paid to them than other polls.

IIRC the MRS guidelines are that people have to be told what the survey is about before they agree to take part, which I've always thought is a bloody silly rule for the reason you give, but pollsters are wise enough to realise this and give only really vague descriptions so it doesn't skew samples (in fact, the MRS guidelines only say "general description", so "current social issues" or something like that will presumably suffice).

These questions would have been done as part of an Omnibus survey anyway, since it doesn't make any economic sense to do a poll of just 4 questions, so respondents would likely have been told it was a survey on "a variety of issues" or something similar.

Ian Walker said...

Interesting, Anthony. I'd never really thought that much about differential refusal rates in phone polls until today, but now I have I guess this might be quite an issue with opinion polls. In medical or psychological research it is usually quite important to know the demographics of people who (a) took part in and (b) later dropped out of a study, as if there's systematic pattern in either of these it might make the results questionable. Shame such data generally won't be available for surveys like this especially when, as you point out, there's no later check on the validity of the results, as there is with voting surveys.

This is all great. I wrote a blog post this morning and have learnt useful information from it in the same day! So I guess my big outstanding concern about that survey concerns how Christianity and non-religion were defined (hopefully it wasn't simply a case of asking people what religion they were).

Oh, and there's another issue I forgot to mention in the post, which is a shame as it was one of the things that really bothered me. One of the questions was "Our laws should respect and be influenced by UK religious values (agree/disagree)". Surely that's two questions rolled into one! That question was rolling people who think the law should respect religion into saying they also think the law should be influenced by religion. Because those are really quite separate ideas: personally I wouldn't be too worried by the first part of the question (as I think the law should respect our right to believe what we want), but I'd vehemently oppose the second part of the question. Tricksy, I'd say.

Your point about omnibus surveys is really interesting. I'm trying to imagine how I'd react if someone phoned me up, asked me some questions about my local hospital, then said "Next question. Young Earth Creationism is the idea that God created the world sometime in the last 10,000 years. In your opinion is Young Earth Creationism: definitely true, probably true, probably untrue or definitely untrue?"

I think I'd say anything at that point to get off the line :o)

Anthony Wells said...

(hopefully it wasn't simply a case of asking people what religion they were).


ComRes are obliged to tell you under BPC rules if you ask them nicely. I would be very surprised if it wasn't a simple question asking what religion they are - the answers they've got there are the sort you would expect to get from a straight question, and it would be out of character for the BBC to commission a poll with less than the bare minimum number of questions.

Chris Hutt said...

What looks like the same survey was reported in the Guardian today. They didn't seem to query the sampling methods.

Ian Walker said...


I think that poll is actually different from the earlier one, although given what Anthony has said here about omnibus polls it wouldn't be surprising if it was carried out on much the same people.

I did have a look at Theos's comments on their finding that most people don't agree with them; they make for interesting reading...