Crikey Clarifier: how are polls conducted?
The fortnightly blanket media coverage of the latest political opinion poll does more than reflect public opinion — polls can help shape it. But how do these polls work and are they accurate? Last week, one poll had the two major parties deadlocked on the two-party-preferred count, while another had a 10-point gap. So who do we trust?
How are polls conducted?
Political polling is done over the phone through market research companies such as Roy Morgan Research, Nielsen, Newspoll, Essential Research and Galaxy. Galaxy contacts landline and mobile numbers, while Newspoll is restricted to landlines. Online polls also exist (Essential Research) but phone polling is still preferred. Galaxy managing director David Briggs says whereas the phone book gives you access to about 85% of the Australian population, online databases are generally capped at about 1 million Australians.
However, political commentator Charles Richardson says the number of people you have access to is a red herring: “You only need a couple of thousand, provided they’re representative. Both phone polling and internet polling have problems getting a representative sample; phone tends to exclude the young, internet tends to exclude the old. I think phone polling is pretty clearly still superior to internet polling, but the gap is closing.”
A typical poll on voting intention requires about 1000 responses to be accurate. That could mean burning through 12,000 phone numbers if there’s a low response rate, says Briggs. Newspoll’s CEO Martin O’Shannessy says it has about a one in six strike rate. When someone does pick up, they’ll find that the questionnaire is short; about 8-15 questions, which wouldn’t take longer than two minutes.
So how does polling work?
Sampling works based on mathematical laws of probability. If it’s a truly representative sample, 1000 responses is enough to represent the whole to within about three percentage points. That plus or minus 3% is the “margin of error” and varies based on the sample size. The result could be more accurate with a larger sample and smaller margin of error. However, for pollsters to halve that variation from 3% to 1.5%, they’d have to quadruple the sample size from 1000 to 4000, and this is where you hit a wall with the laws of diminishing returns. Polling is costly, and pollsters say that level of accuracy really isn’t necessary when a sample of 1000 does the job.
How do they make the sample as accurate as possible?
Pollsters control the sample through several techniques. O’Shannessy says they try to make the polling as much a scientific process as they can, with a goal of giving everybody a chance of selection. Besides making sure they’ve got the right numbers in terms of age, gender and area, the sample is also as random as possible. “We might say something like ‘I’d like to talk to the person with the most recent birthday’ hence filling the statistical need of giving everybody a chance of being selected,” he said.
Even then, the sample is rarely a perfect match. Pollsters will minimise the degrees of variation by “weighting” data based on ABS population estimates, putting results through statistical software so that it reflects the population at large.
When gathering data, pollsters also minimise the scope for systemic variation. For example, there’s a strict order in how questionnaires are put together — voting intention is always asked first so answers are not “contaminated by any other issues lying around”, says Briggs. Consistency of wording is also important. O’Shannessy says Newspoll has been asking its questions in the same way for more than 20 years. Party names may change from time to time but the question always ends with “which party do you prefer”?
Which polls can be trusted? How often are are they wrong?
Page 1 of 2 | Next page