Survey 3 results now available here.
People often ask me about how Radio surveys work.
“No one ever surveyed me about what I listen to, so, how can they be accurate?” they ask.
My usual response is: I’ve never won a lottery either, but I l know that about 10 people a week do.”
I bring this up because the validity of polls and surveys have never been more in contention than in the wake of the unexpected result of the recent federal election. Most of the polls forecast Labor to romp in, yet it was the Liberals, led by Scott Morrison, that got the chocolates. Since that result, the media has scrambled to provide cogent arguments as to how and why the polls got it so wrong.
In fact, the polls were well inside the accepted “margin for error” which is around three percent, give or take a percent or two.
But radio surveys and political surveys are polls apart in both subject and methodology.
The major difference is, as they say, “A week is a long time in politics,” especially after an election campaign is announced. Swinging voters, who ultimately decide elections, can change their intentions minute by minute based on the last policy announcement or scare tactic they heard. For that reason alone, the last poll before the election which, from the time the raw data was gathered and tabulated could be several days or, perhaps a week old by the time it is published.
As conventional wisdom has it, a significant number of voters that tell pollsters that they intend to vote for an opposition party, are prone to change their minds when they are alone in the polling booth. It is only at the last moment that their instinct tells them to go with the devil they know, i.e. the government of the day – which is usually the party they voted for last time.
On the other hand, a week in radio is not such a long time. Apart from exceptional circumstances, where superstars like Kyle and Jackie O switch stations, it usually takes much longer for audiences to migrate from one station to another.
Nonetheless, every radio survey tends to throw up some anomalies here and there. Most programmers are sceptical of variations that are more than a whole percentage point up or down and instead look at long term trends over several surveys over which the peaks and troughs tend to iron out.
Here’s what GfK’s Media Measurement Director, ANZ, Deb Hishon had to say:
The GfK Radio Ratings is an extremely robust and large sample of over 60,000 respondents every year, with a new group of respondents every week providing information on their listening behaviour. The Radio Ratings also employs a mixed recruitment methodology, ensuring a wide range of the population are being given the opportunity to be involved, either recruited face to face, online or via telephone interview in some markets.
The robustness of the radio ratings is seen in the stability of the survey results, over time the results are remarkably consistent.
In terms of the subject matter, an individual’s radio listening preferences and behaviour is very different to their political opinion. You quite rightly say that peoples political views can change day by day depending on the political coverage, while it has been shown in many different pieces of research that radio listeners are consistently loyal and cannot be easily swayed away from their favourite personality or radio station.
At the end of the day the radio ratings are based on a continuous, representative and reliable survey, we are field for 41 weeks of the year with over 60,000 respondents each year contributing to the results.
Usually by Survey 3, the trends for the year start to appear.
Join us here on radioinfo this morning at 9:30 for all the results including DAB+, Analysis, Trend Charts, our For and Against Chart and the ever-popular Spin Cycle that celebrates the wonderous art of the network publicists.