Just so that all of you know, I have run polls in the past and came close to starting a polling company at one point. I have a good background in mathematical and statistical aspects of polling though I am not a Nate Silver, this is mainly because my life is too short to analyse the data in the degree of detail that he does. Eric Grenier tries but I think he takes the results of the polling companies at face value.
More and more more pollsters are no longer conducting random samples of public opinion but are using online panels. Online panels might be a good measure of public opinion but there is still very little data to show what it takes to have a good online panel and how to understand what the results mean.
The problem with all the online panels is that people have to opt in to be part of the panel and then they have to have to choose to take part in the survey. The questions to ask about the panels are the following:
- How many panel members does the company have? Abacus stated in their latest survey that they have 500,000 people in Canada in their online pool - this means I suspect they have between 30,000 and 40,000 in BC since they are not all that active here. This is not nearly a large enough pool to gauge public opinion.
- Are the panels accurate reflections of the province? It is important to remember that the work by Angus Reid, Ipsos, Abacus and Insight West are all samples of their panels and not random samples of the public such as telephone polling has managed to achieve in the past. It also means that is wrong to ever assign a statistical margin of error to an online survey which is something Angus Reid does all the time.
- How do they make sure the panels are not stacked by the politically active? I am a member of all the online panels and I have been part of more than a few surveys. I am not an average random member of the public.
- How do they know their panel members represent either the public that votes or the census demographics of BC? Since none of the companies give us any details on their panel membership we just have to trust that it is somehow a reasonable reflection of reality I suspect all the online pollsters have very low numbers for people aged 65+ and that their small pool of people drive opinion results.
- How do they ensure the survey is answered by people other than those that are online 24/7? The people that are online for too many hours of the day are not the ones that should form the core of the respondents to the online panels.
Overall I am remain skeptical of the online panels as an accurate tool of measuring public opinion. I have been experimenting with models that would use social media to give us accurate results but so far I have not been able to develop one that I have statistical confidence in. I have been reading all the academic literature on public opinion sampling methods and there is a major issue of how to poll without the traditional land line that every household had and everyone answered.
If not online panels, what about telephones? Telephone polling is no longer a very good tool to get a random sample of the public because of the very high refusal rate and the ever increasing number of people with no land lines.
Companies like Ekos and Forum use IVR (Interactive Voice Response) to call people and get results for their polls. I have used IVR polling in the past and used it in this election to get regional results for the Victoria area. IVR has some major flaws.
The recent Forum IVR poll had the NDP and the Liberals only 4 points apart. In one respect I am happy about Forum because they have a lot more detail in their report than other companies have but this means I can also see the flaws. The poll reached a lot more old people than young people. Forum weighted the results based on census demographics, but there are many ways they could weight the data and each way has a significant impact on the results.
They asked how people voted in 2009 and what their voting intention is in 2013. The results to this question on page 4 of the report showed only 28.7% of their poll respondents voted for the NDP in 2013 when in fact 42.2% actually did so. It means their IVR sample was significantly light in NDP voters which means the overall result showed a close race. If you weight for how people voted in 2009, the results would have been more like NDP 45% and Liberals 32%.
Ekos also used IVR and had some results that were not really in sync with the other companies. Does this mean they are more accurate? I do not know.
The problem with IVR polling is that you miss everyone with a mobile phone and you get a huge percentage of people refusing to answer. When I have done IVR polling I have had between 85% and 90% refusal rates, others have told me they have had as high as 95% refusal rates. These high rates mean that the IVR polls are not random samples of the public.
So in the end I am not convinced that any of the results to date are true representations of public opinion in this election. That said, I do think I can glean some trends from the results:
- The NDP is marginally less popular than it was a few months ago, 44% instead of 46%
- The Liberals are marginally more popular - 33% instead of 31%
- The Greens have a strong concentration of support on Vancouver Island which if it is concentrated on the South Island means the Greens are at about 30% in the Victoria area.
- The BC Conservatives are unchanged over the last few months with no data indicating any region with a consistent stronger result
- The results for "other" is high enough and concentrated enough that four independents could win their seats.
My one final thought on the polling, take them all with a grain of salt. That means take my gleaned observations with that grain of salt as well.