As a research and data analytics company we are neither directly involved with, nor do we hold strong opinions about politics. That said, last week’s Presidential election shocked the world AND our industry.
Every four years the U.S. Election offers market/consumer research and data companies around the world an opportunity to showcase the effectiveness of their research methodology. The hope is to accurately predict the election outcome validating their services thereby garnering attention and acquiring new clients.
This year’s election was, to put it nicely, a bit of a curveball. Only a handful of pollsters were accurate in their prediction which was a blow to the reputation of not only vast majority of companies, but it negatively impacted the research industry as a whole.
The outcome has left top-tier research firms rigorously, and publically, reviewing their processes hoping to find the cause of their error. Facing the reality of their inaccuracy, many experts are trying to explain what exactly went wrong. While market research certainly has its difficulties, such wide-spread erroneous predictions are extremely rare.
Discussing the impact on our industry, we at MacKenzie considered areas in which pollsters could have gone wrong. We came up with a few ideas and quickly realized these areas are also common shortcomings when businesses conduct research on their own market or customers.
Each respective research company asserting their election prediction started with the same, clearly stated research objective; which candidate will earn the most votes?
Having this previously established research objective, we considered how our internal approach could be applied. Typically we’d start by identifying our target respondent sample by identifying groups of people to best represent the overall voter population. Rather than consumers, here we have voters so regional (or preferably state-based) segmentation would be ideal.
We’d then establish the most effective means of communicating with our sample population. Immediately we saw a challenge in that pollsters are expected to have real-time, updated polling numbers the day of the election. This doesn’t allow sufficient time for extensive data collection and analysis.
Moving forward we would typically personalize communication and survey development to best suit the specific segments of respondents in our target population. In this example, we would ask questions regarding important regional issues in relation to their preferred candidate. Again, the quick turnaround wouldn’t allow for this effort thus requiring a weighting method to compensate for potential biases.
As if things weren’t difficult enough, reaching older voters or those in rural areas might sometimes be impossible; presenting yet another potential bias.
The outcome of our internal discussion, considering how we would approach this projected, concluded with; we do not envy the pollsters. That said we can clearly see many opportunities for things to go wrong which provides some good examples for businesses looking to conduct a research project of their own.
Our ultimate goal is to ask the right people the right questions, at the right time, and provide reliable insights to guide future decision making. The pollsters were hoping to do the same thing, however they were proven to be unsuccessful.
For more detail on where or how things went wrong, click below for a great article explaining things in further detail.
READ HERE:
Why Pollsters Got the Election So Wrong, and What It Means for Marketers
Via AdAge.com