8 Ways to Improve Your Survey for Next Time

by Jenny on May 28, 2014 Comments Off on 8 Ways to Improve Your Survey for Next Time

Sometimes you need a survey that will only be used once, such an ad concept test. Many surveys, however, are meant to be repeated – either on an on-going basis, such as the surveys that are included with product registration materials, or on a periodic basis, such as quarterly or annual surveys. One thing to keep in mind is that the fact that a survey is meant to be repeated does not mean that it has to be identical each time. Improvement is always a good thing.

Here are eight questions to ask when evaluating a survey after it has been run, in order to improve it for next time:

1. Did the survey meet its objectives? Go back to the original survey objectives to evaluate whether or not they were met. Did you get the metrics you anticipated? Did it measure what you were hoping to measure?

2. Did unexpected themes emerge in the open-ended comments? For example, if you ran a customer satisfaction survey, were there lots of comments about a particular aspect of your product? If so, consider adding questions about that aspect of your product to your next survey.

3. Were the answers to open-ended questions way off track? If the responses were not at all what you were looking for, the problem most likely lies in the wording of the question itself.

4. What new questions did the survey results raise? Is there something else that you wished you had asked, that would make the results more meaningful? Say you had a question on your customer survey asking if your product was a good value for the money, and many people said “no.” It would have been nice to know what people would like to get from your product that would make it a good value for the money. Next time that question can be added to the survey.

5. What questions did not get answered by a large number of respondents? If lots of people are not answering a particular question, be sure to add an “N/A” option next time. For example, an auto dealer might ask recent buyers to “rate the financing options” using a rating scale. Without a “N/A” option, those who paid cash might leave the question blank.

Another solution here is to add a diagnostic question before the rating question that asks “how did you pay for your new vehicle?” Those that paid cash would then skip the financing question. This approach also helps quantify things. You might find that 80% of respondents are unhappy with the financing option. But if the diagnostic question shows that only 10% of buyers are financing their purchase, then you know that although it’s a problem, it’s only a problem that only affects a small percentage of your customers.

6. Did the question provide you with useful information? Did you actually use these results? Questions that did not prove to be helpful can be eliminated next time.

7. Did you ask any questions that biased the results? It can be useful to have an expert look at your questions to be sure that you did not ask any leading questions. For example, if in asking respondents to rate the quality of something you asked whether or not something is “high quality,” having the word “high” in the question can actually bias the results.

8. Can you eliminate questions that are highly correlated? Once you have survey data in hand you can run correlations between answers to related questions. Questions that are 100% correlated are most likely measuring the same things, so next time only one of the group needs to be asked.

A good example of this might be a customer satisfaction survey for a residential painter. After a question about overall satisfaction there might be questions about how the painter rated for various attributes. Are you satisfied with the painter’s timeliness, cleanliness, the quality of the paint used, etc.? If you run correlations you might find that the ratings for two attributes – such as “knowledgeable about paint” and “helped me select an ideal paint color” – are 100% correlated. Next time you don’t need to ask both questions.

Creating surveys that provide truly useful information is an art. Many organizations find that the survey results answer some questions and raise others. Evaluating the answers you receive can help you craft an even better survey for the next round.

Share this post:


Jenny Dinnen is President of Sales and Marketing at MacKenzie Corporation. Driven to maximize customer's value and exceed expectations, Jenny carries a can-do attitude wherever she goes. She maintains open communication channels with both her clients and her staff to ensure all goals and objectives are being met in an expeditious manner. Jenny is a big-picture thinker who leads MacKenzie in developing strategies for growth while maintaining a focus on the core services that have made the company a success. Basically, when something needs to get done, go see Jenny. Before joining MacKenzie, Jenny worked at HD Supply as a Marketing Manager and Household Auto Finance in their marketing department. Jenny received her undergrad degree in Marketing from the University of Colorado (Boulder) and her MBA from the University of Redlands.

Jenny8 Ways to Improve Your Survey for Next Time