Focused Diagnostics • Practical Solutions • Business Results
 Articles

Publications Home


The following article appeared in
Total Communication Measurement, April 1999
Melcrum Publishing Ltd
., London

.

.

Getting the Most Use out of Research Results
Tips for developing an actionable survey

By Angela D. Sinickas, ABC

.

 All too often companies conduct a survey and do nothing with the results.  This problem can be minimized through developing a highly actionable survey in the first place (the topic of this month's column) and making sure that management is committed to acting on the findings (to be covered next month). Here are some suggestions for developing a survey that leads to highly actionable results.

Identify the possible changes you might make 

Think about the aspects of your communication program that could be changed, and ask questions that will let you know if your audience thinks they should be. For example, if you have a great deal of control over a publication's content, you could ask questions about the topics covered, the length per article, use of graphics, etc. If, however, senior management has determined that the publication's frequency, number of pages and number of colors are set in concrete; you might skip asking about those topics to avoid setting up false expectations among your audience.  On the other hand, if overwhelming audience data contrary to your management's viewpoint could change their opinions, go ahead and ask those questions.

Write action-oriented questions

Asking your audience how satisfied they are with communication may not suggest actionable results.  If they're not satisfied, you haven't learned enough to determine how to satisfy them better.  While a few broad, "temperature-taking" questions can be useful, if all your questions are at this level, you won't have any idea of what to do to improve the numbers. More action-oriented questions would be to determine the size of information gaps on various topics, current and preferred sources of information on a topic, frequency with which management displays various communication skills, etc.

Avoid multiple issues in one question

Sometimes a single survey question asks about two different issues; for example, "My manager is open and honest."  If most respondents disagree, do they disagree with both adjectives or only one?  If only one, which one? This problem also occurs with questions that have a built-in assumption. Your respondents may not be starting with the same assumption and may disagree with the question just because of that.

Use the right response scale

An "agree/disagree" scale may not be the best choice for many actionable issues. For example, let's say you ask people to agree or disagree with the statement:  "Face-to-face communication has improved in the last year."  For those who disagree with the statement, do they perceive it has become worse or stayed the same? A better scale would be one that asks if communication has become worse, become better or stayed the same.

Be wary of "no opinion" options

Before you offer a neutral, "no opinion" response for a question, be sure that it is a meaningful option. Often, "not applicable" may be a more appropriate choice. At some organizations, a very high percentage of respondents choose "no opinion" on a sensitive issue when in fact they hold a negative opinion but are fearful of being identified individually and punished by their managers.

Conduct qualitative research before developing a survey

By conducting interviews or focus groups with your audience, you will learn what issues are on their minds.  Then you can include their issues on the questionnaire along with those that you want answers to. For example, at one hotel I visited frequently over a three-year period, room service rarely delivered the exact food I had ordered, or the correct condiments and silverware to go with it.  The hotel's customer satisfaction card asked if I was satisfied with the speed of the food delivery and the variety of the menu, but didn't ask about accuracy.  By not asking the right question, they were unlikely to ever fix the real problem.

Pretest the survey

Ask a small cross-section of your survey audience to pretest the survey. Discuss each question with them as they read it.  Ask them how they interpret different words or phrases.  For example, whom do they include in the group of "senior management"?  Ask them if the answer they want to provide is available to them, if the format or instructions are confusing, etc. This will prevent at least two problems: 

  • people interpreting questions differently from how you will interpret their responses;
  • people making errors in how they complete the survey, compromising the validity of your data.

Anticipate possible responses

Once you believe your survey is ready to be administered, look at each question, pretend first that you received a favorable response, and then pretend you received a negative response. Have you learned enough from that question to take a specific action?  If not, the question may need to be more specific, or perhaps you need to use a different type of response scale.


© 1999 Angela D. Sinickas, All rights reserved

Angela Sinickas, ABC, is president of Sinickas Communications, Inc., a communication consultancy specializing in helping corporations achieve business results through targeted diagnostics and practical solutions. You can visit her new website, CommToolbox.com, to see the automated planning, measurement, and benchmarking tools she has developed based on her manual, How to Measure Your Communication Programs.

Top of Page Publications Home