How can you best measure attitudes? I’ve recently been learning about using surveys to measure people’s views about products, companies, experiences or ideas, and thought I’d share a summary for SCN readers. These tips should help anyone who creates course evaluation surveys or other kinds of surveys that measure opinions, beliefs or attitudes.
How consistent is an attitude?
A challenge with attitude surveys is that participants may have fluid attitudes and how you ask the questions can influence the answers. For example, in one well-reported experiment, questions were asked to two randomly selected groups of participants. One group was asked this:
“Do you think the United States should forbid public speeches in favor of communism?”
The other group was asked a slightly different question:
“Do you think the United States should allow public speeches in favor of communism?”
The researchers found that 39 percent of respondents thought that speeches should be forbidden, but 56 percent thought that speeches should not be allowed. The difference in wording between “forbid” and “not allow” made a large difference in the attitude measured.
How participants answer a survey question
It’s helpful to consider the process participants go through when answering a question.
As shown in the diagram above, there are four steps in answering a question:
Error can creep in at every stage.
Do participants give the best answer they can?
Unfortunately they often do not.
Unlike in tests and exams, where participants have a strong motivation to answer optimally, in a survey, participants often take shortcuts or give an answer they think is satisfactory rather than taking the time and effort to give the optimal answer. This effect is called “satisficing” and can involve skipping any of the steps above – often just selecting a response that seems to make sense without thinking too much about it.
Have you ever answered a survey, just picking the choices that seemed roughly right? Good survey design can reduce the impact of satisficing.
Here are ten tips for improving your attitude surveys.
1 – Avoid Agree/Disagree questions
Although these are one of the most common types of survey questions, you should try to avoid questions which ask participants whether they agree with a statement. There is an effect called acquiescence bias, which means that people are more likely to agree than disagree. Re-word the question to ask directly about the statement. For more details about this, see Agree or disagree? 10 tips for better surveys — Part 2
2. Avoid Yes/No
For the same reason, avoid Yes/No questions. Participants are more likely to answer “yes”.
3. Each question should address one attitude only
Avoid double-barrelled questions that ask about more than one thing. Suppose you ask “How satisfied are you with your pay and work conditions?” Someone might be satisfied with their pay but dissatisfied with their work conditions, or vice versa. So make it two separate questions.
4. Minimize the difficulty of answering each question
If a question is harder to answer, it is more likely that participants will satisfice. Use few words, and use simple words your audience will understand. Avoid negatives. Keep the cognitive load low.
5. Randomize the responses if order is not important
The order of responses can significantly influence which ones get chosen. For example, it can be more likely that the first or last choices will be chosen. If your choices do not have a clear progression or some other reason for being in a particular order, randomize them. Most systems including SAP Assessment Management by Questionmark, can do this — and it will remove the effect of response order on your results.
6. Pre-test your survey
To reduce error, you should pre-test or pilot your survey before it goes live. This helps avoid having participants interpret questions differently than you intended.
7. Make survey participants realize how useful the survey is
The more motivated a participant is, the more likely he or she is to answer optimally. Ensure that you communicate the goal of the survey and make participants feel that filling it in usefully will benefit something they believe in or value.
8. Avoid a “don’t know” option
Including a “don’t know” option usually does not improve the accuracy of your survey. In most cases it reduces accuracy, partly because it encourages disengaging and just answering “don’t know” —and partly because participants are better at estimating than they think they are.
9. Ask questions about the recent past only
The further back in time they are asked to remember, the less accurately participants will answer your questions. If you can, ask about the last week or the last month, not about the last year or further back.
10. Pay attention to trends
Error can creep into survey results in many ways. Participants can misunderstand the question. They can fail to recall the right information. Their judgement can be influenced by social pressures. And they are limited by the choices available. But if you use the same questions over time with a similar population, you can be pretty sure that the trends you record over time will be meaningful.
This post is a summary of a longer series of posts on the same subject on the Questionmark blog. If interested in seeing more and also seeing some references to research to justify the tips, see part 1, part 2 and part 3 of this series there. I hope these tips help you build better surveys.
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
User | Count |
---|---|
8 | |
1 | |
1 | |
1 | |
1 |