| Publishing Date: October 2002. © 2002. All rights reserved. Copyright rests with the author. No part of this article may be reproduced without written permission from the author. |
Best Practices For On-Line Surveys - Playing Nicely While Using Fancy Features
by Annie Pettit, Ph.D.
We have all learned the hard way how easily e-mail messages can be misinterpreted, but knowing that on-line interactions are risky does not outweigh the fabulous advantages that the on-line method offers to research. In this article, I would like to share some best practices for writing and designing on-line surveys. Basically, it is a meeting of minds between technological advantages and social etiquette.
Consider a hypothetical situation. While shopping for the newest and greatest consumer goods, you agree to participate in a mall intercept survey. "Which of peanut butter or oatmeal cookies are your favourite cookies?" "I really cannot decide", you helpfully offer. To your surprise, the interviewer repeats the question, demanding a decision and then blocks your only exit. What do you do? My experience tells me you would avoid the interviewers at that mall, and probably at all other malls too.
If we would never carry out a mall survey like that, then why would we do an on-line survey?
So, let us begin: how old are you? What a presumptuous way to start a conversation, is not it? Social etiquette has taught us that asking someone how old they are, is usually inappropriate. And, we certainly would not start a conversation by asking someone about their race or religion. This same principle applies to on-line surveys.
A less invasive, as well as more useful way to start a survey, is to ask basic questions about the survey topic. For example, a survey about cookies could start by asking "do you eat cookies?" and "what are your two favourite kinds of cookies?" Not only do these simple questions show respondents how to answer an on-line survey, they also ease the respondent into the right frame of mind before the main objective of the survey is reached.
Of course, this means that respondents who do not fit a specific profile you have in mind cannot be screened out, and profiles for which you already have sufficient quantities, cannot be capped. Although researchers understand that screening for age or employment is an extremely important part of the survey process, respondents just do not like it. Some respondents go as far as to comment, "I know you are not going to use my answers because I am older than 50."
In general, there are no good reasons why screening and capping questions cannot be postponed to the end of a survey1. and, after being asked how different household members use different products, respondents will probably understand more easily how the demographic information is used to group responses together.
We have also all seen those thanks but no thanks messages: "Thank you for coming to our survey. However, our quota has been filled." First, do consumers really understand this survey speak? A more consumer friendly message might say, "thank you for coming to our survey. We received so many interesting responses to our survey that we have gathered all the information we need." These few extra words are more informative, more personal, and best of all, they cost nothing.
Second, even if the sample size for a particular demographic, or even the entire survey, has been reached and respondents were invited to participate and showed interest in participating, then they deserve to participate. Even better than a politely worded thanks but no thanks message is a back-up plan for these respondents. Providing them with an alternate but legitimate survey would ensure that they are not disappointed, and as a bonus, additional data can be obtained.
The points made so far clearly reflect the idea that survey development and completion does not take place in a social vacuum. A good on-line survey considers the tone and potential interpretation of not just the questions, but also the introductory messages, error messages, and thank you messages.
For instance, we have all been told "you are wrong" at least a few times in our lives, and I doubt it ever felt good. Yet, survey respondents are routinely told that they are wrong ("Number must be from 1 to 10") or that their answers are not good enough ("Answer all of these questions"). So, what happens when 12 is the right answer, Oreos are not one of the options, or when a question is so offensive that the respondent just does not want to answer it? It is not unusual to see bright red error messages repeatedly displayed forcing the respondent to either quit the survey, or lie about their answer. Forcing a 100% completion rate for every single question simply does not improve data quality. Valuable data is either lost or miscoded.
But, a computer's ability to error check during a survey can be a great time-saving tool. Respondents should not be writing out long numbers when they simply have to use the number pad. Error messages can and should be kind and helpful such as, "please use the number pad to type in your answer". At the same time, if a respondent insists on typing out the number in words, continued pestering is simply not warranted. If they choose not to pay attention to your first message, a second message is not going to work any better. In the end, it is better to get a differently formatted response than an incorrect response, a non-response, or a frustrated and annoyed respondent. This is especially important for panel research, which demands the development of a positive ongoing relationship.
In some cases, error checking can be handled by using the appropriate answer format. For example, using radio buttons ensures that when we want only one answer, only one answer can be given. Just because on-line surveys make this possible, though, does not mean it is always a great thing to do. Really, where would you slot yourself if your parents were from two very different backgrounds? Which counts more - your mother's background or your father's background? A researcher could always include an option for 'Other' so that radio buttons would be appropriate, but this option is neither helpful to the researcher nor respectful to the respondent. In this case, the only solution is using a checkbox that allows multiple options, along with a space to write in details.
And finally, researchers need to consider if they are spending more time testing out the plethora of available colour schemes rather than designing clear and friendly questions. Even with friendly questions, fancy colour schemes may end up motivating respondents to seek for and use the opt-out link.
Remember, the research process is a partnership between the researcher and the respondent. Both the researcher and the respondent should benefit. Treat your respondent with respect and they will reward you with high click-thrus and low opt-outs.
Annie Pettit is a Research Scientist with ICOM Information and Communications Inc. She can be reached at (416) 297-7887 ext.2466 or apettit@I-com.com.
1 Legal implications may require age restrictions at the beginning of a survey, but the principle of playing nicely still applies.
© 2002. All rights reserved. Copyright rests with the author. No part of this article may be reproduced without written permission from the author.