DIY Marketing Research – Online Questionnaire Development

Prepared by Dr Brian Monger

Online Marketing Research

Online Marketing Research uses a specific form of media to conduct market research.  While this media requires some different approaches than traditional Marketing Research, the basic concepts of MR will always apply.  I suggest the reader become familiar with these concepts before launching into online research.

Of course the use of Marketing Research consultants is likely to assist the process, improving the likelihood of success and eliminating/reducing errors that will ruin a good Marketing Research project

This article is focused on Online Marketing Research

Online marketing research is the process by which companies use the Internet to gather data to evaluate information from the market generally consumers. The good analysis and interpretation of data provided  by MRes can also identify popular trends that can assist a company in creating a strategy that will get better results. When used properly, online marketing research can be an effective tool that a company can use to experience higher revenues.

Types of online market research

Online Focus Groups

Online focus groups bring participants together in an online environment to discuss topics, and share ideas and opinions, facilitated by a Moderator. Just as you do in a traditional focus group session but from the comfort of your own home or office via the Internet. Online focus groups are divided into two main types, the more common chat based focus groups and the more recent webcam and audio based focus groups.

Online Community Discussion Boards

Online community discussion board market research occurs in a secure environment where participants login and post comments on an online research discussion board in response to a specific topic. Discussion is generated by a Moderator and posted for participants to comment. A discussion board topic can include question, images and website or video links.

Online Surveys

Online surveys are perfect for validating ideas, measuring customer satisfaction, employee feedback, and even qualifying people to participate in discussion boards and live groups.

This article focuses on online surveys – and the development of questionnaires

The general overall process online research project

Understanding research needs (the difference between symptoms and actual causes)

Define your business problem

Developing research objectives

Identifying the respondent / participant requirements

Screening of potential respondents / participants

Conducting Exploratory Research to clarify the research needed

Designing the online survey or online discussion guide

Pilot testing the online focus group or survey

Making adjustments to the online survey or online discussion guide

Scheduling the research

Conducting the research

Performing the research analysis

Interpretation (what MRes collects is “data”.  To be useful (become “information”)  it needs analysis and interpretation

Reporting

Strategic recommendations & debrief

Research objectives

What is happening that requires research to be conducted?

What is it that you want to know and how will the answers be used?

What decisions will be made from the answers revealed in this research?

Who is the information for and what do the stakeholders have invested in the answer?

Focusing on the questions

It’s important to remember that for most general research you may have only 5 minutes of the respondents time.

For targeted research, where you know the respondent has an interest and has agreed to participate, you may have 15 to 20 minutes for an individual to effectively complete an online survey.

It is important to be specific and pinpoint the project’s research objectives. By focusing only on the questions you need answers to you will improve the quality of the results and maximise the return on your research investment.

Developing Online Questionnaires

When it comes to using Web-based surveys, keep in mind a few simple guidelines:

The Shorter the Better.

Don’t alienate survey takers with long questionnaires. Limit yourself only to what you need to know – not what “might be interesting”

Put a status bar at the top of each question page so respondents know how close they are to being finished

Avoid too many Open-Ended Questions.

Don’t include a lot of open-ended questions where they have to type out the answers. Close-ended questions they can click on a button to answer—Yes, No, Maybe, Never, Often—work much better,

To get more in-depth responses use ranking scales for questions, which ask a respondent to rate something on some type of scale, 1 to 5, or 1 to 10

To Get More Responses, Be Politely Persistent. If you’re asking customers or vendors to take a survey, it’s okay to send more than one invitation, especially to people who’ve previously indicated they would be willing to participate. Just make sure you’ve got people’s permission, so they don’t think you’re spamming them.

Be Patient.

Don’t get impatient you can’t get the results right away. Even though online surveys reduce some of the work, they take time to design and administer, and when the results are in, more time to interpret.

Online Questionnaire Development and formatting

Questionnaire layout

Experienced researchers and Questionnaire Designers can help design better on-line questionnaires

Consider several design options: all questions in one page, related questions grouped on the same page, or one question per page.

Overall structure

Left alignment of the questions and answers.

Place the response format on the left of each response categories.

Vertical alignment has more white space thus it is visually appealing and results in less problems with alignment and formatting

Be cautious about the sense of scale (eg. column space for Likert scale) in horizontal alignment.

Keep the answer labels visible for the grid design.

Predict the visual flow and align the questions, answers and proceeding buttons accordingly.

Decrease the movement time by decreasing the distance between the selected response check box (or radio button, etc) and the ‘Next/Submit’ button.

Present questions in hierarchically ordered structure.

Provide cognitive comforts; respondents do not need to go back and forth in their memory to retrieve.

Pre-program the skip pattern (branching).

Consider item non-response error

Decisions about whether to display questions on the same page together or on separate screens can impact the data collected.

A group of questions considered together may convey information about the purpose of the questions and may elicit different responses than if the questions were presented one at a time across multiple pages.

Consider how to guide the user’s mental model by providing appropriate question context. Such context may be conveyed in may ways, including question wording, response wording, introductory explanation, or may be implied by surrounding questions.

The grouping of response options will impact the responses recorded.

If it is significant that responses or questions fall into distinct groups, call attention to the fact in some way, for example grouping or labeling.

List a few questions per screen: Excessive scrolling can become a burden to respondents and lengthy web pages can give the impression that the survey is too long to complete. There is some evidence that using a single screen or a few screens for short surveys minimizes respondents abandonment, whereas using a single screen and forcing the respondent to scroll down in long surveys increases abandonment.

Eliminate unnecessary questions: Avoid questions that have answers that the computer can determine, such as the date the questionnaire is filled out.

Begin the web questionnaire with a question that is fully visible on the first screen of the questionnaire, and will be easily comprehended and answered by all respondents.

Present each question in a consistent format, including question numbers, instructions, space between lines, and space between question and answers.

Limit the length to decrease the likelihood of a long line of prose being allowed to extend across the screen of the respondent’s browser.

Do not require respondents to provide an answer to each question before being allowed to answer any subsequent ones. It is suggested to include a “prefer not to answer” and/or “don�t know” category for every item than to force respondents to answer every question.

Organize and partition an on-line survey to correspond with the organization of semantic knowledge in memory. Then how should the survey be partitioned on-line into pages? Should it be Item-based, Form-based, or somewhere in between?

Paged surveys that are not congruent with sections are to be avoided.

Sectioned surveys that require scrolling should clearly indicate that additional items must be accessed by scrolling to them.

Indexes to sections and pages are of marginal benefit and may sometimes lead to confusion.

Hide inappropriate and irrelevant questions to shorten the apparent length of the questionnaire and make such questions available only if the respondent specifically needs or wishes to view them.

Instructions

Provide easy to follow instructions. Unlike the traditional face-to-face or telephone surveys, web-based surveys are not administered by an interviewer who motivates, clarifies and probes the respondents.

Embedded instructions distract respondents from their main task.

Provide a help button to access the instructions.

Instructions on a pop-up windows may disturb respondents

Enable respondents to report problems: Almost certainly, some respondents will experience some type of problem with the survey. Thus, a “help desk” should be established that respondents can contact easily by e-mail and/or toll-free phone number.

Provide specific instructions on how to take each necessary computer action for responding to the questionnaire. Answering techniques may be obvious for experienced users, but need to be explained to less experienced respondents.

Provide computer operation instructions as part of each question where the action is to be taken, not in a separate section prior to the beginning of the questionnaire.

Reduce the branching instructions to a minimum to reduce reading time, confusion, and perceived difficulty of the questionnaire.

Use colour

A proper use of color could provide visual cues that may simplify the web-based survey process.

Use Multimedia Features

Take advantage of the media’s presentation capabilities: Web-based surveys could include hypertext links to help and additional resources as part of the survey instrument without increasing its apparent length.

Use graphics sparingly: In a web-based survey, graphics can significantly slow the downloading of a web page, especially if users are likely to connect to the Internet using a modem as opposed to a high-speed connection.

Be aware of how respondents may interpret questions in light of accompanying graphics: When graphics or pictures are used, bear in mind that respondents tend to interpret questions in the context of those graphics and pictures.

Always ask whether the multimedia material is an enhancement that solves an information problem or whether it is a distraction .

Be aware of how respondents may interpret questions in light of accompanying graphics: When graphics or pictures are used, bear in mind that respondents tend to interpret questions in the context of those graphics and pictures.

Always ask whether the multimedia material is an enhancement that solves an information problem or whether it is a distraction (“eye candy”).

Consider Response Format

Use matrix questions sparingly: Matrix questions place an extra burden on the respondent because they require a lot of work to be done within a single screen. In addition, it is impossible to predict exactly how a matrix question will look on a respondent’s web browser. Therefore, they should be used with caution.

Check boxes

When there are too many options, use a matrix format.

Use “check all that apply” only when necessary, since respondents tend to merely satisfy their task rather than optimising.

When “none of above” is needed, provide it with a radio button. Avoid erroneous check on this choice with the chosen responses.

Buttons

Appropriate for “select only one” from mutually exclusive items.

The size should not change even when font size changes.

Consider precision in clicking.

Avoid default-filled radio buttons, since they may be misunderstood as an answer when respondents do not choose any answer.

Drop-down boxes

Use sparingly: they are best for a very long list. (eg. State/country of residence, prescription drugs, etc)

Drop-down boxes require 3 mouse actions whereas other response formats require only one. Thus, use drop-down format only when the increased mouse action is worthwhile.

Not appropriate for items where typing is quicker (eg. Year of birth)

Permit type-ahead look up, since it prevents tedious scrolling.

Do not make the first option visible in drop-down boxes: it may mislead the respondents and it may be misunderstood as an answer when respondents do not choose any answer.

Give visual cues for the task in drop-down boxes. (eg. “select one”)

Avoid multiple selections on drop-down boxes: use check boxes.

Text input

Provide sufficient space for text input.- Consideration needs to be given to having boxes that will meet the amount of the required information.

Provide concise and clear input guidance. (eg. MM/DD/YYYY for the birth date)

Response Categories

Pre-program to check the error in input (eg. the input do not sum to 100%; check some response options and “none of above” option) and provide feedback if erroneous input is caught.

Implement category randomisation when needed.

Provide an open-ended field with the “others” option.

When order matters (eg. a 5-point scale), pay attention to the response labels and spacing.

Reduce response errors by restricting response choices: A predefined response format is helpful to achieve uniformity of data, which will reduce the workload in data cleaning and processing. However, a flexible format may be more respondent-friendly. It is suggested to use radio buttons when the number of choices is relatively small, and drop boxes when the number of potential choices is large.

Force answers only on rare occasions: Forcing respondents to answer questions should be used only on rare occasions because the respondent may become annoyed and give an arbitrary or deliberately false answer in order to proceed to the next screen or stop taking the survey altogether.

Automation

Automatically validate input, if possible: Input validation improves data quality and saves time in data preparation. However, such validation should be user-friendly and simply identify the mistake of the user.

Access control

Provide a PIN number for limiting access only to authorised users in order to:

Prevent duplicates and foreign elements (coverage error).

Secure Confidentiality.

Let the researcher take control over the access.

Always password protect web-based surveys: User passwords are necessary to restrict access and uniquely identify web-based survey respondents. Of course, restricting access for a convenience sample does not make sense, so passwords would not be issued in that particular case.

Ensure that respondents’ privacy and their perception of privacy are protected: Because there is the risk that transmissions sent over the Internet may be observed by unauthorized users, all survey data should be encrypted. A message explaining such procedures in a clear way, and displayed just before respondents leave a secure area can alleviate any privacy or security concerns they may have.

Carefully handle respondents who fail a screening test: Depending on the nature of a survey and the respondent population, access to a web-based questionnaire can be restricted until a respondent has passed the screening questions. Two possible approaches involve excluding respondents from a survey as soon as they fail a screening question, or allowing all respondents to complete the entire survey and eliminate ineligible respondents later on.

See also the article DIY Marketing Research – Questionnaire Design here in this group

Brian Monger is the Executive Director of MAANZ International and a Principal Consultant with The Centre for Market Development.  His profile can be found on LinkedIn.

He is available for consulting tasks and speaking engagements

Did you find this article useful?  Please let us know

Also check out other articles on http://smartamarketing2.wordpress.com

MAANZ International website http://www.marketing.org.au

Contact – info@marketing.org.au

Advertisements

2 thoughts on “DIY Marketing Research – Online Questionnaire Development

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s