With the ubiquity of mobile devices and smartphones, many researchers are asking how mobiles and smartphones can be successfully incorporated in a research setting. Whether your aim is to improve existing research projects or you’re looking to try completely new research methodologies, these devices can augment other research methods and are also important research tools in their own right. #1. Add Impact with Video and Picture Libraries Most smartphones have camera and video capabilities, which can both greatly enhance a research program. Respondents have the ability to film themselves completing a research task, such as describing the products they purchased at a store or how they interact with a new product. It’s a great way to communicate the results too: a montage of the pictures or videos can help make the research findings more impactful. #2. Quick Data Collection via Pulse SMS Surveys A one- or two-question survey via text message is a viable way to collect data very quickly (usually within minutes). Younger age groups use email less frequently, making an SMS survey a more effective way to reach this demographic. #3. Feedback via User-initiated SMS Surveys A short code can be used to allow potential respondents to initiate a survey using a key word (e.g., Text JAVA to 78789 to start the survey). User-initiated SMS surveys are a useful way to gain feedback on a transactional basis. Using a variety of start words allows you to track where or when a respondent learned about the survey. #4. Make Reminders More Effective Text messages are an easy way to remind respondents to complete an online survey or to attend a focus group. Since most people carry their cell phones, the reminders are often more effective than by telephone or email. Many will have internet capabilities on their phones and may opt to complete the survey right then and there. #5. Support Your Mystery Shoppers Mystery shopping often requires the shopper to notice many different things during the task, such as the time spent in line or the number of people in the store when they enter. Smartphones provide an easy way for mystery shoppers to record the key points in a discrete manner so that they don’t have to rely on memory. Using technology in this way provides a more accurate result for the client and means the mystery shopping task is less onerous for respondents. #6. Aid Auto-ethnography Ethnographic research can provide holistic, qualitative insight into consumers’ lives, but having a researcher in-home is expensive and has the potential to introduce bias into the results. Fortunately, technology can partially automate this involved research process and allow participants to compile much of the information themselves. Rather than have a researcher observe the subject’s behaviour, the participant can fill in a diary about his or her daily activities at specific times. Auto-ethnography relies on the participants to remember to record their activities at specific times, and since many of us have our phone with us at all times, sending a timed prompt to record the necessary information works well. A smartphone can even be used to record the necessary information. #7. Run Co-research Programs & Spotter Diaries Empowering research participants as co-researchers can provide a viable way to understand complex cultural factors that researchers may not be able to identify on their own, and these methods are nicely augmented by the use of smartphones. Co-researchers can take pictures or record their thoughts surrounding a common topic as they go about their day. These reflections can be used to uncover market gaps and to design new products. Additionally, since marketing campaigns usually encompass executions across various media, it is difficult for marketers to understand the overlap of the various channels. Having respondents record each time they come across an advertisement for a certain brand (creating a spotter diary) can provide a better picture of the whole campaign’s reach. #8. Collect Location-Based Data Using GPS functionality, researchers can better understand location based information as it relates to consumers: how far they travel to a store or other location or their travel patterns within a venue such as a mall or leisure facility. This data can demonstrate issues with congestion, help to optimize within-venue placement, and provide a reference point for advertising metrics. #9. Use Gamification Methods Although gamification in research is relatively unexplored, an ideal venue for research games may be on a smartphone. Canadians are already playing games on their smartphones: sixty percent of Canadians do so, according to the 2012 Rogers Innovation Report. If a game is well-designed, analyzing how users play the game could allow researchers to gain insights into consumer behaviour that could not be measured by a survey. #10. Get Beneath the Surface with Passive Data Using the functionality of participants’ smartphones, passively tracking data can be used to gain insight that might be impossible using a survey methodology. A user can opt-in to provide information about websites visited, health statistics captured via GPS, communications, or any number of other types of data from their smartphone. This data could even be linked to survey responses in order to compare or to augment the dataset....
 

By Briana Brownell Does a Yellow Checkbox Give You Better Brand Equity? The meaning of colours in branding and marketing is a popular topic: blue means you’re trustworthy and yellow means fun, but do yellow checkboxes mean that respondents will give you better scores in brand aspects like approachability or likeability? Maybe…as researchers love to say, more research may be required. Even though research on the impact of colour in surveys is pretty slim, many survey design guides warn that colour can potentially influence survey responses. Most of the research surrounding the effect of colour in surveys has been in regards to response rates in mail surveys, and unfortunately, there are not many conclusive results about whether colour has a significant effect online. One study found that added elements like pictures don’t seem to cause detrimental results in online surveys. It’s not all good news though. Other experiments showed that question types might affect responses to perceptual questions, and in some of these instances, colour has played a role. How Can Colours Affect Your Research? Colour can make a survey less clear. A difficult-to-read or difficult-to-complete survey will have lower response rates and potentially misleading results if respondents misunderstand the survey questions or if the answer options are difficult to read. A coloured mail survey could be conspicuous – or look like junk. Both positive and negative effects have been found for coloured mail surveys. If colour makes a survey more noticeable, it can serve as a reminder to complete and increase response rates. However, if a survey is confused for junk mail, response rates can decrease. A coloured scale can affect rating questions. Colour can influence the perception of a scale’s spread and influence results on perception-based rating questions such as agree-disagree scales or numerical rating scales. When the gradient of the colours from one end to the other is amplified, respondents perceive the scale as more severe and give more moderate ratings. Inventory Questions Are Pretty Safe Inventory questions such as “Who is your current telecommunications provider?” or “In what year were you born?” do not appear to be affected by the question design or colours used because they have an objectively true answer. As long as the question design and layout are clear and the design doesn’t cause confusion, there is no evidence that the survey’s design affects the responses. Perceptual Questions May be Affected Perceptual questions, on the other hand, may be affected by various factors concerning the question style. I know what you’re thinking: we already know that. Very true. Perceptual questions should always be taken with a grain of salt and considered a comparator rather than an absolute measure, whether they’re rainbow coloured or black on white. Sliders seem to have some interesting effects on survey responses. Both the initial placement of the slider and the size of the slider matter: a wide slider discourages respondents from answering at the extremes, and a slider with an initial placement in the middle discourages a neutral response (respondents prefer to move it rather than leave it where it is). Colouring may also matter in the interpretation of the scale, if the colours used affect the respondent’s perception of the measurement. Overall, using colour and changing design seem to be okay as long as they are consistent. Think of using the different question styles as using different anchor points and treat them this way in the analysis. Entering the Era of Grayscale Research Surveys? Colour and visual elements might be a fun addition to your survey as long as you don’t go overboard: clarity is key to collecting quality data. Remember that researchers see far more surveys than respondents will: make sure it’s not you who is bored with the formatting. It’s a safer bet to keep the wilder stuff for the inventory questions. Consistency should be a key priority in tracking work (I’m sure I’m the first one to ever recommend that!). Questions that are going to be compared should be in the same format. There, I said it: researchers, here’s your excuse to feed your addiction and give respondents a few pages of item-bank radio button grids.  ...
 

Different research challenges require various research solutions and knowing when to use a specific approach can certainly be a daunting task. This overview highlights some instances in which online communities may be preferred in place of custom ad hoc research. Combination of Quant and Qual – Online communities offer researchers a solid opportunity to gather both quantitative and qualitative data at the same time and at a lower cost. Because most online community platforms have both quant and qual tools built in, research can be conducted much more quickly and efficiently than a combined qual-quant ad hoc study. Demographic Segments – If you are looking to segment individuals based on demographics, online communities work well. Short surveys are used to profile individuals and then targeted research questions are presented to the entire group to pinpoint where profile differences emerge. Groups can also be formed based on demographics and targeted research can be conducted with specific sub groups. This approach can be achieved much more easily with an online community than with a long ad hoc questionnaire with skip logic that segments groups during the survey. Regional, National, and International Research – If the research question requires insights from individuals who are geographically dispersed, an online community is an excellent research platform. If a wide scope is required, individuals can be recruited from different regions, provinces/states, and countries. Online communities are borderless and research can easily be conducted in several languages. Engaging Research – In place of long and often boring surveys, try utilizing an online community to spice up your research questions and increase engagement. If your research topic is dull in survey form, consider an online community to allow for a more open forum for discussion. The community also allows for innovative approaches such as co-moderation, where a community member or members take an active role in conducting the research. Rather than gathering a lot of yes/no and scale answers, you can collect rich, organic data from engaged members whom you can return to for future research questions....
 

This white paper provides an introduction to statistical and significance testing in market research and answers the following questions: What does statistical testing mean, how is it shown, and how should it be interpreted? Why are there multiple statistical tests, and how are they different? What do terms like “margin of error” and “nineteen times out of twenty” mean, and how are they relevant? Why is a margin of error not reported in online research?   Most of the time when doing marketing research, there is interest in differences between groups. Demographic groups, groups based on psychographics or attitudes, or any number of other slices and dices may be relevant to the researcher. However, for a non-researcher or a new researcher, entering the world of stat testing and interpretation can be daunting. Sometimes researchers forget that most people don’t look at research results all day, and we often forget that not everyone can eyeball a significant difference!...
 

A new independent online poll conducted by Insightrix Research suggests that residents are divided on whether or not the new Regina Sewage Treatment plant should follow a traditional Design, Bid and Build (DBB) approach or a Public-Private-Partnership (P3) approach. Awareness of and Following the Debate Awareness of the debate regarding the development of a new sewage treatment plant is widespread. Nearly all Regina residents surveyed (96%) report they are aware of the debate taking place regarding whether or not the City of Regina should use a DBB or P3 approach to building the new sewage treatment plant. Further, 94% are aware that a referendum is being held on September 25th where Regina residents can vote on the issue. Additionally, eight in ten (81%) residents aware of the issue say they are actively following the discussion (22% very closely, 59% somewhat closely) while the remainder (19%) are either not following the issue at all (8%) or are only listening to what their friends or family tell them as the debate unfolds (11%). Support for P3 vs. DBB Respondents were presented with the following brief description of the two approaches: The City of Regina Council unanimously approved using a public-private-partnership (P3) for the sewage treatment plant because it believes this to be the best option for the city. They report that a P3 costs less than other options, is less risky and is much more likely to be built on time and on budget. However, there are some who do not support the idea of a P3 approach because they feel it does not provide accountability to citizens, it will cost more than the traditional Design, Bid and Build (DBB) approach, privatization is risky, and Regina’s entire water system should be kept public. After hearing this description, respondents were asked to state which approach they personally support. Four in ten (40%) Regina residents say they support a P3 approach while three in ten (30%) support a traditional DBB approach. More than one quarter (27%) are unsure and another 3% are indifferent on the issue. A P3 approach is more strongly supported by males (46% vs. 35% among females) and support for this approach tends to rise with household income. Among those who plan to vote in the upcoming referendum (66% of respondents), 45% are in favour of a P3 approach while 37% prefer a DBB method. Nearly two in ten of those who plan to vote (18%) are unsure as to which approach they support. Research Details A total of 400 randomly selected SaskWatch Research™ panel members who live in Regina participated in the online research study from September 11th to 15th, 2013. Quotas were set by age, gender and region to match the general population of the city. As the research is conducted online, it is considered to be a non-probability proportion sample and therefore, margins of error are not applicable. About SaskWatch Research™ Insightrix started developing the SaskWatch Research™ online market research panel in October 2007, using high-quality techniques including telephone recruitment and referrals from existing panel members. Presently, there are over 14,000 active panel members representing all regions of the province, and distributions of the general population. The panel membership closely matches the 2011 Census based on age, gender, household composition, household income and education. For more information, please visit: http://saskwatch.ca. About Insightrix Founded in 2001, Insightrix Research Inc. is a full-service market research firm that helps clients develop, administer and manage data collection and information strategies. From its office in Saskatoon, Insightrix offers a comprehensive range of research services. For further information contact Lang McGilp, Senior Research Executive Insightrix Research Inc. Tel: 306.657.5640 Ext. 229 Cell: 306.290.9599 Fax: 306.384.5655 Email: lang.mcgilp@insightrix.com Web: www.insightrix.com...