fbpx

Dashboards are a great way to present information, especially when the data needs to be shown at a high level. Digital dashboards are collections of key reports, metrics, KPIs, and other data that provide relevant context and highlight the essential elements of a research study. They are a great tool for presenting information to executives who may only have a few minutes to review and make decisions about a project. Here are five key points to consider when developing dashboards for executives. #1. Dashboards are not scorecards. Scorecards are report cards for your projects. Scorecards measure performance against goals, show the success/failure of specific metrics, and are utilized once a project is complete. Dashboards, on the other hand, are used throughout a project and offer a snapshot of a study’s progress. Dashboards are a collection of reports, KPIs, and comments from consumers, all of which provide context for the status of a project. #2. Looks matter. A dashboard needs to convey information quickly and clearly, so appearance is very important. All elements of a dashboard, including gauges, colour, highlights, and fonts, are critical to ensuring that messages are communicated efficiently. #3. Dashboards should be actionable. Every dashboard should be created with the goal of making the data actionable. Since organizations collect large amounts of data, dashboards need to provide an overview of the most relevant information in a concise, clear manner. Remember that dashboards are not reports: their function is to assist with the decision-making process. #4. A one-size-fits-all approach does not work. While dashboards keep track of the relevant information for a project, the same information and style of presentation will not meet the needs of all hierarchical levels. According to dashboardinsight.com, performance dashboards can be loosely categorized into four levels, and each should include a different number of metrics: CEO/board level – about six high-level metrics Corporate vice president/director level – between 12 and 20 metrics IT strategic level – range of 12 to 50 metrics IT operational level – around 20 metrics Always begin dashboard design with a clear understanding of the end user and his or her executive level. While different levels of users will require various dashboard views, remember that you can create filters to extract the information required for each type of user. #5. Focus on simplicity. Poorly designed dashboards gather huge amounts of data on one screen, preventing clear understanding and slowing down decision making. With more and more web applications using a minimalist design (a change for the better), dashboards need to be clear and simple. Use clear fonts, appropriate whitespace ratios, and iconography to guide the user through the dashboard....
 

By Briana Brownell Does a Yellow Checkbox Give You Better Brand Equity? The meaning of colours in branding and marketing is a popular topic: blue means you’re trustworthy and yellow means fun, but do yellow checkboxes mean that respondents will give you better scores in brand aspects like approachability or likeability? Maybe…as researchers love to say, more research may be required. Even though research on the impact of colour in surveys is pretty slim, many survey design guides warn that colour can potentially influence survey responses. Most of the research surrounding the effect of colour in surveys has been in regards to response rates in mail surveys, and unfortunately, there are not many conclusive results about whether colour has a significant effect online. One study found that added elements like pictures don’t seem to cause detrimental results in online surveys. It’s not all good news though. Other experiments showed that question types might affect responses to perceptual questions, and in some of these instances, colour has played a role. How Can Colours Affect Your Research? Colour can make a survey less clear. A difficult-to-read or difficult-to-complete survey will have lower response rates and potentially misleading results if respondents misunderstand the survey questions or if the answer options are difficult to read. A coloured mail survey could be conspicuous – or look like junk. Both positive and negative effects have been found for coloured mail surveys. If colour makes a survey more noticeable, it can serve as a reminder to complete and increase response rates. However, if a survey is confused for junk mail, response rates can decrease. A coloured scale can affect rating questions. Colour can influence the perception of a scale’s spread and influence results on perception-based rating questions such as agree-disagree scales or numerical rating scales. When the gradient of the colours from one end to the other is amplified, respondents perceive the scale as more severe and give more moderate ratings. Inventory Questions Are Pretty Safe Inventory questions such as “Who is your current telecommunications provider?” or “In what year were you born?” do not appear to be affected by the question design or colours used because they have an objectively true answer. As long as the question design and layout are clear and the design doesn’t cause confusion, there is no evidence that the survey’s design affects the responses. Perceptual Questions May be Affected Perceptual questions, on the other hand, may be affected by various factors concerning the question style. I know what you’re thinking: we already know that. Very true. Perceptual questions should always be taken with a grain of salt and considered a comparator rather than an absolute measure, whether they’re rainbow coloured or black on white. Sliders seem to have some interesting effects on survey responses. Both the initial placement of the slider and the size of the slider matter: a wide slider discourages respondents from answering at the extremes, and a slider with an initial placement in the middle discourages a neutral response (respondents prefer to move it rather than leave it where it is). Colouring may also matter in the interpretation of the scale, if the colours used affect the respondent’s perception of the measurement. Overall, using colour and changing design seem to be okay as long as they are consistent. Think of using the different question styles as using different anchor points and treat them this way in the analysis. Entering the Era of Grayscale Research Surveys? Colour and visual elements might be a fun addition to your survey as long as you don’t go overboard: clarity is key to collecting quality data. Remember that researchers see far more surveys than respondents will: make sure it’s not you who is bored with the formatting. It’s a safer bet to keep the wilder stuff for the inventory questions. Consistency should be a key priority in tracking work (I’m sure I’m the first one to ever recommend that!). Questions that are going to be compared should be in the same format. There, I said it: researchers, here’s your excuse to feed your addiction and give respondents a few pages of item-bank radio button grids.  ...
 

#1. Understanding online community research methodologies: Community research can be quite different from ad hoc research. Research goals may range from answering only one question over a few days to pursuing many questions and topics. The key here is to understand the goals of the research in order to moderate properly for the study objectives. Understanding the methodology allows those moderating the community to know how much detail to provide, how to phrase questions, and when to probe and follow-up. #2. Engage early and consistently: Prior to the launch of any online community, a plan should be in place to determine how to engage members as soon as they join. This plan should be ready to be executed as soon as the community is launched and drive initial participation. Throughout the life of the panel, constant feedback and follow-up engagement should also be implemented. Examples involve sharing articles or incorporating news feeds relevant to the community members and providing quick follow-up feedback regarding member participation in community research. #3. Pinning the main questions: To make discussions easier to follow and participate in, key questions the moderator asks should be pinned somewhere (preferably at the top) of the discussion group. The names of each discussion group can be the research question, which will help community members readily understand the topic being discussed. #4. Pop-up announcements are your friend: Utilizing pop-up features in a community software platform will help with engagement, moderation, and management. Rather than having the moderator post details over and over again, create a pop-up announcement to inform community members of details that do not fit directly into a discussion group. #5. Keep it simple: Depending on the audience, it is generally best practice to stay away from language that is too technical, trendy, or ambiguous. Community members want to provide straightforward insights, which will come from being asked clear questions. #6. Be creative: There are ways to “spice up” a relatively boring topic. Think of all the long and un-engaging surveys you may have completed and turn questions into punchy, insight-focused statements. This may include developing a research game for members to participate in. Prizes help too! Check out this article on 6 Creative Ways to Present Your Market Research Data for more ideas. #7. Last but not least, have fun: This is definitely a huge benefit of moderating an online community. If you interact with community members and show that you enjoy conducting research with them, odds are members will feel more comfortable opening up and sharing their insights....
 

To celebrate the beginning of summer, Insightrix held its annual staff and family summer barbeque last weekend at the Saskatoon Forestry Farm Park and Zoo. There was a bouncy castle, an inflatable slide, cotton candy, and, of course, more than enough food. Happy summer from everyone at Insightrix! ...