DIY toolsIn a recent webinar, What’s Hot in Market Research, consultant Ray Poynter described the growing trend for what he called “DIY research”. This includes scripting tools such as Survey Monkey, but also specific aspects of the research process. Reporting must surely be one of them. In the same way that people can now easily design their own package holiday by booking their own flights and hotels and specifying exactly what they want, you would think that people who need to access survey research would be interested in being able to access the data directly themselves and create their own reports.

There are an increasing number of tools available by which survey data can be made accessible online in an interactive way so that those with access can explore it themselves. Is this trend a good thing or a bad thing?

On the plus side, some research buyers and users will find the idea of getting direct access to the data themselves very convenient. They can see what they want to see, and don’t have to plough through a 150-page PowerPoint report to get to what they’re interested in. They don’t need to request additional analysis and they can pull off what they want themselves (if the interface is easy to use). They can navigate and make use of the data when and how they want. Potentially, they can access the data much more quickly than waiting for a formal report. This is particularly useful for results that need to be actioned quickly, such as unhappy customers.

On the minus side, not everyone who commissions market research is particularly interested in the data itself: some only want to know what the results mean for their business or area of responsibility, and don’t have the time or inclination to go exploring data sets for themselves. They would hope that the research agency, particularly a consultancy-led agency, would do all this for them and tell them what they need to know in a way which points towards action. Some users may not really have the technical expertise to interrogate a data set and could make mistakes with it. Often, “real-time reporting” sounds great but, depending on the type of survey, what you often need is for the survey to complete in the field before you start to look at the results, to ensure that quotas are filled and response rates maximised.

At the end of the day, one of the things researchers are good at is looking at results, interpreting them, and communicating what they mean, and the need for this is not going to end. However, for some types of survey, enabling direct access will definitely add value and allow more use to be made of the research, making it more real to people and more convenient.

Someone outside the industry might ask why we aren’t doing this as standard these days. An interface like Google Analytics, available for free, is easy and intuitive, provides recently-updated information, enables overview and detailed analysis and is attractive. Why don’t all our surveys have a facility like this? The answer is that most quantitative surveys are ad hoc and have been specifically designed, so an online tool would also need to be specifically configured each time. This seems to work out expensive, particularly for something that looks good, and basically costs more than clients would see the value of.

Two things about that: firstly, as technology improves the costs will come down. Secondly, if the online reporting tool is seen as an add-on to everything else, it will look expensive, but most of the tools available can be used to produce all offline reporting requirements as well. Upload respondent level data into a tool, configure it for online access, create dashboards and graphics appropriate for the different audiences, and use the same tool to create the presentations which are needed for face-to-face debriefs as well. This is the most efficient way forward.

I sometimes wonder how much our offline reporting processes cost if we add up all the time we put into creating monster PowerPoint reports. Anecdotally it sometimes appears to be the most time-consuming part of the entire research process. Reporting on a large project – just getting the data into graphical format – can easily take up more time than updating questionnaires and setting up fieldwork. If we did work this out, we might find that using a DIY tool instead is not actually as expensive as it sounds. It would need to be instead, though. Not as well.

In other words, if we are to offer DIY reporting, and to be able to afford to do so, we should also be using it ourselves. Imagine a world in which, when the fieldwork is complete, instead of a set of tables, the researcher receives a link to a tool where all the data is sitting, configured with weights and nets etc, ready for her to explore and analyse on behalf of the client. She will then be able to evaluate the results and provide the key insight and interpretation using the tool rather than working from a set of tables. In addition, the client also has access to the same data sets, possibly enhanced with dashboard views that illustrate the story that the researcher has identified.

Sound like a better way of working?

For my review of online reporting tools, visit