insight_out_blackThis week I was given a demo by Intellection of their Insight Out software, or not so much a demo but a detailed explanation of their approach to survey research and just how different it is.

We are all aware that there is enormous pressure to conduct and deliver market research insight at breakneck speed, or at least that how its seems to someone who has been in the industry any length of time. The technological environment we are now in has created an expectation of instant results – press a button and the software will deliver the answer. While we have the technology to deliver real-time “results” while a survey is in field, most researchers would say that there does need to be some time after fieldwork closes to process the data and apply weights etc, aggregate and filter, then peruse the data to find the story, which then needs to be teased out and illustrated.

The first, and non-controversial, Intellection principle is that there ought to be a seamless technological process covering data collection, processing and reporting. The second principle is that instead of waiting while the researcher analyses each data set for insights, all of the knowledge and learning of the research team can be programmed directly into the software itself in advance, so the analysis is instant and delivers a result as soon as the data is in. This means that, together with the speed of online data collection, you can order a report on a particular sector, and if it’s a modestly sized nationally representative sample, get the results back, with insights, within 24 hours. Intellection provides the software to do this – some of the click-and-buy reports are available at zappistore.com.

The idea that research insight can be programmed in this way is intriguing. I have always thought that, while learnings can be made across sectors (for example about what makes a strong brand or what makes customers loyal) to deliver insightful and relevant research would always require some knowledge of the sector, at least some survey-specific analysis, and in some cases a research methodology which is particularly tailored to a sector. In terms of research technology, it has always seemed to me that one of the industry’s challenges is to develop software which is accessible and cost-effective, but which is also flexible enough to be able to cope with the vast variety of different research approaches and methods that we apply.

Intellection are saying that if you really look at what we do, there isn’t that much variety after all. Questionnaire design, and what the responses to those questions actually mean in terms of the products and services they are about, is actually quite standardised and can be distilled into a set of rules to be applied programatically at the push of a button. The extension of this argument is that we are all spending a lot of time creating research solutions which are considered unique but between which the differences are so slight that they are negligible at the end of the day.

Given that many agencies, in particular smaller boutique consultants and independent researchers, market themselves on the basis that they take a fresh approach to every project, this is an idea which is likely to hit considerable opposition. My initial thought is that it could work on certain research products but I’m sceptical about the idea that all research can be “generified” in this way.

The big plus is timing and cost. Ordering a report on Zappistore might cost say £5,000 and take 24-48 hours, whereas the same report commissioned through conventional means could cost ten times as much and take two or three weeks. That’s a very attractive prospect for a research buyer, if they can be convinced that they are getting the same level of insight either way.

It’s certainly food for thought, and I’d welcome any views on this topic.