A visualisation that ticks all the boxes

FT unemployment visualisationThe FT Data Blog published a visualisation of EU unemployment that is a real pleasure. There are a number of things I really like about this blog:

The amount of data is pretty impressive…

– anyone vaguely interested in the topic will have a great deal to look at here. Unemployment trends amongst the population, young people and long term unemployed for all 28 EU countries, for 10 or 11 time periods, together with regional unemployment rates shown on a map and a graph. All of this gives a great overview and puts each country’s issues into fascinating context.

…but the design makes it possible

If I suggested to someone that a one-page visualisation should contain 84 charts plus a map and another chart showing over 100 regions, they would probably say that it was far too much. A typical survey research results report would break down that much data across a number of different slides. But you get much more from seeing it all at once because you can get the context as well as the detail. And what makes it possible is great design: a limited number of colours are used, so that the brain isn’t distracted. Everything is neatly arranged and the unnecessary is not there – axis tick marks, data labels, gridlines.

Position, size, shape and colour all convey information

The shape of each mini line chart tells the story of the last ten years in each country. For extra meaning, the position of the charts also shows the ranking of each country. The map shading gives a geographical story for the EU and also highlights interesting stories such as the difference between north and south Italy, and the varying fortunes of the Czech Republic compared with its erstwhile sister region Slovakia.

Less is more

Minimalism brings out the real stories and avoid distractions. The bar chart displays the unemployment rate in every region and the fact that all the bars are one colour with no borders or labels makes us focus on its shape, which illustrates the enormous variety in unemployment rates across the EU. It’s really uneccesary for every series in a chart to be a different colour like the PowerPoint defaults.

 

There are definitely some lessons to learn from this really great example which can be applied in survey research. I would like to see research reporting which puts more together on one page for better insight, uses professional quality design, removes the clutter, and focuses on shape, size, colour and position to best effect.

Challenges faced by the independent researcher

independent researchers group logoI’ve recently joined the ICG (Independent Consultants Group), an association of around 400 freelance researchers. We had our summer party last night, and what a fun event it was. I found it refreshing to talk to people who were extremely hands-on and provided straightforward, easy-to-describe services like field recruitment and data processing, and great to be part of a group which is so mutually supportive.

The issues for survey data reporting for independent researchers are a little different from larger agencies. For example, an independent researcher may not know what her or his next project will involve, in which case it becomes much more difficult to justify the purchase of software tools. Unless you need it for every project, for example as a data processing specialist would, you would probably be much more interested in a per-project arrangement so that the costs can be justified as part of a single project.

Each independent researcher will of course have a limited range of competencies, so will be putting forward proposals and costings based on collaboration with other service providers, for example recruitment and field agencies. For something different to happen at the reporting end, this would really need to be included into the research proposal right at the start, and the proposer would need to justify why any additional cost is there. So they would need to be clear about what, for example, an interactive online visualisation of the research results will add in terms of value to a particular project. Factoring in additional spend once a project has been commissioned would be extremely difficult, and of course this is also true of larger agencies.

As sole operators or small partnerships, independent researchers are vulnerable to workload build-ups, and could benefit from simple processes which reduce the amount of manual time needed for a particular task. Report automation of one kind or other might therefore be attractive on any project where there is likely to be significant charting. But again it would often need to be something which could be costed on a per project basis: the purchase of a tool is only justifiable for those researchers who can be confident they will be needing it again. Project-specific solutions can be developed using VBA at fairly low cost, such as fast auto-charting or significance formatting, which could save a great deal of time. Having said that, there are some great crosstabbing, analysis and charting tools out there like MarketSight and Q, as well as SPSS of course, which would be exactly the right thing for some researchers.

There is a massive range of work carried out by independent researchers, but it’s fair to generalise and say that many independents are qualitative researchers and that the quantitative work tends to be ad hoc rather than the large-scale tracking variety. So the relevant kinds of reporting solutions would often be those which are low-cost enough to be justifiable for single ad hoc projects. This calls to mind options such as a simple online interactive developed with PowerPoint like the examples I have here, which can be entirely customised, are costed on a project-specific basis, and will not break the bank either.

 

Five things you need to know before purchasing an online reporting tool

The Friday Report: Online reporting toolA review of the tools available for building online reports (collectively known as dashboards) reveals that there are five very important questions which you need an answer to before being able to make a decision about purchasing a dashboard product.

1. When do you need a dashboard, and when do you not need one? A requirement for a dashboard can sometimes be written into a proposal as something that ought to be asked for, but it’s important to know, for example, what kind of people will be accessing the dashboard and what their requirements are (eg are they analytical, or will they want something straightforward and simple?), how many users will need to access it (this may have a strong bearing on cost), and what additional benefit they will get from having access to a dashboard. It may be that other communication tools might achieve just as much if not more, for example an one-off online interactive that tells a story with some eye-catching graphics, or even a nicely-produced slide show or video.

2. Following on from question 1, how many dashboards will you need to build in a year? If the answer is more than about 5 (rough estimate) it’s probably worth looking at a tool. Less than that, and it may be more worthwhile to commission a serviced dashboard each time, which means that you can specify exactly what you want and don’t have to go to the trouble of purchasing new software and learning how to use it, the latter potentially costing more than the former in terms of employee time.

3. How customised do the dashboards need to be? Will the clients/end users all be happy with something which reflects their corporate ID in general terms, is secure and provides customisable access, will present charts or tables with drill-down options and downloadable content, and not too fussed about exactly HOW the significant differences are displayed or HOW the brand names are shown? If so, a tool on the cheaper end will probably suffice. If there is likely to be the need to customise elements such as graphics and text, and be more creative with layouts, make sure this functionality is part of the tool you are considering.

4. Can you make use of the entire functionality of an online reporting tool? Most products enable respondent level data to be uploaded, and have a cross-tabbing and analysis element which means that you can potentially use them to create the aggregated data you need as well as the dashboard itself. Most of them enable aggregated data to be downloaded either into Excel or PowerPoint charts, which could make report production a lot easier. Some of the additional spend buying the tool and training people to use it could therefore be offset against savings made by moving away from the more traditional data processing work flow.

5. Who will be using the tool? If the research executives themselves who set up the project and know the questionnaire, the client and the sector, also use the tool, this has the advantage that they can be very hands-on and have the data immediately at their fingertips for analysis and finding the story. However, if the tool is complicated, which it may be if it’s at the top end in terms of functionality, this might be a struggle. Another option would be to focus expertise within a small team of people who upload the data and build dashboards on behalf of the research teams. This would have the benefit of allowing a build-up of expertise and would mean that you could get more out of a high-specification product – but it may not be how you want to work.

There are no right or wrong answers to any of these, but knowing the answers would make a decision on the right online reporting tool much easier.

I will be publishing a detailed features matrix on the products I’ve covered very soon, which will be available as a free download.

 

Instant market research insight – is it possible?

insight_out_blackThis week I was given a demo by Intellection of their Insight Out software, or not so much a demo but a detailed explanation of their approach to survey research and just how different it is.

We are all aware that there is enormous pressure to conduct and deliver market research insight at breakneck speed, or at least that how its seems to someone who has been in the industry any length of time. The technological environment we are now in has created an expectation of instant results – press a button and the software will deliver the answer. While we have the technology to deliver real-time “results” while a survey is in field, most researchers would say that there does need to be some time after fieldwork closes to process the data and apply weights etc, aggregate and filter, then peruse the data to find the story, which then needs to be teased out and illustrated.

The first, and non-controversial, Intellection principle is that there ought to be a seamless technological process covering data collection, processing and reporting. The second principle is that instead of waiting while the researcher analyses each data set for insights, all of the knowledge and learning of the research team can be programmed directly into the software itself in advance, so the analysis is instant and delivers a result as soon as the data is in. This means that, together with the speed of online data collection, you can order a report on a particular sector, and if it’s a modestly sized nationally representative sample, get the results back, with insights, within 24 hours. Intellection provides the software to do this – some of the click-and-buy reports are available at zappistore.com.

The idea that research insight can be programmed in this way is intriguing. I have always thought that, while learnings can be made across sectors (for example about what makes a strong brand or what makes customers loyal) to deliver insightful and relevant research would always require some knowledge of the sector, at least some survey-specific analysis, and in some cases a research methodology which is particularly tailored to a sector. In terms of research technology, it has always seemed to me that one of the industry’s challenges is to develop software which is accessible and cost-effective, but which is also flexible enough to be able to cope with the vast variety of different research approaches and methods that we apply.

Intellection are saying that if you really look at what we do, there isn’t that much variety after all. Questionnaire design, and what the responses to those questions actually mean in terms of the products and services they are about, is actually quite standardised and can be distilled into a set of rules to be applied programatically at the push of a button. The extension of this argument is that we are all spending a lot of time creating research solutions which are considered unique but between which the differences are so slight that they are negligible at the end of the day.

Given that many agencies, in particular smaller boutique consultants and independent researchers, market themselves on the basis that they take a fresh approach to every project, this is an idea which is likely to hit considerable opposition. My initial thought is that it could work on certain research products but I’m sceptical about the idea that all research can be “generified” in this way.

The big plus is timing and cost. Ordering a report on Zappistore might cost say £5,000 and take 24-48 hours, whereas the same report commissioned through conventional means could cost ten times as much and take two or three weeks. That’s a very attractive prospect for a research buyer, if they can be convinced that they are getting the same level of insight either way.

It’s certainly food for thought, and I’d welcome any views on this topic.

 

 

The survey research process: Is the way we work completely outmoded?

typewriterWhen I first started in market research, the main client deliverable on the first tracking study I worked on was sets of tables. Presentations in PowerPoint were an additional extra, occurring at key stages in the reporting cycle. Things have changed. Well, that was a while ago. We had desktop computers and internal email, but not external. It was a good few years before we could email attachments. Most international project co-ordination was done with the aid of a fax machine. I can’t now remember how we got the tables to the client. We probably couriered hard copy, or maybe we were with the programme and put them on a CD…

Anyway, as I said that was a while back. Certainly the norm for most of my research career is that the tables produced by the data processing team are for internal use only, designed to facilitate analysis and interpretation of the data, and as a basis for copying and pasting aggregated figures into PowerPoint which illustrates the result and brings out the story.

Over the past few weeks I’ve been reviewing online dashboard products. Most of these products do more than provide an online dashboard: they enable respondent level data to be uploaded, manipulated into crosstabs using intuitive functionality, incorporated into customisable charts and downloaded into (sometimes) editable PowerPoint decks which can themselves be customised. The data/crosstabs/charts are all linked, so data can be uploaded again and everything else is updated. They are essentially tabulation and automation tools all in one, and the ones I have reviewed also have online dashboard features which are rapidly developing in terms of flexibility and functionality.

That makes me wonder whether the traditional survey research process which is still the norm in sizeable agencies isn’t starting to look a bit out of date. Can we do away with teams of data processers, deploying that resource instead in areas that will enhance reporting capability such as data exploration, web development and graphic design?

On the one hand, this is quite a compelling argument. Executives like having the data at their fingertips to play with rather than having to ask a different team if they realise they need to see another cut of the data. On the other hand, there will still be work involved doing things such as combining top 2 box scores, grouping multiple response variables, and creating the various derived variables which add so much to the analysis. This will still have to be done, whether by researchers or data processors, so we’d need to be sure that this wasn’t just a transfer of work from one place to another in the organsiation. And then there are the teeth-suckingly complex surveys where almost everything you look at is a derived variable of some sort put together in heinously complicated ways. Having DP experts with hands-on expertise directly manipulating the data is surely a must for projects like those.

I can, however, foresee more of the process being taken on directly by researchers using these intuitive tools, and they are perfect for small agencies or independent researchers who want to do as much as possible themselves. It seems to me that if we really want to get more out of the data, then the part of the survey research process which needs more expertise is in the exploration of the data (data science is a growing area) and in finding better ways of visualising and presenting it both offline and online.

Up against the clock: data visualisation innovations in the real world

Image of clockAt Research Club on Wednesday, I was explaining to a senior agency researcher that I specialise in thinking up new ways of visualising survey data. That’s all very well, he said, but when I only have two days from the data being finalised to presenting the results, where’s the time for that?

He has a point. Project timings are so tight that even a small hitch earlier in the data collection process can seriously affect the time available to analyse the results and dig out the findings. Agencies have sometimes been marked down by client companies for not providing enough value added insight with the results, but I sometimes think the reasons for this are misdiagnosed. It’s not that researchers don’t know how to find or present insight, it’s that they don’t have time to, because timings have slipped earlier in the research process and they are scrabbling to pull findings together right at the end to meet the original deadline.

It is relatively easy, and enjoyable, to develop new ways of presenting and communicating research data visually. The real challenge is being able to apply some of these new approaches on real world projects.

One of the ways of addressing this is to plan ahead and develop ideas away from a specific project which could be applied to different projects. That way the “thinking time” has already been spent and when the live project comes to reporting, researchers can move straight in to applying a new data visualisation which has already been developed, tested and tweaked.

Keeping things consistent in terms of software also has its advantages. I spend a lot of time working with PowerPoint VBA which can usually be applied relatively seamlessly because PowerPoint is, for better or for worse, the main reporting and presenting tool used by researchers. Most outputs are editable, and there’s the option to include custom add-ins using the Microsoft Office ribbon, making the creating of a data visualisation easy and accessible to a researcher under pressure.

Richer data visualisation also ought to mean smaller output, because it’s possible to squeeze more insight into a smaller space and gain more insight by seeing everything at once. This takes some planning and possibly negotiation up front, because it’s tempting from the client side to say yes, great, let’s see something different, but can we also keep seeing all the old stuff as well? With better data visualisation we can, and really should, be moving away from the 200-slide PowerPoint deck into something more attractive and intuitive, possibly interactive, and just plain smaller, something that tells a story and doesn’t somewhere along the way turn into a data dump.

In the long run therefore it ought to take less time to produce better data visualisation than a traditional linear PowerPoint report. But it will take some planning to get there.

 

Using a “Big Data” approach for survey research

scatter example for websiteLast Thursday, 800 people spent a sunny evening in Brighton packed into the Dome Theatre to see David McCandless present on Information is Beautiful. That’s a fair sized audience for a subject I might have thought was a bit geeky. It shows there is real broad interest in new ways of looking at all the data we now have access to. Either that, or I live in a really geeky city.

David McCandless’ visualizations often contain very large amounts of data (ie “big data”) simplified into patterns that the eye can understand. The size, colour and position of shapes in front of our eyes become our means of understanding what we are seeing. For example, his attempts to visualize relative sums of money: the budget deficit versus African countries’ debt, etc. We can look up the numbers, but do we really understand how big they are and how they relate to each other?

These big data visualizations don’t show numbers themselves but are simply shapes which only mean something when it is explained what they are (there are some examples on his home page). It’s the only way of making sense of the quantity of information being shown. I’m wondering how often we do this with survey data and what the result would look like if we did. How often do we try to visualize our data as shapes on a page, and visualize large amounts of data to see what it looks like?

I’m working on ways of using the VBA programming behind PowerPoint to display large amounts of data on a single slide. Here’s an example showing over 8,000 individual data points, on an xy chart. It’s from the GP Patients Survey, which collects data on every GP practice in England. There they all are, on one PowerPoint slide.

This is work in progress but the possibilities are huge – using colour, shape and size to distinguish the points on different characteristics, and we have a whole new way of visualizing survey research in an understandable way, even down to respondent level. I’ll be developing this idea more in the next few weeks.

 

Will low price reporting dashboards ever be flexible enough?

Tracey Hill investigates low price reporting dashboard options

Tracey Hill

I’ve been looking at a number of online reporting tools this week offered by E-tabs, Kicktag (the Cosmos product) and Data Liberation. Two out of three of them essentially have a full serviced dashboard offer, as well as a self-service option. E-tabs is planning to launch a self-service option later in the year.

As you’d expect, if you pay a company to build a dashboard for you, you will have many more options in terms of look and feel and every aspect of functionality. With either a fully functional dashboard product behind the scenes and people who really know how to use it, or a team of developers, custom dashboards really are just that.

A custom dashboard, however, will often cost £10,000, and the price can go up to £40,000 or more. That means that this kind of tool may only be used on large-scale surveys like global trackers or research programmes that significant numbers of people will be wanting to access.

As for smaller-scale surveys where these costs simply can’t be justified or absorbed, the options are more limited. A lot of work is being done to develop self-serve low price reporting dashboard tools, and always the trade-off is between price and flexibility, particularly with the visual look and feel. Closing, or narrowing, the gap between clientside expectation and supply side technological and cost constraints is definitely an ongoing challenge.

One possible outcome is that the gap between what these things cost and what people are prepared to pay will always limit the move to online reporting. After all, the internet has been around for a while now, and the vast majority of quantitative surveys are not reported online (in the form of an interactive dashboard). I know researchers in the marketing and retail sectors who do not report online at all.

Some of the products do look promising, though, for example a dashboard tool being developed as an integral part of online crosstabbing and charting tool MTab and MTabView promises to offer a lot of flexibility at an affordable price. Kicktag Cosmos and Data Liberation also have some really useful functionality and are continually under development. I would be really happy to see some really neat and clever low price reporting dashboard tool really take off and render the 200-slide PowerPoint report superfluous.