The Power of One

I was interviewed in a store the other day. I had just bought a piece of electronic kit, and on my way out I was approached by a good old traditional market researcher, complete with clip-board. He went through his questionnaire. I answered as best I could.

The truth is though that I felt unable to answer many of the guy’s questions (‘was I happy with the advice I received in-store?’ How can I possibly know that until I get home, plug in the kit and see if it meets my needs?). Rather than get into a discussion I just did my best and between us we picked the least inappropriate pre-coded answer.

The answers I gave presumably form a small part of some report. No doubt conclusions will be drawn and maybe even actions taken. I don’t think I’m that unusual, and so I think it likely that others will have faced the same dilemma as me, and answered quickly, quite apart from not being able to be entirely accurate. Which rather calls into question the conclusions reached.

This has happened to me before – both online and in person. ‘Am I happy with the way my online complaint of earlier today was handled?’ I don’t know – maybe we should wait and see whether you do what you say you’ll do. But someone somewhere is concluding that the customer service team is doing a fine job as a % of those contacted say they’re satisfied.

Questions like these might well be better answered by analysing some data. Take my store interview. Maybe at least as great a degree of insight could be generated by looking at the % of products returned to the store by category of product, or even the number of questions around each product category on the store’s discussion forum.

Market research companies work themselves into something of a frenzy whenever the phrase ‘big data’ is mentioned. Dark mutterings about the threats posed to the very future of the market research industry are heard; much hand-wringing about the accuracy of some of the analytics is in evidence.

The fact is that data (big, medium-sized, or small) is great at revealing what people do and what they’ve done. It illustrates behaviour (yes, I know I’m simplifying). Used cleverly it can (on a good day) be modelled to indicate the likelihood of someone doing something in the future. Market research on the other hand is (or can be) terrific at explaining what people think, and how they feel.

Traditional market research is (more often than not) not so great at measuring past behaviour although of course the results of questions like ‘have you visited the John Lewis website in the last week’ can be relative and indicative. If you really want to know how many people visited the John Lewis website last week (and what for), look at the data.

It’s also not a bad rule of thumb for any client commissioning research to answer their questionnaire themselves before going live. Can they do so, accurately? Can they complete the online questionnaire without getting so bored that they just tick boxes to get through it as quickly as possible (as happened to me with a major cross-media survey). Errors can be avoided. A simple exercise amongst a sample of one can be a valuable step in generating good research.

 

 

|
|
|
|
1 Comment
  1. What you say is so true about inaccurate quantitative surveys either online, f2f or by phone. The main problem is that everything is pre-coded to save money so you end up searching for the least inappropriate answer as you say. All surveys should have at least one open ended question at the end along the lines of ‘Is there anything else you would like to tell us about this topic?’ Provided the respondent has time to fill it in, what they say could well be an eye opener.

Leave a Reply

Your email address will not be published. Required fields are marked *