November 22nd, 2011
By Darren Breese, Research Director
With all the talk of new technologies and the like in the air, we may overlook the basics. Yet, one of the themes running through the recent Market Research Event was the notion of simply empathizing with consumers. This is nothing new of course; it is the core of what we do every day — understand consumers, bring them to life, connect them to marketers. Every day we put ourselves in the consumer’s shoes, or in some cases actually watching them put on their shoes.
Sometimes it is hard for clients to truly empathize with their consumers, because quite often they aren’t in the same boat. They may be more affluent, live different lifestyles, and have upbrings and life experiences that are poles apart. Despite all of the differences — perhaps because of them — it is the researcher’s job to do as much as to connect marketers and their consumers, and to do this in a way that makes the experience as engaging as possible.
A technique used successfully by one researcher at the conference was to force marketers to consume as their consumer does.
- Shop on a strict budget (like many of their customers).
- Shop with children in tow, even if that means “borrowing” kids for a day.
- Immerse marketers with triads of like-minded consumers
- Engage in other non-shopping activities common to the target consumers.
- And, of course, keep journals to drive their immersion home.
We know and do Immersion extremely well, but Immersion research only works as well as the client wants it to, so we have to constantly look for way to keep things fresh and fun.
Another way insight managers are using empathy is bringing together cross-functional teams. We all know how different right- and left-brained individuals think and process information. It can be extremely difficult for them all to difficult to work on the same page. By placing cross-functional teams together in the same room with consumers, and holding immersion sessions that help each team member empathize with their consumers, an insights manager got his team to think similarly—like their consumer. He was then able to hold Ideation sessions that led to productive concept development work.
In other words, walking in someone else’s shoes has the added benefit of forcing marketers to take of their own.
As we strive to provide marketers with actionable insights and help them connect with their consumers, we must also be consistently looking for new and innovative ways to help them foster empathy for their consumers. Empathy makes insights real.
November 18th, 2011
New Study Examines the TV Consumption Habits of Generations X, Y and i
Adults 18-24 and 25-34 Most Likely to Connect Social Media to TV Viewing; Teens Most Likely to Watch with Friends and Family
(National Harbor, MD—November 9, 2011) – A new study released today, “Watching Gens X, Y & i,” paints a detailed portrait of 13-34 year old consumers and how they watch television: often while taking part in up to four or five activities all at the same time, from eating, cooking and cleaning to texting, surfing the web, emailing, playing games or listening to music.
“Many 13-34 year olds are multi-media multitaskers, but their social media activities vary depending on age group,” said Char Beales, president and CEO, Cable & Telecommunications Association for Marketing (CTAM), and head of the organization that commissioned the study.
Younger generations have been raised in an entertainment world where content is available anytime, anywhere and on numerous platforms. This study exposes what teens and young adults are watching, with whom they’re watching, where, how often and on what devices.
Although about half of 18-24 and 25-34 year olds follow or “like” TV networks/shows, only 38% of those 13-17 do. The leading social networking activities while watching TV are looking up info (31% of 13-34 year olds), discussing shows online (29%), posting updates/tweeting (24%) and visiting a network or show page (22%). However, these activities are almost twice as likely to be conducted among 18-24 and 25-34 year olds compared to teens.
This research, conducted by C+R Research, was commissioned by the Cable & Telecommunications Association for Marketing (CTAM) to investigate the effect of lifestyles and life stages on media and technology usage of younger consumers. It included both qualitative and quantitative online phases in the summer of 2011, and also utilizes data from C+R’s comprehensive syndicated YouthBeat study to provide additional context. 2,124 total interviews were conducted as part of the quantitative phase.
November 16th, 2011
By Hillary Stifler, Director
At the Market Research Event last week, one theme was played out in several presentations – categorization. And it hit home, as I’m currently working on a study whose goal is to categorize over 100 products in a way that makes sense to consumers!
Categorization was a major theme discussed by Sheena Iyengar, who spoke about “The Art of Choosing.” People make thousands of choices each day and, as she puts it, face “choice overload.” She offers three solutions:
- Cut duplicates and indistinguishable.
- Categorize the options.
- Condition the chooser for complexity by offering the easier choices
first before working into the more complex choices.
This advice is not only great for product offerings at shelf, but it is also great for business communications.
Ruben Alcaraz from Meijer spoke on data visualization and gave some great advice that I think ties into the power of categorization. He said, “It’s not that people don’t get it, it’s just that we [market researchers/those sharing our data] aren’t good at communicating it.”
Really, categorization is communication. A jumble of data on a page does not tell a story. Humans are visually-oriented and, to be an effective communicator in the visual realm, we must categorize our information in a way that makes sense to our audience.
So, Ms. Iyengar’s three pieces of advice also apply to reporting and data visualization:
- Don’t show duplicate data.
- Section off reports (or even parts of a slide) in a way that makes sense
to the audience and supports your story.
- Start with the obvious, more general information and work your way into
the deeper, newer information.
Categorization is not a novel concept; we have grouped and framed information forever. However, I think it is a good reminder that information is far less powerful when it’s not organized in a way that speaks to the audience. And, when organized in a meaningful way, it helps people choose where to focus their attention and it helps the author to tell the story efficiently and with ambiguity.
November 3rd, 2011
By Walt Dickie, Executive Vice President
When my two college-age sons were younger they each went through a period of fascination with what I think of as “alternate reality” or “science fiction presented as fact” cable TV programs. Our house was awash for years in aliens hidden in secret government facilities, Sasquatch sightings, and paranormal activities of all kinds.
None of this bothered me too much. Although the sci-fi-as-sci-fact genre hadn’t been so popular when I was a kid, I had loved Erich von Dänigan, and devoured everything I could find about Bigfoot – once even giving serious thought (for a week or two) to writing my Ph.D. dissertation about Bigfoot lore before coming to earth on the realization that my paltry linguistic skills weren’t up to the challenge of the Northwest Coast languages.
But my kids were getting all this via cable at a much younger age than I was, and they didn’t have the grasp on the “science” part yet. I had to have some way of talking to them about what they were seeing that admitted its attraction while warning them about taking it all too literally.
Talking to my older son one day about extraterrestrial visitors, and struggling with the knowledge that the stories of little green men that he found so fascinating certainly had some scientific plausibility, I told him that I thought it would be so cool if it was true. And I saw that what I was trying to say had finally clicked for him. Of course he was fascinated, of course people wrote about this, went to conventions, and searched the skies (and their backyards) for aliens because it would be so cool if it was true. Once the aliens, Yetis, time travel, ghost sightings, and all the rest were seen as powerfully compelling stories, he got the point I had been trying to make about his fascination with them and my parental concern.
That experience had a real effect on me; it was the first time that I fully realized how our entertainment-driven culture has become fascinated by things that are cool to think about, even when we know at some level that they aren’t true. It’s not that we value illusions of truth, it’s that we value cool so highly. This is where internet memes are born – internet flash mobs playing all the variations on a theme that is experiencing its moment of coolness.
I think about that experience a lot these days; only I’m not thinking bout possible remnant populations of otherwise extinct giant apes wandering around barefoot in the Himalayas. I’m thinking about the unceasing drumbeat of blog posts and articles hawking the embrace of all-out innovation as the only means of escape from certain commercial death.
“Innovate or die” has been with us since the tech boom of the 90s, and it has evolved from a taunt thrown out by a few hard-charging internet entrepreneurs to well-nigh the accepted gospel of modern business. It seems to be especially popular among bloggers and other bystanders; a hair trigger response to any commercial stumble, good for a quick post requiring little thought. Lately it’s become the war cry of the marketing research commentariat, who would have us toss off the shackles of hide bound, obsolete, research techniques – apparently abandoning clients and revenue streams in our haste to get out of long-standing but soon extinct business lines.
C+R and I both survived the 90s – by innovating – and I’ve had a lot of time to think about the events of those years and what they mean to our future. Our experience put me firmly in the pro-innovation camp, and I thought I was comfortable there. Then, a few years ago, my college alumni magazine, Technology Review, which has become a fairly successful commercial enterprise itself, announced that “innovation” would henceforth its theme, its be-all and end-all.
I found that strangely troubling. I’m an MIT grad, but I am not now nor have I ever been an engineer, scientist, or even a technologist in any very serious sense. I’m an anthropologist by professional training, and what technical expertise I have has always been harnessed to business issues.
But when Technology Review started to really harp on “innovation,” I found myself wondering about all of the engineers who had spent their careers improving processes, increasing the accuracy and speed of measurements, making things more efficient, safer, or economical. Certainly those achievements required something reasonably called “innovation,” but that kind of innovation wasn’t what the “innovate or die” crowd were talking about.
You don’t “innovate or die” just by improving something. You invent something “revolutionary.” You “change the world.” You overthrow the old and embrace the new. You cause a “paradigm shift.”
I have a visceral understanding of the allure of paradigm shifts. The voice of Thomas Kuhn, extolling the concept of “paradigm shifts” in the history of science during the 60s, sang as sweet a song in the corridors of academe as The Beatles or The Dead. We wanted to get high, get laid, and cause a paradigm shift that would overthrow the stodgy dogma of whatever we were majoring in – not necessarily in that order. I remember it well. And, like most young academics, I loved the smell of it in the morning.
So why, I found myself asking, does it bother me, 40 or 50 years later, that my alumni magazine has gone all innovation on me? Was I now backing the fuddy duddies? This question really bothered me, raising the possibility, as it did, that I was not only backing them but also joining them.
In the past few years I think I’ve finally figured it out. Kuhn was writing about big-S Science, and the Tech Review has chosen to concentrate more on Big-T Technology rather than the small-t technologists it used to write about. Paradigms are the right thing to think about when you’re talking about big-S, big-T stuff. I thought so in the 70s, and I think so now.
But if you’re a small-letter scientist, technologist, or marketing researcher, then the issue isn’t as clear. Should you, in fact, stop wasting your time on something that will be displaced by a paradigm shift?
Let’s imagine you live near Lake Michigan, as I do. It’s summer, it’s warm, and the sun is shining. The gorgeous blue waters and fresh breezes beckon. You know, of course, that the paradigm of gorgeous weather will shift, possibly very soon. And, this summer at least, you have every reason to believe that 60 mph winds and 20’ waves may make that shift very unpleasant indeed. For purposes of this little example, you have no access to any weather forecast of any kind. What do you do? Stay on shore or get in a boat?
The fact is that, until the advent of modern computerized weather models, no forecasting tools could consistently beat the prediction that tomorrow’s weather will be very much like today’s. Your best bet, then, in the absence of better information like an online satellite view, is to predict a continuation of more of the same. Shove off, hoist the sails, and enjoy the continuing lovely weather. At least in the short run.
Let’s try another analogy. You’re a marketing research company that has a good-sized survey research business. You’ve been doing well on survey research for many years, you negotiated the transition from phone and mall to the internet, and you’re doing well getting your head and hands around the mobile revolution.
But you see a paradigm shift coming. Big data could shift the paradigm and undermine the market for survey research because, given enough data, what people actually do as reflected in their searching, shopping, purchasing, friending, tweeting, and other choices on millions of web sites is a much more reliable indicator of their future behavior and their opinions. And a paradigm shift in cognitive science says that people don’t even know what they’re thinking when they’re thinking it; that what they think they think and whatever “thought” it is that drives action in their world are two totally different things, often in conflict with one another. Who will care about what people imagine they think in the future? Your survey research could be shifted right into the trash.
Or will your recent good results from survey research continue, like the weather, at least in the short run? For that matter, even if your business tanks in the long run, what exactly does “long” mean? Do you get out of the business today? Next year? Suddenly? Gradually? And what does “out” mean? Quit? Put less emphasis on selling and marketing? Stop developing completely and freeze everything just as it is today?
Since we’re talking here about you, a small-m marketing researcher, not the big-M marketing research industry as a whole, it’s pretty hard to know the right answer. Certainly these, and other equally large, evident paradigm shifts, are going to impact Marketing Research – or Consumer Insights, Decision Guidance, or whatever we wind up calling what we do. But what happens at big-M scale may not be what happens in your immediate neighborhood.
Famously, buggy whips went out a century or more ago, killed by the big paradigm shift to internal combustion engines, automobiles, and global warming. But there are still buggy whips being made and sold, and someone is probably making a nice living off them. Along about the same time that the horse-and-buggy went out, a huge industry based on fancy feathers for ladies’ hats, that had shipped tons upon tons of feathers annually, died out, swamped by a sea change in fashion. But there are still feather merchants to be found, and their industry, once dominated by sales to fly fishermen and Vegas-style costumers, is booming thanks to a fad for hairdos incorporating feathers. I remember reading a story a few years ago about a local firm’s turnaround – they had gotten out of their struggling “modern” electronics business and established themselves as the dominant player in the global market for replacement vacuum tubes for legacy equipment and were making record profits! Vacuum tubes! Who knew? And closer to home, there are still many phone rooms plying their trade for marketing research long after the online paradigm shift moved the industry in a new direction.
Let me be clear: I’m not arguing that it’s better, wiser, or more lucrative for your career or business to live on the backside of a paradigm shift, only that there are a lot more choices and possibilities than just “get with the new paradigm” or “drop dead.” Anyone who tells you otherwise has either never managed a real business or going for sensationalism over sense. Hooey!
It’s pretty clear why we’re so enamored of “innovate or die”– you don’t have to have paid much attention in any of your lib arts courses to recognize and understand the appeal of a story arc. We humans all crave stories and are fascinated by them, so condensing the complexities of actual life into the simple narrative of “innovate or die” is perfectly understandable, even if pretty much useless as advice. Innovators on one side– successful, profitable, with it, and future-oriented – and dead companies – foggies, fools, and bankrupt stick-in-the muds – on the other!
It is, I think, a winner take all fable for a winner take all age. It would just be so cool if it was true!