Enterprise Marketer - Make Your Marketing Matter.

Is your brand trustworthy? Is there friction between the numbers and your message?

Today's consumer will not believe your product or service is excellent just because you say it is. We must provide survey or data-based evidence to support our claims. A marriage between metrics and marketing will boost your engagement rate in a B2B setting. Allowing the facts to cultivate the narrative will provide scientific integrity.

Proper survey methodology destroys implied bias when reviewed from an ethical standpoint. The story is in the data, and any peer or journalistic review will see it, immediately, for what it is: honest.

Data literacy is important across your entire organization; synthesizing figures into real insight is best done by an expert. Not only will this lend credibility, but you will end up with easily absorbed graphics that plant visual seeds in the consumer. Too often: pretentious design nails the coffin shut on a prospective buyer’s interest. Clarity over cleverness.

This week on Explicit Content: Katie Martell sits down with Clare McDermott at the MarketingProfs B2BForum to examine the good, the bad and the ugly in relation to content research.

Original Research Examples mentioned in the show:


Jeff Julian Welcome to the Explicit Content Podcast. This is Truth, Lies and Digital Marketing. Katie Martell, one of our hosts, went to MarketingProfs B2B Forum and she interviewed three folks from there. Over the next few weeks, we'll rotate through some of those shows and also some of the other interviews the Explicit Content Podcast hosts have been conducting. p>

I also want to make a little announcement that we are launching the Enterprise Marketer conference, March first in Kansas City. For more details, go to the site conference.enterprisemarketer.com, where you can see our lineup of awesome speakers all who will be hosts of the Explicit Content Podcast over the next few months. We hope to see you there. It is a very local event, one day only, but it is our first attempt at doing what we want to continue to do year after year. We hope that you can join us if you're nearby, or you can make the trip.

Let's go ahead and get started. Here's Katie Martell and Clare McDermott talking about research.

Katie Martell Welcome to another episode of the Explicit Content Podcast, Truth, Lies and Digital Marketing. I'm your host Katie Martell. I am coming to you live from the MarketingProfs B2B Forum, held in beautiful but smoky San Francisco. I am so delighted today to be joined by Clare McDermott, the head of research and co-founder of Mantis Research. Hi, Clare.

Clare McDermott Hi, Katie. I'm delighted to be here.

Katie Martell Thanks so much for being here. We are all hunkered in the Marriott Marquis downtown, because the air outside is just not safe for consumption. Our hearts and our thoughts go out to everyone who's affected by these wildfires.

Clare McDermott That's right.

Katie Martell Today, we're going to talk about research, which is one of my favorite topics, because it is so wildly misunderstood in the marketing realm. It is an area of immense opportunity and immense failure all at once. The reason I wanted to have Clare on the show today is because Clare knows the good, the bad and the ugly of using original research in B2B.

Clare, first and foremost, you gave a great talk this morning here at the MarketingProfs B2B Forum about this topic. In it, you really showed some wonderful examples of the kind of research that we're talking about. Can you give us a little just high level view. What are we talking about when we say original research for marketing?

Clare McDermott Yeah. That's a great question. Just to be clear, let's first talk about what we're not talking about. We're not talking about market research. We're not talking about audience research. Instead, we're talking about brands producing original research with the intent of publishing the results. It really is research as content.

There are a few different categories of research that we like to talk about. One is just those traditional surveys that have been around for a long time. You're probably familiar with the McKinseys and the PwCs and the Bains that have been doing benchmark type surveys for a long time.

Aside from that example, there are tech companies that are sitting on mounds of data, and so analyzing owned data and publishing insights from that owned data. Then also looking at external sources of data, be it data from data.gov, so government or civic data, as well as you can find paid external data. That's another type of research is analyzing external research and publishing insights from that.

Katie Martell Mantis works with a lot of companies. Do you handle all three?

Clare McDermott Yeah, we do. We do. Yeah, to be honest, I think largely people come to us, I'd say about three in four come for survey based research, but the owned data and data analysis is really growing quickly as companies have a better understanding that that data really is such a valuable asset from a content marketing perspective.

Katie Martell Tell us about, do you have an example that you can share of some of this great owned data that is used in a content marketing perspective?

Clare McDermott Yes, absolutely. I talk about a few examples in the presentation. I won't say all of them. Some of them are not safe for work, if you can believe that, but-

Katie Martell Okay, we want to hear about those. This is the Explicit Content Podcast.

Clare McDermott Okay, well then I'll tell you about that one. It's actually the most entertaining example. It's PornHub. PornHub has a micro site called PornHub Insights. On that site, they are slicing and dicing their internal user data to understand how users are affected by breaking events, by different, be it the Olympics or be it the release of a movie or anything that is topical, and how it relates to their audience.

The example that I use, and it's a hilarious data graphic, showing what users in Hawaii were doing on PornHub when the last ... I don't know if you remember the missile alert that came through, and what were users doing during that time. It's important, when people think about research, sometimes they're thinking about really serious stuff, very authoritative, traditional research, but data can also be so much fun and so entertaining. PornHub is a great example of that.

Katie Martell What did they find in that study?

Clare McDermott Too bad this is a podcast, because I'd love to show the visual.

Katie Martell We'll include it in the show notes.

Clare McDermott Okay, great. Yeah, what it shows is that at the moment of the missile alert, basically people abandoned PornHub en masse, and it's really about an hour until the all clear comes, but they're slowly trickling back in. There's this sense of if I'm going to die, I might as well keep on going.

Katie Martell I love it.

Clare McDermott Anyway, that's a great example. You know, we see a lot of tech companies that now have these data hubs where they're, on a consistent basis, publishing insights from their own data.

Katie Martell Yeah. There's one example that was done by a former PR firm that I love. They're called Version 2.0 Communications in the South End. The company's Panjiva. I'm going to get what the company does wrong, but they manage shipping data in some way. With that, you can take out a lot of industry trends based on what's being shipped and when. If there's a major global event happening, how does that affect shipping? If there's a spike in shipping of, say, Ivanka Trump's merchandise or a dip, what does that say about the national conversation around the Trumps and the brand? It's an interesting way to use a seemingly irrelevant company like a shipping data company, and really plug it into the narratives of an industry.

Clare McDermott That is such a great example. Exactly.

Katie Martell Isn't that awesome?

Clare McDermott Yes, definitely.

Katie Martell I will include that link in the show notes as well. They have a whole ... Actually, you know what? They were acquired. I wonder if that data's still out there. We will find out. I love these examples. You mentioned a phrase in your talk today that I think really summarizes what this is. We mentioned PR. We're talking about marketing, but you mentioned data journalism.

Clare McDermott Yeah.

Katie Martell Can you tell us what that means?

Clare McDermott That's actually a term that is used in journalism, so it's not a content marketing term. It really is a journalism term. I talk a bit about the history of data journalism, and I won't take you through the whole arc of it, because we don't have time, but really it's just trying to understand this growing appreciation in journalism that it's, number one, that great stories, and stories that really compel people to act or to change their behavior, really need data. They need data to really give it authority, credibility.

Even beyond that, there's a growing sense in journalism that it's not just using data to back up an existing story, but rather mining existing data, even unstructured data, to surface stories. It's not data supporting a story, but it's actually crawling through data trying to surface interesting stories. The best and most famous example of that is WikiLeaks, and all of the stories that came out of that data, but for markers, I think that thinking like a data journalist is a useful way of thinking, much like five years ago, people were always saying, "Think like a journalist." Why not think like a data journalist? What are some of the lessons that we can learn from that?

Interestingly enough, if you go to some of the big media sites now and see what jobs they're listing, you'll see, in addition to the reporter jobs, you're going to see jobs such as, titles such as computational journalist or data scientist. It's really this sense that data is such an asset, and there are so many stories buried inside of it.

Katie Martell There are stories in the data.

Clare McDermott Yes.

Katie Martell I love that. I think it speaks to the way that a marketing department is changing in terms of what the purview is and can be across the team. I think a lot of times we look at data in marketing as something that's, you know, primarily just to manage our internal systems. You also mentioned something earlier today that I'd love to talk about, which is the idea of a data scientist playing a role in compiling data for the purposes of marketing.

Clare McDermott Yeah.

Katie Martell A data scientist whose job it is to help with content marketing, which sometimes feel like two very different domains, writing and data. Can you talk about the resources that are needed when you are a company that's ready to try something like this?

Clare McDermott Yeah, that's a good question. Obviously not every organization has a data scientist at the disposal of marketing, especially smaller companies, but does every project need a data scientist? No, but I will say that certain parts of the research process are really critical, that if you don't have the skills internally, that you look for those skills.

I'd say that the places where people get tripped up most often, where I see the most mistakes, is number one, for survey based content, survey design. If someone doesn't have experience designing a survey, that can go off the rails pretty quickly. Doing analysis, and especially the big mistake I see people make, is essentially drawing insights from the data that are not correct. For example-

Katie Martell Wait, marketers would never try to be extrapolating stories just to fit their narratives. What are you talking about, Clare?

Clare McDermott Yeah. I mean, basically there is some stats involved, and insuring that the data passes a significance test, for example. Or looking at year over year data. Well, have you sampled properly such that you have a close enough sample, year over year, that you can actually make those kinds of statements? Or sometimes I'll even see people confusing correlation and causation. These are all ... A fairly data literate person, you don't have to be a professional statistician to get some of this stuff, but if you don't have a data literate team, it's really important to look outside. That's one piece.

The other completely unrelated piece, but equally important, is design. I think that companies often fall down on the design side, because they rely on an internal designer who perhaps doesn't have experience designing data graphics. There is an art and science to it. That's a really worthwhile investment to invest in a designer who has specific experience with data graphics.

Katie Martell Can you talk more about the mistakes made with the graphical element of these reports?

Clare McDermott Yeah. I would say the ... I don't know if it's the biggest. It's probably my pet peeve. It's my biggest pet peeve. When I open a report and I see this, I just shudder a bit. That is when people try to be super creative with the data graphics. That can come in a few forms. One is just picking oddball charts and graphs to use, when a simple bar chart would suffice, and not just suffice, but would be clearer. Some of these really creative, a little bit out there data graphics are hard to read sometimes. It's hard to understand them, even if you have a fairly high degree of data literacy. You have to sit there for a while to try to figure out what's going on. Simple is better, number one.

Number two, don't use a ton of color if the color doesn't mean anything. My big pet peeve is when I see a bar chart and every bar is a different color, but there's no good reason for it. My eyes are looking for a key. Why is this bar red and that one blue and that one purple? It's decorative, and it's standing in the way of clarity. I always say, emphasize clarity over cleverness. The best resource for people who are interested, the Wall Street Journal has a book that's a guide to infographics. Is it called The Guide to Data Graphics or Infographics? In any case, it is just a really simple primer. I always point people to that book. It's a great, great book, a resource for plain, simple, clear data graphics.

Katie Martell That's great advice. We'll include a link to that as well in the show notes. Make sure you use your Amazon referral so you get all the referral benefits.

I love that design is an element of this. I think when people are consuming information, the idea is to make the complex simple for presenting this research.

Clare McDermott Yeah.

Katie Martell Just in general. It's the point of it all.

Clare McDermott Research is such a visual tactic. Done well, you can leave these little visual seeds on social. You can share reports. You can share research by just sharing a single data graphic and telling the story of that data graphic. Design is really important. It's often overlooked or it's just done by someone who doesn't have quite the right skills to do it effectively.

Katie Martell Right. If a company's getting started, and they want to do this kind of initiative, I'm not going to ask the question, "Where do you suggest they start?" I want to ask where most companies fall flat on their face when they are doing this for the first time, so beginner rookie mistakes with original research.

Clare McDermott I think one thing that when we begin client engagements, we always start with strategy. Part of what we're trying to define is what story we're trying to tell. I talk a lot about not doing research and taking it in inventory, but rather having a story and a through line through your data, because especially on the owned data side, but even in the survey side, you can come up with 200 data points from your research, and really curating that list and deciding what are the super important, compelling, interesting stories that you're going to tell, that support your brand's message, and then putting aside the other stuff.

Maybe you can use it in another way, but biggest mistake is when companies feel like they have to inventory every single finding. In doing so, no one is going to read through it. There's no story, and there's no so what. They're just putting facts out there, but it's not clear why they matter and which one is most important, and things of that nature. Yeah, I think that that's definitely the biggest mistake that I see.

Katie Martell It's so critical. It's so critical, because what you're talking about is marrying the world of marketing and data. I think marketers have to have a point of view. They have to have something to say about the industries that they operate in. Where do you draw the line, and how do you help clients navigate between doing a survey that's going to back their biased point of view, which is why people are skeptical about marketers having anything to do with research data. I think it's also what scares marketers from taking a strong stance and using the data to support it, is the fear that buyers will look at it and say, "This is inherently biased."

Clare McDermott Yeah. It's definitely a fear. Surprisingly, I find that a survey that's, or any type of research that's done well, that has a clear methodology, you're very transparent about how it was done, what biases could be involved, it's actually, it gets a good amount of credibility and authority. Journalists do pick that up. The fear of, "Can my company really talk about this?" we don't run into that so much. I think it would be an issue if it's a survey that's not performed well.

Yeah, aside from that, I think another thing that I tell clients is that it's really important that whatever the project is, that the punchline should not be, "Buy my product or service." You can have this sort of, you can be talking about things that overlap or support your business in some way, but there really shouldn't be such a tight, "We've proved that you should buy, you should invest in my company." That's just something to stay away from. You won't get any attention for that kind of research.

Katie Martell What kinds of punchlines work? For example, I love unexpected correlations, like you talked about with Hawaii and the use of PornHub.com. What other kinds of angles are really effective specifically for PR?

Clare McDermott Yeah. I wish I remembered who said this, because this is actually not my insight. Someone else who works in research says this. He says, "Look for your dream headlines." As you're thinking through what area you're going to study, be it owned data or survey based data, what would be your dream headlines? Sometimes that really helps to crystallize what topics you should cover, what's going to be of interest, how to organize it, how to ask the right questions. That's number one.

Number two, I don't remember what number two was.

Katie Martell I love the idea of thinking your dream headlines. I do. I think if the data doesn't show what you intended it to show, what do you do?

Clare McDermott Yeah.

Katie Martell You've wasted all this money.

Clare McDermott You know, it's funny, someone asked me that before. I actually had a conversation with someone from a big research firm that will be unnamed, but you would recognize the name. He said that sometimes this does happen, that they have client engagements where the hope is that we'll find A, but in fact we found F. That has not happened.

What has happened with us is that perhaps we have a survey of 30 questions, and we think that these 10 questions are going to be the payoff, and some of those, we get back results that are kind of you know, not super interesting. There's no clear decisive insight from some of those questions. What's important is also when you design a survey, to have some ... We always have some fun questions in there too, where it's almost like you're just throwing a rock in the pond and seeing what comes of it. We might have 25 questions that are really closely tied to the story we're trying to tell, and then five that are, "Let's try this out." Goofy question that could bear fruit.

Katie Martell I love that experimentation mindset.

Clare McDermott Yes.

Katie Martell You never quite know what you're going to find.

Clare McDermott Exactly.

Katie Martell Yeah. That's the risk of working with data. It doesn't ... You can't beat it to fit your narrative. The truth isn't there.

Clare McDermott Again, we haven't encountered an issue where all of the findings are ... But definitely, there have been times where a chunk of it is just not terribly usable and not as interesting as we thought it would be.

Katie Martell What's the phrase at the end of your deck today about if you beat the data long enough ...

Clare McDermott It will confess. Yeah. It's true that you can massage things in a way where you're really being dishonest with how you report the data. It's hard to tell, I guess, those who do it. We certainly would never. I think that some companies use us not just for the expertise, but also for that outside name attached, such that there's an added credibility.

Something else to think about that works really well, is to get a partner. Not necessarily even a research partner, but perhaps an academic partner, or some other organization that is almost going to be vetting the research in some way, and giving you that gold seal of approval.

Katie Martell I think that's so important, especially in a world where nearly half of buyers don't know which companies to trust. If these companies are the ones promoting research, it has that and comes with that layer of built in skepticism.

Clare McDermott Absolutely.

Katie Martell So that's good advice.

Clare McDermott You know, it's an issue in the industry. One thing that we're super careful about is that when people buy panel services, it's not always clear that they're buying what they think they're buying. In other words, when they're buying responses, are they actually real people who represent their target market, or are they bots? Are they people who are paid to dissemble in some way? There are absolutely question marks, say, that readers may have about how valid the research is.

Katie Martell Right, right. Is your advice to be as transparent as possible about how research is done?

Clare McDermott Absolutely, yeah. Your methodology should not be a line or two. You should really explain clearly how you sourced respondents, what are the implications of sourcing them the way they did. We do have clients that are, to save money, they're surveying their own audience. What is the implication of doing so? What does your audience look like that may be different from the target market you're studying? Just always being very clear about what the weaknesses are.

Katie Martell This stuff has got beautiful potential, and a lot of pitfalls along the way. Our advice is to partner with someone like Clare and Mantis Research to avoid them. Where can people find out more about you and about Mantis?

Clare McDermott You can go to MantisResearch.com. On Twitter, I'm @clare_mcd, so M-C-D.

Katie Martell You have new research that I'd love to link to from these show notes as well that I saw today, right?

Clare McDermott Absolutely.

Katie Martell Research on research.

Clare McDermott Research on research, and we are getting ready to launch the 2019 survey.

Katie Martell Awesome. We will link to the report in the show notes. Clare, thank you so much.

Clare McDermott Thanks, Katie.

Katie Martell Any other last minute advice for the folks listening that are interested in taking on this huge initiative?

Clare McDermott Yeah. I guess my last pice of advice would be I hope we haven't scared people, and that experimentation is okay. Starting small and trying out some polls or some shorter surveys, not to be so fearful of the mistakes that you don't try it out as a tactic.

Katie Martell I love it. Give it a shot everyone, but listen, do it right. Thank you so much for listening to this episode of Explicit Content. Have a great day.

Speaker 2 Thank you for listening to the Explicit Content Podcast. For more information, check out enterprisemarketer.com.

associated links: Mantis Research

What’s New in the Orbit Media 2018 Blogging Research?
What’s Happening at the Enterprise Marketer Conference?

related articles

Join The Movement!

Become a member of this community of marketers.