Customer Story

Finding the Story: How a global creative agency tapped into data science

Machine learning success is built by working effectively with people, not just machines

Presented by Brandon Shockley, Director of Research, 160over90

In this video, Brandon describes the agency’s data mining journey, from early prototypes to actionable consumer insights. He outlines creative ways to apply machine learning to market research for customer segmentation, messaging, and brand health tracking.

The Problem?Advertising used to be about having the winning set of traits in order for customers to buy. Now, it focuses on memory, linking the good traits for the consumer to recall. 160over90 wants to use data science to find hidden insights and patterns to create truly meaningful customer segmentation and encode their products into consumers’ minds.

The Solution?160over90 implemented machine learning to find the best traits for consumer recall. RapidMiner made machine learning more approachable, allowing them to utilize it within already established marketing techniques for an easy transition into a decision support tool. 160over90 can now understand what characteristics really matter when it comes to segmenting customers into meaningful groups. From there, they can tailor their efforts using these more informed subgroups of consumers.

Get the Slides

Watch the full video below to learn how 160over90 brought machine learning into marketing research.

00:05 [music] So we’re 160over90. We’re a marketing branding agency. Really, a creative agency. We’re not in healthcare, even though our name is a reference to elevated blood pressure, but you’ll get a sense about why that’s the case. But I’m going to move pretty quick. I’m from the advertising industry, which means I have too many slides. So I’m going to go through this very quickly. Quick agenda and some things we’ll be talking about. I’m going to walk through some of our process for how we really adopted machine learning, some of the ways we’re using it and how we built buy-in around it internally, and then some of the ways we’re bringing machine learning into the market research workflow. So the space that I’m in and that our team is in is really in using consumer insights to try to inform messaging strategy, try to inform media, and creative strategy. So we’ll talk about that.

00:47 If there’s one thing you take away from this talk, I’d like it to be this. So machine learning success is built by working effectively with people, not just machines. And that seems a little kind of off-kilter, out of sorts. We’ve seen a lot of live demos. We’ve seen a lot of really hands-on, in the weeds stuff. And we’ll see cool models and data science this entire conference. But I think very in keeping with things we’ve heard earlier from Ingo and others, that data comes from some place that involves people and then has to go somewhere and it’s also used by people. So the more time you spend with people, the better your models are going to be. The better your performance is going to be.

01:24 There’s a lot of adjustments you can make to tuning parameters and all those kinds of things, but if you’re doing research on a call center, go sit in the call center. If you’re doing research on a supermarket, go shop in the supermarket. Go actually interact with the data that you’re talking about. And because we come from market research that’s really how we kind of orient ourselves, but that’s not quite as common in the space of working with data. So the only type of model you can do your data analysis on from your desk is building a model about your desk. So it’s important to get out and actually interact with the situation.

01:54 So who’s 160over90? We’re a team of 800. We love selfies. There’s me in the back. And as I mentioned, we’re unusually fond of selfies. And if you could indulge me, because I have to prove to my boss that I was here, if you could, we’ll take a quick selfie on both sides here. We’ll start with side left. Make some noise, folks. There you go. That was weak. The Tableau people are going to laugh at us. If we’re like this. All right. Let’s give it a shot. Make some noise data people. Come on. Show them what you got. There you go. That’s the sound of presenting right before lunch is what you call that phenomenon. [laughter]

02:28 So as far as what we do, has anyone heard of a company called Endeavor? Are you familiar with this company at all? Limited? Okay. Great. So 160over90 is an agency. It’s the 800 people you saw. We do marketing and advertising. Endeavor is the company that owns us. And that’s actually a pretty broad collection of entertainment, marketing, and media companies. So Ultimate Fighting Championship, New York Fashion Week, the Miami Open, professional bull riding, along with that 800-person agency I mentioned, and William Morris, the largest talent agency in the world, are actually all part of one global company of 8,000 people called Endeavor.

03:02 We operate across these seven verticals and actually represent all these people. So people like Justin Timberlake and Mahershala Ali and Serena Williams are all clients of Endeavor, who represent it. Endeavor also owns some of the properties that they compete in or act in, as well as actually doing marketing services around it. The last one there is education. So 160over90, in particular, is the largest marketing agency for higher education in the US, which a lot of people don’t know. But we work with dozens of different colleges and universities.

03:31 Some mix of some of the different clients, we work with to give you a sense of the types of things we do. But it runs the gamut. And particularly with universities, because that’s a very common use case for us, we’re doing the recruitment and enrollment marketing toward the front-end and increasingly. And more importantly, doing the advancement marketing. So if you get a call or email from your school and they’re saying, “Hey, donate to us. Give some money to the annual fund,” we actually work on those campaigns and help to coordinate those. Help develop strategy and messaging for them. So across all of our clients, we do about $40 billion of fundraising. So it ends up being a pretty big number when you aggregate across a lot of different higher ed institutions.

04:05 Wanted to give you a sense of some of the work we create and then explain sort of how does this have anything to do with data and then look at some of our use cases. But wanted to just highlight some recent work for a company called LightLife. You may have seen this in the supermarket. There’s a big movement toward plant-based foods, plant-based burgers, and things like that. So LightLife is one of these companies. And they make hot dogs and sausages and hamburgers that aren’t actually hot dogs, sausages, and hamburgers. They’re made from plants. So this is an example of some creative work we did in promoting that brand and its products. [music]

04:34 Shhh. Goldie’s sleeping.

04:37 No baby she’s not asleep.

04:39 She’s dead. [music] There are a lot of theories on how to parent. You’ve got to find what works for you. And what works for us is honesty.

04:51 Brutal honesty.

04:53 So what were you guys doing?

04:54 Well, sometimes when mommy and daddy consensually agree to a romantic encounter, the mommy takes the daddy’s p– [music]… aka coitus.

05:05 Wow. Even drew to scale.

05:12 Are you okay?

05:13 Oh, I’m totally fine, bud. I just hide in here sometimes to get away from you guys. [music] There’s no way a 300 pound can fit in a chimney.

05:22 They come every month and they’re always unpleasant.

05:24 You’re not good at the game, so we let you win.

05:27 We can’t because daddy had a vasectomy.

05:29 What are you going to do? Call the cops on mom? It’s legal in California.

05:32 Most of it’s not real. [music] The hardest moments are around dinners.

05:39 Mm-mm.

05:40 Are they good?

05:42 Honestly, they taste like *BLEEP*. [laughter]

05:49 But then we discovered LightLife.

05:51 Kids, dinner’s ready.

05:53 Hamburgers!

05:55 Well, actually, they’re LightLife plant-based burgers.

06:00 They taste just like regular burgers, but they’re made from plants, and they’re delicious.

06:04 Honestly?

06:05 Would we lie to you?

06:08 Hun.

06:10 Honey?

06:10 Hun, we haven’t gotten to gestation yet. [laughter]

06:13 LightLife, the plant-based burger that tastes so good, you don’t have to lie about it being plant-based. Honestly.

06:21 So that’s an example of some of the types of creative work we do. And I’m going to get into what role does data and market research play in that? Across all the different campaigns we work on, how do you kind of get an understanding of customers to know that people have this trade-off between giving their kids’ food that’s delicious, that they want to eat and are excited about, and things that are actually healthy? So if you can understand that decision-making process and some of the things people are thinking about when they’re trying to solve that problem, that can help you make fun and creative advertising. Also, highlighting there the connection between those different parts of our company I mentioned. That Dax Shepard and Kristen Bell, who are married in real life, are also Endeavor clients in the William Morris side. So there’s a kind of a through-line that connects across a lot of these different areas in creating something like this.

07:05 So the role of insights. I’m going to do kind of an Advertising 101. Very high level. I know we have a mix of people that are all smarter than me and all STEM people making cool algorithms and stuff. So I’m going to do the quick kind of insider take on advertising. Advertising that doesn’t happen at the point-of-purchase must work through memory. That’s kind of a foundational and fundamental idea. So point-of-purchase could be added at a cash register. It could be signing a contract for something. It could be online when you’re booking a room. That’s the point where you actually take an action. You actually have some persuasion take place. And if I’m sitting next to you and saying, “By the way, you sign up for the deluxe suite in the hotel room,” I can persuade you in the moment. Anything else has to interact with you through your memory. Either implicitly or explicitly it has to be something that you can have a feeling about or recall. And so that’s kind of the fundamental idea of something about advertising. You need to be memorable. You need people to remember, or at least have some feeling toward, that brand. Showing something like that about LightLife, now in the store when it’s next to something else that you haven’t had a funny video about or haven’t heard about before, there is some sort of memory, either implicitly or explicitly, that you can call back.

08:11 We used to think– and this is for the first probably 70 or 80 years of advertising. We used to think advertising worked like this. So you have a hand of cards, and you want to find the winning set of traits that will persuade customers to buy. So I can tell you information about my product. I can give you a value proposition. I can tell you the seven reasons why this headache medicine is going to work and it’s so great. And I’m competing with others, who may have some similar cards in their hand, but ultimately, I’m trying to get that winning combination. And ultimately, what lots and lots of research and lots and lots of time and market experience has shown is that’s not really that important.

08:40 People talk a lot about differentiation. Nobody really cares about differentiation. People worry about paying their taxes and getting to work on time and stuff. It really has a lot more to do with what things are memorable. What things people can remember and connect back to the problem they’re trying to solve. In that ad, the problem they’re trying to solve is selling healthy food to their– or serving healthy food to their kids. And so that’s really why a lot of advertising has funny characters and jokes and bright colors and things that actually are really easy for us to encode in memory. A lot of this is a memory game and having something memorable that we can connect back to a problem.

09:10 So a better analogy for how advertising works is actually the game of Monopoly. So Monopoly you’re trying to cover as much of the board as possible. You’re not trying to get a winning sort of set of combination of traits. You’re actually trying to say, “I want to cover as much territory as possible.” In marketing what that means is linking a brand to as many traits as possible so it comes to mind easily as a solution to a problem. But don’t take my word for it. We’re going to do a quick game. We’ll do a survey. This is exactly what we do with real surveys. You’re going to shout out. We’ll try to synchronize this. We’ll see if this works. And I do this in different settings to varying degrees of success. What is a cool sneaker brand for athletes?

09:46 Nike.

09:46 What is a refreshing beer you can drink at the beach?

09:50 [crosstalk]. Budlight, Corona.

09:52在地图。好吧。[laughter] Did you see? It’s funny. The data folks aren’t drinking as much beer. What is a car insurance company that will save you money?

10:00 GEICO.

10:01 So that ability for something to come to mind in the moment with that prompt is called salience. It’s the ability for– when you’re trying to solve that problem, the ability for that product or service to come to mind. That is actually what advertising does and how advertising works in the market. Four out of five brands that grow in market grow by growing salience. That ability to say, “I need something that does X,” and the brand that can come to mind for that. So ultimately, that’s what we’re trying to do. Where that comes back to data and market research, is that this is why we need to understand the memories and feelings that people have when they’re trying to solve a problem. So if you’re trying to create this ad about plant-based food, you want to understand what are the things that parents think about? What are the feelings they have? What are their frustrations? What are the chain of events they go through when they’re trying to solve that specific problem? And those are all things that get worked into the ad to make it effective and hopefully funny and engaging.

10:52 So going to talk a little bit about– that’s just some context about kind of how we operate and what the use case is. What the goal is, at least. Going to talk to you about, then, knowing that that’s what we’re doing day in, day out. That my associate director, Katie, is here sitting in the audience here. That we have to go– and we have a team of designers and creative people and strategists and coders and folks that we work with who we have to say, “Machine learning is valuable. It can help you do more work like this and make your job easier.” So the premise here in the business opportunity when we started our machine learning journey was that consumer insights are a competitive advantage. Machine learning helps us find new and novel patterns in marketing data. So this was really how we serve this up internally, how we started to build by-in. And throughout our process, this was our business case for this.

11:35 To do that, we had to machine learning approachable. And so we’re going to talk to you about sort of what our onboarding process was. We have a really diverse set of teams. We have a big company, as you saw. So we’re going to talk about some of the things that we’ve found. And wouldn’t necessarily say they are best practices, but they worked for us. So they’re things that help engage people who are not totally data-oriented or maybe not as familiar with things like machine learning engagement.

12:00 It starts with getting support from management. So for most folks, if you’re in some sort of client service business or even if you have internal clients, you have to go into a meeting that looks like this, where you’re going to go and talk to a decision-maker who has to take their pile of jewels and money and give some of it to you, over time, for you to invest that and grow a part of their business. One of the things, in general, and just kind of as a best practice or a general thought for how we engage our colleagues is that everyone in this room is doing cool things with decision trees and cool models. And we can explain that. We can justify the case based on the data based on the insights. But in addition to that, we need to go to our colleagues in finance and justify it based on net present value and justify it based on return on investment. So being able to speak some of that language is really critical in making the business case. Finding somebody on that other side in finance or in operations who can help you organize that business case and say, “That this investment today is worth this in the future,” is really critical. So building that kind of stakeholder engagements important. Again, look, there’s me.

13:02 There are a few different things we tried out as far as building engagement among our colleagues. One is we created an internal slack channel. So anyone could post to this things that had to do with data science. A lot of articles about Netflix and Uber and things like that about how people are using data science. Making it relatable to an actual product. They’ve used the HBO documentary about Brexit, which if you haven’t seen it, I highly recommend it. Or documentaries the wrong term. It’s actually a film, a dramatization, about Brexit and about the campaign that deals with all of the contemporary issues about predictive modeling, consumer privacy, third-party data, and purchase data, and things. And it’s all in this sort of convenient narrative that people can see in a film.

42的第一件事,外面有人of the research and data team posted to the research slack was this. This is a data visualization of all of these various rappers organized by their vocabulary. And you can even sort it by all the members of Wu-Tang Clan if you want. But so one of the things we found is that data visualizations and things like this are one of the first ways that your non-data colleagues are going to get integrated into things having to do with data. It’s one of the things that’ll help get them excited and make this relatable. So these are, again, some of the tools and content we’re sharing.

14:09 On that point about infographics, we also started creating some of our own. We’re fans of Matt North and the RapidMiner team, the community, and what we’ve done in terms of putting out things about the data mining process, workflows, things like that. So we created these sort of assets that could be used in Decs, be used with teams to help illustrate the data mining process. So that people are trying to understand, “Okay. I know there’s data. I know there’s information somewhere. What are you actually doing with it? What are the steps in that?” And so this is part of a bigger presentation that explains each of those.

14:36 Another thing are briefing books. So the idea of knowing enough to be dangerous. So not giving people the expectation that they have to master machine learning or become an expert in something that’s outside of their kind of core area of competency, but learning enough that they can communicate it. That’s especially important for us because we’re a client service organization, so we’re working with clients. And so this is sort of a rubric of what we include in there. Giving people for any given machine learning application – it could be churn prediction, it could be message testing – what’s the value to clients? What are the use cases? Typical budgets and timelines. FAQs. Top technical terms we’ve actually found to be very helpful. What are the top pieces of jargon you’re going to hear and want to be able to understand? What are the deliverables, what are the roadblocks and watch outs. And again, this can apply whether you’re doing this internally or with external clients. So that’s just one of those approaches.

15:25 Finally, lunch and learns. Anything you do with food is going to be more successful. So if you can buy people pizza or beer, even better. And so we would have lunch and learns with account teams, with different teams internally, and roll out these capabilities and do something like this. So Ingo already stole my thunder up here and showed the Titanic in the case study. But we would go and explain the process of making a decision tree, a simple model, by using actual characters from the Titanic. The idea would be if you want to learn how predictive modeling works, come to this workshop and we’ll build one together. The easiest way to learn is to build one. So we start with this. The characters well-known. We say, “With a little help from Jack and Rose, we’ll build a model to predict who survived.” Most people should be familiar with it, but spoiler, it’s not Jack, unfortunately.

16:08然后通过较长的演讲we actually show each of those steps in that data mining process, we end up with making a real decision tree. And we talk about women and children first and that Jack didn’t make it. But low and behold, then Rose does make it. And so we can sort of make this into a fun and creative story that relates back to an actual film people are familiar with that makes it a little more approachable.

16:29 So as far as the nitty-gritty, what we’re actually doing with the tool, how we’re actually starting to use it, and some things you might be able to do from a marketing approach with the data. One is, first talk about the typical workflow for a market research project. So think about a quantitative survey. We just did a survey of 10,000 grocery shoppers over the course of months and months in a lot of different cities. So that’s the type of study we’d be doing. So we work with our client. Understand the business objectives. And again, as much as possible, actually go to the source. Actually, find out.

16:57 So we shopped in the store. We ate the strange cookies and things that they sell. We actually get a really in-depth personal familiarity with what’s going on in the data. Design a study. A very typical market research study. We’re asking about product uses and basket size, who you go to the store with, things like that. Develop and test the online survey instrument. Conductor fieldwork to analysis and reporting.

17:18越来越多的外部数据开始come in is things like anonymized CRM data. So to take a different example, I mentioned the work we do in alumni fundraising earlier. There’s huge troves of data in terms of donor records and things like that. So being able to bring in anonymized data linked to a customer ID or a donor ID around things like average giving amount and how much you participate in on-campus events, things of that nature. So bringing that into those stages of the workflow in terms of how we conduct the fieldwork and our analysis.

我说17:44然后数据挖掘工作流about earlier coming in, in this analysis, and reporting phase. So this is, in a sense, our roadmap of how we’re going to augment the traditional market research process to bring data mining into it and actually be able to bring a lot more tools to bear a lot more interesting kinds of models and approaches to working with survey data beyond just crosstabs and regression analysis and things like that.

18:07 So some applications in terms of how we’re actually rolling this out in a couple of places. So we’re going to talk about two main things. One being decision trees for customer segmentation and then message testing as a classification problem. Originally, this slide had a lot more things on it. But like I said, I already have too many slides. So I’m going to be condensed here.

18:26 So this is an example from a donor segmentation study. And this could apply to anything. It’s kind of a generic version of the question. But if we were to have a survey question, this would go out to a few thousand global alumni of a school, for instance. And we’d say, “How likely are you to make a donation to insert here in the next 12 months?” This is a Likert scale question. So you have a set of boxes. And in research lingo, we call this the Top 2 Box or T2B. So people who are somewhat too very likely are considered our likelies. It’s around one-third of people. And then everybody else is considered our unlikelys. And really, that’s not totally accurate because our unlikelys includes a lot of neutral people and so on. From there, we use this, and we build a decision tree of demographic traits of donor past behavior, attendance at events, and things, against these two variables to try to understand who’s likely or unlikely.

19:08 And you end up with something like this. This is a redacted version for demonstration purposes. If my math is wrong, don’t sue me. But this is an example. So in a case like this, you could see something like average giving amount being the leading kind of split. The leading factor that determines which segment people ended up in. If it is less than $450 for your average giving amount – that’s where most people fall, about three-quarters of people or so – the next biggest split would be where you’re willing to support the university’s fundraising effort. So the person is willing to make calls on the phone, to go to events, to advocate for the university. And from that, you get two segments. The 4th and 5th segment that we found. So look at segment four, for instance, where that’s two-thirds of the audience. It’s 67% of the audience and 46% of donors. So the concentration of donors there is not high. And the absolute numbers, that is where you would find the most total donors. It’s also where you’d find a lot of people who are on the fence. So that’s a key audience, even though it’s not as concentrated. Contrast that with segment three, which is 12% of the audience, but 30% of all donors. So really over-performing, really high concentration in terms to the responsiveness there in terms of donation. So you can even go beyond that and say, Once you know the average donation amounts to the different segments, actually begin to do a valuation of these and understand the value at stake in terms of philanthropic giving from these different groups of people.

20:25 So decision trees on their own have shortcomings as a model. You can overfit. You can run into a lot of different challenges with them and dealing with treating different types of variables. They’re really easy to understand. They’re really relatable. And when you show a client that Titanic example and then this, they can get a really intuitive grasp of this.

20:41 And so this is increasingly how we’re using machine learning. We’re not necessarily deploying a model real-time that’s updating and constantly learning. We’re using it as a decision support tool and particularly as a tool to identify segments that we’re then going to create creative briefs for. So our creative teams who are going to then create videos and events and content and messaging are going to then develop distinct messaging for each of these groups based on all the other demographics they share.

21:04消息测试的另一个例子是一堂课sification problems. So if you think about a typical sort of direct marketing or any sort of addressable media. So something where I know who each message is going to, and then what their responsiveness is to it, that’s kind of structured as this problem. So if I contact someone, this is kind of what most lead scoring things roughly are organized around. I want to know; will they buy or won’t they buy? And so does anybody know sort of the challenge in this or the missing piece of this model if I just say, “If I contact this person, will they buy or not”? Don’t all yell at once.

21:39 What if I don’t contact this person? What if for someone who I didn’t contact, they were going to buy it anyway or not buy it anyway? It’s actually a classification problem with more categories rather than just saying, “If I contact you, will you buy or not?” So it comes down to there are people who won’t buy in either case who are, in a sense, a lost cause. There are people who will buy-in either case who are sure things to some extent. There are people who will buy if you don’t contact them but won’t buy if you do. That actually happens in areas like telecommunications and things where I call you up and say, “Would you like the new offer for this discount?” And you say, ” I still have this account? Let me cancel this because I moved in with my roommate and I don’t need an extra account.”

22:17有两种行为是p的观众retty stable in terms of response to marketing and only one that generates positive lift. That’s kind of the controversial and surprising thing here is that positive lift really comes from this most persuadable group of people who, if you do not contact them, will not make a purchase. And then who if you do contact them, if and only if you do contact them, will make a purchase. So this works in a lot of different ways.

22:39 Here’s how you could set that up as a classification model. So you start with a data warehouse of different customers. You’re going to take a random sample. That’s the keyword here and the critical thing. It has to be random samples from that data warehouse where you have all your vector of customer information and all that stuff. Then, you’re going to target them with this ad. And from that, you’ll have an outcome. So from your control group of people who didn’t receive this ad, you’ll have a certain group of people who took the offer or didn’t. And from the exposed group, people who took the offer or didn’t. Maybe people who are members of a gym and who signed up for the extra class of spin class or personal training or something. And from there, look, there’s a classification problem. So we can take that data warehouse of customer data we have, try our different direct messaging outcomes on them, and then try to understand who’s the profile of someone who will respond if we contact them, if not, and then a lot of people who are going to change their behavior potentially either way.

23:31 A final data point to consider, and then we’ll do maybe some quick questions. With one rare exception, the entire population is made up of other people. So not a not a controversial view, but true, right? So this is why, again, even though we talk about different techniques and different approaches, there’s a lot of esoteric things we can do in tuning models and devising really advanced systems it comes back to working effectively with people. Go to the source of your problem. If there’s one thing you change coming out of this, it’s actually spend time with the people who create data and the people who use it, and your models will be more effective. Thank you. [applause] [music]

Related Resources