E136 - Elyse Wallnutt, Founder at Agility Lab Consulting, Washington DC

32:18

SUMMARY KEYWORDS

people, data, understand, third party, privacy, happening, advertising, thinking, big, marketing, marketers, advertisers, audience, companies, compliance, debbie, opt, unpaid, touch, trust

SPEAKERS

Elyse Wallnutt, Debbie Reynolds

Debbie Reynolds  00:00

Personal views and opinions expressed by our podcast guests are their own and are not legal advice or official statements by their organizations. Hello, my name is Debbie Reynolds. They call me "The Data Diva". This is "The Data Diva" Talks Privacy podcast, where we discuss Data Privacy issues with industry leaders around the world with information that businesses need to know now. I have a special guest all the way from Washington, DC. Elyse Wallnutt. She is the founder of Agility Lab Consulting Welcome. 

Elyse Wallnutt  00:39

Hi, Debbie. Thank you.

Debbie Reynolds  00:41

Yeah. Well, I like to tell a story about how people meet or how I met people. So you sent me an email introducing yourself. And I was really fascinated by your work in privacy and how that intersects with the acquisition pipeline for the audience of marketing. Because I think that's such a hot issue right now. And you live in a city that I used to live in, in DC; it's kind of funny. But I thought it'd be a great opportunity for us to really chat about this. You know, I feel like, especially anything with advertising, or what advertisers do touch humans, that's who they really want to reach. And there's been so much, I guess, turmoil in a way around what's happening with ad tech. And the tectonic shifts and how OS makers are kind of forcing changes on other types of companies. But before we start, I want you to give an introduction of yourself and your trajectory or your path into privacy.

Elyse Wallnutt  01:55

Great. Well, I grew up in Colorado. So I graduated from Colorado State University. And I entered into digital advertising really quickly; after college, I worked at a boutique search engine marketing agency that primarily focused on nonprofits. So I got a lot of great exposure early on to some big brand names and really got to observe how people interacted with messaging through the lens of Google search ads, and on the SEO side, as well for more of that unpaid touch. So from there, I really dove deeper into the marketing space. At that time, we were investing more lightly in things like, you know, display advertising and whatnot. It was before the bigger investments started coming. But I got really intrigued by attribution and understanding how we credit different channels with the ultimate conversion in the advertising space. So as I moved along in my career, as you said, I live in DC now and have worked with Amnesty International, The Nature Conservancy, and some other larger political campaigns; I became really concerned with some of the controls that big tech advertisers were putting into place without a lot of guardrails. So, after the 2016 election, in particular, I was working at The Nature Conservancy, which is nonpartisan. And when we would do things like upload our ads into at that time, Facebook, we would get flagged for political content because we mentioned climate change on our website. And it struck me as strange that Meta was able to arbitrarily decide what qualified US politics and what didn't and shut certain advertisers out of the market related to that. And then on the flip side, they were saying, you know, for unpaid content that just crops up from their communities stating things, they didn't want to clamp down on free speech. So we were seeing this kind of dual perspective of advertisers being held out of the marketplace. While they were also saying that they didn't want to repress any sort of, you know, unpaid speech. So from there, I really wanted to spend more time with that. I went on to the Center for American Progress and spend some time with our tech policy team. Digging in, when we were seeing some of our costs for advertising. And rise really substantially, I was looking to see why that was happening and became aware of some of the trickle-down impact that changes, like Apple's iOS update we're having on the industry at large. And wanted to find that space to help marketers understand that Data Privacy is of huge value to us as humans. And we shouldn't be expecting our audiences to opt into something that we don't opt into in our personal lives. So how can we find that balance of driving revenue in a way that values explicit informed consent and still helps us be creative as marketers and achieve our goals? So I started consulting just in the fall, and I'm helping organizations understand how they can get ahead of this.

Debbie Reynolds  05:53

Very good, very good. So you've touched on this a bit, and I've done a video, if anyone cares to look at it a couple of years ago about this, and this is around Apple's App transparency change that they made. So just briefly, to explain to the audience, Apple made a change and actually announced it a couple of months before they were going to make a change. And that's when I made the video about I'm like, hey, this is going to be big, you're going to lose a lot of money from this, basically. And Apple made a huge change in the way that they have advertisers interact with individuals, on their phones and on their Apple devices. So basically, what they did is make a lot of the advertising reach opt-in, as opposed to just opt-out. So before advertisers had a lot more information, they had a lot more information about the device, the person, the stuff that people did on their device, not even with their apps, you know, just everything like what you bought, how much stuff and how much money you spent different things. So basically, what that did, that change that has had a significant ripple effect through the way that people do programmatic marketing. So it's basically depreciated that signal. And it's made marketers have to work harder to be able to get people to opt-in or get that trust. But then it also made this advertising a lot more expensive because these companies want to still make money, right? Even though they didn't have the data that they used to have, they want to be able to really earn that money. So we saw, and I think you and I chatted about this year, was news that marketers that year had lost like $10 billion in revenue as a result of that app transparency change. And actually, they lost more than that. But I think I saw an article around that time, within like six months or so a bunch of companies that do that type of advertising, I think they said they lost like $280 billion. So it's definitely a huge change in the way a way that the marketing is done. And you know, I've seen very savvy, interesting, smart people in marketing, they're trying to find alternative ways to really build trust with audiences so that people really want to receive their ads and be able to do it in a way that doesn't sound or feel creepy.

Elyse Wallnutt  08:33

Yeah, absolutely. I think that, that change with that web developer component, and the deprecation of third party cookies, and that elimination has really forced us to be a lot more creative. And I keep calling it marketing, like it's 2010. Again, because it's it's so much less reliant on algorithms and third party learning and forces the business community to really rely on our chops and our creative instincts to understand what target audiences care about and make choices that are more autonomous.

Debbie Reynolds  09:13

Yeah, the opt-in, you know, I guess one other thing about opt-in so the, in the US, we're not like an opt-in in country, right. So this is very different from what we're accustomed to in the EU. In the EU, they are very opt-in, right, opt-in or opt-out, were very up, you know, just you know, take whatever and then you find your way out or different things. So definitely very different. Tell me a little bit about how you're helping companies to really navigate those changes and how that relates to privacy.

Elyse Wallnutt  09:51

Yes, I think that the biggest thing is helping people understand what their advertising results have to do with privacy. So I lead workshops and subsequent consulting that helps people understand. I call it the four-step data autonomy framework. And that really brings together the components of compliance. First and foremost, which I think is coming from my seat on the marketing side, tends to get shoved over to just, you know, the legal team or the IT team. And it's not the marketing team's problem. But I think marketers need to be very much involved in the compliance conversation and things like privacy policy updates so that they're shielded. Well, yes, but also so that we are externally messaging how data is being released to third parties in a responsible way. So that as people are opting in like you said, they understand what's happening to their data subsequently. And that protects, you know, the PR lens as well, which I think is the big thing about this conversation is, as marketers, we're responsible for the brand. So how do we, you know, taking cues from Apple, lean into asserting audience trust and gaining that so that we gain also the data that's handed to us in a consented way. So there's the compliance angle, there's the integration angle when we think about some of the analytics touch points that we're going to lose with things like third-party, Cookie deprecation. And then there's the strategic angle of building first-party data acquisition and thinking about the content that we can create that gives people a reason to raise their hands and say, Yes, I trust you. Here's my data. And also, really think about the forward steps of where you want to be as a brand when it comes to earning audience trust. So I tend to start with workshops because I like to bring together all those different teams I named to build that buy-in across those teams for how we're moving forward, how we're going to evaluate our third-party partners, and whether they're, you know, up to snuff, whether we understand their privacy policies and agree with them, and how we assign responsibility across teams for taking care of updates. So that this isn't just a moment in time; it's something that we're thinking about consistently.

Debbie Reynolds  12:27

So you touched a bit on all first-party data, which is kind of the holy grail of marketing if that's what you really, really want. Alright, so companies that can build trust with individuals have first party-data; that's kind of the best you could have ever hoped for. If you have your third party, you know, I heard someone, one of my other guests, who is a is a luminary in the marketing space, for over 30 years, he called third-party data, the third rail of advertising, right? So I was like, oh, my God, okay, you're a third party, you have all these new obligations that you didn't have before. And then so tell me a little bit about how you're talking with people around handling third-party data. So I, I'll just give you my point of view. So I have clients that are third-parties to bigger companies, and they are shots when they find out that they have all these new obligations, things that they, you know, they're like, well, I don't have customers in California, so but you have a customer who has customers in California so that those things that are flowing down from a lot of these privacy regulations touch you. So tell me a little bit about that third-party story and how you navigate that.

Elyse Wallnutt  13:43

Yeah, so I think there's two big pieces to it in my mind, which is first helping people understand what a third party is because I think a lot of people don't realize that, you know when you upload a group of email addresses to your Facebook advertising account, Facebook's a third party. It's not just a tool, it's a third party. So you have to understand what it is that's happening subsequently. And then I think, with Google, in particular, people don't realize some of the trickle-down impacts of what's happening. So when they use tools like Google Analytics, Google then subsequently owns their data. And their terms say that they can use that for whatever purposes they want. So that data is what powers Google's advertising insights. And it's really a free for all, you know, in that perspective. So it's really helping people to understand that those tools are third parties, you have to understand what happens and you have to represent what happens to your audience. And then I think from there, it's it's really making sure that people understand that they need to assert really positive sentiment when it comes to making sure that people get why they should trust the organization and how they can get ahead of representing that externally.

Debbie Reynolds  15:12

I want to touch on something you said a bit too. And this is about people using this kind of third-party plugins like Google Analytics and stuff like this. So I've had conversations with people about this for a while. So a lot of times, the way it used to be advertised was like, okay, there's this cool third-party thing, plugin, and we can have a site do this cool thing, but they didn't really understand what that thing was doing. And we've seen a lot of people, you know, get in trouble with fines or lawsuits or FTC things because they really didn't know what was happening with the data. And the issue is that it's your business to know, so, you really need to ask those tough questions with third-parties and then also have someone who understands, from a technical point of view, what's actually happening, and we're actually using these tools. So can you talk a little bit about that?

Elyse Wallnutt  16:11

Yeah, and I think that the biggest part of breaking down those team silos is, if your legal team doesn't understand all the ways that you're using tools, they can't do that research and inform that protected lens. So I think that that's the biggest part is making sure the full team is in the loop about how data is being released. So that there can be processes built for evaluation and ongoing updating of those policies. And I think the thing that people miss, often, especially when they're interpreting privacy law at a glance is, oh, you know, I don't make $25 million annually. So this doesn't apply to me, or, oh, I'm a nonprofit. So I'm exempted. And they, you know, therefore, ignore the law. Like you said, if they don't have constituents in California, it doesn't apply. So what I think is important for people to understand is that it doesn't just matter what constituencies you serve; it matters what constituencies your vendors serve, as well, like you said, Debbie. So I think from the big tech side, when I think about digital advertising, all of you know, Google's choices, Meta’s choices, Apple's choices are informed by building their own brand reputations, and we won't get a lot of heads up about the choices that they're going to make, like with the iOS update on Apple's end. So compared to legislation in which we usually have a two-year window to get ourselves into compliance, these things could happen tomorrow. So it's, it's not so much about being behind; it's about taking steps to protect ourselves so that we're not caught off guard. And we have an audience file of consented first party data that we can rely upon in the future as things change.

Debbie Reynolds  18:09

There's a thing that I just read in the news, and I want to your thoughts on it. So basically, there's a telehealth company called Better Health you have you heard about this, but they are a direct-to-consumer mental health platform. And they really started booming during the pandemic, they recently paid a fine to the FTC of $7.8 million dollars. And what they did is they took like 7 million email addresses for people who signed up for their service, and then they uploaded it to Facebook to match it. And then they matched like 4 million people. But then now they've basically told Facebook share with Facebook, do they have mental health treatment through this app. So tell me, what is wrong with that whole picture and why they got in trouble?

Elyse Wallnutt  19:05

Yeah, so that's a very typical practice, I think in digital advertising. And I think that's really the case study on how people don't understand that Facebook is a third party. So you are you're connecting some dots there when you are releasing that information. So I think it's been really interesting to watch some of these companies who have had to adjust to GDPR already, if they are, you know, serving EU constituencies. They've had to look at their email file and prove that people consented to be on it and that they consented to have their data released. And a lot of people can't do that. So what we're seeing is that, you know, organizations and companies are losing something like 50 to 60% of their email file when they go through that process of evaluating, hey, do we truly have consent? Can we prove it exists? So, I think that that fine happened likely because that, you know, that consent wasn't there. And when that information was released to Facebook, it, it came back around. So I think we'll see a lot more of that. And that's where I think, you know, rather than waiting for your legal team to tell you, you know, oh, hey, we have the Federal US privacy law. If that happens someday, and you have to restart your email file, why not start now, in terms of making sure that the people you have on file have raised their hands and said, Yes, this is okay. And that your setup in terms of form compliance, and often so that you don't have to go through that in such big sweeps later.

Debbie Reynolds  20:50

Yeah, I will say probably the message at a high level is that you need to if you're going to take data that you've been given and transfer it to any third-party that sort of raises alarm with you. So there has to be more. You know, I realize from a technological standpoint, it's very easy to do, right. But I think from not just a legal compliance and also a trust standpoint, right? If someone didn't say that they you could share your information on Facebook, you need to really listen there. I think, also, another reason why this company had this particular settlement is because they were dealing with health information, right? So maybe like, let's say, let's say that you transfer data to a third-party, and it wasn't like health-related or something, you know, you probably would still get in trouble. But you probably wouldn't have been as in as much trouble because health information is considered more protected. And we're seeing State laws really call that like health and financial information or race information out as more sensitive data. So having that you know that that's just more risk. So what are your thoughts on that?

Elyse Wallnutt  22:11

I completely agree with that. I think anytime you're entering into a third-party relationship, that's the perfect framework to be thinking about it is, how many risks do we want to be taking on? So one of the points I've been making is, you know, when it comes to things like Google Analytics, it's positioned well in the marketplace because it's free. So people opt into it because of that. But when we look at the risk associated with it, is it truly free? Because you're taking on that responsibility of communicating externally, you're taking on the responsibility of understanding Google's policies on an evergreen basis and updating that continually. So is it more worthwhile to be thinking about a paid tool that's already GDPR compliant and that's less beholden to future change? And my opinion is yes, we need to start thinking about ROI in a business framework from a different perspective that takes the right risk calculation into account.

Debbie Reynolds  23:14

Yeah. What are your thoughts on zero party data? First of all, a lot of people haven't even heard of this. But I want your thoughts on this. Because I work with people who actually play in that space. But I want your thoughts. First of all, explain once your party date is and give me your thoughts on how people can use it.

Elyse Wallnutt  23:35

So zero party data is data that an individual gives you themselves. So they fill out a form, or they've been communicated with you in some way, you know, it might be in person it might be, it's really a one-to-one mechanism for them to give you that information directly. So when we do things like, you know, thinking about it from the nonprofit angle, fill out an online petition or sign up for email. That's the zero-party data that gives you that information flow. So I think the challenge from the business side of it is always the question of scale. How do we do this in a scalable way? So that's where I think it's important to start thinking about your content strategy, really, first and foremost, as marketers in a brand new way. And, you know, I think the word influencer, it means so many things, and it can get a bad rap based on what we see on social media. But when we think about thought leadership as influence and, you know, utilizing your community partners as brand ambassadors, we can build some of those zero to zero party data touch points that create that mechanism for collection, and so I do agree that scale is challenging. And that's where I've tried to prepare people that, you know, to your earlier point, Debbie, we probably will see some slowdown as we adapt to these things and new regulations. But that's right, you know, that's what we need when it comes to the understanding that we are valuing our audiences and how they want their data treated.

Debbie Reynolds  25:30

So what is happening right now that you see maybe in the news that concerns you? Are you saying, oh, I don't like this new development, or something that's happening related to privacy?

Elyse Wallnutt  26:27

So I think it's, it's concerning when we make short term decisions, rather than thinking about the long term. And so when we, you know, make choices in terms of uploading audience data to to various ad platforms or signing on for that, you know, third-party Co-Op, I think that meets our metrics for the year and said, that isn't thinking about the long haul of audience reputation, what brand reputation and audience trust. And I think what concerns me most is this conception. I think both internally from business teams that we need to be hitting that, you know, 10%, year over year growth margin, regardless, and the only thing externally impacting that is the economy. That's a fallacy that's not taking into account all of the technical pieces of what's at play here. And I think we're going to need the finance side to become a lot more savvy when it comes to digital infrastructure and the ways in which that bigger sphere is changing so that they understand it's the state of the industry and not the people in the seats that are impacting that you know, slower growth margin so that we can make better projections that properly estimate what revenue is going to look alike so that we don't see what we're seeing all over tech where, you know, that COVID bump and some of those economic upticks resulted in huge influxes in staffing that are now being reduced. So the more that we have that more full faceted understanding throughout ever executives, that better position will be

Debbie Reynolds  28:25

Very good. Well, you're in Washington, so I'm going to ask you this question. So what do you think about us getting a Federal privacy law? In the US?

Elyse Wallnutt  28:42

I'm for it, because I think the value of Federal law is that, you know, right now, we're seeing five States this year implementing State level laws. And I think the last thing that I read into this, it was 22 States that are considering it, so we're going to have to comply with State laws anyways. And the problem with complying with State level laws instead of our Federal laws is that there it's so beholden to change, and you have to know each State's perspective. So a Federal law would create a lot of efficiency on that front. So just speaking from a practicality standpoint, and I think, you know, rules are never fun, right? But they also create the basis for equity so that we understand how data is being used. And we understand the risks at play when data is treated like something that can be weaponized, and there's there's no penalty. And I think, really, it's been a long time since we've had any oversight and tech. And I think that would be a strong move so that we're not also beholden to the actions of a few rather than, you know, the respect that we need to get in for many.

Debbie Reynolds  30:03

Very good, very good. So if it were the world according to Elyse, what would be your wish for privacy anywhere in the world? Whether it's law, human behavior, technology, what are your thoughts?

Elyse Wallnutt  30:23

My ultimate wish would be that you know, I've always said that when people understood what was happening with their data, we would see some kind of reckoning. And I think that's where we're, we're running into things right now. So my wish would be that people would take the time to get that grounding and understanding so that we had that basis to work from and we could implement, assuming that people got that bigger picture. I think that a big piece of all of this is that education that's missing. When people don't understand, you know, though I hit Accept all on, you know, that cookie banner, what does that actually mean? So, some making sure that as no matter what we legislate or move toward, we create some literacy around that data transfer.

Debbie Reynolds  31:21

Yeah. I'm all for it. I agree with that, wholeheartedly. Well, thank you so much for being on the show. It's been a pleasure. Thank you for keeping us up to date on what's happening, especially in ad tech and these tectonic shifts that we're experiencing, but I really appreciate you being on the show. Thank you.

Elyse Wallnutt  31:40

Thank you, Debbie. And thanks, everyone for listening.

Debbie Reynolds  31:42

Yeah, talk to you soon.

Previous
Previous

E137 - Louis Rosenberg, CEO, Unanimous AI

Next
Next

E135 - Ken Chikwanha, Executive Head: Data Governance, Data Privacy & Data Protection, Standard Bank Group, Johannesburg, South Africa