E102 -Tom Chavez, CEO and Co-Founder, Ketch
Your browser doesn't support HTML5 audio
46:51
SUMMARY KEYWORDS
privacy, data, debbie, consumers, systems, people, companies, request, problem, business, create, api, brand, ticketing system, happening, trust, deletion, consent, customers, compliance
SPEAKERS
Debbie Reynolds, Tom Chavez
Debbie Reynolds 00:00
Personal views and opinions expressed by our podcast guests are their own and are not legal advice or official statements by their organizations.
Hello, my name is Debbie Reynolds. They call me the "The Data Diva". This is "The Data Diva" Talks Privacy podcast where we discuss Data Privacy issues with industry leaders around the world, with information that businesses need to know now. So I have a very special guest on the show Tom Chavez; he is the Co-Founder and CEO of Ketch. Nice to see you here. Thank you for being here.
Tom Chavez 00:41
Absolutely. Debbie, thank you so much for having me. I've been looking forward to this.
Debbie Reynolds 00:45
This is exciting for me because I've liked your tool for quite a while. I like the moves you guys are making in the marketplace. And I thought this would be a great opportunity to take a deep dive into why you decided that you needed to create a company like Ketch and your journey into this privacy world.
Tom Chavez 01:08
Absolutely. It's not been an immediate thing. And in some sense, I feel like this has been in the crock pot on slow for at least 10-15 years. So my journey into all of this dates back to probably at least 2010. Before Ketch, I had co-founded and led another company that was in the Data Management, Marketing Tech arena. And even then, Debbie, my team, and I had a lot of passion for matters of privacy. And so we saw a lot of companies trying to harness the power of consumer data for better content, marketing, and commerce; of course, that was exciting. But we saw companies starting to sort of taking liberties and do some things with data that gave us pause. So in that context, we'd actually started to build a companion product that would give consumers the ability to control their data signature, right? We wanted them to have control, to be able to come in and say, hey, my name is Debbie, you can use my location for recommendations, but you can't use my age and my zip code for targeted advertising say, right, we wanted to give that kind of granular control. And we just thought it was cool but important and necessary. So 2010-2011, I can tell you, we built it. We put it on our website. And about, I don't know 1200 people globally showed up and poked around at it. I think my mom showed up five or six times just to show support. The point is like in 2011, nobody cares, right? Yeah, nobody cares about privacy and matters of data control. But we didn't lose, and you know that we didn't lose conviction, right? That could actually eventually take root. Fast forward now to 2016 and the company had been acquired by Salesforce. And suddenly GDPR, after a lot of shadowboxing around this thing, the European authority finally comes in and says, no, no, no, we're really doing it, and GDPR happened. Okay, that was a lightning bolt in our business. Because, as engineers, we start to put the numbers to this and understand its implications, the costs and complexities for us of rewriting all of our data pipelines to adhere to a single deletion request were potentially crippling, right. So forget the Hollywood facade of hey, Debbie come in here and exercise your deletion request. But if you're actually going to really do it, as we needed to do now, in that context, making it so was really expensive and complex, right? I've likened it to going into an ocean and finding a particular droplet that corresponds to Debbie with this much magnesium, this much oxygen, and this much carbon; where can I find this particular droplet in this ocean of data? Pipette it out of the ocean to respond to this deletion request. Is that crazy? It's that complicated, right? So that was the point at which I and others started to look at this and say, you know, there's got to be a better way. If you believe that Data Privacy is a data management problem, as we do, then we need a new set of infrastructure, a new kind of application, and a new kind of conception for how to really nail it. And so that was my journey into all of this. We got going. I can share about two years ago, and I can tell you we're seeing it's very different from 2011. When nobody cared, right, we're seeing brands, it’s, of course, the regulations and, and GDPR and CCPA, and so on. All of that is putting a lot more urgency into these questions, but we're also seeing leaders at companies now understand that it's not a nuisance. It's not just the cost, source of cost and complexity; it's an opportunity now with our customers and consumers.
Debbie Reynolds 05:20
I love that story. So I was one of the people like you who cared about privacy when no one was thinking about it. Actually, it was funny. So when the GDPR came out, when it passed into law, in May of 2016, I thought, I'm going to wake up on May 25, and everyone's going to care about privacy, and it was like, it wasn't in any paper. No one's going to think about it. And I was just crestfallen, right? Like the world has turned okay, like this is going to turn everything topsy turvy upside down. And so I've made my business sort of evangelize, hey, this is coming; you need to really think about it. And it's funny because I've talked to really big corporations, they were like, they just couldn't believe it, because they just didn't, couldn't wrap their head around the extra torque, the territorial reach of data regulation and rights. And, you know how applications are made. So they're made to put data in and not have data come out right?
Tom Chavez 06:21
So it's interesting. Yeah, just the level of oblivion or naivete out there. Right, where otherwise really sophisticated business leaders feel like, well, you know, that's a thing out there, but it's not relevant to me. Or, you know, I'll get to that later. Yeah. Get to it now.
Debbie Reynolds 06:38
Well, I'm glad they are; I'm glad they are. One thing that I like about a slogan that you have is trust by design. And the reason why I like that is very simple and very crisp, but I think the thought had been earlier, so let's say 2016, 2018, 2020, whatever. A lot of companies are trying to grapple with this, and they saw privacy more as just compliance. And we know that it's more than that, right? It's definitely deeper in terms of what you have to do. But in order to have customers want to do business with you, you have to engender trust, and just box-checking; it's not going to do that. So tell me a little bit about that. That feeling that you guys are trying to impart when you use that phrase?
Tom Chavez 07:34
Yeah, well, I think I think you are right on the nose, right? The first generation of systems and approaches to this problem are extremely compliance-oriented, right? It's a set of chores and irritating tasks and checklists and surveys to be completed. To demonstrate compliance in this dreary loyally way. What I think you and I and others are seeing now is look, and this is what's behind trust by designers. No, no, no, no, elevate this idea, and understand compliance will be a measure of whether or not you've done the right thing but do the right thing. Right, the Northstar is to demonstrate privacy as a way of engendering trust with your consumers, your customers, and your partners, right? Trust is oxygen for businesses, right? Trust grows top lines. But trust is what creates confidence for consumers to come back and buy again, and again and again, for them to engage in your content if you're a publisher, right? I trust this brand because I just feel like they're doing the right thing with my data. So what's behind trust by design for us is this application of privacy and control over data as not a jury lawyerly thing that needs to be handled, but as an opportunity to be seized? Right, because it promotes top-line growth and business expansion for the companies who take it seriously.
Debbie Reynolds 09:06
I thought it was really important, you know, another inflection point when Apple came out with their app transparency. And so people and I know you've heard this before, where people didn't see privacy as being something as driving top-line revenue stuff. So you know, not only is it a good thing that you obviously should comply with the law and try to do things that engender trust, but we're starting to see we put numbers to companies that are thriving because they are helping consumers protect themselves. What are your thoughts about that?
Tom Chavez 09:46
Exactly. Well, listen, I mean, I find it so interesting, right? When Facebook raises its hackles and gets all irritated about the iOS move. Listen, if customers and regular consumers didn't have a problem with it, presumably when they see that opt-in or opt-out sign from iOS, they would just say, oh, no, keep using my data. That hasn't happened, right? The point is that so many people had no idea the data that's being collected and used in darkness without their permission without their awareness even right? And so I think it's actually almost unconscionable. I think we're going to come to a time, Debbie, when we look back at this. And we won't believe that that actually happened. I know that sounds crazy today. But we're going to look like really, and they were just taking your data. And they didn't even ask permission, and it was all, and nobody knew, right? And now, we're shining a bright light on this. And when you shine a bright, bright light, you know, the cockroaches scramble, right, the FTC moves. And you know, my team and I have been in and around these data supply chains for many, many years or decades. Now, we've seen some really some crazy stuff. Now, I also like to say, look, if you're a consumer and you're happy to trade off convenience and utility for privacy and security, you should make that choice right back to the iOS Apple. Like if I get a shot, I can choose. There's nothing untoward, right, about taking your data, as long as you've consented to it, to do things that, presumably give you more of what you want. There's nothing untoward and nefarious about that. But the premise here has to be that you get to consent and you have to choose.
Debbie Reynolds 11:37
Yeah, yeah, consent can be tricky. Unfortunately, people as you see consent to things that aren't in their best interest. And part of that is around transparency, right? Because they don't really understand what's happening.
Tom Chavez 11:50
Yeah, that's right. That's right. And so, but that's where your work and, you know, in educating people in shining light, illuminating these things that otherwise seem kind of boring and dark, right, there's still a lot of work to be done there. We can solve a number of challenges, I think, through technology, but it's not. That's absolutely not enough. There's a whole education imperative over here, too.
Debbie Reynolds 12:16
That's true. I agree with that all the way. So I want to talk to you about DSARs. So I have had an issue with the DSARs. And that is because when these regulations started to come out, especially the GDPR, and a lot of other regulations start to say, oh, you have to do data subject access request, everyone and their mother decided that they wanted to create like a DSAR feature, right? And a lot of what we're seeing is basically like a ticketing feature tacked on to stuff that doesn't have anything to do with privacy. It doesn't connect with anything. And so I feel like there's too much noise in the marketplace around organizations or companies that are just heard attacking these things. But I like the way that you guys have built your system. So you built it not from the outside in, right? I mean, you actually built it from the ground up, as opposed to making it like window dressing and not have stuff happening in the background. So tell me a little bit about the facts. I feel like that's very different from other tools.
Tom Chavez 13:25
Yeah, look, it is, as you say, a funny, funny thing, and it's okay, right? It's developmentally appropriate that we be at this stage where, okay, we have a deletion request, and we have to do something about it. There's a consumer interaction, right? I need to have a way to show up to a brand and say, my name is Tom, I want you to delete me. The trick or the problem here is that, as you call it window dressing, I call it the Hollywood facade of privacy. So I set up this experience that makes you think I'm capturing your request and actually doing something and people really can't, we have I'm sure you and identity have conversations all day long with business leaders who have adopted more those first gen solutions, and they think that they got it handled. So the shock and the scandal, I tell you the scandal of realizing what they actually got was a ticketing system. Right? It's a system that captures a consent signal or I'm sorry a deletion request. And then it's sort of like Zendesk, right? Or what are these ticketing systems like? Well, I have a workflow now, and you need to walk that ticket around the company to make sure that Debbie's deletion request is enforced inside the subscription system and then inside the registration system. Wait, wait, we have all of these other third parties; I have an email newsletter system that I use, make sure that we handle it there. There's no actual orchestration of the DSR happening in a machine-based way. So when we come back to Data Privacy is a data management problem. What we mean is no, no, you can't just have the window dressing; you've got to actually do the thing. And this is where we need to put the machines to work. Having the privacy program manager walked that ticket around the company, in our book does not count as privacy enactment in any universe, right? But the shock and the scandal again, and it's, you know, a sort of where we're at, and that's what makes the space so exciting. It's like, okay, let's collapse the costs because you have companies now throwing bodies at the problem, hiring privacy, program managers and engineering teams writing bespoke code to handle it. Oh, well, we had it handled in California. But now Arizona is about to do a thing. Oh, wait, wait, Virginia, I've got to go and sling a bunch of new code to respond to all of these new privacy regs. It's unsupportable. It's just way too expensive. They're understanding like, okay, I’ve got to turn this into an API-driven software thing. I can't just have the patina of privacy anymore.
Debbie Reynolds 16:10
Right? Absolutely. So I'll tell you a horror story. And I want you to react to this, okay. Now, this is actually something happening right now in companies around the country, and the world is horrifying. Okay. So let's say they have this ticketing system, they have these DSAR requests, companies may actually create a database separate like new buckets, new databases of data that they think they want to query for these requests, then they're trying to go through and redact data from that, and then produce it to people and PDFs.
Tom Chavez 16:51
Holy guacamole, Holy guacamole is my primary reaction to that. Yeah, well, I mean, we've seen some of that, too. And part of the way that we know it's, it's the path to total despair and desolation is because, in earlier times that can go, we might have, you know, we were in the bowels of those kinds of systems running similar kinds of moves. They don't scale, right? When you get to any kind of density of consumers, right? You have tables, and like you're saying with billions of rows and millions of columns, right? Duplicating every single one of those right, and then trying to redact or manipulate the data in the derived data set that you created. It's a Tower of Babel, right? And your engineers, by the way, and the CIO, starts to pay attention and announces, wait a minute, what in the world are you doing? We can't pay that AWS bill if you want to do that. We're going to bankrupt the business if you keep on carrying on in that way. So so there we are, again, it's not like a duplicate, redact, you know, and gum tape and glue it together; you have to have a systematic data management solution to the product problem. But, you know, my core reaction there, Debbie is Holy guacamole. It's crazy.
Debbie Reynolds 18:19
Yeah, you don't have to do that. We have tools, and we have automation, right? So especially if the data is already electronic, connecting to a system like Ketch can connect to those systems and do that deletion electronically. So there is no creating a new bucket of information; there is no redaction. There is no PDF creation, you know; you're just creating just a mess. And at that point, so I think I would love to see people try to do more automation. And I love what you said about scale, right? So, you know, if you get 10 of these types of requests, you're going to want to jump out the window because it's hard.
Tom Chavez 19:04
It's akin to like, you know, you make an airplane reservation on your favorite airline, right? And now, wait a minute, my plans changed. I got COVID, or I'm not traveling; after all, whatever it is, I'm going to cancel. It's akin to like, United Airlines. Oh, Debbie canceled a reservation. Let's create another data set. With all of our records corresponding to all the reservations but redacting Debbie I mean, it's absolutely crazy town, right? The way these kinds of duplication you know, bespoke manual redaction procedures work it's again just totally unscalable.
Debbie Reynolds 19:40
Yeah. Now, let me talk to you about the bucket creation, okay? So this is another way, so not as bad as the PDF redaction thing, but still bad where organizations say okay, we have this problem, and in order to solve it, we need to put data, take data from everywhere, and put into a new bucket, right? So you're just creating more duplication, you know, duplicates are a problem within organizations already. They're proliferating. I like the way that you guys are attacking this problem; like you're saying, look, we'll find your data where it is. You don't have to move to different places, we can communicate with all systems. So tell me a little bit about that.
Tom Chavez 20:23
Yeah well, let's establish the context. And what you're referring to is what we call the data everywhere problem, right? I've got data on-prem, I've got it in the cloud, I've got relational, I've got new SQL, no SQL, like, I have new frameworks, like Databricks. And new platforms like stuff. I mean, I've got data everywhere. So just from a first principles perspective, right? When you think about, and we did this maybe a decade ago, it's like, you could just create a new cluster, you could spin up clusters, and you can create a new data set because the scale wasn't as extreme as it is today. If you try to do that, in 2022, for business, you know, a large business, just, you know, a medium-sized business, anything that is not just a corner store, Mom and Pop, you tried to do that. And first, again, just in terms of the volume of data, there's no chance, but also in terms of the variety and the velocity of those data sets coming in existing in those different formats in those different systems. You can't. The horse is already out of the barn; you can't wrap your head around it and try that duplication technique, which is how a lot of people are still approaching it. So we came at this, Debbie, from a totally different perspective, who said, If we're going to crack this up, we have to bring control to the data. Wherever it lives, however, it's used. I was talking to a noted computer scientist who focuses on privacy and security. And when he was explaining how we do it, he in a slightly slightly dismissive way, says ha, so really, what you're creating is kind of a control fabric; I would think of it more as like an annotation layer for data. And I thought about it for a minute, and I said, I'm okay with that. I mean, if you want to think of it as a way to add every column, every row, every cell, right, bring it back to that earlier example we talked about, I now have the right to show up and say, My name is Tom, you can use my age for recommendations, but not my location that I don't want to see targeted advertising. And you can use my household income for music, recommendations, whatever, right? I have that hyper granular, like, every single cell in my row of data can be specified in this hyper granular way, in a way you couldn't before. You have to bring control to the data, right? And so that's the approach we take in solving these gnarly problems. What we're finding is that, and you know, we like hard technical puzzles over here as well. So it turns out to be really, we like hard things. And doing it in this way is not for the meek of spirit. But we've also found that there's just no other path, right? You can try that duplication kind of method that you were talking about. And all of the people that we know who are scampering down that path still are meeting with total despair and started to cast about for alternative solutions.
Debbie Reynolds 23:41
That's true. So I like what you said about an annotation layer, basically, you're taking metadata and reading that information, right? So you're not getting the full document, but you know what things are and where things are. So in a way, in order for companies to be able to do things like fulfilling these higher requests, they have to solve two problems. So one is, what data do I have and where is it? And then how can I take care of the requests? So I feel like some people say, well, I have this DSAR problem, but really a lot of your problem is that you don't know where your data is. That's the first thing.
Tom Chavez 24:31
Yes, the first thing and that's where they get you. Right? Because think about it in a deletion request. Okay, Debbie wants to be deleted. Well, how do I, mean where's daddy? Right? And reaching out across that broad sprawl of complex data systems that we talked about? You know you have to have a programmatic approach for discovering and surfacing and classifying the types of data there is such that you can respond to that deletion request whenever it comes. The first generation of those systems were really focused on kind of creating a catalog. Right? And CIOs liked it because it gave them a sense of control and thinking, sorry, my house is in order. I know where my data is. The problem is a lot of those systems generated results that satisfied curiosity but weren't actionable in a real-time, in-line kind of way. Right. So in theory, I can go and query this catalog to find Debbie's data, but connecting it to those systems, right, in a programmatic way, the way we think, you know, the way Google crawls websites to discover what's out there, for purposes of search, you can, there's a similar kind of idea at work over here, in the way that we attack the discovery problem. The second piece of what I heard you talk about, though, and this is why I say this is not for the meek of spirit, don't try it at home, is you now actually have to enact and enforce across all of those systems. So if we're really going to make it programmatic, right, if we're going to move from privacy programs to programmatic privacy, I now need to have a set of connectors, and adapters and turnkey integrations, that remove the gore of, okay, if there is going to be deleted, I need to have a programmatic way of reaching into my internal systems to make it so without walking that ticket all around the office all around the company and calling in favors from the IT friends to actually reach into those systems and do it. So that's hard. But on a third-party front now, for us, you know, sending an email to privacy@hubspot.com does not count, right? Because now you just have a privacy program manager on that side of me. So walk a ticket around the office. So having an API-based approach and the connectors, right, and the APIs, they can actually enact those privacy instructions. You know, it's really a kind of a monolithic undertaking, right? When I think about it, Debbie, I'm tired. Just talking about, you know, the complexity here that we've had to conquer. To get it right. But you know, that's what we're doing over here.
Debbie Reynolds 27:25
So one thing I want to talk to you about is that I help people evaluate tools, so they want to bring it in-house or whatever they want to use it. And one of the hidden costs this comes up, and I know that you have an answer for this is the connectors, the middleware, the API, or whatever. So okay, you get this nice shiny tool. And then on the backend, you have to pay for this consulting time to have these connectors created. Now, you guys, and I actually quiz you all about this. So quite a few. I even threw some esoteric ones, but I thought, well, I wonder if they have it. You guys had all the ones that I thought of. Okay, this is good. This is great. So tell me about the APIs and the connectors that you guys have already built.
Tom Chavez 28:14
Yeah, so I hope you don't quiz me. That'd be because I had a hard time keeping track of that monster list myself. It's really remarkable and but now, yes, it's monolithic. And it's hard. But it turns out that there's this what developers like to call a pattern, right? Once you do it once, it's not to say that you're starting, that it's a turn of the snap of the fingers for the next one. But you started to leave, pave these paths through the jungle. And you start to understand the way to actualize, for example, a LinkedIn request, right? And that it's not an exact replica of what you have to do for HubSpot. It just makes the ideas of a couple of examples. But they rhyme. Right. And so what we're doing effectively is in for the companies who have well-formed privacy APIs, and companies like HubSpot and Salesforce fall into that category, we're creating a left, if that's the right hand, we have a left-hand privacy API for the customers we serve. That is activated through that middleware that you talked about on our platform. Okay. There's also an interesting set of customers who kind of sorta have some sockets open, but not really. And so what we have had to do there is what we call materialization. So you kind of have, you know, the bits and pieces of something that could be turned into a privacy API. By the way, a lot of these are audience-oriented advertising APIs, so they don't exactly sing and dance and speak privacy, right? But we have the raw materials in place. To effectively materialize an API, a socket for them, to catch the privacy signal programmatically, by the way, it's a great benefit for them; we've had some of those larger platforms, thank us profusely, they can't believe especially some of the early ones. They listen. And I'm not going to call them out because it's not something I want to talk about, specifically, but let's just say they're out there. And it's a big group for them. To have one of those privacy sockets opened up by this thing we call materialization by Ketch, it's a heavy lift. But it's a pattern. It's replicable. And that's what's allowed us to get to our knowledge, Debbie. It's the largest scale of programmatic privacy connections to hundreds of those external third-party systems. And then infrastructure what we like to call, it,'s a geeky word, but we call it the transponder, that allows you to reach into your internal systems, and scale those privacy connections for internal systems in that way, as well.
Debbie Reynolds 31:07
I was very impressed that I put you through your paces. And so I was very impressed by you answering all my trick questions as well.
Tom Chavez 31:15
Oh, my goodness, well, keeping us on our toes here, Debbie. Thanks very much.
Debbie Reynolds 31:23
One other thing I'd love to chat about. And this is something, a gap that I see as well. So some companies or a lot of companies may have legacy data, so embarrassing stuff and backwards that they don't want to talk about, or whatever, they may be old and may not be able to be connected to, but you can still, your tool, manage those datasets, even if you can't connect them. So you can even put a record in there so that this can be the place where you record all the different types of data that you have, even if those companies have to do a manual process for some data that doesn't have connectivity, or it's like super duper ancient, which I think is a plus. Because a lot of times, when people do these systems, they have created a whole separate system to track these other things. Right. So I think that's definitely a benefit.
Tom Chavez 32:21
Yeah, thank you. Yeah. But as you point out, there are some companies that are ready for fully programmatic. I remember some early customers asked us if we could reach into legacy IBM AS-400 systems. Okay. There's a middling kind of approach that we take to your point to handle it. But if we've committed as we have to this sort of debris data everywhere idea. It's on us to figure that out and make it easy for our customers. What frustrates and we're talking to our customers who are, were on those first-ten systems. You got it. And I'm sure you see this as well, that the frustration and the anger, like the thought, that they were getting privacy automation. And like, a ticketing system. They're pissed. Right? Right. So yeah.
Debbie Reynolds 33:21
What I'm finding is that companies, they may go and buy something, and they end up, and it's not the right thing. And they end up back in the marketplace within like 12 to 18 months, looking for something else. So it's really important to be able to ask the right questions, make sure you know, sometimes, the companies don't know the right questions to ask, are really just trying to get down to the bottom of what they actually truly need and have a tool that will actually fit what they're trying to do. I think it's really important.
Tom Chavez 33:55
We, just a lot of these days, we have this architectural principle because trust by design infuses everything we do. Deploy once, compliance control everywhere, is how we pay it off architecturally. Right. And another source of frustration that we've seen with customers on legacy first-gen systems is this feeling of like, really? So now, Virginia came out with a thing, and that's another module, are you really going to do like the onesie twosie you know, it's death by 1000 cuts and a million regulations. Our view is like now, that's crazy town. If you buy a privacy solution from us, it said to set it and forget it, configure it according to the policies that you have in mind for your business and that's important work for you to do. But if and when the privacy demons go bump in the night on Ketch, you know, you're good because the machines are humming as they ought to be to handle all of those privacy problems wherever they or issues or requests or consensus and disclosure and rights wherever they occur?
Debbie Reynolds 35:06
So I would love for you to chat with me about a study that Ketch did called "The Person Behind the Data: Consumer Privacy Perspective". I would love you to chat about some of the great insights that you’ve got around what people really want. And what's their feeling taking their temperature about privacy?
Tom Chavez 35:26
Absolutely, you know, it was, you start one of these studies, we did this in partnership with a research firm Magna. And you have you always start with a theory, right as to what you think you're going to discover, what you're going to hear, I can tell you that. This time, we were actually quite surprised by some of these results, by the way, and back to 2010 or 2011, Debbie, what a difference a decade makes because right off the bat, one of the key there's sort of three key findings that I would call attention to here. The first is that more people, and this is a study conducted with 1000s of participants in various geographies; more people across the board, value Data Privacy than sustainability, diversity, and inclusion, right? We sensed that the drums were beating and people were climbing onto it, but we could not have expected that this was going to be something that they valued more than other really, really important things like sustainability and diversity. The second key thing and this one didn't surprise me so much, but it was good to have it affirmed is that consumers are not zealots when it comes to matters of privacy. They're absolutely okay with the value exchange. I give you my data, and you give me something back. There's a game to get., right? And there's no scandal, and there's no worry. There's no moral issue at stake here. For consumers, they're very comfortable with that idea. Right? But they want, if they're going to give, they want to see a get. Right. And so that was the second key finding. And the third. And this is the biggie. The third was that, in fact, and this is to the earlier point about top-line benefits and seizing opportunity, as opposed to just complying, consumers with brands who show a commitment to transparency and privacy, consumers for those brands demonstrate a 23% higher propensity to purchase. 23%. Right. So if you're a CMO, right, and we've worked with a lot of CMOS over the years, when you say 3% improvement in the propensity to purchase that gets everybody up out of their chairs, right? 23% is tectonic, right? And it's not anecdotal. It's borne out by the numbers. And so that I think is it also explains luck; I mean, we're also just reflecting, I guess what they study the broader shifts in the market because we're seeing the CMOS now flocking to these questions of privacy because they understand it's a way of committing to their brand, and reputation and the promise that they made to consumers. But it's also a way, if you have a 23% improvement in the propensity to purchase, that's money. That's growth; that's revenue. So those were the three biggies that we discovered in this study.
Debbie Reynolds 38:54
And just for our listeners, you can go to the Ketch website and download a copy of that study. It's really cool. I love that. I'm glad that you all were able to get those insights because it's true. I feel like there's a gap where people need help, right? People need more help. They can't help themselves enough. So they need tools or companies like Ketch to help them help themselves. So, you know, I feel always as a consumer as well, if I had to choose between two different competing products, I will choose the one that was more privacy-focused because I'm like, okay, that's something I really care about, and they can help me help myself.
Tom Chavez 39:39
Absolutely. Yeah.
Debbie Reynolds 39:41
So if it were the world according to you, Tom, and we did everything you said, what would be your wish for privacy anywhere in the world, whether it be law, technology, or human stuff.
Tom Chavez 39:55
Oh, boy, king for a day, at least in the realm of privacy. Yeah, I, you know, I would want to see through a combination of, I guess, legal codes, but ideally, and primarily through business adoption, I would like to see a pervasive always-on, sort of privacy fabric that allows every consumer at every touchpoint, with every company, on every device, in a beautiful way, not like a conversation, oh my god, I gotta click through 14 of these pesky things to get the thing I want. But in a beautiful, well-designed way, by the way, there are a lot of companies who, you know, it doesn't have to be cumbersome. I'd love to see this privacy fabric cover the whole world, anytime, you know, and to curiously persist at the moments when I want a thing to come back and say, hey, Tom, if you'd like to get more of what you want here, as you engage with this brand, you can tell them your zip code, and they'll be able to customize something like in the moment of truth, not 14 things that I had to crunch through. Right? When in that little moment of truth, when I'm engaging with the brand, right? That's what I mean by real-time always on; it's everywhere. Right? That, for me, is the dream.
Debbie Reynolds 41:20
Wow, that gives me a lot to think about. You're a philosopher, and I love that. Within the moment of truth. So I think that that is probably a note towards the future. Because as we get into things like the Metaverse and different things, I feel like consent will have to be more incremental at some point, right, based on what you want to do. And so, you get slammed with 80 pages or 14 checkboxes at the beginning of a thing; I don't think that's going to flow in the future. So being able to have that real-time in the moment, the opportunity to make a choice, I think, is the way of the future.
Tom Chavez 42:04
Listen, I mean, as a writer once said, The features are already here; it just isn't evenly distributed yet. If you think about what's happening in personalization and all the ways in which we take it for granted today, right? When you go, whatever new site you'd like to get your news from, or when you go shopping, not hopefully, not just on Amazon, but other places. You aren't, and we're now all attuned and anchored on the expectation that the brand will come back and offer me something that is sometimes even a little eerie, like, wow, that's really good. How do they know I like that? Well, behind that screen is a set of personalization technologies that they have perfected for the purposes of selling you more stuff and marketing to you more effectively. All of that possibility is ready. And absolutely within reach when it comes to privacy. I mean, like we've already in a sense, we've already done it. It's not like we're splitting the atom here. And starting from scratch, we have the patterns. We've done it in other domains. We just haven't yet brought it to privacy; by the way, if any Apple people are listening here, you know, my challenge to them is okay, we have these, you give us these beautiful experiences and these beautiful devices, and you follow you know, the patterns that your founder established to make it beautiful and elegant for people. Why can't you do the same for privacy? Right? All right, with Apple. I mean, I still get these long ponderous privacy things. Why didn't you make that beautiful? Right? Give me an 18-page thing I'm not going to read. How about you treat me respectfully and bring the same kind of beauty and discipline you bring to the creation of all of your other experiences with me? Bring it to privacy too.
Debbie Reynolds 43:58
Wow, I love it. Yeah, I'm going to quote you on this. This is great. I agree with that. I agree wholeheartedly. So I say it should; you should be able to tell it to an 80-year-old or eight-year-old. So it needs to be simple and plain, in my opinion.
Tom Chavez 44:15
By the way, like in our day jobs, Debbie, a lot of people now are doing these Zooms. And then they record the Zooms, right? We asked curiously at the beginning. I'm on a lot of these; students and people have gone on other systems or Zoom; they're just asking, can I get your permission to record the session? That's nice. I like that. Right. We all appreciate that. At the beginning, like the courtesy, the above board, hey, that notetaking thing that you see on your screen. We're going to record the call. Is that okay? We have little courteous patterns of that in all other places. And so when I hear people say, oh, it's too hard, and it's clunky, and we don't like it and I'm like, man, you've got it all wrong because every other place that we could point to where people gain consent in a courteous fashion. Actually, when people ask me, you know, can I record you? I like him a little more. I trust them more.
Debbie Reynolds 45:16
Oh, wow. deep thoughts. Thank you so much. This is fantastic. Well, thank you so much for being on the show. I really appreciate it. People definitely go to the Ketch website; check it out. I was very impressed. Very impressed with what you guys are doing. And I love that you have kind of that felt-good foundation of just the idea that this is just the right thing to do. You know, so it's not just, you know, you know, not just zeros and ones, you know, there's feeling and there's a purpose behind it. And I think you're doing a fantastic job.
Tom Chavez 45:51
Well, thank you, Debbie. Let's keep working together to make privacy cooler and more purposeful and more powerful for everybody.
Debbie Reynolds 46:00
I agree. And we shall do that. Thank you so much.
Tom Chavez 46:04
Thank you, Debbie.
Debbie Reynolds 46:04
Talk to you soon.