Debbie Reynolds Consulting LLC

View Original

E154 - Tom Kemp, Silicon Valley-based Entrepreneur, Seed/Angel Investor, Policy Advisor, and Author

Find your Podcast Player of Choice to listen to “The Data Diva” Talks Privacy Podcast Episode Here

Your browser doesn't support HTML5 audio

The Data Diva E154 - Tom Kemp and Debbie Reynolds - (46 minutes) Debbie Reynolds

46:35

SUMMARY KEYWORDS

privacy, california, data, ai, information, happening, consumer, law, states, company, cybersecurity, tech, opt, delete, gpc, brokers, put, great, hipaa, thoughts

SPEAKERS

Debbie Reynolds, Tom Kemp

On this episode of "The Data Diva" Talks Privacy podcast, I talk with Tom Kemp. Tom, in addition to being an author, speaker, and someone who has very deep knowledge in technology and Silicon Valley, was also the co-author of the Delete Act. So, the Delete Act in the State of California was just passed as a law. So, it was signed into law on October the 11th, 2023, by California Governor Gavin Newsom. So, in this episode, we talk quite a bit about the Delete Act, among other things. Congratulations to Tom, and this is a great exploration of that Act for those of you who don't know much about it. Thank you.

Debbie Reynolds  00:00

Personal views and opinions expressed by our podcast guests are their own and are not legal advice or official statements by their organizations. Hello, my name is Debbie Reynolds; they call me "The Data Diva". This is "The Data Diva" Talks Privacy podcast, where we discuss Data Privacy issues with industry leaders around the world with information the businesses need to know now. I have a very special guest on the show. Tom Kemp. He is a Silicon Valley-based entrepreneur, seed angel investor, policy advisor, and author. Welcome.

Tom Kemp  00:41

Thanks for having me on.

Debbie Reynolds  00:43

Yeah, those are a lot of words, right? About all the stuff that you do. But I feel like it doesn't truly encompass everything that you do; you're obviously very deep into tech. You're a Silicon Valley person, you know a lot about what's happening in Silicon Valley or has been over the years, you have a deep interest in stuff like privacy, AI, emerging tech, you know, that's my space. And you and I know each other from LinkedIn, I also had the pleasure of providing a quote for your book, because I think it's amazing. So thank you for letting me read it. And I think the public agrees, I think it's still on several bestseller lists if I'm not mistaken. So I think you're definitely filling a void that's there in terms of being able to pull all the different ideas together in terms of technology, privacy, and then just what is happening in Silicon Valley. So why don't you tell me a bit about how did you get here? How did you get here? Why does this area especially around privacy, interest you?

Tom Kemp  01:48

Yeah. So as you said, I'm a Silicon Valley-based entrepreneur, investor, policy adviser, and most recently, an author of this book called Containing Big Tech. Most recently, I was the CEO and founder of a cybersecurity company called Centrify that I grew with a great team to over 100 million in sales. We had 2000 customers, more than half the Fortune 50. So I definitely got a great appreciation for large amounts of data being collected and then being hacked, right? And after my company was acquired, I said, you know, I've been focusing on data from a cybersecurity perspective. But let's talk about and look at it more from a governance perspective. And so I really started digging into privacy. Now, my company, Centrify, we had to be GDPR compliant. And I was very familiar with a lot of the rules and laws that are out there, but really dug in. And then I got involved in this political campaign in 2020 for Proposition 24, here in California, where I live, which is the California Privacy Rights Act. And then that led me to advising other people and campaigns and groups and getting involved in legislation. And so what I've done is I've kind of taken all this knowledge that I have as an entrepreneur, as a policy advisor, and put it together in this book called Containing Big Tech. And so yeah, so I just, it's an exciting space, I know your listeners, privacy is top of mind, right, because it involves data associated with us. And then, when you look at the AI revolution that's happening, there's great concern about personal data being fed into AI. So it's an incredibly topical, very exciting space. And I think it's great to like really spend time like you do on your podcast, looking at this, thinking about it, looking at laws, looking at ethics, the whole nine yards.

Debbie Reynolds  03:40

That's fascinating. I love it. I love to see people who bring so many different experiences and perspectives because that's what I think we need in order to solve these problems. Something unique to me, in terms of the US and this is a confusion a lot of times around cybersecurity and privacy. So a lot of times I hear people talk about cybersecurity, and they start sprinkling in privacy stuff, and it just is not the same, even though they have a symbiotic relationship. They're not the same. So tell me your thoughts about how you explain to people the difference between cybersecurity and privacy.

Tom Kemp  04:14

Yeah, I mean, cybersecurity is about the protection of information or code or other objects from a digital perspective as well. So it's about securing and protecting privacy, it's about the governance of, and it's more limited towards personal data. So it doesn't involve copyright material, intellectual property, or source code. Those are very valuable things that can be stolen from a cybersecurity perspective. Privacy is more narrowly focused on personal data, and depending on what part of the world and what laws, it will have different definitions, and it's more like the governance of that in terms of what rights are associated with the consumers. Here's what obligations you have and what enforcement bodies that may oversee. So it's kind of a Venn diagram; it overlaps in terms of the personal data, securing it versus governing it from a privacy perspective. But also, the core definition of privacy, at least in the United States historically, has been Justice Brandeis, which is the right to be left alone, as opposed to cybersecurity is probably the right not to be hacked. Right. So it's fundamentally different. 

Debbie Reynolds  05:31

Yeah, I agree with that. I love you that brought up Brandeis. That's a great reference there. I think California is the epicenter of what's happening in privacy in the US, and people all over the world are very interested in California. So California has always been very progressive in privacy over the years. And I feel like maybe some States are a bit jealous, perhaps envious, of where California is. But California has been doing this work for decades in terms of building that foundation in order to get where you are in terms of the CPRA and CCPA and things like that. As a Californian, looking at what's happening in other States, knowing that California is very influential, you've been involved with campaigns and policy and advisory for things like the CPRA. Also, this new bill that you helped co-author, The Delete Act, I'd love to get your thoughts about what is happening in California and how are you seeing it impact other States or other companies even?

Tom Kemp  06:39

Yeah, so California has always led the nation in terms of consumer protection. And, you know, you can go back to automobiles, right, in California, especially, for example, with emissions, has historically set the standard in the United States. So California has been at the forefront of consumer protection, specific to privacy, that California voters in 1972 we're very far-sighted and actually put the word privacy in the Constitution by a ballot initiative. So in California, privacy is actually an inalienable right. As we all know, that's not in the US Constitution itself, right there. And so what's happening in California was that in 2018, there was going to be a ballot initiative called the California Consumer Privacy Act that was led by a gentleman by the name of Alastair MacTaggart. And there was a lot of opposition from the tech community. And then what happened was Cambridge Analytica, and then they all kind of slunk away with their tails between their legs, the opposition, and then the legislature said, you know, we're going to look really stupid that something like this is on the ballot, and this is going to look really bad, if this becomes something that the voters have to vote for, and we can't deliver. And so there is a law in California that within 30 days of a ballot initiative, which is also very unique to California, this direct democracy thing, which goes back to the progressive area, with Hyram Johnson. And so it's kind of in the same time that Louis Brandeis was around the whole progressive era, right there, that within 30 days, the legislature can basically say, oh, we can take a ballot initiative, and we can turn it into law. So it doesn't have to show up in November. And so that's how it got passed in 2018. What happened then is the tech industry came back and started watering it down in 2019. And then the guy Alastair MacTaggart said, that's to this. And so he came out with a ballot initiative in 2020 called the California Privacy Rights Act that amends the CCPA. And what it does is it actually sets a floor versus a ceiling, which means that new laws that are done in the legislature cannot water it down. And then it added an enforcement agency, etcetera. So, goal was to try to make California on par with GDPR, for the most part, as well. So that's what we have right now. And then, I'm more than happy to talk about how that has impacted the other States. Do you want me to? Or just go? Do you want to ask a question? Or I'm kind of monologuing right now.

Debbie Reynolds  09:12

No, no, it's perfect. Yeah. I would love to know your thoughts about how it's impacting other States.

Tom Kemp  09:18

Please let me let's hear what you think.

Debbie Reynolds  09:20

Wow. Well, first of all, I think I feel like California has always played this role as being a foreigner in this legacy of consumer protection. And we see have seen over decades or over time, other States try to put those things in their laws. But I think one of the more interesting things that I noticed, especially when CCPA first got on the books, is that companies that did not have to comply with CCPA decided that they would do it anyway because they felt like instead of carving out California, let's just treat everybody the same, right? So not all companies do this, but that is something that I've seen that I did not expect. What happened on the business front? What are your thoughts?

Tom Kemp  10:07

Yeah, I think it's a good business decision. Because if you match to the gold standard in the United States, then it's very likely if there's other State laws that come into play, that as long as you meet the California standards, then it'll be less amount of work to deal with any other States. And so right now, we're actually up to 12 States. So first, it was California in 2018. And then in 2020, at the end of last year, there were five States. And then I think we're now up to 12. I know Texas, and there was Oregon out there. And now, it turns out that there are some differences, though. And so one thing is that the tech industry woke up and said, oh, my gosh, yes, there's this thing called the California effect, much like, there's the Brussels effect, with GDPR, and other things. And so the tech industry started actually writing a law, the laws like in Virginia, and so there are weaker sauce laws. But there are ones that are stronger, like Connecticut, for example, that may be more closely aligned with California as well. And so that's where we're at right now; we do have a hybrid system in place. And between California and Texas, having privacy laws, we're talking about what a third of the nation now finally having at least some privacy law; the big difference is probably with the weaker laws versus the stronger ones has to do with the universal opt-out capabilities, the global privacy control. And that's one great example. And also, in the areas of enforcement, California is still the only State that actually has what they call in Europe a supervisory authority. We have the California Privacy Protection Agency, and the enforcement just started; the enforcement of the regulations has been kicked out. But the PPA can still enforce the regulations associated with the CCPA. And the core law itself. There seems to be a lot of confusion out there when people read that, oh, we don't have to do anything with CPRA. No, you do have to do something with the core statute, maybe the finer points of the regulations, et cetera. So look, obviously, the fact that we have 12 States is leading to, probably, a further debate about whether or not we need a Federal privacy law. And then the core issues, there are private right of action and preemptions. But the States are just plowing ahead. And in lieu of the Federal government doing something. Yeah. So California was the first State to have data breach notification laws. And as I saw other States started to pick up until 2019, where all States have them, right? But they're all different. Yeah, it took 15 years, though; that's the problem. It took like 15 years or so to finally have that happen. But yes, you're right. And so we now have a patchwork data breach notification side.

Debbie Reynolds  13:04

Exactly. And I was saying in 2019. I'm like, if we don't have a Federal law, then we're going to do the same pattern again. And that's exactly what's happening. Where the States are passed, California's first passes comprehensive law, and other States, we're doing it; you touched on quite a few things I want to dig deeper into. And one is, I kind of joke that in the US now, we're going to have privacy laws, almost like barbecue capitals of the world. So like, my Carolina barbecue versus Kansas City barbecue, so we're going to have California-type privacy laws versus Virginia privacy law. So they seem to be falling into those two camps. What are your thoughts?

Tom Kemp  13:49

Absolutely, yeah, I think there are ones that are that are better and stronger and provide more consumer protection. And they're ones that are kind of weaker, barbecue sauce to use that analogy right there. And I think, at the end of the day, we do need, I'm very supportive of a Federal privacy law. But the big issue is that the Federal government has historically been very slow-moving in putting together privacy laws. And so, like the last major kind of privacy-related laws was in the 90s with HIPAA, Gramm Leach Bliley that was pre-iPhone, pre-Facebook, pre-everything, right? And so it's not like if they do come up with one that they can kind of keep up with changes in technology as well. So I fundamentally believe that if we do a Federal privacy law, it should represent a floor, and frankly, that floor should be high enough that maybe 48, 49 states will be below that floor, and it will raise up their privacy law. But I do think, especially in the area of AI, there should be innovation, and we talked about Brandeis; Brandeis finally said that privacy is the right to be left alone. He also said that the States are laboratories of democracy, right? And with like HIPAA, you know, States can go over and beyond, and California, we have CMIA. Right. And there's other examples of that as well. So I actually believe that States should have the flexibility to potentially preempt Federal privacy law. But the bar should be so high that in reality, it may be kind of one or two States doing a little bit more, etc. But at least we should have that flexibility. Because, unfortunately, the Federal government moves so slowly. And especially when you put AI into the mix; I mean, people are complaining about copyright material being sucked up by AI. But what about our faces, our genetics, right, you know, that's going to be sucked up into AI, as well. And so, we do need a Federal privacy law. But barring that, let's keep on going down the State path until they finally wake up and do something.

Debbie Reynolds  15:54

I want to toss out a radical idea for you. I'm sure people pull their hair out when they hear this. I think that we should have a Federal privacy law that, if nothing else, all it does is harmonize the language and the definitions of like, what person that is, different things like that, and they forget private right of action and they forget preemption.

Tom Kemp  16:17

Hey, I'm, actually, that'd be good. That's a great idea. I mean, that just kind of, like, here are the core set of rights that people have, right? This is it, just like anti-discrimination laws, right? Just like, boom, this is what it is. This is what it means to discriminate. In this case, this is like, this is what personal data means. Right. And so I think that, again, that would cover the vast majority, and with the States that don't even have it something defined right there. I also agree that we should have kind of a baseline data breach notification law, which says, the same type of personal data, you know, this is what it means to have it be breached. And yeah, there will be one or two States that say, oh, well, it's our specialized ID card for XYZ that should be included as part of the breach. Let them do that. But in the end, when you do set a high floor, the vast majority of States will be upgraded to that, and there will only be a smattering. And we see that with HIPAA. I mean, all the other privacy-related laws are not ceilings, they're floors, and we should do the same thing here.

Debbie Reynolds  17:24

Yeah, there are a lot of misconceptions. So I hear people talk about how we need preemption and because of law, like HIPAA preempts. I'm like HIPAA preempts because it was first, not because it tried to cut off other laws. Right. So I think if you're in first, you have the law come out first, then if you create a law where the floor is high enough, then other States may not even want need to pass laws, right?

Tom Kemp  17:57

Yeah, absolutely.

Debbie Reynolds  17:59

You know, yeah. Interesting. Interesting. So let's talk a little bit about your book. So your book is Containing Big Tech, How to Protect Our Civil Rights, Economy and Democracy. I think this is an interesting title. Because I feel like a lot of tech folks like us go off on a whole tech tangent and don't really tie back these concepts to society in a way, right? So how is technology, or how's the way data is managed, impacting not only individuals but us as a society as a whole? What are your thoughts?

Tom Kemp  18:38

Yeah, so look, I mean, it's funny because you and I are kind of in the weeds. Sometimes, you know, when we're like, what is this definition of sensitive personal information? And we're dealing with, like, well, what's the definition of precise geolocation? Is that an acre? Is that, you know, square miles and all that stuff? And we're cognizant of all these laws. And we're thinking, oh, well, that's more cybersecurity versus privacy. And the problem is that this is such a big deal for the average person; it's like protecting their data, making sure that they have basic rights that I kind of stepped back. And I said to myself, you know, we should have a book, not a super geeky book or a book that only is for academics, etc. What I wanted to do is take a comprehensive look at the issues concerning the largest tech providers that you could hand your Uncle Larry, or an average politician or someone that just like, cares about just issues and hand it to them. And so they can say, aha, I get what it means for AI bias, right? I understand what digital surveillance is and the online behavioral advertising business model. And then, but not only did I want to say here are the issues that are directly impacting our democracy, our civil rights, our economy, etc. But I also wanted to provide solutions like I wanted to tell people, here's how you can limit your data footprint. So I talked about third-party trackers, and I talked about mobile SDKs. You and I have both written about the Meta pixel, right? People don't realize that if they fill out a shopping cart, right, that information is being sent, shuffled off into the Meta land, right. And so I wanted to have people be aware that, yeah, you could actually set up eff Privacy Badger to block third-party cookies, right? And then also, there's just so much stuff happening nowadays, there's AI, there's TikTok, etc. And a lot of average people don't know what AI is; they think AI is purely just generative AI, right? But AI has a lot of very positive uses. But it could be used for a bias exploit. They know their kids are using TikTok, but they don't know what TikTok is and how TikTok gathers micro signals to optimize, they don't know that so, of course, there's just new laws, right, that have come into this world. And so what I wanted to do with the book is be able to kind of give a fresh and updated look on AI, on new companies etc. And actually, I feel really good that I spent so much time last year, really looking at what big tech was doing with AI. And then when I published the book, then all this stuff came out. I was like, oh, I'm so glad that I kind of was tracking that beforehand, because it's a very fresh look at what's happening with technology.

Debbie Reynolds  21:26

So what's happening now in the world that's concerning you most related to privacy? So something you see like, oh, my goodness, I don't really like that.

Tom Kemp  21:36

Well, look, historically, the business models of the largest tech providers, and probably Apple's the exception, and to a lesser extent, Microsoft and Amazon. But definitely, the core of Google and Meta, historically, their business models were, hey, I'm going to create these free services, which are really walled gardens, because they don't provide interoperability with other solutions out there. And they did that purposely. And what I'm going to do is I'm going to have people come into this walled garden, and through network effects, it's going to become more valuable. And I'm just going to collect as much information about these people as possible and then use that information to serve highly personalized ads. And we all accepted that we were like, boy, this was kind of creepy that I was looking for red shoes. And then, for the next month, every website I ever visited had that red shoe floating right there, right? And we thought that was creepy. But we accepted that was the trade-off. But the problem is, over the last few years, people have figured out ways to weaponize that information. Okay. And here's a great example. It's so cool that as an advertiser, you can say I want to target young single women with young children and serve them ads about my very economically priced diapers; right, that's a great job of personalization. But wait a minute, you can use that same mechanism to say, for my rental, I want to exclude a young woman, a young single woman with children, from seeing the rental ads, right? And so you can actually kind of flip it on its head, or, and not wishing to get political here. But we are now in a post-abortion rights America, where the data of what people search for specific healthcare-related topics can actually be used against them in multiple states right now. And we're starting to see examples of that where attorney generals are saying if someone goes out of State and seeks reproductive health care, I'm going to track them down, I'm going to find them, I'm going to arrest them, how are they going to do that? They're going to look at their Google searches, their maps, data, et cetera. So we're kind of in this weird situation that things that were legal before are now illegal, but all the data is still being collected about us as well. So my biggest concern is the weaponization of data. And then my secondary concern is that artists, musicians are freaking out about their content, their intellectual property being sucked up by AI. But we're already starting to see deep fakes of our faces and our personal data starting to be mashed up into the AI blender. And so pretty soon, there'll be ads of half my face, of my voice talking, but it's not 100% me. And so our personal data is our intellectual property, and we should have protection over that. So it kind of goes back to that to make sure that we can govern AI, we first have to go back to the whole governance of the data, right? And that's why I'm so focused that we really need to have more privacy rights because it's only going to get worse with AI.

Debbie Reynolds  24:43

I agree. I agree. Right, AI just exponentially raises the risk for humans, but I think that's definitely an issue. One thing that concerns me, and I want your thoughts, is that I feel as though there's a digital caste system being created where people who have data and information have more opportunities to manipulate data. So I think it's not just the haves and the have-nots, but the knows and know-nots; what are your thoughts?

Tom Kemp  25:15

You know this better than I do, but we need laws, not only to give people privacy rights, in terms of the right to know what's being collected, the right to say no to the collection and selling, the right to correct, the right to delete. But we also need to have businesses have purpose, limitations, and data minimization, as well. And it's in the big tech companies like, for example, Meta, formerly Facebook, they were collecting people's phone numbers and saying this is for multifactor authentication, right. But then they turned around and started actually texting people and providing the phone numbers to their advertisers, right? And that was a violation of the FTC decree that they had. And that was one of the things that they were fined for. But that's an example if I give you my location data so I can be picked up by a ride-sharing service. That doesn't mean that because I want to be picked up at XYZ point, my location data can then be used to completely track me everywhere else or to know that I actually was at a cancer clinic, because you're right, because that kind of data could actually be used against people, that this person is at a cancer clinic, right? Or this person is at a church or a mosque, etc. So we do have issues here. And so we do need the guardrails right there. I don't think this is asking too much, which is if I give you my phone number for MFA, I don't want you to spam me with texts and sell my phone number. I mean, that's just like basics, right? And that's what a privacy law should have in place.

Debbie Reynolds  27:01

Very good. I agree with that. Let's talk a little bit about the Delete Act. So the Delete Act was an act that you co-authored; I find that a lot of people, a lot of my clients in Europe, don't really understand the difference between how deletion happens in the US versus Europe. So some people confuse deletion with a right to be forgotten, which it isn't. So tell me about your work on this act and why it's important. And what is it?

Tom Kemp  27:34

Sure, so this is California Senate Bill 362. As we record, it still hasn't passed; it's still open legislation. Hopefully, it'll pass the legislature and go to the governor. So probably when this airs, it will be decided if it's a yay or nay, etcetera. But at the core of it, it involves entities known as data brokers. And so the companies that we directly interact with, that we have a direct relationship like Walgreens, Walmart, Meta, Google, your hospital, et cetera, that's kind of first-party data, that I consent to the use of my personal data for the business purposes, signed up for, but in the US, unlike mainly in Europe, that we also have this concept of third party data, where there's these entities known as data brokers, that we don't have a direct relationship with, that we don't consent to the collection of our information, or collecting our information, the mobile SDKs, third-party trackers, credit card information, public information that they gather from property records, and they're building dossiers on us. And the issue that we have is that data is increasingly being weaponized. And there's horrific stories of data brokers selling data about people visiting abortion clinics, or people that have a Muslim prayer app, or tracking people into gay bars and outing them, etc. So there's just been a lot of bad headlines about just this unfettered collection and selling of our sensitive data, especially in the health care area. However, the bigger issue is that because we don't know who these data brokers are, we can't tell them to please delete my information, right? So that's the first problem. If you don't know, you don't have a relationship with them; you don't know that they're collecting and selling your data. The second problem is that each of the data brokers has a different way if you do contact them, to try to get them to delete your information. And then pretty soon after you do Aardvark Data Broker, it's just incredibly painful to get through the whole list, even if you had a list of data brokers. And then the other issue is that your deletion request is a one-time deletion request. Most of these data brokers sell data amongst each other. So if they know that, hey, Debbie's data was deleted, they call up their buddy Data Broker and say send me back Debbie's information. And so the consumer has to go back and do a repeat right there. So it's just a complete, basically impossible for a consumer to get their data deleted from data brokers. And so what the California Delete Act SB 362 does is it creates a consumer portal, kind of like the FTC's Do Not Call Registry, where you simply have to be verified. So this won't allow any fraud; you go in there. And then you say, Here's Tom Kemp, here's my email, here's my address, you hit the go button. And then the data brokers have to look at that information periodically, have to look at the registry, and then they have to delete the information. Now, there are exemptions. You know, if the data brokers have your data for HIPAA reasons, Gramm Leach Bliley, etc.. So there's the normal data exemptions that we have with the CPRA. But otherwise, this is a big win for consumers because they can finally hit the delete button right there. It was fully motivated by the weaponization of data happening by data brokers and the fact that it's nearly impossible to get your data deleted from data brokers, assuming you even know who they are.

Debbie Reynolds  30:59

Right. And I feel like this is, as jurisdictions are writing laws, the data brokers have kind of skirted around these issues, because a lot of the way the laws are written, are as if I know the person, the company that has my data, and then I'm holding them accountable in some way. If you don't know the company, how can you hold them accountable, right? So I like the fact that this is sort of flipping things on his head. So it's not putting the burden on me to know the company; the company knows me, right? And I'll say, hey, take my data and delete it, right?

Tom Kemp  31:36

Absolutely. And by the way, this is not original. I mean, I'm fully a supporter, I'm fully active, helping, but obviously, at the end of the day, it's State Senator Josh Becker; he gets all the credit. He's the one who put this forth. I'm not a legislator. It's his team. And then there's a broad coalition of 30 groups working on this and all the major privacy groups. And I'll also say it's not an original idea. I mean, we've had the FTC's Do Not Call Registry for telemarketers; Tim Cook actually proposed this in 2019. He said, hey, we need a Data Broker Clearinghouse where consumers can delete. At the Federal level, Senator Assaf and Cassidy made this proposal two years ago; it's called the Federal Delete Act. And even in the ADPPA that passed 53-2, the House Energy and Commerce Committee last year, Section 206, explicitly was about these third-party entities. And so this has been tried and true. This is well known. It's not an original idea; it's been kicked around with the Delete Act, ADPPA, the FTC Do Not Call Registry, etc. So hopefully, California will be really the first one to be able to give the consumers the ability to provide the deletion capabilities that are really lacking.

Debbie Reynolds  32:57

Very good. I'm very impressed by all your work. And it makes sense, so the Do Not Call Registry is very successful. So why not take advantage of that and do the same thing? I want you to travel with me on the philosophical plane.

Tom Kemp  33:15

Uh-oh.

Debbie Reynolds  33:17

I want to propose something to you, and I want your thoughts about it.

Tom Kemp  33:23

Please.

Debbie Reynolds  33:25

I think it will be interesting if privacy regulations will weave in requirements for data accuracy. And the reason why I think that will be interesting is because in order for data to be accurate, that brings in the consumer, right, there has to be some type of verification with them. And then that forces transparency from a company that has data; what are your thoughts?

Tom Kemp  33:54

Well, I actually philosophically believe that privacy should be made easy for the consumer. And I wrote an article on tech policy press, where I walk through how difficult privacy is, right? If you make privacy easy like Apple did with App Tracking transparency, where you can just simply click a button, say I don't want third-party trackers, 96% of consumers actually utilize it as well. And so I think that one of the problems that we have as policy people is that we focus on, like, let's give all these rights, but you kind of forget what the consumer interface is to them as well. Similarly, as privacy practitioners, right, we also make it difficult, that we need to walk a mile in the shoes of the consumer and say, A consumer has a concern; what's the best way to make a happy customer, right, as opposed to let's make it difficult for them and they get really, really ticked off. So I think fundamentally, we do need to make privacy easy. The second thing is that we do need to have a consistent ability to right to correct. And then the third thing is that we do need a way for consumers to see what information, I call it consumer-facing portals, we do need to have a way for consumers to see exactly what information they have about them. The problem, Debbie, is that oftentimes people have wrong information. And because scoring occurs based on that wrong information, that people are denied services, they don't get rentals, they don't get jobs, because there was some bad data at some point that said they defaulted on a loan or something like that. And that may not be true, right? And so I think the whole transparency issue is critical. And I think it's equally critical in an AI world; we should have transparency to be able to ask, hey, was this human-generated or computer generated, or we get a bunch of text, we see some texts, we should be able to go back and ask open AI? Did you generate this text or not? And so I think, overall, we do need a move to greater transparency of data that's either about us or has been generated from an AI perspective.

Debbie Reynolds  36:14

Thank you. Thank you for humoring me with that. Do you think the US will ever move towards having more of an opt-in approach as opposed to an opt-out approach for marketing?

Tom Kemp  36:30

No, I don't, because in California, when we came out with the California Consumer Privacy Act, there was a concern about a Supreme Court case, the Serral case. And so it basically kind of said, to the whole opt-in, so we're kind of forced into an opt-out model. But what's the way that you could actually address that is you could address that from a consumer perspective by having the global privacy control, where you actually send an opt-out signal, and that signal is associated with your browser or your device. And so, as you visit a website, you're sending the signal; I don't want you to sell it. And so that actually works better as an opt-in system. I mean, Europe has an opt-in system, but we have cookie fatigue, like people have to, like, okay, that's making privacy equally difficult than the opt-out right here. And so I really think for first-party data, we need GPC, the global privacy control. For third-party data for data brokers, we need a Data Broker Clearinghouse with the deletion aspect, which, again, is not an original idea. Tim Cook talked about it in 2019. And we've seen this with the Federal Delete Act proposal. And we see it now with SB 362, in the California Delete Act as well. So two simple things could fix the problem. The first of which is GPC and the universal opt-out signals, California and Colorado support it, but the weaker sauce ones that the tech industry has heavily influenced the privacy laws, are not required. And that's incredibly disappointing. And I think people need to rethink that and think about how consumers can take advantage of rights. Because if you can't easily exercise rights, then do you really have rights?

Debbie Reynolds  38:17

Right, that goes to the core of the human versus consumer rights approach. So that talk about where, unfortunately, sometimes, depending on what jurisdiction you're in, you can't exercise your right unless you're consuming something, right, but not every human is a consumer. So that's like an issue. I'm glad you brought up global privacy control. I really definitely support it. I think a confusion with people is that they think global privacy control means global. It just means that the person that is doing it, can configure their browser in a certain way that the companies that for the websites they go to, they are supposed to respect the signal without the person having to say no or opt-out to each website, right?

Tom Kemp  39:12

Yes. And yeah, and actually, that's why there's this free plugin called Privacy Badger with EFF and I have no relationship with EFF and et cetera. But it does two things. It provides the blocking of third-party cookies, and it also sends the global opt-out signal as well. So it's kind of a twofer. But yeah, I love the concept of GPC. And I know, as you've written with a Sephora, fine in California, that California, at least the Attorney General, and I assume that the California Privacy Protection Agency has sent a clear message to businesses that you do need to respect the global opt-out signal. And so I would advise any people that are privacy practitioners to make sure that you support it number one, and number two, going back to the whole pixels and Meta pixel, etcetera, I think that you, oftentimes the marketing departments kind of are off on their own. And they think it's really cool to collect all this information. And I think the privacy people need to kind of sit down in an organization, spend a little time with the marketing people and ask the question, how are you using the pixel? Does this actually transmit sensitive data about the consumer, because remember, the Meta pixel captures information about anyone, even if they're not actually a customer of the actual company; if I just randomly go to a website, and then put in my personal information, it transmits it to Facebook, even if I am actually not registered on that system as well. So I do think that those are the third party pixels and GPC should be top of mind with privacy practitioners to make sure that your business is not going to be exposed to what happened with Sephora.

Debbie Reynolds  40:56

Absolutely. I 100% agree. Right, and I'm advising my clients, regardless of where their customers are, that they need to really make sure that their websites are configured to do that handshake, the GPC signal. So, you know, I'm seeing a lot of companies that are probably not paying too much attention to it right now, which they should. Especially as we see more enforcement started happening in California and other States. They respect this thing. But I think from a company perspective; I think this is going to be almost like, I think global privacy control will have like a huge impact on businesses that really understand, people who read the Sephora case and were surprised that GPC showed up on the list of stuff that they should do. I think that, hopefully, those companies will move in that direction. So then, they don't have to be shamed, shamed, and blamed once these things are fully enforced. So if it were the world according to you, Tom, and we did everything that you said, what would be your wish for privacy anywhere in the world? Whether it be regulation, human behavior, or technology. What are your thoughts?

Tom Kemp  42:17

Well as I talk about in my book Containing Big Tech, my goal is simply to, first and foremost, just have people be more conscious that you are emanating digital exhaust and that you should have rights over that. And then you should be careful because sometimes that data can be used against you. And then in a perfect world, privacy would be easy, right? That we would have the right to say no, and to people collecting and selling data, we should have the right to know as well. Look, I'm not trying to propose aircraft safety; where there's never a crash of an airplane, or very rare and, and having 100 checks you have to go to even before you turn on the engine of an airplane, my goal and vision is more car safety, right, that we shouldn't allow people to have, we should tell people, you shouldn't put the baby in your front lap when you're driving, that you should actually fly or to have a child car seat, that there should be airbags, etc. And that's kind of my goal and vision. And people may say, but that stifles innovation. My goodness, look at the innovation that's happening in automobile manufacturing, with electric cars and Teslas and all the great things that are happening that now you have the Ford 150, charging up a house when the power has gone down. I mean, that's just, we never imagined that stuff. And so if we can get to kind of the equivalent of car safety, vis a vie privacy, then it does provide safety. And it also provides a means for innovation. So it's kind of striking that right balance. And so that's kind of what I'm trying to get to, which is, can we have a basic privacy law for the whole United States? Can we make it easy for us to delete data from entities that we don't have a relationship with that purposely make it difficult via dark patterns for us to delete it like dark patterns should be banned is just a great example, as well. So that's kind of my vision. I tried to put forth this vision in the book, and I encourage people to check out Containing Big Tech.

Debbie Reynolds  44:24

I agree. I agree. I'm not saying it just because I'm quoted in the book. It is really good.

Tom Kemp  44:31

Well, thank you. I do appreciate you looking at it and saying some very nice things. So it's greatly appreciated.

Debbie Reynolds  44:37

Yeah, you're filling that gap. There's definitely a gap there that you're filling. And it's really important, and I think we aren't going to solve privacy problems that we're just at conferences wearing like tweed jackets with leather patches on the elbows, right? So understanding that we need to be able to communicate this to someone whose eight or eighty is very important.

Tom Kemp  45:02

Absolutely.

Debbie Reynolds  45:04

Excellent. Well, thank you so much for being on the show. It was fine. I could talk to you for another hour.

Tom Kemp  45:10

Likewise, this has been great, and keep up the amazing work that you do, and your evangelism and I'm complimenting what you do with the book with all the great content you put on LinkedIn and other social media, and of course, your amazing podcast. So thank you again for having me on as a guest.

Debbie Reynolds  45:26

Aww, that's so sweet. So sweet. Well, we have plenty of ways we can collaborate. That'd be great.

Tom Kemp  45:31

Absolutely.

Debbie Reynolds  45:32

Alright, talk to you soon.

Tom Kemp  45:34

All right. Bye bye.

Debbie Reynolds  45:35

Bye Bye.