Debbie Reynolds Consulting LLC

View Original

E100- Roy Smith, CEO, PrivacyCheq

Find your Podcast Player of Choice to listen to “The Data Diva” Talks Privacy Podcast Episode Here

Your browser doesn't support HTML5 audio

The Data Diva E100 Roy Smith and Debbie Reynolds- (45 minutes) Debbie Reynolds

45:56

SUMMARY KEYWORDS

privacy, people, consent, cookies, coppa, company, happening, enforcement, track, privacy laws, called, law, california, consumer, data, revoke, world, created, years, computer

SPEAKERS

Debbie Reynolds, Roy Smith

Debbie Reynolds 00:00

Many thanks to my guest, Roy Smith, for being the 100th "Data Diva Talks" Privacy Podcast guest. I'm really thrilled to have Roy on the show. Also, I wanted to note there were two things that Roy said in his podcast that have since come true. So 2 good predictions from Roy; one is about the state attorney general in California starting enforcement actions and having people really start to take notice of it. And that happened in August 2022. But the Sephora enforcement action, I think it was $1.2 million against Sephora, saying that they failed to tell consumers that they were selling their data through contracts with online tracking companies. And then also the California Attorney General's focus on global privacy

control. So we talked about both of those in this episode. Thank you so much, and enjoy the show.

Personal views and opinions expressed by our podcast guests are their own and are not legal advice or official statements by their organizations.

Hello, my name is Debbie Reynolds. They call me "The Data Diva". This is "The Data Diva" talks privacy podcast where we discuss Data Privacy issues with industry leaders around the world with information that businesses need to know now. I have a special guest on the show. Roy Smith, he's the CEO of PrivacyCheq, and I'm happy to have him on the show. Ah, well, this is going to be fun. So you and I have been connected on LinkedIn for a long time, I believe, four or five years; it's been a while. You know, and I've been watching your company grow. And you have always been someone who really stands out in the way that you comment on things, you know, you always bring the heat. You bring a lot of knowledge and thought to the things that you comment on. And it's a masterclass on understanding stuff. So I know you have a lot of deep knowledge in technology and privacy and understanding stuff about what's happening in the US. So you can tell that story better than most people that I know.

Roy Smith 02:23

I've been told that I have a good BS detector.

Debbie Reynolds 02:26

I think that's true. I think that's true. I would love for you to talk about your journey in tech and what you're doing with your company PrivacyCheq.

Roy Smith 02:39

Okay. Yeah, I'm definitely atypical when it comes to the people who would be on your podcast; I certainly didn't get my start as a privacy person at all. I am a technologist. I started by writing code for the IBM computer before it was actually publicly released in 1981. My brother was involved with a company partnered with IBM to figure out ways to put the IBM PC into the banking industry. So we were actually writing code in a sealed room with a lock on it. Because that was the only way, IBM would allow us to have the computer outside their location. We had a dark room that you had to go into, but I am a musician. And shortly after that happened, I got involved in creating things to make sound and music happen easier on the computer. The IBM PC when it first came out, only made beeps. It didn't record sound or playback any real sound; the only thing it had was a beep. And you could control the frequency of it. But that was it. And being a musician and having all of the interesting tools that were coming out at that time to allow music and computers to do stuff. A partner and I founded a company called Turtle Beach. Which if you are a person who plays video games these days, you may know that Turtle Beach is the number one vendor of headphones for people who play Fortnite gaming headphones. So we founded that company in 85. Just the two of us on literally on my kitchen table. And I built that up to a point where we sold it to a chip company in 93. And after that, it went through a number of permutations, eventually going public in the mid-2000s, I believe. And at one point, I think it was back in 2011 or 12 It was actually the highest growth company in the NASDAQ. So kind of a proud papa. I don't have any involvement with that company anymore. Except for the fact that you know I named it, it was named after a place where my parents live down in Florida, and my partner and I were able to build that up, and it's still alive. And they have a very big audience right now. So basically, I'm a tech entrepreneur guy. I was writing code. And my family's very entrepreneurial. My father has a bunch of patents that he got done during his career, and my brother's also entrepreneurial. So after I sold Turtle Beach, I got involved in an incubator. This was when the dot.com era was starting and was involved with a number of companies that were not successful. The final one was a company called appMobi that created tools for smartphones to allow people to easily make games mobile games. And we were fortunate enough to sell that to Intel in 2013. So just as that was finishing up, I was trying to figure out what am I going to do next. And I learned about this law that was in the US called COPPA, this is the first I had ever heard of it. And COPPA stands for the Children's Online Privacy Protection Act. And it had been initiated in 1998 because the Internet was sort of growing at that point. And there were all kinds of; it was a wild west. And there were all kinds of crazy things happening like websites who would say now, go get your parents' credit card and type in these numbers to me, and, you know, you can get a, you know, a prize. So Congress created this law, COPPA, and the Federal Trade Commission began to enforce it. But basically, COPPA said, if you're going to gather any private information from a kid under 13, you have to first tell the parent what you're going to do, and the parent has to give their permission. So it makes sense. It's a reasonable law. So that was great for websites. And it was working well; there were some enforcements throughout the 2000s. And in 2012, because smartphones were now getting very big, you know, the iPhone was introduced in 2007. And so all the kids were playing on smartphones. Smartphones, in terms of being able to gather private data, are probably 100 times better than a web page. So Congress in their infinite wisdom, updated COPPA, and so there was this revision of COPPA that went live in July of 2014. So after selling appMobi, I knew all these game publishers because that had been our business. And then, I found out about COPPA. And I said, boy, this is going to be a big pain point for all these publishers. Being an entrepreneurial guy, I said, Well, why don't I start a company and build a technology stack that these people can just build into their games? I can't reduce the pain of the law, or I can't get rid of the pain for that this law introduces them. But I can certainly ease their pain and at the same time, help people protect their kid's privacy. So I was able to raise some money locally because I was kind of a technology hero; not everybody in my area, you know, builds themselves a company to Intel. So raising money wasn't that difficult. And I got the best guys that had worked with me at appMobi. And over the next year and a half, we built the core technology, which is now Consent Check. That's what we are selling to do consent management. But back then we were just trying to solve for COPPA. So we were, we had built a product that was very, very scalable. It was intended to work for Angry Birds and Candy Crush and games that had hundreds of millions of monthly users. So it was all built on Amazon AWS's back end. And we went through the COPPA law, and we just created technical solutions for each of the weird things that COPPA requires you to do. And there's, there's a lot of weird things in there. I mean, the law was written by people who didn't understand technology. And there are still some loopholes. I just read an article the other day that somebody has figured out that there's a loophole in COPPA; well, all the bad guys have known that for 15 years. So this is why I kind of chuckle when we get into conversations about oh, yes, we're going to have this new Federal privacy law. It's going to be great, you know, they're going to update COPPA. Well, COPPA has some loopholes that are as big as a truck; you can drive a truck through. And the reason for that is because there were lobbyists and COPPA was written that said, well, let's just put this thing in and it says, if we don't actually know it's a kid, then we don't have to treat them like they're a kid. That's called actual knowledge. That's one of the loopholes. So if there's stuff like that in the new Federal privacy law, you know, we're basically not going to be achieving anything. I'm going to finish this story quick, I can see you falling asleep. So we're, we updated our technology to oh, okay, at the same time as this COPPA thing was happening, the Europeans were updating their privacy law which itself had not been updated since before the Internet. So they were badly in need of privacy regulation. And they created GDPR, which, in my opinion, is really the granddaddy that underpins all of the privacy laws we now talk about; even though COPPA was there first, and GDPR was loosely based on COPPA, GDPR really is the thing that has gotten people's attention all over the world. And now it's being copied in many countries if not all. So GDPR was a big deal for us. It, it gave us a market to sell to that was 10,000 times larger than the market that we were trying to sell to, which was just the top 200 game publishers now, you know, with GDPR, we're selling to millions of enterprises all the way across the European Union, and then back into all the other countries that want to comply with that. So we updated the platform to handle all of the little intricacies that are different between GDPR and COPPA, and are some maybe 5%. And we launched that product in 2016; actually, two years before GDPR went into force, we had a marketable product to comply with it. So as you can imagine, nobody cared. Because there was a two-year grace. Yeah, I mean, that was a slow period for us. Because at the same time in the US, the Federal Trade Commission wasn't enforcing COPPA. And in retrospect, I believe the reason the FTC didn't do that was that they were afraid they would get up against a heavily lawyered, large ad tech or social media company and would actually lose a court battle, which for the FTC would be, you know, a career-ending effort. So the way to avoid that is to never pursue an enforcement action against a Facebook or any big COPPA violator, so to speak. So we created this GDPR thing; we made everybody aware of it. When GDPR came out, we were doing quite well for the first year. But then we noticed, hey, we're not getting any new customers. And basically, people were sitting on their hands, waiting to see if there was going to be any enforcement. So the customers that we did get were the early adopters are the people that wanted to do the right thing, which, sadly, in our, in this day and time, there's not all that many of them who want to do the right thing. So at the same time, our friend Alastair MacTaggart in California got fed up with the privacy thing and managed to twist the arm of the California Legislature to adopt CCPA. So this is sort of like this project of forcing people to tell you what you're going to do with their privacy by getting your consent or your opt-in. This is like washing over the world now. But that back then, you know, CCPA was a big, big deal for us. So that's where we are. I'm not a privacy person. Everything I know about privacy, I've learned since we turned that corner in 2013; I do have some CIPT people on our staff, we're I'm really a technologist, an entrepreneur, and also a musician. But that doesn't appeal to this use case, but yeah, did I overly answer your question by a factor five?

Debbie Reynolds 13:54

No, no, that was great. That was great. I'm a technologist as well. We have that in common. I'm not a musician.

Roy Smith 14:02

It has been many years since I've written code, but I did write code back in the day.

Debbie Reynolds 14:09

Yeah, that's really interesting. I think COPPA is overlooked a lot. Because a lot of companies just say, well, let's just put a checkbox on there and say that you can't use our site if you're under 13.

Roy Smith 14:24

But that gives them the plausible deniability that Facebook's response for many, many years was, well, our terms of service says you can't get on if you're less than 13. So we're good. And you know, without anybody on the enforcement end saying no, you are well aware that you have kids under 13. You know, I don't care what your terms of service are, but nobody ever enforced it that way.

Debbie Reynolds 14:49

Yeah, I wonder what will happen with that in the future. There are some rumblings out of Australia that they're trying to really put the pins to organizations about the knowledge of the age of an individual.

Roy Smith 15:01

Well, I think Lena Khan, the new FTC Commissioner, has really struggled to telegraph to the market that she's going to go after this. She's in a number of different speeches and press releases and so on. I believe that COPPA is actually going to come back to haunt people because they are actually going to start enforcing it. She actually said that we haven't done much enforcement. We're going to step it up. And I believe the reason for that is here we are in an election year, talking about child privacy; who's going to advocate against that? I mean, it's like the easiest thing to say, Hey, I'm a hero; we're protecting child privacy. So maybe there's going to be some great announcements in the next couple of months, you know, massive, like TikTok, for example. Remember, there was a period in the Trump administration when he was going to force TikTok to sell to Microsoft because of their child privacy issues. And that was sort of a sideswipe of the whole COPPA thing. I agree. I think it's going to, it's going to come back, and it's going to be a part the child privacy is going to be a part of whatever we ended up with as Federal privacy law, in the event that, that the people in Congress are able to actually do anything, all right. Other than a press release.

Debbie Reynolds 16:25

Right. Absolutely. Well, that's my concern. I know I have friends who are like, oh, are you going to do videos about the move, you know, a proposal like, well, no, kind of wait until the cake is baked first. We're talking about it like a lot can happen between now and you know, the midterm elections. So I'm just kind of waiting to see what happens.

Roy Smith 16:45

I was not a big fan of government before I got involved with this. But having been involved with the FTC now for for eight years and government activity, I'm really not a fan of government.

Debbie Reynolds 16:55

Oh, one thing that you mentioned that I would love to talk about, I know that you know, this very in depth, and this is what the CCPA does for companies when they're dealing with children that are under 16.

Roy Smith 17:16

Right.

Debbie Reynolds 17:16

Between 13 and 16. So people try to kind of skirt what's happening with COPPA. They will say, okay, well, we'll just say people under 13 can't use it, and then keep going. But now there's this new wrinkle with CCPA, where there are things that companies have to do if they have someone under 16. So can you tell us a little bit about what this is?

Roy Smith 17:45

Yeah, this is very, very poorly understood. And of course, the marketers make it their business not to understand it. So they could be saying, you know, oh, I'd had no idea. But within CCPA, CCPA takes the opposite approach to GDPR. GDPR says, If you're going to do stuff with my data, you have to tell me what you're going to do. And I have to give you my consent; I have to take affirmative action to give you my consent. CCPA has the opposite approach, which says, I assume I have to show you what we're going to do with your data. And I assume I have your consent. So your consent is the default. And if you do not want me to do stuff with your data, you then have to tell me, and they do not sell my data, or revoke my consent, or revoke my opt-in. So the default position of CCPA is that I have your consent, I still have to tell you what I'm going to do. So you still have all the notices. That's another poorly understood aspect. But here's where it gets weird. CCPA says if the person you are talking about is under 16 years old, in other words, zero to 15, that model gets flipped, and you do not have the default of opt-in; you now have to get their consent. So it becomes in effect like GDPR. So if I know you're 14 years old, I can't assume that I have your consent; I must get you to click on something that says, I understand what you're going to do with my private data. And I give my consent. That is a big deal for people that address Californians because think of all the people on social media networks on Instagram and Facebook and all the games as well. Yeah, you have kids that are aged 13 to 15 that now have to get special treatment under CCPA. It models the exact same as COPPA which says if the kid is under 14, the parent has to give their consent; it's the kid themselves who doesn't, can't understand what they're doing. So you now have to go find the parent, tell the parent what you're going to do get the parent's consent. And then the kid can play the game or run the social media or whatever. But if the parent later comes back and says, I revoke my consent, the game has to stop working for the kid. And this is really the functionality that we provide with Consent Check is all this back-end plumbing to monitor this stuff and tell you okay, the parent revoked their consent, you now have to stop this from working. But the big deal that people don't understand is that the ages of 13, 14, and 15, which comprise a big chunk of the audience, particularly for entertainment in games, TikTok, for example, imagine how many kids that age or on TikTok, I mean, it's many millions, it's not just some millions, right? Those kids when they're signing up, when they're creating their account, they have to be shown. Notice that they can understand that says, here's what we're going to do with your privacy. And they have to affirmatively give their consent. And very few, I have yet to actually see any California-based website or mobile game that complies with that. And why is that? Well, there hasn't been any enforcement yet.

Debbie Reynolds 21:33

Yeah, this is a tricky one. This is a tricky one. I tell people this all the time. And they get really surprised by this. I think a lot of them felt like, well, you know why just say I don't have people under 13. And I'll be done. Then also, I feel like in the US, what's happening on a state level is that we're shifting from a kind of a notice regime to states creating obligations for companies to actually do consent. What do you think?

Roy Smith 22:03

Well, yeah, all the different states have I mean, this is, the irony here is ridiculous, because Europe saw fit to create GDPR, because they had all 29 different states that had completely different privacy laws, and it was driving people nuts. So they created GDPR. Here in the US, we never created a Federal privacy law. Now, all the states are creating their own privacy laws. And we are moving into a position which is going to exactly mirror the position that the European Union was before they made GDPR, where you have each different state has their privacy law, which is generally similar, but each one of them has a little different, you know, twist just because of that's how things are, I mean, the some of them are radically different, like the Washington State one that's failed three times. If they get that through, that's going to be a big deal. Very, very different. But to answer your question, I don't really know how an enterprise that has footprints in many states could ever fully comply with each of the states the way the situation is now; I think what you said at the beginning was to take a shot at being CCPA compliant, GDPR compliant, and you're going to be 90%, there, you might get into a situation where one state says, oh, well, you didn't do this. You have to think the regulators are not going to go after people who make a good faith effort and get 90% of the way there, and they're going o go after the companies that aren't doing anything.

Debbie Reynolds 23:43

Yeah, I know, that we were concerned that, you know, the US, which I think this is where we're going to have will be like it was with the data breach notification laws where California have started out first, everyone else kind of got on the bandwagon. So now we have every state that has their own data breach, notification law, which is, you know, pulling your hair out, and then we're seeing that trajectory start again, with these new, like consumer-based privacy laws that states are putting up.

Roy Smith 24:18

Think of how many things California has led in our country. I mean, we're really very lucky to have California. The whole smoking thing, the car emissions thing, the ADA you know, for people with a handicap. California really, really leads in terms of those types of initiatives and privacy again with CCPA

Debbie Reynolds 24:43

Yeah, and the privacy notice and privacy policy on websites.

Roy Smith 24:48

That's right. That was exactly right. That didn't come from DC.

Debbie Reynolds 24:53

No, no. I told someone that they had to have a privacy policy on their website, I don't know, probably about 10 or 15 years ago, and they just laughed at me. I'm like, really? You're supposed to have? Now, everyone has or should have one.

Roy Smith 25:08

Well, that's a whole nother can of worms, though, because they ended up being a page of legalese defensively crafted so that nobody would ever read it. And that's another one of our crusades is to fix that.

Debbie Reynolds 25:20

Yeah, that's a problem. I think I like the fact that your tool is trying to help people on this back-end stuff because, I mean, that's so hard, right? Organizations have a hard time knowing what they have, finding what they have, and being able to take action if they have to revoke consent or knowing what consents they have. That's really tough. What are your thoughts?

Roy Smith 25:46

Yeah, it gets, it gets really weird when you have what we call consent fragmentation, where if we look at it from the perspective of the user, it starts to get easier, you know, if I'm a user, and I log into a website, and I say, oh, this is interesting, I want to get on the mailing list. And they then take me to a mailing list ingestion site for SendGrid. And I say, okay, I'll just type in my stuff. To me, I'm still working with company A, I don't really know that I got shifted over to another place. Now SendGrid has my consent to receive emails; I start getting emails, and I say, I don't like this anymore. I go to Company A, and I say, hey, unsubscribe me; well, company A may have three or four different places that I can unsubscribe, but none of those things are connected. I, the consumer, feel that I've unsubscribed. I went to this company's site. And I unsubscribe; I don't know that there's no connection between these different silos. So this is what's called consent fragmentation, where you have all these disparate services that may or may not even be owned by the company you think you're dealing with that have your consent for one or more tiny things. So our solution to that is to create what Gartner calls a central source of truth or one source of truth. So we have a database that keeps track of Debbie's consent for this for that, for everything that she gives consent for. The problem is, is that we require all of these third parties to then come and look up what is Debbie's consent point. Using our database, not theirs? So you get into a lot of what's called middleware; where are you? You have to; they have to be when you change your consent; we have to send a note to them and say, Hey, SendGrid, Debbie changed her mind; take her off the email list. It makes it more complicated, but it makes the situation exactly what the user expects. If I sign up for email from a company X, and I changed my mind, and I unsubscribe, I don't want to see any more emails from company X. And that has led to frustration because, you know, companies like Cisco, we had a horror show. A couple of years ago, the guy from Cisco was telling us how bad it was that he had a customer that actually unsubscribed five times. And it was all, it was each in different database marketing databases, you know, complete fragmentation.

Debbie Reynolds 28:11

Yeah, right. It's like Santa's workshop, and a lot of companies, right, they, a lot of people, the data just goes everywhere, and they don't really know how to wrangle it in that way. So I think tools like yours definitely helped in that way.

Roy Smith 28:24

In the days, it was easier, because like for a term of service, like when you install a Microsoft product, you see this terms of service, you have to scroll down and say yes, that's a one-time thing, once you give your consent to that, they register that and it's never discussed again. But with the advent of all these new privacy laws, you, the user, have the ability to revoke your consent any day of the week. So that means they have to actually verify that anytime they're going to process your data. So the consent to handle or process your data is now a living thing that has to be checked every time you process it. And that's been a huge change that people still I'm still educating people on that every day.

Debbie Reynolds 29:10

Sure, I'm sure. Now, I'd love your thoughts about, I guess is the CPRA now about what's written in there. And you correct me if I'm wrong, that companies want companies to be able to read signals, like opt-out signals from different technologies. Can you talk a little bit about that?

Roy Smith 29:33

Well, this is the return of the global Do Not Track Me. Global privacy control is the return of the Do Not Track. And that's the Do Not Track is a story of how the ad tech world was able to defeat a very good and highly helpful intentional issue because it was going to really screw up with them. They realized when this came out the Do Not Track initiative way back that was probably in the 2000s, like the mid-2000s. When browser people, browsers were big. So there's like, okay, well, we'll just put a thing in this browser that you can say every website, I go to send this little signal that says do not track makes perfect sense perfectly logical. Well, the ad guy said, Well, we know if we do that, the number of people that say do not track will be huge. The very largest consumer boycott in the history of man, is the number of ad blockers that exist on the websites, it's hundreds of millions of people are pursuing a consumer boycott of being tracked using ad blockers. So these ad guys know that if they give people the chance, they're not going to allow it. So they were successful at hobbling do not track through a variety of methods. Some of them were because the people who owned the ad company were also the people that owned very popular browsers such as Chrome. So it was sort of like the wolf manning the hen house. But with the advent of CPRA, they've now come back and said, we want to do that again. Now, it's going to be called the Global GPS Privacy Signal. And they are mandating it in the law that you have to listen to it. So it's a short form, a way for people to say don't track me there. They don't even want to see a notice that I'm already telling you. I don't need to see a notice. Do Not Track me. So I think that's going to be successful. But I do think that it's going to be a battle. The techs are not going to go down gently.

Debbie Reynolds 31:54

Yeah. And then, you know, I'm already seeing companies in their privacy policies that are public-facing, saying that we, you know, we don't recognize these signals, you know, global privacy control. So I think that's going to be a big deal. Because as you say, the laws are going to be mandating that companies have the capability to be able to do this.

Roy Smith 32:21

We have high hopes for Ashkan Soltani and the California enforcement agency, that they're, they're going to make one or two examples out of companies that do exactly what you said; many of them don't explicitly say we don't listen to it. In fact, they just sort of ignoring it. And then, you know, a third party researcher will set his GPS, and then actually test it and say, well, I just tested this, and this company completely ignored my global privacy signal. So they'll refer it to an enforcement agency. But as with all of this privacy stuff, until there's enforcement that gets into the heads of CEOs, they're like, man, I don't want to have that happen to me. All we're doing is just moving pieces around on a chessboard. We're not, there's no, nothing really is going to happen until people get scared.

Debbie Reynolds 33:12

Yeah, I love talking about the death of cookies.

Roy Smith 33:19

Yeah.

Debbie Reynolds 33:20

I have my own thoughts about the whole cookie thing. My thing with the cookie deal is that, to me, the problem is not cookies; the problem is tracking. So you can track without cookies. So for some people, they overly emphasized the cookie as if it you know, once we take that down, and this is going to stop tracking, but we know that tracking still exists, there's probably a lot more. You know, we'll be more ubiquitous now. And probably harder to spot. What are your thoughts?

Roy Smith 33:54

Well, yeah, the ad tech world is nothing if they're not extremely resourceful and creative in ways to do what they want to do, which is to track you. I mean, they've known that the cookie, the cookie was never ever meant to be used for what they use it for what it was convenient for them because they were using the cookie for you know, what's the size of the screen? How many megabytes of storage does this computer have? It was a very pragmatic thing we know what's this person's favorite font. Did they like it zoomed in or zoomed out? The perfectly useful purpose for cookies, but once somebody said, Hey, you know what, we could put an ID number there. And then we can see if when somebody clicks on an ad, that Id relates to that now, we know that person likes that thing. And we could sell more ads today. Oh, that'd be great. So the entire ad tech world built its entire fortune on cookies because it was a convenient, existing technology for them to do so. We always recognize that the cookie thing was not useful. Imagine you and I are having a business relationship. And we're going to exchange money. But I store how much money I owe you in an envelope that's sitting on the top of my desk. And you need to know that information. How ridiculous is that? Well, if I'm storing the person's consent information on their computer, the only time I can access that is when they are online using that exact computer and that exact browser, you could even be sitting at your home with your computer and your browser, and you bring up my website on your phone, and to me, you're a completely different person. Because it's a different device, and it's not just a device; it's even within sessions, you know, you could have a browser session where you have two different browser windows up. If it's one browser that doesn't share the cookies, one could have consent, and the other could not. So the whole notion of using cookies to manage consent was something that the ad world did; I believe it was kind of a dark pattern. Let's do this, let's put these cookie banners up. This is going to be so annoying that people will hate it. And they'll take it away just like they did for Do Not Track thing. And so they did it; the IAB created their open source TCF thing transparency consent framework. And everybody would put up these banners that had 500 different ad networks that to stop them from tracking; you'd have to go in and click on 500. Yes, did not get my consent. I mean, it was very cynical, it was a very cynical approach. So we always knew that it was going to be a fail. But you know, it just took the time now, you know, in February with the Belgians saying, look, this is just not legal. Here are the seven things you don't do in the transparency and consent framework. And unless you fix these in six months, you're out of business or go do something else. And so for us, it's a little bit of a I won't call it Schadenfreude. But we've enjoyed watching what's happened because the cookie thing has really been a damage to us the, you know, these people would come along and say, Yeah, GDPR you want to comply with it? Here's this $500 cookie thing, and you're done. Well, that's not true. There are so many aspects of GDPR that cookies could never involve. It's just ridiculous. So now you got me started. So what do I think about the death of cookies? I couldn't be happier. Thanks.

Debbie Reynolds 37:54

Yeah, I want people to not be so fixated on cookies that they're not thinking about the other ways that people are being tracked. That's actually a misnomer.

Roy Smith 38:06

It's not really the death of cookies. It's the death of cookies being used for tracking and privacy management. That's cookies are fine. There's nothing wrong with them. They fulfill a wonderful purpose, but it's their perversion of them for this purpose. That is the problem. Sorry.

Debbie Reynolds 38:06

It is. Yeah, I agree with that. You know, and then we have things that are coming up now about browser extension, fingerprinting, and stuff like that.

Roy Smith 38:37

Right, as I said, the ad tech guys are really, they're very clever. And yeah, there's always been fingerprinting; you've always been able to look inside a person's computer and say, okay, well, this is a Dell that has two gigabytes in here, and it's not okay. Well, that's a fingerprint. If I ever see that again, I can know that that's that guy, even though I don't have a cookie. So they've been fingerprinting isn't new; that's been around for 10 years, but they just didn't need to use it because the cookies were working so well. Now, cookies are taken away. So now you have to fingerprint. So what does Apple do? They turn all that stuff off. So now you can't, you know, Apple has done a wonderful job of weaponizing privacy, you know, several years ago, or even back when Jobs was still alive. You know, they kind of realized, hey, let's make privacy be a part of our platform. You know, we don't make money from advertising. So let's make privacy a big part of our platform and the guy that they compete with whose name starts with a G, advertising is their thing, and they make smartphones. So that'll give us a wonderful edge. And they have beaten, you know, Google over the head with that edge. Quite well.

Debbie Reynolds 39:50

Yeah. Do you see people when you work with companies? I don't know; maybe it's still early yet. Do you see clients or potential clients seeing privacy as being an advantage and not attacks on their business?

Roy Smith 40:09

All right, that's you sound like one of my many press releases. Over the past few years, I have been with somebody who's always said, look, consumers feel more comfortable interacting with a company that they trust. Trust is now a fungible thing in the economy that we exist in. But at the end of the day, at the end of the quarter, the numbers have to be made, the sales have to be made. If you're a marketing guy, your compensation is based on how many people went in the top of the funnel and how many came to the bottom of the funnel as conversions. So the happy talk about privacy sort of gets forgotten. In that scenario, there are some companies who have taken that approach, Apple being one of them. There are others who have said, you know, we're going to take a privacy-first approach. I think, over time, more and more will because, over time, people have gotten more and more freaked out about their privacy. I was quoted this morning; I was writing an article for somebody else. There was a Pew research effort that was done a few years ago, that 91% of the people they surveyed in the US felt their privacy was out of control. Think about when was the last time you heard a survey of anything where 91% of people agreed on something? Is it Sunny? No, no, it's not. It's a little cloudy. Right? 91% of people in the US believe their privacy is out of control. That's got to translate into people making purchasing choices. When it's obvious to them, hey, these people care about my privacy. Hey, these people, you know, don't. So weaponizing privacy, which Apple has done is been a very, very good thing for Apple to do.

Debbie Reynolds 41:54

Yeah, I agree. So if it were the world, according to Roy, we did everything that you said, what will be your wish for privacy or consumer rights or anything anywhere in the world?

Roy Smith 42:10

It starts with an E and ends with enforcement. I mean, we can do all the work we do to make these things nice to make nice notices for people, to the companies that need to do these things. Must feel some pressure to do them and to do them the right way. And that comes from enforcement. The other way it comes is much slower, which is by consumers actually voting with their pocketbooks and walking to companies that are privacy focus to handle your privacy. The problem, though, is it's not; it's never that simple as well; I'm going over here across the street because these people are better with my privacy. Consumers don't know how they're being tracked. They don't know what's happening to their data. They just have a feeling, hey, this is creepy. You know, I signed up for this. And a week later, I got an email from this other company, because I bought, you know, a set of baby booties here, they think I'm pregnant, you know, all this stuff, people are just like skeeved out, because of all of these connections that are being made. They don't know how it's working. So the choice of a privacy-focused company versus not a privacy-focused company is not easy for the consumer. There's not a flag that flies out front that says, you know, we're CCPA compliant, and the other guy doesn't have it. The way that happens is through enforcement when, when some of these big, big companies that do this stuff, you know, TikToks and the Facebooks, Metas of the world. I mean, the privacy enforcement that Meta has taken, happily agreed to pay $5 billion for the settlement of that privacy infraction that was made in 2011. Right, that's how slow that glacial pace of enforcement is happening. And you're you asked me for what was my one thing is, is I would love to see, effective enforcement happened in Europe, in the GDPR in California, in the Washington DC for COPPA, and when we get a Federal law for that, putting out press releases and passing laws, is one thing, but the hard part is the enforcement, and that's where everybody seems to fall down. If you're not going to enforce it, don't bother passing the law because you're just twiddling your thumbs.

Debbie Reynolds 44:37

Yeah, I agree with that. I agree wholeheartedly. Well, it's been great to have you on the show. This is great. I'm sure everyone will really love this, and a lot of the things you said probably aren't understood as well, but you explain it very well.

Roy Smith 44:53

Thank you. Thank you, Debbie. It's been wonderful to be here and to get to vent my spleen about the cookies.

Debbie Reynolds 45:00

Oh, no worries, no worries. Well, it's been great, and I'm looking forward to chatting with you soon. Thanks. Thank you