E112 - Richy Glassberg, Co-Founder and CEO, Safeguard Privacy
Your browser doesn't support HTML5 audio
54:04
SUMMARY KEYWORDS
debbie, people, data, privacy, companies, marketer, laws, compliance, assessment, complexity, consumer, ad, world, wayne, platform, industry, understand, regulators, brand, advertising
SPEAKERS
Debbie Reynolds, Richy Glassberg
Debbie Reynolds 00:00
Personal views and opinions expressed by our podcast guests are their own and are not legal advice or official statements by their organizations.
Hello, my name is Debbie Reynolds; they call me "The Data Diva". This is "The Data Diva" Talks Privacy podcast where we discuss Data Privacy issues with industry leaders around the world with information that businesses need to know now. I have a special guest on the show, Richy Glassberg. I'm so delighted to have him on the show. This is so funny. So happy to have you here Richy.
Richy Glassberg 00:39
Debbie, it is great to be on "The Data Diva" podcast. Thank you for inviting me on.
Debbie Reynolds 00:44
So let me just say a few words about Richy. So Richy is the Co-Founder and CEO of SafeGuard Privacy. He was also a part of the founding team of CNN.com. He was a Co-Founder of the IAB so we have tons to talk about with that. I had the pleasure of meeting Richy; this is so funny. So I met Richy at a conference we attended, a Privacy Matters conference in New York for Ketch. And you and I know someone in common, which is your Co-Founder, Wayne Matus. So I've known Wayne for a million years. I knew him before he did SafeGuard. We used to always collaborate and do stuff together. And he's so smart and fun. And I feel now that I've met you I understand everything clicked, so you're kind of like the yin and yang; you all kind of fit together. It's pretty cool. So yeah, tell me a little bit about yourself. You've had such an illustrious career in media. And so I would love to find out about your journey in technology and why you found yourself in privacy now.
Richy Glassberg 01:51
Thank you, Debbie. Funny story, you know, Jules Polonetsky at the FPF?
Debbie Reynolds 01:56
Yes.
Richy Glassberg 01:57
So when I hired Wayne, when I found Wayne to be my Co-Founder, I should say it the right way. But I found Wayne to be my Co-Founder. We realized after a couple of months that we had somebody in common. Back in the day, when I started CNN.com. I did the first search for an answer. And it was a company called Doubleclick and another company that we were looking at, the two ad servers and the lawyer was a guy named Jules Polonetsky. At the same time, so back in the mid-90s Jules and Wayne did the first ever New York City Bar CLE on privacy. So what a small world is it that Wayne knew Jules back from 99-2000. I knew Jules from then. But then Wayne and I got together in 2018 to try to solve privacy for companies, then what a small world, Debbie.
Debbie Reynolds 02:49
It's a totally small world, right? I've been watching Jules for many years. I've known Wayne, as I said, from you know, forever. And so it's so fun to see; I can totally see the synergy between you guys. And it's so much fun. You're you have so much energy and I love the talk that you did at the Data Privacy Matters session because I don't know, I'm a real talk person. And I feel like you do that. And that just makes me laugh.
Richy Glassberg 03:21
Thank you. And you know sometimes I get in trouble for it. But look, I've been in the digital space since it really started in 1995. I was lucky enough to be one of four people brought together we were all inside Turner Broadcasting. And Ted asked us to start CNN.com There were four of us. I was the business guy, the revenue guy in New York and sales. And frankly, Debbie, nobody knew what they were doing. Back then it was the early days of it was AOL. It was Prodigy you know, it was all this stuff that nobody knew what it was, the early days of Yahoo. And we started CNN.com. When we took the content that we had from CNN, we created the first website for CNN on the Internet. And I was lucky enough to run that for four years. And we ran it around the world. We started seeing FN CNN, SI and CNN Global. And at the same time, because I grew up in the cable industry, we had something called the CAB. It was the Cable Advertising Bureau. And what I learned growing up in the cable industry is that the industry body could help. Cable had to compete against broadcast television; there were about 35 of us that got together at CNET. And if you remember CNET, back in the day, it was like mid 96 When a whole bunch of people from early Internet companies got together for a lunch meeting outside of a guy named Halsey Minor's office. And about eight of us decided, well all of us decided we were going to do this and eight of us got tasked to do it. So there was a bunch of really cool people that I'm really proud to have been part of, Rich LeFurgy who was at Starwave, which was ESPN.com Molly Ford from Infoseek, Kate Everthorpe from CNET. There were a lot of great people from back in the day Jed Savage and Scott Schiller there was just Doug Weaver was on the radar. It's just an amazing crew of the original people of Internet advertising. And we created the Internet Advertising Bureau. And I was lucky enough with Rich LeFurgy as the Chairman and I was the Vice Chairman for the first six years. And we created the first industry ad sizes, we created the first industry, T's and C's, because we realized that it was so new that people needed to understand how to buy digital. And the reason I tell the story is a lot of things were done that we had no idea of the consequences; we took the first cookie, it was a txt file, and I actually took the first tracking pixel from my friends who ran General Motors. General Motors was the first advertiser in digital to put a tracking pixel on it, and nobody would take it; I was running CNN.com; we were a big partner of General Motors, as you would expect with CNN and Turner Broadcasting around the world. And they asked us to take this; nobody knew what that tracking pixel would become. Nobody knew how data was going to become the oil of the literally the digital industry and the impacts of those things. And looking back on it, we could not tell what would have happened by taking those technologies that now today have really led to the Internet digital ecosystem, forgetting the consumer, and using that data as oil in a way that has upset consumers so much over the last 10 years, I believe that is what to me, I see that direct line, Debbie, from those initial cookies, tracking pixels, to programmatic speeding up advertising, I think GDPR was a direct result of programmatic advertising and digital advertising for getting the consumer. And I think these privacy laws are really in response to consumers feeling creepy. Because, you know, Debbie, let's be honest, none of us really had privacy in an offline world. When you moved or I moved, we moved, we put in a moving thing to the US Postal Service, that data was sold, when you use your credit card, that data was sold, but you moved, that data was sold. And a week later at your new house, you got a card in the mail that said, hey, Debbie, hey, Richy, I'm your local dry cleaner, come on by, I'll give you 10% off or whatever it was. And it took the speed of a month; right now, on the Internet, I go and look at a pair of Merrill's sneaker hiking shoes on Amazon, probably one of the best data companies, these tech companies out there, actually buy it on Amazon, Debbie, they don't realize that I bought it. And for the next 30 days, everywhere I go, I see an ad for those same Merrill sneakers that I just bought, and it's in my face. Now substitute Merrill sneakers for something sensitive, a medicine, a therapy, something that you search for, and the algorithm doesn't know it's a mental sneaker or said medicine or said treatment or whatever it is that you've searched for. And I think that what we're forgetting is that the digital advertising ecosystem has gone from the speed of a change of address form to the US Postal Service to milliseconds, like real-time trading on Wall Street.
Debbie Reynolds 08:30
Yeah, then, you know, this is a huge issue. I'm actually glad to see; I was very happy to see some of these regulations, mostly because I felt like companies just store too much data. And so I feel like you have so much data, you're keeping someone's data gets you in trouble, because you think, oh, I can do other things with this. So I think having that human element in the mix as part of it, and like you said, around, you know, trading data, someone's searching for medicine and different things online, it's like, who are you selling that data to, right? So to an insurance company, how's it going to be used in the future?
Richy Glassberg 09:13
Debbie, I think that most companies want to do the right thing. Right? The ad industry and the digital industry is mostly accustomed to standards, not laws. I think part of the problem is the regulators have a hard time understanding the complexity of advertising technology. That nexus is the idea for why we built safeguard privacy, and that collision of the ad industry companies want to do the right thing standards, not laws and the regulators not understanding is why I knew that I needed a partner and that's why as you know, Wayne was the perfect partner for me. And the platform we started to build was just that, it's how do I manage my privacy compliance, and that was the first product we did. And we looked at the world. And we saw all these companies in the IAPP Factbook, and frankly, Debbie, none of them came from the digital world. None of them understood the complexity of the digital ecosystem and how the data flows across this world that people just look like I logged into a website, I saw an ad for a product or a movie, or a food or an entertainment thing or a destination that I wanted to go to. And I think that was the genesis of our idea to start SafeGuard Privacy.
Debbie Reynolds 10:43
So I guess there's not necessarily friction, but I guess there are two things that are happening. One is standards. So standards need to be developed and followed to be able to get people on the same page in terms of how they're handling data. And we're seeing a lot of movement in the digital spaces around, especially the appreciation of the third-party cookie and things like that. But then also, as you say, like this rise of regulation. So they're kind of two different fronts that things are working on. Give me your thoughts about what's happening there.
Richy Glassberg 11:20
So it's a great question to me, I don't think advertising is inherently bad, or they're bad actors. Yes, there's bad actors in everything. And I think that's why these laws exist. Again, I'm going to come back to saying most companies want to do the right thing. I think there are things that you need a standard for. I'll give you an example. When do I drop the pixel or the cookie in their consent string? Did I get it from the consumer? Am I communicating it to the other people up and downstream of what I'm doing? I think the IAB tech lab is working on some important standards around that. So we can have some consistency. But the law needs to say things like, hey, you got to have a policy about this. And you've got to define PII like this. Because if there isn't that stringent rulemaking, people are just going to skirt these rules, Debbie. Right, I think we really have to think about how much data are you keeping on your, let's not call it a consumer, your customer? A person, how much data do you keep? Why are you keeping that data? What are you going to use that data for? So I do think you're right; there is some conflict between laws and standards. I do think we need both because they serve two different parts of what I think is the effort to make this digital advertising more palatable and safer for consumers, for your customers for people. I'm very wary Debbie about people in regulators calling it surveillance advertising and using really negative words. But I think it's our own fault. We didn't put limits on what we did with data. And I mean that as an industry we.
Debbie Reynolds 13:10
Yeah, I think, you know, every year, I come out with a prediction about what I thought would be the hot issue. And so my issue this year, which I was right about, is kind of the third-party data sharing without consent, which is a third-party vendor issue. Or the third-party data issue. So like a first-party company collects data, they have to somehow share it with a third party, or a third party gets it, so talk to me a little bit about that problem and how you all helped solve it.
Richy Glassberg 13:40
Well, actually, I love you know, I just had this idea when you said that, Debbie, let's call the third-party data the third rail of advertising. Because we never want to step on the third rail, right? You're bringing up an incredibly important point, the complexity of privacy is insane right now; I would argue it's not mainly possible to manage it. A lot of the tech that's out there, Debbie, was not built for the evolution of privacy, a cookie data-sharing identity. None of it was built over the last 25 years for privacy. It wasn't Debbie; as you know, in these complex systems, it's hard to retrofit things into new norms, if you will. I'll give you an example. You're talking about third-party data, right? Let's talk about clean rooms. Right, Debbie? Everybody's running. That was the hot word this year. I'm going to put my data in a cleanroom. Did that just make you laugh when you heard that, Debbie?
Debbie Reynolds 14:35
Yes, yes, it did. It did. Absolutely. It's like a pendulum, isn't it? We go back and forth. Yeah.
Richy Glassberg 14:41
Well, here's why that made me laugh. I think people looked at cleanrooms like washing machines. I'm going to put my data in a cleanroom. Throw a Tide pod in and it's good. That is so wrong, Debbie. I mean, if I don't want to name names, if a publisher x doesn't get, and I signed up for that publisher at RichyGlassberg.com here, everybody knows my email, whatever, just destroyed my inbox. And they didn't get consent to share that data. And they put that into the cleanroom; there is no cleanroom Tide pod that's going to wash away, that absolves them of that sin for not getting my consent to share or sell that data. And that's where I look at people looking at; they want an easy solution; they want an easy button. Remember the Staples easy button, Debbie. That's what clients want. And I'm, I feel terrible. But there are no easy solutions in privacy, it's a lot more nuanced and it's a lot more complex. But we built a platform to help really power these solutions. These are hard technology and hard legal compliance solutions to solve. You can't do it in a spreadsheet anymore, Debbie; you can't do it over email. I mean, you know, you can't even hire enough people to handle it for you. And you bring up a really scary point for companies under the new CPRA. I’ve got to say it slowly, Debbie, because I get confused with CCPA CPRA, Virginia; you know, there's so many acronyms out there. Under this new law, they've really closed that loophole. And I'm not talking legal because I'm not a lawyer, but they've closed the legal loophole is sharing data sale, and they've come out and they've now said, you don't have to have a contract with all your counterparties. So Debbie, let me give you the example; you now have to have a contract or sub agreement, you have to have a third party agreement and a service provider before you just had to say, hey, is this vendor a service provider? This is how crazy it is. Debbie, if we were going to go build a house together, we would call a contractor and we'd have you'd have a contract with that general contractor and we'd be covered, right? He would, he she they would have a plumber, they would have a roofer there, whatever, everything. Under this new law, the analogy I'll give you is, you now have to have a contract with the general contractor with the roofer, that general contractor hires with the supplier of shingles, to that roofer, to the nail supplier, and so on, and so on. And what people don't understand is, if I'm marketer y, and I'm selling a new kind of shampoo, and I'm putting in front of Richy on website x, there could be 30 different companies between those two, they're going to be five different, seven different, ten different data infusers, on their third party data companies, did every single one of those companies get the consent to use the data that they're using to be able to target that ad, put that ad for a shampoo in front of me on a website? Right?
Debbie Reynolds 17:44
I know it's absolutely bananas, right. But but you also bring another point up. And that is part of it is almost the disconnect between what happens in digital spaces, versus how the laws are really. So some of these are very, as you say, it's very difficult for someone to actually achieve all this, but just talk to me about that dichotomy.
Richy Glassberg 18:10
So we think one of the very complex, nuanced problems is, especially when you look at some of the big companies around the world, a lot of them have legacy data systems, Debbie, and they've been using data to contact you in many different channels, you get an email from your favorite airline, you get a letter from your favorite airline, you see an outdoor ad, you've seen an online ad, a television ad. And all these systems have a lot of complexity. I'm concerned that the regulator, the regulatory machine, doesn't understand the differences and how the data is used. My concern is that if we really think data is the new oil of the economy, we're still acting like this data is like a wildcat well in East Texas, somewhere in West Texas somewhere, and it's just gushing out all over the ground. And we're not sophisticated enough to understand trust-busting and getting out of those wildcat companies to remember in the early days that there was one or two companies that controlled it all, and then the government had to break them up. Are we going there? I don't know. There are some people making analogies about that, that there's three or four companies that are controlling all the data and how does that get broken up? How is that a fair world? I want to come back to something in this law and you talked about it. The regulators are being really clear, California AG, that you now have a legal obligation to assess and do due diligence on all your vendors. If you're a billion-dollar advertiser and marketer and there's a lot of them in the United States, Debbie you could have 4100 vendors. How are you going to do that? How are you going to handle that off of a spreadsheet and how many people are you going to throw at that? That's one of the things that we looked at by being of the industry that we solved for the largest bank in America. And we're rolling it out to many advertising agencies, many marketers, because we were so concerned about that third party risk management, and that it didn't exist in the privacy space. And, you know, to your question, the regulators don't understand the complexity of this; you've got marketers that just want it to be easy. And really looking at old-style solutions, a GRC platform that did your, whatever your some other kind of law that you had to do for some regulator that you were under. And none of that understands the privacy rules and regulations or the complexity of how the privacy ecosystem works. And that's why we built our vendor compliance hub to help automate that and help manage that. We have a saying Debbie, that when we started this company weighing in on in 2018, it looks like a lot of privacy 1.0 companies; I'm pretty excited. And I love your opinion on this. I don't know what you're seeing in the last year; I've seen what I'm calling privacy 3.0 I'm seeing more and more new companies in the tool space that weren't around in 2018 that are really understanding the complexity. And we don't do tools. We're legal compliance, Debbie. So when we look at these tool companies, I'm excited about the industry because there are companies; you interviewed Tom Chavez a couple of weeks ago, and he was at Crux. We grew up in the digital industry together, what they're doing at Ketch is amazing about data orchestration on the back end, you know, RadarFirst what they're doing with breach, there's a lot of really forward-thinking companies out there, Habu in the cleanroom space. I'm really excited about companies seeing the complexity and seeing the problem between what a regulator does what industry standards are, and how the ecosystem actually works. So it makes sense, Debbie?
Debbie Reynolds 22:04
Oh, absolutely. I am too very excited. So I'm a data person. So for me, I want to know what's happening with the data. So I'm very excited to see these sophisticated tools; I'm happy to know that these guys are really passionate about solving those problems. You know, I think the challenge that I've seen over the years, it's changed a lot, right? So I think when GDPR came out, people thought oh my god, now I have compliance, which, you know, obviously, compliance is always there, right? And never went away. It's just more things to be compliant about, but I think we're seeing more of a holistic approach where people are like, okay, you have data problems then you have compliance problems, right, you don't have one or the other. So you can't put your blinders on like, okay, right, I only do compliance, that's it, or I only do data stuff. That's it. So it has to be more of a handshake that's happening there. And it has to be done in a way where you can break those silos down.
Richy Glassberg 23:04
So let's talk about that handshake. You know, where I'm really concerned about, I'm concerned about where that handshake starts. I hate these pop-up consent boxes, Debbie. Is that the worst experience on the web out there you've ever seen?
Debbie Reynolds 23:17
I tell my European friends it's worse than the US by 1,000%. Yeah.
Richy Glassberg 23:23
I mean, let's be honest, Debbie. Why do I have to read an 18-page privacy policy when the same company sends me a one page Ikea like diagram on how to put together a product? Right? It's crazy, Debbie. I mean, they give you they give you such minimal tools to put something together. If you ever unbox a new phone, I don't care who it's from. Apple, Samsung, Google. It's simple. It's in a beautiful box. There's a couple of pictures; you push a button, it turns on, right? It's crazy. But I want to do a privacy policy. I've got to click on a banner. If I want to look at it, I've got to click on something else. I’ve got to read 80 pages of text. Debbie, it just makes no sense. We're forgetting about human beings on the other side of the screen. It drives me crazy. There's a couple of great companies in this space to consent. Jesse Redniss out of Time Warner, they're doing great stuff with consent. It's a really cool company. There's lots of cool stuff out there. But you know what I would love I would love companies to simplify that say, hey, Richy, and maybe not on the first page, Debbie, hey, I see you're here for a while. You bought a product from me. We're going to collect this data because you bought a product. We're going to keep this data because that way you're our customer and we can get a refund. We can charge you; you can do your service fee, whatever it is. And then I want him to say hey, Richy; we want to sell your we want to share your data. Are you okay with that? You know what? No, I don't want to share my data. Okay, Richy, we won't share the data. Hey, they want to sell the data. And this is why because if we sell it, we can make a better product for you. Or we can do, we can give you a discount on the next time. Okay, if I trust you, yeah, if I don't know, then let's talk about data exchanges. I give all my data in the world to Google Maps because Google Maps gives me back what I want. Am I going to give all my data, right? It tells me where to go, where the Dunkin Donuts is, where the gas station is, where the police are, and the best route; it's a great exchange of value. If a website, and I'm just going there to read a news article, a sports article, you know, a lifestyle article wants to take my data, what are you giving me? Why should you sell my data? Why should I agree to share my data? And they just make it too hard to understand. And let me understand the value exchange. I think it starts there. And then everything else flows right once you get that consent, the data? Is it a Real ID? Is it a masked ID? I kind of liked some of the stuff that Apple's doing about the masked IDs; I think we've got to get the pendulum back so that I as a human being as a person can control my data and how my data is used.
Debbie Reynolds 26:13
Yeah, I think the change is, you know, I think businesses have always been good to figure out what benefits them. But in this new world, I have to figure out how does it benefit me or the consumer or the human? Right? Because that's just the way things are going. So companies, they're still in this business-only mindset, and they feel like, you know, they don't know how to how to make a person or a human a stakeholder in how their data is handled, I think they're going to fail. What are your thoughts?
Richy Glassberg 26:50
I couldn't agree with you more. And I think it's just happened so fast. I think companies have to think about the person on the other side of the screen. And it's got to make sense for an 80-year-old, 50-year-old, a 40-year-old, a 30-year-old and an 8-year-old. And we're not thinking that way. Look, that's the DNA of what we need. And I built, we actually looked at this problem from the other side and said, how do companies do the right thing? We realized we do two things. We help a company manage its own compliance, and we help them manage its vendor compliance. Now, I just simplify that. And it's highly complex. It's an auditable platform; it meets all the requirements of a compliance platform. It's committed, it's permission-based, it's keystroke logging, all that stuff that lawyers love and all that. But we simplified it. We tried to simplify the complexity. We wrote it in plain English, we put the law on the page, we give people guidance and we called it our simplified commentary. We literally say, hey, are you doing this? Great. Put your documents here. Are you doing that? Great. Tell us why you did that. And we did it in a way that gives people what people like, you know, we're, you know, the US is all score based. The world has gone score based; we give them a really clear score, you're this compliant, and this is your risk. And when we showed people that practitioners, right, Debbie, the people you want to talk to every day, they said, oh my God, we had one, you know, Chief Privacy Officer for 20 years, say, where have you been my whole life. And it was really Wayne saying we have to simplify this. In the same way, I complain about simplifying consent. I think we have to make it so that everybody in the company can be part of the privacy team. If it's just the privacy lawyer in a corner, we're not helping the privacy lawyer do it because they're fighting up the tide. We built the products so the privacy lawyer can work with the product and people can work with the IT people. It's a collaboration tool. It's a workflow tool. And people love it because we're trying to break down the barriers. Our belief is if more people in a company can be part of the privacy journey, that company will do better things for their customers and the people on the other side of that screen. We wrote it in plain English. We made it to help companies look in this economic time, Debbie, you've got to reduce your costs. You've got to reduce your time to compliance. You've got to save headcount. We've done all those things. But it's hard because these laws continue to evolve. More laws get passed and our country has not figured out how to do a Federal law. We've got what, five laws plus COPPA if you add BIPA, it's 6 laws. We have 22 more states in committee that may have laws over the next 18 months. This is hard. Companies are trying to reduce their risks. They're trying to reduce their external costs. They're trying to increase their productivity. These all things we all want to do in a company. But it's so complex Debbie. And it's not made easier by regulators not really understanding how the world works. I don't mean to pick on regulators today, because I actually think that these laws are a great start. I just wish there was a better conversation between actual practitioners and the regulators. I think it's getting better, but I'd like it to get stronger.
Debbie Reynolds 30:28
I agree with that because I mean, regulators are consumers, too, right? So a lot of times, we have to make it more personal for them. So it can't be all corporate and businesslike, and then you made a really good point. Because, you know, if let's say a privacy officer is sitting in a corner somewhere, they're looking at the legal issues, but they don't know what's happening, what operationally is occurring within an organization, right? So being able to have people within all levels of the organization to be able to communicate and be able to share that information, then you'll be able to see that risk before it comes up.
Richy Glassberg 31:08
So, Debbie, we talked to everybody out there. And the thing we hear most is that the organization doesn't understand the complexity of privacy. And when we talk to lawyers, they say the organization doesn't understand the whole complexity of it; they think we're over here on the side. I'm not saying every organization, Debbie. There's 20% that is, right, but the mass, the 80% of orgs out there, and they want a tool so that everybody can feel comfortable. They don't want to feel like oh, I'm the wizard over here. I'm the lawyer and only I can read this language that divides a company, right? And I think for companies to embrace privacy, we have to embrace all the different parts of a company that have to touch privacy. And that's such a complex, nuanced problem. And we hear that everybody we talked to, we hear product people saying, oh my God, why do we have to do this? Why can't there be Federal legislation? Why is it like this? Do they not understand? And we hear lawyers saying, gosh, I just need my product team, I need my business team, myself right? A marketer says, I want to buy this piece of data because I think I can get more customers if I do it. And then the lawyer is like, oh, no, no, you can't do that. Or did you check them out? Or did you go through this process, but the marketer just wants to buy this piece of data? Because they think they can get an incremental 2% of sales? And guess what, in this market, Debbie, the marketer just wants to go do that. It's really hard.
Debbie Reynolds 32:37
It is quite difficult. And it's only getting more complex, right? Just like you said, different laws. And then we were talking about regulation, but then the technology is getting bananas. Right. So we're talking about AI, we're talking about Web 3.0 retirement decentralization. I mean, it's going to be you know, I feel like the technology. So right now I feel like people are already struggling with what they have right now. And then now we're going to add, like a quantum leap of more complexity, like kinds of powers, I'm happy in terms of how data is being collected and share what are your thoughts there.
Richy Glassberg 33:14
That's my biggest worry; I've been doing this since 1995. I try to keep up. I mean, my family makes fun of me that I call it the inner tubes. But I mean, Debbie, I don't think all of us have the capacity to understand the implications of what's happening with data. I'm going to go back to my first example; in 1997, we did the first ad server to not hard code an ad on the page in 98. But then we took some people who created the first txt file, which was a cookie, to start understanding logins and make the functionality better. At the end of 98, we took the first tracking pixel from General Motors; we had no idea what it would do and how this would affect the entire digital ecosystem. And you just named with three, you just named more legislation, you know, all the things you named, Debbie, all those things. What is this going to look like in 10 or 15 years? I've been in this from the beginning, deep in the engine room of this trying to avoid the third rail as you brought up before Debbie; this is hard. I feel really worried about the CMO, the CFO and the CEO of a company that's got 100 million in advertising, 500 million and a billion in advertising. The trigger on these laws are so bad. Look on January 1, everything changes, right? The new California law goes into effect; the 30-day cure period goes away. But we're frantic right now talking to every marketer who's calling us that we can because they don't know what to do. I believe that this is the year of enforcement; you've got many more laws coming in on July 1 and you're going to have more laws passed. I think they're starting to ramp up enforcement even more in Europe. I think you saw some I don't want to name names, you saw a couple of big things happen this year, I think you're going to have the FTC and the California Privacy Protection Authority really going at it out there to really show some. I mean, they didn't approve 40 lawyers for the CPPA in the ballot tab and just sit there and not do anything, Debbie, right? Now, if I'm a marketer, and look, let's be honest, while they're for everybody, it really talks to the companies that have data on a consumer. That's a market, whether it's the local car dealer or it's the national car dealer. Debbie, this is a large, looming growing problem. I think people see they think there's a light at the end of the tunnel; I would argue January 1 that's not a light. That's a freight train. I think marketers are in trouble. I think we're hearing that there are hundreds of letters out to marketers this fall with that 30-day cure.
Debbie Reynolds 36:25
Now let's talk a little bit about assessment. So your tool I looked at, it is great. I love Wayne. So I know he's super smart. He's always a go-to person for me. So I knew it was going to be solid there.
Richy Glassberg 36:45
Secret weapon, Wayne's our secret weapon.
Debbie Reynolds 36:50
Yeah. But they saw that the thing that is happening, in addition to enforcement, what we're seeing is a lot of these laws, that this has been the case in Europe for a while, but we're starting to see a lot of this in US laws, say you have to do assessments, right. And so a lot of these laws previously didn't say that. So now there's an actual paper trail that they're expecting companies to have, and also to be able to sort of show how they're maturing over time. And you can't do that unless you're actually on top of your game or trying to assess, you know, collecting information. So, to me, that's like a completely different world than what people have been accustomed to. And that will start, you know, kick-off, I think in January, what do you think?
Richy Glassberg 37:39
So Wayne saw this problem. Three years ago, we launched our first product of Manage You. And as I said, you know, we're a privacy company; we try not to name all of our great, wonderful clients that trust us. But one of the largest banks in the United States approached us in January 2020. And said, we really like what it did about Manage You. But we're worried about managing our vendors. And we see it in GDPR. Because Debbie, as you just said, it's been in GDPR for a while. So we spent 18 months with them looking at vendor compliance and how it's done in the InfoSec world. And we looked at and we talked to a lot of people and they all told us they hated the spreadsheet; they hated that everybody had their own spreadsheet that headed down to fill out everything over and over again. And we came back to it. And Wayne's original DNA of let's build an assessment to the entirety of the law that is independent and agnostic of any tool. So Debbie, every one of our clients uses every one of the tool companies that you can name out there. All the big names that are out there, because we don't compete with them. We're an independent assessment platform. We're like a legal assessment platform. And we looked at what was wrong in InfoSec and in third-party risk management. And what we realized was our standardized assessment, if everybody did the same assessment, then you could now do benchmarks and you could share with each other. So the same platform that the leading Bank of America is using, one of the leading pharmacies using, two of the leading agencies are using, the biggest holding companies are using, two of the biggest data platforms and credit bureaus are using it. And everybody's realized if we could all say hey, we're using SafeGuard Privacy, if you knew their California Assessment and share it with me, hey, that's good. So what we did was we looked at this tough problem, and we realized we could automate it if we were truly independent, which we are and if our assessments are comprehensive to the entirety of the law, so we didn't build a framework. Our GDPR assessment is for GDPR or California's for California, Virginia, or COPPA assessment are used by leading people for safe harbors now. And we've really been validated with that idea that people want to simplify and stay standardized vendor compliance. And we didn't know that the AG was going to announce in that second round of the regs that this was going to be part of the regs. We were, we believed it was the right thing to do, Debbie when we built it, and we're pretty, we're pretty excited about the fact that in the regs, it actually came out as a legal obligation. And we have a platform that is live today that many big companies are using, and we're doing our best to get into as many people's hands as possible. Out of the box, it works. We call it the TurboTax of privacy compliance. It does what you need to do right now. And it's also customizable to you. If you're in banking, you can customize it to banking, and when I mean, I don't mean the assessments, but you can ask some additional banking questions or some pharma questions, or some HIPAA questions. But we basically standardized privacy, compliance, and the ability to say, hey, Debbie "Data Diva", I'm Richie "Privacy Buddha", I'm a good actor, you're a good actor; hey, now we can do business together. We're trying to solve that problem by automating all of this vendor compliance and assessment. And frankly, the other thing we did was four years ago; we built an entire audit back end into this; we had no idea that they were going to require an audit. But our belief was a real compliance platform allows a third-party audit. So today, you can use any number of auditors on our platform; we announced that the AMA Masters of Law that the BBB national programs is going to offer the first-ever certification of the California privacy off of our assessments. That's a huge step forward, Debbie; that means any publisher, any company can use our assessment. And the BBB national programs can give a certification that what they put in was true and valid and it meets the law, just like they do today for their COPPA safe harbors. That's a huge leap forward for the industry to be able to have that independent third-party audit of somebody's assessment.
Debbie Reynolds 42:05
That's amazing. I love that you guys are trying to solve tough problems; I think 2023 is going to be a very challenging year. Because a lot of people aren't waking up to what actually they need to do. And you know, I think it's very different because businesses haven't been accustomed to having individuals be kind of a stakeholder, like have a seat at the table, in terms of what they were doing with their data.
Richy Glassberg 42:34
So, Debbie, you talked about, we talked about the complexity for marketers all that you were talking about 2023, being a tough year; here's one of the things that worries me about 2023, I think the economy's going to be pretty tough, we came out of a pandemic we've come out of, you know, there is a global, there is a war going on in the world that's affecting the supply chain. There are a lot of issues facing our economy and other economies around the world. And why I'm worried about that is there's a tremendous amount of tech layoffs. And I think that as the complexity gets worse, with fewer people in the tech industry to deal with it, I'm very concerned about 2023. I'll give you an absolute example. If I'm a marketer, and I've stopped my spending on Twitter, and I still have a Twitter pixel on my ads, I'd be very worried about that today. I think some very large organizations are writing about that some journalists are writing about that. There's no trust and safety team; the engineering team has gone. How do you know what's happening with that data? And at the end of the day, a regulator is going to say, why was that data from marketer X or advertiser y abused? And that worries me. Do you see the problem there is, the layoffs hit the industry because of the economy? And what's happening at Twitter is a little bit different than the economy. It's a change of ownership. That puts the marketer at risk. That worries me in 2023 a lot.
Debbie Reynolds 44:08
Yeah, well, well, since you mentioned that I would love to touch a bit on brand safety. So this is a word; this is going to be long.
Richy Glassberg 44:17
This is going to be a long one Debbie.
Debbie Reynolds 44:20
So this brand safety is a term is a phrase that's been used in marketing forever, but now I feel like it's like burst out. It's like normal parlance now. Right. So everyone's talking about brand safety. So tell me a little bit about brand safety as it relates to privacy because I feel like that's super connected.
Richy Glassberg 44:41
That's a fascinating question. I'm going to get in a lot of trouble here, Debbie, but I think people have given a lot of lip service to brand safety. I think that again over the last 27-28 years. It's because of the complexity. Marketers talk about it. But I don't think tech companies do enough about it. And I don't think we see the damage it's doing seeing a brand next to something on social media that's hate. That's negative speech. That's hate speech. That's against a religion, culture, an anybody, a country, a person, an ethnicity. And I think that happens all the time. And I think marketers are trying, but I don't think tech platforms are doing enough to protect against that. I believe that's a failure of the tech industry to be stronger. And they're hiding behind, as you know, certain regulations and certain acts and say, I'm not a publisher, I'm in this in there hiding behind stuff. And I don't believe and I oversimplify that, Debbie. But it's a broader conversation that's probably an hour with a lot of people smarter than I am. But I think that the tech industry is not doing nearly enough to get hate speech and negative things that are really detrimental to society off their platforms. It's not okay to go after ethnic groups; it's not okay to go after religious groups; it's not okay to denigrate people, you know, the rise in teen suicide, the rise in depression, I think a lot of that can be tied to social media. I think there's a lot of very good peer-reviewed research that shows that out there that people can look at, and you could find links to out there. And it worries me; I think it's a much broader issue than brand safety. And if I was a brand, but there's some great brands out there that stopped advertising on the biggest social platform. And they did it because they felt that there was no good corporate governance, right? I don't care what reason you do it for. But I think if I'm a brand, I really have to worry about where my brand is and where my brand is seen. And I think that has to be stronger. So now let me answer your question. I know we went down a side road there, Debbie. But when we talk about privacy, the question becomes very important as well because for privacy, now, how was that data used? How was that hate speech data used? How was that commingled? And I think if I'm a brand, I care very much about how the tech companies that touch my advertising dollar and let's not kid ourselves, Debbie, the entire ecosystem is driven by the advertiser's dollar. If you don't advertise, there is no tech industry. If you don't advertise, there is no Google. There is no Facebook; there are no Amazon ads, there's no Amazon Prime and Amazon store. But there's none of the ad business that works digitally without advertising. The only people that are getting subscriptions, are you know, the Journal, the Times; it's a very small handful of people that are living on subscriptions. So it's all driven by the marketer's dollar. And I actually think the marketers need to be stronger and more forceful about how they call the tech industry to account for brand safety and mail privacy, and maybe because of privacy. And these laws have actual teeth; there's no law about brand safety, its reputational risk; maybe it'll change on January 1, when these laws start to hit. And these fines start to hit people. Maybe that's the breakthrough we need. And they'll use privacy to drive brand safety. Because I think the two are driven together. If a digital company is not worrying about the data and how they're handling it for a customer and a human and a person, then they don't care about where that ad is shown and they're not really going to believe in brand safety. So I think if those two things, I think if there's a positive benefit of this being a law, instead of you know, a want to have because brand safety is a want to have reputational risk. Maybe the laws around privacy will help drive brand safety forward.
Debbie Reynolds 49:22
So tell me what if it were the world according to you, Richy and we did everything you said what would be your wish for privacy anywhere in the world? Whether it be human stuff, regulation, data, marketing. What are your thoughts?
Richy Glassberg 49:39
I wish we could start over. I wish we could simplify this. I think I wish we could be much more transparent. We've never told the consumer what we're doing with their data. They had no idea that if they used an app to let him check in at a restaurant, that app was selling that data to all these people out there and people were marketing to him on that. I wish we could have full transparency. Tell the consumer hey, Richy, I'm using your data. Are you okay with that? I'm using it to sell you to give you a better product. Are you okay with it? This is how I use it. I'm using it; I'm going to share your data. Are you okay with that? No. Okay, great, then I can still give you this product without sharing it. Hey, by the way, I may want to sell it. Are you okay with that? Nope, great, I won't sell it. But I'll still give you a great product that allows you to check in at your favorite restaurant and share with your friends. I think we need that level of transparency. And I think the entire industry has to shift; I think we have got to stop looking at the consumer and the human as the product and look at them as our customers. We're looking at them as the product, not as the customer. I'll give you an example. I think I hate to do this, Debbie. I think Google treats me really well. When I use their maps and I use search and I understand what they're doing. I think they can be better. Nobody's perfect. They can be better about their privacy policies and how they give it to me and all that. But they are trying to give me value for that. I'm frankly off most social media platforms because they don't give me value for that. And all they're doing is mining my data. And I think consumers are getting leery of that. And to answer your question. I think we have to change the mindset of how we deal with the human on the other side of the screen. We have to be simple; we have to be direct. And we have to be transparent about what we're doing with their data and how we're using that data. And we have to clearly let them know what they can and can't do.
Debbie Reynolds 51:48
I love it. Clear and transparent. That's the way to go. Yeah. Excellent. Well, thank you so much for being on the show. I'm so excited to be able to collaborate with you. And you know, I love Wayne. He's amazing. So I like what you guys are doing, and I'm happy to support you any way I can.
Richy Glassberg 52:06
So how did you get the name "The Data Diva".
Debbie Reynolds 52:11
Actually, I went to a networking event, and I met a reporter for The Wall Street Journal. And when I gave her my elevator pitch, she said, oh, you're like "The Data Diva". We just laughed about it. Yeah, I took it because I've never been. I was like one of the kids and never had a nickname growing up. So someone gave me a nickname and I took it.
Richy Glassberg 52:34
Yeah, that's great. That's great. I love that.
Debbie Reynolds 52:38
Well, thank you "Privacy Buddha". I'm so happy to have you on the show. Yeah, this is great.
Richy Glassberg 52:44
Debbie, thank you for having me, Debbie. Thank you for having me on your podcast. I've listened to a bunch of the episodes. I love the depth and breadth of people that you have on this privacy podcast. I think it's great to hear from legislators and regulators and practitioners. I love your work and we really were happy to support "The Data Diva" in any way we can.
Debbie Reynolds 53:06
Thank you so much. That's really sweet Richy. I very much appreciate it. Well, thank you for being on the show.
Richy Glassberg 53:14
Thank you. It was a blast. I really appreciate it.