E226 - Lisa LeVasseur, Founder of Internet Safety Labs
[00:00] Debbie Reynolds: The personal views expressed by our podcast guests are their own and are not legal advice or official statements by their organizations.
[00:13] Hello, my name is Debbie Reynolds. They call me The Data Diva. This is the Data Diva Talks privacy podcast where we discuss data privacy issues with industry leaders around the world with information that businesses need to know.
[00:25] Now I have a very special guest on the show, Lisa LeVasseur. She is the founder of Internet Safety Labs. Welcome.
[00:34] Lisa LeVasseur: Thank you, Debbie. I'm so happy to be here.
[00:37] Debbie Reynolds: Well, it's exciting that we finally got a chance to meet and talk. You and I had a great meeting and it's always fun to meet someone that you followed so many years on LinkedIn and looked at your work and wow, I just feel like I'm in the presence of a celebrity.
[00:54] Lisa LeVasseur: I feel the same way.
[00:55] Debbie Reynolds: Well, first of all, you do terribly important work and you have such deep technical knowledge. That's why we enjoy geeking out together. But the mission that you're on is vitally important and I think that we're going to intersect probably in the future with what we're working on.
[01:13] But I would love for you to introduce yourself and tell us your journey in tech and how you became the founder of Internet Safety Labs.
[01:23] Lisa LeVasseur: I'll give you the condensed version because I'm old and the journey's long, so I don't want to like put everyone to. But you know, I started off in software development in the early part of my career and back when we were writing Z8000 assembly language code and cellular infrastructure systems.
[01:44] So I had a pretty long rambly path that moved from software development, software engineering as we called it back then, that was the formal name for it. And then I got really interested in industry standards in my career at Motorola.
[02:00] And of course industry standards are a deal in telecom. And then I got involved in product management because I thought who are these people making these decisions about technology?
[02:09] They need to be better. So I kind of had a long rambly career in there. And then ultimately I realized that I wanted to do something that was what we now know as safety standards for technology, which wasn't even a thing.
[02:28] Those weren't even words that I didn't even put them together in my head. When I started this organization, it was the ME2B alliance, that was the original name of the Internet Safety Labs.
[02:39] And we were an industry standards organization to bring consumers me's together with bees the vendors and try to basically come up with mutually agreeable what we now call safety standards.
[02:55] Yeah, and so fast forward a little bit, we're now Internet safety labs, and we are building safety standards and we're doing independent safety testing on technology, focusing right now on really developing robust testing methods and standards for websites and mobile apps.
[03:15] Debbie Reynolds: So the work that you do fascinates me. I love the way that you describe this. And I think one of the reasons why we ended up having a call is because I had posted something about privacy and safety.
[03:28] And so that wording, I think, caught your attention, and I think it's important.
[03:35] So tell me why the word safety is important when we're talking about digital systems.
[03:41] Lisa LeVasseur: Well, there's a reason we're not Internet privacy labs that were Internet safety labs because we never felt like the word privacy was.
[03:52] We didn't feel like it covered all of the array of harms that people can be exposed to through software, like mobile apps and websites or anything that is really driven by software.
[04:05] And so we've always viewed privacy as a subset of safety. And in fact, I've written a few blog posts about it.
[04:15] I use the parable of the five blind men touching an elephant, and the elephant is safety, but the parts are like the trunk is privacy and the leg is deceptive patterns.
[04:30] And, you know, there's all these other parts to safety, but privacy in our view since day one. And if you look on our website, we don't use the word privacy all that much, but what we measure right now, what we started off measuring was really all about data privacy, but we always felt like the word was too narrow.
[04:51] Debbie Reynolds: I agree with that in some ways because it has many different facets. And I think the reason why I'm pivoting and trying to use the word safety a lot more because I feel like people understand the word safety better than they do.
[05:08] Like, let's say if a child car seat is recalled, they understand that that's for a safety thing. So trying to use the vernacular of safety around privacy especially, we know our people are misusing or abusing data of people and apps or cars for, you know, stalking and different things like that.
[05:28] I think being able to have that conversation helps make it more real people and like less fuzzy wuzzy and less, you know, theoretical. Because I feel like most people would say they.
[05:41] They like safety, safety is a good thing.
[05:44] So as opposed to people, some people feel like privacy is too. I retire too philosophical. What do you think?
[05:53] Lisa LeVasseur: Well, I think it's really interesting. You know, a couple of years ago, maybe it was even just last year we did a survey with people to measure their sentiments around safety when it comes to physical Products versus intangible technical type products like websites and other things.
[06:14] And we deliberately structured the research to ask them first about safety concerns with all of the physical product categories in their lives, because we wanted to get them thinking about.
[06:27] And the reason we did this was because when we were starting to prepare the survey, we did some initial polling and we were seeing people conflate cyber security with safety.
[06:40] And so they went straight to cyber security, data breaches, identity theft, all of that. And it's like, yeah, that's all important, but that's not really what we're talking about when we talk about safety.
[06:53] We're talking about the innate risks that you're exposed to when you use the technology as it was designed to be used.
[07:05] And so, and that brings us back to like, those privacy risks because we're finding technology is so leaky. It's sharing stuff with so many people that, and, and because it's going into this, you know, commercial surveillance ecosystem, directly into data brokers.
[07:25] That's where the real safety risk comes with the privacy risk. But going back to that research that we did, this is my sort of glib way of characterizing the results of the research.
[07:38] People felt like they were going to get killed by their cleaning products.
[07:42] But when it came to technology, it was only privacy. There was a clear, like, devaluing, I would say, or a disconnect. And this was something we pointed out in the report, which is on our website.
[07:57] But they can't connect the dots of how the loss of privacy translates into potential physical, financial, reputational risks.
[08:12] We're not doing a good job of making that threat more immediate. And this we go back to looking at like, cigarettes and the dangers of cigarettes because it took decades for people to connect the dots between the act of smoking cigarettes and the buildup of health risks over time and connecting the correlating factors, you know, to build a case for this is harmful.
[08:42] So we've kind of been here before and we're, I guess, like in that early stage of trying to build those causation links and paint a really clear picture to people.
[08:56] Right now it's mostly anecdotal, right? It's. It's the Priest from the Grindr app. It's, you know, like these, these anecdotal cases that we have.
[09:06] But really, I mean, we don't know, like, the weaponization of data that is available from data brokers could be deployed way more systematically, potentially. That's a, you know, glass half empty kind of point of view.
[09:21] But I think, I think there's a real risk of that.
[09:24] Debbie Reynolds: Yeah, you, you showed me a presentation and you showed me a little bit of some of the, the details you go into as you all evaluate software or apps and it's incredibly detailed.
[09:41] But what is it that people don't know about apps and software that they should know?
[09:49] Lisa LeVasseur: Well, I think what they don't know is that they don't know that their data is going to so. So many entities that they have no relationship whatsoever with.
[10:01] And we're trying to expose that. We're kind of, we're saying that accountability starts with transparency and we don't have that right now. We don't have, you know, the information and the privacy policies and whatever product documentation is there.
[10:19] I was talking to someone earlier today. You know, software is deterministic, but it's not predictable.
[10:25] It's, you know, it's programmatic, but it's not predictable.
[10:29] And we've published a little bit about this. Even the Facebook engineers saying we don't know where all the data is going. We don't know where all our users data is going.
[10:38] And that's right, we don't because the systems that we have are, they're sort of unbounded by design.
[10:46] But we can capture that snapshot and that's what we do. So we capture the snapshot of like we used the app, we ran through a bunch of functionality in the app, we captured all the data, all of the network traffic, and then we have it all.
[11:03] But what we show in our safety label is we organize it by the companies that are receiving the data and then we have a lot more adjunct information about what the companies do and how risky they are and why.
[11:17] Right now it doesn't highlight who the data brokers are in there, but we know who they are. And we are going to highlight those with like a superscript of like this is a data broker, this is an identity resolution platform.
[11:31] This is a malware domain. Because we find malware domains, we also find dangling domains in the network traffic.
[11:41] So we're really able to put a very detailed, very geeky lens on this to expose exactly what the risks are.
[11:52] Debbie Reynolds: And this is tremendous. And I want you to explain how you all's labeling works, what you're doing with labeling, how it works, what it means.
[12:03] Lisa LeVasseur: So this is a really ever changing evergreen thing because we're kind of developing this as we go forward. And I'll say one thing is we do have an open standards panel.
[12:14] It's our software safety standards panel. And that panel is a group of us. It's open to everybody. Anybody who wants to can come and join and, and we define in there what we believe needs to be included in the label.
[12:31] We also talk about what the, the thresholds are. So we have adopted a kind of stoplight, a modified stoplight scoring system. We think any finer granularity than like four levels is too much for people to try to understand.
[12:49] It's hard for us too to even like distinguish one level from the next. We started off with three and the apps have a three level stoplight score. Green is some risk, yellow is, I think it's high risk.
[13:06] We're in the middle of changing this and I think it's high risk. And then red is very high risk.
[13:12] We don't have anything that is no risk.
[13:15] The only thing that would be no risk is an app that was not connected to the Internet. Like that would probably be no risk. But that doesn't happen anymore.
[13:26] So the apps have three levels. We're going to move them to four levels.
[13:31] We're putting one more in. And then if you look at the risk scoring later in the label, where we break down the SDKs, the software development kits that are in the app, those get four levels of risk.
[13:48] And the observed domains and subdomains, those also are organized into four buckets of risk.
[13:55] And we do that. It's a very complicated, they're very complicated equations.
[14:01] Risk is popularly characterized as impact times likelihood.
[14:07] So what is the potential impact of this potential thing and what's the likelihood that it will happen?
[14:16] So what we look at for impact is usually it orients around what kind of data is involved. So the greater the sensitivity of the data that's available and involved, either at the app level or the SDK level or the domain level, whatever the thing is we're measuring, the greater that is, the greater the impact number is.
[14:42] And then likelihood is based on, eventually it will be based on what we actually observe. But since we're starting, we're building this out of nothing. We're building it, we're building the likelihood out of whether or not the company or the app or the service monetizes personal information.
[15:06] And basically we're also looking at things like whether the company has had what their data breach history is, what their history of privacy related fines and actions, class action lawsuits, any litigations around privacy.
[15:26] We factor all of that in at a company level to figure out what's the risk. Like how good a steward is this company of personal information.
[15:40] Super complicated.
[15:41] We're working on publishing this for everybody, but it's super geeky.
[15:46] Debbie Reynolds: So who uses these labels?
[15:49] Lisa LeVasseur: This is a good question. We don't track analytics, so we don't exactly know. We hope, because most of these labels are around our edtech research that we did in 2022. We hope that teachers and schools and districts and anyone, any educator is using them to have a better understanding of potential risks.
[16:17] We have reason to believe that an entire category of apps, they're called community engagement platforms. These are these kind of white labeled apps that are deployed by schools and school districts mostly.
[16:32] And, and it's the same engine of the app with a different branding on it for each school district.
[16:41] And those were incredibly leaky. That's, that's where we had like one app that was sending data to 140 some odd companies.
[16:51] And again these, these can be used by children. They're, you know, so, but I think that's changing. We have, we have to re audit these apps and I have reason to believe that, and I don't know if it's because of our work or what, but I have reason to believe that these apps are getting better, which is a good thing.
[17:11] So I hope the vendors are looking at them. I think they are. I think the vendors are looking at the labels.
[17:17] Debbie Reynolds: Well, that's great. I'm glad that people are definitely looking at it. I just want your opinion on this. Where does regulation play in this, if at all?
[17:26] Lisa LeVasseur: Well, they're actually a key stakeholder for us. We are connected to regulators around the world. We think that regulators are poorly equipped to handle enforcement. Maybe not enough resources, maybe not enough funding, maybe not enough clarity in the regulation to support what is needed to, to really enforce at scale.
[17:57] I mean, you figure that software is changing every single day. How do enforcers keep up with this? There's certain things like in our safety label where we try to paint a picture for regulators in particular around coppa.
[18:14] So the ftc, so we do, in our safety labels, we do have in the header whether or not the app was found in elementary schools. Because if it's found in elementary schools, then there are COPPA requirements.
[18:30] So then also in the label we talk about whether or not we saw behavioral ads in there. So we try to kind of paint a bullseye in a way on like hey, here's something to look at.
[18:43] It's used by elementary schools and there's behavioral ads. So we're trying to highlight enforcement things that are of interest to regulatory enforcement.
[18:55] Debbie Reynolds: I want your thoughts about this. I guess I have a complicated view of regulation.
[19:01] One of the things that regulation Isn't everything, first of all. And I think it's challenging to try to navigate all this stuff that we're trying to do, especially in the technology realm.
[19:14] Knowing the regulation tends to be reactive and narrow in terms of how it's put together. But then also, I think one of the challenges that we have with regulation is like this idea that, okay, I have data, I know this company, I give them the data, and then they're supposed to handle my data in a certain way.
[19:39] But what we also know is that there are companies that we don't know, they have our data and they kind of do whatever they want to with it. But I just want your thoughts about that, because a lot of the work that you're doing, I think, is kind of showing that underneath layer of stuff that's happening that people don't really know about.
[19:57] Because I think people feel like, ooh, if we have like a. This law is going to help us with all this data sharing, that may help. But if you're peeling back an onion, we are missing some layers there that are probably the things that I'm more concerned about.
[20:13] Start with your thoughts.
[20:14] Lisa LeVasseur: Yeah, I think you and I are in alignment on this. You know, when we started our work five years ago, we deliberately did not take regulation as the baseline of our standard.
[20:28] And we did that because we didn't think it was adequate, none of it, anywhere in the world. I mean, it's a little bit maybe arrogant, but we really just didn't think it was deep enough probably.
[20:40] So that's, you know, so we could have very easily become compliance, you know, regulatory compliance organization. We chose not to do that. We really wanted to be a consumer advocacy organization that was more like the environmental Working Group, like independently testing and validating what was actually happening.
[21:02] So. So I'm skeptical about regulation. I think if we really zoom back out of it and think about, like, what has GDPR really done? Well, I don't know that it's really had a major impact so far on making software safer for people.
[21:28] I think it has had some successes. I think it continues to have benefit.
[21:36] But we're still. I mean, we still have this whole commercial surveillance ecosystem that is untouchable. We have data broker laws that are. I'm not even really sure, like, what they're.
[21:50] I'm not sure they're doing anything. You know, I know you had Jeff on too, and Jeff is a friend and smart person. I watch and Justin Sherman too. You know, the data broker laws.
[22:04] I don't think they're doing anything. Right now, and I'm not sure that they ever will.
[22:08] I think that there's too much demand for the ill gotten gains of all of this data that goes into that like nefarious underworld of data, whatever.
[22:23] You know, there's too many entities, including law enforcement and government entities that utilize this data.
[22:33] I'm not sure if we'll ever see a regulatory or regulation that will really do that. One of the things we've been doing in our work, we have these principles for safe software and we're revisiting them.
[22:49] And one of the ones that we have been kind of tossing around for at least two or three years is a principle that says you cannot share personal, you cannot trade for consideration or sell personal information.
[23:09] Like what would happen if we just disallowed this marketplace of personal information and like you could only use personal information for real, like what the person gave you the right to use it for, which is going to be providing a service to that individual.
[23:29] But if we disallowed a market for personal information, what does that world look like? You know, we, we disallow a market for human organs, we disallow markets for our votes, we're not allowed to sell votes.
[23:50] There are things that we decide as a society that we will not sell because it devalues and also disproportionately affects people in the community. So we agree we're not going to do that.
[24:08] This is one of the things I continue to say that I think we need to have serious discussions around.
[24:14] The selling of personal information, is that really healthy for people and for society?
[24:23] I don't think it is.
[24:25] Debbie Reynolds: I agree with that. I share your concerns. I want to talk a little bit about a topic you and I chatted about a bit.
[24:33] But the reason why I want to talk about this topic because I've never heard anyone else talk about it besides me in this space. And that is identity resolution. So you had brought this up and it triggered, I was like, oh my God, I think I did a video on that.
[24:48] And I looked up and I did do a video on it. But first of all, people don't know what it is and they don't know the impact of it. So can you please explain what identity resolution is and what's the impact?
[25:00] Lisa LeVasseur: I'm glad you asked that because they're part of that. When I say commercial surveillance infrastructure, it's hand wavy to cover a lot of these entities, including identity resolution. So an identity resolution platforms, they come in a few different flavors.
[25:18] But the one I'm going to Highlight on is these entities like Liveramp, they build a platform and infrastructure that allows them to ingest a lot of disparate systems of personal information and identifiers and correlate those identifiers into their own sort of universal identification schema.
[25:46] So the way to think about this is that they can resolve identification across platforms, meaning they can determine that user 1, 2, 3 on Facebook is user 567 on LinkedIn.
[26:01] And they're able to correlate that, tie it together, mash all of that data together with some statistical confidence or deterministic confidence. They have different ways of doing this, but the power of this is that they're able to aggregate ever more information across devices, across platforms, across worlds.
[26:25] They pick up data out of the dmv, potentially out of point of sale. So they're picking up things in the brick and mortar world and things in the digital world, and they're aggregating all of that.
[26:39] And this identification for commercial purposes is, I want to say, 100% invisible to consumers. This is not us creating a credential and logging in. This is a company, many companies, hundreds of companies, creating unique identifiers behind our backs, under the hoods of this technology and tracking us around and correlating constantly correlating.
[27:10] Like Liveramp has, I think, over a thousand integration partners.
[27:16] And they can be big, they can be like Facebook and an integration partner.
[27:23] So we are really outnumbered in this situation, you know, like. And so the beauty of this architecture, I posted something about this. You know, decentralization is sometimes a code word for safer architecture.
[27:43] It's really not.
[27:46] It's not the decentralization because these infrastructures, this identity resolution, which has a partner, by the way, and the partner is customer data platforms.
[27:56] And they are exactly what they sound like. And they work sort of hand in glove with the identity resolution, because identity resolution for the statistical resolution, it needs customer data because it's kind of matching like data points about that person to draw those to resolve customer 1, 2, 3, with 4, 5, 6 over here.
[28:22] So these two platforms are architected to be decentralized and to ingest information from disparate sources. That's the beauty. In a way, I'm a kind of an architect. So in a way, I like, I have to admire the architecture of this.
[28:38] These guys have been around for over 10 years, maybe like 15, 12, 15 years. So this isn't. This isn't like this has been happening for years.
[28:48] It's just been growing, growing, growing. It's very quiet. It's very quiet. There's not enough awareness of this.
[28:56] And there needs to be.
[28:58] We're so concerned with data brokers. Well, you need to be concerned with this identity resolution and these customer data platforms because they're collecting so much information.
[29:08] Nobody's governing them. Not really. We don't know what's happening in there. I think we saw somewhere between 30 and 40% of the identity resolution companies. We identified 93 that's available on our website, the list of them, and that's way too small.
[29:23] There's a lot that don't even say their identity resolution.
[29:27] Between 30 and 40% of those are data brokers, registered data brokers. So there's a real strong correlation there, you know, and how this data supply, if we think about this as a giant data supply system, they're hugely important, this commercial surveillance infrastructure.
[29:46] Debbie Reynolds: I think one of the challenges that we have in this issue, beyond the fact that it's kind of like an underground market that people don't know about for their data and then they don't know how deep it goes, right?
[30:00] So it's not just that you bought blue suede shoes on Facebook. It's like that you use accessibility setting on your computer. Your computer is in Cincinnati overnight and your phone is near your computer and all this type of stuff.
[30:16] But then also, I think this is a very lucrative business. You know, companies definitely don't want to stop. One of the things that I am seeing, and I want your thoughts about this.
[30:28] I don't think that the data will stop, right? They'll ever more want to get more data, whether they have, whether they can get it for free. I think it's a little bit harder now because of maybe some regulation, but people are trying to like hold back some of that data.
[30:44] And I think to your point about decentralization, I think people think decentralization is like kind of a magic button, which it really isn't.
[30:53] Lisa LeVasseur: It's the decentralization is what makes the universal profiling possible.
[30:59] That's. I mean, that's. It's not even like if, if it were fragmenting, that would be better for us as consumers, but it's actually not the way that they've architected this is to be.
[31:09] It allows for, I forget the math term, but it's like multiple foci or multiple maxes or whatever it is. It allows for a lot of universal identity schemas. And that's what I was saying before about.
[31:24] Because Google, with their Gaia, they have an identification system called Gaia, that's their internal. But they're so big.
[31:34] And that is their own kind of universal identification. And they don't have to resolve it because they're like, they're so big that they just have so many tendrils out there that they've just got a universal identification system.
[31:51] But yet it's tricky.
[31:54] Debbie Reynolds: The problem that I have with all these ecosystems, and I know people who work in some of these companies, is that the discussion is always about, well, we need this because we want to sell someone blue suede shoes.
[32:07] Right? But yeah, but I mean, we know that they're using it for other things, whether that be denying people insurance or someone pays more money for a house or things that they don't know about.
[32:18] How can you defend yourself against something that you're not even aware of? Right. Or like, let's say you're getting a score based on the way that you drive or how fast you walk.
[32:28] It's like, if you're taking a test, shouldn't you know that you're being tested? Right. Or scored on that?
[32:34] Lisa LeVasseur: Yeah, and we don't. I mean, people don't even know that this is happening. We don't. Well, that's not true. People have a sense it's happening. Right. Because people always say, like, oh, my device is listening to me.
[32:49] And after I kind of understood this connected network and ecosystem of marketing surveillance, I realized like, oh, doesn't have to listen to us. It really doesn't. It's connecting all this disparate data and it's really got an enormous amount of information.
[33:08] It doesn't really need to listen to us at all. It's got it.
[33:14] One of the things that, and you're right when you mentioned before too, about there's so much money behind it, I looked at, I did a like back of the napkin calculation of all of the industries.
[33:26] This is in the report on our identity resolution report. But all of the industries that rely on this function and it is, it's like law enforcement, it's insurance, it's social media, it's sales and marketing and advertising.
[33:41] And it's like all of these massive industries because they want to predict you and they want to personalize. This is the other thing that I was kind of frustrated with looking into this because a lot of the rationale in the advertising and marketing space is customers want personalized information.
[34:05] And if you look at the company's websites, they're like, identify all the visitors on your website, even if they're not your customers.
[34:14] I mean, who asked for that? I don't think any of us asked for that and that's what this is doing. And also this, like everybody wants a personalized experience, even if they don't, even if they're not your customer.
[34:29] And I just kind of want to say I don't, if you don't know me, I do not want a personalized experience. Like that is not okay with me, like at all.
[34:40] But there's so much money and momentum around this drive for personalization because there's some statistics somewhere in some like high powered marketing study that says that there's a slightly bigger conversion rate if there's a little bit of a personalization on something.
[35:02] And so because of these whatever that data, we are now targeted. We have no safe harbor whatsoever. We are just, we're like, I don't know, we're, we're just victims, like we're marketing victims in, in this very surveilled, invisible world.
[35:26] Debbie Reynolds: What are you working on now that you need help with or what would you like to share for someone who either wants to get more information from you, maybe want to help out in some way?
[35:39] Lisa LeVasseur: Yeah, well, there's a couple of things you can check out our website, all of our research, all of our tools, all of the databases that we published and we try to publish everything.
[35:49] We're a nonprofit organization.
[35:51] We, you know, we'd like to get everything into everybody's hands so that they can make good use of it. If you're a researcher, civil society, if you're a lawyer, whatever, if you want to get your hands dirty and help create the content that goes in the label like we're defining, we've only done one part of the label and that's privacy.
[36:12] We're working on deceptive patterns, we're working on security risks, we're working on automatic decision making risks. You can come and join us in our software safety standards panel. That's where we do that work.
[36:26] And then the last thing I want to put a little plug out there for is we're doing a pilot project, it called safetypedia where we are teaching people and we're certifying anybody who wants to be a certified safety inspector to do a privacy audit.
[36:43] And you'll get access to our dashboard where you can enter data in for a mobile app and whatever data you put in it will generate a safety label.
[36:55] So that's something we're just piloting there with about 20 people and it's looking pretty fun and exciting. I think people are liking it and we hope to roll that out.
[37:10] Any interest in that, let me know.
[37:13] Debbie Reynolds: Well, if it were the world according to you, Lisa, and we did everything you said, what would be your wish for privacy anywhere in the world, whether that be regulation, human behavior or technology?
[37:26] Lisa LeVasseur: Just one.
[37:28] Debbie Reynolds: You can have more than one. You can have more than one.
[37:30] Lisa LeVasseur: Well, I think I have to go back to.
[37:33] If I. If I could have a. Like, if I could wave a magic wand. The one thing I would like to.
[37:41] I know it's fiction, but it's what we were talking about earlier with disallowing a marketplace for personal information.
[37:53] Just disallowing that and agreeing that nothing good can come of that.
[37:58] And then if we do that, we kind of take the steam out of data brokers.
[38:04] But if we disincentivize that whole thing by disallowing it, I think the dynamics, we kind of change the dynamics of that underworld.
[38:16] Debbie Reynolds: I had a thought, and maybe someone can answer this question for me, but I always thought that data brokers should be regulated like credit reporting agencies.
[38:27] Lisa LeVasseur: It's not a bad idea.
[38:30] It's not. And they kind of are. Right. Like the regulation that's coming out like it's. I love to see that. What, the recent thing, the trade agreement, too, that's saying we're going to chip away at those data brokers with every little thing in our resource.
[38:45] Debbie Reynolds: I think it's a good idea because, I mean, the regulation is there and I think that they've tried to distance themselves from that, to try to make it seem like they're not.
[38:55] But the data that they're collecting is being used to make decisions about people. So to me, I think that makes them a credit.
[39:03] Lisa LeVasseur: It's currency, right?
[39:04] Debbie Reynolds: Yeah.
[39:05] Lisa LeVasseur: Data is currency.
[39:07] Therefore it should be like we should be monitoring it like that. The other thing I was going to say is one of the things we talked about. Like I said, we've been talking about data brokers for several years in our, in our standards panel.
[39:23] And one of my pet peeves about it is the name. They're not brokers. They're not brokers at all. They're fencers. They fence illegal goods. And the fact that we don't call them on that, like a broker is usually like doing both sides.
[39:39] Well, they're doing both sides, but you know who's not involved with it? The original owner of the information.
[39:45] It. It's completely the wrong name. They're fencers. Okay. So the other thing, my other wish is this notice and consent as any kind of viable consumer safety thing. That's the other thing that we've got to agree that Notice is mandatory.
[40:05] Notice must be correct. Notice must be understandable. But notice does not take the place of safety risks. It does not take the place of a safety notice because it does not and will not ever.
[40:18] Because vendors will never be forthcoming about the safety risks and things they can't.
[40:28] Debbie Reynolds: Notice isn't sufficient and it's not right. Some people like, to me, I think notice is like the lowest of the low hanging fruit, is like the minimum that you have to do.
[40:36] And some companies don't even do that very well. That's a concern. But I would love for us to get out of this situation where, especially in the US where we're opting out of stuff, but we can't opt in.
[40:49] So it's like swabbing flies. Right. Which is terrible.
[40:53] Lisa LeVasseur: Yeah, it really is. Every day there's a new Microsoft thing that you found you've been opted into without your knowledge.
[41:01] I swear, every single day there's a new one.
[41:04] Debbie Reynolds: Oh, my goodness. Well, thank you so much for being on the show. This is tremendous and I love your work. So I'm sure we'll be able to find ways we can collaborate in the future.
[41:15] Lisa LeVasseur: Yeah. Thank you so much. I really am grateful for the opportunity to talk with you and it's such a joy. I mean, I look forward to many more conversations with you.
[41:24] Debbie Reynolds: Absolutely. I feel the same way. Well, thank you so much. I really appreciate it. And we'll talk soon.
[41:29] Lisa LeVasseur: Thank you, Daddy. You're welcome.