E144 - Andowah Newton, Former VP, Legal Affairs, LVMH Moët Hennessy Louis Vuitton
Your browser doesn't support HTML5 audio
54:06
SUMMARY KEYWORDS
data, privacy, issue, companies, claims, accessibility, law, lawsuits, talked, publicity, legal, technology, website, tech, collecting, worked, ai, terms, year, customers
SPEAKERS
Debbie Reynolds, Andowah Newton
Debbie Reynolds 00:00
Personal views and opinions expressed by our podcast guests are their own and are not legal advice or official statements by their organizations.
Hello, my name is Debbie Reynolds; they call me "The Data Diva". This is "The Data Diva" Talks Privacy podcast, where we discuss Data Privacy issues with industry leaders around the world about information that businesses need to know now. I have a very special guest on the show. Andowah Newton is the former Vice President of Legal Affairs and Head of Litigation for LVMH, which is Moet Hennessy and Louis Vuitton; welcome.
Andowah Newton 00:45
Thank you, Debbie. I'm so excited to be here. Thank you for inviting me. And I'm just so happy to participate in this wonderful thing that you're doing, I have so much admiration for what you're doing.
Debbie Reynolds 00:57
So I just had to tell this story; you are so beyond next level in terms of just being an executive and a professional. I saw you on LinkedIn, I was blown away by the things that you do. I think I contacted you in the next week, I saw your picture at the White House with Joe Biden. So you're just all over, doing all types of things. But when I contacted you, you wanted to be on the podcast; you and I chatted, and I sent you some information. We had a meeting, and I sent you a couple of podcasts. You know, listen to these; these would be a couple of good, interesting ones. And what I didn't know is that you were going to listen to all the podcasts.
Andowah Newton 01:43
Yes, I had to. I was just so excited to see one of us not only moving in this space but dominating and thriving in this space. And when I saw all of your accolades, I was just like, I have to listen to all of this. And because I'm kind of newish to the privacy world, but probably always been somewhat of a techie all my life. I was like, I need to educate myself on some of this stuff. So I just started listening to one. I was just like, I should just listen to all of that. And I now have my degree in "The Data Diva" Privacy.
Debbie Reynolds 02:18
You're just too much. I love it. Just an example. There are people like you that are just excellent. You just are, period. So it is no wonder that you reached those heights in your career. And yeah, for many different accolades. So you're amazing. Oh, my God.
Andowah Newton 02:35
Thank you, so are you, Debbie. And I just have to jump in and tell you quickly that as I've been going through all the episodes, I've been connecting with all of your past guests, and they have been so nice and gracious just being in contact with me. So now I feel like I'm well into the privacy community because of you. And yeah, they just sing your praises; everyone from Judge Kennedy to Cameron Kerry, you are amongst the stars. So yeah, thank you.
Debbie Reynolds 02:58
Oh, my goodness, you're about to make me cry on my podcast. Oh, my God. Oh, this is amazing. This is amazing. I loved your background. And I particularly like to talk to people who may not necessarily be directly in privacy, but your role of privacy cuts across horizontally and a lot of work that you do. So tell me about your journey. Tell me your background. So you've lived in many countries, you've traveled all over the world. I mean, you've done so much. But I think your background, your trajectory in terms of your legal career, your personal growth, that all plays into what we want to talk about. So just introduce yourself.
Andowah Newton 03:44
Sure. So I'll start with kind of my journey through tech. And then, maybe during the podcast episode, we'll weave in all the other experiences I've had. But, as I was listening through your past episodes, I was thinking about, well, how did I get interested in this stuff? And how did I get interested in technology? I thought about the fact that when I was a kid, I'm the eldest of five, the only girl, and I was always playing video games with my brothers. And I got really good at video games; no one now would ever guess. I became a gamer in the 80s, but I kind of was; I won the Super Mario Brothers two and three conference county-wide competition. And that was my claim to fame for a while. And that really, I think, opened up my mind to tech. And I know people might not think of it as being such an obvious link, but it just opens their minds as to what is possible. And so from then on I took these optional computer classes in junior high, and in high school, I was really good at math and algebra and calculus and physics. And I think I just had an open mind when it came to tech, so when I went to law school, I took this E-law course that was being offered for the first time in the early 2000s. That was about the digital transformation of law, and then I was also a business major. So I was an accounting major, got my CPA from Georgetown University, and this was in the mid-90s. And back then, the only students that really had access to computers were the business students. And so we were trained on Excel and databases and things like that. And I mentioned all that because I really think it informs my ability to latch on to technology as a lawyer, which a lot of lawyers struggle with. And when I was an auditor we worked on databases all the time; I was an auditor at PWC, then at Estee Lauder. And we had to use tech; we had to be really well-versed in all kinds of gadgets and tech as it evolved in the early 2000s. So by the time I got to law firms, I was involved in some discovery issues, international discovery issues; as you mentioned, I lived in various countries. For France, that was the longest one that I've lived in. For two years, other places, in five other places for lesser amounts of time. But I have an international background, I ended up getting a law degree in French law at the Sorbonne, I did an undergraduate year in Lyons, and then my law school in Paris. And then I just spent a lot of time. I clerked for a judge in the International Criminal Court in The Hague as well before starting at law firms in New York. So all of this kind of international experience gave a unique perspective when it came to discovery issues; particularly, we were dealing with a large French client who had a huge presence in the United States as well as France. And we had some discovery issues come up. And I had to deal with a Federal statute in the United States that related to transferring data abroad for the use of what, in mitigation, and also with blocking statutes that exist in Europe. So that was my first foray into the Data Privacy world. And, of course, that continued as I represented several tech clients along the way at law firms. And then, when I eventually moved in-house, I spent eight years at law firms. And then, I went in-house for eight years; I worked for a European company. And we often dealt with issues of Data Privacy because, by the time I got there, around 2015, these issues became front and center. And we were in the news all the time. So not only did I deal with technology, most people don't know that LVMH actually encompasses about 70 different brands, and it's about six or seven different industries. So I was also dealing with smartwatch, litigation, website, website checkout process, all kinds of things, luggage technology that you wouldn't necessarily think would relate to a large luxury conglomerate. So those are the kinds of tech issues that came up, but also the Data Privacy issues that came up involved collection. And when I came there, the woman who was my predecessor had just retired, and she had dealt with a slew of demands regarding the collection of personal data at the retail checkout point, such as the codes. And then later on HIPAA, held a year after I got there, website accessibility dominated our litigation docket for a long time. And it’s, as you can imagine, 70 different companies, 70 different websites, all of those had to be dealt with. So other things that came on towards the end of my tenure, there were claims regarding chatbox technology and potential violations of privacy in that arena, as well as session replay technologies. And then, towards the very end, I wasn't responsible for the matters that reached the public eye, but there were a couple of matters regarding virtual tryon, and collection and sale of personal data. I was not involved in those. But those were front and center and, since that time, kind of exploded on the beauty and fashion front.
Debbie Reynolds 08:58
That's just amazing, staggering, staggering. Thoughts about just personal realization. So companies that have brands that are selling stuff direct to the consumer. They're using a lot more advanced technology, especially fashion. I feel like fashion is going headfirst into the Metaverse. They're doing all these types of things. But what is happening as a result of this two things, one is that brands can bring a new level of experience and personalization to people, but then that also means that they're collecting more data than they were collecting about the person, and it becomes more specific to the person so when you retire my virtual tryon, I thought about that or makeup brands and maybe one is match your skin color or different things like that. So tell me about your view on that and how that is really bringing privacy into play when companies are doing these innovative things.
Andowah Newton 10:03
Yeah, well, I think the most fascinating aspect of personalization when it comes to the beauty and fashion space as compared to other areas is that in these industries, it's largely consumer-driven or consumer-initiated, unlike in social media where, or other kinds of online usage is where, people are kind of collecting data secretly behind your back without your knowing, and you never asked for it, customers are actually wanting personalization of their makeup products, of their skincare routine, of their clothes. So they are ready and willing to give up so much more; I take their personal data in order to obtain that personalization. However, the risks are even multiplied; now someone's not just tracking your movements online or collecting your name and potentially your address and ZIP code. Now they're doing things like your facial structure, your bone structure, if you sweat using a certain material, what you would look like if you aged, right, things like that, what various nail polish colors would look like on new shapes, and colors, of nails, jewelry, trying on features, this is all happening right now in the fashion and beauty industry. Technology is already out there being used, and people are voluntarily doling out their data in order to be able to take advantage of this customization. So I think what's really interesting as well, in the beauty space, is some of this information. And I think anyone who has had experience in the consumer products industry is familiar with this, but a lot of the skincare stuff and some of the makeup stuff borders on medicine, right, like treating acne for skin. So now you're delving into a whole new realm of privacy concerns. And we're talking about someone's heart rate or fitness tracker apps and how they track and using that to customize your clothing or your makeup and your body shape, all of these things. It's just so much more invasive than other areas of companies out there. So I think the obvious things that companies have to think about before when it came to data collection, they have to think about why are they collecting this data. Are they collecting it? Because they absolutely needed to do this personalization. And really, looking at that at a very granular level, every piece of data that you're collecting, do you need it? Second, how long are you retaining it? Are you only using it during the tryon process? Are you communicating to your customer that you're only using it for the tryon process? Are people who have customer profiles set up being given more disclosure than those who don't, as more of their data is being saved on a consensual basis than those who don't have customer profiles? And then after they use it? How long are you going to retain it? You know, when if I replace mascara, it's probably every couple of months. If I try out an eyeshadow, I probably buy eyeshadows I don't know twice a year or something like that. I buy a whole bunch twice a year. So after six months, does a company need to retain whatever data used to try on? Or even after the five minutes that I tried on? Do they need to retain that data, and why? And the other thing is all of this information that's being used to recommend, right, other products and what's best for you? What is that based on? We want to know is based on my data when I tried on something, same thing with eyeglasses and facial structure and shape. So all of the same issues that we're looking at in AI, where did this data come from, these copyright lawsuits by Sarah Silverman and others that we're seeing in the AI space, they want to know, hey, did you use my book too, as your training data for these algorithms. And I think customers are going to want to know, in the end, are you using my face shape, structure, body type, and information about my personal attributes to build these tools that recommend products to other people? And in the process of preparing for this, like I said, I tend to try to prepare for things to the best extent possible. I went on one of these skin customization websites, filled out their quiz in an anonymous way, and put all the wrong information in there. But I just went through; there's this company that analyzes and tailors your skincare regime to 14 different factors about your skin and your habits. And so I went in there, I was entering all this data or whatever, just to see what it was like, and one of the questions that I came across was your zip code and why are they asking me my zip code? Well, it turns out they are using factors such as what the weather is like, whether there's smog and air pollution, which is on it's face a great idea, right? Need to take into account your locality? How much sun exposure you might have, things like that. But did they need to ask my zip code in order to obtain that information? Couldn't they have just asked me? Do you live in a high-population area with lots of pollution? How often are you exposed to sun without asking me my zip code? And then the ripple effects of that, right, like, are you saving that are you collecting that, are using that to target me personally, et cetera, et cetera?
Debbie Reynolds 15:26
That's true. And I think one of the big issues of privacy is a secondary use. So just like you said, the zip code, if the only thing they wanted to know is if there was air pollution in your area, like you said, they could just ask is there air pollution in your area? Right, but asking by your zip code, maybe they can repurpose that and say, hey, people who live in this zip code, we should charge them more money or something, you know what I'm saying?
Andowah Newton 15:56
And some do, right? That's what we're finding out. Right?
Debbie Reynolds 16:02
Exactly. But I think the message should be for companies that want to go into this space; you need to relook at everything in terms of the data that you're collecting; you can't go on doing the things that you've always done and go into these new areas. So you have to take a more detailed look at why you're doing this. So I think there's going to be tons of litigation around these types of areas because I think companies think, oh, well, that's the new thing. And we don't have to change anything; we've just got to add the stuff. And we're going to collect all this new information. And you just like to sit around the zip code, I can see a court saying, hey, that's not necessary for you to be able to do skin care. So I think companies need to be really aware of that. And also those secondary uses.
Andowah Newton 16:47
Yeah, and I think there's some stats that I came across that kind of surprised me when I was preparing for our episode, that 80% of customers in beauty and fashion say that they want personalized or are open to personalized recommendations. So that's huge in terms of potential data collection and also the flipside of data invasion that could occur here, right? Also, the changes that took place during the pandemic. Now, a site like Sephora, for example, claims that 70 to 80% of shoppers that are shopping for some of their products is done online, as compared to in stores, right, and what that involves in terms of data collection and transmission compared to the past, and also the pandemic brought about people finding other ways to obtain products, but also think about things like hygiene and testers. And so what a lot of these companies who are embracing the technology, that personalization technology say is that, well, it's so much more clean, to be able to try on a product, virtually rather than using actual product testers in the store. And then it reduces waste. So in terms of sustainability and not throwing out so much. They claim that it reduces returns. And that that increases waste of a product as well. Another stat that I heard in a fashion luxury industry is that there's a McKinsey study that said AI could add $150 to $275 billion to profits in the next 3 to five years. And personally, using hyper-personalization is a term that I kept coming across, I can increase my things up to 40%. And some companies are seeing conversion rates of sales using this technology two to two and a half times. And what that tells me is yes, there's huge incentives for profit. But as we've seen in so many of your past episodes, that huge incentive for profit also comes with a dangerous side to consumers in terms of their privacy rights, or what they perceive to be their privacy rights, as well as companies being just sloppy, and negligent or not careful with all of that data.
Debbie Reynolds 19:09
Yeah, in those situations, consent really is key. And I think companies are really good. I think being able to ask for consent for the most part. I think the trouble that companies have is when people want to withdraw that consent, right? And how do you really do that? So if my skin is a certain shade of brown, and I say I don't want you to store that information, how do those companies actually do that? So I think the challenge will be not only obviously they want the consent, right, but then you have to have a method for people to withdraw consent as well.
Andowah Newton 19:44
Yeah, and I still think there's a ways to go, especially in beauty and fashion, about obtaining consent. And I think there's differences between consent and informed consent and of affirmative disclosure.
Debbie Reynolds 19:59
Excellent. Yeah, those are all great points. I'd love to talk with you a bit about accessibility. So I did a video about this, maybe last year or so, about how accessibility features can be personally identifiable to individuals, especially as they do their configuration on websites and stuff like that. So tell me about accessibility, from your point of view, and that privacy issues get entangled there.
Andowah Newton 20:31
Yeah, so website accessibility; when I came to the company, I had no idea what that term meant; actually, it was a huge education process for me. And as I learned, it was a huge education process for everyone I was working with within the company, tech people, marketing people. So people have to get up to speed. So what this is and how we use it, as it turns out, we're all familiar with it, but we just didn't necessarily know that there were terms for this. And the thing about it from a litigation perspective was that companies underestimated the extent to which existing laws about not discriminating against people with disabilities could be extended to their websites and kind of make that a place of public accommodation in the same way that retail stores are a place of public accommodation. And I think that is an important lesson as we look at Data Privacy and AI going forward because even though there is no Federal Data Privacy law right now, and even though there are no major regulations about AI, where there's a perceived violation or perceived right, customers, plaintiffs, the plaintiffs’ bar will find a way, I believe, from my perspective, to assert those rights through existing law until that law is there. And what's interesting about website accessibility in particular, is to date, we still have no national legislation about this issue, or even on a State level about this issue. Yet, last year alone, there were almost 3000 lawsuits, and lawsuits are different from the amount of demands and claims that are made to companies; a lot of them get resolved in confidential settlements. So I would say, by the time you're looking at lawsuits, we're probably looking at 10% or 5 to 10% of the amount of claims and demands that went out there by the plaintiff's bar. And this started as an issue where there was maybe a couple of hundred lawsuits, and in 2017, 2018, and now it's several 1000. And these are legal claims that are brought on the basis of guidelines issued by the World Wide Web Consortium, which is an independent nonprofit organization and came together to write this very detailed set of guidelines that really have guided the legal landscape in the absence of any clearer law on the subject for the past, almost a decade now, they started writing them before, but the legal litigation kind of spun out of control in the past maybe 8 years. So it's from a legal perspective, it's really interesting to see how those guidelines have been used to set the standard and how courts have followed those guidelines and enforcing and determining whether someone's rights were violated. And the Data Privacy link is that a lot of times, when demand letters will be issued, they will loop in Data Privacy claims. In other words, not only was a website not accessible because of its kind of physical features, font size, not including closed captioning, contrast levels, things to make it easier for people who had vision issues or were hearing impaired. But now the theory is that you're also violating their Data Privacy rights, because they are not able to access what their privacy rights are and how their data is being used. Because what preceded that was all these terms and conditions, disclosures, and companies were putting on their websites, these are your rights as to how we use your data. But people who were unable to easily access websites suddenly didn't have access to their Data Privacy. So it was a dual violation. And I think working in the luxury industry, there was, depending on the brand, and every brand is different, different a lot, but sometimes there was a lot of resistance to changing the website because there was concern about aesthetics, but to me, it's an issue of creativity. People in fashion and luxury are very creative, and a lot of the brands you know rose to the task of finding creative ways to work this into their websites and the new information and new content that was constantly being added to their websites. And the other thing about accessibility is that I liken it to those buttons in front of doors at stores that you can press on, even if you're not a quote, unquote, disabled person. And I know we're moving away from that terminology. But even if you don't have some kind of permanent disability, you could just have a lot of bags in your hand that makes it difficult for you to open the door. So many of us benefit from accessibility modifications that are made. Even when we don't have disabilities. And then so many of us are temporarily disabled, have a broken hand or wrist, a limb before or twisted an ankle; suddenly, you're in need of these things. And I remember I was at a resort a couple of months ago, and there was this gorgeous pool on top of this hill. And they had three flights of stairs leading up to this restaurant and pool that overlooks this gorgeous beach. And there was also this huge, dramatic, beautiful ramp that kind of swerved around and you could tell somebody had taken time to conceive how to design that to make it look so beautiful. And it just perfectly fit into the landscape of this structure at the resort. And I observed that most of the people took the ramp down; they didn't want to take the steps because the ramp was easier for them. So I think that's how we should think of accessibility and all of these kinds of technological modifications that we're required to make there; they actually facilitate the experience for many people. And if you were to translate that to a computer screen, if you're in a sunny environment, it might be nice to have that contrast if it's late at night, and your eyes get sensitive. Even if you don't have a disability per se, you might benefit from the accessibility.
Debbie Reynolds 26:58
That's amazing. I love your example about the resort. I agree with that completely. You touched a bit on the fact that the US does not have a Federal Data Privacy law or legislation. Try as though we may to do this. It's not yet happening. It doesn't look like it's on the horizon anytime soon. But I would love your perspective as to why that's not happening. And I think because you have so much international experience, you have maybe a different lens that other people may not have. But give me your thoughts on that.
Andowah Newton 27:35
Yeah, so I am; I have to say that I'm a little bit more encouraged after listening to the Senate hearings on AI. That took place in May. Because I think there were several senators that mentioned the fact that we don't yet have a Data Privacy one. I'm like, wow, so this is actually being talked about. And also, the way they were directing questions at the executives from both sides of the aisle actually gave me a little bit of encouragement on that front that we might actually see some legislation. You know, it might relate to AI first before Data Privacy, but it was odd to me to see both sides of the aisle being so consistently on an issue in a way that I've testified before Congress on other issues, and even in seemingly obvious issues, it's hard to get both sides to agree on something. But with this stuff, they pretty much all agreed. So that gave me a piece of encouragement. But I do have some thoughts on what has made it take so long to get Federal privacy legislation on the books. I think, broadly speaking, that law is a reflection of culture. And I think that there are major cultural differences between us. And I would say Western Europe, places like France, Germany, Switzerland, Northern Europe, I exclude kind of Spain and Italy, Portugal for some of this because I think they're probably culturally a little bit more similar to the US in some ways that the other countries aren't. But I think that just as a culture, American culture, I think values privacy less. Not to say that we don't value it at all. But I think we live our lives a bit more out loud within those European countries that I just mentioned. I think even things when it comes down to I think we wear brighter colors. If you've walked down the street in New York, it looks completely different from walking down the street in Brussels or Paris. And there's always this kind of reputation that Americans have for being loud. And I think that carries over to how we express ourselves and how I think we're more comfortable doing things publicly. And we have this very interesting celebrity culture that I don't think exists to the same extent in some of those European countries and when you tie culture to legal habits. So when I got my French law degree, when I practiced, when I was interning in France as a lawyer just in and working at the court, it opened my eyes as the civil law system, right, and how different it is from our system. We are a common law system along with the UK and UK countries, but our laws are case-based for the most part, right? We may have statutes, and we may have codes and regulations. But there's such a reliance and understanding that judges will make law and an expectation that judges will make law. And I think a lot of European countries are very code based; they have these major national codes in France, they have the court pin out the criminal code, they have the code, civil liability code, and about administrative liability. We have bits and pieces and things that we try to get to you. And we have States that have some of those codes. But we don't have that kind of code culture ingrained in us. And it's a very structural thing that I think you can see throughout the EU and the legislation that it comes up with; they are kind of edicting laws in a way that we don't. I also think when you look at when I went to law school in France, whenever we were told to write a legal memorandum, it always had to be in two parts and two sub parts. I noticed that even when I got to law firms and was working there if I got a memo from somebody else, it was similarly structured, maybe once or twice, it might have a third part and a third subpart. But that was a structure that they always worked with. And even down to a lot of French people have similar handwriting because they use graph paper and they're taught to teach the letter. So it's very structured; we are wild compared to that, right? We have so much less structure; when you see legal decisions here, it's just paragraph upon paragraph upon paragraph. Over there, the paragraphs are numbered. So when I drafted decisions when I was at the International Criminal Court, we numbered our paragraphs and issuing judicial decisions, and most European legal decisions are like that. So all of this kind of desire, need for structure and formality, I think Europeans tend to like forms, more certifications and stamps in a way that we are kind of just a little bit more flexible with all of that contributes, I think, to this. It's not the first thing that comes to mind for us, right? We wait for the problems to accrue, for litigation to happen. And then we're like, okay, we need a law now, we really need a law now; otherwise, someone else will decide it. And as we talked about, in the website accessibility context, that law never came. And it wasn't until we had international standards that we were able to do that. But I think that even when we do have codes and laws, and whenever I do think that we will get a Federal Data Privacy Act, but I think that even when we do from the moment it's drafted, there will be no expectation that it's the last word, right? We issue these laws with the expectation that judges are going to shape and interpret and add their own color to it, that'll be interpreted differently in different jurisdictions. So that's my take on it. I hope I didn't offend any European countries or any Americans in making these observations. But being the only American lawyer when I was at The Hague, and being one of the few people that have this experience, logically, I think, for me, there's so many things in terms of culture and legal practices that go into the gaps, and why we are so far, I'd say behind, I guess, compared to Europe, when it comes to so much of this.
Debbie Reynolds 33:31
Yeah. That's fascinating. And I love the fact that you brought up the difference between common law and civil because I think that's another reason why the US does many, many big tech companies, have major outposts in Ireland. Or they do a lot of business in the UK because of that. And also, I agree with your observation about the differences. You went into more detail, but I always felt like Europeans feel about privacy in the way that we in the US feel about free speech. Yeah. So it's very, very different in terms of what our priorities are in that way.
Andowah Newton 34:04
Exactly.
Debbie Reynolds 34:07
Yeah. Very cool. You mentioned Sarah Silverman and her lawsuit against Open AI. So this is a surprise for you, and that you, it's your fault, you mentioned it, so we're going to bring it. To me, this case is going to be a really interesting case. What I think is happening now is that there's going to be a collision course between privacy and publicity law because I think that the angle that she's taking on this case is that this is a violation of her publicity rights in terms of how AI is using her data. What are your thoughts?
Andowah Newton 34:58
Yeah, I think you're spot on that it touches upon so many areas of the law, right? She's claiming copyright infringement. But how do you get there? Right? And I know that in your podcast with your guests, it frequently comes up, the fact that we don't have a constitutional right to privacy in the United States. However, if you were to do, I don't know if you used to watch Jay Leno when he used to interview people on the street and say do you know who the president is, things like that. I think if you were to do that with Americans across the country, I think 90% of them will tell you that there is a constitutional right to privacy, right. So even though it does not exist, it'd be like, yeah, that's on the books, right? So I think that it's a perceived right, and back to this issue of these perceived rights are going to get litigated, because somebody feels they have the right to it. So in these instances, what Sarah Silverman and other authors, and I'm sure there are going to be photographers, and there are already musicians making similar claims against AI and Meta, Microsoft. It's not just an issue of, oh, I think you infringed by copying. It's also a question of, yeah, I had rights to this as an artist, right? And I have a privacy right. But there's also State court, or claims such as just very traditional ones, unfair competition. So many States have deceptive practices statutes on their books. There's a myriad of claims; there's some other ones that I wanted to mention here. But one thing that jumps in mind more generally when we're just talking about data is, as I was listening to your episodes, I thought constantly about this concept of bailment that we learned about in law school. And most people don't use that term. But bailment is a concept of you're giving someone something to hold on to temporarily for whatever purpose or a very specific purpose, but you expect it back, you're not transferring ownership. And the example that comes to mind for me, when I think of that, is sending your clothes to the dry cleaners, you're sending it to them here, please clean this, but it's not yours, I want it back or having your car valet parked, right? I'm giving you my car to park it. But I want it back. And I think data needs to be conceived of in that way. Right? So I'm giving you my data for this one specific purpose. You don't own it. So a dry cleaner, can't give it to another customer and say, hey, wear this for the next hour or so. The valet car driver can't say, oh, I'm going to take a spin with it because that wasn't its intended use. And I think we need to start thinking about Data Privacy in that way. But I wanted to highlight some of that. Yeah, some of the things that you mentioned in terms of the way in which claims are being phrased. Yes, here are some of them, so only copyright infringement, but DMCA violations, right? Negligence, that's as basic as you get when it comes to legal theories; that's just saying that you had a duty to take care of something in the right way. And you failed that duty. And I suffered as a result of that. They're not only asking for monetary damages, right? Injunctive relief, that's like, fix your algorithm or fix your training data. How expensive and pervasive would that be? So I think that, yeah, you're exactly right, things are going to start to kind of spin out of control in terms of the bases that people are using to bring these lawsuits and the legal theories, and the law may catch up. But if it doesn't catch up, I think, especially when you get celebrities on the front end of these lawsuits that are going to be saying, oh, no, there are about six legal theories under which we could bring these lawsuits, and they're all valid because we were damaged in one way or another.
Debbie Reynolds 39:15
Yeah. I think the interesting thing about publicity, and I want your thoughts about this, is that we're seeing cases actually come out that are trying to address this issue. It's who has the right of publicity? So if you were a person in Ohio that had 100 Facebook followers, could you sue under these publicity statutes because you're not considered a notable person? And I think social media and stuff like that is bringing in a new wrinkle there. Because obviously, let's say President Biden has a right of publicity, but a guy who has 100 Facebook followers in Ohio probably doesn't, right?
Andowah Newton 40:02
Who knows? Who are his followers? That's what I want to know. Right? Yeah, I think, and thanks for bringing it back to that issue, because I think that will start to be interpreted in a different light. And curiously enough, that's another issue that I think Europeans view very differently from us, even when it comes to intellectual property and having this moral right; those are kind of tangentially related to the issue that you just pointed out. It isn't only celebrities that have this right; what's a celebrity now, right? Is it somebody with a large following? Suppose their largest following is on LinkedIn, not Instagram. Do we consider them a celebrity? Are they a celebrity in their particular field? Suppose they are a well-known data scientist, just because they're not on the six o'clock news every night? What rights do they have to publicity? And how are we conceiving of that? And in this new social media landscape? I think it's exactly right. And I was joking that you're half joking when I said who are their followers, but if someone's followers are heads of State are key decision makers, that means they have a lot of influence, even if they don't have a huge number of people following them. And what does that mean? Right, and now with threads and other platforms that people will become influential on, I think we're going to see different types of people claiming publicity rights in different ways and interpreting existing laws, whether or not there are State statutes on the books relating to publicity, it goes back to that same theme of there's this perception that I have the right to publicize myself and my persona in the way that I want. And I finally found on that list of claims that I was looking for before, but there are also these claims that are being brought under the consumer front. Deceptive Business Practices Act, the CFA, the SEPA VIPA invasion of privacy intrusion upon seclusion, I have never heard of that. But apparently, that exists in some places. And that's being used as the basis of an Open AI lawsuit right now. That just got filed on June 28. unjust enrichment, I thought, was an interesting way to look at it. Unjust enrichment, it takes me back to again to my torts class in my first year of law school; this is just the concept that even if you don't have a contract with somebody if you benefit from something that they created or did, whether they kind of consented to that or not, and you damage them, that they have a right to recuperate those damages. Failure to warn is another basis on which they're filing these lawsuits and only larceny or receipt of stolen property. Conversion is another old tort claim you got property and use it for a purpose that it wasn't intended for. So yeah, we just talked about probably a dozen different legal theories under which you can really get in trouble by not being thoughtful about how you're using and collecting people's data, even if in a lot of these lawsuits, the training or the analysis of past data was done by a third-party. And we'll see in the coming months how courts react to that theory. But I don't think that they're going to be very favorable to saying, oh, well, we hired someone to do it. And that third party did it. So it's not really us,, that hasn't worked in other types of Data Privacy lawsuits. So yeah, it's not like that's going to get you out. And yeah, I think at the end of the day, if a court person perceives that a wrong has been done, they're going to find a way to render accountability for that wrong, even if there's no national statute.
Debbie Reynolds 44:01
That's fascinating. Well, wow, this is just going very wild west. I think this is really interesting. It's fascinating. So if it were the world, according to you, Andowah, and we did everything you said, what would be your wish for privacy anywhere in the world, whether it be legislation, human behavior, or technology, what are your thoughts?
Andowah Newton 44:23
Well, I have a bunch. I'll start with tech literacy amongst the judges and lawyers. I think this is essential and critical because, as we've talked about, the legal landscape is so important to how the technology gets advanced, the limits that are placed around it, and how far and the speed at which technology and Data Privacy is allowed to go, and lawyers are the kind of harbingers, and they kind of hold a lot of the power in shaping this and so many of us are not tech-literate or uncomfortable, are afraid of anything tech. And listen, I think for me as a litigator, as soon as I had a case, I was kind of a social media-averse person. But as soon as I had to start dealing with cases alleging infringement on the various platforms, I had to educate myself about those social media platforms and get up to speed on it. And I heard a judge recently comment at a conference that the attorneys’ ethics rules that require an attorney to be competent in the area that they're practicing could be seen to inform how attorneys should be looking at their rules when it comes to things like Data Privacy, and AI, and how they're used. So I really think we need to get up to speed. And that will trickle down into things like Congress and judges so that, quote, unquote, good law can be made on these on these issues. Another thing we've kind of touched on before is, I know, transparency is also often used in the field of Data Privacy, but I think we need to move further than just transparency. I think that's a baseline to need transparency. Transparency is a very passive word. It's like, okay, we just have to make it available to you. And I think that companies need to move towards something that's more affirmative, disclosure or informed consent, that's using the medical field that we touched upon earlier. And I think a lot of the concepts that are currently used in healthcare should be applied more widely to privacy, actually, for any type of interaction when somebody's using your personal data. Cleaner language. I know some of your guests have talked about this before. But it's kind of like all of the meaningfulness of any disclosure is quickly eviscerated when it becomes anything longer than a paragraph for people to read or understand quickly. And I almost wish that there could be some kind of labeling similar to what cigarette labeling is, like, if you smoke, this can kill you, basically. And not that it has to be that extreme. But I think it has to be that bullet-pointed when it comes to Data Privacy when we just talked about the failure to warn, right? I think that there's is a duty to warn, now that we know some of the ills of social media and tech and letting too much of your data out there, where even California has this language that I'm sure you've seen, that people are required to insert into their settlement agreement. It's three or four lines. And every settlement in California basically has to have this language in it. And I think moving towards that type of concept where it's just similar language that everybody uses, that it's simple and adopted. So we're not assuming that people are going to click through 80 pages of terms and conditions on a website before purchasing something. Before CEOs, I think it's important that they set the tone from the top down throughout the culture, about ethics, in particular. So as an attorney, of course, I'm constantly thinking about ethics. And that's part of what we learned. But I think that needs to apply to so many more industries and in a meaningful way. Marketers, I don't think people should be able to get a marketing degree without learning ethics or work at a company without being trained in ethics. And I think that's for CEOs and senior management to implement and trickle down. Because so much, when it comes to our own concerns, Data Privacy is related to marketing and how we're targeting people for marketing. But that's not good enough; it also has to be in product development because as we just talked about in the beauty and fashion industry, it's not even a matter of getting data through marketing anymore or as Sam Altman testified in the hearing before the Senate, his first thing was, I'm not ad-based. I'm not, but it doesn't mean that you don't collect data, right? And it might not drive your collection of data, but you're still collecting data, and you might not be profiting from it right now. But I noticed that he answered that question by saying, no, we don't profit from it now. And they asked him, do you ever see a model which you would profit from in the future? And I forget what the precise answer was, but to me, it was a maybe, right? So even things and products and tools that are conceived of as nonprofitable, can in the future, be the basis of seeking profits. And as we discussed earlier, once there's profits, there's data, and then people get damaged. So I think that infusing ethics into things like product development, marketing are critical, and I don't think we can just leave it to legal and compliance, especially because so many companies see those departments as kind of, oh, we go to when there's an emergency there. The fire department, we'll call them when there's a problem. They're often not integrated from the beginning of the process. And yet the people who are creating these products, creating these ads, need to have some advantage to that? And I would say from a consumer perspective, when I was younger my mom used to, she's an educator, and she always used to teach me or try to teach me about the concept of delayed gratification. And this concept of I'll give you $1 now, but if you wait two days, I'll give you $5, right, or whatever the example was. And I think there's so much of a lack of this concept of delayed gratification in our consumer frenzy, especially, I think, here in the United States. And I think if customers will have a little bit of patience in terms of wanting to use that new tool or wanting to use that new app, or tryon tool, whatever, and wait until they can figure out what they're comfortable with, right? And check and see, are they asking for my permission and put the onus on companies and say, we're not going to use this until you assure us that you're disclosing to us how long you're keeping our data in your databases, whether you're using my facial features to build your algorithms for recommendations of other products. I think we as consumers need to be more patient on those fronts to demand the things that we need before using those tools rather than just being the first person to use i. And then, finally, I know this is an issue that's near and dear to your heart. And it's very important to me as I move more into the space, but the bias and discrimination that is not only inherent to so many of these technological developments and ensuring people's Data Privacy rights, but from my point of view, seem to be amplified and multiplied when it comes to the use of technology. And I think, again, those things need to be thought of at the stage of product development. And at the stage of targeting marketing, it needs to occur from the beginning; it's not something that you can go in at the end and correct or fix; this needs to go in an intentional way into the thought process behind developing new technology.
Debbie Reynolds 52:16
I agree with all those things. Wow. You've been an excellent student, I guess of "The Data Diva"?
Andowah Newton 52:23
Yes,
Debbie Reynolds 52:24
I've been an excellent student of you, of all the amazing things that you've said. It's amazing. So yeah, thank you so much for doing this. This is tremendous. You know, I think we're going to hear a ton more about publicity. We're going to hear a lot more about accessibility, especially and personalization. So people, keep your ears out for those three things over the next year or two. We're going to hear a lot more about that because I think there's going to be a ton of legal issues that come up as a result.
Andowah Newton 52:52
Exactly.
Debbie Reynolds 52:54
Yeah. Well, thank you. This has been so fun. It's been a long journey for us to get here. This is a great
Andowah Newton 53:02
Thank you, Debbie. Yes, I'm so glad to have done this with you. It's so nice to be in contact with you. And it makes me so proud as a black woman to just see you in this space and getting all the accolades that are so well deserved and hearing people talk about you the way they do. It just fills me with pride. So thank you for representing us well; thank you for just excelling and exuding black excellence.
Debbie Reynolds 53:31
Well thank you. You're tremendous. I don't know anyone who outworks you under any circumstances. Nobody.
Andowah Newton 53:39
That's what my friends say.
Debbie Reynolds 53:43
They're all right; they're all right. Well, we'll definitely stay in touch, and we'll talk soon. And thank you so much again.
Andowah Newton 53:53
Okay, sounds great, Debbie. Thank you. I appreciate it. Take care, okay?