E13 – Gail Gottehrer Law Firm Emerging Technologies
Data Diva Gail Gottehrer
42:44
SUMMARY KEYWORDS
people, quantum computing, technology, data, autonomous vehicles, ai, privacy, Cybersecurity, algorithms, law, vehicles, classical computers, pandemic, technologies, world, app, ediscovery, case, women, robots
SPEAKERS
Gail Gottehrer, Debbie Reynolds
Debbie Reynolds 00:00
The personal views and opinions expressed by our podcast guests are their own and are not legal advice or official statements by their organizations. Hello, this is Debbie Reynolds of "The Data Diva Talks" Privacy Podcast, where we discuss privacy issues with industry leaders around the world to give businesses information they need to know now. So today I have a very, very special guest on my show. This is Gail Gottehrer. She runs her own practice in New York. She focuses on emerging technologies. She's absolutely brilliant, and I'm super happy to have her today on the show. Gail, would you introduce yourself?
Gail Gottehrer 00:51
Thank you so much for that overly generous and kind introduction, Debbie. It's a pleasure to be here with you. Hi, everybody. My name is Gail Gottehrer. And as Debbie said, I currently have my own firm based in the New York area. So my practice now focuses on emerging technologies. Such things as autonomous vehicles, drones, robots, biometrics AI, and how the law is trying to catch up to them. So some of the legal and regulatory issues that apply to those or will apply to those technologies. And also, given that those technologies and devices collect and use such significant amounts of data and a lot of it very sensitive personal data, the Cybersecurity privacy and ethical issues connected to that data, and then by extension to those devices and those technologies. So that's what I'm focused on these days. And it's great to be here with you, Data Diva.
Debbie Reynolds 01:54
That's quite a list. Gail. I wanted to tell the audience how we met. So Robert Childress, who's the President of the Masters Conference, asked us to be on a panel together. I had to look this up. So this is I think it's 2017. Yeah, this is about a year before GDPR came into full force. And we were like, hey, this is kind of when you need to pay attention. One of the really interesting things about that panel was that it was the first privacy panel that I have been on up to that point to have all women. So it's five women in there and three women of color. So we had on there, Keikoh Park, who's Asian, and then it was Amie Taal and me, African American, obviously, no one could tell that. And then Amie is of African descent, also from the UK. And then Jennifer Mailander, and you. So that was not only was it a fun panel because it was all women because we typically go into these things is all men and very few women and extremely low number of women of color. So the really dynamic panel, but I felt like we all have such a good synergy together as we played off of each other really well. And then in the future. We all, you know, collectively have done other things since then. So you and I have been fast friends. We decided many years ago. We would collaborate on stuff. We've done quite a few things. So we've you've collaborated with me on the eDiscovery for corporate counsel, the chapter we do that book annually for privacy. And you asked me to collaborate with you with a book that Amie Taal actually it was editing about GDPR? Yes. And then we talk a lot about different panels and conferences. And so we're always I feel like we're just always into some trouble.
Gail Gottehrer 03:59
And you were kind enough to join my Cybersecurity Subcommittee of The New York State Bar Association Technology Committee. I appreciate and soon, our second annual Thought Leadership Conference report will be coming out. So I think that group is another really good example of Cybersecurity Focus Committee that I specifically wanted to make sure had a good number of women and women of color people of color generally because Cybersecurity data just in general is you know, and as you mentioned, we often see women being so vastly underrepresented and people of color being represented and underrepresented and especially women of color being underrepresented. And it's every time I read something about oh, we're looking for diverse talent and cyber, we just can't find it. I want to scream so are part of the subcommittee in our publications is really to emphasize you know, if you're looking Look no further because we found these women for you in these attorneys of color, and they're brilliant, and you're not compromising any kind of expertise or anything quite the contrary, these are leaders in the field to just as you'll just need more of a light shine on them more opportunities for public speaking more opportunities for publication, which very often just due to the structure of law firms or other organizations often don't go to younger women, women of color women, attorneys, people of color. And still, we see so many panels like we like to refer to as mantle's right, that is all men and, and, and not to pick on men, because I think we have there are some tremendous allies. And I'm very fortunate that my Co-Chair on the Nisbet tech Committee, Ron Hedges, is a huge ally for women and getting women involved in things. But I think we just, it's incumbent on all of us to really try our best to help. And we have a law student and a college student on that subcommittee, both of whom are women, and kind of giving them a little, you know, healthy push in the right direction is a tremendous thing to do. And hopefully, more and more people will feel comfortable doing that.
Debbie Reynolds 06:10
Right, just a shout out to Ron Hedges. Amazing, awesome. He is. So he's a former judge. He is a senior partner at Dentons now. He actually reached out to me a while back. We're gonna do a panel together in January for a health care organization. So I'm very honored and very thrilled, but they have not crossed paths doing the Cybersecurity stuff. And the UFC might not have known me as well. So I think that that was fun. I mean, so your master plan is working.
Gail Gottehrer 06:49
You know, it's interesting, because that's something that three of us have in common is starting adding an eDiscovery because Ron wrote some of the really groundbreaking foundational eDiscovery opinions when he was a magistrate judge in the District of New Jersey years ago before he retired from the bench and when to Dentons. And we both started out in the Discovery world years ago when it was first starting back when people thought it was a fad, and it would never catch on and, you know, kind of looked at us strangely, but and I think it's interesting how that's evolved into bigger and bigger data issues and more complex issues and privacy and Cybersecurity really kind of became a natural outgrowth of each discovery.
Debbie Reynolds 07:31
Oh, yeah, definitely. I think you were branching out into emerging technology and me moving more in privacy, for me a natural progression, like, those are things that I think we were naturally interested in. Yeah, and there are things we ended up working on anyway, in some way, shape, or form. So I want to work on data problems before they become problems on the legal side. I think it's really important. Before we get to these questions, I want to tell you something that happened a couple of weeks ago, that if you get a chuckle out of I was doing some research on Automobile, Telematics, and I've done a video about it. But the research, I was doing these white papers and articles and different things, I was doing my research on white paper, and I thought this is really good. And I look down, I scroll all the way to the end, and there your picture pops up. Or she was a part of it was very well written very, you know, I feel like a lot of people when they're talking about legal issues or technological issues, they will do it in complicated ways that make it seem distant, or it may seem like perfect, can't really grasp it. So I really like the way that you're talking about complicated issues, but you broke it down the very common way that anybody could read that paper and know that, you know, part of this.
Gail Gottehrer 09:03
Excellent. Well, I'm glad you I'm glad that's your takeaway because that's really my goal. Because I think that's part of the problem where we have trouble with attorneys, not feeling comfortable delving into technology. And I confess, I'm the first one I am not. I have no math or science background. I went to law school because there was no way I was doing math or science. But certain things, you know, you just learn we learn complicated things as lawyers all the time. You know, statutes aren't easy and a lot of cases, but we learned them, and I think technology is the same way. And especially now because it just pervades litigation, M and A, any kind of law that you practice. You know, back in the day, people said, Oh, you don't have to learn eDiscovery, you know, I do medical malpractice it's one box of documents. I can do it by hand. And it's like, okay, maybe, but today that's not the case. Because everything is email, everything is texting, and everything is data. Everything is stored somewhere. Everything involved to the cloud, there is no case that really doesn't involve data and privacy and Cybersecurity and even litigation, right discovery, if you don't know, litigation holds and what that entails and how to assess what data your client has, you can't really help them with an effective litigation hold, which could lead to spoliation. And could lead to dispositive sanctions and a case. And in discovery, if you don't understand data, you can't ask for the right data. And then this is something Ron focuses on a lot. And I've learned a lot from him, you know, is it admissible, depending on how you got it, how you preserved it if you don't know the right way to get it admitted? It's not coming into evidence, and Oser, you know, then you're not crossing the finish line. Right. And that's ultimately what matters. And then to your point, if you get it introduced, but the jury doesn't understand it, or doesn't care, because you're essentially speaking, you know, legalese to them, you're not helping your client either. And I think that's what modern-day litigation looks like.
Debbie Reynolds 11:01
Yeah, I agree with that. I would love to delve into QuantumComputing. So I made a video about this recently about Post Quantum Cryptography. It interests me because I thought, as I was thinking about Quantum Computing and learning more about it, I thought this might impact encryption. I don't know if crypto is a hot issue and stuff like that. But I want your thoughts about quantum computing, like reasonable Cybersecurity, things like that.
Gail Gottehrer 11:31
This is another example of us being completely in synch without even knowing it. So I recently wrote an article for Thomson Reuters, which I think is coming out in March, on that very topic about Quantum Computing. And I'll confess to you the hardest part was that they gave me a 1500 word limit. And even many, many drafts did not comply with that limit because it was so hard to try to boil it down to something simple. And again, not being a math-science person, I confess, it took me a lot of research. But I think the important takeaways is that as I understand it, again, this is completely non-scientific, is that Quantum Computers because of cubits and how they exist in superposition, which you can think of is if a coin is spinning, it's not heads or tails, it's both. And then so it's in multiple spaces or ways of being as opposed to binary zeros and ones as classical computers work. And then you have the concept of entanglement, that when these coins are entangled, when they fall, they're both going to fall the same way as they're programmed to, every time no matter how far apart they are. And that creates tremendous opportunities for programmers to determine things and also to store tremendous, much, much more data than classical computers can deal with. And because of that ability to process things in parallel to one parallel paths, if you think of going through a maze, as opposed to going once from the beginning and walking all the way through with Quantum computing, they can go through all different ways at once, at the same time in parallel. So obviously, you're going to reach the end faster, because one of those ways will work as opposed to starting going the wrong way, starting again. So this gives Quantum computing tremendous extra power and ability. So the connection to Cybersecurity in cryptography is that Cybersecurity is essentially these algorithms, like the RSA algorithm are built on prime numbers, which again, we're getting into math that was not good for me like once I hit seventh grade, but prime numbers very, very, very large prime numbers, where you have to factor them. And finding the factor of a very, very large prime number is hard for classical computers to do. And the amount of time and computing power it would require is more than it can handle. So algorithms, those things are secure because classical computers can't break them by solving that math puzzle. Quantum computers, however, since they can work in parallel and much faster, are very good at cracking these puzzles and doing factoring. So the idea is that if we get to the point and projections, maybe 2033 or so, we'll get close to that. But if you can have a Quantum computer that can figure that out quickly, it could break the algorithms that much of our current Cybersecurity is based on, so now NIST is focusing on Post Quantum cryptography, trying to come up with algorithms and encryption keys and things that will work in a post-quantum computing world when we work with Quantum computing to possibly improve security. But I think, and the point of this article is to what does that mean for laws like the CCPA, the New York steel Dac where right now, you have an obligation if you collect personal data, which under both laws is very broadly defined New York, it's any individual or company that retains personal information, private information. If a New York resident, you don't even have to be based in New York, as your company to be under that act. So what does that mean in terms of having reasonable Cybersecurity now? Right, so what does that mean in terms of what safeguards you have to have in place, given that this technology is still evolving, right? It's nothing you can say now. Who knows if the threat is as grave as predicted. Maybe it'll be just as much as people predict. So if you have sensitive information, your banking information, what do you need to be doing now? So I think that's a really interesting question for companies to start looking into. Because if you have very secure, very sensitive information that you know, you will be targeted by hackers in these. And there are already predictions that hackers, even now before Quantum computing, are trying to steal secure data. Now, even though they can't crack it, they can't crack the algorithms and hold on to it till Quantum computing gets here, and then use that technology to crack that those algorithms and decrypt that information. And when you think about the kind of information that a lot of companies have that have long-term value, you could really be a target. So even now, before Quantum computing gets here, there may be things you need to be doing now to be deemed to be complying with those laws. So I think it's a fascinating question.
Debbie Reynolds 16:37
Wow, I should have called you out when I was doing my research for this. You basically say the whole thing. I don't know. And the way I think of Quantum computing is, let's say, you had like a privacy fence that was eight feet tall, and it was sufficient. If someone walked past her house, they could see the Quantum computers like someone getting a like 15-foot ladder and just looking over your fence. So it's like, the safeguards that we had in place before may not be sufficient when Quantum computing becomes more popular, more commercially available, or just, you know, I don't know, I feel like all these big technology changes, there are always laggards. But this is one where the companies that lag behind or can't or aren't thinking about this problem or be in bad shape? I think. So I think the best thing is that just like you say like Misano was working on this problem, and, you know, encryption is not the only type of cryptography. So there are other types of cryptography that are ways that people can do cryptography that could possibly not be broken due to quantum computing. But I mean, it just sort of remains to be seen how that works out. But I was really fascinated by this. And I actually found, article by a mathematician from NIST about this particular thing. So just, you know, I was waiting to find someone to talk about it. So now that melds you, and this guy from this is really interested in it, we'll be talking about that
Gail Gottehrer 18:12
a lot more. I think it's fascinating. And it goes back to something that you and I talked about a lot, which is data minimization, you know, in terms of what's reasonable, even if you can't buy a product, now that's going to protect you from whatever quantum computing turns out to be, the answer always has less data, it especially if it's sensitive data that is going to fall within the CCPA, the cpra, or the shield act, and you're going to be in trouble if it's breached, even under current law, forget about Quantum computing, you know, if you don't need it, and it doesn't have commercial value, just get rid of it. You know, hopefully, more and more companies are doing that. Because, you know, as much as storage may be cheap, the risk is expensive.
Debbie Reynolds 18:54
Yeah, the risk is very expensive. And I always tell people is that server in the back room that no one wants to talk about has is, unfortunately, probably connected to the internet, probably not patched, probably out of service, or you know, it's out of warranty or something like that, like that is your weak link. So even if it's data that you don't know, you want to use, again, maybe it's on a legacy system, you know, that could be the doorway or the gateway for a hacker to get into your other stuff. So I mean, it is risky, even though it doesn't seem that way. I would love to talk about AI. Privacy. So I know this is a topic you love to talk about. But what do you think is privacy? I've said my opinion. What do you think are the hot privacy topics related to AI right now?
Gail Gottehrer 19:45
Wearables concern me, and that now, especially with COVID, we have so many more of these AI products that are being used. And the technology is very cool, right to the idea that some of these AI products can die. Notice you coughing and know if you have COVID or can track in a do contact tracing all sorts of things. I think the potential is tremendous, but the privacy risk is so high. And, you know, it's an interesting thing about the pandemic. I think it's accelerated technology, innovation by necessity, right. You know, five years ago, someone had said, you're going to get to a point where you're going to be running the court system, the school system, and you know, your law practice on zoom, you'd be like, that's never gonna happen. That's impossible, right? Zoom is something a few of us know about that we use for conferences. Yeah, you will never be using it for depositions and court hearings and trials and all that. But you know, here we are, and, you know, not to pick on zoom. But these technologies, we almost asked too much from them, or we expected too much from them. And we rushed to use them because we needed them. And we didn't really think about a lot of the guardrails and a lot of just what data we're collecting, where are we storing it? Who's it being shared with? Is it being anonymized? You know, it was more like, Can we get it working, so kids can have some sort of an online classroom tomorrow. And I worry that a lot of the wearables are like that, that people take for granted that this is collecting really sensitive information about you, in a lot of cases, biometric information, that is not like a password, if it gets breached, you can't just change it, you know, your retinas scan or with facial recognition technology, you know, your bone structure, the distance between your eyes, that kind of stuff is who you are. And if that's a breach, they're really significant consequences for people. So as much as I think wearables are great, and I know people are very conscious of monitoring their health, for a good reason, and hopefully, that will be something that'll post date, the pandemic, you know, people being more focused on their health. It just worries me again. Where's all this data going? And have we put enough thought into what we're collecting? Do people really understand and especially, you know, you talked about the webinar going to do with Ron about medical, you know, telehealth kind of issues, medical issues, telehealth is another thing. It's been really helpful for a lot of people. I think it has a lot of benefits. But again, where's all that data going? You know, you have people doing online therapy, talking about their lives. What kind of technologies are we using? Because I think people assume either that this information is HIPAA protected, or it's being stored in accordance with some law, and it may not be in most cases, you know, if you download an app, you don't get HIPAA protection along with that, because you're not covered by HIPAA. But I think people think medical information, HIPAA, it's safe. And that may not be the case. And I think if you have informed consent, and people are willing to share this information subject to certain, you know, contractual agreements, that's fine. But I worry that a lot of people don't think about that, and don't understand what they're quote-unquote, agreeing to are consenting to, and that there are long-range consequences from that, that people might not be aware of and recognized later. And that that's why we kind of have a lot of backlash about technology because people don't really understand that the front end what they're giving up, or you know why that what that free app is taking, and that's why it's free. And then they find out later. And then people kind of reflexively say, oh, technology, and we need to do a better job. I think having people understand that your biometric information, your health information, you need to ask a few more questions before you just download that app.
Debbie Reynolds 23:46
Yeah. And you make a good point. I tell people this all the time. Next year, I think there was some senator or something. I don't know. The last few months, we've talked about COVID and saying that help Africa information was covered by HIPAA, and whatever form in which the data will share is my cover by HIPAA. I think a few people did, like, let him know that it's like, No, it's not. It's really not. I mean, your example about the person coughing and then the like the wearable being able to tailor it have color again, the reason one of the reasons why this is concerning, let's say that data goes to your medical provider, like, let's say the app is a medical app that your doctor or your medical group has, you know, when you're sharing that data with them, it is protected. But let's say, like you say, download like a commercial app, and the only protection you have is consumer protection. And there really is where there is. There is hardly any ceiling to what people can consent to. So I think you know, you couldn't it's not legal to sell your body parts, okay. On the black market, but you could consider the online anything, and that's one of the issues I have with sort of consumer protections with privacy anywhere in the world is like you can have to consent to almost anything, you know, just to play an app, someone may have to, you know, maybe the terms of service that the app has, like exclusive worldwide rights, your image forever, you know, that doesn't see even that has no balance with right, I'm gonna play a game for like 30 minutes, and you have my picture gonna show up on a billboard and another country in 10 years. That doesn't seem right.
Gail Gottehrer 25:35
Yeah, it doesn't seem like if people really understood that they would consent to that either. And I think another technology that would change another way AI has been used during the pandemic that's concerning is video interviewing. And it's interesting because Illinois come out with a law this year about that, which, you know, it's understandable. It's a convenience, like so many of these technologies are, you know, that rather than have people travel, especially for screening interviews, you have people on the screen and you run the video, you record it, and you run it through AI. And you know, the best-case scenario is, and I think this does motivate some companies. The idea is that when we have people do interviews, people bring their own biases, either intentional or unconscious. And, you know, those of us who've done employment law at some point in our careers, always trained people who do interviews like, don't ask about age, don't ask about the year the person graduated high school, because that could be a backdoor way to age. Don't ask If a woman has children or can work overtime, we train people and all those things. But sometimes, we still find out, or it's alleged that improper questions were asked. So I think running it through AI could be a way to try to eliminate that human bias. But I think we take for granted that AI has biases, again, hopefully not intentional, hopefully just unconscious biases being programmed in by the programmers just because people are wired in a certain way. And the way we design algorithms reflects who we are how we think the data we put into things reflect choices and potential situations that we don't zero out before we use certain data. But things like facial movements or accents or word choice, all sorts of things, or severe expressions is, you know, some phrase that some technologies pick up, I think we have to be more cognizant that even in our attempt to make things fairer, using ai, ai is a tool, you know, we almost asked too much, we ask it to be perfect, and it's not we have to recognize its limitations and how people can work collaboratively with AI. Because otherwise, we talk about AI bias AI ethics, AI in itself can't be ethical, right? We can't write ourselves out of the equation or, you know, kind of abdicate responsibility for the results of AI. You know, it's only it's Human Design Technology. And we're using it for decisions that affect people. So as much as we'd love to have some perfect self effectuating technology that always reaches good results and ethical results. We're just not there. Yeah, we need a healthy balance back to your point about balance. Yeah,
Debbie Reynolds 28:20
I think you hit the nail on the head with that. So I think, you know, I feel like technology doesn't have context. So what you have, you need a human to be able to bring the context to the data that you have. And then also, you're right, you can't abdicate your responsibility as a human to be to make the final judgment. Even though a lot of people think computers are magical and they're perfect. And they do things right, or whatever like we need to know. And, you know, especially about algorithms and AI is important that those things be transparent to like people need to be able to review what's happening, there needs to be like a diverse group of people doing that. I know a friend in New Zealand who has his work. He reviews algorithms with AI for bias and things like that. So when I'm looking at movies about technology is always like the evil robot, like the robot is taken over and over. I was doing this and right, and it goes right to your point about the abdication that responsibility feels like things can get out of hand. And they absolutely can. But we as humans have to be the leaders. And we have to be people who are making the decisions and also being able to figure out what makes sense or what's the best use of that technology.
Gail Gottehrer 29:40
I think we're gonna see more AI audits. I think that's something we're gonna see more of is how do you check these AI products? How do we really figure out how do we make it explainable? How do we understand that? And I think that's important for lawyers. Because more and more, we're going to see decisions that involve AI either in the employment discrimination context or in algorithms used for bail or sentencing. Or if you're a recidivist, things like that, more and more of these cases are finding their way to court, we're starting to see more and more decisions, and especially in the criminal context, where somebody's life and liberty are at stake, lawyers are going to have to get comfortable asking the right questions or working with experts, like your friends you mentioned, to gain a sufficient understanding to make sure that people's due process rights are being protected.
Debbie Reynolds 30:31
Yeah. And I love to talk a bit about what's happening in the autonomous vehicle world that your love and your passion, and let's go on, and we need to know,
Gail Gottehrer 30:40
I would hope that the pandemic has made people realize that autonomous vehicles, which some people call robot cars, which I hate because, again, it evokes these ideas of transformers and these evil vehicles taking over the world. But, you know, I think the pandemic has shown us that there are so many uses for these kinds of vehicles. And we've seen an expansion of one vehicle neuro. You may have seen them. There, they're like delivery vehicles. So they look like little kind of like Volkswagens, that kind of oval vehicle, there's no windshield, there's no driver's seat, they open from the side, almost like a hatch comes up, and they carry groceries or whatever consumer goods, you need food, and they've been approved. And there are lots of them on the road now, primarily in California. And, you know, I think this is important. I know people have very strong opinions about autonomous vehicles, and we're still not there yet. We're still dabbling with all the complications of level three vehicles. And, some companies are skipping over and going to level four, and Tesla complicates the whole conversation. But I just think the reality is now this year. We've lived through the fact that we may not have people who are able to operate vehicles right to we have a shortage of people operating ambulances, we've had just a shortage of everything and or people healthy people, we may really get to a point where we should start thinking about if we can have certain vehicles operated on their own, especially in a time when we had lockdowns where there are fewer people on the road. So I would argue the safety issues are mitigated, you know, we could set aside certain road areas, there's a lot of benefit to these kinds of vehicles, just for even for just government use take away individual ownership or them being used as Robo taxis where that's another development just this week, Zooks is coming out with these Robo taxis that they're they're using. So which are interesting because they're not retrofitted other vehicles. They're designed just to be Robo taxis. But every time I see images of people who potentially have COVID driving themselves to wait online to get a COVID test, it just strikes me as odd. Like if that person has a really high fever and is really sick, do we want that person driving? Right? You know, it's amazing that, thank goodness, there haven't been accidents. But the idea of all these sick people driving themselves to get tested where we could be doing this, using autonomous vehicles, and a lot of hospitals have been using them on closed campuses, for these purposes, moving around materials, we're just at the point where people need to socially distance, these kinds of technologies can be extremely helpful. And I think we've seen some more comfort level with these autonomous technologies, whether it's the robots in airports that are riding through in sanitizing areas, or the other kinds of like, the sanitizing robots using the UV lights that are our cleaning offices, or some hotels have had robots that check people in, you know, and doing other tasks that people don't have to do. There are a lot of really positive ways to use them. And, you know, again, nothing is perfect. Everything's still in the development phase. But I hope that one thing we take away from the pandemic is that it's an opportunity to open our minds a little bit more, rather than just to condemn these technologies because they're not perfect, you know, kind of embracing more, they are working to make them better. Because especially with driving, the reality is people are not good drivers now, right? If we expected perfection as the level for anyone to drive, few if any of us would be on the road, right people? If people have an accident, we don't take away their driver's license in most cases unless something egregious happened. Because there have been a few autonomous vehicle accidents, which again, are absolutely tragic, and I would still argue they're extreme situations. But since there have been a few of these accidents, a lot of people just called to scrap the whole idea. And I think that's old, you know, throwing the baby out with the bathwater. We don't do that in the case of humans who are applying a different standard of perfection or reasonableness to this technology, rather than being a little bit more rational about it and then saying, okay, we have problems. Are these quote-unquote safety drivers really effective? Should we have more regulation and the ongoing debate now? And I think we'll see it again next year? Should it be Federal law? Or should it continue to be this patchwork of State law, which is sort of like privacy or data security, that's what we have with AV law now, in this country, you know, should that be changed, but I think we need to have the conversation because there's just too much positive, you know, the potential for people who are disabled, who can't use their legs, to be able to have mobility and operate these vehicles with their voice telling the car what to do, people who are legally blind, older people, you know, we have an aging population. And unfortunately, now we have a population that's going to have long term health impacts from COVID, we're going to have more and more people who aren't going to be able to drive or who are not going to be great drivers because they're going to have physical limitations. autonomous vehicles, offer them opportunities for mobility. And to the extent that states and localities are going to have to provide transportation like Access-a-Ride, these are huge opportunities. And there's so much that autonomous vehicle technology, even if we never reach level five, which is, you know, you can sleep in the back of the car, and everything, everyone will just get to where they're going, you know, that's a beautiful world, I would love that. But you know, even if we never get to level five, we can still do so much good for society for people, just by getting to level four, and just improving what we have now. I hope that people take that away. That's one of the positive takeaways from this challenging year that we've had so far.
Debbie Reynolds 36:51
Yeah, well, that's a good segue into my question. So it was the world according to Gail, you get to choose and do whatever you want it, what would you like to see related to kind of privacy laws regulations in the US or anywhere in the world,
Gail Gottehrer 37:07
I would love to see people start to take things more seriously. Like, really to value privacy. And, you know, we talk about it all the time. And sometimes when you give people examples, you know, like, this is like your example with an app like, this is what you're giving up, or this is how much information one company has about you. Or still now, like when certain people, you know, they tell you, oh, I was on a website, you'll never believe this ad popped up. And it was for something I just been looking at. And you explain to them how that happened, you know, and what cookies are, and they're mortified, God bless them, you know, but I think if more people really focused on that, and so we could, we could have people being less willing to just give up information without saying, hold on, if I give you this, how are you going to protect it? So if we had an, I don't know if so we'll see if the CPRA being more like the GDPR how that impacts the US. And if we see that move east, as we do, are so many laws that start in California and movies, we'll see if that really catches on. But you know, I would really I would hope I don't know if the answer is what kind of regulation if it's Federal law, if it's state law, I tend to think that Federal privacy, privacy law be very complicated. And I don't know that that's the answer to all these problems. And it's certainly not going to be able to come along fast enough to solve things anytime soon. But you know, I think individual empowerment people were recognizing, you know, this is important, this is my identity, this is something about me, maybe I should think a little more and at least calculated, you know, a little more seriously before I make a decision to share everything or to keep putting things on Facebook, and you know, going to Clearview AI, when we're talking about AI, the idea that they had like 3 billion images. And that's really scary, even putting aside how they used it, and all the web scraping and that they were it wasn't just for it scary enough that it was intended for governmental use, but the fact that they gave it to shareholders and people had it on their app, and we're using it to check out their daughter's boyfriend or whatever. I mean, that's terrifying 3 billion images, right? Like, that's a huge amount about you. And that if it's true, what the Times article said that they could pick up a picture of you on an app and it would search that whole database for things about you. You know, that's terrifying. That's like one of those bad movies that you were talking about. Analogy. And you know, I think people don't think through that, but I would wish that more people and that does scare people away from technology because again, I love technology, and I really think it has tremendous potential to change and improve the world and Even Aside from that, it's just a reality we live with. And we have to try to make it better. But I worry that while we obsess about things like this and don't understand how anyone else doesn't, there are a lot of people in the world who don't think about this, not all the time, but just like even a little of the time. So I would make everyone really care about privacy in my perfect world, as we're driving in our autonomous vehicles. That's the correct trick. Right. So we're helping the environment. And I'm hoping we're good. I think we will see. It'll be interesting to see what President-Elect Biden's green energy push looks like and what that does for electric vehicles and technology in general. So I'm optimistic about that.
Debbie Reynolds 40:40
Yeah. Wow. Wow, I am happy to listen to this. Again, this is very educational. Sweet. Thank you so much. I'm so happy to have you on the show. This is an excellent session. I'm glad we were able to do it. And I'm excited to be able to share this with people because I love I think your description of Quantum computing is the best one that I have heard so far. It is so complicated. So being able to have people and really talk about it in ways that even I can't believe me, it took a lot of reading because it is it's definitely not easy to to get your mind around again, for those of us who went to law school, because we were not meant to do math or science. But,
Gail Gottehrer 41:25
you know, I think that's a good idea for people who know me and know that I am not a science person. If I could get my mind around this, hopefully, that would inspire other lawyers to say, you know, I'll roll up my sleeves and embrace technology, rather than saying maybe it'll go away, or we'll hire some first year. And that will take care of it. You know, that. Hopefully, the legal profession will realize that we have to really embrace things beyond zoom. And as great as zoom is, I think we're gonna see technology not go away. Post pandemic, I think it's going to change the way you know, the law is practiced, and courts run. So it's a great time for people who were willing to see the law in that way to really carve out a different way of practicing law.
Debbie Reynolds 42:09
I agree with that. I agree with that. So Wow. So we're at the end of our show, and man, we could probably go on you, and I can talk for hours about the stuff. Oh, yes, we
Gail Gottehrer 42:19
usually do. But it's been so great joining you. And I'm so thrilled that you have this series Data Diva, so happy for you. You're amazing.
42:28
Thank you so much. So we'll talk soon. Sounds good.
Gail Gottehrer 42:31
And thank you again for having me.
Debbie Reynolds 42:32
Oh, you're welcome. Very welcome.