E4 – Leonard Lee of neXt Curve
38:48
SUMMARY KEYWORDS
privacy, people, companies, consumer, privacy rights, talk, thinking, fake, happen, data, personalization, content, important, products, folks, platforms, industry, digital, agree, cloud
SPEAKERS
Leonard Lee, Debbie Reynolds
Debbie Reynolds 00:02
Hello, this is Debbie Reynolds with "The Data Diva Talks" Podcast, where we discuss Data Privacy issues with industry leaders around the world with insights that businesses need to know now. I am super happy today to have Leonard Lee on my show, he is the managing director and founder of neXtCurve. He's also had a lot of board memberships and publications. He's a futurist with a capital F. That's what I'm gonna say about him. So he's a tech advisor, influencer, market analysts, and I had the pleasure of meeting Lee on LinkedIn. So we always comment on each other's posts. And then one day, I think you tagged me on a post, and I commented back, and then you call me you and oh, we have to do something. So I had fun on your show, which is great because you know, you have a good personality. And I love the fact that you're always looking ahead at things. So you're looking at sort of how the die has been cast right now. And you sort of look at what's going to happen. A lot of people just don't do that. So we had so much fun on that call. I'm happy to have you on the show.
Leonard Lee 01:16
It's wonderful to be on your show. And by the way, I love your logo, the color choices are really awesome. I really like it. Yeah. So kudos to your graphics designer. And yeah, it was a real pleasure having you on my show, which is the "Rethink" podcast for my firm, which is neXt Curve. And yeah, we had some fun talking about digital privacy, right. And I'll have to say you are a futurist as well because you were talking about privacy, and back X number of years ago, you're way ahead of the curve, because we don't talk about privacy and what was really so sobering for me on that session that we had together and folks, you have to check it out, because it's really good. And it's not because of me it's because of your Digital Diva here. She's fantastic. You know, you really opened up my eyes. And you know, I've been looking at privacy for quite some time, especially as it pertains to digital. And, you know, I spent a good part of my early career in the dot.com industry, working in Silicon Valley with some of these companies that eventually became the Googles and the Facebook's right. So, you know, very early phase saw what we were doing with the Internet and how these ad-driven business models, these freemium business models were looking to start capitalizing on what they could find out about us to monetize that information about us, right. And so I think you're, you're as much of a Futurist with a capital F as I am. And, and so kudos, you're one of those forward-thinking people.
Debbie Reynolds 03:07
Oh, thank you so much. But the graphic designer's me, by the way, so I do all my own graphics, though. Oh, really. It's my other side hobby. So when I get I get bored with privacy, I do graphics stuff.
Leonard Lee 03:23
People know now, right? You got to call. You got to call Debbie. Exactly. Don't Don't laugh. We're gonna be talking about serious stuff here.
Debbie Reynolds 03:34
Really? Well, you and I exchanged. I sent you an email, I sent you an article. So I'm always looking at stuff, and whenever you and I picked the last call we had, we sort of talked about, well, maybe we'll talk about Deepfakes at some point. And so a few little news articles came up. And one just came up, it was totally crazy. And I sent it to you, so I would love to talk about Deepfakes. I feel like we just weren't paying attention when all this stuff was going on with deepfakes like no one was really looking, and now it is out of hand, I think so what do you think?
Leonard Lee 04:11
I think this is absolutely dangerous. You know, I think there's some buzz around Deepfakes. But I don't think that there's this level of urgency around how this is such a dangerous digital phenomenon. Right. And a lot of it is being driven by not just AI. AI is one aspect everyone was, oh yeah, you know, the AI and Deepfakes, or some sort of manifestation of actually, you know, AI as part of the story, but it's also a testament to how far other technologies have gone in particular, you know, graphics rendering technologies and you know, just being able to also produce these, these fake this fake content basically, very quickly and at scale. And I think these are the things that are really, really frightening. And when you combine it with social media and other digital channels to really broadcast this content very quickly, it's frightening. And you know, you sent me this article in the MIT Technology Review, what really is freaky is a lot of this stuff is coming out of, you know, Russia, you know, a lot of this fake content. And so as we start to see, I know that the article that you sent was about, you know, I guess this fake deep fake pornographic content, right? And really targeted at underage girls. This could very well easily be weaponized, if not already being weaponized. And I think the prospects of the future, because of this, the threat to truth, and what is real, is tremendous, I think, is completely underestimated, and not talked about enough. Does it sound like I have an opinion?
Debbie Reynolds 06:17
You have an opinion on many things. It's good. Yeah, I mean, things like having, let's say, you have an opponent in a political process, you can have them, you know, mimic their voice really accurately in a video, or have made it seem like you're looking, you know, they're doing they're saying something that they wouldn't typically say or doing something they wouldn't typically do, and you throw it on the Internet, and, you know, they say false information, travels, like six or seven times faster than true information. So I, you know, by the time that you even, let's say, like an opponent in a political process, by the time they caught it, the damage is already done. And then if you have there the court process, like who, you know, do you even have a right to sue? Like, you know, is it your image? You know, even if it is your image? Do you own that image? You know, does a photographer own that image? Is it a criminal thing? Is it a copyright thing? Is it defamation a liable thing? I mean, it's just all over the map, just it's hard to even know where to start with that. But the problem is that harm can happen really fast. And if, if, and when you're able to get any type of redress will be very slow. So I think that's always a challenge.
Leonard Lee 07:42
Well, yeah, and to remediate the damage that these attacks can cause these Deepfake attacks, and let's just call them what they are they I mean, they are attack on a person on a person, the cost of remediating versus and how little it costs for whoever is creating these Deepfakes to damage someone? How? Oh, yeah. Totally. No. And so, you know, I do hear a lot of, I do read, read a lot of research, engage with a lot of folks in the industry talking about how we can use AI to battle this. And it's like, well, you know, once that fake content gets out there, to your, to your point, the damage has already been done, and now in or the effort that you need to go through to now reverse the damage. I mean, number one, I think you can only recover half of yourself after being inflicted with or being attacked in such a way. But it's going to be almost impossible to really scale your response in a way that's going to change anybody's impression, and that's the thing, you've already made that impression, the impression has already been made. And it caters to fantasy. It caters to denial, you know, this whole alternative reality. And, you know, the thing that's going on nowadays with that's being fueled by not only fake news, but now it's become a visual thing is becoming an audio thing, you know, and so you have this content that can now reinforce fantasy as well as alternative thinking and beliefs. Right? And so, you know, we hear about the conspiracy theories. Now, this alternative reality can just make these things really, almost alternate. I mean, an alternative fact. Exactly. For a lot of folks, right? That's just, and then too, some people are prone to believe conspiracies anyway. So even if you told them it was fake, they were like, Oh, I think that's true or whatever. So yeah, yeah,
Debbie Reynolds 10:11
I don't know if you heard of I can't remember what it's called. Exactly. Someone was saying, some people have memories of things that didn't happen. Like, like an example. They were thinking, you remember that the Tiananmen Square thing where the guy stood in front of a tank like I heard someone say, well, you know, they killed that guy like, no, he wasn't killed, you know, he went to Harvard, actually, I think, a couple of years after that, like, literally people say, Oh, my God, did you see what happened to the guy at Tiananmen Square? I'm like that did not happen. What it's like, even if you look it up on Wikipedia, or something that's like telling, like, here's the guy right here, here, you know, here he is, we still believe it. So that's another challenge. I think just the way people think. Some people want things to fall into what their point of view is, anyway, so they sort of create these narratives, but it's quite dangerous.
Leonard Lee 11:04
Yeah. And, and, and these are just very powerful tools, tools that I think can really have a tremendous impact on folks that are more prone to, let's say, subscribing to conspiracy theories. But I think especially, you know if you're looking at personal attacks, these Deepfakes are just really dangerous. And I think I think it's gonna be a really difficult thing to counter going forward.
Debbie Reynolds 11:35
I would love to talk about one of your favorite topics, which is the Internet of Things. And one thing that you posted, which was, I think you and I have the same reaction at the same time, which was about having drones flying around your house. Yeah, this new thing. So I'm totally up. So I was. I'm totally a fan. And I'm happy that they got Amazon, I got the FAA license to do drone delivery for packages. So I was totally down with that. I'm like, yes, I love this. But then I saw this thing about drones being in your house they're supposed to, like fly around, like The Jetsons, or something, and look at your doors and stuff like that. So I would love to talk to you about that.
Leonard Lee 12:19
Yeah, you know, I'm not a really big fan of that. I mean, I have some smart home products. In my, my home. They're not entirely smart. But one thing that I know about them is that when I provision them, I gave away a lot of personal information. And you know, when you do read the fine print, you're giving away a lot of your privacy rights or, you know, you're opting into a lot of stuff that you actually have no options to write out of just have that agree button, just get rid of the product altogether. And you know, thankfully, most of the stuff either came with the house, or it came with this Smart Home package. But yeah, I'll tell you right now, I mean, consumers need to heighten their awareness of the threat that companies pose to their privacy. And I don't think that companies are being thoughtful about how they architect their own products and services to protect your privacy. And so, you know, Debbie, you and I talk a lot about, you know, Apple, and how they are kind of very early on, I think they've kind of taken up this, the banner for privacy and not to say that they're perfect, okay. They've had their issues, and they may even have stuff going on that we don't even know about, that we might not find so savory from a privacy perspective. But, you know, compared to other companies, you have to admit they architect, they're pretty much first to market with a lot of privacy, new privacy architectures for consumers. And, and when I see these products just come out of the woodworks, especially for the home, you know, IoT products for the home. You just don't see a lot of this mindfulness and consideration for what I you know, I call it Privacy First Design. It's almost an afterthought, if not just an afterthought.
Debbie Reynolds 14:38
Yeah, I think the problem is companies want to make products that have the widest use or applicability, right? So then what they do is they give you a choice as a consumer to be your own Cybersecurity expert. So they're like, you know, I want to be able to sell to a business or a home or a government or, you know, big corporation. So they're going to add all these features in, and likely a lot of them, the default setting is, you know, do everything. And then you, as a consumer, you have to decide what you want to do, if you have any control whatsoever. So I think it's putting consumers or people in an in a tough position because it sort of, you know, it's like I tell people like giving a baby a steak, you know, it's like, yeah, take this, do that, it's like, well, I don't know how to do that, you know, people just know how to, you follow the prompts, you put it on there, and they make it so hard to, to edit or change those settings in a way that makes it better for the consumer.
Leonard Lee 15:52
I agree that there really needs to be more of an opt-out default, right? You've opted out, and that forces the consumer to educate themselves into what they would opt-in for, but then that would require people to actually read the fine print. But there are other ways that you can, you can deal with those types of, let's call it, barriers or challenges, right? Adoption, right? If you took a privacy-first approach and going back to what we what I mentioned earlier, and what you and I have talked about is basically designing your products and services with privacy in mind, and there are ways to do it. We're seeing a solution or architecture patterns that lead to that. I mean, one great example is the, you know, the trusted enclave on an A-series chip on iPhone, right? You know, they were the first to actually bring that to market. So whenever you're, you're using any kind of functionality in iOS 14, or iOS, you know, all your private information is stored on that enclave. And so when you look at all the other Apple products, that the whole discipline that they've instilled on the way they designed products was to keep your private information encrypted and on-device, they don't send stuff into the cloud or have things on the cloud brokering identity and access in the cloud. When you look at these other guys, almost all of their identity management, access management are done in the cloud. And they're also storing your biometrics or a biometric proxy, which is I mean, honestly, it's an I don't care what kind of excuse you provide about what technology you're using to secure that data. The fact of the matter is, it's not on my device, it's up in the cloud. And unless I really, really trust you, and your data center, folks, that that stuff will be hacked, it can be compromised, and there really isn't any kind of excuse for it. Right? I think those are the types of criteria that consumers need to start to develop in their heads. Like, what, how, how is your product designed to protect my privacy?
Debbie Reynolds 18:33
I agree, I think the shift now has to be the shift from a business focus to an individual focus. So I like to call it the rise of the individual. So before business, you know, yeah, okay. You're a customer. You know, we do all we have all these fancy charts and stuff. And then we don't really care about your rights, right? So now it's like, oh, my God, like people can consent, and they can revoke consent, and they can do all these other things. So you're thinking, privacy-first is important, but also consumer first or human first, you'd need to be thinking about how what you're doing is gonna trample on the rights of the consumer. And it's, it's hard because that's not the way that the mindset has been for people. But I'm sure Apple's really leading away in terms of technology companies to try to make it more of an individual focus. So hopefully, that will catch on with other companies.
Leonard Lee 19:33
It really boils down to here's the thing. I think that companies need to understand that developing relationships with your customers because I mean, I think a lot of you know when you think about the genesis of personalization, especially digital I mean, it kind of originated from this company called Broadvision that looked provided. I don't know, if you really call it a platform, it's just like a bunch of widgets, actually, for eCommerce, these new eCommerce concerns to build websites, you know, commerce websites that could personalize content offers, you know, the product catalog, could be profile-driven. That then went on this crazy trajectory, where now you had the likes of Google and the Yahoos of the world now trying to figure out well, okay, how do we even further personalize the way that we can target ads, using this idea of personalization, because if you remember, you know, back in the day, Yahoo and Infoseek, and all these guys, they're more like DuckDuckGo, you know, I didn't know much about DuckDuckGo. But I found out about it, like, you know, these guys just serve up, you know, banner ads, based on, you know, your keyword search, right, and there's nothing really there, you're not really giving personal information, right? You're not they're not building a profile on you. And so this idea that you're developing a relationship with a customer by gathering as much data about them, and most more and more specifically, personal information about them, is insane. You know, and I think you really have to balance the creepy, immensely creepy aspect of that with actually developing a relationship with customers. And I think we've just gone way overboard. Today trying to pursue this whole micro personalization agenda in the industry, and that really has to change. And that's where I think, Privacy Awareness, and all this stuff that you're doing, and I read all your stuff is so important. No, I do. I mean, the reason why you and I are talking. I talk to a lot of people is because you have something to say that's important. Well, thank you. No, seriously. Yeah, I'm the one that invited you. Right. And I reached out to you because I think what you say, what you know, and what you've been researching is absolutely. It's vital.
Debbie Reynolds 22:28
Yeah. Thank you so much. That's so sweet. One thing about the personalization that concerns me is sort of the, your, the way you move through the Internet, everything is filtered. Before you look at it, right. So I, the analogy I use is like you're in a library. So you walk into a library, see all these books, and gotten all these sections. The Internet is not like that. It's like, here's a section is walled off. And this is your library for stuff. So it gives you the impression that there's nothing else in the library, and the more you go down that funnel, the less stuff that you see. And the more stuff you see, these like other things. So you really sort of, actually, I don't know, you saw "The Social Dilemma"? Did you see that?
Leonard Lee 23:17
No, no, I had a lot of people recommend that to me. So
Debbie Reynolds 23:21
you have to watch it. But it's really about social media. But there's one little thing that they talked about that I wish they could have talked about more in detail, which is everyone has the impression that the things that they see on the Internet as well, other people see. And the fact of the matter is that's not true. Like we all see, do we have all have different experiences because there are different filters and different funnels and different choices that we made. The companies are trying to try to filter us to a different path, and we are not seeing the same thing.
Leonard Lee 23:52
Yeah. And you know, I would say you don't even need to be in the library, that and that's the other problem, you're being tracked at a very granular and persistent way. And that's what's really frightening. And you have no now you have no way of opting out of that. And so I think this whole idea of the right to be forgotten is so important. You know, you're very familiar that with the GDPR stuff that you're doing research that you're doing. But how do you how can you institute that? And you know, one of the things that I've been talking about and as part of the research that we've been doing, and what we've done for OFCOM, which is the UK regulator is we recommended to them. One of the important, you know, technology trends that they should probably be mindful of and promote is this idea of trust platforms, there has to be some kind of way to establish the provenance of content and data as to where there can be some sort of consensus system around what is true and then some on top of that, then you have to worry about all the mechanisms that you need to be able to quickly scale that counter to the deep fakes and all these other fake things that are out there that challenge truth, man, I mean, if you really think about it, it's going to be insane. The next decade is going to be crazy. And our lack of consideration of privacy, these things and you know, in a lot of ways, what is true, is going to be a huge threat to societies.
Debbie Reynolds 25:37
Oh, absolutely.
Leonard Lee 25:38
And we really need to check our innovations against some of these. I think it's very important principles that are the foundations of the integrity of our societies.
Debbie Reynolds 25:51
So I think that's a good segue to 5G. So I know the 5G is something that you're really interested in. And I guess we can geek out about that. But...
Leonard Lee 26:01
Gonna revolutionize the world.
Debbie Reynolds 26:03
So think back. So now we're in 4G, right? So when we were in 3G, I don't know, no one was like, trying to take down towers or anything? Well, we went from 3G to 4G. But now people are like attacking tower cell towers and stuff. I don't know why that is. But if you think about it, 4G made it capable for you to use things like Uber. Because 3G was not fast enough to be able to, to get like GPS information on the phone and things like that. So ushering in 5G will bring other technologies that aren't possible in 4G. And I'm interested to see what those are going to be. I don't know what's happening right now. I think, you know, some companies are trying to obviously rush to 5G, someone tried to stop other people from doing 5G. You know, the US wants to be ahead of 5G. I don't know how that's gonna happen. Are they doing every What's happening? Just tell me, what are your thoughts?
Leonard Lee 27:02
Okay, so first off, that whole thing about the burning of the towers and stuff, the 5G, the cell towers, that was because some genius sat there and took like the map of the 5G deployments and looked at the COVID maps of where the virus is spreading, and they correlated, they said, Oh, wait a minute, these maps look the same. So you know, the cause and effect. Right? And so that's where this conspiracy came from. And, you know, unfortunately, people are acting on those conspiracies. So you know, going back to our first topic on Deepfakes, why is it so dangerous that we have people believing in fake stuff? But yeah, 5G is it's very, it's a complex topic. And, you know, here's the thing. Yeah, a lot of people say, Well, you know, if it wasn't for 4G, we wouldn't have Uber and blah, blah, blah, you know, Netflix and all that stuff. But you know, the fact of the matter is, yes, we would have had those things. It's just that when you think about it, a lot of these things, a lot of these applications originally, were on your PC. And yes, 4G, quite honestly, was kind of followed the demand that in particular, Apple was pressing upon them with applications like FaceTime, you know, so some Remember, you know, FaceTime before you can only be on WiFi, you can be on a cellular network, right? Yeah, yeah. And so it's not that these guys enabled, you know, the 4G, 3G enabled these applications, it was really the applications that kind of created in generated demand that push. So it was actually the other way around, you know, it's not like the carriers created the capacity, the technology, even though some, some folks will argue otherwise on the technology end. But in terms of capacity and build capacity before the applications were already there, and, you know, kind of pushing things. And with 5G, I mean, you know, you're pretty much gonna see the similar dynamic, you know, the thing is, is that everyone's talking about the industrial applicability of 5G now, but that's a whole greenfield, high Actually, it's still a highly unknown frontier for the telco operators. And there's a lot of stuff that's happening around 5G that is going to change the role of telcos versus cloud players versus, you know, a lot of other players out there and the overall ICT or what is the Information Communications Technology Industry, it's going to be really interesting. Now my thinking is is that the most transformative element of 5G is really this ultra-reliable low latency stuff. That's what I think is going to drive a lot of the new types of applications that we haven't seen before. But that stuff is gonna be extremely hard. I think the industry, number one, doesn't really understand all as well as they think they do. And they don't think of it in terms of the application of the system. They're thinking of merely in terms of how the latency between the radio and your handset. And you got you
Debbie Reynolds 30:50
have to think beyond that. Yeah, it'd be interesting. I don't know, who eventually everyone will be on 5G, every country, but, you know, I don't know who's gonna be first. Maybe China. Maybe?
Leonard Lee 31:03
You know, actually, South Korea has been first in a lot of stuff, even though you know,
Debbie Reynolds 31:07
Leonard Lee 31:12
Verizon, Yeah, we were first. Like, yeah, okay, you know, you guys can argue over that. But in terms of penetration, yeah, I mean, South Korea is going nuts with it. So, you know, and houses, fast or slow following, but I think a lot of markets are still really, really cautious about it.
Debbie Reynolds 31:34
So what are your thoughts? If you had to if you had your wish if they let Lee decide, What should happen with privacy? Like the next five years? What would be your wish? On your wish list?
Leonard Lee 31:45
My wish list? Yeah, you know, my wish list would be can I have a delay? You know, delete, or restore or reset button?
Debbie Reynolds 31:56
Oh, that's good. That's, that's gonna be great. That's never gonna happen. But yeah, that'd be amazing.
Leonard Lee 32:04
I Well, you know, but then we haven't really put a lot of energy into that. I mean, the closest we've gotten is GDPR. right to be forgotten. And then, you know, of course, they basically tell the companies, now you figure out how that happens. You just have to make sure that you provide consumers or citizens with the right. Is it the right to be forgotten? Or is it the right to?
Debbie Reynolds 32:27
Yes, you're right, the right to be forgotten, the right to erasure, that that is never going to happen in the US. Because even privacy laws change here, companies have so much data, they will not give that stuff up. Because they can still use it. I mean, you know, just say as Lee likes, I don't know pomegranate. So that probably won't change. Right? So really, we say forget that. We know, we know, that's us. He eats pomegranates every week or something like that. So but yeah, like they're, they're actually hoovering up as much data as possible. Because even if the laws passed, it's going to be about data going forward, as opposed to in the past. So I think the right to be forgotten, people in Europe really are excited about that. Now, we have a little bit of taste of that and the CCPA. But it doesn't even go back that far. So
Leonard Lee 33:23
it all begins with awareness. Right. I think folks in the EU are much more aware of privacy rights, or even in the UK, especially in the UK. You know, I mean, I was surprised when I was at when I was in London. I was it okay to say that I was drinking at a bar, and I was like, just chatting up with a guy,
Debbie Reynolds 33:46
I think who is not in a bar in London drinking. I think that's where you got the claim. They tell you not to do that.
Leonard Lee 33:52
Yeah. Yeah, he was telling me that. Yeah. You know, in the UK, the UK says and has a right to refuse to disclose their identity to law enforcement, you know, and so you have these cultures where privacy is very sort of sacred, almost right. And so, you see policies like GDPR come into fruition. Where I think in the US, I think we just he was a big takeaway that I got from our discussion, we assume we have privacy rights.
Debbie Reynolds 34:28
Oh, absolutely.
Leonard Lee 34:31
Right. And this is where I why I think what you're doing is so important because you're part of that movement, and your source of awareness of the fact that number one, we don't have what we think that we have, and number two, you have to be aware of what we don't have before you start to ask for what we should have, which is
Debbie Reynolds 34:58
rightfully I think That people in America don't care about privacy rights. But if you really sat down and talked to someone and like, explain to them what was private and what wasn't like, I've never had a conversation or the person's like, Oh, that's cool. They're like, What? You know, they're okay with it up to a point. And then they're like, Oh, my God, I have no idea. So, you know, life, liberty, and the pursuit of happiness doesn't include privacy.
Leonard Lee 35:23
Yeah. And that's a revelation for us. But I don't think there are enough people who have their eyes open about this stuff.
Debbie Reynolds 35:31
I feel like we have to have these conversations at every level. So you have to be able to explain to someone who's at someone who's eight.
Leonard Lee 35:38
Yeah,
Debbie Reynolds 35:38
So we're not talking, you know, a lot of us, obviously, you know, we have these chats amongst us, or when we're at conferences and stuff, but we have to, that's one thing I really liked about "The Social Dilemma," I felt like it really brought the conversation down to a level that like a teenager can understand, which is important. We have to cover all those bases. Yeah.
Leonard Lee 36:00
And, you know, I was this, this thought dawned on me, as we were, as we had the lead into our, our discussion on my podcast, is that one of the reasons why we have companies like Google and Facebook and, and others is because we legally don't have comprehensive privacy. Right? If we did, these companies probably wouldn't exist in the way that they exist today. Right? What they would be is they would be Duck, Duck Go. And what I think has happened is, over the years, these guys went through their lawyers have figured out, Hey, wait a minute, we can do this, we can do that, Oh, I didn't know we can do that. Let's do this. And so what you see is not innovation. I think a lot of the stuff that let's say, um, you know, Google, and ultra seek and all these search engine companies did in the early days with search platforms, as well as their integration with the ad platforms. They progressively evolved these platforms to be more intrusive on our privacy because they discovered that the law permitted it. So now what you're seeing is that they're starting to butt up against or actually, the EU and other regions, including China are realizing, look, this, these guys are like straight-up stealing people's identities to make ad dollars, you know, and I think the rest of the world is waking up to this. And it's going to be a challenge going forward. For the likes of Facebook and Google, I think, where you have the majority of the planet not agreeing with our interpretation of privacy rights, and, and don't agree with the level of endowment that our government provides, and our laws provide for our privacy. So true. So true.
Debbie Reynolds 38:03
I totally agree. And I think there's no coincidence that the top companies in the world are from the list. It is no coincidence that happened. And so now we see the FTC they're trying to do these antitrust cases. That's like a whole other episode we could like to talk about or whatever. So we can't go there we don't have any time. But thank you so much for being on the show. This is all enlightening. It's always you're always with a lot of fun to talk to and thank you so much.
Leonard Lee 38:32
You are as well, so thanks so, so much for thank you so much for having me.
Debbie Reynolds 38:37
Fantastic.