E148 - Isabella De Michelis, CEO and Founder, ErnieApp, Your Privacy Knowledge Manager

44:40

SUMMARY KEYWORDS

data, companies, privacy, regulation, users, europe, market, industry, microsoft, ai, years, business, consumer, compliance, competition, europeans, laws, part, chinese, give

SPEAKERS

Isabella DeMichelis, Debbie Reynolds, Isabella De Michelis

Debbie Reynolds  00:00

Personal views and opinions expressed by our podcast guests are their own and are not legal advice or official statements by their organizations. Hello, my name is Debbie Reynolds; they call me "The Data Diva". This is "The Data Diva" Talks Privacy podcast, where we discuss Data Privacy issues with industry leaders around the world with information that businesses need to know now. I have a special guest on the show all the way from Zurich, Switzerland, one of my favorite cities in the world. Isabella De Michelis is the founder and CEO of ErnieApp. Welcome.

Isabella DeMichelis  00:43

Hello to everybody. Thank you very, very much for this invite to be part of your podcast.

Debbie Reynolds  00:49

Well, it's an honor to have you on the podcast; I've been following you for many years; your commentary is like a bolt of lightning whenever you comment on stuff because you're so very well-rounded in information and data and privacy and what's happening. And I think we chatted a little bit about geopolitics; I want to get into that as well. You're a very well-rounded person; I love to hear your commentary because you don't look just at the surface of an issue. You dig deep on those issues. And I love that about you but introduce yourself to the audience. Give me an idea of your background and your interest in Data Privacy.

Isabella De Michelis  01:32

That's great. Thank you, Debbie. It's very encouraging, because communicating on social media, it's usually very difficult to speak to such a wide audience, and you really never know who that audience is made of. But my background is that I have been spending 25 plus years working for very, very large corporations, mostly American, working for them as an executive in charge of regulatory strategy, market access, industry alliances, standardization, and privacy. At the time, I used to work for these companies, I used to work for Cisco Systems, I used to work for Motorola, I did work for Qualcomm Technologies because I was more in the hardware and the software industry, but somehow very much in the b2b space. And then, when I left these very large corporations back in 2014, I decided that it was really my time to go into a venture that I really felt very strong, I had a very strong motivation to do, was this idea to change and twist the angle and think a little bit more as a consumer than as an enterprise. Because I could see that GDPR in Europe, and especially privacy regulations rising with no common standard, it could have led to market fragmentation. And companies love economies of scale. They love big markets, and they love low barriers to market. And privacy regulations are a barrier to the market. So I told myself, I'm going to do a software solution. That is actually what has been honed in ErnieApp when I accepted the company back in 2017. And we did the fundraising. And we did the product specification. We did our first pilots; the intent was really to find the right balance between the rights of people to determine what they want to do with the data. And the companies that depend so much on data to be profitable and find a way that only a dynamic system can ensure, and a peer-to-peer system relationships system can ensure were essentially the data would be supplied by the consumers and the companies could use it. But the companies would need to behave well. And the users would need to be convinced that it's worth it to participate. So the idea was back in the years when I used to work for large American companies wanting to grow their revenues in Europe, grow their market presence in Europe with their technologies and products, rather than do the opposite and have a European product be capable to go out in other markets than Europe and serve the companies outside Europe because they are bound by the GDPR regulation, thereby bound by the privacy regulations if they have operations in Europe, to have the solution to be scalable, easy and very hands-on for people to use and for company to use. And after so many years, it has taken me to have to look into all these very different issues. I'm talking on social media, and I'm talking on blogs and podcasts, competition, privacy geopolitics of data, data sovereignty, and Cloud Data spaces, all those things that come a little bit. One layer after the other, all part of the same equation, how can we ensure using more data in a better way for the companies to continue to grow and make profits and for the people to continue to be respected, but also be part of the digital economy?

Debbie Reynolds  05:14

Perfect, I encourage people to go out and take a look at ErnieApp. It's very cool. Well, what you're doing, this is going to be a fun conversation because you talk about data and layers that I feel like a lot of people don't talk about. So this is going to be fun. I want your thoughts a bit on geopolitics. So when you're thinking about data, when I'm talking with people about data, I try to fill them in on the why of what's happening. So I consider geopolitics kind of unwritten rules of what is surrounding, what's happening in the rules. But give me your thoughts on how geopolitics plays into privacy.

Isabella De Michelis  05:56

No, I love this question. Thank you, Debbie. I think that to the extent that a lot of data practitioners and especially data protection officers, I am not a data protection officer, so I try to keep it balanced with what I'm saying. But I know that the data protection officers community would question what I'm about to say. But I think that one of the deepest meanings of Europeans adopting GDPR was to set a principle that it's very, very rare, it's applied international law, that is to have the effect of a law to be applying outside a territory. That's more typical of the United States of America. It has a lot of provisions and laws that have that effect, in particular in the tax, fiscal policy area, and in other areas, but the Europeans did not have that. And I think that they really, really thought about how they could achieve that, with a very unusual regulation that at the time was called the Data Protection Regulation. It became GDPR later on, and somehow they carved in the GDPR this foundation of serenity and geopolitical approach to data that is linked to data collection and processing. So yes, I'm totally convinced that privacy has a geopolitical dimension. And it's even wider and deeper than the economic, industrial dimension. It's really the intent of the nations to take control of the Internet, where we know that the Internet cannot be controlled; you can't really prevent someone from accessing the Internet on condition to have access to the line, to the broken line, or maybe not to the group. And it's still a line, a data line. And that it's a fundamental wake-up call for everyone back in 2016 and 2018. People like me were in the deep industry of networking, broadband, and connectivity. It was really the message. All right, governments are back on the topic, they're back on the table, they want to sit at the table, and they want to say their words. So for the very, very early stage companies that looked at what's going to be the scenario in 10 years, 15 years, 20 years, starting in 2018, they knew that they couldn't do anymore what they wanted to do without somehow having to deal with possible policy or even regulation. And then we had these two different trends that departed. The United States tried to stay away from regulation as much as possible, but we're still seeing that the Federal privacy bill, it's still up in the air, the Chinese went very straight into regulation; they came up to the point of even disregarding the founders ownership of the companies and look at what happened to Alibaba and to the companies over there. They have twisted this legal scheme that basically defines the property of the data as the property of the government. And so it is even more than saying I have a right to say something. It says and now dispose of that asset in the way the government thinks it's useful. It's in the best interest of the government. And then you have the Europeans, which were a little bit unsure whether to go completely the Chinese way or the American way; the American way has proved to be too difficult to implement in Europe; it's too much freedom. It's too flexible. The Chinese were at the opposite too much. And so, as often it happens with Europeans, you have a fragmented decision-making process. You have Council Parliament in Brussels, and you have the Commission, which makes its proposal that needs to go through the legislative process, and sometimes they get really changed. So at the end, it boiled down to a policy that identifies in the individuals really the center of where the rights happen to be exercisable. But it leaves a lot of discretionary power to the public sector and the government as such to overcome and bypass the rights of individuals. Because sometimes, people tend to forget that, by the way, it's been defined in Europe, privacy as a fundamental right, because it's a fundamental right, it has very little to do with commercial and monetization and leveraging an asset that can be capitalized. So it's a very fine line. But clearly, the Europeans were not ready to say that the market could take it all. And they wanted to have a say, to say that was the balance they struck between the American approach and the Chinese approach. I don't know how much you agree with this, or if you have a different idea or opinion.

Debbie Reynolds  10:58

I agree wholeheartedly. So I think of it in three different buckets. So I think the Chinese have gone with the government has the say so there where the Europeans are like, you have to support the fundamental human rights of people, and the US is very business-focused, very business to consumer. So the business-to-consumer route leaves a lot of gaps because not every human is a consumer. So in the US, if you're not consuming things, you don't really have privacy rights, right? In Europe, you have a fundamental right to privacy, and we want to make sure that we create laws and things and stuff in business that protect that at a foundational level. So it's very different. I want your thoughts on the US versus Europe. I want your thoughts. The reason why I think that the US is having trouble creating a Federal privacy law in the US. First of all, there's this fragmentation about the States and different things. But also, in my view, the US, when they create laws, tends to be very prescriptive, whereas I feel like GDPR is a lot less prescriptive. So an example is, GDPR says, okay, you need to keep data as long as is necessary, in terms of the purpose that you're collecting data for. And then you have something like the CCPA in California says, put a button on your website that says this. What are your thoughts?

Isabella De Michelis  12:46

Well, it's more in the implementation level that these things will then be comparable somehow. And because there is a different timeline. It's difficult to compare and pardon me the simplicity as I express it, but it's like comparing an apple and a pear; they'll never really compare. So something that the Europeans did, which is a little bit of a departure from the idea of having just a horizontal privacy regulation with GDPR, was that they realized that GDPR did not include all the data that they want to be protected, or they want it to be homed under an option for protection. And so they have to develop other regulations to look at that. So you certainly have heard about the hurdles around approving the new E-privacy regulation that applies to metadata, communication data, and essentially cookies. And that's driving me nuts, that the industry is so dependent on these trackers and identifiers, and they can't really understand why the technological dimension of the industry keeps saying, no, no, no, no, no, but this way of using data will not allow any personally identifiable process, I mean, people will not be identified if we use this data. And on the other side, you still have the other school of thought that says no, no, no, these people will still be identifiable. And hence, you need to fulfill these more strict requirements. You also have seen in Europe, Italy, France, and Austria, and others have even started popping up on regulation about cookie consent banners. So they have to introduce the retractable banner, which is very similar to the do not sell banner in California. But California, and as you rightly said before, they twisted their regulation, more thinking of consumers than end users. So you need to have a paying relationship or transaction, you buy something for actually being benefiting from that, which in Europe you don't need however, there are so many new regulations in Europe that are applicable to data. Last but not least, the Data Governance Act, which is giving full rights to users to monetize and capitalize on the data that they generate on third-party applications, there is a DATA Act that it's still in motion. And the two bodies are quite convergent in identifying IoT data. If the IoT data are used through devices that are connected to the Internet and somehow used by a human being, that data qualifies for consent requirement, even though you would presume that they would fall outside GDPR. And as you know, there are many legal bases for processing data and GDPR. And you could certainly say that if you have a thermostat in your house, it's there to help you doing energy saving; the legal basis for getting those data is the performance of a contract because you bought the sensor for basically measuring the energy savings, but the reality is that the company that's doing that it's also using this data for other purposes, and there is where the Europeans wanted to have this additional regulation, to say to the market, we don't expect you to have to give away your right to use performance of contract or legitimate interest. But you need to rely on consent if you're using those data for a different purpose. This is exactly what we had designed in ErnieApp; we wanted to have a consent interface that would be so dynamic and so granular by purpose that on one side, the users they know straight away to whom they can say yes, and to whom they can say no, but it becomes like a dynamic relationship. You could say yes, on Monday and no, on Wednesday because this is a right that you can change your mind. But at the same time, the moment you execute your confirmatory consent to the company that it can use the data that it collected from the sensor, also for programmatic, fine, that's a deal. That's a contract between the user and the company, and the company is waived from GDPR sanctions or DATA Act infringement because the DATA Act is finally giving the people in Europe the right to perform contracts about data monetization; that was missing; we didn't have that. Now, I am interested and I am not sufficiently knowledgeable about the United States situation, whether actually, the current privacy regulations in the US or not, are maybe more for protecting purposes than for data capitalization purposes. I mean, it's strange that it's Europe that is now pushing through the capitalization for users to be able to capitalize on their data and that it's not coming from the US because it would seem that would come from the US. And again, it's like having an apple and a pear. Apple, like Apple has come to maturity in Europe faster than in the US because they first started with privacy. And now they can go on data sharing, and capitalization. But because the US started so late in privacy, I don't know how it's going to be shaping up the capitalization because then you're still missing a number of laws that would be necessary.

Debbie Reynolds  18:21

I agree with that, Actually, the CCPA in California, they deal a bit with monetization. So a company can give someone something of value in exchange for their data. But they can't penalize someone if they don't want to use stuff. So I think that's the way people are looking at it. But in a way, I guess the monetization in the US, I think, has been slow because companies don't want to share the money that they make. Right? So they don't really want to talk about that part.

Isabella DeMichelis  19:07

They resist, of course, yes.

Debbie Reynolds  19:09

I think they'll be dragged kicking and screaming into the future. And until companies feel like they're going to go out of business or something. If they don't go to war, monetization, I think we'll see it in the US and the way that it is now where these companies are trying to, for example, as they know, all these regulations are being passed. Some of these companies are just getting data-hungry. So they're trying as much as they can to get as much data as they can about people. Because in the US, we don't have a right to be forgotten. So a right to deletion and the right to be forgotten are totally different. So I don't think we'll ever have a right to be forgotten in the US because these companies, they want to keep that data and a lot of laws they don't, they're not really addressing that. I guess they're hoping people don't make a distinction between the right to be forgotten and the right to deletion, which is a lesser right.

Isabella De Michelis  20:10

I'm sorry to interrupt you, I have strong views about this, that this has got to change for one particular reason. Companies have very, very long invested a lot of money in buying technologies, buying software, and training people because they also needed to have the in-house skills to run those machines to ingest a lot of data. And after 10 or 15 years, they realized that they had ingested a lot of data which they never used. Basically, not all data that you get is going to be monetizable, or with a profit and a margin that makes sense for you doing it. But in the beginning, everybody thought, I'm going to jump on every data possible, I can collect and put it there because it's going to be an asset, I can put it in my balance sheet, I can sell the company for a higher value. And then it's really one privacy regulation hit. And really, the cost of compliance has hit so massively companies realized that sometimes it's a lot better if you externalize the cost of compliance to the user. So sometimes, when we talk with our customers, enterprise customers, and we tell them, we understand you are afraid that your users may delete data which have been stored in your system. But how much of that data are you really using? If they're older than 10 years? Five years? The answer is that not a lot of companies ask us that. But we need to go back and do it and see it. And then they come back, and they say, well, 25 to 30% of those data, we really don't use them, but we keep them. So why do you keep them? Why wouldn't you clean that up? Oh, because it costs a lot of money, I need to call for a specialized company to do data discovery. And then I need to inject software into my system, which I'm scared if something goes wrong, and then I tell them, But what about giving the users a button to click in your data structure to be capable of executing that write command on an interface, and the user cleaning up things with the device or device at home when he has time and your people may be on vacation, you don't need to buy an extra software. Because if you enable your own system to have command and control that you put in the hands of the users, then you can recommend to the user what you want them to do for you. And so the companies, for example, like eBay, which made a public statement of 10% of their revenues, approximately 10% of the revenues goes to compliance, data protection, compliance, and of that 10%, another 10% goes on right to be forgotten execution and data deletion. Now, if eBay externalizes that function on their application, under the profile, where the user is authenticated username and password, there would not even be data subject requests because they would be self-automated. So sometimes technology can really bring a lot of efficiency; even the consumers can be efficient in how to help the companies to comply. But sometimes, the companies are still so attached to having a lot of data that I really see a very important effort of education with the mindset of the business people to understand that ultimately, cleaning sometimes is better than keeping and complying; it's going to be probably easier to monetize if you have a clean data set, less data more contextual than a lot of data. And you don't even know if you have permission to use it.

Debbie Reynolds  23:57

Yeah, I agree with that. Also, I feel like companies go overboard on compliance stuff. It doesn't have to be as difficult as people make it to be. So I try to make things simple, not make it more difficult. So I think one part of that is people who try to make it difficult; I guess they get paid more because they create more issues. So I think that also plays in, I want your thoughts on this. And this is a topic that I love to find people to talk about with. I don't find a lot of people who know about this, but because of your background in international law and economics, I want to talk about the link between Data Privacy or data and competition and antitrust. I co-wrote an article a couple of years ago for Bloomberg Law, and I had a very spirited debate, quote-unquote, with a lawyer who, at that point, did not think that antitrust and privacy intersected, and I thought he was crazy. So tell me your thoughts about the connection there. It's definitely connected to my view.

Isabella De Michelis  25:08

So first of all, I can only congratulate you for having already so long time ago already spotted that yes, of course, there is an interlink and interdependency. And there are such strong bonds between privacy and competition that it's almost amazing how long it has taken academia and even the antitrust authorities to start looking into that as a specific area of urgency because it's really about access to data. That is the point that is more linked to the antitrust components rather than abuses of data. So how many players in the market can overcome the barriers to collect the data? You've collected the data carriers, such enormous cost, and how that can be overcome if there are only a few players which can afford it. And they are so big to be naturally dominant; they're not even evil. They’re just dominant because of their ability to scale and the huge capital they disposed to begin their venture, their platform. Of course, some of these companies have grown by acquisition; if you compare the model of Google versus Facebook, Facebook had to buy a lot more companies to be at the size of competing against Google. But now look at what happened with Microsoft; everyone knows Microsoft has been sanctioned back by the European Commission on the interoperability Windows case. And it seemed that curbed the dominance of Microsoft in the business software collaboration software. They're still at 90% of market share. They're still dominant, but they're there. They're bound by the commitment taken in that case. All right. What happened back in November is that Microsoft announced after the launch of ChatGPT, Open AI, he launched the tool, and  ChatGPT was used in less than a month and a half by 100 million users. So the time to market was basically very short, even shorter than Instagram, even shorter than Facebook, and faster than even TikTok. And I'm not saying it didn't cost a lot of money, it did cost a lot of money. But if you think about what the implications are for the Microsoft business collaboration, software suite, Office 365, Teams, and Microsoft Drive, and then trim all the services, including SharePoint, and including every business application that goes with the licenses of Microsoft to the midsize enterprises and to the large enterprises. Now what happens, it happens that Microsoft wants to integrate in their software suite the engine, ChatGPT to become a default plugin into the software suite. So basically, in less than a year, they've moved from the pure business enterprise software into the consumer space because ChatGPT is going to be trained by consumers, where consumers are employees, or where consumers are users. It doesn't change the fact that they're is going to be trained because their software is already deployed across everyone on Earth, who is using PowerPoint, Word, and Excel. So probably, I'm really expecting that to be quite soon brought up as an antitrust case, that will probably be seen by other players in the market as an unfair support and unfair condition to deploy its market presence by leveraging a pre-existing product that was not even in the same class of product or segment. So even if the European Commission has thought by devising the Digital Market Act to have an act that helps regulate gatekeepers’ excessive power in the market, and they tried to define the markets, and there is a list of markets that are defined in the Digital Market Act, but I promise you that the market they should be looking the fastest, which is the ChatGPT AI engine integrated into the business after collaboration. It's not even listed as a market. And it will be probably dealt in the case because they tried to do it with the general regulation and they failed. On the other side. Google is trying with Bard and its different ways of competing with Bing, potentially being also linked to ChatGPT. And clearly, if you know, it's interesting that a few months ago a young startup would not have a lot of advertisement budget to spend. If the company would go to an advertisement agency and ask, which is a social network, I can go and not spend a lot of money to target people, target an audience, the answer would have been to think it's the cheapest you could find in the market. Today, it's not anymore the cheapest because with so many people going to think and so many queries being injected into Bing, Bing has improved its accuracy so much that the price of targeting someone through Bing has skyrocketed. So you see that the reason is significant effect between how data are used and if they are not, or protected or not protected. Because if we go again to what I said before, changeability didn't have on its interface any privacy dashboard, any privacy option, it doesn't have the right for the people to delete their queries, it doesn't have the ability to let the people delete their account, it doesn't have the ability for the people to choose the processing purpose. So that's unfair because what about all the other companies that are bound by these rules? It caused them to put a purpose, to put a privacy dashboard to engineer their privacy engineering in a way that is compliant, efficient, and not just to show compliance but you tell your users, I give you an option to be opting out from targeting. And then, if the user opts out, and then a measurement is made, and the targeting is still happening, that company gets fined. So there are serious implications that have market consequences and that have legal consequences. And then you've got these huge gigantic guys, who come with a, you know, big name saying, and it goes into the market, like without privacy impact assessment, and without real consideration for every other rule that is already applied across the board to all their competitors. And, you know, just because we're talking about large language models and chatbot type of services, and there is this Chinese attempt to compete against the Microsoft Open AI engine in the Google Bard engine, and there is also the little Facebook engine. But interestingly enough, the Chinese one, the Chinese Baidu, picked his name, Ernie. And we now have to pick a big fight with them because they would like us to abandon Ernie as a name. And for us, Ernie, it has a lot of meaning not only because we trademarked it, and we also did it in China, we did it in Hong Kong, and we did it in Japan, Korea, every market in which Baidu has a presence not counting United States, Europe and rest of the world. But why it says so much in saying that Ernie has significance for us is that ErnieApp, at the end, is like an assistant for you on privacy and data sharing data capitalization. So you could conceive us a little bit the place where if you are a user or a company, you're going to the engine, the earning engine, our engine, the ErnieApp Engine, and you asked to be assisted in? How do I opt out? How do I delete it? How do I transfer my data? How do I monetize my data? So everything that is in how can I do this, and I get to assistance, probably, you know, it qualifies as to be at least deserving to keep using our name, which has been trademark years and years and years ago. But the the race to the Chatbot shows how much the geopolitical dimension, it's strong, because you have the Chinese who wants to run, but they are late. Also, because of their delay in the computing industry. And in the chipset industry. They're really not equipped with the supercomputers that you need to train this model. But on the other side, you have the healthy United States market that lost competition, and it can pretend that it has Microsoft against Google and maybe even against Facebook. So for the pure pure pure competition, antitrust lawyers, that's a sign that the market is working because you have at least three players who compete. But for me, as a very liberal-minded economist seeing that to be having a healthy market, you need not to have an unsurmountable barrier to entry for a new entrant. The amount of money you need to do what Google, Facebook and Microsoft did for their Chatbot. It's an insurmountable barrier to the market. So that is why I love so much the idea that in Europe we have this new system, which will create for this big companies are symmetric regulation. And so they will have to divest some of their assets and some of their data if they're going to be prompted by new entrants that they want to access those data. And I think this is really very, very good. This is notwithstanding the privacy, compliance, security, respect, and robustness of the solutions that need to be implemented to make it happen. But it's very, very big. It's as big as when we opened and liberalized the telephony industry sector. The big big companies needed to open up their system to be allowing smaller companies to terminate calls. And there was this, you know, big thing about people having the right to port their numbers from one carrier to another. And if you change your house and going to a neighborhood that it's close enough that you can keep your phone number, you will be keeping your phone number, even if you change your carrier. I think all these things are very good for competition. Very, very good.

Debbie Reynolds  35:54

Wow, that's mind-blowing. I love that you drew the parallel on telecom. Because I agree, when I'm telling people, oh we can't do this with privacy, I'm like, well, look what different countries have done in telecom in terms of creating standards and making sure there's a somewhat more level playing field. Also, I love that you talked a bit about the asymmetry. So the business-to-consumer relationship is asymmetrical by nature. But I think something like AI just makes that asymmetry just astronomically unfair for people. So in my view, and I want your thoughts on this, AI brings with it more privacy risks, especially as these models are collecting and adjusting more data, and it's being used in ways that maybe people aren't aware of in these particular models. But I want your thoughts about AI and privacy risks and the asymmetry there.

Isabella De Michelis  37:01

So I will say that I have some initial thinking about it. But it's a little bit premature to say if my thinking is going to be staying the same over time. At the moment, I am extremely cautious to say. So AI for me, it's just another technology. It's probably in a different framework. But it's just another technology, we have the same asymmetries between the data generator and the data collectors and the data processors. For not saying that it's even fewer players that can collide those data and process them. So there is even this, you know, competition they mentioned that makes it delivered a little bit even more critical than as we used to have it with the web or the mobile platform. And it's too premature to know what the Metaverse is going to look like. So you know, just AI, these are engines, these are tools. So depending how they're going to be used, they're going to create more imbalances and more asymmetry. But there is one firm point that I believe is not going to be debated a lot between the industries and the governments because this time for AI, the governments are already sitting at a table. And it's that the government wants their cut of the profits. And the cut of profits can only come from one source. And it's if the citizens are somehow being pulled into the value chain and have a right to recuperate in some form of cash or noncash. They're part because with distributed networks, and with supercomputing happening probably just in down California; there is no jurisdictional thing that can happen other than if you twist it the other way around. So what the government will do is that they will say that if these models, if these companies, they want the data from the people, they will need as they already started, they will need to get the consent of people. So the consent, it's not going to be for a long time only a compliance thing like a legal basis; it's going to become the gateway, it's going to be probably the flip, the tilting point of who controls the value of what, because if an LLM model needs to collect a lot of data, but the rule is that without consent, it cannot collect it, then that entity needs to talk to the users and convince them to hand out the data. And since the people are far away from that company headquarters, they are sparse around the world. They are of any age, any culture, and probably it's going to be an interface that will need to be a common standard interface for lots of people to use a little bit, like when you take a smartphone, and everyone knows how to switch it on, right? It's something that everybody knows about how to switch on a smartphone, and then it will become so easy to say yes or no against an incentive that there will be floods of money that will return back into countries, which makes the government happy. So AI will happen, and it will happen in the way people believed it would happen. With one exception, it will cost a lot more money. But not just trivial, a lot more money. It's an indirect tax; what is going to be levied?

Debbie Reynolds  40:31

Wow, that's fascinating. We have talked more about this. This is fascinating. Well, if it were the world, according to you, Isabella, and we did everything you said, what would be your wish for privacy, data protection and technology anywhere in the world? Whether that be regulation, what happens with technology, competition, or human behavior? What are your thoughts?

Isabella De Michelis  40:57

Oh, well, it boils back to the reason why I founded ErnieApp; the idea was to rebalance a little bit the value chain of the digital. And rebalancing means to make a bigger pie. So when you retwist assets in the market, you create a market that is bigger, so there's more space for more players. And there's less inequality for the users of those services and products. So my vision is to have this privacy and competition, two-sided policy and two-sided economics and two-sided technologies, to walk hand in hand to essentially redistribute in a more balanced way, the value that is created through the interaction between humans and machines, that would make a lot of positive impact in countries that are less wealthy, it would also create for those which are a lot wealthier, a sense of responsibility that we were all part of the same planet. But everyone across the planet have the same rights and the rights of to access the Internet. To me, they are less relevant than the access to your right of determination of your data. So to the extent that for 20 years, we heard that the only thing that was important for countries in the world was to have proper good Internet access. I think now really what we should be giving us as a goal, and each one doing his part. And our part is to give a right of choosing, a right of choice for people to say, I have privacy, and I know what it is, and I love it. And I don't want anyone to question that. But I'm also part of this digital equation. And if I can be part of this in my own way, let me be it. So the disintermediation of users who for 20 years have been called just users should end. They should be part of the value chain. And within the value chain, privacy and competition should help that happening successfully. Not just happening, but happening successfully.

Debbie Reynolds  43:12

I love that vision. Well, thank you so much, Isabella, for being on the show. This is mind-blowing. I'd love to chat with you more in the future around this because I agree you have your finger on the pulse, and you're watching this really closely. So it'd be interesting to see how this plays out.

Isabella De Michelis  43:28

I hope we didn't, I didn't bore too much our future participants to your podcast. And you know, I tend to talk a lot, but I'm very passionate, and I love talking to you.

Debbie Reynolds  43:41

Ah, thank you so much. This is great. I think a lot of times, especially on social media, it's hard to make bigger things. It's hard to talk about things at a higher level. And you do that really well. I've been on social media where a lot of people are chasing the ambulance, and you're thinking about these higher, bigger issues. So I think it's good to have these conversations, and it's good for all of us to expand to see the bigger picture.

Isabella De Michelis  44:11

Excellent. And if you stop by Zurich, please ping me. I'll be there.

Debbie Reynolds  44:15

Oh yeah, I love it. I love Zurich. Oh my goodness. Yeah, we'll definitely talk soon. But thank you so much for being on the show.

Isabella De Michelis  44:22

Thank you. Thank you a lot, and have a good day.

Previous
Previous

E149 - Victoria Beckman, Associate General Counsel - Security & Privacy, Shopify

Next
Next

E147 - Igor Barshteyn, Senior Manager, Information Security & Compliance Expert