Debbie Reynolds Consulting LLC

View Original

E225 - Elizabeth Aguado, Emerging Technologies, Responsible AI Expert (South America)

Find your Podcast Player of Choice to listen to “The Data Diva” Talks Privacy Podcast Episode Here

Your browser doesn't support HTML5 audio

The Data Diva E225 - Elizabeth Aguado and Debbie Reynolds (28 minutes) Debbie Reynolds

[00:00] Debbie Reynolds: The personal views expressed by our podcast guests are their own and are not legal advice or official statements by their organizations.

[00:12] Hello, my name is Debbie Reynolds. They call me the Data Diva. This is the Data Diva Talks Privacy podcast where we discuss data privacy issues with industry leaders around the world with information that businesses need to know.

[00:25] Now I have a very special guest on the show, Elizabeth Aguado. She is an emerging technologies and responsible AI professional all the way from Lima, Peru. Welcome.

[00:38] Elizabeth Aguado: Thanks, Debbie.

[00:40] Debbie Reynolds: I'm so happy to have you here. You and I have been connected on LinkedIn for quite some time. We actually had a chance to collaborate with one another on a course that you were working on with Stanford.

[00:54] I want you to introduce yourself to the audience and tell them how you came to this career path.

[01:01] Elizabeth Aguado: Thank you so much for the opportunity, Debbie. And definitely one of the things I love the most is the opportunity to collaborate with people all over the world. And certainly it was such an honor to have you as part of the pilot program with Stanford last year.

[01:19] And I think it's quite an experience to have.

[01:25] So I started in technology in about the end of 2020 when I started a course in AI with TU Delft.

[01:36] And I was by that time in the manufacturing industry.

[01:41] And I knew that automating was like a huge thing. So that's why I was already trying to upskill.

[01:49] However, I think the beginning of last year that I really started looking into more ways to participate in making of AI, not just this nice word that all companies want to jump on, right.

[02:09] But also no more about the technology. So I started taking courses in AI ethics, which I already knew a little bit about that, but not enough. And I found myself more interested with the background I had been studying.

[02:25] And my interest just grew more and more. And I would be so obsessed about passionate about this.

[02:34] I knew that responsible AI was something that was worth really putting all your time into it. See, in Peru and Chile, countries where I have lived, let's say it's in our culture that we worship big tech.

[02:56] Okay? We, at a young age, we're always like asking kids, who do you admire? And my generation would answer, Bill Gates. And it's not even questionable.

[03:12] It's going to have that impact this generation. Now it's saying, Musk, Elon Musk is the person you admire the most. And they see so many qualities. So what I found myself doing is okay.

[03:27] Of course there are many engineers in this country and in Chile, many people working on this, but how many of them are really thinking about the impact and the consequences of these technologies.

[03:38] And that's also where I found professionals and data and other emerging technologies like you. And it has been really a nice, nice journey.

[03:50] Debbie Reynolds: Wow, thank you for that.

[03:52] You explain a little bit more around responsible AI. So we know, I guess we see a lot in the news, as you say, like for your example, well, that people may look up to are these tech titans that create all these products and services that they use every day.

[04:11] But what we see in the news mostly is very much excitement about the types of things that you can do with AI. And you don't hear a lot of talk about why you need to have responsible AI.

[04:25] So people are like, I feel like some people are like, oh, as this cool thing, it's fun, we can make money from it. But then a lot of people are pushing back.

[04:35] They're like, well, if you have guardrails or you're using AI, quote unquote in responsible ways, then you're going to stifle innovation. So can you tell me a little bit about why you think responsible AI is important?

[04:48] Elizabeth Aguado: Well, definitely. The technology is getting better and better. I feel like whatever we're doing isn't even fast enough.

[04:55] I know right now we have many frameworks and responsible AI, we have many organizations, international organizations, and even ISOs. The way I see it at a global scale is like, you don't even have to wait that long for it to see it.

[05:11] Like I see it in the news here too, we got news about the judge using ChatGPT to make a vect on a child support or, you know, handling those kind of cases that really have an impact in people's lives.

[05:28] It's not really about, okay, I'm using generative AI for copy or for advertising.

[05:34] These people are being affected by these technologies. And so responsible AI is more akin towards not just kind of feeling like you're more moral than others.

[05:50] Right. Or preaching, but certainly that we need to work on critical thinking and the ways that we implement this technologies. I see here that we do need responsible AI. What people see at first is, oh, this is fast, this is going to democratize research and so many things.

[06:14] But the need is not just in the US or in Europe. It's also in the Global South.

[06:22] Debbie Reynolds: Let's talk a little bit about the Global South. Obviously, a lot of times when we hear people talk about data privacy or data protection, you hear a lot about the US and you hear a lot about Europe and you don't hear enough about the Global South.

[06:35] Even though it's very active in data protection, it has been for decades. But give me a sense for the audience what's happening in the global South. You know, for people who don't understand what's happening in data protection.

[06:49] Give us an idea of what's happening.

[06:51] Elizabeth Aguado: I can mostly speak for South America and Latin America, Caribbean. I'm not very aware about the situation in Africa or the Philippines, let's say.

[07:02] But something I also wanted to tell you about is my project. What I'm trying to do is to really connect different professionals in different countries and collaborate with them.

[07:14] I am also interested in making more connections in Brazil is one of the first countries that they already have laws, let's say kind of like the executive order you guys had in the US and it's already being worked at a congress level which is different from Chile or Peru.

[07:37] It depends on the country. For Chile, I'm also collaborating with Kenneth Pugh in building of a cybersecurity infrastructure.

[07:47] At least having that goal that by 2030 Chile will have a really robust structure that would also help with the perception of the country.

[08:02] I think Colombia and other neighboring countries are also trying to follow what UNESCO or other international institutions come up with like the AI EU Act. Right. My concern is that we already have slow system and like if I, if you tell me well wow, it took that long for the EU to finish their first version, then I don't know how long it's going to take here.

[08:35] And like I said, it's already been used by judges. It's already been used like say we had this topic right about facial recognition and during our time collaborating for the pilot program at Stanford.

[08:52] But the thing is here a lot of things are being just thrown to the two people like facial recognition, like you cannot opt out use it like for apps. All the fintech is like they require your biometrics and they really don't follow any regulation.

[09:14] In particular, sometimes they are subscribed to the UPSCRIBE edpr but then they make you sign that you give consent. So since you give consent then it doesn't really matter. So for facial recognition I see that here it's already happening and there's less attention on this ethical questions or you also have to look at it from a geopolitical stand point because China is growing and Latin America is the first partner and a lot of our countries because since we have a big mining industry also the minerals that you find here can be used for batteries, all the semiconductors industry.

[10:09] So it's really a complex problem and I really am concerned about society and how are we going to be able to face the many challenges because at least in the EU or in the US you have an executive order.

[10:29] You have, right.

[10:31] You're one step ahead. We are just attending different conferences, but our authorities are really slow. So that's really one downside. Right.

[10:44] Major. Because technology, like I said, it's so fast and we can't keep up.

[10:51] Debbie Reynolds: How do you feel what's in the world happening right now that you're seeing in privacy or data or AI that's concerning you?

[11:02] Elizabeth Aguado: The situation where privacy has a really high price.

[11:09] If I am really honest with myself, I have many privileges, not because of the access to information that I have, but it's also resources. You want to have a good, like say I, I use Proton, they don't pay me.

[11:27] Right. I'm not trying to advertise Proton. But you know, you pay for, for privacy because it's true there's privacy by design. But privacy has a cost and I don't see how people in Latin America can, can access that.

[11:45] And like you see that say the price of ChatGPT, which is 2020 USD, here, it means more. So it's that about democratizing education.

[12:00] It's also tricky. Right. So I would say that that's also something that other countries in the Global south will have a problem with because just as with some communities in the States where the schools don't have access to the same resources or are underfunded, here is like 10x that.

[12:24] So that for me is really something to work on and all I could come up with at the moment, and that's something that I'm also working with is data literacy and collaboration between countries through my Project Ignite Lab.

[12:46] And definitely that's trying to empower the users.

[12:54] Debbie Reynolds: Do you feel that people just your sense, do you feel that people seem to be caring or asking more questions about privacy now that we have all these emerging technologies and using facial recognition and things like that?

[13:08] What is your feeling?

[13:10] Elizabeth Aguado: Unfortunately, I don't see a lot of people concerned. They like again, from the perspective of the people in the Global south or Latin America, it's like everything is like price, access.

[13:28] People just want to have access. They don't care about privacy.

[13:32] They don't find that their data is that valuable. Right. And they prefer not to know. They don't want to know.

[13:42] Right. But I try to ignite those conversations. I try to really have that, that space to, to make that you have to create the spaces for people to talk about it.

[13:59] It's not something that you can Just, you know, run into people and just talk about your data privacy or your practices. Right. It takes time. So I'm trying to also build safe spaces where you can work and talk about the good things about technology, but also voice your concerns.

[14:19] Debbie Reynolds: Well, let's talk a little bit about the course that you and I collaborated on together. So you asked me to speak at one of you all sessions and I spoke about biometrics, I believe on the session with you.

[14:32] But tell me a little bit about that project.

[14:34] Elizabeth Aguado: Yes. So that project was like the second edition, I believe from Tech Ethics and Public Policy that Stanford was offering. The cohorts are often in the fall. And as I collaborate with All Tech Is Human, I'm part of the working groups.

[14:58] I decided to apply for the course and they asked us if we wanted to be leaders and we would have groups. And so I tried to organize our sessions with a group from All Tech is Human.

[15:19] Of course, the program is made for many other practitioners, not, not just a certain type. They had guests, guests like some Altman. Right.

[15:34] They had Jacinda under. Right. From New Zealand and many, many more. And I think a lot of people know about Professor Rob Reich, right. From the book System Error and like the other professors were also delivering part of the videos, the sessions.

[15:57] This course was mainly online.

[16:01] I mean, yeah, it was online. I mean a virtual course. But the guests were obviously live our sessions. The one that you helped us with was the Synchronous Pilot, since they were also evaluating that.

[16:18] Well, since this program is so useful not just in the U.S. but other countries, it would be much better to have it recorded and also have the sessions. Right. With their respective.

[16:30] And we had people join in from California with different time zones. Right.

[16:35] So we were able to meet once a week and work on the readings. And you did facial recognition, which was part of, I think we're week five or six and this was a seven week long program.

[16:55] I know that they are starting to roll out the ads for the new cohort.

[17:03] I got really great feedback from the other organizers and I cannot be more thankful that we were able to have you and for you to explain to us more about your consulting experience.

[17:19] Right. You're someone that has hands on practice and it's not just the theory. Right. We had access to great material nonetheless. But it's a different feeling, right. To have someone who works in the industry to.

[17:37] To share with you what's happening. And you really guided us through some laws as well. Right. Different in different states. And that was really enriching. Thank you again.

[17:52] Debbie Reynolds: Aw, thank you. It was my pleasure. People still watch that and contact me about that session because I put it all.

[17:58] Elizabeth Aguado: You should share the link again so they can go over it.

[18:04] Debbie Reynolds: It was very good, very good. Thank you. It was my pleasure to be able to do that.

[18:09] I want your thoughts on something that's happening in South America that's concerning me or something that caught my attention and I want your thoughts on it. So in Chile, actually they made an addition to their constitution to add in neural rights.

[18:28] And I think it's the first country in the world that has done that. But I want your thoughts on this. So this is such a big, hot emerging issue. You know, we see a lot of activity, especially with companies that are developing technology, neural technology, human brain interfaces and things like that.

[18:49] But I was very impressed that Chile actually had added. You know, they thought it was so important that they added their constitution. But I just want your thoughts on that and what that means.

[18:59] You think?

[19:00] Elizabeth Aguado: Yeah, that's actually one of the reasons that I also want to still collaborate a lot with Chile and stay in touch, because they really, they really make it a priority to work under regulations on the legislation and everything regarding emerging technologies.

[19:17] They are not waiting.

[19:20] They're already starting those thorny conversations. And I think that it also helps that Chile doesn't. I mean, it's not a small country, but population wise, it's manageable. And so let's say it would be like a little, what can I put it?

[19:43] Like a little Norway, I don't know. Right. Where you can also test for many scenarios. And that is very useful. Right, because you want to sandbox a lot of things before just jumping and banning something or.

[20:03] Right. And so what I have seen firsthand is that there is a great initiative from not just private, but public universities.

[20:14] They're also working a lot in quantum technologies and they are trying to already educate, like the healthcare industry about the, like you said, you know, neurotech, Neurotech and all the other emerging technologies.

[20:36] They want to be, they want to be one of the first countries in the region to lead and to set a good example and.

[20:48] Well, those schools have, yeah, have. I think they have really great people working in, in those roundtables. And so I expect a lot from, from them and I really think I benefit from having those connections.

[21:08] I, I know there is a conference for women and AI for Chile, but it's more, it's sponsored by a private university and they also collaborate with this senator at ochu.

[21:24] And so you can see that they really Put the effort to make both the public and the private sector collaborate.

[21:36] And I think that's a good model for Peru and Colombia and other countries.

[21:44] Debbie Reynolds: That's an excellent point you made around the private and public collaboration on these types of things. But I think also one of the things that really struck me around adding neural rights to the Constitution is sort of staking your claim saying, hey, we know that these technologies are out there.

[22:05] We know that companies will continue to develop those things. People will probably want, they'll have reasons to want maybe these types of technologies. But then you also have to keep in mind that people should have rights over their data and their information as it relates to their bodies, Right?

[22:25] Elizabeth Aguado: Yes.

[22:27] Chile is also a country that cares a lot about the individual as a whole. And if I compare it to other, like even Peru is like, unfortunately in Peru we take care more of the investors over the people.

[22:45] And the government really, I mean, will do anything to just keep investors, international investors.

[22:56] I don't find that there's enough of a connection between those sectors.

[23:03] And so I think each country, yes, have their own particularity, is kind of like each state in the U.S. right. But I think we need good leaders.

[23:18] Debbie Reynolds: So if it were the world according to you, Elizabeth, and we did everything you said, what would be your wish for privacy or data protection anywhere in the world, whether that be regulation, human behavior or technology?

[23:32] Elizabeth Aguado: I'm in the middle, right. I believe that emerging technologies have the capacity to augment human intelligence and really help solve complex problems. Like for example, in the Global south, we struggle with extreme poverty and, you know, access to basic health care, vaccines, health care, Right.

[24:04] There are still diseases that have not been eradicated.

[24:08] And like, particularly in Peru, for example, the children are anemic and that affects how much they learn. Right. Because you go to school and you have low block count, then what kind of learning are you going to do?

[24:26] Is it going to solve it? To have ChatGPT do your homework, if you know at the core the problem is that you don't have your basic needs covered. Right.

[24:39] What I would expect in an ideal world, it would be that we don't just base our decisions on maximizing growth because growth is tricky, right? I understand that governments want to be more productive, but like, for the reasons I have explained to you, I think that in this context we need to be more careful.

[25:10] We should also invest in cultivating our humanness. Right. We should not stop investing in the human capacities and the skills that also make us human.

[25:31] I think that's something that right now is lacking.

[25:41] My wish would be that, yes, the. The objective would not be just growth or economic growth more specifically, but that they would also think of AI and emerging technologies with the objective of improving people's lives.

[26:01] Even if that could mean one thing in Japan and another in.

[26:06] In the States.

[26:09] But the human at the center.

[26:13] Debbie Reynolds: Human at the center. And I agree what you said very poignantly that those things can mean different things in different regions. So I totally agree with you on that. Well, thank you so much for being on the show.

[26:25] I'm so happy that I've gotten to know you. And thank you for inviting me to participate with you on your Stanford project. It was a lot of fun. And I'm sure we'll find other ways that we can collaborate in the future.

[26:36] For sure.

[26:38] Elizabeth Aguado: For sure. And I will. I would encourage people to watch that recording of that session because it's. It's really, really great. Or just general.

[26:49] Debbie Reynolds: Yeah, I'll provide a link for people. I'll provide a link.

[26:53] Elizabeth Aguado: All right, thanks, Debbie. And y'all can reach out to me via LinkedIn. I'm very active in that platform.

[27:00] Debbie Reynolds: Yes, she is. Definitely connect with Elizabeth on LinkedIn. She's amazing. And I love the things that you post, too. So thank you so much.

[27:08] Elizabeth Aguado: Bye.

[27:08] Debbie Reynolds: Bye. I'll talk to you soon.

[27:09] Elizabeth Aguado: Bye.

[27:27] Debbie Reynolds: It.