E201 - Angeline Corvaglia, Founder, Data Girl and Friends

Many thanks to “The Data Diva” Talks Privacy Podcast “Privacy Champion” MineOS, for sponsoring this episode and supporting the podcast.

With constantly evolving regulatory frameworks and AI systems set to introduce monumental complications, data governance has become an even more difficult challenge. That’s why you need MineOS. The platform helps you control and manage your enterprise data by providing a continuous Single Source of Data Truth. Get yours today with a free personalized demo of MineOS, the industry’s top no-code privacy & data ops solution.

To find out more about MineOS visit their website at https://www.mineos.ai/

32:01

SUMMARY KEYWORDS

kids, parents, data, privacy, adults, ai, children, talk, understand, app, created, find, people, cfo, information, braked, imagine, friends, risks, teenagers

SPEAKERS

Debbie Reynolds, Angeline Corvaglia

Debbie Reynolds  00:00

Personal views and opinions expressed by our podcast guests are their own and are not legal advice or official statements by their organizations.

Hello, my name is Debbie Reynolds. They call me "The Data Diva". This is "The Data Diva" Talks Privacy podcast, where we discuss Data Privacy issues with industry leaders around the world with information that businesses need to know now. I have a very special guest on the show, Angeline Corvaglia. She is the founder of DataGirl and Friends. Welcome.

Angeline Corvaglia  00:40

Thank you. I'm so excited to be here. It's really great to be on this fantastic podcast.

Debbie Reynolds  00:46

Well, we met on LinkedIn. I think someone shared your DataGirls and Friends, and I would love for you to be able to talk more about what that is. But basically, DataGirls and Friends is an animated series that you put together that raises awareness with children and adults like me about privacy issues. I absolutely love what you do, and I was happy that we were able to meet on LinkedIn. But we also got fortunate that you happen to be in the US. You're from Italy, but you're in the US, and you and I got a chance to actually have lunch in Chicago, and it was a fantastic lunch.

Angeline Corvaglia  01:28

I really dream of that lunch. I can't wait to come back to Chicago and find out what else you have up your sleeve for food places.

Debbie Reynolds  01:37

I know, Portillos is classic. Well, tell me a little bit about your journey in tech and why you're doing DataGirl and Friends.

Angeline Corvaglia  01:49

Well, my journey in tech started when I was a CFO of a financial institution in Europe, a big multinational bank, and I did a lot of digital transformation initiatives, big ones involving 350 legal entities across 17 countries. I was CFO then of an entity responsible for the data office during this huge digital transformation. So that's how I found my love for data from that aspect, non-traditional CFO, but still. Then, I worked for a software provider, one that works internationally. Then, after around three years there, I decided that it was time for me to do my own thing, and I had a few months to explore what was really best. I wanted to follow my dream kind of thing, and the topic of DataGirl, in France, actually found me in privacy, especially online security. AI because I was following some industry leaders, particularly Bill Schmarzo, and he made this article for kids and youth to understand the importance of privacy and how they're sharing their data everywhere. I just thought, I bet this would come over better if there would be a video. So I asked him if I could create a video based on his content, and he said yeah, so I did, and that went over much better than I expected it would because it was just fun on my side, and that was the beginning of DataGirl and Friends, because I had that background in data, and I'm a passionate parent that wants to make the world a better place for the next generation. So I just combined those two kinds of information, these two experiences, and created DataGirl and Friends.

Debbie Reynolds  03:37

Well, I definitely think people should check it out. I absolutely love it. A couple of things I really love about it. First of all, there just seems to be hardly anything for children around privacy and how to protect data. So, just getting that message out there is interesting. But I also think it's challenging to try to send a message in a way that you feel like a younger person would understand or care about. So tell me a little bit about how you manage that in DataGirl.

Angeline Corvaglia  04:18

Well, I always had the ability to take complex concepts and simplify them. That was just something that was I was able to do, and with children, I have a young daughter, so I can see what resonates with her. So I take that as an example, and I also see other parents and other kids with what they're interested in, and I just try to recreate that. What was interesting, what you said is that there's not much for children. Something else that I discovered, and I see very often, is that we adults think that we have to teach children online safety, privacy, AI, but actually, they often know it much better than we do because they're the digital generation, and what we need to give them is filling gaps in their knowledge. So my whole concept is to help create a link between the different generations, the younger generation, who's digitally savvy. I've also seen a lot of research articles that often children don't listen to stuff, especially privacy. I think it's the hardest one. Privacy, they don't listen because they think they know better, and they often do know better, but really important side pieces of information they don't know. We adults, maybe we don't know the digital world so well, but we know the world. We know how to keep ourselves safe. We know the importance of what can be done with our personal data. So that's the aspect that we can bring into it, and that's my whole concept, is bringing the generations together, getting them to talking, short videos, one to three-minute videos, should trigger conversations where the kids and youth bring in their knowledge and experience, and the adults bring in their knowledge, and we all figure out how to make basically, the digital world a safer place.

Debbie Reynolds  06:20

One thing that you said, I love that, I would love for you to talk a little bit more about, which is, adults understand safety, understand the world, as you were saying, and kids understand the digital world, and so I think especially when we see things around children, like some of these sextortion cases and things like that, where you're right, kids do understand the digital world, but, I think in some ways, they may not imagine the type of harm that can happen to them in the physical world. What do you think?

Angeline Corvaglia  06:56

Yeah, exactly. I think sextortion is a really good example. It's easy to understand, and I think we, anyone basically can think back when they were teenagers, how it must be, because basically there were always predators. So we as adults, we've been living our entire lives, first learning to protect ourselves, and then learning to protect our children, and often we talk about Stranger Danger. It used to be when I was a kid, they might pick up a van, all for you for ice cream and pull you into the van. Obviously, this is something completely different with sextortion. So the kids, they know it, they are aware of this risk. But what teenagers aren't aware of is their inability, and it's actually they're growing. Their brain is growing, their inability to really think things through; the consequences and understand it can happen to anyone, and understand that they are much more likely to follow the tendency of their impulses, and this is what we can give them as adults, that you really need to understand that in no weakness of your own, you as a teenager are programmed to learn to be independent,, and learning to be independent is taking risks, is following your impulses, your instincts, without understanding the consequences, and this is extortion. How this can often happen simply, it's, for me, a completely natural thing, and my daughter's not much online. She doesn't have any contact with people online that she doesn't know in person, but I always tell her my sentence. I know you think that you're ready for something, but actually, you need to trust me that you're not ready for certain things. So the  beginning of the conversation, you feel ready to face these risks, and you feel ready to recognize who's safe and who's not online, but you're just not there yet. You have to trust me as an adult that I know; I have experience in keeping myself and others safe, and we need to find a solution how to help you do that online. These are mindsets and conversations that need to happen, especially sextortion. I don't think that people who haven't lived that can even imagine it. I personally can't even imagine; that wasn't even possible, anything anywhere near and I think if we can't imagine it, then we can't really keep them safe without giving them the tools to do it themselves, really.

Debbie Reynolds  09:45

I agree with that. What is happening in the world of privacy that's concerning you now, something that you see, wow, this is something maybe I need to talk about it, or I just need to think about it more, or just anything. What do you think?

Angeline Corvaglia  10:05

That's a hard one because so much is going on. But what concerns me the most at the moment is the AI chatbot companions because a lot of people, not just children but also adults, are using it as a replacement for human connections, which is not actually what concerns me the most. What concerns me the most is they are telling it things that they would otherwise only tell their most trusted friends, their psychiatrists, and their partners. They're using it for sexual gratification, so to speak. I mean, they're using it as therapists, anything you can imagine, and this is all data that is going to a machine and the person behind the machine. These are private companies. Privacy, and you know better than me the privacy landscape, but I don't know that this data is protected at all, and there's so much opportunity to manipulate and blackmail people based on the most intimate details they're giving to these chatbots, thinking, anyway, it's just a machine. It's not just a machine. It's a bunch of people behind the machine that are collecting; they've created this machine to collect your most private thoughts, and that really concerns me a lot about what can become of that.

Debbie Reynolds  11:40

I agree with that. I saw an ad on an app when they were like, hey, let so and so be your AI boyfriend. So they have a picture of the guy the teens like, and stuff like that. It's like a chat thing, almost like WhatsApp, where you're chatting with the person, and my concern always, and stuff like that, is that people will be manipulated by these bots that are really trying to gather as much information as possible about the person. In addition to not knowing where this data is going, where it's being shared, you don't know how far this manipulation can go, whether they're trying to get money from people, or have them do things maybe they would not have otherwise done. I think especially when you're dealing with things like voice and visuals, even if you think you're smart enough to know, okay, this is not real. It's like, it's something happening in your brain where it says, yeah, this is real. You know, it's almost like when you go to a movie, and you see something that startles; you jump, so you know nothing. You're not interacting with anything, but your brain is reacting to what you're seeing or what you're hearing. So I think that's the concern that I have with AI chatbots with children.

Angeline Corvaglia  12:57

I have this workshop that I started recently for parents. I call it AI for Parents, and it basically is just teaching parents the basics, really basics, about what they should know about AI, what it is, and where it is, and I have this one section on what AI can and can't do, and I take this one and a half minute video of OpenAI when they presented ChatGPT 4o, this guy has a conversation with an infamous chatbot that sounds like Scarlet Johansson, and I go through I pause it, and I said, this sounds like this, but they actually just learned this language to really help them understand. I always call AI it. It's just it. But when I do that, I keep saying she, and it seems like a small thing because it's so convincing you, as you said, in my mind, I know it's an it, but when I think about the video and think about this guy having a conversation with a very seemingly sentient chatbot, I slip into she. It just seems so so real for my mind in the background. Yeah, that's exactly the risk, even if you know it, that something in you is convinced is fooled because that's not something our mind is used to having to deal with.

Debbie Reynolds  14:22

That's true. What do you find the thing that maybe parents or adults find the most challenging when they're talking to children about privacy?

Angeline Corvaglia  14:34

Well, I think that the biggest challenge that adults have is getting through to the kids, and especially teenagers. Obviously, it's hard to get through to teenagers anyway, and as I said earlier, they know a lot more than we actually think they do and give credit for. But I think the challenge often comes from a need for. a way of parenting compared to the past, and what I mean by that is as definitely a parent who just doesn't know, and it's hard for parents and adults to admit they don't know things. I mean, it's hard in any business situation to be the one in a room to hold your hand up and say, I'm sorry, I don't understand what you're talking about. Because that's something that our society, a lot of societies, try to say, oh, that's weakness, but that's the problem. I think the difficulty for parents and adults speaking to youth is that we have to say, I don't know. We have to say, I do not know this digital world unless, by chance, you're an expert, really, like a deep expert in this. Chances are you don't know and work together because you have to take a step back and say and say that, and it's hard to say that to your kids. Another thing related to that is that you also have to not be afraid to talk about things you don't know about because a lot of parents are saying, I don't know, so I'm afraid. I don't know what to say. I don't know what to do, so they'll hesitate to step in, and that's another big challenge is giving parents and adults the courage just to speak about something maybe you don't even know about and find solutions together.

Debbie Reynolds  16:26

I knew of someone, and I think I told you this, I think I told you during our lunch, I knew of a parent who basically had a rule where all the computers are in the common areas of the house, and they are supervised by their parents when they're using it. The kids, if they have phones, cannot take them up to their rooms or anything. They have to be in a common area, and I would not have imagined anything like that growing up. But yeah, growing up, I was very sheltered by my parents, and so we didn't have a lot of interaction. I wasn't having slumber parties at other people's houses and going to wild parties. That just was not the way we grew up. So, I think I took for granted how sheltered I was then. But I think parents now it's, if you're on the phone with someone or they're chatting, you don't know, are they chatting with someone that they know from school? Are they chatting with someone who's a stranger? Are they being manipulated by a bot? I think it just makes it more challenging for parents to be able to have that openness and that dialog. Because, again, like you said, I think kids think they know more than they do. I think it's going to be harder, but we have to try to get parents a lot more involved than what kids are involved in.

Angeline Corvaglia  17:58

Yeah, exactly, and what you said was really interesting about you couldn't imagine that. I can't imagine it either, and I think that's something that maybe we underestimate a little bit, and it's causing strife between the different generations. I mean, parental controls are definitely, I mean, necessary, but how much can we control and our kids, in a sense, control that was not the right translation from a different language, how we can observe our kids. I mean, as you said, I live in Italy, so we're not so maybe advanced, but the first time I heard about my cousin who had a 16 or 17-year-old, and he had this device connected to his son's car, and he knew everything. Every time he braked, how fast he was going, how far he braked. It was what we, in the meantime, have been a lot in the news of how much these cars are collecting. He had it all on his app on his phone, and I was like, wow, and he said, well, the insurance company gives us lower rates if I have this on my phone. I said, Wow. Can you imagine, if you're a 16-year-old and your parents know exactly how fast you drive in every given situation and how much you're braking, how difficult that must be for them? Where is the independence? We didn't have anything like that. We used to do dangerous stuff, and there was no way that our parents could have found out unless someone saw it. It's the same with online that going online is a way for them to be independent, and they want to be rich for independence, so they're going to do things that sometimes they even know their parents don't want, and obviously we as parents, if we have the most advanced parental controls, we can see everything that they're doing. Well, everything, no, because the tech companies are quite smart, so they built in workarounds, and the kids find the workarounds, but this is a level of control and surveillance by parents that was nowhere near what we had and as kids, and I can understand that kids would not like that very much, and they would try to get around it. So it's a very important balancing act to help them understand why we need to help them in also giving some freedoms, as I said, through empowerment, that they are able to help themselves. It's really difficult.

Debbie Reynolds  20:36

Yeah, absolutely, and I think as things get more complex with AI and all these other new things are happening, I think this is going to make it tougher for parents. I was working with a company advising them about an app for children, and it was so funny because I had to talk to the developer, and they couldn't understand why I said, you can't have kids chatting in unsupervised conversations. You just can't do that So, and there are some jurisdictions, there are laws around that, but I mean, they just didn't understand. They think, oh, it's so cool. They get into a room and they chat. It's not like you're in school and you're in the hallway and you're talking with people. These things are being recorded. You don't know who these people are, depending on who's in those rooms. You're not sure what information a kid may divulge. I've seen things like scammers and people who are doing criminal activity. They may ask something that seems really innocent, like what's your favorite color and but they really want that information because they're trying to impersonate you and try to get into your accounts and different things like that. But what are your thoughts about that?

Angeline Corvaglia  22:02

Those are great points. I think that privacy, kids, especially, the average person can't imagine how much information is already out there about them, so there's often just little pieces, as you say, seemingly insignificant information that will fill be the last piece of the puzzle, or the last pieces of the puzzle, and most adults aren't able to handle that, much less kids, and also kids. I always think back. I try to remember my childhood a while back, but you don't know what you're allowed to say and what you're not allowed to say. I mean, obviously the younger the kid, the less that they understand what they're allowed to say and what they're not allowed to say, and kids are all on really early, earlier than I thought before I became a parent. They're in online games. A lot of parents aren't aware of the chat function of the games that their kids are playing, and it's not to blame anyone. They just don't understand that they're not supposed to say certain things, and it's a challenge, definitely, and doesn't help if the app providers, as you say, don't really appreciate that, not all of them.

Debbie Reynolds  23:24

Now, that's true. What is the thing that you are able to express with DataGirl that you feel like people once they see it, it just clicks with them.

Angeline Corvaglia  23:38

Well, actually, I'm going to start with one that was unexpected for me that when I created DataGirl, first I created DataGirl, and by chance, she looked like me, which, of course, with psychology, it would make sense. I created what looks like me and then created a companion, Isla AI Girl, and I just thought, well, she needs to look different than me. That's all I thought, that I can't have two characters that look the same, so as I have either with darker skin and she's just different, and, lots of people came back to me and said, so important for kids to be able to see themselves in the characters that I'm creating. So I even tried. Now, I'm exploring different characters. I was working with a fantastic woman, Constance in Nigeria, and she was creating privacy exercises for a school, and she asked me to create characters that looked like the kids in the school. So I got some help from her because obviously, I've never been to Nigeria, so I didn't know, and she said, This is so powerful to get the message across is that they see a character that looks like them, or in the case of the younger ones, somehow fun is related to something they like. It created this set of animal friends for DataGirl and Isla. So this is the thing that surprised me. Shouldn't have maybe, but it did, that not only should the messages be somehow short, because kids have a shorter attention span, often thanks to social media and YouTube and those kind of things, but also that they need to have something that reminds them of themselves or things that they love.

Debbie Reynolds  25:36

Yeah, I like that too. I definitely like the characters and I like the things you're doing. When I saw it, it was a light bulb moment because I think even though we're saying it's for kids, these things can be very instructional for adults so that they understand what their kids are learning. But they're also older people who may want to be able to connect or relate more to what the younger generation is doing and so that they also are learning. So, I think it's a multi-tiered learning experience from different angles. What do you think?

Angeline Corvaglia  26:12

Thank you. Really. Your opinion means a lot to me. Yeah, this reminded me, as you were talking, of another concept that was related to what I said before, is that people often don't want to admit that they don't know things, and when I create a video, it started out I didn't know these things myself, so I would create the video to teach myself these concepts, and I would just make sure that it's in language that kids and youth understand, because these are often concepts thart you know them somehow, like algorithms. You know that algorithms are everywhere, somehow influencing our lives, and they're taking our data, but it's hard for, as you say, adults, a lot of adults to understand. So that's one thing that I do. Also consider my videos, that the adults might not know this either, and that they could be learning it together with their kids. And I think it's a great way to learn together. That's what I try to make it

Debbie Reynolds  27:19

I think it's a fantastic way to learn together. So I definitely support your work, and I love the things that you're doing. But if it were the world, according to you, Angeline, and we did everything, you said, what would be your wish for privacy anywhere in the world, whether that be technology regulation or human behavior?

Angeline Corvaglia  27:44

I wish there would be more regulation. I just think that tech companies need to be given stricter rules about where the boundaries are to keep people safe. I do see a difference when I'm in Europe, and then when I'm in the US seems really small. In Europe, I never opened a website without getting to select the cookies, and in the US, I'm opening websites and I can select the cookies. The other day, I downloaded an app because I needed to go to an amusement park and I needed to get the tickets. I was looking for the privacy policy on the app, and I was looking to uncheck some boxes, because obviously there's too many boxes to check always, but there was no privacy policy on the app. I looked everywhere, and I said that this wouldn't exist in Europe. So I just went there for a day, and I deleted the app afterward and hoped it didn't get too much information in that one day. But that really was an eye-opener for me. So I just wish there would be more rules around that, but also attitudes. I wish people would appreciate more that we've often grown a little bit used to sharing our data, and it's hard to get through to people. I really, as I said earlier, I believe privacy is the hardest point to help people understand the AI, cybersecurity and privacy is the one where the advantages are just much better sold than the disadvantages. When we talked a lot about that, when we met for lunch, it's really, really hard to get people to take the risks and the disadvantages as seriously as they take the advantages.

Debbie Reynolds  29:37

Yeah, I think that's true. I think mostly because sometimes people don't imagine what could happen that would be bad if they gave out the information, just like the example I gave you about scammers saying what's your favorite color, or what's your dog's name or talking like that, and so it seems innocent, in a way, because, your head thinks, well, no one will ever do anything with that data. That's not important. But we know better, because we know that people can take that data and other things and do different things with it. So maybe it's a lack of our imagination of what could possibly happen or what those risks would be. But yeah, we'll work on it together. For sure, exactly, get that mess out there.

Angeline Corvaglia  30:25

Stronger together.

Debbie Reynolds  30:27

So if people want to get involved or learn more about DataGirl, what's the best way for them to reach out to you?

Angeline Corvaglia  30:35

The easiest place to find me is LinkedIn, my profile or DataGirl and Friends, and there you can find the links to my websites. In theory, I'm on other social media, but I'm not a big social media fan, so you probably won't find me there very often. But yeah, LinkedIn, or there's my website, DataGirl/and/Friends.com with slashes between the words and they can find me in case you're not on LinkedIn.

Debbie Reynolds  31:03

Excellent. Well, thank you so much. It was a pleasure. I love the work that you're doing. I'm always happy to support and it was great to be able to meet you in person in Chicago. It's great.

Angeline Corvaglia  31:14

It was, yeah, and I think that every DataGirl should asspire to be the next "Data Diva", because the next step, seriously, I was thinking that before there should be a DataGirl, and then it can be "The Data Diva", not to infringe on your trademark, but it's something to strive for.

Debbie Reynolds  31:35

Oh that's so sweet. Thank you so much. Well, it was a pleasure to talk to you, and I'm sure we'll find ways we can collaborate in the future.

Angeline Corvaglia  31:43

Thank you. Looking forward to it.

Debbie Reynolds  31:47

All right. Talk to you soon.

Angeline Corvaglia  31:48

Bye.

Previous
Previous

E202 - Meghan Anzelc, President, Chief Data and Analytics Officer, 3Arc Advisory, Chief AI Product Officer

Next
Next

E200 - Nicol Turner Lee, The Brookings Institution, Author, Digitally Invisible: How the Internet is creating the new underclass