E232 -Caroline Lancelot Miltgen, Social & Behavioral Scientist (PhD), Consumer/UX Privacy and Responsible AI Expert (France)
[00:00] Debbie Reynolds: The personal views expressed by our podcast guests are their own and are not legal advice or official statements by their organizations.
[00:12] Hello, my name is Debbie Reynolds. They call me the Data Diva. This is the Data Diva Talks Privacy podcast where we discuss data privacy issues with industry leaders around the world with information that businesses need to know.
[00:25] Now I, I have a very special guest all the way from France, Caroline Lancelot Milchen. She is a social and behavioral scientist focused on consumer and UX privacy, AI and ethics.
[00:41] Welcome.
[00:42] Caroline Lancelot Miltgen: Thank you. Thank you to have me. Debbie, I'm very happy to be with you today.
[00:47] Debbie Reynolds: Well, I'm excited to have you on the show. Your background experience is so different and unique and I love your profile. And I think maybe a lot of times I see people online and maybe you've made a comment or something and it really draws my attention.
[01:04] So I reached out and asked you to be on the show and you're kind enough to be, to agree to be on the show. But you're, you're an academic, you teach, you're a doctor, a Ph.D.
[01:16] and I would love for you to give us, our audience, a feel for your career trajectory and why you study social and behavioral information and things about consumer and UX privacy.
[01:32] Caroline Lancelot Miltgen: Okay, so yeah, I studied my, maybe my studies. I was really interested in mathematics, so more statistics.
[01:44] And then I discovered that I wanted to do something more applied, more impactful. And I'm not saying that mathematics is not impactful to the world, but I wanted to do something more practical.
[01:56] So I decided to do studies on management and marketing.
[02:01] Specifically. I was really interested in understanding how consumer make decisions and was doing a master degree and I had to choose a topic for my master thesis. And I remember I didn't know what to choose.
[02:15] And at that time I was sure that I wanted to work on something related to technology and the influence of technology on society more generally. But I didn't know what exactly to study and my supervisor advised me to read a book about privacy.
[02:33] And at that time there were not so many books on the topic. And I read the book just in two hours and I decided, okay, that's the topic I want to study.
[02:43] I was really interested in the topic because I think it's a complex topic and I lack things that are complex. It needs to really think in a multidisciplinary perspective. You can't address the topic if you only think as a lawyer or as marketer or you really need to have a team with you to think of the topic.
[03:09] So yeah, so that was the. When I started My career on privacy. Then I after my master thesis I did a PhD on privacy.
[03:20] And then I became a marketing professor, still studying consumer decision making, but in relation to privacy. And then of course I started to teach privacy. I started to do a lot of research on privacy, focusing a lot on consumers decisions in relation to privacy issue, how people decide to give information to companies, to public institutions, to private institutions, what make them make the decision to give data or at the contrary to not give those data, and so on and so forth.
[03:53] I never regret to have chosen this topic because it's still since my PhD in 2006, the topic has evolved so much I was a bit afraid when I decided to choose the topic.
[04:06] I want a topic that 20 years from now would still be on the edge of the topics that are of interest to society. And I was lucky enough to choose the topic because we can see that now it's still a very important topic for society, for companies, for consumers.
[04:27] Debbie Reynolds: Wow, that's amazing. Thank you so much for that. I am excited that you chose that as a topic for. For me privacy was a personal interest. And so as things started to get more digital, I started to see the problems that we would have with privacy.
[04:45] So I think my interest followed along your interests. And I love the fact that you study consumers and consumer behavior.
[04:54] What is it about consumer behavior and privacy that maybe a lot of people don't understand understand? So you at your level, having taught it, having really studied this really closely, what is an insight that you can give our audience that maybe they may not know about consumers and privacy?
[05:16] Caroline Lancelot Miltgen: Yeah, I think the most interesting insight that you discover when you start studying privacy and especially consumers point of view or on, on privacy is that most marketers or company think that consumers don't care about privacy just because consumers disclose a lot of information.
[05:36] And so they make the hypothesis that oh, because people give us so many information about themselves, then it means that they don't care.
[05:45] And when you start interviewing people, surveying people, I've done a lot of study in Europe, in the US can tell you that that's not the case.
[05:55] What's happening is that people feel powerless, helplessness, they care about privacy. But at the moment, and despite regulation in Europe, for some time, starting in the US despite regulation has not solved this problem.
[06:15] The constant paradigm that is beyond regulation, whether in Europe or in the US is just not working. You know, people just have no choice back to consent if they want to use websites, apps if they want to use digital and online features or applications.
[06:40] So it's not that people don't care, it's just that they don't know how to protect themselves or they don't have the means to protect themselves because it requires a lot of effort.
[06:52] If you take the privacy notices from a legal point of view, companies can say, oh, look at my privacy policy, I'm compliant.
[07:02] Okay, they may be compliant because they have put text of 40 pages that nobody but a lawyer can understand. Is it really transparency?
[07:11] Not if you interview people, they would say, oh, it's not transparent. I don't understand what it's, what it means. I still don't understand what the company would do with my data, how the company would use my data.
[07:25] So I'm consenting to something that I'm not understanding. And so I think there is still a lot of improvement to be done in this area. But the insightful part is that people care about privacy, but they don't know how to make better decisions because there's still a lot of power in asymmetry between the consumers and the companies or the institution that want to collect data about them.
[07:53] Debbie Reynolds: The power asymmetry is very real. And unfortunately, a lot of these laws. When you say that the regulation isn't working, I think what is happening? Whenever I see a new regulation come out, I think, oh my God, now more work for the consumer.
[08:12] Right? So more boxes to check more things that they're going to ask. That's just my opinion. Right. So the company, they may have one privacy policy, but you as a user, you may see, you may look at a hundred websites.
[08:25] So now you have to make 100 or more different decisions every time you go to a different website. To me, I think that's wrong.
[08:33] Exactly.
[08:35] Caroline Lancelot Miltgen: The effort is asked to be done by the consumer. So as, as long as we are putting the effort on the consumer side, we would not make any progress on privacy.
[08:47] It has to be a common effort by public institution, by companies and by the consumer. At the moment, it's too much oriented into the consumer to protect and to read, as you mentioned, to read and to tick in or tick off everything and sold it.
[09:05] It's just not doable.
[09:06] Debbie Reynolds: I want your thoughts about opting out.
[09:09] So these are very different in the US and very different in Europe in terms of opting out. So in the US a lot of times we don't opt in. Right.
[09:19] So we just get things.
[09:21] But theoretically we can opt out. And opt out is very difficult. Right. So the law does not specify that you need to make it easier for people to opt out, at least in Europe companies are supposed to make it easy.
[09:35] For example, it's supposed to be as easy to opt in as it is to opt out. But what are your thoughts about the opt out landscape in terms of giving people control?
[09:46] Caroline Lancelot Miltgen: Yeah, I think in Europe it's, the opt in is as you mentioned in theory there for a long time.
[09:57] But still of course it seems that it's easier or better for the consumer because they don't have by default they don't share their data or they don't, I mean, they don't have to do anything, protect their data.
[10:10] So it's, it seems that of course the opt in model would be better for the consumer, but I'm not sure it has not proven to be really true. So I'm not sure it makes that much of a difference until a consumer are better educated about what exactly it means, how exactly it works.
[10:36] Many people don't understand that when they give information about themselves to a company or to website, then it would go through many third party.
[10:46] And so I don't think the model itself is changing so much until really people understand what is behind this choice.
[10:57] Debbie Reynolds: I think that's true.
[10:59] Your thoughts about the difference between the analog world versus the digital world and the way that we exchange data. So like an example that I like to give, an example we see in the US is being used with lawmakers around showing your ID card or something, right?
[11:20] So let's say in the analog world, if you're going into a bar or something and you would give your ID to show that you're a certain age, right? And so some people, some of the laws that we're seeing that come up on a state level in the US are saying, well, you know, if you go into a bar, you have to show your id.
[11:40] So now online, we want everybody that goes to your website to upload their id, right? So this is totally different than, you know, one person looking at your id, seeing that you're the right age, letting you go into the bar, but then they're not keeping your information, right?
[11:59] So you still have control of your identity information.
[12:03] You're not in any danger of having your data breached or anything. But then once you give it to another company, it's in a digital system, it gets copied a thousand times, it gets sent around a thousand times, it can be breached.
[12:19] You know, so to me, I think part of the challenge, especially in my view of regulation, is to really talk about how digital life is different than analog transacting in life around your personal information.
[12:36] What do you think?
[12:37] Caroline Lancelot Miltgen: Yeah, I fully agree with what you mentioned. The issue is the digital world is that as soon as you have given some information now there may be some many copies done and shared with many other players that you are not aware and that you would not be able to really refrain even with the regulation we have now, because you would not know what are the companies or the other parties that now have access or have received a copy of your information.
[13:09] So that's. I agree with you that the regulation should take this into account. If you just give your ID card to when you are in a bar and they would just look at your birth date and then see whether you are more than 21 or whatever age.
[13:24] And that's all.
[13:26] So that's really different. It reminds me of a model that comes from Northern Europe called My Data.
[13:35] And the My data model is. So it has not at the moment, it's just kind of a theory is that not. It has not been really reached a market yet.
[13:46] But the idea is that you would be able to, instead of giving the information, let's say you want to vote online.
[13:54] And so to vote online you need to prove that you are a citizen of this country. You need to prove that you are more than 18 or 21, whatever.
[14:04] But it doesn't mean that you have to give this information as when you are in a bar, you just show your ID card, but then five minutes after the bar owner doesn't remember anymore your birth date.
[14:18] So here the idea would be to instead of giving your birth date if you want to vote online, or instead of proving that you are from this state or from this country, there would be technical mechanism that would just give the institution that is trying to authenticate you or to.
[14:38] To be sure that you are from this country and that you can vote. They would just say o. They would look at your data that you would have in your own folder.
[14:48] And so you give access to, to this institution, to your birthday, just to verify, okay, she's more than 18 years old, she's from this country, she can vote, but the institution will not have access to your information.
[15:03] So that, that's a kind of model and I'm saying it would be able to reach a market like this. But that's a kind of reflection that we would have to do because as you mentioned, as soon as it's digital, there would be copies of it.
[15:19] And so it means that as soon as you give information, you lose control. And so that's the problem. The problem is the control.
[15:26] And so we need to think of next regulation or next way to, to tackle this issue on how we can make people really control information.
[15:37] Because at the moment when there is a regulation, it's still not fully addressed.
[15:44] Debbie Reynolds: I agree. Control is an issue and also deletion is an issue. So once data gets in digital systems, as we say, it is duplicated, it's spread around, the organizations that have it lose control as well.
[16:00] Right. So even when you have something like at least in Europe, you all have the right to be forgotten. In the US we have a right to delete. That's just very different, it's more limited.
[16:10] Right. But even that doesn't solve the problem because the chances of you, let's say for instance, you in this situation, you upload your ID to a company, let's say they made 100 copies, you say I want to delete it, maybe they're only deleting two or three of those hundred copies or something, or maybe they're suppressing it in some way.
[16:32] So I think I agree with you that there should be more control for the consumer and there should, to me there should be some type of end of life strategy.
[16:42] But I think part of that in order to solve that, I don't think the answer is give this person this data, let them make 100 copies, then have them chase around these hundred copies.
[16:55] I think, think at some point the information that you've given, like can this person vote yes or no? It needs to expire or be extinguished in some way where it doesn't have to be duplicated.
[17:08] Caroline Lancelot Miltgen: Exactly, I agree with you. And even in Europe with this right to be forgotten again, that would be very difficult for a person to. There have been cases, you know, like people that are die, that die and the family wants the profile to be deleted.
[17:27] And it's just a nightmare to obtain that. So even with a regulation and a right, it's still difficult to be done. So it means that, and I think that the answer cannot be only regulation.
[17:41] So that's why, as I mentioned earlier, I think that's what makes this topic so interesting is that if you only think legal solution that would not work, if you only think as a technical solution would not work.
[17:56] So you need to think of it in a broader view. And okay, what can we do with the law? What can we do with the technical way to protect privacy?
[18:06] So we need to have a broader perspective to start having some solution for these issues.
[18:14] Debbie Reynolds: I agree and I think this is something I talk about a lot. It's very frustrating because some people do think, well, all we need is A new tool or all we need is more regulation, but that doesn't cover all the issues.
[18:27] And so to me, once you get things in digital systems, that's a challenge. So for me, part of it is what should go into digital systems, right? So we used to, in the paper world, you know, when you thought about the way that people handle documents, then it was very curated, right?
[18:46] So not everything that you thought or everything that you said went into a folder, as you say, or went into a file cabinet. So there humans were curating what they thought was important and then they protected it in those ways.
[19:00] And so now we have a lot of data that's important and not important, and it's all being put into systems and decisions are being made about people based on that information.
[19:12] But then it's not secure in a way that you will want it to be. Secure where? Like in the bar scenario. Okay, I want to go on this bar, I have to show my id.
[19:21] I'm okay with that. That's an exchange where I feel like it's worth it to me to do. But when I'm transacting something and I'm uploading my ID to a website, I don't know what they're going to do with that data.
[19:38] I don't know where it's going to show up, know if it's going to be breached. I don't know if it's going to be sold. And so that makes, to me, the value has changed.
[19:45] It makes it even more asymmetrical in my view. What do you think?
[19:51] Caroline Lancelot Miltgen: No, I totally agree. I have this little example that I give to my students and it's about personalization because there is a lot of debate about our company need that, because they need to know you to better personalize their offers and their promotion and their communication.
[20:11] And companies use this argument to say, okay, look, people like things that are personalized. People prefer personalized ads. Okay, but compared to what? Compared to, of course I would prefer that you send me an ad about a product that I can be interested in as compared to a product that I have no interest in.
[20:34] There is no point about it. But the question is, what do you need to make this personalization work without being too intrusive into people's privacy? Let's give this example. If you go in a bar every day, you take the same, let's say you take a green tea or whatever, you take the same beverage every day.
[20:56] After several days, then the person who is taking the order would know what you are interested in. But maybe this person would not know your name, your surname, your age, whether you have kids or whatever.
[21:12] And you can imagine if you come the day after, you may not have to give again your order because the person would know what you would have as a beverage.
[21:22] And so companies may not need to have that many information about you to make personalization work. And so I think that's an interesting issue, this balance or this trade off between privacy protection and personalization by companies.
[21:40] So if companies could do that without being so intrusive into people's lives and knowing information that they don't know, they don't need to know to be able to send you offers.
[21:55] Debbie Reynolds: I agree with that. And that's my concern with personalization as well. Because as you say, you're giving a lot more or they know a lot more information about you. So they're taking the information that you're asking to be personalized, but then they're also adding these other factors that may not even be relevant to what you want.
[22:15] And so to me, again, that's like more around what the company wants and not really what the consumer will want.
[22:22] Caroline Lancelot Miltgen: Exactly. That's why my studies, I wanted to really focus on the consumer side to understand and to give recommendation to marketers. Okay, you are going too far here. At the end of the day, it may not be that relevant.
[22:36] And that in terms of business, interesting for you to try to collect so many information about your customers. Because at the end of the day, technology is about trust. And so of course on the short term, people still agree to give information because they don't know how to protect this information.
[22:58] But at the end of the day, it doesn't mean that the consumer, if you somehow force him to give more information than he would like to, he would start to be cautious or to be maybe not distrustful, but at least will not be fully at ease with you.
[23:15] So even from a business perspective, if you think about trust, company would be more efficient if they really take the consumer perspective and the consumer needs into account. I mean, that's all about marketing.
[23:29] When I started to study marketing, it was all about you need to know your customers, you need to focus on their needs and answer the needs.
[23:37] So, yeah, I think that there may be some progress also here.
[23:41] Debbie Reynolds: So what's happening in the digital world that concerns you right now as it relates to privacy?
[23:48] Caroline Lancelot Miltgen: Well, I think the big topic is AI. Of course, AI is everywhere. And again, as many technological improvements, it's difficult to do without AI. But companies are using AI every day now to target us.
[24:06] So Even if you as a consumer would say oh no, I don't want to use AI, you are using it without somehow knowing it or without being willing to do it.
[24:16] So I think AI is coming in time, is already here for a long time, but is really progressing a lot. And so in terms of privacy, of course it would have an influence on privacy because AI is working only with data.
[24:32] And so at the moment many consumers are using AI, ChatGPT, whatever the kind of AI they are using or even they talk with a chatbot.
[24:43] And these tools are using this data to train, to train models and then to improve the models. So you know, not many people know that again you were talking about the opt in, opt out.
[24:56] You need to go into the feature of the AI to make this impossible, that the AI would use the charts you would have with the AI to train further the model.
[25:10] So again that makes the effort into the consumer's hands to say no, I don't want the AI to use my data to train itself.
[25:19] And there are of course many other privacy related issue with AI. But I think again in Europe there is a regulation that is coming that will be enforced in one year and a half.
[25:34] But again that's not the only solution we can think of or we need to protect consumers, to protect the society about the possible negative impacts of AI. We can think of how AI is having an impact on politics, on democracy, misinformation, disinformation.
[25:58] There are so many issues that need to be solved. We really need more people to think of these issues. And here I may be biased, as AI can be biased, but I think that social scientists, psychologists, people like this are needed.
[26:18] We don't want developers, tech companies to make the decisions for us. We as a society, we need to have voices on the use of AI and how it is changing our lives is just incredible.
[26:36] And so I think it's a really important topic that need to be addressed by of course regulation, but by the society in general.
[26:45] Debbie Reynolds: I agree with that. I think there it needs to be a multidisciplinary approach to looking at these problems and doing it in silos isn't working.
[26:56] So I agree that there should be something like that.
[26:59] One thing I'm concerned about and I want your thoughts and I'm going to go back to your green tea analogy. So if and this is relates to inference and data systems.
[27:11] So let's say you order green tea, somehow it gets in the digital system that you like green tea, maybe you order it from a store or something like that, right?
[27:22] And so the inference will be for that. Is that. Okay, Caroline loves green tea, right? So if you're a green tea lover, that's great. And so they say, okay, well we're not going to show her these other beverages because we know that she likes green tea because we want her to buy green tea, that's fine.
[27:40] Right? The problem with inference is when these tools are using inference to make decisions that could be harmful. Right? So let's say instead of green tea, let's say, well, we think that cause Caroline goes to this particular part of town that she is a anti Semite.
[28:04] Well, I'll give you an example. I give this example a lot. And this happened with Cambridge Analytica. So in the Cambridge Analytica case, they have something called the KitKat project.
[28:15] There was a correlation that people who liked KitKat, also liked would say do thumbs up on Facebook for anti Semitic messages. So the correlation they were making, and they were making it in jest, is that people who like KitKat bars are anti Semites.
[28:37] Right. So this is where inference can go off the rails in data systems. But I want your thoughts on it.
[28:45] Caroline Lancelot Miltgen: Yeah, I mean the easy answer, and I'm not a data scientist, but the easy answer is that correlation is not causation and that there may be some spurious effect that in fact there is not a direct link between liking the KitKats and the other personality or the way the.
[29:08] But there may be some other variable in between that explain that. There is. This seems to be a direct correlation between loving the KitKat and the other what you mentioned.
[29:21] But in fact it's because it's going through another variable. So that's more of a technical statistical answer. But of course that's the very issue with privacy and with the fact that we are afraid that somehow I don't care that company knows that I love tea, because that's true and I don't think it's very sensitive.
[29:44] But as you mentioned, the problem is that behind the scene and behind the fact that I lack this specific item, this specific website or whatever, that the companies would try to use this information to infer what you would be interested in.
[30:06] And by doing that first they may do mistakes because there may not be always direct correlation. And then is it really what consumers want that based on information that they have, they would try to go beyond that and try to infer many information that you consider as personal and that you don't want companies to have or to try to guess about you.
[30:36] And the problem here is that whatever the regulation that we can have we would never solve this problem through regulation or even through technical issues here. I think it's a matter of society we want to live in.
[30:51] Do we think it's right somehow that companies try to infer information about you to make you buy specific product or service? When you interview people they would tell you no, that they don't like it.
[31:05] But that's the part where they really feel powerless and helplessness. What can you do? I think here the only way to protect yourself is to limit the data as much as you can.
[31:17] The data you give to companies or the data you give about you, even on social media or any apps or whatever. That's the most efficient way at the moment for consumers to, to protect their privacy is to give us less information about them.
[31:33] Because I agree this the inference, the aggregation that they have some information about you but now they would buy some other data sets to try to know more about you.
[31:43] And I don't think that I understand why they are trying to do it because they think that from business perspective they think that there is nothing to lose.
[31:55] If you don't like it, you would ignore the ad or the message or whatever. I don't think it's true.
[32:03] I think that that's from a company's perspective that's too narrow of a view and that's too short term view. In the long run I'm not sure that consumer would continue to agree with that.
[32:16] But at the moment there is no much thing we can do to impede these things to happen.
[32:27] Debbie Reynolds: I agree. I'm a great advocate for data minimization and really that more curated, thoughtful view about what goes into those systems. So hopefully that catches on with people. Because I agree that's right now that's really the only viable way in my view to limit that exposure.
[32:46] But Caroline, if it were the world according to you and we did everything that you said, what would be your wish for privacy anywhere in the world? Whether that be technology, human behavior or regulation?
[33:02] Caroline Lancelot Miltgen: Yeah, I'm not a technical person though I don't think I would answer about the technical part. I think there can be some improvement on that to, to. To make things easier for the consumer.
[33:13] You were mentioning that we have to agree consent to cookies for example, on every website. So here there are some easy solutions to implement, to invent and to implement from a regulatory perspective.
[33:27] As I already mentioned, I think that's needed. Regulation is needed, but can't be the only solution. So I think the solution is about human behavior, whether it's the consumer behavior or the marketer behavior from the consumer perspective.
[33:41] I think public authorities need to help people to be more aware of what's going on. So I think there is really education that is needed.
[33:52] I think that's the word I want to emphasize. Education is needed in this area. And here I'm talking about education for the consumers so that they make better decision, but education for marketers also and for companies so that they understand that it's not always true, that it's not a problem if a consumer just ignored your email.
[34:14] I mean, there may be some aspects that consumer marketers don't want to see. On trust. And I think that focusing more on consumers needs and trust would be really a good way to prove things.
[34:30] So two words, trust and education.
[34:34] Debbie Reynolds: I support that. I think that's true. So it's like we have to fight a battle one person at a time and being able to have that education and have it being an ongoing thing as we know that the digital world is changing really rapidly.
[34:49] Well, thank you so much for joining me for this session. This is great. I don't think we've never had a behavioral scientist on and you have very deep knowledge in this area.
[35:00] And I love the way that you think about these problems because I think about that too, because I'm, I'm a technologist, but I still see that gap between technology and law.
[35:10] We're really not capturing that and we're not thinking about the human factor in all of these things.
[35:16] Caroline Lancelot Miltgen: I mean, it was really interesting to have this discussion.
[35:20] Debbie Reynolds: Yeah, it was amazing. Well, I hope we have a chance in the future to be able to collaborate.
[35:25] Caroline Lancelot Miltgen: Yeah, that would be great. Let's think what and see what the future can bring.
[35:30] Debbie Reynolds: Yeah. Thank you so much. I'll talk to you soon.
[35:33] Caroline Lancelot Miltgen: Okay, bye. Bye. Thank you.
[35:35] Debbie Reynolds: Bye Bye. It.