Debbie Reynolds Consulting LLC

View Original

E208 - Jesper Graugaard, Father to the Danish Chromebook Case / Danish GDPR Activist

Find your Podcast Player of Choice to listen to “The Data Diva” Talks Privacy Podcast Episode Here

Your browser doesn't support HTML5 audio

The Data Diva E208 - Jesper Graugaard and Debbie Reynolds (1:01:30) Debbie Reynolds

The Data Diva E208 - Jesper Graugaard and Debbie Reynolds (1:01:30)

[00:00] Debbie Reynolds: The personal views expressed by our podcast guests are their own and are not legal advice or official statements by their organizations. Hello. My name is Debbie Reynolds. They call me the data Diva. This is the data Diva talks privacy podcast, where we discuss data privacy issues with industry leaders around the world with information the businesses need to know. Now, I have a special guest on the show, all the way from Denmark, Jesper Graugaard. He is the father of the danish Chromebook case and also a danish GDPR activist. Welcome.

[00:37] Jesper Graugaard: Thank you, Debbie. Very honored to be here. It really means a lot to me that not only you, but also other countries shows interest in the case. So I'm very delighted to talk to you about the subject, and I'm really looking forward to a nice conversation about it. Thank you.

[00:55] Debbie Reynolds: Yeah, so am I. Well, we've been connected on LinkedIn for many years. I really enjoy a lot of your commentary and the things that you post. And then, as this Chromebook case has. Has continued, you've always updated the network about what's happening. And I thought it'd be really interesting to have you on the show, first of all, just to introduce yourself and how you got into being a GDPR activist. And then after that, we can start with what's happening with the Chromebook case.

[01:25] Jesper Graugaard: Yes. Well, my background for all this is not technical. I'm not a lawyer. I'm not a tech guy. I see myself as the generation that gave everything away. I was part of the guys who send emails to each other to get into Gmail and get out of Microsoft. And all my friends embraced Facebook and social media. And I remember once super upset that we walk naked into the digital age. And that's really what I did. Ignorance was a bliss. I knew there was something happening, but I really didn't care. And also because when I started to think about privacy and data and social media, I kind of thought, well, I gave everything away. But when I got children, I thought, maybe I can learn them to act differently, because they will grow up in a total digitalized age where I still have some references back to when the time was analog. So for me, going into this world was naked. I had no idea what I was going into. And it all started basically for me. When my children started public schools at the age of six, they get a chromebook. And I was. I was very excited. First of all, I was excited that now they could finally learn what I didn't learn in school, which was coding and getting more into the digital age and learn about safety and privacy and all that kind of stuff we can do with a computer. But that wasn't the case. It was a chromebook instead of books. And that really surprised me. So I went to my school board and told them, gave them a PowerPoint about what I studied, about coding, that you can actually introduce subjects like these on a very early age. And they said, well, we don't have the knowledge to do so, so that's not going to happen. And then one year later, my son came up to me and basically told me that somebody else has written a comment on YouTube, a negative comment on YouTube in his full name. And I was like, no, you don't have a YouTube account. And certainly not because I did not make one. And I never gave consent that the school could do so. And then he showed me, and I was shocked to see that he actually had a YouTube profile name with his full name, the school name and the class. And I was like, I never had a YouTube account with my full name and address, basically, or where I live. So I immediately called the school that was back in 2019, and I said that, oh, we have a problem. My son has a social media profile without my. And they said, no, no, it's just some filters. They are not supposed to see certain videos. I said, no, no, no. This was back then when GDPR was new, and it has been a lot written about in the paper. So I knew a bit about it and I could see this was a clear GDPR break mistake. This is nothing supposed to happen. And I explained it to the school director that you have to fix this. And he came back to me a few hours later and I said, well, it's not a problem. And I said, well, it is a problem. And if you don't solve this, I start calling the media because this is a clear breach of personal know data. And they fixed it within 24 hours. And I thought, well, great done. We are happy. But I started also looking into the chromebook and I realized that there were a lot of other weird things going on. Like he could share documents with other students outside his classroom. And I also knew that when this happened, they had to. We have, in the Danes law, we have 72 hours disclosure obligation after the data violation has happened. So the municipality should basically inform me and all the other parents at the school that this violation had happened and they didn't do so. So I took this subject up on my school board and I told them that we are supposed to do something about this. And they said, well, we follow the guidelines of the municipality, so we are case closed. And I said, well, what they are doing is illegal. It was an illegal violation. I think morally we should at least tell the parents at our school, whatever happens other school, it's not my concern, but at least we could do it. And then representative from the city council, who is part of the school board told me that, well, we, we live in a digital age where mistakes can happen and either I can live with it or find another school. And that kind of tricked me because luckily for me, I have enough time to dig into this subject because we decided to have a more relaxed work life balance, which means that as I come from the music industry and my wife is a doctor, at early age, when we got twins, we realized that one of us one need to go home. And in the music industry, it's much tougher to make money. So I decided, okay, I do that. And also because we travel a lot around, we moved to Greenland for many, you know, for various times. And so my kind of, my, my work with the music industry kind of was dropped. So I had certainly time to dig into this subject. And I started digging into the Chromebook. I started talking to other parents. And what we realized is that also that Gmail was with their full name without content. And we told them that, asked whether they actually done any risk or impact analysis that they are supposed to do, especially after GDPR was done, was implemented in 2018. And they told us that due to that Chromebooks were implemented in public schools in Helsinger, then before GDPR law, they didn't have to apply. So there was no compliance. And that made me, that made me understand that it was far more serious. It was not about only that. 8000 children in Helsingua public schools had got their full name on YouTube. They didn't really understand what was going on and with compliance to the law. So I started asking around what I should do. And I talked to some friends that knew a bit more about technology and the law, and they said, you should contact the Danish dPA and file a case. And I called them and they told me how to do it and how to make sure I could document the process and also give the local dpos a chance to make things right. And as that didn't happen, I found the case in December 2019. And with my background in the music.

[10:16] Debbie Reynolds: Industry.

[10:19] Jesper Graugaard: I thought, well, let me use the same strategy as I have an unknown bag and I want to make it on hard rotation, what do I do? So basically, I started pushing my case to the media as well. And by January 2020, the first articles in one of the big national papers, Polytik was published and then the case exploded. Then it turned out what I thought also might be the case, that if this mistake has happened here, it might have happened somewhere else in other schools that use Chromebooks. And within a couple of months it turned out that 52 municipalities had the same problem. So suddenly it was a big case. And I was shocked to realize that the danish public school, which is primarily dominated with two companies, which is Microsoft on one side and Google on the other side, that at least when it comes to the Google product, there were no compliance to the GDPR law.

[11:37] Debbie Reynolds: Well, first of all, wow, that is unbelievable. I, like you, would have totally been horrified if I saw that my child had a YouTube account that you didn't know about and have their full information, also using emails with their full name. I think a lot of times organizations, they're not really considering the potential harm or the potential risk of doing things like creating social media accounts or putting people's full name or putting people's school information online. I think a lot of times that is because they're thinking about how life used to be before the digital age, right? Where you share, maybe you share stuff with your class and stuff like that, but having something that has like more of a global reach that can be found by anyone on the Internet, that's definitely a problem. But what is your thought?

[12:36] Jesper Graugaard: For me, first of all, it was also a very invasive thing to suddenly get free laptops into our homes because we had a more strict use of technical devices with our children. So I realized that when a child at the age of six, get a device and go online, you need to educate them really, really fast. And so I had to, in a way, to talk about shotguns, on how to act and interact with the Internet in a way that I knew they would simply not be capable of understanding. But I didn't have a choice because suddenly they had this device, I could not control it. I could not even download and install software to protect them because that's simply not the case. It's not allowed to do, you cannot do that. And it was a bit weird and kind of scary to, especially now I also have a girl that you have to be aware of as a camera. You're not supposed to take pictures. Everything that is done on this computer goes into a cloud. And so you should not use it for anything but schoolwork. And that's difficult to control. And they had to learn really fast about the dangers of being a child, being online. And therefore, when I filed the case, I also started to approach the politicians, both locally and nationally. I went to the committee of education in the government in 2020. I explained them that we have this dominant company that is known for breaking the law, using bully methods to keep their monopoly. And we had basically gave them free access. Actually, we paid them to get access to one of our key welfare solutions, which is the public school. And we spent years in developing the public school. We have a lot of know how in that. And suddenly I realized that. And when I spoke to them, that back in the days, the space for education was very intimate. It was a relationship between a teacher and a pupil. It was built on trust and understanding and sharing knowledge. And it was very controlled environment. Suddenly that was interfered with. Suddenly the relationship was not only between a student and a teacher, but in between. You had not only big tech companies, but hundreds or third parties delivering hundreds of different apps that we have no idea how the data flow is. We don't know what's necessarily going on. On a deeper data privacy level, we have been so focused on making Denmark a digitalized country, and we want the politicians. They are proud to say that we are the most digitalized country in the world. We are number one. But when it came to the privacy and the data issues, neither the law or the education among the people who are working with on a daily basis, especially when it comes to children, had no idea how to really work with it. And in the government, at the committee of education, they didn't want to discuss these issues, because it was. They only said that, thank you very much, but this is a case running. We cannot talk about a case that is going on with the Danish dpa right now. And I said, well, if not you, who then? And I approached also the committee of justice, and I also have been meeting up with committee of digitization. And it. And because the case has been running for nearly five years, no one in the political system really wanted to discuss the deeper issues, because it was a case running. So we wanted to close the case. And that was, for me, very confusing, because I thought that, hey, it says in the data protection regulation that especially children, they need special protection, but we didn't give it to them. And they also went to the local committee of Children Education, and they didn't actually knew about it. And I realized that Google did this transfer of personal data without their knowledge. And I look back in the history and I could see that I discovered that in August 2019, but the transfer of personal data and the establishment of YouTube profiles happened between three and six months earlier. That made me also think about that you have 8000 children going to school, and only by accident one parent discovered the issue. So for me, it was kind of a shocker to realize that we don't know as a parent, we don't really know what device we have got into our home from the school system. We have no idea where to look for things that we could adjust. And even the municipality didn't know and didn't even check up on it. So if I wouldn't have discovered it, it wouldn't have got into the public. And during this process, politicians told me that I should go on LinkedIn because the mayor of Helsingua was writing about the case, and she was in a big debate with dpos and privacy specialists, and I should look into that. And yes, I've been on LinkedIn for many years, but since I started the music work, I kind of lost interest in using LinkedIn. And so I got back to LinkedIn, and I was, wow, something is happening here. Maybe I should start writing about it here. And when I did that, and my network exploded. And scientists, researchers, privacy specialists from first Denmark, they found me, and they were like, oh, you are dot guy, who started that one? And they were so excited because what many professors and researchers had been talking about for years suddenly happened. And I very quickly became, you know, the human behind it all. And I realized that the discussion and the debate about privacy, about data protection, about the value of data, has been going on for years, but it has been mainly going on in very closed academic circles, and it has been very difficult to get that understanding out in the public. But suddenly they were like, what we had tried to get out in the public debate for years, you made happen in six months. How did you do that? And I said, well, maybe because I'm not very well educated in the subject. So I bring maybe some issues that very quickly can become very technical. I bring it a bit down to earth, and we get a face on the issue here we have children that were violated in 52 municipalities, basically half of the danish school system. And this was also the Danish dpa was shocked to know about this. But they were happy, thank me for bringing it up, because we needed more awareness on these issues. Then for a long time, the dPA was working the case, and it really exploded in the summer 2022, when there was a ban. They made basically a ban for three months because the municipality in Helsinger did not live up to the risk assessment analysis. And so they basically told them that if you don't make these documents, this work, we cannot allow chromebooks to start in the new school year. They did those documents and send it into the DPA. And then they thought they are home free and they can cancel the ban. And I said, well, it has to be the DPA who canceled the ban, not the municipality. So I filed a case to the local police and said that they are doing a legal procedure here. So in August 2022, they had a very special city council meeting to discuss what would happen if they would break the law the day after. And the law has basically said that there hasn't been any case, so there isn't any procedure for this, but you might get a fine. And it is not the city council that will have to pay. You have to pay personally. So of course they said, okay, we live up to the bank, and for three months we didn't have chromebooks in schools. And then if we move forward a bit faster, then it occurred to these municipalities that it was too difficult for them to handle locally. So they asked, they have a lobby organization called Coel, which is I can basically translate into association of municipalities in Denmark. They took over the case and started the negotiation with Google and the DPA. And then by January they basically told the dPA that we cannot adjust these issues, we need to change the law. Because the Chromebook problematic about handling privacy data was only at tiny tip of the iceberg, because the whole public sector has a problem in the way that we handle private data. But the dPA asked them to negotiate with Google and basically they got a few solutions, suggestions that either Google is out or Google adjust and promise not to use personal and user data. And the negotiations, they were finalized here by June. And the status of the cases that Corel has stated on behalf of 52 municipalities that after the 1 August 2022, May 2024 that the municipalities will no longer pass on personal data for maintains or improvement of Google workspace for education services, Chrome, Us and OS, as well as the Chrome browser. And they are not supposed to make measuring the performance of Chrome OS and Chrome browser and development and the data is not to be used for development of new features and services in Chrome and Chrome browsers. That's a big deal. And many told me that you make Google Bend. And I said, I don't know if I did that, but let's see what happens. What the change means is that the affected multiplies must refrain from using and shut down the services and parts of services where personal data is proceeded in third countries where protection cannot be ensured by data, subjects rights and freedoms that essentially are equivalent to the protection number that is within the EU. This also applies to the service and maintenance of infrastructure from the supplier side, where processing of the personal data that is processed can take place for the municipalities that are responsible for handling the data. But there's a tricky issue to that, and that is in the Danish School act regulation, it actually says that you are allowed to use data for third parties, but the third party, what is that? That is not really defined. It could be the health sector in Denmark, but it could also be Google third party. And so right now, I think that we are caught in this challenge and dilemma is that right now, as a normal human being who is not into law and not into all this technical stuff, I still have no clarity about what has been done. I even request document access to the case and to the final deal closure between Google and Corel. I got that from the DPA, and that is a totally black document. So we now have a closure of a five year old case about protecting children's privacy in public school and their user and personal data. We have a lobby organization who closed a deal with a private tech company that concerns a public welfare service, which is the public school, and it concerns miners from the age of six. It's paid by tax money, but the public is not allowed to know what the deal is. The Danish dPA has also sent the final closure of the deal to the European Data Protection Board to have an opinion, so that we know that we are living up to the European law and GDPR, and we are waiting for that to come up here in September, if we are lucky. But what really has been the deeper, what can you say? The journey for me is that I realized that I live in a society where we should understand the deeper impact of data protection and privacy. Because during the years, I've often been asked also by journalists, does it really matter? They are using this phone book, Google service, and they're giving away data, but the children are also giving away data on Snapchat and TikTok and whenever they are on the Internet. And I said, well, you have to isolate this, because one thing is what you do privately, another thing is what you do when you enter the public school. There, you should trust the government to assure that the children's personal data is handled not only according to the law. And when I say not only according to the law, I mean that sometimes, or very often, the law is behind tech. So we have to be even more careful, especially when it comes to children, because I thought back then that wow, this must be Google's wet dream come true. If they can get access to children at the age of six and collecting, harvesting data, their voice, everything they write, everything they search on, and they can just suck it in and they can use it for anything they want and there's nothing I can do about it and that still frightens me.

[31:35] Debbie Reynolds: I totally understand and agree it is frightening. Well, I want to understand the scope of this case. So you had said that the changes that are made in Google workspace for education, is that just active in this case for the schools that you're talking about or all of Denmark?

[31:56] Jesper Graugaard: Yeah, it's half of the schools in Denmark then we are not then. Everybody has also been talking about the Microsoft 365 services, but there has still not been a data subject that has filed a case. But many believe that the same problematic that we know in the Chromebook case is also in the Microsoft issue. I don't know that because I don't use it. But what I do know is that since the white magazine wrote an article about this in the summer of 2022, the issue has grown across Europe. I know now parents in Holland, in Belgium, in England, in Spain, in Sweden, in Norway that are looking into this issue, that are asking questions and that are understanding that when they use these services in public schools, they have no control over it. And since Denmark compared to the US is a very little country, we're only 5 million, then when you can get access to a couple of hundred thousand children's personal data, you pretty much are getting into the soul of a country. And US tech has become a central infrastructure. Basically, we've given away the digital infrastructure to companies outside Denmark that act outside danish law. And, and I think it's wrong. I think when you have national school system, you should be able to service that with national solutions, because it's not only about the technology, whether it's easy to use, it's also the bias. We have a different culture, we have a different political environment than the US. We have a different way of running schools and educate. And it's a very delicate issue.

[34:21] Debbie Reynolds: It is a delicate issue. I want your thoughts about something. As you were talking, I feel like I've been thinking about this for a couple months now. Do you think that we need to make sure when we're talking about privacy and data protection that we need to also bring in the conversation around safety?

[34:44] Jesper Graugaard: Yes, I think that it's interesting. I also listen to a lot of your podcasts and professionals. They say it very often, data is the new goal that's the modern oil. And, and if it is that and that it is that, then we should protect it, and we should be able to protect it on the individual level, because, like, when my children ask me about the case and they say, well, dad, what is really the problem? Nothing happened. I mean, I wasn't hurt. I don't know anybody else who were hurt. There were no sexual abuse or there were no crime. And I said, well, that's true. But we know a lot of vowel data can be used for today, but we don't know about the future, and especially when now AI's come rolling in, also in public school. We don't know what will happen with these, with these personal data information. And maybe it's not connected to that necessarily. But even my oldest child, who's now 14, he got a dump phone last year, and he is already receiving spoofing calls. And he is very worried about that because how did they get my phone number? How do they know who I am? How do they know that I have a phone? I said, I don't know. So we had to educate. I mean, the journey that I have been into is far from over. I was lucky, in a sense, that I opened my eyes. I was lucky that I got on LinkedIn, because within those five years, I have, from being a total ignorant, I still got access to or got connected to people that really know what they are talking about. And it has been a fantastic help to be able to call professors at universities, researchers, getting to talk to, for example, Alan Woods or Pia Tesdorf. Pia Tesdorf, who is in your world, is known for total. I mean, she has so much integrity in analyzing technical issues. She was the one who put me in your direction, for example. And if it wasn't for people like that, I would not have been able to educate myself that fast. And when I had law issues or if I should file complaints, I could call people up on a very high level and they talk about the need to get it out on a level where everyday men can understand the importance of protecting your privacy. Like John Kavanaugh told in one of your podcasts that when you asked him, I really liked that question, when you said, what do you think about when you hear the phrase, I have nothing to hide? And he said, well, why don't you? Why do you close your door when you go to the bathroom? And we need to discuss these problems on a more understandable level, because the main problem in public schools is actually not the Kronbrook case. It's not the data issue. It's about screen time, it's about social media and that children are too much on social media and spend like 6 hours a day in front of the screen. And I think it's very important. But nobody really addresses the fact that we had two companies that are having access to user and personal data to children from the age of six and onward.

[39:16] Debbie Reynolds: I think I'm also horrified about what artificial intelligence will do with this, right? So being able to gather that much information, being able to have people who can spoof people's voices or gather information enough to, like you said, your child said, you know, how did these people get my number? How do they know my name? All those things concern me. And I think, you know, from my perspective, I don't want anything bad to happen, right? I don't want anything terrible to happen to someone where we're just trying to address it after the fact. So I think it is very important to raise your voice and be able to say, hey, you know, we don't think this is okay. Here's how it impacts us. And I think the importance of your case, and I think you hit on it as well as it is very human. Right? So it's not theoretical. It's not something that's just in a paper for academic or someone who's just talking from a point of theory. So this is something that actually, you know, impacted you. And so I think that that makes that story resonate best. And I also think for legislators, you know, what I've talked, when I talk to people who do legislation and things like that, those personal stories really do help because I think you're right. Being able to make a connection on a human level so that people can understand what that impact is, is very important. What do you think?

[40:50] Jesper Graugaard: Yes, and I think it's one of the most important subjects that should be addressed today in the society in general. And I mean, in Denmark, since we are, everything is digitalized now, we need to have a different discussion that we, than we have had so far because it's. Yeah, well, let give you another just really relevant example. We have in Denmark, we have a travel card that we could use to use public transport. And this week there was published an app that everybody can download and use. The problem with that app is that it tracks you all the time, even when you don't use it. And that has been to debate and the DPA is on the case as well. But when it was debated in the news, one user was saying that, well, I don't really care. I have so many apps that is already tracking me and already taking my data. That one more app that is doing that, I really don't care. And another one was saying that I don't understand why they need to track me when I don't use it. So I don't feel comfortable that it does. So I'm not going to use it. And that is a classical. And I was like thinking, how can you allow an app that is, you know, it's violating the GDPR law, but it's allowed to public to put it out anyway? And then the DPA can run after the case and they can then say, raise a finger and say, we are looking at it and you should think about whether you are violating the GDPR law. And meanwhile we will look into it and come up with the results on a later time. But then the violation already happened. And I mentioned it to my children and said that I feel it's like there's a burglar standing outside my house. The police is telling him that if you go into that house and you steal Jesper's furniture, you should think about whether you are violating the law. The burglar knows he's violating. He doesn't really care, does it anyway. And then the police can run after him after he broke the law, even though they watch outside my house and they can see it happening. I don't get it.

[43:44] Debbie Reynolds: That's true. We have to change that, because I feel like the harm is too great in the future, where we shouldn't need to wait till something bad happens before we react. So to me that's part of the. The reason why law, regulation aren't a magic button. Obviously we need laws and regulation, right? But we also need to have people like you, people who are working with this in civil society, so that we can prevent things, so that it doesn't become a problem down the line. What do you think?

[44:22] Jesper Graugaard: But is it, is it possible that all. Because what I see is like with my case, Google and their municipalities, they adjusted. They said, okay, we will stop using those data and collecting them. And we promise not to let them go to third parties. We promise not to use them for developing new services. But so far nobody is checking up on it if they are doing it or not. Secondly, why did I have to wait nearly five years? They knew it back then. They could have done it with a snap on their finger. It's not difficult to say, okay, we made a mistake, they closed down. We will not use children's data. So I am afraid that the law is always behind that, the tech is faster. And every time when nope. Is filing maxims, is filing a case against Facebook, and they are supposed to pay millions in fees they never pay. They can drag the case on and on and on. And I thought that five years ago, I thought, this is done within a couple of months, and I can go back to my garden and my family and live happily ever after. But now I realize that I didn't manage to protect my children. I told my children that because in the early age of. Early, yeah, in the early stage of the case, they asked me, why are you doing this? And I said, well, you will grow up in a digital age where everything is running on devices and your life will go on various platforms, and during this behavior, there will be collected so much data that when you finish school, they have digital copy of you. And it's my job as a parent to protect that digital copy, because that digital copy will be equivalent with your real person, because they can basically, when my children are finished in school, they know exactly how they think. They know what they are good at. They know their academic weaknesses and their strengths were not capable of stopping them, because I told them that when I was 18, I did whatever I wanted to do. This is my own choice. But you are not 18 years, so I have to make the right choices for you until you grow up and get old enough. Then you can go on Facebook, then you can go on Snapchat, and you can give everything away. You can take naked pictures of yourself, and you can do whatever you want. But until then, I am the dad, and I decide, but I cannot do that. I don't have the law to do it. I don't have a government who will help me doing it. And I certainly don't have tech companies who wants to do that, because they make a lot of money and they make a lot of technology where they can use those data. And one thing has been circling around in my head, and that is, how much value does the data that you can collect from a child age of six until they're 18, how much value does that represent in money? And do they get any. I mean, do they, do my children, do they get paid for what they give away for free or without their knowledge? No. They get a lot of technology. They get a lot of apps. Some of them are good, and some of them, they don't even want my children. They don't necessarily want TikTok. They don't necessarily want Snapchat. Maybe they want, but they didn't ask for it. It just came boom into their life. And also with Google products, they didn't ask for a company that could gain access to their papers and whatever they are doing on that computer. I still. I only had heartbeats at home. I don't use cloud systems. I have everything on hot disk. But my children, they don't have that because of the Chromebook. So I think we need to talk on a much bigger scale about privacy issues and how to protect it, but also the value of data. That's, again, an academic debate, whether you can say you own your own data or nothing, but at least have more control on the data you give away, and that you have a deep understanding the consequences on the long term, which is still abstract, especially with AI. I think it's very abstract.

[50:04] Debbie Reynolds: I agree with you on all those points. So if it were the world according to you, Jesper, and we did everything you said, what would be your wish for privacy anywhere in the world, whether that be human behavior, technology or regulation?

[50:18] Jesper Graugaard: We definitely need to educate ourselves when it comes to human behavior. We need to understand that the Internet is not what I grew up with back in the nineties. It was the wide west, and everybody could communicate safely with everybody, and everything was for free, and knowledge was for free, and it turned out it wasn't. And legislation, it's the key, one of the keys, to protect our behavior and data. But it's difficult to implement, I think, and in a little. Denmark is a little country. And the school system that is buying, for example, Google product is only a little minor stone or a little corner of the big Google machinery. Why should they comply with what some danish people in this little country of Hamlet is thinking about? Or, you know, feeling when the company is, you know, the economy in Google is nearly, I think it's even bigger than the national, you know, the BNP of Denmark. I mean, we are just a little mice down in the corner. And I don't know whether I can do much about it on a personal level. I find it frustrating that I have politicians, and that's mainly the main problem, I think, right now, is that the politicians are nothing completely aware of how fast this is going. And when they say, when our business minister is saying that he's aware of that children are cash cars for big tech, he only means it when it comes to social media. And I say, well, it's far more. It's beyond that because you, you give them free passport to your public schools. This is where the issue is when it comes to social media. I can say to my children, well, you don't have a smartphone. I have closed that issue, but I cannot do much about it when I send my children in public school, and I find it very frustrating. Simple way of saying it, but I was asked very early in this case as well, why don't you just take your children out of public school and put them in a private school where you could have more control about this? And I said, well, then I've given up. I think that public school is critical. Welfare that we should protect. This is one of the foundations of our democracy, that everybody can gain access to education free without their social background. And this is how you develop a positive society where you trust your democracy and you have good citizens that are working collectively to make a better society. I'm not sure that maybe you know more about that than I do, but sometimes it has been very dark for me in moments, because the more you get into about the business of data and privacy and how difficult it is on an individual level to make a difference, then it feels very sometimes really depressing. And Alan Woods and Pia Tesdorf both told me that lay low, take rest, step away, go into your garden, have fun with your children, enjoy life, because data protection is a monster.

[54:55] Debbie Reynolds: It is. And I think what I've learned over the years, especially if you're trying to make change, it's definitely difficult. It's definitely daunting, but it's literally like an inch by inch game. So you definitely have to pace yourself and, you know, be able to, you know, you know, just like Allen Woods and Pia Tesdorf said, what I adore, by the way, you know, you have to be able to have life and step away and do different things, but, you know, it's. It's hard. It's not easy. So. But I thank you so much for your advocacy. I love your story. I think that's why I resonate so much with people, because I feel like, oh, some people think, oh, you know, you're just the legal person or you're just the tech person talking about data. So being able to have your perspective as a parent who's really concerned and about protecting your children and the children in Denmark, I think you're doing amazing work, and I'm so happy that you were able to be on the show today, because I think this is a very important case, and I think I. That this will have a bigger impact way beyond Denmark.

[56:02] Jesper Graugaard: Yes, I do, too, and I thank you for showing interest in it. And I still find it very difficult to talk about because there's so many layers in this case. I always tell people that is what you say is the cleanest case of GDPR violation because it's children and we should start there and start protecting them. But when you dig deeper into the layers, I get confused and there's many people who want me to make talks about it and it can easily. And I think maybe when I, when I listen to this conversation with you after we have been talking, I might still find that I'm going in too many directions and I get lost. And I think that's one of the key issues, why many people don't really, I mean, normal people don't really dig deeper into this because they lose direction. And I think it's fair because it's very complex world technically, and with legislation, it's really, really complex. And you can, you can easily be overturned by arguments that might sound relevant, but then when you go further down to the contract, it says something different. And I hope that, that I can keep up being inspired and learn and also hope that my case have inspired other non tech people that just live their everyday life with robot hoovers and smartphones, started thinking about what is going on and whether you need those services at all. Do you need a hoover that will map your house and do you need a fridge that can go online and shop for you while you are not at home and stuff like that? Because a lot of it is convenient. But is it necessary and at least for future generations that we know that privacy and data is big business, we don't know what the consequences will be on the long term. So maybe start changing behavior and not turning into against tech or pro tech, but having a discussion that embraces it in a more positive but also critical way.

[59:09] Debbie Reynolds: I agree with that completely. Thank you so much for being on the show, and I applaud what you're doing. I support what you're doing. So anything that I can do to help, please do let me know. And I know that people will really love this episode because it really is a human story. So I love the fact that your takeaway was about humans and how we can change ourselves. We can make decisions ourselves. So thank you so much. Thank you so much.

[59:38] Jesper Graugaard: Thank you so much, Debbie. Honored to be part of your podcast and I will look forward to listening to it and also to hear the responses. I will, it will be really, really interesting and I will thank you for your time and thank you for inspiring me as well. Over the years, it's been one of the great gifts of this journey. So keep up your good work as well. And I talk to you again.

[01:00:10] Debbie Reynolds: Definitely. We definitely want to collaborate in the future. Thank you so much and have a good night.

[01:00:16] Jesper Graugaard: Yeah, you too. Have a good weekend.