E79 - Alexander Hanff, Managing Director at Hanff & Co AB
43:20
SUMMARY KEYWORDS
people, law, privacy, technology, directive, privacy directive, european commission, metaverse, services, eu, fundamental rights, companies, technologies, data, google, respect, consent, manipulated, european, cookies
SPEAKERS
Alexander Hanff, Debbie Reynolds
Debbie Reynolds
Personal views and opinions expressed by our podcast guests are their own and are not legal advice or official statements by their organizations.
Debbie Reynolds 00:00
Hello, my name is Debbie Reynolds. This is "The Data Diva Talks" Privacy podcast, where we discuss Data Privacy issues with industry leaders around the world, with information that businesses need to know now. I have a special guest on the show all the way from Sweden, Alexander Hanff, he is the Managing Director of Hanff and Company AB. He is a Data Protection genius extraordinare. Also a great technologist; I have such deep respect for him. Actually, when I started my podcast before I started my podcast, I made a list of like 20 people that I had to talk to. And Alexander was on that list. And finally, I have him on the show, so you're fulfilling my dream right now. So hello, and welcome to the show.
Alexander Hanff 01:04
Hi, Debbie; you're making me blush. In fact, I am blushing, as you can see. It's a great pleasure to be here. I've been looking forward to this for some time. As you know, as we just discussed, I was supposed to do it some time ago, but medical issues got in the way. So it's nice to be able to visit now.
Debbie Reynolds 01:19
Absolutely. Well, so why don't you give the audience a background of your journey in Privacy and Technology. I'm fascinated by this. So for me, you stand out a lot because you have very deep technological chops. Right. So because that's my genesis and kind of data, right around technology, that interests me a lot. So I would love to know how you built your career in technology and data protection.
Alexander Hanff 01:54
I mean, to be honest, it was completely accidental, which seems odd. I worked in technology for a long time, since the very late 80s early 90s. and studied computer science and psychology from my first degree was a joint major where I was looking specifically at technology and how it impacts us as individuals, which is a really interesting way of looking at it. And I worked in tech for about 15 years before going back to university to look at the impact of technology on society as a sociologist because it's becoming more and more concerned with technology being used to commoditize and control people, as opposed to democratize and empower people, which is what I love about technology as a technologist. And for me, it was really very personal because I came from a very poor family. When I finally managed to get to university, it was as a mature student, as an adult. I've been separated from my family from about the age of 12. So I didn't have any of that support to go to university, where most people go to university. And it was really technology which empowered me to lift myself out of that social situation that I was in, you know, the disruptive childhood that I had. So it's something very personal to me and something that you know, when I discovered the Internet back in the early 90s, it really opened me up to the rest of the world, maybe realize how small the world was and how important communication was and it really transcended borders and prejudice and all these other things. I remember back in 1992, using Helsinki University Free-Net service to chat with people on IRC for the first time and really being exposed to people and opinions and knowledge that I would never have been exposed to had not been in that environment. So you know, it was something very personal, to me, still is something very personal to me. And when I saw technology being, you know, changing in this way to being something more of a commoditization, you know, basically monetizing people online, I was deeply concerned. So I went back, as I said, to study the impact of technology on society as a sociologist, and those concerns rang true; I did a great deal of work writing about issues surrounding technology and democracy and privacy, surveillance, and things like that. So looking at things like the Teen Screen project in the US, which you may be familiar with, looking at the Patriot Act, and the sunset clauses within the Patriot Act, which most of which have since been resolved, and really got very deep into, you know, issues that we're facing around technology and surveillance and human rights, and so on and so forth. And it was really interesting because my professors gave me the freedom to really scope my degree around these issues instead of following up on more curriculum. So I was very lucky in that respect. And then towards the end of my sociology degree, I was looking at the impact of proprietary technology in the education system, how, you know, it seemed to me that we've moved away from teaching people about technology, and I started teaching people how to use proprietary software like Microsoft and Lotus at the time and various other proprietary systems which were becoming dominant within the market. And the funny thing was, I was looking at Linux terminal services and, you know, certain schools within the UK, who had switched to open source technologies for that, they're the children or the students within those schools, and the significant decreasing costs, yet the outstanding results that they're achieving, you know, that they really found it's very easy to switch from these more proprietary technologies into open source technologies. So something I was a huge fan of because I was a big supporter of open source technologies from the mid-90s. And then, by accident, I fell on a forum post about a technology which is being used in the UK, called Deep Packet Inspection, which was being used by an ad tech company called Phorm. And they were literally putting this technology into the data centers and the communication networks of the biggest ISPs in the UK. And they were using all the browsing data, every single website in visited to build behavioral profiles and then sell it to advertisers. People were rightfully outraged, and I did some research as the curious technologist that I am and fell down this rabbit hole of law. And before I knew it, I've changed my dissertation, literally a month before it was due, to write a legal analysis of this technology under European law; this caught a lot of attention and was cited at various law conferences and caught the attention of the European Commission. Because at the same time, I was running a grassroots campaign against this technology trying to get the European Commission to take action against the UK for not appropriately implementing the European law, ended up being very successful ended up being one of the most successful campaigns the European Commission has ever seen in their words. And we did that very, in a very analog fashion. So we didn't send emails, we've sent faxes and letters. And you know, the impact of that was real people had to process these real paper documents, which had an impact on the resources of the European Commission and got noticed very quickly, and all of a sudden, they got five to ten thousand letters coming in. That's something that doesn't go unnoticed. So we got called to Brussels, and we had a meeting with Vivian Redding's cabinet at the time; living reading was very much responsible for these types of all these, this area of law within the EU. And we ended up having infringement proceedings against the UK that forced them to change their surveillance law. And then that led on to the changes to the privacy directive, which I know you're very fond of, in relation to the use of cookies and other tracking technologies are 2009. And then obviously, that led to the development of GDPR in 2011. And it's really been a nonstop journey. Since then, I've been really heavily involved in in the development of GDPR. And the upcoming privacy regulations, I helped draft the European Parliament drafting team looking at these issues, and then I did three years of Privacy International, when I finished my degree, looking at specifically managing the E privacy portfolio. So dealing with things like the Google Wi-Fi scan over the Street View cause other issues of deep packet inspection with other service providers, and so on and so forth. And it really, you know, I say it was accidental because it was it came out of nowhere I became passionate about it. I did a lot of research on it wrote a lot about it. And before I knew it, I was swept into this tsunami of European Policy and law. And obviously, working at Privacy International gave me a huge pedestal to shout from as well and gave me access to a lot of resources and a network that I've never been able to attain as an individual. So I, you know, I chose to stick with what I thought it was; if you have the opportunity to do something, then you have an obligation to do it. Otherwise, you're in contributing to the problem. So for the last 15 years, almost now, my life has been nonstop privacy data protection, both as a lobbyist in Brussels but also working with companies trying to help them overcome some of the issues they may face when it comes to implementing privacy by design, data protection by design within their organizations, working with startup companies who are very focused on privacy enhancing technologies, and obviously, given a great deal of commentary in Brussels and online forums like LinkedIn and Twitter. And yeah, it's a nonstop journey, a very personal journey, something that I take very seriously. And is essentially my life's work at this point. And I don't see that changing at any point in the future.
Debbie Reynolds 09:26
Wow, that is amazing. So I didn't know that backstory, but it doesn't surprise me at all. People definitely listen to you because you have that deep technical background. It's not just some people when they think about their protection; it's kind of an ivory tower exercise, right? So they don't really talk to actual things that are happening, and we know that those are the things that are most impactful to companies, right? So it's not just what you say that you do, right? It's what you're actually doing or what their technology is doing. I would love for you to talk a bit more about the ePrivacy Directive. And then the pending ePrivacy or Privacy Regulation in the EU, for people who don't understand it. So a lot of people in the US that I know call the cookie Law. And we know that's not really the full spectrum of what the law is, right? That's their shorthand thing about it. And that's the thing that they're really concerned about. And then people really didn't, in my view, get up in arms about the ePrivacy Directive until the GDPR, which is kind of head scratching because that directive preceded the GDPR. Tell me about ePrivacy.
Alexander Hanff 10:52
The ePrivacy directive is interesting and frustrating at the same time. As a directive, as you know, it's open to the interpretation of member states when they implement the directive into member state law, which was the problem we have with the old Data Protection Directive from 1995. And there's a reason why we now have GDPR as a regulation is to try and overcome some of these issues and have an equal state of play across all 27 Member States. You know, the prospective came into effect in 2002. And the problem was that, despite the fact that it seemed fairly obvious how it should have been applied, different member states applied it in different ways. So it's been incredibly difficult for supervisory authorities or, you know, the regulators to enforce the law. And in many member states, in the EU, it's not even the Data Protection Authority who is responsible or have the competence to enforce the privacy directive. It's often in many member states down to the telecoms regulator. Telecoms regulators are traditionally very business-friendly, been very difficult to see the workforce, and I've been lobbying for stronger enforcement of the ePrivacy Directive For as long as I've been working in privacy. My main focus has been on this. And the thing with the project is it really relies on the bedrock of European law, which is the principles of proportionality and necessity, right. So Article Five Three, which was the amendment made in 2009, from the Telecoms Reform package, states that you should only access information or store information on end-user terminal equipment if you have their consent unless it's strictly necessary for the provision of the requested service. And that's necessary. Again, you need to consider that under you know, the framework, the framing of EU law, what we consider a necessity. So a lot of focus on this is on cookies, which is problematic, because it tends to divert people's attention away from other technologies, which are equally as, as dangerous from a compliance perspective, and has left them, you know, the industry really in a situation now, where they're scrambling to try and find solutions because they're now being told, you know because we have these new powers under GDPR, where you can effectively use any privacy complaint on the basis of consent under the GDPR. Because the definition of consent comes from GDPR. This empowers the regulators to take steps that they weren't previously able to take from an enforcement perspective. So we recently just heard, for example, that Google Analytics would not be okay to use in the EU from the Austrians or from the EPS and SES against European data protection outside European Parliament. And this is a combination of GDPR because IP addresses within sent to Google via Google Analytics, but also this is down to any privacy issues. Welcome to Article Five Three. Analytics are not considered as strictly necessary in order to deliver the service as requested by the end-user. So in order to store that Google Analytics script on the device in the first place on your mobile phone, or your laptop, or your tablet, whatever, whatever technology you may be using to access including smart TVs and other smart devices, I should add, you require to have consent because the only legal basis under the Privacy directive is consent, there is no legitimate interest or performance of a contract or anything else consent unless it's strictly necessary, is the only legal basis. So, for example, if we look at things like fonts, which many websites will download, or will force the user to download from places like Google or Adobe, where they have these huge font resources available for free? Here's the clue. The reason why they're available for free is because every time you embed a link to those fonts in your website, that gives these companies a great deal of information about the people who are using these fonts and the websites that they're visiting. If you can see that Google Fonts are on, you know, probably over a billion websites around the world. If Google knows every time you visit a web page, which has their fonts on, they know what that web page is. Therefore they're able to include this information in your data profile, they have a view this digital copy of you in order to better target advertising and to you know, to make more money off you So it's an issue that we can't consider it strictly necessary, because the fonts could be stored directly on the host server. This website presents this content to the end-user. There's no requirement; there's nothing within Google terms or conditions which prevent you from hosting the first party. So under what we call the necessity principle, and if you can do something in a less intrusive way, or a way, which is going to have less impact on some of these fundamental rights, then as a matter of law, that's what you're supposed to do. That's what we refer to as the necessity principle. So if you can store these fonts on local servers first party, then that is the position you should have. So using any third party embedded technology, unless it's absolutely necessary, and there are some circumstances where it might be, such as edge services in relation to information security, or prevention of DDoS attacks, things like that, that would be justifiable because it's no security is considered as something which is necessary to provide the service in a safe way. But things like Google fonts and images like Facebook pixel, and so on, and so forth, they're not considered strictly necessary, and therefore you have to have consent. So there's been a lot of ignorance around this over the past, I would say even the past 10 years since the changes happened in 2012, whereas I said the focus has been on cookies. And there's been an almost ignorant aspects of this relation to other tracking technologies. And we must understand that when you're linking to a third-party free resource, it's only free because they're able to take that data and use it and other ways which are of benefit. So that means an organization which has an impact on their bottom line, right? So it's really important that developers and publishers are aware of the technologies which are embedded within their digital assets. And we're not just talking about websites; we're talking about apps as well as SDKs, and APIs, which are embedded in apps. We're talking about, as I said, IoT devices or smart devices, like your Smart TV, or your smart car, so on and so forth. So the risk is much wider than it is when we're talking about GDPR. Because GDPR is only focused on personal data. Personal data has a fairly broad definition, but it's still defined in a way which can be understood fairly easily. But the privacy directive covers any information which traverses over a public communications network. So that means anything to do with anything that somebody accesses over the Internet was primarily under the jurisdiction of the ePrivacy Directive, which from a legal perspective is considered as less specialized to GDPR. So it sits as a special piece of law above GDPR must be taken into account first. And as you know that the GDPR principle of lawfulness in Article Five states that in order to be able to process personal data, that data must have been collected in a lawful fashion and must be processed in a lawful way. So if you have a law, which it's above GDPR, like the ePrivacy Directive, and data is collected from people's devices, without that consent, you then fail the lawfulness principle under GDPR. And can't process that data any further, no lawful way. There's no legal basis available. So there is an incident between GDPR and the ePrivacy Directive, but it's often overlooked and often misunderstood. And then fast forward to 2017. And we have the privacy regulation proposal put forward by the European Commission, which passed very quickly through the European Parliament. It is, I believe, November 2017, that the European Parliament plenary vote took place with a draft that we've been working on throughout that year, yet still waiting for the Council of the EU to initiate I think the trial proceedings have now been formally initiated, but took a very, very long time to get there. Many presidencies in 2017, bearing in mind the presence of changes every six months. And since then we've seen, you know, the changes in the multiple drafts that have come out of the Council of Ministers has been particularly, let's just say, disappointing from the perspective of fundamental rights, you know, many would argue that it actually puts the privacy regulation at a level of protection significantly lower than the existing privacy directive, which is problematic, because one of the requirements is that it must reduce the amount of protection that exists in existing law. So, you know, it's been very difficult. We're still going through the trial process; there's no indication when that's going to end; we would hope that it would end this year. But, you know, given some of the issues, that the parties are at loggerheads over, such as things like tracking was implementing legitimate interest as a legal basis course as with any piece of EU legislation, Council of Ministers have tried to slip in something there on data retention as well, as they always do. You know, so there's some very significant issues with confidentiality; communications is another problem. So there's some very significant issues to overcome. And to some extent, it doesn't help that we have other legislative files currently, such as the for Services Act, which is touching on some of these things as well. So we end up with a problem where we have a lack of consistency across different laws if the trialog price regulation comes through and the Council of Ministers have had their way and managed to get out some of these protections. But then we have so, for example, end-to-end encryption. And then we have the European Electronic Communications Code, which became law last year, or was it 2020. In December 2020, I think it was in relation to what is considered a communications service provider that is now put over the top services like Skype calls iMessage system and WhatsApp and various other end to end encrypted messenger services, has now put them in the scope of certain issues surrounding lawful intercept, for example. So we're seeing now there was an article in Romania just some weeks ago, where there had been an attempt by the legislators to implement a new law, which forces a backdoor into end-to-end encryption to these types of technologies. So you know, the lack of consistency and the constant battle between the two sides of the European Commission. So you've got DG connect, which indicates it is very much focused on single market issues and trying to boost the economy through the use of technology and technology-based services, being just as squarely focused on fundamental rights and this constant clash between the two of them on how we can implement European law in a way, which is consistent, belief protects the value of fundamental rights under European Charter, but are still allowing your business to evolve and to generate revenues. So it's a really complex situation from a political perspective, which has led to issues such as the Schrems I Safe Harbor and then Schrems II to privacy shield, and all the action that's been going around there recently. And it's not likely to go away at any point in the near future, which is frustrating for people like me who have been fighting this, you know, for 15 years, it's kind of a little bit tiring. For others. I'm not a young man anymore. But at the same time, it creates a lot of uncertainty in the rule of law, it creates a lot of problems for individuals who live within the European Union, which is 500 million people, and their confidence in the law and what they can do, if we look at Max Schrems and Lloyd, today just announced that they just pushed a table online, about all the cases they filed and the lack of action against them. You know this is a huge problem; it's very difficult for supervisory authorities to have meaningful enforcement when there's no consistency in the case of a directive across all 27 Member States when there's significant differences in how the law should be applied in different member states. The obvious example is being the Irish Data Protection Commissioner versus maybe some of the German regulators or other regulators within the EU who are not necessarily happy with some of the decisions made by some of the other regulators. So it's very difficult for individuals to understand what their rights are, how to enforce those rights, how to protect those rights et, etc., etc., which doesn't make it any easier for anybody involved after so.
Debbie Reynolds 22:56
I recently did a video about EPrivacy Directive, and I had postponed it for several years because I was hoping it would become regulation, and then I will talk about it. But I just decided to go ahead and do it because I'm like, I don't know when this is going to happen. And there's a lot of activity still going on around it, and people are still trying to fight it and figure out what to do with it. What is happening in the world right now in technology or privacy that concerns you the most?
Alexander Hanff 23:28
Metaverse is obviously very topical at the moment, deeply concerning that there's no meaningful open standards currently being signaled, significantly supported, and developed. There are efforts at some other standards, but they're not the projects which are seeing significant investment. We're seeing a lot of investment in proprietary systems around the Metaverse. To people like me, the Metaverse has been around for a long, long time. Anybody who's played an MMO, or multiplayer online role-playing game, such as World of Warcraft or any of the multiple other ones that are out there, understand what the Metaverse is. Second Life is a really good example of the Metaverse and the early days of the Internet. It's really this virtual environment or this world within the Internet itself, which allows you to interact with people and other services and other things within this space in a way that is more real to how markets how we behave in the real world. So there's a recent demonstration by Walmart of their online Metaverse version of the supermarket, where you go in, and you pick things you put them in your trolley, but you don't pick them from a listing. There are 3d objects within that environment, you virtually pick them up, and you virtually put them on a trolley, which is actually a little bit cumbersome compared to the 2d environment that we're so used to when it comes to e-commerce, and it's something that will obviously be refined over time. But you know, these environments, these virtual spaces, whether it be VR, whether it be AR, whether it just be 2d or 3d, within the browser, we're making a much more, let's say, a much more real connection to the technologies that we're using the lines between what is real, and what is the Internet or technology are really starting to blur. And we saw this a long time ago, actually, when we started seeing things like intelligent billboards in stay, for example, the subways in Japan, where they detect people's faces and provide them with information, a billboard or an advertisement or billboard, which is specifically targeted towards them. We see it in smart bus shelters using similar technologies and so forth. But now, when we're in an environment, which is within our home, most of us will be accessing these, either within the workplace or within the home; we're certainly not going to be wearing 3d goggles or AR goggles walking down the street because obviously, the risk of injury in those situations is quite significant. But we're mostly going to be used in these environments within our home using new technologies such as AR such as VR, which means that automatically we're making ourselves more vulnerable is anything which we're doing in our home, which has got sensors and cameras, and so on so forth, is opening us up to potential privacy violations or risks of those, those privacy, protection, fundamental rights. So we automatically need to be cautious. And I remember writing an article a couple of years ago, about things like smart cameras in Xbox Kinect and smart TVs, and whether or not these create a vulnerability to introduce, for example, if they're stored in the cloud, and the police had access to that data, and we know from experience that yes, they can and not just police, but many other institutions can have as well, which may not be a benefit, or may not be as respect to fundamental rights, or have the protections in place, which exactly what the screams to the argument is based around in relation to US surveillance and intelligence agencies. So when we bring this into our home, we create more vulnerability; we're no longer necessarily in control of our environments because the sensors are in IoT devices smart devices and give a great deal of information about who you are. And look at some of the patents which have been released, which have been applied for currently, and granted, such as the ones with Meta, and I believe there was some from Amazon the other day as well to update or get published, the types of things that they're looking at are things that people like me have been worried about, for some time, things like being able to track eye movements to see what you're looking at. Being able to detect emotion using AI, which is a particularly difficult situation from a fundamental rights perspective, and the dangers that pose towards particularly vulnerable individuals, for example, and all these other technologies that they're now trying to patent to gain more intelligence about who we are and the things that we're interested in purely for the purpose of being able to throw more advertising as it is a worrying itself how much it would be manipulated by the advertisers in which we're seeing every single day, how much free choice do we have anymore for constantly being nudged by advertising to buy certain products and services. So from a data ethics perspective, from, you know, from the perspective of sociology and social psychology, in anthropology and digital anthropology, we're seeing significant risk to the quality of the self, to autonomy to the individual being able to make decisions in a way which is meaningful to self-determination, and all these other risks as a result of being manipulated and no clear examples of Cambridge Analytica, a scandal, where they were using psychographic profiling to manipulate and nudge people using emotional cues to create an irrational response. And we see that through things like Brexit and the Trump administration, or the Trump political campaign, presidential campaign, etc. Well, we know Cambridge Analytica was involved in manipulating people's opinions and emotions to get them to vote in a specific way or to nudge them in a certain direction, based on things like immigration, for example. So the huge risks. I remember posting a couple of events at the European Parliament on Cambridge Analytica, probably a year before the main press hit the headlines. And it was a really interesting moment for me because somebody had been lobbying in Brussels on privacy and data protection for years. All of a sudden, politicians were really interested when I told them that, you know, this company or other companies like them, can effectively put you out of a job, not because of anything you've done, but purely through manipulating people to make irrational decisions based on emotional triggers. They could end up voting for one of your competitors or somebody who's aiming to get that position over you. And it could be that you're a perfectly decent politician, you have great plans, and your heart's in the right place, and you're working really hard for people's rights. But that doesn't matter if people aren't making rational decisions if they're being manipulated in this way. And all of a sudden, then politicians started taking an interest. And now, as you've seen, probably over the past couple of years, there's been a big focus on online harms on things like manipulative messaging and fake news, and so on and so forth. So we're really in a difficult time, and I think it's not going to get any easier with the Metaverse if we don't take steps now to prevent giant organizations like Meta, like Google, like Microsoft, from going in there and really taking control of this new evolution of the Internet, I think is probably the best way to describe it without taking into consideration these rights and making these technologies open and equitable to everybody. So I would say yeah, Metaverse is definitely a concern; I think manipulation of the likes of Cambridge Analytica is generally considered to be a concern of mine. I think surveillance as knee-jerk reactions that, you know, the limits or the restrictions, which certain governments are attempting put on end-to-end encryption, particularly Five Eyes nations, is something which troubles me. But that's nothing new that's been troubling us for a long time now already, you know. So I think generally, lack of enforcement as well as the laws. Keep in mind that we have some really good laws when it comes to privacy and data protection. GDPR is a principle-based law. It's a very technology agnostic law, which makes it very easy to apply as technology evolves. But if that law isn't being enforced appropriately by the regulators, or they don't have the appropriate resources to be able to enforce it, which is the case in many situations, then it's meaningful. And actually, I would say it's even worse than meaningful; it actually damages fundamental rights. Because if organizations understand there's no consequences to their actions, they can just do what they want to do and profit from that. And they may get a small fine, or what's the kind of a small fine to a company making billions of dollars a year; then there's no incentive for them to change what they're doing, there's no incentive for them to behave more appropriately, and protect our rights and, you know, acts as responsible custodians. And in fact, in certain countries, they'll be required to break the law. The US, for example, have this fiduciary duty clause for publicly traded companies, you know, they're required to maximize profits for shareholders, and often breaking laws, which have fairly limited penalties, but allow them to profit, a significant amount is the, you know, often seen as a cost of doing business under these regimes. And that creates huge problems moving forward because it makes it very difficult to find a consolidated view globally to deal with the problems that arise from that. Right.
Debbie Reynolds 32:16
You touched on something, and it's something I want to ask you about anyway, was about the cookie litigation you have happening in Europe, right. So my concern, and you mentioned that the GDPR is a principle-based law, which can help evolve as technology evolves. Some people are of the mind that I do like the fact that GDPR is principle-based. Some people in the US, we have a lot of laws that are being passed that are very prescriptive, right, so, put a button on your website, this is this, right, so some people like that, right, with, but those things become kind of outdated, you know, pretty fast. Right? So, that is a concern that I have with the cookie, some of the cookie litigation, where I feel like people are really caught on to this. And there are so many other ways that this tracking can be achieved, probably even worse than cookies. Right. But I think if people are so fixated on that one, what I call a mode of transportation, for people being able to kind of do this tracking, you know, once these cases are over with companies won't be using cookies, they'll be using biscuits or whatever, some other thing. Yeah. So what are your thoughts about that?
Alexander Hanff 33:43
I mean, the beauty is that to a certain extent, the ePrivacy Directive is very much principle-based, as well. So having that one legal basis of consent, unless something sensitive is strictly necessary, is really quite preventative if it's applied. And the European Commission actually just released that final report on IoT for consumer IoT devices, I believe, which again illustrates the importance of Article Five to the ePrivacy Directive, stating that IoT devices, I consider this type of equipment under the law. And as such, if something isn't strictly necessary, then it requires consent. We're still not seeing that applied. If I look at my new LG TV, which I bought recently, you know, in order for me to turn on features such as Apple HomeKit, I then have to accept a bunch of terms and conditions, which basically gives LG permission to send all sorts of data back to their servers and to send me advertising pop-ups and all sorts of other crap on my, on my screen. And the only way for me to get rid of that is to use a technical solution, such as blocking all the traffic, leaving my internal home network from that device, which is something that I can do, but it's not something most people will be able to do. It's quite technologically advanced. It's not really something that most people want to log into the router or their firewall. Start creating Rules that block traffic to individual devices on the network. Imagine if you had to do that for dozens of devices or, as we move forward 10 years in the future, maybe hundreds or even 1000s of devices to control who has access to what you're doing within the private sphere of your home then it becomes problematic. But I don't think necessarily that the ePrivacy Directive is as prescriptive as certain other laws such as CCPA, in the US, or CPRA, as it is now. But, again, when we have no enforcement, then it becomes problematic because the law is effectively meaningless. And the biggest issue we face currently, not just in the EU, but I would argue, globally, certainly in the West, there's a lack of enforcement. And part of that is down to a lack of will to take on giant corporations like Google, or Facebook, or Meta as they are now. Part of that is down to a lack of will to disrupt markets, particularly in a time such as a pandemic, where the economy is suffering, as we know. And I'll say part of that, again, is a political lack of will. Because obviously, governments want to incentivize companies to invest in their countries and employ people and allegedly pay taxes, which is something we've heard from these global corporations, if we're going, to be honest, which is another matter entirely. And I would say that there's a fourth aspect of that as well. In relation to the difference in interpretation of the law, as we've seen with the privacy directive, being interpreted differently, pretty much in 27 member states. So it's I think there's a combined issue there, which isn't limited to one thing; I do think prescriptive laws can be very difficult, particularly in a bloc like the EU, where it takes a long time to pass a new law. GDPR was originally drafted in 2011. by the European Commission, the original proposal, and didn't come into force until 2018. There was an extended period of trial. And probably the privacy regulation will take even longer when they start in 2017, already in 2022. And it's not looking like there's a way forward at this point. So we'll look at another two to three years before this actually becomes actionable under EU law because there's always that period after the trial was completed. And that agreement has been made before the law is implemented across member states to give organizations time to adjust and change the way they do business to comply with the new regulations. So I think it's really important that we have principle-based law instead of restrictive law because we can't afford the pace of development of technology. We can't afford laws, which take eight to 10 years, to take into account a new way forward of technology or, you know, a new methodology for AI or so on and so forth. Because then you are always operating in the past, and people's fundamental rights have already been harmed, the point where the new law comes into play and outdated very quickly. So I really liked the idea of principle-based, or I'm a big fan of the European model. But at the same time, without that enforcement, as I said, it becomes meaningless.
Debbie Reynolds 38:04
So if it were the world, according to Alexander, and we did everything you said, what would be your wish for privacy anywhere in the world, whether it's technology, law, human stuff? What are your thoughts?
Alexander Hanff 38:18
I think respect is, to me is, the most important thing. People say to me all the time, should I stop using Facebook? Or should I stop using Google? Well, you know, you shouldn't have to stop using anything. The law states that they have to behave in a particular way and protect your data, and you privately respect your privacy. And they're not doing that. But you know, the services they provide or, without questions, incredibly useful for us to communicate to make the world a smaller place these things which I really value as a technologist and my reason for moving into privacy in the first place. Why should people have to stop using these services? You know, the correct reaction is that the companies providing these services do so in an ethical and responsible way and respect the rights of the individual. And that's really all it comes down to respecting the law as an organization. You don't not pay your accountant for filing your accounts. Yeah, and you don't not file your accounts as there are penalties that come from that. And the same should be true of all the laws that you're obligated to comply with. You can't cherry-pick which laws you choose to obey and which ones you don't choose to obey. Because then the impact from the rule of law there is devastating; it's one rule for them and one rule for everybody else, which goes against the democratic society or the very premise of a democratic society. So I think respect, I think this is the biggest problem. That we have so much loyalty to these brands because they give us all this free, great stuff that we can use. That makes our lives maybe a bit easier, a bit more convenient, but they're not respecting you as an individual. So why would you give them the opportunity to make vast sums of money and bear in mind these are the wealthiest companies in the world we're talking about here. And they're only wealthy because they're using your information, your data to make that money; that's where they make their money from. So if they're not willing to respect your rights, and you know, that's problematic, and I don't think people should be excluded from digital society, just because they want their rights to be upheld, I think there should be stronger enforcement, I think companies should be held to account. And I think, you know, from an ethics perspective, and this is something which I've seen over the past couple of years now, with the emergence of, of the new generation going into the workforce is more focused on these younger people moving into the workforce, on ethics, and working for companies that they feel are ethical companies. So there's more pressure now, not just from the competition perspective, and even companies like Apple are very much focused on privacy in their marketing, their products, and services. But now, you're also looking at a situation where access to talent, in order to be able to run your company is becoming more and more dependent on respecting people's rights and being more ethical, the Dragonfly project in Google, and the protests against that from Google employees is a very good example, the facial recognition contract with Amazon for US drones, which caused a massive controversy internally and led to people walking out is another good example of that, you know, so companies need to understand now that they don't need to do this, from that I mean, I would like them to do because it's the ethical thing to do. But this is also about them being able to conduct business nowadays. And, you know, when you look at US cloud providers, and the turmoil they're currently in the EU, with announcements from people like the EDPs, and the Austrian, DPA, are things like Google Analytics, that they have sweeping consequences, that could potentially impact markets to hundreds of billions of dollars, over a relatively short period of time, if those services are suddenly not able to access the European market, or if companies within the European market are not able to use their services, for the benefits and the convenience and the revenues that might help them bring in. So respect, I think, is the most important thing I'd like to see companies be more respectful. I'd like to see individuals holding them to account more, but I don't think people should be forced to stop using technologies, which are really useful, allow them to communicate and stay in touch with friends and family and, you know, buy things at the best price and so on and so forth. Just because those companies don't respect their privacy, I think the focus needs to be on the controllers, as we will call them under GDPR. As opposed to on the data subjects.
Debbie Reynolds 42:23
Yeah, I love that answer. That's a great one. That's a great one. Well, thank you so much for being on the show. This is a tour de force episode. I love all the stuff that you say, and I very much support your work and the things we're dealing with now. I'm happy to have you on the show. And I'm happy that hopefully, we have a chance to collaborate on some things.
Alexander Hanff 42:46
It's been a great pleasure. Thank you very much. Thank you so much.