E213 - Bill Buchanan, Professor of Applied Cryptography at Edinburgh Napier University (Scotland)
Many thanks to the Data Diva Talks Privacy Podcast Privacy Visionary, Smartbox AI, for sponsoring this episode and supporting our podcast. Smartbox.ai, named British AI Company of the Year, provides cutting-edge AI, helps privacy and technology experts uniquely master their Data Request challenges, and makes it easier to comply with Global data protection requirements, FOIA requests, and various US state privacy regulations. Their technology is a game-changer for anyone needing to sift through complex data, find data, and redact sensitive information. With clients across North America and Europe and a major partnership with Xerox, Smartbox.ai is bringing their data expertise right to our doorstep, offering insights into navigating the complex world of global data laws For more information about Smartbox AI, visit their website at https://www.smartbox.ai. Enjoy the show.
[00:00] Debbie Reynolds: The personal views expressed by our podcast guests are their own and are not legal advice or official statements by their organizations. Hello, my name is Debbie Reynolds. They call me the Data Diva. This is the Data Diva Talks Privacy podcast where we discuss data privacy issues with industry leaders around the world with information that businesses need to know. Now I have a very special guest all the way from Edinburgh, Scotland, Professor Bill Buchanan, who's the professor of Applied Cryptography at Edinburgh Napier University. Welcome.
[00:40] Bill Buchanan: Nice to see you. Thanks for the invite.
[00:43] Debbie Reynolds: And you also received an award for most innovative teacher of the year. Congrats.
[00:49] Bill Buchanan: Yeah, yeah, that's right. I received that last year. Teaching is obviously highly important in universities and I still adore teaching and it's one of my favorite things to do.
[01:06] Debbie Reynolds: Well, I enjoy the things that you post and that you put on LinkedIn. You and I have met on LinkedIn and I thought it would be great to have you on the show because you have such wisdom and insight that you can share with so many of us. So why don't you tell us about your career trajectory and how you became the professor of Applied Cryptography at Edinburgh Napier University.
[01:29] Bill Buchanan: It goes back a long way. My father and my grandfather were both electrical engineers, or electricians as you would say. So electrons were really very much in my family and I followed that route. I studied electrical engineering and electronics and I worked in a chemical factory. And then I realized that this little three legged thing was coming along. The three legged thing was a transistor, an NPN or a PMP transistor. And I knew that this was going to change the world. Though there were quite large things then and they actually sat on top of the of printed circuit boards. But as an electrical engineer, I knew that I wouldn't be involved that much with these transistors. And then I also saw these massive mainframe computers that were used to control plants. And again, I saw these people with white coats going around working this magical machine in these clean areas. And again, I knew that computing transistors were really going to change the world. So I decided to take a degree, a degree in communication engineering and electronics. And I just adored studying wireless networks and communications. And again, I saw the move from analog communications, radio waves and so on towards digital communications. At that time we had the thing called x25 packet switching. And packet switching was really just at its infancy then, and it's amazing now, but that was quite a leap from our voice networks, our switched analog switched networks, towards a packet switched network. And obviously the Internet has grown because of that. So I was inspired by some of the lecturers on the course and I decided I wanted to be a lecturer myself. So I started to teach electronics and moved into data communications. Then along came networking with some Cisco networking routers, switches and so on. And then I've generally migrated from that towards security again. Security I saw was being such a big topic and really there wasn't a lot around, a lot of people teaching the subject, a lot of courses at the time. So I moved into cybersecurity. And then as part of teaching cybersecurity, I've got more and more into cryptography. I think I missed the mathematics a little bit, I missed the practical coding and so on. So I'm very happy to be back now reading and understanding cryptography. So that's where I am just now. I'm a professor in a very privileged position just now. Along the way, we've managed to create good impacts. I've led for companies to spin out from the university, each of which has been successful. So it's been a fun and interesting time overall.
[04:47] Debbie Reynolds: Let's talk about a little bit about cryptography. One thing that I realize when I talk with people about cryptography is that, well, first of all, some people confuse encryption with cryptography. So encryption is a type of cryptography, but it's not cryptography, right? So talk about cryptography, like the umbrella of cryptography.
[05:14] Bill Buchanan: So cryptography is a. It's a very large area and as you say, encryption is really just part of it. So encryption can be what we call symmetric key encryption. So I. If I have a key for the front door of my house, then I can make a copy of it and I can give it to you and you can also get into my house. So that's symmetric key, where we use the same key to encrypt and then decrypt. It's basically we scramble something and then we unscramble it. We basically reverse everything that we've done. But we need this key to be able to go through that process. Along with encryption, we get what's called public key encryption. That's where we use what's called the key pair, private key and a public key. We can encrypt with one key and then decrypt with the other key. But cryptography is so much more than that. That typically relates to confidentiality, to the privacy. But in the CIA, we also have integrity. So integrity makes sure that the data that we have is actually correct and hasn't been modified. So for that we get hashing functions which are obviously Important to make sure that we get a digital fingerprint of data. But probably fundamentally, the thing that saved the Internet was the usage of digital signatures. Digital signature allows us to be able to verify that something or somebody actually signed in the same way that you would sign your name on a check someone has signed for something to prove their identity, and also to make sure there's integrity in the data that has been sent. So that's the area of digital signatures. And really, that's the thing that makes the Internet so much more trustworthy. You know, when you connect to something, there's a digital signature in there to make sure that you are connecting to a trustworthy site. But along with that, there's a whole bag of other things that we can use to be able to create systems which are private, secure, and also resilient. Other things such as zero knowledge proofs, homomorphic encryption key exchange. A lot of the work that we see now started around the time of Whitefield the fair Marty Hellman, who created a key exchange method that we could publicly exchange information but end up with the same secret key. So to me, a lot of it is magic. So I love magic, magicians and so on. And I think cryptography is the magic of the, of the Internet. And really, without it, we would be back in the 1970s that people could spy on our communications, they could change it. And really it was cryptography that was the magic part that made the Internet both private and also trustworthy. So cryptography is so many things, and it isn't just encryption.
[08:21] Debbie Reynolds: That's true. What's your thoughts about this analogy that I use? And I think it speaks maybe the way that we had thought about security in the past. So let's go back to your analogy about a house. And you have a key to a house. Right. I feel like some ways that organizations think that they should protect their data is that as long as they have this key, this front or back door key to the house, they don't have to worry about anything else being secure than the house. Right. So we see, for example, companies that have these data breaches where someone obviously breaches the front or the back door or the window. And then once they get into the house or to the castles, like I like to say, they can do anything because a lot of the data that's in there is not protected. So what are your thoughts about that?
[09:14] Bill Buchanan: I think the. One of the fundamental problems in the industry is key management, and you find that most companies struggle to look after the keys. So to use your analogy, we have A front door key, but then we have a key to every room in the house. And then we have a key to a safe that keeps your money and your jewelry and so on. And the more keys that we have, the more safe boxes that we can actually have. To me, every piece of data should be encrypted, possibly even with its own unique key. So if you store in the cloud, your front door is open. And I find that companies are actually better when they're in the cloud because they need to make sure that they look after their keys, they need to make sure their data is encrypted, and so on. So just because data is behind the firewall and is on premise doesn't mean your data is secure. You have insiders. All the people who work for your company, all the people who have previously worked for your company could have all those keys and can actually use them. And what's meant to happen when someone leaves your company, you're meant to change all the locks in your, in your, in your company. The same thing happens for your passwords and your encryption keys and so on, but that doesn't quite happen. So an ex employee, ex employee could come back into your real estate, your infrastructure, and use the encryption keys and have the privilege that they had before. So increasingly, what we need to do is to make sure that all of our data is secured. It's like an onion. You don't actually have just one layer of security. You have an onion and some of your data isn't that sensitive and it doesn't need encryption. And maybe just as simple security applied to it. But for personally identifiable information, credit card details, gender and so on, those must be protected or your company could get fined. So it's extremely important that you understand the encryption. You need to be adding salt, you need to be adding, adding pepper, you need to be adding lots of different things to make sure that it becomes difficult for someone and it's not just the same thing that's applied. You have multiple methods so that once someone gets over one thing, you have an intrusion detection system to detect they've managed to get over this thing, and then they're now faced with another type of hurdle to go over. So more and more, we need to understand what is our most sensitive data and how we protect it best.
[11:53] Debbie Reynolds: I think what you're talking about, and I think this is something that companies struggle with and that's being able to really classify data in a way so that they understand the level of protection that it needs. Just like you say, not everything is A trade secret, top secret, super hot, confidential. And you want to be able to as your analogy states about the onion, because there are different levels of security or there's different levels of sensitivity of a data that companies need to really think about that when they're planning that security. And I find unfortunately, I don't know what your thoughts have been but unfortunately some companies who aren't as mature in this area, they have maybe not best level of security but then they're not treating different assets with different levels of security. What are your thoughts?
[12:46] Bill Buchanan: Yeah, and I think we've, we've really scaled are on premise into the cloud. So we didn't really understand how we should build our systems properly. We typically migrate, we never say let's stop doing this, let's stop doing SQL databases and let's now move towards this other more secure methods. So generally our migration is a step, is a piecemeal approach and rather than re architecting and understanding how we segment data and networks, we've generally just kept the thing going. If you had this with an oil refinery, you'll find that an oil refinery has multiple design elements into it to make sure that the whole infrastructure is resilient. If this element actually fails, then this thing will kick in and replace this. The same thing goes for data. We've never really understood how we segment data properly or most companies don't really understand how to properly move the private information away even physically from other types of data. So that segmentation of data, that understanding what elements are involved also the key elements of access control is fundamental. So in the industry we're generally moving towards a zero trust approach. But that's moving from a role based security type approach that if I log in as this type of person I now get all these roles. I see it as an analogy like a gp. A GP logs into a healthcare system and obviously has a whole lot of rights to it. But that GP is also a patient of a GP and if they log as a patient then they will get a lot less rights to be able to get access to data. So I think a challenge for most companies is to understand that zero trust level system, the identification at the gate. So I need to identify myself as that first hurdle and then if I want some more access I now must go through another gate and I must prove myself probably and coming in with simple login password and so on. But if I want to get access to this higher level system, I now need my biometrics and I need to move up again. I need to probably have A one time password from a physical device and so on. So that whole identity system needs to be almost looked at in itself and then matched to the rights. But companies need to understand how they can dynamically change that. So if someone does need extra rights to be able to access something, the company should be able to dynamically allow that to happen or revoke it instantly. And I think the way that we've designed our Windows domain infrastructures is we have this hodgepodge of roles and groups and so on, and it gets really messy and difficult to cope how we identify and then how we define the rights of access to data. To me, every single data element, even one byte of data, should have an access control policy defined on it. And that would be the best case. Every little bit of data has a policy associated with it and has rights. And you must prove your identity to be able to get access to that data. And let's keep it simple. We'll have five levels of security on that data. One means simple access, five means physical control. I need to physically prove my location to be able to get access to that data. So I think that understanding of identity and access control is fundamental and how those two things work together. And I don't think that Windows domains actually work that well and can be really messy for those types of thing.
[17:29] Debbie Reynolds: Yeah, I think the internal architectures are not built for the future. The way the data works and the way the data flows now is not. It should be dynamic. It is not dynamic right now. I don't think. And I've seen unfortunately a lot of the, the companies that grew up with data on premise, a lot of their security or some of it was kind of security by obscurity. So they thought, oh well, we don't have to protect this, but create a path for people to go to do, to look at different things. But if they went off the path or they did it a different way, they would be able to see basically everything because not everything was protected like on all sides, basically. One thing I want you to be able to talk a little bit about for people who don't understand or don't know, I want you to talk a bit about homomorphic encryption. It's a big thing that people are talking about now, but our listeners may not fully understand what that is and what it means.
[18:33] Bill Buchanan: Okay, so there's three states that data can be in. So there's data at rest. So we would encrypt data either with symmetric key or public key encryption at rest when it's on a disk and if it's on a storage area. So we can have long term, short term storage, we can put things in a nice vault and so on. So that will generally encrypt our data as it's at rest. That's fairly well developed. We will typically be using symmetric key methods for that. And we might protect the symmetric key with public key encryption. Then there is over the air. So that's fairly well solved with tls, SSL and so on, where we encrypt the data. So wireless communications was the best example. So I remember when wireless communications first started with local, wireless local area networks, we used a technique called web wireless equivalent protocol or something, I can't remember the full acronym for it, but it used a very simple encryption method, the RC4 I think it was. And it also had a salt value that meant after a little while the same salt value came round and you could just exclusive or the two bit streams together and you could actually decrypt and find out the password. It was a global password for the whole of the wireless network, unbelievably for that. So that was required because we needed to start to encrypt data over the air because people could be listening over the wireless network which was shared, and we could actually listen to the communication. Luckily now we have WP2 and so on. But the core of the Internet when we're transmitting data is tls. And TLS encrypts data over the air, over the, over the network. The last place that data isn't protected is in memory or in process. And these days it's not that difficult to peek into the memory, to look at the hardware, to look at data within the process. So if your encryption key is not obfuscated and is in plain sight within a program, then someone could examine the program to be able to find all of your sensitive data and also the encryption key and so on. So in memory has always been the problem. And obviously if someone has access to your machines, your processes and so on, they can peek into the memory and discover all the secrets that you would have. A good example was with heartbleed. Heartbleed nearly brought down the Internet. And what that was was there was a special packet that was sent to a server and it pulled back the running memory of the servers. So anything that was in the memory would actually be seen as this memory chunk that was sent back to the, to the person. And people discovered passwords and sensitive data because the memory itself had all the sensitive data within it. So this has always been a problem. But now, with homomorphic encryption, we can encrypt data and then we can mathematically operate on it. So if I just need to find the total population of a country, I could encrypt all the city's population with my public key and then I could receive the encrypted values, I can then add all those encrypted values together and I get a homomorphic summation. Now I can give the data back to you and you can use a private key to be able to unencrypt the data and find the, find the summation of the values. And the magical with this is that we can replicate every single function, every single bit of logic, every single electronic circuit in the form of a homomorphic circuit, which would mean that we could set up a highly trusted environment to be able to process data. And in this way, data could be given to a data processor working in the cloud with encrypted values, and they'd be able to process for any operation that we wanted, give it back to us, and then we can decrypt again. So it would mean that we'd have encryption in every single state that we would have, right?
[23:26] Debbie Reynolds: Yeah. So how does cryptography play into or help privacy or data protection?
[23:35] Bill Buchanan: Well, it's fundamental. I think without cryptography and encryption, then we have very little privacy. So there's as most of the data is now in the cloud, so we need ways to be able to protect that data. And cryptography, as I said before, is used to typically encrypt our data either with symmetric key or public key encryption. So you can be Alice and I'll be Bob. So if I want to send you some encrypted data with symmetric key, then obviously we need to have a way that we create the same encryption key that was solved by Whitfield Duffy and Marty Hellman with the Diffie Hellman method. So we can do some maths, we can openly discuss, and at the end of it, we end up with the same symmetric key, the same encryption key. So I can now encrypt the data for you, send it to you, and then you can now decrypt with the key that we've now negotiated. And every single time we talk, we can create a new key, those new keys from them. But if I want to use public key encryption, then you create your key pair, your public key and your private key, and then I take the data, you will send me your public key. That's a bit of a problem. I'll come back to that. So you send me your Public key. I then encrypt the data with your public key and send it back to you, and you use your private key to be able to decrypt that. The problem here is that how do I know you as Alice, that I have your proper public key? Because Eve is listening and actually wants to pretend to be you and sends me the wrong public key, and I will encrypt it, and then now she can decrypt it. The magic of this is where we bring in Trent. Trent creates a digital certificate for you and will take your public key and make a certificate, and then they will sign that certificate to say that they are. They are Trent. Trent then sends me the certificate. I can then check that Trent has signed the certificate and I can take the public key off it. That public key can now be used to encrypt the data for you. But the magic of this comes in when I digitally sign for that data that I send you. So it's almost like I create my signature on an envelope to make sure that you know it was me that sent it. For that, I use my private key. So I have a key pair, too, a public key and a private key, and I take a hash of the data and then I sign that with my private key. So I now send you the encrypted data, and I send you the signature for it. When you receive it, you will now take my public key, just like we did before with Trent. You'll get my public key that's signed on a certificate, and you will check my public key against the signature, and you'll know for one, that I sent it, and for two, that no one has changed the message. And it's unbelievable that in 50 years of the Internet, we're still using email. Email is completely untrusted. You can't tell if anybody in your company has read the email. You can't tell if anybody has changed it, and you can't tell that a spear phishing person has changed at all. With the method that I've outlined, typically, I define that as PGP encryption. We actually have a way to send encrypted email that's also digitally signed. And we would almost, like, cure most of the problems in the Internet almost instantly against spear phishing, if we actually started to encrypt our data and also started to digitally sign it. So that's where privacy comes in. I don't want my emails to be seen by anyone, not even by Google. There's nothing that I'm hiding, but I have the rights to privacy and again, snooping. So fundamental in that is cryptography, to be able to make sure that we have the privacy, but also the integrity of that.
[28:16] Debbie Reynolds: I've never heard anyone say that about email. I think it's true.
[28:23] Bill Buchanan: Email is such an old fashion and companies love it because they can snoop and they can look at emails and they can filter. Quite rightly, they can see whether there's a virus in an email or there are some bad words or so on. So they quite like that it's not encrypted, but really for some things, like for governments snipping on citizens and so on, then email is not a good mechanism for that.
[28:53] Debbie Reynolds: What's happening in the world right now that's concerning you about privacy or cryptography or security?
[29:01] Bill Buchanan: Well, the biggest problem of course is what's called the crypto wars. So the crypto wars have been around since the late 1970s, around the time of the Clipper chip with the NSA. So the NSA wanted to make sure that they could break everyone's encrypted communications. So they created this clipper chip. So if you wanted to encrypt data over the telephone lines and so on, you had to buy this chip. And obviously the NSE had the key in escrow. So a key in escrow means that they have a copy of the key that they can take off the shelf and actually use it against your communications. And that leads to mass harvesting of data and lots of problems with your rights to privacy and so on. So obviously law enforcement and government agents don't quite like encryption. It's not general, but they obviously do want to be able to read communications, find threats quite rightly against terrorists and bad people doing bad things. So it worries them greatly the uses of encryption and where they'll not be able to, to be able to scan messages and hunt for threats and so on. So they have been looking for backdoors into cryptography and really there isn't. So all the cryptography we have has been standardized by nist, it's been peer reviewed and we've made sure that there isn't any possibility of a backdoor in the mathematics and so on. The implementation can be poor sometimes, but the mathematics is secure. So the eu, the us, the uk, five eyes and so on have all been looking at ways to be able to put some sort of backdoor into the encryption process to make sure that they can actually discover things. And it's worrying that whenever you put a back door in, then an adversary could actually find out what that back door actually is. And then use that against you. So you see WhatsApp and Signal, Telegram and so on as a secure way of communicating these days. But again, there's this push by law enforcement and government agencies to really make sure there is a backdoor in there. And it's an argument that can never be completely resolved because we all know there are bad people doing bad things, but hopefully there are a minority. The Tor network is a good example. Just because someone uses Tor, say you're a journalist and you need to keep your sources private. Just because you might use a Tor network for your communications doesn't mean that you're doing bad things. So I think the biggest problem is probably that balance between the rights to privacy and the rights of society to protect itself against bad people.
[32:14] Debbie Reynolds: I share your concerns there. So part of that issue, in my view, is that when you're trying to create these back doors, what you're essentially saying is that I want to be able to unlock every door instead of certain doors. Right. And so it creates actually more risk for everyone and it doesn't really pinpoint the issue. And so we're having situations about these bulk, what they call bulk data warrants or geofence warrants. Right. Where they're saying, well, we don't know who committed a particular crime, so we're going to ask Google to give us data of everyone who was in this area at that time. And what it creates in that situation is more, in my view, more of a guilty until proven innocent type of thing, because you don't have really a specific person that you're targeting. And so when you get people in those digital systems and they're looked at as a suspect, that's a problem. So I feel like on a bigger scale, the backdoor issue is even wider than that, where basically everyone is a suspect. Right. So, you know, in my view, you can't, you can't find a needle in a haystack when you're creating a bigger haystack. What do you think?
[33:34] Bill Buchanan: Yeah, the fundamental part is the warrants. So if someone requires access to data sensitive data, then they will need a warrant, so they'll have to go to a judge and all the information will be put on the table for them to make a decision. So the worry would be one, there isn't a, there isn't a warrant involved. And we move towards a world where, where nation states could listen to all of the communication on their citizens, that would be a worry. I think if you look at the way that some companies go about privacy, you'll find that Apple have got a very good track record of making sure that they keep the data private and sensitive. They've always got the best cryptography. Their hardware is optimized to have secure enclaves and so on. So you'll find if you even Apple can't get into your computer because the encryption keys are actually held within side a special chip on, on your, on your device. So I think there are, there are approaches and, and it is important that, that we trust are the main providers of our data because any one breach of that trust could bring their whole company down. So Apple, Google, Microsoft are highly dependent upon keeping our data private. And I think it was ever found that they were selling our data off or compromising in some way then they'd be in big trouble. But on the other side of the fence they have law enforcement at them all the time to be able to get more and more access and obviously it's that they need to create some resistance in there. Some companies that are very good at this but it all depends on where in the world you are. If you're in the U.S. most of the data, the cloud data is stored in the US so it's easier for law enforcement in the US to get access to data. But if you're in a small country in another part of the world then it's actually a lengthy process to get access to, to sensitive data. So I think in the US it can be easier. In the UK it's less easy but obviously with the five eyes agreement than there is that relationship. When the UK left the EU then that caused, well that is likely to have caused a lot of problems with access to data and the UK is still developing its data access policies and still needs to fit in with the rest of the world. So they need to fit in with the US and also with the EU and so on. So it's a difficult time just now in certain parts of the world.
[36:31] Debbie Reynolds: Well, I have a two part question. They may not exactly be related, but one is I want your thoughts on how artificial intelligence is making cryptography more difficult or easier. I don't know your thoughts. And then I also want you to talk a bit about post quantum cryptography, like the threat of quantum computing and what's the state of post quantum cryptography.
[36:58] Bill Buchanan: Right now for the first part? So you might think that AI has got some magical powers against the laws of mathematics and physics. It doesn't. So something is mathematically sound then AI doesn't give you any advantage. We typically define this in the forms of Entropy. So entropy means that I can't tell anything from your data because it looks like random noise, or the ones and zeros are completely jumbled. So there's no AI in the world that can make sense of random data. That would just be complete black magic if AI could actually do that. Where AI can come in is in the form of what are called side channels. So a side channel is that my mobile phone is actually giving off electromagnetic waves that you can't see, of course, but if I have an antenna, then I can pick up these electromagnetic waves. Electromagnetic waves are caused by the processor doing its business as processing a one and then a zero and then a zero and then a one. And with encryption, if the encryption hasn't been designed correctly, then I can measure the amount of time it takes to do a one and a zero. And I can work out when I see those pulses of radio waves, then that's a one and that's a zero and so on. And that's the side channel. We can get things like thermal emissions, light, and so on. These can emit external things that certainly an AI entity can actually make some sense of it eventually because it might be difficult for us to see the mathematics or the technique behind it. But an AI neural network, for example, will be able to find just that extra little element that allows it to be able to discover something. So for AI, then really it will have little impact on the methods that are actually used. But obviously the implementations might give away certain things about the, the code. So things like reverse engineering code from binary back to C code and so on is something that AI would do quite well. So it's highly important that developers obfuscate their code as much as they can to make sure it's so mangled that it's almost impossible for an AI entity to be able to reverse engineer the code back again. In terms of post quantum cryptography, obviously we have this problem that all the existing public key methods, that's rsa, elliptic curve and discrete logs, will be broken by a quantum computer. So it's not a difficult problem anymore with a quantum computer to be able to factorize two prime numbers. So RSA is based on taking two large prime numbers, multiplying them together to get what's called the modulus. And computationally it's very difficult for us to be able to factorize these back into the two prime numbers. If I can get them, then I can break every RSA encryption and signature. But with a quantum computer, because of Shor's law, then it was shown that that's not a difficult problem anymore because of the way that light waves travel and so on, we can tune and find out the solution to finding the two prime numbers. So that means that our key exchange methods are going to be broken. So in this zoom call we'll be using a key exchange. Diffie Hellman used the with elliptic curve. And unfortunately a quantum computer, by listening to the communication at the start of the conversation would actually be able to find out the encryption key that we're, we're using. So it will break key exchange. Where it will also break is digital signatures. So as I said before, we have the signature process that allows us to identify things we can identify that we're talking about zoom here and not a fake zoom. And again, public key encryption is used for that. Rsa, Elliptic of ECDSA and ED DSA are the three main signature methods and all of them are broken with quantum computers. Luckily, NIST has been looking at this for the past, I don't know, four or five years or so and has been getting academics and researchers to be looking at new methods which are provably secure. These include methods such as lattice cryptography, isogenies, multivariates methods and so on. And even codes, you know, like the error correcting codes that you get in a CD Rom. Those types of codes are quite good at doing things like key exchange. So we'll be getting rid of all of our old public key methods that have done us for 50 years or so and done us very well. And then we'll be bringing in new methods typically based on the standards that NIST are defining. And it's now a FIPS standard. So FIPS is, if anybody wants to do cybersecurity properly, they need to define that they have a FIPS standardization with it. The methods that are proposed at the current time is Kyber Key Exchange and then for digital signatures it's Dilithium, Falcon and Sphinx Plus. But NIST is worried that the lattice methods might get broken. So there was classic research that was typically used in AI called learning with errors. I add a little bit of noise into my computing and it makes it really difficult to be able to determine something from it. And the noise doesn't get too great. And that isn't approvably secure method. It might be broken by researchers in the near future. So NIST want to see other methods coming along besides the lattice ones, which are very efficient, they're fast, they don't take much memory up and they don't take much energy too when they're doing the computing. So NIST are looking for alternatives. So although we standardize these four methods, there will be a whole lot of other methods that will be coming along on the back of these that will be replacements. Because the nightmare scenario for the Internet is that someone breaks kyber and all of our communications are broken. The way that we've managed to work with cryptography is there's also a fallback. So we have AES encryption for symmetric key, but we might also have three destinations. So on your credit card you actually find that it's probably three DEs that's used there. So the great advantage is that if one thing gets broken, then we just flip to something else. We just deprecate it, we get rid of it and make sure. So NIST will be defining new methods that will come along. What's likely to happen is that we won't just switch the Internet off tomorrow or on any single day and migrate from one our existing public key to the new ones. We'll have what's called a hybrid solution. A hybrid solution is that the data packets that you will be creating, say for the key exchange, will have our existing tiffy Hellman elliptic curved diffie Hellman information in it and it'll also have Kyber information. So our packets will get a bit bigger, but there'll be both types of, of methods in there. And that allows organizations to migrate from the existing public key methods towards the new ones. And then eventually you drop or you deprecate the old public key method and you end up with only the post quantum cryptography method. So this is likely to take three to five years or so. Quantum computers might not be built within the next seven to ten years at scale. So we have this window of opportunity for the next three to five years, I would think, to be able to move and migrate. And we've done it with cryptography all along. We've generally migrated from things that aren't secure to something that is secure in a seamless way that you don't have to switch the whole thing off.
[46:13] Debbie Reynolds: Thank you so much. Yeah, this is very exciting and I'm happy to see that the NIST is really leading on this and that people are really taking these threats seriously. But if it were the world according to you, Bill, and we did everything you said, what would be your wish for either privacy or cryptography or security anywhere in the world, whether that be human behavior, technology or regulation?
[46:41] Bill Buchanan: Well, I think, I think cryptography has generally been a technical solution and I think it needs to be something that changes our society. So like, I know we've moved most of our work home, our whole lives are now online, but we haven't really properly integrated security and trust into the, into these things and certainly to make sure that it's citizen focused. So you'll find most data systems, most IT systems are still very much focused on the companies and the organizations and the roles and citizens have very little rights to their own data. In the UK we have the nhs. I can't book an appointment online, I have no access to my data, my healthcare data. Every time I change my gp, I get asked for the same old data with a pen and a piece of paper. So I think we need to start to understand how citizen data should be used and we should be able to give them more rights to that and they should know how their data is actually being used too. And we should have the rights to be able to extract our data from these data infrastructures. And in a world of AI, this becomes worrying because how do I extract my data from an AI algorithm? Bots come into my website all the time and scrape all of my data and how do I say that that's my ownership, that I should have some rights as to how that is actually used? So I worry that there are certain companies in the world, these top companies who are investing in AI and they are creating a monopoly for data. They're hoovering up all the data in the world and they're feeding into these algorithms and they're making a lot of money on it. I really worry for innovation, for small businesses, that they will have very little opportunities and for universities too, we'll have very little opportunity to innovate and do new things. So my great worry is that we're moving towards these very large monopolistic companies. Microsoft, invest in OpenAI, Google, these large and sometimes faceless companies are really making a lot of money. Microsoft are building the biggest data center in the world. They have plans with it with OpenAI, it's going to be full of GPUs that will be implementing the most advanced AI that you've ever seen. So when that happens, then they become the all powerful God of the Internet. Just in the same way that Google is almost like the sole source of education and for education too. Then you increasingly see that students, pupils are starting to turn towards AI as a way, so they don't have to retain knowledge, they don't have to get deep into knowledge anymore, they don't have to learn from books and so on. Everything is there, but it's not real. Learning, you're not really learning if you're asking a prompt. Anyone can do that. So I really worry about the way that the AI is generally going and the way that it's eating up our data, our intellectual property. So you can have a recording of Johnny Cash singing Barbie Girl. That's a rather unfortunate world where we've short circuited human spirits and creativity, or a machine that knows little about real intelligence. So I suppose it's a worry, but we obviously can't switch off and we'll just have to cope with that.
[50:56] Debbie Reynolds: As a society, I share concerns and I think in the future things will have to be more individual centered as opposed to company centered. I'm not sure exactly how we get there, but I think cryptography is going to play a huge part in that where there's so much more computing power that people have in their pocket or in their hand. They can possibly be used, for example, keeping data on device or limiting or giving people more agency and how data is shared and having people be able to revoke access to certain things. So yeah, this is amazing. Thank you so much. I learned so much from listening to you and I'm really excited about all the things, the educational content that you post on LinkedIn. So I would love for people to be able to follow you. Bill Buchanan on LinkedIn. Professor, I really enjoy that you were able to join us today. Thank you so much.
[51:56] Bill Buchanan: Thank you.
[51:57] Debbie Reynolds: Talk soon. Thank you.
[51:59] Bill Buchanan: Thank you.