E54 - Chris Roberts, Chief Geek, Hillbilly Hit Squad, CyberSecurity
1:05:02
SUMMARY KEYWORDS
people, data, privacy, human, business, cyber, bloody, silos, point, industry, talk, corporations, problem, world, hacker, run, hand, technology, understand, system
SPEAKERS
Debbie Reynolds, Chris Roberts
Debbie Reynolds 00:00
Personal views and opinions expressed by our podcast guests are their own and are not legal advice or official statements by their organizations. Hello, my name is Debbie Reynolds; they call me "The Data Diva." This is "The Data Diva Talks" privacy podcast where we discuss Data Privacy issues to the industry leaders around the world information that businesses need to know now. So I have an extraordinarily special guest. I have Chris Roberts on the show today Chris Roberts. He calls himself a Chief geek at the hillbilly hit squad. I have followed Chris for many years, and you are like a bolt of lightning, so you kinda you really the content that you put out is so striking because it's so different than what other people talk about. So, you know, you talk about not only cyber and stuff like that, but you also talk about sort of life, life and living because we are a human right. You know, sometimes we forget that. And so we have kind of human things we go through. So I get excited when whenever I see that you've written kind of a long post, I like to serve, you know, I flag it, and it's okay, I gotta sit over here and like, actually absorb what it is. Because a lot of times, you just can't go past it without making a comment or saying something about it. Also, I'll say that when I decided I wanted to start a podcast, you were on my list, my wish list of people that I wanted to talk to. And what I found, when I looked at the people that I want to talk to you, I've been fortunate, extraordinarily so that I've been able to talk to almost all the people on my list. That's great. So you're like a hold out because you were like really top of our list. But I think the common thread is that the people that I liked are people who are talking about things that other people are talking about, as you do that. You do that a lot. So I really like it. And I also say this is so funny. So I don't know. I guess you were like an enigma wrapped in a riddle for me for many years. Because I had never met you, but I follow your stuff. And I think at one point on your LinkedIn profile, you had like a drawing or something. So I didn't know what you looked like at that point. And then, I have the pleasure of being invited to a panel to do a panel with you. For MIT. You and I did that FIDO Alliance thing of a fast password online. And I was like, Ah, I arrived. I want a panel, Chris. Chris Roberts, it was so cool. So welcome to the show. And I'd love to just let this flow. Let's just chat.
Chris Roberts 02:57
Yeah, no, thank you. Yeah, the profile picture, that profile picture if I remember rightly, that's the one done by reading it right. He's not friends with his family, a governor with Eddie Mize who's in the industry; in fact, you should know we should get him in new you and him talking as well. Another one of those just amazing humans in the industry is spent a long time, you know, both in the corporate side of the world trying to effect change and then outside other areas, and his artwork is just absolutely amazing. Just any, he's almost always at DEFCON and besides the mighty others, and any money he gets from the network is automatically typically donated to like Electronic Frontier Foundation in this life foundation. Hackers, you know, hacks for charities; hackers forget all sorts of stuff. He's just one of those freaking good human beings.
Debbie Reynolds 03:51
That's so cool. That's so cool. Where should we start? What do you want to talk about today? What what is top of mind for you today?
Chris Roberts 04:01
It's interesting. I just came off of having a pre. I've got a couple of like things lined up. And it's really interesting because there are so many people who are still talking about technology. So many people are talking about, hey, we have to solve it with tech, we have to do this, we have to do this. And I raised some points the other day, somebody put a Google doc out if I've got the Google doc up, somebody put the Google doc out about this, you know, cyber event brainstorming, and I'm like, Alright, let's take a look at it. And it was, you know, let's talk about risks. Let's do what is good. Let's talk about scaling things as to where ransomware and I'm like, how about we talk about humans? You know, how about we talk about you know, we also have Data Privacy? And let's look at the human aspect of it. Let's look at the policies and the control aspect. Let's not keep going running off, and how do we buy more tech to solve the problem? Let's take that step back and look at the human behind And it's, we've we seem to, for whatever reason, have left the human behind in so much of what we do these days. And I think, you know, to your point, a lot of the times when I look at my posts, when I, when I put my posts together, I try to put a human element in there, I try to make it much more personal. Because if I sit down, and I'm just like, well, you know, here's the second is this, it's it kind of goes in the ear and out the other ear and out the other ear. That if you try to humanize the story, or you try to weave in something, you know, push humor, and humor is always an interesting one or something that makes people think, you know, that way is not to be controversial, but it has to be provoking. You know, it's, I was talking about the blinky lights, I was talking about Taser and people, I was looking at the same things. It's like, look, this logic to this will feel the same way. So let's actually humanize this situation a little bit more. And what can we do about it? How can we as humans approach it? And how can we collaborate together? Rather than each one of us finding our own bloody fights in our own damn silos?
Debbie Reynolds 06:08
Yeah, I'm glad you mentioned silos. Silos are a big problem, right? Silos, you know, when I look at people, we like, say, for instance, people do arm executives, or corporations or people work in the business world, a lot of it a lot of their teaching and a lot of their learning is about how to create silos. Right? So our problems can't. The problems that we have with sort of Data Privacy and cybersecurity are escalated because of silos. So we need to break those walls down. What are your thoughts?
Chris Roberts 06:49
Yeah, I mean, I had a really good friend, who kind of challenge tackled me on any really, really good point, because I talked about blowing silos up and he's like, Well, hold on a second, he's like, it's still useful to have them, it's still useful to have everybody in their own streams, no two ways about it. But you need to build more bridges, more doorways, more collaboration between them. So that silo almost ends up like, you know, the Connect four boards, where you could go up and sideways and downwards and all sorts of other things. He's like that, to me would be more effective, because, at the end of the day, you still need people in development to do development, no two ways about it. But you want them to have a mindset of approaching a problem with not just how to solve it but also how to solve it in a safer and more secure manner than maybe there's some of them do. So I don't want you out of your damn silo. But I want you to acknowledge there are other tracks running right alongside the view. And maybe every now and again, you look left, and you look right, and you go, I've got you covered. And I got a question for you. And we all do that more effectively. So you end up building this almost latticework. And to me, that made a lot of sense. I don't know, whatever metaphor, but at the end of the day codons, the whatever, which has the way we look at it, it's like, everybody's got to get there. They need to put their head above their cubicle and go. Is there anybody else out there?
Debbie Reynolds 08:13
Yeah, I love that you, you know, talk about the human because that's one thing I talk about a lot when I'm talking about Data Privacy. So, you know, obviously, I want businesses to be a business, right, and make money. But you can do that and also respect the rights of individuals. And then, you know, I think one thing that I will speak to corporations about which, as you know, I'm sure you have this experience to where you say something very simple. And it's shocking to them which is the data that you have doesn't belong to you.
Chris Roberts 08:54
Yeah, it's, I think, again, because we see because I mean, we've lived and breathed in this world, so for more years than probably you or I ever care to admit to sometimes. And so we definitely see this world. But again, I think there's an awareness to look around. There's a willingness to look in different businesses, and there's an awareness, you running your own business, you do anything you understand what it takes to run a business more than just the tech and everything else. All the facets of that business, a lot of us understand. And some of that comes with age, and I'd say age more than, say, a business degree or something like that. I think it's being in the trenches and understanding what it takes to run it. So when you walk into the leadership discussions, you say, Hey, by the way, here's not just my perspective, but here's something that you can think about. Here's something you can look at, and they're like, Wow, I didn't think about that. And then, you know, you're like, well hang on. How much the rest of the business does not understand because the end of the day technology and security is, again, it's just one silo. You know you look at manufacturing, you look at transportation, you consider all these other areas that make a business actually function. And you're like, Okay, if you don't understand that, what else are you missing that either I can help you with? Or I can bring somebody else from another area of the business. And together, we can sit down and do this. I mean, it's, it always fascinates me. Sometimes you get some amazing leaders who really understand how things work because they care. And you get some who are just chasing that almighty dollar, and no matter what it takes.
Debbie Reynolds 10:30
Yeah, I agree with that. I would love to talk about myths, right? There's something that you, you talk about a lot, you kind of pierce the veil of myth. I know you, and I have a spirited discussion about the word hacker. And so I'm reminded a lot that in the media, there are a lot of misconceptions about cybersecurity that are perpetuated, right? So one of the older ones, one of the more familiar ones that I'm sure that you know, and we talk about a lot is well, two one is that a hacker is a person who is has a hoodie and he's in the basement of his mom's house.
Chris Roberts 11:11
The guy with the hoodie, let's we'll ignore that one.
Debbie Reynolds 11:17
We'll ignore that one and then this sort of the misnomer that the hack someone calls himself a hacker is a criminal.
Chris Roberts 11:27
Yeah. Yeah. It's, it's I find a level of irony that in that first stereotype. Mostly because I mean, I love my hoodies. It's and again, people, people tack on me on the right like, oh, you know, like, hoodies, I'm I do I love my hoodies. I hate saying, oh, you know, a hacker breaks in, and you've got this shit going on, and I'm like, quit that shit. You know, it's Stop. Stop stereotyping us. Yeah, you know, and we see it in life all the time. I mean, all walks of life. Let's be perfectly honest on this one. But the challenge is, sorry, the cameras have gone fuzzy. It'll clear up in a minute when it realizes that I've kept those on my end. That's fine. It'll figure it out in a minute, is the challenges. The very industry that we are working in that we're dealing with arguably was formed with a lot of help and a lot of assistance from us that we self-identify as hackers. You know, so many of us are part of organizations being like 303, 2600, or all of these other besides DEFCON, all these other ones have come about because of that hacker mentality, that thirst for knowledge, that wanting to understand how something works, wanting to understand how to effect change and change, you know, hopefully, more often than not changes for the better. But there are times when an organization needs to be, you know, kicked into effecting change. So it's frustrating because you do get people who go rogue, no two ways about it. And there are people that have come from that side of the world that now run companies are part of companies are amazing contributors in this industry—those of us who've tread and trodden that gray area and still do on a regular basis. And yet, it's an ethics and morals conversation. You know, if you're using those skills to do good, you can call yourself a hacker or a business person, a pen test, or whatever you want. But if you cross over and you still use those skills for the bad, at that point, your adversary criminal or something else, and unfortunately, the media, I don't know if we'll ever reclaim the word to be perfectly honest, I'm not entirely certain we will. But even if I can't completely reclaim it, I still want to put it out there like a lot of us. I mean, there's the hashtag; you know, hacking is not a crime. There's a bunch of us out there that are, like, doing our best to promote that mentality. We'll see if it happens. It's something I won't ever stop doing because that's who identifies us.
Debbie Reynolds 14:15
You talk a lot about the future. And that's something that we'll have to talk about. I guess let's talk about two things. One is how do we get where we are now, and where are we going? Or should we be thinking about going in the future?
Chris Roberts 14:32
Oh, so that's that it's, I love that stuff. It's a really fascinating topic. And it's, it's kind of frustrating as well, in some ways, because I'm getting older, in no two ways about I've crossed I've crossed that, you know, I've got my half-century in cricketing sounds. I'm sitting now, you know, I'm batting the next one. And so it's frustrating because I will figure if I fight if I'm lucky, I've got you know, 10 or 20 years left in the I mean, I look at the future of technology. And on the one hand, I'm astounded and amazed at the potential for what we can do. You know, I joke about retiring to an AS 400 In New Zealand, yeah, and so a couple of things I've talked about. And yet, I look at the technology and go.
We're not too far off, and we are down the pathway from being able to do that. And others take a human and literally synthesize them into a set of digital signals, we're doing bits of it, we've got some interesting stuff going on behind the scenes that haven't been public, etc., etc., etc. So you look at where technology is going. And you look at the advances that we have in really figuring out what we are as humans. So can I break a human down into a set of signals? If I can, how do I do it? What can I do? Where are we going? Where do we go? Where does it end? Does it end? If I can make us a set of signals, then why do I need to send a mission to Mars, but I can just be in the bloody signal that. So there's all of that kind of stuff. And yet, on the other hand, you look at where we are today. And you're like, Yeah, we can't get people to remember passwords. Like whoah.
Debbie Reynolds 16:13
Right? Absolutely. Absolutely. I know; it's like technology is going way far ahead and sort of definitely trying to keep up. I don't know, and I feel like maybe we're not having the right conversations in some way. So yeah, we're talking about a lot of us, right? We want people to be better, right? And that falls for kind of the simple, low-hanging fruit types of tricks. But I feel like the harm in the future is so concerning, right? Where, you know, a person doesn't want to take your account password, they can literally impersonate you, oh, as they can, they can lock you out of your own life. Right. So, you know, I feel like there's a bigger picture out there that we need to be thinking about
Chris Roberts 17:07
100% with you on that. And again, it comes down to the data side of it, the data, the Data Privacy, the use of data, the abuse of data. I mean, you look at how much data we're creating. And I remember a couple of years ago, and the statistics were as like, probably 90, know what I mean, like 20. A couple of years ago, somebody said, you know, in the last two or three years, we've created like 95% of the data on the planet, you fast forward to now, and you know where we are now you look at, you look at you go hang on, they're now saying it's the last two years, we've created like the one that 95 or 90% of the data that we're using it, we're exponentially increasing how much data we're producing, which exactly to your point means that there's some cool stuff you can do with it. But there's, unfortunately, the nefarious and negative side. And we haven't, I mean, a perfect example, sitting on LinkedIn. And there was this Stanford University has been doing an investigation into bodily inputs and body outputs. So can you measure the health of a human by measuring body outputs? Well, yes, you absolutely can. So like, we'll make a digital toilet to do it for you. You know, and on the one hand, I'm like, cool, cool idea, love the concept. But on the other hand, I might say you're going to upload all my data to the cloud. It's not bad; let's not do that. And by the way, it's probably gonna go to Amazon to remind me to, you know, order more vegetables and more bloody, you know, more vitamins or you know, so it's, there are some fascinating, fascinating and amazing things we can do with technology. There was a story a couple of years ago where one of the big supermarket chains was tracking what people were buying. And they could identify if somebody had, like liver problems or heart problems, or somebody was potentially pregnant, even before that person knew what was going on. Those, on the one hand, are fascinating because you want to help people. But on the other hand, you've got to get the trust of the people to allow you to do that. And we are so far from that. I mean, we are. So far, we've lost the trust of people, but they still do. I mean, everybody's on Bloody Facebook, everybody's on social media. Everybody hands over their email address for a 10% off coupon. Yet, we still complain about everything that's going on, and we still get sold everything. It's a really weird dichotomy as to where we're going. I love the idea of using data to help people to save people. And I think the more we can bring that data together and ask the right questions is amazing. But it scares the hell out of me. And the potential for the misuse of all of that data.
Debbie Reynolds 19:50
Yeah, I agree with that. Let's talk a bit about corporate corporations, corporations, and businesses, so I can already see you laughing about this. All righty. Yeah, exactly. I don't know, and I guess the analogy for me is like, you know, in the movie Titanic before they got on the boat, they said, Oh, the Titanic is the unsinkable ship. And I'm like, and I wouldn't get on that thing. I mean, because if you think your ship is unsinkable, that tells me you have a huge blind spot. So in some ways, I feel like some corporations, that's sort of the way that they think they think, Oh, we're good, you know, nothing bad is happening yet or something? I don't know. So I don't need to take these, you know, pre-emptive measures, I don't need to find, you know, the cyber folks. I don't need to hire, you know, these people because I think we're good. You know, what are your thoughts?
Chris Roberts 20:44
Oh, 100%. With you. And unfortunately, I gave a talk fairly recently. And I talked about the trifecta. And I actually, I think I call it the triad I went after. I'm like, you know if we're gonna call it what it is, let's call it I mean, let's call it what it is, is a triad. And you have like, there, there are three areas to it. From a corporation standpoint, they really are taking a step back and going; why do we have to do security. Because on the one hand, they're being told they're gonna put a tick in the box. Or in this case, in modern-day and age, they're gonna put lots of ticks in lots of different boxes, and we keep giving them different ticks and boxes. And we keep telling them they're going to put more ticks in more boxes. And though so they're spinning in circles, wondering how many downticks they have to put in how many damn boxes, and we've called it audit, we've caught compliance, we've got a PCI, HIPAA, FISMA, CNMC, we keep giving these boxes, new names. So one, then you've got a corporation sitting there going, well, how the hell do we deal with this.
On the other hand, if they don't put a tick in the box? Well, there's not a huge amount of repercussions that happened. PCI isn't going to yank your credit card ability taking ability away. They've done it maybe once in the last, what, 10 or 15 years now—such a small organization. So well, okay, I got to spend a couple of million to do this. Or I can just make it, I'll tell you, I'm working on it—the same thing with healthcare and a bunch of the other ones. So and then you've got an entire industry; our entire industry is built around putting a tick in the box. And we've put a level of complication round. And we've put a level of craziness around it that now only certain people can come into your corporation to put those ticks inboxes. And they can only use certain types of colored types of pens to do it with. And so we've got the ridiculing there, and then you've got the poor, the poor consumer who's sitting there going, I just want to buy shit. Or I want to get my healthcare, or I want to be able to pump my damn petrol. So you've got this crazy trifecta. And the businesses are now saying, Well, hang on, if I don't put the tick in the box, maybe I get breached, maybe I don't. And if I do get breached, all I have to do is go in front of Congress and go, and I'm so sorry, I've got my insurance. Maybe I get money back, and maybe I don't. But no matter what, I'm just going to pass the cost on to the consumer. Consumer, instead of paying $100 for insurance, now pays $125 for insurance because that extra $25 just paid for me to get all my stuff back. And they get to pay for the ransomware. And if they, if the Feds tell them, don't pay the ransom, yeah, will still pay it. And so we haven't gone to the corporations with an effective enough solution that's compelling enough for them to go, I get it. I understand why you are doing this. I understand what's going on. I can integrate it effectively. And it's not going to cause me crazy amounts of pain, suffering, and all these other things we haven't. We haven't done that. Because again, like our industry, we're just was so bloody fragmented.
Debbie Reynolds 23:52
Yeah. Wow. That's interesting.
Chris Roberts 23:56
Yeah, it's crazy. And I don't have a good solution. I really don't have a good solution.
Debbie Reynolds 24:01
Yeah, let's talk a little bit about cybersecurity as a profession and how it fits into business. This is something that I've noticed. And to me, you're kind of going against this trend, which I love, which is, you know, I have a dear friend who told me that he thought his idea was when cybersecurity is working, it's supposed to be invisible. And so to me, that translated to I want to be invisible, but when you're invisible, that means you don't get the funding, right. You don't get the money. You don't get the acknowledgment of sort of what you're doing. And I think it just makes it harder for cyber to really prosper within organizations the way that they need to write because there isn't enough visibility, so what they need to do is really important. You know, you do All these panels and speaking and, you know, your, you know, we're renowned for being able to kind of speak out and tell your truth about, you know, cyber and the dangers that we're facing. But what types of things would you say that people in cyber need to do to really get more of a foothold and kind of, you know, a seat at the table in the boardroom, or at least me here, because right now, it's like, oh, we make widgets. And that's so important. And we don't care about cyber until something happens, like, okay, so you can't make widgets anymore because you're out of business. Right? So it's like, it should be more foundational, as opposed to well, yeah, we have less weights, we have a problem. And then we go to cyber.
Chris Roberts 25:47
Yeah. And that that's the human nature aspect of it by far and away. It's, again, I think, you know, right. To your point, we would it's a soft skill. To me, it's the soft skill side of it. It's that. How to perfect that again? I was on a talk earlier today. And somebody started talking about SLRs. And I'm like, oh, flipping. So I waited for a second to see if they would explain what this meant for now because I come out of the military was an SLR to me as a self-loading rifle. I'm like, I know. I'm like, I know what one of those is, but I'm not sure how it applies to what you're talking about. But we'll listen in on this one. And I ended up putting my hand up. I'm like, Hey, what the heck is an SLR? And he explained to me, I'm like, Alright, so why the hell would you say that? Well, you know, it's our industry. And like, No, it isn't our industry. And like, you're talking to me, you're, you're assuming I know what the hell you're understanding. Now you go before exactly to your point, you go to a board, or you go to leadership. And you start talking acronyms, or you start going in and saying, Hey, we stopped all this stuff with the firewall, or we've got all this, and we've bought IDS, IPS, DLP, EDR XDR MDR, and every other leadership is going to look at you like you got three bloody heads, and 20 seconds later, they're going to be scrolling through their text messages, we can't continue to do that, if we want that seat on the board, we need to understand more effectively, what business risk is what makes the business tick. And literally what leadership cares about, and you probably right, they don't care about cyber, and there is an argument to say, we should say it should just work. You know, it's like, when you walk into the building, you expect the door to open, you expect the elevator to work, you expect the lights to come on, you expect water to fly and other taps. There is an argument or sayings, the cyber world and especially the security side of it has become so integrated into our everyday lives that just like water coming out of the taps, it should work. We don't, and we don't have to test our water every day. I don't go down every morning and put a pH balance in my water and run a spectrograph through the water to make sure the water is okay. I assume it's going to work, and I assume the water's going to fly at the tap properly. And I assume that when I drink the water, I feed it to my pets; none of us are going to keel over dead. Yeah, we expect people in their homes to understand the computing understand the network and to patch their systems to patch their environment to understand what ransomware is to know their or their home systems are probably compromised to know how to do this. Why doesn't it work?
Debbie Reynolds 28:32
Also, let's talk a bit about artificial intelligence. And I'll tell you what concerns me. The thing? Well, I have a lot of concerns about artificial intelligence, right. So I love technology as Do you. But the issue that I have with AI and having kind of the systems make these decisions so quickly about people and companies taking or companies or governments taking action against people in a negative way, right as a result of what a computer told them to do, or what they thought is that the harm, in my view cannot have any adequate redress. Right. So So basically, the way we're doing this now is we're saying, Oh, well, let's create these regulations, right? So then we create these privacy laws, and we say, Hey, you can't do this with someone's data. And that's, and then if you do, we're going to find you in this bad thing. It's going to happen. The problem with that and something like aI where you have algorithms that are making inferences and companies or organizations are making decisions. Some of them are not great, right? They can harm people. The harm comes fast, right? Or can come fast and Swift or maybe can come slow, right? That's even worse, right? Where it's, like drawn out. But if someone's life isn't impacted by that, there isn't any adequate legal redress for that.
Chris Roberts 30:15
Yeah, it's, it's interesting, because so at the moment if you look at so if you look at artificial intelligence, what we have at the moment in the industry, and not just the industry, just in the world, in general, is what we call a narrow and narrow, narrow AI or an AI. That's a narrow focus. So perfect example, we like IBM's, Deep Blue, amazing at playing chess, amazing doing that game show and amazing small things. But if you turn to say, Hey, can you make me a cup of tea, it's gonna come up in the corner and whimper to itself. Because you won't ever flipping clue what to do with it, let alone analysis of tea, the human everything else. So it's good at doing very, very specific tasks. And the same thing in our industry, the same thing, everything that we're dealing with on the military side of things, it has a narrow scope, you have drones out there that exactly to your point, make a determination as to whether somebody lives or dies, based off of a number of characteristics. At the moment, we have a human between the at the moment, and we have a human between that analysis and that decision point. So that little die is still a very, very human decision. How long that continues to go for to be seen. Beyond that, my biggest challenge is, how often is that program being trained? So talk to me about the data that you're continually using to train it. Is that data refreshed on a periodic basis? Is it a near real-time data feed? Or did you upload data six months ago? Yeah, we're all in good shape. And then let's also talk about the quality of their data. So we've been doing some interesting and fairly nasty studies on what we call, it's actually this official term is adversarial machine learning. So let's pick a controversial topic. Let's talk about distinguishing between humans. So we all hear about how a machine distinguishes between humans sometimes effectively, sometimes not. But if I give the human your face in my face and say, certain characteristics in this face are great. And certain characteristics in this face are not so good. The machine and they could determine as to which one of us gets the cup of tea and which one of us gets the Thanks and goodbye letter. If I'm nasty, and I'm an attacker or an adversary, and I go into that data stream, and I say, No, no, no, no. Rather than saying red beards are good, let's see red beards too bad. Rather than saying, hey, red scarves are a warning sign. Let's go look at Red scarves as being good. Red in certain placements So my red, my red beard is okay, your red scarf around here is not okay. Now let's change that around. And I start to train the machine on my version of data rather than your data stream. How are you testing it? How your understanding and also now you get a thank you, and I get the boot. And so all of a sudden, we see it in medical intelligence systems, you know, you see machines making decisions based on characteristics. Well, what's the source of the data that it made a machine that it made that decision on? How trusted is that source of data? You know, we take one plus one equals two unless you're messing around with mathematics is a fairly standard thing. But if I mess around and say one plus one has to equal three, all of a sudden machine goes, gotcha. And now it makes a whole bunch of decisions based on that. So my concern is several-fold. It's that quality of data, the type of data, the source of data. And then honestly, as well, the other big one in intelligent systems is whose bloody hands are on the keyboards programming in the first place. I mean, we have an issue with getting enough people in a diverse enough spectrum. And we're not just talking race, religion, color region, and we're talking diversity of ideas as well. You know, I love hit squad. I mean, the team that we have is hilarious. We've got myself, and we got Jesse, who wears overalls in Nebraska. And we got a guy who came off the bloody reservation. And so there's three of us were crazy, diverse ideas. But we all come together, and we can figure it out with some amazing results yet. You look at the intelligence systems, and unfortunately, a lot of the coding and programming is being done by a very, to say, standard type of human being.
Debbie Reynolds 34:54
Right, right. Let's talk a little bit about let's talk about bias a bit. And then also,
Chris Roberts 35:02
Was it an explicit bias?
Debbie Reynolds 35:04
Yeah, all bias. So these things concern me greatly, for obvious reasons, right? Knowing that black women are often very, very often, I think that statistic is 10 times more often misidentified and facial recognition. It's crazy. Yeah, systems. And you know, there isn't enough diversity, like you said, not just creed or color, but also diversity of ideas, you know, in these spaces. And, you know, it concerns me, right? Because a lot of these technologies are being used for life and liberty decisions, and then sort of how do you fight your way out of that? So I'm, I'm just concerned about kind of the over data collection, the inferences that are being made without sort of your knowledge. And then what happens next?
Chris Roberts 35:57
Well, that's where it is really interesting, because, you know, we've gotten to a point, unfortunately, or unfortunately, where we don't necessarily trust the system. Yet, we need to collect more data to effectively train the system. And yeah, exactly to your point. I mean, you go to a population, you're going to a population that's already had issues with the system. And you say, hey, I need to collect more data on you. They're like, why? What are you going to do with it? So now you have to look at trust inside the system. You go back to a system and say, Hey, I've got to take more data on you well, on the one hand, yeah, I'm actually happy I want to train the system. On the other hand, you're like, What the hell are you going to do with this data? And who's going to have it? Who's gonna look after it? Who's going to manage it? Who's going to be affected by it? And ultimately, what the hell are you doing with it? Yeah, I mean, there's it again, it's we didn't start off on the right foot. This, I think, is the biggest problem, you know, we talk, we look at statistics, and this is the one thing that drives me nuts with statistics, is I can look at statistics and make them say anything I want them to say, we all have. And that part of the problem is we've not been consistent. We've all not sung from the same song sheet, which is part of the other problem. I mean, you look at society. And if it was the same voice being used to describe what we're doing, how we're doing it, and we use the same terminology to describe what we're doing and how we're doing it, we might be in a better chance. But we've got so much, honestly, so much bloody infighting, that if I come to you and say, hey, look, I want to train my intelligence system, to more effectively represent people who aren't bright white and old American males, or Old English European males. On the one hand, you'd be like, Okay, I'm going to take it. And if I tell you what to do with it, now I've got; also, I've got to fight off the 20 other companies and try to dispel what they're saying, before you even get to a comfort level of going, alright, I have a level of trust with you. So I, you know, it's another one of those until we can instill more trust in the system and more trust in the data. And until we can see it as a move and see until we can highlight. And this is, you know, this is one of the other things you know, news always likes bad things. We see 100 bad situations where technology is used for the negative. But rarely do we see a rally to those stories bubbled to the surface where technology has been useful, the positive we ran into it. So perfect example would be when Snowden went off the rails and released everything. The agencies we took a lot of stick for plugging into the feeds from, like Facebook and LinkedIn. Everybody else, we've done logistics for that. Everybody's crying about liberty and privacy and everything else. What did not come out of any of that, with the several 100 instances where that data had been used to correctly identify somebody who was one step away from walking into a building with a frickin bomb vest on or doing another McVeigh under a building. That's the stuff that never did come out to you. Rarely do you hear the positive stories? You, unfortunately, always hear negative stories. And that I think, again, that's a problem with news and media and especially the social media aspect, where things just get blown up so quickly that you just don't have time to react to them.
Debbie Reynolds 39:39
Well, I'll talk to you a little bit about this is interesting. So I feel like the future is about your gaze, right, your eyes which are what you look at, what you like, what you love, right? That's the information that companies want. That's what advertisers want. And I think.
Chris Roberts 39:59
screen where are you looking? Where's your mouse moving? Where are your eyes looking? How long are you hovering over? So oh my gosh, I mean, there's an entire industry. And it's crazy how big it is, and there's an entire friggin industry that just wants you to like, click one more time. That's scary.
Debbie Reynolds 40:17
But I mean, it, it, it brings up privacy issues. And then I guess the friction there is, let's say you're in public. So so the things you look at, maybe that isn't protected, right? Because it's in public, right? But maybe the inferences that are being created by what you look at and what you see, you know, those things, I think there's going to be a fight about who owns the information, first of all, is being collected we already know. And then, as you say, like statistics, it can be you utilize in all types of different ways that could possibly be harmful to a person that may become a privacy issue.
Chris Roberts 40:57
It's, it's to me; honestly, it's not even a can be harmful. It is. And this is, so I've got an amazingly good friend, and I absolutely freakin love him to death. However, he has different views than I do. So we have a number of times gotten into conversations over, say different viewpoints on the vaccine, different viewpoints on presidential stuff, different viewpoints on humans, who's, whose lives matter whose lives don't matter why this why, and we've gotten into some really interesting discussions. And so I sat down with him once, and we went through our browser history. And I'm like, Okay, if you look for this, and I look for this, what do you see? And if I look for something, I see this, if he looks for something, he sees this, then like, Okay, we both typed in exactly the same thing. Why do you think that is? And why do I think that is so weak. So we start talking about how not just Google but I mean, almost without fail, pick most of your standard Browder's in one way, they're like, oh, we'll help you, we'll serve up the content you want to see. But if you're continually looking for, you know, aliens in area 51, every single time you look for something even akin to that, it's going to serve you stuff, other things you want. So we end up getting into this funnel. And this funnel gets narrower, and it gets narrower, and it gets narrower. And then, three weeks later, or three months later, we find ourselves with a tinfoil hat on our heads. Understanding that under Denver airport, we've got the lizard people that are running the government, and the Illuminati is basically at the core of the Hollow Earth. And the Germans are, you know, the former Nazis are actually sitting on the moon base. And you're like, how the hell did I get here? Well, congratulations, welcome to Google search. And then you've got the other people who are completely the opposite direction have figured out, you know, how to solve you know, global warming, and putting everything back in the bloody oceans needs to be there. So, and we're all using the same tools, and we're all pulling from the same data sources. But because the system is doing a, unfortunately, a very, very effective job of profiling us. It's, it's, it's almost been too efficient at hammering us what it thinks we want, sometimes before we even want it. And so, it behooves us to be human to ask more questions and to look outside of our comfort zone. And unfortunately, most humans, we're not good at doing that. Or we don't like to hear ideas that are contrary to what we believe.
Debbie Reynolds 43:28
I agree with that. I talk about that a lot. I say, The analogy I give is like, when you're on the internet, it's almost like you think you're in a library. But actually, you're in a section of a library that's been chosen for you.
Chris Roberts 43:43
it's been chosen for you. I mean, you walk into that library. I mean, so that's pretty good. We're looking at it because, like day one, you walk into the library, and it's like, here's the whole library. Day two, you walk into the library, and it's like when we saw you in section number four, his section number 4. Oh, we saw your folder 2; well, his folder 2 run the saying, Hey, we saw your photo 2. Hey, you know, do me a favor, just get out of your own head and go look at Section 6.4 or go look at section A dot 3, you know, rather than doing that, and again as humans, we are not always good at looking in the mirror. Questioning what we've been told plus the good or for bad everything on the internet. It's true. Right, you know, it's tough. Yeah. That's especially, you know, for good or for bad. You take the internet as one source, and then you look at news media, and you tie in news media and what they're pulling off the internet as well and, and all I mean, I use news very, very loosely and again, I'm, I don't care which you whether your left side right side or right down the middle. They're almost all as bad as each other. Because they're pulling from stuff that corroborates their viewpoints. Or they're ignoring everything else. You know, it's not quite as important, but they're going to corroborate it. They're going to corroborate their viewpoint.
Debbie Reynolds 45:16
What are your thoughts about the importance of Data Privacy? I feel like data. I feel like I don't know when companies started having a lot more breaches and things like that. People started to pay a little bit more attention to cyber security. And then privacy was kind of like the stepchild, abandoned stepchild until GDPR came out, and then everyone's like, okay, privacy, privacy, privacy. So where do you think, Where do you think we're going with that in terms of, you know, not only just companies trying to figure out how they, they handle privacy, but also how the cyber and privacy folks can work together?
Chris Roberts 46:02
You know, GDPR is a really, really interesting one. Because you're right, when GDPR came out, everybody was like, yes, privacy for the common man and woman and child and Beast and, and then what happened is, all of a sudden that the few poor people that were beholden to uphold GDPR, suddenly realized that they had more work than they knew what the hell to do with. So now you got this crazy backlog of companies that need to be assessed and fined and all this other stuff, and they've gone after a couple of big ones, but that's about it. And they've missed so many others. So it feels as if GDPR has also lost a lot of us momentum. It also feels as if the other challenges, you know, if you've taken American-centric thing just for a second. As much as I love this country, it isn't the country. It's 48,50 Odd, different bloody states that are all fighting their own damn rules and rules and regulations. And what California wants Texas doesn't want and what New Hampshire wants Colorado doesn't want, and what these I'm like, Y'all get your shit together. Do you know, seriously privacy's privacy, stop debating this shit. Identify the 10 pieces that every single one your frickin sense issuance needs to own themselves, and just figure this out, stop. Stop putting 50 different rules and regulations into place. Because as a company, I'm knocking my socks off, I'm watching everybody spin and spin and spin and spin. And it's going to be another one of those, well, do I pay attention? Yeah, maybe I'm not gonna pay. I'll just end up in the queue. And we'll just put another couple of lawyers on it. And the customer has been a payment of 20 cents on the dollar. Done.
Debbie Reynolds 47:58
Right. Yeah.
Chris Roberts 48:01
I'd sound cynical. Yeah. But I don't think it doesn't feel like it. Because I mean, you have New Hampshire and California, and a bunch of other states made this. We're gonna fight for the privacy of our citizens. Okay, now, how's that going? You stop the breaches, huh? No.
Debbie Reynolds 48:18
Well, I mean, I, I like to point out to people the difference between the privacy regulation in the US, whereas consumer privacy as opposed to human. And so the example I give for people, especially when they say, Oh, well, you know, we have privacy at a fundamental human right, which we don't, right in the US. You know, an example of the difference is, let's say you're in California, you know, California has a CCPA. If you walk to a grocery store in California, they have to abide by the CCPA. Right there, for-profit business, but you walk across the street to a church, and they don't. So yeah, what happened? You're still the same person. But your data can be used in totally different ways based on kind of the industry.
Chris Roberts 49:06
And that's what and again, this is, and it should be. I hate to say it, and this is where my dictator hat comes in. I'm done giving people the option. And seriously put me in charge for a week. Well, of this shit figured out, it won't be pretty. But we'll have this figured out because every single one of you, sorry, sons of bitches. They'd lost all my data. First and foremost, your asses are going to be tarred and feathered. We'll just start there. And I'm going to make an example of you. Secondly, if you're one of those ones, that's putting a lawyer in place to argue rather than actually putting somebody in place to actually go, hey, how do we do this? I'm taking you down as well. And then exactly to your point, it should be one rule for all. I don't care who you are, and I don't care what you do. I care what you do with that person's data. Because again until I until we on we being Corporation industry secure until we can actually show that we care for the very people that were meant to be protecting until we can show that we don't have a leg to stand on, I have no business or no right to come to you and go, Hey, can I take 25 pictures of your face? Because I want to train my system? I have no business doing that because I cannot guarantee that I can look after that effectively. But until I can do that, I have no right and no place, and that's part of the problem. So yeah, I but then okay, so here is a counter to that one. And this is where my big brother comes out. Because I have a big brother, I have a big brother mentality as well. I love people. To some degree, I love them for about 10 minutes every single day. And that's about it the other 23 hours and 50 minutes. I just don't like humans. I Oh my god, they're a pain in the ass. Oh, I mean, especially this last year and a half, two years. I mean, most humans have lost their minds in the last two years, let's be honest. And so let's see, I'm gonna go to every single person, and we'll say the US-centric approach for a minute. I'm gonna go to every single person in America, and I'm going to go knock on the door. I'm gonna go. Hello, Mr. Mrs. Smith. I'd like to inform you that you're now in charge of your privacy. You're now in charge of your data, and your social security and your credit card, and your information and have a nice day to tell now. And you know, I do that around 330 40 50 million houses? How many of those people are actually going to be exactly, you know that? Yeah. How many of those people are gonna be able to look after their data? And how many of them are capable? How many of them care? You know how many you've only got to go on Facebook or Twitter or LinkedIn, not LinkedIn, Facebook, and Twitter. And just Google credit cards and how many people like, look at me, I got my first credit card, and they put it up online. Oh, hey, I got my driver's license, and they put the darn picture on the site. At what point did you think it was good to show your driver's license and your credit card number to seven and a half other billion people on the internet? Why do we still have scams? Why do people still fall for the phone call that comes in from the FBI director telling them that they have to send them 20 pre-paid no know Visa gift cards? We, as humans, arguably can't look after ourselves. This is where my big brother comes in. This is where you know, and my agency hat comes in. It's you can't look after yourselves. We're gonna do it for you. You know, there is one part of me that wants the robot overlords to take over because, Okay, hear me out on this one. Let's go to this one. Let's go all the way on this one. It got me on a rant. Okay. Yeah, we do listen to this as well. I guess we can go to the other side of it. All right. How many years? Have we had how many years if we had a sign on the packet of cigarettes saying if you buy these and smoke these, your ass will die? What do we do? We go buy them. We're going to smoke of Yeah. Whoo. Because we're free. We can make our own. No. You shouldn't be allowed to be in charge of a kettle level own your own personal safety and privacy. You're asked, I mean, good God. We sue people because we get told that somehow it's a hot cup of tea. And yet we sue people because it's hot. Thank you. Seriously, robot overlords done. I am getting off my soapbox for a second.
Debbie Reynolds 53:51
I know, I know. We need help. I think it's extraordinary. It is an opportunity, right? Where I feel like consumers, we are helpless. We need help. Right? We're not getting help right now. So I think a huge opportunity for businesses can help consumers and that way.
Chris Roberts 54:09
I think we and I think that's it again, we've talked about the soft skills This to me it's an education. It isn't telling people do not click something do not send something. We all have to do that. It's how do I help you ask another question? How do we help you think before you act Ryan co-chair amazing good human being, another one that you probably should have on the show? Ryan is he has a good we're looking at we're analog humans in a digital world. And if you think about it, again, I use this example a lot as a kid when you were in the kitchen, your guardian parent or whatever said don't touch the kettle. Don't touch the stove. It's hot, but we still did it. We've learned through experience. There is no good way for us to learn. On a digital lesson, unfortunately, I was too late. And I think that's part of the data privacy. I don't know. I hand my credit card over all over the damn place to pay for things but rarely do I think about it until I get a new one turns off the first one. Hey, you know, you lost it again.
Debbie Reynolds 55:16
Yeah, once you fall off the cliff, you can't unfall off the cliff, right?
Chris Roberts 55:21
But in the digital world, in the digital world, people keep putting me back on the cliff, you know, hey, you got another card. Hey, you got another social security number? Hey, you got a post? Hey, you know, you made this mistake, we'll give you that. No, your ass is down. And we and I think this is the other problem. The other challenge that we have coming into the industry is in security and technology in general. We in technology, in general, let's face it, we literally own the keys of the kingdom. If we turned everything off, humanity would be in a bit of a state, and we'd be in a bit of a pickle, for the most part. There are some amazing people in the Midwest to be like, Yeah, great, and we can do that for a bit. We're just going to keep farming shit, and we know what the hell we're doing. And I'm like, yes, but most of us will be screwed. I mean, we're done. We muscle just shoot the neighborhood and start eating your friends and neighbors at that point in time. It's like, and I'm gonna have you a barbecue sauce, none of you have been sauce, because that's about what is going to come down to pretty quickly. And so, you know, when you start looking at it that way, it's like how do we help people understand the questions? How do we help people understand what they should be thinking what they should be doing? And I think that's where it's like, we have a lot to do we have a lot of work to do in the communication side of the world.
Debbie Reynolds 56:34
I agree with that. We have to keep pushing for it. That's for sure. What would be your width if it were the world, according to Chris, and we did everything that you said beyond the robot overlords? Well, what would be your wish for privacy anywhere in the world, whether it's law technology?
Chris Roberts 56:54
Man, that's, you know, that would be a huge one. I think it's a tough one. I want you to know, I, I've run into this a number of times, because, you know, I've got now got a new tattoo courtesy of Gokhan. So I've got the tattoos, Have the nails, I have no hair, I have this. What I want is for people not to judge a book by its cover. You won't talk about data, and you will talk about privacy. I want you to look at you, and I want somebody to look at me and look beyond what they just see. So use all that data. I want you to look at all the data; what's all the good that has happened? What are all the amazing things this person has done? How can we judge them more effectively on who they are, what they do, what they bring to the table? What diverse ideas they bring to the table, their thoughts, their processes, rather than just looking at the very first cover page and going Yep, Nope, we're done. That I think might be my biggest wish would be for people. For people to think twice, maybe listen to an inner voice think twice, rather than just that instinctive. Well, you're different equals bad.
Debbie Reynolds 58:09
Yeah, I agree with that a lot. I know in the past and the executive roles, you know, sometimes I'll pick people that maybe other people think, oh, I don't think this person is any good. It's like, well, I know what I'm talking about. And I know talent when I see it, so I can do what I need to do with the person. So I think that there is a there's an old saying that people say I didn't like I heard someone tell me they learned in, in graduate school, the best person for a job is the person who's done it before. And I don't agree with it. I mean, no, I don't think that that's the kind of a lazy way to think, you know, how does, how do people kind of evolve and grow? And so I've been really proud to be able to take people that were, you know, didn't fit in the box, you know, you know, there is no box as far as I'm concerned. You know, you want to find talented people and have them, you know, do their best. And so I've had people taking them from different industries even and have them sort of blossom in their career. So I really like that. And I like to factor thinking outside of the box. And unfortunately, not everyone thinks the way that we do, but it's been successful for me.
Chris Roberts 59:19
Yeah, I'm totally with you. I mean, there was in the world of mathematics there was a whole an upheaval and an uproar, shall we say, not that long ago. And it was exactly the same way. There was an individual that came out of India. And they hadn't gone through formal education. They haven't gone through anything kind of your normal or the standard pathway, or I'm going to go to Oxford or Eton or Cambridge or Ivy League or something like that. They came at it from a different manner. And I remember hearing their initial theories on some of the theoretical mathematics, and you're like, that is somebody who came it from an entirely different viewpoint, NASA, there was a story on NASA. So I just sort of come across one of the feeds that this intern, young kid came in, and, and discovered new Earth-type planets within three days of being there. Because they approached it differently, and that's what we have to have. And the only way you do that is if you are receptive to looking around and going, how do I find somebody? How do I find somebody different? How do I challenge myself? Because you're right, the easy way out is like, oh, yeah, they've done it before they can do it again. No, hell no, just hell no.
Debbie Reynolds 1:00:37
Exactly. I agree with that—1,000%. And, and now, I mean, we're having we're going to have problems in the future that we've never had before. So we need to have different people different ways of thinking, you know, there is almost like someone saying, I want to hire someone to be like a cloud architect that has like 50 years of cloud experience or something like that. It's like, oh, wait a minute, you know, these are? These are emerging issues, emerging risks, emerging technologies, where there is no, you know, there is no one on their resume that has five years experience in XYZ, you know?
Chris Roberts 1:01:15
Well, and as you pointed out, you talk about data, Data Privacy, and data integration. It's one of those things where the more and more integrated into a business that we become, and I can't just have a tech specialist walk down to the manufacturing site and say, Hey, this is how we're gonna do it. You go to a foreman at the manufacturing site, and she goes, Excuse me, that shit ain't gonna work. And here's why. So rather than fighting it, you sit down with that person, say, hey, how do we make this work? Effectively, we got to do some changes that that's gonna happen. But how do we do it more effectively? That, to me, is where we have to step up as security people; we have to step up. We've always, unfortunately, for a long time, we approach this as we're going to be the special snowflakes. And that doesn't work anymore. It really shouldn't have worked in the first part, but it doesn't work anymore. We have to be we've got to eat some more humble pie, like serious Delabar more by and go back to the business and go back to dev SEC ops and say, Hey, sorry, how do we make it work?
Debbie Reynolds 1:02:22
I agree with you. This is wonderful. I mean, oh, man, I have to invite you back because I could talk for hours.
Chris Roberts 1:02:29
Well, yeah, we'll have more conversations. We got plenty of soapboxes; you and I can probably both get on to have some conversation.
Debbie Reynolds 1:02:36
And I want to thank you too, as well. You put out a post last year about looking at diverse voices within privacy, cybersecurity, for speaking, and that was wonderful that you put my name on the list, and I thought it was great.
Chris Roberts 1:02:52
!Yeah, I need to do another one of those because I didn't realize Rachel does another one for you, Rachel Arnold's. Oh my gosh. You need to have Rachel on the show
Debbie Reynolds 1:03:00
oh, I have to have her on the show. Yeah, I was looking at her. I love Rachel.
Chris Roberts 1:03:03
She is freaking amazing. So much love for her. She actually hit me up. And she caught me on the carpet. Right, so I didn't even realize it. I'm on a panel. Next, I'm on a panel soon. And I didn't realize until I looked at the pictures. I didn't; I didn't get pissed off at myself. I didn't give it a moment. So I looked at the pictures, all the bunch of all white guys. And I'm like, damn it. She was one she's like, so I'm like, Yeah, my failure, my big one. So no, she's good. I got a shit ton of love for her.
Debbie Reynolds 1:03:34
I love her. I love Rachel. She's amazing. And she's so funny. She's such a great sense of humor. She's unbelievable.
Chris Roberts 1:03:42
What she's, we were up in Grand Rapids at GORDON the other week. And it's just great because she was up there. And I finally got to meet the man thing and her man things in it as well. And he's just he's another one of those is like, I can deal with humans for a certain amount of time. And then I'm done. And it's so interesting to see the differences. I mean, she is such an outgoing and amazing individual as well.
Debbie Reynolds 1:04:04
Yeah, she's definitely going to be on the show. So she's amazing. I agree. I agree. Well, thank you so much for this. And I'm sure we'll talk more soon, I guess. So I was so excited to have you on the show. And, of course, we went overtime. I can't stop asking. But yeah. We'll do this again soon. For sure. Thank you so much.
Chris Roberts 1:04:29
Thank you. This was fantastic. Thank you for having me on.
Debbie Reynolds 1:04:32
You're welcome.