E110 - Jackie Singh, Former Cyber Threat Expert for the Biden Presidential Campaign
45:29
SUMMARY KEYWORDS
people, data, security, companies, happening, world, privacy, military, hacker, outcomes, called, bit, cybersecurity, iraq, organization, incident, technology, work, safe, breach
SPEAKERS
Debbie Reynolds, Jackie Singh
Debbie Reynolds 00:00
Personal views and opinions expressed by our podcast guests are their own and are not legal advice or official statements by their organizations.
Hello, my name is Debbie Reynolds. They call me "The Data Diva". This is "The Data Diva" Talks Privacy podcast where we discuss Data Privacy issues with industry leaders around the world with information that businesses need to know now. I have a special guest on the show. Jackie Singh. She is the former lead cyber incident responder and threat analyst for the Biden Presidential campaign. She's also an advocate for anti-surveillance or trying to stop mass surveillance of I guess individuals; she is fascinating. We had the pleasure of collaborating on a Business Insider panel around ransomware. And when you are introducing yourself, I basically just want to talk about that; I want to talk about your background. And I thought you'd be a great person to have on the show. You're such a multifaceted, talented person with an excellent, different backstory and very unique way that you've come into technology and cybersecurity and I'd just love to have you introduce yourself to the show.
Jackie Singh 01:28
Thank you so much, Debbie, for having me on your show. I am so excited to be here. It is not a typical thing for me to be recording a podcast. So thank you. I'm so honored.
Debbie Reynolds 01:40
That's fascinating. Thank you, this is great. I would love for you to tell your story or technology story. I know that you've done stuff in the military. You're obviously a very well-sought-after cybersecurity expert. I know of your teenage start as a hacker, you know, just all that. I would love you to tell the audience your story.
Jackie Singh 02:11
Well, it all started when my father got me a computer when I was 11. And I didn't really know what to do with it. But I soon figured out that I could do a lot with it. And I soon found that I could join online communities. And I soon found that some of those online communities have a corollary with the real world. And so that was my introduction to tech; I started going to Linux user groups, which are real-world meetings of technology enthusiasts. When I was about 13,14. I definitely vividly remember attending a few meetings the year that I was 14 and many older men in the room who were very confused about what I was doing there and why there was a 14-year-old kid talking to them about the Linux operating system and just trying to learn from them. But I soon migrated from like this kind of nerdy technology meetups to hacker meetups. And so the type of person in attendance at these monthly hacker meetups at the local mall, where I lived in Florida, growing up, it was just like a new world as a door opened to a new world as these are people talking about brand new things about how to get systems to do things that they weren't intended to do, and how to circumvent systems rules, how to do things you weren't supposed to do. And you know, as a 15-year-old, that usually took the form of getting free calls from a payphone or trying to confuse a circuit board somewhere to give me something that I wasn't supposed to have like candy from a candy machine. And over time, I just learned more and more that I could bend computers to my will and tell computers what to do. And they would listen. And that was just so exciting for me as a kid. So that's how I got hooked and joined the military when I was 17. Soon after 9/11, I actually moved up to New York City by that point. It was something that I felt I needed to do, but it was also something that was born out of the kind of the radical elements of that hacker scene that are very, you know, like, far less very anti-authoritarian in many instances, anti-government, I would say and I wanted to know more about what the other side of that coin was right? Like what is what does Law and Order look like? And so I was very fortunate that the US Government, that the US Army gave me a home. Right, I was a teenage hacker who had dropped out of high school the day after she turned 16 so that she could spend time with her friends in the hacker halfway house quote unquote, in New York City and Brooklyn. And so just being a young kid who didn't have a lot of direction in the military was really the best thing that could have happened to me. And so once I deployed to Iraq, came back, and left the military, I decided to go back over to Iraq. And I was working as a contractor first as a person who is repairing vehicles, which was, you know, one of the skills that I picked up in the military was mechanic work; I used to work on tanks, very large pieces of equipment with a lot of electronics, and weapon systems and hydraulics and, you know, interesting things to work on and repair. And when I migrated into it, I found a very willing mentor in the form of a man named Troy Caffee, who used to work at a company called Raytheon. And this man was the only contractor who is based in Iraq, with my unit, at the time in 2003, when defense contracting in the global war on terror was a relatively new thing. And this man was so instrumental to my career because he really took me under his wing. And as a young soldier, I was able to see what it was like to have one's own office and to be able to kind of call your own shots from day to day. And instead of being this military person, this soldier who is having to be responsive to whatever the wider military was doing that day, you know, he had a lot more autonomy, and he was able to work on really cool stuff. And so Troy, when I got out of the military and was working in that mechanic job in Iraq, he was the one who hired me for my very first IT job. So a very circuitous and strange story as to how he came to have a formal role in it. I did have a job at an Internet service provider when I was 14. But that didn't last long. That lasted a few weeks.
Debbie Reynolds 06:59
That's amazing. So you're telling me about your time once you actually had your first IT job; how you got in to be the lead cyber incident responder for the Biden administration?
Jackie Singh 07:15
Oh, my gosh, wow, there's so many twists and turns. So I worked, after that my first IT job, I worked various IT jobs in roles of escalating responsibility doing really interesting stuff. I served as a knowledge management officer for a couple of really interesting projects, working on economic revitalization for the country of Iraq on a pentagon project. And I did some really interesting things at the US Army's Human Terrain System, which was a program to bring academics, anthropologists into the war zone to help military commanders make better sense of the fog of war, right? If we for example, as the US military, build a well in a village, will that actually help? Or will that harm? Or should we build more than one? Or will it be stabilized? If we build one? Should we build it somewhere else? Right, all of those questions can't be answered necessarily by people who don't have a cultural understanding of that region. And so that was the point there, but very interesting roles. I kind of bounced around Iraq; I spent a lot of time doing a lot of different things. And I eventually found my way to my first information security role in Africa, I was working in Djibouti, Africa, which is a small country in the Horn of Africa, next to Somalia and Ethiopia. If you've ever seen the movie, Captain Phillips relates to the strait there, that narrow, little area of ships passing that tends to be a security concern. But anyway, very interesting as well, but also a place that had grinding poverty, which was very difficult to deal with as a contractor. You know, obviously, I had seen a lot in Iraq, and I had been throughout other parts of the Middle East. But it's, you know, another type of thing entirely in Djibouti Africa, where I felt like we were a bit like, I would say, a bit like an occupying force. And the US military is not the only you know, occupying force, I would say that has had an interest in Africa. Obviously, the Chinese are there now as well. But just another circuitous role that took me to another part of the world that I found interesting and was able to learn more about and so from that role, I pivoted back to the United States and to find myself hired by a company called Mandiant. Before they were acquired by another company called FireEye. And so this was a cybersecurity firm that was founded by a former Air Force, Office of Strategic Intelligence Officer, Kevin Mandia, a very interesting organization, very interesting culture. And I was promoted several times at that company, and we do some really interesting things with data. And to develop some really, I would say, at the time new and unusual ways of measuring a company's security posture and getting an understanding of what things an organization needs to do to be secure. Mandiant was a really formative place for me to learn a lot of things I was working with senior managers who had come from, you know, larger firms. And they had a lot of really useful established methodologies, and structure for writing, reporting, and for parsing cybersecurity data. But what they didn't have was a way to create alignment between the organization and its security goals. And so what I did was I developed a program that I called a health check. And the program was essentially designed to walk into any company and suck up a bunch of their data, deploy some sensors into their environment, and pull some data from their SIEM, the security information event management software, put that into a bucket, and derive the new useful insights from that analysis, right. So instead of the cybersecurity folks at a company who are used to looking at the data set, looking at it day in and day out, you have a third party, basically taking a snapshot of system configurations of what's happening on that network on a day to day, what is the actual boots on the ground truth about what's actually happening on that network to use the military analogy, versus what the organization wants to be happening, right, the security controls, and what the organization believes is happening, right, what is being reported internally, and that can often be a very different situation from what the boots on the ground truth is. And so through that health check service that I developed, I was able to build a team and deploy this service at many different companies in order to give them that snapshot of the things that don't seem to be going right that may reflect a need to modify some processes or implement some new defensive controls, or maybe adjust to the measurements of those defensive controls, right, maybe if we're focusing on measuring for compliance, maybe we should be focusing on that raising the bar a bit more, for example. So I just had a great time at Mandiant really incredible culture tons of smart people, just incident responders that knew what to do. They have a very in-depth understanding of what systems do, how the software works, what things look normal, and how to tease out the things that looked abnormal. And so I would say that one of my core competencies is being able to identify things that look a little strange, and then pulling the thread on that, and investigating until I feel satisfied. That I understand what that is. It's kind of a compulsion.
Debbie Reynolds 13:45
I can see that. So how did you get enough exposure for the Biden campaign to knock on your door? Say we need your help?
Jackie Singh 13:56
Oh, my goodness, I've been asked this question so many times; I actually didn't have any special in, way in, I sent in my resume. And I had heard about this role opening. And I sent in my resume. I didn't expect to hear anything at all, Debbie, but I was called by the Chief Information Security Officer who had just been hired about a month prior. And he just seemed like a good dude; he was on a really good mission. And I couldn't possibly predict how this was going to affect, you know, me and my family over the following months as we got closer and closer to the election and in the wake of the election, you know, just a very difficult role to have. But a role that felt so important that I'm so honored to have had the opportunity to have been selected for something like that is really impressive to me personally. And, you know, it made me feel like there were no opportunities sticks, right? It's not the kind of job where if you miss something, you know, it's no big deal. Yeah, if you miss something, you know, it's really like, affecting the free world. So you really have to be feeling like you are on your P's and Q's. And I just had a lot of sleepless nights, I had a lot of time spent analyzing and worrying and fretting, but also, you know, being very aware of how incident responders need to also take time and relax and sleep and, you know, balance the work that they do, the difficult work, that is incident response and just kind of bouncing from threat to threat. Balancing that with needing to take care of my children. You know, my youngest daughter was just a few months old when I joined the campaign. She was born on May 4, and I joined the campaign in July. And my immediate recollection, so she may have been just a few months old. And it was a tough time for everyone in my family, but just the honor of my life.
Debbie Reynolds 16:07
Oh, my goodness, well two things that you talk about; I would love your thoughts about I guess the attention or the balance of working on cyber security, cyber threats from a proactive standpoint, and then also reactive. Obviously, you have to try to balance both where I feel like a lot of people, when they think about cyber, maybe just in the press, they think about it on the reactive side. But I just want your thoughts about how do you balance the proactive and the reactive part of your job.
Jackie Singh 16:49
Well, it's unfortunate that we have to be so reactive in the world. And the reason for that is that we aren't thinking enough about security, to the left of the boom, where it matters, right? So in my business, right, in digital forensics instant response, we have a saying, left of boom and right of boom. So are you a professional who works before the incident or your professional who works to help support the incident, the aftermath and helping recover from that incident? And so I tend to focus on the incident response side. But I've done a lot of work on the preventive side as well. And obviously, these two sides feed each other right information from past incidents helps keep you safe, hopefully, from future incidents, if you're able to have a successful lessons learned process, right? What is the outcome of the issues that you face? And as an organization, the very first thing you need to do is establish an incident response program so that you have the ability to respond to issues because if you're not able to respond them to respond to them effectively, you don't really have a purpose in detecting more security threats and more security issues. Right, you've got to be able to effectively handle the incidents and then the output of those incidents. And then next, you got to think about prevention. We're in a world where we're stuck with the security outcomes of the software that we're given. And so the software's written insecurely, there are bugs in software, there are definitely more and more languages that are implementing features such as memory safety, for example, the Rust language, there are some features in GO which is built by Google. So there are opportunities, I think, for us to continue shifting security left as as a technology industry globally. But us as cybersecurity experts need to do more work of evangelizing that effort, right? Shifting security left should be the focus of what we do because incident response is extremely expensive, right? It's just very, very costly. I was looking at some numbers in my immediate recollection yesterday that described the total cost of cyber incidents. And we're looking at some really high numbers per incident when you break that down to all the components of the security program, you know, incident response is extremely expensive. So we want to try to stay away from that as much as possible. And so we have this dichotomy where all of the unfortunate outcomes for bad security decisions are made earlier on in the software development stages in that lifecycle. We're stuck with the outcomes of those bad decisions all the way down the line. And the unfortunate reality is that most folks in tech, you know, those builders, they're often very disconnected from these outcomes, right? They don't necessarily know what happens to people on the other side, or you know, to have a better understanding. I know what the harms are that people face when software isn't safe. Right? So I think there are understandings that we all need to gain as an industry, there's more advocacy that needs to be done in cybersecurity, to focus less on the security of corporations and a bit more on the security of individuals. And I think once we shift that focus a little more and start to develop better ethics around the work that we do and a deeper sense of responsibility, I think we'll see cybersecurity folks come up out of their current corporate niches to talk about the things that will make all of us safer, right? Because a rising tide lifts all boats.
Debbie Reynolds 20:42
That's fascinating that you said that. So I think that there has to be a shift from kind of this corporate think to people think because so much now, you know, when we're thinking about Web 3, and things like that, part of that effort is to give people more agency and more of a say in their data and what's happening with it. So I agree with you that there is a disconnect between the people who design those systems and kind of the downstream harm. That may happen to people that they may not be thinking about, or they're not even invested in, they're like, okay, I got, I made the software, I got this, check out, buy this car, or whatever. And I don't really care about what happens next. And what happens next is really important because, you know, it's not just numbers on a page; these are people in their lives.
Jackie Singh 21:37
Yes, absolutely. I think so. So I appreciate something that you said there, which is if I understand correctly, we're talking about like data sovereignty, right, and how to empower people to have more agency over their own data. And so I think, ultimately, that might be doomed to fail. Because I think that people ultimately just don't care that much, right? They want to stay safe. But they don't care too much about the details, right? Like, if I pay for a gated community, for example, because I believe that a gated community is safer for my family. I depend on the HOA to come up with the specifics, right? The HOA comes up with the height of the gate. And they think about where to place the guard shack, and they think about what the process should be for that guard coming in and out every day. Where should the cameras be placed around the perimeter? Who should be reviewing those? Well, how do we respond to law enforcement requests, right? Like, that's a whole security process that I think a regular resident never wants to have to think about. And that's the world that we want to get to where folks don't necessarily have to be cyber experts in order to understand how to close their cyber door, and how to lock their cyber door. Kind of basic hygiene things because most people don't walk into their house and leave the door unlocked. I mean, some of us do live in some towns that are like Mayberry still, where we know everyone and it's still feel safe to do that. And that's wonderful. But most people I would say, especially those living in cities, wouldn't feel comfortable just leaving the door unlocked. And it's something that we do reflexively. Without thinking about it. I don't think we're going to be able to get to that point in cybersecurity. And I don't think that it's necessarily desirable for people to have to get to that point, right? And really, frankly, we maybe we are at that point; maybe the equivalent of a deadbolt is our password. Right? Just like a key can be lost just like a password. It can be reused like a password. But unlike a password, a key is very localized, right? It's a physical object that has a physical location. So very different from a password that can be kind of sprayed all over the Internet after a breach.
Debbie Reynolds 23:48
That's amazing. Yeah, I love that analogy that you said. So I have to dig into that a little bit deeper. Give me some deep thoughts and stuff. What do you think? What is happening in the world that concerns you right now around privacy or cybersecurity? What are you seeing right now that you're saying, you know, I don't like this. I don't know what's happening.
Jackie Singh 24:14
Oh, gosh, what is it? What isn't happening? I work for this 501 C3 nonprofit based in New York City that litigates; we advocate against the government's use of mass surveillance. And so what we're finding is that the government, whether that's a local, State government, whatever, when they want data, they will simply go buy it from the companies that have collected that data and aggregated that data in order to sell stuff. Right? And so I think surveillance capitalism, right, this business model that companies have adopted of making money from our data, they're not going to leave it on the table anymore, right? It's just simply too lucrative. Not to profile your users not to identify your users or for other people downstream to be able to identify your users to be able to target them a little bit better, right? These are simply monetary data streams that companies can't and don't want to leave on the table. And so when we see companies like Apple making changes to their advertising ecosystem that zero out all of the ID numbers on your particular iPhone so that advertisers can't reconcile your device ID with all the other information they have about you. Now, companies are going even further. Now they're saying, well, we're going to find other ways to profile you; we're going to look at all of the characteristics of your phone, how much free space you have, what your IMEI number is, right, whatever little bits of details of the information that they could get, which is pretty vast, actually, given the capabilities that Apple provides to software developers, right? Like, ultimately, Apple and Google are the platforms that create these operating systems that allow companies to suction the data that they have from our phones, which is then passed around all these different companies and aggregated and turned into these dossiers of us, which then are then you know, potentially sold to the government for some weird reason. And then you might get caught up in a dragnet, right? You might get fingered for a crime that you weren't involved in, which has happened, you might get placed into a gang database in a city and then potentially have an issue out of that, right, like, the automated license plate software that the police have in their car can potentially pick you up, right, there are just a lot of different things that can happen. There are many different systems that are connected now in ways that the public doesn't quite understand very well. And surveillance capitalism makes all of that available to the highest bidder. Right? So we're finding ourselves in situations where power is being concentrated. So it's not just that we're losing privacy, I think people tend to have a question about, well, if I have nothing to lose, you know, if I have nothing to hide, then do I have anything to lose? If people are, you know, looking through my private data, whether that's a company or a government? Is that really something that affects me? In the individual? It may be the answer to that maybe no. But on the whole, for all of us together, we're collectively weakening our own safety by allowing all of this data to be compiled, without the appropriate limits, without any thought as to, should that information ever be deleted. Or today that information is kind of indefinite, right? People can pretty much do whatever they want with data. The ECPA law in the United States essentially allows employers to collect email communications to intercept anything that you're doing on a work computer, for example. And so you have no expectation of privacy in the workplace. And then when you're looking at apps on your phone, or you're using free services on the Internet, as they say, if the product is free, then you're the product. And so we're just in this ecosystem that takes advantage of our lack of understanding of how this information is being taken from us. It's an ecosystem that has very few safeguards for people. It's an ecosystem that is an industry, the data industry, which is thriving due to the lack of regulation on privacy in the United States. You know, the data collection, wholesale concerns me, Tabby, right like this, this like this spigot infrastructure. It's like a suction and spigot infrastructure, as we like to call it right? Some people are sucking up our data, and other people are opening the spigot and letting it out in different places for different reasons. And we're seeing more and more outcomes of those bad effects. And I worry, I think I mentioned this on that event that we did for Business Inside, Debbie; I think I mentioned quantum computing is something that's on my radar that I'm super concerned about. Are we having our faces used as access mechanisms? Right biometrics, our facial characteristics are being used more and more to say, open a door to an apartment building or unlock our phone. In the case of our iPhones, that information is stored on our phones locally. But our phones are backed up to iCloud. And that data is not secure at Apple; I think we tend to have the assumption that our data is secure. But when Apple kicks out third-party advertisers or tries to, what they're really doing is building a higher-walled garden for themselves, right? It's data for me, but not for thee and so we're just having to trust Daddy Apple and Mama Google. Yeah. And that's it. That's an uncomfortable position to be in, you know; I would love for us to have some rule regulation with teeth that says, This is what companies aren't allowed to do with data. And we're just not at that point because it's too dang valuable.
Debbie Reynolds 30:23
And it's confusing. I feel like a lot of people. I don't know, maybe back to your analogy about someone who wants to live in a gated community. You know, I think at a fundamental level, someone who decides they want to live in a gated community, they're saying, I want to have this extra layer of safety. But I don't want to have to deal with how that happens. So I think in some ways, people want security or they want privacy, but they don't know how to make it happen, or, you know, really who they should trust to make it happen, I guess.
Jackie Singh 31:01
Exactly. Right. I would absolutely agree with that.
Debbie Reynolds 31:05
I think that's also I think there's a confusion with people, I think people assume that they have more privacy rights than they actually have. So I think that's part of another reason why it's been hard to get legislation or regulation because some people are like, well, I don't know, maybe, from one, one part of the spectrum, we have people like I don't care, because I don't have anything to hide. But then somewhere in the middle, there are people like, you know, nothing bad has happened to me yet around my privacy or my security. So I think that everything is good until something bad happens. So I don't know, what are your thoughts?
Jackie Singh 31:49
Oh, I mean, isn't that how people function inside corporations, right? Like, right. Look, as a consultant, I've seen this again, and again, when there's a breach, especially a very public breach, especially in a public company. Right? That means that there's pressure for more and more people to do something different, which will lead to a different outcome. companies tend to open up their wallet, they're ready, they're like, now's the time to solve the security problem, we're going to be seen as doing the right things, we must be visibly solving these problems. Whereas before the breach, they may have been maybe a bit asleep at the wheel, maybe not thinking about security, so much maybe you know, on the backburner hasn't been invested in so much. And then the breach happens and they're ready, right? I think that's, that's anyone I think that's human nature, right? We tend to underestimate certain things and overestimate certain things, right, the likelihood that a certain scenario will occur. For example, if you're driving on the highway, you have a much higher likelihood of getting into an accident than if you're flying in a plane. But for some reason, when we're on the plane, and a little bit of turbulence happens, we get pretty dang nervous. These aren't logical; these illogical thoughts that we have as human beings. And so it's up to us as cyber experts to understand what those cycles look like to understand what those psychological tendencies look like. And that bias and to try to work against that. And that's why I'm such an advocate of shifting security left as much as possible, of removing security from the human-centered equation and trying to create environments that are trustless or permissionless. Right? When you think about zero trust, it's kind of the inverse of that zero trust is the methodology by which you're checking every single connection, every single conversation that the system is having to make sure that it's authorized and that it's the right conversation to be having. But there are other environments that are being created today, such as these blockchain environments with these cryptocurrencies that allow people to be a trusted participant from the start to have an identity from the start that is manageable by them. And we're seeing now in those ecosystems that it can be a disaster to actually manage their own keys, right? Just last night there were some crazy things happening on the Solana network last night, where many, many private keys were stolen, and then an attack was executed all at once. So some malware campaign very likely was able to steal many private keys, and then they ran a script to steal all the money from those wallets all at once. And so over the course of an evening, we watched as 7000 wallets get drained of their funds where millions and millions of dollars are being drained. And so even in these environments where you're using memory-safe computing languages like Rust, even in these environments where you have very strong incentives to keep the network safe due to the amounts of money that are circulating, you're still seeing these very large publicized breaches. And so I would caution anyone who, you know, thinks like permission was decentralized trustless. And here's these words to kind of equate them to security the same way that we did with zero trust. One zero trust was a new buzzword fresh on the town from Google. So I think we're making some strides. But quantum computing is coming to change all of that. Yeah. And so there's a lot to prepare for. And when we talk about using our faces as access mechanisms, and then combining that with public data going over the Internet being encrypted with asymmetric encryption using like, you know, standard HTTPS, we'd go to a website using HTTPS, all of that is transferred over the public Internet. And the folks are able to, you know, store that data. For example, if you're the NSA, you're storing a lot of this data that transits our borders. And imagine some not-so-distant future, you know 5, 10 years where there's usable quantum computers that are able to very easily decrypt the streams that were encrypted with asymmetric encryption, to decrypt these massive data breaches that may have been encrypted with data at rest at some point but were stolen from some corporation who we allowed to collect this data, who we allowed to use our faces for an access mechanism or whatever. And now we're in a situation of not being able to revoke that credential because we cannot revoke our face.
Debbie Reynolds 36:42
Yeah, it's concerning. Definitely. I talk a lot about quantum computing and post-quantum cryptography and the threats there. Because I don't think people are really thinking about it. And some people think, oh, it's far off. I'm like, you know, it really isn't that far, the time goes really fast. And we're already seeing bad actors outsmart us right now. Right? So having so much more computing power and capability just makes the threat exponentially more dire I will say.
Jackie Singh 37:20
So right? Here's you know, that's such a good point is compute and storage have become so cheap, and are going to continue becoming so cheap due to Moore's law, that we're going to continue seeing improvements in the ability of companies to collect data and the ability of companies to store that data for a very long time. And so we're going to have the pain of the outcomes, the unintended outcomes of data collection and increase until again, we try to figure out some type of privacy legislation that helps put the kibosh on companies doing absolutely anything they want, and making money off of the loss of our privacy. I mean, I think it's ghoulish.
Debbie Reynolds 38:09
Yeah, I agree. I agree with that completely. So if it were the world, according to Jackie, and we did everything you said, what will be your wish for privacy or cybersecurity in the world, whether it be technology, people stuff, or regulation? What are your thoughts?
Jackie Singh 38:39
Oh, my Oh, my gosh, I love this question. I don't like that world. It sounds like a very authoritarian world. I haven't had this question before. It's really good. So obviously, the first thing I would want to do is enact some type of privacy legislation, it would be very, very strong. I would love to abolish the use of mass surveillance, I would likely abolish a lot of the work that companies are doing today. Were you seeing this closeness between information security departments and corporate security departments at some of the largest companies in the world, including the technology companies in Silicon Valley now who have started hiring from last Wednesday starting I would say this has been going on for some time. They've been hiring from the three letter agencies in order to staff their corporate security departments. And so you have folks who are used to having very big budgets, but having lots of like regulatory limitations or legal limitations on the work that they do. Now working at companies where they have even bigger budgets, sometimes, you know, even less oversight over the work that they do, and having a very different mission and character and flavor to the work that they do. So I would also look into making some transparency and corruption changes of the United States in order to have it'd be more clear as to what that revolving door looks like between government and industry. I think that there's a lot of weird stuff that goes on there that somebody, yeah, having been somebody who's gone between government and industry, having worked as a defense contractor for many years. I think there's a lot of reform to be done in these systems. But generally I tend to look at the world a little bit like Alexei Navalny, you know, he wants to increase transparency to reduce corruption. I think that's an incredibly powerful concept. And as well as decentralizing rights, we've created so many unaccountable centers of power that are now harming our democracy and our ability as citizens to kind of control our own institutions. And it's really difficult when you have a series of companies that are so big and powerful globally, that they start to rival your own government. Right. And that's why we're starting to see some movement on antitrust action from the FTC. I hope to see more from Lena Khan at the FTC there. Because we've let it go so far, that we're very difficult. What a very difficult inflection point in the country, right? How do we rein in the power that organizations have gained over the citizenry with their technological might and their data collection and analysis capability? Right, I don't have any easy answer to that other than the government must focus on sane technology policy and has to start doing it sooner. I did see some very positive stuff recently about like, talking about starting to talk about getting the Federal government ready for quantum computing and the advent of that, and what do we need to do to prepare our agencies; I think that's really valuable stuff. So I just want to see more and more of that; I'd like to see a national cyber and privacy. I don't want to use the word SAR; that's a word that's been used in the past. And I think it's really an authoritarian and overused word. I'd like to see someone who's less of an authoritarian dictator on the subject and more of a diplomat, right, more of an advocate more of an evangelist, somebody who can help elevate these issues, help coordinate interagency on them and just kind of be the representative that says the things are really important that we need to care about them as a nation, we're going to be the leader on these topics around the world instead of say, for example, leaving the EU lead with GDPR, which has been a very, like I would say, positively and negatively impactful legislation in many ways. But no one can argue that it is game-changing legislation for many people; it has impacted the industry quite significantly. And so you know, as a security vendor, it's grumble, grumble. As a corporation, it's grumble grumble, I have more regulatory requirements and more compliance requirements. We just need to as a nation, enact some regulatory and policy solutions that are actually aligned with the things that keep us safe. So I think if we can focus on those things will be on a better path. And I think we'll get there. It's just moving a bit too slow for my like.
Debbie Reynolds 43:29
I agree with that completely. I like that, like a data diplomat. I agree. I think we need some, there's kind of there's a void there, isn't there? There is no voice.
Jackie Singh 43:40
That's incredible. I love that it could be you.
Debbie Reynolds 43:48
I guess it could be me. I don't know. We need a voice. We need a voice there for sure. That could pull all these different areas together. Absolutely. I agree. Well, thank you so much for being on the show. This was outstanding. I know that the audience will love it as much as I do. And I love your work. And I always follow the stuff that you're doing. So definitely keep me posted on what you're doing. I'm happy to support you in any way that I can.
Jackie Singh 44:17
Well, thank you so much. You if anyone listening would like to follow me on Twitter, I am @ hackingbutlegal. And you can also Google hacking but legal. I've got my blog I keep on legal.com You can find me on LinkedIn, under the same rubric hacking but legal and I look forward to maybe coming back on the show at a later time. Thank you so much for having me.
Debbie Reynolds 44:40
Thank you. Thank you. We'll talk soon.
Jackie Singh 44:42
Thanks so much. Sounds great. Take care