E52 - Jimmy Sanders, Information Security, Netflix

43:31

SUMMARY KEYWORDS

people, security, privacy, data, cyber, technology, company, business, person, thought, agree, computers, building, understand, talk, widgets, lan, friends, organizations, consistent basis

SPEAKERS

Jimmy Sanders, Debbie Reynolds


Debbie Reynolds  00:00

Personal views and opinions expressed by our podcast guests are their own and are not legal advice or official statements by their organizations. Hello, my name is Debbie Reynolds. They call me  "The Data Diva". This is "The Data Diva Talks" Privacy podcast, where we discuss Data Privacy issues with industry leaders around the world with information that businesses need to know now. Today I have a special guest on the show. Jimmy Sanders, he's Information Security at Netflix. Also, he's currently the San Francisco Bay Area Chapter President of the Information Systems Security Association since 2014. So, Jimmy, it's very nice to have you on the show. You're one of those people on LinkedIn and just has your first name; your first is your last name. So I know you're always like a cyber person.


Jimmy Sanders  00:59

Yes, it's great to be here.


Debbie Reynolds  01:02

So I will just chat a little bit about how we sort of interacted ever. Because let's see, we were with Lan Jenson, who was with Adaptable Security. I think her new venture is called something different, CyberTrust, I believe.


Jimmy Sanders  01:20

CyberTrust of America.


Debbie Reynolds  01:21

 Cyber Trust of America. So Lan had this call last year when COVID got really crazy. And a lot of businesses were shutting down and didn't know what to do, especially the small and medium-sized businesses that probably didn't have any other alternative, you know, didn't have like the technology in place to do stuff like no electronic ordering, or you know, counting and stuff like that. So Lan sent a call out, and she created this initiative called Tech Cares. So there's a lot of different people that she put a call out to, and this is basically, for the Bay Area, even though there were people from all over, all over the country that sort of helped out with this initiative. So I, you and I were on some planning calls with, with Lan, and I was really, you know, fascinated by the fact that you know, I was really happy to see so many people who are some stellar examples in their industry really helping out in a volunteer way, right? Especially at a time when I think when COVID really got crazy, like, we didn't know where it's gonna go. And I think a lot of people went inward, right, where we were trying to reach out. So I thought that made this group unique. And I really wanted to make sure that talk with you and sort of talk to you about your story.


Jimmy Sanders  02:48

Yeah, because and that's a great segue because, for me, it's all about you can, you can be the person who decides to have to get in their shell and protect their environment. And there's nothing wrong with that. Because you're not comfortable yet to branch out. Or you feel like you need to make sure all your eggs and ducks are in a row. Or you can feel like my foundation is set. But there are so many others out there who have rocky, rocky foundations who have who don't have a good, who don't have a digital presence who don't have good digital security. And so, for me, I felt incumbent to work with a group of collaborator people to do that.


Debbie Reynolds  03:33

Well, yeah, I'm really happy that we actually asked to do that. And I'm really happy with all the stuff that came out of it. So it was a lot of education, a lot of boots on the ground, people literally helping people implement stuff. There were five frameworks that came out of it. Guides for businesses. I also give a shout-out to my friend, Pamela Gupta. She was very, very instrumental and really helping, especially on kind of the frameworks by helping people guides, and they can, you know, use. So tell me about, you know, tell me about your journey. So I am fascinated. And I know that you didn't want me to just read off the script of your background, but I thought it was funny that you do so you had a background in behavioral psychology.


Jimmy Sanders  04:19

Yes. So I actually got started in computers in 99. And I was going through school, taking computer classes, working on my associate's degree in computers, and I actually got that I just don't list this is on my resume. But once I got my computer science degree, associate's degree, I started taking, I went transferred to San Jose State, and I started taking computer classes there. And my problem immediately happened is I was learning things on my job that I was currently at that wasn't actually being taught in school. So I had a big disconnect between the classes and courses I'm taking and what I'm actually doing in the real world. And I was like, I'm wasting my time. And I like computers, and I still love computers. But I was like, what can I do that can actually expand and broaden my thinking because I always wanted to be, you know, or not everybody with me, they're like, I want to be in management, I want to be an executive. So what skills do I need? I need to think as my employees would think, or my boss would think, or whoever that person would think. And so it's like behavioral science and psychology. And so I end up getting a double major in behavior, science, and psychology because I love the subject. And I love analyzing people, analyzing situations. And this was even before I got into computer security, and it just helped out immensely, even in that field as well. And so that's how I got started with my degree. And then once I graduated with that, I was already in computers. And so, I went from being an IT administrator to an IT manager. And then in 2006, one of my friends said, Jimmy, would walk around, and he had this little resume out, and he had been looking for a security engineer, for over a year at the company I work for, I looked at the job requirements like I can do that, I can do that. And you know that, as they say about man, like, you can look at something, if you can do 60 70% of job you think you should take the job at and so like, you know, for me, the motto I impart upon a lot of people is a closed mouth doesn't get feed, meaning if you don't speak up, you won't get it. And so I asked him, I said, let me try out for it. He said, okay, we'll give you three-month probation. And I took that position from an information engineer, I mean, information security engineering position, to what at the time, it would have been a CISO now, but at the time was called senior security officer, because the term CISO wasn't around back, or it may have been around, but it wasn't as prevalent as it is now. And so I just worked, I worked there for five years amazing journey. And it's been an amazing journey ever since.


Debbie Reynolds  07:11

Yeah, I love it. I love what your background is, you know, behavioral psychology because when you think about cybersecurity, a lot of people think about technology, but it really is about people. Right?


Jimmy Sanders  07:24

Oh, it is. And I smile all the time. Because if I'm evaluating a mentor, or I'm talking to people, I try to abstract the actual tool that I'm using. Because I can replace or, in some cases, I'm always looking for the best of breed, tool, and the system always. But if you can abstract that, and you start thinking, because no matter what tool you have, it was created by somebody with some form of thought pattern. And Americans think differently than a Russian would think differently than a Chinese would think differently than a Brazilian. And so, the way we would engage security is totally different than the way somebody else will. So how do you get behind that thought process? And that, to me, that's how you build holistic, resilient systems is by thinking as broad and as diverse among the spectrum as possible.


Debbie Reynolds  08:18

So I've never heard anyone say that cyber is very important. And so I've heard people say this about privacy, but I never thought about it in the same way, which is, not everyone thinks about cyber the same way. So it can't be this, you know, okay, this is the way that Jimmy thinks that should be done. And we're gonna just put that on everybody all around the world, right?


Jimmy Sanders  08:43

Yeah. And so I, so this was one of the great revelations when I actually got to Netflix and dvd.com is that I worked at companies where if you were smart, you will continue to almost like a unicorn, you're the smartest person there. And, you know, you say something, and people will stop, and they'll listen to you. And, oh, we're gonna go with what the smartest person in the room said; well, when you come here, and we go to db.com, that's not the case. It is your very smart person. But that's why we hired you. That doesn't mean you're the smartest in the room. And so we're gonna look at everybody's opinion because we want the spectrum of ideas because we don't. We go by stats and metrics over feelings. And just because you happen to have the best idea today doesn't mean you have the best idea tomorrow. So how do we bring that use case? How do we ensure that we're always pushing the edge using data? And by bringing in a diverse group of opinions, you can feel gaps that you never even thought about? Because one of the big issues that I have is I see many security architects and security designs and security systems where they consistently say the end-user is the issue. And I push back on it all the time because I question why do you allow a checkbox to input words into it, don't blame a user for being able to do that. Blame your architecting, your design, for allowing that. And so my role is to ensure that I'm not pushing it, only use it to be right. I'm pushing on the user to just do their job.


Debbie Reynolds  10:18

Yeah, yeah, consumers need help. So I think if we design things that people don't understand or don't feel like they can, you know, really resonates, they were really not, to me, that's part of being having a good client service. Focus.


Jimmy Sanders  10:36

But, and that makes sense. But if you if we look at the industry, on a consistent basis, most of the companies in the tech age have horrible client service, that it is rare that you can actually get on the phone and call any one of these Internet companies and complain if they intentionally make it hard for you to call them, they make it hard for you to even email them, they make you go through this 20 page, click, click, click, click, click through your pattern. And then finally, after you've escalated it to a certain level, then somebody finally maybe email you back. And so I think it's an intentionality that they've made it hard because they've built their own kingdoms of data and consumers, there they come to have the Take it or leave it almost approach.


Debbie Reynolds  11:28

Yeah, what do you think? What what is top of mind for you? Well, before I get to the top of mine, what you think right now, I want you to try to impart some wisdom, drop some knowledge on me about people who are in cyber, they want to ascend to executive roles. So tell me something that would help someone because you said a couple of things that triggered me. One is you said a closed mouth doesn't get fed, right. And so I've had I've known people in information technology, they feel like, if I'm invisible, you know, then I'm doing my job. But you know, again, you know, being invisible means you don't get the budget, you don't get the money, you don't get the funding, you don't get all that stuff. So tell me a little bit about the kind of advice you'd give to someone who wants to ascend to an executive position and cyber.


Jimmy Sanders  12:24

So the first question I really asked him is, is that something you really want to do? Because it may sound good in theory, but in practice, that may not be what you cut out for. I was with a group of security leaders last week, and we made the analogy. It's almost like you saying you want to be Icarus, and there's an issue with that because you get close to the sun too close to the sun, sometimes that can burn your wings. So you have to be willing to deal with the heat. You have to be willing to deal with the politics. But there's a lot of benefits to that. But that means you have to be willing to deal in the politics. That means convincing people, it doesn't mean necessarily, you know you being a political animal, but you got to convince your team, you got to convince your employees, you got to convince your peers, or you got to be able to partner with them to make strategic decisions. And if you're not the person who can listen to advise, as well as give good advice, then it may not be for you. And that's to be some of the keys is can you listen? Do you have a closed mind.? And just because you're smart doesn't mean you're good. And that is one of the things that ends up happening. Somebody is smart, and they think there's more than subject area eggs. But cyber is a whole spectrum of ideas and technology. So you must realize that to be a leader, you can't like always go to your ax, if you're you got a better use the ax, the spoon, whatever tools are available, And to bring your team up to make sure that they're strong.


Debbie Reynolds  13:58

Excellent, excellent. And then also, you know, you have to get out of the silo. So a lot of and this goes into privacy too. So privacy is about breaking down silos within an organization. So people that I've seen have been successful in their executive careers know how to do that. They know how to speak to people at different levels, they understand they and like you say they listen to people's pain points because the challenges, for example, someone has an accounting may be different than someone and you know, maybe legal, right? So being able to listen is very important. And also, being able to break down those silos and communicate is important. What are your thoughts?


Jimmy Sanders  14:40

Yes, for me, it's about transparency and empathy. And the transparency is that you're not coming at them with a hidden agenda. Because that's one of the things that when you go to an existing organization or you building organizations that have current silos in place besides lows are in place for reasons that we can't necessarily mix, security and engineering, we can actually make security and accounting. We can actually make security and, you know, finance. And at the end of the day, you have to come to a common understanding of what are we trying to achieve? Why am I working with this group, by being transparent in terms of I want to work with you, because x, let them know that upfront, let them know that there are no hidden agendas, and then say, here's why it will work best for us going forward, we will have a universal goal by exhibiting and demonstrating that universal goal of why we should partner together instead of being separate. To me, that's how you break down the silos by sharing data first and being proactive and outreach. Instead of thinking, because it was funny, I remember I was at this CIO event with a lot of CIOs, and I was just talking to CIOs about security. And at the end of the conversation, they were like Jimmy, and all the security never shared any data with us; they have all the data. And I was laughing, laughing to my security friends, because security, people think that it has all the data, and they're just not naturally given it to security. And so is that when you are not transparent in what you're doing, and you're not sharing that knowledge, you come up with all these misconceptions. And so breaking down the misconceptions to being transparent.


Debbie Reynolds  16:27

Yeah, I agree with that. I agree with it. So tell me a bit about how privacy has touched your business life.


Jimmy Sanders  16:38

Oh, so for me, on a consistent basis, privacy is so huge. Because first and foremost, the people saw, if I saw your viewing history, or if you saw my viewing history, it may sway or opinionate me about Debbie, the person, when you're viewing history is your private information that's that, I don't need to know that. I don't need to need to do any documentation about that. Your cousin doesn't need to know that. Only you get to decide what's your viewing history, what's your preference if you want to sign up as a kid or not a kid, or whatever you want to be. And to me, that translates into other facets of security as well, meaning I work with my peers all the time because I want to ensure that when I'm going to these other sites, that they're practicing security and privacy in a consistent, elevated manner as well so that we're not the only company doing security, right? We're not the only company doing privacy, right? Hopefully, by teaming up, all the companies that we can partner with are doing privacy and security in an effective and efficient manner. Because one of the consistent things that you see, especially if I've been paying attention to the news, is how people have been tracking people based on phone usage or, or tracking people based on their cell tower data of in all of these things that when you look at it from a micro level, but you don't think that that's a big deal. Oh, so what they know is my MAC address, so what they know, you know, my phone number, but once you start piecing all that information together, they'll build in profiles and ideas about you didn't even realize.


Debbie Reynolds  18:24

I was on a call today with a woman in Europe. And she said that she had requested her information from a company, and they based on kind of her browsing and the stuff that she likes to do gardening and stuff like that. They thought that she was a 70-year-old male.


Jimmy Sanders  18:45

And, and that is funny, but to me that that could be a good thing or a bad because for me because I'm a privacy security person and I try to be private, I intentionally do bad searches, I intentionally throw dirt, dirt in my search history so that it will mess up their algorithm. I do that because they're going to the problem is, and this is an inherent problem. We have a lot of entities collecting data without our consent. We have a lot of entities that, or I was just reading about, how over 100,000 people allow their eyes to be scanned so that they can get some cryptocurrency. And the issue is that we haven't explained the implications of what happens when your biometric data gets stored. Because what I hear a lot of people advocate for the use in getting rid of passwords, passwords, evil, get rid of passwords, and I hear that a lot. But my, I always push back on that because the genes are passwords is that you can change those. Once somebody steals your iPad, once somebody steals your handprint, your fingerprint, any of that data, It's not like you can go and get a new set of hands; you can't get a new set of eyes. And so, by solely relying on biometric data, that can't be changed. To me, that's a big privacy and security issue waiting to happen.


Debbie Reynolds  20:14

I agree. I agree with that. It's almost like, you know, be it psychological. So you, I think you'll appreciate this. So it is, you know, you want something, someone wants something right now, right? So they're willing to give up something for future harm that they don't think about down the road. So that's kind of why people say yes, to those days, you're like, Well, I don't, I don't want to think about tomorrow. I want to think about today, what I want to do,


Jimmy Sanders  20:44

I don't even think it's necessarily them not even thinking about tomorrow. I think they're not exposed to the technology that's coming. It's like, how could somebody tell you about an iPhone 30 years ago, they without you, or some Man in the Moon or woman in the Moon of reading sci-fi, that was coming a 2000 years ago, I mean, 2000 years from now. And now, technology is just advancing faster than people can recognize. I equate that to, when the first person on the horse saw a train coming, they die because they were used to the speed in which a horse could run the speed in which a train could come. And so, understanding. So to me, that's like modern-day life. We have people who are in tech, and they're used to the speed of what they see technology going. People in tech realize that speed is already leaps and bounds ahead of what's conceivable.


Debbie Reynolds  21:41

Yeah, right. So technology is going way ahead. And we're not yet we haven't really even conceptualize what it was happening, you know, what's brewing in the background and sort of what will happen in the future? What is your thought? This is about privacy? What is your kind of biggest concern right now about privacy, professionally, or personally, or just in the world in general?


Jimmy Sanders  22:08

So my easily my biggest concern is we lacked what I would consider a Privacy Bill of Rights. We like anything even remotely saying, These are the things that almost like the three laws of AI, we like anything that says, human universally, they can't of track your surgeon history, for instance. And I'm not saying that should be part of your bill of rights. But I think there need to be some core tenets that are legally binding to them on a universal basis. Because one of the things that happen in the corporate world is corporations, the first thing they want to do is protect their rights. So that's why we have the DMCA; we have nothing like the DMCA when it comes to actual privacy. Why? Because corporations care about their rights a lot more than they care about all rights. So we as consumers, we as privacy, individuals need to stand up and say, we need something that has legal binding so that when these companies are doing these nefarious and bad things behind the scenes, they don't stay in business.


Debbie Reynolds  23:18

I agree with you on that. So I get a little upset sometimes. So we in the US, don't have privacy as a fundamental human right. It's not codified in law, right. But you do have organizations to say, and we feel like we're going to treat you as if privacy was a fundamental human right. And that's good, right? But that's not legally binding. That's not something that all organizations would do. And that's something, but I tell my friends in Europe, they don't understand about the US.


Jimmy Sanders  23:54

And it changes based on the executive team. One team comes in, oh, we really believe in your fundamental rights. All of a sudden, that company gets sold, they have a totally different, right. And once that company gets sold, and they have a new executive team, that data is still at that company. It's not like they purge the data when they get bought. So the things you thought you were sharing with this good company, all of a sudden, now you're sharing it with this evil empire.


Debbie Reynolds  24:24

Yeah, I agree with that. I think also just in terms of data and data collection, and the technology is advancing so far ahead that now we're talking about data being collected that was never collected before and being, you know, a way that can be used in ways that people can't even fathom. So, I wonder, and, you know, in a lot of ways, how can we, how can we educate ourselves? How can we kind of, you know, at least get a leg up because I feel like we're it's almost like giving a baby a steak at this point,


Jimmy Sanders  25:01

Yes. And so, you know, this is, to me, it's where the EFF, you know, the Electronic Frontier Foundation comes in really good, where they're always, or a lot of times they try to fight for your rights that you don't even realize that they're fighting for. And the issue once you start fighting for people's rights, you can't arbitrarily pick the fight for the good person to write for the bad person, right? Because hopefully, we're fighting for universal rights. And so sometimes you are pitting yourself to fight for somebody who you may not necessarily agree with. But their stance is on a fundamental level. And so by looking at what the eff is doing, and they're doing a great job, by looking at some of the things like a Bruce Schneider or Debbie Reynolds is doing, and following the privacy-minded people, it is hard for a regular consumer to do that. It's it's hard. But because it's funny, a lot of my friends, they'll send me Oh, what do you think about Company X? And the first thing I do is I'll go to the privacy policy security policy. And to me, if a company has over a two-page security policy for the public, not internally, to page security policy for the public, if they have anything like arbitration in their language, it says I see the word arbitration, I think the company is going to screw you over. So there are certain buzzwords that you can see, it's like, you see that word? Okay. Yep. They can say whatever they want. You say arbitration. I mean, you're going to do my data dirty.


Debbie Reynolds  26:37

Yeah, also, I'm very much against the 80-page privacy policy. So I think it should be way simpler than that. Oh, like, do you sell my data? Or don't you?


Jimmy Sanders  26:47

By Design?


Debbie Reynolds  26:48

Yeah.


Jimmy Sanders  26:50

It's like, it's like when they give you the contracts that are eight, it's like when you go to the car dealership, and they make you wait for like, 10 hours to make the transaction. And they do that to wear you down. So you finally just acquiesce and say, okay, I'm good. Whatever you want, I'm good. Instead of it being a quick and easy process, and to me, that's why they have these policies, the intention, they know people, they just want you to do some click-throughs, they don't want you to actually read the say, you're actually documenting my cell phone data per inch, or per meter, because that's one of the things that's happening now. It used to be they were tracking per cell tower, but all these phone companies are getting better and better at accurately measuring you per foot per meter. And that's in their documentation that's in a privacy policy, but it's buried in like page 130.


Debbie Reynolds  27:44

Absolutely. I agree with that. I agree with that. What do you think? What what is kind of the surprising thing that you're seeing in cyber, so obviously, there's a lot more news about things like ransomware, spyware attacks, or whatever. But what comes out of these things that surprise you, the most back to you what surprised me the most. So the thing that surprised me the most is that when you find out what happened a lot of times, it's not like someone hung from the ceiling. You know, like Tom Cruise was a Mission Impossible. Like it's not that it's like grandma taped her password on a computer, and someone got access. Yeah. So that's what surprises me. What do you think?


Jimmy Sanders  28:32

So does a great example. But what you just told me, and this is what's surprising. We have many security leaders, including myself, so I don't want to act like I'm not among them, who consistently preach and talk about resilience, security, and layers of mode, multiple layers, so that you get through. And what we ended up seeing time and time again is it's one simple attack that breaks the whole bank. And we're, we were not practicing what we preach. And the other thing, and this is not even in cyber, it's just, to me, the lack of diversity, because the lack of diversity to me is consistently biased. And what I mean by that is that not even a lack of diversity in terms of people of color, or people of gender, it's the lack of diversity in thought, because you can hire 20 black people, 20 Latinos, 20 Caucasians from Harvard, that all were at the top of the class, so you got 60 people from quote, unquote, diverse ethnic background, but they will have the same thought process. And so what you end up having and what I see when you have the groupthink is everybody's building the security systems in a similar way, so that when one attack becomes successful, it cracks everybody because we all doing what's considered best business practice. Instead of really implementing the version, have tried and true security through layers. Instead of just saying we do.


Debbie Reynolds  30:07

Yeah, a lot of that is about assuming too many assumptions about people and what they do and how they are. So, I am very sensitive to that, because I don't feel like I fit in any box, okay, I can't really, you know, you know, I, if someone had a stereotype of a black woman and what she did, like, I would not fit anywhere in there. So, you know, being able to see kind of those biases and understanding that when you don't fit in the box, what can happen and you're thrown into another group, right, that doesn't really fit or, you know, someone's making an assumption about you, that doesn't really resonate, you know, and that can be harmful, right? Especially if people are making decisions about sort of carrier life, liberty, or, you know, just how you live your life.


Jimmy Sanders  30:53

And that I was talking with a peer of yesterday about this. And now, and I was explaining to this person about leveling up. And this is what I call it. And what I mean by leveling up is, when you first start off as an engineer, you're you, you thought you're working hard, you looked up, you know, your peers, you're all on the same level. And so as you move up from an engineer, to a, maybe a manager to maybe a director, and so on, nobody cares what you did when you engineer once you become a manager. Like that's, it's good that you got here. And so one of the things that you see happen, I see happen a lot is people, oh, I came from this small town, I worked hard. And now I'm, you know, successful. Right, that sounds good. But nobody really cared once you get to a certain level.


Debbie Reynolds  31:43

No, no, no. But some people like to harken back to the past. So I'm, I'm about the future. So I tried to look back to the last right. So I'll talk about the future.


Jimmy Sanders  31:53

And it's not even in, so it's always good to appreciate where you came from; never take that for granted. Appreciate, you know, your peers, your family, that and understand that that's what helped you get there. But that's not what's going to help you keep going down. Hopefully, you start working with the people at your current level. And maybe some people at a level below and also some people at a level above, so that you keep aspiring, you keep going forward. And that that to me, that's the way we keep continuing to prepare for it.


Debbie Reynolds  32:25

One thing I like. So I like to talk about statistics and metrics and stuff like that. I think, to me, one of the things that seem to be the case where people and technology who can ascend to, you know, these really great executive jobs is that they understand metrics. And they understand how to talk with numbers. So someone may not understand a very technical concept or something, but they know how to count, hopefully, from one to 10. And if you're talking in numbers, they will help you communicate better.


Jimmy Sanders  33:03

Yes, and so we so I was talking about this, and I was thinking this, and my period, we talked about this on a consistent basis. How do you talk the language of business? How do you translate security into the language of business into something that's portable and digestible for everybody around you? And so the way I would talk, and I do this, and I tell this about security training as well, I'm not going to do the same security training for developers that I will do for administrative assistants that I will do for executives. And I wouldn't do security training the same way, I'm not going to explain security or talk about security the same way. And so understanding your audience is the key to every audience that needs to be talked to in a different way. When I'm among my security friend, we talk CISSP and ICMP and whatever, you know, the next you know, but where it is, but understanding your audience understand that you may be with a group of engineers who love to talk like that. Your board may be comprised of all engineers, and they may want to talk about buzzwords and things. Or you may have a board of executives of just general business that they just want to know. You know, what's, you know, what's the overall ROI and things of that nature?


Debbie Reynolds  34:28

Yeah, I agree with that. I agree with that. I think there's a major shift has happened. You're probably in the middle of this, right in the mix of all this. And that is kind of in the past, like let's say we you started in computing and technology to now. So when I started, it was sort of like okay, technology. We do this thing. Let's say we make widgets, right? We make widgets, and that's our thing, and that's our focus. And then we're going to use technology to help us do something faster, better, you know, greater. So what's happened over the last 30 years is that technology has become so prevalent in businesses that, you know, maybe you can't make widgets. If you don't, you don't have your technology, right? You don't have your security, right? You don't have your privacy, right. And I feel like some companies are not yet realizing how important that is in terms of cybersecurity and privacy. So like, whatever the widget thing that you're doing, you literally can't do that thing. If you don't get privacy, you don't get cyber, right.


Jimmy Sanders  35:34

And this is one of the things because I'm in the Bay Area. And I travel around America, and sometimes around the world to speak to people. And I speak to companies and things of that nature. And one of the things that happen, you realize that Silicon Valley is a little bubble. And so we have all these things with ideas. And we think that everybody's is like this, and then you go to like a place like not to pick on Middle America or something. And their ideal of getting things delivered is not there yet. And you realize that that how do you push their ideas without disrupting, without disrupting the culture without disrupting the way of life? Because technology is a way of life in the Bay Area? We think it's cool to drive your Tesla and have all the gadgets. But in other areas, it's not. So how do we get them to integrate technology without disrupting their core belief system? And to show them that technology isn't the enemy? Privacy isn't the enemy because you'll find that that person can tell you a lot about their physical business. They can tell you a lot of the details, but they just haven't been explained yet the gap and how to do that transition. That transition is the hard part, and the digital transformation is the gap.


Debbie Reynolds  37:01

I agree. I agree with that. How do you you know, so I have seen lots of people really try to get organizations to understand that kind of you know, cyber and privacy stuff needs to be things that are talked about in C suite. So it can't be kind of an attack on. It can't be the dessert after the appetizer and entree, right? It has to be like a foundational conversation. How do you broach that topic with people who may be thought, again, we were making widgets, and you know, we were doing everything? You know, I have an MBA, and this is how they taught us how to do things. And then one thing about privacy and security, cybersecurity is different than I think the way some people were taught or educated is that if you do it, right, it requires kind of proactive stance, as opposed to let's wait to all hell breaks loose. And then let's try to jump on this problem.


Jimmy Sanders  37:59

So I do; I do about three different things that depend on the audience. So the first is a cautionary tale. So you show an example of how technology, lack of security, lack of privacy has actually taken companies out of business. One of the stats that we see in technology, and just in small businesses, is the amount of businesses that fail. One is that you don't see is one of the true reasons why these businesses fail. And so some of those reasons, because they got hit by ransomware and things. The other thing is what I call unintended consequences. And what I mean by that is your thought to have been building this widget, this widget was only supposed to be for your kid to be able to record something, and you let them connect to Wi-Fi. All of a sudden, you got a Wi-Fi camera that your kid has that's connected to Wi-Fi, there are so many unintended consequences that can stem off of that, that I can explain as well. And the other thing that you use is using real-life examples of that. So, you know, as we could talk about, you can talk about targeted or the hot, you know, just the list goes on the things that you can use it that's what I try to do. But also try not to do too much. But you know, try not to do too much fear, uncertainty, and doubt; use numbers to back yourself up. And also show how by getting security and privacy into your product. At the big at the closer end of the development cycle is actually lowering the amount of bugs found it's actually accelerating your development cycle instead of making it longer. It's actually making customers happy because instead of you having to patch your system every two days, because that's one of the tricks that another thing that I've seen of software companies has done they tricked us of now they give us it used to be Patch Tuesday, and now they've come up with a patch and anytime they feel like you get it new update on your phone oh come patching system. Oh, come back to your system. So they conditioned us to constantly patch our systems. Right where I am. I am of the fundamental belief that they're having this conflict patch our system because they're doing bad coding, and they're getting away with.


Debbie Reynolds  40:19

Yeah, that's probably it.


Jimmy Sanders  40:23

But you know, and my friend did push back, and like no, Jimmy, we're building in new features. We're doing all these new things. But here's the point. Right? If I'm happy with what I've got, let me be happy with what I have.


Debbie Reynolds  40:41

Yeah, I agree with that. I agree with that. So if it was the world, according to Jimmy, and we did everything you said, what would be your wish for privacy anywhere, anywhere in the world? Technology, people, anything.


Jimmy Sanders  40:58

Transparency and consent. That would be my wish. Let me understand what you're doing with my data before you do it and get my explicit consent; get my opt-in. It would have to be opt-in by default instead of opt-out by default. And so and you have to be transparent in terms of how you use my data. Yes, Jimmy, I'm trying to find out how many shirts you go through in a year so that I could sell you more shirts. Okay. Let me make an educated decision on how you handle the data that is mine. Right now. We don't even know what they're doing with the data. So for me, those are the only two things I really care about. I don't care what you're doing with the data as long as I consent to it. Yeah,


Debbie Reynolds  41:46

I agree with that. Transparency, definitely a concern. And then I think, you know, to the two of the things that get companies in trouble, that advisement a lot is kind of data retention. First of all, what are you collecting? And why are you collecting it? And then keep the stuff way too long. That's like a problem.


Jimmy Sanders  42:07

Yeah, but they're keeping away too long. But everybody wants to look at the big, you know, social media company, they everybody wants to be the next billion kajillions, our social media company that's holding all your data from the time you've been born so that they can eventually have AI, run these metrics on you. So they don't want to throw away the data because eventually, the AI will catch up to it. That I mean, that's the thought process.


Debbie Reynolds  42:36

Yeah, oh, wow. Well, this has been wonderful. Thank you so much for being able to have this chat with me. I'm sure people really love to hear your insights about all things cyber and privacy, and I like that you're even though you're a Bay Area person, you have Midwestern, common sense going for you.


Jimmy Sanders  42:59

Thank you, Debbie, it's been great chatting with you. Thank You

Previous
Previous

E53- Demeka Fields, Counsel, Global Sports Marketing, Data Privacy at New Balance

Next
Next

E51- Davi Ottenheimer VP, Trust, and Digital Ethics Technologist, Inrupt