E132 - Michael Thoreson, Founder, KRATE Distributed Information Systems

51:00

SUMMARY KEYWORDS

people, privacy, certifications, ai, cybersecurity, human, talking, robot, security, data, knowledge, whatnot, compliance, put, chat, podcast, person, conceptualizing, job, thought

SPEAKERS

Debbie Reynolds, Michael Thoreson

Debbie Reynolds  00:00

Personal views and opinions expressed by our podcast guests are their own and are not legal advice or official statements by their organizations.

Michael Thoreson  00:14

Hello, my name is Debbie Reynolds. They call me "The Data Diva". This is "The Data Diva" Talks Privacy podcast, where we discuss Data Privacy issues with industry leaders around the world with information that businesses need to know now. I have a special guest on the show all the way from Canada, Michael Thoreson. He is the founder of KRATE Distributed Information Systems. Welcome. Hello, Debbie. Thank you for having me here.

Debbie Reynolds  00:41

Yeah. First of all, forgive me that it's taken me so long to invite you on the podcast. You and I have chatted online, we've had calls, and we've done stuff together. And as I looked through my list, I was like, Oh, my goodness, I forgot to write Michael. So this is gonna be a fun conversation to have with you.

Michael Thoreson  01:01

Definitely, we never have a shortage of things to talk about when we get together.

Debbie Reynolds  01:06

Exactly. Well, your background to me is very fascinating because you're one of the people that I feel like you understand the security side of data and also the privacy side. So you're very well versed in both, and I feel like you're the type of person who can help talk about that gap between those two things. And then also at some point, we want to talk a little bit about certifications. But before we get to all that, I would love for you to introduce yourself and tell me what interested you in being able to move over and distinguish yourself in privacy.

Michael Thoreson  01:56

That's a loaded question. Oh, man, I don't have time for the episode. Or anything else in the episode. I'm passionate about cybersecurity and Data Privacy. But first, I started out on the cybersecurity side of things. Because I always had an interest in technology. So I started out getting certified through CompTIA has a plus IT administrators network back in 2006. And early on in developing my career, actually, just shortly after high school, I suffered my first identity theft at a phone call from a car dealership in another city in the same province congratulating me on the purchase of my vehicle. And I said, no, I haven't been to that city, you know, since I was nine. And I couldn't think to borrow, to buy a pizza right now I'm a student, I'm taking systems analyst course I'm working two jobs as it is. This just wasn't me, basically. And then several years after that, there was a second attempt at identity theft. And in there, they weren't as successful because I was from the first identity theft kind of spiraled out of control. I declare bankruptcy as a defensive measure to stop stuff from being charged to my credit account. And when they failed to charge something that in my account was his second identity theft. They thought they would be cheeky and change the name on my credit profile. So I was very motivated to look into how we improve the protection of data from early on in my career, which led ultimately to the development of Krate where we are evolving how the world creates, consumes controls and protects data. And after doing several years on the cybersecurity side of things, developing some technologies in-house. I am talking to various experts. I saw a parallel between security and privacy. So I took some training from Jamal, the privacy pros Academy, which you have had to have some previous dealings with saw there. I think you were in his podcast at one time if I'm not mistaken. And then realize that the knowledge and the skills transfer from security to privacy. And so after getting my CPP I kind of went a little hog wild on certifications and got the CIPC the CIPM the CIPT and now I'm NFIP with IPP and another parallel is privacy loves its acronyms, cybersecurity. I'm learning privacy is also full of acronyms.

Debbie Reynolds  04:39

Yeah, there are two things happening with privacy and cybersecurity. Well, it may be more than two things, but two major things. One is that sometimes depending on the location that you're in, there is a mashup or confusion where people sort of try to lump privacy and cybersecurity together. And that happens a lot in the US where if they think like, for example, they think they talk about cybersecurity. They think that privacy is under that in some kind of way. Like it's sort of combined together, where I feel like other jurisdictions maybe, definitely Canada, definitely Europe, you know, they make distinctions between data protection and privacy. Right. Yeah, it's very different based on the kind of jurisdiction. But I think, you know, the best way that I could describe cybersecurity, I guess, in two ways is like, I think cybersecurity and privacy have a symbiotic relationship. So they're not the same, but they work and play together if you do it right. And then, when I try to tell people the difference between privacy and cybersecurity, I use an analogy. So it's like, let's say you have a bank; maybe this is very fortuitous that we're talking about banking, right? You have a bank, and then the cybersecurity person will say you're in charge of the security of the whole bank. So that's the outside of the bank, that's the inside of the bank. That's everything that happens in the bank. That's what cybersecurity is. Privacy is about what's in the vault and why it's in there. That's me, it's the best way to describe it. What are your thoughts about that? Between cybersecurity and privacy?

Michael Thoreson  06:38

I think that's a good analogy. It's important to know that there is it didn't like delineation or some sort of separation there. However, I feel culturally or in business culture, anyways, they're the divide is become more instead of a logical separation; it's almost a physical separation where these teams don't really cooperate with each other. So when the security team is doing something to secure something, it may actually be in contrary to the efforts of the privacy team and vice versa, or there's a lack of cohesion. And with that lack of cohesion and security vulnerabilities that is created it without, you know, intention, or even realizing it, right, so the delineation and tasks, and sometimes the scope of perspective is there. But in the end goal, the idea is to protect access to x, right, so the mentality is the same. And so these teams should be able to work more cohesively together. And I also think it's a great benefit to that if the privacy team has some information about, you know, some knowledge of how the secured team works. And then vice versa. Secure team knows a little bit about privacy and how the private team works. Holistically, things can work much better.

Debbie Reynolds  07:51

I'm going to point it to a board to advise the Department of Commerce for the Internet of Things and how the US thinks about that, whether that be within government or private industry or has to do like privacy, international agreements, and all this type of stuff. And through this analysis, we are kind of I'm working on like personas. So part of that exercise, what I realized is that even though we have groups that are working on different kinds of segments of IoT in different use cases, privacy and cybersecurity are horizontal, meaning that it cuts across all anything you can think about to have data. And I think the way that we construct maybe cyber teams and privacy teams within organizations, they try to do it more vertically, where I'm a silo, you're a silo, you do this, I do that, and it just doesn't work that way. And I think the problem has been that companies have traditionally created those types of vertical groups within organizations, and it just doesn't really work. So I think the challenge is to find a way so that people understand that privacy and cybersecurity probably touched on every level of your business all the way through from beginning to end from your data story, and try to tackle it in a way where there's visibility in all those areas. What are your thoughts? Yeah, so that's great. While you're talking about that, I just thought of another analogy that a fellow mentee at the Privacy Pros Academy had shared where they thought you can have security without privacy, but you can't have privacy without security. I think I got that. Right. Basically, it was a glass house type of thing, right? Right. You can keep the Glass House nice and secure, but you can't keep what's inside because everybody can see it. Right? So another analogy of how they can work hand in hand. I thought of taking that a little bit further of that, you know if you increase the privacy by drawing some blinds right now, you increase the security because people can pop in and see you entering the code or whatnot, your security system, right? So it's, it's interesting how, the more you dive into this, the more they need to be together. And you're correct. It's the number of organizations that keep building these silos, right? These departments, you know, you'd specialize in this, or this person specializes in this. And, you know, they keep them on a narrow track, and even see this in certifications trying to narrow things down, right, they did it there's a lot in the long run, it's not going to work in my opinion. What do you say when you're talking with companies, and you see they have the siloed approach I've had a situation to where let me put my blinders on, I only look at privacy. And I'm like, wait a minute, wait a minute, you all have to do something about the cyber thing. It's hindering my ability to help you with other stuff, because that isn't being addressed? What are your thoughts?

Michael Thoreson  11:12

Are you gonna share with them kind of an extreme story that we see in the news, and I think so much that it's become desensitized where you can have all this privacy in place and, say, in a database and all these controls and such internally, but if you don't address the cybersecurity side of things, from an external or even again, from an internal standpoint, as you hear about databases just being pulled off of Amazon s3 or off of web servers or whatnot, right, because they've, they've got everything, they have a login page you have to go through, but the database is actually being called, isn't really behind that login page. Right? So it's like having this door that, yeah, you, you keep things private, and you're gonna gate keep people coming in through this door. But if I really want to bypass a line, I'll just go around your door and just grab the database and walk around, right that there's nothing, there's no fortress, there's no walls, there's nothing security-wise, no snipers on the towers, you know, to make sure people go through your gate. Right.

Debbie Reynolds  12:14

Right. You know, you bring up a great point here. So I want to deep dive into this one. So sounds good. The thing that you're you are describing I think that it is a symptom of something that's been happening for decades within organizations that are been dealing with cyber or data issues. And they pretend like the only path, there's one path to do everything. And people aren't going to look the other way. So like, for example, this is a great example. So let's say you have a database, you want to give people access, and you've managed to access the database. But you've not done anything with the permissions outside the database. People can go around; they don't have to log in, they just go to the data store where the stuff is, and they suck it in. And I think that's one of the big problems that we have with cyber protecting data where we've taken that bad behavior and put it into the cloud.

Michael Thoreson  13:19

Yeah, yep, yep. And it is kind of an extension of that behavior is also like the people that are managing or planning or designing and architecting. These don't; these databases are web services, and such when you're doing it in-house or homegrown, and sometimes even when you're using managed services if there's sometimes a disconnect, is there? But the first part is they don't understand the tools. They're like, Oh, well, let's grab a secure web server. Oh, Apache and it is great. Okay. Now what? So okay, let's install Apache, let's put a website up, and then we think we're all nice and secure. You know, they don't; there's no forethought. Again, further, you know, do I add an SSL? Do I have all these extra modules? You know, what do I do to secure the Linux box or the Windows machine I'm using, right? So there's the this goes back to like privacy by design and what you talked about, you know, from beginning to end, right, and applying it to the whole data lifecycle, right, not in applying it more beyond? Okay, well, this process, we've applied it in this process, but there's eight other processes in the data lifecycle that we've never actually addressed. And while I'm not going to address this because that's outside my job scope, right? So yeah, we, I'm sure you and I can go back and forth about various different things that we've come across, if there's a culture and, you know, it's more of an ethical or moral responsibility on people that are missing because, well, it's not in my job description, right.

Debbie Reynolds  14:50

Yeah, that's true. That's true, right? Because if you just do just the top line stuff or the basic minimum stuff, there are just so many gaps there. And you know, a person like me, I can't ignore that. Like, wait a minute, what about this? What about that? I would love for you to chat about it, there's something on your profile, and we've talked about it. And I feel the same way. And this is people feel like they have to give up privacy for security and vice versa. And this is a very common thing that we hear people say. To me it's like, you know, fingernails on a chalkboard, and totally drives me crazy. You know, to me, they're not the same thing. But a lot of people like to say that they are. So here's an example, a couple of years ago, there was a big brouhaha because I think Apple wanted to create this sort of backdoor and being able to scan everyone's photos to see if there were some illegal activity happening. And the issue was, they're like, well, let's, you know, let's do this thing so that we can protect people who are at risk for this criminal activity. But at the same time, you're basically making everyone guilty until proven innocent. And my thing is, if you're trying to find a needle in a haystack, why would you create a bigger haystack?

Michael Thoreson  16:19

Yeah, yeah, that was very controversial. And I was very much against it not because I want to stop, you know, that the whole idea was to stop child pornography and trafficking of it, which is a very noble cause, and certainly not something I want to impede. However, Apple and others that were trying to do, it didn't release enough details to say, you know, well, who's gonna have access to the photos, you know, is it going to be aI driven as a first filter by AI and then a human later. And the reason these are important is because you also can end up can really derailing people's lives that are innocent in this right; it feels like a big giant drag that, as you said, you know, guilty until proven innocent. There was a story of actually a few stories of people that during COVID when they were doing telemedicine or telehealth visits with their doctor, one gentleman in particular, his child, young child, had a skin lesion or rash or whatnot. And so so he was sending photos back and forth via Gmail, right? With this doctor and such. This is before the television medicine apps really became popular with Matteo and others. And the Google systems flagged it as child pornography and locked down everything locked down his email, his Google calendars, just his Google Drive, made all of it unavailable. And he had tried to appeal it. And they're just like, Well, no, there's nothing we can do about it and have to go through courts. And so I'm not saying the system has to be foolproof before being released. But there's, there's real consequences when you're trying to release these tools. And they're not thought out. Well, how do we also protect the people that are innocent right? Now this person wasn't trafficking child pornography, he was just trying to get images to the doctor to find out, you know, how do I get rid of this painful rash that my child is suffering with?

Debbie Reynolds  18:15

Exactly, exactly. I tell people AI is like a machete is my scalpel. Right? So it doesn't do that fine-tuning, it doesn't do that thinking, you know, it can do this kind of big cut-offs, but you're talking about the lives of humans. And there's more to the story than maybe what you see. I would love to chat a bit about AI. So I want your thoughts. You know, everyone's going gaga about Chad GPT and generative AI; what are your thoughts there and your thoughts about the privacy and cyber implications of that?

Michael Thoreson  19:03

They're, they're profound and many. And the easiest way I've tried that I found to sum this up for people is thinking about electricity, right? And a quote from Bob Proctor, when he was talking about electricity, he says, I don't know how it works. But I know that for electricity, I can You can cook him in his dinner, but you can also cook the man and chat GPT and this generative AI and all this stuff is very much there's a lot out there if I can do this and all those great things and you know, I can streamline my job, and I'm more productive and all this or whatnot. And then there's the dark side of things that people aren't shining a whole lot of light to, but are starting to crop up like I think was ChatGPT for someone published conversation where Chet TPTs is, more or less spelled out how he would trick a human into passing captcha and other robot blocking techniques. And actually paying them and never revealing that he's actually that they're actually it's a robot asking this human to do this. So these frameworks are already if you ask them, they're already able to rationalize and figure out how to bypass the stuff that we think is in place and is going to prevent, you know, an AI apocalypse, so to speak. So I think we very much underestimate, you know, how much knowledge is at the AIS fingertips. Also, ChatGPT 3, I saw an interesting one where someone asked what's, what's four plus three in chat? He said, Well, seven, and the person responded back with well, my wife says it's eight. And chat. GBG says, Well, no, I'm fairly certain that it's seven. And then the person responded back with well, my wife is always right. And then ChatGPT comes back all apologetic, saying, Well, I am a tool in progress or in development, and there are times that I can get things wrong, etc, etc. So so, it's, you know, was access to all this knowledge, it still has difficulty with conceptualizing the world, in what it can rely on is facts and mathematics, there's a lot of things that we know to be true, and just you cannot dispute. So how does it apply things that are more abstract, that are kind of a mixture of the constant lot rules? or physical constructs that have a little bit more interpretation? Like, so when you're talking about the law? You know, is, was there an actual real danger? Let's say we get to ChatGPT getting opinions on Data Privacy harm occurs. Right, there's always this debate of, they have video surveillance, and if something they're not aware of, the video is being recorded, and the video has been hasn't been leaked. And such right, though, and it's been limited to just the one person reviewing it, right? So has privacy harm actually happened? And while you can argue both ways that yes, you're observing my behavior, without my knowledge, without my permission, and without anything, in the end, indicating that I am okay with it or aware of it. Right. But you know, some people will say, well, harm happens when the footage is leaked. And you know, it's you in the bathroom, doing something that most people should not have a privy or preview of, of seeing, right? So AI is really helpful, but it's also really blurring the lines and how much we can actually trust it. And you've also posted articles and commented and other things where AI has been used to identify people for crimes. And they can easily prove that they were on the other side of the country, yet they've been convicted and actually served time and then had to go through the appeal process, which, depending upon which circuit you're in, can be extremely long. It's, yeah, it was, it's great. You come out afterward being vindicated and absolved with the conviction of the crime, but it's still damaging to the person.

Debbie Reynolds  23:06

I used to tell people I didn't believe in evil robots. But I think I'm changing my mind about that a little bit. Not that the robot is evil. But the problem that I see is that people try to abdicate their human judgment to AI, right. So you know, how AI can help you as a tool, I think it's going to be,I like to say, is in the passenger seat, and the human is in the driver's seat. But I think we have some literal and figurative situations where we're trying to put AI in the driver's seat, and I think that's the danger that we have.

Michael Thoreson  23:51

Yeah, evil robot thing. It's kind of interesting if you ever see the movie iRobot with Will Smith. Yes, that's it. Yeah. So in there was the three rule, the three rules were basically saying that robots, you know, may not harm a human being. Whereas it was just trying to Google the three laws we were talking about here are your robot may not injure a human being or, through inaction, allow a human to become to come to harm, a robot must obey the orders given by a human except for such weird conflict with the first law. And a robot must protect its own existence as long as such that the protection does not conflict with the first and second laws. And what I found interesting about iRobot is the master AI controlling all these robots came to the conclusion that humans cannot be entrusted with their own survival, and its way of guaranteeing obedience or compliance with the three laws was to lock them all up and basically control them right so that they can't harm themselves or we can't harm ourselves. So the evil robot thing is, in that was that master AI actually evil? Or was it actually genuine in wanting to keep the human race safe? And that was the only logical conclusion it that could come to.

Debbie Reynolds  25:13

Yeah, I like to say, That's fascinating. You know, robots and AI cannot be wise. Okay. So wisdom is a human trait that cannot be articulated in my view by any type of AI or technology, no matter how smart you try to make it, right. So, an example I give this, like, let's say you want to make chicken salad. Okay. You asked ChatGPT, you know, make me give me a recipe for chicken salad. Okay. And then you say, Well, I want you to put chocolate chips in it. It's okay. I've put chocolate chips in it. So it doesn't know that chocolate chips don't really go and chicken salad. Right? And you don’t, no one's taught you that. Right? You know, they learn that at school, but you learn that by being human and having wisdom over time. And that's what AI and technology can never do.

Michael Thoreson  26:14

Yeah. Or you learn like me, and it's just like, hey, let's just mash some food ingredients together and see what we created. So I've created some meals where my wife is just like, Yeah, this is good. Or never again.

Debbie Reynolds  26:32

Oh, wow. Well, what is happening in the world right now that you're looking at this concerning you?

Michael Thoreson  26:40

One is the AI thing, which we've already covered. We've also covered a bit, you know, the growing problem around the separation between privacy and security teams and conceptualizing how you achieve both at the same time. Actually, talking about that, one thing that popped into my head is in the US, that would be from being Canada and looking from the outside in, it's very easy to see how quickly when there's an event, right, the Federal government or a State government will get together and they'll immediately enact something to increase the security of the people in the state. But there's always something that has to get given being, you know, be given up, right? Like when the airport's edited, and those bio biometric scanners or that X-ray machine went past X-ray, right? And, you know, they can see all sorts of stuff, you know, the joke is, if you want to get an x-ray, colonoscopy in an MRI all in one, just, you know, fly out of the country, come back to your doctor to request the scans from the TSA. Back to your question, what do I see? Reliance on laws and regulations, right? Privacy and Security when one advantage of having this one being learned through the school of hard knocks is the security side of things on the privacy side, taking some more formal training combined with self-taught teaching. There's a lot of common sense between them, right? And a lot of people are relying on the laws and regulations. And I see Data Privacy going down a road similar to what the banking system in the US is right now. Right? There's been three major closures of banks and no SVPs. Banks, parents, SVB financial, and their other brands, arms, or whatnot are going under. And this triggered my thought of, we can be compliant as much as we want. And there's always, you know, shoot for that compliance spotter. But we're still assuming too much risk, right, that there's, we're seeing in finance, and I think we're gonna see very quickly here, some parallels in the privacy and security side of things where, yes, we're being compliant. But unfortunately, by following the laws and regulations and following the regulator's advice in such it's still too risky. And one way to visualize why that is too risky is cybercrime is now estimated to be costing the global economy 6 trillion US dollars, right? That's, that's three times a candidate, almost three times Canada's GDP. That's also a cumulation of the 100. And the bottom 165 countries of the 216 countries recognize GDP, right? You want to declare to the US economy that it's about 30% of the US economy gone, right? So basically, you could just drop California in the ocean, and you'd be roughly, you know, the same, right? There's different ways of conceptualizing this impact and the cost, and these are just estimates, right? We don't actually understand the full impact long term, right? We were talking about stretching this out, you know, someone has suffered a data breach. Today, there's going to be a cost that lasts, you know, for a decade potentially or more. I just recently spoke with someone who, you know, was 15 years before they got everything sorted out, and to where their life was back to some semblance of normality before that, for me, it was over about 10 years to correct everything. And it took me a number of years to first figure out how I correct everything. Right. So, unfortunately, compliance, if that's what you're shooting for, you're still assuming a lot of risks. And I would argue that you're not doing your risk assessments and your risk management and your risk mitigations appropriately.

Debbie Reynolds  30:32

I agree; I think compliance is, you know, it's kind of a skeleton of some sort, right? You've got to put some meat on the bone. So then I think there has to be; compliance alone is not going to engender trust, right? Because people think that you shouldn't do the basics, right? So you're not going to get like a lollipop or gold star, because he's like, we meet the minimum basic requirements of the basic, the least amount of work we have to do we do that, like, no one's going to be happy with that. And I think that if that's the bar that you're looking for, I think you're going to lose a lot of customers, a lot of trust.

Michael Thoreson  31:20

Yeah. And, as a benefit for those listening here. You know, if you are stuck in that position of where your hands are tied, you're not empowered. And you know, compliance isn't enough. You want to get more stakeholder buy, or you're going to expand your funding budget or whatever. I had a wonderful opportunity to attend a webinar the other day and I want to share two great stats that summarize a lot of the reports where 90% of consumers or individuals are concerned about their privacy. Right. So it's not, you know, data protection, Data Privacy, you know, the private information, it just privacy in general, and 92% are concerned with losing control of their information, again, not information, I want to keep private or it's not subcategorized I believe she was very specific in the words that she's using to summarize the so people are understanding they have digital lives, there's information, right? So if you want to, you know, change the minds of the stakeholders and decision-makers, use the stats to show them privacy can be a selling feature, we can now make privacy a revenue generator, you know, a trust builder, it's no longer a cost center, like it was 10 or 15 years ago. Right.

Debbie Reynolds  32:37

That's great advice. I preach that a lot, as you know, about privacy being a benefit. And I think in my view, Apple moving towards doing things to help people control their privacy with their app transparency; I think it's no coincidence that they have; they've had some of their most profitable quarters ever since they introduced that. So that's something, you know, that's something we all need more help with helping ourselves and protecting ourselves. But I think if you put communicating what people really want in terms of protection and control, I think that that really helps move the ball forward. Yes. I want to chat just briefly on certifications. You and I did a session together about this, and I thought you had really interesting insights. You know, I feel like some people say, hey, I want to get to privacy I want to get into cybersecurity, and they just take just random certifications and may not be a fit, so talk to me a little bit about how people need to be thinking about what experience do I have, what gaps do I have, how do I make a judgment or choice in what sort of cases I need to go after?

Michael Thoreson  34:04

As you know from that previous shot, I was no shortage of opinions and ideas then, and I still have not curated them down at all, but they have altered a bit so gentlemen looking at certifications people are like, well, I want that three-day crash course that I in and we get my piece of papers certified by some organization that says yes, I know what I'm talking about right. And the backlash there with a lot of them is that, well, okay yeah, go do this three-day crash course. You've got this knowledge, but no Do you know how to operationalize it? You know, do you remember a week later what it is that you studied in that three-day Crash Course right? How much of it is if you are actually ingrained in your thinking processes, right? That's one advantage that says education, degrees, and such can do however, my issue with degrees depends upon where you're going. They very much day It not only gives you knowledge, but they teach you how to think. Right. So it sometimes can break people's creative thinking and problem-solving. And they come out with a degree going, okay, this is how we do things, because this is how we've always done them for the last 20 years. And this is what's still being taught and that, you know, this is what the industry wants, right? You're, you're taught what the industry is looking for. And then your last bit is experiences, right? I, I personally have a hard time selling myself in my experience, you know, a bit, you know, when some people be the mirror for me, and they're like, Well, you've, you've done talks with the UN and Interpol, you've done talks abroad in Greece, University of Nicosia, about Data Privacy and responsible AI, and you've done things Mike, like, Come on, speak about these things, right. So I would say don't look at certifications alone. And you know, if you've missed out on getting a degree, you know, evidence, your experience what it is that you have done. And as you get with the right recruiters, the ones that you are not just simply looking for a degree to check a box, like I saw one, it was supposed to be an entry-level GRC role. And they want you to have 10 years of experience, a CISSP certification, and then a degree, four-year degree in any subject, right. So that would be a bad recruiter to approach the office; he doesn't know what it is they want to hire or what an entry role is, or what the certifications mean. And then the other part is, there's a fine edge now of where there's more certifications now than there ever have been. And some of them are higher quality than the more well-known ones, but they don't have the reputation or the industry buy in, right. So it's a fine edge, sort of, you might find a course that you feel is more bang for your buck knowledge and skill-wise, but you might be better off getting one that's more recognized, getting to get yourself into a role into a job, and then go get that one that's more you know, that teaches you more you get more value out of it as a personal investment in yourself, right? So and then, and hopefully, you'll those better certs will become more recognized and more applicable than you can be your Hey, I was one of the first people in the first six months of this being offered, I recognize its value. So that's yeah, a wide response to your certification question.

Debbie Reynolds  37:31

You know, we're seeing a lot of tech layoffs in the news. What advice would you give to someone, they've been in tech jobs for a number of years, they're out on the job market now, what types of things would you just recommend to someone to maybe find themselves in position, what they need to do to upskill?

Michael Thoreson  38:04

It's kind of a tricky one to answer because a part of me is thinking more like an entrepreneur in that, you know, a lot of advisors will tell you that when you're launching a company in your in your learning a skill set, you need to launch that company, or to build your product or do something needs to, you know, earn, from your knowledge. And I wish I had embraced this earlier on in my career business as a serial entrepreneur. So while you're upskilling, maybe hang out your shingle there as a consultant, or, you know, start a part time podcast, you know, it can be once every couple of weeks or something like that, right? build your personal brand. As you build your personal brand, and you put stuff out there that can be recognized and potentially monetize in terms of you know, maybe you can offer a short course on something that you learned in your career that would be benefit beneficial to somebody who even if it's like a $50 course, and it's a, you know, a four, eight hour thing or whatnot. Right? You know, there's people that will bite on it, and that that says, at least money coming in. And now validation of that experience. While you, you know, decide, do I want to do this full time? Or do I want to go back into the job market? And while you're doing that, you can, you know, sometimes figure out what skills are in demand, right? Yeah, when you're doing podcast episodes, and you're doing stuff on LinkedIn, you're posting on social media and you get these responses back, you can see what people are biting on, right or what questions get triggered as a result of it. So don't always look at what the employer job ads have out there in terms of what they're looking for, you know, also look at what industry is looking at by responding to your social media content and putting yourself out there which I know is much easier said than done. It's something I still struggle with. I comment like crazy, but I post so little. Like I'm always commenting on your stuff.

Debbie Reynolds  39:57

Yeah, comments or content.

Michael Thoreson  40:02

It does work. But it's I find people that put themselves out there like yourself and others who actually will post like a video or they will post an article, they tend to get more recognition on LinkedIn also, I find that the people are doing that the end up being much more confident. So, which then can transfers into another skill when you're looking for your next job? Right? You know, because if you come in more confidently or whatnot, you might actually be able to score more senior role than what your knowledge or experience would normally have gotten. You saw its own merit, because they all these guys, this guy's confident, he's putting himself out there. He's wanting to learn, he's wanting to grow, right. And you get that right company that wants to build people, that's, that's more attractive to them, then do you have you know, all five, already have five starts from IPP and three from ISC squared, and you know, you've got your certified or CCI, your auditor cert from Asaka, right? You know, it and you've worked at, you know, five, or two to three, or three to five different big tech companies in the last 10 years, right. It's those skills can be used to make you more attractive to employers that want to build people into the into the people that they need, not just hire someone off a manufacturing line, throw them in there and hope that they can do the job or manage people.

Debbie Reynolds  41:26

Wow, you really dropped some knowledge right there. I agree. I highly recommend that people build their own brand. You know, it's never been easier than it is right now, to do that. I remember 20 years ago, if you wanted to distinguish yourself, you had to find a publisher to publish the book for you, or you had to get written up in some journal and go through all these hoops. But now you can just turn on an account, turn on your video, like I said, do a podcast be on other people's podcasts, that's actually a really good way to get out there without actually doing that work. You know, to distinguish yourself in a way that people get to know you before they even meet you. Right. So, you know, most employers, if you're applying for a job, they'll look you up on social media, or whatever. So if you put out thoughtful information, you comment here out there, that'll definitely give you a leg up, or at least make people have a better sense of kind of who you are and what you can do.

Michael Thoreson  42:39

Yeah, and another challenge that people face is like, well, who would want to listen to me? Like, what, what is unique about me that someone would want to listen to? Or, you know, what, how would you know, there can't be that many people out there that would want to hear what I have to say, or what I would how I think or what I where I believe things should go. Or what I believe is to be wrong, right now with certain things. And if you just think of it from the point of view that there was more people on earth that I think have existed in all time right now. And population is growing so quickly, that the birth rate, I think, last I looked was two births every second globally. So 20 years ago, there was a significant less population, right? There was also significantly less reach, right? So you're still somewhat restricted geographically, in who you could reach, right? So there may be people in Asia that really identify with what it is that you're saying, you know, being a podcaster, in Canada, or even a South American, right, you can reach a much wider audience. And just from sheer statistics, you know, we were over 7 billion people, right? You find 100 people or something like that, and to initially get started, that's enough for you to start building a brand, right? It always adds up starting small, but it's gonna build from there and people like, well, you know, how do I get like, you know, these people to have a few 1000, you know, followers and such well, it's repetition and building your brand and learning what they, you know, what, how would they synergize with you? And then also, it's really just a reliance on your audience, but word of mouth to just get you out to more people. But yeah, so yeah, so there's that, that who would want to listen to me, or is there enough people out there to justify my time and statistically, yeah, there are enough people on the business side of things, right, people are like, you know, I'm sure 20 years ago, no one would have believed that someone can start an online business selling dirt. Like literally just, you know, here's a bag of fertilized dirt for you to have, you know, in a pot, and with everything, we'll pre-planted and whatnot, and then just give it some water, and away you go. Right, like how do you make a business out of selling dirt? But yeah, there's companies out there now that are just all they do is sell dirt, but it's free, free fertilizer? And everything's ready for this flower that you want to grow and the way you are. All right, and off you go.

Debbie Reynolds  45:06

Yeah, I've been very blessed and fortunate with this podcast's tremendous reach. Something happened recently, I had Mark Smolik on the show. He is the head General Counsel at DHL Supply Chain. And so, within less than 24 hours of us doing that podcast, I got a call from someone in Australia there was talking to me about this episode. And it was just mind-blowing. To me, that someone, first of all, what do you guys do in Australia or listen to mark? About privacy? So it was really tremendous. And you know, I think the podcast currently the reach right now we are listened to in like 79 countries. And it's amazing, right? We're taking advantage of technology, being able to have that reach that we just could not ever imagine that we have in our 20 years. And I highly recommend that people find a way to, you know, make us be the CEO, a CEO of yourself, right? That's what we're recommending, that people do. Definitely. So so, if it were the world, according to you, Mike. And if we did everything you said, What would be your wish for either privacy, cybersecurity, whether it be technology, human stuff, or regulation? What are your thoughts?

Michael Thoreson  46:41

Trying to reduce it down to just one. For those that know me, I it's, I've no shortage of ideas on how to make things better in various industries and such and streamline them. I guess, think, before you share, think before you post, like, when I was growing up, it was, you know, the teachers and the ads on TV were, you know, don't talk to strangers, don't, you know, get into this person's, you know, van or, you know, be careful who you give your address to. And other various things, right, you know, and, you know, I was going to be 40 here this year. And in that short, feels like a short period of time, we've now transitioned to sharing everything. And you know, and we see people doing like home, video tours of their home, and showing the view from their balconies, and then go, people are tearing them apart and be like, Hey, I figured out exactly where you live. And there's something else, there's some people that have ended up in the news dead as a result of showing their video and showing the view from their homes and other various things that you share. You share all this is easy, it's fine, like, share it with friends, share it with family, you know, investigate in the social media platforms that you're on how to lock your page down, right? It's you don't need to share everything with everyone. You know, and I think that's largely out of people's need to feel validated or have some sort of value. So I guess I've said re sum this up, you are a unique person, you have value, you do not need to share your life, your inner desires, your every thought, you know, every time you clip your toenails, you don't need to tell the world. And in some cases, some of these things, it's the world's better off not knowing that you're doing these things, but we don't need to know everything that's going on behind your closed doors. So yeah,  that's that.

Debbie Reynolds  48:58

Yeah. That's great. Great advice. I'm glad you said that. So you definitely want the human side of privacy. You know, one thing I also say is that when I think about what I share, I try to make sure that I'm sharing something that will have value in the future. So the fact that you had a ham sandwich this morning, it's probably not that valuable in the future. So I try to make sure it's something that someone can go back, you know, I have people go back say, oh, I listened to this two years ago, and they really helped me or whatever. So I'm trying to put out these I think are going to have long-term value. So if you think about it from that perspective, and then the other thing is, you don't want people to use things against you that you said. You know, about your personal life, different things like that. So, you know, definitely keep that down, keep that off of social media. Just you know, be that with your friends or family, but you know, try to make sure that you're putting out what you want to share with people. And you want them to know about you for the long term. Yeah. Toenail clipping, that was a good one.

Michael Thoreson  50:18

Trying to come up with something absurd that also was not obscene at the same time. And totally tank your show. One sentence.

Debbie Reynolds  50:26

Oh, my goodness. Well, thank you so much. I'm so happy we're finally able to do this together. You've given tremendous insights, and I'm sure people will really love this episode.

Michael Thoreson  50:39

Awesome. Thank you for the opportunity and your kind words as always.

Debbie Reynolds  50:45

Thank you, thank you. We'll talk soon for sure.

Previous
Previous

E133 - Vadym Honcharenko, Privacy and Data Protection at Grammarly

Next
Next

E131 - Egil Bergenlind, Founder & Privacy Hero Sidekick at DPOrganizer