E228 - Carey Parker, Podcast host and author of "Firewalls Don't Stop Dragons”
[00:00] Debbie Reynolds: The personal views expressed by our podcast guests are their own and are not legal advice or official statements by their organizations.
[00:12] Hello, my name is Debbie Reynolds. They call me the Data Diva. This is the Data Diva Talks Privacy podcast where we discuss data privacy issues with industry leaders around the world with information the businesses need to know.
[00:25] Now, I have a very special guest on the show all the way from North Carolina, Carey Parker. He is the podcast host and author of Firewalls, Don't Stop Dragons, a step by step guide to computer security and piracy for Non techies.
[00:41] Welcome.
[00:42] Carey Parker: Thanks, Debbie. Great to be here. So good to be here. It's been too long. We've danced around each other for long enough. It was time for us to get together.
[00:49] Debbie Reynolds: I think some mutual friends of ours tried to get us together. At some point, I don't know, we kind of got lost and we found each other again. And yeah, first of all, your podcast is tremendous.
[01:01] You've done a lot of podcasts. I'm in the 200 now. You're like almost in the 400s now.
[01:06] Carey Parker: I just passed 400. Yep, 400, baby. That's a lot. That's a lot of podcasts that don't go that long.
[01:12] Debbie Reynolds: Yeah, it's a lot. It's a lot of podcasts. But the work that you do is outstanding. I think you do a really great job of being able to explain things and very simple to understand ways, because I feel like a lot of us tech people, sometimes we assume that other people are as technical as we are, so we throw out like a lot of jargon, a lot of acronyms and stuff like that.
[01:37] Just regular people just want to know, what does this mean to me? You know, how does this impact me? Again, I really love your show, but one of the shows you had done, which I thought was outstanding, as they all are, is the one about the national data breach.
[01:56] The one about that, the data broker breach.
[01:59] Carey Parker: Right, npd. Yeah. That was horrible.
[02:02] Debbie Reynolds: It was terrible. And I felt like one of the frustrations that I had in the news media is I felt like people weren't really explaining it very well in terms of, like, what was breached, what happened, how the data that they capture, like, what does this actually mean to the person?
[02:17] And so, yeah, just so much confusion about that, even though I think it was tremendously horrible, but just understanding it from a data perspective. And then also, in addition, you were providing some insight into how businesses can avoid those types of things.
[02:36] I talk a lot about unstructured data and how cybercriminals love that stuff. Right. The stuff that you don't really govern very well, and that's like a treasure trove for them.
[02:46] Please introduce yourself. And I also want to make sure you give your background in terms of being a former software engineer.
[02:53] Carey Parker: So just briefly. I was a software engineer for about 28 years. I managed to retire early, which was lucky for me. But before I did that, right around the time the Snowden revelations dropped, I've always been kind of a private person, personally.
[03:05] Like, I don't share different things with different parts of my life. I share different things with my family versus my friends versus my coworkers. And that just seems like a natural thing.
[03:14] And the classic definition of privacy is being able to tell your story the way you want to tell it. And so I was always kind of that way, but it was just a me thing.
[03:22] And then after the Snowden revelations, I was really, really pretty shocked and actually angry. Like, I. You know, Cause I'm not a tinfoil hat, black helicopter kind of guy either, but, you know, so I knew these things were.
[03:31] But then it's like, not only is it possible, but they're doing it on a mass level without warrants and without approval. I was like, that's just wrong. And so that was kind of what started this whole thing for me.
[03:42] And I'd always wanted to write a book, and so it kind of came together. Why don't I just write a book about this stuff? Because being the software guy, I'm the IT person for the family, and I've got a pretty big family.
[03:53] And so I'm constantly getting questions like, you know, Carrie, do I have a virus? You know, my computer's doing this. Is that right? Is that normal? Or should I install antivirus?
[04:00] If so, what kind? And how do I work this password manager stuff? I don't know about that. So I decided, let me just put this all down in a book so I could hit the easy button.
[04:09] Just say here. And the whole idea of the book was kind of privacy and security for Dummies and for Dummies was taken. So I like my analogies. And so one of the main core analogies of the book is defending your computers and your devices is kind of like defending a medieval castle.
[04:25] And so we talk about defense in depth. You don't just have one defense. You've got a moat, you've got a drawbridge, you've got walls, you've got armed guards.
[04:33] And translate some of those things into the simple and everyday things we all should be doing and can be doing to improve Our security and privacy. And then to wrap it all up, the dragon part of the analogy is the CIA or the NSA or the, or some nation state.
[04:46] Don't bother trying to make a dragon proof, castle. You'll go crazy and broke in the process.
[04:51] So anyway, that's where the title came from. And so, yeah, ever since then, my whole goal has been to try to bring what are pretty complicated issues, security and privacy, to the light, to help people understand what they are and then explain them in ways that regular everyday people could understand.
[05:08] Which is like 99% of the planet. Because like you said, there's so many. For people like us, there's all sorts of resources. Right. If you know what you're talking about already, there's all sorts of books and podcasts and things you can listen to that will explain these in technical detail with jargon.
[05:19] I wanted to skip all that. I wanted to go to explain this to the way that everybody can understand and put in step by step, instructions. I'm very OCD about stuff like that.
[05:28] So I've got these really detailed instructions with screenshots. Like half the book, it's a 600 page book now, but half of that is pictures because there's all these pictures that show you what you should see as you're doing these things.
[05:37] So anyway, yeah, that's been kind of my mission all along is to, is to bring this stuff to everybody else. And the podcast too became part of that. Where I wanted to take the news stories like you like the MPD data breach and other things like this.
[05:49] You see the headlines and a lot of them are either wrong or they're clickbaity. So they're, they, they focus on the wrong thing or they imply that something happened that didn't really happen, or it's much worse than it was.
[05:58] And so I try to take these things, tell you what they really mean, and sift through the jargon and then if possible, give you something actionable to do about it.
[06:07] If not this time, for the next time. And so kind of in a nutshell, that's how, where I came from and that's how I do what I do.
[06:15] Debbie Reynolds: That's a great story. One thing that I've always loved, that Mark Twain had written about, he had apologized to a friend for writing a long letter because he didn't have time to write a short letter.
[06:30] So I think what you're doing and what people don't really understand is when, especially when you're working in media and books and stuff like that, the, the ability to be able to simmer something down to its simplest form is quite an art.
[06:45] And it takes a long time.
[06:47] Carey Parker: Yeah.
[06:48] Debbie Reynolds: Because you have to really understand something at a really deep level in order to be able to explain it to someone else in a very simple term. So I thank you for that art.
[06:57] And I love the way that you really deep dive into things and explain it in a fulsome way that people truly understand.
[07:05] Carey Parker: I think we've mentioned this, but. So the format of the podcast is a little different than some. I've kind of settled into this every other week TikTok thing where that's a bad choice of words.
[07:12] Debbie Reynolds: Good.
[07:13] Carey Parker: Maybe I shouldn't say it, but a back and forth thing where I do a new show and then I do an interview show. So every other week it's here's what's happened lately and here's what it means to you and here's what you can do about that.
[07:22] And then here's an expert in privacy or security that I dive deep on some particular topic with. So it's a little bit of both. I like doing both and I didn't want to do one versus the other, so I just did both.
[07:31] And so I go back and forth.
[07:33] Debbie Reynolds: Oh, you know what? I never thought about it, but I kind of do that. So I have a weekly video where I talk about a particular news item or something that happened and then I have the podcast.
[07:42] So we kind of do the same thing, I guess.
[07:44] Carey Parker: Yeah. All right.
[07:46] Debbie Reynolds: One thing. I would love your thoughts since you're very good at being able to help people crystallize things, concepts, and that is the difference between cybersecurity and privacy. I feel like people just confuse that so much.
[07:59] And especially I think I'm like shaking my fist to the sky when people say you can either have privacy or security, and they're not the same thing. So it's like, it's not.
[08:08] It's not a one for one exchange. So you have to really explain that. But I want your thoughts.
[08:15] Carey Parker: Yeah, absolutely. What? I teach a class for seniors at a continuing education course on this. And I use my book as the textbook. And one of the pictures I love to put up is a political cartoon I saw years ago.
[08:25] And I'll just have to describe it to you. So it's this. It's this couple in a wooden house. And you can see that these carpenters are pulling apart, pulling off the planks of the house.
[08:34] And the house is labeled privacy. And building a fence around that house that's labeled security. The implication Being you can't have one without the other. They're mutually exclusive. Like in order to have security, you have to give up privacy.
[08:46] And I think that's kind of the, what you're saying there. And that is absolutely not the case. That's really, that's a false choice. And I think it's one that you certainly hear from some elements of law enforcement, certainly from government agencies like the FBI, who they're going dark.
[09:01] We can't, with all this encryption, we can't, we can't protect you when in reality that, that security is there to protect us. Security enables privacy. You can't really have one without the other.
[09:10] Not, they're not really two sides of the same coin. They are different. And I think some of the differences I usually like to call out when people ask me this question is a couple things.
[09:17] First of all, security mistakes can sort of be fixed. Like if someone commits financial fraud, if they do identity theft, if they take some money from you, if they take some of your accounts, you can, you know, it's painful, but you can usually replace and recover and move on.
[09:33] You can't really recover from a privacy mistake. In that sense. Privacy is different because I can't erase memories, right? If there's something about me that is, that gets out there, I can't get that back.
[09:43] I can't really fix that. So to me, privacy mistakes and privacy failures are more consequential, generally speaking. And obviously, if you lose your life, that's one thing. But generally speaking, for most people, you and I, and most people probably listening to this, security failures can sort of be fixed and remediated, whereas privacy failures often can't.
[10:04] So I think it's important that people understand that, so that maybe they put more attention in preventing privacy mishaps and put some more focus on that. The other thing I like to draw attention to is that for security, your interests are usually aligned with corporations and with governments.
[10:22] We all do better when security is good. If there's a security failure, it's bad for all of above. That is not true with privacy. It's. We're usually at odds because corporations want our data, to sell it and monetize it.
[10:34] So, you know, they are trying to get, give us to give up as much information as possible. And they usually phrase that you can totally trust us, it's super, super private from everybody but us.
[10:44] And, and you can trust us. That's usually the line, right, with corporations. So, but even if that doesn't bother you, even if you like the fact that that shoe ad is chasing you all over the Internet for weeks, or you know, your search history is somehow showing up somewhere else, or the fact that you hit the brakes too often in your car means your insurance rates go up.
[11:02] You know, if that's not enough to put you off, the fact that corporate surveillance is happening and it's so rampant is also enabling government surveillance so basically allows our government to buy your data on the open market, bypassing in the United States at least your fourth Amendment rights.
[11:17] So, you know, governments and they have been doing this. It's not, this is not hypothetical, this is actually happening. So that's another, I think, difference between security and privacy that I think needs to be called out and why it's so important.
[11:29] Debbie Reynolds: These are all valid points. I love this. What's happening in the world today that concerns you, that relates to privacy?
[11:38] Carey Parker: Oh gosh, what isn't happening? It's so, so bad.
[11:41] It just keeps getting worse. And I don't understand how it keeps getting worse. Like after all the data breaches and all the, the outing of priests and all, all the things that have been consequences of poor privacy, you would think that the people in power, our elected representatives would at least know somebody these things have happened to and would, would kind of get religion on this and decide that we need privacy laws here in the US So that, that just drives me crazy.
[12:05] But I think, I think what's maybe bothered me the most and that, like I said, there's a lot of things that are bothering me. But I think at its root, what really drives me crazy is the lack of transparency, the lack of understanding of what is really, really going on in there, out there.
[12:19] And that's, and that's something I harp on a lot in the show is to make sure people understand, like, look, this, you know, not only is, this is what's going on, here's, here's what that means to you, here's why this is a problem, here's the de, here's the 30,000 foot view of why this is an issue.
[12:32] And we get these little glimpses through data breaches and things that happen and that make the news about what's going on. But it's so much worse than that. It is just the tip of the iceberg.
[12:41] And so I think what still bothers me the most is that we are not, we don't have the transparency and we don't have the general education of the populace to know that these things are happening.
[12:51] And you know, I know there's a lot of people that hate Regulation. Whenever I mention regulation, I think I turn off half the audience because, oh, regulation bad, Regulation's bad for innovation, all these things.
[13:01] I would actually go so far as to say regulation is the mother of invention. Whenever you have things that cause you to rethink how you're doing something or put more constraints on where you're doing it, you have to get innovative to work around those things.
[13:12] I mean, look at car safety, right? And I think Tom Kemp even brought this up when he was on your show. Car manufacturers hated the fact that they were required to put seat belts on and then put airbags in and then put all these sensors on the vehicle.
[13:23] Oh, we can't do that. It's going to cost too much money. It's going to put us out of business. I can't possibly do that. They all did it. They're all making a lot of money and we're all much safer as a result.
[13:33] So I think that, back to the original point, I think what we really need is we need more transparency, we need more education, because on two levels. First of all, our free market society, if you're a free market person, the free market society requires an informed consumer, like a consumer for the free market to work and for the better products to win out.
[13:53] We have to be able to look at product A and product B and say, this one's more private or this one's more secure, so that more people pick the better one and the bad one has to either catch up or go out of business.
[14:02] By the same token, this notion of transparency and education, we need an informed electorate, right? We need the people to be understanding what these privacy problems are so that when they're going to the town halls, when they're going to the voting booth, when they're going to their local council meetings and they're bringing up issues that they can hold their representatives accountable and ask them tough questions.
[14:23] So all of that requires transparency and education. There's so many bad things going on, but to me, the root of it is we still, despite all these things, are happening.
[14:31] I don't think people are fully aware of what's going on, and until we get to that point, I'm not sure we're going to force things to happen.
[14:36] Debbie Reynolds: Yeah, I agree. I think transparency is a huge issue. I decided many moons ago that I wasn't going to hold my breath on regulation.
[14:44] Carey Parker: Oh, sure. Yep. Oh, yeah.
[14:46] Debbie Reynolds: So. So if that doesn't happen, it's a big missing part that we have. And I feel, especially in the us without, like, privacy Regulation like we're seeing in almost any other jurisdiction, we're kind of dancing to the tune of other regions, I feel, because with the lack of definition about how we either define things, how we go about things, from a business perspective, companies, they can only, like, look at what other jurisdictions are doing and try to at least align with it, even if they are not legally required.
[15:22] It gives them some, some type of framework or some type of guidance to kind of think about, especially as you're doing commerce with other businesses. What do you think?
[15:31] Carey Parker: Well, one. One more comment in defensive regulations. And that is, I do think there's plenty of things. I mean that the whole point of the book is there are a lot of things we could all be doing that will help our privacy and security, and I think we should be doing those things.
[15:42] That's not to say we should just give up on that and expect the government to swoop in on their white horse and fix everything.
[15:48] However, I mean, there's a reason why when you fly somewhere and you get on a plane, you don't have to personally walk around that plane and inspect all the flaps and look at the lights and kick the tires before you get on that flight, because someone's done that for you.
[15:59] Because there's regulations to say it's got to be safe. There's a. There's a reason why you don't have to have a taste tester for all of your drugs and your food.
[16:06] Right? When you go to a restaurant, there are people watching over that restaurant to make sure that they're keeping up with code. That's regulation at work for you. I mean, there are certain things that just doesn't make practical sense for each of us individually to do those things.
[16:18] So I think that still eventually we need regulation. But you're right, we can't hold our breath and wait for those things because as we've already seen, it hasn't happened yet, and it may be a while before it does happen, unless you live in California or a few of these states that have managed to step up to the plate.
[16:32] And one nice thing about that is, though, is that oftentimes, as California goes, so does the rest of the country, because a lot of these companies don't want to make two different things, one for California and one for somewhere else.
[16:43] So sometimes we get the benefit, even if we're not living in California, of some of the better privacy laws in California. So regulations can even help in that regard. But yes, you're absolutely right.
[16:51] We shouldn't. We can't count on that. We can't put all our chips in that. On that, on that spot on the roulette table. We've got to be doing the stuff ourselves too.
[16:59] And so yeah, my book is filled with all sorts of simple things. You know, password manager two factor authentication, user privacy, respecting browser use, plugins like UBlock, Origin. There's, there's, there's all sorts of things we can be doing that would make us more secure and protect our data that we, that most of them don't cost any money.
[17:15] Most of them are pretty easy to do. Some of them are just don't like. There's a good percentage of my tips in the book that are just don't do this as opposed to.
[17:23] So it's not that you have to do, it's just something you need to avoid doing.
[17:26] Yes. So there's still a lot of things we can, we absolutely should be doing. So it's not an either or thing. We need to be doing both. And absolutely. I like to empower people to do what is within their power to do.
[17:37] And that is a lot of what I preach.
[17:39] Debbie Reynolds: I agree with that. I want your thoughts. And this is something I've been thinking about very deeply over the last, I don't know, year or so. And that is maybe we should be talking about privacy as a safety issue as opposed to a feel good rah rah.
[17:56] Maybe you should, maybe you shouldn't have it. And maybe we go back to like the automobile analogy. And I know the car, I know the background of seat belts and how car manufacturers didn't want to do that, but it was a boon for them and it really helped the adoption of automobiles and stuff like that.
[18:15] So yeah, thinking I wonder if talking about privacy as a safety issue because let's go to a data breach. So let's talk about 23andMe. And so 23andMe, they're in super hot water right now.
[18:29] Their stock is in, in the doldrums. I think they're about to be delisted from some stock exchanges. And so some of the stuff they did isn't necessarily illegal in certain places.
[18:42] But I think it was just a bad business move because the more sensitive the data that you collect or the harder it is or maybe even impossible for someone to have any adequate redress.
[18:54] Right, right. And so when that breach came out, they said, oh, we're going to pay whatever, however many million dollars and then we're going to give you three whole years of credit monitoring.
[19:05] It's like, I don't know how credit monitoring will help if someone took my DNA, you know?
[19:10] Carey Parker: Right, right. Yeah, that's a very interesting story because as you know, as I often say, you can't anonymize DNA. Like your DNA is literally you. It is your genetic makeup. You can't make that anonymous.
[19:22] I mean, there's a whole forensic science in law enforcement around identifying people based on DNA.
[19:28] And yes, in particular right now it's a problem because 23andMe does look like it's going to go under. And one of the most valuable assets they have for their hungry creditors is your data.
[19:37] I know, I looked into this and I forget what their privacy policy currently says because they can change these things, usually just with an email notice, but I think it is kind of vague into what they are allowed to do with that data.
[19:47] So I have been telling people to try and go and have your Data deleted from 23andMe if you've used them in the past. I've heard it's kind of tricky to do, but it's worth a shot.
[19:55] Certainly before they go under. I would think I'd want to get that deleted just so it doesn't become some asset that they could try to sell to creditors if they go under.
[20:02] But just, yeah, generally speaking, DNA. So when you submit DNA, not only is it your DNA, but it's all the DNA of your blood relatives too. That DNA has been used in multiple cases, criminal cases, to find suspects through DNA of relatives.
[20:17] So even if the suspect, their DNA wasn't in 23andMe or Ancestry.com, if they know a blood relative and they get access to their DNA, they've been able to use that to then find a match to the DNA at the crime scene.
[20:31] So you're not just giving away your DNA, you're giving away the DNA of all your blood relatives as well. So it's. And that brings me to what I think is, to your security point, and I think we need to understand this, is that, that privacy and security both are not just me things.
[20:46] They're very much we things. And so your privacy and your security affect mine. They overlap mine. So even if you don't care about yourself, even if you think I'm boring, I don't care if anybody looks at my emails or reads my text messages.
[21:01] It's just shopping lists and I'll meet you at 5 o'clock at the coffee shop kind of stuff. I don't care. You got to realize that, that when you sign up for these services, Facebook was, was big on this.
[21:11] When you first signed up for Facebook, they said, hey, give us access to your Google account so that we can look up and automatically find all your friends for you.
[21:18] Oh, that's convenient. Well, also gives them access to everybody else who's not already on Facebook. And now you've your contact list, your address book has probably hundreds of people in it, and so you're giving away all their information out as well, not just yours.
[21:31] So your address book, when you made that privacy choice, you were making that privacy choice for other people. When you upload pictures, more than likely there's somebody else in that picture.
[21:40] Unless you do a lot of selfies. And even then, selfies often contain other people. Those people, even people in the background, can have their privacy affected by that. AI has gotten so good, and facial recognition has gotten so good that there are programs that can identify faces in the background, look them up, like Clearview AI, they can look these people up, find out their names, find out who they are, find their social media accounts.
[22:04] It's getting to the point now also where you can look at the background of an image and tell where that picture was taken.
[22:10] So when you upload photos like that to your social media, you might not be thinking about all the other people you're affecting as well. And that becomes a security problem.
[22:17] Let's say now that Roe v. Wade has been turned over, or if you've got kids, if you took a picture at a birthday party, and here's all the 20 kids that came to the kids birthday party at the park that we all shared on the block.
[22:29] Some creeper now knows there's, hey, there's 20 kids that like to go to this, this park. Right.
[22:34] So your security, your privacy overlap those of others. If you bring a compromised laptop into my home, that means that potentially all the devices in my house are now vulnerable as well.
[22:42] So we gotta, we gotta think about these things as more than just me things. They're we things. They're a collective good. And you can think whatever you want about you, you know, little as you want about your own privacy and security, but when you give yours up, you're also potentially affecting others.
[22:58] Debbie Reynolds: I know you said at the beginning of the show, you're not quite tinfoil hat and black helicopter, but I'm very tinfoil hat. I'm not quite black helicopter yet. But are there any stories that have come up that you've covered recently that said that you said, wow, we really like going across this, in my view, like the 1984 George Orwell threshold.
[23:22] Carey Parker: Yeah, usually those are the ones where you find out that apps or devices are doing some really strange tracking. Like the recent story about the air fryer. Who's. There's a smart air fryer, which you think, how smart does an air fryer need to be?
[23:36] Well, it's got an associated app. I guess I can tell you when it's done. I don't know. But the app was sending all sorts of data to some location back in China.
[23:45] I think the tinfoil part of that is most people don't understand how easy it is for these apps and usually comes through software development kits and libraries that are reused from other people.
[23:57] Sometimes these libraries are taken over by malicious actors. You know, everybody used this. This library because it was free and open source, and then they kind of abandoned it. And then somebody else swoops in, picks it up, takes it over, and puts stuff in the.
[24:10] And malware into it or tracking stuff into it. It's those kind of, kind of back dory things that always seem to blow people's mind. One of the other. One of the other ones when I teach my class that people always, like, shake their heads at is the automatic content recognition, which is that is your TV watching you.
[24:29] So this is. There's a reason why a lot of TVs, especially some of the lower end TVs, are so cheap, because they're basically selling them at the cost. Because they're selling them at cost because they can now monetize your data.
[24:39] And they're. It's like. It's like Shazam for video. Like Shazam. Like music identifier. Right? So if you're in the elevator and you're like, oh, what's that song playing? You hit Shazam.
[24:48] It listens for five seconds and says, oh, you're listening to this. It's a similar kind of thing where this is true. And it sounds so crazy, but there's a system called automatic content recognition built into a lot of TVs that can look at the pixels on your screen and determine what you're watching and then report back.
[25:05] This is the kind of stuff that Cary likes to watch. This is the kind of commercials he's seen today and try to maybe map that to a purchase he later made.
[25:12] So I could kind of connect those shots because advertisers love to do that. And it sounds so. So tinfoil Hattie. And yet that is a real thing.
[25:21] Debbie Reynolds: So you mean I have to watch TV with sunglasses now?
[25:25] Carey Parker: Well, it's watching the pixel, so. Well, There are some TVs with cameras and mics built in too. That's a whole other issue. But yet it doesn't matter what the source is because it's built into the television.
[25:33] And as long as you've connected that television to the Internet, when these systems have ACR turned on and some of them give you a way to turn off if you dig in the settings.
[25:42] But yeah, they're watching what you watch and reporting back. Kind of like Nielsen's, but in a much creepier way.
[25:48] Debbie Reynolds: Wow, that's staggering. Oh, my goodness. Thankfully, I don't watch tv. So the point that you made about the air fryer and the tv, this is one reason why I'm very deeply involved in Iot because I feel like people don't understand, first of all, they don't understand what these devices can do and they don't understand why, like, why would the air fryer need to send my information about the chicken that I made to China?
[26:17] So I think for us, it's like we can't imagine why someone wants that, but someone wants that. And someone can sell or monetize that information.
[26:25] Carey Parker: Yep, absolutely. Oh, yeah, the, the data that we're trying to hoover up is just astounding because there's no regulations that say they can't do it. And so they are. I mean, these public companies, I mean, you know, I'm a, I'm a capitalist, I'm not an anti capitalist, but.
[26:40] And so I kind of understand that some of these companies, they have shareholders and those shareholders are demanding that number go up.
[26:48] They want the stock value to go up or the dividends to go up. And so if there's money on the table, they expect that company to pick it up. And today that money is monetizing your data.
[26:57] And so there's a lot of pressure in these companies to do that. And because there's no law saying they can't, they're kind of forced to.
[27:04] Debbie Reynolds: Yeah, I think my concern always is that that data will be used, misused and abused and used against the person. Right. So let's say Carrie uses his air fryer and he makes bacon three times a week instead of baked chicken.
[27:21] Right? And then maybe insurance company says, hey, we want that data because we're gonna like, change his rates because he should be eating baked chicken instead of, instead of bacon.
[27:31] I think we're get getting, and this may sound far fetched, you know, when I saw that they were putting Internet connectivity in refrigerators, I'll say, hey, you just ran out of milk or you did that, like, I could see the future where they say, well, we're not going to unlock the fridge for you because you shouldn't be eating ice cream.
[27:50] Carey Parker: Oh yeah, or your rates are going to go up or what? I mean, I mentioned it before. That is already happening with car insurance.
[27:56] Many car. Modern cars are a privacy nightmare. I been talking with Andrea Miko, who's the head of privacy for cars for many years now. He's a great guy. He and I actually just gave a talk together at a hacker conference in Atlanta about this.
[28:10] And modern cars are just privacy nightmares. These things are chock full of sensors. They're smartphones on wheels, but on steroids because you think your smartphone has a bunch of sensors in it.
[28:19] Your car has a gazillion more sensors in it. And all cars, modern cars today come with a built in cellular modem, whether you pay for the hotspot feature in your car or not.
[28:27] And they're sending telemetry and telematics information back to the manufacturer. But it's also setting things like what you like to listen to on the radio. If you connect your phone, it could potentially be sending your text messages and your address book information to them.
[28:38] And that information and your location information is being turned around and bought and sold.
[28:43] And supposedly you've agreed to this somehow when you buy the vehicle. And I don't know if this is a paper where you sign when you're buying the car or if there's a one pop up when you first turn on the car that you.
[28:53] Everyone just hits okay to and drives on. But you are, you've agreed somewhere along the line to share things like how, how often you brake hard, how often I accelerate too hard, how often maybe I come close to hitting something and my, and my, my proximity sensors go off and make the brakes go off.
[29:11] Their New York Times did an article on this and Kashmir, hello, I think it was Kashmir, hello, who I've interviewed on my show. It was fantastic. Did a report of this where this information was being sold in bulk for pennies, by the way, pennies per subscriber to LexisNexis.
[29:25] And then insurance companies were buying that information and raising people's rates because they could see that they were poor drivers. So I mean that's, that is happening. And I, yeah, I don't think it's, I don't think it's a stretch to say that we'll eventually be doing that same thing with medical insurance as well.
[29:39] And based on stuff you buy at the store, if you're using your Kroger loyalty card, you know Everything you buy is being tracked and maybe that's being sold. Who knows?
[29:47] I, I sure that's possible.
[29:49] Debbie Reynolds: Yeah. So you have to buy from Cheetos with cash.
[29:52] Carey Parker: Cash or Bitcoin. Yeah. Or whatever.
[29:56] Debbie Reynolds: Yeah, I know. Kashmir and I actually collaborated with Andrea Omeko on a report that I did for the US Department of Commerce around IoT. And we mentioned we actually had to fight with some lobbyists to get recommendation in the report.
[30:13] And one of them was the most heated, hotly debated item in that report was that we wanted to have privacy information, just basic stuff on a car label. And so the lobbyists kind of came out of the woodwork and we had to really go at them to be able to get it into the report.
[30:31] But thank God it's in the report and we'll see what happens with that. I feel like, especially with cars, because cars, to me are like the most unique category of a Iot device that you can imagine.
[30:42] Right. It's something that you have over many years. You interact with it. I don't know any Iot device that has so many different sensors in it, especially as this thing about heartbreaking and different things.
[30:53] It's like you're being graded for a test that you don't know that you're taking. Like, that doesn't seem like the right. You know what I'm saying?
[30:59] Carey Parker: Yeah.
[31:00] Debbie Reynolds: It seems like you should know that you're being rated in this way. And if you do drive this way, the aha. You know this, oh, well, your, your insurance rate went up 30%.
[31:10] And it's like, well, why then they give you 300 pages of every place you've been and how they rated it? And that doesn't seem right. Right?
[31:18] Carey Parker: Nope, it does not. And again, transparency, people don't know. And the other point I like to make about this in the United States is we're really, we're built on this notice and consent model for privacy and that.
[31:30] And so everybody claims, oh, we told you we're going to do that, and we gave you a way to opt out. Well, nobody reads the privacy policies. Even if you did, you wouldn't understand them if you read, I forget what the stats are, but they're crazy.
[31:40] Like, if you actually read cover to cover every privacy policy that affects your daily life, you wouldn't get anything else done because they're all crazy long. And even if you did read them, that's legalese and they're meant to be obfuscated.
[31:53] You wouldn't understand it anyway. So this whole thing that we have been informed and are making informed choices is just a lie.
[32:00] Debbie Reynolds: I agree with that 100%.
[32:03] So if it were the world according to you, Carrie, and we did everything you said, what would be your wish for privacy or security anywhere in the world? Whether that be regulation, technology or human behavior here.
[32:17] Carey Parker: Oh geez.
[32:18] We do a whole, probably a whole podcast on that one question.
[32:21] I, I think actually kind of circling back to what we said before, I. We should certainly as. And I, you know, I'd like to, you know, act, think globally, act locally, do what you can do yourself and make.
[32:33] And make changes where you can.
[32:35] So I would just like people to be aware of what is going on. So to understand and all the things that are happening. So that requires not only taking the time to learn these things, reading a book like mine or others, and listening to podcasts like ours to tune in and make sure you're aware of these things, but we also need the transparency.
[32:54] So we need to be fighting for that. That's probably going to take laws, but we can somewhat as consumers demand these things.
[33:01] So I've actually kind of had this dream of like a consumer's union, which actually I looked it up, that was the original name, I think for Consumer Reports back in the day.
[33:09] So it's taken. But where the problem with these privacy policies is that is that the wrong person's writing them. I want to write the privacy policy and then make you sign it for all my devices.
[33:19] Like I want to come to you with like here's what I would agree to privacy wise. And that's what we're going to go with here. But the problem is that who's writing these things is not us.
[33:28] And so we almost need like a collective bargaining as consumers. I wish we had the clout of all of us working together and say we're not going to buy a product unless it meets this stock privacy policy.
[33:41] I've called IT Privacy Policy 0. Like I want to be able to, like I thought of it almost like open source software licensing. They're like GPL and some of these named known open source licensing things that somebody took the time to write and then everyone kind of says, okay, we agree, these are good.
[33:57] And because I'm writing open source software, I'm going to publish it under this well known established third party license. I would love to have a third party privacy license written by consumers for consumers that we collectively force these manufacturers to agree to for their products.
[34:13] Because I think if we all work together, if we had collective bargaining capability as consumers and decided we're not going to sign a petition saying, I'm not going to buy an IoT product unless it satisfies this privacy policy and these privacy rules.
[34:28] That's my pie in the sky dreaming. But I kind of wish, I guess at a more practical level that we as consumers would fight back more, demand more transparency and by voting with our wallet, try to support companies that are doing the right thing, or at least trying to do the right thing, and incentivizing new startup companies to fill that space because there's money to be made, because people are willing to pay for privacy and security.
[34:49] And then to tell all the other ones, the incumbents like, hey, these startups are going to eat your lunch if you don't start. Also making things that are private and secure.
[34:57] I think waiting for regulation is going to take time, but in the meantime, I kind of hope my dream is that as consumers, we could all get educated enough to demand these things from the people that are making these products.
[35:09] Debbie Reynolds: I agree with that. And I think consumers have more power than they think they have, especially if we do work collectively. And I think that was probably the learning that I got from a lot of Cashmere Hill's reporting in the New York Times where the things that she was talking about weren't necessarily, quote, unquote, illegal.
[35:27] Right. But there were things that consumers didn't like.
[35:30] And so I tell companies, I'm like, are you going to give up a dollar to make a dime? Right. So you're making pennies on selling the data, but then you may lose maybe a brand or lifelong customer who's buying these high ticket items.
[35:44] Right. Like cars and different things because you're not handling their data. So hopefully that message is coming across. And I do like your thinking there because I think that that is a more direct bottom line impact on some companies around how they handle data.
[36:00] Well, thank you so much. This has been so much fun and I would love to find, find ways we collaborate in the future.
[36:07] Carey Parker: Yeah, absolutely. This has been fantastic. Thanks for having me on your show.
[36:10] Debbie Reynolds: You're welcome. Thank you.