E19- Jeff Jockisch – CEO Privacy Plan, Data Privacy Researcher

March_Podcast Guests  Jeff Jockisch.gif

Data Diva Jeff Jockisch

40:10

SUMMARY KEYWORDS

data, people, data brokers, podcasts, privacy, companies, information, problem, FTC, happening, purpose, collect, list, law, world, de-identified data

SPEAKERS

Debbie Reynolds, Jeff Jockisch

Debbie Reynolds 00:00

The personal views and opinions expressed by our podcast guests are their own and are not legal advice or official statements by their organizations. Hello, my name is Debbie Reynolds; they call me "The Data Diva." This is "The Data Diva Talks Privacy" podcast, where we discuss Data Privacy issues with industry leaders around the world to tell businesses things that they need to know now. It was hard for me to sleep last night because I was so excited to have Jeff Jockisch on my show. He is the head of PrivacyPlan. And also, Jeff puts together this fantastic curated list of Data Privacy podcasts. So I think you and I had worked communicating with each other or commenting on each other's posts on LinkedIn, probably even before you started that already started putting that out. But your listeners have gained so much traction because I don't think anyone is curating privacy podcasts in the way that you're doing. And it's very important that they because as you know, we look at podcasts directories, they're very general. So being able to have someone like you who could sort of go through and curate that list and do it in such a, you know, the detailed manner is amazing. Just look at your database and how you track this stuff is unbelievable. So yeah, we'd love for you to say a bit about yourself. I was so I don't know, we may have to do a couple of these things because I have so many things I want to ask you. So we'll try to keep it down to 30 or 40 minutes, but I'm not sure.

Jeff Jockisch 01:47

It's good to be here, Debbie. I appreciate the invitation. Yeah, I've been doing these Data Privacy podcast lists for a little while now. It's actually relatively new. I started tracking podcasts. Actually, when I was studying for my CIPP US over the summer. But I really was just doing it for myself. I had a relatively small list, maybe 10 or 15 different podcasts that I was listening to sort of tracking for my own purposes. But sometimes, I think it was in November. It might have even been closer to December. Somebody in the IAPP listserv asked about, you know, what, what podcasts people are listening to about privacy. And so I said, Well, maybe I had a post my list. And, you know, I had this small data set. So I posted it and sort of grew and grew and grew from there. And I do think we were probably interacting before that. But that's when we sort of started interacting a whole lot more. The podcast list is really turned into something that's, I guess, got me some attention from the privacy community. And I've really sort of loved turning it into a real pet project, if not more than that because it's, it's grown pretty massive. Now, it's actually got 75 different podcasts now that I'm tracking. And there's a whole lot of data actually added a whole new tab to the dataset now that tracks actual individual episodes. And I'm actually considering doing a project where we not only get sort of like the titles of the episodes and the little synopsis. But actually going back and trying to get some, some podcasters to get transcripts from those podcasts and do some natural language processing of the text in the podcasts to sort of like see how privacy professionals talk about the world that and compare that to how normal people talk, which is, I think sort of interesting. There are some studies that I've done in the past, sort of comparing the way different people talk in different professional ways, about, you know, language sets, and you can find some really compelling things by doing those kinds of analyses.

Debbie Reynolds 04:16

Oh, that's fascinating. I love it. I'm actually a fan of linguistics or psycholinguistics. That's one of my little secrets. So that will be fascinating. Your list is THE list. So there is no other list besides this list as far as I'm concerned because you do such a great job, and I've just been in awe with the detail that your list goes into. It's fascinating, and I agree, you know, you've listened to a lot of these podcasts. You know, I like what you just said. People do talk about it differently. For me, because I am very steeped in technology, and I understand the law. I always try to bridge that gap between the two because, you know, as I can tell, you're a data person like me, so. So to me this, you know, the patterns of data really fascinate me. One thing I will say is that a lot of times when I'm editing, you know, the text or transcripts for the podcast, is funny because some of the editing softwares don't really comprehend people well enough when they're talking about big issues. So when you talk about the world, who, you know, they're thinking about what's down the street. So there, the scope of what they're thinking about is different. So no, I know, when I'm talking with people, like you who sort of understand the concepts and make it simple, you know, I know that like this extra, looking at the editing because a lot of times they're assuming that things are more provincial, for lack of a better word. But what one thing I will say, I want to say about you and your commentary, I'm always excited. I try to tag you on stuff that I think you'd be really interested in. And you always come back with an elevated commentary. To elevate it in a way that you're sort of bringing the conversation up—but doing it in a way where it's still relatable. So that is a great skill to have, and you do it really well. I feel like sometimes you're, you're like a hot butter knife through butter. And you really, you can really like go, you know, you really slice into these concepts, and you just have, you know, such brilliant commentary,

Jeff Jockisch 06:48

I think maybe you recognize that because that's what you do a lot of, Debbie. Appreciate the comment. I think maybe I got a little bit too much into the weeds of what I do with, with what I'm what I want to do with my podcast list. But I don't know, if all your listeners realize that, you know, what I really try to do with that podcast list is, you know, I started out as really was just a list, right? And then I started adding more, you know, information about each of the podcasts. So, you know, who are the podcasters? What are the companies that sponsor them? And then I started saying, Okay, well, how can we sort of categorize each of these? So I took each of the podcasts and said, Well, how uniquely focused on Data Privacy? Are they, you know, what other focus do they have? Do they focus on Cybersecurity? Do they focus on risk, they focus on data governance, do they focus on policy, you know, and I actually created a whole, you know, set of tags and taxonomy, to sort of try to describe them. So that you can, if you're interested, you know, go and sort of search on those terms to find, you know, if you're interested in a particular, you know, if your particular interest is in policy, you can find the podcasts that focus more on those kinds of things. Or if you're interested in something else, you can find those. And I've actually put a page up on my website now where you can actually search even easier than doing it through the data set. But then if you if got those kinds of taxonomies put in place, then you can actually start putting in additional things where you can actually rank and rate them, right. And when I started writing podcasts that actually got more people's attention, right? Because they're like, oh wow, okay, now we can actually see who's, you know, maybe higher ranked than, than everybody else. And so that was something I think I didn't really expect was going to be all that popular. But it was, right? And then, of course, the other thing that I tried to do is I try to sit, tell people, you know, what am I actually listening to. So once a week, I'll do my sort of top three, which is getting harder and harder when you're listening to so many, but picking out a couple of the sort of best episodes of the week is pretty, pretty great for me, and it helps me sort of promote people in the industry.

Debbie Reynolds 09:13

Well, I've been very proud that you featured me a couple of times on your list. I was so shocked because I was like, Oh my god, what have I done? So it's nice to know, for me, I've been doing a podcast because I love having one on one conversations with people. And I'm just naturally inquisitive and interested in this topic. So it's not hard for me to come up with questions. I probably have more questions than we have time. Definitely, but you know, that type of recognition really has been very helpful. And it helps me know that, you know, I'm talking in ways that people really appreciate, you know, sometimes you don't get that feedback, right from podcasts.

Jeff Jockisch 09:58

Well, absolutely. You know, It's sort of come to my attention that most podcasters really get very little feedback. It's sort of, you know, I mean, maybe you understand how many people are following you, right? Maybe, you know, if you're getting ad revenue, which most privacy podcasts don't get, right, you get that number, but you're getting very little listener feedback unless they happen to rank you on Apple, you know, Apple podcasts. That happens very rarely. Right? So right, that you're really getting very little, it's hard.

Debbie Reynolds 10:30

Yeah, you're doing a great service, great service to the industry. Well, let's get into what we wanted to talk about. I obviously, you know, this is hard for me because I have so many questions I want to ask you, but one thing that we chatted about was talking about data brokers. And I think data brokers are a topic that doesn't get covered a lot of you know quite a lot about it. So I'd love to dive in and get your thought about, you know, data brokers who they are, what they do, and what is kind of the challenge with data brokers?

Jeff Jockisch 11:08

Sure. So, so my company PrivacyPlan is a small company. And what I've really started to do is change its focus to really be on privacy datasets, datasets that are focused on the privacy world. And over the summer, in addition to sort of focusing on, you know, building some, some datasets on podcasters, right, what I was really focused on was building a data set of data brokers. And I sort of started this dataset by looking at the Vermont data set because brokers have to register in Vermont. And there's also a data set in California because there's a California data broker law. And there's also the privacy rights website, forgetting exactly what the name of that is, but there's a nonprofit organization that tracks data brokers, but their data set's pretty old now. So I took all that information, essentially scraped it, put it into a big data set. And I think I ended up with about 600 different data brokers. And then I sort of expanded that information by going to a data broker and paying for information about data brokers from a data broker. Right. And then I said, Well, this isn't enough, there's, you know, lots of data brokers that aren't in here. So I started building that out by hand, by adding in Adtech companies, by adding in social media companies, and using a variety of different tactics to build out that Infoset, both vertically and horizontally. And at this point in time, I've got about 1400 plus data brokers listed in my dataset, and I've got about 180 plus fields that have information on each one of them. Or not, of course, not every single one. I've got 180, but it's pretty, pretty deep. And that sort of spans a variety of different types of information. So I've got, you know, just basic information about them. So I know you know who the company is, what their descriptors are different keyword tags about the different companies, you know, where they're located, phone numbers, email addresses, things like that, right. But more importantly, I have information about the sort of like what industries they're in, so we can sort of, you know, figure out that. And then I've got classifications right, which are unique to the stuff that I do. So back in 2014, the FTC did a big study of data brokers, where they really said that sort of data brokers are sort of intrinsic harm, right. But then they never really followed up and did anything about it. But part of that study sort of put brokers into three different buckets, and those buckets were sort of people search engines, marketers, and risk assessors. I think I've got that pretty close to how they described it. I've sort of taken that taxonomy and expanded it a little bit further. So the way I sort of group these data brokers is in four different buckets. People search screeners, right, which is essentially risk assessors, but a little bit more broadly. And then target, which is all kinds of different ad marketers right, and then monetize. Because there's not a group that really covers the type of companies that collect information but don't sell it. Rather they monetize it by keeping it inside. This is primarily large social media companies and other organizations. It probably should be a fifth bucket. And that is data collectors, people that collect massive amounts of information but don't yet monetize that data. The problem is, is that if you were to do that, you potentially, you know, have every company in the world, you know, listing your data broker list. But you know, we have this problem in the United States, where there's really no laws other than maybe CCPA, where, where there's purpose limitation, right? So they can collect all this information. And tomorrow, you know, unless the CCPA steps up or the FTC gets involved, they can take your data and use it however the heck they want. You know, there are some limitations to that, obviously, but there's not a whole lot. And so it's sort of a scary world out there. So anyway, that's the way I've sort of organized these companies. And then there are lots also, you know, lots of different pieces of information, where I've gathered on how, you know, what information they sort of collect on you, and how, what, what sort of different markets they target, and what sort of information services they provide, you know, are they collecting location data? Are they collecting this kind of data they are doing, you know, they targeting human resources, people, they targeting this kind of market, it's pretty extensive. And then probably the most important thing that you would be interested in is, I've also cross-referenced all of these data brokers against privacy problems for a better word, right? So I've cross-referenced it all against data breaches. I've cross-referenced it against FTC consent orders. I've cross-referenced it against big-name privacy lawsuits. I've cross-referenced it against law enforcement ties. And I've even gone so far as to develop, at this point, a relatively crude metric, to try to figure out how good or poor they are at being an advocate with people's data, right? Actually using the wrong word, they're escaping me at the moment, but trying to give them an indicator, an indice, using an indice, to figure out how good or bad they are at, you know, keeping the public's trust with their data. At some point, I'm going to put together a sort of a top 10 list or a top 100 list on how well they're using the public's trust or abusing it.

Debbie Reynolds 17:36

That's fascinating. I would love to see that. And, again, you're doing a service that we definitely need because very few people know anything about data brokers or the information that they're capturing, or people don't know is that there are essentially data dossiers floating out there about individuals. And actually, I've seen some pretty extensive categorization of people. So you know, suburban dad drives a pickup truck. They have two kids in college. Right? You know, so it's very, you know, the way that they sort of target people is very precise, I will say, and then two, you have brokers that are collecting different information, but they are combining datasets together. So they may be joining forces to, to, you know, pull more information. So let's say, for instance, you gave company information about you for one purpose, and you give another piece of information to a different company for a different purpose. That data may end up in maybe two different data brokers' hands, and they may combine that information together, where maybe it was fine the way that you gave it to them, but now they sort of started rules by pulling that information together.

Jeff Jockisch 18:59

Right. And that's really what's happening right there. They're the largest data brokers out there that have aggregated all of that information, right? I do not necessarily sort of point to any particular company. I'm going to throw this name out, but I do not sort of say that they're any better or worse than anybody else. But that Axiom claims to have 11,000 data points on virtually every consumer, right. 11,000. I mean, that's probably every single data point that could exist on you, right? I mean, maybe there's more, but that's a hell of a lot. Right?

Debbie Reynolds 19:35

There was a lot of data, and it doesn't necessarily mean that is correct, either. So then two, I think the thing that concerns me and I tell people about this a lot is if you don't have information if you're not for people who are not putting out or having any control over sort of their name, right and marketplace what they're doing. Those gaps get filled in by other people, and they may not be correct, either. So being able to sort of put out sort of what, who you are, and what you're about definitely helps. But I think people who don't do any management at all, they're a sort of digital identity, they sort of run the risk of having someone make up stuff about them, that may harm them in the future.

Jeff Jockisch 20:29

Yeah. And, you know, and it's not just that, right. I mean, I think that the accuracy problems that these companies have, particularly the screeners are, are getting more problematic because there's more and more data, right? The data glut that's happening out there is actually making the accuracy problem a lot worse, regardless of whether you're trying to control your information or not. I've seen a few studies. I can't quote any specifically at the moment, but that essentially showed that there's a lot of inaccurate data out there. I can tell you that, well, this is a little bit tangential. But of all the data brokers that I've got in my data set, 59 of them have actually been breached. So that's a heck of a lot. I mean, you know, regular companies get to breach. That's one thing, but a data broker getting breaches is even worse. Right? But the other problem with screeners, right, who are screening you for employment or screening you for loans or screening you for an apartment, you know, if you're trying to get an apartment, is that it's better for them to be able to exclude you and say, Oh, no, this guy might be a bad risk, right, then to allow you to go through and then you know, any attempt to be a problem, right? So if your name is Bob Smith, and there's some other Bob Smith, who's got a problem, right? It's better for them to err on the side of caution than to err on the side of, oh no, that was the wrong Bob Smith. Right. So they don't have a whole lot of incentive to fix that problem.

Debbie Reynolds 22:08

Right. And because of the way that the data brokers work, they don't necessarily have to tell you how they got the data. Right?

Jeff Jockisch 22:18

Well, I think I mean, you know, if they're, if they're abiding by background, right, they have to tell you that they are taking adverse action on you based upon your credit rating or credit score. Right. But that doesn't always happen if they're using alternative data sources. Right?

Debbie Reynolds 22:37

Right. Yeah. That's very frightening. Yeah, that's true, right? Because companies really want to reduce their risk. And so if these companies are saying, you know, this person is riskier than this next person, you may get denied something, and not even know, sort of why, you know, maybe you instead of just saying you had an adverse action, like especially in employment, you know, they can use some of this information, some of the information that they can use, they could possibly exclude you before you ever enter their system. Right. So your resume goes into the trash based on what they can find out beforehand. He would never know.

Jeff Jockisch 23:20

Right? And that definitely happened. Right? I mean, there's been some studies that show that just based upon your name, sounding foreign, you may not even you know, make the, you know, make it to the recruiter, or whatever. I forget the exact circumstance. But there's definitely some proof that that's happening.

Debbie Reynolds 23:38

Well, what do you think is happening with the data brokerage industry, especially as it relates to regulation, so obviously, things like the GDPR. And in the EU, they are very strict on the purpose of data. And then we see it with a CCPA in California, where they're trying to try to pin down people sending data to third parties and making sure that they let them know sort of about that. And I think that's going to be the trend. But purpose. To me, it's a big problem with a lot of this data collection because all of this data collection is done indiscriminately, and it doesn't really have a purpose at first. And then down the line, maybe they'll make one. So I think, I think it's a danger. And maybe I'm wrong about this. Tell me what you think, so if laws are going in the way where they're saying before you collect that you have to have a purpose for it. That is a danger, I think, to data brokers, because they're just collecting they're vacuuming up data and then hoping to find a purpose for it as opposed to having a purpose.

Jeff Jockisch 24:52

Yeah, no, I think you're definitely Right. I mean, the CCPA threw in a purpose limitation amendment. Then it was an amendment. I guess just a little clip right at the end right before it passed. And attorney general has made a clarifying statement on that. But it actually sort of weakened it a bit based upon whether the changes in the purpose is material or not. But you know, then the problem there with that law is there's no private right of action anyway. Right. So it's got to be enforced by the, you know, by the well, with the new CPRA. It's got to be enforced by that agency. But I didn't actually see whether or not the CPRA made any change to that purpose limitation? I don't think it did. But I could be wrong there.

Debbie Reynolds 25:43

I don't think they did. But that's really an important point. And that's sort of a gap in sort of our laws about purpose. And I hope that we can shore that up in the future. Because I think the purpose is very important. Collecting and a lot of my European friends, you know, they feel this, you know, intrinsically in terms of their fundamental human right about just having too much data. You know, you'd be surprised what inferences people make based on data. That may not be correct because they just clicked my stuff.

Jeff Jockisch 26:21

Yeah, absolutely.

Debbie Reynolds 26:23

Now, this is fascinating. So I'm hoping so. With the Vermont law, I like what you're doing. So when the Vermont law came out, the data broker law, I thought it was a good step. But the problem that I have with that is because people don't know who these data brokers are or what they do like that list is not very helpful. You know, I think of myself as an individual. I went on that site, I looked. Like it will just be, you know, hieroglyphics probably because I'm like, so what does this mean to me? I don't understand, right? So how can people find a way to understand what data brokers are doing? Or find out how their data is being used? Or can we?

Jeff Jockisch 27:07

I don't think it really helps a whole lot, right? I mean, it's a good first step. But I mean, Debbie, here's the problem, right? I mean, if I gave you a list of 1400 data brokers, what are you going to do? You can email all of them and tell them to stop, you know, using your data or selling it or, or to delete it? I mean, first of all, you got to find their, you know, find their portal, and then you got a, you know, each of them has a whole separate process on how you got to make that request. And then you've got to make sure that they actually follow it. And it's a problem, Consumer Reports just did sort of an expose, and I've got a friend who's actually been doing some research on it. And it's really not a very effective process right now. In fact, there are some third parties that have been trying to help people do that, right, where you could sort of go to them, and you know, as a third-party service, they would try to do all that stuff for you. Just not really working very well. In part, I think those third-party services aren't very good yet, but we're still sort of in this notice and consent, you know, framework in, it's just not going to work if that's what we're up against, right, when there's so many of them, and they're just going to keep growing.

Debbie Reynolds 28:23

Very true.

Jeff Jockisch 28:24

And they've all got a different process. Right. You know, there's another piece of this that we probably ought to mention. Right. And that is that I told you that there are all kinds of different types, data brokers, right. The only thing we didn't discuss is, what the heck is the definition of a data broker? Right? I mean, legally, if you look on Wikipedia, the definition of a data broker is anybody that sort of collects public and private data and sort of aggregates it. That's sort of a synopsis of what it says, right? But if you go to the Vermont law and the California law, it sort of puts a lot of limits on that. Right? So the law says that you're only a data broker if you sell the data, right. And, of course, you know, some of the other pieces of legislation make it a little bit different. But there's all these sort of like, caveats and opt-outs, right? And I'm not even sure some of these companies that are data brokers even realize that they're technically a data broker. Right? So whether they're registering or not, it may not be that maybe because they don't even know that they're technically a data broker.

Debbie Reynolds 29:33

Yeah, wow. Yeah, we need to find some way to be able to make this more visible to people in terms of how their data is being handled. Because you know, I try to follow these things pretty closely, and I am confused. So for the average person, you know, they just have no shot, no chance whatsoever to view or sort of even comprehend what's happening with their information. One thing that we talked about, something that you bring up in commentary that I see, which I actually adore how I want to talk about it is privacy as a human right versus a property. Right. I would love for you to sort of talk about that and talk about your idea about this. Maybe I'll chime in and let you know what I think.

Jeff Jockisch 30:26

Sure, sure. So, you know, I think it probably needs some more flesh. My ideas need more flesh, but in essence, it's sort of this right that I think their privacy needs to be a human right. In Europe, they sort of passed, you know, a law way back, probably get the wrong timeframe that I think back in the 1950's yeah, the 40s or 50s, you know, to think that the privacy was a human right. But in the United States, we've never really recognized that you know, California has, you know, the right of privacy in their constitution. But we've never really sort of called into it a human right. We've always sort of approached it as more of a transactional right or property right. And as long as we're in that modality, where I could sort of sell my property right, for money, or trade it for access to a social media app, or, you know, do something like that, then I think we're sort of in the wrong mode. And there are lots of people that sort of want to get into this, this data economy where, you know, maybe I can make money off of my identity in my data, right, by, you know, trading it. So instead of having Facebook make money off of my data, I'll make money off my data. And that sounds pretty good when you sort of think about it initially. And in fact, even parts of that actually are good, right? Because if you're, if you're talking about your intellectual property, right about your, your posts, and your content, I actually think that is pretty good. Right? And I mean, if you're creating content on YouTube, you have to be getting part of that revenue from the ads that show up on your content, right? But when you bet on, you back that up when you're talking about your actual identity, right? That's it's a different proposition, right? What you don't want is you don't want, you know, people selling their organs for money, right? You don't want people selling their identity so that they can pay for a meal, right? Because what that does is that means that it's only the marginalized communities, and the people that have no other recourse, that is going to be giving up their identity when everybody else, you know, the rich folks, right, don't have to give up their identity, you make sort of dual-class citizenship, over identity. And that's a really bad proposition.

Debbie Reynolds 32:56

It is terrible. You know, I feel like they're just any way. In the digital age, there is a caste system being created. So the haves and have nots, but the haves are the people who have data and have insights, and the people who don't aren't going to get ahead. Right. But I agree with you wholeheartedly about, you know, privacy as a human right versus a fundamental, I'm sorry, versus a property right, because not every human is a consumer. So in our system, those rights kick in, in terms of an exchange of some sort. So in my view that right, you should just have it, right. Yeah. So that you don't necessarily have to trade it. So you know, my thing is in the US, do we have a right if we don't want to share our information? And really, the answer is no. Because of the way that is articulated in our system, there has to be an exchange of some sort. So the fact that you don't want to give your data isn't really protected in a way that we want it to be because it assumes that there has to be a transaction in a way. So for me, that's the reason why privacy should be a fundamental human right because I need to be able to have the right to not share my face.

Jeff Jockisch 34:31

Right. I mean, if you look at what is the FTC going to enforce, or what is the state attorney general going to enforce if they're going to enforce a privacy problem? It's unfair trade practices. Right? Right. And that's what Section 5 is for the FTC. And the unfair and deceptive trade practices are on the state level. That's mostly where all the privacy enforcement comes from. It's a contract, right being broken. So you're right on target there. This brings up another point that we've discussed, right, which is the whole concept of data fiduciary. Right?

Debbie Reynolds 35:05

Right.

Jeff Jockisch 35:06

When you give your money to a CPA or somebody else who's you know, a bank, right, they have a fiduciary responsibility to take care of that money for you. If I think about, you know, data as being similar to money, right, when you give your data to a company, shouldn't they have that same fiduciary responsibility for your data? Right? There are some people that actually think that maybe they already have that responsibility. It's just not being enforced. Or perhaps we need to pass legislation to make that more clear to them. All right. But that would be an interesting way to get us privacy rights. Maybe, instead of or in addition to national privacy legislation.

Debbie Reynolds 35:52

I always ask people this question, Jeff, if it was the world, according to you, and people do whatever you said, what will be your wish for privacy regulation, either in the US around the world?

Jeff Jockisch 36:08

That's a good question. I think what I would probably change is how we deal with de-identified data. We've had a few discussions around that, I think, to sort of nutshell it, you know, I don't think that people understand the risk of de-identified data. And, and this sort of goes back to the data brokers, right. Not just me, but other companies, too. We think when we de-identified data, we're making it safe. And it's just so easy to re-identify, identify data that it's not safe. No, you can, you can get the re-identify it, you can re-identify data so easily, in so many cases, that it's just scary.

Debbie Reynolds 36:50

Yeah, I love that you said that. Actually, I really like when I ask people this question. I don't think I've had two people to ever say the same thing. So this is great. But I agree with you. And back to your point about the data brokers and them having so many 1000s of data points on people, the legal requirements, or the legal measures that are written into laws about how you de-identified data are not sufficient because there are so many data points that can be used to re-identify a person. So I read an article many years ago that it really only takes three pieces of information to identify someone. And I can attest to you that this is true. Because this happened to me recently, where someone was saying, oh, they're trying to tell me about some person. And they were like, he has a red beard. He's a lawyer and a technologist. I'm like, I know exactly who that is.

Jeff Jockisch 37:51

Right? You know, there's, there's a couple other quick data points, pieces of information we could share with people too, right? So if you take for space, geo-temporal data points, right? Time location stamps, right? You can identify a unique person 95% of the time. Oh, wow. Yep. Right. And some Euro researchers back in 2019 said that if you take any random 15 demographic features, you can identify virtually any person in the world uniquely.

Debbie Reynolds 38:27

Oh, my goodness, yeah, we need to catch up. We need to have the regulation catch up to the fact that it is so easy to identify people. And I think it's gonna become even easier. Not just because of the data collected, but because computing is becoming more advanced, so they can do these things much easier. So I know a lot of people are obviously concerned about facial recognition. But there are so many other data points that people can identify. You don't need your face. Like you know, your pattern of life is something that identifies you.

Jeff Jockisch 39:07

Yeah, absolutely.

Debbie Reynolds 39:08

So Wow. Well, I have to have you back because we have lots and lots of stuff to talk about, I see.

Jeff Jockisch 39:15

Sounds wonderful.

Debbie Reynolds 39:16

Well, why don't you tell me how people can contact you and how they can access your I would love for them to know how to access your data broker stuff and your podcasts, lists, and just your company in general?

Jeff Jockisch 39:29

Well, sure, you can contact me at jeff@privacyplan.net. And actually, all of the information, including the data broker and the podcast database, are right on my website at PrivacyPlan.net.

Debbie Reynolds 39:43

That's amazing. Well, thank you so much. I'm gonna write a note to myself. We should probably talk in a couple of months to see if anything different has happened. Or we can do a deep dive into some of this other stuff because I think it's fascinating.

Jeff Jockisch 39:57

Sounds wonderful—great talking to you.

Debbie Reynolds 39:59

Thank you so much, Jeff.

Previous
Previous

E20 - Jennifer Mailander – Deputy General Counsel Fannie Mae Data Privacy & Cybersecurity

Next
Next

E18 - Kenya Dixon – Information Governance U.S. Government and Private Sector