E76 - Heidi Saas, Data Privacy and Technology Attorney, Washington DC

44:58

SUMMARY KEYWORDS

privacy, people, data, technology, happening, ai, companies, systems, tracked, work, cannabis, build, bias, irs, problem, consent, thought, human, impacts, contracts

SPEAKERS

Debbie Reynolds, Heidi Saas


Debbie Reynolds  00:00

Personal views and opinions expressed by our podcast guests are their own and are not legal advice or official statements by their organizations. Hello, my name is Debbie Reynolds; they call me "The Data Diva". This is "The Data Diva" Talks Privacy podcast, where we discuss Data Privacy issues with industry leaders around the world with information that businesses need to know now. I have a special guest on the show. Heidi Saas. She is a Data Privacy and technology attorney from the east coast at Saas, LLC. That's your company. Welcome.


Heidi Saas  00:42

Thank you so much, Debbie. Thanks for having me.


Debbie Reynolds  00:44

Oh, yeah, you are superduper-sassy. I love your work. We've been connected on LinkedIn for a while; for the first time, I ever saw you in action. I looked at a video; I think you were hosting the Data Collaboration Alliance data drop session. So they have sessions every month where they talk and have different people on, and you’re a host. And I was so impressed by your manner. You know, you're just, you're just kind of a ball of energy, you're a very straight talker. You're very funny, very witty in the way that you put things, and you like to cut through the BS where everything is put things out there. So I love your work. And it's so fun for me to be able to do this show with you today.


Heidi Saas  01:31

Thank you so much, Debbie; I love your work. I collect smart people. And I think I told you; you're one of the first people that I came for when I started working in privacy a couple of years ago because I thought, oh my gosh, she’s sharing all this information. And I never heard you say anything that was wrong. And I certainly don't know everything. So we learned so much by listening to each other that you provided a great resource for people, you know, practitioners as well as business owners, everybody that's listening to your stuff. We're all learning at the same time because it is a new industry. But you were one of the first people I went for. I said, oh man, "The Data Diva". She totally knows what's up. As she said, It's the gospel; just go with it. And yeah, I've turned everybody I've mentored into privacy through the exam since then. But you've got to follow her immediately and listen to what she says. And if you're lucky enough to talk to her, then yeah, do make everything you can of the opportunity. So I sent you one of the first articles that I wrote in privacy because I've been working on it for a while and took the exam. And I felt like I knew enough about it to start writing articles. Because it's kind of a pet peeve when you see new people show up, and now they're starting to write about it. And like, you know, okay. I wasn't really sure. So I reached out to you, and I sent it to you, and you took the time to look at it, say no, this is good information, do this and then gave some excellent pointers about it. And it was it made a significant impression on me that you just really gave your time like that. So I mean, it says a lot about the community that we're in. So the relationships matter, I think, the most.


Debbie Reynolds  02:55

Oh, wow. Oh, my goodness, that must have been a while back.


Heidi Saas  02:59

It was quite a while ago, but I never forgot. I'm like, oh, man; she's one of my heroes now. So I'm excited to be on your show to talk about all kinds of things.


Debbie Reynolds  03:09

Oh, wow. I totally forgot about that. Heidi, oh, my goodness. That's so cool. 


Heidi Saas  03:14

You're changing lives everywhere, Debbie; you don't even take credit for it.


Debbie Reynolds  03:20

 This is fun. You're doing a great service. You know, I love talking to people who are attorneys, but they have kind of a passion for technology. And you're definitely one of those people. And then also I think people have an idea if you're an attorney, you're a very stiff person. And you're just not that way. So I love that you just have so much energy, so much knowledge, and this passion; it just comes through for sure. Why don't you tell me about your journey into privacy? What is it about privacy that got you interested, and how you got to where you are right now? Oh, wow, that's awesome. I didn't know that. That's amazing. I love that you are focused on access to justice issues. So I've always been concerned. First of all, even without privacy issues, we have a tremendous access to justice issue. I think there's a statistic that I saw in Illinois that was staggering, and I want to say more than 90% of people who show up in court are not represented, which is horrible. And so I when I was seeing over the years technology be developed and more data being collected and used in different ways. I'm dismayed by the advances in technology and law not keeping up at all; there's one thing you know, law, technology will always outpace law, but we're lightyears ahead of where the legal system needs to be. And I feel like the way the data is being collected now, we're like, creating like a new caste system, in my opinion, because, unfortunately, privacy values kind of like, who can afford privacy, who can afford to buy iPhone who can pay for, you know, different things? Like I like to say, Warren Buffett doesn't have privacy problems, you know what I mean? So, it's just the rest of us trying to figure out, first of all, what data the organization has about us, is that even accurate, you know, what will it be used for? And that that part isn't as transparent as it should be?


Heidi Saas  08:01

I agree. Yeah, you're right that privacy is one of those things that the Founding Fathers should have included. And believe me, if TMZ had their drones flying over the Jefferson plantation, watching what's going on, we'd have a right to privacy today; they did not anticipate these sorts of invasive things happening. They really didn't. So here's a visual for you.


Debbie Reynolds  08:22

 That's right. Exactly, exactly. So what is happening? You post a lot; you write a lot. I follow your stuff religiously. You really dig in deep with the issue. So tell me what's happening right now, in privacy, that's kind of got your goat or has raised your eyebrow.


Heidi Saas  08:45

You know, I really want to invest more in data destruction technologies. And then, once we're there, I want to mandate their use because we're awash in unstructured data. And we're not slowing down on collecting, and the 5g network is predicted for each user to generate a terabyte of data a day. Well, what are we going to do with all of that data? You know, and now that we are moving to geopolitical, you know, consensus on privacy is at least something that's important. In some places, it's the right, but I mean, it deserves different treatment. So you can't just keep collecting and hoarding all of this data. So I think data destruction technologies need to do that. What really bothers me lately is technology stacks with open source, and they didn't audit anything in it. And you know, they're trying to move forward. They just move too fast to get things put together, and they're not building things the right way. So, you know, I see a lot of that. And also vendor contracts. I'm seeing a lot of problems with vendor contracts because you finally get a program in place, and they get to change all the vendor contracts to make sure they comply. And if they don't, can you ditch the contract? And I'm seeing internal struggles between like CISOs and in-house counsel because the systems like we get to do this make these changes; in-house counsel says you don't tell me what to do with my contracts. And so lawyers are sometimes like Terminators; you're going to bring in another Terminator to take them down or get them back in the box or whatever and so coming in to handle those kinds of disputes. But I am seeing more CISOs stepping forward to say, I'm going to own this until it gets done, and then hand it off to whoever needs to do it. So I mean, it's encouraging to see the overlap of security and privacy starting to work together with legal to move forward. Pretty much everybody is lining up over here against marketing. But those are some of the things that I see. What really gets me, what really gets my goat, as you said, is the dark money influence in DC. So in the last three years, especially, it has been raining new white men and money from the West Coast. And this is a town that already overflows in both categories. But hey, the more, the merrier, right? Welcome, Jared. So they're all there. They're all you know, working on these things. But what people don't really understand when it comes to privacy and technology the stalemate is because there's so many big-money interests in technology. And they all have disparate interests in where they want to go. So none of the stakeholders are all fighting for the same thing, we want this, or we want that they all want their own custom kind of thing going on. So there's no consensus amongst the power brokers in the dark money world about what they bought. So until they can decide what it is that they want, they can't tell the legislators what they need. And we've got the stalemate out on top of that, the excitement of midterm elections this year. And I don't know that I see a lot getting done. And it may not be the worst thing because if something gets done hastily, it could be done poorly as well. So I think that's what bothers me the most is that they can't get on the same page. Like you're all there to do something. Like there are too many think tanks thinking about themselves. So that's what I think.


Debbie Reynolds  12:03

Yeah, I agree. I agree. You know, I don't like to make money on people's pain. You know what I mean? Some people. I think when elections come up, I know that nothing is going to happen on privacy. So election years midterm, you know, people in political circles they're trying to get elected. And privacy doesn't really hasn't really raised to the level that that gets talked about. Right. So, you know, if they had a top 10 list, privacy doesn't even end up on that list.


Heidi Saas  12:45

Yeah, they're definitely polling on it to see if we can get something done in this area. Will it help us, you know, this or that they're definitely pulling on it. But like I said, you the opposition here, you've got to line up the opposition. And they have to like, what are your demands? Pretty much, you're holding legislation hostage, and you haven't told us what your demands are so that we can work on coming to a compromise; it's pretty much just hold the status quo until we can figure things out or until after the election. But I don't know my history, and public policy goes back to the unknown. I date myself here, but it goes back to the Clinton administration. So I went to undergrad at George Washington University, and I went to school full time. And I worked full time at a lobbying law firm. And I attended bar a couple of nights a week because Georgetown is crazy. Yeah. But between all of that, you know, learning in the classroom and then doing the work. When I left Washington, I had a bachelor's in International Affairs and Government Relations and a couple of years of practical experience, and people are doing that work. Plus, I was serving them drinks when they were done. So I definitely met some people and saw some things. But most importantly, I think I figured out how things really work. And it goes back to the relationships. So the new people are showing up in Washington, like look at Microsoft, you know, I mean, they just announced a huge, huge acquisition. Have you heard even a whiff of antitrust? Is there news about it? No, exactly because they have played the long game for a long time in Washington. And they've got a relationship of trust with the lawmakers that are there in the establishment in Washington. And when they call upon Microsoft, they show up like they're starting to catch a little heat. They can't get by unscathed. But by and large, they are in the trust category because they play the long game and politics. So Apple kind of did the same thing. It's starting to unravel a little bit auto nail with the App Store issues. But it's kind of the same thing when you look at how things get done in Washington.


Debbie Reynolds  14:43

Wow. That's amazing. That's amazing. So I think when we're thinking about what, let's talk about this new thing that came out recently, so there was an article that was posted about certain companies lost, I don't know, over $200 billion in market value as a result of Apple's App transparency, privacy changes that they made so good. What are your thoughts on that?


Heidi Saas  15:14

I think it's amazing that they said, Do you want privacy? And 96% of the people said, Yes, I do. So I think that it is fantastic that consumers are the ones that need to drive this push. And what I started doing a couple of years ago is when Google announced the end of cookies, I started switching my people over to first-party consent systems, redo the marketing, and let's use different analytics and all of the different alternatives that are out there and available. And so they got a head start on collecting the consent and starting to work on these relationships with their clients. So now that people are looking into it, they're telling me that, you know, clients are coming to me finding out what's going on, and how do you treat my data. And they're pleased that this has always been the way that it's been done. It gives us, you know, positive brand messaging for privacy and trust as part of our brand messaging now, whereas, before these other companies, they surreptitiously stole our data and continue to misuse it against us. So yeah, I think everybody, the collective, we are ready for a more trust-based system, and privacy is at the forefront of that. How do you treat me as a person? We've been through some stuff lately? How are you treating me right now? So that's where we are, I think,


Debbie Reynolds  16:27

Yeah, I think the thing, the great thing that Apple did, and I'm happy that they did it, is that they've proven right in a money, dollars, and cents way that privacy can be profitable. So I don't think it's a coincidence that they've had some of their biggest quarters ever, right on the heels of announcing this kind of app transparency. And that's because they did play the long game and built that trust in, and then, to me, this is a wide-open issue because consumers need help, right? So we don't know how to help ourselves. We don't know the threats that are out there. So any and all companies or organizations or whoever, who's working either on education or technology, anything, and they help people do better? I think they'll do well. So I'm hoping to see more investment into this privacy-enhancing tech space. I want to see more companies think about privacy kind of foundationally as they build tools because it's only going to help them to make money, right. So before it was like, I think people thought about privacy as a tax like, oh, my God, this is the other thing that I have to do to be compliant.


Heidi Saas  17:41

Yeah. Overhead, compliance work. Yeah, that's what they think. Yeah, yeah, exactly. Where this is actually something that can add to the bottom line of the organization. What do you think? I'm so glad that you said that? I'm so glad you said that. That's part of my pitch; when I go on to talk to companies that I work with is that, yes, traditionally, it's compliance work. But once it's a mature system and it's up and running, there are so many creative avenues to explore. So you know, we've been able to open new verticals and creative new ways so that we could, you know, monetize the data within the privacy ecosystem, right. But you've got to have that existing privacy ecosystem before you can go in and take the next step. So I explained practicing maturity in terms of crawling, walking, and then running; once you're running, then you can really start to optimize and get better results for your business and for your customers. So there is a long-term game there, return for the investments in investing in a privacy program is shown every year in the Cisco report. And so they keep coming out every year and showing, you know, all these great returns that are coming back, a lot of it is brand reputation. And so that's something that has the most value to people right now, especially in the canceled culture; they just want to make sure they don't want to foul of what people are doing. So yeah, I think one of the best things people can do is also look at the advertising. You don't need to track and harass people. There's a lot of information out there from people who have switched over to privacy-based and contextual advertising. And they're having stellar results with it. You know, I mean, you can find information like that all over the place. So I mean, I'm not hearing companies when they're crying to me about the end of the world is coming because you can't track and surveil people. I mean, if we were happy with being tracked, surveilled, harassed, and hacked all the time, we'd stick with web two, but we're not rebuilding web three because we want more privacy and more control.


Debbie Reynolds  19:35

Right? Well, let's talk about third-party data risk. We know that a lot of laws are circling around the idea of if I give my data to a first-party data company, and they decide to transfer it to a third party, especially on the state level in the US, they're adding more consent there as well as opposed to notice. And then obviously the EU, you know, definitely there's consent there. So tell me your thoughts about the landscape on the third-party data sharing or third-party risk thing that's happening right now in the world.


Heidi Saas  20:13

It looks to me like in general contracts, when you set them up, you do indemnity and kick it down the chain, these regulations are some of the first to turn around, and liability runs up the chain. So now you're responsible for whatever's happening. What I'm seeing is, people don't know what's happening, you know, I mean, yeah, you're third party affiliates. Well, how many affiliates? Is it?  Oh, 300. Well, how many affiliates? Do they have? 6000? Oh, well, how many different places did it go now? And it's, it's hard because once it's out of your infrastructure, you can't map that data. You also can't get it back. Well, how long? Did you say you'd keep it? Because if you said you'd keep it as long as reasonably necessary, and that might mean forever? Who knows? But how long are they going to keep it? What if they get sold or required to someone else, they're taking that data with them. I mean, you don't have any control over that third-party data. So when you do the vendor contracts, you've got to make sure that you're using the right vendors that aren't going to get you in a jam. Because you can try to paper your indemnity as much as you want, it's still coming back to here. So I think that's where businesses need to know that these are things that are out of your control. And the bottom line in law is this. It still costs real time and money to get you out of trouble you may or may not be in.


Debbie Reynolds  21:40

Right? Exactly, you're going to pay one way or another. So being proactive is really the way to go. So let's talk a little bit about proactivity. So I think the problem we have with privacy, and I also see this, and AI and emerging technology are, that some people think all we need is regulation, and we'll be great. But the problem is some of the harm that happens to individuals as a result of kind of misuse or abuse of people's data. To me, the harm may come fast, right? And there may not be any adequate redress. So talk to me about privacy in more of the proactive ways, because, you know, can't chase the ambulance on this, you know, there'll be no ambulance, as far as I'm concerned. What are your thoughts?


Heidi Saas  22:31

Yeah. So I recently wrote a paper as I sort of working on a project with Data Privacy and applicant tracking systems, ATS systems used in HR departments where people apply online for a job, and they parse your resume and whatever. So I had done a lot of research in AI to figure out how it worked and whatnot. And I wanted to find out how privacy plays a role in that. Because AI is so research and development based right now, it's still such a nascent technology. And where we are in AI right now is everybody's calling everything AI. That's what we're at right now. Right? Yeah. So I started looking at that, and then you know, sort of looking at it in the employment context because I was reading all these, you know, papers that said, we have all these exposures to liability here, there, and everywhere else. I didn't see anybody do anything about it; we're still trying to get people to categorize data. Meanwhile, they're saying that once it's categorized, you have to treat this in a separate way. And they still haven't gone through step one. So nobody's doing anything about it. So all of these technologies are out there running massive amounts of data from God knows where violations are happening. They're like I learned Data Privacy and technology so that I could fight the consumer rights violations where they're happening. And there they are. The first individual that asked me to consider taking one of these cases had tracked their efforts over six months of job searching, you know, looking at the posting, do I fit researching? Do I want to work with this company, getting the resume together, sending it through, and then tracking whatever came back? And then there were patterns that emerged over that with data brokers and things like that, this individual had tracked their progress for six months, and that spreadsheet had 172 potential defendants. I was crushed, and not by the scope of the work but by the scale of the damage to the human spirit. I mean, imagine that much rejection and ghosting and never knowing why you're not going any further because a human never even saw you. So in this case, you know, you're trying to give people information, your data, and you want them to see it, but they're not even seeing that they're seeing data that was compiled by someone else that they think this is a representation of you in the computer decides to make a decision whether you go in the trash or not from there. So I think that it's wildly illegal. I'll just start there, but you've got to be able to To prove it, and so the courts have been looking for concrete harm for a long time when it comes to privacy cases. And so I see a shift now that we're in a better position; the court has signaled that they are ready to hear these sorts of cases; I think this is a great time to start this sort of awareness project and bring these things to the forefront so that people build better systems and don't get into this situation. The systems are not designed by a bunch of racists somewhere trying to screw people. They're designed by developers who don't employ ethics in their decision-making, and no design choice is neutral. So in looking at all of these things and what's going on, I decided, You know what, I have to start doing something about this, and learning as much as I can and working with other people in other ways to do something about this, because there is such a lack of knowledge, and the harm is so pervasive.


Debbie Reynolds  25:51

Yeah, exactly. And I think, too, you know, I'm glad to touch on this. So this kind of goes into bias as well in digital systems and stuff like that. And so, part of the defensiveness that I see with people, they sort of think that you're accusing them of being racist. And that's not necessarily what people were saying is like, okay, you may have a perspective, that's very narrow. And if you aren't including different types of people who have different types of experiences, you may not see those gaps, right. So we saw what; I want you to comment on that. And then I want to ask you something about the IRS. 


Heidi Saas  26:33

 It doesn't even get to be that far as to someone making a decision. It's just in the code. Here's the shortcut to optimize for this to that. And I'm looking for these features. And so those features may be your zip code. And so, yep, I'm looking in this zip code because people who live closer to the office are happier because they have a shorter commute, right. But that's not fair. Because the people that live right outside your zip code may also want to apply to this job, but it's a poor neighborhood, and they have to use the bus, and you don't think they're going to be happy. And so you exclude all of these people, just because it made sense to you that, oh, yeah, people are happier when they're closer to work. And so, it wasn't the decision they thought they were making. It's a decision that they made that had impacts that they weren't thinking about. So that that's why you got to have the right tech stack of people, data scientists and ethicists and all of these other people to help you when you're designing these AI models to avoid this, because the data set itself is where the biases, and then you compound that with the coding for the shortcuts on how to access the bias data. And when it comes to racism, I tell people, yes, and we are, we all are. And that's how we have been conditioned. What's important is that you learn where your biases are, and you take proactive measures to make sure that you don't fall into those traps. But please don't take offense of being racist because we all are; this is a country that we live in. Like, I think people need to get over the whole, like, racist, you know, haters and bigots and white supremacist. Those people are a problem, and they're not new. Our collective awareness of them is what's new. So yeah, that's a whole other set of snakes right there. But yeah, I think it's not really the decisions people are making when it comes to technology is the impacts; they're not thinking about being made from those decisions.


Debbie Reynolds  28:18

Right, exactly. So that all means we just need to work be able to work together. So tell me about collaboration. So I think one of the things that I enjoy about privacy, and just something I'm doing in general, is kind of working with people across different groups, right. So trying to silo too much information, right? Or too much, you know, collaboration doesn't happen the way that it should happen. And so in order to be mature and privacy, you have to have these conversations with people at all levels, you have to make sure that they understand kind of the big picture, not just the part of an organization or what they do in particular, but understanding the impact of what they do to the whole. So tell me a little bit about that. So to me, that's kind of a new thing. To me, in the past, even the way the MBA programs are taught, they were taught in siloed ways, right? So then, when people get in organizations, very few people can kind of cut across those silos. So tell me a little bit about that.


Heidi Saas  29:27

I think the EQ is where people are at now. The emotional intelligence, like yes, you get leadership skills, but do you have the right empathy level to understand the people on your team? Because when you meet people, you want to have a right fit in fit doesn't mean they're like everybody else. Fit means they fit the need for what you have for that individual to do. Diversity of thought requires what are you what can you bring to the table? And also, what's your background so that you can bring something other than what we have? Team building is part of building consensus. So you do want people that all think differently but can also feel a part of an overall team; there's cohesion. And so you feel supported when you're able to know your role and know what everybody else's role is. And you're working towards a common goal; it helps move things forward in the project. So I think that's another area where I can help people that are working on projects that don't have in-house people, and they're bringing in consultants and whatnot. That's something else to help build a collaborative effort so that everybody does feel understood and valued. And people need a purpose. Because we're bringing change. First, it's compliance. And it's change. And I got to do things differently. Tell me why. Well, if you do and give them a buy-in, you are more likely to have them support you. And then you know, you can let them know this is ultimately our goal is to do this so that everybody else can succeed together. And hopefully, you get your buy-in; you always have a couple of rotten apples that really just don't want to have to change the way they do anything. But that's where you can find the right people on your team to be the early adopter in personally work with the rotten apples in your favor to help turn the tide in keeping the project going. So that's part of the collaboration part is to, I think, the empathy level to connect on a human level with people and understand how do you work? And where can we put you in the right environment, so everybody can move forward together and feel good about it because going to happy people makes bad technology?


Debbie Reynolds  31:30

I never thought of it that way. I think that's true. So let's talk about something that's been in the news recently, and that is the IRS trying to use a vendor to do facial recognition for their identity systems. So certain activities that someone like, say you have irs.gov account, you normally log in with a username or password and IRS is trying to use a third-party vendor to do this very invasive, in my opinion. 


Heidi Saas  32:07

They're not going to get to do that. They announced, hey, we'd like to, and people got booed mad, myself included. So I think the best article I saw on this was by Dr. Joy Buolamwini. And she is with the Algorithmic Justice League. And it was in the Atlantic, a great article on exactly why this should not be used. But I want to go back earlier like the AI one of my favorite AI ethicist, you know, Michigan, as well, they she's got a great point. And AI is still r&d. And people need to understand that. And also the government is the government, man, they're not really good at many things. They're just not. And so you want to give them this kind of control over this kind of data. I can't get another face. And I can't argue with my government that they've got my face wrong because then I'm going to end up in red tape. And then, you know, like the impacts are so far-reaching from such a bad decision that I really, I don't think this is going to move forward. And if it does, it's going to be mired in lawsuits from, you know, as far as the eye can see, there are enough people on the side of Oh, hell no, that I really don't think this is going to be able to happen, but it has sparked a lot of public debate. And I'm glad it's getting people thinking about how this can be thrust upon you without being asked for consent. You know, I mean, that's, it's dangerous.


Debbie Reynolds  33:34

Yeah. And I just mentioned that the article, Dr. Joy, she did the documentary Coded Bias. And she did write an article in Atlantic; I thought it was great for anyone can read it. A very well-thought-out thing. And so, for me, as a woman of color, knowing that I'm 10 to 100 times more likely to be misidentified in facial recognition systems, it's obviously concerns me greatly. You know, I think I'm glad that people sort of have been raising their eyebrows about this and made me vocal that they weren't that concerned about this. But let's back up. So let's pretend like this at the very beginning, when IRS or any government was kind of thinking about this type of step like what should they have been doing before they got to this point? Differently? Oh, for fraud. Yeah, it just seems to me that their solution is kind of overboard for what the problem is, right? 


Heidi Saas  34:41

Yeah, it's new people, and they have a shortage of people over there. They can't find enough people to work with the IRS. And they have more work than ever. And so it's just government inefficiency. That's how it is, and so at this point, they're looking for what can I do that will cut down on the amount of work I want to outsource is to a tech tool that A lot of the problems with the automation of systemic racism is that they've outsourced human responsibilities to the machines. And if you're going to do that, you have to have a human in the loop to make sure that you are handling things responsibly. Otherwise, you're creating liabilities. And so I think what they did was say, what's the technology solution for us to do this? All right, that sounds great. Let's throw it against the wall and see if it sticks. I don't think they expected this kind of blowback.


Debbie Reynolds  35:29

No, I don't think so either. And then two weeks, my thing is like, we want to boil a kettle of fish, you don't have to boil the whole ocean, right? So if fraud was their focus, they should focus on the percentage of transactions that they are fraudulent, not try to put everything, everyone, through this whole process.


Heidi Saas  35:49

Or they could look at other technology tools that are being used by multinational conglomerates that have been working on AML. For years. Yeah, that technology is not secret. You know, there's a blue light tool for AB InBev. Anheuser Busch. And that tool is open and available; you can contact the people over there if you want to use those tools over; I don't know a dozen different applications in there to track anti-money laundering know your client free beer, it's all in a supply chain. But they had to build a system to track the data and find the information, but it was based on regulatory compliance obligations for the fraud. So for, fraud detection has had years of r&d done on that already, but not by the government. So I think that's where they need to look outside. So I think me is not really something they need to look at. Right? So new whiz-bang technology company, they're going to end up in a ClearView AI situation like we don't, we don't need that. We need to look at other industries that have been doing this for other purposes. And it's not just their sole purpose is to sell government technology. I think we need to look at other applications of these things. Now, I recently started working on a new project. And it's the trickiest kind of data I think I've encountered yet. You want to take a guess?


Debbie Reynolds  37:17

Oh, my goodness. I don't know what it could be.


Heidi Saas  37:30

And we're starting to work on building AI models for the cannabis supply chain. And so, when it comes to cannabis, some of that personal information over here it's medical. And so it's health data. But over in this state, it's evidence of criminal behavior may be used against you in housing, lending, something like that. But it's all the same data. So it kind of depends on where that data is, the nature of the sensitivity of that data, and how it can or should be used. So I mean, that's it; there are so many different layers to this project; not only the supply chain has to be tracked from seed to sale. But you can't have people in the same verticals, either. Because there's a history of organized crime taking over all the verticals in New York. So they passed a law for cannabis saying that it's got to change over if you're a grower, you're a grower, if you're a distributor, that's what you do. So the supply chain still has to be tracked all the way. But all of this data still has to change hands. And a lot of it does include some personal information, some of it health information. I think it's the trickiest data I've encountered yet. What do you what are your thoughts?


Debbie Reynolds  38:32

Oh, wow, you know what? That's crazy. I guess it would be crazy because it's different from state to state. So I live in Illinois, and cannabis is legal up to a certain amount, I guess. And it's heavily regulated here. But then I drive 30 miles away in Indiana, and it's illegal, right? So you have kind of the state line issues. So there's kind of this medical like I said, there's kind of medical stuff. There's recreational.


Heidi Saas  38:48

Whose privacy data on top of that is that layer on top of it, and then keep layering the onion. It all stinks, let me tell you, but it's because it's hard. I'm not deterred. And I think it's possible it just; I am excited about having access to so many different smart people to help me work with this, you know, all of the people that you need the think stack to help you build something like this, it's going to take years it can cost a bazillion dollars. But somebody needs to do this, and not for nothing. My client is building this system because he's established a partnership relationship management platform for black entrepreneurs in the cannabis and hemp industries to network and share opportunities and provide services to each other so that they can go forward. He encountered systemic racism and the applications process and said, I want to build something different. So he and I found each other a few years ago and started doing that. So I'm pretty excited about the social impact of this project as well as the complexity and project potentially the opportunity to build it on something like adera. So that we lower the carbon footprint of this kind of tool, that's so cool. Wow, look at you. I love that. I love that, you know; I think to the US is so complicated, right? Just because we say we're the United States, we're kind of like the Divided States of America; we share a common currency. And that's basically it. So you go to a different state, and it's like being in a different country. So these are really hard regulations in certain states play off of one another, you know, every so like, where you know, where the in is, for example, in Illinois. Cannabis is legal, right, and illegal in Indiana. So maybe Indiana, they may still sell like other accessories, right? They can't sell cannabis, and they sell accessories cheaper. So it's like, there's a lot of like, crossfades of this happening. And it's absolutely bananas. Yeah, but it's also one of those industries that are moving forward, and we can kind of tell where it's going to go. So yeah, I think I that's it's interesting as well. But like I said, we've got to design better systems for the way that we use data; before I want to start going and messing with access to justice based on data systems like that, we've got to have better reliance and more control over our data. Then we can start to tackle bigger issues.


Debbie Reynolds  41:25

Yeah, I agree with that. I agree with that. So if it were the world, according to Heidi, and we did everything that you said, what would be your wish for privacy anywhere in the world or in any sector. So whether it's technology, law, or human stuff, what are your thoughts?


Heidi Saas  41:43

 And you can split it into a personal like a PSA and a wish? Right. So the PSA is, please do not contact me if you're looking for a rubber stamp of what you're already doing with your privacy program. That's not what I do. I'm not a rubber stamp girl. But my clients know the value in bringing in a consultant that can tell them something they don't know. So if you're looking because I get the CIPPJD, and people are always after that, they sweat me hard for that because they want the rubber stamp. And that's not me. So don't, that's my PSA, don't come to me for that. And if I have the chance to make a wish, I want to wish for people to start caring more about each other as humans. And I think we could really go to a lot of good places if we just started with that.


Debbie Reynolds  42:30

Oh, look at you. I love it. I love it. You know it? You know, we're talking about data. And it gets very technical. But this is a human issue. Right? So you're dealing with humans. So I think understanding that and coming back to that is a focus, right? What would you want to be done to you or your data?


Heidi Saas  42:51

Well, think about the English speakers in the EU; they're talking about privacy; they even say it differently the way they say it sounds like a right. You know, I mean, they treat people differently. They call them, you know, they call data protection subjects and, you know, the data subject selves, and they're not called users like they are subject to a dealer or something. Right, even when they say privacy, it even sounds like it has rights attached to it. Yeah, you know, I mean, we've got a long way to go here in our country. But I think the good news is that we are at the second stage of awakening; we've gone through the Snowden awakening now or the hog and awakening. And so everybody else is starting to catch on to see the impacts. And so I think we have more support, because we've been out there telling people, here's what we need to do for a while I think now we're finally getting the support where they're like, oh, great, and you know how to do it. Great, I'll call you. So I think that the busier that we get doing this type of work more encouraging it is overall for privacy, right?


Debbie Reynolds  43:44

Yeah, that's amazing. Yeah, I love to see more focus. And I like to see, you know, I guess what, what I'm loving to see now is that it's more mainstream in media or so before it wasn't it was kind of like little groups that we ourselves, right.


Heidi Saas  44:00

You and I were totally making it cool because that's what we do is how we roll. We're making it totally cool.


Debbie Reynolds  44:04

Exactly. So I'm hoping I hope that we can continue to do that. So this was so much fun. Thank you so much for being on the show.


Heidi Saas  44:12

Thank you so much. Take care.


Debbie Reynolds  44:14

Yeah, we'll talk soon.

Previous
Previous

E77 - Hitoshi Kokumai - Mnemonic Identity Solutions Limited

Next
Next

E75 - Cillian Kieran, Founder & CEO, Ethyca