E128 - Garrison Ross, Sr. Data Privacy Consultant, Founder and Advocate, Data Engineering

Audio Block
Double-click here to upload or link to a .mp3. Learn more

34:27

SUMMARY KEYWORDS

privacy, data, organization, company, technology, people, consumer, communicating, customer, expose, compliance, silos, starting, rights, deletion, information, terms, world, business, shared

SPEAKERS

Debbie Reynolds, Garrison Ross

Debbie Reynolds  00:00

Personal views and opinions expressed by our podcast guests are their own and are not legal advice or official statements by their organizations. Hello, my name is Debbie Reynolds. They call me "The Data Diva". This is "The Data Diva" Talks Privacy podcast, where we discuss Data Privacy issues with industry leaders around the world with information the businesses need to know now. I have a special guest on the show. His name is Garrison Ross. He is the CEO of Strategic Privacy Partners. Welcome.

Garrison Ross  00:39

Hello, Debbie. It's good to speak with you.

Debbie Reynolds  00:41

Very good, very good. So you and I met on LinkedIn, and we actually have some very lively discussions and chats; we chat on the phone and texts and different things. So I thought it'd be great to have you on the show, you have a lot of deep knowledge and experience. You've done privacy work for a lot of different big organizations. And now you're out, and you have your own company. So you've done privacy work at Sirius XM, Toyota, and Lululemon, and you're also working in privacy at a university level with your degree at Cornell. Congratulations.

Garrison Ross  01:31

Thank you. And I would like to say that that's going through several certifications at Cornell.

Debbie Reynolds  01:37

Very good. Very good. So tell me a little bit about how you got into privacy and why privacy is so important to you.

Garrison Ross  01:47

Ah, that's an excellent question. Thank you. I tend to have a unique story in terms of how I entered the privacy world. You know, back in the day, we have kids now that grew up very smart. And we are kind of blown away at how much they know. I would say 25 years ago, growing up in technology and in the Washington area, surrounded by Microsoft and technology and Boeing, I explored technology a lot as a youth and was kind of a hacker, I would say, hacker in an ethical way, I never stole money, I never caused harm. But I definitely was curious about how the technology worked and would explore it, and exploit it, just out of curiosity and for learning. And so that sort of started my tech journey. And when I completed school really early, I winded up getting a job in semiconductors. And that was how I got exposed to the tech industry. Well, long story short, through multiple careers. I eventually, after being a director of quality and regulatory for a company, I winded up getting a job traveling around the world, really working with developing very various different compliance frameworks and testing modules and auditing protocols and processes that can be customized based on the vendor, or the factory or the region or jurisdiction. And that really exposed me to the regulatory framework and world and how strict it is and how to respond to various different risks and mitigate those risks, etc. Over the years, I strategically took roles at specific companies to learn the operations of what it means to have a Data Privacy program, everything from helping implement and integrate a DSR management tool, such as one trust, and configuring that for compliance regulations so that customers can action on the rights to helping do privacy by design for organizations that create a lot of connected technologies, say in a vehicle when I was working for Toyota, and really making sure that those products get vetted properly, from starting from the data sharing agreement, or the data sharing contract and understanding where that data is being sold to or shared with. And this instance, that company didn't sell data, per se, but data was shared for various different services. And they had really strong principles in terms of communicating the customer. So if you want to use a technology for the purpose of providing enhanced, you know, navigation, you would disclose that to the customer, the customer can opt-in and opt-out and also be able to turn that service off according to their unique preferences and needs. And so really getting to participate in that and seeing how that impacts the privacy policy, how it impacts the terms of use. And excuse me in terms of service. It really opened my eyes to the world of Data Privacy. I work in data compliance, and Data Quality audits. And so you get to really understand the operations of privacy and how companies need to be more mindful of how they communicate, how they educate, how they work together, and start to eliminate silos in their Data Privacy process so that we can have more unity and more standardization to have more consistent compliance and help corporations mitigate risk. So that's sort of my journey in terms of how I got into Data Privacy.

Debbie Reynolds  05:31

Yeah. Very good. I think sometimes people think of privacy and how you attack it in a kind of a linear fashion, right, almost like Santa's workshop. But I think technology today is changing so rapidly that you have to adjust to that. So, for example, let's say an organization has an IoT device that they want to incorporate into their business, right? That device may get, maybe today, it has certain functions, but maybe tomorrow, because of software updates, firmware updates, people deciding that they want to use certain new features, that device may need to be vetted again for those new things. So a lot of times, what I see is people say, Okay, let's evaluate this seeker thing for right now. But then it's changing down the line, and they're not really looking at those gaps. So can you talk a little bit about that?

Garrison Ross  06:40

Yes, that's, that's an excellent call out, and that's where operations really come into play, like if a company is going to have a DPA or even if they don't want to call it a DPA and they want to call it a privacy evaluation for a specific technology. When do you deploy that assessment? And how frequently in the product lifecycle do you review and re-audit that, right? So I've been in situations in the past where there may be a particular technology that's being used by an organization, and that organization has a specific function; it could be something like a chat module using voice activation chats, so customers can call in and they can talk about various different services. And they may expose PII and their discussion. Well, if they have a specific implementation plan from a previous project that they've deployed that had the same technology use the same technology. But what was being discussed were standard protocols and terms that would activate the technology itself through voice activation but didn't involve customer information. And they replicate that same design in the new product and think that they can just pass it, because it's already passed before, not recognizing that now, that particular technology is capturing specific customer information. And there may be a modification to various different, you know, the SDK may have a slight modification, how the technology is being used may have a slight modification to systems and databases that it connects to may have a slight modification. And they're not really vetting the whole process of how their internal employees may access that information, thereby exposing the customer's information to potential exposure and data breach because it's not being controlled. So those things have to be looked at. There are instances where you may be using a Bluetooth technology that may be 3.0. And that 3.0 has vulnerabilities and it can be exploited. And it there may be known exploitation, there may be some that have not even yet been discovered. And the new standard may be 5.0. So where in the decision lifecycle do you make the decision to use? What version of the technology and how are you identifying those gaps? Those are things that consumers have to be prepared for, but businesses also have to prepare themselves for to mitigate the risks in the industry and not negatively expose customer sensitive information due to exploitation.

Debbie Reynolds  09:26

What are your thoughts about compliance? So sometimes I feel like the privacy industry a lot of times has an overreliance on the legal aspects of privacy and not enough technical data operational aspects. I guess it should be a situation where if everyone does what they're supposed to do, the product you come out with is like baking a cake, right? But you know, let's say you give someone eggs, you give someone butter, and he gets hot sugar and flour. That doesn't mean you're gonna come out with the cake at the end, right? So there has to be more orchestration there. And then also, I think just looking at privacy from a legal lens isn't fulsome enough to the problem to be able to handle it, especially operationally; what are your thoughts?

Garrison Ross  10:35

I think that's an excellent call out I mean, it's still emerging, right so it's a great question. It's a question that even organizations like the FTC is curious about. So I recently attended in September. And I think I sent you the link on this for the data security and commercial surveillance public forum that the FTC had to understand more of what's happening from industry professionals, and even consumers; they were facilitating public comments, so they can help with their rulemaking. And I love the cake analogy because we tend to think of privacy, at least corporately; they tend to think of it as a cookie cutter. And most organizations only do the minimal viable product, what's necessary from a legislative standpoint, but they're a greater risk and pose from an operational structure, and how a company communicates and disseminates that information, how they process it. And so even working with third-party vendors, for instance, that area is still relatively unexplored. And so when you're working with a lot of, say, contract agreements and data sharing agreements, you'll let your vendors or your partners do certain things with the data. And then the originating company or the source company will lose control of how that data is being managed once they release it. And in the contract, sometimes there's loose terminology in terms of who's responsible for what in that process. One company may not want to have the responsibility, and they may say that this company is responsible will. And there's different reporting things that have to happen. And this is one of the areas that also was talked about in my course this week in terms of privacy incidents and responding to data security; or Data Privacy crisis is when a third party uses particular software and they don't uphold the standards of the primary company who makes that software. And they give them control to, say, access, modify, without consumer consent or disclosure, full transparency, say, their product, that is an area that exposes risk, but there's no real area in compliance that covers that umbrella. And if sharing information with a particular third party or vendor for the purpose of doing business, where in that process in that product lifecycle does the customer have the ability to access their data rights? Are they being communicated with? Do they have the option to opt-in or opt-out? Is there full disclosure in terms of what technologies and services and other vendors the company may share information with? So these are areas that I understand the compliance industry wanting to kind of get their hands around? But there's really no there's no precedent for it. Because there hasn't really been a major incident or violation in the industry that exposes it, does that make sense?

Debbie Reynolds  13:37

Yes. So I'm glad you mentioned this. So it makes me laugh a lot. Right. So the issue with the law and the reason, you know, I'm a technologist, and a lot of times some of the things I talk about, people are like, well, how did you get that? It's because I don't follow the law, right? Because the law is like something bad happens and like, oh, let's pass a law, right, so it's always kind of looking backward. So we have challenges now in privacy where you have to look forward, so you can't drive a car looking in the rearview mirror. Right. So you're trying to go forward. So I think that is one reason why just looking at privacy through the compliance lens isn't sufficient, because we have emerging threats, emerging issues. So companies need we need to be thinking about sort of their privacy tenants at a fundamental level, right, not just trying to lurch from one law to the next. What are your thoughts?

Garrison Ross  14:37

I agree it needs to be holistic. I think that that's where privacy ethics comes into play and a company really taking a position on what their privacy principles are. Are they going to be legislative and reactive? Are they going to be responsive and stay ahead of emerging risks and trends and make decisions based on what is happening in the market and based on best practices? And it's really important to in today's climate, you can't just focus on the law right now. Trust the trust factor in the end. So you can be compliant legally. But if you're not compliant ethically, that can dramatically hurt your business just as much as legislative, you know, the violation. And we're starting to see that happen with companies like Mehta, for instance, right? Just look how much money they've lost and what's happening with the organization. Do they create great products? Absolutely. But they're losing trust in the market. And that's having a big impact on the success of their business.

Debbie Reynolds  15:41

That's a great point. That's a great point. I want to talk a little bit about silos. So I think privacy comes with a unique challenge. And it's a good one. Because I think it can push change within organizations, but we have organizations that have traditionally been very siloed and how they handle sort of data or think it's a legal issue. So I think there are different sections within organizations that handle certain things. And so, back to my cake analogy, right? So, you know, you can't bake a cake, and people aren't really collaborating together, right? So it's not just, okay, I have this ingredient, you have this ingredient or whatever. It's not like Santa's workshop everything's going to turn out fine in the end. So how do you find a way to break down those silos within organizations, because I feel like a person in privacy, you really need to know all aspects of the business. And you also need to be able to communicate across all these different silos within the organization.

Garrison Ross  16:58

How you break them down is really building. I think one of the things this is actually brought up in certification at Cornell and Data Privacy, data security, and privacy policy is how you build a comprehensive professional privacy network. And how do you eliminate silos, because when you have silos in an organization, it creates dysfunction and it creates a toxic work environment? And there have been various different reports that have come out recently that have talked about these different things. In the age of privacy, you can't really operate in silos because everything is connected, and we live in a digital and connected world. So ethics play into a company's privacy culture and how they respond and handle different privacy incidences within an organization. And that comes down to communication, what their managers are doing, how they communicate with each other, and how they handle incidents. If you're developing a product or a tool, or you're working with product owners or product developers, or you're working with a Data Privacy compliance team, right? How are they communicating? And who needs to know when is it? Is it generally provided? And across the organization? Is there equal education and access to the information within an organization? Or is it tightly controlled by those who need to know? So ethics is really important because change comes from the top; I know, there's this approach where when there's when organizations are going through a transformation, they want to have that happen from the bottom up. But when it comes to privacy, it really starts at the top. And it starts with senior management. And they have to communicate that information down to make sure that it's been, it's been integrated with an organization effectively, and it starts from the bottom and the top. It's not just one-sided. And so how I have generally broken up those lines in the past is really just getting straight to the, you know, the decision-makers and making sure that you're communicating as much as possible when there's risk involved, and not letting someone higher than you really dictate. Old, old models and old thinking can expose a company to unnecessary additional risk if that makes sense. Really being clear when you're in sitting in front of an attorney, like if there's risk involved, communicating that risk very clearly and articulately, and making sure that they're informed so they can make the best decision possible.

Debbie Reynolds  19:39

Very good. So what is happening in the world and privacy that you see maybe the news or something that concerns you right now, if you're looking at it like, oh man, I don't like this.

Garrison Ross  19:54

I wouldn't say necessarily in the news, but I think just in early guidance documentation, say from the FTC, or the Department of Homeland Security, if you go to any of their public events or you read their publications, they're really starting to curtail dark practices, and how companies collect information. And there's been this loose connection between how they collect it and what they communicate. And so we've seen in recent reports that have come out from certain companies that you have to be really transparent. So if you are an organization and you're communicating to a company, or you're communicating to a consumer, and you tell the consumer you're going to do something with your internal processes when you collect their data, do you follow through with that communication? Do you follow up with them? And if your operations are inconsistent public-facing on multiple channels, and you have different organizations, we're not going back to what you've said prior, which happens to do with silos, different teams not communicating with each other and having different processes and how they do things and handle customers information, or how they interface and respond to a customer, if it's not holistic. And it's not in harmony, and the customer experiences a lot of disruption and disconnection in that process that can also expose your company to unnecessary risk, because it shows that your internal organization and teams aren't communicating with each other. So I would say, from a standpoint, really making sure that you stay ahead of emerging trends, and you stay on top of best practices and early guidance that's coming down from inventory. And governing bodies.

Debbie Reynolds  21:56

Let's talk a little bit about the deletions. So in the US, we don't have a right to be forgotten; I don't think we'll ever have that. But I think companies around the world struggle with deletion. Mostly because a lot of companies don't really know where their data is, so they're afraid to get a deletion request because they're like, oh, my God, like what data do we have about this person? How far back do we have to go? Like I have to go into this room with cobwebs? Or where do I find this information? Right? So tell me about the challenge that organizations have around deletion because I think we consumers have a very, your picture is like you're on a computer, you press the delete button, right, and the file is gone, or whatever. But that's not how deletion happens within organizations because data is in multiple different places. You know, there's just a lot more mechanics behind actually being able to do a delete request. So tell me a little bit about the mechanics of that, behind the curtain, to maybe consumers or people outside of organizations don't understand.

Garrison Ross  23:18

Yes, that's a really good point, Debbie, is I mean, where do we start with deletion? Right? In most companies, you have to have a really comprehensive data strategy and Enterprise Data Management Policy and Program. And that doesn't just mean what data is stored in what system, right? It also means who has access to that data and how is that data being extracted, shared, and replicated. And so we hear the term data minimization; it's really important in the organization to really control how that information is accessed, copied, controlled and shared. Because if it's in an Excel document somewhere, or it's in, you know, a server or a database has been replicated when it comes to deleting that information, it's really hard to confidently accurately say that you deleted that information if you get a deletion request. But deletion requests, we tend to think from a consumer standpoint, if a consumer is aware, that's one aspect of it, the consumer can go to a privacy policy, you know, exercise on their, their Data Privacy rights, if they live in a specific state and delete their data. In the US, most companies that have implemented a decent management tool or process, they could have only had to focus on California. Now we're seeing Connecticut and Massachusetts and Virginia and all these other companies start to come forward with Data Privacy rights in their particular state. And so as we move into next year, we're starting to implement those additional states to be rolled up with California. That's going to continue over the next one to two years. And so we'll have about 40 or 50 states, which you'll be able to go in and say you want your data deleted. So when you think of data deletion, data deletion, from my perspective and your perspective, we understand that it shouldn't be an equal right across our country. And so what you tend to see, and this may be not the best analogy, but, you know, there is because of the jurisdiction lines and states, we tend to think of one concurrent, one state having more rights over a different state, but we all live in America, right. So when you think of things that we've had to deal with in the past that may have to do with race, throughout human history in this country, there has been different classes of people that have had different rights because of the color of their skin. Well, when you think of data deletion rights, that same mentality and mindset is being applied to data, right? We're saying, well, if you live in this state, you will delete your data. But if you live in this state, you're not good enough, or we're not going to honor your request. And so that kind of separation mentality, in terms of how we handle data under our country, creates a lot of separation. And it's going to cause some of the same discrimination like there are consumers that are starting to feel discriminated because we may live in Washington, or we may live in Chicago, and we want to be able to delete our data. And we can't because we don't live in California, right? That type of activity has to evolve so that we can have more unity in our country, and fair access to information and fair control over our privacy as citizens of the United States, equally across the board.

Debbie Reynolds  26:37

I haven't thought about it as a discrimination thing; that makes sense. Because some companies will decide, hey, I'm going to give all people in the US these rights that people have in California, mostly because it makes their life easier, right? They don't have to segment people out. But that really shouldn't be in the control of the organization. Right? It shouldn't be, oh, you know, Mother, may I, right with an organization like, hey, could you please delete my data? You have to kind of beg them to do it. So it should be more, in my view; I hope that we can get at some point more to a human rights type of regime as opposed to a consumer, right? Because right now in the US, if you're not consuming stuff, like you don't have any rights, like, you don't have a right not to share. So that's something that people in Europe have, we don't have here. So what, what are your thoughts about that?

Garrison Ross  27:45

Yes. And that's also a good point, too. I think that competition with data exists. I mean, I posted information on various different boards on LinkedIn, and you get some people from the UK, who are very territorial, and they're like, our rights are better than your rights. And, you know, they don't want to. They want to think of it separately. And it's sort of like thinking about Data Privacy and silos, but Data Privacy, as you stated, is a human right, right? We should all be working together. It's not about your country being better than our country, it's about us working together to find a unified and common ground and how we can protect Data Privacy across the board for all individuals who are consumers or otherwise.

Debbie Reynolds  28:28

Yeah, I think this is something that Tim Cook says a lot. He says from Apple, he says, privacy is a fundamental human right. So the problem that I have with that statement is that it actually is not in the US. So it isn't, in terms of legal. You know, we don't have a legal standing in the US to say that privacy is a fundamental human right. So when these big companies say that they give people a false impression, right? So if privacy were a fundamental human right in the US, we wouldn't have the same gaps that we have right now. So for me, I think in order to get better, we have to be real with what we have now. I think the Roe v. Wade Dobbs decision highlighted for people, especially women, how fragile our rights are, right? So being able to have something that's more fundamental on a human level will fill a lot of those gaps. So what are your thoughts about that?

Garrison Ross  29:50

I do feel that it will fill some of those gaps. But I think even in the statement that some CEOs make on Data Privacy, how was that applied across This organization or across their customer base because not all customers are treated equally? So you may say privacy is a fundamental human right. But then you have classes of customers that you treat differently based on their buying behavior, they based on their online internet activity, etc. And you may advertise specific privacy rights, but how you treat them behind the scenes when they want to exercise on that, or if you want to test or deploy certain technologies, because of your interest in the data that they're generating, you know, really has to be standardized. Does that make sense?

Debbie Reynolds  30:40

Absolutely. Absolutely. So if it were the world, according to you, Garrison, and we did everything that you said, what would be your wish for privacy anywhere in the world? Whether it be technology, law, or human stuff, what are your thoughts?

Garrison Ross  31:03

My thoughts are I would like to see more partnerships and more alliances across States and countries across governments. Because we, as humans, are in this era of the space race, right? We want to go to a different planet and live there harmoniously and explore the intergalactic, you know, the multiverse. We can't enter that domain as the United States against Russia or the United States against Korea, we have to enter that as the human race, as you know, Earth Ian's per se right as one collective race of humanity together, entering into a greater multiverse, right? So I tend to think of it if it's the world; according to me, there definitely needs to be more partnership because there is more than enough data to go around. And statistically speaking, this idea that we don't want people to have to delete their data because we're afraid that everyone's going to rush to the podium and at this stage and exercise their data rights and their voting ability to delete their information. Well, they'll never be enough people that come forward to delete their information that takes away from the power of data that is available and accessible to companies to create products and services because there's always going to be someone willing to share their information. So why deny that people the choice and the right to exercise their unique preferences, according to their needs, when we have more than enough citizens and people and data available to continue the process of innovation and evolution? So we need to think about the big picture and statistics and not be afraid to give people the freewill to exercise rights that are specific according to their unique needs.

Debbie Reynolds  33:02

Wow, that was a great answer, earthy, and so I love that. You went intergalactic on us. So very nice. Well, thank you so much. I really appreciate you doing this. This is very early for you. So I really appreciate it, and we'll be having more conversations. So I look forward to us chatting more and being able to collaborate in the future.

Garrison Ross  33:26

Likewise, Debbie, thank you for this time today. And it's been a pleasure speaking with you. You know, my journey in the privacy space is still relatively young. And I definitely am passionate. And I'm so grateful to be a part of the evolution of what it means to be data-compliant and respect consumer preferences and choices.

Debbie Reynolds  33:46

Excellent. Excellent. Well, we'll talk soon. Thank you so much.

Garrison Ross  33:50

You're very welcome. Have an excellent day today.

Previous
Previous

E129 - Tharishni Arumugam, Global Privacy Technology & Operations Director,  Aon

Next
Next

E127 - Scott Taylor, The Data Whisperer and Principal Consultant, MetaMeta Consulting