E118 - Nicole Stephensen, Owner and Partner, IIS Partners, Australia
49:54
SUMMARY KEYWORDS
privacy, organizations, australia, technology, people, companies, terms, privacy policy, privacy laws, big, compliance, data, law, school, happening, information, professional, breach, Federal, privacy legislation
SPEAKERS
Debbie Reynolds, Nicole Stephensen
Debbie Reynolds 00:00
Personal views and opinions expressed by our podcast guests are their own and are not legal advice or official statements by their organizations. Hello, my name is Debbie Reynolds; they call me "The Data Diva". This is "The Data Diva" Talks Privacy podcast where we discuss Data Privacy issues with industry leaders around the world with information that businesses need to know now. I have a special guest all the way from Down Under in Brisbane, Australia. Nicole Stephensen, how are you?
Nicole Stephensen 00:39
Oh, hello, Debbie. Thank you. I'm great. lovely to be here.
Debbie Reynolds 00:42
Well, this is going to be a really great show. Nicole is someone that I really adore. It's been a thrill for me to get to know her and collaborate with her. Nicole, I think we met through a mutual friend. Zoe Eather, around Smart Cities. And so Zoe, who's also been on the podcast, she, I call her like the Queen of Smart Cities and, and we've collaborated on different panels. And we tend to chat on LinkedIn, and you always put up really interesting stuff. So I would love to just say a few words about you and some of the things that you work on. I know that you are, you hold the Smart Cities and critical infrastructure or security professional designation, and you are a fellow of the Australian Information Security Association. You are also IAPP Knowledge Net chair for Queensland, Brisbane Gold Coast, you've also been the founder and member for the Australia and New Zealand Regions Privacy Industry Membership Association, IAPPANZ. Now part of the larger IATP, where you sat for three consecutive terms on the board. You're also an active member of the Smart Cities council for Australia, New Zealand. And you're also on the advisory board of the Center for data leadership.
Nicole Stephensen 02:15
I'm pretty busy.
Debbie Reynolds 02:16
That's a big list. That's a big list. In addition to having your own consultancy, when you're the owner and partner of IIS Partners in Australia, so you have a lot going on, don't you?
Nicole Stephensen 02:31
I surely do. But it's wonderful. It's an exciting time to be in privacy and an exciting time to be in business in privacy. Right. It's great to have a consultancy; my little consultancy, Ground Up Consulting recently merged with IIS Partners. So we had been working together probably since about 2017 on some major projects because my consultancy was so small when I wanted to get involved in the big stuff, you know, you need to partner with a bigger firm. It's often partnering with IIS, and sort of after a number of years of dating, right, we just decided to make it official and go into business formally together. And it's just been such a joy and a treat. My mentor, actually Malcolm Crompton, who's a former Federal Privacy Commissioner, he founded IIS Partners. And so for the last 22 years, obviously, I've had the opportunity to be mentored by this man and become friends along the way. And now we're running business together along with our colleague, Mike Travato. And it's just yeah, and it's fun. It's really fantastic. business-wise, it's a really happy time. And, yeah, in terms of all the volunteer stuff, I think that's what you do, right? At this stage in your career, I know you do the same Debbie; it's just, it's lovely to be able to give back and you just have to pick a couple of areas of interest to you and start there.
Debbie Reynolds 04:00
Yeah, it's always nice to be able to get to know other people. You know, I think even though we're in different locations, we're dealing with very similar challenges. So it's always interesting to get a point of view from people. And you know, I've reached out to you on a number of occasions about different things as you've reached out to me and it's been fun that we have kind of this international camaraderie among people that we feel comfortable to reach out to one another and ask each other questions and I think it's really cool.
Nicole Stephensen 04:31
Yeah, yeah, it really is. And you know what? It just shows us how small the privacy professional community is, as well. There's a group of trusted experts, I feel that we both cultivated around the world that when we have a question or a problem, that's a good spot to reach out in a collegial way and just have a conversation. And sometimes that's really all it takes to clarify things, particularly when you're talking about other jurisdictions. Yeah, it's really lovely.
Debbie Reynolds 05:04
Yeah, I would love to talk about how you deal with so many different issues. I remember something you have posted or you and I have talked about; we touched on a little bit. And this is around education, like technology in Edtech. And I think you had posted something many months ago about, I think it was your daughter's school or something, I can't remember what it was, in the past about parents who are privacy folks going into schools and seeing kind of the wacky things the schools are trying to do, and, you know, just your perspective, not just for as a parent, but also as a privacy professional. What do you think is happening in Edtech that we need to think about?
Nicole Stephensen 05:51
Look, I think not enough is happening in privacy and Edtech. That's just, you know, that's just my opinion. But I would venture to say that a number of privacy professionals feel the same way. I think there's two things to talk about, first is that privacy, cyber safety and safety and duty of care, those things, those concepts all come together when you're talking about schools and education and kids, right, and you almost have to look at them as a whole, in order to work out the issues that you might have with all those separate parts, right. So often, in that sort of Edtech space. We see schools saying things like all we need to deploy this monitoring software because we have a duty of care to our kids, you know, to make sure that when they're on school grounds, and they're using technology that they're doing so safely. And if we don't do that, then, you know, we're in breach of our duty of care. And I think well, duty of care. It's a pretty big topic, right? Pretty big concept. And it includes not doing things or not failing to do things that would protect kids in their school environment, right? One of those things you would think would be protecting children's privacy. Right, so failing to protect their privacy could potentially be a breach of a school's duty of care and, or their safety and online environments. Right? You could sort of take this conversation, you know, in many different directions. But the point here is schools are leveraging duty of care as a concept. And more vendors are maybe leveraging the concept and getting their products into schools. And one of the examples that I raised last year was because this mom was talking to me about how her daughter had been suspended from school. Not suspended I apologize. She was given detention at school. And the reason for this was because she had been caught watching JoJo Siwa videos, like YouTube videos on her break. Now I don't know about you like JoJo Siwa was this sort of, you know, like cute little bebop tween singer dancer that, you know, really, really kind of soft in terms of the things that you might see on YouTube or our music channels. Right, and generationally really popular and this was considered unsafe activity by the school. Now, what I was more concerned about, though, is one, how did the school know that this child was watching JoJo Siwa videos, but then the other? What were the parameters that they use to set what is safe or unsafe in the school environment? But then further than that, how come the mom didn't know that her daughter was being monitored like this at school at all or that any of the kids are being monitored? Right? So there's all these questions that were coming up and unpacking it from a duty of care perspective, and from a privacy perspective was hard work. And it got me feeling pretty frustrated about how schools generally are tackling technology. So yeah, I wrote an article about it. I tried not to be too ranty. But it's, it can be really challenging. And it's a space that I I try in a professional way, but also just in a collegial way to continue to educate schools about
Debbie Reynolds 09:36
I agree with that. You know, I work with companies that are developing technology in the education space, and it's kind of surprising, you know, they'll I think a lot of times, sometimes I see designers if they're very excited about what they've created and stuff that they can do. And then it's like I said, we can't do that. Like yeah, you have created this integration. But if you're applying it to the education space, you know, there are so many different levels of consent that has to happen. So many things let parents know children can't consent, right, on their own. So you can't have, you can't be monitoring them or for example, things do facial recognition is like going, you know, data be collected in databases or searches running open databases, or I have one where someone they were creating like a tool, like a sanitation tool to do upsampling around the school. Yeah, they're like, oh, yeah, well, when people walk from the school, they can be zapped with a UV, like, you can't zap people. You cannot do that. You especially can't. No, no.
Nicole Stephensen 10:52
No, this is what I'm thinking too, right? So there are a number of companies that are kind of working in this space. And there's one that I was looking at recently, and I thought I do not know how you guys are getting away with this. So they're based over in the European Economic Area. And they have a cloud-based tool that offers things like keystroke monitoring and live screen monitoring for teachers. So say kids are learning remotely, which seems to be the way that we're going in a number of jurisdictions now with you know, COVID lockdowns going on and off and, and I get the challenges there. But imagine, you know, parents finding out that the school has, you know, remotely installed software on the laptop the kid is using at home, where the teacher can observe the child's screen. So what the child is doing in real time, including what they're typing, what they're looking at, and how fast they're completing their activities. And then if the teacher doesn't like it, the teacher can actually take a screenshot of what they see on the screen, date and time stamp it and then stop the kid via remote access to their computer, from completing whatever the activity is that they're doing, or watching whatever the video is that they're watching. Now, I just as a privacy professional, see that as a bit of overkill. And what's worse is that this particular product leverages privacy by design when they're selling their stuff to the school. And I just, I find that really challenging because privacy by design, is pretty tricky. When you're talking about a technology that is intended to invade privacy. Right? I mean, the starting point, already, privacy by design has been lost. So I do find this area really tricky. I find it frustrating, I find that it's in deep need of education, you know, both in the vendor space and in the buyer space, ie the schools. And I'd like to see some uplift there for sure.
Debbie Reynolds 13:08
Yeah, I think when people think about privacy by design, they only think about it from a technological standpoint. So by design should also be about human behavior because you can have all the policies and procedures that you want. But if someone is misusing or abusing technology in some way, then you know, that's part of the design, it's part of the policy, it's part of the training, education that's required for people to be able to use these types of tools.
Nicole Stephensen 13:39
And it's actually it's really quite impossible to look at privacy by design, say, one principle at a time. So you know, there's the seven foundational principles of privacy by design, but they're intended to be taken together as a whole, not looking just at one thing. So for example, if you engineer, you know, end-to-end security as part of your product or your service offering, you're meeting an aspect of privacy by design, and that's awesome. But you can't say that, on that basis alone, you have a privacy-by-design solution. And I do think that this is where vendors can fall down. And similarly, you're quite right that it's about, there's that human condition element, the ethos of the company, or the ethos of the vendor and what it is they're trying to provide, and privacy must be your starting point. So from the time you're even conceptualizing your idea before you start building it, and before you bring it to market, it needs to have privacy at its core; it needs to have privacy in mind. And I do see the term you know, particularly with the advent of the GDPR, and privacy by design having so much more play because we know it's been around since the 1990s as a concept when Ann Cavoukian coined it all those years ago, but GDPR really brought it to the fore. And so now companies are leveraging the term because they see that it assists them with their sort of observable compliance. But I'm not necessarily sure that they're getting it right 100% of the time.
Debbie Reynolds 15:29
I would love to talk with you about privacy, program management and culture around that. So I don't know, I think when we think about people have a privacy program, why people think about people that either don't have a privacy program at all, or people who have ones very well developed, very entrenched, maybe more into compliance than actual sort of behavior, or reactive as opposed to proactive steps. But I'd love to tell you about people who are sort of in the middle. So I don't know about you, but I find a lot of people in the middle where they may, they know the privacy is important in some way. They've tried something, you know, they may put some policies and procedure in place. Not sure exactly how to get to a maturity level where they need to be like; tell me a little bit about that story. I think it's really interesting.
Nicole Stephensen 16:29
You know, I think there's a bit of a misunderstanding that privacy starts; good privacy practice starts with a good privacy policy. So you often see organizations, the first thing they do when they contact me, and they're wondering about privacy and where to go and how to get it done. The first thing they ask for is can you help us build a privacy policy? And it's through that conversation that I'm able to show them that having a good privacy policy only provides you with good window dressing. Right? It doesn't. If people open up your shopfront and have a look at what's inside. They need to actually see that you are doing what you say you do in your privacy policy and that you have the systems and the processes and the I guess the ethos behind you that allows you to give effect to that policy. Right. So my advice for organizations that think that a good policy environment equals good information practice or good privacy practice, I think that they're starting at the wrong place. So if you, if you think about what good privacy program management looks like, it always starts with strategy, right? It starts with something strong at the foundation. So I know we often think of strategy as being like an umbrella. But I like to think of it as a tree. So you have really a really strong root system to hold your tree up. And the root system is the foundation is your privacy strategy. And from that, you're able to kind of grow and nourish your privacy management framework and your privacy planning and any policies or procedures or processes that flow from that, and any training that flows from that. But it does start with strategy and strategy, to my mind isn't just some glossy document, right? Some glossy 500-page document that you stick on a shelf and it gathers dust. It's something else; it's about communicating within the organization, what your beliefs are, what you think and feel and care about in terms of privacy and the community that you serve. And that is the best possible starting point for privacy. So you know, if for example, you are a company that deals with the community directly, they matter to you, their trust is incredibly important to you having a privacy strategy that reflects that and your belief that that the community is your starting point with everything. It helps to build out that culture in a way that simply having a good privacy policy wouldn't. Also, you know, on the topic of policy just for a second, you can have a whiz-bang privacy policy, you can have the most excellent privacy policy that ticks all the boxes in terms of, you know, what should be placed in it from a statutory perspective, right? You can meet all your transparency obligations but still have a rubbish privacy culture and terrible information practice. Right. You know, as long as your privacy policy tells the truth about what you're doing, you know, often companies think that that's enough, but if the privacy policy is telling the truth about what you're doing, and on the face of it, you have terrible information practice in a terrible, a terrible culture, and are not showing a good degree of respect for the community that you serve? Well, that's a real shame. And you know, there's an opportunity there to turn it around by starting with the roots of the tree and growing from there.
Debbie Reynolds 20:18
Very good. I would love to talk to you about your notion about the things that people think about when they think about privacy. I hear corporations often think of privacy as a tax, like, oh, my goodness; this is extra pain I didn't really care about before. And now I have to do you know; companies are doing so begrudgingly some of them. But I don't agree with that. But I'm pretty sure that you don't either. But what do you say to organizations that may have the attitude when you first started working with them?
Nicole Stephensen 20:51
That's like every organization I work with; it starts there, where privacy is a compliance job; it's another hoop that they're jumping through. And by the time I speak to a number of organizations, they're already exhausted. So they already have a deep compliance burden across a variety of things, right, like financial reporting, accountability, or cybersecurity, or human resources, workplace health and safety, you name it, right? There are so many compliance tasks that organizations have, if they view privacy as one of them, you know how it's talking about strategy before. If you view privacy only as a compliance task, you're viewing it at the policy level. If you view it as a strategic imperative, something that's important to you and critical to the operation of your business because you care about it. It stops being a task and starts being just what we do here.
Debbie Reynolds 21:58
I agree with that. For people who don't know, what is the latest and greatest thing happening with privacy regulation and Australia that maybe people will be surprised about or they don't understand? I think, in my view, I think people don't understand how robust privacy regulation is in Australia compared to a lot of different other countries. What are your thoughts?
Nicole Stephensen 22:27
I think there's a difference between robust and big. Right? So to me robust, I think it's sufficient for the purpose, it's working well, it. It doesn't require a whole lot of change in order to be applied consistently across industry and public sector, those sorts of things. Right. That's what I think of when I think of robust. I think we're trying to get there. So we're in a process right now of a major Legislative Review on Australia's Privacy Act of 1988. And it has been reviewed a couple of times; this isn't the first time since 1988. But this is the biggest review. And I see that there's potentially some important and big change coming there. And I think that's wonderful. The thing about Australia's privacy landscape, though, right now, that makes it big, but not necessarily robust. And this is potentially a lesson for sort of our US counterparts, who are right now looking at the opportunity for Federal privacy legislation. And I know that it's, you know, this has been batted around for a number of years; privacy professionals get super excited, and then they get super sad. But, you know, I do think Federal privacy legislation is so important in order to regulate private sector entities and to ensure that organizations that are servicing the community and the public sector entities across all the various States that they have a consistent framework that they're going to apply every single time that we have in Australia. So we have our Federal legislation regulates private sector, as well as our commonwealth government entities. Where it gets a little bit tricky is each of the States and territories in Australia, except for two that don't have privacy legislation yet, which is appalling in this day and age, but I digress. We each of the States and territories have their own privacy laws, and those laws regulate public sector entities here. So we have, you know, that it can it could be fairly complicated if we had more States and territories. Right. So in terms of kind of that the lessons for our US counterparts, imagine, you know, 52, State based public sector privacy laws and then having your Federal organized, you know, Federal private sector privacy legislation overlay, you can imagine how complicated that would be in a much bigger country. But here we have a minimal number of States and territories to worry about. The challenge, though, is each of the privacy laws at that State level that regulate public sector are waiting to see what the Federal government's going to do. And it's big review before they do anything with those State based laws. And it takes a long time for them to do that; once they say yes, we're going to go ahead and push the button, we're still waiting another 18 months to two years before anything really happens. So there's a bigness here in terms of the amount of legislative requirements that potentially apply in this space. But it's still a bit fragmented. Additionally, in addition to the privacy laws, we have, we also have, we also have a consumer data rate that at the moment applies in the financial services sector but is looking to roll out to other sectors. So that adds a degree of complexity because that's about open data, right? So instead of keeping it back, holding it back, it's about giving it out. And so there's this tension with the consumer data, right, and our privacy laws. And in addition to that, we have a number of different information security laws that are tied to our sort of our Federal national security objectives. And those laws often butt heads with the privacy regime. So it's definitely big here. But I think that there's a way to go before we could call it robust, in, you know, in that positive way that we would view something as being robust.
Debbie Reynolds 26:47
I don't know, I hear this from my counterparts in different countries, where I would say our privacy regulations in the US are super duper not robust. is just kind of, you know, we have dribs and drabs of things from different places, we have things on the Federal level that cover certain things is definitely, you know, a patchwork or a work of nothing, right? So there's a lot of gaps there, there are a lot of things there. So I'm excited to see countries like Australia, who have a, you know, Privacy Act or law regulation, looking to modernize. You know, I think that's really important. So we don't have anything that we can monitor, you know, modernize at this point, we're just trying to create.
Nicole Stephensen 27:33
You know, it's so funny. Here we call our landscape, often the privacy professionals here and I won't speak for all of them. But those sort of within my collegial circle, we all kind of call it a patchwork quilt. And, you know, we're used to our patchwork quilt so we can navigate it. Okay, but we are looking forward to some modernization, you know, catching up with the rest of the world would be great. I think it would also be great for Australia's economic relations with other parts of the world to ensure that there's that robustness of regimes in terms of the transferring and moving of personal information worldwide. The United States, on the other hand, it's sort of like, you know, an unfinished needlepoint cushion. Right. You know, there's a couple of stitches here and a couple of stitches there. But if I, you know, I am encouraged by what I'm seeing coming out of the US in terms of all the dialogue around the importance of, you know, national privacy regulation. And I do hope to see that happen. I think it would make things a lot easier all around the world in terms of being able to deal meaningfully with companies in the US.
Debbie Reynolds 28:50
I think it will definitely be a benefit. I personally am not holding my breath here. But if it happens, that's great. You know, I've seen, you know, I've been watching this very closely for over 20 years.
Nicole Stephensen 29:04
What's the area you see the biggest gap in like, not, let's not talk so big, like, you know, the Schrems II decision making it hard for, you know, businesses in the US to deal with their, you know, eu counterparts or companies that they're affiliated with, let's not go that big. But in terms of problems that you think need addressing locally, what are the ones that Federal privacy law would really nail for you guys?
Debbie Reynolds 29:31
I think, you know, trying to harmonize the definitions of what personal data is, or persistent data is what the data breach notification requirements will be met on a Federal level as opposed to a State by State level. That will be tremendous, right? Because right now, it's like every State is different. Some States want to be more bespoke than others. Some don't have anything to say about privacy. in a meaningful way, in a proactive way, a lot of them have stuff to say like we have a breach, right? And then in the US, we sort of commingle cybersecurity with privacy a lot of time. So when you hear people say cybersecurity, sometimes they're talking about privacy as well. And that's not the same thing. Right?
Nicole Stephensen 30:18
Oh, gosh, let's talk about that for a second. Oh, yes, yes. So how many times have I gone in to advise on a project? And I asked, you know, after hearing all about the project and all the great things that they're going to do, I just, you know, very calmly and quietly say, Well, what about privacy? And these other, we just told you, we've got cybersecurity covered. And on and on and on. They're not the same thing. And here are all the reasons why. Right? I like to say that privacy is about, you know, if we're talking about protection for a second, privacy is about protecting personal information through its lifecycle. But security is about protecting all the information. And, you know, personal information is just a subset of that. And, you know, arguably the most important one, from my perspective as a professional. And, you know, I don't see when we talk about protecting in the privacy sense. It's about thinking about your information, practice collecting the information appropriately, according to a purpose. And then not using or disclosing that information outside those guardrails. And security professionals are mainly seeing technical solutions and controls that they place around data, right, all the information. But the personal stuff is, you know, a bit amorphous to them, I think,
Debbie Reynolds 31:47
Yeah, it's very confusing. You know, a lot of people I talk to, when you talk to them for a while, you see that they're confusing what those two things are. And some people are like, oh, you know, privacy and security, obviously, have a symbiotic relationship. But I tell people, you know, privacy existed before technology, right? So it's more of a rights based issue. And so the analogy I gave is, like, think of a bank. So someone in cybersecurity is protecting a bank, they're protecting the exterior of the bank, the interior of the bank, who comes and goes, why what's happening, everything having to do with the bank, so cybersecurity folks have to protect, right, but we as privacy professionals, for example, we want to know what's in the wall, and why isn't there understanding the why that is what we really get down to, and then that's how you determine what needs to be protected in terms of the rights of individuals.
Nicole Stephensen 32:49
Yeah, that is so so true. And I'm imagining to that, that, like it's a difficult conversation, right, it's particularly because privacy professionals are not brought in early enough, still, I feel they're not being brought in early enough into the discussion. So kind of reverting back to that discussion earlier we had about privacy by design, bringing privacy professionals in at the inception of a project, when you're first thinking about its parameters and why you want to do the thing you're gonna do. That's, that's the time to engage a privacy professional, not when you're down to contracts administration. Right. And you're just often that's when I get brought in is someone says, do we have enough information about privacy in our vendor agreements? And I'm like, wow, probably not. No. But in addition, you know the whole point here is that there's, you've had no opportunity to have visibility over all, you know, all the why elements that you were talking about, you know, what's in there, and, you know, what's doing there. We haven't had any visibility of that if we get into it late in the game.
Debbie Reynolds 34:08
People have thought about cybersecurity or even managing data in a very reactive way. So it's like, okay, we're not going to deal with it until something bad happens. And then we're going to like spring into action, right? And where privacy really needs to be a more proactive approach. It needs to be baked in, you know, as you said, it's part of the culture. It's part of the strategy, the overall strategy for the company. So you reduce situations or reduce the risks that organizations have downstream when they're dealing with data. Yeah, I want to ask you about this is fascinating to me. I would love for you to tell the story. So Choice magazine in Australia has experts say I believe about particular organizations using more emerging technology within their stores. And it is creating a privacy dust-up in Australia. We're reading about it here. And I'd love for you to tell the story about what's happening.
Nicole Stephensen 35:12
Well, it's like one of those moments where so imagine yourself in a schoolroom for a second, right? And one of the children gets told to go stand in the corner because they've done the wrong thing. And all the kids watch. And then three days later, all the kids that were watching do the exact same thing, the exact same wrong thing. And that's what's happened here. So Choice magazine found out through their investigative processes and survey processes that a number of Australia's major retailers are using facial recognition in their stores in order to catch baddies, right? And like, I get that I think no, no store really wants to have individuals coming on in then and shoplifting you know, you know, reducing their ability to make a profit; I fully appreciate that. But there are guardrails for using technologies like that. And our privacy law is pretty specific about what's required before you deploy any technology that collects personal information from the community. Right? Now, we had a case in Australia recently. So this is the first kid that had to stand in the corner. Right. Going back to that analogy, the first kid was 7-11. And 7-11 was using a survey tool or technique that involves taking pictures of people's faces. When they filled out their customer satisfaction survey when they were checking out of the store. And there was a big investigation by the Office of the Australian Information Commissioner on this, and it was found that 7-11 was indeed in breach of our privacy law. And the decision really set out all the ways that 7-11 had breached the law like in great detail. And it was a super resource for the other kids in the classroom who were watching 7-11's standing in the corner. And I don't think any of these other Australian retailers, so it was Bunnings, Kmart Australia limited. And the good guys, I don't think that they read the 7-11 decision, or perhaps, I don't know, maybe didn't know that, that it even happened. Because they've just now they have just been profiled for doing exactly the same thing or using a similar technology in a way that appears to be in breach of our law. And it's just; it's gobsmacking. So the privacy professionals around here, we've just got our heads in our hands saying what on earth is going on? And, you know, I, the other day, I just made some really pointed remarks about hey, you know, this has got to be only the tip of the iceberg if our major retailers are doing this. And these guys, okay, these are big companies. So they have compliance teams, they have legal teams, and they have privacy officers. And they have access to, you know, a plethora of information on the Office of the Australian Information Commissioner's website, right? They've got all that information at hand, including that 7-11 decision. And, you know, they, they have this ability, this lens that they can look through if they wanted to do privacy impact assessments right before they deploy technologies like this. And there are many consultancies right from your Big Four firms and law firms all the way down to the boutique privacy consultancies that do stuff like this every day. So they've got three opportunities. They're their own privacy officers, right? The Federal commissioner's guidance materials and privacy impact assessments to get it right. And they haven't done any of those things. And now they've been profiled in a major way and then media, and are gonna suffer brand damage as a result.
Debbie Reynolds 39:21
Yeah, so I did a video on that. The 7-11 case, and I feel the same way. It's like everyone's falling into the same ditch somehow. And I think what happens, you know, we're seeing that in the US to where we see people, you know, companies get hit, like, for example, I live in Illinois, and we have the HIPAA law for biometric law in Illinois. And it's pretty, you know, it's like eight pages very, very plain and simple to understand. And we see a lot of companies; big companies run afoul of that. It's like why it's very simple to you know if you follow the steps to be able to do this. But I think what happens is there's a disconnect within the organization. So it's like, you know, for example, let's say a company has traditionally done like CCTV, where they have an ace, like static camera that just, you know, films people in stores, and then they get the sales pitch, like, hey, we give you these new cameras, and they have all these new technology or bells and whistles. And they think, okay, well, since we've had this, in the past, this is kind of the new version of that it's like a new car, right? And we don't have to do these other, have these other considerations, but when you're dealing with data of individuals, you do have to have these considerations. So you have to be able to explain it you have to do your homework. And the thing is if the technology is different if it gathers the same personal information but in a slightly different way, or manipulates it in a different way, or stores it in a different place, these are all new privacy considerations that, you know, have to be looked at by that organization in order to determine if the risk is okay with them, or what are the ways that they can manage that risk. And, you know, out of the 7-11 decision, and, and certainly, as we've seen with this, this latest Choice expose, one of the biggest things that came up was not about the tech. And this is what's really interesting, Debbie actually is about the privacy program management; it's about the privacy program that sits behind the use of technologies or onboarding of vendors or whatever it is that the organization is doing. And what we're seeing, I think we're part of the disconnect is that there's a lack of understanding about central privacy concepts. And then there's also there's a lack of transparency. And transparency is a real key tenant right of good privacy practice and program management. So what folks like 7-11, and then these other organizations have done is they think, oh, if I post a notice somewhere, you know, somewhere visible that people will see that there's cameras operating in the store. That's enough. But these notices are not specific enough to be truly transparent. Similarly, sometimes the notices or even the privacy policies that are posted by these companies say things like by entering the store, you consent to be photographed. Well, a notice and consent are not the same thing. Right. I noticed in forms, it tells you what's what, it gives you some idea about what's going on. I noticed in the forms it doesn't ask. And I think that as a key learning is something that came out of the 7-11 decision. And now we're seeing, you know, these other retailers that have been engaged in sort of the same misunderstanding or same disconnect in terms of what their obligation actually is. So I can't wait to see how this one washes out. Yeah, I think anyone, I recommend that anyone in any country, if if you have customers that are employing emerging tech to look at the 7-11 case, because it's very detailed, and it will help you avoid problems in the future if you're thinking about doing similar things in terms of implementing new technologies because unfortunately, it's easy for companies to fall into this. In a way I feel like the way companies operate is probably one of the reasons why this creates a problem because in a lot of companies, everything is sort of this part, you know, it's like Santas workshop, everyone's works on their little part, but like, you have people who can look at a higher level across, you know, different functions within the organization and figure out, okay, this is a problem because of X, Y and Z as opposed to okay, we have an old camera, we're going to have a new camera, and you just install it, and then all hell breaks loose.
Nicole Stephensen 44:13
Yeah, I think you're right. And I think that's the role of a Chief Privacy Officer or a Data Protection Officer or whatever you want to call that person in the organization to just kind of, you know, have their tentacles out within the organization and try to understand it on the whole, as opposed to just those component parts. I look, I think it's really, this is such a relevant space to be in and when it comes to tech. I think it's also really important if you have somebody that's looking at it from that from the privacy perspective and through that lens. You can also see that technology itself is not necessarily the problem. Technology is, you know, it doesn't really have thoughts and feelings; it's neutral, right? It's how we choose to deploy it as an organization that can really have us coming unstuck from a privacy perspective to the facial recognition example is one that I think really resonates here because not all facial recognition is created equal, right? We can use facial recognition in a really privacy proactive or positive way, like, you know, the way that we use Face ID to open our phones, right? We can use it to protect ourselves and our assets. And that can be seen as a privacy-positive use of the technology. But then when we're using it for surveillance, or covert law enforcement processes, for example, it doesn't sit so well with the community. So often, we have to think about the technology and its purpose and its intended uses and the consequences of that. That is something a privacy professional is able to do in a way that the folks on the ground who are just trying to come up with a you know, a solution to whatever the problem is at hand, they may miss some of those nuances.
Debbie Reynolds 46:18
Excellent. If it were the world according to you, Nicole, and we did everything you said, what would be your wish for privacy anywhere in the world where it'd be law, technology regulation, or program management? What are your thoughts?
Nicole Stephensen 46:33
Okay, so on privacy law, I wish but I know that it is because privacy is relative to the jurisdiction that you're in and the socio-political climate that you're in. Although I wish for harmonization, I don't see that happening anytime soon at an international level because the regimes that we have around the world are just so so very different, right? And the considerations for the privacy laws in those regimes are so so very different. That would be a big picture, which just sort of like, you know, one day having a whole world government would be really cool, right? But I don't see that really happening in my lifetime. So I think I'm going to stick to privacy strategy and culture building as being an area that we all have a real opportunity to get right. And if every company took some time to invest in privacy strategy, and to think about the community they serve, when they're doing that, I see them having a much greater level of downstream compliance. When it comes to whatever privacy regime they are serving under.
Debbie Reynolds 47:53
I love that you are a pragmatic lady. So that's a very good pragmatic answer that we can actually achieve right on a case-by-case basis or client-by-client. Yeah. Very good. Yeah. Well, it's been great to have you on the show. Thank you so much for waking up early in Australia and having this call with me, and I really appreciate it and we'll talk soon.
Nicole Stephensen 48:16
It was absolutely fabulous. And again, I'm just, I'm delighted. I love every opportunity you and I have to chat or collaborate. I just soak it in. Thank you so much, Debbie.
Debbie Reynolds 48:26
Welcome. Thank you. Thank you