E163 - Karen Habercoss, Privacy & Compliance Professional, Chief Privacy Officer, University of Chicago

35:30

SUMMARY KEYWORDS

privacy, data, understand, hipaa, work, people, cybersecurity, talk, part, business, risk, contemplated, health, ai, security, areas, space, health care, feel, healthcare

SPEAKERS

Debbie Reynolds, Karen Habercross

Debbie Reynolds  00:00

Personal views and opinions expressed by our podcast guests are their own and are not legal advice or official statements by their organizations. Hello, my name is Debbie Reynolds; they call me "The Data Diva". This is "The Data Diva" Talks Privacy podcast, where we discuss Data Privacy issues with industry leaders around the world with information that businesses need to know now. I have a very special guest from Chicagoland, Karen Habercross; she is the Chief Privacy Officer at the University of Chicago Medical Health System. Welcome.

Karen Habercross  00:46

Thank you, Debbie. It's a pleasure to be here.

Debbie Reynolds  00:48

Yeah. I'm glad to have you here. I love to talk to privacy folks in Chicago; I feel like we're all hiding somewhere, so.

Karen Habercross  00:58

And it's nice to meet people that are local.

Debbie Reynolds  01:01

Yeah, very nice. Very nice. Well, first of all, again, I'm happy to have you on the show. I love to talk about privacy in the health area. Because I feel like sometimes people feel like, okay, I don't have to worry about privacy in health, because health has HIPAA in the US, and it's a lot more than that. Right? Why don't you tell me your trajectory? How did you get into privacy? And how did you end up where you are now in your role?

Karen Habercross  01:34

Yes, I've had a very non-traditional trajectory; I would say I started many, many years ago as a licensed clinical social worker in the Department of Psychiatry. So, I worked with people who had mental health issues. And folks who had medical issues that were affecting them in a psychiatric way. I was a social worker for many, many years, maybe 13 or 14 years, and then took a career path; I went to go work for a small healthcare startup, initially in the social workspace, social work leadership space. But when you work for a startup, you wear many hats. And so one of the other hats I was wearing at the time was compliance, which was really, back then, a fairly new and emerging area. And so got quickly up to speed on compliance and healthcare in this new and emerging area, then went to work for the Joint Commission, and then came back to the University of Chicago medicine. Now, I've been back about seven or eight years now, specifically in the privacy space. But I will say the interesting thing is that I use my social work skills and privacy probably every day.

Debbie Reynolds  02:59

I can imagine, first of all, that's tremendous. I find that a lot of people who are really good in this space they found a way to leverage the tools or both backgrounds that they have to do this job and move into those areas. So, bravo to you for making your own ladder, so to speak. And going into this, but I agree, I think your social skills definitely play into this. I've known people who've gone into privacy, and maybe they've taken more of a lawyer route. And so some of them, unfortunately, they think, okay, I want to come in the room, and I'm going to tell everyone what to do. And then that's just the way privacy is going to be, and you know, and I know that's just not how it works.

Karen Habercross  03:57

No, I think that's true. I think being able to, obviously, you have to know what the laws are. But being able to interpret, in my case, patients, healthcare consumers, or other people in the business, I think, is a key skill. So knowing the law is only one part of it, or knowing the regulation really is only one part of it, but being able to assess where things are and explain that to the people that need to understand it in a way that's relevant to them and not terribly high level or in a legal fashion, I think helps a lot.

Debbie Reynolds  04:37

Yeah, I had read something that a guy wrote on LinkedIn; I think of an article he wrote at IAPP. And he was talking about being a privacy person and not a lawyer and have a like; I think he's out of a job market now. And he was saying that he's finding so many of these job descriptions they have to be a lawyer. But then, when you read down the list of requirements is that, oh, you have to have experience in operationalizing. XYZ or that or that. So he was saying he felt like some of the people who are doing hiring are very narrow in their thought, or maybe they didn't really understand the skill set that's needed to really operationalize privacy. What are your thoughts?

Karen Habercross  05:26

Yeah, I think that's true. I think that the job market does seem to sway itself to folks who are lawyers by training or certainly who have a law school education. But I agree; I think that I have been successful in my career with a number of other skills. I also have a business degree. So that doesn't hurt, I would say, in terms of operationalizing, what's happening in the business structure? So I think, you know, those things definitely are part of it. But no, I would say I'm a success story for someone who's not a lawyer who has developed a program that's pretty mature here at the University of Chicago Medicine. And I think that we have grown year over year. And I don't know that that is a result of anything legal; I think that it's really more about being able to understand what the business is. Now, understanding the whole idea of privacy risk in health care is really just coming into its own right now. I think some other areas of industry have probably adopted the idea of privacy risk a little while ago, but healthcare is really starting to embrace the idea of privacy risk. I think that's not necessarily a legal concept.

Debbie Reynolds  06:50

Let's talk a little bit about privacy risks when you say some industries may have embraced it already and how it's coming to us in health. Tell me about that a bit more.

Karen Habercross  07:02

Yeah, I mean, I think for a very long time, privacy in health care, and even still to date, people think of HIPAA. And I think that's important for sure. HIPAA is now old. And let's just say that and never contemplated much of the technology that we're speaking, you know about and working under today. And so, most health care companies, or most covered entities, if you will, and health care, I think they understand HIPAA fairly well at this point. But the world has changed. And so we have situations where you have healthcare consumers who may never become your patients. And so that is typically regulated by other laws aside from HIPAA, or when you work, for example, like I do in academic medicine, and you do research, that contemplates all kinds of international laws that you may not have thought about before. And so I think the concept of privacy expanding and healthcare outside the HIPAA space is crucial for privacy leaders in this industry. I also think that you have to be able to work with risk in the business to understand, so it's not just about privacy risk under HIPAA, but it's about all of the other obligations you have around data. So, you can collect data in many ways. But sometimes you have to think about just because you can do things with your data,  do you want to? Or is it the right thing for your business, your vision, or your mission? And so I think those things are very important to consider along the concept of privacy risk.

Debbie Reynolds  08:54

That's great. Thank you for that. I want to talk a little bit about HIPAA. And I feel like a lot of people don't understand this. So I don't want to date myself, but I remember when HIPAA came out, okay.

Karen Habercross  09:06

It's been around a while.

Debbie Reynolds  09:07

I remember it very, very much when it came out. This was during Bill Clinton's administration. So I feel like people don't understand who aren't as deep in the space you are, is that HIPAA isn't a privacy law. It's a health data portability law; they have a privacy part in it. So because it's one of the stronger privacy safeguards of the country and Federal level, when people talk about health, they automatically talk about HIPAA, right in the privacy space, but really, HIPAA. You know, someone made this comment to me; it's like HIPAA exists because, basically the US, we don't have universal health care. And people have changed jobs. They have to move their health information around, or they change back or stuff like that. And so, really, HIPAA was trying to cover that data gap in terms of how the data moves. But as you're saying, there's so many different new technologies and new ways that data is being used where, for example, your example, which is a good one, people who aren't patients or aren't providers, if they're interacting with health systems or marketing systems, their data is still being captured, even though it may not be covered by HIPAA. So tell me a little bit about that gap.

Karen Habercross  10:35

Yeah, I mean, I think you're right; back in the day, you know, HIPAA was about the portability of your data. So, being able to move your data if you want to move from one physician to another makes it easy. And so that's really what it was about. And then, of course, it expanded to all these other areas about protections, certainly electronic than data, because we stopped using paper, for the most part, and what does that look like from a safeguards perspective? And then, of course, they contemplated what happens when the data goes astray. And how do you deal with breaches? And how do you tell people that their data has gone to the incorrect place? What are the requirements for that? So it's expanded a lot over the years; I still don't know that it's fully able to keep up with the emerging technologies that we have, certainly. And you're right; it's really for lack of a better way of understanding it. For most people. It's it is the only federal type of legislation people think of when they think of health data. But you're right, it doesn't cover everybody. So it covers, obviously, providers, pharmacies, labs, you know, those clearinghouses, payers, insurance, payers, those kinds of people are covered. But a lot of the technology coming out today is not regulated under HIPAA. And so I think we're trying to play catch up now for all of the ingestion of this data that nobody has ever contemplated before. And it puts us at a slight disadvantage because now it's up to consumers to become more knowledgeable. And so one thing to think about is even in the provider space, for example, the space I operate, we try to be transparent and tell all of the people that we see in our notice of privacy practices how we're going to use your data or how your data moves throughout the system, how it's disclosed outside and inside. But I would say most people don't read that document. And so you get that document from your provider, most of us, myself included at times, take it home and then never take a look at it again. And so that's part of the reason I think that people don't always understand what's happening. And then on top of that, all of these other emerging areas of health care consumerism, like I'm wearing my watch right now, which is tracking all kinds of data. It's not provider-level data, but it certainly has data to it, my heart rate, and things like that. And somebody holds that data; probably lots of people do. But you don't really contemplate it until something maybe happens down the line, or even now, people, I think, are becoming much more aware of technologies and how those things intersect with each other. And HIPAA always allowed for this concept, as you mentioned, have the ability to make transactions between providers and share data so that patients could get good care, really seamless care, if you will. And that's only been doubled down, right, with the Cures Act. So, the whole idea of electronic interoperability and how things connect with each other to make that even more seamless for patients and health care consumers. But again, we were just talking about risk that doesn't come without risk.

Debbie Reynolds  14:03

I agree with that. I want your thoughts on marketing. So marketing in the health space can be a bit tricky, right? Obviously, there are offices to go out there are things that health organizations may want to communicate to a broader audience. I think marketing right now is getting a lot of focus because of all the programmatic advertising that happens and how companies need to really be careful, especially ours, tell people if you're in anything with children, anything will help anything with maybe protected categories of information. That should be like a red flag. You really need to look at marketing more closely. What are your thoughts there?

Karen Habercross  14:51

Yeah, I don't disagree with you. I think that there's just like anything. I really believe there's probably pluses and minuses to it. Now, there is some help for consumers who really want the experience of knowing what's available, and sometimes marketing is a good thing, especially in the healthcare space, to let people know, you know, I work for an academic institution, we want people to know, the services we offer we offer clinical drug trials sometimes are things that aren't necessarily available in community hospitals. And so we want people to be aware of the additional services that we offer. But you also, I think, balance that with explaining to people this is the type of data that we collect; this is how we use it is the whole idea of transparency, I think, really is important for letting people know that and putting these things in plain language, right. So we try to, I think most privacy officers in health these days are really focused on letting people know what's happening, but letting people know in a way that they understand it, and have the ability to say no, if they want to, so ways that people if they really don't want to have the experience can opt out of those things, or how to make it easy for people to contract, and opt out of those things I think are important. So I think it kind of goes both ways. So that's sort of my opinion on it; I would say

Debbie Reynolds  16:19

Thank you for that. Yeah, something weird happened to me the other day; maybe it's not that weird. But I want your thoughtfulness about it. So I have an app on my phone; it's an accounting app. And they will ask me, oh, can we track your location? And I thought, I know that they have like a part in there where they're like, ah, you know, we'll track your mileage for you. That's why they want your location. I said no, because I don't need that. Right. But then I got a prompt that said, I want to access to my health kit on my phone. Are you crazy?

Karen Habercross  16:55

Well, it sounds like that is the app being more transparent with you. So that's a good thing, at least they're telling you; I would say, actually, I've noticed this on my own iPhone more recently; when I updated to a recent update, it started telling me like it had to go back through. Maybe that's what you're speaking about. But it's almost like every time I use an app now for the first time since that update; it asks me whether I want to allow that specific app to track that Apple has put out there, too, in an attempt to help with privacy. But yeah, I've noticed that two people are becoming much more aware people are becoming more transparent, especially companies are becoming more transparent with people.

Debbie Reynolds  17:36

Yeah, I would love to chat with you a bit about privacy in the research space; I think it's very different. And so whenever I've worked with anyone doing research, I feel that a lot of people, they're nervous about using data. But to me, this is one of the best use cases for data, right. Whereas you definitely have a legitimate reason to use it. You just have to have safeguards in place. What are your thoughts? Yeah, I

Karen Habercross  18:05

I think that one of the reasons why I love working in academic medicine is the research part, I feel very strongly that these things are very important. And COVID was a good example of why these things are important. The ability to use and share data is important in order to develop vaccinations or get a handle on it and know how it was spreading and how variants were coming about. So those things are very important. And the only way to do that is typically through research. And so, there are allowances for the use of data under research. And much of the time, patients consent to that. Well, there are a few exceptions, right, where different research boards can issue waivers for use of data. Again, that's also fairly transparent, where we tell people, those of us who work in academic medicine, we tell people when they come to us, we are a research institution where we want to use data for research, especially if it's going to be identifiable, we're going to ask for your permission, we're going to get your written permission on those things. The University of Chicago medicine is for those who don't know is, you know, on the south side of Chicago, we serve a fairly specific population, and that population has its own health issues. And so, research is key to us helping that specific population.

Debbie Reynolds  19:40

Thank you for that. You're working on a project, which is a public-private partnership around people who operationalize privacy and cybersecurity and privacy space. I would love for you to tell me a little bit about the work that you're doing there. Sounds fascinating.

Karen Habercross  19:58

Yeah. Thank you. So, the Health Sector Coordinating Council Cybersecurity Working Group is a public-private partnership. And there are a number of task groups that are charged with this healthcare space, which is a critical infrastructure of the government. So the government has identified lots of critical infrastructure, especially in the cybersecurity space, where they think it's important to focus. You can think about things like water, gas, health care, finance, those types of sectors. So, this one is specific to the healthcare sector. And I co-lead a task group where we are starting to look at how privacy and cybersecurity intersect and what is really important about that relationship. So I think historically, privacy and security operated in silos, for lack of a better word, and everybody was doing their part. But I think more and more literature and knowledge is coming out that really privacy and security must work hand in hand. And so when it doesn't function that way, it increases the risk of the organization, and things get missed. And so, this task group is really focused on those areas where privacy and security might experience challenges. And what does that look like? What are those challenges? And then what are some best practices that can be put forward to handle those challenges, and many times those challenges are what create the risk for the organization. And so we are putting together a document that we will be publishing in late 2023, early 2024, that really talks about themes of risk between privacy and security. So, things like team dynamics. So, how do the teams trust each other? How do they incorporate emotional support for each other, things like having an operational understanding? So, does privacy understand what security is doing on a day-to-day basis? And vice versa? Does security understand what privacy does to actually inform their own work or things like a cross-functional alignment? So is both privacy and security aligned around not only the business's vision and mission but their own team's vision and mission? So, what is the vision and mission for the privacy team and for the security team? And how can those be stood up in a symbiotic way? And then deliver to the board and leadership so that it's one unified message? And then other things like what is the culture of the organization? And are there impacts where one team is more noticeable than the other? And how to equalize that a little bit? And then, what are the regulatory responsibilities of both? How can you inform each other when one is in a meeting and the other one is not in a meeting? And you know, you can say, oh, that sounds like something for my security counterpart; let me connect you with that person as well. So, just not necessarily doing the work for each other, but having a really strong understanding of the other's perspective, so that you can jump in when you hear something or see something, and know who to go to, and then trust that the other team is going to have your back.

Debbie Reynolds  23:38

Absolutely, I love that you brought up visibility; visibility is a challenge for a lot of reasons. First of all, a lot of companies, the way that they are built internally, they're almost very siloed. Right. So I call them almost like Santa's workshop where everyone's supposed to do their little part. And then, at the end, things come up like this magic toy. But the problem with privacy and cybersecurity is because those are horizontal issues to go throughout the organization. That siloed approach doesn't work for that. And I love the fact you're talking about; maybe you're a privacy professional, you're in a meeting, you start to see people enter into the cyber area, making sure that you can let them know, hey, this is something that this other group needs to be involved in. That's extremely important. And I'm glad that you mentioned that because a lot of times that doesn't happen. And that's where a lot of those gaps happen. What are your thoughts?

Karen Habercross  24:41

No, I think that's absolutely the case. I think that's where the challenge is in the risk column; sometimes very challenging for the business to understand the differences between privacy and security. So sometimes they think when one is at the table, that means both are at the table. And while there is a lot of overlap, I think they are distinctly different areas. And I don't think that we can only rely on the business to know and assume that if you tell one, both know. And so I think there is a lot of what I would call deliberate cohesion that needs to happen. And so, it needs to be a very deliberate space between privacy and security. And so by making that deliberate, the other is always informed, or there is some understanding by someone in the room, that while I can handle this part, I'm also going to need to bring in my counterpart because they're the experts in that area. And I think all of this is helpful, for if and when most of us have an incident because again, it's it usually happens, it will happen probably to everybody, at some point, that the history has already played itself out that there's a good enough and trustful enough relationship, that everybody can just dig in and help at the time when it's needed the most in the business can feel secure, that it's being handled efficiently and effectively.

Debbie Reynolds  26:23

I also want to hit one more thing on visibility. And I think this is something that privacy maybe can help cyber security, Lou; I mean, I've heard many people say that cybersecurity, it was done well as invisible. And I don't think that's true. And the problem with that Invisibility is that you may not get the support, you may not get the budget because people don't feel like that's where the investment needs to go. So tell me your thoughts on that, where privacy, you have to be more of this.

Karen Habercross  26:53

I think that's right; I think part of what's something that is really important to me is really pushing privacy visibility, especially in the healthcare space, because you can turn on any news station or read the Internet; everything is about cybersecurity and ransomware attacks. And I think those things are certainly important, definitely important. But privacy is just as important. I think that it needs the same level of visibility, especially at the board level especially in leadership meetings; I feel very strongly that privacy and security leaders should be at the same level. I'm not one to say one should report to the other. Either way, actually, I think that each needs their own seat at the table, and each needs to be able to have their own board presentations to make sure that everybody sees that as equal. And so, you know, often talk to privacy leaders who may not be maybe someone in compliance is going to the board, which sometimes it's just structural. And we talked about sort of organizational culture. But privacy is only one part of compliance. And so I think it's really important that if cybersecurity is getting talking time in meetings, or talking time at the board level, that privacy in and of itself gets the same consideration. So I agree with you, I think it's that important in this day and age.

Debbie Reynolds  28:31

I agree. What is happening in the world right now? Maybe in privacy or something you see out, maybe the news that maybe has a privacy implication that concerns you.

Karen Habercross  28:46

Yeah, I mean, the cliche is this is it's got to be AI right now, probably. I think that we don't even know that it's new or novel so much anymore, but certainly everybody's talking about it as if it's new and novel. And I think that it's still not well understood by the vast majority of people, certainly not sure that it's well understood the privacy or even security implications of it are well understood by government regulators. And so I think that they rely a lot on us in the business to be able to share with them what we know. And so it's incumbent upon us to figure it out and explain it in a way that people understand. And it really goes back in my mind to the whole concept of risk management. So, I'm a firm believer that privacy has to be part of enterprise risk management. So, there's a piece of privacy risk management that exists. But your privacy program, if it's not, needs to get into the business's enterprise risk management structure as soon as it can. And it just falls to another space there. I think it has its own risks. I think that they're not well understood; I think that especially things like publicly available generative AI has a lot of data risk behind them. And I think that we need to understand that we need to take the time. And I think many times privacy leaders are very busy with all of the things that we've already talked about, relating to the regulations that are fluidly coming down or changing rapidly changing regulatory environment, trying to track everything at the State level, Federal level and the national level in some respects. And now we've got to also track AI and what that means, and then healthcare, I think there is a lot of good that can be done with it. So, I think that physicians can get back to the business of caring for patients. So there are some things that I think are exciting coming out of the AI space. But I also think that in order to use them, we have to assess that we have to assess the impact, we have to assess them, you got to understand the whole data flow of that and how it's being used and who gets access to that information. And then what are the secondary uses, maybe even tertiary uses, need to be considered. And I just don't know that people are doing that in the privacy space as much as they should be right now.

Debbie Reynolds  31:26

Those are pretty good callouts there. Also, I feel like AI poses an additional risk because of the lack of understanding of what it does. Because I feel like, especially when I talk to people who maybe aren't in corporations or companies have been using AI for a while, right, but Generative AI has gotten a lot of attention. So, sometimes people assume that Generative AI is all AI. And it operates very differently. And it poses different risks, especially because of the exponential nature of being able to create more data, whether it's right or wrong. And especially in the health space, where the right is important.

Karen Habercross  32:15

Although the integrity of the data is extremely important.

Debbie Reynolds  32:19

Yeah. So it just creates, as you said, it has to be more education around what those risks are, how companies need to use or leverage it, and whether it makes sense to use it or leverage it. I think that's what companies are trying to figure out right now. If it were the world, according to you, Karen, and we did everything you said wouldn't be your wish for privacy anywhere in the world. So, whether that be human behavior, regulation, or technology, what are your thoughts?

Karen Habercross  32:49

I think my wish is that we just continue to increase transparency. To me, this probably goes back to the idea of my social work background; I think that transparency helps people to understand and make choices for themselves. And so in a world where, I mean, it's not just health care, I think it's just a world where we tend to be very paternalistic of each other, I think that people are informed and are able to get a good understanding, make good decisions for themselves. But that can only happen when we're transparent about what is being done. And so that goes back a lot to what I was talking about earlier, which is just because you can do things you want to? And maybe it also fits with your mission? Do your consumers want that? Does your patient population want that? And it's different. I think it's different, as we talked about in academic medicine, because of the research component. We in academic medicine have a training component, we train future healthcare workers. And there is a problem right now in healthcare in terms of getting people to stay. It's an extremely stressful job. And so the more transparent we can be across the spectrum, and this is not even unique to the healthcare industry, I think it's even consumerism, the more people can understand what the options are, and then make decisions for themselves. So, I think that my wish is just to increase transparency and increase the way in which we tell people so they understand.

Debbie Reynolds  34:44

That's a very good wish. I agree with that. Was tremendous. Thank you so much for being on the show. This is very illuminating. I love to talk with people who are in the health space, but also, you also drop a lot of emails about life or enterprise and understanding, just working relationships within companies. So this is great.

Karen Habercross  35:07

Yeah. Thank you so much for having me. It's been a pleasure.

Debbie Reynolds  35:10

Yeah, well, we'll talk soon; maybe we'll hang out.

Karen Habercross  35:13

Yeah, that would be great.

Debbie Reynolds  35:15

All right, thank you so much.

Previous
Previous

E164 - Vikram Koppikar Senior Manager, Privacy - South Asia and Metropolitan Asia,  Kenvue  - Formally Johnson Consumer Health (Mumbai)

Next
Next

E162 - Justin Daniels, Corporate Mergers & Acquisitions Counsel and Cybersecurity Expert, Baker Donelson LLP