E153 - Swanti Handa, Privacy Product Executive, X formally known as Twitter
Your browser doesn't support HTML5 audio
36:44
SUMMARY KEYWORDS
privacy, data, people, work, information, tech, podcast, federated, encryption, feel, talk, ensure, thoughts, ai, implement, swati, companies, tool, app, access
SPEAKERS
Swati Handa, Debbie Reynolds
Debbie Reynolds 00:00
Personal views and opinions expressed by our podcast guests are their own and are not legal advice or official statements by their organizations. Hello, my name is Debbie Reynolds; they call me "The Data Diva". This is "The Data Diva" Talks Privacy podcast, where we discuss Data Privacy issues with industry leaders around the world with information that businesses need to know now. I have a special guest on the show all the way from the Bay Area. Swati Handa is a Privacy Product Executive at X, formerly known as Twitter. Welcome.
Swati Handa 00:41
Hi, Debbie. Thank you for having me. Yeah, I have to admit it. You know, being on a podcast named "Data Diva", makes me feel I should be wearing a sparkling tiara. It's kind of missing. But that's what's missing. But other than that, the vibes are pretty good.
Debbie Reynolds 01:04
Well, I'm so excited to have you on the show. I don't know what I had done, but you have reached out to me, and you congratulated me saying, I love the way you evangelize privacy. And like that, you just reached out and thanked me, and I said, be on the show you're like yeah, why don't I? I'm so glad we're able to talk. You've had a very illustrious career in privacy and product areas. You work in those areas at companies like American Express, Cisco, and now x. Tell me a little bit about how you got into privacy. How did these interests spark in you in terms of a career path?
Swati Handa 01:48
Yeah, great question, even before I kick off my origin story, one of the challenges I feel with privacy and the work that you're doing right, again, I said that in a note, but I want to say that here again, on the podcast, you're doing great work, and making sure that people understand that privacy is front and center. It can take a backseat. And we expect somebody else to do the work for us. So thank you, again, thank you for making sure that privacy as a domain, as a subject, is front and center for all of us. I've listened to your podcast many times when I'm driving, I'm doing my errands. And I love the part where it just feels like a conversation. Versus oh my god, this is yet another errand that I have to finish. It's like I have a friend driving with me and talking good stuff. But I'm running to do the errands. So thank you for what you do.
Debbie Reynolds 02:44
Thank you so much. That's so sweet. Oh, well, I always hope that's the case. Right? I think about it myself. When I'm listening to a podcast. I'm like, you don't want to be annoying. You want to make it fresh and interesting and not look scripted. So thank you. Thank you. Thank you. But more about you.
Swati Handa 03:08
Sure. So, Debbie, I got my way into tech nearly two decades ago, right? started my journey at Amex. And as I was working through it, that was a place that taught me the weight of data responsibility. Each transaction wasn't just a data point. It's somebody's holidays, somebody's dinners, someone's emergency, just what all you can glean from data. It's amazing what you can learn from just a spreadsheet, or from sheets of data. That's the place where I feel I got to realize with great power comes great responsibility and what all we could do being in the tech space. But the biggest thing that hit me while I was working was when my niece hesitated to use her Amex card, and this was years years back for dinner because she's like, oh, no, there could be online fraud. And just that that one incident showed that there's so much trust that millions have for individuals working in organizations. And that made sure that I remain unbroken in my mission towards Data Privacy, and not just this, what really got me into tech was someone really close in my family. They were a victim of identity theft. And this was back in the day. The chaos it caused for all of us, not just the person who was impacted, but everybody in the family, was eye-opening. Right. So I think it felt very natural as a progression for you to have a fascination for tech. And then you saw the impact it has personally. It was a natural staring into privacy.
Debbie Reynolds 05:01
That's incredible. I think a lot of people I feel that are very passionate have a back story like that, a personal experience they've either been through or they've seen that gets their attention. There's something that you said already, that's piqued my interest. Maybe this, you said before we started recording, was about data responsibility. So when I think about, for example, AI right now, where there's this AI going on, while everybody wants to use AI, and no one really wants to be responsible. You know, they're like, use it at your own risk.
Swati Handa 05:44
Yeah, yeah.
Debbie Reynolds 05:45
I just coded it, it's not my fault that you get arrested or something, you know, so.
Swati Handa 05:50
Right. Right.
Debbie Reynolds 05:51
So when you talk about it, I feel like people, especially in those areas around finance, understand the responsibility part. But I think as these new emerging tools are created, people sometimes are so fascinated or beguiled by what's happening. They're excited about the innovation. No one wants to take responsibility. And I feel like, yeah, in AI, every single person who's on the chain has some level of responsibility. So it can't be just like for the user. They can't say, well, why didn't the company tell me this? And the company can't say, well, it's all a customer's fault. So tell me your thoughts about that.
Swati Handa 06:36
I've always felt the latest tech is awesome, right? It's like shiny, the new shiny tool on the block. People feel great about it. But true innovation is where the latest tech, not only wows us, but it safeguards us too. So that to me, is the real innovation where I have confidence in using what you have created, be it an opinion that you share, be the tool that you have created. I feel confident in using that. So, I have a confession to make. I'm a big word game fan. Android, right? So anytime crosswords, be it any app, I'll just sit down, and you'll see me scribbling. Yeah, that's kind of like my hobby. And that's what I enjoy. A few years back. And this was before PlayStore had implemented the whole overhaul and giving apps ratings; my niece asked me to, like, hey, I'm leading on the leaderboard globally, and we live in different countries. So she goes like, hey, why don't you download this app? And we'll connect; let me see if you can beat me. Right? Good, old challenge. And I'm like, yeah, that's a way for me to connect back. It's in the realm that I enjoy. Why not. As I downloaded the app, I realized that it was asking me for all this location information access to my camera. And that just hit me. I'm like, you're asking me for all this; I get it. It's a wonderful experience, the value it provides to the customer. Again, you know, I'm looking at that product hat that I'm wearing and saying that it's wonderful how I can connect with someone that I love over a leaderboard. But at the same time, why do you need to know about all this information that I have about my location? Or the pictures? Why do you need access to my gallery? And that in itself, made me think about the digital playground that we have given access to that we have created, and you have all these creeps, so to say, hanging out and observing, just observing what you're doing? So, you know, when we talk about AI, I feel it just amplifies that problem. I don't even know what's the right number, not 10x, not 100x, I don't have a number. But you get the idea how that idea of ensuring that we have a wow or the next app that we download, or what we do, continues to wow as well as continues to be safe, along with the value.
Debbie Reynolds 09:24
And I think when you see things like that, like you said, it definitely erodes trust, right? You saw the game; you thought it was cool. You want to connect to your niece, then you started asking questions that you should ask, right, like, why are you doing this? And I think just like you said, along with innovation, I think companies, when they're doing innovative things, want people to adopt their innovation. They don't want people to not want to use their tool because of some setting that they put there. They may not be aligned, and we're the person. So, for me, human centricity is vital because the survival of the human being the product, or human is the customer.
Swati Handa 10:12
Exactly. It's not just a download; I know you care about how many people adopt it, you care. Those are your key metrics. But there's a human at the end of it. And that's where I feel the practices of privacy and changing are helpful, which helps build that gap between what's the right thing to do, and ensuring the business moves, the needle achieves its goal, but what's right for the customer.
Debbie Reynolds 10:43
I know that the tech companies breathe and sleep this whole thing around filling the gaps in privacy, not just on the legal side or technical side as well. But even companies who do not consider themselves tech companies have data, right? So, almost any company is a data company; data is a huge asset. But it can also be a huge liability, I guess if you use it in the wrong ways. What are you seeing, from the beginning of my career, the way that I saw people see data, they saw data as a commodity, and it really is not anywhere, right? It definitely is an asset. And it's something that if companies use it, right, it can really help them or hurt them because I feel like it's almost like a double-edged sword that cuts both ways. What are your thoughts?
Swati Handa 11:37
You're absolutely right. In fact, I feel anytime, I'm just saying, what's been around the block for anybody who's in this profession is any application or any business that you use. Anytime you're not paying for it, you are paying for it; you're just paying for it, the currencies and data, what you just talked about, right? So it is an asset; you're paying it with your behavior, you're paying it with your choices, and helping them create a better product. So when that check is being used for enhancing the product, but unfortunately, as you said, not just the big tech organizations of all sizes house data and it is important of how they use this data and give the user control of that data. That's the most important part. It's like, hey, there needs to be a universal respect for privacy. So, the future should acknowledge privacy as a universal right and not as a privilege. And there needs to be transparent data practices. So organizations should be clear about how they use the data; this, the hidden dark patterns, are ways to, oh, yeah, I didn't know about that. Oh, did we also do that? Oops, this is not an acceptable response. That's the idea behind it. That's true. And finally, the user-centric controls, right? Even the Delete Act, coming together in 2026, is where Californians can expect a user-friendly tool. So that's kind of like, hey, sort of a Do Not Call list. It's kind of like a do-not-sell list. That's what it is; you have one central place where you can go in and request all data brokers to delete. So it's one thing for them to pass the act; the challenge would be, how do you build something which is effective? That's gonna be an interesting time for organizations of all sizes to keep that again, user, front and center, to build something for it.
Debbie Reynolds 13:48
Right. And the Delete Act is basically a riff off of the Do Not Call list. Yeah, from Intelecom, which was wildly popular. As you may imagine, you go on the list, you put your number in for five years, you can call you for certain purposes, and people love it. So I think they were trying to do was in that vein, and even though I hear people kind of upset about it, I'm like, I prefer that, as opposed to me going on a website, going into each data broker, like one website, I have, like 200 data brokers, and I have to opt-out each person, right. So now they're putting the work on the companies who want to use this data. So yeah, I think it's a great thing. I think it's a great thing. Well, I'll tell you what I'm thinking about and what I'm concerned about. So I feel like what is being created now it's like a new caste system when it relates to data, so instead of the haves and have-nots, right, we're going to have the knows and the know-nots, people who have access to data who can do things with information and people who don't? What are your thoughts about that?
Swati Handa 15:07
Interesting. Basically, what you're saying, again, kind of like dials back to data is your super currency. It's like, so, interestingly, and people say that about the network, too, right? It's like, it's not what you know, it's like who you know. But here, it's interestingly, and I'm actually going to tie it back to the work that you're doing, Debbie, it's like, hey, the awareness for privacy. It's not about, hey, we know about privacy, but the implications it has. So those who do listen to your podcast have an edge versus those who will discover it. So how about that? So those who have discovered it and those who have not. So that's what my thoughts around essentially, the data piece being I agree with you, a lot of it has to do with the information you have access to. And then how fast you act on that information, a lot has to also do with the access to resources that you have and how fast you act on it. I mean, the whole AI boom of apps that has sprouted that last one year that we are, not even, it's less than one year, it's a couple of months. And you have all the reasons to invest and are ready to. It has AI at the end of the name, and they're ready to write you a check. So it's all because you're like, oh, we have this, and you want to ride the wave, and you get that information. It all goes back to access to information. And how do you solve for a particular use case versus trying to solve for? Oh, yeah, they can yet again, summarize a document or give me something, whether it's legible or not, but I can give you something. So now it's that next phase that I see in AI and access to information where I see people coming up with specific use cases to solve versus just general capabilities.
Debbie Reynolds 17:08
I agree with that; as a technologist, I don't love everything that people try with technology. And so I'm happy to support when I think okay, this is a great way to use technology. I really liked the way that they did this. And there are people, oh, my God, I can't believe they're going this way. But I think the future will be more human-centric. Because a lot of these laws, that is the gap that they're trying to fill, right, because we feel like businesses have the upper hand, right? I think that the business-to-consumer relationship is always asymmetrical. But the asymmetry, I think, is getting just out of hand, like you're saying you don't know whether it's 10x, 5x. I mean, it's like an exponential asymmetry that's being created with a lot of this technology.
Swati Handa 18:02
As you were speaking about it, it just made me also reflect back on, you know, we talk about the differences and what companies are doing to get it right. There are a lot of times when they're trying to find loopholes. They're like, hey, anonymize data, it's out of scope for GDPR. Or out of scope for anything, maybe all these privacy laws, why don't we just go ahead and implement it? Such a cool word, but in reality, getting down to true implementation. So, I was reading this report that NIST and the US Census Bureau came out with a couple of days back. And they were talking about how de-identification, right, not even anonymization. It's pivotal. But it's not infallible. It's thrown the words thrown about a lot, but it's not perfect. And we need to acknowledge that there is some way where it can be through quasi-identifiers or through methods that we haven't thought through yet. It is possible to do that. And then there was another one. So, there were three things that really caught my eye. And that report was the second one around encryption. They were like, hey, oh, this is valuable. Let's just encrypt it. I mean, it has its merits, I'm not going to deny, but anybody who works in business knows how expensive and how hard it is to implement any of it and do it in a way which is scalable, which is cost-effective, and at the same time, makes access to data effective, right. So basically, you can't get an expensive lock and expect all your problems to be solved. Because you got to really have that defense in-depth mindset versus just get a heavy lock for everything. Because every time you get out of the house, there's a flaw somebody can get in, right? So I'm just using analog keys, but I'm essentially trying to get out. But the identification is not perfect. Encryption is not your solution to all differential privacy. I feel there is merit there. But a lot of it has to be done. It's an emerging stuff, right? So it's not perfect yet. It's not scaled out yet. There are organizations that are building frameworks to implement it. But we'll see. How does it deal with the test of time? And does it scale? Well, across the tech industry, or not?
Debbie Reynolds 20:46
Yeah. Oh, fascinating things. And I agree; I love to see all these different things happening. One thing that you said just made me remember a myth that I hear, and I want your thoughts on this. it drives me bananas, which is why some people think that encryption solves privacy issues, and it's like nails on the chalkboard. When I hear that, what are your thoughts? It's like trying to eat soup with a butter knife; it doesn't work, right?
Swati Handa 21:24
Yeah, you get the taste of it. Sure, with the butter knife, but you're not gonna finish your soup. And it's not gonna be effective in any way in getting the soup through your tummy anytime soon. So that's how I think about the whole encryption thing. Again, encryption is one tool in your toolbox; it may be not the most effective tool for what you want to achieve at the time, but most of the time, it's the handiest tool, and it's the one that has been around for the longest, probably. And that's where people tend to gravitate towards that. But that's definitely not the first tool that you might, you know, okay, you will pick it up, but then put it back and use the right one. That's all I'm saying.
Debbie Reynolds 22:08
That's true. That's true. Your thoughts on federated learning: I feel like a lot of people don't really understand this. Maybe it's just me, but I feel like people have dropped things in the media around Web Three decentralization; they've not really explained it that well. But federated learning is a way to protect privacy because it changes the way that data gets transferred and transmitted. So tell me a little bit about that.
Swati Handa 22:41
So I find federated learning, especially with this whole machine learning and AI, wave riding, I feel it's pretty cool. The very few organizations who have the infrastructure to implement it right now. But I'm hoping that will change soon. So federated learning, for me, is a decentralized approach, as you said, to machine learning, right? Traditionally, models would require collecting a whole bunch of data centrally; I need to know everything about you in order for me to train. I was hearing another podcast today where people have claims to all these proprietary work, like there's no way you haven't fed my entire novel in order to answer the questions that you're answering, right? But the way federated learning works is it's trained at the device level using local data. So only the model updates and not the raw data are sent back to the central repository. So it's kind of like a personalized experience. But without compromising individual data. The best analogy that I can think of, Debbie, it's like you getting a tailored suit, but without somebody touching you and getting all your measurements. So it's like, hey, you get a custom suit. But no, like the stamps were, yeah, I can look at you, and I can build you that custom suit. Whereas instead of somebody coming in and actually taking the measurements, minus the has-to-end, getting a customer is experienced, based on it. So, T-shirt sizes, right? Small, medium, large, versus everybody taking your size every time you want to go buy clothes, right? So that's the analogy. Federated learning it's more of an opportunity where the data still stays with you on your devices. And you just get aggregated information sent back centrally to train the main model.
Debbie Reynolds 24:39
I think the one thing that people don't understand about federated learning and the things you're talking about is that these things are made possible by the fact that mobile devices are more sophisticated in how they do computing, right?
Swati Handa 24:57
Exactly.
Debbie Reynolds 24:58
So a lot of the status centralization that we see now, like big players having data, is because a long time ago, the computing power was in those central places. Right? So now, we're saying we're at a place where are these mobile smartphone devices. Some of the things you have in your pocket are more powerful than these huge mainframe computers that NASA used in the 50s. You know what I'm saying? So, yeah, because of that, people are thinking about using more putting more of the computing on those mobile devices and not having to share so much data across, right?
Swati Handa 25:43
Yeah. And not just mobile devices, even the smart devices, we have Apple, that's a totally different podcast on its own coming to smart devices. But you get the idea. It's like, any device, physical device, be it the smart speaker via the smartwatch, are, you know, the mobile device, and we talk about all of it there on the edge and have a lot of computing power.
Debbie Reynolds 26:09
And they'll talk to one another at some point.
Swati Handa 26:14
Yeah.
Debbie Reynolds 26:16
So now you're right. So I'm just waiting for that to happen. I saw your toaster and your thermostat will talk to each other. And they'll be talking about you, Swati, like what you're doing.
Swati Handa 26:29
It's a hot day; she's not gonna get close today. Forget it; you're gonna have a rest. That's what's gonna happen.
Debbie Reynolds 26:38
Oh, my goodness. Yeah, yeah. Talk a little bit about zero-knowledge proofs. So people heard of zero trust, obviously, from a cyber perspective. But tell me a little bit about zero-knowledge proofs and how that can improve privacy.
Swati Handa 27:00
The beauty of zero-knowledge proof is like where one party can prove to another party that something's true without a specific statement. So it's kind of like a cryptographic method right for on-chain reactions. But you do not need to reveal the hidden information, yet another form of privacy-enhancing tech. So they work well when you can convince the verifier that they have correctly executed on secret data. So the whole idea is you're not revealing a whole bunch of information. So it keeps the customer in focus, where they're not revealing private information. And that's where I see the relevance with privacy. In fact, even in crypto, a lot of the new crypto tech that's coming in place, they're ensuring, and they're leaning a lot on zero knowledge. So, for me, it's a fantastic tool in your kit, right? As we were talking about, encryption is not your only tool that you have. This is one of those where I will pick up in order to ensure that privacy is ensured for the end user in my tech.
Debbie Reynolds 28:13
Wonderful. Well, what's happening in the world right now in privacy or data that concerns you, like something you see and you're like, oh, wow, I don't like the way this is going, or it just concerns you.
Swati Handa 28:27
My top three would be lack of control as an end user all this fantastic tech around facial recognition. And finally, privacy around kids and young adults. So, just making sure that the choices that we make, especially the last one, right, the one around the privacy of young adults who think they can make all the decisions. But perhaps there's an opportunity for education, as guardians as parents, and what we can do through the apps that we don't just Zoom; we talked about that, right? The parental controls, and all the tech in our school. Yeah, I want to talk a little bit more. It's fantastic that we have access to the SEC, and we get to know all of them. But instead of just say I agree, most of the time, there are waivers and attendance that you can sign for not using the information widely. So you're going to ensure you can still protect, don't just yet sign another form. I know it's easy, and it gets you out of the way, but just being aware that there's a lot going on in the tech with a school in children. Yes, we have the Coppa here in the US and then there's GDPR but it's there's a bigger narrative here, right? It's like, what do we do to ensure the safety of the young ones in our lives and around how do we and the landscape right Debbie? You know this. And in fact, I'm sure a lot of listeners do 70% of the countries in the world. So, 137 out of the 193 have some form of privacy laws. So, the landscape is constantly updating; to expect an average caregiver or a parent to be aware of these laws is out of the question. I mean, that's not a fair expectation. But what is and can be encouraged for all of us? How do we ensure that there is education and awareness that we bring to the young ones and tell them that it's okay for you to question? Do I really need to sign this? And how will you use this? I mean, I encourage the people in my life to do that. And it's interesting when there are alternatives, and it's not about yet another form to sign. It's more about what can we do to show that we are aware and that institutions around us need to ensure that safety.
Debbie Reynolds 31:05
I agree; I kind of feel sorry for schools where the parents are privacy professionals because you all ask a lot of questions. And you know, you're probably giving people a lot of heartburn when you ask them those questions. But they're all fair. They're all fair questions. And I think that the gap, I feel a lot of times, is that companies or schools when they're dealing with children when they are at a company when they are trying to implement an emerging technology. And they think of it as an exchange for, okay, we did this all the way. Now, here's the new way, like, well, the new way is totally different. A new way collects more data, and a new way gives you more risk, right? So, if people are thinking about it as a like-for-like exchange, you can see how they understand they don't really understand the gap. So let's say, for instance, you had a permission slip, that was a paper, this one put on the back of your kid's shirt, and you came home, you signed it, you gave it to them. Now, let's suppose it was electronic, so now you're like, okay, you're increasing the risk of this data now, because it can be disseminated so many different ways. But you're not taking those steps to assess those differences and try to mitigate that risk. What are your thoughts?
Swati Handa 32:37
What I do want to say is that I don't think that privacy in the future is overshadowed by everything else, like what you said, right? I think privacy's future is pretty bright, but only if we steer it right. So unless we, as a society, ask those questions. Yeah, we're gonna be a pain in the butt for someone. But I feel when 10 people ask the same questions, some of them stand up and take notice that, like, okay, we got to do the right thing. And because that's the right thing to do, as I said before, right, it's this: privacy is not a privilege. It's a universal right, that we want to ensure people understand that universal respect for privacy.
Debbie Reynolds 33:26
Wow, I love it. It's very deep, very deep. I love it. So if it were the world, according to you, Swati, and we did everything you said, what would be your wish for privacy anywhere in the world? So, whether that be regulation, technology, or human behavior, what are your thoughts?
Swati Handa 33:46
What would be my one wish for privacy? Is that what you're saying?
Debbie Reynolds 33:52
You can have more than one. Okay.
Swati Handa 34:00
Okay. You know what? I'm gonna stick with what I've been saying the entire time: that universal respect for privacy is something that are really live and breathe by transparent data practices, again, wherever we are in the world. And finally, user-centric control, you should really feel you have control over the data being used in any aspect of your life, be it the car you drive, the watch you wear, the phone you use, or the device you use to have your next conversation. So balancing that utility with protection, right, and empowering the individuals is really really important. Some, but I liked the way you put the debits like, you know, you kind of let the genie who's asking me what wishes and you know, what's the cool part there is so I by ethnicity, I'm an Indian, and we believe in that at least once time in a day, one time in a day, whatever we say, actually comes through. So what I'm hoping for is that when you ask me this question, that's what happened. This is the time. I don't know if it's a myth if it's something we believe, but it was something that was passed down to me to encourage me to say positive things and move on, you know, move forward. I don't know if that's true or not. But what I'm hoping for is that as we talk about this, this really becomes a reality, that universal respect for privacy, transparent data practices, and user-centric control.
Debbie Reynolds 35:40
Those are high, very high on my list; I agree with you completely. Agree with you completely. Well, thank you so much. This was such a joy to have you on the show. And I'm sure people learn so much from you. And you know, you just have such a spark, you know, I can tell that you feel this in your heart. You're not just a seasoned executive and professional, but you just have a heart for people, which is, I think, the best that you can ask for someone in your profession.
Swati Handa 36:10
Thank you, Debbie. It feels awesome to get that compliment from you. Big hug. Thank you.
Debbie Reynolds 36:19
Well, thank you so much for being on the show. This is tremendous. And I'm sure that everyone will enjoy the episode as much as I did. Thank you so much.
Swati Handa 36:29
Thanks for. Thanks for having me.
Debbie Reynolds 36:31
You're welcome. Talk to you soon.