Playback speed
×
Share post
Share post at current time
0:00
/
0:00

Data Privacy with Debbie Reynolds, The Data Diva, Founder, CEO, & Chief Data Privacy Officer

Steve interviews The Data Diva, Debbie Reynolds

FEATURING: Debbie Reynolds, The Data Diva

In this premiere episode, I speak with Debbie Reynolds, Founder, Chief Executive Officer, and Chief Data Protection Officer of Debbie Reynolds Consulting, LLC.

Debbie discusses her career in emerging technology, the field of data privacy, and how her consultancy helps companies make important decisions on data collection and data retention. You'll also learn about data provenance, data lineage, and how to best deploy data practices at global scale.

RESOURCES:

Connecting with Debbie Reynolds

Debbie’s LinkedIn: https://www.linkedin.com/in/debbieareynolds/

Debbie Reynolds Consulting, LLC: https://www.debbiereynoldsconsulting.com/

The Data Diva Talks Privacy podcast: https://thedatadivatalksprivacypodcast.buzzsprout.com/

Companies & Resources Discussed

GDPR: https://gdpr-info.eu/

McDonald’s Corporation: https://corporate.mcdonalds.com/corpmcd/home.html

PBS Interview: https://www.pbs.org/video/new-european-law-raises-bar-data-privacy-protection-vgrrid/

Debbie’s most recent NeXt Curve interview:

Rite Aid/FTC: https://www.ftc.gov/news-events/news/press-releases/2023/12/rite-aid-banned-using-ai-facial-recognition-after-ftc-says-retailer-deployed-technology-without

23andMe Data Breach: https://techcrunch.com/2023/12/04/23andme-confirms-hackers-stole-ancestry-data-on-6-9-million-users/

Twitter/X: https://twitter.com/home

OpenAI: https://openai.com/

Zoom/Data Privacy: https://www.nbcnews.com/tech/innovation/zoom-ai-privacy-tos-terms-of-service-data-rcna98665

LLMs: https://en.wikipedia.org/wiki/Large_language_model

PEAK IDV LIVE: THE CONVERGENCE:

IoT Advisory Board: https://www.nist.gov/system/files/documents/2023/02/28/IoT%20AB%20Member%20-%20Reynolds%20Statement%202023-01-18%20v2.pdf

Debbie’s Newsletter, The Data Privacy Advantage, on LinkedIn: https://www.linkedin.com/newsletters/the-data-privacy-advantage-6881658462879657984/

PEAK IDV EXECUTIVE SERIES is a reader-supported publication. To receive new posts and support my work, consider becoming a free or paid subscriber.

FULL EPISODE TRANSCRIPT

Steve Craig:

Welcome to the PEAK IDV EXECUTIVE SERIES video podcast, where I speak with executives, leaders, founders, and change makers in the digital identity space. I'm your host, Steve Craig, Founder and Chief Enablement Officer at PEAK IDV. For our audience, this is a video first series, so if you're enjoying the audio version, please check out the video recording on executiveseries.peakidv.com, where you can watch the full episode, read the transcript, and access any of the resources or links from today's conversation.

This is the premiere episode of Season 3, and I'm thrilled to kick off this new season with Debbie Reynolds, The Data Diva. Debbie is Founder, Chief Executive Officer, and Chief Data Privacy Officer of Debbie Reynolds Consulting, LLC.

She is a world-renowned technologist, thought leader, and advisor working at the intersection of data privacy, technology, and law. Debbie is also a top consulting voice on LinkedIn, is an internationally published author, highly sought speaker and top media presence on global data privacy, data protection and emerging technology issues and matters.

Debbie's podcast, The Data Diva Talks Privacy is the #1 data privacy podcast worldwide.

Welcome, Debbie. I'm beyond thrilled for you to be on the podcast today.

Debbie Reynolds:

Thank you so much, Steve, for having me on the show. I really enjoy your show, so I'm happy to be guest here. Thank you.

Steve: Oh, thank you so much.

Well, let's get started. Can you share more about your consulting practice and your business? Like what's your elevator pitch?

Debbie: Oh, that's a good question. I think elevator pitch is, I like to help companies overcome things like barriers to adoption as it relates to privacy and emerging technology.

So companies want to, they get excited about emerging technology. They want to see how they can implement it. So I work with them either on the implementation side or the advisory side as in terms of how they navigate some of the privacy global landscape. And then the other side of my business, I do like you have a podcast, I have a production company.

So we do videos, podcasts, different things for our company. Very exciting. And I'm a speaker as well.

Steve: Excellent. Excellent. And I love how you include in your title Chief Data Privacy Officer in part, I call myself Chief Enablement Officer because I’m enabling people to be experts in IDV, but what are some of the key responsibilities of a CDPO. Should companies have this role?

Debbie: Yeah, because I am the CEO of my own company, people in my company has my own name. When you say CEO, it doesn't really tell you what you do. So that's why I have a title of Chief Data Privacy Officer.

So basically what I do is help companies with maturity in terms of data privacy. So I do that for my company. I do that for other companies. So, you know, not all companies are at the same place or not all companies have the same risk appetite. I would say, around privacy. So I try to meet people where they are, see where they want to go and see where we can move them to in terms of maturity.

So it's sort of like, you know, I've heard someone say like sharks, if a shark stops swimming, it dies, right? Because it'll just sink to the bottom of the ocean. So you have to keep swimming, you know, privacy is not a one and done thing. It's not like, it's not like it's a sprint where you get to the end of who I saw privacy and I, you know, I'm so cool and you know, I don't have to do anything else.

It's just not that way. So it is a progression. It's something that has to happen, has to be part of the lifeblood of an organization, and it has to be very fundamental now to how companies handle data and is an ongoing process.

Steve: Are you seeing more and more companies have it as a designated role within an organization that where someone's responsible specifically for that?

Debbie: We are. I am seeing that a lot more. So whether it be people internal, obviously, you know, really big companies, they have a whole staff, like many people who work on that, whether it be the, you know, sometimes it's divided between like the legal part of privacy and the technology part that's our play in those, that middle space. Some companies, not all companies need a Data Privacy Officer, but some, some companies decide they do want someone fractional. So they may not need someone full time, but they want to have some, a relationship with someone who knows enough about, you know, what they're doing.

And then there are some, you know, a lot of companies, they try to give it as a role, you know, maybe a responsibility to someone who's existing. And I think, even though I don't think that's a bad thing, I think that if companies want to do that, they need to really invest in that person and make sure that they have the proper resources and training.

So I don't recommend what I've seen other people say, “Hey, John down the hall. You know, you like technology. You're the data privacy officer.” You know, they don't give him any tasks to do. They don't give them any funding. No training. And so they're on, you know, in certain jurisdictions, it could get as serious as like in Asia, you can have criminal charges against the person if they mishandle data. On the other side of that spectrum, let's say in Europe or whatever, a person could be sanctioned if they don't handle data properly. So it could be a serious thing depending on the jurisdiction.

I am not the person to say, “Oh, you always have to have a full time person or you always have to have like, you know, you know, you can't have someone internal.” That's not true. Right? So there are a lot of people who can come into this role to do it. You know, you do need to have someone who has aptitude to learn, know that it's ongoing process and there you're constantly reading and learning basically.

Steve:

Yeah, and looking at your background, I see your career has predominantly been in the legal field. Can you share with me your journey when you got more involved in the data privacy components of legal side and compliance?

Debbie: Sure. So actually, my background is in technology. So that's my, that's my trajectory. So you are probably too young, Steve, to remember what people had card catalogs and libraries, but I was turning those into databases back in the day. So that's really how my technology journey started. Then I started getting contact. The

Steve: Dewey Decimal System? That style? Not that young.

Debbie: Yeah. Oh, yeah. So I started creating databases for companies. And so some of those companies were involved in legal matters and things like that. But I'd always had this digital transformation thread through my career and so I've done it for a lot of different companies and different industries, not just legal, but a lot of legal people.

I know a lot of legal because a lot of times they were involved in litigation and they needed data, right? Okay, I need data. I need it fast. I need it from here. Well, this is how you do that. This is how you move that data here or there. So I actually, because I had done that for so long, I started to get calls from companies that I worked with that knew me and around the time in Europe, when they started to decide in the mid 2000s, that they were going to update their data directive, I started getting calls from people who knew me from all those digital transformation things. And they were like, “Hey, we know, you know, about this privacy stuff. So come talk to us.”

So probably one of the first big companies that asked me to come speak with them was McDonald's Corporation. So I spoke with their corporate legal department.

This before the General Data Protection Regulation came out and they were like, “Hey, what do we need to be thinking about or stuff like that?”

So I talked to their entire legal department and that was a long time ago. So as I said, it was before GDPR came out and then as I started talking more about privacy, you know, I actually have a personal interest in privacy and I knew that as technology, all this digital transformation was bringing in a lot of privacy risk.

So, you know, it gave me an opportunity to focus there. So I've been really excited to do that. But a few years later when GDPR, General Data Protection Regulation went into effect, PBS called me and asked me to be on TV to talk about it. So people still ask me about that interview.

Steve: So, yeah. Was that your first media interview? With PBS?

Debbie: No. I've been interviewed a lot, like in the press, you know, the New York Times on my texts, you know, so I've, for many years I've done been interviewed by media outlets and different things like that. But I think up to that time, no one really cared a lot about privacy. Right. So that was the first time people, you know, it wasn't entangled with some other thing.

Right. So I think that that was a big deal.

Steve: Well, at what point did you decide that you wanted to go independent and start your consultancy? It looked like on LinkedIn it was 2019, but had you been consulting prior to that?

Debbie: Yeah, I'd always done, had actually my first job out of college. I have my own business doing consulting.

So I've always done consulting and, again, digital transformation and a lot of different things. So AdTech, FinTech, whatever, people knew that I knew how to do these things. So I like help them with certain different projects. So it all, to me, it all comes together because the center thread through my career is data, right?

And so for me, I can go in any direction. So it doesn't matter to me. Like people ask me, “Oh, what type of clients do you have?”

You know, I have Fortune 50 companies, like small companies, it really depends on not the size of the company, but the seriousness of their data problem. So, so yeah, the more serious the data problem, the more serious they want someone more who understands those high risks.

Basically, yeah.

Steve: Does, does it tend to be B2B2C versus B2C? Like what, what sort of companies end up having the most issues?

Debbie: Yeah. I work a lot with definitely B2B, but also companies that have B2C issues. Right. So I'm working directly with company. Maybe they have, you know, millions of customers and they want to do X, whatever X is, and they want some help there or companies, maybe they are trying to do contracts between two different businesses, right.

They want to collaborate in some way. They want to through some of these privacy challenges. So, yeah.

Steve: What are some of the common struggles that are brought to you in terms of like the challenges and what they need the most pressing help with?

Debbie: Some of the common struggles, I get a lot of calls from people who, you know, as I said before, barriers to adoption, right?

So barriers to adoption can be, let's say a company has a cool product, they want to sign a contract with a big company, and the company says, “Hey, we want to see your privacy program.”

They're like, oh, we don't have one, right? So then they have to go back to the drawing board. Or, a company, let's say a company is in a different, wants to move into a new area, maybe that they've never been in before, right?

They have some emerging technology components. So, for example, I had a client. They’re famous for being in the cable TV business, right? And they wanted to move into smart cities. That meant that, okay, now it's not just their B2C customers. Now it's like, okay, now I'm putting technology into vehicles.

I'm putting technology into the traffic lights. I'm doing all these different things that mean that they're handling data in different ways. So a lot of times I get involved with them, like the developer groups a lot. So I'm typically advising them before it even gets to legal because I tell people privacy is a data problem that has legal implications.

It's not a legal problem that has data implications. So unfortunately, some people, they take it from the legal side, but really the problem is the data. Right? So if companies have a better way to govern their data, understand, you know, what, what day they have, why they have it, why they're using it, when they get rid of it.

So that data life cycle is a huge deal. And it's a big problem for companies because up until the time that a lot of these data privacy regulations came about, a lot of companies didn't have a reason to delete data, right? And a lot of these companies, a lot of these regulations are saying, “Hey, only keep data until your purpose has extinguished.”

And so before, I was like, let's keep it forever, let's do whatever we want to with it, let's use it for other stuff, and that's what a lot of these regulations are trying to guard against because they don't want people to do that because they know there's a lot of harm that can happen as a result.

Steve: Yeah, yeah.

I felt like in the, in the 2000s, in the 2010s, it was all about big data. Collect as much data as you could because the data was now the new oil and it was like, okay, we don't know what we're going to do with it all. But let's just collect as much of it because someday we might be able to monetize. It's interesting how that transition has occurred with PEAK IDV and the audience that watches this podcast. It's typically providers and practitioners of digital identity or identity verification technologies. Sometimes data privacy is like the IDV practices can be at odds with each other because you want to know someone. So you want to ask them as much as you can about them and get driver's licenses and selfies and ask all these questions, but then you need to balance that.

And I think consumers have awoken to some of the challenges with over collection. How have you seen that area of the world, like that transformation, digital identity, be affected by some of the privacy policies that are happening?

Debbie: Yeah. Identity is a fascinating space. I have a lot of friends in identity, especially in banking and FinTech, biometrics. I think some of the growing pains that we're seeing in, in the democratization of AI, I think has probably already happened a lot in identity, where it's like, you know, should we, how much data should we collect? You know, should we have an empty database or should we have a full database?

Like how much data do we need to collect to be able to do this process? So I think. You know, those challenges are always there. I think what people don't really realize that when they move into a situation where they're using biometrics, for example, let's say, give me an example. Let's say, for instance, a company, they would have been accustomed.

They are a drugstore. So they're being accustomed to using just a regular video camera in the store, right? To make sure people aren’t stealing or just, you know, checking out everything. And then they decide they want to use. Biometric camera that actually has a database attached to it and they're doing facial recognition, right?

So that's not a like for like exchange, right? So maybe the camera looks the same, but the technology is doing something different and that raises those privacy risks. So it really isn't coming upon not just the people who create these technologies, but the people who use it to understand what they need to do, right?

Because a lot of the problems that companies have around identity is a lot of times the over collection of data. Really, the two privacy problems, really, there's only two. Data collection and data retention, right? So. So the data you collect, why are you collecting it? Why do you need it? Whatever.

How long do you need it? So that's where you get into privacy and that's where you get into some of those cyber risks for a company. So being able to really, like you said, do that balance, that balancing act. Is really important and not just sort of close your eyes and think that, oh, the tool works perfectly and we don't have to do anything, you know, so it's, you know, with great power comes a great responsibility.

So I think people need to understand that when they implement these tools is not like easy button and they, they need to, there has to be some human judgment and some governance. In terms of how the data is handled.

Steve: Totally. Now, we don't have to name names, but I think I might have read a similar article, about the company that was using some of those technologies.

And they were told that they can't, like they got put in a penalty box, that they're no longer able to do that for a few years. And the risks of abuse beyond the reputational damage, the potential for fraud from that, but just making poor decisions by not having the right use of that technology are considerably high right now, as people are diving in. We're recording this at the start of 2024.

What are some of the topics that practitioners or solution providers should really have on their radar as they think about their next big project or go to market anything that comes to mind. So you don't end up like that company get in trouble.

Debbie: Well, you know, it goes back to me with data collection and that data retention again.

And what you're going to do with that data. That's a problem. So the particular case you're talking about is in public domain. So I can talk about it as Rite AId And so Rite Aid they were banned for five years. So they can't use facial recognition in any of their stores for five years because they put on the naughty list I guess they had a run in previously many many years ago about some bad governance around data like collecting data, having false matches to people's faces on databases, going out to social media, trying to match people's face with social media, people in the store, harassing customers because they thought they looked like people, call the police.

I mean, it was just buck wild. Right. So this is like, it's like falling from a tree and hitting like every branch on the way down. So this, I think this is the reason why the FTC highlighted this case because it's probably almost anything that could have gone wrong and that scenario did go wrong, but on the flip side, I see a lot of companies, they do a very good job of managing the governance risks that have to do with, with biometrics, like making sure that people, they have the rights, making sure that people are aware of what is happening, like if they're recording and they're using facial recognition, making sure that people can opt out if that’s an opportunity to do that. Making sure that they're not, you know, falsely accusing people, you know, have to be sure you're sure that was Stevie Wonder song. Before you try to cause people or try to arrest them and different things like that.

So really what the regulators want is people who are using these types of technology and what they call a high risk use case, so anything that could possibly harm the life or liberty of someone is one the regulators really, really look at. The people who do these technologies the best to me, you know, they say, okay, I'm going to collect this data maybe you want to match your face, you know, you go to an airport and then once we do that, then we delete the data.

That's like the best you can hope for, right? Because beyond that, what do you need it for? Right? So I think companies really need to ask themselves that question all the way through the process.

So data has a lifecycle, like cradle to grave. So you can't think about a lot of times companies, they get excited about the cradle part. And you know, they don't think about the end-of-life strategy for that data. So has to be an end of life. Can't be an open-ended situation, especially when you're thinking about identity and biometrics.

Steve: Yeah, it feels like the company should be thinking about each angle and playing it ahead on how that data could be used and then could go wrong in that scenario. One that came to mind as you were speaking was what happened with Zoom. When Zoom, the web conferencing system changed their terms and conditions about something with how they were going to use the data.

And it was tremendous uproar around that because they weren't thinking through like how that would be perceived by the public and what privacy laws that would break or, or any challenges with that. So I find that that to be fascinating that that thought work, it's almost like you have to have a crystal ball.

At least think about it first before you make those big changes.

Debbie: Yeah. A lot of companies get in trouble because they collect data for one purpose and then it ends up getting used for something else and that to me is indicative of a governance problem within an organization. So, another public thing we talk about is Twitter.

So, Twitter had a problem where they were asking users to give them their email address and their phone number. And the purpose was that they wanted to use it for a multifactor authentication, and that's fine, no problem. Right. But then the marketing people got that data and they started using it to market people.

It's like, okay, that's no, no, no, that's a problem, right? And so that's a governance problem. So if the data was collected for one purpose, it shouldn't be used for another purpose without you getting permission from that person, right? Because basically you're saying, “Hey, let me pretend like I want it for this one purpose and use it for another one.”

Right? And it may not be nefarious because what happens a lot of times in organizations, data comes in and then how it got there gets lost. Right. So that lineage is not there. And so someone says, “Hey, I found a bucket of data. Let's use it for this. Okay, whatever.”

And that's, you know, so these laws and regulations, what they're trying to do is bring more transparency to that data life cycle, and we're trying to make sure that companies know that there is a governance step there. So just because it's in your possession doesn't mean that you can do whatever you want with that data. Those days are gone.

Steve: That's a really good point. I think by now people realize, okay, I can't sell customers data. I can't use it for marketing purposes. But one of the things we've seen a lot of in this last year is, “well, hey, I've got this data.What can I train in artificial intelligence?”

And I know you do a lot of thought leadership in that. I was watching a session you did with the NeXt Curve Webcast on personalization and privacy and all that's going on with generative AI feels like companies are really wanting to jump on that large language model train and start to do stuff with the data they sit on. What should companies or individuals Think about when it comes to privacy and data risk for that plan?

Debbie: Make sure first of all, people think of generative AI tools is like a Swiss army knife.

So it can do many different things. So a lot of times when I hear people talk about it, they think about in terms of doing one thing, right? And it's just not that way. But the thing that companies need to think about is provenance and data lineage. So provenance is, do I have a right to have this data in the first place?

That's your first question, right? Then, now the legal people are really concerned about that, provenance. But then, you can go off the rails with lineage. So, and go back to the Twitter example. So they have the provenance of the data when they requested it for multi-factor authentication. The people consented to it, right, gave them the data.

So that's provenance, right? So you have a legal right to have it. People say it's okay. Then you take the data, somehow it gets to marketing, which, you know, the person didn't agree to, and you start using it for that, and that's data lineage. So, when you're thinking about AI and AI systems, you have to track the data all the way through the process so do we have a legal right to have the data is one question. And then as it's going through the process, you can succeed at provenance and fail at lineage and fail completely. Right?

So don't get so excited about the provenance part because you can mess up. A lot of people the mess up is not the provenance, really? It's really that lineage that happens.

So with right now with Open AI, they're having a double whammy. So they're having lineage and province legal issues going on so I think it'll be interesting to see how all that will play out. But, you know, these are powerful tools.

I think they'll evolve a lot, very rapidly, change very rapidly. Right now we're talking a lot about Large Language Models. I think the future will be Little Language Models where things will be more purpose built. They won't need as many parameters. They won't need, you know, as many.

Documents in these databases, like, what do you or I need with like a million cat videos in the database? I don't need that, right? You know, someone created that, that's their own IP. That's their own copyright. I don't need that, right? But maybe what I want is the tool so that I can do what I want to do with it.

So I think that's going to be the future of LLMs in the future.

Steve: That's great. I can envision a not too distant future when these models are really securely on the device. Yeah. That data doesn't go anywhere. I can see Apple being a leader in that area as they develop some of those systems.

Debbie: Yeah.

Steve:

Another area that is expanding and changing when we think about transformation is this convergence of the physical and the digital world. And that can include your identity, but just how you interact in spaces and then in the digital world. You dropped in on one of the PEAK IDV LIVE events where we talked about that convergence. Thanks for joining. We didn't get to go in too deep to some of the topics I think are really relevant, and one of them was IoT, the Internet of Things. I was looking at your profile, so you're in the advisory board for the U. S. Department of Commerce's initiatives around this area. What are some of the things that we should think about when it comes to IoT and privacy and personal identity and how that all relates as that?

Debbie: Yeah, I decided that that was the area that I want to focus in emerging technologies, because I think that that is one of, first of all in terms of technology, hardware, that is one of the most ubiquitous, types of interactions that humans have with technology, in my view, which is hardware, right?

So, these are smart speakers, they're your TVs, you know, all those things. So, I can, I call an IoT device a computer without a screen, right? So, when you have a speaker in your house, it looks innocuous, like, “Oh, it's not doing anything. It's just, you know, it doesn't have any buttons on it. It doesn't have a screen.”

You don't really know what it's doing. But what you don't understand is that, you know, when you first of all, it's connected to the Internet, then it's being updated. Right. So maybe the thing that you brought. You know, a year ago. Now it has functionality that it didn't have a year ago. Now it's doing things that it wasn't doing a year ago.

So how do you know that as a consumer? What do you need to know about, how it's collecting your data? Can you opt out of certain things? So that's the concern that I have around IoT. And so I think as people, you know, like you said, how, you know, we have the metaverse, right? You know, you know, that was a hot word.

It hasn't gone away under any circumstances, right? That's still there, but it's going to have probably more, real world uses. Not these crazy, you know, out of this world use cases, I think. But, you know, how are you moving spaces? How, you know, there's going to be situations, you know, there's technology where sensors can be like the size of a postage stamp, and they can be put anywhere.

And all they really need is a device that knows how to read that information.

Like what does that mean for the person who's moving through these spaces? How their data is being collected. So all of it is about collection and retention. So let's say, let's say you have a space someone's moving in.

They have all these sensors or whatever. Maybe the data is only needed for the time when that person is in that space. And if the company decided, okay, we don't need this data, we just need it for them while they're in the experience. And then we delete it. That's great. That's to me, that's like the best you could possibly ask for because it gets you out of legal trouble.

It gets you out of a problem with the person, right? So they understand how this data, you know, we're, you know, we're collecting your data right now as you move through the space. But once you're out of the space, we're going to delete the data, right? So no privacy problems there, you know, no consumer problems because it's more evident, more transparent.

So that's where we need to go. Right now we have organizations, unfortunately, they're not doing that communication. They're not being transparent because the corporations have traditionally always kept everything that's just the bold, you know, the leading stuff is like swimming upstream within a corporation and even databases.

So databases are made to remember, not to forget stuff. So when we say, okay, you need to delete stuff. They're like, what delete? You're like, you know, it becomes like a huge, big deal and a huge, big project because a lot of these technologies were never made to do that.

Steve: Yeah. Yeah. And these days, almost everything you buy that even if it doesn't have a screen, has a computer in it, and it's probably an Internet connected computer. Um, the appliances, we're all pretty familiar with that. You got a refrigerator that connects now to clouds and ovens and this and that, but it's really hard to know if, to your point about the speaker, is it just a speaker? Is it just output or there are microphones in there?

Does it connect to a WiFi? Does it have a cellular chip that you might not know about? Like all these things are problematic for consumers as you were describing this spaces concept I was thinking about. Going out to the mall or a store. It's like, how do you know what is tracking you? Of course, you've got computer vision and cameras, but as you move about, like you're not signing a release when, when you walk into the store, how can they collect things on you if you haven't been privy to that?

I think it's a fascinating area. I think investing in IoT is a good play. Now, we're coming up on time and there is a timely topic that I want to bring up. It's just when this was recorded just in the last few weeks, it was getting a lot of press and that had to do with the DNA company 23andMe.

They had a data breach and there was a lot of data that is not just your personal DNA data, but your relative connections, family members. There was some sort of hack, but then in the end they said, “Oh, no, this wasn't a hack. This was password reuse. And it was our customers that caused the problem.”

And they did a little bit of a, almost like a victim shaming. Oh, if you didn't reuse your passwords, that didn't go over well. And it's still an ongoing topic. But what are some of your recommendations when you think about data privacy and connecting to the identity topic? What are recommendations you have?

When someone runs into that situation?

Debbie: Yeah, well, 23andMe is a very unique situation. It's hard to even, I can't even think of anything that compares in the magnitude of a breach in that regard, right? So your DNA is, you know, it is you, right? It is everything about you. It's about all types of inferences and things like that.

And, you know, it was so popular that police agencies around the world want to want that data because they want to say, “Hey, we want this DNA. We think your cousin stole something from the store or whatever.”

And a person, you know, I have family members who've used that service and it's, you know, very innocent, right?

They want to know, you know, what is my heritage, especially you're black. A black person, you know, I don't know,  where my, unfortunately, you know, my lineage in the same way. I mean, my boyfriend knows his lineage because he's Jewish or whatever, but, you know, very innocent. Right. So when people are signing up for that, they didn't intend for their data to be used in like police databases.

And they didn't intend for their data to be not secured in a way that can be breached by someone, there are things in your DNA, maybe, maybe you're predisposed to cancer or something and someone says, “Hey, I'm not going to hire Steve because he's predisposed to cancer and we don't want to have like high insurance rates or whatever”

Whether you have cancer or not. So this is the danger of collecting, over collecting data, collecting it for not the right purpose and then protection, right?

So if you want to collect data that's of such a sensitive nature, you have to have like the best security ever. Can't just, you know, this is not like, oh, you know, Sally drinks, you know, flat white latte on Tuesday, right?

So this is like your heritage, where you're from, what your predisposition is for certain genetic things, whether it's true or not. So I think, I think that the government will make an example out of 23andMe because I think, I can't even imagine, you know, I probably the worst breach I was thinking of was probably like credit, you know, but the credit bureau breaches because, when someone steals your identity, they may not have to do it right then, you know, so it may not impact you right now. Maybe it'll impact you in the future. You know, there are people who couldn't get jobs. There are people who were denied housing, you know, just from all types of wackiness with that. Right. And that's bad.

But I think DNA is the worst because you can't change your DNA. So if it's breached or being misused, you can't do anything about that. So there is really no adequate legal redress for that type of problem. And it's not a victim problem, right? It's the problem of a company collecting the data.

So my thing is, if you can't protect it, don't collect it.

Steve: That's a good point. And  I think about the fallout from trying to address the issue, okay? The first mistake was, okay, the data got accessed, but then to put out a statement that says, well, you know, not our fault in a way. Companies need to think through the ramifications of how they react to that.

Of course, they can’t also be silent. That's not good either, but I think you're right. The government needs to do more when it comes to making examples. There are a few other DNA database services out there that hopefully are seeing that and really boosting up their security processes and whatnot.

To close out today's session, I really want to highlight two of your media properties from the media side of your business that are really inspiring to me because I'm leveraging similar tools and capabilities. The first one is your LinkedIn Newsletter, The Data Privacy Advantage. You've got over 11,000 subscribers on that. That's tremendous. Can you share more about the topics you cover and if there's any particular topics that resonate with your LinkedIn audience?

Debbie: Oh, aren't you sweet? Thank you. Thank you. I really love the newsletter. I had struggled for years to decide whether I wanted to do a newsletter or not. And I thought I did because sometimes you can't really express things in a soundbite, right

So you don't get a lot of time or attention for people. And sometimes you need to be able to have a outlet where you can explain things, people can read them and they go back and like ruminate on them. So that's really the impetus behind the newsletter.

So I always try to think about what is the, what is the higher vision that I want to share, that you can't do in a five minute video, or maybe doesn't come across in a podcast or things like that. So that's really what I do. So I try to think about like, what's happening right now? What, what is it that people need to know that they can use now that would probably help them in the future.

So, you know, it's not, I don't do hot takes. I don't do, oh, this happened on the, a dog bit me yesterday. Like, that's not what I try to do the newsletter. I try to say, you know, what can I share with someone that they can take right now and use? And so that's what I try to do in the newsletter.

Everybody asks me about AI. So AI and privacy,

Steve: AI and privacy, or those are the main topics that seem to get a lot of engagement, reactions.  

Debbie: Yes. Yeah. So I do a lot of speaking. So I've done quite a bit of speaking in the last. Well, months to, you know, tiny companies like PayPal and Coca Cola, Johnson & Johnson about AI and privacy is a hot, hot,

Steve: Small little companies, the biggest brands.

Yeah. You know, the newsletter is really impressive. I think the newsletter format from LinkedIn is great because it does go into people's inboxes versus when you're just posting on LinkedIn, the algorithm decides, and maybe you'll get a lot of engagement when you don't, but with the newsletter, it, it does.

Gets out there. But the second property I want to highlight, which I mentioned in the intro is your podcast. It's the number one data privacy podcast in the world. I saw your post around December, the holidays, 100,000 downloads over 160 episodes. I would call that a phenomenon, right?

Can you share the origin story of how you decided, “Hey, I'm going to do a podcast” because I've got one for myself, but I'm not at 160 episodes and hundreds, a hundred thousand downloads.

I'm curious about it.

Debbie: I fought myself for a while about a podcast. I thought I wanted it. I didn't know. I, you know, you see a lot of people out on LinkedIn or in different talking about things and I felt like, you know, I was like screaming at the computer, right? Because I'm like, why are people talking about this, this, right?

So the thing that I want to make sure that my podcast does always. And so this is like, the origin of reason why I do it, is I talk about the things that people aren't talking about. So, people describe it at the narrows. So I go into those deep issues that a lot of people maybe don't want to touch.

They don't want to talk about, they don't want to think about,  so I'm talking about digital transformation, not just digital transformation, digital degradation, end of life, data retention, you know, bias, equity, you know, all types of stuff. So these are all human issues and they are hard.

They are really, really hard. So I'm always attracted to really complex problems. So I steer towards those more complex issues that I try to make sure that I'm illuminating them in a way that that you can't really do in a soundbite.

Steve: That's great. Well, I'll definitely include a link directly to your newsletter as well as the podcast.

So those that are watching this, we'll see it, but chances are with 100,000+ downloads and number of episodes, you're already on their radar. We're just out of time, Debbie. If you've seen a few of these episodes, you know, I like to go beyond the LinkedIn profile, beyond the business side. Are there any passions or personal causes outside of data privacy and the things we've discussed that are important to you that you could share?

Debbie: That's a great question. You know, privacy is it's not just my job. I am passionate about that, right? So I'm passionate about human rights, I guess. I guess that's a way I could talk about it, right? It's a part of a human right and for me, that's what I want in the U. S. So I want privacy to be a human right in the Constitution, which it is not now.

So there are other places that have privacy as a fundamental human right. In their constitution, including the EU and India a few years ago added that. So I think that to me, that's like a proper foundation that we can have in the U.S. to be able to create regulation that actually helps people and not hurt them.

Steve: Yeah. Yeah. It's almost as if we need a constitutional amendment around that to be able to be then at the same time, it can be tough with the way the government likes to track or the Internal Revenue Service. Like sometimes they push the boundaries of that as well. Well, as we wrap up, what kind of conversations would you be looking to have from the audience?

Anyone who's watched this or listened to it? What are you interested in as far as engagement in the market?

Debbie: Yeah, well, I'm always open. I love identity. I love that space, that area. So, anything in biometrics, anything in AI, all that stuff really interests me. All the little wacky things people are trying to do.

For me, I'm always like, oh! Well, what is the privacy implication on that? So I'm a technologist by trade, so I love technology. I don't necessarily love everything people try to do with technology, but I'm always interested to see or have conversations with people who are thinking about what can we do?

You know, what, how can I help? What can we do to, you know, obviously companies, they want to make money, which is fine, right? You want to do that in a way that doesn't harm people. That's that goal.

Steve: Are there any upcoming events that you'll be at? Any keynotes that you have already scheduled or planned for 2024?

Debbie: I have some things brewing. I can't really discuss about those yet. I try not to telegraph my punch, but yeah, I will make sure I share it once everything is set in stone, but there'll be some, some surprising things that people probably wouldn't expect that are coming up in 2024, but there'll be fun, interesting things.

Steve: Very cool. Very cool. And if anyone wants to reach out, what's the best way to get ahold of you? Yeah. Is it LinkedIn or website?

Debbie: LinkedIn is great. So you can contact me. If you type in Data Diva, Debbie Reynolds, my name will pop up, uh, or you can look at my website, uh, DebbieReynoldsConsulting.com and I'll have all my resources and videos and newsletters on my website.

Steve: Excellent. Well, Debbie, thank you so much for taking the time to speak with me and to be a part of this premiere episode for Season 3. I look forward to seeing those announcements that you're going to post and then potentially collaborating with you in the future. Thank you so much.

Debbie: I look forward to it. Thank you so much.

0 Comments