Playback speed
×
Share post
Share post at current time
0:00
/
0:00
Transcript

Child Online Protection with Caroline Humer

Steve interviews Digital Safety Advocate & Global Protection Expert, Caroline Humer

This week’s episode features Caroline Humer, Digital Safety Advocate and Global Protection Expert.

Caroline is a child protection expert who has dedicated her life to protecting the most vulnerable populations through her work at international non-profits and the trust & safety industry.

She’s a globally recognized expert in policies and frameworks that fight harms online and teaches law enforcement teams around the world on how to investigate missing and abducted children. Caroline has helped implement AMBER Alerts in 19 countries as well as launching the first and largest AI search engine for missing children.

She also hosts a bi-weekly podcast series called "Missing Persons Uncovered."

If you're in the field of trust & safety or just interested in keeping your family or community safe, you’ll learn from this engaging conversation with Caroline Humer.

RESOURCES:

Connecting with Caroline Humer

Caroline Humer’s LinkedIn: https://www.linkedin.com/in/caroline-humer-669bb71/

HMC Group, LLC website: https://hmcgroupllc.com/

Companies & Resources Discussed

HMC Group, LLC was founded by Caroline Humer to construct sustainable mechanisms to protect the most vulnerable people. HMC’s focus is on missing persons and digital protection.

Podcast: Missing Persons Uncovered is a bi-weekly podcast series hosted by Caroline Humer and Karen Shalev first-hand experiences of those who have experienced missing persons cases, seeking the people behind the statistics.

Trust and Safety Forum is an annual event hosted by Caroline Humer and Jean-Christophe LeToquin to bring policymakers and industry closer together and identify opportunities to strengthen a safer digital environment and a balanced innovation process.

National Center for Missing and Exploited Children is a private, non-profit 501(c)(3) corporation whose mission is to help find missing children, reduce child sexual exploitation, and prevent child victimization.

International Centre for Missing and Exploited Children works to protect children around the world from going missing by providing resources for governments, law enforcement, NGOs, and families on prevention as well as the appropriate actions to take in the event a child does go missing.

Ernie Allen was the founder and former CEO of the National Center for Missing and Exploited Children and the International Centre for Missing and Exploited Children

Marketplace Risk is the only conference designed for marketplace and digital platform founders and leaders. The next conference is in New York, September 17.

Regulations referenced in this episode:

Children's Online Privacy Protection Rule (COPPA)

California Age Appropriate Design Code Act

UK Age Appropriate Design Code

Videntifier is dedicated to building video identification tools and solutions that can help organizations accurately identify and address illegal content online. Since 2012, it has helped organizations such as online platforms, law enforcement, and CSAM hotlines efficiently navigate the heavy influx of illegal content. 

WebKyte is is a MediaTech company. It offers an automatic video recognition engine that detects criminal, adult, and copyrighted content among user-generated videos. The tool gives visibility over what’s available on a platform and automates content moderation.

T3K  is a provider of AI solutions for law enforcement and private enterprise. It harnesses the power of AI to find and classify illegal and harmful content.

Karen Shalev is the Director of the Centre for the Study of Missing Persons and a professor in missing person studies at the University of Portsmouth.

LexisNexis Risk Solutions provides customers with innovative technologies, information-based analytics, decisioning tools and data management services that help them solve problems, make better decisions, stay compliant, reduce risk and improve operations.

National Missing Persons and Unidentified Conference focuses on identifying innovative and effective technologies, approaches and strategies in the search, investigation, identification, recovery, and reunification of missing persons, regardless of the circumstances surrounding their disappearance.

FULL EPISODE TRANSCRIPT

Steve Craig: Welcome to the PEAK IDV EXECUTIVE SERIES video podcast, where I speak with executives, leaders, founders, and changemakers in the digital identity space. I'm your host, Steve Craig, Founder and Chief Enablement Officer of PEAK IDV. For our audience, this is a video first series, so if you're enjoying the audio version, please check out the video recording on executiveseries.peakidv.com. There you could watch the full episode. You could read the transcript and you can access any of the resources or links discussed in today's conversation. In this week's episode, I have the distinct honor of speaking with Caroline Humer. Caroline is a child protection expert who has dedicated her life to protecting the most vulnerable populations through her work at international nonprofits and in the trust and safety industry. She's a globally recognized expert in policies and frameworks that fight harms online, and she teaches law enforcement teams around the world how to investigate missing and abducted children. Caroline also helped implement the Amber Alert system in 19 countries, as well as launching the first and largest AI search engine for missing children.

In 2021, Caroline founded HMC Group, LLC with the aim to share her extensive knowledge to help any organized-- organization interested in protecting vulnerable people. She also hosts a bi-weekly podcast series called Missing Persons Uncovered. Welcome Caroline. Thank you for making the time to be on the podcast.

Caroline Humer: Thank you so much for having me. 

Steve: Let's get started. I'd love to learn more about your company, HMC Group. 

Caroline: Yeah. So as you mentioned, I started it in 2021. And it really came after I left the nonprofit world after 18 years in child protection with the National Center for Missing and Exploited Children and the International Center for Missing and Exploited Children.

And what the aim of my consulting company is, is really to help companies strategize and implement how best to protect their clients, their services, products, for children who use their products and -- as well as adults -- from either children who go missing or adults that go missing-- from online child sexual exploitation, so child sexual abuse material, grooming, exploit, trafficking, and so forth. And find sustainable mechanisms to essentially implement those for the long term, hoping that their services are more safe and less risky for the vulnerable people that we have that are online. 

Steve: It's fascinating and very important work. You and I met most recently at the Marketplace Risk event. We're both on the advisory board and we had a Marketplace Risk dinner. You shared your background. I thought your background was very fascinating as well. Can you share your personal origin story or your history? 

Caroline: Sure. I think similar to some of your other guests is-- I fell into this world of child protection. So it's not something I was looking for, nor did I know it existed when I looked for a job after university. But originally I'm actually from Switzerland, grew up in the UK, went to boarding school-- international boarding school, which really gave me a vast exposure to international culture. And then I started business studies because I didn't know what else to study. And so I did that for two - three years. And then I worked for the British Transport Police-- that was my first job out of university, which led me into police work. Essentially with-- I was the sole intelligence analyst working on one of the biggest train crash investigations in the early 2000s. So, working with the police, understanding what was needed and it was an interesting way to get into the work of working with the police and understanding investigations.

Steve: I want to talk about that next. What is the National Center for Missing and Exploited Children? Can you share more about what that organization does? 

Caroline: So when I came over in 2003, from the UK, I essentially started work at the National Center for Missing and Exploited Children, which is essentially called NCMEC.

NCMEC was created in 1984 as the US clearinghouse for all issues related to missing and child sexual exploitation. I specifically worked on their cyber tip line, which was created in 1998 to specifically focus on the online harms and risk for children, specifically child sexual abuse material or some-- still refer to as child pornography, which we don't use that term in the industry. And so I, for two and a half years, reviewed reports from the public as well as from platforms that suspected they had child sexual abuse material, or saw it and wanted to report it. We then analyzed the information and If we could find a location anywhere in the world, we would then forward it onto the law enforcement officers who would then investigate that actual case.

So we weren't-- we were doing the analytical work and the intelligence work, if you like, to help law enforcement give the work that they need to do to then determine, can they investigate or not. 

Steve: It's really on the front line of suspicion, is that right? Where you're investigating as it comes through and supporting law enforcement.

Now, I understand within a few years, you transitioned to the International Center for Missing and Exploited Children. Can you share more about how those two organizations interrelate and then what led you to go from national to international? 

Caroline: So the International Center for Missing and Exploited Children, ICMEC or ICMEC, international center is the sister organization of NCMEC.

It was created by the same founder, Ernie Allen and--  but it's a separate NGO and the idea of the international center is to help countries all over the world to establish-- or build capacity to handle issues related to child sexual exploitation and missing children. So at that point in 2006, I took over the missing children's division and built up the global missing children's network, which, at the time, consisted of maybe 15 countries. And when I left, it consisted of 30 countries, NGOs, law enforcement, working on missing children issues. If that was investigations protection, if that's putting out posters or even working with children from a healthcare perspective, we wanted to build a network that is holistic and understands the whole concept and then essentially train other countries and their own country on becoming stronger in that response. And that's where the Amber Alert comes in that you've mentioned. I know we'll talk about that in a bit. 

Steve: Yeah, I definitely want to go deeper into the Amber Alert, especially your role in rolling those programs out. I think about the time that you were in first the National Center, then the International Center, just what the internet was like. I, myself, I was an early internet user back to when it was like, modem dial-up bulletin board systems. And, the place you'd meet strangers on the internet was these internet relay chat rooms and then there was AOL and all that. As the internet became more and more mainstream, or it wasn't so niche in population, how did that start to impact the work at the ICMEC, and how did the sort of the risk for children and other vulnerable populations grow as more people got online? 

Caroline: Well, like you, I was an early user, not necessarily from a technical standpoint of view, but I started using in ‘95, ‘96, emails and started using the internet as limited as it was. But that was sort of it. And then when I came to the National Center for Missing and Exploited Children, it opened my whole eyes in 2003, and four, of what kind of risks there actually are on the internet. But even then, the risks are very different from today that at that point it was emails, it was maybe, you know, IRCs, if people knew what that was, and websites.

But we didn't have social media in 2003 and four. We didn't have the dark web. We didn't have smartphones. I remember when I moved over from the UK, barely anybody in the US was using text messages in 2003, because it came way after. So I think there's a whole different change in the risks that we see from 20 years ago-- but I don't know-- I would say that they're similar or the risk hasn't increased. It is just very different with today's technology than then. But I think the technology we had was simple to use and now it's just more complex and more diverse and what we've got. 

Steve: Yeah, I think back to the late ‘90s, early 2000s, when a lot of households were on dial-up and they were just transitioning into a persistent internet connections. You often had just one computer in the house and it was maybe in the living room. And so you could see what your kids were doing online and who they were talking with. And then as things change, you started to get computers in your room. And then eventually the smartphones-- there's this public service announcement in the late seventies in the US that said “it's 10 PM, do you know where your children are?” And it was meant to be like, “hey, your children are out running in the streets. Like they should be home.” But now it's like your children are home, but they're probably in their room on a smartphone and you might not know who they're talking with. What's your perspective on underage access? Just not specific to the content, but just having-- children have access to the internet from such a young age. 

Caroline: I think you can't have no access. I think today's children need to have access to the online world. They don't separate the online world with the real world like we do. And we're just-- I think-- I guess our age would be where we are just on that cusp of getting the, the native technology people, right? But we grew up as children without technology. So today's children, they need technology. That's something that-- it's part of their life. So for us as adults, it's important that we figure out how to find that balance between being online and having-- and being offline or being in the real world. Where's the connection? How do we make sure that children as well as adults behave the same way online as well as offline? I think that's the really important part. So, it's not about how much time they spend on the mobile phone or online, although that has an impact on their development. It's more about what are they doing while they're online. For what purposes are they-- is a two year old on the mobile phone for what purpose is a six year old on their phone? Is it schools-- school homework, or is it because they're talking to their cousins or family members across the pond? You don't know, right?

So that's the hard part. So I think it's an individual matter that parents have to understand, have to look at and say, what is good for my child? And then you've got the minimum standards that we try to implement as the industry to say, this is what should be done as overall if you don't want to actually coordinate or communicate with your child about internet usage.

Steve: It's something I personally struggle with. I have two young children that have grown up with access to iPads and smartphones, and now they're getting into that late primary school age and it's very tough because they're savvy. They know what browsers can do and what you can search for and then you can go down a rabbit hole and end up in a forum or somewhere where they're mixed in with adults.

And you wouldn't just send your children off to some physical event that's full of adults unsupervised. So at the same time, it's like you're doing that in a way on the internet and unfortunately, the internet's full of fraudsters, which is something I talk a lot about on the podcast with guests here, you know, scammers, criminals, but also predators that-- sort of manipulate children that convince them that they're one thing when they're really not. And then that unfortunately leads to abductions. 

Going back to the comment you made or your work with Amber Alerts, I'm really curious about how that rolled out. Maybe start by sharing more about what the US Amber Alerts system is and then how demand for that grew internationally. 

Caroline: Yeah, and I think, you know, there's a lot to unpack with the online world and children and access. So I think that could be your-- a separate podcast in itself when you focus on that part. But for Amber Alert to missing children, what-- I started this in 2006 when I go over to the International Center for Missing and Exploited Children, I realized that the US, with its 50 states and territories, had Amber Alert. And Amber Alert essentially started in 1996 after Amber Hackerman in Arlington, Texas. She was abducted while she was playing outside of her house in the front yard and was abducted by a stranger, white van, and they drove off. 

That led for the public to ask, why can't we use the emergency alert system that the US already has for tornadoes, for weather, presidential addresses, et cetera, to alert the public for missing and abducted children cases. So 2023, there were about 360,000 missing children under the age of 18 reported to law enforcement in the US. So that's about a thousand a day.

Now, a thousand a day across the US doesn't sound a lot. But take into account that there are probably another number of under-- not reported cases where children aren't reported to law enforcement or children who go missing and just before you report law-- the law enforcement or somebody else finds them.

So there's probably a higher number than a thousand a day. You do not want to get an Amber Alert for a thousand people-- for a thousand people every day. That would defeat the purpose. The purpose is for the public to help when there is a missing child or an abducted child at high risk. So law enforcement created criterias.

Then in between 2000-- or 1996 to about 2003, the US and each state created their own system. So each state has their own Amber Alert program and they issue it for their state only. So there's no federal Amber Alert; there's only statewide alerts. It-- there's also local alerts as well. When I came on board in 2006, I-- that 2007, we, unfortunately, on the other side of the world, in Europe, had a high profile case called Madeleine McCann from the UK who went missing.

She was three years old or just about three or four years old in 2007. And it made it worldwide and she's still not found. But because of that interest worldwide and specifically in Europe, the European Union and the European Commission said, we need to have something, we need to deal with this issue. This is not just a one off case. And so they came to the US, looked at the US system and said, “you know what, we could probably do the same.” So they put money behind their countries in the-- under the European Commission. And said, “let us create Amber Alerts for each of the countries under the EU. Can you help us build that?”

And so they came to me and I work then with each of the countries, law enforcement agencies, NGOs, department of justice equivalent to basically build-- help build their standards, their mechanisms, their training program, and explain exactly how the system works. 

Steve: So in that process, I imagine there's a lot of bureaucracy and challenges with interdependency between organizations and governments and police.

What was your approach to building collisions-- coalitions between those groups. And like, how did you-- this sounds like such a complex task. And then for the number of those that you rolled out?

Caroline: It sounds complex, but when you actually take it apart and look at it piece by piece, it becomes very simple. And that's sort of how I've always dealt with things, even at an early age, when, as I said, I wasn't a tech computer person, but I would spend my weekends taking a Walkman apart and build it back together again and see if it would still work. 

So it's like literally piece by piece. It's for me, it's puzzling-- a jigsaw puzzle, the same thing, except you don't know what the end looks like necessarily. You have to adapt. And so for me, it's piece by piece. And that's alongside my exposure to international community from boarding school, where I was exposed to 80 different nationalities-- no, 70 different nationalities for two years that allowed me to understand how we can communicate without offending each other or finding a common ground that allowed us to actually say, “okay, we want the same thing, but we will go about it differently, but the end result will still be the same.”

So with that and the puzzling and put people-- you know, putting pieces together, it allowed me to look at each of the countries and saying, “okay, explain to me how it works in your country. What is the structure? How is law enforcement connected to NGOs or how is law enforcement connected to the prosecutor's office, who has which role.” That then allowed me to, at least in my head or maybe on paper, to understand the diagram. And they understand how that works. That allowed me then to say, “okay, here's what we do in the US. You don't need to do exactly what we do, but here's just one way. And if you want to do it differently, because you have maybe a more similar system to Australia or South Africa or Argentina, then here's another way how they've set it up, and maybe that will work better for you.”

And so, we adapted the criteria for Amber Alerts in different ways in all the different countries. We adapted the technology that they used, but in the end, the system works the same way. It's about alerting the public on high risk missing children cases when the pub-- when the law enforcement or the investigating agency needs help to look for that particular child. And that's the important part. So, we all came together. You know, we all did it differently, but the end result was again, the same. 

Steve: That's great. That's great. I think despite differences in cultures and nationalities and laws and rules, at the core humans want their children and their family members to be safe and they want them to be protected.

Caroline: Yeah, and I think it's-- it's, again-- it's common-- finding common ground, right? It's-- if we are going to be different, which we all are as human beings, we have different experiences, we need to find that common ground. What can we live with and what can't we live with? And that's more human than looking at criterias and strategies and legislations, let's say.

Steve: That's great. That's great. Well, shifting gears a bit, thank you for sharing your background and your origin and the great work you've done. I want to talk a little bit about the trust and safety ecosystem. This podcast is predominantly focused on digital identity. Some who've been listening so far might not immediately make the connection; “hey, why are we talking about child exploitation? And then this is an IDV or identity verification podcast.” But I think identity technology has a really important role to play because in marketplaces, in communities where you're connecting different individuals, peer-to-peer, anonymity can be very dangerous. And I want to get your thoughts on the concept of an anonymity and trust and safety and just some of the work that you've done in this field. 

Caroline: Yeah. So essentially, as part of my growing with-- growing my consulting company, I started the Trust and Safety Forum in conjunction with a colleague of mine in France that allowed me to really get exposed to trust and safety and what that is. The work that I did with the nonprofits was a small part of trust and safety. There's so much more to trust and safety than just child protection, but talking about anon-- anonymity is… I think I have two hats when it comes to this one is sort of my personal hat or as a single adult. And then the other one is more as a child protection expert. As a child-- child protection expert, I think one of the things that I look at is that children should have their private space. They should have a space where they can explore. They can experiment in a way that allows them to make mistakes, but not have massive consequences that cannot be undone because of legislation or because of lifelong consequences.

I think we need to understand that children are still children. Regardless, if they're online and offline, and therefore they need to have that space to explore and make mistakes and learn from them. And that is something we need to give them online. Should that mean that they can do this in full anonymity? I don't think so. It should be more of let's have that safe space. But if something goes wrong, you have a recourse, you have a way to say something and speak out as either a parent, as a victim, or as someone who's looking on where somebody is being victimized. So that's the important part for the platforms and for the online world that we build something that allows us or allows our children to grow up with a balanced view of how the world works. And it's not just safe, but it's also not just danger, it's both. It's a gray world, it's not black and white, as we all keep saying. From a standpoint of view of me being a single adult with nobody in the house but me, I want my anonymity to basically say, I want to go wherever I want to go online, as long as it's legal. I don't want anybody to know what I'm doing online unless I tell them. So how do we make that balance between us as adults who should have that balance or that anonymity? As well as children who have the safe space, but are able to report if something happens. I don't -- and this is the hard part -- I don't have the answer because I wear both hats.

And so I try to find a balance between the two in these kind of discussions, because it's not one solution fits all. It's multitude of things. So if we just take one platform, one platform can do it one way. Another platform does it another way. But as long as there's multitudes of structures or tools in place to keep you safe, adult or child, then we can say, do I want it or not as the user. And I think that's important to have that structure in place. For platforms to say, “okay, you are a user here. Here are our safety measures. If you don't want any of them, you can opt out, or if you're a child, these are going to stay in because you are a child under age, therefore, that's just how things work in this world.”

Same way as the real world. Then a child under the age of 21 in the US can't just walk into a liquor store and buy something. They have to prove that they're over 21-- or 21 and over. So why not do the same with the online world where we have that ability to say private space, your own space, but have the anonymity when you need it or want it for various reasons.

Steve: Depending on who you speak with, some are big proponents of anonymity. And it's actually one of the the core features of the internet, going back to that New Yorker cartoon where, you know, on the internet, no one knows your dog. That was a benefit that people could voice their opinions without concern for, you know, consequence, which is a double edged sword because then they can also be trolls and they can do bad things on the internet.

At the same time, there's been this balance for privacy and very early on, as you started your work with child protection, at least for the US, we had a regulation come into effect. The Children's Online Privacy Reg.-- I think I say this right-- Children's Online Privacy Protection Rule COPPA is designed to prevent a collection of personal information for a child that's 13 and under or under 13-- but that's self attes-- self attestation. So that the child could be like, “oh, I'm actually 15” when they're really 11. And most of the companies that are complying with the COPPA rule, like they just let the child select from like a drop down for their age. 

When it comes to digital identity, there's this concept of like, knowing who you are, which might go too far knowing a name or address or location. But then there's this concept of just knowing with certainty that you're of a certain age, that you're truly over the age maybe of 18, or you're under a certain age if you're a child. What do you think about age verification or age compliance with respect to this problem space?

Caroline: Yeah, I think, you know, it's a different-- again, it's not black and white. It's sort of a gray area where we need to have age verification. We need to be able to check who is online but not not necessarily have the whole internet know your identity or your DNA as such, but more of, “hey, this person from the looks like it, they're over 18.” That makes sense, right? Or they're over 13. 

The hard part becomes is-- is not only as you said-- is that the child will just do a drop down and say, I'm 11 or I'm 13, but they're actually 11. It's the parents create profiles for their kids. So the parents will choose 13 and above, which if they would do an age-- if they did an age estimation through a digital identity, then yes, that will be-- make sense.

But then they give the password or username to a child, then what? That's where I think we can't just rely on technology. We can't just rely on age verification or age assurance, but also do education to parents. To say, this is why we're putting age assurance and age verification in place. This is not to hinder your child, but this is to help, because here are the risks, or here are the challenges that you would face if you put an 11 year old onto social media, or an eight year old on social media, or a gaming platform-- these are the exposures. So I think that's one aspect. 

The other thing is that with age verification or digital identity-- at least coming from the outside, because as it's not really my industry-- I look at it as more of a user perspective going, “do I trust that technology? What do you do with my data? What happens when you've checked my face or you've checked my-- or I've given my driver's license as an ID. What happens next with that information?” We've had a lot of leaks. We've had a lot of hacking of private information in this country. So with that and not knowing the technology aspect of it makes it hard to trust that age verification and age assurance, and the technology that is out there that's being used is actually making a difference or should be used on a regular basis. That's from a user perspective, not coming from the industry, right? And I think that's the difficulty then when we're talking about kids. Kids will find a way to get around age verification at some point or another.

So, again, it's coming back to we need to find a way to teach them, educate them, empower them on making those changes and saying, “okay, this is maybe not appropriate for me, or I have my little sister sitting next to me maybe I shouldn't be watching pornography while she's sitting next to me,” because of whatever reasons, right? So, I think that's where the education comes in, building awareness, and we haven't really done much of that education as much as we should with children. I think schools need to do more. I think parents need to do more and certainly we're working on the platforms and we're working on-- from an industry perspective, but everybody has to work together in this sense when it comes to all of the components that we're talking about.

Steve: We've got a series of regulations that are starting to pop-up in the world that are trying to tackle this issue. In California, there's the California Age Appropriate Design Code Act, which is modeled after the UK Age Appropriate Design Code.

What's your perspective on these laws? Like where are they putting the onus? Is it on the platforms themselves or dictating what they have to design their product around to support. 

Caroline: Yeah, I think a lot of the comp-- a lot of the legislations that are coming in place right now are putting the emphasis on the platforms saying “you're bad, you're not doing enough. You need to do more.” Which in an essence is right, but we also need to check and understand that they are not the only player when it comes to the online world. Again, education, parents, schools-- we need awareness campaigns that will just be something same as was a drop and roll that we had in-- when we were kids, or as you said, where are your-- where's your child at 10 pm. 

We need those kind of awareness for the age appropriate design codes and the-- from the UK as well from California-- I think they're a great start. Because what that means is that the companies need to think about safety and privacy from the start. So when they're building a product or a tool that will be used by us, the public, that it has those measures in place about privacy, about does it safety so that if I'm going to use a new tool, it will tell me “this is the-- these are the things that we've put in place to make sure that you are safe or your identity is safe, or your information-- this is what we're doing with your information. This is what's happening when you go onto our platform,” right? 

And I think that's the important part. That's a great step forward in putting some responsibilities to the platforms to say, “we need to-- we need to create tools, games, internet sites that are appropriate for the use of anybody that's out there.” And if they're not appropriate for children, then they need to have certain measures in place. If that is from the safety perspective and from the privacy perspective.

Steve: It's definitely a trend that I've seen in the last few years where traditionally unregulated platforms are now facing regulations. The last few years, it's been more about the emphasis on taxation and fraud and scams, but there are more and more around protecting the end consumers and especially children. But regulations move slowly and sometimes you have periods of gridlock within countries where nothing gets done. 

What role do you think the private sector, specifically the platforms and any of the trust and safety leaders that might be listening to this have? What role do they have in self regulation, like really pushing the frontier without having to have the government tell them to do X, Y, Z? 

Caroline: Yeah, I think it's certainly from a child protection perspective, and being in this field for 20 years, we've tried the self regulation with platforms on child protection and it hasn't-- let's say they've done some stuff, but not enough. And that's why the regulations are now coming into place. We've got-- there's currently, I think, four or five legislations in front of Congress, where just focusing on child sexual exploitation because essentially the industry and the government has said, “look, we've, we've tried, you-- you had the opportunity. We've talked to you and you're not doing it. So we're going to now start regulating you.” Some of it-- again, it's the balance-- and I think we need to find a balance between regulation and policies and what the freedom of platforms they can do themselves. 

Steve: This audience for PEAK IDV EXECUTIVE SERIES includes a lot of identity verification companies, biometrics providers-- what would your message be to them on how they can do their part? Like what are some gaps in the market, or how can they better serve the platforms to solve these tough challenges? 

Caroline: Yeah, I think part of what I've already said at the-- earlier on is build trust; build trust with your consumers. Build trust with the public that your products are safe, secure, usable without fear of us going, “is my data going to be hacked or what happens to my information?” And I think that's building awareness. I think-- and you can correct me and anybody else in the digital identity can correct me-- again, from a user perspective, I see that you've worked a lot on the technology and I think the technology works. I think that is fantastic. Now is a question of, okay, how do we make sure that the implementation of that technology is actually being used? How do we get people to use the digital identity tools that are out there? And that comes with education. That comes with awareness and building that trust and I think that's probably where the trust and safety and digital identity industry sort of come together a little bit-- and intertwined because it is building that trust to the consumers, regardless if it's a platform or digital identity tool. It's how do we make sure that our users, our customers have a good experience and understand that we're there for them. They're the priority, not the profit, not anything else, not the stakeholders, but actually the customer themselves. 

Steve: What I see in the market is a gap between-- or a communication breakdown between the solution providers and those end consumers, because often there is an intermediary, it's the platform or it's the bank. And so you may not even know your data is going to a third-party, or in many cases, multiple third parties. And that creates a bit of angst-- and then have people just exit from that process. 

Caroline: I think a simple awareness campaign as maybe as a collective or with your third-party, if that's the bank, if that is insurance, if that is a platform-- but to be able to-- they say that this is what happens, right? It's again, it comes down to transparency, “This is what's going on in our platform. This is why we've hired this digital identity company and they will be doing X, Y, and Z. And this is what happens when you access our platform with their identity. And this is why we do this.” People will be much more willing to then use the technology and understand why. And I think that's where, as you say, there's-- there is that gap. And that's not just-- you I think for digital identity, I think that's in many places when it comes to the online world. 

Steve: We're coming up on time. I want to transition to another topic, which I-- which I think is related. And that is this explosion acceleration of AI content, deep fake content. And no doubt these tools are being used for exploitation, catfishing, sextortion, just general trickery; you could create deep fakes of people's voices. I want to talk about the positive side of that, which is using AI for good. And, in the intro, I mentioned you created one of the first AI systems to support the search for missing children. Are there any projects or organizations that you can talk about how AI is helping us fight some of these evils? 

Caroline: I think there's a lot of different-- great startup companies that are working in this space-- at least in the child protection space, certainly also in the copyright space. For example, there-- there's Videntifier from Iceland, there's Webkyte out of Lithuania. You have, T3K out of Austria. There-- there's probably too many to mention but just-- those are just the three of them that I'm thinking of. Most companies who work in our space, if it's digital identity or even fraud or copyright infringement, child protection are working in some shape or form with AI.

What we again is-- need to do is understand the use of AI, how good is it? How accurate is it? How willing is it to adapt to a human being? What's their-- what's the limitation of AI? And at what point does a human being need to be involved in making a decision that AI cannot make a decision for? And so AI will help us, I think, in navigating the massive amount of data that we have, that we need to weed through to understand what's happening on the online world.

At the same time, we can use AI tools to protect the users and find ways to ensure that AI is being used in an ethical manner. Of course, it's always going to be-- there's always going to be people who use it for bad stuff. Like any other technology or any other thing we create it, there's always someone who's going to use it for bad things.

So-- but if we can minimize that-- minimize the usage of bad AI, then we have a better chance in really creating an AI online world, offline world that allows us to harness the positiveness and the massive greatness what AI could be. 

Steve: I hadn't heard of any of those companies that you mentioned, so I'll be sure to get the links from you and then I'll include them in the transcript section of this podcast.

So thank you for sharing that. And I agree, it's like when there's a technological advancement, it immediately gets weaponized for bad. And then we're always in this arms race or cat and mouse game of trying to stop the evils that come out of it with better technology. So it's-- I feel like it's-- it's never ending. 

But we are getting close to the end of the podcast episode. I do want to talk a little bit about your podcast. I mentioned in your intro that you have the Missing Persons Uncovered series. And you could please share more about that show and how you got started with that. 

Caroline: Yeah, what-- I so-- I got started this after I left the nonprofit world the ICMEC and the NCMEC because I sort of felt there was a frustration that missing children wasn't getting the same attention as child sexual exploitation and trafficking. That all the funding, all the awareness from the media went to human trafficking, went to child sexual exploitation.

And I think that's still probably true today, although it has shifted a little bit. And so one of the things that we wanted to do was to highlight the complexity of missing. So I partnered up with Professor Shalaf-- Karen Shalaf from the University of Portsmouth. She is a professor there and has done over 10 years of research into missing persons. So has an extensive library of all the research that's been done on this issue. And so we both felt that frustration. And so we said we will build a podcast that allows us to bring on experts like as you're doing with yours with the EXECUTIVE SERIES to say, “talk about what you know about this issue. What do you understand the complexity around missing? What is it that we need to do to build more awareness? If it's from a health care perspective, social welfare perspective from a law enforcement perspective, academia, nonprofit platforms.” So similar to yours, we essentially invite people up. We do it on a bi-weekly basis. We're now in our third year or third season. We just launched the third episode of the third season on Wednesday. So in two weeks or less than a week and a half, we'll do the fourth season-- fourth episode. And we talk to anybody around the world. So, it's not just US or UK focused, we really try to get as varied expertise as much as we can, including-- I just edited one episode that's coming up in probably two months with a death investigator from South Africa.

Which… you just never know who is working in this space and how it's all connected. In my mind, missing is the starting point of the vulnerability for a child. A child who goes missing is either vulnerable already because something's happening in the house, or they leave the house and they become vulnerable.

And therefore, we as a society need to do the-- to work on being able to protect and put resources behind the child to say “if you are vulnerable here are resources tools to help you.” And that's really what we want to do with the podcast is build awareness on the issue instead of actual cases themselves.

Steve: I'll be sure to provide a link to your podcast in the transcript as well. And when we were at that advisory board dinner, you mentioned that you also did some live episodes of the podcast, and it was sponsored by one of the more known identity verification companies, LexisNexis Risk Solutions. How can identity verification companies get involved or get into the space or support you?

Caroline: The easiest way is to get in touch with me, but, essentially how that worked is the-- under the Department of Justice, OJJDP, and I can't remember exactly what that stands for, Office of Juvenile Justice Delinquent Program, I think, they run a national missing persons and unidentified conference every year, which essentially brings law enforcement from around the country together for three days to talk about issues related to missing. And, I reached out to LexisNexis Risk Solutions said, “look, we're doing this podcast, we want to get more exposure. Is this something they would, would you be interested in?”

They had not heard of the conference. And so they said, “Absolutely. We'll come with you and we'll actually can talk a little bit about the solutions LexisNexis has for law enforcement.” So it was a win-win. We got exposure because we could talk to participants and speakers about all of their experiences related to missing and LexisNexis got access to law enforcement that didn't know what kind of resources they had.

So anybody that's interested in that, absolutely reach out to me, reach out to the National Missing Persons and Unidentified Conference. You can find that online or we can put a link to it also. That way, we can talk about next year. It always takes place in Vegas. So that's a great place. If anybody likes Vegas.

Steve: Yeah. I, a lot of identity verification conferences and FinTech conferences end up in Las Vegas. And maybe it's easy flight schedules, I guess, is the main driver. 

We're going to wrap up today's episode, Caroline, before I finish though, if you've seen episodes of EXECUTIVE SERIES, I'd like to go a little, a little bit beyond the professional profile and your LinkedIn background and ‘whatnot’.

Can you share with us the things you like to do from a leisure perspective? Like, where do you spend your time when you're not doing international protection work? 

Caroline: So that's-- I like to travel-- that's the easiest. I love to explore new countries, new places. So one of the things that I started with my stepmom two-- three years ago is road trips through the US-- she is a non-American and she loves America and said, “I want to, I need to know this country.” I was like, well, we spent a lot of time in Wyoming and said, “Wyoming is not the typical US and nor is New York City. So let's drive around the country and you can-- and we can explore.” And so every year we do different parts of the US for about 10 days. And we actually just finished one, the Northwest corner with Oregon, Idaho, Oregon, Washington state. And for the first time we went into British Columbia, and then back into Montana and Wyoming. Next year will be Alaska. So that will be another state that we'll try to cover. And this year I'll do two road trips because I'll do one in Europe and North-- Northern Spain. So I love to explore and just take a moment to enjoy the world and the beauty that this world has and not always think about the negative and the risks or the challenges, but really understand that there's a balance between the two. And this world is a great world to live in. And we have great things that work really amazingly and are beautiful. And the negative is really just a small part of the world that we need to work on. 

Steve: That's very cool. Do you like to do RVs or you drive and hotel? Like what's your typical? 

Caroline: We do B&Bs. We have a ram 1500 truck that will-- that we use in the US so drive in comfort. And then on the day, we'll just find a Google and do it, see what we can find for a hotel or usually B&B. We don't like to stay at the normal hotel stuff. So we try to find B&Bs because we usually just one night. 

So we found some interesting B&Bs that are theme-based. As much as just, lovely families who just open up their house for the night for anybody that's traveling through. So that's a great way to meet, again, Americans, and learn about the country that is so varied depending on which part of the-- the side of-- which state you're in, which city you're in, or not even a city, but a town, you learn so much from the-- from others. So we sort of wing it and sort of say, “let's-- let's figure this out in the end. If we don't find anything, we can always stay in the truck and sleep there.” 

Steve: Well, for those that are listening to this internationally that aren't based in the US, when you go to the US and you do a road trip, know that different drivers behave differently in different states and regions. Like, it's not all the same. LA versus Rhode Island versus, yeah, Wyoming. All different. 

Caroline: It's not even that they drive differently. The laws are different. 

Steve: That too. 

Caroline: So you have to learn the laws too. And that's a difficult one. So always, always research which states you're going to go through to understand the laws, as well as know your, know your rules, know-- stay under the speed limit. Make sure you know when it's a four-way or a one-way street, or how to park. You're not allowed to park on the opposite of-- opposite side of the road. Always park on the road that you're driving on the side that you drive in. My mistake, when I moved here, so…

Steve: You don't want to get a ticket if you're here just visiting because it could be complex, but very cool.

Well, thank you for sharing that. For those that are watching this episode or listening to it, what types of conversations are you looking to have from the audience? Would you like them to reach out to you on any specific topics? 

Caroline: I honestly-- I would love to learn more about digital identity and how-- how we can connect between the digital identity and trust and safety.

I think we sort of hit on a couple of points. I do think there's a way that we can connect the dots on that instead of reinventing the wheel but actually learn from each other and collaborate. I think that would be great. So anybody who's interested in having a conversation with me, feel free to reach out on LinkedIn or through my website. That's the easiest way to get in touch with me. And, my phone's hip-- is on my hip. So I will respond this quickly as I normally do. 

Steve: Perfect. Well, I'll be sure to put a link to your LinkedIn profile and your website, Caroline, thank you so much for taking the time to speak with me today. The work you do is truly inspiring, and I appreciate the positive impact that you're having on the world. And thank you for being on the podcast. 

Caroline: Thank you so much, Steve, for allowing me to share this and I hope your audience enjoys this and finds it interesting. And I'm sure our paths will cross soon again.

Discussion about this podcast