adimas - Fotolia

Thorn CEO on using machine learning and tech partnerships to tackle online child sex abuse

The CEO of US-based no-profit Thorn explains how cross-industry collaboration and machine learning is helping her organisation stay one step ahead of online child sex abusers

The ongoing democratisation of technology, coupled with the spread of internet connectivity across the globe, means it has never been easier to get access to information wherever and whenever we need it.

The trend has brought innumerable benefits to many of us, but not for all. In some instances, the information and data the more nefarious web users seek out online serves only to harm society’s most vulnerable groups.

Technology not only gives the perpetrators access to data, but also a wider range of ways and options when it comes to inflicting their harm, and that is certainly true of online child sexual exploitation.

The borderless nature of the internet makes it harder to track down victims, while the anonymity it provides offers those responsible numerous places to hide, which makes curbing the spread of the associated imagery and web content an increasingly difficult job for law enforcers around the world.

For the past five and half years, US-based non-profit Thorn has been on a mission to fight fire with fire, using technology to clamp down on a crime it has inadvertently become an enabler of.

Co-founded in 2009 by Hollywood actor Ashton Kutcher and his then wife Demi Moore, Thorn initially set out to use technology to tackle child sex trafficking online, before going on to extend its remit to include combating the growing problem of online child sex abuse.

“Kutcher and Moore were seeing the role technology was playing in extending a crime that was disproportionately affecting children, and yet technology had not been playing a part in its solution,” Julie Cordua, CEO of Thorn, tells Computer Weekly.

Read more about artificial intelligence

  • Government digital strategy’s plans to boost the artificial intelligence industry include a £17m fund to support new technological developments.
  • Amazon Web Services used Re:Invent 2016 to stake its claim as an early adopter of machine learning, while revealing details of its plans to boost the technology's take-up among everyday developers to support new technological developments.

Abusers are keen adopters of cutting-edge technologies that aid, extend or mask the ability to commit their crimes, says Cordua.

To back this point, she points to several instances where seemingly innocuous technologies, such as live-streaming tools and virtual reality headsets, have been misappropriated by abusers, contributing to the democratisation of child abuse online.

“When someone can abuse a child literally in their home by opening their laptop, it makes it much more accessible, and people behind the screen feel a sense of freedom and anonymity they didn’t feel when going to trade child abuse material in the past,” she says.

Back then, the spread of such material often required perpetrators to visit arranged pick-up and drop-off points in their local area, and hope nobody cottoned on to what they were doing.

“Now, you just need to know the right search terms and you have child abuse material at your fingertips,” she says.

Collaborating to tackle online sex abuse

The wide-ranging role technology plays in this crime means Thorn, and the law enforcement officers it supports, have adopted a holistic and collaborative approach to tackling the problem of online child sex abuse.

The focus of its work, specifically, is on using technology to help identify victims, and forging close ties with some of Silicon Valley’s household names and best innovators, she says.

“We have all this data online, and, instead of it being a detriment to us, we should be able to use it to find the victims in the abuse material more quickly.”

“We have to think about how technology can be used to stop these crimes, and how can we get the minds building the new tools to think about how the technologies they make could be used for abuse and try to be part of the solution, instead of thinking about it too late.”

The Thorn technology taskforce

The effort the organisation puts into collaborating with the technology industry is essential to ensure Thorn and the law enforcers who depend on its services can keep pace with the different tools and platforms perpetrators use to carry out their crimes.

To date, the company has partnered with more than 20 organisations, including Facebook, Google, Microsoft and Salesforce, through its Thorn Tech Task Force initiative. These organisations have all pledged to donate knowledge, time and resources to support its work.

As an example of the difference these partnerships make to the work Thorn does, Cordua points to the firm’s collaboration with Microsoft, which is supporting the non-profit’s bid to create new facial recognition tools and services to boost its ability to identify victims.

“Facial recognition is very important to what we do, because if we can take an image of a child that’s been abused and cross-reference it with data from the open web, and find some other place they may be, such as on social media, we can identify and find them more quickly,” she says.

Existing facial recognition tools have their limitations in this context, because of the low quality of the images the organisation is dealing with and the fact these tools struggle with detecting age progression in children, says Cordua.

The company has made progress, through its partnership with Microsoft, on both these fronts, through the utilisation of its open source facial recognition and age progression tools. The Redmond-based software giant has also lent Thorn some of its staff to work in-house on this.

“We, as a small entity, are never going to be world-renowned experts in facial recognition, but many of our partners are. So working out how to bring that technology in and deploy it in that environment is critical,” she says.

Working with smaller technology providers

Thorn is also not averse to working with smaller, more niche technology providers. It also works with distributed, in-memory database company MemSQL, and its technology forms part of the underlying architecture of its facial recognition tools.

“Part of the challenge with facial recognition is the processing power,” says Cordua. “When you’re dealing with 150,000 escort adverts online per day, each ad can have anywhere between one and ten pictures associated with it.

“After we’ve been collecting that information over several years, we have millions of images that we are working with, and processing those images is a very compute-intense process.”

MemSQL offered up its database and data warehousing technologies to help Thorn with this, while affording it the opportunity to prototype some additional functionality the software company was looking to build into its platform. 

“They come in, give us software for free and work hand-in-hand with our developers to customise a solution that increases the processing power and speed needed to analyse millions of images. That cuts our processing time dramatically for this type of work,” she says.

“They also get to prototype a feature that will be useful to their customers, so it’s a win-win for everyone.”

Working with machine learning experts

Another example of how Thorn’s technology partnerships have generated tangible results is through its work with machine learning experts Digital Reasoning.

This collaboration paved the way for the development of its web-based tool Spotlight, which is used by law enforcers to sift through the hundreds of thousands of online sex trafficking adverts to find, identify and assist with the rescue of child victims.

The tool launched two years ago, and is now reportedly used by 4,000 law enforcers across all 50 states of America. In the past 12 months alone, it has led to the identification of 1,980 child victims of online sex trafficking, as well as 4,545 adults.

The app processes between 150,000 to 200,000 adverts a day and assists law enforcers with establishing links between data they contain in order to identify traffickers, and locate and protect their victims.

“Somewhere in amongst those adverts is a child, but the ad is not going to tell you it’s a child. All the ads will say is that the person is 18 or older, and they won’t say they’re advertising sex because that’s illegal – but they are,” says Cordua.

“The machine learning comes into play here, because we have to teach the algorithms what it looks like when a child is being advertised, and you use data to do that. The adverts then inform the prediction model going forward.”

Keeping up with perpetrators

Before Spotlight, law enforcers would be forced to manually scroll through the adverts, recording data they think might be significant.

But with adverts often disappearing within hours of being posted, and with the perpetrators going to great lengths to obfuscate phone numbers and other key data, it could be difficult for them to keep up with the perpetrators.

According to Thorn’s data, the roll-out of Spotlight has contributed to a 60% reduction in the amount of time law enforcers have to devote to investigating these types of crimes by reducing how much manual handling has to go into sifting through this data.

On the back of this success, Spotlight is now being trialled in Canada and parts of Europe.

Protecting the investigators

The act of manually trawling through child abuse images and data can have a traumatising impact on the law enforcers tasked with doing it, and Spotlight and Thorn’s other tools are all designed to minimise the amount of exposure they have to it.

For example, in Spotlight, the images can be blurred out, negating the need for investigators to look at them unless they really need to, while some of the newer technologies Thorn is developing do not require investigators to view the pictures at all.

“It’s really helpful for investigators who can do their work without having to go on these websites that are 100% focused on child pornography because it is incredibly traumatic,” she says.

“Most investigators go through therapy, and we offer therapy for our team working on this content, because you know the work needs to be done; but we have to protect the people on the frontlines.”

“It’s really helpful for investigators who can do their work without having to go on these websites”

Julie Cordua, Thorn

This can also help reduce the levels of churn and brain drain often seen in investigative teams, which can have a detrimental impact on their ability to solve online crimes.

“A lot of law enforcement divisions have a mandatory rotation where they only let people work on this subject matter for a few years, which is understandable from a mental health perspective,” she says.

“The technology in this space is moving so quickly, though, you’ll often lose the institutional knowledge about how to investigate. What you want is for people to be resilient, taken care of mentally and still be able to do this work. That has definitely gone into the design of the tools we’re building.”

In-house development

Thorn began taking steps to build out its in-house engineering capabilities about two years ago so it could start developing its own tools and technologies to further its work.

This began with the recruitment of engineers on a year-long fellowship basis to see if Thorn’s ambition to create its own products in-house would be a viable option.

“We recruited engineers and data scientists, and it worked well,” she says. “We saw a lot of progress with the work they were able to do.

“It also gave us the confidence to say we could be the first in the world to create the centre of excellence with amazing engineers, and – coupled with our subject matter knowledge of online child abuse – would help us make the most dramatic impact possible in this field.”

Tapping into the Innovation Lab

The team responsible for Thorn’s in-house development are part of what is known as the organisation’s Innovation Lab, and are sited in San Francisco.

From a recruitment perspective, the location of its Innovation Lab ensures there is no shortage of technology talent for the company to tap into as and when needed.

“Great engineers can put their talent to work at Thorn by still building great products and taking on challenging issues for a social good,” she says.

“When we recruit, we often have an advantage, but it takes finding the right person who is at the right time in their career and life, where they have made a conscious decision to lend their talent to an issue such as this.”

Gaining intelligence from anonymised communities

At the moment, the Lab is developing a Dark Web-focused investigation tool that is geared towards gleaning intelligence on child sex crimes from the anonymised communities that reside there.

It is also seeking to boost cross-industry collaboration between tech firms through the creation of a data-sharing service that will allow various parties to distribute intelligence about what abuse looks like on their various platforms.  

“We hope sharing intelligence amongst each other about what abuse looks like on their platforms will allow them to learn from each other, and get better at identifying and removing bad material,” she says.

For Cordua and the team at Thorn, their goal is to turn the tables on child sex abusers by using the data the perpetrators create and share online to track them down and bring them to justice.

“When you interview offenders of child abuse material, you often hear them say they didn’t know this is illegal – and if it’s so illegal, why it’s so easy to find the data, and why no-one tried to stop them. And that’s what we’re here to do,” she says.

Read more on Platform-as-a-Service (PaaS)