Cisco report: why the IWF can never completely rely on artificial intelligence
Harriet Lester, IWF Technical Projects Officer
"Technology is great but it will never completely replace human judgement."
A great piece of research published by Internet Watch Foundation Member, Cisco, finds 39 percent of security leaders are reliant on automation, 34 percent are completely reliant on machine learning, and 32 percent are highly reliant on artificial intelligence (AI).
Like many of these organisations, the IWF is continuously advancing its own technology and as a world leader in the fight against child sexual abuse imagery online there is an ongoing challenge to innovate as offenders seek new ways to evade detection on the internet. We know the value of bots, crawlers and the thousand and one different ways technology is giving us to interrogate online content faster and more accurately.
As you can see from our list of services, we heavily invest in technology and couldn’t do the work we do without it, and with lots of new advancements in the pipeline we’re excited about how much the IWF’s work will progress in the next few years due to machine learning.
However, the way we work suggests the human element is always likely to play a crucial role, despite the light Cisco has shone on the growing reliance on technology by organisations across the sector.
In the online industries of tomorrow there will still be jobs which only a set of human eyes can truly and reliably perform and the kind of sensitive assessment of content, which our analysts at the IWF carry out, will surely be one of them.
It is really important to us that we are not solely reliant on automation, however much it provides a huge aid in the work we do. Our analysts are still indispensable to provide an eyes-on approach for all our reports, to ensure previously unseen victims are not missed and to interpret what is often going on off screen or the domestic context within which content has been created.
Machine learning is going to play a big part in the future, there are potentially millions of images out there and we don’t have the resources to have eyes on every single potential image.
But for us the future for machine learning is in working hand-in-hand with our analysts to ensure images are flagged, giving us a heads-up with a human confirming any illegal material and ruling out any anomalies.
Picture a million images; a machine could be trained to learn what a child abuse image typically looks like, and then scan the internet, flagging and prioritising what it thinks is a child abuse image, leading us to find new victims and take action against more images which we simply wouldn’t have had time for without artificial intelligence.
Our analysts look at clues in child sexual abuse images and videos which help track down victims, for instance the clothes they wear, listening out for accents or names or even identifying where someone lives by the products they buy. Maybe one day machines can help with all of this, but right now our analysts are absolutely vital in this task, and there is no room for error when it comes to victims of child sexual abuse.
To read the 11th Cisco® 2018 Annual Cyber-security Report (ACR), clickhere.
Want to find out more about becoming a Member of the IWF? Head to our Become a Member page on our website.