IWF welcomes award-winning AI firm to its child-protection community

Published:  Wed 19 Jul 2023

Image Analyzer provides artificial intelligence-based content moderation technology for image, video and streaming media.  

Their award-winning technology swiftly identifies visual risks on digital platforms, including illegal content, and images and videos that are deemed harmful to users, especially children and vulnerable adults. 

The IWF helps its Members to reduce the risk of users finding illegal imagery online by providing cutting edge data sets and alerts to block and disrupt child sexual abuse images and videos online.  

Image Analyzer, who are also members of the Online Safety Tech Industry Association, will use the IWF’s Keywords List. 

The Keywords List is a unique compilation of the words, phrases, and codes often used by offenders to conceal criminal child sexual abuse material online. It’s particularly helpful for moderation purposes in chat, gaming, and forum environments.   

Nick Drew, Image Analyzer Senior VP of Global Sales, said: “Image Analyzer detects visual threats with unique artificial intelligence-based content moderation software.  

“The technology helps organisations minimise corporate legal risk exposure, protect brand reputation and comply with online safeguarding regulations by recognising visual, harmful material, including pornography, extremism and graphic violence in images, videos and streaming media.  

“Image Analyzer is committed to further developing its Visual Intelligence technology to aid safeguarding and protecting human moderators from the worst excesses of internet-based content. The company is a firm supporter of the work and ethos of the IWF and is proud to be a corporate member.” 

Susie Hargreaves OBE, Chief Executive of the IWF, said: “IWF analysts identified more than 255,000 webpages containing horrific child sexual abuse in 2022. Each of these web pages could contain thousands of images and videos depicting the rape and exploitation of children, amounting to millions in total. 

“We work with tech companies to ensure that as many of these images are blocked and removed from the internet as possible. By working together with organisations like Image Analyzer, we can help stop the repeated traumatisation of survivors who continue to suffer while imagery of their abuse is traded and shared on the internet.” 

Find out more about becoming a Member and the services the IWF can provide here.

Tags

AI imagery getting more ‘extreme’ as IWF welcomes new rules allowing thorough testing of AI tools

AI imagery getting more ‘extreme’ as IWF welcomes new rules allowing thorough testing of AI tools

The IWF welcomes new measures to help make sure digital tools are safe as new data shows AI child sexual abuse is still spreading.

12 November 2025 News
High public concern at spread of child sexual abuse images and videos in end-to-end encrypted spaces

High public concern at spread of child sexual abuse images and videos in end-to-end encrypted spaces

More than nine in ten people in the UK say they are concerned at how images and videos of children being sexually abused are shared through end-to-end encrypted (E2EE) messaging services.

10 November 2025 News
MEP visits IWF ahead of vital negotiations in EU Parliament on AI child sexual abuse content

MEP visits IWF ahead of vital negotiations in EU Parliament on AI child sexual abuse content

Dutch MEP Jeroen Lenaers hears about AI harms from experts on the front line of the fight to stop child sexual abuse online.

30 October 2025 News