IWF supports the Online Safety Act by helping adult sites detect, remove, and prevent child sexual abuse imagery online.
The IWF will provide hashes of child sexual abuse images to the online industry to speed up the identification and removal of this content worldwide.
The IWF's role regarding government legislation on the possession of non-photographic visual depictions of the sexual abuse of children.
The ‘shocking’ images of children can involve penetrative sexual activity, sexual activity with an animal, and sadism.
A list of ‘digital fingerprints’ of known child sexual abuse imagery allowing you to stop it on your networks, platforms and apps.
A unique list of words, phrases, and codes offenders may use to conceal child sexual abuse imagery on legitimate networks and platforms.
The Internet Watch Foundation (IWF) has hashed more than a million images in a ‘major boost’ to internet safety.
The world’s leading independent open source generative AI company Stability AI, has partnered with the Internet Watch Foundation to tackle the creation of AI generated child sexual abuse imagery online.
Explore how IWF identifies and addresses non-photographic child sexual abuse imagery, including drawings and CGI, under UK legislation.
IWF and Black Forest Labs join forces to combat harmful AI-generated content. The partnership grants the frontier AI lab access to safety tech tools.