More children than ever are becoming victim of online sexual abuse, with technology offering abusers more access to them than ever.
The IWF is one of the most effective hotlines in the world at removing child sexual abuse imagery from the internet, but this has only been possible thanks to the key international partnerships.
Every 5 Minutes our analysts in Cambridge find & remove an image or video online of a child suffering sexual abuse.
Help for people aged 18 or over who may be suffering online blackmail, sexually coerced extortion or 'sextortion' for nude images or videos and/or money.
Tech Monitor spoke to the IWF’s chief technology officer Dan Sexton about how his team is developing bespoke software to support the charity’s work.
An increase in sophisticated AI-generated images of child abuse could result in police and other agencies chasing "fake" rather than genuine abuse, a charity has said.
For nine years, Chris Hughes has fought a battle very few people ever see. He oversees a team of 21 analysts in Cambridge who locate, identify and remove child sexual abuse material (CSAM) from the internet.
A rise in child sex abuse material has been linked to websites hosted in EU countries, according to the Internet Watch Foundation (IWF).
The Voluntary Principles to Counter Online Child Sexual Exploitation and Abuse are a set of 11 actions tech firms should take to ensure children are not sexually exploited on their platforms.
Elliptic, a global leader in digital asset decisioning, has partnered with the Internet Watch Foundation (IWF) to strengthen efforts in preventing the financing of child sexual abuse images and videos through cryptocurrencies and blockchain infrastructure.