Ofcom, the United Kingdom’s communications regulator, has formally opened an investigation into Telegram following reports that child sexual abuse material (CSAM) is being shared on the platform. The move marks a significant escalation in the British government’s effort to enforce the Online Safety Act of 2023, a piece of legislation designed to hold digital platforms accountable for the content they host.
The investigation was prompted by evidence provided by the Canadian Centre for Child Protection, which Ofcom supplemented with its own internal assessment. For Telegram, a platform that has long positioned itself as a bastion of privacy and minimal interference, the inquiry represents a direct challenge to its historically hands-off moderation philosophy. The regulator is specifically examining whether the company has failed to meet its legal obligations regarding the removal of illegal content.
This regulatory friction arrives as Prime Minister Keir Starmer’s administration signals a desire for even stricter controls, including potential social media bans for children under 16. Telegram has "categorically" denied the allegations, asserting that it has significantly curtailed the spread of public CSAM since 2018. However, as London seeks to define the boundaries of digital safety, the platform’s claims of internal vigilance are now being subjected to official, external scrutiny.
With reporting from InfoMoney.
Source · InfoMoney



