Home News UK Opens Formal Investigation Into Telegram Over CSAM and Child Safety Compliance Concerns
News

UK Opens Formal Investigation Into Telegram Over CSAM and Child Safety Compliance Concerns

The United Kingdom’s communications regulator, Ofcom, has launched a formal investigation into Telegram over concerns that the platform is being used to share child sexual abuse material (CSAM).

The probe is being carried out under the UK’s Online Safety Act, which requires online services to take strong action against illegal content, including preventing the distribution of CSAM. Ofcom is investigating whether Telegram has failed or is failing to meet these safety obligations.

The regulator stated that it received evidence from the Canadian Centre for Child Protection indicating the presence and sharing of CSAM on Telegram. Ofcom also carried out its own assessment of the platform before deciding to open the investigation.

Telegram has rejected the allegations, claiming it has “virtually eliminated the public spread of CSAM” on its platform since 2018. The company also expressed concern that the investigation may be part of a wider effort targeting online platforms that prioritize free expression and privacy.

Alongside Telegram, Ofcom has also launched investigations into two teen-focused chat platforms—Teen Chat and Chat Avenue. These probes are focused on whether predators are using the services to groom children and whether the platforms are properly identifying and mitigating such risks.

In a separate action, Ofcom is also investigating X over concerns involving non-consensual sexually explicit content generated using the Grok AI chatbot.

Under the Online Safety Act, Ofcom has significant enforcement powers. If a platform is found in violation, it can impose fines of up to £18 million or 10% of global revenue, whichever is higher. In severe cases, it can also seek court orders to restrict access to services in the UK. These measures may include requiring internet providers, payment processors, or advertisers to cut ties with non-compliant platforms or block access entirely.


1 Comment

  • This investigation highlights how seriously online safety is being taken under the UK’s Online Safety Act. Platforms like Telegram and other chat services need strong moderation systems and faster response mechanisms to prevent illegal content and protect vulnerable users. At the same time, it will be important to ensure that enforcement actions are transparent and balanced with privacy and free expression concerns.

Leave a Reply to scsec Cancel reply

Your email address will not be published. Required fields are marked *

Related Articles

News

BlackBerry Report: Governments Rely on WhatsApp Despite Widespread Misunderstanding of Messaging Security

A new report from BlackBerry Secure Communications highlights widespread confusion among government...

News

Over 1,500 Perforce Servers Still Expose Sensitive Source Code and Critical Data to Attackers

Thousands of internet-facing Perforce P4 servers are still exposing sensitive data due...

News

NGate Malware Hijacks NFC Payments on Android to Steal Card Data

A newly discovered variant of the NGate Android malware is targeting users...

News

Global Crackdown Shuts Down DDoS-for-Hire Empire, Exposing Millions of Cybercriminals

Operation PowerOFF Dismantles Major DDoS-for-Hire Network An international law enforcement operation, known...