Author Topic: Telegram faces UK ban threat as Ofcom launches massive safety investigation  (Read 15 times)

Offline javajolt

  • Administrator
  • Hero Member
  • *****
  • Posts: 35978
  • Gender: Male
  • I Do Windows
    • windows10newsinfo.com
The UK regulator is cracking down on Telegram and teen chat apps over illegal content, with potential fines reaching 10% of global revenue.

The UK digital regulator, Ofcom, has launched enforcement action against Telegram after evidence suggested child sexual abuse material (CSAM) was being shared on the platform. The investigation is happening under the Online Safety Act and will look to see whether the platform is meeting its obligations to stop CSAM from being shared.

The regulator also revealed that it was opening investigations into Teen Chat and Chat Avenue to see whether they were meeting their duties to protect children from being groomed by predators.

Under the OSA, platforms facilitating user-to-user services must tackle the sharing of CSAM content. Ofcom said that it works with law enforcement agencies to identify platforms that are being used by offenders to share CSAM. Most recently, it received evidence from the Canadian Centre for Child Protection about the alleged existence and sharing of CSAM on Telegram.

It has decided to launch the investigation on the back of this report. If Ofcom finds the company has broken the law, it can require Telegram to take specific actions to come into compliance. It can also impose fines of 18 million pounds or 10% of qualifying worldwide revenue, whichever is higher. If it continues not to comply, it could ask a court to block Telegram in the UK or require payment providers and advertisers to withdraw their services from the platform.

Commenting on this development, Suzanne Cater, Director of Enforcement at Ofcom, said:

Quote
“Child sexual exploitation and abuse causes devastating harm to victims, and making sure sites and apps tackle this is one of our highest priorities. It’s why we work so closely with partners in law enforcement and child protection organisations to identify where these harms are occurring and hold providers to account where they’re failing to meet their obligations.

“Progress has undeniably been made, particularly with file-sharing services, which are too often used to share horrific child sexual abuse imagery. But this problem extends to big platforms too, and teen-focused chat services are too easily being used by predators to groom children. These firms must do more to protect children, or face serious consequences under the Online Safety Act.”

source