Fraud Management & Cybercrime
,
Geo-Specific
,
Regulation
eSafety Regulator Seeks Platform’s Policy on Extremist and Child Sexual Content

Australia’s online safety regulator on Monday fined social networking company Telegram nearly AU$1 million for delaying its response to questions on how the company regulates violent, extremist and child sexual content on its platform.
See Also: Unmasking Risk for Due Diligence, KYC and Fraud
Australia’s eSafety commissioner said Telegram took over 160 days to respond to a reporting notice that asked how it tackled content related to terrorist, violent extremist and child sexual abuse content on its platform. The delay violated the commissioner’s May 6 deadline and triggered an infringement fine of AU$957,780.
“If we want accountability from the tech industry we need much greater transparency,” eSafety commissioner Julie Inman Grant said. “These powers give us a look under the hood at just how these platforms are dealing, or not dealing, with a range of serious and egregious online harms which affect Australians.”
The Commissioner’s action follows similar transparency notices it sent in March 2024 to social networking providers Meta, WhatsApp, Google, Reddit, and X, formerly Twitter, to learn how they address illegal activities on their platforms. Telegram was the only platform to violate the May 6 deadline.
Grant said eSafety could have levied a larger fine against Telegram, but felt the present figure reflects the seriousness of the delay in Telegram’s response and the subsequent improvement in its engagement with eSafety. Telegram now has 28 days to pay the fine, seek an extension to pay or request a withdrawal of the infringement notice.
Australia’s Online Safety Act, adopted in 2021, empowers the eSafety commissioner to hold social media platforms, search engines, internet service providers, app distribution services and electronic messaging services accountable for violent, extremist and age-inappropriate material on their platforms.
With over 900 million monthly active users, Telegram has historically faced the ire of regulators for failing to apply country-specific laws to content hosted on its platform. French authorities arrested the company’s chief executive officer, Pavel Durov, in September on criminal charges alleging complicity in hacking, distribution of child sexual abuse material and refusal to act on law enforcement requests (see: French Police Arrest Telegram CEO and Owner).
In the weeks following his arrest, Durov announced that Telegram will provide IP addresses and phone numbers of offenders to authorities in response to valid legal requests. He also said the social networking platform deployed artificial intelligence and content moderators to update the search engine and remove “problematic content.”
South Korea also struggled with Telegram over a surge in deepfake videos on the platform that victimized minors. Spending several frustrating months trying to contact a Telegram executive, South Korean authorities latched on to the news of Durov’s arrest and immediately flew to Paris to request French authorities to make Durov cooperate with South Korean investigators (see: Telegram Removes Deepfake Videos at South Korea’s Behest).
Australia has criticized Telegram’s delay in reporting how it handles sensitive content on its platform. Commissioner Grant said online radicalization of young people is a major contributor to raising the country’s terror threat level, and online platforms should be accountable for the content they host.
“Research and observation have shown us that this material can normalize, desensitize and sometimes radicalize – especially the young who are viewing harmful material online that they cannot un-see,” Grant said, adding that eSafety will seek a civil penalty in Federal Court if Telegram refuses to pay the infringement fine.