As reported by CoinDesk, Telegram quietly edited its FAQs to remove language about not moderating private or group chats. A section headed “I have illegal content on Telegram, how do I remove it?” previously said that content in chats and group chats remains between participants. But now the section says that “All Telegram apps have a Report button” that users can use to report illegal content to the app’s moderators. Users simply tap on the message on Android or long-press on iOS and select the Report option. They can also send an email to the service’s takedown email address (abuse@telegram.org) with a link to the content they want to report.
The change came after Telegram chief executive Pavel Durov made his first public comments since his arrest on his channel. Durov was arrested at a French airport in late August as part of authorities’ investigation into the app’s lack of moderation and failure to curb criminal activity. He has since been released from custody, but was charged with “participating in the distribution of child pornography, illegal drugs and hacking software” on the messaging app, as well as “refusing to cooperate with the investigation into illegal activity on Telegram.”
French authorities apparently told Durov that he was arrested because they had not received any answers from Telegram regarding their investigation. In his post, the app’s founder explained that this came as a surprise, as Telegram has an official representative in the EU and its email address is publicly available. He also said that French authorities could be contacted in various ways to ask for assistance, and that they had even helped Telegram set up a hotline to deal with terrorist threats in the country. He further called the decision by French authorities to “prosecute the CEO for crimes committed by third parties on the platform” a “wrong approach.” Innovators would not create new tools if they could be held responsible for their potential misuse, he said.
Durov also spoke about how Telegram defends people’s fundamental rights, especially in places where those rights are being violated. For example, Telegram was banned in Russia after it refused to hand over encryption keys that would allow authorities to monitor its users. He said Telegram removes “millions of harmful posts and channels every day,” publishes transparency reports, and maintains a direct hotline with NGOs for urgent moderation requests.
But the CEO acknowledged that Telegram has room to improve. The “rapid growth in users” to 950 million “caused growing pains” and made it easier for criminals to exploit the platform. Telegram aims to “significantly improve in this regard” and has already started the process internally. Presumably, the rule change is part of the messaging service’s efforts to address authorities’ accusations that it has failed to stop criminals from using the app. Incidentally, the company reported having 41 million users in the European Union earlier this year, but authorities believe it lied about its user numbers to avoid restrictions under the Digital Services Act (DSA).