Telegram has rolled out a major update to its chat moderation policy following the arrest of its founder, Pavel Durov, in France. This marks a pivotal change in how the platform handles content, particularly in private chats, signaling a shift from its historically strong stance on privacy.
Previously, Telegram maintained that all chats—public and private—were confidential and exempt from moderation requests. But after Durov’s arrest over allegations of insufficient action against illegal content, Telegram has revised its policy. Now, even private chats are subject to moderation, allowing users to report inappropriate content to moderators. This change represents a significant departure from the platform’s earlier commitment to absolute privacy.
Telegram has quietly introduced a ‘Report’ button across its apps, enabling users to flag illegal content, including in private chats. This shift towards proactive content moderation went largely under the radar but reflects a growing effort to crack down on harmful activities.
Telegram’s FAQ section, which previously assured that private chats were off-limits for moderation, has been revised to reflect the platform’s new stance. Content from all types of chats—public or private—can now be reviewed for illegal activities, signaling Telegram’s commitment to comply with legal standards.
This policy shift follows Durov’s legal troubles in France, where he was detained for allegedly allowing illegal activities on Telegram, including child exploitation, drug trafficking, and fraud. In response, Durov pledged to address these issues directly, making it clear that he is committed to making the platform safer.
The update has stirred mixed reactions across Telegram’s massive user base. While some are concerned about the privacy implications, others view it as a necessary step to curb criminal activities and protect users.