Less than two weeks after Telegram founder and CEO Pavel Durov’s high-profile arrest by French police, the company has announced that it will start moderating “illegal content” in the platform’s private and group chats.
Perhaps the word “announced” is overstating a change so subtle it took eagle-eyed journalists to notice the new wording in a paragraph buried inside a revised Telegram FAQ.
After years of allegedly stonewalling police enquiries — as well as refusing to join international initiatives designed to detect and remove child abuse material — the company appears to have had a change of heart.
“All Telegram apps have ‘Report’ buttons that let you flag illegal content for our moderators — in just a few taps,” states the FAQ published on Thursday in answer to the question, “There’s illegal content on Telegram. How do I take it down?”
This is different wording from the previous version, as recorded on the Wayback Machine the day before.
“All Telegram chats and group chats are private amongst their participants. We do not process any requests related to them,” that version stated.
Both include an official email address for reporting illegal content, but the new version makes it clear that this channel is specifically for reporting, “content on Telegram which you think needs attention from our moderators.”
To some, the new wording will sound nuanced for a platform that has consistently been criticized for paying little heed to an alleged flood of criminal content that has found a home on it.
What’s new is the explicit reference to a moderation process, something Telegram has never promoted in the past.
Growing pains
On the same day as the tweak to the FAQ, Pavel Durov released a statement that reads as part apologia, and part acknowledgement that the company’s content oversight needs an upgrade.
“Last month I got interviewed by police for 4 days after arriving in Paris. I was told I may be personally responsible for other people’s illegal use of Telegram, because the French authorities didn’t receive responses from Telegram,” he began.
“No innovator will ever build new tools if they know they can be personally held responsible for potential abuse of those tools,” added Durov, before conceding: “However, we hear voices saying that it’s not enough. Telegram’s abrupt increase in user count to 950M caused growing pains that made it easier for criminals to abuse our platform.”
Of course, having a reporting address for moderation requests is not the same thing as actually doing it. Telegram will be heavily scrutinized on that score, but it could prove heavy going. Other social media platforms have struggled to contain abuse despite employing large moderation teams dedicated to the task.
No secrets
The issue of moderation has overshadowed another issue, namely privacy. It’s something that seems to endlessly confuse the users of all messaging apps.
WhatsApp, Signal, and Google’s RCS-based Messages all employ end-to-end encryption (E2EE) using the Signal protocol. This means that communications can’t be eavesdropped on because the keys used to encrypt them are stored only on the sending devices.
Telegram’s approach is more confusing. By default, there is no E2EE on Telegram private or group messages. These are encrypted server side, which means that, in theory, the company can decrypt their content if it wants to. Telegram private chats offer E2EE, but it’s inconvenient to set up. Durov has blogged on why the company eschews E2EE, mainly, he claimed, so that users can conveniently back up and access messages across multiple devices.
That’s the main reason why it makes sense for police to target criminals on the platform in a way that’s impossible with apps using E2EE: Telegram should be able to unscramble the content of Telegram messages if it chooses to.
Of course, as an app which has long played on its unruly MO, Telegram doesn’t want to advertise that uncomfortable possibility. Indeed, the company has long talked up its willingness to resist government oversight, including falling out with Russia’s Federal Security Service (FSB) in 2018 over its refusal to grant access to communications.
What’s clear — although perhaps less so to much of its user base — is that Telegram is not, and never has been, a guaranteed confidential channel.
Despite this, figures as senior as France’s President Macron himself are reportedly enthusiastic users. Some politicians in the US are also said to have embraced Telegram, ironically because they see it as being less inclined towards moderation.
Does this super-connected user base understand the risks that come with Telegram, including to apparently confidential messages sent months or years ago? Presumably not, or they wouldn’t have used it so extensively.
It’s all part of a wider movement towards informal off-channel communication that has leaders and businesspeople using a wide range of platforms on the basis that what is said there, stays there. That assumption might be in the process of unraveling.
What’s not always appreciated is how rapidly platforms such as Telegram are evolving away from their communication and content broadcast roots towards new capabilities such as crypto. These take its development and future in a very different direction.
At the same time, it’s still a surprisingly small company, with revenue of only $342 million in 2023. The crypto business unit dominates the bottom line.
Telegram might have changed its policy on illegal content, but It’s hard to see how such a small workforce will allow it to provide the sort of hands-on moderation the authorities might now be expecting.