Pavel Durov’s arrest suggests that the law enforcement dragnet is being widened from private financial transactions to private speech.
The arrest of the Telegram CEO Pavel Durov in France this week is extremely significant. It confirms that we are deep into the second crypto war, where governments are systematically seeking to prosecute developers of digital encryption tools because encryption frustrates state surveillance and control. While the first crypto war in the 1990s was led by the United States, this one is led jointly by the European Union — now its own regulatory superpower.
Durov, a former Russian, now French citizen, was arrested in Paris on Saturday, and has now been indicted. You can read the French accusations here. They include complicity in drug possession and sale, fraud, child pornography and money laundering. These are extremely serious crimes — but note that the charge is complicity, not participation. The meaning of that word “complicity” seems to be revealed by the last three charges: Telegram has been providing users a “cryptology tool” unauthorised by French regulators.
Who said it’s a street? What makes it a street?
Did you seek it out? I and nobody I know personally, have ever encountered anything like what was described on that platform and I’ve been on it for years.
Was it the same “channel” or “group chat” that persisted for years?
What gives them the right or responsibility to moderate a group chat or channel more than say Signal or Threema? Just because their technical back end lets them?
I mean by that argument Signal could do client side scanning on everything (that’s an enforcement at the platform level that fits their technical limitations). Is that where we’re at? “If you can figure out how to violate privacy in the name of looking for illegal content, you should.”
Nothing Telegram offers is equivalent to the algorithmic feeds that require moderation like YouTube, Twitter, Instagram, or Facebook, everything you have to seek out.
Make no mistake, I’m not defending the content. The people who used the platform to share that content should be arrested. However, I’m not sure I agree with the moral dichotomy we’ve gotten ourselves into where e.g., the messenger is legally responsible for refusing service to people doing illegal activity.
Yes. They can VERY CLEARLY SEE that the platform is being misused. Signal can’t. Signal is genuinely clueless as to what you do on their platform. If you’re going to promote your service as “privacy respecting” and not mean it, you better count on any world government getting on your ass for not taking down CSAM material. The difference between being ignorant and being irresponsible is ignoring the issue after you’ve been made aware of it.
Signal can very clearly see all the messages you send if they just add a bit of code.
And then no one will use Signal because there is no way someone won’t notice immediately. It is OPEN SOURCE. Look at the xz backdoor that got figured out in less than a day. Signal is more popular than a Linux tool for extracting files. Stupid statement.
But with that small tweak to their front end they can “VERY CLEARLY SEE that the platform is being misused.” So per your own argument, the government should force them to do so (and presumably anyone that’s uncomfortable with that can “just not use Signal”).
Stop with the what-aboutisms because this is not the current situation.
🙄
I won’t go into the specific channels as to not promote them or what they do but we can talk about one known example, which is how Bellingcat got to the FSB officers responsible for the poisoning of Navalny via their mobile phone call logs and airline ticket data. They used the two highly popular bots called H****a and the E** ** G**, which allow to get everything known to the government and other social networks on every citizen of Russia for about $1 to $5. They use the Telegram API and have been there for years. How do you moderate that? You don’t. You take it down as the illegal, privacy-violating, and doxing-enabling content that it is.
Edit: “Censored” the names of the bots, as I still don’t want to make them even easier to find.
Was that a bad thing? I’ve never heard the name Bellingcat before, but it sounds like this would’ve been partially responsible for the reporting about the Navalny poisoning?
Ultimately, that sounds like an issue the Russian government needs to fix. Telegram bots are also trivial to launch and duplicate so … actually detecting and shutting that down without it being a massive expensive money pit is difficult.
It’s easy to say “oh they’re hosting it, they should just take it down.”
https://www.washingtonpost.com/politics/2018/10/16/postal-service-preferred-shipper-drug-dealers/
Should the US federal government hold themselves liable for delivering illegal drugs via their own postal service? I mean there’s serious nuance in what’s reasonable liability for a carrier … and personally holding the CEO criminally liable is a pretty extreme instance of that.
Telegram is in the news often for public groups with lots of crime
“The news” is too vague a source to dispute.