“Telegram is not a private messenger. There’s nothing private about it. It’s the opposite. It’s a cloud messenger where every message you’ve ever sent or received is in plain text in a database that Telegram the organization controls and has access to it”

“It’s like a Russian oligarch starting an unencrypted version of WhatsApp, a pixel for pixel clone of WhatsApp. That should be kind of a difficult brand to operate. Somehow, they’ve done a really amazing job of convincing the whole world that this is an encrypted messaging app and that the founder is some kind of Russian dissident, even though he goes there once a month, the whole team lives in Russia, and their families are there.”

" What happened in France is they just chose not to respond to the subpoena. So that’s in violation of the law. And, he gets arrested in France, right? And everyone’s like, oh, France. But I think the key point is they have the data, like they can respond to the subpoenas where as Signal, for instance, doesn’t have access to the data and couldn’t respond to that same request.  To me it’s very obvious that Russia would’ve had a much less polite version of that conversation with Pavel Durov and the telegram team before this moment"

  • ☆ Yσɠƚԋσʂ ☆@lemmy.ml
    link
    fedilink
    arrow-up
    2
    ·
    2 hours ago

    It’s not really a partial solution, it’s just sophistry to obscure the problem. The fact that I’ve had this same discussion with many people now, and it always takes effort to explain why sealed sender doesn’t actually address the problem leads me to believe the the actual problem it’s solving is not of making the platform more secure. The complete and obvious solution to the problem is to not collect personally identifying information in the first place.

    You have a very charitable view of Signal making the base assumption that people running it are good actors. Yet, given that it has direct ties to the US government, that it’s operated in the US on a central server, and the team won’t even release the app outside proprietary platforms, that base assumption does not seem well founded to me. I do not trust the people operating this service, and I think it’s a very dangerous assumption to think that they have your best interests in mind.

    • Pup Biru@aussie.zone
      link
      fedilink
      English
      arrow-up
      1
      ·
      34 minutes ago

      It’s not really a partial solution

      disagree, and that’s fine… STEM is full of partial solutions that become complete solutions as additional pieces are added (and as i said with the proxy, imo the proxy makes it a complete solution)

      The complete and obvious solution to the problem is to not collect personally identifying information in the first place.

      but that creates other problems… for example, with spam and usability

      it’s all trade-offs, and signal has done a lot of global privacy when compared to alternatives exactly because of the compromises they’ve made

      You have a very charitable view of Signal making the base assumption that people running it are good actors

      i don’t consider it charity… they’re making a lot of right moves, and are explaining their compromises. they’ve given me no reason not to trust them, and plenty of reasons to say they’re a good compromise that will have the greatest impact to global privacy

      are there better privacy solutions? sure… will they ever take off? personally, i doubt it… not letting perfect be the enemy of better or good enough is important: a solution that keeps people who don’t care about privacy relatively safe is important, including for the privacy of people who do care about their privacy because it allows everyone to blend in with the crowd

      Yet, given that it has direct ties to the US government, that it’s operated in the US on a central server, and the team won’t even release the app outside proprietary platforms

      imo the fact that it’s hosted in the US is pretty irrelevant… as you’ve pointed out: it shouldn’t be a matter of trust… validation of the client is the only thing you can rely on, so even if the NSA hosted the servers you should still theoretically be able to “trust” the platform (outside of the fact that you couldn’t ever trust that they’re using encryption that they don’t have a secret back door in or something)

      I do not trust the people operating this service, and I think it’s a very dangerous assumption to think that they have your best interests in mind.

      i trust them as much as i trust anyone running any other privacy service