“Telegram is not a private messenger. There’s nothing private about it. It’s the opposite. It’s a cloud messenger where every message you’ve ever sent or received is in plain text in a database that Telegram the organization controls and has access to it”

“It’s like a Russian oligarch starting an unencrypted version of WhatsApp, a pixel for pixel clone of WhatsApp. That should be kind of a difficult brand to operate. Somehow, they’ve done a really amazing job of convincing the whole world that this is an encrypted messaging app and that the founder is some kind of Russian dissident, even though he goes there once a month, the whole team lives in Russia, and their families are there.”

" What happened in France is they just chose not to respond to the subpoena. So that’s in violation of the law. And, he gets arrested in France, right? And everyone’s like, oh, France. But I think the key point is they have the data, like they can respond to the subpoenas where as Signal, for instance, doesn’t have access to the data and couldn’t respond to that same request.  To me it’s very obvious that Russia would’ve had a much less polite version of that conversation with Pavel Durov and the telegram team before this moment"

  • Pup Biru@aussie.zone
    link
    fedilink
    English
    arrow-up
    1
    ·
    3 hours ago

    If I run a server, I have access to the raw requests coming in. I can do whatever I want with them even outside Signal protocol. You can’t verify that my server is set up to work the way I say it is. You get that right?

    i do, of course… and the information you have in that raw request is limited to the information that’s in the request (including metadata like IP address and other header information in the packets that make it up)

    You’re confusing what Signal team says their server does, and the open source server implementation they released with what’s actually running. The latter, you have no idea about.

    i’m really not… i’m saying it doesn’t matter what their server is doing, which is the only way to actually verify this: the client is the only thing you can trust in the chain, so you should always assume that the server is compromised: maliciously or not

    trusting signal doesn’t even have anything to do with it; they could be compromised and not know it

    these are things we both agree on

    the server still receives the raw network request and all the metadata attached to it. Your client has to talk to the server and identify itself before any messages are even sent.

    i agree with that too. what information contained within that request do you take issue with?

    as i said earlier, your IP address is problematic, but that can be said about any service: you have no way to validate any server software, open source or not… so you have to take measures to protect that information no matter the service you’re connecting too

    this is pretty trivially achieved with a trustworthy VPN these days (again, this is unverifiable, but you have to draw the line somewhere: can we agree that IP address privacy is within the realm of personal responsibility since that applies to any service?)

    When your device connects to send that sealed message, it inevitably reveals your IP address and connection timing to the server.

    agree

    The server also knows your IP address from when you initially registered your phone number or when you requested those temporary rate limiting tokens.

    also agree

    By logging the raw incoming requests at the network level, a malicious server can easily correlate the IP address sending the sealed message with the IP address tied to the phone number.

    okay, i can see where your problem is

    i can agree that’s definitely a vector they can use to build a social graph, and then tie that social graph back to real identities, and also that’s far from what you want in a private platform

    id say that it comes down to trade-offs… signal (says) they require the phone number in order to combat spam, which i can see as a real issue (i’d be happier if they didn’t store the phone number, or at least didn’t link it to your account, but that comes with a whole load of other issues)

    services need to have some way of combatting spam, which either boils down to “expensive accounts” so that blocking is a viable option, or spam filters which can be abused by corporate entities like they have with email

    if you really care about privacy with signal, you can get a VPN that allows you to frequently rotate your IP… most users won’t do that, so i can agree it’s a sub-optimal solution

    but i do think it’s a reasonable trade-off

    • ☆ Yσɠƚԋσʂ ☆@lemmy.ml
      link
      fedilink
      arrow-up
      2
      ·
      2 hours ago

      Sure, you can absolutely decide that it’s a reasonable trade off, but your original claim was that sealed sender addressed the problem. Sounds like you’re now acknowledging that’s not actually the case…

      • Pup Biru@aussie.zone
        link
        fedilink
        English
        arrow-up
        1
        ·
        edit-2
        58 minutes ago

        i think it’s a very clever partial solution, but when combined with signals other ethos (making privacy simple so that more people use privacy-centric options), that means people aren’t going to change IPs between temp token and message to solve the last part of the puzzle: thanks for explaining your line of reasoning

        i also think that there’s a way forward where messages are sent or tokens are retrieved via a 3rd party proxy to hide IPs (i thought i read something about signal contracting a 3rd party to provide some of those services but i can’t find the reference to that, and also it’s not verifiable so limited in usefulness), which is a complete solution to the problem, as long as said proxies aren’t controlled by signal (thinking about it now, you could also simply route signal traffic through a proxy so many people share an IP, and they do provide proxy functionality separate to the system proxy configuration)

        i still think that signal has made a pretty reasonable set of trade-offs in order to balance privacy and usability in order to have a large impact on global privacy

        *edit: actually, adding to the proxy point, turns out EFF run a public proxy

        and there’s a big list of public proxies available (not a big list to avoid censorship, but still a good resource)

        and they also have support for tapping a link to configure the proxy, so very quick and easy

        • ☆ Yσɠƚԋσʂ ☆@lemmy.ml
          link
          fedilink
          arrow-up
          1
          ·
          23 minutes ago

          It’s not really a partial solution, it’s just sophistry to obscure the problem. The fact that I’ve had this same discussion with many people now, and it always takes effort to explain why sealed sender doesn’t actually address the problem leads me to believe the the actual problem it’s solving is not of making the platform more secure. The complete and obvious solution to the problem is to not collect personally identifying information in the first place.

          You have a very charitable view of Signal making the base assumption that people running it are good actors. Yet, given that it has direct ties to the US government, that it’s operated in the US on a central server, and the team won’t even release the app outside proprietary platforms, that base assumption does not seem well founded to me. I do not trust the people operating this service, and I think it’s a very dangerous assumption to think that they have your best interests in mind.