• givesomefucks@lemmy.world
    link
    fedilink
    English
    arrow-up
    4
    arrow-down
    2
    ·
    10 months ago

    Ok, sounds like a transparent display on top of the normal one, that makes sense.

    And completely undetectable

    • conciselyverbose@kbin.social
      link
      fedilink
      arrow-up
      11
      ·
      10 months ago

      I don’t think it’s an extra display.

      If you have a 4K display, you can still have that display accept a 1080p signal. You can do this because the GPU isn’t controlling the display. It’s merely sending an image to the display 60 (or 120, or 144, etc) times a second. This passes through a chip that’s part of the display that is able to turn that feed into the signals to each sub pixel and tell them how bright to be. Monitors generally don’t really process much, but TVs often do additional processing to (in theory) make the image look better. This is the level this MSI display is going to be processing and adjusting the image at. (I’ve glossed over a lot of details here, and am not pretending I understand all of them. But in broad strokes, this is what’s happening)

      Screen captures by your computer don’t see any of the adjustments your display makes. They just see the image you send to the display. They have no way of knowing if your display is cranking saturation through the roof, inserting gross fake frames it’s calling “true motion” or whatever, blasting the shit out of brightness and blowing out highlights, etc. They don’t actually know what the final output looks like. They only know what they send.

    • fishpen0@lemmy.world
      link
      fedilink
      English
      arrow-up
      4
      ·
      edit-2
      10 months ago

      Think of when you open the menu on your screen to adjust the colors or brightness or change the input to another one. That overlay is controlled by the screen and not the computer. If you take a screenshot when it is up, the screenshot won’t show it because the monitor is driving that, not the computer