• frezik@midwest.social
    link
    fedilink
    English
    arrow-up
    3
    arrow-down
    5
    ·
    edit-2
    5 months ago

    Nope. There is an industry standard way of measuring latency, and it’s measured at the halfway point of drawing the image.

    Edit: you can measure this through Nvidia’s LDAT system, for example, which uses a light sensor placed in the middle of the display combined with detecting the exact moment you create an input. The light sensor picks up a change (such as the muzzle flash in an fps) and measures the difference in time. If you were to make this work on a CRT running at NTSC refresh rates, it would never show less than 8.3ms when in the middle of the screen.

    If you are measuring fairly with techniques we use against LCDs, then yes, CRTs have latency.

    • Saik0@lemmy.saik0.com
      link
      fedilink
      English
      arrow-up
      5
      ·
      edit-2
      5 months ago

      Nope. There is an industry standard way of measuring latency, and it’s measured at the halfway point of drawing the image.

      No. And if you want to actually provide a link to your “industry standard” feel free to, just make sure that your “standard” actually can be applied to a CRT first.

      You can literally focus the CRT to only show one pixel (more accurately beam width) worth of value. And that pixel would be updated many thousands of times a second (literally constant… since it’s analog).

      If you’re going to define latency as “drawing the image” (by any part of the metric) then a CRT can draw a single “pixel” worth of value 1000s of times a second… probably more. Where your standard 60hz panel can only do 1/60th a second… (or even the highest LCDs at 1/365).

      If there is a frame to draw and that frame is being processed, then yes. You’re right. Measuring at the middle will yield a delay. But this isn’t how all games/operations work for devices in all of history. There are many applications where data being sent to the display is literally read from memory nanoseconds prior. CRTs have NO processing delay that LCDs do have.

      Further points of failure in your post. CRTs are not all “NTSC” standard (Virtually every computer monitor for instance). There’s plenty of CRTs that can push much higher than the NTSC standard specifies.

      Here’s an example from a bog standard monitor I had a long time ago… https://www.manualslib.com/products/Sony-Trinitron-Cpd-E430-364427.html

      800 x 600/155 Hz
      1024 x 768/121 Hz
      1280 x 1024/91 Hz
      1600 x 1200/78 Hz

      So on a 60hz LCD will always be 0.016 to do the whole image. Regardless of it’s resolution being displayed. Not so on the CRT… Higher performance CRTs can draw more “pixels” per second. and when you lower the amount of lines you want it to display the full frame draw times go down substantially. There’s a lot of ways to define these things, that your simplistic view doesn’t account for. The reality is though, it’s possible if you skip the idea of a “frame” that the time from input to the time of display on the CRT monitor is lower simply because there’s no processing occurring here, your limit is physics of the materials you build the monitor out of. Not some chips capability to decode a frame. thus… No latency.

      Not frametime. Not FPS. Not Hz. Latency is NONE of those things, otherwise we wouldn’t have those other terms and would have strictly used “latency” instead.

      And a wonderful example of this is the commodor64 tape loading screens. https://youtube.com/watch?v=Swd2qFZz98U

      Those lines/colors are drawn straight from memory without the concept of a frame. There is no latency here. Many scene demos abused this function to achieve really wild affects as well. Your LCD cannot do that, those demos don’t function correctly on LCDs…

      Lightguns are a perfect example of how this can be leveraged (which is completely impossible on an LCD as well).

      Specifically scroll down to the Sega section. https://www.retrorgb.com/yes-you-can-use-lightguns-on-lcds-sometimes.html

      By timing the click of the lightgun input to which pixel is currently being drawn by the frame to take that as input for the gun. That requires minimal latency to do. LCDs cant do that.

      Ultimately people like you are trying to redefine what latency is that flies in the face of actual history that shows us there is a distinct difference that has historically mattered and even applications of that latency that CANNOT be what you’re claiming it to be.

      https://yt.saik0.com/watch?v=llGzvCaw62Y#player-container

      can you tell me why the LCD on the right is ALWAYS behind? And why it will ALWAYS be the case that it will not work, regardless of how fast the LCD panel is? The reason you’re going to come to is that it’s processing delay. Which didn’t exist on CRTs. That’s “LATENCY”.

      When talking about retro consoles, we’re limited by the hardware feeding the display, and the frame can’t start drawing until the console has transmitted everything.

      This is where you’re completely wrong. CRTs don’t know the concept of a frame. It draws the input that it gets. Period. There’s no buffer… there’s no where to hold onto anything that is being transmitted. It’s literally just spewing electrons at the phosphors.

      Edit: typo

      Edit2: to expound on the LCD vs CRT thing with light guns. CRTs drawn the “frame” as it’s received… so as it gets the voltage it varies that voltage on the electron gun itself, which means that when the Sega console in this case sets the video buffer to the white value for a coordinate and displays it, it knows exactly which pixel is currently being modified. The LCD will take the input, store it in a buffer until it gets the full frame. Then display. The Sega doesn’t know when that frame will actually be displayed as there’s other shit between it and the display mechanism doing stuff. There is an innate delay that MUST occur on the LCD that simply doesn’t on the CRT. That’s the latency.