• frezik@midwest.social
    link
    fedilink
    arrow-up
    22
    ·
    edit-2
    5 months ago

    Of course there’s buffers. Once RAM got cheap enough to have a buffer to represent the whole screen, everyone did that. That was in the late 80s/early 90s.

    There’s some really bad misconceptions about how latency works on screens.

    • HackerJoe@sh.itjust.works
      link
      fedilink
      arrow-up
      7
      ·
      5 months ago

      Those are on the graphics adapter. Not in the CRT.
      You can update the framebuffer faster than the CRT can draw. That’s when you get tearing. Same VSync then as now.

    • __dev@lemmy.world
      link
      fedilink
      arrow-up
      2
      ·
      5 months ago

      CRTs (apart from some exceptions) did not have a display buffer. The analog display signal is used to directly control the output of each electron gun in the CRT, without any digital processing happening in-between. The computer on the other end however does have display buffers, just like they do now; however eliminating extra buffers (like those used by modern monitors) does reduce latency.

      • frezik@midwest.social
        link
        fedilink
        arrow-up
        1
        arrow-down
        2
        ·
        5 months ago

        Doesn’t matter. Having a buffer means either the buffer must be full before drawing, or you get screen tearing. It wasn’t like racing the beam.