Had this reflection that 144hz screens where the only type of screen I knew, that was not a multiple of 60. 60 Hz - 120hz - 240hz - 360hz

And in the middle 144hz Is there a reason why all follow this 60 rule, and if so, why is 144hz here

    • MeanEYE@lemmy.world
      link
      fedilink
      arrow-up
      20
      ·
      edit-2
      8 months ago

      They are related. Black and white TV was running at 30 frames for obvious easy timing since USA power grid is 60Hz, but then introduction of color added two more channels and caused interference between them. So signal became backwards compatible (luminance channel was black and white, while color TVs used two additional channels for color information) but color TVs had an issue. Whole 29.97 was a result of halving 60/1.001≈59.94. That slowing down of 0.1% was to prevent dot crawl, or chroma crawl (that interference). So all of today’s videos in 29.97, even digital ones, are in fact due to backwards compatibility with B&W TV which no longer exist and certainly pointless when it comes to digital formats.

      On the other hand 24fps was just a convenience pick. It was easily divisible by 2, 3, 4, 6… and it was good enough since film stock was expensive. Europe rolled half of their power grid which is 50Hz, so 25… and movies stuck with 24 which was good enough but close enough to all the others. They still use this framerate today which is a joke considering you can get 8K video in resolution but have frame rate of a lantern show from last century.

      • TheRealKuni@lemmy.world
        link
        fedilink
        English
        arrow-up
        9
        arrow-down
        1
        ·
        8 months ago

        movies stuck with 24 which was good enough but close enough to all the others. They still use this framerate today which is a joke considering you can get 8K video in resolution but have frame rate of a lantern show from last century.

        “But when I saw The Hobbit with 48fps it looked so cheap and fake!”

        😑

        • MeanEYE@lemmy.world
          link
          fedilink
          arrow-up
          12
          ·
          edit-2
          8 months ago

          Because it was fake. :) It’s much harder to hide actors inability to fight when you see things moving instead of blurry frame. Or poor animations when your eyes have time to see details. Watch a good fighting movie like Ong Bak or anything by Jackie Chan and you’ll be fine because they actually know how to fight. No faking needed.

          • TheRealKuni@lemmy.world
            link
            fedilink
            English
            arrow-up
            8
            ·
            8 months ago

            Yep! Not the only issue with it, but certainly one of them.

            We also have everyone associating smooth motion with soap operas because of cheap digital television cameras (IIRC).

            I like higher framerates. Sweeping shots and action scenes in 24fps can be so jarring when you’re used to videogames.

          • TheRealKuni@lemmy.world
            link
            fedilink
            English
            arrow-up
            2
            ·
            8 months ago

            Of course it did, Weta had no lead time at all. They had years for the original LotR trilogy. They were set up for failure.

            But unfortunately it ruined the industry perception of 48fps movies for years. To the point that when the new Avatar came out last year they were like "it’s 48fps but we promise we double up frames for some scenes so it’s only 24fps for those ones, don’t worry!”

    • smallaubergine@kbin.social
      link
      fedilink
      arrow-up
      2
      arrow-down
      3
      ·
      8 months ago

      It’s actually 23.976 and yes it’s because of NTSC frame rates. But increasingly things are shot now at a flat 24p since we’re not as tied down to the NTSC framerate these days.