• micka190@lemmy.world
    link
    fedilink
    arrow-up
    7
    arrow-down
    1
    ·
    17 days ago

    “Instructions” is probably the wrong word here (I was mostly trying to dumb it down for people who aren’t familiar with graphics rendering terminology).

    Here’s a link to the Digital Foundry video I was talking about (didn’t realized they made like 5 videos for Alan Wake 2, took a bit to find it).

    The big thing, in Alan Wake 2’s case, is that it uses Mesh Shaders. The video I linked above goes into it at around the 3:38 mark.

    AMD has a pretty detailed article on how they work here.

    This /r/GameDev post here has some devs explaining why it’s useful in a more accessible manner.

    The idea is that it allows offloading more work to the GPU in ways that are much better performance-wise. It just requires that the hardware actually support it, which is why you basically need an RTX card for Alan Wake 2 (or whichever AMD GPU supports Mesh Shaders, I’m not as familiar with their cards).

    • uis@lemm.ee
      link
      fedilink
      arrow-up
      1
      ·
      edit-2
      17 days ago

      Ah, mesh shaders. Cool stuff. AMD retroactively added them to their old GPUs in drivers. I think same goes for Intel’s post-Ivybridge GPUs(I think send opcode can throw primitives into 3d pipeline, if you are interested, you can go read docs). I guess Nvidia can do something similar.

      And even if they don’t have such straightforward way of implementing them, they probably(my guess, can be wrong) can be emulated in geometry shaders.

      What I don’t like is apparent removal of vertex fetcher, but maybe there will be extension that will return it.

      • micka190@lemmy.world
        link
        fedilink
        arrow-up
        2
        ·
        17 days ago

        I could be wrong, but I’m pretty sure Nvidia has patched them into the GTX series, they’re just really slow compared to RTX cards.