title

  • apemint@kbin.social
    link
    fedilink
    arrow-up
    4
    ·
    11 months ago

    Photoshop has been around for over quarter of a century but you don’t need a forensic team to tell something has been photoshopped.
    Tools to detect image (and video) modifications have been around and will continue to be developed alongside these technologies. We’re simply entering a new era of media creation.

    When Photoshop became mainstream, people said the exact same thing, but somehow the world didn’t end up on its head.

    • oldGregg@lemm.ee
      link
      fedilink
      arrow-up
      3
      ·
      11 months ago

      These tools are what make AI generated material more convincing. They can just automate piping the AIGEN output into the quality test until it passes. Then it should pass consistently

      • secrethat@kbin.social
        link
        fedilink
        arrow-up
        2
        ·
        11 months ago

        We just need to have our politicians painted in a way that is hard to replicate with current AI Generative technology

    • IWantToFuckSpez@kbin.social
      link
      fedilink
      arrow-up
      1
      ·
      11 months ago

      Photoshop still required skill to create a convincing fake. With AI it’s a lot easier for everyone without artistic skill to make deepfakes. Sure there are tools to detect these fakes. But it will get easier and easier to make a deepfake thus the social media feeds will be flooded with so many fakes in the future that damage will be done before the fakes can be debunked. Like how that altered video of Pelosi where she sounded drunk that went viral in the right wing sphere, that will happen exponentially more often in the future.