• Snowclone@lemmy.world
      link
      fedilink
      arrow-up
      1
      arrow-down
      2
      ·
      edit-2
      7 days ago

      The current method is auto deleting nsfw images. Doesn’t matter how you got there, it detects nsfw it dumps it, you never get an image. Besides that gating nsfw content generation behind a pay wall or ID wall. It would stop a lot of teenagers. Not all, but it would put a dent in it. There are also AI models that will allow some nsfw if it’s clearly in an artistic style, like a watercolor painting, but will kick nsfw realism or photography, rendered images, that sort of thing. These are usually both in the prompt mode, paint in/out, and image reference mode, generation of likely nsfw images, and after generating a nsfw check before delivering the image. AI services are antisipating full on legal consequences for allowing any nsfw or any realistic, photographic, cgi, image of a living person without their consent, it’s easy to see that’s what they are prepared for.

      • cryptiod137@lemmy.world
        link
        fedilink
        English
        arrow-up
        7
        ·
        7 days ago

        I assume someone who is currently generating AI porn is running a model locally and not using a service, as there is absolute boat loads of generated hentai getting pised every day?

      • bane_killgrind
        link
        fedilink
        English
        arrow-up
        6
        ·
        7 days ago

        Sure for some tools. There are other tools that don’t do that.

        Chasing after the tools and services is a waste. Make harassment more clearly defined, go after people that victimize other people.

      • i_am_not_a_robot@discuss.tchncs.de
        link
        fedilink
        English
        arrow-up
        6
        ·
        7 days ago

        Web services and AI in general are completely different things. Web services that generate AI content want to avoid scandals so they’re constantly blocking things that may be in some situations inappropriate, to the point where those services are incapable of performing a great many legitimate tasks.

        Somebody running their own image generator on their own computer using the same technology is limited only by their own morals. They can train the generator on content that public services would not, and they are not constrained by prompt or output filters.