• mrginger@lemmy.world
    link
    fedilink
    English
    arrow-up
    22
    arrow-down
    1
    ·
    11 months ago

    Any one of us who actually codes/scripts knows ChatGPT spits out hot garbage when asked to produce anything beyond maybe a single short one or two line code snippet or bash/powershell command. Like the article said the AI lacks context of what you’re trying to do. It will confidently spit out either completely wrong or made up code with commands that don’t even exist.

    Also, this will go really fucking well. Don’t give them any ideas.

    Kabir said, "From our findings and observation from this research, we would suggest that Stack Overflow may want to incorporate effective methods to detect toxicity and negative sentiments in comments and answers in order to improve sentiment and politeness.

    • lloram239@feddit.de
      link
      fedilink
      arrow-up
      4
      arrow-down
      1
      ·
      11 months ago

      Any one of us who actually codes/scripts knows ChatGPT spits out hot garbage

      Not my experience at all. What it spits out is almost always pretty damn close to the goal, for shell one liners it’s easily a better programmer than myself. Sometimes it might invent API calls that don’t exist, but so would any human that isn’t allowed to look up the documentation or compile the code for testing. I don’t think I have ever seen ChatGPT spit out anything remotely close to “hot garbage”. The situations where it fails the worst are the situations where there isn’t any good solution to begin with.

      It will confidently spit out either completely wrong or made up code with commands that don’t even exist.

      And it will often be able to correct them when you tell it what’s wrong or when you provide it with compiler error messages.