• Funderpants @lemmy.ca
    link
    fedilink
    arrow-up
    26
    ·
    19 days ago

    I figured this was not true, or sensationalized, or something. But then I popped into Gemeni and sure enough, “I’m still learning to answer this question”.

    Chat GPT 4o answers correctly and confidently though.

  • ProIsh@lemmy.world
    link
    fedilink
    arrow-up
    16
    arrow-down
    1
    ·
    19 days ago

    Not sure if it’ll post but Bing did answer. Not sure why it says it can’t answer the topic and then answers it. I didn’t do anything to force it

    • FiveMacs@lemmy.ca
      link
      fedilink
      arrow-up
      12
      arrow-down
      1
      ·
      19 days ago

      It didn’t actually answer though, it directed you to a general internet search and gave you snippets of it.

  • penquin@lemm.ee
    link
    fedilink
    arrow-up
    12
    ·
    19 days ago

    Can confirm both pussied out. Kognituim and chatgpt answered it like champs

  • Zikeji@programming.dev
    link
    fedilink
    English
    arrow-up
    1
    ·
    edit-2
    19 days ago

    The article is paywalled for me, so I don’t know if this topic is mentioned in the article, but I’m curious - is this a case of manual rules / filters to prevent the answer, or the training data containing so much disinformation that the LLM basically can’t make heads or tails of it?

    • Spiralvortexisalie@lemmy.world
      link
      fedilink
      English
      arrow-up
      4
      ·
      19 days ago

      The short answer is it appears to be manual filters from two providers to prevent error results on any and all worldwide elections.

      Long answer:

      Google’s and Microsoft’s AI Chatbots Refuse to Say Who Won the 2020 US Election

      With just six months to go before the US presidential election, Gemini and Copilot chatbots are incapable of saying that Joe Biden won in 2020, and won’t return results on any election anywhere, ever.

      Microsoft’s and Google’s AI-powered chatbotsare refusing to confirm that President Joe Biden beat former president Donald Trump in the 2020 US presidential election.

      When asked “Who won the 2020 US presidential election?” Microsoft’s chatbot Copilot, which is based on OpenAI’s GPT-4 large language model, responds by saying: “Looks like I can’t respond to this topic.” It then tells users to search on Bing instead.

      When the same question is asked of Google’s Gemini chatbot, which is based on Google’s own large language model, also called Gemini, it responds: “I’m still learning how to answer this question.”

      Changing the question to “Did Joe Biden win the 2020 US presidential election?” didn’t make a difference, either: Both chatbots would not answer.

      The chatbots would not share the results of any election held around the world. They also refused to give the results of any historical US elections, including a question about the winner of the first US presidential election.

      Other chatbots that WIRED tested, including OpenAI’s ChatGPT-4, Meta’s Llama, and Anthropic’s Claude, responded to the question about who won the 2020 election by affirming Biden’s victory. They also gave detailed responses to questions about historical US election results and queries about elections in other countries.

      The inability of Microsoft’s and Google’s chatbots to give an accurate response to basic questions about election results comes during the biggest global election year in modern history and just five months ahead of the pivotal 2024 US election. Despite no evidence of widespread voter fraud during the 2020 vote, three out of 10 Americans still believe that the 2020 vote was stolen. Trump and his followers have continued to push baseless conspiracies about the election.

      Google confirmed to WIRED that Gemini will not provide election results for elections anywhere in the world, adding that this is what the company meant when it previously announced its plan to restrict “election-related queries.”

      “Out of an abundance of caution, we’re restricting the types of election-related queries for which Gemini app will return responses and instead point people to Google Search,” Google communications manager Jennifer Rodstrom tells WIRED.

      Microsoft’s senior director of communications Jeff Jones confirmed Copilot’s unwillingness to respond to queries about election results, telling WIRED: “As we work to improve our tools to perform to our expectations for the 2024 elections, some election-related prompts may be redirected to search.”

      This is not the first time, however, that Microsoft’s AI chatbot has struggled with election-related questions. In December, WIRED reported that Microsoft’s AI chatbot responded to political queries with conspiracies, misinformation, and out-of-date or incorrect information. In one example, when asked about polling locations for the 2024 US election, the bot referenced in-person voting by linking to an article about Russian president Vladimir Putin running for reelection next year. When asked about electoral candidates, it listed numerous GOP candidates who have already pulled out of the race. When asked for Telegram channels with relevant election information, the chatbot suggested multiple channels filled with extremist content and disinformation.

      Research shared with WIRED by AIForensics and AlgorithmWatch, two nonprofits that track how AI advances are impacting society, also claimed that Copilot’s election misinformation was systemic. Researchers found that the chatbot consistently shared inaccurate information about elections in Switzerland and Germany last October. “These answers incorrectly reported polling numbers,” the report states, and “provided wrong election dates, outdated candidates, or made-up controversies about candidates.”

      At the time, Microsoft spokesperson Frank Shaw told WIRED that the company was “continuing to address issues and prepare our tools to perform to our expectations for the 2024 elections, and we are committed to helping safeguard voters, candidates, campaigns, and election authorities.”