Not all searches get AI answers, but Google has been steadily expanding this feature since it debuted last year. One searcher on Reddit spotted a troubling confabulation when searching for crashes involving Airbus planes. AI Overviews, apparently overwhelmed with results reporting on the Air India crash, stated confidently (and incorrectly) that it was an Airbus A330 that fell out of the sky shortly after takeoff. We’ve run a few similar searches—some of the AI results say Boeing, some say Airbus, and some include a strange mashup blaming both Airbus and Boeing. It’s a mess.

Always remember that AI, or more accurately, LLMs are just glorified predictive text like on your phone. Don’t trust them. Maybe someday they will be reliable, but that day isn’t today.___

  • ctrl_alt_esc@lemmy.ml
    link
    fedilink
    arrow-up
    2
    ·
    9 days ago

    Or because it was programmed with a bias to respond in a certain way. There may not be intent on the LLM’s part, the same is not necessarily true for its developers though.

    • Boddhisatva@lemmy.worldOP
      link
      fedilink
      English
      arrow-up
      2
      ·
      8 days ago

      There may not be intent on the LLM’s part

      There can’t be intent on the part of a non-sentient program. It has working code, flawed code, and probably intentionally biased code. Dont think of it as a being that intends to do anything.