Flutterby™!
: LLMs lie. It's what they do.
LLMs lie. It's what they do.
2024-02-15 18:48:19.209173+01 by
Dan Lyke
0 comments
Air Canada's chatbot gave a B.C. man the wrong information. Now, the airline has to pay for the mistake
Air Canada, for its part, argued that it could not be held liable for information provided by the bot.
"In effect, Air Canada suggests the chatbot is a separate legal entity that is responsible for its own actions. This is a remarkable submission. While a chatbot has an interactive component, it is still just a part of Air Canada’s website," Rivers wrote.
Civil Resolution Tribunal: Moffatt v. Air Canada
[ related topics:
Aviation Law
]
comments in ascending chronological order (reverse):
Comment policy
We will not edit your comments. However, we may delete your
comments, or cause them to be hidden behind another link, if we feel
they detract from the conversation. Commercial plugs are fine,
if they are relevant to the conversation, and if you don't
try to pretend to be a consumer. Annoying endorsements will be deleted
if you're lucky, if you're not a whole bunch of people smarter and
more articulate than you will ridicule you, and we will leave
such ridicule in place.
Flutterby™ is a trademark claimed by
Dan Lyke for the web publications at www.flutterby.com and www.flutterby.net.