A passenger has won a civil case against Air Canada in a legal row over the validity of an artificial intelligence (AI) chatbot’s advice.
Whilst mourning the death of his grandmother, Jake Moffatt took to the travel company’s chatbot to plead his case about a refunded flight on the grounds of the air carrier’s bereavement policy.
Air Canada however, had stated that Mr Moffatt did not follow the “proper procedure and cannot claim them retroactively” according to the case documents provided by the Civil Resolution Tribunal of British Columbia (CRT).
But the tribunal found Air Canada would be held responsible for the words of its chatbot and the refund policy it made up and presented to Mr Moffatt would have to be honored.
Air Canada’s rogue AI
The case was heard by Christopher Rivers of the CRT, who stated this case was a “remarkable submission”.
Mr Moffatt provided the court with screenshots of his conversation with the chatbot after visiting the website on the day his grandmother had passed, seeking advice on bereavement support.
“Air Canada offers reduced bereavement fares if you need to travel because of an imminent death or a death in your immediate family” the statement from the site’s chatbot reads.
“If you need to travel immediately or have already traveled and would like to submit your ticket for a reduced bereavement rate, kindly do so within 90 days of the date your ticket was issued by completing our Ticket Refund Application form.”
After contacting the travel provider Moffatt was given a response from an Air Canada representative saying the chatbot responded with “misleading words” and that the airfare company could not be liable, including any “agents, servants or representatives — including a chatbot” of the company.
Mr Rivers’ review of the case led him to conclude that Air Canada didn’t “explain (to Mr Moffatt or the CRT) why it believes that is the case. In effect, Air Canada suggests the chatbot is a separate legal entity that is responsible for its own actions.”
Air Canada has argued that the chatbot’s advice was in contrast to the provider’s web page and policy, but Mr Rivers rebuffed these claims saying that the “chatbot has an interactive component, it is still just a part of Air Canada’s website. It should be obvious to Air Canada that it is responsible for all the information on its website. It makes no difference whether the information comes from a static page or a chatbot.”
Mr Moffatt was awarded $650.88 in damages, $36.14 in pre-judgment interest under the Court Order Interest Act, and $125 in CRT fees.
Image Credit: Midjourney