IA: Air Canada goes to the checkout to repair the error of its chatbot

Feeling wronged by the airline because of its chatbot which invented a discount, Jake Moffatt filed a complaint against Air Canada. However, although she justified that the tool constituted “a separate legal entity”, the Canadian court ruled against her. And the company is not the only one to experience setbacks caused by this type of use of AI.

Air Canada chatbot gave a customer a fake discount

$812.02. This is the amount that Air Canada will have to pay to repair the error of its chatbot, according to the decision of the Canadian civil court on February 14. It all begins in 2022 when a certain Jake Moffatt goes to the airline’s website to book plane tickets to attend his grandmother’s funeral.

He asks the chatbot for advice, which assures him that the company offers discounts in the event of last-minute reservations due to bereavement. Moffatt then booked a $600 ticket, thinking that Air Canada would refund part of the amount, as long as he requested it within 90 days of booking, as the story goes. Washington Post. Except that on this last instruction, the chatbot makes an error. When he tries to obtain a refund, Jake Moffatt learns that the company only grants this type of fare when requested before purchasing the ticket.

LIRE  Tourniquets en Entreprise : Un Pilier pour la Sécurité des Sites

For three months, the customer communicated by email with Air Canada, which refused to reimburse him. Enough to push him to sue them. For its part, Air Canada argues that the chatbot has been updated and that the error no longer appears. It has since been deactivated.

The chatbot was according to Air Canada “a legal entity” apart

The story may seem anecdotal, but Air Canada’s defense is interesting. In court, the company considered that the chatbot was a “separate legal entity”, “responsible for one’s own actions”. Furthermore, the firm believes that the information was correctly displayed on the pages of its website. The justification, however, did not convince the court, which considered that Air Canada is indeed responsible for all the information provided on its website. For Christopher Riversone of the members of the tribunal, Mr. Moffatt was not supposed to know “that one of the sections of the site was valid and the other was not”. The court therefore ruled that the company would have to pay the difference between the amount paid by its client and that indicated by the chatbot, as well as legal costs.

LIRE  L'essor des générateurs vidéo AI gratuits : Révolution dans la création de contenu

This issue might concern other businesses since customer service chatbots are one of the most common applications of language models by businesses. However, chatbots based on large language models (LLM) tend to produce what is called a “hallucination”. They generate text probabilistically, and can therefore invent false information while maintaining a confident tone.

General Motors chatbot advised a competing brand

Enough to produce funny situations. Recently, a customer of DDP, a delivery company subsidiary of La Poste, had fun making the chatbot tell insults against the company. The latter had no choice but to deactivate the chatbot.

Likewise, a few months earlier, journalists from the site Carscoops reported that the General Motors chatbot did not hesitate to advise a competing brand (Tesla).

Visit Baddiehub for more information

Similar Posts

Leave a Reply

Your email address will not be published. Required fields are marked *