Skip to content

Brought to you by

Dentons logo

Dentons Data

Your trusted advisor for all things digital.

open menu close menu

Dentons Data

  • Home
  • About Us

Airline ordered to compensate a B.C. man because its chatbot provided inaccurate information

By Kirsten Thompson
February 15, 2024
  • Artificial Intelligence
  • Litigation
  • Technology
Share on Facebook Share on Twitter Share via email Share on LinkedIn

In the British Columbia Civil Resolution Tribunal’s decision Moffatt v. Air Canada, 2024 BCCRT 149, tribunal member Christopher C. Rivers (“Member”)  found in favour of a man who relied on fare information provided by an airline’s chatbot when he was booking a flight to attend his grandmother’s funeral.

The Member found that the airline owed a duty of care to chatbot users, and “did not take reasonable care to ensure its chatbot was accurate” and that these inaccuracies amounted to negligent misrepresentation.

Mr. Moffat was awarded $650.88 in damages for negligent misrepresentation, essentially the difference between the lower bereavement fare price and the price he had actually paid.

Background

Mr. Moffat was booking a flight to Toronto subsequent to the death of his grandmother, and had asked the chatbot about the airline’s bereavement rates – reduced fares available to a person who needs to travel due to the death of an immediate family member.

Mr. Moffat claimed that chatbot told him these fares could be claimed retroactively by completing a refund application within 90 days of the date the ticket was issued. Mr Moffat submitted a screenshot of his conversation with the chatbot as evidence supporting this claim, which said, in part, as follows:

Air Canada offers reduced bereavement fares if you need to travel because of an imminent death or a death in your immediate family.

…

If you need to travel immediately or have already travelled and would like to submit your ticket for a reduced bereavement rate, kindly do so within 90 days of the date your ticket was issued by completing our Ticket Refund Application form. (emphasis in original)

He submitted his request within a week of his travel, supported by a copy of his grandmother’s death certificate as required. The airline denied his request, saying that such requests could not be submitted retroactively. Mr. Moffatt’s attempts to receive a partial refund continued for another two-and-a-half months.

In February 2023, Mr. Moffatt emailed the airline and included the screenshot from the chatbot that set out the 90-day window to request a reduced rate and confirmed he had filled out the refund form and provided a death certificate.

An airline representative responded and admitted the chatbot had provided “misleading words.” The representative pointed out the chatbot’s link to the bereavement travel webpage containing the airline’s full policy stating that bereavement fares could not be claimed retroactively, and said Air Canada had noted the issue so it could update the chatbot.

However, Mr. Moffatt was still unable to get a partial refund, prompting him to file the claim with the tribunal.

Analysis

The airline argued that it could not be held liable for information provided by the chatbot, an argument emphatically rejected by the Member:

In effect, Air Canada suggests the chatbot is a separate legal entity that is responsible for its own actions. This is a remarkable submission. While a chatbot has an interactive component, it is still just a part of Air Canada’s website. It should be obvious to Air Canada that it is responsible for all the information on its website. It makes no difference whether the information comes from a static page or a chatbot

Para. 27

The airline also argued that the chatbot’s response had included a link to a section of its website that outlined the company’s policy, which said refund requests after travel had occurred were not permitted. This, claimed the airline, was the controlling information.

The Member rejected this argument as well, saying:

While Air Canada argues Mr. Moffatt could find the correct information on another part of its website, it does not explain why the webpage titled “Bereavement travel” was inherently more trustworthy than its chatbot. It also does not explain why customers should have to double-check information found in one part of its website on another part of its website.

Para. 28

The Member accepted Mr. Moffat’s claim that he had relied upon the chatbot to provide accurate information, and found this to be reasonable in the circumstances. The Member noted that “[t]here is no reason why Mr. Moffatt should know that one section of Air Canada’s webpage is accurate, and another is not.”

The compensation the tribunal awarded was the equivalent of the difference between what Moffatt paid for his flight and a discounted bereavement fare.

Mr. Moffat was awarded $650.88 in damages for negligent misrepresentation. In addition, the airline was ordered to pay $36.14 in pre-judgment interest and $125 in Tribunal fees.

Takeaways

It is clear from this decision that organizations will be held responsible for the representations made by their chatbots. Organizations should check that they have processes in place to verify (and update) the information being provided by chatbots, and ensure that such information aligns with various policies and positions. It will not be sufficient to rehabilitate a chatbot’s misrepresentation/inaccurate summary by merely by providing a link to the actual policy or document.

When chatbots are provided by third parties, both the organization and the chatbot provider should ensure contractual terms appropriately set out which party is responsible (and liable) for the chatbot’s content, training, and responses.

Share on Facebook Share on Twitter Share via email Share on LinkedIn
Subscribe and stay updated
Receive our latest blog posts by email.
Stay in Touch
Artificial Intelligence, Chatbot
Kirsten Thompson

About Kirsten Thompson

Kirsten Thompson is a partner and the national lead of Dentons’ Privacy and Cybersecurity group. She has both an advisory and advocacy practice, and provides privacy, data security and data management advice to clients in a wide variety of industries.

All posts Full bio

RELATED POSTS

  • Artificial Intelligence
  • Privacy

Privacy Commissioner launches investigation into ChatGPT; ChatGPT drafts its own “Report of Findings”

By Kirsten Thompson
  • Litigation
  • Privacy

Court Finds Language of Privacy Act Precludes Arbitration of Privacy Disputes

By Chloe Snider
  • Litigation
  • Privacy Torts
  • Public disclosure of private facts

The Trend Towards New Privacy Torts – Alberta Weighs In

By Kelly Osaka

About Dentons

Redefining possibilities. Together, everywhere. For more information visit dentons.com

Grow, Protect, Operate, Finance. Dentons, the law firm of the future is here. Copyright 2023 Dentons. Dentons is a global legal practice providing client services worldwide through its member firms and affiliates. Please see dentons.com for Legal notices.

Categories

Subscribe and stay updated

Receive our latest blog posts by email.

Stay in Touch

Dentons logo in black and white

© 2025 Dentons

  • Legal notices
  • Privacy policy
  • Terms of use
  • Cookies on this site