Chatbots are likely to lead to a new wave of class action lawsuits. This is how are we helping our clients get ready.

Claims include:

  • Contemporaneous recording by the chat provider
  • Chat provider uses the chat transcripts to train and improve its AI
  • Chat provider analyzes the chat to make determinations regarding the emotional state of the individual

Causes of action:

  • CIPA / wiretapping
  • Invasion of Privacy
  • Intrusion upon seclusion
  • Quasi contract

Things to consider:

  • If you are using a third party to power your chat interactions, you must review the agreement with them and understand their role. Are they using any data for their own purposes? Could this be considered a “sale” or a “deceptive/unfair” practice?
  • What is the AI involved? If AI is involved for behavior (emotion) analysis, this could be actionable “profiling” under data privacy laws and may require: DPIA, expanded disclosure etc.
  • Disclose to your users that they are speaking with a robot and that a third party has access to the data. It is also important to disclose what they are doing with it.
  • Make sure you are clear. Plaintiffs may use statements in your privacy notice (like we use tracking in our website or disclosing only one purpose for which you use a provider and omitting the chat monitoring from it) as implying that no tracking is taking place elsewhere.
  • Provide the disclosure both contemporaneously and in your privacy notice. It is important to get the necessary consent to avoid a wiretapping claim.
  • The just in time notice should be at the very beginning of the call/interaction with the call center.