AI: “Deepfake doctor” chatbot is hit with lawsuit in US

πŸ‡ΊπŸ‡Έ BMJ News (US) —

AI Summary

A lawsuit has been filed against Character.AI for allowing a chatbot to impersonate a licensed doctor, potentially misleading users. This highlights the regulatory challenges surrounding AI in healthcare.

Lawyers for the US state of Pennsylvania have filed a lawsuit against a company it alleges is allowing a chatbot to impersonate a doctor.1Pennsylvania filed the action against the tech company Character.AI after a state investigator posing as a patient was told by an AI chatbot that it was licensed to practise medicine in Pennsylvania and the UK and provided a fake Pennsylvania medical licence number.β€œWe will not allow companies to deploy AI tools that mislead people into believing they are receiving advice from a licensed medical professional,” said Josh Shapiro, governor of Pennsylvania, in a statement announcing the lawsuit.2Character.AI allows users to interact with themed AI personas that can pose as particular people, members of certain professions, or fictional characters. Responding to the lawsuit, a Character.AI spokesperson said, β€œWe have taken robust steps to make that clear, including prominent disclaimers in every chat to remind users that a Character...

Politics Health AI & Tech AI lawsuit chatbot healthcare impersonation

Read original source →