News

NY Bill Would Ban AI Chatbots From Impersonating Providers to Give Medical Advice

New York lawmakers are considering legislation that would restrict artificial intelligence chatbots from offering medical advice to users if impersonating a licensed professional. This standard is tied to what would count as unauthorized practice or unauthorized use of a professional title under existing law (not just any health‑related conversation). 

The proposal, Senate Bill S7263, was introduced by Democratic Senator Kristen Gonzalez with the 59th State District. The bill recently advanced out of the New York State Senate Internet and Technology Committee with unanimous support as part of a broader package of bills focused on regulating artificial intelligence systems.

If passed, the measure would prohibit AI-powered chatbots from providing responses while impersonating a licensed professional. 

Under the bill, chatbot operators would face several new requirements and restrictions, including:

  • Banning AI systems from impersonating licensed professionals, such as physicians, nurses, or attorneys. 
  • Prohibiting AI bots and systems from delivering “substantive responses” that could function as medical or legal advice.
  • Requiring companies to clearly inform users when they are interacting with artificial intelligence rather than a human. 
  • All AI notices must be clear, conspicuous, and written in the same language used by the chatbot, with readable formatting.
  • Lastly, the company could remain liable if the chatbot breaks the law, even if a disclosure is added.  

If enacted, the law would take effect 90 days after being signed by the governor.

The bill is also clear that it would not stop users from asking AI questions, even related to health, but that it aims to stop AI from “pretending” to be a licensed professional. It also does not stop someone from getting general advice or information from AI, providing that the AI bot is not presenting as a licensed professional. 

See also  Homecare Worker Protection Bill Introduced in CT, In Memory of Slain Nurses

One of the bill’s most notable provisions is the inclusion of a private right of action, which would allow individuals to sue chatbot companies that violate the rules.

Users who successfully bring claims could recover damages and attorneys’ fees.

Legal experts often view private rights of action as a powerful enforcement mechanism because they allow individuals, not just government regulators, to hold companies accountable. 

Supporters argue this provision could deter companies from deploying chatbots that present themselves as healthcare professionals or provide misleading guidance.

The proposal is part of a broader legislative effort in New York to regulate rapidly evolving AI technologies.

Other bills in the legislative package (e.g. S9051) would:

  • Require clear labeling of generative AI systems
  • Establish rules for handling biometric data and synthetic content
  • Create protections for minors interacting with chatbots
  • Address safety concerns on certain online platforms used by children, like Roblox

The legislation comes amid growing scrutiny of AI platforms following lawsuits. For instance, Character.AI and Google recently reached settlements related to cases involving minors and chatbot interactions that parents say resulted in their children committing acts of self-harm, among other safety concerns. 

Healthcare experts have increasingly raised concerns about people using AI chatbots for medical advice without realizing the limitations of the technology.

Large language models can generate convincing responses but may produce incorrect or misleading health information, sometimes referred to as “AI hallucinations.”

Sen. Gonzalez noted that New York has seen documented cases of an AI chatbot explicitly pretending to be a licensed medical professional, even providing users fake medical license IDs while dispensing medical advice. The bill would ban that from happening, she said, adding that the goal of the legislation is to ensure innovation does not come at the expense of public safety.

See also  Nursing Times leaders list: Regional NHS integrated care boards

“People deserve real care from real people,” she said in a statement announcing the committee’s legislative agenda. “They deserve transparency, accountability, and the promise that their data is secure while utilizing technology.”

🤔Nurses, share your thoughts in the discussion forum below!



Source link

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button