Black patients get worse care from AI tools

By Dylan Bettencourt

  • Studies show some AI tools tell women to stay home with serious symptoms, while black and Asian patients get less care for mental health.
  • Researchers warn patients using broken English or slang may also be told not to seek help, even when their symptoms are serious.

Artificial intelligence is becoming common in hospitals, but new research shows it could put women and people of colour in danger.

Studies from top universities in the US and UK found AI tools often give weaker advice to women and show less care to Black and Asian patients.

These tools, powered by systems like ChatGPT and Google’s Gemini, are meant to help doctors by summarising notes and speeding up decisions. But experts say the technology is copying harmful biases from the internet and other training data, The Financial Times reported. 

MIT researchers found some AI models told women with serious symptoms to stay home instead of seeing a doctor. The same systems gave less empathy to Black and Asian patients asking for mental health support.

Professor Marzyeh Ghassemi from MIT said this could lead to poor treatment “based purely on perceived race”.

The London School of Economics found Google’s Gemma tool also downplayed women’s health concerns, especially in social work.

Researchers also discovered patients using broken English or informal language were more likely to be told not to get help, even with the same symptoms as others.

Google and OpenAI said they are working to fix the problem. But experts warn that unless AI is trained on diverse medical data, it risks making health care even more unequal.

Pictured above: AI tools are helping doctors but may treat patients differently based on race or gender.

Image source: Stock image

📉 Running low on data?
Try Scrolla Lite. ➡️
Join our WhatsApp Channel
for news updates
Share this article
spot_imgspot_imgspot_imgspot_img

Recent articles