Fitbit lets you upload medical records and ask its AI for advice now – but is that safe?


Fitbit personal health coach

Google

Follow ZDNET: Add us as a preferred source on Google.


ZDNET’s key takeaways 

  • Fitbit users can soon upload medical records to Fitbit.
  • Its AI health coach reviews these records to generate responses. 
  • Google also unveiled other Fitbit feature upgrades.

The era of the AI-powered health coach is in full swing, and now, it’s moving into medical territory. Google is one of the latest tech companies leading this AI-powered charge. On Tuesday, it announced updates to Fitbit’s personal health coach that will further integrate medical data with AI. 

Fitbit users will soon be able to connect their medical records to the Fitbit app for the personal health coach’s review. The AI-powered health coach can contextualize and use this data in response to health questions. The data includes lab results, medications, and visit history, according to a press release from Google, Fitbit’s owner.

Other smaller wearable health technology companies have offered similar features through their own apps. This includes the fitness band company Whoop, which lets you upload records to the app and then use the Whoop AI to ask it questions. 

“When your coach understands your medical history, its guidance becomes safer, more relevant and more personalized,” Google said in a press release.

The feature comes at a time when more people are turning to AI for guidance on everything from task management to health regimens. Eight in 10 US adults go online to look up health information, and over two-thirds of Americans find the information reliable. Meanwhile, some medical professionals worry that this over-reliance on AI for health information could lead patients to self-diagnosis or incorrect treatment. 

Also: Asking AI for medical advice? There’s a right and wrong way, one doctor explains

Google said that by connecting medical data to Fitbit, the personal health coach’s responses and advice are more personalized to the user. If a user is curious about the implications of a recent lab test, they can now query the AI for help. And when a Fitbit user asks the personal health coach how to improve their cholesterol, the AI can use the medical data and lifestyle patterns (such as sleep, diet, and activity) recorded on Fitbit for helpful responses and advice. 

When a conversation goes too far into medical territory, Google said that the AI coach will remind users to consult with a healthcare professional for medical needs. In other words, it’s not going to give medical advice, diagnose, or offer a treatment plan. 

The feature will arrive next month. 

Perhaps you’re reading this news and scratching your head, worrying about the privacy implications of connecting your health records to a consumer wearable device owned by Google. Google said that medical records are securely stored within Fitbit, and users will have control over their use. Google didn’t specify whether that data will be stored on-device or within Fitbit’s servers. 

Also: OpenAI, Anthropic, and Google all have new AI healthcare tools – here’s how they work

Florence Thng, Google’s Health Intelligence product management director, said in the press release that users’ medical records, like other health data in Fitbit, are not used for ads. 

Chatbots have a tendency to hallucinate or make up information, and this could pose a risk to Fitbit users asking the coach for questions or advice about their health. Google acknowledged large language models’ limitations in an email with ZDNET. It said it is committed to improvement and evaluation. 

“We acknowledge that Large Language Models can have limitations, including potential inaccuracies and hallucination. To address this, we invest heavily in a validation process to enhance the quality of our models. This involves ongoing evaluations and the use of advanced architectures that rely on tools and self-critique to improve reliability and reduce the chance of inaccurate responses,” a Google spokesperson said in an email. 

Google is bringing another health device to its ecosystem for Fitbit users to try out. Starting next month in public preview, users will be able to connect a continuous glucose monitor to Fitbit and query the health AI for insights into their blood sugar trends after a meal or workout. 

Also: Are AI health coach subscriptions a scam? My verdict after testing Fitbit’s for a month

The company also announced an increase in its sleep staging accuracy, which will soon roll out to public preview users. These sleep staging models can better predict differences between naps, interruptions, and sleep stage transitions. The new sleep stage feature will roll out in public preview over the next few days. 





Source link

Leave a Reply

Subscribe to Our Newsletter

Get our latest articles delivered straight to your inbox. No spam, we promise.

Recent Reviews


Do you ever walk past a person on the streets exhibiting mental health issues and wonder what happened to their family? I have a brother—or at least, I used to. I worry about where he is and hope he is safe. He hasn’t taken my call since 2014.

James and his brother as young children playing together before his brother became sick. James is on the right and his brother is on the left.

James and his brother as young children playing together before his brother became sick. James is on the right and his brother is on the left.

When I was 13, I had a very bad day. I was in the back of the car, and what I remember most was the world-crushing sound violently panging off every surface: he was pounding his fists into the steering wheel, and I worried it would break apart. He was screaming at me and my mother, and I remember the web of saliva and tears hanging over his mouth. His eyes were red, and I knew this day would change everything between us. My brother was sick.

Nearly 20 years later, I still have trouble thinking about him. By the time we realized he was mentally ill, he was no longer a minor. The police brought him to a facility for the standard 72-hour hold, where he was diagnosed with paranoid delusional schizophrenia. Concluding he was not a danger to himself or others, they released him.

There was only one problem: at 18, my brother told the facility he was not related to us and that we were imposters. When they let him out, he refused to come home.

My parents sought help and even arranged for medication, but he didn’t take it. Before long, he disappeared.

My brother’s decline and disappearance had nothing to do with the common narratives about drug use or criminal behavior. He was sick. By the time my family discovered his condition, he was already 18 and legally independent from our custody.

The last time he let me visit, I asked about his bed. I remember seeing his dirty mattress on the floor beside broken glass and garbage. I also asked about the laptop my parents had gifted him just a year earlier. He needed the money, he said—and he had maxed out my parents’ credit card.

In secret from my parents, I gave him all the cash I had saved. I just wanted him to be alright.

My parents and I tried texting and calling him; there was no response except the occasional text every few weeks. But weeks turned into months.

Before long, I was graduating from high school. I begged him to come. When I looked in the bleachers, he was nowhere to be seen. I couldn’t help but wonder what I had done wrong.

The last time I heard from him was over the phone in 2014. I tried to tell him about our parents and how much we all missed him. I asked him to be my brother again, but he cut me off, saying he was never my brother. After a pause, he admitted we could be friends. Making the toughest call of my life, I told him he was my brother—and if he ever remembers that, I’ll be there, ready for him to come back.

I’m now 32 years old. I often wonder how different our lives would have been if he had been diagnosed as a minor and received appropriate care. The laws in place do not help families in my situation.

My brother has no social media, and we suspect he traded his phone several years ago. My family has hired private investigators over the years, who have also worked with local police to try to track him down.

One private investigator’s report indicated an artist befriended my brother many years ago. When my mother tried contacting the artist, they said whatever happened between them was best left in the past and declined to respond. My mom had wanted to wish my brother a happy 30th birthday.

My brother grew up in a safe, middle-class home with two parents. He had no history of drug use or criminal record. He loved collecting vintage basketball cards, eating mint chocolate chip ice cream, and listening to Motown music. To my parents, there was no smoking gun indicating he needed help before it was too late.

The next time you think about a person screaming outside on the street, picture their families. We need policies and services that allow families to locate and support their loved ones living with mental illness, and stronger protections to ensure that individuals leaving facilities can transition into stable care. Current laws, including age-based consent rules, the limits of 72-hour holds, and the lack of step-down or supported housing options, leave too many families without resources when a serious diagnosis occurs.

Governments and lawmakers need to do better for people like my brother. As someone who thinks about him every day, I can tell you the burden is too heavy to carry alone.

James Finney-Conlon is a concerned brother and mental health advocate. He can be reached at [email protected].



Source link