Readit News logoReadit News
Posted by u/ar7hur 2 years ago
Show HN: Using GPT-3 and Whisper to save doctors’ time
Hey HN,

We're Alex, Martin and Laurent. We previously founded Wit.ai (W14), which we sold to Facebook in 2015. Since 2019, we've been working on Nabla (https://nabla.com), an intelligent assistant for health practitioners.

When GPT-3 was released in 2020, we investigated it's usage in a medical context[0], to mixed results.

Since then we’ve kept exploring opportunities at the intersection of healthcare and AI, and noticed that doctors spend am awful lot of time on medical documentation (writing clinical notes, updating their EHR, etc.).

Today, we're releasing Nabla Copilot, a Chrome extension generating clinical notes from video consultations, to address this problem.

You can try it out, without installation nor sign up, on our demo page: https://nabla.com/copilot-demo/

Here’s how it works under the hood:

- When a doctor starts a video consultation, our Chrome extension auto-starts itself and listens to the active tab as well as the doctor’s microphone.

- We then transcribe the consultation using a fine-tuned version of Whisper. We've trained Whisper with tens of thousands of hours of medical consultation and medical terms recordings, and we have now reached an error rate which is 3× lower than Google's Speech-To-Text.

- Once we have the transcript, we feed it to a heavily trained GPT-3, which generates a clinical note.

- We finally return the clinical note to the doctor through our Chrome extension, the doctor can copy it to their EHR, and send a version to the patient.

This allows doctors to be fully focused on their consultation, and saves them a lot time.

Next, we want to make this work for in-person consultation.

We also want to extract structured data (in the FHIR standard) from the clinical note, and feed it to the doctor’s EHR so that it is automatically added to the patient's record.

Happy to further discuss technical details in comments!

---

[0]: https://nabla.com/blog/gpt-3/

swatcoder · 2 years ago
I don't spend much time worrying about AI ethics, but bringing AI close to patient interactions and record-keeping in healthcare seems grossly premature and irresponsible. There are countless brilliant applications of AI and the last one we need is a mediocre automated transcriptionist that distances a doctor from their responsibility to engage with the patient and can bear no accountability for error.

This is a task that perhaps can be supported with AI some day, but there are fields that deserve the application of a mature technology, not the gold rush rush game of integrating today's hottest thing.

D13Fd · 2 years ago
The transcription part doesn't bother me. That's just advanced speech-to-text.

The summarization part, though, is dangerous. It's a very quick path to us losing all faith in our own medical records. Any way you slice it, no matter how much you train it, it's still going to be vulnerable to hallucination errors that slip by the reviewing doctor and become part of the patient's medical history.

ar7hur · 2 years ago
I understand your concerns. Our objectives are, on the contrary, to reduce the doctor/patient distance that was created by EHRs and all the required administrative processes. We already measure that when AI takes care of this stuff, doctors do better engage with patients.
ttpphd · 2 years ago
"I don't spend much time worrying about AI ethics"

Maybe you should re-evaluate whether that's the right choice based on your own comment, especially when ethics and safety experts are being fired by big tech co.

JohnFen · 2 years ago
As a patient, this is an application that I would be extremely wary of, personally. I don't want the details of my conversations with my doctor to be sent to a third party in the first place, and I wouldn't trust the results of the transcription to be correct.

My doctor could vet it for accuracy, I suppose, but why? He's already putting his notes in my records anyway.

1123581321 · 2 years ago
Most people doing a video meeting with their doctor are already trusting their information to third parties in some way. I understand that you probably are in a different situation. Perhaps services like these are on the path to your more privacy-oriented doctor having a tool that helps them and is comfortable for knowledgeable patients.
JohnFen · 2 years ago
Yes, if you're doing telehealth, you're already taking a risk in terms of privacy. But the service provider doesn't need to actually pay attention to what you and your doctor are talking about to provide the service.

This is different, in that this service requires that a third party pay attention to the content of your discussion in order to work. That's significantly more intrusive.

l5t · 2 years ago
It takes 40% of your doctor time to put his notes in your medical records. We have been testing with doctors to reduce this time drastically so that they can spend more quality time with you. We also interviewed a lot of patients who take notes to be able to remember what’s been discussed during a consultation. So it could also benefit the patient
Veen · 2 years ago
Your response completely ignores the concerns raised in the comment you are replying to. "It saves your doctor 40% of their time" is not a reasonable response to "I don't want my private medical details divulged to a third party".
burntcookie90 · 2 years ago
> that they can spend more quality time with you

Maybe its just the doc offices i've been to, but i think this would actually just increase the patient churn in a hospital. There's no way a doc is going to increase their patient time by the 40% saved if the hospital can toss them in front of another patient.

JohnFen · 2 years ago
I can absolutely see the benefits of this, that's not the question. The question is whether or not the benefit justifies the risk.
mannyv · 2 years ago
Many practices today have a transcriber with every doc, and that person is their personal scribe. You trust them, right?

In any case HIPAA will protect you. Your provider will have a business associate contract with them, so they're subject to huge fines if they violate that.

This is a demo. In real life this'll be wrapped with a whole lot of legal contracts.

c0m · 2 years ago
I trust a scribe because they are a person with years of specialised training and experience who can be held accountable.

> in any case, HIPAA will protect you

It protects you insofar as it disincentives honest actors from doing sketchy things with your data. It's punishment for orgs, not protection for patients, like how laws against murder punishes the murderer rather than protecting the victim.

The best thing a patient can do to protect their privacy is to be actively avoid of medical practitioners that, for example, use tools like this which send your private medical consultation transcripts to God-knows-where.

SkyPuncher · 2 years ago
> I don't want the details of my conversations with my doctor to be sent to a third party in the first place

Then you simply cannot use modern healthcare. Unless you're doctor is cash-pay, off the cuff, they're _legally_ using third-parties to perform their services.

----

If your doctor/healthcare provider is a covered entity, they're bound by HIPAA. That means they're free to establish relationships with Business Associates that are part of delivering healthcare. Those Business Associates must also be HIPAA compliant.

v4dok · 2 years ago
I like the idea, but I would like it even more if it was on-prem. The doctors (at least in EU) will be very wary of having their client meeting essentially recorded by a third party. With this as a cloud SaaS, patient confidentiality is essentially broken since the raw data is available to you while you transcribe it. I understand that you compete with "google speech-to-text" but this is not a feature meant to be used by doctors (even if they "illegally" do).

Obviously the business model is harder with on-prem, but cloud-first for doctor notes is in the long run much harder.

agilob · 2 years ago
There are so many regulations and certifications to do that there are no chances anyone will have money and time to do all paperwork to host it on-perm. Because of these regulatory reasons most hospitals use so old systems that it surprises no one to see Windows XP with IE6 there. I had the pleasure myself of fixing a Material Angular bug that wasn't displaying form fields validation correctly in IE6 only 2 years ago, and that was for network of hospitals in Canada, US and Germany. Effort to allow them to use anything newer is too big and there are too many documents to review. It's simply cheaper to keep WinXP and IE6 running for as long as possible.
v4dok · 2 years ago
I've read about yesterday someone running LLAMA in a single GPU. Maybe if you optimise the model enough, you can give it to them as a box.
actionfromafar · 2 years ago
Off prem is also a rats nest of regulations.
ar7hur · 2 years ago
Thanks for your comment.

We plan to offer an on-prem option eventually.

In the meantime: - we offer a GDPR compliant EU-based hosting option - we don't store anything (no audio, no transcript, no note): it's stateless and all erased at the end of each consultation - data is pseudonomyzed as it flows though our systems

Our first customers (large healthcare orgs) have been OK with this so far!

barnd4 · 2 years ago
Hi, I have two questions.

First question - you say that you don't store anything (no audio, no transcript, no note), but your legal agreement says that in order to use this service, the doctor asserts to you that they have gotten consent from the patient for you to reuse all data processed through the service for research and development, and to improve the performance, models, and algorithms of this or any other solution you come up with in the future. Why the difference and how do you square A with B?

"Due to the substantial financial, material and human investments made by NABLA within the framework of the Contract for the development and updating of the Solution, NABLA wish to be allowed to reuse the data processed within the framework of the Contract.

The CLIENT, when applicable in the name and on behalf of the DATA CONTROLLER, warrants that the Data Subjects have been informed of their rights and have given their consent for the use of their data within the framework of the Contract when required by applicable laws or the Regulation and authorizes the DATA PROCESSOR to reuse the Data processed within the framework of the Contract, as long as the latter undertakes to comply with the Regulation for all of this Data, for the uses listed below:

- research and development of the Solution,

- improving the performance, models and algorithms developed and trained by NABLA in the context of the Solution or any other solution published by NABLA,"

Second question - you say that you don't store anything (no audio, no transcript, no note) and that it is all erased at the end of each consultation. But do you store any artifacts derived from the audio, transcript, or notes of a consultation, like data processed directly or indirectly from the audio, transcripts, or notes of a consultation that is fed in to an AI model, ML model, or other dataset that you persist after the consultation?

v4dok · 2 years ago
I am in this space, dealing with similar problems with similar organizations. Any DPO that is not listing you as a shared controller will be acting illegally. And they might, its still a grey area, but how many of those assesments can you handle before your CAC becomes too much? Don't get me wrong I really think that it can help a lot on HCLS space if you want to chat more shoot me an email contact_vdk@proton.me
priyanmuthu · 2 years ago
My worry is verification of facts. What if the model summarized incorrect facts, and it gets added to the medical history? How can doctors easily verify and fix things?
jjoonathan · 2 years ago
Have you had a medical billing experience recently? They don't care about facts, they care about maximizing the bill subject to plausibility constraints.

I fear that LLMs will kick this to a whole new level :(

sp332 · 2 years ago
It’s easier to revise than to write a first draft. It’s possible to miss mistakes on review, but relatively, I think it’s more likely that a distracted human will mess something up on the first draft.
D13Fd · 2 years ago
That's exactly the thought process that leads to us trusting LLMs in places where they shouldn't be trusted.
qgin · 2 years ago
A good benchmark could be to compare the accuracy of generated notes to the accuracy of other currently accepted practices like human medical scribes and assistants.
ar7hur · 2 years ago
Thanks for your comment. We clearly show a diff to the doctor before updating patient records. We are very explicit that they should check it (and edit if necessary).
avgDev · 2 years ago
I feel uncomfortable knowing doctor would use anything like this.

How are 2 party consent states handled?

Is this HIPPA compliant?

rkaregaran · 2 years ago
This! No mention of HIPAA on their site at all. Would be a total non-starter for any providers not in private practice.
freshpots · 2 years ago
From their main page (https://www.nabla.com/), the mention HIPAA:

Secure and HIPAA-eligible

    Audio, transcripts, and notes are not stored by Nabla
    HIPAA-eligible and GDPR compliant
    SOC 2 and ISO 27001 certifications in progress
Digging deeper (https://www.nabla.com/blog/privacy-security/):

This data processing is done on Nabla's servers, which are powered by the HIPAA and GDPR compliant Google Cloud Platform (GCP), and on HIPAA-eligible LLM servers.

egorfine · 2 years ago
Kindly reminder that HIPAA is a particular law applicable only in a single particular country.
shagie · 2 years ago
Their website seems to suggest it is a French company with a US office. The issues around HIPAA would not be there and instead replaced by GDPR.

The blog posts also mention French trained ML.

> Cedille is a new open source French language model created by Coteries. It is trained to understand and write French and is also the largest model of its kind for French. Cedille is trained using large databases of publicly available content on the internet filtered for toxic content.

Expanding into the US, yes - they would need to deal with HIPAA, but until they do they likely don't need to.

jimnotgym · 2 years ago
Imagine a doctor instead mumbled his notes in a non native accent into a dictation machine, and had someone on minimum wage type them up?
shagie · 2 years ago
That job is known as a medical scribe.

https://www.scribeamerica.com/what-is-a-medical-scribe/

> A Medical Scribe is a revolutionary concept in modern medicine. Traditionally, a physician's job has been focusing solely on direct patient contact and care. However, the advent of the Electronic Health Record (EHR) created an overload of documentation and clerical responsibilities that slows physicians down and pulls them away from actual patient care. To relieve the documentation overload, physicians across the country are turning to Medical Scribe services.

> A Medical Scribe is essentially a personal assistant to the physician; performing documentation in the EHR, gathering information for the patient's visit, and partnering with the physician to deliver the pinnacle of efficient patient care.

avgDev · 2 years ago
Um there is dictation tools which are HIPPA compliant?
moomoo11 · 2 years ago
HIPAA only applies to healthcare providers I think. Private companies like the extension maker can do whatever they want. At least that’s what someone on the internet told me and maybe it’s wrong.
ceejayoz · 2 years ago
If the provider wants to use the extension for patient care, the extension maker must be prepared to enter into an agreement to comply with the HIPAA rules.

https://www.hhs.gov/hipaa/for-professionals/covered-entities...

> If a covered entity engages a business associate to help it carry out its health care activities and functions, the covered entity must have a written business associate contract or other arrangement with the business associate that establishes specifically what the business associate has been engaged to do and requires the business associate to comply with the Rules’ requirements to protect the privacy and security of protected health information.

avgDev · 2 years ago
No. HIPPA applies to software as the software company would be considered a business associate.

"If you handle, store or transmit protected health information (PHI) to or from a covered entity then you need to be HIPAA compliant."

Source: https://github.com/truevault/hipaa-compliance-developers-gui...

SkyPuncher · 2 years ago
Your statement is correct, but not complete.

When a covered entity (a HIPAA-required provider) does business with a private non-covered entity _and_ that transaction involves HIPAA controlled information, they must enter into a Business Associate Agreement (BAA). This effectively forces the private entity to maintain the same HIPAA standard as the provider.

A private company is absolutely free to build non-HIPAA compliant software, but they completely unlikely to get any healthcare providers to actually use it.

NorthOf33rd · 2 years ago
This is the scariest real life attempt to use the bullshit machine I’ve seen so far.

Nothing like “subtly wrong” clinical notes to affect positive medical outcomes. And time pressed medical professionals are certainly well suited to vigilance tasks like correcting machine generated errors. /s.

I sincerely hope this never sees the light of day.

manv1 · 2 years ago
If you think this is scary you should see how it's done today without AI.

Doctor's handwriting is notoriously bad. Now try putting that into an EMR. Good luck with that! You think you're going to get someone who gets paid $2000/hr to type shit up? Yeah, guess again.

Pigalowda · 2 years ago
We don’t get paid even close to $2k/hr. Where is that number from? Many physicians type notes, others use dragon dictation, or medical assistants/scribes.

I used to type, now I dictate.

Quality of note is provider dependent though. If an LLM can improve quality of notes across the board through simple summary that’s interesting. But the input matters - is the provider actually asking the right questions? Did they look at past relevant history and other notes and put it in their current note?

The LLM should be in a HIPAA compliant environment and have access to the EMR. Then it provides a tidy summary for that. Second it should have relevant questions for the provider to ask during the interview but also a real time/dynamic component which generates relevant follow up questions dependent on patient responses.

Lastly it should put together the old and the new and generate a new note which can then be edited by the provider. Editing notes is much easier than generation of a new high quality note.

urduntupu · 2 years ago
Technically interesting but solving the wrong problem.

Doctor's already today spent too little time with their patients to understand diseases holistically enough.

Adding technically between these 2 will make treatments in most of the cases worse, not better.

vsareto · 2 years ago
Nothing's going to fix that but having more doctors around.

Going from a recorded transcript to a summary note, extracted structured data, and diagnostic codes is a big time saver.

nharada · 2 years ago
> solving the wrong problem

You seem quite confident about this, but based on the doctors I know writing notes is a real pain and mostly seen as (important) scutwork. They're only given a set amount of time to see a patient (including notes), and if you can reduce notes they'd actually get more time talking and seeing the patient.

alach11 · 2 years ago
> if you can reduce notes they'd actually get more time talking and seeing the patient

You're not thinking like a hospital administrator. Sounds like these doctors can handle 40% more patients to me!

SkyPuncher · 2 years ago
You're both correct. My wife is a psychiatrist. I'm friends with several doctors through her.

Everyone hates notes. However, there are already a bunch of dictation services that help write notes more efficiently. They absolutely work well enough.

Adoption and funding is the primary blocker. When my wife works at facilities that have M-Modal, she's 80%+ faster. However, she works at many, who despite knowing the benefits, simply don't care to spend the time, money, or resources on implementing scribe/dictation services.

thfuran · 2 years ago
What percent of that scutwork is actually note taking rather than clicking a bunch of checkboxes or combo boxes or otherwise navigating through EHR UI?
JohnFen · 2 years ago
The last time I visited a doctor, he had an assistant with him that took all the notes.
stopachka · 2 years ago
Right now, drs type notes as you speak to them. If this works, they'd have more time to spend actually listening and talking to you.
hartleybrody · 2 years ago
Wow, lots of haters in this thread. Having doctors spend less time with clerical/administrative work needed to record conversations will free up more time to focus on patients. I assume the generated note text can be easily modified by the doc if necessary, but saves them the bulk of having to type things up and remember everything that was discussed.
orcajerk · 2 years ago
Last time I went to the doctor, he charged me $300 to google symptoms on a computer. Now you're saying I get to pay $300 for him to ask ChatGPT? No thanks. We all know that is where this is going - doctor decision support system, i.e. the AI has access to information locked behind expensive journals that the patient does not.
matsemann · 2 years ago
Last time I asked a developer to fix a bug, he charged me $300 to google the api docs..!

But this isn't even about the doctor asking GPT, this is transcribing the video call so the journaling becomes easier and the doctor can spend more time with their patients. A good thing, no?

CuriouslyC · 2 years ago
Doctors are liability lightning rods who've gone through a professional hazing process to join an exclusive club that has massively warped American medicine through its lobbying efforts. Unless you're dealing with a specialist they're often not super competent - that's why doctors are being replaced with nurse practitioners and physicians assistants everywhere (to the point that you often won't ever see an actual doctor at a lot of medical facilities) and usually people are completely oblivious to the fact.

Pharmacists in other countries do a lot of the basic shit doctors in the USA do, with less overhead, for cheaper and with a much faster turnaround. Antibiotics, blood pressure/cholesterol meds, antidepressants and other rubber stamp drugs absolutely do not need to be strongly controlled.

soco · 2 years ago
No. The transcription part is done by Whisper. After that, GPT summarizes a clinical note which may be truthful - or may have hallucinations just as well.
sharemywin · 2 years ago
I'll be happy to do that for like $200 as long as I can use ChatGPT
avgDev · 2 years ago
If my doctor is googling he is doing his job.

He has medical knowledge which allows him to sort through information.

Any developer who doesn't google today is hindering his ability to develop and slowing the development process. Just like the dev cannot hold all the algorithms and language caveats/syntax in his head, a physician cannot hold all the information in his brain. He may not be familiar with symptoms or might be googling to see if there is anything "new" as pertaining to a patients specific issue.

There was a time steroid injections were the go to for tendon injuries, recently science came out that it results in worse long term outcomes and PT should be the first line treatment.

CuriouslyC · 2 years ago
I would find it suspect if my specialist was googling unless I had something really weird going on, and might consider going to a different specialist.
Shared404 · 2 years ago
> Last time I went to the doctor, he charged me $300 to google symptoms on a computer.

Would you be upset if someone were to ask you to help with their code and you needed to Google for documentation?

It's not exactly the same thing, but it's a similar situation.

I do tend to agree with you on the rest of your concerns, especially info being used that the patient has no way to verify - but that seems tangential at worst to the product in it's current state.

JohnFen · 2 years ago
> It's not exactly the same thing, but it's a similar situation.

It's pretty close to the same thing. Programmers are paid for knowing how to program, not for memorizing APIs, system calls, etc. Being able to look up details is essential because it lightens the cognitive load, allowing deeper thought to be put into the real meat of the problem. No programmer can realistically memorize every technical detail they'll need to know.

It's the same with doctors. They're paid because they know the process of diagnosing illnesses, not for memorizing Gray's Anatomy and every pharmacopoeia. Being able to look stuff up is essential for the same reason it is for programmers.

Not to mention that medicine changes, and you want your doctor to have the latest information about whatever it is that ails you.

clemailacct1 · 2 years ago
The tone of this reply is excessively snarky.

You clearly have a personal bad experience with your doctor - but that doesn’t mean that this potential product is a terrible idea.

mr_mitm · 2 years ago
I would want my doctor to Google things. No one knows everything, but I expect an expert to know where to look. If he used Google the same way I do, he'd know which results are reliable. Doctors who finish med school and never consult a resource ever again cannot be good doctors.
exclusiv · 2 years ago
Sure, but as an example, I paid a lawyer who presents that they are an expert in real estate. And I later find out they have zero understanding of legal non-conforming permits. It was a key factor I made clear upfront before I hired them.

I shouldn't have to pay for your time so you can educate yourself on the thing you purported to know before I hired you. Especially when the hourly rates are in the hundreds of dollars.

So there is some balance there. And doctors are using software that spits them out shit in real-time and then they send you on your way with some printed info you can readily find on the web.

I just had a knee issue and the doctor thought it may be a torn meniscus. Didn't have me do any range of motion movements. We just chatted and he said I think it is this. Take an xray. Bye.

I go get an xray and results come back the next day over the app. All he says in the app - "xray is normal".

So I'm thinking what the hell. That's where you leave it? So what. Should we do an MRI? What's next. You gave me an anti-inflammatory and your guess was apparently wrong. What a shitty engineer this person would be.

Tech can help in some cases. I like getting lab results back in the app. I like that I can follow up with chat later.

But I fear it's making doctors less effective. And the best ones will be the ones that maintain their traditional craft and nuance even with all the fancy new tools and tech that save them time.

Just like everyone is a React developer these days. Tech and advances can make many people sloppy. And dumber. It can push away great talent due to mandates of the tech or process. And attract new talent that is worse.

Deleted Comment

trdtaylor1 · 2 years ago
You are not wrong; I did not think about this until you brought it up but it's a great plan to wrap nearly free customers, then get in bed with Health Insurers to reduce cost of care by recording and transcribing all doctor conversations, then alerting to possible fraud or excessive care based on patient input against your huge DB of patient interactions. Historically recording of the patient DR interaction has always been incredibly hard sell to do for the management types. Then insurers mandate chat capture with the visit summary billed to them. Love it!
mrtksn · 2 years ago
Well if you believe that you can have you fixed by Googling and still go to someone else to do the Googling you obviously have to pay.

No one is obligated to do free Googling.

Or maybe you didn’t go the for the Googling?

nickphx · 2 years ago
I believe this product is to help the physicians transcribe notes and medical chart information from video/audio recordings, not diagnose.
jimnotgym · 2 years ago
Last time I went to the doctor it was free.

I just like to point out the small advantages of living in Europe now and again

nickphx · 2 years ago
I'm not sure you understand the definition of free.