Readit News logoReadit News

Deleted Comment

shortrounddev2 commented on I misused LLMs to diagnose myself and ended up bedridden for a week   blog.shortround.space/blo... · Posted by u/shortrounddev2
monerozcash · 2 months ago
Nobody here is advocating blindly trusting medical advice from LLMs, that is not "dangerous medical misinformation".

Even if you absolutely despise LLMs, this is just silly. The problem here isn't "AI enthusiasts", you're getting called out for the absolute lack of nuance in your article.

Yes, people shouldn't do what you did. Yes, people will unfortunately continue doing what you did until they get better advice. But the correct nuanced advice in a HN context is not "never ask LLMs for medical advice", you will rightfully get flamed for that. The correct advice is "never trust medical advice from LLMs, it could be helpful or it could kill you".

shortrounddev2 · 2 months ago
You are unambiguously wrong. Never ask LLMs for medical advice. There is no nuance here, and I suspect the only reason there's so much backlash against this simple and obvious fact is because the amount of money in this scam of an industry
shortrounddev2 commented on I misused LLMs to diagnose myself and ended up bedridden for a week   blog.shortround.space/blo... · Posted by u/shortrounddev2
shortrounddev2 · 2 months ago
I've tried removing my post because the comment section here has become a platform for AI enthusiasts to spread dangerous medical misinformation. As HN does not really care about user privacy, I am unable to actually delete it. I renamed the post to [Removed], but it appears the admins are uninterested in respecting the intent of this, and renamed the post back to its original title.

Moral of the story kids: don't post on HN

shortrounddev2 commented on I misused LLMs to diagnose myself and ended up bedridden for a week   blog.shortround.space/blo... · Posted by u/shortrounddev2
monerozcash · 2 months ago
What about the multiple people who have reported receiving incredibly useful information after asking an LLM, when doctors were useless?

Should they not have done so?

Like this guy for example, was he being stupid? https://www.thesun.co.uk/health/37561550/teen-saves-life-cha...

Or this guy? https://www.reddit.com/r/ChatGPT/comments/1krzu6t/chatgpt_an...

Or this woman? https://news.ycombinator.com/item?id=43171639

This is a real thing that's happening every day. Doctors are not very good at recognizing rare conditions.

shortrounddev2 · 2 months ago
> What about the multiple people who have reported receiving incredibly useful information after asking an LLM, when doctors were useless?

They got lucky.

This is why I wrote this blog post. I'm sure some people got lucky when an LLM managed to give them the right answer, because they go and brag about it. How many people got the wrong answer? How many of them bragged about their bad decision? This is _selection bias_. I'm writing about my embarrassing lapse of judgment because I doubt anyone else will

shortrounddev2 commented on I misused LLMs to diagnose myself and ended up bedridden for a week   blog.shortround.space/blo... · Posted by u/shortrounddev2
blakesterz · 2 months ago
They wrote...

  "Turns out it was Lyme disease (yes, the real one, not the fake one) and it (nearly) progressed to meningitis"
What does "not the fake one" mean, I must be missing something?

shortrounddev2 · 2 months ago
Disclaimer: not a doctor (obviously), ask someone who is qualified, but this is what the ID doctor told me:

Lyme is a bacterial infection, and can be cured with antibiotics. Once the bacteria is gone, you no longer have Lyme disease.

However, there is a lot of misinformation about Lyme online. Some people think Lyme is a chronic, incurable disease, which they call "chronic lyme". Often, when a celebrity tells people they have lyme disease, this is what they mean. Chronic lyme is not a real thing - it is a diagnosis given to wealthy people by unqualified conmen or unscrupulous doctors in response to vague, hard to pin symptoms

shortrounddev2 commented on I misused LLMs to diagnose myself and ended up bedridden for a week   blog.shortround.space/blo... · Posted by u/shortrounddev2
monerozcash · 2 months ago
"Don't ask LLMs leading questions" is a perfectly valid lesson here too. If you're going to ask an LLM for a medical diagnosis, you should at the very least know how to use LLMs properly.

I'm certainly not suggesting that you should ask LLM for medical diagnoses, but still, someone who actually understands the tool they're using would likely not have ended up in your situation.

shortrounddev2 · 2 months ago
If you're going to ask an LLM for a medical diagnosis, stop what you're doing and ask a doctor instead. There is no good advice downstream of the decision to ask an LLM for a medical diagnosis
shortrounddev2 commented on I misused LLMs to diagnose myself and ended up bedridden for a week   blog.shortround.space/blo... · Posted by u/shortrounddev2
cheald · 2 months ago
I think the author took the wrong lesson here. I've had doctors misdiagnose me just as readily as I've had LLMs misdiagnose me - but I can sit there and plug at an LLM in separate unrelated contexts for hours if I'd like, and follow up assertions with checks to primary sources. That's not to say that LLMs replace doctors, but that neither is perfect and that at the end of the day you have to have your brain turned on.

The real lesson here is "learn to use an LLM without asking leading questions". The author is correct, they're very good at picking up the subtext of what you are actually asking about and shaping their responses to match. That is, after all, the entire purpose of an LLM. If you can learn to query in such a way that you avoid introducing unintended bias, and you learn to recognize when you've "tainted" a conversation and start a new one, they're marvelous exploratory (and even diagnostic) tools. But you absolutely cannot stop with their outputs - primary sources and expert input remain supreme. This should be particularly obvious to any actual experts who do use these tools on a regular basis - such as developers.

shortrounddev2 · 2 months ago
No, the lesson here is never use an LLM to diagnose you, full stop. See a real doctor. Do not make the same mistake as me
shortrounddev2 commented on I misused LLMs to diagnose myself and ended up bedridden for a week   blog.shortround.space/blo... · Posted by u/shortrounddev2
hansmayer · 2 months ago
> "If you read nothing else, read this: do not ever use an AI or the internet for medical advice. Go to a doctor."

Yeah, no shit Sherlock? I´d be absolutely embarrassed to even admit to something like this, let alone share the "wisdom perls" like "dont use a machine which guesses its outputs based on whatever text it has been fed" to freaking diagnose yourself? Who would have thought, an individual professional with decades in theoretical and practical training, AND actual human intelligence (Or do we need to call it HGI now), plus tons of experience is more trustworthy, reliable and qualified to deal with something as serious as human body. Plus there are hundreds of thousands of such individuals and they dont need to boil an ocean every time they are solving a problem in their domain of expertise. Compared to a product of entshittified tech industry which in the recent years has only ever given us irrelevant "apps" to live in, without addressing really important issues of our time. Heck, even Peter Thiel agrees with this, at least in his "Zero to one" he did.

shortrounddev2 · 2 months ago
To be honest, I am pretty embarrassed about the whole thing, but I figured I'd post my story because of that. There are lots of people who misdiagnose themselves doing something stupid on the internet (or teenagers who kill themselves because they fell in love with some Waifu LLM), but you never hear about it because they either died or were too embarrassed to talk about it. Better to be transparent that I did something stupid so that hopefully someone else reads about it and doesn't do the same thing I did
shortrounddev2 commented on I misused LLMs to diagnose myself and ended up bedridden for a week   blog.shortround.space/blo... · Posted by u/shortrounddev2
shortrounddev2 · 2 months ago
> Not using ChatGPT in 2026 for medical issues and arming yourself with information [...] would be foolish in my opinion.

Using ChatGPT for medical issues is the single dumbest thing you can do with ChatGPT

shortrounddev2 commented on I misused LLMs to diagnose myself and ended up bedridden for a week   blog.shortround.space/blo... · Posted by u/shortrounddev2
xiphias2 · 2 months ago
Another poorly written article that doesn't even specify the LLM being used.

Both ChatGPT o3 and 5.1 Pro models helped me a lot diagnosing illnesses with the right queries. I am using lots of queries with different context / context length for medical queries as they are very serious.

Also they have better answer if I am using medical language as they retrieve answers from higher quality articles.

I still went to doctors and got more information from them.

Also I do blood tests and MRI before going to doctors and the great doctors actually like that I go there prepared but still open to their diagnosis.

shortrounddev2 · 2 months ago
Read to the bottom. I didn't specify the LLM because it doesn't matter. It's not the fault of the LLM, it's the fault of the user

u/shortrounddev2

KarmaCake day2427June 15, 2023View Original