Readit News logoReadit News
Posted by u/sungam 6 days ago
Show HN: I'm a dermatologist and I vibe coded a skin cancer learning appmolecheck.info/...
Coded using Gemini Pro 2.5 (free version) in about 2-3 hours.

Single file including all html/js/css, Vanilla JS, no backend, scores persisted with localStorage.

Deployed using ubuntu/apache2/python/flask on a £5 Digital Ocean server (but could have been hosted on a static hosting provider as it's just a single page with no backend).

Images / metadata stored in an AWS S3 bucket.

jmull · 6 days ago
I kind of love the diy aspect of ai coding.

A dermatologist a short while ago with this idea would have to find a willing and able partner to do a bunch of work -- meaning that most likely it would just remain an idea.

This isn't just for non-tech people either -- I have a decades long list of ideas I'd like to work on but simply do not have time for. So now I'm cranking up the ol' AI agents an seeing what I can do about it.

Waterluvian · 6 days ago
I feel like the name “vibe code” is really the only issue I have. Enabling everyone to program computers to do useful things is very very good.
sollewitt · 6 days ago
It captures not understanding what you’re doing crossed with limited AI understanding which means the whole thing is running on vibes.
AuthAuth · 6 days ago
I wish that computers were designed in a way that pushed the users to script more. Its such a powerful ability that would benefit almost every worker.
notTooFarGone · 5 days ago
The only issue is security. The amount of open endpoints, standard logins and stuff will get out of control.

Deleted Comment

vrighter · 5 days ago
but they're not programming computers. They're commissioning footgun-riddled software from a junior intern
farai89 · 5 days ago
I believe this captures it well. There are many people that would have previously needed to hire dev shops to get their ideas out and now they can just get them done faster. I believe the impact will be larger in non-tech sectors.
NitpickLawyer · 5 days ago
Right. And what a lot of folks here miss is that the prototype was always bad. This process only speeds up the MVP, and gives the idea person a faster way to validate an idea.

Focusing on "but security lol" is a bad take, IMO. Every early attempt is bad at something. Be it security, or scale, or any number of problems. Validating early is good. Giving non-tech people a chance is good. If an idea is worth pursuing, you can always redo it with "experts". But you can't afford experts (hell, you can't even afford amateurs) for every idea you want put into an MVP.

utyop22 · 5 days ago
Most ideas suck and never deserve to see the light of day.

True productivity is when what is produced is of benefit.

jmkni · 6 days ago
Same, I've had ideas rattling around in my brain for years which I've just never executed on, because I'm 'pretty sure' they won't work and it's not been worth the effort

I've been coding professionally for ~20 years now, so it's not that I don't know what to do, it's just a time sink

Now I'm blasting through them with AI and getting them out there just in case

They're a bit crap, but better than not existing at all, you never know

ecocentrik · 6 days ago
I'm a big fan of barriers to entry and using effort as a filter for good work. This derma app could be so much better if it actually taught laypeople to identify the difference between carcinomas, melanomas and non-cancerous moles instead of just being a fixed loop quiz.
citizenpaul · 6 days ago
>They're a bit crap, but better than not existing at all, you never know

I don't agree. I think because of llm/vibe coding my random ideas I've actually wasted more time then if I did them manually. The vibe code as you said is often crap and often after I spend a lot of time on it. Realize that there are countless subtle errors that mean its not actually doing what I was intending at all. I've learned nothing and made a pointless app that does not even do anything but looks like it does.

Thats the big allure that has been keeping "AI" hype floating. It always seems so dang close to being a magic wand. Then upon time spent reviewing and a critical eye you realize it has been tricking you like a janitor that is just sweeping dirt under the rug.

At this point I've relegated LLM to advanced find replace and Formatted data structuring(Take this list make it into JSON) and that's about it. There are basically tools that do everything else llms do that already exist and do it better.

I can't count at this point how many times "AI" has taken some sort of logic I want then makes a bunch of complex looking stuff that takes forever to review and I find out it fudged the logic to simply always be true/false when its not even a boolean problem.

vrighter · 5 days ago
well yeah, better not existing at all actually, if they're crap and you're ok with that. Those just serve to pad out your resume for nontechnical people. It's not like you're actually learning much if you couldn't be bothered to even remove the crap parts
sungam · 6 days ago
Yes I agree - I could probably have worked out how to do it myself but it would have taken weeks and realistically I would never have had the time to finish it.

Deleted Comment

amelius · 6 days ago
Well, image classification tasks don't require coding at all.

You just need one program that can read the training data, train a model, and then do the classification based on input images from the user.

This works for basically any kind of image, whether it's dogs/cats or skin cancer.

chaps · 6 days ago
...none of this requires coding?

Dead Comment

yread · 6 days ago
Why? I know tons of coding MDs. Pathologist hacking the original Prince and adding mods also just in assembly. Molecular pathologist organizing their own pipelines and ETLs.

Lots of people like computers but earn a living doing something else

jonahx · 6 days ago
He wasn't saying no coding MDs existed. Just that, generally speaking, most MDs would have had to partner with a technical person, which is true. And is now less true than it was before.
jjallen · 6 days ago
Very cool. I learned a lot as a non dermatologist but someone with a sister who has had melanoma at a very young age.

I went from 50% to 85% very quickly. And that’s because most of them are skin cancer and that was easy to learn.

So my only advice would be to make closer to 50% actually skin cancer.

Although maybe you want to focus on the bad ones and get people to learn those more.

This was way harder than I thought this detection would be. Makes me want to go to a dermatologist.

sungam · 6 days ago
Thanks, this is a good point - I think a 50:50 balance of cancer versus harmless lesions would be better and will change this in a future version.

Of course in reality the vast majority of skin lesions and moles are harmless and the challenge is identifying those that are not and I think that even a short period of focused training like this can help the average person to identify a concerning lesion.

alanfranz · 6 days ago
> So my only advice would be to make closer to 50% actually skin cancer.

If I were to code this for "real training" of a dermatologist, I'd make this closer to "real world" training rate. As a dermatologist, I'll imagine that probably just 1 out of 100 (or something like that) skin lesions that people could imagine are cancerous, actually are so.

With the current dataset, there're just too many cancerous images. This makes it kind of easy to just flag something as "cancerous" and still retain a good "score" - but the point is moot, if as a dermatologist you send _too many_ people without cancer to do further exams, then you're negating the usefulness of what you're doing.

mewpmewp2 · 5 days ago
It needs a specific scoring system where each false positive has a lower score drop, but false negative has a huge one. At the same time like you said positives would be much rarer. Should be easy to ask LLM to vibe code that so it would simulate real world and its consequences.
jjallen · 6 days ago
Thought about this some more. I think you want to start at 100% or high so people actually learn what needs to be learned: what malignant skin conditions actually look like.

And then once they have learned you get progressively harder and harder. Basically the closer to 50% you are the harder it will be to have a score higher than chance/50%.

loeg · 6 days ago
I found the first dozen to be mostly cancer and then the next dozen were mostly non-cancer. (Not sure if it's randomized.) (Also, I'm really bad at identifying cancerous vs non-cancerous skin lesions.)
sungam · 6 days ago
It is randomized so probably just bad luck! FWIW I get a high score and another skin cancer doctor who commented also gets a high score so it is possible to make the diagnosis in most cases on the basis of these images.

Dead Comment

vindex10 · 6 days ago
Hi! That's really useful tool!

I wish it also explained the decision making process, how to understand from the picture what is the right answer.

I'm really getting lost between melanoma and seborrheic keratosis / nevus.

I went through ~120 pictures, but couldn't learn to distinguish those.

Also, the guide in the burger menu leads to a page that doesn't exist: https://molecheck.info/how-to-recognise-skin-cancer

sungam · 6 days ago
This is very helpful feedback. I will add some more information to help with the diagnosis and add an article in the burger menu with detailed explanation.

Being honest I didn't expect anyone apart from a few of may patients to use the app and certainly did not expect front page HN!

jgilias · 6 days ago
Hey!

Thanks for making this! A bit more polish and this is something I’d make sure everyone in my family has played with.

Imagine a world where every third person is able to recognise worrying skin lesions early on.

addandsubtract · 5 days ago
I'm not a doctor, but there's an ABCDE[0] rule of thumb to spot signs of skin cancer:

Asymmetry: One half of the spot is unlike the other half.

Border: The spot has an irregular, scalloped, or poorly defined border.

Color: The spot has varying colors from one area to the next

Diameter: melanomas are usually greater than 6 millimeters, or about the size of a pencil eraser

Evolving: Changing in size, shape, color, or new symptoms (itching, bleeding)

[0] https://www.aad.org/public/diseases/skin-cancer/find/at-risk...

jgilias · 6 days ago
Also came to the same conclusion. I want a mode where 50% of the set are melanomas, and the other 50% are “brown benign things”.
sungam · 6 days ago
Will add this in next version!
lukko · 6 days ago
I'm a doctor too and would love to hear more about the rationale and process for creating this.

It's quite interesting to have a binary distinction: 'concerned vs not concerned', which I guess would be more relevant for referring clinicians, rather than getting an actual diagnosis. Whereas naming multiple choice 'BCC vs melanoma' would be more of a learning tool useful for medical students..

Echoing the other comments, but it would be interesting to match the cards to the actual incidence in the population or in primary care - although it may be a lot more boring with the amount of harmless naevi!

sungam · 6 days ago
Thanks for your comment. The main motivation for me in developing the app was that lots of my patients wanted me to guide them to a resource that can help them improve their ability to recognise skin cancer and, in my view, a good way to learn is to be forced to make a decision an then receive feedback on that decision.

For the patient I think the decision actually is binary - either (i) I contact a doctor about this skin lesion now or (ii) I wait for a bit to see what happens or do nothing. In reality most skin cancers are very obvious even to a non-expert and the reason they are missed are that patients are not checking their skin or have no idea what to look for.

I think you are right about the incidence - would be better to be a more balanced distribution of benign versus malignant, but I don't think it would be good to just show 99% harmless moles and 1% cancers (which is probably the accurate representation of skin lesions in primary care) since it would take too long for patients to learn the appearance of skin cancer.

jazoom · 6 days ago
> most skin cancers are very obvious even to a non-expert and the reason they are missed are that patients are not checking their skin or have no idea what to look for

I am a skin cancer doctor in Queensland and all I do is find and remove skin cancers (find between 10 and 30 every day). In my experience the vast majority of cancers I find are not obvious to other doctors (not even seen by them), let alone obvious to the patient. Most of what I find are BCCs, which are usually very subtle when they are small. Even when I point them out to the patient they still can't see them.

Also, almost all melanomas I find were not noticed by the patient and they're usually a little surprised about the one I point to.

In my experience the only skin cancers routinely noticed by patients are SCCs and Merkel cell carcinomas.

With respect, if "most skin cancers are very obvious even to a non-expert" I suggest the experts are missing them and letting them get larger than necessary.

I realise things will be different in other parts of the world and my location allows a lot more practice than most doctors would get.

Update: I like the quiz. Nice work! In case anyone is wondering, I only got 27/30. Distinguishing between naevus and melanoma without a dermatoscope on it is sometimes impossible. Get your skin checked.

jacquesm · 6 days ago
Nice job. Now you really need to study up on the statistics behind this and you'll quickly come to the conclusion that this was the easy part. What to do with the output is the hard part. I've seen a start-up that made their bread and butter on such classifications, they did an absolutely great job of it but found the the problem of deciding what to do with such an application without ending up with net negative patient outcomes to be far, far harder than the classification problem itself. The error rates, no matter how low, are going to be your main challenge, both false positives and false negatives can be extremely expensive, both in terms of finance and in terms of emotion.
sungam · 6 days ago
Thanks for your comment - the purpose of this app is patient education rather than diagnosis but I will definitely have a look at the relevant stats in more detail!
jacquesm · 6 days ago
The risk I think is that people will not understand that that is your goal, instead they will use it to help them diagnose something they might think is suspicious.

They will go through your images until they get a good score, believe themselves and expert and proceed to diagnose themselves (and their friends).

By the time you have an image set that is representative and that will actually educate people to the point where they know what to do and what not to do you've created a whole raft of amateur dermatologists. And the result of that will be that a lot of people are going to knock on the doors of real dermatologists who might tell them not to worry about something when they are now primed to argue with them.

I've seen this pattern before with self diagnosis.

thebeardisred · 6 days ago
To that end I quickly learned something that AI models would as well (which isn't your intention):

Pictures with purple circles (e.g. faded pen ink on light skin outlining the area of concern) are a strong indicator of cancer. :wink:

DrewADesign · 6 days ago
This is awesome. Great use of AI to realize an idea. Subject matter experts making educational tools is one of the most hopeful things to come out of AI.

It’s just a bummer that it’s far more frequently used to pump wealth to tech investors from the entire class of people that have been creating things on the internet for the past couple of decades, and that projects like this fuel the “why do you oppose fighting cancer” sort of counter arguments against that.

jacquesm · 6 days ago
On the contrary. There is a whole raft of start-ups around this idea and other related ones. And almost all of them have found the technical challenges manageable, and the medical and ethical challenges formidable.
DrewADesign · 6 days ago
I’m not exactly sure what in my comment you’re responding to, here: My appreciation that a subject matter expert is now capable of creating a tool to share their knowledge, that tech investors are using AI to siphon money from people that actually make things, or that good projects like this are used to justify that siphoning?
sungam · 6 days ago
Thanks for your comment - I'm pleased that people have found it useful and definitely only possible because of AI coding. I agree that this is likely to be applicable to non-experts in many different areas.
DrewADesign · 6 days ago
Absolutely. I hope you’ll encourage your colleagues to follow suit!
globalise83 · 6 days ago
As someone with literally every single possible variation of skin blemish, mole and God knows what else, this scares the living hell out of me.
abootstrapper · 5 days ago
Get a yearly full body skin check from a dermatologist. It’s a common thing. I’ve been doing it for years because of my skin type. They caught early Basal cell carcinoma the last time I went.
mewpmewp2 · 5 days ago
Yeah, I only have just 1 concerning, but still made me spend 20 minutes googling difference between dermatofibroma and basal cell cancer. I think it is dermatofibroma, but I guess good point anyway to let it get checked out.
meindnoch · 6 days ago
sungam · 6 days ago
According to the metadata supplied with the dataset yes

Could definitely be a misclassification, however a small proportion of moles that look entirely harmless to the naked eye and under the dermatoscope (skin microscope) can be cancerous.

For example, have a look at these images of naevoid melanoma: https://www.google.com/search?tbm=isch&q=naevoid+melanoma

This is why dermatology can be challenging and why AI-based image classification is difficult from a liability/risk perspective

I was previously clinical lead for a melanoma multidisciplinary meeting and 1-2 times per year I would see a patient with a melanoma that presented like this and looking back at previous photos there was no features that would have worried me.

The key thing that I emphasise to patients is that even if a mole looks harmless it is important to monitor for any signs of change since a skin cancer will almost always change in appearance over a period of several months

jonahx · 6 days ago
> however a small proportion of moles that look entirely harmless to the naked eye and under the dermatoscope (skin microscope) can be cancerous.

That is very scary.

So the only way to be sure is to have everything sent to the lab. But I'm guessing cost/benefit of that from a risk perspective make it prohibitive? So if you're an unlucky person with a completely benign-presenting melanoma, you're just shit out of luck? Or will the appearance change before it spreads internally?

48terry · 6 days ago
> According to the metadata supplied with the dataset yes

"idk but that's what it says" somehow this does not inspire confidence in the skin cancer learning app.

jonahx · 6 days ago
Yeah that seems likely to be a misclassification...

Deleted Comment