Readit News logoReadit News
pclmulqdq · a year ago
I have to say that I think the photo without the Ligthroom processing actually looks better. The second one hasn't just added a bitcoin, it has also added the "AI shimmer" that seems to be a part of a lot of generated images. I can't put my finger on exactly what the characteristic is, but my first instinct (separate from the bitcoin, which was hard to find) is "that's an AI picture." Someone should just spend some time in the blur tool if they don't like those glints.
serviceberry · a year ago
I don't think there's any AI-fication going on in that photo. The modified version has a more compressed tone curve to bring out more detail, along with jacked up saturation (especially evident for water). This is similar to what most cell phones do by default to make photos look more pleasing.

I do agree that the original looks better, but the author of the post clearly prefers the modified version.

strogonoff · a year ago
Clipped highlights in digital photography are simply impossible to eliminate in post-processing without conjuring nonexistent information. Even if you shoot raw. Different raw processors use different tricks, color propagation and such, but naturally they are best used when highlights are really small. I would not be surprised if tools like Lightroom invoke ML at the first hint of clipping (because why not, if you all you have is a hammer…).

Pro tip: Digital sensors are much less forgiving than negative film when it comes to exposing highlights. With a bit of foresight they are best tackled at shooting time. Highlights from water/glass reflections are tamed by a fairly cheap polarizing filter, and if you shoot raw you should do the opposite of negative film and always underexpose a scene with bright highlights (especially if highlights are large or are in your subject of interest). Let it be dark, you will have more noise, but noise is manageable without having to invent what doesn’t exist in the parts that are the most noticeable to human eye.

pclmulqdq · a year ago
Oh, yeah, compression of the dynamic range combined with increase brightness makes sense. It's not exactly just stable diffusion that produces that look, but also things like ML filters, etc.
ruraljuror · a year ago
This reminds me of the soap-opera effect[0] on modern tvs. I have difficulty watching a movie on someone’s tv with it enabled, but they don’t even seem to notice.

0: https://en.wikipedia.org/wiki/Soap_opera_effect

johnnyanmac · a year ago
A truly bizarre effect. One of the first times in my life I was ever thinking "wait, this looks TOO smooth. it's weird". As if my eyes just instinctively knew there were fake frames (before I understood the concept of "frames").
samsartor · a year ago
That could be the VAE? The "latent" part of latent diffusion models is surprisingly lossy. And however much the image is getting inpainted, the entire thing gets encoded and decoded.

Edit: I'll note some new models (SD3 and Flux) have a wider latent dim and seem to suffer from this problem less.

AI generated images are also biased strongly towards medium lightness. The photographer's adjusting of the tone curve may simply give it that "look".

AuryGlenz · a year ago
You absolutely do not need to encode and decode the whole image, even in ComfyUI. All you need to do is composite the changed areas back in the original photo. There are nodes for that and I’m sure that’s what Adobe does as well, if they even encode in the first place. These tools don’t really work quite like inpainting. There’s no denoise value - it’s all or nothing.

I’ve used Photoshop’s generative fill many times on singular images and there’s no loss on the ungenerated parts.

Culonavirus · a year ago
If you zoom in and squint your eyes, it does look like some kind of shiny coin.

What I'd like to know though... is how is the model so bad that when you tell it to "remove this artifact" ... instead of it looking at the surroundings and painting over with some DoF-ed out ocean... it slaps an even more distinct artifact in there? Makes no sense.

samsartor · a year ago
A lot of current inpainting models have quite a lot of "signal leak". They're more for covering stuff vs removing it entirely.

Ironically, some older SD1/2-era models work a lot better for complete removal.

AuryGlenz · a year ago
I mean, this is notable because it screwed up. It usually does a pretty good job. Usually.

In this case there are better tools for the job anyways. Generative fill shines when it’s over something that’d be hard to paint back in - out of focus water isn’t that.

gedy · a year ago
Heaven forbid your picture has a woman in it somewhere though, Adobe's AI will refuse to fill half the time.. I've taken to censoring out blocks of image with black squares of it has any body parts showing (still clothed), fill, copy, then undo the censoring. It's pretty ridiculous for a paid subscription.
kyriakos · a year ago
For a paid product even if the content explicitly contained nudity or depicted sexual activity it should had still been allowed as they are valid cases that Lightroom and Photoshop could be used. The censorship in AI is stupid, babysitting users should not be part of the tool's responsibility. Its like banning kitchen knives to keep people from using them for violence.
jsheard · a year ago
Apparently this isn't an isolated incident: https://www.reddit.com/r/photoshop/comments/1e5nyt7/generati...
ripped_britches · a year ago
“Large company ships half baked product” has gotta be the least interesting story to read
slg · a year ago
Sure, if you view this as an isolated incident. But I think of it more as the latest example of the larger trend of how the industry has gone mad actively making their products worse with half-baked AI features. That is a more interesting story.
DrewADesign · a year ago
And this is the closest thing the professional imaging world has to a broadly-available tool designed for high-end professional use cases. It's barely consistently good enough for throwaway blog headers in its current state for large edits, and for small edits it's exactly 0 percent better than the removal tools they added 20 years ago. Adobe better start giving some love to its professional users because the alternatives are getting better, and their ecosystem is getting worse. It's like they're trying to put themselves in a position where they're competing with Canva and Instagram rather than Affinity, Procreate, Rebelle, etc. If it's not clear to them yet that they're not right-around-the-corner from having their AI tools be a drop-in replacement for their regular stuff, they're gonna have a bad time.
thegeomaster · a year ago
Is it actively worse, though? My impression is that all of the other, classical-based in-painting methods are still alive and well in Adobe products. And I think their in-painting works well, when it does work. To me, this honestly sounds like an improvement, especially in a creative tool like Lightroom or Photoshop --- the artist has more options to achieve their vision; it's usually understood to be up to the artist to use the tools appropriately.
johnnyanmac · a year ago
THe fact that we call it least interesting shows exactly how interesting it is that we just accept that companies are expected to ship broken slop.
ttoinou · a year ago
I dont see where in the picture it is zoomed from to see the bitcoin
jxi · a year ago
You have to click into it as it's not visible from the preview.
crooked-v · a year ago
Bottom left.
permo-w · a year ago
it's in the second picture not the first
uberman · a year ago
Down from the tip of the birds right wing near the very bottom of the image.
missing-acumen · a year ago
Question to people knowing adobe lightroom, could this feature be compromised? Is this just doing API calls to some remote thing?
nshireman · a year ago
Lightroom has a local Heal/Remove feature, and at least with LR Classic you have to tick a box for the AI remove, which processes it on Adobe servers.

As for whether it can be compromised... Probably? It sends all or some of your photo to a remove server, so that can certainly be taken.

missing-acumen · a year ago
I mean, having the model behave this way looks too easy and I guess that adobe does qc on the features it releases, so I'm not sure to see an alternative explanation - or adobe's qc is poor/inexistent.
wmf · a year ago
I'm not sure what you mean by compromised but I'm pretty sure Adobe Firefly AI features are server-based. These features are too good to be done locally.
jsheard · a year ago
Plus even if it could be done locally, doing it server-side has the side benefit (for Adobe) of making it trivial to prevent pirates from ever being able to use those features.
missing-acumen · a year ago
By compromised I mean something like someone having access to adobe's servers where this is running and uploading troll models or toying with the model's responses
IAmGraydon · a year ago
It’s almost like integrating a poorly understood black box with your software is a bad idea.