I've noticed the same issue with WebP and have gone back to JPG/PNG for most things (jpg for photos, png for UI-type images)
I think the real problem is, like many of the commenters here, most people can't tell the difference because desktop monitors have been stuck in a deadzone of zero innovation for the last 10 years. I'm sure half the folks here are viewing his example images on a 2012-era HD 1920x1080 LCD, which is definitely part of the problem.
It's bizarre. Smaller displays (Mobile phones) and larger displays (4k TVs) have fantastic pixel densities now considering their viewing distance. However any panel in the range of 20"-40" is stuck in the mid-2000s.
Also, I think the author would have done us a favor by using example photos with lighter backgrounds (or changing the background color of his post to black). The harshness of the black images on white don't allow the eye to adjust enough to see the issue. If you put those images on a dark background its super easy to tell the difference.
I have no problem seeing the artefacts on both my 2012-era displays. One of them is a rather good at the time 30" 2560x1600 IPS monitor, the other is an entry-level 27" TN 1080p TV.
So I don't think display quality really is the problem here. Maybe the drivers, or post-processing filters. Or maybe everyone doesn't have an eye for this.
I have an interest in image processing, and that's the kind of detail one tends to notice with experience. The author of the article is undoubtedly more experienced than me and noticing these details may even be part of his job. He most likely will be able to notice these problems on crappy monitors, as well as telling you in which way that monitor is crap.
It's hard to see in the first set of images, but the second set is much clearer. In the WebP example, look to the right of the subject, about 1/6th of the image's width from the right edge. There's a hard transition between shades of grey. The JPEG version directly above it also has banding but each band is narrower so the difference at the edges is more subtle.
One might argue that if you need to enlarge it to see the artifacts, then the artifacts aren't perceptible enough and the codec is already good enough for the use case.
> The examples are just bad. If you want to show something, screenshot and enlarge it to show the artifacts.
Yes! Where's the red underlines and diffs? I can see the background banding, but the foreground looks the same at a glance except that some of them look ambiguously "off" in ways that could just be placebo.
You'd think a visual artist would be more interested in visual communication and not just a wall of text with un-annotated photos.
Laptop and desktop monitors have been advancing just fine over in the Apple world with high ppi, brightness and color accuracy being standard for nearly a decade... it's just expensive and so one of the first corners cut for PC as most folks simply don't care.
> I've noticed the same issue with WebP and have gone back to JPG/PNG for most things (jpg for photos, png for UI-type images)
Wait... I agree for JPG but if you use lossless WEBP instead of PNG, isn't it simply the same pixels, just with a file about 30% smaller than the corresponding PNG file? (and 15% smaller compared to already heavily optimized PNG files like when using zopfli/optipng/etc.).
Isn't the "lossless" in "lossless WEBP" actually lossless when converting a PNG file to WEBP?
FWIW when you convert losslessly a PNG to WEBP, then decompress the WEBP back to a PNG file, then convert again that PNG back to a WEBP file, you get the exact same lossless WEBP file. It's also the same WEBP you get when you encode losslessly from either a PNG or that same PNG but "crushed" with a PNG optimizer.
Yeah but I just don't fw webp and other weird formats. JPEG and PNG are tried and true, also it's nice how the extension indicates lossiness.
On the technical side, webp support still isn't like png. Tried dragging a webp into Google Slides just now, got "unsupported image type," which is ironic. I'll try again in like 10 years.
I'm on a 27" 4K IPS screen here and have to squint/zoom in to see the difference the author is writing about.
While it's nice some people really care for the best result I think most people aren't going to notice or care about it.
I'm guess it's also true that HN is definitely the wrong audience for this post. As the author suggests, if you spend all day in VScode/VIM, you're among the segment of computer users who looks at images the least as a percentage of time spent on a computer.
It's like the audiophile equivalent of using $500 speaker wire. Nobody normal really cares about the difference, if there's really any difference at all.
I caught it on my Android 12 without full screening. He's talking about the background, not the foreground. The backgrounds color noticeably changes from shot to shot around edges.
>because desktop monitors have been stuck in a deadzone of zero innovation for the last 10 years.
That's a weird thing to say unless the pixel density is your one and only measure. Regardless of that, the posterization should be perfectly visible on a 2012 FullHD monitor, or even a 1366x768 TN screen of a decade-old laptop. Most commenters here are probably viewing the pictures on a scale different from 1:1.
> That's a weird thing to say unless the pixel density is your one and only measure.
Is it though? We now have OLED TVs and OLED smartphones.
Where's our OLED PC monitors?
On every measure, if you care about colors/contrast/black+white levels/resolution/density, the average computer monitor has fallen far behind.
You can't even buy a smartphone that has a panel half as bad as most PC monitors on the market. And, at least in my area, you'd actually have to go to a lot of effort to find a non-4k TV.
Not true. Monitors now are 1440p or 4k. Even at work for me.
The "issue" is that monitors last a LONG time. And thats good. We dont touch them or fiddle with them. They tend to just work. Phones and shit we keep dropping and breaking, then the battery gets bad.
Also for gaming you may even want 1080p 200hz monitor for high refresh rate and FPS over pixel density.
> I'm sure half the folks here are viewing his example images on a 2012-era HD 1920x1080 LCD, which is definitely part of the problem.
I just looked at the first two images of the post.
First on two mid end LCDs: one ASUS IPS from this year and one BenQ TN from 2012, both 24" 1920x1080 (~91 DPI). The difference between the images is clear on both.
And before posting, to make sure, I pulled out a 15" 1024x768 (~85 DPI: basically the same) NEC TN LCD from 2002. And a NEC CRT roughly 15" viewable 1024x768 from 1998. Both on VGA connectors (so there is the typical noise from that, which still doesn't cover up the posterization). The difference between the images is clear on both.
All monitors viewed from 3' away.
People are simply accommodated to poor image quality, including posterization. AAA FPS video games display it on static art backgrounds in the loading menu, and I can never tell if they are intended. Show them a 240Hz monitor with 30ms input lag and 5 frames of overshoot artifacts and viewing angles worse than 1998, and they'll be wowed.
It’s quite noticeable on a 2011 MacBook Air, too. The issue is less pronounced if you don’t have a decent display but it’s more that people are not used to it. Like bad kerning, it’s something you’ll notice everywhere if you train your eye to look for it, but otherwise probably don’t notice except that some things feel less appealing.
Also, only a tiny fraction of PC monitors have color gamuts wider than sRGB, proper HDR support, or any kind of calibration.
Recently I’ve been dabbling in HDR video, but I realised that the exercise is futile because I can’t send the results to anyone — unless they’re using an Apple device.
Pixel density isn't the issue. 2K-4K computer monitors are pretty common. But they tend to suck in other ways compared to a MacBook screen. And yes I can tell the difference between the images on my MBP.
I opened the first two pictures in separate tabs and switched quickly between them. There is zero difference. Tried it on two different monitors, Chrome and Firefox. Same with the pictures of the guy at the end.
EDIT: The last comparison is webp twice, he linked it wrong. Here is the jpg one, still no difference:
I checked those images on a Macbook 16 M2 Max (standard P3-1600 nits preset), Chrome 120.0.6099.109. All of the WebP images had pretty bad posterization, while JPEG examples did not.
Edit: You have to actually click for a full size image to see the truth. Those inline images had pretty bad compression artefacts, even the supposed lossless versions.
> I wonder if there's some issue with the WebP encoder (or the settings) he is using?
I played around with online optimizers and IrfanView which I had locally. IrfanView got the results they did, no matter what else I tuned, obvious degradation at 90. Online optimizers were not even comparable in how bad they were.
edit: I found Squoosh [0], which has WebP V2 compression marked as unstable. It’s far better, half the size of JPEG 90, but it’s still degraded in comparison. Also, it saves as wp2 file, which neither Chrome nor FF support natively.
Tried it with a Windows laptop connected to a Samsung LS32A800 32" 4k display. Laptop has factory default settings. Chrome 120. The monitor is pretty low end for a 4k model.
Monitor's picture settings: Custom, brightness 81, contrast 75, sharpness 60, gamma mode1 and response time fastest.
Switched between those three "Edit 2" images blindly, yet the issues are obvious also on this combination.
The JPEG version looks better compared to WebP ones. (Also, this goes against my prior general assumptions about JPEG vs WebP quality.)
the second image and the third image are half resolution of the other, yeah some posterization is visible in Shoot-Antoine-0044-_DSC0085-lossless-1200x675.webp, but it's half resolution and he purposefully added a high frequency noise for his test then averaged the noise point trough resizing, and well, of course it's blurry.
> I opened the first two pictures in separate tabs and switched quickly between them. There is zero difference. Tried it on two different monitors, Chrome and Firefox. Same with the pictures of the guy at the end.
For clarity if anyone is still confused, on Wikipedia's example image, look at the snakes's shadow - that's what's happening to the background in the blog's image.
I didn't know the word "posterization", so I'd describe this (slightly?) more simply as a stepped gradient instead of a smooth gradient.
There is a clear difference though, I can see it in all my monitors, from desktop to laptop and even mobile. It's especially visible in the top right quarter.
That being said if you're not into photography you might just not care enough to see it
At 50 y/o my eyesight began to fail and yet the differences in the pictures are freaking obvious. As in: it's impossible to not see how huge the differences are.
And many people commented the same. These simply aren't small differences.
People who cannot see the differences or who only see them after taking a close look should realize something: there are many people for whom the differences are going to be immediately obvious.
> People who cannot see the differences or who only see them after taking a close look should realize something: there are many people for whom the differences are going to be immediately obvious.
That's one possible conclusion. Another is that some people are overstating how obvious it is. I don't mean this as an insult - there's plenty of cases where people's stated perceptions and preferences disappear when tested under strict conditions (hello Audiophiles).
So - it's not immediately obvious whether claims such as yours are trustworthy.
(for the record I can see the difference but it's fairly subtle on my screen)
The first picture is very hard to spot imo. I had to zoom in a bit to spot it initially. You'll see the "blockiness" is slightly worse in the webp version. (Left side of the image, head height)
For the second image, I opened the jpeg 90 [1] and webp 90 [2] versions. Comparing those two, there are clear banding issues to the right of the neck. Slightly less visible are the darker bands circling around the whole image, though still noticeable if you know where to look.
Comparing the jpeg 90 version with either webp lossless, jpeg 100 or jpeg 95, I can spot some very slight banding in the jpeg 90 version just to the right of the neck. Very difficult to spot though without zooming in.
I don't see any difference either on Windows on either of my monitors.
I wonder if the author's issue is due to the author using a Mac. Back when I was at Google working on VR images, my work machine was a Macbook and my home machine was a normal Windows desktop. I realized that images looked worse on my laptop's screen because the native resolution of the display hardware was something like 4000 (numbers made up because I don't remember the specs) but the display was set to 3000. So OSX would incorrectly rescale the image using the wrong gamma curves. Since I was trying to calibrate VR headsets, I spent way too much time looking at gamma test images like https://www.epaperpress.com/monitorcal/gamma.html where a high res pure black + pure white grid is shown next to a set of grays. That was how I realized that my Mac was incorrectly resizing the graphics without being properly gamma aware. I also realized that if I set the OS resolution to 2000, it would use nearest neighbor instead of bilinear filtering and the gamma issue would go away. My Windows desktop had the OS running at the native resolution of the display so this wasn't an issue there. This also wasn't an issue if I had an external monitor hooked up to the Mac and set to its native resolution.
Apple users tend to say "it just works" which is true 90% of the time. But there are cases like this where it doesn't "just work" and there was no easy way to force the OS to run at its native resolution on that specific laptop.
Edit:
I tested with the second set of images (the upper body shot) and the problems with the gradient are visible there. But I still can't see a different when quickly flipping through the first part of images on my properly calibrated native-resolution monitor. I _can_ see some banding on one of my monitors that was intentionally miscalibrated so that I could read text better.
It could also be a browser issue implementing webp. There's a decade-old bug in Chrome, where they're using the wrong color profile for CSS, so colors are brighter than in other browsers. It's extreme enough that one of the designers I worked with spotted it in passing just glancing at my Firefox window, which led down a rabbit hole finding the bug report.
Total aside, y'know how people do things like make their smartphones greyscale (or at least mute the colors a bit) to reduce smartphone addiction? It wouldn't surprise me if these over-saturated colors were part of why Chrome got so popular so fast...
> I wonder if the author's issue is due to the author using a Mac.
It is not, since I tested positive on Linux. What post processing would any OS even do on an image when you view it in a new tab as one is meant to do for this tutorial?
I did the same, and it took me a long time to spot it, but in the upper-right corner you see circles in the WebP version. It's outside the centre of attention, so it's not that obvious. Actually, it wasn't until I saw the second picture and knew what to look for that I spotted this in the first picture.
It's not so easy to see if the browser zooms the image, so make sure to open the image and set zoom to 100%. I also need to keep my face fairly close to my screen (12" 1920×1080, so not that large).
I can readily tell the difference on the guy's forehead. The webp version has less dynamic and looks like a big white spot, while jpeg has more shades.
The same image rendered with different os/hardware will almost always look different.
Different operating systems and monitors have different default gamma curves for rendering brightness and black levels. Monitors are most likely either uncalibrated, or _can't be calibrated_ to render a greyscale with just 64 brightness levels distinctly.
TFA is calling attention to "posterization" in their portrait backgrounds. They expected the grey background to have a smooth gradient, but, depending on your monitor, you should see visual jagged stair-steps between different grey levels.
When an image uses a color palette that's insufficiently variable to render the original image colors with high fidelity, that's "posterization."
(I paid for my college doing high-end prepress and digital image services, and got to work with a ton of really talented photographers who helped me see what they were seeing)
I thought it was pretty clear. I'm not even running any monitor/computer setup. The light behind her is clearly different, it almost looks like a photo with different lighting.
If I view the full images of the first two in two Chrome tabs, two Firefox tabs, or download them and open then both in Preview on a 27" 5k iMac and flip back and forth between the two I see nothing changing.
There is definitely something changing though, because if I open each in Preview, switch Preview to full screen, set the view to be actual size, and take a full screen screenshot, the screenshot for the WebP image is 14% smaller than the one for the JPEG.
If I use screen zoom to go way in and then flip between the two images I can finally see some changes. The JPEG background has more small scale variation in shade. In the hair there are some white streaks that aren't quite as long in the WebP. Lots of small changes in the shirt, but it is about 50/50 whether or not any given difference there looks better in the JPEG or the WebP.
This whole thread feels like one of those "I can tell the difference between an MP3 encoded at 320 kbit/s and one encoded at 256 kbit/s!" audiophile threads. Yes, there are probably people out there with well-calibrated ears who can, but I am sure not one of them. FWIW I have a 27" 5k iMac and can't even remotely see any difference between the images.
Lots of replies here saying either: "I can't see the difference" or "Wow the difference is stark".
My takeaway as a non-photographer is: "different tools for different uses". If you're posting photography where image quality matters then use JPEG or another format that you think displays the image best. If you're writing a blog post with screenshots or other images where minute quality doesn't matter that much then use WebP.
There's a clear difference between the JPEG and WEBP versions. Especially on the background on the right of the man.
There are clear bands of various shades of grey that circle out of the brighter areas behind the face and from the mid-right edge. They appear to join about two thirds from the middle to the right edge. That artifacting is most notable at full size, but is still visible on the smaller size on the web page.
You either have a bad screen or limited eyesight, it's quite funny to me that this is the most upvoted comment.
There's definitely very ugly "banding" going on in the gradients on the WebP versions i say as someone who's worked extensively with UX and interfaces.
I'm looking at an LG UltraFine, which as far as I know, is not a bad screen, but I can't really tell.
I've read all the comments, and zoomed way in. I can see it on one of the pairs if I pay attention, but on most of them, I still am not sure how to even look for the difference.
Last time I had a vision check, I got a 20/15, which is supposed to be better than "normal". It may have declined since then.
I don't think it's a monitor or eyesight thing. I think I don't know "how" to look for the effect I'm supposed to be seeing.
I can see a difference in the gradients, but in practical use on the average website does that even matter?
Photography portfolios are the one use case where having gigantic JPEG 90 images might make sense I suppose. Although everyone is going to get annoyed at your loading times.
The author is complaining about the consequences of recompressing images, which are also black and white and have a huge gradient background, and also, the post is full of flaws. I don’t know, Hacker News is better as less of a Hacker Rants.
> which are also black and white and have a huge gradient background
That's the entire point of this article. Rather than picking a dozen different kinds of images at random, it considers the problem within the very specific context of actual photographs, made by actual professional photographers, with specific (yet not uncommon) artistic/stylistic choices.
It's like showing why an audio codec sucks for cellos. Yes, there is going to be a hundred other things you may want to record (like a podcast, a rock band, etc), and most of them will not be cellos, but still that doesn't change the fact that the codec sucks for cellos.
It's not, it's just that people who spend thousands of dollars and hours into photography are more susceptible to care. Same with music, most people are fine with $15 earphones while musicians or music enthusiasts will find them disgusting.
In my opinion the worst and most distinguishable downside of webp is the forced 4:2:0 chroma subsampling. On many images with bright colors you can clearly see the color and brightness loss without an educated eye.
On comparison [1] you can clearly see that the top right balloon has lost its vibrant red color. On comparison [2] the bright blue neon art on the center has lost its brightness.
Not to stir yet stir another debate but yeah, definitely not able to perceive the difference in either of the examples you linked. It would be helpful if that site let you drag the vertical comparison bar at least. On an iPhone 14 display.
I can see it in the second link setting webp to small in the orange reflections above the rightmost outside needle tree htms. ... oh, you cant drag it? ...
This article didn't go into the biggest problem with webp for me: the inconveninence of the format outside the browser compared to the small space saving. There are better formats (the video-codec inspired ones like heif, avif, and what might come out of h266, or even jpeg-xl), and webp just seems like a compromise without enough upside.
My favorite is the URL ends with jpg but when you save the image you get a fucking WebP. Thanks everyone for breaking the Internet in the name of Google. The best.
Even worse that the original blog post, because of this you may be dealing with a JPEG image, converted to WEBP, and then back to JPEG. And then maybe someone edited that JPEG and it got converted back to WEBP!
A large chunk of the hn commentors are debating over banding they can or can't see in a best case scenario WEBP image. The reality is the bulk of the WEBP images look horrible, something I've started to really notice only recently. Of course, you can "clean" the images by using different generative upscaling processes now, which is pretty ironic how much electricity we are using because someone wanted to save 45kb.
Also this reminds me a lot about GIFs being converted to JPEGs. 25~ years ago there was a lot of nice, clean GIF screenshots (256 colors was all you needed) that got destroyed by JPEG.
Google tells developers to use WEBP but has no problem serving petabytes of video ads no one wants to watch!
> To the non-educated eye, this might look ok, but for a photographer it’s not, and for several reasons.
There surely must be better examples to show "non-educated" plebs (to use the tone of the post) why webp is bad and to justify the post and the tone.
I'm on Android, maybe this is why all pic quality look the same?
Also - yeah, if you are making pics for educated eyes: don't use tech that is not suitable for educated eyes? Or don't outsource that decision making to others?
The authors point is that if you are making this tech, you should have educated eyes.
And given all the confident comments in this thread claiming the author is full of shit and there's no difference, I think their frustration is justified? If you can't see the difference in the first images that's fine but you probably shouldn't be confidently claiming to know better than the author, let alone designing an image codec.
His font choice is terrible for my legibility. Maybe for others it's great. But it made the already difficult article that much harder to read. And I like this topic. I already seriously question his sense of what is reasonable and good and for what purpose. His purposes are so alien to mine that his opinion ends up being pretty irrelevant to mine. I wish him well with his.
I can't see the things he's pointing out in the images, and I tried and tried.
I use webp extensively, there have been zero complaints from users about the images. But I don't make art sites. I make software people use to get stuff done. I don't transfer images above maybe 50-80k. Art, aside from modest marketing, is most definitely not the point.
The author's point is deeply stupid. As he admits:
> WebP re-encoding of an already lossy compressed JPEG
So... all this shows nothing. Is webp worse than jpeg? Not addressed. He re-encoded jpeg to webp and it somehow didn't magically cure the compression artifacts he's seeing! Who coulda thunk!
Any comparison starts with taking the originals, encoding to jpeg and webp, and comparing that. Or he could repeatedly encode original -> jpeg -> jpeg and compare that to what he has, which is original -> jpeg -> webp
Still, the author could do more to highlight the differences using zooms and annotations. The banding in the background is particularly strong and would help their point to highlight visually to the reader.
Look at the man with his face screwed up. Look at the edges of his shirt near his shoulders.
In the pictures that had bad image quality, there is a sort of glow around his shoulders, as if they are backlit.
In the pictures that had a good image quality, The gradient was smooth. There was no backlit glow around his shoulders; it just looked like a smooth gradient background image.
To be clear, I'm not a photographer. I'm a DevOps engineer. The last time I professionally wrote a line of JavaScript was at least 11 years ago.
…so essentially WebP is fine for mobile devices and the vast majority of desktop web cases. I’m fine with WebP not being a suitable format for permanent storage of photography.
A close up section of the same zone in the images would make them visible. I could hardly see the artefacts in the first place as my attention was caught with the highly contrasted parts of the images.
For starters, anyone that ever worked with a codec, will know that you don't compare them with ONE SINNGLE IMAGE.
This whole basic idea of the blog post is just to generate more whining and clicks and not to actually make a comparison between formats that's worth a basic smell test.
This cuts against WebP more: all of Google’s marketing was “it’s a third smaller!!!!” and then when you looked they were comparing it to unoptimized libjpeg outout and using computational metrics like SSIM which only crudely approximate what humans notice about image quality.
I did the same comparison the author did when WebP came out but used an optimized JPEG encoder and found the same conclusion: when you produced subjectively equivalent images, the savings were more like -10% to +15% and for web sites which didn’t get Google-scale traffic the performance impact was negative since it made caching less effective and you had to support an entire new toolchain.
A bit of context: Aurelien Pierre is known to be a major contributor to Darktable (open source raw developper / catalog ; in other words, an open source Adobe Lightroom), and is known to have strong opinion about the correct way do to stuff, to the point of abrasiveness and to the point where he has forked Darktable into its own stuff (Ansel; see HN discussion some times ago https://news.ycombinator.com/item?id=38390914 ).
If I cared about archive image quality (and I do), I wouldn't re-compress older images in a new format unless I could do so from uncompressed originals. Re-encoding from a lossy compressed source will make quality worse. Storage is cheap and getting cheaper.
What would make sense is choosing safe settings for compressing new photos in the new format.
> Re-encoding from a lossy compressed source will make quality worse.
JPEG-XL is supposed to reencode old JPEG files into 20% smaller files without quality loss though.
In context, Google has been holding JPEG-XL back by removing support for it from Chrome and refusing to reinstate it, claiming that it did not have good enough "incremental benefits compared to existing formats" such as webp.
Wow, I didn't know that. A top google result says:
> It is possible to losslessly transcode JPEG images into JPEG XL. Transcoding preserves the already-lossy compression data from the original JPEG image without any quality loss caused by re-encoding, while making the file size smaller than the original.
I wonder how it does that and why JPEG didn't notice it could. I would re-encode to JPEG-XL, when supported. So then the situation isn't that WebP is so great but rather Chrome's not so great.
Careful with the JPEG-XL re-compression, though--depending on how you're re-encoding, jxl may use SSIM to evaluate for visual losslessness, and the whole point of TFA is that SSIM is blind to posterization, but (some) humans aren't.
Disk space is cheap. It's most likely not worth the 20% compression to lose your original images (and possibly lose metadata as well--it's quite hard to robustly retain all vendor-specific MakerNotes, for example).
Okay, but that isn't really the point. You can start from a perfect gradient saved as a PNG and you will still see that WebP has visible banding at -q100 while JPEG is visually transparent at -q90.
I think the author is focusing on the wrong thing. They focused on the difference in format, when they should have focused on the compression. Different image processing programs will have different compression even when set to the same number (eg "80").
I think for a truly meaningful comparison you'd need to test a variety of images including full color with busy backgrounds as well as these b&w studio portraits on a smooth gradient type bg, and test a variety of programs like imagemagik, graphicsMagick, sharp, photoshop, whatever cloud offerings, etc.
The other issue I see is use case. If you're a professional photographer trying to upload full size full quality photos, maybe just don't compress at all so you know your creative / editing work is completely preserved. That use case is not the average use case of a website displaying a reasonably sized image of reasonable quality. For many situations a significantly smaller image might be worth having a more compressed image, and for many images the compression won't be as noticeable as it is in a full resolution professional studio photo with a large gradient type background.
I think the real problem is, like many of the commenters here, most people can't tell the difference because desktop monitors have been stuck in a deadzone of zero innovation for the last 10 years. I'm sure half the folks here are viewing his example images on a 2012-era HD 1920x1080 LCD, which is definitely part of the problem.
It's bizarre. Smaller displays (Mobile phones) and larger displays (4k TVs) have fantastic pixel densities now considering their viewing distance. However any panel in the range of 20"-40" is stuck in the mid-2000s.
Also, I think the author would have done us a favor by using example photos with lighter backgrounds (or changing the background color of his post to black). The harshness of the black images on white don't allow the eye to adjust enough to see the issue. If you put those images on a dark background its super easy to tell the difference.
So I don't think display quality really is the problem here. Maybe the drivers, or post-processing filters. Or maybe everyone doesn't have an eye for this. I have an interest in image processing, and that's the kind of detail one tends to notice with experience. The author of the article is undoubtedly more experienced than me and noticing these details may even be part of his job. He most likely will be able to notice these problems on crappy monitors, as well as telling you in which way that monitor is crap.
Generally though i would expect wide gaumet monitors to make a significant difference for these types of artifacts
The examples are just bad. If you want to show something, screenshot and enlarge it to show the artifacts.
WebP image gradients just looked broken (posterized) except the lossless one, which was (obviously) perfect.
One might argue that if you need to enlarge it to see the artifacts, then the artifacts aren't perceptible enough and the codec is already good enough for the use case.
The difference is in color around the edges of the picture in the background change noticeably on a non-fullscreen image on my Android 12 device.
Yes! Where's the red underlines and diffs? I can see the background banding, but the foreground looks the same at a glance except that some of them look ambiguously "off" in ways that could just be placebo.
You'd think a visual artist would be more interested in visual communication and not just a wall of text with un-annotated photos.
Wait... I agree for JPG but if you use lossless WEBP instead of PNG, isn't it simply the same pixels, just with a file about 30% smaller than the corresponding PNG file? (and 15% smaller compared to already heavily optimized PNG files like when using zopfli/optipng/etc.).
Isn't the "lossless" in "lossless WEBP" actually lossless when converting a PNG file to WEBP?
FWIW when you convert losslessly a PNG to WEBP, then decompress the WEBP back to a PNG file, then convert again that PNG back to a WEBP file, you get the exact same lossless WEBP file. It's also the same WEBP you get when you encode losslessly from either a PNG or that same PNG but "crushed" with a PNG optimizer.
On the technical side, webp support still isn't like png. Tried dragging a webp into Google Slides just now, got "unsupported image type," which is ironic. I'll try again in like 10 years.
That's a weird thing to say unless the pixel density is your one and only measure. Regardless of that, the posterization should be perfectly visible on a 2012 FullHD monitor, or even a 1366x768 TN screen of a decade-old laptop. Most commenters here are probably viewing the pictures on a scale different from 1:1.
Is it though? We now have OLED TVs and OLED smartphones.
Where's our OLED PC monitors?
On every measure, if you care about colors/contrast/black+white levels/resolution/density, the average computer monitor has fallen far behind.
You can't even buy a smartphone that has a panel half as bad as most PC monitors on the market. And, at least in my area, you'd actually have to go to a lot of effort to find a non-4k TV.
The "issue" is that monitors last a LONG time. And thats good. We dont touch them or fiddle with them. They tend to just work. Phones and shit we keep dropping and breaking, then the battery gets bad.
Also for gaming you may even want 1080p 200hz monitor for high refresh rate and FPS over pixel density.
They really don't...
I just looked at the first two images of the post.
First on two mid end LCDs: one ASUS IPS from this year and one BenQ TN from 2012, both 24" 1920x1080 (~91 DPI). The difference between the images is clear on both.
And before posting, to make sure, I pulled out a 15" 1024x768 (~85 DPI: basically the same) NEC TN LCD from 2002. And a NEC CRT roughly 15" viewable 1024x768 from 1998. Both on VGA connectors (so there is the typical noise from that, which still doesn't cover up the posterization). The difference between the images is clear on both.
All monitors viewed from 3' away.
People are simply accommodated to poor image quality, including posterization. AAA FPS video games display it on static art backgrounds in the loading menu, and I can never tell if they are intended. Show them a 240Hz monitor with 30ms input lag and 5 frames of overshoot artifacts and viewing angles worse than 1998, and they'll be wowed.
Recently I’ve been dabbling in HDR video, but I realised that the exercise is futile because I can’t send the results to anyone — unless they’re using an Apple device.
EDIT: The last comparison is webp twice, he linked it wrong. Here is the jpg one, still no difference:
https://eng.aurelienpierre.com/wp-content/uploads/sites/8/20...
Edit: You have to actually click for a full size image to see the truth. Those inline images had pretty bad compression artefacts, even the supposed lossless versions.
So https://eng.aurelienpierre.com/wp-content/uploads/sites/8/20... (full size lossless WebP image) looks fine, but inline version of the same image https://eng.aurelienpierre.com/wp-content/uploads/sites/8/20... looks terrible.
Edit 2: The difference between...
https://eng.aurelienpierre.com/wp-content/uploads/sites/8/20... lossy-noise.jpg (216 kB JPEG)
https://eng.aurelienpierre.com/wp-content/uploads/sites/8/20... (150 kB WebP)
https://eng.aurelienpierre.com/wp-content/uploads/sites/8/20... (301 kB WebP)
... is pretty obvious. Both of the WebP examples, even that 301 kB version, show clearly visible posterization.
I wonder if there's some issue with the WebP encoder (or the settings) he is using?
Edit 3:
It should be noted that monitor gamma and color profile might affect gradient posterization visibility.
I played around with online optimizers and IrfanView which I had locally. IrfanView got the results they did, no matter what else I tuned, obvious degradation at 90. Online optimizers were not even comparable in how bad they were.
edit: I found Squoosh [0], which has WebP V2 compression marked as unstable. It’s far better, half the size of JPEG 90, but it’s still degraded in comparison. Also, it saves as wp2 file, which neither Chrome nor FF support natively.
[0]: https://squoosh.app/editor
He's re-encoding the JPEG compressed images. That is a huge mistake.
Tried it with a Windows laptop connected to a Samsung LS32A800 32" 4k display. Laptop has factory default settings. Chrome 120. The monitor is pretty low end for a 4k model.
Monitor's picture settings: Custom, brightness 81, contrast 75, sharpness 60, gamma mode1 and response time fastest.
Switched between those three "Edit 2" images blindly, yet the issues are obvious also on this combination.
The JPEG version looks better compared to WebP ones. (Also, this goes against my prior general assumptions about JPEG vs WebP quality.)
One easy difference to spot is the background in this pair is posterized (https://en.wikipedia.org/wiki/Posterization) in webp but not in jpg:
https://eng.aurelienpierre.com/wp-content/uploads/sites/8/20...
https://eng.aurelienpierre.com/wp-content/uploads/sites/8/20...
I didn't know the word "posterization", so I'd describe this (slightly?) more simply as a stepped gradient instead of a smooth gradient.
There is a clear difference though, I can see it in all my monitors, from desktop to laptop and even mobile. It's especially visible in the top right quarter.
That being said if you're not into photography you might just not care enough to see it
And many people commented the same. These simply aren't small differences.
People who cannot see the differences or who only see them after taking a close look should realize something: there are many people for whom the differences are going to be immediately obvious.
That's one possible conclusion. Another is that some people are overstating how obvious it is. I don't mean this as an insult - there's plenty of cases where people's stated perceptions and preferences disappear when tested under strict conditions (hello Audiophiles).
So - it's not immediately obvious whether claims such as yours are trustworthy.
(for the record I can see the difference but it's fairly subtle on my screen)
For the second image, I opened the jpeg 90 [1] and webp 90 [2] versions. Comparing those two, there are clear banding issues to the right of the neck. Slightly less visible are the darker bands circling around the whole image, though still noticeable if you know where to look.
Comparing the jpeg 90 version with either webp lossless, jpeg 100 or jpeg 95, I can spot some very slight banding in the jpeg 90 version just to the right of the neck. Very difficult to spot though without zooming in.
[1] https://eng.aurelienpierre.com/wp-content/uploads/sites/8/20...
[2] https://eng.aurelienpierre.com/wp-content/uploads/sites/8/20...
I wonder if the author's issue is due to the author using a Mac. Back when I was at Google working on VR images, my work machine was a Macbook and my home machine was a normal Windows desktop. I realized that images looked worse on my laptop's screen because the native resolution of the display hardware was something like 4000 (numbers made up because I don't remember the specs) but the display was set to 3000. So OSX would incorrectly rescale the image using the wrong gamma curves. Since I was trying to calibrate VR headsets, I spent way too much time looking at gamma test images like https://www.epaperpress.com/monitorcal/gamma.html where a high res pure black + pure white grid is shown next to a set of grays. That was how I realized that my Mac was incorrectly resizing the graphics without being properly gamma aware. I also realized that if I set the OS resolution to 2000, it would use nearest neighbor instead of bilinear filtering and the gamma issue would go away. My Windows desktop had the OS running at the native resolution of the display so this wasn't an issue there. This also wasn't an issue if I had an external monitor hooked up to the Mac and set to its native resolution.
Apple users tend to say "it just works" which is true 90% of the time. But there are cases like this where it doesn't "just work" and there was no easy way to force the OS to run at its native resolution on that specific laptop.
Edit: I tested with the second set of images (the upper body shot) and the problems with the gradient are visible there. But I still can't see a different when quickly flipping through the first part of images on my properly calibrated native-resolution monitor. I _can_ see some banding on one of my monitors that was intentionally miscalibrated so that I could read text better.
https://bugs.chromium.org/p/chromium/issues/detail?id=44872
Total aside, y'know how people do things like make their smartphones greyscale (or at least mute the colors a bit) to reduce smartphone addiction? It wouldn't surprise me if these over-saturated colors were part of why Chrome got so popular so fast...
It is not, since I tested positive on Linux. What post processing would any OS even do on an image when you view it in a new tab as one is meant to do for this tutorial?
It's not so easy to see if the browser zooms the image, so make sure to open the image and set zoom to 100%. I also need to keep my face fairly close to my screen (12" 1920×1080, so not that large).
>> To the non-educated eye, this might look ok, but for a photographer it’s not, and for several reasons.
webp is a banding nightmare.
Different operating systems and monitors have different default gamma curves for rendering brightness and black levels. Monitors are most likely either uncalibrated, or _can't be calibrated_ to render a greyscale with just 64 brightness levels distinctly.
TFA is calling attention to "posterization" in their portrait backgrounds. They expected the grey background to have a smooth gradient, but, depending on your monitor, you should see visual jagged stair-steps between different grey levels.
When an image uses a color palette that's insufficiently variable to render the original image colors with high fidelity, that's "posterization."
(I paid for my college doing high-end prepress and digital image services, and got to work with a ton of really talented photographers who helped me see what they were seeing)
4k Dell monitor, Safari on a Mac.
There is definitely something changing though, because if I open each in Preview, switch Preview to full screen, set the view to be actual size, and take a full screen screenshot, the screenshot for the WebP image is 14% smaller than the one for the JPEG.
If I use screen zoom to go way in and then flip between the two images I can finally see some changes. The JPEG background has more small scale variation in shade. In the hair there are some white streaks that aren't quite as long in the WebP. Lots of small changes in the shirt, but it is about 50/50 whether or not any given difference there looks better in the JPEG or the WebP.
My takeaway as a non-photographer is: "different tools for different uses". If you're posting photography where image quality matters then use JPEG or another format that you think displays the image best. If you're writing a blog post with screenshots or other images where minute quality doesn't matter that much then use WebP.
Deleted Comment
There are clear bands of various shades of grey that circle out of the brighter areas behind the face and from the mid-right edge. They appear to join about two thirds from the middle to the right edge. That artifacting is most notable at full size, but is still visible on the smaller size on the web page.
There's definitely very ugly "banding" going on in the gradients on the WebP versions i say as someone who's worked extensively with UX and interfaces.
I'm on a M2 Macbook Air.
I've read all the comments, and zoomed way in. I can see it on one of the pairs if I pay attention, but on most of them, I still am not sure how to even look for the difference.
Last time I had a vision check, I got a 20/15, which is supposed to be better than "normal". It may have declined since then.
I don't think it's a monitor or eyesight thing. I think I don't know "how" to look for the effect I'm supposed to be seeing.
Photography portfolios are the one use case where having gigantic JPEG 90 images might make sense I suppose. Although everyone is going to get annoyed at your loading times.
See my post lower in this thread.
https://news.ycombinator.com/item?id=38656046
That's the entire point of this article. Rather than picking a dozen different kinds of images at random, it considers the problem within the very specific context of actual photographs, made by actual professional photographers, with specific (yet not uncommon) artistic/stylistic choices.
It's like showing why an audio codec sucks for cellos. Yes, there is going to be a hundred other things you may want to record (like a podcast, a rock band, etc), and most of them will not be cellos, but still that doesn't change the fact that the codec sucks for cellos.
Dead Comment
On comparison [1] you can clearly see that the top right balloon has lost its vibrant red color. On comparison [2] the bright blue neon art on the center has lost its brightness.
[1] https://storage.googleapis.com/demos.webmproject.org/webp/cm...
[2] https://storage.googleapis.com/demos.webmproject.org/webp/cm...
I have to ask, what could be the reason this gives me pale blue (other colors are okeyish) jpg > webp:
cwebp -pass 10 -m 6 -nostrong -sharp_yuv -quiet -q 60 -sharpness 2 $1 -o
A large chunk of the hn commentors are debating over banding they can or can't see in a best case scenario WEBP image. The reality is the bulk of the WEBP images look horrible, something I've started to really notice only recently. Of course, you can "clean" the images by using different generative upscaling processes now, which is pretty ironic how much electricity we are using because someone wanted to save 45kb.
Also this reminds me a lot about GIFs being converted to JPEGs. 25~ years ago there was a lot of nice, clean GIF screenshots (256 colors was all you needed) that got destroyed by JPEG.
Google tells developers to use WEBP but has no problem serving petabytes of video ads no one wants to watch!
There surely must be better examples to show "non-educated" plebs (to use the tone of the post) why webp is bad and to justify the post and the tone.
I'm on Android, maybe this is why all pic quality look the same?
Also - yeah, if you are making pics for educated eyes: don't use tech that is not suitable for educated eyes? Or don't outsource that decision making to others?
And given all the confident comments in this thread claiming the author is full of shit and there's no difference, I think their frustration is justified? If you can't see the difference in the first images that's fine but you probably shouldn't be confidently claiming to know better than the author, let alone designing an image codec.
His font choice is terrible for my legibility. Maybe for others it's great. But it made the already difficult article that much harder to read. And I like this topic. I already seriously question his sense of what is reasonable and good and for what purpose. His purposes are so alien to mine that his opinion ends up being pretty irrelevant to mine. I wish him well with his.
I can't see the things he's pointing out in the images, and I tried and tried.
I use webp extensively, there have been zero complaints from users about the images. But I don't make art sites. I make software people use to get stuff done. I don't transfer images above maybe 50-80k. Art, aside from modest marketing, is most definitely not the point.
> WebP re-encoding of an already lossy compressed JPEG
So... all this shows nothing. Is webp worse than jpeg? Not addressed. He re-encoded jpeg to webp and it somehow didn't magically cure the compression artifacts he's seeing! Who coulda thunk!
Any comparison starts with taking the originals, encoding to jpeg and webp, and comparing that. Or he could repeatedly encode original -> jpeg -> jpeg and compare that to what he has, which is original -> jpeg -> webp
I was able to see it without full screening.
Look at the man with his face screwed up. Look at the edges of his shirt near his shoulders.
In the pictures that had bad image quality, there is a sort of glow around his shoulders, as if they are backlit.
In the pictures that had a good image quality, The gradient was smooth. There was no backlit glow around his shoulders; it just looked like a smooth gradient background image.
To be clear, I'm not a photographer. I'm a DevOps engineer. The last time I professionally wrote a line of JavaScript was at least 11 years ago.
It's easy enough to see.
[1] https://news.ycombinator.com/item?id=38653224
Deleted Comment
This whole basic idea of the blog post is just to generate more whining and clicks and not to actually make a comparison between formats that's worth a basic smell test.
I did the same comparison the author did when WebP came out but used an optimized JPEG encoder and found the same conclusion: when you produced subjectively equivalent images, the savings were more like -10% to +15% and for web sites which didn’t get Google-scale traffic the performance impact was negative since it made caching less effective and you had to support an entire new toolchain.
What would make sense is choosing safe settings for compressing new photos in the new format.
JPEG-XL is supposed to reencode old JPEG files into 20% smaller files without quality loss though. In context, Google has been holding JPEG-XL back by removing support for it from Chrome and refusing to reinstate it, claiming that it did not have good enough "incremental benefits compared to existing formats" such as webp.
> It is possible to losslessly transcode JPEG images into JPEG XL. Transcoding preserves the already-lossy compression data from the original JPEG image without any quality loss caused by re-encoding, while making the file size smaller than the original.
I wonder how it does that and why JPEG didn't notice it could. I would re-encode to JPEG-XL, when supported. So then the situation isn't that WebP is so great but rather Chrome's not so great.
Disk space is cheap. It's most likely not worth the 20% compression to lose your original images (and possibly lose metadata as well--it's quite hard to robustly retain all vendor-specific MakerNotes, for example).
I think for a truly meaningful comparison you'd need to test a variety of images including full color with busy backgrounds as well as these b&w studio portraits on a smooth gradient type bg, and test a variety of programs like imagemagik, graphicsMagick, sharp, photoshop, whatever cloud offerings, etc.
The other issue I see is use case. If you're a professional photographer trying to upload full size full quality photos, maybe just don't compress at all so you know your creative / editing work is completely preserved. That use case is not the average use case of a website displaying a reasonably sized image of reasonable quality. For many situations a significantly smaller image might be worth having a more compressed image, and for many images the compression won't be as noticeable as it is in a full resolution professional studio photo with a large gradient type background.