Readit News logoReadit News
declan_roberts · 9 months ago
> Merely keeping up with the stream of issues found by fuzzing costs Google at least 0.25 full time software engineers

I like this way of measuring extra work. Is this standard at Google?

pradn · 9 months ago
Yes, a SWE-year is a common unit of cost.

And there are internal calculators that tell you how much CPU, memory, network etc a SWE-year gets you. Same for other internal units, like the cost of a particular DB.

This allows you to make time/resource tradeoffs. Spending half an engineer’s year to save 0.5 SWE-y of CPU is not a great ROI. But if you get 10 SWE out of it, it’s probably a great idea.

I personally have used it to argue that we shouldn’t spend 2 weeks of engineering time to save a TB of DB disk space. The cost of the disk comes to less than a SWE-hour per year!

jorvi · 9 months ago
Note that this can lead to horrid economics for the user.

An example being Google unilaterally flipping on VP8/VP9 decode, which at that time purely decoded on the CPU or experimentally on the GPU.

It saved Google a few CPU cycles and some bandwidth but it nuked every user's CPU and battery. And the amount of energy YouTube consumed wholesale (so servers + clients) skyrocketed. Tragedy of the Commons.

nairb774 · 9 months ago
FTE. Full time equivalent. Mosts costs are denominated in FTE - headcount as well as things like CPU/memory/storage/...

The main economic unit for most engineers is FTE not $.

sanbor · 9 months ago
I’m pretty sure FTE stands for full-time employee
h3half · 9 months ago
This is also extremely extremely common in engineering services contracts, both for government and private sector contracting. RFPs (or their equivalent) will specifically lay out what level of effort is expected and will often denote effort in terms of FTE
fph · 9 months ago
Technically a measure of power, work/time.
summerlight · 9 months ago
Yeap, kinds of. It's preferred because whenever you propose/evaluate some changes, you can have a rough idea whether it was worth the time. Like you worked on some significant optimization then measure it and justify like the saving was 10 SWE-years where you just put 1 SWE-quarter.
adam_gyroscope · 9 months ago
Yep, often things are measured in FTE or FTE-equivalent units. It’s not precise of course but is a reasonable shorthand for the amount of work required.
kiicia · 9 months ago
Yes, all those well paid C-level managers cannot handle multiple units so they require everyone to use one “easy to understand unit so that everything is easy to compare and micromanage”
Someone · 9 months ago
If you want to decide which of several options is better, how do you propose doing that without using a single number? You can’t, in general, compare multi dimensional quantities.
MForster · 9 months ago
It's quite the opposite. This is intended for engineers to make good trade-off decisions as a rule of thumb without financial micromanaging.
whazor · 9 months ago
I think this means the engineer fuzzes 4 projects?
colejohnson66 · 9 months ago
It means it costs them three months per year per employee. So, 'n' employees, n/4 years of man-power is spent fixing issues found by fuzzing. As others have said, FTE (full-time equivalent) is the more common name.
maxdamantus · 9 months ago
I hope that if we switch away from FreeType, we'll still have a way to use TTF hinting instructions fully.

Windows/macOS don't seem to have a way to enable proper hinting anymore [0], and even FreeType (since 2.7 [1]) defaults to improper hinting (they call it "subpixel hinting", which doesn't make sense to me in theory, and practically still seems blurry, as if it's unhinted).

In case anyone's wondering what properly hinted text looks like, here's a screenshot [2]. This currently relies on setting the environment variable `FREETYPE_PROPERTIES=truetype:interpreter-version=35`, possibly some other configuration through fontconfig, and using a font with good hinting instructions (eg, DejaVu Sans and DejaVu Sans Mono in the screenshot).

My suspicion is that Windows moved away from font hinting after XP because it's very hard to create fonts with good sets of hinting instructions (aiui, OTF effectively only allows something like "autohinting"), so in the modern world of designer fonts it's easier to just have everyone look at blurry text. Some other minor reasons would be that UI scaling in Windows sometimes (or always?) introduces blurring anyway, and viewing raster images on screens of varying resolutions also introduces scaling blur.

[0] Windows still has a way to enable it, but it disables antialiasing at the same time. This is using an option called "Smooth edges of screen fonts" in "Performance Options"). This basically makes it look like XP, which imo is an improvement, but not as good as FreeType which can do hinting and antialiasing at the same time.

[1] https://freetype.org/freetype2/docs/hinting/subpixel-hinting...

[2] https://gist.githubusercontent.com/Maxdamantus/3a58d8e764b29...

AndriyKunitsyn · 9 months ago
>In case anyone's wondering what properly hinted text looks like, here's a screenshot

I'm not an expert, but - I'm sorry, it's not.

The point of hinting is to change the shape of the glyphs so the rasterized result looks "better". What "better" is, of course, purely subjective, but most people would agree that it's better when perceived thicknesses of strokes and gaps are uniform, and the text is less blurry, so the eye can discern the shapes faster. I don't think that your rendering scores high points in that regard.

I'll take a phrase from your rendering: "it usually pays" [0]. I don't like it, I'm sorry. The hinter can't make up its mind if the stroke width should be two pixels wide, or three pixels with faint pixels on the sides and an intense one in the center - therefore, the perceived thicknesses vary, which increases eye strain; "l"s are noticeably different; "ys" at the end clumped together into one blurry thing; and there's a completely unnecessary line of faint pixels on the bottom of the characters, which hinting should have prevented.

The second line is how it looks on Windows on 150% scale. Verdana is a different font, so it's an unfair comparison (Verdana was designed specifically to look good on low resolutions), and the rainbows can be off-putting, but I still think the hinter tucks the shapes into pixels better.

Maybe I don't understand something, or maybe there's a mistake.

[0] https://postimg.cc/cKCQR60F

maxdamantus · 9 months ago
I'm not entirely sure how you got that first line, but if it's derived from my image, your system must be scaling the image, which introduces blur. Since you mentioned a 150% scale, I'm guessing your image viewer is rendering each pixel in my image as 1.5 pixels (on average) on your screen, which will explain the scaling/blurring, making it difficult to demonstrate proper hinting on your screen with raster images (I alluded to this in my previous post).

Here's an updated version of your image, with the actual pixel data from my image copied in, at 8x and 1x scale [0]. It should be possible to see the pixels yourself if you load it into a tool like GIMP, which preserves pixels as you zoom in.

It should be fairly clear from the image above that the hinting causes the outlines (particularly, horizontal and vertical lines) to align to the pixel grid, which, as you say, both makes the line widths uniform and makes the text less blurry (by reducing the need for antialiasing; SSAA is basically the gold standard for antialiasing, which involves rendering at a higher resolution and then downscaling, meaning a single physical pixel corresponds to an average of multiple pixels from the original image).

Out of interest, I've done a bit of processing [1] to your image to see what ClearType is actually doing [2], and as described in the FreeType post I linked, it seems like it is indeed using vertical hints (so the horizontal lines don't have the colours next to them—this is obvious from your picture), and it seems like it is indeed ignoring any horizontal hints, since the grey levels around the vertical lines are inconsistent, and the image still looks horizontally blurry.

I might as well also link to this demo I started putting together a few years ago [3]. It uses FreeType to render with hinting within the browser, and it should handle different DPIs as long as `devicePixelRatio` is correct. I think this should work on desktop browsers as long as they're scaling aware (I think this is the case with UI scaling on Windows but not on macOS). Mobile browsers tend to give nonsense values here since they don't want to change it as the user zooms in/out. Since it's using FreeType alone without something like HarfBuzz, maybe some of the positioning is not optimal.

[0] https://gist.githubusercontent.com/Maxdamantus/3a58d8e764b29...

[1] After jiggling the image around and converting it back from 8x to 1x, I used this command to show each RGB subpixel in greyscale (assuming the typical R-G-B pixel layout used on LCD computer monitors):

    width=137; height=53; stream /tmp/image_1x.png - | xxd -p | sed 's/../&&&/g' | xxd -r -p | convert -size $((width*3))x$((height)) -depth 8 rgb:- -interpolate nearest-neighbor -interpolative-resize $((100*8))%x$((300*8))% /tmp/image_8x_rgb.png
[2] https://gist.githubusercontent.com/Maxdamantus/3a58d8e764b29...

[3] https://maxdamantus.eu.org/ftv35-220719/site/

raggi · 9 months ago
I really want hinting, subpixel and anti-aliasing available on all systems, and i want to pick the appropriate set of techniques based on the dpi and font size to minimize error, fuzzy excesses, and balance computation cost. Obviously we still don't all have high DPI all the time and likely won't for a long while. Apple dropping support was a disaster, and I currently run Apple devices plugged into to regular DPI screens with a slightly downsized resolution to trick it into rendering in a sane way, but it's a bonkers process that's also expensive and unnecessary for non-font paint work.

That said, one of the horrors of using the old font stacks, and in many ways the very thing that drove me away from fighting Linux desktop configurations for now about 10y of freedom from pain, was dealing with other related disasters. First it's a fight to even get to things being consistent, and as seen in your screenshot there's inconsistency between the content renders, the title, and the address bar. Worse though the kerning in `Yes` in your screenshot is just too bad for constant use for me.

I hope as we usher in a new generation of font tech, that we can finally reach consistency. I really want to see these engines used on Windows and macOS as well, where they're currently only used as fall-back, because I want them to present extremely high quality results on all platforms, and then I can use them in desktop app work and stop having constant fights with these subsystems. I'm fed up with it after many decades, I just want to be able to recommend a solution that works and have those slowly become the consistently correct output for everyone everywhere.

emidoots · 9 months ago
If you want consistency, you only need to convince everyone to switch to a single font renderer (e.g. freetype). That won't happen, though, because OS font renderers have quirks that cause them to render the same things in subtly different ways, and users have come to unintentionally expect those quirks. Even if rendering is 'better' in one app.. if it doesn't match the others or what the user is used to.. then it won't 'feel native'.

Maybe if what freetype is pushing for (fonts are WASM binaries) continues to take hold, and encompass more of fonts, we'll find more consistency over time though

drott · 9 months ago
Yes, Skrifa executes TrueType hints and has a new autohinting implementation written in Rust. We use these modes in Chrome.
maxdamantus · 9 months ago
Hmm.. I tried using the "tools/viewer/viewer --slide GM_typeface_fontations_roboto" example in the skia repository earlier (swapping out the Roboto font for DejaVuSans, since the Roboto font doesn't seem to properly hint horizontally [0]), but the result [1] seems to only be hinted vertically, so similar to the FreeType "v40" interpreter, which supposedly ignores horizontal hints.

Admittedly I haven't looked into how the setup is configured, and haven't tried it in Chrome, so maybe it's still possible to enable full hinting as intended by the font somehow.

[0] https://gist.githubusercontent.com/Maxdamantus/3a58d8e764b29...

[1] https://gist.githubusercontent.com/Maxdamantus/3a58d8e764b29...

nyanpasu64 · 9 months ago
> In addition, before the integration into Chromium, we ran a wide set of pixel comparisons in Skia, comparing FreeType rendering to Skrifa and Skia rendering to ensure the pixel differences are absolutely minimal, in all required rendering modes (across different antialiasing and hinting modes).

I'm hoping (but not sure) Skrifa will support hinting (though I'm not sure how it interacts with fontconfig). I noticed your screenshot uses full hinting (a subjective choice I currently don't use on my machines), presumably with integer glyph positioning/advance which isn't scale-independent (neither is Windows GDI), though this is quite a nonstandard configuration IMO.

Keyb0ardWarri0r · 9 months ago
This is the true power of Rust that many are missing (like Microsoft with its TypeScript rewrite in Go): a gradual migration towards safety and the capability of being embedded in existing project.

You don't have to do the Big Rewrite™, you can simply migrate components one by one instead.

TheCoreh · 9 months ago
> like Microsoft with its TypeScript rewrite in Go

My understanding is that Microsoft chose Go precisely to avoid having to do a full rewrite. Of all the “modern” native/AoT compiled languages (Rust, Swift, Go, Zig) Go has the most straightforward 1:1 mapping in semantics with the original TypeScript/JavaScript, so that a tool-assisted translation of the whole codebase is feasible with bug-for-bug compatibility, and minimal support/utility code.

It would be of course _possible_ to port/translate it to any language (Including Rust) but you would essentially end up implementing a small JavaScript runtime and GC, with none or very little of the safety guarantees provided by Rust. (Rust's ownership model generally favors drastically different architectures.)

jeppester · 9 months ago
As I understood their arguments it was not about the effort needed to rewrite the project.

It was about being able to have two codebases (old and new) that are so structurally similar, that it won't be a big deal to keep updating both

mdriley · 9 months ago
Hi, I lead Chrome's Rust efforts. I think the Typescript folks made a great and well-reasoned decision.
aapoalas · 9 months ago
Thank you, it's really nice seeing cooler heads prevail on the question of "why didn't they build in my favourite thing X?"

In entirely unrelated news, I think Chrome should totally switch engines from V8 to a Rust built one, like hmm... our / my Nova JavaScript engine! j/k

Great stuff on the font front, thank you for the articles, Rust/C++ interop work, and keep it up!

K0nserv · 9 months ago
In a similar way Rust can be very useful for the hot path in programs written in Python, Ruby, etc. You don't have to throw out and rewrite everything, but because Rust can look like C you can use it easily anywhere C FFI is supported.
steveklabnik · 9 months ago
> like Microsoft with its TypeScript rewrite in Go

Go is also memory safe.

gpm · 9 months ago
I'd argue technically not due to data races on interface values, maps, slices, and strings... but close enough for almost all purposes.

PS. Note that unlike most languages, a datarace on something like an int in go isn't undefined behavior, just non-deterministic and discouraged.

GaggiX · 9 months ago
Go can have data races, so I would not consider it memory safe.
tedunangst · 9 months ago
Since you've mentioned that you never see the annoying strike force threads that others complain about, you're in one.
Keyb0ardWarri0r · 9 months ago
But can't be embedded in other projects as easily as Rust (FFI, WASM).

Deleted Comment

raggi · 9 months ago
Odd comparison / statement in the context of MS rewriting GDI in Rust
wavemode · 9 months ago
You're saying choosing Go over Rust was a mistake? Why?
timewizard · 9 months ago
"If you build it they will come."
AndriyKunitsyn · 9 months ago
So, there's Skia. Skia is a high-level library that converts texts to glyph indices, does high-level text layout, and caches glyphs (and it also makes GPU calls). But the actual parsing of the font file and converting glyphs to bitmaps happens below in FreeType.

Skia is made in C++. It's made by Google.

There's FreeType. It actually measures and renders glyphs, simultaneously supporting various antialiasing modes, hinting, kerning, interpreting TrueType bytecode and other things.

FreeType is made in C. It's not made by Google.

Question: why was it FreeType that got a Rust rewrite first?

raggi · 9 months ago
It has a smaller API surface into the consuming applications and platforms.

Skia tendrils run deep and leak all over the place.

There's also a different set of work to invest in, next-generation Skia is likely to look quite different, moving much of the work on to the GPU, and this work is being researched and developed: https://github.com/linebender/vello. Some presentations about this work too: https://youtu.be/mmW_RbTyj8c https://youtu.be/OvfNipIcRiQ

interroboink · 9 months ago
Perhaps since FreeType is the one handling the more untrusted inputs (the font files themselves, downloaded from who-knows-where), it is more at-risk and thus stood to benefit from the conversion more?

But I don't really know anything about how all the parts fit together; just speculating.

londons_explore · 9 months ago
Skia's inputs are relatively less complex, so there is less risk of dangerous corner cases.
bawolff · 9 months ago
Format parsing is generally considered some of the most risky type of code to have for memory safety. Skia is probably considered a less risky problem domain.
SquareWheel · 9 months ago
I've recently been learning about how fonts render based on subpixel layouts in monitor panels. Windows assumes that all panels use RGB layout, and their ClearType software will render fonts with that assumption in mind. Unfortunately, this leads to visible text fringing on new display types, like the alternative stripe pattern used on WOLED monitors, or the triangular pattern used on QD-OLED.

Some third-party tools exist to tweak how ClearType works, like MacType[1] or Better ClearType Tuner[2]. Unfortunately, these tools don't work in Chrome/electron, which seems to implement its own font rendering. Reading this, I guess that's through FreeType.

I hope that as new panel technologies start becoming more prevalent, that somebody takes the initiative to help define a standard for communicating subpixel layouts from displays to the graphics layer, which text (or graphics) rendering engines can then make use of to improve type hinting. I do see some efforts in that area from Blur Busters[3] (the UFO Test guy), but still not much recognition from vendors.

Note I'm still learning about this topic, so please let me know if I'm mistaken about any points here.

[1] https://github.com/snowie2000/mactype

[2] https://github.com/bp2008/BetterClearTypeTuner

[3] https://github.com/microsoft/PowerToys/issues/25595

wkat4242 · 9 months ago
I'm pretty sure windows dropped subpixel anti-aliasing a few years ago. When it did exist there was a wizard to determine and set the subpixel layout.

Personally I don't bother anymore anyway since I have a HiDPI display (about 200dpi, 4K@24"). I think that's a better solution, simply have enough pixels to look smooth. It's what phones do too of course.

ghusbands · 9 months ago
To be clear: Windows still does subpixel rendering, and the wizard is still there. The wizard has not actually worked properly for at least a decade at this point, and subpixel rendering is always enabled, unless you use hacks or third-party programs.
cosmic_cheese · 9 months ago
macOS dropped it a few years ago, primarily because there are no Macs with non-HiDPI displays any more (reducing benefit of subpixel AA) and to improve uniformity with iOS apps running on macOS via Catalyst (iOS has never supported subpixel AA, since it doesn’t play nice with frequently adjusted orientations).

Windows I believe still uses RGB subpixel AA, because OLED monitor users still need to tweak ClearType settings to make text not look bad.

hnuser123456 · 9 months ago
It absolutely still does subpixel AA. Take a screenshot of any text and zoom way in, there's red and blue fringing. And the ClearType text tuner is still a functional builtin program in Win11 24H2.
nine_k · 9 months ago
I still have subpixel antialiasing on when using a 28" 4K display. It's the same DPI as a FHD 14" display, typical on laptops. Subpixel AA makes small fonts look significantly more readable.

But it only applies to Linux, where the small fonts can be made look crisp this way. Windows AA is worse, small fonts are a bit more blurred on the same screen, and amcOS is the worst: connecting a 24" FHD screen to an MBP ives really horrible font rendering, unless you make fonts really large. I suppose it's because macOS does not do subpxel AA at all, and assumes high DPI screens only.

Clamchop · 9 months ago
As far as I'm aware, ClearType is still enabled by default in Windows.

Subpixel text rendering was removed from MacOS some time ago, though, presumably because they decided it was not needed on retina screens. Maybe you're thinking of that?

perching_aix · 9 months ago
It didn't. Some parts of the UI are using grayscale AA, some are on subpixel AA. And sometimes it's just a blur, to keep things fun I guess.

Pretty sure phones do grayscale AA.

tadfisher · 9 months ago
The standard is EDID-DDDB, and subpixel layout is a major part of that specification. However I believe display manufacturers are dropping the ball here.

https://glenwing.github.io/docs/VESA-EEDID-DDDB-1.pdf

kiicia · 9 months ago
For me, being old time user, (ab)using any subpixel layouts for text rendering and antialiasing is counterproductive and (especially with current pixel densities, but also in general) introduces much more issues that it actually ever solved

“Whole pixel/grayscale antialiasing” should be enough and then specialized display controller would handle the rest

tadfisher · 9 months ago
Agreed, but layouts such as Pentile don't actually have all three subpixel components in a logical pixel, so you'll still get artifacts even with grayscale AA. You can compensate for this by masking those missing components.

https://github.com/snowie2000/mactype/issues/932

TheRealPomax · 9 months ago
Mandatory reading when getting into this topic: http://rastertragedy.com/
cosmic_cheese · 9 months ago
I may be totally off the mark here, but my understanding is that the alternative pixel arrangements found in current WOLED and QD-OLED monitors are suboptimal in various ways (despite the otherwise excellent qualities of these displays) and so panel manufacturers are working towards OLED panels built with traditional RGB subpixel arrangements that don’t forfeit the benefits of current WOLED and QD-OLED tech.

That being the case, it may end up being that in the near future, alternative arrangements end up being abandoned and become one of the many quirky “stepping stone” technologies that litter display technology history. While it’s still a good idea to support them better in software, that might put into context why there hasn’t been more efforts put into doing so.

RKFADU_UOFCCLEL · 9 months ago
Windows has always allowed you to change subpixel layout, its right there in the clear type settings.
zozbot234 · 9 months ago
Sub-pixel anti-aliasing requires outputing a pixel-perfect image to the screen, which is a challenge when you're also doing rendering on the GPU. You generally can't rely on any non-trivial part of the standard 3D-rendering pipeline (except for simple blitting/compositing) and have to use the GPU's compute stack instead to address those requirements. This adds quite a bit of complexity.

Deleted Comment

K0nserv · 9 months ago
I appreciate the pun in the repository name https://github.com/googlefonts/fontations/
sidcool · 9 months ago
This is a wonderful write up. Reminiscent of the old google
xyst · 9 months ago
G engineering write ups are usually well written with plenty of useful information to carry forward.

It’s G’s _business_ folks (ie, C-level executives) that I have no respect for. Their business model of exploiting users is just awful.

tsuru · 9 months ago
It looks like there is an extern C interface... I wonder if it is everything necessary for someone to use it via FFI.
steveklabnik · 9 months ago
Given that it's being used in a large C++ codebase, I would assume it has everything needed to use it in that API.
pornel · 9 months ago
They just need to rewrite the rest of Chrome to use the native Rust<>Rust interface.

(in reality Google is investing a lot of effort into automating the FFI layer to make it safer and less tedious)