Readit News logoReadit News
DNF2 commented on Julia 1.12 brings progress on standalone binaries and more   lwn.net/Articles/1044280/... · Posted by u/leephillips
mccoyb · 4 months ago
I used Julia for 4 years. I'm not a moron: I'm familiar with how it works, I've written several packages in it, including some speculative compiler ones.

You claimed:

> Allowing invokation of the compiler at runtime is definitely not something that is done for performance, but for dynamism, to allow some code to run that could not otherwise be run.

I asked:

> why not just compile a static but generic version of the method with branches based on the tags of values? ("Can't figure out the types, wait until runtime and then just branch to the specialized method instances which I do know the types for")

Which can be done completely ahead of time, before runtime, and doesn't rely on re-invoking the compiler, thereby making this whole "ahead of time compilation only works for a subset of Julia code" problem disappear.

Do you understand now?

My original comment:

> The problem (which the author didn't focus on, but which I believe to be the case) that Julia willingly hoisted on itself in the pursuit of maximum performance is _invoking the compiler at runtime_ to specialize methods when type information is finally known.

is NOT a claim about the overall architecture of Julia -- it's a point about this specific problem (Julia's static ahead-of-time compilation) which is currently highly limited.

DNF2 · 4 months ago
First of all, I think this sort of aggressive tone is unwarranted.

Secondly, I think it's on you to clarify that you were talking specifically and exclusively about static compilation to standalone binaries. Re-reading your first post strongly gives the impression that you were talking about the compilation strategy in general.

I would also remind you that Julia always does does-ahead-of-time compilation.

Furthermore, my limited understanding of the static compiler (--trim feature), based on hearsay, is that it does pretty much what you are suggesting, supporting dynamic dispatch as long as one can enumerate all the types in advance (though requiring special implementation tricks). Open-ended type sets are not at all supported.

DNF2 commented on Julia 1.12 brings progress on standalone binaries and more   lwn.net/Articles/1044280/... · Posted by u/leephillips
mccoyb · 4 months ago
I don't believe it, otherwise why not just compile a static but generic version of the method with branches based on the tags of values? ("Can't figure out the types, wait until runtime and then just branch to the specialized method instances which I do know the types for")

Perhaps there is something about subtyping which makes this answer ... not correct -- and if someone knows the real answer, I'd love to understand it.

I believe that this answer is because of performance -- if I can JIT at runtime, that's great -- I get dynamism and performance ... at the cost of a small blip at runtime.

And yes, "performant Julia code" -- that's the static subset of the language that I roughly equated to be the subset which is trying to be pried free from the dynamic "invoking the compiler again" part.

DNF2 · 4 months ago
I'm not exactly sure what you don't believe, your comment is hard to follow, or relies on premises I haven't detected. What you are describing in your first paragraph is somewhat reminiscent of dynamic dispatch, which Julia does use, but generally hampers performance. It is something to avoid in most cases.

Anyway, performance in Julia relies heavily on statically inferring types and aggressive type specialization at compile time. Triggering the compiler later, during actual runtime, can happen, but is certainly not beneficial for performance, and it's quite unusual to claim that it's central to the performance model of Julia.

If you are asking why Julia allows recompiling code and has dynamic types, it's not for performance, but to allow an interactive workflow and user friendly dynamism. It is the central tradeoff in Julia to enable this while retaining performance. If performance was the only concern, the language would be very different.

DNF2 commented on Julia 1.12 brings progress on standalone binaries and more   lwn.net/Articles/1044280/... · Posted by u/leephillips
mccoyb · 4 months ago
> Julia's "secret sauce", the dynamic type system and method dispatch that endows it with its powers of composability, will never be a feature of languages such as Fortran. The tradeoff is a more complex compilation process and the necessity to have part of the Julia runtime available during execution.

> The main limitation is the prohibition of dynamic dispatch. This is a key feature of Julia, where methods can be selected at run time based on the types of function arguments encountered. The consequence is that most public packages don't work, as they may contain at least some instances of dynamic dispatch in contexts that are not performance-critical. Some of these packages can and will be rewritten so that they can be used in standalone binaries, but, in others, the dynamic dispatch is a necessary or desirable feature, so they will never be suitable for static compilation.

The problem (which the author didn't focus on, but which I believe to be the case) that Julia willingly hoisted on itself in the pursuit of maximum performance is _invoking the compiler at runtime_ to specialize methods when type information is finally known.

Method dispatch can be done statically. For instance, what if I don't know what method to call via abstract interpretation? Well, use a bunch of branches. Okay, you say, but that's garbage for performance ... well, raise a compiler error or warning like JET.jl so someone knows that it is garbage for performance.

Now, my read on this work is the noble goal of prying a different, more static version of Julia free from this compiler design decision.

But I think at the heart of this is an infrastructural problem ... does one really need to invoke the compiler at runtime? What space of programs is that serving that cannot be served statically, or with a bit of upfront user refactoring?

Open to be shown wrong, but I believe this is the key compiler issue.

DNF2 · 4 months ago
This is not how I understand the performance model. Allowing invokation of the compiler at runtime is definitely not something that is done for performance, but for dynamism, to allow some code to run that could not otherwise be run.

In performant Julia code, the compiler is not invoked, because types are statically inferred. In some cases you can have dynamic dispatch, but that doesn't necessarily mean that the compiler needs to run. Instead you can get runtime lookup of previously compiled methods. Dynamic dispatch does not necessitate running the compiler.

DNF2 commented on Lenses in Julia   juliaobjects.github.io/Ac... · Posted by u/samuel2
electroly · 4 months ago
My language doesn't have any other kind; it's not immutable by default, it's immutable only. There are no mutable types or reference semantics, so there's no other kind of type that I need to differentiate. That's my question--why haven't other languages taken this approach? Many newer languages today are full-throated defenses of immutable data structures--why do they still make the mutable structures the easiest, syntactically, to change? Why not the other way around? Julia is fastest with immutable structures--why provide a built-in syntax for complex assignment to mutable types, but then relegate lenses to a library that only FP aficionados will use? We don't want add() and subtract() when we have + and -; why should we live with set() when we have =?

I must be missing it because it worked out pretty nicely in my toy language. Complex assignments are written in exactly the way that people expect them to be. That's why I think it must be about taste or practical consideration--obviously it's possible to write a language like this. But experienced designers don't, presumably because it's a bad idea, and I don't understand what the badness is. Since my language is a toy, I likely haven't hit the practical considerations.

Lenses, to me, feel at home in Haskell where the entire language is a game to see how much theory you can implement in the "userspace" of a tight, maximally-orthogonal FP language. But this is Julia, a monstrously large, imperative, Algol-family language with every possible language feature built-in, intended to be a practical language for analysis by people who aren't programming language experts. Julia's compiler already has knowledge of immutable types which it uses for optimization. Seems like they could do better than lenses if they weren't forced to implement it as a library in the language itself.

DNF2 · 4 months ago
> Julia is fastest with immutable structures--why provide a built-in syntax for complex assignment to mutable types, but then relegate lenses to a library that only FP aficionados will use?

This is not really accurate. Performance in Julia is heavily organized around mutability, in particular for arrays. The main reason Julia does not fully embrace immutability for everything is, simply, performance.

DNF2 commented on Typst 0.14   typst.app/blog/2025/typst... · Posted by u/optionalsquid
b33j0r · 5 months ago
Yeah but can it run doom
DNF2 · 4 months ago
DNF2 commented on Typst 0.14   typst.app/blog/2025/typst... · Posted by u/optionalsquid
fiso64 · 5 months ago
I have laptop with a good-ish CPU that is only a few years old, and on page 3 tinymist is already starting to struggle. There is a noticeable input delay between me pressing a key on the keyboard, and the key getting typed & the preview updating. I think it's more of a tinymist issue though, as it has no debouncing and apparently also runs the preview updates on the same thread as vscode's input handling.
DNF2 · 4 months ago
Interesting. I have not experienced that, except when trying out the pre-release version of tinymist, and did some messy multiple view+cropping into a big pdf (testing out the new pdf-image stuff.) I chalked it up to it being new and beta.

Admittedly, I have still not created large documents in Typst.

DNF2 commented on Typst 0.14   typst.app/blog/2025/typst... · Posted by u/optionalsquid
idoubtit · 5 months ago
When I compile LaTeX files, I use tectonic¹ which automatically download dependencies, compiles in one pass, and hides temporary files. But the regulars users of LaTeX I know all use a web interface — IIRC, it's an instance of Overleaf² installed by their university, with real-time rendering.

So when I read your list, I had these tools in mind, and the only items that made sense to me were:

2. (minor compared to Overleaf) typst compiles faster.

3. Diagnostics are better.

4. (minor and arguable) Lists have 2 simpler syntaxes.

The other points were irrelevant (dependencies), wrong (macros) or really dubious (margins, Git, bibliography). I think Typst has many more interesting features over LaTeX.

¹: https://tectonic-typesetting.github.io/

²: https://docs.overleaf.com/on-premises/installation/using-the...

DNF2 · 5 months ago
> 2. (minor compared to Overleaf) typst compiles faster.

I would argue that this isn't minor. At least in my opinion, it makes a big difference.

Overleaf, already 3 pages into a document, with a couple of TikZ figures, was getting slow, as in multiple seconds wait for each save.

Typst, on the other hand (Tinymist in VS Code) is really realtime. Text updating within some tens of milliseconds, and figures included in far below a second. It really _feels_ instant, and to me that changes the experience a lot.

DNF2 commented on Typst 0.14   typst.app/blog/2025/typst... · Posted by u/optionalsquid
imiric · 5 months ago
I'm super happy that Typst continues to chip away at LaTeX's dominance. Kudos to the team and contributors! <3

This looks like a great release. Lossless embedding of PDFs seems like it would be useful in many scenarios. I'm surprised with how much better the character-level justified text actually looks. And I wasn't even aware that it supported exporting HTML. Typst—both the tool and the language—are more robust and enjoyable to use IME than something like Markdown, Pandoc, Org mode, and other formats, so I'll definitely consider using it for my next web project.

My only concern is backwards compatibility. How committed is the team to supporting older syntax? What will happen in a year or two from now when I have to generate a PDF from a .typ file written with version 0.13? They mention deprecations in v0.14, so I assume that I should expect breaking issues. I suppose only time will tell how difficult upgrading will be in the future.

This was a big problem for me when using LaTeX, which is why I maintained a TeX Live Docker image with the exact version and dependencies I needed. Upgrading it was always a nerve-racking ordeal. Since Typst is a single binary, this should at least be easier to manage.

DNF2 · 5 months ago
As long as Typst is on version 0.x,you should probably expect breaking changes. There is talk about changing even part of the parsing rules.

This is the risk of being an early adopter.

Once v1.0 is out, I hope it will stabilize for the long term.

DNF2 commented on Correctness and composability bugs in the Julia ecosystem (2022)   yuri.is/not-julia/... · Posted by u/cs702
cs702 · 5 months ago
The OP shows examples of people being unable to solve a problem in Julia that they solve quickly after switching to PyTorch, Jax, or TensorFlow, so the OP is implicitly recommending those alternatives.
DNF2 · 5 months ago
But those are not languages, but frameworks, and are not general enough to solve many problems, especially outside of machine learning.
DNF2 commented on New horizons for Julia   lwn.net/Articles/1006117/... · Posted by u/leephillips
kjrfghslkdjfl · a year ago
> But my main complaint about Julia is its general approach to memory management.

I'm not a full-blown hater, but I have problems with that as well. Specifically, you have no control about it whatsoever, you're just promised that "if you do things right, it'll be amazing". And it is! The problem is that any tiny minuscule mistake causes catastrophic failure of performance due to allocations. Since the good performance depends on type stability, and type stability propagates, any mistake anywhere will propagate everywhere. Think: if a variable becomes type unstable due to a programmer mistake, any function that consumes it generally might become type unstable as well, and any function that consumes the output of that function as well, etc. The upshot is that this forces you to think more carefully about your types and data structures. Programming in Julia extensively has made me a better programmer. I'm not a C++ expert, but I believe that in C++ these kind of mistakes always end up being localized.

DNF2 · a year ago
That is not really correct. Type instabilities tend to disappear at function boundaries, which is one of the reasons why using functions is so heavily promoted in Julia, it helps keep type instabilites 'localized'.

u/DNF2

KarmaCake day325August 9, 2018View Original