I work in "one of those industries" where failure can easily cost millions in the space of minutes, and we use Julia almost exclusively. Unfortunately I can't give you details because of reasons, other than tell you you're very poorly informed.
> you're locked in to Julia
You're not "locked in" to Julia any more you're locked in to Python. Arguably less, as Julia can call C natively, Python can't.
My point isn't that you can't interop, and it seems you're intentionally (?) equivocating on this point. Surely you didn't misinterpret what I wrote as "you can't iterop"? My point is that you can more benefits if you don't. Like I said, if you want a slow language that can call fast routines, why not just use Python.
Which country are you in?
Compiler latency ("time to first plot") used to be miserable but after a few releases with incremental improvements it feels mostly solved to me.
Just now on Friday at JuliaCon Local in Eindhoven one of the keynotes was about similar ongoing work on stand-alone binaries (including shared libraries to call like C/Fortran.)
The discussion seems to be too deep in Julia internals for me to follow. Is this about startup time or defining an entry point (or both?). I haven't had problems with Julia entrypoints (yet at least).
With the nightly `./julia-f7618602d4/bin/julia -e "using DynamicalSystems"` still takes over 5 seconds. Can I somehow define a main to make this faster or precompile more efficiently?
> Since packagecompiler.jl already has C ABI and one goal discussed about binaries being easily callable from other languages and vice versa, I would bet that it will have shared libraries.
Sounds promising. Shared libraries are not a musthave for me, but could allow Julia save us from C++ in more cases.
DynamicalSystems looks like a heavy project. I don't think you can do much more on your own. There have been recent features in 1.10 that lets you just use the portion you need (just a weak dependency), and there is precompiletools.jl but these are on your side.
You can also look into https://github.com/dmolina/DaemonMode.jl for running a Julia process in the background and do your stuff in the shell without startup time until the standalone binaries are there.
Do standalone binaries here include shared libraries (with a C ABI)? That would be a dream.
Unfortunately, the core devs are not too chatty about standalone binaries, because of how Julia's internals are set there are going to be a lot of unforeseen challenges, so they are not trying to promise how things will be rather let's wait and see how things will turnout. Since packagecompiler.jl already has C ABI and one goal discussed about binaries being easily callable from other languages and vice versa, I would bet that it will have shared libraries.
Usage where you don't keep the Julia process running. E.g. "$ julia stuff.jl" form a shell.
I just timed vscode with the lsp. From the point I open a 40 line file of the lorenz attractor example, it takes 45 seconds until navigation within that same file works, and the lsp hogs 1 GB of memory. That's 5x the memory of clangd and 20x worse performance; hardly what I would consider a snappy experience.
I have no doubt that julia can be shoe-horned into realtime applications. But when I read threads like this [1], it's pretty clear that doing so amounts to a hack (e.g., people recommending that you somehow call all your functions to get them jited before the main loop actually starts). Even the mitigations you propose, i.e., pre-allocating everything, don't exploit any guarantees made by the language, so you're basically in cross-your-fingers and pray territory. I would never feel comfortable advocating for this in a commercial setting.
[1] https://discourse.julialang.org/t/julia-for-real-time-worrie...
Having a REPL open is not the same thing as a notebook, if you feel like that, cool I guess.
That thread is old and Julia can cache compiled code now from 1.9 and onward. However, it can not distribute the cached code(yet).
Writing the fastest possible real-time application in c/c++ has the same principles as in Julia. It's not as shoe-horned as you might believe.
When developing Julia, the developers chose some design decisions that affected the workflow of using the language. If it doesn't fit your needs that's cool, don't use it. If you are frustrated and like to try the language come to discourse, people are friendly.
Autocompletion in Julia is also just terrible and the tooling really is lacking compared to better funded languages. No harm in admitting that (when Julia had no working debugger some people were seriously arguing that you don't need one: Just think harder about your code! Let's please bury that attitude...)
- Slow startup times (e.g., time-to-first-plot) kill it's a appeal for scripting. For a long time, one got told that the "correct" way to use julia was in a notebook. Outside of that, nobody wanted to hear your complaints.
- Garbage collection kills it's appeal for realtime applications.
- The potential for new code paths to trigger JIT compilation presents similar issues for domains that care about latency. Yes, I know there is supposedly static compilation for julia, but as you can read in other comments here, that's still a half baked, brittle feature.
The second two points mean I still have the same two language problem I had with c++ and python. I'm still going to write my robotics algorithms in c++, so julia just becomes a glue language; but there's nothing that makes it more compelling that python for that use. This is especially true when you consider the sub-par tooling. For example, the lsp is written julia itself, so it suffers the same usability problems as TTFP : you won't start getting autocompletions for several minutes after opening a file. It is also insanely memory hungry to the extent that it's basically unusable on a laptop with 8gb of ram (on the other hand, I have no problem with clangd). Similarly, auto-formatting a 40 line file takes 5 seconds. The debugging and stacktrace story is similarly frustrating.
When you take all of this together, julia just doesn't seem worth it outside of very specific uses, e.g., long running large scale simulations where startup time is amortized away and aggregate throughput is more important than P99 latency.
You can do real-time applications just fine in Julia, just preallocate anything you need and avoid allocations in the hot loop, I am doing real-time stuff in Julia. There are some annoyances with the GC but nothing to stop you from doing real-time. There are robotics packages in Julia and they are old, there is a talk about it and compares it with c++(spoiler, developing in julia was both faster and easier and the results were faster).
I have been using two Julia sessions on an 8gb laptop constantly while developing, no problem. LSP loads fine and fast in vscode no problem there either.
The debugger in vscode is slow and most don't use it. There is a package for that. The big binaries are a problem and the focus is shifting there to solve that. Stacktrace will become much better in 1.10 but still needs better hints(there are plans for 1.11). In general, we need better onboarding documentation for newcomers to make their experience as smooth as possible.