Readit News logoReadit News
torrance · 3 years ago
Whilst the Julia version currently beats Mojo, I fully expect both to approach basically the same performance with enough tinkering, and for that performance to be on par with C or Fortran.

A more interesting question is which version is more elegant, ‘obvious’ and maintainable. (Deeply familiar with both, but money is on Julia).

sundarurfriend · 3 years ago
> A more interesting question is which version is more elegant, ‘obvious’ and maintainable. (Deeply familiar with both, but money is on Julia).

Yes, more than raw speed, what impresses me is that the version of code in [1] is already a few times faster than the Mojo code - because that's pretty basic Julia code that anyone with a little Julia experience could write, and maintain easily.

The later versions with LoopVectorization require more specialized knowledge , and get into the "how can we tune this particular benchmark" territory for me (I don't know how to evaluate the Mojo code in this regard as yet, how 'obvious' it would be to an everyday Mojo developer). So [1] is a more impressive demonstration of how an average developer can write very performant code in Julia.

[1] https://discourse.julialang.org/t/julia-mojo-mandelbrot-benc...

sundarurfriend · 3 years ago
User lmiq articulates the sentiment well in a later comment in the OP thread:

> what I find more interesting here is that the reasoning and code type applies to any composite type of a similar structure, thus we can use that to optimize other code completely unrelated to calculations with complex numbers.

It matters less whether super-optimized code can be written in Julia for this particular case (though there's some value in that too), the more important and telling part to me is that the language has features and tools that can easily be adopted to a general class of problems like this.

nologic01 · 3 years ago
An even more interesting question is: which version will actually entice millions of independed and variably motivated actors from all walks of life to commit and invest to a particular ecosystem. Technnical and usability aspects play only a minor role in technology adoption. In particular the best technology doesnt always win.

My humble two pennies is that Julia is missing the influencer factor: being endorsed by widely known entities that will attract the attention of both corporate eyes and the hordes of developers constantly looking for the next big thing.

Your money might be on Julia but $100mln was just placed on the Mojo/Modular bet...

patrick451 · 3 years ago
I've tried julia a handful of times. IMO, the thing slowing adoption is that the usecases where julia feels like the most powerful, optimal choice are too limited. For example

- Slow startup times (e.g., time-to-first-plot) kill it's a appeal for scripting. For a long time, one got told that the "correct" way to use julia was in a notebook. Outside of that, nobody wanted to hear your complaints.

- Garbage collection kills it's appeal for realtime applications.

- The potential for new code paths to trigger JIT compilation presents similar issues for domains that care about latency. Yes, I know there is supposedly static compilation for julia, but as you can read in other comments here, that's still a half baked, brittle feature.

The second two points mean I still have the same two language problem I had with c++ and python. I'm still going to write my robotics algorithms in c++, so julia just becomes a glue language; but there's nothing that makes it more compelling that python for that use. This is especially true when you consider the sub-par tooling. For example, the lsp is written julia itself, so it suffers the same usability problems as TTFP : you won't start getting autocompletions for several minutes after opening a file. It is also insanely memory hungry to the extent that it's basically unusable on a laptop with 8gb of ram (on the other hand, I have no problem with clangd). Similarly, auto-formatting a 40 line file takes 5 seconds. The debugging and stacktrace story is similarly frustrating.

When you take all of this together, julia just doesn't seem worth it outside of very specific uses, e.g., long running large scale simulations where startup time is amortized away and aggregate throughput is more important than P99 latency.

bwanab · 3 years ago
You may very well be right, but for my money Alan Edelman (MIT) is influencer enough.
sa-code · 3 years ago
IMO the reason Julia gets to be this fast is because of LLVM, and the guy who created LLVM is also the creator of Mojo so there is something to be said about that
sundarurfriend · 3 years ago
My understanding is that Julia gets to be this fast because the language design was optimized for performance from the beginning, by clever use of its type hierarchy and multiple dispatch for compilation into very specific and efficient machine code (plus a thousand other little optimizations like constant propagation, auto-vectorization, etc.)

LLVM helps with its own optimizations, but more and more of those optimizations are being moved to the Julia side nowadays (since the language has more semantic understanding of the code and can do better optimizations). I believe the main thing LLVM helps with is portability across many platforms, without having to write individual backends for each one.

Deleted Comment

__rito__ · 3 years ago
One language is literally some months old, and another is 11 years old. Not a fair comparison at this point, and for some years to come.

The gap will definitely narrow.

Max-Limelihood · 3 years ago
Not necessarily. Julia will always be 11 years older than Mojo, no matter how old both of them get, and that advantage won't shrink. Not to mention, Mojo is a superset of a 40-year old language with billions of dollars of development poured into it, plus an extra hundred million poured directly into Mojo itself. If we go by resources spent on each, Mojo has had gotten about 5x more investment than Julia.
eigenspace · 3 years ago
yeah pretty much any strongly performance oriented modern language should be able to be massaged into emitting whatever instructions should give close to optimal performance here.

It's always fun though when one language does better naïvely in a benchmark to delve in and see how to match or surpass them, and see if it was worth the trouble.

Microbenchmark performance for languages in this class definitely shouldn't be seen as a strongly deciding factor though.

jakobnissen · 3 years ago
I disagree, actually. I have found microbenchmarks to be very informative to understand why a language is fast in some cases and slow in others.

It's not only the actual benchmark numbers though. Its understanding the code that reaches those numbers: Can Julia do explicit SIMD? How awkward is that in one language or the other? Are there idiosyncratic bottlenecks? Bad design decisions that needs to be worked around in one language but not the other? And so on.

ubj · 3 years ago
In my opinion, the issue that will make more of a difference in the long run is Mojo's first-class support for AoT compiled binaries (as well as JIT compilation).

Julia's poor AoT support (with small binaries) is a major Achilles heel. I really wish that the Julia developers had taken that more seriously earlier on.

arbitrandomuser · 3 years ago
If anyone is interested in compiling small binaries with Julia do check out staticcompiler.jl and supporting statictools.jl that manages to produce small binaries without the Julia runtime , Ofcourse it's a wip , and not fully mature , I'm just putting it out there for people to know. There's some really cool demonstrations come out of this ...

Wasm fluid simulation in Julia: https://alexander-barth.github.io/FluidSimDemo-WebAssembly/

Differential equations demo in the browser: https://tshort.github.io/Lorenz-WebAssembly-Model.jl/

Someone even setup Julia to run on AVR mcus for Arduino (Directly using gpucompiler which statictools uses to compile to binaries):

https://seelengrab.github.io/articles/Running%20Julia%20bare...

ubj · 3 years ago
Yep, I was aware of StaticCompiler.jl. I wish it was more mature.

Static compilation is indeed possible with Julia. But it's very limited in its capabilities and certainly not as effortless as a simple `mojo build myfile.mojo`.

sundarurfriend · 3 years ago
FWIW, the core Julia developers seem to be taking this more and more seriously, and AoT compilation to small binaries seems more of a "when" question than an "if" at this point. Open source development - without multi-million dollar support from outside - is unpredictable, but I wouldn't be surprised if a year from now, writing a restricted subset of Julia allowed you AoT compilation to reasonable binaries (and not something as restricted as StaticCompiler.jl requires, just avoiding some of the extreme dynamic features).
baggepinnen · 3 years ago
You mean investment like this? https://info.juliahub.com/juliahub-receives-13-million-strat...

You can imagine what a company like Boeing might be interested in when it comes to a programming language.

Certhas · 3 years ago
In the long run I think the determining factors will be:

Connection to the Python ecosystem. Python remains the number 1 teaching language by a large margin.

AI funding. If they can get the buy in from the AI community that Julia never got, they have the resources to engineer around any challenges faced.

Solid foundation in modern language design, and with that, a focus on correct code produced by larger teams.

hpcjoe · 3 years ago
This is my current pain with Julia. It makes deploying code require the entire environment, or a PackageCompiler built sys-image. I've played with static compiler, and other techniques. They are sadly quite brittle for my previous use cases. Lack of ability to use threads in a static compiler built binary was a deal killer for me.
zarkenfrood · 3 years ago
I think in the long run the real difference will be if mojo gets accepted into industry usage given it initially looks like it is closer to python. Julia has struggled getting wider industry adoption and mojo is currently selling itself as minimal uplift from existing python which will help the sell in industry.
turndown · 3 years ago
IMO this is just not a great example on either side. As others have pointed out, the Julia implementation was refined to be 8x faster. The Mojo code has to run the CPython interpreter to run numpy.
suavesito · 3 years ago
The example Mojo code does not run the computations with numpy. It uses the extensions of Mojo to do it, testing the capabilities of this extensions, which are the ones who promises the speed up. I must admit otherwise that not a lot of work has been done to optimise it as the optimised Julia version.
DNF2 · 3 years ago
Why are you saying that? The Mojo code seems to have the same optimizations as the Julia code.
Max-Limelihood · 3 years ago
> The Mojo code has to run the CPython interpreter to run numpy.

Yes, the need to run CPython interpreter is what makes Mojo slow (and it will remain that way, unless they abandon their "superset of Python" promise).

wiz21c · 3 years ago
A bit OT but what is Julia's adoption rate nowadays ? I know there are people who thinks it's the best, others thinks it's not going to cut it, but well.. In your experience ? (my experience is: a little too slow to load, type hierarchies lead to unbearable error messages sometimes but looks like a serious attempt to replace whatever language in the math/physics/stats/... space)
eigenspace · 3 years ago
Hard to know really since the language is open source and tries not to be too onerous with telemetry (though there is some limited opt-out telemetry in the package manager).

It's growing, but certainly not growing exponentially or anything like that. Here's some statistics from January this year: https://info.juliahub.com/julia-annual-growth-statistics-jan...

Regarding your negative experiences, the bad news is that we haven't solved those issues, but the good news is that we're making real progress on them. Version 1.9 released in may of this year and is the first version to cache native code from packages, which makes loading of julia code MUCH faster through more AOT compilation, and there are even more improvements coming in v1.10 later this year. https://julialang.org/blog/2023/04/julia-1.9-highlights/#cac...

Error messages are also receiving a fair amount of attention, but it's a hard problem and there's less agreement on what the best way forward is. However, there's been some good work going into improving the readability and clarity of error messages that I think will help alleviate these struggles.

wiz21c · 3 years ago
I know you're doing a lot of great work, I'm following Julia rather closely. It's just that as of now, it's not easy enough to grasp to make quick tests at work (I'd have some use there, but I have to be on schedule with the projects).

I've used a bit for various pet projects (mainly some graph search and JuMP stuff) and it was convincing. But now I can see Fortran perform in real production code (where it shines, at the cost of being so antiquated that it's not funny anymore) and my expectations for Julia are now higher.

I'll give it another try 'cos you spend some time answering my question :-) (and because the charts in the 2nd provided link are just really convincing)

cswhnjidd · 3 years ago
I've been using it at work for almost 2 years now. I haven't used python personally since.
thetwentyone · 3 years ago
Same, though about 3 years here. My company (financial services) officially supports Python and Julia internally.
anoy8888 · 3 years ago
It is still fascinating that lisp languages lost to python for AI and data processing and now pretty much everything else. In a perfect world , we would be using lisp or lisp like languages for everything
dagw · 3 years ago
It is still fascinating that lisp languages lost to python for AI and data processing

To a first approximation, the only people that love lisps are people with a solid computer science background, and most people working with AI and data processing day to day do not have a computer science background. They're scientists, engineers and mathematicians who see programming and programming languages as a tool needed to do their 'real' job and not as an end in itself. Python is the perfect language for people who want to learn as little programming as possible so that they can get on with what actually interests them.

QuadmasterXLII · 3 years ago
I think the secret is that python is so slow that you have to vectorize and call a library written in C to do any serious math. In 2008 this was a serious downside, but it meant that a whole community got used to slicing, multi-indexing, specialized functions like cumsum, and shared idioms. As a result, when the GPGPU revolution hit, you could write vectorized gpu code in any language, but the shared idioms meant that python programmers had the unique superpower of being able to read each others vectorized gpu code.
xiaodai · 3 years ago
the other data science language r is also slow
eigenspace · 3 years ago
What do you mean by lisp-like?

If you mean “primarily uses S-expressions” then I guess I dont really see why that’s so important to you.

If you mean a language that is semantically similar to lisps and learned a lot of the important lessons that Lisp taught the programming world, I think Julia is one of the Lispiest languages in this space right now.

The syntax may not be S-expression based on the surface, but our Exprs are actually essentially just S-espressions so writing syntactic macros is very easy. The language is about as dynamic as is possible without major performance concessions, and is very heavily influenced by a lot of design ideas from the CLOS, with some features missing but also some cool features CLOS doesnt have.

Oreb · 3 years ago
I certainly consider Julia to be a Lisp, and I’m pretty sure that’s what the person you responded meant, too. His point remains true: Julia appears to have little chance of overtaking Python, except in some tiny niche areas. And even in these niche areas, I fear that Julia will end up losing to Mojo.

I really hope I am wrong. I love Julia and would like to see it succeed everywhere, but it doesn’t seem to be happening.

dan-robertson · 3 years ago
I think the point the GP is making is that Julia (and other lisps) lost in this space.
est · 3 years ago
And dialects of Lisp a-likes will fight each other. https://locklessinc.com/articles/why_lisp_failed/
dgb23 · 3 years ago
For reference:

Julia is basically a Lisp under the hood. From playing around with it, it seems like the REPL experience is up there too.

sundarurfriend · 3 years ago
The language is a "reasonable" Lisp, with the caveat that it doesn't have things like reader macros (which is a good thing IMO, and helps avoid the Curse of Lisp).

The runtime/REPL is pretty good too, and can be quite dynamic with Revise.jl, but doesn't have the Real REPL-driven Programming Experience™ as defined here: https://mikelevins.github.io/posts/2020-12-18-repl-driven/

I'm not sure how much of the "breakloop" functionality Infiltrate.jl provides, but at least the runtime re-definition of types isn't supported in Julia, and is one of the shortcomings of the Revise.jl based workflow.

All this is not to take away from the original point, Julia does get you a big chunk of the way to being a Lisp and gives you a lot of expressive power. It's just to say that Julia is not just a reskinning of a Lisp with familiar syntax, it has some important design and implementation differences.

IshKebab · 3 years ago
Doesn't sound very perfect to me. I'm no fan of Python but it is at least readable. Lisp is not.
sva_ · 3 years ago
In a perfect world, we'd probably all live in some kind of harmonious communistic utopia, but theory and reality are two different things.
WantonQuantum · 3 years ago
Be sure to check out the comments on the page - lots of optimisations for the julia code.
ayhanfuat · 3 years ago
Not if you want to avoid condescending "I cannot look at the Python code my eyes hurt" comments. Good to know the Julia community hasn't made any progress in that regard, though.
suavesito · 3 years ago
I think this is a little bit unfair. The comment refers to the Mojo specific use of [], not to regular Python. It also starts saying

> I know I shouldn’t say so but I can’t help...

Remarking that the comment should not be taken too seriously, as it might be inappropriate.

Finally, saying the whole community is condescending given 1 in 32 comments is... a little rounding up from the statistics there.

ballooney · 3 years ago
I only see one of those among 32 constructive ones - a rather better ratio than this site!
amj7e · 3 years ago
Well, still a much better community than most where having an opinion is considered a sin.
agumonkey · 3 years ago
So Julia has --pedantic by default ?

Dead Comment

eddtests · 3 years ago
Mojo released an example of their new language which will mean readability and simplicity compared to the Python implementation will surely have been a requirement… I get someone within Modular doing some horrific looking low-level Mojo stuff could get it much quicker.
oivey · 3 years ago
The Mojo one is already doing some pretty horrific low-level stuff with fairly manual SIMD. That’s why it was faster than Julia in the initial example, and the edge is lost when a couple of posters did similar things for Julia.
patagurbon · 3 years ago
We have yet to see Mojo do any "sufficiently smart compiler" optimizations that Julia or similar languages don't already do. The Mojo code in the blog post does the same ugly optimizations to get good SIMD as the Julia code.

Convincing LLVM to vectorize is still a problem in both languages. I do hope Mojo can make some headway there in the future. Especially since with MLIR they might be able to capture some higher level semantics Julia can't.

eigenspace · 3 years ago
> Mojo released an example of their new language which will mean readability and simplicity compared to the Python implementation will surely have been a requirement…

Did you read the Mojo code? It’s very messy and low-level dealing with explicit SIMD intrinsics and such.

eddtests · 3 years ago
Yea I did but after using it quite a bit over the last 24 hours since the SDK came out I’ve already seen far worse lol
bjourne · 3 years ago
Last time this was up I wrote a single-threaded version in C which I'm pretty sure beats both Julia and Mojo: https://github.com/bjourne/c-examples/blob/master/programs/m...
TwentyPosts · 3 years ago
> which I'm pretty sure beats both Julia and Mojo

Sometimes "showing the code" is not enough. Show me the benchmark.

bjourne · 3 years ago
I'll pass - Hacker News comments are not dissertations. The C code ran faster on my machine. YMMV.
adgjlsfhk1 · 3 years ago
I'm pretty sure it doesn't. That looks exactly like the single threaded code for the good julia versions.
ssivark · 3 years ago
> pretty sure beats both Julia and Mojo:

Would the C compiler automatically exploit vectorized instructions on the CPU, or loop/kernel fusion, etc? It’s unclear otherwise how it would be faster than Julia/Mojo code exploiting several hardware features.

bjourne · 3 years ago
In a HLL like Julia or Mojo you use special types and annotations to nudge the compiler to use the correct SIMD instructions. In C the instructions are directly usable via intrinsics. Julia's and Mojo's advantage is that the same code is portable over many SIMD instruction sets like sse, avx2, avx512, etc. But you generally never get close to the same performance hand-optimized C code gets you.