In a nutshell, there has been trends in Python (such as JAX and Thinc.ai) that argue functional programming can provide better abstractions and more composable building blocks for deep learning libraries. And I believe Elixir, as a functional language with Lisp-style macros, is in an excellent position to exploit that - as seen in "numerical definitions" which compile a subset of Elixir to the CPU/GPU.
I also think the Erlang VM, with its distribution and network capabilities, can provide exciting developments in the realm of federated and distributed learning. We aren't exploring those aspects yet but we are getting closer to having the foundation to do so.
Regarding ML ops, I believe one main advantage is explained in this video announcement. When deploying a ML model with Nx, you can embed the model within your applications: you don't need a 3rd-party service because we batch and route requests from multiple cores and multiple nodes withing Erlang/Elixir. This can be specially beneficial for projects like Nerves [5] and we will see how it evolves in the long term (as we _just_ announced it).
Finally, one of the benefits on starting from scratch after Python has paved the way is that we can learn from its ecosystem and provide a unified experience. You can think of Nx as Numpy+JAX+TFServing all in one place and we hope that doing so streamlines the developer experience. This also means libraries like Scholar [2] (which aims to serve a similar role as SciPy) and Meow [3] (for Genetic Algorithms) get to use the same abstractions and compile to the CPU/GPU. The latter can show an order of magnitude improvement over other currently used frameworks [4].
[0]: https://dashbit.co/blog/nx-numerical-elixir-is-now-publicly-... [1]: https://dashbit.co/blog/elixir-and-machine-learning-nx-v0.1 [2]: https://github.com/elixir-nx/scholar/ [3]: https://github.com/jonatanklosko/meow [4]: https://dl.acm.org/doi/10.1145/3512290.3528753 [5]: https://www.nerves-project.org/
One thing that still suffers is AI autocomplete. While I tried Zed's own solution and supermaven (now part of Cursor), I still find Cursor's AI autocomplete and predictions much more accurate (even pulling up a file via search is more accurate in Cursor).
I am glad to hear that Zed got a round of funding. https://zed.dev/blog/sequoia-backs-zed This will go a long way to creating real competition to Cursor in the form of a quality IDE not built on VSCode
I’ll keep an eye on this ‘proper’ Zed support for sure, although the current setup is working just fine so I might wait for v0.2.