Readit News logoReadit News
darsnack commented on Julia 1.10   docs.julialang.org/en/v1/... · Posted by u/CoreyFieldens
jszymborski · 2 years ago
Out of curiosity, how's the state of DL for Julia.

Can I use PyTorch or JAX comfortably in Julia?

darsnack · 2 years ago
> Can I use PyTorch or JAX comfortably in Julia?

There is https://github.com/rejuvyesh/PyCallChainRules.jl which makes this possible. But using some of the native Julia ML libraries that others have mentioned is preferable.

darsnack commented on Julia 1.10   docs.julialang.org/en/v1/... · Posted by u/CoreyFieldens
Tarrosion · 2 years ago
What's the user-facing difference between Lux and Flux?
darsnack · 2 years ago
How you interact with parameters.

Lux is similar to Flax (Jax) where the parameters are kept in a separate variable from the model definition, and they are passed in on the forward pass. Notably, this design choice allows Lux to accept parameters built with ComponentArrays.jl which can be especially helpful when working with libraries that expect flat vectors of parameters.

Flux lies somewhere between Jax and PyTorch. Like PyTorch, the parameters are stored as part of the model. Unlike traditional PyTorch, Flux has “functional” conventions, e.g. `g = gradient(loss, model)` vs. `loss.backward()`. Similar to Flax, the model is a tree of parameters.

darsnack commented on Texas could get a bullet train between Houston and Dallas   popsci.com/technology/amt... · Posted by u/elorant
yarpen_z · 3 years ago
The last mile argument is often ignored, but it's very important in this discussion. What's the point of traveling fast by a train, unless you're going exactly to the downtown? Without a reliable public transport, it might be more efficient to travel by car since you won't have to rent one or Uber everywhere.
darsnack · 3 years ago
You are also allowed to have bike share, buses, subway/light rail, and taxis. No one is saying trains are the only mode of transportation. I’ve lived in midwestern states with miles of farmland. In the town center, a bus network and bike paths meant I almost never needed a car. Folks who lived outside of town had the option to drive in and park their car at commuter lots.

Also this is what railway lines looked liked in the US when we cared to build out our train infrastructure: https://i.pinimg.com/originals/3c/57/ee/3c57eeffb7e1a3c78691.... Just because the US is big doesn’t mean we can’t build tracks.

darsnack commented on FastAI.jl: FastAI for Julia   forums.fast.ai/t/ann-anno... · Posted by u/dklend122
xvilka · 5 years ago
I didn't check the library yet, asking here in case of the quick answer - does it support GNN too via GeometricFlux.jl[1]?

[1] https://github.com/FluxML/GeometricFlux.jl

darsnack · 5 years ago
You would have to add a learning method to tell it how to encode/decode graph data, but the framework is agnostic to the model choice. So any Flux model is supported.
darsnack commented on FastAI.jl: FastAI for Julia   forums.fast.ai/t/ann-anno... · Posted by u/dklend122
ellisv · 5 years ago
This is interesting to me but the motivation behind this is unclear. Since FastAI.jl uses Flux, and not PyTorch, functionality has to be reimplemented. FastAI.jl has vision support but no text support yet.

What does this mean for the development of fastai?

What is the timeline for FastAI.jl to achieve parity?

When should I choose FastAI.jl vs fastai?

darsnack · 5 years ago
I’m not the main dev on FastAI.jl, but I work on the Julia ML community team that supported this project.

> Since FastAI.jl uses Flux, and not PyTorch, functionality has to be reimplemented.

We are looking to offer a high level API for ML in Julia similar to fastai for PyTorch. The goal is to enrich the Flux ecosystem, so just calling into Python fastai wouldn’t be appropriate. FastAI.jl is built on top of several lower level packages that can be used separately from FastAI.jl. These packages help build out the ecosystem not just for FastAI.jl, but any ML framework or workflow in Julia.

> What does this mean for the development of fastai?

FastAI.jl is “unofficial” in that Jeremy and the fastai team did not develop it. But Jeremy knows about the project, and we have kept in touch with the fastai team for feedback. FastAI.jl doesn’t affect the development of Python fastai in any way.

> FastAI.jl has vision support but no text support yet.

> What is the timeline for FastAI.jl to achieve parity?

We’re working to add more out-of-the-box support for other learning tasks. Currently, we have tabular support on the way, but the timeline for text is not decided.

Note that the framework itself could already support a text learning method, but you’d have to implement the high level interface functions for it yourself. We just don’t have built-in defaults like vision. You can check out https://fluxml.ai/FastAI.jl/dev/docs/learning_methods.md.htm... for a bit more on what I mean.

> When should I choose FastAI.jl vs fastai?

It depends on what you need. PyTorch and fastai are more mature, but Julia and Flux tend to be more flexible to non-standard problems in my experience. If you’re interested, then give Julia/Flux/FastAI.jl a try. If we’re missing a mission critical feature for you, then please let us know so we can prioritize it.

darsnack commented on Julia adoption keeps climbing   hpcwire.com/2021/01/13/ju... · Posted by u/newswasboring
dopu · 5 years ago
I'm a graduate student that's switched almost completely over to Julia. Prior to it I worked in both MATLAB (the IDE is so nice, and writing out matrix computations is just great) and Python (for ML). Julia is absolutely nicer to write in than either of the two. MATLAB is slow and at times feels less like a programming language and more like an incomplete and brittle interface with the JVM. Python is also slow, and it feels awkward to use given that it was not explicitly designed for scientific workflows. With Julia I get proper typing, incredible speed, easy parallelization, and a kick ass REPL.

The only thing I truly miss in using Julia is the plotting capacities of MATLAB. I haven't found an environment that can match it in terms of interactivity. Give me the ability to (easily) save interactive figures for later use and Julia would be perfect.

darsnack · 5 years ago
You should check out Makie. Getting it set up can be a bit frustrating if things don’t go right, and there is a small learning curve for using `@lift`, but it is an absolute joy to use once you ramp up.

I use it for my research by default. You can pan, zoom, etc. The subplot/layout system is frankly a lot better than Matlab (and I enjoyed Matlab for plotting!). The best part is that I can insert sliders and drop downs into my plot easily, which means I don’t need to waste time figuring out the best static, 2D plot for my experiment. I just dump all the data into some custom logging struct and use sliders to index into the correct 2D plot (e.g. a heat map changing over time, I just save all the matrices and use the slider to get the heat map at time t).

darsnack commented on Apple unveils M1, its first system-on-a-chip for portable Mac computers   9to5mac.com/2020/11/10/ap... · Posted by u/runesoerensen
mmm_grayons · 5 years ago
Can anyone who knows about machine learning hardware comment on how much faster dedicated hardware is as opposed to, say, a vulkan compute shader?
darsnack · 5 years ago
On the NVidia A100, the standard FP32 performance is 20 TFLOPs, but if you use the tensor cores and all the ML features available then it peaks out at 300+ TFLOPs. Not exactly your question, but a simple reference point.

Now the accelerator in the M1 is only 11 TFLOPs. So it’s definitely not trying to compete as an accelerator for training.

darsnack commented on Pluto.jl – a reactive, lightweight, simple notebook   github.com/fonsp/Pluto.jl... · Posted by u/dunefox
benhurmarcel · 6 years ago
What happens if you reassign a variable below? Like:

  a = 1
  println(a)
  a = 2
Does it show 1 or 2?

Edit: tested it, it throws an error "Multiple definitions for a: Combine all definitions into a single reactive cell using a `begin ... end` block."

Not sure I like that way of working.

darsnack · 6 years ago
In a more complex example where you actually take a variable, do some operations to it, then reassign it, Pluto.jl encourages you to separate that into multiple cells. The reason is each cell marks a distinct node in the dependency graph. If you prefer to use cells, then the notebook can be smarter about what lines actually need to get re-run and what don't.

A downside to using multiple cells is vertical spacing/visual noise. This is something that the package authors are currently thinking about addressing.

darsnack commented on Julia library for fast machine learning   turing.ml/dev/... · Posted by u/todsacerdoti
nerdponx · 6 years ago
I've been excited by Turing, just to see a probabilistic programming framework like that in Julia. I think the expressiveness of Julia and it being native to that framework will be helpful.

I hope so too. But hasn't Julia's TF/Torch equivalent, Flux, had performance problems? That was the rumor I heard anyway, I haven't had the chance to use it myself.

darsnack · 6 years ago
Most of that is not fundamental to Julia or Flux itself. It’s the difference between a monolithic package like TF and source-to-source AD in Julia. The former allows the designers to use their own data structures and external libraries to do optimizations. Source-to-source relies on the underlying IR used by Julia, making optimizations challenging without some compiler assistance. But all of that is in the pipeline with stopgap solutions on the way.

As with most things in Julia, the code developers don’t just want to hack changes that work, but make changes that are flexible, extensible, and can solve many problems at once. So, Flux isn’t ready for prime time yet, but it is definitely worth keeping your eye on it.

u/darsnack

KarmaCake day44October 14, 2014View Original