Readit News logoReadit News
flerovium114 commented on Derivatives, Gradients, Jacobians and Hessians   blog.demofox.org/2025/08/... · Posted by u/ibobev
sestep · 14 days ago
A bit more advanced than this post, but for calculating Jacobians and Hessians, the Julia folks have done some cool work recently building on classical automatic differentiation research: https://iclr-blogposts.github.io/2025/blog/sparse-autodiff/
flerovium114 · 14 days ago
Have you tried using Enzyme (https://enzyme.mit.edu/)? It operates on the LLVM IR, so it's available in any language that breaks down into LLVM (e.g., Julia, where I've used it for surface gradients) and it produces highly optimized AD code. Pretty cool stuff.
flerovium114 commented on How randomness improves algorithms (2023)   quantamagazine.org/how-ra... · Posted by u/kehiy
flerovium114 · 16 days ago
Randomized numerical linear algebra has proven very useful as well. It allows you to use a black-box function implementing matrix-vector multiplication (MVM) to compute standard decompositions like SVD, QR, etc. Very useful when MVM is O(N log N) or better.

u/flerovium114

KarmaCake day2August 16, 2025
About
I do research and development in computational electromagnetics (CEM), specializing in fast algorithms and high performance computing
View Original