I've been a full-time FEM Analyst for 15 years now. It's generally a nice article, though in my opinion paints a far rosier picture of the last couple decades than is warranted.
Actual, practical use of FEM has been stagnate for quite some time. There have been some nice stability improvements to the numerical algorithms that make highly nonlinear problems a little easier; solvers are more optimized; and hardware is of course dramatically more capable (flash storage has been a godsend).
Basically every advanced/"next generation" thing the article touts has fallen flat on its face when applied to real problems. They have some nice results on the world's simplest "laboratory" problem, but accuracy is abysmal on most real-world problems - e.g. it might give good results on a cylinder in simple tension, but fails horribly when adding bending.
There's still nothing better, but looking back I'm pretty surprised I'm still basically doing things the same way I was as an Engineer 1; and not for lack of trying. I've been on countless development projects that seem promising but just won't validate in the real world.
Industry focus has been far more on Verification and Validation (ASME V&V 10/20/40) which has done a lot to point out the various pitfalls and limitations. Academic research and the software vendors haven't been particularly keen to revisit the supposedly "solved" problems we're finding.
I'm a mechanical engineer, and I've been wanting to better understand the computational side of the tools I use every day. Do you have any recommendations for learning resources if one wanted to "relearn" FEA from a computer science perspective?
I kind of thought Neural Operators were slotting into the some problem domains where FEM is used (based on recent work in weather modelling, cloth modelling, etc) and thought there was some sort of FEM -> NO lineage. Did I completely misunderstand that whole thing?
Those are definitely up next in the flashy-new-thing pipeline and I'm not that up to speed on them yet.
Another group within my company is evaluating them right now and the early results seems to be "not very accurate, but directionally correct and very fast" so there may be some value in non-FEM experts using them to quickly tell if A or B is a better design; but will still need a more proper analysis in more accurate tools.
It's still early though and we're just starting to see the first non-research solvers hitting the market.
I was under the impression that the linear systems that come out of FEM methods are in some cases being solved by neural networks (or partially, e.g. as a preconditioner in an iterative scheme), but I don't know the details.
stagnate last 15 years???
Contact elements, bolt preload, modeling individual composite fibers, delamination progressive ply failure, modeling layers of material to a few thousandths of an inch. Design optimization.
ANSYS Workbench = FEA For Dummies.
The list goes on.
Could you write a blogpost-style article on how to model the shallow water wave equation on a sphere? The article would start with the simplest possible method, something that could be implemented in short C program, and would continue with a progressively more accurate and complex methods.
If you are interested in this, I'd recommend following an openfoam tutorial, c++ though.
You could do SWE with finite elements, but generally finite volumes would be your choice to handle any potential discontinuities and is more stable and accurate for practical problems.
I started my career doing FE modeling and analysis with ANSYS and NASTRAN. Sometimes I miss these days. Thinking about how to simplify a real world problem so far that it is solvable with the computational means available was always fun. Then pushing quads around for hours until the mesh was good had an almost meditative effect. But I don't feel overwhelmingly eager to learn a new software or language.
Much to my surprise, it seems there hasn't been much movement there. ANSYS still seems to be the leader for general simulation and multi-physics. NASTRAN still popular. Still no viable open-source solution.
The only new player seems to be COMSOL. Has anyone experience with it? Would it be worth a try for someone who knows ANSYS and NASTRAN well?
I've used ansys daily for over a decade, and the only movement is in how they name their license tiers. It's a slow muddy death march. Every year I'm fighting the software more and more, the sales men are clearly at the wheel.
They buy "vertical aligned" software, integrate it, then slowly let it die. They just announced they're killing off one of these next year, that they bought ten years ago, because they want to push a competitive product with 20% of the features.
I've been using nastran for half as long but it isn't much better. It's all sales.
I dabbed a bit in abaqus, that seems nice. Probably cause I just dabbed in it.
But here I'm just trying to do my work, and all these companies do is move capabilities around their license tiers and boil the frog as fast as they get away with.
I've gone Abaqus > Ansys > Abaqus/LS-DYNA over my career and hate Ansys with a fiery passion. It's the easiest one to run your first model in, but when you start applying it to real problems its a fully adversarial relationship. The fact you have to make a complete copy of the geometry/mesh to a new Workbench "block" to run a slightly different load case (and you can't read in an orphaned results files) is just horrible.
Abaqus is more difficult to get up to speed in, but its really nice from an advanced usability standpoint. They struggle due to cost though, it is hugely expensive and we've had to fight hard to keep it time and time again.
LS-Dyna is similar to Abaqus (though I'm not fully up in it yet), but we're all just waiting to see how Ansys ruins it, especially now that they got bought out by Synopsys.
For the more low-level stuff there's the FEniCS project[1], for solving PDEs using fairly straight forward Python code like this[2]. When I say fairly straight forward, I mean it follows the math pretty closely, it's not exactly high-school level stuff.
Interesting. Please bear with me as this is going off 25 year old memories, but my memory is that the workflow for using FEA tools was: Model in some 3D modelling engineering tool (e.g. SolidWorks), ansys to run FEA, iterate if needed, prototype, iterate.
So to have anything useful, you need that entire pipeline? For hobbyists, I assume we need this stack. What are the popular modelling tools?
COMSOL's big advantage is it ties together a lot of different physics regimes together and makes it very easy to couple different physics together. Want to do coupled structures/fluid? Or coupled electromagnetism/mechanical? Its probably the easiest one to use.
Each individual physics regime is not particularly good on its own - there are far better mechanical, CFD, electromagnetism, etc solvers out there - but they're all made by different vendors and don't play nicely with each other.
Ouch. I kind of know Comsol because it was already taught in my engineering school 15 years ago, so that it still counts as a “new entrant” really gives an idea of how slow the field evolves.
I am hoping this open source FEM library will catch on : https://www.dealii.org/. The deal in deal.II stands for Differential Equation Analysis Library.
It's written in C++, makes heavy use of templates and been in development since 2000. It's not meant for solid mechanics or fluid mechanics specifically, but for FEM solutions of general PDEs.
The documentation is vast, the examples are numerous and the library interfaces with other libraries like Petsc, Trilinos etc. You can output results to a variety of formats.
I believe support for triangle and tetrahedral elements has been added only recently. In spite of this, one quirk of the library is that meshes are called "triangulations".
I've worked with COMSOL (I have a smaller amount of ANSYS experience to compare to). For the most part I preferred COMSOL's UI and workflow and leveraged a lot of COMSOL's scripting capabilities which was handy for a big but procedural geometry I had (I don't know ANSYS's capabilities for that). They of course largely do the same stuff. If you have easy access to COMSOL to try it out I'd recommend it just for the experience. I've found sometimes working with other tools make me recognize some capabilities or technique that hadn't clicked for me yet.
Once you have a mesh that's "good enough", you can use any number of numeric solvers. COMSOL has a very good mesher, and a competent geometry editor. It's scriptable, and their solvers are also very good.
There might be better programs for some problems, but COMSOL is quite nice.
OpenFOAM seems like an opensource option but I have found it rather impenetrable - there are some youtube videos and pdf tutorials, but they are quite dense and specific and doens't seem to cover the entire pipeline
Wait? What? NASTRAN was originally developed by NASA and open sourced over two decades ago. Is this commercial software built on top that is closed source?
I’m astonished ANSYS and NASTRAN are still the only players in town. I remember using NASTRAN 20 years ago for FE of structures while doing aero engineering. And even then NASTRAN was almost 40 years old and ancient.
There's a bunch of open source fem solvers e.g. Calculix, Code_Aster, OpenRadioss and probably a few unmaintained forks of (NASA) NASTRAN, but there's no multiphysics package I don't think.
I work in this field and it really is stagnant and dominated by high-priced Ansys/etc. For some reason silicon valley's open sourceness hasn't touched it. For open source, there's CalculiX which is full of bugs and Code Aster which everybody I've heard about it from say it's too confusing to use. CalculiX has Prepomax as a fairly new and popular pre/post.
During my industrial PhD, I created an Object-Oriented Programming (OOP) framework for Large Scale Air-Pollution (LSAP) simulations.
The OOP framework I created was based on Petrov-Galerkin FEM. (Both proper 2D and "layered" 3D.)
Before my PhD work, the people I worked with (worked for) used spectral methods and Alternate-direction FEM (i.e. using 1D to approximate 2D.)
In some conferences and interviews certain scientists would tell me that programming FEM is easy (for LSAP.) I always kind of agree and ask how many times they have done it. (For LSAP or anything else.)
I was not getting an answer from those scientists...
Applying FEM to real-life problems can involve the resolving of quite a lot of "little" practical and theoretical gotchas, bugs, etc.
> Applying FEM to real-life problems can involve the resolving of quite a lot of "little" practical and theoretical gotchas, bugs, etc.
FEM at it's core ends up being just a technique to find approximate solutions to problems expressed with partial differential equations.
Finding solutions to practical problems that meet both boundary conditions and domain is practically impossible to have with analytical methods. FEM trades off correctness with an approximation that can be exact in prescribed boundary conditions but is an approximation in both how domains are expressed and the solution,and has nice properties such as the approximation errors converging to the exact solution by refining the approximation. This means exponentially larger computational budgets.
I also studied FEM in undergrad and grad school. There's something very satisfying about breaking an intractably difficult real-world problem up into finite chunks of simplified, simulated reality and getting a useful, albeit explicitly imperfect, answer out of the other end. I find myself thinking about this approach often.
Predicting how things evolve in space-time is a fundamental need. Finite element methods deserve the glory of a place at the top of the HN list. I opted for "orthogonal collocation" as the method of choice for my model back in the day because it was faster and more fitting to the problem at hand. A couple of my fellow researchers did use FEM. It was all the rage in the 90s for sure.
Interesting perspective. I just attended an academic conference on isogeometric analysis (IGA), which is briefly mentioned in this article. Tom Hughes, who is mentioned several times, is now the de facto leader of the IGA research community. IGA has a lot of potential to solve many of the pain points of FEM. It has better convergence rates in general, allows for better timesteps in explicit solvers, has better methods to ensure stability in, e.g., incompressible solids, and perhaps most exciting, enables an immersed approach, where the problem of meshing is all but gone as the geometry is just immersed in a background grid that is easy to mesh. There is still a lot to be done to drive adoption in industry, but this is likely the future of FEM.
> IGA has a lot of potential to solve many of the pain points of FEM.
Isn't IGA's shtick just replacing classical shape functions with the splines used to specify the geometry?
If I recall correctly convergence rates are exactly the same, but the whole approach fails to realize that, other than boundaries, geometry and the fields of quantities of interest do not have the same spatial distributions.
IGA has been around for ages, and never materialized beyond the "let's reuse the CAD functions" trick, which ends up making the problem more complex without any tangible return when compared with plain old P-refinent. What is left in terms of potential?
> Tom Hughes, who is mentioned several times, is now the de facto leader of the IGA research community.
I recall the name Tom Hughes. I have his FEM book and he's been for years (decades) the only one pushing the concept. The reason being that the whole computational mechanics community looked at it,found it interesting, but ultimately wasn't worth the trouble. There are far more interesting and promising ideas in FEM than using splines to build elements.
> Isn't IGA's shtick just replacing classical shape functions with the splines used to specify the geometry?
That's how it started, yes. The splines used to specify the geometry are trimmed surfaces, and IGA has expanded from there to the use of splines generally as the shape functions, as well as trimming of volumes, etc. This use of smooth splines as shape functions improves the accuracy per degree of freedom.
> If I recall correctly convergence rates are exactly the same
Okay, looks like I remembered wrong here. What we do definitely see is that in IGA you get the convergence rates of higher degrees without drastically increasing your degree of freedom, meaning that there is better accuracy per degree of freedom for any degree above 1. See for example Figures 16 and 18 in this paper: https://www.researchgate.net/profile/Laurens-Coox/publicatio...
> geometry and the fields of quantities of interest do not have the same spatial distributions.
Using the same shape functions doesn't automatically mean that they will have the same spatial distributions. In fact, with hierarchical refinement in splines you can refine the geometry and any single field of interest separately.
> What is left in terms of potential?
The biggest potential other than higher accuracy per degree of freedom is perhaps trimming. In FEM, trimming your shape functions makes the solution unusable. In IGA, you can immerse your model in a "brick" of smooth spline shape functions, trim off the region outside, and run the simulation while still getting optimal convergence properties. This effectively means little to no meshing required. For a company that is readying this for use in industry, take a look at https://coreform.com/ (disclosure, I used to be a software developer there).
Actual, practical use of FEM has been stagnate for quite some time. There have been some nice stability improvements to the numerical algorithms that make highly nonlinear problems a little easier; solvers are more optimized; and hardware is of course dramatically more capable (flash storage has been a godsend).
Basically every advanced/"next generation" thing the article touts has fallen flat on its face when applied to real problems. They have some nice results on the world's simplest "laboratory" problem, but accuracy is abysmal on most real-world problems - e.g. it might give good results on a cylinder in simple tension, but fails horribly when adding bending.
There's still nothing better, but looking back I'm pretty surprised I'm still basically doing things the same way I was as an Engineer 1; and not for lack of trying. I've been on countless development projects that seem promising but just won't validate in the real world.
Industry focus has been far more on Verification and Validation (ASME V&V 10/20/40) which has done a lot to point out the various pitfalls and limitations. Academic research and the software vendors haven't been particularly keen to revisit the supposedly "solved" problems we're finding.
[0]: https://open.umich.edu/find/open-educational-resources/engin...
[1]: https://www.dealii.org/
Even Arnold's work? FEEC seemed quite promising last time I was reading about it, but never seemed to get much traction in the wider FEM world.
Another group within my company is evaluating them right now and the early results seems to be "not very accurate, but directionally correct and very fast" so there may be some value in non-FEM experts using them to quickly tell if A or B is a better design; but will still need a more proper analysis in more accurate tools.
It's still early though and we're just starting to see the first non-research solvers hitting the market.
It seems like a hot candidate to potentially yield better results in the future
You could do SWE with finite elements, but generally finite volumes would be your choice to handle any potential discontinuities and is more stable and accurate for practical problems.
Here is a tutorial. https://www.tfd.chalmers.se/~hani/kurser/OS_CFD_2010/johanPi...
Typically, Finite Volume Method is used for fluid flow problems. It is possible to use Finite Element Methods, but it is rare.
Much to my surprise, it seems there hasn't been much movement there. ANSYS still seems to be the leader for general simulation and multi-physics. NASTRAN still popular. Still no viable open-source solution.
The only new player seems to be COMSOL. Has anyone experience with it? Would it be worth a try for someone who knows ANSYS and NASTRAN well?
They buy "vertical aligned" software, integrate it, then slowly let it die. They just announced they're killing off one of these next year, that they bought ten years ago, because they want to push a competitive product with 20% of the features.
I've been using nastran for half as long but it isn't much better. It's all sales.
I dabbed a bit in abaqus, that seems nice. Probably cause I just dabbed in it.
But here I'm just trying to do my work, and all these companies do is move capabilities around their license tiers and boil the frog as fast as they get away with.
Abaqus is more difficult to get up to speed in, but its really nice from an advanced usability standpoint. They struggle due to cost though, it is hugely expensive and we've had to fight hard to keep it time and time again.
LS-Dyna is similar to Abaqus (though I'm not fully up in it yet), but we're all just waiting to see how Ansys ruins it, especially now that they got bought out by Synopsys.
For the more low-level stuff there's the FEniCS project[1], for solving PDEs using fairly straight forward Python code like this[2]. When I say fairly straight forward, I mean it follows the math pretty closely, it's not exactly high-school level stuff.
[1]: https://fenicsproject.org/
[2]: https://jsdokken.com/dolfinx-tutorial/chapter2/linearelastic...
So to have anything useful, you need that entire pipeline? For hobbyists, I assume we need this stack. What are the popular modelling tools?
Each individual physics regime is not particularly good on its own - there are far better mechanical, CFD, electromagnetism, etc solvers out there - but they're all made by different vendors and don't play nicely with each other.
Ouch. I kind of know Comsol because it was already taught in my engineering school 15 years ago, so that it still counts as a “new entrant” really gives an idea of how slow the field evolves.
It's written in C++, makes heavy use of templates and been in development since 2000. It's not meant for solid mechanics or fluid mechanics specifically, but for FEM solutions of general PDEs.
The documentation is vast, the examples are numerous and the library interfaces with other libraries like Petsc, Trilinos etc. You can output results to a variety of formats.
I believe support for triangle and tetrahedral elements has been added only recently. In spite of this, one quirk of the library is that meshes are called "triangulations".
There might be better programs for some problems, but COMSOL is quite nice.
Happy to hear if people have good resources!
Wait? What? NASTRAN was originally developed by NASA and open sourced over two decades ago. Is this commercial software built on top that is closed source?
I’m astonished ANSYS and NASTRAN are still the only players in town. I remember using NASTRAN 20 years ago for FE of structures while doing aero engineering. And even then NASTRAN was almost 40 years old and ancient.
The OOP framework I created was based on Petrov-Galerkin FEM. (Both proper 2D and "layered" 3D.)
Before my PhD work, the people I worked with (worked for) used spectral methods and Alternate-direction FEM (i.e. using 1D to approximate 2D.)
In some conferences and interviews certain scientists would tell me that programming FEM is easy (for LSAP.) I always kind of agree and ask how many times they have done it. (For LSAP or anything else.) I was not getting an answer from those scientists...
Applying FEM to real-life problems can involve the resolving of quite a lot of "little" practical and theoretical gotchas, bugs, etc.
FEM at it's core ends up being just a technique to find approximate solutions to problems expressed with partial differential equations.
Finding solutions to practical problems that meet both boundary conditions and domain is practically impossible to have with analytical methods. FEM trades off correctness with an approximation that can be exact in prescribed boundary conditions but is an approximation in both how domains are expressed and the solution,and has nice properties such as the approximation errors converging to the exact solution by refining the approximation. This means exponentially larger computational budgets.
> FEM: Finite Element Method: https://en.wikipedia.org/wiki/Finite_element_method
>> FEM: Finite Element Method (for ~solving coupled PDEs (Partial Differential Equations))
>> FEA: Finite Element Analysis (applied FEM)
> awesome-mecheng > Finite Element Analysis: https://github.com/m2n037/awesome-mecheng#fea
And also, "Learning quantum Hamiltonians at any temperature in polynomial time" (2024) https://arxiv.org/abs/2310.02243 re: the "relaxation technique" .. https://news.ycombinator.com/item?id=40396171
Isn't IGA's shtick just replacing classical shape functions with the splines used to specify the geometry?
If I recall correctly convergence rates are exactly the same, but the whole approach fails to realize that, other than boundaries, geometry and the fields of quantities of interest do not have the same spatial distributions.
IGA has been around for ages, and never materialized beyond the "let's reuse the CAD functions" trick, which ends up making the problem more complex without any tangible return when compared with plain old P-refinent. What is left in terms of potential?
> Tom Hughes, who is mentioned several times, is now the de facto leader of the IGA research community.
I recall the name Tom Hughes. I have his FEM book and he's been for years (decades) the only one pushing the concept. The reason being that the whole computational mechanics community looked at it,found it interesting, but ultimately wasn't worth the trouble. There are far more interesting and promising ideas in FEM than using splines to build elements.
That's how it started, yes. The splines used to specify the geometry are trimmed surfaces, and IGA has expanded from there to the use of splines generally as the shape functions, as well as trimming of volumes, etc. This use of smooth splines as shape functions improves the accuracy per degree of freedom.
> If I recall correctly convergence rates are exactly the same
Okay, looks like I remembered wrong here. What we do definitely see is that in IGA you get the convergence rates of higher degrees without drastically increasing your degree of freedom, meaning that there is better accuracy per degree of freedom for any degree above 1. See for example Figures 16 and 18 in this paper: https://www.researchgate.net/profile/Laurens-Coox/publicatio...
> geometry and the fields of quantities of interest do not have the same spatial distributions.
Using the same shape functions doesn't automatically mean that they will have the same spatial distributions. In fact, with hierarchical refinement in splines you can refine the geometry and any single field of interest separately.
> What is left in terms of potential?
The biggest potential other than higher accuracy per degree of freedom is perhaps trimming. In FEM, trimming your shape functions makes the solution unusable. In IGA, you can immerse your model in a "brick" of smooth spline shape functions, trim off the region outside, and run the simulation while still getting optimal convergence properties. This effectively means little to no meshing required. For a company that is readying this for use in industry, take a look at https://coreform.com/ (disclosure, I used to be a software developer there).