https://proceedings.neurips.cc/paper/2012/file/c399862d3b9d6...
The approach is clearly valid but I feel like there's some missing pieces in making it work effectively in a digital context. I played around with this stuff a lot in college and my take was that the evolvable encoding problem (a form of representation problem) is fairly major and not really solved. There are also some unsolved issues around evolutionary dynamics and evolutionary game theory, which means how to structure the population and the "game" for best results. Not sure how much progress has been made since then on these, but my impression is not much.
The main area where GAs have seen use in the field so far is optimization problems with a lot of parameters related in unknown or hard to model ways or where a closed form solution is unknown or computationally too expensive (NP). These include shipping and air travel routing (traveling salesman with multiple optimization goals like distance + time + fuel + depreciation), circuit board and IC layout, antenna design for exotic RF modulations, drug discovery, materials science, etc. Problems like these are fairly easy to map to a GA, and current GAs are pretty good at finding local maxima in these functions.
Still those are a far cry from "design ex nihilo," which is really the promise that evolutionary computation carries. Those applications are using GAs as a bigger brother to things like Monte Carlo and simulated annealing.
One area that I'd look into if I were doing this now would be a hybrid approach where a GA is used to design deep learning architectures and their associated parameters. Seems like it could be very powerful but damn would that ever take a lot of computing power. The fitness function of the GA would consist of N deep learning runs for each candidate. Luckily GAs are parallelizable to an almost unlimited degree.
Most recent advances in the fields you mentioned were driven by gradient-based optimization (e.g., drug design, routing, or chip design: https://www.nature.com/articles/s41586-021-03544-w).
Nature can't SGD through genomes but has a metric ton of time, so evolution might be near-ideal for sexual reproduction. We typically don't have billions of generations, trillions of instantiations, and complex environments to play with when optimizing functions... It's telling that the fastest-evolving biological system (our brain!) certainly doesn't employ large-scale GA; if anything, it probably approximates gradients via funky distributed rules.
EDIT: The most modern application I can think of was some stuff from OpenAI (https://openai.com/blog/evolution-strategies/). But the point here is one of computational feasibility -- if they could backprop through the same workload, they would.
Plenty of work on it but it’s early days and we need massive, massive, compute power.