2. I don't believe that scala-native was a good idea in the first place, because of all the java-interop boilerplate already present. C/JVM interop design conflicts are very hard to abstract properly. It would've been nice to adopt some WASM-compatible IR instead of MLIR/LLVM lock-in.
3. WASM-first is a very viable AOT+PGO option, it also brings new opportunities for remote exec and some interesting Architectural Approaches like automagic splitting monolith into microservices on the fly by potentialy calculating communications overhead and performing basic discrete optimizations.
So, Scala is still fun, just missing a lot of business oportunitites, it's just that Odersky decided to take another spin of EU Grants acquisition by developing a new lang.
I, personally, don't think that dotty was "good enough" to roll out last year, but overall project traction and cash-flow directions don't look that promising. And I personally choose to call it EU Budget Laundering, because it's really puzzling for me how exactly 60mil Euros grants are not enough to make Dotty stable. If no-one audits than no-one cares about it, or how does laundering and embezzlement work nowadays ?...
From a lang design standpoint, there are three things which make Scala obsolete
1. No proper formal verification - although there are things like stainless (stainless.epfl.ch) and it would've been possible to adopt zero-GC alloc during codegen by adopting CoC similarly to neut (github.com/vekatze/neut)
2. No proper support for protodef and IDL's - nowadays efficient serialization (marshaling) defines software reliability, all these JSON'y/Protobuffy/Flatbuffy thingies really causing some traction, although every single one of them are not something I'd call scalable.
3. Adopting Calculus Of Constructs (CoC) and formal verification alongside bunched-separation logic (like in F-star lang and rust) should be enough to formally prove Mem consuption, amount of IO and the respective computational overheads. Basically it would've worked similarly to CAP: pick either small mem footprint and bandwidth needed - the amount of compute power and the respective latency will be calculated and formally proven.
The exact desings of separation logic for multi-threading apps is a complex subject, but I like what Azalea Raad done with Concurrent Incorrectness Separation Logic (CISL) - it's something that would've allowed rust, for instance, to drop it's boxed types for RAII and a lot of the existing sync primitives (Arc, Barrier, Condvar, PoisonErr etc).
But who am I to talk about that... I never boiled in the Sciency Kettle and Played by the Academic Tribe Rules, spending half of my life just to copy-paste generic paperworks from here and there, filling up the gaps by slaving the Kenya/Nigeria students on forced contract terms, with usual threats, IP Extortion, common worker-contractor misclassification.
Everything professory Academic-related looks so corrupt for me nowadays.
2. The upstream fight had always followed the Cash Flow. If you have so called "problems" you either act yourself, or let other people do something about it, the way it won't result in more long-term issues. Deliberate Detraction IS a Sign of Corruption and Abuse of Power.
3. An ambiguous Point of Conflict may not be Conflict at all, but just an outcome of the Lack of Proper Communication and Transparency. Detraction is Accountable, social dynamics and social incline with explicit or implicit motives is Accountable as well. The lack of Effective Code of Conduct, and absence of Proper Punitive Action, for everyone, will cause Authoritarianism (or just Genocide of Engineering Thought).
I do feel bad about the state of Linux Kernel development, but we'll have either to move on, and learn from This Mistake, or do something about it.