I'm so glad I was taught Java at Macquarie University back in 1998. For the past 20 years I've had a career built on a solid API that doesn't change every 2 years like some flavour-of-the-month Javascript framework.
Even on the client where Java has lost to Javascript, I'm finding it more enjoyable to add features to my 15-year-old SWT app [0] rather than dealing with the multiple layers of abstractions that is Javascript+CSS+DOM (+ maybe Electron).
Personally I think it's a shame Sun dropped the ball with client Java - if they had chosen SWT over Swing and provided a minimal JVM then maybe Java Web Start would have beaten Javascript web apps. It's also a shame Sun sold Java to Oracle - Google would have been a better steward, and probably would have been willing to pay more for the Java parts of Sun.
I'm now trying Dart to develop a few Flutter apps. It's no doubt a better language, but not that much better - I think Flutter would have been more successful if it was built on Java.
The times of ever-changing JavaScript frontend frameworks is long behind us (and, arguably, React has won for MVw-style browser apps). The core node.js web serving APIs (expressjs and core http request API, which expressjs middlewares forwards and decorates) is stable since node.js v0.1 or at least 2015, and infinitely better than Java's servlet, JSP, and taglib APIs (web.xml/jetty-config.xml anyone?).
The flip side of Java's stability is stagnation. On the server-side, customers use mostly Spring/Spring Boot these days to make Java's overengineered stack usable, using even more over-engineering and metaprogramming. Same with maven/gradle, which turns what should be a developer-focussed build system into an enterprise XML mess.
SWT? I'm glad it works for you. Last I heard, IBM had pulled out of developing Eclipse and SWT/RCP over ten years ago. In the one project where we used it, the app looked like an IDE, and customers hated, then cancelled it. Designed as wrappers for OS-native GUI controls, it has even less of a place in a browser than Swing had. Last I looked, of the Java rich-client frameworks, only Swing has been updated with HIDPI support, whereas JavaFX and SWT haven't. None is using any of those (with the exception of experimental JFX apps) for new apps in this decade.
You can call Spring (boot) over-engineered. I call it feature full and extensible.
Whereas in the modern JS world, you start out with a lean project and then add small libraries from NPM that all work slightly different for every feature you need, in the Spring world where the framework has matured for 10+ years most of the things you will need are in the box or are available as external libraries that all work on the same defined interfaces.
Spring Boot with Java 11 or Kotlin are still my preferred backend stack for that reason. It’s solid, has aged well and makes me super productive.
Haha! Funny man! I am only doing javascript occasionly. Most of my time i do python and swift.
Even stuff in “core” dependencies like webpack/yarn/npm/npx/nvm/babel changes a few times a year.
Right now doing a react native app and a browser extension. While javascript/typescript is fun to write, the time you have to spend fixing the tools is just horrible. Whiping cache, rebooting, switching package versions.. It makes the overall experience cumbersome.
I would not call something stable if it manages to work 2 whole years.
> On the server-side, customers use mostly Spring/Spring Boot these days to make Java's overengineered stack usable, using even more over-engineering and metaprogramming.
There was definitely an era of that, but over the past decade or so it's been acknowledged as a problem, and there's been a lot of effort put into making things simpler and more vanilla. Modern Spring is much much closer to plain old code. (Of course it's still possible to write in the XML/setter injection/afterPropertiesSet style, and always will be - that's backward compatibility for you - but it's very possible to migrate existing applications to a better style in place, which very few platforms manage).
> Same with maven/gradle, which turns what should be a developer-focussed build system into an enterprise XML mess.
Maven is pretty close to the perfect build system, it's just taken a while for knowledge and practices to catch up with it. All the stuff people love and praise Cargo for, Maven has been doing for 15+ years.
Bad decisions can be made using any technology. Java has a lot of crufty legacy APIs because, well it has a legacy. I gave up using JSPs years ago for Freemarker. I now use Vert.x rather than servlet contains which gives me a node.js environment but with access to the Java ecosystems and basic things like mature logging frameworks, as well as more advanced features (oh and multithreading in a single process).
Despite Spring being Enterprise it still scales down to simpler apps pretty well due to its modularity.
These can be build into a single jar app that is easy to build and deploy, and combined with a database migration library, maintain their own database schemas. Very, very usable.
Yes, I probably would not use Java on the frontend unless it was for a technical audience.
Spring and other frameworks may be popular, but the base servlet api with db connection pooling is enough. This approach feels much lighter than using frameworks. Not sure where the 'over-engineered' view comes from. I'd call that 'sane'.
People complain about the verbosity of Java, but how much time do you spend learning a new framework and all of the gotchas? At the end of the day it seems easier to just write the boilerplate. All of that verbose code means you got exactly what you asked for.
JVM projects never surprise me. When there is a problem debugging is a dream.
Any app reaching complexity of what’s being done on Java will look the same, no matter what platform do you choose. I haven’t yet seen large nodejs deployments with MLOCs and hundreds of developers, but I’m pretty sure it can easily become much bigger mess - the tools and the libraries are immature, there’s lack of enterprise SDLC patterns etc, which creates a lot of obstacles. Spring is becoming too enterprisey these days, that’s true, but it’s still fit for use in modern architectures - only if you understand how to utilize all of its capabilities.
> On the server-side, customers use mostly Spring/Spring Boot these days to make Java's overengineered stack usable, using even more over-engineering and metaprogramming.
Could you elaborate on the overengineering part? There are many modern frameworks that are much lighterweight than Spring/Spring Boot, and with the introduction of lambdas in Java, things are even more straightforward.
One of my Java guilty pleasures is pulling in some 14 year old library and it just works. Absolute gold for a company with a lotta old code sitting around.
I also do a good bit of frontend and holy crap it's exhausting. That's the main reason I push hard for Java backend. Backends tend to stick around about a decade longer than you would like, while front end code is due for a rewrite every 3 years
>Google would have been a better steward, and probably would have been willing to pay more for the Java parts of Sun.
They didn't and Google didn't want a pay a single dollar to Sun on Java. As James Gosling has mentioned multiple times. I don't understand why people are still painting Google as Saint after all these years.
actually they didn't want to pay a single dollar to sun, because of licensing reasons, not because they didn't wanted to buy java.
currently they recreated the java apis by pulling in apache harmony and they tought that they are in the right or at least in a fair use position (they still think they do).
but I agree, I'm not sure if the direction of java would've been better when google would have the stewardship.
just had this conversation with a co-worker today.
java is a stable api but it also doesn't evolve. The tradeoff is you get a program guaranteed to work no matter the upgrade vs being able to build better toolage.
React changes every year or 2. It is exhausting. But you get way better patterns and some things that drastically improve productivity.
Java gives you a flexible object oriented language and an incredible set of libraries. It's got an archiver for artifact and dependency management. It now has lambdas and has FRP style stream programming. I'd say it's kept up pretty well with modern fashion, while it's been doing other things far longer than newer, more popular languages. Take Swift, now my daily driver (which I love so much compared to ObjC, which I moved to after Java)... an example of something Java had way before Swift is "Protocol Oriented Programming". Also, Swing's GridBagLayout predates iOS' AutoLayout and CSS (edit: don't know where I was going with CSS haha...). Everything old is new again, and I'm sure Java reinvented a few wheels of it's own.
Also keep in mind that React is a library, not a language. If you want to compare things, compare Javascript to Java, or the Java stdlibs to React. Maybe that's what you meant when you said Java.
IMO, Java and its libraries are nicer to work with solely due to its static nature and OO design. It is worlds apart from a scripting language when you need to refactor, or even just understand, legacy code. And if you can't even run tests on that legacy code anymore, like a React project over a year or two old?
What are the better patterns and productivity boosts you get from React vs. a Java ecosystem?
If you develop, not if you maintain. This is a major reason we stay away from JS frameworks after burning the hands on implementing an AngularJS that was obsolete a year later.
Doesn't evolve _quickly_. Which as you say is say is part of the tradeoff of giving a trustable guarantee that the tooling you do use will continue to work for a long time.
I've been working professionally in Java for almost 20 years. I can't remember the last time I saw someone use features introduced after 1.6. The time API in 1.8 was nice. I've never seen anyone use lambdas in production code. I'm sure people do use newer stuff, but I think it's the minority. So in practice it's even more stable than it is in the headlines.
You have probably missed the last 5 years. From streams, to closures, to default methods, to functional interfaces, to local type inference, to new Process APIs, to modules, to unsigned arithmetic, to new date/time APIs, http client, 2 new GCs, shell REPL, etc... with more to come, and new releases every 3 months...
I love Dart and Flutter. I actually think it was a good idea not to use Java. Dart is much more modern and nice to work with in my opinion. There are many reason and I won't dwell into it here.
The biggest-ish project I've built with Flutter is an alternative to Nissan's ConnectEV app; it's used with the electric vehicles Nissan Leaf and Nissan E-NV200. You can see statistics, battery status and control climate control and charging of your vehicle.
My alternative is called "My Leaf" on the Play Store and "My Leaf for Nissan EV" on the App Store. It's completely open source; https://gitlab.com/tobiaswkjeldsen/carwingsflutter
It consists of the main Flutter app and a Dart library for communicating with Nissan's API.
I am currently building my first mobile app on Flutter and quite happy I choose that technology! Developing in AndroidStudio + Flutter does not feel like the chore I remember developing android apps to be.
Biggest barrier to entry for me is the build dependency system Maven. And I never understood Ant. I did like the language when I played around with Java 8. Streams are amazing.
Maven is an incredible system. It can do so much and yet, even though I've setup countless apps and services, I can never use it without looking literally everything up.
Maven and spring are the worst parts, for me, when dealing with Java. I don't mind the language at all.
The success of javascript is hugely because of no dependency-hell (not being able to upgrade some lib to a new version, because it would conflict with some other lib which requires it in its current version and you can't have both of them loaded at the same time)
The most interesting new feature I think is the Shenandoah GC. The summary from [1]:
"Add a new garbage collection (GC) algorithm named Shenandoah which reduces GC pause times by doing evacuation work concurrently with the running Java threads. Pause times with Shenandoah are independent of heap size, meaning you will have the same consistent pause times whether your heap is 200 MB or 200 GB."
The original algorithm was published in 2016 [2]. It consists of 4 phases: initial marking, concurrent marking, final marking, and concurrent compaction.
GC pauses have been one of the major barriers to using garbage collected (read: higher-level) languages for game development. This could open up the JVM for games, which could have some exciting implications.
Once upon a time the barrier was the existance of VMT tables and method dispatch.
GC use can be optimized in high level languages with support for value types (which are still missing in Java, though) and it is not like every game needs to be the next Fortnight.
Honestly, if you need real performance when writing a game, there's no reason not use the language that's most popular for your engine of choice. Mostly that's gong to be C++ with python scripting.
This has already happened in a way. Unity uses a ton of C# for scripting. While C# is somewhat nicer to work with, this low pause GC makes java a compelling option for moving much more logic into the VM language
From the tests I've seen on beta versions, no difference.
The performance increase is from a different algorithm + full multithreading. Not from collecting garbage less often as is a common tactic to reduce time spent in GC
This thread reads eerily like threads about Go's low latency GC from 2015 and how 10ms isn't good enough and throughput will be impacted and on and on. Three years later Go treats any 500 microsecond pause as a bug as Go continues to focus on throughput. Shenandoah is being put together by some very very smart people and I'm optimistic that the only thing that stands in the way of Java reaching the "500 microseconds is a bug" level is engineering hours and resources. More kudos for this achievement are in order.
Is there a guide to the various GC algorithms? Suppose I'm running an app where pause times don't matter, and I want throughout or reduced memory usage. How much more of this will I get by switching to another GC algorithm? 2%? 20%? 40%?
Tangent: Go GC (generational, non-copying, concurrent mark and concurrent sweep) and the improvements it saw from 1.5 -> 1.8 https://blog.golang.org/ismmkeynote
The mutator/pacer scheme, the tri-color mark scheme, using rw-barriers during the sweep phase are interesting to contrast between the two GCs.
It should help performance on almost any JVM app, but much more on large heaps.
The GC time is lower and multi-threaded but most importantly the "pause time" is low and nearly constant (based on root set size)
The old GC's all scaled pause time roughly linearly with heap size. Apps that created a lot of garbage would have all their threads yielded for large amounts of time.
Now, essentially, it doesn't matter how much garbage you create. This is awesome because you used to carefully watch how many allocations you made to avoid bad GC times. Now it doesn't matter. Make as much trash as you want and the GC will deal with it
Interesting that they explicitly exclude time to safe point from that improvement. In my experience that is usually most expensive part of a stw pause.
Switch expressions prepare the way for pattern matching (https://openjdk.java.net/jeps/305), instead of `instanceof` checks and casts.
But (in my naive opinion) double dispatch seems a more elegant and java-ry solution, i.e. polymorphism on argument classes, so different methods are invoked for different object runtime classes (instead of using the compiletime type of the variable).
The switching could be optimised, as ordinary polymorphism is in the JVM.
Sure, you'd wreck legacy code if you just introduced it, but there's surely a backcompatible way to do it that isn't too awkward.
That kind of multiple dispatch would be far more disruptive at several levels, and still wouldn’t really get you where you want to go without a lot more compiler and JIT magic.
Sure, you could match on types, but you wouldn’t be able to extend that to destructuring patterns, or regsxps as patterns on strings, or so many other ways patterns may be extended in future releases.
Anybody know how the graal project ties in with all of this? Is oracle effectively developing 3 different JVMs (OpenJDK, Oracle JDK, GraalVM)? Or is there some sort of convergence plan?
From what I understand, graal has made a lot of headway with language interop, as well as a handful of specific memory optimizations and native compilation, but overall is lagging in pure throughput/latency performance behind hotspot. It would be really cool if we could get the best of both worlds.
Well the relationship between OpenJDK and Oracle JDK is that OpenJDK is the FOSS base and Oracle JDK is the proprietary structure built on top, kinda like Chromium and Chrome
As for GraalVM, I don't think it's meant to be mixed in with those 2 at all. From their website (https://www.graalvm.org/), it sounds to me as if it's a completely different project that shares none of the goals the other 2 JDKs have:
> GraalVM is a universal virtual machine for running applications written in JavaScript, Python, Ruby, R, JVM-based languages like Java, Scala, Kotlin, Clojure, and LLVM-based languages such as C and C++.
> GraalVM removes the isolation between programming languages and enables interoperability in a shared runtime. It can run either standalone or in the context of OpenJDK, Node.js, Oracle Database, or MySQL.
Graal is a JIT for Hotspot (and other things), you can already use it be turning on experimental features and enabling JVMCICompiler. Graalvm is that packaged up with the Truffle framework (and languages written for that), and SubstrateVM which allows programs to be ahead of time compiled and linked as executables with a minimal VM.
Oracle JDK is just the Oracle(TM) Supported branding of OpenJDK, now. As of Java 11, there are no feature parity or real technical differences between OpenJDK or OracleJDK, just support/EOL differences from vendors. So, that's pretty simple.
GraalVM is... complicated. There are a few parts to it:
1) An advanced JIT compiler, written in Java, and distributed as ordinary maven/jar packages, etc. You can use it yourself if you want. Interestingly, this compiler package ("graal") can be used for the JVM itself, and it will be used to compile the Java code that is run by the JVM. The "JVMCI" feature allows you to plug third party compilers into the JVM and replace HotSpot, and Graal (the library) is such a replacement. You can use this compiler on ordinary OpenJDK 10+ with a special startup invocation.
2) Truffle, which is another library for writing programming language interpreters. It does a lot of magic to make them fast. It uses the graal JIT compiler to do this, so it depends on the previous library.
3) A bunch of programming language implementations under umbrella: Python, JS, etc. These are implemented using Truffle. By using Truffle, these language implementations magically share an object model (so they can interoperate), and they get a JIT compiler, based on Graal, for "free". This means the languages can interoperate and JIT together cleanly (the compiler can inline JavaScript into Ruby!)
4) If you use Graal as the JVMCI compiler (i.e. for your host Java code), and you use Truffle-based languages -- Graal can extend benefit #3 to the host Java code itself. This effectively means the JIT compiler can inline and optimize code across every language boundary.
5) SubstrateVM, which is is a tool to turn Java programs into native exe binaries. The intent is you use SVM on the programming language interpreters, to produce interpreter binaries that look "normal" to the user. These binaries are run on custom, non-JDK/HotSpot runtime. The Java code is not JITed, but interpreted code -- Ruby, Python, JS, etc -- is. (This means you have benefit #3, but not #4). SubstrateVM uses a custom generational garbage collector (written in Java!)
6) The "GraalVM distribution", from graalvm.org. This is a combination of all the above pieces together as a sanctioned package. This package uses JDK8 as the base virtual machine.
7) The GraalVM distribution comes in both a "community" and "commercial" edition, which do have technical/feature differences.
Here's the thing: you can use everything from 1-4 with an "ordinary" OpenJDK today if you know what you're doing, and you don't need a special JDK build. SubstrateVM might also be workable, but I don't know.
Points 6-7 actually mean that there is generally a difference between "GraalVM the platform" and "Graal the compiler". Notably, while OpenJDK no longer has feature gaps vs the commercial OracleJDK, GraalVM does, which I find a very weird choice on Oracle's behalf and unfortunate, but maybe it will change one day.
If you want to use one of the new low-latency garbage collectors (ZGC/Shenandoah) with Graal, I don't think that's possible as of right now: garbage collectors need to interface with the JIT compiler, but ZGC does not support the JVMCI interface (it will explode on startup), and Shenandoah doesn't either, I believe (but maybe this changed). This will almost certainly be fixed in the future, of course.
The worst thing Oracle can do is deliver a bad feature. Java Gererics turn 15 this year. It's better to delay such a radical change than being stuck with it for the next few decades. Oracle is no hurry to do that. Maybe they get it together in time for the next LTS release (Java 14).
I get Java needs it because some uses need the performance gain in terms of less ram and less pointer interdirection. I'm glad library authors of like servers or caches will be able to improve performance without weird things like unsafe byte buffers and native memory leaks.
I suspect I won't ever use value types in my code though unless it's drop in to refactor from AnyVal=>AnyRef (or w/e the java equiv is).
* Does accepting a value type parameter accept it by value or by ref?
* How does this play with GC? Can I get dangling ref with value type?
* How does value types play with inheritance e.g. will it be like c# and struct and they will be boxed any time treated as ref?
* How safe will it be to convert a class from being value type based to Object (w/e AnyRef equiv is). Will I need to inspect all methods that deal it now?
This can probably just be resolved with a style guide like c++ can be mostly sane if you're starting a new project today with an aggressive style guide, but I just appreciated not think about this and just count on escape analysis + gc + hotspot tricks like monomorphize code paths with some counters to be good enough.
I share your frustration but this new GC should drastically reduce the issues with creating tons of garbage. There's still the infamous "allocation wall" but value types wouldn't solve that either. But they would make it a lot better
GC is only part of the issue. Value types allow much finer control over locality, something completely lacking in Java now. And without that, you give up a lot of performance.
No feature is meant to be in a release. A feature will be done when it’s done, though features may be split so that some functionality can land earlier.
In Java, the concept of exceptions is incompatible with the concept of parametrized types.
During compilation, type parameters are erased. "T" becomes "Object"; "T extends Something" becomes "Something"; assignment becomes type conversion. But at the end it works correctly. If you try to make something that wouldn't survive the type parameter erasure -- such as having two methods with the same name, one of which accepts "List<String>" and another accepts "List<Integer>" as their input -- it will be a compile-time error, because you can't have two methods with the same name and the same parameter "List".
But imagine that you have a type "MyException" extending RuntimeException, and your code has "catch (MyException)". After compilation, this would become "catch (RuntimeException)", which is wrong, because it would also catch other RuntimeExceptions, and it is not supposed to. But if you make this construction a compile-time error, then... the whole thing becomes useless, because what's the point of having an exception type you cannot catch.
On further thought I understood that streams usually are lazy and real exception would be thrown not at map, but later (probably in collect or similar method). So it would require to propagate that E generic parameter to almost every function (and Stream type signature). Also it won't work well with multiple checked exceptions. So yeah, not that easy.
Or in FunctionalInterface, or in anonymous functions.
Makes HashMap computeIfAbsent much less useful than it should be. It's impossible to use a function in there that throws a checked exception (all I want is for the exception to propagate out of computeIfAbsent for the parent method to deal with).
Best option I've found is to wrap it in an unchecked exception, then unwrap it in the parent method, which is just .... yuck.
If they are officially bad, remove them from the language. It's a backwards-compatible change. Currently we are in a weird situation, when standard library does not work well with each other. It's like making iPhone without USB-C cable and Macbook without USB-A port.
I find checked exceptions interesting. They happen to be the only example of an effect system in a mainstream language. Anders Hejlsberg formulated the main argument against them here [1]
Basically the problem is that checked exceptions are infectious and intermediary code needs to annotate all the exceptions of code that it calls. Which is a chore. Lucas Rytz states in his PhD thesis [2] that the problem is not wich checked exceptions themselves, but that in the java implementation 'not mentioning an exception means if won't happen'. He proposes a system where the default is 'any exception' and being able to turn it off with a compiler flag. And states that the developer experience of that would be much better.
This may become possible in Dotty Scala using the effect system based on implicit functions. But that's in 'proposed' status.
That's what languages like Rust do with a return return type that is part of the core language.
Sometimes I wish Java had one too to complement Optional. I wonder what the argument against that is? Too much confusion with the existing Exception system?
Even on the client where Java has lost to Javascript, I'm finding it more enjoyable to add features to my 15-year-old SWT app [0] rather than dealing with the multiple layers of abstractions that is Javascript+CSS+DOM (+ maybe Electron). Personally I think it's a shame Sun dropped the ball with client Java - if they had chosen SWT over Swing and provided a minimal JVM then maybe Java Web Start would have beaten Javascript web apps. It's also a shame Sun sold Java to Oracle - Google would have been a better steward, and probably would have been willing to pay more for the Java parts of Sun.
I'm now trying Dart to develop a few Flutter apps. It's no doubt a better language, but not that much better - I think Flutter would have been more successful if it was built on Java.
[0] https://www.solaraccounts.co.uk
The flip side of Java's stability is stagnation. On the server-side, customers use mostly Spring/Spring Boot these days to make Java's overengineered stack usable, using even more over-engineering and metaprogramming. Same with maven/gradle, which turns what should be a developer-focussed build system into an enterprise XML mess.
SWT? I'm glad it works for you. Last I heard, IBM had pulled out of developing Eclipse and SWT/RCP over ten years ago. In the one project where we used it, the app looked like an IDE, and customers hated, then cancelled it. Designed as wrappers for OS-native GUI controls, it has even less of a place in a browser than Swing had. Last I looked, of the Java rich-client frameworks, only Swing has been updated with HIDPI support, whereas JavaFX and SWT haven't. None is using any of those (with the exception of experimental JFX apps) for new apps in this decade.
Whereas in the modern JS world, you start out with a lean project and then add small libraries from NPM that all work slightly different for every feature you need, in the Spring world where the framework has matured for 10+ years most of the things you will need are in the box or are available as external libraries that all work on the same defined interfaces.
Spring Boot with Java 11 or Kotlin are still my preferred backend stack for that reason. It’s solid, has aged well and makes me super productive.
Even stuff in “core” dependencies like webpack/yarn/npm/npx/nvm/babel changes a few times a year.
Right now doing a react native app and a browser extension. While javascript/typescript is fun to write, the time you have to spend fixing the tools is just horrible. Whiping cache, rebooting, switching package versions.. It makes the overall experience cumbersome.
I would not call something stable if it manages to work 2 whole years.
There was definitely an era of that, but over the past decade or so it's been acknowledged as a problem, and there's been a lot of effort put into making things simpler and more vanilla. Modern Spring is much much closer to plain old code. (Of course it's still possible to write in the XML/setter injection/afterPropertiesSet style, and always will be - that's backward compatibility for you - but it's very possible to migrate existing applications to a better style in place, which very few platforms manage).
> Same with maven/gradle, which turns what should be a developer-focussed build system into an enterprise XML mess.
Maven is pretty close to the perfect build system, it's just taken a while for knowledge and practices to catch up with it. All the stuff people love and praise Cargo for, Maven has been doing for 15+ years.
Despite Spring being Enterprise it still scales down to simpler apps pretty well due to its modularity.
These can be build into a single jar app that is easy to build and deploy, and combined with a database migration library, maintain their own database schemas. Very, very usable.
Yes, I probably would not use Java on the frontend unless it was for a technical audience.
People complain about the verbosity of Java, but how much time do you spend learning a new framework and all of the gotchas? At the end of the day it seems easier to just write the boilerplate. All of that verbose code means you got exactly what you asked for.
JVM projects never surprise me. When there is a problem debugging is a dream.
Could you elaborate on the overengineering part? There are many modern frameworks that are much lighterweight than Spring/Spring Boot, and with the introduction of lambdas in Java, things are even more straightforward.
Also, gradle doesn't use XML.
All our customers doing Java Web projects are either on JEE, or a CMS platform running on top of servlets and JEE related JSRs, like Liferay and AM.
I also do a good bit of frontend and holy crap it's exhausting. That's the main reason I push hard for Java backend. Backends tend to stick around about a decade longer than you would like, while front end code is due for a rewrite every 3 years
They didn't and Google didn't want a pay a single dollar to Sun on Java. As James Gosling has mentioned multiple times. I don't understand why people are still painting Google as Saint after all these years.
currently they recreated the java apis by pulling in apache harmony and they tought that they are in the right or at least in a fair use position (they still think they do). but I agree, I'm not sure if the direction of java would've been better when google would have the stewardship.
java is a stable api but it also doesn't evolve. The tradeoff is you get a program guaranteed to work no matter the upgrade vs being able to build better toolage.
React changes every year or 2. It is exhausting. But you get way better patterns and some things that drastically improve productivity.
Also keep in mind that React is a library, not a language. If you want to compare things, compare Javascript to Java, or the Java stdlibs to React. Maybe that's what you meant when you said Java.
IMO, Java and its libraries are nicer to work with solely due to its static nature and OO design. It is worlds apart from a scripting language when you need to refactor, or even just understand, legacy code. And if you can't even run tests on that legacy code anymore, like a React project over a year or two old?
What are the better patterns and productivity boosts you get from React vs. a Java ecosystem?
That and a few page reloads never killed anyone.
You have probably missed the last 5 years. From streams, to closures, to default methods, to functional interfaces, to local type inference, to new Process APIs, to modules, to unsigned arithmetic, to new date/time APIs, http client, 2 new GCs, shell REPL, etc... with more to come, and new releases every 3 months...
It consists of the main Flutter app and a Dart library for communicating with Nissan's API.
Maven and spring are the worst parts, for me, when dealing with Java. I don't mind the language at all.
Worse: Oracle bought Sun.
"Add a new garbage collection (GC) algorithm named Shenandoah which reduces GC pause times by doing evacuation work concurrently with the running Java threads. Pause times with Shenandoah are independent of heap size, meaning you will have the same consistent pause times whether your heap is 200 MB or 200 GB."
The original algorithm was published in 2016 [2]. It consists of 4 phases: initial marking, concurrent marking, final marking, and concurrent compaction.
[1] http://openjdk.java.net/jeps/189
[2] https://dl.acm.org/citation.cfm?id=2972210
GC use can be optimized in high level languages with support for value types (which are still missing in Java, though) and it is not like every game needs to be the next Fortnight.
There is such a thing as reference counting :)
(This is why I'm excited for ARC / Swift on the server...)
The performance increase is from a different algorithm + full multithreading. Not from collecting garbage less often as is a common tactic to reduce time spent in GC
"35C3 - Safe and Secure Drivers in High-Level Languages"
https://www.youtube.com/watch?v=aSuRyLBrXgI
Each JVM implementation (Azul, IBM, OpenJDK, PTC, Aicas,...) has their own collection of GC algorithms.
And their behaviour depends pretty much on the application as well.
Well, one could build and run Shenandoah GC with JDK8 and JDK11. Ref: https://wiki.openjdk.java.net/display/shenandoah/Main#Main-B...
Here's a presentation on Shenandoah by one of the principal developers (focuses on the how): https://youtube.com/watch?v=4d9-FQZZoVA
Slides: https://christineflood.files.wordpress.com/2014/10/shenandoa...
Another presentation (focuses on the how, why, and when) at devoxx: https://www.youtube.com/watch?v=VCeHkcwfF9Q
Slides: https://shipilev.net/talks/devoxx-Nov2017-shenandoah.pdf
---
Tangent: Go GC (generational, non-copying, concurrent mark and concurrent sweep) and the improvements it saw from 1.5 -> 1.8 https://blog.golang.org/ismmkeynote
The mutator/pacer scheme, the tri-color mark scheme, using rw-barriers during the sweep phase are interesting to contrast between the two GCs.
If you want an order of magnitude jump in performance need to see if the IBM work on Scala Native version of Spark manifests.
The GC time is lower and multi-threaded but most importantly the "pause time" is low and nearly constant (based on root set size)
The old GC's all scaled pause time roughly linearly with heap size. Apps that created a lot of garbage would have all their threads yielded for large amounts of time.
Now, essentially, it doesn't matter how much garbage you create. This is awesome because you used to carefully watch how many allocations you made to avoid bad GC times. Now it doesn't matter. Make as much trash as you want and the GC will deal with it
I expect this needs to be taken with a chunk of salt, since a smaller heap can only require so much GC...
Pause times. Not GC times. Shenandoah pauses to scan the root set only. The size of the root set doesn't grow with the size of the heap.
You can have a large root set and a tiny heap, or a tiny root set and a massive heap. They're entirely independent.
But (in my naive opinion) double dispatch seems a more elegant and java-ry solution, i.e. polymorphism on argument classes, so different methods are invoked for different object runtime classes (instead of using the compiletime type of the variable).
The switching could be optimised, as ordinary polymorphism is in the JVM.
Sure, you'd wreck legacy code if you just introduced it, but there's surely a backcompatible way to do it that isn't too awkward.
BONUS: goodbye visitor pattern!
Sure, you could match on types, but you wouldn’t be able to extend that to destructuring patterns, or regsxps as patterns on strings, or so many other ways patterns may be extended in future releases.
Their initial motivating examples was bare instanceof, but I now see they extend it.
How would destructuring fit this model ?!
From what I understand, graal has made a lot of headway with language interop, as well as a handful of specific memory optimizations and native compilation, but overall is lagging in pure throughput/latency performance behind hotspot. It would be really cool if we could get the best of both worlds.
As for GraalVM, I don't think it's meant to be mixed in with those 2 at all. From their website (https://www.graalvm.org/), it sounds to me as if it's a completely different project that shares none of the goals the other 2 JDKs have:
> GraalVM is a universal virtual machine for running applications written in JavaScript, Python, Ruby, R, JVM-based languages like Java, Scala, Kotlin, Clojure, and LLVM-based languages such as C and C++.
> GraalVM removes the isolation between programming languages and enables interoperability in a shared runtime. It can run either standalone or in the context of OpenJDK, Node.js, Oracle Database, or MySQL.
Twitter use Graal in production - I think it's about 13% faster for them on their real workloads. If you send a Tweet it's going through Graal.
GraalVM is... complicated. There are a few parts to it:
1) An advanced JIT compiler, written in Java, and distributed as ordinary maven/jar packages, etc. You can use it yourself if you want. Interestingly, this compiler package ("graal") can be used for the JVM itself, and it will be used to compile the Java code that is run by the JVM. The "JVMCI" feature allows you to plug third party compilers into the JVM and replace HotSpot, and Graal (the library) is such a replacement. You can use this compiler on ordinary OpenJDK 10+ with a special startup invocation.
2) Truffle, which is another library for writing programming language interpreters. It does a lot of magic to make them fast. It uses the graal JIT compiler to do this, so it depends on the previous library.
3) A bunch of programming language implementations under umbrella: Python, JS, etc. These are implemented using Truffle. By using Truffle, these language implementations magically share an object model (so they can interoperate), and they get a JIT compiler, based on Graal, for "free". This means the languages can interoperate and JIT together cleanly (the compiler can inline JavaScript into Ruby!)
4) If you use Graal as the JVMCI compiler (i.e. for your host Java code), and you use Truffle-based languages -- Graal can extend benefit #3 to the host Java code itself. This effectively means the JIT compiler can inline and optimize code across every language boundary.
5) SubstrateVM, which is is a tool to turn Java programs into native exe binaries. The intent is you use SVM on the programming language interpreters, to produce interpreter binaries that look "normal" to the user. These binaries are run on custom, non-JDK/HotSpot runtime. The Java code is not JITed, but interpreted code -- Ruby, Python, JS, etc -- is. (This means you have benefit #3, but not #4). SubstrateVM uses a custom generational garbage collector (written in Java!)
6) The "GraalVM distribution", from graalvm.org. This is a combination of all the above pieces together as a sanctioned package. This package uses JDK8 as the base virtual machine.
7) The GraalVM distribution comes in both a "community" and "commercial" edition, which do have technical/feature differences.
Here's the thing: you can use everything from 1-4 with an "ordinary" OpenJDK today if you know what you're doing, and you don't need a special JDK build. SubstrateVM might also be workable, but I don't know.
Points 6-7 actually mean that there is generally a difference between "GraalVM the platform" and "Graal the compiler". Notably, while OpenJDK no longer has feature gaps vs the commercial OracleJDK, GraalVM does, which I find a very weird choice on Oracle's behalf and unfortunate, but maybe it will change one day.
If you want to use one of the new low-latency garbage collectors (ZGC/Shenandoah) with Graal, I don't think that's possible as of right now: garbage collectors need to interface with the JIT compiler, but ZGC does not support the JVMCI interface (it will explode on startup), and Shenandoah doesn't either, I believe (but maybe this changed). This will almost certainly be fixed in the future, of course.
http://openjdk.java.net/jeps/230
http://openjdk.java.net/projects/jdk/12/
Rest assured progress is being made.
https://jdk.java.net/valhalla/
The engineering problem is that they want to keep old jars running on a world of value types.
I suspect I won't ever use value types in my code though unless it's drop in to refactor from AnyVal=>AnyRef (or w/e the java equiv is).
* Does accepting a value type parameter accept it by value or by ref? * How does this play with GC? Can I get dangling ref with value type? * How does value types play with inheritance e.g. will it be like c# and struct and they will be boxed any time treated as ref? * How safe will it be to convert a class from being value type based to Object (w/e AnyRef equiv is). Will I need to inspect all methods that deal it now?
This can probably just be resolved with a style guide like c++ can be mostly sane if you're starting a new project today with an aggressive style guide, but I just appreciated not think about this and just count on escape analysis + gc + hotspot tricks like monomorphize code paths with some counters to be good enough.
For example: https://rcoh.me/posts/cache-oblivious-datastructures/
Something like
During compilation, type parameters are erased. "T" becomes "Object"; "T extends Something" becomes "Something"; assignment becomes type conversion. But at the end it works correctly. If you try to make something that wouldn't survive the type parameter erasure -- such as having two methods with the same name, one of which accepts "List<String>" and another accepts "List<Integer>" as their input -- it will be a compile-time error, because you can't have two methods with the same name and the same parameter "List".
But imagine that you have a type "MyException" extending RuntimeException, and your code has "catch (MyException)". After compilation, this would become "catch (RuntimeException)", which is wrong, because it would also catch other RuntimeExceptions, and it is not supposed to. But if you make this construction a compile-time error, then... the whole thing becomes useless, because what's the point of having an exception type you cannot catch.
Makes HashMap computeIfAbsent much less useful than it should be. It's impossible to use a function in there that throws a checked exception (all I want is for the exception to propagate out of computeIfAbsent for the parent method to deal with).
Best option I've found is to wrap it in an unchecked exception, then unwrap it in the parent method, which is just .... yuck.
[1]https://www.artima.com/intv/handcuffs.html [2]https://lrytz.github.io/pubsTalks/ the link to his thesis doesn't work anymore. I cherry pick some of it in my talk for a Scala meetup in Utrecht https://www.slideshare.net/yoozd/effect-systems-in-scala-bey...
But my favorite is abusing the language using Lombok SneakyThrows. It just feels dirty using it. In a good way.
Sometimes I wish Java had one too to complement Optional. I wonder what the argument against that is? Too much confusion with the existing Exception system?