The year was 1997. The Red Hot Chili Peppers were musing on love and the motions of amusement park rides, Pathfinder landed on Mars and Leonardo DiCaprio drew Kate Winslet as per one of his French associates. It was around this time I had heard about a thing called “Java”, a fancy new language everyone was talking about. The word on IRC was that it was based on work Sun Microsystems had originally done for embedded software on set-top boxes and other smart appliances.
I jumped straight onto AltaVista to perform an automated hypertext transfer search for this jovially caffeine-related language. There were only 17 pages on the Internet at that time, and the first search result was…the Microsoft Java Virtual Machine. I downloaded this using my 56-kbps modem and started working through the examples. I’d been programming in C++ for several years at this point, and as by design, the transition to Java was fairly painless. But wait.. COM objects? I’m constructing Microsoft COM objects in Java? I swiftly realised this was Microsoft’s standard embrace, extend, extinguish manoeuvre.
I returned to AltaVista and found the real Sun Microsystems JVM. I wrote a little animated game. Small discs with pictures of Max Headroom and Neve Campbell spun around the screen. I put this project on the shelf and continued writing C++ for my day job. Neve Campbell never did respond to my merge request.
The following year I was approached by a friend who was looking to hire Java developers to work in the United States. I of course, now as an expert level developer, was immediately hired. My long history with Java, for better or for worse, had begun.
Sun Microsystems went on to sue Microsoft for making an incomplete Java implementation. Microsoft would drop all development of their JVM.
Applets – Remember Those?
Java was incorporated into the Netscape Navigator browser and would manifest itself on web pages in the form of applets. An applet is a rectangular viewport, large or small, on the page which is backed by Java bytecode executing in a sandbox.
The Java sandbox could interact with the page’s document object model, but more commonly, applets were used for trivial use cases. Applets would add ripple effects to images, whereas more complicated applets were banking applications and other more complicated user interfaces. The power here was that you could build a reasonably complex desktop style application and run that in a browser. Applets served their purpose at the time as web browsers were not as capable as they are today. But applets would eventually meet the same fate as Adobe Flash, only years earlier. Web browsers preferably use open specifications, yet technologies such as Java and Flash open a trap door to a whole new world inside a magical component.
Steve Jobs was right to get Flash out of the browser, especially mobile ones. Applets were presenting some security issues, mainly involving parts of the APIs which touched the native layers, such as JDBC database integration. They could also grind your machine to a halt as they fired up a giant JVM to simply add that ripple effect to your MySpace page. The inbuilt graphics libraries gave you windows and buttons but nothing as fancy as the vector system of something like Flash. Eventually JVMs were extracted from web browsers all together, bound with duct tape and shot into the sun.
Sun’s insistence that the “Write Once, Run Anywhere” concept be strictly enforced meant that they did something truly bizarre, and in my opinion profoundly naive (but unfortunately, not profoundly Neve). The original Java graphics system is called AWT, or the Abstract Windowing Toolkit. On top of that Sun built Swing for extra “advancededness”. Java has its own graphics components, buttons, windows, fonts and all the associated widgets. So when a Java desktop application opens a window on your operating system, it creates a borderless undecorated native window and paints its own components, window frame and all.
This gave applications consistency across platforms…in the sense that they looked consistently bad. It also guaranteed that Java desktop applications would always look out of place in comparison to their native environment. Furthermore, apart from the obvious display differences, any optimisations or extra features that the local windowing environment gave to its windows and widgets were not inherited.
To make things even more comical, Sun created pluggable look and feel themes for their components, which mimicked each the platform. So, for example, as Microsoft transitioned its desktop theme to the (then) new Windows XP look, Java apps would still display old themes on new desktops. High-five.
When Eclipse began development, for whatever reason (I’m assuming performance), the developers decided that Java’s windowing environment was not up to task and created their own bridge to native Windowing components of the platform. This would eventually be split out in a project called SWT, or the Standard Widget Toolkit. I wrote apps using SWT. It made sense, and it worked delightfully.
It should be noted the “emulate a platform on the platform rather than using the native features of the platform” approach is one taken by Flutter, Google’s new portable UI toolkit. So it seems I’m one of the few people who thinks this is not a sustainable idea.
More recently, Oracle developed JavaFX, a more capable graphics front end to replace Swing. It also contained JavaFX script. Long story short, JavaFX is dead. Well, maybe not officially. But it never gained any of the traction Oracle had been hoped for. Too little, too late. Build it and they will come. Or not.
Java’s True Home
But as we were to learn, the true home of Java was not the browser, or the desktop, or even the set top box. It was the server.
Java was found to be a great way to develop server side code. It was a C++-like object-oriented language, but with garbage collection and type safety. Did your request thread throw a random NullPointerException? Perhaps, but it wasn’t going to cause a general protection fault and bring down the whole server. Memory leaks? Not unheard of, but much less likely. Even something as simple as the package- and-classname-based file naming structure helped bring reason to large projects. The days of the 10,000 line C++ file were over (not that it didn’t stop some developers trying).
But of course Sun were clearly trying to kill Java with the early Enterprise Edition specifications. Bloated, clunky and providing questionable benefits, many developers who hadn’t drunk the Java EE Kool-Aid simply developed enterprise applications with the standard Java edition. It didn’t take long before viable alternatives appeared. Yes, I’m talking about Spring: the inversion of control framework known mostly for its notoriously large stack traces.
In fact, on February 14, 1990, the Voyager 1 spacecraft was 6.4 billion kilometres from earth and, at Carl Sagan‘s suggestion, engineers turned the spacecraft around to take a photograph of one of the longer Spring stack traces which exist in our solar system. The only thing preventing this story from being true is that Spring did not exist for another twelve years.
The ability to run different languages on the JVM had always been there, but didn’t really seem to gain much traction until Groovy (which was a Java extension anyway). Scala then emerged as another JVM language which gained some interest, but never to the degree of threatening the defences of the Java mothership. And in recent years Kotlin has appeared as more serious contender.
Apples and Tangerines
Java has always been unfairly compared with scripting languages. JVM start-up times have never been as fast as some script interpreters, but that’s not really how you should be using Java anyway. Instead, it’s a language best designed for processes that will be running for a long time. In other words: servers.
Dynamically-typed languages such as PHP, Python, Perl and Ruby initially gained popularity for web server scripting. Script hosting is simple and inexpensive: initially it was as straightforward as dropping a script in the cgi-bin directory of your server. However, scripts needed to be re-interpreted and executed on a per-request basis. Consequently, the load capacity of a web server using scripts is dictated by its ability to concurrently run those scripts.
Java hosting, on the other hand, traditionally requires the pre-commitment of a running JVM. Non-shared Tomcat hosting, for example, requires an allocation of memory for your JVM, even if it is sitting idle most of the time. However, for sites with high request volumes, a warmed-up JVM can service many more concurrent requests than running parallel script interpreters, and the shared memory space allows efficient sharing of in-demand resources like database connections.
In the early days of Java, the JVM was pretty slow, as it was mostly running in interpreted mode. But over the years Just In Time compilers for Java became quite advanced. Furthermore, Java’s structured type system and simple memory model facilitated these performance improvements, by being able to make guarantees about what a program would (and would not) do at runtime. In contrast, it’s harder to predict how a program written with a dynamically-typed scripting languages will behave in advance, and it is thus harder to optimise the performance of that program.
The end result: a platform that just got faster and faster (although somehow, Java retained a reputation for being slow).
Java, Wherefore Art Thou?
Java has an interesting place in the ecosystem of languages, mainly because its more than a language: it’s an entire platform. And you’d be surprised how widely deployed this platform still is. For example, Minecraft, one of the most culturally-significant games of all time, was written in Java (although it didn’t use any of Java’s clunky built-in graphics libraries, instead opting for a lightweight bridging library to OpenGL and the operating system’s I/O primitives).
Yet there is a flip-side to this all-pervasiveness. Programming languages are normally open standards. For languages like C++, while there will be competing commercial compilers and tools, you’ll also probably find a free compiler in the GNU compiler collection.
In contrast, Oracle (and Sun before them) have played a game of making Java mostly an open standard, whilst still controlling certain strategic parts, seemingly with the hope of charging high-value users for them.
Of course, high-value users found ways to work around this. For example, Google made the Android mobile platform Java source-code compatible, but replaced the whole bytecode implementation with their own format, which was logically compatible, but different enough to avoid losing a lawsuit from Oracle.
Whilst Oracle is often cast as the bad-guy in this scenario, it’s understandable that they would try to protect their intellectual property – they are a corporation, after all. It also raises the interesting issue of what happens to Java if Oracle can’t make enough money from it.
A few years ago I began development of a robotics project using the Raspberry Pi single board computer. Oracle had just released a Java 8 JVM for the Raspberry Pi, complete with a kick-butt Just In Time compiler for the ARM processor. They shipped it with the Rasbian Linux distribution.
Of course this JDK is not free for use for commercial applications – please contact Oracle for pricing information. As this was a pet project for me, I wasn’t too fazed about this, as I didn’t have to buy a licence. But evidently there weren’t enough commercial applications wanting to buy licenses either, as Oracle eventually dropped support for the whole product.
“Write Once, Run Maybe Somewhere, for a While.”
So let’s have a think about that: there is now no current official version of the Oracle JVM on ARM, the world’s standard mobile architecture, the world’s most prolific CPU architecture after Intel, and an architecture that is likely to eclipse Intel in many application spaces.
There is the OpenJDK version for ARM, but it uses the Zero bytecode interpreter, which is by no means anywhere near as fast as the Oracle JIT compiler. Now, I understand that Oracle have donated the code to the OpenJDK project, but platform support for Java seems to be at the whim of where money can be extracted.
This is of course expected behaviour from a commercial entity, in particular Oracle. But this raises the question of how dependable – despite all the hubris – Java actually is as a platform for anything other than Intel servers on Linux or Windows. It’s made me rethink whether Java is suitable at all for use on embedded systems, which is ironic given the roots that it came from.
Where to from here?
Java might not be the coolest technology on the block, but I always come back to what I think made it successful. It’s a statically typed language which allows efficient bit-level operations like a statically compiled language. Its type system requires no summoning of the dark arts. Yet due to the strict nature of its bytecode, it’s free from general protection faults, buffer overruns or accidentally casting string types to negative exponential quantities of pickled cucumbers. And it’s Just In Time compiler and garbage collectors are the result of years of development and research. These things combined, in my mind, are the sweet spot between higher and lower level languages.
Would I be sad if some cool new tech appeared and truly replaced Java? That depends. I hope other languages learn from Java’s mistakes but also from what made it such a force. Go might have a place. Perhaps Rust will oxidise itself onto the industry. Maybe Kotlin will pick up where Java leaves off. My preference isn’t to live in the past, but not at the expense of throwing out all the benefits of a structured, type safe language. If the future of server side development is Node.js or Python, I’ll be in my DeLorean with my Commodore 64 and my 8 bit assembler. See you back in the 1980s, fellow developers.