The 8 Nights of Java – Night 1

Given the holiday season, we thought it would be fun to share our favorite (or least favorite) features from all 8 versions of Java that have been released to date. Some features, like generics and autoboxing/unboxing, were met with a lot of fanfare and have since changed the way we as developers write code. Others, like NIO.1 and RMI, are not nearly as popular today as originally envisioned. With that in mind, we’ll be posting one entry each night on a different version of Java, starting tonight with Java 1.

Oh, we want to wish all of our readers a Happy and Healthy Holiday, as well as a Wonderful New Year!

Jump to: [Night 1 | Night 2 | Night 3 | Night 4 | Night 5 | Night 6 | Night 7 | Night 8]

Java 1.0/1.1 Notable Features
Sun introduced Java 1.0 (codename Oak) on January 23, 1996, with a more stable Java 1.1 released in February of the following year. These versions included:

  • Compiler and JVM runtime environment
  • JDBC
  • The beginnings of reflection
  • Inner classes
  • Thread class

From Scott:

Java 1.0 released.. well, Java! We all take for granted the ability to execute Java code on nearly every platform but back when Java was first released the idea of compiling source code into byte code and running inside a virtual machine was absolutely revolutionary. It was one of the first languages to allow developers to work freely in any environment and deploy to any other environment. Before Java, programmers tended to use the same operating system, same IDE software, and same compiler to write software that often could only run on a handful of environments. Java helped foster the open source movement allowing developers to work in Linux, Windows, MacOS, etc and deploy to any system. At a time when hardware and software systems were much more heterogeneous than they are today, Java helped improve productivity and sharing across a wide variety of landscapes.

Of course, early on there were some problems. Microsoft released Visual J++ alongside Sun’s implementation which contained only a subset of Sun’s features, as well as additional features Microsoft wrote themselves. These differences almost splintered the Java landscape within the first few years, since Visual J++ was so different from Sun’s implementation. Luckily for us, Sun intervened, successfully suing Microsoft on the basis that it violated Sun’s license agreement by releasing a version of Java not compatible with other versions of Java, helping to solidify Java as a standards-based language. In hindsight, articles like “Microsoft’s J/Direct called death of Java” written in 1997 (and comical now) would be the first of many (including a famous interview with Steve Jobs) to incorrectly predict Java’s demise. Today, Java is used in over 3 billion devices worldwide.

My favorite part of Java 1.0? The fact that the Thread class was included right away. Multi-threaded programming was still somewhat new, especially since multi-core processors were still in their infancy. Providing a new language in which developers could process tasks in parallel was pretty forward thinking, even if our thread-base implementations weren’t always perfect. Today, we tend to rely on the Concurrency API given its feature-rich convenience and stability, but never forget it’s is built entirely upon the Thread class.

From Jeanne:

Version 1.0 included Vector. While we no longer use Vector for new code, it paved the way for ArrayList and the Collections framework. JDBC is one of my favorite libraries. I use a mix of raw JDBC, ORM and Spring JDBC template these days, but JDBC started all of this. And then we have the parts of the language that stood the test of time. Plus since Sun/Oracle find it hard to actually get rid of anything from the language, we also have such fond deprecated memories such as Date’s getHours() methods. I was still in high school when Java 1 launched. I never actually worked with it directly as Java 1.2 was out before I started even reading about Java. When Java turned 10, it was cool to read Hello World(s) – From Code to Culture and see how Java got started. Or should I say how Oak got started?

Announcing: Our NEW Java OCA / OCP 8 Practice Test Book!

pt-cover

Jeanne and I are thrilled and excited to announce that we are nearly done writing a brand new book for the OCA / OCP Java 8 Programmer exams!

The new book, OCA / OCP Java SE 8 Programmer Practice Tests, includes over 1000+ hand-crafted and peer reviewed questions. We’ve also created a new permanent Practice Tests OCA / OCP 8 book page on the blog to keep track of updates and news about the new book.

Jeanne and I wanted to say how thankful we are to our all of our readers of our first two books. Without your invaluable feedback and positive reviews, we never would have been offered this opportunity to expand our OCA and OCP subject matter. We promise to make this next book our best yet!

We are expecting to ship the book in March 2017. Stay tuned for additional news about the book!

performance engineer’s guide to hotspot JIT compilation – monica beckwith – qcon

For more QCon posts, see my live blog table of contents. This presentation is about the compiler and also the runtime.

Major pieces

  • Execution engine
    • Heap management/garbage collection
    • JIT compilation
  • Runtime
    • VM Class loading
    • Interpretter
    • Byte code verification,etc

Runtime goal – convert from bytecode to native code and do optimizations along the way

Compilation Techniques and Notes

  • Pre-compiled/ahead of time
  • profile guided – based on critical hotspots
  • Adaptive optimization (Java uses Profile guided and Adaptive optimization)
  • Identify root of compilation
  • replace method or on stack – depends on number of times through loop
  • Server compiler has a higher threshold than client compiler for the threshold at which you need optimizations
  • Tiered compilation – tier 1 is client compiler with no profiling info, tier 2 and 3 are client compiler with profiling info. Then comes server compiler
  • CodeCache order of magnitude larger when tiered compilation is enabled. If need more can use -XX:ReservedCodeCacheSize
  • Inlining – many different parameters when figuring out when to inline
  • Vectorization – SIMD (SIngle Instruction Multiple Data). Can generate stubs and benefit from caching size chunks. For SuperWord Level Parallelism, you need to unroll the loop, do analysis/pre-optimization, etc. Still in infancy with Hotspot.
  • Escape analysis – Want to see if object only is used in a compile method. Need entire graph to confirm not in a static field/returned from method/passed as parameter/etc. If really local, can optimize by storing in registers.
  • Objects are 8 byte aligned by default. Fields are aligned by type.
  • OOP (ordinary object pointer) is a managed pointer. The size can be changed to optimize.
  • Compressed Class Pointers – part of the Metaspace. Class data is outside of heap.

Deoptimization

  1. dependency issues
  2. class unloading/redefinition
  3. uncommon path
  4. profiled info isn’t useful for path [like with databases when db assumes something different than you want]

If curious about details

To get information about what compiler thinks/did:

  • PrintCompilation – ex: what level instructions were compiled at
  • PrintInlining – use -XX:+UnlockDiagonsticVMOptions