Benefit of specifying -jvm-target / jvmTarget version other than 1.8 - java

As of Kotlin 1.6.0, for Kotlin/JVM projects one may specify the -jvm-target version option up to Java 17, see general and Gradle plugin documentation.
What are the benefits of doing so?
I couldn't find much on the benefits of specifying something other than the default value of 1.8.
The only things I could find on this were:
on JVM 9+ targets, string concatenations are compiled into dynamic invocations (invokedynamic), see release blog of Kotlin 1.5.20
on JVM 10+ targets, Java's Records are supported, see release blog of Kotlin 1.5.0.
Both seem negligible to me.
Especially because when specifying a higher target, one looses the ability to use the resulting artifact in projects stuck with Java 1.8, which seems undersireable especially for libraries.
Is there something I missed here?

I don’t know what Kotlin actively uses or supports.
The following features available in later Java environments may provide a benefit to other programming languages, even if you are not actively using them in your application code:
Concurrent constructs of your language may use VarHandle internally even if you don’t use this API directly. [JDK 9]
If your language needs it, reachabilityFence allows to prevent garbage collection prior a point of execution, instead of relying on fragile or expensive work-arounds [JDK 9]
An official way to add classes to the current environment dynamically, rather than hacking into JRE internals [JDK 9]
You already mentioned string concatenation… [JDK 9]
When you create a module declaring the required dependencies, you can create a customized JDK containing only the required modules, to be deployed with the application (which eliminates the need for 1.8 compatibility anyway). [JDK 9]
Classes belonging to a nest can access each others private members without the need for helper methods. The compiler of your language can decide which classes belong to a nest, it doesn’t have to be the semantic of Java’s nested classes. [JDK 11]
Custom dynamic constants. You can have arbitrary constants loadable by an ldc instruction, which is constructed by a bootstrap method on the first execution and subsequently reused. This means the language can use its own constants of its own types the same way as Java’s built-in constants (think, string interning). [JDK 11]
Create dynamic anonymous classes using an official API instead of assuming the presence of the proprietary sun.misc.Unsafe [JDK 15]
Sealed classes are directly supported by the JVM, so if the language has such a concept, it can translate it directly instead of emulating it. [JDK 17]
Perhaps something more that is useful for the particular language implementation but not obvious to us, not trying to implement the language on the JVM

Related

Use case for RuntimeException from package Kotlin

I was writing some code and was going to throw a RuntimeException as a default for something, when I noticed that there are two options for RuntimeException - one from java.lang and one from kotlin:
Inside the kotlin version of this, there's an entire list of other methods which work the same way :
So, from my understanding of this, the one from package kotlin is simply an alias for the java equivalent (correct me if I'm wrong) which leads me to the question :
what is the point of having this alias file and when should you use it over the "standard" java equivalent ? Does this simply save a few imports ?
When using the JDK, these map to JDK classes. When using Kotlin for Javascript, they would map to a specific implementation in the Kotlin Javascript library. Documentation about actual and expect here.
To answer your question, if there's a chance of you porting your code to also work on another platform, always use the kotlin. variant. Otherwise, it doesn't matter.
In my opinion typealias is strong feature provided by Kotlin language.
Why? Because it gives ability to extend code to multiple domains like extension to provide interoperability between any other languages and Kotlin.
It also becomes helpful when providing APIs or SDK with semantic versioning without worrying much about changes affecting on lower versions of APIs.
One good example would be Collections derived from Java to Kotlin language with some additional & powerful methods as typealias (Literally it saves lot of development time and efforts).
Another good example would be multiplatform programming support by Kotlin language that helps you create APIs with actual and expect keywords implementation.
So long story short, I prefer using RuntimeException from Kotlin package to extend support amongst variety of Kotlin versions (You can see newly added classes starting from version 1.3, but doesn't affect existing API).

Enhance library for Java 8 while keeping backwards compatibility

I'm developing an open source library in Java and would like to ensure that it is convenient for Java 8 users, and takes advantage of new concepts in Java 8 wherever possible (lambdas etc.)
At the same time I absolutely need to maintain backwards compatibility (the library must still be usable for people using Java 6 or 7).
What useful features from Java 8 can I adopt that would be beneficial for library users without breaking library compatibility for users of older Java versions?
I don't know about your library, this advice might be slightly off.
Lambdas: Don't worry. Any functional interface can be implemented using a Lambda expression.
Method references: Same as lambdas, they should just be usable.
Streams: If this fits your library, you should use them, but keeping compatibility is harder here. The backwards compatibility could be achieved using a second library part, wrapping around the base library and hooking into the public API of it. It could therefore provide additional sugar/functionality without abandoning Java 6/7.
Default methods: By all means, use these! They are a quick/cheap/good way to enhance existing implementations without breaking them. All default methods you add will be automatically available for implementing classes. These will, however, also need the second library part, so you should provide the base interfaces in your base library, and extend the interfaces from the companion-library.
Don't fork the library, abandoning the old one, as there are still many developers who cannot use Java 8, or even Java 7. If your library is sensible to use on e.g. Android, please keep that compatibility.
If you want your code to be usable by Java 6 consuming VMs, you have to write using Java 6 language compatibility. Alas. The bytecode format critically changed from 6 to 7, so code for 7 and 8 won't load into 6. (I found this with my own code migrating to 7; even when all I was using was multi-catch — which should be writable in the 6 bytecode — I couldn't build it for a 6 target. The compiler refused.)
I've yet to experiment with 8, but you'll have to be careful because if you depend on any new Java packages or classes, you will still be unable to work with older versions; the old consuming VMs simply won't have access to the relevant classes and won't be able to load the code. In particular, lambdas definitely won't work.
Well, can we target a different classfile version? There's no combination of source and target that will actually make javac happy with this.
kevin$ $JAVA8/bin/javac -source 1.8 -target 1.7 *.java
javac: source release 1.8 requires target release 1.8
So there's simply no way to compile Java source with lambdas into a pre-Java 8 classfile version.
My general take is that if you want to support old versions of Java, you have to use old versions of Java to do so. You can use a Java 8 toolchain, but you need Java 7 or Java 6 source. You can do it by forking the code, maintaining versions for all the versions of Java you want to support, but that's far more work than you could ever justify as a lone developer. Pick the minimum version and go with that (and kiss those juicy new features goodbye for now, alas).
If you use any new language features in Java 8, it requires also using Java 8 bytecode.
$ javac -source 1.8 -target 1.7
javac: source release 1.8 requires target release 1.8
That means your options are quite limited. You cannot use lambdas, method references, default methods, Streams, etc. and maintain backwards compatibility.
There are still two things you can do that users of Java 8 will benefit from. The first is using Functional Interfaces in your public API. If your methods take Runnables, Callables, Comparators, etc. as parameters, then users of Java 8 will be able to pass in lambdas. You may want to create your own Single-Abstract-Method interfaces as well. If you find you need Functions and Predicates, I'd suggest reusing the ones that ship with GS Collections or Guava instead of writing your own.
The second thing you can do is use a rich collections API that benefits from using Functional Interfaces. Again, that means using GS Collections or Guava. For example, if you have a method that would return List, return MutableList or ImmutableList instead. That way, callers of the method will be able to chain usages of the rich API exposed by these interfaces.
As said by others, providing and using interfaces with a single method such that they can be implemented using lambdas or method references when using Java 8 is a good way of supporting Java 8 without breaking Java 7 compatibility.
This can be complemented by providing methods by your library which fit into one of the standard function types of Java 8 (e.g. Supplier, (Bi)Consumer, (Bi)Function) so that Java 8 developers can create method references to them for Java 8 API methods. This implies that their signature matches one of these functional interfaces and they don’t throw checked exceptions. This often comes naturally, e.g. getFoo() may act as a Function and isBar() as a Predicate but sometimes it’s possible to improve methods by thinking about possible Java 8 use scenarios.
For example, if you provide a method taking two parameters, it’s useful to choose the order where the first parameter is the one that is more likely to be a key in a Map. So it is more likely to be useful for Map.forEach with a method reference.
And avoid method with ambiguous signatures. E.g. if you have a class Foo with an instance method ReturnType bar() and a static method ReturnType bar(Foo) neither of them can be used as method reference anymore as Foo::bar would be ambiguous. Eliminate or rename one of these methods.
It’s important that such methods do not have undocumented internal state that causes surprising behavior when used by multiple threads. Otherwise they can not be used by parallel streams.
Another opportunity that should not be underestimated is to use names for classes, interfaces and members conforming to patterns introduced by the Java 8 API. E.g. if you have to introduce some sort of filter interface with a test method for your library that ought to work with Java 7 as well, you should name the interface Predicate and the method test to associate it with the similar named functional interface of Java 8.

Why project Jigsaw / JPMS?

Java's package management system always seemed simple and effective to me. It is heavily used by the JDK itself. We have been using it to mimic the concept of namespaces and modules.
What is Project Jigsaw (aka Java Platform Module System) trying to fill in?
From the official site:
The goal of this Project is to design and implement a standard module
system for the Java SE Platform, and to apply that system to the
Platform itself and to the JDK.
Jigsaw and OSGi are trying to solve the same problem: how to allow coarser-grained modules to interact while shielding their internals.
In Jigsaw's case, the coarser-grained modules include Java classes, packages, and their dependencies.
Here's an example: Spring and Hibernate. Both have a dependency on a 3rd party JAR CGLIB, but they use different, incompatible versions of that JAR. What can you do if you rely on the standard JDK? Including the version that Spring wants breaks Hibernate and visa versa.
But, if you have a higher-level model like Jigsaw you can easily manage different versions of a JAR in different modules. Think of them as higher-level packages.
If you build Spring from the GitHub source you'll see it, too. They've redone the framework so it consists of several modules: core, persistence, etc. You can pick and choose the minimal set of module dependencies that your application needs and ignore the rest. It used to be a single Spring JAR, with all the .class files in it.
Update: Five years later - Jigsaw might still have some issues to resolve.
AFAIK The plan is to make the JRE more modular. I.e. have smaller jars which are optional and/or you can download/upgrade only the functionality you need.
Its to make it less bloated and give you the option of dropping legacy modules which perhaps most people don't use.
Based on Mark Reinhold's keynote speech at Devoxx Belgium, Project Jigsaw is going to address two main pain points:
Classpath
Massive Monolithic JDK
What's wrong with Classpath?
We all know about the JAR Hell. This term describes all the various ways in which the classloading process can end up not working. The most known limitations of classpath are:
It's hard to tell if there are conflicts. build tools like maven can do a pretty good job based on artifact names but if the artifacts themselves have the different names but same contents, there could be a conflict.
The fundamental problem with jar files is that they are not components. They're just bunch of file containers that will be searched linearly. Classpath is a way to lookup classes regardless of what components they're in, what packages they're in or their intended use.
Massive Monolithic JDK
The big monolithic nature of JDK causes several problems:
It doesn't fit on small devices. Even though small IoT type devices have processors capable of running an SE class VM but they do not have necessarily the memory to hold all of the JDK, especially, when the application only uses small part of it.
It's even a problem in the Cloud. Cloud is all about optimizing the use of hardware, if you got thousands of images containing the whole JDK but applications only use small part of it, it would be a waste.
Modules: The Common Solution
To address the above problems, we treat modules as a fundamental new kind of Java program component. A module is a named, self-describing collection of code and data. Its code is organized as a set of packages containing types, i.e., Java classes and interfaces; its data includes resources and other kinds of static information.
To control how its code refers to types in other modules, a module declares which other modules it requires in order to be compiled and run. To control how code in other modules refers to types in its packages, a module declares which of those packages it exports.
The module system locates required modules and, unlike the class-path mechanism, ensures that code in a module can only refer to types in the modules upon which it depends. The access-control mechanisms of the Java language and the Java virtual machine prevent code from accessing types in packages that are not exported by their defining modules.
Apart from being more reliable, modularity could improve performance. When code in a module refers to a type in a package then that package is guaranteed to be defined either in that module or in precisely one of the modules read by that module. When looking for the definition of a specific type there is, therefore, no need to search for it in multiple modules or, worse, along the entire class path.
JEPs to Follow
Jigsaw is an enormous project that is ongoing for a quite a few years. It's got an impressive amount of JEPs which are great places to gain more information about the project. Some of these JEPs are as the following:
JEP 200: The Modular JDK: Use the Java Platform Module System (JPMS) to modularize the JDK
JEP 201: Modular Source Code: Reorganize the JDK source code into modules, enhance the build system to compile modules, and enforce module boundaries at build time
JEP 261: Module System: Implement the Java Platform Module System, as specified by JSR 376, together with related JDK-specific changes and enhancements
JEP 220: Modular Run-Time Images: Restructure the JDK and JRE run-time images to accommodate modules and to improve performance, security, and maintainability
JEP 260: Encapsulate Most Internal APIs: Make most of the JDK's internal APIs inaccessible by default but leave a few critical, widely-used internal APIs accessible, until supported replacements exist for all or most of their functionality
JEP 282: jlink: The Java Linker: Create a tool that can assemble and optimize a set of modules and their dependencies into a custom run-time image as defined in JEP 220
Closing Remarks
In the initial edition of The State of the Module System report, Mark Reinhold describes the specific goals of the module system as following:
Reliable configuration, to replace the brittle, error-prone class-path mechanism with a means for program components to declare explicit dependences upon one another, along with
Strong encapsulation, to allow a component to declare which of its public types are accessible to other components, and which are not.
These features will benefit application developers, library developers, and implementors of the Java SE Platform itself directly and, also, indirectly, since they will enable a scalable platform, greater platform integrity, and improved performance.
For the sake of argument, let's assert that Java 8 (and earlier) already has a "form" of modules (jars) and module system (the classpath). But there are well-known problems with these.
By examining the problems, we can illustrate the motivation for Jigsaw. (The following assumes we are not using OSGi, JBoss Modules, etc, which certainly offer solutions.)
Problem 1: public is too public
Consider the following classes (assume both are public):
com.acme.foo.db.api.UserDao
com.acme.foo.db.impl.UserDaoImpl
At Foo.com, we might decide that our team should use UserDao and not use UserDaoImpl directly. However, there is no way to enforce that on the classpath.
In Jigsaw, a module contains a module-info.java file which allows us to explicitly state what is public to other modules. That is, public has nuance. For example:
// com.acme.foo.db.api.UserDao is accessible, but
// com.acme.foo.db.impl.UserDaoImpl is not
module com.acme.foo.db {
exports com.acme.foo.db.api;
}
Problem 2: reflection is unbridled
Given the classes in #1, someone could still do this in Java 8:
Class c = Class.forName("com.acme.foo.db.impl.UserDaoImpl");
Object obj = c.getConstructor().newInstance();
That is to say: reflection is powerful and essential, but if unchecked, it can be used to reach into the internals of a module in undesirable ways. Mark Reinhold has a rather alarming example. (The SO post is here.)
In Jigsaw, strong encapsulation offers the ability to deny access to a class, including reflection. (This may depend on command-line settings, pending the revised tech spec for JDK 9.) Note that because Jigsaw is used for the JDK itself, Oracle claims that this will allow the Java team to innovate the platform internals more quickly.
Problem 3: the classpath erases architectural relationships
A team typically has a mental model about the relationships between jars. For example, foo-app.jar may use foo-services.jar which uses foo-db.jar. We might assert that classes in foo-app.jar should not bypass "the service layer" and use foo-db.jar directly. However, there is no way to enforce that via the classpath. Mark Reinhold mentions this here.
By comparison, Jigsaw offers an explicit, reliable accessibility model for modules.
Problem 4: monolithic run-time
The Java runtime is in the monolithic rt.jar. On my machine, it is 60+ MB with 20k classes! In an age of micro-services, IoT devices, etc, it is undesirable to have Corba, Swing, XML, and other libraries on disk if they aren't being used.
Jigsaw breaks up the JDK itself into many modules; e.g. java.sql contains the familiar SQL classes. There are several benefits to this, but a new one is the jlink tool. Assuming an app is completely modularized, jlink generates a distributable run-time image that is trimmed to contain only the modules specified (and their dependencies). Looking ahead, Oracle envisions a future where the JDK modules are compiled ahead-of-time into native code. Though jlink is optional, and AOT compilation is experimental, they are major indications of where Oracle is headed.
Problem 5: versioning
It is well-known that the classpath does not allow us to use multiple versions of the same jar: e.g. bar-lib-1.1.jar and bar-lib-2.2.jar.
Jigsaw does not address this problem; Mark Reinhold states the rationale here. The gist is that Maven, Gradle, and other tools represent a large ecosystem for dependency management, and another solution will be more harmful than beneficial.
It should be noted that other solutions (e.g. OSGi) do indeed address this problem (and others, aside from #4).
Bottom Line
That's some key points for Jigsaw, motivated by specific problems.
Note that explaining the controversy between Jigsaw, OSGi, JBoss Modules, etc is a separate discussion that belongs on another Stack Exchange site. There are many more differences between the solutions than described here. What's more, there was sufficient consensus to approve the Public Review Reconsideration Ballot for JSR 376.
This article explains in detail the problems which both OSGi and JPMS/Jigsaw try to solve:
"Java 9, OSGi and the Future of Modularity" [22 SEP 2016]
It also goes thoroughly into the approaches of both OSGi and JPMS/Jigsaw.
As of now, it appears authors listed almost no practical Pros for JPMS/Jigsaw compared with matured (16 years old) OSGi.

Mixing Java 1.4 and 1.6 bytecode in a class hierarchy

The question first, the story will follow:
Is it safe to mix different bytecode version in a class hierarchy? What are the risks?
For a case, Class C extends B, Class B extends Class A. Class A implements Interface I.
My question would involve following example scenarios:
Class A compiled to Java 1.6 bytecode, and have 1.6 features such as generics, etc. The heirs, which are B and C was compiled to 1.4 bytecode.
Interface I compiled to 1.6, while the implementor compiled to 1.4.
Other exotic inheritance scenario involving different version of bytecode.
I have tried as many scenarios I could imagine and it seems to run just fine. However I still feel the urge to ask here as I only know Java at the surface; i know how to code and tweak Java but don't really know what happen under the hood.
Now for those minds who can't help themselves to ask "why would you need to do that???".
I'm in a project to assess the migration of legacy Java 1.4 Swing app, connected to EJB 2 via RMI, to Java 1.6 Swing connected to newer version of App Server running on top of 1.6 also. The J2EE platform will still be 1.4 (EJB 2).
The migration will not be "recompile everything to 1.6", but it will be "code and compile new features to 1.6".
The way they do things is like this:
They only have one path in the CVS, everyone commits there. No tags/branches whatsoever to get the production code.
Whenever a new feature need to be added, they get the JARs from production server, explode them, replace or add new classes as needed, repackage the jars, put them back to server.
Therefore, if they will use Java 6 to compile and using the above method for deployment, there will be a lot of exotic mixes of 1.4 and 1.6 bytecodes.
The JVM byte code is not siginificantly different between Java 1.0 and Java 6. In Java 7 they add one new instruction. Woohoo.
There are so little changes in how the byte code works that
The JVM doesn't support nested classes accessing private members of outer classes, this works through generated code.
The JVM doesn't support runtime checks for generics e.g you cannot new T() where T is a generic.
Basically, they make the JVM smarter and faster but until recently changing the model of how the byte code works has been avoided at all costs.
You can compile with Java 6 but target 1.4 with a compiler setting. We did this for a migration project once. If/when 1.4 disappears, you then change your compiler settings again and target 1.6.
Keeping the target version explicit also means that you can upgrade your SDK without fear of your JAR files becoming unusable to an older JVM.
I am maintaining an environment with mix of 1.4 (old library jars) and 1.5 (my fixes and stuff) classes on Tomcat using Sun JVM 1.5 and it runs fine.
However, for RMI you may be in trouble if client and server has different class version because the server might check the class version (I ran into this problem).
The best way to find out is to do a proof of concept type of project on small scale.
A friendly reminder though, you are digging a pretty big hole for yourself here :-)
These links seem relevant. They document the few edge cases that could break compatibility between 1.4 and 1.5 and between 1.5 and 1.6.
The biggest differences that could cause problems that I can think of is that enum became a keyword, but that would only effect a 1.5+ JVMs when loading an older class file (which doesn't seem to be what you will be doing). The other thing is annotations. The above links seem to suggest everything would be fine, but I would be wary about what would happen if an older JVM loaded up a class with runtime annotations.
Other than that I don't think there have been any bytecode changes between the first version of java and java 6. Meaning the only problems you should encounter are changes to functionality the API or deprecations (listed in the links above).
As long as you aren't using reflection, the only major problem you could have from differing bytecode versions is the ACC_SUPER flag.
In very early versions, invocation of superclass methods was not handled correctly. When they fixed it, they added a new flag to the classfile format, ACC_SUPER to enable it, so that applications relying on the old, broken, behavior were not affected. Naturally, using a class that doesn't contain this flag could cause problems.
However, this is ancient history. Every class compiled in 1.4 and later will have the flag, so this isn't a problem. Bytecode wise, the only major differences between 1.4 and 1.6 are the addition of optional attributes used to store metadata about inner classes, generics, annotations, etc.
However, these don't directly affect the bytecode execution. The only way these have an affect is if you access them through reflection. For instance, java.lang.Class.getDeclaredClasses() will return information from the optional attribute InnerClasses.

Why is org.ietf, org.omg, org.w3c, and org.xml part of the POJOs?

Hi all I was wondering how did the packages org.ietf, org.omg, org.w3c, and org.xml made it into the "official" Java classes ?
For example, it makes sense that the default JDK wouldn't have all the classes from Apache Commons,
By the same philosophy, shouldn't these org.w3c, org.omg packages be outside of the default JDK classes (i.e. not included within the JDK installation)?
These are all generally code representing standards, IETF, OMG, and W3C are all standards organizations. The code that you are referring to was created with these package names and was/is very widely used so it made sense to put it into the JDK with their original names. An exception to the standards name is the org.xml package. That has SAX which is an early Java/XML open source implementation of streaming XML event handling that became very popular. It's also code that is at the right level (a fairly low level) in the programming hierarchy so that it will be generally needed universally. Some of it is code that other parts of the Java runtime environment depend on.
Code in open source projects like Apache Commons is either not a standard or not required by other parts of the Java runtime, so there is no strong reason to include it.
Note in other cases, Sun/Oracle has added code external to the JDK to implement core features (Doug Lea's concurrency stuff comes to mind), but these packages were renamed into java packages.

Categories

Resources