My initial question was an exact duplicate of this one; that is, why is it that this interface has a runtime retention policy.
But the accepted answer does not satisfy me at all, for two reasons:
the fact that this interface is #Documented has (I believe) nothing to do with it (although why #Documented has a runtime retention policy is a mystery to me as well);
even though many "would be" functional interfaces existed in Java prior to Java 8 (Comparable as the answer mentions, but also Runnable etc), this does not prevent them from being used as "substitutes" (for instance, you can perfecty well use a DirectoryStream.Filter as a substitute to a Predicate if all you do is filter on Path, for instance).
But still, it has this retention. Which means that it has to influence the JVM behavior somehow. How?
I've found the thread in core-libs-dev mailing list which discusses the retention of #FunctionalInterface annotation. The main point mentioned here is to allow third-party tools to use this information for code analysis/validation and to allow non-Java JVM languages to map correctly their lambdas to functional interfaces. Some excerpts:
Joe Darcy (original committer of #FunctionalInterface):
We intentionally made this annotation have runtime retention to
allow it to also be queried to various tools at runtime, etc.
Brian Goetz
There is a benefit for languages other than Java, that can use this as a means to determine whether the interface is suitable for passing to the SAM conversion machinery. The JDK support for lambda conversion is available to other languages as well.
So seems that it's not used by JVM itself, it's just an additional possibility for third-party tools. Making the annotation runtime-visible is not a big cost, so seems there were no strong reasons not to do this.
The only requirement for annotations with retention policy runtime is
Annotations are to be recorded in the class file by the compiler and retained by the VM at run time, so they may be read reflectively. (https://docs.oracle.com/javase/7/docs/api/java/lang/annotation/RetentionPolicy.html#RUNTIME)
Now this has some consequences on runtime behaviour, since the class loader must load these annoations and the VM must keep these annotations in memory for reflective access (for example by third party libraries).
There is however no requirement for the VM to act on such annotations.
Related
I'm currently working on optimizing a particular method, which is unfortunately inlined by the JVM, which prevents it from being properly vectorized. I've noticed that there is an annotation to forbid inlining, namely jdk.internal.vm.annotation.DontInline . However, it cannot be accessed from the default module.
Is there a clean way of gaining access to this annotation or to prevent the inlining of the offending method some other way?
DontInline, ForceInline, etc. are JDK internal annotations, they cannot be applied to user code. Even if you somehow manage to open these annotations, HotSpot JVM has an explicit check to disallow them for non-privileged classes.
The reasons are understandable. These annotations are the implementation detail of the particular JVM version; JDK developers are free to add/remove/change meaning of such annotations without notice, even in a minor JDK update.
Using #DontInline to force vectorization does not seem a good approach anyway. In general, inlining should not prevent from other optimizations. If you encounter such problem, it's better to report an issue on hotspot-compiler-dev mailing list.
Now the good news.
Since JDK 9, there is a public supported API to manually tune JIT compiler. This is JEP 165: Compiler Control.
The idea is to provide compiler directives in a separate JSON file, and start the JVM with -XX:CompilerDirectivesFile=<file> option. If your application is sensitive to certain compiler decisions, you may provide the directives file along with the application.
{
match: "*::*",
inline: "-org/package/MyClass::hotMethod"
}
It is even possible to apply compiler directives programmatically in runtime using DiagnosticCommand API:
ManagementFactory.getPlatformMBeanServer().invoke(
new ObjectName("com.sun.management:type=DiagnosticCommand"),
"compilerDirectivesAdd",
new Object[]{new String[]{"compiler.json"}},
new String[]{"[Ljava.lang.String;"}
);
By the way, there is Vectorize: true option among the directives list, which may probably help in vectorizing the particular method.
Currently I'm trying to modify method bodies residing in classes already loaded by the JVM. I'm aware of the JVM actually not allowing to change the definition of classes that have already been loaded. But my researches brought me to implementations like JRebel or Java Instrumentation API, both using an agent-based approach. I know how to do it right before a class is loaded o behalf of Javassist. But considering e.g. JRebel in an EJB environment where class definitions are loaded on application startup, shouldn't bytecode modification be possible on JVM-loaded classes?
Well, you learned that the Instrumentation API exists and it offers redefinition of classes as an operation. So then it is time to rethink you initial premise of “the JVM actually not allowing to change the definition of classes that have already been loaded”.
You should note that
as the links show, the Instrumentation API is part of the standard API
the support of redefinition of classes is, however, optional. You may ask whether the current JVM supports this feature
it might be limited to not support every class; you may ask whether it’s possible for a particular class
Even if it is supported, the changes may be limited, to cite the documentation:
The redefinition may change method bodies, the constant pool and attributes. The redefinition must not add, remove or rename fields or methods, change the signatures of methods, or change inheritance. These restrictions maybe be lifted in future versions.
at the time you perform the redefinition, there might be threads executing code of methods of these classes; these executions will complete using the old code then
So the Instrumentation is merely useful for debugging and profiling, etc.
But other frameworks, like EJB containers, offering class reloading in production code, usually resort to creating new ClassLoaders which create different versions of the classes which then are entirely independent to the older versions.
In a Java runtime environment, the identity of a class consists of a pair of <ClassLoader, Qualified Name> rather than just a qualified name…
I wasn't aware that you can use the instrumentation API to redefine classes (see #Holger's answer). However, as he notes, there are some significant limitations on that approach. Furthermore, the javadoc says:
"This method is intended for use in instrumentation, as described in the class specification."
Using it to materially change the semantics of a class is ... all sorts of bad from the perspective of the Java type system.
I want to redefine the bytecode of the StackOverflowError constructor so I have a "hook" for when a stack overflow occurs. All I want to do is insert a single method call to a static method of my choosing at the start of the constructor. Is it possible to do this?
You should be able to do it using one of two ways (unless something changed in the last 1-2 years, in which case I'd love some links to changelogs/docs):
Mentioned in a comment, not very feasible I guess, modify the classes you are interested in, put them in a jar and then use the -bootclasspath option to load them instead of the default ones. As was mentioned before this can have some legal issues (and is a pain to do in general).
You should be able to (or at least you used to be able to) instrument almost all core classes (iirc Class was the only exception I've seen). One of many problems you might have is the fact that many of core classes are being initialized before the agents you provide (or well their premain methods to be exact) are consulted. To overcome this you will have to add Can-Retransform-Classes property to your agent jar and then re-transform the classes you are interested in. Be aware that re-transformation is a bit less powerful and doesn't give you all the options you'd have normally with instrumentation, you can read more about it in the doc.
I am assuming you know how to do instrumentation?
There are several things to consider.
It is possible to redefine java.lang.StackOverflowError. I tried it successfully on 1.7.0_40. isModifiableClass(java.lang.StackOverflowError.class) return true and I successfully redefined it inserting a method invocation into all of its constructors
You should be aware that when you insert a method call into a class via Instrumentation you still have to obey the visibility imposed by the ClassLoader relationships. Since StackOverflowError is loaded by the bootstrap loader it can only invoke methods of classes loaded by the bootstrap loader. You would have to add the target method’s class(es) to the bootstrap loader
This works if the application’s code throws a StackOverflowError manually. However, when a real stackoverflow occurs, the last thing the JVM will do is to invoke additional methods (keep in mind what the error says, the stack is full). Consequently it creates an instance of StackOverflowError without calling its constructor (a JVM can do that). So your instrumentation is pointless in this situation.
As already pointed out by others, a “Pure Java Application” must not rely on modified JRE classes. It is only valid to use Instrumentation as add-on, i.e. development or JVM management tool. You should keep in mind that the fact that Oracle’s JVM 1.7.0_40 supports the redefinition of StackOverflowError does not imply that other versions or other JVMs do as well.
I am trying out Java 7 for one project and getting warnings from annotation processors (Bindgen and Hibernate JPA modelgen) of this sort:
warning: Supported source version 'RELEASE_6' from annotation processor 'org.hibernate.jpamodelgen.JPAMetaModelEntityProcessor' less than -source '1.7'
This is caused by the #SupportedSourceVersion(SourceVersion.RELEASE_6) annotation on the annotation processor classes. Since they are compiled with Java 6, the highest value of SourceVersion available to them is RELEASE_6. The Java 7 version of SourceVersion introduces RELEASE_7.
My questions: How are annotation processors supposed to handle forward compatibility? Will there have to be separate jdk6 and jdk7 binary versions of them? Am I not understanding something else here?
I only found the following information regarding this concern:
Querdydsl bug report which used
#Override
public SourceVersion getSupportedSourceVersion() {
return SourceVersion.latest();
}
Oracle blog in which a commentor recommends supporting latest source version
Forward compatibility is handled by processing unknown language constructs appropriately, for example by implementing ElementVisitor.visitUnknown.
There is another entry in the mentioned Oracle blog, which suggests two policies regarding forward compatibility:
Write the processor to only work with the current language version.
Write the processor to cope with unknown future constructs.
The second one is done by returning SourceVersion.latest() as already posted in the question.
I think it's ok to do this in most cases, when you are sure additional language elements won't break anything. Of course you shouldn't just assume that everything will be fine even with newer versions, you should test it too.
Ok, I guess processing unknown language constructs appropriately sounds a bit vague, so here are some examples.
Supposed you have a processor that checks for a custom type of annotations on known language constructs (annotations on a class for example) and creates a simple documentation of what it has found. You are probably safe to assume it will also work in newer versions. Restricting it to a particular version would not be a good decision in my opinion.
Supposed you have a processor that checks every element it can find and analyses the code structure to generate a graph out of it. You may get problems with newer versions. You may be able to handle unknown language constructs somehow (like by adding a unknown node to the graph) but only do this if that makes sense - and if it's worth the trouble. If the processor just wouldn't be useful any more when confronted with something unknown, it should probably stick to a particular java version.
Regardless of the policy used, the best way in my opinion would be to monitor upcoming changes to the language and update the processor accordingly. In Java 7 for example, project coin introduced some new language features, which are most likely not even visible to a processor. Java 8 on the other hand does have new constructs that will affect processing, for example type annotations. New language features don't happen that often though, so Chances are that you don't even need to change anything for a long time.
We know that there are several deprecated items in Java.
Will they be removed?
Have any deprecated items ever been removed from Java?
Will they be removed?
Unlikely since java has always been about maintaining backward compability, but it can happen. I see deprecations as a warning that the API is either unreliable or somehow seriously flawed.
(Thread has several of these).
Has any of the deprecated items in the past has been removed from java?
AFAIC not removed but never implemented Thread.destroy(), as it was along with several other Thread methods inherently unsafe.
This question has been asked elsewhere.
Quite frankly, what the Java team usually do, is to deprecate the method and remove its implementation to the suggested method instead. The deprecated method is just an unimplemented method.
According to the documentation here.
You can see that it says
A program element annotated #Deprecated is one that programmers are discouraged from using, typically because it is dangerous, or because a better alternative exists.
So a deprecated method or class is basically a older method or class that is discouraged from being used as there are newer more logical ways to perform that action.
Will these methods ever be removed?
Probably not. It will continue to work as before the deprecation, you just have to deal with the pesky warning. In order to keep older programs still running correctly that aren't being updated, almost all deprecated classes and methods won't be removed solely for that reason.
Deprecated APIs are interfaces that are supported only for backwards compatibility. The javac compiler generates a warning message whenever one of these is used, unless the -nowarn command-line option is used. It is recommended that programs be modified to eliminate the use of deprecated APIs, though there are no current plans to remove such APIs – with the exception of JVMDI and JVMPI – entirely from the system.
Have any deprecated items been removed?
In the java.* class, no. There have been a few changes in the javax.* class but in regular java there has never been a deprecated method or class removed.