How to use the HotSpot JVM #DontInline annotation? - java

I'm currently working on optimizing a particular method, which is unfortunately inlined by the JVM, which prevents it from being properly vectorized. I've noticed that there is an annotation to forbid inlining, namely jdk.internal.vm.annotation.DontInline . However, it cannot be accessed from the default module.
Is there a clean way of gaining access to this annotation or to prevent the inlining of the offending method some other way?

DontInline, ForceInline, etc. are JDK internal annotations, they cannot be applied to user code. Even if you somehow manage to open these annotations, HotSpot JVM has an explicit check to disallow them for non-privileged classes.
The reasons are understandable. These annotations are the implementation detail of the particular JVM version; JDK developers are free to add/remove/change meaning of such annotations without notice, even in a minor JDK update.
Using #DontInline to force vectorization does not seem a good approach anyway. In general, inlining should not prevent from other optimizations. If you encounter such problem, it's better to report an issue on hotspot-compiler-dev mailing list.
Now the good news.
Since JDK 9, there is a public supported API to manually tune JIT compiler. This is JEP 165: Compiler Control.
The idea is to provide compiler directives in a separate JSON file, and start the JVM with -XX:CompilerDirectivesFile=<file> option. If your application is sensitive to certain compiler decisions, you may provide the directives file along with the application.
{
match: "*::*",
inline: "-org/package/MyClass::hotMethod"
}
It is even possible to apply compiler directives programmatically in runtime using DiagnosticCommand API:
ManagementFactory.getPlatformMBeanServer().invoke(
new ObjectName("com.sun.management:type=DiagnosticCommand"),
"compilerDirectivesAdd",
new Object[]{new String[]{"compiler.json"}},
new String[]{"[Ljava.lang.String;"}
);
By the way, there is Vectorize: true option among the directives list, which may probably help in vectorizing the particular method.

Related

How does #FunctionalInterface influence the JVM's runtime behavior?

My initial question was an exact duplicate of this one; that is, why is it that this interface has a runtime retention policy.
But the accepted answer does not satisfy me at all, for two reasons:
the fact that this interface is #Documented has (I believe) nothing to do with it (although why #Documented has a runtime retention policy is a mystery to me as well);
even though many "would be" functional interfaces existed in Java prior to Java 8 (Comparable as the answer mentions, but also Runnable etc), this does not prevent them from being used as "substitutes" (for instance, you can perfecty well use a DirectoryStream.Filter as a substitute to a Predicate if all you do is filter on Path, for instance).
But still, it has this retention. Which means that it has to influence the JVM behavior somehow. How?
I've found the thread in core-libs-dev mailing list which discusses the retention of #FunctionalInterface annotation. The main point mentioned here is to allow third-party tools to use this information for code analysis/validation and to allow non-Java JVM languages to map correctly their lambdas to functional interfaces. Some excerpts:
Joe Darcy (original committer of #FunctionalInterface):
We intentionally made this annotation have runtime retention to
allow it to also be queried to various tools at runtime, etc.
Brian Goetz
There is a benefit for languages other than Java, that can use this as a means to determine whether the interface is suitable for passing to the SAM conversion machinery. The JDK support for lambda conversion is available to other languages as well.
So seems that it's not used by JVM itself, it's just an additional possibility for third-party tools. Making the annotation runtime-visible is not a big cost, so seems there were no strong reasons not to do this.
The only requirement for annotations with retention policy runtime is
Annotations are to be recorded in the class file by the compiler and retained by the VM at run time, so they may be read reflectively. (https://docs.oracle.com/javase/7/docs/api/java/lang/annotation/RetentionPolicy.html#RUNTIME)
Now this has some consequences on runtime behaviour, since the class loader must load these annoations and the VM must keep these annotations in memory for reflective access (for example by third party libraries).
There is however no requirement for the VM to act on such annotations.

Modify already loaded class with Java agent?

Currently I'm trying to modify method bodies residing in classes already loaded by the JVM. I'm aware of the JVM actually not allowing to change the definition of classes that have already been loaded. But my researches brought me to implementations like JRebel or Java Instrumentation API, both using an agent-based approach. I know how to do it right before a class is loaded o behalf of Javassist. But considering e.g. JRebel in an EJB environment where class definitions are loaded on application startup, shouldn't bytecode modification be possible on JVM-loaded classes?
Well, you learned that the Instrumentation API exists and it offers redefinition of classes as an operation. So then it is time to rethink you initial premise of “the JVM actually not allowing to change the definition of classes that have already been loaded”.
You should note that
as the links show, the Instrumentation API is part of the standard API
the support of redefinition of classes is, however, optional. You may ask whether the current JVM supports this feature
it might be limited to not support every class; you may ask whether it’s possible for a particular class
Even if it is supported, the changes may be limited, to cite the documentation:
The redefinition may change method bodies, the constant pool and attributes. The redefinition must not add, remove or rename fields or methods, change the signatures of methods, or change inheritance. These restrictions maybe be lifted in future versions.
at the time you perform the redefinition, there might be threads executing code of methods of these classes; these executions will complete using the old code then
So the Instrumentation is merely useful for debugging and profiling, etc.
But other frameworks, like EJB containers, offering class reloading in production code, usually resort to creating new ClassLoaders which create different versions of the classes which then are entirely independent to the older versions.
In a Java runtime environment, the identity of a class consists of a pair of <ClassLoader, Qualified Name> rather than just a qualified name…
I wasn't aware that you can use the instrumentation API to redefine classes (see #Holger's answer). However, as he notes, there are some significant limitations on that approach. Furthermore, the javadoc says:
"This method is intended for use in instrumentation, as described in the class specification."
Using it to materially change the semantics of a class is ... all sorts of bad from the perspective of the Java type system.

How can we measure the execution time of private methods as well?

We use interceptors to measure the execution time of a bean's public method invocation. nonetheless, when a bean's method invoke other private methods, it seems to ignore the audit interceptor.
How can we measure the execution time of private methods as well?
Using AOP
You could implement a benchmarker using an Aspect-Oriented Programming library like AspectJ.
For instance, see:
this SO question on Catching Private or Inner Methods with AspectJ,
or this article on Profiling with AspectJ.
Using a Profiler
You could implement your own agent extension (for instance, for JProfiler).
Or you could give up on your interceptors and simply inspect from any profiler allow to capture snapshots and to record execution times.
Using JVMTI
Which is what some profilers do, actually.
You could resort to using the JVMTI API (not entirely sure this would fly, to be honest) to implement your own code inspector and directly hook yourself into the JVM.
The Sneaky and Evil Inlining Issue
Regarding jb's (valid) concern in his answer that private methods might be inlined at either compilation time or runtime, some JVMs may not do it or allow to disable this feature.
Oracle's JRockit has a -XnoOpt option that would disable optimizations (including this particular one).
Oracle/Sun's HotSpot at least used to have -XX:-Inline (not sure it still exists or does anything).
However, it means you don't measure exactly what you'd have in production when the inlining is activated. Still, probably handy for inspecting your code.
Interceptors are applied by EJB container on invocation of interface methods - your private methods are invisible to it. What about using profiling tool instead?
Well AFAIK private methods can be inlined at JVM lesiure (even at compilation time), so they can't be profiled since they might not exist in bytecode.
I suppose you could mark your methods protected --- so they will not be inlined even in production, and then profile it.
If you want to profile your test instance you might try visualvm --- visualvm is a very nice alternative. VisualVM is a graphical tool to profile JVM istances that does all the instrumentation needed. http://visualvm.java.net/. Moreover it is a standard tool in most jdk distros.

Forward compatible Java 6 annotation processor and SupportedSourceVersion

I am trying out Java 7 for one project and getting warnings from annotation processors (Bindgen and Hibernate JPA modelgen) of this sort:
warning: Supported source version 'RELEASE_6' from annotation processor 'org.hibernate.jpamodelgen.JPAMetaModelEntityProcessor' less than -source '1.7'
This is caused by the #SupportedSourceVersion(SourceVersion.RELEASE_6) annotation on the annotation processor classes. Since they are compiled with Java 6, the highest value of SourceVersion available to them is RELEASE_6. The Java 7 version of SourceVersion introduces RELEASE_7.
My questions: How are annotation processors supposed to handle forward compatibility? Will there have to be separate jdk6 and jdk7 binary versions of them? Am I not understanding something else here?
I only found the following information regarding this concern:
Querdydsl bug report which used
#Override
public SourceVersion getSupportedSourceVersion() {
return SourceVersion.latest();
}
Oracle blog in which a commentor recommends supporting latest source version
Forward compatibility is handled by processing unknown language constructs appropriately, for example by implementing ElementVisitor.visitUnknown.
There is another entry in the mentioned Oracle blog, which suggests two policies regarding forward compatibility:
Write the processor to only work with the current language version.
Write the processor to cope with unknown future constructs.
The second one is done by returning SourceVersion.latest() as already posted in the question.
I think it's ok to do this in most cases, when you are sure additional language elements won't break anything. Of course you shouldn't just assume that everything will be fine even with newer versions, you should test it too.
Ok, I guess processing unknown language constructs appropriately sounds a bit vague, so here are some examples.
Supposed you have a processor that checks for a custom type of annotations on known language constructs (annotations on a class for example) and creates a simple documentation of what it has found. You are probably safe to assume it will also work in newer versions. Restricting it to a particular version would not be a good decision in my opinion.
Supposed you have a processor that checks every element it can find and analyses the code structure to generate a graph out of it. You may get problems with newer versions. You may be able to handle unknown language constructs somehow (like by adding a unknown node to the graph) but only do this if that makes sense - and if it's worth the trouble. If the processor just wouldn't be useful any more when confronted with something unknown, it should probably stick to a particular java version.
Regardless of the policy used, the best way in my opinion would be to monitor upcoming changes to the language and update the processor accordingly. In Java 7 for example, project coin introduced some new language features, which are most likely not even visible to a processor. Java 8 on the other hand does have new constructs that will affect processing, for example type annotations. New language features don't happen that often though, so Chances are that you don't even need to change anything for a long time.

Java: Locate reflection code usage

We have huge codebase and some classes are often used via reflection all over the code. We can safely remove classes and compiler is happy, but some of them are used dynamically using reflection so I can't locate them otherwise than searching strings ...
Is there some reflection explorer for Java code?
No simple tool to do this. However you can use code coverage instead. What this does is give you a report of all the line of code executed. This can be even more useful in either improving test code or removing dead code.
Reflections is by definition very dynamic and you have to run the right code to see what it would do. i.e. you have to have reasonable tests. You can add logging to everything Reflection does if you can access this code, or perhaps you can use instrumentation of these libraries (or change them directly)
I suggest, using appropriately licensed source for your JRE, modifying the reflection classes to log when classes are used by reflection (use a map/WeakHashMap to ignore duplicates). Your modified system classes can replace those in rt.jar with -Xbootclasspath/p: on the command line (on Oracle "Sun" JRE, others will presumably have something similar). Run your program and tests and see what comes up.
(Possibly you might have to hack around issues with class loading order in the system classes.)
I doubt any such utility is readily available, but I could be wrong.
This is quite complex, considering that dynamically loaded classes (via reflection) can themselves load other classes dynamically and that the names of loaded classes may come from variables or some runtime input.
Your codebase probably does neither of these. If this a one time effort searching strings might be a good option. Or you look for calls to reflection methods.
As the other posters have mentioned, this cannot be done with static analysis due to the dynamic nature of Reflection. If you are using Eclipse, you might find this coverage tool to be useful, and it's very easy to work with. It's called EclEmma

Categories

Resources