I'm seriously considering to add default methods into Java 8 Iterable/List/etc. , instead of the current approach via stream/static methods, which has made my code rather long and difficult-to-read, especially for numerous small pieces of code where simple list filter/combine is required, such as:
printLines(myList.filter(a -> a.alive).map(a -> a.name))
I understand that would require all environments that compile and execute my code to have specific jars in jre\lib\endorsed, but it doesn't matter to our project since it will have to ship with embedded JRE. The binary compatibility is okay as described in Java 8 document. The endorsed method is also ignored in binary license (unlike -Xbootclasspath). What else should I consider? Have anyone or any project done that already?
The jre\lib\endorsed works. Tested with running Eclipse/IntelliJ/NetBeans and LWJGL/JOGL demos without any problem.
I have to use source from OpenJDK to avoid license problems; New default methods are put in separated classes, such as this:
public interface Iterable<T> extends IterableExt<T> {
Iterator<T> iterator();
....
}
public interface IterableExt<T> {
default boolean all(Predicate<? super T> filter)
{
for (T item : (Iterable<T>) this)
{
if (!filter.test(item))
{
return false;
}
}
return true;
}
default Iterable<T> filter(Predicate<? super T> filter)
{
Iterable<T> thiz = (Iterable<T>) this;
return () -> new FilteredIterator<>(thiz.iterator(), filter);
}
}
Deployment is simple enough: just wrap everything into jar and put into %JRE_HOME%\lib\endorsed. Since it's applied to entire system and locked on Windows, it'd be impossible to upgrade without stopping all Java programs.
Another problem is the classes compiled by Eclipse would not work when referenced by javac later (bad class file etc), but recompiling by javac works for both.
EDIT: I uploaded the source to https://github.com/AqD/JXTN (.axi project) under public domain.
Related
I'm attempting to compile a Google Cloud Endpoints project in Java, but I'm getting the following error:
There was an error running endpoints command get-client-lib: Object type K not supported.
I don't have any public methods in any of my API classes that take in or return a generic type (K or otherwise), nor any class specifically named K. My body types should all be entity types by the documentation's definitions.
The biggest change I've made recently is that I moved my model (which defines the entity types) to a separate project and I'm including those classes as a JAR dependency.
I'm on the App Engine SDK version 1.9.51 (recently upgraded to this), though using previous versions doesn't seem to change anything. I'm using Gradle with the gradle-appengine-plugin (eventually planning to migrate to the newer app-gradle-plugin).
Any thoughts on why this is happening, and what steps I might take to resolve?
EDIT:
I seem to have isolated the problem, but it hasn't resolved the issue.
I have a list method in my API that returns an object containing a list of other objects. Something like:
public class ListWrapper() implements Serializable {
private List<Object> list;
public List<Object> getList() { return list; }
public void setList(List<Object> list) { this.list = list; }
}
It seems to be this List type which is suddenly the problem. If I remove it from the class, it works fine. If I edit the method to return something else, it works fine. But, in a previous version of the code, this LsitWrapper object existed and was returned exactly as it is and worked fine. What has changed? (Reverting back to earlier versions of the App Engine SDK doesn't seem to help.)
I'm building a library that requires some annotation processing to generate code. I now run into an issue that the release build doesn't need to have as much code as the debug build does (since this is a library for modifying configuration variants - primarily used for testing purposes). The following code illustrates the situations. Let's say I want to create a class ConfigManager from some annotated classes and properties. In debug builds, I need this much:
public class ConfigManager {
public Class getConfigClass() {
return abc.class;
}
public void method1() {
doSomething1();
}
public void method2() {
doSomething2();
}
public void method3() {
doSomething3();
}
}
While in release builds, I only need this much:
public class ConfigManager {
public Class getConfigClass() {
return abc.class;
}
}
I have a feeling it may be possible by writing a Gradle plugin to check for build flavor at compile time and invoke a different processor/or somehow pass a parameter to a processor to generate different code. However this topic is pretty new to me so I'm not sure how to achieve this. A couple hours of googling also didnt help. So I'm wondering if anyone could give me a direction or example? Thanks
Pass an option (release=true/false) to your processor.
From javac https://docs.oracle.com/javase/8/docs/technotes/tools/windows/javac.html
-Akey[=value]
Specifies options to pass to annotation processors. These options are not interpreted by javac directly, but are made available for use by individual processors. The key value should be one or more identifiers separated by a dot (.).
In combination with Processor.html#getSupportedOptions https://docs.oracle.com/javase/8/docs/api/javax/annotation/processing/Processor.html#getSupportedOptions
Returns the options recognized by this processor. An implementation of the processing tool must provide a way to pass processor-specific options distinctly from options passed to the tool itself, see getOptions.
Implementation outline:
public Set<String> getSupportedOptions() {
Set<String> set = new HashSet<>();
set.add("release");
return set;
}
// -Arelease=true
boolean isRelease(ProcessingEnvironment env) {
return Boolean.parseBoolean(env.getOptions().get("release"));
}
See Pass options to JPAAnnotationProcessor from Gradle for how to pass options in a gradle build.
Trying to upgrade to JDK8 on a big project, compilation goes really slow on JDK8 compared to JDK7.
Running the compiler in verbose mode, JDK8 compiler stops at a big generated converter class(Mapping) for entities from server to client.
The converter methods in several cases call other converter methods from the same Mapping class.
As a workaround tried to split the Mapping file into multiple files. This visibly improved performance when only compiling the Mapping class or it's containing project(projectA). But compile time was very slow for other projects which invoke converter methods from projectA.
Another workaround was to make all convert methods return null, not calling anything else. Again, the performance was good for projectA but not for depending projects.
ProjectA uses generics but since it is compatible with JDK6, which didn't have generalized type inference introduced, maybe it's another JDK8 bug that causes this slowdown.
So possibly out of context but for generalized type inference, some threads like below suggest an upgrade to JDK9. But since it's not yet released, it's not a viable option as upgrade.
It'd be ideal if a backport of the fix would be done to JDK8. This was requested in the following StackOverflow thread but no reply from Oracle team yet.
Slow compilation with jOOQ 3.6+, plain SQL, and the javac compiler
I've attached 2 screenshots of how the heap looks in JDK7 vs JDK8. Could this be a cause for the JDK8 slowdown?
Thank you!
Update 20160314
The converter methods from Mapping class look like:
public static ResponseItemVO convert (ResponseItem pArg0){
if(pArg0==null){
return null;
}
ResponseItemVO ret = new ResponseItemVO();
ret.setErrorDetails(pArg0.getErrorDetails());
ret.setResult(Mapping.convert(pArg0.getResult()));
ret.setIdentifier(Mapping.convert(pArg0.getIdentifier()));
return ret;
}
And the VO looks like:
public class ResponseItemVO extends ResultVO<IdentifierVO, DetailsVO > {
public ResponseItemVO() {}
}
JDK7 Heap:
JDK8 Heap:
You've noticed already, there's a severe performance regression in Java 8 when it comes to overload resolution based on generic target typing. One of the reasons in your case might be the fact that the compiler needs to find the appropriate method from an assignment type
ResultVO<Something, Something> result = Mapping.convert(...);
// heavy lookup here ---------------------------^^^^^^^
If you're in control of the code generator, and not constrained by backwards compatibility, it might be worth thinking about avoiding the overloading of the convert() method. Without overloading, the compiler doesn't have to do the overload resolution work, neither inside of your mapping code, nor at the call site. This will certainly be much much faster.
Attempt 1: By using the parameter type in the method name:
class Mapping {
public static ResponseItemVO convertResponseItem(ResponseItem pArg0){
if (pArg0==null){
return null;
}
ResponseItemVO ret = new ResponseItemVO();
ret.setErrorDetails(pArg0.getErrorDetails());
ret.setResult(Mapping.convertResult(pArg0.getResult()));
ret.setIdentifier(Mapping.convertIdentifier(pArg0.getIdentifier()));
return ret;
}
}
Attempt 2: By moving the convert method elsewhere, e.g. into the VO type
class ResponseItemVO {
public static ResponseItemVO from(ResponseItem pArg0){
if (pArg0==null){
return null;
}
ResponseItemVO ret = new ResponseItemVO();
ret.setErrorDetails(pArg0.getErrorDetails());
ret.setResult(ResultVO.from(pArg0.getResult()));
ret.setIdentifier(IdentifierVO.from(pArg0.getIdentifier()));
return ret;
}
}
Or better...
class ResponseItem {
public ResponseItemVO toVO(){
ResponseItemVO ret = new ResponseItemVO();
ret.setErrorDetails(getErrorDetails());
ret.setResult(getResult().toVO());
ret.setIdentifier(getIdentifier().toVO());
return ret;
}
}
I've got a project that was originally written for Java 1.4, but I only have Java 6 on my Mac and I cannot install Java 1.4.
Normally, I'd use a line like this to compile:
javac -source=1.4 -target=1.4 MyClass.java
However, MyClass.java implements the java.sql.ResultSet interface, which added several new methods in Java 6, so I get compile errors like:
MyClass is not abstract and does not override abstract method
updateNClob(java.lang.String,java.io.Reader) in java.sql.ResultSet
I cannot simply implement the missing methods because many use generics, which are not available in Java 1.4.
It seems a solution would be to obtain and compile against the Java 1.4 JARs. So, I've got a few questions:
Is there a better way?
How do I specify to my Java 1.6 javac that I'd like to use the 1.4 JARs instead of the Java 6 JARs?
Will this even work, and if so, will the project run on Java 1.4 as well as Java 6?
How do I do this in Maven?
Thanks!
Your situation seems to be quite contrived. I'll try to simplify matters. At first, I am going to ignore your question about Maven.
So let me first state some facts:
-source=1.4 means: Dear compiler, please accept only language constructs --- not library features --- which were available with javac of JDK 1.4.
-target=1.4 means: Dear compiler, please write class files in a binary file format which is compatible with a JRE 1.4.
I gather that you are interested in load-time compatibility with JDK 1.4, i.e. you want that the class files produced in your setup can be loaded by JDK 1.4. Is that right?
Do you also want to support source compatibility? I.e. do you want to allow others to compile your code on a JDK 1.4?
If the answer to the last question is yes, I would try to install JDK 1.4 on OS X. It supports multiple installed JDKs. So I am pretty sure it is possible. If that is no option use:
-source=1.4 -target=1.4 -bootclasspath=[path/to/1.4.jar]
Note, do not use -Xbootclasspath. This changes the boot classpath of the jvm executing javac.
If the answer to the above question is no. You can dispose of -source=1.4 allowing you to use generics and other Java 5 enhancement in your code. But you still have to provide binary compatibility by using:
-target=1.4 -bootclasspath=[path/to/1.4.jar]
Another option would be to use Retroweaver.
After re-reading your question, I'd like add that you have to get hold of JDK 1.4 variant of the jdbc class files. Otherwise you'll run into the compiler errors you've shown in your question.
Unless you are a JDBC vendor, it is unwise to implement interfaces like this one.
Consider using a proxy to maintain compatibility across JVM versions.
Migrating to a proxy is accomplished as follows. Consider this ResultSet implementation:
public class ResultSetFoo implements ResultSet {
public String getString(int columnIndex) throws SQLException {
return "foobar";
}
// other Java 1.4 methods
This would be changed so no classes implement ResultSet:
public class ResultBar {
public String getString(int columnIndex) throws SQLException {
return "foobar";
}
// other method signatures matching the 1.4 ResultSet, as before
You would then need to build a mapping of methods between the two types at runtime (a primitive form of duck-typing:)
private static final Map RESULT_SET_DUCK = initResultSet();
private static Map initResultSet() {
Map map = new HashMap();
Method[] methods = ResultSet.class.getMethods();
for (int i = 0; i < methods.length; i++) {
try {
Method match =
ResultBar.class.getMethod(methods[i].getName(),
methods[i].getParameterTypes());
map.put(methods[i], match);
} catch (SecurityException e) {
throw new IllegalStateException(e);
} catch (NoSuchMethodException e) {
// OK; not supported in 1.4
}
}
return map;
}
This allows you to invoke the ResultBar type by proxy:
/** Create a java.sql.ResultSet proxy */
public static ResultSet proxy(final ResultBar duck) {
class Handler implements InvocationHandler {
public Object invoke(Object proxy, Method method, Object[] args)
throws Throwable {
Method proxiedMethod = (Method) RESULT_SET_DUCK.get(method);
if (proxiedMethod == null) {
throw new UnsupportedOperationException("TODO: method detail");
} else {
return invoke(proxiedMethod, duck, args);
}
}
private Object invoke(Method m, Object target, Object[] args)
throws Throwable {
try {
return m.invoke(target, args);
} catch (InvocationTargetException e) {
throw e.getCause();
}
}
}
return (ResultSet) Proxy.newProxyInstance(null, RSET, new Handler());
}
Such implementations should allow code compiled in one JVM to be used in future JVMs even if new methods are added. Existing method signatures are unlikely to change because it is one thing to make database vendors do some work; something else to make all API consumers change.
You may need to change how class instances are created. You can no longer use a constructor directly:
ResultSet nonPortable = new ResultSetFoo();
//becomes...
ResultSet portable = proxy(new ResultBar());
If you're already employing a factory/builder/etc. pattern this bit is easy.
Although reflection is relatively cheap in the latest JVMs it is less so in older versions; this may have a detrimental effect on performance.
How do I specify to my Java 1.6 javac that I'd like to use the 1.4 JARs instead of the Java 6 JARs?
In Win. & *nix it would be by specifying the bootclasspath option. See javac: Cross-Compilation Options for more details.
I'm using Java Compiler API to compile in-memory classes. That is, classes are compiled to bytecode (no .classes files stored in disk) and then loaded by reconstructing the bytecode.
Sometimes, I need to compile a class that depends on another, also in-memory compiled, class. For instance: Compile Class A, then compile Class B which depends on Class A.
To solve this, I pass both Class A and Class B as the compilation units needed by the getTask method of the compiler API.
However, I really don't like this solution, as it makes me recompile Class A which was already compiled.
Is there a way to get around this?
EDIT: I found a solution through this link: http://www.ibm.com/developerworks/java/library/j-jcomp/index.html
Yes, this is totally possible as long as you properly implement the ForwardingJavaFileManager. The two most important methods are inferBinaryName() and list(). If you set these two up properly, the compiler will be able to resolve classes that you've previously compiled.
inferBinaryName() must return the class' simple name (e.g. the inferred binary name for com.test.Test would be just Test). Here is my implementation (my subclass of JavaFileObject is called InAppJavaFileObject):
#Override
public String inferBinaryName(Location location, JavaFileObject javaFileObject) {
if(location == StandardLocation.CLASS_PATH && javaFileObject instanceof InAppJavaFileObject) {
return StringUtils.substringBeforeLast(javaFileObject.getName(), ".java");
}
return super.inferBinaryName(location, javaFileObject);
}
Note that I'm stripping off ".java" from the end. When constructing a JavaFileObject, the file name must end in ".java", but if you don't strip the suffix later, the compiler won't find your class.
list() is a little bit more complicated because you have to be careful to play along nicely with your delegate file manager. In my implementation, I keep a map of fully-qualified class name to my subclass of JavaFileObject that I can iterate over:
#Override
public Iterable<JavaFileObject> list(Location action, String pkg, Set<JavaFileObject.Kind> kind, boolean recurse) throws IOException {
Iterable<JavaFileObject> superFiles = super.list(action, pkg, kind, recurse);
// see if there's anything in our cache that matches the criteria.
if(action == StandardLocation.CLASS_PATH && (kind.contains(JavaFileObject.Kind.CLASS) || kind.contains(JavaFileObject.Kind.SOURCE))) {
List<JavaFileObject> ourFiles = new ArrayList<JavaFileObject>();
for(Map.Entry<String,InAppJavaFileObject> entry : files.entrySet()) {
String className = entry.getKey();
if(className.startsWith(pkg) && ("".equals(pkg) || pkg.equals(className.substring(0, className.lastIndexOf('.'))))) {
ourFiles.add(entry.getValue());
}
}
if(ourFiles.size() > 0) {
for(JavaFileObject javaFileObject : superFiles) {
ourFiles.add(javaFileObject);
}
return ourFiles;
}
}
// nothing found in our hash map that matches the criteria... return
// whatever super came up with.
return superFiles;
}
Once you have those methods properly implemented, the rest just works. Enjoy!
That leads to the obvious question of why you want to compile class A separately first. Why not just compile everything in one go?
How if you maintain the modified time of the files and the (in-memory) compiled byte code?
I don't think you can avoid compiling both classes. In fact, if you don't compile both of them, there is a chance that you will end up with binary compatibility problems, or problems with incorrect inlined constants.
This is essentially the same problem as you'd get if you compiled one class and not the other from the command line.
But to be honest, I wouldn't worry about trying to optimize the compilation like that. (And if your application needs to be able to dynamically compile one class and not the other, it has probably has significant design issues.)