Compatibility between protobuf 2.3.0 and 2.6.0 - java

I'm working with two sets of protobuf bindings, A and B.
A has been generated with protoc version 2.3.0 and B has been generated with protoc version 2.6.0. In my application that uses both of the above I'm using the protobuf-java library version 2.6.0.
With this setup I get the following kind of error when interacting with A protobuf bindings:
java.lang.RuntimeException: Generated message class "A$Builder" missing method "getAMessageBuilder"
There are plenty of posts and questions about backwards compatibility between protobuf schemas, but what about library versions? Are there any guarantees between protobuf versions?

Protobuf for Java supports running older generated code against a newer runtime library and also mixing generated code from different compiler versions. However, this support only exists starting from version 3.0 (see the change log entry here). So in your case it would probably be best to upgrade to 3.0 (or higher) if possible and regenerate your code with the 3.0 protoc, but after that one-time step you will no longer need to worry about regenerating your code as you update the protobuf library version.

Related

ANTLR Tool version mismatch

I have been working on a Java Eclipse project that uses Antlr 4.4. In my project, I need to use classes of another Java project that in turn uses Antlr 4.6. as a result, I get the error message: ANTLR Tool version 4.6 used for code generation does not match the current runtime version 4.4. I have no idea how to solve this problem without damaging the codes.
Pretty simple: use the runtime with the same version as what was used to generated your parser files. In this case upgrade your runtime to 4.6.

How to validate dependencies' java version when compiling using higher version JDK?

We're using java 8 for most modules/projects, but for some of the modules, we use java 6 (customer requirements).
The developers have java 8 installed and we compile the java 6 projects using these flags:
compileJava {
sourceCompatibility = 1.6
targetCompatibility = 1.6
}
We thought we're all good until we upgraded guava from v20 to latest - 28.1-jre.
To our surprise, the build was successful but failed at runtime.
We have a workaround for building for java 6 using a specific javac found in JDK 6. See more info here. This workaround wields the error class file has wrong version 52.0, should be 50.0 in compile time. The downside is that it requires a download+config+usage of JDK 6 for developers.
Is there a way to validate the dependencies' java version at compile time when using a higher java version? (without installing lower version java) Thanks.
Setting -source and -target values to 1.6 is insufficient to ensure that the resulting output is compatible with 1.6. The program itself must not have any library API dependencies on later versions, and the -source and -target options don't do that. (GhostCat said pretty much the same thing.)
For example, in Java 8, ConcurrentHashMap added a covariant override for the keySet method that returns a new type ConcurrentHashMap.KeySetView. This type didn't exist in earlier versions of Java. However, in the class binary, the return type is encoded at the call site. Thus, even if the source code is compiled with -source 1.6 -target 1.6, the resulting class file contains a dependency on the Java 8 class library API.
The only solution to this is to ensure that only Java 1.6 compatible libraries are in the classpath at compile time. This can be done using the -Xbootclasspath option to point to a JDK 1.6 class library, or it might be simpler just to use a JDK 1.6 installation in the first place.
This applies to external libraries in addition to the JDK, as you've discovered with Guava. The Animal Sniffer project provides plugins for Ant and Maven that checks library dependencies for version problems. Offhand I don't know if there is something similar for Gradle. There might be a way to get Animal Sniffer to work with Gradle, but I have no experience with doing that.
Is there a way to validate the dependencies' java version at compile time when using a higher java version? (without installing lower version java).
You specify your dependencies. When you tell your built system to explicitly use some library X in version Y, then you made a very clear statement.
And you see, it is not only about the class file version number. What if some person doesn't pay attention, and compiles something with Java8 ... with Java6 target, but forgets that the code bases uses Java8-only API calls?!
In other words: you are looking in the wrong place.
The person who makes updates to the build description, and changes a library version from Y to Y+8, that person needs to carefully assess that change. For example by reading release letters.
I agree that a really clever build system could check if libraries you are using come in with a matching class file version. But as said, that is only one aspect of the problem. So instead of looking into a technical solution, I think the real answer is: don't step version numbers because you can, but because you have to. And that manual step of changing that version number, that is something that requires due diligence (on the side of the human doing it).
Thus: I think the most sane approach here is to compile the Java6 deliverables within their own specific build setup. Which you only touch after careful inspection of such details. And sure: convince your customer to move on, and give up a long dead version of Java.

OpenNLP - Model version 1.6.0 not supported by this (1.5.3) version of OpenNLP

I am currently trying to use a custom-trained OpenNLP Name Finder model in code. My project uses OpenNLP 1.6.0 and is developed using Eclipse IDE. The model was also trained using OpenNLP 1.6.0.
However, I'm getting this annoying error:
java.lang.IllegalArgumentException : opennlp.tools.util.InvalidFormatException: Model version 1.6.0 is not supported by this (1.5.3) version of OpenNLP!
A similar question was asked here and the answer stated that the problem was due to the OpenNLP model being trained using the same version as the one in use (i.e. training a model using 1.6.0 and using it in a project that also uses 1.6.0). However I also have other Java projects using OpenNLP 1.6.0, and they were able to load the model successfully, so I don't think this applies to me.
The .classpath of my project also shows that the project is referencing the OpenNLP 1.6.0 libraries.
I know the question is rather vague, but if anyone has any insight into why this could be happening, please let me know!
To sum up: Unable to load custom trained OpenNLP Name Finder model in code due to apparent OpenNLP version incompatibility. Model was trained in OpenNLP 1.6.0, which my project also uses. Other projects also using 1.6.0 were able to load the model successfully.
I've pinpointed the source of my error - my project was making use of the Apache Tika 1.13 library as well, which contains its own implementation of OpenNLP 1.5.3.
As a result the classpath contained multiple JARs that contain their own versions of the OpenNLP classes.
EDIT 16/1/2017: From my findings, the order in which the libraries are loaded matters.
Please correct me if I am wrong here: in Java (and other languages as well), once a dependency component has been loaded in memory, then by default, it will not be loaded again even if another version is required later.
Because of this, if the Tika library is loaded before the OpenNLP 1.6.0 libraries, OpenNLP 1.5.3 will be loaded first, and the program will "stick" to 1.5.3 in spite of the subsequent loading of 1.6.0. Similarly, if the reverse happens (1.6.0 is loaded before Tika), the program will "stick" to 1.6.0 instead.

Is there a protobuf 3 compatible Spring HTTP message converter?

There's a org.springframework.http.converter.protobuf.ProtobufHttpMessageConverter that can be used to convert protobuf generated java message classes, but it's tested only with protobuf version 2.5
Is there a version compatible with the version 3 of protobuf?
Here's a not fully tested solution still in a git branch:
https://github.com/bclozel/spring-framework/blob/SPR-13589/spring-web/src/main/java/org/springframework/http/converter/protobuf/ProtobufHttpMessageConverter.java

Converting Java to .NET library using IKVMC - Warning IKVMC0108: not a class file

There is Java tool (it is called Mallet)
http://mallet.cs.umass.edu/download.php
which I want to use in my .NET project.
To convert this tool to .NET library at first I've tried to build it in single .jar file using Apache Ant. I've done everything corresponding to instructions at link above.
Download Developer Release from Mercurial repository.
Download Apache Ant, install JDK, set JAVA_HOME var to use Apache Ant.
Using Ant I've built single mallet.jar file.
And then I would to convert mallet.jar to .NET library using IKVMC.
When converting, I've got a lot of warnings such as:
Warning IKVMC0108: not a class file "cc/mallet/util/tests/TestPriorityQueue$1.cl
ass", including it as resource
(class format error "51.0")
Despite of these warnings, mallet.dll was created. But when I try to reference to it from my .NET project, it looks "empty". It has not any classes or namespaces. I don't forget to reference to IKVM.OpenJDL.Core.
And this is unusual that I can't find any same problems in Google.
I think that problem is in warnings. And I have never worked with Ant and I don't understand all process exactly.
The class format version 51 was introduced with Java 7.
IKVM most likely doesn't support that version yet and the file name you quote (cc/mallet/util/tests/TestPriorityQueue$1.class) points at an anonymous inner class of TestPriorityQueue that certainly is needed for the library to work correctly.
My suggestion: compile Mallet using an older JDK or at least using the -source and -target switches set to 6 (to ensure that it's compile for Java 6).
FYI v8.1 (currently in RC) of IKVM supports Java 8:
http://weblog.ikvm.net/2015/08/26/IKVMNET81ReleaseCandidate0.aspx
http://sourceforge.net/p/ikvm/mailman/message/34502991/

Categories

Resources