Using maven library in android project - java

I have a certain library project which does a certain work. The library is built using maven.
I now want to include this library in an android project. I added the library jar as a compile dependency in gradle and I can successfully use the library in my android code.
I have JDK 8 installed and I build the library using it. But, as I have read, android uses Java 7. Since the library is built using JDK 8, can this cause a problem?
If it can cause problems, I don't think building the library using JDK 7 would solve it either, since the library depends on other maven libraries from external maven repository. Is there anything I can do about it?

This is a common JDK lifecycle problem. The good news is there are things you can do to make sure that everything in your build is compliant with a certain JDK.
First off, make sure that your module is indeed compiled with the latest JDK version you are willing to accept. You can set the compiler plugin to only generate bytecode that is compliant to certain version, for instance JDK 7.
For example:
<build>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<version>3.3</version>
<configuration>
<source>THE_JDK_VERSION_YOU_WANT</source>
<target>THE_JDK_VERSION_YOU_WANT</target>
</configuration>
</plugin>
</plugins>
</build>
NOTE
There is a drawback to this and that is that while setting a specific source and target to the compiler, the code may still use JDK features that aren't available in the target environment's JRE. For example, having set JAVA_HOME to JDK8 and source and target to 1.7 will still allow the code to use (say) ConcurrentHashMap.mappingCount() which came in JDK8.
The answer to this problem is to use the animal-sniffer plugin. This plugin will check your code for any usage of disallowed API:s, such as JDK8, if you configure it that way. This plugin was previously hosted by Codehaus, but it will come up if Google around a bit. I can provide you with a working example tomorrow when I get back to work.
Next, as you pointed out, you have your dependencies. Fortunately, the enforcer plugin can use an extended rule-set, called <enforceBytecodeVersion> that can check that all of your dependencies are also compliant with a specific bytecode version. All the details are available here.
EDITED
Here comes the configuration for the animal-sniffer plugin. There's a newer version available (1.14), but it didn't work for me. Maybe you'll have better luck. The API signatures for the JDK are available at Maven Central.
<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>animal-sniffer-maven-plugin</artifactId>
<version>1.13</version>
<executions>
<execution>
<id>check-for-jdk6-compliance</id>
<phase>test</phase>
<goals>
<goal>check</goal>
</goals>
</execution>
</executions>
<configuration>
<signature>
<groupId>org.codehaus.mojo.signature</groupId>
<artifactId>java16</artifactId>
<version>1.1</version>
</signature>
</configuration>
</plugin>

Related

Setting up Spring Tool Suite 4 with AspectJ/Spring Boot (m2e connector missing)

I am using the latest version of STS which at the moment is 4.11. I'm building a new project and trying to get AspectJ CTW working with Spring Boot. I have some unit tests to check the aspects with #Async method calls. The funny thing is that the unit tests pass with a maven clean install, but not when building through STS.
I believe the reason is the AJDT plugin or AJDT configurator plugins are not working because I see this error:
Plugin execution not covered by lifecycle configuration:
dev.aspectj:aspectj-maven-plugin:1.13.M3:compile
(execution: default, phase: compile)
I am using the latest aspectj maven plugin with these settings.
<plugin>
<groupId>dev.aspectj</groupId>
<artifactId>aspectj-maven-plugin</artifactId>
<version>1.13.M3</version>
<configuration>
<source>${maven.compiler.source}</source>
<target>${maven.compiler.target}</target>
<complianceLevel>${maven.compiler.target}</complianceLevel>
<encoding>${project.build.sourceEncoding}</encoding>
<XnoInline>true</XnoInline>
<aspectLibraries>
<aspectLibrary>
<groupId>org.springframework</groupId>
<artifactId>spring-aspects</artifactId>
</aspectLibrary>
</aspectLibraries>
</configuration>
<executions>
<execution>
<goals>
<goal>compile</goal>
<goal>test-compile</goal>
</goals>
</execution>
</executions>
<dependencies>
<dependency>
<groupId>org.aspectj</groupId>
<artifactId>aspectjtools</artifactId>
<version>${aspectj.version}</version>
</dependency>
</dependencies>
</plugin>
And I tried to install these two software installations:
http://download.eclipse.org/tools/ajdt/410/dev/update
http://dist.springsource.org/release/AJDT/configurator/
Getting this set up always seems to be a pain. Has anyone managed to do it with Java 11?
AspectJ Development Tools (AJDT)
I do not use STS, but mostly IntelliJ IDEA and if Eclipse, then plain Eclipse for Java developers. A while ago, I prepared a new AJDT version for Eclipse 2021-03, which still seems to be working in 2021-06, while developing AspectJ 1.9.7.
Try downloading the latest snapshot from aspectj.dev:
https://aspectj.dev/maven/org/eclipse/ajdt/org.eclipse.ajdt.releng/2.2.4-SNAPSHOT/
At the time of writing this, the latest snapshot is:
https://aspectj.dev/maven/org/eclipse/ajdt/org.eclipse.ajdt.releng/2.2.4-SNAPSHOT/org.eclipse.ajdt.releng-2.2.4-20210509.044425-2.zip
Sorry that I cannot provide you with a regular Eclipse update site, but while contributing to AspectJ, I have no access to the Eclipse infrastructure. The lead developer is busy, so my own web server is the easiest way to provide you with AJDT. The ZIP archive is about 15 MB in size. You can import it into Eclipse as a virtual update site as described here (scroll to "Install AJDT (AspectJ Development tools) for Eclipse IDE").
AspectJ Maven Plugin by aspectj.dev
Some small news: Yesterday I released version 1.13 of AspectJ Maven. It has a few more improvements compared to 1.13.M3, most notably 1.13
depends on AspectJ 1.9.8.M1 by default (you can also use 1.9.7, of course, but 1.9.8.M1 supports the --release N compiler switch),
recognises language level 17 as a valid parameter for source, target, compliance level and release parameters, i.e. it can be used with latest AspectJ 1.9.8 snapshots in order to experimentally compile Java 17-EA,
has precedence rules for compiler level settings, i.e. if compliance level is set, you do not need source and target (they are the same) and if you set source and target, you do not need to specify compliance level anymore. That before you had to set all three, was always a bug IMO. Besides, if you set the release for cross-compilation, all of source, target and compliance level are ignored.
More information can be found on the plugin's GitHub site.
Update: I found an m2e connector for AJDT which is maintained by Miika Vesti for his private use. At first it was not working for the dev.aspectj groupID, because he had forgotten to push an update to the Eclipse update site, but I got in touch with him and now it works. Please see the project's read-me for more information. You can use the existing update site for Eclipse 2020-12 in order to install a connector which also works on Eclipse 2021-06.
The connector needs some more work in order to import all AspectJ Maven settings correctly, e.g. it does not work in some of my projects where I deactivated Maven Compiler Plugin, because it currently relies on it being active and things like source/target compiler levels being configured there. Only then it will also correctly import source and target directories as well as dependencies - most prominently the Aspectj runtime library - correctly and result in a usable Eclipse project. I am trying to work with Miika in order to make the connector more self-sufficient in the future.

Apache Spark -- using spark-submit throws a NoSuchMethodError

To submit a Spark application to a cluster, their documentation notes:
To do this, create an assembly jar (or “uber” jar) containing your code and its dependencies. Both sbt and Maven have assembly plugins. When creating assembly jars, list Spark and Hadoop as provided dependencies; these need not be bundled since they are provided by the cluster manager at runtime. -- http://spark.apache.org/docs/latest/submitting-applications.html
So, I added the Apache Maven Shade Plugin to my pom.xml file. (version 3.0.0)
And I turned my Spark dependency's scope into provided. (version 2.1.0)
(I also added the Apache Maven Assembly Plugin to ensure I was wrapping all of my dependencies in the jar when I run mvn clean package. I'm unsure if it's truly necessary.)
Thus is how spark-submit fails. It throws a NoSuchMethodError for a dependency I have (note that the code works from a local instance when compiling inside IntelliJ, assuming that provided is removed).
Exception in thread "main" java.lang.NoSuchMethodError: com.google.common.base.Stopwatch.createStarted()Lcom/google/common/base/Stopwatch;
The line of code that throws the error is irrelevant--it's simply the first line in my main method that creates a Stopwatch, part of the Google Guava utilities. (version 21.0)
Other solutions online suggest that it has to do with version conflicts of Guava, but I haven't had any luck yet with those suggestions. Any help would be appreciated, thank you.
If you take a look at the /jars subdirectory of the Spark 2.1.0 installation, you will likely see guava-14.0.1.jar. Per the API for the Guava Stopwatch#createStarted method you are using, createStarted did not exist until Guava 15.0. What is most likely happening is that the Spark process Classloader is finding the Spark-provided Guava 14.0.1 library before it finds the Guava 21.0 library packaged in your uberjar.
One possible resolution is to use the class-relocation feature provided by the Maven Shade plugin (which you're already using to construct your uberjar). Via "class relocation", Maven-Shade moves the Guava 21.0 classes (needed by your code) during the packaging of the uberjar from a pattern location reflecting their existing package name (e.g. com.google.common.base) to an arbitrary shadedPattern location, which you specify in the Shade configuration (e.g. myguava123.com.google.common.base).
The result is that the older and newer Guava libraries no longer share a package name, avoiding the runtime conflict.
Most likely you're having a dependency conflict, yes.
First you can look if you have a dependency conflict when you build your jar. A quick way is to look in your jar directly to see if the Stopwatch.class file is there, and if, by looking at the bytecode, it appears that the method createStarted is there.
Otherwise you can also list the dependency tree and work from there : https://maven.apache.org/plugins/maven-dependency-plugin/examples/resolving-conflicts-using-the-dependency-tree.html
If it's not an issue with your jar, you might have a dependency issue due to a conflict between your spark installation and your jar.
Look in the lib and jars folder of your spark installation. There you can see if you have jars that include an alternate version of guava that wouldnt support the method createStarted() from Stopwatch
Apply above answers to solve the problem by following config:
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-shade-plugin</artifactId>
<version>3.1.0</version>
<executions>
<execution>
<phase>package</phase>
<goals>
<goal>shade</goal>
</goals>
<configuration>
<relocations>
<relocation>
<pattern>com.google.common</pattern>
<shadedPattern>shade.com.google.common</shadedPattern>
</relocation>
<relocation>
<pattern>com.google.thirdparty.publicsuffix</pattern>
<shadedPattern>shade.com.google.thirdparty.publicsuffix</shadedPattern>
</relocation>
</relocations>
</configuration>
</execution>
</executions>
</plugin>

`class file has wrong version` after Jenkins upgrade

A couple of days ago, I upgraded Jenkins to version 1.643. Before, we were using Jenkins 1.593. Starting with Jenkins 1.612, Jenkins requires Java 7, see changelog, announcement and issue. Our Jenkins server has Java 8.
I have a Maven project consisting of submodules.
In the job configuration in Jenkins, I have configured the build to use JDK 1.6.
When looking at the build environment, it's indeed 1.6:
JAVA_HOME=/var/lib/jenkins/tools/hudson.model.JDK/1.6
One of the submodules fails to build on Jenkins, with this error:
[ERROR] /var/lib/jenkins/<REDACTED>.java:[15,-1] cannot access java.lang.Object
bad class file: java/lang/Object.class(java/lang:Object.class)
class file has wrong version 52.0, should be 50.0
According to what I can Google, class file version 52.0 is JDK 1.8 while the compiler is expecting version 50.0, which is JDK 1.6. I assuming that class file 52.0 refers to rt.jar (Java Runtime) which contains java.lang.Object (see also pom.xml snippet below).
I have found this SO question (and others that are duplicate of it), but they are all in the context of someone building from their IDE (IntelliJ) or from command prompt, and after reading them, I don't see how I could apply the suggested solutions. They involve setting $JAVA_HOME, which is already done by Jenkins.
My question is different because the issue is in the context of Jenkins (and Maven), and only occurred after the Jenkins upgrade. When I execute mvn clean install on my own desktop (with JDK 1.8), the error does not occur. If I execute the file command on the offending class file, but on the desktop where compilation succeeded, I get compiled Java class data, version 50.0 (Java 1.6). For me, this confirms that my pom.xml is (probably) still correct and it's (probably) a Jenkins configuration issue.
That specific submodule has this in the pom.xml, which may or may not be relevant:
<build>
<plugins>
<plugin>
<artifactId>maven-compiler-plugin</artifactId>
<configuration>
<source>1.6</source>
<target>1.6</target>
<compilerArguments>
<verbose />
<bootclasspath>${java.home}/lib/rt.jar</bootclasspath>
</compilerArguments>
</configuration>
</plugin>
</plugins>
</build>
So, as you can see, it takes rt.jar from the current $JAVA_HOME so it can cross compile with a target of 1.6.
I'm a bit lost about the origin of this Java 8. Before the Jenkins upgrade, we were already using Java 8 on the server and cross compiling with a target of Java 6. What am I missing here?
EDIT
Do I even need this? If I comment out
<compilerArguments>
<verbose />
<bootclasspath>${java.home}/lib/rt.jar</bootclasspath>
</compilerArguments>
in pom.xml, I can still cross compile on my desktop and the class files are still version 50.0.
EDIT
When I take that part out, the build does not fail any more.
Which means I solved it myself.
I want to change the question to: why did it fail in the first place? And why didn't it fail before on Jenkins 1.593?
I changed my pom.xml to exactly how it is described in this SO answer: Maven release plugin: specify java compiler version
<build>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<version>3.3</version>
<configuration>
<source>1.6</source>
<target>1.6</target>
<encoding>UTF-8</encoding>
<bootclasspath>${java.home}/lib/rt.jar</bootclasspath>
</configuration>
</plugin>
</plugins>
</build>
As you can see:
I explicitly set the groupId
I explicitly set the version to the latest version, 3.3
The configuration parameters are formatted a bit differently.
My educated guess is that the Maven on the Jenkins server didn't pick up configuration inside compilerArguments and was only happy when it is directly inside configuration. I leave it to the comments to explain how and why, but for me the issue is solved.
I think you have a few errors in your assumptions.
compilerArguments is deprecated. It's been superseded by compilerArgs.
Like you can see from compiler plugin documentation, compilerArguments/compilerArgs is meant to be used only for arguments not supported by configuration section itself. As bootclasspath is supported, using it in compilerArgs/compilerArguments section is generally incorrect.
compilerArgs/compilerArguments is only used if fork is set to true, which was not correct for your configuration.
The third point was probably the most important reason why it didn't work for you. Using configuration section for your use case there should be no issues, and indeed based on your question/answer, this seems to be the case.
Also note that java.home is not JAVA_HOME. I've expanded on that on my other answer here. I'd guess that is related to why you see changes between Jenkins versions.

Multi-JDK Maven builds using classifiers

Maven docs explicitly suggest classifiers as a solution for multiple JDK support:
The classifier allows to distinguish artifacts that were built from the same POM but differ in their content. It is some optional and arbitrary string that - if present - is appended to the artifact name just after the version number. As a motivation for this element, consider for example a project that offers an artifact targeting JRE 1.5 but at the same time also an artifact that still supports JRE 1.4. The first artifact could be equipped with the classifier jdk15 and the second one with jdk14 such that clients can choose which one to use.
I have never seen a working example of this. Is the documentation wrong, or is it somehow possible to actually make Maven build the same artifact multiple times with different JDKs (and obviously distinct source directories, since they will have different syntax (e.g. diamond or lambdas)) and, most importantly, deploy them together?
Seems like this kind of thing would be a basic requirement for potential support of JEP 238, too.
The documentation is not wrong. It is just giving an example of how classifiers can be applied, in this case by targeting several JREs.
For how this can be done, there may be several ways to do this. See How to configure Maven to build two versions of an artifact, each one for a different target JRE for a related problem. You can also trigger different execution with Maven profiles. In this case, each profile triggers a different configuration of the maven-jar-plugin with a different classifier:
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-jar-plugin</artifactId>
<version>2.6</version>
<executions>
<execution>
<phase>package</phase>
<goals>
<goal>jar</goal>
</goals>
<configuration>
<classifier>jdk14</classifier>
</configuration>
</execution>
</executions>
</plugin>

Maven: different java compiler versions for project and plugin

We have a Java 7 project which uses a plugin built with Java 8. Is it possible to use the Java 8 plugin in Java 7 pom by setting the compiler version to use only for the plugin?
I just found that on the internet :
http://maven.apache.org/plugins/maven-compiler-plugin/examples/set-compiler-source-and-target.html
You should just add a configuration section specifying the jdk wanted as
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<version>3.3</version>
<configuration>
<source>1.4</source>
<target>1.4</target>
</configuration>
</plugin>
I hope it helps !
It is only possible, if the plugin forks into a new JVM process and allows you to configure the JVM which it uses to execute. Otherwise it will use the JVM, which the whole build uses. You would then have to run the build with a JRE/JDK 8 (which shouldn't be a problem when you set the source and target version appropriately, like in the answer from Juan Wolf (just set 1.7 instead of 1.4)

Categories

Resources