My main project is using java 1.6 and I need to provide an client jar to an system that can only run on java 1.5. The client jar is an separate module so I am able to specify the java version in the maven-compiler-plugin. However, the client jar is dependent on an core jar, which is on 1.6. One way to
I have used "test-jar" goal in maven-jar-plugin to generate an test jar for other module to use. I am hoping to do something similar and use it in my client module with the following dependency:
<dependency>
<groupId>org.mygroup</groupId>
<artifactId>module-core</artifactId>
<classifier>java1_5</classifier>
</dependency>
Why do your client projects depends on core?
If it uses the code from the core, you apparently need to compile core JAR for 1.5 as well. You have several options here:
Set the the target globally to 1.5 and make sure you are not using 1.6 JDK stuff in your code (at least, in the part of the code invoked by the client on JDK 1.5).
Use the profiles + classifiers to generate artifacts for different JDKs (see this question). You have to run the built multiple times, though. Actually, each build will compile everything using the same -target version, so this approach is only a little improvement of 1), allowing you to publish your artifact for multiple JDK versions.
If client code actually does not use core (for example, it uses only WSDLs from the core or some other non-Java stuff), you can remove this dependency by moving stuff to separate "shared" module.
Related
we have a Java 11 based project with maven and within our source code we do have a package with the classes generated by xjc. As soon as we modify the XSD file, we run xjc and copy the classes into our project in order to access the new fields. This is working fine, but as the project grows, we get more and more XSD files and the process of using xjc is not very intuitive. Also xjc is not part of the java 11 Adopt JDK anymore, so we have to use the old java 8 oracle jdk in order to use xjc.
The other option we do have is to use a maven plugin. But I am wondering, if this is working in our project setup. Because the plugin is executed during install phase, the classes will be generated only during this (or whatever phase it is set up). So the classes are only available after running through this phase.
That means if I change an XSD, I copy the new XSD into my project, run mvn install, then update my sources in IntelliJ and then I can access the new fields?
Or is the usual procedure to manage the XSDs in a separate Maven project and configure this project as a dependency in your own main project? As soon as an XSD changes, would I then have a new version of my dependency and would I have to adjust it in the pom.xml?
I am wondering which way makes sense here.
There are a couple of misconceptions in your question, I'd like to straighten them up:
you don't need Java 8 to use XJC, as this tool is available on Maven in both the com.sun.xml.bind and org.glassfish.jaxb groups (the latter one has its dependencies split into many jars). If you want to use JAXB on Java 11 you must include either jaxb-impl or jaxb-runtime with your software.
the jaxb2-maven-plugin does not run at the install phase, but the generate-sources phase, which precedes the compile phase (cf. Lifecycles Reference).
Whether you decide to run XJC externally or through Maven mostly depends on your development style.
I would split the generated code into another project only if the size of the generated code becomes considerable: e.g. whenever the amount of generated code (which will never be modified manually) started to slow down Eclipse's startup, we migrated it into another project.
I have a project 'java11-core' that generates a test jar artifact to share with project 'java11-app'. These projects build fine with command line Maven, but within Eclipse, the classes shared in the test jar cannot be found.
Version Info:
Apache Maven 3.6.0 (command line and Eclipse)
Java version: 11.0.1, vendor: Oracle Corporation
Eclipse IDE: Version: 2018-09 (4.9.0)
M2E Plugin: 1.9.1.20180912-1601
I originally created these to projects as tradition non-JPMS projects. These projects compiled and ran tests normally as expected. After I added module-info.java to both java11-core and java11-app, the Eclipse compiler could not recognize the shared test files from the core project.
Here is a snapshot of the package explorer for an overview of the project structure.
The added java11-app and java11-core module-info contents respectively:
module com.java11.app {
exports com.java11.app;
requires com.java11.core;
}
module com.java11.core {
exports com.java11.core;
}
As you can see, I do not export the test utilities package from com.java11.core. I do not want to export the test packages because this would make the test classes publicly available. I also do not wish to introduce a new test project, because in real-world scenarios, this is very likely to require cyclic dependencies between test utilities and the projects they assist in testing.
Build errors for in AppTest.java. The failure reported by Eclipse is interesting is that it does not claim it cannot find the CoreTestUtil class, but rather:
The type com.java11.test.core.util.CoreTestUtil is not accessible AppTest.java /java11-app/src/test/java/com/java11/app line 8 Java Problem
CoreTestUtil cannot be resolved AppTest.java /java11-app/src/test/java/com/java11/app line 21 Java Problem
My assumption is that the lack of an export for this package from java11-core and/or the lack of a requires for this package in java11-app make eclipse believe the access is restricted, even though the classes exist in a separate test-jar.
The module path for java11-app shows it includes java11-core as a module, and the Without test code is set to No.
I know I am working with newly release features and suspect that sharing test classes across Eclipse JPMS project is not yet supported. But, I am not sure where to look (Eclipse? M2E plugin) for an update on it being supported. I am also not aware of a work-around that would allow me to be productive while adopting JPMS for my software projects.
For those that believe test utilities should not be shared this way...
This subject has been characterized as a best-practice issue that should be resolved by refactoring test utilities into a separate module. I respect this perspective, but in attempting to follow that guidance, I found myself being forced to violate other best-practices, including DRY (Don't Repeat Yourself), and cyclic dependencies between packages.
It is common for a test utility to emerge while developing a module that both assists in effective testing of that module, as well as depends on that module. This creates a cycle if those utilities are pulled out to separate module. Furthermore, some of these utilities can be equally useful when testing other modules that depend upon that module. This creates duplicate code if those utilities are copied to a new test module for dependents. This reasoning may have been why Maven 'test-jar' support was originally added.
Eclipse does not support multiple module-info per project: in whatever source folder (main or test), you must only have one module-info.
From Eclipse point of view, your only luck is to create a new Java project referencing the other and with its proper module-info/exports:
module mod.a {
exports com.example.a;
// com.example.a.Main
}
module mod.a.tests { // (1)
exports com.example.a.tests;
// com.example.a.tests.MainUtils calling com.example.a.Main
requires mod.a;
}
In case (1), you will have problems if you don't use mod.a.tests: Java will never find com.example.a.Main, probably because the second project shadows the first project.
I am not an OSGI expert, but I think that's one of those reason for why most Eclipse plugin do have a main and test projects: org.eclipse.m2e.core is patched by org.eclipse.m2e.core.tests
However module-info does not have any knowledge of "patches": you may patch a module on command line (java --patch-module), but not in module-info itself: perhaps Eclipse could do that on your behalf, but it don't.
As you can see, two project in Eclipse = two Maven module.
In the case of Maven, you can certainly create other artefacts with the same build (and I do think it tends to pollute the dependencies, because every time your secondary artefacts will requires a dependency, it would have to go to the common scope).
This can be done using maven-compiler-plugin, maven-shade-plugin and maven-jar-plugin:
I think you should not rely test-jar because you want to emulate the --patch-module of Java by merging the classes and test-classes directories.
You don't want to import this project in Eclipse due to multiple module-info; or you must ensure that its module-info is only visible to Maven (you can use a profile + m2e.version do detect m2e and disable it).
I fully agree with you. Why should I only use the src-main code from some core module when I also could inherit some src-test functionalities?
But how to handle the scope problem? When I use the "test"-scope I loose the relation to the src-main code. When I dont use the test scope I loose the relation to the src-test code.
My core test code does not change very often, so to get the stuff working in Eclipse
I install the test-jar to my local repository and everything works fine.
I am in a bit of a jam.
I am working on upgrading our software to have Kettle 6.1. Specifically, we need the feature of S3FileOutput. Meanwhile, our application was already using the aws-sdk for other things.
So I am running into a problem: Pentaho Kettle requires version 1.0.something of aws-sdk. Our application, on the otherhand needs 1.9.6 of the aws-sdk.
To give more details, the feature of Kettle we require is in the pentaho-big-data-legacy plugin. Even if I upgrade to the latest version of Kettle, pentaho-big-data-legacy still uses the old version of the aws-sdk.
I've read a bit about plugins having special classloaders, so one option I was considering is that maybe I am not downloading the right dependency. However, when I tried downloading pentaho-big-data-plugin instead of pentaho-big-data-legacy, I got weird errors, so I stopped going down this path.
I was wondering if there is any way I could put the Kettle Libs in one folder, and my application libs in another folder, and then set some sort of a PENTAHO environment variable to pick up the libraries from the alternative folder.
Another option is if I could somehow set the pentaho classloader, but I don't know if this is possible.
What are my options for having 2 versions of the aws-sdk in my application, with regards to Kettle?
Maven can do much more than download dependencies.
The Maven Shade plugin can help with your current predicament. During a build, it can rename packages.
You would make a project that builds a "fat jar" (or "uber jar") with Pentaho Kettle and its version of the aws-sdk re-packaged as appropriate. That dependency would be handled before your project is built, so you are free to use whatever version of aws-sdk you like since there is no longer a conflict on package names.
I'm working on very old system, we have ant 1.5.3 running and we need to add unit tests to the environment. As far as i have researched, there is no available ant-junit 1.5.3 jar version. did it had a different name before ant-junit-1.7.0? My application says that JUnitTask is not available when compiling. (because ant and ant-junit.jar should be of same version)
for Ant 1.5.x the classes for optional tasks, such as <junit> were contained in optional.jar, in the following directory: org/apache/tools/ant/taskdefs/optional/junit to be precise.
from Ant 1.6.x onwards, optional.jar was split into multiple jars, one of them being ant-junit-1.6.*.jar.
so, i doubt ant-junit-1.5.3.jar ever existed.
read delegating-classloader-1.6, for more of this.
I have a large Ivy project, and Ive noticed that my code, which run well in eclipse, cause a compile error when run in ant. I've narrowed the problem down to the following line :
FileUtils.write(...).
This line fails - during compilation --- the method is simply not found. Obviously, my code is dependant on apache's commons-io library. And its quite clear that the current commons-io has this method.
http://commons.apache.org/io/apidocs/org/apache/commons/io/FileUtils.html
So what gives ?
I am pretty sure this is related to my ivy.xml -> the eclipse compiler is luckily (or smartly) using the newest possible version of commons-io , whereas my ivy.xml is using an older version which lacks this method.
Most important of all to not here is that ant is clearly using a different version of this jar.
So - my question is :
1) How can I tell ant / ivy to preferentially compile my code with the latest versions of libraries i specify ? I'm assuming that some of the dependencies in my lib/ may depend on older versions of commons-io .....
Also :
2) In this context, any hints about what to worry about regarding the how the classloader deals with duplicates in a multi-jar-dependent project would also be helpful to me ...
Dependency Reporting
I would suggest that you first add the generation of an ivy dependency report into your build, using the report task.
An example of this task is included in the following answer:
What is the Ivy equivalent of Maven's versions:display-dependency-updates?
This will tell you what versions of what jars are being used. Normally, ivy will use the version you specify in the ivy.xml file, however, another module might depend on a more recent version. Ivy's default behaviour is to always favour the most recent version of a Maven module.
Retrieve the latest dependency
If you want ivy to always prefer the latest version of a particular library then declare the dependency as follows:
<dependency org="commons-io" name="commons-io" rev="latest.release"/>
Ivy has a feature called Fixed and Dynamic Revisions.
You can set the version/revision of any artifact to latest-status like
rev="latest.integration" --> for development released
rev="latest.release" --> for released versions
Ivy takes the version with the highest version(you have specified) and omits all libraries with lower versions, so that you only have one lib in the ivy classpath (have a look at the resolution report, run ant -v (verbose mode))., which avoids having duplicate jars with conflicting versions.
This might be worth checking out, maybe you just have an old version defined in one of your ivy files.
As to the second point:
The classloader takes the class, that happens to be first in the classpath(or the jar that is first in the classpath). So mixed versions of the same lib, could behave differently on any system, depending on how the classpath is constructed.