Maven Java interface and implementation multi-module issue - java

Let's say I have a re-use project with 2 modules: api and service. The re-use api module defines interfaces, types, annotations that can be used by other application projects. The re-use service module contains the real implementation. The re-use project looks like this:
pom.xml: id=reuse, group=com.test.project, version=1.0.0
|__api-module
|__pom.xml: parent={id=reuse, group=com.test.project, version=1.0.0}, id=reuse-api
|__service-module
|__pom.xml: parent={id=reuse, group=com.test.project, version=1.0.0}, id=reuse-srv
And one of our applications has a depdency to the re-use module.
pom.xml: id=application, group=com.test.project, version=2.0.0
dependency={scope=compile, id=reuse-api, group=com.test.project, version=1.0.0}
dependency={scope=runtime, id=reuse-srv, group=com.test.project, version=1.0.0}
The thing is, if now we change something in the re-use service module, meaning we change the implimentation of the re-use module, the whole re-use module(api and service) has to be compiled and a new version will be released, because the versions of the api and the service module inherict from the parent module. The new pom structure will look like this:
pom.xml: id=reuse, group=com.test.project, version=1.0.1
|__api-module
|__pom.xml: parent={id=reuse, group=com.test.project, version=1.0.1}, id=reuse-api
|__service-module
|__pom.xml: parent={id=reuse, group=com.test.project, version=1.0.1}, id=reuse-srv
After that, the application has to change the dependency with the new version of re-use:
pom.xml: id=application, group=com.test.project, version=2.0.1
dependency={scope=compile, id=reuse-api, group=com.test.project, version=1.0.1}
dependency={scope=runtime, id=reuse-srv, group=com.test.project, version=1.0.1}
Is there a way that the modification of re-use service module will not cause a change also to the application? The application does not really need to be affected by the change of the re-use implementation or?
Do you have some comments/suggestions? Thank you.

Lets assume that you have your 2 modules stored in some artifact repository (for example some repository manager hosted in your company)
service-api-1.0.0
service-impl-1.0.0
There modules are shared across several applications:
app-1
compile: service-api-1.0.0
runtime: service-impl-1.0.0
app-2
compile: service-api-1.0.0
runtime: service-impl-1.0.0
The applications should define your repository in build configuration (additional maven repository) and reference them by version number.
Every time you complete a set of changes on your API/implementation you should release a library with a higher version.
Library version usually consist out of 3 components [major].[minor].[bugfix]
Bugfix version should be updated each release when there are no new features but just fixes to existing functionality.
Minor version is updated when there are new features that are backward compatible with the previous release.
Major version is changed when we introduce incompatible changes.
E.g. when fixing some implementation defects in service-impl-1.0.0 you are releasing service-impl-1.0.1. This one may/should be still compiled against service-api-1.0.0.
This new version will be installed in your repository, then its contents will be
service-api-1.0.0
service-impl-1.0.0
service-impl-1.0.1
Unless you update the configuration of your application it will still be compiled and build against the older version still available in your repository. So you can have now:
app-1
compile: service-api-1.0.0
runtime: service-impl-1.0.0
app-2
compile: service-api-1.0.0
runtime: service-impl-1.0.1
There is no need to change each application as long as they have access to historical versions of your library. You change the dependency version only when you want to get new features in (very much the same way you do about any other public shared library in maven central).

If you change the code of your library, the application has three choices:
Use the new version and compile again.
Stay with the old version and ignore the changes made.
Do not compile again but use the new library at runtime.
The third option is a bit risky because it requires all interfaces and all behaviour to be unchanged, otherwise you might get runtime exceptions. It is pretty common, though.
If you really want to decouple the implementation of the library from the application, you need a different kind of dependency, e.g. a REST service. In the effect, this is similar to (3) because you change the implementation of the REST service but guarantee that the behaviour and interface is unchanged.

Related

How do optional compile-time dependencies work?

For the longest time, I thought that in Java you either had one of two types of dependencies:
Required compile-time dependencies (dependencies always required at compile time)
Possibly optional runtime dependencies (dependency that can be
resolved at runtime)
Recently, I found out that compile dependencies can be optional too. For example, commons-beanutils is listed as an optional compile dependency of JXPath.
How can this work? Can a dependency really be used at the time of compilation yet remain fully optional?
EDIT: I might have been unclear. I'm looking for a case where a dependency is used at compile-time and is at the same time fully optional, or an explanation why such a dependency is impossible.
A class can compile to an interface but the implementation of that interface is not needed during compilation. The implementation is needed during runtime.
Example commons-logging, JPA, JDBC etc which are frameworks, an application can compile based on these. At runtime an implementation is needed to execute the code. Sample implementations - Common Bean utils, Oracle thin driver, Eclipse link etc.
An extensive quote from Maven documentation describes this quite clearly:
Optional dependencies are used when it's not possible (for whatever reason) to split a project into sub-modules. The idea is that some of the dependencies are only used for certain features in the project and will not be needed if that feature isn't used. Ideally, such a feature would be split into a sub-module that depends on the core functionality project. This new subproject would have only non-optional dependencies, since you'd need them all if you decided to use the subproject's functionality.
However, since the project cannot be split up (again, for whatever reason), these dependencies are declared optional. If a user wants to use functionality related to an optional dependency, they have to redeclare that optional dependency in their own project. This is not the clearest way to handle this situation, but both optional dependencies and dependency exclusions are stop-gap solutions.
Why use optional dependencies?
Optional dependencies save space and memory. They prevent problematic jars that violate a license agreement or cause classpath issues from being bundled into a WAR, EAR, fat jar, or the like.
How do optional dependencies work?
Project-A -> Project-B
The diagram above says that Project-A depends on Project-B. When A declares B as an optional dependency in its POM, this relationship remains unchanged. It's just like a normal build where Project-B will be added in Project-A's classpath.
Project-X -> Project-A
When another project (Project-X) declares Project-A as a dependency in its POM, the optional nature of the dependency takes effect. Project-B is not included in the classpath of Project-X. You need to declare it directly in the POM of Project X for B to be included in X's classpath.
A practical example: imagine that you are a developer of a library/framework SuperLib that is built as one superlib.jar. Your library provides multiple features. Its main feature (that most of the users use) is dependency injection based on a third-party di library. However, one of your classes - EmailApi - offers features to send e-mails, using a third-party email library. Since superlib is one artifact, it needs both di and email to be compiled.
Now put yourself in the position of a user who uses superlib. They are interested in the dependency injection features. This is the core role of your library, so the dependency between superlib and di would not be optional.
However, most users are not interested in sending emails and may be bothered by having a useless email library and its dependencies added to their application (which will cause size increase of their application and may cause a dependency version clash between the dependencies of email and dependencies of the user's application). Therefore, you would mark the dependency on email as optional. As long as the user does not use your EmailApi class, everything will run fine. However, if they do use EmailApi, they will need the email dependency, otherwise the application will fail at runtime with ClassNotFoundException for whichever class from email would be referenced in EmailApi. The user of your library will need to add the email dependency explicitly in their POM.
See also When to use <optional>true</optional> and when to use <scope>provided</scope>.
What you described is actually a feature of Maven, the build tool, but not Java itself.
Without build tools, using just 'javac' you need to specify all classes or interfaces that directly used in your code. Sure there are options for dynamic class loading and even runtime compilation, but thats not on topic.
One of use-cases with separation on interface and implementation is described in previous answer, another popular case is based on classpath scanning:
if some specific class is present in classpath and/or has specific annotation - an optional module will be loaded.
That's how Spring Boot modules are loaded.

Maven dependency: ranges resolving issue

I want to use a range for the version of some dependency. But I don't really get it how it should be defined for my case.
Here's the results of lookup - maven-metadata-nexus.xml file.
<versioning>
<latest>0.1.0-SNAPSHOT</latest>
<versions>
<version>0.0.13-SNAPSHOT</version>
<version>0.0.14-SNAPSHOT</version>
<version>0.0.15-SNAPSHOT</version>
<version>0.0.16-SNAPSHOT</version>
<version>0.0.17-SNAPSHOT</version>
<version>0.1.0-SNAPSHOT</version>
</versions>
<lastUpdated>20190826092951</lastUpdated>
</versioning>
I want to import the latest 0.1.x dependency, so I thought writing range this way would do the thing
<dependency>
<groupId>my.group.id</groupId>
<artifactId>my-artifact</artifactId>
<version>[0.1, 0.2)</version>
</dependency>
However, maven says that there's no version of my artifact available.
Defining range as [0.1.0-SNAPSHOT, 0.2) fixes the problem, but I don't really understand why I need to be so specific with the boundary and is it a good practice or not. What's the right way to define such ranges?
Maven treats SNAPSHOT version differently than "normal" versions.
"normal" (published in a repo) version is typically immutable. It can't be updated nor removed and no matter when you access it will still be the same.
SNAPSHOT versions are the opposite of that. They can change at any time (think work in progress).
Typically SNAPSHOTs are only to be found in your local repo. If you want to use SNAPSHOTs from remote repo you have to explicitly tell Maven that repo provides SNAPSHOT versions.
With that distinction in mind, Maven folks have decided that
Resolution of dependency ranges should not resolve to a snapshot (development version) unless it is included as an explicit boundary. There is no need to compile against development code unless you are explicitly using a new feature, under which the snapshot will become the lower bound of your version specification.

How to deal with libraries' incompatible dependencies in maven while running unit tests?

I'm trying to run a unit test that uses both libraryA and libraryB.
LibraryA has a dependency on com.oldlib:dependencyX:2.0. LibraryB has a dependency on com.newlib:dependencyX:3.0.
Note dependencyX 2.0 and 3.0 are the same library but the groupID got changed between releases (not sure if this is relevant). Internal package names are the same between 2 and 3.
dependencyX has had API changes between 2.0. If I force maven (with dependency/exclusion) to only use dependencyX:2.0, libraryB throws class-not-found exceptions for new classes in 3.0. If I force maven to use only dependencyX:3.0, libraryA throws method-not-found exceptions because some APIs were removed between 2.0 and 3.0.
Is there any to isolate these in a unit test? I know about OSGI - it would be really neat if maven could seamlessly isolate dependencies into separate classloaders in an osgi-like manner without actually having to set up all the osgi stuff.
(the guilty libraries are apache storm 0.10.0 and cassandra-unit, and the guilty dependency is com.lmax:disruptor / com.googlecode:disruptor).

maven - separate modules for interfaces and implementation with Spring

We are working on Mavenizing our java project and we would like to setup a clean separation between interfaces and implementations for each module.
In order to do so, we want to split each module into two sub-modules one for interfaces and data objects used by them and another for implementations.
For example:
+commons
+commons-api
+commons-impl
The POMs of the modules will be configured such that no module depends on the impl sub-modules. This way no code from one module will be able to "see" implementation details of another module.
What we are having trouble with, is where to put our spring XMLs.
In our project we automatically import spring XML files using wildcard import like
<import resource="classpath*:**/*-beans.xml"/>
This way the location of Spring XMLs doesn't really matter at runtime, as all the modules get loaded into the same class loader and, the strict one way dependency rules in the POMs don't apply.
However, during development we want the IDE - we use Intellij IDEA - to recognize implementation classes referenced from the spring XMLs.
We also want IDEA to recognize beans defined in other modules.
If we put the spring XMLs in API sub-modules - they won't "see" the implementation classes in the impl sub-modules.
If we put them in the impl sub-modules, their beans won't be "seen" from other modules.
It is probably possible to configure the IDEA project to recognize spring XMLs from modules on which there is no dependency, but we prefer for our POMs to hold all the project structure information and not rely on IDEA project files.
We considered creating a third sub-module just to hold Spring XMLs (and perhaps hibernate xmls as well). For example:
+commons
+commons-api
+commons-impl
+commons-config
The external modules will depend on both commons-api and commons-config and commons-config will depend on both commons-api and commons-impl, with the dependency on commons-impl marked as "provided" (to prevent transitive resolution).
This however seems like a complex and awkward solution and we feel that there must be a better - simpler way to achieve interface/impl separation with Maven and Spring.
What you need is a runtime dependency scope:
runtime - This scope indicates that the dependency is not required for compilation, but is for execution. It is in the runtime and test classpaths, but not the compile classpath.
(https://maven.apache.org/guides/introduction/introduction-to-dependency-mechanism.html)
Define a runtime dependency from one impl module to another impl module where you use the impl classes in the *-beans.xml config. Intellij will correctly recognize this in spring configuration files, but won't auto complete them in code (but it will do that in test code).
Also if anyone used the classes in the code, compilation through maven would fail, because the runtime dependency is not on a compile class path.
You can achieve decoupling of api and impl like this:
+ commons (pom)
+ pom.xml <--- serves as a parent aggregator (see below)
+ commons-api (jar) <--- contains models, interfaces and abstract classes only
+ commons-impl (jar) <--- depends on commons-api
+ commons-config (jar) <--- depends on commons-impl only (no need to depend on commons-api as it is brought in transitively)
+ external-project (war or jar) <--- has commons-config as a dependency
Parent aggregator pom (specify build order):
<modules>
<module>commons-api</module>
<module>commons-impl</module>
<module>commons-config</module>
</modules>
The config module can be omitted if it only contains spring application context configuration. The app configuration xml should be in the classpath and folder structure of the module that contains the artifact that you are deploying. So if you are building a war artifact, the app context should be in there.
The only configuration that should be in your commons module would be in a test package of your impl module.
In short you want Idea to override maven dependency graph but avoid keeping this configuration in idea project files?
One option is to group implementation dependencies in a maven profile. This profile would not be enabled by default but you should be able to mark it as active under idea.
Two ideas come to mind:
You will have one (or more) modules where all the modules (api+impl) are dependencies, you could place your spring configuration files there.
Place the spring configuration files in the api modules and declare a dependency on the impl module with scope provided this way the implementations will be known, while there is no dependency of the api for the deployment.
commons-impl at runtime scope in external modules
commons (pom dependencyManagement) =>
+commons-api (compile)
+commons-impl (compile)
+commons-config (compile)
commons-impl (pom dependencies) =>
+commons-api (compile)
+commons-config (compile)
external modules (pom dependencies) =>
+commons-impl (runtime)
+commons-api (compile)
+commons-config (compile)
keep modules number as little as possible;
This speeds up project build time and simplifies its layout.
keep modules structure as plain as possible: single root + all sub modules in the same folder, e. g.:
pom.xml
commons-api/
commons-runtime/
module-a-api/
module-a-runtime/
...
This simplifies navigation across the project, when modules number is really high (>50)
provide runtime-scoped dependencies to the runtime modules only when they are required;
This keeps your architecture clear. Use mocks instead of explicit dependency to another runtime module.
keep your api spring contexts in api modules, define your public beans as abstract bean + interface;
keep your implementation contexts in runtime modules, override api beans with your implementations via spring profiles (use <beans profile="default").
Result: simple, transparent layout and design; full ide support; no explicit dependencies on runtime module internals.

REST client inside of OSGi application

I need to integrate a REST client into an existing OSGi application implemented using Apache Felix. The REST service is based on RESTeasy implementation (version 2.3.2.Final) of JAX-RS. I created a separate bundle with clients' dependencies, exporting required RESTeasy packages and importing them in the bundle where the client is used, but unfortunately I cannot get it working inside of the OSGi context.
I tried two different approaches. First one using the generic ClientRequest:
ClientRequest request = new ClientRequest(MyService.URL_TEST+"/stats");
request.body(javax.ws.rs.core.MediaType.APPLICATION_XML, stats);
ClientResponse<String> response = request.post(String.class);
The error that I get in this case is pretty weird:
[java] java.lang.RuntimeException: java.lang.ClassCastException:
org.jboss.resteasy.client.core.executors.ApacheHttpClient4Executor cannot be cast to
org.jboss.resteasy.client.ClientExecutor
where I it is known for sure that ApacheHttpClient4Executor implements the ClientExecutor interface.
When I try to use my own REST client wrapper around RESTeasy like this:
MyService myService = MyServiceClient.getInstance();
myService.saveStatistics(stats);
I get a different exception:
[java] java.lang.LinkageError: ClassCastException: attempting to
castjar:file:/D:/Development/Eclipses/eclipse_4.2_j2ee_x64/lib/jaxrs-api-2.3.2.Final.jar
!/javax/ws/rs/ext/RuntimeDelegate.classtobundle:
//78.0:1/javax/ws/rs/ext/RuntimeDelegate.class
As far as I understand, the LinkageError most probably has to do with the way RESTeasy initializes the RuntimeDelegate using some classloader tricks, which probably fall under the restrictions of OSGi framework. I get the suspicion that the java.lang.ClassCastException mentioned first has the same source.
Is there any way to get RESTeasy working inside of OSGi?
PS: discussion about a similar issue with RESTeasy, but outside of OSGi: java.lang.LinkageError: ClassCastException
Update:
these are the libraries included into restclient bundle:
activation-1.1.jar commons-codec-1.2.jar commons-httpclient-3.1.jar commons-io-2.1.jar commons-logging-1.0.4.jar flexjson-2.1.jar httpclient-4.1.2.jar httpcore-4.1.2.jar javassist-3.12.1.GA.jar jaxb-api-2.2.3.jar jaxb-impl-2.2.4.jar jaxrs-api-2.3.2.Final.jar jcip-annotations-1.0.jar jettison-1.3.1.jar jsr250-api-1.0.jar junit-4.10.jar log4j-1.2.14.jar resteasy-jaxb-provider-2.3.2.Final.jar resteasy-jaxrs-2.3.2.Final.jar resteasy-jettison-provider-2.3.2.Final.jar scannotation-1.0.3.jar slf4j-api-1.6.4.jar slf4j-log4j12-1.6.4.jar myservice-common-0.1.0.3.jar my-service-client-0.1.0.3-SNAPSHOT.jar stax-api-1.0-2.jar xmlpull-1.1.3.1.jar xpp3_min-1.1.4c.jar xstream-1.4.2.jar
These are the exports from the restclient bundle: javax.ws.rs, javax.ws.rs.ext, javax.ws.rs.core, org.jboss.resteasy.client, org.jboss.resteasy.client.cache, org.jboss.resteasy.client.extractors, org.jboss.resteasy.client.marshallers, org.jboss.resteasy.client.core.executors, javax.xml.bind.annotation, org.jboss.resteasy.plugins.providers, org.jboss.resteasy.plugins.providers.jaxb, org.jboss.resteasy.spi
Have a look at the SpringSource Bundle Repo, it's got some very useful pre-built bundles of common libraries including the Apache HTTP Client which we are using (in conjunction with gson) to do our RESTful comms.
(unfortunately a legacy module of my project still uses OSGi, but using RESTeasy 3.0.16 now)
When I need to OSGify a dependency my preferred solution now is to wrap it using the excellent Apache Ops4j Pax Tipi project.
The project provides a preconfigured Maven setup (parent POM handles the bundling) and you just have to adapt the GAV coordinates of the original project in a Tipi sub module with a org.apache.ops4j.pax.tipi prefix and build the new bundle project which draws in the original dependency, unpacks and wraps it as OSGi bundle.
You can start from an existing Tipi sub project that best matches your project setup (dependencies, etc.) and adapt any OSGi imports/exports missing (most often, these are created automatically by the maven-bundle-plugin anyway).
This worked quite well for me as long as the original project did not contain too many exotic or malformed dependencies.
However you may run into snags like transitive dependencies using the root package, as I currently experience, which can be a real show stopper (finding out which library is a real nightmare).
Unfortunately, RESTeasy seems to be affected by this, as I get exactly the same error (default package , even after declaring non-test and non-provided dependencies as optional:
The default package '.' is not permitted by the Import-Package syntax.
Upgrading the maven-bundle-plugin to the latest release 3.0.1 yields a different error (even less helpful):
[ERROR] Bundle org.ops4j.pax.tipi:org.ops4j.pax.tipi.resteasy-jaxrs:bundle:3.0.16.Final.1 : Can not parse name from bundle native code header:
[ERROR] Error(s) found in bundle configuration
Update seems to be solved by upping Tipi version in POM to 1.4.0, testing...
Is RESTEasy mandatory ?
I personally use jersey in OSGi and it is working perfectly, both as client and server.
This problem isn't limited to RESTeasy. It also occurs with Jersey.
It is occurring because you have two copies of the JAX-RS classes on the classpath.
You can see this in the LinkageError:
[java] java.lang.LinkageError: ClassCastException: attempting to cast jar:file:/D:/Development/Eclipses/eclipse_4.2_j2ee_x64/lib/jaxrs-api-2.3.2.Final.jar!/javax/ws/rs/ext/RuntimeDelegate.class to bundle://78.0:1/javax/ws/rs/ext/RuntimeDelegate.class
i.e. one copy is coming from:
D:/Development/Eclipses/eclipse_4.2_j2ee_x64/lib/jaxrs-api-2.3.2.Final.jar
and the other from the OSGI bundle.
This causes problems for the RuntimeDelegate class, which by default uses the system class loader to create the RuntimeDelegate implementation (see javax.ws.rs.ext.FactoryFinder).
The problem can also occur if the same jar is loaded via two different class loaders.
There are a couple of workarounds:
remove the jaxrs-api-2.3.2.Final.jar from the system class path
set the thread context class loader to that of your bundle, prior to making any JAX-RS calls.
The FactoryFinder will use this to load the RuntimeDelegate.
To avoid polluting your code with calls to Thread.currentThread().setContextClassLoader(myBundleClassLoader), you can wrap your JAX-RS client using a Proxy. e.g. see the Thread context classloader section of https://puredanger.github.io/tech.puredanger.com/2007/06/15/classloaders/

Categories

Resources