I am trying to build a Dropwizard (Jersey) REST endpoint that communicates with HBase. Although these are my only two top-level dependencies, both of these dependencies come loaded with many transitive dependencies that conflict. A simple example of such a conflict is Google's Guava:
The HBase client specifies version 11
Dropwizard specifies 18
Dropwizard will not work with version 11 and HBase will not work with version 18.
I have examined the Maven shade plugin documentation, but it does not seem to let you relocate classes found in dependency jars. So I don't know how to resolve this issue short of separating these two components into separate JVMs.
This is a dirty solution. But you could...
Create a project / module where you define a set of service interfaces that your dropwizard app will use to talk to HBase.
Create another module / project that implements these interfaces and uses the HBase classes. Shade this project.
In your Dropwizard project include only the interface jar but create a task to copy the shaded artifact into your resources.
Create a JARClassLoader for your shaded HBase client artifact. You may have to make a special subclass that does not delegate to the parent as by default the classloader will ask the parent to resolve linkages and may pull the newer version of guava from outer classloader.
Ask for an instance of the service contract from the Jar loader...
Businessing api = Class.forName("com.awesome.Businessing", true, jarLoader).newInstance();
Try to specify concrete versions for those dependencies in <dependencyManagement/> section of you pom.xml.
Related
I have implemented a springboot application that can be extended by loading plugins into it. The plugins are created by extending from a common interface specified in my library (LibA) and loaded into the springboot application using Java SPI. Because of this, LibA is a dependency to the springboot application.
There are some common libraries (LibB, LibC) that are used by the plugins which are not part of the springboot application.
The current implementation requires each plugin to be a fat jar containing LibA, LibB, LibC and because of this the versions of the libs need to be the same otherwise there will be issues (such as method not found) while loading these plugins.
I would like to have these common libraries as part of the springboot application and have the plugins refer to the versions in the application. Doing this could result in smaller jars and also ensure the consistency of the version of the libraries used by the plugins.
I could specify LibA, LibB and LibC as dependencies of the springboot application and have the plugins remove them from it's jar and things might work but this would still require the versions to be specified in the plugin's build.gradle.
What I want to know is how to have something similar to the way springboot specifies and provides it's dependencies. Other ways to manage this is also appreciated. I'm using gradle as the build tool.
Current
App (LibA), plugins (LibA, LibB, LibC)
What i need
App (LibA, LibB, LibC), plugins (just refer to the versions in app)
I'm unsure if this is the correct channel to post this question, let me know if it's not suitable.
I am using spring-retry in my spring-boot service, as well as spring-boot library.
I Noticed this scenario to work:
Use spring retry logic in the library, but the spring retry jars are not imported in the library
Use the library in the parent service as a maven dependency.
The parent service imports spring-retry maven jars.
Is is normal for the library code to use the maven jars from the parent app ? and not need to import the jars itself ? My common intuition says it should, as the resulting compilation unit will have the dependencies injected.. but not sure.
Sorry if this is a super naive question, but my searches did not give a good answer (maybe want using the right keywords)
I'm not sure I've got you right, probably this question should be rephrased.
So you say, that there is a "spring-boot library" that uses spring retry logic.
If so, this library has a maven module and it gets compiled into a regular jar, right?
But if so, if it uses classes/interfaces/annotations from spring retry library and doesn't have it on the compile class path how it gets compiled? I believe you do have this spring retry library in the dependencies but just don't notice (try mvn dependency:tree in the spring boot library module to see the dependencies)
Other than that - usually when you develop a library its intended to be reused by different applications, and if it has dependencies on its own, usually it should list them in the project's library pom. Also usually people who develop the library try to minimize the dependencies list of the library itself.
So if pom.xml of the library doesn't list the required dependencies it won't even compile.
Now in runtime, all the dependencies (including transitive of course) should be available to the spring boot application, otherwise the class that uses these dependencies might not load. But other than that, spring, being a runtime framework, doesn't really care how did the dependency find its way into BOOT-INF/lib folder - its expected to work as long as the dependency is there.
I am using gradle 6.7 and creating a library project which is compiled to a jar that is placed in my own s3 artifacts repository.
In my project I have dependencies to other artifacts and I use implementation dependency.
The jar is created (not a fat jar) and uploaded to the s3 repository.
When in another project I am using my library by fetching it as implementation dependency I am getting errors NoClassDef for other dependencies I used in my library, which means that no runtime is found for the dependencies I was using in my library.
My question is, whether it is a good idea to create a fat jar? I don't think that other libraries (e.g. springboot and others) are using fat jars, right? however when I use them as dependency other dependencies are found on runtime.
Does it mean that using implementation in my project for other dependencies is not the right way? shall I use something else? Could you please contribute a bit more about what is the right way?
Thank you
Check out the Java Library Plugin for gradle. It exists for this exact situation.
If a dependency of your library needs to be exposed to the consumer of your library, then you would use api instead of implementation. There is a nice section within the plugin documentation here that can be used to help you identify when to declare a dependency as api vs implementation.
I have a project that has two needed dependencies for it. These dependencies have in turn apache cxf dependencies. Dependency A uses apache cxf versions 2.4.0 where dependency B uses apache cxf versions 3.1.0.
Because classes were changed between the versions, using one or the other results in ClassNotFoundException's.
A little description of what they are, Dependency A is the client jar for some old SOAP webservices. This includes the autogenerated files from wsdl2java. Dependency B is the client jar for a new REST webservice.
The main project uses Maven to handle dependencies, is a war, and is on tomcat 7.
Any thoughts for what I could do to try and get this to work correctly? I have already tried making the REST client a jar-with-dependencies and bringing it in through Tomcat's common classloader with no luck.
EDIT
After reviewing the dependencies I am bringing in, I can't simply update the dependencies because I do not have control of some of the dependencies being brought in. I would need to alter what those jars are doing and that will not work.
Its never a good idea to use two version of same dependency as it can create discrepancies in your project.
Ideally you should latest version and then make the old code compatible with latest version
The basic problem is as such: I've got a project that already uses multiple Maven modules for various sub-projects. However, one of the modules (the core module) could itself be split into multiple OSGi bundles when created. This is due to the core module containing several optional dependencies, each of which have isolated Java packages where they're required. For instance, support for JSON input files are optional as they require the optional dependencies from Jackson. The classes that rely on the Jackson dependencies are all isolated to certain json packages within the module. Thus, in theory, I could create a minimal bundle from core that doesn't include the packages that rely on optional dependencies.
Normally, I'd simply split up this module into more Maven modules to make life easier for creating bundles via Felix's maven-bundle-plugin. The problem here is that I still want to create a core JAR for non-OSGi users who don't want to have to include several extra JARs just to use optional functionality (which requires they provide the optional dependencies on the class path as it is). Not only that, but I don't wish to have to split up this module into more modules as it makes development on the project more tedious for the developers as well, especially when we're already splitting up code into proper package-based modules as it is.
The way we were trying to use OSGi already was to make the API module a fragment host (in order to allow it to load a provider bundle without requiring OSGi support), then make the other bundles use said fragment host. This seemed to work well for the smaller modules outside of core, but for core, we wanted to be able to provide multiple bundles from a single module so that optional dependencies wouldn't be required in the bundle itself. As it stands, for plugins, we already have a mechanism for scanning them and ignoring plugins that don't have all the required classes to load them (e.g., if a plugin requires a JPA provider but the JPA API is not available, that plugin isn't loaded). Once we can successfully split up the core module into multiple bundles, I can use declarative services as the plugin method in an OSGi environment (instead of the default class path JAR scanning mechanism in place for normal Java environments), so that isn't an issue.
Is there any way to do all this using Felix's maven-bundle-plugin? Or will I have to use the assembly plugin to copy subsets of the module where bundles can be generated from? Or will I have to resort to writing an Ant script (or Maven plugin) to do this? We've tried using separate Maven modules that simply import the core module as a dependency and generating a bundle from there, but the resultant bundle is always empty regardless of import/export package settings and embed dependencies.
Or, is there a better way to do this? We already use the <optional>true</optional> configuration for the optional dependencies, yet the Felix plugin doesn't seem to care about that and imports all of those dependencies anyways without using the optional attribute.
Well, this is what I'm ending up doing to accomplish this. I'm using the maven-assembly-plugin to copy the binaries I need and filtering out the classes I don't want to include using the <fileSets/> element similar to the <fileset/> element in Ant.
Using the generated directories for each assembly, I'm using the maven-bundle-plugin along with the <buildDirectory/> configuration option to specify where the bundle's class files are located.
It's not ideal, but it's better than writing an Ant script for a Maven project!