I have following dependency hierarchy in a service:
Service A
-> Lib A
-> Lib B
-> Lib A
-> Lib C
-> Lib A
-> Lib D
-> Lib A
"->" means depends on.
There are lot of problems that pops up because of above structure. The one that requires most of the efforts is to keep the Lib A sync across all modules to avoid class conflicts and other issues.
I am using maven for dependency management but it doesn't solve the issue as I have to update all dependencies to avoid conflicts (semantic and functional)
Is there any better way to manage these dependencies?
Edit 1:
Scenario 1: Adding new code to Lib A which is only going to be used by Lib B.
In this case, it wouldn't be sufficient to change Lib B. I will have to add Latest version of Lib A in service A so that maven picks up the correct version.
Scenario 2: Non Backward compatible changes in A.
This will cause problem if I just update Lib A in Service A because other libs (B, C and D) doesn't know about this new change and it might break (e.g Adding a new argument in an existing method method). I will have to update all of them.
Scenario 3: Changing an existing method in Lib A.
This will work fine. If I update this Lib A in service A, maven will pick up latest Lib A and all libs (B,C and D) will use latest version of Lib A.
Not sure if i understand the question right , are you mentioning of dependency conflict between libraries.
If you have same dependencies across libraries , one way of sorting it out could be to use exclusions , that way you can add one dependency and exclude the others (https://maven.apache.org/guides/introduction/introduction-to-optional-and-excludes-dependencies.html)
You might look into OSGi or Java 9 modules (not finished yet of course). I am not an expert but I believe these frameworks allow for a concept of modules/bundles where each module has its own class loader. This means if lib B and lib C were modules they could have different versions of lib A and everything would work smoothly. It does prevent lib B and lib C from communicating directly but given your dependency tree that should be desired.
--Update to match your update--
Scenario 1:
Update lib A version only in module B. Module C/D can remain unchanged.
Scenario 2:
You can update lib A in module B/C/D independently when each of those modules is ready to use the new version of lib A.
Scenario 3:
No issue.
Think about Domain Driven Design and more specifically Bounded Context Pattern. It allows code duplication to certain degree. The flexibility will allow you to decouple "jar A" from the rest very easily.
What exactly is a Bounded Context?
Lets imagine that we have User. This User in one context can be UserAdmin or in another context it can be PowerUser or in another context can be just User.
Now lets imaging that we have 3 services addressing three different functions of a User based on hist type. Each Service represents a different Context of the term User. Now the point here is that instead of creating one almighty "User" in your case it will go in the almighty "libA" we will create 3 users based on the context and will give different functions of these 3 users. Eventually we will end up with 3 domain objects AdminUser for service 1 PowerUser for service 2 and just User for service 3. This three services will not have dependencies on each other, unless it fits us and even if they have it will be piece of cake to decouple them at any point.
If we start thinking of a service as Onion. On the first layer we would have persistence. On the second layer we will start unwrapping our domain model. On the next layer we will start building our services. Then our network representation and so on.
Thinking in terms of Bounded Context will allow you to create a unique Bounded context on per service basis and this will allow you to not have this interdependencies in between the different Jars . This is because your service will not necessarily share code with other services.
Service A
-> Project Commons
-> Lib B_A (functional dependency, commons B no functional dependency with C and D)
-> Lib C_A (functional dependency, commons C no functional dependency with B and D)
-> Lib D_A (functional dependency, commons C no functional dependency with B and C)
-> Lib A (technical dependencies , commons in B C D)
-> pom_parent
-> Lib B
-> Lib A and Lib B_A
-> Lib C
-> Lib A and Lib C_A
-> Lib D
-> Lib A and Lib D_A
pom_parent (SNAPSHOT)
Related
I have a library B that depends on another library A.
A third library C depends on B and A.
I can satisfy C's dependencies with
dependencies {
implementation files('/path/A.jar')
implementation files('/path/B.jar')
}
but I would rather only declare B and build B.jar in such a way that it contains and exposes A as well.
I know that with api files('/path/A.jar') B can expose the parts of A that it uses in interfaces, but (my experience is that) it doesn't let consuming projects import anything from A explicitly.
How can B expose A completely?
files() is only for local flat file use, there are 2 mechanisms to share transitive dependencies ...
project(':other') (project dependency)
'group:artifact:version' (maven repository dependency)
https://docs.gradle.org/current/userguide/declaring_repositories.html#sec:repository-types
if the source for B & A is available locally & can be built along with C ... then declaring like below is possible
implementation project(':A')
implementation project(':B')
else you go with proper maven repository artefact.
I have a weird behavior in my project using Spring. I have this structure:
Main-Spring-Project
|_Depends on Library A
|_Depends on Library B
Now... there is an #Autowired in a component in A which injects a component which is in B.
When I have all 3 projects opened in my development environment and start de main project, everythin works fine. But if I remove the project A and leave B, it all crashes when starting saying:
Field factory in [class in A] required a bean of type '[class in B]' that could not be found.
so... I suspect it is loading A, before B.
What I don't understand is why it works perfect if I have both projects open, and why it crashes when I have A closed (main project is using its JAR)
By the way... if I remove A and B, everything works perfect again.
No Spring doesn't make a mistake during the context loading.
This behavior makes sense : the class is just not available at runtime.
Here you work with snapshot/current development source code/compiled classes of library B since library B makes part of your current development code.
It means that the dependencies/classes of library B needed during the run of the application doesn't depend on Maven for retrieving them but for what your IDE has access : the library A and B projects.
To avoid that you should install (mvn clean install) the B dependency in your local Maven/Gradle repository but that may be not practical if you need to repeat the task 20 days a day because the B source code changes 20 days a day.
Note that if you didn't use your IDE, you would be constraint to install the dependency at each modification. So finally, it is not so bad at all. No ?
General Description:
I have two projects A and B.
Project A, must use the version v1 of the L library/API.
Project B, must use the version v2 of the L library/API.
Project A has a dependency on project B (In project A, i need to call a method contained in B).
Concrete description:
Project A is actually a machine learner which has a collection of algorithms which are using an older version of spark-mllib.
I want to integrate the XGBOOST-spark algorithm in project A.
The problem is that the XGBOOST api, specifically: ml.dmlc.xgboost4j.scala.spark.XGBoost.train() method, expects an RDD<org.apache.spark.ml.feature.LabeledPoint>. But the org.apache.spark.ml.feature.LabeledPoint is only available in the newer version of spark-mllib. And from project A (which uses the older version of spark-mllib), I only have acces to an org.apache.spark.mllib.regression.LabeledPoint. So I cannot directly integrate XGBOOST in project A without upgrading the spark-mllib version of project A.
Fortunately, the newer version of spark-mllib has a method of converting from the old LabeledPoint (org.apache.spark.mllib.regression.LabeledPoint) to the new LabeledPoint (org.apache.spark.ml.feature.LabeledPoint). The method is: org.apache.spark.mllib.regression.LabeledPoint.asML().
So, the question is: Is there any clever way of using that method .asML() which is available only in the newer version of spark, so that I can convert the LabeledPoint and pass it to the XGBOOST API?
I am not familiar with how the dependencies are treated by maven but I thought of something like:
Create a project B that uses the newer version of spark-mllib, and the XGBOOST-API, and in which we have a class and a method that receives the parameters (from project A), converts the old LabeledPoint to the new LabeledPoint, calls the XGBoost.train() method which generates a model, and then we pass back the model to project A. We import that class in project A (from project B), call it's method, get the model, and we continue with our business as usual.
Of course, I tried to do that. But it doesn't work. I think that's because of the fact that we can only have one version of spark-mllib in the whole dependency tree. Since the class from project B throws java.lang.NoSuchMethodError: org.apache.spark.mllib.regression.LabeledPoint.asML()Lorg/apache/spark/ml/feature/LabeledPoint; , it seems that in the whole dependency tree, we actually use the older version of spark-mllib (and that happens because the older version is closer to the root of the dependency tree). Even though in project B we use the newer version of spark-mllib, which has the asML() method available.
So, the actual question is: Is there any clever way of making this work? Without upgrading the spark-mllib version on project A? Upgrading is not a viable option. Project A is big and if I upgrade that version, I screw up just about everything.
[Update]
I even tried to use a ClassLoader (URLClassLoader) in order to load the class directly from spark-mllib_2.11-2.3.0.jar and print all the available methods. Code here:
URLClassLoader clsLoader = URLClassLoader.newInstance(new URL[] {
new URL("file:///home/myhome/spark-mllib_2.11-2.3.0.jar")
});
Class cls = clsLoader.loadClass("org.apache.spark.mllib.regression.LabeledPoint");
Method[] m = cls.getDeclaredMethods();
for (int i = 0; i < m.length; i++)
System.out.println(m[i].toString());
In my .pom file of this project, if I add a dependency of:
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-mllib_2.11</artifactId>
<version>2.3.0</version>
</dependency>
The method public org.apache.spark.ml.feature.LabeledPoint org.apache.spark.mllib.regression.LabeledPoint.asML() is present the results if i use the 2.3.0 version.
But when I use the version 1.6.2 of spark-mllib, it isn't there anymore.
Even though the asML() method is within the spark-mllib's jar. Which is kind of weird.
You can achieve this by creating a shaded dependency of Project B and using it in Project A. Refer to this answer for understanding maven shading and how to use it.
Consider the following situation. I have two gradle (sub-)projects called "A" and "B". A defines some classes/interfaces that are being referenced by B. So B has a compile dependency to A. Now A is a web server that should be started with B on the classpath. How do you achieve that with gradle?
Of course it is not possible to add B as compile dependency to A because that would mean a circular dependency between A and B. Even adding B as runtime dependency to A did not work because then compile errors in B state that referenced classes from A do not exist. But why?
One solution would be to move code from B into A but I really would like to separate that code because there might be another implementation of B later that I want to swap easily in A (e.g. by exchanging the jar in runtime classpath).
Another solution I was thinking about is to separate classes from A referenced by B into a new module and make both A and B depend on that new module. This sounds valid but that would imply to move persistence layer from A to that new module which feels wrong.
Additional information: A is a Spring boot web application with persistence layer, web services etc, B produces a JAR.
Circular dependencies are a well-known problem when you try to get Dependency Injection. In this case, you have something similar but at a module level
The only way I see you can solve your issue is by creating a third module C with the common code (probably the A interfaces referenced by B)
This way you can compile C (it doesn't have any dependencies), A (it depends on C), and B (it depends on C) and launch A with B in its classpath
Everytime you end up with circular dependency you probably should introduce another entity to break the cycle.
Have a look at my explanation in this other QA article (it's dealing with packages and classes, but idea is the same): What does it mean and how to fix SonarQube Java issue "Cycles between packages should be removed" (squid:CycleBetweenPackages)
I am writing Android JUnit test cases for Project A which has Project B as a library (Project Properties / Android / Library). My test cases need to access resources (view ids, strings, etc.) from both Project A and Project B. If I add Project A and/or Project B as Android / Libraries or as Java Build Path / Projects to my test project, any call to ActivityInstrumentationTestCase2.getActivity() throws a ClassCastException.
If I don't add them, the call returns the appropriate Activity, but I don't have access to the resource ids of the two projects under test. I also don't have access to the objects in Project B, which are needed to properly test Project A. I can't use mock objects here.
Has anybody encountered and resolved this before?
Try this to access resources from the library: (I think it will work)
getInstrumentation().getTargetContext().getResources()...;
This will load resources from the test project: (definitely works)
getInstrumentation().getContext().getResources()...;