Consider the following situation. I have two gradle (sub-)projects called "A" and "B". A defines some classes/interfaces that are being referenced by B. So B has a compile dependency to A. Now A is a web server that should be started with B on the classpath. How do you achieve that with gradle?
Of course it is not possible to add B as compile dependency to A because that would mean a circular dependency between A and B. Even adding B as runtime dependency to A did not work because then compile errors in B state that referenced classes from A do not exist. But why?
One solution would be to move code from B into A but I really would like to separate that code because there might be another implementation of B later that I want to swap easily in A (e.g. by exchanging the jar in runtime classpath).
Another solution I was thinking about is to separate classes from A referenced by B into a new module and make both A and B depend on that new module. This sounds valid but that would imply to move persistence layer from A to that new module which feels wrong.
Additional information: A is a Spring boot web application with persistence layer, web services etc, B produces a JAR.
Circular dependencies are a well-known problem when you try to get Dependency Injection. In this case, you have something similar but at a module level
The only way I see you can solve your issue is by creating a third module C with the common code (probably the A interfaces referenced by B)
This way you can compile C (it doesn't have any dependencies), A (it depends on C), and B (it depends on C) and launch A with B in its classpath
Everytime you end up with circular dependency you probably should introduce another entity to break the cycle.
Have a look at my explanation in this other QA article (it's dealing with packages and classes, but idea is the same): What does it mean and how to fix SonarQube Java issue "Cycles between packages should be removed" (squid:CycleBetweenPackages)
Related
I have a library B that depends on another library A.
A third library C depends on B and A.
I can satisfy C's dependencies with
dependencies {
implementation files('/path/A.jar')
implementation files('/path/B.jar')
}
but I would rather only declare B and build B.jar in such a way that it contains and exposes A as well.
I know that with api files('/path/A.jar') B can expose the parts of A that it uses in interfaces, but (my experience is that) it doesn't let consuming projects import anything from A explicitly.
How can B expose A completely?
files() is only for local flat file use, there are 2 mechanisms to share transitive dependencies ...
project(':other') (project dependency)
'group:artifact:version' (maven repository dependency)
https://docs.gradle.org/current/userguide/declaring_repositories.html#sec:repository-types
if the source for B & A is available locally & can be built along with C ... then declaring like below is possible
implementation project(':A')
implementation project(':B')
else you go with proper maven repository artefact.
I have a weird behavior in my project using Spring. I have this structure:
Main-Spring-Project
|_Depends on Library A
|_Depends on Library B
Now... there is an #Autowired in a component in A which injects a component which is in B.
When I have all 3 projects opened in my development environment and start de main project, everythin works fine. But if I remove the project A and leave B, it all crashes when starting saying:
Field factory in [class in A] required a bean of type '[class in B]' that could not be found.
so... I suspect it is loading A, before B.
What I don't understand is why it works perfect if I have both projects open, and why it crashes when I have A closed (main project is using its JAR)
By the way... if I remove A and B, everything works perfect again.
No Spring doesn't make a mistake during the context loading.
This behavior makes sense : the class is just not available at runtime.
Here you work with snapshot/current development source code/compiled classes of library B since library B makes part of your current development code.
It means that the dependencies/classes of library B needed during the run of the application doesn't depend on Maven for retrieving them but for what your IDE has access : the library A and B projects.
To avoid that you should install (mvn clean install) the B dependency in your local Maven/Gradle repository but that may be not practical if you need to repeat the task 20 days a day because the B source code changes 20 days a day.
Note that if you didn't use your IDE, you would be constraint to install the dependency at each modification. So finally, it is not so bad at all. No ?
I'm having an Android Studio project with 2 modules: A and B. (I do not include here the Annotation Processor and the Annotations module)
B depends on A.
B is an Android Library Module, and A is simple Java library Module. I'm also having an Annotation Processor on module B.
The problem I'm facing is:
I want to generate some code, based on annotated files placed in both modules - A and B. The problem comes from the way the Annotation Processor works - only with source code files *.java - not with compiled *.class ones. Unfortunately, during the compilation of B, the Annotation Processor doesn't have access to those source files from A...
The only thing, I was able to think about as a kind of solution, even an ugly one, was to include the folder with the annotated classes from module A as a source set to module B. This way I give module B access to those files during compilation.
sourceSets {
main {
java {
srcDirs = ['src/main/java', '../module_A/src/main/java/path/to/annotated/classes/folder']
}
}
}
That solves the problem - now the Annotation Processor has access to all the annotated classes from both modules, but...
Unfortunately, it introduces another issue... those annotated classes from module A, are now compiled twice. And they are included in the module A's JAR file and in the module B's AAR file.
Question 1: Is there another way to access those source files of module A, from the Annotation Processor running on B??? (From what I was able to find, the answer is NO, but checking...)
Question 2: How can I exclude those compiled files (the repeated ones) from the AAR final package of module B?
Question 3: Maybe... that's an absolutely wrong approach? Any suggestions?
Thanks in advance!
Nop, you can not achieve what you want using just java.lang.model API. At least not without some additional tricks.
The issues is not with binary-vs-source. Annotation processors can use Elements#getTypeElement to interospect compiled classes as well as source-defined classes:
Elements elementUtil = processingEnvironment.getElementUtils();
TypeElement integerClass = elementUtil.getTypeElement("java.lang.Integer");
TypeElement myClass = elementUtil.getTypeElement("currently.compiled.Class");
But you still need to have class on compilation classpath to observe it, and the class must be in process of being compiled to be visible to getElementsAnnotatedWith.
You can work around later limitation by using a tool like FastClasspathScanner: it will use it's own mechanisms to find annotations in compiled bytecode, and report them to you separately from compilation process. But you can not work around the classpath issue: if you don't have some dependency in compilation classpath, it can not be processed. So you have to compile modules together — either by merging them into one (as you did) or via declaring one to depend on another. In later case you might not be able to use getElementsAnnotatedWith, but getTypeElement and FastClasspathScanner will work.
I am trying to pull a bunch of classes and packages from a large project and create a separate standalone module out of it. Now when I try to compile these classes, due to dependency on other classes, I end up with very large number of compiled classes which I don't intend to have in the standalone module.
e.g. if this is class dependency A -> B -> C -> D. And I compile A, I will end with A.class, B.class, C.class and D.class. I want to break the dependency on class D and refactor the code such that class D doesn't become part of the module. But for this to happen, I would have to know the dependency path(s) for given class A and class D.
I tried searching SO but without success so far.
For future stumble-upons: At least as of IntelliJ 2020.1, you can find the list of classes a class would need (or depend upon), recursively, by:
Right click on the class name
Click Analyze > Dependencies
Choose File .java
You can also specify the depth required
(For finding which classes depend on the class in question, choose Analyze > "Backward Dependencies", instead)
For other cases or alternatives: How do I get a list of Java class dependencies for a main class?
I have following dependency hierarchy in a service:
Service A
-> Lib A
-> Lib B
-> Lib A
-> Lib C
-> Lib A
-> Lib D
-> Lib A
"->" means depends on.
There are lot of problems that pops up because of above structure. The one that requires most of the efforts is to keep the Lib A sync across all modules to avoid class conflicts and other issues.
I am using maven for dependency management but it doesn't solve the issue as I have to update all dependencies to avoid conflicts (semantic and functional)
Is there any better way to manage these dependencies?
Edit 1:
Scenario 1: Adding new code to Lib A which is only going to be used by Lib B.
In this case, it wouldn't be sufficient to change Lib B. I will have to add Latest version of Lib A in service A so that maven picks up the correct version.
Scenario 2: Non Backward compatible changes in A.
This will cause problem if I just update Lib A in Service A because other libs (B, C and D) doesn't know about this new change and it might break (e.g Adding a new argument in an existing method method). I will have to update all of them.
Scenario 3: Changing an existing method in Lib A.
This will work fine. If I update this Lib A in service A, maven will pick up latest Lib A and all libs (B,C and D) will use latest version of Lib A.
Not sure if i understand the question right , are you mentioning of dependency conflict between libraries.
If you have same dependencies across libraries , one way of sorting it out could be to use exclusions , that way you can add one dependency and exclude the others (https://maven.apache.org/guides/introduction/introduction-to-optional-and-excludes-dependencies.html)
You might look into OSGi or Java 9 modules (not finished yet of course). I am not an expert but I believe these frameworks allow for a concept of modules/bundles where each module has its own class loader. This means if lib B and lib C were modules they could have different versions of lib A and everything would work smoothly. It does prevent lib B and lib C from communicating directly but given your dependency tree that should be desired.
--Update to match your update--
Scenario 1:
Update lib A version only in module B. Module C/D can remain unchanged.
Scenario 2:
You can update lib A in module B/C/D independently when each of those modules is ready to use the new version of lib A.
Scenario 3:
No issue.
Think about Domain Driven Design and more specifically Bounded Context Pattern. It allows code duplication to certain degree. The flexibility will allow you to decouple "jar A" from the rest very easily.
What exactly is a Bounded Context?
Lets imagine that we have User. This User in one context can be UserAdmin or in another context it can be PowerUser or in another context can be just User.
Now lets imaging that we have 3 services addressing three different functions of a User based on hist type. Each Service represents a different Context of the term User. Now the point here is that instead of creating one almighty "User" in your case it will go in the almighty "libA" we will create 3 users based on the context and will give different functions of these 3 users. Eventually we will end up with 3 domain objects AdminUser for service 1 PowerUser for service 2 and just User for service 3. This three services will not have dependencies on each other, unless it fits us and even if they have it will be piece of cake to decouple them at any point.
If we start thinking of a service as Onion. On the first layer we would have persistence. On the second layer we will start unwrapping our domain model. On the next layer we will start building our services. Then our network representation and so on.
Thinking in terms of Bounded Context will allow you to create a unique Bounded context on per service basis and this will allow you to not have this interdependencies in between the different Jars . This is because your service will not necessarily share code with other services.
Service A
-> Project Commons
-> Lib B_A (functional dependency, commons B no functional dependency with C and D)
-> Lib C_A (functional dependency, commons C no functional dependency with B and D)
-> Lib D_A (functional dependency, commons C no functional dependency with B and C)
-> Lib A (technical dependencies , commons in B C D)
-> pom_parent
-> Lib B
-> Lib A and Lib B_A
-> Lib C
-> Lib A and Lib C_A
-> Lib D
-> Lib A and Lib D_A
pom_parent (SNAPSHOT)