I have maven project with next modules: app, app-impl, web. I build web with maven-war-plugin. Now app-impl depends on app. web depends on app-impl.
But web should depends only on app. By some magic maven-war-plugin should include app-impl in war. Could I do it without additional maven module?
I absolutely agree with #Torsten, but maybe there is something valid in your request:
when using IDE to develop the project, you can require it to stop offering your implementation classes in code completion lists.
If this is your reason, just add both dependencies, and for the implementation one, specify <scope>runtime</scope>.
This ensures that:
your app module get to classpath of javac, but app-impl does not
both app and app-impl will be placed under WEB-INF/lib/ in your war
IDE (if properly implemented) will not offer you completions from app-impl
This definitely does not save you keystrokes, but gives you a better pom, carefully modelling the reality.
IMHO having a dependency on app in web does not reflect the reality, as your web-application really depends on having an implementation of the interfaces I assume you have defined in app.
Consider having an alternative implementation in lets say app-impl2. How can maven possibly decide which implementation it should choose?
So having web depend on app-impl is to me the way to go and given the above setup also should work out of the box.
Related
I have an Android project which is architectured in a Modularized way. I have modularized the projects by dividing their source code between multiple Gradle modules, following the clean Architecture.
Here is the structure of the App.
The top module in this hierarchy, App is the one that no other module depends upon, is the main module of your application. The lower level modules domain and data do not depend on the App module, where the App module includes the data and domain modules. I have added the below code in the build.gradle of the app module
implementation project(':domain')
api project(':data')
Now, I'm having some issues with maintaining dependencies across each module. Since each of them is an individual android module, each of them having its own build.gradle. The App module can use classes in the data and domain modules. But, I have some general purpose classes, (Such as some annotations, Utilities, Broadcast classes, Dagger scopes etc) which I want to make use in all the modules. But these are the issues I'm facing
Since these classes are contained in the main module app, I cannot
access these in my data and domain, because those modules do not
depend on the higher layer app
Any libraries I'm using in all the layers (eg: RxJava) needs to be
included in the build.gradle of each module
As a solution for this I thought of adding one more android module, say common which will be containing all my general purpose classes as well as the libraries which I use in all the modules.
All my other modules app, domain and data will be having this module as a dependency.
implementation project(':common')
So, any global libraries and classes will be added to this module and each of the individual modules will have only module-specific classes.
Is that a good approach? Or is there any way to solve this issue efficiently?
We recently encountered this problem, as we transitioned to a multi-module project for reuse, build time optimisation (unchanged modules aren't recompiled), etc. Your core goal is to make your app module as small as possible, as it will be recompiled every time.
We used a few general principles, which may help you:
A common base-ui module contains the primary strings.xml, styles.xml etc.
Other front-end modules (profile, dashboard, etc) implement this base-ui module.
Libraries that will be used in all user-facing modules are included in base-ui, as an api instead of implementation.
Libraries that are only used in some modules are added as dependencies only in those modules.
The project makes extensive use of data syncing etc too, so there are also base-data, dashboard-data etc modules, following the same logic.
The dashboard feature module depends on dashboard-data.
The app module depends only on feature modules, dashboard, profile, etc.
I strongly suggest sketching out your module dependency flow beforehand, we ended up with ~15 or so modules, all strictly organised. In your case, you mentioned it's already quite a large app, so I imagine app needs feature modules pulled out of it, as does domain. Remember, small modules = less code to be recompiled!
We encountered some issues with making sure the same version (buildType, flavors) of the app was used across all submodules. Essentially, all submodules have to have the same flavors and buildTypes defined as the app module.
On the other side of the coin, multi module development does really make you think about dependencies, and enforces strict separation between features. You're likely to run into a few unexpected problems that you've never considered before. For example, something as simple as displaying the app's version suddenly complicates (disclaimer: my article).
This article also helped us decide on our approach. The article you linked also seems to be an excellent resource, I wish it had existed when we'd transitioned!
After comment discussion, here's an example diagram (with unfortunate untidiness, but enough to illustrate the concept. Note that distinguishing between api and implementation would be a good next step):
I'm currently working on an ebanking platform, so out customers are banks. To extend this platform, we develop our own 'xDK' (development kit) for 3rd party developers (usually the banks themselves).
When xDK is used as a dependency (via maven or gradle), it brings along a lot of transitive dependencies in order to work (~25MB). I was trying to think of solutions to make the dependency a bit lighter to use (given that it needs all of its dependencies) which in turn will promote having smaller, more focused services (not exactly micro-services but at least a step closer).
The current situation's benefit is that every service/project can use its own version of xDK and it doesn't have to update until it needs to. The problem is that it doesn't scale. If we assume 100 WAR files having xDK as a dependency, we create a 2.5GB overhead on the application server (even if they all use the same version).
I'll list two options I was thinking of, but I'd like to know if there are better solutions for this problem. Feel free to ask for more info. Thanks in advance.
Similar to JavaEE components (JPA, JAX-RS, ...), we'll have an 'api' dependency and the implementation. The projects will only declare the 'api' as a provided dependency while the implementation will be provided like so:
JBoss module
I haven't worked with other application servers. We (and our customers) only use JBoss EAP, so this might be a JBoss specific solution. We can create a JBoss module for xDK and then make every deployment depend on it via the JBoss deployment descriptor. The benefit is that we get rid of the multiple copies of the library, but we lose on version flexibility. This would mean that there needs to be some kind of governance on which version of xDK you code against in your service. Also, every time there is a breaking change, we'd need to update all services if we want to update the JBoss module to the latest version.
Bundle in an EAR
EARs allow multiple WAR files in them and also jars as libs. xDK will be an EAR dependency. Again, we have the same pros and cons as the previous solution. This solution is JBoss independent. However, it needs an extra build step to collect all the projects and bundle them, which might be annoying for out customers if they need to bundle their own services.
How about using the maven dependency scope of provided to declare that for the individual war files the jar file is provided outside of the war file, and then have another mechanism to inject the shared jar file into the application server?
c.f. https://maven.apache.org/guides/introduction/introduction-to-dependency-mechanism.html
[Clarification] Forgive the lack of clarity in the initial description. Allow me to re-phrase the question.
Does there exist a way to perform runtime compilation using the javax.tools API, usable in OSGi (again stressing runtime), which understands a bundle's dependencies and security constraints?
[update]
Please see https://github.com/rotty3000/phidias
It's a well formed OSGi bundle.
The readme provides all the details of the very tiny 4 class API (8k module).
In order to get from a set of package imports and exports to a list of bundles which can be used for compilation, you'll need some sort of repository of candidate bundles, and a provisioner to work out which bundles best provide which packages. If you're using 'Require-Bundle' (not a best practice), you'll know the bundle names, but not necessarily the versions, so some provisioning is still required.
For example, in Eclipse PDE, the target platform is used as the basic repository for compilation. You can also do more sophisticated things like using Eclipse's p2 provisioning to provision your target platform, so you can use an external p2 repository as your repository instead of setting one up yourself. For command line builds, Tycho allows Maven builds to use the same sort of mechanisms for resolving the classpath as Eclipse itself uses.
An alternative approach is to list your 'classpath' as Maven dependencies, and let the maven bundle plugin (based on bnd) generate your manifest for you.
If you can't take advantage of existing command line tools because you're compiling programatically (it's not entirely clear from your question what problem you're trying to solve), your best best is probably to take advantage of an existing provisioning technology, like OBR, Eclipse p2, or Apache Ace to work out the bundles which should be on the class path for compilation.
This is exactly what we do in bndtools ... If I had a bit of time I would add a compiler to bnd so it could also do this.
Sure you can, you just have to write a custom JavaFileManager which will supply the right classes to compile against to the JavaCompiler.
For example you can write one that gets its classes from an OSGi runtime. If you don't mind having a dependency from your compiler bundle to the libraries you need then it's pretty easy, otherwise you can use the wiring api to look to other bundles as well. (OSGi 4.3+ only). If you intercept which packages it requests while compiling you can generate Package-Import statements so you can generate a bundle.
I made a rough GitHub example a few months back:
https://github.com/flyaruu/test-dynamic-compiler
There were some issues (I couldn't get the Eclipse ecj compiler to work for example, I didn't look into bundle security at all, and due to the dynamic nature of OSGi you have to listen to bundle changes to update your compilation path.), but it works fine.
I've so far found that the real answer is "No there is not!"
The predominant runtime compilation scenario currently for java is JSP compilation. An investigation of the app servers I've had the occasion to review use one of these methods:
invocation of javac (through a system call)
use of ecj/jdt
uses javax.tools in a non-OSGi aware way
All of these approaches are based on collecting the available classpath by directly introspecting jars or classes in the file system.
None of the current approaches are aware of OSGi characteristics like the dynamic nature of the environment or the underlying restrictions imposed of the framework itself.
All -
we have several web applications, all based on some version of Spring developed over time by different team across organizations. They each produce their own WAR, have a different context to work within, and often gets deployed on the same machine, as their functionalities are closely knit together. So we end up with:
tomcat/webapps/{A, B, C ... }
upon deployment, each use a very similar set of tool chains, replicate all Spring jars and dependencies all around.
I am wondering if there is a way to make the project structure better, deploy as a SINGLE war, while allowing each webapp live in their own source repo and have its own pace of development??
Any pointer or references are much appreciated.
Oliver
Deploying in a single WAR will couple all the projects together. Modifying one will mean redeploying all, with the accompanying QA effort to validate and do regression. I wouldn't recommend that.
Multiple copies of Spring JARs can be addressed by putting them in the Tomcat /lib; they're loaded by the Tomcat class loader instead of the WAR class loader. That will mean that every app has to be on the same version of Spring; upgrading one means upgrading all. You'll have to regression test all at once.
What harm is separate WAR files doing you? What do you care if the Tomcat /webapps directory has lots of deployments? One advantage is that they CAN be on separate release schedules. That's a big one to give away. Be sure you have a good reason before doing it.
you would have to probably move to an app server like jboss, but couldn't you use an ear file and have maven build the modules for you? That way you could probably put them in separate repos if you want each with it's own pom and then have another project with a pom for the ear file:
here is the maven ear plugin:
http://maven.apache.org/plugins/maven-ear-plugin/
here is an older blog post about multiple spring app ear file (single applicationContext fo all wars to share if you need):
http://blog.springsource.com/2007/06/11/using-a-shared-parent-application-context-in-a-multi-war-spring-application/
Based on one of your comments to another response, it sounds like you might be more interested in maven's multi-module project feature. This will allow you to define a parent POM with consistent dependencies and project layouts managed across multiple projects.
You might benefit from combining each project into a single WAR, but I do think this is really one of those 'the grass is always greener' problems. One key thing I would keep in mind is figuring out how much longer (or shorter!) is redeployment going to take if the projects were combined.
Think about OSGi. You can deploy all the dependencies just once, build your separate but interrelated modules as OSGi bundles, and deploy and upgrade them all independently. You can also choose whether to deploy them all as WARs (web bundles) or to deploy them as JARs with one or many WARs importing them to tie everything up. Virgo Web Server, formerly Spring DM Server, is really nice and comes ready to do this kind of stuff right out of the box.
I'm writing a Java API for several clients, and would internally like to use Spring and it's several features, but I don't want to expose to the client my dependencies.
Is this possible?
So if my client uses a different version of spring would they be insulated from my internal Spring dependencies.
If so, would my spring dependencies be bundled internally inside my jar? As well as would a custom class loader be required by my client application?
I have heard you can use this through OSGI bundles, but I'm wondering if this would satisfy my requirement.
The clients of my API wouldn't be OSGI enabled or we have no current environment that utilizes OSGI bundles.
It is not really feasible, or desirable to do so. Why would you want to "hide" the dependencies? Would you also want to hide a dependency on whatever logging package you may be using (for example)?
If you have the dependencies in your implementation, then they are best published as it will cause a lot less grief on the part of users of your API since they will know what conflicts may exist before they even attempt to use your code.
Don't forget, your users are actually developers and I am sure that they would rather be aware of any landmines or requirements up front.
Edit - Regarding OSGi:
OSGi will definitely take care of your conflicting dependencies issues, but it also would rely on deploying in an OSGi environment, which you haven't mentioned is the case for your clients. In addition, it is still not recommended to "hide" those dependencies in a bundle. The very nature of OSGi allows those conficting depencies to cooexist in the same application.