For the longest time, I thought that in Java you either had one of two types of dependencies:
Required compile-time dependencies (dependencies always required at compile time)
Possibly optional runtime dependencies (dependency that can be
resolved at runtime)
Recently, I found out that compile dependencies can be optional too. For example, commons-beanutils is listed as an optional compile dependency of JXPath.
How can this work? Can a dependency really be used at the time of compilation yet remain fully optional?
EDIT: I might have been unclear. I'm looking for a case where a dependency is used at compile-time and is at the same time fully optional, or an explanation why such a dependency is impossible.
A class can compile to an interface but the implementation of that interface is not needed during compilation. The implementation is needed during runtime.
Example commons-logging, JPA, JDBC etc which are frameworks, an application can compile based on these. At runtime an implementation is needed to execute the code. Sample implementations - Common Bean utils, Oracle thin driver, Eclipse link etc.
An extensive quote from Maven documentation describes this quite clearly:
Optional dependencies are used when it's not possible (for whatever reason) to split a project into sub-modules. The idea is that some of the dependencies are only used for certain features in the project and will not be needed if that feature isn't used. Ideally, such a feature would be split into a sub-module that depends on the core functionality project. This new subproject would have only non-optional dependencies, since you'd need them all if you decided to use the subproject's functionality.
However, since the project cannot be split up (again, for whatever reason), these dependencies are declared optional. If a user wants to use functionality related to an optional dependency, they have to redeclare that optional dependency in their own project. This is not the clearest way to handle this situation, but both optional dependencies and dependency exclusions are stop-gap solutions.
Why use optional dependencies?
Optional dependencies save space and memory. They prevent problematic jars that violate a license agreement or cause classpath issues from being bundled into a WAR, EAR, fat jar, or the like.
How do optional dependencies work?
Project-A -> Project-B
The diagram above says that Project-A depends on Project-B. When A declares B as an optional dependency in its POM, this relationship remains unchanged. It's just like a normal build where Project-B will be added in Project-A's classpath.
Project-X -> Project-A
When another project (Project-X) declares Project-A as a dependency in its POM, the optional nature of the dependency takes effect. Project-B is not included in the classpath of Project-X. You need to declare it directly in the POM of Project X for B to be included in X's classpath.
A practical example: imagine that you are a developer of a library/framework SuperLib that is built as one superlib.jar. Your library provides multiple features. Its main feature (that most of the users use) is dependency injection based on a third-party di library. However, one of your classes - EmailApi - offers features to send e-mails, using a third-party email library. Since superlib is one artifact, it needs both di and email to be compiled.
Now put yourself in the position of a user who uses superlib. They are interested in the dependency injection features. This is the core role of your library, so the dependency between superlib and di would not be optional.
However, most users are not interested in sending emails and may be bothered by having a useless email library and its dependencies added to their application (which will cause size increase of their application and may cause a dependency version clash between the dependencies of email and dependencies of the user's application). Therefore, you would mark the dependency on email as optional. As long as the user does not use your EmailApi class, everything will run fine. However, if they do use EmailApi, they will need the email dependency, otherwise the application will fail at runtime with ClassNotFoundException for whichever class from email would be referenced in EmailApi. The user of your library will need to add the email dependency explicitly in their POM.
See also When to use <optional>true</optional> and when to use <scope>provided</scope>.
What you described is actually a feature of Maven, the build tool, but not Java itself.
Without build tools, using just 'javac' you need to specify all classes or interfaces that directly used in your code. Sure there are options for dynamic class loading and even runtime compilation, but thats not on topic.
One of use-cases with separation on interface and implementation is described in previous answer, another popular case is based on classpath scanning:
if some specific class is present in classpath and/or has specific annotation - an optional module will be loaded.
That's how Spring Boot modules are loaded.
Related
Let's say I have a re-use project with 2 modules: api and service. The re-use api module defines interfaces, types, annotations that can be used by other application projects. The re-use service module contains the real implementation. The re-use project looks like this:
pom.xml: id=reuse, group=com.test.project, version=1.0.0
|__api-module
|__pom.xml: parent={id=reuse, group=com.test.project, version=1.0.0}, id=reuse-api
|__service-module
|__pom.xml: parent={id=reuse, group=com.test.project, version=1.0.0}, id=reuse-srv
And one of our applications has a depdency to the re-use module.
pom.xml: id=application, group=com.test.project, version=2.0.0
dependency={scope=compile, id=reuse-api, group=com.test.project, version=1.0.0}
dependency={scope=runtime, id=reuse-srv, group=com.test.project, version=1.0.0}
The thing is, if now we change something in the re-use service module, meaning we change the implimentation of the re-use module, the whole re-use module(api and service) has to be compiled and a new version will be released, because the versions of the api and the service module inherict from the parent module. The new pom structure will look like this:
pom.xml: id=reuse, group=com.test.project, version=1.0.1
|__api-module
|__pom.xml: parent={id=reuse, group=com.test.project, version=1.0.1}, id=reuse-api
|__service-module
|__pom.xml: parent={id=reuse, group=com.test.project, version=1.0.1}, id=reuse-srv
After that, the application has to change the dependency with the new version of re-use:
pom.xml: id=application, group=com.test.project, version=2.0.1
dependency={scope=compile, id=reuse-api, group=com.test.project, version=1.0.1}
dependency={scope=runtime, id=reuse-srv, group=com.test.project, version=1.0.1}
Is there a way that the modification of re-use service module will not cause a change also to the application? The application does not really need to be affected by the change of the re-use implementation or?
Do you have some comments/suggestions? Thank you.
Lets assume that you have your 2 modules stored in some artifact repository (for example some repository manager hosted in your company)
service-api-1.0.0
service-impl-1.0.0
There modules are shared across several applications:
app-1
compile: service-api-1.0.0
runtime: service-impl-1.0.0
app-2
compile: service-api-1.0.0
runtime: service-impl-1.0.0
The applications should define your repository in build configuration (additional maven repository) and reference them by version number.
Every time you complete a set of changes on your API/implementation you should release a library with a higher version.
Library version usually consist out of 3 components [major].[minor].[bugfix]
Bugfix version should be updated each release when there are no new features but just fixes to existing functionality.
Minor version is updated when there are new features that are backward compatible with the previous release.
Major version is changed when we introduce incompatible changes.
E.g. when fixing some implementation defects in service-impl-1.0.0 you are releasing service-impl-1.0.1. This one may/should be still compiled against service-api-1.0.0.
This new version will be installed in your repository, then its contents will be
service-api-1.0.0
service-impl-1.0.0
service-impl-1.0.1
Unless you update the configuration of your application it will still be compiled and build against the older version still available in your repository. So you can have now:
app-1
compile: service-api-1.0.0
runtime: service-impl-1.0.0
app-2
compile: service-api-1.0.0
runtime: service-impl-1.0.1
There is no need to change each application as long as they have access to historical versions of your library. You change the dependency version only when you want to get new features in (very much the same way you do about any other public shared library in maven central).
If you change the code of your library, the application has three choices:
Use the new version and compile again.
Stay with the old version and ignore the changes made.
Do not compile again but use the new library at runtime.
The third option is a bit risky because it requires all interfaces and all behaviour to be unchanged, otherwise you might get runtime exceptions. It is pretty common, though.
If you really want to decouple the implementation of the library from the application, you need a different kind of dependency, e.g. a REST service. In the effect, this is similar to (3) because you change the implementation of the REST service but guarantee that the behaviour and interface is unchanged.
I am writing an extension for a library which consists of several Maven modules. I need to add some functionality on top of one module but do not want to add unnecessary dependencies in case somebody wants to use this module without my extension (typical use case).
One solution that I can think of is to create another module with my extension and try to call methods from its classes using reflection. There would be some kind of check like this:
try {
Class.forName("my.package.Foo", false, getClass().getClassLoader());
// extension will be enabled and some method will be called using reflection
} catch(ClassNotFoundException e) {
// extension will be disabled
}
And methods on that class will only be called if it is on classpath. The extension can then be activated if you add Maven dependency on its module (in addition to the dependency on the module it extends).
But this does not sound like the best approach. Are there any more elegant solutions to this problem?
The one way is to use built-in Service provider interface (SPI).
The basic idea is to make your optional libraries to provide an implementations of some interface (a "services") which may be easily found by your main application. Take a look at this example
// scan classpath for all registered
// implementations of Module interface
ServiceLoader<Module> loader = ServiceLoader.load(Module.class);
for (Module module : loader) {
module.doSomething();
}
Once your optional dependency is in classpath service loader will find it.
You can find a lot of examples in "Creating Extensible Applications" tutorial from Oracle on how to make it.
The other way is to use dependency injection frameworks such as spring or google guice. These frameworks are also providing a classpath scanning mechanisms for automatic component discovery. This solution is a way more flexible but heavier than SPI.
you can definite your dependency like this:
<dependency>
<groupId>com.thoughtworks.paranamer</groupId>
<artifactId>paranamer</artifactId>
<version>2.6</version>
<optional>true</optional>
</dependency>
checkout the detail from this link
Simplest would be to create a new Module as you mentioned. And in this new Project A you have a dependency to this existing Module that you are talking about Project B.
So now any body who wants to use without your extension would use Project B. And anyone who would need your extension would use Project A.
Just make sure to add the Maven dependencies in the build Path to avoid ClassNotFound conflicts.
I needed EasyStream available at sourceforge site and added the dependency in my application. My SLF4J-API used to work just fine but now it has a disagreement. The way am using code snippet is :
private final Map<?, ?> parentContext;
MDC.setContextMap(parentContext);
For some reason i am getting a compile time error now with the message :
The method setContextMap(Map<String,String>) in the type MDC is not applicable for the argument Map<Capture#5of-?,Capture#6of-?>
kindly suggest how i can get ride of this error.
My guess is that EasyStream depends on a different version of SLF4J than the one you're using. Having more than one of the same version of a dependency on one's classpath leads to all sorts of weird issues. Depending on what dependency management system you're using, you probably need to tell it to exclude the child slf4j-api dependency or perhaps override it. For example, in Maven, I'd use the <dependencyManagement> system to force all dependencies to use the same version of slf4j-api. Perhaps if you edit your question with more details about the dependency management system you're using you can get a more specific answer, and details on how to check and prevent such problems in the future.
We are working on Mavenizing our java project and we would like to setup a clean separation between interfaces and implementations for each module.
In order to do so, we want to split each module into two sub-modules one for interfaces and data objects used by them and another for implementations.
For example:
+commons
+commons-api
+commons-impl
The POMs of the modules will be configured such that no module depends on the impl sub-modules. This way no code from one module will be able to "see" implementation details of another module.
What we are having trouble with, is where to put our spring XMLs.
In our project we automatically import spring XML files using wildcard import like
<import resource="classpath*:**/*-beans.xml"/>
This way the location of Spring XMLs doesn't really matter at runtime, as all the modules get loaded into the same class loader and, the strict one way dependency rules in the POMs don't apply.
However, during development we want the IDE - we use Intellij IDEA - to recognize implementation classes referenced from the spring XMLs.
We also want IDEA to recognize beans defined in other modules.
If we put the spring XMLs in API sub-modules - they won't "see" the implementation classes in the impl sub-modules.
If we put them in the impl sub-modules, their beans won't be "seen" from other modules.
It is probably possible to configure the IDEA project to recognize spring XMLs from modules on which there is no dependency, but we prefer for our POMs to hold all the project structure information and not rely on IDEA project files.
We considered creating a third sub-module just to hold Spring XMLs (and perhaps hibernate xmls as well). For example:
+commons
+commons-api
+commons-impl
+commons-config
The external modules will depend on both commons-api and commons-config and commons-config will depend on both commons-api and commons-impl, with the dependency on commons-impl marked as "provided" (to prevent transitive resolution).
This however seems like a complex and awkward solution and we feel that there must be a better - simpler way to achieve interface/impl separation with Maven and Spring.
What you need is a runtime dependency scope:
runtime - This scope indicates that the dependency is not required for compilation, but is for execution. It is in the runtime and test classpaths, but not the compile classpath.
(https://maven.apache.org/guides/introduction/introduction-to-dependency-mechanism.html)
Define a runtime dependency from one impl module to another impl module where you use the impl classes in the *-beans.xml config. Intellij will correctly recognize this in spring configuration files, but won't auto complete them in code (but it will do that in test code).
Also if anyone used the classes in the code, compilation through maven would fail, because the runtime dependency is not on a compile class path.
You can achieve decoupling of api and impl like this:
+ commons (pom)
+ pom.xml <--- serves as a parent aggregator (see below)
+ commons-api (jar) <--- contains models, interfaces and abstract classes only
+ commons-impl (jar) <--- depends on commons-api
+ commons-config (jar) <--- depends on commons-impl only (no need to depend on commons-api as it is brought in transitively)
+ external-project (war or jar) <--- has commons-config as a dependency
Parent aggregator pom (specify build order):
<modules>
<module>commons-api</module>
<module>commons-impl</module>
<module>commons-config</module>
</modules>
The config module can be omitted if it only contains spring application context configuration. The app configuration xml should be in the classpath and folder structure of the module that contains the artifact that you are deploying. So if you are building a war artifact, the app context should be in there.
The only configuration that should be in your commons module would be in a test package of your impl module.
In short you want Idea to override maven dependency graph but avoid keeping this configuration in idea project files?
One option is to group implementation dependencies in a maven profile. This profile would not be enabled by default but you should be able to mark it as active under idea.
Two ideas come to mind:
You will have one (or more) modules where all the modules (api+impl) are dependencies, you could place your spring configuration files there.
Place the spring configuration files in the api modules and declare a dependency on the impl module with scope provided this way the implementations will be known, while there is no dependency of the api for the deployment.
commons-impl at runtime scope in external modules
commons (pom dependencyManagement) =>
+commons-api (compile)
+commons-impl (compile)
+commons-config (compile)
commons-impl (pom dependencies) =>
+commons-api (compile)
+commons-config (compile)
external modules (pom dependencies) =>
+commons-impl (runtime)
+commons-api (compile)
+commons-config (compile)
keep modules number as little as possible;
This speeds up project build time and simplifies its layout.
keep modules structure as plain as possible: single root + all sub modules in the same folder, e. g.:
pom.xml
commons-api/
commons-runtime/
module-a-api/
module-a-runtime/
...
This simplifies navigation across the project, when modules number is really high (>50)
provide runtime-scoped dependencies to the runtime modules only when they are required;
This keeps your architecture clear. Use mocks instead of explicit dependency to another runtime module.
keep your api spring contexts in api modules, define your public beans as abstract bean + interface;
keep your implementation contexts in runtime modules, override api beans with your implementations via spring profiles (use <beans profile="default").
Result: simple, transparent layout and design; full ide support; no explicit dependencies on runtime module internals.
I have three modules in my Maven project (this is slightly simplified):
model contains JPA annotated entity classes
persistence instantiates an EntityManager and calls methods on it
application creates instances of the classes in model, sets some values and passes them to persistence
model and persistence obviously depend on javax.persistence, but application shouldn't, I think.
The javax.persistence dependency is moved to a top-level POM's dependencyManagement section because it occurs in a number of submodules where I only reference that entry.
What's surprising to me is that I have to reference the dependency in application when I set its scope to provided, whereas I don't have to when its scope is compile.
With a scope of provided, if I don't list it in the dependencies for application, the build fails with an error message from javac:
com.sun.tools.javac.code.Symbol$CompletionFailure: class file for javax.persistence.InheritanceType not found
What's going on?
model and persistence obviously depend on javax.persistence, but application shouldn't, I think.
That's true. But transitive dependencies resolution has nothing to do with your problem (and actually, javax.persistence is provided to model and persistence on which application depends with a compile scope so it's omitted as documented in 3.4.4. Transitive Dependencies).
In my opinion, you are victim of this bug: http://bugs.sun.com/view_bug.do?bug_id=6550655
I have the same issues with an EJB3
entity that uses the Inheritance annotation:
#Inheritance(strategy=InheritanceType.SINGLE_TABLE)
A client class using this entity won't
compile when the ejb3 annatations are
not on the classpath, but crash with
the following message:
com.sun.tools.javac.code.Symbol$CompletionFailure:
class file for
javax.persistence.InheritanceType not
found
[...]
Note that is a special case of bug 6365854 (that is reported to be fixed); the problem here seems to be that the annotation is using an enum as its value.
The current workaround is to add the missing enum to the CLASSPATH.
In your case, the "less worse" way to do that would be to add javax.persistence as provided dependency to the application module. But that's a workaround to the JVM bug, application shouldn't need that dependency to compile.
umm, because provided dependencies are not transitive? that's builtin behavior for maven.
The dependencyManagement section declares what dependencies will look like if you use them, not that you will use them. So you still need to declare a minimal dependency declaration to have the configuration applied in your child project. See the dependency management section of the Maven book for details.
The minimum required is typically the groupId and the artifactId.
If you want to inherit the configuration without declaring it at all, you should define it in the parent's dependencies section rather than dependencyManagement