I have a Maven project made up of several modules. Some of the modules depend on the other modules for example Module C <- Module B <- Module A. Module C depends on Module B which depends on Module A.
In each module, I have Spring config files in main/resources and test/resources, those under test are for unit testing, while the those under main are for release/production. Each config file is self contained - Module B contained only its Spring config (file names are like so foo-B.xml, foo-A.xml)
However, when I need to test Module C, I need to reference Module B's Spring config under test/resources, but what is included is Module B's main/resources config file. This presents a problem because the production file has references to JNDI datasources where test one does not.
How can I get Maven or Spring to reference the test configuration file from the module dependency?
Maven separates the source classes & resources from the test classes & resources. You may configure Module B to create a test jar using the maven-jar-plugin test-jar goal. Then, you may have Module C reference Module B's test code as a dependency.
<dependency>
<groupId>com.myCompany</groupId>
<artifactId>moduleB</artifactId>
<version>${project.version}</version>
<classifier>tests</classifier>
<scope>test</scope>
</dependency>
Alternately, you can create a regular Maven project including only the test code you'd like to share, then include that as a test dependency where needed. This idea is described in the maven jar plugin's usage docs.
I'm not sure this can be done. Maven deliberately does not include test resources in artifacts. If I were in your place, I would duplicate the test resources in module C. Presumably, you're not testing the same things in both modules, so hopefully it won't cause a bad case of copy&paste/dual-maintenance.
As an aside, I try to avoid having "production" data sources and "test" data sources. Use the same JNDI name for both, but have the JNDI provider configured to point to test or production based on the circumstance. For example, all of our web servers have the same data sources defined, but the JDBC urls are different for dev/qa/prod. For your unit tests, use something like simple-jndi to simulate a JNDI environment.
Related
I'm working on a Spring multi-module project. One of the child modules has some files under /test/resources/certs/ and a property file under /test/resources/test-ssl.properties.
───resources
│ test-ssl.properties
├───certs
│ test-keystore.p12
test-ssl.properties has a property that points to /certs/test-keystore.p12file.
server.ssl.trust-store=/certs/test-keystore.p12
In child modules pom.xml I'm using Maven plugin test-jar and in parent pom I've added this module as a dependency.
With this structure integration test present in parent module is able to successfully read classpath:test-ssl.properties but it fails to resolve its property value.
Spring throws FileNotFoundException: \certs\test-keystore.p12. What change we can do to make Spring read a file present in test jar?
Also tried the following patterns,
server.ssl.trust-store=classpath:/certs/test-keystore.p12
server.ssl.trust-store=classpath:certs/test-keystore.p12
server.ssl.trust-store=classpath*:/certs/test-keystore.p12
Please note that this test property doesn't try to load any certificate. It is there because property placeholder can find some value for the property during build.
Issue is resolved by changing integration-test phase to process-test-resources.
Credit goes to the following answer of Pascal Thivent:
The content of the test output directory (target/test-classes) is on the class path, not src/test/resources. But resources under src/test/resources are copied to the test output directory by the resources:testResources goal (which is bound by default to the process-test-resources phase).
I have a multi-module maven project with three modules core, utils and test-utils
Core has the following dependencies definition
<dependency>
<groupId>my.project</groupId>
<artifactId>utils</artifactId>
</dependency>
<dependency>
<groupId>my.project</groupId>
<artifactId>test-utils</artifactId>
<scope>test</scope>
</dependency>
I have added Java 9 module-info.java definitions for all three modules and core's looks like this:
module my.project.core {
requires my.project.utils;
}
However I cannot figure out how to get core's test classes to be able to see the test-utils classes during test execution. When maven-surefire-plugin attempts the test run I get class not found.
If I add a requires my.project.testutils; to core's module-info.java:
module my.project.core {
requires my.project.utils;
requires my.project.testutils; //test dependency
}
Then at compile time I get an error that the my.project.testutils module can't be found (presumably because it's only brought in as a test dependency).
How does one work with test dependencies in a Java 9 modular world? For obvious reason's I don't want my main code to pull in test dependencies. Am I missing something?
With maven and java9, if your my.project.testutils is a test scope dependency, you don't need to explicitly include(requires) it in the module descriptor.
The test dependencies are taken care via the classpath itself. So you can simply remove the testutils and it would be patched by maven while executing tests.
module my.project.core {
requires my.project.utils;
}
Refer to the slide 30 pertaining to maven-compiler-plugin.
I would also suggest you take a look at Where should I put unit tests when migrating a Java 8 project to Jigsaw and this comment by Robert confirming on the implementation that maven follows.
Edit: Created a sample project drawing an analogy that the main module is same as your core, the dependency on guava is same as your utils and the junit dependency is same as your testutils.
I'm having a problem properly setting up spring boot for my multi-module maven project.
There is a module "api" that uses another module "core". Api has an application.properties file that contains spring.mail.host=xxx. According to the spring boot documentation this provides you with a default implementation of the JavaMailSender interface, ready to be autowired.
However the class that is responsible for sending out the e-mails resides in the "core" package. When I try to build that module the build fails because no implementation of JavaMailSender can be found.
My guess then was that the mailing config should reside in "core" in a separate application.properties. I created that and moved the spring.mail.host property from the "api" to the "core" property file.
This time the core module builds successfully, but "api" fails to build because of the same exception, so I think I just moved the problem.
I don't understand the required structure for handling this type of situations well enough so I was wondering what the correct way is for having a "core" module containing all the correct configuration for sending mails and having other modules use the mailing code and config that resides in it.
I found the answer in another stack overflow question: How to add multiple application.properties files in spring-boot?
It turns out there can only be 1 application.properties file in the final jar that spring boot creates. To have multiple files you have to rename one of the files to something custom. I named the properties of the core module "core-application.properties".
Then in the API module I added this to the spring boot application class:
#SpringBootApplication
#PropertySource(value = {"core-application.properties", "application.properties"})
Doing this I can correctly use the base properties file and overwrite them in the more specific modules. Also you can still create profile-specific properties file (core-application-production.properties) with this setup, no need to add those to the propertysource manually). Note that #PropertySource does not work for yaml configuration files at this moment.
there is one effective application.properties per project. you just keep 2 properties file for a success build.
when api module use core module, the application.properties in core module is overwrite by api.
Your API's pom.xml must has dependency of CORE module.
the solution is to define properties files as a value of #PropertiesSource in Starter class.
but it is beter to put "classpath:" behind the properties files.
for example in Intellij idea after adding the "classpatch:" word berhind the files name, values become to link. like this:
#SpringBootApplication
#PropertySource(value = {"classpath:core-application.properties", "classpath:application.properties"})
I hope to helped you.
I have a common service which is packaged as a jar with all it's dependencies.
Consumer1, consumer2, consumer3 provides different configuration for Common_service.
What is the best way to repackage common_service with all it's jar content and in addition, bundle the configuration with it.
Final output, on Consumer1 would give :
Consumer1-shaded.jar (without common-service)
Consumer1-Common-service.jar (only common-service with custom configuration)
Consumer2 would give :
Consumer2-shaded.jar (without common-service)
Consumer2-Common-service.jar (only common-service with custom configuration)
I tried maven shade to repackage common_service, but I need to explicitly include all the dependencies of common service in consumer., Why should the consumer be aware of the common service contents ? Is there any direct way to take the jar, add config and repackage ?
To get what you want, probably your best bet is to have four maven modules. The common_service module would no longer produce a shaded jar, just a regular jar. The consumer_1 module would include the configuration files and have common_service as a dependency, and would produce a shaded jar. The consumer_2 and consumer_3 modules would be set up similar to the consumer_1 module.
We are working on Mavenizing our java project and we would like to setup a clean separation between interfaces and implementations for each module.
In order to do so, we want to split each module into two sub-modules one for interfaces and data objects used by them and another for implementations.
For example:
+commons
+commons-api
+commons-impl
The POMs of the modules will be configured such that no module depends on the impl sub-modules. This way no code from one module will be able to "see" implementation details of another module.
What we are having trouble with, is where to put our spring XMLs.
In our project we automatically import spring XML files using wildcard import like
<import resource="classpath*:**/*-beans.xml"/>
This way the location of Spring XMLs doesn't really matter at runtime, as all the modules get loaded into the same class loader and, the strict one way dependency rules in the POMs don't apply.
However, during development we want the IDE - we use Intellij IDEA - to recognize implementation classes referenced from the spring XMLs.
We also want IDEA to recognize beans defined in other modules.
If we put the spring XMLs in API sub-modules - they won't "see" the implementation classes in the impl sub-modules.
If we put them in the impl sub-modules, their beans won't be "seen" from other modules.
It is probably possible to configure the IDEA project to recognize spring XMLs from modules on which there is no dependency, but we prefer for our POMs to hold all the project structure information and not rely on IDEA project files.
We considered creating a third sub-module just to hold Spring XMLs (and perhaps hibernate xmls as well). For example:
+commons
+commons-api
+commons-impl
+commons-config
The external modules will depend on both commons-api and commons-config and commons-config will depend on both commons-api and commons-impl, with the dependency on commons-impl marked as "provided" (to prevent transitive resolution).
This however seems like a complex and awkward solution and we feel that there must be a better - simpler way to achieve interface/impl separation with Maven and Spring.
What you need is a runtime dependency scope:
runtime - This scope indicates that the dependency is not required for compilation, but is for execution. It is in the runtime and test classpaths, but not the compile classpath.
(https://maven.apache.org/guides/introduction/introduction-to-dependency-mechanism.html)
Define a runtime dependency from one impl module to another impl module where you use the impl classes in the *-beans.xml config. Intellij will correctly recognize this in spring configuration files, but won't auto complete them in code (but it will do that in test code).
Also if anyone used the classes in the code, compilation through maven would fail, because the runtime dependency is not on a compile class path.
You can achieve decoupling of api and impl like this:
+ commons (pom)
+ pom.xml <--- serves as a parent aggregator (see below)
+ commons-api (jar) <--- contains models, interfaces and abstract classes only
+ commons-impl (jar) <--- depends on commons-api
+ commons-config (jar) <--- depends on commons-impl only (no need to depend on commons-api as it is brought in transitively)
+ external-project (war or jar) <--- has commons-config as a dependency
Parent aggregator pom (specify build order):
<modules>
<module>commons-api</module>
<module>commons-impl</module>
<module>commons-config</module>
</modules>
The config module can be omitted if it only contains spring application context configuration. The app configuration xml should be in the classpath and folder structure of the module that contains the artifact that you are deploying. So if you are building a war artifact, the app context should be in there.
The only configuration that should be in your commons module would be in a test package of your impl module.
In short you want Idea to override maven dependency graph but avoid keeping this configuration in idea project files?
One option is to group implementation dependencies in a maven profile. This profile would not be enabled by default but you should be able to mark it as active under idea.
Two ideas come to mind:
You will have one (or more) modules where all the modules (api+impl) are dependencies, you could place your spring configuration files there.
Place the spring configuration files in the api modules and declare a dependency on the impl module with scope provided this way the implementations will be known, while there is no dependency of the api for the deployment.
commons-impl at runtime scope in external modules
commons (pom dependencyManagement) =>
+commons-api (compile)
+commons-impl (compile)
+commons-config (compile)
commons-impl (pom dependencies) =>
+commons-api (compile)
+commons-config (compile)
external modules (pom dependencies) =>
+commons-impl (runtime)
+commons-api (compile)
+commons-config (compile)
keep modules number as little as possible;
This speeds up project build time and simplifies its layout.
keep modules structure as plain as possible: single root + all sub modules in the same folder, e. g.:
pom.xml
commons-api/
commons-runtime/
module-a-api/
module-a-runtime/
...
This simplifies navigation across the project, when modules number is really high (>50)
provide runtime-scoped dependencies to the runtime modules only when they are required;
This keeps your architecture clear. Use mocks instead of explicit dependency to another runtime module.
keep your api spring contexts in api modules, define your public beans as abstract bean + interface;
keep your implementation contexts in runtime modules, override api beans with your implementations via spring profiles (use <beans profile="default").
Result: simple, transparent layout and design; full ide support; no explicit dependencies on runtime module internals.