Project organization using Maven and ParameterSupplier - java

I am wondering what would be some suggestions for project / module organization given the following situation:
I have a project DomainObjects in which I have a class MyObject
In /src/test/java of DomainObjects I have the tests for MyObject
I have a project Client that depends on DomainObjects
I would like to add a ParameterSupplier called MyObjectTestSupplier class to provide test instances of MyObject for use by tests in Client.
It seems to make the most sense to provide MyObjectTestSupplier in the DomainObjects project. Here is my dilema...
if I put the supplier in src/test/java of DomainObjects it will not be available to Client.
I don't want to put it in src/main/java of DomainObjects because that means that JUnit would have to be included as a compile dependancy of DomainObjects and thereby be included in my production code.
if I put the supplier in some project DomainObjectsTest I have three options
put just the supplier is the test project but this means that tests in DomainObjects could not use this supplier.
put all the tests and suppliers for DomainObjects in DomainObjectsTest but that means that DomainObjects will be successfully compiled by maven even if tests fail
copy the supplier in both src/test/java of DomainObjects and src/main/java of DomainObjectsTest.
I thought about trying to make DomainObjectsTest a module of DomainObjects but that only works if the packaging for DomainObjects is pom which does not work here.
Thoughts? Suggestions?
EDIT: As an explanation, MyObject is a simple bean (just getters and setters) and I use the ParameterSuppier pattern for providing populated instances of beans. The supplier provides utility methods to easily create populated instances of the bean for use in testing. I do this so that I don't repeat this population code (or the mocking equivilent) throughout my project(s).

As the official Maven mini guide on this particular topic says you should publish a test artifact of the DomainObjects project into your local Maven repository (or anywhere you'd like or able to) and use the DomainObjects-X.Y-tests artifact as a test-scoped dependency in your Client project.
Publishing a test artifact is done by using the jar:test-jar goal of the Maven JAR plugin.
If you include this artifact as a test-scoped dependency in your Client project then any other project that depends on the Client project won't inherit your DomainObjects project's test artifact, because test-scoped dependencies are not transitive by default as stated by the official guide on Maven's dependency mechanism.

That is a bad design example what you're describing. Your unit test can't depend on external dependencies to be "unit".
What you need to do is to mock all dependencies and test only Client code. Use Mockito or another library of your choice to create mock instances of MyObject in client project according to what you expect this class to do. Test MyObject behavior in its own project - DomainObjects.
In mockito creating a mock is just :
import static org.mockito.Mockito.*;
...
MyObject myMock = mock(MyObject.class);
when(myMock.doWhatYouNeed(params)).thenReturn(whatYouExpect);
Edit:
Another ideas
Publish DomainObjects' tests as artifact of type test-jar as descibed here and use it as test-scoped dependency in Client. But this is quite ugly...
Nice design is :
DomainObjectAPI project with MyObject,
DomainObjectTestSupplier using DomainObjectAPI providing suppliers,
DomainObject using DomainObjectAPI for compile and DomainObjectTestSupplier for testing
Client using DomainObjectAPI, DomainObject for compile and DomainObjectTestSupplier for testing.
It's just an overkill.

Related

Java modules: accessibility problems for Mockito 2.20.0

I am migrating from Java 8 to Java 10, and I am running my test which now fails because of package protected classes. The build is run under maven 3.5.4 + Oracle JDK 10.0.2:
maven-compiler-plugin 3.7.0 + asm 6.2
maven-surefire-plugin 2.22.0 + asm 6.2 + junit 5.2.0
asm 6.2 is required for both compiler/surefire because of a bug in the version of ASM used by those plugins.
mockito-core 2.20.0 (but was using 2.20.0 with Java 8 before).
Eclipse Photon R
The project can be found here ide-bugs.zip (it is located at Eclipse forum because I've made this Topic on Eclipse for another problem, this time with Eclipse having local error with module).
The test is very simple: we try to mock different class, with different access level - all of which were working in Java 8.
package protected class
public class but not exported, not opened
public class not exported but opened to Mockito
public class not exported but opened to all
package protected class not exported but opened to Mockito
package protected class not exported but opened to all
In Java 8, case 1, 5 and 6 are the same (access to package protected). Case 2, 3 and 4 are the same (access to public).
The test fails because Mockito is unable to either:
class org.mockito.codegen.NotExportedOpenToMockitoProtected$MockitoMock$117073031 cannot access its superclass nodatafound.mjpmsuc.withopens.NotExportedOpenToMockitoProtected
class org.mockito.codegen.NotExportedNotOpenedPublic$MockitoMock$365628885 (in unnamed module #0x3f07b12c) cannot access class nodatafound.mjpmsuc.internal.NotExportedNotOpenedPublic (in module nodatafound.mockito_jpms_usecase) because module nodatafound.mockito_jpms_usecase does not export nodatafound.mjpmsuc.internal to unnamed module #0x3f07b12c
Mockito effectively have a Automatic-Module-Name but is seen as the unamed module because all jar found in the class path for a big "unnamed module".
While I'm fine with migrating from package-protected to non exported package, I fail to understand how I can address the problem keeping my interface/class not visible to other modules ?
[edit] updated the version of plugin/dependency one month after, no result.
I found part of answer to my problem here: https://blog.codefx.org/java/java-module-system-tutorial/#Open-Packages-And-Modules
Mockito is using reflection to access classes from module or class path.
Mockito is in the "unnamed module" because Maven adds it into the class path rather than the module path. This explains why the opens package to org.mockito never works: there is no org.mockito module.
Maven Surefire does not care to contribute to the "opens" of the module in order to allow Mockito to access it.
Mockito is (no longer?) able to mock non-private & non-final classes class. By any means package protected class are private. The error is rather explicit: Mockito create a class extending the package protected class, which now fails (it was working before, but this was probably because Mockito created the class in the same package than the one being mocked).
Nevertheless, this give a problematic configuration in the pom.xml of each module:
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-surefire-plugin</artifactId>
<configuration>
<argLine>
--add-opens nodatafound.mockito_jpms_usecase/nodatafound.mjpmsuc=ALL-UNNAMED
</configuration>
</plugin>
We need to explicitly add opens to an unnamed module. This should not be done in the module-info.java because it expose the module to all other modules or jars which is against encapsulation.
This is problematic because:
You need to specify it in the pom.xml for each package.
It add additional burdens to the surefire configuration which I prefer simple.
You don't have validation from the IDE; Eclipse will validate module-info.java marking invalid package.
m2e does not pass to Eclipse JUnit plugin the necessary <argLine /> making the test fail in Eclipse.
The maven approach (which is the same in Eclipse, and perhaps Gradle as far as I know) does not permits an additional module-info for the tests; eg: lets test dependency be modular (this could be probably be done using a dedicated test module per source module like Eclipse does for plugin' tests).

How do I configure arquillian suite extension

How do I configure Arquillian Suite extesion?
https://github.com/it-crowd/arquillian-suite-extension
I would like to use it for single deployment tests, in order not to have to deploy for every single class that have #Test methods in my project.
By the way, I'm using TESTNG with arquillian..
I pushed extension bit further, it can be found on maven central and there is part of help written + tests to look how it should be done.
I also created "generic" deployer builder that should work with javaee6.
https://github.com/ingwarsw/arquillian-suite-extension

Maven modules and Spring test resources

I have a Maven project made up of several modules. Some of the modules depend on the other modules for example Module C <- Module B <- Module A. Module C depends on Module B which depends on Module A.
In each module, I have Spring config files in main/resources and test/resources, those under test are for unit testing, while the those under main are for release/production. Each config file is self contained - Module B contained only its Spring config (file names are like so foo-B.xml, foo-A.xml)
However, when I need to test Module C, I need to reference Module B's Spring config under test/resources, but what is included is Module B's main/resources config file. This presents a problem because the production file has references to JNDI datasources where test one does not.
How can I get Maven or Spring to reference the test configuration file from the module dependency?
Maven separates the source classes & resources from the test classes & resources. You may configure Module B to create a test jar using the maven-jar-plugin test-jar goal. Then, you may have Module C reference Module B's test code as a dependency.
<dependency>
<groupId>com.myCompany</groupId>
<artifactId>moduleB</artifactId>
<version>${project.version}</version>
<classifier>tests</classifier>
<scope>test</scope>
</dependency>
Alternately, you can create a regular Maven project including only the test code you'd like to share, then include that as a test dependency where needed. This idea is described in the maven jar plugin's usage docs.
I'm not sure this can be done. Maven deliberately does not include test resources in artifacts. If I were in your place, I would duplicate the test resources in module C. Presumably, you're not testing the same things in both modules, so hopefully it won't cause a bad case of copy&paste/dual-maintenance.
As an aside, I try to avoid having "production" data sources and "test" data sources. Use the same JNDI name for both, but have the JNDI provider configured to point to test or production based on the circumstance. For example, all of our web servers have the same data sources defined, but the JDBC urls are different for dev/qa/prod. For your unit tests, use something like simple-jndi to simulate a JNDI environment.

maven - separate modules for interfaces and implementation with Spring

We are working on Mavenizing our java project and we would like to setup a clean separation between interfaces and implementations for each module.
In order to do so, we want to split each module into two sub-modules one for interfaces and data objects used by them and another for implementations.
For example:
+commons
+commons-api
+commons-impl
The POMs of the modules will be configured such that no module depends on the impl sub-modules. This way no code from one module will be able to "see" implementation details of another module.
What we are having trouble with, is where to put our spring XMLs.
In our project we automatically import spring XML files using wildcard import like
<import resource="classpath*:**/*-beans.xml"/>
This way the location of Spring XMLs doesn't really matter at runtime, as all the modules get loaded into the same class loader and, the strict one way dependency rules in the POMs don't apply.
However, during development we want the IDE - we use Intellij IDEA - to recognize implementation classes referenced from the spring XMLs.
We also want IDEA to recognize beans defined in other modules.
If we put the spring XMLs in API sub-modules - they won't "see" the implementation classes in the impl sub-modules.
If we put them in the impl sub-modules, their beans won't be "seen" from other modules.
It is probably possible to configure the IDEA project to recognize spring XMLs from modules on which there is no dependency, but we prefer for our POMs to hold all the project structure information and not rely on IDEA project files.
We considered creating a third sub-module just to hold Spring XMLs (and perhaps hibernate xmls as well). For example:
+commons
+commons-api
+commons-impl
+commons-config
The external modules will depend on both commons-api and commons-config and commons-config will depend on both commons-api and commons-impl, with the dependency on commons-impl marked as "provided" (to prevent transitive resolution).
This however seems like a complex and awkward solution and we feel that there must be a better - simpler way to achieve interface/impl separation with Maven and Spring.
What you need is a runtime dependency scope:
runtime - This scope indicates that the dependency is not required for compilation, but is for execution. It is in the runtime and test classpaths, but not the compile classpath.
(https://maven.apache.org/guides/introduction/introduction-to-dependency-mechanism.html)
Define a runtime dependency from one impl module to another impl module where you use the impl classes in the *-beans.xml config. Intellij will correctly recognize this in spring configuration files, but won't auto complete them in code (but it will do that in test code).
Also if anyone used the classes in the code, compilation through maven would fail, because the runtime dependency is not on a compile class path.
You can achieve decoupling of api and impl like this:
+ commons (pom)
+ pom.xml <--- serves as a parent aggregator (see below)
+ commons-api (jar) <--- contains models, interfaces and abstract classes only
+ commons-impl (jar) <--- depends on commons-api
+ commons-config (jar) <--- depends on commons-impl only (no need to depend on commons-api as it is brought in transitively)
+ external-project (war or jar) <--- has commons-config as a dependency
Parent aggregator pom (specify build order):
<modules>
<module>commons-api</module>
<module>commons-impl</module>
<module>commons-config</module>
</modules>
The config module can be omitted if it only contains spring application context configuration. The app configuration xml should be in the classpath and folder structure of the module that contains the artifact that you are deploying. So if you are building a war artifact, the app context should be in there.
The only configuration that should be in your commons module would be in a test package of your impl module.
In short you want Idea to override maven dependency graph but avoid keeping this configuration in idea project files?
One option is to group implementation dependencies in a maven profile. This profile would not be enabled by default but you should be able to mark it as active under idea.
Two ideas come to mind:
You will have one (or more) modules where all the modules (api+impl) are dependencies, you could place your spring configuration files there.
Place the spring configuration files in the api modules and declare a dependency on the impl module with scope provided this way the implementations will be known, while there is no dependency of the api for the deployment.
commons-impl at runtime scope in external modules
commons (pom dependencyManagement) =>
+commons-api (compile)
+commons-impl (compile)
+commons-config (compile)
commons-impl (pom dependencies) =>
+commons-api (compile)
+commons-config (compile)
external modules (pom dependencies) =>
+commons-impl (runtime)
+commons-api (compile)
+commons-config (compile)
keep modules number as little as possible;
This speeds up project build time and simplifies its layout.
keep modules structure as plain as possible: single root + all sub modules in the same folder, e. g.:
pom.xml
commons-api/
commons-runtime/
module-a-api/
module-a-runtime/
...
This simplifies navigation across the project, when modules number is really high (>50)
provide runtime-scoped dependencies to the runtime modules only when they are required;
This keeps your architecture clear. Use mocks instead of explicit dependency to another runtime module.
keep your api spring contexts in api modules, define your public beans as abstract bean + interface;
keep your implementation contexts in runtime modules, override api beans with your implementations via spring profiles (use <beans profile="default").
Result: simple, transparent layout and design; full ide support; no explicit dependencies on runtime module internals.

Maven - use different java classes during 'test' and 'war' phase

I'm using maven war plugin to build war package.
Before package is build test are executed. To preinitialize my database with sample data I use spring bean. I would like to have different data in my db for tests and different when application starts.
I was thinking that maybe it is possible to use two different spring initializer classes in 'test' and 'war' phases but I don't know how to achieve this.
You have to put the different classes you need into src/main/java or src/test/java or may be supplemental application.xml into src/main/resources or src/test/resources. The test initializer can be done by a Test class which initializes first before all tests are running (take a look at testng which has this kind of feature).
Your tests should not be using the production Spring context (xml) files.
Instead, if you need to access an ApplicationContext in your tests (or if you are using a base testcase class like AbstractTransactionalJUnit4SpringContextTests), set up a test-context.xml context which points to the test database configuration and the test data scripts.

Categories

Resources