Using stubs in a basic Java maven project - java

I have a basic maven project with the folder structure: -main and -test directories.
I have one package in the main source directory which consists of a few classes, say a.class b.class and c.class, all under the same package. All classes have dependencies to each other. To do proper unit testing, and to cut the dependencies from each class, I write stub classes of each a, b and c class, define them to have the same package and put them inside the test source directory. Then I run: mvn test
Fine, the stubs are now being found first from the classpath and used, but I want to modify the classpath (on the fly?) so that, when testing class a, I need to have the original a.class and stubs used for b.class and c.class. Similarly, when testing class b, I need to have the original class b and stubs used for a.class and c.class.
How do I accomplish this using Maven and JUnit?
This is kind of frustrating in Java, because in C++, one can use the makefile source path and user defined include paths in unit test header files to force the stubs to be found first and then explicitly add an include to the original class to be tested.

If you have dependent classes you schould use interface for each class. Then you can resolve dependency problems actually...

Like #khmarbaise already pointed out you are going the wrong way. In Java it is good practice to use Mocking libraries like Mockito and PowerMock if you want to test static methods.
Those libraries help you to write Stubs for your existing classes without to modify the classes themselves. Check Maven Central for Mockito. You can include it with maven via
<dependency>
<groupId>org.mockito</groupId>
<artifactId>mockito-core</artifactId>
<version>1.10.19</version>
<scope>test</scope>
</dependency>
Then using JUnit you end up writing Mocks for your existing classes. There are many tutorials regarding Mockito out there.

Related

Choose which dependency to use in class

I want to use a class and in my pom there are two dependencies that support it: dependency1 and dependency2.
Using the class with dependency1 crashed my program, so I deleted it completely from pom and left dependency2 as it was and the code was working.
How do I tell maven to build my class with dependency2 and not dependency1, without deleting dependency1 (in case dependency1 contains something that I want to use in my code later on)?
You cannot sensibly use two libraries that contain classes with the same qualified class names.
So
either restrict yourself to one of them.
or use the Maven shade plugin to relocate the packages of one of the dependencies.
when you are importing the dependency in your respective class just check for the entire address of the dependency (the whole package structure) and make sure that you are using the dependency from the dependency2.
Also, if you have removed dependency1 from the pom.xml, maven will not put the dependency1 in the target folder which will be generated while building the project.
It's very unlikely to have the same class with same package name on two different dependencies.Because artifactIds are unique for each dependency even it belongs to the same groupdId. So if you solved your issue by using the dependency2's class then that's the class you need. And as you asked if you need dependency1 for any other task keeping the dependency1 on your pom.xml won't be a problem.Only thing that you need to take care is importing the class that you need exactly from dependency2.So please check your import statements in the class and see if it's importing the class from dependency2.

Referencing a function from another drl file in a separate maven project

I have multiple maven projects with DROOLs drl files in them. I would like to put things like helper functions in a central location and then have the drls in other projects be able to use them, but it isn't working.
The common project is a maven dependency in the other projects. I can prove this is working because I have access to the facts that I define in the common project, but I don't have access to functions.
I initially tried creating a file called:
helperfunctions.drl and put the functions directly in the file thinking they would be available without any imports when building and they are not found.
I then tried wrapping the functions in a declare HelperFunctions end, but this syntax doesn't work.
Finally, I tried changing the file to HelperFunctions.java and did public class HelperFunctions and made all of the methods static. Then in the other project drls I imported using the namespace com.myproject.common.
I am out of options, is there anything else I can try or is this not possible?
I'm a little unclear about what you've attempted, but Java code from a dependency (your point #3) can be invoked from the rules if you have the Jar on your classpath.
Imagine you have your project set up as follows:
|-rule-utils (project name)
|-- src\main\java\com\mycompany\common\HelperFunctions.java
|--pom.xml
And you have defined some utility function public static void doSomethingUseful() in your HelperFunctions class.
In your other project where your rules exist, you can include your project1 jar as a dependency, possibly as follows in your pom:
<dependency>
<artifactId>rule-utils</artifactId>
<groupId>com.mycompany</groupId>
</dependency>
And then you can import and use HelperFunctions and its doSomethingUseful method as you would any other Java code in your drl:
import com.mycompany.common.HelperFunctions;
rule "Example rule"
when
then
HelperFunctions.doSomethingUseful();
end
In my experience, it's pretty common to invoke third-party utility code this way, for example the Apache commons' utility classes like StringUtils and CollectionUtils (though more often on the left hand side than in the consequences.)

How to detect a nested Unit Test classes (not under test folder) in Java

I know that it's quite hard to test many features of Java language. For example, it would be impossible to test a private variables of a class or similar methods.
I generally tackle this by making a nested class, where this nested class is a unit test, such that :
public class MyClass{
private String somePrivate;
// omitted for brevity
#RunWith(MockitoJUnitRunner.class)
public static class MyClassUnitTest{
#InjectMockito
MyClass myclassMocked;
// so forth...
}
}
thus no need for reflection/powermock or others!
This structure helps me to test all unreachable members or methods of a class.
But it appears that i also should make an automated build where maven will look up this nested classes for unit tests and run it when i mvn clean test in the deployment.
I've been trying to find any answer on this but to no avail i couldn't find any spec of maven or maven-surefire-plugin to say that 'hey please look at these nested classes in the src/main folder and mark them as unit test'. Also, i am using springboot to package all of my project (thus most of the dependencies are with spring)
Anyone up for solution?
For example, it would be impossible to test a private variables of a class or similar methods
You don't need to do this - private methods are private, and they are indirectly tested by testing methods that use them.
You should not be embedding test code or libs such that they have to ship with production software, period.
Don't do what you are proposing.
Edit based on your comment:
As for how you would do it technically, Maven only supports 1 directory for test sources.
You could do something like create an integration test setup that would find the tests in your src/main directory, but the reason this is not easy to do with Maven is because Maven promotes sane patterns and your pattern is not one of those.
Howto add another test source folder to Maven and compile it to a separate folder?

How to create class that will only receive dependency in classpath at runtime?

I work for a company that distributes our product as a jar file, and I'm trying to write something that will be able to test past versions of these jars with various inputs. Ideally, I could then run the test framework like
java -jar testframework.jar -cp "version1.jar"
or
java -jar testframework.jar -cp "version2.jar"
and get different outputs. Since the methods that take in input are set in stone, I figured I could make the dependency on our product scope "provided" or "runtime" in maven, and then call input methods on whatever version of the jar was provided in the classpath. Something like this:
<dependency>
<groupId>com.ourCompany</groupid>
<artifactId>ourProduct</artifactId>
<scope>provided</scope>
</dependency>
and then in the main TestFramework class:
public static void main(String[] args) {
ProductClass.doSomething();
}
However, I'm getting a compilation error that the doSomething method doesn't exist. I imagine I'm misunderstanding exactly what "provided" and "runtime" mean with respect to maven dependencies, but I haven't been able to find any resources that explain my mistake. Does anyone know how I can do what I'm trying to do?
ProductClass definitely exists within ProductJar. It has no
problem importing the class, just calling the method doSomething. And
I'm getting that error when I use provided scope.
Because you are confirming that the JAR exists, the issue seems like with the version of the JAR file you are pointing to, so specify the <version>X</version> (in which the doSomething method exist) as well for the <dependency> and should solve the problem.
I'm misunderstanding exactly what "provided" and "runtime" mean with
respect to maven dependencies
provided and runtime scopes are completely different, they are for two different purposes.
provided scope means that the dependency is required during compile and test time (but, the dependency JAR will not be bundled as part of the JAR packaging, so the JAR should be available in the container classpath)
runtime scope means that the dependency is required only during execution of the program, not at compile time.
The dependencies always need to be available at compile time. Otherwise, how would the compiler be able to know if your code is valid or not? Check that the version you've declared in the dependency does indeed have the doSomething method you want to use. If not you will need to change the version to one that does have that method.

Android Annotation Processor accessing Annotated classes from different modules

I'm having an Android Studio project with 2 modules: A and B. (I do not include here the Annotation Processor and the Annotations module)
B depends on A.
B is an Android Library Module, and A is simple Java library Module. I'm also having an Annotation Processor on module B.
The problem I'm facing is:
I want to generate some code, based on annotated files placed in both modules - A and B. The problem comes from the way the Annotation Processor works - only with source code files *.java - not with compiled *.class ones. Unfortunately, during the compilation of B, the Annotation Processor doesn't have access to those source files from A...
The only thing, I was able to think about as a kind of solution, even an ugly one, was to include the folder with the annotated classes from module A as a source set to module B. This way I give module B access to those files during compilation.
sourceSets {
main {
java {
srcDirs = ['src/main/java', '../module_A/src/main/java/path/to/annotated/classes/folder']
}
}
}
That solves the problem - now the Annotation Processor has access to all the annotated classes from both modules, but...
Unfortunately, it introduces another issue... those annotated classes from module A, are now compiled twice. And they are included in the module A's JAR file and in the module B's AAR file.
Question 1: Is there another way to access those source files of module A, from the Annotation Processor running on B??? (From what I was able to find, the answer is NO, but checking...)
Question 2: How can I exclude those compiled files (the repeated ones) from the AAR final package of module B?
Question 3: Maybe... that's an absolutely wrong approach? Any suggestions?
Thanks in advance!
Nop, you can not achieve what you want using just java.lang.model API. At least not without some additional tricks.
The issues is not with binary-vs-source. Annotation processors can use Elements#getTypeElement to interospect compiled classes as well as source-defined classes:
Elements elementUtil = processingEnvironment.getElementUtils();
TypeElement integerClass = elementUtil.getTypeElement("java.lang.Integer");
TypeElement myClass = elementUtil.getTypeElement("currently.compiled.Class");
But you still need to have class on compilation classpath to observe it, and the class must be in process of being compiled to be visible to getElementsAnnotatedWith.
You can work around later limitation by using a tool like FastClasspathScanner: it will use it's own mechanisms to find annotations in compiled bytecode, and report them to you separately from compilation process. But you can not work around the classpath issue: if you don't have some dependency in compilation classpath, it can not be processed. So you have to compile modules together — either by merging them into one (as you did) or via declaring one to depend on another. In later case you might not be able to use getElementsAnnotatedWith, but getTypeElement and FastClasspathScanner will work.

Categories

Resources