I am trying to use Micrometer to record execution time in my Java application. This is related to my other question about used #Timed annotation.
I have a class CountedObject that has the following 2 methods:
#Measured
#Timed(value = "timer1")
public void measuredFunction() {
try {
int sleepTime = new Random().nextInt(3) + 1;
Thread.sleep(sleepTime * 1000L);
} catch (InterruptedException e) {}
}
#Timed(value = "timer2")
public void timedFunction() {
try {
int sleepTime = new Random().nextInt(3) + 1;
Thread.sleep(sleepTime * 1000L);
} catch (InterruptedException e) {}
}
I have defined a custom annotation #Measured
#Retention(RetentionPolicy.RUNTIME)
#Target(ElementType.METHOD)
public #interface Measured {
}
And a MeasuredAspect to intercept calls to methods annotated with my #Measured annotation:
#Aspect
public class MeasuredAspect {
#Around("execution(* *(..)) && #annotation(Measured)")
public Object around(ProceedingJoinPoint pjp) throws Throwable {
return AppMetrics.getInstance().handle(pjp);
}
}
In my AppMetrics class I initialize an instance of micrometer's TimedAspect and in the handle(ProceedingJoinPoint pjp) method pass the ProceedingJoinPoint pjp to the TimedAspect instance.
public class AppMetrics {
private static final AppMetrics instance = new AppMetrics();
private MeterRegistry registry;
private TimedAspect timedAspect;
public static AppMetrics getInstance() {
return instance;
}
private AppMetrics() {
this.registry = new SimpleMeterRegistry();
this.timedAspect = new TimedAspect(registry);
}
public Object handle(ProceedingJoinPoint pjp) throws Throwable {
return timedAspect.timedMethod(pjp);
}
}
In my application main, I create an object of CountedObject and invoke measuredFunction() and timedFunction() then I check my registry.getMeters(); only timer1 used by the measuredFunction() [which is annotated by both #Measured and #Timed] is found, while the timer2 that should be used by timedFunction() [annotated only by #Timed] doesn't exist.
I am using eclipse with AspectJ Development Tools Plugin and my project is a Gradle project with AspectJ capability. I am using id "io.freefair.aspectj" version "5.1.1" plugin in my Gradle plugins. This is a basic java application not a Spring app.
What configurations needs to be done or what code changes are required so that micrometer TimedAspect can intercept my method calls directly [i.e timedFunction() should be timed and timer2 should be found in the registry] without the need of my custom annotation?
I created an example project for you:
https://github.com/kriegaex/SO_AJ_MicrometerTimed_67803726
Quoting the read-me (sorry, but answers only containing links are frowned upon on StackOverflow):
In https://github.com/micrometer-metrics/micrometer/issues/1149 and on StackOverflow, an FAQ about Micrometer's #Timed annotation is,
why it works with Spring AOP, but not when using Micrometer as an aspect library for native AspectJ in the context of compile-time weaving (CTW),
e.g. with AspectJ Maven Plugin. It can be made to work with load-time weaving (LTW) when providing an aop.xml pointing to TimedAspect,
but in a CTW the aspect never kicks in.
The reason is that the aspect has been compiled with Javac, not with the AspectJ compiler (AJC), which is necessary to "finish" the Java class,
i.e. to enhance its byte code in order to be a full AspectJ aspect. The LTW agent does that on the fly during class-loading, but in a CTW context
you need to explicitly tell AJC to do post-compile weaving (a.k.a. binary weaving) on the Micrometer library, producing newly woven class files.
This is done by putting Micrometer on AJC's inpath in order to make sure that its class files are being transformed and written to the target
directory. The inpath in AspectJ Maven is configured via <weaveDependencies>. There are at least two ways to do this:
You can either create your own woven version of the library in a separate Maven module and then use that module instead of Micrometer.
In that case, you need to exclude the original Micrometer library in the consuming module, in order to make sure that the unwoven
class files are not on the classpath anymore and accidentally used.
The way shown here in this example project is a single-module approach, building an executable uber JAR with Maven Shade. The Micrometer class
files are not a re-usable library like in the first approach, but it is nice for demonstration purposes, because we can just run the sample
application and check its output:
$ mvn clean package
...
[INFO] --- aspectj-maven-plugin:1.12.6:compile (default) # SO_AJ_MicrometerTimed_67803726 ---
[INFO] Showing AJC message detail for messages of types: [error, warning, fail]
[INFO] Join point 'method-execution(void de.scrum_master.app.Application.doSomething())' in Type 'de.scrum_master.app.Application' (Application.java:23) advised by around advice from 'io.micrometer.core.aop.TimedAspect' (micrometer-core-1.7.0.jar!TimedAspect.class(from TimedAspect.java))
...
[INFO] --- maven-shade-plugin:3.2.4:shade (default) # SO_AJ_MicrometerTimed_67803726 ---
[INFO] Including org.hdrhistogram:HdrHistogram:jar:2.1.12 in the shaded jar.
[INFO] Including org.latencyutils:LatencyUtils:jar:2.0.3 in the shaded jar.
[INFO] Including org.aspectj:aspectjrt:jar:1.9.6 in the shaded jar.
[INFO] Excluding io.micrometer:micrometer-core:jar:1.7.0 from the shaded jar.
[INFO] Replacing original artifact with shaded artifact.
[INFO] Replacing C:\Users\me\java-src\SO_AJ_MicrometerTimed_67803726\target\SO_AJ_MicrometerTimed_67803726-1.0-SNAPSHOT.jar with C:\Users\me\java-src\SO_AJ_MicrometerTimed_67803726\target\SO_AJ_MicrometerTimed_67803726-1.0-SNAPSHOT-shaded.jar
[INFO] Dependency-reduced POM written at: C:\Users\me\java-src\SO_AJ_MicrometerTimed_67803726\target\dependency-reduced-pom.xml
[INFO] ------------------------------------------------------------------------
[INFO] BUILD SUCCESS
[INFO] ------------------------------------------------------------------------
$ java -jar target/SO_AJ_MicrometerTimed_67803726-1.0-SNAPSHOT.jar
Juni 05, 2021 1:12:27 PM io.micrometer.core.instrument.push.PushMeterRegistry start
INFO: publishing metrics for LoggingMeterRegistry every 1m
Juni 05, 2021 1:13:00 PM io.micrometer.core.instrument.logging.LoggingMeterRegistry lambda$publish$5
INFO: method.timed{class=de.scrum_master.app.Application,exception=none,method=doSomething} throughput=0.166667/s mean=0.11842469s max=0.2146482s
Please specifically note those log lines (line breaks inserted for better readability):
Join point 'method-execution(void de.scrum_master.app.Application.doSomething())'
in Type 'de.scrum_master.app.Application' (Application.java:23)
advised by around advice from 'io.micrometer.core.aop.TimedAspect'
(micrometer-core-1.7.0.jar!TimedAspect.class(from TimedAspect.java))
The above is proof that the #Timed annotation actually causes Micrometer's TimedAspect to be woven into our application code. And here are
the measurements created by the aspect for the sample application:
method.timed
{class=de.scrum_master.app.Application,exception=none,method=doSomething}
throughput=0.166667/s mean=0.11842469s max=0.2146482s
I'm not sure what do you expect from this:
public Object handle(ProceedingJoinPoint pjp) throws Throwable {
return timedAspect.timedMethod(pjp);
}
If I understand this correctly, it does nothing.
There are guides that you can follow to set-up AspectJ properly for your project. After it is done TimedAspect should work, you don't need MeasuredAspect or #Measured just set up AspectJ.
Inspired by #kriegaex's Maven solution https://github.com/kriegaex/SO_AJ_MicrometerTimed_67803726, this is what I came up with for Gradle.
IMPORTANT once you have produced your jar, it replaces micrometer-core, so remember to exclude the original micrometer-core
from your dependencies. If you don't, it will be the luck of the draw
which TimedAspect and CountedAspect classes are chosen by your
runtime.
The goal is to produce a replacement jar for io.micrometer:micrometer-core. This involves compiling with ajc the original micrometer-core jar together with its transitive dependencies. The original micrometer-core contains only two #Aspects, TimedAspect and CountedAspect, so only those classes will changed by ajc.
build.gradle
plugins {
id 'java-library'
id 'io.freefair.aspectj.post-compile-weaving' version '6.6'
}
dependencies {
implementation 'org.aspectj:aspectjrt:1.9.9.1'
inpath 'io.micrometer:micrometer-core:1.10.2'
}
jar {
exclude 'dummy'
}
sourcesJar {
exclude 'dummy'
}
// Since I'm only compiling micrometer's library code, I don't need linting
compileJava.ajc.options.compilerArgs += "-Xlint:ignore"
Gradle won't allow me to compile without at least one java class to compile. To work around this, I created the following class. Yes, this does get compiled, but does not get included in the jar because it is filtered out by the jar's exclude path.
dummy.Dummy.java
package dummy;
public class Dummy {
public static void main(String[] args) {}
}
Related
I have a multi-module project which is using
Java 1.8
JUnit 4.12
Gradle
When compiling a single module, its unit tests are failing on asserting null with Gradle 5.1, but same tests pass on Gradle 1.12 and module compiles successfully. Not changing anything except what is deprecated in 5.1. I can't understand why same framework fails on a recent Gradle version.
One test fails on JUunit Assert.assertNotNull(), which is checking for a string is null or not.
A second test fails on JUnit Assert.assertTrue().
build.gradle is same in both except configuration name changes and I have confirmed all dependencies are downloaded and compiling.
Can't share build script, but if you don't understand something I'll try to make a pseudo script.
I thought assertion errors were more related to language version than tools?
public class Test {
private String property;
#Before
public void setUp() {
property = Singleton.getInstance().getProperty();
}
// test failure 1
#Test
public void shouldAbleToGetProperty() {
assertNotNull(property);
}
// test failure 2
#Test
public void shouldReturnTrueIfPropertyIsTrue() {
Assert.assertTrue(Singleton.getInstance().isTrue());
}
}
Singleton class is a normal singleton which reads property files in resources folder.
NOT ACTUAL CODE
class Singleton {
private Map<String, Properties> properties;
public static Singleton getInstance() {
// return singleton as its meant to be ...
// read property file from project and hold it in map.
}
}
Background
Performing post-compile weaving of projects using:
AspectJ 1.9.4
io.freefair.aspectj.post-compile-weaving 4.1.1
Java 11.0.3
Gradle 5.6.2 (Groovy 2.5.4, Kotlin 1.3.41)
This project does not use Maven or Spring.
Layout
The projects include:
app.aspects - Contains a single LogAspect class annotated with #Aspect.
app.aspects.weaver - No source files, only dependencies to declare aspects and project to weave.
app.common - Defines #Log annotation referenced by pointcuts described in LogAspect.
app.program.main - Files to be woven with jointpoints described in LogAspect.
Gradle
Build files that relate to aspects are defined here. The idea is that weaving is independent from the application so neither the application's common classes nor the main program need know about weaving. Rather, the main program need only reference #Log from the common package and AJC will take care of the weaving.
app.aspects
apply plugin: "io.freefair.aspectj.post-compile-weaving"
dependencies {
// For the #Log annotation
compileOnly project(':app.common')
// The LogAspect's joinpoint references the Main Program
compileOnly project(':app.program.main')
// Logging dependency is also compiled, but not shown here
}
app.aspects.weaver
apply plugin: "io.freefair.aspectj.post-compile-weaving"
dependencies {
compileOnly "org.aspectj:aspectjrt:1.9.4"
// This should set the -aspectpath ?
aspect project(":app.aspects")
// This should set the -inpath ?
inpath(project(":app.program.main")) {
// Only weave within the project
transitive = false
}
}
Classes
Log
The Log annotation is straightforward:
package com.app.common.aspects;
#Retention(RetentionPolicy.RUNTIME)
#Target({ ElementType.METHOD, ElementType.TYPE, ElementType.CONSTRUCTOR })
public #interface Log {
boolean secure() default false;
}
Main Program
The main program resembles:
package com.app.program.main;
import com.app.common.aspects.Log;
#Log
public class Program {
/** This is the method to weave. */
protected void run() throws InterruptedException, TimeoutException {
}
}
Logging Aspect
The logging aspect resembles (see the code from a related question):
#Aspect
public class LogAspect {
// In the future this will target points annotated with #Log
#Pointcut("execution(* com.app.program.main.Program.run(..))")
public void loggedClass() {
}
#Around("loggedClass()")
public Object log(final ProceedingJoinPoint joinPoint) throws Throwable {
return log(joinPoint, false);
}
private Object log(final ProceedingJoinPoint joinPoint, boolean secure) throws Throwable {
// See last year's code for the full listing
log.info("\u21B7| {}{}#{}({})", indent, className, memberName, params);
}
}
Problem
It appears weaving is taking place, but the advice cannot be found:
.../app.aspects/build/classes/java/main!com/app/aspects/LogAspect.class [warning] advice defined in com.app.aspects.LogAspect has not been applied [Xlint:adviceDidNotMatch]
Question
What needs to change so that weaving of the LogAspect into Program's run() method works using Gradle?
Options File
The ajc.options file shows:
-inpath
.../app.aspects/build/classes/java/main
-classpath
.../.gradle/caches/modules-2/files-2.1/org.aspectj/...
-d
.../app.aspects/build/classes/java/main
-target
11
-source
11
It is disconcerting that -aspectpath isn't shown and -inpath is listing app.aspects instead of app.program.main.
Merging apps.aspects and apps.aspects.weaver into the same project has produced:
Join point 'method-execution(void com.app.program.main.Program.run())' in Type 'com.app.program.main.Program' (Program.java:396) advised by around advice from 'com.app.aspects.LogAspect' (LogAspect.class(from LogAspect.java))
While this solves the problem, I don't understand why LogAspect needs to be in the same project that performs the weaving. The Gradle file becomes:
apply plugin: "io.freefair.aspectj.post-compile-weaving"
dependencies {
compileOnly "org.aspectj:aspectjrt:1.9.4"
compileOnly project(':app.common')
compileOnly project(':app.program.main')
compileOnly org_apache_logging_log4j__log4j_api
inpath(project(":app.program.main")) {
transitive = false
}
}
compileJava.ajc.options.compilerArgs += "-showWeaveInfo"
compileJava.ajc.options.compilerArgs += "-verbose"
I have an Enum inside a jar that I have produced myself. This jar is a dependency of a second jar, which uses the enum values.
Now, the second jar is a logging framework, whereas the first jar in this case is the model classes of the logging framework.
I am trying to implement this logging framework into a web application that I have made. Long story short, it still needs some work, but I am stuck on a single problem. An error in the framework's configuration initialization is caught as an exception, and it calls a method. This method has an Enum value as one of it's parameters. However, I get a java.lang.NoSuchFieldError on this enum.
The Enum value was ERROR, so i figured it could be a coincidence. But when I changed it to BABYLOVE the error message changed as well.
I've checked for redundancies and/or possible overlappings in class/enum names, but there are none that I can find.
Sequence of order:
Web App calls for initialization of logging-framework (direct dependency)
logging-framework has issues loading it's own configuration, and throws an exception
Exception is handeled, and a method is called to register the error
The method is called with several parameters, one which is an enum value from logging-framework-model.jar, which is a transitive dependency of the web app
The web-app throws an exception
java.lang.NoSuchFieldError: BABYLOVE
at logging.framework.Constants.<clinit>(Constants.java:52)
at logging.framework.Logger.<init>(Logger.java:60)
at logging.framework.LogContext.getLoggerFromContext(LogContext.java:95)
at logging.framework.LogContext.getCurrent(LogContext.java:48)
at action.navigation.CalendarElementEditorAction.execute(CalendarElementEditorAction.java:39)
Truncated. see log file for complete stacktrace
Constants, line 51-52:
public static final Event ConfigValidationFailed =
EventLogHelper.getEvent(EventLogSource.LoggingFramework, EventLogEntryType.BABYLOVE");
EventLogEntryType:
#XmlType(name = "EventLogEntryType")
#XmlEnum
public enum EventLogEntryType {
//for test purposes, should be removed. This variable is given a name that can not be confused with standard names in error messages, like Error and Warning can.
#XmlEnumValue("BabyLove")
BABYLOVE("BabyLove"),
#XmlEnumValue("Error")
ERROR("Error"),
#XmlEnumValue("Warning")
WARNING("Warning"),
#XmlEnumValue("Information")
INFORMATION("Information"),
#XmlEnumValue("SuccessAudit")
SUCCESSAUDIT("SuccessAudit"),
#XmlEnumValue("FailureAudit")
FAILUREAUDIT("FailureAudit");
private final String value;
EventLogEntryType(String v) {
value = v;
}
public String value() {
return value;
}
public static EventLogEntryType fromValue(String v) {
for (EventLogEntryType c: EventLogEntryType .values()) {
if (c.value.equals(v)) {
return c;
}
}
throw new IllegalArgumentException(v);
}
I don't know if it matters, but I am using maven2 to deal with my dependencies.
I was told to check if the versions of my dependencies had mismatches, and after checking the war's content, I found that to be the problem.
My webapp is one of two very similar ones, that both has a dependency to a jar containing some base model and business logic classes. I had previously added the logging framework (version 1) to that project's pom.xml. So the logging framework 1.0 was a transitive dependency of the web app, while the logging framework 2.0 was a direct dependency of the web app. I am guessing that direct dependencies has precedence over transitive dependencies, so 2.0 was the one who was packaged into my war. However, since the logging framework is composed of a framework (direct dependency), and a set of model classes (transitive dependency), the war was packaged with logging framework model version 1.0.
After I unpacked the war, and found this, it was a pretty easy process to find out where it was wrongly imported, and I ended up with only logging framework version 2.0 for the complete set.
I am trying to get gwt-test-utils to work. I set up the project in the following way:
src/main/java : all the java source code
src/test/java : the test source code
src/test/resources : resource files for the tests
I am building my project with gradle and eclipse. Gradle uses these directories correctly by default and I added all three of them as source directories to Eclipse.
I have successfully built and run the project and was able to execute some plain old JUnit tests as well as a GWTTestCase, so I think I set up the project and its dependencies correctly.
Now I wanted to use gwt-test-utils for some more advanced integration tests. To do so I did the following:
Add the gwt-test-utils and gwt-test-utils-csv to my dependencies
gwtTestUtilsVersion = '0.45'
testCompile group:'com.googlecode.gwt-test-utils', name:'gwt-test-utils', version:gwtTestUtilsVersion
testCompile group:'com.googlecode.gwt-test-utils', name:'gwt-test-utils-csv', version:gwtTestUtilsVersion
Add a gwt-test-utils.properties file to the directory src/test/resources/META-INF with the following content:
path/to/my/module = gwt-module
Added a class that extends GwtCsvTest to a package in the src/test/java directory. It is modeled after the second example in HowToWriteCsvScenario from the gwt-test-utils project wiki, replacing occurrence of their example classes with mine. It looks like this
#CsvDirectory(value = "gwtTests")
public class LoginLogoutTest extends GwtCsvTest
{
#Mock
private MainServiceAsync mainService;
private AppController appController = new AppController();
#CsvMethod
public void initApp()
{
appController.onModuleLoad();
}
#Before
public void setup()
{
GwtFinder.registerNodeFinder("myApp", new NodeObjectFinder()
{
#Override
public Object find(Node node)
{
return csvRunner.getNodeValue(appController, node);
}
});
GwtFinder.registerNodeFinder("loginView", new NodeObjectFinder()
{
#Override
public Object find(Node node)
{
return csvRunner.getNodeValue(appController.getRootPresenter().getCurrentlyActiveSubPresenters().iterator().next().getView(), node);
}
});
addGwtCreateHandler(createRemoteServiceCreateHandler());
}
}
added a csv-file for configuring the test to src/test/resources/gwtTests with the following content
start
initApp
assertExist;/loginView/emailTextBox
I tried executing it via the Eclipse's Run As > JUnit Test and indirectly via gradle build (which executes all the test cases, not just this one). Both lead to the same error:
ERROR GwtTreeLogger Unable to find type 'myPackage.client.AppController'
ERROR GwtTreeLogger Hint: Check that the type name 'myPackage.client.AppController' is really what you meant
ERROR GwtTreeLogger Hint: Check that your classpath includes all required source roots
The AppController class is the entry-point configured in the module I configured in gwt-test-utils.properties, which makes me think that configuration works correctly and the rest of the setup (dependencies and all) work as well.
In an earlier version I used the same file as a subclass of GWTTestCase and created an AppController instance in the same way. That worked, so I'm pretty sure the class path is setup correctly to include it as well. I also tried changing it back to the previous version just now and it still works.
I have no clue why the class is not found. Is there anything gwt-test-utils does differently which means I need to specifically set the class path for it? Otherwise it should just work, since both gradle and eclipse know about all the relevant source folders and dependencies.
I want to write unit tests (junit4) for my maven-plugin. All examples i found use "AbstractMojoTestCase" (junit3 :-(). To get rid of this i got answer here. But the problem is how Mojos get instantiated:
MyMojo myMojo = (MyMojo) lookupMojo( "touch", pom );
That means i need a pom for every test case - the pom is the tests input data. But is there a way to mock (i would use Mockito) the project model some how?
Could lookupMojo(String groupId, String artifactId, String version, String goal, PlexusConfiguration pluginConfiguration) be a good starting point? In this case i would mock "PlexusConfiguration", but what methods?
Some maven-plugin testing doku uses classes like "MavenProjectStub". But i can't get a consistent picture of how a mojo is created and to what intefaces it talks on creation.
A perfect solution would be if i could just
#inject
MyMojo testObject;
and just mock all the stuff it need to get it working (primary i need #Parameters)
Based on my experience writing Maven plugin, there are two levels of testing a plugin: via unit test (using mocks) and via integration tests (using the maven-invoker-plugin).
For the integration tests, the maven archetype for new maven plugins already provide a good example out of the box, just execute the following and have a look at it:
mvn archetype:generate \
-DgroupId=sample.plugin \
-DartifactId=hello-maven-plugin \
-DarchetypeGroupId=org.apache.maven.archetypes \
-DarchetypeArtifactId=maven-archetype-plugin
By default you will get integration tests in a profile to start with. An example maven project will also be available (under src\it\simple-it\pom.xml) which can execute your plugin goals. What I suggest is also to enforce the result of your integration test via additional constraints in that pom.xml. For instance: you can add the Maven Enforcer Plugin rule to check against created files, if that makes sense for your plugin.
To answer more specifically to your question on how to write unit tests for custom maven plugins, this is the approach I'm using:
JUnit + Mockito.
Test case running using #RunWith(MockitoJUnitRunner.class)
Mock Maven specific classes (MavenProject, Log, Build, DependencyNode, etc.) using #Mock annotations
Initiate and link your mock objects in a #Before method (typically setUp() method)
Test your plugin :)
As an example, you might have the following mocked objects as class variable of your unit test:
#Mock
private MavenProject project;
#Mock
private Log log;
#Mock
Build build;
Then, in your #Before method you need to add a big of glue code as following:
Mockito.when(this.project.getBuild()).thenReturn(this.build);
For instance, I use to write some custom Enforcer Plugin rules, hence I need
#Mock
private EnforcerRuleHelper helper;
And in the #Before method:
Mockito.when(this.helper.evaluate("${project}")).thenReturn(this.project);
Mockito.when(this.helper.getLog()).thenReturn(this.log);
Mockito.when(this.project.getBuild()).thenReturn(this.build);
Mockito.when(this.helper.getComponent(DependencyGraphBuilder.class)).thenReturn(this.graphBuilder);
Mockito.when(this.graphBuilder.buildDependencyGraph(this.project, null)).thenReturn(this.node);
As such, it will be easy to use these mock objects into your tests. For instance, a must have first dummy test is to test it against an empty build as following (below testing a custom Enforcer rule):
#Test
public void testEmptyBuild() throws Exception {
try {
this.rule.execute(this.helper);
} catch (EnforcerRuleException e) {
Assert.fail("Rule should not fail");
}
}
If you need to test against dependencies of your build, for instance, you might end up writing utility methods as following:
private static DependencyNode generateNode(String groupId, String artifactId, String version) {
DependencyNode node = Mockito.mock(DependencyNode.class);
Artifact artifact = Mockito.mock(Artifact.class);
Mockito.when(node.getArtifact()).thenReturn(artifact);
// mock artifact
Mockito.when(artifact.getGroupId()).thenReturn(groupId);
Mockito.when(artifact.getArtifactId()).thenReturn(artifactId);
Mockito.when(artifact.getVersion()).thenReturn(version);
return node;
}
In order to easily create dependencies into the dependency graph of your build, as following:
List<DependencyNode> nodes = new ArrayList<DependencyNode>();
nodes.add(generateNode("junit", "junit", "4.12"));
Mockito.when(node.getChildren()).thenReturn(nodes);
NOTE: you can improve the utility method if you need further details (like scope or classifier for a dependency).
If you also need to mock configuration of a plugin, because you need to scan existing plugins and their configuration, for instance, you can do it as following:
List<Plugin> plugins = new ArrayList<Plugin>();
Plugin p = new Plugin(); // no need to mock it
p.setArtifactId("maven-surefire-plugin");
Xpp3Dom conf = new Xpp3Dom("configuration");
Xpp3Dom skip = new Xpp3Dom("skip");
skip.setValue("true");
conf.addChild(skip);
p.setConfiguration(conf);
plugins.add(p);
Mockito.when(this.build.getPlugins()).thenReturn(plugins);
I will obviously not cover all the possible cases, but I am sure you got an understanding about approach and usage. Hope it helps.