Exception Handling/Mapping for a particular class - java

I have resource class which itself's talks with a internal service. This resource acts a rest API for the service. The service layer can throw unexpected exceptions, thus the resource should handle those handled unexpected exceptions and log it. I am using dropwizard framework which in turns use jersey. It goes like this.
#PATH(/user)
#GET
public Response getUser(#QueryParam("id") String userId) {
assertNotNull(userId);
try {
User user = service.getUser(userId);
return Response.ok(user).build();
}
catch (MyOwnException moe) { //basically 400's
return Response.status(400).entity(moe.getMsg()).build();
}
catch (Exception e) { //unexpected exceptions
logger.debug(e.getMessage);
return Response.status(500).entity(moe.getMsg()).build();
}
}
The problem here is that i have to do this exact same exception handling for each REST api endpoint. Can i do some kind of exception mapping for this particular resource so that i can put all the handling logic and logging there?
I know i can build a mapper for an particular exception in jersey, but that is for the whole module not a single class.

Afaig you can't register an ExceptionMapper to a resource method. I've tried this by implementing a DynamicFeature which was looking for a custom Annotation and then tried to register a custom ExceptionMapper with the FeatureContext.
The result was disillusioning:
WARNING: The given contract (interface javax.ws.rs.ext.ExceptionMapper) of class path.to.CustomExceptionMapper provider cannot be bound to a resource method.
Might not work:
But...
For a resource class this is in fact easy. Just register your ExceptionMapper for your resource class within your ResourceConfig. For me it looks like:
#ApplicationPath("/")
public class ApplicationResourceConfig extends ResourceConfig {
public ApplicationResourceConfig() {
// [...]
register(YourExceptionMapper.class, YourResource.class);
// [...]
}
}
So if you are okay with having this on resource class level, do it like this.
Otherwise you might need to use Aspects (but I don't see any reasons to do so). Example:
Aspect
#Aspect
public class ResourceAspect {
Logger logger = [...]
private static final String RESOURCE = "execution(public !static javax.ws.rs.core.Response path.to.resources..*(..)) && #annotation(path.to.HandleMyOwnException)";
#Around(RESOURCE)
public Object translateRuntimeException(ProceedingJoinPoint p) throws Throwable {
try {
return p.proceed();
} catch (MyOwnException moe) {
return Response.status(400).entity(moe.getMsg()).build();
} catch (Exception e) { //unexpected exceptions
logger.debug(e.getMessage);
return Response.status(500).entity(e.getMessage()).build();
}
}
}
Please notice, the RESOURCE config. Here it works for none static methods under path.to.resources which returning Response and are anntotated with the HandleMyOwnException annotation.
HandleMyOwnException
#Retention(RetentionPolicy.RUNTIME)
#Target(ElementType.METHOD)
public #interface HandleMyOwnException {}
ResourceMethod
#GET
#PATH("/user")
#HandleMyOwnException
public Response getUser(#QueryParam("id") String userId) {
assertNotNull(userId);
return Response.ok(service.getUser(userId)).build();
}
pom.xml
<!-- deps -->
<dependency>
<groupId>org.aspectj</groupId>
<artifactId>aspectjrt</artifactId>
<version>1.8.2</version> <!-- or newer version -->
</dependency>
<!-- build plugins -->
<plugins>
<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>aspectj-maven-plugin</artifactId>
<version>1.7</version>
<configuration>
<complianceLevel>1.8</complianceLevel>
<showWeaveInfo>true</showWeaveInfo>
</configuration>
<executions>
<execution>
<goals>
<goal>compile</goal>
</goals>
</execution>
</executions>
</plugin>
</plugins>
<pluginManagement>
<plugins>
<plugin>
<groupId>org.eclipse.m2e</groupId>
<artifactId>lifecycle-mapping</artifactId>
<version>1.0.0</version>
<configuration>
<lifecycleMappingMetadata>
<pluginExecutions>
<pluginExecution>
<pluginExecutionFilter>
<groupId>org.codehaus.mojo</groupId>
<artifactId>aspectj-maven-plugin</artifactId>
<versionRange>[1.7,)</versionRange>
<goals>
<goal>compile</goal>
</goals>
</pluginExecutionFilter>
<action>
<ignore></ignore>
</action>
</pluginExecution>
</pluginExecutions>
</lifecycleMappingMetadata>
</configuration>
</plugin>
<plugins>
<pluginManagement>
Have a nice day!
EDITED
~ Added more complete pom.xml config
~ Corrected missing path for Annotation in ResourceAspect

Why not just factor out the exception handling into a private method?
#PATH(/user)
#GET
public Response getUser(#QueryParam("id") String userId) {
assertNotNull(userId);
return handleExceptions(() -> {
User user = service.getUser(userId);
return Response.ok(user).build();
});
}
private Response handleExceptions(Callable<Response> callable) {
try {
return callable.call();
}
catch (MyOwnException moe) { //basically 400's
return Response.status(400).entity(moe.getMsg()).build();
}
catch (Exception e) { //unexpected exceptions
logger.debug(e.getMessage);
return Response.status(500).entity(e.getMessage()).build();
}
}

Related

AspectJ plugin builds fine but at runtime annotations don't work

I am using the AspectJ Maven plugin to build my project and use an AspectLibrary, which is a jar in which I have my aspects defined.
Here is the Aspect that I am trying to use
#Around("execution(* *(..))&&#annotation(com.cisco.commerce.pricing.lp.commons.util.annotations.TimeMe)")
public Object timeMeAroundAspect(ProceedingJoinPoint proceedingJoinPoint) throws Throwable {// NOSONAR
Timer timer = Timer.instance().start();
MethodSignature signature = (MethodSignature) proceedingJoinPoint.getSignature();
Method method = signature.getMethod();
TimeMe timeMeAnnotation = method.getAnnotation(TimeMe.class);
String name = timeMeAnnotation.name();
boolean log = timeMeAnnotation.log();
boolean addToMetrics = timeMeAnnotation.addToMetrics();
Object response = null;
try {
response = proceedingJoinPoint.proceed();
} finally {
try {
Long timeTaken = timer.timeTaken();
if (log) {
LOGGER.info("MethodName: {} Time taken: {}", name, timeTaken);
}
if (addToMetrics) {
ExecutionDetailsUtil.addMethodExecutionTime(name, timeTaken);
}
} catch (Exception e) {
LOGGER.warn("Exception while trying to log time", e);
}
}
return response;
}
This code is in a jar file, which I am using as the aspectLibrary in my pom
<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>aspectj-maven-plugin</artifactId>
<version>1.7</version>
<configuration>
<showWeaveInfo>true</showWeaveInfo>
<verbose>true</verbose>
<encoding>UTF-8</encoding>
<source>${java.source-target.version}</source>
<target>${java.source-target.version}</target>
<Xlint>ignore</Xlint>
<aspectLibraries>
<aspectLibrary>
<groupId>it.cvc.ciscocommerce.lps.lp-commons</groupId>
<artifactId>lp-commons</artifactId>
</aspectLibrary>
</aspectLibraries>
<complianceLevel>${java.source-target.version}</complianceLevel>
</configuration>
<executions>
<execution>
<phase>process-sources</phase>
<goals>
<goal>compile</goal>
</goals>
</execution>
</executions>
<dependencies>
<dependency>
<groupId>org.aspectj</groupId>
<artifactId>aspectjtools</artifactId>
<version>${aspectj.version}</version>
</dependency>
</dependencies>
</plugin>
Below is my annotation defintion
#Target(ElementType.METHOD)
#Retention(RetentionPolicy.RUNTIME)
public #interface TimeMe {
public String name();
public boolean log() default true;
public boolean addToMetrics() default true;
}
Here is the snippet where I am trying to use this annotation (in a different code base which uses the above jar as a dependency)
#TimeMe(name = "classifyLine")
private void classifyLine(PricingObject pricingObject,
PricingLineObject pricingLineObject, LineTypes lineTypes) {
//logic
}
My build runs fine and prints the following in the MAVEN Console
[INFO] Join point 'method-execution(void com.cisco.pricing.lps.main.ListPriceService.classifyLine(com.cisco.pricing.lps.bean.PricingObject, com.cisco.pricing.lps.bean.PricingLineObject, com.cisco.pricing.lps.dto.LineTypes))' in Type 'com.cisco.pricing.lps.main.ListPriceService' (ListPriceService.java:235) advised by around advice from 'com.cisco.commerce.pricing.lp.commons.util.logging.LoggingAspectDefiner' (lp-commons-2019.03.01-SNAPSHOT.jar!LoggingAspectDefiner.class(from LoggingAspectDefiner.java))
I exploded the war file and looked at the class files generated. I have the following AjcClosure1 class generated for the java file where I used the annotation.
public class ListPriceService$AjcClosure1 extends AroundClosure {
public Object run(Object[] paramArrayOfObject) {
Object[] arrayOfObject = this.state;
ListPriceService.classifyLine_aroundBody0((ListPriceService)
arrayOfObject[0],
(PricingObject)arrayOfObject[1],
(PricingLineObject)arrayOfObject[2], (LineTypes)arrayOfObject[3],
(JoinPoint)arrayOfObject[4]);return null;
}
public ListPriceService$AjcClosure1(Object[] paramArrayOfObject)
{
super(paramArrayOfObject);
}
}
And in the java class file, where I use the annotation, I see no changes to the classifyLine method.
However, when I run my application, the annotation is not working. It doesn't execute the Aspect I have defined in the jar.
I have no clue why. Is my pattern not matching? It matches and works fine in a Spring application but not in this non Spring application.

AspectJ Compile time weaving fails for streams

I am getting below error when aspectj compiler runs.
[ERROR] Type mismatch: cannot convert from List<Object> to List<Tag>
My code is,
final List<Tag> customTags =
pathVariables.entrySet().stream().filter(entry -> {
return tagList.contains(entry.getKey());
}).map(tag -> {
return new Tag() {
#Override
public String getValue() {
logger.info("Key for the attached tag is: {}",
tag.getKey());
return tag.getKey();
}
#Override
public String getKey() {
logger.info("Value for the attached tag is: {}", (String)tag.getValue());
return (String) tag.getValue();
}
};
}).collect(Collectors.toList());
pom.xml
<dependency>
<groupId>org.aspectj</groupId>
<artifactId>aspectjrt</artifactId>
<version>1.8.7</version>
</dependency>
<dependency>
<groupId>org.aspectj</groupId>
<artifactId>aspectjtools</artifactId>
<version>1.8.7</version>
</dependency>
<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>aspectj-maven-plugin</artifactId>
<version>1.8</version>
<configuration>
<source>1.8</source>
<target>1.8</target>
<complianceLevel>1.8</complianceLevel>
<encoding>UTF-8</encoding>
<verbose>true</verbose>
</configuration>
<executions>
<execution>
<goals>
<goal>compile</goal> <!-- use this goal to weave all your main classes -->
<goal>test-compile</goal> <!-- use this goal to weave all your test classes -->
</goals>
</execution>
</executions>
</plugin>
Things that I have tried,
1. Adding properties so as to tell maven compiler plugin to comply with java 8
<properties>
<maven.compiler.target>1.8</maven.compiler.target>
<maven.compiler.source>1.8</maven.compiler.source>
</properties>
Changing AspectJ version to 1.8.13
In both of the cases I got same error. If I use,
final List customTags = ...
with the same code I do not get this error. Am I missing anything?
Tried running dummy code of similar structure,
Map<String, Object> HOSTING1 = new HashMap<>();
HOSTING1.put("1", "linode.com");
HOSTING1.put("2", "heroku.com");
HOSTING1.put("3", "digitalocean.com");
HOSTING1.put("4", "aws.amazon.com");
List<String> tagList = Arrays.asList("1", "2", "3");
final List<Tag> customTags = HOSTING1.entrySet().stream().filter(entry -
> {
return tagList.contains(entry.getKey());
}).map(tag -> {
return new Tag() {
#Override
public String getValue() {
System.out.println("Value for the attached tag is: {}" + tag.getValue());
return (String) tag.getValue();
}
#Override
public String getKey() {
System.out.println("Key for the attached tag is: {}" + tag.getKey());
return (String) tag.getKey();
}
};
}).collect(Collectors.toList());
Explicitly specifying .map(tag -> {... helped! Without that AspectJ compiler was causing the issue. Code is,
final List<Tag> customTags =
pathVariables.entrySet().stream().filter(entry -> {
return tagList.contains(entry.getKey());
}).<Tag>map(tag -> {
return new Tag() {
#Override
public String getValue() {
logger.info("Key for the attached tag is: {}",
tag.getKey());
return tag.getKey();
}
#Override
public String getKey() {
logger.info("Value for the attached tag is: {}", (String)tag.getValue());
return (String) tag.getValue();
}
};
}).collect(Collectors.toList());

Get annotations when exec-maven-plugin runs Main does not work

I would like to run a Main class with exec-maven-plugin and from my dependencies generate documentation like a swagger file.
The annotation that I care is javax.ws.rs.Path which has #Retention(RetentionPolicy.RUNTIME)
My Java code
public class ContextClassReader extends ClassReader {
private static final ExtensibleClassLoader CLASS_LOADER = new ExtensibleClassLoader();
public ContextClassReader(final String className) throws IOException {
super(CLASS_LOADER.getResourceAsStream(className.replace('.', '/') + ".class"));
final URL resource = CLASS_LOADER.getResource(className.replace('.', '/') + ".class");
}
public static ClassLoader getClassLoader() {
return CLASS_LOADER;
}
public static void addClassPath(final URL url) {
CLASS_LOADER.addURL(url);
}
private static class ExtensibleClassLoader extends URLClassLoader {
ExtensibleClassLoader() {
super(new URL[]{});
}
#Override
public void addURL(final URL url) {
super.addURL(url);
}
}
Here is the loading of class and testing it for annotations.
final Class<?> clazz = ContextClassReader.getClassLoader().loadClass(className);
isAnnotationPresent(clazz);
public static boolean isAnnotationPresent(final AnnotatedElement annotatedElement) {
... annotatedElement.getAnnotations().length --> 0
... clazz.getMethods().length() ---> works !
}
My pom.xml
<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>exec-maven-plugin</artifactId>
<version>1.6.0</version>
<executions>
<execution>
<goals>
<goal>java</goal>
</goals>
</execution>
</executions>
<configuration>
<executable>java</executable>
<workingDirectory>XXXXXXXXXXXXXXXXXXX</workingDirectory>
<addResourcesToClasspath>true</addResourcesToClasspath>
<additionalClasspathElements>true</additionalClasspathElements>
<includeProjectDependencies>true</includeProjectDependencies>
<includePluginDependencies>true</includePluginDependencies>
<includeProjectDependencies>true</includeProjectDependencies>
<mainClass>XXX.Main</mainClass>
</configuration>
</plugin>

Creating a lambda function to store metadata of file in S3 to the Mysql database?

I am new to AWS and I am currently trying to understand Lambda functions and to trigger it when I upload file to S3 bucket.
I wrote a handler class for this:
public class Hello implements RequestHandler<Employee, String> {
public String handleRequest(Employee input, Context context) {
context.getLogger().log("helloWorld");
return "Hello World " ;
}
}
This was just a basic and I could see the "helloworld" printed in logs in CloudWatch when I upload a file to S3 bucket.
But Now what I want to log the metadata of the file (fileName, createdTime etc.).
I went thru a sample template example in AWS Lambda page, where I can see using Nodejs, we have the event as the argument and we can extract the name and other fields using this field.
const aws = require('aws-sdk');
const s3 = new aws.S3({ apiVersion: '2006-03-01' });
exports.handler = (event, context, callback) => {
const bucket = event.Records[0].s3.bucket.name;
...
}
But as a Java developer, I tried to use S3EventNotification as the argument:
public class Hello implements RequestHandler<S3EventNotification, String> {
public String handleRequest(S3EventNotification input, Context context) {
context.getLogger().log(input.getRecords().get(0).getEventSource());
return "Hello World " ;
}
}
But I am getting below error:
An error occurred during JSON parsing: java.lang.RuntimeException
java.lang.RuntimeException: An error occurred during JSON parsing
Caused by: lambdainternal.util.ReflectUtil$ReflectException: java.lang.NoSuchMethodException: com.amazonaws.services.s3.event.S3EventNotification$S3ObjectEntity.<init>(java.lang.String, java.lang.Long, java.lang.String, java.lang.String)
Caused by: java.lang.NoSuchMethodException: com.amazonaws.services.s3.event.S3EventNotification$S3ObjectEntity.<init>(java.lang.String, java.lang.Long, java.lang.String, java.lang.String)
How can I achieve the same thing in Java? Thanks.
Try some variant of the following:
import com.amazonaws.services.lambda.runtime.Context;
import com.amazonaws.services.lambda.runtime.RequestHandler;
import com.amazonaws.services.lambda.runtime.events.S3Event;
public class Hello implements RequestHandler<S3Event, Void> {
#Override
public Void handleRequest(S3Event s3event, Context context) {
try {
S3EventNotificationRecord record = s3event.getRecords().get(0);
String bkt = record.getS3().getBucket().getName();
String key = record.getS3().getObject().getKey().replace('+', ' ');
key = URLDecoder.decode(key, "UTF-8");
} catch (Exception e) {
// do something
}
return null;
}
}
And here are the corresponding dependencies that I used in pom.xml:
<dependency>
<groupId>com.amazonaws</groupId>
<artifactId>aws-java-sdk-s3</artifactId>
<version>1.11.228</version>
</dependency>
<dependency>
<groupId>com.amazonaws</groupId>
<artifactId>aws-lambda-java-core</artifactId>
<version>1.1.0</version>
</dependency>
<dependency>
<groupId>com.amazonaws</groupId>
<artifactId>aws-lambda-java-events</artifactId>
<version>2.0.1</version>
</dependency>
And here is the build specification from my pom.xml (which will cause dependent classes to be pulled into my built JAR):
<build>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-shade-plugin</artifactId>
<version>2.3</version>
<configuration>
<createDependencyReducedPom>false</createDependencyReducedPom>
</configuration>
<executions>
<execution>
<phase>package</phase>
<goals>
<goal>shade</goal>
</goals>
</execution>
</executions>
</plugin>
</plugins>
</build>
None of this is very simple, unfortunately, but that's Java and Maven for you. AWS Lambda programming in Node.js or Python is much simpler (and more fun) than in Java, so if there's no strong requirement to write it in Java, you're better off not writing in Java.
Also note that if the Lambda is going to be invoked asynchronously then the output type should be Void rather than String (see docs).

Run migrations programmatically in Dropwizard

I have dropwizard-application (0.7.0) for which I want to run integration tests.
I've set up an integration test using DropwizardAppRule, like this:
#ClassRule
public static final DropwizardAppRule<MyAppConfiguration> RULE =
new DropwizardAppRule<MyAppConfiguration>(
MyApplication.class, Resources.getResource("testconfiguration.yml").getPath());
When I try to run the below tests using it, it doesn't work because I haven't run my migrations. What is the best way to run the migrations?
Test:
#Test
public void fooTest() {
Client client = new Client();
String root = String.format("http://localhost:%d/", RULE.getLocalPort());
URI uri = UriBuilder.fromUri(root).path("/users").build();
client.resource(uri).accept(MediaType.APPLICATION_JSON).type(MediaType.APPLICATION_JSON).post(User.class, new LoginUserDTO("email#email.com", "password"));
}
Configuration:
public class MyAppConfiguration extends Configuration {
#Valid
#NotNull
private DataSourceFactory database = new DataSourceFactory();
#JsonProperty("database")
public DataSourceFactory getDataSourceFactory() {
return database;
}
#JsonProperty("database")
public void setDataSourceFactory(DataSourceFactory dataSourceFactory) {
this.database = dataSourceFactory;
}
}
Thanks to Kimble and andersem for putting me on the right track. Here's what I came up with in my #BeforeClass method:
// Create the test database with the LiquiBase migrations.
#BeforeClass
public static void up() throws Exception
{
ManagedDataSource ds = RULE.getConfiguration().getMainDataSource().build(
RULE.getEnvironment().metrics(), "migrations");
try (Connection connection = ds.getConnection())
{
Liquibase migrator = new Liquibase("migrations.xml", new ClassLoaderResourceAccessor(), new JdbcConnection(connection));
migrator.update("");
}
}
I ran into some concurrency issues when trying to do the database migration as part of the test case and ended up baking it into the application itself (protected by a configuration option).
private void migrate(MyAppConfiguration configuration, Environment environment) {
if (configuration.isMigrateSchemaOnStartup()) {
log.info("Running schema migration");
ManagedDataSource dataSource = createMigrationDataSource(configuration, environment);
try (Connection connection = dataSource.getConnection()) {
JdbcConnection conn = new JdbcConnection(connection);
Database database = DatabaseFactory.getInstance().findCorrectDatabaseImplementation(conn);
Liquibase liquibase = new Liquibase("migrations.xml", new ClassLoaderResourceAccessor(), database);
liquibase.update("");
log.info("Migration completed!");
}
catch (Exception ex) {
throw new IllegalStateException("Unable to migrate database", ex);
}
finally {
try {
dataSource.stop();
}
catch (Exception ex) {
log.error("Unable to stop data source used to execute schema migration", ex);
}
}
}
else {
log.info("Skipping schema migration");
}
}
private ManagedDataSource createMigrationDataSource(MyAppConfiguration configuration, Environment environment) {
DataSourceFactory dataSourceFactory = configuration.getDataSourceFactory();
try {
return dataSourceFactory.build(environment.metrics(), "migration-ds");
}
catch (ClassNotFoundException ex) {
throw new IllegalStateException("Unable to initialize data source for schema migration", ex);
}
}
Another approach that doesn't rely on importing Liquibase's classes directly is to run the db migrate command in the same way that you might from the command line, using the RULE:
#Before
public void migrateDatabase() throws Exception {
RULE.getApplication().run("db", "migrate", ResourceHelpers.resourceFilePath("testconfiguration.yml"));
}
This approach also works for any other commands from any other bundles that you might want to run before starting the tests.
A small winkle: Doing this with any commands that extend Dropwizards ConfiguredCommand (which all of the dropwizard-migrations do) will unnecessarily disable logback when the command finishes.
To restore it, you can call:
RULE.getConfiguration().getLoggingFactory().configure(RULE.getEnvironment().metrics(),
RULE.getApplication().getName());
I did it this way using Liquibase's API:
private void migrate(){
DataSourceFactory dataSourceFactory = RULE.getConfiguration().dataSourceFactory;
Properties info = new Properties();
info.setProperty("user", dataSourceFactory.getUser());
info.setProperty("password", dataSourceFactory.getPassword());
org.h2.jdbc.JdbcConnection h2Conn = new org.h2.jdbc.JdbcConnection(dataSourceFactory.getUrl(), info);
JdbcConnection conn = new JdbcConnection(h2Conn);
Database database = DatabaseFactory.getInstance().findCorrectDatabaseImplementation(conn);
Liquibase liquibase = new Liquibase("migrations.xml", new ClassLoaderResourceAccessor(), database);
String ctx = null;
liquibase.update(ctx);
}
And then I put this in a beforeclass:
#BeforeClass
public void setup(){
migrate();
}
It's probably not the ultimate solution, and it depends a lot on the database you're using, but it works.
What I do to achieve the same goal is to run the migration from within maven.
Add this to the section in the sction of your pom.xml:
<plugin>
<groupId>org.liquibase</groupId>
<artifactId>liquibase-maven-plugin</artifactId>
<version>3.0.5</version>
<executions>
<execution>
<phase>process-test-resources</phase>
<configuration>
<changeLogFile>PATH TO YOUR MIGRATIONS FILE</changeLogFile>
<driver>org.h2.Driver</driver>
<url>JDBC URL LIKE IN YOUR APP.YML</url>
<username>USERNAME</username>
<password>PASSWORD</password>
<dropFirst>false</dropFirst>
<promptOnNonLocalDatabase>false</promptOnNonLocalDatabase>
<logging>info</logging>
</configuration>
<goals>
<goal>dropAll</goal>
<goal>update</goal>
</goals>
</execution>
</executions>
</plugin>
This will work with maven from command line. With this setting, maven will use liquibase dropAll to drop all database objects, and then run a migration, so with every test you have a clean new database.
When using that, I ran intoissues with eclipse, it complained about the lifecycle mapping not working upon the execution tag of the plugin. In this case, you need to add the following to the build section as well, so eclipse can properly map the life cycles:
<pluginManagement>
<plugins>
<plugin>
<groupId>org.eclipse.m2e</groupId>
<artifactId>lifecycle-mapping</artifactId>
<version>1.0.0</version>
<configuration>
<lifecycleMappingMetadata>
<pluginExecutions>
<pluginExecution>
<pluginExecutionFilter>
<groupId>org.liquibase</groupId>
<artifactId>liquibase-maven-plugin</artifactId>
<versionRange>[1.0,)</versionRange>
<goals>
<goal>dropAll</goal>
<goal>update</goal>
</goals>
</pluginExecutionFilter>
<action>
<execute />
</action>
</pluginExecution>
</pluginExecutions>
</lifecycleMappingMetadata>
</configuration>
</plugin>
</plugins>
</pluginManagement>

Categories

Resources