I'm currently facing a problem.
My project looks like this :
Project
|_ module 1
|_ liquibase
|_ migration.xml
|_ file1.xml
|_ src
|_ main
|_ java
|_ resources
To be able to launch component tests, I run, using docker, a postgresql container.
I want to launch my liquibase scripts.
Here's a my code :
SpringLiquibase liquibase = new SpringLiquibase();
liquibase.setResourceLoader(new FileSystemResourceLoader());
liquibase.setDataSource(dataSource);
liquibase.setChangeLog("liquibase/migration.xml");
liquibase.setDefaultSchema("mySchema");
liquibase.setDropFirst(false);
liquibase.setShouldRun(true);
try {
liquibase.afterPropertiesSet();
log.info("Liquibase run ended");
} catch (Exception e) {
log.error(e.getMessage());
throw new RuntimeException(e.getMessage());
}
This has run well for a long time, until I made an update to Liquibase 4.
Now, I'm getting the following error : Specifying files by absolute path was removed in Liquibase 4.0. Please use a relative path or add '/' to the classpath parameter.
I searched throught the web and didn't find anything helpful.
I tried a lot of different things, and nothing worked
Someone has a clue ? (other than moving my liquibase folder inside resources)
I worked it out implementing custom SpringLiquibase and SpringResourceAcessor and moving from liquibase 4.0 to 4.6.1
If anyone is interested, here's my code :
public class CustomSpringResourceAcessor extends SpringResourceAccessor {
public CustomSpringResourceAcessor(ResourceLoader resourceLoader) {
super(resourceLoader);
}
#Override
protected String finalizeSearchPath(String searchPath) {
return super.finalizeSearchPath(searchPath).substring(11);
}
#Override
public InputStreamList openStreams(String relativeTo, String streamPath) throws IOException {
String path = this.getClass().getProtectionDomain().getCodeSource().getLocation().getPath();
path = path.substring(0, path.indexOf("/target"));
if (relativeTo == null) {
return super.openStreams(path, streamPath);
}
return super.openStreams(path + "/" + relativeTo, streamPath);
}
}
and
public class CustomSpringLiquibase extends SpringLiquibase {
#Override
protected SpringResourceAccessor createResourceOpener() {
return new CustomSpringResourceAcessor(getResourceLoader());
}
}
Related
In a multi-module project I want to be sure that Spring's #sql annotation uses correct resources. Is there a way to log full path of those files to console somehow?
Spring does log script file name before execution, but in tests for different modules those file names are the same sometimes.
SqlScriptsTestExecutionListener - responsible for the processing of #Sql, for the first step you can change to debug related log by adding property logging.level.org.springframework.test.context.jdbc=debug, but the debug message is not fully and if is not enough you should create your own TestExecutionListener and declare on test class #TestExecutionListeners(listeners = SqlScriptsCustomTestExecutionListener.class)
for example:
public class SqlScriptsCustomTestExecutionListener extends AbstractTestExecutionListener {
#Override
public void beforeTestMethod(TestContext testContext) {
List<Resource> scriptResources = new ArrayList<>();
Set<Sql> sqlAnnotations = AnnotatedElementUtils.getMergedRepeatableAnnotations(testContext.getTestMethod(), Sql.class);
for (Sql sqlAnnotation : sqlAnnotations) {
String[] scripts = sqlAnnotation.scripts();
scripts = TestContextResourceUtils.convertToClasspathResourcePaths(testContext.getTestClass(), scripts);
scriptResources.addAll(TestContextResourceUtils.convertToResourceList(testContext.getApplicationContext(), scripts));
}
if (!scriptResources.isEmpty()) {
String debugString = scriptResources.stream().map(r -> {
try {
return r.getFile().getAbsolutePath();
} catch (IOException e) {
System.out.println("Unable to found file resource");
}
return null;
}).collect(Collectors.joining(","));
System.out.println(String.format("Execute sql script :[%s]", debugString));
}
}
It is just quick example and it works. Most of source code i copied from SqlScriptsTestExecutionListener just for explanation. It is just realization in case of #Sql annotation on method level, and not included class level.
I hope it will be helps you.
I'm running into a problem with an AntTask run within the maven-antrun-plugin. Unfortunately, the AntTask uses the plugin classloader to locate a file from the project, but when run from within a plugin, the build output is not included in the plugin's classpath.
From the Guide to Maven Classloading:
Please note that the plugin classloader does neither contain the
dependencies of the current project nor its build output.
...
Plugins are free to create further classloaders on their discretion.
For example, a plugin might want to create a classloader that combines
the plugin class path and the project class path.
Can anyone point me in right direction how to create my own version of the maven-antrun-plugin in which I can create my own classloader that combines the plugin class path and the project class path? I need to update the classloader such that when a class executed by my custom antrun-plugin calls:
getClass().getClassLoader().getResource()
the classloader will search the build output folder as well.
After several hours trying to work my way around this issue with configuration, I bit the bullet and simply wrote my own plugin that extends the AntRun plugin. This was done using Maven 3.2.5:
#Mojo( name = "run", threadSafe = true, requiresDependencyResolution = ResolutionScope.TEST )
public class CustomAntMojo
extends AntRunMojo
{
#Component
private PluginDescriptor pluginDescriptor;
public void execute()
throws MojoExecutionException
{
File buildDirectory = new File( getMavenProject().getBuild().getOutputDirectory() );
// add the build directory to the classpath for the classloader
try {
ClassRealm realm = pluginDescriptor.getClassRealm();
realm.addURL(buildDirectory.toURI().toURL());
} catch (MalformedURLException e1) {
e1.printStackTrace();
}
// configure the log4j logger to output the ant logs to the maven log
BasicConfigurator.configure( new MavenLoggerLog4jBridge(getLog()));
super.execute();
}
}
With the MavenLoggerLog4jBridge class being used to convert from my Ant task's Log4j output to maven logger (https://stackoverflow.com/a/6948208/827480):
import org.apache.log4j.AppenderSkeleton;
import org.apache.log4j.Level;
import org.apache.log4j.spi.LoggingEvent;
import org.apache.maven.plugin.logging.Log;
public class MavenLoggerLog4jBridge extends AppenderSkeleton {
private Log logger;
public MavenLoggerLog4jBridge(Log logger) {
this.logger = logger;
}
protected void append(LoggingEvent event) {
int level = event.getLevel().toInt();
String msg = event.getMessage().toString();
if (level <= Level.DEBUG_INT ) {
this.logger.debug(msg);
} else if (level == Level.INFO_INT) {
this.logger.info(msg);
} else if (level == Level.WARN_INT) {
this.logger.warn(msg);
} else if (level == Level.ERROR_INT || level == Level.FATAL_INT) {
this.logger.error(msg);
}
}
public void close() {
}
public boolean requiresLayout() {
return false;
}
}
Hopefully it might be of some use or assistance to someone in the future.
I would like to distribute a jar of a library I created with all my dependencies bundled inside. However I would like to avoid version conflicts of dependencies with the adopting project.
I think maven shade can do this but I could not find a way to do this with Scala / SBT. I found OneJar however from my experiments with it seems to work only for executables.
How could I achieve this?
Thanks!
You can do this with your own classloader.
The classLoader:
Write a class loader which loads class files from diferent classloader using a rewrite.
For example you could add library as a prefix to the classpath when fetching the resource.
I have created a classloader using this teqnuiqe.
https://github.com/espenbrekke/dependent/blob/master/src/main/java/no/dependent/hacks/PathRewritingClassLoader.java
It replaces the method findClass in URLClassLoader with one adding a prefix.
protected Class<?> findClass(final String name) throws ClassNotFoundException {
Class result;
try {
result = (Class)AccessController.doPrivileged(new PrivilegedExceptionAction() {
public Class<?> run() throws ClassNotFoundException {
// This is where the prefix is added:
String path = PathRewritingClassLoader.this.prefix + name.replace('.', '/').concat(".class");
Resource res = PathRewritingClassLoader.this._ucp.getResource(path, false);
if(res != null) {
try {
return PathRewritingClassLoader.this._defineClass(name, res);
} catch (IOException var4) {
throw new ClassNotFoundException(name, var4);
}
} else {
return null;
}
}
}, this._acc);
} catch (PrivilegedActionException var4) {
throw (ClassNotFoundException)var4.getException();
}
if(result == null) {
throw new ClassNotFoundException(name);
} else {
return result;
}
}
We also have to rewrite resource loading
#Override
public URL getResource(String name){
return super.getResource(prefix+name);
}
Here is how it is used:
_dependentClassLoader = new PathRewritingClassLoader("private", (URLClassLoader)DependentFactory.class.getClassLoader());
Class myImplementationClass=_dependentClassLoader.loadClass("my.hidden.Implementation");
Building your jar:
In your build you place all the library and private classes under your selected prefix. In my gradle build I have a simple loop collecting all the dependencies.
task packageImplementation {
dependsOn cleanImplementationClasses
doLast {
def paths = project.configurations.runtime.asPath
paths.split(':').each { dependencyJar ->
println "unpacking" + dependencyJar
ant.unzip(src: dependencyJar,
dest: "build/classes/main/private/",
overwrite: "true")
}
}
}
Proguard can rename packages inside jar and obfuscate code. It is a bit complicated but you can achieve you goal with it. sbt-proguard plugin is actively maintained
Also you can check answers from similar thread:
maven-shade like plugin for SBT
UPDATE:
from version 0.14.0 sbt-assembly plugin seemed to have shading ability
Have you tried sbt-assembly plugin? It has set of merging strategies in case of conflicts and has pretty much nice start guide.
So I am developing one maven plugin where I need to modify the classloaders in order to work correctly. The problem is that I am not sure that I am modifying the correct classloader. What I'm doing is the following:
#Mojo(name = "aggregate", requiresDependencyResolution = ResolutionScope.TEST)
public class AcceptanceTestMojo extends AbstractMojo {
private static final String SYSTEM_CLASSLOADER_FIELD_NAME = "scl";
#Parameter
private String property;
#Component
public PluginDescriptor pluginDescriptor;
#Component
public MavenProject mavenProject;
#Override
public void execute() throws MojoExecutionException, MojoFailureException {
ClassLoader newClassLoader = null;
List<String> runtimeClassPathElements;
try {
runtimeClassPathElements = mavenProject.getTestClasspathElements();
} catch (DependencyResolutionRequiredException e) {
throw new MojoFailureException(MojoFailureMessages.UNRESOLVED_DEPENDENCIES_MESSAGE);
}
ClassRealm realm = pluginDescriptor.getClassRealm();
ClassRealm modifiedRealm= new ClassRealm( realm.getWorld(), realm.getId(), realm.getParentClassLoader());
try {
for (String element : runtimeClassPathElements) {
File elementFile = new File(element);
modifiedRealm.addURL(elementFile.toURI().toURL());
}
} catch (MalformedURLException e) {
throw new MojoFailureException(MojoFailureMessages.UNRESOLVED_CLASSES_MESSAGE);
}
pluginDescriptor.setClassRealm(modifiedRealm);
So I am getting the ClassRealm and I'am making slight changes to the UCP(removing some jars) and after that I set the newly created ClassRealm to the project descriptor. I am also changing the ContextClassLoader and the SystemClassLoader as the project I am executing my plugin on are using them for some interactions. These two are working fine- they are changed and the plugin is working fine with them. The problem is the plugin classloader. Because for some reason when executing my plugin on one project it is looking in the plugin ClassRealm and searching for the needed jars from there. But the code I put above is not fully correct, because when I come to the part where the execution of the plugin is looking in the plugin ClassRealm it is not the modified one- it gets another reference, which I don't know where it comes from. What I think is that I am not setting the ClassRealm correctly or I am missing something else.
I am trying to use FreeMarker to render some templates that come from a CMS path that happens to include a symbolic link (under Linux). Our CMS code handles the path to the template so, for example, this path:
/var/cms/live/display/main.html
really points to:
/var/cms/trunk/127/display/main.html
/var/cms/live is the base-directory while /display/main.html is the path.
In my case, live is a symbolic link -- in this case to trunk/127. FYI: the trunk is our SVN branch. When our CMS system downloads a new release of CMS files as (for example) trunk-128.zip, it unpacks it into trunk/128 and then changes the symlink (atomically) to trunk/128. Great.
The problem is that FreeMarker seems to have cached the trunk/127 path. It doesn't recognize that the file /var/cms/live/display/main.html has been updated and if the trunk/127 tree is removed, it generates a 500 error.
500 Unable to load template: /display/main.html
How can I get FreeMarker to cache the proper path?
The problem turned out to be with FreeMarker's FileTemplateLoader class. It does a baseDir.getCanonicalFile(...) call on the base-directory passed into the constructor. When our application booted, the base directory /var/cms/live gets resolved into the real path /var/cms/trunk/127/ by getCanonicalFile(...) so any future changes to the symlink are ignored.
It does this in the constructor, so we were forced to create our own LocalFileTemplateLoader which is listed below.
It is just a basic spring loaded implementation of TemplateLoader. Then when we are building our FreeMarker Configuration we set the template loader:
Configuration config = new Configuration();
LocalTemplateLoader loader = new LocalTemplateLoader();
// this is designed for spring
loader.setBaseDir("/var/cms/live");
config.setTemplateLoader(loader);
...
Here is our LocalFileTemplateLoader code. Full class on pastebin:
public class LocalFileTemplateLoader implements TemplateLoader {
public File baseDir;
#Override
public Object findTemplateSource(String name) {
File source = new File(baseDir, name);
if (source.isFile()) {
return source;
} else {
return null;
}
}
#Override
public long getLastModified(Object templateSource) {
if (templateSource instanceof File) {
return new Long(((File) templateSource).lastModified());
} else {
throw new IllegalArgumentException("templateSource is an unknown type: " + templateSource.getClass());
}
}
#Override
public Reader getReader(Object templateSource, String encoding) throws IOException {
if (templateSource instanceof File) {
return new InputStreamReader(new FileInputStream((File) templateSource), encoding);
} else {
throw new IllegalArgumentException("templateSource is an unknown type: " + templateSource.getClass());
}
}
#Override
public void closeTemplateSource(Object templateSource) {
// noop
}
#Required
public void setBaseDir(File baseDir) {
this.baseDir = baseDir;
// it may not exist yet because CMS is going to download and create it
}
}