I am using liquibase (3.1.1) in a spring environment (3.2.x) and load the changesets via the inlcudeAll tag in a master file. There I use the "classpath*:/package/to/changesets" as path.
<?xml version="1.0" encoding="UTF-8"?>
<databaseChangeLog xmlns="http://www.liquibase.org/xml/ns/dbchangelog"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://www.liquibase.org/xml/ns/dbchangelog
http://www.liquibase.org/xml/ns/dbchangelog/dbchangelog-3.1.xsd">
<includeAll path="classpath*:/package/to/changesets"/>...
I use a naming strategy like "nnn_changesetname.xml" to keep ordering. But when I look into the changeset table this ordering via the filenames are not kept. Is this only working, if the changeset files are contained in a directory and not on the classpath?
Update
Hi, I found out that the below suggested solution is not enough. I think it lies in the implementation how liquibase resolves the includAll attribute. In my case it first resolves all "folders" and then looks into each folder for changeset xmls. This will break the ordering of the xml files in all classpath*:/changes locations, because there are now several "changes" folders in different locations. What I would suspect in such a case is a merge of all contents of this "virtual" classpath folders and loading of all resources in one enumeration. Or we could allow some resouce pattern in the inlcudeAll tag like resources="classpath*:/changes/*.xml" to directly select all needed files (tried it out with the path attribute, but did not work, because it checks for a folder)?
Update
I made a hack to check if the ordering in the returned enumeration is preserved with the anwser from below. To achive this I checked for the given package name and if it matches my pattern I added an additional "*.xml" to it. With this extension I get all changeset as needed.
#Override
public Enumeration<URL> getResources(String packageName)
throws IOException {
if(packageName.equals("classpath*:/plugin/liquibase/changes/")) {
packageName = packageName + "*.xml";
}
List<URL> resources = Collections.list(super.getResources(packageName));
Collections.sort(resources, new Comparator<URL>() {
#Override
public int compare(URL url1, URL url2) {
String path1 = FilenameUtils.getName(url1.getPath());
String path2 = FilenameUtils.getName(url2.getPath());
return String.CASE_INSENSITIVE_ORDER.compare(path1, path2);
}
});
logger.info("Found resources: {}", resources);
return Collections.enumeration(resources);
}};
In the log I can see now that the resources have the correct order. But when I look into the table DATABASECHANGELOCK it does not reflect the order I had in the enumeration. So it seems that this values get reodered somewhere else.
Update
Analyzed the code furhter and found out that the class liquibase.parser.core.xml.XMLChangeLogSAXHandler makes a reordering of the returned enumeration. So my changes will have no effect. I do not think that I can hack into this class as well.
You are right, Liquibase is relying on the underlying "list files" logic which orders files alphabetically through the file system but apparently does not through classpaths.
I created https://liquibase.jira.com/browse/CORE-1843 to track the fix.
For now, if you configure spring with a subclass of liquibase.integration.spring.SpringLiquibase that overrides getResources(String packageName) with a method that sorts the returned Enumeration that should resolve the problem for you.
So after some thinking and one night of sleep I came up with the following hack to guarantee order of the loaded changelog files via classpath pattern classpath*:/my/path/to/changelog/*.xml . The idea is to create the main changelog file on the fly via dom manipulation, when liquibase requests it.
It only works for the main changelog file. Following prerequisite:
The pattern can only be used for the main changelog file
I use an empty master changelog file as template
All other changelog files have to use the normal allowed loading mechanism
Works only in an Spring environment
First I had to extend/overwrite the liquibase.integration.spring.SpringLiquibase with my implementation.
public class MySpringLiquibase extends SpringLiquibase {
private static final Logger logger = LoggerFactory.getLogger(MySpringLiquibase.class);
private ApplicationContext context;
private String changeLogLocationPattern;
private List<String> changeLogLocations;
#Autowired
public void setContext(ApplicationContext context) {
this.context = context;
}
/**
* Location pattern to search for changelog files.
*
* #param changeLogLocationPattern
*/
public void setChangeLogLocationPattern(String changeLogLocationPattern) {
this.changeLogLocationPattern = changeLogLocationPattern;
}
#Override
public void afterPropertiesSet() throws LiquibaseException {
try {
changeLogLocations = new ArrayList<String>();
// retrieve all changelog resources for the pattern
List<Resource> changeLogResources = Arrays.asList(context.getResources(changeLogLocationPattern));
for (Resource changeLogResource : changeLogResources) {
// get only the classpath path of the resource
String changeLogLocation = changeLogResource.getURL().getPath();
changeLogLocation = "classpath:" + StringUtils.substringAfterLast(changeLogLocation, "!");
changeLogLocations.add(changeLogLocation);
}
// sort all found resources by string
Collections.sort(changeLogLocations, String.CASE_INSENSITIVE_ORDER);
} catch (IOException e) {
throw new LiquibaseException("Could not resolve changeLogLocationPattern", e);
}
super.afterPropertiesSet();
}
#Override
protected SpringResourceOpener createResourceOpener() {
final String mainChangeLog = getChangeLog();
return new SpringResourceOpener(getChangeLog()) {
#Override
public InputStream getResourceAsStream(String file)
throws IOException {
// check if main changelog file
if(mainChangeLog.equals(file)) {
// load master template and convert to dom object
Resource masterResource = getResourceLoader().getResource(file);
Document masterDocument = DomUtils.parse(masterResource, true);
// add all changelog locations as include elements
for (String changeLogLocation : changeLogLocations) {
Element inlcudeElement = masterDocument.createElement("include");
inlcudeElement.setAttribute("file", changeLogLocation);
masterDocument.getDocumentElement().appendChild(inlcudeElement);
}
if(logger.isDebugEnabled()) {
logger.debug("Master changeset: {}", DomUtils.toString(masterDocument));
}
// convert dom back to string and give it back as input resource
return new ByteArrayInputStream(DomUtils.toBytes(masterDocument));
} else {
return super.getResourceAsStream(file);
}
}
};
}
}
This class now needs to be used in the spring xml configuration.
<bean id="liquibase" class="liquibase.integration.spring.MySpringLiquibase"
p:changeLog="classpath:/plugin/liquibase/master.xml"
p:dataSource-ref="dataSource"
p:contexts="${liquibase.contexts:prod}"
p:ignoreClasspathPrefix="true"
p:changeLogLocationPattern="classpath*:/plugin/liquibase/changes/*.xml"/>
With this changes I have achieved that my main changelog files are ordered by their name.
Hope that helps others too.
Related
In a multi-module project I want to be sure that Spring's #sql annotation uses correct resources. Is there a way to log full path of those files to console somehow?
Spring does log script file name before execution, but in tests for different modules those file names are the same sometimes.
SqlScriptsTestExecutionListener - responsible for the processing of #Sql, for the first step you can change to debug related log by adding property logging.level.org.springframework.test.context.jdbc=debug, but the debug message is not fully and if is not enough you should create your own TestExecutionListener and declare on test class #TestExecutionListeners(listeners = SqlScriptsCustomTestExecutionListener.class)
for example:
public class SqlScriptsCustomTestExecutionListener extends AbstractTestExecutionListener {
#Override
public void beforeTestMethod(TestContext testContext) {
List<Resource> scriptResources = new ArrayList<>();
Set<Sql> sqlAnnotations = AnnotatedElementUtils.getMergedRepeatableAnnotations(testContext.getTestMethod(), Sql.class);
for (Sql sqlAnnotation : sqlAnnotations) {
String[] scripts = sqlAnnotation.scripts();
scripts = TestContextResourceUtils.convertToClasspathResourcePaths(testContext.getTestClass(), scripts);
scriptResources.addAll(TestContextResourceUtils.convertToResourceList(testContext.getApplicationContext(), scripts));
}
if (!scriptResources.isEmpty()) {
String debugString = scriptResources.stream().map(r -> {
try {
return r.getFile().getAbsolutePath();
} catch (IOException e) {
System.out.println("Unable to found file resource");
}
return null;
}).collect(Collectors.joining(","));
System.out.println(String.format("Execute sql script :[%s]", debugString));
}
}
It is just quick example and it works. Most of source code i copied from SqlScriptsTestExecutionListener just for explanation. It is just realization in case of #Sql annotation on method level, and not included class level.
I hope it will be helps you.
Let say I have the follow code.
private static String configFile = null;
File cf = new File(configFile);
Configuration c = new Configuration();
if (cf.exists() && cf.isFile()) {
c.configure(cf);
} else {
c.configure(configFile);
}
I am wondering what is the difference between c.configure(cf) and c.configure(configFile). In my code,configFile is repsented as resource and cf is the the configFile object.
I found these two from this (api).
public Configuration configure(String resource)
throws HibernateException
public Configuration configure(File configFile)
throws HibernateException
The documentation of the API isn't explicitly clear, is it?
I tracked it as far as this class before getting fed up:
https://github.com/hibernate/hibernate-orm/blob/master/hibernate-core/src/main/java/org/hibernate/boot/cfgxml/internal/ConfigLoader.java
But it looks like in case of configure(String resource), it is the name of a resource as would be passed to the Java class loader to get a resource as a stream, i.e.:
http://docs.oracle.com/javase/7/docs/api/java/lang/Class.html#getResourceAsStream(java.lang.String)
Whereas, configure(File configFile), it uses a FileInputStream.
In either case Hibernate is still expecting the same XML format for the configuration.
I'm trying to write a test for a Mule flow that will involve dropping a file in a location, waiting for it to be processed by my flow and compare the output to see if it has been transformed correctly. My flow looks as follows:
<flow name="mainFlow" processingStrategy="synchronous">
<file:inbound-endpoint name="fileIn" path="${inboundPath}">
<file:filename-regex-filter pattern="myFile.csv" caseSensitive="true"/>
</file:inbound-endpoint>
...
<file:outbound-endpoint path="${outboundPath}" outputPattern="out.csv"/>
</flow>
Is there a way I can access the inboundPath and outboundPath Mule properties inside of my test class so that I can drop files and wait for output in the correct places?
The test class I'm using is:
public class MappingTest extends BaseFileToFileFunctionalTest {
#Override
protected String getConfigResources() {
return "mappingtest.xml";
}
#Test
public void testMapping() throws Exception {
dropInputFileIntoPlace("myFile.csv");
waitForOutputFile("out.csv", 100);
assertEquals(getExpectedOutputFile("expected-out.csv"), getActualOutputFile("out.csv"));
}
}
Which extends this class:
public abstract class BaseFileToFileFunctionalTest extends FunctionalTestCase {
private static final File INPUT_DIR = new File("/tmp/muletest/input");
private static final File OUTPUT_DIR = new File("/tmp/muletest/output");
private static final Charset CHARSET = Charsets.UTF_8;
#Before
public void setup() {
new File("/tmp/muletest/input").mkdirs();
new File("/tmp/muletest/output").mkdirs();
empty(INPUT_DIR);
empty(OUTPUT_DIR);
}
private void empty(File inputDir) {
for (File file : inputDir.listFiles()) {
file.delete();
}
}
protected File waitForOutputFile(String expectedFileName, int retryAttempts) throws InterruptedException {
boolean polling = true;
int attemptsRemaining = retryAttempts;
File outputFile = new File(OUTPUT_DIR, expectedFileName);
while (polling) {
Thread.sleep(100L);
if (outputFile.exists()) {
polling = false;
}
if (attemptsRemaining == 0) {
VisibleAssertions.fail("Output file did not appear within expected time");
}
attemptsRemaining--;
}
outputFile.deleteOnExit();
return outputFile;
}
protected void dropInputFileIntoPlace(String inputFileResourceName) throws IOException {
File inputFile = new File(INPUT_DIR, inputFileResourceName);
Files.copy(Resources.newInputStreamSupplier(Resources.getResource(inputFileResourceName)), inputFile);
inputFile.deleteOnExit();
}
protected String getActualOutputFile(String outputFileName) throws IOException {
File outputFile = new File(OUTPUT_DIR, outputFileName);
return Files.toString(outputFile, CHARSET);
}
protected String getExpectedOutputFile(String resourceName) throws IOException {
return Resources.toString(Resources.getResource(resourceName), CHARSET);
}
}
As you can see I'm currently creating temporary input/output directories. I'd like to make this part read from the Mule properties if possible? Thanks in advance.
After observing your test classes and code I could see that you want to dynamically create temp folders place files in them. And the flow should read the files from Temp Directory and write output to another Temp directory. Point to be noted is that Mule's Endpoints are created when the configuration is loaded. So the ${inbound} and ${outbound} should be provided to the mule flow by the time they are provided.
So one option can be to create a dummy flow pointing to the temp folders for testing.
or
Create a test properties file pointing to the temp folders and load that to your flow config, so that your flow endpoints will get the temp folder paths.
In any way path cannot be provided to the flow inbound endpoints after they have been created(on config load).
Update1:
As per your comment the solution with option would be like the following.
Seperate the properties loading part of the config into another config.
Like "mapping-core-config.xml,mappingtest.xml" where the mapping-core-config will have the tags to load the properties file.
Now create a test config file for the mapping-core-config.xml file which loads the test properties file. This should be used in your test config. This way without modifying or disturbing your main code, you can test your flows pointing to temp folders.
"mapping-core-test-config.xml,mappingtest.xml"
Note: The test config can reside in the src/test/resources folders.
Hope this helps.
I am trying to use FreeMarker to render some templates that come from a CMS path that happens to include a symbolic link (under Linux). Our CMS code handles the path to the template so, for example, this path:
/var/cms/live/display/main.html
really points to:
/var/cms/trunk/127/display/main.html
/var/cms/live is the base-directory while /display/main.html is the path.
In my case, live is a symbolic link -- in this case to trunk/127. FYI: the trunk is our SVN branch. When our CMS system downloads a new release of CMS files as (for example) trunk-128.zip, it unpacks it into trunk/128 and then changes the symlink (atomically) to trunk/128. Great.
The problem is that FreeMarker seems to have cached the trunk/127 path. It doesn't recognize that the file /var/cms/live/display/main.html has been updated and if the trunk/127 tree is removed, it generates a 500 error.
500 Unable to load template: /display/main.html
How can I get FreeMarker to cache the proper path?
The problem turned out to be with FreeMarker's FileTemplateLoader class. It does a baseDir.getCanonicalFile(...) call on the base-directory passed into the constructor. When our application booted, the base directory /var/cms/live gets resolved into the real path /var/cms/trunk/127/ by getCanonicalFile(...) so any future changes to the symlink are ignored.
It does this in the constructor, so we were forced to create our own LocalFileTemplateLoader which is listed below.
It is just a basic spring loaded implementation of TemplateLoader. Then when we are building our FreeMarker Configuration we set the template loader:
Configuration config = new Configuration();
LocalTemplateLoader loader = new LocalTemplateLoader();
// this is designed for spring
loader.setBaseDir("/var/cms/live");
config.setTemplateLoader(loader);
...
Here is our LocalFileTemplateLoader code. Full class on pastebin:
public class LocalFileTemplateLoader implements TemplateLoader {
public File baseDir;
#Override
public Object findTemplateSource(String name) {
File source = new File(baseDir, name);
if (source.isFile()) {
return source;
} else {
return null;
}
}
#Override
public long getLastModified(Object templateSource) {
if (templateSource instanceof File) {
return new Long(((File) templateSource).lastModified());
} else {
throw new IllegalArgumentException("templateSource is an unknown type: " + templateSource.getClass());
}
}
#Override
public Reader getReader(Object templateSource, String encoding) throws IOException {
if (templateSource instanceof File) {
return new InputStreamReader(new FileInputStream((File) templateSource), encoding);
} else {
throw new IllegalArgumentException("templateSource is an unknown type: " + templateSource.getClass());
}
}
#Override
public void closeTemplateSource(Object templateSource) {
// noop
}
#Required
public void setBaseDir(File baseDir) {
this.baseDir = baseDir;
// it may not exist yet because CMS is going to download and create it
}
}
Background:
One of the components of our project operates using spring. Some SQL code is dynamically generated, based on a given XML spring configuration.
At first it was fine to store all the XML configurations in the same package on the classpath, (and then load it as a resource when the service is called) but over time we ended up with a large number of configurations. It came time to separate the configurations into different namespaces.
The Goal
What I want is, given a starting package on the classpath, to recursively walk the directory structure and discover any spring XML files dynamically. (So that as new configurations / packages are added, the files will still be found by the service).
The Problem
I was able to accomplish my goal fine when running outside an EJB container by using Thread.getContextClassloader().getResource(myBasePackage), then getting a File object and using it to walk the tree on the filesystem. Clunky, I know, but it was still classpath relative and it worked.
However, you cannot do this inside an EJB container (you can't interact with the filesystem at all), so I had to use the rather annoying workaround in which I maintain a list of hardcoded packages to search.
The Question
Is there a way (running inside an EJB container) to dynamically walk the classpath (from a given starting location) searching for arbitrary resources?
Short answer: Not while staying in compliance with the EJB spec. Because the spec envisions containers running in all kinds of non-standard situations, it does not make this possible.
Longer answer: Since you are not creating these resources dynamically, I would write a routine that gives you a list of all of the resources at build time and puts them in a dynamically generated file that your EJB knows how to reference. So you basically create a directory listing of packages and files that you can load in the EJB that are referenced in one master file.
Spring answer: Spring supports finding resources on the classpath, although I have no idea how well this works in the EJB context (and I doubt its EJB compliant, but I haven't checked). Some details here.
DISCLAIMER: As already pointed out, creating resources in the classpath is not recommended and depending on the EJB container explicitly forbidden. This may cause you a lot of problems because containers may explode your resources into another folder or even replicate the resources throughout the cluster (if thats the case). In order to create resources dynamically you have to create a custom classloader. So, I would never do it. It is better to access the filesystem directly than the classpath. It is less ugly and eventually cluster-safe if you use a remote filesystem + file locks.
If even after all I explained you still want to play with the classpath, you can try to do something like: get the classloader via
ClassLoader cld = Thread.currentThread().getContextClassLoader();
Starting from a base package enumerate all occurrences
Enumeration<URL> basePackageUrls = cld.getResources(basePackagePath);
Each URL is generally either a file link (file:///home/scott/.../MyResource.properties) or a jar link (file:///lib.jar!/com/domain/MyResource.properties). You have to check the pattern in the URL. Using that, enumerate the contents of the folder using the normal java API and find the subpackages. Proceed until you have scanned all packages.
See the class below (will be released with an open-source project of mine soon). It implemens a classpath scanner that you can pass in a selector. It works like a visitor. It my work for you, if not, get ideas from it. See the sample annotation selector at the end.
public class ClasspathScanner
{
private static final Log log = LogFactory.getLog(ClasspathScanner.class);
private static final String JAR_FILE_PATTERN = ".jar!";
private ClassSelector selector;
private Set<Class<?>> classes;
// PUBLIC METHODS ------------------------------------------------------------------------------
public synchronized Set<Class<?>> scanPackage(String basePackage, ClassSelector selector)
throws Exception
{
if (selector == null)
{
throw new NullPointerException("Selector cannot be NULL");
}
this.selector = selector;
this.classes = new HashSet<Class<?>>();
Set<Class<?>> aux;
try
{
scanClasses0(basePackage);
aux = this.classes;
}
finally
{
this.selector = null;
this.classes = null;
}
return aux;
}
// HELPER CLASSES ------------------------------------------------------------------------------
private void scanClasses0(String basePackage)
throws IOException, ClassNotFoundException, FileNotFoundException
{
File packageDirectory = null;
ClassLoader cld = getLoader();
String basePackagePath = basePackage.replace('.', '/');
Enumeration<URL> basePackageUrls = cld.getResources(basePackagePath);
if (basePackageUrls == null || !basePackageUrls.hasMoreElements())
{
throw new ClassNotFoundException("Base package path not found: [" + basePackagePath
+ "]");
}
while (basePackageUrls.hasMoreElements())
{
String packagePath = basePackageUrls.nextElement().getFile();
if (packagePath.contains(JAR_FILE_PATTERN))
{
scanJarFile(basePackagePath, packagePath);
}
else
{
packageDirectory = new File(packagePath);
scanDirectory(basePackage, packageDirectory);
}
}
}
private void scanDirectory(String packageName, File packagePath)
throws ClassNotFoundException, FileNotFoundException
{
if (packagePath.exists())
{
File[] packageFiles = packagePath.listFiles();
for (File file : packageFiles)
{
if (file.isFile() && file.getName().endsWith(".class"))
{
String fullFileName = packageName + '.' + file.getName();
checkClass(fullFileName);
}
else if (file.isDirectory())
{
scanDirectory(packageName + "." + file.getName(), file);
}
}
}
else
{
throw new FileNotFoundException(packagePath.getPath());
}
}
private void scanJarFile(String basePackagePath, String jarFileUrl)
throws IOException, ClassNotFoundException
{
String jarFilePath = jarFileUrl.substring("file:".length(), jarFileUrl
.indexOf(JAR_FILE_PATTERN)
+ JAR_FILE_PATTERN.length() - 1);
log.debug("URL JAR file path: [" + jarFilePath + "]");
jarFilePath = URLDecoder.decode(jarFilePath, "UTF-8");
log.debug("Decoded JAR file path: [" + jarFilePath + "]");
JarFile jar = new JarFile(new File(jarFilePath));
for (Enumeration<JarEntry> jarFiles = jar.entries(); jarFiles.hasMoreElements();)
{
JarEntry file = jarFiles.nextElement();
String fileName = file.getName();
if (!file.isDirectory() && fileName.endsWith(".class")
&& fileName.startsWith(basePackagePath))
{
String className = fileName.replace('/', '.');
checkClass(className);
}
}
}
private void checkClass(String fullFilePath) throws ClassNotFoundException
{
String className = fullFilePath.substring(0, fullFilePath.length() - 6);
Class<?> c = getLoader().loadClass(className);
if (selector.select(c))
{
classes.add(c);
}
}
private ClassLoader getLoader()
{
ClassLoader loader = Thread.currentThread().getContextClassLoader();
if (loader == null)
{
loader = getClass().getClassLoader();
}
return loader;
}
// INNER CLASSES -------------------------------------------------------------------------------
public interface ClassSelector
{
boolean select(Class<?> clazz);
}
public static class AnnotatedClassSelector implements ClassSelector
{
private final Class<? extends Annotation>[] annotations;
public AnnotatedClassSelector(Class<? extends Annotation>... annotations)
{
this.annotations = annotations;
}
public boolean select(Class<?> clazz)
{
for (Class<? extends Annotation> ac : annotations)
{
if (clazz.isAnnotationPresent(ac))
{
return true;
}
}
return false;
}
}
}