I'd like to use Hibernate' schema generation to generate DDL for a database that I cannot access directly from my PC, just using hibernate config files. I'd like to skip, if possible, the installation of a local oracle database. Can hibernate generate DDL for a "theoretical" database of the appropriate dialect, version, etc., or is this a pipe dream?
Are there other tools that can do this?
You can either use an In-memory database during testing phase;
hibernate.hbm2ddl.auto="update"
Or you can generate your DDL using the hibernatetool from Maven:
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-antrun-plugin</artifactId>
<executions>
<execution>
<id>generate-test-sql-scripts</id>
<phase>generate-test-resources</phase>
<goals>
<goal>run</goal>
</goals>
<configuration>
<tasks>
<property name="maven_test_classpath" refid="maven.test.classpath"/>
<path id="hibernate_tools_path">
<pathelement path="${maven_test_classpath}"/>
</path>
<property name="hibernate_tools_classpath" refid="hibernate_tools_path"/>
<taskdef name="hibernatetool"
classname="org.hibernate.tool.ant.HibernateToolTask"/>
<mkdir dir="${project.build.directory}/test-classes/hsqldb"/>
<hibernatetool destdir="${project.build.directory}/test-classes/hsqldb">
<classpath refid="hibernate_tools_path"/>
<jpaconfiguration persistenceunit="testPersistenceUnit"
propertyfile="src/test/resources/META-INF/spring/jdbc.properties"/>
<hbm2ddl drop="false" create="true" export="false"
outputfilename="create_db.sql"
delimiter=";" format="true"/>
<hbm2ddl drop="true" create="false" export="false"
outputfilename="drop_db.sql"
delimiter=";" format="true"/>
</hibernatetool>
</tasks>
</configuration>
</execution>
</executions>
</plugin>
This Maven plugin will generate the following DDL files:
create_db.sql (containing all DDL statements for creating the DB)
drop_db.sql (containing all DDL statements for dropping the DB)
As of hibernate-tools 5.3 you can also use the integrated hibernate-tools-maven-plugin. There you should also find documentation on how to use it.
Unfortunatelly the maven-site with detailed description of the parameters/goals for it is not yet published but a little bit older version of it is available here.
There would be a PR (#842) that would allow for the site to be generated... unfortunatelly it hasn't found its way in yet.
Related
In Spring Boot, you can do the following:
src/main/resources/META-INF/spring.factories
# Auto Configure
org.springframework.boot.autoconfigure.EnableAutoConfiguration=\
org.springframework.boot.autoconfigure.admin.AConfiguration,\
org.springframework.boot.autoconfigure.admin.BConfiguration,\
org.springframework.boot.autoconfigure.admin.CConfiguration,\
org.springframework.boot.autoconfigure.admin.DConfiguration,\
org.springframework.boot.autoconfigure.admin.EConfiguration,\
org.springframework.boot.autoconfigure.admin.FConfiguration,\
Which is very nice. However after a year of development the list of auto configuration is now > 15 lines, which makes it hard to manage.
Would like to know if it is possible to separate the spring.factories into multiple files? Preferably would like to keep the whole project in one JAR.
Or maybe there is another ways to help organize the EnableAutoConfiguration that I am not aware of?
Thanks in advance!
While using spring-boot we use multiple "starters", each with an auto-configuration and spring.factories file.
So, one way could be to split your project into modules - one for each auto-configuration, define a dedicated spring.factories file in the module, and import all the modules as a runtime dependency in the main application module.
You can use maven or gradle to manage the multi-module project and the dependencies among them:
Gradle: https://guides.gradle.org/creating-multi-project-builds/
Maven: https://www.baeldung.com/maven-multi-module
Example:
root
moduleA
src/main/resources/META-INF/spring.factories
moduleB
src/main/resources/META-INF/spring.factories
and so on...
I have found a solution for this question.
Note: This exact solution assume that you only used EnableAutoConfiguration in your spring.factiores, it would crash if you use more than one type of config inside spring.factories.
One can do:
src/main/resources/META-INF/spring.factories
src/main/resources/META-INF/spring-2.factories
src/main/resources/META-INF/spring-3.factories
src/main/resources/META-INF/spring-4.factories
and merge this into one file.
Note, I am using Maven Antrun but I suspect Gradle would also have a similar feature.
In your pom.xml, add the following:
<plugin>
<artifactId>maven-antrun-plugin</artifactId>
<version>1.8</version>
<executions>
<execution>
<id>default-ci</id>
<goals>
<goal>run</goal>
</goals>
<phase>process-resources</phase>
<configuration>
<target>
<replace token='org.springframework.boot.autoconfigure.EnableAutoConfiguration=' value=','
dir="${project.build.directory}/classes/META-INF">
<include name="spring-*.factories"/>
</replace>
<concat destfile="${project.build.directory}/classes/META-INF/spring.factories" overwrite="yes" append="yes">
<fileset dir="${project.build.directory}/classes/META-INF" includes="spring-*.factories" />
</concat>
</target>
</configuration>
</execution>
</executions>
</plugin>
And in spring.factories is the normal config:
# Auto Configure
org.springframework.boot.autoconfigure.EnableAutoConfiguration=\
org.springframework.boot.autoconfigure.admin.AConfiguration,\
org.springframework.boot.autoconfigure.admin.BConfiguration
But in spring-2.factories and others, you start with ,\ instead of the default statement:
spring-2.factories:
org.springframework.boot.autoconfigure.EnableAutoConfiguration=\
org.springframework.boot.autoconfigure.admin.CConfiguration
spring-3.factories:
org.springframework.boot.autoconfigure.EnableAutoConfiguration=\
org.springframework.boot.autoconfigure.admin.DConfiguration
After all that, the outcome spring.factories in your output class directories will be a very nice:
# Auto Configure
org.springframework.boot.autoconfigure.EnableAutoConfiguration=\
org.springframework.boot.autoconfigure.admin.AConfiguration,\
org.springframework.boot.autoconfigure.admin.BConfiguration,\
org.springframework.boot.autoconfigure.admin.CConfiguration,\
org.springframework.boot.autoconfigure.admin.DConfiguration
I added to my Maven project the solution advised in this Stack Overflow question. The only difference to the suggested solution that I introduced was to replace the <tasks /> with the <target /> (the issue I am experiencing appears with either).
Everything works excellent on the testing side. When I run my tests the correct (test-persistence.xml) persistence file is being used. However when I am doing clean install or even hit run from my IDE (Netbeans 8.2) only the first target (copy-test-persistence) is being executed. The second execution is being entered after the tests (see the build output below), but target is not executed. What I do get after every clean install and when running the app on the server is that the contents of the test-persistence.xml are in the persistence.xml file. The right content remains in the persistence.xml.proper created in the first target.
--- maven-antrun-plugin:1.8:run (copy-test-persistence) # RimmaNew ---
Executing tasks
main:
[copy] Copying 1 file to /my-project-home/target/classes/META-INF
[copy] Copying 1 file to /my-project-home/target/classes/META-INF
Executed tasks
...
--- maven-antrun-plugin:1.8:run (restore-persistence) # RimmaNew ---
Executing tasks
main:
Executed tasks
You will notice that 0 tasks are executed under restore-persistence. Strangely enough in the created /target/antrun folder there's a build-main.xml file which includes the skipped task:
<?xml version="1.0" encoding="UTF-8" ?>
<project name="maven-antrun-" default="main" >
<target name="main">
<copy file="/home/vgorcinschi/NetBeansProjects/rimmanew/target/classes/META-INF/persistence.xml.proper" tofile="/home/vgorcinschi/NetBeansProjects/rimmanew/target/classes/META-INF/persistence.xml"/>
</target>
</project>
It would be appreciated if you could give me a hint as I can't get my head around this. As it is common I am posting my current pom.xml:
<plugin>
<artifactId>maven-antrun-plugin</artifactId>
<version>1.8</version>
<executions>
<execution>
<id>copy-test-persistence</id>
<phase>process-test-resources</phase>
<configuration>
<target>
<!--backup the "proper" persistence.xml-->
<copy file="${project.build.outputDirectory}/META-INF/persistence.xml" tofile="${project.build.outputDirectory}/META-INF/persistence.xml.proper" />
<!--replace the "proper" persistence.xml with the "test" version-->
<copy file="${project.build.testOutputDirectory}/META-INF/test-persistence.xml" tofile="${project.build.outputDirectory}/META-INF/persistence.xml" />
</target>
</configuration>
<goals>
<goal>run</goal>
</goals>
</execution>
<execution>
<id>restore-persistence</id>
<phase>prepare-package</phase>
<configuration>
<target>
<!--restore the "proper" persistence.xml-->
<copy file="${project.build.outputDirectory}/META-INF/persistence.xml.proper" tofile="${project.build.outputDirectory}/META-INF/persistence.xml" />
</target>
</configuration>
<goals>
<goal>run</goal>
</goals>
</execution>
</executions>
</plugin>
The issue has got to do with how Ant's copy task work:
By default, files are only copied if the source file is newer than the destination file, or when the destination file does not exist.
This is the problem here. Ant detects that the target file already exists, and that it is not newer. There is a granularity to determine "newer", and by default, it is 1 second, or 2 seconds on DOS systems. So what happens is that, during the build, the persistence.xml is copied by Maven into the build directory, its last modified date is changed (the Resources Plugin doesn't keep it), and then your own copy is just few milliseconds later. Thus, the copied persistence.xml.proper will never be newer because this all happens during the default granularity.
You can force the copy by setting the overwrite parameter to true with
<copy file="${project.build.outputDirectory}/META-INF/persistence.xml.proper"
tofile="${project.build.outputDirectory}/META-INF/persistence.xml"
overwrite="true"/>
Or you could use the move task instead, since you probably don't need to keep the .proper file anyway:
<move file="${project.build.outputDirectory}/META-INF/persistence.xml.proper"
tofile="${project.build.outputDirectory}/META-INF/persistence.xml" />
I'm trying to build up some documentation for my Wicket Web Application. I have created a page to grab all of my mounted pages and display them in /sitemap.xml.
In the vein of documentation I've added a new tag to the file <siteMap:Description>
now I want to fill that description with the javadoc entry that describes the class file.
I know there is know direct way to access them at runtime. So Instead I'm hoping to copy them at compile time into a List where they will then be accessible from runtime. How would I do that?
I'm using Maven for my build.
EDIT
I should probably Also mention that I do have an AntTask Already defined as part of my build process to save the compile Dates/times to a property file.
It seems to me an Task to scan my Class and then put the information into a file is probably the way to go. Problem is I'm not sure how to proceed.
My Ant-Task is defined like in my pom.xml so:
<plugin>
<artifactId>maven-antrun-plugin</artifactId>
<dependencies>
<dependency>
<groupId>ant</groupId>
<artifactId>ant-nodeps</artifactId>
<version>1.6.5</version>
</dependency>
</dependencies>
<executions>
<execution>
<id>set-build-time</id>
<phase>process-sources</phase>
<configuration>
<tasks>
<tstamp>
<format property="build.timestamp" pattern="yyyy/MM/dd HH:mm:ss"/>
<format property="build.time" pattern="HH:mm:ss" />
<format property="build.date" pattern="MM/dd/yyyy" />
<format property="build.year" pattern="yyyy"/>
</tstamp>
<replaceregexp byline="true">
<regexp pattern="copyYear\=.*" />
<!--suppress MavenModelInspection -->
<substitution expression="copyYear=${build.year}" />
<fileset dir="src/main/java/" includes="**/*.properties" />
</replaceregexp>
<replaceregexp byline="true">
<regexp pattern="buildTime\=.*" />
<!--suppress MavenModelInspection -->
<substitution expression="buildTime=${build.date} ${build.time}" />
<fileset dir="src/main/java/" includes="**/*.properties" />
</replaceregexp>
</tasks>
</configuration>
<goals>
<goal>run</goal>
</goals>
</execution>
</executions>
</plugin>
After doing more research I determined I was barking up the wrong tree.
I since I was trying to get Javadoc comments A Doclet was the better answer.
So I implemented a custom doclet and wired it up to run automatically as described in
the follow up question and answer below.
How can I compile and run my Custom Doclet class in my project?
I want to share test resources between 2 modules A and B. In A test resources I have directory with some files. Like this
-dir
----f1
----f2
I've done all according to Share test resources between maven projects . Now from B test I can access to resources using syntax like:
this.getClass().getClassLoader().getResource("dir/f1")
And it's works perfectly fine. But I don't want hardcore all file names like f1 or f2. What I really want is getting all files from directory. Like
File foo = new File(this.getClass().getClassLoader().getResource("dir").getFile());
assert foo.isDirectory()
foo.list()
...
But when I create foo in such way it even doesn't exist (foo.exist() returns false).
How can I deal with it?
Update. Alternative solution using ant
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-antrun-plugin</artifactId>
<version>1.5</version>
<executions>
<execution>
<configuration>
<target>
<fileset id="d" dir="${basedir}/src/test/resources/dir" includes="*"/>
<pathconvert property="d" refid="d">
<map from="${basedir}/src/test/resources/" to=""/>
</pathconvert>
<touch file="${basedir}/src/test/resources/d/listOfCharts.txt"/>
<echo file="${basedir}/src/test/resources/d/listOfCharts.txt" message="${charts}"/>
</target>
</configuration>
<phase>package</phase>
<goals>
<goal>run</goal>
</goals>
</execution>
</executions>
</plugin>
The approach you talk about creates a JAR which can be used in later tests. You can't list the contents of a Jar this way. You need to use Jar you need to read it as a Jar specially
How do I list the files inside a JAR file?
We are using ANT for our build process and don't have plans to change this in the near future.
Is it possible to use Maven to just fetch common Open Source jar files (e.g. Log4J, SWT, JFace) and put them in the right location of our project, so we don't have to store them in our version control — preferable without creating the typical Maven-cache in the home directory?
NO NO NO Everyone!
If you're using Ant, the best way to use Maven repositories to download jar dependencies is to use Ivy with Ant. That's exactly what Ivy is for.
Installing Ivy and getting to work with current Ant projects is simple to do. It works with Nexus and Artifactory if you use those as your local Maven repositories.
Take a look at Ivy. It is probably exactly what you want.
In variation of org.life.java's answer, I would not do mvn install.
Instead, in the pom.xml I would add the following bit:
<build>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-dependency-plugin</artifactId>
<version>2.1</version>
<executions>
<execution>
<phase>generate-sources</phase>
<goals>
<goal>copy-dependencies</goal>
</goals>
<configuration>
<outputDirectory>${project.build.directory}/lib</outputDirectory>
</configuration>
</execution>
</executions>
</plugin>
</plugins>
</build>
Now you just need to do mvn generate-sources, which is a lot faster than the full mvn install, and all dependencies will be copied to the specified directory.
Oh btw, isn't that what Apache Ivy is about? Extending Ant to understand Maven's dependency management?
It's possible, you should use maven-ant-tasks.
In particular its dependencies ant task. With this setup no Maven install is required.
<?xml version="1.0" encoding="UTF-8"?>
<project
name="download-dependency"
basedir="."
default="download-dependency"
xmlns:artifact="antlib:org.apache.maven.artifact.ant"
>
<target name="download-dependency">
... define properties ...
<taskdef
resource="org/apache/maven/artifact/ant/antlib.xml"
uri="antlib:org.apache.maven.artifact.ant"
/>
<artifact:dependencies>
<localRepository path="${local-repo.dir}"/>
<remoteRepository id="central" url="${repository-uri}"/>
<dependency
groupId="${groupId}"
artifactId="${artifactId}"
version="${version}"
type="${type}"
classifier="${classifier}"
scope="runtime"
/>
</artifact:dependencies>
</target>
</project>
The only binary you should check into your project is maven-ant-tasks.jar.
Actually in our project I used Sonatype Nexus ( documentation ) Maven repository manager to centralize access to different repositories and even maintain some binaries unique to our environment. With Nexus' help I just fetch maven-ant-tasks.jar with ant's <get> task from a known URL. You don't have to use Nexus, but it greatly speeds up builds, because it caches binaries close to your developer's machines.
Ivy does just this,
when it bootstraps itself:
http://ant.apache.org/ivy/history/latest-milestone/samples/build.xml
<property name="ivy.install.version" value="2.0.0-beta1"/>
<property name="ivy.jar.dir" value="lib"/>
<property name="ivy.jar.file" value="${ivy.jar.dir}/ivy.jar"/>
<target name="resolve" unless="skip.download">
<mkdir dir="${ivy.jar.dir}"/>
<echo message="installing ivy..."/>
<get src="http://repo1.maven.org/maven2/org/apache/ivy/ivy/${ivy.install.version}/ivy-${ivy.install.version}.jar" dest="${ivy.jar.file}" usetimestamp="true"/>
</target>