What is the best (easiest) way to reverse-engineer POJOs from a database? I would like to generate probably 40 entity classes from tables, just to save a bunch of typing. I would like to use the Hibernate Tools toolset but all examples seem incomplete or contradictory - some reference using Ant tasks, some reference Maven plugins, and the Jboss site itself indicates that Hibernate Tools 4.x now seems to be an Eclipse plugin!
What is the "correct" way to do this, starting from scratch?
I actually wound up using an Ant task. If you have a situation where you need to reverse-engineer POJOs from a database, and you have no existing infrastructure in place, I believe the Ant method is best. I started with this excellent blog post and was able to cut and paste most of the code I needed. I found through experimentation that some additional JARs were needed and after some tweaking was able to generate the POJOs I needed in fairly short order.
This assumes that you know basic Java terminology and a little about Ant, and have both installed. Here are the steps.
You'll need to create two files (build.xml and hibernate.cfg.xml) and download some JARs. You may also need to download the Hibernate DTD files if you are behind a proxy or firewall (since Hibernate will try to go out and read the DTDs). That's it.
Create the following directories:
/myantproject
/lib
/src
In your "myantproject" directory create your build.xml file as follows:
<project name="antbuild" basedir="." default="gen_hibernate">
<taskdef name="hibernatetool"
classname="org.hibernate.tool.ant.HibernateToolTask">
<classpath>
<fileset dir="lib">
<include name="**/*.jar"/>
</fileset>
</classpath>
</taskdef>
<target name="gen_hibernate"
description="generate hibernate classes">
<hibernatetool>
<jdbcconfiguration
configurationfile="hibernate.cfg.xml"
packagename="com.mycompany.model"
detectmanytomany="true"
/>
<hbm2hbmxml destdir="src" />
<hbm2java destdir="src" />
</hibernatetool>
</target>
</project>
Also in the "myantproject" directory create your hibernate.cfg.xml file as follows:
<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE hibernate-configuration PUBLIC
"-//Hibernate/Hibernate Configuration DTD 3.0//EN"
"http://www.hibernate.org/dtd/hibernate-configuration-3.0.dtd" >
<hibernate-configuration>
<session-factory>
<property name="hibernate.connection.driver_class">com.ibm.as400.access.AS400JDBCDriver</property>
<property name="hibernate.connection.url">jdbc:as400://myserver;libraries=MYLIB;dateformat=iso;timeformat=iso;prompt=false;naming=system;transaction isolation=none</property>
<property name="hibernate.connection.username">myuser</property>
<property name="hibernate.connection.password">mypassword</property>
<property name="hibernate.dialect">org.hibernate.dialect.DB2400Dialect</property>
</session-factory>
</hibernate-configuration>
If you are behind a firewall/proxy, you can download the DTD change the DTD reference in the file to this (make sure you edit it to point to your actual file location):
"file:///mypath/myantproject/lib/hibernate-configuration-3.0.dtd"
You can then download the DTD from the original URL and stick it in your "lib" directory.
Here are the JARs I wound up with. With these JARs, you should be able to run this Ant task and it will reverse-engineer all the tables in the database you have pointed to in "hibernate.cfg.xml".
cglib-nodep-2.2.3.jar
commons-collections-3.2.1.jar
commons-logging-1.1.1.jar
dom4j-1.6.1.jar
freemarker-2.3.8.jar
hibernate-annotations-3.5.0-Final.jar
hibernate-commons-annotations-4.0.1.Final.jar
hibernate-configuration-3.0.dtd
hibernate-core-3.3.1.GA.jar
hibernate-entitymanager-4.2.0.Final.jar
hibernate-tools-3.2.3.GA.jar
jt400-6.6.jar
jtidy-r938.jar
log4j-1.2.16.jar
slf4j-api-1.7.5.jar
These come from various sources - most either from apache.org or hibernate.org. You will need your database JDBC JAR from your database vendor (in this case an AS400 connector jar from IBM) to connect to the database. I also needed to download these DTDs since I was behind a firewall:
hibernate-mapping-3.0.dtd
hibernate-reverse-engineering-3.0.dtd
Good luck!
Related
I have a tack in build.xml that downloads all dependencies to the cache:
<target name="init" depends="init-ivy">
...
<ivy:cachepath
inline="true"
module="jersey-container-servlet"
organisation="org.glassfish.jersey.containers"
pathid="jersey.classpath"
revision="2.23.2"/>
<ivy:cachepath
inline="true"
module="javax.json"
organisation="org.glassfish"
pathid="json.classpath"
revision="1.0.4"/>
...
</target>
The code compiles successfully and a war file is created. Now I need write a task that would deploy the app to tomcat. I need to copy all dependencies to the app's WEB-INF/lib. How is this done? Maybe there is a way to include the dependencies' JARs to the WAR file? I am new to Java development, please help.
The following answer outlines the comprehensive solution using an ivy file.
Ivy dependecy as provided
It answers a different question ("provided" dependencies) but one you will eventually face because not all the jars you use in your build will need to shipped with your application (because they already exist on tomcat).
Attempting to apply this answer to your question is not straightforward because you're resolving your dependencies in inline mode (No ivy file). Firstly I'd recommend combining your dependencies into a single path, rather than creating paths around each dependency:
<ivy:cachepath pathid="compile.classpath">
<dependency org="org.glassfish" name="javax.json" rev="1.0.4" />
<dependency org="org.glassfish.jersey.containers" name="jersey-container-servlet" rev="2.23.2" />
</ivy:cachepath>
Secondly (and to answer your question), it's the alternate ivy retrieve task that's used to place ivy files on the file system. It too can support an inline resolution as follows:
<ivy:retrieve pattern="${build.dir}/lib/[artifact].[ext]">
<dependency org="org.glassfish" name="javax.json" rev="1.0.4" />
<dependency org="org.glassfish.jersey.containers" name="jersey-container-servlet" rev="2.23.2" />
</ivy:retrieve>
<war destfile="${war.file}" webxml="${resources.dir}/web.xml">
<fileset dir="${resources.dir}" excludes="web.xml"/>
<lib dir="${build.dir}/lib"/>
</war>
So in conclusion, while this suggested answer will work, I would recommend investigating how configurations work in concert with an external ivy file to manage your dependencies. Configurations may appear challenging, but they're also very powerful.
Your other question is related. Using ivy's inline mode is convenient but not the most efficient way to use ivy. A single call to the resolve task can be used to determine all a project's dependencies and using configurations to partition these up into various classpath or filesets, etc.
how to get ivy:cachepath location without checking if dependencies downloaded
Short Version
I've got a Java-project which uses JPA 2.0 with Hibernate 4.3.4. All is fine when I run it inside Eclipse. But when I let Eclipse export a runnable JAR, the trouble begins and the program crashes due to a seemingly missing persistence unit...
javax.persistence.PersistenceException: No Persistence provider for EntityManager named MyDBManager
at javax.persistence.Persistence.createEntityManagerFactory(Persistence.java:61)
... or seemingly unmapped classes...
3024 Thread-4| FATAL DbManager : DBManager could not load countries from database.
java.lang.IllegalArgumentException: org.hibernate.hql.internal.ast.QuerySyntaxException: Country is not mapped [SELECT x FROM Country x]
at org.hibernate.jpa.spi.AbstractEntityManagerImpl.convert(AbstractEntityManagerImpl.java:1750)
at org.hibernate.jpa.spi.AbstractEntityManagerImpl.convert(AbstractEntityManagerImpl.java:1677)
at org.hibernate.jpa.spi.AbstractEntityManagerImpl.convert(AbstractEntityManagerImpl.java:1683)
at org.hibernate.jpa.spi.AbstractEntityManagerImpl.createQuery(AbstractEntityManagerImpl.java:331)
Background
Depending on the kind of export (extracting vs. packaging vs. copying libraries), I run into different errors which resist solving. The furthest I get is with the last approach, which is also the one I have to choose for license reasons, so let's focus on that one.
In this case the exported JAR fails to look into its persistence.xml. I will specify that later but first some background information...
Folder Structure
some_folder
myproject_lib
myproject.jar
root of my project's package structure
meta-inf
persistence.xml
File persistence.xml
<persistence xmlns="http://java.sun.com/xml/ns/persistence" version="2.0">
<persistence-unit name="MyDBManager" transaction-type="RESOURCE_LOCAL">
<!-- <exclude-unlisted-classes>false</exclude-unlisted-classes> -->
<!-- <class>isi.eload.core.Country</class> -->
<properties>
<property name="javax.persistence.jdbc.driver" value="com.mysql.jdbc.Driver" />
<property name="hibernate.dialect" value="org.hibernate.dialect.MySQLDialect" />
<!-- Do not define a connection here - this is done by the DbManager according to the command line arguments -->
<property name="hibernate.id.new_generator_mappings" value="true" />
<!-- <property name="hibernate.archive.autodetection" value="class, hbm" /> -->
</properties>
</persistence-unit>
</persistence>
I played around with the commented lines, once I felt that the xml is actually processed (see below) but that didn't help.
JPA/Hibernate JARs
Essentially the ones from the 4.3.4 Final Release:
antlr-2.7.7
dom4j-1.6.1
hibernate-commons-annotations-4.0.4.Final
hibernate-core-4.3.4.Final
hibernate-entitymanager-4.3.4.Final
hibernate-jpa-2.1-api-1.0.0.Final
jandex-1.1.0.Final
javassist-3.18.1-GA
jboss-logging-3.1.3.GA
jboss-logging-annotations-1.2.0.Beta1
jboss-transaction-api_1.2_spec-1.0.0.Final
.
Failing with persistence.xml
Packaged meta-inf
As I hinted at before, the exported JAR fails to properly process the persistence.xml. When I execute it in the above folder structure, the following exception is thrown:
javax.persistence.PersistenceException: No Persistence provider for EntityManager named MyDBManager
at javax.persistence.Persistence.createEntityManagerFactory(Persistence.java:61)
This exception is usually thrown when the file was found but the persistence unit name given to Persistence.createEntityManagerFactory does not match any persistence unit declared in the files. But this is definitely not the case here!
I have no good idea, why this exception is thrown.
When I edit the JAR file (though I'm not sure that such tampering is not causing problems on its own) and empty or remove the persistence.xml, the error stays the same.
Another meta-inf
My first response was to copy a meta-inf folder next to the JAR:
myproject_lib
myproject.jar
... unchanged ...
meta-inf
persistence.xml
This seems to work, as an entity manager factory can now be created. But then no Entities are found and I think this is related to the fact that the persistence.xml, which is actually used, is not "on the same class path" as the JAR file.
Is there a link or an idea for how I can fix this? Preferably by forcing the JAR file to use the meta-inf folder which it contains itself.
META-INF needs to be in upper case. If Java is attempting to access the filesystem on Windows (or OS X in that regard) META-INF/persistence.xml will be automatically translated to meta-inf/persistence.xml by the operating system. Once you package it up to a JAR it becomes case sensitive and stops working.
I've been doing static analysis on Java projects, which usually boils down to running javac2 #a_list_of_all_the_java_files_in_the_project, where javac2 is my modified compiler. Except, finding the right libraries to make everything compile is difficult.
I'm working with a project now (incidentally, eclipse SDK 3.7.1) that has a file artifacts.xml in the root folder. This file looks useful. My understanding so far is that it tells eclipse which libraries to use when opening the folder as an eclipse project. If so, I'd like to download these libraries locally and reference them in my custom compilation command.
Can someone explain the purpose of artifacts.xml, and optionally offer feedback on my approach? Ultimately all I want is to be able to compile the project on the command line using a nonstandard compiler.
First few lines or artifacts.xml
<?xml version='1.0' encoding='UTF-8'?>
<?artifactRepository version='1.1.0'?>
<repository name='Bundle pool' type='org.eclipse.equinox.p2.artifact.repository.simpleRepository' version='1'>
<properties size='2'>
<property name='p2.system' value='true'/>
<property name='p2.timestamp' value='1315600353875'/>
</properties>
<mappings size='3'>
<rule filter='(& (classifier=osgi.bundle))' output='${repoUrl}/plugins/${id}_${version}.jar'/>
<rule filter='(& (classifier=binary))' output='${repoUrl}/binary/${id}_${version}'/>
<rule filter='(& (classifier=org.eclipse.update.feature))' output='${repoUrl}/features/${id}_${version}.jar'/>
</mappings>
<artifacts size='405'>
<artifact classifier='osgi.bundle' id='org.eclipse.ecf.provider.filetransfer.ssl' version='1.0.0.v20110531-2218'>
<properties size='1'>
<property name='download.size' value='8460'/>
</properties>
</artifact>
This is the file that the Eclipse 'p2' install system uses to describe a repository of installable artifacts. The file is sometimes compressed in to an artifacts.jar file.
Eclipse p2 is described here: http://wiki.eclipse.org/Equinox/p2
A maven artifact, in general, is a file that gets deployed to a maven repo.
artifacts.xml is the canonical way of listing everything that needs to be sent to said repository.
Check this previous post for more information:
What is a Maven artifact?
I have tried setting different properties and attributes (debug="true"), but it didn't work.
This is from our build.xml (just showing the parts relating to the build step):
<!-- Environment holen -->
<property environment="env" />
<!-- Target: all -->
<target name="all" depends="build, test, export">
</target>
<!-- Target: build -->
<target name="build">
<ant4eclipse:executeProjectSet workspaceDirectory="${env.WORKSPACE}" teamprojectset="${env.WORKSPACE}\${env.JOB_NAME}\projectSet.psf">
<ant4eclipse:forEachProject filter="(executeProjectSet.org.eclipse.jdt.core.javanature=*)">
<buildJdtProject workspaceDirectory="${env.WORKSPACE}" projectName="${executeProjectSet.project.name}" targetLevel="1.6" />
</ant4eclipse:forEachProject>
</ant4eclipse:executeProjectSet>
</target>
Detailed description:
An internal project consists of a large number of classes and some applications, all written in Java.Everything runs just fine when started from within Eclipse.
After each commit to our SVN repository, the project is built using ant4eclipse on our Hudson installation and if tests pass, a zip is automatically created and copied to a file server to be used by simply unpacking and starting the supplied startup batch script.
Now last week a colleague informed me that the version from the file server doesn't work for him. I checked and am able to reproduce the problem - loading data from a database doesn't work. No exception is shown in the log/console and I have no idea what goes wrong. Everything works when started from within eclipse (same vmargs, same JVM etc.).
When trying to connect the debugger, it seems like no debug info is present ("line numbers missing" etc.). So I now need to find out how to convince ant4eclipse to include debug infos.
In the meantime, I found out how to do this myself: I added a default compiler options file like this (attribute defaultCompilerOptionsFile):
<!-- Target: build -->
<target name="build">
<ant4eclipse:executeProjectSet workspaceDirectory="${env.WORKSPACE}" teamprojectset="${env.WORKSPACE}\${env.JOB_NAME}\projectSet.psf">
<ant4eclipse:forEachProject filter="(executeProjectSet.org.eclipse.jdt.core.javanature=*)">
<buildJdtProject
workspaceDirectory="${env.WORKSPACE}"
projectName="${executeProjectSet.project.name}"
targetLevel="1.6"
defaultCompilerOptionsFile="compilerOptions.prefs"/>
</ant4eclipse:forEachProject>
</ant4eclipse:executeProjectSet>
</target>
compiler options file is just a copy of .metadata\.plugins\org.eclipse.core.runtime\.settings\org.eclipse.jdt.core.prefs inside the workspace. Make sure to set the desired options inside the file:
org.eclipse.jdt.core.compiler.debug.lineNumber=generate
org.eclipse.jdt.core.compiler.debug.localVariable=generate
org.eclipse.jdt.core.compiler.debug.sourceFile=generate
I haven't tested if it works if you create a compiler options file that just contains the 3 lines above.
We have a project called web-app1 and has a dependency on another jar file called core-app.jar which is provided by another team as a shared library , yet there is a hibernate.cfg.xml in this core-app.jar (inside of the jar), with content as below.
<hibernate-configuration>
<session-factory>
<property name="dialect">${hibernate.dialect}</property>
<property name="query.substitutions"><![CDATA[false 'N', true 'Y']]></property>
<property name="show_sql">false</property>
<property name="format_sql">false</property>
<property name="use_sql_comments">false</property>
<property name="generate_statistics">true</property>
<property name="hibernate.connection.release_mode">after_transaction</property>
<!-- Search Configurations -->
<property name="hibernate.search.default.directory_provider">org.hibernate.search.store.FSDirectoryProvider</property>
<property name="hibernate.search.default.indexBase">${lucene.index.home}</property>
<property name="hibernate.search.default.batch.merge_factor">10</property>
<property name="hibernate.search.default.batch.max_buffered_docs">10</property>
</session-factory>
</hibernate-configuration>
As we see in the Search Configurations section, there is a variable ${lucene.index.home} that should be replaced by other projects on different OS platform,
so the question, does maven provide a way to filter a dependency jar file and filter the content? any plugins ? war:war , unzip ? dependencies ? I couldn't figure a fast way to do that. it looks to me , no matter what plugin would be adopted, the plugin needs to do 4 things basically.
1 unpack the jar in
process-resources phase.
2 substitute the ${var} with
value defined in profile.
3 pack it again back into a jar.
4 need to copy it back from the
packing/unpacking workspace back to
the maven process path ??
did anyone run into this similar requirement before.
thanks
I would assume that those values are meant to be set at runtime, likely as VM arguments. It doesn't make sense to provide a jar file that has to be modified to be able to be used.
If you really really REALLY have to do filtering at build time for configuration purposes, those configuration files should be filtered, NOT your dependencies. Then, you should either bundle said file into multiple artifacts (assuming of course you are targeting multiple environments), or be provided outside the built artifact as an externalized resource.