How to check GORM version of grails application - java

I am doing some debugging on code that throws errors on certain fields during a database query, and some constructor actions of a load method.
We are setting properties manually on an object before creating an instance of a class/model, however for some reason these fields (DateCreated, DateUpdated) are returning null shortly after passing them.
We have validated that these fields are in fact NOT null up until this point, so our concern is that the GORM is deleting and attempting to replace these fields because of naming convention, and perhaps we should leave our own manual setting of these fields out, and allow GORM to manage them.
Is there are way to see which version of GORM the grails app is using, and perhaps get some understanding of how this is managing these fields, and using its own generated constructor for this object?

Depends which version for grails you are using. Assuming you are on newer grails version. The GORM version should be specified in gradle.properties file inside of your grails project and it looks like this:
grailsVersion=3.3.2
gormVersion=6.1.8.RELEASE
gradleWrapperVersion=3.5
You can also find it out from gradle dependency tree with this gradle task:
./gradlew dependencies

We are setting properties manually on an object before creating an
instance of a class/model...
It isn't possible to set properties on an object before creating it.
...so our concern is that the GORM is deleting and attempting to
replace these fields because of naming convention
You haven't said what the property names are but if they are dateCreated and lastUpdated, then GORM will treat those properties specially. If you don't want that, you can turn auto time stamping off, as shown below...
class Person {
Date dateCreated
Date lastUpdated
static mapping = {
autoTimestamp false
}
}
See section 8.1.9 at http://gorm.grails.org/6.1.8/hibernate/manual/index.html#eventsAutoTimestamping.

Related

How to avoid validation error "Cannot resolve column 'xy'" for embeddable classes in IntelliJ?

I'm using IntelliJ on a Java/Hibernate project. I've also assigned a data source to that project so most of the JPA validation errors for non existing columns are gone.
The only errors remaining are for these columns which are defined in #Embeddable classes like:
#Embeddable
public class MyEmbeddedClass {
#Column(name="my_embedded_column")
private String myEmbeddedColumn;
IntelliJ keeps warning me that these columns are not existing in the data model:
"Cannot resolve column 'my_embedded_column'"
Is there any way to make IntelliJ skip these JPA validation checks for #Embeddable classes without disabling the whole JPA validation functionality or am I supposed to create a bug ticket for the JPA validation plugin?
Same issue here. I've followd #Andrey suggestion and it worked for me.
I've moved my #embeddable classes into a subpackage (org.mydomain.back.entities.ids).
I've defined a scope, "MyCustomScope", exluding explicitly this subpackage. Then I've configured "Inspections-->JPA-->Unresolved databases references in annotations" like this:
Severity by Scope:
- MyCustomScope (allButEmbeddedKeys): Checked and "Error".
- Everywhere else: Unchecked.
It's more a workaround than an actual solution, because JPA inspection is disabled in such classes, with all inconveniences that could cause. But all errors in "embeddableId" classes are gone. Also for new classes defined into that subpackage.
Hope it helps!

How do I avoid content fields in Joda objects?

I'm using Joda objects (DateTime and DateTimeZone) in a document and whenever I access it via the REST interface I get entries with fields like this
lastAggregationDate: { content: "2016-07-12T17:58:43.643Z" }
instead of
lastAggregationDate: "2016-07-12T17:58:43.643Z"
I have the Joda Jackson dependencies declared and I see the de/serializers for these types so I'm puzzled as to what's at work here.
I've duplicated this behavior in a slightly modified Spring sample project but using Java's native date types rather than Joda's. I've added a date of birth property to the Person object and modified the shouldRetrieveEntity test to look for $.dateOfBirth.content. I've confirmed the serializer is being used and it seems like the LocalDate object is being treated as a resource rather than as a simple property.
This is fixed in Spring Data Hopper-SR4:
https://jira.spring.io/browse/DATAMONGO-1498
The issue results from Spring Boot not setting up MongoMappingContext correctly. A ticket has been created for Spring Boot and the fix is anticipated for the 1.4.1 release (credit for this answer goes to Oyku Gencay and Oliver Gierke). For more detail, see the ticket or the pull request.

Hibernate + JodaTime mapping different types

I am using hibernate reverse engineering and trying to get my Timestamps to map to a JodaTime type.
I have setup my hibernate.reveng.xml file properly
<sql-type jdbc-type="TIMESTAMP" hibernate-type="org.joda.time.contrib.hibernate.PersistentDateTime" not-null="true"></sql-type>
The issue is that when i run the rev-eng process my Java classes also get the members created as PersistentDateTime objects, but I don't want that because they are not usable. I need the java objects to be org.joda.time.DateTime
So I tried creating a custom engineering strategy
public class C3CustomRevEngStrategy extends DelegatingReverseEngineeringStrategy {
public C3CustomRevEngStrategy(ReverseEngineeringStrategy res) {
super(res);
}
public String columnToHibernateTypeName(TableIdentifier table, String columnName, int sqlType, int length, int precision, int scale, boolean nullable, boolean generatedIdentifier) {
if(sqlType==Types.TIMESTAMP) {
return "org.joda.time.DateTime";
} else {
return super.columnToHibernateTypeName(table, columnName, sqlType, length, precision, scale, nullable, generatedIdentifier);
}
}
}
My thought was that the hibernate mapping files would get the hibernate.reveng.xml file settings and the java objects would get the settings from the custom strategy file...but that was not the case. Both the mapping file and Object are of type "org.joda.time.DateTime" which is not what I want.
How can I achieve my goal? Also, I am NOT using annotations.
Hibernate 3.6
JodaTime 2.3
JodaTime-Hibernate 1.3
Thanks
EDIT: To clarify exactly what the issue is
After reverse engineering this is what I get in my mapping file and POJO class
<property name="timestamp" type="org.joda.time.contrib.hibernate.PersistentDateTime">
private PersistentDateTime timestamp;
As a POJO property, PersistentDateTime is useless to me as I cannot do anything with it such as time manipulations or anything. So this is what I want after my reverse engineering
<property name="timestamp" type="org.joda.time.contrib.hibernate.PersistentDateTime">
private org.joda.time.DateTime timestamp;
Using the Jidira library as suggested below gives me the same result, a POJO that I cannot use.
The JodaTime-Hibernate library is deprecated, and is probably the source of your problem. Don't dispair however as there is a (better) alternative.
You will need to use the JadiraTypes library to create the correct JodaTime objects from Hibernate. Add the library which can be found here to your project classpath and then change your type to org.jadira.usertype.dateandtime.joda.PersistantDateTime. All of the JodaTime objects have a corresponding mapping in that package, so if you decide to change to another object then just update your type.
This should ensure that your objects get created correctly.
I should add a caveat to my answer, which is that I have never used the JadiraTypes library with Hibernate 3. If it only supports Hibernate 4 (I don't see why it would, but...) let me know and I'll delete my answer.
The Hibernate Tools don't seem to separate Hibernate Types from the Java types. If you would be using annotations, this would be more clear as in that case you'd need an #Type annotation, which Hibernate will not generate at all. So using the exposed APIs won't help here.
Fortunately, Hibernate lets you plug into the actual code (or XML) generation after it does its processing. You can do that by replacing the Freemarker templates it uses to generate both XML and Java code. You'll need to use Ant for reverse engineering, however.
To start using Ant for this purpose (if you're not doing so already), you can either pull Hibernate Tools in as a build-time dependency using your dependency manager or download a JAR. Put the jar in Ant's classpath, but also extract the template files from it: since you're using XML configuration, you'll be interested in the /hbm directory. Then add the task using <taskdef> and, assuming that you extracted the templates to TPL_PATH/hbm/*.ftl, call it using <hibernatetool templatepath="TPL_PATH" ...>. (For more info, see below for a link to the docs). Using Ant also helps with automation, for example on CI servers where you won't have Eclipse.
You'll want to keep hibernate-type="org.joda.time.DateTime" in your hibernate.reveng.xml so that the Java files get generated correctly, but replace it with org.joda.time.contrib.hibernate.PersistentDateTime in the generated XML. To do that, edit TPL_PATH/hbm/property.hbm.ftl, replace ${property.value.typeName} with ${javaType}, and assign javaType to the right value before it's used:
<#assign javaType=property.value.typeName>
<#if javaType.equals("DateTime")>
<#assign javaType="org.jadira.usertype.dateandtime.joda.PersistentDateTime">
</#if>
<property
name="${property.name}"
type="${javaType}"
...
You might want to remove newlines to keep the generated XML clean.
This is all described in the Hibernate Tools documentation at http://docs.jboss.org/tools/archive/3.2.1.GA/en/hibernatetools/html/codegen.html#d0e6349 - except that the documentation doesn't tell you exactly which templates you need to modify, you need to figure that out by reading the templates.

How to use the Hibernate optimistic locking version property on the front end?

Optimistic locking using the version attribute for an entity works fine and is easy to implement:
<version property="VERSION" type="int" column="EX_VERSION" />
The entity has a property of the following type:
private int VERSION;
public int getVERSION() { return VERSION; }
public void setVERSION(int VERSION) { this.VERSION = VERSION; }
So far, so good. Now service methods return a data transfer object (DTO) for the entity above, which the views display in HTML. For update pages, the VERSION attribute is stored in an HTML hidden field and submitted with the form.
The intent is to use the version property to ensure that a user's update will fail if the information displayed is accompanied by an old version.
The controller responds to a users update request by invoking a service method with the DTO containing the updated information (including the version property), and the service method in turn uses a data access object (DAO) to persist the changes:
public void update(SimpleDTO dto) {
SimplyEntity entity = getSimpleDao().load(dto.getId());
copyProperties(dto, entity); // all properties, including VERSION copied to entity
getSimpleDao().update(entity);
}
The problem is that the version property copied into the entity by copyProperties(...) is not respected by Hibernate. I tracked down the reason in the following forum: https://forum.hibernate.org/viewtopic.php?f=1&t=955893&p=2418068
In short, when load() is called, Hibernate caches the version property in the session cache and it doesn't matter what it's value is subsequently changed to. I agree that this is the correct behavior, but I have been instructed by Bosses to pass the version via an HTML form property (if there is a better pattern for this, I'd love to hear it).
One solution I am exploring now is to evict the entity from the session after it's version has been set using hibernateTemplate.evict(simpleEntity) before the update happens. I hope this works, but it doesn't seem efficient.
I would like to ask Hibernate to check the version property on the instance itself, rather than only from the session cache.
Thanks in advance for answers!
--
LES
Do you really need to use DTO? You wouldn't have had this problem if you were passing the actual entity around - nor would you have to load the entity again, which isn't exactly great for performance.
But even if you do have a legitimate reason to use DTO, I'm not quite grasping why you would try to update the version number on your freshly reloaded entity prior to saving. Consider different scenarios possible in your workflow:
Entity is loaded initially, has version = V1
It's transferred to DTO which goes to UI, comes back and is ready to be saved.
Entity is loaded again, has version = V2
You have two possibilities now:
V1 == V2. Peachy, you don't have to do anything.
V1 is less than V2, meaning entity was updated by someone else while you were editing it. There's no reason to try to set version to V1 and attempt to save because saving will fail. You can either save it with V2 (thus overriding someone else's changes) or fail now.(without involving Hibernate).

Running hibernate tool annotation generation without the "catalog" attribute

when i run my hibernate tools
it reads from the db and create java classes for each tables,
and a java class for composite primary keys.
that's great.
the problem is this line
#Table(name="tst_feature"
,catalog="tstdb"
)
while the table name is required, the "catalog" attribute is not required.
sometimes i want to use "tstdb", sometimes i want to use "tstdev"
i thought which db was chosen depends on the jdbc connection url
but when i change the jdbc url to point to "tstdev", it is still using "tstdb"
so,
i know what must be done,
just don't know how its is done
my options are
suppress the generation of the "catalog" attribute
currently i'm doing this manually (not very productive)
or i could write a program that parses the java file and remove the attribute manually
but i'm hoping i don't have to
OR
find a way to tell hibernate to ignore the "catalog" attribute and use the schema that is explicitly specified.
i don't know the exact setting i have to change to achive this, or even if the option is available.
You need to follow 3 steps -
1) In the hibernate.cfg.xml, add this property
hibernate.default_catalog = MyDatabaseName
(as specified in above post)
2) In the hibernate.reveng.xml, add all the table filters like this
table-filter match-name="MyTableName"
(just this, no catalog name here)
3) Regenerate hibernate code
You will not see any catalog name in any of the *.hbm.xml files.
I have used Eclipse Galileo and Hibernate-3.2.4.GA.
There is a customization to the generation, that will tell what tables to put in what catalog.
You can specify the catalogue manually (in reveng file, <table> element), or programmatically (in your custom ReverseEngineeringStrategy class if I remember well).
Also, I recently had to modify the generation templates.
See the reference documentation :
http://docs.jboss.org/tools/archive/3.0.1.GA/en/hibernatetools/html_single/index.html#hibernaterevengxmlfile
you can customize the catalogue of each of your tables manually
https://www.hibernate.org/hib_docs/tools/viewlets/custom_reverse_engineering.htm watch a movie that explains a lot ...
http://docs.jboss.org/tools/archive/3.0.1.GA/en/hibernatetools/html_single/index.html#d0e5363 for customizing the templates (I start with the directory that's closest to my needs, copy all of them in my own directory, then edit as will)
Sorry, this could get more precise, but I don't have access to my work computer right now.
The attribute catalog is a "connection" attribute and should be specified in the "connection" config file hibernate.cfg.xml and NOT in a "data" config file *.hbm.xml.
I generate hibernate code via ant task <hibernatetool> and I put this replace task after regeneration (replace schema-name with your database).
<replace dir='../src' token='catalog="schema-name"' value=''>
So, after generation, attribute catalog has been removed.
This is a workaround, but code generated works in my development a production environment with different schema-name.

Categories

Resources