I need help with JPA/Hibernate. I have added two new columns to a table:
CREATE TABLE DMU_TRACKED_SERIAL (
TSO_SPLIT_NUMBER varchar(6) PRIMARY KEY NOT NULL,
CREATED_BY varchar(100) DEFAULT 'APDMUIF',
CREATED_TS timestamp DEFAULT CURRENT TIMESTAMP NOT NULL, -- new column added
MODIFIED char(1) DEFAULT 'F' NOT NULL -- new column added
)
The entity class is TrackedSerial (attached). I’m using what I believe to be the correct annotations for the new getters, based on getters for the same type in other entity classes in the project:
#Column(name = "CREATED_TS", nullable = false, length = 26,
updatable = false, columnDefinition="TIMESTAMP DEFAULT CURRENT_TIMESTAMP")
#Temporal(TemporalType.TIMESTAMP)
public Date getCreatedTs() {
return createdTs;
}
#Column(name = "MODIFIED", nullable = false, columnDefinition = "char(1)")
#Convert(converter = BooleanCharacter.class)
public Boolean getModified() {
return modified;
}
Insertions seem to work fine, but although this query returns with no error:
#NamedNativeQuery(name = "getUnimportedTrackedSerials",
query = "SELECT * FROM CADBOM.DMU_TRACKED_SERIAL WHERE modified = 'F'")
A few lines after the call that uses this query on line 73, an exception is thrown at line 79 (indicated by arrow)
java.lang.ClassCastException: [Ljava.lang.Object; incompatible with com.daimler.dmu.jpa.entities.dmu.TrackedSerial
at com.daimler.dmu.batch.jobs.PerSerialItemFetcher.execute(PerSerialItemFetcher.java:79)
Now this is very odd, because the variable t is the same class as the members of trackedSerials. There should be no need to cast, let alone throw an exception. I suspect Hibernate is doing some jiggery-pokery with reflection, creating objects that look like TrackedSerial but aren't exactly. Does anyone know what's going on here?
Related
Something very bizarre have been happening. I have a very simple Entity recipe like so
#Data
#Entity
#Table(name = "recipe", schema = "public")
#AllArgsConstructor
#NoArgsConstructor
public class Recipe {
#Id
#GeneratedValue(
strategy = GenerationType.IDENTITY
)
#Column(name = "id", updatable = false, nullable = false)
private long id;
#Column(name = "name")
private String name;
#Column(name = "instructions")
private String instructions;
#Column(name = "date_added", nullable = false)
private String dateAdded;
#Column(name = "last_edited", nullable = false)
private String lastEdited;
}
and I have this post service that should post the 4 string attribute to the database
public void postRecipe(Recipe recipe){
var sql = """
INSERT INTO public.recipe ("name","instructions","date_added","last_edited")
VALUES (?, ?, ?, ?)
""";
jdbcTemplate.update(
sql,
recipe.getName(),
recipe.getInstructions(),
recipe.getDateAdded(),
recipe.getLastEdited()
);
}
However when the following jason is sent using postman, I get the null value error.
{
"name":"test",
"instructions":"don't eat",
"date_added":"03/04/2017",
"last_edited":"03/04/2017"
}
ERROR: null value in column \"date_added\" of relation \"recipe\" violates not-null constraint\n Detail: Failing row contains (3, null, don't eat, null, test)
The strangest thing is that only the "name" and "instruction" columns receive their data and not the other columns. I have tried adding another String attribute to the Entity class and it also cannot get data from the jason.
Edit 1:
I have tried adding the data directly through pgadmin and it worked fine
INSERT INTO recipe (name, instructions, date_added, last_edited)
VALUES ('test', 'test instruction', '2020/03/05', '2020/05/08');
It looks like your deserialization is broken - transforming your JSON into the Java entity, which results in some null values present. Most likely because date_added != dateAdded (field name), and Jackson cannot properly set a value.
I recommend having a look at Jackson guide: https://www.baeldung.com/jackson-annotations, #JsonProperty specifically. And overall do not mix entities and DTOs
After many trials and errors I was able to come up with a solution but still have no clue as to why this is happening. It turns out the under score in the annotation is the problem.
//won't work
#Column(name = date_added)
//works
#Column(name = dateadded)
This is pretty strange because I am fairly certain that the under score is generated by hibernate.
if anyone know why this is happening please let me know... for now I will just stay away from the under scrolls.
Hello i'm trying to learn Quarkus with Hibernate but i've ran into an issue the schema-validation.
The error:
2021-12-29 16:05:14,915 ERROR
[io.qua.hib.orm.run.sch.SchemaManagementIntegrator] (Hibernate
post-boot validation thread for ) Failed to validate Schema:
Schema-validation: wrong column type encount ered in column [BED_INFO]
in table [ROOM]; found [bpchar (Types#CHAR)], but expecting [char
(Types#VARCHAR)] 2021-12-29 16:05:14,921 ERROR
[io.qua.hib.orm.run.sch.SchemaManagementIntegrator] (Hibernate
post-boot validation thread for ) The following SQL may
resolve the database issues, as generated by the Hibernate schema
migration tool. WARNING: You must manually verify this SQL is correct,
this is a best effort guess, do not copy/paste it without verifying
that it does what you expect.
the class Room looks like this
#Entity
#Table(name = "ROOM")
public class Room {
#Id
#GeneratedValue(strategy = GenerationType.IDENTITY)
#Column(name = "ROOM_ID")
private long id;
#Column(name = "NAME")
private String name;
#Column(name = "ROOM_NUMBER")
private String roomNumber;
#Column(name = "BED_INFO", columnDefinition = "char")
private String bedInfo;
public Room(String name, String roomNumber, String bedInfo) {
this.name = name;
this.roomNumber = roomNumber;
this.bedInfo = bedInfo;
}
}
and the mysql schema like this
CREATE TABLE ROOM(
ROOM_ID BIGSERIAL PRIMARY KEY,
NAME VARCHAR(16) NOT NULL,
ROOM_NUMBER CHAR(2) NOT NULL UNIQUE,
BED_INFO CHAR(2) NOT NULL
);
According to their documentation this "should" work but i'm perhaps missing something here.
[BED_INFO] in table [ROOM]; found [bpchar (Types#CHAR)], but expecting [char (Types#VARCHAR)]
Means that Hibernate has found a bpchar(char) where it would expect a varchar. It seems that columnDefinition does not handle char. If you really want it to be a char, try bpchar instead.
In case it doesn't work, try running your quarkus app in dev mode with this option in the application.properties file.
quarkus.hibernate-orm.database.generation=create
This will generate the actual DDL for your database that Hibernate is expecting.
Personally I would refrain from columnDefinition, instead i would use length and nullable. Unless you are building an app on an existent, I would also remove name and #Table.
Query:
INSERT INTO PERSON
(email, mobile, party_id, affiliate_id, eligibility, member_start_date, created_by, created_dt, first_name, last_name, google_connected)
values
('xxx#yyy.org', NULL, 123, '123', '1', NULL, NULL, '2018-8-30 21:45:56.859000 -6:0:0', 'xxx', 'yyy', '0')
ON CONFLICT (email)
DO UPDATE SET create_dt = '2018-8-30 21:45:56.859000 -6:0:0' where email = ?
When the LocalDate value is not null, it works fine. Facing this issue only when LocalDate value is given as null.
Even after PostgreSQL casting, it does the same.
Exception stacktrace:
2018-08-30 21:10:48,372 -- [ERROR]-- There was an unexpected problem
with your request org.postgresql.util.PSQLException: ERROR: column
"member_start_date" is of type date but expression is of type bytea
Hint: You will need to rewrite or cast the expression. Position: 185
at
org.postgresql.core.v3.QueryExecutorImpl.receiveErrorResponse(QueryExecutorImpl.java:2182)
at
org.postgresql.core.v3.QueryExecutorImpl.processResults(QueryExecutorImpl.java:1911)
at
org.postgresql.core.v3.QueryExecutorImpl.execute(QueryExecutorImpl.java:173)
at
org.postgresql.jdbc2.AbstractJdbc2Statement.execute(AbstractJdbc2Statement.java:645)
at
org.postgresql.jdbc2.AbstractJdbc2Statement.executeWithFlags(AbstractJdbc2Statement.java:495)
at
org.postgresql.jdbc2.AbstractJdbc2Statement.executeQuery(AbstractJdbc2Statement.java:380) at sun.reflect.GeneratedMethodAccessor98.invoke(Unknown Source) at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498) at
org.apache.tomcat.jdbc.pool.StatementFacade$StatementProxy.invoke(StatementFacade.java:114)
at com.sun.proxy.$Proxy185.executeQuery(Unknown Source) at at
org.hibernate.engine.jdbc.internal.ResultSetReturnImpl.extract(ResultSetReturnImpl.java:70)
... 149 common frames omitted
Entity:
#Entity(name = "person")
#EqualsAndHashCode(callSuper = false)
public class PersonEntity extends Audit {
#Id
#GeneratedValue
#Column(name = "person_id", columnDefinition = "uuid", updatable = false)
private UUID id;
#Column(name = "first_name")
private String firstName;
#Column(name = "last_name")
private String lastName;
#Column(name = "email")
#NotNull
private String email;
#Column(name = "mobile")
private String mobile;
#Column(name = "party_id")
private Long partyId;
#Column(name = "affiliate_id")
private String affiliateId;
#Column(name = "eligibility")
#NotNull
private Boolean eligibility;
#Column(name = "member_start_date")
private LocalDate memberStartDate;
#Column(name = "google_connected")
private Boolean googleAccountConnected;
}
PostgreSQL table definition; it's missing google_connected column which is not important:
CREATE TABLE person
(
person_id UUID NOT NULL,
email VARCHAR(128) NOT NULL,
mobile VARCHAR(20),
party_id INTEGER,
affiliate_id VARCHAR(20),
eligibility BOOLEAN NOT NULL,
member_start_date DATE,
created_by VARCHAR(128) NOT NULL,
created_dt TIMESTAMP NOT NULL DEFAULT CURRENT_TIMESTAMP,
updated_by VARCHAR(128) DEFAULT NULL,
updated_dt TIMESTAMP NULL,
CONSTRAINT person_pk PRIMARY KEY ( person_id )
);
Because the query is native, Hibernate doesn't know the data types which to expect, so when you pass a null it defaults to the generic Serializable type handler. Changing this behaviour breaks compatibility with other databases.
Postgres, however parses the query immediately and determines what types are acceptable, and it always checks for type before it checks for null. They are the only ones who can fix this, but refuse to do so and say it works as intended.
The only solutions for you are:
use JPQL
use managed entities
use hard-coded nulls in the query string where you need to
Fortunately for the third option, with Hibernate you can use named parameters in native queries, so you don't have to do positional calculations for when something is available and when it isn't.
edit: 4th solution that I've discovered since.
You have your query:
Query q = em.createNativeQuery("UPDATE...");
Have some static final LocalDate object somewhere:
public static final LocalDate EPOCH_DATE = LocalDate.of(1970, 1, 1);
then call the query like this:
q.setParameter("start_date", EPOCH_DATE);
q.setParameter("start_date", nullableDateParam);
The first time you call setParameter for a parameter, Hibernate uses the class to resolve the type. The second time you call it, the type is already resolved, so a null will work.
It's an old question, but there is a more useful way:
your query...
.setParameter("transaction_id", null, LongType.INSTANCE)
It works.
Found from https://forum.hibernate.org/viewtopic.php?p=2493645
Going to newer versions of hibernate 5.1.17 and above + postgres seems to have exhibited this behavior. Looking into the code, when it binds a type that has no value, the old hibernate code attempted to resolve the type through a typeresolver. The newer versions of hibernate's implementation state that it will not guess.
public Type resolveParameterBindType(Object bindValue) {
if ( bindValue == null ) {
// we can't guess
return null;
}
We ended up just setting a default value based on the type first, and then the real null value.
I'm having problems where two Date fields are updated to the exact same date when only one should be. I'm trying to figure out why this is happening and how I can update only the one date field I want updated, and leave the other at its original value.
I'm using Hibernate with JPA on a MySQL database, in case that is part of the reason.
I have a persistence entity that looks something like this:
#NamedQueries({
#NamedQuery(name="MyObject.updateItem", query="UPDATE MyObject m SET m.item = :item, m.lastUpdate = :updated WHERE m.id = :id")
})
#Entity
#Table(name="entries")
public class MyObject implements Serializable
{
#Id
#GeneratedValue(strategy=GenerationType.IDENTITY)
private Long id;
private String item;
#Column(columnDefinition = "TIMESTAMP", nullable = false)
private Date dateCreated = new Date();
#Column(columnDefinition = "TIMESTAMP", nullable = false)
private Date lastUpdate = new Date();
// after here standard constructors, getters, setters, etc.
}
When from my DAO I call the NamedQuery and provide the correct paramters, I find that both lastUpdate and dateCreated are changed. Is there any reason for this and how can I prevent this from happening? Is this caused because I initialize the to date fields in the entity class?
I'm using the TIMESTAMP column definition because I want to be able to perform queries with < or >.
lastUpdate and dataCreated, aftare update have the same value?
I don't know if this will be a solution for you but this is what I commonly do for all of the entities I regularly implement. Add a PrePersist and PreUpdate function to your entity in order to set the created and last modified times. Also try adding #Temporal(TemporalType.TIMESTAMP) to each of your date fields.
#PrePersist
public void prePersist() {
this.dateCreated = new Date();
this.lastUpdated = this.dateCreated;
}
#PreUpdate
public void preUpdate() {
this.lastUpdated = new Date();
}
Beyond that, I'm a little stumped...
So I figured out the problem wasn't to do with my query or how I used persistence but how I built the database itself.
When I created the table to contain the data for the object I didn't specify a specific default for a NOT NULL field.
My original SQL CREATE statement looked something like this.
CREATE TABLE IF NOT EXISTS `entries` (
`id` bigint(20) NOT NULL AUTO_INCREMENT,
`item` VARCHAR(255) NOT NULL,
`dateCreated` TIMESTAMP NOT NULL,
`lastUpdate` TIMESTAMP NOT NULL,
PRIMARY KEY (`id`)
) ENGINE=InnoDB DEFAULT CHARSET=utf8;
When the MySQL Server executed this statement it deferred the first TIMESTAMP field (in this case dateCreated) with the default to fill it with the CURRENT_TIMESTAMP and the attribute on update CURRENT_TIMESTAMP which was unexpected by me.
I corrected this problem by changing default for the field to DEFAULT '0000-00-00 00:00:00' and by changing my CREATE TABLE statement to force this default so my new statement looks like
CREATE TABLE IF NOT EXISTS `entries` (
`id` bigint(20) NOT NULL AUTO_INCREMENT,
`item` VARCHAR(255) NOT NULL,
`dateCreated` TIMESTAMP NOT NULL DEFAULT '0000-00-00 00:00:00',
`lastUpdate` TIMESTAMP NOT NULL DEFAULT '0000-00-00 00:00:00',
PRIMARY KEY (`id`)
) ENGINE=InnoDB DEFAULT CHARSET=utf8;
This apparently allows me to update the fields that I want without causing the other to update automatically.
I'm still not sure why MySQL assumed the defaults that it did. I guess it's probably somewhere in the documentation.
Hoping someone can clear this up for me. I'm getting some warnings when I run a unit test that is using hibernate criteria. The specific warnings are:
Mar 10, 2016 11:48:31 AM org.hibernate.engine.jdbc.spi.SqlExceptionHelper$StandardWarningHandler logWarning
WARN: SQL Warning Code: 1292, SQLState: 22007
Mar 10, 2016 11:48:31 AM org.hibernate.engine.jdbc.spi.SqlExceptionHelper$StandardWarningHandler logWarning
WARN: Incorrect datetime value: '1454684370' for column 'date_created' at row 1
Mar 10, 2016 11:48:31 AM org.hibernate.engine.jdbc.spi.SqlExceptionHelper$StandardWarningHandler logWarning
WARN: SQL Warning Code: 1292, SQLState: 22007
Mar 10, 2016 11:48:31 AM org.hibernate.engine.jdbc.spi.SqlExceptionHelper$StandardWarningHandler logWarning
WARN: Incorrect datetime value: '1454684700' for column 'date_created' at row 1
I can't figure out what Hibernate is complaining about. The column it's talking about, 'date_created', is of type datetime. I've tried passing in every other version of the date object, a string, a java.util.Date, a java.sql.Timestamp, but those just cause actual errors. Specifically they cause:
java.lang.ClassCastException: java.sql.Timestamp cannot be cast to java.lang.Long
or, of course, whatever other type I tried passing instead of Long. The times I'm passing in are epoch times but for some reason I'm getting these errors and the unit tests aren't passing.
Also, in case it might help, here is the specific code in the tests:
public List<Content> findByCollectionName(String collectionName, Long exclusiveBegin, Long inclusiveEnd, Long expiration)
{
if(collectionName == null)
return null;
Session session = currentSession();
session.beginTransaction();
Criteria criteria = session.createCriteria(Content.class).setResultTransformer(Criteria.DISTINCT_ROOT_ENTITY);
criteria.createAlias("collections", "cols");
criteria
.add(Restrictions.and(Restrictions.eq("cols.name", collectionName), buildCriterion(exclusiveBegin, inclusiveEnd, expiration)));
List<Content> list = criteria.list();
session.getTransaction().commit();
return list;
}
private Criterion buildCriterion(Long exclusiveBegin, Long inclusiveEnd, Long expiration)
{
List<Criterion> criterion = new ArrayList<Criterion>();
if(exclusiveBegin != null)
criterion.add(Restrictions.gt("dateCreated", exclusiveBegin));
if(inclusiveEnd != null)
criterion.add(Restrictions.le("dateCreated", inclusiveEnd));
if(expiration != null)
criterion.add(Restrictions.ge("dateCreated", expiration));
Criterion[] array = new Criterion[criterion.size()];
for(int j = 0; j < array.length; j++)
{
array[j] = criterion.get(j);
}
return Restrictions.and(array);
}
EDIT Sorry for the delay, here's the requested additions.
#Entity
#Table(name = "content")
public class Content implements Serializable
{
private static final long serialVersionUID = -8483381938400121236L;
public Content()
{
}
public Content(String messageId, ContentBlock block) throws NullPointerException
{
if(messageId == null || block == null)
throw new NullPointerException("Null objects passed for Content object creation");
this.messageId = messageId;
this.setContentBlock(block);
this.dateCreated = System.currentTimeMillis();
}
#Id
#GeneratedValue(strategy = GenerationType.IDENTITY)
#Column(name = "id", columnDefinition = "BIGINT UNSIGNED")
private int id;
#Column(name = "message_id")
private String messageId;
#Column(name = "date_created")
private long dateCreated;
#Column(name = "content_wrapper", columnDefinition = "longblob")
private byte[] contentWrapper;
#ManyToMany(mappedBy = "contentBlocks")
private List<TCollection> collections;
/*#ManyToOne(fetch = FetchType.LAZY)
#JoinColumn(name = "message_id")
private TMessage message;*/
I omitted the getters and setters for brevity
As for the database, it's got a regular structure. The TContent table has:
column type
id bigint
user_id int
name varchar
date_created datetime
collection_wrapper longblob
tspoll_expire decimal
Please let me know if I can add anything else or have missed anything. Thanks for taking a look.
In the past I've used the following, which successfully converts between MySQL datetime field and java.util.Date:
#Temporal(TemporalType.TIMESTAMP)
private Date dateTime;
And more recently with Joda-time, the following successfully converts between MySQL 5.7 datetime(3) and org.joda.time.DateTime:
#Column(columnDefinition = "DATETIME(3)")
private DateTime dateTime;
There will be other options but these are the two I'm currently familiar with.
Hibernate has a bug (sort of) when parameter hibernate.hbm2ddl.auto set to update If table contains filed with the same name, for example from previous deployment, it just leave field 'as is' and doesn't change field type.
I think, previously you set date_created as Date type, but later switch it to long.
Possible solutions:
change java bean field type long dateCreated; to Date dateCreated; and work with Date type in code.
manually change database structure accordinly to your java classes. alter table content alter column date_created TYPE bigint I doesn't sure mysql allow to do it, in that case you should write migration procedure.
drop database and recreate it with hibernate (only if content not important/test/garbage)