It happens when I insert entity from UI and it stores in db for the first time as I have entered. After I refresh page, it updates db and returns me some invalid chars. Something like this:
'8', NULL, NULL, '?e??_??e?', '?e??_o??a??', '2', NULL, '?e??_o?'
Here it is the part of sql log:
Hibernate:
/* insert test.model.Smer
*/ insert
into
test.smer
(naziv, smer, oblast, obrazovni_profil, odsek_id, stari_naziv, studijska_grupa_id)
values
(?, ?, ?, ?, ?, ?, ?)
Hibernate:
/* select
generatedAlias0
from
Smer as generatedAlias0 */ select
smer0_.smer_id as smer_id1_19_,
smer0_.naziv as naziv2_19_,
smer0_.smer as smer3_19_,
smer0_.oblast as oblast4_19_,
smer0_.obrazovni_profil as obrazovn5_19_,
smer0_.odsek_id as odsek_id8_19_,
smer0_.stari_naziv as stari_na6_19_,
smer0_.studijska_grupa_id as studijsk7_19_
from
test.smer smer0_
Hibernate:
select
odsek0_.odsek_id as odsek_id1_13_0_,
odsek0_.odsek as odsek2_13_0_
from
test.odsek odsek0_
where
odsek0_.odsek_id=?
Hibernate:
select
odsek0_.odsek_id as odsek_id1_13_0_,
odsek0_.odsek as odsek2_13_0_
from
test.odsek odsek0_
where
odsek0_.odsek_id=?
Hibernate:
select
odsek0_.odsek_id as odsek_id1_13_0_,
odsek0_.odsek as odsek2_13_0_
from
test.odsek odsek0_
where
odsek0_.odsek_id=?
Hibernate:
/* select
generatedAlias0
from
Odsek as generatedAlias0 */ select
odsek0_.odsek_id as odsek_id1_13_,
odsek0_.odsek as odsek2_13_
from
test.odsek odsek0_
Hibernate:
/* update
test.model.Smer */ update
test.smer
set
naziv=?,
smer=?,
oblast=?,
obrazovni_profil=?,
odsek_id=?,
stari_naziv=?,
studijska_grupa_id=?
where
smer_id=?
What language are you working with?
Double check to see if the page is utf-8 (or whatever suitable) and the database field must be the same char-set then check the received values before sending them to DB to check where that happens exactly.
Related
I can't save result of select into database using JPA in Spring Boot application. The code that I use is below:
#Override
#Transactional
public void fetchAndSave() {
List<TestData> all = testDataRepository.findAllRecords();
testDataRepository.saveAll(all);
// let suppose I will save another data here that's why I need #Transactional for roll-back in case of exception
}
#Repository
public interface TestDataRepository extends JpaRepository<TestData, Long> {
#Query(value = "select raw_values.identificator AS id, raw_values.name as value from test.raw_values", nativeQuery = true)
List<TestData> findAllRecords();
}
When I call fetchAndSave with a property spring.jpa.show-sql=true I see in logs only select:
Hibernate: select raw_values.identificator AS id, raw_values.name as value from test.raw_values
In a case I don't use #Transactional I can see more requests to database in logs and values are saved:
Hibernate: select raw_values.identificator AS id, raw_values.name as value from test.raw_values
Hibernate: select testdata0_.id as id1_0_0_, testdata0_.value as value2_0_0_ from test.test_data testdata0_ where testdata0_.id=?
Hibernate: select testdata0_.id as id1_0_0_, testdata0_.value as value2_0_0_ from test.test_data testdata0_ where testdata0_.id=?
Hibernate: select testdata0_.id as id1_0_0_, testdata0_.value as value2_0_0_ from test.test_data testdata0_ where testdata0_.id=?
Hibernate: insert into test.test_data (value, id) values (?, ?)
Hibernate: insert into test.test_data (value, id) values (?, ?)
Hibernate: insert into test.test_data (value, id) values (?, ?)
I have a pretty simple table in database, DDL looks like:
create table test_data
(
id serial not null
constraint test_data_pk
primary key,
value varchar(256)
);
-- There are 3 records in table raw_values
create table table_name
(
identificator integer not null
constraint table_name_pk
primary key,
name varchar(256)
);
Can you help me to identify the reason of such behavior? I expect records to be saved into database when I use #Transactional.
The short answer for "why it does not save" is: because they are already saved.
The longer answer is Hibernate sees that these IDs has already present in DB, and it does not save them.
If you want to inset another three entities to DB, just create duplicates for these objects, with id=null and save them:
List<TestData> all = testDataRepository.findAllRecords();
List<TestData> copies = all.stream()
.map(testData -> new TestData(...)) //copy all the fields EXCEPT ID
.collect(toList());
testDataRepository.saveAll(copies);
The entities (using Set):
#Entity
class Product(
#Id
#GeneratedValue
val int: Int = 0,
val name : String,
#ManyToMany(cascade = [(CascadeType.PERSIST), (CascadeType.MERGE)])
val stores: MutableSet<Store> = mutableSetOf()
)
#Entity
class Store(
#Id
#GeneratedValue
val int: Int = 0,
val name : String = ""
)
...
val p = Product(name = "product")
em.persist(p)
val store = Store(name = "store");
p.stores += store
em.persist(store)
for (i in 1..5) {
val s = Store(name = i.toString())
p.stores += s
em.persist(s)
}
em.flush()
p.stores.remove(store)
em.flush()
The result:
Hibernate: insert into product (name, int) values (?, ?)
Hibernate: insert into store (name, int) values (?, ?)
Hibernate: insert into store (name, int) values (?, ?)
Hibernate: insert into store (name, int) values (?, ?)
Hibernate: insert into store (name, int) values (?, ?)
Hibernate: insert into store (name, int) values (?, ?)
Hibernate: insert into store (name, int) values (?, ?)
Hibernate: insert into product_stores (product_int, stores_int) values (?, ?)
Hibernate: insert into product_stores (product_int, stores_int) values (?, ?)
Hibernate: insert into product_stores (product_int, stores_int) values (?, ?)
Hibernate: insert into product_stores (product_int, stores_int) values (?, ?)
Hibernate: insert into product_stores (product_int, stores_int) values (?, ?)
Hibernate: insert into product_stores (product_int, stores_int) values (?, ?)
Hibernate: delete from product_stores where product_int=? and stores_int=?
But with this entity (using List):
#Entity
class Product(
#Id
#GeneratedValue
val int: Int = 0,
val name : String,
#ManyToMany(cascade = [(CascadeType.PERSIST), (CascadeType.MERGE)])
val stores: MutableList<Store> = mutableListOf()
)
The result:
Hibernate: insert into product (name, int) values (?, ?)
Hibernate: insert into store (name, int) values (?, ?)
Hibernate: insert into store (name, int) values (?, ?)
Hibernate: insert into store (name, int) values (?, ?)
Hibernate: insert into store (name, int) values (?, ?)
Hibernate: insert into store (name, int) values (?, ?)
Hibernate: insert into store (name, int) values (?, ?)
Hibernate: insert into product_stores (product_int, stores_int) values (?, ?)
Hibernate: insert into product_stores (product_int, stores_int) values (?, ?)
Hibernate: insert into product_stores (product_int, stores_int) values (?, ?)
Hibernate: insert into product_stores (product_int, stores_int) values (?, ?)
Hibernate: insert into product_stores (product_int, stores_int) values (?, ?)
Hibernate: insert into product_stores (product_int, stores_int) values (?, ?)
Hibernate: delete from product_stores where product_int=?
Hibernate: insert into product_stores (product_int, stores_int) values (?, ?)
Hibernate: insert into product_stores (product_int, stores_int) values (?, ?)
Hibernate: insert into product_stores (product_int, stores_int) values (?, ?)
Hibernate: insert into product_stores (product_int, stores_int) values (?, ?)
Hibernate: insert into product_stores (product_int, stores_int) values (?, ?)
From Hibernate User Guide:
http://docs.jboss.org/hibernate/orm/5.2/userguide/html_single/Hibernate_User_Guide.html#associations-many-to-many
When an entity is removed from the #ManyToMany collection, Hibernate simply deletes the joining record in the link table. Unfortunately, this operation requires removing all entries associated with a given parent and recreating the ones that are listed in the current running persistent context.
But I don't know the reason behind it, why can't we just do the same as in the Set case
Here you got the explaination: http://assarconsulting.blogspot.com/2009/08/why-hibernate-does-delete-all-then-re.html
The #ManyToMany annotation with a List is telling hibernate to expect a joinTable without index
List that don't use index column in one-to-many relation are treated implicitly as bags and this lead to
https://docs.jboss.org/hibernate/stable/core.old/reference/en/html/performance-collections.html
Bags are the worst case. Since a bag permits duplicate element values and has no index column, no primary key may be defined. Hibernate has no way of distinguishing between duplicate rows. Hibernate resolves this problem by completely removing (in a single DELETE) and recreating the collection whenever it changes.
Here is my code to check to see if a record already exists in the system before entering ther new record to the sql database.
String sql = "INSERT INTO Stock (name, cost_price, selling_price, numberinstock, supplier) VALUES (?, ?, ?, ?,?) "
+ "Select name"
+ " from Stock"
+ "Where not exists (select * from Stock"
+ "where name = "+NameTextField+")";
I am using Java, my sql and a derby database.
What I am trying to do is when a new item is entered into the system, the sql statement will check to see if that items is already in the system.
What is wrong with this sql statement
You cannot do what you want using insert . . . values. So use insert . . . select instead. The code should look like this:
INSERT INTO Stock(name, cost_price, selling_price, numberinstock, supplier)
Select ?, ?, ?, ?, ?
from sysibm.sysdummy1
Where not exists (select * from Stock where name = "+NameTextField+");
However, you should pass the second reference to name as a parameter, just like all the others:
INSERT INTO Stock(name, cost_price, selling_price, numberinstock, supplier)
Select ?, ?, ?, ?, ?
from sysibm.sysdummy1
Where not exists (select 1 from Stock where name = ?);
try using trigger
Create Trigger Modified_Order_Trigger
On Orders
After Update --as per youe need "insert, delete, update"
AS
Insert Into tablename (tables fileds)
SELECT tablefileds
FROM INSERTED
I have a table like this:
create table images (
image_id serial primary key,
user_id int references users(user_id),
date_created timestamp with time zone
);
I then have a tag table for tags that images can have:
create table images_tags (
images_tag_id serial primary key,
image_id int references images(image_id),
tag_id int references tags(tag_id)
);
To get the results I want, I run a query like this:
select image_id,user_id,tag_id from images left join images_tags using(image_id)
where (?=-1 or user_id=?)
and (?=-1 or tag_id in (?, ?, ?, ?)) --have up to 4 tag_ids to search for
order by date_created desc limit 100;
The problem is, I want to limit based on the number of unique image_ids because my output will look like this:
{"images":[
{"image_id":1, "tag_ids":[1, 2, 3]},
....
]}
Notice how I group the tag_ids into an array for output, even though the SQL returns a row for each tag_id and image_id combo.
So, when I say limit 100, I want it to apply to 100 unique image_ids.
Maybe you should put one image on each row? If that works, you can do:
select image_id, user_id, string_agg(cast(tag_id as varchar(2000)), ',') as tags
from images left join
images_tags
using (image_id)
where (?=-1 or user_id=?) and
(?=-1 or tag_id in (?, ?, ?, ?)) --have up to 4 tag_ids to search for
group by image_id, user_id
order by date_created desc
limit 100;
If that doesn't work, then use a CTE:
with cte as (
select image_id, user_id, tag_id,
dense_rank() over (order by date_created desc) as seqnum
from images left join
images_tags
using (image_id)
where (?=-1 or user_id=?) and
(?=-1 or tag_id in (?, ?, ?, ?)) --have up to 4 tag_ids to search for
)
select *
from cte
where seqnum <= 100
order by seqnum;
Select 100 qualifying images first, and then join images_tags.
Use an EXISTS semi-join to satisfy the condition on images_tags and take care to get the parentheses right.
SELECT i.*, t.tag_id
FROM (
SELECT i.image_id, i.user_id
FROM images i
WHERE (? = -1 OR i.user_id = ?)
AND (? = -1 OR EXISTS (
SELECT 1
FROM images_tags t
WHERE t.image_id = i.image_id
AND t.tag_id IN (?, ?, ?, ?)
))
ORDER BY i.date_created DESC
LIMIT 100
) i
LEFT JOIN images_tags t
ON t.image_id = i.image_id
AND (? = -1 OR t.tag_id in (?, ?, ?, ?)) -- repeat condition
This should be faster than a solution with window functions and CTEs.
Test performance with EXPLAIN ANLAYZE. As always run a couple of times to warm up cache.
In my application, which uses Spring and Hibernate, I parse a CSV file and populate the db by calling handleRow() every time a record is read from the CSV file.
My domain model:
'Family' has many 'SubFamily'
'SubFamily' has many 'Locus'
a 'Locus' belongs to a 'Species'
Family <-> SubFamily <-> Locus are all bi-directional mappings.
Code:
public void handleRow(Family dummyFamily, SubFamily dummySubFamily, Locus dummyLocus) {
//Service method which access DAO layers
CommonService serv = ctx.getCommonService();
boolean newFamily=false;
Family family=serv.getFamilyByFamilyId(dummyFamily.getFamilyId());
if(family==null){
newFamily=true;
family=new Family();
family.setFamilyId(dummyFamily.getFamilyId());
family.setFamilyIPRId(dummyFamily.getFamilyIPRId());
family.setFamilyName(dummyFamily.getFamilyName());
family.setFamilyPattern(dummyFamily.getFamilyPattern());
family.setRifID(dummyFamily.getRifID());
}
SubFamily subFamily = family.getSubFamilyBySubFamilyId( dummySubFamily.getSubFamilyId() );
if(subFamily==null){
subFamily=new SubFamily();
subFamily.setRifID(dummySubFamily.getRifID());
subFamily.setSubFamilyId(dummySubFamily.getSubFamilyId());
subFamily.setSubFamilyIPRId(dummySubFamily.getSubFamilyIPRId());
subFamily.setSubFamilyName(dummySubFamily.getSubFamilyName());
subFamily.setSubFamilyPattern(dummySubFamily.getSubFamilyPattern());
family.addSubFamily(subFamily);
}
//use the save reference, to update from GFF handler
Locus locus = dummyLocus;
subFamily.addLocus(locus);
assignSpecies(serv,locus);
//Persist object
if(newFamily){
serv.createFamily(family);
} else {
serv.updateFamily(family);
}
}
a Species is assigned to a Locus using following method, which simply accesses the DAO layer:
private void assignSpecies (CommonService serv, Locus locus) {
String locusId = locus.getLocusId();
String speciesId = CommonUtils.getLocusSpecies(locusId, ctx.getSpeciesList()).getSpeciesId();
//Simply get Species object from DAO
Species sp = serv.getSpeciesBySpeciesId(speciesId);
locus.setSpecies(sp);
}
Hibernate gives following error:
[INFO] Starting scheduled refresh cache with period [5000ms]
Hibernate: insert into species (species_id, name) values (?, ?)
Hibernate: insert into species (species_id, name) values (?, ?)
Hibernate: insert into species (species_id, name) values (?, ?)
############################ROW#####################1
SubFamiyID#######RIF0005913
Hibernate: select this_.id as id1_0_, this_.family_id as family2_1_0_, this_.rif_iD as rif3_1_0_, this_.family_name as family4_1_0_, this_.family_ipr_id as family5_1_0_, this_.family_pattern as family6_1_0_ from family this_ where this_.family_id=?
Creating NEW SubFamiyID#######RIF0005913
Hibernate: select this_.id as id3_0_, this_.species_id as species2_3_0_, this_.name as name3_0_ from species this_ where this_.species_id=?
Hibernate: insert into family (family_id, rif_iD, family_name, family_ipr_id, family_pattern) values (?, ?, ?, ?, ?)
Hibernate: insert into subfamily (sub_family_id, rif_iD, sub_family_name, sub_family_ipr_id, sub_family_pattern, family_id, sub_family_index) values (?, ?, ?, ?, ?, ?, ?)
Hibernate: insert into locus (locus_id, refTrans_id, function, species_id, sub_family_id, sub_family_index) values (?, ?, ?, ?, ?, ?)
Hibernate: update species set species_id=?, name=? where id=?
Hibernate: update subfamily set family_id=?, sub_family_index=? where id=?
Hibernate: update locus set sub_family_id=?, sub_family_index=? where id=?
############################ROW#####################2
SubFamiyID#######RIF0005913
Hibernate: select this_.id as id1_0_, this_.family_id as family2_1_0_, this_.rif_iD as rif3_1_0_, this_.family_name as family4_1_0_, this_.family_ipr_id as family5_1_0_, this_.family_pattern as family6_1_0_ from family this_ where this_.family_id=?
Hibernate: select subfamilie0_.family_id as family7_1_, subfamilie0_.id as id1_, subfamilie0_.sub_family_index as sub8_1_, subfamilie0_.id as id0_0_, subfamilie0_.sub_family_id as sub2_0_0_, subfamilie0_.rif_iD as rif3_0_0_, subfamilie0_.sub_family_name as sub4_0_0_, subfamilie0_.sub_family_ipr_id as sub5_0_0_, subfamilie0_.sub_family_pattern as sub6_0_0_, subfamilie0_.family_id as family7_0_0_ from subfamily subfamilie0_ where subfamilie0_.family_id=?
Hibernate: select locuslist0_.sub_family_id as sub5_1_, locuslist0_.id as id1_, locuslist0_.sub_family_index as sub7_1_, locuslist0_.id as id2_0_, locuslist0_.locus_id as locus2_2_0_, locuslist0_.refTrans_id as refTrans3_2_0_, locuslist0_.function as function2_0_, locuslist0_.sub_family_id as sub5_2_0_, locuslist0_.species_id as species6_2_0_ from locus locuslist0_ where locuslist0_.sub_family_id=?
Hibernate: select species0_.id as id3_0_, species0_.species_id as species2_3_0_, species0_.name as name3_0_ from species species0_ where species0_.id=?
Hibernate: select this_.id as id1_0_, this_.family_id as family2_1_0_, this_.rif_iD as rif3_1_0_, this_.family_name as family4_1_0_, this_.family_ipr_id as family5_1_0_, this_.family_pattern as family6_1_0_ from family this_ where this_.family_id=?
Hibernate: select this_.id as id3_0_, this_.species_id as species2_3_0_, this_.name as name3_0_ from species this_ where this_.species_id=?
Exception in thread "main" [INFO] Closing Compass [compass]
org.springframework.orm.hibernate3.HibernateSystemException: a different object with the same identifier value was already associated with the session: [com.bigg.nihonbare.common.domain.Species#1]; nested exception is org.hibernate.NonUniqueObjectException: a different object with the same identifier value was already associated with the session: [com.bigg.nihonbare.common.domain.Species#1]
Caused by: org.hibernate.NonUniqueObjectException: a different object with the same identifier value was already associated with the session: [com.bigg.nihonbare.common.domain.Species#1]
at org.hibernate.engine.StatefulPersistenceContext.checkUniqueness(StatefulPersistenceContext.java:590)
at org.hibernate.event.def.DefaultSaveOrUpdateEventListener.performUpdate(DefaultSaveOrUpdateEventListener.java:284)
at org.hibernate.event.def.DefaultSaveOrUpdateEventListener.entityIsDetached(DefaultSaveOrUpdateEventListener.java:223)
at org.hibernate.event.def.DefaultSaveOrUpdateEventListener.performSaveOrUpdate(DefaultSaveOrUpdateEventListener.java:89)
at org.hibernate.event.def.DefaultSaveOrUpdateEventListener.onSaveOrUpdate(DefaultSaveOrUpdateEventListener.java:70)
at org.hibernate.impl.SessionImpl.fireSaveOrUpdate(SessionImpl.java:507)
at org.hibernate.impl.SessionImpl.saveOrUpdate(SessionImpl.java:499)
at org.hibernate.engine.CascadingAction$5.cascade(CascadingAction.java:218)
at org.hibernate.engine.Cascade.cascadeToOne(Cascade.java:268)
Any tips?
Use merge(). The exception means that the current session is already aware of the entity you are passing. If not, check how you have overridden hashCode() and equals() - it should return different values for different entities.
You can also encounter this problem if you are doing a delete() or update(). The problem is likely to occur if you build the hibernate-mapped pojo yourself, perhaps from a DTO. This pojo now has the same identifier as one that is already in the Session, and that causes the problem.
You now have two options. Either do what #Bozho said and first merge() the object. That takes care of updating. For deleting, take the object returned by merge() and delete it.
The other option is to first query the Session using the id of the object and then delete or update it.
I have seen this when an Entity does not have a GeneratedValue annotation for its ID column:
#GeneratedValue(strategy = GenerationType.AUTO)
I resolved so:
On delete method:
this.getHibernateTemplate().clear();
this.getHibernateTemplate().delete(obj);
// Esta línea realiza el "commit" del comando
this.getHibernateTemplate().flush();
On update method:
this.getHibernateTemplate().merge(obj);
// Esta línea realiza el "commit" del comando
this.getHibernateTemplate().flush();
If you are updating an object evict() it from session after the saveOrUpdate() call, also check your hashCode implementation of the object.
You may have created two instances of Session
Session session = factory.openSession();
If you have opened one session in one function and executing another function with creating another session, then this problem occurs.
This happened to me because part of my compound key was null. Ex:
#Id
#Column(name = "id")
private String id;
#JoinColumn(name = "id")
private Username username;
Username happened to be null which led to "duplicate" null primary keys, even though the id was different.