I have mapping class defined as:
#Table(name = "TEST_TABLE")
public class DBTestAccount
{
#Id
#Column(name = "UUID", nullable = false, length = 36)
private String uuid;
#Column(name = "REGION")
private String region;
#Column(name = "COUNTRY")
private String countryCode;
//getters and setters
}
Now I need to update the table. For that let's say I create following object:
DBTestAccount dbTestAccount = new DBTestAccount();
dbTestAccount.setUuid("testUUID");
dbTestAccount.setRegion("testRegion");
dbTestAccount.setCountryCode(null);
Now let's say initially in the table we have a record that has some value of COUNTRY. Inserting the above object will replace the value and make COUNTRY null. I want that it should update the data, but if the column is null, then it should ignore and do not update it. If it is non-null then it should update it. How to achieve this in hibernate? Is there an annotation to do so? If not then what is the possible solution (except using if - else). Can I create a custom annotation for this?
PS:
The underlying database is PostgreSQL.
The example you are describing can't be present in the database, because the object is not an entity yet, as it is created with new keyword and it isn't yet persisted in the database.
From your explanation, what I got, is that you want to save only changed attributes. For that purpose hibernate has the Dynamic Update annotation.
Related
I have a csv file from where i will read the data and map it to corresponding java object using opencsv.
I have my entity class like below
#Data
#Entity
#Table(name="student_info")
public class Student {
#CsvBindByPosition(position = 0)
private String name;
#CsvBindByPosition(position = 1)
private boolean isBusFacilityAvailed;
#CsvBindByPosition(position = 2)
private Integer busFee = 0;
#OneToMany(mappedBy = "markSheet", fetch = FetchType.Lazy)
private List<MarkSheet> marks;
}
Here i may not get busFee information if isBusFacilityAvailed is false. In Database busFee column should not be saved as null instead it should be saved as 0 if the information is not present.
For that purpose i have given default value as 0 in the above class for busFee field and that works fine when i don't have OneToMany mapping in my entity class.
But when i add OneToMany mapping in the entity class it is not working, the value for busFee is getting saved as null in DB.
Is there any other way how i could resolve this issue.
The trouble here your class is doing more than one thing. It's both a DTO to read from a CSV file and an Entity object to deal with the database.
If you separate your concerns here and create a class whose only purpose is to read from a CSV file and keep the entity as is then you can transform the DTO to an entity which will make it easier for you to understand where the problem is or lack of it thereby.
How about using #Column annotation with columnDefinition to busFee filed
#Column(columnDefinition = "integer default 0")
#CsvBindByPosition(position = 2)
private Integer busFee = 0;
Let us know if this works
I want one of the fields to be ignored when called save() method. The field is gonna get populated automatically by the database and returned. It should be treated as a read-only field.
I am concerned about private Timestamp ts; field:
#Entity
#Table(name = "time_series", schema = "ms")
#IdClass(Reading.class)
public class Reading implements Serializable {
private static final long serialVersionUID = 1L;
#Id
#Column(name = "name", nullable = false)
private String sensorName;
#Id
#Column(name = "ts", insertable = false, updatable = false)
private Timestamp ts;
#Column(name = "reading")
private Double value;
...
As you see, I use insertable = false, updatable = false are inside the #Column annotation, so I'd expect that ts is ignored when forming the actual SQL behind the curtain.
#Override
#Transactional(readOnly = false)
public Reading save(Reading r) {
return readingRepository.save(r);
}
ReadingRepository is basically extended Spring's CrudRepository which has save(...) method.
When I save Reading object with ts=null I get an error from Postgres:
ERROR: null value in column "ts" violates not-null constraint
because Spring Data did not actually ignore the ts field based what I see from the log:
insert into ms.time_series (ts, name, reading) values (NULL, 'sensor1', 10.0)
Clearly, I want the query to be without ts like this:
insert into ms.time_series (name, reading) values ('sensor1', 10.0)
Why is the field not being ignored?
Now if you ask me whether my database schema is okay I say yes. When I type SQL query in console without the ts everything is fine. I even tried #Generated and #GeneratedValue annotations. Name and ts are both forming a primary key for the table, however, the result is the same if I make only one of them a PK or if I add an extra surrogate ID column. Same result...
Am I overlooking something or is there maybe a bug in the Spring framework?? I am using Spring 5.1.2 and SpringData 2.1.2
Note: If I use #Transient annotation that persists the insert query correctly but then the field is being ignored completely even on read/fetch.
Many thanks for any help with this!
Try using GenericGenerator and GeneratedValue in your code.
Add the needed annotation and give values to all other members in Reading class, except ts.
Here some examples.
As you say
I get an error from Postgres
If you check the docs it states:
Technically, a primary key constraint is simply a combination of a unique constraint and a not-null constraint.
That's also true for multi-column primary keys (see here)
So, if ts is part of your primary key in the database (as the #Id indicates) it's simply not possible to insert null values in that column.
IMO Hibernate/Spring got nothing to do with that as
insert into ms.time_series (ts, name, reading) values (NULL, 'sensor1', 10.0)
should be equivalent to
insert into ms.time_series (name, reading) values ('sensor1', 10.0)
I have trying to insert a record into the database (MySQL), using Entity Class and Entity Manager. But one of the field is an auto incrementing primary key, so unless I provide an value manually, the insertion is not successful.
public boolean newContributor(String name, String email, String country, Integer contactable, String address) {
Contributors contributor = new Contributors(); //entity class
contributor.setId(??????); //this is Primary Key field and is of int datatype
contributor.setName(name);
contributor.setEmail(email);
contributor.setCountry(country);
contributor.setContactable(contactable);
contributor.setAddress(address);
em.persist(contributor);
return true;
}
How to solve such problem? Is there a way to tell the entity manager to attempt the insert without the ID field and use NULL value instead.
Update: Here is a portion of the entity class defining the id
...
#Id
#GeneratedValue(strategy = GenerationType.IDENTITY)
#Basic(optional = false)
#NotNull
#Column(name = "id")
private Integer id;
#Basic(optional = false)
#NotNull
#Size(min = 1, max = 50)
....
Is there a way to tell the entity manager to attempt the insert without the ID field and use NULL value instead?
Sure. You need to remove the #NotNull annotation for id field in the #Entity definition, and also remove the row:
contributor.setId(??????);
from method newContributor(). The reason for this is that the #NotNull annotation enforces a validation check in the JPA stack. It doesn't mean that the field is NOT NULL at a database level. See here a discussion about this issue.
The rest of the code looks fine.
This question already has answers here:
Wrong ordering in generated table in jpa
(6 answers)
Closed 4 years ago.
This is my pojo annotated as entity
#Entity
#Table(name = "book", catalog = "book_db")
public class Book {
private Integer bookId;
private String bookName;
private String bookShortDesc;
private String bookDesc;
private String bookAuthor;
}
#Id
#GeneratedValue(strategy = IDENTITY)
#Column(name = "book_id", unique = true, nullable = false)
public Integer getBookId() {
return this.bookId;
}
#Column(name = "book_name", nullable = false, length = 256)
public String getBookName() {
return this.bookName;
}
#Column(name = "book_short_desc", nullable = false, length = 1024)
public String getBookShortDesc() {
return this.bookShortDesc;
}
etc......
the above entity is created using annotation, when i look to the mysql database,the columns are not created in the order, i have written below, instead , first columns is the book_id, then book_desc, then book_athor, then book_short_desc then book_name.
my question is how can i tell hibernate to create the columns the same order as i have written in the java code ??
is there any annotation for that ??
regards
Assuming you mean the database tables are generated via hbm2ddl.auto being set to CREATE or similar, there is no way to specify the order of the columns at least in 2008, according to one member of the Hibernate team.
My advice would be to create the database via separately maintained database scripts (possibly in conjunction with a script deployment/migration tool such as Flyway and perhaps also with hbm2ddl.auto = VALIDATE to check the resulting schema matches the entities). Maintaining the database via scripts becomes much more necessary once the application goes into production.
#Entity
public class Person {
#ElementCollection
#CollectionTable(name = "PERSON_LOCATIONS", joinColumns = #JoinColumn(name = "PERSON_ID"))
private List<Location> locations;
[...]
}
#Embeddable
public class Location {
[...]
}
Given the following class structure, when I try to add a new location to the list of Person's Locations, it always results in the following SQL queries:
DELETE FROM PERSON_LOCATIONS WHERE PERSON_ID = :idOfPerson
And
A lotsa' inserts into the PERSON_LOCATIONS table
Hibernate (3.5.x / JPA 2) deletes all associated records for the given Person and re-inserts all previous records, plus the new one.
I had the idea that the equals/hashcode method on Location would solve the problem, but it didn't change anything.
Any hints are appreciated!
The problem is somehow explained in the page about ElementCollection of the JPA wikibook:
Primary keys in CollectionTable
The JPA 2.0 specification does not
provide a way to define the Id in the
Embeddable. However, to delete or
update a element of the
ElementCollection mapping, some unique
key is normally required. Otherwise,
on every update the JPA provider would
need to delete everything from the
CollectionTable for the Entity, and
then insert the values back. So, the
JPA provider will most likely assume
that the combination of all of the
fields in the Embeddable are unique,
in combination with the foreign key
(JoinColunm(s)). This however could be
inefficient, or just not feasible if
the Embeddable is big, or complex.
And this is exactly (the part in bold) what happens here (Hibernate doesn't generate a primary key for the collection table and has no way to detect what element of the collection changed and will delete the old content from the table to insert the new content).
However, if you define an #OrderColumn (to specify a column used to maintain the persistent order of a list - which would make sense since you're using a List), Hibernate will create a primary key (made of the order column and the join column) and will be able to update the collection table without deleting the whole content.
Something like this (if you want to use the default column name):
#Entity
public class Person {
...
#ElementCollection
#CollectionTable(name = "PERSON_LOCATIONS", joinColumns = #JoinColumn(name = "PERSON_ID"))
#OrderColumn
private List<Location> locations;
...
}
References
JPA 2.0 Specification
Section 11.1.12 "ElementCollection Annotation"
Section 11.1.39 "OrderColumn Annotation"
JPA Wikibook
Java Persistence/ElementCollection
In addition to Pascal's answer, you have to also set at least one column as NOT NULL:
#Embeddable
public class Location {
#Column(name = "path", nullable = false)
private String path;
#Column(name = "parent", nullable = false)
private String parent;
public Location() {
}
public Location(String path, String parent) {
this.path = path;
this.parent= parent;
}
public String getPath() {
return path;
}
public String getParent() {
return parent;
}
}
This requirement is documented in AbstractPersistentCollection:
Workaround for situations like HHH-7072. If the collection element is a component that consists entirely
of nullable properties, we currently have to forcefully recreate the entire collection. See the use
of hasNotNullableColumns in the AbstractCollectionPersister constructor for more info. In order to delete
row-by-row, that would require SQL like "WHERE ( COL = ? OR ( COL is null AND ? is null ) )", rather than
the current "WHERE COL = ?" (fails for null for most DBs). Note that
the param would have to be bound twice. Until we eventually add "parameter bind points" concepts to the
AST in ORM 5+, handling this type of condition is either extremely difficult or impossible. Forcing
recreation isn't ideal, but not really any other option in ORM 4.
We discovered that entities we were defining as our ElementCollection types did not have an equals or hashcode method defined and had nullable fields. We provided those (via #lombok for what it's worth) on the entity type and it allowed hibernate (v 5.2.14) to identify that the collection was or was not dirty.
Additionally, this error manifested for us because we were within a service method that was marked with the annotation #Transaction(readonly = true). Since hibernate would attempt to clear the related element collection and insert it all over again, the transaction would fail when being flushed and things were breaking with this very difficult to trace message:
HHH000346: Error during managed flush [Batch update returned unexpected row count from update [0]; actual row count: 0; expected: 1]
Here is an example of our entity model that had the error
#Entity
public class Entity1 {
#ElementCollection #Default private Set<Entity2> relatedEntity2s = Sets.newHashSet();
}
public class Entity2 {
private UUID someUUID;
}
Changing it to this
#Entity
public class Entity1 {
#ElementCollection #Default private Set<Entity2> relatedEntity2s = Sets.newHashSet();
}
#EqualsAndHashCode
public class Entity2 {
#Column(nullable = false)
private UUID someUUID;
}
Fixed our issue. Good luck.
I had the same issue but wanted to map a list of enums: List<EnumType>.
I got it working like this:
#ElementCollection
#CollectionTable(
name = "enum_table",
joinColumns = #JoinColumn(name = "some_id")
)
#OrderColumn
#Enumerated(EnumType.STRING)
private List<EnumType> enumTypeList = new ArrayList<>();
public void setEnumList(List<EnumType> newEnumList) {
this.enumTypeList.clear();
this.enumTypeList.addAll(newEnumList);
}
The issue with me was that the List object was always replaced using the default setter and therefore hibernate treated it as a completely "new" object although the enums did not change.