I'd like to override a setter so that I can perform some function on the data so I can return a calculated column for my entity. The function depends a several columns (e.g. COL1, COL2, ...) so I can't really intercept any particular setter because the other values might not yet be populated. Does hibernate provide some sort of "finish()" method that can be called once at the values are set for the Entity?
#Override
#Column(name="COL1")
public String getCol1() {
return this.col1;
}
#Override
public void setCol1(String value) {
super.setCol1(value);
genMagicValue();
}
public String getMagicValue() {
return this.magicValue();
}
I don't understand your question,
setCol1 may be never called (and left with its default value).
Furthermore, no one is preventing you to call it twice with different values.
perhaps the pattern you are looking for is:
boolean magicDone=false;
public String getMagicValue() {
if (!magicDone){
magicDone=true;
genMagicValue();
}
return this.magicValue();
}
Aside from what hibernate provides, is it not possible to lazy init the magicValue so that the calculation happens the first time getMagicValue is called and subsequent calls to getMagicValue just return the computed value?
Related
I really like the addition of records in Java 14, at least as a preview feature, as it helps to reduce my need to use lombok for simple, immutable "data holders". But I'm having an issue with the implementation of nullable components. I'm trying to avoid returning null in my codebase to indicate that a value might not be present. Therefore I currently often use something like the following pattern with lombok.
#Value
public class MyClass {
String id;
#Nullable String value;
Optional<String> getValue() { // overwrite the generated getter
return Optional.ofNullable(this.value);
}
}
When I try the same pattern now with records, this is not allowed stating incorrect component accessor return type.
record MyRecord (String id, #Nullable String value){
Optional<String> value(){
return Optional.ofNullable(this.value);
}
}
Since I thought the usage of Optionals as return types is now preferred, I'm really wondering why this restriction is in place. Is my understanding of the usage wrong? How can I achieve the same, without adding another accessor with another signature which does not hide the default one? Should Optional not be used in this case at all?
A record comprises attributes that primarily define its state. The derivation of the accessors, constructors, etc. is completely based on this state of the records.
Now in your example, the state of the attribute value is null, hence the access using the default implementation ends up providing the true state. To provide customized access to this attribute you are instead looking for an overridden API that wraps the actual state and further provides an Optional return type.
Of course, as you mentioned one of the ways to deal with it would be to have a custom implementation included in the record definition itself
record MyClass(String id, String value) {
Optional<String> getValue() {
return Optional.ofNullable(value());
}
}
Alternatively, you could decouple the read and write APIs from the data carrier in a separate class and pass on the record instance to them for custom accesses.
The most relevant quote from JEP 384: Records that I found would be(formatting mine):
A record declares its state -- the group of variables -- and commits
to an API that matches that state. This means that records give up a
freedom that classes usually enjoy -- the ability to decouple a
class's API from its internal representation -- but in return, records
become significantly more concise.
Due to restrictions placed on records, namely that canonical constructor type needs to match accessor type, a pragmatic way to use Optional with records would be to define it as a property type:
record MyRecord (String id, Optional<String> value){
}
A point has been made that this is problematic due to the fact that null might be passed as a value to the constructor. This can be solved by forbidding such MyRecord invariants through canonical constructor:
record MyRecord(String id, Optional<String> value) {
MyRecord(String id, Optional<String> value) {
this.id = id;
this.value = Objects.requireNonNull(value);
}
}
In practice most common libraries or frameworks (e.g. Jackson, Spring) have support for recognizing Optional type and translating null into Optional.empty() automatically so whether this is an issue that needs to be tackled in your particular instance depends on context. I recommend researching support for Optional in your codebase before cluttering your code possibly unnecessary.
Credits go to Holger! I really like his proposed way of questioning the actual need of null. Thus with a short example, I wanted to give his approach a bit more space, even if a bit convoluted for this use-case.
interface ConversionResult<T> {
String raw();
default Optional<T> value(){
return Optional.empty();
}
default Optional<String> error(){
return Optional.empty();
}
default void ifOk(Consumer<T> okAction) {
value().ifPresent(okAction);
}
default void okOrError(Consumer<T> okAction, Consumer<String> errorAction){
value().ifPresent(okAction);
error().ifPresent(errorAction);
}
static ConversionResult<LocalDate> ofDate(String raw, String pattern){
try {
var value = LocalDate.parse(raw, DateTimeFormatter.ofPattern(pattern));
return new Ok<>(raw, value);
} catch (Exception e){
var error = String.format("Invalid date value '%s'. Expected pattern '%s'.", raw, pattern);
return new Error<>(raw, error);
}
}
// more conversion operations
}
record Ok<T>(String raw, T actualValue) implements ConversionResult<T> {
public Optional<T> value(){
return Optional.of(actualValue);
}
}
record Error<T>(String raw, String actualError) implements ConversionResult<T> {
public Optional<String> error(){
return Optional.of(actualError);
}
}
Usage would be something like
var okConv = ConversionResult.ofDate("12.03.2020", "dd.MM.yyyy");
okConv.okOrError(
v -> System.out.println("SUCCESS: "+v),
e -> System.err.println("FAILURE: "+e)
);
System.out.println(okConv);
System.out.println();
var failedConv = ConversionResult.ofDate("12.03.2020", "yyyy-MM-dd");
failedConv.okOrError(
v -> System.out.println("SUCCESS: "+v),
e -> System.err.println("FAILURE: "+e)
);
System.out.println(failedConv);
which leads to the following output...
SUCCESS: 2020-03-12
Ok[raw=12.03.2020, actualValue=2020-03-12]
FAILURE: Invalid date value '12.03.2020'. Expected pattern 'yyyy-MM-dd'.
Error[raw=12.03.2020, actualError=Invalid date value '12.03.2020'. Expected pattern 'yyyy-MM-dd'.]
The only minor issue is that the toString prints now the actual... variants. And of course we do not NEED to use records for this.
Don't have the rep to comment, but I just wanted to point out that you've essentially reinvented the Either datatype. https://hackage.haskell.org/package/base-4.14.0.0/docs/Data-Either.html or https://www.scala-lang.org/api/2.9.3/scala/Either.html. I find Try, Either, and Validation to be incredibly useful for parsing and there are a few java libraries with this functionality that I use: https://github.com/aol/cyclops/tree/master/cyclops and https://www.vavr.io/vavr-docs/#_either.
Unfortunately, I think your main question is still open (and I'd be interested in finding an answer).
doing something like
RecordA(String a)
RecordAandB(String a, Integer b)
to deal with an immutable data carrier with a null b seems bad, but wrapping recordA(String a, Integer b) to have an Optional getB somewhere else seems contra-productive. There's almost no point to the record class then and I think the lombok #Value is still the best answer. I'm just concerned that it won't play well with deconstruction for pattern matching.
[I want this function to work, i.e. change a lot of data at once][1]
I want to change a lot of data at once, how?
Under this DAO to change just one data, what if I want to change a lot?
#PutMapping("/roomstatus/update/{id}")
public ResponseEntity<RoomStatus> updatRoomStatus(#PathVariable(value="id") Integer empid,#Valid #RequestBody RoomStatus RsDetails){
RoomStatus emp=roomStatusDAO.findOne(empid);
if(emp==null) {
return ResponseEntity.notFound().build();
}
emp.setRoomNumber(RsDetails.getRoomNumber());
emp.setFloor(RsDetails.getFloor());
emp.setGuestName(RsDetails.getGuestName());
emp.setRoomType(RsDetails.getRoomType());
emp.setBedType(RsDetails.getBedType());
emp.setRoomStatus(RsDetails.getRoomStatus());
emp.setConditions(RsDetails.getConditions());
RoomStatus updateRoomStatus=roomStatusDAO.save(emp);
return ResponseEntity.ok().body(updateRoomStatus);
}
The simplest solution is to pass as #RequestBody RoomStatus collection.
#PatchMapping("/roomstatus/update")
public ResponseEntity<RoomStatus> updatRoomsStatus(#Valid #RequestBody List<RoomStatus> rsDetails){
List<RoomStatus> updated = rsDetails.stream().map(rs -> updateRoomStatus(rs)).collect(Collectors.toList());
return ResponseEntity.ok().body(updated);
}
private RoomStatus updateRoomStatus(RoomStatus emp) {
if(emp==null) {
return ResponseEntity.notFound().build();
}
emp.setRoomNumber(RsDetails.getRoomNumber());
emp.setFloor(RsDetails.getFloor());
emp.setGuestName(RsDetails.getGuestName());
emp.setRoomType(RsDetails.getRoomType());
emp.setBedType(RsDetails.getBedType());
emp.setRoomStatus(RsDetails.getRoomStatus());
emp.setConditions(RsDetails.getConditions());
return roomStatusDAO.save(emp);
}
But be careful of created heap size - all the objects can be really heavy.
Take note about that question - passed objects on List will not be validated, so you can follow these answers.
PS. If modification is the same for every object - just require custom object from RequestBody that contains a list od ID and modification. Then call custom JPQL query from spring data that calls DB once instead of making n calls.
Using ModelMapper you can change a lot of data check links
https://www.appsdeveloperblog.com/java-objects-mapping-with-modelmapper/
Or
https://www.baeldung.com/entity-to-and-from-dto-for-a-java-spring-application
if you want to update your resource completely use #PutMapping to update partially use #PatchMapping. For reference check here.
First of all, your question is not specific and not formulated well enough.
But as I have understood, empId is employeeId. Employee itself has a One-to-Many relation to RoomStatus. Your method takes employeeId and roomStatusDetails and for each RoomStatus associated with employeeId, you want to set roomStatusDetails. In that case you can do the following.
First, in your roomStatusDAO interface add a method to get a collection of RoomStatus by empId: findAllByEmpId(Integer empId) and provide its implementation in respective class.
Then, the solution will be like this:
#PutMapping("/roomstatus/update/{id}")
public ResponseEntity<RoomStatus> updatRoomStatus(#PathVariable(value="id") Integer empid,#Valid #RequestBody RoomStatus RsDetails){
List<RoomStatus> updatedRoomStatuses = roomStatusDAO.findAllByEmpId(empid).stream().map(rs -> setRsDetailsAndSave(rs, RsDetails)).collect(Collectors.toList());
return updatedRoomStatuses.isEmpty() ? ResponseEntity.notFound().build() : ResponseEntity.ok().body(updatedRoomStatus);
}
private RoomStatus setRsDetailsAndSave (RoomStatus rs, RoomStatus RsDetails) {
rs.setRoomNumber(RsDetails.getRoomNumber());
rs.setFloor(RsDetails.getFloor());
rs.setGuestName(RsDetails.getGuestName());
rs.setRoomType(RsDetails.getRoomType());
rs.setBedType(RsDetails.getBedType());
rs.setRoomStatus(RsDetails.getRoomStatus());
rs.setConditions(RsDetails.getConditions());
return roomStatusDAO.save(rs);
}
I'm trying to map EnumSet into a single value as integer via Hibernate.
I have implemented AttributeConverter:
public class RolesToIntConverter implements AttributeConverter<Set<Roles>, Integer> {
#Override
public Integer convertToDatabaseColumn(Set<Roles> attribute) {
return Roles.encode(attribute);
}
#Override
public Set<Roles> convertToEntityAttribute(Integer dbData) {
return Roles.decode(dbData);
}
}
As well as new SqlDialect:
public class LocalSqlDialect extends MySQL5Dialect {
public LocalSqlDialect() {
super();
registerFunction("bitwise_and", new SQLFunctionTemplate(IntegerType.INSTANCE, "(?1 & ?2)"));
}
}
Then I call it like this:
public Collection<PersonsEntity> getAll(Roles roles) {
Query q = getEntityManager().createQuery("SELECT s FROM PersonsEntity AS s WHERE ( bitwise_and(s.roles,:roles) <> 0 )");
q.setParameter("roles", EnumSet.of(roles));
List<PersonsEntity> result = (List<PersonsEntity>) q.getResultList();
return result;
}
This causes several issues:
ClassCastException, because for some reason, it passes set to the AttributeConverter per each element
I tried changing type from Set<Roles> to Object (at the converter), and then using instanceof I checked whether it is a single object or a set and parsed accordingly. After that I found out, that while calling s.roles = :roles worked fine, calling the registered bitwise function did not even call the AttributeConverter
ResultSet exception, because after calling bitwise function and using as input Set with two values, it actually puts ?, ? into the query instead of calling AttributeParser, which should merge it into a single number
The question is, what em I doing wrong? Or do you know a better solution to a problem: map EnumSet into a single database column, while being able to assign multiple roles to one entity.
For example I have value 1 for user, value 2 for manager, value 4 for admin, etc.. I want a particular person to be a user and a manager at the same time, which would mean value 3 (1 | 2) and then I want to find him when searching for user only (resp. manager only) via number 1 (resp. 2) - which suggests bitwise and.
Thank you in advance for any response!
I'm using JPA 2.0, more precisely Eclipselink. Here's my problem:
I have an entity that has a property like "isPaid". that property is the result of some calculations the entity performs with some of its other fields. since this is derived from other fields, the property does not have a setter method.
As an example, the getter is something like this:
public boolean isPaid() {
return this.totalAmount - this.amountPaid == 0;
}
that's just an example. The thing is, I want this property to be calculated and persisted, so i can do a jpql query like:
SELECT d FROM Debt d WHERE d.isPaid = true
Is this possible? Is there any workaround for this?.
I don't want to retrieve all entities to call this method and then filter those that return true.
Here are a couple of options:
1) Create a jpql query that directly does what you need:
select d from Debt d where (d.totalAmount - d.amountPaid) = 0
The benefits of the approach is that it is simple and will always work. The downside is that your query has to understand how the paid logic was calculated.
2) Create a persisted paid value that stores the calculated value:
#Basic
private boolean paid;
public boolean isPaid() {
return this.paid;
}
private void updateCalculations() {
this.paid = (this.totalAmount - this.amountPaid == 0);
}
// using int as example here
public void setTotalAmount(int totalAmount) {
this.totalAmount = totalAmount;
updateCalculations();
}
public void setAmountPaid(int amountPaid) {
this.amountPaid = amountPaid;
updateCalculations();
}
The benefit of this approach is that you will be able to create a jpql query that directly checks for the boolean value, i.e.,
select d from Debt d where d.paid = true;
Obviously, the downside to the approach is that you need to make sure to recalculate the value anytime you update the values. However, this can be alleviated if you only calculate it on access. Meaning that in your isPaid() method, you calculate the value, assign it to the paid attribute and then return the value. If you decide to go with this approach, you will need to add a #PrePersist and #PreUpdate method that performs the paid calculation and updates the paid attribute prior to the bean being persisted to the datastore (makes sure that the paid value is always covered.
If you use JPA annotations on your attributes themselves, you can have a getter without a setter and still be able to correctly retrieve and store the values in the database.
Seen this: Mapping calculated properties with JPA ?
Basically you need a setter one way or the other in order to make JPA happy.
I have a database table with a field that I need to read from and write to via Hibernate. It is string field, but the contents are encrypted. And for various reasons (e.g. a need to sort the plain text values), the encrypt/decrypt functions are implemented inside the database, not in Java.
The problem I'm struggling with now is finding a way to invoke the encrypt/decrypt functions in Hibernate-generated SQL everywhere that the field is referenced and in a way that's transparent to my application code. Is this possible? I've looked into Hibernate's support for "derived" properties, but unfortunately, that approach doesn't support read-write fields. Any ideas appreciated.
I don't think there's a way to make encryption like you've described it completely transparent to your application. The closest thing you can get is to make it transparent outside of entity. In your entity class:
#Entity
#SQLInsert(sql="INSERT INTO my_table(my_column, id) VALUES(encrypt(?),?)")
#SQLUpdate( sql="UPDATE my_table SET my_column = encrypt(?) WHERE id = ?")
public class MyEntity {
private String myValue;
....
#Formula("decrypt(my_column)")
public String getValue() {
return myValue;
}
public void setValue(String value) {
myValue = value;
}
#Column (name="my_column")
private String getValueCopy() {
return myValue;
}
private void setValueCopy(String value) {
}
}
value is mapped as derived property, you should be able to use it in queries.
valueCopy is private and is used to get around derived property being read-only.
SQLInsert and SQLUpdate is black voodoo magic to force encryption on insert / update. Note that parameter order IS important, you need to find out what order Hibernate would generate parameters in without using custom insert / update and then replicate it.
You could have a trigger internal to the database that, on retrieval, decrypts the value and replaces the returned result and on insert encrypts the value and replaces the stored result with the encrypted value. You could also do this with a view wrapper - i.e. have an insert trigger on the view, and have the view automatically decrypt the value.
To better explain: have a view that decrypts the value, and an on insert trigger that encrypts the value that is linked to the view.
Actually, in the end, I went a different route and submitted a patch to Hibernate. It was committed to trunk last week and so I think it will be in the next release following 3.5. Now, in property mappings, you can specify SQL "read" and "write" expressions to call SQL functions or perform some other kind of database-side conversion.
Assuming you have access to the encrypt/decrypt algorithm from within Java, I would set up my mapped class something like
public class encryptedTable {
#Column(name="encrypted_field")
private String encryptedValue;
#Transient
private String value;
public String getEncryptedValue() {
return encryptedValue;
}
public String getValue() {
return value;
}
public void setEncryptedValue(String encryptedValue) {
this.encryptedValue = encryptedValue;
this.value = decrypt(encryptedValue);
}
public void setValue(String value) {
this.value = value;
this.encryptedValue = encrypt(value);
}
}
And then use get/set Value as the accessor within your program and leave the get/set EncryptedValue for Hibernates use when accessing the database.
Why not just use the SQl server encryption that seems to already be in place by calling a stored proc in Hibernate instead of letting Hibernate generate a query?