Save to database inside CompletableFuture - java

I'm trying to save an entity to a Oracle DB using CrudRepository, from inside a method which returns a CompletableFuture.
In this method I am doing a REST call to get the data, I'm processing it to an excel, then save it to the DB.
public CompletableFuture<byte[]> getCompletableFuture(**some parameters**)
{
return CompletableFuture.supplyAsync(() -> caller.getForObjectList(**some parameters**))
.thenApply(listWithData-> {
Collections.sort(**some sorting**);
//Inside exportExcel method I'm creating the excel and call the save method from my service which calls the repository extending CrudRepository.
return excelExportService.exportExcel(listWithData,**some parameters**);
});
}
I am doing getCompletableFuture(parameters).get() tot get the result from the CompletableFuture.
When the save method is called, my entity is populated and everything looks fine, except the fact that nothing is saved in the DB. I think this might be a Transaction problem since the code is done on a separate thread.

Related

How can I tell if current session is dirty?

I want to publish an event if and only if there were changes to the DB. I'm running under #Transaction is Spring context and I come up with this check:
Session session = entityManager.unwrap(Session.class);
session.isDirty();
That seems to fail for new (Transient) objects:
#Transactional
public Entity save(Entity newEntity) {
Entity entity = entityRepository.save(newEntity);
Session session = entityManager.unwrap(Session.class);
session.isDirty(); // <-- returns `false` ):
return entity;
}
Based on the answer here https://stackoverflow.com/a/5268617/672689 I would expect it to work and return true.
What am I missing?
UPDATE
Considering #fladdimir answer, although this function is called in a transaction context, I did add the #Transactional (from org.springframework.transaction.annotation) on the function. but I still encounter the same behaviour. The isDirty is returning false.
Moreover, as expected, the new entity doesn't shows on the DB while the program is hold on breakpoint at the line of the session.isDirty().
UPDATE_2
I also tried to change the session flush modes before calling the repo save, also without any effect:
session.setFlushMode(FlushModeType.COMMIT);
session.setHibernateFlushMode(FlushMode.MANUAL);
First of all, Session.isDirty() has a different meaning than what I understood. It tells if the current session is holding in memory queries which still haven't been sent to the DB. While I thought it tells if the transaction have changing queries. When saving a new entity, even in transaction, the insert query must be sent to the DB in order to get the new entity id, therefore the isDirty() will always be false after it.
So I ended up creating a class to extend SessionImpl and hold the change status for the session, updating it on persist and merge calls (the functions hibernate is using)
So this is the class I wrote:
import org.hibernate.HibernateException;
import org.hibernate.internal.SessionCreationOptions;
import org.hibernate.internal.SessionFactoryImpl;
import org.hibernate.internal.SessionImpl;
public class CustomSession extends SessionImpl {
private boolean changed;
public CustomSession(SessionFactoryImpl factory, SessionCreationOptions options) {
super(factory, options);
changed = false;
}
#Override
public void persist(Object object) throws HibernateException {
super.persist(object);
changed = true;
}
#Override
public void flush() throws HibernateException {
changed = changed || isDirty();
super.flush();
}
public boolean isChanged() {
return changed || isDirty();
}
}
In order to use it I had to:
extend SessionFactoryImpl.SessionBuilderImpl to override the openSession function and return my CustomSession
extend SessionFactoryImpl to override the withOptions function to return the extended SessionFactoryImpl.SessionBuilderImpl
extend AbstractDelegatingSessionFactoryBuilderImplementor to override the build function to return the extended SessionFactoryImpl
implement SessionFactoryBuilderFactory to implement getSessionFactoryBuilder to return the extended AbstractDelegatingSessionFactoryBuilderImplementor
add org.hibernate.boot.spi.SessionFactoryBuilderFactory file under META-INF/services with value of my SessionFactoryBuilderFactory implementation full class name (for the spring to be aware of it).
UPDATE
There was a bug with capturing the "merge" calls (as tremendous7 comment), so I end up capturing the isDirty state before any flush, and also checking it once more when checking isChanged()
The following is a different way you might be able to leverage to track dirtiness.
Though architecturally different than your sample code, it may be more to the point of your actual goal (I want to publish an event if and only if there were changes to the DB).
Maybe you could use an Interceptor listener to let the entity manager do the heavy lifting and just TELL you what's dirty. Then you only have to react to it, instead of prod it to sort out what's dirty in the first place.
Take a look at this article: https://www.baeldung.com/hibernate-entity-lifecycle
It has a lot of test cases that basically check for dirtiness of objects being saved in various contexts and then it relies on a piece of code called the DirtyDataInspector that effectively listens to any items that are flagged dirty on flush and then just remembers them (i.e. keeps them in a list) so the unit test cases can assert that the things that SHOULD have been dirty were actually flushed as dirty.
The dirty data inspector code is on their github. Here's the direct link for ease of access.
Here is the code where the interceptor is applied to the factory so it can be effective. You might need to write this up in your injection framework accordingly.
The code for the Interceptor it is based on has a TON of lifecycle methods you can probably exploit to get the perfect behavior for "do this if there was actually a dirty save that occured".
You can see the full docs of it here.
We do not know your complete setup, but as #Christian Beikov suggested in the comment, is it possible that the insertion was already flushed before you call isDirty()?
This would happen when you called repository.save(newEntity) without a running transaction, since the SimpleJpaRepository's save method is annotated itself with #Transactional:
#Transactional
#Override
public <S extends T> S save(S entity) {
...
}
This will wrap the call in a new transaction if none is already active, and flush the insertion to the DB at the end of the transaction just before the method returns.
You might choose to annotate the method where you call save and isDirty with #Transactional, so that the transaction is created when your method is called, and propagated to the repository call. This way the transaction would not be committed when the save returns, and the session would still be dirty.
(edit, just for completeness: in case of using an identity ID generation strategy, the insertion of newly created entity is flushed during a repository's save call to generate the ID, before the running transaction is committed)

Returning Java Object from Mono

I am trying to get Json String from Mono. I tried to use block() method to get object it worked fine , but when I use map/flatmap ,I don't see following lines of code is executed.And I see account Mono is not empty.
private String getJsonString( Mono<Account> account) {
response.map(it->{
**// call is not coming here**
val json = mapper.writeValueAsString(it)
System.out.println(son)
});
}
am I doing anything wrong here?
If you give a read to the official documentation , you will see that:
Nothing happens until you subscribe
Now to understand, In spring boot webflux based microservice, who is the subscriber?, have a look at this stackoverflow question
Now, if you think, you can have blocking and reactive implementations in the same service, unfortunately, it doesn't work like that. For this you have to understand the event loop model on which reactor works. Thus calling a block method at any point in the flow is of no good and is equivalent to using the old blocking spring-web methods. Because the thread in which the request is being processed gets blocked and waits for the outcome of the I/O operation / network call.
Coming to your question in the comment:
But when i use flatMap in my controller to call handler method it goes service method with Object not mono?serviceRequest-->Mono-->Object how this works?
Let me give you a simple example for this:
Suppose you have an employee application, where you want to fetch details of an employee for a given id.
Now in your controller, you will have an endpoint like this:
#GetMapping("/{tenant}/api/employee/{id}")
public Mono<ResponseEntity> getEmployeeDetails(#PathVariable("id") Long employeeId) {
return employeeService.getDetails(employeeId)
.map(ResponseEntity::ok);
}
Now in your service,
public Mono<EmployeeEntity> getDetails(Long employeeId) {
return employeeRepository.findById(employeeId);
}
And your repository will look like this:
#Repository
public interface EmployeeRepository extends ReactiveCrudRepository<EmployeeEntity, Long> {
}

Spring #Transactional . Save entity in main thread and select in new thread in one transaction

I have method in my service:
#Slf4j
#Component
public class ImportChargesHandler implements SendRequestHandler {
...
#Override
#Transactional
public void process(SendRequestRequest request, String fullRequest) {
String guid = sendRequestService.saveImportChargesRequest(...);
taskExecutor.execute(() -> importChargesOperation.importChargesRequest(request, guid));
}
}
in this method I have 2 logik parts:
saveImportChargesRequest - create some object, save in DB and return id(guid)
importChargesRequest - call in new Thread. I pass saved id(guid) and use into this method(make select by guid)
But when I save object in DB in first method and pass id to second method, when I make select by id into second method I get exception(entity not found) or success result. I think this happens because when I make select - save method from previos method not persist data and I do not know about this data.
I tried saveAndFlush() when save object to DB - when I mae select data alreade flushed and I can select it. But sometimes I can not select does not matter.
-> start main method with #Transactional
------transaction begin------------
-> start fist internal method for save entity
-> save entity
-> flush
-> return id
-----start new thread------------
-> start second method
-> select entity by id(id from first method)
-> exception(not found) or success select(it depends)
------transaction commit-----------
Now I removed #Transactional from main method. And first and second method has #Transactional. I have this logik
-> start main method without #Transactional
------transaction begin------------
-> start fist internal method for save entity
-> save entity
-> flush
-> return id
------transaction commit-----------
-----start new thread------------
------transaction begin------------
-> start second method
-> select entity by id(id from first method)
-> success select
------transaction commit-----------
But I do not know correct this implemetation or not. And how can I fix first implementation - save data in main thread and select in new thread in One transaction?
Why you need a second thread? Please try to remove the second thread and use only one. If you still need second thread then you need to commit the first transaction before get the result as you already did. You must also know that with your second approach you will have problem to rollback the transactions if you need to clear all preserved data. That why it is good if you consider to make it in one thread.

StaleObjectStateException when read from Spring Data repository

I have the next Java code:
public Role reproduceIssue(String code) {
repository.findOneByCode(code);
roleChangedService.onRoleChanged(code);
return null;
}
The code first reads a role by code from repository (in one transaction, marked as read only) and then calls another method RoleChangedService#onRoleChanged (in another transaction). The method is like this:
#Transactional
public void onRoleChanged(String code) {
Role resource = roleRepository.findOneAndLockByCode(code);
resource.setName(null);
roleRepository.saveAndFlush(resource);
}
RoleRepository#findOneAndLockByCode is annotated with #Lock(PESSIMISTIC_WRITE).
Every time I send 2 parallel HTTP requests to call the reproduceIssue method in parallel, I get org.hibernate.StaleObjectStateException: Row was updated or deleted by another transaction (or unsaved-value mapping was incorrect) on the RoleRepository#findOneAndLockByCode method call.
I wonder, how is the exception possible, given that I use the lock on this method?
The reproduction project (as well as the shell scripts that simulate the parallel call) is available here.

Execute function from model in viewcontroller

In my App (Fusion Web) exist a ViewObject from Oracle DB.
I created the java classes and build a specific method (makeUniqueSearchByDate(String)) to process the data.
This method appears in "Data controls" that I can drag to the "view" and use as any other function. When I try to use it in a "bean" (instead of dragging directly):
public void setDate(ActionEvent actionEvent) {
ApplicationModule appMod =
Configuration.createRootApplicationModule("com.svr.model.AppModule", "AppModuleLocal");
ViewModelosByDataImpl fo = (ViewModelosByDataImpl) appMod.findViewObject("ViewModelosByData1");
String dateV = "07-01-2013";
fo.makeUniqueSearchByDate(dateV);
}
This code has no effect on the table. Can anyone see why?
Btw, the program does not throw any exception. Just does not work. The table remains the same. But if I use the button, automatically generated by "drag and drop" the function runs normally. I know I should study ADF, but unfortunately I have no time.
i think after you have exposed the method written at VO as Client interface, you need to create a method binding in pageDef file of you page. after creating the method binding, you need to access the method in bean through binding layer something like this :
OperationBinding op=((DCBindingContainer)BindingContext.getCurrent().getCurrentBindingsEntry()).getOperationBinding("Method Binding");
op.execute();
i think the method used by you to call VO method from bean is not right.
i think one more thing you need to do in your bean after calling the VO method is that you need to do refresh the table / perform PPR programatically :
AdfFacesContext adfFacesContext = AdfFacesContext.getCurrentInstance();
adfFacesContext.addPartialTarget(component binding for your table component);
you can try setting autosubmit to true for command button which invokes action event, and set partial trigger for table to component id of the command button.
can you post VO method code as well ?
does the method get called and data gets committed / updated when you execute it through bean ? is it only a table refresh issue ? do you see changes to data if you manually refresh the page ?

Categories

Resources