blocking EntityManager operations - java

I don't want to perform a blocking operation.
Caused by: java.lang.IllegalStateException: You have attempted to perform a blocking operation on a IO thread. This is not allowed, as blocking the IO thread will cause major performance issues with your application. If you want to perform blocking EntityManager operations make sure you are doing it from a worker thread.
Anyone know how to fix this problem?
I only have simple operations. a single findAll request that returns 10 rows. I put Tansactional NEVER
and I still have the problem.
I'm using panache with a simple entity.
#GET
#Path("/type")
#Produces(MediaType.APPLICATION_JSON)
#Transactional(Transactional.TxType.NEVER)
public Response get() {
return AlertType.listAll();
}
public class AlerteType extends PanacheEntityBase
{
#Column(name="ATY_ACTIVE")
private String active;
#Column(name="ATY_ID")
#Id
private Long oraId;
#Column(name="ATY_TYPE")
private String type;
}
thank

If you want to keep using non-reactive code, you can use #Blocking annotation on the method get(). It will offload the computation on a worker thread (instead of one IO thread).
Quarkus is really picky with IO thread, you cannot block them. And if you have something like a database call (or any remote call), that is blocking. So you cannot do it in an IO thread.
More info:
https://quarkus.io/guides/getting-started-reactive
https://quarkus.io/blog/resteasy-reactive-faq/

"Controller" methods (request / route / path handlers, or whatever you call it) is executed on IO thread and not supposed to do any time consuming tasks such as database querying.
If you're not using reactive database client, try wrap them in side a "Service" class.
#ApplicationScoped
public class AlertService {
private final AlertType alertType;
#Inject
public AlertService(AlertType alertType) {
this.alertType = alertType;
}
public List<Alert> listAll() {
return this.alertType.listAll();
}
}

thank you but I already had the call in a service.
I found a solution with mutiny
#GET
#Path("type")
#Produces(MediaType.APPLICATION_JSON)
public Uni<Response> get() {
return Uni.createFrom().item(alertTypeService.findAll().get())
.onItem().transform(data -> Response.ok(data))
.onFailure().recoverWithItem(err -> Response.status(600, err.getMessage()))
.onItem().transform(ResponseBuilder::build)
.emitOn(Infrastructure.getDefaultExecutor())
}
Where alertTypeService.findAll() return a supplier
#Transactional(Transactional.TxType.NEVER)
public Supplier<Set<AlerteTypeDTO>> findAll() {
return () -> alerteTypeDAO.streamAll()
.map(AlertTypeDTOMapper::mapToDTO)
.collect(Collectors.toSet());
}
I don't know if this is the right solution
but it works.
This way the service provides a supplier which will be invoked by the correct thread.
At least that's how I understood it.

Related

Persist objects in db using reactive programming and JPA respository

I am using web flux in my project and am trying to do simple CRUD operations using JPA repository. However I' unable to persist the object in the DB. The object is always being persisted with null values instead in the DB. Please help. Thanks in advance.
Here is my pojo:
#Entity
#Table(name="tbl_student")
#Data
public class Student {
#Id
#GeneratedValue(strategy= GenerationType.SEQUENCE,generator = "student_seq")
#SequenceGenerator(name="student_seq",allocationSize = 1)
#Column(name="id",insertable =false,nullable = false,updatable = false)
private Long id;
#Column(name="name")
private String name;
#Column(name="school")
private String school;
}
My Repo:
public interface StudentRepository extends JpaRepository<Student,Long> {
}
My controller:
#RestController
#RequiredArgsConstructor
#Slf4j
public class StudentApiControllerImpl extends StudentApiController {
private final StudentRepository StudentRepository;
private final ModelMapper modelMapper;
public Mono<Void> addStudent(#Valid #RequestBody(required = false) Mono<StudentDetails> studentDetails, ServerWebExchange exchange) {
StudentRepository.save(modelMapper.map(studentDetails, StudentDTO.class));
return Mono.empty();
}
public Flux<StudentDetails> getStudent(ServerWebExchange exchange) {
return Flux.fromStream(StudentRepository.findAll().stream().map(v ->
modelMapper.map(v,LoginCredential.class))).subscribeOn(Schedulers.boundedElastic());
}
}
You are breaking the reactive chain, thats why nothing happens.
Reactive programming is a lot different than standard java programming so you can't just do what you have done before and think that it will work the same.
In reactive programming one of the most important things to understand is nothing happens until you subscribe
A reactive chain is build by a producer and a subscriber which means someone produces (your application) and someone subscribes (the calling client/application). A Flux/Mono is a producer and nothing will happen until someone subscribes.
// Nothing happens, this is just a declaration
Mono.just("Foobar");
But when we subscribe:
// This will print FooBar
Mono.just("Foobar").subscribe(s -> System.out.println(s));
So if we look at your code, especially this line
// This is just a declaration, no one is subscribing, so nothing will happen
StudentRepository.save(modelMapper.map(studentDetails, StudentDTO.class));
A common misconception is that people will solve this by just subscribing.
// This is in most cases wrong
StudentRepository.save(modelMapper.map(studentDetails, StudentDTO.class)).subscribe();
Because the subscriber is the calling client, the one that initiated the call. Doing as such might lead to very bad performance under heavier loads. Instead what you do is that you need to return the Mono out to the client, so that the client can subscribe.
public Mono<Void> addStudent(#Valid #RequestBody(required = false) Mono<StudentDetails> studentDetails, ServerWebExchange exchange) {
return StudentRepository.save(modelMapper.map(studentDetails, StudentDTO.class))
.then();
}
I am using Mono#then here to throw away the return value and just return a void value to the calling client.
But as you can see, you need to think about it like callbacks, you need to always return so it will build a chain that gets returned to the calling client, so that the client can subscribe.
I highly suggest you read the reactive documentation so you understand the core concepts before starting out with reactive programming.
Reactive programming getting started
Also, another thing, you can not use JpaRepository in a reactive world because it is using a blocking database driver. Which means basically it is not written to work well with Webflux and will give very poor performance. If you want to use a database connection i suggest you look into using R2DBC by spring here is a tutuorial to get it up and running R2DBC getting started
Update:
i see now that you have placed the entire call on its own scheduler which is good since you are using JPA and that will result in less bad performance. But i do recommend still, if possible, to look into using R2DBC for a fully reactive application.
Also, if you insist on using JPA with a blocking database driver, you need to perform your call in a reactive context, since it will return a concrete value. So for instance:
return Mono.fromCallable(() -> StudentRepository.save(modelMapper.map(studentDetails, StudentDTO.class)))
.then()
Which means we are basically executing our function and immediately placing the returned type into a Mono, and then as the above example using Mono#then to discard the return value and just signal that the execution has completed.

Transaction handling when wrapping Stream into Flux

I really have issues understanding what's going on behind the sences when manually wrapping Stream received as a query result from spring data jpa into a Flux.
Consider the following:
Entity:
#NoArgsConstructor
#AllArgsConstructor
#Data
#Entity
public class TestEntity {
#Id
private Integer a;
private Integer b;
}
Repository:
public interface TestEntityRepository extends JpaRepository<TestEntity, Integer> {
Stream<TestEntity> findByBBetween(int b1, int b2);
}
Simple test code:
#Test
#SneakyThrows
#Transactional
public void dbStreamToFluxTest() {
testEntityRepository.save(new TestEntity(2, 6));
testEntityRepository.save(new TestEntity(3, 8));
testEntityRepository.save(new TestEntity(4, 10));
testEntityFlux(testEntityStream()).subscribe(System.out::println);
testEntityFlux().subscribe(System.out::println);
Thread.sleep(200);
}
private Flux<TestEntity> testEntityFlux() {
return fromStream(this::testEntityStream);
}
private Flux<TestEntity> testEntityFlux(Stream<TestEntity> testEntityStream) {
return fromStream(() -> testEntityStream);
}
private Stream<TestEntity> testEntityStream() {
return testEntityRepository.findByBBetween(1, 9);
}
static <T> Flux<T> fromStream(final Supplier<Stream<? extends T>> streamSupplier) {
return Flux
.defer(() -> Flux.fromStream(streamSupplier))
.subscribeOn(Schedulers.elastic());
}
Questions:
Is this the correct way to do what I do, especially regarding the static fromStream method?
While the call to testEntityFlux(testEntityStream()) does what I expect, for reasons I really don't understand, the call to testEntityFlux() runs into an error:
reactor.core.Exceptions$ErrorCallbackNotImplemented: org.springframework.dao.InvalidDataAccessApiUsageException: You're trying to execute a streaming query method without a surrounding transaction that keeps the connection open so that the Stream can actually be consumed. Make sure the code consuming the stream uses #Transactional or any other way of declaring a (read-only) transaction.
Caused by: org.springframework.dao.InvalidDataAccessApiUsageException: You're trying to execute a streaming query method without a surrounding transaction that keeps the connection open so that the Stream can actually be consumed. Make sure the code consuming the stream uses #Transactional or any other way of declaring a (read-only) transaction.
... what usually happens when I forget the #Transactional, which I didn't.
EDIT
Note: The code was inspired by: https://github.com/chang-chao/spring-webflux-reactive-jdbc-sample/blob/master/src/main/java/me/changchao/spring/springwebfluxasyncjdbcsample/service/CityServiceImpl.java which in turn was inspired by https://spring.io/blog/2016/07/20/notes-on-reactive-programming-part-iii-a-simple-http-server-application.
However, the Mono version has the same "issue".
EDIT 2
An example using optional, note that in testEntityMono() replacing testEntityOptional() with testEntityOptionalManual() leads to working code. Thus it all seems to be directly related to how jpa does the data fetching:
#SneakyThrows
#Transactional
public void dbOptionalToMonoTest() {
testEntityRepository.save(new TestEntity(2, 6));
testEntityRepository.save(new TestEntity(3, 8));
testEntityRepository.save(new TestEntity(4, 10));
testEntityMono(testEntityOptional()).subscribe(System.out::println);
testEntityMono().subscribe(System.out::println);
Thread.sleep(1200);
}
private Mono<TestEntity> testEntityMono() {
return fromSingle(() -> testEntityOptional().get());
}
private Mono<TestEntity> testEntityMono(Optional<TestEntity> testEntity) {
return fromSingle(() -> testEntity.get());
}
private Optional<TestEntity> testEntityOptional() {
return testEntityRepository.findById(4);
}
#SneakyThrows
private Optional<TestEntity> testEntityOptionalManual() {
Thread.sleep(1000);
return Optional.of(new TestEntity(20, 20));
}
static <T> Mono<T> fromSingle(final Supplier<T> tSupplier) {
return Mono
.defer(() -> Mono.fromSupplier(tSupplier))
.subscribeOn(Schedulers.elastic());
}
TL;DR:
It boils down to the differences between imperative and reactive programming assumptions and Thread affinity.
Details
We first need to understand what happens with transaction management to understand why your arrangement ends with a failure.
Using a #Transactional method creates a transactional scope for all code within the method. Transactional methods returning scalar values, Stream, collection-like types, or void (basically non-reactive types) are considered imperative transactional methods.
In imperative programming, flows stick to their carrier Thread. The code is expected to remain on the same Thread and not to switch threads. Therefore, transaction management associates transactional state and resources with the carrier Thread in a ThreadLocal storage. As soon as code within a transactional method switches threads (e.g. spinning up a new Thread or using a Thread pool), the unit of work that gets executed on a different Thread leaves the transactional scope and potentially runs in its own transaction. In the worst case, the transaction is left open on an external Thread because there is no transaction manager monitoring entry/exit of the transactional unit of work.
#Transactional methods returning a reactive type (such as Mono or Flux) are subject to reactive transaction management. Reactive transaction management is different from imperative transaction management as the transactional state is attached to a Subscription, specifically the subscriber Context. The context is only available with reactive types, not with scalar types as there are no means to attach data to void or a String.
Looking at the code:
#Test
#Transactional
public void dbStreamToFluxTest() {
// …
}
we see that this method is a #Transactional test method. Here we have two things to consider:
The method returns void so it is subject to imperative transaction management associating the transactional state with a ThreadLocal.
There's no reactive transaction support for #Test methods because typically a Publisher is expected to be returned from the method, and by doing so, there would be no way to assert the outcome of the stream.
#Test
#Transactional
public Publisher<Object> thisDoesNotWork() {
return myRepository.findAll(); // Where did my assertions go?
}
Let's take a closer look at the fromStream(…) method:
static <T> Flux<T> fromStream(final Supplier<Stream<? extends T>> streamSupplier) {
return Flux
.defer(() -> Flux.fromStream(streamSupplier))
.subscribeOn(Schedulers.elastic());
}
The code accepts a Supplier that returns a Stream. Next, subscription (subscribe(…), request(…)) signals are instructed to happen on the elastic Scheduler which effectively switches on which Thread the Stream gets created and consumed. Therefore, subscribeOn causes the Stream creation (call to findByBBetween(…)) to happen on a different Thread than your carrier Thread.
Removing subscribeOn(…) will fix your issue.
There is a bit more to explain why you want to refrain from using reactive types with JPA. Reactive programming has no strong Thread affinity. Thread switching may occur at any time. Depending on how you use the resulting Flux and how you have designed your entities, you might experience visibility issues as entities are passed across threads. Ideally, data in a reactive context remains immutable. Such an approach does not always comply with JPA rules.
Another aspect is lazy loading. By using JPA entities from threads other than the carrier Thread, the entity may not be able to correlate its context back to the JPA Transaction. You can easily run into LazyInitializationException without being aware of why this is as Thread switching can be opaque to you.
The recommendation is: Do not use reactive types with JPA or any other transactional resources. Stay with Java 8 Stream instead.
The Stream returned by the repository is lazy. It uses the connection to the database in order to get the rows when the stream is being consumed by a terminal operation.
The connection is bound to the current transaction, and the current transaction is stored in a ThreadLocal variable, i.e. is bound to the thread that is eecuting your test method.
But the consumption of the stream is done on a separate thread, belonging to the thread pool used by the elastic scheduler of Reactor. So you create the lazy stream on the main thread, which has the transaction bound to it, but you consume the stream on a separate thread, which doesn't have the transaction bound to it.
Don't use reactor with JPA transactions and entities. They're incompatible.

What is a safe way of making a Future available between HTTP requests?

I have a long-running operation in a Spring Boot web application.
This is how it works:
When the user clicks a button, a POST request is made and the operation starts.
Because the operation will take a long time, it is started asynchronously and a response is sent immediately.
Using JavaScript, I periodically send GET requests to find out if the operation has finished.
Here are the request handlers:
import java.util.concurrent.Future;
#RequestMapping(value = "/start", method = RequestMethod.POST)
#ResponseBody
String start(HttpSession session) {
Future<String> result = resultService.result();
session.setAttribute("result", result);
return "started";
}
#RequestMapping(value = "/status")
#ResponseBody
String status(HttpSession session) throws Exception {
#SuppressWarnings("unchecked")
Future<String> result = (Future<String>) session.getAttribute("result");
if (result != null && result.isDone()) {
return result.get();
} else {
return "working";
}
}
And this is the long-running operation (in a separate bean):
import org.springframework.scheduling.annotation.Async;
import org.springframework.scheduling.annotation.AsyncResult;
#Async
#Override
public Future<String> result() {
String result = computeResult(); // takes long
return new AsyncResult<String>(result);
}
The complete example is on GitLab.
Also, here's a GIF showing how it works.
Now, this works, but the problem is that SonarQube raised an issue:
Make "Future" and its parameters serializable or don't store it in the session.
It explained that
the session [...] may be written to disk anyway, as the server manages its memory use in a process called "passivation". Further, some servers automatically write their active sessions out to file at shutdown & deserialize any such sessions at startup.
See MITRE, CWE-579 - J2EE Bad Practices: Non-serializable Object Stored in Session
Since I can't make Future serializable, what would be a better way to keep track of the long-running operation between requests?
Now, this works, but the problem is that SonarQube raised an issue:
To fix the above issue, you can write a wrapper class implementing Serializable that contains the result of the Future object along with the Future object as transient. And you can place this wrapper object in the session instead of directly putting the Future object.
Ex:
public class ResultWrapper implements Serializable {
private String result = "working"; //String, since the resultService.result() is returning Future<String>
private transient Future future; //transient as it is not serializable.
public String getResult() {
if (future != null && future.isDone()) {
result = future.get();
//Once the future is done, call the session.setAttribute(...) so that value of result field is replicated across the JVM nodes.
}
return result;
}
}
Note that this just solves the issue you have raised regarding the SonarQube. But it doesn't really provide failover or handles activation/passivation even if the session replication is active.
If there are two nodes M1 & M2 on which the webapp is running with session replication in place, the async job computeResult(); will obviously be running only on one of the machines (the one which received the initial request) and if that machine goes down all the requests will be forwarded to the other active machine and the result will always return "working" forever.
Another issue which applies even the webapp is running on a single node is that, if the session gets passivated the future will not be passivated as it is transient and so you will loose reference to that object and reactivated wrapper will have future obj as null. Finally the result is same as above case.

Asynchronous processing with fallback java

I am working on an java application that will makes calls to a web service, I dont want to incur additional latency while making these calls hence I am planning on doing this asynchronously. Using threads is one way to go about it but this approach might become unreliable if the calls to the dependent service fail because of various reasons. So essentially what I am looking is some kind of in-process asynchronous service will fallback to temporarily store (inprocess database ?) and retry the failed requests. Are there are any exiting solutions out there that achieve this ? If not it would help if someone could point me to something that does a similar task like this.
Thanks
Actually, I've not yet tried it, but Reactor is something like Node.js and should allow you to program using event-driven paradigm.
Please check it out and let us know if it suits your needs.
Quarkus has easy built in reactive asynchronous call processing.
Like:
#ApplicationScoped
#RegisterForReflection
#Path("/")
public class JokesServiceCallManager {
private static final Logger LOGGER =
LoggerFactory.getLogger(JokesServiceCallManager.class);
#Inject
JokeResponseHandler jokeResponseHandler;
#Inject
JokeSetupAdapter jokesSetupAdapter;
#Inject
JokeReactionAdapter jokesReactionAdapter;
public Uni<Response> getData(String id,String correlationId) {
LOGGER.debug("********** getData(String id) id = " + id);
Uni<Response> uniSetup = jokesSetupAdapter.getByIdAsync(id);
Uni<Response> uniReaction =
jokesReactionAdapter.getByIdAsync(id);
return Uni.combine().all().unis(uniSetup, uniReaction )
.asTuple().onItem().transformToUni(tuple ->
jokeResponseHandler.createUniResponse(tuple.getItem1(),
tuple.getItem2()));
// .onFailure().invoke(exception ->
jokeResponseHandler.buildUniExceptionResponse(exception));
}
Which returns a tuple of all the calls when complete.
Simply allowing the service return to be cast to an Uni makes it all reactive (non-blocking)
The calls are as simple as :
import javax.ws.rs.core.Response;
import
org.eclipse.microprofile.rest.client.annotation.RegisterClientHeaders;
import org.eclipse.microprofile.rest.client.inject.RegisterRestClient;
import io.smallrye.mutiny.Uni;
#RegisterRestClient(configKey="setup-api")
#RegisterClientHeaders(SetupHeaderFactory.class)
public interface JokesSetupService {
#GET
#Path("/jokesetup/{id}")
#Produces(MediaType.APPLICATION_JSON)
Uni<Response> getByIdAsync(#PathParam("id") String id);
}

Java concurrency : Making webservice access threadsafe

I'd like to know the correct / best way to handle concurrency with an Axis2 webservice.
Eg, given this code:
public class MyServiceDelegate
{
#Resource
UserWebService service; // Injected by spring
public CustomerDTO getCustomer()
{
String sessionString = getSessionStringFromCookies();
service.setJSESSIONID(sessionString);
CustomerDTO customer = service.getCustomerFromSessionID();
}
}
Note that in the above that UserWebService is a 3rd party API. The service requires that when making calls, we pass a cookie with the JSESSIONID of an authenticated session.
Am I correct in assuming that this statement is not threadsafe? IE., given two threads, is it possible for the following to occur?
ThreadA : service.setJSESSIONID("threadA")
ThreadB : service.setJSESSIONID("threadB")
ThreadA : service.getCustomerFromSessionID // service.sesionID == "threadB"
If so, what's the most appropriate way to handle this situation? Should I use a resource pool for service? Or should I declare service as synchronized?
public CustomerDTO getCustomer()
{
synchronized( service ) {
service.setJSESSIONID(sessionString);
CustomerDTO customer = service.getCustomerFromSessionID();
}
}
Or, is there another, more appropriate way to handle this problem?
Would each thread have its own Delegate object and hence its own UserWebService service?
In the simple case, if delegates are created on the stack the threads would be independent.
If the cost of creation is high, have a pool of the delegate objects. Taking one from teh pool is comparativley cheap. You need to be very careful with housekeeping, but effectively this is what is done with database connections. Some environments have utility classes for managing such pooling - tends to be preferable to rolling your own.
Is UserWebService one of your classes? If so, I think I'd change the method signature to:
public CustomerDTO getCustomer()
{
CustomerDTO customer = service.getCustomerFromSessionID(sessionString);
}
And not have your UserWebService maintain state, that way it will be inherently thread-safe
As you said, the function is not thread safe. Java has a simple way to make monitors, which is an object that only allows one thread to access a function at a time. More info on monitors
To make it thread safe you can put synchronized either, as you did, around the expression, or before the function name:
public synchronized CustomerDTO getCustomer(){
service.setJSESSIONID(sessionString);
CustomerDTO customer = service.getCustomerFromSessionID();
}
The difference between the two is which object you turn into a monitor.

Categories

Resources