I'm building a Spring Boot 2.1.6 REST API in which one of the controllers receives frequent requests. Each request contains data pertaining to some client event. The client is a video player which sends events each time a user viewed a video, viewed 25%, 50% etc. of the video, clicked etc. There may around 20 events per each client cycle.
I need to save each event in the database. As far as I understand inserting an event into the database inside the controller is not the best solution because inserting in the db is expensive, especially if this will happen on each call to the controller.
Therefore I thought to accumulate the events inside the controller inside some data structure, perhaps a queue. Then when the queue reached certain size I'd send all the events via JMS service to a consumer which will update the db with all the events (in a separate thread).
Controllers in Spring are singletons by default (and I want to keep them that way). Therefore I will need to synchronize the controller queue because I don't think that even if I'll declare the queue static it will prevent synch issues. In addition if I use synchronized inside the controller then I'm not taking an advantage of the multi-threading model that Tomcat uses.
What is the best practice to save controller state/perform synchronization is such case?
This is my controller:
#RestController
public class TrackingEventController {
#Autowired
private DBHandler db;
#Autowired
private JmsTemplate jmsTemplate;
private Queue<TrackingEvent> queue;
public TrackingEventController() {
this.queue = new ArrayBlockingQueue<>(30);
}
#CrossOrigin(origins = "*")
#RequestMapping(method=GET, path=trackingEventPath)
public ResponseEntity<Object> handleTrackingEvent(
#RequestParam(name = Routes.event) String event,
#RequestParam(name = Routes.pubId) String pubId,
#RequestParam(name = Routes.advId) String advId) {
TrackingEvent tr = new TrackingEvent(event, pubId, advId);
this.trySendEventsMessage(tr);
return new ResponseEntity<>(null, new HttpHeaders(), HttpStatus.OK);
}
private synchronized void trySendEventsMessage() {
this.queue.add(tr);
if (this.queue.size() >= eventsMapMaxMessageSize) {
jmsTemplate.convertAndSend(jmsTopicName, this.queue);
queue.clear();
}
}
}
Related
I'm a new for a project reactor, but i have task to send some information from classic spring rest controller to some service, which is interacts with different system. Whole project developed with project reactor.
Here is my rest controller:
#RestController
public class Controller {
#Autowired
Service service;
#PostMapping("/path")
public Mono<String> test(#RequestHeader Map<String, String> headers) throws Exception {
testService.saveHeader(headers.get("header"));
return service.getData();
}
And here is my service:
#Service
public class Service {
private Mono<String> monoHeader;
private InteractionService interactor;
public Mono<String> getData() {
return Mono.fromSupplier(() -> interactor.interact(monoHeader.block()));
}
public void saveHeader(String header) {
String key = "header";
monoHeader = Mono.just("")
.flatMap( s -> Mono.subscriberContext()
.map( ctx -> s + ctx.get(key)))
.subscriberContext(ctx -> ctx.put(key, header));
}
Is it acceptable solution?
Fisrt off, I don't think you need the Context here. It is useful to implicitly pass data to a Flux or a Mono that you don't create (eg. one that a database driver creates for you). But here you're in charge of creating the Mono<String>.
Does the service saveHeader really achieve something? The call seem transient in nature: you always immediately call the interactor with the last saved header. (there could be a side effect there where two parallel calls to your endpoint end up overwriting each other's headers).
If you really want to store the headers, you could add a list or map in your service, but the most logical path would be to add the header as a parameter of getData().
This eliminates monoHeader field and saveHeader method.
Then getData itself: you don't need to ever block() on a Mono if you aim at returning a Mono. Adding an input parameter would allow you to rewrite the method as:
public Mono<String> getData(String header) {
return Mono.fromSupplier(() -> interactor.interact(header));
}
Last but not least, blocking.
The interactor seems to be an external service or library that is not reactive in nature. If the operation involves some latency (which it probably does) or blocks for more than a few milliseconds, then it should run on a separate thread.
Mono.fromSupplier runs in whatever thread is subscribing to it. In this case, Spring WebFlux will subscribe to it, and it will run in the Netty eventloop thread. If you block that thread, it means no other request can be serviced in the whole application!
So you want to execute the interactor in a dedicated thread, which you can do by using subscribeOn(Schedulers.boundedElastic()).
All in all:
#RestController
public class Controller {
#Autowired
Service service;
#PostMapping("/path")
public Mono<String> test(#RequestHeader Map<String, String> headers) throws Exception {
return service.getData(headers.get("header"));
}
}
#Service
public class Service {
private InteractionService interactor;
public Mono<String> getData(String header) {
return Mono.fromSupplier(() -> interactor.interact(header))
.subscribeOn(Schedulers.boundedElastic());
}
}
How to transfer data via reactor's subscriber context?
Is it acceptable solution?
No.
Your code of saveHeader() method is an equivalent of simple
public void saveHeader(String header) {
monoHeader = Mono.just(header);
}
A subscriberContext is needed if you consume the value elsewhere - if the mono is constructed elsewhere. In your case (where you have all code before your eyes in the same method) just use the actual value.
BTW, there are many ways to implement your getData() method.
One is as suggested by Simon Baslé to get rid of a separate saveHeader() method.
One other way, if you have to keep your monoHeader field, could be
public Mono<String> getData() {
return monoHeader.publishOn(Schedulers.boundedElastic())
.map(header -> interactor.interact(header));
}
I need to send the email/sms/events as a background async task inside spring boot rest.
My REST controller
#RestController
public class UserController {
#PostMapping(value = "/register")
public ResponseEntity<Object> registerUser(#RequestBody UserRequest userRequest){
// I will create the user
// I need to make the asyn call to background job to send email/sms/events
sendEvents(userId, type) // this shouldn't block the response.
// need to send immediate response
Response x = new Response();
x.setCode("success");
x.setMessage("success message");
return new ResponseEntity<>(x, HttpStatus.OK);
}
}
How can I make sendEvents without blocking the response (no need to get the return just a background task) ?
sendEvents- call the sms/email third part api or send events to kafka topic.
Thanks.
Sounds like a perfect use case for the Spring #Async annotation.
#Async
public void sendEvents() {
// this method is executed asynchronously, client code isn't blocked
}
Important: #Async works only on the public methods and can't be called from inside of a single class. If you put the sendEvents() method in the UserController class it will be executed synchronously (because the proxy mechanism is bypassed). Create a separate class to extract the asynchronous operation.
In order to enable the async processing in your Spring Boot application, you have to mark your main class with appropriate annotation.
#EnableAsync
public class Application {
public static void main(String[] args) {
SpringApplication application = new SpringApplication(Application.class);
application.run(args);
}
}
Alternatively, you can also place the #EnableAsync annotation on any #Configuration class. The result will be the same.
The title might be incorrect, but I will try to explain my issue. My project is a Spring Boot project. I have services which do calls to external REST endpoints.
I have a service method which contains several method calls to other services I have. Every individual method call can be successful or not. Every method call is done to a REST endpoint and there can be issues that for example the webservice is not available or that it throws an unknown exception in rare cases. What ever happens, I need to be able to track which method calls were successful and if any one of them fails, I want to rollback to the original state as if nothing happened, see it a bit as #Transactional annotation. All REST calls are different endpoints and need to be called separately and are from an external party which I don't have influence on. Example:
public MyServiceImpl implements MyService {
#Autowired
private Process1Service;
#Autowired
private Process2Service;
#Autowired
private Process3Service;
#Autowired
private Process4Service;
public void bundledProcess() {
process1Service.createFileRESTcall();
process2Service.addFilePermissionsRESTcall();
process3Service.addFileMetadataRESTcall(); <-- might fail for example
process4Service.addFileTimestampRESTcall();
}
}
If for example process3Service.addFileMetadataRESTcall fails I want to do something like undo (in reverse order) for every step before process3:
process2Service.removeFilePermissionsRESTcall();
process1Service.deleteFileRESTcall();
I read about the Command pattern, but that seems to be used for Undo actions inside an application as a sort of history of actions performed, not inside a Spring web application. Is this correct for my use case too or should I track per method/webservice call if it was successful? Is there a best practice for doing this?
I guess however I track it, I need to know which method call failed and from there on perform my 'undo' method REST calls. Although in theory even these calls might also fail of course.
My main goal is to not have files being created (in my example) which any further processes have not been performed on. It should either be all successful or nothing. A sort of transactional.
Update1: improved pseudo implementation based on comments:
public Process1ServiceImpl implements Process1Service {
public void createFileRESTcall() throws MyException {
// Call an external REST api, pseudo code:
if (REST-call fails) {
throw new MyException("External REST api failed");
}
}
}
public class BundledProcessEvent {
private boolean createFileSuccess;
private boolean addFilePermissionsSuccess;
private boolean addFileMetadataSuccess;
private boolean addFileTimestampSuccess;
// Getters and setters
}
public MyServiceImpl implements MyService {
#Autowired
private Process1Service;
#Autowired
private Process2Service;
#Autowired
private Process3Service;
#Autowired
private Process4Service;
#Autowired
private ApplicationEventPublisher applicationEventPublisher;
#Transactional(rollbackOn = MyException.class)
public void bundledProcess() {
BundleProcessEvent bundleProcessEvent = new BundleProcessEvent();
this.applicationEventPublisher.publishEvent(bundleProcessEvent);
bundleProcessEvent.setCreateFileSuccess = bundprocess1Service.createFileRESTcall();
bundleProcessEvent.setAddFilePermissionsSuccess = process2Service.addFilePermissionsRESTcall();
bundleProcessEvent.setAddFileMetadataSuccess = process3Service.addFileMetadataRESTcall();
bundleProcessEvent.setAddFileTimestampSuccess = process4Service.addFileTimestampRESTcall();
}
#TransactionalEventListener(phase = TransactionPhase.AFTER_ROLLBACK)
public void rollback(BundleProcessEvent bundleProcessEvent) {
// If the last process event is successful, we should not
// be in this rollback method even
//if (bundleProcessEvent.isAddFileTimestampSuccess()) {
// remove timestamp
//}
if (bundleProcessEvent.isAddFileMetadataSuccess()) {
// remove metadata
}
if (bundleProcessEvent.isAddFilePermissionsSuccess()) {
// remove file permissions
}
if (bundleProcessEvent.isCreateFileSuccess()) {
// remove file
}
}
Your operation looks like a transaction, so you can use #Transactional annotation. From your code I can't really tell how you are managing HTTP response calls for each of those operations, but you should consider having your service methods to return them, and then do a rollback depending on response calls. You can create an array of methods like so, but how exactly you want your logic to be is up to you.
private Process[] restCalls = new Process[] {
new Process() { public void call() { process1Service.createFileRESTcall(); } },
new Process() { public void call() { process2Service.addFilePermissionsRESTcall(); } },
new Process() { public void call() { process3Service.addFileMetadataRESTcall(); } },
new Process() { public void call() { process4Service.addFileTimestampRESTcall(); } },
};
interface Process {
void call();
}
#Transactional(rollbackOn = Exception.class)
public void bundledProcess() {
restCalls[0].call();
... // say, see which process returned wrong response code
}
#TransactionalEventListener(phase = TransactionPhase.AFTER_ROLLBACK)
public void rollback() {
// handle rollback according to failed method index
}
Check this article. Might come in handy.
The answer to this question is quite broad. There are various ways to do distributed transactions to go through them all here. However, since you are using Java and Spring, your best bet is to use something like JTA (Java Transaction API), which enables a distributed transactions across multiple services/instances/etc.. Fortunately, Spring Boot supports JTA using either Atomikos or Bitronix. You can read the doc here.
One approach to enable distributed transactions is through a message broker such as JMS, RabbitMQ, Kafka, ActiveMQ, etc. and use a protocol like XA transactions (two-phase commit). In the case of external services that do not support distributed, one approach is to write a wrapper service that understands XA transactions to that external service.
I have a Spring Web Application with concurrent access to a certain resource. This resource holds a list of a certain numer of objects which might be fetched by a request. If the list is empty, no SomeClass Object should be returned any more
The resource looks like this:
public class Resource {
private List<SomeClass> someList;
public List<SomeClass> fetch() {
List<SomeClass> fetched = new ArrayList<SomeClass>();
int max = someList.size();
if(max<=0) {
return fetched;
}
int added = 0;
while(added<max) {
int randomIndex = Math.random(max-1);
SomeClass someClass = someList.get(randomIndex);
if(!fetched.contains(someClass)) {
fetched.add(someClass);
++added;
}
}
someList.remove(fetched);
return fetched;
}
}
This resource is loaded in the service layer, then accessed and saved back to the database:
#Service
public class ResourceService {
#Autowired
private ResourceRepository repo;
public List<SomeClass> fetch(long id) {
Resource resource = repo.findOne(id);
List<SomeClass> fetched = resource.fetch();
repo.save(resource);
return fetched;
}
}
I tried to use #Transactional on the ResourceService#fetch method to avoid the problem that two concurrent requests might fetch a SomeClass object from the list although the list was already emptied by the first request but I'm not sure if this is the right approach... Do I have to use #Synchronized on the Resource#fetch method or introduce an explicit Lock in the service layer?
I need to make sure that only one request is accessing the Resource (fetching a list of SomeClass objects) without throwing an exception. Instead, subsequent requests should be queued and try to access the resource after the current request has finished fetching the list of SomeClass objects.
My final solution was to introduce a Blocking Queue in the #Service and add all incoming requests to it. A seperate thread is then taking an element from the queue as soon as one was added and processing it.
I think this is the cleanest solution, as adding an ReentrantLock would block the request processing.
If I have a Java class defined below that is injected in my web application via dependency injection:
public AccountDao
{
private NamedParameterJdbcTemplate njt;
private List<Account> accounts;
public AccountDao(Datasource ds)
{
this.njt = new NamedParameterJdbcTemplate(ds);
refreshAccounts();
}
/*called at creation, and then via API calls to inform service new users have
been added to the database by a separate program*/
public void refreshAccounts()
{
this.accounts = /*call to database to get list of accounts*/
}
//called by every request to web service
public boolean isActiveAccount(String accountId)
{
Account a = map.get(accountId);
return a == null ? false : a.isActive();
}
}
I am concerned about thread safety. Does the Spring framework not handle cases where one request is reading from the list and it is currently being updated by another? I have used read/write locks before in other applications, but I have never thought about a case such as above before.
I was planning on using the bean as a singleton so I could reduce database load.
By the way, this is a follow up of the below question:
Java Memory Storage to Reduce Database Load - Safe?
EDIT:
So would code like this solve this problem:
/*called at creation, and then via API calls to inform service new users have
been added to the database by a separate program*/
public void refreshAccounts()
{
//java.util.concurrent.locks.Lock
final Lock w = lock.writeLock();
w.lock();
try{
this.accounts = /*call to database to get list of accounts*/
}
finally{
w.unlock();
}
}
//called by every request to web service
public boolean isActiveAccount(String accountId)
{
final Lock r = lock.readLock();
r.lock();
try{
Account a = map.get(accountId);
}
finally{
r.unlock();
}
return a == null ? false : a.isActive();
}
Spring framework does not do anything under the hood concerning the multithreaded behavior of a singleton bean. It is the developer's responsibility to deal with concurrency issue and thread safety of the singleton bean.
I would suggest reading the below article: Spring Singleton, Request, Session Beans and Thread Safety
You could have asked for clarification on my initial answer. Spring does not synchronize access to a bean. If you have a bean in the default scope (singleton), there will only be a single object for that bean, and all concurrent requests will access that object, requiring that object to the thread safe.
Most spring beans have no mutable state, and as such are trivially thread safe. Your bean has mutable state, so you need to ensure no thread sees a list of accounts the other thread is currently assembling.
The easiest way to do that is to make the accounts field volatile. That assumes that you assign the new list to the field after having filled it (as you appear to be doing).
private volatile List<Accounts> accounts;
As a singleton and non-synchronized, Spring will allow any number of threads to concurrently invoke isActiveAccount and refreshAccounts. So, no this class is not going to be thread-safe and will not reduce the database load.
we have many such meta data and have some 11 nodes running. on each app node we have static maps for such data so its one instance only, init from db at start up once in off peak hour every day or when support person triggers it. have an interal simple http post based API to send updates from 1 node to others for some of the data which we need updates in real time.
public AccountDao
{
private static List<Account> accounts;
private static List<String> activeAccounts;
private NamedParameterJdbcTemplate njt;
static {
try{
refreshAccounts();
}catch(Exception e){
//log but do not throw. any uncaught exceptions in static means your class is un-usable
}
}
public AccountDao(Datasource ds)
{
this.njt = new NamedParameterJdbcTemplate(ds);
//refreshAccounts();
}
/*called at creation, and then via API calls to inform service new users have
been added to the database by a separate program*/
public void refreshAccounts()
{
this.accounts = /*call to database to get list of accounts*/
}
public void addAccount(Account acEditedOrAdded)
{
//add or reove from map onr row
//can be called from this node or other node
//meaning if you have 2 nodes, keep IP port of each or use a internal web service or the like to tell
//node B when a account id added or changed in node A ...
}
//called by every request to web service
public static boolean isActiveAccount(String accountId)
{
Account a = map.get(accountId);
return a == null ? false : a.isActive();
}
}