I have a REST API with some synchronized tasks that give me errors. I'll try to explain the idea.
Suppose I have the following code in my Controller class:
#Qualifier("serviceA")
#Autowired
private UpdateService updateServiceA; // connected to host A
#Qualifier("serviceB")
#Autowired
private UpdateService updateServiceB; // connected to host B
#Autowired
private ExecutorService myExecutorService;
// one of my endpoints calls update method below
public void update(Object x) {
myExecutorService.submit(() -> {
updateServiceA.remove(x); // remove x on host A
updateServiceB.remove(x); // remove x on host B
}
}
And in UpdateServiceImpl class I have the following:
public class UpdateServiceImpl implements UpdateService {
public synchronized void remove(Object x) {
// 1. find x
// 2. if x exists, remove it
// Response response = elasticSearch.performRequest(...)
}
}
And the beans are configured in a configuration class as follows:
#Bean
public ExecutorService executorService() {
return Executors.newFixedThreadPool(Runtime.getRuntime().availableProcessors() + 1);
}
#Qualifier("serviceA")
#Bean
public UpdateService getUpdateService() {
return new UpdateServiceImpl();
}
#Qualifier("serviceB")
#Bean
public UpdateService getUpdateService() {
return new UpdateServiceImpl();
}
The backend is ElasticSearch. Remove method is just perfoming a delete operation.
Once in a while I see in my logs that an entry is found but when it tries to remove it, it does not exist anymore. I get the feeling that another thread (?) already removed the object in the meantime. I don't understand why the synchronized method is not working properly.
Am I missing something here?
Related
I am trying to use BlockingQueue inside Spring Boot. My design was like this: user submit request via a controller and controller in turn puts some objects onto a blocking queue. After that the consumer should be able to take the objects and process further.
I have used Asnyc, ThreadPool and EventListener. However with my code below I found consumer class is not consuming objects. Could you please help point out how to improve?
Queue Configuration
#Bean
public BlockingQueue<MyObject> myQueue() {
return new PriorityBlockingQueue<>();
}
#Bean
public Executor getAsyncExecutor() {
ThreadPoolTaskExecutor executor = new ThreadPoolTaskExecutor();
executor.setCorePoolSize(3);
executor.setMaxPoolSize(3);
executor.setQueueCapacity(10);
executor.setThreadNamePrefix("Test-");
executor.initialize();
return executor;
}
Rest Controller
#Autowired
BlockingQueue<MyObject> myQueue;
#RequestMapping(path = "/api/produce")
public void produce() {
/* Do something */
MyObject myObject = new MyObject();
myQueue.put(myObject);
}
Consumer Class
#Autowired
private BlockingQueue<MyObject> myQueue;
#EventListener
public void onApplicationEvent(ContextRefreshedEvent event) {
consume();
}
#Async
public void consume() {
while (true) {
try {
MyObject myObject = myQueue.take();
}
catch (Exception e) {
}
}
}
Your idea is using Queue to store messages, consumer listens to spring events and consume.
I didn't see your code have actually publish the event, just store them in queue.
If you want to use Spring Events, producers could like this:
#Autowired
private ApplicationEventPublisher applicationEventPublisher;
public void doStuffAndPublishAnEvent(final String message) {
System.out.println("Publishing custom event. ");
CustomSpringEvent customSpringEvent = new CustomSpringEvent(this, message);
applicationEventPublisher.publishEvent(customSpringEvent);
}
check this doc
If you still want to use BlockingQueue, your consumer should be a running thread, continuously waiting for tasks in the queue, like:
public class NumbersConsumer implements Runnable {
private BlockingQueue<Integer> queue;
private final int poisonPill;
public NumbersConsumer(BlockingQueue<Integer> queue, int poisonPill) {
this.queue = queue;
this.poisonPill = poisonPill;
}
public void run() {
try {
while (true) {
Integer number = queue.take(); // always waiting
if (number.equals(poisonPill)) {
return;
}
System.out.println(Thread.currentThread().getName() + " result: " + number);
}
} catch (InterruptedException e) {
Thread.currentThread().interrupt();
}
}
}
could check this code example
#Async doesn't actually start a new thread if the target method is called from within the same object instance, this could be the problem in your case.
Also note that you need to put #EnableAsync on a config class to enable the #Async annotation.
See Spring documentation: https://docs.spring.io/spring-framework/docs/current/reference/html/integration.html#scheduling-annotation-support
The default advice mode for processing #Async annotations is proxy which allows for interception of calls through the proxy only. Local calls within the same class cannot get intercepted that way. For a more advanced mode of interception, consider switching to aspectj mode in combination with compile-time or load-time weaving.
In the end I came up with this solution.
Rest Controller
#Autowired
BlockingQueue<MyObject> myQueue;
#RequestMapping(path = "/api/produce")
public void produce() {
/* Do something */
MyObject myObject = new MyObject();
myQueue.put(myObject);
Consumer.consume();
}
It is a little bit weird because you have to first put the object on queue yourself then consume that object by yourself. Any suggestions on improvement is highly appreciated.
#Service
public Class SomeService {
private SomeServiceAsync someServiceAsync = new SomeServiceAsync();
...
public String DoAThing() {
CompletableFurute<String> future = someServiceAsync.GetAString();
return future.get();
}
}
#Service
#EnableAsync
public Class SomeServiceAsync {
#Value("${someProp1}")
private String someProp1;
#Autowired
private Environment env;
...
#Async
public CompletableFuture<String> GetAString() {
System.out.println(someProp1); // returns null
String someProp2 env.getProperty("someProp2"); // throws null pointer exception
return CompletableFurute.completedFuture("blablabla");
}
}
My problem is simply that I cannot access my application properties after making some of my methods run asynchronously. Nothing is failing before I try to execute the method and either get a null from #Value or env is null.
The method worked before making it async, and the async version works fine when I am not accessing the application properties.
Looks like the problem was the new SomeServiceAsync(), instead of an #Autowired.
Made the same mistake myself many times.
I have an application with 3 layers:
App <--> Graph <--> Couchbase
I'm trying to test the GraphConnector by mocking the couchbase layer and "replacing" it with a very basic in-memory graph implementation, using the same approach demonstrated in the JMockit tutorial.
This is my test class (pardon the poor indentation, didn't get the hang of it yet):
public class GraphConnectorTest {
public static final class MockCouchbase extends MockUp<ICouchConnector> {
private Map<String, CouchEntry> couch;
#Mock
public void $clinit() {
couch = new HashMap<String, CouchEntry>();
}
#Mock
public void put(CouchEntry entry) {
couch.put(entry.getKey(), entry);
}
#Mock
public CouchEntry get(String key) {
return couch.get(key);
}
}
GraphConnectorImpl graph = new GraphConnectorImpl();
#BeforeClass
public static void setUpClass() {
new MockCouchbase();
}
#Test
public void testPost() throws Exception {
GraphNode node = new GraphNode(GraphNodeType.DOMAIN, "alon.com");
graph.post(node);
GraphNode retNode = graph.getSingleNode(node.getValue(), node.getType());
assertEquals(node.getValue(), retNode.getValue());
assertEquals(node.getType(), retNode.getType());
}
}
And here is my class under test:
public class GraphConnectorImpl implements IGraphConnector {
private static ICouchConnector couch = new CouchConnectorImpl(); // <-- Basic implementation which I don't want the test to execute
#Override
public void post(GraphNode node) {
CouchEntry entry = new CouchEntry(node.getValue(), JsonDocument.create(node.getValue()));
couch.put(entry);
}
#Override
public GraphNode getSingleNode(String nodeName, GraphNodeType nodeType) {
return new GraphNode(nodeType, couch.get(nodeName).getKey());
}
}
For some reason, the class MockCouchbase that I created within the test class isn't automatically bound to the private field ICouchConnector couch of the tested class, as shown in the tutorial. Instead, the real implementation is called, which is obviously undesirable.
If I remove the reference to the real implementation, I just get a good ol' NullPointerException.
I tried playing with the #Tested and #Injectable annotations but to no avail.
Solving my own question.
The problem with the way I wrote the class under test was explicitly invoking the constructor of the real implementation. I'll be surprised if any mocking framework can "bypass" that.
Instead, I should've created a constructor that gets ICouchConnector as one of its arguments, e.g. use dependency injection properly.
public class GraphConnectorImpl implements IGraphConnector {
private static ICouchConnector couch;
public GraphConnectorImpl(ICouchConnector connector) {
couch = connector;
}
// Rest of class...
}
JMockit will then attempt to find a constructor that corresponds to the fields annotated #Tested and #Injectable in the test class.
public class GraphConnectorTest {
#Tested
GraphConnectorImpl graph;
#Injectable
ICouchConnector couch;
// Rest of class...
}
I have a web service DocGenerationServiceImpl that inserts (for every format) a record in the table using DocRepository and object representing the record as DocFileDO. In the for-loop, I can get the id of the record that was created in the table. For each record, I will call the executor's execute method where DocGenTask will search for the record given the id. However, for example, there are 3 formats, the DocGenTask is able to get only the last record. The first 2 it cannot find. Although it's using hibernateTemplate. Can please advise?
#RestfulService
#Controller
#RequestMapping("/docs")
public class DocGenerationServiceImpl {
#Autowired
private TaskExecutor taskExecutor;
#Autowired
private DocRepository docRepository;
#RequestMapping(value = "/generate", method = RequestMethod.POST)
#ResponseBody
public String generatedDocFile(DOCParam param) {
for(String format : param.getFormatList()) {
DocFileDO docFileDO = new DocFileDO();
...
docRepository.saveDocFile(docFileDO);
log.debug("docFileDO id = " + docFileDO.getId());
DocGenTask task = new DocGenTask(docFileDO.getId());
task.setDocRepository(docRepository);
taskExecutor.execute(task);
}
}
}
#Repository
public class DocRepository {
#Autowired
private HibernateTemplate hibernateTemplate;
public DocFileDO saveDocFile(DocFileDO docFile) {
hibernateTemplate.save(docFile);
hibernateTemplate.flush();
return docFile;
}
public DocFileDO getDocFile(Long docFileId) {
return hibernateTemplate.get(DocFileDO.class, docFileId);
}
}
public class DocGenTask implements Runnable {
public void run() {
generate();
}
private void generate() {
DocFileDO docFileObj = docRepository.getDocFile(docFileId);
}
}
A couple of things
Don't use HibernateTemplate it should be considered deprecated as of Hibernate 3.0.1 (which was released somewhere in 2006). Use the SessionFactory directly and use the getCurrentSession() method to get a hibernate Session to operate on.
You don't have transactions setup (judging from the snippets), to work with a databse you need proper transaction setup.
Your controller is doing much, all of this should be inside a service.
The first refactor your repository
#Repository
public class DocRepository {
#Autowired
private SessionFactory sf;
public DocFileDO saveDocFile(DocFileDO docFile) {
Session session = sf.getCurrentSession();
session.save(docFile);
return docFile;
}
public DocFileDO getDocFile(Long docFileId) {
return sf.getCurrentSession().get(DocFileDO.class, docFileId);
}
}
Now your code will probably fail due to improper transaction setup. Add #Transactional to all the methods (or class) that need a transaction (like the saveDocFile method).
As mentioned you probably should move the code found in the controller to a service. The controller should be nothing more then a thin integration layer converting from the web to an internal representation of something and then kick off a service/business method somewhere. This service-/business-method is also your transactional unit-of-work it either all succeeds or all fails.
I have a JSF application where the users create some files. The problem is, they must upload them and download the confirmation messages too and the process of uploading/downloading is exclusive, only one user at the time, because the authentication requires a technical user/password. My question is, how can I make the waiting process transparent for the user, a kind of protocol, for example:
waiting to get the connection
authentication
upload file
download confirmation file
done
Use a single thread executor.
#ManagedBean
#ApplicationScoped
public class FileManager {
private ExecutorService executor;
#PostConstruct
public void init() {
executor = Executors.newSingleThreadExecutor();
}
public Result process(Task task) throws InterruptedException, ExecutionException {
return executor.submit(task).get();
}
#PreDestroy
public void destroy() {
executor.shutdownNow();
}
}
Where Result is just your javabean object containing the desired result and Task look like this:
public class Task implements Callable<Result> {
private Data data;
public Task(Data data) {
this.data = data;
}
#Override
public Result call() throws Exception {
Result result = process(data); // Do your upload/download/auth job here.
return result;
}
}
Data is just your javabean object containing the input data (uploaded file?). Finally invoke it from in your managed bean as follows:
#ManagedBean
#RequestScoped
public class Bean {
#ManagedProperty("#{fileManager}")
private FileManager fileManager;
public void submit() {
try {
Data data = prepareItSomehow();
Result result = fileManager.process(new Task(data));
// Now do your job with result.
}
catch (Exception e) {
// Handle
}
}
// ...
}
This way all tasks will be processed by a single thead in the first in - first out order.
If your container supports EJB, then there are other ways.