Objective: generate excel report.
i call a controller(JAVA) after clicking submit button from UI. After that, i populate data using procedure and do manipulation in service layer.which takes a long time, due to which i get gateway timeout error on UI (there is some amount of load on server).
So, now i was planning to call controller from UI and tell the user that excel report will be emailed to you, such that user wont wait on that screen for report.
You can do asynchronous task using spring with #Async annotation. for more detail you can have a look section 25.5.2 in spring.
Once user submits request from UI, just make an entry in database from your controller and give message to user saying "We have received your request and excel will be emailed to you".
Now in background there is job which is running, you can write this job at server side using Thread or better use Spring Batch. This job will do following
1) This will be continuously running thread, which will check is there any new entry from UI in this table, by some flag or so you can find this.
2) This job will generate excel file and email to customer
3) Once file is emailed, update flag = false in database, so that next time this job will take only flag = false records for next time processing.
Create a java program that would populate your excel sheet and rest of the things. Then in your servlet use
Process p=Runtime.getRuntime().exec(/*run your java program */);
this would create a parallel process and your servlet will end
Related
I have a rest API that will upload a file to AWS and return a success response to the user.
I now have a requirement where I should post the uploaded data's details to another service for reporting purposes.
But the problem here is, this posting of data should be done independently without altering the response time of the API.
i.e. After uploading has been completed, I should run a background process that will post the data to another service, meanwhile the success response should be sent back to the user without any delay.
I have gone through some solutions and I tried something like with below snippet:
if(uploadSuccess) {
response.setStatus(HttpsServletResponse.SC_OK);
//Post data to reporter
CompletableFuture.runAsync(() -> postUploadedData(fileName,
fileId));
}
With this approach the task is running in the background but the API response is held till the data post call has been completed.
Is there any other ways that I can achieve this?
I have internal tool which will show the results by pulling it from DB for 100k records
At one occasion I have to do web service call for each record(I can't store it in DB as there is a security issue),
Please find my design below,
Step1:
User clicks the button(get file)
Step2:
I am calling the DB and we service with use of thread and thread pool
and writing into the CSV file
It taking 5 mins to process as I have 100k web service call,
Step3:
Once the step 2 done I have to read CSV the file and will write into bytearrayoutputstream ,
Set the response content type as (text /csv),
I will write those into response which will download the csv files into user system.
Issue:
I am getting proxy server issue as it taking more than 5 mins,
I am getting Http session time out or 502 invalid response.
Any suggestions to give files to user ?
Please help me to resolve this issue if you can.
p.s as it is internal tool user ready to wait more than five mins to get the file
I can't change the step 1 and step 2.
There is a heavy database process that has to be executed before loading the jsp.
// jsp
<% new bean().startHeavyProcess(); %>
.
.
.
As the process is going,I want to show a progress bar using java script. Is there any way I can make the progress bar work in synchronization with the server side code ? Can I know when the server side code finishes its execution or is about to finish the execution ?
Without you building functionality to achieve this specific to your application there is no way of doing this.
I'm assuming the heavy processing is initiated by the client clicking on a link?
If so then you could try the following.
User clicks on link that initiates heavy processing.
Server returns unique id for the process that started in the same request as above.
Create a separate interface where you can query for the progress of a specific process by id.
Using javascript query the interface every 1 second to know how far along the process has come
Display progress bar to user (maybe replace original link with a progress bar)
When process is at 100% redirect user to new page.
This approach requires that you have some sort of Heavy Process Manager that has a knowledge of all running processes and can extract a process from these..
This could perhaps be done by creating a new Table in your database in which you store the process unique id and the % progress of each process. Each of the processes then update their row in the table as they progress and you simply return the progression value from the table when requested by its ID.
If your looking for a good progress bar plugin Jquery has it: http://jqueryui.com/progressbar/ :)
I did this by using
function pollProgress() {
setTimeout(function() {
int progress = queryServerProgress();
if(progress < 100) {
pollProgress();
}
}, 1000)
}
and querying the server of the current progress. Triggering the same query/update function as long as the progress is < 100%.
You need to keep track of the progress on the server side though.
We had a similar use case, and we implemented by checking the status periodically using the REST API. When the task is in progress in server side, server updates the status in memory and client requests for the progress from API.
I am measuring the cost of requests to GAE by inspecting the x-appengine-estimated-cpm-us-dollars header. This works great and in combination with x-appengine-resource-usage and
x-traceurl I can even get more detailed information.
However, a large part of my application run in the context of task queues. Thus, a huge part of the instance hour costs are consumed by queues. Each time code is executed outside of a request its costs are not included in the x-appengine-estimated-cpm-us-dollars header.
I am looking for a way to measure the full costs consumed by each request. I.e. costs generated by the request itself and the cost of the tasks that have been added by this request.
It is an overkill. There is a tool you can download google app engine log and convert them to sqlite.
http://code.google.com/p/google-app-engine-samples/source/browse/trunk/logparser/logparser.py
With this tool, cpm usd for both task request and normal request would be all downloaded together. You can store daily log into separate sqlite file and do as much analysis as you want.
In terms of relate the cost of task back to original request. The log data downloaded with this tool includes the full output of logging module.
So you can simply logging an generate id in the original request
pass the id to task.
logging the received id again in the task request.
find normal and task request pair via id.
for example:
# in org request
a_id = genereate_a_random_id()
logging.info(a_id) # the id will be included
taskqueue.add(url='/path_to_task', params={'id': a_id})
# in task request
a_id = self.request.get('id')
logging.info(a_id)
EDIT1
I think there is another possible way to estimate the cost of normal request + task request.
The trick is change the async task to sync (assume the cost would be the same).
I didn't try it but it is much easier to try.
# in org request, add a variable to identify debug
debug = self.request.get('DEBUG')
if debug:
self.redirect('/path_to_task')
else:
taskqueue.add(url='/path_to_task')
Thus, while testing the normal request with DEBUG parameter. It will firstly process the normal request then return x-appengine-estimated-cpm-us-dollars for normal request. Later it will redirect your test client to the relative task request (task request could also be access and trigger via url client as normal request) and return x-appengine-estimated-cpm-us-dollars for task request. You can simply add them together to get the total cost.
I hava an app on JSF where I upload a file. Very basic. The question is, is there a way to launch (or keep executing) another java program when I reach the last page of my app? That is:
UploadFile.xhtml -> receiveFile.java -> Thanks.xhtml (user will close this the browser) -> another program make some processing on the recently uploaded file (even if user shutsdown the PC)
I thought using a daemon program that keeps checking if a new file arrived, but I want to know if there's a way to keep executing things even if the user closes the browser.
Thanks.
Certainly the best way to do this is to have a scheduler that will look for certain files every x time interval and do something with it within a thread.
Advice, make sure you shut down the scheduler on context unload. Here is an example on how to use a SchedulerService.
You would want to do something like in a context listener.
ScheduledExecutorService scheduler = Executors.newScheduledThreadPool(1);
Runnable handler = new Runnable() {
public void run() {
// handle file
}
};
scheduler.scheduleAtFixedRate(handler, 10, 10, SECONDS);
Firstly, after receiving file from client, even when client had closed their browser, file is now on the server, and you can process it free (not depend on client now) :-D
As your question, you want to launch something right after page closed. How about using annotation #PreDestroy or #PostContructor with bean scope #ViewScoped in JSF?
(in my mind, you can do this right after receiving file uploaded from client)
A reliable way to implement this would be is to have some form of meta data saved in the database after the file upload is done.
And a separate listener like Quartz to read the meta data periodically and do the post processing.
The same metadata can contain more status flags for that meta data to avoid conflicts during file processing.
Again this would depend upon the complexity of your rrquirements.