Is their any way so that i can make use of Multithreading concept to call the execution in parallel and make execution faster for created #RestController which will accepts a String and List<MultipartFile> as request parameters, and the code is working fine.Problem here is if i'm parsing one file after other via a for loop. Time taken for execution is more.
Below is Controller
#RequestMapping(value = "/csvUpload", method = RequestMethod.POST)
public List<String> csvUpload(#RequestParam String parentPkId, #RequestParam List<MultipartFile> file)
throws IOException {
log.info("Entered method csvUpload() of DaoController.class");
List<String> response = new ArrayList<String>();
String temp = parentPkId.replaceAll("[-+.^:,]", "");
for (MultipartFile f : file) {
String resp = uploadService.csvUpload(temp, f);
response.add(resp);
}
return response;
}
from controller, i'm calling uploadService.csvUpload() method where i'm parsing the files one after the other as i'm using For loop.
Below is my UploadService Class
public String csvUpload(String parentPkId, MultipartFile file) {
try {
BufferedReader br = new BufferedReader(new InputStreamReader(file.getInputStream()));
String line = "";
int header = 0;
while ((line = br.readLine()) != null) {
// TO SKIP HEADER
if(header == 0) {
header++;
continue;
}
header++;
//Use Comma As Separator
String[] csvDataSet = line.split(",");
//Saving it to DB
}catch(IOException ex) {
ex.printStackTrace();
}
return "Successfully Uploaded "+ file.getOriginalFilename();
}
How to make this Controller as a Multithreaded so that processing is parallel and fast. I'm new to Multithreading and I tried by making use of Callable interface but the Call() method will not take parameters.
any leads and suggestion are welcomed, thanks in advance.
you need to create a Class which will implement callable as below and store futures in a list and finally process the futures as below
public class ProcessMutlipartFile implements Callable<String>
{
private Mutlipartfile file;
private String temp;
private UploadService uploadService;
public ProcessMutlipartFile(Mutlipartfile file,String temp, UploadService uploadService )
{
this.file=file;
this.temp=temp,
this.uploadService=uploadService;
}
public String call() throws Exception
{
return uploadService.csvUpload(temp, file);
}
}
in your controller create a list of future object
ExecutorService executor = Executors.newFixedThreadPool(10)
List< Future<String> > futureList = new ArrayList<Future<String>>();
.
.
.
for (MultipartFile f : file) {
futureList.add(executor.submit(new ProcessMutlipartFile(file ,temp,uploadService));
}
finally in your controller
for (Future f :futureList)
{
response.add(f.get());
}
//shuttingdown the Executor
executor.shutdown();
hope this helps
You can execute the uploading code using parallel stream
List<String> response = file.parallelStream().map(f -> uploadService.csvUpload(temp, f))
.collect(Collectors.toList());
You can execute streams in serial or in parallel. When a stream executes in parallel, the Java runtime partitions the stream into multiple substreams. Aggregate operations iterate over and process these substreams in parallel and then combine the results.
Related
I want to process files with a flink stream in which two lines belong together. In the first line there is a header and in the second line a corresponding text.
The files are located on my local file system. I am using the readFile(fileInputFormat, path, watchType, interval, pathFilter, typeInfo) method with a custom FileInputFormat.
My streaming job class looks like this:
final StreamExecutionEnvironment env = StreamExecutionEnvironment.getExecutionEnvironment();
DataStream<Read> inputStream = env.readFile(new ReadInputFormatTest("path/to/monitored/folder"), "path/to/monitored/folder", FileProcessingMode.PROCESS_CONTINUOUSLY, 100);
inputStream.print();
env.execute("Flink Streaming Java API Skeleton");
and my ReadInputFormatTest like this:
public class ReadInputFormatTest extends FileInputFormat<Read> {
private transient FileSystem fileSystem;
private transient BufferedReader reader;
private final String inputPath;
private String headerLine;
private String readLine;
public ReadInputFormatTest(String inputPath) {
this.inputPath = inputPath;
}
#Override
public void open(FileInputSplit inputSplit) throws IOException {
FileSystem fileSystem = getFileSystem();
this.reader = new BufferedReader(new InputStreamReader(fileSystem.open(inputSplit.getPath())));
this.headerLine = reader.readLine();
this.readLine = reader.readLine();
}
private FileSystem getFileSystem() {
if (fileSystem == null) {
try {
fileSystem = FileSystem.get(new URI(inputPath));
} catch (URISyntaxException | IOException e) {
throw new RuntimeException(e);
}
}
return fileSystem;
}
#Override
public boolean reachedEnd() throws IOException {
return headerLine == null;
}
#Override
public Read nextRecord(Read r) throws IOException {
r.setHeader(headerLine);
r.setSequence(readLine);
headerLine = reader.readLine();
readLine = reader.readLine();
return r;
}
}
As expected, the headers and the text are stored together in one object. However, the file is read eight times. So the problem is the parallelization. Where and how can I specify that a file is processed only once, but several files in parallel?
Or do I have to change my custom FileInputFormat even further?
I would modify your source to emit the available filenames (instead of the actual file contents) and then add a new processor to read a name from the input stream and then emit pairs of lines. In other words, split the current source into a source followed by a processor. The processor can be made to run at any degree of parallelism and the source would be a single instance.
I have the following rest controller, which receives requests, transforms them into JSON strings and puts them into a concurrent queue.
I would like to make a Flux out of this queue and subscribe to it.
Unfortunately, it doesn't work.
What am I doing wrong here?
#RestController
public class EventController {
private final ObjectMapper mapper = new ObjectMapper();
private final FirehosePutService firehosePutService;
private ConcurrentLinkedQueue<String> events = new ConcurrentLinkedQueue<>();
private int batchSize = 10;
#Autowired
public EventController(FirehosePutService firehosePutService) {
this.firehosePutService = firehosePutService;
Flux<String> eventFlux = Flux.create((FluxSink<String> sink) -> {
String next;
while (( next = events.poll()) != null) {
sink.next(next);
}
});
eventFlux.publish().autoConnect().subscribe(new BaseSubscriber<String>() {
int consumed;
List<String> batchOfEvents = new ArrayList<>(batchSize);
#Override
protected void hookOnSubscribe(Subscription subscription) {
request(batchSize);
}
#Override
protected void hookOnNext(String value) {
batchOfEvents.add(value);
consumed++;
if (consumed == batchSize) {
batchOfEvents.addAll(events);
log.info("Consume {} elements. Size of batchOfEvents={}", consumed, batchOfEvents.size());
firehosePutService.saveBulk(batchOfEvents);
consumed = 0;
batchOfEvents.clear();
events.clear();
request(batchSize);
}
}
});
}
#GetMapping(value = "/saveMany", produces = "text/html")
public ResponseEntity<Void> saveMany(#RequestParam MultiValueMap<String, String> allRequestParams) throws JsonProcessingException {
Map<String, String> paramValues = allRequestParams.toSingleValueMap();
String reignnEvent = mapper.writeValueAsString(paramValues);
events.add(reignnEvent);
return new ResponseEntity<>(HttpStatus.OK);
}
}
First of all, you use poll method. It is not blocking and returns null if queue is empty. You loop collection until first null (i.e. while (next != null), so your code exits loop almost immediately because queue is empty on start. You must replace poll with take which is blocking and will wait until element is available.
Secondly, hookOnNext is invoked when the event is removed from the queue. However, you are trying to read events again using batchOfEvents.addAll(events);. Moreover, you also clear all pending events events.clear();
I advise you to remove all direct access to events collection from hookOnNext method.
Why do you use Flux here at all? Seems overcomplicated. You can use plain thread here
#Autowired
public EventController(FirehosePutService firehosePutService) {
this.firehosePutService = firehosePutService;
Thread persister = new Thread(() -> {
List<String> batchOfEvents = new ArrayList<>(batchSize);
String next;
while (( next = events.take()) != null) {
batchOfEvents.add(value);
if (batchOfEvents.size() == batchSize) {
log.info("Consume {} elements. Size of batchOfEvents={}", consumed, batchOfEvents.size());
firehosePutService.saveBulk(batchOfEvents);
batchOfEvents.clear();
}
}
});
persister.start();
}
I am writing a controller, that I need to make it asynchronous. How can I deal with a list of ListenableFuture? Because I have a list of URLs that I need to send GET request one by one, what is the best solution for it?
#RequestMapping(value = "/repositories", method = RequestMethod.GET)
private void getUsername(#RequestParam(value = "username") String username) {
System.out.println(username);
List<ListenableFuture> futureList = githubRestAsync.getRepositoryLanguages(username);
System.out.println(futureList.size());
}
In the service I use List<ListanbleFuture> which seems does not work, since it is asynchronous, in the controller method I cannot have the size of futureList to run a for loop on it for the callbacks.
public List<ListenableFuture> getRepositoryLanguages(String username){
return getRepositoryLanguages(username, getUserRepositoriesFuture(username));
}
private ListenableFuture getUserRepositoriesFuture(String username) throws HttpClientErrorException {
HttpEntity entity = new HttpEntity(httpHeaders);
ListenableFuture future = restTemplate.exchange(githubUsersUrl + username + "/repos", HttpMethod.GET, entity, String.class);
return future;
}
private List<ListenableFuture> getRepositoryLanguages(final String username, ListenableFuture<ResponseEntity<String>> future) {
final List<ListenableFuture> futures = new ArrayList<>();
future.addCallback(new ListenableFutureCallback<ResponseEntity<String>>() {
#Override
public void onSuccess(ResponseEntity<String> response) {
ObjectMapper mapper = new ObjectMapper();
try {
repositories = mapper.readValue(response.getBody(), new TypeReference<List<Repositories>>() {
});
HttpEntity entity = new HttpEntity(httpHeaders);
System.out.println("Repo size: " + repositories.size());
for (int i = 0; i < repositories.size(); i++) {
futures.add(restTemplate.exchange(githubReposUrl + username + "/" + repositories.get(i).getName() + "/languages", HttpMethod.GET, entity, String.class));
}
} catch (IOException e) {
e.printStackTrace();
}
}
#Override
public void onFailure(Throwable throwable) {
System.out.println("FAILURE in getRepositoryLanguages: " + throwable.getMessage());
}
});
return futures;
}
Should I use something like ListenableFuture<List> instead of List<ListenableFuture> ?
It seems like you have a List<ListenableFuture<Result>>, but you want a ListenableFuture<List<Result>>, so you can take one action when all of the futures are complete.
public static <T> ListenableFuture<List<T>> allOf(final List<? extends ListenableFuture<? extends T>> futures) {
// we will return this ListenableFuture, and modify it from within callbacks on each input future
final SettableListenableFuture<List<T>> groupFuture = new SettableListenableFuture<>();
// use a defensive shallow copy of the futures list, to avoid errors that could be caused by
// someone inserting/removing a future from `futures` list after they call this method
final List<? extends ListenableFuture<? extends T>> futuresCopy = new ArrayList<>(futures);
// Count the number of completed futures with an AtomicInt (to avoid race conditions)
final AtomicInteger resultCount = new AtomicInteger(0);
for (int i = 0; i < futuresCopy.size(); i++) {
futuresCopy.get(i).addCallback(new ListenableFutureCallback<T>() {
#Override
public void onSuccess(final T result) {
int thisCount = resultCount.incrementAndGet();
// if this is the last result, build the ArrayList and complete the GroupFuture
if (thisCount == futuresCopy.size()) {
List<T> resultList = new ArrayList<T>(futuresCopy.size());
try {
for (ListenableFuture<? extends T> future : futuresCopy) {
resultList.add(future.get());
}
groupFuture.set(resultList);
} catch (Exception e) {
// this should never happen, but future.get() forces us to deal with this exception.
groupFuture.setException(e);
}
}
}
#Override
public void onFailure(final Throwable throwable) {
groupFuture.setException(throwable);
// if one future fails, don't waste effort on the others
for (ListenableFuture future : futuresCopy) {
future.cancel(true);
}
}
});
}
return groupFuture;
}
Im not quite sure if you are starting a new project or working on a legacy one, but if the main requirement for you is none blocking and asynchronous rest service I would suggest you to have a look into upcoming Spring Framework 5 and it integration with reactive streams. Particularly Spring 5 will allow you to create fully reactive and asynchronous web services with little of coding.
So for example fully functional version of your code can be written with this small code snippet.
#RestController
public class ReactiveController {
#GetMapping(value = "/repositories")
public Flux<String> getUsername(#RequestParam(value = "username") String username) {
WebClient client = WebClient.create(new ReactorClientHttpConnector());
ClientRequest<Void> listRepoRequest = ClientRequest.GET("https://api.github.com/users/{username}/repos", username)
.accept(MediaType.APPLICATION_JSON).header("user-agent", "reactive.java").build();
return client.exchange(listRepoRequest).flatMap(response -> response.bodyToFlux(Repository.class)).flatMap(
repository -> client
.exchange(ClientRequest
.GET("https://api.github.com/repos/{username}/{repo}/languages", username,
repository.getName())
.accept(MediaType.APPLICATION_JSON).header("user-agent", "reactive.java").build())
.map(r -> r.bodyToMono(String.class)))
.concatMap(Flux::merge);
}
static class Repository {
private String name;
public String getName() {
return name;
}
public void setName(String name) {
this.name = name;
}
}
}
To run this code locally just clone the spring-boot-starter-web-reactive and copy the code into it.
The result is something like {"Java":50563,"JavaScript":11541,"CSS":1177}{"Java":50469}{"Java":130182}{"Shell":21222,"Makefile":7169,"JavaScript":1156}{"Java":30754,"Shell":7058,"JavaScript":5486,"Batchfile":5006,"HTML":4865} still you can map it to something more usable in asynchronous way :)
Considering the following function:
public void execute4() {
File filePath = new File(filePathData);
File[] files = filePath.listFiles((File filePathData) -> filePathData.getName().endsWith("CDR"));
List<CDR> cdrs = new ArrayList<CDR>();
Arrays.asList(files).parallelStream().forEach(file -> readCDRP(cdrs, file));
cdrs.sort(cdrsorter);
}
which reads a list of Files containing CDR and executes the readCDRP() which is this:
private void readCDRP(List<CDR> cdrs, File file) {
final CDR cdr = new CDR(file.getName());
try (BufferedReader bfr = new BufferedReader(new FileReader(file))) {
List<String> lines = bfr.lines().collect(Collectors.toList());
lines.parallelStream().forEach(e -> {
String[] data = e.split(",", -1);
CDREntry entry = new CDREntry(file.getName());
for (int i = 0; i < data.length; i++) {
entry.setField(i, data[i]);
}
cdr.addEntry(entry);
});
if (cdr != null) {
cdrs.add(cdr);
}
} catch (IOException e) {
e.printStackTrace();
}
}
What I observe is that occasionally and NOT all the time, I either get a ArrayIndexNotBound Exception at the readCDRP function over the line (which is awkward, as the list of cdr is an ArrayList() ):
cdr.addEntry(entry);
or
at the last line in execute4() where I apply the sorting.
I think the issue is that the first parallelStream from execute4 is not in a separate space in memory from the second parallelStream execution inside readCDRP() and also seems to share wrongly the data. Using "seem" word as I can't confirm and is just a hutch.
The questions are:
is my code buggy to the bone from JDK8 perspective?
Is there a workaround using the same flow, something like using CountDownLatch for example?
Is limitation of the ForkJoinPool ?
Thanks for any responce....
EDIT(1):
The addEntry is part of a class itself:
class CDR {
public final String fileName;
private final List<CDREntry> entries = new ArrayList<CDREntry>();
public CDR(String fileName) {
super();
this.fileName = fileName;
}
public List<CDREntry> getEntries() {
return entries;
}
public List<CDREntry> addEntry(CDREntry e) {
entries.add(e);
return entries;
}
public String getFileName() {
return this.fileName;
}
}
Your code is broken from a thread safety point of view. In readCDR you add elements to the cdrs list which is an ArrayList that does not support concurrent writes. That is why it breaks.
A better approach would be to have readCDR return a cdr object and do something like:
List<CDR> cdrs = Arrays.stream(files)
.parallel()
.map(this::readCDR)
.collect(Collectors.toList());
Also, using parallel streams for IO related operations is generally a bad idea, but that is another discussion.
When you starting programming in functional style you should prefer immutable objects which can be fully created via construction (or probably using builder pattern or some factory method). So your CDREntry class may look like this:
class CDREntry {
private final String[] fields;
private final String name;
public CDREntry(String name, String[] fields) {
this.name = name;
this.fields = fields;
}
// Add getters and whatever
}
And your CDR class may look like this:
class CDR {
private final String fileName;
private final List<CDREntry> entries;
public CDR(String fileName, List<CDREntry> entries) {
this.fileName = fileName;
this.entries = entries;
}
public List<CDREntry> getEntries() {
return entries;
}
public String getFileName() {
return this.fileName;
}
}
Having such classes things become easier. The rest of the code can be rewritten like this:
public void execute4() {
File filePath = new File(filePathData);
File[] files = filePath.listFiles((File data, String name) ->
data.getName().endsWith("CDR")); // fixed this line: it had compilation error
List<CDR> cdrs = Arrays.stream(files).parallel()
.map(this::readCDRP).sorted(cdrsorter)
.collect(Collectors.toList());
}
private CDR readCDRP(File file) {
try (BufferedReader bfr = new BufferedReader(new FileReader(file))) {
// I'm not sure that collecting lines into list
// before main processing was actually necessary
return bfr.lines().parallelStream()
.map(e -> new CDREntry(file.getName(), e.split(",", -1)))
.collect(Collectors.collectingAndThen(
Collectors.toList(), list -> new CDR(file.getName(), list)));
} catch (IOException e) {
throw new UncheckedIOException(e);
}
}
In general remember that forEach is usually not the cleanest way to solve the tasks. It may be helpful when you integrate the streams into legacy code, but in general should be avoided.
you are using a parallel stream and a lambda that has side effects
(the lambda updates the ArrayList 'cdrs')
try to use a Collector or a Reduction-Operation.
I have programmed a JAX-RS web service with Jersey that queries prices from different websites and gives the result back as XML through JAXB annotated classes. Unfortunately some websites take up to 15 seconds to respond so I am using multiple threads to inquire those prices.
I would like to write a client to this webservice now and my web users will not want to wait for 30 seconds after they hit 'search' for the result to come so my idea is dynamically updating the result table as the results from my JAX-RS webservice come back.
After 30 seconds my webservice should time out and close the <result>-Element or after all threads completed.
Right now my webservice runs all threads and gives back the result after all trheads are completed, I would like to dynamically add results to the XML output as they come, how can I accomplish that?
The structure of the XML response is:
<result>
<articles>
<article>
content of article
</article>
</articles>
As the webservice gets results from websites it adds new articles to the XML
</result>
RequestController.java
#Path("/request")
public class RequestController {
#GET
#Produces("application/xml")
public Response getRequest(#QueryParam("part") String part) {
response = new Response();
driverController = new DriverController(this.response, this.part);
this.response = driverController.query();
return this.response;
}
}
DriverController.java
public class DriverController {
public Response query() {
CompletionService<Deque<Article>> completionService = new ExecutorCompletionService<Deque<Article>>(
Worker.getThreadPool());
final Deque<Article> articleQueue = new LinkedList<Article>();
int submittedTasks = 0;
// This threadwill take about 4 seconds to finish
Driver driverA = new DriverA(this.part,
this.currency, this.language);
// This thread will take about 15 seconds to finish
Driver driverN = new DriverN(this.part,
this.currency, this.language);
completionService.submit(driverA);
submittedTasks++;
completionService.submit(driverN);
submittedTasks++;
for (int i = 0; i < submittedTasks; i++) {
log.info("Tasks: " + submittedTasks);
try {
Future<Deque<Article>> completedFuture = completionService.take();
try {
Deque<Article> articleQueueFromThread = completedFuture.get();
if (articleQueueFromThread != null) {
articleQueue.addAll(articleQueueFromThread);
response.setStatus("OK");
}
} catch (ExecutionException e) {
log.error(e.getMessage());
e.printStackTrace();
}
} catch (InterruptedException e) {
log.error(e.getMessage());
e.printStackTrace();
}
}
for (Article article : articleQueue) {
this.response.addArticle(article);
}
return this.response;
}
}
Response.java
#XmlRootElement
public class Response {
Queue<Article> queue = new ConcurrentLinkedQueue<Article>();
private String status;
private String code;
private String message;
private List<Article> articles = new ArrayList<Article>();
public Response(){
}
public void setMessage(String message) {
this.message = message;
}
#XmlAttribute
public String getMessage() {
return message;
}
public void setStatus(String status) {
this.status = status;
}
#XmlAttribute
public String getStatus() {
return status;
}
public void setCode(String code) {
this.code = code;
}
#XmlAttribute
public String getCode() {
return code;
}
public void addArticle(Article article) {
this.articles.add(article);
System.out.println("Response: ADDED ARTICLE TO RESPONSE");
}
#XmlElement(name = "article")
#XmlElementWrapper(name = "articles")
public List<Article> getArticles() {
return articles;
}
}
I started to adapt your code to do it, but I decided it was easier to work up an independent example. The example starts a Grizzly+Jersey server with a single resource class in it. A GET on the resource spawns three threads that delay for 2, 4, and 6 seconds before returning some objects. After the server starts, another thread makes a request to the server. When you run it, you can plainly see that the requester receives chunks of XML as the respective threads finish their work in the server. The one thing it doesn't do is wrap separately-delivered XML chunks in a single root element since that should be relatively trivial.
The entire executable source is below, and if you have maven and git, you can clone it from github and run it with:
git clone git://github.com/zzantozz/testbed.git tmp
cd tmp
mvn compile exec:java -Dexec.mainClass=rds.jersey.JaxRsResource -pl jersey-with-streaming-xml-response
Source:
import com.sun.grizzly.http.SelectorThread;
import com.sun.jersey.api.container.grizzly.GrizzlyWebContainerFactory;
import javax.ws.rs.*;
import javax.ws.rs.core.StreamingOutput;
import javax.xml.bind.*;
import javax.xml.bind.annotation.*;
import java.io.*;
import java.net.*;
import java.util.*;
import java.util.concurrent.*;
#Path("/streaming")
public class JaxRsResource {
private static ExecutorService executorService = Executors.newFixedThreadPool(4);
private static int fooCounter;
private Marshaller marshaller;
public JaxRsResource() throws JAXBException {
marshaller = JAXBContext.newInstance(Foo.class).createMarshaller();
marshaller.setProperty("jaxb.fragment", Boolean.TRUE);
}
#GET
#Produces("application/xml")
public StreamingOutput streamStuff() {
System.out.println("Got request for streaming resource; starting delayed response threads");
final List<Future<List<Foo>>> futureFoos = new ArrayList<Future<List<Foo>>>();
futureFoos.add(executorService.submit(new DelayedFoos(2)));
futureFoos.add(executorService.submit(new DelayedFoos(4)));
futureFoos.add(executorService.submit(new DelayedFoos(6)));
return new StreamingOutput() {
public void write(OutputStream output) throws IOException {
for (Future<List<Foo>> futureFoo : futureFoos) {
writePartialOutput(futureFoo, output);
output.write("\n".getBytes());
output.flush();
}
}
};
}
private void writePartialOutput(Future<List<Foo>> futureFoo, OutputStream output) {
try {
List<Foo> foos = futureFoo.get();
System.out.println("Server sending a chunk of XML");
for (Foo foo : foos) {
marshaller.marshal(foo, output);
}
} catch (JAXBException e) {
throw new IllegalStateException("JAXB couldn't marshal. Handle it.", e);
} catch (InterruptedException e) {
throw new IllegalStateException("Task was interrupted. Handle it.", e);
} catch (ExecutionException e) {
throw new IllegalStateException("Task failed to execute. Handle it.", e);
}
}
class DelayedFoos implements Callable<List<Foo>> {
private int delaySeconds;
public DelayedFoos(int delaySeconds) {
this.delaySeconds = delaySeconds;
}
public List<Foo> call() throws Exception {
Thread.sleep(delaySeconds * 1000);
return Arrays.asList(new Foo(fooCounter++), new Foo(fooCounter++), new Foo(fooCounter++));
}
}
public static void main(String[] args) throws IOException {
System.out.println("Starting Grizzly with the JAX-RS resource");
final String baseUri = "http://localhost:9998/";
final Map<String, String> initParams = new HashMap<String, String>();
initParams.put("com.sun.jersey.config.property.packages", "rds.jersey");
SelectorThread threadSelector = GrizzlyWebContainerFactory.create(baseUri, initParams);
System.out.println("Grizzly started");
System.out.println("Starting a thread to request the streamed XML");
executorService.submit(new HttpRequester(baseUri + "streaming"));
}
}
#XmlRootElement
class Foo {
#XmlElement
private int id;
Foo() {}
public Foo(int id) {
this.id = id;
}
}
class HttpRequester implements Runnable {
private String url;
public HttpRequester(String url) {
this.url = url;
}
public void run() {
try {
System.out.println("Doing HTTP GET on " + url);
HttpURLConnection urlConnection = (HttpURLConnection) new URL(url).openConnection();
BufferedReader in = new BufferedReader(new InputStreamReader(urlConnection.getInputStream()));
String line;
while ((line = in.readLine()) != null) {
System.out.println("Client got: " + line);
}
System.exit(0);
} catch (IOException e) {
throw new IllegalStateException("Some bad I/O happened. Handle it.", e);
}
}
}
Important points/differences to take note of:
Returning a Response from your resource method indicates that the entire response is contained in that object and doesn't allow for incremental updates to the response. Return a StreamingOutput instead. That tells Jersey that you'll be sending back a stream of data, which you can append to at will until you're done. The StreamingOutput gives you access to an OutputStream, which is what you use to send incremental updates and is the key to this whole thing. Of course, that means you have to handle the marshaling yourself. Jersey can only do the marshaling if you're returning the entire response at once.
Since the OutputStream is how you send back the data a little at a time, you either have to do the threading in your JAX-RS resource or pass the OutputStream down to your DriverController and write to it there.
Be sure to invoke flush() on the OutputStream if you want to force it to send out data immediately. Otherwise, nothing will be sent to the client until whatever internal buffer is filled up. Note that invoking flush() yourself circumvents the purpose of the buffer and makes your app more chatty.
All in all, to apply this to your project, the primary thing to do is change your resource method to return a StreamingOutput implementation and invoke your DriverController from inside that implementation, passing the OutputStream to the DriverController. Then in the DriverController, when you get some Articles back from a thread, instead of adding it to a queue for later, write it to the OutputStream immediately.
#Ryan Stewart: how would we resolve same issue in axis2.x SOAP based web service kind of environment and HTML page as web client.
What I think is DriverController can keep Future objects in session and returns very first available response(article) with a unique session identifier to client....then client can make another webservice call (preferably thru Ajax+jquery) passing saved session identifier which would trigger DriverController to search more results and send back....is it a viable solution? Would it applicable for above environment too.