I currently have a fluent builder, .e.g
SomeObject someObject = new SomeObject.Builder()
.field1(someWebService.getSomeValue())
.field2(someOtherService.getSomething())
.field3(anotherService.youGetThePicture())
// etc x 10
.createSomeObject()
I'd like to elegantly convert this to run in parallel, so the three service calls can run at the same time. Is this possible? I'm using Java 8 so streams are an option.
Thanks.
Well just for reference what you want to do could be done like this
SomeObject.Builder someObject = new SomeObject.Builder();
CompletableFuture cf1 = CompletableFuture.supplyAsync(() -> someObject.field1(someWebService.getSomeValue());
CompletableFuture cf2 = CompletableFuture.supplyAsync(() -> someObject.field2(someOtherService.getSomething());
...
SomeObject res = CompletableFuture.allOf(cf1, cf2, ...).get().createSomeObject();
But I would strongly advice you to instead call the services asynchronously, collect the responses and then build your object synchronously.
If you want to do it using streams you should add all the lambdas (service calls) into a collection and then use serviceCalls.stream().parallel().forEach(Consumer::accept);
Thats not good either though
Related
I've created a reactive flow at my controller Endpoint addEntry where one object inside should be created only once per request since it holds a state.
#Override
public Mono<FileResultDto> addEntry(final Flux<byte[]> body,
final String fileId) {
return keyVaultRepository.findByFiletId(fileId)
.switchIfEmpty(Mono.defer(() -> {
final KeyVault keyVault = KeyVault.of(fileId);
return keyVaultRepository.save(keyVault);
}))
.map(keyVault -> Mono
.just(encryption.createEncryption(keyVault.getKey(), ENCRYPT_MODE)) // createEncryption obj. that holds a state
.cache())
.map(encryption -> Flux
.from(body)
.map(bytes -> encryption
.share()
.block()
.update(bytes) // works with the state and changes it per byte[] going through this flux
)
)
.flatMap(flux -> persistenceService.addEntry(flux, fileId));
}
before I asked this question I used
encryption.block() which was failing.
I found this one and updated my code accordingly (added .share()).
The test itself is working. But I am wondering if this is the proper way to go to work with an object that should be created and used only once in the reactive flow, provided by
encryptionService.createEncryption(keyVault.getKey(), ENCRYPT_MODE)
Happy to hear your opinion
Mono.just is only a wrapper around a pre-computed value, so there is no need to cache or share it, because it is only just giving back a cached variable on subscription.
But, in your example, there is something I do not understand.
If we simplify / decompose it, it gives the following:
Mono<KeyVault> vault = keyVaultRepository.findByFiletId(fileId)
.switchIfEmpty(Mono.defer(() -> keyVaultRepository.save(KeyVault.of(fileId));
));
Mono<Mono<Encryption>> fileEncryption = vault
.map(it -> Mono.just(createEncryption(it.getKey)).cache()); // <1>
Mono<Flux<Encryption>> encryptedContent = fileEncryption.map(encryption -> Flux
.from(body)
.map(bytes -> encryption
.share()
.block()
.update(bytes))); // <2>
Mono<FileResultDto> file = encryptedContent.map(flux -> persistenceService.addEntry(flux, fileId));
Why are you trying to wrap your encryption object ? The result is already part of a reactive pipeline. Doing Mono.just() is redundant because you are already in a map operation, and doing cache() over just() is also redundant, because a "Mono.just" is essentially a permanent cache.
What does your "update(bytes)" method do ? Does it mutate the same object every time ? because if it does, you might have a problem here. Reactive streams cannot ensure thread-safety and proper ordering of actions on internal mutated states, that is out of its reach. You might bypass the problem by using scan operator, though.
Without additional details, I would start refactoring the code like this:
Mono<KeyVault> vault = keyVaultRepository.findByFileId(fileId)
.switchIfEmpty(Mono.defer(() -> keyVaultRepository.save(KeyVault.of(fileId));
Mono<Encryption> = vault.map(it -> createEncryption(it.getKey()));
Flux<Encryption> encryptedContent = fileEncryption
.flatMapMany(encryption -> body.scan(encryption, (it, block) -> it.update(block)));
Mono<FileResultDto> result = persistenceService.addEntry(encryptedContent, fileId);
I have 3 services which I want to chain
CompletableFuture<String> serviceA
CompletableFuture<String> serviceB(String resultFromA)
CompletableFuture<String> serviceC(String resultFromA, String resultFromB)
If I use thenCompose, I can't seem to maintain the first result
i.e.
serviceA.thenCompose(a -> serviceB(a))
.thenCompose(b -> serviceC(a, b)); // a is lost
If I use CompletableFuture.allOf(), I don't see that it allows chaining - running in sequence and passing results.
I am going to modify serviceB, so that it returns a Pair, or some composite object, but is there a better way ?
serviceA.thenCompose(a -> serviceB(a).thenCompose(b -> serviceC(a, b)));
I have a use case in my Spring boot application as follows:
I would like to fetch the id field value from the response with the following function:
String id = getIdFromResponse(response);
If I don't get any id in the response, then I check if the id field is present in the request argument with the following function:
String id = getIdFromRequest(request);
As of now, I am invoking them sequentially. But I would like to make these two functions run parallelly, I would like to stop as soon as I get an id from either of them.
I am wondering if there is any way to implement this using streams in Java 8.
You can use something like this:
String id = Stream.<Supplier<String>>of(
() -> getIdFromResponse(response),
() -> getIdFromRequest(request)
)
.parallel()
.map(Supplier::get)
.filter(Objects::nonNull)
.findFirst()
.orElseThrow():
The suppliers are needed, because when you don't use them, then both requests are still executed sequentially.
I also assumed that your methods return null when nothing is found, so I had to filter these values out with .filter(Objects::nonNull).
Depending on your use case, you can replace .orElseThrow() with something different, like .orElse(null)
There is no need to use Stream API as long as there exists a method exactly for this.
ExecutorService::invokeAny(Collection<? extends Callable<T>>)
Executes the given tasks, returning the result of one that has completed successfully (i.e., without throwing an exception), if any do. Upon normal or exceptional return, tasks that have not completed are cancelled.
List<Callable<String>> collection = Arrays.asList(
() -> getIdFromResponse(response),
() -> getIdFromRequest(request)
);
// you want the same number of threads as the size of the collection
ExecutorService executorService = Executors.newFixedThreadPool(collection.size());
String id = executorService.invokeAny(collection);
Three notes:
There is also an overloaded method with timeout throwing TimeoutException if no result is available in time: invokeAny(Collection<? extends Callable<T>>, long, TimeUnit)
You need to handle ExecutionException and InterruptedException from the invokeAny method.
Don't forget to close the service once you are done
If you want to be in full control over when to enable the alternative evaluation, you may use CompletableFuture:
CompletableFuture<String> job
= CompletableFuture.supplyAsync(() -> getIdFromResponse(response));
String id;
try {
id = job.get(300, TimeUnit.MILLISECONDS);
}
catch(TimeoutException ex) {
// did not respond within the specified time, set up alternative
id = job.applyToEither(
CompletableFuture.supplyAsync(() -> getIdFromRequest(request)), s -> s).join();
}
catch(InterruptedException|ExecutionException ex) {
// handle error
}
The second job is only submitted when the first did not complete within the specified time. Then, whichever job responds first will provide the result value.
I have a list and I'm streaming this list to get some filtered data as:
List<Future<Accommodation>> submittedRequestList =
list.stream().filter(Objects::nonNull)
.map(config -> taskExecutorService.submit(() -> requestHandler
.handle(jobId, config))).collect(Collectors.toList());
When I wrote tests, I tried to return some data using a when():
List<Future<Accommodation>> submittedRequestList = mock(LinkedList.class);
when(list.stream().filter(Objects::nonNull)
.map(config -> executorService.submit(() -> requestHandler
.handle(JOB_ID, config))).collect(Collectors.toList())).thenReturn(submittedRequestList);
I'm getting org.mockito.exceptions.misusing.WrongTypeOfReturnValue:
LinkedList$$EnhancerByMockitoWithCGLIB$$716dd84d cannot be returned by submit() error. How may I resolve this error by using a correct when()?
You can only mock single method calls, not entire fluent interface cascades.
Eg, you could do
Stream<Future> fs = mock(Stream.class);
when(requestList.stream()).thenReturn(fs);
Stream<Future> filtered = mock(Stream.class);
when(fs.filter(Objects::nonNull).thenReturn(filtered);
and so on.
IMO it's really not worth mocking the whole thing, just verify that all filters were called and check the contents of the result list.
I have a list of Strings, and for each of them, I need to open a new thread and to gather all the information into a CompletableFuture.
This is my iteration:
for (String result: results) {
candidateInfos.add(getCandidatesInfo(result));
}
I am trying for the first time the implementation of threads and I would appreciate some help.
You can build Stream for each method call and can then collect the result into a list as follows.
Stream.Builder<Supplier<CanditateInfo>> streamBuilder = Stream.builder();
results.forEach(string-> streamBuilder.accept(() -> this.getCandidatesInfo(string)));
List<CanditateInfo> candidateInfos = streamBuilder.build().map(supplier -> CompletableFuture.supplyAsync(supplier, Executors.newFixedThreadPool(
results.size()))).collect(Collectors.toList()).stream().map(
CompletableFuture::join).collect(Collectors.toList());
Here I have used the separate Executor because by default, java use the common Fork and Join Pool which will block all other threads if the pool would have got filled. For more Info see http://fahdshariff.blogspot.in/2016/06/java-8-completablefuture-vs-parallel.html
Edit: Less syntax.
You can directly create a stream using a list or if you an array then using Arrays.stream instead of using Stream.Builder
List<CanditateInfo> candidateInfos = results.stream().map(s ->
CompletableFuture.supplyAsync(this.getCandidatesInfo(s), Executors.newFixedThreadPool(
results.size()))).map(CompletableFuture::join).collect(Collectors.toList());