Java Lazy Stream of Strings including List<String> - java

I'm creating a Stream of String lazily, for the first two simple items. However, part of my stream is List of String.
Stream<String> streamA = Stream.concat(
Stream.generate(item::getStringA),
Stream.generate(item::getStringB))
return Stream.concat(streamA, item.getStringList(param).stream())
The above works, but .getStringList needs to be called lazily as well. It's not clear to me how to fetch it and "merge" it in with the rest of the stream.

I think, what you actually want to do, is
return Stream.<Supplier<Stream<String>>>of(
() -> Stream.of(item.getStringA()),
() -> Stream.of(item.getStringB()),
() -> item.getStringList(param).stream())
.flatMap(Supplier::get);
This produces a fully lazy Stream<String> where, e.g. .limit(0).count() will not call any method on item or .findFirst() will only invoke getStringA(), etc.
The stream’s content will be equivalent to
Stream.concat(
Stream.of(item.getStringA(), item.getStringB()), item.getStringList(param).stream())

I don't think any of this does what you think it does. Stream.generate always generates an infinite stream. But the closest thing to what you want is going to be
StreamSupport.stream(() -> item.getStringList(param).spliterator(), 0, false)
...which will lazily call item.getStringList(param). (What you want isn't really an intended use case of Stream, so it's not very well supported.)

What you could do is return a Stream from your item.getStringList()
public static void main(String[] args) throws InterruptedException {
Item item = new Item();
Stream<String> a = Stream.concat(Stream.of("A"), Stream.of("B"));
Stream<String> anotherStream = Stream.concat(a, item.getStringList());
anotherStream.forEach(System.out::println);
}
private static class Item {
public Stream<String> getStringList() {
List<String> l = new ArrayList<>();
l.add("C");
l.add("D");
l.add("E");
l.add("F");
final AtomicInteger i = new AtomicInteger(0);
return Stream.iterate(l.get(i.get()), (f) -> {
// Proof of laziness
try {
Thread.sleep(1000);
} catch (InterruptedException e) {
e.printStackTrace();
}
return l.get(i.getAndIncrement());
})
// Iterate is by default unbound
.limit(l.size());
}
}
I'm not sure how helpful that approach would be still, since your list is still in the memory.

Related

Java: the fastest way to filter List with 1m of objects

Now i have List of ProductDTO, and Product.
This list can contains 100 objects, and can also contains 1m of objects.
This list i am reading from csv file.
How i am filtering it now:
productDtos.parralelStream()
.filter(i -> i.getName.equals(product.getName))
.filter(i -> Objects.equals(i.getCode(), product.getCode()))
.map(Product::new)
// getting object here
So, which is the best way to parse it ? I thought i should use multithreading, one thread will start from beggining of list, other will start from the end of list.
Any ideas how to improve spreed of filtering list in big data cases ?
Thank you
First of all, I see, you have already uploaded all productsDtos right in memory.
It could lead you to very high memory consumption.
I suggest you read CSV files by rows and filter them one by one. In that case, your code might look like the next:
public class Csv {
public static void main(String[] args) {
File file = new File("your.csv");
try (final BufferedReader br = new BufferedReader(new FileReader(file))) {
final List<String> filtered = br.lines().parallel()
.map(Csv::toYourDTO)
.filter(Csv::yourFilter)
.collect(Collectors.toList());
System.out.println(filtered);
} catch (IOException e) {
//todo something with the error
}
}
private static boolean yourFilter(String s) {
return true; //todo
}
private static String toYourDTO(String s) {
return "";//todo
}
}
I used to construct map and use get on it to avoid filter on loop.
For instance, if you Have N code for 1 product, you can do :
Map<String, Map<String, List<ProductDTO>>> productDtoByNameAndCode= productDtos.stream().collect(groupingBy(ProductDTO::getName, groupingBy(ProductDTO::getCode)));
Then you will just have to do for each product :
List<ProductDTO> correspondingProductDTOs = productDtoByNameAndCode.get(product.getName()).get(Product.getCode());
Like that, you haven't to filter all your list every time for each product.

Add a default item to a stream collection

I am performing some actions on a stream and returning an array list. This is working without a problem but I need to do a final step to add an element if the array list is empty (nothing to do with options / nulls just part of the requirement) My way is a bit clunky and I wondered if it can be done in the stream operation instead?
public ArrayList<String> getArrayList () {
ArrayList<String> aL = setOfStrings.stream()
.filter(remove some)
.filter(remove some more)
.map(i -> createStringAbout(i))
.collect(Collectors.toCollection(ArrayList::new));
if (aL.size() < 1) {
aL.add("No items passed the test");
}
return aL;
}
So really I would like to do
return set.stream()...
is this possible ?
Use collectingAndThen
.collect(Collectors.collectingAndThen(ArrayList::new, rs -> {
if(rs.size() < 1 ) {
rs.add("something");
}
return rs;
})

Group and Reduce list of objects

I have a list of objects with many duplicated and some fields that need to be merged. I want to reduce this down to a list of unique objects using only Java 8 Streams (I know how to do this via old-skool means but this is an experiment.)
This is what I have right now. I don't really like this because the map-building seems extraneous and the values() collection is a view of the backing map, and you need to wrap it in a new ArrayList<>(...) to get a more specific collection. Is there a better approach, perhaps using the more general reduction operations?
#Test
public void reduce() {
Collection<Foo> foos = Stream.of("foo", "bar", "baz")
.flatMap(this::getfoos)
.collect(Collectors.toMap(f -> f.name, f -> f, (l, r) -> {
l.ids.addAll(r.ids);
return l;
})).values();
assertEquals(3, foos.size());
foos.forEach(f -> assertEquals(10, f.ids.size()));
}
private Stream<Foo> getfoos(String n) {
return IntStream.range(0,10).mapToObj(i -> new Foo(n, i));
}
public static class Foo {
private String name;
private List<Integer> ids = new ArrayList<>();
public Foo(String n, int i) {
name = n;
ids.add(i);
}
}
If you break the grouping and reducing steps up, you can get something cleaner:
Stream<Foo> input = Stream.of("foo", "bar", "baz").flatMap(this::getfoos);
Map<String, Optional<Foo>> collect = input.collect(Collectors.groupingBy(f -> f.name, Collectors.reducing(Foo::merge)));
Collection<Optional<Foo>> collected = collect.values();
This assumes a few convenience methods in your Foo class:
public Foo(String n, List<Integer> ids) {
this.name = n;
this.ids.addAll(ids);
}
public static Foo merge(Foo src, Foo dest) {
List<Integer> merged = new ArrayList<>();
merged.addAll(src.ids);
merged.addAll(dest.ids);
return new Foo(src.name, merged);
}
As already pointed out in the comments, a map is a very natural thing to use when you want to identify unique objects. If all you needed to do was find the unique objects, you could use the Stream::distinct method. This method hides the fact that there is a map involved, but apparently it does use a map internally, as hinted by this question that shows you should implement a hashCode method or distinct may not behave correctly.
In the case of the distinct method, where no merging is necessary, it is possible to return some of the results before all of the input has been processed. In your case, unless you can make additional assumptions about the input that haven't been mentioned in the question, you do need to finish processing all of the input before you return any results. Thus this answer does use a map.
It is easy enough to use streams to process the values of the map and turn it back into an ArrayList, though. I show that in this answer, as well as providing a way to avoid the appearance of an Optional<Foo>, which shows up in one of the other answers.
public void reduce() {
ArrayList<Foo> foos = Stream.of("foo", "bar", "baz").flatMap(this::getfoos)
.collect(Collectors.collectingAndThen(Collectors.groupingBy(f -> f.name,
Collectors.reducing(Foo.identity(), Foo::merge)),
map -> map.values().stream().
collect(Collectors.toCollection(ArrayList::new))));
assertEquals(3, foos.size());
foos.forEach(f -> assertEquals(10, f.ids.size()));
}
private Stream<Foo> getfoos(String n) {
return IntStream.range(0, 10).mapToObj(i -> new Foo(n, i));
}
public static class Foo {
private String name;
private List<Integer> ids = new ArrayList<>();
private static final Foo BASE_FOO = new Foo("", 0);
public static Foo identity() {
return BASE_FOO;
}
// use only if side effects to the argument objects are okay
public static Foo merge(Foo fooOne, Foo fooTwo) {
if (fooOne == BASE_FOO) {
return fooTwo;
} else if (fooTwo == BASE_FOO) {
return fooOne;
}
fooOne.ids.addAll(fooTwo.ids);
return fooOne;
}
public Foo(String n, int i) {
name = n;
ids.add(i);
}
}
If the input elements are supplied in the random order, then having intermediate map is probably the best solution. However if you know in advance that all the foos with the same name are adjacent (this condition is actually met in your test), the algorithm can be greatly simplified: you just need to compare the current element with the previous one and merge them if the name is the same.
Unfortunately there's no Stream API method which would allow you do to such thing easily and effectively. One possible solution is to write custom collector like this:
public static List<Foo> withCollector(Stream<Foo> stream) {
return stream.collect(Collector.<Foo, List<Foo>>of(ArrayList::new,
(list, t) -> {
Foo f;
if(list.isEmpty() || !(f = list.get(list.size()-1)).name.equals(t.name))
list.add(t);
else
f.ids.addAll(t.ids);
},
(l1, l2) -> {
if(l1.isEmpty())
return l2;
if(l2.isEmpty())
return l1;
if(l1.get(l1.size()-1).name.equals(l2.get(0).name)) {
l1.get(l1.size()-1).ids.addAll(l2.get(0).ids);
l1.addAll(l2.subList(1, l2.size()));
} else {
l1.addAll(l2);
}
return l1;
}));
}
My tests show that this collector is always faster than collecting to map (up to 2x depending on average number of duplicate names), both in sequential and parallel mode.
Another approach is to use my StreamEx library which provides a bunch of "partial reduction" methods including collapse:
public static List<Foo> withStreamEx(Stream<Foo> stream) {
return StreamEx.of(stream)
.collapse((l, r) -> l.name.equals(r.name), (l, r) -> {
l.ids.addAll(r.ids);
return l;
}).toList();
}
This method accepts two arguments: a BiPredicate which is applied for two adjacent elements and should return true if elements should be merged and the BinaryOperator which performs merging. This solution is a little bit slower in sequential mode than the custom collector (in parallel the results are very similar), but it's still significantly faster than toMap solution and it's simpler and somewhat more flexible as collapse is an intermediate operation, so you can collect in another way.
Again both these solutions work only if foos with the same name are known to be adjacent. It's a bad idea to sort the input stream by foo name, then using these solutions, because the sorting will drastically reduce the performance making it slower than toMap solution.
As already pointed out by others, an intermediate Map is unavoidable, as that’s the way of finding the objects to merge. Further, you should not modify source data during reduction.
Nevertheless, you can achieve both without creating multiple Foo instances:
List<Foo> foos = Stream.of("foo", "bar", "baz")
.flatMap(n->IntStream.range(0,10).mapToObj(i -> new Foo(n, i)))
.collect(collectingAndThen(groupingBy(f -> f.name),
m->m.entrySet().stream().map(e->new Foo(e.getKey(),
e.getValue().stream().flatMap(f->f.ids.stream()).collect(toList())))
.collect(toList())));
This assumes that you add a constructor
public Foo(String n, List<Integer> l) {
name = n;
ids=l;
}
to your Foo class, as it should have if Foo is really supposed to be capable of holding a list of IDs. As a side note, having a type which serves as single item as well as a container for merged results seems unnatural to me. This is exactly why to code turns out to be so complicated.
If the source items had a single id, using something like groupingBy(f -> f.name, mapping(f -> id, toList()), followed by mapping the entries of (String, List<Integer>) to the merged items was sufficient.
Since this is not the case and Java 8 lacks the flatMapping collector, the flatmapping step is moved to the second step, making it look much more complicated.
But in both cases, the second step is not obsolete as it is where the result items are actually created and converting the map to the desired list type comes for free.

Lambdas, multiple forEach with casting

Need some help thinking in lambdas from my fellow StackOverflow luminaries.
Standard case of picking through a list of a list of a list to collect some children deep in a graph. What awesome ways could Lambdas help with this boilerplate?
public List<ContextInfo> list() {
final List<ContextInfo> list = new ArrayList<ContextInfo>();
final StandardServer server = getServer();
for (final Service service : server.findServices()) {
if (service.getContainer() instanceof Engine) {
final Engine engine = (Engine) service.getContainer();
for (final Container possibleHost : engine.findChildren()) {
if (possibleHost instanceof Host) {
final Host host = (Host) possibleHost;
for (final Container possibleContext : host.findChildren()) {
if (possibleContext instanceof Context) {
final Context context = (Context) possibleContext;
// copy to another object -- not the important part
final ContextInfo info = new ContextInfo(context.getPath());
info.setThisPart(context.getThisPart());
info.setNotImportant(context.getNotImportant());
list.add(info);
}
}
}
}
}
}
return list;
}
Note the list itself is going to the client as JSON, so don't focus on what is returned. Must be a few neat ways I can cut down the loops.
Interested to see what my fellow experts create. Multiple approaches encouraged.
EDIT
The findServices and the two findChildren methods return arrays
EDIT - BONUS CHALLENGE
The "not important part" did turn out to be important. I actually need to copy a value only available in the host instance. This seems to ruin all the beautiful examples. How would one carry state forward?
final ContextInfo info = new ContextInfo(context.getPath());
info.setHostname(host.getName()); // The Bonus Challenge
It's fairly deeply nested but it doesn't seem exceptionally difficult.
The first observation is that if a for-loop translates into a stream, nested for-loops can be "flattened" into a single stream using flatMap. This operation takes a single element and returns an arbitrary number elements in a stream. I looked up and found that StandardServer.findServices() returns an array of Service so we turn this into a stream using Arrays.stream(). (I make similar assumptions for Engine.findChildren() and Host.findChildren().
Next, the logic within each loop does an instanceof check and a cast. This can be modeled using streams as a filter operation to do the instanceof followed by a map operation that simply casts and returns the same reference. This is actually a no-op but it lets the static typing system convert a Stream<Container> to a Stream<Host> for example.
Applying these transformations to the nested loops, we get the following:
public List<ContextInfo> list() {
final List<ContextInfo> list = new ArrayList<ContextInfo>();
final StandardServer server = getServer();
Arrays.stream(server.findServices())
.filter(service -> service.getContainer() instanceof Engine)
.map(service -> (Engine)service.getContainer())
.flatMap(engine -> Arrays.stream(engine.findChildren()))
.filter(possibleHost -> possibleHost instanceof Host)
.map(possibleHost -> (Host)possibleHost)
.flatMap(host -> Arrays.stream(host.findChildren()))
.filter(possibleContext -> possibleContext instanceof Context)
.map(possibleContext -> (Context)possibleContext)
.forEach(context -> {
// copy to another object -- not the important part
final ContextInfo info = new ContextInfo(context.getPath());
info.setThisPart(context.getThisPart());
info.setNotImportant(context.getNotImportant());
list.add(info);
});
return list;
}
But wait, there's more.
The final forEach operation is a slightly more complicated map operation that converts a Context into a ContextInfo. Furthermore, these are just collected into a List so we can use collectors to do this instead of creating and empty list up front and then populating it. Applying these refactorings results in the following:
public List<ContextInfo> list() {
final StandardServer server = getServer();
return Arrays.stream(server.findServices())
.filter(service -> service.getContainer() instanceof Engine)
.map(service -> (Engine)service.getContainer())
.flatMap(engine -> Arrays.stream(engine.findChildren()))
.filter(possibleHost -> possibleHost instanceof Host)
.map(possibleHost -> (Host)possibleHost)
.flatMap(host -> Arrays.stream(host.findChildren()))
.filter(possibleContext -> possibleContext instanceof Context)
.map(possibleContext -> (Context)possibleContext)
.map(context -> {
// copy to another object -- not the important part
final ContextInfo info = new ContextInfo(context.getPath());
info.setThisPart(context.getThisPart());
info.setNotImportant(context.getNotImportant());
return info;
})
.collect(Collectors.toList());
}
I usually try to avoid multi-line lambdas (such as in the final map operation) so I'd refactor it into a little helper method that takes a Context and returns a ContextInfo. This doesn't shorten the code at all, but I think it does make it clearer.
UPDATE
But wait, there's still more.
Let's extract the call to service.getContainer() into its own pipeline element:
return Arrays.stream(server.findServices())
.map(service -> service.getContainer())
.filter(container -> container instanceof Engine)
.map(container -> (Engine)container)
.flatMap(engine -> Arrays.stream(engine.findChildren()))
// ...
This exposes the repetition of filtering on instanceof followed by a mapping with a cast. This is done three times in total. It seems likely that other code is going to need to do similar things, so it would be nice to extract this bit of logic into a helper method. The problem is that filter can change the number of elements in the stream (dropping ones that don't match) but it can't change their types. And map can change the types of elements, but it can't change their number. Can something change both the number and types? Yes, it's our old friend flatMap again! So our helper method needs to take an element and return a stream of elements of a different type. That return stream will contain a single casted element (if it matches) or it will be empty (if it doesn't match). The helper function would look like this:
<T,U> Stream<U> toType(T t, Class<U> clazz) {
if (clazz.isInstance(t)) {
return Stream.of(clazz.cast(t));
} else {
return Stream.empty();
}
}
(This is loosely based on C#'s OfType construct mentioned in some of the comments.)
While we're at it, let's extract a method to create a ContextInfo:
ContextInfo makeContextInfo(Context context) {
// copy to another object -- not the important part
final ContextInfo info = new ContextInfo(context.getPath());
info.setThisPart(context.getThisPart());
info.setNotImportant(context.getNotImportant());
return info;
}
After these extractions, the pipeline looks like this:
return Arrays.stream(server.findServices())
.map(service -> service.getContainer())
.flatMap(container -> toType(container, Engine.class))
.flatMap(engine -> Arrays.stream(engine.findChildren()))
.flatMap(possibleHost -> toType(possibleHost, Host.class))
.flatMap(host -> Arrays.stream(host.findChildren()))
.flatMap(possibleContext -> toType(possibleContext, Context.class))
.map(this::makeContextInfo)
.collect(Collectors.toList());
Nicer, I think, and we've removed the dreaded multi-line statement lambda.
UPDATE: BONUS CHALLENGE
Once again, flatMap is your friend. Take the tail of the stream and migrate it into the last flatMap before the tail. That way the host variable is still in scope, and you can pass it to a makeContextInfo helper method that's been modified to take host as well.
return Arrays.stream(server.findServices())
.map(service -> service.getContainer())
.flatMap(container -> toType(container, Engine.class))
.flatMap(engine -> Arrays.stream(engine.findChildren()))
.flatMap(possibleHost -> toType(possibleHost, Host.class))
.flatMap(host -> Arrays.stream(host.findChildren())
.flatMap(possibleContext -> toType(possibleContext, Context.class))
.map(ctx -> makeContextInfo(ctx, host)))
.collect(Collectors.toList());
This would be my version of your code using JDK 8 streams, method references, and lambda expressions:
server.findServices()
.stream()
.map(Service::getContainer)
.filter(Engine.class::isInstance)
.map(Engine.class::cast)
.flatMap(engine -> Arrays.stream(engine.findChildren()))
.filter(Host.class::isInstance)
.map(Host.class::cast)
.flatMap(host -> Arrays.stream(host.findChildren()))
.filter(Context.class::isInstance)
.map(Context.class::cast)
.map(context -> {
ContextInfo info = new ContextInfo(context.getPath());
info.setThisPart(context.getThisPart());
info.setNotImportant(context.getNotImportant());
return info;
})
.collect(Collectors.toList());
In this approach, I replace your if-statements for filter predicates. Take into account that an instanceof check can be replaced with a Predicate<T>
Predicate<Object> isEngine = someObject -> someObject instanceof Engine;
which can also be expressed as
Predicate<Object> isEngine = Engine.class::isInstance
Similarly, your casts can be replaced by Function<T,R>.
Function<Object,Engine> castToEngine = someObject -> (Engine) someObject;
Which is pretty much the same as
Function<Object,Engine> castToEngine = Engine.class::cast;
And adding items manually to a list in the for loop can be replaced with a collector. In production code, the lambda that transforms a Context into a ContextInfo can (and should) be extracted into a separate method, and used as a method reference.
Solution to bonus challenge
Inspired by #EdwinDalorzo answer.
public List<ContextInfo> list() {
final List<ContextInfo> list = new ArrayList<>();
final StandardServer server = getServer();
return server.findServices()
.stream()
.map(Service::getContainer)
.filter(Engine.class::isInstance)
.map(Engine.class::cast)
.flatMap(engine -> Arrays.stream(engine.findChildren()))
.filter(Host.class::isInstance)
.map(Host.class::cast)
.flatMap(host -> mapContainers(
Arrays.stream(host.findChildren()), host.getName())
)
.collect(Collectors.toList());
}
private static Stream<ContextInfo> mapContainers(Stream<Container> containers,
String hostname) {
return containers
.filter(Context.class::isInstance)
.map(Context.class::cast)
.map(context -> {
ContextInfo info = new ContextInfo(context.getPath());
info.setThisPart(context.getThisPart());
info.setNotImportant(context.getNotImportant());
info.setHostname(hostname); // The Bonus Challenge
return info;
});
}
First attempt beyond ugly. It will be years before I find this readable. Has to be a better way.
Note the findChildren methods return arrays which of course work with for (N n: array) syntax, but not with the new Iterable.forEach method. Had to wrap them with Arrays.asList
public List<ContextInfo> list() {
final List<ContextInfo> list = new ArrayList<ContextInfo>();
final StandardServer server = getServer();
asList(server.findServices()).forEach(service -> {
if (!(service.getContainer() instanceof Engine)) return;
final Engine engine = (Engine) service.getContainer();
instanceOf(Host.class, asList(engine.findChildren())).forEach(host -> {
instanceOf(Context.class, asList(host.findChildren())).forEach(context -> {
// copy to another object -- not the important part
final ContextInfo info = new ContextInfo(context.getPath());
info.setThisPart(context.getThisPart());
info.setNotImportant(context.getNotImportant());
list.add(info);
});
});
});
return list;
}
The utility methods
public static <T> Iterable<T> instanceOf(final Class<T> type, final Collection collection) {
final Iterator iterator = collection.iterator();
return () -> new SlambdaIterator<>(() -> {
while (iterator.hasNext()) {
final Object object = iterator.next();
if (object != null && type.isAssignableFrom(object.getClass())) {
return (T) object;
}
}
throw new NoSuchElementException();
});
}
And finally a Lambda-powerable implementation of Iterable
public static class SlambdaIterator<T> implements Iterator<T> {
// Ya put your Lambdas in there
public static interface Advancer<T> {
T advance() throws NoSuchElementException;
}
private final Advancer<T> advancer;
private T next;
protected SlambdaIterator(final Advancer<T> advancer) {
this.advancer = advancer;
}
#Override
public boolean hasNext() {
if (next != null) return true;
try {
next = advancer.advance();
return next != null;
} catch (final NoSuchElementException e) {
return false;
}
}
#Override
public T next() {
if (!hasNext()) throw new NoSuchElementException();
final T v = next;
next = null;
return v;
}
#Override
public void remove() {
throw new UnsupportedOperationException();
}
}
Lots of plumbing and no doubt 5x the byte code. Must be a better way.

Java 8 Iterating Stream Operations

I want to perform a stream where the output from the stream is then used as the source for the same stream, in the same operation.
I currently perform this sort of operation using a queue; I remove an item, process it, and add any results that need further processing back to the queue. Here are two examples of this sort of thing:
Queue<WorkItem> workQueue = new Queue<>(workToDo);
while(!workQueue.isEmpty()){
WorkItem item = workQueue.remove();
item.doOneWorkUnit();
if(!item.isDone()) workQueue.add(item);
}
Queue<Node> nodes = new Queue<>(rootNodes);
while(!nodesLeft.isEmpty()){
Node node = nodes.remove();
process(node);
nodes.addAll(node.children());
}
I would imagine that the first could be performed concurrently like this:
try {
LinkedBlockingQueue<WorkItem> workQueue = new LinkedBlockingQueue<>();
Stream<WorkItem> reprocess = Stream.generate(() -> workQueue.remove()).parallel();
Stream.concat(workToDo.parallelstream(), reprocess)
.filter(item -> {item.doOneWorkUnit(); return !item.isDone();})
.collect(Collectors.toCollection(() -> workQueue));
} catch (NoSuchElementException e){}
And the second as:
try {
LinkedBlockingQueue<Node> reprocessQueue = new LinkedBlockingQueue<>();
Stream<WorkItem> reprocess = Stream.generate(() -> nodes.remove()).parallel();
Stream.concat(rootNodes.parallelStream(), reprocess)
.filter(item -> {process(item); return true;})
.flatMap(node -> node.children().parallelStream())
.collect(Collectors.toCollection(() -> reprocessQueue));
} catch (NoSuchElementException e){}
However, these feel like kludgy workarounds, and I dislike having to resort to using exceptions. Does anyone have a better way to do this sort of thing?
To make work parallel, I would use standard java.util.concurrent.Executor. To return the task back to working queue, in the end of the code of each task, add executor.execute(this).

Categories

Resources