I have Set with items, and want to send it for parallel processing.
However, I want to modify the original set afterwards and it'd cause some concurrency issues, so I think it'd be nice to take a snapshot or something of the Set and send THAt for the processing.
Will clone work good?
Or should I make a new Set of it myself?
Or is there some nice way I'm missing?
Edit: I'm now using this, it seems to work pretty nice:
public class BufferedHashSet<E> extends HashSet<E> {
private List<E> toAdd = new LinkedList<E>();
private List<Object> toRemove = new LinkedList<Object>();
#Override
public boolean add(E e)
{
synchronized (this) {
toAdd.add(e);
return true;
}
}
#Override
public boolean remove(Object e)
{
synchronized (this) {
toRemove.add(e);
return true;
}
}
public void flush()
{
synchronized (this) {
for (E e : toAdd) {
super.add(e);
}
for (Object e : toRemove) {
super.remove(e);
}
toAdd.clear();
toRemove.clear();
}
}
}
In my opinion the most elegant solution is to use Set.addAll() method.
Set set;
Set snapshot = new TreeSet<>(); //or any Set implementation you use
snapshot.addAll(set);
Related
I just started playing around Java 8 and Lambda Expression and I am curious if I can stop the Stream generation from inside the Lambda expession by returning a specific value
(like null). Is this possible with Stream.generate()?
private int counter;
private void generate()
{
System.out.println(Stream.generate(() -> {
if (counter < 10) {
counter++;
return RandomUtils.nextInt(100);
} else {
return null;
}
}).count());
}
Unfortunately this code does not terminate, so by simply returning null will not step out of the stream.
Java 9 and later includes this method:
Stream<T> takeWhile(Predicate<? super T> predicate);
to limit a stream by condition. So the workaround beneath is not needed anymore.
Original answer (for Java versions earlier than 9):
With Stream.generate this is per definition not possible from a lambda closure. It is by definition endless. Using limit() you are able make your stream fix sized. But this will not help you for conditions like:
if random>10 then stop
There is a possibility to limit a potential endless stream by condition. This is usefull if one does not know the size. Your friend here is a Spliterator and your sample code would look like:
System.out.println( StreamSupport.stream(Spliterators.spliteratorUnknownSize(new Iterator<Integer>() {
int counter = 0;
#Override
public boolean hasNext() {
return counter < 10;
}
#Override
public Integer next() {
counter++;
return RandomUtils.nextInt(100);
}
}, Spliterator.IMMUTABLE), false).count());
Basically you are able to build a Stream from an Iterator. I am using this construct e.g. for a stream of XMLEvents from Stax XML - parsing.
I know this is not done by lambda constructs but it IHMO solves this lacking feature of stopping the stream item generation by condition.
I would be very interested, if there is a better way to achieve this (I mean this stream construct and not the XML processing ;)) or if there is a fundamental flaw in using streams in this way.
This is not possible with Lamdas, you cannot control the flow from inside the expression.
Even the API docs says that the Stream.generate generates an infinite stream.
However, you can limit the Stream and achieve the desired functionality simply by using the limit() method:
System.out.println(Stream.generate(() -> RandomUtils.nextInt(100)).limit(10).count());
// If you are not looking for parallelism, you can use following method:
public static <T> Stream<T> breakStream(Stream<T> stream, Predicate<T> terminate) {
final Iterator<T> original = stream.iterator();
Iterable<T> iter = () -> new Iterator<T>() {
T t;
boolean hasValue = false;
#Override
public boolean hasNext() {
if (!original.hasNext()) {
return false;
}
t = original.next();
hasValue = true;
if (terminate.test(t)) {
return false;
}
return true;
}
#Override
public T next() {
if (hasValue) {
hasValue = false;
return t;
}
return t;
}
};
return StreamSupport.stream(iter.spliterator(), false);
}
Use StreamSupport.stream(Spliterator, boolean)
See JavaDoc on Spliterator.
Here is example spliterator:
public class GeneratingSpliterator<T> implements Spliterator<T>
{
private Supplier<T> supplier;
private Predicate<T> predicate;
public GeneratingSpliterator(final Supplier<T> newSupplier, final Predicate<T> newPredicate)
{
supplier = newSupplier;
predicate = newPredicate;
}
#Override
public int characteristics()
{
return 0;
}
#Override
public long estimateSize()
{
return Long.MAX_VALUE;
}
#Override
public boolean tryAdvance(final Consumer<? super T> action)
{
T newObject = supplier.get();
boolean ret = predicate.test(newObject);
if(ret) action.accept(newObject);
return ret;
}
#Override
public Spliterator<T> trySplit()
{
return null;
}
}
This is another solution for java 8 (It needs a Stream.Builder, may be it is not optimal, but it is quite simple):
#SuppressWarnings("ResultOfMethodCallIgnored")
public static <T> Stream<T> streamBreakable(Stream<T> stream, Predicate<T> stopCondition) {
Stream.Builder<T> builder = Stream.builder();
stream.map(t -> {
boolean stop = stopCondition.test(t);
if (!stop) {
builder.add(t);
}
return stop;
})
.filter(result -> result)
.findFirst();
return builder.build();
}
And the test:
#Test
public void shouldStop() {
AtomicInteger count = new AtomicInteger(0);
Stream<Integer> stream = Stream.generate(() -> {
if (count.getAndIncrement() < 10) {
return (int) (Math.random() * 100);
} else {
return null;
}
});
List<Integer> list = streamBreakable(stream, Objects::isNull)
.collect(Collectors.toList());
System.out.println(list);
}
It is possible, you just need to think outside the box.
The following idea is borrowed from Python, the language which introduced me to generator functions...
Just throw an instance of RuntimeException when you are done from within the Supplier<T> closure and catch-and-ignore it at the call site.
An example excerpt (note I have added a safety catch of Stream.limit(Long.MAX_VALUE) to cover the unexpected, though it should never be triggered):
static <T> Stream<T> read(String path, FieldSetMapper<T> fieldSetMapper) throws IOException {
ClassPathResource resource = new ClassPathResource(path);
DefaultLineMapper<T> lineMapper = new DefaultLineMapper<>();
lineMapper.setFieldSetMapper(fieldSetMapper);
lineMapper.setLineTokenizer(getTokenizer(resource));
return Stream.generate(new Supplier<T>() {
FlatFileItemReader<T> itemReader = new FlatFileItemReader<>();
int line = 1;
{
itemReader.setResource(resource);
itemReader.setLineMapper(lineMapper);
itemReader.setRecordSeparatorPolicy(new DefaultRecordSeparatorPolicy());
itemReader.setLinesToSkip(1);
itemReader.open(new ExecutionContext());
}
#Override
public T get() {
T item = null;
++line;
try {
item = itemReader.read();
if (item == null) {
throw new StopIterationException();
}
} catch (StopIterationException ex) {
throw ex;
} catch (Exception ex) {
LOG.log(WARNING, ex,
() -> format("%s reading line %d of %s", ex.getClass().getSimpleName(), line, resource));
}
return item;
}
}).limit(Long.MAX_VALUE).filter(Objects::nonNull);
}
static class StopIterationException extends RuntimeException {}
public void init() {
if (repository.count() == 0) {
Level logLevel = INFO;
try {
read("providers.csv", fields -> new Provider(
fields.readString("code"),
fields.readString("name"),
LocalDate.parse(fields.readString("effectiveStart"), DateTimeFormatter.ISO_LOCAL_DATE),
LocalDate.parse(fields.readString("effectiveEnd"), DateTimeFormatter.ISO_LOCAL_DATE)
)).forEach(repository::save);
} catch (IOException e) {
logLevel = WARNING;
LOG.log(logLevel, "Initialization was interrupted");
} catch (StopIterationException ignored) {}
LOG.log(logLevel, "{} providers imported.", repository.count());
}
}
My solution was to generate a null when done and then apply a filter
Stream
.generate( o -> newObject() )
.filter( o -> o != null )
.forEach(...)
trying to use a concurrent skip list map. i had problems with how to use a synchronized linked hash map correctly, so i decided to give concurrent skip list map a try.
i have the same sort of problem. the unit test below fails because when i get the entry set, it has null values when size() indicates that the map is not empty. naict, i have all access to the map synchronized.
i would think that one would not need to do this (synchronized), since this a concurrent map.
the server just puts the numbers 0,1,2,3, ... into the map, keeping it's size below a threshold. it tries to put one number in for each millisecond that has passed since the server was started.
any pointers will be appreciated.
thanks
import static org.junit.Assert.*;
import java.util.*;
import java.util.Map.Entry;
import java.util.concurrent.ConcurrentSkipListMap;
import org.junit.*;
class DummyServer implements Runnable {
DummyServer(int pieces) {
t0=System.currentTimeMillis();
this.pieces=pieces;
max=pieces;
lruMap=new ConcurrentSkipListMap<Long,Long>();
}
Set<Map.Entry<Long,Long>> entrySet() {
Set<Entry<Long,Long>> entries=null;
synchronized(lruMap) {
entries=Collections.unmodifiableSet(lruMap.entrySet());
}
return entries;
}
Set<Long> keySet() {
Set<Long> entries=null;
synchronized(lruMap) {
entries=Collections.unmodifiableSet(lruMap.keySet());
}
return entries;
}
#Override public void run() {
int n=0;
while(piece<stopAtPiece) {
long target=piece(System.currentTimeMillis()-t0);
long n0=piece;
for(;piece<target;piece++,n++)
put(piece);
if(n>max+max/10) {
Long[] keys=keySet().toArray(new Long[0]);
synchronized(lruMap) {
for(int i=0;n>max;i++,n--)
lruMap.remove(keys[i]);
}
}
try {
Thread.sleep(10);
} catch(InterruptedException e) {
e.printStackTrace();
break;
}
}
}
private void put(long piece) {
synchronized(lruMap) {
lruMap.put(piece,piece);
}
}
public long piece() {
return piece;
}
public Long get(long piece) {
synchronized(lruMap) {
return lruMap.get(piece);
}
}
public int size() {
synchronized(lruMap) {
return lruMap.size();
}
}
public long piece(long dt) {
return dt/period*pieces+dt%period*pieces/period;
}
private long piece;
int period=2000;
private volatile Map<Long,Long> lruMap;
public final long t0;
protected final int pieces;
public final int max;
public long stopAtPiece=Long.MAX_VALUE;
}
public class DummyServerTestCase {
void checkMap(Long n) {
if(server.size()>0) {
final Set<Map.Entry<Long,Long>> mapValues=server.entrySet();
#SuppressWarnings("unchecked") final Map.Entry<Long,Long>[] entries=new Map.Entry[mapValues.size()];
mapValues.toArray(entries);
try {
if(entries[0]==null)
System.out.println(server.piece());
assertNotNull(entries[0]);
} catch(Exception e) {
fail(e.toString());
}
}
}
#Test public void testRunForFirstIsNotZero() {
server.stopAtPiece=1*server.pieces;
Thread thread=new Thread(server);
thread.start();
while(thread.isAlive()) {
for(long i=0;i<server.piece();i++) {
server.get(i);
Thread.yield();
checkMap(server.piece());
Thread.yield();
}
}
}
DummyServer server=new DummyServer(1000);
}
The problem is that you are performing
final Map.Entry<Long,Long>[] entries=new Map.Entry[mapValues.size()]; // size>0
mapValues.toArray(entries); // size is 0.
Between creating the array and calling toArray you are clearing the map.
If you take a copy using the Iterator you will not get this race condition.
void checkMap(Long n) {
final Set<Map.Entry<Long, Long>> mapValues = server.entrySet();
Set<Map.Entry<Long, Long>> entries = new LinkedHashSet<>(mapValues);
for (Entry<Long, Long> entry : entries) {
assertNotNull(entry);
}
}
or
void checkMap(Long n) {
for (Entry<Long, Long> entry : server.entrySet())
assertNotNull(entry);
}
First you shouldn't ever have to synchronize a thread-safe collection implementation unless you have to do some compound operation. The ConcurrentMap offers good atomic compound functions for you so even then you shouldnt have to.
Second. You should never rely on the size method to be correct while doing concurrent operations. The javadoc notes:
Beware that, unlike in most collections, the size method is not a
constant-time operation. Because of the asynchronous nature of these
maps, determining the current number of elements requires a traversal
of the elements.
The size can be different from when you start the invocation to when you get a return.
In short your test isn't a valid concurrent test. Can you elaborate more on what you're trying to achieve?
It's useful to me to have a data structure in Java that has all the functionality of a List, but has a maximum storage capacity, and drops older data when newer data is added. Conceivably at some point I might want to implement a fixed size Queue which keeps a more general ordering of the data, and drops the old data lowest in that ordering, but that's the for the future.
At the moment I'm implementing it like this:
public class FixedSizeList<T> {
private final int maxSize;
private final LinkedList<T> list = new LinkedList<T>();
public FixedSizeQueue(int maxSize) {
this.maxSize = maxSize < 0 ? 0 : maxSize;
}
public T add(T t) {
list.add(t);
return list.size() > maxSize ? list.remove() : null;
}
// add remaining methods...
}
Is there either (a) an existing data structure that serves my needs, or (b) a better way of implementing this data structure?
I would use array and 2 indexes for head and tail of the list.Make sure that head is always < tail and you're safe.
Here's a List with a size limit, based on Guava's ForwardingList:
A list which forwards all its method
calls to another list. Subclasses
should override one or more methods to
modify the behavior of the backing
list as desired per the decorator
pattern.
Guava has base classes like this for all JDK-5 Collection types. Each of them fulfills the same purpose: making it easy to add value, while delegating all default functionality to the underlying collection.
public class LimitingList<E> extends ForwardingList<E> {
private final class LimitingListIterator extends ForwardingListIterator<E> {
private final ListIterator<E> innerListIterator;
private LimitingListIterator(final ListIterator<E> innerListIterator) {
this.innerListIterator = innerListIterator;
}
/**
* {#inheritDoc}
*/
#Override
public void add(final E element) {
if (inner.size() < maxSize)
innerListIterator.add(element);
else
throw new IndexOutOfBoundsException();
}
#Override
protected ListIterator<E> delegate() {
return innerListIterator;
}
}
public LimitingList(final int maxSize) {
this(new ArrayList<E>(), maxSize);
}
public LimitingList(final List<E> inner, final int maxSize) {
super();
this.inner = inner;
this.maxSize = maxSize;
}
#Override
public boolean addAll(final Collection<? extends E> collection) {
boolean changed = false;
for (final E item : collection) {
final boolean tmpChanged = add(item);
changed = changed || tmpChanged;
if (!tmpChanged)
break;
}
return changed;
}
#Override
public boolean add(final E e) {
if (inner.size() < maxSize)
return super.add(e);
else
return false;
}
#Override
public ListIterator<E> listIterator() {
return new LimitingListIterator(inner.listIterator());
}
#Override
public void add(final int index, final E element) {
throw new UnsupportedOperationException();
}
#Override
public boolean addAll(final int index, final Collection<? extends E> elements) {
throw new UnsupportedOperationException();
}
#Override
public ListIterator<E> listIterator(final int index) {
return new LimitingListIterator(inner.listIterator(index));
}
private final int maxSize;
private final List<E> inner;
#Override
protected List<E> delegate() {
return inner;
}
}
It delegates all real functionality to an underlying list, which is an ArrayList per default (single argument constructor), but you can also supply (two argument constructor)
Unless you want to use an actual array, I don't believe there is a list type data structure you can use.
Personally I would extend one of the existing list classes to get the functionality, and override the add methods. This way you get all the other list operations for free. ie something like the following...
public class FixedSizeArrayList<T> extends ArrayList<T> {
private final int maxSize;
public FixedSizeArrayList(int maxSize) {
super();
this.maxSize = maxSize
}
public boolean add(T t) {
if (size() >= maxSize) {
remove(0);
}
return super.add(t);
}
// implementation of remaining add methods....
}
If you extend the LinkedList class you will have direct access to all it's methods. Instead of having to write stuff like
fixedList.getList().pop()
you could just write
fixedList.pop()
You could then override the methods where you need to add the maxSize criteria.
Maybe this is a silly question, but I cannot seem to find an obvious answer.
I need a concurrent FIFO queue that contains only unique values. Attempting to add a value that already exists in the queue simply ignores that value. Which, if not for the thread safety would be trivial. Is there a data structure in Java or maybe a code snipit on the interwebs that exhibits this behavior?
If you want better concurrency than full synchronization, there is one way I know of to do it, using a ConcurrentHashMap as the backing map. The following is a sketch only.
public final class ConcurrentHashSet<E> extends ForwardingSet<E>
implements Set<E>, Queue<E> {
private enum Dummy { VALUE }
private final ConcurrentMap<E, Dummy> map;
ConcurrentHashSet(ConcurrentMap<E, Dummy> map) {
super(map.keySet());
this.map = Preconditions.checkNotNull(map);
}
#Override public boolean add(E element) {
return map.put(element, Dummy.VALUE) == null;
}
#Override public boolean addAll(Collection<? extends E> newElements) {
// just the standard implementation
boolean modified = false;
for (E element : newElements) {
modified |= add(element);
}
return modified;
}
#Override public boolean offer(E element) {
return add(element);
}
#Override public E remove() {
E polled = poll();
if (polled == null) {
throw new NoSuchElementException();
}
return polled;
}
#Override public E poll() {
for (E element : this) {
// Not convinced that removing via iterator is viable (check this?)
if (map.remove(element) != null) {
return element;
}
}
return null;
}
#Override public E element() {
return iterator().next();
}
#Override public E peek() {
Iterator<E> iterator = iterator();
return iterator.hasNext() ? iterator.next() : null;
}
}
All is not sunshine with this approach. We have no decent way to select a head element other than using the backing map's entrySet().iterator().next(), the result being that the map gets more and more unbalanced as time goes on. This unbalancing is a problem both due to greater bucket collisions and greater segment contention.
Note: this code uses Guava in a few places.
There's not a built-in collection that does this. There are some concurrent Set implementations that could be used together with a concurrent Queue.
For example, an item is added to the queue only after it was successfully added to the set, and each item removed from the queue is removed from the set. In this case, the contents of the queue, logically, are really whatever is in the set, and the queue is just used to track the order and provide efficient take() and poll() operations found only on a BlockingQueue.
I would use a synchronized LinkedHashSet until there was enough justification to consider alternatives. The primary benefit that a more concurrent solution could offer is lock splitting.
The simplest concurrent approach would be a a ConcurrentHashMap (acting as a set) and a ConcurrentLinkedQueue. The ordering of operations would provide the desired constraint. An offer() would first perform a CHM#putIfAbsent() and if successful insert into the CLQ. A poll() would take from the CLQ and then remove it from the CHM. This means that we consider an entry in our queue if it is in the map and the CLQ provides the ordering. The performance could then be adjusted by increasing the map's concurrencyLevel. If you are tolerant to additional racy-ness, then a cheap CHM#get() could act as a reasonable precondition (but it can suffer by being a slightly stale view).
A java.util.concurrent.ConcurrentLinkedQueue gets you most of the way there.
Wrap the ConcurrentLinkedQueue with your own class that checks for the uniqueness of an add. Your code has to be thread safe.
What do you mean by a concurrent queue with Set semantics? If you mean a truly concurrent structure (as opposed to a thread-safe structure) then I would contend that you are asking for a pony.
What happens for instance if you call put(element) and detect that something is already there which immediately is removed? For instance, what does it mean in your case if offer(element) || queue.contains(element) returns false?
These kinds of things often need to thought about slightly differently in a concurrent world as often nothing is as it seems unless you stop the world (lock it down). Otherwise you are usually looking at something in the past. So, what are you actually trying to do?
Perhaps extend ArrayBlockingQueue. In order to get access to the (package-access) lock, I had to put my sub-class within the same package. Caveat: I haven't tested this.
package java.util.concurrent;
import java.util.Collection;
import java.util.concurrent.locks.ReentrantLock;
public class DeDupingBlockingQueue<E> extends ArrayBlockingQueue<E> {
public DeDupingBlockingQueue(int capacity) {
super(capacity);
}
public DeDupingBlockingQueue(int capacity, boolean fair) {
super(capacity, fair);
}
public DeDupingBlockingQueue(int capacity, boolean fair, Collection<? extends E> c) {
super(capacity, fair, c);
}
#Override
public boolean add(E e) {
final ReentrantLock lock = this.lock;
lock.lock();
try {
if (contains(e)) return false;
return super.add(e);
} finally {
lock.unlock();
}
}
#Override
public boolean offer(E e) {
final ReentrantLock lock = this.lock;
lock.lock();
try {
if (contains(e)) return true;
return super.offer(e);
} finally {
lock.unlock();
}
}
#Override
public void put(E e) throws InterruptedException {
final ReentrantLock lock = this.lock;
lock.lockInterruptibly(); //Should this be lock.lock() instead?
try {
if (contains(e)) return;
super.put(e); //if it blocks, it does so without holding the lock.
} finally {
lock.unlock();
}
}
#Override
public boolean offer(E e, long timeout, TimeUnit unit) throws InterruptedException {
final ReentrantLock lock = this.lock;
lock.lock();
try {
if (contains(e)) return true;
return super.offer(e, timeout, unit); //if it blocks, it does so without holding the lock.
} finally {
lock.unlock();
}
}
}
A simple answer for a queue of unique objects can be as follow:
import java.util.concurrent.ConcurrentLinkedQueue;
public class FinalQueue {
class Bin {
private int a;
private int b;
public Bin(int a, int b) {
this.a = a;
this.b = b;
}
#Override
public int hashCode() {
return a * b;
}
public String toString() {
return a + ":" + b;
}
#Override
public boolean equals(Object obj) {
if (this == obj)
return true;
if (obj == null)
return false;
if (getClass() != obj.getClass())
return false;
Bin other = (Bin) obj;
if ((a != other.a) || (b != other.b))
return false;
return true;
}
}
private ConcurrentLinkedQueue<Bin> queue;
public FinalQueue() {
queue = new ConcurrentLinkedQueue<Bin>();
}
public synchronized void enqueue(Bin ipAddress) {
if (!queue.contains(ipAddress))
queue.add(ipAddress);
}
public Bin dequeue() {
return queue.poll();
}
public String toString() {
return "" + queue;
}
/**
* #param args
*/
public static void main(String[] args) {
FinalQueue queue = new FinalQueue();
Bin a = queue.new Bin(2,6);
queue.enqueue(a);
queue.enqueue(queue.new Bin(13, 3));
queue.enqueue(queue.new Bin(13, 3));
queue.enqueue(queue.new Bin(14, 3));
queue.enqueue(queue.new Bin(13, 9));
queue.enqueue(queue.new Bin(18, 3));
queue.enqueue(queue.new Bin(14, 7));
Bin x= queue.dequeue();
System.out.println(x.a);
System.out.println(queue.toString());
System.out.println("Dequeue..." + queue.dequeue());
System.out.println("Dequeue..." + queue.dequeue());
System.out.println(queue.toString());
}
}
The question but in C#. So does Java have C#'s command? I need it for Matches-SearchTerm-Files-relationship.
foreach(var i in BunchOfItems.SelectMany(k => k.Items)) {}
[Why not for-loops?]
I have done such structures in nested for loops but they soon become bloated. So I prefer something more succint like the above.
public static Stack<Integer[]> getPrintPoss(String s,File f,Integer maxViewPerF)
{
Stack<File> possPrint = new Stack<File>();
Integer[] poss = new Integer[4]();
int u,size;
for(File f:files)
{
size = f2S(f).length();
u = Math.min(maxViewsPerF,size);
for(int i=0; i<u;i++)
{
// Do something --- bloated, and soon out of control
// wants more succintly
}
}
return possPrint;
}
for (List<Object> lo : list) {
for (Object o : lo) {
// etc etc
}
}
I don't think there's a simpler solution.
If you can get the data into an Iterable<Iterable<T>>, then you can get from that to a flattened Iterable<T> using Guava's Iterables.concat method. If what you have is really an Iterable<S>, with some way to get from an S to an Iterable<T>, well, then you have to first use Iterables.transform to view that as the Iterable<Iterable<T>> needed by concat.
All this will look a lot nicer if and when Java has something resembling closures, but at least today it's possible.
http://guava-libraries.googlecode.com
With Java 8, you can say
Collection bunchOfItems = ...;
bunchOfItems.stream().flatMap(k::getItems).forEach(i -> /* operate on i */);
or
Item[] bunchOfItems = ...;
Stream.of(bunchOfItems).flatMap(k::getItems).forEach(i -> /* operate on i */);
depending upon whether you have a Collection or an Array.
Have about half a year patience until JDK7 is final which will include Closures. This provides simliar syntax and the same possibilities as LINQ which was demonstrated in the answer you're talking about.
I have my own version. Waiting desperately for Closures in Java :
public static <T, E> Iterable<T> transformMany(Iterable<E> iterable, Func<E, Iterable<T>> f) {
if (null == iterable)
throw new IllegalArgumentException("null iterable");
if (null == f)
throw new IllegalArgumentException("null f");
return new TransformManyIterable<E, T>(iterable, f);
}
public interface Func<E, T> {
T execute(E e);
}
public class TransformManyIterable<TOriginal, TResult> implements Iterable<TResult> {
private Iterable<TOriginal> iterable;
private Func<TOriginal, Iterable<TResult>> func;
public TransformManyIterable(Iterable<TOriginal> iterable,
Func<TOriginal, Iterable<TResult>> func) {
super();
this.iterable = iterable;
this.func = func;
}
class TransformIterator implements Iterator<TResult> {
private Iterator<TOriginal> iterator;
private Iterator<TResult> currentIterator;
public TransformIterator() {
iterator = iterable.iterator();
}
#Override
public boolean hasNext() {
if (currentIterator != null && currentIterator.hasNext())
return true;
else {
while (iterator.hasNext()) {
Iterable<TResult> iterable = func.execute(iterator.next());
if (iterable == null)
continue;
currentIterator = iterable.iterator();
if (currentIterator.hasNext())
return true;
}
}
return false;
}
#Override
public TResult next() {
if (currentIterator != null && currentIterator.hasNext())
return currentIterator.next();
else {
while (iterator.hasNext()) {
Iterable<TResult> iterable = func.execute(iterator.next());
if (iterable == null)
continue;
currentIterator = iterable.iterator();
if (currentIterator.hasNext())
return currentIterator.next();
}
}
throw new NoSuchElementException();
}
#Override
public void remove() {
throw new UnsupportedOperationException();
}
}
#Override
public Iterator<TResult> iterator() {
return new TransformIterator();
}
}
Usage:
Iterable<SomeType> result = transformMany(input, new Func<InputType, Iterable<SomeType>>() {
#Override
public Iterable<SomeType> execute(InputType e) {
return new ArrayList<SomeType>();
}
});
The SelectMany method is part of LINQ which is .Net-specific. This question asks about a LINQ equilvalent for java. Unfortunately, it doesn't look like there is a direct equivalent.