What is the best way in Java to make recursive function to get all combinations of elements taken from several sets of candidates?
In general the number of candidate sets is undefined so recursive solution seems to be appropriate for this task. As an example for given sets of candidates
[1,2]
[3,4]
[5,6,7]
should get 12 combinations:
[1,3,5] [2,3,5]
[1,4,5] [2,4,5]
[1,3,6] [2,3,6]
[1,4,6] [2,4,6]
[1,3,7] [2,3,7]
[1,4,7] [2,4,7]
Candidate set is represented as List of List of Type: List<List<T>>
I encountered this same problem several years ago. I solved it by iterating through the result list with an odometer.
The number of wheels in the odometer is the number of input sets. The figures on each wheel are the members of the corresponding set. To get the next permutation, roll the rightmost odometer wheel. If it's turned all the way around, roll the one to it's left, etc.
For example:
Wheel 0 values: [1,2]
Wheel 1 values: [3,4]
Wheel 2 values: [5,6,7]
Start with odometer reading (1,3,5). Advance to (1,3,6), (1,3,7). Then roll the next wheel as well, to (1,4,5), (1,4,6) and (1,4,7). Continue.
Odometer wheels as indices
Alternatively, you can represent the wheels as indices into the corresponding list.
Wheel 0 values: [0,1]
Wheel 1 values: [0,1]
Wheel 2 values: [0,1,2]
Start with odometer reading (0,0,0). Advance to (0,0,1), (0,0,2). Then roll the next wheel as well, to (0,1,0), (0,1,1) and (0,1,2). Continue. For each reading, translate to the result list by using the odometer wheel readings as indices into the input lists.
Odometer wheels as iterators
As another alternative, you can represent the wheels as iterators into the input collections. This is more general than the prior two approaches. It works even if the input collections are not accessible by index. And it's scalable. And this is the approach I used several years ago.
The total number of combinations is the product of the sizes of the candidate sets. Each result set's size is equal to the number of candidate sets.
You don't need recursion for the solution. Just go through each candidate set. In this example, the first has two values, 1 and 2. The first 6 result sets (half of them) get the first value as 1. The next half get 6.
Onto the next candidate set, there are two values, 3 and 4. But this time, alternate assigning them in groups of 3, rather than 6. So the first 3 result sets get 3, the next 3 sets get 4, the next 3 get 3, and so on.
The next candidate set has three values: 5, 6, and 7. You'll be rotating which value you assign for each result set now (rotating each 1 assignment.) If there were more candidate sets or different amounts of values in them, the amount you assign before rotating to the next value would change. But you can figure this out programatically.
You don't need recursion. Just use the size of the list of sets and then of each set. You can keep the results open to addition of further elements, in case you get more sets to mix in in the future, in case that's what you need.
Thank you all for your replies.
Andy Thomas, quite interesting idea with odometer. Will give it a try a bit later. For now I've implemented it as ThatOneCloud suggested.
Here's what i've got (for Integer items; if needed can be generalized):
public List<List<Integer>> makeCombinations(List<List<Integer>> candidates) {
List<List<Integer>> result = new ArrayList<List<Integer>>();
// calculate result size
int size = 1;
for (List<Integer> candidateSet : candidates)
size *= candidateSet.size();
// make result
for (int i = 0; i < size; i++)
result.add(new ArrayList<Integer>());
// fill result
int pos = 1;
for (List<Integer> candidateSet : candidates)
fillPosition(candidateSet, result, countRepeats(candidates, pos++));
// return
return result;
}
public int countRepeats(List<List<Integer>> candidates, int pos) {
int repeats = 1;
for (int i = pos; i < candidates.size(); i++)
repeats *= candidates.get(i).size();
return repeats;
}
public void fillPosition( List<Integer> candidateSet,
List<List<Integer>> result,
int repeats) {
int idx = 0;
while (idx < result.size()) {
for (int item : candidateSet) {
for (int i = 0; i < repeats; i++) {
result.get(idx++).add(item);
}
}
}
}
And here's another version (Odometer, as Andy Thomas suggested)
import java.util.ArrayList;
import java.util.Iterator;
import java.util.List;
public class Odometer<T> implements Iterable<List<T>> {
private class Wheel {
List<T> values;
int idx = 0;
/**
* Create an odometer wheel from list of values
* #param v
*/
protected Wheel (List<T> v) {
if (v == null) throw new NullPointerException("can't create an instance of Wheel.class with null values");
if (v.isEmpty()) throw new IllegalArgumentException("can't create an instance of Wheel.class with no values");
this.values = v;
}
/**
* Get wheel value
* #return
*/
protected T value() {
return values.get(idx);
}
/**
* switch an odometer wheel one step
* #return TRUE - if a wheel have made full cycle and have switched to first item
*/
protected boolean next() {
if (idx >= values.size() - 1) {
idx = 0;
return true;
} else {
idx++;
return false;
}
}
}
/**
* list of wheels
*/
private List<Wheel> wheels = new ArrayList<Wheel>();
/**
* Create an odometer from several lists of values
* (each List<T> is a list of values for one odometer wheel)
* #param values
*/
public Odometer(List<List<T>> values) {
for (List<T> v : values)
wheels.add(new Wheel(v));
}
/**
* Get odometer value
* #return
*/
public List<T> get() {
List<T> result = new ArrayList<T>();
for (Wheel wheel : wheels) {
result.add(wheel.value());
}
return result;
}
/**
* Switch to next value
* #return TRUE if full cycle is finished
*/
public boolean next() {
for (int i = wheels.size() - 1; i >= 0; i--)
if (!wheels.get(i).next()) return false;
return true;
}
/**
* Reset odometer
*/
public void reset() {
for (Wheel wheel : wheels)
wheel.idx = 0;
}
/**
* Iterator
*/
#Override
public Iterator<List<T>> iterator() {
reset();
Iterator<List<T>> it = new Iterator<List<T>>() {
private boolean last = false;
#Override
public boolean hasNext() {
return !last;
}
#Override
public List<T> next() {
List<T> result = get();
last = Odometer.this.next();
return result;
}
#Override
public void remove() {
throw new UnsupportedOperationException();
}
};
return it;
}
public static void main(String [] args) {
List<Integer> l1 = new ArrayList<Integer>(); l1.add(1); l1.add(2);
List<Integer> l2 = new ArrayList<Integer>(); l2.add(3); l2.add(4); l2.add(5);
List<Integer> l3 = new ArrayList<Integer>(); l3.add(6); l3.add(7);
List<List<Integer>> list = new ArrayList<List<Integer>>(); list.add(l1); list.add(l2); list.add(l3);
Odometer<Integer> odometer = new Odometer<Integer>(list);
for (List<Integer> value : odometer) {
System.out.println(value);
}
}
}
Related
I am working on a sharding problem.
Imagine I have 10 lists.
Each list has a series of items that are independently sorted.
I want to get the Nth item as if all the lists were sorted together in one large list.
Do I need to sort the lists overall to get an item at a particular index?
I solved a similar but not equivalent problem where there is:
10 lists
Each list represents a range of items that are after the previous list.
here's the code to iterate through all the indexes of the lists:
/* code to iterate through all items in order
* threads refers to one of the lists */
int sizes[] = new int[threads.size()];
for (int i = 0 ; i < threads.size(); i++) {
sizes[i] = threads.get(i).data2.size();
}
int n = 0;
int thread = 0;
int size = threads.size();
int offset = 0;
long iterationStart = System.nanoTime();
while (thread < size) {
// System.out.println(String.format("%d %d", thread, offset + threads.get(thread).data.get(n)));
int current = offset + threads.get(thread).data.get(n);
n = n + 1;
if (n == sizes[thread]) {
offset += sizes[thread];
thread++;
n = 0;
}
}
long iterationEnd = System.nanoTime();
long iterationTime = iterationEnd - iterationStart;
Here's the code to lookup an item by index.
int lookupKey = 329131;
int current = lookupKey;
int currentThread = 0;
int total = 0;
while (current >= 0 && currentThread <= size - 1) {
int next = current - sizes[currentThread];
if (next >= 0) {
total += sizes[currentThread];
current -= sizes[currentThread];
currentThread++;
} else {
break;
}
}
long lookupEnd = System.nanoTime();
long lookupTime = lookupEnd - lookupStart;
System.out.println(String.format("%d %d",
currentThread,
total + threads.get(currentThread).data.get(current)));
I'm hoping there's some property of sorted collections that I can use to retrieve the Nth item in an overall sorted lists.
What I have in effect is multiple partial orders.
I have some other code that does a N way merge between multiple sorted lists. Is the fastest option to run this in a loop up to lookupIndex?
int size1 = threads.size();
int[] positions = new int[size1];
Arrays.fill(positions, 0);
PriorityQueue<Tuple> pq = new PriorityQueue<>(new Comparator<Tuple>() {
#Override
public int compare(Tuple o1, Tuple o2) {
return o1.value.compareTo(o2.value);
}
});
long startOrderedIteration = System.nanoTime();
for (ShardedTotalRandomOrder thread : threads) {
for (int i = 0; i < 10; i++) {
// System.out.println(thread.data2.get(i));
pq.add(thread.data2.get(i));
}
}
List<Integer> overall = new ArrayList<>();
while (!pq.isEmpty()) {
Tuple poll = pq.poll();
ArrayList<Tuple> data2 = threads.get(poll.thread).data2;
if (positions[poll.thread] < data2.size()) {
Tuple nextValue = data2.get(positions[poll.thread]++);
pq.offer(nextValue);
}
overall.add(poll.value);
// System.out.println(String.format("%d %d", poll.thread, poll.value));
}
System.out.println(overall);
long endOrderedIteration = System.nanoTime();
long orderedIterationTime = endOrderedIteration - startOrderedIteration;
You don't need to resort them. Since each list is already sorted you can merge them as follows. This uses a single method to merge two lists based on their relative values. Then it returns that list and feeds it back into the method to merge it with the next list.
import java.util.ArrayList;
import java.util.Comparator;
import java.util.List;
public class Merging {
public static void main(String[] args) {
List<Integer> list1 = List.of(5,10,15,20,25,30,35,40,45,50);
List<Integer> list2 = List.of(2,4,6,8,10);
List<Integer> list3 = List.of(1,2,3,4,5,6,7,8,9,10,11,12,13,14,15,16,17,18,19,20);
int nth = 10;
List<List<Integer>> lists = List.of(list1,list2,list3);
List<Integer> merged = lists.get(0);
for (int i = 1; i < lists.size(); i++) {
merged = mergeLists(merged, lists.get(i));
}
System.out.println(merged.get(nth));
}
prints
7
This works with any type that implements the Comparable interface.
It will loop until one list is exhausted or until both indices exceed the combined list size.
Once either list is finished, the other can be appended via the sublist.
public static <T extends Comparable<? super T>> List<T> mergeLists(List<T> list1, List<T> list2) {
List<T> merged = new ArrayList<>();
int i1 = 0;
int i2 = 0;
while (i1 + i2 < list1.size() + list2.size()) {
if (i1 >= list1.size()) {
merged.addAll(list2.subList(i2,list2.size()));
break;
}
if (i2 >= list2.size()) {
merged.addAll(list1.subList(i1,list1.size()));
break;
}
if(list1.get(i1).compareTo(list2.get(i2)) <= 0) {
merged.add(list1.get(i1++));
} else {
merged.add(list2.get(i2++));
}
}
return merged;
}
}
Here is a relatively efficient (linear with respect to the number of lists) algorithm that leverages some of the power of streams, but avoids a full list merge.
EDIT: To address shortcomings such as array length checking, array destruction, and readability I have improved this example. For better comparison, I have used the same integer test data as the other answer.
This virtual queue backed by the (presumably) immutable array will not mutate or otherwise
public class VirtualQueue<T> {
private List<T> list;
private int index=0;
public VirtualQueue(List<T> list) { this.list = list; }
public boolean hasMore() { return index < list.size(); }
public T pop() { return list.get(index++); }
public T peek() { return list.get(index);}
}
(I suspect that there is an easier way to do this with standard collections)
List<Integer> list1 = List.of(5,10,15,20,25,30,35,40,45,50);
List<Integer> list2 = List.of(2,4,6,8,10);
List<Integer> list3 = List.of(1,2,3,4,5,6,7,8,9,10,11,12,13,14,15,16,17,18,19,20);
List<VirtualQueue<Integer>> listList = List.of(
new VirtualQueue<>(list1),
new VirtualQueue<>(list2),
new VirtualQueue<>(list3));
int n=10;
var value = IntStream.range(0,n)
.mapToObj(i -> listList.stream()
.filter(VirtualQueue::hasMore)
.min(Comparator.comparing(l -> l.peek()))
.get().pop())
.skip(n-1).findFirst().get();
//value is now the nth item in a hypothetical merged list.
Assuming that you have k sorted Lists, and you need to obtain the n from the aggregated List (but the merged list itself don't needed), then this problem can be solved in O(n * log k) time, and by using O(k) additional space.
Note:
If the code below looks too much involved, here's the rationale behind that. This solution is more performant than straightforward comparison of elements from each list which can be observed in this and this answers, which time complexity O(n * k) (opposed to O(n * log k)). A moderate additional complexity is a cost of the performance gain, and note that it's still maintainable.
In case you would need to materialize the merged sorted list (solution below is not doing this), you can simply combine the lists together and make use of the built-in Timsort algorithm implementation via List.sort(). Timsort is very good at spotting the sorted runs, therefore sorting a list that consists of sorted chunks would have a linear time complexity.
To address the problem in O(n * log k) time, we can maintain a PriorityQueue which would always have a size of k or less (therefore enqueue/dequeue operation would have a cost of O(log k)). At the beginning, the Queue should be initialized by adding the very first element from each List.
Then we need to perform n iteration (to find the target element). At each iteration step the Head element of the Queue should be removed, and the next element originated from the same list should be added to the Queue (i.e. if let's say the 7th element from the 3rd List appear to be the Head of the Queue, then after removing it, we need to enqueue the 8th element from the 3rd List).
In order to be able to track from which List each element is coming from and what was its index in the List, we can define a custom type:
public class ElementWrapper<V extends Comparable<V>> implements Comparable<ElementWrapper<V>> {
private V value;
private int listIndex;
private int elementIndex;
// all-args constructor, getters
#Override
public int compareTo(ElementWrapper<V> o) {
return value.compareTo(o.getValue());
}
}
And here's how this algorithm for finding the n-th element can be implemented. As I've said, the time complexity is O(n * log k), since we need n iteration step with each step having a cost of O(log k). Additional memory required only to maintain a Queue of k element.
public static <T extends Comparable<T>> T getNElement(List<List<T>> lists, int n) {
Queue<ElementWrapper<T>> queue = initializeWithFirstElements(lists);
T result = null;
int count = 1;
while (!queue.isEmpty()) {
ElementWrapper<T> current = queue.remove();
if (count == n) { // target index was reached
result = current.getValue();
break;
}
count++;
if (hasNext(current, lists)) {
addNext(current, lists, queue);
}
}
return result;
}
public static <T extends Comparable<T>> Queue<ElementWrapper<T>>
initializeWithFirstElements(List<List<T>> lists) {
Queue<ElementWrapper<T>> queue = new PriorityQueue<>();
for (int i = 0; i < lists.size(); i++) {
if (lists.get(i).isEmpty()) continue;
queue.add(new ElementWrapper<>(lists.get(i).get(0), i, 0));
}
return queue;
}
public static <T extends Comparable<T>> boolean
hasNext(ElementWrapper<T> current, List<List<T>> lists) {
return current.getElementIndex() + 1 < lists.get(current.getListIndex()).size();
}
public static <T extends Comparable<T>> void
addNext(ElementWrapper<T> current, List<List<T>> lists,
Queue<ElementWrapper<T>> queue) {
ElementWrapper<T> next = new ElementWrapper<>(
lists.get(current.getListIndex()).get(current.getElementIndex() + 1),
current.getListIndex(),
current.getElementIndex() + 1
);
queue.add(next);
}
Usage example:
public static void main(String[] args) {
List<List<Integer>> input =
List.of(List.of(1, 3), List.of(),
List.of(2, 6, 7), List.of(10), List.of(4, 5, 8, 9)
);
System.out.println(getNElement(input, 1));
System.out.println(getNElement(input, 3));
System.out.println(getNElement(input, 9));
}
Output:
1 // 1st
3 // 3rd
9 // 9th
Note: depending how do you want the n-th element to be indexed the count variable in the getNElement() method should be initialized accordingly, i.e. with 1 if you want to use 1-based indexes, and with 0 if you want n to be 0-based.
I want to choose a random item from a set, but the chance of choosing any item should be proportional to the associated weight
Example inputs:
item weight
---- ------
sword of misery 10
shield of happy 5
potion of dying 6
triple-edged sword 1
So, if I have 4 possible items, the chance of getting any one item without weights would be 1 in 4.
In this case, a user should be 10 times more likely to get the sword of misery than the triple-edged sword.
How do I make a weighted random selection in Java?
I would use a NavigableMap
public class RandomCollection<E> {
private final NavigableMap<Double, E> map = new TreeMap<Double, E>();
private final Random random;
private double total = 0;
public RandomCollection() {
this(new Random());
}
public RandomCollection(Random random) {
this.random = random;
}
public RandomCollection<E> add(double weight, E result) {
if (weight <= 0) return this;
total += weight;
map.put(total, result);
return this;
}
public E next() {
double value = random.nextDouble() * total;
return map.higherEntry(value).getValue();
}
}
Say I have a list of animals dog, cat, horse with probabilities as 40%, 35%, 25% respectively
RandomCollection<String> rc = new RandomCollection<>()
.add(40, "dog").add(35, "cat").add(25, "horse");
for (int i = 0; i < 10; i++) {
System.out.println(rc.next());
}
There is now a class for this in Apache Commons: EnumeratedDistribution
Item selectedItem = new EnumeratedDistribution<>(itemWeights).sample();
where itemWeights is a List<Pair<Item, Double>>, like (assuming Item interface in Arne's answer):
final List<Pair<Item, Double>> itemWeights = Collections.newArrayList();
for (Item i: itemSet) {
itemWeights.add(new Pair(i, i.getWeight()));
}
or in Java 8:
itemSet.stream().map(i -> new Pair(i, i.getWeight())).collect(toList());
Note: Pair here needs to be org.apache.commons.math3.util.Pair, not org.apache.commons.lang3.tuple.Pair.
You will not find a framework for this kind of problem, as the requested functionality is nothing more then a simple function. Do something like this:
interface Item {
double getWeight();
}
class RandomItemChooser {
public Item chooseOnWeight(List<Item> items) {
double completeWeight = 0.0;
for (Item item : items)
completeWeight += item.getWeight();
double r = Math.random() * completeWeight;
double countWeight = 0.0;
for (Item item : items) {
countWeight += item.getWeight();
if (countWeight >= r)
return item;
}
throw new RuntimeException("Should never be shown.");
}
}
139
There is a straightforward algorithm for picking an item at random, where items have individual weights:
calculate the sum of all the weights
pick a random number that is 0 or greater and is less than the sum of the weights
go through the items one at a time, subtracting their weight from your random number until you get the item where the random number is less than that item's weight
Use an alias method
If you're gonna roll a lot of times (as in a game), you should use an alias method.
The code below is rather long implementation of such an alias method, indeed. But this is because of the initialization part. The retrieval of elements is very fast (see the next and the applyAsInt methods they don't loop).
Usage
Set<Item> items = ... ;
ToDoubleFunction<Item> weighter = ... ;
Random random = new Random();
RandomSelector<T> selector = RandomSelector.weighted(items, weighter);
Item drop = selector.next(random);
Implementation
This implementation:
uses Java 8;
is designed to be as fast as possible (well, at least, I tried to do so using micro-benchmarking);
is totally thread-safe (keep one Random in each thread for maximum performance, use ThreadLocalRandom?);
fetches elements in O(1), unlike what you mostly find on the internet or on StackOverflow, where naive implementations run in O(n) or O(log(n));
keeps the items independant from their weight, so an item can be assigned various weights in different contexts.
Anyways, here's the code. (Note that I maintain an up to date version of this class.)
import static java.util.Objects.requireNonNull;
import java.util.*;
import java.util.function.*;
public final class RandomSelector<T> {
public static <T> RandomSelector<T> weighted(Set<T> elements, ToDoubleFunction<? super T> weighter)
throws IllegalArgumentException {
requireNonNull(elements, "elements must not be null");
requireNonNull(weighter, "weighter must not be null");
if (elements.isEmpty()) { throw new IllegalArgumentException("elements must not be empty"); }
// Array is faster than anything. Use that.
int size = elements.size();
T[] elementArray = elements.toArray((T[]) new Object[size]);
double totalWeight = 0d;
double[] discreteProbabilities = new double[size];
// Retrieve the probabilities
for (int i = 0; i < size; i++) {
double weight = weighter.applyAsDouble(elementArray[i]);
if (weight < 0.0d) { throw new IllegalArgumentException("weighter may not return a negative number"); }
discreteProbabilities[i] = weight;
totalWeight += weight;
}
if (totalWeight == 0.0d) { throw new IllegalArgumentException("the total weight of elements must be greater than 0"); }
// Normalize the probabilities
for (int i = 0; i < size; i++) {
discreteProbabilities[i] /= totalWeight;
}
return new RandomSelector<>(elementArray, new RandomWeightedSelection(discreteProbabilities));
}
private final T[] elements;
private final ToIntFunction<Random> selection;
private RandomSelector(T[] elements, ToIntFunction<Random> selection) {
this.elements = elements;
this.selection = selection;
}
public T next(Random random) {
return elements[selection.applyAsInt(random)];
}
private static class RandomWeightedSelection implements ToIntFunction<Random> {
// Alias method implementation O(1)
// using Vose's algorithm to initialize O(n)
private final double[] probabilities;
private final int[] alias;
RandomWeightedSelection(double[] probabilities) {
int size = probabilities.length;
double average = 1.0d / size;
int[] small = new int[size];
int smallSize = 0;
int[] large = new int[size];
int largeSize = 0;
// Describe a column as either small (below average) or large (above average).
for (int i = 0; i < size; i++) {
if (probabilities[i] < average) {
small[smallSize++] = i;
} else {
large[largeSize++] = i;
}
}
// For each column, saturate a small probability to average with a large probability.
while (largeSize != 0 && smallSize != 0) {
int less = small[--smallSize];
int more = large[--largeSize];
probabilities[less] = probabilities[less] * size;
alias[less] = more;
probabilities[more] += probabilities[less] - average;
if (probabilities[more] < average) {
small[smallSize++] = more;
} else {
large[largeSize++] = more;
}
}
// Flush unused columns.
while (smallSize != 0) {
probabilities[small[--smallSize]] = 1.0d;
}
while (largeSize != 0) {
probabilities[large[--largeSize]] = 1.0d;
}
}
#Override public int applyAsInt(Random random) {
// Call random once to decide which column will be used.
int column = random.nextInt(probabilities.length);
// Call random a second time to decide which will be used: the column or the alias.
if (random.nextDouble() < probabilities[column]) {
return column;
} else {
return alias[column];
}
}
}
}
public class RandomCollection<E> {
private final NavigableMap<Double, E> map = new TreeMap<Double, E>();
private double total = 0;
public void add(double weight, E result) {
if (weight <= 0 || map.containsValue(result))
return;
total += weight;
map.put(total, result);
}
public E next() {
double value = ThreadLocalRandom.current().nextDouble() * total;
return map.ceilingEntry(value).getValue();
}
}
A simple (even naive?), but (as I believe) straightforward method:
/**
* Draws an integer between a given range (excluding the upper limit).
* <p>
* Simulates Python's randint method.
*
* #param min: the smallest value to be drawed.
* #param max: the biggest value to be drawed.
* #return The value drawn.
*/
public static int randomInt(int min, int max)
{return (int) (min + Math.random()*max);}
/**
* Tests wether a given matrix has all its inner vectors
* has the same passed and expected lenght.
* #param matrix: the matrix from which the vectors length will be measured.
* #param expectedLenght: the length each vector should have.
* #return false if at least one vector has a different length.
*/
public static boolean haveAllVectorsEqualLength(int[][] matrix, int expectedLenght){
for(int[] vector: matrix){if (vector.length != expectedLenght) {return false;}}
return true;
}
/**
* Draws an integer between a given range
* by weighted values.
*
* #param ticketBlock: matrix with limits and weights for the drawing. All its
* vectors should have lenght two. The weights, instead of percentages, should be
* measured as integers, according to how rare each one should be draw, the rarest
* receiving the smallest value.
* #return The value drawn.
*/
public static int weightedRandomInt(int[][] ticketBlock) throws RuntimeException {
boolean theVectorsHaventAllLengthTwo = !(haveAllVectorsEqualLength(ticketBlock, 2));
if (theVectorsHaventAllLengthTwo)
{throw new RuntimeException("The given matrix has, at least, one vector with length lower or higher than two.");}
// Need to test for duplicates or null values in ticketBlock!
// Raffle urn building:
int raffleUrnSize = 0, urnIndex = 0, blockIndex = 0, repetitionCount = 0;
for(int[] ticket: ticketBlock){raffleUrnSize += ticket[1];}
int[] raffleUrn = new int[raffleUrnSize];
// Raffle urn filling:
while (urnIndex < raffleUrn.length){
do {
raffleUrn[urnIndex] = ticketBlock[blockIndex][0];
urnIndex++; repetitionCount++;
} while (repetitionCount < ticketBlock[blockIndex][1]);
repetitionCount = 0; blockIndex++;
}
return raffleUrn[randomInt(0, raffleUrn.length)];
}
I need to compress a list of intervals into a smaller list. Let me explain:
For example I have a list containing intervals [1,4],[2,5],[5,7],[10,11],[13,20],[19,21] and i want to join the intersecting intervals and return a list [1,7],[10,11],[13,21] that transforming intersecting intervals into a single longer interval.
For this I wrote this method:
public List compress(List<Interval> intervals) {
for (int j = 0; j < intervals.size(); j++) {
Interval a = intervals.get(j);
int aIndex = j;
for (int i = 1 + aIndex; i < intervals.size(); i++) {
Interval b = intervals.get(i);
if (a.intersects(b)) {
//the union method return a union of two intervals. For example returns [1,7] for [1,4] and [2,5]
intervals.add(j, a.union(b));
intervals.remove(j+1);
intervals.remove(i);
}
}
}
return intervals;
}
This seems to work fine for the first pair of intervals that are checked but it stops there. That is the final output is a list containing [1, 5],[5, 7],[10, 11],[13, 20],[19, 21].
I have found that this may be a problem with illegal removing of elements from a list? https://codereview.stackexchange.com/questions/64011/removing-elements-on-a-list-while-iterating-through-it?newreg=cc3f30e670e24cc2b05cd1fa2492906f
But I have no idea how to get around this.
Please can anyone give me a hint.
Notice: Sorry if I did anything wrong as this is my first post to stackoverflow. And thanks to anyone that will try to help.
UPDATE:
Here is the solution I found after Maraboc proposed to create a copy of the list and manipulate that one.
That seems to work.
public List compress(List<Interval> intervals) {
List<Interval> man = intervals;
for (int j = 0; j < intervals.size(); j++) {
Interval a = intervals.get(j);
int aIndex = j;
for (int i = 1 + aIndex; i < intervals.size(); i++) {
Interval b = intervals.get(i);
if (a.intersects(b)) {
a = a.union(b);
man.add(j,a);
man.remove(j+1);
man.remove(i);
i--;
}
}
}
return intervals;
}
Thank you everyone.
You are actually NOT using iterator, you are using for-cycles and select elements from list based on their position, therefore you do not have to be afraid of "I am not able to remove while iterating" issue.
I posted this question first to stackexchange by mistake. They redirected me to this place and the question was put on hold. But before that happened Maraboc[a link](https://codereview.stackexchange.com/users/87685/maraboc
)
Helped with an idea. He told me to create a new list and modify that one. I did that and it seems to work. The updated solution will be in the updated question.
Just for the fun of it I took an existing Interval Tree and added a minimise method that seems to work nicely.
/**
* Title: IntervlTree
*
* Description: Implements a static Interval Tree. i.e. adding and removal are not possible.
*
* This implementation uses longs to bound the intervals but could just as easily use doubles or any other linear value.
*
* #author OldCurmudgeon
* #version 1.0
* #param <T> - The Intervals to work with.
*/
public class IntervalTree<T extends IntervalTree.Interval> {
// My intervals.
private final List<T> intervals;
// My center value. All my intervals contain this center.
private final long center;
// My interval range.
private final long lBound;
private final long uBound;
// My left tree. All intervals that end below my center.
private final IntervalTree<T> left;
// My right tree. All intervals that start above my center.
private final IntervalTree<T> right;
public IntervalTree(List<T> intervals) {
if (intervals == null) {
throw new NullPointerException();
}
// Initially, my root contains all intervals.
this.intervals = intervals;
// Find my center.
center = findCenter();
/*
* Builds lefts out of all intervals that end below my center.
* Builds rights out of all intervals that start above my center.
* What remains contains all the intervals that contain my center.
*/
// Lefts contains all intervals that end below my center point.
final List<T> lefts = new ArrayList<>();
// Rights contains all intervals that start above my center point.
final List<T> rights = new ArrayList<>();
long uB = Long.MIN_VALUE;
long lB = Long.MAX_VALUE;
for (T i : intervals) {
long start = i.getStart();
long end = i.getEnd();
if (end < center) {
lefts.add(i);
} else if (start > center) {
rights.add(i);
} else {
// One of mine.
lB = Math.min(lB, start);
uB = Math.max(uB, end);
}
}
// Remove all those not mine.
intervals.removeAll(lefts);
intervals.removeAll(rights);
uBound = uB;
lBound = lB;
// Build the subtrees.
left = lefts.size() > 0 ? new IntervalTree<>(lefts) : null;
right = rights.size() > 0 ? new IntervalTree<>(rights) : null;
// Build my ascending and descending arrays.
/**
* #todo Build my ascending and descending arrays.
*/
}
/*
* Returns a list of all intervals containing the point.
*/
List<T> query(long point) {
// Check my range.
if (point >= lBound) {
if (point <= uBound) {
// Gather all intersecting ones.
List<T> found = intervals
.stream()
.filter((i) -> (i.getStart() <= point && point <= i.getEnd()))
.collect(Collectors.toList());
// Gather others.
if (point < center && left != null) {
found.addAll(left.query(point));
}
if (point > center && right != null) {
found.addAll(right.query(point));
}
return found;
} else {
// To right.
return right != null ? right.query(point) : Collections.<T>emptyList();
}
} else {
// To left.
return left != null ? left.query(point) : Collections.<T>emptyList();
}
}
/**
* Blends the two lists together.
*
* If the ends touch then make them one.
*
* #param a
* #param b
* #return
*/
static List<Interval> blend(List<Interval> a, List<Interval> b) {
// Either empty - lreturn the other.
if (a.isEmpty()) {
return b;
}
if (b.isEmpty()) {
return a;
}
Interval aEnd = a.get(a.size() - 1);
Interval bStart = b.get(0);
ArrayList<Interval> blended = new ArrayList<>();
// Do they meet?
if (aEnd.getEnd() >= bStart.getStart() - 1) {
// Yes! merge them.
// Remove the last.
blended.addAll(a.subList(0, a.size() - 1));
// Add a combined one.
blended.add(new SimpleInterval(aEnd.getStart(), bStart.getEnd()));
// Add all but the first.
blended.addAll(b.subList(1, b.size()));
} else {
// Just join them.
blended.addAll(a);
blended.addAll(b);
}
return blended;
}
static List<Interval> blend(List<Interval> a, List<Interval> b, List<Interval>... more) {
List<Interval> blended = blend(a, b);
for (List<Interval> l : more) {
blended = blend(blended, l);
}
return blended;
}
List<Interval> minimise() {
// Calculate min of left and right.
List<Interval> minLeft = left != null ? left.minimise() : Collections.EMPTY_LIST;
List<Interval> minRight = right != null ? right.minimise() : Collections.EMPTY_LIST;
// My contribution.
long meLeft = minLeft.isEmpty() ? lBound : Math.max(lBound, minLeft.get(minLeft.size() - 1).getEnd());
long meRight = minRight.isEmpty() ? uBound : Math.min(uBound, minRight.get(0).getEnd());
return blend(minLeft,
Collections.singletonList(new SimpleInterval(meLeft, meRight)),
minRight);
}
private long findCenter() {
//return average();
return median();
}
protected long median() {
if (intervals.isEmpty()) {
return 0;
}
// Choose the median of all centers. Could choose just ends etc or anything.
long[] points = new long[intervals.size()];
int x = 0;
for (T i : intervals) {
// Take the mid point.
points[x++] = (i.getStart() + i.getEnd()) / 2;
}
Arrays.sort(points);
return points[points.length / 2];
}
/*
* What an interval looks like.
*/
public interface Interval {
public long getStart();
public long getEnd();
}
/*
* A simple implemementation of an interval.
*/
public static class SimpleInterval implements Interval {
private final long start;
private final long end;
public SimpleInterval(long start, long end) {
this.start = start;
this.end = end;
}
#Override
public long getStart() {
return start;
}
#Override
public long getEnd() {
return end;
}
#Override
public String toString() {
return "{" + start + "," + end + "}";
}
}
/**
* Test code.
*
* #param args
*/
public static void main(String[] args) {
/**
* #todo Needs MUCH more rigorous testing.
*/
// Test data.
long[][] data = {
{1, 4}, {2, 5}, {5, 7}, {10, 11}, {13, 20}, {19, 21},};
List<Interval> intervals = new ArrayList<>();
for (long[] pair : data) {
intervals.add(new SimpleInterval(pair[0], pair[1]));
}
// Build it.
IntervalTree<Interval> test = new IntervalTree<>(intervals);
// Check minimise.
List<Interval> min = test.minimise();
System.out.println("Minimise test: ---");
System.out.println(min);
}
}
For your algorithm to work, the intervals must be sorted, say by start.
Then the for-i loop can make a the longest possible interval.
if (a.intersects(b)) {
a = a.union(b);
intervals.remove(i);
--i; // So we remain at old i value.
}
} // for i
intervals.set(j, a);
The reason for these requirements is that intervals A, B, C might form one long interval ABC, whereas C. B, A might.
Indeed the problem is that when you remove an element from the list, then all subsequent elements will be shifted. At around j I'm guessing it doesn't change because you insert then remove an item at the same location. But the removal at position i will shift all elements in the list.
What you could be doing, instead of removing the elements, is to put a null value at that position, so that the indices remain the same. You will then have to perform a final pass to remove null elements from the array (and check for nulls before comparing).
You could also run your inner loop backwards (from max i down to j) so that any element that gets shifted after i has already been processed.
This question already has answers here:
Getting permutations of an int[] removing duplicates sets
(5 answers)
Closed 7 years ago.
I want to generate all distinct permutations of array of integers. The array may contain duplicates. but i want to generate all distinct permutations. I have tried next permutation and recursive methods which tend to be very slow. Please suggest.
There are n! different permutations of n elements. Generating a single permutation is cost n (strictly) so the minimum cost of any permutation generation algorithm would be O(n*n!)
Steinhaus–Johnson–Trotter algorithm is one of those algorithms. There are improvements like Shimon Even's and other algorithms like Heap's but none get them under O(n*n!)
Googling "permutation algorithm" gets several different algorithms you can implement, although most use recursion and that means another stack step. Steinhaus–Johnson–Trotter is defined as iterative, so shouldn't get that problem.
Here's a Java implementation
import java.util.Arrays;
import java.util.Iterator;
/**
* this implementation is based in Steinhaus–Johnson–Trotter algorithm and
* Shimon Even's improvement;
*
* #see https
* ://en.wikipedia.org/wiki/Steinhaus%E2%80%93Johnson%E2%80%93Trotter_algorithm
*
*/
public class Permutations implements Iterator<int[]> {
/**
* direction[i] = -1 if the element i has to move to the left, +1 to the
* right, 0 if it does not need to move
*/
private int[] direction;
/**
* inversePermutation[i] is the position of element i in permutation; It's
* called inverse permutation because if p2 is the inverse permutation of
* p1, then p1 is the inverse permutation of p2
*/
private int[] inversePermutation;
/**
* current permutation
*/
private int[] permutation;
/**
* #param numElements
* >= 1
*/
public Permutations(int numElements) {
// initial permutation
permutation = new int[numElements];
for (int i = 0; i < numElements; i++) {
permutation[i] = i;
}
// the support elements
inversePermutation = Arrays.copyOf(permutation, numElements);
direction = new int[numElements];
Arrays.fill(direction, -1);
direction[0] = 0;
}
/**
* Swaps the elements in array at positions i1 and i2
*
* #param array
* #param i1
* #param i2
*/
private static void swap(int[] array, int i1, int i2) {
int temp = array[i1];
array[i1] = array[i2];
array[i2] = temp;
}
/**
* prepares permutation to be the next one to return
*/
private void buildNextPermutation() {
// find the largest element with a nonzero direction, and swaps it in
// the indicated direction
int index = -1;
for (int i = 0; i < direction.length; i++) {
if (direction[permutation[i]] != 0
&& (index < 0 || permutation[index] < permutation[i])) {
index = i;
}
}
if (index < 0) {
// there are no more permutations
permutation = null;
} else {
// element we're moving
int chosenElement = permutation[index];
// direction we're moving
int dir = direction[chosenElement];
// index2 is the new position of chosenElement
int index2 = index + dir;
// we'll swap positions elements permutation[index] and
// permutation[index2] in permutation, to keep inversePermutation we
// have to swap inversePermutation's elements at index
// permutation[index] and permutation[index2]
swap(inversePermutation, permutation[index], permutation[index2]);
swap(permutation, index, index2);
// update directions
if (index2 == 0 || index2 == permutation.length - 1
|| permutation[index2 + dir] > permutation[index2]) {
// direction of chosen element
direction[chosenElement] = 0;
}
// all elements greater that chosenElement set its direction to +1
// if they're before index-1 or -1 if they're after
for (int i = chosenElement + 1; i < direction.length; i++) {
if (inversePermutation[i] > index2) {
direction[i] = -1;
} else {
direction[i] = 1;
}
}
}
}
#Override
public boolean hasNext() {
return permutation != null;
}
#Override
public int[] next() {
int[] result = Arrays.copyOf(permutation, permutation.length);
buildNextPermutation();
return result;
}
}
import java.util.*;
public class main {
/**
* #param args
*/
public static void main(String[] args) {
// TODO Auto-generated method stub
int[] quiz = new int[10];
int mean = 0,mode = 0,median,range;
Scanner scan = new Scanner(System.in);
for(int x=0;x<=9;x++){
System.out.print("Enter quiz["+(x+1)+"]:");
quiz[x]= scan.nextInt();
}
Arrays.sort(quiz);
for(int x=0;x<=9;x++){
mean = mean+quiz[x];
}
mean = mean/10;
median = (quiz[4]+quiz[5])/2;
range = quiz[9]-quiz[0];
int[] cntr = new int[10];
for(int x=0;x<=9;x++){
for(int y=0;y<=9;y++){
if (quiz[x]==quiz[y]&&x!=y){
cntr[x]++;
}
}
}
int[] sortcntr = cntr;
int ndx = 0;
Arrays.sort(sortcntr);
for(int z=0;z<=9;z++){
if(cntr[z]==sortcntr[9]){
ndx = z;
}
else
mode=0;
}
mode = quiz[ndx];
System.out.println("Mean: "+mean);
System.out.println("Median: "+median);
System.out.println("Range: "+range);
if(mode==0){
System.out.println("Mode: none");
}
else
System.out.println("Mode: "+mode);
System.out.print(sortcntr[9]);
System.out.print(cntr[9]);
System.out.println(ndx);
}
}
this is the codes that i used everything is right except for the mode. the mode variable there always returns the highest number from the number. the latter part was just for debugging and not for use. please help
The main problem of your code is that you obviously think that the line
int[] sortcntr = cntr;
creates a copy of the array cntr. However, arrays have reference semantics in Java. Thus, you simply create a second reference to the same array. If you then sort sortcntr, it applies to cntr as well since it's the same array.
To create a copy of the array:
int[] sortcntr = new int[ cntr.length ];
System.arraycopy(cntr, 0, sortcntr, 0, cntr.length);
BTW: Wouldn't it make more sense to work with floating-point numbers (double) instead of integer numbers?
for(int x=0;x<=9;x++){
for(int y=0;y<=9;y++){
The inner loop should start at x+1, otherwise you count everything twice.
Just to help you out, if you decide to more generify (As Raffaele said) the process of getting the mode of a given set of data, here is a method I developed a while ago which will even return multiple modes if there are more than one with the same occurrence. (Uses the Java 8 Stream API)
/**
* Computes the mode of the passed integers.
*
* #param args Numbers to find the mode of.
* #return Mode of the passed numbers.
*/
public static int[] mode(int... args) {
/* Create a map of integers to their frequencies */
Map<Integer, Integer> frequencies = IntStream.of(args).collect(
HashMap::new,//Indicated that this collector will result in a HashMap
(integerIntegerMap, value) -> integerIntegerMap.merge(value, 1, Maths::sum), //For each value in the arguments added, merge it with the current map and add the frequencies
(integerIntegerMap, integerIntegerMap2) -> integerIntegerMap.putAll(integerIntegerMap2) //While this is not used, it simply combines 2 HashMaps. (I think this is only used when in parallel)
);
//Here we get the maximum number of occurrences for any number, we could return the mode here; but there could be multiple modes
int maxOccurrences = frequencies.entrySet().stream().mapToInt(Map.Entry::getValue).max().getAsInt();
//Here we simply go through the entry set again, filtering out only the numbers with a frequency equal to the max, then returning them as an array
return frequencies.entrySet().stream().filter(entry -> entry.getValue() == maxOccurrences).mapToInt(Map.Entry::getKey).toArray();
}
-Thomas
Since the input is already sorted to compute range and median, you can use the following code to get the mode after a single loop and without any extra memory (live on ideone):
// this must be sorted
int[] values = {1, 1, 2, 3, 4, 5, 5, 5, 6, 7, 8, 8};
int mode = values[0];
int modeOccurrences = 1;
int occurrences = 1;
int current = values[0];
for (int i = 1; i < values.length; i++) {
int value = values[i];
if (value == current) {
occurrences++;
} else {
if (occurrences > modeOccurrences) {
mode = current;
modeOccurrences = occurrences;
}
occurrences = 1;
current = value;
}
}
if (occurrences > modeOccurrences) {
mode = current;
modeOccurrences = occurrences;
}
You can even generify this piece of code to work with plain objects instead of numerical types, provided modes can be sorted and compared (I used enums in my demo)