Related
The expectation is derive 3 lists itemIsBoth, aItems, bItems from the input list items.
How to convert code like below to functional style? (I understand this code is clear enough in an imperative style, but I want to know does declarative style really fail to deal with such a simple example). Thanks.
for (Item item: items) {
if (item.isA() && item.isB()) {
itemIsBoth.add(item);
} else if (item.isA()) {
aItems.add(item);
} else if (item.isB()){
bItems.add(item)
}
}
The question title is quite broad (convert if-else ladder), but since the actual question asks about a specific scenario, let me offer a sample that can at least illustrate what can be done.
Because the if-else structure creates three distinct lists based on a predicate applied to the item, we can express this behavior more declaratively as a grouping operation. The only extra needed to make this work out of the box would be to collapse the multiple Boolean predicates using a tagging object. For example:
class Item {
enum Category {A, B, AB}
public Category getCategory() {
return /* ... */;
}
}
Then the logic can be expressed simply as:
Map<Item.Category, List<Item>> categorized =
items.stream().collect(Collectors.groupingBy(Item::getCategory));
where each list can be retrieved from the map given its category.
If it's not possible to change class Item, the same effect can be achieved by moving the enum declaration and the categorization method outsize the Item class (the method would become a static method).
Another solution using Vavr and doing only one iteration over a list of items might be achieved using foldLeft:
list.foldLeft(
Tuple.of(List.empty(), List.empty(), List.empty()), //we declare 3 lists for results
(lists, item) -> Match(item).of(
//both predicates pass, add to first list
Case($(allOf(Item::isA, Item::isB)), lists.map1(l -> l.append(item))),
//is a, add to second list
Case($(Item::isA), lists.map2(l -> l.append(item))),
//is b, add to third list
Case($(Item::isB), lists.map3(l -> l.append(item)))
))
);
It will return a tuple containing three lists with results.
Of course, you can. The functional way is to use declarative ways.
Mathematically you are setting an Equivalence relation, then, you can write
Map<String, List<Item>> ys = xs
.stream()
.collect(groupingBy(x -> here your equivalence relation))
A simple example show this
public class Main {
static class Item {
private final boolean a;
private final boolean b;
Item(boolean a, boolean b) {
this.a = a;
this.b = b;
}
public boolean isB() {
return b;
}
public boolean isA() {
return a;
}
}
public static void main(String[] args) {
List<Item> xs = asList(new Item(true, true), new Item(true, true), new Item(false, true));
Map<String, List<Item>> ys = xs.stream().collect(groupingBy(x -> x.isA() + "," + x.isB()));
ys.entrySet().forEach(System.out::println);
}
}
With output
true,true=[com.foo.Main$Item#64616ca2, com.foo.Main$Item#13fee20c]
false,true=[com.foo.Main$Item#4e04a765]
Another way you can get rid of the if-else is to to replace them with Predicate and Consumer:
Map<Predicate<Item>, Consumer<Item>> actions =
Map.of(item.predicateA(), aItems::add, item.predicateB(), bItems::add);
actions.forEach((key, value) -> items.stream().filter(key).forEach(value));
Therefore you need to enhace your Item with the both mehods predicateA() and predicateB() using the logic you have implemented in your isA() and isB()
Btw I would still suggest to use your if-else logic.
Since you've mentioned vavr as a tag, I'm gonna provide a solution using vavr collections.
import static io.vavr.Predicates.allOf;
import static io.vavr.Predicates.not;
...
final Array<Item> itemIsBoth = items.filter(allOf(Item::isA, Item::isB));
final Array<Item> aItems = items.filter(allOf(Item::isA, not(Item::isB)));
final Array<Item> bItems = items.filter(allOf(Item::isB, not(Item::isA)));
The advantage of this solution that it's simple to understand at a glance and it's as functional as you can get with Java. The drawback is that it will iterate over the original collections three times instead of once. That's still an O(n), but with a constant multiplier factor of 3. On non-critical code paths and with small collections it might be worth to trade a few CPU cycles for code clarity.
Of course, this works with all the other vavr collections too, so you can replace Array with List, Vector, Stream, etc.
Not (functional in the sense of) using lambda's or so, but quite functional in the sense of using only functions (as per mathematics) and no local state/variabels anywhere :
/* returns 0, 1, 2 or 3 according to isA/isB */
int getCategory(Item item) {
return item.isA() ? 1 : 0 + 2 * (item.isB() ? 1 : 0)
}
LinkedList<Item>[] lists = new LinkedList<Item> { initializer for 4-element array here };
{
for (Item item: items) {
lists[getCategory(item)].addLast(item);
}
}
The question is somewhat controversial, as it seems (+5/-3 at the time of writing this).
As you mentioned, the imperative solution here is most likely the most simple, appropriate and readable one.
The functional or declarative style does not really "fail". It's rather raising questions about the exact goals, conditions and context, and maybe even philosophical questions about language details (like why there is no standard Pair class in core Java).
You can apply a functional solution here. One simple, technical question is then whether you really want to fill the existing lists, or whether it's OK to create new lists. In both cases, you can use the Collectors#groupingBy method.
The grouping criterion is the same in both cases: Namely, any "representation" of the specific combination of isA and isB of one item. There are different possible solutions for that. In the examples below, I used an Entry<Boolean, Boolean> as the key.
(If you had further conditions, like isC and isD, then you could in fact also use a List<Boolean>).
The example shows how you can either add the item to existing lists (as in your question), or create new lists (which is a tad simpler and cleaner).
import java.util.AbstractMap.SimpleEntry;
import java.util.ArrayList;
import java.util.LinkedHashMap;
import java.util.List;
import java.util.Map;
import java.util.Map.Entry;
import java.util.stream.Collectors;
public class FunctionalIfElse
{
public static void main(String[] args)
{
List<Item> items = new ArrayList<Item>();
items.add(new Item(false, false));
items.add(new Item(false, true));
items.add(new Item(true, false));
items.add(new Item(true, true));
fillExistingLists(items);
createNewLists(items);
}
private static void fillExistingLists(List<Item> items)
{
System.out.println("Filling existing lists:");
List<Item> itemIsBoth = new ArrayList<Item>();
List<Item> aItems = new ArrayList<Item>();
List<Item> bItems = new ArrayList<Item>();
Map<Entry<Boolean, Boolean>, List<Item>> map =
new LinkedHashMap<Entry<Boolean, Boolean>, List<Item>>();
map.put(entryWith(true, true), itemIsBoth);
map.put(entryWith(true, false), aItems);
map.put(entryWith(false, true), bItems);
items.stream().collect(Collectors.groupingBy(
item -> entryWith(item.isA(), item.isB()),
() -> map, Collectors.toList()));
System.out.println("Both");
itemIsBoth.forEach(System.out::println);
System.out.println("A");
aItems.forEach(System.out::println);
System.out.println("B");
bItems.forEach(System.out::println);
}
private static void createNewLists(List<Item> items)
{
System.out.println("Creating new lists:");
Map<Entry<Boolean, Boolean>, List<Item>> map =
items.stream().collect(Collectors.groupingBy(
item -> entryWith(item.isA(), item.isB()),
LinkedHashMap::new, Collectors.toList()));
List<Item> itemIsBoth = map.get(entryWith(true, true));
List<Item> aItems = map.get(entryWith(true, false));
List<Item> bItems = map.get(entryWith(false, true));
System.out.println("Both");
itemIsBoth.forEach(System.out::println);
System.out.println("A");
aItems.forEach(System.out::println);
System.out.println("B");
bItems.forEach(System.out::println);
}
private static <K, V> Entry<K, V> entryWith(K k, V v)
{
return new SimpleEntry<K, V>(k, v);
}
static class Item
{
private boolean a;
private boolean b;
public Item(boolean a, boolean b)
{
this.a = a;
this.b = b;
}
public boolean isA()
{
return a;
}
public boolean isB()
{
return b;
}
#Override
public String toString()
{
return "(" + a + ", " + b + ")";
}
}
}
I have a list of objects with many duplicated and some fields that need to be merged. I want to reduce this down to a list of unique objects using only Java 8 Streams (I know how to do this via old-skool means but this is an experiment.)
This is what I have right now. I don't really like this because the map-building seems extraneous and the values() collection is a view of the backing map, and you need to wrap it in a new ArrayList<>(...) to get a more specific collection. Is there a better approach, perhaps using the more general reduction operations?
#Test
public void reduce() {
Collection<Foo> foos = Stream.of("foo", "bar", "baz")
.flatMap(this::getfoos)
.collect(Collectors.toMap(f -> f.name, f -> f, (l, r) -> {
l.ids.addAll(r.ids);
return l;
})).values();
assertEquals(3, foos.size());
foos.forEach(f -> assertEquals(10, f.ids.size()));
}
private Stream<Foo> getfoos(String n) {
return IntStream.range(0,10).mapToObj(i -> new Foo(n, i));
}
public static class Foo {
private String name;
private List<Integer> ids = new ArrayList<>();
public Foo(String n, int i) {
name = n;
ids.add(i);
}
}
If you break the grouping and reducing steps up, you can get something cleaner:
Stream<Foo> input = Stream.of("foo", "bar", "baz").flatMap(this::getfoos);
Map<String, Optional<Foo>> collect = input.collect(Collectors.groupingBy(f -> f.name, Collectors.reducing(Foo::merge)));
Collection<Optional<Foo>> collected = collect.values();
This assumes a few convenience methods in your Foo class:
public Foo(String n, List<Integer> ids) {
this.name = n;
this.ids.addAll(ids);
}
public static Foo merge(Foo src, Foo dest) {
List<Integer> merged = new ArrayList<>();
merged.addAll(src.ids);
merged.addAll(dest.ids);
return new Foo(src.name, merged);
}
As already pointed out in the comments, a map is a very natural thing to use when you want to identify unique objects. If all you needed to do was find the unique objects, you could use the Stream::distinct method. This method hides the fact that there is a map involved, but apparently it does use a map internally, as hinted by this question that shows you should implement a hashCode method or distinct may not behave correctly.
In the case of the distinct method, where no merging is necessary, it is possible to return some of the results before all of the input has been processed. In your case, unless you can make additional assumptions about the input that haven't been mentioned in the question, you do need to finish processing all of the input before you return any results. Thus this answer does use a map.
It is easy enough to use streams to process the values of the map and turn it back into an ArrayList, though. I show that in this answer, as well as providing a way to avoid the appearance of an Optional<Foo>, which shows up in one of the other answers.
public void reduce() {
ArrayList<Foo> foos = Stream.of("foo", "bar", "baz").flatMap(this::getfoos)
.collect(Collectors.collectingAndThen(Collectors.groupingBy(f -> f.name,
Collectors.reducing(Foo.identity(), Foo::merge)),
map -> map.values().stream().
collect(Collectors.toCollection(ArrayList::new))));
assertEquals(3, foos.size());
foos.forEach(f -> assertEquals(10, f.ids.size()));
}
private Stream<Foo> getfoos(String n) {
return IntStream.range(0, 10).mapToObj(i -> new Foo(n, i));
}
public static class Foo {
private String name;
private List<Integer> ids = new ArrayList<>();
private static final Foo BASE_FOO = new Foo("", 0);
public static Foo identity() {
return BASE_FOO;
}
// use only if side effects to the argument objects are okay
public static Foo merge(Foo fooOne, Foo fooTwo) {
if (fooOne == BASE_FOO) {
return fooTwo;
} else if (fooTwo == BASE_FOO) {
return fooOne;
}
fooOne.ids.addAll(fooTwo.ids);
return fooOne;
}
public Foo(String n, int i) {
name = n;
ids.add(i);
}
}
If the input elements are supplied in the random order, then having intermediate map is probably the best solution. However if you know in advance that all the foos with the same name are adjacent (this condition is actually met in your test), the algorithm can be greatly simplified: you just need to compare the current element with the previous one and merge them if the name is the same.
Unfortunately there's no Stream API method which would allow you do to such thing easily and effectively. One possible solution is to write custom collector like this:
public static List<Foo> withCollector(Stream<Foo> stream) {
return stream.collect(Collector.<Foo, List<Foo>>of(ArrayList::new,
(list, t) -> {
Foo f;
if(list.isEmpty() || !(f = list.get(list.size()-1)).name.equals(t.name))
list.add(t);
else
f.ids.addAll(t.ids);
},
(l1, l2) -> {
if(l1.isEmpty())
return l2;
if(l2.isEmpty())
return l1;
if(l1.get(l1.size()-1).name.equals(l2.get(0).name)) {
l1.get(l1.size()-1).ids.addAll(l2.get(0).ids);
l1.addAll(l2.subList(1, l2.size()));
} else {
l1.addAll(l2);
}
return l1;
}));
}
My tests show that this collector is always faster than collecting to map (up to 2x depending on average number of duplicate names), both in sequential and parallel mode.
Another approach is to use my StreamEx library which provides a bunch of "partial reduction" methods including collapse:
public static List<Foo> withStreamEx(Stream<Foo> stream) {
return StreamEx.of(stream)
.collapse((l, r) -> l.name.equals(r.name), (l, r) -> {
l.ids.addAll(r.ids);
return l;
}).toList();
}
This method accepts two arguments: a BiPredicate which is applied for two adjacent elements and should return true if elements should be merged and the BinaryOperator which performs merging. This solution is a little bit slower in sequential mode than the custom collector (in parallel the results are very similar), but it's still significantly faster than toMap solution and it's simpler and somewhat more flexible as collapse is an intermediate operation, so you can collect in another way.
Again both these solutions work only if foos with the same name are known to be adjacent. It's a bad idea to sort the input stream by foo name, then using these solutions, because the sorting will drastically reduce the performance making it slower than toMap solution.
As already pointed out by others, an intermediate Map is unavoidable, as that’s the way of finding the objects to merge. Further, you should not modify source data during reduction.
Nevertheless, you can achieve both without creating multiple Foo instances:
List<Foo> foos = Stream.of("foo", "bar", "baz")
.flatMap(n->IntStream.range(0,10).mapToObj(i -> new Foo(n, i)))
.collect(collectingAndThen(groupingBy(f -> f.name),
m->m.entrySet().stream().map(e->new Foo(e.getKey(),
e.getValue().stream().flatMap(f->f.ids.stream()).collect(toList())))
.collect(toList())));
This assumes that you add a constructor
public Foo(String n, List<Integer> l) {
name = n;
ids=l;
}
to your Foo class, as it should have if Foo is really supposed to be capable of holding a list of IDs. As a side note, having a type which serves as single item as well as a container for merged results seems unnatural to me. This is exactly why to code turns out to be so complicated.
If the source items had a single id, using something like groupingBy(f -> f.name, mapping(f -> id, toList()), followed by mapping the entries of (String, List<Integer>) to the merged items was sufficient.
Since this is not the case and Java 8 lacks the flatMapping collector, the flatmapping step is moved to the second step, making it look much more complicated.
But in both cases, the second step is not obsolete as it is where the result items are actually created and converting the map to the desired list type comes for free.
I have a project where, in different scenarios, I have to work on different subsets of a large dataset. The way I have written the code, there is a Collector interface, and a class DataCollector implements Collector. The class DataCollector is instantiated with the condition of the subset-creation, and these conditions are enums.
Let's say the dataset is a set of 1 million English words, and I want to work on the subset of words consisting of odd number of letters. Then, I do the following:
DataCollector dataCollector = new DataCollector(CollectionType.WORDS_OF_ODD_LENGTH);
Set<String> fourLetteredWords = dataCollector.collect();
where CollectionType is the enum class
enum CollectionType {
WORDS_OF_ODD_LENGTH,
WORDS_OF_EVEN_LENGTH,
STARTING_WITH_VOWEL,
STARTING_WITH_CONSONANT,
....
}
The data collector calls a java.util.Predicate depending on the enum with which it was instantiated.
So far, this approach has been robust and flexible enough, but now I am facing increasingly complex scenarios (e.g., collect words of even length starting with a vowel). I would like to avoid adding new CollectionType for every such scenario. What I have noticed is that many of these complex scenarios are just logical operations on the simpler ones (e.g., condition_1 && (condition_2 || condition_3)).
The end-user is the one who specifies these conditions, and the only control I have is that I can specify the set of such conditions. As in, the end-user can only select from CollectionType. Right now, I am trying to generalize from the ability to select only one condition to the ability to select one or more. For that, I need something like
DataCollector dataCollector = new DataCollector(WORDS_OF_ODD_LENGTH &&
STARTING_WITH_VOWEL);
Is there a way I model my enums to carry out such operations? I am open to other ideas (as in, should I just scrap this enum-based approach for something else, etc.).
I suggest you use Java 8 which has Predicate and operations supporting predicates.
enum CollectionType implements Predicate<String> {
WORDS_OF_ODD_LENGTH(s -> s.length() % 2 != 0),
WORDS_OF_EVEN_LENGTH(WORDS_OF_ODD_LENGTH.negate()),
STARTING_WITH_VOWEL(s -> isVowel(s.charAt(0))),
STARTING_WITH_CONSONANT(STARTING_WITH_VOWEL.negate()),
COMPLEX_CHECK(CollectionType::complexCheck);
private final Predicate<String> predicate;
CollectionType(Predicate<String> predicate) {
this.predicate = predicate;
}
static boolean isVowel(char c) {
return "AEIOUaeiou".indexOf(c) >= 0;
}
public boolean test(String s) {
return predicate.test(s);
}
public static boolean complexCheck(String s) {
// many lines of code, calling many methods
}
}
The you can write a Predicate like
Predicate<String> p = WORDS_OF_ODD_LENGTH.and(STARTING_WITH_CONSONANT);
or even five letter words starting with a vowel
Predicate<String> p = STARTING_WITH_VOWEL.and(s -> s.length() == 5);
Say you wanted to use this filter on reading the file, you can do
List<String> oddWords = Files.lines(path).filter(WORDS_OF_ODD_LENGTH).collect(toList());
Or you could index them as you load them with
Map<Integer, List<String>> wordsBySize = Files.lines(path)
.collect(groupBy(s -> s.length()));
Even though you have made your enum is a Predicate you can optimise its usage like this.
if (predicate == WORDS_OF_ODD_LENGTH || predicate == WORDS_OF_EVEN_LENGTH) {
// assume if the first word in a list of words of the same length
// then take all words of that length.
return wordsBySize.values().stream()
.filter(l -> predicate.test(l.get(0)))
.flatMap(l -> l.stream()).collect(toList());
} else {
return wordsBySize.values().stream()
.flatMap(l -> l.stream())
.filter(predicate)
.collect(toList());
}
i.e. by using enum you can recognise some predicates and optimise for them. (Whether that is a good idea or not I will leave to you)
How can I make an equality assertion between lists in a JUnit test case? Equality should be between the content of the list.
For example:
List<String> numbers = Arrays.asList("one", "two", "three");
List<String> numbers2 = Arrays.asList("one", "two", "three");
List<String> numbers3 = Arrays.asList("one", "two", "four");
// numbers should be equal to numbers2
//numbers should not be equal to numbers3
For junit4! This question deserves a new answer written for junit5.
I realise this answer is written a couple years after the question, probably this feature wasn't around then. But now, it's easy to just do this:
#Test
public void test_array_pass()
{
List<String> actual = Arrays.asList("fee", "fi", "foe");
List<String> expected = Arrays.asList("fee", "fi", "foe");
assertThat(actual, is(expected));
assertThat(actual, is(not(expected)));
}
If you have a recent version of Junit installed with hamcrest, just add these imports:
import static org.junit.Assert.*;
import static org.hamcrest.CoreMatchers.*;
http://junit.org/junit4/javadoc/latest/org/junit/Assert.html#assertThat(T, org.hamcrest.Matcher)
http://junit.org/junit4/javadoc/latest/org/hamcrest/CoreMatchers.html
http://junit.org/junit4/javadoc/latest/org/hamcrest/core/Is.html
For JUnit 5
you can use assertIterableEquals :
List<String> numbers = Arrays.asList("one", "two", "three");
List<String> numbers2 = Arrays.asList("one", "two", "three");
Assertions.assertIterableEquals(numbers, numbers2);
or assertArrayEquals and converting lists to arrays :
List<String> numbers = Arrays.asList("one", "two", "three");
List<String> numbers2 = Arrays.asList("one", "two", "three");
Assertions.assertArrayEquals(numbers.toArray(), numbers2.toArray());
Don't transform to string and compare. This is not good for perfomance.
In the junit, inside Corematchers, there's a matcher for this => hasItems
List<Integer> yourList = Arrays.asList(1,2,3,4)
assertThat(yourList, CoreMatchers.hasItems(1,2,3,4,5));
This is the better way that I know of to check elements in a list.
assertEquals(Object, Object) from JUnit4/JUnit 5 or assertThat(actual, is(expected)); from Hamcrest proposed in the other answers will work only as both equals() and toString() are overrided for the classes (and deeply) of the compared objects.
It matters because the equality test in the assertion relies on equals() and the test failure message relies on toString() of the compared objects.
For built-in classes such as String, Integer and so for ... no problem as these override both equals() and toString(). So it is perfectly valid to assert List<String> or List<Integer> with assertEquals(Object,Object).
And about this matter : you have to override equals() in a class because it makes sense in terms of object equality, not only to make assertions easier in a test with JUnit.
To make assertions easier you have other ways.
As a good practice I favor assertion/matcher libraries.
Here is a AssertJ solution.
org.assertj.core.api.ListAssert.containsExactly() is what you need : it verifies that the actual group contains exactly the given values and nothing else, in order as stated in the javadoc.
Suppose a Foo class where you add elements and where you can get that.
A unit test of Foo that asserts that the two lists have the same content could look like :
import org.assertj.core.api.Assertions;
import org.junit.jupiter.api.Test;
#Test
void add() throws Exception {
Foo foo = new Foo();
foo.add("One", "Two", "Three");
Assertions.assertThat(foo.getElements())
.containsExactly("One", "Two", "Three");
}
A AssertJ good point is that declaring a List as expected is needless : it makes the assertion straighter and the code more readable :
Assertions.assertThat(foo.getElements())
.containsExactly("One", "Two", "Three");
But Assertion/matcher libraries are a must because these will really further.
Suppose now that Foo doesn't store Strings but Bars instances.
That is a very common need.
With AssertJ the assertion is still simple to write. Better you can assert that the list content are equal even if the class of the elements doesn't override equals()/hashCode() while JUnit way requires that :
import org.assertj.core.api.Assertions;
import static org.assertj.core.groups.Tuple.tuple;
import org.junit.jupiter.api.Test;
#Test
void add() throws Exception {
Foo foo = new Foo();
foo.add(new Bar(1, "One"), new Bar(2, "Two"), new Bar(3, "Three"));
Assertions.assertThat(foo.getElements())
.extracting(Bar::getId, Bar::getName)
.containsExactly(tuple(1, "One"),
tuple(2, "Two"),
tuple(3, "Three"));
}
This is a legacy answer, suitable for JUnit 4.3 and below. The modern version of JUnit includes a built-in readable failure messages in the assertThat method. Prefer other answers on this question, if possible.
List<E> a = resultFromTest();
List<E> expected = Arrays.asList(new E(), new E(), ...);
assertTrue("Expected 'a' and 'expected' to be equal."+
"\n 'a' = "+a+
"\n 'expected' = "+expected,
expected.equals(a));
For the record, as #Paul mentioned in his comment to this answer, two Lists are equal:
if and only if the specified object is also a list, both lists have the same size, and all corresponding pairs of elements in the two lists are equal. (Two elements e1 and e2 are equal if (e1==null ? e2==null : e1.equals(e2)).) In other words, two lists are defined to be equal if they contain the same elements in the same order. This definition ensures that the equals method works properly across different implementations of the List interface.
See the JavaDocs of the List interface.
If you don't care about the order of the elements, I recommend ListAssert.assertEquals in junit-addons.
Link: http://junit-addons.sourceforge.net/
For lazy Maven users:
<dependency>
<groupId>junit-addons</groupId>
<artifactId>junit-addons</artifactId>
<version>1.4</version>
<scope>test</scope>
</dependency>
You can use assertEquals in junit.
import org.junit.Assert;
import org.junit.Test;
#Test
public void test_array_pass()
{
List<String> actual = Arrays.asList("fee", "fi", "foe");
List<String> expected = Arrays.asList("fee", "fi", "foe");
Assert.assertEquals(actual,expected);
}
If the order of elements is different then it will return error.
If you are asserting a model object list then you should
override the equals method in the specific model.
#Override
public boolean equals(Object obj) {
if (obj == this) {
return true;
}
if (obj != null && obj instanceof ModelName) {
ModelName other = (ModelName) obj;
return this.getItem().equals(other.getItem()) ;
}
return false;
}
if you don't want to build up an array list , you can try this also
#Test
public void test_array_pass()
{
List<String> list = Arrays.asList("fee", "fi", "foe");
Strint listToString = list.toString();
Assert.assertTrue(listToString.contains("[fee, fi, foe]")); // passes
}
List<Integer> figureTypes = new ArrayList<Integer>(
Arrays.asList(
1,
2
));
List<Integer> figureTypes2 = new ArrayList<Integer>(
Arrays.asList(
1,
2));
assertTrue(figureTypes .equals(figureTypes2 ));
I know there are already many options to solve this issue, but I would rather do the following to assert two lists in any oder:
assertTrue(result.containsAll(expected) && expected.containsAll(result))
You mentioned that you're interested in the equality of the contents of the list (and didn't mention order). So containsExactlyInAnyOrder from AssertJ is a good fit. It comes packaged with spring-boot-starter-test, for example.
From the AssertJ docs ListAssert#containsExactlyInAnyOrder:
Verifies that the actual group contains exactly the given values and nothing else, in any order.
Example:
// an Iterable is used in the example but it would also work with an array
Iterable<Ring> elvesRings = newArrayList(vilya, nenya, narya, vilya);
// assertion will pass
assertThat(elvesRings).containsExactlyInAnyOrder(vilya, vilya, nenya, narya);
// assertion will fail as vilya is contained twice in elvesRings.
assertThat(elvesRings).containsExactlyInAnyOrder(nenya, vilya, narya);
assertEquals(expected, result); works for me.
Since this function gets two objects, you can pass anything to it.
public static void assertEquals(Object expected, Object actual) {
AssertEquals.assertEquals(expected, actual);
}
If there are no duplicates, following code should do the job
Assertions.assertTrue(firstList.size() == secondList.size()
&& firstList.containsAll(secondList)
&& secondList.containsAll(firstList));
Note: In case of duplicates, assertion will pass if number of elements is the same in both lists (even if different elements are duplicated in each list.
I don't this the all the above answers are giving the exact solution for comparing two lists of Objects.
Most of above approaches can be helpful in following limit of comparisons only
- Size comparison
- Reference comparison
But if we have same sized lists of objects and different data on the objects level then this comparison approaches won't help.
I think the following approach will work perfectly with overriding equals and hashcode method on the user-defined object.
I used Xstream lib for override equals and hashcode but we can override equals and hashcode by out won logics/comparison too.
Here is the example for your reference
import com.thoughtworks.xstream.XStream;
import java.text.ParseException;
import java.util.ArrayList;
import java.util.List;
class TestClass {
private String name;
private String id;
public void setName(String value) {
this.name = value;
}
public String getName() {
return this.name;
}
public String getId() {
return id;
}
public void setId(String id) {
this.id = id;
}
/**
* #see java.lang.Object#equals(java.lang.Object)
*/
#Override
public boolean equals(Object o) {
XStream xstream = new XStream();
String oxml = xstream.toXML(o);
String myxml = xstream.toXML(this);
return myxml.equals(oxml);
}
/**
* #see java.lang.Object#hashCode()
*/
#Override
public int hashCode() {
XStream xstream = new XStream();
String myxml = xstream.toXML(this);
return myxml.hashCode();
}
}
public class XstreamCompareTest {
public static void main(String[] args) throws ParseException {
checkObjectEquals();
}
private static void checkObjectEquals() {
List<TestClass> testList1 = new ArrayList<TestClass>();
TestClass tObj1 = new TestClass();
tObj1.setId("test3");
tObj1.setName("testname3");
testList1.add(tObj1);
TestClass tObj2 = new TestClass();
tObj2.setId("test2");
tObj2.setName("testname2");
testList1.add(tObj2);
testList1.sort((TestClass t1, TestClass t2) -> t1.getId().compareTo(t2.getId()));
List<TestClass> testList2 = new ArrayList<TestClass>();
TestClass tObj3 = new TestClass();
tObj3.setId("test3");
tObj3.setName("testname3");
testList2.add(tObj3);
TestClass tObj4 = new TestClass();
tObj4.setId("test2");
tObj4.setName("testname2");
testList2.add(tObj4);
testList2.sort((TestClass t1, TestClass t2) -> t1.getId().compareTo(t2.getId()));
if (isNotMatch(testList1, testList2)) {
System.out.println("The list are not matched");
} else {
System.out.println("The list are matched");
}
}
private static boolean isNotMatch(List<TestClass> clist1, List<TestClass> clist2) {
return clist1.size() != clist2.size() || !clist1.equals(clist2);
}
}
The most important thing is that you can ignore the fields by Annotation (#XStreamOmitField) if you don't want to include any fields on the equal check of Objects. There are many Annotations like this to configure so have a look deep about the annotations of this lib.
I am sure this answer will save your time to identify the correct approach for comparing two lists of objects :). Please comment if you see any issues on this.
Since Java doesn't allow passing methods as parameters, what trick do you use to implement Python like list comprehension in Java ?
I have a list (ArrayList) of Strings. I need to transform each element by using a function so that I get another list. I have several functions which take a String as input and return another String as output. How do I make a generic method which can be given the list and the function as parameters so that I can get a list back with each element processed. It is not possible in the literal sense, but what trick should I use ?
The other option is to write a new function for each smaller String-processing function which simply loops over the entire list, which is kinda not so cool.
In Java 8 you can use method references:
List<String> list = ...;
list.replaceAll(String::toUpperCase);
Or, if you want to create a new list instance:
List<String> upper = list.stream().map(String::toUpperCase).collect(Collectors.toList());
Basically, you create a Function interface:
public interface Func<In, Out> {
public Out apply(In in);
}
and then pass in an anonymous subclass to your method.
Your method could either apply the function to each element in-place:
public static <T> void applyToListInPlace(List<T> list, Func<T, T> f) {
ListIterator<T> itr = list.listIterator();
while (itr.hasNext()) {
T output = f.apply(itr.next());
itr.set(output);
}
}
// ...
List<String> myList = ...;
applyToListInPlace(myList, new Func<String, String>() {
public String apply(String in) {
return in.toLowerCase();
}
});
or create a new List (basically creating a mapping from the input list to the output list):
public static <In, Out> List<Out> map(List<In> in, Func<In, Out> f) {
List<Out> out = new ArrayList<Out>(in.size());
for (In inObj : in) {
out.add(f.apply(inObj));
}
return out;
}
// ...
List<String> myList = ...;
List<String> lowerCased = map(myList, new Func<String, String>() {
public String apply(String in) {
return in.toLowerCase();
}
});
Which one is preferable depends on your use case. If your list is extremely large, the in-place solution may be the only viable one; if you wish to apply many different functions to the same original list to make many derivative lists, you will want the map version.
The Google Collections library has lots of classes for working with collections and iterators at a much higher level than plain Java supports, and in a functional manner (filter, map, fold, etc.). It defines Function and Predicate interfaces and methods that use them to process collections so that you don't have to. It also has convenience functions that make dealing with Java generics less arduous.
I also use Hamcrest** for filtering collections.
The two libraries are easy to combine with adapter classes.
** Declaration of interest: I co-wrote Hamcrest
Apache Commons CollectionsUtil.transform(Collection, Transformer) is another option.
I'm building this project to write list comprehension in Java, now is a proof of concept in https://github.com/farolfo/list-comprehension-in-java
Examples
// { x | x E {1,2,3,4} ^ x is even }
// gives {2,4}
Predicate<Integer> even = x -> x % 2 == 0;
List<Integer> evens = new ListComprehension<Integer>()
.suchThat(x -> {
x.belongsTo(Arrays.asList(1, 2, 3, 4));
x.is(even);
});
// evens = {2,4};
And if we want to transform the output expression in some way like
// { x * 2 | x E {1,2,3,4} ^ x is even }
// gives {4,8}
List<Integer> duplicated = new ListComprehension<Integer>()
.giveMeAll((Integer x) -> x * 2)
.suchThat(x -> {
x.belongsTo(Arrays.asList(1, 2, 3, 4));
x.is(even);
});
// duplicated = {4,8}
You can use lambdas for the function, like so:
class Comprehension<T> {
/**
*in: List int
*func: Function to do to each entry
*/
public List<T> comp(List<T> in, Function<T, T> func) {
List<T> out = new ArrayList<T>();
for(T o: in) {
out.add(func.apply(o));
}
return out;
}
}
the usage:
List<String> stuff = new ArrayList<String>();
stuff.add("a");
stuff.add("b");
stuff.add("c");
stuff.add("d");
stuff.add("cheese");
List<String> newStuff = new Comprehension<String>().comp(stuff, (a) -> { //The <String> tells the comprehension to return an ArrayList<String>
a.equals("a")? "1":
(a.equals("b")? "2":
(a.equals("c")? "3":
(a.equals("d")? "4": a
)))
});
will return:
["1", "2", "3", "4", "cheese"]
import java.util.Arrays;
class Soft{
public static void main(String[] args){
int[] nums=range(9, 12);
System.out.println(Arrays.toString(nums));
}
static int[] range(int low, int high){
int[] a=new int[high-low];
for(int i=0,j=low;i<high-low;i++,j++){
a[i]=j;
}
return a;
}
}
Hope, that I help you :)