I have a class
class ColumnTags {
String Name;
Collection<String> columnSemanticTags;
// constructor and getter and setters and other relevant attributes
}
I want to get the columnSemanticTags from a list of ColumnTags for a given name.
The corresponding method is as follows
public Collection<String> getTags(String colName, List<ColumnTags> colList)
{
Collection<String> tags = new ArrayList();
for(ColumnTag col:colList){
if(colName.equals(col.getName())){
tags = col.getColumnSemanticTags();
break;
}
}
return tags;
}
Want to convert the for loop to a java stream . I have tried
tags = colList.stream().filter(col -> colName.equals(col.getName()))
.map(col -> col.getColumnSemanticTags())
.collect(Collectors.toCollection());
I am getting compilation error. I am not aware what should be the Supplier . Have tried ArrayList::new . I have also tried casting it to ArrayList , but no success.
Can someone advice me what am I assuming wrong or what should be the expected way to handle this scenario.
With the solution , can someone explain as to why .collect() is a wrong way of tackling this solution.
public Collection<String> getTags(String colName, List<ColumnTags> colList) {
return colList.stream().filter(col -> colName.equals(col.getName()))
.map(col -> col.getColumnSemanticTags())
.findFirst().orElse(new ArrayList<>());
}
An easier way of going about this would be to simply filter a Stream to find exactly what you're looking for. If it is found, then return it, otherwise return an empty ArrayList:
return colList.stream()
.filter(c -> colName.equals(c.getName()))
.map(ColumnTag::getColumnSemanticTags)
.findFirst()
.orElseGet(ArrayList::new);
If you really want to use collect, you must call flatMap. That merges all of the lists (which are come from map(col -> col.getColumnSemanticTags())) into a single stream which contains all of the items.
List<String> tags = colList.stream()
.filter(col -> colName.equals(col.getName()))
.map(col -> col.getColumnSemanticTags())
.flatMap(collection -> collection.stream())
.collect(Collectors.toList());
I have an entity Employee
class Employee{
private String name;
private String addr;
private String sal;
}
Now i have list of these employees. I want to filter out those objects which has name = null and set addr = 'A'. I was able to achieve like below :
List<Employee> list2= list.stream()
.filter(l -> l.getName() != null)
.peek(l -> l.setAddr("A"))
.collect(Collectors.toList());
Now list2 will have all those employees whose name is not null and then set addr as A for those employees.
What i also want to find is those employees which are filtered( name == null) and save them in DB.One way i achieved is like below :
List<Employee> list2= list.stream()
.filter(l -> filter(l))
.peek(l -> l.setAddr("A"))
.collect(Collectors.toList());
private static boolean filter(Employee l){
boolean j = l.getName() != null;
if(!j)
// save in db
return j;
}
1) Is this the right way?
2) Can we do this directly in lambda expression instead of writing separate method?
Generally, you should not use side effect in behavioral parameters. See the sections “Stateless behaviors” and “Side-effects” of the package documentation. Also, it’s not recommended to use peek for non-debugging purposes, see “In Java streams is peek really only for debugging?”
There’s not much advantage in trying to squeeze all these different operations into a single Stream pipeline. Consider the clean alternative:
Map<Boolean,List<Employee>> m = list.stream()
.collect(Collectors.partitioningBy(l -> l.getName() != null));
m.get(false).forEach(l -> {
// save in db
});
List<Employee> list2 = m.get(true);
list2.forEach(l -> l.setAddr("A"));
Regarding your second question, a lambda expression allows almost everything, a method does. The differences are on the declaration, i.e. you can’t declare additional type parameters nor annotate the return type. Still, you should avoid writing too much code into a lambda expression, as, of course, you can’t create test cases directly calling that code. But that’s a matter of programming style, not a technical limitation.
If you are okay in using peek for implementing your logic (though it is not recommended unless for learning), you can do the following:
List<Employee> list2= list.stream()
.peek(l -> { // add this peek to do persistence
if(l.getName()==null){
persistInDB(l);
}
}).filter(l -> l.getName() != null)
.peek(l -> l.setAddr("A"))
.collect(Collectors.toList());
You can also do something like this:
List<Employee> list2 = list.stream()
.filter(l->{
boolean condition = l.getName()!=null;
if(condition){
l.setAddr("A");
} else {
persistInDB(l);
}
return condition;
})
.collect(Collectors.toList());
Hope this helps!
How can I iterate over list of POJO classes for collecting result of some methods in a standard way to avoid copy past?
I want to have code like this:
//class 'Person' has methods: getNames(), getEmails()
List<Person> people = requester.getPeople(u.getId());
String names = merge(people, Person::getNames);
String emails = merge(people, Person::getEmails);
instead of such copy-pasted logic:
List<Person> people = requester.getPeople(u.getId());
Set<String> namesAll = new HashSet<>();
Set<String> emailsAll = new HashSet<>();
for (Person p : people) {
if(p.getNames()!=null) {
phonesAll.addAll(p.getNames());
}
if(p.getEmails()!=null) {
emailsAll.addAll(p.getEmails());
}
}
String names = Joiner.on(", ").skipNulls().join(namesAll);
String emails = Joiner.on(", ").skipNulls().join(emailsAll);
Thus, is it possible to implement some standard approach for iterating and processing special method of POJO in list that could be reused?
If I understand you correctly, you want something like this :
String names = people.stream().flatMap(p->p.getNames().stream()).distinct().collect(Collectors.joining(", "));
Now, if you want to save typing that line for each property, you can have this merge method as you suggested :
public static String merge (List<Person> people, Function<Person, Collection<String>> mapper)
{
return people.stream().flatMap(p->mapper.apply(p).stream()).distinct().collect(Collectors.joining(", "));
}
This would make your first snippet work.
Now, you can make this method generic :
public static <T> String merge (List<T> list, Function<T, Collection<String>> mapper)
{
return list.stream().flatMap(p->mapper.apply(p).stream()).distinct().collect(Collectors.joining(", "));
}
I think this should work (haven't tested it).
I have a list of objects with many duplicated and some fields that need to be merged. I want to reduce this down to a list of unique objects using only Java 8 Streams (I know how to do this via old-skool means but this is an experiment.)
This is what I have right now. I don't really like this because the map-building seems extraneous and the values() collection is a view of the backing map, and you need to wrap it in a new ArrayList<>(...) to get a more specific collection. Is there a better approach, perhaps using the more general reduction operations?
#Test
public void reduce() {
Collection<Foo> foos = Stream.of("foo", "bar", "baz")
.flatMap(this::getfoos)
.collect(Collectors.toMap(f -> f.name, f -> f, (l, r) -> {
l.ids.addAll(r.ids);
return l;
})).values();
assertEquals(3, foos.size());
foos.forEach(f -> assertEquals(10, f.ids.size()));
}
private Stream<Foo> getfoos(String n) {
return IntStream.range(0,10).mapToObj(i -> new Foo(n, i));
}
public static class Foo {
private String name;
private List<Integer> ids = new ArrayList<>();
public Foo(String n, int i) {
name = n;
ids.add(i);
}
}
If you break the grouping and reducing steps up, you can get something cleaner:
Stream<Foo> input = Stream.of("foo", "bar", "baz").flatMap(this::getfoos);
Map<String, Optional<Foo>> collect = input.collect(Collectors.groupingBy(f -> f.name, Collectors.reducing(Foo::merge)));
Collection<Optional<Foo>> collected = collect.values();
This assumes a few convenience methods in your Foo class:
public Foo(String n, List<Integer> ids) {
this.name = n;
this.ids.addAll(ids);
}
public static Foo merge(Foo src, Foo dest) {
List<Integer> merged = new ArrayList<>();
merged.addAll(src.ids);
merged.addAll(dest.ids);
return new Foo(src.name, merged);
}
As already pointed out in the comments, a map is a very natural thing to use when you want to identify unique objects. If all you needed to do was find the unique objects, you could use the Stream::distinct method. This method hides the fact that there is a map involved, but apparently it does use a map internally, as hinted by this question that shows you should implement a hashCode method or distinct may not behave correctly.
In the case of the distinct method, where no merging is necessary, it is possible to return some of the results before all of the input has been processed. In your case, unless you can make additional assumptions about the input that haven't been mentioned in the question, you do need to finish processing all of the input before you return any results. Thus this answer does use a map.
It is easy enough to use streams to process the values of the map and turn it back into an ArrayList, though. I show that in this answer, as well as providing a way to avoid the appearance of an Optional<Foo>, which shows up in one of the other answers.
public void reduce() {
ArrayList<Foo> foos = Stream.of("foo", "bar", "baz").flatMap(this::getfoos)
.collect(Collectors.collectingAndThen(Collectors.groupingBy(f -> f.name,
Collectors.reducing(Foo.identity(), Foo::merge)),
map -> map.values().stream().
collect(Collectors.toCollection(ArrayList::new))));
assertEquals(3, foos.size());
foos.forEach(f -> assertEquals(10, f.ids.size()));
}
private Stream<Foo> getfoos(String n) {
return IntStream.range(0, 10).mapToObj(i -> new Foo(n, i));
}
public static class Foo {
private String name;
private List<Integer> ids = new ArrayList<>();
private static final Foo BASE_FOO = new Foo("", 0);
public static Foo identity() {
return BASE_FOO;
}
// use only if side effects to the argument objects are okay
public static Foo merge(Foo fooOne, Foo fooTwo) {
if (fooOne == BASE_FOO) {
return fooTwo;
} else if (fooTwo == BASE_FOO) {
return fooOne;
}
fooOne.ids.addAll(fooTwo.ids);
return fooOne;
}
public Foo(String n, int i) {
name = n;
ids.add(i);
}
}
If the input elements are supplied in the random order, then having intermediate map is probably the best solution. However if you know in advance that all the foos with the same name are adjacent (this condition is actually met in your test), the algorithm can be greatly simplified: you just need to compare the current element with the previous one and merge them if the name is the same.
Unfortunately there's no Stream API method which would allow you do to such thing easily and effectively. One possible solution is to write custom collector like this:
public static List<Foo> withCollector(Stream<Foo> stream) {
return stream.collect(Collector.<Foo, List<Foo>>of(ArrayList::new,
(list, t) -> {
Foo f;
if(list.isEmpty() || !(f = list.get(list.size()-1)).name.equals(t.name))
list.add(t);
else
f.ids.addAll(t.ids);
},
(l1, l2) -> {
if(l1.isEmpty())
return l2;
if(l2.isEmpty())
return l1;
if(l1.get(l1.size()-1).name.equals(l2.get(0).name)) {
l1.get(l1.size()-1).ids.addAll(l2.get(0).ids);
l1.addAll(l2.subList(1, l2.size()));
} else {
l1.addAll(l2);
}
return l1;
}));
}
My tests show that this collector is always faster than collecting to map (up to 2x depending on average number of duplicate names), both in sequential and parallel mode.
Another approach is to use my StreamEx library which provides a bunch of "partial reduction" methods including collapse:
public static List<Foo> withStreamEx(Stream<Foo> stream) {
return StreamEx.of(stream)
.collapse((l, r) -> l.name.equals(r.name), (l, r) -> {
l.ids.addAll(r.ids);
return l;
}).toList();
}
This method accepts two arguments: a BiPredicate which is applied for two adjacent elements and should return true if elements should be merged and the BinaryOperator which performs merging. This solution is a little bit slower in sequential mode than the custom collector (in parallel the results are very similar), but it's still significantly faster than toMap solution and it's simpler and somewhat more flexible as collapse is an intermediate operation, so you can collect in another way.
Again both these solutions work only if foos with the same name are known to be adjacent. It's a bad idea to sort the input stream by foo name, then using these solutions, because the sorting will drastically reduce the performance making it slower than toMap solution.
As already pointed out by others, an intermediate Map is unavoidable, as that’s the way of finding the objects to merge. Further, you should not modify source data during reduction.
Nevertheless, you can achieve both without creating multiple Foo instances:
List<Foo> foos = Stream.of("foo", "bar", "baz")
.flatMap(n->IntStream.range(0,10).mapToObj(i -> new Foo(n, i)))
.collect(collectingAndThen(groupingBy(f -> f.name),
m->m.entrySet().stream().map(e->new Foo(e.getKey(),
e.getValue().stream().flatMap(f->f.ids.stream()).collect(toList())))
.collect(toList())));
This assumes that you add a constructor
public Foo(String n, List<Integer> l) {
name = n;
ids=l;
}
to your Foo class, as it should have if Foo is really supposed to be capable of holding a list of IDs. As a side note, having a type which serves as single item as well as a container for merged results seems unnatural to me. This is exactly why to code turns out to be so complicated.
If the source items had a single id, using something like groupingBy(f -> f.name, mapping(f -> id, toList()), followed by mapping the entries of (String, List<Integer>) to the merged items was sufficient.
Since this is not the case and Java 8 lacks the flatMapping collector, the flatmapping step is moved to the second step, making it look much more complicated.
But in both cases, the second step is not obsolete as it is where the result items are actually created and converting the map to the desired list type comes for free.
I am trying to understand Lambdas in Java 8.
Say I have a Person class that looks like this:
public class Person implements {
String name;
GenderEnum gender;
int age;
List<Person> children;
}
Now what I want to do is find all persons which are female, that have children that are younger than 10 years old.
Pre java 8 I would do it like this:
List<Person> allPersons = somePeople();
List<Person> allFemaleWithChildren = new ArrayList<>();
for(Person p : allPersons) {
for(Person child : p.getChildren()) {
if(child.getAge() < 10 && p.getGender() == GenderEnum.Female) {
allFemaleWithChildren.add(p);
}
}
}
Now allFemaleWithChildren should have what I want.
I have been trying to do the same using streams
I think I need to use some sort of map, filter and reduce
allPersons.stream()
//filter females
.filter(p -> p.getGender == GenderEnum.Female)
//get the children
.map(c -> c.getChildren())
//filter the ones that are less than 10 years
.filter(c -> c.getAge() < 10)
//return a list with the result
.collect(Collectors.toList())
But this code does not compile.
What am I missing.
Also, I don't understand what the reduce method can be used for.
The compiler says
cannot resolve method getAge(). This is because c is apparently a collection and not the items in the collection, which is really what I want.
At the moment (once you fix the compilation error) you would be returning a list of Children. Assuming that in your original code you meant to break as soon as you find a children under 10, the equivalent could look like:
allPersons.stream()
//filter females
.filter(p -> p.getGender() == GenderEnum.Female)
//only keep females with at least one child < 10
.filter(f -> f.getChildren().stream()
.anyMatch(c -> c.getAge() < 10))
//return a list with the result
.collect(Collectors.toList())
And indeed as commented below, you could use a few static imports, add helper methods and refactor the original code to make it more readable:
allPersons.stream()
.filter(this::female)
.filter(this::hasChildrenUnder10)
.collect(toList())
//...
private boolean female(Person p) { return p.getGender() == Female; }
private boolean hasChildrenUnder10(Person parent) {
return parent.getChildren().stream()
.anyMatch(c -> c.getAge() < 10));
}
You have 2 for loops, that means at some point you need another stream. Here when you call map, you map your mothers to lists of children. You then carry on as if you had a stream of children, but you have a stream of collections of children actually.