Say I have the following two maps:
Map<Member, List<Message>> one = ...;//one constructed somehow
Map<Member, List<Message>> two = ...;//two also constructed somehow
I would like to obtain a third map containing the content of one and two.
So if a member key is in both one and two, the list value from the entry in two will be added to the list value from the entry in one.
What is the best and cleanest way to achieve this possibly using java 8?
You can use the merge method.
Map<Member, List<Message>> third = new HashMap<>(one);
two.forEach((k, v) -> third.merge(k, v, (v1, v2) -> {v1.addAll(v2); return v1;}));
This will also modify the list in the map one since you're manipulating the same reference. If you don't want, i.e create a new list, you can do it like this:
Map<Member, List<Message>> third = new HashMap<>(one);
two.forEach((k, v) -> third.merge(k, v, (v1, v2) -> Stream.concat(v1.stream(), v2.stream()).collect(toList())));
The merged list will be a new list but be careful that this won't be the case for the keys which weren't in both maps. You'd need to deep copy all the lists to achieve this.
Just added another solution with the Stream API:
Map<Member, List<Message>> third = Stream.of(one, two)
.flatMap(m -> m.entrySet().stream())
.collect(toMap(Map.Entry::getKey,
Map.Entry::getValue,
(l1, l2) -> Stream.concat(l1.stream(), l2.stream()).collect(toList())));
I also added my contribution to the proton-pack library. With this you could also achieve it like this:
Map<Member, List<Message>> third = MapStream.ofMaps(one, two)
.mergeKeys((l1, l2) -> Stream.concat(l1.stream(), l2.stream()).collect(toList()))
.collect();
Related
I have a map inside a map which looks like this:
Map<String, Map<Integer, BigDecimal>> mapInMap; //--> with values like:
/*
"a": (1: BigDecimal.ONE),
"b": (2: BigDecimal.TEN),
"c": (1: BigDecimal.ZERO)
*/
And I would like to combine the inner maps by expecting the following result:
Map<Integer, BigDecimal> innerMapCombined; //--> with values:
/*
1: BigDecimal.ZERO,
2: BigDecimal.TEN
*/
This is my solution with predefining the combined map and using forEach:
Map<Integer, BigDecimal> combined = new HashMap<>();
mapInMap.forEach((str, innerMap) -> {
innerMap.forEach(combined::putIfAbsent);
});
But this will ignore (1: BigDecimal.ZERO).
Could you provide a 1-line solution with java 8 stream?
The issue with your problem is that as soon as you initialize your maps, and add the duplicate keys on the inner maps, you will rewrite those keys, since those maps do not accept duplicated keys. Therefore, you need to first change this:
Map<String, Map<Integer, BigDecimal>> mapInMap;
to a Map that allows duplicated keys, for instance Multimap from Google Guava:
Map<String, Multimap<Integer, BigDecimal>> mapInMap = new HashMap<>();
where the inner maps are created like this:
Multimap<Integer, BigDecimal> x1 = ArrayListMultimap.create();
x1.put(1, BigDecimal.ONE);
mapInMap.put("a", x1);
Only now you can try to solve your problem using Java 8 Streams API. For instance:
Map<Integer, BigDecimal> map = multiMap.values()
.stream()
.flatMap(map -> map.entries().stream())
.collect(Collectors.toMap(Map.Entry::getKey,
Map.Entry::getValue,
(v1, v2) -> v2));
The duplicate keys conflicts are solved using mergeFunction parameter of the toMap method. We explicitly express to take the second value (v1, v2) -> v2 in case of duplicates.
Problem:
To address why your current solution doesn't work is because Map#putIfAbsent method only adds and doesn't replace a value in a map if is already present.
Solution using for-each:
Map#put is a way to go, however its limitation is that you cannot decide whether you want to keep always the first value for such key, calculate a new one or use always the last value. For such reason I recommend to use either a combination of Map#computeIfPresent and Map#putIfAbsent or better a method that does all that at once which is Map#merge(K, V, BiFunction) with a BiFunction remappingFunction:
remappingFunction - the function to recompute a value if present
Map<Integer, BigDecimal> resultMap = new HashMap<>();
for (Map<Integer, BigDecimal> map: mapInMap.values()) {
for (Map.Entry<Integer, BigDecimal> entry: map.entrySet()) {
resultMap.merge(entry.getKey(), entry.getValue(), (l, r) -> r);
}
}
Solution using Stream API:
To rewrite it in the Stream-alike solution, the approach would be identical. The only difference is the declarative syntax of Stream API, however, the idea is very same.
Just flatMap the structure and collect to a map with a Collector.toMap(Function, Function, BinaryOperator using BinaryOperator mergeFunction to merge duplicated keys.
mergeFunction - a merge function, used to resolve collisions between values associated with the same key, as supplied to Map.merge(Object, Object, BiFunction)
Map<Integer, BigDecimal> resultMap = mapInMap.values().stream()
.flatMap(entries -> entries.entrySet().stream())
.collect(Collectors.toMap( Map.Entry::getKey, Map.Entry::getValue, (l, r) -> r));
Note: #dreamcrash also deserves a credit for his good Stream API answer in terms of speed.
Result:
{1=1, 2=10} is the result when you pring out such map (note that BigDecimal is printed as a number). This output matches your expected output.
1=BigDecimal.ZERO
2=BigDecimal.TEN
Notice the similarities between Map#merge(K, V, BiFunction) and Collector.toMap(Function, Function, BinaryOperator that use a very similar approach to the same result.
I am trying to rewrite the method below using streams but I am not sure what the best approach is? If I use flatMap on the values of the entrySet(), I lose the reference to the current key.
private List<String> asList(final Map<String, List<String>> map) {
final List<String> result = new ArrayList<>();
for (final Entry<String, List<String>> entry : map.entrySet()) {
final List<String> values = entry.getValue();
values.forEach(value -> result.add(String.format("%s-%s", entry.getKey(), value)));
}
return result;
}
The best I managed to do is the following:
return map.keySet().stream()
.flatMap(key -> map.get(key).stream()
.map(value -> new AbstractMap.SimpleEntry<>(key, value)))
.map(e -> String.format("%s-%s", e.getKey(), e.getValue()))
.collect(Collectors.toList());
Is there a simpler way without resorting to creating new Entry objects?
A stream is a sequence of values (possibly unordered / parallel). map() is what you use when you want to map a single value in the sequence to some single other value. Say, map "alturkovic" to "ALTURKOVIC". flatMap() is what you use when you want to map a single value in the sequence to 0, 1, or many other values. Hence why a flatMap lambda needs to turn a value into a stream of values. flatMap can thus be used to take, say, a list of lists of string, and turn that into a stream of just strings.
Here, you want to map a single entry from your map (a single key/value pair) into a single element (a string describing it). 1 value to 1 value. That means flatMap is not appropriate. You're looking for just map.
Furthermore, you need both key and value to perform your mapping op, so, keySet() is also not appropriate. You're looking for entrySet(), which gives you a set of all k/v pairs, juts what we need.
That gets us to:
map.entrySet().stream()
.map(e -> String.format("%s-%s", e.getKey(), e.getValue()))
.collect(Collectors.toList());
Your original code makes no effort to treat a single value from a map (which is a List<String>) as separate values; you just call .toString() on the entire ordeal, and be done with it. This means the produced string looks like, say, [Hello, World] given a map value of List.of("Hello", "World"). If you don't want this, you still don't want flatmap, because streams are also homogenous - the values in a stream are all of the same kind, and thus a stream of 'key1 value1 value2 key2 valueA valueB' is not what you'd want:
map.entrySet().stream()
.map(e -> String.format("%s-%s", e.getKey(), myPrint(e.getValue())))
.collect(Collectors.toList());
public static String myPrint(List<String> in) {
// write your own algorithm here
}
Stream API just isn't the right tool to replace that myPrint method.
A third alternative is that you want to smear out the map; you want each string in a mapvalue's List<String> to first be matched with the key (so that's re-stating that key rather a lot), and then do something to that. NOW flatMap IS appropriate - you want a stream of k/v pairs first, and then do something to that, and each element is now of the same kind. You want to turn the map:
key1 = [value1, value2]
key2 = [value3, value4]
first into a stream:
key1:value1
key1:value2
key2:value3
key2:value4
and take it from there. This explodes a single k/v entry in your map into more than one, thus, flatmapping needed:
return map.entrySet().stream()
.flatMap(e -> e.getValue().stream()
.map(v -> String.format("%s-%s", e.getKey(), v))
.collect(Collectors.toList());
Going inside-out, it maps a single entry within a list that belongs to a single k/v pair into the string Key-SingleItemFromItsList.
Adding my two cents to excellent answer by #rzwitserloot. Already flatmap and map is explained in his answer.
List<String> resultLists = myMap.entrySet().stream()
.flatMap(mapEntry -> printEntries(mapEntry.getKey(),mapEntry.getValue())).collect(Collectors.toList());
System.out.println(resultLists);
Splitting this to a separate method gives good readability IMO,
private static Stream<String> printEntries(String key, List<String> values) {
return values.stream().map(val -> String.format("%s-%s",key,val));
}
I would like how to convert Java List to Map. Were key in a map is some property of the list element (different elements might have the same property) and value is a list of those list items (having the same property).
eg.List<Owner> --> Map<Item, List<Owner>>. I found a few List to Map questions, but it was not I want to do.
What I came with is:
List<Owner> owners = new ArrayList<>(); // populate from file
Map<Item, List<Owner>> map = new HashMap<>();
owners.parallelStream()
.map(Owner::getPairStream)
.flatMap(Function.identity())
.forEach(pair -> {
map.computeIfPresent(pair.getItem(), (k,v)-> {
v.add(pair.getOwner());
return v;
});
map.computeIfAbsent(pair.getItem(), (k) -> {
List<Owner> list = new ArrayList<>();
list.add(pair.getOwner());
return list;
});
});
PasteBin
I can put forEach part to a separate method, but it still feels too verbose. Plus I made a Pair class just to make it work. I tried to look in to Collectors but couldn't get my head around to do what I wanted.
From where this is, you can simplify your code by using groupingBy:
Map<Item, List<Owner>> map = owners.stream()
.flatMap(Owner::getPairStream)
.collect(Collectors.groupingBy(Pair::getItem,
Collectors.mapping(Pair::getOwner,
Collectors.toList())));
You can also dispense with the Pair class by using SimpleEntry:
Map<Item, List<Owner>> map = owners.stream()
.flatMap(owner -> owner.getItems()
.stream()
.map(item -> new AbstractMap.SimpleEntry<>(item, owner)))
.collect(Collectors.groupingBy(Entry::getKey,
Collectors.mapping(Entry::getValue,
Collectors.toList())));
Note that I'm assuming that Item has equals and hashCode overridden accordingly.
Side notes:
You can use map.merge instead of successively calling map.computeIfPresent and map.computeIfAbsent
HashMap and parallelStream make a bad combination (HashMap isn't thread-safe)
So I have a piece of code where I'm iterating over a list of data. Each one is a ReportData that contains a case with a Long caseId and one Ruling. Each Ruling has one or more Payment. I want to have a Map with the caseId as keys and sets of payments as values (i.e. a Map<Long, Set<Payments>>).
Cases are not unique across rows, but cases are.
In other words, I can have several rows with the same case, but they will have unique rulings.
The following code gets me a Map<Long, Set<Set<Payments>>> which is almost what I want, but I've been struggling to find the correct way to flatMap the final set in the given context. I've been doing workarounds to make the logic work correctly using this map as is, but I'd very much like to fix the algorithm to correctly combine the set of payments into one single set instead of creating a set of sets.
I've searched around and couldn't find a problem with the same kind of iteration, although flatMapping with Java streams seems like a somewhat popular topic.
rowData.stream()
.collect(Collectors.groupingBy(
r -> r.case.getCaseId(),
Collectors.mapping(
r -> r.getRuling(),
Collectors.mapping(ruling->
ruling.getPayments(),
Collectors.toSet()
)
)));
Another JDK8 solution:
Map<Long, Set<Payment>> resultSet =
rowData.stream()
.collect(Collectors.toMap(p -> p.Case.getCaseId(),
p -> new HashSet<>(p.getRuling().getPayments()),
(l, r) -> { l.addAll(r);return l;}));
or as of JDK9 you can use the flatMapping collector:
rowData.stream()
.collect(Collectors.groupingBy(r -> r.Case.getCaseId(),
Collectors.flatMapping(e -> e.getRuling().getPayments().stream(),
Collectors.toSet())));
The cleanest solution is to define your own collector:
Map<Long, Set<Payment>> result = rowData.stream()
.collect(Collectors.groupingBy(
ReportData::getCaseId,
Collector.of(HashSet::new,
(s, r) -> s.addAll(r.getRuling().getPayments()),
(s1, s2) -> { s1.addAll(s2); return s1; })
));
Two other solutions to which I thought first but are actually less efficient and readable, but still avoid constructing the intermediate Map:
Merging the inner sets using Collectors.reducing():
Map<Long, Set<Payment>> result = rowData.stream()
.collect(Collectors.groupingBy(
ReportData::getCaseId,
Collectors.reducing(Collections.emptySet(),
r -> r.getRuling().getPayments(),
(s1, s2) -> {
Set<Payment> r = new HashSet<>(s1);
r.addAll(s2);
return r;
})
));
where the reducing operation will merge the Set<Payment> of entries with the same caseId. This can however cause a lot of copies of the sets if you have a lot of merges needed.
Another solution is with a downstream collector that flatmaps the nested collections:
Map<Long, Set<Payment>> result = rowData.stream()
.collect(Collectors.groupingBy(
ReportData::getCaseId,
Collectors.collectingAndThen(
Collectors.mapping(r -> r.getRuling().getPayments(), Collectors.toList()),
s -> s.stream().flatMap(Set::stream).collect(Collectors.toSet())))
);
Basically it puts all sets of matching caseId together in a List, then flatmaps that list into a single Set.
There are probably better ways to do this, but this is the best I found:
Map<Long, Set<Payment>> result =
rowData.stream()
// First group by caseIds.
.collect(Collectors.groupingBy(r -> r.case.getCaseId()))
.entrySet().stream()
// By streaming over the entrySet, I map the values to the set of payments.
.collect(Collectors.toMap(
Map.Entry::getKey,
entry -> entry.getValue().stream()
.flatMap(r -> r.getRuling().getPayments().stream())
.collect(Collectors.toSet())));
I have a structure such as Map<String,List<Map<String,Object>>. I want to apply a function to the map as follows. The method takes a key and uses a
Map<String,Object> of the list. Each key has several Map<String,Object> in the list. How can I apply the process method to the map's key for each value of Map<String,Object>? I was able to use to forEach loops(see below) but I have a feeling this is not the best way to solve the problem in a functional way.
TypeProcessor p=new TypeProcessor.instance();
//Apply this function to the key and each map from the list
// The collect the Result returned in a list.
Result process(String key, Map<String,Object> dataPoints);
List<Result> list = new ArrayList<>();
map.forEach(key,value) -> {
value.forEach(innerVal -> {
Result r=p.process(key,innerVal);
list.add(r):
});
});
It seems from your code that you want to apply process for the entire Map, so you could do it like this:
List<Result> l = map.entrySet()
.stream()
.flatMap(e -> e.getValue().stream().map(value -> process(e.getKey(), value)))
.collect(Collectors.toList());
Well, assuming map contains key, you don't need any foreach. Just obtain the value from the outer map, stream it, map to your new object and collect to a List:
List<Result> list =
map.get(key)
.stream()
.map(v -> p.process(key,v))
.collect(Collectors.toList());