I'm trying to learn Java 8 Stream and when I try to convert some function to java8 to practice. I meet a problem.
I'm curious that how can I convert follow code to java stream format.
/*
* input example:
* [
{
"k1": { "kk1": 1, "kk2": 2},
"k2": {"kk1": 3, "kk2": 4}
}
{
"k1": { "kk1": 10, "kk2": 20},
"k2": {"kk1": 30, "kk2": 40}
}
]
* output:
* {
"k1": { "kk1": 11, "kk2": 22},
"k2": {"kk1": 33, "kk2": 44}
}
*
*
*/
private static Map<String, Map<String, Long>> mergeMapsValue(List<Map<String, Map<String, Long>>> valueList) {
Set<String> keys_1 = valueList.get(0).keySet();
Set<String> keys_2 = valueList.get(0).entrySet().iterator().next().getValue().keySet();
Map<String, Map<String, Long>> result = new HashMap<>();
for (String k1: keys_1) {
result.put(k1, new HashMap<>());
for (String k2: keys_2) {
long total = 0;
for (Map<String, Map<String, Long>> mmap: valueList) {
Map<String, Long> m = mmap.get(k1);
if (m != null && m.get(k2) != null) {
total += m.get(k2);
}
}
result.get(k1).put(k2, total);
}
}
return result;
}
The trick here is to collect correctly the inner maps. The workflow would be:
Flat map the list of map List<Map<String, Map<String, Long>>> into a stream of map entries Stream<Map.Entry<String, Map<String, Long>>>.
Group by the key of each of those entry, and for the values mapped to same key, merge the two maps together.
Collecting maps by merging them would ideally warrant a flatMapping collector, which unfortunately doesn't exist in Java 8, although it will exist in Java 9 (see JDK-8071600). For Java 8, it is possible to use the one provided by the StreamEx library (and use MoreCollectors.flatMapping in the following code).
private static Map<String, Map<String, Long>> mergeMapsValue(List<Map<String, Map<String, Long>>> valueList) {
return valueList.stream()
.flatMap(e -> e.entrySet().stream())
.collect(Collectors.groupingBy(
Map.Entry::getKey,
Collectors.flatMapping(
e -> e.getValue().entrySet().stream(),
Collectors.<Map.Entry<String,Long>,String,Long>toMap(Map.Entry::getKey, Map.Entry::getValue, Long::sum)
)
));
}
Without using this convenient collector, we can still build our own with equivalent semantics:
private static Map<String, Map<String, Long>> mergeMapsValue2(List<Map<String, Map<String, Long>>> valueList) {
return valueList.stream()
.flatMap(e -> e.entrySet().stream())
.collect(Collectors.groupingBy(
Map.Entry::getKey,
Collector.of(
HashMap::new,
(r, t) -> t.getValue().forEach((k, v) -> r.merge(k, v, Long::sum)),
(r1, r2) -> { r2.forEach((k, v) -> r1.merge(k, v, Long::sum)); return r1; }
)
));
}
As a starting point, converting to use computeIfAbsent and merge gives us the following:
private static <K1, K2> Map<K1, Map<K2, Long>> mergeMapsValue(List<Map<K1, Map<K2, Long>>> valueList) {
final Map<K1, Map<K2, Long>> result = new HashMap<>();
for (final Map<K1, Map<K2, Long>> map : valueList) {
for (final Map.Entry<K1, Map<K2, Long>> sub : map.entrySet()) {
for (final Map.Entry<K2, Long> subsub : sub.getValue().entrySet()) {
result.computeIfAbsent(sub.getKey(), k1 -> new HashMap<>())
.merge(subsub.getKey(), subsub.getValue(), Long::sum);
}
}
}
return result;
}
This removes much of the logic from your inner loop.
This code below is wrong, I leave it here for reference.
Converting to the Stream API is not going to make it neater, but lets give it a go.
import static java.util.stream.Collectors.collectingAndThen;
import static java.util.stream.Collectors.groupingBy;
import static java.util.stream.Collectors.mapping;
import static java.util.stream.Collectors.toList;
private static <K1, K2> Map<K1, Map<K2, Long>> mergeMapsValue(List<Map<K1, Map<K2, Long>>> valueList) {
return valueList.stream()
.flatMap(v -> v.entrySet().stream())
.collect(groupingBy(Entry::getKey, collectingAndThen(mapping(Entry::getValue, toList()), l -> l.stream()
.reduce(new HashMap<>(), (l2, r2) -> {
r2.forEach((k, v) -> l2.merge(k, v, Long::sum);
return l2;
}))));
}
This is what I've managed to come up with - it's horrible. The problem is that with the foreach approach, you have a reference to each level of the iteration - this makes the logic simple. With the functional approach, you need to consider each folding operation separately.
How does it work?
We first stream() our List<Map<K1, Map<K2, Long>>>, giving a Stream<Map<K1, Map<K2, Long>>>. Next we flatMap each element, giving a Stream<Entry<K1, Map<K2, Long>>> - so we flatten the first dimension. But we cannot flatten further as we need to K1 value.
So we then use collect(groupingBy) on the K1 value giving us a Map<K1, SOMETHING> - what is something?
Well, first we use a mapping(Entry::getValue, toList()) to give us a Map<K1, List<Map<K2, Long>>>. We then use collectingAndThen to take that List<Map<K2, Long>> and reduce it. Note that this means we produce an intermediate List, which is wasteful - you could get around this by using a custom Collector.
For this we use List.stream().reduce(a, b) where a is the initial value and b is the "fold" operation. a is set to new HashMap<>() and b takes two values: either the initial value or the result of the previous application of the function and the current item in the List. So we, for each item in the List use Map.merge to combine the values.
I would say that this approach is more or less illegible - you won't be able to decipher it in a few hours time, let alone a few days.
I took the flatMap(e -> e.entrySet().stream()) part from Tunaki, but used a shorter variant for the collector:
Map<String, Integer> merged = maps.stream()
.flatMap(map -> map.entrySet().stream())
.collect(Collectors.toMap(
Map.Entry::getKey, Map.Entry::getValue, Integer::sum));
More elaborate example:
Map<String, Integer> a = new HashMap<String, Integer>() {{
put("a", 2);
put("b", 5);
}};
Map<String, Integer> b = new HashMap<String, Integer>() {{
put("a", 7);
}};
List<Map<String, Integer>> maps = Arrays.asList(a, b);
Map<String, Integer> merged = maps.stream()
.flatMap(map -> map.entrySet().stream())
.collect(Collectors.toMap(
Map.Entry::getKey, Map.Entry::getValue, Integer::sum));
assert merged.get("a") == 9;
assert merged.get("b") == 5;
Related
HashMap<String, String> map = new HashMap<String, String>();
HashMap<String, String> newMap = new HashMap<String, String>();
map.put("A","1");
map.put("B","2");
map.put("C","2");
map.put("D","1");
Expected Output: "AD", "1" and "BC", "2" present inside the newMap which means, if the data values were same it needs combine its keys to have only one data value by combining its keys inside the newMap created how to achieve this in Java?
You want to group by the "integer" value using Collectors.groupingBy and collect the former keys as a new value. By default, grouping yields in List. You can further use downstream collector Collectors.mapping and another downstream collector Collectors.reducing to map and concatenate the individual items (values) as a single String.
Map<String, String> groupedMap = map.entrySet().stream()
.collect(Collectors.groupingBy(
Map.Entry::getValue,
Collectors.mapping(
Map.Entry::getKey,
Collectors.reducing("", (l, r) -> l + r))));
{1=AD, 2=BC}
Now, you can switch keys with values for the final result, though I really think you finally need what is already in the groupedMap as further processing might cause an error on duplicated keys:
Map<String, String> newMap = groupedMap.entrySet().stream()
.collect(Collectors.toMap(
Map.Entry::getValue,
Map.Entry::getKey));
{BC=2, AD=1}
It is possible, put it all together using Collectors.collectingAndThen (matter of taste):
Map<String, String> newMap = map.entrySet().stream()
.collect(Collectors.collectingAndThen(
Collectors.groupingBy(
Map.Entry::getValue,
Collectors.mapping(
Map.Entry::getKey,
Collectors.reducing("", (l, r) -> l + r))),
m -> m.entrySet().stream()
.collect(Collectors.toMap(
Map.Entry::getValue,
Map.Entry::getKey))));
Based on logic:
Loop through your map
For each value, get the corresponding key from the new map (based on the value)
If the new map key exists, remove it and put it again with the extra letter at the end
If not exists, just put it without any concatenation.
for (var entry : map.entrySet())
{
String newMapKey = getKey(newMap, entry.getValue());
if (newMapKey != null)
{
newMap.remove(newMapKey);
newMap.put(newMapKey + entry.getKey(), entry.getValue());
continue;
}
newMap.put(entry.getKey(), entry.getValue());
}
The extra method:
private static String getKey(HashMap<String, String> map, String value)
{
for (String key : map.keySet())
if (value.equals(map.get(key)))
return key;
return null;
}
{BC=2, AD=1}
Using Java 8
You can try the below approach in order to get the desired result.
Code:
public class Test {
public static void main(String[] args) {
HashMap<String, String> map = new HashMap<>();
Map<String, String> newMap;
map.put("A","1");
map.put("B","2");
map.put("C","2");
map.put("D","1");
Map<String, String> tempMap = map.entrySet().stream()
.collect(Collectors.groupingBy(Map.Entry::getValue,
Collectors.mapping(Map.Entry::getKey,Collectors.joining(""))));
newMap = tempMap.entrySet().stream().sorted(Map.Entry.comparingByValue())
.collect(Collectors.toMap(Map.Entry::getValue, Map.Entry::getKey,(a,b) -> a, LinkedHashMap::new));
System.out.println(newMap);
}
}
Output:
{AD=1, BC=2}
If you want the keys of the source map to be concatenated in alphabetical order like in your example "AD", "BC" (and not "DA" or "CB"), then you can ensure that by creating an intermediate map of type Map<String,List<String>> associating each distinct value in the source map with a List of keys. Then sort each list and generate a string from it.
That how it might be implemented:
Map<String, String> map = Map.of(
"A", "1", "B", "2","C", "2","D", "1"
);
Map<String, String> newMap = map.entrySet().stream()
.collect(Collectors.groupingBy( // intermediate Map<String, List<String>>
Map.Entry::getValue,
Collectors.mapping(Map.Entry::getKey, Collectors.toList())
))
.entrySet().stream()
.collect(Collectors.toMap(
e -> e.getValue().stream().sorted().collect(Collectors.joining()),
Map.Entry::getKey
));
newMap.forEach((k, v) -> System.out.println(k + " -> " + v));
Output:
BC -> 2
AD -> 1
I'd like to know how I can insert a map into another map using streams in java.
I have a two maps
Map<String, List<Character>> map1
Map<String, List<Integer>> map2
I d like to merge both maps such that we have
Map<String, Map<Character, Integer>> finalmap
if map1 is something like
{String1 = [Character1, Character2], String2 = [Character3, Character4], etc}
and map2 is
{String1 = [Integer1, Integer2], String2 = [Integer3, Integer4], etc}
I want it to merge such that the innermap maps Character1 with Integer1 and so on.
Does someone have an idea how to solve this problem? :)
Map<String, Map<Character, Integer>> map3 = map1.entrySet()
.stream()
.flatMap(entry -> {
if (map2.containsKey(entry.getKey())) {
List<Integer> integers = map2.get(entry.getKey());
List<Character> characters = entry.getValue();
Map<Character, Integer> innerMap = IntStream.range(0, Math.min(integers.size(), characters.size()))
.mapToObj(i -> Map.entry(characters.get(i), integers.get(i)))
.collect(Collectors.toMap(Map.Entry::getKey, Map.Entry::getValue));
return Stream.of(Map.entry(entry.getKey(), innerMap));
}
return Stream.empty();
})
.collect(Collectors.toMap(Map.Entry::getKey, Map.Entry::getValue));
This is a bit late but in case the length of Character and Integer lists is different, it may be possible to build an inner map containing all Character keys and null for missing Integer values:
// class MyClass
static Map<String, Map<Character, Integer>> joinMaps(
Map<String, List<Character>> map1, Map<String, List<Integer>> map2)
{
return map1
.entrySet()
.stream()
.filter(e -> map2.containsKey(e.getKey())) // keep the keys from both maps
.map(e -> Map.entry(
e.getKey(), // String key for result
IntStream.range(0, e.getValue().size()) // build map <Character, Integer>
.mapToObj(i -> new AbstractMap.SimpleEntry<>(
e.getValue().get(i),
i < map2.get(e.getKey()).size() ? map2.get(e.getKey()).get(i) : null
))
// use custom collector to allow null Integer values
.collect(
MyClass::innerMap,
(hm, e2) -> hm.put(e2.getKey(), e2.getValue()),
Map::putAll
)
))
.collect(Collectors.toMap(Map.Entry::getKey, Map.Entry::getValue));
}
static LinkedHashMap<Character, Integer> innerMap() {
return new LinkedHashMap<>();
}
Custom collector for inner map is needed to allow adding null values which is not possible with Collectors.toMap where NPE is thrown.
Tests:
Map<String, List<Character>> map1 = Map.of(
"S0", Arrays.asList('#', '#'),
"S1", Arrays.asList('a', 'b'),
"S2", Arrays.asList('c', 'd'),
"S3", Arrays.asList('e', 'f')
);
System.out.println("map1=" + map1);
Map<String, List<Integer>> map2 = Map.of(
"S1", Arrays.asList(1),
"S2", Arrays.asList(3, 4, 5),
"S3", Arrays.asList(5, 6),
"S4", Arrays.asList(7, 8)
);
System.out.println("map2=" + map2);
Map<String, Map<Character, Integer>> res = joinMaps(map1, map2);
System.out.println("----\nResult:");
res.forEach((k, v) -> System.out.printf("%s -> %s%n", k, v));
Output:
map1={S0=[#, #], S1=[a, b], S2=[c, d], S3=[e, f]}
map2={S1=[1], S2=[3, 4, 5], S3=[5, 6], S4=[7, 8]}
----
Result:
S1 -> {a=1, b=null}
S2 -> {c=3, d=4}
S3 -> {e=5, f=6}
Update
Another solution using Stream::flatMap and groupingBy + mapping collectors with a custom collector used as a downstream collector is shown below:
static Map<String, Map<Character, Integer>> joinMaps(
Map<String, List<Character>> map1, Map<String, List<Integer>> map2)
{
return map1
.entrySet()
.stream()
.filter(e -> map2.containsKey(e.getKey()))
.flatMap(e -> IntStream.range(0, e.getValue().size())
.mapToObj(i -> Map.entry(
e.getKey(),
new AbstractMap.SimpleEntry<>(
e.getValue().get(i),
i < map2.get(e.getKey()).size() ? map2.get(e.getKey()).get(i) : null
)
))) // Stream<Map.Entry<String, Map.Entry<Character, Integer>>>
.collect(Collectors.groupingBy(
Map.Entry::getKey, // use String as key in outer map
Collectors.mapping(e -> e.getValue(), // build inner map
Collector.of( // using custom collector
MyClass::innerMap, // supplier
(hm, e2) -> hm.put(e2.getKey(), e2.getValue()), // accumulator
MyClass::mergeMaps // combiner
))
));
}
static <T extends Map> T mergeMaps(T acc, T map) {
acc.putAll(map);
return acc;
}
Here mergeMaps is a BinaryOperator<A> combiner argument in the method Collector.of which slightly differs from Stream::collect used above where BiConsumer<R, R> is used as combiner.
I have an issue with streams
I have a Map<LocalDateTime, Set<Vote>> map = new HashMap<>();
I have to count the number of votes and put it in a new map grouped by localDateTime but I need to count only the vote filtered by Type. I don't know how to do this with a stream.
I have tried this but it doesn't work and it doens't even compile
public Map<LocalDateTime, Integer> numbersOfVotes(TypeVote typeVote) {
return map.entrySet().stream().filter(en -> en.getValue().stream().filter(v -> v.getTypeVote().equals(typeVote)))
.collect(Collectors.toMap(Entry::getKey, e -> e.getValue().size()));
}
Class vote has 2 attributes, name and TypeVote (enum)
public Map<LocalDateTime, Integer> numbersOfVotes(TypeVote typeVote) {
return null;
}
My return value must be Map<LocalDateTime, Integer>.
How to do this using streams in Java 8 ?
final Map<LocalDateTime, Integer> result = map.entrySet().stream()
.collect(Collectors.toMap(Map.Entry::getKey,
(Map.Entry<LocalDateTime, Set<Vote>> entry) -> (int) entry.getValue().stream()
.filter((Vote vote) -> true) // Here insert real filtering
.count()));
something like
Function<Map.Entry<LocalDateTime, Set<Vote>>, Integer> fun = e -> (int) e.getValue()
.stream().filter(vote -> typeVote.equals(vote.getTypeVote()))
.count();
Map<LocalDateTime, Integer> collect = map.entrySet().stream()
.collect(Collectors.toMap(Map.Entry::getKey, fun));
How would you use Java 8 streams to swap keys in this map of maps? Or at least clean up this mess a little bit...
Map<Type1, Map<Type2, String>> to Map<Type2, Map<Type1, String>>
Using nested for loops (untested):
Map<Type1, Map<Type2, String>> map ...
Map<Type2, Map<Type1, String>> map2 = new HashMap<>();
for (Type 1 type1 : map.keySet()) {
for(Entry<Type2, String> entry : map.get(type1)) {
if (map2.get(entry.key() == null) {
map2.push(entry.key(), new HashMap<Type1, String>();
}
map2.get(entry.key()).put(type1, entry.value();
}
}
So far I think you would need to flap map into all unique combinations of Type1, Type2, and String and store this set in some sort of intermediate collection.
Definitely wrong:
map.entrySet().stream().flatMap(t -> <Type1, Type2,
String>).collect(Collectors.toMap(t -> t.Type2, Collectors.toMap(t ->
t.type1, t->t.String))
Streams aren't well-suited for this type of problem. Instead, consider using other java 8 additions -- Map#forEach and Map#computeIfAbsent:
map.forEach( (t1, e) ->
e.forEach( (t2, v) ->
result.computeIfAbsent(t2, x -> new HashMap<>()).put(t1, v)
)
);
Misha already showed you the straight forward solution. If you really want to use Streams it could look like this:
public static <S, T> Map<T, Map<S, String>> convertStream(Map<S, Map<T, String>> map) {
return map.entrySet().stream().flatMap(m1 -> m1.getValue().entrySet()
.stream().map(e -> new Object() {
final T outer = e.getKey();
final Map<S, String> map;
{
map = new HashMap<>();
map.put(m1.getKey(), e.getValue());
}
})).collect(Collectors.toMap(o -> o.outer, o -> o.map, (m1, m2) -> {
m1.putAll(m2);
return m1;
}));
}
Map<Type2, Map<Type1, Object>> finalAnswer = map.entrySet().stream()
.collect(()->new HashMap<Type2,Map<Type1,Object>>(),
(mapAccumulator, left)->{
for(Entry<?, ?> leftEntry : left.getValue().entrySet() ){
Map<Type1,Object> tempMap = new HashMap<>();
tempMap.put(left.getKey(), leftEntry.getValue());
mapAccumulator.put((Type2) leftEntry.getKey(), tempMap);
}
/*accumulator*/},
(mapLeft,mapRight)->{mapLeft.putAll(mapRight); /*combiner*/});
map.entrySet().forEach(System.out::println);
I have a list of nested maps (List<Map<String, Map<String, Long>>>) and the goal is to reduce the list to a single map, and the merging is to be done as follow: if map1 contains x->{y->10, z->20} and map2 contains x->{y->20, z->20} then these two should be merged to x->{y->30, z->40}.
I tried to do it as follow and this is working fine.
import java.io.IOException;
import java.util.ArrayList;
import java.util.HashMap;
import java.util.List;
import java.util.Map;
import java.util.Map.Entry;
import java.util.function.BinaryOperator;
import java.util.stream.Collectors;
public class Test {
public static void main(String args[]) throws IOException {
Map<String, Map<String, Long>> data1 = new HashMap<>();
Map<String, Long> innerData1 = new HashMap<>();
innerData1.put("a", 10L);
innerData1.put("b", 20L);
data1.put("x", innerData1);
Map<String, Long> innerData2 = new HashMap<>();
innerData2.put("b", 20L);
innerData2.put("a", 10L);
data1.put("x", innerData1);
Map<String, Map<String, Long>> data2 = new HashMap<>();
data2.put("x", innerData2);
List<Map<String, Map<String, Long>>> mapLists = new ArrayList<>();
mapLists.add(data1);
mapLists.add(data2);
Map<String, Map<String, Long>> result = mapLists.stream().flatMap(map -> map.entrySet().stream()).
collect(Collectors.toMap(Map.Entry::getKey, Map.Entry::getValue, new BinaryOperator<Map<String, Long>>() {
#Override
public Map<String, Long> apply(Map<String, Long> t,
Map<String, Long> u) {
Map<String, Long> result = t;
for(Entry<String, Long> entry: u.entrySet()) {
Long val = t.getOrDefault(entry.getKey(), 0L);
result.put(entry.getKey(), val + entry.getValue());
}
return result;
}
}));
}
}
Is there any other better and efficient approach to solve this?
How to do it more cleanly if the nesting level is more than 2? Suppose the List is like List<Map<String, Map<String, Map<String, Long>>>> and we have to reduce it to a single Map<String, Map<String, Map<String, Long>>>, assuming similar merge functionality as above.
You have the general idea, it is just possible to simplify a little the process of merging two maps together. Merging the two maps can be done easily with:
Map<String, Integer> mx = new HashMap<>(m1);
m2.forEach((k, v) -> mx.merge(k, v, Long::sum));
This code creates the merged map mx from m1, then iterates over all entries of the second map m2 and merges each entry into mx with the help of Map.merge(key, value, remappingFunction): this method will add the given key with the given value if no mapping existed for that key, else it will remap the existing value for that key and the given value with the given remapping function. In our case, the remapping function should sum the two values together.
Code:
Map<String, Map<String, Long>> result =
mapLists.stream()
.flatMap(m -> m.entrySet().stream())
.collect(Collectors.toMap(
Map.Entry::getKey,
Map.Entry::getValue,
(m1, m2) -> {
Map<String, Long> mx = new HashMap<>(m1);
m2.forEach((k, v) -> mx.merge(k, v, Long::sum));
return mx;
}
));
If there are more "levels", you could define a merge method:
private static <K, V> Map<K, V> merge(Map<K, V> m1, Map<K, V> m2, BiFunction<? super V, ? super V, ? extends V> remappingFunction) {
Map<K, V> mx = new HashMap<>(m1);
m2.forEach((k, v) -> mx.merge(k, v, remappingFunction));
return mx;
}
and use it recursively. For example, to merge two Map<String, Map<String, Long>> m1 and m2, you could use
merge(m1, m2, (a, b) -> merge(a, b, Long::sum));
as the remapping function is Collectors.toMap.
Using my StreamEx library:
Map<String, Map<String, Long>> result = StreamEx.of(mapLists)
.flatMapToEntry(m -> m)
.toMap((m1, m2) -> EntryStream.of(m1).append(m2).toMap(Long::sum));
The flatMapToEntry intermediate operation flattens maps into EntryStream<String, Map<String, Long>> which extends Stream<Map.Entry<String, Map<String, Long>>>. The toMap terminal operation just creates a map from the stream of entries using the supplied merge function. To merge two maps we use EntryStream again.