I have the below multilevel map:
Map<String, List<Map<String, Map<String, Map<String, Map<String, String>>>>>> input =
ImmutableMap.of("A",
ImmutableList.of(ImmutableMap.of("2",
ImmutableMap.of("3",
ImmutableMap.of("4",
ImmutableMap.of("5", "a"))))));
In short it'll be like
{
"A":[{"2":{"3":{"4":{"5":"a"}}}}],
"B":[{"2":{"3":{"4":{"5":"b"}}}}]
}
My requirement is to construct a map of the form
{
"A":"a",
"B":"b"
}
I tried the below code but for some reason myMap is always empty even though I'm populating it. What am I missing?
Map<String, String> myMap = new HashMap<>();
input.entrySet()
.stream()
.map(l -> l.getValue().stream().map(m -> m.get(m.keySet().toArray()[0]))
.map(n -> n.get(n.keySet().toArray()[0]))
.map(o -> o.get(o.keySet().toArray()[0]))
.map(p -> myMap.put(l.getKey(), p.get(p.keySet().toArray()[0])))).collect(Collectors.toList());
System.out.println(myMap);
Here's what I get when I add two peek calls to your pipeline:
input.entrySet().stream()
.peek(System.out::println) //<- this
.map(l -> ...)
.peek(System.out::println) //<- and this
.collect(Collectors.toList());
output:
A=[{2={3={4={5=a}}}}]
java.util.stream.ReferencePipeline$3#d041cf
If you notice the problem, you're collecting streams, and these streams don't get executed through a call to a terminal operation... When I try adding something like .count() to the inner stream, your expected output is produced:
...
.map(l -> l.getValue().stream().map(m -> m.get(m.keySet().toArray()[0]))
.map(n -> n.get(n.keySet().toArray()[0]))
.map(o -> o.get(o.keySet().toArray()[0]))
.map(p -> myMap.put(l.getKey(), p.get(p.keySet().toArray()[0])))
.count()) //just an example
...
Now, I suppose you know that a terminal operation needs to be called for the intermediate ones to run.
In a rather desperate attempt to simplify this code, as the stream seems to make it simply hard to read, I thought you might be interested in this, which assumes that no collection is empty in the tree but at least addrsses the record as one object, and not a collection of records (but I'm sure no code will look clean for that deep map of of maps).
String key = input.keySet().iterator().next();
String value = input.entrySet().iterator().next()
.getValue().get(0)
.values().iterator().next()
.values().iterator().next()
.values().iterator().next()
.values().iterator().next();
myMap.put(key, value);
Related
I'm writing a junit test where I want to import the expected result from a csv file, as a HashMap.
The following works, but I find it kind boilerplate that I first create a MapEntry.entry(), which I than collect into a new HashMap.
csv:
#key;amount
key1;val1
key2;val2
...
keyN;valN
test:
Map<String, BigDecimal> expected = Files.readAllLines(
Paths.get("test.csv"))
.stream()
.map(line -> MapEntry.entry(line.split(",")[0], line.split(",")[1]))
.collect(Collectors.toMap(Map.Entry::getKey, item -> new BigDecimal(item.getValue())));
Especially I'm looking for a oneliner solution like this. I mean: can I prevent having to create a MapEntry.entry explicit before again collecting it to a hashmap?
Can this be done better? Or is there even any junit utility that can already read a csv?
You don't need to create entry, you can split the line into array using map function and then use Collectors.toMap
Map<String, BigDecimal> expected = Files.readAllLines(
Paths.get("test.csv"))
.stream()
.map(line->line.split(","))
.filter(line->line.length>1)
.collect(Collectors.toMap(key->key[0], value -> new BigDecimal(value[1])));
If you want to collect entries into a specific type you can use overloaded Collectors.toMap with mapSupplier
Returns a Collector that accumulates elements into a Map whose keys and values are the result of applying the provided mapping functions to the input elements.
HashMap<String, BigDecimal> expected = Files.readAllLines(
Paths.get("test.csv"))
.stream()
.map(line->line.split(","))
.filter(line->line.length>1)
.collect(Collectors.toMap(key->key[0], value -> new BigDecimal(value[1]),(val1,val2)->val1, HashMap::new));
}
This worked for me:
Map<String, BigDecimal> result = Files.readAllLines(Paths.get("test.csv"))
.stream()
.collect(Collectors.toMap(l -> l.split(",")[0], l -> new BigDecimal(l.split(",")[1])));
I am trying to rewrite the method below using streams but I am not sure what the best approach is? If I use flatMap on the values of the entrySet(), I lose the reference to the current key.
private List<String> asList(final Map<String, List<String>> map) {
final List<String> result = new ArrayList<>();
for (final Entry<String, List<String>> entry : map.entrySet()) {
final List<String> values = entry.getValue();
values.forEach(value -> result.add(String.format("%s-%s", entry.getKey(), value)));
}
return result;
}
The best I managed to do is the following:
return map.keySet().stream()
.flatMap(key -> map.get(key).stream()
.map(value -> new AbstractMap.SimpleEntry<>(key, value)))
.map(e -> String.format("%s-%s", e.getKey(), e.getValue()))
.collect(Collectors.toList());
Is there a simpler way without resorting to creating new Entry objects?
A stream is a sequence of values (possibly unordered / parallel). map() is what you use when you want to map a single value in the sequence to some single other value. Say, map "alturkovic" to "ALTURKOVIC". flatMap() is what you use when you want to map a single value in the sequence to 0, 1, or many other values. Hence why a flatMap lambda needs to turn a value into a stream of values. flatMap can thus be used to take, say, a list of lists of string, and turn that into a stream of just strings.
Here, you want to map a single entry from your map (a single key/value pair) into a single element (a string describing it). 1 value to 1 value. That means flatMap is not appropriate. You're looking for just map.
Furthermore, you need both key and value to perform your mapping op, so, keySet() is also not appropriate. You're looking for entrySet(), which gives you a set of all k/v pairs, juts what we need.
That gets us to:
map.entrySet().stream()
.map(e -> String.format("%s-%s", e.getKey(), e.getValue()))
.collect(Collectors.toList());
Your original code makes no effort to treat a single value from a map (which is a List<String>) as separate values; you just call .toString() on the entire ordeal, and be done with it. This means the produced string looks like, say, [Hello, World] given a map value of List.of("Hello", "World"). If you don't want this, you still don't want flatmap, because streams are also homogenous - the values in a stream are all of the same kind, and thus a stream of 'key1 value1 value2 key2 valueA valueB' is not what you'd want:
map.entrySet().stream()
.map(e -> String.format("%s-%s", e.getKey(), myPrint(e.getValue())))
.collect(Collectors.toList());
public static String myPrint(List<String> in) {
// write your own algorithm here
}
Stream API just isn't the right tool to replace that myPrint method.
A third alternative is that you want to smear out the map; you want each string in a mapvalue's List<String> to first be matched with the key (so that's re-stating that key rather a lot), and then do something to that. NOW flatMap IS appropriate - you want a stream of k/v pairs first, and then do something to that, and each element is now of the same kind. You want to turn the map:
key1 = [value1, value2]
key2 = [value3, value4]
first into a stream:
key1:value1
key1:value2
key2:value3
key2:value4
and take it from there. This explodes a single k/v entry in your map into more than one, thus, flatmapping needed:
return map.entrySet().stream()
.flatMap(e -> e.getValue().stream()
.map(v -> String.format("%s-%s", e.getKey(), v))
.collect(Collectors.toList());
Going inside-out, it maps a single entry within a list that belongs to a single k/v pair into the string Key-SingleItemFromItsList.
Adding my two cents to excellent answer by #rzwitserloot. Already flatmap and map is explained in his answer.
List<String> resultLists = myMap.entrySet().stream()
.flatMap(mapEntry -> printEntries(mapEntry.getKey(),mapEntry.getValue())).collect(Collectors.toList());
System.out.println(resultLists);
Splitting this to a separate method gives good readability IMO,
private static Stream<String> printEntries(String key, List<String> values) {
return values.stream().map(val -> String.format("%s-%s",key,val));
}
So I have a piece of code where I'm iterating over a list of data. Each one is a ReportData that contains a case with a Long caseId and one Ruling. Each Ruling has one or more Payment. I want to have a Map with the caseId as keys and sets of payments as values (i.e. a Map<Long, Set<Payments>>).
Cases are not unique across rows, but cases are.
In other words, I can have several rows with the same case, but they will have unique rulings.
The following code gets me a Map<Long, Set<Set<Payments>>> which is almost what I want, but I've been struggling to find the correct way to flatMap the final set in the given context. I've been doing workarounds to make the logic work correctly using this map as is, but I'd very much like to fix the algorithm to correctly combine the set of payments into one single set instead of creating a set of sets.
I've searched around and couldn't find a problem with the same kind of iteration, although flatMapping with Java streams seems like a somewhat popular topic.
rowData.stream()
.collect(Collectors.groupingBy(
r -> r.case.getCaseId(),
Collectors.mapping(
r -> r.getRuling(),
Collectors.mapping(ruling->
ruling.getPayments(),
Collectors.toSet()
)
)));
Another JDK8 solution:
Map<Long, Set<Payment>> resultSet =
rowData.stream()
.collect(Collectors.toMap(p -> p.Case.getCaseId(),
p -> new HashSet<>(p.getRuling().getPayments()),
(l, r) -> { l.addAll(r);return l;}));
or as of JDK9 you can use the flatMapping collector:
rowData.stream()
.collect(Collectors.groupingBy(r -> r.Case.getCaseId(),
Collectors.flatMapping(e -> e.getRuling().getPayments().stream(),
Collectors.toSet())));
The cleanest solution is to define your own collector:
Map<Long, Set<Payment>> result = rowData.stream()
.collect(Collectors.groupingBy(
ReportData::getCaseId,
Collector.of(HashSet::new,
(s, r) -> s.addAll(r.getRuling().getPayments()),
(s1, s2) -> { s1.addAll(s2); return s1; })
));
Two other solutions to which I thought first but are actually less efficient and readable, but still avoid constructing the intermediate Map:
Merging the inner sets using Collectors.reducing():
Map<Long, Set<Payment>> result = rowData.stream()
.collect(Collectors.groupingBy(
ReportData::getCaseId,
Collectors.reducing(Collections.emptySet(),
r -> r.getRuling().getPayments(),
(s1, s2) -> {
Set<Payment> r = new HashSet<>(s1);
r.addAll(s2);
return r;
})
));
where the reducing operation will merge the Set<Payment> of entries with the same caseId. This can however cause a lot of copies of the sets if you have a lot of merges needed.
Another solution is with a downstream collector that flatmaps the nested collections:
Map<Long, Set<Payment>> result = rowData.stream()
.collect(Collectors.groupingBy(
ReportData::getCaseId,
Collectors.collectingAndThen(
Collectors.mapping(r -> r.getRuling().getPayments(), Collectors.toList()),
s -> s.stream().flatMap(Set::stream).collect(Collectors.toSet())))
);
Basically it puts all sets of matching caseId together in a List, then flatmaps that list into a single Set.
There are probably better ways to do this, but this is the best I found:
Map<Long, Set<Payment>> result =
rowData.stream()
// First group by caseIds.
.collect(Collectors.groupingBy(r -> r.case.getCaseId()))
.entrySet().stream()
// By streaming over the entrySet, I map the values to the set of payments.
.collect(Collectors.toMap(
Map.Entry::getKey,
entry -> entry.getValue().stream()
.flatMap(r -> r.getRuling().getPayments().stream())
.collect(Collectors.toSet())));
can somebody help me converting Map<String, map<Long, Set<PanelData>>> to List<PanelData>?
Backgroud: as part of my task I have grouped the PanelData object on two different attributes and the end result is the above map. PanelData is just a POJO with getter and setter.
To convert a Map<String,Map<Long,CustomObject>> to List<CustomObject>, you can do it somewhat like this:
Map<String,Map<Long,CustomObject>> input = ...
List<CustomObject> output = new ArrayList<>();
input.forEach((key, value) -> output.addAll(value.values()));
You can get stream from entrySet and use flatMap to make another stream from values:
map.entrySet()
.stream()
.map(Map.Entry::getValue)
.map(Map::entrySet)
.flatMap(Set::stream)
.map(Map.Entry::getValue)
.flatMap(Set::stream)
.collect(Collectors.toList());
I have the following situation where I need to remove an element from a stream.
map.entrySet().stream().filter(t -> t.getValue().equals("0")).
forEach(t -> map.remove(t.getKey()));
in pre Java 8 code one would remove from the iterator - what's the best way to deal with this situation here?
map.entrySet().removeIf(entry -> entry.getValue().equals("0"));
You can't do it with streams, but you can do it with the other new methods.
EDIT: even better:
map.values().removeAll(Collections.singleton("0"));
If you want to remove the entire key, then use:
myMap.entrySet().removeIf(map -> map.getValue().containsValue("0"));
I think it's not possible (or deffinitelly shouldn't be done) due to Streams' desire to have Non-iterference, as described here
If you think about streams as your functional programming constructs leaked into Java, then think about the objects that support them as their Functional counterparts and in functional programming you operate on immutable objects
And for the best way to deal with this is to use filter just like you did
1st time replying. Ran across this thread and thought to update if others are searching. Using streams you can return a filtered map<> or whatever you like really.
#Test
public void test() {
Map<String,String> map1 = new HashMap<>();
map1.put("dan", "good");
map1.put("Jess", "Good");
map1.put("Jaxon", "Bad");
map1.put("Maggie", "Great");
map1.put("Allie", "Bad");
System.out.println("\nFilter on key ...");
Map<String,String> map2 = map1.entrySet().stream().filter(x ->
x.getKey().startsWith("J"))
.collect(Collectors.toMap(e -> e.getKey(), e -> e.getValue()));
map2.entrySet()
.forEach(s -> System.out.println(s));
System.out.println("\nFilter on value ...");
map1.entrySet().stream()
.filter(x -> !x.getValue().equalsIgnoreCase("bad"))
.collect(Collectors.toMap(e -> e.getKey(), e -> e.getValue()))
.entrySet().stream()
.forEach(s -> System.out.println(s));
}
------- output -------
Filter on key ...
Jaxon=Bad
Jess=Good
Filter on value ...
dan=good
Jess=Good
Maggie=Great