I want to convert a scala function that replaces the age of a certain person in a scala Map<String,String> (which is name -> age) : map.map(e => if (e._1 == "Tom") (e._1, "52") else e)
Now I need to write the same function in java, I also have a scala map (scala.collection.Map) as input, I checked in javadoc the method map.map(..) has this signature :
def map[B, That](f: A => B)(implicit bf: CanBuildFrom[Repr, B, That])
So the function f I write like this :
AbstractFunction1 f = new AbstractFunction1<Tuple2<String, String>, Tuple2<String, String>>() {
#Override
public Tuple2<String, String> apply(Tuple2<String, String> e) {
if (e._1.equals("Tom")) {
return new Tuple2<>(e._1, "52");
}
return e;
}
};
But I have no idea what I should put in CanBuildFrom. I searched some posts but never found something that works for me.
Does someone know how to do this properly with .map() or there is some other workaround for this kind of usage in Java ? Note : I can convert the scala map into Java map first but it's definitely not something I will do because the return value of the function is also a scala map.
Related
I have a Map<String, List<SomeClass>> someMap and I'm retrieving the value based on someKey and for each element of the list of SomeClass I'm performing other operations.
someMap.getOrDefault(someKey, new ArrayList<>()).forEach(...)
I also want to be able to log messages when I don't find someKey. How would I be able to achieve it optimally? Is there any other function/way to achieve this behavior?
Map<String, List<String>> map = new HashMap<>();
List<String> l = new ArrayList<>();
l.add("b");
map.put("a", l);
Yes, you can do it in a single statement. Use .compute().
map.compute("a", (k, v) -> {
if (v == null) {
System.out.println("Key Not Found");
return new ArrayList<>();
}
return v;
}).forEach(System.out::println);
There's also computeIfAbsent() which will only compute the lambda if the key is not present.
Note, from the documentation:
Attempts to compute a mapping for the specified key and its current
mapped value (or null if there is no current mapping).
This will add the key which was not found in your map.
If you want to remove those keys later, then simply add those keys to a list inside the if and remove them in one statement like this:
map.keySet().removeAll(listToRemove);
You can create a function to do that. For example, I created a function which will get the value from the map, return it if it is not null, or an empty list otherwise. Before returning the empty list, you can run a Runnable action. The main benefit of that is that you can do more than just logging there.
#Slf4j
public class Main {
public static Collection<String> retrieveOrRun(Map<String, Collection<String>> map, String key, Runnable runnable) {
final Collection<String> strings = map.get(key);
if (strings == null) {
runnable.run();
return Collections.emptyList();
} else {
return strings;
}
}
public static void main(String[] args) {
Map<String, Collection<String>> map = new HashMap<>();
Collection<String> strings = retrieveOrRun(map, "hello", () -> log.warn("Could not find a value for the key : {}", "hello"));
}
}
I think you have two choices:
Either you use a wrapper method, doing the actual call (getOrDefault, etc) and handling missing keys.
public static <K,V> V getOrDefault(Map<K,V> map, K key, V defaultValue) {
V value = map.get(key);
if (value == null) {
logMissingValue(key);
return defaultValue;
}
return value;
}
Or you create new implementation of Map doing just that, with a delegation to method that should be delegated (I won't do here in this example, but Eclipse work pretty well: Alt + Shift + S > Create delegate methods):
class LoggerMap<K,V> implements Map<K,V> {
private final Map<K,V> internal;
public LoggerMap(Map<K,V> internal) {
this.internal = Objects.requireNonNull(internal, "internal");
}
#Override
public V getOrDefault(K key, V defaultValue) {
... if not found logMissingValue(key); ...
}
}
Now about which is optimal, that depends on your needs: if you know you will always use the wrapper method, then your missing keys will always be logged. Creating a new map implementation would be overkill.
If your need is to log absolutely all missing keys - even if foreign code (for example, some API taking a map as a parameter), then your best choice is a map implementation:
In terms of performance, I don't think you should worry about delegation: I did not test it using a benchmark, but the JVM should be able to optimize that.
There are other parts where a key might return a missing value (eg: remove, get, ...), using such an implementation will allow you to easily trace those as well.
I was working on a spare time project where I needed to read values from a YAML file and store them in a HashMap, another YAML file had to be stored in a LinkedHashMap. I used an API to do the reading, some explanation was added in the code below (though I believe it's quite redundant). Only the method that returns a LinkedHashMap was included because the other one is practically identical.
Currently I'm using seperate methods for getting a HashMap and LinkedHashMap but noticed that the code was quite similar. So I wondered, would it be possible to write a general method that puts the paths and values from the YAML file into any Collections implementation (that are implementing Hash Table)? And if so, how could one accomplish that?
public LinkedHashMap<String, Object> fileToLinkedHashMap(File yamlFile)
{
LinkedHashMap<String, Object> fileContents = new LinkedHashMap<String, Object>();
//Part of the API I'm using, reads from YAML File and stores the contents
YamlConfiguration config = YamlConfiguration.loadConfiguration(yamlFile);
//Configuration#getKeys(true) Gets all paths within the read File
for (String path : config.getKeys(true))
{
//Gets the value of a path
if (config.get(path) != null)
fileContents.put(path, config.get(path));
}
return fileContents;
}
Note: I know I'm currently not checking if the given file is a YAML file, this is redundant within this question.
You can make use of functional interfaces (introduced in java 8) for this:
public void consumeFile(File yamlFile, BiConsumer<? super String, ? super Object> consumer){
YamlConfiguration config = YamlConfiguration.loadConfiguration(yamlFile);
for (String path : config.getKeys(true)){
if (config.get(path) != null){
consumer.accept(path, config.get(path));
}
}
}
Which can then be called with literally anything, you just have to provide a lambda which accepts 2 parameters:
// collect into a map
Map<String, Object> map = /* hash map, linked hash map, tree map, you decide */;
consumeFile(yamlFile, map::put);
// just print them, why not?
consumeFile(yamlFile, (key, value) -> System.out.println(key + " = " + value));
You see, the uses are possibly endless. Only limited by your use case and imagination.
If you can't use java 8 (you probably should though) there is still hope. As you both times return a Map you can decide when calling the method what map implementation you'd like to collect into:
public Map<String, Object> consumeFile(File yamlFile, Map<String, Object> map){
YamlConfiguration config = YamlConfiguration.loadConfiguration(yamlFile);
for (String path : config.getKeys(true)){
if (config.get(path) != null){
map.put(path, config.get(path));
}
}
return map;
}
Which may be called like this:
Map<String, Object> map = consumeFile(yamlFile, new /*Linked*/HashMap<>());
Again what map implementation you want to use, you can decide for your needs.
I have a code pattern in a piece of code using Kafka Streams that keeps repeating, I do a map, then group by key and then reduce. It looks like this:
KTable<ProjectKey, EventConfigurationIdsWithDeletedState> eventConfigurationsByProjectTable = eventConfigurationStream
.map((key, value) -> {
Map<String, Boolean> eventConfigurationUpdates = new HashMap<>();
eventConfigurationUpdates.put(key.getEventConfigurationId(), value != null);
ProjectKey projectKey = ProjectKey.newBuilder().setId(key.getProjectId()).build();
EventConfigurationIdsWithDeletedState eventConfigurationIdsWithDeletedState = EventConfigurationIdsWithDeletedState.newBuilder().setEventConfigurations(eventConfigurationUpdates).build();
return KeyValue.pair(projectKey, eventConfigurationIdsWithDeletedState);
})
.groupByKey()
.reduce((aggValue, newValue) -> {
Map<String, Boolean> newEventConfigurations = newValue.getEventConfigurations();
Map<String, Boolean> aggEventConfigurations = aggValue.getEventConfigurations();
Map.Entry<String, Boolean> newEntry = newEventConfigurations.entrySet().iterator().next();
if (newEntry.getValue())
aggEventConfigurations.putAll(newEventConfigurations);
else
aggEventConfigurations.remove(newEntry.getKey());
if (aggEventConfigurations.size() == 0)
return null;
return aggValue;
});
(with eventConfigurationStream being of type KStream<EventConfigurationKey, EventConfiguration>)
Another example that follows this pattern. Note there's a filter here too but that isn't always the case:
KTable<ProjectKey, NotificationSettingsTransition> globalNotificationSettingsPerProjectTable = notificationSettingTable.toStream()
.filter((key, value) -> {
return key.getEventConfigurationId() == null;
})
.map((key, value) -> {
ProjectKey projectKey = ProjectKey.newBuilder().setId(key.getProjectId()).build();
Map<String, NotificationSetting> notificationSettingsMap = new HashMap<>();
notificationSettingsMap.put(getAsCompoundKeyString(key), value);
NotificationSettingsTransition notificationSettingTransition = NotificationSettingsTransition
.newBuilder()
.setNotificationSettingCompoundKeyLastUpdate(getAsCompoundKey(key))
.setNotificationSettingLastUpdate(value)
.setEventConfigurationIds(new ArrayList<>())
.setNotificationSettingsMap(notificationSettingsMap)
.build();
return KeyValue.pair(projectKey, notificationSettingTransition);
})
.groupByKey()
.reduce((aggValue, newValue) -> {
Map<String, NotificationSetting> notificationSettingMap = aggValue.getNotificationSettingsMap();
String compoundKeyAsString = getAsString(newValue.getNotificationSettingCompoundKeyLastUpdate());
if (newValue.getNotificationSettingLastUpdate() != null)
notificationSettingMap.put(compoundKeyAsString, newValue.getNotificationSettingLastUpdate());
else
notificationSettingMap.remove(compoundKeyAsString);
aggValue.setNotificationSettingCompoundKeyLastUpdate(newValue.getNotificationSettingCompoundKeyLastUpdate());
aggValue.setNotificationSettingLastUpdate(newValue.getNotificationSettingLastUpdate());
aggValue.setNotificationSettingsMap(notificationSettingMap);
return aggValue;
});
(with notificationSettingsTable being of type KTable<NotificationSettingKey, NotificationSetting> notificationSettingTable but immediately being transformed into a KStream as well.)
How could I extract this into a function where I pass a function for the map code and for the reduce code but do not have to repeat the pattern of .map().groupByKey().reduce()? While respecting that the return types are different and depend on the code in the map function and should remain typed. Ideally in Java 8 but higher versions are potentially possible. I think I have a good idea of how to do it when the inner types of the KeyValuePair within the map code wouldn't change, but not sure how to do it now.
You can parameterise your function to accept two generic functions, where the types will be inferred (or set explicitely if not possible) when the function is called.
For the input to map, you want a BiFunction<K, V, T>, and for reduce you want a BiFunction<U, U, U>, where:
K is the type of key in map's function.
V is the type of value in map's function.
T is the return type of map's function.
U is the type of the aggregator, values and return type of reduce's function.
Looking at KStream and KGroupedStream, you can get more detailed type information to constrain the functions further.
This would make your custom function something like this:
<K, V, T, U> U mapGroupReduce(final KStream<K, V> stream, final BiFunction<K, V, T> mapper, final BiFunction<U, U, U> reducer) {
return stream.map(mapper).groupByKey().reduce(reducer);
}
You can then call it like so:
mapGroupReduce(yourStream,
(key, value) -> new KeyValue(k, v)),
(acc, value) -> acc);
In your case, instead of using BiFunctions, you need to use:
KeyValueMapper<K, V, KeyValue<T, U>> for the mapper
Reducer<U> for the reducer.
However, is this really all that much better than just writing stream.map(M).groupByKey().reduce(R) every time? The more verbose version is more explicit, and given the relative sizes of the mapper and reducer, you are not really saving on all that much.
I am using a accumulator within a fold function. I would like to change the value of the accumulator.
My function looks something like this:
public Tuple2<String, Long> fold(Tuple2<String, Long> acc, eventClass event)
{
acc._1 = event.getUser();
acc._2 += event.getOtherThing();
return acc
}
To me this should be working, because all I am doing is change the values of the accumulator. However what I get is Cannot assign value to final variable _1. Same for _2. Why are these properties of acc final? How can I assign values to them?
quick edit:
Wat I could to is rust return a new Tuple instead, but this is not a nice solution in my opinion return new Tuple2<String, Long>(event.getUser(), acc._2 + event.getOtherThing());
solution for flink framework:
Use the Tuple2 of defined in flink. Import it using
import org.apache.flink.api.java.tuple.Tuple2;
and then use it with
acc.f0 = event.getUser();
acc.f1 += event.getByteDiff();
return acc;
I don't know what kind of Tuple2 you still use, but I assume it is a scala Tuple2.
The scala Tuple2 it's immutable. You can't change the value of an Immutable object, you must to recreate it.
Why? The scala Tuple2 is a functional programming "Data structure" so, as all concept of functional programming" it try to reduce side effect.
You can use the .copy function to recreate it as you want.
the following is an example of code:
#Test
public void test() {
Tuple2<String,Long> tuple = new Tuple2<>("a",1l);
Tuple2<String,Long> actual = tuple.copy(tuple._1,tuple._2+1);
Tuple2<String,Long> expected = new Tuple2<>("a",2l);
assertEquals(actual,expected);
}
I don't know which Tuple2 you are working with. How about return a new object:
Tuple2<String, Long> tuple = new Tuple2<String, Long>();
tuple._1 = event.getUser();
tuple._2 = event.getOtherThing() + acc._2;
return tuple;
This question already has answers here:
How to convert List to Map?
(20 answers)
Closed 7 years ago.
I would like to find a way to take the object specific routine below and abstract it into a method that you can pass a class, list, and fieldname to get back a Map.
If I could get a general pointer on the pattern used or , etc that could get me started in the right direction.
Map<String,Role> mapped_roles = new HashMap<String,Role>();
List<Role> p_roles = (List<Role>) c.list();
for (Role el : p_roles) {
mapped_roles.put(el.getName(), el);
}
to this? (Pseudo code)
Map<String,?> MapMe(Class clz, Collection list, String methodName)
Map<String,?> map = new HashMap<String,?>();
for (clz el : list) {
map.put(el.methodName(), el);
}
is it possible?
Using Guava (formerly Google Collections):
Map<String,Role> mappedRoles = Maps.uniqueIndex(yourList, Functions.toStringFunction());
Or, if you want to supply your own method that makes a String out of the object:
Map<String,Role> mappedRoles = Maps.uniqueIndex(yourList, new Function<Role,String>() {
public String apply(Role from) {
return from.getName(); // or something else
}});
Here's what I would do. I am not entirely sure if I am handling generics right, but oh well:
public <T> Map<String, T> mapMe(Collection<T> list) {
Map<String, T> map = new HashMap<String, T>();
for (T el : list) {
map.put(el.toString(), el);
}
return map;
}
Just pass a Collection to it, and have your classes implement toString() to return the name. Polymorphism will take care of it.
Java 8 streams and method references make this so easy you don't need a helper method for it.
Map<String, Foo> map = listOfFoos.stream()
.collect(Collectors.toMap(Foo::getName, Function.identity()));
If there may be duplicate keys, you can aggregate the values with the toMap overload that takes a value merge function, or you can use groupingBy to collect into a list:
//taken right from the Collectors javadoc
Map<Department, List<Employee>> byDept = employees.stream()
.collect(Collectors.groupingBy(Employee::getDepartment));
As shown above, none of this is specific to String -- you can create an index on any type.
If you have a lot of objects to process and/or your indexing function is expensive, you can go parallel by using Collection.parallelStream() or stream().parallel() (they do the same thing). In that case you might use toConcurrentMap or groupingByConcurrent, as they allow the stream implementation to just blast elements into a ConcurrentMap instead of making separate maps for each thread and then merging them.
If you don't want to commit to Foo::getName (or any specific method) at the call site, you can use a Function passed in by a caller, stored in a field, etc.. Whoever actually creates the Function can still take advantage of method reference or lambda syntax.
Avoid reflection like the plague.
Unfortunately, Java's syntax for this is verbose. (A recent JDK7 proposal would make it much more consise.)
interface ToString<T> {
String toString(T obj);
}
public static <T> Map<String,T> stringIndexOf(
Iterable<T> things,
ToString<T> toString
) {
Map<String,T> map = new HashMap<String,T>();
for (T thing : things) {
map.put(toString.toString(thing), thing);
}
return map;
}
Currently call as:
Map<String,Thing> map = stringIndexOf(
things,
new ToString<Thing>() { public String toString(Thing thing) {
return thing.getSomething();
}
);
In JDK7, it may be something like:
Map<String,Thing> map = stringIndexOf(
things,
{ thing -> thing.getSomething(); }
);
(Might need a yield in there.)
Using reflection and generics:
public static <T> Map<String, T> MapMe(Class<T> clz, Collection<T> list, String methodName)
throws Exception{
Map<String, T> map = new HashMap<String, T>();
Method method = clz.getMethod(methodName);
for (T el : list){
map.put((String)method.invoke(el), el);
}
return map;
}
In your documentation, make sure you mention that the return type of the method must be a String. Otherwise, it will throw a ClassCastException when it tries to cast the return value.
If you're sure that each object in the List will have a unique index, use Guava with Jorn's suggestion of Maps.uniqueIndex.
If, on the other hand, more than one object may have the same value for the index field (which, while not true for your specific example perhaps, is true in many use cases for this sort of thing), the more general way do this indexing is to use Multimaps.index(Iterable<V> values, Function<? super V,K> keyFunction) to create an ImmutableListMultimap<K,V> that maps each key to one or more matching values.
Here's an example that uses a custom Function that creates an index on a specific property of an object:
List<Foo> foos = ...
ImmutableListMultimap<String, Foo> index = Multimaps.index(foos,
new Function<Foo, String>() {
public String apply(Foo input) {
return input.getBar();
}
});
// iterate over all Foos that have "baz" as their Bar property
for (Foo foo : index.get("baz")) { ... }