Java nested Map to Scala nested sequence - java

I'm new to Scala and our project mixes Java and Scala code together (using the Play Framework). I'm trying to write a Scala method that can take a nested Java Map such as:
LinkedHashMap<String, LinkedHashMap<String, String>> groupingA = new LinkedHashMap<String, LinkedHashMap<String,String>>();
And have that java object passed to a Scala function that can loop through it. I have the following scala object definition to try and support the above Java nested map:
Seq[(String, Seq[(String,String)])]
Both the Java file and the Scala file compile fine individually, but when my java object tries to create a new instance of my scala class and pass in the nested map, I get a compiler error with the following details:
[error] ..... overloaded method value apply with alternatives:
[error] (options: java.util.List[String])scala.collection.mutable.Buffer[(String, String)] <and>
[error] (options: scala.collection.immutable.List[String])List[(String, String)] <and>
[error] (options: java.util.Map[String,String])Seq[(String, String)] <and>
[error] (options: scala.collection.immutable.Map[String,String])Seq[(String, String)] <and>
[error] (options: (String, String)*)Seq[(String, String)]
[error] cannot be applied to (java.util.LinkedHashMap[java.lang.String,java.util.LinkedHashMap[java.lang.String,java.lang.String]])
Any ideas here on how I can pass in a nested Java LinkedHashMap such as above into a Scala file where I can generically iterate over a nested collection? I'm trying to write this generic enough that it would also work for a nested Scala collection in case we ever switch to writing our play framework controllers in Scala instead of Java.

Seq is a base trait defined in the Scala Collections hierarchy. While java and scala offer byte code compatibility, scala defines a number of its own types including its own collection library. The rub here is if you want to write idiomatic scala you need to convert your java data to scala data. The way I see it you have a few options.
You can use Richard's solution and convert the java types to scala types in your scala code. I think this is ugly because it assumes your input will always be coming from java land.
You can write beautiful, perfect scala handler and provide a companion object that offers the ugly java conversion behavior. This disentangles your scala implementation from the java details.
Or you could write an implicit def like the one below genericizing it to your heart's content.
.
import java.util.LinkedHashMap
import scala.collection.JavaConversions.mapAsScalaMap
object App{
implicit def wrapLhm[K,V,G](i:LinkedHashMap[K,LinkedHashMap[G,V]]):LHMWrapper[K,V,G] = new LHMWrapper[K,V,G](i)
def main(args: Array[String]){
println("Hello World!")
val lhm = new LinkedHashMap[String, LinkedHashMap[String,String]]()
val inner = new LinkedHashMap[String,String]()
inner.put("one", "one")
lhm.put("outer",inner);
val s = lhm.getSeq()
println(s.toString())
}
class LHMWrapper[K,V,G](value: LinkedHashMap[K,LinkedHashMap[G,V]]){
def getSeq():Seq[ (K, Seq[(G,V)])] = mapAsScalaMap(value).mapValues(mapAsScalaMap(_).toSeq).toSeq
}
}

Try this:
import scala.collections.JavaConversions.mapAsScalaMap
val lhm: LinkedHashMap[String, LinkedHashMap[String, String]] = getLHM()
val scalaMap = mapAsScalaMap(lhm).mapValues(mapAsScalaMap(_).toSeq).toSeq
I tested this, and got a result of type Seq[String, Seq[(String, String)]]
(The conversions will wrap the original Java object, rather than actually creating a Scala object with a copy of the values. So the conversions to Seq aren't necessary, you could leave it as a Map, the iteration order will be the same).
Let me guess, are you processing query parameters?

Related

Jython - Scala array/list not JSON serializable within python script

I have a Scala class that wraps an avro record with getters and setters. Using Jython to allow users to write python scripts to process the Avro record and ultimately do a json.dumps on the new processed record.
The issue is, if the user wants to grab a value that is an Array from the record, the interpreter complains that the object is not JSON serializable.
import json
json.dumps(<AClass>.getArray('myArray'))
The AClass is made available to any given python script at run time. Scala AClass:
class AClass {
def getArray(fieldName: String): Array[Integer] = {
val value: GenericData.Array[T] = [....]
value
.asInstanceOf[GenericData.Array[T]]
.asScala
.toArray[T]
}
}
I've tried a few other return types, 1) List[Integer], 2) mutable.Buffer[Integer], just the plain Avro generic Array 3) GenericData.Array[T]. All give the same serialization error with the slightly varying objects:
Runtime exception occurred during Python processing. TypeError: List(1, 2, 3) is not JSON serializable.
... Buffer(1, 2, 3) is not JSON serializable
... [1, 2, 3] is not JSON serializable.
Now it seems that if we were to convert it to a list() from within the python script, it works fine. This gave some leads but need it to happen at the Scala level.
import json
json.dumps(list(<AClass>.getArray('myArray')))
Is there any way to achieve this? What Scala / Java list type would translate directly into the python list type and/or be JSON serializable within the Jython py interpreter?
The json module in jython only accepts standard jython data types (that's why converting to list() works). See: https://docs.python.org/3/library/json.html#py-to-json-table

How to convert a Scala Iterable into Java util.List?

I have a written a method in Scala that is using a method written in Java - processSale() method takes util.List<Sale> as a parameter.
But after groupByKey() I'm getting an RDD[(String, Iterable[Sale])]. I've tried to import scala.collection.JavaConverters._ and do SaleParser.processSale(a.asJava).
However it gives me an Iterable[Sale]. How is it possible to convert it into a Java util.List?
val parseSales: RDD[(String, Sale)] = rawSales
.map(sale => sale.Id -> sale)
.groupByKey()
.mapValues(a => SaleParser.processSale(???))
a.toSeq.asJava
Note that if this Iterable is actually a Seq, toSeq just returns the same object.
See API doc for the complete list of conversions.

convert java to scala code - change of method signatures

Trying to convert some java to scala code I face the problem of a different method signature which compiled fine in the java world:
The following code in java (from https://github.com/DataSystemsLab/GeoSpark/blob/master/babylon/src/main/java/org/datasyslab/babylon/showcase/Example.java#L122-L126)
visualizationOperator = new ScatterPlot(1000,600,USMainLandBoundary,false,-1,-1,true,true);
visualizationOperator.CustomizeColor(255, 255, 255, 255, Color.GREEN, true);
visualizationOperator.Visualize(sparkContext, spatialRDD);
imageGenerator = new SparkImageGenerator();
imageGenerator.SaveAsFile(visualizationOperator.distributedVectorImage, "file://"+outputPath,ImageType.SVG);
Is translated to https://github.com/geoHeil/geoSparkScalaSample/blob/master/src/main/scala/myOrg/visualization/Vis.scala#L45-L57
val vDistributedVector = new ScatterPlot(1000, 600, USMainLandBoundary, false, -1, -1, true, true)
vDistributedVector.CustomizeColor(255, 255, 255, 255, Color.GREEN, true)
vDistributedVector.Visualize(s, spatialRDD)
sparkImageGenerator.SaveAsFile(vDistributedVector.distributedVectorImage, outputPath + "distributedVector", ImageType.SVG)
Which will throw the following error:
overloaded method value SaveAsFile with alternatives:
[error] (x$1: java.util.List[String],x$2: String,x$3: org.datasyslab.babylon.utils.ImageType)Boolean <and>
[error] (x$1: java.awt.image.BufferedImage,x$2: String,x$3: org.datasyslab.babylon.utils.ImageType)Boolean <and>
[error] (x$1: org.apache.spark.api.java.JavaPairRDD,x$2: String,x$3: org.datasyslab.babylon.utils.ImageType)Boolean
[error] cannot be applied to (org.apache.spark.api.java.JavaPairRDD[Integer,String], String, org.datasyslab.babylon.utils.ImageType)
[error] sparkImageGenerator.SaveAsFile(vDistributedVector.distributedVectorImage, outputPath + "distributedVector", ImageType.SVG)
Unfortunately, I am not really sure how to fix this / how to properly call the method in scala.
This is a problem in ImageGenerator, inherited by SparkImageGenerator. As you can see here, it has a method
public boolean SaveAsFile(JavaPairRDD distributedImage, String outputPath, ImageType imageType)
which uses a raw type (JavaPairRDD without <...>). They exist primarily for compatibility with pre-Java 5 code and shouldn't normally be used otherwise. For this code, there is certainly no good reason, as it actually expects specific type parameters. Using raw types merely loses type-safety. Maybe some subclasses (current or potential) might override it and expect different type parameters, but this would be a misuse of inheritance and there must be a better solution.
Scala doesn't support raw types in any way and so you can't call this method from it (AFAIK). As a workaround, you could write a wrapper in Java which used correct types and call this wrapper from Scala. I misremembered, it's extending Java classes extending raw types which was impossible, and even then there are workarounds.
You might be able to call it by explicit type ascription (preferable to casting):
sparkImageGenerator.SaveAsFile(
(vDistributedVector.distributedVectorImage: JavaPairRDD[_, _]),
outputPath + "distributedVector", ImageType.SVG)
But given the error message shows just JavaPairRDD, I don't particularly expect it to work. If this fails, I'd still go with a Java wrapper.
The accepted answer is correct in saying that raw types should be avoided. However Scala can interoperate with Java code that has raw types. Scala interprets the raw type java.util.List as the existential type java.util.List[_].
Take for example this Java code:
// Test.java
import java.util.Map;
public class Test {
public boolean foo(Map map, String s) {
return true;
}
}
Then try to call it from Scala:
Welcome to Scala 2.12.1 (Java HotSpot(TM) 64-Bit Server VM, Java 1.8.0_131).
Type in expressions for evaluation. Or try :help.
scala> import java.util.{Map,HashMap}
import java.util.{Map,HashMap}
scala> new Test().foo(new HashMap[String,Integer], "a")
res0: Boolean = true
scala> val h: Map[_,_] = new HashMap[String,Integer]
h: java.util.Map[_, _] = {}
scala> new Test().foo(h, "a")
res1: Boolean = true
So it looks like there must be some other problem.

How To Convert Scala Case Class to Java HashMap

I'm using Mule ESB (Java Based) and I have some scala components that modify and create data. My Data is represented in Case Classes. I'm trying to convert them to Java, however Just getting them to convert to Scala types is a challenge. Here's a simplified example of what I'm trying to do:
package com.echostar.ese.experiment
import scala.collection.JavaConverters
case class Resource(guid: String, filename: String)
case class Blackboard(name: String, guid:String, resource: Resource)
object CCC extends App {
val res = Resource("4alskckd", "test.file")
val bb = Blackboard("Test", "123asdfs", res)
val myMap = getCCParams(bb)
val result = new java.util.HashMap[String,Object](myMap)
println("Result:"+result)
def getCCParams(cc: AnyRef) =
(Map[String, Any]() /: cc.getClass.getDeclaredFields) {(a, f) =>
f.setAccessible(true)
val value = f.get(cc) match {
// this covers tuples as well as case classes, so there may be a more specific way
case caseClassInstance: Product => getCCParams(caseClassInstance): Map[String, Any]
case x => x
}
a + (f.getName -> value)
}
}
Current Error: Recursive method needs return type.
My Scala Foo isn't very strong. I grabbed this method from another answer here
and basically know what it's doing, but not enough to change this to java.util.HashMap and java.util.List
Expected Output:
Result:{"name"="Test", "guid"="123asdfs", "resource"= {"guid"="4alskckd", "filename"="test.file"}}
UPDATE1:
1. Added getCCParams(caseClassInstance): Map[String, Any] to line 22 Above per #cem-catikkas. IDE syntax error still says "recursive method ... needs result type" and "overloaded method java.util.HashMap cannot be applied to scala.collection.immutable.Map".
2. Changed java.util.HashMap[String, Object]
You should follow what the error tells you. Since getCCParams is a recursive method you need to declare its return type.
def getCCParams(cc: AnyRef): Map[String, Any]
Answering this in case anyone else going through the issue ends up here (as happened to me).
I believe the error you were getting had to do with the fact that the return type was being declared at method invocation (line 22), however the compiler was expecting it at the method's declaration (in your case, line 17). The below seems to have worked:
def getCCParams(cc: AnyRef): Map[String, Any] = ...
Regarding the conversion from Scala Map to Java HashMap, by adding the ._ wildcard to the JavaConverters import statement, you manage to import all the methods of the object as single identifiers, which is a requirement for implicit conversions. This will include the asJava method which can then be used to convert the Scala Map to a Java one, and then this can be passed to the java.util.HashMap(Map<? extends K,? extends V> m) constructor to instantiate a HashMap:
import scala.collection.JavaConverters._
import java.util.{HashMap => JHashMap}
...
val myMap = getCCParams(bb)
val r = myMap.asJava // converting to java.util.Map[String, Any]
val result: JHashMap[String,Any] = new JHashMap(r)
I wonder if you've considered going at it the other way around, by implementing the java.util.Map interface in your case class? Then you wouldn't have to convert back and forth, but any consumers downstream that are using a Map interface will just work (for example if you're using Groovy's field dot-notation).

Get Java reflection representation of Scala type

This seems like a simple question, but it's very challenging to search for, so I'm asking a new question. My apologies if it's already been asked.
Due to the compiler bug described here Scala 2.11.5 compiler crash with type aliases and manifests (also here https://issues.scala-lang.org/browse/SI-9155), I need to use scala TypeTags and friends for discovery of type parameters to methods. However, I then need to use that type information in a Java library that uses java.lang.Class and java.lang.reflect.Type.
How can I convert a scala.reflect.runtime.universe Type into a java.lang.reflect.Type or java.lang.Class?
Put concretely, how would I fill out the body of this method:
def typeFor[T](implicit tag: TypeTag[T]): java.lang.reflect.Type = ...
or, if that's not possible:
def typeFor[T](implicit tag: TypeTag[T]): java.lang.Class[T] = ...
And note, due to the bug posted above, I cannot use scala.reflect.Manifest.
The short answer is no, but you can try to do something similar to this SO question. However there is an open ticket....
This may have some limitations I'm not aware of, but you could drop down to Java reflection and try something like:
import scala.util.control.Exception._
def typeMe[T](implicit t: TypeTag[T]) = {
catching(classOf[Exception]) opt Class.forName(t.tpe.typeSymbol.asClass.fullName)
}
println(typeMe[String])
println(typeMe[ClassTag[_]])
Results in:
Some(class java.lang.String)
Some(interface scala.reflect.ClassTag)
The way I solved it with manifests, was:
private def typeFromManifest(m: Manifest[_]): Type = {
if (m.typeArguments.isEmpty) { m.runtimeClass }
else new ParameterizedType {
def getRawType = m.runtimeClass
def getActualTypeArguments = m.typeArguments.map(typeFromManifest).toArray
def getOwnerType = null
}
}
Right now I'm trying to solve this using something other than Manifest which should be removed from scala runtime.

Categories

Resources