Is there a way I can explicitly cast one Java object to another Java class from JRuby?
Sometimes I want to be able to invoke SomeJavaClass#aMethod(MySuperClass) rather than SomeJavaClass#aMethod(MyClass) from JRuby.
From Java, I'd do this:
someJavaObject.aMethod( (MySuperClass) myObj );
but I didn't see a #cast ruby method or anything like that to do the equivalent from JRuby.
Note that the question Casting Java Objects From JRuby lacks an answer for the general case, which is why I'm re-asking the question.
You need to make use of either the #java_send or #java_alias feature available starting with JRuby 1.4 to select the method you wish to call. Example:
class Java::JavaUtil::Arrays
boolean_array_class = [false].to_java(:boolean).java_class
java_alias :boolean_equals, :equals, [boolean_array_class, boolean_array_class]
end
a1 = [false, true]
Java::JavaUtil::Arrays.boolean_equals a1, a1
# => TypeError: for method Arrays.equals expected [class [Z, class [Z]; got: [org.jruby.RubyArray,org.jruby.RubyArray]; error: argument type mismatch
Java::JavaUtil::Arrays.boolean_equals a1.to_java(:boolean), a1.to_java(:boolean)
# => true
a2 = [true, false]
Java::JavaUtil::Arrays.boolean_equals a1.to_java(:boolean), a2.to_java(:boolean)
# => false
Related
I wrote a method to parse Metrics data and at first faced a problem with the type of transactionMap which is a java.util.Map. And I solved it using JavaConverters.
def parseMetrics(metric: Metric) = {
import scala.collection.JavaConverters._
metric.transactionMap.asScala.values.map {
case false => "N"
case true => "Y"
}.toList
But after that I got an error while pattern matching true and false values: pattern type is incompatible with expected type, found: Boolean, required: java.lang.Boolean
As far as I understand Scala does not chain two implicit conversions. Is there a way to fix it using JavaConverters?
The other answer provides a reasonable way to solve this problem, but doesn't show why you're running into it or how the approach it proposes works.
The Scala standard library does provide an implicit conversion from java.lang.Boolean to scala.Boolean, which you can see in action by using reify in a REPL to desugar some code that uses a Java boolean in a context where a Scala boolean is expected:
scala> val x: java.lang.Boolean = true
x: Boolean = true
scala> import scala.reflect.runtime.universe.reify
import scala.reflect.runtime.universe.reify
scala> reify(if (x) 1 else 0)
res0: reflect.runtime.universe.Expr[Int] =
Expr[Int](if (Predef.Boolean2boolean($read.x))
1
else
0)
The problem is that simply trying to match a java.lang.Boolean value against true or false isn't sufficient to trigger the conversion. You can check this by defining your own types where you can be sure you know exactly what conversions are in play:
scala> case class Foo(i: Int); case class Bar(i: Int)
defined class Foo
defined class Bar
scala> implicit def foo2bar(foo: Foo): Bar = Bar(foo.i)
foo2bar: (foo: Foo)Bar
scala> Foo(100) match { case Bar(x) => x }
<console>:17: error: constructor cannot be instantiated to expected type;
found : Bar
required: Foo
Foo(100) match { case Bar(x) => x }
^
This is a language design decision. It would probably be possible to have the implicit conversions applied in these scenarios, but there's also probably a good reason that they aren't (off the top of my head I'm not familiar with any relevant discussions or issues, but that doesn't mean they don't exist).
The reason Andy's solution works is that the java.lang.Boolean is in a position where the compiler expects a scala.Boolean (a condition) and is willing to apply the Predef.Boolean2boolean conversion. You could do this manually if you really wanted to:
def parseMetrics(metric: Metric) = {
import scala.collection.JavaConverters._
metric.transactionMap.asScala.values.map(Predef.Boolean2boolean).map {
case false => "N"
case true => "Y"
}.toList
}
…but to my eye at least pattern matching on Boolean is a little clunkier than using a conditional.
Use if/else rather than a match statement for Boolean checking:
def parseMetrics(metric: Metric) = {
import scala.collection.JavaConverters._
metric.transactionMap.asScala.values.map {
x => if (x) "Y" else "N"
}.toList
My suspicion is that within the if statement the java.lang.Boolean (which I presume x is here) can be coerced to Boolean via import scala.collection.JavaConverters._... but the match statement doesn't do the same coercion, but would have to be made explicitly (or match on the java.lang.Boolean values).
I am trying to use Elasticsearch's Java API.
I am trying to create a RestClientBuilder.
Host=createObject("java", "org.apache.http.HttpHost").init(variables.HostName, variables.Port);
Node=createObject("java", "org.elasticsearch.client.Node").init(Host);
RestClient=createObject("java", "org.elasticsearch.client.RestClient").builder(Javacast("org.elasticsearch.client.Node[]", [Node])).build();
I get the error
Cannot convert the value to Java array because type org.elasticsearch.client.Node is unknown.
Also if I just try to use:
RestClient=createObject("java", "org.elasticsearch.client.RestClient").builder(Javacast("org.apache.http.HttpHost[]", [Host]));
I get the following error
Either there are no methods with the specified method name and
argument types or the builder method is overloaded with argument types
that ColdFusion cannot decipher reliably. ColdFusion found 0 methods
that match the provided arguments. If this is a Java object and you
verified that the method exists, use the javacast function to reduce
ambiguity.
This I assume is because ColdFusion doesn't play nicely with varargs
I found a workaround using this method
https://www.bennadel.com/blog/1980-tojava---a-coldfusion-user-defined-function-for-complex-java-casting.htm
I believe there is a bug with Javacast and javaSettings loadPaths not being used.
coldfusion.runtime.Cast$UnknownTypeException: Cannot convert the value
to Java array because type org.elasticsearch.client.Node is unknown.
at coldfusion.runtime.Cast.toJavaArray(Cast.java:1602)
Additionally if I try to perform the actiuons that the UDF takes
local.javaClass = createObject("java", "org.apache.http.HttpHost");
local.HostArrayReflect = createObject("java", "java.lang.reflect.Array");
local.HostArray = local.HostArrayReflect.newInstance(
local.javaClass.GetClass()
, JavaCast( "int", ArrayLen(local.Hosts))
);
for (i=0; i LT ArrayLen(local.Hosts); i=i+1) {
local.HostArrayReflect.Set(local.HostArray, JavaCast("int", i), local.Hosts[i]);
}
I get the error
An exception occurred while instantiating a Java object. The class
must not be an interface or an abstract class. If the class has a
constructor that accepts an argument, you must call the constructor
explicitly using the init(args) method. Error :
org.apache.http.HttpHost
java.lang.NoSuchMethodException: org.apache.http.HttpHost.() at
java.lang.Class.getConstructor0(Class.java:3082) at
java.lang.Class.newInstance(Class.java:412) at
coldfusion.runtime.java.JavaProxy.createObjectWithDefaultConstructor(JavaProxy.java:209)
at coldfusion.runtime.java.JavaProxy.invoke(JavaProxy.java:92)
This happens when I try to run getClass(), but in the UDF there is no issue. A coworker tried to run this on Lucee and it seems to have worked, so I believe there is a bug in CF related to this.
Trying to convert some java to scala code I face the problem of a different method signature which compiled fine in the java world:
The following code in java (from https://github.com/DataSystemsLab/GeoSpark/blob/master/babylon/src/main/java/org/datasyslab/babylon/showcase/Example.java#L122-L126)
visualizationOperator = new ScatterPlot(1000,600,USMainLandBoundary,false,-1,-1,true,true);
visualizationOperator.CustomizeColor(255, 255, 255, 255, Color.GREEN, true);
visualizationOperator.Visualize(sparkContext, spatialRDD);
imageGenerator = new SparkImageGenerator();
imageGenerator.SaveAsFile(visualizationOperator.distributedVectorImage, "file://"+outputPath,ImageType.SVG);
Is translated to https://github.com/geoHeil/geoSparkScalaSample/blob/master/src/main/scala/myOrg/visualization/Vis.scala#L45-L57
val vDistributedVector = new ScatterPlot(1000, 600, USMainLandBoundary, false, -1, -1, true, true)
vDistributedVector.CustomizeColor(255, 255, 255, 255, Color.GREEN, true)
vDistributedVector.Visualize(s, spatialRDD)
sparkImageGenerator.SaveAsFile(vDistributedVector.distributedVectorImage, outputPath + "distributedVector", ImageType.SVG)
Which will throw the following error:
overloaded method value SaveAsFile with alternatives:
[error] (x$1: java.util.List[String],x$2: String,x$3: org.datasyslab.babylon.utils.ImageType)Boolean <and>
[error] (x$1: java.awt.image.BufferedImage,x$2: String,x$3: org.datasyslab.babylon.utils.ImageType)Boolean <and>
[error] (x$1: org.apache.spark.api.java.JavaPairRDD,x$2: String,x$3: org.datasyslab.babylon.utils.ImageType)Boolean
[error] cannot be applied to (org.apache.spark.api.java.JavaPairRDD[Integer,String], String, org.datasyslab.babylon.utils.ImageType)
[error] sparkImageGenerator.SaveAsFile(vDistributedVector.distributedVectorImage, outputPath + "distributedVector", ImageType.SVG)
Unfortunately, I am not really sure how to fix this / how to properly call the method in scala.
This is a problem in ImageGenerator, inherited by SparkImageGenerator. As you can see here, it has a method
public boolean SaveAsFile(JavaPairRDD distributedImage, String outputPath, ImageType imageType)
which uses a raw type (JavaPairRDD without <...>). They exist primarily for compatibility with pre-Java 5 code and shouldn't normally be used otherwise. For this code, there is certainly no good reason, as it actually expects specific type parameters. Using raw types merely loses type-safety. Maybe some subclasses (current or potential) might override it and expect different type parameters, but this would be a misuse of inheritance and there must be a better solution.
Scala doesn't support raw types in any way and so you can't call this method from it (AFAIK). As a workaround, you could write a wrapper in Java which used correct types and call this wrapper from Scala. I misremembered, it's extending Java classes extending raw types which was impossible, and even then there are workarounds.
You might be able to call it by explicit type ascription (preferable to casting):
sparkImageGenerator.SaveAsFile(
(vDistributedVector.distributedVectorImage: JavaPairRDD[_, _]),
outputPath + "distributedVector", ImageType.SVG)
But given the error message shows just JavaPairRDD, I don't particularly expect it to work. If this fails, I'd still go with a Java wrapper.
The accepted answer is correct in saying that raw types should be avoided. However Scala can interoperate with Java code that has raw types. Scala interprets the raw type java.util.List as the existential type java.util.List[_].
Take for example this Java code:
// Test.java
import java.util.Map;
public class Test {
public boolean foo(Map map, String s) {
return true;
}
}
Then try to call it from Scala:
Welcome to Scala 2.12.1 (Java HotSpot(TM) 64-Bit Server VM, Java 1.8.0_131).
Type in expressions for evaluation. Or try :help.
scala> import java.util.{Map,HashMap}
import java.util.{Map,HashMap}
scala> new Test().foo(new HashMap[String,Integer], "a")
res0: Boolean = true
scala> val h: Map[_,_] = new HashMap[String,Integer]
h: java.util.Map[_, _] = {}
scala> new Test().foo(h, "a")
res1: Boolean = true
So it looks like there must be some other problem.
Executing the following code in Java7
ScriptEngine scriptEngine = new ScriptEngineManager().getEngineByName("js");
Bindings b = scriptEngine.createBindings();
b.put("x", true);
scriptEngine.eval("x&y", b);
I get the error
sun.org.mozilla.javascript.internal.EcmaError: ReferenceError: "b" is not defined. (<Unknown Source>#1) in <Unknown Source> at line number 1
Is there an option to evaluate to null/false for undefined objects, like in JavaScript?
I know that an option will be to do something like "this.x&this.y" instead of "x&y", but I don't have control over that string (user entered).
I browsed a little bit through the Rhino code and it seems that there's no such option.
In the end I will append "this." in front of each variable. This is not by far a desirable solution (I will not even accept my own answer :) ), but for the time being I have no other.
So I'm trying to explore Clojure's internals and I've come across something I'm not quite sure I understand:
From the REPL, I can access RT.var("clojure.core","require") just fine (this is supposed to return the var associated with the "require" symbol in the "clojure.core" namespace):
user=> (clojure.lang.RT/var "clojure.core" "require")
#'clojure.core/require
However, if I try to access it in what I thought was the same way (
user=> (clojure.lang.Var/intern (clojure.lang.Namespace/findOrCreate (clojure.lang.Symbol/intern nil "clojure.main")) (clojure.lang.Symbol/intern nil "require"))
java.lang.IllegalStateException: require already refers to: #'clojure.core/require in namespace: clojure.main (NO_SOURCE_FILE:0)
I get an error that require already refers to something that exists. This is very strange because RT.var is the same as Var.intern, except with the arguments converted to a Namespace and Symbol respectively.
static public Var var(String ns, String name){
return Var.intern(Namespace.findOrCreate(Symbol.intern(null, ns)), Symbol.intern(null, name));
}
I'll do some more digging, but I'm pretty stumped on this one. I've already checked:
1. nil is the same as null
2. I created var2, which returns the namespace argument sent to Var.intern, and var3, which returns the name argument sent to Var.intern. I then pass those two to Var.intern:
user=> (clojure.lang.Var/intern
(clojure.lang.RT/var2 "clojure.main" "require")
(clojure.lang.RT/var3 "clojure.main" "require"))
java.lang.IllegalStateException: require already refers to: #'clojure.core/require in namespace: clojure.main (NO_SOURCE_FILE:0)
Could this be a bug?
This works fine:
(clojure.lang.Var/intern
(clojure.lang.Namespace/findOrCreate
(clojure.lang.Symbol/create "clojure.core"))
(clojure.lang.Symbol/create "require"))
Symbol/intern works also:
(clojure.lang.Var/intern
(clojure.lang.Namespace/findOrCreate
(clojure.lang.Symbol/intern nil "clojure.core"))
(clojure.lang.Symbol/intern nil "require"))
The REPL is just clojure.main, so we can not intern clojure.main/require in a REPL, but clojure.core/require, I think!