Using:
Apache Spark 2.0.1
Java 7
On the Apache Spark Java API documentation for the class DataSet appears an example to use the method join using a scala.collection.Seq parameter to specify the columns names. But I'm not able to use it.
On the documentation they provide the following example:
df1.join(df2, Seq("user_id", "user_name"))
Error: Can not find Symbol Method Seq(String)
My Code:
import org.apache.spark.sql.Dataset;
import org.apache.spark.sql.Row;
import scala.collection.Seq;
public class UserProfiles {
public static void calcTopShopLookup() {
Dataset<Row> udp = Spark.getDataFrameFromMySQL("my_schema","table_1");
Dataset<Row> result = Spark.getSparkSession().table("table_2").join(udp,Seq("col_1","col_2"));
}
Seq(x, y, ...) is a Scala way to create sequence. Seq has it's companion object, which has apply method, which allows to not write new each time.
It should be possible to write:
import scala.collection.JavaConversions;
import scala.collection.Seq;
import static java.util.Arrays.asList;
Dataset<Row> result = Spark.getSparkSession().table("table_2").join(udp, JavaConversions.asScalaBuffer(asList("col_1","col_2")));`
Or you can create own small method:
public static <T> Seq<T> asSeq(T... values) {
return JavaConversions.asScalaBuffer(asList(values));
}
Related
Using compile 'io.github.classgraph:classgraph:4.8.65'
https://github.com/classgraph/classgraph/wiki/ClassGraph-API
Java 8
ScanResult scanResult =
new ClassGraph().enableAllInfo()
.whitelistPackages("abc1")
.whitelistPackages("abc2")
.whitelistPackages("java")
.scan();
When I encounter ClassInfo objects for classes from the packages abc1 or abc2 they are able to reference things like java.util.HashMap, I see them in the FieldInfo.
But when I then proceed to do scanResult.getClassInfo("java.util.HashMap"), it returns null.
(following FieldInfos for other classes within the abc1 or abc2 packages do return more ClassInfo objects)
My question is, is it correct to think I would be able to get the ClassInfo objects to the java jre classes via the ClassGraph method chaining as shown above?
Added this test which fails, it surprisingly only prints one class rather than expected dozens:
package abc;
import io.github.classgraph.ScanResult;
import io.github.classgraph.ClassGraph;
import io.github.classgraph.ClassInfo;
import java.util.*;
import java.io.*;
import java.util.function.*;
import org.springframework.boot.test.context.SpringBootTest;
import org.springframework.test.context.junit.jupiter.SpringJUnitConfig;
#SpringJUnitConfig
#SpringBootTest(classes = {})
public class ExamplesSpec {
#org.junit.jupiter.api.Test
#org.junit.jupiter.api.DisplayName(value="test_for_built_in_java_jre_classes")
public void test_on_line_42() throws Exception {
System.out.println("test_for_built_in_java_jre_classes");
ClassInfo found = null;
try (
ScanResult result = new ClassGraph().enableAllInfo().whitelistPackages("java.util").scan()
) {
System.out.println("here all the classes....");
for( ClassInfo item : result.getAllClasses()) {
System.out.println("here classinfo: " + item);
}
found = result.getClassInfo("java.util.HashMap");
}
assert found != null;
}
}
The only class found is this:
here classinfo: public class java.util.zip.VFSZipFile implements java.util.zip.ZipConstants
Found the answer!
In the setup of the ClassGraph, in order to scan the jre provided classes, you would need to add this to the method chaining:
.enableSystemJarsAndModules()
For example:
new ClassGraph().enableAllInfo()
.whitelistPackages("abc1")
.whitelistPackages("abc2")
.whitelistPackages("java")
.enableSystemJarsAndModules()
.scan();
This is detailed in the documentation found here:
https://github.com/classgraph/classgraph/wiki/API:-ClassGraph-Constructor#configuring-the-classgraph-instance
I have a scala case class which looks like this:
case class AddressView(id: Option[Long],
address: Address,
purpose: Seq[String])
I have to invoke this class from Java.
This does not seem to work:
AddressView billToAddress = new AddressView
(BusinessFieldValue.ShipToAddressId,
shipAddress,
(Seq<String>)Arrays.asList("BILLING"));
Can anyone tell me the right way of doing this?
So there's 2 things you need to fix; you need to wrap your Long in an Option, and you need to properly convert your List to a Scala Seq.
You addressview class takes an Option[Long] as a parameter. It would be nice if you could just pass a long to your constructor, but you must wrap your long in an Option.
Option has 2 subclasses, Some and None. So you need to import Some into your java program. You must also have the scala library added to you classpath.
For the conversion, scala provides some built in functionality in scala.collection.JavaConversions.
So the final result is..
import scala.Some;
import scala.collection.JavaConversions;
import java.util.Arrays;
public class Test {
public static void main(String[] args){
AddressView billToAddress = new AddressView
(new Some<>(BusinessFieldValue.ShipToAddressId),
shipAddress,
JavaConversions.asScalaBuffer(Arrays.asList("BILLING")));
}
}
I have written a code which has Suite information with Test case information embedded inside it.I have written the TestCase.java and Suite.java and they seem to have no errors.But with the MongoMapper.java which I have written I am getting this error.
The method fromDBObject(Class, BasicDBObject) in the type Morphia is not applicable for the arguments (Class, DBObject).Kindly help me with this as well as suggest me how to see whether I have my collections updated in the MongoDB Shell.Thanks in advance.Here is my code.
package com.DrAssist.Morphia.model;
import com.google.code.morphia.Morphia;
import com.mongodb.DB;
import com.mongodb.DBCollection;
import com.mongodb.Mongo;
import org.junit.After;
import org.junit.Before;
import org.junit.Test;
import java.net.UnknownHostException;
import static junit.framework.Assert.assertNotNull;
import static junit.framework.Assert.assertNull;
public class MongoMapper {
Morphia morph;
Mongo mongo;
DBCollection DrAssistReport;
#Before
public void setUp() throws UnknownHostException {
morph = new Morphia();
mongo = new Mongo("127.0.0.1", 27017);
// This is where we map Persons and addresses
// But shouldn't the annotation be able to handle that?
morph.map(Suite.class).map(TestCase.class);
DB testDb = mongo.getDB( "test" );
DrAssistReport = testDb.getCollection("DrAssistReport");
}
#Test
public void storePersonThroughMorphiaMapping () {
Suite suite = new Suite(new TestCase("1",new String[]{"Test1", "Test2", "Test3", "Test4"},"1","5","6","7","889"));
suite.setSID("1");
suite.setsuiteName("Suite1");
suite.setnoOfTests("5");
DrAssistReport.save(morph.toDBObject(suite));
Suite suite2 = morph.fromDBObject(Suite.class, DrAssistReport.findOne());
assertNotNull(suite2.getSID());
}
}
The ERROR I am getting is The method fromDBObject(Class, BasicDBObject) in the type Morphia is not applicable for the arguments (Class, DBObject)
It looks like you're using a version of Morphia from before this change: http://code.google.com/p/morphia/issues/detail?id=15
You could try either casting the return value from DrAssistReport.findOne() to a "BasicDBObject" or upgrade the verison of Morphia that you're using to a version whose Morphia.fromDBObject method takes a DBObject rather then requiring a BasicDBObject.
I want to make use of the new functionality in the latest build of junit in order to name my parameterized tests
I have the following two tests written in java & scala, but the scala test generates a compiler error:
error: unknown annotation argument name: name #Parameters(name =
"{0}") def data: util.Collection[Array[AnyRef]] =
util.Arrays.asList(Array("x"), Array("y"), Array("z"))
What is the difference in implementation causing this error?
java
import org.junit.Test;
import org.junit.runner.RunWith;
import org.junit.runners.Parameterized;
import org.junit.runners.Parameterized.Parameters;
import java.util.Arrays;
import java.util.Collection;
import static org.junit.Assert.fail;
#RunWith(Parameterized.class)
public class ParameterizedTest {
#Parameters(name = "{0}")
public static Collection<Object[]> data() {
return Arrays.asList(new Object[]{"x"}, new Object[]{"y"}, new Object[]{"z"});
}
#Test
public void foo() {
fail("bar");
}
}
scala
import java.util
import org.junit.Assert._
import org.junit.Test
import org.junit.runner.RunWith
import org.junit.runners.Parameterized
import org.junit.runners.Parameterized._
#RunWith(classOf[Parameterized])
class ParameterizedScalaTest {
#Test def foo() {
fail("bar")
}
}
object ParameterizedScalaTest {
#Parameters(name = "{0}") def data: util.Collection[Array[AnyRef]] = util.Arrays.asList(Array("x"), Array("y"), Array("z"))
}
Because #Parameters is defined as an inner, you seem to need to give the full name.
Try
#Parameters(Parameters.name = "{0}")
At least, that is the only significant difference I can observe in the definitions of #Parameters and #Test, and this works:
#Test(timeout = 10)
It turns out the issue here is due to junit-dep.jar being on the classpath through a transient dependency on jMock 2.4.0
Removing that fixed the compiler error, odd that this is an issue for scalac but not javac.
Hi I would like to import Play framework Java Exension* into a Play framework Model.
In specific I'd like to have in my model:
package models;
// various import
import play.templates.JavaExtensions;
#Entity
public class Product extends Model {
#PrePersist
public void save_slug(){
slug = title.slugify();
}
}
But I'm receiving following error
The method slugify() is undefined for the type String
What am I doing wrong?
*references:
- http://www.playframework.org/documentation/1.1/javaextensions#aslugifya
- http://www.playframework.org/documentation/api/1.2.4/play%2Ftemplates%2FJavaExtensions.html
Java extensions are static methods of JavaExtensions class, you can use them as follows:
slug = JavaExtensions.slugify(title);