I have the following dataframe :
+-------------+-----------------+------------------+
|longitude |latitude |geom |
+-------------+-----------------+------------------+
|-7.07378166 |33.826661 [00 00 00 00 01 0..|
|-7.5952683 |33.544191 [00 00 00 00 01 0..|
+-------------+-----------------+------------------+
I'm using the following code :
Dataset<Row> result_f = sparkSession.sql("select * from data_f where ST_WITHIN(ST_GeomFromText(CONCAT('POINT(',longitude_f,' ',latitude_f,')',4326)),geom)");
result_f.show();
But I get the following error :
java.lang.ClassCastException: [B cannot be cast to org.apache.spark.sql.catalyst.util.ArrayData
at org.apache.spark.sql.geosparksql.expressions.ST_Within.eval(Predicates.scala:105)
EDIT
longitude : Double type
latitude : Double type
geom : Binary type
Any idea ? I need your help
Thank you
I don't think ST_GeomFromText is availble as constructing a geometry from text however there are:
ST_GeomFromWKT
ST_GeomFromWKB
ST_GeomFromGeoJSON
ST_Point
ST_PointFromText
ST_PolygonFromText
ST_LineStringFromText
ST_PolygonFromEnvelope
ST_Circle
I suggest to use either ST_Point or ST_PointFromText and after that the predicate ST_WITHIN
Something like this:
Dataset<Row> result_f = sparkSession.sql("select * from data_f where ST_WITHIN(ST_Point(CAST(data_f.latitude AS Decimal(24,20)), CAST(data_f.longitude AS Decimal(24,20))),geom)");
result_f.show();
Related
I would like to not use null value for field of a class used in dataset. I try to use scala Option and java Optional but it failed:
#AllArgsConstructor // lombok
#NoArgsConstructor // mutable type is required in java :(
#Data // see https://stackoverflow.com/q/59609933/1206998
public static class TestClass {
String id;
Option<Integer> optionalInt;
}
#Test
public void testDatasetWithOptionField(){
Dataset<TestClass> ds = spark.createDataset(Arrays.asList(
new TestClass("item 1", Option.apply(1)),
new TestClass("item .", Option.empty())
), Encoders.bean(TestClass.class));
ds.collectAsList().forEach(x -> System.out.println("Found " + x));
}
Fails, at runtime, with message File 'generated.java', Line 77, Column 47: Cannot instantiate abstract "scala.Option"
Question: Is there a way to encode optional fields without null in a dataset, using java?
Subsidiary question: btw, I didn't use much dataset in scala either, can you validate that it is actually possible in scala to encode a case class containing Option fields?
Note: This is used in an intermediate dataset, i.e something that isn't read nor write (but for spark internal serialization)
This is fairly simple to do in Scala.
Scala Implementation
import org.apache.spark.sql.{Encoders, SparkSession}
object Test {
def main(args: Array[String]): Unit = {
val spark = SparkSession.builder
.appName("Stack-scala")
.master("local[2]")
.getOrCreate()
val ds = spark.createDataset(Seq(
TestClass("Item 1", Some(1)),
TestClass("Item 2", None)
))( Encoders.product[TestClass])
ds.collectAsList().forEach(println)
spark.stop()
}
case class TestClass(
id: String,
optionalInt: Option[Int] )
}
Java
There are various Option classes available in Java. However, none of them work out-of-the-box.
java.util.Optional : Not serializable
scala.Option -> Serializable but abstract, so when CodeGenerator generates the following code, it fails!
/* 081 */ // initializejavabean(newInstance(class scala.Option))
/* 082 */ final scala.Option value_9 = false ?
/* 083 */ null : new scala.Option(); // ---> Such initialization is not possible for abstract classes
/* 084 */ scala.Option javaBean_1 = value_9;
org.apache.spark.api.java.Optional -> Spark's implementation of Optional which is serializable but has private constructors. So, it fails with error : No applicable constructor/method found for zero actual parameters. Since this is a final class, it's not possible to extend this.
/* 081 */ // initializejavabean(newInstance(class org.apache.spark.api.java.Optional))
/* 082 */ final org.apache.spark.api.java.Optional value_9 = false ?
/* 083 */ null : new org.apache.spark.api.java.Optional();
/* 084 */ org.apache.spark.api.java.Optional javaBean_1 = value_9;
/* 085 */ if (!false) {
One option is to use normal Java Optionals in the data class and then use Kryo as serializer.
Encoder en = Encoders.kryo(TestClass.class);
Dataset<TestClass> ds = spark.createDataset(Arrays.asList(
new TestClass("item 1", Optional.of(1)),
new TestClass("item .", Optional.empty())
), en);
ds.collectAsList().forEach(x -> System.out.println("Found " + x));
Output:
Found TestClass(id=item 1, optionalInt=Optional[1])
Found TestClass(id=item ., optionalInt=Optional.empty)
There is a downside when using Kryo: this encoder encodes in a binary format:
ds.printSchema();
ds.show(false);
prints
root
|-- value: binary (nullable = true)
+-------------------------------------------------------------------------------------------------------+
|value |
+-------------------------------------------------------------------------------------------------------+
|[01 00 4A 61 76 61 53 74 61 72 74 65 72 24 54 65 73 74 43 6C 61 73 F3 01 01 69 74 65 6D 20 B1 01 02 02]|
|[01 00 4A 61 76 61 53 74 61 72 74 65 72 24 54 65 73 74 43 6C 61 73 F3 01 01 69 74 65 6D 20 AE 01 00] |
+-------------------------------------------------------------------------------------------------------+
An udf-based solution to get the normal output columns of a dataset encoded with Kryo describes this answer.
Maybe a bit off-topic but probably a start to find a long-term solution is to look at the code of JavaTypeInference. The methods serializerFor and deserializerFor are used by ExpressionEncoder.javaBean to create the serializer and deserializer part of the encoder for Java beans.
In this pattern matching block
typeToken.getRawType match {
case c if c == classOf[String] => createSerializerForString(inputObject)
case c if c == classOf[java.time.Instant] => createSerializerForJavaInstant(inputObject)
case c if c == classOf[java.sql.Timestamp] => createSerializerForSqlTimestamp(inputObject)
case c if c == classOf[java.time.LocalDate] => createSerializerForJavaLocalDate(inputObject)
case c if c == classOf[java.sql.Date] => createSerializerForSqlDate(inputObject)
[...]
there is the handling for java.util.Optional missing. It could probably be added here as well as in the corresponding deserialize method. This would allow Java beans to have properties of type Optional.
my widgets.proto:
option java_package = "example";
option java_outer_classname = "WidgetsProtoc";
message Widget {
required string name = 1;
required int32 id = 2;
}
message WidgetList {
repeated Widget widget = 1;
}
my rest: (Path: /widgets)
#GET
#Produces("application/x-protobuf")
public WidgetsProtoc.WidgetList getAllWidgets() {
Widget widget1 =
Widget.newBuilder().setId(1).setName("testing").build();
Widget widget2 =
Widget.newBuilder().setId(100).setName("widget 2").build();
System.err.println("widgets1: " + widget1);
System.err.println("widgets2: " + widget2);
WidgetsProtoc.WidgetList list = WidgetsProtoc.WidgetList.newBuilder().addWidget(widget1).addWidget(widget2).build();
System.err.println("list: " + list.toByteArray());
return list;
}
And when i use postman i get this response:
(emptyline)
(emptyline)
testing
(emptyline)
widget 2d
This is normal? I think not really...in my messagebodywriter class i override writeto like this:
#Override
public void writeTo(WidgetsProtoc.WidgetList widgetList, Class<?> type, Type genericType, Annotation[] annotations, MediaType mediaType, MultivaluedMap<String, Object> httpHeaders, OutputStream entityStream) throws IOException, WebApplicationException {
entityStream.write(widgetList.toByteArray());
}
I thought its good to send a byte array...but it is a little bit weird that not serialize nothing....or just my be the id...Thanks for the help :)
It already is in binary.
Have you noticed that you message consists of name and id but you get only name displayed?
That's what I get displayed when I just save the response as a "test" file from the browser's GET call:
$ cat test
testing
widget 2d%
You can see cat being barely able to display it, but there is more content into it.
Now if I open it in bless (hex editor in ubuntu) I can get more out of it
0A 0B 0A 07 74 65 73 74 69 6E 67 10 01 0A 0C 0A 08 77 69 64 67 65 74 20 32 10 64
In general I wouldn't trust Postman to display binary data. You need to know how to decode it, and apparently Postman doesn't.
And here's the final proof:
$ cat test | protoc --decode_raw
1 {
1: "testing"
2: 1
}
1 {
1: "widget 2"
2: 100
}
I have multiple text files that contains information about different programming languages popularity in different countries based off of google searches. I have one text file for each year from 2004 to 2015. I also have a text file that breaks this down into each week (called iot.txt) but this file does not include the country.
Example data from 2004.txt:
Region java c++ c# python JavaScript
Argentina 13 14 10 0 17
Australia 22 20 22 64 26
Austria 23 21 19 31 21
Belgium 20 14 17 34 25
Bolivia 25 0 0 0 0
etc
example from iot.txt:
Week java c++ c# python JavaScript
2004-01-04 - 2004-01-10 88 23 12 8 34
2004-01-11 - 2004-01-17 88 25 12 8 36
2004-01-18 - 2004-01-24 91 24 12 8 36
2004-01-25 - 2004-01-31 88 26 11 7 36
2004-02-01 - 2004-02-07 93 26 12 7 37
My problem is that i am trying to write code that will output the number of countries that have exhibited 0 interest in python.
This is my current code that I use to read the text files. But I'm not sure of the best way to tell the number of regions that have 0 interest in python across all the years 2004-2015. At first I thought the best way would be to create a list from all the text files not including iot.txt and then search that for any entries that have 0 interest in python but I have no idea how to do that.
Can anyone suggest a way to do this?
import java.io.BufferedReader;
import java.io.FileReader;
import java.util.*;
public class Starter{
public static void main(String[] args) throws Exception {
BufferedReader fh =
new BufferedReader(new FileReader("iot.txt"));
//First line contains the language names
String s = fh.readLine();
List<String> langs =
new ArrayList<>(Arrays.asList(s.split("\t")));
langs.remove(0); //Throw away the first word - "week"
Map<String,HashMap<String,Integer>> iot = new TreeMap<>();
while ((s=fh.readLine())!=null)
{
String [] wrds = s.split("\t");
HashMap<String,Integer> interest = new HashMap<>();
for(int i=0;i<langs.size();i++)
interest.put(langs.get(i), Integer.parseInt(wrds[i+1]));
iot.put(wrds[0], interest);
}
fh.close();
HashMap<Integer,HashMap<String,HashMap<String,Integer>>>
regionsByYear = new HashMap<>();
for (int i=2004;i<2016;i++)
{
BufferedReader fh1 =
new BufferedReader(new FileReader(i+".txt"));
String s1 = fh1.readLine(); //Throw away the first line
HashMap<String,HashMap<String,Integer>> year = new HashMap<>();
while ((s1=fh1.readLine())!=null)
{
String [] wrds = s1.split("\t");
HashMap<String,Integer>langMap = new HashMap<>();
for(int j=1;j<wrds.length;j++){
langMap.put(langs.get(j-1), Integer.parseInt(wrds[j]));
}
year.put(wrds[0],langMap);
}
regionsByYear.put(i,year);
fh1.close();
}
}
}
Create a Map<String, Integer> using a HashMap and each time you find a new country while scanning the incoming data add it into the map country->0. Each time you find a usage of python increment the value.
At the end loop through the entrySet of the map and for each case where e.value() is zero output e.key().
I have the following code:
public static void main(String args[]){
try {
//String ticket = "Negotiate YIGCBg...==";
//byte[] kerberosTicket = ticket.getBytes();
byte[] kerberosTicket = Base64.decode("YIGCBg...==");
GSSContext context = GSSManager.getInstance().createContext((GSSCredential) null);
context.acceptSecContext(kerberosTicket, 0, kerberosTicket.length);
String user = context.getSrcName().toString();
context.dispose();
} catch (GSSException e) {
e.printStackTrace();
} catch (Base64DecodingException e) {
e.printStackTrace();
}
}
Of course it fails. Here's the exception:
GSSException: Defective token detected (Mechanism level: GSSHeader did not find the right tag)
I don't know what I'm supposed to do to solve this. Honestly, I don't really understand Kerberos.
I got this ticket by sending a 401 with the appropriate header WWW-Authenticate with 'Negotiate' as the value. The browser immediately issued the same request again with an authorization header containing this ticket.
I was hoping I could validate the ticket and determine who the user is.
Do I need a keytab file? If so, what credentials would I run this under? I'm trying to use the Kerberos ticket for auth for a web-site. Would the credentials be the credentials from IIS?
What am I missing?
Update 1
From Michael-O's reply, I did a bit more googling and found this article, which led me to this article.
On table 3, I found 1.3.6.1.5.5.2 SPNEGO.
I have now added that to my credentials following the example from the first article. Here's my code:
public static void main(String args[]){
try {
Oid mechOid = new Oid("1.3.6.1.5.5.2");
GSSManager manager = GSSManager.getInstance();
GSSCredential myCred = manager.createCredential(null,
GSSCredential.DEFAULT_LIFETIME,
mechOid,
GSSCredential.ACCEPT_ONLY);
GSSContext context = manager.createContext(myCred);
byte[] ticket = Base64.decode("YIGCBg...==");
context.acceptSecContext(ticket, 0, ticket.length);
String user = context.getSrcName().toString();
context.dispose();
} catch (GSSException e) {
e.printStackTrace();
} catch (Base64DecodingException e) {
e.printStackTrace();
}
}
But now the code is failing on createCredential with this error:
GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos credentails)
Here's the entire ticket: YIGCBgYrBgEFBQKgeDB2oDAwLgYKKwYBBAGCNwICCgYJKoZIgvcSAQICBgkqhkiG9xIBAgIGCisGAQQBgjcCAh6iQgRATlRMTVNTUAABAAAAl7II4g4ADgAyAAAACgAKACgAAAAGAbEdAAAAD0xBUFRPUC0yNDVMSUZFQUNDT1VOVExMQw==
Validating an SPNEGO ticket from Java is a somewhat convoluted process. Here's a brief overview but bear in mind that the process can have tons of pitfalls. You really need to understand how Active Directory, Kerberos, SPNEGO, and JAAS all operate to successfully diagnose problems.
Before you start, make sure you know your kerberos realm name for your windows domain. For the purposes of this answer I'll assume it's MYDOMAIN. You can obtain the realm name by running echo %userdnsdomain% from a cmd window. Note that kerberos is case sensitive and the realm is almost always ALL CAPS.
Step 1 - Obtain a Kerberos Keytab
In order for a kerberos client to access a service, it requests a ticket for the Service Principal Name [SPN] that represents that service. SPNs are generally derived from the machine name and the type of service being accessed (e.g. HTTP/www.my-domain.com). In order to validate a kerberos ticket for a particular SPN, you must have a keytab file that contains a shared secret known to both the Kerberos Domain Controller [KDC] Ticket Granting Ticket [TGT] service and the service provider (you).
In terms of Active Directory, the KDC is the Domain Controller, and the shared secret is just the plain text password of the account that owns the SPN. A SPN may be owned by either a Computer or a User object within the AD.
The easiest way to setup a SPN in AD if you are defining a service is to setup a user-based SPN like so:
Create an unpriviledged service account in AD whose password doesn't expire e.g. SVC_HTTP_MYSERVER with password ReallyLongRandomPass
Bind the service SPN to the account using the windows setspn utility. Best practice is to define multiple SPNs for both the short name and the FQDN of the host:
setspn -U -S HTTP/myserver#MYDOMAIN SVC_HTTP_MYSERVER
setspn -U -S HTTP/myserver.my-domain.com#MYDOMAIN SVC_HTTP_MYSERVER
Generate a keytab for the account using Java's ktab utility.
ktab -k FILE:http_myserver.ktab -a HTTP/myserver#MYDOMAIN ReallyLongRandomPass
ktab -k FILE:http_myserver.ktab -a HTTP/myserver.my-domain.com#MYDOMAIN ReallyLongRandomPass
If you are trying to authenticate a pre-existing SPN that is bound to a Computer account or to a User account you do not control, the above will not work. You will need to extract the keytab from ActiveDirectory itself. The Wireshark Kerberos Page has some good pointers for this.
Step 2 - Setup your krb5.conf
In %JAVA_HOME%/jre/lib/security create a krb5.conf that describes your domain. Make sure the realm you define here matches what you setup for your SPN. If you don't put the file in the JVM directory, you can point to it by setting -Djava.security.krb5.conf=C:\path\to\krb5.conf on the command line.
Example:
[libdefaults]
default_realm = MYDOMAIN
[realms]
MYDOMAIN = {
kdc = dc1.my-domain.com
default_domain = my-domain.com
}
[domain_realm]
.my-domain.com = MYDOMAIN
my-domain.com = MYDOMAIN
Step 3 - Setup JAAS login.conf
Your JAAS login.conf should define a login configuration that sets up the Krb5LoginModule as a acceptor. Here's an example that assumes that the keytab we created above is in C:\http_myserver.ktab. Point to the JASS config file by setting -Djava.security.auth.login.config=C:\path\to\login.conf on the command line.
http_myserver_mydomain {
com.sun.security.auth.module.Krb5LoginModule required
principal="HTTP/myserver.my-domain.com#MYDOMAIN"
doNotPrompt="true"
useKeyTab="true"
keyTab="C:/http_myserver.ktab"
storeKey="true"
isInitiator="false";
};
Alternatively, you can generate a JAAS config at runtime like so:
public static Configuration getJaasKrb5TicketCfg(
final String principal, final String realm, final File keytab) {
return new Configuration() {
#Override
public AppConfigurationEntry[] getAppConfigurationEntry(String name) {
Map<String, String> options = new HashMap<String, String>();
options.put("principal", principal);
options.put("keyTab", keytab.getAbsolutePath());
options.put("doNotPrompt", "true");
options.put("useKeyTab", "true");
options.put("storeKey", "true");
options.put("isInitiator", "false");
return new AppConfigurationEntry[] {
new AppConfigurationEntry(
"com.sun.security.auth.module.Krb5LoginModule",
LoginModuleControlFlag.REQUIRED, options)
};
}
};
}
You would create a LoginContext for this configuration like so:
LoginContext ctx = new LoginContext("doesn't matter", subject, null,
getJaasKrbValidationCfg("HTTP/myserver.my-domain.com#MYDOMAIN", "MYDOMAIN",
new File("C:/path/to/my.ktab")));
Step 4 - Accepting the ticket
This is a little off-the-cuff, but the general idea is to define a PriviledgedAction that performs the SPNEGO protocol using the ticket. Note that this example does not check that SPNEGO protocol is complete. For example if the client requested server authentication, you would need to return the token generated by acceptSecContext() in the authentication header in the HTTP response.
public class Krb5TicketValidateAction implements PrivilegedExceptionAction<String> {
public Krb5TicketValidateAction(byte[] ticket, String spn) {
this.ticket = ticket;
this.spn = spn;
}
#Override
public String run() throws Exception {
final Oid spnegoOid = new Oid("1.3.6.1.5.5.2");
GSSManager gssmgr = GSSManager.getInstance();
// tell the GSSManager the Kerberos name of the service
GSSName serviceName = gssmgr.createName(this.spn, GSSName.NT_USER_NAME);
// get the service's credentials. note that this run() method was called by Subject.doAs(),
// so the service's credentials (Service Principal Name and password) are already
// available in the Subject
GSSCredential serviceCredentials = gssmgr.createCredential(serviceName,
GSSCredential.INDEFINITE_LIFETIME, spnegoOid, GSSCredential.ACCEPT_ONLY);
// create a security context for decrypting the service ticket
GSSContext gssContext = gssmgr.createContext(serviceCredentials);
// decrypt the service ticket
System.out.println("Entering accpetSecContext...");
gssContext.acceptSecContext(this.ticket, 0, this.ticket.length);
// get the client name from the decrypted service ticket
// note that Active Directory created the service ticket, so we can trust it
String clientName = gssContext.getSrcName().toString();
// clean up the context
gssContext.dispose();
// return the authenticated client name
return clientName;
}
private final byte[] ticket;
private final String spn;
}
Then to authenticate the ticket, you would do something like the following. Assume that ticket contains the already-base-64-decoded ticket from the authentication header. The spn should be derived from the Host header in the HTTP request if the format of HTTP/<HOST>#<REALM>. E.g. if the Host header was myserver.my-domain.com then spn should be HTTP/myserver.my-domain.com#MYDOMAIN.
public boolean isTicketValid(String spn, byte[] ticket) {
LoginContext ctx = null;
try {
// this is the name from login.conf. This could also be a parameter
String ctxName = "http_myserver_mydomain";
// define the principal who will validate the ticket
Principal principal = new KerberosPrincipal(spn, KerberosPrincipal.KRB_NT_SRV_INST);
Set<Principal> principals = new HashSet<Principal>();
principals.add(principal);
// define the subject to execute our secure action as
Subject subject = new Subject(false, principals, new HashSet<Object>(),
new HashSet<Object>());
// login the subject
ctx = new LoginContext("http_myserver_mydomain", subject);
ctx.login();
// create a validator for the ticket and execute it
Krb5TicketValidateAction validateAction = new Krb5TicketValidateAction(ticket, spn);
String username = Subject.doAs(subject, validateAction);
System.out.println("Validated service ticket for user " + username
+ " to access service " + spn );
return true;
} catch(PriviledgedActionException e ) {
System.out.println("Invalid ticket for " + spn + ": " + e);
} catch(LoginException e) {
System.out.println("Error creating validation LoginContext for "
+ spn + ": " + e);
} finally {
try {
if(ctx!=null) { ctx.logout(); }
} catch(LoginException e) { /* noop */ }
}
return false;
}
This is not a Kerberos ticket but a SPNEGO ticket. Your context has the wrong mechanism.
Edit: Though, you now have the correct mech, you client is sending you a NTLM token which the GSS-API is not able to process. Take the Base 64 token, decode to raw bytes and display ASCII chars. If it starts with NTLMSSP, it won't work for sure and you have broken Kerberos setup.
Edit 2: This is your ticket:
60 81 82 06 06 2B 06 01 05 05 02 A0 78 30 76 A0 30 30 2E 06 `..+..... x0v 00..
0A 2B 06 01 04 01 82 37 02 02 0A 06 09 2A 86 48 82 F7 12 01 .+....7.....*H÷..
02 02 06 09 2A 86 48 86 F7 12 01 02 02 06 0A 2B 06 01 04 01 ....*H÷......+....
82 37 02 02 1E A2 42 04 40 4E 54 4C 4D 53 53 50 00 01 00 00 7...¢B.#NTLMSSP....
00 97 B2 08 E2 0E 00 0E 00 32 00 00 00 0A 00 0A 00 28 00 00 .².â....2.......(..
00 06 01 B1 1D 00 00 00 0F 4C 41 50 54 4F 50 2D 32 34 35 4C ...±.....LAPTOP-245L
49 46 45 41 43 43 4F 55 4E 54 4C 4C 43 IFEACCOUNTLLC
This is a wrapped NTLM token inside a SPNEGO token. which simply means that Kerberos has failed for some reasons, e.g.,
SPN not registered
Clockskew
Not allowed for Kerberos
Incorrect DNS records
Best option is to use Wireshark on the client to find the root cause.
Please note that Java does not support NTLM as a SPNEGO submechanism. NTLM is only supported by SSPI and Heimdal.
If the server does not have a keytab and associated key registered the KDC, you will never be able use kerberos to validate a ticket.
Getting SPNEGO to work is tricky at best and will be next to impossible without at least a cursory understanding of how kerberos works. Try reading this dialog and see if you can get a better understanding.
http://web.mit.edu/kerberos/dialogue.html
SPNEGO requires an SPN of the form HTTP/server.example.com and you'll need to tell the GSS libraries where that keytab is when you start the server.
Why doesn't this work? I'm trying to test that an empty database is the same before doing nothing as after doing nothing. In other words, this is the simplest dbunit test with a database that I can think of. And it doesn't work. The test methods are practically lifted from http://www.dbunit.org/howto.html
The error message I'm getting for comparing empty database is:
java.lang.AssertionError: expected:
org.dbunit.dataset.xml.FlatXmlDataSet<AbstractDataSet[_orderedTableNameMap=null]>
but was:
org.dbunit.database.DatabaseDataSet<AbstractDataSet[_orderedTableNameMap=null]>
The error message I'm getting for comparing empty table is:
java.lang.AssertionError: expected:
<org.dbunit.dataset.DefaultTable[_metaData=tableName=test, columns=[], keys=[], _rowList.size()=0]>
but was:
<org.dbunit.database.CachedResultSetTable[_metaData=table=test, cols=[(id, DOUBLE, noNulls), (txt, VARCHAR, nullable)], pk=[(id, DOUBLE, noNulls)], _rowList.size()=0]>
(I've added newlines for readability)
I can edit in the full stack trace (or anything else) if it'll be useful. Or you can browse through the public git repo: https://bitbucket.org/djeikyb/simple_dbunit
Do I need to somehow convert my actual IDataSet to xml then back to IDataSet to properly compare? What am I doing/expecting wrong?
34 public class TestCase
35 {
36
37 private IDatabaseTester database_tester;
38
39 #Before
40 public void setUp() throws Exception
41 {
42 database_tester = new JdbcDatabaseTester("com.mysql.jdbc.Driver",
43 "jdbc:mysql://localhost/cal",
44 "cal",
45 "cal");
46
47 IDataSet data_set = new FlatXmlDataSetBuilder().build(
48 new FileInputStream("src/simple_dbunit/dataset.xml"));
49 database_tester.setDataSet(data_set);
50
51 database_tester.onSetup();
52 }
53
54 #Test
55 public void testDbNoChanges() throws Exception
56 {
57 // expected
58 IDataSet expected_data_set = new FlatXmlDataSetBuilder().build(
59 new FileInputStream("src/simple_dbunit/dataset.xml"));
60
61 // actual
62 IDatabaseConnection connection = database_tester.getConnection();
63 IDataSet actual_data_set = connection.createDataSet();
64
65 // test
66 assertEquals(expected_data_set, actual_data_set);
67 }
68
69 #Test
70 public void testTableNoChanges() throws Exception
71 {
72 // expected
73 IDataSet expected_data_set = new FlatXmlDataSetBuilder().build(
74 new FileInputStream("src/simple_dbunit/dataset.xml"));
75 ITable expected_table = expected_data_set.getTable("test");
76
77 // actual
78 IDatabaseConnection connection = database_tester.getConnection();
79 IDataSet actual_data_set = connection.createDataSet();
80 ITable actual_table = actual_data_set.getTable("test");
81
82 // test
83 assertEquals(expected_table, actual_table);
84 }
85
86 }
When you compare the IDataSet and other DBUnit components, you have to use the assert method that provided by DBUnit.
If you use assert methods provided by JUnit, it will only be compared via equals method in Object
That's why you get the error complaining about different object type.