my widgets.proto:
option java_package = "example";
option java_outer_classname = "WidgetsProtoc";
message Widget {
required string name = 1;
required int32 id = 2;
}
message WidgetList {
repeated Widget widget = 1;
}
my rest: (Path: /widgets)
#GET
#Produces("application/x-protobuf")
public WidgetsProtoc.WidgetList getAllWidgets() {
Widget widget1 =
Widget.newBuilder().setId(1).setName("testing").build();
Widget widget2 =
Widget.newBuilder().setId(100).setName("widget 2").build();
System.err.println("widgets1: " + widget1);
System.err.println("widgets2: " + widget2);
WidgetsProtoc.WidgetList list = WidgetsProtoc.WidgetList.newBuilder().addWidget(widget1).addWidget(widget2).build();
System.err.println("list: " + list.toByteArray());
return list;
}
And when i use postman i get this response:
(emptyline)
(emptyline)
testing
(emptyline)
widget 2d
This is normal? I think not really...in my messagebodywriter class i override writeto like this:
#Override
public void writeTo(WidgetsProtoc.WidgetList widgetList, Class<?> type, Type genericType, Annotation[] annotations, MediaType mediaType, MultivaluedMap<String, Object> httpHeaders, OutputStream entityStream) throws IOException, WebApplicationException {
entityStream.write(widgetList.toByteArray());
}
I thought its good to send a byte array...but it is a little bit weird that not serialize nothing....or just my be the id...Thanks for the help :)
It already is in binary.
Have you noticed that you message consists of name and id but you get only name displayed?
That's what I get displayed when I just save the response as a "test" file from the browser's GET call:
$ cat test
testing
widget 2d%
You can see cat being barely able to display it, but there is more content into it.
Now if I open it in bless (hex editor in ubuntu) I can get more out of it
0A 0B 0A 07 74 65 73 74 69 6E 67 10 01 0A 0C 0A 08 77 69 64 67 65 74 20 32 10 64
In general I wouldn't trust Postman to display binary data. You need to know how to decode it, and apparently Postman doesn't.
And here's the final proof:
$ cat test | protoc --decode_raw
1 {
1: "testing"
2: 1
}
1 {
1: "widget 2"
2: 100
}
Related
I have the following dataframe :
+-------------+-----------------+------------------+
|longitude |latitude |geom |
+-------------+-----------------+------------------+
|-7.07378166 |33.826661 [00 00 00 00 01 0..|
|-7.5952683 |33.544191 [00 00 00 00 01 0..|
+-------------+-----------------+------------------+
I'm using the following code :
Dataset<Row> result_f = sparkSession.sql("select * from data_f where ST_WITHIN(ST_GeomFromText(CONCAT('POINT(',longitude_f,' ',latitude_f,')',4326)),geom)");
result_f.show();
But I get the following error :
java.lang.ClassCastException: [B cannot be cast to org.apache.spark.sql.catalyst.util.ArrayData
at org.apache.spark.sql.geosparksql.expressions.ST_Within.eval(Predicates.scala:105)
EDIT
longitude : Double type
latitude : Double type
geom : Binary type
Any idea ? I need your help
Thank you
I don't think ST_GeomFromText is availble as constructing a geometry from text however there are:
ST_GeomFromWKT
ST_GeomFromWKB
ST_GeomFromGeoJSON
ST_Point
ST_PointFromText
ST_PolygonFromText
ST_LineStringFromText
ST_PolygonFromEnvelope
ST_Circle
I suggest to use either ST_Point or ST_PointFromText and after that the predicate ST_WITHIN
Something like this:
Dataset<Row> result_f = sparkSession.sql("select * from data_f where ST_WITHIN(ST_Point(CAST(data_f.latitude AS Decimal(24,20)), CAST(data_f.longitude AS Decimal(24,20))),geom)");
result_f.show();
I would like to not use null value for field of a class used in dataset. I try to use scala Option and java Optional but it failed:
#AllArgsConstructor // lombok
#NoArgsConstructor // mutable type is required in java :(
#Data // see https://stackoverflow.com/q/59609933/1206998
public static class TestClass {
String id;
Option<Integer> optionalInt;
}
#Test
public void testDatasetWithOptionField(){
Dataset<TestClass> ds = spark.createDataset(Arrays.asList(
new TestClass("item 1", Option.apply(1)),
new TestClass("item .", Option.empty())
), Encoders.bean(TestClass.class));
ds.collectAsList().forEach(x -> System.out.println("Found " + x));
}
Fails, at runtime, with message File 'generated.java', Line 77, Column 47: Cannot instantiate abstract "scala.Option"
Question: Is there a way to encode optional fields without null in a dataset, using java?
Subsidiary question: btw, I didn't use much dataset in scala either, can you validate that it is actually possible in scala to encode a case class containing Option fields?
Note: This is used in an intermediate dataset, i.e something that isn't read nor write (but for spark internal serialization)
This is fairly simple to do in Scala.
Scala Implementation
import org.apache.spark.sql.{Encoders, SparkSession}
object Test {
def main(args: Array[String]): Unit = {
val spark = SparkSession.builder
.appName("Stack-scala")
.master("local[2]")
.getOrCreate()
val ds = spark.createDataset(Seq(
TestClass("Item 1", Some(1)),
TestClass("Item 2", None)
))( Encoders.product[TestClass])
ds.collectAsList().forEach(println)
spark.stop()
}
case class TestClass(
id: String,
optionalInt: Option[Int] )
}
Java
There are various Option classes available in Java. However, none of them work out-of-the-box.
java.util.Optional : Not serializable
scala.Option -> Serializable but abstract, so when CodeGenerator generates the following code, it fails!
/* 081 */ // initializejavabean(newInstance(class scala.Option))
/* 082 */ final scala.Option value_9 = false ?
/* 083 */ null : new scala.Option(); // ---> Such initialization is not possible for abstract classes
/* 084 */ scala.Option javaBean_1 = value_9;
org.apache.spark.api.java.Optional -> Spark's implementation of Optional which is serializable but has private constructors. So, it fails with error : No applicable constructor/method found for zero actual parameters. Since this is a final class, it's not possible to extend this.
/* 081 */ // initializejavabean(newInstance(class org.apache.spark.api.java.Optional))
/* 082 */ final org.apache.spark.api.java.Optional value_9 = false ?
/* 083 */ null : new org.apache.spark.api.java.Optional();
/* 084 */ org.apache.spark.api.java.Optional javaBean_1 = value_9;
/* 085 */ if (!false) {
One option is to use normal Java Optionals in the data class and then use Kryo as serializer.
Encoder en = Encoders.kryo(TestClass.class);
Dataset<TestClass> ds = spark.createDataset(Arrays.asList(
new TestClass("item 1", Optional.of(1)),
new TestClass("item .", Optional.empty())
), en);
ds.collectAsList().forEach(x -> System.out.println("Found " + x));
Output:
Found TestClass(id=item 1, optionalInt=Optional[1])
Found TestClass(id=item ., optionalInt=Optional.empty)
There is a downside when using Kryo: this encoder encodes in a binary format:
ds.printSchema();
ds.show(false);
prints
root
|-- value: binary (nullable = true)
+-------------------------------------------------------------------------------------------------------+
|value |
+-------------------------------------------------------------------------------------------------------+
|[01 00 4A 61 76 61 53 74 61 72 74 65 72 24 54 65 73 74 43 6C 61 73 F3 01 01 69 74 65 6D 20 B1 01 02 02]|
|[01 00 4A 61 76 61 53 74 61 72 74 65 72 24 54 65 73 74 43 6C 61 73 F3 01 01 69 74 65 6D 20 AE 01 00] |
+-------------------------------------------------------------------------------------------------------+
An udf-based solution to get the normal output columns of a dataset encoded with Kryo describes this answer.
Maybe a bit off-topic but probably a start to find a long-term solution is to look at the code of JavaTypeInference. The methods serializerFor and deserializerFor are used by ExpressionEncoder.javaBean to create the serializer and deserializer part of the encoder for Java beans.
In this pattern matching block
typeToken.getRawType match {
case c if c == classOf[String] => createSerializerForString(inputObject)
case c if c == classOf[java.time.Instant] => createSerializerForJavaInstant(inputObject)
case c if c == classOf[java.sql.Timestamp] => createSerializerForSqlTimestamp(inputObject)
case c if c == classOf[java.time.LocalDate] => createSerializerForJavaLocalDate(inputObject)
case c if c == classOf[java.sql.Date] => createSerializerForSqlDate(inputObject)
[...]
there is the handling for java.util.Optional missing. It could probably be added here as well as in the corresponding deserialize method. This would allow Java beans to have properties of type Optional.
Using java-9 build 9-ea+149 and jol 0.6.
Running this simple code:
ArrayList<Integer> list = new ArrayList<>();
list.add(12);
System.out.println(ClassLayout.parseInstance(list).toPrintable());
Output:
OFFSET SIZE TYPE DESCRIPTION VALUE
0 4 (object header) 01 00 00 00 (00000001 00000000 00000000 00000000) (1)
4 4 (object header) 00 00 00 00 (00000000 00000000 00000000 00000000) (0)
8 4 (object header) 0e 8d 00 f8 (00001110 10001101 00000000 11111000) (-134181618)
12 4 int AbstractList.modCount (access denied)
16 4 int ArrayList.size (access denied)
20 4 Object[] ArrayList.elementData (access denied)
This access denied part comes from FieldData.java in the method:
public String safeValue(Object object) {
if (refField != null) {
try {
return ObjectUtils.safeToString(refField.get(object));
} catch (IllegalAccessException iae) {
// exception, try again
}
try {
refField.setAccessible(true);
return ObjectUtils.safeToString(refField.get(object));
} catch (Exception e) {
return "(access denied)";
}
} else {
return "N/A";
}
}
And the actual exception is :
Unable to make field protected transient int java.util.AbstractList.modCount accessible: module java.base does not "opens java.util" to unnamed module #479d31f3.
I think that this has to do with Unsafe features being locked down. The question is how do I get this to run?
I've looked at properties like :
-XaddExports:java.base/sun.security.provider=ALL-UNNAMED
But can't really tell what it is supposed to look like.
The solutions was indeed to put the correct argument..
--add-opens java.base/java.util=ALL-UNNAMED
as suggested here
I have the following code:
public static void main(String args[]){
try {
//String ticket = "Negotiate YIGCBg...==";
//byte[] kerberosTicket = ticket.getBytes();
byte[] kerberosTicket = Base64.decode("YIGCBg...==");
GSSContext context = GSSManager.getInstance().createContext((GSSCredential) null);
context.acceptSecContext(kerberosTicket, 0, kerberosTicket.length);
String user = context.getSrcName().toString();
context.dispose();
} catch (GSSException e) {
e.printStackTrace();
} catch (Base64DecodingException e) {
e.printStackTrace();
}
}
Of course it fails. Here's the exception:
GSSException: Defective token detected (Mechanism level: GSSHeader did not find the right tag)
I don't know what I'm supposed to do to solve this. Honestly, I don't really understand Kerberos.
I got this ticket by sending a 401 with the appropriate header WWW-Authenticate with 'Negotiate' as the value. The browser immediately issued the same request again with an authorization header containing this ticket.
I was hoping I could validate the ticket and determine who the user is.
Do I need a keytab file? If so, what credentials would I run this under? I'm trying to use the Kerberos ticket for auth for a web-site. Would the credentials be the credentials from IIS?
What am I missing?
Update 1
From Michael-O's reply, I did a bit more googling and found this article, which led me to this article.
On table 3, I found 1.3.6.1.5.5.2 SPNEGO.
I have now added that to my credentials following the example from the first article. Here's my code:
public static void main(String args[]){
try {
Oid mechOid = new Oid("1.3.6.1.5.5.2");
GSSManager manager = GSSManager.getInstance();
GSSCredential myCred = manager.createCredential(null,
GSSCredential.DEFAULT_LIFETIME,
mechOid,
GSSCredential.ACCEPT_ONLY);
GSSContext context = manager.createContext(myCred);
byte[] ticket = Base64.decode("YIGCBg...==");
context.acceptSecContext(ticket, 0, ticket.length);
String user = context.getSrcName().toString();
context.dispose();
} catch (GSSException e) {
e.printStackTrace();
} catch (Base64DecodingException e) {
e.printStackTrace();
}
}
But now the code is failing on createCredential with this error:
GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos credentails)
Here's the entire ticket: YIGCBgYrBgEFBQKgeDB2oDAwLgYKKwYBBAGCNwICCgYJKoZIgvcSAQICBgkqhkiG9xIBAgIGCisGAQQBgjcCAh6iQgRATlRMTVNTUAABAAAAl7II4g4ADgAyAAAACgAKACgAAAAGAbEdAAAAD0xBUFRPUC0yNDVMSUZFQUNDT1VOVExMQw==
Validating an SPNEGO ticket from Java is a somewhat convoluted process. Here's a brief overview but bear in mind that the process can have tons of pitfalls. You really need to understand how Active Directory, Kerberos, SPNEGO, and JAAS all operate to successfully diagnose problems.
Before you start, make sure you know your kerberos realm name for your windows domain. For the purposes of this answer I'll assume it's MYDOMAIN. You can obtain the realm name by running echo %userdnsdomain% from a cmd window. Note that kerberos is case sensitive and the realm is almost always ALL CAPS.
Step 1 - Obtain a Kerberos Keytab
In order for a kerberos client to access a service, it requests a ticket for the Service Principal Name [SPN] that represents that service. SPNs are generally derived from the machine name and the type of service being accessed (e.g. HTTP/www.my-domain.com). In order to validate a kerberos ticket for a particular SPN, you must have a keytab file that contains a shared secret known to both the Kerberos Domain Controller [KDC] Ticket Granting Ticket [TGT] service and the service provider (you).
In terms of Active Directory, the KDC is the Domain Controller, and the shared secret is just the plain text password of the account that owns the SPN. A SPN may be owned by either a Computer or a User object within the AD.
The easiest way to setup a SPN in AD if you are defining a service is to setup a user-based SPN like so:
Create an unpriviledged service account in AD whose password doesn't expire e.g. SVC_HTTP_MYSERVER with password ReallyLongRandomPass
Bind the service SPN to the account using the windows setspn utility. Best practice is to define multiple SPNs for both the short name and the FQDN of the host:
setspn -U -S HTTP/myserver#MYDOMAIN SVC_HTTP_MYSERVER
setspn -U -S HTTP/myserver.my-domain.com#MYDOMAIN SVC_HTTP_MYSERVER
Generate a keytab for the account using Java's ktab utility.
ktab -k FILE:http_myserver.ktab -a HTTP/myserver#MYDOMAIN ReallyLongRandomPass
ktab -k FILE:http_myserver.ktab -a HTTP/myserver.my-domain.com#MYDOMAIN ReallyLongRandomPass
If you are trying to authenticate a pre-existing SPN that is bound to a Computer account or to a User account you do not control, the above will not work. You will need to extract the keytab from ActiveDirectory itself. The Wireshark Kerberos Page has some good pointers for this.
Step 2 - Setup your krb5.conf
In %JAVA_HOME%/jre/lib/security create a krb5.conf that describes your domain. Make sure the realm you define here matches what you setup for your SPN. If you don't put the file in the JVM directory, you can point to it by setting -Djava.security.krb5.conf=C:\path\to\krb5.conf on the command line.
Example:
[libdefaults]
default_realm = MYDOMAIN
[realms]
MYDOMAIN = {
kdc = dc1.my-domain.com
default_domain = my-domain.com
}
[domain_realm]
.my-domain.com = MYDOMAIN
my-domain.com = MYDOMAIN
Step 3 - Setup JAAS login.conf
Your JAAS login.conf should define a login configuration that sets up the Krb5LoginModule as a acceptor. Here's an example that assumes that the keytab we created above is in C:\http_myserver.ktab. Point to the JASS config file by setting -Djava.security.auth.login.config=C:\path\to\login.conf on the command line.
http_myserver_mydomain {
com.sun.security.auth.module.Krb5LoginModule required
principal="HTTP/myserver.my-domain.com#MYDOMAIN"
doNotPrompt="true"
useKeyTab="true"
keyTab="C:/http_myserver.ktab"
storeKey="true"
isInitiator="false";
};
Alternatively, you can generate a JAAS config at runtime like so:
public static Configuration getJaasKrb5TicketCfg(
final String principal, final String realm, final File keytab) {
return new Configuration() {
#Override
public AppConfigurationEntry[] getAppConfigurationEntry(String name) {
Map<String, String> options = new HashMap<String, String>();
options.put("principal", principal);
options.put("keyTab", keytab.getAbsolutePath());
options.put("doNotPrompt", "true");
options.put("useKeyTab", "true");
options.put("storeKey", "true");
options.put("isInitiator", "false");
return new AppConfigurationEntry[] {
new AppConfigurationEntry(
"com.sun.security.auth.module.Krb5LoginModule",
LoginModuleControlFlag.REQUIRED, options)
};
}
};
}
You would create a LoginContext for this configuration like so:
LoginContext ctx = new LoginContext("doesn't matter", subject, null,
getJaasKrbValidationCfg("HTTP/myserver.my-domain.com#MYDOMAIN", "MYDOMAIN",
new File("C:/path/to/my.ktab")));
Step 4 - Accepting the ticket
This is a little off-the-cuff, but the general idea is to define a PriviledgedAction that performs the SPNEGO protocol using the ticket. Note that this example does not check that SPNEGO protocol is complete. For example if the client requested server authentication, you would need to return the token generated by acceptSecContext() in the authentication header in the HTTP response.
public class Krb5TicketValidateAction implements PrivilegedExceptionAction<String> {
public Krb5TicketValidateAction(byte[] ticket, String spn) {
this.ticket = ticket;
this.spn = spn;
}
#Override
public String run() throws Exception {
final Oid spnegoOid = new Oid("1.3.6.1.5.5.2");
GSSManager gssmgr = GSSManager.getInstance();
// tell the GSSManager the Kerberos name of the service
GSSName serviceName = gssmgr.createName(this.spn, GSSName.NT_USER_NAME);
// get the service's credentials. note that this run() method was called by Subject.doAs(),
// so the service's credentials (Service Principal Name and password) are already
// available in the Subject
GSSCredential serviceCredentials = gssmgr.createCredential(serviceName,
GSSCredential.INDEFINITE_LIFETIME, spnegoOid, GSSCredential.ACCEPT_ONLY);
// create a security context for decrypting the service ticket
GSSContext gssContext = gssmgr.createContext(serviceCredentials);
// decrypt the service ticket
System.out.println("Entering accpetSecContext...");
gssContext.acceptSecContext(this.ticket, 0, this.ticket.length);
// get the client name from the decrypted service ticket
// note that Active Directory created the service ticket, so we can trust it
String clientName = gssContext.getSrcName().toString();
// clean up the context
gssContext.dispose();
// return the authenticated client name
return clientName;
}
private final byte[] ticket;
private final String spn;
}
Then to authenticate the ticket, you would do something like the following. Assume that ticket contains the already-base-64-decoded ticket from the authentication header. The spn should be derived from the Host header in the HTTP request if the format of HTTP/<HOST>#<REALM>. E.g. if the Host header was myserver.my-domain.com then spn should be HTTP/myserver.my-domain.com#MYDOMAIN.
public boolean isTicketValid(String spn, byte[] ticket) {
LoginContext ctx = null;
try {
// this is the name from login.conf. This could also be a parameter
String ctxName = "http_myserver_mydomain";
// define the principal who will validate the ticket
Principal principal = new KerberosPrincipal(spn, KerberosPrincipal.KRB_NT_SRV_INST);
Set<Principal> principals = new HashSet<Principal>();
principals.add(principal);
// define the subject to execute our secure action as
Subject subject = new Subject(false, principals, new HashSet<Object>(),
new HashSet<Object>());
// login the subject
ctx = new LoginContext("http_myserver_mydomain", subject);
ctx.login();
// create a validator for the ticket and execute it
Krb5TicketValidateAction validateAction = new Krb5TicketValidateAction(ticket, spn);
String username = Subject.doAs(subject, validateAction);
System.out.println("Validated service ticket for user " + username
+ " to access service " + spn );
return true;
} catch(PriviledgedActionException e ) {
System.out.println("Invalid ticket for " + spn + ": " + e);
} catch(LoginException e) {
System.out.println("Error creating validation LoginContext for "
+ spn + ": " + e);
} finally {
try {
if(ctx!=null) { ctx.logout(); }
} catch(LoginException e) { /* noop */ }
}
return false;
}
This is not a Kerberos ticket but a SPNEGO ticket. Your context has the wrong mechanism.
Edit: Though, you now have the correct mech, you client is sending you a NTLM token which the GSS-API is not able to process. Take the Base 64 token, decode to raw bytes and display ASCII chars. If it starts with NTLMSSP, it won't work for sure and you have broken Kerberos setup.
Edit 2: This is your ticket:
60 81 82 06 06 2B 06 01 05 05 02 A0 78 30 76 A0 30 30 2E 06 `..+..... x0v 00..
0A 2B 06 01 04 01 82 37 02 02 0A 06 09 2A 86 48 82 F7 12 01 .+....7.....*H÷..
02 02 06 09 2A 86 48 86 F7 12 01 02 02 06 0A 2B 06 01 04 01 ....*H÷......+....
82 37 02 02 1E A2 42 04 40 4E 54 4C 4D 53 53 50 00 01 00 00 7...¢B.#NTLMSSP....
00 97 B2 08 E2 0E 00 0E 00 32 00 00 00 0A 00 0A 00 28 00 00 .².â....2.......(..
00 06 01 B1 1D 00 00 00 0F 4C 41 50 54 4F 50 2D 32 34 35 4C ...±.....LAPTOP-245L
49 46 45 41 43 43 4F 55 4E 54 4C 4C 43 IFEACCOUNTLLC
This is a wrapped NTLM token inside a SPNEGO token. which simply means that Kerberos has failed for some reasons, e.g.,
SPN not registered
Clockskew
Not allowed for Kerberos
Incorrect DNS records
Best option is to use Wireshark on the client to find the root cause.
Please note that Java does not support NTLM as a SPNEGO submechanism. NTLM is only supported by SSPI and Heimdal.
If the server does not have a keytab and associated key registered the KDC, you will never be able use kerberos to validate a ticket.
Getting SPNEGO to work is tricky at best and will be next to impossible without at least a cursory understanding of how kerberos works. Try reading this dialog and see if you can get a better understanding.
http://web.mit.edu/kerberos/dialogue.html
SPNEGO requires an SPN of the form HTTP/server.example.com and you'll need to tell the GSS libraries where that keytab is when you start the server.
Why doesn't this work? I'm trying to test that an empty database is the same before doing nothing as after doing nothing. In other words, this is the simplest dbunit test with a database that I can think of. And it doesn't work. The test methods are practically lifted from http://www.dbunit.org/howto.html
The error message I'm getting for comparing empty database is:
java.lang.AssertionError: expected:
org.dbunit.dataset.xml.FlatXmlDataSet<AbstractDataSet[_orderedTableNameMap=null]>
but was:
org.dbunit.database.DatabaseDataSet<AbstractDataSet[_orderedTableNameMap=null]>
The error message I'm getting for comparing empty table is:
java.lang.AssertionError: expected:
<org.dbunit.dataset.DefaultTable[_metaData=tableName=test, columns=[], keys=[], _rowList.size()=0]>
but was:
<org.dbunit.database.CachedResultSetTable[_metaData=table=test, cols=[(id, DOUBLE, noNulls), (txt, VARCHAR, nullable)], pk=[(id, DOUBLE, noNulls)], _rowList.size()=0]>
(I've added newlines for readability)
I can edit in the full stack trace (or anything else) if it'll be useful. Or you can browse through the public git repo: https://bitbucket.org/djeikyb/simple_dbunit
Do I need to somehow convert my actual IDataSet to xml then back to IDataSet to properly compare? What am I doing/expecting wrong?
34 public class TestCase
35 {
36
37 private IDatabaseTester database_tester;
38
39 #Before
40 public void setUp() throws Exception
41 {
42 database_tester = new JdbcDatabaseTester("com.mysql.jdbc.Driver",
43 "jdbc:mysql://localhost/cal",
44 "cal",
45 "cal");
46
47 IDataSet data_set = new FlatXmlDataSetBuilder().build(
48 new FileInputStream("src/simple_dbunit/dataset.xml"));
49 database_tester.setDataSet(data_set);
50
51 database_tester.onSetup();
52 }
53
54 #Test
55 public void testDbNoChanges() throws Exception
56 {
57 // expected
58 IDataSet expected_data_set = new FlatXmlDataSetBuilder().build(
59 new FileInputStream("src/simple_dbunit/dataset.xml"));
60
61 // actual
62 IDatabaseConnection connection = database_tester.getConnection();
63 IDataSet actual_data_set = connection.createDataSet();
64
65 // test
66 assertEquals(expected_data_set, actual_data_set);
67 }
68
69 #Test
70 public void testTableNoChanges() throws Exception
71 {
72 // expected
73 IDataSet expected_data_set = new FlatXmlDataSetBuilder().build(
74 new FileInputStream("src/simple_dbunit/dataset.xml"));
75 ITable expected_table = expected_data_set.getTable("test");
76
77 // actual
78 IDatabaseConnection connection = database_tester.getConnection();
79 IDataSet actual_data_set = connection.createDataSet();
80 ITable actual_table = actual_data_set.getTable("test");
81
82 // test
83 assertEquals(expected_table, actual_table);
84 }
85
86 }
When you compare the IDataSet and other DBUnit components, you have to use the assert method that provided by DBUnit.
If you use assert methods provided by JUnit, it will only be compared via equals method in Object
That's why you get the error complaining about different object type.