Using java-9 build 9-ea+149 and jol 0.6.
Running this simple code:
ArrayList<Integer> list = new ArrayList<>();
list.add(12);
System.out.println(ClassLayout.parseInstance(list).toPrintable());
Output:
OFFSET SIZE TYPE DESCRIPTION VALUE
0 4 (object header) 01 00 00 00 (00000001 00000000 00000000 00000000) (1)
4 4 (object header) 00 00 00 00 (00000000 00000000 00000000 00000000) (0)
8 4 (object header) 0e 8d 00 f8 (00001110 10001101 00000000 11111000) (-134181618)
12 4 int AbstractList.modCount (access denied)
16 4 int ArrayList.size (access denied)
20 4 Object[] ArrayList.elementData (access denied)
This access denied part comes from FieldData.java in the method:
public String safeValue(Object object) {
if (refField != null) {
try {
return ObjectUtils.safeToString(refField.get(object));
} catch (IllegalAccessException iae) {
// exception, try again
}
try {
refField.setAccessible(true);
return ObjectUtils.safeToString(refField.get(object));
} catch (Exception e) {
return "(access denied)";
}
} else {
return "N/A";
}
}
And the actual exception is :
Unable to make field protected transient int java.util.AbstractList.modCount accessible: module java.base does not "opens java.util" to unnamed module #479d31f3.
I think that this has to do with Unsafe features being locked down. The question is how do I get this to run?
I've looked at properties like :
-XaddExports:java.base/sun.security.provider=ALL-UNNAMED
But can't really tell what it is supposed to look like.
The solutions was indeed to put the correct argument..
--add-opens java.base/java.util=ALL-UNNAMED
as suggested here
Related
I would like to not use null value for field of a class used in dataset. I try to use scala Option and java Optional but it failed:
#AllArgsConstructor // lombok
#NoArgsConstructor // mutable type is required in java :(
#Data // see https://stackoverflow.com/q/59609933/1206998
public static class TestClass {
String id;
Option<Integer> optionalInt;
}
#Test
public void testDatasetWithOptionField(){
Dataset<TestClass> ds = spark.createDataset(Arrays.asList(
new TestClass("item 1", Option.apply(1)),
new TestClass("item .", Option.empty())
), Encoders.bean(TestClass.class));
ds.collectAsList().forEach(x -> System.out.println("Found " + x));
}
Fails, at runtime, with message File 'generated.java', Line 77, Column 47: Cannot instantiate abstract "scala.Option"
Question: Is there a way to encode optional fields without null in a dataset, using java?
Subsidiary question: btw, I didn't use much dataset in scala either, can you validate that it is actually possible in scala to encode a case class containing Option fields?
Note: This is used in an intermediate dataset, i.e something that isn't read nor write (but for spark internal serialization)
This is fairly simple to do in Scala.
Scala Implementation
import org.apache.spark.sql.{Encoders, SparkSession}
object Test {
def main(args: Array[String]): Unit = {
val spark = SparkSession.builder
.appName("Stack-scala")
.master("local[2]")
.getOrCreate()
val ds = spark.createDataset(Seq(
TestClass("Item 1", Some(1)),
TestClass("Item 2", None)
))( Encoders.product[TestClass])
ds.collectAsList().forEach(println)
spark.stop()
}
case class TestClass(
id: String,
optionalInt: Option[Int] )
}
Java
There are various Option classes available in Java. However, none of them work out-of-the-box.
java.util.Optional : Not serializable
scala.Option -> Serializable but abstract, so when CodeGenerator generates the following code, it fails!
/* 081 */ // initializejavabean(newInstance(class scala.Option))
/* 082 */ final scala.Option value_9 = false ?
/* 083 */ null : new scala.Option(); // ---> Such initialization is not possible for abstract classes
/* 084 */ scala.Option javaBean_1 = value_9;
org.apache.spark.api.java.Optional -> Spark's implementation of Optional which is serializable but has private constructors. So, it fails with error : No applicable constructor/method found for zero actual parameters. Since this is a final class, it's not possible to extend this.
/* 081 */ // initializejavabean(newInstance(class org.apache.spark.api.java.Optional))
/* 082 */ final org.apache.spark.api.java.Optional value_9 = false ?
/* 083 */ null : new org.apache.spark.api.java.Optional();
/* 084 */ org.apache.spark.api.java.Optional javaBean_1 = value_9;
/* 085 */ if (!false) {
One option is to use normal Java Optionals in the data class and then use Kryo as serializer.
Encoder en = Encoders.kryo(TestClass.class);
Dataset<TestClass> ds = spark.createDataset(Arrays.asList(
new TestClass("item 1", Optional.of(1)),
new TestClass("item .", Optional.empty())
), en);
ds.collectAsList().forEach(x -> System.out.println("Found " + x));
Output:
Found TestClass(id=item 1, optionalInt=Optional[1])
Found TestClass(id=item ., optionalInt=Optional.empty)
There is a downside when using Kryo: this encoder encodes in a binary format:
ds.printSchema();
ds.show(false);
prints
root
|-- value: binary (nullable = true)
+-------------------------------------------------------------------------------------------------------+
|value |
+-------------------------------------------------------------------------------------------------------+
|[01 00 4A 61 76 61 53 74 61 72 74 65 72 24 54 65 73 74 43 6C 61 73 F3 01 01 69 74 65 6D 20 B1 01 02 02]|
|[01 00 4A 61 76 61 53 74 61 72 74 65 72 24 54 65 73 74 43 6C 61 73 F3 01 01 69 74 65 6D 20 AE 01 00] |
+-------------------------------------------------------------------------------------------------------+
An udf-based solution to get the normal output columns of a dataset encoded with Kryo describes this answer.
Maybe a bit off-topic but probably a start to find a long-term solution is to look at the code of JavaTypeInference. The methods serializerFor and deserializerFor are used by ExpressionEncoder.javaBean to create the serializer and deserializer part of the encoder for Java beans.
In this pattern matching block
typeToken.getRawType match {
case c if c == classOf[String] => createSerializerForString(inputObject)
case c if c == classOf[java.time.Instant] => createSerializerForJavaInstant(inputObject)
case c if c == classOf[java.sql.Timestamp] => createSerializerForSqlTimestamp(inputObject)
case c if c == classOf[java.time.LocalDate] => createSerializerForJavaLocalDate(inputObject)
case c if c == classOf[java.sql.Date] => createSerializerForSqlDate(inputObject)
[...]
there is the handling for java.util.Optional missing. It could probably be added here as well as in the corresponding deserialize method. This would allow Java beans to have properties of type Optional.
my widgets.proto:
option java_package = "example";
option java_outer_classname = "WidgetsProtoc";
message Widget {
required string name = 1;
required int32 id = 2;
}
message WidgetList {
repeated Widget widget = 1;
}
my rest: (Path: /widgets)
#GET
#Produces("application/x-protobuf")
public WidgetsProtoc.WidgetList getAllWidgets() {
Widget widget1 =
Widget.newBuilder().setId(1).setName("testing").build();
Widget widget2 =
Widget.newBuilder().setId(100).setName("widget 2").build();
System.err.println("widgets1: " + widget1);
System.err.println("widgets2: " + widget2);
WidgetsProtoc.WidgetList list = WidgetsProtoc.WidgetList.newBuilder().addWidget(widget1).addWidget(widget2).build();
System.err.println("list: " + list.toByteArray());
return list;
}
And when i use postman i get this response:
(emptyline)
(emptyline)
testing
(emptyline)
widget 2d
This is normal? I think not really...in my messagebodywriter class i override writeto like this:
#Override
public void writeTo(WidgetsProtoc.WidgetList widgetList, Class<?> type, Type genericType, Annotation[] annotations, MediaType mediaType, MultivaluedMap<String, Object> httpHeaders, OutputStream entityStream) throws IOException, WebApplicationException {
entityStream.write(widgetList.toByteArray());
}
I thought its good to send a byte array...but it is a little bit weird that not serialize nothing....or just my be the id...Thanks for the help :)
It already is in binary.
Have you noticed that you message consists of name and id but you get only name displayed?
That's what I get displayed when I just save the response as a "test" file from the browser's GET call:
$ cat test
testing
widget 2d%
You can see cat being barely able to display it, but there is more content into it.
Now if I open it in bless (hex editor in ubuntu) I can get more out of it
0A 0B 0A 07 74 65 73 74 69 6E 67 10 01 0A 0C 0A 08 77 69 64 67 65 74 20 32 10 64
In general I wouldn't trust Postman to display binary data. You need to know how to decode it, and apparently Postman doesn't.
And here's the final proof:
$ cat test | protoc --decode_raw
1 {
1: "testing"
2: 1
}
1 {
1: "widget 2"
2: 100
}
I have the following
javax.swing.Timer timer = new javax.swing.Timer(3000, new java.awt.event.ActionListener() {
#Override
public void actionPerformed(ActionEvent e) {
jLabelMsgL.setText("");
NitgenSwingWorker sWorker = new NitgenSwingWorker();
sWorker.execute();
}
});
timer.start();
private final class NitgenSwingWorker extends SwingWorker<Boolean, Void> {
#Override
protected Boolean doInBackground() throws Exception {
return nitgen.checkFinger();
}
#Override
protected void done() {
try {
Boolean isCheckFinger = get();
if(isCheckFinger){
delegate.getListaByIdEmpleado(123);
}else{
delegate.getListaByIdEmpleado(123);
}
} catch (InterruptedException | ExecutionException e) {
System.err.println("NitgenSwingWorker Error: " + e.getMessage());
}
}
}
but when 'isCheckFinger' is true, throws
Mon Jun 29 10:44:14 CDT 2015 INFO: ** Performance Metrics Report **
Longest reported query: 0 ms
Shortest reported query: 9223372036854775807 ms
Average query execution time: NaN ms
Number of statements executed: 0
Number of result sets created: 0
Number of statements prepared: 0
Number of prepared statement executions: 0
Mon Jun 29 10:44:14 CDT 2015 TRACE: send() packet payload:
0a 00 00 00 03 73 65 6c . . . . . s e l
65 63 74 20 31 3b e c t . 1 ;
org.hibernate.exception.GenericJDBCException: Cannot open connection
at org.hibernate.exception.SQLStateConverter.handledNonSpecificException(SQLStateConverter.java:103)
at org.hibernate.exception.SQLStateConverter.convert(SQLStateConverter.java:91)
...
Caused by: java.sql.SQLException: Connections could not be acquired from the underlying database!
at com.mchange.v2.sql.SqlUtils.toSQLException(SqlUtils.java:106)
at com.mchange.v2.c3p0.impl.C3P0PooledConnectionPool.checkoutPooledConnection(C3P0PooledConnectionPool.java:529)
...
Caused by: com.mchange.v2.resourcepool.CannotAcquireResourceException: A ResourcePool could not acquire a resource from its primary factory or source.
at com.mchange.v2.resourcepool.BasicResourcePool.awaitAvailable(BasicResourcePool.java:1319)
at com.mchange.v2.resourcepool.BasicResourcePool.prelimCheckoutResource(BasicResourcePool.java:557)
...
only send me the error when nitgen.checkFinger returns true,
nitgen is a library of a fingerprint reader. I guess when this return
true, it changes something in swing and can not access hibernate.
The exception is sent until it try to do this:
getSession().beginTransaction()
Otherwise no problems, anyone can help me?
I solved
The problem was that I was using a JNI library, and when a value is changed, this happened. To fix and change the default value and ready solved.
public boolean checkFinger() throws Exception {
//Boolean thereIsFinger = Boolean.FALSE; //original line
Boolean thereIsFinger = Boolean.TRUE; //solution
//bsp is a JNI Library
bsp.CheckFinger(thereIsFinger);
return thereIsFinger;
}
I'm trying to patch a class with ASM. I need to add some logic in a function. This logic needs a new local variable. Here is what I've done:
class CreateHashTableMethodAdapter extends MethodAdapter {
#Override
public void visitMethodInsn(int opcode, String owner,String name, String desc){
System.out.println(opcode + "/" + owner + "/" + name + "/" + desc);
if(opcode == Opcodes.INVOKESPECIAL &&
"javax/naming/InitialContext".equals(owner) &&
"<init>".equals(name) &&
"()V".equals(desc)){
System.out.println("In mod");
// 83: new #436; //class javax/naming/InitialContext
// 86: dup
mv.visitMethodInsn(Opcodes.INVOKESPECIAL, "javax/naming/InitialContext", "<init>", "()V");
mv.visitVarInsn(Opcodes.ASTORE, 1);
Label start_patch = new Label();
Label end_patch = new Label();
mv.visitLabel(start_patch);
mv.visitTypeInsn(Opcodes.NEW,"java/util/Hashtable");
mv.visitInsn(Opcodes.DUP);
mv.visitMethodInsn(Opcodes.INVOKESPECIAL, "java/util/Hashtable", "<init>", "()V");
mv.visitVarInsn(Opcodes.ASTORE,9);
// ........ sNip ..........
mv.visitLabel(end_patch);
mv.visitLocalVariable("env","Ljava/util/Hashtable;",null,start_patch,end_patch,9);
// 127: astore_1
}
else {
mv.visitMethodInsn(opcode, owner, name, desc);
}
}
}
When I run this method adapter against CheckClassAdapter it states:
org.objectweb.asm.tree.analysis.AnalyzerException: Error at instruction 51: Trying to access an inexistant local variable 9
.... sNiP ....
00050 R R . . . : R R : INVOKESPECIAL java/util/Hashtable.<init> ()V
00051 R R . . . : R : ASTORE 9
I think I misuse the visitLocalVariable, but I can not find out where I'm supposed to call it.
When I javap generated bytecode (without checking), I get the following local variables table:
LocalVariableTable:
Start Length Slot Name Signature
91 40 9 env Ljava/util/Hashtable;
0 343 0 this Lpmu/jms/ServerJMS;
132 146 1 initialContext Ljavax/naming/InitialContext;
153 125 2 topicConnectionFactory Ljavax/jms/TopicConnectionFactory;
223 55 3 topic Ljavax/jms/Topic;
249 29 4 topicSubscriber Ljavax/jms/TopicSubscriber;
279 55 1 ex Ljava/lang/Exception;
281 53 2 codeMessage I
289 45 3 params Lpmu/data/Parameters;
325 9 4 messageError Ljava/lang/String;
As you may notice, my variable is here but topmost ?!
Any idea ?
One convenient way to create new local variables is to extend LocalVariablesSorter instead of MethodAdapter. Then you can allocate local variables as needed using newLocal() without interfering with existing variables. See section 3.3.3 of the ASM 4.0 A Java bytecode engineering library on the ASM homepage for details.
Why doesn't this work? I'm trying to test that an empty database is the same before doing nothing as after doing nothing. In other words, this is the simplest dbunit test with a database that I can think of. And it doesn't work. The test methods are practically lifted from http://www.dbunit.org/howto.html
The error message I'm getting for comparing empty database is:
java.lang.AssertionError: expected:
org.dbunit.dataset.xml.FlatXmlDataSet<AbstractDataSet[_orderedTableNameMap=null]>
but was:
org.dbunit.database.DatabaseDataSet<AbstractDataSet[_orderedTableNameMap=null]>
The error message I'm getting for comparing empty table is:
java.lang.AssertionError: expected:
<org.dbunit.dataset.DefaultTable[_metaData=tableName=test, columns=[], keys=[], _rowList.size()=0]>
but was:
<org.dbunit.database.CachedResultSetTable[_metaData=table=test, cols=[(id, DOUBLE, noNulls), (txt, VARCHAR, nullable)], pk=[(id, DOUBLE, noNulls)], _rowList.size()=0]>
(I've added newlines for readability)
I can edit in the full stack trace (or anything else) if it'll be useful. Or you can browse through the public git repo: https://bitbucket.org/djeikyb/simple_dbunit
Do I need to somehow convert my actual IDataSet to xml then back to IDataSet to properly compare? What am I doing/expecting wrong?
34 public class TestCase
35 {
36
37 private IDatabaseTester database_tester;
38
39 #Before
40 public void setUp() throws Exception
41 {
42 database_tester = new JdbcDatabaseTester("com.mysql.jdbc.Driver",
43 "jdbc:mysql://localhost/cal",
44 "cal",
45 "cal");
46
47 IDataSet data_set = new FlatXmlDataSetBuilder().build(
48 new FileInputStream("src/simple_dbunit/dataset.xml"));
49 database_tester.setDataSet(data_set);
50
51 database_tester.onSetup();
52 }
53
54 #Test
55 public void testDbNoChanges() throws Exception
56 {
57 // expected
58 IDataSet expected_data_set = new FlatXmlDataSetBuilder().build(
59 new FileInputStream("src/simple_dbunit/dataset.xml"));
60
61 // actual
62 IDatabaseConnection connection = database_tester.getConnection();
63 IDataSet actual_data_set = connection.createDataSet();
64
65 // test
66 assertEquals(expected_data_set, actual_data_set);
67 }
68
69 #Test
70 public void testTableNoChanges() throws Exception
71 {
72 // expected
73 IDataSet expected_data_set = new FlatXmlDataSetBuilder().build(
74 new FileInputStream("src/simple_dbunit/dataset.xml"));
75 ITable expected_table = expected_data_set.getTable("test");
76
77 // actual
78 IDatabaseConnection connection = database_tester.getConnection();
79 IDataSet actual_data_set = connection.createDataSet();
80 ITable actual_table = actual_data_set.getTable("test");
81
82 // test
83 assertEquals(expected_table, actual_table);
84 }
85
86 }
When you compare the IDataSet and other DBUnit components, you have to use the assert method that provided by DBUnit.
If you use assert methods provided by JUnit, it will only be compared via equals method in Object
That's why you get the error complaining about different object type.