Error when running Z3 java binding - java

I am running JavaExample.java provided with Z3 4.4.2 but I get the following output and then error:
Z3 Major Version: 4
Z3 Full Version: 4.3.2.0
SimpleExample
Opt
Exception in thread "main" java.lang.UnsatisfiedLinkError: com.microsoft.z3.Native.INTERNALmkOptimize(J)J
at com.microsoft.z3.Native.INTERNALmkOptimize(Native Method)
at com.microsoft.z3.Native.mkOptimize(Native.java:5208)
at com.microsoft.z3.Optimize.<init>(Optimize.java:262)
at com.microsoft.z3.Context.mkOptimize(Context.java:3043)
at Z3Example.optimizeExample(Z3Example.java:2323)
at Z3Example.main(Z3Example.java:2362)
To be fair, I am using the 64-bit libz3java.dll provided with 4.3.2, while using the jar file com.microsoft.z3.jar from version 4.4.2, because that was the only combination that I managed to get working (this details those problems). Could that be the version difference the reason for this error, or is there something else?

Yes, the version difference is responsible for this issue: 4.3.2 did not support optimization and thus doesn't come with mkOptimize. I'll take a look at the other issue separately.

Related

Java 11 Jasper report compilation error: org.eclipse.jdt.internal.compiler.classfmt.ClassFormatException

Getting following error when jasper report compile in Java 11:
ERROR [net.sf.jasperreports.engine.design.JRJdtCompiler] (default
task-94) Compilation error:
org.eclipse.jdt.internal.compiler.classfmt.ClassFormatException
[Server:app-node-00] at
deployment.app.ear//org.eclipse.jdt.internal.compiler.classfmt.ClassFileReader.(ClassFileReader.java:329)
[Server:app-node-00] at
deployment.app.ear//net.sf.jasperreports.engine.design.JRJdtCompiler$1.findType(JRJdtCompiler.java:251)
[Server:app-node-00] at
deployment.app.ear//net.sf.jasperreports.engine.design.JRJdtCompiler$1.findType(JRJdtCompiler.java:187)
[Server:app-node-00] at
deployment.app.ear//org.eclipse.jdt.internal.compiler.lookup.LookupEnvironment.askForType(LookupEnvironment.java:97)
We are using jasperreports-javaflow-6.5.1.jar.
org.tolven.library.jboss-rules.core-3.2.3.v_686_R32x.jar, assuming that it's the same file as the one here, seems to be a very old (released in 2007) JDT compiler implementation that's not able to read classes compiled for Java 11.
For Java 11 you'll need a more recent JDT version, for instance 4.4.2. But then there's a risk that the code that depends on the 3.2.3 JDT implementation no longer works with the more recent JDT version, in which case I don't know what you can do.
You might also need to upgrade the JasperReports version, according to the change log support for Java 11 has been introduced in 6.8.0.

How do I solve this NPE from trying to use Journey Browser

Using journey browser project I have set up a simple maven project in eclipse, using the dependency provided on the page, I have tried to run the code example (also provided on the page)
And I get an NPE:
Exception in thread "main" java.lang.ExceptionInInitializerError
at java.base/java.lang.Class.forName0(Native Method)
at java.base/java.lang.Class.forName(Class.java:315)
at java.desktop/java.awt.Toolkit$2.run(Toolkit.java:588)
at java.desktop/java.awt.Toolkit$2.run(Toolkit.java:583)
at java.base/java.security.AccessController.doPrivileged(Native Method)
at java.desktop/java.awt.Toolkit.getDefaultToolkit(Toolkit.java:582)
at java.desktop/java.awt.Toolkit.getEventQueue(Toolkit.java:1494)
at java.desktop/java.awt.EventQueue.isDispatchThread(EventQueue.java:1086)
at java.desktop/javax.swing.SwingUtilities.isEventDispatchThread(SwingUtilities.java:1493)
at com.codebrig.journey.JourneyBrowserView.<init>(JourneyBrowserView.java:78)
at com.codebrig.journey.JourneyBrowserView.<init>(JourneyBrowserView.java:71)
at JourneyBrowser.main(JourneyBrowser.java:13)
Caused by: java.lang.NullPointerException
at java.base/java.lang.ClassLoader.loadLibrary(ClassLoader.java:2646)
at java.base/java.lang.Runtime.loadLibrary0(Runtime.java:830)
at java.base/java.lang.System.loadLibrary(System.java:1870)
at java.desktop/sun.awt.windows.WToolkit$1.run(WToolkit.java:118)
at java.desktop/sun.awt.windows.WToolkit$1.run(WToolkit.java:115)
at java.base/java.security.AccessController.doPrivileged(Native Method)
at java.desktop/sun.awt.windows.WToolkit.loadLibraries(WToolkit.java:114)
at java.desktop/sun.awt.windows.WToolkit.<clinit>(WToolkit.java:129)
... 12 more
Can anyone explain what is happening and why here, I have tried this on 64bit windows with java Coretto 11(jdk11.0.7_10),I initially found a bug here, relating to loadLibray in open JDK and thought that maybe the problem, I just don't have a good enough understanding to work out how to get around it?
I'm not sure, but I think that Corretto bug is probably the one that is causing the problem. As noted, it is from their "upstream"; i.e the OpenJDK codebase. It was due to regression that appeared in jdk11.0.7 due to a backport of a fix for another problem. Apparently, the fix changes some internal JDK fields and that breaks application code. As JDK-8240521 puts it:
The backport of the JDK-8231584 changes internal JDK fields processing. The problem is that the many third-party applications copy-pasted a hack that depends on particular JDK implementation.
If I am reading the Oracle bug entries correctly, the reversion of the broken fix should be in JDK11.0.8. Alternatively, an earlier JDK 11 release than 11.0.7 shouldn't have the broken fix.
Let me know if changing your Java 11 install solves the problem. (If not, I'll see if I can get the line numbers to match up.)

Spark based application failing in JDK 8?

I am running built in sample example which comes as part of Spark installation, and running in Hadoop 2.7 + Spark with JDK 8. However it is giving me the following error:
Exception in thread "main" java.lang.OutOfMemoryError: Cannot allocate
new DoublePointer(10000000): totalBytes = 363M, physicalBytes = 911M
at
org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:187)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:212)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:126)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) Caused by: java.lang.OutOfMemoryError: Physical memory usage is too
high: physicalBytes = 911M > maxPhysicalBytes = 911M
at org.bytedeco.javacpp.Pointer.deallocator(Pointer.java:572)
at org.bytedeco.javacpp.Pointer.init(Pointer.java:121)
I followed the following SO question as well, and did the configuration changes.
In addition to this, I referred to these links as well: YARN-4714 , HADOOP-11090
Are there any issues in running Spark in JDK 8.
The below are the versions of the softwares that I am running in my simple cluster:
jdk-8u131-linux-x64
scala-2.12.2
spark-2.1.1-bin-without-hadoop
hadoop-2.7.0
One thing when I run the program in JDK 7, it is working fine, but failing with JDK 8.
Has anyone encountered this problem and if so, what is the fix? Isn't hadoop , spark, scala not yet compatible with JDK 8?
Can anyone please help me?
You are receiving OOM error is the indication of shortage of memory for java to launch. As you have mentioned JDK7 it worked fine. Upgrading to JDK8 require more memory compare to JDK7. Please check JDK8 memory requirements here - https://dzone.com/articles/java-8-permgen-metaspace
Spark is not yet released for Scala 2.12. I don't know if it solves the original problem but you should switch to Scala 2.11 anyway.

BusinessObjects semantic layer results in Unsupported Major.Minor Version 51.0

I'm using business objects to connect to a new database using JDBC. I receive the error:
Database error: (CS) “Java Exception : java.lang.UnspportedClassVersionError: : Unsupported major.minor version 51.0” . (IES 10901) (Error: INF)
From what I understand, this is telling me that there is an incompatibility with the Java version used to compile the Jar. I'm assuming from the error that the Jar was compiled in Java 7. However, I’m not quite sure how to go about fixing this. I can find information about this error for other applications, but not BusinessObjects. Does anyone have any ideas?
The reason is your SAP JVM or JDK version is too old to read class format 51.0.
Try to upgrade Java version to fix this problem.

Android,GoogleAnalytics - java.lang.NullPointerException

I'm using GoogleAnalyticsServicesAndroid_3.0, And Have a min sdk of Android API 9.
I keep seeing the following error in my bug report system :
java.lang.NullPointerException
at android.app.ContextImpl.openFileOutput(ContextImpl.java:431)
at android.content.ContextWrapper.openFileOutput(ContextWrapper.java:158)
at com.google.analytics.tracking.android.ClientIdDefaultProvider.storeClientId(ClientIdDefaultProvider.java:102)
at com.google.analytics.tracking.android.ClientIdDefaultProvider.generateClientId(ClientIdDefaultProvider.java:123)
at com.google.analytics.tracking.android.ClientIdDefaultProvider.initializeClientId(ClientIdDefaultProvider.java:179)
at com.google.analytics.tracking.android.ClientIdDefaultProvider$1.run(ClientIdDefaultProvider.java:134)
I haven't been able to find a solution for this, Any help is appreciated
This seems to only happen in 2.3. OS versions
Thanks,
Shimi

Categories

Resources