Keytool - java.lang.ExceptionInInitializerError - java

I have installed IBM Java 8 and I am getting below exception when trying to run keytool command. Its even happening when just executing ./keytool.
Exception in thread "main" java.lang.ExceptionInInitializerError
at java.lang.J9VMInternals.ensureError(J9VMInternals.java:147)
at java.lang.J9VMInternals.recordInitializationFailure(J9VMInternals.java:136)
Caused by: java.lang.NullPointerException
at com.ibm.oti.vm.AbstractClassLoader.findResourceImpl(AbstractClassLoader.java:210)
at com.ibm.oti.vm.AbstractClassLoader.access$100(AbstractClassLoader.java:41)
at com.ibm.oti.vm.AbstractClassLoader$2.run(AbstractClassLoader.java:250)
at java.security.AccessController.doPrivileged(AccessController.java:678)
at com.ibm.oti.vm.AbstractClassLoader.findResources(AbstractClassLoader.java:246)
at java.lang.ClassLoader.getResources(ClassLoader.java:683)
at java.lang.ClassLoader.getResources(ClassLoader.java:680)
at java.util.ServiceLoader$LazyIterator.hasNextService(ServiceLoader.java:359)
at java.util.ServiceLoader$LazyIterator.hasNext(ServiceLoader.java:404)
at java.util.ServiceLoader$1.hasNext(ServiceLoader.java:485)
at sun.util.locale.provider.SPILocaleProviderAdapter$1.run(SPILocaleProviderAdapter.java:92)
at sun.util.locale.provider.SPILocaleProviderAdapter$1.run(SPILocaleProviderAdapter.java:86)
at java.security.AccessController.doPrivileged(AccessController.java:734)
at sun.util.locale.provider.SPILocaleProviderAdapter.findInstalledProvider(SPILocaleProviderAdapter.java:86)
at sun.util.locale.provider.AuxLocaleProviderAdapter.getLocaleServiceProvider(AuxLocaleProviderAdapter.java:82)
at sun.util.locale.provider.LocaleProviderAdapter.findAdapter(LocaleProviderAdapter.java:296)
at sun.util.locale.provider.LocaleProviderAdapter.getAdapter(LocaleProviderAdapter.java:266)
at java.text.Collator.getInstance(Collator.java:274)
at java.text.Collator.getInstance(Collator.java:238)
at com.ibm.crypto.tools.KeyTool.<clinit>(Unknown Source)

Related

Liquibase ClassCastException on LiquibaseConfiguration when running update on MongoDB

Below jars are present on the classpath
mongo-java-driver-3.12.11.jar
liquibase-mongodb-4.11.0.jar
liquibase-core-4.11.0.jar
[2022-06-17 12:54:30] SEVERE [liquibase.integration] Unexpected error running Liquibase: java.lang.ClassCastException: liquibase.configuration.LiquibaseConfiguration cannot be cast to liquibase.SingletonObject
java.lang.ExceptionInInitializerError
at liquibase.ext.mongodb.database.MongoLiquibaseDatabase.getAdjustTrackingTablesOnStartup(MongoLiquibaseDatabase.java:111)
at liquibase.ext.mongodb.lockservice.MongoLockService.adjustRepository(MongoLockService.java:97)
at liquibase.nosql.lockservice.AbstractNoSqlLockService.init(AbstractNoSqlLockService.java:102)
at liquibase.nosql.lockservice.AbstractNoSqlLockService.acquireLock(AbstractNoSqlLockService.java:155)
at liquibase.nosql.lockservice.AbstractNoSqlLockService.waitForLock(AbstractNoSqlLockService.java:119)
at liquibase.Liquibase$1.run(Liquibase.java:188)
at liquibase.Scope.lambda$child$0(Scope.java:159)
at liquibase.Scope.child(Scope.java:170)
at liquibase.Scope.child(Scope.java:158)
at liquibase.Scope.child(Scope.java:137)
at liquibase.Liquibase.runInScope(Liquibase.java:1790)
at liquibase.Liquibase.update(Liquibase.java:183)
at liquibase.Liquibase.update(Liquibase.java:179)
at liquibase.integration.commandline.Main.doMigration(Main.java:1543)
at liquibase.integration.commandline.Main$1.lambda$run$0(Main.java:316)
at liquibase.Scope.lambda$child$0(Scope.java:159)
at liquibase.Scope.child(Scope.java:170)
at liquibase.Scope.child(Scope.java:158)
at liquibase.Scope.child(Scope.java:137)
at liquibase.Scope.child(Scope.java:183)
at liquibase.Scope.child(Scope.java:187)
at liquibase.integration.commandline.Main$1.run(Main.java:315)
at liquibase.integration.commandline.Main$1.run(Main.java:166)
at liquibase.Scope.child(Scope.java:170)
at liquibase.Scope.child(Scope.java:144)
at liquibase.integration.commandline.Main.run(Main.java:166)
at liquibase.integration.commandline.Main.main(Main.java:145)
Caused by: liquibase.exception.UnexpectedLiquibaseException: java.lang.ClassCastException: liquibase.configuration.LiquibaseConfiguration cannot be cast to liquibase.SingletonObject
at liquibase.Scope.getSingleton(Scope.java:269)
at liquibase.Scope.getSingleton(Scope.java:252)
at liquibase.Scope.getSingleton(Scope.java:252)
at liquibase.Scope.getSingleton(Scope.java:252)
at liquibase.configuration.ConfigurationDefinition$Building.build(ConfigurationDefinition.java:323)
at liquibase.ext.mongodb.configuration.MongoConfiguration.<clinit>(MongoConfiguration.java:23)
... 27 more
Caused by: java.lang.ClassCastException: liquibase.configuration.LiquibaseConfiguration cannot be cast to liquibase.SingletonObject
at liquibase.Scope.getSingleton(Scope.java:266)
... 32 more
Not sure if this is a dependency/classpath issue. Directions appreciated
Running Liquibase 4.0.0

scala vscode Exception in thread "main" java.lang.IllegalArgumentException: name

i am trying to install scala in vs code but it's not working (or running) and i have this error
this is my code
object assd extends App{
print("dfdfdfdfdf")
}
and this is the error
[Running] scala "c:\Users\ahmed\Desktop\scala\tempCodeRunnerFile.scala"
Exception in thread "main" java.lang.IllegalArgumentException: name
at java.base/jdk.internal.loader.URLClassPath$Loader.getResource(URLClassPath.java:636)
at java.base/jdk.internal.loader.URLClassPath.getResource(URLClassPath.java:314)
at java.base/java.net.URLClassLoader$1.run(URLClassLoader.java:455)
at java.base/java.net.URLClassLoader$1.run(URLClassLoader.java:452)
at java.base/java.security.AccessController.doPrivileged(Native Method)
at java.base/java.net.URLClassLoader.findClass(URLClassLoader.java:451)
at java.base/java.lang.ClassLoader.loadClass(ClassLoader.java:589)
at java.base/java.lang.ClassLoader.loadClass(ClassLoader.java:522)
at java.base/java.lang.Class.forName0(Native Method)
at java.base/java.lang.Class.forName(Class.java:398)
at scala.reflect.internal.util.RichClassLoader$.$anonfun$tryClass$extension$1(ScalaClassLoader.scala:47)
at scala.util.control.Exception$Catch.$anonfun$opt$1(Exception.scala:245)
at scala.util.control.Exception$Catch.apply(Exception.scala:227)
at scala.util.control.Exception$Catch.opt(Exception.scala:245)
at scala.reflect.internal.util.RichClassLoader$.tryClass$extension(ScalaClassLoader.scala:47)
at scala.reflect.internal.util.ScalaClassLoader.tryToLoadClass(ScalaClassLoader.scala:41)
at scala.reflect.internal.util.ScalaClassLoader.tryToLoadClass$(ScalaClassLoader.scala:119)
at scala.reflect.internal.util.ScalaClassLoader$URLClassLoader.tryToLoadClass(ScalaClassLoader.scala:161)
at scala.reflect.internal.util.ScalaClassLoader$.classExists(ScalaClassLoader.scala:189)
at scala.tools.nsc.GenericRunnerCommand.guessHowToRun(GenericRunnerCommand.scala:43)
at scala.tools.nsc.GenericRunnerCommand.<init>(GenericRunnerCommand.scala:62)
at scala.tools.nsc.GenericRunnerCommand.<init>(GenericRunnerCommand.scala:25)
at scala.tools.nsc.MainGenericRunner.process(MainGenericRunner.scala:45)
at scala.tools.nsc.MainGenericRunner$.main(MainGenericRunner.scala:108)
at scala.tools.nsc.MainGenericRunner.main(MainGenericRunner.scala)
[Done] exited with code=1 in 0.99 seconds
and this is path windowpath window
These are the extensions that I added
scala
code runner
if you use code runner,you can set scala change
"scala": "cd $dir && scala $fileName",

Jetty error "IllegalStateException: No Method: <Call name="addBefore"> ... on class ...Configuration$ClassList"

I am using an external jetty version 9.4.35.v20201120. I am getting this error in my Eclipse. Not sure why! Researched a lot but not able to fix it. Can any help here?
Exception in thread "main" java.lang.IllegalStateException: No Method: <Call name="addBefore"><Arg name="addBefore" type="java.lang.String">org.eclipse.jetty.webapp.JettyWebXmlConfiguration</Arg><Arg><Array type="java.lang.String"><Item>org.eclipse.jetty.annotations.AnnotationConfiguration</Item></Array></Arg></Call> on class org.eclipse.jetty.webapp.Configuration$ClassList
at org.eclipse.jetty.xml.XmlConfiguration$JettyXmlConfiguration.call(XmlConfiguration.java:950)
at org.eclipse.jetty.xml.XmlConfiguration$JettyXmlConfiguration.configure(XmlConfiguration.java:515)
at org.eclipse.jetty.xml.XmlConfiguration$JettyXmlConfiguration.call(XmlConfiguration.java:945)
at org.eclipse.jetty.xml.XmlConfiguration$JettyXmlConfiguration.configure(XmlConfiguration.java:515)
at org.eclipse.jetty.xml.XmlConfiguration$JettyXmlConfiguration.configure(XmlConfiguration.java:431)
at org.eclipse.jetty.xml.XmlConfiguration.configure(XmlConfiguration.java:364)
at net.sourceforge.eclipsejetty.starter.jetty9.Jetty9LauncherMain.configure(Jetty9LauncherMain.java:85)
at net.sourceforge.eclipsejetty.starter.common.AbstractJettyLauncherMain.configure(AbstractJettyLauncherMain.java:144)
at net.sourceforge.eclipsejetty.starter.common.AbstractJettyLauncherMain.launch(AbstractJettyLauncherMain.java:75)
at net.sourceforge.eclipsejetty.starter.jetty9.Jetty9LauncherMain.main(Jetty9LauncherMain.java:42)
Caused by: java.lang.NoSuchMethodException: addBefore
at org.eclipse.jetty.xml.XmlConfiguration$JettyXmlConfiguration.call(XmlConfiguration.java:987)
at org.eclipse.jetty.xml.XmlConfiguration$JettyXmlConfiguration.call(XmlConfiguration.java:942)
... 9 more

Spark 1.6 java.io.IOException: Filesystem closed

I have a strange issue. I'm not closing file system by .close() or .abort().
Locally my application is working perfectly, but when I try to do spark-submit on cluster I have got a lot of Exceptions
> ERROR scheduler.LiveListenerBus: Listener EventLoggingListener threw an exception
java.lang.reflect.InvocationTargetException
at sun.reflect.GeneratedMethodAccessor14.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.spark.scheduler.EventLoggingListener$$anonfun$logEvent$3.apply(EventLoggingListener.scala:150)
at org.apache.spark.scheduler.EventLoggingListener$$anonfun$logEvent$3.apply(EventLoggingListener.scala:150)
at scala.Option.foreach(Option.scala:236)
at org.apache.spark.scheduler.EventLoggingListener.logEvent(EventLoggingListener.scala:150)
at org.apache.spark.scheduler.EventLoggingListener.onApplicationEnd(EventLoggingListener.scala:196)
at org.apache.spark.scheduler.SparkListenerBus$class.onPostEvent(SparkListenerBus.scala:54)
at org.apache.spark.scheduler.LiveListenerBus.onPostEvent(LiveListenerBus.scala:31)
at org.apache.spark.scheduler.LiveListenerBus.onPostEvent(LiveListenerBus.scala:31)
at org.apache.spark.util.ListenerBus$class.postToAll(ListenerBus.scala:55)
at org.apache.spark.util.AsynchronousListenerBus.postToAll(AsynchronousListenerBus.scala:38)
at org.apache.spark.util.AsynchronousListenerBus$$anon$1$$anonfun$run$1$$anonfun$apply$mcV$sp$1.apply$mcV$sp(AsynchronousListenerBus.scala:87)
at org.apache.spark.util.AsynchronousListenerBus$$anon$1$$anonfun$run$1$$anonfun$apply$mcV$sp$1.apply(AsynchronousListenerBus.scala:72)
at org.apache.spark.util.AsynchronousListenerBus$$anon$1$$anonfun$run$1$$anonfun$apply$mcV$sp$1.apply(AsynchronousListenerBus.scala:72)
at scala.util.DynamicVariable.withValue(DynamicVariable.scala:57)
at org.apache.spark.util.AsynchronousListenerBus$$anon$1$$anonfun$run$1.apply$mcV$sp(AsynchronousListenerBus.scala:71)
at org.apache.spark.util.Utils$.tryOrStopSparkContext(Utils.scala:1182)
at org.apache.spark.util.AsynchronousListenerBus$$anon$1.run(AsynchronousListenerBus.scala:70)
Caused by: java.io.IOException: Filesystem closed
at org.apache.hadoop.hdfs.DFSClient.checkOpen(DFSClient.java:858)
at org.apache.hadoop.hdfs.DFSOutputStream.flushOrSync(DFSOutputStream.java:2350)
at org.apache.hadoop.hdfs.DFSOutputStream.hflush(DFSOutputStream.java:2296)
at org.apache.hadoop.fs.FSDataOutputStream.hflush(FSDataOutputStream.java:130)
... 20 more
But application is still running, until I have got. This error kills my app.
> ERROR util.Utils: Uncaught exception in thread Thread-3
java.io.IOException: Filesystem closed
at org.apache.hadoop.hdfs.DFSClient.checkOpen(DFSClient.java:858)
at org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:2120)
at org.apache.hadoop.hdfs.DistributedFileSystem$20.doCall(DistributedFileSystem.java:1253)
at org.apache.hadoop.hdfs.DistributedFileSystem$20.doCall(DistributedFileSystem.java:1249)
at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1249)
at org.apache.hadoop.fs.FileSystem.exists(FileSystem.java:1417)
at org.apache.spark.scheduler.EventLoggingListener.stop(EventLoggingListener.scala:224)
at org.apache.spark.SparkContext$$anonfun$stop$8$$anonfun$apply$mcV$sp$5.apply(SparkContext.scala:1744)
at org.apache.spark.SparkContext$$anonfun$stop$8$$anonfun$apply$mcV$sp$5.apply(SparkContext.scala:1744)
at scala.Option.foreach(Option.scala:236)
at org.apache.spark.SparkContext$$anonfun$stop$8.apply$mcV$sp(SparkContext.scala:1744)
at org.apache.spark.util.Utils$.tryLogNonFatalError(Utils.scala:1220)
at org.apache.spark.SparkContext.stop(SparkContext.scala:1743)
at org.apache.spark.SparkContext$$anonfun$3.apply$mcV$sp(SparkContext.scala:604)
at org.apache.spark.util.SparkShutdownHook.run(ShutdownHookManager.scala:267)
at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1$$anonfun$apply$mcV$sp$1.apply$mcV$sp(ShutdownHookManager.scala:239)
at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1$$anonfun$apply$mcV$sp$1.apply(ShutdownHookManager.scala:239)
at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1$$anonfun$apply$mcV$sp$1.apply(ShutdownHookManager.scala:239)
at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1818)
at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1.apply$mcV$sp(ShutdownHookManager.scala:239)
at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1.apply(ShutdownHookManager.scala:239)
at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1.apply(ShutdownHookManager.scala:239)
at scala.util.Try$.apply(Try.scala:161)
at org.apache.spark.util.SparkShutdownHookManager.runAll(ShutdownHookManager.scala:239)
at org.apache.spark.util.SparkShutdownHookManager$$anon$2.run(ShutdownHookManager.scala:218)
at org.apache.hadoop.util.ShutdownHookManager$1.run(ShutdownHookManager.java:54)
The last error is totally randomly, but it occurs 90% times when I'm using my app. Can be there be some with cluster configuration or my code is wrong?
Thanks
If the FileSystem is closed somewhere , the error will also occur when you use it again.

Exception in thread "main" java.lang.ClassFormatError: (unrecognized class file version)

Hi I am facting the below exception when running in unix environment, pls any body can help
Exception in thread "main" java.lang.ClassFormatError: com.polaris.treasury.bloomberg.marketdata.test.SpringWebServiceTest (unrecognized class file version)
at java.lang.VMClassLoader.defineClass(libgcj.so.7rh)
at java.lang.ClassLoader.defineClass(libgcj.so.7rh)
at java.security.SecureClassLoader.defineClass(libgcj.so.7rh)
at java.net.URLClassLoader.findClass(libgcj.so.7rh)
at java.lang.ClassLoader.loadClass(libgcj.so.7rh)
at java.lang.ClassLoader.loadClass(libgcj.so.7rh)
at gnu.java.lang.MainThread.run(libgcj.so.7rh)

Categories

Resources