I've a "spring boot app" application.
I'm executing my software with "gradle bootRun" and It works correctly for all developers.
To make the jar or war I've used maven and currently I'm using gradle.
With both methods to generated jar or war the execution of software from this artifacts isn't working propertly in one of hundreds of use cases.
The error is:
Error: Error while starting plugin CheckGebaeudeVersicherungsSystemIoxPlugin
Error: ch/interlis/iox_j/validator/InterlisFunction
Error: ch.interlis.iox_j.validator.InterlisFunction
Error: Error while starting plugin org.interlis2.validator.plugins.IntersectsIoxPlugin
Error: ch/interlis/iox_j/validator/InterlisFunction
Error: ch.interlis.iox_j.validator.InterlisFunction
Error: Error while starting plugin org.interlis2.validator.plugins.no_overlaps_2_1_6_IoxPlugin
Error: ch/interlis/iox_j/validator/InterlisFunction
Error: ch.interlis.iox_j.validator.InterlisFunction
Error: Error while starting plugin org.interlis2.validator.plugins.no_overlaps_2_2_0_IoxPlugin
Error: ch/interlis/iox_j/validator/InterlisFunction
Error: ch.interlis.iox_j.validator.InterlisFunction
This seems to be an error of a missing dependency.
This error is not because of something on the code. In eclipse with execution and debug, all is working properly!
What's the difference in execute between "gradle bootRun"/"mvn spring-boot:run" and the execution with java commad (traditional "java -jar jar/war")?
Related
I'm trying to build and run this project:
https://github.com/NN-Minhas/inception
but when I run it I got this error:
java: package org.dkpro.core.api.xml.type does not exist
I tried to run mvn clean install, but then I got this error:
Failed to execute goal com.github.eirslett:frontend-maven-plugin:1.12.1:npm (npm build) on project inception-project-export: Failed to run task: 'npm run build' failed. org.apache.commons.exec.ExecuteException: Process exited with an error
Any suggestions on how this could be solved?
When using JMH with maven to benchmark a project that was previously working and benchmarking as intended experiencing the following error:
java -jar target/benchmarks.jar
Exception in thread "main" java.lang.RuntimeException: ERROR: Unable to find the resource: /META-INF/BenchmarkList
at org.openjdk.jmh.runner.AbstractResourceReader.getReaders(AbstractResourceReader.java:98)
at org.openjdk.jmh.runner.BenchmarkList.find(BenchmarkList.java:122)
at org.openjdk.jmh.runner.Runner.internalRun(Runner.java:260)
at org.openjdk.jmh.runner.Runner.run(Runner.java:209)
at org.openjdk.jmh.Main.main(Main.java:71)
I have seen other similar posts which were solved by performing a mvn clean install but I have already done this repeatedly.
Thanks.
I m trying to run a Spark maven Scala project.
the mvn install didn't succeded (java.lang.UnsatisfiedLinkError) :
*** RUN ABORTED ***
java.lang.UnsatisfiedLinkError: org.apache.hadoop.io.nativeio.NativeIO$Windows.access0(Ljava/lang/String;I)Z
at org.apache.hadoop.io.nativeio.NativeIO$Windows.access0(Native Method)
at org.apache.hadoop.io.nativeio.NativeIO$Windows.access(NativeIO.java:609)
at org.apache.hadoop.fs.FileUtil.canWrite(FileUtil.java:996)
at org.apache.hadoop.hdfs.server.common.Storage$StorageDirectory.analyzeStorage(Storage.java:490)
at org.apache.hadoop.hdfs.server.namenode.FSImage.recoverStorageDirs(FSImage.java:321)
at org.apache.hadoop.hdfs.server.namenode.FSImage.recoverTransitionRead(FSImage.java:215)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.loadFSImage(FSNamesystem.java:976)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.loadFromDisk(FSNamesystem.java:681)
at org.apache.hadoop.hdfs.server.namenode.NameNode.loadNamesystem(NameNode.java:584)
at org.apache.hadoop.hdfs.server.namenode.NameNode.initialize(NameNode.java:643)
at org.apache.hadoop.hdfs.server.namenode.NameNode.<init>(NameNode.java:810)
at org.apache.hadoop.hdfs.server.namenode.NameNode.<init>(NameNode.java:794)
at org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1487)
I didn't understand the cause of the error since the Hadoop path is set and contains winutils.exe , and i have already run another java-spark project successfully ?
I just added hadoop.dll To the hadoop path and the error disappear.
Has anyone tried running the Cucumber-jvm cli without using maven?
If I try to execute the below command I am getting this error:
java cucumber.api.cli.Main
Error: Could not find or load main class cucumber.api.cli.Main
I even tried this:
java -cp "/Users/jreijn/.m2/repository/info/cukes/cucumber-
core/1.2.5/cucumber-core-
1.2.5.jar:/Users/jreijn/.m2/repository/info/cukes/gherkin/2.12.2/gherkin-
2.12.2.jar:/Users/jreijn/.m2/repository/info/cukes/cucumber-
java/1.2.5/cucumber-java-
1.2.5.jar:/Users/jreijn/.m2/repository/info/cukes/cucumber-jvm-
deps/1.0.5/cucumber-jvm-deps-1.0.5.jar" cucumber.api.cli.Main
I am getting the same error.
I'm trying to run an application in Grails but I have a problem with the execution. I'm using grails version 2.5.4 and Java version 1.7.
I checked my enviroment variables and they turned out to be OK.
This is the error:
Error occurred during initialization of VM agent library failed to init: instrument.
Error opening zip file or JAR manifest missing : C:\Program%20Files\Grails\grails-2.5.4\lib\org.springframework\springloaded\jars\springloaded-1.2.4.RELEASE.jar
Error | Forked Grails VM exited with error
What's going on?