I'm working on an open-source project called "Cloudnet-v3". I am using a symlink /data on my local machine to the data-point in my IntelliJProjects-Folder.
I got the following startup command:
[java, -XX:+UseG1GC, -XX:MaxGCPauseMillis=50, -XX:-UseAdaptiveSizePolicy, -XX:CompileThreshold=100, -XX:+UnlockExperimentalVMOptions, -XX:+UseCompressedOops, -Dcom.mojang.eula.agree=true, -Djline.terminal=jline.UnsupportedTerminal, -Dfile.encoding=UTF-8, -Dio.netty.noPreferDirect=true, -Dclient.encoding.override=UTF-8, -Dio.netty.maxDirectMemory=0, -Dio.netty.leakDetectionLevel=DISABLED, -Dio.netty.recycler.maxCapacity=0, -Dio.netty.recycler.maxCapacity.default=0, -DIReallyKnowWhatIAmDoingISwear=true, -Dcloudnet.wrapper.receivedMessages.language=english, -Xmx372M, -javaagent: "/data/temp/caches/wrapper.jar", -cp, "/data/launcher/libs/io/kubernetes/client-java/4.0.0/client-java-4.0.0.jar:/data/launcher/libs/io/netty/netty-codec-http/4.1.36.Final/netty-codec-http-4.1.36.Final.jar:/data/launcher/libs/io/netty/netty-handler/4.1.36.Final/netty-handler-4.1.36.Final.jar:/data/launcher/libs/io/netty/netty-transport-native-epoll/4.1.36.Final/netty-transport-native-epoll-4.1.36.Final-linux-x86_64.jar:/data/launcher/libs/io/netty/netty-transport-native-kqueue/4.1.36.Final/netty-transport-native-kqueue-4.1.36.Final-osx-x86_64.jar:/data/launcher/libs/io/kubernetes/client-java-api/4.0.0/client-java-api-4.0.0.jar:/data/launcher/libs/io/kubernetes/client-java-proto/4.0.0/client-java-proto-4.0.0.jar:/data/launcher/libs/org/yaml/snakeyaml/1.19/snakeyaml-1.19.jar:/data/launcher/libs/commons-codec/commons-codec/1.11/commons-codec-1.11.jar:/data/launcher/libs/org/apache/commons/commons-compress/1.18/commons-compress-1.18.jar:/data/launcher/libs/org/apache/commons/commons-lang3/3.7/commons-lang3-3.7.jar:/data/launcher/libs/com/squareup/okhttp/okhttp-ws/2.7.5/okhttp-ws-2.7.5.jar:/data/launcher/libs/com/google/guava/guava/25.1-jre/guava-25.1-jre.jar:/data/launcher/libs/org/slf4j/slf4j-api/1.7.25/slf4j-api-1.7.25.jar:/data/launcher/libs/org/bouncycastle/bcprov-ext-jdk15on/1.59/bcprov-ext-jdk15on-1.59.jar:/data/launcher/libs/org/bouncycastle/bcpkix-jdk15on/1.59/bcpkix-jdk15on-1.59.jar:/data/launcher/libs/com/google/protobuf/protobuf-java/3.4.0/protobuf-java-3.4.0.jar:/data/launcher/libs/com/google/code/gson/gson/2.8.2/gson-2.8.2.jar:/data/launcher/libs/io/netty/netty-codec/4.1.36.Final/netty-codec-4.1.36.Final.jar:/data/launcher/libs/io/netty/netty-transport-native-unix-common/4.1.36.Final/netty-transport-native-unix-common-4.1.36.Final.jar:/data/launcher/libs/io/netty/netty-transport/4.1.36.Final/netty-transport-4.1.36.Final.jar:/data/launcher/libs/io/netty/netty-buffer/4.1.36.Final/netty-buffer-4.1.36.Final.jar:/data/launcher/libs/io/netty/netty-resolver/4.1.36.Final/netty-resolver-4.1.36.Final.jar:/data/launcher/libs/io/netty/netty-common/4.1.36.Final/netty-common-4.1.36.Final.jar:/data/launcher/libs/io/sundr/builder-annotations/0.9.2/builder-annotations-0.9.2.jar:/data/launcher/libs/io/swagger/swagger-annotations/1.5.12/swagger-annotations-1.5.12.jar:/data/launcher/libs/com/squareup/okhttp/logging-interceptor/2.7.5/logging-interceptor-2.7.5.jar:/data/launcher/libs/com/squareup/okhttp/okhttp/2.7.5/okhttp-2.7.5.jar:/data/launcher/libs/joda-time/joda-time/2.9.3/joda-time-2.9.3.jar:/data/launcher/libs/org/joda/joda-convert/1.2/joda-convert-1.2.jar:/data/launcher/libs/com/google/code/findbugs/jsr305/3.0.2/jsr305-3.0.2.jar:/data/launcher/libs/org/checkerframework/checker-qual/2.0.0/checker-qual-2.0.0.jar:/data/launcher/libs/com/google/errorprone/error_prone_annotations/2.1.3/error_prone_annotations-2.1.3.jar:/data/launcher/libs/com/google/j2objc/j2objc-annotations/1.1/j2objc-annotations-1.1.jar:/data/launcher/libs/org/codehaus/mojo/animal-sniffer-annotations/1.14/animal-sniffer-annotations-1.14.jar:/data/launcher/libs/org/bouncycastle/bcprov-jdk15on/1.59/bcprov-jdk15on-1.59.jar:/data/launcher/libs/io/sundr/sundr-core/0.9.2/sundr-core-0.9.2.jar:/data/launcher/libs/io/sundr/sundr-codegen/0.9.2/sundr-codegen-0.9.2.jar:/data/launcher/libs/io/sundr/resourcecify-annotations/0.9.2/resourcecify-annotations-0.9.2.jar:/data/launcher/libs/com/squareup/okio/okio/1.6.0/okio-1.6.0.jar:/data/launcher/versions/3.0.0-RELEASE-e48128a/driver.jar:/data/temp/caches/wrapper.jar", de.dytanic.cloudnet.wrapper.Main, nogui]
And my current workdir is: /data/temp/services/Lobby-1#4a517311-09e6-4f77-89a5-64b4bc15399a
So whenever I am in the workdir and execute the given command, it fails with the following error: Error opening zip file or JAR manifest missing :
Error occurred during initialization of VM
agent library failed to init: instrument Full Log
Now I am wondering because it's working in the automatic-environment but there are no changes to the master-Branch Source e.g. a changed Path to /data/launcher instead of launcher (System.getProperty("cloudnet.launcher.dir", "/data/launcher"))[https://github.com/CloudNetService/CloudNet-v3/blob/master/cloudnet-launcher/src/main/java/de/dytanic/cloudnet/launcher/Constants.java].
A short lookup: ls -laR /Users/.../Documents/IdeaProjects/cloudnet-parent/data
javaagent option is misused. Correct syntax is
-javaagent:/data/temp/caches/wrapper.jar
Related
I have developed a REST API using Java. It works perfectly on local computer and it is compiled with Java 11.0.2. It is also using Hibernate.
When I try to upload this to the AWS Elastic Beanstalk, it always fail with a severe error. I downloaded the log files and the eb-engine.log says the following
2021/08/06 07:36:01.851410 [ERROR] update processes [web healthd cfn-hup nginx] pid symlinks failed with error Read pid source file /var/pids/web.pid failed with error:open /var/pids/web.pid: no such file or directory
2021/08/06 07:36:01.851423 [ERROR] An error occurred during execution of command [app-deploy] - [Track pids in healthd]. Stop running the command. Error: update processes [web healthd cfn-hup nginx] pid symlinks failed with error Read pid source file /var/pids/web.pid failed with error:open /var/pids/web.pid: no such file or directory
I am not using any specific configuration in my Java app. The war file is a clean build directly from netbeans IDE.
How can I fix this issue?
I am using NCache NodeJS client package and writing tests by calling NCache API's. Package has dependency of Java. After I installed and used that package to write simple test. Test execution failed with module not Found error:
Error Message
Messgae:
The specified module could not be found.
\\D:\GitHome\Test\node_modules\java\build\Release\nodejavabridge_bindings.node
at Runtime._loadModule (node_modules/jest-runtime/build/index.js:893:29)
at Object.<anonymous> (node_modules/java/lib/nodeJavaBridge.js:21:16)
File in above mentioned directory exits.
Note: When I used that package in simple index.js file it worked fine as expected but unable to run with jest.
No major configuration done for jest in package.json file.
Java jdk-11.0.6 installed on box and JAVA_HOME is set.
Package.json
test.specs.js
Unable to identify the exact reason of failure therefore stuck on how to resolve it.
Environment:
OS: Windows
node: 14.15.3
jest: 26.6.3
C:\Program Files\OpenJDK\jdk-16.0.1\bin\server needs to be in your path. On windows can Edit the system environment > Environment Variables... > System variables and add C:\Program Files\OpenJDK\jdk-16.0.1\bin\server
Other option is to add it to your test runner. For example in WebStorm edit the Run Configurations and add Environment variables PATH=C:\Program Files\OpenJDK\jdk-16.0.1\bin\server
I download the source code of OpenJDK14 and put them in \home\yuanfang\jdk14 and after running bash configure --disable-warnings-as-errors and make images, I build OpenJDK14 successfully, The newly built JDK is in home\yuanfang\jdk14\build\linux-x86_64-server-release\jdk. By the way, I am using Ubuntu 20.04 LTS.
Then I want to debug OpenJDK14 using CLion IDE. I was using CLion 2020.2 and I take the following steps:
After open CLion I choose Create New CMake Project from Sources and choose the directory of \home\yuanfang\jdk14, which is the root directory of the jdk project.
I alter the Run/Debug Configurations to make it look like this:
CLion create a CMakeLists.txt automatically but the file doesn't work, so after googling I find find the correct CMakeLists.txt here at https://github.com/ojdkbuild/ojdkbuild/blob/master/src/java-14-openjdk/CMakeLists.txt. I then rewrite the old CMakeLists.txt using the correct one.
I download the whole repository(that is, github.com/ojdkbuild/ojdkbuild), unzip it and put it into \home\yuanfang\jdk14.
It looks like as follow, the ojdkbuild-master is the newly added folder.
I reload the CMake project, but some CMake error occurs(as follow), why can't CLion find those files? javaI googled but can't find any effective solution. Is there anything I can do or refer to? Thanks in advance.
CMake Error at CMakeLists.txt:19 (include):
include could not find load file:
/home/yuanfang/jdk14/../../resources/cmake/ojdkbuild_common.cmake
CMake Error at CMakeLists.txt:21 (include):
include could not find load file:
/home/yuanfang/jdk14/../../resources/cmake/version.cmake
CMake Error at CMakeLists.txt:93 (add_subdirectory):
add_subdirectory given source
"/home/yuanfang/jdk14/../../deps/rhino/scripting_tasks" which is not an
existing directory.
CMake Error at CMakeLists.txt:98 (ojdkbuild_add_subdirectory):
Unknown CMake command "ojdkbuild_add_subdirectory".
-- Configuring incomplete, errors occurred!
See also "/home/yuanfang/jdk14/cmake-build-debug/CMakeFiles/CMakeOutput.log".
Cannot get compiler information:
Compiler exited with error code 1: /usr/bin/cc -xobjective-c -I/home/yuanfang/jdk14/build/linux-x86_64-server-release/hotspot/variant-server/gensrc/adfiles......./loading/LibraryLoader/jar_src -g -fpch-preprocess -v -dD -E
Using built-in specs.
COLLECT_GCC=/usr/bin/cc
OFFLOAD_TARGET_NAMES=nvptx-none:hsa
OFFLOAD_TARGET_DEFAULT=1
Target: x86_64-linux-gnu
Configured with: ../src/configure -v --with-pkgversion='Ubuntu 9.3.0-10ubuntu2' --with-bugurl=file:///usr/share/doc/gcc-9/README.Bugs --enable-languages=c,ada,c++,go,brig,d,fortran,objc,obj-c++,gm2 --prefix=/usr --with-gcc-major-version-only --program-suffix=-9 --program-prefix=x86_64-linux-gnu- --enable-shared --enable-linker-build-id --libexecdir=/usr/lib --without-included-gettext --enable-threads=posix --libdir=/usr/lib --enable-nls --enable-clocale=gnu --enable-libstdcxx-debug --enable-libstdcxx-time=yes --with-default-libstdcxx-abi=new --enable-gnu-unique-object --disable-vtable-verify --enable-plugin --enable-default-pie --with-system-zlib --with-target-system-zlib=auto --enable-objc-gc=auto --enable-multiarch --disable-werror --with-arch-32=i686 --with-abi=m64 --with-multilib-list=m32,m64,mx32 --enable-multilib --with-tune=generic --enable-offload-targets=nvptx-none,hsa --without-cuda-driver --enable-checking=release --build=x86_64-linux-gnu --host=x86_64-linux-gnu --target=x86_64-linux-gnu
Thread model: posix
gcc version 9.3.0 (Ubuntu 9.3.0-10ubuntu2)
COLLECT_GCC_OPTIONS='-I' '/home/yuanfang/jdk14/build/linux-x86_64-server-release/hotspot/variant-server/gensrc/adfiles' '-I' '......'-g' '-fpch-preprocess' '-v' '-dD' '-E' '-D' '___CIDR_DEFINITIONS_END' '-mtune=generic' '-march=x86-64'
cc1obj -E -quiet -v #/tmp/cci3XM6r -imultiarch x86_64-linux-gnu -D ___CIDR_DEFINITIONS_END /tmp/compiler-file5929385022787926768 -mtune=generic -march=x86-64 -fpch-preprocess -g -fworking-directory -fasynchronous-unwind-tables -fstack-protector-strong -Wformat -Wformat-security -fstack-clash-protection -fcf-protection -dD
cc: fatal error: cannot execute 'cc1obj': execvp: No such file or directory
compilation terminated.
[Failed to reload]
step: check out this projecdt :https://github.com/ojdkbuild/ojdkbuild.git
step: put the source code of OpenJDK14 in ojdkbuild/src/
step: copy reousrces directory and CMakelists.txt to OpenJDK14
step: recomile openjdk and import project to CLin
I have successfully compiled the JNI based Apache libhdfs (C++) on my Hadoop Sandbox / CentOS - no compilation errors or warnings:
g++ test.cpp -o test -I/usr/lib/jvm/java-1.7.0-openjdk-1.7.0.151.x86_64/include/
-I/usr/hdp/2.6.3.0-235/usr/include/ -I/usr/hdp/2.6.3.0-235/hadoop/bin
-I/usr/lib/jvm/java-1.8.0-openjdk-1.8.0.151-1.b12.el6_9.x86_64/include/
-I/usr/lib/jvm/java-1.8.0-openjdk-1.8.0.151-1.b12.el6_9.x86_64/jre/lib/amd64/
-L/usr/hdp/2.6.3.0-235/hadoop/lib/ -L/usr/hdp/2.6.3.0-235/hadoop/lib/native
-L/usr/hdp/2.6.3.0-235/hadoop/lib/ -L/usr/lib/jvm/java-1.8.0-openjdk-1.8.0.151-1.b12.el6_9.x86_64/jre/lib/amd64/
-L/usr/lib/jvm/java-1.8.0-openjdk-1.8.0.151-1.b12.el6_9.x86_64/jre/lib/amd64/server/
-lhdfs -pthread -ljvm
Once I try to run the code, I get the following errors:
[root#sandbox-hdp ~]# ./test
Environment variable CLASSPATH not set!
getJNIEnv: getGlobalJNIEnv failed
Environment variable CLASSPATH not set!
getJNIEnv: getGlobalJNIEnv failed
If I run hadoop classpath in the terminal, I get the following output:
[root#sandbox-hdp ~]# hadoop classpath
/usr/hdp/2.6.3.0-235/hadoop/conf:/usr/hdp/2.6.3.0-
235/hadoop/lib/:/usr/hdp/2.6.3.0-235/hadoop/.//:/usr/hdp/2.6.3.0-235/hadoop-
hdfs/./:/usr/hdp/2.6.3.0-235/hadoop-hdfs/lib/:/usr/hdp/2.6.3.0-235/hadoop-
hdfs/.//:/usr/hdp/2.6.3.0-235/hadoop-yarn/lib/:/usr/hdp/2.6.3.0-235/hadoop-
yarn/.//:/usr/hdp/2.6.3.0-235/hadoop-mapreduce/lib/:/usr/hdp/2.6.3.0-
235/hadoop-mapreduce/.//::jdbc-mysql.jar:mysql-connector-java-
5.1.17.jar:mysql-connector-java-5.1.37.jar:mysql-connector-
java.jar:/usr/hdp/2.6.3.0-235/tez/:/usr/hdp/2.6.3.0-
235/tez/lib/:/usr/hdp/2.6.3.0-235/tez/conf
On the Apache libhdfs page it says:
The most common problem is the CLASSPATH is not set properly when
calling a program that uses libhdfs. Make sure you set it to all the
Hadoop jars needed to run Hadoop itself as well as the right
configuration directory containing hdfs-site.xml. It is not valid to
use wildcard syntax for specifying multiple jars. It may be useful to
run hadoop classpath --glob or hadoop classpath --jar to generate the
correct classpath for your deployment. See Hadoop Commands Reference
for more information on this command.
I do however not get how to proceed after many trial and error attempts, I would therefore appreciate any help that could help me to solve this problem.
Edit: tried the following: CLASSPATH=hadoop classpath ./test
...which gave me the following error: libjvm.so: cannot open shared object file: No such file or directory
I tried the following: export LD_LIBRARY_PATH=/usr/lib/jvm/java-1.8.0-openjdk-1.8.0.151-1.b12.el6_9.x86_64/jre/lib/amd64/server
...and now the error is:
[root#sandbox-hdp ~]# CLASSPATH=$CLASSPATH:`hadoop classpath` ./test
loadFileSystems error:
(unable to get stack trace for java.lang.NoClassDefFoundError exception: ExceptionUtils::getStackTrace error.)
hdfsBuilderConnect(forceNewInstance=0, nn=default, port=0, kerbTicketCachePath=(NULL), userName=(NULL)) error:
(unable to get stack trace for java.lang.NoClassDefFoundError exception: ExceptionUtils::getStackTrace error.)
hdfsOpenFile(/tmp/testfile.txt): constructNewObjectOfPath error:
(unable to get stack trace for java.lang.NoClassDefFoundError exception: ExceptionUtils::getStackTrace error.)
Maybe the following could works for you:
CLASSPATH=$CLASSPATH:`hadoop classpath` ./test
or only this:
CLASSPATH=`hadoop classpath` ./test
Check out JAVA_HOME environment variable, maybe it could alter the java libraries used too.
And finally, a wrapper like the script below could be useful:
#!/bin/bash
export CLASSPATH="AllTheJARs"
ARG0="$0"
EXEC_PATH="$( dirname "$ARG0" )"
"${EXEC_PATH}/test" $#
I am new to Broadleaf Ecommerce and have installed the same couple of days back. Maven build is successful however getting below error while starting Startadmin.batch file. I have tried increasing heap memory however it didn't help. Any of you faced similar error?
Error opening zip file or JAR manifest missing :
target/agents/spring-instrument .jar
Error occurred during initialization of VM agent library failed to
init: instrument
Error occurred during initialization of VM Too small initial heap
Error occurred during initialization of VM Too small initial heap
ERROR: Cannot load this JVM TI agent twice, check your java command
line for dup licate jdwp options.
Error occurred during initialization of VM agent library failed to
init: jdwp
The error seems to indicate that Spring-instruments.jar file was not downloaded. That means that your local maven failed on the "#copy-agent" goal that is responsible for installing that jar in the correct directory. Try doing a mvn install with Maven 3.3.9 and turning on the #echo when running the bat script to make sure that jar is installed.
Also change the %TOMCAT_MEMORY% line to be this:
IF "%TOMCAT_MEMORY%"=="" SET TOMCAT_MEMORY=-Xmx1536M -XX:MaxPermSize=512M
This will change the Heap size from 1536 bytes to a more suitable size.