I used package com.github.nscala_time.time.Imports in code and I m running the code using spark.
Here is my stream.sh file:
#!/bin/bash
JARS_HOME=$HOME/spark-job/lib
JARS=$JARS_HOME/job-server-api_2.10-0.6.0.jar,$JARS_HOME/httpmime-4.4.1.jar,$JARS_HOME/noggit-0.6.jar,$JARS_HOME/nscala-time_2.10-2.0.0.jar
export SPARK_IP=`ifconfig | grep eth0 -1 | grep -i inet | awk '{ print $2 }' | cut -d':' -f2`
APP_JAR=$JARS_HOME/spark-jobs-tests.jar
export SPARK_LOCAL_IP=$SPARK_IP
dse spark-submit --conf "spark.cassandra.input.consistency.level=LOCAL_QUORUM" \
--total-executor-cores 2 \
--jars=$JARS \
--class "my file classpath" $APP_JAR "$1" --files $1
I have set $JARS_HOME/nscala-time_2.10-2.0.0.jar in .sh then still getting following error:
Exception in thread "main" scala.reflect.internal.MissingRequirementError: object com.github.nscala_time.time.Imports not found.
at scala.reflect.internal.MissingRequirementError$.signal(MissingRequirementError.scala:16)
at scala.reflect.internal.MissingRequirementError$.notFound(MissingRequirementError.scala:17)
at scala.reflect.internal.Mirrors$RootsBase.ensureModuleSymbol(Mirrors.scala:126)
at scala.reflect.internal.Mirrors$RootsBase.staticModule(Mirrors.scala:161)
at scala.reflect.internal.Mirrors$RootsBase.staticModule(Mirrors.scala:21)
How to resolve this??
Related
I'm trying to build OpenCV 4.5.5 on a mac mini m1 as follows
installed openjdk 17
created folder ~/build_opencv in user home and cd to it
from this folder call
cmake -DCMAKE_SYSTEM_PROCESSOR=arm64 \
-DCMAKE_OSX_ARCHITECTURES=arm64 \
-DWITH_OPENJPEG=OFF \
-DWITH_IPP=OFF \
-D CMAKE_BUILD_TYPE=RELEASE \
-D CMAKE_INSTALL_PREFIX=/usr/local/opencv \
-D JAVA_INCLUDE_PATH=$JAVA_HOME/include \
-D JAVA_AWT_LIBRARY=$JAVA_HOME/jre/lib/amd64/libawt.so \
-D JAVA_JVM_LIBRARY=$JAVA_HOME/jre/lib/arm/server/libjvm.so \
-D BUILD_opencv_python2=OFF \
-D BUILD_opencv_java=ON \
-D INSTALL_PYTHON_EXAMPLES=OFF \
-D INSTALL_C_EXAMPLES=OFF \
-D OPENCV_ENABLE_NONFREE=OFF \
-D BUILD_EXAMPLES=ON ../opencv-4.5.5
call
make -j7
Got a lot of warnings like
/Users/testuser/opencv-4.5.5/3rdparty/libpng/pngpriv.h:606:29: note: expanded from macro 'png_isaligned'
(((type)((const char*)ptr-(const char*)0) & \
^~~~~~~~~~~~~~~
/Users/testuser/opencv-4.5.5/3rdparty/libpng/pngrutil.c:3563:23: warning: performing pointer subtraction with a null pointer has undefined behavior [-Wnull-pointer-subtraction]
png_isaligned(sp, png_uint_32) &&
^~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
/Users/testuser/opencv-4.5.5/3rdparty/libpng/pngpriv.h:606:29: note: expanded from macro 'png_isaligned'
(((type)((const char*)ptr-(const char*)0) & \
^~~~~~~~~~~~~~~
/Users/testuser/opencv-4.5.5/3rdparty/libpng/pngrutil.c:4622:34: warning: performing pointer subtraction with a null pointer has undefined behavior [-Wnull-pointer-subtraction]
int extra = (int)((temp - (png_bytep)0) & 0x0f);
^ ~~~~~~~~~~~~
/Users/user/opencv-4.5.5/3rdparty/libpng/pngrutil.c:4626:30: warning: performing pointer subtraction with a null pointer has undefined behavior [-Wnull-pointer-subtraction]
extra = (int)((temp - (png_bytep)0) & 0x0f);
and final message
[ 71%] Building CXX object modules/gapi/CMakeFiles/opencv_gapi.dir/src/backends/python/gpythonbackend.cpp.o
[ 71%] Building CXX object modules/gapi/CMakeFiles/opencv_gapi.dir/src/utils/itt.cpp.o
[ 71%] Linking CXX shared library ../../lib/libopencv_gapi.dylib
[ 71%] Built target opencv_gapi
make: *** [all] Error 2
Also tried openjdk 15 and 18 but led to same result.
Any ideas to fix this would be appreciated.
Foreword
Hi, I am new to stackoverflow. If there is any place that is not clear, please point it out. Thank you!
Question
I just started to study hyperledger-fabric. As a Java programmer, I choose to use the fabric-java-sdk.
After I can run the test case End2endIT.java, I want to change the chaincode. I just find the example_cc.go at fabric-sdk-java/src/test/fixture/sdkintegration/gocc/sample1/src/github.com/example_cc/example_cc.go . However, after I changed the chaincode, it did't work. Even after I deleted this code, the test case can still run.
Therefore, I guess I found a wrong place. Can anyone tell me where to change the chaincode? Thx!
Additional
The code to load chaincode
if (isFooChain) {
// on foo chain install from directory.
////For GO language and serving just a single user, chaincodeSource is mostly likely the users GOPATH
installProposalRequest.setChaincodeSourceLocation(new File(TEST_FIXTURES_PATH + "/sdkintegration/gocc/sample1"));
//[output]: src/test/fixture/sdkintegration/gocc/sample1
System.out.println(TEST_FIXTURES_PATH + "/sdkintegration/gocc/sample1");
} else {
// On bar chain install from an input stream.
installProposalRequest.setChaincodeInputStream(Util.generateTarGzInputStream(
(Paths.get(TEST_FIXTURES_PATH, "/sdkintegration/gocc/sample1", "src", CHAIN_CODE_PATH).toFile()),
Paths.get("src", CHAIN_CODE_PATH).toString()));
}
I solved the question in the end as I noticed the fabric.sh in the fabric-sdk-java.
./fabric.sh up to force recreate the docker container
./fabric.sh clean to clean the peers
The reason why I could run the invoke request without chaincode is that I didn't clean the volumns of peers.
And the source code as follows:
#!/usr/bin/env bash
#
# Copyright IBM Corp. All Rights Reserved.
#
# SPDX-License-Identifier: Apache-2.0
#
# simple batch script making it easier to cleanup and start a relatively fresh fabric env.
if [ ! -e "docker-compose.yaml" ];then
echo "docker-compose.yaml not found."
exit 8
fi
ORG_HYPERLEDGER_FABRIC_SDKTEST_VERSION=${ORG_HYPERLEDGER_FABRIC_SDKTEST_VERSION:-}
function clean(){
rm -rf /var/hyperledger/*
if [ -e "/tmp/HFCSampletest.properties" ];then
rm -f "/tmp/HFCSampletest.properties"
fi
lines=`docker ps -a | grep 'dev-peer' | wc -l`
if [ "$lines" -gt 0 ]; then
docker ps -a | grep 'dev-peer' | awk '{print $1}' | xargs docker rm -f
fi
lines=`docker images | grep 'dev-peer' | grep 'dev-peer' | wc -l`
if [ "$lines" -gt 0 ]; then
docker images | grep 'dev-peer' | awk '{print $1}' | xargs docker rmi -f
fi
}
function up(){
if [ "$ORG_HYPERLEDGER_FABRIC_SDKTEST_VERSION" == "1.0.0" ]; then
docker-compose up --force-recreate ca0 ca1 peer1.org1.example.com peer1.org2.example.com ccenv
else
docker-compose up --force-recreate
fi
}
function down(){
docker-compose down;
}
function stop (){
docker-compose stop;
}
function start (){
docker-compose start;
}
for opt in "$#"
do
case "$opt" in
up)
up
;;
down)
down
;;
stop)
stop
;;
start)
start
;;
clean)
clean
;;
restart)
down
clean
up
;;
*)
echo $"Usage: $0 {up|down|start|stop|clean|restart}"
exit 1
esac
done
Is it possible to read console input just before the embedded tomcat on the spring boot starts? The supposedly application flow is, request username and password from the user, and that will be used to be able to start the application. It works when using the java -jar command, the problem is when I close the console(SSH on linux) the process stops. I tried searching about it and found out that the process is tied to the console, so I tried using nohup, the problem is I cannot request for console input when using that. Is there any other way?
I think this can help you.
public static void main(String[] args) {
Scanner scanner = new Scanner(System.in);
// prompt for the user's name
System.out.print("Enter your name: ");
// get their input as a String
String username = scanner.next();
System.out.println(username);
SpringApplication.run(Application.class, args);
}
Get username and password in shell script before executing your java program and pass those as argument to your application.
#!/bin/bash
# Ask login details
read -p 'user: ' uservar
read -sp 'password: ' passvar
echo
Now you have user and password you can run java command with nohup and pass user password as jvm properties. You can also pass user and password as program arguments as suggested in other answer.
Like nohup java -jar abc.jar -Duser=$user -Dpassword=$password
And fetch these properties using
String user = System.getProperty("String");
String password = System.getProperty("password");
You can use nohup to start the jar with the parameters, they just won't be prompted for them on new lines in the terminal. The User could add them as parameters when starting the jar. See details below.
Example:
Main Class
public static void main(String[] args) {
String username = args[0];
String password = args[1];
if(username.equals("admin") && password.equals("password")) {
SpringApplication.run(NohupApplication.class, args);
} else {
System.out.println("You are not authorized to start this application.");
}
}
With Invalid Credentials
Command
nohup java -jar example.jar user password
nohup.out
You are not authorized to start this application.
With Valid Credentials
Command
nohup java -jar example.jar admin password
nohup.out
. ____ _ __ _ _
/\\ / ___'_ __ _ _(_)_ __ __ _ \ \ \ \
( ( )\___ | '_ | '_| | '_ \/ _` | \ \ \ \
\\/ ___)| |_)| | | | | || (_| | ) ) ) )
' |____| .__|_| |_|_| |_\__, | / / / /
=========|_|==============|___/=/_/_/_/
:: Spring Boot :: (v1.5.2.RELEASE)
If you want to full log with console input like logger + System.out.println() outputs, then you have to use
nohup java -jar yourJarFile.jar admin password >> fullLogOutput.log &
Another issue:
the problem is when I close the console(SSH on linux) the process
stops.
So you want to run your jar file as background process. You can't stop it with your console closing. Just add & symbol after full command, it will not stop. You can use the following command.
nohup java -jar yourJarFile.jar &
For taking full log with console output
nohup java -jar yourJarFile.jar >> fullLogOutput.log &
The & symbol is used to run the program in the background.
The nohup utility makes the command passed as an argument run in the background even after you log out.
Stopping/Killing the back-end process:
For stopping the background process, use ps -aux to get the id and then kill (id number)
ps -aux | grep yourJarFile.jar
You will get the id number. To kill that
sudo kill -9 <pid>
Resource Link: https://stackoverflow.com/a/12108646/2293534
I have a spark application packaged with maven. At run-time, I have to give 3 arguments (paths of 3 files to create RDDs). So I used spark-submit command as the officiel website of spark indicates:
./bin/spark-submit \
--class <main-class> \
--master <master-url> \
--deploy-mode <deploy-mode> \
--conf <key>=<value> \
.. # other options
<application-jar> \
[application-arguments]
My submit-command looks like:
\bin\spark-submit --class myapp.Main --master local[*] file:///C:\Users\pc\Desktop\eclipse\myapp\target\myapp-0.0.1-SNAPSHOT.jar ["C:\Users\pc\Desktop\pathToFile1.csv", "C:\Users\pc\Desktop\pathToFile2.csv", "C:\Users\pc\Desktop\pathToFile3.csv"]
I moddified my Main class as follows to get paths at runtime:
String pathToFile1=args[0];
String pathToFile2=args[1];
String pathToFile3=args[2];
But I get an error message that says that the specified path does not exist. What am I doing wrong here?
#bradimus you were right i dont have to use [], i have to write it as :
\bin\spark-submit --class myapp.Main --master local[*] file:///C:\Users\pc\Desktop\eclipse\myapp\target\myapp-0.0.1-SNAPSHOT.jar C:\Users\pc\Desktop\pathToFile1.csv C:\Users\pc\Desktop\pathToFile2.csv C:\Users\pc\Desktop\pathToFile3.csv
I am trying to find which .jar detected this error so I can figure out the issue. This is running on hyperion server.
[2015-03-15T15:18:35.352+08:00] [Planning0] [WARNING] [] [oracle.EPMHSP.calcmgr_execution] [tid: 144] [userId: <anonymous>] [ecid: 00iRyJJB65hDOd5LzQL6iW000ly40016YL,0:1] [APP: PLANNING#11.1.2.0] [URI: /HyperionPlanning/faces/RunTimePromptTF/BgImage] [SRC_CLASS: com.hyperion.planning.adf.artifact.datacontrol.HspManageArtifactsDC] [SRC_METHOD: executeCalcScript] Error detected while attempting to run job Test_Rule [[
com.hyperion.planning.HspRuntimeException: Error detected while attempting to run job: Test_Rule.
at com.hyperion.planning.HspAsyncJobsManager.completeJobExceution(HspAsyncJobsManager.java:101)
at com.hyperion.planning.db.HspFMDBImpl$CalcMgrWrapper.runRule(HspFMDBImpl.java:10411)
at com.hyperion.planning.db.HspFMDBImpl.runHBRRule(HspFMDBImpl.java:2254)
at com.hyperion.planning.db.HspFMDBImpl.runCalcScript(HspFMDBImpl.java:2218)
at com.hyperion.planning.HyperionPlanningBean.runCalcScript(HyperionPlanningBean.java:4028)
at com.hyperion.planning.adf.artifact.datacontrol.HspManageArtifactsDC.executeCalcScript(HspManageArtifactsDC.java:3518)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at oracle.adf.model.binding.DCInvokeMethod.invokeMethod(DCInvokeMethod.java:677)
at oracle.adf.model.bean.DCBeanDataControl.invokeMethod(DCBeanDataControl.java:445)
If you are running linux/unix flavor, I usually find jars via something like the following bash script:
for i in $( find LIB_FOLDERS -iname *.jar | xargs ); do
( zipinfo $i | grep -i PATTERN ) && echo $i ; done
Where LIB_FOLDERS is the place where your jars are found, and PATTERN is a characteristic part of the name of the class you are looking for. This will print the names of all jar-files that match the pattern. Most IDEs allow you to "search for a class in the classpath" withouth all that command-line hassle, but I don't know if you have all sources loaded up in one.
Use JarScan. It's one of my favorite tools to search for a class buried in some jar in some directory. Works for any platform, simple and easy to use: https://java.net/projects/jarscan/pages/Tutorial/text
On linux systems I create ~/bin/findjar with the following, then chmod 700 and add ~/bin to my PATH:
#!/bin/bash
# Usage: findjar <classname or string to search for> [path to search under]
#
class=$1
path=$2
if [[ "$path" = "" ]]; then
path=.
fi
echo searching for $class in $path
for f in `find $path -name "*.jar"`; do
match=$(jar tf $f | grep $class);
if [[ -n "$match" ]]; then
echo
echo $f;
echo "$match"
fi;
done