Groovy script to run java main class - java

I have below code of main class that i am trying to run with gradle . i have attached the sample main class of java and gradle script below.
public class Hello {
public static void main(String[] args) throws IOException {
System.out.println("This is file system watch service");
Path dir = Paths.get("c:\\sid\\");
WatchService service = FileSystems.getDefault().newWatchService();
WatchKey key = dir.register(service, ENTRY_CREATE);
System.out.println("Watching directory: "+dir.toString());
//file creation logic
while(true) {
for (WatchEvent<?> event: key.pollEvents()) {
WatchEvent.Kind<?> kind = event.kind();
if (kind == OVERFLOW) {
continue;
}
WatchEvent<Path> ev = (WatchEvent<Path>)event;
Path filename = ev.context();
}
}
}
}
Also my gradle script is as below
apply plugin: 'java'
task hello1 << {
def process = ['java', '-cp', 'sourceSets.main.runtimeClasspath', 'com.test.gradle.Hello'].execute()
process.in.close()
process.out.close()
process.err.close()
}
task hello << {
ant.java(classpath:'sourceSets.main.runtimeClasspath', classname:'com.test.gradle.Hello', fork:'true')
ant.echo('Done')
}
When I call gradle hello1 from command prompt it says build successful however it seems my main program never gets executed. To see the execution of main program I have added sample sysout and file creation logic with PrintWriter.
I have also tried with ant with gradle hello this is also not working.
Now when i use the gradle task as below removed fork true from previous
task hello << {
ant.java(classpath:'sourceSets.main.runtimeClasspath.asPath', classname:'com.test.gradle.Hello.class')
ant.echo('Done')
}
it is generating below exception
[ant:java] Could not find com.test.gradle.Hello.class. Make sure you have it in
your classpath
at org.apache.tools.ant.taskdefs.ExecuteJava.execute(ExecuteJava.java:13
8)
at org.apache.tools.ant.taskdefs.Java.run(Java.java:771)
at org.apache.tools.ant.taskdefs.Java.executeJava(Java.java:221)
at org.apache.tools.ant.taskdefs.Java.executeJava(Java.java:135)
at org.apache.tools.ant.taskdefs.Java.execute(Java.java:108)
at org.apache.tools.ant.UnknownElement.execute(UnknownElement.java:292)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.
java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAcces
sorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:601)
at org.apache.tools.ant.dispatch.DispatchUtils.execute(DispatchUtils.jav
a:106)
at groovy.util.AntBuilder.performTask(AntBuilder.java:319)
at groovy.util.AntBuilder.nodeCompleted(AntBuilder.java:264)
at org.gradle.api.internal.project.ant.BasicAntBuilder.nodeCompleted(Bas
icAntBuilder.java:72)
at groovy.util.BuilderSupport.doInvokeMethod(BuilderSupport.java:147)
at groovy.util.AntBuilder.doInvokeMethod(AntBuilder.java:203)
at org.gradle.api.internal.project.ant.BasicAntBuilder.doInvokeMethod(Ba
sicAntBuilder.java:87)
at groovy.util.BuilderSupport.invokeMethod(BuilderSupport.java:64)
at org.codehaus.groovy.runtime.callsite.PogoMetaClassSite.call(PogoMetaC
lassSite.java:45)
at org.codehaus.groovy.runtime.callsite.CallSiteArray.defaultCall(CallSi
teArray.java:45)
at org.codehaus.groovy.runtime.callsite.AbstractCallSite.call(AbstractCa
llSite.java:108)
at org.codehaus.groovy.runtime.callsite.AbstractCallSite.call(AbstractCa
llSite.java:116)
at build_1dcr9mg141bsc4tjjgplovccq0$_run_closure2.doCall(C:\Users\sumit\
workspace1\gradleTest\build.gradle:13)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.
java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAcces
sorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:601)
at org.codehaus.groovy.reflection.CachedMethod.invoke(CachedMethod.java:
90)

Try this
javaexec {
main = 'com.pack.YourClass'
classpath(sourceSets.src.output.classesDir, sourceSets.src.compileClasspath)
args(arg1, arg2)
}

Related

How to read data from smart card using jmrtd and scuba

I'm trying to read data from ePassporta using jmrtd and scuba.
But it doesn’t work. In Android, almost the same code works.
And when reading from a smart card it doesn’t work.
My method for reading:
public static void reader() throws Exception {
try {
CardTerminal terminal =TerminalFactory.getDefault().terminals().list().get(0);
CardService cs = CardService.getInstance(terminal);
PassportService service = new PassportService(cs);
service.open();
BACKeySpec bacKey = new BACKey("12312312312312", "31121990", "311230");
boolean paceSucceeded = false;
try {
CardAccessFile cardAccessFile = new CardAccessFile(service.getInputStream(PassportService.EF_CARD_ACCESS));
Collection<PACEInfo> paceInfos = cardAccessFile.getPACEInfos();
if (paceInfos != null && paceInfos.size() > 0) {
PACEInfo paceInfo = paceInfos.iterator().next();
service.doPACE(bacKey, paceInfo.getObjectIdentifier(), PACEInfo.toParameterSpec(paceInfo.getParameterId()));
paceSucceeded = true;
} else {
paceSucceeded = true;
}
} catch (Exception e) {
e.printStackTrace();
}
service.sendSelectApplet(paceSucceeded);
if (!paceSucceeded) {
try {
service.getInputStream(PassportService.EF_COM).read();
} catch (Exception e) {
service.doBAC(bacKey);
}
}
LDS lds = new LDS();
CardFileInputStream dg1In = service.getInputStream(PassportService.EF_DG1);
lds.add(PassportService.EF_DG1, dg1In, dg1In.getLength());
DG1File dg1File = lds.getDG1File();
System.out.println(dg1File.getMRZInfo().toString());
} catch (CardServiceException e) {
e.printStackTrace();
}
}
pom.xml
<dependency>
<groupId>org.jmrtd</groupId>
<artifactId>jmrtd</artifactId>
<version>0.5.2</version>
</dependency>
<dependency>
<groupId>net.sf.scuba</groupId>
<artifactId>scuba-sc-j2se</artifactId>
<version>0.0.13</version>
</dependency>
<dependency>
<groupId>net.sf.scuba</groupId>
<artifactId>scuba-smartcards</artifactId>
<version>0.0.13</version>
</dependency>
Stacktrace :
add more text just for stack overflow to accept the question.
added full stacktrace content.
added full stacktrace content.
added full stacktrace content.
added full stacktrace content.
D:\Java\jdk\bin\java -Didea.launcher.port=7539 "-Didea.launcher.bin.path=C:\Program Files (x86)\JetBrains\IntelliJ IDEA 14.0\bin" -Dfile.encoding=UTF-8 -classpath "D:\Java\jdk\jre\lib\charsets.jar;D:\Java\jdk\jre\lib\deploy.jar;D:\Java\jdk\jre\lib\javaws.jar;D:\Java\jdk\jre\lib\jce.jar;D:\Java\jdk\jre\lib\jfr.jar;D:\Java\jdk\jre\lib\jfxswt.jar;D:\Java\jdk\jre\lib\jsse.jar;D:\Java\jdk\jre\lib\management-agent.jar;D:\Java\jdk\jre\lib\plugin.jar;D:\Java\jdk\jre\lib\resources.jar;D:\Java\jdk\jre\lib\rt.jar;D:\Java\jdk\jre\lib\ext\access-bridge-32.jar;D:\Java\jdk\jre\lib\ext\cldrdata.jar;D:\Java\jdk\jre\lib\ext\dnsns.jar;D:\Java\jdk\jre\lib\ext\jaccess.jar;D:\Java\jdk\jre\lib\ext\jfxrt.jar;D:\Java\jdk\jre\lib\ext\localedata.jar;D:\Java\jdk\jre\lib\ext\nashorn.jar;D:\Java\jdk\jre\lib\ext\sunec.jar;D:\Java\jdk\jre\lib\ext\sunjce_provider.jar;D:\Java\jdk\jre\lib\ext\sunmscapi.jar;D:\Java\jdk\jre\lib\ext\sunpkcs11.jar;D:\Java\jdk\jre\lib\ext\zipfs.jar;C:\Users\maksat\IdeaProjects\smart_card\target\classes;C:\Users\maksat\.m2\repository\org\jmrtd\jmrtd\0.5.2\jmrtd-0.5.2.jar;C:\Users\maksat\.m2\repository\org\bouncycastle\bcprov-jdk15on\1.52\bcprov-jdk15on-1.52.jar;C:\Users\maksat\.m2\repository\net\sf\scuba\scuba-sc-j2se\0.0.11\scuba-sc-j2se-0.0.11.jar;C:\Users\maksat\.m2\repository\net\sf\scuba\scuba-smartcards\0.0.9\scuba-smartcards-0.0.9.jar;C:\Program Files (x86)\JetBrains\IntelliJ IDEA 14.0\lib\idea_rt.jar" com.intellij.rt.execution.application.AppMain paket.ReadFromSmartCard
net.sf.scuba.smartcards.CardServiceException: File not found, CAPDU = 00A4020C02011C, RAPDU = 6A82 (SW = 0x6A82: FILE NOT FOUND)
at org.jmrtd.PassportApduService.checkStatusWordAfterFileOperation(Unknown Source)
at org.jmrtd.PassportApduService.sendSelectFile(Unknown Source)
at org.jmrtd.PassportService.sendSelectFile(Unknown Source)
at org.jmrtd.MRTDFileSystem.getFileInfo(Unknown Source)
at org.jmrtd.MRTDFileSystem.getSelectedPath(Unknown Source)
at net.sf.scuba.smartcards.CardFileInputStream.<init>(CardFileInputStream.java:56)
at org.jmrtd.PassportService.getInputStream(Unknown Source)
at paket.ReadFromSmartCard.sdvsdv(ReadFromSmartCard.java:121)
at paket.ReadFromSmartCard.readData(ReadFromSmartCard.java:89)
at paket.ReadFromSmartCard.main(ReadFromSmartCard.java:23)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at com.intellij.rt.execution.application.AppMain.main(AppMain.java:134)
ноя 19, 2019 10:45:51 AM org.jmrtd.PassportService doBAC
WARNING: BAC failed for BAC key "21009199301195, 100993, 120229"
net.sf.scuba.smartcards.CardServiceException: Mutual authentication failed: expected length: 40 + 2, actual length: 2 (SW = 0x6985: CONDITIONS NOT SATISFIED)
at org.jmrtd.PassportApduService.sendMutualAuth(Unknown Source)
at org.jmrtd.PassportService.doBAC(Unknown Source)
at org.jmrtd.PassportService.doBAC(Unknown Source)
at paket.ReadFromSmartCard.sdvsdv(ReadFromSmartCard.java:140)
at paket.ReadFromSmartCard.readData(ReadFromSmartCard.java:89)
at paket.ReadFromSmartCard.main(ReadFromSmartCard.java:23)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at com.intellij.rt.execution.application.AppMain.main(AppMain.java:134)
Process finished with exit code 0

issue deserializing events in protobuf events in apache flink

I am reading events from kinesis in my flink app. the events are in protobuf format. if i use 'com.google.protobuf:protobuf-java:3.7.1' with in the flink app i've no issues. however if i change that to 'com.google.protobuf:protobuf-java:3.10.0' i get the above exception with stack trace
java.lang.IncompatibleClassChangeError: class com.google.protobuf.Descriptors$OneofDescriptor has interface com.google.protobuf.Descriptors$GenericDescriptor as super class
at java.lang.ClassLoader.defineClass1(Native Method)
at java.lang.ClassLoader.defineClass(ClassLoader.java:763)
at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
at java.net.URLClassLoader.defineClass(URLClassLoader.java:468)
at java.net.URLClassLoader.access$100(URLClassLoader.java:74)
at java.net.URLClassLoader$1.run(URLClassLoader.java:369)
at java.net.URLClassLoader$1.run(URLClassLoader.java:363)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:362)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
at java.lang.Class.getDeclaredMethods0(Native Method)
at java.lang.Class.privateGetDeclaredMethods(Class.java:2701)
at java.lang.Class.privateGetPublicMethods(Class.java:2902)
at java.lang.Class.privateGetPublicMethods(Class.java:2917)
at java.lang.Class.getMethods(Class.java:1615)
at org.apache.flink.api.java.typeutils.TypeExtractor.isValidPojoField(TypeExtractor.java:1786)
at org.apache.flink.api.java.typeutils.TypeExtractor.analyzePojo(TypeExtractor.java:1856)
at org.apache.flink.api.java.typeutils.TypeExtractor.privateGetForClass(TypeExtractor.java:1746)
at org.apache.flink.api.java.typeutils.TypeExtractor.privateGetForClass(TypeExtractor.java:1643)
at org.apache.flink.api.java.typeutils.TypeExtractor.createTypeInfoWithTypeHierarchy(TypeExtractor.java:921)
at org.apache.flink.api.java.typeutils.TypeExtractor.privateCreateTypeInfo(TypeExtractor.java:781)
at org.apache.flink.api.java.typeutils.TypeExtractor.createTypeInfo(TypeExtractor.java:735)
at org.apache.flink.api.java.typeutils.TypeExtractor.createTypeInfo(TypeExtractor.java:731)
at org.apache.flink.api.common.typeinfo.TypeInformation.of(TypeInformation.java:211)
at org.apache.flink.api.java.typeutils.ListTypeInfo.<init>(ListTypeInfo.java:45)
at com.bagi.streaming.serialization.ProtoSchema.getProducedType(ProtoSchema.java:40)
at org.apache.flink.streaming.connectors.kinesis.serialization.KinesisDeserializationSchemaWrapper.getProducedType(KinesisDeserializationSchemaWrapper.java:57)
at org.apache.flink.streaming.connectors.kinesis.FlinkKinesisConsumer.getProducedType(FlinkKinesisConsumer.java:363)
at org.apache.flink.streaming.api.environment.StreamExecutionEnvironment.addSource(StreamExecutionEnvironment.java:1456)
at org.apache.flink.streaming.api.environment.StreamExecutionEnvironment.addSource(StreamExecutionEnvironment.java:1414)
at org.apache.flink.streaming.api.environment.StreamExecutionEnvironment.addSource(StreamExecutionEnvironment.java:1396)
at com.bagi.streaming.StreamProcessor.getKinesisTrackingStream(StreamProcessor.java:101)
at com.bagi.streaming.StreamProcessor.getKinesisTrackingStream(StreamProcessor.java:110)
at com.bagi.streaming.StreamProcessor.consumeKinesis(StreamProcessor.java:117)
at com.bagi.streaming.StreamProcessor.main(StreamProcessor.java:80)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.flink.client.program.PackagedProgram.callMainMethod(PackagedProgram.java:529)
at org.apache.flink.client.program.PackagedProgram.invokeInteractiveModeForExecution(PackagedProgram.java:421)
at org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:423)
at org.apache.flink.client.cli.CliFrontend.executeProgram(CliFrontend.java:813)
at org.apache.flink.client.cli.CliFrontend.runProgram(CliFrontend.java:287)
at org.apache.flink.client.cli.CliFrontend.run(CliFrontend.java:213)
at org.apache.flink.client.cli.CliFrontend.parseParameters(CliFrontend.java:1050)
at org.apache.flink.client.cli.CliFrontend.lambda$main$11(CliFrontend.java:1126)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1844)
at org.apache.flink.runtime.security.HadoopSecurityContext.runSecured(HadoopSecurityContext.java:41)
at org.apache.flink.client.cli.CliFrontend.main(CliFrontend.java:1126)
i am using flink#1.8.0 and 'com.twitter:chill-protobuf:0.9.3'. i am building flink app jar locally on my mac. i've tried using protoc at both 3.10.0 and 3.7.1 for protobuf-java at 3.10.0 in case that matters.
here is my deserializer
public class ProtoSchema implements DeserializationSchema<List<Event>> {
#Override
public List<Event> deserialize(byte[] message) throws IOException {
List<Event> events = new LinkedList<>();
InputStream inputStream = new ByteArrayInputStream(message);
while (true) {
Event event = Event.parseDelimitedFrom(inputStream);
if (event != null) {
events.add(event);
} else {
break;
}
}
return events;
}
#Override
public boolean isEndOfStream(List<Event> nextElement) {
return false;
}
#Override
public TypeInformation<List<Event>> getProducedType() {
return new ListTypeInfo<>(Event.class);
}
}
which i am plugging in by doing
StreamExecutionEnvironment env = StreamExecutionEnvironment.getExecutionEnvironment();
Properties consumerConfig = new Properties();
consumerConfig.put(AWSConfigConstants.AWS_CREDENTIALS_PROVIDER, "AUTO");
consumerConfig.put(AWSConfigConstants.AWS_REGION, region);
consumerConfig.put(ConsumerConfigConstants.SHARD_GETRECORDS_INTERVAL_MILLIS, "300");
consumerConfig.put(ConsumerConfigConstants.SHARD_GETRECORDS_RETRIES, "10");
consumerConfig.put(ConsumerConfigConstants.SHARD_GETRECORDS_MAX, "5000");
consumerConfig.put(ConsumerConfigConstants.STREAM_INITIAL_POSITION, "LATEST");
env.addSource(new FlinkKinesisConsumer<>(name, new ProtoSchema(), consumerConfig)).name("KinesisSource");
env.getConfig().registerTypeWithKryoSerializer(Event.class, ProtobufSerializer.class);
Event.class is compiled from protobuf schema using protoc#3.10.0 and protobuf-java#3.10.0
As you said in comment from protobuf-java:3.9.0 there is binary incompatible change to lower versions (3.8-).
to class class Descriptors.OneofDescriptor added super-class Descriptors.GenericDescriptor,
which
A static field from a super-interface of a client class may hide a field (with the same name) inherited from new super-class and cause IncompatibleClassChangeError exception. More
So if you have on your classpath protobuf-java:3.9.0+ and also some lower version (3.8-) call this class you will got this error. (In my case it went from hadoop which has 2.5 protobuf-java version and my fat jar with 3.10)
Solution:
You need to shade one of the incompatible dependencies protobuf-java more how to shade depedency with gradle
Or use version 3.8 and lower as temporary shortsighted solution.

Launch JavaFX project using reflection

I have a program, that downloads a Git repository, builds it and launches defined Main class. It works properly with ordinary projects, but when I want to launch a JavaFX project, I get strange errors like:
Exception in thread "main" java.lang.reflect.InvocationTargetException
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at Main.main(Main.java:31)
Caused by: java.lang.RuntimeException: java.lang.ClassNotFoundException: app.UI_Main
at javafx.application.Application.launch(Application.java:260)
at app.UI_Main.main(UI_Main.java:31)
... 5 more
Caused by: java.lang.ClassNotFoundException: app.UI_Main
at java.net.URLClassLoader.findClass(URLClassLoader.java:382)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:349)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:348)
at javafx.application.Application.launch(Application.java:248)
... 6 more
My Main class is:
public class Main {
private static final String GIT_ADDRESS = "https://github.com/lerryv/CheckCheckerDesktop";
private static final String MAIN_PATH = "app/";
private static final String MAIN_CLASS = "app.UI_Main";
public static void main(String[] args) throws GitAPIException, IOException, ClassNotFoundException, NoSuchMethodException, InvocationTargetException, IllegalAccessException {
Git.cloneRepository().setURI(GIT_ADDRESS).setDirectory(Paths.get("./dir/").toFile()).call();
Collection<String> result = compile(Paths.get("./dir/src/").toFile());
String command = System.getProperty("java.home") + "/../bin/javac -d dirOut -cp \".:json-simple-1.1.jar\" " + result.join(" ");
Runtime.getRuntime().exec(command);
URLClassLoader urlClassLoader = URLClassLoader.newInstance(
new URL[]{
new File("dirOut/").toURI().toURL()
}
);
Class clazz = urlClassLoader.loadClass(MAIN_CLASS);
Method main = clazz.getDeclaredMethod("main", String[].class);
assert Modifier.isStatic(main.getModifiers());
main.invoke(null, (Object) args);
}
private static Collection<String> compile(File directory) {
assert directory.isDirectory();
Collection<String> result = new Collection<>();
boolean hasFiles = false;
for (File file: directory.listFiles()) {
if (file.isDirectory()) {
result.addAll(compile(file));
} else {
if (!hasFiles) {
String path = file.getAbsolutePath();
String extension = path.substring(path.lastIndexOf(".") + 1);
if (extension.equals("java")) hasFiles = true;
}
}
}
if (hasFiles) result.add(directory.getAbsolutePath() + "/*.java");
return result;
};
}
At first I thought it cannot find the class, but when I removed the method.invoke statement, errors disappeared. Why does it happen and are there any workarounds?
Runtime.getRuntime().exec(command)
This is starting another process, so after this line is executed compilation is not yet finished, you need to wait for this process to end, and probably you should also handle output/error stream of process to check if it succeed or not.
Process compileProc = Runtime.getRuntime().exec(command);
compileProc.waitFor();
Also I don't know what are you trying to do, but remember that not everyone might have compiler available and configured java.hame property, or configured it to different java version. (like older one and your code will not compile or newer one and you code will not run)
The program opens a new thread to start the project, but it executes the next line without monitoring its completion, so the thread can be removed if it is not necessary. If necessary, you need to write a monitoring thread to monitor and schedule all threads so that it can continue to execute after it has finished its work. Tasks of the main thread.

program using reflection run as hadoop jar throws java.lang.NoClassDefFoundError

I have a base and sub class. The requirement is to invoke sub class' method using reflection. Both programs are below.
base.java
import java.io.File;
import java.net.URL;
import java.net.URLClassLoader;
public class base {
public static void main(String[] args) {
URLClassLoader loader = null;
Class<?> cls;
try {
File file = new File("sub.jar" );
URL[] urls = { file.toURI().toURL() };
loader = new URLClassLoader(urls);
cls = loader.loadClass("sub");
base obj = (base) cls.newInstance();
obj.print();
}
catch(Exception e) {
System.out.println("Exception occured:" + e);
e.printStackTrace();
}
}
public void print() {
System.out.println("In Base class");
}
}
sub.java
public class sub extends base {
public void print() {
System.out.println("In subclass");
}
}
compile both in to jars.
javac base.java;
jar cvf base.jar base.class
javac sub.java;
jar cvf sub.jar sub.class
If I invoke the base.jar as "java -cp", it works fine
java -cp base.jar base
output: "In subclass"
But if I invoke it with "hadoop jar" command, I get
hadoop jar base.jar base
Exception in thread "main" java.lang.NoClassDefFoundError: base
at java.lang.ClassLoader.defineClass1(Native Method)
at java.lang.ClassLoader.defineClass(ClassLoader.java:763)
at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
at java.net.URLClassLoader.defineClass(URLClassLoader.java:467)
at java.net.URLClassLoader.access$100(URLClassLoader.java:73)
at java.net.URLClassLoader$1.run(URLClassLoader.java:368)
at java.net.URLClassLoader$1.run(URLClassLoader.java:362)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:361)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
at base.main(base.java:15)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.hadoop.util.RunJar.run(RunJar.java:221)
at org.apache.hadoop.util.RunJar.main(RunJar.java:136)
Caused by: java.lang.ClassNotFoundException: base
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
Any help is greatly appreciated.
I believe your issue is on line 15 where you try to case the loader as your base class. I believe the compiler doesn't know how to handle that. Additionally, I'm not sure why you're trying to run this program with Hadoop since it does not have any of the hadoop implementations or classes
The problem is solved by placing base.jar in hadoop classpath. Run "hadoop classpath" command which lists all classpath dirs. Place the jar in one of these directories.

ProcessBuilder does not find the executable after moving to Spring Boot

I have a java class running as a task (as a plugin in an Java CMS) which used Quartz.
I am now moving it out of the CMS and into a Spring Boot (v1.0.2) project and am using #Scheduled annotation.
The problem i that the ProcessBuilder reports missing executable file, but a File().exist() return true.
It is the same Java version OpenJDK 7, same Application Container (Tomcat 7), on FreeBSD 9.2.
The testing code is a follow:
package test;
import java.io.File;
import java.io.IOException;
import java.util.ArrayList;
import java.util.List;
import org.springframework.scheduling.annotation.Scheduled;
import org.springframework.stereotype.Service;
#Service
public class SchedulerTest {
#Scheduled(fixedDelay=1800000, initialDelay=5000)
public void test() {
if ((new File("/usr/local/bin/exiftool")).exists()) {
System.out.println("Found exiftool");
}
else {
System.out.println("Did not find exiftool");
}
List<String> args = new ArrayList<String>();
args.add("/usr/local/bin/exiftool");
args.add("-n");
args.add("-S");
args.add("-DateTimeOriginal");
args.add("/image/TEST.JPG");
Process proc = null;
try {
proc = new ProcessBuilder(args).start();
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
if (proc != null) {
System.out.println("ExifTool output: " + proc.getOutputStream());
proc.destroy();
}
}
}
The ouput is:
2014-04-28 12:30:49.471 INFO 22473 --- [ost-startStop-7] o.s.boot.SpringApplication : Started application in 10.997 seconds (JVM running for 4022.804)
Found exiftool
java.io.IOException: Cannot run program "/usr/local/bin/exiftool": error=2, No such file or directory
at java.lang.ProcessBuilder.start(ProcessBuilder.java:1041)
at no.hoanglai.service.SchedulerTest.test(SchedulerTest.java:32)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.springframework.scheduling.support.ScheduledMethodRunnable.run(ScheduledMethodRunnable.java:65)
at org.springframework.scheduling.support.DelegatingErrorHandlingRunnable.run(DelegatingErrorHandlingRunnable.java:54)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:304)
at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:178)
at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:293)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.io.IOException: error=2, No such file or directory
at java.lang.UNIXProcess.forkAndExec(Native Method)
at java.lang.UNIXProcess.<init>(UNIXProcess.java:184)
at java.lang.ProcessImpl.start(ProcessImpl.java:130)
at java.lang.ProcessBuilder.start(ProcessBuilder.java:1022)
... 14 more
What may be the problem?
My guessing is the move from Quartz to #Scheduled.
I have also tried to create cutsom bean and set the thread pool to 2, without any luck.
Maybe a file format problem (not really executable?). E.g. see here "aapt" IOException error=2, No such file or directory" why can't I build my gradle on jenkins?.

Categories

Resources