I'm working with a project that contains Java classes and Clojure files. The objective is to test Clojure files using java.
I'm using Cljunit for this purpose: https://github.com/mikera/cljunit
The code I use is next:
public class DemoClojureTest extends ClojureTest {
#Override
public List<String> namespaces() {
#SuppressWarnings("unused") ArrayList<String> ns=new ArrayList<String>();
ns.add("com.example.demo.helloWorld");
return ns;
}
}
And the clojure file (helloWorld.clj) is:
(ns com.example.demo.helloWorld
(:use clojure.test))
(deftest test1
(is (= 1 3)))
(deftest test2
(is (= 2 2)))
When I try to execute the DemoClojureTest I get this error:
Error attempting to get var names for namespace [com.example.demo.helloWorld]
java.io.FileNotFoundException: Could not locate com/example/demo/helloWorld__init.class or com/example/demo/helloWorld.clj on classpath:
at clojure.lang.RT.load(RT.java:432)
at clojure.lang.RT.load(RT.java:400)
at clojure.core$load$fn__4890.invoke(core.clj:5415)
at clojure.core$load.doInvoke(core.clj:5414)
at clojure.lang.RestFn.invoke(RestFn.java:408)
at clojure.core$load_one.invoke(core.clj:5227)
at clojure.core$load_lib.doInvoke(core.clj:5264)
at clojure.lang.RestFn.applyTo(RestFn.java:142)
at clojure.core$apply.invoke(core.clj:603)
at clojure.core$load_libs.doInvoke(core.clj:5298)
at clojure.lang.RestFn.applyTo(RestFn.java:137)
at clojure.core$apply.invoke(core.clj:603)
at clojure.core$require.doInvoke(core.clj:5381)
at clojure.lang.RestFn.invoke(RestFn.java:408)
at mikera.cljunit.core$get_test_var_names.invoke(core.clj:67)
at clojure.lang.Var.invoke(Var.java:415)
at mikera.cljunit.Clojure.getTestVars(Clojure.java:29)
at mikera.cljunit.NamespaceTester.<init>(NamespaceTester.java:19)
at mikera.cljunit.ClojureTester.<init>(ClojureTester.java:21)
at mikera.cljunit.ClojureRunner.<init>(ClojureRunner.java:16)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:39)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27)
at java.lang.reflect.Constructor.newInstance(Constructor.java:513)
at org.junit.internal.builders.AnnotatedBuilder.buildRunner(AnnotatedBuilder.java:29)
at org.junit.internal.builders.AnnotatedBuilder.runnerForClass(AnnotatedBuilder.java:21)
at org.junit.runners.model.RunnerBuilder.safeRunnerForClass(RunnerBuilder.java:59)
at org.junit.internal.builders.AllDefaultPossibilitiesBuilder.runnerForClass(AllDefaultPossibilitiesBuilder.java:26)
at org.junit.runners.model.RunnerBuilder.safeRunnerForClass(RunnerBuilder.java:59)
at org.junit.internal.requests.ClassRequest.getRunner(ClassRequest.java:26)
at com.intellij.junit4.JUnit4IdeaTestRunner.startRunnerWithArgs(JUnit4IdeaTestRunner.java:44)
at com.intellij.rt.execution.junit.JUnitStarter.prepareStreamsAndStart(JUnitStarter.java:195)
at com.intellij.rt.execution.junit.JUnitStarter.main(JUnitStarter.java:63)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at com.intellij.rt.execution.application.AppMain.main(AppMain.java:120)
What I'm doing wrong?
From the stack trace, it looks like the IntelliJ test runner is either running your tests with a classpath that does not include your Clojure source files, or not including them in the build.
Make sure the folder, your Clojure files are in, is under a "Content Root" and marked as a "Sources", or "Test Sources", folder in the module settings.
Related
See the quickstart guide on installing sampleclean:
http://sampleclean.org/release.html
I have followed the steps provided in the quick start and have the correct versions:
$ javac -version
javac 1.7.0_95
$ scala
Welcome to Scala version 2.10.5 (OpenJDK 64-Bit Server VM, Java 1.7.0_95)
I wasn't able to install spark-1.2.2, so I got spark-1.6.1 instead.
After building spark, updating the hive conf file and running the scala commands, I get a java error:
~/sampleclean/spark-1.6.1$ ./bin/spark-shell --jars sampleclean-v0.1.jar
scala> import sampleclean.api.SampleCleanContext
import sampleclean.api.SampleCleanContext
scala> val scc = new SampleCleanContext(sc)
scc: sampleclean.api.SampleCleanContext = sampleclean.api.SampleCleanContext#2d15f9f9
scala> scc.hql("CREATE TABLE restaurant(id String, entity String, name String, category String, city String) ROW FORMAT DELIMITED FIELDS TERMINATED BY ',' LINES TERMINATED BY '\\n'")
java.lang.NoSuchMethodError: sampleclean.api.SampleCleanContext.hql(Ljava/lang/String;)Lorg/apache/spark/sql/DataFrame;
at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:31)
at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:36)
at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:38)
at $iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:40)
at $iwC$$iwC$$iwC$$iwC.<init>(<console>:42)
at $iwC$$iwC$$iwC.<init>(<console>:44)
at $iwC$$iwC.<init>(<console>:46)
at $iwC.<init>(<console>:48)
at <init>(<console>:50)
at .<init>(<console>:54)
at .<clinit>(<console>)
at .<init>(<console>:7)
at .<clinit>(<console>)
at $print(<console>)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:1065)
at org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1346)
at org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:840)
at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871)
at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819)
at org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:857)
at org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:902)
at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:814)
at org.apache.spark.repl.SparkILoop.processLine$1(SparkILoop.scala:657)
at org.apache.spark.repl.SparkILoop.innerLoop$1(SparkILoop.scala:665)
at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$loop(SparkILoop.scala:670)
at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply$mcZ$sp(SparkILoop.scala:997)
at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
at scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135)
at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$process(SparkILoop.scala:945)
at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1059)
at org.apache.spark.repl.Main$.main(Main.scala:31)
at org.apache.spark.repl.Main.main(Main.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:731)
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
I think I've followed all of the steps correctly. In addition, I can see the hql function defined in sampleclean/api/SampleCleanContext.scala.
Can somebody help to resolve this error.
Thanks!
I have a java project containing a class MyFileLoader (among others) that successfully loads a file from resources using:
public static List<String> loadFile() {
Path path = System.class.getResource("/my/path/model.bin").getFile().toPath();
return Files.readAllLines(path, UTF_8);
}
and then does some processing.
After adding this project/jar as a dependency in scala, I tried to access MyFileLoader.loadFile. Unfortunately, this gives a java.lang.NullPointerException, as the resource isn't found.
To debug, I ran this command in spark-shell, showing that this resource indeed exists:
scala> getClass.getResource("/my/path/model.bin").getFile
res32: String = file:/some-local-path/my-jar-with-dependencies.jar!/my/path/model.bin
I then tried:
scala> Files.readAllLines(new File(getClass.getResource("/my/path/model.bin").getPath).toPath)
java.nio.file.NoSuchFileException: file:/some-local-path/my-jar-with-dependencies.jar!/my/path/model.bin
at sun.nio.fs.UnixException.translateToIOException(UnixException.java:86)
at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:102)
at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:107)
at sun.nio.fs.UnixFileSystemProvider.newByteChannel(UnixFileSystemProvider.java:214)
at java.nio.file.Files.newByteChannel(Files.java:361)
at java.nio.file.Files.newByteChannel(Files.java:407)
at java.nio.file.spi.FileSystemProvider.newInputStream(FileSystemProvider.java:384)
at java.nio.file.Files.newInputStream(Files.java:152)
at java.nio.file.Files.newBufferedReader(Files.java:2784)
at java.nio.file.Files.readAllLines(Files.java:3202)
at java.nio.file.Files.readAllLines(Files.java:3242)
at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:20)
at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:25)
at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:27)
at $iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:29)
at $iwC$$iwC$$iwC$$iwC.<init>(<console>:31)
at $iwC$$iwC$$iwC.<init>(<console>:33)
at $iwC$$iwC.<init>(<console>:35)
at $iwC.<init>(<console>:37)
at <init>(<console>:39)
at .<init>(<console>:43)
at .<clinit>(<console>)
at .<init>(<console>:7)
at .<clinit>(<console>)
at $print(<console>)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:497)
at org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:1065)
at org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1338)
at org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:840)
at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871)
at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819)
at org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:856)
at org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:901)
at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:813)
at org.apache.spark.repl.SparkILoop.processLine$1(SparkILoop.scala:656)
at org.apache.spark.repl.SparkILoop.innerLoop$1(SparkILoop.scala:664)
at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$loop(SparkILoop.scala:669)
at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply$mcZ$sp(SparkILoop.scala:996)
at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:944)
at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:944)
at scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135)
at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$process(SparkILoop.scala:944)
at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1058)
at org.apache.spark.repl.Main$.main(Main.scala:31)
at org.apache.spark.repl.Main.main(Main.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:497)
at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:569)
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:166)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:189)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:110)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Why aren't I able to load the resource these ways?
Since your file is now packaged inside of jar you will need to use Class.getResourceAsStream(). It seems you are trying to read the URL as a regular file which isn't supported (it likely worked before since it wasn't packaged inside of the jar and was able to be loaded as a regular file).
I am trying to create a Workflow using the Oozie dashboard provided by the Hue interface. Trying to do it step by step, my workflow only has one java step. The relevant piece of code of this java step is as follows:
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
public class InputPathsCalculator {
private static final Logger LOGGER = LoggerFactory.getLogger(InputPathsCalculator.class);
public static void main(String[] args) throws IOException {
System.out.println("sout-ing");
LOGGER.info("putting something in the log");
JobConf jobConf = new JobConf();
jobConf.addResource(new Path("/etc/hadoop/conf/hdfs-site.xml"));
Path outputPath = new Path(args[1]);
List<Path> inputPaths = calculateInputPaths(args[0], jobConf);
FileUtil.copy(fileSystem,
inputPaths.toArray(new Path[0]),
fileSystem,
outputPath,
false,
true,
jobConf);
}
}
calculateInputPaths(...) is a method that has been tested in isolation and works just fine. The arguments that I pass to the method are a config file, and a String with the value /usr/myUser/outputs/.
I have two problems here:
1. I can't see anything in any logs. Not what I put into the console, not what I put into the logs
2. The outputs directory exists, but I get the following stack trace:
org.apache.oozie.action.hadoop.JavaMainException: java.io.IOException: `/user/eliasg/outputs/output': specified destination directory does not exist
at org.apache.oozie.action.hadoop.JavaMain.run(JavaMain.java:58)
at org.apache.oozie.action.hadoop.LauncherMain.run(LauncherMain.java:39)
at org.apache.oozie.action.hadoop.JavaMain.main(JavaMain.java:36)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.oozie.action.hadoop.LauncherMapper.map(LauncherMapper.java:226)
at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:54)
at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:450)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:343)
at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628)
at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158)
Caused by: java.io.IOException: `/user/eliasg/outputs/output': specified destination directory does not exist
at org.apache.hadoop.fs.FileUtil.copy(FileUtil.java:306)
at com.ig.hadoop.jsonextractor.InputPathsCalculator.main(InputPathsCalculator.java:37)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.oozie.action.hadoop.JavaMain.run(JavaMain.java:55)
... 15 more
I have the feeling that for point 2, my jobConfig is missing something that will let it work with hdfs, but I don't know what. About point 1, I am completely lost.
I was looking at the wrong place. The logs were there, but apparently I needed to look at the map task container logs instead of the list of apps log.
I have discovered that adding hdfs-site.xml to the jobConf is not enough. There is a property that is not enabled by default and needs to be set. That property is:
jobConf.set("fs.default.name", String.format("hdfs://%1$s", jobConf.get("dfs.nameservices")));
I am using the Groovy ConfigSlurper to load a large groovy file (741KB) from a Groovy script and consistently receive a RuntimeException when it tries to do the compilation.
Groovy 2.1.1, Java 1.6 (Apple/MacOSX)
I call it like so:
conf = new ConfigSlurper().parse(new File('conf.groovy').toURL())
And always get the exception below. Is there a known limitation to the size of a file ConfigSlurper can compile?
org.codehaus.groovy.control.MultipleCompilationErrorsException: startup failed:
General error during class generation: Class file too large!
java.lang.RuntimeException: Class file too large!
at org.objectweb.asm.ClassWriter.toByteArray(Unknown Source)
at org.codehaus.groovy.control.CompilationUnit$15.call(CompilationUnit.java:797)
at org.codehaus.groovy.control.CompilationUnit.applyToPrimaryClassNodes(CompilationUnit.java:1036)
at org.codehaus.groovy.control.CompilationUnit.doPhaseOperation(CompilationUnit.java:573)
at org.codehaus.groovy.control.CompilationUnit.processPhaseOperations(CompilationUnit.java:551)
at org.codehaus.groovy.control.CompilationUnit.compile(CompilationUnit.java:528)
at groovy.lang.GroovyClassLoader.doParseClass(GroovyClassLoader.java:279)
at groovy.lang.GroovyClassLoader.parseClass(GroovyClassLoader.java:258)
at groovy.lang.GroovyClassLoader.parseClass(GroovyClassLoader.java:244)
at groovy.lang.GroovyClassLoader.parseClass(GroovyClassLoader.java:202)
at groovy.lang.GroovyClassLoader.parseClass(GroovyClassLoader.java:212)
at groovy.lang.GroovyClassLoader$parseClass.call(Unknown Source)
at org.codehaus.groovy.runtime.callsite.CallSiteArray.defaultCall(CallSiteArray.java:45)
at org.codehaus.groovy.runtime.callsite.AbstractCallSite.call(AbstractCallSite.java:108)
at org.codehaus.groovy.runtime.callsite.AbstractCallSite.call(AbstractCallSite.java:116)
at groovy.util.ConfigSlurper.parse(ConfigSlurper.groovy:146)
at groovy.util.ConfigSlurper$parse.call(Unknown Source)
at org.codehaus.groovy.runtime.callsite.CallSiteArray.defaultCall(CallSiteArray.java:45)
at org.codehaus.groovy.runtime.callsite.AbstractCallSite.call(AbstractCallSite.java:108)
at org.codehaus.groovy.runtime.callsite.AbstractCallSite.call(AbstractCallSite.java:116)
at write_users.run(write_users.groovy:19)
at groovy.lang.GroovyShell.runScriptOrMainOrTestOrRunnable(GroovyShell.java:257)
at groovy.lang.GroovyShell.run(GroovyShell.java:220)
at groovy.lang.GroovyShell.run(GroovyShell.java:150)
at groovy.ui.GroovyMain.processOnce(GroovyMain.java:588)
at groovy.ui.GroovyMain.run(GroovyMain.java:375)
at groovy.ui.GroovyMain.process(GroovyMain.java:361)
at groovy.ui.GroovyMain.processArgs(GroovyMain.java:120)
at groovy.ui.GroovyMain.main(GroovyMain.java:100)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at org.codehaus.groovy.tools.GroovyStarter.rootLoader(GroovyStarter.java:106)
at org.codehaus.groovy.tools.GroovyStarter.main(GroovyStarter.java:128)
1 error
There is a known limitation but it is arguably not Groovy; it is the ASM library and ultimately Java.
As Stuart Halloway has mentioned in talks, it is often insightful to know what is happening one layer below your level of abstraction.
For example, this link shows that this code:
public byte[] toByteArray() {
if (index > Short.MAX_VALUE) {
throw new RuntimeException("Class file too large!");
}
... is likely the exception shown here:
java.lang.RuntimeException: Class file too large!
at org.objectweb.asm.ClassWriter.toByteArray(Unknown Source)
Why does the ASM method throw this exception? This post states:
It turns out that the magic number for the "code too large" error is
65535 bytes (compiled byte code, not source code).
It is probable that the file is too large for Groovy's internal implementation, which would result in a synthetic method that is too large for the JVM.
As part of some simulations that I am running using a tool called JIST/SWANS I am getting some weird errors. This simulator has been written for Java 1.4 and I am attempting to port it to 1.5.
What I am trying to do is to compile the original code with the 1.5 SDK. The problem is that the simulator uses bcel to rewrite the bytecode so that JVM could be used for simulations. When I compile the code under the new SDK, I get the error given below. Can someone point me in the right direction to fix this? I know the byte code produced by 1.4 and 1.5 are somewhat different but I have no clue where to start looking from.
java.lang.ArrayIndexOutOfBoundsException: -1
at java.util.ArrayList.remove(ArrayList.java:390)
at org.apache.bcel.verifier.structurals.OperandStack.pop(OperandStack.java:135)
at org.apache.bcel.verifier.structurals.ExecutionVisitor.visitPUTFIELD(ExecutionVisitor.java:1048)
at org.apache.bcel.generic.PUTFIELD.accept(PUTFIELD.java:78)
at jist.runtime.RewriterFlow.execute(RewriterFlow.java:235)
at jist.runtime.RewriterFlow.doFlow(RewriterFlow.java:187)
at jist.runtime.RewriterTraversalContinuableMethods.doMethod(Rewriter.java:3059)
at jist.runtime.ClassTraversal.processMethodGen(ClassTraversal.java:136)
at jist.runtime.ClassTraversal.processClassGen(ClassTraversal.java:96)
at jist.runtime.ClassTraversal.processClass(ClassTraversal.java:63)
at jist.runtime.Rewriter.rewriteClass(Rewriter.java:621)
at jist.runtime.Rewriter.findClass(Rewriter.java:410)
at jist.runtime.Rewriter.loadClass(Rewriter.java:367)
at java.lang.ClassLoader.loadClass(ClassLoader.java:248)
at java.lang.Class.getDeclaredMethods0(Native Method)
at java.lang.Class.privateGetDeclaredMethods(Class.java:2427)
at java.lang.Class.getDeclaredMethod(Class.java:1935)
at jist.swans.app.AppJava.findMain(AppJava.java:86)
at jist.swans.app.AppJava.<init>(AppJava.java:61)
at driver.aodvtest.createNode(aodvtest.java:192)
at driver.aodvtest.createSim(aodvtest.java:235)
at driver.aodvtest.main(aodvtest.java:277)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at jist.runtime.Bootstrap$JavaMain.startSimulation(Bootstrap.java:163)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at jist.runtime.Controller.processEvent(Controller.java:650)
at jist.runtime.Controller.eventLoop(Controller.java:428)
at jist.runtime.Controller.run(Controller.java:457)
at java.lang.Thread.run(Thread.java:619)
java.lang.NullPointerException
at driver.aodvtest.createNode(aodvtest.java:198)
at driver.aodvtest.createSim(aodvtest.java:235)
at driver.aodvtest.main(aodvtest.java:277)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at jist.runtime.Bootstrap$JavaMain.startSimulation(Bootstrap.java:163)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at jist.runtime.Controller.processEvent(Controller.java:650)
at jist.runtime.Controller.eventLoop(Controller.java:428)
at jist.runtime.Controller.run(Controller.java:457)
at java.lang.Thread.run(Thread.java:619)
UPDATE:
I narrowed down to this line from where the exception is being thrown:
private Method findMain(Class<?> c) throws NoSuchMethodException {
return c.getDeclaredMethod("main", new Class<?>[] { String[].class });
}
There is a main method in the class which is being passed to this function so I have no clue why it is returning a null.
UPDATE 2:
There is one function:
createNode(..., ..., ..., ..., MyClient.class, ..., ...)
which passes the MyClient.class to a function that makes use of this in the findMain method posted above. Using the debugger, I can see that declaredMethods is null so obviously the getDeclaredMethods call is dying. The MyClient class is defined as an inner static class in the following way:
public static class MyClient {
public static void main(String[] args[]) {
...
}
}
I am not sure if this has anything to do with declaredMethods being null so I tried extracting the class into a separate class but with no luck.
UPDATE 3:
Ok narrowed it down. The following throws an exception even in the main class:
System.out.println(MyClient.class.getDeclaredMethods());
Perhaps downloading BCEL's source code and putting a breakpoint at line 135 of this method
org.apache.bcel.verifier.structurals.OperandStack.pop(OperandStack.java:135)
will tell you more. Obviously, there's a problem somewhere with array index manipulation.
From what I see, BCEL is no more in development, so if a bug exists, it might be difficult to have it fixed if you report it.
ASM is a more modern bytecode generation library that's under development. You probably already know this; but if you have that simulator's source code and it isn't using BCEL too much, you might be able to rewrite it using ASM. Of course, stuff like this is usually overkill.