specified destination directory does not exist - java

I am trying to create a Workflow using the Oozie dashboard provided by the Hue interface. Trying to do it step by step, my workflow only has one java step. The relevant piece of code of this java step is as follows:
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
public class InputPathsCalculator {
private static final Logger LOGGER = LoggerFactory.getLogger(InputPathsCalculator.class);
public static void main(String[] args) throws IOException {
System.out.println("sout-ing");
LOGGER.info("putting something in the log");
JobConf jobConf = new JobConf();
jobConf.addResource(new Path("/etc/hadoop/conf/hdfs-site.xml"));
Path outputPath = new Path(args[1]);
List<Path> inputPaths = calculateInputPaths(args[0], jobConf);
FileUtil.copy(fileSystem,
inputPaths.toArray(new Path[0]),
fileSystem,
outputPath,
false,
true,
jobConf);
}
}
calculateInputPaths(...) is a method that has been tested in isolation and works just fine. The arguments that I pass to the method are a config file, and a String with the value /usr/myUser/outputs/.
I have two problems here:
1. I can't see anything in any logs. Not what I put into the console, not what I put into the logs
2. The outputs directory exists, but I get the following stack trace:
org.apache.oozie.action.hadoop.JavaMainException: java.io.IOException: `/user/eliasg/outputs/output': specified destination directory does not exist
at org.apache.oozie.action.hadoop.JavaMain.run(JavaMain.java:58)
at org.apache.oozie.action.hadoop.LauncherMain.run(LauncherMain.java:39)
at org.apache.oozie.action.hadoop.JavaMain.main(JavaMain.java:36)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.oozie.action.hadoop.LauncherMapper.map(LauncherMapper.java:226)
at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:54)
at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:450)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:343)
at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628)
at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158)
Caused by: java.io.IOException: `/user/eliasg/outputs/output': specified destination directory does not exist
at org.apache.hadoop.fs.FileUtil.copy(FileUtil.java:306)
at com.ig.hadoop.jsonextractor.InputPathsCalculator.main(InputPathsCalculator.java:37)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.oozie.action.hadoop.JavaMain.run(JavaMain.java:55)
... 15 more
I have the feeling that for point 2, my jobConfig is missing something that will let it work with hdfs, but I don't know what. About point 1, I am completely lost.

I was looking at the wrong place. The logs were there, but apparently I needed to look at the map task container logs instead of the list of apps log.
I have discovered that adding hdfs-site.xml to the jobConf is not enough. There is a property that is not enabled by default and needs to be set. That property is:
jobConf.set("fs.default.name", String.format("hdfs://%1$s", jobConf.get("dfs.nameservices")));

Related

How to use System Propery values as a part of logger handler pattern

I'm using java.util.logging.logger as the logging engine for my application. I want to set a system property as a part of the value of handler pattern which is mentioned in the logging.properties file like below,
java.util.logging.FileHandler.pattern = path/${custom.home}/logs/server.log
I tried setting the above value, but it doesn't resolve custom.home property instead use that as a string and give below error while initializing the handler
Can't load log handler "java.util.logging.FileHandler"
java.nio.file.NoSuchFileException: path/${custom.home}/logs/server.log.lck
java.nio.file.NoSuchFileException: path/${custom.home}/logs/server.log.lck
at sun.nio.fs.UnixException.translateToIOException(UnixException.java:86)
at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:102)
at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:107)
at sun.nio.fs.UnixFileSystemProvider.newFileChannel(UnixFileSystemProvider.java:177)
at java.nio.channels.FileChannel.open(FileChannel.java:287)
at java.nio.channels.FileChannel.open(FileChannel.java:335)
at java.util.logging.FileHandler.openFiles(FileHandler.java:459)
at java.util.logging.FileHandler.<init>(FileHandler.java:263)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at java.lang.Class.newInstance(Class.java:442)
at java.util.logging.LogManager$5.run(LogManager.java:966)
at java.security.AccessController.doPrivileged(Native Method)
at java.util.logging.LogManager.loadLoggerHandlers(LogManager.java:958)
at java.util.logging.LogManager.initializeGlobalHandlers(LogManager.java:1578)
at java.util.logging.LogManager.access$1500(LogManager.java:145)
at java.util.logging.LogManager$RootLogger.accessCheckedHandlers(LogManager.java:1667)
at java.util.logging.Logger.getHandlers(Logger.java:1777)
at java.util.logging.Logger.log(Logger.java:735)
at java.util.logging.Logger.doLog(Logger.java:765)
at java.util.logging.Logger.log(Logger.java:788)
Is there any way of achieving this requirement?
Thanks
Danesh
The path path/${custom.home}/logs/server.log doesnt exists!
Use:
`java.util.logging.FileHandler.pattern = "path/"+`System.getProperty("user.home")+"/logs/server.log";
instead.
But eclipse says : "FileHander.pattern is not visible!"
From your question I'm assuming you have manually set a system property called custom_home as a launch argument. You can then subclass the java.util.logging.FileHandler to recognize the new pattern syntax:
public class EnvFileHandler extends FileHandler {
private static String pattern() throws IOException {
String prefix = EnvFileHandler.class.getName();
String v = LogManager.getLogManager().getProperty(prefix +".pattern");
return v.replace("${custom_home}", System.getProperty("custom_home", "%hjava.log"));
}
public EnvFileHandler() throws IOException {
super(pattern());
}
}
Then instead of installing the FileHandler, you install the FileHandler subclass using your logging.properties.
.handlers=package.of.EnvFileHandler
package.of.EnvFileHandler.pattern=path/${custom_home}/logs/server.log
If you have not set this custom property this code example will default to the home directory.

Cannot Load Java Resources in Spark from Scala

I have a java project containing a class MyFileLoader (among others) that successfully loads a file from resources using:
public static List<String> loadFile() {
Path path = System.class.getResource("/my/path/model.bin").getFile().toPath();
return Files.readAllLines(path, UTF_8);
}
and then does some processing.
After adding this project/jar as a dependency in scala, I tried to access MyFileLoader.loadFile. Unfortunately, this gives a java.lang.NullPointerException, as the resource isn't found.
To debug, I ran this command in spark-shell, showing that this resource indeed exists:
scala> getClass.getResource("/my/path/model.bin").getFile
res32: String = file:/some-local-path/my-jar-with-dependencies.jar!/my/path/model.bin
I then tried:
scala> Files.readAllLines(new File(getClass.getResource("/my/path/model.bin").getPath).toPath)
java.nio.file.NoSuchFileException: file:/some-local-path/my-jar-with-dependencies.jar!/my/path/model.bin
at sun.nio.fs.UnixException.translateToIOException(UnixException.java:86)
at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:102)
at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:107)
at sun.nio.fs.UnixFileSystemProvider.newByteChannel(UnixFileSystemProvider.java:214)
at java.nio.file.Files.newByteChannel(Files.java:361)
at java.nio.file.Files.newByteChannel(Files.java:407)
at java.nio.file.spi.FileSystemProvider.newInputStream(FileSystemProvider.java:384)
at java.nio.file.Files.newInputStream(Files.java:152)
at java.nio.file.Files.newBufferedReader(Files.java:2784)
at java.nio.file.Files.readAllLines(Files.java:3202)
at java.nio.file.Files.readAllLines(Files.java:3242)
at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:20)
at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:25)
at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:27)
at $iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:29)
at $iwC$$iwC$$iwC$$iwC.<init>(<console>:31)
at $iwC$$iwC$$iwC.<init>(<console>:33)
at $iwC$$iwC.<init>(<console>:35)
at $iwC.<init>(<console>:37)
at <init>(<console>:39)
at .<init>(<console>:43)
at .<clinit>(<console>)
at .<init>(<console>:7)
at .<clinit>(<console>)
at $print(<console>)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:497)
at org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:1065)
at org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1338)
at org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:840)
at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871)
at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819)
at org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:856)
at org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:901)
at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:813)
at org.apache.spark.repl.SparkILoop.processLine$1(SparkILoop.scala:656)
at org.apache.spark.repl.SparkILoop.innerLoop$1(SparkILoop.scala:664)
at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$loop(SparkILoop.scala:669)
at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply$mcZ$sp(SparkILoop.scala:996)
at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:944)
at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:944)
at scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135)
at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$process(SparkILoop.scala:944)
at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1058)
at org.apache.spark.repl.Main$.main(Main.scala:31)
at org.apache.spark.repl.Main.main(Main.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:497)
at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:569)
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:166)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:189)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:110)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Why aren't I able to load the resource these ways?
Since your file is now packaged inside of jar you will need to use Class.getResourceAsStream(). It seems you are trying to read the URL as a regular file which isn't supported (it likely worked before since it wasn't packaged inside of the jar and was able to be loaded as a regular file).

Mongo Hadoop Connector Issue

I am trying to run a MapReduce job: I pull from Mongo and then write to HDFS, but I cannot seem to get the job to run. I could not find an example, but the issues I am having that if I set an input path of Mongo it loos for the output path of Mongo. And now I am getting an authentication error when my MongoDB instance does not have authentication.
final Configuration conf = getConf();
final Job job = new Job(conf, "sort");
MongoConfig config = new MongoConfig(conf);
MongoConfigUtil.setInputFormat(getConf(), MongoInputFormat.class);
FileOutputFormat.setOutputPath(job, new Path("/trythisdir"));
MongoConfigUtil.setInputURI(conf,"mongodb://localhost:27017/fake_data.file");
//conf.set("mongo.output.uri", "mongodb://localhost:27017/fake_data.file");
job.setJarByClass(imageExtractor.class);
job.setMapperClass(imageExtractorMapper.class);
job.setOutputKeyClass(Text.class);
job.setOutputValueClass(Text.class);
job.setInputFormatClass( MongoInputFormat.class );
// Execute job and return status
return job.waitForCompletion(true) ? 0 : 1;
Edit: This is the current error I am having:
Exception in thread "main" java.lang.IllegalArgumentException: Couldn't connect and authenticate to get collection
at com.mongodb.hadoop.util.MongoConfigUtil.getCollection(MongoConfigUtil.java:353)
at com.mongodb.hadoop.splitter.MongoSplitterFactory.getSplitterByStats(MongoSplitterFactory.java:71)
at com.mongodb.hadoop.splitter.MongoSplitterFactory.getSplitter(MongoSplitterFactory.java:107)
at com.mongodb.hadoop.MongoInputFormat.getSplits(MongoInputFormat.java:56)
at org.apache.hadoop.mapred.JobClient.writeNewSplits(JobClient.java:1079)
at org.apache.hadoop.mapred.JobClient.writeSplits(JobClient.java:1096)
at org.apache.hadoop.mapred.JobClient.access$600(JobClient.java:177)
at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:995)
at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:948)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1408)
at org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:948)
at org.apache.hadoop.mapreduce.Job.submit(Job.java:566)
at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:596)
at com.orbis.image.extractor.mongo.imageExtractor.run(imageExtractor.java:103)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
at com.orbis.image.extractor.mongo.imageExtractor.main(imageExtractor.java:78)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.hadoop.util.RunJar.main(RunJar.java:208)
Caused by: java.lang.NullPointerException
at com.mongodb.MongoURI.<init>(MongoURI.java:148)
at com.mongodb.MongoClient.<init>(MongoClient.java:268)
at com.mongodb.hadoop.util.MongoConfigUtil.getCollection(MongoConfigUtil.java:351)
... 22 more
Late answer.. It may be helpul for people. I encountered with same problem while playing with Apache Spark.
I think you should set correctly mongo.input.uri and mongo.output.uri which will be used by hadoop and also input and output formats.
/*Correct input and output uri setting on spark(hadoop)*/
conf.set("mongo.input.uri", "mongodb://localhost:27017/dbName.inputColName");
conf.set("mongo.output.uri", "mongodb://localhost:27017/dbName.outputColName");
/*Set input and output formats*/
job.setInputFormatClass( MongoInputFormat.class );
job.setOutputFormatClass( MongoOutputFormat.class )
Btw, if "mongo.input.uri" or "mongo.output.uri" strings have typos it causes same error.
Replace:
MongoConfigUtil.setInputURI(conf, "mongodb://localhost:27017/fake_data.file");
by:
MongoConfigUtil.setInputURI(job.getConfiguration(), "mongodb://localhost:27017/fake_data.file");
The conf object is already 'consumed' by your job, so you need to set it directly on the configuration of the job.
You haven't shared the complete code so it's hard to tell, but what you've got there does not look consistent with typical usage of the MongoDB Connector for Hadoop.
I would suggest that you start with the examples in github.

How to use MultipleInput class in mapreduce?

I have a one question.
I need two files as an Input to mapreduce program.
#Override
public int run(String[] args) throws Exception {
(argument skip)
Job job1 = new Job();
job1.setJarByClass(CFRecommenderDriver.class);
job1.setMapperClass(CFRecommenderMapper.class);
//job1.setReducerClass(CFRecommenderReducer.class);
job1.setMapOutputKeyClass(Text.class);
job1.setMapOutputValueClass(TextDoublePairWritableComparable.class);
//job1.setOutputKeyClass(TextTwoWritableComparable.class);
//job1.setOutputValueClass(TextDoubleTwoPairsWritableComparable.class);
MultipleInputs.addInputPath(job1, new Path(args[0]), FileInputFormat.class);
MultipleInputs.addInputPath(job1, new Path(args[1]), FileInputFormat.class);
job1.setNumReduceTasks(0);
boolean step1 = job1.waitForCompletion(true);
if(!(step1)) return -1;
If I'm running program with the following command:
hadoop jar mapreduce-0.1.jar cf /input/cf-re/data1 /input/cf-re/data2 /output/cf-r/data1
I get the following error:
2013-07-01 13:13:44.822 java[45783:1603] Unable to load realm info from SCDynamicStore
13/07/01 13:13:45 WARN mapred.JobClient: Use GenericOptionsParser for parsing the arguments. Applications should implement Tool for the same.
13/07/01 13:13:45 INFO mapred.JobClient: Cleaning up the staging area hdfs://127.0.0.1:9000/tmp/hadoop- suhyunjeon/mapred/staging/suhyunjeon/.staging/job_201306191432_0218
java.lang.RuntimeException: java.lang.InstantiationException
at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:115)
at org.apache.hadoop.mapreduce.lib.input.MultipleInputs.getInputFormatMap(MultipleInputs.java:109)
at org.apache.hadoop.mapreduce.lib.input.DelegatingInputFormat.getSplits(DelegatingInputFormat.java:58)
at org.apache.hadoop.mapred.JobClient.writeNewSplits(JobClient.java:1024)
at org.apache.hadoop.mapred.JobClient.writeSplits(JobClient.java:1041)
at org.apache.hadoop.mapred.JobClient.access$700(JobClient.java:179)
at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:959)
at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:912)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:396)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1149)
at org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:912)
at org.apache.hadoop.mapreduce.Job.submit(Job.java:500)
at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:530)
at org.ankus.hadoop.mapreduce.algorithm.cf.recommender.CFRecommenderDriver.run(CFRecommenderDriver.java:86)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:79)
at org.ankus.hadoop.mapreduce.algorithm.cf.recommender.CFRecommenderDriver.main(CFRecommenderDriver.java:49)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at org.apache.hadoop.util.ProgramDriver$ProgramDescription.invoke(ProgramDriver.java:68)
at org.apache.hadoop.util.ProgramDriver.driver(ProgramDriver.java:139)
at org.ankus.hadoop.mapreduce.MapReduceDriver.main(MapReduceDriver.java:64)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at org.apache.hadoop.util.RunJar.main(RunJar.java:156)
Caused by: java.lang.InstantiationException
at sun.reflect.InstantiationExceptionConstructorAccessorImpl.newInstance(InstantiationExceptionConstructorAccessorImpl.java:30)
at java.lang.reflect.Constructor.newInstance(Constructor.java:513)
at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:113)
... 29 more
I don't know what exactly the problem. Please help me.
You can not use the abstract class FileInputFormat directly. If your inputs are text, you can use org.apache.hadoop.mapreduce.lib.input.TextInputFormat. For example,
MultipleInputs.addInputPath(job1, new Path(args[0]), TextInputFormat.class);
MultipleInputs.addInputPath(job1, new Path(args[1]), TextInputFormat.class);
This is how you can use multiple input files in your job:
job1.setInputFormatClass(TextInputFormat.class);
FileInputFormat.setInputPaths(job1, input_1 + ","
+ input_2);
I think it's worth mentioning another method that can be used for adding multiple input paths, and in my mind the prettiest and simplest: FileInputFormat.setInputPaths(Job job, Path... inputPaths)
The Path... signature tells you that you can give any number of Path objects to this call. Example:
FileInputFormat.setInputPaths(job, new Path(args[0]), new Path(args[1]), new Path(args[2]));

Getting weird errors on stack manipulation

As part of some simulations that I am running using a tool called JIST/SWANS I am getting some weird errors. This simulator has been written for Java 1.4 and I am attempting to port it to 1.5.
What I am trying to do is to compile the original code with the 1.5 SDK. The problem is that the simulator uses bcel to rewrite the bytecode so that JVM could be used for simulations. When I compile the code under the new SDK, I get the error given below. Can someone point me in the right direction to fix this? I know the byte code produced by 1.4 and 1.5 are somewhat different but I have no clue where to start looking from.
java.lang.ArrayIndexOutOfBoundsException: -1
at java.util.ArrayList.remove(ArrayList.java:390)
at org.apache.bcel.verifier.structurals.OperandStack.pop(OperandStack.java:135)
at org.apache.bcel.verifier.structurals.ExecutionVisitor.visitPUTFIELD(ExecutionVisitor.java:1048)
at org.apache.bcel.generic.PUTFIELD.accept(PUTFIELD.java:78)
at jist.runtime.RewriterFlow.execute(RewriterFlow.java:235)
at jist.runtime.RewriterFlow.doFlow(RewriterFlow.java:187)
at jist.runtime.RewriterTraversalContinuableMethods.doMethod(Rewriter.java:3059)
at jist.runtime.ClassTraversal.processMethodGen(ClassTraversal.java:136)
at jist.runtime.ClassTraversal.processClassGen(ClassTraversal.java:96)
at jist.runtime.ClassTraversal.processClass(ClassTraversal.java:63)
at jist.runtime.Rewriter.rewriteClass(Rewriter.java:621)
at jist.runtime.Rewriter.findClass(Rewriter.java:410)
at jist.runtime.Rewriter.loadClass(Rewriter.java:367)
at java.lang.ClassLoader.loadClass(ClassLoader.java:248)
at java.lang.Class.getDeclaredMethods0(Native Method)
at java.lang.Class.privateGetDeclaredMethods(Class.java:2427)
at java.lang.Class.getDeclaredMethod(Class.java:1935)
at jist.swans.app.AppJava.findMain(AppJava.java:86)
at jist.swans.app.AppJava.<init>(AppJava.java:61)
at driver.aodvtest.createNode(aodvtest.java:192)
at driver.aodvtest.createSim(aodvtest.java:235)
at driver.aodvtest.main(aodvtest.java:277)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at jist.runtime.Bootstrap$JavaMain.startSimulation(Bootstrap.java:163)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at jist.runtime.Controller.processEvent(Controller.java:650)
at jist.runtime.Controller.eventLoop(Controller.java:428)
at jist.runtime.Controller.run(Controller.java:457)
at java.lang.Thread.run(Thread.java:619)
java.lang.NullPointerException
at driver.aodvtest.createNode(aodvtest.java:198)
at driver.aodvtest.createSim(aodvtest.java:235)
at driver.aodvtest.main(aodvtest.java:277)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at jist.runtime.Bootstrap$JavaMain.startSimulation(Bootstrap.java:163)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at jist.runtime.Controller.processEvent(Controller.java:650)
at jist.runtime.Controller.eventLoop(Controller.java:428)
at jist.runtime.Controller.run(Controller.java:457)
at java.lang.Thread.run(Thread.java:619)
UPDATE:
I narrowed down to this line from where the exception is being thrown:
private Method findMain(Class<?> c) throws NoSuchMethodException {
return c.getDeclaredMethod("main", new Class<?>[] { String[].class });
}
There is a main method in the class which is being passed to this function so I have no clue why it is returning a null.
UPDATE 2:
There is one function:
createNode(..., ..., ..., ..., MyClient.class, ..., ...)
which passes the MyClient.class to a function that makes use of this in the findMain method posted above. Using the debugger, I can see that declaredMethods is null so obviously the getDeclaredMethods call is dying. The MyClient class is defined as an inner static class in the following way:
public static class MyClient {
public static void main(String[] args[]) {
...
}
}
I am not sure if this has anything to do with declaredMethods being null so I tried extracting the class into a separate class but with no luck.
UPDATE 3:
Ok narrowed it down. The following throws an exception even in the main class:
System.out.println(MyClient.class.getDeclaredMethods());
Perhaps downloading BCEL's source code and putting a breakpoint at line 135 of this method
org.apache.bcel.verifier.structurals.OperandStack.pop(OperandStack.java:135)
will tell you more. Obviously, there's a problem somewhere with array index manipulation.
From what I see, BCEL is no more in development, so if a bug exists, it might be difficult to have it fixed if you report it.
ASM is a more modern bytecode generation library that's under development. You probably already know this; but if you have that simulator's source code and it isn't using BCEL too much, you might be able to rewrite it using ASM. Of course, stuff like this is usually overkill.

Categories

Resources