Java Spark RDD in an other RDD? - java

I try to create a JavaRDD which contains an other series of RDD inside.
RDDMachine.foreach(machine -> startDetectionNow())
Inside, machine start a query to ES and get an other RDD. I collect all this (1200hits) and covert to Lists. After the Machine start work with this list
Firstly : is it possible to do this or not ? if not, in which way can i try to do something different?
Let me show what I try to do :
SparkConf conf = new SparkConf().setAppName("Algo").setMaster("local");
conf.set("es.index.auto.create", "true");
conf.set("es.nodes", "IP_ES");
conf.set("es.port", "9200");
sparkContext = new JavaSparkContext(conf);
MyAlgoConfig config_algo = new MyAlgoConfig(Detection.byPrevisionMerge);
Machine m1 = new Machine("AL-27", "IP1", config_algo);
Machine m2 = new Machine("AL-20", "IP2", config_algo);
Machine m3 = new Machine("AL-24", "IP3", config_algo);
Machine m4 = new Machine("AL-21", "IP4", config_algo);
ArrayList<Machine> Machines = new ArrayList();
Machines.add(m1);
Machines.add(m2);
Machines.add(m3);
Machines.add(m4);
JavaRDD<Machine> machineRDD = sparkContext.parallelize(Machines);
machineRDD.foreach(machine -> machine.startDetectNow());
I try to start my algorithm in each machine which must learn from data located in Elasticsearch.
public boolean startDetectNow()
// MEGA Requete ELK
JavaRDD dataForLearn = Elastic.loadElasticsearch(
Algo.sparkContext
, "logstash-*/Collector"
, Elastic.req_AvgOfCall(
getIP()
, "hour"
, "2016-04-16T00:00:00"
, "2016-06-10T00:00:00"));
JavaRDD<Hit> RDD_hits = Elastic.mapToHit(dataForLearn);
List<Hit> hits = Elastic.RddToListHits(RDD_hits);
So I try to get all data of a query in every "Machine".
My question is : is it possible to do this with Spark ? Or maybe in an other way ?
When I start it in Spark; it's seams to be something like lock when the code is around the second RDD.
And the error message is :
16/08/17 00:17:13 INFO SparkContext: Starting job: collect at Elastic.java:94
16/08/17 00:17:13 INFO DAGScheduler: Got job 1 (collect at Elastic.java:94) with 1 output partitions
16/08/17 00:17:13 INFO DAGScheduler: Final stage: ResultStage 1 (collect at Elastic.java:94)
16/08/17 00:17:13 INFO DAGScheduler: Parents of final stage: List()
16/08/17 00:17:13 INFO DAGScheduler: Missing parents: List()
16/08/17 00:17:13 INFO DAGScheduler: Submitting ResultStage 1 (MapPartitionsRDD[4] at map at Elastic.java:106), which has no missing parents
16/08/17 00:17:13 INFO MemoryStore: Block broadcast_1 stored as values in memory (estimated size 4.3 KB, free 7.7 KB)
16/08/17 00:17:13 INFO MemoryStore: Block broadcast_1_piece0 stored as bytes in memory (estimated size 2.5 KB, free 10.2 KB)
16/08/17 00:17:13 INFO BlockManagerInfo: Added broadcast_1_piece0 in memory on localhost:46356 (size: 2.5 KB, free: 511.1 MB)
16/08/17 00:17:13 INFO SparkContext: Created broadcast 1 from broadcast at DAGScheduler.scala:1006
16/08/17 00:17:13 INFO DAGScheduler: Submitting 1 missing tasks from ResultStage 1 (MapPartitionsRDD[4] at map at Elastic.java:106)
16/08/17 00:17:13 INFO TaskSchedulerImpl: Adding task set 1.0 with 1 tasks
^C16/08/17 00:17:22 INFO SparkContext: Invoking stop() from shutdown hook
16/08/17 00:17:22 INFO SparkUI: Stopped Spark web UI at http://192.168.10.23:4040
16/08/17 00:17:22 INFO DAGScheduler: ResultStage 0 (foreach at GuardConnect.java:60) failed in 10,292 s
16/08/17 00:17:22 INFO DAGScheduler: Job 0 failed: foreach at GuardConnect.java:60, took 10,470974 s
Exception in thread "main" org.apache.spark.SparkException: Job 0 cancelled because SparkContext was shut down
at org.apache.spark.scheduler.DAGScheduler$$anonfun$cleanUpAfterSchedulerStop$1.apply(DAGScheduler.scala:806)
at org.apache.spark.scheduler.DAGScheduler$$anonfun$cleanUpAfterSchedulerStop$1.apply(DAGScheduler.scala:804)
at scala.collection.mutable.HashSet.foreach(HashSet.scala:79)
at org.apache.spark.scheduler.DAGScheduler.cleanUpAfterSchedulerStop(DAGScheduler.scala:804)
at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onStop(DAGScheduler.scala:1658)
at org.apache.spark.util.EventLoop.stop(EventLoop.scala:84)
at org.apache.spark.scheduler.DAGScheduler.stop(DAGScheduler.scala:1581)
at org.apache.spark.SparkContext$$anonfun$stop$9.apply$mcV$sp(SparkContext.scala:1740)
at org.apache.spark.util.Utils$.tryLogNonFatalError(Utils.scala:1229)
at org.apache.spark.SparkContext.stop(SparkContext.scala:1739)
at org.apache.spark.SparkContext$$anonfun$3.apply$mcV$sp(SparkContext.scala:596)
at org.apache.spark.util.SparkShutdownHook.run(ShutdownHookManager.scala:267)
at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1$$anonfun$apply$mcV$sp$1.apply$mcV$sp(ShutdownHookManager.scala:239)
at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1$$anonfun$apply$mcV$sp$1.apply(ShutdownHookManager.scala:239)
at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1$$anonfun$apply$mcV$sp$1.apply(ShutdownHookManager.scala:239)
at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1765)
at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1.apply$mcV$sp(ShutdownHookManager.scala:239)
at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1.apply(ShutdownHookManager.scala:239)
at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1.apply(ShutdownHookManager.scala:239)
at scala.util.Try$.apply(Try.scala:161)
at org.apache.spark.util.SparkShutdownHookManager.runAll(ShutdownHookManager.scala:239)
at org.apache.spark.util.SparkShutdownHookManager$$anon$2.run(ShutdownHookManager.scala:218)
at org.apache.hadoop.util.ShutdownHookManager$1.run(ShutdownHookManager.java:54)
at org.apache.spark.scheduler.DAGScheduler.runJob(DAGScheduler.scala:620)
at org.apache.spark.SparkContext.runJob(SparkContext.scala:1832)
at org.apache.spark.SparkContext.runJob(SparkContext.scala:1845)
at org.apache.spark.SparkContext.runJob(SparkContext.scala:1858)
at org.apache.spark.SparkContext.runJob(SparkContext.scala:1929)
at org.apache.spark.rdd.RDD$$anonfun$foreach$1.apply(RDD.scala:912)
at org.apache.spark.rdd.RDD$$anonfun$foreach$1.apply(RDD.scala:910)
at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:150)
at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:111)
at org.apache.spark.rdd.RDD.withScope(RDD.scala:316)
at org.apache.spark.rdd.RDD.foreach(RDD.scala:910)
at org.apache.spark.api.java.JavaRDDLike$class.foreach(JavaRDDLike.scala:332)
at org.apache.spark.api.java.AbstractJavaRDDLike.foreach(JavaRDDLike.scala:46)
at com.seigneurin.spark.GuardConnect.main(GuardConnect.java:60)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:731)
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
16/08/17 00:17:22 ERROR LiveListenerBus: SparkListenerBus has already stopped! Dropping event SparkListenerStageCompleted(org.apache.spark.scheduler.StageInfo#4a7e0846)
16/08/17 00:17:22 INFO DAGScheduler: ResultStage 1 (collect at Elastic.java:94) failed in 9,301 s
16/08/17 00:17:22 ERROR LiveListenerBus: SparkListenerBus has already stopped! Dropping event SparkListenerStageCompleted(org.apache.spark.scheduler.StageInfo#6c6b4cb8)
16/08/17 00:17:22 ERROR LiveListenerBus: SparkListenerBus has already stopped! Dropping event SparkListenerJobEnd(0,1471385842813,JobFailed(org.apache.spark.SparkException: Job 0 cancelled because SparkContext was shut down))
16/08/17 00:17:22 INFO DAGScheduler: Job 1 failed: collect at Elastic.java:94, took 9,317650 s
16/08/17 00:17:22 ERROR Executor: Exception in task 0.0 in stage 0.0 (TID 0)
org.apache.spark.SparkException: Job 1 cancelled because SparkContext was shut down
at org.apache.spark.scheduler.DAGScheduler$$anonfun$cleanUpAfterSchedulerStop$1.apply(DAGScheduler.scala:806)
at org.apache.spark.scheduler.DAGScheduler$$anonfun$cleanUpAfterSchedulerStop$1.apply(DAGScheduler.scala:804)
at scala.collection.mutable.HashSet.foreach(HashSet.scala:79)
at org.apache.spark.scheduler.DAGScheduler.cleanUpAfterSchedulerStop(DAGScheduler.scala:804)
at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onStop(DAGScheduler.scala:1658)
at org.apache.spark.util.EventLoop.stop(EventLoop.scala:84)
at org.apache.spark.scheduler.DAGScheduler.stop(DAGScheduler.scala:1581)
at org.apache.spark.SparkContext$$anonfun$stop$9.apply$mcV$sp(SparkContext.scala:1740)
at org.apache.spark.util.Utils$.tryLogNonFatalError(Utils.scala:1229)
at org.apache.spark.SparkContext.stop(SparkContext.scala:1739)
at org.apache.spark.SparkContext$$anonfun$3.apply$mcV$sp(SparkContext.scala:596)
at org.apache.spark.util.SparkShutdownHook.run(ShutdownHookManager.scala:267)
at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1$$anonfun$apply$mcV$sp$1.apply$mcV$sp(ShutdownHookManager.scala:239)
at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1$$anonfun$apply$mcV$sp$1.apply(ShutdownHookManager.scala:239)
at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1$$anonfun$apply$mcV$sp$1.apply(ShutdownHookManager.scala:239)
at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1765)
at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1.apply$mcV$sp(ShutdownHookManager.scala:239)
at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1.apply(ShutdownHookManager.scala:239)
at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1.apply(ShutdownHookManager.scala:239)
at scala.util.Try$.apply(Try.scala:161)
at org.apache.spark.util.SparkShutdownHookManager.runAll(ShutdownHookManager.scala:239)
at org.apache.spark.util.SparkShutdownHookManager$$anon$2.run(ShutdownHookManager.scala:218)
at org.apache.hadoop.util.ShutdownHookManager$1.run(ShutdownHookManager.java:54)
at org.apache.spark.scheduler.DAGScheduler.runJob(DAGScheduler.scala:620)
at org.apache.spark.SparkContext.runJob(SparkContext.scala:1832)
at org.apache.spark.SparkContext.runJob(SparkContext.scala:1845)
at org.apache.spark.SparkContext.runJob(SparkContext.scala:1858)
at org.apache.spark.SparkContext.runJob(SparkContext.scala:1929)
at org.apache.spark.rdd.RDD$$anonfun$collect$1.apply(RDD.scala:927)
at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:150)
at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:111)
at org.apache.spark.rdd.RDD.withScope(RDD.scala:316)
at org.apache.spark.rdd.RDD.collect(RDD.scala:926)
at org.apache.spark.api.java.JavaRDDLike$class.collect(JavaRDDLike.scala:339)
at org.apache.spark.api.java.AbstractJavaRDDLike.collect(JavaRDDLike.scala:46)
at com.seigneurin.spark.Elastic.RddToListHits(Elastic.java:94)
at com.seigneurin.spark.OXO.prepareDataAndLearn(OXO.java:126)
at com.seigneurin.spark.OXO.startDetectNow(OXO.java:148)
at com.seigneurin.spark.GuardConnect.lambda$main$1282d8df$1(GuardConnect.java:60)
at org.apache.spark.api.java.JavaRDDLike$$anonfun$foreach$1.apply(JavaRDDLike.scala:332)
at org.apache.spark.api.java.JavaRDDLike$$anonfun$foreach$1.apply(JavaRDDLike.scala:332)
at scala.collection.Iterator$class.foreach(Iterator.scala:727)
at org.apache.spark.InterruptibleIterator.foreach(InterruptibleIterator.scala:28)
at org.apache.spark.rdd.RDD$$anonfun$foreach$1$$anonfun$apply$32.apply(RDD.scala:912)
at org.apache.spark.rdd.RDD$$anonfun$foreach$1$$anonfun$apply$32.apply(RDD.scala:912)
at org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:1858)
at org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:1858)
at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:66)
at org.apache.spark.scheduler.Task.run(Task.scala:89)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:214)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
16/08/17 00:17:22 ERROR LiveListenerBus: SparkListenerBus has already stopped! Dropping event SparkListenerJobEnd(1,1471385842814,JobFailed(org.apache.spark.SparkException: Job 1 cancelled because SparkContext was shut down))
16/08/17 00:17:22 INFO MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped!
16/08/17 00:17:22 INFO MemoryStore: MemoryStore cleared
16/08/17 00:17:22 INFO BlockManager: BlockManager stopped
16/08/17 00:17:22 INFO BlockManagerMaster: BlockManagerMaster stopped
16/08/17 00:17:22 INFO OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped!
16/08/17 00:17:22 INFO RemoteActorRefProvider$RemotingTerminator: Shutting down remote daemon.
16/08/17 00:17:22 INFO RemoteActorRefProvider$RemotingTerminator: Remote daemon shut down; proceeding with flushing remote transports.
16/08/17 00:17:22 INFO TaskSetManager: Starting task 0.0 in stage 1.0 (TID 1, localhost, partition 0,ANY, 6751 bytes)
16/08/17 00:17:22 ERROR Inbox: Ignoring error
java.util.concurrent.RejectedExecutionException: Task org.apache.spark.executor.Executor$TaskRunner#65fd4104 rejected from java.util.concurrent.ThreadPoolExecutor#4387a1bf[Terminated, pool size = 0, active threads = 0, queued tasks = 0, completed tasks = 1]
at java.util.concurrent.ThreadPoolExecutor$AbortPolicy.rejectedExecution(ThreadPoolExecutor.java:2047)
at java.util.concurrent.ThreadPoolExecutor.reject(ThreadPoolExecutor.java:823)
at java.util.concurrent.ThreadPoolExecutor.execute(ThreadPoolExecutor.java:1369)
at org.apache.spark.executor.Executor.launchTask(Executor.scala:128)
at org.apache.spark.scheduler.local.LocalEndpoint$$anonfun$reviveOffers$1.apply(LocalBackend.scala:86)
at org.apache.spark.scheduler.local.LocalEndpoint$$anonfun$reviveOffers$1.apply(LocalBackend.scala:84)
at scala.collection.immutable.List.foreach(List.scala:318)
at org.apache.spark.scheduler.local.LocalEndpoint.reviveOffers(LocalBackend.scala:84)
at org.apache.spark.scheduler.local.LocalEndpoint$$anonfun$receive$1.applyOrElse(LocalBackend.scala:69)
at org.apache.spark.rpc.netty.Inbox$$anonfun$process$1.apply$mcV$sp(Inbox.scala:116)
at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:204)
at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100)
at org.apache.spark.rpc.netty.Dispatcher$MessageLoop.run(Dispatcher.scala:215)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
16/08/17 00:17:22 INFO SparkContext: Successfully stopped SparkContext
16/08/17 00:17:22 INFO ShutdownHookManager: Shutdown hook called
16/08/17 00:17:22 INFO ShutdownHookManager: Deleting directory /tmp/spark-8bf65e78-a916-4cc0-b4d1-1b0ec9a07157
16/08/17 00:17:22 INFO RemoteActorRefProvider$RemotingTerminator: Remoting shut down.
16/08/17 00:17:22 INFO ShutdownHookManager: Deleting directory /tmp/spark-8bf65e78-a916-4cc0-b4d1-1b0ec9a07157/httpd-6d3aeb80-808c-4749-8f8b-ac9341f990a7
Thank if you can give me some advice.

You cannot create an RDD inside an RDD, what soever the type of RDD be.
This is the first rule. This is because RDD being an abstraction pointing to your data.

Related

spark submit java.lang.IllegalArgumentException: Can not create a Path from an empty string

i am getting this error when i do spark submit.
java.lang.IllegalArgumentException: Can not create a Path from an empty string
i am using spark version 2.4.7
hadoop version 3.3.0
intellji ide
jdk 8
first i was getting class not found error which i solved now i am getting this error
Is it because of the dataset or something else.
https://www.kaggle.com/datasnaek/youtube-new?select=INvideos.csv link to dataset
error:
C:\spark\spark-2.4.7-bin-hadoop2.7\bin>spark-submit --class org.example.TopViewedCategories --master local C:\Users\Piyush\IdeaProjects\BDA\target\BDA-1.0-SNAPSHOT.jar
Started Processing
21/05/04 06:56:04 INFO SparkContext: Running Spark version 2.4.7
21/05/04 06:56:04 INFO SparkContext: Submitted application: YouTubeDM
21/05/04 06:56:04 INFO SecurityManager: Changing view acls to: Piyush
21/05/04 06:56:04 INFO SecurityManager: Changing modify acls to: Piyush
21/05/04 06:56:04 INFO SecurityManager: Changing view acls groups to:
21/05/04 06:56:04 INFO SecurityManager: Changing modify acls groups to:
21/05/04 06:56:04 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(Piyush); groups with view permissions: Set(); users with modify permissions: Set(Piyush); groups with modify permissions: Set()
21/05/04 06:56:04 INFO Utils: Successfully started service 'sparkDriver' on port 63708.
21/05/04 06:56:04 INFO SparkEnv: Registering MapOutputTracker
21/05/04 06:56:04 INFO SparkEnv: Registering BlockManagerMaster
21/05/04 06:56:04 INFO BlockManagerMasterEndpoint: Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information
21/05/04 06:56:04 INFO BlockManagerMasterEndpoint: BlockManagerMasterEndpoint up
21/05/04 06:56:04 INFO DiskBlockManager: Created local directory at C:\Users\Piyush\AppData\Local\Temp\blockmgr-9f91b0fe-b655-422e-b0bf-38172b70dff0
21/05/04 06:56:05 INFO MemoryStore: MemoryStore started with capacity 366.3 MB
21/05/04 06:56:05 INFO SparkEnv: Registering OutputCommitCoordinator
21/05/04 06:56:05 INFO Utils: Successfully started service 'SparkUI' on port 4040.
21/05/04 06:56:05 INFO SparkUI: Bound SparkUI to 0.0.0.0, and started at http://DESKTOP-IBFFKH9:4040
21/05/04 06:56:05 INFO SparkContext: Added JAR file:/C:/Users/Piyush/IdeaProjects/BDA/target/BDA-1.0-SNAPSHOT.jar at spark://DESKTOP-IBFFKH9:63708/jars/BDA-1.0-SNAPSHOT.jar with timestamp 1620091565160
21/05/04 06:56:05 INFO Executor: Starting executor ID driver on host localhost
21/05/04 06:56:05 INFO Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 63723.
21/05/04 06:56:05 INFO NettyBlockTransferService: Server created on DESKTOP-IBFFKH9:63723
21/05/04 06:56:05 INFO BlockManager: Using org.apache.spark.storage.RandomBlockReplicationPolicy for block replication policy
21/05/04 06:56:05 INFO BlockManagerMaster: Registering BlockManager BlockManagerId(driver, DESKTOP-IBFFKH9, 63723, None)
21/05/04 06:56:05 INFO BlockManagerMasterEndpoint: Registering block manager DESKTOP-IBFFKH9:63723 with 366.3 MB RAM, BlockManagerId(driver, DESKTOP-IBFFKH9, 63723, None)
21/05/04 06:56:05 INFO BlockManagerMaster: Registered BlockManager BlockManagerId(driver, DESKTOP-IBFFKH9, 63723, None)
21/05/04 06:56:05 INFO BlockManager: Initialized BlockManager: BlockManagerId(driver, DESKTOP-IBFFKH9, 63723, None)
Exception in thread "main" java.lang.IllegalArgumentException: Can not create a Path from an empty string
at org.apache.hadoop.fs.Path.checkPathArg(Path.java:126)
at org.apache.hadoop.fs.Path.<init>(Path.java:183)
at org.apache.hadoop.fs.Path.getParent(Path.java:356)
at org.apache.hadoop.fs.RawLocalFileSystem.mkdirsWithOptionalPermission(RawLocalFileSystem.java:517)
at org.apache.hadoop.fs.RawLocalFileSystem.mkdirs(RawLocalFileSystem.java:504)
at org.apache.hadoop.fs.RawLocalFileSystem.mkdirsWithOptionalPermission(RawLocalFileSystem.java:531)
at org.apache.hadoop.fs.RawLocalFileSystem.mkdirs(RawLocalFileSystem.java:504)
at org.apache.hadoop.fs.RawLocalFileSystem.mkdirsWithOptionalPermission(RawLocalFileSystem.java:531)
at org.apache.hadoop.fs.RawLocalFileSystem.mkdirs(RawLocalFileSystem.java:504)
at org.apache.hadoop.fs.RawLocalFileSystem.mkdirsWithOptionalPermission(RawLocalFileSystem.java:531)
at org.apache.hadoop.fs.RawLocalFileSystem.mkdirs(RawLocalFileSystem.java:504)
at org.apache.hadoop.fs.RawLocalFileSystem.mkdirsWithOptionalPermission(RawLocalFileSystem.java:531)
at org.apache.hadoop.fs.RawLocalFileSystem.mkdirs(RawLocalFileSystem.java:504)
at org.apache.hadoop.fs.ChecksumFileSystem.mkdirs(ChecksumFileSystem.java:694)
at org.apache.hadoop.mapreduce.lib.output.FileOutputCommitter.setupJob(FileOutputCommitter.java:313)
at org.apache.hadoop.mapred.FileOutputCommitter.setupJob(FileOutputCommitter.java:131)
at org.apache.hadoop.mapred.OutputCommitter.setupJob(OutputCommitter.java:265)
at org.apache.spark.internal.io.HadoopMapReduceCommitProtocol.setupJob(HadoopMapReduceCommitProtocol.scala:162)
at org.apache.spark.internal.io.SparkHadoopWriter$.write(SparkHadoopWriter.scala:74)
at org.apache.spark.rdd.PairRDDFunctions$$anonfun$saveAsHadoopDataset$1.apply$mcV$sp(PairRDDFunctions.scala:1096)
at org.apache.spark.rdd.PairRDDFunctions$$anonfun$saveAsHadoopDataset$1.apply(PairRDDFunctions.scala:1094)
at org.apache.spark.rdd.PairRDDFunctions$$anonfun$saveAsHadoopDataset$1.apply(PairRDDFunctions.scala:1094)
at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:112)
at org.apache.spark.rdd.RDD.withScope(RDD.scala:385)
at org.apache.spark.rdd.PairRDDFunctions.saveAsHadoopDataset(PairRDDFunctions.scala:1094)
at org.apache.spark.rdd.PairRDDFunctions$$anonfun$saveAsHadoopFile$4.apply$mcV$sp(PairRDDFunctions.scala:1067)
at org.apache.spark.rdd.PairRDDFunctions$$anonfun$saveAsHadoopFile$4.apply(PairRDDFunctions.scala:1032)
at org.apache.spark.rdd.PairRDDFunctions$$anonfun$saveAsHadoopFile$4.apply(PairRDDFunctions.scala:1032)
at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:112)
at org.apache.spark.rdd.RDD.withScope(RDD.scala:385)
at org.apache.spark.rdd.PairRDDFunctions.saveAsHadoopFile(PairRDDFunctions.scala:1032)
at org.apache.spark.rdd.PairRDDFunctions$$anonfun$saveAsHadoopFile$1.apply$mcV$sp(PairRDDFunctions.scala:958)
at org.apache.spark.rdd.PairRDDFunctions$$anonfun$saveAsHadoopFile$1.apply(PairRDDFunctions.scala:958)
at org.apache.spark.rdd.PairRDDFunctions$$anonfun$saveAsHadoopFile$1.apply(PairRDDFunctions.scala:958)
at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:112)
at org.apache.spark.rdd.RDD.withScope(RDD.scala:385)
at org.apache.spark.rdd.PairRDDFunctions.saveAsHadoopFile(PairRDDFunctions.scala:957)
at org.apache.spark.rdd.RDD$$anonfun$saveAsTextFile$1.apply$mcV$sp(RDD.scala:1544)
at org.apache.spark.rdd.RDD$$anonfun$saveAsTextFile$1.apply(RDD.scala:1523)
at org.apache.spark.rdd.RDD$$anonfun$saveAsTextFile$1.apply(RDD.scala:1523)
at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:112)
at org.apache.spark.rdd.RDD.withScope(RDD.scala:385)
at org.apache.spark.rdd.RDD.saveAsTextFile(RDD.scala:1523)
at org.apache.spark.api.java.JavaRDDLike$class.saveAsTextFile(JavaRDDLike.scala:550)
at org.apache.spark.api.java.AbstractJavaRDDLike.saveAsTextFile(JavaRDDLike.scala:45)
at org.example.TopViewedCategories.main(TopViewedCategories.java:46)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:845)
at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:161)
at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:184)
at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:86)
at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:920)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:929)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
21/05/04 06:56:06 ERROR ShutdownHookManager: Exception while deleting Spark temp dir: C:\Users\Piyush\AppData\Local\Temp\spark-2bac840b-8170-477d-a9ec-dd5f1f9283c2
java.io.IOException: Failed to delete: C:\Users\Piyush\AppData\Local\Temp\spark-2bac840b-8170-477d-a9ec-dd5f1f9283c2\userFiles-897873ea-324a-432c-85a1-786e5797243a\BDA-1.0-SNAPSHOT.jar
at org.apache.spark.network.util.JavaUtils.deleteRecursivelyUsingJavaIO(JavaUtils.java:144)
at org.apache.spark.network.util.JavaUtils.deleteRecursively(JavaUtils.java:118)
at org.apache.spark.network.util.JavaUtils.deleteRecursivelyUsingJavaIO(JavaUtils.java:128)
at org.apache.spark.network.util.JavaUtils.deleteRecursively(JavaUtils.java:118)
at org.apache.spark.network.util.JavaUtils.deleteRecursivelyUsingJavaIO(JavaUtils.java:128)
at org.apache.spark.network.util.JavaUtils.deleteRecursively(JavaUtils.java:118)
at org.apache.spark.network.util.JavaUtils.deleteRecursively(JavaUtils.java:91)
at org.apache.spark.util.Utils$.deleteRecursively(Utils.scala:1062)
at org.apache.spark.util.ShutdownHookManager$$anonfun$1$$anonfun$apply$mcV$sp$3.apply(ShutdownHookManager.scala:65)
at org.apache.spark.util.ShutdownHookManager$$anonfun$1$$anonfun$apply$mcV$sp$3.apply(ShutdownHookManager.scala:62)
at scala.collection.IndexedSeqOptimized$class.foreach(IndexedSeqOptimized.scala:33)
at scala.collection.mutable.ArrayOps$ofRef.foreach(ArrayOps.scala:186)
at org.apache.spark.util.ShutdownHookManager$$anonfun$1.apply$mcV$sp(ShutdownHookManager.scala:62)
at org.apache.spark.util.SparkShutdownHook.run(ShutdownHookManager.scala:214)
at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1$$anonfun$apply$mcV$sp$1.apply$mcV$sp(ShutdownHookManager.scala:188)
at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1$$anonfun$apply$mcV$sp$1.apply(ShutdownHookManager.scala:188)
at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1$$anonfun$apply$mcV$sp$1.apply(ShutdownHookManager.scala:188)
at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1945)
at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1.apply$mcV$sp(ShutdownHookManager.scala:188)
at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1.apply(ShutdownHookManager.scala:188)
at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1.apply(ShutdownHookManager.scala:188)
at scala.util.Try$.apply(Try.scala:192)
at org.apache.spark.util.SparkShutdownHookManager.runAll(ShutdownHookManager.scala:188)
at org.apache.spark.util.SparkShutdownHookManager$$anon$2.run(ShutdownHookManager.scala:178)
at org.apache.hadoop.util.ShutdownHookManager$1.run(ShutdownHookManager.java:54)
21/05/04 06:56:06 ERROR ShutdownHookManager: Exception while deleting Spark temp dir: C:\Users\Piyush\AppData\Local\Temp\spark-2bac840b-8170-477d-a9ec-dd5f1f9283c2\userFiles-897873ea-324a-432c-85a1-786e5797243a
java.io.IOException: Failed to delete: C:\Users\Piyush\AppData\Local\Temp\spark-2bac840b-8170-477d-a9ec-dd5f1f9283c2\userFiles-897873ea-324a-432c-85a1-786e5797243a\BDA-1.0-SNAPSHOT.jar
at org.apache.spark.network.util.JavaUtils.deleteRecursivelyUsingJavaIO(JavaUtils.java:144)
at org.apache.spark.network.util.JavaUtils.deleteRecursively(JavaUtils.java:118)
at org.apache.spark.network.util.JavaUtils.deleteRecursivelyUsingJavaIO(JavaUtils.java:128)
at org.apache.spark.network.util.JavaUtils.deleteRecursively(JavaUtils.java:118)
at org.apache.spark.network.util.JavaUtils.deleteRecursively(JavaUtils.java:91)
at org.apache.spark.util.Utils$.deleteRecursively(Utils.scala:1062)
at org.apache.spark.util.ShutdownHookManager$$anonfun$1$$anonfun$apply$mcV$sp$3.apply(ShutdownHookManager.scala:65)
at org.apache.spark.util.ShutdownHookManager$$anonfun$1$$anonfun$apply$mcV$sp$3.apply(ShutdownHookManager.scala:62)
at scala.collection.IndexedSeqOptimized$class.foreach(IndexedSeqOptimized.scala:33)
at scala.collection.mutable.ArrayOps$ofRef.foreach(ArrayOps.scala:186)
at org.apache.spark.util.ShutdownHookManager$$anonfun$1.apply$mcV$sp(ShutdownHookManager.scala:62)
at org.apache.spark.util.SparkShutdownHook.run(ShutdownHookManager.scala:214)
at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1$$anonfun$apply$mcV$sp$1.apply$mcV$sp(ShutdownHookManager.scala:188)
at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1$$anonfun$apply$mcV$sp$1.apply(ShutdownHookManager.scala:188)
at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1$$anonfun$apply$mcV$sp$1.apply(ShutdownHookManager.scala:188)
at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1945)
at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1.apply$mcV$sp(ShutdownHookManager.scala:188)
at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1.apply(ShutdownHookManager.scala:188)
at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1.apply(ShutdownHookManager.scala:188)
at scala.util.Try$.apply(Try.scala:192)
at org.apache.spark.util.SparkShutdownHookManager.runAll(ShutdownHookManager.scala:188)
at org.apache.spark.util.SparkShutdownHookManager$$anon$2.run(ShutdownHookManager.scala:178)
at org.apache.hadoop.util.ShutdownHookManager$1.run(ShutdownHookManager.java:54)
this is the code
package org.example;
import org.apache.hadoop.fs.FileSystem;
import org.apache.hadoop.fs.Path;
import org.apache.spark.SparkConf;
import org.apache.spark.api.java.JavaPairRDD;
import org.apache.spark.api.java.JavaRDD;
import org.apache.spark.api.java.JavaSparkContext;
import scala.Tuple2;
import java.util.List;
public class TopViewedCategories {
public static void main(String[] args) throws Exception {
long timeElapsed = System.currentTimeMillis();
System.out.println("Started Processing");
SparkConf conf = new SparkConf()
.setMaster("local")
.setAppName("YouTubeDM");
JavaSparkContext sc = new JavaSparkContext(conf);
//Valid log levels include: ALL, DEBUG, ERROR, FATAL, INFO, OFF, TRACE, WARN
sc.setLogLevel("ERROR");
JavaRDD<String> mRDD = sc.textFile("C:/Users/Piyush/Desktop/bda/INvideos"); //directory where the files are
JavaPairRDD<Double,String> sortedRDD = mRDD
// .filter(line -> line.split("\t").length > 6)
.mapToPair(
line -> {
String[] lineArr = line.split("\t");
String category = lineArr[5];
Double views = Double.parseDouble(lineArr[1]);
Tuple2<Double, Integer> viewsTuple = new Tuple2<>(views, 1);
return new Tuple2<>(category, viewsTuple);
})
.reduceByKey((x, y) -> new Tuple2<>(x._1 + y._1, x._2 + y._2)) .mapToPair(x -> new Tuple2<>(x._1, (x._2._1 / x._2._2)))
.mapToPair(Tuple2::swap)
.sortByKey(false);
// .take(10);
long count = sortedRDD.count();
List<Tuple2<Double, String>> topTenTuples = sortedRDD.take(10);
JavaPairRDD<Double, String> topTenRdd = sc.parallelizePairs(topTenTuples); String output_dir = "C:output/spark/TopViewedCategories";
//remove output directory if already there
FileSystem fs = FileSystem.get(sc.hadoopConfiguration());
fs.delete(new Path(output_dir), true); // delete dir, true for recursive
topTenRdd.saveAsTextFile(output_dir);
timeElapsed = System.currentTimeMillis() - timeElapsed;
System.out.println("Done.Time taken (in seconds): " + timeElapsed/1000f); System.out.println("Processed Records: " + count);
sc.stop();
sc.close();
}
}
please help me solve this
It just seems as output_dir variable contains incorrect path:
String output_dir = "C:output/spark/TopViewedCategories";
Because of it fs.delete(new Path(output_dir), true) throws
java.lang.IllegalArgumentException: Can not create a Path from an empty string
at org.apache.hadoop.fs.Path.checkPathArg(Path.java:126)

How to resolved "Error running job streaming job" in Spark streaming program?

I created a Master and a worker on Spark. Then I created a Spark streaming job and I am trying to submit it, but on Master it shows long list of java errors
Used this command to start the master:
spark-class org.apache.spark.deploy.master.Master
Used this to Start the worker:
spark-class org.apache.spark.deploy.worker.Worker spark://ip:port
For submitting spark job, tried commands with different parameters:
spark-submit --class com.rba.boston.SparkHome RBA-jar-with-dependencies.jar
spark-submit --class com.rba.boston.SparkHome --master spark://10.220.45.105:7077 --driver-memory 2G --driver-cores 2 --conf spark.driver.port=9998 --executor-memory 2G --deploy-mode cluster --total-executor-cores 4 RBA-jar-with-dependencies.jar
spark-submit --class com.rba.boston.SparkHome --master spark://10.220.45.105:7077 --driver-memory 2G --driver-cores 2 --conf spark.driver.port=9997 --conf spark.driver.host=10.220.45.105 --executor-memory 2G --deploy-mode client --total-executor-cores 4 RBA-jar-with-dependencies.jar
public static void main(String[] args) throws InterruptedException
{
SparkConf conf = new
SparkConf().setMaster("local[3]").setAppName("NetworkWordCount");
JavaStreamingContext jssc = new JavaStreamingContext(conf,
Durations.seconds(3));
JavaReceiverInputDStream<String>
lines=jssc.socketTextStream("10.220.45.105", 9998);
JavaDStream<String> words = lines.flatMap(x ->
Arrays.asList(x.split(" ")).iterator());
JavaPairDStream<String, Integer> pairs = words.mapToPair(s -> new
Tuple2<>(s, 1));
JavaPairDStream<String, Integer> wordCounts =
pairs.reduceByKey((i1, i2) -> i1 + i2);
wordCounts.print();
jssc.start();
jssc.awaitTermination(); // Wait for the computation to terminate
}
Logs after Submitting the job are:
WARNING: An illegal reflective access operation has occurred
WARNING: Illegal reflective access by org.apache.spark.unsafe.Platform (file:/C:/Data/Softwares/spark-2.4.3-bin-hadoop2.6/jars/spark-unsafe_2.11-2.4.3.jar) to method java.nio.Bits.unaligned()
WARNING: Please consider reporting this to the maintainers of org.apache.spark.unsafe.Platform
WARNING: Use --illegal-access=warn to enable warnings of further illegal reflective access operations
WARNING: All illegal access operations will be denied in a future release
SparkHome
19/08/01 15:16:06 INFO SparkContext: Running Spark version 2.4.3
19/08/01 15:16:06 INFO SparkContext: Submitted application: NetworkWordCount
19/08/01 15:16:06 INFO SecurityManager: Changing view acls to: SachdeJ
19/08/01 15:16:06 INFO SecurityManager: Changing modify acls to: SachdeJ
19/08/01 15:16:06 INFO SecurityManager: Changing view acls groups to:
19/08/01 15:16:06 INFO SecurityManager: Changing modify acls groups to:
19/08/01 15:16:06 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(SachdeJ); groups with view permissions: Set(); users with modify permissions: Set(SachdeJ); groups with modify permissions: Set()
19/08/01 15:16:07 INFO Utils: Successfully started service 'sparkDriver' on port 54035.
19/08/01 15:16:07 INFO BlockManagerMasterEndpoint: Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information
19/08/01 15:16:07 INFO BlockManagerMasterEndpoint: BlockManagerMasterEndpoint up
19/08/01 15:16:07 INFO DiskBlockManager: Created local directory at C:\Users\sachdej\AppData\Local\Temp\blockmgr-cccaa7a1-cfdc-45fb-8ff5-9a800def11ff
19/08/01 15:16:07 INFO MemoryStore: MemoryStore started with capacity 434.4 MB
19/08/01 15:16:08 INFO Utils: Successfully started service 'SparkUI' on port 4040.
19/08/01 15:16:08 INFO SparkUI: Bound SparkUI to 0.0.0.0, and started at http://GGN1037742.bsci.bossci.com:4040
19/08/01 15:16:08 INFO SparkContext: Added JAR file:/C:/Data/SharkTank/com.rba.boston/target/RBA-jar-with-dependencies.jar at spark://GGN1037742.bsci.bossci.com:54035/jars/RBA-jar-with-dependencies.jar with timestamp 1564652768180
19/08/01 15:16:08 INFO Executor: Starting executor ID driver on host localhost
19/08/01 15:16:08 INFO Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 54057.
19/08/01 15:16:08 INFO NettyBlockTransferService: Server created on GGN1037742.bsci.bossci.com:54057
19/08/01 15:16:08 INFO BlockManager: Using org.apache.spark.storage.RandomBlockReplicationPolicy for block replication policy
19/08/01 15:16:08 INFO BlockManagerMaster: Registering BlockManager BlockManagerId(driver, GGN1037742.bsci.bossci.com, 54057, None)
19/08/01 15:16:08 INFO BlockManagerMasterEndpoint: Registering block manager GGN1037742.bsci.bossci.com:54057 with 434.4 MB RAM, BlockManagerId(driver, GGN1037742.bsci.bossci.com, 54057, None)
19/08/01 15:16:08 INFO BlockManagerMaster: Registered BlockManager BlockManagerId(driver, GGN1037742.bsci.bossci.com, 54057, None)
19/08/01 15:16:08 INFO BlockManager: Initialized BlockManager: BlockManagerId(driver, GGN1037742.bsci.bossci.com, 54057, None)
19/08/01 15:16:08 INFO ReceiverTracker: Starting 1 receivers
19/08/01 15:16:08 INFO ReceiverTracker: ReceiverTracker started
19/08/01 15:16:08 INFO SocketInputDStream: Slide time = 3000 ms
19/08/01 15:16:08 INFO SocketInputDStream: Storage level = Serialized 1x Replicated
19/08/01 15:16:08 INFO SocketInputDStream: Checkpoint interval = null
19/08/01 15:16:08 INFO SocketInputDStream: Remember interval = 3000 ms
19/08/01 15:16:08 INFO SocketInputDStream: Initialized and validated org.apache.spark.streaming.dstream.SocketInputDStream#628a467a
19/08/01 15:16:08 INFO FlatMappedDStream: Slide time = 3000 ms
19/08/01 15:16:08 INFO FlatMappedDStream: Storage level = Serialized 1x Replicated
19/08/01 15:16:08 INFO FlatMappedDStream: Checkpoint interval = null
19/08/01 15:16:08 INFO FlatMappedDStream: Remember interval = 3000 ms
19/08/01 15:16:08 INFO FlatMappedDStream: Initialized and validated org.apache.spark.streaming.dstream.FlatMappedDStream#4013fae0
19/08/01 15:16:08 INFO MappedDStream: Slide time = 3000 ms
19/08/01 15:16:08 INFO MappedDStream: Storage level = Serialized 1x Replicated
19/08/01 15:16:08 INFO MappedDStream: Checkpoint interval = null
19/08/01 15:16:08 INFO MappedDStream: Remember interval = 3000 ms
19/08/01 15:16:08 INFO MappedDStream: Initialized and validated org.apache.spark.streaming.dstream.MappedDStream#2e5a119c
19/08/01 15:16:08 INFO ShuffledDStream: Slide time = 3000 ms
19/08/01 15:16:08 INFO ShuffledDStream: Storage level = Serialized 1x Replicated
19/08/01 15:16:08 INFO ShuffledDStream: Checkpoint interval = null
19/08/01 15:16:08 INFO ShuffledDStream: Remember interval = 3000 ms
19/08/01 15:16:08 INFO ShuffledDStream: Initialized and validated org.apache.spark.streaming.dstream.ShuffledDStream#2141cddb
19/08/01 15:16:08 INFO ForEachDStream: Slide time = 3000 ms
19/08/01 15:16:08 INFO ForEachDStream: Storage level = Serialized 1x Replicated
19/08/01 15:16:08 INFO ForEachDStream: Checkpoint interval = null
19/08/01 15:16:08 INFO ForEachDStream: Remember interval = 3000 ms
19/08/01 15:16:08 INFO ForEachDStream: Initialized and validated org.apache.spark.streaming.dstream.ForEachDStream#1148f61f
19/08/01 15:16:08 INFO RecurringTimer: Started timer for JobGenerator at time 1564652769000
19/08/01 15:16:08 INFO JobGenerator: Started JobGenerator at 1564652769000 ms
19/08/01 15:16:08 INFO JobScheduler: Started JobScheduler
19/08/01 15:16:08 INFO ReceiverTracker: Receiver 0 started
19/08/01 15:16:08 INFO StreamingContext: StreamingContext started
19/08/01 15:16:08 INFO DAGScheduler: Got job 0 (start at SparkHome.java:40) with 1 output partitions
19/08/01 15:16:08 INFO DAGScheduler: Final stage: ResultStage 0 (start at SparkHome.java:40)
19/08/01 15:16:08 INFO DAGScheduler: Parents of final stage: List()
19/08/01 15:16:08 INFO DAGScheduler: Missing parents: List()
19/08/01 15:16:09 INFO DAGScheduler: Submitting ResultStage 0 (Receiver 0 ParallelCollectionRDD[0] at makeRDD at ReceiverTracker.scala:614), which has no missing parents
19/08/01 15:16:09 INFO JobScheduler: Added jobs for time 1564652769000 ms
19/08/01 15:16:09 INFO JobScheduler: Starting job streaming job 1564652769000 ms.0 from job set of time 1564652769000 ms
19/08/01 15:16:09 INFO JobScheduler: Finished job streaming job 1564652769000 ms.0 from job set of time 1564652769000 ms
19/08/01 15:16:09 ERROR JobScheduler: Error running job streaming job 1564652769000 ms.0
java.lang.IllegalArgumentException: Unsupported class file major version 56
at org.apache.xbean.asm6.ClassReader.<init>(ClassReader.java:166)
at org.apache.xbean.asm6.ClassReader.<init>(ClassReader.java:148)
at org.apache.xbean.asm6.ClassReader.<init>(ClassReader.java:136)
at org.apache.xbean.asm6.ClassReader.<init>(ClassReader.java:237)
at org.apache.spark.util.ClosureCleaner$.getClassReader(ClosureCleaner.scala:49)
at org.apache.spark.util.FieldAccessFinder$$anon$3$$anonfun$visitMethodInsn$2.apply(ClosureCleaner.scala:517)
at org.apache.spark.util.FieldAccessFinder$$anon$3$$anonfun$visitMethodInsn$2.apply(ClosureCleaner.scala:500)
at scala.collection.TraversableLike$WithFilter$$anonfun$foreach$1.apply(TraversableLike.scala:733)
at scala.collection.mutable.HashMap$$anon$1$$anonfun$foreach$2.apply(HashMap.scala:134)
at scala.collection.mutable.HashMap$$anon$1$$anonfun$foreach$2.apply(HashMap.scala:134)
at scala.collection.mutable.HashTable$class.foreachEntry(HashTable.scala:236)
at scala.collection.mutable.HashMap.foreachEntry(HashMap.scala:40)
at scala.collection.mutable.HashMap$$anon$1.foreach(HashMap.scala:134)
at scala.collection.TraversableLike$WithFilter.foreach(TraversableLike.scala:732)
at org.apache.spark.util.FieldAccessFinder$$anon$3.visitMethodInsn(ClosureCleaner.scala:500)
at org.apache.xbean.asm6.ClassReader.readCode(ClassReader.java:2175)
at org.apache.xbean.asm6.ClassReader.readMethod(ClassReader.java:1238)
at org.apache.xbean.asm6.ClassReader.accept(ClassReader.java:631)
at org.apache.xbean.asm6.ClassReader.accept(ClassReader.java:355)
at org.apache.spark.util.ClosureCleaner$$anonfun$org$apache$spark$util$ClosureCleaner$$clean$14.apply(ClosureCleaner.scala:307)
at org.apache.spark.util.ClosureCleaner$$anonfun$org$apache$spark$util$ClosureCleaner$$clean$14.apply(ClosureCleaner.scala:306)
at scala.collection.immutable.List.foreach(List.scala:392)
at org.apache.spark.util.ClosureCleaner$.org$apache$spark$util$ClosureCleaner$$clean(ClosureCleaner.scala:306)
at org.apache.spark.util.ClosureCleaner$.clean(ClosureCleaner.scala:162)
at org.apache.spark.SparkContext.clean(SparkContext.scala:2326)
at org.apache.spark.SparkContext.runJob(SparkContext.scala:2100)
at org.apache.spark.rdd.RDD$$anonfun$take$1.apply(RDD.scala:1364)
at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:112)
at org.apache.spark.rdd.RDD.withScope(RDD.scala:363)
at org.apache.spark.rdd.RDD.take(RDD.scala:1337)
at org.apache.spark.streaming.dstream.DStream$$anonfun$print$2$$anonfun$foreachFunc$3$1.apply(DStream.scala:735)
at org.apache.spark.streaming.dstream.DStream$$anonfun$print$2$$anonfun$foreachFunc$3$1.apply(DStream.scala:734)
at org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1$$anonfun$apply$mcV$sp$1.apply$mcV$sp(ForEachDStream.scala:51)
at org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1$$anonfun$apply$mcV$sp$1.apply(ForEachDStream.scala:51)
at org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1$$anonfun$apply$mcV$sp$1.apply(ForEachDStream.scala:51)
at org.apache.spark.streaming.dstream.DStream.createRDDWithLocalProperties(DStream.scala:416)
at org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1.apply$mcV$sp(ForEachDStream.scala:50)
at org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1.apply(ForEachDStream.scala:50)
at org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1.apply(ForEachDStream.scala:50)
at scala.util.Try$.apply(Try.scala:192)
at org.apache.spark.streaming.scheduler.Job.run(Job.scala:39)
at org.apache.spark.streaming.scheduler.JobScheduler$JobHandler$$anonfun$run$1.apply$mcV$sp(JobScheduler.scala:257)
at org.apache.spark.streaming.scheduler.JobScheduler$JobHandler$$anonfun$run$1.apply(JobScheduler.scala:257)
at org.apache.spark.streaming.scheduler.JobScheduler$JobHandler$$anonfun$run$1.apply(JobScheduler.scala:257)
at scala.util.DynamicVariable.withValue(DynamicVariable.scala:58)
at org.apache.spark.streaming.scheduler.JobScheduler$JobHandler.run(JobScheduler.scala:256)
at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
at java.base/java.lang.Thread.run(Thread.java:835)
Exception in thread "main" java.lang.IllegalArgumentException: Unsupported class file major version 56
at org.apache.xbean.asm6.ClassReader.<init>(ClassReader.java:166)
at org.apache.xbean.asm6.ClassReader.<init>(ClassReader.java:148)
at org.apache.xbean.asm6.ClassReader.<init>(ClassReader.java:136)
at org.apache.xbean.asm6.ClassReader.<init>(ClassReader.java:237)
at org.apache.spark.util.ClosureCleaner$.getClassReader(ClosureCleaner.scala:49)
at org.apache.spark.util.FieldAccessFinder$$anon$3$$anonfun$visitMethodInsn$2.apply(ClosureCleaner.scala:517)
at org.apache.spark.util.FieldAccessFinder$$anon$3$$anonfun$visitMethodInsn$2.apply(ClosureCleaner.scala:500)
at scala.collection.TraversableLike$WithFilter$$anonfun$foreach$1.apply(TraversableLike.scala:733)
at scala.collection.mutable.HashMap$$anon$1$$anonfun$foreach$2.apply(HashMap.scala:134)
at scala.collection.mutable.HashMap$$anon$1$$anonfun$foreach$2.apply(HashMap.scala:134)
at scala.collection.mutable.HashTable$class.foreachEntry(HashTable.scala:236)
at scala.collection.mutable.HashMap.foreachEntry(HashMap.scala:40)
at scala.collection.mutable.HashMap$$anon$1.foreach(HashMap.scala:134)
at scala.collection.TraversableLike$WithFilter.foreach(TraversableLike.scala:732)
at org.apache.spark.util.FieldAccessFinder$$anon$3.visitMethodInsn(ClosureCleaner.scala:500)
at org.apache.xbean.asm6.ClassReader.readCode(ClassReader.java:2175)
at org.apache.xbean.asm6.ClassReader.readMethod(ClassReader.java:1238)
at org.apache.xbean.asm6.ClassReader.accept(ClassReader.java:631)
at org.apache.xbean.asm6.ClassReader.accept(ClassReader.java:355)
at org.apache.spark.util.ClosureCleaner$$anonfun$org$apache$spark$util$ClosureCleaner$$clean$14.apply(ClosureCleaner.scala:307)
at org.apache.spark.util.ClosureCleaner$$anonfun$org$apache$spark$util$ClosureCleaner$$clean$14.apply(ClosureCleaner.scala:306)
at scala.collection.immutable.List.foreach(List.scala:392)
at org.apache.spark.util.ClosureCleaner$.org$apache$spark$util$ClosureCleaner$$clean(ClosureCleaner.scala:306)
at org.apache.spark.util.ClosureCleaner$.clean(ClosureCleaner.scala:162)
at org.apache.spark.SparkContext.clean(SparkContext.scala:2326)
at org.apache.spark.SparkContext.runJob(SparkContext.scala:2100)
at org.apache.spark.rdd.RDD$$anonfun$take$1.apply(RDD.scala:1364)
at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:112)
at org.apache.spark.rdd.RDD.withScope(RDD.scala:363)
at org.apache.spark.rdd.RDD.take(RDD.scala:1337)
at org.apache.spark.streaming.dstream.DStream$$anonfun$print$2$$anonfun$foreachFunc$3$1.apply(DStream.scala:735)
at org.apache.spark.streaming.dstream.DStream$$anonfun$print$2$$anonfun$foreachFunc$3$1.apply(DStream.scala:734)
at org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1$$anonfun$apply$mcV$sp$1.apply$mcV$sp(ForEachDStream.scala:51)
at org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1$$anonfun$apply$mcV$sp$1.apply(ForEachDStream.scala:51)
at org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1$$anonfun$apply$mcV$sp$1.apply(ForEachDStream.scala:51)
at org.apache.spark.streaming.dstream.DStream.createRDDWithLocalProperties(DStream.scala:416)
at org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1.apply$mcV$sp(ForEachDStream.scala:50)
at org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1.apply(ForEachDStream.scala:50)
at org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1.apply(ForEachDStream.scala:50)
at scala.util.Try$.apply(Try.scala:192)
at org.apache.spark.streaming.scheduler.Job.run(Job.scala:39)
at org.apache.spark.streaming.scheduler.JobScheduler$JobHandler$$anonfun$run$1.apply$mcV$sp(JobScheduler.scala:257)
at org.apache.spark.streaming.scheduler.JobScheduler$JobHandler$$anonfun$run$1.apply(JobScheduler.scala:257)
at org.apache.spark.streaming.scheduler.JobScheduler$JobHandler$$anonfun$run$1.apply(JobScheduler.scala:257)
at scala.util.DynamicVariable.withValue(DynamicVariable.scala:58)
at org.apache.spark.streaming.scheduler.JobScheduler$JobHandler.run(JobScheduler.scala:256)
at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
at java.base/java.lang.Thread.run(Thread.java:835)
19/08/01 15:16:09 INFO MemoryStore: Block broadcast_0 stored as values in memory (estimated size 46.6 KB, free 434.4 MB)
19/08/01 15:16:09 INFO StreamingContext: Invoking stop(stopGracefully=false) from shutdown hook
19/08/01 15:16:09 INFO ReceiverTracker: Sent stop signal to all 1 receivers
19/08/01 15:16:09 INFO MemoryStore: Block broadcast_0_piece0 stored as bytes in memory (estimated size 16.0 KB, free 434.3 MB)
19/08/01 15:16:09 INFO BlockManagerInfo: Added broadcast_0_piece0 in memory on GGN1037742.bsci.bossci.com:54057 (size: 16.0 KB, free: 434.4 MB)
19/08/01 15:16:09 INFO SparkContext: Created broadcast 0 from broadcast at DAGScheduler.scala:1161
19/08/01 15:16:09 INFO DAGScheduler: Submitting 1 missing tasks from ResultStage 0 (Receiver 0 ParallelCollectionRDD[0] at makeRDD at ReceiverTracker.scala:614) (first 15 tasks are for partitions Vector(0))
19/08/01 15:16:09 INFO TaskSchedulerImpl: Adding task set 0.0 with 1 tasks
19/08/01 15:16:09 INFO TaskSetManager: Starting task 0.0 in stage 0.0 (TID 0, localhost, executor driver, partition 0, PROCESS_LOCAL, 8459 bytes)
19/08/01 15:16:09 INFO Executor: Running task 0.0 in stage 0.0 (TID 0)
19/08/01 15:16:09 INFO Executor: Fetching spark://GGN1037742.bsci.bossci.com:54035/jars/RBA-jar-with-dependencies.jar with timestamp 1564652768180
19/08/01 15:16:09 INFO TransportClientFactory: Successfully created connection to GGN1037742.bsci.bossci.com/10.220.45.105:54035 after 31 ms (0 ms spent in bootstraps)
19/08/01 15:16:09 INFO Utils: Fetching spark://GGN1037742.bsci.bossci.com:54035/jars/RBA-jar-with-dependencies.jar to C:\Users\sachdej\AppData\Local\Temp\spark-5b831314-0918-401a-88d8-1c6a135d554c\userFiles-99183ed0-5ef0-4d7d-8ea1-8e928e026847\fetchFileTemp6554652685231905365.tmp
19/08/01 15:16:09 INFO Executor: Adding file:/C:/Users/sachdej/AppData/Local/Temp/spark-5b831314-0918-401a-88d8-1c6a135d554c/userFiles-99183ed0-5ef0-4d7d-8ea1-8e928e026847/RBA-jar-with-dependencies.jar to class loader
19/08/01 15:16:09 INFO RecurringTimer: Started timer for BlockGenerator at time 1564652770000
19/08/01 15:16:09 INFO BlockGenerator: Started BlockGenerator
19/08/01 15:16:09 INFO BlockGenerator: Started block pushing thread
19/08/01 15:16:09 INFO ReceiverSupervisorImpl: Stopping receiver with message: Registered unsuccessfully because Driver refused to start receiver 0:
19/08/01 15:16:09 WARN ReceiverSupervisorImpl: Skip stopping receiver because it has not yet stared
19/08/01 15:16:09 INFO BlockGenerator: Stopping BlockGenerator
19/08/01 15:16:10 INFO RecurringTimer: Stopped timer for BlockGenerator after time 1564652770200
19/08/01 15:16:10 INFO BlockGenerator: Waiting for block pushing thread to terminate
19/08/01 15:16:10 INFO BlockGenerator: Pushing out the last 0 blocks
19/08/01 15:16:10 INFO BlockGenerator: Stopped block pushing thread
19/08/01 15:16:10 INFO BlockGenerator: Stopped BlockGenerator
19/08/01 15:16:10 INFO ReceiverSupervisorImpl: Waiting for receiver to be stopped
19/08/01 15:16:10 INFO ReceiverSupervisorImpl: Stopped receiver without error
19/08/01 15:16:10 INFO Executor: Finished task 0.0 in stage 0.0 (TID 0). 794 bytes result sent to driver
19/08/01 15:16:10 INFO TaskSetManager: Finished task 0.0 in stage 0.0 (TID 0) in 965 ms on localhost (executor driver) (1/1)
19/08/01 15:16:10 INFO TaskSchedulerImpl: Removed TaskSet 0.0, whose tasks have all completed, from pool
19/08/01 15:16:10 INFO DAGScheduler: ResultStage 0 (start at SparkHome.java:40) finished in 1.229 s
19/08/01 15:16:10 INFO ReceiverTracker: All of the receivers have deregistered successfully
19/08/01 15:16:10 INFO ReceiverTracker: ReceiverTracker stopped
19/08/01 15:16:10 INFO JobGenerator: Stopping JobGenerator immediately
19/08/01 15:16:10 INFO RecurringTimer: Stopped timer for JobGenerator after time 1564652769000
19/08/01 15:16:10 INFO JobGenerator: Stopped JobGenerator
19/08/01 15:16:10 INFO JobScheduler: Stopped JobScheduler
19/08/01 15:16:10 INFO StreamingContext: StreamingContext stopped successfully
19/08/01 15:16:10 INFO SparkContext: Invoking stop() from shutdown hook
19/08/01 15:16:10 INFO SparkUI: Stopped Spark web UI at http://GGN1037742.bsci.bossci.com:4040
19/08/01 15:16:10 INFO MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped!
19/08/01 15:16:10 INFO MemoryStore: MemoryStore cleared
19/08/01 15:16:10 INFO BlockManager: BlockManager stopped
19/08/01 15:16:10 INFO BlockManagerMaster: BlockManagerMaster stopped
19/08/01 15:16:10 INFO OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped!
19/08/01 15:16:10 INFO SparkContext: Successfully stopped SparkContext
Master Logs after submitting the job are:
19/08/01 14:03:10 ERROR TransportRequestHandler: Error while invoking RpcHandler#receive() for one-way message.
java.io.InvalidClassException: org.apache.spark.rpc.netty.NettyRpcEndpointRef; local class incompatible: stream classdesc serialVersionUID = -4186747031772874359, local class serialVersionUID = 6257082371135760434
at java.base/java.io.ObjectStreamClass.initNonProxy(ObjectStreamClass.java:689)
at java.base/java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1903)
at java.base/java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1772)
at java.base/java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2060)
at java.base/java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1594)
at java.base/java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2355)
at java.base/java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2249)
at java.base/java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2087)
at java.base/java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1594)
at java.base/java.io.ObjectInputStream.readObject(ObjectInputStream.java:430)
at org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:75)
at org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:108)
at org.apache.spark.rpc.netty.NettyRpcEnv$$anonfun$deserialize$1$$anonfun$apply$1.apply(NettyRpcEnv.scala:271)
at scala.util.DynamicVariable.withValue(DynamicVariable.scala:58)
at org.apache.spark.rpc.netty.NettyRpcEnv.deserialize(NettyRpcEnv.scala:320)
at org.apache.spark.rpc.netty.NettyRpcEnv$$anonfun$deserialize$1.apply(NettyRpcEnv.scala:270)
at scala.util.DynamicVariable.withValue(DynamicVariable.scala:58)
at org.apache.spark.rpc.netty.NettyRpcEnv.deserialize(NettyRpcEnv.scala:269)
at org.apache.spark.rpc.netty.RequestMessage$.apply(NettyRpcEnv.scala:611)
at org.apache.spark.rpc.netty.NettyRpcHandler.internalReceive(NettyRpcEnv.scala:662)
at org.apache.spark.rpc.netty.NettyRpcHandler.receive(NettyRpcEnv.scala:654)
at org.apache.spark.network.server.TransportRequestHandler.processOneWayMessage(TransportRequestHandler.java:274)
at org.apache.spark.network.server.TransportRequestHandler.handle(TransportRequestHandler.java:105)19/08/01 14:03:10 ERROR TransportRequestHandler: Error while invoking RpcHandler#receive() for one-way message.
java.io.InvalidClassException: org.apache.spark.rpc.netty.NettyRpcEndpointRef; local class incompatible: stream classdesc serialVersionUID = -4186747031772874359, local class serialVersionUID = 6257082371135760434
at java.base/java.io.ObjectStreamClass.initNonProxy(ObjectStreamClass.java:689)
at java.base/java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1903)
at java.base/java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1772)
at java.base/java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2060)
at java.base/java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1594)
at java.base/java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2355)
at java.base/java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2249)
at java.base/java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2087)
at java.base/java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1594)
at java.base/java.io.ObjectInputStream.readObject(ObjectInputStream.java:430)
at org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:75)
at org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:108)
at org.apache.spark.rpc.netty.NettyRpcEnv$$anonfun$deserialize$1$$anonfun$apply$1.apply(NettyRpcEnv.scala:271)
at scala.util.DynamicVariable.withValue(DynamicVariable.scala:58)
at org.apache.spark.rpc.netty.NettyRpcEnv.deserialize(NettyRpcEnv.scala:320)
at org.apache.spark.rpc.netty.NettyRpcEnv$$anonfun$deserialize$1.apply(NettyRpcEnv.scala:270)
at scala.util.DynamicVariable.withValue(DynamicVariable.scala:58)
at org.apache.spark.rpc.netty.NettyRpcEnv.deserialize(NettyRpcEnv.scala:269)
at org.apache.spark.rpc.netty.RequestMessage$.apply(NettyRpcEnv.scala:611)
at org.apache.spark.rpc.netty.NettyRpcHandler.internalReceive(NettyRpcEnv.scala:662)
at org.apache.spark.rpc.netty.NettyRpcHandler.receive(NettyRpcEnv.scala:654)
at org.apache.spark.network.server.TransportRequestHandler.processOneWayMessage(TransportRequestHandler.java:274)
at org.apache.spark.network.server.TransportRequestHandler.handle(TransportRequestHandler.java:105)19/08/01 14:03:10 ERROR TransportRequestHandler: Error while invoking RpcHandler#receive() for one-way message.
java.io.InvalidClassException: org.apache.spark.rpc.netty.NettyRpcEndpointRef; local class incompatible: stream classdesc serialVersionUID = -4186747031772874359, local class serialVersionUID = 6257082371135760434
at java.base/java.io.ObjectStreamClass.initNonProxy(ObjectStreamClass.java:689)
at java.base/java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1903)
at java.base/java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1772)
at java.base/java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2060)
at java.base/java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1594)
Errors like
java.lang.IllegalArgumentException: Unsupported class file major version 56
means the class was compiled at a higher version of Java than the version with which we tried to run it.
v56 means you used Java 12 to compile, and I do not know what version of Java is used by your Spark. According to this ticket, it looks like Spark is even not compatible with Java 11 yet: https://issues.apache.org/jira/browse/SPARK-24417
The solution is to compile your code with a previous version of the jdk.

Spark collect() network failure

I am just starting up with Spark. I am trying to use it to implement distributed processing for a deduplication application. The part i am working on now is supposed to get a RDD list of Pair<String[],String[]> which are the columns of the records. This process should be highly parallelisable but currently i am just working on local. When debuging everything seems to work as expected in the map function, when the collect tries to execute though everything breaks :( and i have no idea why, its not even running on a cluster.
This is the part of the code let me know if you need to see more:
JavaRDD<Pair<String[],String[]>> rddData = javaSparkContext.parallelize(data.stream()
.map(p->Pair.of(p.getLeft().fields(),p.getRight().fields()))
.collect(Collectors.toList()),2);
final int featureVecSize = featureCombinationToFeatureIndex.size();
List<double[]> distributedArchFeatures = rddData.map(recordPair ->{
String[] valuesA = recordPair.getLeft();
String[] valuesB = recordPair.getRight();
double[] scores = new double[featureVecSize];
for(Map.Entry<Triple<Integer, ComparisonFeature, ComparisonModifier>, Integer> t : featureCombinationToFeatureIndex.entrySet()) {
int index = t.getKey().getLeft();
ComparisonFeature feature = t.getKey().getMiddle();
ComparisonModifier modifier = t.getKey().getRight();
double score = modifier.calculateScore(feature, valuesA[index], valuesB[index]);
scores[t.getValue()] = score;
}
return scores;
}).collect();
And this is the stacktrace i get, it seems to be retrying to retreive the data but no dice:
[analyzerbeans-pool1-thread-27] INFO com.hi.identify7.execution.simple.SimpleMatchingExecutionContext - Scoring all matches...
[analyzerbeans-pool1-thread-27] INFO org.apache.spark.SparkContext - Starting job: collect at SimpleMatchingExecutionContext.java:177
[dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - Got job 0 (collect at SimpleMatchingExecutionContext.java:177) with 2 output partitions
[dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - Final stage: ResultStage 0 (collect at SimpleMatchingExecutionContext.java:177)
[dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - Parents of final stage: List()
[dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - Missing parents: List()
[dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - Submitting ResultStage 0 (MapPartitionsRDD[1] at map at SimpleMatchingExecutionContext.java:165), which has no missing parents
[dag-scheduler-event-loop] INFO org.apache.spark.storage.memory.MemoryStore - Block broadcast_0 stored as values in memory (estimated size 4.9 KB, free 2.5 GB)
[dag-scheduler-event-loop] INFO org.apache.spark.storage.memory.MemoryStore - Block broadcast_0_piece0 stored as bytes in memory (estimated size 2.7 KB, free 2.5 GB)
[dispatcher-event-loop-0] INFO org.apache.spark.storage.BlockManagerInfo - Added broadcast_0_piece0 in memory on 10.209.1.88:52600 (size: 2.7 KB, free: 2.5 GB)
[dag-scheduler-event-loop] INFO org.apache.spark.SparkContext - Created broadcast 0 from broadcast at DAGScheduler.scala:1006
[dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - Submitting 2 missing tasks from ResultStage 0 (MapPartitionsRDD[1] at map at SimpleMatchingExecutionContext.java:165) (first 15 tasks are for partitions Vector(0, 1))
[dag-scheduler-event-loop] INFO org.apache.spark.scheduler.TaskSchedulerImpl - Adding task set 0.0 with 2 tasks
[dispatcher-event-loop-1] WARN org.apache.spark.scheduler.TaskSetManager - Stage 0 contains a task of very large size (11664 KB). The maximum recommended task size is 100 KB.
[dispatcher-event-loop-1] INFO org.apache.spark.scheduler.TaskSetManager - Starting task 0.0 in stage 0.0 (TID 0, localhost, executor driver, partition 0, PROCESS_LOCAL, 11944196 bytes)
[Executor task launch worker for task 0] INFO org.apache.spark.executor.Executor - Running task 0.0 in stage 0.0 (TID 0)
[Executor task launch worker for task 0] INFO org.apache.spark.storage.memory.MemoryStore - Block taskresult_0 stored as bytes in memory (estimated size 185.3 MB, free 2.3 GB)
[dispatcher-event-loop-1] INFO org.apache.spark.storage.BlockManagerInfo - Added taskresult_0 in memory on 10.209.1.88:52600 (size: 185.3 MB, free: 2.3 GB)
[Executor task launch worker for task 0] INFO org.apache.spark.executor.Executor - Finished task 0.0 in stage 0.0 (TID 0). 194298473 bytes result sent via BlockManager)
[dispatcher-event-loop-2] INFO org.apache.spark.scheduler.TaskSetManager - Starting task 1.0 in stage 0.0 (TID 1, localhost, executor driver, partition 1, PROCESS_LOCAL, 11932301 bytes)
[Executor task launch worker for task 1] INFO org.apache.spark.executor.Executor - Running task 1.0 in stage 0.0 (TID 1)
[task-result-getter-0] INFO org.apache.spark.network.client.TransportClientFactory - Successfully created connection to /10.209.1.88:52600 after 59 ms (0 ms spent in bootstraps)
[shuffle-client-4-1] ERROR org.apache.spark.network.client.TransportClient - Failed to send RPC 7173534709356817937 to /10.209.1.88:52600: java.lang.AbstractMethodError: org.apache.spark.network.protocol.MessageWithHeader.touch(Ljava/lang/Object;)Lio/netty/util/ReferenceCounted;
java.lang.AbstractMethodError: org.apache.spark.network.protocol.MessageWithHeader.touch(Ljava/lang/Object;)Lio/netty/util/ReferenceCounted;
at io.netty.util.ReferenceCountUtil.touch(ReferenceCountUtil.java:73)
at io.netty.channel.DefaultChannelPipeline.touch(DefaultChannelPipeline.java:107)
at io.netty.channel.AbstractChannelHandlerContext.write(AbstractChannelHandlerContext.java:810)
at io.netty.channel.AbstractChannelHandlerContext.write(AbstractChannelHandlerContext.java:723)
at io.netty.handler.codec.MessageToMessageEncoder.write(MessageToMessageEncoder.java:111)
at io.netty.channel.AbstractChannelHandlerContext.invokeWrite0(AbstractChannelHandlerContext.java:738)
at io.netty.channel.AbstractChannelHandlerContext.invokeWrite(AbstractChannelHandlerContext.java:730)
at io.netty.channel.AbstractChannelHandlerContext.write(AbstractChannelHandlerContext.java:816)
at io.netty.channel.AbstractChannelHandlerContext.write(AbstractChannelHandlerContext.java:723)
at io.netty.handler.timeout.IdleStateHandler.write(IdleStateHandler.java:302)
at io.netty.channel.AbstractChannelHandlerContext.invokeWrite0(AbstractChannelHandlerContext.java:738)
at io.netty.channel.AbstractChannelHandlerContext.invokeWrite(AbstractChannelHandlerContext.java:730)
at io.netty.channel.AbstractChannelHandlerContext.access$1900(AbstractChannelHandlerContext.java:38)
at io.netty.channel.AbstractChannelHandlerContext$AbstractWriteTask.write(AbstractChannelHandlerContext.java:1089)
at io.netty.channel.AbstractChannelHandlerContext$WriteAndFlushTask.write(AbstractChannelHandlerContext.java:1136)
at io.netty.channel.AbstractChannelHandlerContext$AbstractWriteTask.run(AbstractChannelHandlerContext.java:1078)
at io.netty.util.concurrent.AbstractEventExecutor.safeExecute$$$capture(AbstractEventExecutor.java:163)
at io.netty.util.concurrent.AbstractEventExecutor.safeExecute(AbstractEventExecutor.java)
at io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:403)
at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:462)
at io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:858)
at io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.java:138)
at java.lang.Thread.run(Thread.java:748)
[shuffle-client-4-1] ERROR org.apache.spark.network.shuffle.OneForOneBlockFetcher - Failed while starting block fetches
java.io.IOException: Failed to send RPC 7173534709356817937 to /10.209.1.88:52600: java.lang.AbstractMethodError: org.apache.spark.network.protocol.MessageWithHeader.touch(Ljava/lang/Object;)Lio/netty/util/ReferenceCounted;
at org.apache.spark.network.client.TransportClient.lambda$sendRpc$2(TransportClient.java:237)
at io.netty.util.concurrent.DefaultPromise.notifyListener0(DefaultPromise.java:507)
at io.netty.util.concurrent.DefaultPromise.notifyListenersNow(DefaultPromise.java:481)
at io.netty.util.concurrent.DefaultPromise.notifyListeners(DefaultPromise.java:420)
at io.netty.util.concurrent.DefaultPromise.tryFailure(DefaultPromise.java:122)
at io.netty.util.internal.PromiseNotificationUtil.tryFailure(PromiseNotificationUtil.java:64)
at io.netty.channel.AbstractChannelHandlerContext.notifyOutboundHandlerException(AbstractChannelHandlerContext.java:837)
at io.netty.channel.AbstractChannelHandlerContext.invokeWrite0(AbstractChannelHandlerContext.java:740)
at io.netty.channel.AbstractChannelHandlerContext.invokeWrite(AbstractChannelHandlerContext.java:730)
at io.netty.channel.AbstractChannelHandlerContext.write(AbstractChannelHandlerContext.java:816)
at io.netty.channel.AbstractChannelHandlerContext.write(AbstractChannelHandlerContext.java:723)
at io.netty.handler.timeout.IdleStateHandler.write(IdleStateHandler.java:302)
at io.netty.channel.AbstractChannelHandlerContext.invokeWrite0(AbstractChannelHandlerContext.java:738)
at io.netty.channel.AbstractChannelHandlerContext.invokeWrite(AbstractChannelHandlerContext.java:730)
at io.netty.channel.AbstractChannelHandlerContext.access$1900(AbstractChannelHandlerContext.java:38)
at io.netty.channel.AbstractChannelHandlerContext$AbstractWriteTask.write(AbstractChannelHandlerContext.java:1089)
at io.netty.channel.AbstractChannelHandlerContext$WriteAndFlushTask.write(AbstractChannelHandlerContext.java:1136)
at io.netty.channel.AbstractChannelHandlerContext$AbstractWriteTask.run(AbstractChannelHandlerContext.java:1078)
at io.netty.util.concurrent.AbstractEventExecutor.safeExecute$$$capture(AbstractEventExecutor.java:163)
at io.netty.util.concurrent.AbstractEventExecutor.safeExecute(AbstractEventExecutor.java)
at io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:403)
at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:462)
at io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:858)
at io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.java:138)
at java.lang.Thread.run(Thread.java:748)
Caused by: java.lang.AbstractMethodError: org.apache.spark.network.protocol.MessageWithHeader.touch(Ljava/lang/Object;)Lio/netty/util/ReferenceCounted;
at io.netty.util.ReferenceCountUtil.touch(ReferenceCountUtil.java:73)
at io.netty.channel.DefaultChannelPipeline.touch(DefaultChannelPipeline.java:107)
at io.netty.channel.AbstractChannelHandlerContext.write(AbstractChannelHandlerContext.java:810)
at io.netty.channel.AbstractChannelHandlerContext.write(AbstractChannelHandlerContext.java:723)
at io.netty.handler.codec.MessageToMessageEncoder.write(MessageToMessageEncoder.java:111)
at io.netty.channel.AbstractChannelHandlerContext.invokeWrite0(AbstractChannelHandlerContext.java:738)
... 17 more
[shuffle-client-4-1] INFO org.apache.spark.network.shuffle.RetryingBlockFetcher - Retrying fetch (1/3) for 1 outstanding blocks after 5000 ms
[Block Fetch Retry-6-1] INFO org.apache.spark.network.client.TransportClientFactory - Found inactive connection to /10.209.1.88:52600, creating a new one.
[Block Fetch Retry-6-1] INFO org.apache.spark.network.client.TransportClientFactory - Successfully created connection to /10.209.1.88:52600 after 1 ms (0 ms spent in bootstraps)
[shuffle-client-4-1] ERROR org.apache.spark.network.client.TransportClient - Failed to send RPC 6680513752030028512 to /10.209.1.88:52600: java.lang.AbstractMethodError: org.apache.spark.network.protocol.MessageWithHeader.touch(Ljava/lang/Object;)Lio/netty/util/ReferenceCounted;
java.lang.AbstractMethodError: org.apache.spark.network.protocol.MessageWithHeader.touch(Ljava/lang/Object;)Lio/netty/util/ReferenceCounted;
at io.netty.util.ReferenceCountUtil.touch(ReferenceCountUtil.java:73)
at io.netty.channel.DefaultChannelPipeline.touch(DefaultChannelPipeline.java:107)
at io.netty.channel.AbstractChannelHandlerContext.write(AbstractChannelHandlerContext.java:810)
at io.netty.channel.AbstractChannelHandlerContext.write(AbstractChannelHandlerContext.java:723)
at io.netty.handler.codec.MessageToMessageEncoder.write(MessageToMessageEncoder.java:111)
at io.netty.channel.AbstractChannelHandlerContext.invokeWrite0(AbstractChannelHandlerContext.java:738)
at io.netty.channel.AbstractChannelHandlerContext.invokeWrite(AbstractChannelHandlerContext.java:730)
at io.netty.channel.AbstractChannelHandlerContext.write(AbstractChannelHandlerContext.java:816)
at io.netty.channel.AbstractChannelHandlerContext.write(AbstractChannelHandlerContext.java:723)
at io.netty.handler.timeout.IdleStateHandler.write(IdleStateHandler.java:302)
at io.netty.channel.AbstractChannelHandlerContext.invokeWrite0(AbstractChannelHandlerContext.java:738)
at io.netty.channel.AbstractChannelHandlerContext.invokeWrite(AbstractChannelHandlerContext.java:730)
at io.netty.channel.AbstractChannelHandlerContext.access$1900(AbstractChannelHandlerContext.java:38)
at io.netty.channel.AbstractChannelHandlerContext$AbstractWriteTask.write(AbstractChannelHandlerContext.java:1089)
at io.netty.channel.AbstractChannelHandlerContext$WriteAndFlushTask.write(AbstractChannelHandlerContext.java:1136)
at io.netty.channel.AbstractChannelHandlerContext$AbstractWriteTask.run(AbstractChannelHandlerContext.java:1078)
at io.netty.util.concurrent.AbstractEventExecutor.safeExecute$$$capture(AbstractEventExecutor.java:163)
at io.netty.util.concurrent.AbstractEventExecutor.safeExecute(AbstractEventExecutor.java)
at io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:403)
at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:462)
at io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:858)
at io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.java:138)
at java.lang.Thread.run(Thread.java:748)
[shuffle-client-4-1] ERROR org.apache.spark.network.shuffle.OneForOneBlockFetcher - Failed while starting block fetches
java.io.IOException: Failed to send RPC 6680513752030028512 to /10.209.1.88:52600: java.lang.AbstractMethodError: org.apache.spark.network.protocol.MessageWithHeader.touch(Ljava/lang/Object;)Lio/netty/util/ReferenceCounted;
at org.apache.spark.network.client.TransportClient.lambda$sendRpc$2(TransportClient.java:237)
at io.netty.util.concurrent.DefaultPromise.notifyListener0(DefaultPromise.java:507)
at io.netty.util.concurrent.DefaultPromise.notifyListenersNow(DefaultPromise.java:481)
at io.netty.util.concurrent.DefaultPromise.notifyListeners(DefaultPromise.java:420)
at io.netty.util.concurrent.DefaultPromise.tryFailure(DefaultPromise.java:122)
at io.netty.util.internal.PromiseNotificationUtil.tryFailure(PromiseNotificationUtil.java:64)
at io.netty.channel.AbstractChannelHandlerContext.notifyOutboundHandlerException(AbstractChannelHandlerContext.java:837)
at io.netty.channel.AbstractChannelHandlerContext.invokeWrite0(AbstractChannelHandlerContext.java:740)
at io.netty.channel.AbstractChannelHandlerContext.invokeWrite(AbstractChannelHandlerContext.java:730)
at io.netty.channel.AbstractChannelHandlerContext.write(AbstractChannelHandlerContext.java:816)
at io.netty.channel.AbstractChannelHandlerContext.write(AbstractChannelHandlerContext.java:723)
at io.netty.handler.timeout.IdleStateHandler.write(IdleStateHandler.java:302)
at io.netty.channel.AbstractChannelHandlerContext.invokeWrite0(AbstractChannelHandlerContext.java:738)
at io.netty.channel.AbstractChannelHandlerContext.invokeWrite(AbstractChannelHandlerContext.java:730)
at io.netty.channel.AbstractChannelHandlerContext.access$1900(AbstractChannelHandlerContext.java:38)
at io.netty.channel.AbstractChannelHandlerContext$AbstractWriteTask.write(AbstractChannelHandlerContext.java:1089)
at io.netty.channel.AbstractChannelHandlerContext$WriteAndFlushTask.write(AbstractChannelHandlerContext.java:1136)
at io.netty.channel.AbstractChannelHandlerContext$AbstractWriteTask.run(AbstractChannelHandlerContext.java:1078)
at io.netty.util.concurrent.AbstractEventExecutor.safeExecute$$$capture(AbstractEventExecutor.java:163)
at io.netty.util.concurrent.AbstractEventExecutor.safeExecute(AbstractEventExecutor.java)
at io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:403)
at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:462)
at io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:858)
at io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.java:138)
at java.lang.Thread.run(Thread.java:748)
Caused by: java.lang.AbstractMethodError: org.apache.spark.network.protocol.MessageWithHeader.touch(Ljava/lang/Object;)Lio/netty/util/ReferenceCounted;
at io.netty.util.ReferenceCountUtil.touch(ReferenceCountUtil.java:73)
at io.netty.channel.DefaultChannelPipeline.touch(DefaultChannelPipeline.java:107)
at io.netty.channel.AbstractChannelHandlerContext.write(AbstractChannelHandlerContext.java:810)
at io.netty.channel.AbstractChannelHandlerContext.write(AbstractChannelHandlerContext.java:723)
at io.netty.handler.codec.MessageToMessageEncoder.write(MessageToMessageEncoder.java:111)
at io.netty.channel.AbstractChannelHandlerContext.invokeWrite0(AbstractChannelHandlerContext.java:738)
This is due to netty dependency conflicts. Check the dependency tree of your project and use the one that spark needs

Sorting RDD by key [duplicate]

As per Spark documentation only RDD actions can trigger a Spark job and the transformations are lazily evaluated when an action is called on it.
I see the sortBy transformation function is applied immediately and it is shown as a job trigger in the SparkUI. Why?
sortBy is implemented using sortByKey which depends on a RangePartitioner (JVM) or partitioning function (Python). When you call sortBy / sortByKey partitioner (partitioning function) is initialized eagerly and samples input RDD to compute partition boundaries. Job you see corresponds to this process.
Actual sorting is performed only if you execute an action on the newly created RDD or its descendants.
As per Spark documentation only the action triggers a job in Spark, the transformations are lazily evaluated when an action is called on it.
In general you're right, but as you've just experienced, there are few exceptions and sortBy is among them (with zipWithIndex).
As a matter of fact, it was reported in Spark's JIRA and closed with Won't Fix resolution. See SPARK-1021 sortByKey() launches a cluster job when it shouldn't.
You can see the job running with DAGScheduler logging enabled (and later in web UI):
scala> sc.parallelize(0 to 8).sortBy(identity)
INFO DAGScheduler: Got job 1 (sortBy at <console>:25) with 8 output partitions
INFO DAGScheduler: Final stage: ResultStage 1 (sortBy at <console>:25)
INFO DAGScheduler: Parents of final stage: List()
INFO DAGScheduler: Missing parents: List()
DEBUG DAGScheduler: submitStage(ResultStage 1)
DEBUG DAGScheduler: missing: List()
INFO DAGScheduler: Submitting ResultStage 1 (MapPartitionsRDD[4] at sortBy at <console>:25), which has no missing parents
DEBUG DAGScheduler: submitMissingTasks(ResultStage 1)
INFO DAGScheduler: Submitting 8 missing tasks from ResultStage 1 (MapPartitionsRDD[4] at sortBy at <console>:25)
DEBUG DAGScheduler: New pending partitions: Set(0, 1, 5, 2, 6, 3, 7, 4)
INFO DAGScheduler: ResultStage 1 (sortBy at <console>:25) finished in 0.013 s
DEBUG DAGScheduler: After removal of stage 1, remaining stages = 0
INFO DAGScheduler: Job 1 finished: sortBy at <console>:25, took 0.019755 s
res1: org.apache.spark.rdd.RDD[Int] = MapPartitionsRDD[6] at sortBy at <console>:25

how to integrate spark streaming spark-2.1.0 with kafka 2.11-0.10.2.0 correctly in Java?

I tried using spark streaming to process kafka messages,followed this wiki https://spark.apache.org/docs/latest/streaming-kafka-0-10-integration.html
and my code is below:
SparkConf sparkConf = new SparkConf().setAppName("JavaDirectKafkaWordCount").setMaster("spark://sl:7077");
JavaStreamingContext jssc = new JavaStreamingContext(sparkConf, Durations.seconds(10));
Map<String, Object> kafkaParams = new HashMap<>();
kafkaParams.put("bootstrap.servers", "10.0.1.5:9092");
kafkaParams.put("key.deserializer", StringDeserializer.class);
kafkaParams.put("value.deserializer", StringDeserializer.class);
kafkaParams.put("group.id", "group1");
kafkaParams.put("auto.offset.reset", "earliest");
kafkaParams.put("enable.auto.commit", false);
Collection<String> topics = Collections.singletonList("test");
final JavaInputDStream<ConsumerRecord<String, String>> stream = KafkaUtils.createDirectStream(jssc,
LocationStrategies.PreferConsistent(),
ConsumerStrategies.<String, String>Subscribe(topics, kafkaParams));
stream.print();
after submit, it returns :
17/04/05 22:43:10 INFO SparkContext: Starting job: print at JavaDirectKafkaWordCount.java:47
17/04/05 22:43:10 INFO DAGScheduler: Got job 0 (print at JavaDirectKafkaWordCount.java:47) with 1 output partitions
17/04/05 22:43:10 INFO DAGScheduler: Final stage: ResultStage 0 (print at JavaDirectKafkaWordCount.java:47)
17/04/05 22:43:10 INFO DAGScheduler: Parents of final stage: List()
17/04/05 22:43:10 INFO DAGScheduler: Missing parents: List()
17/04/05 22:43:10 INFO DAGScheduler: Submitting ResultStage 0 (KafkaRDD[0] at createDirectStream at JavaDirectKafkaWordCount.java:44), which has no missing parents
17/04/05 22:43:10 INFO MemoryStore: Block broadcast_0 stored as values in memory (estimated size 2.3 KB, free 366.3 MB)
17/04/05 22:43:10 INFO MemoryStore: Block broadcast_0_piece0 stored as bytes in memory (estimated size 1529.0 B, free 366.3 MB)
17/04/05 22:43:10 INFO BlockManagerInfo: Added broadcast_0_piece0 in memory on 10.245.226.155:15258 (size: 1529.0 B, free: 366.3 MB)
17/04/05 22:43:10 INFO SparkContext: Created broadcast 0 from broadcast at DAGScheduler.scala:996
17/04/05 22:43:10 INFO DAGScheduler: Submitting 1 missing tasks from ResultStage 0 (KafkaRDD[0] at createDirectStream at JavaDirectKafkaWordCount.java:44)
17/04/05 22:43:10 INFO TaskSchedulerImpl: Adding task set 0.0 with 1 tasks
17/04/05 22:43:10 INFO CoarseGrainedSchedulerBackend$DriverEndpoint: Registered executor NettyRpcEndpointRef(null) (10.245.226.155:53448) with ID 0
17/04/05 22:43:10 INFO TaskSetManager: Starting task 0.0 in stage 0.0 (TID 0, 10.245.226.155, executor 0, partition 0, PROCESS_LOCAL, 7295 bytes)
17/04/05 22:43:10 INFO BlockManagerMasterEndpoint: Registering block manager 10.245.226.155:14669 with 366.3 MB RAM, BlockManagerId(0, 10.245.226.155, 14669, None)
17/04/05 22:43:10 INFO CoarseGrainedSchedulerBackend$DriverEndpoint: Registered executor NettyRpcEndpointRef(null) (10.245.226.155:53447) with ID 1
17/04/05 22:43:10 INFO BlockManagerMasterEndpoint: Registering block manager 10.245.226.155:33754 with 366.3 MB RAM, BlockManagerId(1, 10.245.226.155, 33754, None)
17/04/05 22:43:11 WARN TaskSetManager: Lost task 0.0 in stage 0.0 (TID 0, 10.245.226.155, executor 0): java.lang.NullPointerException
at org.apache.spark.util.Utils$.decodeFileNameInURI(Utils.scala:409)
at org.apache.spark.util.Utils$.fetchFile(Utils.scala:434)
at org.apache.spark.executor.Executor$$anonfun$org$apache$spark$executor$Executor$$updateDependencies$5.apply(Executor.scala:508)
at org.apache.spark.executor.Executor$$anonfun$org$apache$spark$executor$Executor$$updateDependencies$5.apply(Executor.scala:500)
at scala.collection.TraversableLike$WithFilter$$anonfun$foreach$1.apply(TraversableLike.scala:733)
at scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:99)
at scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:99)
at scala.collection.mutable.HashTable$class.foreachEntry(HashTable.scala:230)
at scala.collection.mutable.HashMap.foreachEntry(HashMap.scala:40)
at scala.collection.mutable.HashMap.foreach(HashMap.scala:99)
at scala.collection.TraversableLike$WithFilter.foreach(TraversableLike.scala:732)
at org.apache.spark.executor.Executor.org$apache$spark$executor$Executor$$updateDependencies(Executor.scala:500)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:257)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
can someone help on it? thanks very much.
Can you provide the parameters passed to spark-submit?
You might have passed a jar-file name instead of an absolute path to the jar file. The class org.apache.spark.executor.Executor tries to load "Added Jars" and "Added Files" in updateDependencies method, but the URI path is not as presumed by spark.

Categories

Resources