Java spark submit with google secret manager no such method exception - java

I have a java spark program that uses google secret manager client libraries (https://cloud.google.com/secret-manager/docs/reference/libraries#client-libraries-install-java).
While doing spark submit I got following errors. Not sure what is going on here. Sounds like some dependency issue but could not find a solution yet. Any help is appreciated. "MyJar.jar" is an uber-jar that I created with maven.
spark-submit --class com.myProgram.MyMain --master local[2] --jars
libs/spark-bigquery-latest_2.12.jar target/MyJar.jar Exception in
thread "main" java.lang.NoSuchMethodError:
com.google.common.base.Preconditions.checkArgument(ZLjava/lang/String;CLjava/lang/Object;)V
at io.grpc.Metadata$Key.validateName(Metadata.java:754)
at io.grpc.Metadata$Key.(Metadata.java:762)
at io.grpc.Metadata$Key.(Metadata.java:671)
at io.grpc.Metadata$AsciiKey.(Metadata.java:971)
at io.grpc.Metadata$AsciiKey.(Metadata.java:966)
at io.grpc.Metadata$Key.of(Metadata.java:708)
at io.grpc.Metadata$Key.of(Metadata.java:704)
at com.google.api.gax.grpc.GrpcHeaderInterceptor.(GrpcHeaderInterceptor.java:60)
at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createSingleChannel(InstantiatingGrpcChannelProvider.java:321)
at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.access$1900(InstantiatingGrpcChannelProvider.java:82)
at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider$1.createSingleChannel(InstantiatingGrpcChannelProvider.java:240)
at com.google.api.gax.grpc.ChannelPool.create(ChannelPool.java:72)
at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createChannel(InstantiatingGrpcChannelProvider.java:250)
at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.getTransportChannel(InstantiatingGrpcChannelProvider.java:228)
at com.google.api.gax.rpc.ClientContext.create(ClientContext.java:205)
at com.google.cloud.secretmanager.v1.stub.GrpcSecretManagerServiceStub.create(GrpcSecretManagerServiceStub.java:248)
at com.google.cloud.secretmanager.v1.stub.SecretManagerServiceStubSettings.createStub(SecretManagerServiceStubSettings.java:342)
at com.google.cloud.secretmanager.v1.SecretManagerServiceClient.(SecretManagerServiceClient.java:152)
at com.google.cloud.secretmanager.v1.SecretManagerServiceClient.create(SecretManagerServiceClient.java:133)
at com.google.cloud.secretmanager.v1.SecretManagerServiceClient.create(SecretManagerServiceClient.java:124)
at com.myProgram.shared.AccessSecretVersion.getPrivateKey(AccessSecretVersion.java:12)
at com.myProgram.MyMain.main(MyMain.java:33)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:951)
at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:180)
at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:203)
at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:90)
at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:1039)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1048)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)

Related

Vertx run results in resource not found error

I am trying to run a vertx server with dynamoDB and lombok annotations, and I am unable to get it to work properly. I have added the .jar files for vertx, dynamoDB, and lombok to my classpath. Everything compiles fine in IntelliJ, but I cannot get it to run properly from my terminal.
I get:
Users-MacBook-Pro:Server User$ vertx run Server.java
java.lang.RuntimeException: Resource not found: Server.java
at io.vertx.core.impl.verticle.CompilingClassLoader.<init>(CompilingClassLoader.java:73)
at io.vertx.core.impl.JavaVerticleFactory.createVerticle(JavaVerticleFactory.java:38)
at io.vertx.core.impl.DeploymentManager.createVerticles(DeploymentManager.java:184)
at io.vertx.core.impl.DeploymentManager.lambda$doDeployVerticle$2(DeploymentManager.java:157)
at io.vertx.core.impl.FutureImpl.checkCallHandler(FutureImpl.java:158)
at io.vertx.core.impl.FutureImpl.setHandler(FutureImpl.java:100)
at io.vertx.core.impl.DeploymentManager.doDeployVerticle(DeploymentManager.java:130)
at io.vertx.core.impl.DeploymentManager.doDeployVerticle(DeploymentManager.java:102)
at io.vertx.core.impl.DeploymentManager.deployVerticle(DeploymentManager.java:90)
at io.vertx.core.impl.VertxImpl.deployVerticle(VertxImpl.java:574)
at io.vertx.core.impl.launcher.commands.VertxIsolatedDeployer.deploy(VertxIsolatedDeployer.java:46)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at io.vertx.core.impl.launcher.commands.ClasspathHandler.deploy(ClasspathHandler.java:160)
at io.vertx.core.impl.launcher.commands.RunCommand.deploy(RunCommand.java:389)
at io.vertx.core.impl.launcher.commands.RunCommand.run(RunCommand.java:262)
at io.vertx.core.impl.launcher.VertxCommandLauncher.execute(VertxCommandLauncher.java:230)
at io.vertx.core.impl.launcher.VertxCommandLauncher.dispatch(VertxCommandLauncher.java:365)
at io.vertx.core.impl.launcher.VertxCommandLauncher.dispatch(VertxCommandLauncher.java:328)
at io.vertx.core.Launcher.main(Launcher.java:49)
Failed in deploying verticle
java.lang.RuntimeException: Resource not found: Server.java
at io.vertx.core.impl.verticle.CompilingClassLoader.<init>(CompilingClassLoader.java:73)
at io.vertx.core.impl.JavaVerticleFactory.createVerticle(JavaVerticleFactory.java:38)
at io.vertx.core.impl.DeploymentManager.createVerticles(DeploymentManager.java:184)
at io.vertx.core.impl.DeploymentManager.lambda$doDeployVerticle$2(DeploymentManager.java:157)
at io.vertx.core.impl.FutureImpl.checkCallHandler(FutureImpl.java:158)
at io.vertx.core.impl.FutureImpl.setHandler(FutureImpl.java:100)
at io.vertx.core.impl.DeploymentManager.doDeployVerticle(DeploymentManager.java:130)
at io.vertx.core.impl.DeploymentManager.doDeployVerticle(DeploymentManager.java:102)
at io.vertx.core.impl.DeploymentManager.deployVerticle(DeploymentManager.java:90)
at io.vertx.core.impl.VertxImpl.deployVerticle(VertxImpl.java:574)
at io.vertx.core.impl.launcher.commands.VertxIsolatedDeployer.deploy(VertxIsolatedDeployer.java:46)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at io.vertx.core.impl.launcher.commands.ClasspathHandler.deploy(ClasspathHandler.java:160)
at io.vertx.core.impl.launcher.commands.RunCommand.deploy(RunCommand.java:389)
at io.vertx.core.impl.launcher.commands.RunCommand.run(RunCommand.java:262)
at io.vertx.core.impl.launcher.VertxCommandLauncher.execute(VertxCommandLauncher.java:230)
at io.vertx.core.impl.launcher.VertxCommandLauncher.dispatch(VertxCommandLauncher.java:365)
at io.vertx.core.impl.launcher.VertxCommandLauncher.dispatch(VertxCommandLauncher.java:328)
at io.vertx.core.Launcher.main(Launcher.java:49)
java.lang.RuntimeException: Resource not found: Server.java
at io.vertx.core.impl.verticle.CompilingClassLoader.<init>(CompilingClassLoader.java:73)
at io.vertx.core.impl.JavaVerticleFactory.createVerticle(JavaVerticleFactory.java:38)
at io.vertx.core.impl.DeploymentManager.createVerticles(DeploymentManager.java:184)
at io.vertx.core.impl.DeploymentManager.lambda$doDeployVerticle$2(DeploymentManager.java:157)
at io.vertx.core.impl.FutureImpl.checkCallHandler(FutureImpl.java:158)
at io.vertx.core.impl.FutureImpl.setHandler(FutureImpl.java:100)
at io.vertx.core.impl.DeploymentManager.doDeployVerticle(DeploymentManager.java:130)
at io.vertx.core.impl.DeploymentManager.doDeployVerticle(DeploymentManager.java:102)
at io.vertx.core.impl.DeploymentManager.deployVerticle(DeploymentManager.java:90)
at io.vertx.core.impl.VertxImpl.deployVerticle(VertxImpl.java:574)
at io.vertx.core.impl.launcher.commands.VertxIsolatedDeployer.deploy(VertxIsolatedDeployer.java:46)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at io.vertx.core.impl.launcher.commands.ClasspathHandler.deploy(ClasspathHandler.java:160)
at io.vertx.core.impl.launcher.commands.RunCommand.deploy(RunCommand.java:389)
at io.vertx.core.impl.launcher.commands.RunCommand.run(RunCommand.java:262)
at io.vertx.core.impl.launcher.VertxCommandLauncher.execute(VertxCommandLauncher.java:230)
at io.vertx.core.impl.launcher.VertxCommandLauncher.dispatch(VertxCommandLauncher.java:365)
at io.vertx.core.impl.launcher.VertxCommandLauncher.dispatch(VertxCommandLauncher.java:328)
at io.vertx.core.Launcher.main(Launcher.java:49)
Would appreciate some help with this!
This error simply tells you that the Vert.x CLI could not find the Server.java file in the current working directory.
Given that you use Lombok (which modifies code at compile time), I would recommend to run your Verticle after you compiled it with your IDE or build tool.
Here is a list of possibilities to start a Verticle
I was able to fix this issue by creating a fat jar module out of my project. I followed this for help: http://vertx.io/blog/my-first-vert-x-3-application/

Access to HDFS Oozie java action with Kerberos

I have developed Java application to connect to Ldap server and get the details in csv format based on user arguments. The result csv file will be saved in HDFS (Hadoop File system).
In order to write to HDFS, I have imported org.apache.hadoop.security.UserGroupInformation and set the Kerberos configuration properly. Below are the code snippet.
config.set("hadoop.security.authentication","Kerberos");
UserGroupInformation.setConfiguration(config);
UserGroupInformation.loginUserFromKeytab(Principal,KeyTabfile);
we kept the keytab file in edge server of the POC environment. When I run the Java application from edge server, It is reading the keytab file running fine and writing the results to HDFS.
But my issue starts when I tried to shedule this application using oozie. Oozie will launch java actions in any of the data nodes in a cluster based on the availability of resources and oozie cant access the edge server. Because of this, my java action in oozie getting failed with security exception since it cant read the keytab file in edge server.
Below is exception details.
java.io.IOException: Login failure for hdfs://namenode:8020 from keytab xxxxx#zz.yy.COM: javax.security.auth.login.LoginException: java.lang.IllegalArgumentException: Empty nameString not allowed
at sun.security.krb5.PrincipalName.validateNameStrings(PrincipalName.java:171)
at sun.security.krb5.PrincipalName.<init>(PrincipalName.java:393)
at sun.security.krb5.PrincipalName.<init>(PrincipalName.java:460)
at com.sun.security.auth.module.Krb5LoginModule.attemptAuthentication(Krb5LoginModule.java:650)
at com.sun.security.auth.module.Krb5LoginModule.login(Krb5LoginModule.java:617)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:497)
at javax.security.auth.login.LoginContext.invoke(LoginContext.java:755)
at javax.security.auth.login.LoginContext.access$000(LoginContext.java:195)
at javax.security.auth.login.LoginContext$4.run(LoginContext.java:682)
at javax.security.auth.login.LoginContext$4.run(LoginContext.java:680)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.login.LoginContext.invokePriv(LoginContext.java:680)
at javax.security.auth.login.LoginContext.login(LoginContext.java:587)
at org.apache.hadoop.security.UserGroupInformation.loginUserFromKeytab(UserGroupInformation.java:967)
at RunLdap_Utility.ldapLookupLoop(RunLdap_Utility.java:142)
at RunLdap_Utility.main(RunLdap_Utility.java:72)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:497)
at org.apache.oozie.action.hadoop.JavaMain.run(JavaMain.java:56)
at org.apache.oozie.action.hadoop.LauncherMain.run(LauncherMain.java:47)
at org.apache.oozie.action.hadoop.JavaMain.main(JavaMain.java:35)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:497)
at org.apache.oozie.action.hadoop.LauncherMapper.map(LauncherMapper.java:241)
at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:54)
at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:453)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:343)
at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1709)
at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:162)
Kindly suggest me the solution for this issue

NoClassDefFound Exception Spark Streaming

I am trying to run the sample JavaStatefulNetworkWordCount Algorithm provided by Apache Spark examples but am experiencing a problem when I try to run the program using spark submit, I get the following exception:
Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/spark/streaming/StateSpec
at JavaStatefulNetworkWordCount.main(JavaStatefulNetworkWordCount.java:109)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:674)
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:180)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:205)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:120)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: java.lang.ClassNotFoundException: org.apache.spark.streaming.StateSpec
at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
I have imported the StateSpec classes and the code is the same as the one provided over here: https://github.com/apache/spark/blob/master/examples/src/main/java/org/apache/spark/examples/streaming/JavaStatefulNetworkWordCount.java
I would appreciate any help in understanding why this problem arises and how I can fix it.
You should probably add the actual submit command as well.
I highly suspect that you are either missing the --class parameter at all when submitting the spark jar or you did not use the correct qualified class name in it.
I tried using the following command:
spark-submit --class JavaStatefulNetworkWordCount --master local[2] target/SparkStreaming.jar localhost 2222

BSONFileInputFormat not found even after adding libs to hadoop folder

I was working on the movie recommendations work around using crcmnky's repository. https://github.com/crcsmnky/mongodb-spark-demo
I have compiled mongo-hadoop and mongo-java-driver and stored the jars: mongo-hadoop-core-1.3.2-SNAPSHOT and mongo-java-driver-2.13.3.jar in the $HADOOP_HOME/lib folder.
After doing all this, I built the project and ran it as per the given instructions on the README file.
I get the error:
Exception in thread "main" java.lang.NoClassDefFoundError: com/mongodb/hadoop/BSONFileInputFormat
at com.mongodb.spark.demo.Recommender.main(Recommender.java:59)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:483)
at org.apache.spark.deploy.SparkSubmit$.launch(SparkSubmit.scala:358)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:75)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: java.lang.ClassNotFoundException: com.mongodb.hadoop.BSONFileInputFormat
at java.net.URLClassLoader$1.run(URLClassLoader.java:372)
at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:360)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
What could have possibly gone wrong? I followed all instructions correctly.
I had the exact same problem and the son of Zeus took me forever to solve. Try this:
Locate your mongo-hadoop-core-1.4.1-SNAPSHOT.jar and mongo-java-driver-2.12.3.jar
Add them to the --jars in spark-submit command "before" your --master and the application jar location. This is the crucial step. If you mention the --jars after the two then you will for some insane reason keep getting the BSONFileInputFormat exception. So effectively your spark-submit command would be -
./bin/spark-submit --class "com.mongodb.spark.demo.Recommender" --jars /home/killshot/Downloads/mongo-hadoop/core/build/libs/mongo-hadoop-core-1.4.1-SNAPSHOT.jar,/home/killshot/Downloads/mongo-hadoop/work/mongodb-spark-demo/target/lib/mongo-java-driver-2.12.3.jar --master local[4]

Issues with accumulo 1.4.3 helloworld example on CDH4.3 QuickStart

I am trying to put together a accumulo/cloudera quickstart. Accumulo is running, but I am having problems attempting to execute samples, namely hellowworld.
It appears it is finding hadoop and not accumulo classes?
Following is execution and error messages. I appreciate your assistance!
./bin/accumulo org.apache.accumulo.examples.simple.helloworld.InsertWithOutputFormat "instance" localhost:2181 "username" "password" hellotable
Thread "org.apache.accumulo.examples.simple.helloworld.InsertWithOutputFormat" died nulljava.lang.reflect.InvocationTargetException
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at org.apache.accumulo.start.Main$1.run(Main.java:89)
at java.lang.Thread.run(Thread.java:662)
Caused by: java.lang.IncompatibleClassChangeError: Found interface org.apache.hadoop.mapreduce.JobContext, but class was expected
at org.apache.accumulo.core.util.ContextFactory.createTaskAttemptContext(ContextFactory.java:131)
at org.apache.accumulo.examples.simple.helloworld.InsertWithOutputFormat.run(InsertWithOutputFormat.java:56)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
at org.apache.accumulo.examples.simple.helloworld.InsertWithOutputFormat.main(InsertWithOutputFormat.java:76)
... 6 more
Problem solved by:
1) Using cdh4 accumulo-1.4.3 tar
2) adding additional users
3) ensuring write permissions on supporting jars

Categories

Resources