Spark-Scala-Intellij java.lang.IllegalStateException: After installing macOS Big Sur Update - java

I am working on Spark Scala using IntelliJ IDE, Recently I installed Scala and Spark in my local and there was system update for mac-os version, so not sure what broke it.
I am getting error when I try to build my project now, Which was working fine a day before.
I checked for JRE vs JDK as suggested in other answers and I am sure that my project is pointing to JDK 1.8. Also I removed Scala and Spark from machine to make sure my machine is in same state as before. Still I am getting this error. Checked existing answers on same error, but no help.
[INFO] --- maven-surefire-plugin:2.7:test (default-test) # dotcom-jobs ---
[INFO] Tests are skipped.
[INFO]
[INFO] --- scalatest-maven-plugin:2.0.0:test (small-tests) # dotcom-jobs ---
*** RUN ABORTED ***
java.lang.IllegalStateException: Could not initialize plugin: interface org.mockito.plugins.MockMaker (alternate: null)
at org.mockito.internal.configuration.plugins.PluginLoader$1.invoke(PluginLoader.java:74)
at com.sun.proxy.$Proxy2.isTypeMockable(Unknown Source)
at org.mockito.internal.util.MockUtil.typeMockabilityOf(MockUtil.java:29)
at org.mockito.internal.util.MockCreationValidator.validateType(MockCreationValidator.java:22)
at org.mockito.internal.creation.MockSettingsImpl.validatedSettings(MockSettingsImpl.java:241)
at org.mockito.internal.creation.MockSettingsImpl.build(MockSettingsImpl.java:229)
at org.mockito.internal.MockitoCore.mock(MockitoCore.java:62)
at org.mockito.Mockito.spy(Mockito.java:1992)
at com.homelabs.sc.rbac.utils.ConfigHelper$class.configUtil(ConfigHelper.scala:26)
at com.homelabs.sc.rbac.base.BaseSmallTest.configUtil$lzycompute(BaseSmallTest.scala:7)
...
Cause: java.lang.IllegalStateException: Failed to load interface org.mockito.plugins.MockMaker implementation declared in sun.misc.CompoundEnumeration#517d4a0d
at org.mockito.internal.configuration.plugins.PluginInitializer.loadImpl(PluginInitializer.java:54)
at org.mockito.internal.configuration.plugins.PluginLoader.loadPlugin(PluginLoader.java:57)
at org.mockito.internal.configuration.plugins.PluginLoader.loadPlugin(PluginLoader.java:44)
at org.mockito.internal.configuration.plugins.PluginRegistry.<init>(PluginRegistry.java:22)
at org.mockito.internal.configuration.plugins.Plugins.<clinit>(Plugins.java:19)
at org.mockito.internal.util.MockUtil.<clinit>(MockUtil.java:24)
at org.mockito.internal.util.MockCreationValidator.validateType(MockCreationValidator.java:22)
at org.mockito.internal.creation.MockSettingsImpl.validatedSettings(MockSettingsImpl.java:241)
at org.mockito.internal.creation.MockSettingsImpl.build(MockSettingsImpl.java:229)
at org.mockito.internal.MockitoCore.mock(MockitoCore.java:62)
...
Cause: org.mockito.exceptions.base.MockitoInitializationException: Could not initialize inline Byte Buddy mock maker. (This mock maker is not supported on Android.)
Are you running a JRE instead of a JDK? The inline mock maker needs to be run on a JDK.
Java : 1.8
JVM vendor name : Oracle Corporation
JVM vendor version : 25.221-b11
JVM name : Java HotSpot(TM) 64-Bit Server VM
JVM version : 1.8.0_221-b11
JVM info : mixed mode
OS name : Mac OS X
OS version : 10.16
at org.mockito.internal.creation.bytebuddy.InlineByteBuddyMockMaker.<init>(InlineByteBuddyMockMaker.java:170)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at java.lang.Class.newInstance(Class.java:442)
at org.mockito.internal.configuration.plugins.PluginInitializer.loadImpl(PluginInitializer.java:49)
at org.mockito.internal.configuration.plugins.PluginLoader.loadPlugin(PluginLoader.java:57)
at org.mockito.internal.configuration.plugins.PluginLoader.loadPlugin(PluginLoader.java:44)
at org.mockito.internal.configuration.plugins.PluginRegistry.<init>(PluginRegistry.java:22)
...
Cause: java.lang.IllegalStateException: No compatible attachment provider is available
at net.bytebuddy.agent.ByteBuddyAgent.install(ByteBuddyAgent.java:597)
at net.bytebuddy.agent.ByteBuddyAgent.install(ByteBuddyAgent.java:581)
at net.bytebuddy.agent.ByteBuddyAgent.install(ByteBuddyAgent.java:533)
at net.bytebuddy.agent.ByteBuddyAgent.install(ByteBuddyAgent.java:510)
at org.mockito.internal.creation.bytebuddy.InlineByteBuddyMockMaker.<clinit>(InlineByteBuddyMockMaker.java:104)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at java.lang.Class.newInstance(Class.java:442)
...

The problem was due to mac-os Big Sur update, contrary to what I thought the root cause is (installing scala).
So I solved this following this answer here at apple forum: https://developer.apple.com/forums/thread/666681
sudo rm -fr /Library/Internet\ Plug-Ins/JavaAppletPlugin.plugin
sudo rm -fr /Library/PreferencePanes/JavaControlPanel.prefpane

Related

Smiles2Monomers build fails

I am trying to use the Java software package https://github.com/yoann-dufresne/Smiles2Monomers on a Windows 10 computer. The installation worked fine until running the ant preCompute command which returns:
C:\Users\elabi\Smiles2Monomers>ant preCompute
Buildfile: C:\Users\elabi\Smiles2Monomers\build.xml
preCompute:
[java] JVM args ignored when same JVM is used.
BUILD FAILED
C:\Users\elabi\Smiles2Monomers\build.xml:81: java.lang.UnsupportedOperationException: The Security Manager is deprecated and will be removed in a future release
at java.base/java.lang.System.setSecurityManager(System.java:425)
at org.apache.tools.ant.types.Permissions.setSecurityManager(Permissions.java:103)
at org.apache.tools.ant.taskdefs.ExecuteJava.run(ExecuteJava.java:216)
at org.apache.tools.ant.taskdefs.ExecuteJava.execute(ExecuteJava.java:155)
at org.apache.tools.ant.taskdefs.Java.run(Java.java:891)
at org.apache.tools.ant.taskdefs.Java.executeJava(Java.java:231)
at org.apache.tools.ant.taskdefs.Java.executeJava(Java.java:135)
at org.apache.tools.ant.taskdefs.Java.execute(Java.java:108)
at org.apache.tools.ant.UnknownElement.execute(UnknownElement.java:299)
at java.base/jdk.internal.reflect.DirectMethodHandleAccessor.invoke(DirectMethodHandleAccessor.java:104)
at java.base/java.lang.reflect.Method.invoke(Method.java:578)
at org.apache.tools.ant.dispatch.DispatchUtils.execute(DispatchUtils.java:99)
at org.apache.tools.ant.Task.perform(Task.java:350)
at org.apache.tools.ant.Target.execute(Target.java:449)
at org.apache.tools.ant.Target.performTasks(Target.java:470)
at org.apache.tools.ant.Project.executeSortedTargets(Project.java:1401)
at org.apache.tools.ant.Project.executeTarget(Project.java:1374)
at org.apache.tools.ant.helper.DefaultExecutor.executeTargets(DefaultExecutor.java:41)
at org.apache.tools.ant.Project.executeTargets(Project.java:1264)
at org.apache.tools.ant.Main.runBuild(Main.java:818)
at org.apache.tools.ant.Main.startAnt(Main.java:223)
at org.apache.tools.ant.launch.Launcher.run(Launcher.java:284)
at org.apache.tools.ant.launch.Launcher.main(Launcher.java:101)
Total time: 0 seconds
I posted an issue and their GitHub but doubt there will be a solution since the package has not been updated in 5 years, but if they answer I will post the solution also here. Does anybody know why the build fails and how I could still get it running?
I have Java version 8 (build 1.8.0_351-b10), jdk-19 and jre1.8.0_351.

jna4412392371053342294.dll: Can't find dependent libraries

When i was installing cassandra I encountered this error :
INFO [main] 2021-05-02 00:16:18,144 DatabaseDescriptor.java:775 - Back-pressure is disabled with strategy org.apache.cassandra.net.RateBasedBackPressure{high_ratio=0.9, factor=5, flow=FAST}.
Exception (java.lang.UnsatisfiedLinkError) encountered during startup: C:\Users\ASUS\AppData\Local\Temp\jna-2018896\jna4412392371053342294.dll: Can't find dependent libraries
java.lang.UnsatisfiedLinkError: C:\Users\ASUS\AppData\Local\Temp\jna-2018896\jna4412392371053342294.dll: Can't find dependent libraries
at java.lang.ClassLoader$NativeLibrary.load(Native Method)
at java.lang.ClassLoader.loadLibrary0(ClassLoader.java:1934)
at java.lang.ClassLoader.loadLibrary(ClassLoader.java:1817)
at java.lang.Runtime.load0(Runtime.java:810)
at java.lang.System.load(System.java:1088)
at com.sun.jna.Native.loadNativeDispatchLibraryFromClasspath(Native.java:851)
at com.sun.jna.Native.loadNativeDispatchLibrary(Native.java:826)
at com.sun.jna.Native.<clinit>(Native.java:140)`enter code here
at org.apache.cassandra.utils.WindowsTimer.<clinit>(WindowsTimer.java:35)
at org.apache.cassandra.service.CassandraDaemon.activate(CassandraDaemon.java:630)
at org.apache.cassandra.service.CassandraDaemon.main(CassandraDaemon.java:786)
ERROR [main] 2021-05-02 00:16:18,258 CassandraDaemon.java:803 - Exception encountered during startup
java.lang.UnsatisfiedLinkError: C:\Users\ASUS\AppData\Local\Temp\jna-2018896\jna4412392371053342294.dll: Can't find dependent libraries
at java.lang.ClassLoader$NativeLibrary.load(Native Method) ~[na:1.8.0_282]
at java.lang.ClassLoader.loadLibrary0(ClassLoader.java:1934) ~[na:1.8.0_282]
at java.lang.ClassLoader.loadLibrary(ClassLoader.java:1817) ~[na:1.8.0_282]
at java.lang.Runtime.load0(Runtime.java:810) ~[na:1.8.0_282]
at java.lang.System.load(System.java:1088) ~[na:1.8.0_282]
at com.sun.jna.Native.loadNativeDispatchLibraryFromClasspath(Native.java:851) ~[jna-4.2.2.jar:4.2.2 (b0)]
at com.sun.jna.Native.loadNativeDispatchLibrary(Native.java:826) ~[jna-4.2.2.jar:4.2.2 (b0)]
at com.sun.jna.Native.<clinit>(Native.java:140) ~[jna-4.2.2.jar:4.2.2 (b0)]
at org.apache.cassandra.utils.WindowsTimer.<clinit>(WindowsTimer.java:35) ~[apache-cassandra-3.11.10.jar:3.11.10]
at org.apache.cassandra.service.CassandraDaemon.activate(CassandraDaemon.java:630) [apache-cassandra-3.11.10.jar:3.11.10]
at org.apache.cassandra.service.CassandraDaemon.main(CassandraDaemon.java:786) [apache-cassandra-3.11.10.jar:3.11.10]
cd C:\Users\ASUS\Downloads\apache-cassandra-3.11.10-bin\apache-cassandra-3.11.10
java 1.8 and python 2.7 are installed
How can i solve this problem please
So I was able to find a conversations of how to deal with this out on GitHub:
https://github.com/MarkusBernhardt/proxy-vole/issues/35
Basically, (older versions of) the JNA DLL is dynamically linked to msvcrt100.dll. To get around this issue, the latest version of the JNA libraries should be installed (looks like it was fixed in JNA 4.3+).
Also, running Apache Cassandra on Windows can be difficult, and wrought with strange errors (as you are seeing). I highly recommend running it on Linux. If you must use Windows as your base OS, VirtualBox or Docker can help.
I resolved this problem by this way:
look at the tmp directory and quickly copy dll (in properties you can see its natural name), when it showing, before deleting.
dumpbin /dependents {name}.dll shows its dependents dll's
step by step find problem dll in windows directory and copy to {JDK_HOME}/bin. in my case it was msvcr100.dll, and it has several versions (x86, x64) - in first try, error change look, and i choose another.

How to fix "Could not initialize class org.apache.ignite.IgniteJdbcThinDriver" error in Apache Ignite?

The Problem
I'm trying to connect to a Apache Ignite server with Apache Ignite built-in tool, SQLLine. I get the error: java.lang.NoClassDefFoundError: Could not initialize class org.apache.ignite.IgniteJdbcThinDriver
Background
I have Apache Ignite running in a container and CentOS7 running in another container. Both containers running in the same network (pinging works both ways). The tried connection is happening from CentOS7 to Apache Ignite.
Apache Ignite seems to be running fine with just the default configuration. In the CentOS7 container, I have installed Oracle JDK 12.0.1 and I have Apache Ignite 2.7.0 binary files in a folder. I have also set the IGNITE_HOME environmental variable.
Here, (https://apacheignite-sql.readme.io/docs/sqlline), it says I can connect to my cluster with just: ./sqlline.sh --verbose=true -u jdbc:ignite:thin://127.0.0.1/. However, this throws the previously mentioned error.
SQLLine should come with Ignite JDBC drivers. I have tried downloading them manually (https://apacheignite-sql.readme.io/docs/jdbc-driver#section-multiple-endpoints). When I downloaded the driver, which is said to be the ignite-core-{version}.jar, I put it in the same folder as sqlline.jar files.
Output
[root#bc72c4fbf47e bin]# ./sqlline.sh
sqlline version 1.3.0
sqlline> !connect jdbc:ignite:thin://172.19.0.2/
WARNING: An illegal reflective access operation has occurred
WARNING: Illegal reflective access by org.apache.ignite.internal.util.GridUnsafe$2 (file:/var/tmp/apache-ignite/libs/ignite-core-2.7.0.jar) to field java.nio.Buffer.address
WARNING: Please consider reporting this to the maintainers of org.apache.ignite.internal.util.GridUnsafe$2
WARNING: Use --illegal-access=warn to enable warnings of further illegal reflective access operations
WARNING: All illegal access operations will be denied in a future release
java.lang.NoClassDefFoundError: Could not initialize class org.apache.ignite.IgniteJdbcThinDriver
at java.base/java.lang.Class.forName0(Native Method)
at java.base/java.lang.Class.forName(Class.java:415)
at java.sql/java.sql.DriverManager.isDriverAllowed(DriverManager.java:555)
at java.sql/java.sql.DriverManager.isDriverAllowed(DriverManager.java:547)
at java.sql/java.sql.DriverManager.getDrivers(DriverManager.java:449)
at java.sql/java.sql.DriverManager.getDrivers(DriverManager.java:426)
at sqlline.SqlLine.findRegisteredDriver(SqlLine.java:1568)
at sqlline.SqlLine.scanForDriver(SqlLine.java:1542)
at sqlline.Commands.connect(Commands.java:1074)
at sqlline.Commands.connect(Commands.java:1001)
at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.base/java.lang.reflect.Method.invoke(Method.java:567)
at sqlline.ReflectiveCommandHandler.execute(ReflectiveCommandHandler.java:38)
at sqlline.SqlLine.dispatch(SqlLine.java:791)
at sqlline.SqlLine.begin(SqlLine.java:668)
at sqlline.SqlLine.start(SqlLine.java:373)
at sqlline.SqlLine.main(SqlLine.java:265)
Conclusion
I should be able to connect to my Ignite server with !connect jdbc:ignite:thin://172.19.0.2/ command in sqlline.
This does not work, and throws Could not initialize class org.apache.ignite.IgniteJdbcThinDriver
IgniteJDBCThinDriver is installed/available.
Adding JVM argument
--add-opens java.base/java.nio=ALL-UNNAMED
solved the problem for me.
It is recommended to Java 8, as Apache Ignite 2.7.0 does not have full Java 12 support. Otherwise you can try tinkering with JVM options.
Thank you #alamar, that worked!
I uninstalled JDK12 that i had installed with RPM.
Check the package name:
rpm -qa | grep jdk.
Delete the package:
rpm -e jdk-12.0.1-12.0.1-ga.x86_64.
I'm working on a isolated system, so I downloaded and transferred JDK8.rpm from another machine.
Install JDK8:
rpm -ihv jdk-8u211-linux-x64.rpm.
Now when I run:
./sqlline.sh --verbose=true -u jdbc:ignite:thin://172.19.0.2,
I get:
issuing: !connect jdbc:ignite:thin://172.19.0.2/ '' '' org.apache.ignite.IgniteJdbcTh
Connecting to jdbc:ignite:thin://172.19.0.2/
Connected to: Apache Ignite (version 2.7.0#20181130-sha1:256ae401)
Driver: Apache Ignite Thin JDBC Driver (version 2.7.0#20181130-sha1:256ae401)
Autocommit status: true
Transaction isolation: TRANSACTION_REPEATABLE_READ
sqlline version 1.3.0.
I can now query my database.
I got this error in the following very strange situation:
A dependency of my maven java project had Apache-Ignite as a dependency. The moment I opened a connection to a SQLite database at an invalid path (e.g. ./result/r.sqlite where the folder ./result does not exist) this error occurred (using th driver org.xerial.sqlite-jdbc:v3.30.1.)So this call
Class.forName("org.sqlite.JDBC");
Connection conn = DriverManager.getConnection("jdbc:sqlite:result.sqlite")
would result in this stacktrace:
Exception in thread "main" java.lang.NoClassDefFoundError: Could not initialize class org.apache.ignite.IgniteJdbcThinDriver
at java.base/java.lang.Class.forName0(Native Method)
at java.base/java.lang.Class.forName(Class.java:398)
at java.sql/java.sql.DriverManager.isDriverAllowed(DriverManager.java:555)
at java.sql/java.sql.DriverManager.getConnection(DriverManager.java:674)
at java.sql/java.sql.DriverManager.getConnection(DriverManager.java:251)
at MyClass.myMethod(MyClass.java:154)
So the solution would obviously be to make sure the folder exsits where the sqlite database should be created in.

Cannot setup Apache Spark 2.1.1 on Windows 10

I have installed Apache Spark 2.1.1 on Windows 10, with Java 1.8 and Python version 3.6 Anaconda 4.3.1. I have also downloaded the winutils.exe and setup environment avriables for JAVA_HOME, HADOOP_HOME and SPARK_HOME as well as updated the path variable. I have also run winutils.exe chmod -R 777 \tmp\hive. But I am getting the below error when running pyspark in cmd prompt.
Please please can someone help, let me know if I missed out any important detail
Thanks in advance!
c:\Spark>bin\pyspark
Python 3.6.0 |Anaconda 4.3.1 (64-bit)| (default, Dec 23 2016, 11:57:41) [MSC v.1900 64 bit (AMD64)] on win32
Type "help", "copyright", "credits" or "license" for more information.
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
Traceback (most recent call last):
File "c:\Spark\python\pyspark\sql\utils.py", line 63, in deco
return f(*a, **kw)
File "c:\Spark\python\lib\py4j-0.10.4-src.zip\py4j\protocol.py", line 319, in get_return_value
py4j.protocol.Py4JJavaError: **An error occurred while calling o22.sessionState.
: java.lang.IllegalArgumentException: Error while instantiating 'org.apache.spark.sql.hive.HiveSessionState':**
at org.apache.spark.sql.SparkSession$.org$apache$spark$sql$SparkSession$$reflect(SparkSession.scala:981)
at org.apache.spark.sql.SparkSession.sessionState$lzycompute(SparkSession.scala:110)
at org.apache.spark.sql.SparkSession.sessionState(SparkSession.scala:109)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:244)
at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:357)
I still get errors when launching [spark-shell], but it looks like Spark launches since I get the 'Welcome to Spark' piece. The error I get is
C:\Spark>bin\spark-shell
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
17/06/23 12:20:15 WARN General: Plugin (Bundle) "org.datanucleus.api.jdo" is already registered. Ensure you dont have multiple JAR versions of the same plugin in the classpath. The URL "file:/C:/Spark/jars/datanucleus-api-jdo-3.2.6.jar" is already registered, and you are trying to register an identical plugin located at URL "file:/C:/Spark/bin/../jars/datanucleus-api-jdo-3.2.6.jar."
17/06/23 12:20:15 WARN General: Plugin (Bundle) "org.datanucleus.store.rdbms" is already registered. Ensure you dont have multiple JAR versions of the same plugin in the classpath. The URL "file:/C:/Spark/jars/datanucleus-rdbms-3.2.9.jar" is already registered, and you are trying to register an identical plugin located at URL "file:/C:/Spark/bin/../jars/datanucleus-rdbms-3.2.9.jar."
17/06/23 12:20:15 WARN General: Plugin (Bundle) "org.datanucleus" is already registered. Ensure you dont have multiple JAR versions of the same plugin in the classpath. The URL "file:/C:/Spark/bin/../jars/datanucleus-core-3.2.10.jar" is already registered, and you are trying to register an identical plugin located at URL "file:/C:/Spark/jars/datanucleus-core-3.2.10.jar."
java.lang.IllegalArgumentException: Error while instantiating 'org.apache.spark.sql.hive.HiveSessionState':
at org.apache.spark.sql.SparkSession$.org$apache$spark$sql$SparkSession$$reflect(SparkSession.scala:981)
at org.apache.spark.sql.SparkSession.sessionState$lzycompute(SparkSession.scala:110)
at org.apache.spark.sql.SparkSession.sessionState(SparkSession.scala:109)
at org.apache.spark.sql.SparkSession$Builder$$anonfun$getOrCreate$5.apply(SparkSession.scala:878)
at org.apache.spark.sql.SparkSession$Builder$$anonfun$getOrCreate$5.apply(SparkSession.scala:878)
at scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:99)
at scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:99)
at scala.collection.mutable.HashTable$class.foreachEntry(HashTable.scala:230)
at scala.collection.mutable.HashMap.foreachEntry(HashMap.scala:40)
at scala.collection.mutable.HashMap.foreach(HashMap.scala:99)
at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:878)
at org.apache.spark.repl.Main$.createSparkSession(Main.scala:96)
... 47 elided
Caused by: java.lang.reflect.InvocationTargetException:
java.lang.IllegalArgumentException: Error while instantiating 'org.apache.spark.sql.hive.HiveExternalCatalog':
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at org.apache.spark.sql.SparkSession$.org$apache$spark$sql$SparkSession$$reflect(SparkSession.scala:978)
... 58 more
Caused by: java.lang.IllegalArgumentException: Error while instantiating 'org.apache.spark.sql.hive.HiveExternalCatalog':
at org.apache.spark.sql.internal.SharedState$.org$apache$spark$sql$internal$SharedState$$reflect(SharedState.scala:169)
at org.apache.spark.sql.internal.SharedState.<init>(SharedState.scala:86)
at org.apache.spark.sql.SparkSession$$anonfun$sharedState$1.apply(SparkSession.scala:101)
at org.apache.spark.sql.SparkSession$$anonfun$sharedState$1.apply(SparkSession.scala:101)
at scala.Option.getOrElse(Option.scala:121)
at org.apache.spark.sql.SparkSession.sharedState$lzycompute(SparkSession.scala:101)
at org.apache.spark.sql.SparkSession.sharedState(SparkSession.scala:100)
at org.apache.spark.sql.internal.SessionState.<init>(SessionState.scala:157)
at org.apache.spark.sql.hive.HiveSessionState.<init>(HiveSessionState.scala:32)
... 63 more
Caused by: java.lang.reflect.InvocationTargetException: java.lang.reflect.InvocationTargetException: java.lang.UnsatisfiedLinkError: org.apache.hadoop.io.nativeio.NativeIO$Windows.createDirectoryWithMode0(Ljava/lang/String;I)V
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at org.apache.spark.sql.internal.SharedState$.org$apache$spark$sql$internal$SharedState$$reflect(SharedState.scala:166)
... 71 more
Caused by: java.lang.reflect.InvocationTargetException:
java.lang.UnsatisfiedLinkError: org.apache.hadoop.io.nativeio.NativeIO$Windows.createDirectoryWithMode0(Ljava/lang/String;I)V
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at org.apache.spark.sql.hive.client.IsolatedClientLoader.createClient(IsolatedClientLoader.scala:264)
at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:358)
at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:262)
at org.apache.spark.sql.hive.HiveExternalCatalog.<init>(HiveExternalCatalog.scala:66)
... 76 more
Caused by: java.lang.UnsatisfiedLinkError: org.apache.hadoop.io.nativeio.NativeIO$Windows.createDirectoryWithMode0(Ljava/lang/String;I)V
at org.apache.hadoop.io.nativeio.NativeIO$Windows.createDirectoryWithMode0(Native Method)
at org.apache.hadoop.io.nativeio.NativeIO$Windows.createDirectoryWithMode(NativeIO.java:524)
at org.apache.hadoop.fs.RawLocalFileSystem.mkOneDirWithMode(RawLocalFileSystem.java:478)
at org.apache.hadoop.fs.RawLocalFileSystem.mkdirsWithOptionalPermission(RawLocalFileSystem.java:532)
at org.apache.hadoop.fs.RawLocalFileSystem.mkdirs(RawLocalFileSystem.java:509)
at org.apache.hadoop.fs.FilterFileSystem.mkdirs(FilterFileSystem.java:305)
at org.apache.hadoop.hive.ql.session.SessionState.createPath(SessionState.java:639)
at org.apache.hadoop.hive.ql.session.SessionState.createSessionDirs(SessionState.java:561)
at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:508)
at org.apache.spark.sql.hive.client.HiveClientImpl.<init>(HiveClientImpl.scala:188)
... 84 more
14: error: not found: value spark
import spark.implicits._
^
14: error: not found: value spark
import spark.sql
^
Welcome to
Setup that worked for me is as follows "i didn't use winutils.exe":-
install pyspark and findspark using "Anaconda Command Prompt" as
pip3 install pyspark
and
pip3 install findspark
as you have already downloaded spark setup. unzip it and keep it in "C" drive i.e. "C:\spark-2.2.0-bin-hadoop2.7" and create new environment variable "SPARK_HOME" and set it to "C:\spark-2.2.0-bin-hadoop2.7\bin" and open "path" variable in system variables and add the same there as well.
now open your command prompt and come from "C:\User*" to "C:\" by doing cd.. twice and run following command
set SPARK_HOME='spark-2.2.0-bin-hadoop2.7'
and you are good to go.
Now you just have to import the file location before importing pyspark in your jupyter notebook. use following code:-
import findspark
findspark.init('C:\spark-2.2.0-bin-hadoop2.7')
import pyspark

Sonar does not download plugins

Someone knows why Sonar cannot download any plugins? I have a fresh v.3.7.4 installation. That's the first time I'm seeing this awesome server behaviour this way.
Trace:
2014.04.25 05:13:26 WARN o.s.s.p.PluginDownloader Fail to download the plugin (jira, version 1.2) from http://repository.codehaus.org/org/codehaus/sonar-plugins/sonar-jira-plugin/1.2/sonar-jira-plugin-1.2.jar
org.sonar.api.utils.SonarException: Fail to downloadhttp://repository.codehaus.org/org/codehaus/sonar-plugins/sonar-jira-plugin/1.2/sonar-jira-plugin-1.2.jar (no proxy)
at org.sonar.api.utils.HttpDownloader.failToDownload(HttpDownloader.java:143) ~[sonar-plugin-api-3.7.4.jar:na]
at org.sonar.api.utils.HttpDownloader.download(HttpDownloader.java:138) ~[sonar-plugin-api-3.7.4.jar:na]
at org.sonar.server.plugins.PluginDownloader.downloadRelease(PluginDownloader.java:126) ~[classes/:na]
at org.sonar.server.plugins.PluginDownloader.download(PluginDownloader.java:105) ~[classes/:na]
at org.sonar.server.ui.JRubyFacade.downloadPlugin(JRubyFacade.java:158) [classes/:na]
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[na:1.7.0_55]
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) ~[na:1.7.0_55]
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[na:1.7.0_55]
...
Environment:
Debian 7 x86
Java version "1.7.0_55"
Java(TM) SE Runtime Environment (build 1.7.0_55-b13)
Java HotSpot(TM) Server VM (build 24.55-b03, mixed mode)
Internet validation:
root#machine:/opt# wget http://repository.codehaus.org/org/codehaus/sonar-plugins/sonar-jira-plugin/1.2/sonar-jira-plugin-1.2.jar
--2014-04-25 05:21:58-- http://repository.codehaus.org/org/codehaus/sonar-plugins/sonar-jira-plugin/1.2/sonar-jira-plugin-1.2.jar
Resolving repository.codehaus.org (repository.codehaus.org)... 199.193.192.103
Connecting to repository.codehaus.org (repository.codehaus.org)|199.193.192.103|:80... connected.
HTTP request sent, awaiting response... 200 OK
Length: 1864762 (1.8M) [application/java-archive]
Saving to: `sonar-jira-plugin-1.2.jar'
100%[============================================================================>] 1,864,762 3.78M/s in 0.5s
2014-04-25 05:22:18 (3.78 MB/s) - `sonar-jira-plugin-1.2.jar' saved [1864762/1864762]
Deploy manually your jar files to the repository(mvn deploy:deploy-file), or if you allready done then:
Take a look in your local .m2 repository in the path: /org/codehaus/sonar-plugins/sonar-jira-plugin/1.2/
This path should be empty. If there you find files like *lastUpdated then delete all the content from the directory. Next time if you build the project then maven will download the files.
I expect you have a nexus or archiva or(whatewer) repository localy(writible for you) and you not work only with a settings.xml and .m2.

Categories

Resources