Maven Java Compile - java

I'm new to Maven and Java but when in search for a good EID reader this looked like the best choice (https://github.com/grakic/jfreesteel/).
I've ran "mvn install" and made the .jar file which when ran resulted in:
The terminal waiting for me to insert the card
When the card is inserted the terminal recognises it and continues .
And finally it crashes
This is the full log of what happens:
[main] INFO net.devbase.jfreesteel.nativemessaging.EidWebExtensionApp - Starting web extensions native messaging background app...
[main] INFO net.devbase.jfreesteel.nativemessaging.EidWebExtensionApp - Using terminal factory type PC/SC
[Thread-0] INFO net.devbase.jfreesteel.nativemessaging.EidWebExtensionApp - Card inserted
[Thread-0] INFO net.devbase.jfreesteel.EidCard - exclusive
[Thread-0] INFO net.devbase.jfreesteel.EidCard - exclusive free
[Thread-0] INFO net.devbase.jfreesteel.EidCard - photo exclusive
[Thread-0] INFO net.devbase.jfreesteel.EidCard - photo exclusive free
Exception in thread "Thread-0" java.lang.NoClassDefFoundError: javax/xml/bind/DatatypeConverter
at net.devbase.jfreesteel.Utils.image2Base64String(Utils.java:220)
at net.devbase.jfreesteel.nativemessaging.EidWebExtensionApp.inserted(EidWebExtensionApp.java:261)
at net.devbase.jfreesteel.nativemessaging.EidWebExtensionApp.access$400(EidWebExtensionApp.java:25)
at net.devbase.jfreesteel.nativemessaging.EidWebExtensionApp$2.run(EidWebExtensionApp.java:155)
at java.base/java.lang.Thread.run(Thread.java:830)
Caused by: java.lang.ClassNotFoundException: javax.xml.bind.DatatypeConverter
at java.base/jdk.internal.loader.BuiltinClassLoader.loadClass(BuiltinClassLoader.java:602)
at java.base/jdk.internal.loader.ClassLoaders$AppClassLoader.loadClass(ClassLoaders.java:178)
at java.base/java.lang.ClassLoader.loadClass(ClassLoader.java:521)
... 5 more

Related

Gradle: java.io.IOException: The pipe is being closed

On my CI machine (Windows 10) I run Gradle tests.
Basically on the machine a git pull and a java compile is performed before the test task.
This task...
test --tests my.package.SomeTestClass
failed with this exception...
00:00:32,989 INFO - Successfully started process 'Gradle Test Executor 1'
00:00:35,925 INFO - Error occurred during initialization of VM
00:00:35,927 INFO - Unable to allocate 16384KB bitmaps for parallel garbage collection for the requested 524288KB heap.
00:00:35,927 ERROR - Error: Could not create the Java Virtual Machine.
00:00:35,929 ERROR - Error: A fatal exception has occurred. Program will exit.
00:00:35,930 INFO -
00:00:35,930 ERROR - Could not write standard input to Gradle Test Executor 1.
00:00:35,932 ERROR - java.io.IOException: The pipe is being closed
00:00:35,932 ERROR - at java.io.FileOutputStream.writeBytes(Native Method)
00:00:35,934 ERROR - at java.io.FileOutputStream.write(FileOutputStream.java:326)
00:00:35,934 ERROR - at java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:82)
00:00:35,934 ERROR - at java.io.BufferedOutputStream.flush(BufferedOutputStream.java:140)
00:00:35,934 ERROR - at org.gradle.process.internal.streams.ExecOutputHandleRunner.forwardContent(ExecOutputHandleRunner.java:67)
00:00:35,934 ERROR - at org.gradle.process.internal.streams.ExecOutputHandleRunner.run(ExecOutputHandleRunner.java:52)
00:00:35,934 ERROR - at org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:63)
00:00:35,934 ERROR - at org.gradle.internal.concurrent.ManagedExecutorImpl$1.run(ManagedExecutorImpl.java:46)
00:00:35,934 ERROR - at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
00:00:35,934 ERROR - at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
00:00:35,934 ERROR - at org.gradle.internal.concurrent.ThreadFactoryImpl$ManagedThreadRunnable.run(ThreadFactoryImpl.java:55)
00:00:35,934 ERROR - at java.lang.Thread.run(Thread.java:748)
00:00:35,934 INFO -
00:00:35,935 INFO - > Task :test FAILED
I am using the Gradle wrapper 5.4.1.
My gradle.properties looks like:
org.gradle.daemon=false
org.gradle.jvmargs=-Xmx2g
I experienced the same issue some months back when I enabled the Gradle daemon.
I am monitoring my CI machines and I can see that at this time the machine had about 750MB free memory which is more than the required 512MB (see exception).
Not sure if this is relevant but the node is running the Quickbuild agent.
What does this error exactly mean and how can I get rid of it?

Why apache spark does not work with java 10? We get illegal reflective then java.lang.IllegalArgumentException

Is there any technical reason why spark 2.3 does not work with java 1.10 (as of July 2018)?
Here is the output when I run SparkPi example using spark-submit.
$ ./bin/spark-submit ./examples/src/main/python/pi.py
WARNING: An illegal reflective access operation has occurred
WARNING: Illegal reflective access by org.apache.hadoop.security.authentication.util.KerberosUtil to method sun.security.krb5.Config.getInstance()
WARNING: Please consider reporting this to the maintainers of org.apache.hadoop.security.authentication.util.KerberosUtil
WARNING: Use --illegal-access=warn to enable warnings of further illegal reflective access operations
WARNING: All illegal access operations will be denied in a future release
2018-07-13 14:31:30 WARN NativeCodeLoader:62 - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
2018-07-13 14:31:31 INFO SparkContext:54 - Running Spark version 2.3.1
2018-07-13 14:31:31 INFO SparkContext:54 - Submitted application: PythonPi
2018-07-13 14:31:31 INFO Utils:54 - Successfully started service 'sparkDriver' on port 58681.
2018-07-13 14:31:31 INFO SparkEnv:54 - Registering MapOutputTracker
2018-07-13 14:31:31 INFO SparkEnv:54 - Registering BlockManagerMaster
2018-07-13 14:31:31 INFO BlockManagerMasterEndpoint:54 - Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information
2018-07-13 14:31:31 INFO BlockManagerMasterEndpoint:54 - BlockManagerMasterEndpoint up
2018-07-13 14:31:31 INFO DiskBlockManager:54 - Created local directory at /private/var/folders/mp/9hp4l4md4dqgmgyv7g58gbq0ks62rk/T/blockmgr-d24fab4c-c858-4cd8-9b6a-97b02aa630a5
2018-07-13 14:31:31 INFO MemoryStore:54 - MemoryStore started with capacity 434.4 MB
2018-07-13 14:31:31 INFO SparkEnv:54 - Registering OutputCommitCoordinator
...
2018-07-13 14:31:32 INFO StateStoreCoordinatorRef:54 - Registered StateStoreCoordinator endpoint
Traceback (most recent call last):
File "~/Documents/spark-2.3.1-bin-hadoop2.7/./examples/src/main/python/pi.py", line 44, in <module>
count = spark.sparkContext.parallelize(range(1, n + 1), partitions).map(f).reduce(add)
File "~/Documents/spark-2.3.1-bin-hadoop2.7/python/lib/pyspark.zip/pyspark/rdd.py", line 862, in reduce
File "~/Documents/spark-2.3.1-bin-hadoop2.7/python/lib/pyspark.zip/pyspark/rdd.py", line 834, in collect
File "~/Documents/spark-2.3.1-bin-hadoop2.7/python/lib/py4j-0.10.7-src.zip/py4j/java_gateway.py", line 1257, in __call__
File "~/Documents/spark-2.3.1-bin-hadoop2.7/python/lib/pyspark.zip/pyspark/sql/utils.py", line 63, in deco
File "~/Documents/spark-2.3.1-bin-hadoop2.7/python/lib/py4j-0.10.7-src.zip/py4j/protocol.py", line 328, in get_return_value
py4j.protocol.Py4JJavaError: An error occurred while calling z:org.apache.spark.api.python.PythonRDD.collectAndServe.
: java.lang.IllegalArgumentException
at org.apache.xbean.asm5.ClassReader.<init>(Unknown Source)
at org.apache.xbean.asm5.ClassReader.<init>(Unknown Source)
at org.apache.xbean.asm5.ClassReader.<init>(Unknown Source)
at org.apache.spark.util.ClosureCleaner$.getClassReader(ClosureCleaner.scala:46)
at org.apache.spark.util.FieldAccessFinder$$anon$3$$anonfun$visitMethodInsn$2.apply(ClosureCleaner.scala:449)
at org.apache.spark.util.FieldAccessFinder$$anon$3$$anonfun$visitMethodInsn$2.apply(ClosureCleaner.scala:432)
at scala.collection.TraversableLike$WithFilter$$anonfun$foreach$1.apply(TraversableLike.scala:733)
at scala.collection.mutable.HashMap$$anon$1$$anonfun$foreach$2.apply(HashMap.scala:103)
at scala.collection.mutable.HashMap$$anon$1$$anonfun$foreach$2.apply(HashMap.scala:103)
at scala.collection.mutable.HashTable$class.foreachEntry(HashTable.scala:230)
at scala.collection.mutable.HashMap.foreachEntry(HashMap.scala:40)
at scala.collection.mutable.HashMap$$anon$1.foreach(HashMap.scala:103)
at scala.collection.TraversableLike$WithFilter.foreach(TraversableLike.scala:732)
at org.apache.spark.util.FieldAccessFinder$$anon$3.visitMethodInsn(ClosureCleaner.scala:432)
at org.apache.xbean.asm5.ClassReader.a(Unknown Source)
at org.apache.xbean.asm5.ClassReader.b(Unknown Source)
at org.apache.xbean.asm5.ClassReader.accept(Unknown Source)
at org.apache.xbean.asm5.ClassReader.accept(Unknown Source)
at org.apache.spark.util.ClosureCleaner$$anonfun$org$apache$spark$util$ClosureCleaner$$clean$14.apply(ClosureCleaner.scala:262)
at org.apache.spark.util.ClosureCleaner$$anonfun$org$apache$spark$util$ClosureCleaner$$clean$14.apply(ClosureCleaner.scala:261)
at scala.collection.immutable.List.foreach(List.scala:381)
at org.apache.spark.util.ClosureCleaner$.org$apache$spark$util$ClosureCleaner$$clean(ClosureCleaner.scala:261)
at org.apache.spark.util.ClosureCleaner$.clean(ClosureCleaner.scala:159)
at org.apache.spark.SparkContext.clean(SparkContext.scala:2299)
at org.apache.spark.SparkContext.runJob(SparkContext.scala:2073)
at org.apache.spark.SparkContext.runJob(SparkContext.scala:2099)
at org.apache.spark.rdd.RDD$$anonfun$collect$1.apply(RDD.scala:939)
at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:112)
at org.apache.spark.rdd.RDD.withScope(RDD.scala:363)
at org.apache.spark.rdd.RDD.collect(RDD.scala:938)
at org.apache.spark.api.python.PythonRDD$.collectAndServe(PythonRDD.scala:162)
at org.apache.spark.api.python.PythonRDD.collectAndServe(PythonRDD.scala)
at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.base/java.lang.reflect.Method.invoke(Method.java:564)
at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:244)
at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:357)
at py4j.Gateway.invoke(Gateway.java:282)
at py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:132)
at py4j.commands.CallCommand.execute(CallCommand.java:79)
at py4j.GatewayConnection.run(GatewayConnection.java:238)
at java.base/java.lang.Thread.run(Thread.java:844)
2018-07-13 14:31:33 INFO SparkContext:54 - Invoking stop() from shutdown hook
...
I resolved the issue by switching to Java8 instead of Java10 as mentioned here.
Primary technical reason is that Spark depends heavily on direct access to native memory with sun.misc.Unsafe, which has been made private in Java 9.
https://issues.apache.org/jira/browse/SPARK-24421
http://apache-spark-developers-list.1001551.n3.nabble.com/Java-9-td20875.html
Committer here. It's actually a fair bit of work to support Java 9+: SPARK-24417
It's also almost done and should be ready for Spark 3.0, which should run on Java 8 through 11 and beyond.
The goal (well, mine) is to make it work without opening up module access. The key issues include:
sun.misc.Unsafe usage has to be removed or worked around
Changes to the structure of boot classloader
Scala support for Java 9+
A bunch of dependency updates to work with Java 9+
JAXB no longer automatically available
Spark depends on the memory API's which has been changed in JDK 9 so it is not available starting JDK 9.
And that is the reason for this.
Please check the issue:
https://issues.apache.org/jira/browse/SPARK-24421

How to fix Exception in thread "main" java.lang.NoClassDefFoundError: javax/xml/bind/DatatypeConverter [duplicate]

This question already has answers here:
How to resolve java.lang.NoClassDefFoundError: javax/xml/bind/JAXBException
(43 answers)
Closed 1 year ago.
I seen some threads about it but it didn't really helped me...
I have java JDK 10. Running on Windows 10 64-bit. The thing I'm trying to use is a bot. The github link about it is here.
Here's the logs of the script I'm executing
C:\Users\administrator\Downloads\vHackOSBot>java -jar vHackOSBot.jar
20:15:09 INFO [UpdateService] Creating UpdateService...
20:15:09 INFO [MiscService] Creating MiscService...
20:15:09 INFO [NetworkingService] Creating NetworkingService...
20:15:09 INFO [MainService] Creating MainService...
20:15:09 INFO [ServerService] Creating ServerService...
20:15:09 INFO [vHackOSBot-Config] Creating ConfigFile...
20:15:09 INFO [vHackOSBot-ConfigAdv] Creating ConfigFile...
20:15:09 WARN [io.sentry.DefaultSentryClientFactory] No 'stacktrace.app.packages' was configured, this option is highly recommended as it affects stacktrace grouping and display on Sentry. See documentation: https://docs.sentry.io/clients/java/config/#in-application-stack-frames
20:15:09 WARN [io.sentry.DefaultSentryClientFactory] No 'stacktrace.app.packages' was configured, this option is highly recommended as it affects stacktrace grouping and display on Sentry. See documentation: https://docs.sentry.io/clients/java/config/#in-application-stack-frames
20:15:10 INFO [vHackOSBot-ConfigAdv] Loading advanced config...
20:15:10 INFO [vHackOSBot-ConfigAdv] Loaded advanced config in 223ms.
20:15:10 INFO [vHackOSBot-ConfigAdv] Saving advanced config...
20:15:10 INFO [vHackOSBot-ConfigAdv] Saved advanced config in 29ms.
20:15:10 INFO [vHackOSBot-Config] Loading config...
20:15:10 INFO [vHackOSBot-Config] Loaded config in 47ms.
20:15:10 INFO [vHackOSBot-Config] Saving config...
20:15:10 INFO [vHackOSBot-Config] Saved config in 2ms.
Exception in thread "main" java.lang.NoClassDefFoundError: javax/xml/bind/DatatypeConverter
at net.olympiccode.vhackos.api.utils.Encryption.md5Hash(Encryption.java:15)
at net.olympiccode.vhackos.api.requests.Route.compile(Route.java:59)
at net.olympiccode.vhackos.api.entities.impl.vHackOSAPIImpl.verifyDetails(vHackOSAPIImpl.java:108)
at net.olympiccode.vhackos.api.entities.impl.vHackOSAPIImpl.login(vHackOSAPIImpl.java:83)
at net.olympiccode.vhackos.api.vHackOSAPIBuilder.buildAsync(vHackOSAPIBuilder.java:92)
at net.olympiccode.vhackos.api.vHackOSAPIBuilder.buildBlocking(vHackOSAPIBuilder.java:104)
at net.olympiccode.vhackos.bot.core.vHackOSBot.run(vHackOSBot.java:107)
at net.olympiccode.vhackos.bot.core.vHackOSBot.main(vHackOSBot.java:51)
Caused by: java.lang.ClassNotFoundException: javax.xml.bind.DatatypeConverter
at java.base/jdk.internal.loader.BuiltinClassLoader.loadClass(Unknown Source)
at java.base/jdk.internal.loader.ClassLoaders$AppClassLoader.loadClass(Unknown Source)
at java.base/java.lang.ClassLoader.loadClass(Unknown Source)
... 8 more
20:15:10 INFO [vHackOSBot] Shutting down...
20:15:10 INFO [vHackOSBot-Config] Saving config...
20:15:10 INFO [vHackOSBot-Config] Saved config in 2ms.
20:15:10 INFO [vHackOSBot-ConfigAdv] Saving advanced config...
20:15:10 INFO [vHackOSBot-ConfigAdv] Saved advanced config in 1ms.
With JDK 9 java.xml.bind is dprecated and removed from standard classpath.
Its still there so you could try a workaround and use --add-modules to add Modules to your classpath.
But for me the easiest solution was adding the dependency (gradle/maven):
gradle:
compile 'javax.xml.bind:jaxb-api:2.3.0'
maven:
<dependency>
<groupId>javax.xml.bind</groupId>
<artifactId>jaxb-api</artifactId>
<version>2.3.0</version>
</dependency>
link to deprecation summary: https://docs.oracle.com/javase/9/docs/api/java.xml.bind-summary.html
I found why it wasn't working. I needed to downgrade to java jde and jdk 8

Gradle 'android' project refresh failed After Installed Android Studio 2.2.1

Gradle 'android' project refresh failed
Caused by: org.gradle.api.GradleException: Unable to start the daemon process.
This problem might be caused by incorrect configuration of the daemon.
For example, an unrecognized jvm option is used.
Please refer to the user guide chapter on the daemon at https://docs.gradle.org/2.14.1/userguide/gradle_daemon.html
Please read the following process output to find out more:
-----------------------
at org.gradle.launcher.daemon.bootstrap.DaemonGreeter.parseDaemonOutput(DaemonGreeter.java:34)
at org.gradle.launcher.daemon.client.DefaultDaemonStarter.startProcess(DefaultDaemonStarter.java:153)
at org.gradle.launcher.daemon.client.DefaultDaemonStarter.startDaemon(DefaultDaemonStarter.java:136)
at org.gradle.launcher.daemon.client.DefaultDaemonConnector.startDaemon(DefaultDaemonConnector.java:111)
at org.gradle.launcher.daemon.client.DefaultDaemonConnector.connect(DefaultDaemonConnector.java:89)
at org.gradle.launcher.daemon.client.DaemonClient.execute(DaemonClient.java:122)
at org.gradle.launcher.daemon.client.DaemonClient.execute(DaemonClient.java:80)
at org.gradle.tooling.internal.provider.DaemonBuildActionExecuter.execute(DaemonBuildActionExecuter.java:77)
at org.gradle.tooling.internal.provider.DaemonBuildActionExecuter.execute(DaemonBuildActionExecuter.java:46)
at org.gradle.tooling.internal.provider.LoggingBridgingBuildActionExecuter.execute(LoggingBridgingBuildActionExecuter.java:60)
at org.gradle.tooling.internal.provider.LoggingBridgingBuildActionExecuter.execute(LoggingBridgingBuildActionExecuter.java:34)
at org.gradle.tooling.internal.provider.ProviderConnection.run(ProviderConnection.java:150)
at org.gradle.tooling.internal.provider.ProviderConnection.run(ProviderConnection.java:135)
at org.gradle.tooling.internal.provider.DefaultConnection.run(DefaultConnection.java:202)
at org.gradle.tooling.internal.consumer.connection.CancellableConsumerConnection$CancellableActionRunner.run(CancellableConsumerConnection.java:116)
at org.gradle.tooling.internal.consumer.connection.AbstractConsumerConnection.run(AbstractConsumerConnection.java:64)
at org.gradle.tooling.internal.consumer.DefaultBuildActionExecuter$1.run(DefaultBuildActionExecuter.java:59)
at org.gradle.tooling.internal.consumer.connection.LazyConsumerActionExecutor.run(LazyConsumerActionExecutor.java:79)
at org.gradle.tooling.internal.consumer.connection.CancellableConsumerActionExecutor.run(CancellableConsumerActionExecutor.java:45)
at org.gradle.tooling.internal.consumer.connection.ProgressLoggingConsumerActionExecutor.run(ProgressLoggingConsumerActionExecutor.java:58)
at org.gradle.tooling.internal.consumer.connection.RethrowingErrorsConsumerActionExecutor.run(RethrowingErrorsConsumerActionExecutor.java:38)
at org.gradle.tooling.internal.consumer.async.DefaultAsyncConsumerActionExecutor$1$1.run(DefaultAsyncConsumerActionExecutor.java:55)
at org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:54)
at org.gradle.internal.concurrent.StoppableExecutorImpl$1.run(StoppableExecutorImpl.java:40)
... 3 more
2016-10-20 18:48:29,661 [ 41611] WARN - nal.AbstractExternalSystemTask - Unable to start the daemon process.
This problem might be caused by incorrect configuration of the daemon.
For example, an unrecognized jvm option is used.
Please refer to the user guide chapter on the daemon at https://docs.gradle.org/2.14.1/userguide/gradle_daemon.html
Please read the following process output to find out more:
-----------------------
com.intellij.openapi.externalSystem.model.ExternalSystemException: Unable to start the daemon process.
This problem might be caused by incorrect configuration of the daemon.
For example, an unrecognized jvm option is used.
Please refer to the user guide chapter on the daemon at https://docs.gradle.org/2.14.1/userguide/gradle_daemon.html
Please read the following process output to find out more:
-----------------------
at org.jetbrains.plugins.gradle.service.project.AbstractProjectImportErrorHandler.createUserFriendlyError(AbstractProjectImportErrorHandler.java:106)
at org.jetbrains.plugins.gradle.service.project.BaseProjectImportErrorHandler.getUserFriendlyError(BaseProjectImportErrorHandler.java:158)
at org.jetbrains.plugins.gradle.service.project.BaseGradleProjectResolverExtension.getUserFriendlyError(BaseGradleProjectResolverExtension.java:579)
at org.jetbrains.plugins.gradle.service.project.AbstractProjectResolverExtension.getUserFriendlyError(AbstractProjectResolverExtension.java:158)
at com.android.tools.idea.gradle.project.AndroidGradleProjectResolver.getUserFriendlyError(AndroidGradleProjectResolver.java:405)
at org.jetbrains.plugins.gradle.service.project.GradleProjectResolver$ProjectConnectionDataNodeFunction.fun(GradleProjectResolver.java:772)
at org.jetbrains.plugins.gradle.service.project.GradleProjectResolver$ProjectConnectionDataNodeFunction.fun(GradleProjectResolver.java:752)
at org.jetbrains.plugins.gradle.service.execution.GradleExecutionHelper.execute(GradleExecutionHelper.java:238)
at org.jetbrains.plugins.gradle.service.project.GradleProjectResolver.resolveProjectInfo(GradleProjectResolver.java:112)
at org.jetbrains.plugins.gradle.service.project.GradleProjectResolver.resolveProjectInfo(GradleProjectResolver.java:73)
at com.intellij.openapi.externalSystem.service.remote.RemoteExternalSystemProjectResolverImpl$1.produce(RemoteExternalSystemProjectResolverImpl.java:41)
at com.intellij.openapi.externalSystem.service.remote.RemoteExternalSystemProjectResolverImpl$1.produce(RemoteExternalSystemProjectResolverImpl.java:37)
at com.intellij.openapi.externalSystem.service.remote.AbstractRemoteExternalSystemService.execute(AbstractRemoteExternalSystemService.java:59)
at com.intellij.openapi.externalSystem.service.remote.RemoteExternalSystemProjectResolverImpl.resolveProjectInfo(RemoteExternalSystemProjectResolverImpl.java:37)
at com.intellij.openapi.externalSystem.service.remote.wrapper.ExternalSystemProjectResolverWrapper.resolveProjectInfo(ExternalSystemProjectResolverWrapper.java:49)
at com.intellij.openapi.externalSystem.service.internal.ExternalSystemResolveProjectTask.doExecute(ExternalSystemResolveProjectTask.java:51)
at com.intellij.openapi.externalSystem.service.internal.AbstractExternalSystemTask.execute(AbstractExternalSystemTask.java:138)
at com.intellij.openapi.externalSystem.service.internal.AbstractExternalSystemTask.execute(AbstractExternalSystemTask.java:124)
at com.intellij.openapi.externalSystem.util.ExternalSystemUtil$3.execute(ExternalSystemUtil.java:419)
at com.intellij.openapi.externalSystem.util.ExternalSystemUtil$4$2.run(ExternalSystemUtil.java:500)
at com.intellij.openapi.progress.impl.CoreProgressManager$TaskRunnable.run(CoreProgressManager.java:563)
at com.intellij.openapi.progress.impl.CoreProgressManager$2.run(CoreProgressManager.java:142)
at com.intellij.openapi.progress.impl.CoreProgressManager.registerIndicatorAndRun(CoreProgressManager.java:446)
at com.intellij.openapi.progress.impl.CoreProgressManager.executeProcessUnderProgress(CoreProgressManager.java:392)
at com.intellij.openapi.progress.impl.ProgressManagerImpl.executeProcessUnderProgress(ProgressManagerImpl.java:54)
at com.intellij.openapi.progress.impl.CoreProgressManager.runProcess(CoreProgressManager.java:127)
at com.intellij.openapi.progress.impl.ProgressManagerImpl$1.run(ProgressManagerImpl.java:126)
at com.intellij.openapi.application.impl.ApplicationImpl$8.run(ApplicationImpl.java:369)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
2016-10-20 18:48:29,664 [ 41614] WARN - radle.project.ProjectSetUpTask -
2016-10-20 18:48:29,664 [ 41614] INFO - radle.project.ProjectSetUpTask - Unable to start the daemon process.
This problem might be caused by incorrect configuration of the daemon.
For example, an unrecognized jvm option is used.
Please refer to the user guide chapter on the daemon at https://docs.gradle.org/2.14.1/userguide/gradle_daemon.html
Please read the following process output to find out more:
-----------------------
Consult IDE log for more details (Help | Show Log)
2016-10-20 18:48:29,664 [ 41614] INFO - ls.idea.gradle.GradleSyncState - Sync with Gradle for project 'android' failed: Unable to start the daemon process.
This problem might be caused by incorrect configuration of the daemon.
For example, an unrecognized jvm option is used.
Please refer to the user guide chapter on the daemon at https://docs.gradle.org/2.14.1/userguide/gradle_daemon.html
Please read the following process output to find out more:
-----------------------
Consult IDE log for more details (Help | Show Log)
2016-10-20 18:48:29,733 [ 41683] WARN - roid.tools.ndk.GradleWorkspace - NDK support for project 'android' is disabled because the project doesn't contain any valid native configurations.
2016-10-20 18:48:29,823 [ 41773] INFO - #com.jetbrains.cidr.lang - Clearing symbols finished in 0 s.
2016-10-20 18:48:29,823 [ 41773] INFO - #com.jetbrains.cidr.lang - Loading symbols finished in 0 s.
2016-10-20 18:48:29,823 [ 41773] INFO - #com.jetbrains.cidr.lang - Building symbols finished in 0 s.
2016-10-20 18:48:29,824 [ 41774] INFO - #com.jetbrains.cidr.lang - Saving symbols finished in 0 s.
This issue happens only Cordova/Phonegap projects, works fine on native Android(Java) project.
I checked all questions and answers on StackOverFlow, and tried installing/uninstalling many times, but didn't fix it.
OS: Windows 10 x64, Android Studio 2.2.1.
Does anyone know how to fix this?

Hadoop "Spill Failed" Exception in an ec2 instance with 420GB of instance storage

I am using Hadoop2.3.0 and have installed it as single node cluster (psuedo-distributed mode) on CentOS 6.4 Amazon ec2 instance with an instance storage of 420GB and 7.5GB of RAM , my understanding is that the " Spill Failed " exception only occurs when the node runs out of the disk space however , after running map/reduce tasks for only a short amount of time (no where near to 420 GB of data ) I get the following exception.
I would like to mention that I moved the Hadoop installation on the same node from a EBS volume of 8GB(where I had installed it originally) to an instance store volume of 420GB on the same node and changed the $HADOOP_HOME environment variable and other properties to point to the instance store volume accordingly and the Hadoop2.3.0 is now completely contained in the 420GB drive.
However I still see the following exception , can you please let me know if there is anything besides Diskspace that can cause the Spill Failed exception ?
2014-02-28 15:35:07,630 ERROR [IPC Server handler 12 on 58189] org.apache.hadoop.mapred.TaskAttemptListenerImpl: Task: attempt_1393591821307_0013_m_000000_0 - exited :
java.io.IOException: Spill failed
at org.apache.hadoop.mapred.MapTask$MapOutputBuffer.checkSpillException(MapTask.java:1533)
at org.apache.hadoop.mapred.MapTask$MapOutputBuffer.flush(MapTask.java:1442)
at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:437)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:342)
at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Unknown Source)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1548)
at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:163)
Caused by: org.apache.hadoop.util.DiskChecker$DiskErrorException: Could not find any valid local directory for attempt_1393591821307_0013_m_000000_0_spill_26.out
at org.apache.hadoop.fs.LocalDirAllocator$AllocatorPerContext.getLocalPathForWrite(LocalDirAllocator.java:402)
at org.apache.hadoop.fs.LocalDirAllocator.getLocalPathForWrite(LocalDirAllocator.java:150)
at org.apache.hadoop.fs.LocalDirAllocator.getLocalPathForWrite(LocalDirAllocator.java:131)
at org.apache.hadoop.mapred.YarnOutputFiles.getSpillFileForWrite(YarnOutputFiles.java:159)
at org.apache.hadoop.mapred.MapTask$MapOutputBuffer.sortAndSpill(MapTask.java:1564)
at org.apache.hadoop.mapred.MapTask$MapOutputBuffer.access$900(MapTask.java:853)
at org.apache.hadoop.mapred.MapTask$MapOutputBuffer$SpillThread.run(MapTask.java:1503)
2014-02-28 15:35:07,604 WARN [main] org.apache.hadoop.security.UserGroupInformation: PriviledgedActionException as:root (auth:SIMPLE) cause:java.io.IOException: Spill failed
2014-02-28 15:35:07,605 WARN [main] org.apache.hadoop.mapred.YarnChild: Exception running child : java.io.IOException: Spill failed
at org.apache.hadoop.mapred.MapTask$MapOutputBuffer.checkSpillException(MapTask.java:1533)
at org.apache.hadoop.mapred.MapTask$MapOutputBuffer.flush(MapTask.java:1442)
at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:437)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:342)
at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Unknown Source)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1548)
at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:163)
Caused by: org.apache.hadoop.util.DiskChecker$DiskErrorException: Could not find any valid local directory for attempt_1393591821307_0013_m_000000_0_spill_26.out
at org.apache.hadoop.fs.LocalDirAllocator$AllocatorPerContext.getLocalPathForWrite(LocalDirAllocator.java:402)
at org.apache.hadoop.fs.LocalDirAllocator.getLocalPathForWrite(LocalDirAllocator.java:150)
at org.apache.hadoop.fs.LocalDirAllocator.getLocalPathForWrite(LocalDirAllocator.java:131)
at org.apache.hadoop.mapred.YarnOutputFiles.getSpillFileForWrite(YarnOutputFiles.java:159)
at org.apache.hadoop.mapred.MapTask$MapOutputBuffer.sortAndSpill(MapTask.java:1564)
at org.apache.hadoop.mapred.MapTask$MapOutputBuffer.access$900(MapTask.java:853)
at org.apache.hadoop.mapred.MapTask$MapOutputBuffer$SpillThread.run(MapTask.java:1503)
I was able to solve this by setting the hadoop.tmp.dir value to something on the instace storage , by default it was pointing to the EBS backed root volume.

Categories

Resources