I am getting the error as per showed in title. I have searched in Stackoverflow and other people have been through by the same problem in previous versions. In an answer was said that would be solved in a next version of DL4J and it seems that it have not occurred.
Below are pom.xml and the dependencies i am using.
Please, can anybody help me?
Thank you in advance.
pom.xml:
<properties>
<dl4j-master.version>1.0.0-M1.1</dl4j-master.version>
<logback.version>1.2.3</logback.version>
<java.version>1.8</java.version>
<maven-shade-plugin.version>2.4.3</maven-shade-plugin.version>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
</properties>
<dependencies>
<dependency>
<groupId>org.deeplearning4j</groupId>
<artifactId>deeplearning4j-core</artifactId>
<version>${dl4j-master.version}</version>
</dependency>
<dependency>
<groupId>org.deeplearning4j</groupId>
<artifactId>deeplearning4j-nlp</artifactId>
<version>${dl4j-master.version}</version>
</dependency>
<dependency>
<groupId>org.datavec</groupId>
<artifactId>datavec-api</artifactId>
<version>${dl4j-master.version}</version>
</dependency>
<dependency>
<groupId>org.nd4j</groupId>
<artifactId>nd4j-cuda-11.0-platform</artifactId>
<version>${dl4j-master.version}</version>
</dependency>
<dependency>
<groupId>org.bytedeco</groupId>
<artifactId>cuda-platform-redist</artifactId>
<version>11.0-8.0-1.5.4</version>
</dependency>
<dependency>
<groupId>org.deeplearning4j</groupId>
<artifactId>deeplearning4j-cuda-11.0</artifactId>
<version>${dl4j-master.version}</version>
</dependency>
<dependency>
<groupId>org.bytedeco.javacpp-presets</groupId>
<artifactId>cuda</artifactId>
<version>10.0-7.4-1.4.4</version>
</dependency>
<dependency>
<groupId>ch.qos.logback</groupId>
<artifactId>logback-classic</artifactId>
<version>${logback.version}</version>
</dependency>
</dependencies>
Error:
11:11:35.720 [main] INFO org.nd4j.linalg.factory.Nd4jBackend - Loaded [JCublasBackend] backend
11:11:37.543 [main] INFO org.nd4j.nativeblas.NativeOpsHolder - Number of threads used for linear algebra: 32
11:11:37.675 [main] INFO org.nd4j.linalg.api.ops.executioner.DefaultOpExecutioner - Backend used: [CUDA]; OS: [Windows 10]
11:11:37.676 [main] INFO org.nd4j.linalg.api.ops.executioner.DefaultOpExecutioner - Cores: [4]; Memory: [3,5GB];
11:11:37.676 [main] INFO org.nd4j.linalg.api.ops.executioner.DefaultOpExecutioner - Blas vendor: [CUBLAS]
11:11:37.702 [main] INFO org.nd4j.linalg.jcublas.JCublasBackend - ND4J CUDA build version: 11.0.221
11:11:37.705 [main] INFO org.nd4j.linalg.jcublas.JCublasBackend - CUDA device 0: [NVIDIA GeForce 930M]; cc: [5.0]; Total memory: [4294836224]
11:11:37.705 [main] INFO org.nd4j.linalg.jcublas.JCublasBackend - Backend build information:
MSVC: 192930038
STD version: 201703L
CUDA: 11.0.221
DEFAULT_ENGINE: samediff::ENGINE_CUDA
HAVE_FLATBUFFERS
11:11:37.782 [main] INFO org.deeplearning4j.models.sequencevectors.SequenceVectors - Starting vocabulary building...
11:11:37.783 [main] DEBUG org.deeplearning4j.models.word2vec.wordstore.VocabConstructor - Target vocab size before building: [0]
11:11:37.814 [main] DEBUG org.deeplearning4j.models.word2vec.wordstore.VocabConstructor - Trying source iterator: [0]
11:11:37.814 [main] DEBUG org.deeplearning4j.models.word2vec.wordstore.VocabConstructor - Target vocab size before building: [0]
11:11:51.450 [main] DEBUG org.deeplearning4j.models.word2vec.wordstore.VocabConstructor - Waiting till all processes stop...
11:11:51.457 [main] DEBUG org.deeplearning4j.models.word2vec.wordstore.VocabConstructor - Vocab size before truncation: [168165], NumWords: [1952392], sequences parsed: [318], counter: [1952389]
11:11:51.457 [main] DEBUG org.deeplearning4j.models.word2vec.wordstore.VocabConstructor - Vocab size after truncation: [168165], NumWords: [1952392], sequences parsed: [318], counter: [1952389]
11:11:54.179 [main] INFO org.deeplearning4j.models.word2vec.wordstore.VocabConstructor - Sequences checked: [318], Current vocabulary size: [168165]; Sequences/sec: [19,39];
11:11:54.248 [main] INFO org.deeplearning4j.models.embeddings.loader.WordVectorSerializer - Projected memory use for model: [128,30 MB]
Exception in thread "main" java.lang.RuntimeException: cudaGetSymbolAddress(...) failed; Error code: [13]
at org.nd4j.linalg.jcublas.ops.executioner.CudaExecutioner.createShapeInfo(CudaExecutioner.java:2173)
at org.nd4j.linalg.api.shape.Shape.createShapeInformation(Shape.java:3279)
at org.nd4j.linalg.api.ndarray.BaseShapeInfoProvider.createShapeInformation(BaseShapeInfoProvider.java:75)
at org.nd4j.jita.constant.ProtectedCudaShapeInfoProvider.createShapeInformation(ProtectedCudaShapeInfoProvider.java:96)
at org.nd4j.jita.constant.ProtectedCudaShapeInfoProvider.createShapeInformation(ProtectedCudaShapeInfoProvider.java:77)
at org.nd4j.linalg.jcublas.CachedShapeInfoProvider.createShapeInformation(CachedShapeInfoProvider.java:46)
at org.nd4j.linalg.api.ndarray.BaseNDArray.(BaseNDArray.java:180)
at org.nd4j.linalg.api.ndarray.BaseNDArray.(BaseNDArray.java:174)
at org.nd4j.linalg.api.ndarray.BaseNDArray.(BaseNDArray.java:316)
at org.nd4j.linalg.jcublas.JCublasNDArray.(JCublasNDArray.java:135)
at org.nd4j.linalg.jcublas.JCublasNDArrayFactory.createUninitialized(JCublasNDArrayFactory.java:1533)
at org.nd4j.linalg.factory.Nd4j.createUninitialized(Nd4j.java:4379)
at org.nd4j.linalg.factory.Nd4j.rand(Nd4j.java:2957)
at org.nd4j.linalg.factory.Nd4j.rand(Nd4j.java:2946)
at org.deeplearning4j.models.embeddings.inmemory.InMemoryLookupTable.resetWeights(InMemoryLookupTable.java:145)
at org.deeplearning4j.models.sequencevectors.SequenceVectors.fit(SequenceVectors.java:278)
at org.deeplearning4j.models.paragraphvectors.ParagraphVectors.fit(ParagraphVectors.java:667)
at gov.rfb.cocaj.dl4jGPU.DocumentClassifier.main(DocumentClassifier.java:44)
This is always due to an incompatible cuda version. Make sure that the version you have installed locally is not different from the one you are using with dl4j.
Related
Here I am trying to execute Structured Based Streaming with Apache Kafka. But in here not working and execute error (ERROR MicroBatchExecution: Query [id = daae4c34-9c8a-4c28-9e2e-88e5fcf3d614, runId = ca57d90c-d584-41d3-a8de-6f9534ead0a0] terminated with error
java.lang.UnsatisfiedLinkError:org.apache.hadoop.io.nativeio.NativeIO$Windows.access0(Ljava/lang/String;I)Z).
How can i solve this issue. I work on windows 10 machine.
App Class:
package com.rakib;
import org.apache.spark.sql.Dataset;
import org.apache.spark.sql.Row;
import org.apache.spark.sql.SparkSession;
import org.apache.spark.sql.streaming.OutputMode;
import org.apache.spark.sql.streaming.StreamingQuery;
import org.apache.spark.sql.streaming.StreamingQueryException;
import java.util.concurrent.TimeoutException;
import java.util.logging.Level;
import java.util.logging.Logger;
public class App {
public static void main(String[] args) throws TimeoutException, StreamingQueryException {
System.setProperty("hadoop.home.dir", "c:/hadoop");
Logger.getLogger("org.apache").setLevel(Level.WARNING);
SparkSession sparkSession = SparkSession.builder()
.appName("SparkSQL")
.master("local[*]")
.getOrCreate();
Dataset<Row> rowDataset = sparkSession
.readStream()
.format("kafka")
.option("kafka.bootstrap.servers", "localhost:9091,localhost:9092,localhost:9093")
.option("subscribe", "student")
.option("startingOffsets", "earliest")
.load();
rowDataset.selectExpr("CAST(key AS STRING)", "CAST(value AS STRING)");
//rowDataset.createOrReplaceTempView("student_info");
//Dataset<Row> dataset = sparkSession.sql("SELECT value FROM student_info");
StreamingQuery query = rowDataset
.writeStream()
.format("console")
.outputMode(OutputMode.Append())
.start();
query.awaitTermination();
}
}
Pom.xml:
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>org.example</groupId>
<artifactId>Test_One</artifactId>
<version>1.0-SNAPSHOT</version>
<build>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<configuration>
<source>8</source>
<target>8</target>
</configuration>
</plugin>
</plugins>
</build>
<dependencies>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.12</artifactId>
<version>3.0.0</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-sql_2.12</artifactId>
<version>3.0.0</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-streaming_2.12</artifactId>
<version>3.0.0</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-streaming-kafka-0-10_2.12</artifactId>
<version>3.0.0</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-sql-kafka-0-10_2.12</artifactId>
<version>3.0.0</version>
</dependency>
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-hdfs-client</artifactId>
<version>3.3.0</version>
</dependency>
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-common</artifactId>
<version>3.3.0</version>
</dependency>
<dependency>
<groupId>log4j</groupId>
<artifactId>log4j</artifactId>
<version>1.2.17</version>
</dependency>
</dependencies>
</project>
Error:
20/08/20 23:37:21 INFO MicroBatchExecution: Reading table [org.apache.spark.sql.kafka010.KafkaSourceProvider$KafkaTable#71202043] from DataSourceV2 named 'kafka' [org.apache.spark.sql.kafka010.KafkaSourceProvider#1a4d79db]
20/08/20 23:37:21 ERROR MicroBatchExecution: Query [id = daae4c34-9c8a-4c28-9e2e-88e5fcf3d614, runId = ca57d90c-d584-41d3-a8de-6f9534ead0a0] terminated with error
java.lang.UnsatisfiedLinkError: org.apache.hadoop.io.nativeio.NativeIO$Windows.access0(Ljava/lang/String;I)Z
at org.apache.hadoop.io.nativeio.NativeIO$Windows.access0(Native Method)
at org.apache.hadoop.io.nativeio.NativeIO$Windows.access(NativeIO.java:793)
at org.apache.hadoop.fs.FileUtil.canRead(FileUtil.java:1221)
at org.apache.hadoop.fs.FileUtil.list(FileUtil.java:1426)
at org.apache.hadoop.fs.RawLocalFileSystem.listStatus(RawLocalFileSystem.java:495)
at org.apache.hadoop.fs.DelegateToFileSystem.listStatus(DelegateToFileSystem.java:177)
at org.apache.hadoop.fs.ChecksumFs.listStatus(ChecksumFs.java:548)
at org.apache.hadoop.fs.FileContext$Util$1.next(FileContext.java:1905)
at org.apache.hadoop.fs.FileContext$Util$1.next(FileContext.java:1901)
at org.apache.hadoop.fs.FSLinkResolver.resolve(FSLinkResolver.java:90)
at org.apache.hadoop.fs.FileContext$Util.listStatus(FileContext.java:1907)
at org.apache.hadoop.fs.FileContext$Util.listStatus(FileContext.java:1866)
at org.apache.hadoop.fs.FileContext$Util.listStatus(FileContext.java:1825)
at org.apache.spark.sql.execution.streaming.FileContextBasedCheckpointFileManager.list(CheckpointFileManager.scala:299)
at org.apache.spark.sql.execution.streaming.HDFSMetadataLog.getLatest(HDFSMetadataLog.scala:186)
at org.apache.spark.sql.execution.streaming.MicroBatchExecution.populateStartOffsets(MicroBatchExecution.scala:272)
at org.apache.spark.sql.execution.streaming.MicroBatchExecution.$anonfun$runActivatedStream$2(MicroBatchExecution.scala:194)
at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
at org.apache.spark.sql.execution.streaming.ProgressReporter.reportTimeTaken(ProgressReporter.scala:352)
at org.apache.spark.sql.execution.streaming.ProgressReporter.reportTimeTaken$(ProgressReporter.scala:350)
at org.apache.spark.sql.execution.streaming.StreamExecution.reportTimeTaken(StreamExecution.scala:69)
at org.apache.spark.sql.execution.streaming.MicroBatchExecution.$anonfun$runActivatedStream$1(MicroBatchExecution.scala:191)
at org.apache.spark.sql.execution.streaming.ProcessingTimeExecutor.execute(TriggerExecutor.scala:57)
at org.apache.spark.sql.execution.streaming.MicroBatchExecution.runActivatedStream(MicroBatchExecution.scala:185)
at org.apache.spark.sql.execution.streaming.StreamExecution.org$apache$spark$sql$execution$streaming$StreamExecution$$runStream(StreamExecution.scala:334)
at org.apache.spark.sql.execution.streaming.StreamExecution$$anon$1.run(StreamExecution.scala:245)
Exception in thread "stream execution thread for [id = daae4c34-9c8a-4c28-9e2e-88e5fcf3d614, runId = ca57d90c-d584-41d3-a8de-6f9534ead0a0]" java.lang.UnsatisfiedLinkError: org.apache.hadoop.io.nativeio.NativeIO$Windows.access0(Ljava/lang/String;I)Z
at org.apache.hadoop.io.nativeio.NativeIO$Windows.access0(Native Method)
at org.apache.hadoop.io.nativeio.NativeIO$Windows.access(NativeIO.java:793)
at org.apache.hadoop.fs.FileUtil.canRead(FileUtil.java:1221)
at org.apache.hadoop.fs.FileUtil.list(FileUtil.java:1426)
at org.apache.hadoop.fs.RawLocalFileSystem.listStatus(RawLocalFileSystem.java:495)
at org.apache.hadoop.fs.DelegateToFileSystem.listStatus(DelegateToFileSystem.java:177)
at org.apache.hadoop.fs.ChecksumFs.listStatus(ChecksumFs.java:548)
at org.apache.hadoop.fs.FileContext$Util$1.next(FileContext.java:1905)
at org.apache.hadoop.fs.FileContext$Util$1.next(FileContext.java:1901)
at org.apache.hadoop.fs.FSLinkResolver.resolve(FSLinkResolver.java:90)
at org.apache.hadoop.fs.FileContext$Util.listStatus(FileContext.java:1907)
at org.apache.hadoop.fs.FileContext$Util.listStatus(FileContext.java:1866)
at org.apache.hadoop.fs.FileContext$Util.listStatus(FileContext.java:1825)
at org.apache.spark.sql.execution.streaming.FileContextBasedCheckpointFileManager.list(CheckpointFileManager.scala:299)
at org.apache.spark.sql.execution.streaming.HDFSMetadataLog.getLatest(HDFSMetadataLog.scala:186)
at org.apache.spark.sql.execution.streaming.MicroBatchExecution.populateStartOffsets(MicroBatchExecution.scala:272)
at org.apache.spark.sql.execution.streaming.MicroBatchExecution.$anonfun$runActivatedStream$2(MicroBatchExecution.scala:194)
at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
at org.apache.spark.sql.execution.streaming.ProgressReporter.reportTimeTaken(ProgressReporter.scala:352)
at org.apache.spark.sql.execution.streaming.ProgressReporter.reportTimeTaken$(ProgressReporter.scala:350)
at org.apache.spark.sql.execution.streaming.StreamExecution.reportTimeTaken(StreamExecution.scala:69)
at org.apache.spark.sql.execution.streaming.MicroBatchExecution.$anonfun$runActivatedStream$1(MicroBatchExecution.scala:191)
at org.apache.spark.sql.execution.streaming.ProcessingTimeExecutor.execute(TriggerExecutor.scala:57)
at org.apache.spark.sql.execution.streaming.MicroBatchExecution.runActivatedStream(MicroBatchExecution.scala:185)
at org.apache.spark.sql.execution.streaming.StreamExecution.org$apache$spark$sql$execution$streaming$StreamExecution$$runStream(StreamExecution.scala:334)
at org.apache.spark.sql.execution.streaming.StreamExecution$$anon$1.run(StreamExecution.scala:245)
Exception in thread "main" org.apache.spark.sql.streaming.StreamingQueryException: org.apache.hadoop.io.nativeio.NativeIO$Windows.access0(Ljava/lang/String;I)Z
=== Streaming Query ===
Identifier: [id = daae4c34-9c8a-4c28-9e2e-88e5fcf3d614, runId = ca57d90c-d584-41d3-a8de-6f9534ead0a0]
Current Committed Offsets: {}
Current Available Offsets: {}
Current State: ACTIVE
Thread State: RUNNABLE
Logical Plan:
WriteToMicroBatchDataSource ConsoleWriter[numRows=20, truncate=true]
+- StreamingDataSourceV2Relation [key#7, value#8, topic#9, partition#10, offset#11L, timestamp#12, timestampType#13], org.apache.spark.sql.kafka010.KafkaSourceProvider$KafkaScan#8774409, KafkaV2[Subscribe[student]]
at org.apache.spark.sql.execution.streaming.StreamExecution.org$apache$spark$sql$execution$streaming$StreamExecution$$runStream(StreamExecution.scala:355)
at org.apache.spark.sql.execution.streaming.StreamExecution$$anon$1.run(StreamExecution.scala:245)
Caused by: java.lang.UnsatisfiedLinkError: org.apache.hadoop.io.nativeio.NativeIO$Windows.access0(Ljava/lang/String;I)Z
at org.apache.hadoop.io.nativeio.NativeIO$Windows.access0(Native Method)
at org.apache.hadoop.io.nativeio.NativeIO$Windows.access(NativeIO.java:793)
at org.apache.hadoop.fs.FileUtil.canRead(FileUtil.java:1221)
at org.apache.hadoop.fs.FileUtil.list(FileUtil.java:1426)
at org.apache.hadoop.fs.RawLocalFileSystem.listStatus(RawLocalFileSystem.java:495)
at org.apache.hadoop.fs.DelegateToFileSystem.listStatus(DelegateToFileSystem.java:177)
at org.apache.hadoop.fs.ChecksumFs.listStatus(ChecksumFs.java:548)
at org.apache.hadoop.fs.FileContext$Util$1.next(FileContext.java:1905)
at org.apache.hadoop.fs.FileContext$Util$1.next(FileContext.java:1901)
at org.apache.hadoop.fs.FSLinkResolver.resolve(FSLinkResolver.java:90)
at org.apache.hadoop.fs.FileContext$Util.listStatus(FileContext.java:1907)
at org.apache.hadoop.fs.FileContext$Util.listStatus(FileContext.java:1866)
at org.apache.hadoop.fs.FileContext$Util.listStatus(FileContext.java:1825)
at org.apache.spark.sql.execution.streaming.FileContextBasedCheckpointFileManager.list(CheckpointFileManager.scala:299)
at org.apache.spark.sql.execution.streaming.HDFSMetadataLog.getLatest(HDFSMetadataLog.scala:186)
at org.apache.spark.sql.execution.streaming.MicroBatchExecution.populateStartOffsets(MicroBatchExecution.scala:272)
at org.apache.spark.sql.execution.streaming.MicroBatchExecution.$anonfun$runActivatedStream$2(MicroBatchExecution.scala:194)
at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
at org.apache.spark.sql.execution.streaming.ProgressReporter.reportTimeTaken(ProgressReporter.scala:352)
at org.apache.spark.sql.execution.streaming.ProgressReporter.reportTimeTaken$(ProgressReporter.scala:350)
at org.apache.spark.sql.execution.streaming.StreamExecution.reportTimeTaken(StreamExecution.scala:69)
at org.apache.spark.sql.execution.streaming.MicroBatchExecution.$anonfun$runActivatedStream$1(MicroBatchExecution.scala:191)
at org.apache.spark.sql.execution.streaming.ProcessingTimeExecutor.execute(TriggerExecutor.scala:57)
at org.apache.spark.sql.execution.streaming.MicroBatchExecution.runActivatedStream(MicroBatchExecution.scala:185)
at org.apache.spark.sql.execution.streaming.StreamExecution.org$apache$spark$sql$execution$streaming$StreamExecution$$runStream(StreamExecution.scala:334)
... 1 more
20/08/20 23:37:21 INFO SparkContext: Invoking stop() from shutdown hook
20/08/20 23:37:21 INFO SparkUI: Stopped Spark web UI at http://DESKTOP-3147U79:4040
20/08/20 23:37:21 INFO MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped!
20/08/20 23:37:21 INFO MemoryStore: MemoryStore cleared
20/08/20 23:37:21 INFO BlockManager: BlockManager stopped
20/08/20 23:37:21 INFO BlockManagerMaster: BlockManagerMaster stopped
20/08/20 23:37:21 INFO OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped!
20/08/20 23:37:21 INFO SparkContext: Successfully stopped SparkContext
20/08/20 23:37:21 INFO ShutdownHookManager: Shutdown hook called
20/08/20 23:37:21 INFO ShutdownHookManager: Deleting directory C:\Users\itc\AppData\Local\Temp\temporary-850444d9-5110-4c13-881f-a6e0ba7153d8
20/08/20 23:37:21 INFO ShutdownHookManager: Deleting directory C:\Users\itc\AppData\Local\Temp\spark-813cc4f1-9d4b-44f2-99ae-435d9e99f566
Process finished with exit code 1
This error generally occurs due to the mismatch in your binary files in your %HADOOP_HOME%\bin folder. So, what you need to do is to get hadoop.dll and winutils.exe specifically for your hadoop version.
Get hadoop.dll and winutils.exe for your specific hadoop version and copy them to your %HADOOP_HOME%\bin folder.
My project works perfectly when i start it in Intellij IDEA.
I created a jar with intellij and changed the directory of MANIFEST.MF from java to resources but when i start it:
java -jar window-projectNew.jar
An error occures:
java.lang.illegalargumentexception no auto configuration classes found in meta-inf/spring.factories
Here is the full log:
Microsoft Windows [Version 6.3.9600]
(c) Корпорация Майкрософт (Microsoft Corporation), 2013. Все права защищены.
C:\Users\UserOld.Laptop>cd C:\Users\UserOld.Laptop\Desktop\фото
C:\Users\UserOld.Laptop\Desktop\фото>java -cp window-projectNew.jar com.eurodesi
gn.windowproject.WindowProjectApplication
14:13:30.014 [main] DEBUG org.springframework.web.context.support.StandardServle
tEnvironment - Activating profiles []
. ____ _ __ _ _
/\\ / ___'_ __ _ _(_)_ __ __ _ \ \ \ \
( ( )\___ | '_ | '_| | '_ \/ _` | \ \ \ \
\\/ ___)| |_)| | | | | || (_| | ) ) ) )
' |____| .__|_| |_|_| |_\__, | / / / /
=========|_|==============|___/=/_/_/_/
:: Spring Boot ::
14:13:30.217 [main] INFO com.eurodesign.windowproject.WindowProjectApplication -
Starting WindowProjectApplication on Laptop with PID 2316 (C:\Users\UserOld.Lap
top\Desktop\ЇюЄю\window-projectNew.jar started by UserOld in C:\Users\UserOld.La
ptop\Desktop\ЇюЄю)
14:13:30.217 [main] DEBUG com.eurodesign.windowproject.WindowProjectApplication
- Running with Spring Boot, Spring
14:13:30.217 [main] INFO com.eurodesign.windowproject.WindowProjectApplication -
No active profile set, falling back to default profiles: default
14:13:30.217 [main] DEBUG org.springframework.boot.SpringApplication - Loading s
ource class com.eurodesign.windowproject.WindowProjectApplication
14:13:30.310 [main] DEBUG org.springframework.boot.web.servlet.context.Annotatio
nConfigServletWebServerApplicationContext - Refreshing org.springframework.boot.
web.servlet.context.AnnotationConfigServletWebServerApplicationContext#158a8276
14:13:30.326 [main] DEBUG org.springframework.beans.factory.support.DefaultLista
bleBeanFactory - Creating shared instance of singleton bean 'org.springframework
.context.annotation.internalConfigurationAnnotationProcessor'
14:13:30.404 [main] DEBUG org.springframework.context.annotation.ClassPathBeanDe
finitionScanner - Identified candidate component class: URL [jar:file:/C:/Users/
UserOld.Laptop/Desktop/%d1%84%d0%be%d1%82%d0%be/window-projectNew.jar!/com/eurod
esign/windowproject/config/WebSecurityConfig.class]
14:13:30.435 [main] DEBUG org.springframework.context.annotation.ClassPathBeanDe
finitionScanner - Identified candidate component class: URL [jar:file:/C:/Users/
UserOld.Laptop/Desktop/%d1%84%d0%be%d1%82%d0%be/window-projectNew.jar!/com/eurod
esign/windowproject/controller/AdminPageController.class]
14:13:30.435 [main] DEBUG org.springframework.context.annotation.ClassPathBeanDe
finitionScanner - Identified candidate component class: URL [jar:file:/C:/Users/
UserOld.Laptop/Desktop/%d1%84%d0%be%d1%82%d0%be/window-projectNew.jar!/com/eurod
esign/windowproject/controller/HomePageController.class]
14:13:30.435 [main] DEBUG org.springframework.context.annotation.ClassPathBeanDe
finitionScanner - Ignored because not a concrete top-level class: URL [jar:file:
/C:/Users/UserOld.Laptop/Desktop/%d1%84%d0%be%d1%82%d0%be/window-projectNew.jar!
/com/eurodesign/windowproject/dao/CallMeFormRepository.class]
14:13:30.435 [main] DEBUG org.springframework.context.annotation.ClassPathBeanDe
finitionScanner - Ignored because not a concrete top-level class: URL [jar:file:
/C:/Users/UserOld.Laptop/Desktop/%d1%84%d0%be%d1%82%d0%be/window-projectNew.jar!
/com/eurodesign/windowproject/dao/FeedbackApprovedRepository.class]
14:13:30.435 [main] DEBUG org.springframework.context.annotation.ClassPathBeanDe
finitionScanner - Ignored because not a concrete top-level class: URL [jar:file:
/C:/Users/UserOld.Laptop/Desktop/%d1%84%d0%be%d1%82%d0%be/window-projectNew.jar!
/com/eurodesign/windowproject/dao/FeedbackRepository.class]
14:13:30.451 [main] DEBUG org.springframework.context.annotation.ClassPathBeanDe
finitionScanner - Ignored because not a concrete top-level class: URL [jar:file:
/C:/Users/UserOld.Laptop/Desktop/%d1%84%d0%be%d1%82%d0%be/window-projectNew.jar!
/com/eurodesign/windowproject/dao/WorkDoneRepository.class]
14:13:30.451 [main] DEBUG org.springframework.context.annotation.ClassPathBeanDe
finitionScanner - Identified candidate component class: URL [jar:file:/C:/Users/
UserOld.Laptop/Desktop/%d1%84%d0%be%d1%82%d0%be/window-projectNew.jar!/com/eurod
esign/windowproject/mailSender/MailConfig.class]
14:13:30.451 [main] DEBUG org.springframework.context.annotation.ClassPathBeanDe
finitionScanner - Identified candidate component class: URL [jar:file:/C:/Users/
UserOld.Laptop/Desktop/%d1%84%d0%be%d1%82%d0%be/window-projectNew.jar!/com/eurod
esign/windowproject/service/CallMeFormService.class]
14:13:30.467 [main] DEBUG org.springframework.context.annotation.ClassPathBeanDe
finitionScanner - Identified candidate component class: URL [jar:file:/C:/Users/
UserOld.Laptop/Desktop/%d1%84%d0%be%d1%82%d0%be/window-projectNew.jar!/com/eurod
esign/windowproject/service/FeedbackApprovedService.class]
14:13:30.467 [main] DEBUG org.springframework.context.annotation.ClassPathBeanDe
finitionScanner - Identified candidate component class: URL [jar:file:/C:/Users/
UserOld.Laptop/Desktop/%d1%84%d0%be%d1%82%d0%be/window-projectNew.jar!/com/eurod
esign/windowproject/service/FeedbackService.class]
14:13:30.467 [main] DEBUG org.springframework.context.annotation.ClassPathBeanDe
finitionScanner - Identified candidate component class: URL [jar:file:/C:/Users/
UserOld.Laptop/Desktop/%d1%84%d0%be%d1%82%d0%be/window-projectNew.jar!/com/eurod
esign/windowproject/service/WorkDoneService.class]
14:13:30.514 [main] ERROR org.springframework.boot.SpringApplication - Applicati
on run failed
java.lang.IllegalArgumentException: No auto configuration classes found in META-
INF/spring.factories. If you are using a custom packaging, make sure that file i
s correct.
at org.springframework.util.Assert.notEmpty(Assert.java:467)
at org.springframework.boot.autoconfigure.AutoConfigurationImportSelecto
r.getCandidateConfigurations(AutoConfigurationImportSelector.java:180)
at org.springframework.boot.autoconfigure.AutoConfigurationImportSelecto
r.getAutoConfigurationEntry(AutoConfigurationImportSelector.java:123)
at org.springframework.boot.autoconfigure.AutoConfigurationImportSelecto
r$AutoConfigurationGroup.process(AutoConfigurationImportSelector.java:434)
at org.springframework.context.annotation.ConfigurationClassParser$Defer
redImportSelectorGrouping.getImports(ConfigurationClassParser.java:878)
at org.springframework.context.annotation.ConfigurationClassParser$Defer
redImportSelectorGroupingHandler.processGroupImports(ConfigurationClassParser.ja
va:808)
at org.springframework.context.annotation.ConfigurationClassParser$Defer
redImportSelectorHandler.process(ConfigurationClassParser.java:779)
at org.springframework.context.annotation.ConfigurationClassParser.parse
(ConfigurationClassParser.java:192)
at org.springframework.context.annotation.ConfigurationClassPostProcesso
r.processConfigBeanDefinitions(ConfigurationClassPostProcessor.java:319)
at org.springframework.context.annotation.ConfigurationClassPostProcesso
r.postProcessBeanDefinitionRegistry(ConfigurationClassPostProcessor.java:236)
at org.springframework.context.support.PostProcessorRegistrationDelegate
.invokeBeanDefinitionRegistryPostProcessors(PostProcessorRegistrationDelegate.ja
va:280)
at org.springframework.context.support.PostProcessorRegistrationDelegate
.invokeBeanFactoryPostProcessors(PostProcessorRegistrationDelegate.java:96)
at org.springframework.context.support.AbstractApplicationContext.invoke
BeanFactoryPostProcessors(AbstractApplicationContext.java:707)
at org.springframework.context.support.AbstractApplicationContext.refres
h(AbstractApplicationContext.java:533)
at org.springframework.boot.web.servlet.context.ServletWebServerApplicat
ionContext.refresh(ServletWebServerApplicationContext.java:143)
at org.springframework.boot.SpringApplication.refresh(SpringApplication.
java:758)
at org.springframework.boot.SpringApplication.refresh(SpringApplication.
java:750)
at org.springframework.boot.SpringApplication.refreshContext(SpringAppli
cation.java:397)
at org.springframework.boot.SpringApplication.run(SpringApplication.java
:315)
at org.springframework.boot.SpringApplication.run(SpringApplication.java
:1237)
at org.springframework.boot.SpringApplication.run(SpringApplication.java
:1226)
at com.eurodesign.windowproject.WindowProjectApplication.main(WindowProj
ectApplication.java:10)
14:13:30.529 [main] DEBUG org.springframework.boot.web.servlet.context.Annotatio
nConfigServletWebServerApplicationContext - Closing org.springframework.boot.web
.servlet.context.AnnotationConfigServletWebServerApplicationContext#158a8276, st
arted on Thu Aug 06 14:13:30 MSK 2020
I tried this solution from a similar question. An error with test occured, i skipped the test and it was exported to /target folder. But even after that the program didn't start outside of Intellij
Here is my pom.xml:
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 https://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<parent>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-parent</artifactId>
<version>2.3.2.RELEASE</version>
<relativePath/> <!-- lookup parent from repository -->
</parent>
<groupId>com.eurodesign</groupId>
<artifactId>window-project</artifactId>
<version>0.0.1-SNAPSHOT</version>
<name>window-project</name>
<description>A project</description>
<properties>
<java.version>11</java.version>
<start-class>com.eurodesign.windowproject.WindowProjectApplication</start-class>
</properties>
<dependencies>
<dependency>
<groupId>commons-io</groupId>
<artifactId>commons-io</artifactId>
<version>2.1</version>
</dependency>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-mail</artifactId>
<version>2.2.6.RELEASE</version>
</dependency>
<dependency>
<groupId>org.thymeleaf.extras</groupId>
<artifactId>thymeleaf-extras-springsecurity5</artifactId>
</dependency>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-data-jpa</artifactId>
</dependency>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-security</artifactId>
</dependency>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-thymeleaf</artifactId>
</dependency>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-web</artifactId>
</dependency>
<dependency>
<groupId>mysql</groupId>
<artifactId>mysql-connector-java</artifactId>
<scope>runtime</scope>
</dependency>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-test</artifactId>
<scope>test</scope>
<exclusions>
<exclusion>
<groupId>org.junit.vintage</groupId>
<artifactId>junit-vintage-engine</artifactId>
</exclusion>
</exclusions>
</dependency>
<dependency>
<groupId>org.springframework.security</groupId>
<artifactId>spring-security-test</artifactId>
<scope>test</scope>
</dependency>
</dependencies>
<build>
<plugins>
<plugin>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-maven-plugin</artifactId>
</plugin>
</plugins>
</build>
</project>
What is my mistake? THank you!
Finally i managed to solve my problem. For some reason intellije fails to generate a jar file correctly but when i generated a jar with maven the problem disappeared
I'm trying to integrate EmbeddedKafka (https://github.com/spring-projects/spring-kafka/blob/master/src/reference/asciidoc/testing.adoc) with my Unit Tests.
No always but very often I get errors during EmbeddedKafka startup.
2019-10-08T11:23:43.894Z INFO [main] org.apache.zookeeper.server.ZooKeeperServer: Server environment:java.io.tmpdir=C:\tmp\
2019-10-08T11:23:43.894Z INFO [main] org.apache.zookeeper.server.ZooKeeperServer: Server environment:java.compiler=<NA>
2019-10-08T11:23:43.894Z INFO [main] org.apache.zookeeper.server.ZooKeeperServer: Server environment:os.name=Windows 10
2019-10-08T11:23:43.894Z INFO [main] org.apache.zookeeper.server.ZooKeeperServer: Server environment:os.arch=amd64
2019-10-08T11:23:43.894Z INFO [main] org.apache.zookeeper.server.ZooKeeperServer: Server environment:os.version=10.0
2019-10-08T11:23:43.894Z INFO [main] org.apache.zookeeper.server.ZooKeeperServer: Server environment:user.name=user
2019-10-08T11:23:43.894Z INFO [main] org.apache.zookeeper.server.ZooKeeperServer: Server environment:user.home=C:\Users\user
2019-10-08T11:23:43.894Z INFO [main] org.apache.zookeeper.server.ZooKeeperServer: Server environment:user.dir=C:\work\
2019-10-08T11:23:43.913Z INFO [main] org.apache.zookeeper.server.ZooKeeperServer: Created server with tickTime 500 minSessionTimeout 1000 maxSessionTimeout 10000 datadir C:\tmp\kafka-2406612557331641452\version-2 snapdir C:\tmp\kafka-919479945966258903\version-2
2019-10-08T11:23:43.923Z INFO [main] org.apache.zookeeper.server.NIOServerCnxnFactory: binding to port /127.0.0.1:0
Tests run: 1, Failures: 0, Errors: 1, Skipped: 0, Time elapsed: 0.146 sec <<< FAILURE! - in kafka.KafkaTopicUtilsTest
kafka.KafkaTopicUtilsTest Time elapsed: 0.146 sec <<< ERROR!
org.I0Itec.zkclient.exception.ZkInterruptedException: java.lang.InterruptedException
Caused by: java.lang.InterruptedException
pom.xml:
<dependency>
<groupId>org.apache.kafka</groupId>
<artifactId>kafka_2.11</artifactId>
<version>2.2.1</kafka>
</dependency>
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-core</artifactId>
<version>5.0.7.RELEASE</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-test</artifactId>
<version>5.0.7.RELEASE</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.springframework.kafka</groupId>
<artifactId>spring-kafka-test</artifactId>
<version>2.2.9.RELEASE</version>
<scope>test</scope>
</dependency>
KafkaTopicUtilsTest.java server initialization thru #Rule:
#RunWith(MockitoJUnitRunner.class)
public class KafkaTopicUtilsTest {
static final String INITIAL_TOPIC = "initial_topic";
#ClassRule
public static EmbeddedKafkaRule embeddedKafka = new EmbeddedKafkaRule(1, true, 5, INITIAL_TOPIC);
...
}
As mentioned almost always it is working well when I run test in InteliJ.
Executing from INteliJ (Run 'KafkaTopicUtilsTest') works fine.
Executing test via maven mvn clean install is failing.
Explicit test executions mvn -Dtest=KafkaTopicUtilsTest test works fine.
Anyone faced such issues? Any clue what could be wrong?
Issue solved
The problem was related to other test cases. Another test (not using EmbeddedKafka) was throwing and InterrupedException and checking if the code correctly react on it. Interrupted state was maintained with thru call of Thread.currentThread().interrupt(). Looks like the VM kept Interrupted state and EmbeddedKafka react on it.
Using dropwizard version 0.9.2 and the configuration yml looks somewhat like this
server:
applicationConnectors:
- type: http
port: 8090
adminConnectors:
- type: http
port: 8091
requestLog:
timeZone: UTC
appenders:
- type: file
currentLogFilename: file
threshold: ALL
archive: true
archivedLogFilenamePattern: some-pattern
archivedFileCount: 5
maxFileSize: 10MB
While executing getting the following error
* Unrecognized field at: server.requestLog
Did you mean?:
- adminConnectors
- adminContextPath
- adminMaxThreads
On search seems like this error is known in Jackson and fixed in 2.7.3. So, upgraded dropwizard to latest 1.0.2 but still the problem persists.
Also, tried excluding jackson explicitly and include the latest 2.8.3 didn't help either. Any inputs on solving this issue ?
Tried pom
<dependency>
<groupId>io.dropwizard</groupId>
<artifactId>dropwizard-core</artifactId>
<version>0.9.2</version>
<exclusions>
<exclusion>
<groupId>io.dropwizard</groupId>
<artifactId>dropwizard-jackson</artifactId>
</exclusion>
</exclusions>
</dependency>
<dependency>
<groupId>io.dropwizard</groupId>
<artifactId>dropwizard-jackson</artifactId>
<version>1.0.2</version>
</dependency>
Logging is not part of server configurations.
server:
applicationConnectors:
- type: http
port: 8090
adminConnectors:
- type: http
port: 8091
logging:
level: INFO
loggers:
requestLog: INFO
appenders:
Use "logging" instead
When using the RTC SDK normally in an application, I can turn off the logging in that layer using Log4j with the following code:
// Only show warnings for IBM dependencies
Logger.getLogger("com.ibm").setLevel(Level.WARN);
Logger.getLogger("com.ibm").setAdditivity(false);
Logger.getRootLogger().setLevel(Level.DEBUG);
When trying to convert over to SpringBoot, I add just the basic SpringBoot package and I get all sorts of debug information from the RTC SDK. Even if I have only the root logger set to FATAL and have not settings anywhere else for logging.
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-web</artifactId>
<version>1.3.2.RELEASE</version>
</dependency>
As soon as I add the dependency, (without even having the #SpringBootApplication annotation or even SpringApplication.run(Main.class, args), it starts spewing out RTC log information like the following:
16:14:20.161 [main] DEBUG c.i.t.r.c.i.u.InternalTeamPlatform - Thread[main,5,main]
16:14:20.164 [main] DEBUG c.i.t.r.c.i.u.InternalTeamPlatform - start asBundlefalse
16:14:20.164 [main] DEBUG c.i.t.r.c.i.u.InternalTeamPlatform - set start true
16:14:22.387 [main] DEBUG c.i.t.r.t.client.RemoteTeamServer - Entering setCredentials(userid=, password=)
16:14:22.387 [main] DEBUG c.i.t.r.t.client.RemoteTeamServer - Entering closeHttpClient
16:14:22.387 [main] DEBUG c.i.t.r.t.client.RemoteTeamServer - Value of _httpclient: null
16:14:22.408 [main] DEBUG c.i.t.r.t.client.RemoteTeamServer - httpclient already closed
16:14:22.410 [main] DEBUG c.i.t.r.t.client.RemoteTeamServer - Entering createTeamService
16:14:22.410 [main] DEBUG c.i.t.r.t.client.RemoteTeamServer - creating RemoteTeamService from com.ibm.team.repository.common.internal.IRepositoryRemoteService
16:14:22.420 [main] DEBUG c.i.t.r.t.client.RemoteTeamServer - Entering createTeamService
16:14:22.420 [main] DEBUG c.i.t.r.t.client.RemoteTeamServer - creating RemoteTeamService from com.ibm.team.repository.common.service.IQueryService
16:14:22.424 [main] DEBUG c.i.t.r.t.client.RemoteTeamServer - Entering createTeamService
16:14:22.424 [main] DEBUG c.i.t.r.t.client.RemoteTeamServer - creating RemoteTeamService from com.ibm.team.repository.common.service.IExternalUserRegistryService
My question is, how can I turn this excess logging off? It is quite annoying and not useful to me.
As my colleague suggested in his comment:
you have to include this inside of your pom underneath the dependency tag:
<exclusions>
<exclusion>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-logging</artifactId>
</exclusion>
</exclusions>