To write data to hdfs, I added flink-connector-filesystem_2.11 to my pom, so I can use BucketingSink.
When I submit my jar to flink cluster, it does write some message to hdfs. While, after a few minutes, the exception was thrown.
By jar tvf show-event-to-kafka/target/show-event-to-kafka-1.0-SNAPSHOT.jar | grep HdfsConstants.class, I foundHdfsConstants does exist.
How to fix it?
StackTrace:
TimerException{java.io.IOException: DataStreamer Exception: }
at org.apache.flink.streaming.runtime.tasks.SystemProcessingTimeService$TriggerTask.run(SystemProcessingTimeService.java:288)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$201(ScheduledThreadPoolExecutor.java:180)
at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:293)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Caused by: java.io.IOException: DataStreamer Exception:
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:695)
Caused by: java.lang.NoClassDefFoundError: Could not initialize class org.apache.hadoop.hdfs.protocol.HdfsConstants
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.createBlockOutputStream(DFSOutputStream.java:1413)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1357)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:587)
POM:
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/maven-v4_0_0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>com.mafengwo.recommend.rtp</groupId>
<artifactId>parent-project</artifactId>
<packaging>pom</packaging>
<version>1.0-SNAPSHOT</version>
<modules>
<module>quick-start</module>
<module>flink-template</module>
<module>feature-calculate</module>
<module>show-event-to-kafka</module>
<module>monitor</module>
<module>page-event-to-redis</module>
</modules>
<properties>
<applicationName>recommend-rtp</applicationName>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
<flink.version>1.6.0</flink.version>
<scala.binary.version>2.11</scala.binary.version>
<java.version>1.8</java.version>
<maven.compiler.source>${java.version}</maven.compiler.source>
<maven.compiler.target>${java.version}</maven.compiler.target>
</properties>
<dependencies>
<!-- flink -->
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-java</artifactId>
<version>${flink.version}</version>
</dependency>
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-streaming-java_${scala.binary.version}</artifactId>
<version>${flink.version}</version>
</dependency>
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-core</artifactId>
<version>${flink.version}</version>
</dependency>
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-connector-kafka-0.8_2.11</artifactId>
<version>${flink.version}</version>
</dependency>
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-connector-filesystem_2.11</artifactId>
<version>${flink.version}</version>
</dependency>
<!-- json -->
<dependency>
<groupId>com.google.code.gson</groupId>
<artifactId>gson</artifactId>
<version>2.8.5</version>
</dependency>
<dependency>
<groupId>com.fasterxml.jackson.datatype</groupId>
<artifactId>jackson-datatype-jsr310</artifactId>
<version>2.9.7</version>
</dependency>
<!-- test -->
<dependency>
<groupId>org.junit.jupiter</groupId>
<artifactId>junit-jupiter-engine</artifactId>
<version>5.1.0</version>
<scope>test</scope>
</dependency>
<!-- log -->
<dependency>
<groupId>org.apache.logging.log4j</groupId>
<artifactId>log4j-api</artifactId>
<version>2.11.1</version>
</dependency>
<dependency>
<groupId>org.apache.logging.log4j</groupId>
<artifactId>log4j-core</artifactId>
<version>2.11.1</version>
</dependency>
<!-- redis -->
<dependency>
<groupId>redis.clients</groupId>
<artifactId>jedis</artifactId>
<version>2.9.0</version>
</dependency>
<!-- statistic -->
<dependency>
<groupId>com.tdunning</groupId>
<artifactId>t-digest</artifactId>
<version>3.1</version>
</dependency>
<dependency>
<groupId>com.mafengwo.recommend</groupId>
<artifactId>common</artifactId>
<version>1.30</version>
</dependency>
<dependency>
<groupId>org.apache.commons</groupId>
<artifactId>commons-text</artifactId>
<version>1.6</version>
</dependency>
<dependency>
<groupId>org.projectlombok</groupId>
<artifactId>lombok</artifactId>
<version>1.16.18</version>
<scope>provided</scope>
</dependency>
</dependencies>
<build>
<resources>
<resource>
<!-- 将properties和xml文件中,${...}格式的变量,替换成pom文件中定义的变量 -->
<directory>src/main/resources</directory>
<filtering>true</filtering>
<includes>
<include>**/*.properties</include>
<include>**/*.xml</include>
</includes>
</resource>
</resources>
<plugins>
<!-- We use the maven-shade plugin to create a fat jar that contains all necessary dependencies. -->
<!-- Change the value of <mainClass>...</mainClass> if your program entry point changes. -->
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-shade-plugin</artifactId>
<version>3.0.0</version>
<executions>
<!-- Run shade goal on package phase -->
<execution>
<phase>package</phase>
<goals>
<goal>shade</goal>
</goals>
<configuration>
<artifactSet>
<excludes>
<exclude>org.apache.flink:force-shading</exclude>
<exclude>com.google.code.findbugs:jsr305</exclude>
<exclude>org.slf4j:*</exclude>
<exclude>log4j:*</exclude>
</excludes>
</artifactSet>
<filters>
<filter>
<!-- Do not copy the signatures in the META-INF folder.
Otherwise, this might cause SecurityExceptions when using the JAR. -->
<artifact>*:*</artifact>
<excludes>
<exclude>META-INF/*.SF</exclude>
<exclude>META-INF/*.DSA</exclude>
<exclude>META-INF/*.RSA</exclude>
</excludes>
</filter>
</filters>
<transformers>
<transformer
implementation="org.apache.maven.plugins.shade.resource.ManifestResourceTransformer">
<mainClass>
com.mafengwo.FlinkEntry
</mainClass>
</transformer>
</transformers>
</configuration>
</execution>
</executions>
<configuration>
<createDependencyReducedPom>false</createDependencyReducedPom>
</configuration>
</plugin>
</plugins>
</build>
<distributionManagement>
<repository>
<id>nexus-releases</id>
<name>Nexus Release Repository</name>
<url>https://nexus.mfwdev.com/repository/recommend-release/</url>
</repository>
<snapshotRepository>
<id>nexus-snapshots</id>
<name>Nexus Snapshots Repository</name>
<url>https://nexus.mfwdev.com/repository/recommend-snapshots/</url>
</snapshotRepository>
</distributionManagement>
<repositories>
<repository>
<id>nexus-releases</id>
<name>Nexus Release Repository</name>
<url>https://nexus.mfwdev.com/repository/recommend-release/</url>
</repository>
</repositories>
</project>
I solved this problem by clearly specifying the configuration of checkpoints in "flink-conf.yaml". The version of flink is 1.8.2.
state.backend: filesystem
state.checkpoints.dir: hdfs://<ip>:<port>/flink-checkpoints
state.savepoints.dir: hdfs://<ip>:<port>/flink-checkpoints
Related
I have problem with downloading cas-server-core from https://mvnrepository.com/.When I try to add repository https://mvnrepository.com, maven ignore it and try to download artifact from artifactory, where it isnt present. My build.xml :
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/maven-v4_0_0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>com.dhl.cas</groupId>
<artifactId>cas</artifactId>
<packaging>war</packaging>
<name>cas</name>
<version>3.0.10.0</version>
<parent>
<groupId>com.cleverlance.dhl.spc</groupId>
<artifactId>spc-parent</artifactId>
<version>0.3</version>
</parent>
<scm>
<developerConnection>
scm:svn:https://teamforge.dhl.com/svn/repos/SPL_APPL_TOOLBOX/cas-server/src/tags/cas-3.0.10.0
</developerConnection>
<connection>scm:svn:https://teamforge.dhl.com/svn/repos/SPL_APPL_TOOLBOX/cas-server/src/tags/cas-3.0.10.0
</connection>
<url>https://teamforge.dhl.com/svn/repos/SPL_APPL_TOOLBOX/cas-server/src/tags/cas-3.0.10.0</url>
</scm>
<prerequisites>
<maven>3.0.4</maven>
</prerequisites>
<properties>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
<hibernate.version>4.1.0.Final</hibernate.version>
<spring.version>4.2.3.RELEASE</spring.version>
<cas.version>3.5.0</cas.version>
<netbeans.hint.deploy.server>Tomcat</netbeans.hint.deploy.server>
<ojdbc7>12.1.0.1.0</ojdbc7>
<authentication.core.version>1.0.9.0</authentication.core.version>
<mysql.connector.version>5.1.44</mysql.connector.version>
</properties>
<build>
<resources>
<resource>
<filtering>false</filtering>
<directory>${basedir}/src/main/java</directory>
<excludes>
<exclude>**/*.java</exclude>
</excludes>
</resource>
<resource>
<filtering>true</filtering>
<directory>${basedir}/src/main/resources</directory>
<includes>
<include>dct.properties</include>
</includes>
</resource>
<resource>
<filtering>false</filtering>
<directory>${basedir}/src/main/resources</directory>
</resource>
</resources>
<testResources>
<testResource>
<filtering>false</filtering>
<directory>${basedir}/src/test/resources</directory>
</testResource>
</testResources>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-javadoc-plugin</artifactId>
<version>2.9.1</version>
<configuration>
<sourcepath>${basedir}/dummy</sourcepath>
</configuration>
</plugin>
<!-- add src/main/generated for maven -->
<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>build-helper-maven-plugin</artifactId>
<version>1.7</version>
<executions>
<execution>
<phase>generate-sources</phase>
<goals>
<goal>add-source</goal>
</goals>
<configuration>
<sources>
<source>src/main/generated</source>
</sources>
</configuration>
</execution>
</executions>
</plugin>
</plugins>
<pluginManagement>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<version>2.0.2</version>
<inherited>true</inherited>
<artifactId>maven-compiler-plugin</artifactId>
<configuration>
<source>1.6</source>
<target>1.6</target>
<encoding>UTF-8</encoding>
</configuration>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-war-plugin</artifactId>
<version>2.1.1</version>
<configuration>
<warName>cas</warName>
<webResources>
<resource>
<directory>${basedir}/src/main/webapp</directory>
<filtering>true</filtering>
</resource>
</webResources>
<nonFilteredFileExtensions>
<nonFilteredFileExtension>xls</nonFilteredFileExtension>
</nonFilteredFileExtensions>
<overlays>
<overlay>
<groupId>org.jasig.cas</groupId>
<artifactId>cas-server-webapp</artifactId>
<excludes>
<!-- <exclude>**/spring-configuration,**/unused-spring-configuration,**/cas-servlet.xml</exclude> -->
<exclude>WEB-INF/unused-spring-configuration/**</exclude>
<exclude>WEB-INF/spring-configuration/**</exclude>
<exclude>**/cas-servlet.xml</exclude>
<exclude>**/deployerConfigContext.xml</exclude>
<exclude>**/login-webflow.xml</exclude>
<exclude>**/log4j.xml</exclude>
</excludes>
</overlay>
</overlays>
</configuration>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-surefire-plugin</artifactId>
<version>2.9</version>
<configuration>
<argLine>-Duser.timezone=Etc/GMT</argLine>
</configuration>
</plugin>
<!-- <plugin> -->
<!-- <groupId> org.jasig.cas</groupId> -->
<!-- <artifactId>cas-server-webapp</artifactId> -->
<!-- <configuration> -->
<!-- <skip>true</skip> -->
<!-- </configuration> -->
<!-- </plugin> -->
</plugins>
</pluginManagement>
</build>
<dependencies>
<!-- Spring -->
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-core</artifactId>
</dependency>
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-web</artifactId>
</dependency>
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-jdbc</artifactId>
</dependency>
<dependency>
<groupId>com.dhl.resp</groupId>
<artifactId>authentication-core</artifactId>
<version>${authentication.core.version}</version>
</dependency>
<!-- test -->
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-test</artifactId>
<scope>test</scope>
</dependency>
<dependency>
<groupId>junit</groupId>
<artifactId>junit</artifactId>
<scope>test</scope>
</dependency>
<!-- hibernate -->
<dependency>
<groupId>org.hibernate</groupId>
<artifactId>hibernate-core</artifactId>
<version>${hibernate.version}</version>
</dependency>
<dependency>
<groupId>org.hibernate</groupId>
<artifactId>hibernate-entitymanager</artifactId>
<version>${hibernate.version}</version>
</dependency>
<!-- CAS SSO -->
<dependency>
<groupId>org.jasig.cas</groupId>
<artifactId>cas-server-webapp</artifactId>
<type>war</type>
</dependency>
<dependency>
<groupId>com.github.inspektr</groupId>
<artifactId>inspektr-audit</artifactId>
<version>1.0.7.GA</version>
</dependency>
<dependency>
<groupId>com.github.inspektr</groupId>
<artifactId>inspektr-common</artifactId>
<version>1.0.7.GA</version>
</dependency>
<dependency>
<groupId>com.github.inspektr</groupId>
<artifactId>inspektr-support-spring</artifactId>
<version>1.0.7.GA</version>
</dependency>
<dependency>
<groupId>org.springframework.security</groupId>
<artifactId>spring-security-cas</artifactId>
<version>${spring.version}</version>
</dependency>
<!-- JSP -->
<dependency>
<groupId>javax.servlet</groupId>
<artifactId>servlet-api</artifactId>
</dependency>
<dependency>
<groupId>jstl</groupId>
<artifactId>jstl</artifactId>
</dependency>
<!-- database -->
<dependency>
<groupId>commons-dbcp</groupId>
<artifactId>commons-dbcp</artifactId>
<version>1.4</version>
<scope>runtime</scope>
</dependency>
<dependency>
<groupId>javax.validation</groupId>
<artifactId>validation-api</artifactId>
</dependency>
<!-- oracle driver -->
<dependency>
<groupId>com.oracle</groupId>
<artifactId>ojdbc7</artifactId>
<version>${ojdbc7}</version>
<scope>runtime</scope>
</dependency>
<dependency>
<groupId>org.hsqldb</groupId>
<artifactId>hsqldb</artifactId>
<version>2.2.9</version>
<scope>test</scope>
</dependency>
<!-- tools -->
<dependency>
<groupId>net.sf.opencsv</groupId>
<artifactId>opencsv</artifactId>
<version>2.0</version>
</dependency>
<!-- SAML -->
<dependency>
<groupId>org.jasig.cas</groupId>
<artifactId>cas-server-core</artifactId>
<version>6.5.5</version>
<exclusions>
<exclusion>
<groupId>org.opensaml</groupId>
<artifactId>opensaml</artifactId>
</exclusion>
</exclusions>
</dependency>
<dependency>
<groupId>org.springframework.security</groupId>
<artifactId>spring-security-config</artifactId>
<version>${spring.version}</version>
<scope>compile</scope>
</dependency>
<dependency>
<groupId>org.springframework.security.extensions</groupId>
<artifactId>spring-security-saml2-core</artifactId>
<version>1.0.0.RELEASE</version>
<scope>compile</scope>
</dependency>
<!-- http client utils -->
<dependency>
<groupId>org.apache.httpcomponents</groupId>
<artifactId>httpclient</artifactId>
<version>4.4</version>
</dependency>
</dependencies>
<distributionManagement>
<repository>
<id>releases</id>
<url>http://czcholstc000112.prg-dc.dhl.com:8272/nexus/content/repositories/releases</url>
</repository>
<snapshotRepository>
<id>snapshots</id>
<url>http://czcholstc000112.prg-dc.dhl.com:8272/nexus/content/repositories/snapshots</url>
</snapshotRepository>
</distributionManagement>
<repositories>
<repository>
<snapshots>
<enabled>false</enabled>
</snapshots>
<id>central</id>
<name>maven-release</name>
<url>https://artifactory.dhl.com/maven-release</url>
</repository>
<repository>
<snapshots/>
<id>snapshots</id>
<name>maven-snapshot</name>
<url>https://artifactory.dhl.com/maven-snapshot</url>
</repository>
<repository>
<snapshots/>
<id>mavenrepository</id>
<name>maven-repository</name>
<url>https://mvnrepository.com/</url>
</repository>
</repositories>
<profiles>
<profile>
<id>mysql</id>
<dependencies>
<!-- mysql driver -->
<dependency>
<groupId>mysql</groupId>
<artifactId>mysql-connector-java</artifactId>
<version>${mysql.connector.version}</version>
</dependency>
</dependencies>
</profile>
</profiles>
</project>
Can someone give me a point how to solve this issue? I think that it is enough to add repository in pom.xml, but It seems that there is some problem, which I dont know. Thank you
I think you have wrong groupId
<groupId>org.jasig.cas</groupId> <artifactId>cas-server-core</artifactId> <version>6.5.5</version>
For the group you have versions are listed here
https://mvnrepository.com/artifact/org.jasig.cas/cas-server-core
If you want 6.5.5 version, shouldn't it be this one?
https://mvnrepository.com/artifact/org.apereo.cas/cas-server-core/6.5.5
Software version:
flink 1.11
hive1.2.1
hadoop2.7.1
Use flink run jar to run the submission program with the following exceptions
org.apache.flink.runtime.JobException: Recovery is suppressed by NoRestartBackoffTimeStrategy
at org.apache.flink.runtime.executiongraph.failover.flip1.ExecutionFailureHandler.handleFailure(ExecutionFailureHandler.java:116)
at org.apache.flink.runtime.executiongraph.failover.flip1.ExecutionFailureHandler.getFailureHandlingResult(ExecutionFailureHandler.java:78)
at org.apache.flink.runtime.scheduler.DefaultScheduler.handleTaskFailure(DefaultScheduler.java:192)
at org.apache.flink.runtime.scheduler.DefaultScheduler.maybeHandleTaskFailure(DefaultScheduler.java:185)
at org.apache.flink.runtime.scheduler.DefaultScheduler.updateTaskExecutionStateInternal(DefaultScheduler.java:179)
at org.apache.flink.runtime.scheduler.SchedulerBase.updateTaskExecutionState(SchedulerBase.java:503)
at org.apache.flink.runtime.jobmaster.JobMaster.updateTaskExecutionState(JobMaster.java:386)
at sun.reflect.GeneratedMethodAccessor70.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:497)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRpcInvocation(AkkaRpcActor.java:284)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRpcMessage(AkkaRpcActor.java:199)
at org.apache.flink.runtime.rpc.akka.FencedAkkaRpcActor.handleRpcMessage(FencedAkkaRpcActor.java:74)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleMessage(AkkaRpcActor.java:152)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor$$Lambda$98/618592213.apply(Unknown Source)
at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:26)
at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:21)
at scala.PartialFunction$class.applyOrElse(PartialFunction.scala:123)
at akka.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:21)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:170)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:171)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:171)
at akka.actor.Actor$class.aroundReceive(Actor.scala:517)
at akka.actor.AbstractActor.aroundReceive(AbstractActor.scala:225)
at akka.actor.ActorCell.receiveMessage(ActorCell.scala:592)
at akka.actor.ActorCell.invoke(ActorCell.scala:561)
at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:258)
at akka.dispatch.Mailbox.run(Mailbox.scala:225)
at akka.dispatch.Mailbox.exec(Mailbox.scala:235)
at akka.dispatch.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
at akka.dispatch.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
at akka.dispatch.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
at akka.dispatch.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
Caused by: java.lang.NoClassDefFoundError: org/apache/orc/storage/ql/exec/vector/VectorizedRowBatch
at org.apache.flink.orc.nohive.OrcNoHiveSplitReaderUtil.genPartColumnarRowReader(OrcNoHiveSplitReaderUtil.java:67)
at org.apache.flink.connectors.hive.read.HiveVectorizedOrcSplitReader.<init>(HiveVectorizedOrcSplitReader.java:67)
at org.apache.flink.connectors.hive.read.HiveTableInputFormat.open(HiveTableInputFormat.java:137)
at org.apache.flink.connectors.hive.read.HiveTableInputFormat.open(HiveTableInputFormat.java:66)
at org.apache.flink.streaming.api.functions.source.InputFormatSourceFunction.run(InputFormatSourceFunction.java:85)
at org.apache.flink.streaming.api.operators.StreamSource.run(StreamSource.java:100)
at org.apache.flink.streaming.api.operators.StreamSource.run(StreamSource.java:63)
at org.apache.flink.streaming.runtime.tasks.SourceStreamTask$LegacySourceFunctionThread.run(SourceStreamTask.java:213)
Caused by: java.lang.ClassNotFoundException: org.apache.orc.storage.ql.exec.vector.VectorizedRowBatch
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at org.apache.flink.util.FlinkUserCodeClassLoader.loadClassWithoutExceptionHandling(FlinkUserCodeClassLoader.java:61)
at org.apache.flink.util.ChildFirstClassLoader.loadClassWithoutExceptionHandling(ChildFirstClassLoader.java:74)
at org.apache.flink.util.FlinkUserCodeClassLoader.loadClass(FlinkUserCodeClassLoader.java:48)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
... 8 more
I added the following dependencies in maven
<dependency>
<groupId>org.apache.hive</groupId>
<artifactId>hive-storage-api</artifactId>
<version>2.0.0</version>
</dependency>
The program still reports an error, I don’t quite clear what caused it!
My pom configuration is as follows
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>com.credit.analyze</groupId>
<artifactId>fast-analyze</artifactId>
<version>0.1</version>
<packaging>jar</packaging>
<name>Fast Analyze Job</name>
<properties>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
<flink.version>1.11.0</flink.version>
<java.version>1.8</java.version>
<scala.binary.version>2.11</scala.binary.version>
<maven.compiler.source>${java.version}</maven.compiler.source>
<maven.compiler.target>${java.version}</maven.compiler.target>
<log4j.version>2.12.1</log4j.version>
<hive.version>1.2.1</hive.version>
<hadoop.version>2.7.1</hadoop.version>
<mysql.version>5.1.16</mysql.version>
<junit.version>4.12</junit.version>
</properties>
<repositories>
<repository>
<id>apache.snapshots</id>
<name>Apache Development Snapshot Repository</name>
<url>https://repository.apache.org/content/repositories/snapshots/</url>
<releases>
<enabled>false</enabled>
</releases>
<snapshots>
<enabled>true</enabled>
</snapshots>
</repository>
</repositories>
<dependencies>
<!-- Apache Flink dependencies -->
<!-- These dependencies are provided, because they should not be packaged into the JAR file. -->
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-java</artifactId>
<version>${flink.version}</version>
<!--<scope>provided</scope>-->
</dependency>
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-streaming-java_${scala.binary.version}</artifactId>
<version>${flink.version}</version>
<!--<scope>provided</scope>-->
</dependency>
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-clients_${scala.binary.version}</artifactId>
<version>${flink.version}</version>
<!--<scope>provided</scope>-->
</dependency>
<!-- Add connector dependencies here. They must be in the default scope (compile). -->
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-table-api-java-bridge_${scala.binary.version}</artifactId>
<version>${flink.version}</version>
<!--<scope>provided</scope>-->
</dependency>
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-table-planner_${scala.binary.version}</artifactId>
<version>${flink.version}</version>
<scope>provided</scope>
</dependency>
<!-- or.. (for the new Blink planner) -->
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-table-planner-blink_${scala.binary.version}</artifactId>
<version>${flink.version}</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-streaming-scala_${scala.binary.version}</artifactId>
<version>${flink.version}</version>
<!--<scope>provided</scope>-->
</dependency>
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-connector-hive_${scala.binary.version}</artifactId>
<version>${flink.version}</version>
<!--<scope>provided</scope>-->
</dependency>
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-table-api-java-bridge_${scala.binary.version}</artifactId>
<version>${flink.version}</version>
<!--<scope>provided</scope>-->
</dependency>
<dependency>
<groupId>org.apache.hive</groupId>
<artifactId>hive-exec</artifactId>
<version>${hive.version}</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-client</artifactId>
<version>${hadoop.version}</version>
</dependency>
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-connector-jdbc_${scala.binary.version}</artifactId>
<version>${flink.version}</version>
</dependency>
<dependency>
<groupId>mysql</groupId>
<artifactId>mysql-connector-java</artifactId>
<version>${mysql.version}</version>
</dependency>
<dependency>
<groupId>junit</groupId>
<artifactId>junit</artifactId>
<version>${junit.version}</version>
</dependency>
<dependency>
<groupId>org.apache.orc</groupId>
<artifactId>orc-core</artifactId>
<version>1.6.5</version>
</dependency>
<!-- Add logging framework, to produce console output when running in the IDE. -->
<!-- These dependencies are excluded from the application JAR by default. -->
<dependency>
<groupId>org.apache.logging.log4j</groupId>
<artifactId>log4j-slf4j-impl</artifactId>
<version>${log4j.version}</version>
<scope>runtime</scope>
</dependency>
<dependency>
<groupId>org.apache.logging.log4j</groupId>
<artifactId>log4j-api</artifactId>
<version>${log4j.version}</version>
<scope>runtime</scope>
</dependency>
<dependency>
<groupId>org.apache.logging.log4j</groupId>
<artifactId>log4j-core</artifactId>
<version>${log4j.version}</version>
<scope>runtime</scope>
</dependency>
</dependencies>
<build>
<plugins>
<!-- Java Compiler -->
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<version>3.1</version>
<configuration>
<source>${java.version}</source>
<target>${java.version}</target>
</configuration>
</plugin>
<!-- We use the maven-shade plugin to create a fat jar that contains all necessary dependencies. -->
<!-- Change the value of <mainClass>...</mainClass> if your program entry point changes. -->
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-shade-plugin</artifactId>
<version>3.1.1</version>
<executions>
<!-- Run shade goal on package phase -->
<execution>
<phase>package</phase>
<goals>
<goal>shade</goal>
</goals>
<configuration>
<artifactSet>
<excludes>
<exclude>org.apache.flink:force-shading</exclude>
<exclude>com.google.code.findbugs:jsr305</exclude>
<exclude>org.slf4j:*</exclude>
<exclude>org.apache.logging.log4j:*</exclude>
</excludes>
</artifactSet>
<filters>
<filter>
<!-- Do not copy the signatures in the META-INF folder.
Otherwise, this might cause SecurityExceptions when using the JAR. -->
<artifact>*:*</artifact>
<excludes>
<exclude>META-INF/*.SF</exclude>
<exclude>META-INF/*.DSA</exclude>
<exclude>META-INF/*.RSA</exclude>
</excludes>
</filter>
</filters>
<transformers>
<transformer implementation="org.apache.maven.plugins.shade.resource.ManifestResourceTransformer">
<mainClass>com.credit.analyze.job.batch.market.channel.MarketMonitorHiveJob</mainClass>
</transformer>
</transformers>
</configuration>
</execution>
</executions>
</plugin>
</plugins>
<pluginManagement>
<plugins>
<!-- This improves the out-of-the-box experience in Eclipse by resolving some warnings. -->
<plugin>
<groupId>org.eclipse.m2e</groupId>
<artifactId>lifecycle-mapping</artifactId>
<version>1.0.0</version>
<configuration>
<lifecycleMappingMetadata>
<pluginExecutions>
<pluginExecution>
<pluginExecutionFilter>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-shade-plugin</artifactId>
<versionRange>[3.1.1,)</versionRange>
<goals>
<goal>shade</goal>
</goals>
</pluginExecutionFilter>
<action>
<ignore/>
</action>
</pluginExecution>
<pluginExecution>
<pluginExecutionFilter>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<versionRange>[3.1,)</versionRange>
<goals>
<goal>testCompile</goal>
<goal>compile</goal>
</goals>
</pluginExecutionFilter>
<action>
<ignore/>
</action>
</pluginExecution>
</pluginExecutions>
</lifecycleMappingMetadata>
</configuration>
</plugin>
</plugins>
</pluginManagement>
</build>
</project>
I have added the following jar package in the flink lib directory
flink-sql-connector-hive-1.2.2
hive-metastore-1.2.1.jar
hive-exec-1.2.1.jar
libfb303-0.9.2.jar
orc-core-1.4.3-nohive.jar
aircompressor-0.8.jar
I have solved the problem. Need to add the following dependencies in pom.xml
<!-- flink orc nohive -->
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-orc-nohive_2.12</artifactId>
<version>${flink.version}</version>
</dependency>
Your error clearly indicates, make sure that your have proper orc-core jar imported as dependency in case,
Caused by: java.lang.ClassNotFoundException: org.apache.orc.storage.ql.exec.vector.VectorizedRowBatch
you should add the following artifact,
<dependency>
<groupId>org.apache.orc</groupId>
<artifactId>orc-core</artifactId>
<version>1.6.5</version>
</dependency>
I tried all the versions then also not able to get the source file for this.
[ERROR] -> [Help 1]
org.apache.maven.lifecycle.LifecycleExecutionException: Failed to execute goal org.apache.maven.plugins:maven-compiler-plugin:2.3.2:compile (default-compile) on project projectname: Compilation failure
error: cannot access ServiceFeatureTable
pom.xml
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>com.myfarmony</groupId>
<artifactId>farmsystem</artifactId>
<version>0.0.1-SNAPSHOT</version>
<properties>
<zk.version>8.5.0</zk.version>
<commons-io>1.3.1</commons-io>
<log4j.version>1.2.17</log4j.version>
<maven.build.timestamp.format>yyyy-MM-dd</maven.build.timestamp.format>
<packname>-${project.version}-FL-${maven.build.timestamp}</packname>
<spring.version>3.1.2.RELEASE</spring.version>
</properties>
<packaging>war</packaging>
<name>The farmpro Project</name>
<description>The farmpro Project</description>
<repositories>
<repository>
<id>ZK EVAL</id>
<name>ZK Evaluation Repository</name>
<url>http://mavensync.zkoss.org/eval</url>
</repository>
<repository>
<id>esri</id>
<name>Geotools repository arcgis-java </name>
<url>https://dl.bintray.com/esri/arcgis/</url>
</repository>
<repository>
<id>osgeo</id>
<name>Geotools repository</name>
<url>http://download.osgeo.org/webdav/geotools</url>
</repository>
</repositories>
<licenses>
<license>
<name>GNU LESSER GENERAL PUBLIC LICENSE, Version 3</name>
<url>http://www.gnu.org/licenses/lgpl.html</url>
<distribution>repo</distribution>
</license>
</licenses>
<dependencies>
<dependency>
<groupId>com.myfarmony</groupId>
<artifactId>comfarmpro</artifactId>
<version>1.0.0</version>
</dependency>
<dependency>
<groupId>com.myfarmony</groupId>
<artifactId>farmtheme</artifactId>
<version>1.0</version>
</dependency>
<dependency>
<groupId>com.myfarmony</groupId>
<artifactId>farmadmintheme</artifactId>
<version>1.0</version>
</dependency>
<dependency>
<groupId>calendar</groupId>
<artifactId>colorbox</artifactId>
<version>4.0.0</version>
</dependency>
<!-- ZK Calendar -->
<!-- My Sql -->
<dependency>
<groupId>mysql</groupId>
<artifactId>mysql-connector-java</artifactId>
<version>5.1.6</version>
</dependency>
<!-- jpeg Reader dependency for CMYK image -->
<dependency>
<groupId>org.apache.sanselan</groupId>
<artifactId>sanselan</artifactId>
<version>0.97-incubator</version>
</dependency>
<dependency>
<groupId>javax.servlet</groupId>
<artifactId>servlet-api</artifactId>
<version>2.5</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>com.itextpdf</groupId>
<artifactId>itextpdf</artifactId>
<version>5.5.9</version>
</dependency>
<dependency>
<groupId>com.itextpdf.tool</groupId>
<artifactId>xmlworker</artifactId>
<version>5.5.9</version>
</dependency>
<dependency>
<groupId>org.quartz-scheduler</groupId>
<artifactId>quartz</artifactId>
<version>1.8.6</version>
</dependency>
<!-- https://mvnrepository.com/artifact/commons-io/commons-io -->
<dependency>
<groupId>commons-io</groupId>
<artifactId>commons-io</artifactId>
<version>2.5</version>
</dependency>
<!-- https://mvnrepository.com/artifact/com.esri.arcgisruntime/arcgis-java -->
<dependency>
<groupId>com.esri.arcgisruntime</groupId>
<artifactId>arcgis-java</artifactId>
<version>100.7.0</version>
</dependency>
<!-- https://mvnrepository.com/artifact/org.geotools/gt-shapefile -->
<dependency>
<groupId>org.geotools</groupId>
<artifactId>gt-shapefile</artifactId>
<version>18.0</version>
</dependency>
<!-- https://mvnrepository.com/artifact/org.geotools/gt-data -->
<dependency>
<groupId>org.geotools</groupId>
<artifactId>gt-data</artifactId>
<version>18.0</version>
</dependency>
<dependency>
<groupId>org.gdal</groupId>
<artifactId>gdal</artifactId>
<version>1.11.2</version>
</dependency>
<!-- https://mvnrepository.com/artifact/org.geotools/gt-geojson -->
<dependency>
<groupId>org.geotools</groupId>
<artifactId>gt-geojson</artifactId>
<version>18.0</version>
</dependency>
<!-- https://mvnrepository.com/artifact/com.esri.geometry/esri-geometry-api -->
<dependency>
<groupId>com.esri.geometry</groupId>
<artifactId>esri-geometry-api</artifactId>
<version>2.2.3</version>
</dependency>
<dependency>
<groupId>org.zkoss.chart</groupId>
<artifactId>zkcharts</artifactId>
<version>3.0.3-Eval</version>
</dependency>
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-framework-bom</artifactId>
<version>4.0.3.RELEASE</version>
<type>pom</type>
<scope>import</scope>
</dependency>
</dependencies>
<build>
<finalName>${project.artifactId}</finalName>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-dependency-plugin</artifactId>
<version>2.6</version>
<executions>
<execution>
<id>purge-local-dependencies</id>
<phase>clean</phase>
<goals>
<goal>purge-local-repository</goal>
</goals>
<configuration>
<manualIncludes>
<manualInclude>org.zkoss.zk:zkex</manualInclude>
<manualInclude>org.zkoss.zk:zkmax</manualInclude>
</manualIncludes>
</configuration>
</execution>
</executions>
</plugin>
<!-- Run with Tomcat -->
<plugin>
<groupId>org.apache.tomcat.maven</groupId>
<artifactId>tomcat7-maven-plugin</artifactId>
<version>2.0</version>
<configuration>
<path>/</path>
<port>8080</port>
<contextFile>${basedir}/src/main/resources/context.xml</contextFile>
</configuration>
</plugin>
<!-- Compile java -->
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<version>2.3.2</version>
<configuration>
<source>1.8</source>
<target>1.8</target>
</configuration>
</plugin>
<!-- Build war -->
<plugin>
<artifactId>maven-war-plugin</artifactId>
<groupId>org.apache.maven.plugins</groupId>
<version>2.1.1</version>
</plugin>
<!-- Pack zips -->
<plugin>
<artifactId>maven-assembly-plugin</artifactId>
<version>2.2</version>
<executions>
<execution>
<id>webapp</id>
<phase>package</phase>
<goals>
<goal>single</goal>
</goals>
<configuration>
<finalName>farmpro${packname}</finalName>
<appendAssemblyId>false</appendAssemblyId>
<descriptors>
<descriptor>src/main/assembly/webapp.xml</descriptor>
</descriptors>
</configuration>
</execution>
</executions>
</plugin>
</plugins>
</build>
Feature Controller:
import com.esri.arcgisruntime.data.ServiceFeatureTable;
import com.esri.arcgisruntime.layers.FeatureLayer;
public class FeatureController {
public static void main(String []args) {
ServiceFeatureTable featureTable = new ServiceFeatureTable("url");
FeatureLayer featureLayer = new FeatureLayer(featureTable);
}
}
I am trying to get a very simple JavaFX application to run with maven and Java 10 in IntelliJ.
The project:
https://github.com/ClanWolf/C3-Client_Phoenix
The structure:
The module-info.java:
module net.clanwolf.c3.client {
requires javafx.graphics;
requires javafx.fxml;
requires javafx.controls;
requires javafx.base;
requires org.apache.logging.log4j;
exports net.clanwolf.c3.client;
}
The pom.xml:
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>net.clanwolf.c3</groupId>
<artifactId>C3-Client_Phoenix</artifactId>
<version>4.6.2</version>
<packaging>jar</packaging>
<name>C3-Client Phoenix</name>
<url>http://c3.clanwolf.net</url>
<organization>
<name>ClanWolf W-7</name>
<url>http://www.clanwolf.net</url>
</organization>
<description>Starsystem map of the Inner Sphere, Periphery and Clan space (BattleTech).</description>
<prerequisites>
<maven>3.5.3</maven>
<!--<maven>3.3.9</maven>-->
</prerequisites>
<!-- Repositories ############################################################################################## -->
<repositories>
<repository>
<id>mvnrepository</id>
<name>mvnrepository</name>
<url>http://www.mvnrepository.com</url>
</repository>
<repository>
<id>mvncentral</id>
<name>mvncentral</name>
<url>http://central.maven.org/maven2/</url>
</repository>
</repositories>
<pluginRepositories>
<pluginRepository>
<id>oss-sonatype-snapshots</id>
<url>https://oss.sonatype.org/content/groups/public/</url>
<snapshots>
<enabled>true</enabled>
</snapshots>
</pluginRepository>
</pluginRepositories>
<!-- Properties ################################################################################################ -->
<properties>
<!-- __________________________________________________________ Versions -->
<java.version>10</java.version>
<maven.compiler.plugin.version>3.7.0</maven.compiler.plugin.version>
<maven.surefire.plugin.version>2.22.0</maven.surefire.plugin.version>
<junit.version>4.12</junit.version>
<junit.jupiter.version>5.2.0</junit.jupiter.version>
<junit.platform.surefire.provider.version>1.2.0</junit.platform.surefire.provider.version>
<asm.version>6.2</asm.version>
<!-- __________________________________________________________ Encoding -->
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
<mainClass>net.clanwolf.c3.client.MainFrame</mainClass>
<maven.compiler.source>${java.version}</maven.compiler.source>
<maven.compiler.target>${java.version}</maven.compiler.target>
</properties>
<!-- Dependencies ############################################################################################## -->
<dependencies>
<!-- _____________________________________________________________ Maven -->
<dependency>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<version>${maven.compiler.plugin.version}</version>
</dependency>
<!-- _____________________________________________________________ JUnit -->
<dependency>
<groupId>junit</groupId>
<artifactId>junit</artifactId>
<version>${junit.version}</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.junit.jupiter</groupId>
<artifactId>junit-jupiter-api</artifactId>
<version>${junit.jupiter.version}</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.junit.jupiter</groupId>
<artifactId>junit-jupiter-params</artifactId>
<version>${junit.jupiter.version}</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.apiguardian</groupId>
<artifactId>apiguardian-api</artifactId>
<version>1.0.0</version>
<scope>test</scope>
</dependency>
<!-- ___________________________________________________________ Logging -->
<dependency>
<groupId>org.apache.logging.log4j</groupId>
<artifactId>log4j-api</artifactId>
<version>2.11.0</version>
</dependency>
<dependency>
<groupId>org.apache.logging.log4j</groupId>
<artifactId>log4j-core</artifactId>
<version>2.11.0</version>
</dependency>
<!-- _______________________________________________ Tektosyne / Voronoi -->
<dependency>
<groupId>org.kynosarges</groupId>
<artifactId>tektosyne</artifactId>
<version>6.2.0</version>
</dependency>
<!-- ______________________________________________________ C3-Preloader -->
<dependency>
<groupId>net.clanwolf</groupId>
<artifactId>C3-Preloader</artifactId>
<version>1.0.0</version>
</dependency>
<!-- ____________________________________________________________ Nadron -->
<!--<dependency>-->
<!--<groupId>com.github.menacher</groupId>-->
<!--<artifactId>nadron</artifactId>-->
<!--<version>0.8-SNAPSHOT</version>-->
<!--</dependency>-->
<!--<dependency>-->
<!--<groupId>com.github.menacher</groupId>-->
<!--<artifactId>nadclient</artifactId>-->
<!--<version>0.8-SNAPSHOT</version>-->
<!--</dependency>-->
<!-- _________________________________________________________ Hibernate -->
<!--<dependency>-->
<!--<groupId>org.hibernate.javax.persistence</groupId>-->
<!--<artifactId>hibernate-jpa-2.1-api</artifactId>-->
<!--<version>1.0.0.Final</version>-->
<!--</dependency>-->
<!--<dependency>-->
<!--<groupId>org.hibernate</groupId>-->
<!--<artifactId>hibernate-core</artifactId>-->
<!--<version>5.0.3.Final</version>-->
<!--</dependency>-->
<!-- ___________________________________________________________________________ C H E C K I F N E E D E D -->
<!--<dependency>-->
<!--<groupId>net.sourceforge.collections</groupId>-->
<!--<artifactId>collections-generic</artifactId>-->
<!--<version>4.01</version>-->
<!--</dependency>-->
<!--<dependency>-->
<!--<groupId>com.google.code.gson</groupId>-->
<!--<artifactId>gson</artifactId>-->
<!--<version>2.4</version>-->
<!--</dependency>-->
<!--<dependency>-->
<!--<groupId>commons-codec</groupId>-->
<!--<artifactId>commons-codec</artifactId>-->
<!--<version>1.10</version>-->
<!--</dependency>-->
<!--<dependency>-->
<!--<groupId>dom4j</groupId>-->
<!--<artifactId>dom4j</artifactId>-->
<!--<version>1.6.1</version>-->
<!--</dependency>-->
<!--<dependency>-->
<!--<groupId>commons-net</groupId>-->
<!--<artifactId>commons-net</artifactId>-->
<!--<version>3.3</version>-->
<!--</dependency>-->
</dependencies>
<!-- Build ##################################################################################################### -->
<build>
<resources>
<resource>
<directory>src/main/resources</directory>
<filtering>true</filtering>
<includes>
<include>**/version.number</include>
</includes>
</resource>
<resource>
<directory>src/main/resources</directory>
<filtering>false</filtering>
<excludes>
<exclude>**/version.number</exclude>
</excludes>
</resource>
</resources>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<version>${maven.compiler.plugin.version}</version>
<configuration>
<source>${java.version}</source>
<target>${java.version}</target>
</configuration>
<dependencies>
<dependency>
<groupId>org.ow2.asm</groupId>
<artifactId>asm</artifactId>
<version>${asm.version}</version>
</dependency>
</dependencies>
</plugin>
<plugin>
<artifactId>maven-surefire-plugin</artifactId>
<version>${maven.surefire.plugin.version}</version>
<dependencies>
<dependency>
<groupId>org.junit.platform</groupId>
<artifactId>junit-platform-surefire-provider</artifactId>
<version>${junit.platform.surefire.provider.version}</version>
</dependency>
<dependency>
<groupId>org.junit.jupiter</groupId>
<artifactId>junit-jupiter-engine</artifactId>
<version>${junit.jupiter.version}</version>
</dependency>
<dependency>
<groupId>org.junit.platform</groupId>
<artifactId>junit-platform-surefire-provider</artifactId>
<version>1.2.0</version>
</dependency>
<dependency>
<groupId>org.ow2.asm</groupId>
<artifactId>asm</artifactId>
<version>${asm.version}</version>
</dependency>
</dependencies>
<configuration>
<parallel>methods</parallel>
<threadCount>10</threadCount>
<includes>
<include>**/Test*.java</include>
<include>**/*Test.java</include>
<include>**/*Tests.java</include>
<include>**/*TestCase.java</include>
</includes>
</configuration>
</plugin>
<plugin>
<groupId>com.zenjava</groupId>
<artifactId>javafx-maven-plugin</artifactId>
<version>8.9.0-SNAPSHOT</version>
<configuration>
<mainClass>net.clanwolf.c3.client.MainFrame</mainClass>
<preLoader>c3_preloader.C3_Preloader</preLoader>
<verbose>true</verbose>
<bundleArguments>
<!-- MSI installer Options -->
<!-- https://docs.oracle.com/javase/9/tools/javapackager.htm#GUID-E51F9601-E121-4A50-BCA7-C7F8730078B2__WINDOWSEXEBUNDLERARGUMENTS-26C9A39C -->
<!-- Custonmize MSI installer -->
<!-- http://wixtoolset.org/documentation/manual/v3/wixui/wixui_customizations.html -->
<icon>target/classes/icon.ico</icon>
<installdirChooser>true</installdirChooser>
<module-path>target/classes</module-path>
<module>net.clanwolf.c3.client</module>
<!--<add-modules>module1,module2,module3</add-modules>-->
<!--<limit-modules>module1,module2,module3</limit-modules>-->
<!--<runtime /> Not working together with the above commands -->
</bundleArguments>
<!--<jfxMainAppJarName>${project.build.finalName}.jar</jfxMainAppJarName>-->
<identifier>${project.artifactId}</identifier>
<vendor>ClanWolf.net</vendor>
<!-- win.app | linux.app | mac.app | exe | msi | rpm | deb -->
<bundler>msi</bundler>
<nativeReleaseVersion>${project.version}</nativeReleaseVersion>
<needShortcut>true</needShortcut>
<needMenu>true</needMenu>
<appName>${project.artifactId}</appName>
<jvmArgs>
<jvmArg>-Xmx2g</jvmArg>
<jvmArg>-Djavafx.verbose=true</jvmArg>
<!--<jvmProperty>-DMyProperty=true</jvmProperty>-->
</jvmArgs>
<!--<jvmProperties>-->
<!--<UserProperty>foo</UserProperty>-->
<!--</jvmProperties>-->
<!--<userJvmArgs>-->
<!--<Argument3>AppCommand</Argument3>-->
<!--</userJvmArgs>-->
<!--<keyStoreAlias>example-user</keyStoreAlias>-->
<!--<keyStorePassword>example-password</keyStorePassword>-->
<allPermissions>false</allPermissions>
<manifestAttributes>
<Specification-Title>${project.name}</Specification-Title>
<Specification-Version>${project.version}</Specification-Version>
<Specification-Vendor>${project.organization.name}</Specification-Vendor>
<Implementation-Title>${project.name}</Implementation-Title>
<Implementation-Version>${project.version}</Implementation-Version>
<Implementation-Vendor-Id>${project.groupId}</Implementation-Vendor-Id>
<Implementation-Vendor>${project.organization.name}</Implementation-Vendor>
</manifestAttributes>
</configuration>
<executions>
<execution>
<id>create-jfxjar</id>
<phase>package</phase>
<goals>
<goal>build-jar</goal>
</goals>
</execution>
<execution>
<id>create-native</id>
<phase>package</phase>
<goals>
<goal>build-native</goal>
</goals>
</execution>
</executions>
</plugin>
</plugins>
</build>
</project>
This does not run. The maven build completes without problems, but if I start it in IntelliJ, it gives me this:
Error occurred during initialization of boot layer
java.lang.module.ResolutionException: Modules maven.core and maven.artifact export package org.apache.maven.artifact.resolver.filter to module aether.impl
If I do remove the dependency for log4j-core, it will run, but it will complain at runtime that there is no implementation of log4j and that I should please add log4j-core. If I do that, the Resolution Bullshit is shown again. How can this be resolved, if possible at all?
Java 9 modularization came with a couple of new rules like: Split packages, i.e. classes with the same package but in different jars, are not allowed. This is only an issue when working on the modulepath, not with the traditional classpath.
It seems like intellij decides for the wrong reason to switch to the modulepath, probably because log4j has a module descriptor.
After all the problem was that maven-compiler-plugin was in the pom.xml as a dependency. That is not necessary as it is only used during packaging. After I removed it, the errors went away.
I'm trying to extend business logic of a flink process using google Reflection Library. I'm stuck with this exception at runtime.
Does somebody experimented the same issue. I suspect maven conflict but I have no idea. I pasted the exception and the pom file.
The pom file:
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>it.almaviva.wtf.mms</groupId>
<artifactId>mms-integretedmobilitystatusevent-flink-process</artifactId>
<version>0.0.1-SNAPSHOT</version>
<name>Passing Event Process</name>
<properties>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
<flink.version>1.4.0</flink.version>
<!--<slf4j.version>1.7.7</slf4j.version>-->
<log4j.version>1.2.17</log4j.version>
<scala.binary.version>2.11</scala.binary.version>
<junit.version>4.12</junit.version>
<cassandra.version>3.2.0</cassandra.version>
<flink.connector.elastic.version>1.4.0</flink.connector.elastic.version>
<thyco.compiler.version>0.21.0</thyco.compiler.version>
<jackson.version>2.9.4</jackson.version>
<cassandra.unit.version>3.3.0.2</cassandra.unit.version>
</properties>
<repositories>
<repository>
<id>apache.snapshots</id>
<name>Apache Development Snapshot Repository</name>
<url>https://repository.apache.org/content/repositories/snapshots/</url>
<releases>
<enabled>false</enabled>
</releases>
<snapshots>
</snapshots>
</repository>
</repositories>
<!-- Execute "mvn clean package -Pbuild-jar" to build a jar file out of
this project! How to use the Flink Quickstart pom: a) Adding new dependencies:
You can add dependencies to the list below. Please check if the maven-shade-plugin
below is filtering out your dependency and remove the exclude from there.
b) Build a jar for running on the cluster: There are two options for creating
a jar from this project b.1) "mvn clean package" -> this will create a fat
jar which contains all dependencies necessary for running the jar created
by this pom in a cluster. The "maven-shade-plugin" excludes everything that
is provided on a running Flink cluster. b.2) "mvn clean package -Pbuild-jar"
-> This will also create a fat-jar, but with much nicer dependency exclusion
handling. This approach is preferred and leads to much cleaner jar files. -->
<dependencies>
<!-- Apache Flink dependencies -->
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-core</artifactId>
<version>${flink.version}</version>
</dependency>
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-java</artifactId>
<version>${flink.version}</version>
</dependency>
<!-- This dependency is required to actually execute jobs. It is currently pulled in by
flink-streaming-java, but we explicitly depend on it to safeguard against future changes. -->
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-clients_${scala.binary.version}</artifactId>
<version>${flink.version}</version>
</dependency>
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-streaming-java_${scala.binary.version}</artifactId>
<version>${flink.version}</version>
</dependency>
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-connector-kafka-0.10_${scala.binary.version}</artifactId>
<version>${flink.version}</version>
</dependency>
<!-- explicitly add a standard loggin framework, as Flink does not have
a hard dependency on one specific framework by default -->
<!-- https://mvnrepository.com/artifact/ch.qos.logback/logback-classic -->
<dependency>
<groupId>log4j</groupId>
<artifactId>log4j</artifactId>
<version>${log4j.version}</version>
</dependency>
<!-- TEST PURPOSE -->
<dependency>
<groupId>junit</groupId>
<artifactId>junit</artifactId>
<version>${junit.version}</version>
</dependency>
<dependency>
<groupId>org.cassandraunit</groupId>
<artifactId>cassandra-unit-shaded</artifactId>
<version>3.3.0.2</version>
</dependency>
<!-- DATASTAX -->
<dependency>
<groupId>com.datastax.cassandra</groupId>
<artifactId>cassandra-driver-extras</artifactId>
<version>${cassandra.version}</version>
<exclusions>
<exclusion>
<groupId>io.netty</groupId>
<artifactId>*</artifactId>
</exclusion>
<exclusion>
<artifactId>netty-handler</artifactId>
<groupId>io.netty</groupId>
</exclusion>
</exclusions>
</dependency>
<dependency>
<groupId>com.datastax.cassandra</groupId>
<artifactId>cassandra-driver-mapping</artifactId>
<version>${cassandra.version}</version>
<exclusions>
<exclusion>
<groupId>io.netty</groupId>
<artifactId>*</artifactId>
</exclusion>
</exclusions>
</dependency>
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-connector-elasticsearch5_2.11</artifactId>
<version>${flink.connector.elastic.version}</version>
<scope>compile</scope>
</dependency>
<!-- https://mvnrepository.com/artifact/org.elasticsearch/elasticsearch-core -->
<dependency>
<groupId>org.elasticsearch.client</groupId>
<artifactId>transport</artifactId>
<version>5.1.2</version>
</dependency>
<dependency>
<groupId>org.elasticsearch</groupId>
<artifactId>elasticsearch</artifactId>
<version>5.1.2</version>
</dependency>
<dependency>
<groupId>com.carrotsearch</groupId>
<artifactId>hppc</artifactId>
<version>0.7.0</version>
</dependency>
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-scala_${scala.binary.version}</artifactId>
<version>${flink.version}</version>
</dependency>
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-streaming-scala_${scala.binary.version}</artifactId>
<version>${flink.version}</version>
</dependency>
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-cep_${scala.binary.version}</artifactId>
<version>${flink.version}</version>
</dependency>
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-statebackend-rocksdb_${scala.binary.version}</artifactId>
<version>${flink.version}</version>
</dependency>
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-connector-cassandra_${scala.binary.version}</artifactId>
<version>${flink.version}</version>
</dependency>
<dependency>
<groupId>com.fasterxml.jackson.core</groupId>
<artifactId>jackson-databind</artifactId>
<version>${jackson.version}</version>
</dependency>
<dependency>
<groupId>com.google.code.gson</groupId>
<artifactId>gson</artifactId>
<version>2.6.1</version>
</dependency>
<dependency>
<groupId>io.flinkspector</groupId>
<artifactId>flinkspector-datastream_2.11</artifactId>
<version>0.8.1</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.reflections</groupId>
<artifactId>reflections</artifactId>
<version>0.9.11</version>
<exclusions>
<exclusion>
<artifactId>guava</artifactId>
<groupId>com.google.guava</groupId>
</exclusion>
</exclusions>
</dependency>
</dependencies>
<profiles>
<profile>
<!-- Profile for packaging correct JAR files -->
<id>build-jar</id>
<activation>
<activeByDefault>false</activeByDefault>
</activation>
<dependencies>
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-core</artifactId>
<version>${flink.version}</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-java</artifactId>
<version>${flink.version}</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-clients_${scala.binary.version}</artifactId>
<version>${flink.version}</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-streaming-java_${scala.binary.version}</artifactId>
<version>${flink.version}</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>log4j</groupId>
<artifactId>log4j</artifactId>
<version>${log4j.version}</version>
<scope>provided</scope>
</dependency>
<!-- DTO -->
<dependency>
<groupId>com.fasterxml.jackson.core</groupId>
<artifactId>jackson-databind</artifactId>
<version>${jackson.version}</version>
</dependency>
<!-- APACHE CASSANDRA -->
<dependency>
<groupId>com.datastax.cassandra</groupId>
<artifactId>cassandra-driver-extras</artifactId>
<version>${cassandra.version}</version>
<exclusions>
<exclusion>
<groupId>io.netty</groupId>
<artifactId>*</artifactId>
</exclusion>
</exclusions>
</dependency>
<dependency>
<groupId>com.datastax.cassandra</groupId>
<artifactId>cassandra-driver-mapping</artifactId>
<version>${cassandra.version}</version>
<exclusions>
<exclusion>
<groupId>io.netty</groupId>
<artifactId>*</artifactId>
</exclusion>
</exclusions>
</dependency>
<dependency>
<groupId>org.reflections</groupId>
<artifactId>reflections</artifactId>
<version>0.9.11</version>
</dependency>
</dependencies>
<build>
<plugins>
<!-- disable the exclusion rules -->
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-shade-plugin</artifactId>
<version>2.4.1</version>
<executions>
<execution>
<phase>package</phase>
<goals>
<goal>shade</goal>
</goals>
<configuration>
<artifactSet>
<excludes combine.self="override"/>
</artifactSet>
</configuration>
</execution>
</executions>
</plugin>
</plugins>
</build>
</profile>
</profiles>
<build>
<plugins>
<!-- We use the maven-shade plugin to create a fat jar that contains all
dependencies except flink and it's transitive dependencies. The resulting
fat-jar can be executed on a cluster. Change the value of Program-Class if
your program entry point changes. -->
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-shade-plugin</artifactId>
<version>3.1.0</version>
<executions>
<!-- Run shade goal on package phase -->
<execution>
<phase>package</phase>
<goals>
<goal>shade</goal>
</goals>
<configuration>
<artifactSet>
<excludes>
<!-- This list contains all dependencies of flink-dist Everything
else will be packaged into the fat-jar -->
<exclude>org.apache.flink:flink-annotations</exclude>
<exclude>org.apache.flink:flink-shaded-hadoop1_2.10</exclude>
<exclude>org.apache.flink:flink-shaded-hadoop2</exclude>
<exclude>org.apache.flink:flink-shaded-curator-recipes</exclude>
<exclude>org.apache.flink:flink-core</exclude>
<exclude>org.apache.flink:flink-java</exclude>
<!-- <exclude>org.apache.flink:flink-scala_2.10</exclude> -->
<exclude>org.apache.flink:flink-runtime_2.10</exclude>
<exclude>org.apache.flink:flink-optimizer_2.10</exclude>
<exclude>org.apache.flink:flink-clients_2.10</exclude>
<exclude>org.apache.flink:flink-avro_2.10</exclude>
<exclude>org.apache.flink:flink-examples-batch_2.10</exclude>
<exclude>org.apache.flink:flink-examples-streaming_2.10</exclude>
<exclude>org.apache.flink:flink-streaming-java_2.10</exclude>
<!-- Also exclude very big transitive dependencies of Flink WARNING:
You have to remove these excludes if your code relies on other versions of
these dependencies. -->
<exclude>org.scala-lang:scala-library</exclude>
<exclude>org.scala-lang:scala-compiler</exclude>
<exclude>org.scala-lang:scala-reflect</exclude>
<exclude>com.typesafe.akka:akka-actor_*</exclude>
<exclude>com.typesafe.akka:akka-remote_*</exclude>
<!--<exclude>io.netty:netty-all</exclude>-->
<exclude>io.netty:netty</exclude>
<exclude>commons-fileupload:commons-fileupload</exclude>
<exclude>org.apache.avro:avro</exclude>
<exclude>commons-collections:commons-collections</exclude>
<exclude>org.codehaus.jackson:jackson-core-asl</exclude>
<exclude>org.codehaus.jackson:jackson-mapper-asl</exclude>
<exclude>com.thoughtworks.paranamer:paranamer</exclude>
<exclude>org.xerial.snappy:snappy-java</exclude>
<exclude>org.apache.commons:commons-compress</exclude>
<exclude>org.tukaani:xz</exclude>
<exclude>com.esotericsoftware.kryo:kryo</exclude>
<exclude>com.esotericsoftware.minlog:minlog</exclude>
<exclude>org.objenesis:objenesis</exclude>
<exclude>com.twitter:chill_*</exclude>
<exclude>com.twitter:chill-java</exclude>
<exclude>commons-lang:commons-lang</exclude>
<exclude>junit:junit</exclude>
<exclude>org.apache.commons:commons-lang3</exclude>
<exclude>log4j:log4j</exclude>
<exclude>org.apache.commons:commons-math</exclude>
<exclude>org.apache.sling:org.apache.sling.commons.json</exclude>
<exclude>commons-logging:commons-logging</exclude>
<exclude>commons-codec:commons-codec</exclude>
<exclude>stax:stax-api</exclude>
<exclude>com.typesafe:config</exclude>
<exclude>org.uncommons.maths:uncommons-maths</exclude>
<exclude>com.github.scopt:scopt_*</exclude>
<exclude>commons-io:commons-io</exclude>
<exclude>commons-cli:commons-cli</exclude>
</excludes>
</artifactSet>
<filters>
<filter>
<artifact>org.apache.flink:*</artifact>
<excludes>
<!-- exclude shaded google but include shaded curator -->
<exclude>org/apache/flink/shaded/com/**</exclude>
<exclude>web-docs/**</exclude>
</excludes>
</filter>
<filter>
<!-- Do not copy the signatures in the META-INF folder. Otherwise,
this might cause SecurityExceptions when using the JAR. -->
<artifact>*:*</artifact>
<excludes>
<exclude>META-INF/*.SF</exclude>
<exclude>META-INF/*.DSA</exclude>
<exclude>META-INF/*.RSA</exclude>
</excludes>
</filter>
</filters>
<!-- If you want to use ./bin/flink run <quickstart jar> uncomment
the following lines. This will add a Main-Class entry to the manifest file -->
<transformers>
<transformer
implementation="org.apache.maven.plugins.shade.resource.ManifestResourceTransformer">
<mainClass>it.almaviva.wtf.mms.integratemobilitystatusevent.PassingEventProcess
</mainClass>
</transformer>
</transformers>
<createDependencyReducedPom>false</createDependencyReducedPom>
</configuration>
</execution>
</executions>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<configuration>
<source>1.8</source>
<target>1.8</target>
<compilerId>jdt</compilerId>
</configuration>
<dependencies>
<dependency>
<groupId>org.eclipse.tycho</groupId>
<artifactId>tycho-compiler-jdt</artifactId>
<version>0.21.0</version>
</dependency>
</dependencies>
</plugin>
</plugins>
<pluginManagement>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<version>3.7.0</version>
<configuration>
<source>1.8</source>
<target>1.8</target>
<compilerId>jdt</compilerId>
</configuration>
<dependencies>
<dependency>
<groupId>org.eclipse.tycho</groupId>
<artifactId>tycho-compiler-jdt</artifactId>
<version>0.21.0</version>
</dependency>
</dependencies>
</plugin>
<plugin>
<groupId>org.eclipse.m2e</groupId>
<artifactId>lifecycle-mapping</artifactId>
<version>1.0.0</version>
<configuration>
<lifecycleMappingMetadata>
<pluginExecutions>
<pluginExecution>
<pluginExecutionFilter>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-assembly-plugin</artifactId>
<versionRange>[2.4,)</versionRange>
<goals>
<goal>single</goal>
</goals>
</pluginExecutionFilter>
<action>
<ignore/>
</action>
</pluginExecution>
<pluginExecution>
<pluginExecutionFilter>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<versionRange>[3.1,)</versionRange>
<goals>
<goal>testCompile</goal>
<goal>compile</goal>
</goals>
</pluginExecutionFilter>
<action>
<ignore/>
</action>
</pluginExecution>
</pluginExecutions>
</lifecycleMappingMetadata>
</configuration>
</plugin>
</plugins>
</pluginManagement>
</build>
</project>
The error:
22:24:56.552 [main] INFO org.reflections.Reflections - Reflections took 370 ms to scan 1 urls, producing 16 keys and 52 values
Exception in thread "main" java.lang.NoSuchMethodError: com.google.common.collect.Sets$SetView.iterator()Lcom/google/common/collect/UnmodifiableIterator;
at org.reflections.Reflections.expandSuperTypes(Reflections.java:380)
at org.reflections.Reflections.(Reflections.java:126)
at org.reflections.Reflections.(Reflections.java:168)
at org.reflections.Reflections.(Reflections.java:141)
at it.almaviva.wtf.mms.integratemobilitystatusevent.PassingEventProcess.preferredCustomLogicMethod(PassingEventProcess.java:213)
at it.almaviva.wtf.mms.integratemobilitystatusevent.PassingEventProcess.main(PassingEventProcess.java:81)
I'm pretty sure this is non Flink related issue. The problematic method SetsView#iterator invoked by reflections 0.9.11 comes from Guava 20.0, which you are excluding in your pom, while cassandra-driver-extras pulls in Guava 19 (which doesn't have this method). You can check that by executing:
mvn dependency:tree
You have to fix this conflict somehow between cassandra and reflections dependencies. You could try to force guava version to 20.0, drop the cassandra-driver-extras dependency or find cassandra-driver-extras version without guava dependency (Flink is shading it's Guava dependency so Flink's users do not have to deal with this problem).