how to resolve packages conflicts on varieties of log4j? - java

I wrote this pom.xml based on an spring boot sample.
And when I started my app, I got this error:
SLF4J: Detected both log4j-over-slf4j.jar AND bound slf4j-log4j12.jar on the class path, preempting StackOverflowError.
I'm totally new to Java and Maven, and stuck here for quite a while.
I tried to wrote some exclusions in dependency. But it didn't work out. I have no idea which package should be excluded from which. If so, how can the package which is depended on, work normally?
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<parent>
<!-- Your own application should inherit from spring-boot-starter-parent -->
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-samples</artifactId>
<version>2.0.0.BUILD-SNAPSHOT</version>
</parent>
<artifactId>spring-boot-sample-web-secure-jdbc</artifactId>
<name>Spring Boot Web Secure JDBC Sample</name>
<description>Spring Boot Web Secure JDBC Sample</description>
<url>http://projects.spring.io/spring-boot/</url>
<organization>
<name>Pivotal Software, Inc.</name>
<url>http://www.spring.io</url>
</organization>
<properties>
<main.basedir>${basedir}/../..</main.basedir>
</properties>
<dependencies>
<!-- Compile -->
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-common</artifactId>
<version>3.0.0-alpha2</version>
</dependency>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-security</artifactId>
</dependency>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-thymeleaf</artifactId>
</dependency>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-jdbc</artifactId>
</dependency>
<dependency>
<groupId>com.h2database</groupId>
<artifactId>h2</artifactId>
</dependency>
<!-- Test -->
<dependency>
<groupId>org.apache.httpcomponents</groupId>
<artifactId>httpclient</artifactId>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-test</artifactId>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter</artifactId>
</dependency>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-web</artifactId>
</dependency>
</dependencies>
<build>
<plugins>
<plugin>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-maven-plugin</artifactId>
</plugin>
</plugins>
</build>
</project>

This happens because spring-boot-starter is pulling in the log4j-over-slf4j dependency, and one of your other dependencies is pulling in log4j.
run mvn dependency:tree and you should be able to find which dependency is pulling in the unwanted log4j and exclude it with
<exclusions>
<exclusion>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-log4j12</artifactId>
</exclusion>
<exclusion>
<groupId>log4j</groupId>
<artifactId>log4j</artifactId>
</exclusion>
</exclusions>
It may only be one of those depending on what you see on thee mvn dependency:tree
If you would rather use log4j then obviously just exclude log4j-over-slf4j from spring-boot-starter instead.

This problem is due to the endless loop between logging facade and logging framework call.
https://www.slf4j.org/codes.html#log4jDelegationLoop
My scenario is slightly more complicate. Besides this loop problem. I have two logging framework.
I end up excluding log4j-over-slf4j and logback-classic.

I saw on your pom.xml there's no dependency for log4j. You should put this to your pom.xml:
<dependency>
<groupId>log4j</groupId>
<artifactId>log4j</artifactId>
<version>1.2.17</version>
</dependency>

Related

Dependecy conflict Apache Spark and Spring Boot

So I'm building a full stack backtesting application for trading strategies and currently I thought of using Spring Boot as a Server, and Apache Spark for data processing.
I tried creating Spring Boot & Apache Spark in the same project but with no luck, I couldn't solve a dependency conflict, I'm getting an error:
Exception in thread "main" java.lang.NoClassDefFoundError: javax/servlet/Servlet
at org.apache.spark.ui.SparkUI$.create(SparkUI.scala:223)
at org.apache.spark.SparkContext.<init>(SparkContext.scala:484)
at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2704)
at org.apache.spark.sql.SparkSession$Builder.$anonfun$getOrCreate$2(SparkSession.scala:953)
at scala.Option.getOrElse(Option.scala:201)
at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:947)
at com.samurai.lab.marketdata.MarketDataApplication.main(MarketDataApplication.java:12)
Caused by: java.lang.ClassNotFoundException: javax.servlet.Servlet
at java.base/jdk.internal.loader.BuiltinClassLoader.loadClass(BuiltinClassLoader.java:641)
at java.base/jdk.internal.loader.ClassLoaders$AppClassLoader.loadClass(ClassLoaders.java:188)
at java.base/java.lang.ClassLoader.loadClass(ClassLoader.java:520)
... 7 more
If I add hadoop dependecise then I'll get a different error:
Exception in thread "main" javax.servlet.UnavailableException: Servlet class org.glassfish.jersey.servlet.ServletContainer is not a javax.servlet.Servlet
at org.sparkproject.jetty.servlet.ServletHolder.checkServletType(ServletHolder.java:514)
at org.sparkproject.jetty.servlet.ServletHolder.doStart(ServletHolder.java:386)
at org.sparkproject.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:73)
at org.sparkproject.jetty.servlet.ServletHandler.lambda$initialize$0(ServletHandler.java:749)
at java.base/java.util.stream.SortedOps$SizedRefSortingSink.end(SortedOps.java:357)
at java.base/java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:510)
at java.base/java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:499)
at java.base/java.util.stream.StreamSpliterators$WrappingSpliterator.forEachRemaining(StreamSpliterators.java:310)
at java.base/java.util.stream.Streams$ConcatSpliterator.forEachRemaining(Streams.java:735)
at java.base/java.util.stream.ReferencePipeline$Head.forEach(ReferencePipeline.java:762)
at org.sparkproject.jetty.servlet.ServletHandler.initialize(ServletHandler.java:774)
at org.sparkproject.jetty.servlet.ServletContextHandler.startContext(ServletContextHandler.java:379)
at org.sparkproject.jetty.server.handler.ContextHandler.doStart(ContextHandler.java:916)
at org.sparkproject.jetty.servlet.ServletContextHandler.doStart(ServletContextHandler.java:288)
at org.sparkproject.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:73)
at org.apache.spark.ui.ServerInfo.addHandler(JettyUtils.scala:491)
at org.apache.spark.ui.SparkUI.$anonfun$attachAllHandler$2(SparkUI.scala:76)
at org.apache.spark.ui.SparkUI.$anonfun$attachAllHandler$2$adapted(SparkUI.scala:76)
at scala.collection.IterableOnceOps.foreach(IterableOnce.scala:563)
at scala.collection.IterableOnceOps.foreach$(IterableOnce.scala:561)
at scala.collection.AbstractIterable.foreach(Iterable.scala:926)
at org.apache.spark.ui.SparkUI.$anonfun$attachAllHandler$1(SparkUI.scala:76)
at org.apache.spark.ui.SparkUI.$anonfun$attachAllHandler$1$adapted(SparkUI.scala:74)
at scala.Option.foreach(Option.scala:437)
at org.apache.spark.ui.SparkUI.attachAllHandler(SparkUI.scala:74)
at org.apache.spark.SparkContext.$anonfun$new$31(SparkContext.scala:648)
at org.apache.spark.SparkContext.$anonfun$new$31$adapted(SparkContext.scala:648)
at scala.Option.foreach(Option.scala:437)
at org.apache.spark.SparkContext.<init>(SparkContext.scala:648)
at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2704)
at org.apache.spark.sql.SparkSession$Builder.$anonfun$getOrCreate$2(SparkSession.scala:953)
at scala.Option.getOrElse(Option.scala:201)
at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:947)
at com.samurai.lab.marketdata.MarketDataApplication.main(MarketDataApplication.java:12)
Process finished with exit code 1
I tied excluding servlet from spark like so:
<exclusions>
<exclusion>
<groupId>javax.servlet</groupId>
<artifactId>javax.servlet-api</artifactId>
</exclusion>
<exclusion>
<groupId>org.glassfish</groupId>
<artifactId>javax.servlet</artifactId>
</exclusion>
<exclusion>
<groupId>org.eclipse.jetty.orbit</groupId>
<artifactId>javax.servlet</artifactId>
</exclusion>
</exclusions>
It doesn't help
This is a clean pom.xml file that I'm using right now:
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 https://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<parent>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-parent</artifactId>
<version>3.0.2</version>
<relativePath/> <!-- lookup parent from repository -->
</parent>
<groupId>com.samurai.lab</groupId>
<artifactId>market-data</artifactId>
<version>0.0.1-SNAPSHOT</version>
<name>market-data</name>
<description>Demo project for Spring Boot</description>
<properties>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
<project.reporting.outputEncoding>UTF-8</project.reporting.outputEncoding>
<java.version>17</java.version>
<apache-spark.version>3.3.1</apache-spark.version>
<hadoop.version>3.3.2</hadoop.version>
</properties>
<dependencies>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-web</artifactId>
<exclusions>
<exclusion>
<groupId>log4j</groupId>
<artifactId>*</artifactId>
</exclusion>
<exclusion>
<groupId>org.slf4j</groupId>
<artifactId>*</artifactId>
</exclusion>
<exclusion>
<groupId>org.apache.logging.log4j</groupId>
<artifactId>*</artifactId>
</exclusion>
</exclusions>
</dependency>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-test</artifactId>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.13</artifactId>
<version>${apache-spark.version}</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-sql_2.13</artifactId>
<version>${apache-spark.version}</version>
</dependency>
</dependencies>
<build>
<plugins>
<plugin>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-maven-plugin</artifactId>
</plugin>
</plugins>
</build>
</project>
Any ideas on how to make it work and solve the conflict?
Update v1
I have the dependency as part of the spark-core:
you can reload all maven project if you use IntelliJ and I don't see version for your javax, first, you can check project dependencies for this library
Add scope to your dependency and retry;
<dependency>
<groupId>javax.servlet</groupId>
<artifactId>javax.servlet-api</artifactId>
<scope>provided</scope>
</dependency>
or with test scope.

how come scala code gets compiled with Java?

I'm trying to understand how Scala code works with Java in Java's IDE. I got this doubt while working with Spark Java where I saw Scala packages too in code and using respective classes and methods.
My understanding says, Scala code need Scala's compiler to convert into Java.class files and then from their onwards JDK do its part in JVM to convert into binaries and do actions. Please correct me if am wrong.
After that, In my spark Java project in eclipse, I couldnt see anywhere where scala compiler is being pointed.
This is my pom.xml
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 https://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>spark</groupId>
<artifactId>Project</artifactId>
<version>0.0.1-SNAPSHOT</version>
<parent>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-parent</artifactId>
<version>2.4.0</version>
<relativePath />
</parent>
<properties>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
<project.reporting.outputEncoding>UTF-8</project.reporting.outputEncoding>
<java.version>1.8</java.version>
</properties>
<dependencies>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-web</artifactId>
<!-- <exclusions> <exclusion> <groupId>ch.qos.logback</groupId> <artifactId>logback-classic</artifactId>
</exclusion> </exclusions> -->
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.12</artifactId>
<version>3.0.0</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-sql_2.12</artifactId>
<version>3.0.0</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-sql-kafka-0-10_2.12</artifactId>
<version>3.0.0</version>
</dependency>
<!-- Gson -->
<dependency>
<groupId>com.google.code.gson</groupId>
<artifactId>gson</artifactId>
<version>2.8.6</version>
<scope>provided</scope>
</dependency>
<!-- Avro Messages -->
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-avro_2.12</artifactId>
<version>3.1.2</version>
</dependency>
<!-- Spark MLLIB -->
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-mllib_2.12</artifactId>
<version>3.0.0</version>
<scope>provided</scope>
</dependency>
<!-- https://mvnrepository.com/artifact/org.codehaus.janino/janino -->
<dependency>
<groupId>org.codehaus.janino</groupId>
<artifactId>commons-compiler</artifactId>
</dependency>
<dependency>
<groupId>org.codehaus.janino</groupId>
<artifactId>janino</artifactId>
</dependency>
<dependency>
<groupId>junit</groupId>
<artifactId>junit</artifactId>
<scope>test</scope>
</dependency>
</dependencies>
<dependencyManagement>
<dependencies>
<!--Spark java.lang.NoClassDefFoundError: org/codehaus/janino/InternalCompilerException -->
<dependency>
<groupId>org.codehaus.janino</groupId>
<artifactId>commons-compiler</artifactId>
<version>3.0.16</version>
</dependency>
<dependency>
<groupId>org.codehaus.janino</groupId>
<artifactId>janino</artifactId>
<version>3.0.16</version>
</dependency>
</dependencies>
</dependencyManagement>
<build>
<plugins>
<plugin>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-maven-plugin</artifactId>
</plugin>
</plugins>
</build>
</project>
And for example am using below import of scala in my Java class file. But literally not getting Idea from which Jars its picking and where its getting compiled to Java.class for further processing.
import scala.collection.JavaConverters;
Can anyone please help me to understand this unknown magic to me ?
Dependencies ship in class file form. That JavaConverters class must indeed be compiled by scalac. However, the maintainers of janino have done this on their hardware, shipped the compiled result to mavencentral's servers, which distributed it to all mirrors, which is how it ended up on your system's disk, which is why you do not need scalac to use it.
The jars, they have compiled files only. They will not have the source files. If you are using any dependency, it is not the source, it is compiled output. So, in your case, JavaConverters, is compiled too. And it is from the package scala.collection which is a built-in package.
May I know you have Scala installed on your system?

Exception in Connecting Dataproc Hive Server using Java and Spark Eclipse

I am trying to access the Hive server present in GCP - Dataproc from my local machine(eclipse) using java and spark. But I am getting the below error while starting the application. I tried to find the problem but unable to solve it.
Exception in thread "main" java.lang.IllegalArgumentException: Unable to instantiate SparkSession with Hive support because Hive classes are not found.
at org.apache.spark.sql.SparkSession$Builder.enableHiveSupport(SparkSession.scala:870)
at com.hadoop.Application.main(Application.java:22)
Pom.xml:
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 https://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<parent>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-parent</artifactId>
<version>2.5.1</version>
<relativePath/> <!-- lookup parent from repository -->
</parent>
<groupId>com.hadoop</groupId>
<artifactId>hadoop</artifactId>
<version>0.0.1-SNAPSHOT</version>
<name>hadoop</name>
<description>Demo project for Spring Boot</description>
<properties>
<java.version>1.8</java.version>
</properties>
<dependencyManagement>
<dependencies>
<dependency>
<groupId>com.google.cloud</groupId>
<artifactId>libraries-bom</artifactId>
<version>20.6.0</version>
<type>pom</type>
<scope>import</scope>
</dependency>
</dependencies>
</dependencyManagement>
<dependencies>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-web</artifactId>
</dependency>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-test</artifactId>
<scope>test</scope>
</dependency>
<dependency>
<groupId>com.google.cloud</groupId>
<artifactId>google-cloud-dataproc</artifactId>
<version>1.5.2</version>
</dependency>
<dependency>
<groupId>com.google.cloud</groupId>
<artifactId>google-cloud-storage</artifactId>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.11</artifactId>
<version>2.4.7</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-sql_2.11</artifactId>
<version>2.4.7</version>
<scope>provided</scope>
<exclusions>
<exclusion>
<groupId>io.netty</groupId>
<artifactId>netty-all</artifactId>
</exclusion>
</exclusions>
</dependency>
<dependency>
<groupId>io.netty</groupId>
<artifactId>netty-all</artifactId>
<version>4.1.47.Final</version>
</dependency>
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-hdfs</artifactId>
<version>2.10.1</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-hive_2.11</artifactId>
<version>2.4.7</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>com.sun.jersey</groupId>
<artifactId>jersey-client</artifactId>
<version>1.9</version>
</dependency>
<dependency>
<groupId>org.objenesis</groupId>
<artifactId>objenesis</artifactId>
<version>2.5.1</version>
<scope>test</scope>
</dependency>
</dependencies>
<build>
<plugins>
<plugin>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-maven-plugin</artifactId>
</plugin>
</plugins>
</build>
</project>
The problem is with the scope of the following dependency:
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-hive_2.11</artifactId>
<version>2.4.7</version>
<scope>provided</scope>
</dependency>
According to the Maven doc:
compile: This is the default scope, used if none is specified. Compile dependencies are available in all classpaths of a project. Furthermore, those dependencies are propagated to dependent projects.
provided: This is much like compile, but indicates you expect the JDK or a container to provide the dependency at runtime. For example, when building a web application for the Java Enterprise Edition, you would set the dependency on the Servlet API and related Java EE APIs to scope provided because the web container provides those classes. A dependency with this scope is added to the classpath used for compilation and test, but not the runtime classpath. It is not transitive.
You might want to change it to compile or remove the line. Or download the jar and add it to the classpath.
Also see this doc on how to create a Spark application uber jar which includes its dependencies.

How resolve problem with tomcat ? Spring-Boot " Failed to scan"

this is my first spring-boot project and i have problem with tomcat. Server don't wan't to start. Glass fish is not used in the project, the more I am surprised by this message. The project was created in Maven.
I have sitting about this problem many hours and I can't find the problem. Enybody can help me ?
2019-12-11 16:21:30.775 WARN 10436 --- [ main] o.a.tomcat.util.scan.StandardJarScanner : Failed to scan [file:/C:/Users/maxwell/.m2/repository/org/glassfish/hk2/hk2/2.6.1/hk2-utils.jar] from classloader hierarchy
java.io.IOException: java.lang.reflect.InvocationTargetException
at org.apache.tomcat.util.compat.Jre9Compat.jarFileNewInstance(Jre9Compat.java:236) ~[tomcat-embed-core-9.0.29.jar:9.0.29]
This is the my pom.xml file :
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 https://maven.apache.org /xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<parent>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-parent</artifactId>
<version>2.2.2.RELEASE</version>
<relativePath /> <!-- lookup parent from repository -->
</parent>
<groupId>maxwell</groupId>
<artifactId>appdemo</artifactId>
<version>0.0.1-SNAPSHOT</version>
<name>appdemo</name>
<description>Demo project for Spring Boot</description>
<properties>
<java.version>11</java.version>
</properties>
<dependencies>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-data-jdbc</artifactId>
</dependency>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-jdbc</artifactId>
</dependency>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-jersey</artifactId>
</dependency>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-security</artifactId>
</dependency>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-web</artifactId>
</dependency>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-webflux</artifactId>
</dependency>
<dependency>
<groupId>mysql</groupId>
<artifactId>mysql-connector-java</artifactId>
<scope>runtime</scope>
</dependency>
<!-- https://mvnrepository.com/artifact/org.springframework.security/spring-security-taglibs -->
<dependency>
<groupId>org.springframework.security</groupId>
<artifactId>spring-security-taglibs</artifactId>
</dependency>
<!-- https://mvnrepository.com/artifact/org.springframework.boot/spring-boot-starter-data-rest -->
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-data-rest</artifactId>
</dependency>
<!-- https://mvnrepository.com/artifact/org.springframework.boot/spring-boot-starter-tomcat -->
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-tomcat</artifactId>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>org.apache.tomcat.embed</groupId>
<artifactId>tomcat-embed-jasper</artifactId>
<scope>providet</scope>
</dependency>
<dependency>
<groupId>javax.xml.bind</groupId>
<artifactId>jaxb-api</artifactId>
</dependency>
<dependency>
<groupId>javax.servlet</groupId>
<artifactId>javax.servlet-api</artifactId>
<scope>provided</scope>
</dependency>
<!-- Tests -->
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-test</artifactId>
<scope>test</scope>
<exclusions>
<exclusion>
<groupId>org.junit.vintage</groupId>
<artifactId>junit-vintage-engine</artifactId>
</exclusion>
</exclusions>
</dependency>
<dependency>
<groupId>io.projectreactor</groupId>
<artifactId>reactor-test</artifactId>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.springframework.security</groupId>
<artifactId>spring-security-test</artifactId>
<scope>test</scope>
</dependency>
</dependencies>
<build>
<plugins>
<plugin>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-maven-plugin</artifactId>
</plugin>
</plugins>
</build>
</project>
i have also faced this problem so we can't use concurrently spring-boot-starter-web and spring-boot-starter-jersey because glassfish is part of jersey so just remove any one dependency from both of them in my case i removed starter-jersey
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-jersey</artifactId>
</dependency>
finally problem has been gone please try it
You should add the packaging to war at the pom.xml:
<packaging>war</packaging>
It seems something wrong with your dependencies : can you check this typo
tomcat-embed-jasper dependency
providet
Other question why you have two dependencies if you are using tomcat one will be fine.
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-tomcat</artifactId>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>org.apache.tomcat.embed</groupId>
<artifactId>tomcat-embed-jasper</artifactId>
<scope>providet</scope>
</dependency>
check that example pom
spring boot web dependency is enough it already includes tomcat. You don't need to explicitly define tomcat dependency
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-web</artifactId>
</dependency>
If it is a new project i would try to run a standard project generated on: https://start.spring.io/. This way you can be sure the project should be runnable and any failure is to blame on anything other than the project or IDE. If that does not run i would reinstall Tomcat.

m2eclipse Missing Artifact

I am trying to get started with Maven and m2eclipse but I keep getting missing artifact errors:
log4j is in my local repository. I have m2eclipse set up to use an external installation of Maven although I realize that for dependency resolution it uses the embedded maven. I don't have any custom settings for maven set up, this is a plain install of m2eclipse and maven. I am able to add the dependencies I want through m2eclipse (such as log4j) and it adds them to my pom file. I am at home and not behind a corporate or particularly restrictive firewall.
Can anybody help me figure out what is going on?
Pom file:
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>org.xonatype.mavenbook.ch04</groupId>
<artifactId>simple-weather</artifactId>
<version>1.0</version>
<name>Simple Weather</name>
<dependencies>
<dependency>
<groupId>junit</groupId>
<artifactId>junit</artifactId>
<version>4.8.2</version>
<type>jar</type>
<scope>test</scope>
</dependency>
<dependency>
<groupId>log4j</groupId>
<artifactId>log4j</artifactId>
<version>1.2.16</version>
<type>bundle</type>
<scope>compile</scope>
</dependency>
<dependency>
<groupId>dom4j</groupId>
<artifactId>dom4j</artifactId>
<version>20040902.021138</version>
<type>jar</type>
<scope>compile</scope>
</dependency>
<dependency>
<groupId>jaxen</groupId>
<artifactId>jaxen</artifactId>
<version>1.1.1</version>
<type>jar</type>
<scope>compile</scope>
</dependency>
<dependency>
<groupId>velocity</groupId>
<artifactId>velocity</artifactId>
<version>1.5</version>
<type>pom</type>
<scope>compile</scope>
</dependency>
</dependencies>
<properties>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
</properties>
</project>
change <type>bundle</type> to <type>jar</type>

Categories

Resources