Unable to run unit tests with Snakeyaml in IntelliJ - java

I have created a Java component which reads a YAML file using SnakeYaml. The environment I use is IntelliJ with a Maven plug-in and my project is built using a Maven pom file. When I run the Maven test project, my unit tests all pass. However, when I run the unit tests within IntelliJ directly, they fail.
Specifically, the call new Yaml(myConstructor) below is throwing an exception:
Constructor myConstructor = new Constructor(....)
Yaml yaml = new Yaml(myConstructor)
The specific exception is:
java.lang.NoSuchMethodError:
org.yaml.snakeyaml.Yaml.(Lorg/yaml/snakeyaml/constructor/BaseConstructor;)V
Any ideas?

This happens because of the TestNG plugin. It's an issue for both IntelliJ and Eclipse.
Easily solved in two ways:
Update to the latest TestNG plugin version for your IDE of choice, hoping that the shipped version does not conflict with the one required for your project
Enable the "Use Project TestNG jar" in Eclipse (or the IntelliJ equivalent). This setting is available in the TestNG section of the project specific settings.
The second way would be preferred because TestNG dependencies will be managed by your build tool (are you using a build tool, right ?!) and you have a lot more flexibility.

Take look at file -> proj structure -> artifacts. There can be error. Just click at "Fix" button. Also try recreate itellij project from maven configuration. Usually there is troubles with libraries scopes, like TEST/PROVIDED.

The below worked for me
<dependency>
<groupId>org.testng</groupId>
<artifactId>testng</artifactId>
<version>6.2</version>
<type>jar</type>
<exclusions>
<exclusion>
<artifactId>snakeyaml</artifactId>
<groupId>org.yaml</groupId>
</exclusion>
</exclusions>
</dependency>

Related

make IntelliJ Idea remember the updates/changes of modules' dependencies scopes?

I am running a java/spark project from IntelliJ idea (community 2019.2) on macbook pro.
I use maven.
The scope of dependencies of modules have four types:
compile
test
build
provided
When I build the modules, I always got the error:
Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/spark/api/java/JavaRDDLike
Or, other classes.
I have to change the scope of the dependency of the class from
provided
to
compile
But, there are so many dependencies, it is really boring to change them one by one.
Also, whenever I use
invalidate cache and restart
All setting for the scope will be reset, I have to repeat the scope setting.
How can I only need to change scopes for all dependencies only one time and make IntelliJ remember what I have changed for scopes?
Thanks
If you are using Maven you must not change the configuration of the dependencies and other project structure settings (like directories, compiler etc) in the IDE UI. You change it in Maven pom.xml file. So to set the dependency scope in Maven dependency configuration using <scope> element e.g.
<dependencies>
<dependency>
<groupId>log4j</groupId>
<artifactId>log4j</artifactId>
<version>1.2.14</version>
<scope>compile</scope>
</dependency>
</dependencies>

Apache Spark Maven POM errors

I am trying out Apache Spark and have a little problem that I am stuck with. I have this JavaSparkHiveExample.java example from the Apache Spark Github together with a pom.xml that I have created as a Maven project in IntelliJ IDEA.
I am able to run other Spark examples (using another simpler POM) but this one is giving me the following errors:
Scala is installed in Idea
I am new with Maven and would therefore appreciate some things I could try in order to solve this problem.
Issue is with the value of $project.version. It is referring to your project' version (2.3.0-snapshot). There isn't any maven dependency with this version in Maven central repository, hence you are facing this issue. Instead of using the project version, you can add a new property like this and refer it for all dependency version
<properties>
<spark.version>1.6.2</spark.version>
</properties>
and then use it in the dependency
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_${scala.binary.version}</artifactId>
<version>${spark.version}</version>
</dependency>
Make sure the version you are using is available in maven repository
https://mvnrepository.com/

NoClassDefFoundError on Maven dependency present locally

I am new to Maven Project. I am making changes to one of the open source maven project. I am facing a problem in adding a library to the project. So far i have done this :-
I added a library named jni4net.j-0.8.8.0.jar to the resources folder of the project.
I right clicked the jar(in Intellij) and clicked 'Add as library'.
Then in the pom.xml i added:-
<dependency>
<groupId>jar.0.8.8.0.jni4net</groupId>
<artifactId>jar.0.8.8.0.jni4net</artifactId>
<version>0.8.8.0</version>
<scope>system</scope>
<systemPath>${basedir}/src/main/resources/jni4net.j-
0.8.8.0.jar</systemPath>
</dependency>
But when i build this project(build is successful, test cases are running) and use this it throws following error:-
java.lang.NoClassDefFoundError: net/sf/jni4net/Bridge
Please help me resolve it. I am new to maven and pom. I have looked at various answers, but not getting it right.
PS - I named groupId and artifactID as just reverse of jar file
This is not the right way to add that dependency.
All you need is:
<dependency>
<groupId>net.sf.jni4net</groupId>
<artifactId>jni4net.j</artifactId>
<version>0.8.8.0</version>
</dependency>
The dependency will be retrieved from Maven Central when you build.
Using <systemPath>...</systemPath> is highly discouraged as it usually ties your project to a local environment.
Since jni4net.j dependency is available in maven central, You don't have to download and put the dependency manually. Maven will download and store the dependency locally in `'.m2' folder. Just add dependency as bellow.
<dependency>
<groupId>net.sf.jni4net</groupId>
<artifactId>jni4net.j</artifactId>
<version>0.8.8.0</version>
</dependency>

Could not resolve dependencies for maven project

I've got this kind of error when I try to run my app in Eclipse.
Failed to execute goal on project. Could not resolve dependencies for project pl.wyk:Game-Logic:war:0.0.1-SNAPSHOT: Could not find artifact com.sun:tools:jar:1.8.0_45 at specified path C:\java\jdk/../lib/tools.jar.
But when I try run app from console everything is fine. So what is wrong with Eclipse? I use this same external Maven Runtime.
Try to add your system JDK as Library in Eclipse and use it for the project.
If this doesn't work then you may have to add scope dependency but may cause portability issues:
<dependency>
<groupId>com.sun</groupId>
<artifactId>tools</artifactId>
<version>1.6</version>
<scope>system</scope>
<systemPath>${java.home}/../lib/tools.jar</systemPath>
</dependency>
The problem was with Alternate JRE. You need to choose proper path. In my case:
C:\java\jdk\jre

Maven Java project in eclipse and mongodb libraries

I have a Maven project in Java. I am new to all of these concepts. I created a Restful project which works well with a file repository. But I want to change that to a mongo repository.
So I added my repository class, then I need to add the mongo libraries. I right click on the project and select Maven --> Update, but the libraries are not being downloaded. So I add them myself via Project Build path and this makes my project to compile.
However at runtime I get the exception of classNotFound for mongo classes.
I read some posts and added these line to pom.xml:
<dependency>
<groupId>org.mongodb</groupId>
<artifactId>mongo-java-driver</artifactId>
<version>1.3</version>
</dependency>
Still not compiling. How should I add the libraries in a way that it compiles and also at runtime my program can find those classes?
Where did you get 1.3 for a version? The latest is 2.12.1.

Categories

Resources