Adding folders in maven project - java

let us say I have a standard maven project with the standard four Directories
src/main/java
src/main/resources
src/test/java
src/test/resources
Now let us suppose, I create a subdirectory named "clojure" under "src/main".
Are then the source files under "src/main/clojure" automatically compiled when a build is run or do I somehow have to tell to maven, via configuration of some plugin (e.g. build-helper-maven-plugin), that it also has to compile the sources under "src/main/clojure"?
In others words, does the creation of any folder that is not ".../java" or ".../resources" require an explicit configuration in the pom.xml so that the sources there are taken into account by maven??
Any help would be appreciated.
Regards,
Horace

A Maven project is usually built with a single compiler, which looks for all its source files in those folders known as source folders to Maven. Depending on the project, such source folders may be added automatically, e.g. src/main/java. If a different compiler is used, additional folders may automatically be added, e.g. src/main/groovy.
Sometimes Maven integrations in IDEs (like Eclipse or IntelliJ) do not pick up folders for non-Java projects, even though the correct Maven plugins are in the POM, say e.g. for building a Groovy project.
So even though a build on the command line may run nicely with files in src/main/groovy, the folder may not be detected as a source folder when importing the project in an IDE. In such cases you may have to add the additional source folders, e.g.
<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>build-helper-maven-plugin</artifactId>
<version>1.5</version>
<executions>
<execution>
<id>add-source</id>
<phase>generate-sources</phase>
<goals>
<goal>add-source</goal>
</goals>
<configuration>
<sources>
<source>src/main/groovy</source>
</sources>
</configuration>
</execution>
</executions>
</plugin>

Yes, maven needs to "know" what those directories mean, though a clojure build plugin may use that directory by convention - see for example: https://github.com/talios/clojure-maven-plugin

Apache maven has a Standard Directory Layout which it understands out of the box.
To make maven understand any other structure than the above, you'll have to override these settings in pom.xml.
Look at this section of POM reference.

Related

What do I need to add to my pom.xml to run this specific Java file?

I'm not a Java dev and am unfamiliar with the packaging and building of Java programs. I'm trying to run this file: https://github.com/CodinGame/SpringChallenge2020/blob/master/src/test/java/Spring2020Main.java
by doing
mvn clean install
java -jar .\target\spring-2020-1.0-SNAPSHOT.jar
but I get this error:
no main manifest attribute, in .\target\spring-2020-1.0-SNAPSHOT.jar
I can't figure out for the life of me what I need to add to the pom.xml or whatever else I need to do to get this to work.
Any help will be appreciated.
A few things to understand about Java:
1) If you have a Maven project like this, code is divided between src/main/ and src/test/ directories. src/test/ is intended for unit tests. In your case, Spring2020Main is not actually set up as a unit test, so I'm not sure what the author intended here.
2) When you compile using mvn clean install, a jar (library) is built, but nothing from src/test will be included in the output.
Generally, tests are executed during build. And this one would have been, except it's not set up as a real junit test, so it didn't run during build.
3) You can move the file from src/test/java to src/main/java and it will be built into your resulting jar.
4) In this case, when you run the JVM, you need to specify a classpath. This is a list of all libraries to include when the application starts. You also need to specify the (fully qualified) name of the class to run:
java -cp target/spring-2020-1.0-SNAPSHOT.jar Spring2020Main
...the above won't work directly since there are more unsatisfied dependencies (the top level pom.xml brings in at least 3 other deps you'd also need to provide on the classpath).
As others pointed out, a solution could be to build a self-executing jar, but simplest for you would be to run this from an IDE:
Run this from IntelliJ. If you haven't installed it, install it.
1) File > New From Existing Sources, find the directory where this is coned to.
2) When asked, Import Project from Existing Model (Maven)
3) When the Project view is available (alt-1), or View > Tool Windows > Project, you can expand the structure till you find Spring2020Main in the test directory.
4) Right-click it and select Run.
For me, it exposed a web server running at http://localhost:8888/test.html
You can follow the steps below:
Move Spring2020Main.java to src/main/java/com/codingame directory
Add the following to your pom.xml after the </dependencies>:
<build>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-shade-plugin</artifactId>
<version>3.2.3</version>
<executions>
<execution>
<phase>package</phase>
<goals>
<goal>shade</goal>
</goals>
<configuration>
<transformers>
<transformer implementation="org.apache.maven.plugins.shade.resource.ManifestResourceTransformer">
<mainClass>com.codingame.Spring2020Main</mainClass>
</transformer>
</transformers>
</configuration>
</execution>
</executions>
</plugin>
</plugins>
</build>
Run maven build using mvn clean install
Execute the program using java -jar target/spring-2020-1.0-SNAPSHOT.jar
Info: Apache Maven Shade Plugin helps in building what is called an uber-jar or a fat-jar. This means that all the dependencies are packaged as part of the resultant jar file without the need for any 'libraries' that you'd need to add in the classpath when executing the jar file. As part of the final jar, we need to specify which file needs to be treated as the main file to be executed. This is typically done using META-INF/MANIFEST.MF file inside the uber-jar. That's what the transformer specified inside the configuration of the plugin does for us.
The project you've linked has only a basic setup for compilation (that would be enough to run it from IDE though).
What you need is an executable jar. Check this thread.
As others mentioned (and I failed to notice) the class you linked to is a test class, so it may not be included in a jar by default. Run it through IDE or set it up in a proper source directory.

Strange Maven nullpointer

I have modified this question since initially asking it. Please refer to sections UPDATE 1 and, specifically, UPDATE 2 at the end.
I am building a large JavaFX application with a lot of dependencies. I am using IntelliJ to do the job and am at the point of deployment. I am using IntelliJ's own tutorial on artifacts to build an executable jar. The tutorial can be seen in the "Working with artifacts" tutorial on jetbrains.
I have built my executable jar and it is working as it should, with one caveat, however:
I have to manually mimic the directory structure of my IntelliJ project for the executable jar file to find the resources necessary for the program to function properly.
This is where my question comes in: shouldn't IntelliJ include these files in the artifact, so it can run in and on its own?
My directory structure in IntelliJ looks like this:
Project root
.idea
.out
.src
.main
.java
.com
.myCompany
.package-with-classes1
.class1 ... N
.package-with-classes2
.class1 ... N
.package-with-files
.file1.someExtension
.file2.someExtension
.other-package-classes
.and-so-on
When I build the artifact under Project Structure - Artifacts - Output Layout, I then manually add the directory structure as can be seen above, and then place the files where they belong.
As per my question above, I would expect these files to be automatically included with the executable jar file.
UPDATE 1: Added Maven to project
Due to Andrey's comment I have now implemented Maven in my project. I have a lot of external dependencies which I have added to my pom.xml file like so:
<dependency>
<groupId>some.group.id</groupId>
<artifactId>some-artifact-id</artifactId>
<scope>system</scope>
<version>1.0.0</version>
<systemPath>${basedir}\path\to\jar\jarfile.jar</systemPath
</dependency>
I then do:
mvn clean
mvn compile
mvn package
All runs with no errors.
It places 2 jar files in my \target folder: (1) name-of-jar.jar and (2) name-of-jar-with-dependencies.jar.
Running (1) throws the error: no main manifest attribute. Running (2) throws ClassNotFoundException and NoClassDefFoundError errors. Why is this? The classes throwing the errors are included as dependencies using the above approach.
UPDATE 2: Progress with Maven, but...
I solved the issue in section UPDATE 1 by installing all my third party jar libraries to my local machine's Maven repository at C:\Users\$USER$.m2\repository. However, getting a null pointer...
I changed my dependency declarations in my pom.xml to the following:
<dependency>
<groupId>some.group.id</groupId>
<artifactId>some-artifact-id</artifactId>
<version>some.version.number</version>
</dependency>
I am currently building my fat jar using the maven assembly plugin (I have also tried using the shade plugin but am having the same issue). Here's the excerpt of the assembly plugin from my pom.xml:
<plugin>
<artifactId>maven-assembly-plugin</artifactId>
<configuration>
<archive>
<manifest>
<mainClass>com.myCompany.myMainClass</mainClass>
</manifest>
</archive>
<descriptorRefs>
<descriptorRef>jar-with-dependencies</descriptorRef>
</descriptorRefs>
</configuration>
<executions>
<execution>
<id>make-assembly</id>
<phase>package</phase>
<goals>
<goal>single</goal>
</goals>
</execution>
</executions>
</plugin>
This produces the same two jar files in the \target directory as described in section UPDATE 1.
Now, if I run jar -tf name-of-jar-with-dependencies.jar I can see from the directory contents of this jar that the jar in fact does contain all the third party jar libraries that was missing; and running the jar file using java -jar name-of-jar-with-dependencies.jar does not throw the errors as described in section UPDATE 1 any longer. So far, so good.
However, it does throw a NullPointerException which puzzles me. Specifically, it is complaining that the a certain class is missing. This seems a little strange to me since this class is part of a third party jar library which I did add as a dependency in my pom.xml. The fact that this class is indeed included in the final jar was confirmed by the approach above, printing out the contents of the name-of-jar-with-dependencies.jar, which - among a lot of other files - includes this very jar file.
Any thoughts?

Create Jar Library Without a Main Class

Today I have just created a Java Library. I created it using a Main class, since IntelliJ IDEA 14 asked me to add one. However I want it to be a normal library, without any Main classes. Is it possible to create a jar file from such a project without having a single class with the main method? If so, how do you create such a jar.
It just seems a bit silly to have a main method if you never use it.
Use a build tool like Maven (no IDE dependencies but can be called from IDE for convenience) with the shade plugin to create an 'uber' JAR (that includes all needed dependencies into one final JAR for the project)...
"pom.xml"
...
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-shade-plugin</artifactId>
<version>2.3</version>
<executions>
<!-- Run shade goal on package phase -->
<execution>
<phase>package</phase>
<goals>
<goal>shade</goal>
</goals>
</execution>
</executions>
</plugin>
Documentation to Shade plugin:
https://maven.apache.org/plugins/maven-shade-plugin/
You can do it in few ways, for example from command line, from IDE, maven or other build tool, I describe 2 ways:
Command line:
You can create jar file from command line (without IDE), Here is reference: https://docs.oracle.com/javase/tutorial/deployment/jar/build.html
jar cf jar-file input-file(s)
where jar-file is .jar file name you want and input-file(s) are files you want to put inside your library (can be a wildcard, e.g.: *.class)
Intellij Idea:
Create Artifact like in this article, but without specifying Main class http://blog.jetbrains.com/idea/2010/08/quickly-create-jar-artifact/
Then click Build > Build artifact > Build.
This works even if there is no Main class.

Using Maven for non-Java projects (overriding clean/compile/install goals)

I have a good(ish) understanding of using Maven for Java/WebApp projects but only to the point of following the default goals/lifecycle.
However I now have a Backup project which is not a Java project at all. I was thinking of configuring it with Maven to keep some consistency but am not sure how I override the main Maven goals/phases for my bespoke processing.
The Backup project needs to do the following:
'build' - initially, the backup outputs will be a mysql database dump file and a zip exported from an existing WebApp. But I want it to be flexible so calling an ant file to do the actual work (creating the dump, calling the WebApp, or doing whatever in the future) seems sensible. The output files could then be copied into the target directory.
'install' - publish the output files to a local repository, preferably providing a datetimestamp version number instead of the usual 1.0.0-SNAPSHOT version. I'd like to think that Maven can cope with an artefact being a collection of files, rather than a single jar/war, but not sure on this.
My pom.xml declares the packaging as 'pom' as 'jar' and 'war' dont seem appropriate here.
I then want other projects to be able to have a dependency on this Backup project so they can get the lastest backup artefacts if required.
1) how do I override the maven 'compile' goal to call an ant build file?
2) how do I override the maven 'install' goal to publish all files in the target directory but as a single artefact?
Any help/guidance appreciated.
you can use maven antrun plugin for achieving this.
one example usage:
<plugin>
<artifactId>maven-antrun-plugin</artifactId>
<executions>
<execution>
<id>bundle-virgo</id>
<phase>package</phase>
<configuration>
<tasks>
<ant antfile="<path to build.xml>" target="compile"/>
</tasks>
</configuration>
<goals>
<goal>run</goal>
</goals>
</execution>
</executions>
</plugin>
</plugins>
you can change the phase parameter to different maven phases to run it at different phases like package, compile etc.

How can I share clover coverage data between maven and IntelliJ

I have a multi-module maven project. I'm using intellij-idea as my IDE.
I have Maven configured with the clover plugin to automatically instrument on build.
How can I get IntelliJ to recognize those changes and refresh its coverage data.(NOTE: having to click the "Refresh Coverage" toolbar button is fine.)
I've tried configuring maven-clover2-plugin like so:
<plugin>
<groupId>com.atlassian.maven.plugins</groupId>
<artifactId>maven-clover2-plugin</artifactId>
<version>3.2.2</version>
<configuration>
<baseDir>${project.basedir}</baseDir>
<cloverMergeDatabase>
${project.basedir}.clover\cloverMerge.db
</cloverMergeDatabase>
</configuration>
<executions>
<execution>
<id>main</id>
<phase>package</phase>
<goals>
<goal>instrument</goal>
<goal>aggregate</goal>
<goal>check</goal>
</goals>
</execution>
<execution>
<id>site</id>
<phase>pre-site</phase>
<goals>
<goal>instrument</goal>
<goal>aggregate</goal>
<goal>check</goal>
</goals>
</execution>
<execution>
<id>clean</id>
<phase>clean</phase>
<goals><goal>clean</goal></goals>
</execution>
</executions>
</plugin>
I then configured my project settings to use:
.clover\cloverMerge.db and checked the relative to project directory. checkbox.
But that didn't work.
NOTE:
At the bottom of Configuring Instrumentation it says
Do not set these locations explicitly if you have a multi-module project.
So I also tried leaving the location as the default for both Maven and IDEA and that didn't work either.
Also in the Clover for IDEA installation GUIDE - Known Issues
If you are using the Maven build tool, you should avoid using the same > IntelliJ output directory as Maven does. As Maven uses the target/classes and target/test-classes directories,
avoid specifying these ones. The clover.db location for IntelliJ should also be distinct from that used by Maven.
WHY should they be distinct is there some file corruption issue? If they're kept distinct then HOW can I get awesome coverage highlighting/etc, without having to repeat builds in a completely separate process?
Well I finally figured out an answer. I'm leaving this here for posterity.
The solution is complicated and somewhat of a Hack but it WORKS.
Update the parent projects pom.xml file
cloverDB: <cloverDatabase>${project.basedir}.clover\clover.db</cloverDatabase>
Merge CloverDB:
<cloverMergeDatabase>
${project.basedir}.clover\cloverMerge.db
</cloverMergeDatabase>
Create your Unit Tests to Run in IntelliJ IDEA
setup a Before launch - Run Maven Goal
clean clover2:setup prepare-package -DSkipTests
Create a Maven Run Configuration
Make the Unit-Tests a Before launch condition
In the command line have Maven run clover2:aggregrate
Update Intellij Project Settings for clover to point to the merge file
Make sure the Relative to project directory. checkbox is checked.
InitString to User specified with the value the same as your pom file.
in my case: .clover\cloverMergeDB
Once the command is run, just click the Referesh Coverage icon to see and work with the coverage data in idea.
If the tests fail you will also have the nice IntelliJ Test runner Tab to figure out why.
At the bottom of Configuring Instrumentation it says
Do not set these locations explicitly if you have a multi-module project.
Documentation actually says: Do not set these locations explicitly (using absolute path) if you have a multi-module project. The reason is simple - if you use an absolute path, then you will not have a separate clover.db for every module, but only a single clover.db file.
"If you are using the Maven build tool, you should avoid using the same IntelliJ output directory as Maven does. As Maven uses the target/classes and target/test-classes directories, avoid specifying these ones" [...] WHY should they be distinct is there some file corruption issue?
The problem is as follows: IntelliJ IDEA uses it's own engine to compile sources. It means that it does not have to call the original project's build system (a Maven, for instance) to compile sources.
It means that:
- if you have a Maven-based project and it has the Clover-for-Maven plugin installed and
- at the same time you have the Clover-for-IDEA installed in the IntelliJ IDE
- and these two Clover integrations use the same output folders for classes and databases
... then these two Clover integrations may start overwriting their files.
In most cases this is not a desired behaviour because any source code modification / project rebuild action etc in IDEA will trigger source recompilation; which can delete results obtained previously by Clover-for-Maven.

Categories

Resources