Can Kotlin controller class access Java service class in spring boot - java

I a working on a project in spring boot .. is it possible to use in kotlin and java both in spring-boot project.
I have tried by making a Kotlin #controller class and calling java #service class but it isn't working

Yes, it will definitely work
You simply need to configure it properly
check your build.gradle source sets and targets
Follow the below folder structure
src
main
java
kotlin
resources

I have done the following to successfully use Kotlin in my existing Spring Boot App.
My IDE is IntelliJ.
(1)
Ensure that Kotlin plugin is installed in the IDE.
Select Tools -> Kotlin -> Configure Kotlin Plugin Updates.
In the Update channel list, select the Stable channel.
Click Check again. The latest Stable build version appears.
Click Install. Apply / OK.
(2)
Get the required dependencies.
Open Spring Initializr (https://start.spring.io/).
Select language as Kotlin.
Keep other things as usual. Here we are doing it for Maven Project.
Click on Explore. Sample pom.xml will appear.
Now copy the following from the sample pom.xml to the pom.xml of your existing Spring Boot Project.
…
<properties>
…
<kotlin.version>1.6.10</kotlin.version>
<!--Edit the version to the latest stable version, if applicable-->
</properties>
…
<dependencies>
…
<dependency>
<groupId>org.jetbrains.kotlin</groupId>
<artifactId>kotlin-reflect</artifactId>
</dependency>
<dependency>
<groupId>org.jetbrains.kotlin</groupId>
<artifactId>kotlin-stdlib-jdk8</artifactId>
</dependency>
</dependencies>
…
<build>
<sourceDirectory>${project.basedir}/src/main/kotlin</sourceDirectory>
<testSourceDirectory>${project.basedir}/src/test/kotlin</testSourceDirectory>
…
</build>
…
<plugins>
…
<plugin>
<groupId>org.jetbrains.kotlin</groupId>
<artifactId>kotlin-maven-plugin</artifactId>
<configuration>
<args>
<arg>-Xjsr305=strict</arg>
</args>
<compilerPlugins>
<plugin>spring</plugin>
</compilerPlugins>
</configuration>
<dependencies>
<dependency>
<groupId>org.jetbrains.kotlin</groupId>
<artifactId>kotlin-maven-allopen</artifactId>
<version>${kotlin.version}</version>
</dependency>
</dependencies>
</plugin>
</plugins>
(3)
Create the required directories in your project.
In the src/main directory of your project, create a new directory and name it as kotlin (this directory should be at the same level as src/main/java)
Now right click this directory -> Mark Directory as -> Sources Root
Also, in the src/test directory of your project, create a new directory and name it as kotlin (this directory should be at the same level as src/test/java)
Now right click this directory -> Mark Directory as -> Test Sources Root
(4)
Now update / rebuild the project.
If any Maven / build errors show up then do one / all of the following:
Right click pom.xml -> Maven -> Download sources and documentation
Right click pom.xml -> Maven -> Generate sources and update folders
Right click pom.xml -> Maven -> Reload project
(5)
Now you can create a Kotlin class under the src/main/kotlin package.
For this,
Right click the package -> New -> Kotlin Class/File
Give the class some name and double click Class
Note:
If the newly created Kotlin class does not have a package statement then it will not be visible from your Java classes.
To fix this,
Go to your Java class which has the main method. (This is the class which is annotated by #SpringBootApplication)
Copy the package statement of this class. This should most likely be the first statement of the Java class and should be something like
package com.xyz.myApplication;
Paste this statement in the Kotlin class so that this becomes the first statement of your Kotlin class as well.
Once everything is done, you can simply import and instantiate your Kotlin classes from inside your Java classes.
You can also do vice versa, that is, import and instantiate your Java classes from inside your Kotlin classes.

Related

IntelliJ doesn't load javafx packages from Maven dependencies (JavaFX 17)

I'm trying to get a Maven/JavaFX project, created from the javafx-archetype-fxml archetype and unedited, to run in the latest version of IntelliJ. To be clear, the project is a direct copy of that archetype; I'm just trying to get an example working.
Suffice it to say I'm a complete beginner with Maven, so I'm could just be missing an obvious step here.
Maven build went smoothly, and the project's pom.xml looks the way the JavaFX documentation says it should.
I left it unchanged except for updating the maven.compiler.source and maven.compiler.target properties, as well as the release property in the maven-compiler-plugin, to 16, the JDK version I'm using for the project:
<project xmlns="http://maven.apache.org/POM/4.0.0"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/maven-
v4_0_0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>com.epre</groupId>
<artifactId>jfx-sandbox</artifactId>
<version>0.0.1-SNAPSHOT</version>
<properties>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
<maven.compiler.source>16</maven.compiler.source>
<maven.compiler.target>16</maven.compiler.target>
</properties>
<dependencies>
<dependency>
<groupId>org.openjfx</groupId>
<artifactId>javafx-controls</artifactId>
<version>17</version>
</dependency>
<dependency>
<groupId>org.openjfx</groupId>
<artifactId>javafx-fxml</artifactId>
<version>17</version>
</dependency>
</dependencies>
<build>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<version>3.8.0</version>
<configuration>
<release>16</release>
</configuration>
</plugin>
<plugin>
<groupId>org.openjfx</groupId>
<artifactId>javafx-maven-plugin</artifactId>
<version>0.0.6</version>
<executions>
<execution>
<!-- Default configuration for running -->
<!-- Usage: mvn clean javafx:run -->
<id>default-cli</id>
<configuration>
<mainClass>com.epre.App</mainClass>
</configuration>
</execution>
</executions>
</plugin>
</plugins>
</build>
</project>
The dependencies show up in the Maven tab, and I'm able to reload the project with no problems. Similarly, I can see that the javafx-base, -controls, -fxml, -graphics, and their corresponding :win libraries have been added to the project's External Libraries (pic):
However, when I try to run the project's Main class, IntelliJ throws ~15 errors telling me that many of the packages I'm trying to import from don't exist.
java: package javafx.application does not exist
java: package javafx.fxml does not exist
java: package javafx.scene does not exist
java: package javafx.scene does not exist
java: package javafx.stage does not exist
java: cannot find symbol class Application
java: cannot find symbol class Scene
java: method does not override or implement a method from a supertype
java: cannot find symbol class Stage
java: cannot find symbol class Scene
java: cannot find symbol class Parent
java: cannot find symbol class FXMLLoader
java: cannot find symbol class FXMLLoader
java: cannot find symbol method launch()
This is the Main class, just to show what sort of packages I'm trying to import from:
package com.epre;
import javafx.application.Application;
import javafx.fxml.FXMLLoader;
import javafx.scene.Parent;
import javafx.scene.Scene;
import javafx.stage.Stage;
import java.io.IOException;
/**
* JavaFX App
*/
public class App extends Application {
private static Scene scene;
#Override
public void start(Stage stage) throws IOException {
scene = new Scene(loadFXML("primary"), 640, 480);
stage.setScene(scene);
stage.show();
}
static void setRoot(String fxml) throws IOException {
scene.setRoot(loadFXML(fxml));
}
private static Parent loadFXML(String fxml) throws IOException {
FXMLLoader fxmlLoader = new FXMLLoader(App.class.getResource(fxml + ".fxml"));
return fxmlLoader.load();
}
public static void main(String[] args) {
launch();
}
}
I've done some searching and tried fixing every unrelated problem with the project, just to try and isolate the issue, i.e.
Changing the project language level to 16
Changing the project module's Per-module bytecode version to 16
Replacing instances of "11" in the pom.xml with "16" as mentioned earlier
Opening the project in an older version of IntelliJ
None of these have produced any change, though.
In previous non-Maven projects that also used JavaFX, I had to add all the packages the project needed to its module-info.java. This is the only step I can think of that I haven't taken, since it's my understanding that I shouldn't have to deal with it if I'm declaring those packages as dependencies in the pom.xml?
EDIT: I'm assuming I don't need to set up a module-info.java because the JavaFX documentation never mentions it as a step for creating a Maven+JavaFK project: https://openjfx.io/openjfx-docs/.
The JavaFX and IntelliJ > Non-modular with Maven section of the documentation simply states that upon loading the project, "The JavaFX classes will be recognized."
EDIT 2: I managed to solve the package errors and get the program running by changing the JavaFX dependencies in the pom.xml to version 16 instead of 17. Not sure why a single version would break the program like it did, though I suspect there was probably a change in how JavaFX is bundled/distributed.
Update for JavaFX 17.0.0.1 release
The release of JavaFX 17.0.0.1 has resolved this issue and, when using JavaFX and Maven this is the version that should be used instead of 17 (which remains broken in Maven).
When defining a dependency on JavaFX 17 using Maven, ensure that the version defined for JavaFX dependencies is not 17 and is at least 17.0.0.1. Here is an example of a working dependency definition for JavaFX 17.0.0.1 Maven modules.
<dependency>
<groupId>org.openjfx</groupId>
<artifactId>javafx-controls</artifactId>
<version>17.0.0.1</version>
</dependency>
Info on release versions and contents and the version number for the current latest release is available in the JavaFX release notes hosted at gluon.
For now, I will leave the rest of the original answer, which discusses some of the background information, as it was when it was originally created.
Maven modules for JavaFX versions prior to 17 (e.g. 16), still continue to function without issue. However, if you have the ability to upgrade and use JavaFX 17.0.0.1 or higher for your application, I encourage this.
JavaFX 17 will be maintained as a stable long-term release of JavaFX. It will maintain a stable feature set and receive bug and security fix support for many years.
Background
I was able to replicate this issue.
This is a known issue only affecting projects which rely on the initial JavaFX 17 release artifacts currently available in the Maven central repository.
See related question:
JavaFX lib can not be build any more since JavaFX 17
Discussion of the issue on the openjfx-dev mailing list:
https://mail.openjdk.java.net/pipermail/openjfx-dev/2021-September/031934.html
Workaround
One current workaround is, if your application relies on JavaFX artifacts from Maven central, to use JavaFX 16 rather than JavaFX 17 until this issue is fixed.
As this is quite a critical issue with the JavaFX 17 release, I would expect it will likely be addressed in an update to the JavaFX 17 release in the near future.
Environment
Mac OS (Catalina) 10.15.7
$ java -version
openjdk version "16.0.2" 2021-07-20
OpenJDK Runtime Environment Temurin-16.0.2+7 (build 16.0.2+7)
OpenJDK 64-Bit Server VM Temurin-16.0.2+7 (build 16.0.2+7, mixed mode, sharing)
IntelliJ IDEA 2021.2 (Ultimate Edition)
Build #IU-212.4746.92, built on July 27, 2021
Steps to replicate
In Idea create a new JavaFX Project
Select New | Project
Choose JavaFX in the left tab.
Choose Language: Java, Build System: Maven, Project SDK: 16
Select Next -> Finish
Test the new project.
Right click on HelloApplication and select Run 'HelloApplication.main()'
The application should run and display a JavaFX application window with a "Hello!" button.
Change the JavaFX version
Edit pom.xml
Change the JavaFX dependency versions from 16 to 17
maven-compiler-plugin version can stay at 16.
In the Maven tab, click the refresh icon to "Reload All Maven Projects"
Test the updated project.
Right click on HelloApplication and select Run 'HelloApplication.main()'
Execution will now fail with the error message:
java: package javafx.fxml does not exist

Different behaviour between Maven & Eclipse to launch a JavaFX 11 app

I'm starting to dig into Java 11 migration for a large app (includes Java FX parts) and I need your help to understand the difference between Maven (3.5.4) on the command-line and Eclipse (2018-09 with Java11 upgrade).
I have a simple Java 11 class
import java.util.stream.Stream;
import javafx.application.Application;
import javafx.scene.Scene;
import javafx.scene.control.Label;
import javafx.stage.Stage;
public class HelloFX extends Application {
#Override
public void start(Stage stage) {
String javaVersion = System.getProperty("java.version");
String javafxVersion = System.getProperty("javafx.version");
Label l = new Label("Hello, JavaFX " + javafxVersion + ", running on Java " + javaVersion + ".");
Scene scene = new Scene(l, 640, 480);
stage.setScene(scene);
stage.show();
}
public static void main(String[] args) {
Stream.of("jdk.module.path",
"jdk.module.upgrade.path",
"jdk.module.main",
"jdk.module.main.class").forEach(key -> System.out.println(key + " : " + System.getProperty(key)));
Application.launch();
}
}
and a simple pom
<project xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/maven-v4_0_0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>com.gluonhq</groupId>
<artifactId>hellofx</artifactId>
<version>1.0-SNAPSHOT</version>
<dependencies>
<dependency>
<groupId>org.openjfx</groupId>
<artifactId>javafx-controls</artifactId>
<version>11</version>
</dependency>
</dependencies>
<build>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<version>3.8.0</version>
<configuration>
<release>11</release>
</configuration>
</plugin>
<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>exec-maven-plugin</artifactId>
<version>1.2.1</version>
<executions>
<execution>
<goals>
<goal>java</goal>
</goals>
</execution>
</executions>
<configuration>
<mainClass>HelloFX</mainClass>
</configuration>
</plugin>
</plugins>
</build>
</project>
When I run 'mvn compile exec:java' I think nothing uses the new module-path and the program displays the JavaFX panel as expected.
The system out is:
jdk.module.path : null
jdk.module.upgrade.path : null
jdk.module.main : null
jdk.module.main.class : null
When ran from an Eclipse launcher, I have to add to the launcher the following vm arguments:
--module-path=${env_var:JAVAFX_PATH} --add-modules=javafx.controls
and the panel is also displayed but the output is:
jdk.module.path : C:\dev\tools\javafx-sdk-11\lib
jdk.module.upgrade.path : null
jdk.module.main : null
jdk.module.main.class : null
jdk.module.main.class : null
I cannot make it work in Eclipse as it works from the command line: I am forced to mess with the modules and module-path. If I do not add the vm parameters, I got either "Error: JavaFX runtime components are missing, and are required to run this application" or "Error occurred during initialization of boot layer java.lang.module.FindException: Module javafx.controls not found".
How can it work form the command-line without any more configuration ? To my knowledge Maven do not add automagically anything to the module path...
Any idea ? What am I missing ?
Update1: I realized that when importing the project in Eclipse "as Maven project" (which is what I always do) it results in the JRE being added in the module path (which is not the case for my classis projects). See the screenshot
When running from command line, if you are choosing the Maven (same works for Gradle) build system, you let the plugins do the work for you.
When you run from your IDE the main class, but not from the built-in Maven/Gradle windows, on the contrary, you are running the plain java command line options.
And these results in two different things (but with same final result of course), as you already have figured out via the properties print out.
As already covered by this answer for IntelliJ, but applies to any other IDE, or this other one for Eclipse, there are two ways of running a JavaFX 11 project, based on the use or not of the Maven/Gradle build system.
JavaFX project, without build tools
To run your JavaFX project from your IDE, you have to download the JavaFX SDK and add a library with the different javafx jars to your IDE, with a path like /Users/<user>/Downloads/javafx-sdk-11/lib/.
Now, to run that project, even if it is not modular, you have to add the path to those modules, and include the modules you are using to the VM options/arguments of the project.
Whether you run the project from your IDE or from command line, you will be running something like:
java --module-path /Users/<user>/Downloads/javafx-sdk-11/lib/ \
--add-modules=javafx.controls org.openjfx.hellofx.HelloFX
Note that even if your project is not modular, you are still using the JavaFX modules, and since you are not using any build tool, you have to take care of downloading the SDK in the first place.
JavaFX project, build tools
If you use Maven or Gradle build tools, the first main difference is that you don't need to download the JavaFX SDK. You will include in your pom (or build.gradle file) what modules you need, and Maven/Gradle will manage to download just those modules (and dependencies) to your local .m2/.gradle repository.
When you run your main class from Maven exec:java goal you are using a plugin, and the same goes for the run task on Gradle.
At this point, it looks like when you run:
mvn compile exec:java
or
gradle run
you are not adding the above VM arguments, but the fact is that Maven/Gradle are taking care of it for you.
Gradle
In the Gradle case, this is more evident, since you have to set them in the run task:
run {
doFirst {
jvmArgs = [
'--module-path', classpath.asPath,
'--add-modules', 'javafx.controls'
]
}
}
While you don't need the SDK, the classpath contains the path to your .m2 or .gradle repository where the javafx artifacts have been downloaded.
Maven
For Maven, while the pom manages the dependencies of the different javafx modules, and sets the classifier to download the platform-specific modules (see for instance /Users/<User>/.m2/repository/org/openjfx/javafx-controls/11/javafx.controls-11.pom), the plugin manages to configure the classpath and create the required options to run the project.
In short, a new class that doesn't extend Application is used to call your application class: HelloFX.main(args).
EDIT
See this answer for a more detailed explanation on why launching a JavaFX application without module-path fails. But in short:
This error comes from sun.launcher.LauncherHelper in the java.base module. The reason for this is that the Main app extends Application and has a main method. If that is the case, the LauncherHelper will check for the javafx.graphics module to be present as a named module. If that module is not present, the launch is aborted.
A more detailed explanation on how the maven plugin works without setting the module-path:
If you add debug level (default is info) when running the Maven goals, you will get more detailed information on what is going on behind the scenes.
Running mvn compile exec:java shows:
...
[DEBUG] (f) mainClass = org.openjfx.hellofx.HelloFX
...
[DEBUG] Invoking : org.openjfx.hellofx.HelloFX.main()
...
And if you check the exec-maven-plugin source code, you can find at ExecJavaMojo::execute how the main method of the Application class is called from a thread.
This is exactly what allows launching an Application class from an external class that does not extend Application class, to skip the checks.
Conclusion
Is up to you to choose build tools or not, though nowadays using them is the preferred option, of course. Either way, the end result will be the same.
But it is important to understand what are the differences of those approaches, and how your IDE deals with them.

Manually create jar for use with maven [duplicate]

Maven 2 is driving me crazy during the experimentation / quick and dirty mock-up phase of development.
I have a pom.xml file that defines the dependencies for the web-app framework I want to use, and I can quickly generate starter projects from that file. However, sometimes I want to link to a 3rd party library that doesn't already have a pom.xml file defined, so rather than create the pom.xml file for the 3rd party lib by hand and install it, and add the dependency to my pom.xml, I would just like to tell Maven: "In addition to my defined dependencies, include any jars that are in /lib too."
It seems like this ought to be simple, but if it is, I am missing something.
Any pointers on how to do this are greatly appreciated. Short of that, if there is a simple way to point maven to a /lib directory and easily create a pom.xml with all the enclosed jars mapped to a single dependency which I could then name / install and link to in one fell swoop would also suffice.
Problems of popular approaches
Most of the answers you'll find around the internet will suggest you to either install the dependency to your local repository or specify a "system" scope in the pom and distribute the dependency with the source of your project. But both of these solutions are actually flawed.
Why you shouldn't apply the "Install to Local Repo" approach
When you install a dependency to your local repository it remains there. Your distribution artifact will do fine as long as it has access to this repository. The problem is in most cases this repository will reside on your local machine, so there'll be no way to resolve this dependency on any other machine. Clearly making your artifact depend on a specific machine is not a way to handle things. Otherwise this dependency will have to be locally installed on every machine working with that project which is not any better.
Why you shouldn't apply the "System Scope" approach
The jars you depend on with the "System Scope" approach neither get installed to any repository or attached to your target packages. That's why your distribution package won't have a way to resolve that dependency when used. That I believe was the reason why the use of system scope even got deprecated. Anyway you don't want to rely on a deprecated feature.
The static in-project repository solution
After putting this in your pom:
<repository>
<id>repo</id>
<releases>
<enabled>true</enabled>
<checksumPolicy>ignore</checksumPolicy>
</releases>
<snapshots>
<enabled>false</enabled>
</snapshots>
<url>file://${project.basedir}/repo</url>
</repository>
for each artifact with a group id of form x.y.z Maven will include the following location inside your project dir in its search for artifacts:
repo/
| - x/
| | - y/
| | | - z/
| | | | - ${artifactId}/
| | | | | - ${version}/
| | | | | | - ${artifactId}-${version}.jar
To elaborate more on this you can read this blog post.
Use Maven to install to project repo
Instead of creating this structure by hand I recommend to use a Maven plugin to install your jars as artifacts. So, to install an artifact to an in-project repository under repo folder execute:
mvn install:install-file -DlocalRepositoryPath=repo -DcreateChecksum=true -Dpackaging=jar -Dfile=[your-jar] -DgroupId=[...] -DartifactId=[...] -Dversion=[...]
If you'll choose this approach you'll be able to simplify the repository declaration in pom to:
<repository>
<id>repo</id>
<url>file://${project.basedir}/repo</url>
</repository>
A helper script
Since executing installation command for each lib is kinda annoying and definitely error prone, I've created a utility script which automatically installs all the jars from a lib folder to a project repository, while automatically resolving all metadata (groupId, artifactId and etc.) from names of files. The script also prints out the dependencies xml for you to copy-paste in your pom.
Include the dependencies in your target package
When you'll have your in-project repository created you'll have solved a problem of distributing the dependencies of the project with its source, but since then your project's target artifact will depend on non-published jars, so when you'll install it to a repository it will have unresolvable dependencies.
To beat this problem I suggest to include these dependencies in your target package. This you can do with either the Assembly Plugin or better with the OneJar Plugin. The official documentaion on OneJar is easy to grasp.
For throw away code only
set scope == system and just make up a groupId, artifactId, and version
<dependency>
<groupId>org.swinglabs</groupId>
<artifactId>swingx</artifactId>
<version>0.9.2</version>
<scope>system</scope>
<systemPath>${project.basedir}/lib/swingx-0.9.3.jar</systemPath>
</dependency>
Note: system dependencies are not copied into resulted jar/war
(see How to include system dependencies in war built using maven)
You may create local repository on your project
For example if you have libs folder in project structure
In libs folder you should create directory structure like: /groupId/artifactId/version/artifactId-version.jar
In your pom.xml you should register repository
<repository>
<id>ProjectRepo</id>
<name>ProjectRepo</name>
<url>file://${project.basedir}/libs</url>
</repository>
and add dependency as usual
<dependency>
<groupId>groupId</groupId>
<artifactId>artifactId</artifactId>
<version>version</version>
</dependency>
That is all.
For detailed information: How to add external libraries in Maven (archived)
Note: When using the System scope (as mentioned on this page), Maven needs absolute paths.
If your jars are under your project's root, you'll want to prefix your systemPath values with ${basedir}.
This is what I have done, it also works around the package issue and it works with checked out code.
I created a new folder in the project in my case I used repo, but feel free to use src/repo
In my POM I had a dependency that is not in any public maven repositories
<dependency>
<groupId>com.dovetail</groupId>
<artifactId>zoslog4j</artifactId>
<version>1.0.1</version>
<scope>runtime</scope>
</dependency>
I then created the following directories repo/com/dovetail/zoslog4j/1.0.1 and copied the JAR file into that folder.
I created the following POM file to represent the downloaded file (this step is optional, but it removes a WARNING) and helps the next guy figure out where I got the file to begin with.
<?xml version="1.0" encoding="UTF-8" ?>
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>com.dovetail</groupId>
<artifactId>zoslog4j</artifactId>
<packaging>jar</packaging>
<version>1.0.1</version>
<name>z/OS Log4J Appenders</name>
<url>http://dovetail.com/downloads/misc/index.html</url>
<description>Apache Log4j Appender for z/OS Logstreams, files, etc.</description>
</project>
Two optional files I create are the SHA1 checksums for the POM and the JAR to remove the missing checksum warnings.
shasum -b < repo/com/dovetail/zoslog4j/1.0.1/zoslog4j-1.0.1.jar \
> repo/com/dovetail/zoslog4j/1.0.1/zoslog4j-1.0.1.jar.sha1
shasum -b < repo/com/dovetail/zoslog4j/1.0.1/zoslog4j-1.0.1.pom \
> repo/com/dovetail/zoslog4j/1.0.1/zoslog4j-1.0.1.pom.sha1
Finally I add the following fragment to my pom.xml that allows me to refer to the local repository
<repositories>
<repository>
<id>project</id>
<url>file:///${basedir}/repo</url>
</repository>
</repositories>
This is how we add or install a local jar
<dependency>
<groupId>org.example</groupId>
<artifactId>iamajar</artifactId>
<version>1.0</version>
<scope>system</scope>
<systemPath>${project.basedir}/lib/iamajar.jar</systemPath>
</dependency>
i gave some default groupId and artifactId because they are mandatory :)
You really ought to get a framework in place via a repository and identifying your dependencies up front. Using the system scope is a common mistake people use, because they "don't care about the dependency management." The trouble is that doing this you end up with a perverted maven build that will not show maven in a normal condition. You would be better off following an approach like this.
Maven install plugin has command line usage to install a jar into the local repository, POM is optional but you will have to specify the GroupId, ArtifactId, Version and Packaging (all the POM stuff).
Using <scope>system</scope> is a terrible idea for reasons explained by others, installing the file manually to your local repository makes the build unreproducible, and using <url>file://${project.basedir}/repo</url> is not a good idea either because (1) that may not be a well-formed file URL (e.g. if the project is checked out in a directory with unusual characters), (2) the result is unusable if this project’s POM is used as a dependency of someone else’s project.
Assuming you are unwilling to upload the artifact to a public repository, Simeon’s suggestion of a helper module does the job. But there is an easier way now…
The Recommendation
Use non-maven-jar-maven-plugin. Does exactly what you were asking for, with none of the drawbacks of the other approaches.
I found another way to do this, see here from a Heroku post
To summarize (sorry about some copy & paste)
Create a repo directory under your root folder:
yourproject
+- pom.xml
+- src
+- repo
Run this to install the jar to your local repo directory
mvn deploy:deploy-file -Durl=file:///path/to/yourproject/repo/ -Dfile=mylib-1.0.jar -DgroupId=com.example -DartifactId=mylib -Dpackaging=jar -Dversion=1.0
Add this your pom.xml:
<repositories>
<!--other repositories if any-->
<repository>
<id>project.local</id>
<name>project</name>
<url>file:${project.basedir}/repo</url>
</repository>
</repositories>
<dependency>
<groupId>com.example</groupId>
<artifactId>mylib</artifactId>
<version>1.0</version>
</dependency>
What seems simplest to me is just configure your maven-compiler-plugin to include your custom jars. This example will load any jar files in a lib directory.
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<configuration>
<includes>
<include>lib/*.jar</include>
</includes>
</configuration>
</plugin>
After having really long discussion with CloudBees guys about properly maven packaging of such kind of JARs, they made an interesting good proposal for a solution:
Creation of a fake Maven project which attaches a pre-existing JAR as a primary artifact, running into belonged POM install:install-file execution. Here is an example of such kinf of POM:
<build>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-install-plugin</artifactId>
<version>2.3.1</version>
<executions>
<execution>
<id>image-util-id</id>
<phase>install</phase>
<goals>
<goal>install-file</goal>
</goals>
<configuration>
<file>${basedir}/file-you-want-to-include.jar</file>
<groupId>${project.groupId}</groupId>
<artifactId>${project.artifactId}</artifactId>
<version>${project.version}</version>
<packaging>jar</packaging>
</configuration>
</execution>
</executions>
</plugin>
</plugins>
</build>
But in order to implement it, existing project structure should be changed. First, you should have in mind that for each such kind of JAR there should be created different fake Maven project (module). And there should be created a parent Maven project including all sub-modules which are : all JAR wrappers and existing main project. The structure could be :
root project (this contains the parent POM file includes all sub-modules with module XML element) (POM packaging)
JAR 1 wrapper Maven child project (POM packaging)
JAR 2 wrapper Maven child project (POM packaging)
main existing Maven child project (WAR, JAR, EAR .... packaging)
When parent running via mvn:install or mvn:packaging is forced and sub-modules will be executed. That could be concerned as a minus here, since project structure should be changed, but offers a non static solution at the end
The problem with systemPath is that the dependencies' jars won't get distributed along your artifacts as transitive dependencies. Try what I've posted here: Is it best to Mavenize your project jar files or put them in WEB-INF/lib?
Then declare dependencies as usual.
And please read the footer note.
If you want a quick and dirty solution, you can do the following (though I do not recommend this for anything except test projects, maven will complain in length that this is not proper).
Add a dependency entry for each jar file you need, preferably with a perl script or something similar and copy/paste that into your pom file.
#! /usr/bin/perl
foreach my $n (#ARGV) {
$n=~s#.*/##;
print "<dependency>
<groupId>local.dummy</groupId>
<artifactId>$n</artifactId>
<version>0.0.1</version>
<scope>system</scope>
<systemPath>\${project.basedir}/lib/$n</systemPath>
</dependency>
";
A quick&dirty batch solution (based on Alex's answer):
libs.bat
#ECHO OFF
FOR %%I IN (*.jar) DO (
echo ^<dependency^>
echo ^<groupId^>local.dummy^</groupId^>
echo ^<artifactId^>%%I^</artifactId^>
echo ^<version^>0.0.1^</version^>
echo ^<scope^>system^</scope^>
echo ^<systemPath^>${project.basedir}/lib/%%I^</systemPath^>
echo ^</dependency^>
)
Execute it like this: libs.bat > libs.txt.
Then open libs.txt and copy its content as dependencies.
In my case, I only needed the libraries to compile my code, and this solution was the best for that purpose.
To install the 3rd party jar which is not in maven repository use maven-install-plugin.
Below are steps:
Download the jar file manually from the source (website)
Create a folder and place your jar file in it
Run the below command to install the 3rd party jar in your local maven repository
mvn install:install-file -Dfile= -DgroupId=
-DartifactId= -Dversion= -Dpackaging=
Below is the e.g one I used it for simonsite log4j
mvn install:install-file
-Dfile=/Users/athanka/git/MyProject/repo/log4j-rolling-appender.jar -DgroupId=uk.org.simonsite -DartifactId=log4j-rolling-appender -Dversion=20150607-2059 -Dpackaging=jar
In the pom.xml include the dependency as below
<dependency>
<groupId>uk.org.simonsite</groupId>
<artifactId>log4j-rolling-appender</artifactId>
<version>20150607-2059</version>
</dependency>
Run the mvn clean install command to create your packaging
Below is the reference link:
https://maven.apache.org/guides/mini/guide-3rd-party-jars-local.html
A strange solution I found:
using Eclipse
create simple (non-maven) java project
add a Main class
add all the jars to the classpath
export Runnable JAR (it's important, because no other way here to do it)
select Extract required libraries into generated JAR
decide the licence issues
tadammm...install the generated jar to your m2repo
add this single dependency to your other projects.
cheers,
Balint
Even though it does not exactly fit to your problem, I'll drop this here. My requirements were:
Jars that can not be found in an online maven repository should be in the SVN.
If one developer adds another library, the other developers should not be bothered with manually installing them.
The IDE (NetBeans in my case) should be able find the sources and javadocs to provide autocompletion and help.
Let's talk about (3) first: Just having the jars in a folder and somehow merging them into the final jar will not work for here, since the IDE will not understand this. This means all libraries have to be installed properly. However, I dont want to have everyone installing it using "mvn install-file".
In my project I needed metawidget. Here we go:
Create a new maven project (name it "shared-libs" or something like that).
Download metawidget and extract the zip into src/main/lib.
The folder doc/api contains the javadocs. Create a zip of the content (doc/api/api.zip).
Modify the pom like this
Build the project and the library will be installed.
Add the library as a dependency to your project, or (if you added the dependency in the shared-libs project) add shared-libs as dependency to get all libraries at once.
Every time you have a new library, just add a new execution and tell everyone to build the project again (you can improve this process with project hierachies).
For those that didn't find a good answer here, this is what we are doing to get a jar with all the necessary dependencies in it. This answer (https://stackoverflow.com/a/7623805/1084306) mentions to use the Maven Assembly plugin but doesn't actually give an example in the answer. And if you don't read all the way to the end of the answer (it's pretty lengthy), you may miss it. Adding the below to your pom.xml will generate target/${PROJECT_NAME}-${VERSION}-jar-with-dependencies.jar
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-assembly-plugin</artifactId>
<version>2.4.1</version>
<configuration>
<!-- get all project dependencies -->
<descriptorRefs>
<descriptorRef>jar-with-dependencies</descriptorRef>
</descriptorRefs>
<!-- MainClass in mainfest make a executable jar -->
<archive>
<manifest>
<mainClass>my.package.mainclass</mainClass>
</manifest>
</archive>
</configuration>
<executions>
<execution>
<id>make-assembly</id>
<!-- bind to the packaging phase -->
<phase>package</phase>
<goals>
<goal>single</goal>
</goals>
</execution>
</executions>
</plugin>
I alluded to some python code in a comment to the answer from #alex lehmann's , so am posting it here.
def AddJars(jarList):
s1 = ''
for elem in jarList:
s1+= """
<dependency>
<groupId>local.dummy</groupId>
<artifactId>%s</artifactId>
<version>0.0.1</version>
<scope>system</scope>
<systemPath>${project.basedir}/manual_jars/%s</systemPath>
</dependency>\n"""%(elem, elem)
return s1
This doesn't answer how to add them to your POM, and may be a no brainer, but would just adding the lib dir to your classpath work? I know that is what I do when I need an external jar that I don't want to add to my Maven repos.
Hope this helps.
What works in our project is what Archimedes Trajano wrote, but we had in our .m2/settings.xml something like this:
<mirror>
<id>nexus</id>
<mirrorOf>*</mirrorOf>
<url>http://url_to_our_repository</url>
</mirror>
and the * should be changed to central. So if his answer doesn't work for you, you should check your settings.xml
I just wanted a quick and dirty workaround... I couldn't run the script from Nikita Volkov: syntax error + it requires a strict format for the jar names.
I made this Perl script which works with whatever format for the jar file names, and it generates the dependencies in an xml so it can be copy pasted directly in a pom.
If you want to use it, make sure you understand what the script is doing, you may need to change the lib folder and the value for the groupId or artifactId...
#!/usr/bin/perl
use strict;
use warnings;
open(my $fh, '>', 'dependencies.xml') or die "Could not open file 'dependencies.xml' $!";
foreach my $file (glob("lib/*.jar")) {
print "$file\n";
my $groupId = "my.mess";
my $artifactId = "";
my $version = "0.1-SNAPSHOT";
if ($file =~ /\/([^\/]*?)(-([0-9v\._]*))?\.jar$/) {
$artifactId = $1;
if (defined($3)) {
$version = $3;
}
`mvn install:install-file -Dfile=$file -DgroupId=$groupId -DartifactId=$artifactId -Dversion=$version -Dpackaging=jar`;
print $fh "<dependency>\n\t<groupId>$groupId</groupId>\n\t<artifactId>$artifactId</artifactId>\n\t<version>$version</version>\n</dependency>\n";
print " => $groupId:$artifactId:$version\n";
} else {
print "##### BEUH...\n";
}
}
close $fh;
The solution for scope='system' approach in Java:
public static void main(String[] args) {
String filepath = "/Users/Downloads/lib/";
try (Stream<Path> walk = Files.walk(Paths.get(filepath))) {
List<String> result = walk.filter(Files::isRegularFile)
.map(x -> x.toString()).collect(Collectors.toList());
String indentation = " ";
for (String s : result) {
System.out.println(indentation + indentation + "<dependency>");
System.out.println(indentation + indentation + indentation + "<groupId>"
+ s.replace(filepath, "").replace(".jar", "")
+ "</groupId>");
System.out.println(indentation + indentation + indentation + "<artifactId>"
+ s.replace(filepath, "").replace(".jar", "")
+ "</artifactId>");
System.out.println(indentation + indentation + indentation + "<version>"
+ s.replace(filepath, "").replace(".jar", "")
+ "</version>");
System.out.println(indentation + indentation + indentation + "<scope>system</scope>");
System.out.println(indentation + indentation + indentation + "<systemPath>" + s + "</systemPath>");
System.out.println(indentation + indentation + "</dependency>");
}
} catch (IOException e) {
e.printStackTrace();
}
}

Maven , How to let dependency of local system jar be included into the output jar? [duplicate]

Maven 2 is driving me crazy during the experimentation / quick and dirty mock-up phase of development.
I have a pom.xml file that defines the dependencies for the web-app framework I want to use, and I can quickly generate starter projects from that file. However, sometimes I want to link to a 3rd party library that doesn't already have a pom.xml file defined, so rather than create the pom.xml file for the 3rd party lib by hand and install it, and add the dependency to my pom.xml, I would just like to tell Maven: "In addition to my defined dependencies, include any jars that are in /lib too."
It seems like this ought to be simple, but if it is, I am missing something.
Any pointers on how to do this are greatly appreciated. Short of that, if there is a simple way to point maven to a /lib directory and easily create a pom.xml with all the enclosed jars mapped to a single dependency which I could then name / install and link to in one fell swoop would also suffice.
Problems of popular approaches
Most of the answers you'll find around the internet will suggest you to either install the dependency to your local repository or specify a "system" scope in the pom and distribute the dependency with the source of your project. But both of these solutions are actually flawed.
Why you shouldn't apply the "Install to Local Repo" approach
When you install a dependency to your local repository it remains there. Your distribution artifact will do fine as long as it has access to this repository. The problem is in most cases this repository will reside on your local machine, so there'll be no way to resolve this dependency on any other machine. Clearly making your artifact depend on a specific machine is not a way to handle things. Otherwise this dependency will have to be locally installed on every machine working with that project which is not any better.
Why you shouldn't apply the "System Scope" approach
The jars you depend on with the "System Scope" approach neither get installed to any repository or attached to your target packages. That's why your distribution package won't have a way to resolve that dependency when used. That I believe was the reason why the use of system scope even got deprecated. Anyway you don't want to rely on a deprecated feature.
The static in-project repository solution
After putting this in your pom:
<repository>
<id>repo</id>
<releases>
<enabled>true</enabled>
<checksumPolicy>ignore</checksumPolicy>
</releases>
<snapshots>
<enabled>false</enabled>
</snapshots>
<url>file://${project.basedir}/repo</url>
</repository>
for each artifact with a group id of form x.y.z Maven will include the following location inside your project dir in its search for artifacts:
repo/
| - x/
| | - y/
| | | - z/
| | | | - ${artifactId}/
| | | | | - ${version}/
| | | | | | - ${artifactId}-${version}.jar
To elaborate more on this you can read this blog post.
Use Maven to install to project repo
Instead of creating this structure by hand I recommend to use a Maven plugin to install your jars as artifacts. So, to install an artifact to an in-project repository under repo folder execute:
mvn install:install-file -DlocalRepositoryPath=repo -DcreateChecksum=true -Dpackaging=jar -Dfile=[your-jar] -DgroupId=[...] -DartifactId=[...] -Dversion=[...]
If you'll choose this approach you'll be able to simplify the repository declaration in pom to:
<repository>
<id>repo</id>
<url>file://${project.basedir}/repo</url>
</repository>
A helper script
Since executing installation command for each lib is kinda annoying and definitely error prone, I've created a utility script which automatically installs all the jars from a lib folder to a project repository, while automatically resolving all metadata (groupId, artifactId and etc.) from names of files. The script also prints out the dependencies xml for you to copy-paste in your pom.
Include the dependencies in your target package
When you'll have your in-project repository created you'll have solved a problem of distributing the dependencies of the project with its source, but since then your project's target artifact will depend on non-published jars, so when you'll install it to a repository it will have unresolvable dependencies.
To beat this problem I suggest to include these dependencies in your target package. This you can do with either the Assembly Plugin or better with the OneJar Plugin. The official documentaion on OneJar is easy to grasp.
For throw away code only
set scope == system and just make up a groupId, artifactId, and version
<dependency>
<groupId>org.swinglabs</groupId>
<artifactId>swingx</artifactId>
<version>0.9.2</version>
<scope>system</scope>
<systemPath>${project.basedir}/lib/swingx-0.9.3.jar</systemPath>
</dependency>
Note: system dependencies are not copied into resulted jar/war
(see How to include system dependencies in war built using maven)
You may create local repository on your project
For example if you have libs folder in project structure
In libs folder you should create directory structure like: /groupId/artifactId/version/artifactId-version.jar
In your pom.xml you should register repository
<repository>
<id>ProjectRepo</id>
<name>ProjectRepo</name>
<url>file://${project.basedir}/libs</url>
</repository>
and add dependency as usual
<dependency>
<groupId>groupId</groupId>
<artifactId>artifactId</artifactId>
<version>version</version>
</dependency>
That is all.
For detailed information: How to add external libraries in Maven (archived)
Note: When using the System scope (as mentioned on this page), Maven needs absolute paths.
If your jars are under your project's root, you'll want to prefix your systemPath values with ${basedir}.
This is what I have done, it also works around the package issue and it works with checked out code.
I created a new folder in the project in my case I used repo, but feel free to use src/repo
In my POM I had a dependency that is not in any public maven repositories
<dependency>
<groupId>com.dovetail</groupId>
<artifactId>zoslog4j</artifactId>
<version>1.0.1</version>
<scope>runtime</scope>
</dependency>
I then created the following directories repo/com/dovetail/zoslog4j/1.0.1 and copied the JAR file into that folder.
I created the following POM file to represent the downloaded file (this step is optional, but it removes a WARNING) and helps the next guy figure out where I got the file to begin with.
<?xml version="1.0" encoding="UTF-8" ?>
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>com.dovetail</groupId>
<artifactId>zoslog4j</artifactId>
<packaging>jar</packaging>
<version>1.0.1</version>
<name>z/OS Log4J Appenders</name>
<url>http://dovetail.com/downloads/misc/index.html</url>
<description>Apache Log4j Appender for z/OS Logstreams, files, etc.</description>
</project>
Two optional files I create are the SHA1 checksums for the POM and the JAR to remove the missing checksum warnings.
shasum -b < repo/com/dovetail/zoslog4j/1.0.1/zoslog4j-1.0.1.jar \
> repo/com/dovetail/zoslog4j/1.0.1/zoslog4j-1.0.1.jar.sha1
shasum -b < repo/com/dovetail/zoslog4j/1.0.1/zoslog4j-1.0.1.pom \
> repo/com/dovetail/zoslog4j/1.0.1/zoslog4j-1.0.1.pom.sha1
Finally I add the following fragment to my pom.xml that allows me to refer to the local repository
<repositories>
<repository>
<id>project</id>
<url>file:///${basedir}/repo</url>
</repository>
</repositories>
This is how we add or install a local jar
<dependency>
<groupId>org.example</groupId>
<artifactId>iamajar</artifactId>
<version>1.0</version>
<scope>system</scope>
<systemPath>${project.basedir}/lib/iamajar.jar</systemPath>
</dependency>
i gave some default groupId and artifactId because they are mandatory :)
You really ought to get a framework in place via a repository and identifying your dependencies up front. Using the system scope is a common mistake people use, because they "don't care about the dependency management." The trouble is that doing this you end up with a perverted maven build that will not show maven in a normal condition. You would be better off following an approach like this.
Maven install plugin has command line usage to install a jar into the local repository, POM is optional but you will have to specify the GroupId, ArtifactId, Version and Packaging (all the POM stuff).
Using <scope>system</scope> is a terrible idea for reasons explained by others, installing the file manually to your local repository makes the build unreproducible, and using <url>file://${project.basedir}/repo</url> is not a good idea either because (1) that may not be a well-formed file URL (e.g. if the project is checked out in a directory with unusual characters), (2) the result is unusable if this project’s POM is used as a dependency of someone else’s project.
Assuming you are unwilling to upload the artifact to a public repository, Simeon’s suggestion of a helper module does the job. But there is an easier way now…
The Recommendation
Use non-maven-jar-maven-plugin. Does exactly what you were asking for, with none of the drawbacks of the other approaches.
I found another way to do this, see here from a Heroku post
To summarize (sorry about some copy & paste)
Create a repo directory under your root folder:
yourproject
+- pom.xml
+- src
+- repo
Run this to install the jar to your local repo directory
mvn deploy:deploy-file -Durl=file:///path/to/yourproject/repo/ -Dfile=mylib-1.0.jar -DgroupId=com.example -DartifactId=mylib -Dpackaging=jar -Dversion=1.0
Add this your pom.xml:
<repositories>
<!--other repositories if any-->
<repository>
<id>project.local</id>
<name>project</name>
<url>file:${project.basedir}/repo</url>
</repository>
</repositories>
<dependency>
<groupId>com.example</groupId>
<artifactId>mylib</artifactId>
<version>1.0</version>
</dependency>
What seems simplest to me is just configure your maven-compiler-plugin to include your custom jars. This example will load any jar files in a lib directory.
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<configuration>
<includes>
<include>lib/*.jar</include>
</includes>
</configuration>
</plugin>
After having really long discussion with CloudBees guys about properly maven packaging of such kind of JARs, they made an interesting good proposal for a solution:
Creation of a fake Maven project which attaches a pre-existing JAR as a primary artifact, running into belonged POM install:install-file execution. Here is an example of such kinf of POM:
<build>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-install-plugin</artifactId>
<version>2.3.1</version>
<executions>
<execution>
<id>image-util-id</id>
<phase>install</phase>
<goals>
<goal>install-file</goal>
</goals>
<configuration>
<file>${basedir}/file-you-want-to-include.jar</file>
<groupId>${project.groupId}</groupId>
<artifactId>${project.artifactId}</artifactId>
<version>${project.version}</version>
<packaging>jar</packaging>
</configuration>
</execution>
</executions>
</plugin>
</plugins>
</build>
But in order to implement it, existing project structure should be changed. First, you should have in mind that for each such kind of JAR there should be created different fake Maven project (module). And there should be created a parent Maven project including all sub-modules which are : all JAR wrappers and existing main project. The structure could be :
root project (this contains the parent POM file includes all sub-modules with module XML element) (POM packaging)
JAR 1 wrapper Maven child project (POM packaging)
JAR 2 wrapper Maven child project (POM packaging)
main existing Maven child project (WAR, JAR, EAR .... packaging)
When parent running via mvn:install or mvn:packaging is forced and sub-modules will be executed. That could be concerned as a minus here, since project structure should be changed, but offers a non static solution at the end
The problem with systemPath is that the dependencies' jars won't get distributed along your artifacts as transitive dependencies. Try what I've posted here: Is it best to Mavenize your project jar files or put them in WEB-INF/lib?
Then declare dependencies as usual.
And please read the footer note.
If you want a quick and dirty solution, you can do the following (though I do not recommend this for anything except test projects, maven will complain in length that this is not proper).
Add a dependency entry for each jar file you need, preferably with a perl script or something similar and copy/paste that into your pom file.
#! /usr/bin/perl
foreach my $n (#ARGV) {
$n=~s#.*/##;
print "<dependency>
<groupId>local.dummy</groupId>
<artifactId>$n</artifactId>
<version>0.0.1</version>
<scope>system</scope>
<systemPath>\${project.basedir}/lib/$n</systemPath>
</dependency>
";
A quick&dirty batch solution (based on Alex's answer):
libs.bat
#ECHO OFF
FOR %%I IN (*.jar) DO (
echo ^<dependency^>
echo ^<groupId^>local.dummy^</groupId^>
echo ^<artifactId^>%%I^</artifactId^>
echo ^<version^>0.0.1^</version^>
echo ^<scope^>system^</scope^>
echo ^<systemPath^>${project.basedir}/lib/%%I^</systemPath^>
echo ^</dependency^>
)
Execute it like this: libs.bat > libs.txt.
Then open libs.txt and copy its content as dependencies.
In my case, I only needed the libraries to compile my code, and this solution was the best for that purpose.
To install the 3rd party jar which is not in maven repository use maven-install-plugin.
Below are steps:
Download the jar file manually from the source (website)
Create a folder and place your jar file in it
Run the below command to install the 3rd party jar in your local maven repository
mvn install:install-file -Dfile= -DgroupId=
-DartifactId= -Dversion= -Dpackaging=
Below is the e.g one I used it for simonsite log4j
mvn install:install-file
-Dfile=/Users/athanka/git/MyProject/repo/log4j-rolling-appender.jar -DgroupId=uk.org.simonsite -DartifactId=log4j-rolling-appender -Dversion=20150607-2059 -Dpackaging=jar
In the pom.xml include the dependency as below
<dependency>
<groupId>uk.org.simonsite</groupId>
<artifactId>log4j-rolling-appender</artifactId>
<version>20150607-2059</version>
</dependency>
Run the mvn clean install command to create your packaging
Below is the reference link:
https://maven.apache.org/guides/mini/guide-3rd-party-jars-local.html
A strange solution I found:
using Eclipse
create simple (non-maven) java project
add a Main class
add all the jars to the classpath
export Runnable JAR (it's important, because no other way here to do it)
select Extract required libraries into generated JAR
decide the licence issues
tadammm...install the generated jar to your m2repo
add this single dependency to your other projects.
cheers,
Balint
Even though it does not exactly fit to your problem, I'll drop this here. My requirements were:
Jars that can not be found in an online maven repository should be in the SVN.
If one developer adds another library, the other developers should not be bothered with manually installing them.
The IDE (NetBeans in my case) should be able find the sources and javadocs to provide autocompletion and help.
Let's talk about (3) first: Just having the jars in a folder and somehow merging them into the final jar will not work for here, since the IDE will not understand this. This means all libraries have to be installed properly. However, I dont want to have everyone installing it using "mvn install-file".
In my project I needed metawidget. Here we go:
Create a new maven project (name it "shared-libs" or something like that).
Download metawidget and extract the zip into src/main/lib.
The folder doc/api contains the javadocs. Create a zip of the content (doc/api/api.zip).
Modify the pom like this
Build the project and the library will be installed.
Add the library as a dependency to your project, or (if you added the dependency in the shared-libs project) add shared-libs as dependency to get all libraries at once.
Every time you have a new library, just add a new execution and tell everyone to build the project again (you can improve this process with project hierachies).
For those that didn't find a good answer here, this is what we are doing to get a jar with all the necessary dependencies in it. This answer (https://stackoverflow.com/a/7623805/1084306) mentions to use the Maven Assembly plugin but doesn't actually give an example in the answer. And if you don't read all the way to the end of the answer (it's pretty lengthy), you may miss it. Adding the below to your pom.xml will generate target/${PROJECT_NAME}-${VERSION}-jar-with-dependencies.jar
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-assembly-plugin</artifactId>
<version>2.4.1</version>
<configuration>
<!-- get all project dependencies -->
<descriptorRefs>
<descriptorRef>jar-with-dependencies</descriptorRef>
</descriptorRefs>
<!-- MainClass in mainfest make a executable jar -->
<archive>
<manifest>
<mainClass>my.package.mainclass</mainClass>
</manifest>
</archive>
</configuration>
<executions>
<execution>
<id>make-assembly</id>
<!-- bind to the packaging phase -->
<phase>package</phase>
<goals>
<goal>single</goal>
</goals>
</execution>
</executions>
</plugin>
I alluded to some python code in a comment to the answer from #alex lehmann's , so am posting it here.
def AddJars(jarList):
s1 = ''
for elem in jarList:
s1+= """
<dependency>
<groupId>local.dummy</groupId>
<artifactId>%s</artifactId>
<version>0.0.1</version>
<scope>system</scope>
<systemPath>${project.basedir}/manual_jars/%s</systemPath>
</dependency>\n"""%(elem, elem)
return s1
This doesn't answer how to add them to your POM, and may be a no brainer, but would just adding the lib dir to your classpath work? I know that is what I do when I need an external jar that I don't want to add to my Maven repos.
Hope this helps.
What works in our project is what Archimedes Trajano wrote, but we had in our .m2/settings.xml something like this:
<mirror>
<id>nexus</id>
<mirrorOf>*</mirrorOf>
<url>http://url_to_our_repository</url>
</mirror>
and the * should be changed to central. So if his answer doesn't work for you, you should check your settings.xml
I just wanted a quick and dirty workaround... I couldn't run the script from Nikita Volkov: syntax error + it requires a strict format for the jar names.
I made this Perl script which works with whatever format for the jar file names, and it generates the dependencies in an xml so it can be copy pasted directly in a pom.
If you want to use it, make sure you understand what the script is doing, you may need to change the lib folder and the value for the groupId or artifactId...
#!/usr/bin/perl
use strict;
use warnings;
open(my $fh, '>', 'dependencies.xml') or die "Could not open file 'dependencies.xml' $!";
foreach my $file (glob("lib/*.jar")) {
print "$file\n";
my $groupId = "my.mess";
my $artifactId = "";
my $version = "0.1-SNAPSHOT";
if ($file =~ /\/([^\/]*?)(-([0-9v\._]*))?\.jar$/) {
$artifactId = $1;
if (defined($3)) {
$version = $3;
}
`mvn install:install-file -Dfile=$file -DgroupId=$groupId -DartifactId=$artifactId -Dversion=$version -Dpackaging=jar`;
print $fh "<dependency>\n\t<groupId>$groupId</groupId>\n\t<artifactId>$artifactId</artifactId>\n\t<version>$version</version>\n</dependency>\n";
print " => $groupId:$artifactId:$version\n";
} else {
print "##### BEUH...\n";
}
}
close $fh;
The solution for scope='system' approach in Java:
public static void main(String[] args) {
String filepath = "/Users/Downloads/lib/";
try (Stream<Path> walk = Files.walk(Paths.get(filepath))) {
List<String> result = walk.filter(Files::isRegularFile)
.map(x -> x.toString()).collect(Collectors.toList());
String indentation = " ";
for (String s : result) {
System.out.println(indentation + indentation + "<dependency>");
System.out.println(indentation + indentation + indentation + "<groupId>"
+ s.replace(filepath, "").replace(".jar", "")
+ "</groupId>");
System.out.println(indentation + indentation + indentation + "<artifactId>"
+ s.replace(filepath, "").replace(".jar", "")
+ "</artifactId>");
System.out.println(indentation + indentation + indentation + "<version>"
+ s.replace(filepath, "").replace(".jar", "")
+ "</version>");
System.out.println(indentation + indentation + indentation + "<scope>system</scope>");
System.out.println(indentation + indentation + indentation + "<systemPath>" + s + "</systemPath>");
System.out.println(indentation + indentation + "</dependency>");
}
} catch (IOException e) {
e.printStackTrace();
}
}

AspectJ: How to get pointcuts to advise classes located in other projects

This should be simple.
Question
How do you get a pointcut in one project to advise the code/classes within another project?
Context
I'm working in eclipse with two projects. For ease of explanation, let's call one science project and the other math project and say the science project relies on the math project and I'm developing in both projects, concurrently. The math project is a core product, in production, and life will be easier if I don't modify the code much.
Currently, I'm debugging the interaction between these two projects. To assist with that, I'm writing an Aspect (within the science project) to log key information as the math code (and science code) executes.
Example
I running a simple example aspect along the lines of:
package org.science.example;
public aspect ScientificLog {
public pointcut testCut() : execution (public * *.*(..));
before() : testCut() {
//do stuff
}
}
Problem
The problem is, no matter what pointcut I create, it only advises code from the science project. No classes from org.math.example are crosscut, AT ALL!I tried adding the math project to the inpath of the science project by going to proect properties > AspectJ Build > Inpath and clicking add project and choosing the math project. That didn't work but it seems like I need to do something along those lines.
Thanks, in advance, for any suggestions...
-gMale
EDIT 1:
Since writing this, I've noticed the project is giving the following error:
Caused by: org.aspectj.weaver.BCException: Unable to continue, this version of AspectJ
supports classes built with weaver version 6.0 but the class
com.our.project.adapter.GenericMessagingAdapter is version 7.0
when batch building BuildConfig[null] #Files=52 AopXmls=#0
So maybe this is setup properly and the error is more subtle. BTW, the class mentioned is from the "science project," so to speak. This happens even after I clean the project. I'm currently googling this error...
EDIT 2:
I found the solution to the error above in
comment #5 here
The problem is the maven-aspectj-plugin's pom file declares a dependency on aspectjtools version 1.6.7. So, when configuring the plugin, that transient dependency has to be modified. Here's the related code snippet for the pom file that fixes the problem by specifying version 1.6.9 instead of 1.6.7:
<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>aspectj-maven-plugin</artifactId>
<version>1.3</version>
<dependencies>
<dependency>
<groupId>org.aspectj</groupId>
<artifactId>aspectjtools</artifactId>
<version>1.6.9</version>
</dependency>
</dependencies>
<configuration>
<source>1.6</source>
<target>1.6</target>
</configuration>
<executions>
<execution>
<goals>
<goal>compile</goal>
<goal>test-compile</goal>
</goals>
</execution>
</executions>
</plugin>
Your second problem is unrelated to the first. It is saying that com.our.project.adapter.GenericMessagingAdapter was originally compiled and woven against a new version of AspectJ but is being used to binary weave against an older version of AspectJ.
This is essentially the same problem as when you try to run Java classes compiled under 1.6 on a 1.5 VM.
The version number was revved up for the release of AspectJ 1.6.8 (I think, or maybe it was 1.6.7).
The solution is to make sure you are using the latest version of AspectJ for all of your projects (eg- 1.6.9, or dev builds of 1.6.10).
When you add Math project to the in path of science project, all of math project's code is sent through the aspectj weaver and properly woven. The results of that weave are written to science project's output folder (not Math project's). So, if you were to look in science project's bin folder, you should see the woven classes there.
If you wanted to keep the in path files separate from the regular files, you can specify an inpath out folder. This folder should also be added to the class path as a binary folder. Also, this folder should be placed above the project dependency to Math project in the "Export and Order" tab of the Java build page for Science project.
Finally, if you run the main class from Science project, rather than from Math project, you will be executing the woven code.

Categories

Resources