Maven mojo plugin to load class from hosting project - java

I have a custom plugin that loads classes, e.g.
Class<?> clazz = Class.forName(NAME_OF_CLASS_FROM_HOST_DEPENDENCIES);
NAME_OF_CLASS_FROM_HOST_DEPENDENCIES - is the class that exists in the dependencies of project, where this plugin is used.
in hosting project pom, I call plugin like this:
<plugin>
<groupId>com.plugins</groupId>
<artifactId>the_plugin</artifactId>
<version>1.0-SNAPSHOT</version>
<executions>
<execution>
<id>do</id>
<phase>process-classes</phase>
<goals>
<goal>do</goal>
</goals>
</execution>
</executions>
</plugin>
Getting ClassNotFoundException
it's important , those dependencies defined in pom as
<scope>provided</scope>

Ended up with following.
List<URL> listUrl = new ArrayList<URL>();
Set<Artifact> deps = project.getDependencyArtifacts();
for (Artifact artifact : deps) {
final URL url = artifact.getFile().toURI().toURL();
listUrl.add(url);
}
newClassLoader = new URLClassLoader(listUrl.toArray(new URL[listUrl.size()]), Thread.currentThread().getContextClassLoader());

Related

maven plugin api: #Paramerter using setters doesn't work

I am writing a custom maven-plugin for my project. Following the instructions mentioned here
https://maven.apache.org/guides/plugin/guide-java-plugin-development.html#using-setters I added a #Parameter using setters as shown below.
#Parameter(property = "destinationDirectory", defaultValue = "${project.build.directory}/generated-resources")
private String _destinationDirectory;
private Path dstDirRoot;
public void setDestinationDirectory(String destinationDirectory) {
Path dstDir = Paths.get(destinationDirectory);
if (dstDir.isAbsolute()) {
this._destinationDirectory = dstDir.toString();
} else {
this._destinationDirectory = Paths.get(baseDir, dstDir.toString()).toString();
}
dstDirRoot = Paths.get(this._destinationDirectory);
}
Pom.xml entries on the usage side
<plugin>
<groupId>com.me.maven</groupId>
<artifactId>my-maven-plugin</artifactId>
<version>${project.version}</version>
<executions>
<execution>
<goals>
<goal>run</goal>
</goals>
<phase>generate-resources</phase>
</execution>
</executions>
<configuration>
<destinationDirectory>${project.build.directory}/myDir</destinationDirectory>
</configuration>
</plugin>
Now, I was expecting that during the plugin execution, it would call setDestinationDirectory method. But it doesn't. #Parameter(property="...") doesn't seem to have any impact.
Is this a bug? Or am I missing something?
From maven-plugin-plugin version 3.7.0 you can simply add #Parameter annotation on public setter methods.
You code can looks like:
#Parameter(...)
public void setDestinationDirectory(String destinationDirectory) {
...
}
You also need to define version of maven-plugin-plugin and maven-plugin-annotations dependency in your pom.xml - both should have the same version.
<project>
<properties>
<maven-plugin-tools.version>3.7.1</maven-plugin-tools.version>
</properties>
<dependencies>
<dependency>
<groupId>org.apache.maven.plugin-tools</groupId>
<artifactId>maven-plugin-annotations</artifactId>
<scope>provided</scope>
<version>${maven-plugin-tools.version</version>
</dependency>
</dependencies>
<build>
<pluginManagement>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-plugin-plugin</artifactId>
<version>${maven-plugin-tools.version}</version>
<executions>
<execution>
<id>help-mojo</id>
<goals>
<goal>helpmojo</goal>
</goals>
</execution>
</executions>
</plugin>
</pluginManagement>
</build>
</project>
If I remember correctly, when the annotation has property = destinationDirectory, it will read a system property from system properties (e.g. -D) or pom properties, unless a configuration section is specified in the XML.
mvn generate-resources -DdestinationDirectory=/path/to/dir
If a configuration is specified in the XML, which is the case in your example, the name of the configuration will match either the name of the variable or the specified alias, if any. You can try the following options and check if it solves the issue:
Setting an alias:
#Parameter(alias = "destinationDirectory", defaultValue = "${project.build.directory}/generated-resources")
private String _destinationDirectory;
Renaming the variable:
#Parameter(defaultValue = "${project.build.directory}/generated-resources")
private String destinationDirectory;
It's usually a good practice to keep the name of the configuration and the variables consistent, for easier maintenance.

Quarkus 2.0 maven build is not creating uber-jar for AWS lambda

I'm using Quarkus 2.0 to build uber-jar to be used as AWS lambda.
Maven build script is as follows:
<properties>
<quarkus.package.type>uber-jar</quarkus.package.type>
</properties>
<dependencies>
<dependency>
<groupId>io.quarkus</groupId>
<artifactId>quarkus-amazon-lambda</artifactId>
</dependency>
</dependencies>
<build>
<plugin>
<groupId>io.quarkus</groupId>
<artifactId>quarkus-maven-plugin</artifactId>
<version>2.0.3.Final</version>
<executions>
<execution>
<goals>
<goal>build</goal>
</goals>
</execution>
</executions>
</plugin>
</plugins>
</build>
application.properties also contains the quarkus.package.type=uber-jar config.
When I debug Maven build, I see that in the moment of making decision, quarkus-maven-plugin executes the code:
#BuildStep
public JarBuildItem buildRunnerJar(CurateOutcomeBuildItem curateOutcomeBuildItem, OutputTargetBuildItem outputTargetBuildItem, TransformedClassesBuildItem transformedClasses, ApplicationArchivesBuildItem applicationArchivesBuildItem, ApplicationInfoBuildItem applicationInfo, PackageConfig packageConfig, ClassLoadingConfig classLoadingConfig, List<GeneratedClassBuildItem> generatedClasses, List<GeneratedResourceBuildItem> generatedResources, List<UberJarRequiredBuildItem> uberJarRequired, List<UberJarMergedResourceBuildItem> uberJarMergedResourceBuildItems, List<UberJarIgnoredResourceBuildItem> uberJarIgnoredResourceBuildItems, List<LegacyJarRequiredBuildItem> legacyJarRequired, QuarkusBuildCloseablesBuildItem closeablesBuildItem, List<AdditionalApplicationArchiveBuildItem> additionalApplicationArchiveBuildItems, MainClassBuildItem mainClassBuildItem, Optional<AppCDSRequestedBuildItem> appCDS) throws Exception {
if (appCDS.isPresent()) {
this.handleAppCDSSupportFileGeneration(transformedClasses, generatedClasses, (AppCDSRequestedBuildItem)appCDS.get());
}
if (!uberJarRequired.isEmpty() && !legacyJarRequired.isEmpty()) {
throw new RuntimeException("Extensions with conflicting package types. One extension requires uber-jar another requires legacy format");
} else if (legacyJarRequired.isEmpty() && (!uberJarRequired.isEmpty() || packageConfig.type.equalsIgnoreCase("uber-jar"))) {
/* I want it get there, but it doesn't due to "legacyJarRequired" containing an item, ("packageConfig == uber-jar" as expected) */
return this.buildUberJar(curateOutcomeBuildItem, outputTargetBuildItem, transformedClasses, applicationArchivesBuildItem, packageConfig, applicationInfo, generatedClasses, generatedResources, uberJarMergedResourceBuildItems, uberJarIgnoredResourceBuildItems, mainClassBuildItem);
} else {
/* execution gets there because "legacyJarRequired" contains an item */
return legacyJarRequired.isEmpty() && !packageConfig.isLegacyJar() && !packageConfig.type.equalsIgnoreCase("legacy") ? this.buildThinJar(curateOutcomeBuildItem, outputTargetBuildItem, transformedClasses, applicationArchivesBuildItem, packageConfig, classLoadingConfig, applicationInfo, generatedClasses, generatedResources, additionalApplicationArchiveBuildItems, mainClassBuildItem) : this.buildLegacyThinJar(curateOutcomeBuildItem, outputTargetBuildItem, transformedClasses, applicationArchivesBuildItem, packageConfig, applicationInfo, generatedClasses, generatedResources, mainClassBuildItem);
}
}
And item in the legacyJarRequired is added in here
#BuildStep(onlyIf = IsNormal.class, onlyIfNot = NativeBuild.class)
public void requireLegacy(BuildProducer<LegacyJarRequiredBuildItem> required) {
required.produce(new LegacyJarRequiredBuildItem());
}
How can I avoid adding this element into build config to receive versioned xxx-yyy-zzz-runner.jar from my application build?
function.zip is built all right, but it's not an option for me, because I'd like to push the results of the build to maven repo.
I also needed to deploy an uber-jar to artifactory, for further deployment as AWS lambda. Finally I solved it with build-helper-maven-plugin:attach-artifact plugin. It attached function.zip to artifact in Nexus, so Jenkins was able to get the archive and deploy it to AWS.
<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>build-helper-maven-plugin</artifactId>
<version>3.2.0</version>
<executions>
<execution>
<id>attach-artifacts</id>
<phase>package</phase>
<goals>
<goal>attach-artifact</goal>
</goals>
<configuration>
<artifacts>
<artifact>
<file>./target/function.zip</file>
<type>zip</type>
</artifact>
</artifacts>
</configuration>
</execution>
</executions>
</plugin>

Share properties between maven plugin and the calling maven project pom

I have created a Maven Plugin P, which I want to use as a dependency in another Maven project A. I am providing some parameters to that the plugin P from the pom of Maven project A.
I want to set some properties in plugin P based on parameters provided by project A and want them to be referenced in pom of project A. How can I do that ?
I have tried setting properties for MavenProject in the plugin P. How can I refer them in the pom for project A?
Project A pom snippet:
<plugin>
<groupId>sample.plugin</groupId>
<artifactId>sample-plugin</artifactId>
<version>1.0.0-SNAPSHOT</version>
<executions>
<execution>
<goals>
<goal>testing</goal>
</goals>
<configuration>
<param1>value1</param1>
<param2>value2</param2>
</configuration>
</execution>
</executions>
</plugin>
Plugin P code snippet
#Mojo( name = "testing")
public class TestMojo extends AbstractMojo
{
.
.
#Parameter(property = "param1")
private String param1;
#Parameter(property = "param2")
private String param2;
#Parameter(defaultValue = "${project}")
private org.apache.maven.project.MavenProject project;
public void execute() throws MojoExecutionException
{
if(param1.equalsIgnoreCase("value1")){
project.getProperties().setProperty("PROP1","val1");
} else{
project.getProperties().setProperty("PROP1","val3");
}
if(param2.equalsIgnoreCase("value2")){
project.getProperties().setProperty("PROP2","val2");
} else{
project.getProperties().setProperty("PROP2","val3");
}
}
}
I expect the PROP1 and PROP2 to be used in project A
Found the solution, if we add ${project} A as a parameter to the plugin configuration, we can add properties to it, which can be referred in project A pom.
Ex:
<plugin>
<groupId>sample.plugin</groupId>
<artifactId>sample-plugin</artifactId>
<version>1.0.0-SNAPSHOT</version>
<executions>
<execution>
<goals>
<goal>testing</goal>
</goals>
<configuration>
<param1>value1</param1>
<param2>value2</param2>
<project>${project}</project>
</configuration>
</execution>
</executions>
</plugin>
in Plugin one can use this Maven project
project.getProperties.setProperty("projectProperty",propertyValue);
If i'm understanding this question correctly, try adding:
<dependencies>
<dependency>
<groupId>sample.plugin</groupId>
<artifactId>sample-plugin</artifactId>
<version>1.0.0-SNAPSHOT</version>
</dependency>
</dependencies>
at the bottom of Plugin P's pom.xml file, right before the end of </project>
I am not entirely sure this will even work as I have limited knowledge of Maven, but please let me know.
Best of luck to you.

How to run multiple inter-dependent maven modules

I have a multi-module maven OSGi project. I am using the maven-assembly-plugin to organise the different jars into a central folder, from which the OSGi container will be loading the various project modules:
dist pom.xml
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<artifactId>dist</artifactId>
<packaging>jar</packaging>
<properties>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
</properties>
<build>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<version>2.0.2</version>
<configuration>
<source>1.8</source>
<target>1.8</target>
</configuration>
</plugin>
<plugin>
<artifactId>maven-assembly-plugin</artifactId>
<executions>
<execution>
<id>distro-assembly</id>
<phase>package</phase>
<goals>
<goal>single</goal>
</goals>
<configuration>
<descriptors>
<descriptor>src/main/assembly/bin.xml</descriptor>
</descriptors>
</configuration>
</execution>
</executions>
</plugin>
</plugins>
</build>
<parent>
<groupId>rev</groupId>
<artifactId>parent</artifactId>
<version>1.0-SNAPSHOT</version>
</parent>
</project>
The module jars get put into the central folder as I wish to. I, however, with time, can't really keep track of how the module dependencies relate to each other. For example, a certain module might need that another module be started before it can get executed properly. How can I guarantee that before a module-B gets started, a module-A will first be started - I would like to configure this in a way that some code handles the order of execution?
This is the error I get when such order of execution is not right. I don't think the bundles get installed.
Exception in thread "main" org.osgi.framework.BundleException: Unable to resolve OSGiDmHelloWorldConsumer [2](R 2.0): missing requirement [OSGiDmHelloWorldConsumer [2](R 2.0)] osgi.wiring.package; (&(osgi.wiring.package=com.bw.osgi.provider.able)(version>=1.0.0)(!(version>=2.0.0))) Unresolved requirements: [[OSGiDmHelloWorldConsumer [2](R 2.0)] osgi.wiring.package; (&(osgi.wiring.package=com.bw.osgi.provider.able)(version>=1.0.0)(!(version>=2.0.0)))]
at org.apache.felix.framework.Felix.resolveBundleRevision(Felix.java:4111)
at org.apache.felix.framework.Felix.startBundle(Felix.java:2117)
at org.apache.felix.framework.BundleImpl.start(BundleImpl.java:998)
at org.apache.felix.framework.BundleImpl.start(BundleImpl.java:984)
at main.App.initialize(App.java:46)
at main.App.main(App.java:22)
Below is the class public class App {} that throws the error:
App
public class App {
public static void main(String[] args) throws BundleException, URISyntaxException {
App app = new App();
app.initialize();
}
private void initialize() throws BundleException, URISyntaxException {
Map<String, String> map = new HashMap<String, String>();
// make sure the cache is cleaned
map.put(Constants.FRAMEWORK_STORAGE_CLEAN, Constants.FRAMEWORK_STORAGE_CLEAN_ONFIRSTINIT);
map.put("ds.showtrace", "true");
map.put("ds.showerrors", "true");
FrameworkFactory frameworkFactory = ServiceLoader.load(FrameworkFactory.class).iterator().next();
Framework framework = frameworkFactory.newFramework(map);
System.out.println("Starting OSGi Framework");
framework.init();
loadScrBundle(framework);
String baseDir = "/D:/Maven-Assembly-Plugin-MM/dist/target/dist-1.0-SNAPSHOT-bin/plugins/";
framework.getBundleContext().installBundle("file:" + baseDir + "core-1.0.jar");
framework.getBundleContext().installBundle("file:" + baseDir + "clientfile-plugin-1.0-SNAPSHOT.jar");
framework.getBundleContext().installBundle("file:" + baseDir + "dist-1.0-SNAPSHOT.jar");
List<Bundle> bundles = new ArrayList<Bundle>();
for (Bundle bundle : framework.getBundleContext().getBundles()) {
bundle.start();
bundles.add(bundle);
System.out.println("Bundle Name: " + bundle.getSymbolicName());
System.out.println("Bundle ID: " + bundle.getBundleId());
if (bundle.getRegisteredServices() != null) {
for (ServiceReference<?> serviceReference : bundle.getRegisteredServices())
System.out.println("\tRegistered service: " + serviceReference);
}
}
System.out.println("Total Bundles: " + bundles.size());
}
private void loadScrBundle(Framework framework) throws URISyntaxException, BundleException {
URL url = getClass().getClassLoader().getResource("org/apache/felix/scr/ScrService.class");
if (url == null)
throw new RuntimeException("Could not find the class org.apache.felix.scr.ScrService");
String jarPath = url.toURI().getSchemeSpecificPart().replaceAll("!.*", "");
System.out.println("Found declarative services implementation: " + jarPath);
framework.getBundleContext().installBundle(jarPath).start();
}
}
How can I go about resolving this? Thank you all in advance.
UPDATE
The module jars get put into the central folder as I wish to. I, however, get the following error for all the modules except the felix modules when I try to run the project after calling mvn clean install, ie, all modules from the central maven repository, such as org.apache.felix.framework and org.apache.felix.scr are run in the OSGi container, except those I write myself.
The Problem In Greater Detail
I have published a the a very short version of the problem project in greater detail HERE, Maven-Assembly-Plugin-MM. This, OSGi - Simple Hello World with services, is the tutorial that I followed.
Eclipse:
Import > Existing Maven Projects > C:\***Path***\Maven-Assembly-Plugin-MM
Bundle loading order should not matter in OSGi applications.
But OSGi services might have dependencies to other services.
You can use a framework such as Declarative Services to easily manage dependencies (eg. using the SCR annotations).
You'll need the following plugin:
<plugin>
<groupId>org.apache.felix</groupId>
<artifactId>maven-scr-plugin</artifactId>
<executions>
<execution>
<id>generate-scr-scrdescriptor</id>
<goals>
<goal>scr</goal>
</goals>
</execution>
</executions>
</plugin>
And the following dependencies:
<dependency>
<groupId>org.apache.felix</groupId>
<artifactId>org.apache.felix.scr.annotations</artifactId>
<!-- only needed at compile time, not at runtime -->
<scope>provided</scope>
</dependency>
<dependency>
<groupId>org.apache.felix</groupId>
<artifactId>org.apache.felix.scr</artifactId>
<scope>runtime</scope>
</dependency>

Including shared object in maven assembly

I'm currently trying to build my project with maven and sqlite4java. Which is available in the offical maven repositories.
The offical sqlite4java page on google code does have an example configuration but it's a bit outdated and does not suit my needs. I want to have a single .jar-File in the end which i can deploy elsewhere. The problem here is the shared object depedency. I am using the official build goal from their page to copy the so to the build.dir/lib but my assembly goal crashes with:
[INFO] Failed to create assembly: Error adding file-set for 'com.almworks.sqlite4java:libsqlite4java-linux-i386:so:0.282' to archive: Error adding archived file-set. PlexusIoResourceCollection not found for: /home/lhw/.m2/repository/com/almworks/sqlite4java/libsqlite4java-linux-i386/0.282/libsqlite4java-linux-i386-0.282.so
No such archiver: 'so'.
​
What am I doing wrong? Here is my current pom.xml stripped from some dependencies unrelated to this topic
<?xml version="1.0"?>
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>de.ring0.lhw</groupId>
<artifactId>system</artifactId>
<version>0.0.1-SNAPSHOT</version>
<dependencies>
<dependency>
<groupId>com.almworks.sqlite4java</groupId>
<artifactId>sqlite4java</artifactId>
<version>${sqlite4java.version}</version>
</dependency>
<dependency>
<groupId>com.almworks.sqlite4java</groupId>
<artifactId>libsqlite4java-linux-i386</artifactId>
<version>${sqlite4java.version}</version>
<type>so</type>
</dependency>
</dependencies>
<properties>
<sqlite4java.version>0.282</sqlite4java.version>
</properties>
<build>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-dependency-plugin</artifactId>
<executions>
<execution>
<id>copy</id>
<phase>compile</phase>
<goals>
<goal>copy</goal>
</goals>
<configuration>
<artifactItems>
<artifactItem>
<groupId>com.almworks.sqlite4java</groupId>
<artifactId>libsqlite4java-linux-i386</artifactId>
<version>${sqlite4java.version}</version>
<type>so</type>
<overWrite>true</overWrite>
<outputDirectory>${project.build.directory}/lib</outputDirectory>
</artifactItem>
</artifactItems>
</configuration>
</execution>
</executions>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<version>2.5.1</version>
<configuration>
<source>1.6</source>
<target>1.6</target>
</configuration>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-surefire-plugin</artifactId>
<version>2.12.2</version>
<configuration>
<skipTests>true</skipTests>
<systemProperties>
<property>
<name>sqlite4java.library.path</name>
<value>${project.build.directory}/lib</value>
</property>
</systemProperties>
</configuration>
</plugin>
<plugin>
<artifactId>maven-assembly-plugin</artifactId>
<executions>
<execution>
<phase>package</phase>
<goals>
<goal>single</goal>
</goals>
<configuration>
<archive>
<manifest>
<mainClass>de.ring0.lhw.Init</mainClass>
</manifest>
</archive>
<descriptorRefs>
<descriptorRef>jar-with-dependencies</descriptorRef>
</descriptorRefs>
</configuration>
</execution>
</executions>
</plugin>
</plugins>
</build>
</project>
Edit :
I think that the jar-with-dependencies assembly descriptor tries to unpack the dependencies.
See the link :
http://maven.apache.org/plugins/maven-assembly-plugin/descriptor-refs.html
maven.apache.org/plugins/maven-assembly-plugin/… ...
<unpack>true</unpack>
And of course it fails to unpack the .so
So you might have to use a custom assembly to perform what you want to do
It is possible to create executable jar with stock "jar-with-dependencies" assembly descriptor and without using any startup shell/batch scripts. However, it requires dirty workarounds that doesn't involve much Maven configurations.
We need to place all native libraries (included in sqlite4java zip download) to src/main/resources directory. Also remove sqlite4java native library dependency from your Maven POM file.
Because sqlite4java's native library loader doesn't look at your classpath or inside of JAR file, you have to extract native libraries at startup, and set "sqlite4java.library.path" system property at runtime. Please see the following sample code:
/** List of native libraries you put in src/main/resources */
public static final String[] NATIVE_LIB_FILENAMES = {
"libsqlite4java-linux-amd64.so",
"libsqlite4java-linux-i386.so",
"libsqlite4java-osx.jnilib",
"libsqlite4java-osx-10.4.jnilib",
"libsqlite4java-osx-ppc.jnilib",
"sqlite4java-win32-x64.dll",
"sqlite4java-win32-x86.dll",
};
/**
* Extract native libraries to the current directory.
* This example needs Apache Commons IO (https://commons.apache.org/proper/commons-io/)
*/
public static void extractNativeResources() {
for(String filename: NATIVE_LIB_FILENAMES) {
// Change "DemoSQLite2" to your class name
final InputStream in = DemoSQLite2.class.getResourceAsStream("/"+filename);
if(in != null) {
try {
System.out.println("Extracting " + filename);
FileUtils.copyInputStreamToFile(in, new File(filename));
} catch (IOException e) {
System.err.println("Can't extract " + filename);
e.printStackTrace();
}
}
}
}
/**
* Delete native libraries in the current directory
*/
public static void removeNativeResources() {
for(String filename: NATIVE_LIB_FILENAMES) {
File file = new File(filename);
file.delete();
}
}
public static void main(String[] args) throws Exception {
boolean deleteNativesOnExit = false; // Delete natives on exit
// Extract native libraries if sqlite4java.library.path property is not set
String sqlitePath = System.getProperty("sqlite4java.library.path");
if(sqlitePath == null) {
System.setProperty("sqlite4java.library.path", "."); // Read natives from current directory
extractNativeResources();
deleteNativesOnExit = true;
}
// Do SQLite jobs here
final SQLiteConnection db = new SQLiteConnection(new File("test.db"));
try {
db.open();
db.dispose();
System.out.println("Success");
} catch (Exception e) {
e.printStackTrace();
System.err.println("FAILED");
}
// Delete the native libraries we extracted
if(deleteNativesOnExit) removeNativeResources();
}
Now your app should be buildable with standard "jar-with-dependencies" descriptor, and your app is runnable with standard "java -jar your_jar.jar" command.
Of course, if sqlite4java gets updates in future, you have to manually update the native libraries in your resource directory.
If you have a better, less dirty solution, please let me know!

Categories

Resources