I want to build a Java project using Maven with parameters, which change in every build. One parameter is for example a key which is checked inside of the program.
The parameters should not be able to be read out once the project is build. Tried different Approche including plugins from org.codehaus.mojo… but having problems "plugin execution not covered by lifecycle"....
/****************************/
/**read property values */
/****************************/
//Create a new property list
Properties properties = new Properties();
//create a input strem
InputStream inputStream = null;
//try to read the property file
try {
String filename ="restApi.properties";
inputStream = Main.class.getClassLoader().getResourceAsStream(filename);
if(inputStream==null) {
System.out.println("Unable to read required properties");
}
properties.load(inputStream);
System.out.println(properties.getProperty("property_mainUrlValue"));
} catch (IOException ex) {
ex.printStackTrace();
} finally {
if (inputStream != null) {
try {
inputStream.close();
} catch (IOException e) {
e.printStackTrace();
}
}
}
my pom.xml
<properties> <project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
<property_mainUrlValue>property_mainUrlValue</property_mainUrlValue>
<properties>
<build>
<sourceDirectory>src</sourceDirectory>
<pluginManagement>
<plugins>
<plugin>
<artifactId>maven-compiler-plugin</artifactId>
<version>3.7.0</version>
<configuration>
<source>1.8</source>
<target>1.8</target>
</configuration>
</plugin>
<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>properties-maven-plugin</artifactId>
<version>1.0-alpha-2</version>
<executions>
<execution>
<phase>generate-resources</phase>
<goals>
<goal>write-project-properties</goal>
</goals>
<configuration>
<outputFile>${project.build.outputDirectory}/restApi.properties</outputFile>
</configuration>
</execution>
</executions>
</plugin>
</plugins>
</pluginManagement>
</build>
error shown in eclipse
Unable to read required properties
Exception in thread "main" java.lang.NullPointerException
at java.util.Properties$LineReader.readLine(Unknown Source)
at java.util.Properties.load0(Unknown Source)
at java.util.Properties.load(Unknown Source)
at itAsset.Main.main(Main.java:58)
I guess that you are searching for what Maven calls (for some reason which I do not understand) "filtering".
The basic idea is this:
You turn on the feature by including the following configuration into pom.xml:
...
<resource>
<directory>src/main/resources</directory>
<filtering>true</filtering>
</resource>
...
This causes that when you run mvn resources:resources -Dkey="mmm123", and you have a resource src/main/resources/restApi.properties containing line
appKey=${key}
then Maven will create a resource output in target/classes/restApi.properties which contains
appKey=mmm123
The detailed description is here: http://maven.apache.org/plugins/maven-resources-plugin/examples/filter.html
Related
I am new to Cucumber. I've gone through with a blog which tells about generating maven advance report. Here is the blog link - https://www.linkedin.com/pulse/creating-cucumber-extent-report-right-way-praveen-mathew
After following it, I am able to generate the report BUT with maven 'install'.
I don't know what I am doing wrong that maven 'test' command is not generating the advance report although it is running the test scenarios.
In short 'mvn install' command is working fine and generating advance report but 'mvn test' command is only executing the scenarios and not generating advance report.
Below are some code snippets:
MyTestListner file:
public class MyTestListener implements ConcurrentEventListener {
private static final Logger LOG = LogManager.getLogger(MyTestListener.class);
#Override
public void setEventPublisher(EventPublisher publisher) {
publisher.registerHandlerFor(TestCaseFinished.class, this::handleTestCaseFinished);
}
private void handleTestCaseFinished(TestCaseFinished event) {
TestCase testCase = event.getTestCase();
Result result = event.getResult();
Status status = result.getStatus();
Throwable error = result.getError();
String scenarioName = testCase.getName();
if(error != null) {
LOG.info(error);
}
LOG.info("*****************************************************************************************");
LOG.info(" Scenario: "+scenarioName+" --> "+status.name());
LOG.info("*****************************************************************************************");
}
}
My TestRunner file:
#RunWith(Cucumber.class)
#CucumberOptions(
features= {"src/test/resources/features/editOrganization.feature"}
,glue = {"com.testproject.api.stepdefinition"},
plugin = {"pretty:target/cucumber/cucumber.txt",
"json:target/cucumber/cucumber.json",
// "html:target/cucumber/report.html",
"com.test.api.utils.MyTestListener"
}
//,dryRun = true
,monochrome = true
,snippets = SnippetType.CAMELCASE
// ,tags = "#Regression"
,publish = true
)
public class TestRunner {
}
My pom.xml for generating reports using surefire plugin:
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<version>${maven.compiler.version}</version>
<configuration>
<encoding>UTF-8</encoding>
<source>${java.version}</source>
<target>${java.version}</target>
</configuration>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-surefire-plugin</artifactId>
<version>${maven.surefire.version}</version>
<configuration>
<testFailureIgnore>true</testFailureIgnore>
<includes>
<include>**/*Runner.java</include>
</includes>
</configuration>
</plugin>
<plugin>
<groupId>net.masterthought</groupId>
<artifactId>maven-cucumber-reporting</artifactId>
<version>5.5.4</version>
<executions>
<execution>
<id>execution</id>
<phase>verify</phase>
<goals>
<goal>generate</goal>
</goals>
<configuration>
<projectName>cucumber-api</projectName>
<!-- optional, per documentation set this to "true" to bypass generation of Cucumber Reports entirely, defaults to false if not specified -->
<skip>false</skip>
<!-- output directory for the generated report -->
<outputDirectory>${project.build.directory}</outputDirectory>
<!-- optional, defaults to outputDirectory if not specified -->
<inputDirectory>${project.build.directory}/cucumber</inputDirectory>
<jsonFiles>
<!-- supports wildcard or name pattern -->
<param>**/*.json</param>
</jsonFiles>
<!-- optional, defaults to outputDirectory if not specified -->
<!-- <parallelTesting>false</parallelTesting> -->
<!-- optional, set true to group features by its Ids -->
<mergeFeaturesById>false</mergeFeaturesById>
<!-- optional, set true to get a final report with latest results of the same test from different test runs -->
<mergeFeaturesWithRetest>false</mergeFeaturesWithRetest>
<!-- optional, set true to fail build on test failures -->
<checkBuildResult>false</checkBuildResult>
</configuration>
</execution>
</executions>
</plugin>
</plugins>
Any help would be really appreciable.
Thanks
I'm using Quarkus 2.0 to build uber-jar to be used as AWS lambda.
Maven build script is as follows:
<properties>
<quarkus.package.type>uber-jar</quarkus.package.type>
</properties>
<dependencies>
<dependency>
<groupId>io.quarkus</groupId>
<artifactId>quarkus-amazon-lambda</artifactId>
</dependency>
</dependencies>
<build>
<plugin>
<groupId>io.quarkus</groupId>
<artifactId>quarkus-maven-plugin</artifactId>
<version>2.0.3.Final</version>
<executions>
<execution>
<goals>
<goal>build</goal>
</goals>
</execution>
</executions>
</plugin>
</plugins>
</build>
application.properties also contains the quarkus.package.type=uber-jar config.
When I debug Maven build, I see that in the moment of making decision, quarkus-maven-plugin executes the code:
#BuildStep
public JarBuildItem buildRunnerJar(CurateOutcomeBuildItem curateOutcomeBuildItem, OutputTargetBuildItem outputTargetBuildItem, TransformedClassesBuildItem transformedClasses, ApplicationArchivesBuildItem applicationArchivesBuildItem, ApplicationInfoBuildItem applicationInfo, PackageConfig packageConfig, ClassLoadingConfig classLoadingConfig, List<GeneratedClassBuildItem> generatedClasses, List<GeneratedResourceBuildItem> generatedResources, List<UberJarRequiredBuildItem> uberJarRequired, List<UberJarMergedResourceBuildItem> uberJarMergedResourceBuildItems, List<UberJarIgnoredResourceBuildItem> uberJarIgnoredResourceBuildItems, List<LegacyJarRequiredBuildItem> legacyJarRequired, QuarkusBuildCloseablesBuildItem closeablesBuildItem, List<AdditionalApplicationArchiveBuildItem> additionalApplicationArchiveBuildItems, MainClassBuildItem mainClassBuildItem, Optional<AppCDSRequestedBuildItem> appCDS) throws Exception {
if (appCDS.isPresent()) {
this.handleAppCDSSupportFileGeneration(transformedClasses, generatedClasses, (AppCDSRequestedBuildItem)appCDS.get());
}
if (!uberJarRequired.isEmpty() && !legacyJarRequired.isEmpty()) {
throw new RuntimeException("Extensions with conflicting package types. One extension requires uber-jar another requires legacy format");
} else if (legacyJarRequired.isEmpty() && (!uberJarRequired.isEmpty() || packageConfig.type.equalsIgnoreCase("uber-jar"))) {
/* I want it get there, but it doesn't due to "legacyJarRequired" containing an item, ("packageConfig == uber-jar" as expected) */
return this.buildUberJar(curateOutcomeBuildItem, outputTargetBuildItem, transformedClasses, applicationArchivesBuildItem, packageConfig, applicationInfo, generatedClasses, generatedResources, uberJarMergedResourceBuildItems, uberJarIgnoredResourceBuildItems, mainClassBuildItem);
} else {
/* execution gets there because "legacyJarRequired" contains an item */
return legacyJarRequired.isEmpty() && !packageConfig.isLegacyJar() && !packageConfig.type.equalsIgnoreCase("legacy") ? this.buildThinJar(curateOutcomeBuildItem, outputTargetBuildItem, transformedClasses, applicationArchivesBuildItem, packageConfig, classLoadingConfig, applicationInfo, generatedClasses, generatedResources, additionalApplicationArchiveBuildItems, mainClassBuildItem) : this.buildLegacyThinJar(curateOutcomeBuildItem, outputTargetBuildItem, transformedClasses, applicationArchivesBuildItem, packageConfig, applicationInfo, generatedClasses, generatedResources, mainClassBuildItem);
}
}
And item in the legacyJarRequired is added in here
#BuildStep(onlyIf = IsNormal.class, onlyIfNot = NativeBuild.class)
public void requireLegacy(BuildProducer<LegacyJarRequiredBuildItem> required) {
required.produce(new LegacyJarRequiredBuildItem());
}
How can I avoid adding this element into build config to receive versioned xxx-yyy-zzz-runner.jar from my application build?
function.zip is built all right, but it's not an option for me, because I'd like to push the results of the build to maven repo.
I also needed to deploy an uber-jar to artifactory, for further deployment as AWS lambda. Finally I solved it with build-helper-maven-plugin:attach-artifact plugin. It attached function.zip to artifact in Nexus, so Jenkins was able to get the archive and deploy it to AWS.
<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>build-helper-maven-plugin</artifactId>
<version>3.2.0</version>
<executions>
<execution>
<id>attach-artifacts</id>
<phase>package</phase>
<goals>
<goal>attach-artifact</goal>
</goals>
<configuration>
<artifacts>
<artifact>
<file>./target/function.zip</file>
<type>zip</type>
</artifact>
</artifacts>
</configuration>
</execution>
</executions>
</plugin>
package Bots;
public class FirstBot {
public static void main(String[] args) {
// Insert your bot's token here
String token = "TheToken";
DiscordApi api = new DiscordApiBuilder().setToken(token).login().join();
String prefix = "!";
// Add a listener which answers with "Pong!" if someone writes "!ping"
api.addMessageCreateListener(event -> {
if (event.getMessageContent().equalsIgnoreCase(""+prefix+"ping")) {
event.getChannel().sendMessage("Pong!");
}
});
// Print the invite url of your bot
System.out.println("You can invite the bot by using the following url: " + api.createBotInvite());
}
}
I am new to creating Discord bots in Java. I am using Eclipse and i used this starter code ^
It is giving me an error that DiscordApi cannot be resolved to a type and DiscordApiBuilder cannot be resolved to a type
The first thing you need to do is make sure that you have the JavaCord Maven dependency set up correctly.
Add this inside the <dependencies> field of your pom.xml:
<dependency>
<groupId>org.javacord</groupId>
<artifactId>javacord</artifactId>
<version>3.3.0</version>
<type>pom</type>
</dependency>
The next step is to shade the JavaCord package into your final jar, so that you can run it directly. Add this to your pom.xml:
<build>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-shade-plugin</artifactId>
<version>3.2.4</version>
<configuration>
<relocations>
<relocation>
<pattern>org.javacord</pattern>
<shadedPattern>your.package.name.here.dependencies.javacord</shadedPattern>
</relocation>
</relocations>
</configuration>
<executions>
<execution>
<phase>package</phase>
<goals>
<goal>shade</goal>
</goals>
</execution>
</executions>
</plugin>
</plugins>
</build>
If you already have a <build> or <plugins> field, put it within that.
The final step is to import the relevant JavaCord classes into your main class. If you try to type out the class names again, Eclipse should offer the option to import them.
I'm trying to generate Java code using locally stored group of wsdl's to avoid calling a remote server on runtime.
I want to be able to generate the code on a dev machine and run it on production, so I need the path to the wsdl on the generated code to be relative.
Using two plug-ins I've managed to either use a relative path or to mention a folder location, but I haven't manage to get both.
Using codehause I can use relative path but not to point to a folder:
<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>jaxws-maven-plugin</artifactId>
<version>1.9</version>
<executions>
<execution>
<goals>
<goal>wsimport</goal>
</goals>
<configuration>
<sourceDestDir>${project.build.directory}/generated-sources/jaxb</sourceDestDir>
<wsdlDirectory>${basedir}/wsdl</wsdlDirectory>
<wsdlLocation>../../../../../wsdl/*</wsdlLocation>
</configuration>
</execution>
</executions>
</plugin>
This is the generated service:
try {
URL baseUrl;
baseUrl = path.to.package.BarService.class.getResource(".");
url = new URL(baseUrl, "../../../../../wsdl/*");
} catch (MalformedURLException e) {
logger.warning("Failed to create URL for the wsdl Location: '../../../../../wsdl/*', retrying as a local file");
logger.warning(e.getMessage());
}
The relative path is as expected, but the wsdle name appears as *, which cause the code to fail.
Using jax-ws-commons, I can point to a folder, but the relative path is gone (I tried also using classpath as in the commented line):
<plugin>
<groupId>org.jvnet.jax-ws-commons</groupId>
<artifactId>jaxws-maven-plugin</artifactId>
<version>2.3</version>
<executions>
<execution>
<goals>
<goal>wsimport</goal>
</goals>
<configuration>
<sourceDestDir>${project.build.directory}/generated-sources/jaxb</sourceDestDir>
<wsdlDirectory>${basedir}/wsdl</wsdlDirectory>
<wsdlLocation>../../../../../wsdl/*</wsdlLocation>
<!--<wsdlLocation>classpath:wsdl/*</wsdlLocation>-->
</configuration>
</execution>
</executions>
</plugin>
This is the generated service:
try {
url = new URL("file:/Users/username/Dev/company/project/wsdl/bar.wsdl");
} catch (MalformedURLException ex) {
e = new WebServiceException(ex);
}
This will work on my machine but will fail on any other.
The generated code I wish for is:
try {
url = new URL("../../../../../wsdl/bar.wsdl");
} catch (MalformedURLException ex) {
e = new WebServiceException(ex);
}
I'm currently trying to build my project with maven and sqlite4java. Which is available in the offical maven repositories.
The offical sqlite4java page on google code does have an example configuration but it's a bit outdated and does not suit my needs. I want to have a single .jar-File in the end which i can deploy elsewhere. The problem here is the shared object depedency. I am using the official build goal from their page to copy the so to the build.dir/lib but my assembly goal crashes with:
[INFO] Failed to create assembly: Error adding file-set for 'com.almworks.sqlite4java:libsqlite4java-linux-i386:so:0.282' to archive: Error adding archived file-set. PlexusIoResourceCollection not found for: /home/lhw/.m2/repository/com/almworks/sqlite4java/libsqlite4java-linux-i386/0.282/libsqlite4java-linux-i386-0.282.so
No such archiver: 'so'.
What am I doing wrong? Here is my current pom.xml stripped from some dependencies unrelated to this topic
<?xml version="1.0"?>
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>de.ring0.lhw</groupId>
<artifactId>system</artifactId>
<version>0.0.1-SNAPSHOT</version>
<dependencies>
<dependency>
<groupId>com.almworks.sqlite4java</groupId>
<artifactId>sqlite4java</artifactId>
<version>${sqlite4java.version}</version>
</dependency>
<dependency>
<groupId>com.almworks.sqlite4java</groupId>
<artifactId>libsqlite4java-linux-i386</artifactId>
<version>${sqlite4java.version}</version>
<type>so</type>
</dependency>
</dependencies>
<properties>
<sqlite4java.version>0.282</sqlite4java.version>
</properties>
<build>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-dependency-plugin</artifactId>
<executions>
<execution>
<id>copy</id>
<phase>compile</phase>
<goals>
<goal>copy</goal>
</goals>
<configuration>
<artifactItems>
<artifactItem>
<groupId>com.almworks.sqlite4java</groupId>
<artifactId>libsqlite4java-linux-i386</artifactId>
<version>${sqlite4java.version}</version>
<type>so</type>
<overWrite>true</overWrite>
<outputDirectory>${project.build.directory}/lib</outputDirectory>
</artifactItem>
</artifactItems>
</configuration>
</execution>
</executions>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<version>2.5.1</version>
<configuration>
<source>1.6</source>
<target>1.6</target>
</configuration>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-surefire-plugin</artifactId>
<version>2.12.2</version>
<configuration>
<skipTests>true</skipTests>
<systemProperties>
<property>
<name>sqlite4java.library.path</name>
<value>${project.build.directory}/lib</value>
</property>
</systemProperties>
</configuration>
</plugin>
<plugin>
<artifactId>maven-assembly-plugin</artifactId>
<executions>
<execution>
<phase>package</phase>
<goals>
<goal>single</goal>
</goals>
<configuration>
<archive>
<manifest>
<mainClass>de.ring0.lhw.Init</mainClass>
</manifest>
</archive>
<descriptorRefs>
<descriptorRef>jar-with-dependencies</descriptorRef>
</descriptorRefs>
</configuration>
</execution>
</executions>
</plugin>
</plugins>
</build>
</project>
Edit :
I think that the jar-with-dependencies assembly descriptor tries to unpack the dependencies.
See the link :
http://maven.apache.org/plugins/maven-assembly-plugin/descriptor-refs.html
maven.apache.org/plugins/maven-assembly-plugin/… ...
<unpack>true</unpack>
And of course it fails to unpack the .so
So you might have to use a custom assembly to perform what you want to do
It is possible to create executable jar with stock "jar-with-dependencies" assembly descriptor and without using any startup shell/batch scripts. However, it requires dirty workarounds that doesn't involve much Maven configurations.
We need to place all native libraries (included in sqlite4java zip download) to src/main/resources directory. Also remove sqlite4java native library dependency from your Maven POM file.
Because sqlite4java's native library loader doesn't look at your classpath or inside of JAR file, you have to extract native libraries at startup, and set "sqlite4java.library.path" system property at runtime. Please see the following sample code:
/** List of native libraries you put in src/main/resources */
public static final String[] NATIVE_LIB_FILENAMES = {
"libsqlite4java-linux-amd64.so",
"libsqlite4java-linux-i386.so",
"libsqlite4java-osx.jnilib",
"libsqlite4java-osx-10.4.jnilib",
"libsqlite4java-osx-ppc.jnilib",
"sqlite4java-win32-x64.dll",
"sqlite4java-win32-x86.dll",
};
/**
* Extract native libraries to the current directory.
* This example needs Apache Commons IO (https://commons.apache.org/proper/commons-io/)
*/
public static void extractNativeResources() {
for(String filename: NATIVE_LIB_FILENAMES) {
// Change "DemoSQLite2" to your class name
final InputStream in = DemoSQLite2.class.getResourceAsStream("/"+filename);
if(in != null) {
try {
System.out.println("Extracting " + filename);
FileUtils.copyInputStreamToFile(in, new File(filename));
} catch (IOException e) {
System.err.println("Can't extract " + filename);
e.printStackTrace();
}
}
}
}
/**
* Delete native libraries in the current directory
*/
public static void removeNativeResources() {
for(String filename: NATIVE_LIB_FILENAMES) {
File file = new File(filename);
file.delete();
}
}
public static void main(String[] args) throws Exception {
boolean deleteNativesOnExit = false; // Delete natives on exit
// Extract native libraries if sqlite4java.library.path property is not set
String sqlitePath = System.getProperty("sqlite4java.library.path");
if(sqlitePath == null) {
System.setProperty("sqlite4java.library.path", "."); // Read natives from current directory
extractNativeResources();
deleteNativesOnExit = true;
}
// Do SQLite jobs here
final SQLiteConnection db = new SQLiteConnection(new File("test.db"));
try {
db.open();
db.dispose();
System.out.println("Success");
} catch (Exception e) {
e.printStackTrace();
System.err.println("FAILED");
}
// Delete the native libraries we extracted
if(deleteNativesOnExit) removeNativeResources();
}
Now your app should be buildable with standard "jar-with-dependencies" descriptor, and your app is runnable with standard "java -jar your_jar.jar" command.
Of course, if sqlite4java gets updates in future, you have to manually update the native libraries in your resource directory.
If you have a better, less dirty solution, please let me know!