Platform: Windows 10 PC;
Java: 1.8.0_201;
gRPC: 1.21.x
I have recently looked into the Google RPC (gRPC) package to attempt to evaluate it for use with Java. I have read through a lot of the information available on the site (grpc.io) and have attempted to download/build/install it for test evaluation. I have not had much luck.
It's a little unclear (to me anyway) what exactly is needed in order to use gRPC. There seem to be a number of moving parts and it's hard to tell everything that is needed. I know that it uses Google Protobuf, so I followed directions to install the Protobuf compiler, protoc. I'm not sure if I need something else for Protobuf besides the compiler. I assume that there is a "core" to gRPC and then a language-specific module (e.g. for Java) that implements the necessary logic to interface with Java programs. I don't know if there are any other dependencies.
I am unable to clone the git repo due to security policies where I work, but I downloaded the distro (v1.21.x from https://github.com/grpc/grpc-java) and unpacked it into a directory. I followed the directions to build the sample client and server. The process failed due to missing files. Below is an excerpt of the failure.
> Task :grpc-compiler:compileJava_pluginExecutableJava_pluginCpp
java_generator.h
C:\Users\jo24447\workspace\gRPC\grpc-java-1.21.x\compiler\src\java_plugin\cpp\java_generator.h(8): fatal error C1083: Cannot open include file: 'google/protobuf/io/zero_copy_stream.h': No such file or directory
java_plugin.cpp
c:\users\jo24447\workspace\grpc\grpc-java-1.21.x\compiler\src\java_plugin\cpp\java_generator.h(8): fatal error C1083: Cannot open include file: 'google/protobuf/io/zero_copy_stream.h': No such file or directory
java_generator.cpp
c:\users\jo24447\workspace\grpc\grpc-java-1.21.x\compiler\src\java_plugin\cpp\java_generator.h(8): fatal error C1083: Cannot open include file: 'google/protobuf/io/zero_copy_stream.h': No such file or directory
I joined the gRPC mailing list and submitted some questions describing the issues I'm having. The short reply indicated that I should pull down the latest version (which I have) - I had originally pulled down the master branch and apparently that was the wrong thing to do. It was implied that it came with a pre-built code generator plugin (codegen). I was not given a reason for the build failure.
A link to instructions to build the codegen plugin were also provided. The site documentation seems to indicate that I should not need to build the codegen unless I'm actually changing the code, which I am not. Regardless I looked through the instructions. It would appear that it is intended for this to done on a *nix platform. What are people who are doing this on a Windows platform supposed to do?
In any case, I attempted to do the build with the new distro and it fails with the same issue. I responded back and have heard nothing.
At this point I am stuck and have no idea what to do next.
Any ideas about the following would be very helpful and appreciated:
The minimum required distributions to permit a successful build and
subsequent usage
Unambiguous directions on the steps needed to take
it from downloaded distro(s) to fully functional application/utility
I have responded in part in what I assume is the mailing list thread you referenced.
grpc-java is its own complete implementation. It is not based on "C core" at https://github.com/grpc/grpc.
In short, you have gotten off the beaten path; you should not need to compile grpc-java yourself. We have binaries already available on Maven Central.
When you downloaded grpc-java, you should have downloaded v1.12.0 which is the release tag, not v1.12.x. Released versions are easily downloaded in the release section.
We don't have unambiguous instructions when git clone is unavailable, because that is rare and the answers will vary for each environment.
If you want to make changes to gRPC-Java or build it yourself see the instructions.
But you do not need to do this to use it.
To use in the Maven / Gradle project, just add the necessary dependencies:
Maven
<dependency>
<groupId>io.grpc</groupId>
<artifactId>grpc-netty-shaded</artifactId>
<version>1.20.0</version>
</dependency>
<dependency>
<groupId>io.grpc</groupId>
<artifactId>grpc-protobuf</artifactId>
<version>1.20.0</version>
</dependency>
<dependency>
<groupId>io.grpc</groupId>
<artifactId>grpc-stub</artifactId>
<version>1.20.0</version>
</dependency>
Gradle
compile 'io.grpc:grpc-netty-shaded:1.20.0'
compile 'io.grpc:grpc-protobuf:1.20.0'
compile 'io.grpc:grpc-stub:1.20.0'
For protobuf-based codegen, you can use plugins integrated with your build system:
protobuf-maven-plugin
<build>
<extensions>
<extension>
<groupId>kr.motd.maven</groupId>
<artifactId>os-maven-plugin</artifactId>
<version>1.5.0.Final</version>
</extension>
</extensions>
<plugins>
<plugin>
<groupId>org.xolstice.maven.plugins</groupId>
<artifactId>protobuf-maven-plugin</artifactId>
<version>0.5.1</version>
<configuration>
<protocArtifact>com.google.protobuf:protoc:3.7.1:exe:${os.detected.classifier}</protocArtifact>
<pluginId>grpc-java</pluginId>
<pluginArtifact>io.grpc:protoc-gen-grpc-java:1.20.0:exe:${os.detected.classifier}</pluginArtifact>
</configuration>
<executions>
<execution>
<goals>
<goal>compile</goal>
<goal>compile-custom</goal>
</goals>
</execution>
</executions>
</plugin>
</plugins>
</build>
protobuf-gradle-plugin
apply plugin: 'com.google.protobuf'
buildscript {
repositories {
mavenCentral()
}
dependencies {
classpath 'com.google.protobuf:protobuf-gradle-plugin:0.8.8'
}
}
protobuf {
protoc {
artifact = "com.google.protobuf:protoc:3.7.1"
}
plugins {
grpc {
artifact = 'io.grpc:protoc-gen-grpc-java:1.20.0'
}
}
generateProtoTasks {
all()*.plugins {
grpc {}
}
}
}
You can find more details in the README.
Related
I have spring boot projects with lots of files in .json apart from .java files.
For java formatting, we are using pre-commit hooks with google-java-format . However, for formatting .json files I am a bit struggling.
I have used a maven spotless plugin
<plugin>
<groupId>com.diffplug.spotless</groupId>
<artifactId>spotless-maven-plugin</artifactId>
<version>${spotless.version}</version>
<configuration>
<formats>
<format>
<includes>
<include>*.json</include>
</includes>
<prettier>
<!-- Specify at most one of the following 3 configs: either 'prettierVersion' (2.0.5 is default) , 'devDependencies' or 'devDependencyProperties' -->
<prettierVersion>2.0.5</prettierVersion>
<!-- Specify config file and/or inline config, the inline always trumps file -->
<config>
<useTabs>true</useTabs>
</config>
</prettier>
</format>
</formats>
</configuration>
<executions>
<execution>
<goals>
<goal>check</goal>
</goals>
</execution>
</executions>
</plugin>
The problem with this approach is, that it requires npm in the machine to be installed else mvn clean install will fail. Many machines on the jenkins server don't have npm pre-installed so it fails during the build.
Is there an easy way to solve this?
PS: This project uses GIT as version control here.
Introduction
Let's consider the limitation: the Spotless formatter steps must be Java-based, i.e. avoid using external programs (npm, etc.).
Let's consider Spotless Maven plugin 2.23.0 as the current version.
Let's call Java-based JSON document formatting and validation the feature.
Analysis
According to the feature matrix table, the following formatter steps are available for Spotless Gradle plugin, but not available for Spotless Maven plugin:
json.gson.GsonStep.
json.JsonSimpleStep.
These feature steps are Java-based.
Summary
Currently, there are no such Spotless formatter steps available for Spotless Maven plugin — the feature is absent.
Possible solutions
Implement feature
Implement the feature.
Create a pull request to propose and collaborate on changes to the GitHub repository.
Request feature
Request the feature by creating an issue in the GitHub repository.
It looks like there was a quick attempt to request it. Please, see the comment:
GitHub issue: Add JVM-based JSON formatter by jamietanna · Pull Request #853 · diffplug/spotless.
Comment.
Find and use additional formatter Maven plugin
Find and use additional formatter Maven plugin that has the feature.
For example, it seems that the following formatter Maven plugin supports JSON document formatting and validation:
<dependency>
<groupId>net.revelc.code.formatter</groupId>
<artifactId>formatter-maven-plugin</artifactId>
<version>2.19.0</version>
</dependency>
Some related references:
formatter-maven-plugin – Introduction.
formatter-maven-plugin – formatter:format - configJsonFile parameter.
formatter-maven-plugin – formatter:validate - configJsonFile parameter.
formatter-maven-plugin/JsonFormatter.java at formatter-maven-plugin-2.19.0 · revelc/formatter-maven-plugin.
formatter-maven-plugin/JsonFormatterTest.java at formatter-maven-plugin-2.19.0 · revelc/formatter-maven-plugin.
I am trying to integrate Adobe AEM 6.3 (running on Java 1.8) with Cloudinary SDK. I have done the following but, keep hitting an exception that I am not able to resolve. Has anyone integrated Cloudinary with AEM and run into similar issues?
Add the dependency in the pom.xml for compiling the code.
<dependency>
<groupId>com.cloudinary</groupId>
<artifactId>cloudinary-core</artifactId>
<version>1.24.0</version>
</dependency>
<dependency>
<groupId>com.cloudinary</groupId>
<artifactId>cloudinary-http44</artifactId>
<version>1.24.0</version>
</dependency>
Build an OSGI plugin to ensure AEM gets the right jar files. For this purpose, I followed the steps to create a third party RESTful service example. To build the bundle, I had to explicitly download the following jar files: cloudinary-1.0.14.jar, cloudinary-core-1.21.0.jar, cloudinary-http44-1.21.0.jar, commons-codec-1.10.jar, commons-collections-3.2.2.jar, commons-lang3-3.1.jar, commons-logging-1.2.jar, httpclient-4.4.jar, httpmime-4.4.jar, jsp-api-2.0.jar
Despite creating a bundle that has httpclient, I get the following exception when trying to upload an image to Cloudinary. Here's code and the exception.
Code snippet
import com.cloudinary.*;
..
Cloudinary cloudinary = new Cloudinary("<<credentials>>");
...
File toUpload = new File("/Users/akshayranganath/Downloads/background-2633962_1280.jpg");
try {
Map uploadResult = cloudinary.uploader().upload(toUpload, ObjectUtils.emptyMap());
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
Exception
Caused by: java.lang.NoClassDefFoundError: javax/net/ssl/HostnameVerifier
at java.lang.ClassLoader.defineClass1(Native Method)
at java.lang.ClassLoader.defineClass(ClassLoader.java:756)
at org.apache.felix.framework.BundleWiringImpl$BundleClassLoader.defineClass(BundleWiringImpl.java:2370)
at org.apache.felix.framework.BundleWiringImpl$BundleClassLoader.findClass(BundleWiringImpl.java:2154)
at org.apache.felix.framework.BundleWiringImpl.findClassOrResourceByDelegation(BundleWiringImpl.java:1542)
at org.apache.felix.framework.BundleWiringImpl.access$400(BundleWiringImpl.java:79)
at org.apache.felix.framework.BundleWiringImpl$BundleClassLoader.loadClass(BundleWiringImpl.java:2018)
at java.lang.ClassLoader.loadClass(ClassLoader.java:351)
at org.apache.http.impl.conn.SchemeRegistryFactory.createDefault(SchemeRegistryFactory.java:52)
at org.apache.http.impl.client.AbstractHttpClient.createClientConnectionManager(AbstractHttpClient.java:321)
at org.apache.http.impl.client.AbstractHttpClient.getConnectionManager(AbstractHttpClient.java:484)
at org.apache.http.impl.client.AbstractHttpClient.createHttpContext(AbstractHttpClient.java:301)
at org.apache.http.impl.client.AbstractHttpClient.doExecute(AbstractHttpClient.java:818)
at org.apache.http.impl.client.CloseableHttpClient.execute(CloseableHttpClient.java:82)
at org.apache.http.impl.client.CloseableHttpClient.execute(CloseableHttpClient.java:107)
at org.apache.http.impl.client.CloseableHttpClient.execute(CloseableHttpClient.java:55)
at com.cloudinary.Uploader.callApi(Uploader.java:317)
at com.cloudinary.Uploader.upload(Uploader.java:57)
at com.aem.community.core.models.HelloWorldModel.init(HelloWorldModel.java:59)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.sling.models.impl.ModelAdapterFactory.invokePostConstruct(ModelAdapterFactory.java:792)
at org.apache.sling.models.impl.ModelAdapterFactory.createObject(ModelAdapterFactory.java:607)
... 211 common frames omitted
Caused by: java.lang.ClassNotFoundException: javax.net.ssl.HostnameVerifier not found by MyBundle [550]
at org.apache.felix.framework.BundleWiringImpl.findClassOrResourceByDelegation(BundleWiringImpl.java:1574)
at org.apache.felix.framework.BundleWiringImpl.access$400(BundleWiringImpl.java:79)
at org.apache.felix.framework.BundleWiringImpl$BundleClassLoader.loadClass(BundleWiringImpl.java:2018)
at java.lang.ClassLoader.loadClass(ClassLoader.java:351)
... 236 common frames omitted
This is the first time I am working with AEM and I may not be following the right steps. Please let me know if anyone has been able to get past this issue.
Update
Based on Alexander's suggestion and a pointer from another source, I added the following code to the parent pom.xml file.
<plugin>
<groupId>org.apache.felix</groupId>
<artifactId>maven-bundle-plugin</artifactId>
<version>3.5.0</version>
<configuration>
<instructions>
<Embed-Dependency>*;scope=compile|runtime</Embed-Dependency>
<Embed-Directory>OSGI-INF/lib</Embed-Directory>
<Embed-Transitive>true</Embed-Transitive>
</instructions>
</configuration>
</plugin>
After making this change, the cloudinary libraries were being added to the bundle. Here's the output from AEM: http://localhost:4502/system/console/bundles
Embedded-Artifacts: OSGI-INF/lib/cloudinary-http44-1.21.0.jar; g="com.cloudinary"; a="cloudinary-http44"; v="1.21.0", OSGI-INF/lib/commons-lang3-3.1.jar; g="org.apache.commons"; a="commons-lang3"; v="3.1", OSGI-INF/lib/httpclient-4.4.jar; g="org.apache.httpcomponents"; a="httpclient"; v="4.4", OSGI-INF/lib/httpcore-4.4.jar; g="org.apache.httpcomponents"; a="httpcore"; v="4.4", OSGI-INF/lib/commons-logging-1.2.jar; g="commons-logging"; a="commons-logging"; v="1.2", OSGI-INF/lib/commons-codec-1.9.jar; g="commons-codec"; a="commons-codec"; v="1.9", OSGI-INF/lib/httpmime-4.4.jar; g="org.apache.httpcomponents"; a="httpmime"; v="4.4", OSGI-INF/lib/cloudinary-core-1.21.0.jar; g="com.cloudinary"; a="cloudinary-core"; v="1.21.0"
However, I now get an error with this message:
org.apache.avalon.framework.logger -- Cannot be resolved
org.apache.log -- Cannot be resolved
I am able to resolve the org.apache.avalon.framework.logger error by adding a dependency Avalon framework. But, I am not able to get over the org.apache.log issue. It looks like there is a version conflict that is causing the problem.
This new error starts when I include the Cloudinary http44 library. This library doesn't appear to directly reference logging (see here for dependencies). Due to this error, the application still fails to go from Installed to Active state.
Cloudinary-libs are available as Maven artifacts. Such JAR-files can be put in your bundle as private libraries with the maven-bundle-plugin.
The following sample works for me (even with Cloudinary test account)
...
<plugin>
<groupId>org.apache.felix</groupId>
<artifactId>maven-bundle-plugin</artifactId>
<extensions>true</extensions>
<executions>
<execution>
<!-- Create the bundle late in the compile-phase instead of the package-phase.
So the generated OSGi meta-data is available during JUnit tests. -->
<id>run-before-tests</id>
<phase>process-classes</phase>
<goals>
<goal>bundle</goal>
</goals>
</execution>
</executions>
<configuration>
<instructions>
<Bundle-Name>Test Bundle</Bundle-Name>
<Embed-Dependency>*;groupId=com.cloudinary;scope=compile|runtime</Embed-Dependency>
<Embed-Directory>OSGI-INF/lib</Embed-Directory> <!-- not needed, but nice -->
<Embed-Transitive>true</Embed-Transitive>
</instructions>
</configuration>
</plugin>
...
<dependencies>
<dependency>
<groupId>com.cloudinary</groupId>
<artifactId>cloudinary-core</artifactId>
<version>1.24.0</version>
</dependency>
<dependency>
<groupId>com.cloudinary</groupId>
<artifactId>cloudinary-http44</artifactId>
<version>1.24.0</version>
</dependency>
...
In general embedding an external library can be from simple, cumbersome to impossible. It depends on the dependencies of the imported artifacts.
Check the dependency tree manually! (e.g. https://mvnrepository.com/)
You have to fiddle with 3 instructions:
Embed-Dependency
This are the libraries, that are put in your bundle. Be careful with the asterisk operator, otherwise you may include way too many dependencies (in case of AEM easily half of the internet). But do not include too less! Extract the built bundle.jar, to see what is actually included (in case of cloudinary it was easy).
Import-Package
Often the libs have way too many dependencies, especially if libs come an other ecosystem (like Spring or JEE containers), or have a lot of semi-optional dependencies. With this setting you can tell OSGi, that a bundle can be activated, even if certain dependencies are not available.
This is a real world example :
<Import-Package>
!com.sun.msv.*,
!org.apache.log4j.jmx.*,
!sun.misc.*,
!org.jboss.logging.*,
!org.apache.zookeeper.*,
*
</Import-Package>
Export-Package
Normally the library should be bundle-private. But sometimes you have to import differently, or the lib does something automatically. So you should always check in the system console, what your bundle is exporting. In case it is not right, you have to manually fiddle with this setting:
Here is an example:
<Export-Package>
!*.internal,
!*.internal.*,
!*.impl,
!*.impl.*,
com.mycompany.myproject.mybundle.*
</Export-Package>
By default all packages * are exported, except they are named impl or internal. Also their child packages are private (the !*.impl.* rule). If the default doesn't work, then export with this instruction only what you need.
Whatever you export goes to the global OSGi space. As also the AEM- and Sling-Bundles are not perfect nor 100%-bugfree, please make sure
the startup/shutdown order of out-of-the-box AEM bundles should not be changed
a deployment, re-deployment or un-deployment of your code should not start/stop any out-of-the-box AEM bundles.
If you don't ensure this, you might experience strange deployment issues - that are very difficult to find/solve.
So the best is, NOT to export anything that is imported by any AEM out-of-the-box bundle. Everything else is for Experts-only. And even they overestimate themselves, and underestimate the long-term costs of patching AEM manually.
PS: the _removeheaders instruction could remove all osgi-instructions that are not needed for runtime. But only do this, if you want to provide a bundle to the public and make it totally shiny. I would leave it in, as it is some kind of documentation.
I'm using the maven-release-plugin. I'm trying to release a branch and it's failing when it tries to execute this command:
cmd.exe /X /C "svn --non-interactive copy --file C:\Users\USER~1\AppData\Local\Temp\maven-scm-711744598.commit --parents --revision 0 https://domain/svn/app/branches/2.4.8.x https://domain/svn/app/tags/App-2.4.8.1"
It gives this error:
svn: E195012: Unable to find repository location for 'https://domain/svn/app/branches/2.4.8.x' in revision 0
I think this is happening in the prepare goal because when it fails it says:
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-release-plugin:2.5:prepare
I asked a svn expert about this, and he said:
wait, why is it trying to copy something from r0? By definition there is nothing in r0. r0 is always an empty repository, the first objects are added in r1. That's why it fails. the question is why maven tried it. If you supply a revision argument to 'svn copy' then the branch / tag you create is based on the source from the revision you specify so the source has to exist in that revision (if you don't specify, you get HEAD, i.e., the newest revision) ...and as for that, I know nothing about maven or its plugins
So why is maven trying to copy from revision 0? This is the maven command I ran:
mvn --batch-mode release:prepare release:perform
And my root pom has the maven-release-plugin defined like this:
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-release-plugin</artifactId>
<version>2.5</version>
<configuration>
<autoVersionSubmodules>true</autoVersionSubmodules>
<developmentVersion>2.4.8.2-SNAPSHOT</developmentVersion>
<releaseVersion>2.4.8.1</releaseVersion>
<branchBase>https://domain/svn/app/branches</branchBase>
<tagBase>https://domain/svn/app/tags</tagBase>
</configuration>
</plugin>
Also, my scm tag looks like this:
<scm>
<connection>scm:svn:https://domain/svn/app/branches/2.4.8.x</connection>
</scm>
My svn version is 1.8.5 (r1542147)
Just wanted to add this late answer for if anyone has the same problem and the solution in the comment doesn't work.
We had the same problem in a multi module application, only our parent POM had the SCM tag (which worked perfectly in our other applications). We got the same error but could solve it by adding the corresponding SCM tag to each child POM. We never found out why this was...
As I said as a comment above:
I cleaned up EVERYTHING and ran just release:prepare by itself and it succeeded without issue. Perhaps this is a bug where running release:prepare and release:perform together will cause this
I have not run into this issue since running these commands separately.
I also had this problem. In the affected project I had a custom search and replace of some files during the validate phase and I wanted to check in the changes to Svn before tagging so I added a custom check-in action like this:
<plugin>
<artifactId>maven-release-plugin</artifactId>
<configuration>
<preparationGoals>clean verify scm:checkin -Dmessage="perform release"</preparationGoals>
</configuration>
</plugin>
This had the consequences that when the release plugin tried to check in the changes in the pom file, there were no changes since they were already committed by the custom action. Thus causing this error.
I added a "includes" file list to my custom scm:checkin which only included the files that I had been tampering with and this fixed the problem for me.
The resulting configuration looked like this:
<plugin>
<artifactId>maven-release-plugin</artifactId>
<configuration>
<preparationGoals>clean verify scm:checkin -Dmessage="perform release" -Dincludes="TwogWebUtilsGrailsPlugin.groovy,plugin.xml" -DconnectionType="connection"</preparationGoals>
</configuration>
</plugin>
The reason for my custom replace action is because the project is a Grails plugin and I was following the guidelines in this blog post.
LATE EDIT: After upgrading to maven 3.2, this solution seems to break. I am back to where I started.
What is the simplest way to retrieve version number from maven's pom.xml in code, i.e., programatically?
Assuming you're using Java, you can:
Create a .properties file in (most commonly) your src/main/resources directory (but in step 4 you could tell it to look elsewhere).
Set the value of some property in your .properties file using the standard Maven property for project version:
foo.bar=${project.version}
In your Java code, load the value from the properties file as a resource from the classpath (google for copious examples of how to do this, but here's an example for starters).
In Maven, enable resource filtering. This will cause Maven to copy that file into your output classes and translate the resource during that copy, interpreting the property. You can find some info here but you mostly just do this in your pom:
<build>
<resources>
<resource>
<directory>src/main/resources</directory>
<filtering>true</filtering>
</resource>
</resources>
</build>
You can also get to other standard properties like project.name, project.description, or even arbitrary properties you put in your pom <properties>, etc. Resource filtering, combined with Maven profiles, can give you variable build behavior at build time. When you specify a profile at runtime with -PmyProfile, that can enable properties that then can show up in your build.
The accepted answer may be the best and most stable way to get a version number into an application statically, but does not actually answer the original question: How to retrieve the artifact's version number from pom.xml? Thus, I want to offer an alternative showing how to do it dynamically during runtime:
You can use Maven itself. To be more exact, you can use a Maven library.
<dependency>
<groupId>org.apache.maven</groupId>
<artifactId>maven-model</artifactId>
<version>3.3.9</version>
</dependency>
And then do something like this in Java:
package de.scrum_master.app;
import org.apache.maven.model.Model;
import org.apache.maven.model.io.xpp3.MavenXpp3Reader;
import org.codehaus.plexus.util.xml.pull.XmlPullParserException;
import java.io.FileReader;
import java.io.IOException;
public class Application {
public static void main(String[] args) throws IOException, XmlPullParserException {
MavenXpp3Reader reader = new MavenXpp3Reader();
Model model = reader.read(new FileReader("pom.xml"));
System.out.println(model.getId());
System.out.println(model.getGroupId());
System.out.println(model.getArtifactId());
System.out.println(model.getVersion());
}
}
The console log is as follows:
de.scrum-master.stackoverflow:my-artifact:jar:1.0-SNAPSHOT
de.scrum-master.stackoverflow
my-artifact
1.0-SNAPSHOT
Update 2017-10-31: In order to answer Simon Sobisch's follow-up question I modified the example like this:
package de.scrum_master.app;
import org.apache.maven.model.Model;
import org.apache.maven.model.io.xpp3.MavenXpp3Reader;
import org.codehaus.plexus.util.xml.pull.XmlPullParserException;
import java.io.File;
import java.io.FileReader;
import java.io.IOException;
import java.io.InputStreamReader;
public class Application {
public static void main(String[] args) throws IOException, XmlPullParserException {
MavenXpp3Reader reader = new MavenXpp3Reader();
Model model;
if ((new File("pom.xml")).exists())
model = reader.read(new FileReader("pom.xml"));
else
model = reader.read(
new InputStreamReader(
Application.class.getResourceAsStream(
"/META-INF/maven/de.scrum-master.stackoverflow/aspectj-introduce-method/pom.xml"
)
)
);
System.out.println(model.getId());
System.out.println(model.getGroupId());
System.out.println(model.getArtifactId());
System.out.println(model.getVersion());
}
}
Packaged artifacts contain a META-INF/maven/${groupId}/${artifactId}/pom.properties file which content looks like:
#Generated by Maven
#Sun Feb 21 23:38:24 GMT 2010
version=2.5
groupId=commons-lang
artifactId=commons-lang
Many applications use this file to read the application/jar version at runtime, there is zero setup required.
The only problem with the above approach is that this file is (currently) generated during the package phase and will thus not be present during tests, etc (there is a Jira issue to change this, see MJAR-76). If this is an issue for you, then the approach described by Alex is the way to go.
There is also the method described in Easy way to display your apps version number using Maven:
Add this to pom.xml
<build>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-jar-plugin</artifactId>
<configuration>
<archive>
<manifest>
<mainClass>test.App</mainClass>
<addDefaultImplementationEntries>
true
</addDefaultImplementationEntries>
</manifest>
</archive>
</configuration>
</plugin>
</plugins>
</build>
Then use this:
App.class.getPackage().getImplementationVersion()
I have found this method to be simpler.
If you use mvn packaging such as jar or war, use:
getClass().getPackage().getImplementationVersion()
It reads a property "Implementation-Version" of the generated META-INF/MANIFEST.MF (that is set to the pom.xml's version) in the archive.
To complement what #kieste has posted, which I think is the best way to have Maven build informations available in your code if you're using Spring-boot: the documentation at http://docs.spring.io/spring-boot/docs/current/reference/htmlsingle/#production-ready-application-info is very useful.
You just need to activate actuators, and add the properties you need in your application.properties or application.yml
Automatic property expansion using Maven
You can automatically expand info properties from the Maven project using resource filtering. If you use the spring-boot-starter-parent you can then refer to your Maven ‘project properties’ via #..# placeholders, e.g.
project.artifactId=myproject
project.name=Demo
project.version=X.X.X.X
project.description=Demo project for info endpoint
info.build.artifact=#project.artifactId#
info.build.name=#project.name#
info.build.description=#project.description#
info.build.version=#project.version#
When using spring boot, this link might be useful: https://docs.spring.io/spring-boot/docs/2.3.x/reference/html/howto.html#howto-properties-and-configuration
With spring-boot-starter-parent you just need to add the following to your application config file:
# get values from pom.xml
pom.version=#project.version#
After that the value is available like this:
#Value("${pom.version}")
private String pomVersion;
Sometimes the Maven command line is sufficient when scripting something related to the project version, e.g. for artifact retrieval via URL from a repository:
mvn help:evaluate -Dexpression=project.version -q -DforceStdout
Usage example:
VERSION=$( mvn help:evaluate -Dexpression=project.version -q -DforceStdout )
ARTIFACT_ID=$( mvn help:evaluate -Dexpression=project.artifactId -q -DforceStdout )
GROUP_ID_URL=$( mvn help:evaluate -Dexpression=project.groupId -q -DforceStdout | sed -e 's#\.#/#g' )
curl -f -S -O http://REPO-URL/mvn-repos/${GROUP_ID_URL}/${ARTIFACT_ID}/${VERSION}/${ARTIFACT_ID}-${VERSION}.jar
Use this Library for the ease of a simple solution. Add to the manifest whatever you need and then query by string.
System.out.println("JAR was created by " + Manifests.read("Created-By"));
http://manifests.jcabi.com/index.html
<build>
<finalName>${project.artifactId}-${project.version}</finalName>
<pluginManagement>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-war-plugin</artifactId>
<version>3.2.2</version>
<configuration>
<failOnMissingWebXml>false</failOnMissingWebXml>
<archive>
<manifest>
<addDefaultImplementationEntries>true</addDefaultImplementationEntries>
<addDefaultSpecificationEntries>true</addDefaultSpecificationEntries>
</manifest>
</archive>
</configuration>
</plugin>
</plugins>
</pluginManagement>
</build>
Get Version using this.getClass().getPackage().getImplementationVersion()
PS Don't forget to add:
<manifest>
<addDefaultImplementationEntries>true</addDefaultImplementationEntries>
<addDefaultSpecificationEntries>true</addDefaultSpecificationEntries>
</manifest>
Step 1: If you are using Spring Boot, your pom.xml should already contain spring-boot-maven-plugin. You just need to add the following configuration.
<plugin>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-maven-plugin</artifactId>
<executions>
<execution>
<id>build-info</id>
<goals>
<goal>build-info</goal>
</goals>
</execution>
</executions>
</plugin>
It instructs the plugin to execute also build-info goal, which is not run by default. This generates build meta-data about your application, which includes artifact version, build time and more.
Step2: Accessing Build Properties with buildProperties bean. In our case we create a restResource to access to this build info in our webapp
#RestController
#RequestMapping("/api")
public class BuildInfoResource {
#Autowired
private BuildProperties buildProperties;
#GetMapping("/build-info")
public ResponseEntity<Map<String, Object>> getBuildInfo() {
Map<String, String> buildInfo = new HashMap();
buildInfo.put("appName", buildProperties.getName());
buildInfo.put("appArtifactId", buildProperties.getArtifact());
buildInfo.put("appVersion", buildProperties.getVersion());
buildInfo.put("appBuildDateTime", buildProperties.getTime());
return ResponseEntity.ok().body(buldInfo);
}
}
I hope this will help
I had the same problem in my daytime job. Even though many of the answers will help to find the version for a specific artifact, we needed to get the version for modules/jars that are not a direct dependency of the application. The classpath is assembled from multiple modules when the application starts, the main application module has no knowledge of how many jars are added later.
That's why I came up with a different solution, which may be a little more elegant than having to read XML or properties from jar files.
The idea
use a Java service loader approach to be able to add as many components/artifacts later, which can contribute their own versions at runtime. Create a very lightweight library with just a few lines of code to read, find, filter and sort all of the artifact versions on the classpath.
Create a maven source code generator plugin that generates the service implementation for each of the modules at compile time, package a very simple service in each of the jars.
The solution
Part one of the solution is the artifact-version-service library, which can be found on github and MavenCentral now. It covers the service definition and a few ways to get the artifact versions at runtime.
Part two is the artifact-version-maven-plugin, which can also be found on github and MavenCentral. It is used to have a hassle-free generator implementing the service definition for each of the artifacts.
Examples
Fetching all modules with coordinates
No more reading jar manifests, just a simple method call:
// iterate list of artifact dependencies
for (Artifact artifact : ArtifactVersionCollector.collectArtifacts()) {
// print simple artifact string example
System.out.println("artifact = " + artifact);
}
A sorted set of artifacts is returned. To modify the sorting order, provide a custom comparator:
new ArtifactVersionCollector(Comparator.comparing(Artifact::getVersion)).collect();
This way the list of artifacts is returned sorted by version numbers.
Find a specific artifact
ArtifactVersionCollector.findArtifact("de.westemeyer", "artifact-version-service");
Fetches the version details for a specific artifact.
Find artifacts with matching groupId(s)
Find all artifacts with groupId de.westemeyer (exact match):
ArtifactVersionCollector.findArtifactsByGroupId("de.westemeyer", true);
Find all artifacts where groupId starts with de.westemeyer:
ArtifactVersionCollector.findArtifactsByGroupId("de.westemeyer", false);
Sort result by version number:
new ArtifactVersionCollector(Comparator.comparing(Artifact::getVersion)).artifactsByGroupId("de.", false);
Implement custom actions on list of artifacts
By supplying a lambda, the very first example could be implemented like this:
ArtifactVersionCollector.iterateArtifacts(a -> {
System.out.println(a);
return false;
});
Installation
Add these two tags to all pom.xml files, or maybe to a company master pom somewhere:
<build>
<plugins>
<plugin>
<groupId>de.westemeyer</groupId>
<artifactId>artifact-version-maven-plugin</artifactId>
<version>1.1.0</version>
<executions>
<execution>
<goals>
<goal>generate-service</goal>
</goals>
</execution>
</executions>
</plugin>
</plugins>
</build>
<dependencies>
<dependency>
<groupId>de.westemeyer</groupId>
<artifactId>artifact-version-service</artifactId>
<version>1.1.0</version>
</dependency>
</dependencies>
Feedback
It would be great if maybe some people could give the solution a try. Getting feedback about whether you think the solution fits your needs would be even better. So please don't hesitate to add a new issue on any of the github projects if you have any suggestions, feature requests, problems, whatsoever.
Licence
All of the source code is open source, free to use even for commercial products (MIT licence).
It's very easy and no configuration is needed if you use Spring with Maven.
According to the “Automatic Property Expansion Using Maven” official documentation you can automatically expand properties from the Maven project by using resource filtering. If you use the spring-boot-starter-parent, you can then refer to your Maven ‘project properties’ with #..# placeholders, as shown in the following example:
project.version=#project.version#
project.artifactId=#project.artifactId#
And you can retrieve it with #Value annotation in any class:
#Value("${project.artifactId}#${project.version}")
private String RELEASE;
I hope this helps!
With reference to ketankk's answer:
Unfortunately, adding this messed with how my application dealt with resources:
<build>
<resources>
<resource>
<directory>src/main/resources</directory>
<filtering>true</filtering>
</resource>
</resources>
</build>
But using this inside maven-assemble-plugin's < manifest > tag did the trick:
<addDefaultImplementationEntries>true</addDefaultImplementationEntries>
<addDefaultSpecificationEntries>true</addDefaultSpecificationEntries>
So I was able to get version using
String version = getClass().getPackage().getImplementationVersion();
Preface: Because I remember this often referred-to question after having answered it a few years ago, showing a dynamic version actually accessing Maven POM infos dynamically (e.g. also during tests), today I found a similar question which involved accessing module A's Maven info from another module B.
I thought about it for a moment and spontaneously had the idea to use a special annotation, applying it to a package declaration in package-info.java. I also created a multi-module example project on GitHub. I do not want to repeat the whole answer, so please see solution B in this answer. The Maven setup involves Templating Maven Plugin, but could also be solved in a more verbose way using a combination of resource filtering and adding generated sources directory to the build via Build Helper Maven. I wanted to avoid that, so I simply used Templating Maven.
Accepted answer worked for me once in the step #2 I changed ${project.version} to ${pom.version}
This should be simple.
Question
How do you get a pointcut in one project to advise the code/classes within another project?
Context
I'm working in eclipse with two projects. For ease of explanation, let's call one science project and the other math project and say the science project relies on the math project and I'm developing in both projects, concurrently. The math project is a core product, in production, and life will be easier if I don't modify the code much.
Currently, I'm debugging the interaction between these two projects. To assist with that, I'm writing an Aspect (within the science project) to log key information as the math code (and science code) executes.
Example
I running a simple example aspect along the lines of:
package org.science.example;
public aspect ScientificLog {
public pointcut testCut() : execution (public * *.*(..));
before() : testCut() {
//do stuff
}
}
Problem
The problem is, no matter what pointcut I create, it only advises code from the science project. No classes from org.math.example are crosscut, AT ALL!I tried adding the math project to the inpath of the science project by going to proect properties > AspectJ Build > Inpath and clicking add project and choosing the math project. That didn't work but it seems like I need to do something along those lines.
Thanks, in advance, for any suggestions...
-gMale
EDIT 1:
Since writing this, I've noticed the project is giving the following error:
Caused by: org.aspectj.weaver.BCException: Unable to continue, this version of AspectJ
supports classes built with weaver version 6.0 but the class
com.our.project.adapter.GenericMessagingAdapter is version 7.0
when batch building BuildConfig[null] #Files=52 AopXmls=#0
So maybe this is setup properly and the error is more subtle. BTW, the class mentioned is from the "science project," so to speak. This happens even after I clean the project. I'm currently googling this error...
EDIT 2:
I found the solution to the error above in
comment #5 here
The problem is the maven-aspectj-plugin's pom file declares a dependency on aspectjtools version 1.6.7. So, when configuring the plugin, that transient dependency has to be modified. Here's the related code snippet for the pom file that fixes the problem by specifying version 1.6.9 instead of 1.6.7:
<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>aspectj-maven-plugin</artifactId>
<version>1.3</version>
<dependencies>
<dependency>
<groupId>org.aspectj</groupId>
<artifactId>aspectjtools</artifactId>
<version>1.6.9</version>
</dependency>
</dependencies>
<configuration>
<source>1.6</source>
<target>1.6</target>
</configuration>
<executions>
<execution>
<goals>
<goal>compile</goal>
<goal>test-compile</goal>
</goals>
</execution>
</executions>
</plugin>
Your second problem is unrelated to the first. It is saying that com.our.project.adapter.GenericMessagingAdapter was originally compiled and woven against a new version of AspectJ but is being used to binary weave against an older version of AspectJ.
This is essentially the same problem as when you try to run Java classes compiled under 1.6 on a 1.5 VM.
The version number was revved up for the release of AspectJ 1.6.8 (I think, or maybe it was 1.6.7).
The solution is to make sure you are using the latest version of AspectJ for all of your projects (eg- 1.6.9, or dev builds of 1.6.10).
When you add Math project to the in path of science project, all of math project's code is sent through the aspectj weaver and properly woven. The results of that weave are written to science project's output folder (not Math project's). So, if you were to look in science project's bin folder, you should see the woven classes there.
If you wanted to keep the in path files separate from the regular files, you can specify an inpath out folder. This folder should also be added to the class path as a binary folder. Also, this folder should be placed above the project dependency to Math project in the "Export and Order" tab of the Java build page for Science project.
Finally, if you run the main class from Science project, rather than from Math project, you will be executing the woven code.