I'm creating custom Gradle plugin for internal company use. It will add few tasks to project and behaviour of one task can be customized by plugin users. Idea is to have plugin property that will contain external class name. This class must implement appropriate interface to be correctly used. Plugin's task will instantiate objects for this class and use it during execution.
Reasons for that - there are several reasonably different patterns used by different teams in company. So set of these "external classes" will be created and published. Each team can choose which one to use for their build configuration. Or even can create a new one if there are reasons for that. So I want this thing to be configurable on a build level.
I'm failing to setup this kind of dependency in build.gradle script. Let me show you code on which I'm trying to reproduce and solve issue:
buildscript{
repositories {
mavenCentral()
maven{
url "http://our-internal-nexus/repository/maven-releases/"
}
dependencies{
classpath 'my.company:myplugin:0.1'
classpath 'my.other.company:extClass:0.1'
}
}
}
apply plugin: 'my.company.myplugin'
MyInput{
managerClass = "ExtClass"
}
myplugin - artifact of my plugin, and extclass - external class that should be instantiated by plugin's task.
When I try to execute plugins task: gradle hellotask I receive error: java.lang.ClassNotFoundException: ExtClass
I put a code to hellotask class definition to show me the classpath. The only thing it shows is C:/work/Projects/development/gradle-4.0.1/lib/gradle-launcher-4.0.1.jar. So for me it looks like no path to extClass jar provided by gradle to plugin in runtime so it can't find it.
Below you can find source code of plugin and extClass if this may help.
MyPlugin
MyPlugin.java
package my.company;
import org.gradle.api.*;
//Plugin definition
public class MyPlugin implements Plugin<Project>{
#Override
public void apply(Project project){
project.getExtensions().create("MyInput", MyPluginExtension.class);
HelloTask helloTask = project.getTasks().create("helloTask", HelloTask.class);
}
}
HelloTask.java
package my.company;
import java.net.URL;
import java.net.URLClassLoader;
import org.gradle.api.*;
import org.gradle.api.tasks.*;
//Plugin task
public class HelloTask extends DefaultTask {
#TaskAction
public void action() {
//Print classpath
ClassLoader sysClassLoader = ClassLoader.getSystemClassLoader();
URL[] urls = ((URLClassLoader)sysClassLoader).getURLs();
for(int i=0; i< urls.length; i++) {
System.out.println(urls[i].getFile());
}
//Try to instantiate class
try {
MyPluginExtension extension = getProject().getExtensions().findByType(MyPluginExtension.class);
Object instance = Class.forName(extension.getManagerClass()).newInstance();
}
catch (ClassNotFoundException e) {
e.printStackTrace();
throw new GradleException("Class not found");
} catch (IllegalAccessException e) {
e.printStackTrace();
throw new GradleException("IllegalAccessException");
} catch (InstantiationException e) {
e.printStackTrace();
throw new GradleException("InstantiationException");
}
}
}
MyPluginExtension.java
package my.company;
public class MyPluginExtension {
private String managerClass = null;
public String getManagerClass(){return this.managerClass;}
public void setManagerClass(String managerClass){ this.managerClass = managerClass;}
}
extClass
extClass.java
package my.other.company;
public class ExtClass {
public void ExtClass(){
System.out.println("Show me how it works!");
}
}
Even if you already answered your own question (you can also accept it), I would like to add a small remark:
If you want to provide an option to set a class in your plugin exception, why don't you let the user set the class directly by specifying a Class<?> instead of a String? Each class added in one of the classpath dependencies is available in the build.gradle file. You would also need to specify the package, but you could also import just like in Java. Also, Groovy does not expect you to use the .class suffix, you could simply set the class to the extension property:
import my.other.company.ExtClass
[...]
MyInput {
managerClass = ExtClass
}
ok, as always answer comes as soon as you post the question.
Needed to change managerClass = "ExtClass" to managerClass = "my.other.company.ExtClass" and everything works as expected
Related
I want to create a custom Gradle plugin that will encapsulate Checkstyle and PMD configurations. So, other projects can just apply one custom plugin without bothering about any additional configurations.
I applied checkstyle plugin.
plugins {
id 'java-gradle-plugin'
id 'checkstyle'
}
And then I applied it inside my custom plugin.
public class CustomPlugin implements Plugin<Project> {
public void apply(Project project) {
project.getPluginManager().apply(CheckstylePlugin.class);
}
}
When I try to build the project I get an error.
Unable to find: config/checkstyle/checkstyle.xml
How can I override other plugin's properties? For example, I want to change the default checkstyle.xml path. I can do it manually inside build.gradle of the plugin project itself. But in this case, other projects that apply the plugin won't have this configurations defined by default (I tested it).
EDIT 1:
I managed to configure checkstyle plugin with ChecktyleExtension.
public class MetricCodingRulesGradlePluginPlugin implements Plugin<Project> {
public void apply(Project project) {
project.getPluginManager().apply("checkstyle");
project.getExtensions().configure(CheckstyleExtension.class, checkstyleExtension -> {
checkstyleExtension.setConfigFile(new File("style/checkstyle.xml"));
});
}
}
checkstyle.xml is placed in the plugin project. When I try to apply it within any other project, checkstyle searches it inside the current project directory but not the plugin's one. Is it possible to overcome this issue? I don't want users of that plugin to put any additional files inside their project.
EDIT 2:
I put the config files to resources folder and tried to read the content.
public class MetricCodingRulesGradlePluginPlugin implements Plugin<Project> {
public void apply(Project project) {
project.getPluginManager().apply("checkstyle");
project.getExtensions().configure(CheckstyleExtension.class, checkstyleExtension -> {
URL url = getClass().getClassLoader().getResource("style/checkstyle.xml");
System.out.println("URL: " + url);
try {
checkstyleExtension.setConfigFile(
Paths.get(url.toURI())
.toFile()
);
} catch (URISyntaxException e) {
throw new RuntimeException(e);
}
});
}
}
When I apply the plugin to another project, I get the following error:
URL: jar:file:/Users/user/.gradle/caches/jars-9/8f4176a8ae146bf601f1214b287eb805/my-plugin-0.0.1-SNAPSHOT.jar!/style/checkstyle.xml
Caused by: java.nio.file.FileSystemNotFoundException
at com.sun.nio.zipfs.ZipFileSystemProvider.getFileSystem(ZipFileSystemProvider.java:171)
at com.sun.nio.zipfs.ZipFileSystemProvider.getPath(ZipFileSystemProvider.java:157)
Java cannot read the file from the jar archive for some reason. Any approaches to overcome this error?
You'd need to bundle the checkstyle.xml within your plugin's resources folder, so when you ship it, you can always access it from within the plugin code.
Basically, you need to put the config under src/main/resources/checkstyle.xml of the plugin and then access it like this:
URL resourceURL = getClass().getClassLoader().getResource("checkstyle.xml");
if (resourceURL != null) {
File resourceFile = File(resourceURL.getFile());
checkstyleExtension.setConfigFile(resourceFile);
}
Also remember, if you ship your plugin as a .jar, you'd need to unpack the checkstyle.xml into a temp file beforehand. Roughly:
File temp = File.createTempFile(".checkstyle", ".xml")
try (FileOutputStream out = new FileOutputStream(temp)) {
try (InputStream resourceStream = getClass().getClassLoader().getResourceAsStream("checkstyle.xml")) {
byte[] buffer = new byte[1024];
int bytes = resourceStream.read(buffer);
while (bytes >= 0) {
out.write(buffer, 0, bytes);
bytes = resourceStream.read(buffer);
}
}
}
I'm running into a problem with an AntTask run within the maven-antrun-plugin. Unfortunately, the AntTask uses the plugin classloader to locate a file from the project, but when run from within a plugin, the build output is not included in the plugin's classpath.
From the Guide to Maven Classloading:
Please note that the plugin classloader does neither contain the
dependencies of the current project nor its build output.
...
Plugins are free to create further classloaders on their discretion.
For example, a plugin might want to create a classloader that combines
the plugin class path and the project class path.
Can anyone point me in right direction how to create my own version of the maven-antrun-plugin in which I can create my own classloader that combines the plugin class path and the project class path? I need to update the classloader such that when a class executed by my custom antrun-plugin calls:
getClass().getClassLoader().getResource()
the classloader will search the build output folder as well.
After several hours trying to work my way around this issue with configuration, I bit the bullet and simply wrote my own plugin that extends the AntRun plugin. This was done using Maven 3.2.5:
#Mojo( name = "run", threadSafe = true, requiresDependencyResolution = ResolutionScope.TEST )
public class CustomAntMojo
extends AntRunMojo
{
#Component
private PluginDescriptor pluginDescriptor;
public void execute()
throws MojoExecutionException
{
File buildDirectory = new File( getMavenProject().getBuild().getOutputDirectory() );
// add the build directory to the classpath for the classloader
try {
ClassRealm realm = pluginDescriptor.getClassRealm();
realm.addURL(buildDirectory.toURI().toURL());
} catch (MalformedURLException e1) {
e1.printStackTrace();
}
// configure the log4j logger to output the ant logs to the maven log
BasicConfigurator.configure( new MavenLoggerLog4jBridge(getLog()));
super.execute();
}
}
With the MavenLoggerLog4jBridge class being used to convert from my Ant task's Log4j output to maven logger (https://stackoverflow.com/a/6948208/827480):
import org.apache.log4j.AppenderSkeleton;
import org.apache.log4j.Level;
import org.apache.log4j.spi.LoggingEvent;
import org.apache.maven.plugin.logging.Log;
public class MavenLoggerLog4jBridge extends AppenderSkeleton {
private Log logger;
public MavenLoggerLog4jBridge(Log logger) {
this.logger = logger;
}
protected void append(LoggingEvent event) {
int level = event.getLevel().toInt();
String msg = event.getMessage().toString();
if (level <= Level.DEBUG_INT ) {
this.logger.debug(msg);
} else if (level == Level.INFO_INT) {
this.logger.info(msg);
} else if (level == Level.WARN_INT) {
this.logger.warn(msg);
} else if (level == Level.ERROR_INT || level == Level.FATAL_INT) {
this.logger.error(msg);
}
}
public void close() {
}
public boolean requiresLayout() {
return false;
}
}
Hopefully it might be of some use or assistance to someone in the future.
I have a simple parent project with modules/applications within it. My build tool of choice is gradle. The parent build.gradle is defined below.
apply plugin: 'groovy'
dependencies {
compile gradleApi()
compile localGroovy()
}
allprojects {
repositories {
mavenCentral()
}
version "0.1.0-SNAPSHOT"
}
What I would like to do is utilize the version attribute (0.1.0-SNAPSHOT) within my swing application. Specifically, I'd like it to display in the titlebar of the main JFrame. I expect to be able to do something like this.setTitle("My Application - v." + ???.version);
The application is a plain java project, but I'm not opposed to adding groovy support it it will help.
I like creating a properties file during the build. Here's a way to do that from Gradle directly:
task createProperties(dependsOn: processResources) {
doLast {
new File("$buildDir/resources/main/version.properties").withWriter { w ->
Properties p = new Properties()
p['version'] = project.version.toString()
p.store w, null
}
}
}
classes {
dependsOn createProperties
}
You can always use brute force as somebody suggested and generate properties file during build. More elegant answer, which works only partially would be to use
getClass().getPackage().getImplementationVersion()
Problem is that this will work only if you run your application from generated jar - if you run it directly from IDE/expanded classes, getPackage above will return null. It is good enough for many cases - just display 'DEVELOPMENT' if you run from IDE(geting null package) and will work for actual client deployments.
Better idea is to keep the project version in gradle.properties file. All the properties from this file will be automatically loaded and can be used in build.gradle script.
Then if you need the version in your swing application, add a version.properties file under src/main/resources folder and filter this file during application build, here is a post that shows how it should be done.
version.properties will be included in the final jar, hence can be read and via ClassLoader and properties from this file can be displayed in application.
Simpler and updated solution of #Craig Trader (ready for Gradle 4.0/5.0)
task createProperties {
doLast {
def version = project.version.toString()
def file = new File("$buildDir/resources/main/version.txt")
file.write(version)
}
}
war {
dependsOn createProperties
}
I used #Craig Trader's answer, but had to add quite some changes to make it work (it also adds git-details):
task createProperties() {
doLast {
def details = versionDetails()
new File("$buildDir/resources/main/version.properties").withWriter { w ->
Properties p = new Properties()
p['version'] = project.version.toString()
p['gitLastTag'] = details.lastTag
p['gitCommitDistance'] = details.commitDistance.toString()
p['gitHash'] = details.gitHash.toString()
p['gitHashFull'] = details.gitHashFull.toString() // full 40-character Git commit hash
p['gitBranchName'] = details.branchName // is null if the repository in detached HEAD mode
p['gitIsCleanTag'] = details.isCleanTag.toString()
p.store w, null
}
// copy needed, otherwise the bean VersionController can't load the file at startup when running complete-app tests.
copy {
from "$buildDir/resources/main/version.properties"
into "bin/main/"
}
}
}
classes {
dependsOn createProperties
}
And load it from the constructor of class: VersionController
import static net.logstash.logback.argument.StructuredArguments.v;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.boot.info.BuildProperties;
import org.springframework.http.MediaType;
import org.springframework.web.bind.annotation.GetMapping;
import org.springframework.web.bind.annotation.RestController;
import java.io.FileInputStream;
import java.io.FileNotFoundException;
import java.io.IOException;
import java.io.InputStream;
import java.util.Map;
import java.util.Properties;
import java.util.Set;
#RestController
public class VersionController {
final static Logger log = LoggerFactory.getLogger(AppInfoController.class);
private Properties versionProperties = new Properties();
private String gitLastTag;
private String gitHash;
private String gitBranchName;
private String gitIsCleanTag;
VersionController()
{
String AllGitVersionProperties = "";
InputStream inputStream = getClass().getClassLoader().getResourceAsStream("classpath:/version.properties");
if(inputStream == null)
{
// When running unit tests, no jar is built, so we load a copy of the file that we saved during build.gradle.
// Possibly this also is the case during debugging, therefore we save in bin/main instead of bin/test.
try {
inputStream = new FileInputStream("bin/main/version.properties");
} catch (FileNotFoundException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
}
try {
versionProperties.load(inputStream);
} catch (IOException e) {
AllGitVersionProperties += e.getMessage()+":";
log.error("Could not load classpath:/version.properties",e);
}
gitLastTag = versionProperties.getProperty("gitLastTag","last-tag-not-found");
gitHash = versionProperties.getProperty("gitHash","git-hash-not-found");
gitBranchName = versionProperties.getProperty("gitBranchName","git-branch-name-not-found");
gitIsCleanTag = versionProperties.getProperty("gitIsCleanTag","git-isCleanTag-not-found");
Set<Map.Entry<Object, Object>> mainPropertiesSet = versionProperties.entrySet();
for(Map.Entry oneEntry : mainPropertiesSet){
AllGitVersionProperties += "+" + oneEntry.getKey()+":"+oneEntry.getValue();
}
log.info("All Git Version-Properties:",v("GitVersionProperties", AllGitVersionProperties));
}
}
Using #Craig Trader's solution to save the properties in a version.properties file. Add to build.gradle:
task createProperties() {
doLast {
def details = versionDetails()
new File("$buildDir/resources/main/version.properties").withWriter { w ->
Properties p = new Properties()
p['version'] = project.version.toString()
p['gitLastTag'] = details.lastTag
p['gitCommitDistance'] = details.commitDistance.toString()
p['gitHash'] = details.gitHash.toString()
p['gitHashFull'] = details.gitHashFull.toString() // full 40-character Git commit hash
p['gitBranchName'] = details.branchName // is null if the repository in detached HEAD mode
p['gitIsCleanTag'] = details.isCleanTag.toString()
p.store w, null
}
// copy needed, otherwise the bean VersionController can't load the file at startup when running complete-app tests.
copy {
from "$buildDir/resources/main/version.properties"
into "bin/main/"
}
}
}
classes {
dependsOn createProperties
}
To load the properties runtime in version.properties you need to annotate your class with #PropertySource({"classpath:version.properties"})
Then you can assign a property to a private variable with annotation like:
#Value("${gitLastTag}")
private String gitLastTag;
Full example:
package com.versioncontroller;
import org.springframework.beans.factory.annotation.Value;
import org.springframework.context.annotation.PropertySource;
import javax.annotation.PostConstruct;
import java.util.Properties;
#PropertySource({"classpath:version.properties"})
public class VersionController {
#Value("${gitLastTag}")
private String gitLastTag;
#Value("${gitHash}")
private String gitHash;
#Value("${gitBranchName}")
private String gitBranchName;
#Value("${gitIsCleanTag}")
private String gitIsCleanTag;
#PostConstruct // properties are only set after the constructor has run
private void logVersion(){
// when called during the constructor, all values are null.
System.out.println("All Git Version-Properties:");
System.out.println("gitLastTag: " + gitLastTag),
System.out.println("gitHash: " + gitHash),
System.out.println("gitBranchName: " + gitBranchName),
System.out.println("gitIsCleanTag: " + gitIsCleanTag));
}
}
I would like to distribute a jar of a library I created with all my dependencies bundled inside. However I would like to avoid version conflicts of dependencies with the adopting project.
I think maven shade can do this but I could not find a way to do this with Scala / SBT. I found OneJar however from my experiments with it seems to work only for executables.
How could I achieve this?
Thanks!
You can do this with your own classloader.
The classLoader:
Write a class loader which loads class files from diferent classloader using a rewrite.
For example you could add library as a prefix to the classpath when fetching the resource.
I have created a classloader using this teqnuiqe.
https://github.com/espenbrekke/dependent/blob/master/src/main/java/no/dependent/hacks/PathRewritingClassLoader.java
It replaces the method findClass in URLClassLoader with one adding a prefix.
protected Class<?> findClass(final String name) throws ClassNotFoundException {
Class result;
try {
result = (Class)AccessController.doPrivileged(new PrivilegedExceptionAction() {
public Class<?> run() throws ClassNotFoundException {
// This is where the prefix is added:
String path = PathRewritingClassLoader.this.prefix + name.replace('.', '/').concat(".class");
Resource res = PathRewritingClassLoader.this._ucp.getResource(path, false);
if(res != null) {
try {
return PathRewritingClassLoader.this._defineClass(name, res);
} catch (IOException var4) {
throw new ClassNotFoundException(name, var4);
}
} else {
return null;
}
}
}, this._acc);
} catch (PrivilegedActionException var4) {
throw (ClassNotFoundException)var4.getException();
}
if(result == null) {
throw new ClassNotFoundException(name);
} else {
return result;
}
}
We also have to rewrite resource loading
#Override
public URL getResource(String name){
return super.getResource(prefix+name);
}
Here is how it is used:
_dependentClassLoader = new PathRewritingClassLoader("private", (URLClassLoader)DependentFactory.class.getClassLoader());
Class myImplementationClass=_dependentClassLoader.loadClass("my.hidden.Implementation");
Building your jar:
In your build you place all the library and private classes under your selected prefix. In my gradle build I have a simple loop collecting all the dependencies.
task packageImplementation {
dependsOn cleanImplementationClasses
doLast {
def paths = project.configurations.runtime.asPath
paths.split(':').each { dependencyJar ->
println "unpacking" + dependencyJar
ant.unzip(src: dependencyJar,
dest: "build/classes/main/private/",
overwrite: "true")
}
}
}
Proguard can rename packages inside jar and obfuscate code. It is a bit complicated but you can achieve you goal with it. sbt-proguard plugin is actively maintained
Also you can check answers from similar thread:
maven-shade like plugin for SBT
UPDATE:
from version 0.14.0 sbt-assembly plugin seemed to have shading ability
Have you tried sbt-assembly plugin? It has set of merging strategies in case of conflicts and has pretty much nice start guide.
So I am developing one maven plugin where I need to modify the classloaders in order to work correctly. The problem is that I am not sure that I am modifying the correct classloader. What I'm doing is the following:
#Mojo(name = "aggregate", requiresDependencyResolution = ResolutionScope.TEST)
public class AcceptanceTestMojo extends AbstractMojo {
private static final String SYSTEM_CLASSLOADER_FIELD_NAME = "scl";
#Parameter
private String property;
#Component
public PluginDescriptor pluginDescriptor;
#Component
public MavenProject mavenProject;
#Override
public void execute() throws MojoExecutionException, MojoFailureException {
ClassLoader newClassLoader = null;
List<String> runtimeClassPathElements;
try {
runtimeClassPathElements = mavenProject.getTestClasspathElements();
} catch (DependencyResolutionRequiredException e) {
throw new MojoFailureException(MojoFailureMessages.UNRESOLVED_DEPENDENCIES_MESSAGE);
}
ClassRealm realm = pluginDescriptor.getClassRealm();
ClassRealm modifiedRealm= new ClassRealm( realm.getWorld(), realm.getId(), realm.getParentClassLoader());
try {
for (String element : runtimeClassPathElements) {
File elementFile = new File(element);
modifiedRealm.addURL(elementFile.toURI().toURL());
}
} catch (MalformedURLException e) {
throw new MojoFailureException(MojoFailureMessages.UNRESOLVED_CLASSES_MESSAGE);
}
pluginDescriptor.setClassRealm(modifiedRealm);
So I am getting the ClassRealm and I'am making slight changes to the UCP(removing some jars) and after that I set the newly created ClassRealm to the project descriptor. I am also changing the ContextClassLoader and the SystemClassLoader as the project I am executing my plugin on are using them for some interactions. These two are working fine- they are changed and the plugin is working fine with them. The problem is the plugin classloader. Because for some reason when executing my plugin on one project it is looking in the plugin ClassRealm and searching for the needed jars from there. But the code I put above is not fully correct, because when I come to the part where the execution of the plugin is looking in the plugin ClassRealm it is not the modified one- it gets another reference, which I don't know where it comes from. What I think is that I am not setting the ClassRealm correctly or I am missing something else.