Gradle: getting properties from other project - java

Is it possible to define an extra property in project A and have it visible in project B? The root projects obviously includes both.
I tried putting this in project A's build.gradle:
ext {
myProps = 'something to say'
}
And this in project B's build.gradle:
task('X', dependsOn: [':A:someTask']){
println(project('A').myProps)
}
but I get:
FAILURE: Build failed with an exception.
...
* What went wrong:
A problem occurred evaluating project ':B'.
> Could not find property 'myProps' on project ':A'.
How can I achieve this?

An extra property is accessible from anywhere the owning object (A's Project object in this case) is accessible from. However, it isn't considered good style to reach out into the project model of a sibling project. One reason is that this can make it necessary to tweak the configuration order of projects, but there are others. Instead, it's better to either declare the extra property in a common parent project, or in a script plugin that gets applied to all projects that need to access the extra property.
PS: In the same vein, an explicit cross-project task dependency should be avoided whenever possible. Also note that your task tries to print the extra property in the configuration phase (rather than the execution phase), which may or may not be what you want.

Related

PsiClass to java.lang.Class

I'm developing plugin for IntelliJ IDEA. How can plugin get the name and version of libraries that are imported to the project that is being checked by plugin? I have PsiClass of the project, but cannot convert it to java.lang.Class. Maybe there's the way to get ClassLoader from PsiElement?
super.visitImportStatement(psiImport);
Class importedClass = Class.forName(psiImport.getQualifiedName(), true, psiImport.getClass().getClassLoader());
PsiImport.getClass().GetClassLoader() - returns ClassLoader of class PsiImportStatementImpl instead of ClassLoader of class that I've imported.
IntelliJ does mostly static analysis on your code. In fact, the IDE and the projects you run/debug have completely different classpaths. When you open a project, your dependencies are not added to the IDE classpath. Instead, the IDE will index the JARs, meaning it will automatically discover all the declarations (classes, methods, interfaces etc) and save them for later in a cache.
When you write code in your editor, the static analysis tool will leverage the contents of this index to validate your code and show errors when you're trying to use unknown definitions for example.
On the other hand, when you run a Main class from your project, it will spawn a new java process that has its own classpath. This classpath will likely contain every dependency declared in your module.
Knowing this, you should now understand why you can't "transform" a PsiClass to a corresponding Class.
Back to your original question:
How can plugin get the name and version of libraries that are imported to the project that is being checked by plugin?
You don't need to access Class objects for this. Instead, you can use IntelliJ SDK libraries. Here's an example:
Module mod = ModuleUtil.findModuleForFile(virtualFile,myProject);
ModuleRootManager.getInstance(mod).orderEntries().forEachLibrary(library -> {
// do your thing here with `library`
return true;
});

how to reuse pom elements across modules

So we have 6 child pom modules with 4 of them sharing common elements for build with few differences that would be parameterized.
So can we have a util pom move these common things and resue across.
I do not want to move to parent, since it is not for all the child modules.
Please suggest if there is a way for this or alternatives. Thanks
Not possible.
You cannot import parts of POMs. This would be desirable in some cases (I know that) but we probably need to wait for Maven 5.0.0 for that.
What I would recommend:
Define all your plugins and executions in the parent POM.
Use or define properties to skip them if necessary.
Override the properties in the modules.
This means
If your plugin corresponds to property myplugin.skip that skips the execution, set this to true in the parent POM and set it to false in all modules that need it.
If your plugin has a configuration parameter skip, but no property, add something like <skip>${myplugin.skip}</skip> to the configuration and then use the property myplugin.skip.
Do the same if you have more than one execution of a plugin and you need separate skip parameters.
If a plugin does not have skip parameter at all (luckily, most plugins have one), you can help yourself by using <phase>${myplugin.phase}</phase> and either put none or the correct phase in there.

Gradle is unable to find zip artifact in composite build if java plugin is applied

I have a Gradle project which creates a zip artifact. I define the artifact via artifacts.add('default', zipTask). I add this project to another project via includeBuild and use the zip as dependency (dependencies { myConfiguration 'org.example:testA:+#zip' }).
So far so good. It works.
The problem starts when I add the plugin java to the first project. For some reason it prevents Gradle from finding the zip artifact.
The error is:
Execution failed for task ':doubleZipTask'.
> Could not resolve all files for configuration ':myConfiguration'.
> Could not find testA.zip (project :testA).
Why? How to fix it?
Complete example:
Project testA
settings.gradle:
rootProject.name = 'testA'
build.gradle:
plugins {
id 'base'
// Uncomment the line below to break the zip artifact
//id 'java'
}
group = 'org.test'
version = '0.0.0.1_test'
task zipTask(type: Zip) {
from './settings.gradle' // just so the zip isn't empty
}
artifacts.add('default', zipTask)
Project testB
settings.gradle:
rootProject.name = 'testB'
// This line may be commented out in some cases and then the artifact should be downloaded from Maven repository.
// For this question it should be always uncommented, though.
includeBuild('../testA')
build.gradle:
plugins {
id 'base'
}
configurations {
myConfiguration
}
dependencies {
myConfiguration 'org.test:testA:0.0.0.+#zip'
}
task doubleZipTask(type: Zip) {
from configurations.myConfiguration
}
Update 1
I've added some diagnostic code at the end of the build.grade:
configurations.default.allArtifacts.each() {
println it.toString() + ' -> name: ' + it.getName() + ', extension: ' + it.getExtension()
}
and in the version with java plugin it prints:
ArchivePublishArtifact_Decorated testA:zip:zip: -> name: testA, extension: zip
org.gradle.api.internal.artifacts.dsl.LazyPublishArtifact#2c6aaa5 -> name: testA, extension: jar
However, I'm not sure if an additional artifact can break something.
It doesn't seem to be a problem when I add a second artifact myself.
Update 2
Maybe the zip file isn't the best representation of my intentions. After all, I could build java related files in one project and zip them in another.
However, the problem also applies to war files. (War plugin internally uses the Java plugin so it cannot be run separately.)
The issue seems to be a bug in Gradle where the composite build and reference to artifacts are broken.
Some discussion is here: https://discuss.gradle.org/t/composite-build-cant-use-included-artifact-in-buildsrc-build-gradle/24978
Bug report: https://github.com/gradle/gradle/issues/3768
A workaround would be to move the artifact dependency to a task dependency:
plugins {
id 'base'
}
configurations {
myConfiguration
}
dependencies {
}
task doubleZipTask(type: Zip) {
dependsOn gradle.includedBuild('testA').task(':zipTask')
from configurations.myConfiguration
}
The following setup should work with Gradle 5.6 (when using another attribute it will likely also work with previous versions). It mostly corresponds to your original setup with the exception of the changes indicated by XXX.
Project testA
settings.gradle:
rootProject.name = 'testA'
build.gradle:
plugins {
id 'base'
// Uncomment the line below to break the zip artifact
//id 'java'
}
group = 'org.test'
version = '0.0.0.1_test'
task zipTask(type: Zip) {
from './settings.gradle' // just so the zip isn't empty
}
// XXX added an attribute to the configuration
configurations.default.attributes {
attribute(LibraryElements.LIBRARY_ELEMENTS_ATTRIBUTE,
project.objects.named(LibraryElements, 'my-zipped-lib'))
}
artifacts.add('default', zipTask)
Project testB
settings.gradle:
rootProject.name = 'testB'
// This line may be commented out in some cases and then the artifact should be downloaded from Maven repository.
// For this question it should be always uncommented, though.
includeBuild('../testA')
build.gradle:
plugins {
id 'base'
}
configurations {
// XXX added the same attribute as in the testA project
myConfiguration {
attributes {
attribute(LibraryElements.LIBRARY_ELEMENTS_ATTRIBUTE,
project.objects.named(LibraryElements, 'my-zipped-lib'))
}
}
}
dependencies {
myConfiguration 'org.test:testA:0.0.0.+#zip'
}
task doubleZipTask(type: Zip) {
from configurations.myConfiguration
}
I have tested this setup both with and without the java plugin. I’ve also tested publishing to a Maven repository and letting testB take its dependency from there instead of from the included build of testA. Adding an additional dependency from testB on the JAR artifact of testA (myConfiguration 'org.test:testA:0.0.0.+#jar') worked, too.
Some explanation on what (I believe) is going on: Gradle needs a way of determining automatically which local component/artifact in testA it can use for substituting the external dependency of testB.
Without applying the java plugin, there is only a single component/artifact and my guess is that Gradle then just selects that single one without further ado.
As you saw, by applying the java plugin, another artifact is added to testA. Now which one should Gradle take? One would expect that it would look at the file extension specified on the dependency in testB but that doesn’t seem to be the case. It seems that Gradle isn’t actually working at the artifacts level when substituting dependencies but rather at the module/component level. You might say we only have one component with two artifacts, so selecting the one component should be straightforward. But it seems that we actually have two variants of the same component and Gradle wants to select one of these variants. In your own testB setup, there is no clue given that would tell Gradle which variant to select; so it fails (with an admittedly bad/misleading error message). In my changed testB setup I provide the clue: I tell Gradle that we want the variant that has a specific attribute with the value my-zipped-lib. Since I’ve added the same attribute on the published configuration of testA, Gradle is now able to select the right variant (because there is only one that has the required attribute). The file extension on the dependency is still relevant in a second step: once Gradle has selected the component variant, it still needs to select the right artifact – but only then.
Note that we’re actually working on the rim of what is supported with composite builds today. See also Gradle issue #2529 which states that “projects that publish non-jar artifacts” were not terribly well supported. When I first saw your question I honestly thought we’d be out of luck here … but it seems there’s a way to again step a little closer to the rim ;-)
In the comments, the question came up why adding multiple custom artifacts works while applying the java plugin breaks the build. As I’ve tried to explain above, it’s not a question of multiple artifacts but of multiple component variants. IIUIC, these variants originate from different attributes on configurations. When you don’t add such attributes, then you will not have different components variants. However, the Java plugin does add such attributes which consequently leads to different component variants in your project(/component). If you’re interested, you can see the different attributes by adding something like the following to your build.gradle:
configurations.each { conf ->
println "Attributes of $conf:"
conf.attributes.keySet().each { attr ->
println "\t$attr -> ${conf.attributes.getAttribute(attr)}"
}
}
Now where and when to add which attributes? That depends on your setup. I wouldn’t blindly add attributes to all configurations in the hope that that’ll solve things magically. While it might work (depending on your setup), it’d certainly not be clean. If your project setup is as complex and/or special as your question suggests, then it’ll probably make sense to think more deeply about which configurations you need and what attributes they should carry. The first sections of the Gradle documentation page that I have linked above are probably a good starting point if you are not intricately familiar with these nuances of configurations in Gradle, yet. And yes, I agree that such logic would best live in a Gradle plugin.

Context.getCompilerVariable() returns null for compiler variable, defined in merged project

I am evaluating Install4j (version 7.0.8) and exploring its features - Merged project and custom code in separate JAR.
My tryout code base consists of 3 entities -
Project Main.install4j
Project SubMain.install4j - merged in project 'Main'.
customcode.jar - Added as a resource in project 'Main'.
A compiler variable 'CV_Var1' is defined in Project 'SubMain' and is accessed in a function, defined in customcode.jar, as follows:
String strTemp = InstContext.getCompilerVariable("CV_Var1");
Upon executing the code, 'strTemp' is found null.
As per my observation, if variable 'CV_Var1' is defined in 'Main' instead of 'SubMain', variable value is retrieved successfully.
How to access compiler variable, defined in merged project, in custom code (placed in JAR)?
Kindly help to resolve the issue.
As of 7.0.8, compiler variables from merged projects are indeed not available at runtime. This will be fixed in 7.0.9. Please contact support#ej-technologies.com to get a build where this is already implemented.

How to execute a specific plugin/Mojo from a pom.xml programmatically?

I'm the author of one of the Maven plugins (not Apache/Codehaus, completely indie). Sometimes I get support requests or test cases where I'd really need to debug the execution of my plugin with an existing pom.xml. Basically the test cases I get are sample/test project (pom.xml with src/main/resoures, src/main/java and so on).
What I need is a way to:
Load an existing pom.xml.
Find a specific execution of my plugin there (usually it's the only one).
Get an instance of MyMojo - fully initialized/condigured, with all the components and parameters corectly injected.
Execute MyMojo.
What's important is that test projects are separate projects, I don't want to copy them into the Maven module of my plugin.
I'd like to be able to do this without remote debugging.
By debugging I mean to be able to set and halt on breakpoints (also conditional), step in/out/over on the source code.
Ideally I'd like to be able to executeMyMojoFrom(new File("pom.xml")) - for instance in a JUnit test or a main method of some class. (I can supply groupId, artifactId etc. All other definitions should just be loaded from that pom.xml.)
How can I achieve this?
What I've tried so far:
Debug As... on pom.xml in Eclipse - does not work well enough (source code not found, breakpoint don't work as its not a Java project context)
Maven Embedder/Invoker solutions - spawn things in separate processes via CLI. Forget breakpoints, no debugging.
Remote debugging with mvnDebug and then remote debugging from Eclipse as suggested by Pascal Thivent here. This is so far the best option. However, remote debugging means starting mvnDebug separately, and there's also not guarantee that the JARs I have in Eclipse are exactly the same that mvnDebug is using. So there's a certain distance here.
maven-plugin-testing-harness - I actually thought this this will do the task. But first I was jumping through hoops for a few hours just to make it start. All of the important dependencies are "provided" so I first had to figure out the right combination of versions of these artifacts. And then - only to discover that AbstractMojoTestCase only works within the plugin module you want to test. Probably I was mistaken when I thought that maven-plugin-testing-harness was a testing harness for Maven plugins. It seems that it's a testing harness for the plugin from that plugin's module. Which is not illogical but does not help my case. I'd like to test my plugin in other modules.
So right now I've got the best results with the remote debugging solution. But what I'm looking for is really something like maven-plugin-testing-harness but not hardwired to the plugin module. Does anyone happen to have a hint, if such a method exists somewhere in Maven artifacts?
To be even more specific, I'd like to write something like:
public void testSomething()
throws Exception
{
File pom = getTestFile( "pom.xml" );
assertNotNull( pom );
assertTrue( pom.exists() );
MyMojo myMojo = (MyMojo) lookupMojo( "myGroupId", "myArtifactid", ...,
"myGoal", pom );
assertNotNull( myMojo );
myMojo.execute();
...
}
Compare it to the MyMojoTest here - it's almost there. Should just not be hardwired into the mymojo Maven module (as it is in maven-plugin-testing-harness).
Update
Few answers to the questions in comments:
You mean you don't want such a test class, i.e MyMojoTest to reside inside the same project as the MyMojo, i.e your plugin project? Why is that?
Exactly. I want to debug the plugin execution in an existing Maven project, I don't want to move that project to my plugin project first to be able to run a test. I want to be able to test/debug an existing project. Ideally, I'd just need to add my-maven-plugin-testing dependency and subclass MyMojoTest in the project's src/test/jaca. This would be a good instrument to debug executions. Dragging the target project into my Mojo project ist just too much overhead - and mostly these aren't really the test cases I want to keep long-term. I hope, this answers, why.
Anyway, it's merely a convention to keep the project-to-test/pom.xml inside the src/test/resources of your plugin module, not a rule...
My problem is not the location of the pom.xml of the project-to-test, that is easily configurable. My difficulty is that maven-plugin-testing-harness is is somehow hardcoded to be in the Mojo's project. It uses the pom.xml of the Mojo, looks for other special files/descriptors in the containing project. So I somehow can't use it in a non-Mojo project, or can I? This is my question.
And I'm not sure why Debug as... didn't help you...
Not sure either, but (1) breakpoints did not work and (2) the source code was not "attached" for some reason.
If the Debug as didn't work for you as well as it should, you can try to use the mojo-executor with a bit of work.
https://github.com/TimMoore/mojo-executor
This is how you would execute the copy-dependencies goal of the Maven Dependency Plugin programmatically:
executeMojo(
plugin(
groupId("org.apache.maven.plugins"),
artifactId("maven-dependency-plugin"),
version("2.0")
),
goal("copy-dependencies"),
configuration(
element(name("outputDirectory"), "${project.build.directory}/foo")
),
executionEnvironment(
mavenProject,
mavenSession,
pluginManager
)
);
The project, session, and pluginManager variables should be injected via the normal Mojo injection. Yes, that means this should be executed from the context of another maven plugin. Now that I think about it, whether this would help you in any way is still a question because this still relies on injection of such components by the underlying plexus container.
My original idea was though to have you build a maven plugin that would invoke your jaxb2 plugin thru the mojo-executor like above, then serialize the mavenProject, mavenSession, pluginManager, i.e, all the plexus injected components and then use those objects to invoke your jaxb2 plugin in future from a standalone class without the plugin that you built.

Categories

Resources