I am new to Maven, we are converting ant based project into Maven project. Every thing is working fine. Additionally we need to compile source code package wise.
To be more clear, we have three packages in src/main/java folder, namely dao, svc and controller. I want to compile dao first and then compile svc by giving dao reference to it, similarly compile controller by giving reference of svc only but not dao.
Finally the goal is to make sure that the controllers are not using any of the dao classes, they can use svc classes only. If this condition fails, Maven build has to be failed.
Please suggest.
It sounds like you need a multi-module maven project. Create a parent project, whose task is simply to aggregate your three modules and to provide one thing to build. Create one module for each of your packages, then define dependencies between those modules in the individual POM files.
The Maven build system is clever enough to know in which order to build the modules, based on the dependencies you declare between them. In cases where you don't define a dependency (e.g. between controller and dao), the controller module cannot access classes in the dao module.
The final source layout will be something like:
your-project
your-project/pom.xml <--- parent POM
your-project/dao
your-project/dao/pom.xml
your-project/dao/src/main/...
your-project/svc
your-project/svc/pom.xml
your-project/svc/src/main/...
your-project/controller
your-project/controller/pom.xml
your-project/controller/src/main/...
To be more clear, we have three packages in src/main/java folder, namely dao, svc and controller. I want to compile dao first and then compile svc by giving dao reference to it, similarly compile controller by giving reference of svc only but not dao.
Judging by your requirements, I think more or less you need to setup a maven based Multi Module project. Take a look at these links
Link 1
Link 2
Related
We have a lot of ejb-artifacts that are split into client and impl artifacts like
a-client, a-impl, b-client, b-impl, c-client,...
If a needs to call b, we need to add a compile dependency a-impl -> b-client. When we run the ear, classes from b-impl are injected to actually do the work.
The problem:
To run an ear, we need to make sure that for every client, the corresponding impl artifact is present. When we build the artifact with Maven, this is not guaranteed. If I add a-impl to my pom, Maven adds b-client to the ear (it is a compile dependency), but it does not add b-impl (because there is no static connection). b-impl has to be added to the pom as dependency.
The frequently leads to problems because of "forgotten" impl artifacts. Furthermore, there may be abandoned impl artifacts that will never be deleted from the pom. Possible solutions:
Add a runtime dependency from client to impl. Solves the problem for Maven, but ties the client to the impl. In ejb with client artifact - runtime dependency?, people advised against it.
Use scripts to update and check the pom to make sure that every client has an impl.
Manually check the dependency:list before every build to make sure that every client has an impl.
I do not really like any of the possibilities, but the first seems to produce the least hassle. Is there a better way?
I would create unittests they run the client - then you see on the test phase on the build if everything is working
I'm having an Android Studio project with 2 modules: A and B. (I do not include here the Annotation Processor and the Annotations module)
B depends on A.
B is an Android Library Module, and A is simple Java library Module. I'm also having an Annotation Processor on module B.
The problem I'm facing is:
I want to generate some code, based on annotated files placed in both modules - A and B. The problem comes from the way the Annotation Processor works - only with source code files *.java - not with compiled *.class ones. Unfortunately, during the compilation of B, the Annotation Processor doesn't have access to those source files from A...
The only thing, I was able to think about as a kind of solution, even an ugly one, was to include the folder with the annotated classes from module A as a source set to module B. This way I give module B access to those files during compilation.
sourceSets {
main {
java {
srcDirs = ['src/main/java', '../module_A/src/main/java/path/to/annotated/classes/folder']
}
}
}
That solves the problem - now the Annotation Processor has access to all the annotated classes from both modules, but...
Unfortunately, it introduces another issue... those annotated classes from module A, are now compiled twice. And they are included in the module A's JAR file and in the module B's AAR file.
Question 1: Is there another way to access those source files of module A, from the Annotation Processor running on B??? (From what I was able to find, the answer is NO, but checking...)
Question 2: How can I exclude those compiled files (the repeated ones) from the AAR final package of module B?
Question 3: Maybe... that's an absolutely wrong approach? Any suggestions?
Thanks in advance!
Nop, you can not achieve what you want using just java.lang.model API. At least not without some additional tricks.
The issues is not with binary-vs-source. Annotation processors can use Elements#getTypeElement to interospect compiled classes as well as source-defined classes:
Elements elementUtil = processingEnvironment.getElementUtils();
TypeElement integerClass = elementUtil.getTypeElement("java.lang.Integer");
TypeElement myClass = elementUtil.getTypeElement("currently.compiled.Class");
But you still need to have class on compilation classpath to observe it, and the class must be in process of being compiled to be visible to getElementsAnnotatedWith.
You can work around later limitation by using a tool like FastClasspathScanner: it will use it's own mechanisms to find annotations in compiled bytecode, and report them to you separately from compilation process. But you can not work around the classpath issue: if you don't have some dependency in compilation classpath, it can not be processed. So you have to compile modules together — either by merging them into one (as you did) or via declaring one to depend on another. In later case you might not be able to use getElementsAnnotatedWith, but getTypeElement and FastClasspathScanner will work.
I have a strange problem and I cannot figure out regarding libraries and dependency.
I have an app called MyApp. Which calls a class called MyLib1Class1.
MyLib1Class1 implements MyLib2Interface and MyLib1Class calls MyLib2Class and passes the interface to it.
MyLib1... and MyLib2... are two separate library projects that are published to a local Maven repository.
MyLib1 has the dependency in the gradle file to MyLib2
MyApp has the dependency of MyLib1
The two libraries compile fine.
When I compile MyApp I get the error:
class file for MyLib2Interface not found
Note, that the app only reference MyLib1 and MyLib1 reference MyLib2 which contains the interface.
If I add the dependency of MyLib2 into MyApp it works but I don't want to have to do this, I want to be able to include the dependency of MyLib1 which automatically contains MyLib2 without needing to reference it again in the app.
I'm the author of one of the Maven plugins (not Apache/Codehaus, completely indie). Sometimes I get support requests or test cases where I'd really need to debug the execution of my plugin with an existing pom.xml. Basically the test cases I get are sample/test project (pom.xml with src/main/resoures, src/main/java and so on).
What I need is a way to:
Load an existing pom.xml.
Find a specific execution of my plugin there (usually it's the only one).
Get an instance of MyMojo - fully initialized/condigured, with all the components and parameters corectly injected.
Execute MyMojo.
What's important is that test projects are separate projects, I don't want to copy them into the Maven module of my plugin.
I'd like to be able to do this without remote debugging.
By debugging I mean to be able to set and halt on breakpoints (also conditional), step in/out/over on the source code.
Ideally I'd like to be able to executeMyMojoFrom(new File("pom.xml")) - for instance in a JUnit test or a main method of some class. (I can supply groupId, artifactId etc. All other definitions should just be loaded from that pom.xml.)
How can I achieve this?
What I've tried so far:
Debug As... on pom.xml in Eclipse - does not work well enough (source code not found, breakpoint don't work as its not a Java project context)
Maven Embedder/Invoker solutions - spawn things in separate processes via CLI. Forget breakpoints, no debugging.
Remote debugging with mvnDebug and then remote debugging from Eclipse as suggested by Pascal Thivent here. This is so far the best option. However, remote debugging means starting mvnDebug separately, and there's also not guarantee that the JARs I have in Eclipse are exactly the same that mvnDebug is using. So there's a certain distance here.
maven-plugin-testing-harness - I actually thought this this will do the task. But first I was jumping through hoops for a few hours just to make it start. All of the important dependencies are "provided" so I first had to figure out the right combination of versions of these artifacts. And then - only to discover that AbstractMojoTestCase only works within the plugin module you want to test. Probably I was mistaken when I thought that maven-plugin-testing-harness was a testing harness for Maven plugins. It seems that it's a testing harness for the plugin from that plugin's module. Which is not illogical but does not help my case. I'd like to test my plugin in other modules.
So right now I've got the best results with the remote debugging solution. But what I'm looking for is really something like maven-plugin-testing-harness but not hardwired to the plugin module. Does anyone happen to have a hint, if such a method exists somewhere in Maven artifacts?
To be even more specific, I'd like to write something like:
public void testSomething()
throws Exception
{
File pom = getTestFile( "pom.xml" );
assertNotNull( pom );
assertTrue( pom.exists() );
MyMojo myMojo = (MyMojo) lookupMojo( "myGroupId", "myArtifactid", ...,
"myGoal", pom );
assertNotNull( myMojo );
myMojo.execute();
...
}
Compare it to the MyMojoTest here - it's almost there. Should just not be hardwired into the mymojo Maven module (as it is in maven-plugin-testing-harness).
Update
Few answers to the questions in comments:
You mean you don't want such a test class, i.e MyMojoTest to reside inside the same project as the MyMojo, i.e your plugin project? Why is that?
Exactly. I want to debug the plugin execution in an existing Maven project, I don't want to move that project to my plugin project first to be able to run a test. I want to be able to test/debug an existing project. Ideally, I'd just need to add my-maven-plugin-testing dependency and subclass MyMojoTest in the project's src/test/jaca. This would be a good instrument to debug executions. Dragging the target project into my Mojo project ist just too much overhead - and mostly these aren't really the test cases I want to keep long-term. I hope, this answers, why.
Anyway, it's merely a convention to keep the project-to-test/pom.xml inside the src/test/resources of your plugin module, not a rule...
My problem is not the location of the pom.xml of the project-to-test, that is easily configurable. My difficulty is that maven-plugin-testing-harness is is somehow hardcoded to be in the Mojo's project. It uses the pom.xml of the Mojo, looks for other special files/descriptors in the containing project. So I somehow can't use it in a non-Mojo project, or can I? This is my question.
And I'm not sure why Debug as... didn't help you...
Not sure either, but (1) breakpoints did not work and (2) the source code was not "attached" for some reason.
If the Debug as didn't work for you as well as it should, you can try to use the mojo-executor with a bit of work.
https://github.com/TimMoore/mojo-executor
This is how you would execute the copy-dependencies goal of the Maven Dependency Plugin programmatically:
executeMojo(
plugin(
groupId("org.apache.maven.plugins"),
artifactId("maven-dependency-plugin"),
version("2.0")
),
goal("copy-dependencies"),
configuration(
element(name("outputDirectory"), "${project.build.directory}/foo")
),
executionEnvironment(
mavenProject,
mavenSession,
pluginManager
)
);
The project, session, and pluginManager variables should be injected via the normal Mojo injection. Yes, that means this should be executed from the context of another maven plugin. Now that I think about it, whether this would help you in any way is still a question because this still relies on injection of such components by the underlying plexus container.
My original idea was though to have you build a maven plugin that would invoke your jaxb2 plugin thru the mojo-executor like above, then serialize the mavenProject, mavenSession, pluginManager, i.e, all the plexus injected components and then use those objects to invoke your jaxb2 plugin in future from a standalone class without the plugin that you built.
Is it possible to define an extra property in project A and have it visible in project B? The root projects obviously includes both.
I tried putting this in project A's build.gradle:
ext {
myProps = 'something to say'
}
And this in project B's build.gradle:
task('X', dependsOn: [':A:someTask']){
println(project('A').myProps)
}
but I get:
FAILURE: Build failed with an exception.
...
* What went wrong:
A problem occurred evaluating project ':B'.
> Could not find property 'myProps' on project ':A'.
How can I achieve this?
An extra property is accessible from anywhere the owning object (A's Project object in this case) is accessible from. However, it isn't considered good style to reach out into the project model of a sibling project. One reason is that this can make it necessary to tweak the configuration order of projects, but there are others. Instead, it's better to either declare the extra property in a common parent project, or in a script plugin that gets applied to all projects that need to access the extra property.
PS: In the same vein, an explicit cross-project task dependency should be avoided whenever possible. Also note that your task tries to print the extra property in the configuration phase (rather than the execution phase), which may or may not be what you want.