I've just fought for a whole day with a strange maven problem:
I had a custom property called "deployment.name" that was never to resolved to what I configured for it, but rather the maven filtering mechanism always replaced it by the project's name.
I tried the goal "help:expressions" to find out whether this is a preconfigured property, but that goal only throws exceptions in m2eclipse. Google does not seem to know a pre-configured
property by that name.
The strangest bit: deployment.somethingelse works perfectly fine, so I ended up replacing ".name" with ".depname", then it works ;.)
The Maven Super POM defines the common configuration for all Maven projects. The values in that are accessible as properties (and , so that is where most of the properties you generally use come from (e.g. ${project.build.directory}), these are the pretty much the same as the output of help:expressions.
There is no deployment section in the super POM. The only thing I can think of is that the property is being set somewhere else, e.g. in a profile, or overridden by a plugin (though that seems unlikely). You could try running mvn help:effective-pom to see if the property is being set by a profile.
Are you able to post your POM? that might help diagnose it.
... I just ran help:effective-pom, and there is no trace of "deployment.name" in the output.
I can see all the other properties that I defined though (e.g. "deployment.depname").
Maybe "name" is a reserved attribute of some sort? Maybe debugging into m2eclipse will shed light on this riddle.
Related
What is the best way to find the right dependency for a used class that are part of the maven-online-repository?
As far I see it is this approach:
lookup the import (e.g. org.whatever.X;) from your code at the maven-repository online (search.maven.org).
Pick one of the result list and include it in the dependency section of the POM.
Hope the chosen version and artifact of the dependency matches your requirements (compiling, runtime). If not try another artifact or version.
I'd like to share my way of doing it. What do you mean by "finding the ... for a used class that are part of the ..."? Do you mean that the dependancy is already used in somewhere else, or that you only know the package name that you may need?
I would first check which version I need for the current project.
If I'm working on a team project and someone has used the dependency in somewhere else, I would check their pom (to ensure we are using the same dependency).
Then I would look up the dependency in Maven repo and include it in my pom.
Hope this helps.
Essentially, yes this is what you have to do to obtain libraries/modules for your project.
Something that's helped me out though with this specific problem: versioning. You can set the versions you need for each of your dependencies with <properties> -> <gson.version>2.8.1</gson.version> (for example). That way, you can guarantee that your build matches with the reqs of the class or type of code you're trying to implement.
Maven doc ref: https://maven.apache.org/pom.html#Properties
I'm the author of one of the Maven plugins (not Apache/Codehaus, completely indie). Sometimes I get support requests or test cases where I'd really need to debug the execution of my plugin with an existing pom.xml. Basically the test cases I get are sample/test project (pom.xml with src/main/resoures, src/main/java and so on).
What I need is a way to:
Load an existing pom.xml.
Find a specific execution of my plugin there (usually it's the only one).
Get an instance of MyMojo - fully initialized/condigured, with all the components and parameters corectly injected.
Execute MyMojo.
What's important is that test projects are separate projects, I don't want to copy them into the Maven module of my plugin.
I'd like to be able to do this without remote debugging.
By debugging I mean to be able to set and halt on breakpoints (also conditional), step in/out/over on the source code.
Ideally I'd like to be able to executeMyMojoFrom(new File("pom.xml")) - for instance in a JUnit test or a main method of some class. (I can supply groupId, artifactId etc. All other definitions should just be loaded from that pom.xml.)
How can I achieve this?
What I've tried so far:
Debug As... on pom.xml in Eclipse - does not work well enough (source code not found, breakpoint don't work as its not a Java project context)
Maven Embedder/Invoker solutions - spawn things in separate processes via CLI. Forget breakpoints, no debugging.
Remote debugging with mvnDebug and then remote debugging from Eclipse as suggested by Pascal Thivent here. This is so far the best option. However, remote debugging means starting mvnDebug separately, and there's also not guarantee that the JARs I have in Eclipse are exactly the same that mvnDebug is using. So there's a certain distance here.
maven-plugin-testing-harness - I actually thought this this will do the task. But first I was jumping through hoops for a few hours just to make it start. All of the important dependencies are "provided" so I first had to figure out the right combination of versions of these artifacts. And then - only to discover that AbstractMojoTestCase only works within the plugin module you want to test. Probably I was mistaken when I thought that maven-plugin-testing-harness was a testing harness for Maven plugins. It seems that it's a testing harness for the plugin from that plugin's module. Which is not illogical but does not help my case. I'd like to test my plugin in other modules.
So right now I've got the best results with the remote debugging solution. But what I'm looking for is really something like maven-plugin-testing-harness but not hardwired to the plugin module. Does anyone happen to have a hint, if such a method exists somewhere in Maven artifacts?
To be even more specific, I'd like to write something like:
public void testSomething()
throws Exception
{
File pom = getTestFile( "pom.xml" );
assertNotNull( pom );
assertTrue( pom.exists() );
MyMojo myMojo = (MyMojo) lookupMojo( "myGroupId", "myArtifactid", ...,
"myGoal", pom );
assertNotNull( myMojo );
myMojo.execute();
...
}
Compare it to the MyMojoTest here - it's almost there. Should just not be hardwired into the mymojo Maven module (as it is in maven-plugin-testing-harness).
Update
Few answers to the questions in comments:
You mean you don't want such a test class, i.e MyMojoTest to reside inside the same project as the MyMojo, i.e your plugin project? Why is that?
Exactly. I want to debug the plugin execution in an existing Maven project, I don't want to move that project to my plugin project first to be able to run a test. I want to be able to test/debug an existing project. Ideally, I'd just need to add my-maven-plugin-testing dependency and subclass MyMojoTest in the project's src/test/jaca. This would be a good instrument to debug executions. Dragging the target project into my Mojo project ist just too much overhead - and mostly these aren't really the test cases I want to keep long-term. I hope, this answers, why.
Anyway, it's merely a convention to keep the project-to-test/pom.xml inside the src/test/resources of your plugin module, not a rule...
My problem is not the location of the pom.xml of the project-to-test, that is easily configurable. My difficulty is that maven-plugin-testing-harness is is somehow hardcoded to be in the Mojo's project. It uses the pom.xml of the Mojo, looks for other special files/descriptors in the containing project. So I somehow can't use it in a non-Mojo project, or can I? This is my question.
And I'm not sure why Debug as... didn't help you...
Not sure either, but (1) breakpoints did not work and (2) the source code was not "attached" for some reason.
If the Debug as didn't work for you as well as it should, you can try to use the mojo-executor with a bit of work.
https://github.com/TimMoore/mojo-executor
This is how you would execute the copy-dependencies goal of the Maven Dependency Plugin programmatically:
executeMojo(
plugin(
groupId("org.apache.maven.plugins"),
artifactId("maven-dependency-plugin"),
version("2.0")
),
goal("copy-dependencies"),
configuration(
element(name("outputDirectory"), "${project.build.directory}/foo")
),
executionEnvironment(
mavenProject,
mavenSession,
pluginManager
)
);
The project, session, and pluginManager variables should be injected via the normal Mojo injection. Yes, that means this should be executed from the context of another maven plugin. Now that I think about it, whether this would help you in any way is still a question because this still relies on injection of such components by the underlying plexus container.
My original idea was though to have you build a maven plugin that would invoke your jaxb2 plugin thru the mojo-executor like above, then serialize the mavenProject, mavenSession, pluginManager, i.e, all the plexus injected components and then use those objects to invoke your jaxb2 plugin in future from a standalone class without the plugin that you built.
I'm getting:
NoSuchMethodError: com.foo.SomeService.doSmth()Z
Am I understanding correctly that this 'Z' means that return type of doSmth() method is boolean? If true, then that kind of method really does not exist because this method returns some Collection. But on the other hand if I call this method, I'm not assigning its return value to any variable. I just call this method like this:
service.doSmth();
Any ideas why this error occurs? All necessary JAR files exist and all other methods from this class seems to exist.
Looks like method exists in classpath during compilation, but not during running of your application.
I don't think return type is a problem. If it was, it wouldn't compile. Compiler throws error when method call is ambiguous, and it is when two methods differ only by return type.
Normally, this error is caught by the compiler; this error can only occur at run time if the definition of a class has incompatibly changed.
In short - a class/jar file at runtime is not the same that you used at compile time.
This is probably a difference between your compile-time classpath and you run-time classpath.
Here is what seems to be going on:
The code is compiled with a class path that defines the doSmth() method returning a boolean. The byte-code refers to the doSmth()Z method.
At runtime, the doSmth()Z method isn't found. A method returning a Collection is found instead.
To correct this problem, check your (compile time) classpath.
The current reply just tell you why is failing. Usually is even nicer to know how to fix problems. As it is mentioned, the problem usually is that you built your program but when running or exporting it, the library is not included. So the solution is...
If you are running, check the the run configuration
Select Run tab -> Run configurations -> Select the configuration you are running -> Check the Classpath tab -> Ensure the libraries you need are there
If you are exporting (for example a war file), follow this
Select project -> Select properties -> Select Deployment Assembly -> Press Add -> Select Java Build Path Entries -> Select the libraries you want to be included in your exported file (for example a war file)
In both cases, ensure the library which you are referencing in included.
Other frequent problems for this error are not the right type of parameters or visibility but then, the compiler will detect the error before running. In this case, just check the documentation to match the function and package visibility, and ensure that the library is found in Java Build Path in your project properties.
Maybe still can help somebody, but this exception can happen also when you have on the classpath two classes in different jar files that have the same exact signature but they haven't the same public methods.
For example:
On file mylibrary1.jar you have class com.mypackage.mysubpackage.MyClass with method doSmth()
On file mylibrary2.jar you have class com.mypackage.mysubpackage.MyClass without method doSmth()
When searching the class, the classloader may find first mylibrary2.jar depending on the path precedence but can't find the method on that class.
Be sure you don't have the same package + class on two different files.
I noticed this problem occurring while testing some experimental changes in multiple linked projects, after updating them from SVN in Eclipse.
Specifically, I updated all projects from SVN, and reverted the .classpath file rather than edit it manually to keep things simple.
Then I re-added the linked projects to the path, but forgot to remove the related jars. This was how the problem occurred for me.
So apparently the run time used the jar file while the compiler used the project files.
Another way this can happen and is difficult to find:
If a signature of a method in an external jar changes in a way that there is no error found in the IDE because it's still compatible with how you call it the class might not be re-compiled.
If your build checks the files for changes and only then recompiles them, the class might not be recompiled during the build process.
So when you run it this might lead to that problem. Although you have the new jar, your own code expects still the old one but does never complain.
To make it harder it depends on the jvm if it can handle such cases. So in the worst case it runs on the test server but not on the live machine.
I am using Maven, with the one-jar pluggin, but when I run the one jar executable, I'm greeted with a wall of warnings, this is unacceptable for use
I've looked at every available resource on one-jar and see no instruction on how to keep the jar for spewing out tons of warnings when run, has anyone solved this?
JarClassLoader: Warning: META-INF/LICENSE.txt in lib/commons-io-1.4.jar is hidden by lib/commons-collections-3.2.1.jar (with different bytecode)
JarClassLoader: Warning: META-INF/NOTICE.txt in lib/commons-io-1.4.jar is hidden by lib/commons-collections-3.2.1.jar (with different bytecode)
JarClassLoader: Warning: META-INF/LICENSE.txt in lib/commons-lang-2.4.jar is hidden by lib/commons-collections-3.2.1.jar (with different bytecode)
JarClassLoader: Warning: META-INF/NOTICE.txt in lib/commons-lang-2.4.jar is hidden by lib/commons-collections-3.2.1.jar (with different bytecode)
I found that if you create a one-jar.properties file and put it in the root of your runtime classpath (ie, where your project .class files end up), it will be read by the one-jar Boot class. An entry in this properties file such as:
one-jar.silent=true
will suppress the one-jar log messages altogether.
Other values that the Boot class looks for are one-jar.info and one-jar.verbose.
The default level is INFO. As Pascal Thivent indicated above, you can also set a System property via the command line with the -D parameter of the java command, but if you do not want to have to stipulate or remember this, the properties file approach works great.
It seems that these messages are printed when running in "verbose" mode. What I don't get is that the verbose mode doesn't seem to be activated by default.
Anyway, could you try to set the one-jar.verbose system property to false when running your one-jar:
java -Done-jar.verbose=false -jar <one-jar.jar>
Regarding the latest-and-greatest One-Jar v0.97: The problem is there. The 'one-jar.properties' file actually needs to be put into the root of the final jar. It will, of course, have one line that reads, one-jar.silent=true. This can be done in Ant by setting something like <fileset dir="${build.dir}" includes="**/*.properties" /> inside the <one-jar ...> task.
It can also, just as easily, be placed into the command line using the java -Done-jar.silent=true -jar foo-jar-made-by-one-jar.jar command.
Nevertheless, it will still report a single line that it's loading properties from the One-Jar internal Boot class before going quiet. There is no way to get around this without changing source code starting at line 317 in Boot.java where the method initializeProperties logs the loading/merging operations. See Bug ID 3609329 at SourceForge in the One-Jar bug tracker where I provided the quick fix.
Summary: By adding the one-jar.properties file all but one line of extraneous logging is removed. This should help Maven users find a workaround.
This is much better in the new version of the Maven one-jar plugin.
Add the plugin repository:
<pluginRepository>
<id>one-jar</id>
<url>http://onejar-maven-plugin.googlecode.com/svn/mavenrepo</url>
</pluginRepository>
and use version 1.4.4 in the plugin definition.
I found I needed to use version 1.4.5 (1.4.4 did not work) and then the suggestion to place a one-jar.properties file at the root of my jar file with a single line reading one-jar.silent=true worked for me.
I upgraded fromr 1.4.3 to 1.4.4 as someone suggested before and that made the deal
There's two places to get the one-jar plugin from.
https://github.com/jolira/onejar-maven-plugin
http://code.google.com/p/onejar-maven-plugin/
The 1st one claims to be just a copy of the 2nd one that's served from Maven's main repository. I was encouraged to use this one as it doesn't require specifiying an additional plugin repository that the 2nd one requires. However, when I switched to use the 2nd one (the official one), this problem went away for me.
Note - passing -Done-jar.verbose=false worked for me but not when set in file one-jar.properties as someone stated above.
I submitted a patch for this quite some time ago that merely makes the default behavior silent.
public static final int LOGLEVEL_VERBOSE = 5;
// Loglevel for all loggers.
- private static int loglevel = LOGLEVEL_INFO;
+ private static int loglevel = LOGLEVEL_NONE;
private final String prefix;
AFAIK, it never got applied. Recently I fixed another issue, so I put my fixes out here:
https://github.com/nsoft/uno-jar
Please Re-read the "as is, no warranty" part of the license several times :)
There is no way to do this without modifying the source code
I would like to be able to determine what versions I am running of a dependency at runtime as well as the version of the web application itself.
Each web application I deploy is packaged with a pom.xml which I can read from, that part is trivial. The next part is parsing the pom without much effort.
As the web application is running, I want to be able to understand what version I am, and what versions my dependencies are.
Ideally, I would like to do something like:
MavenPom pom = new MavenPom(webApplicationPomInputStream);
pom.getVersion();
pom.getArtifactId();
pom.getGroupId();
for(Dependency dependency:pom.getDependencies())
{
dependency.getVersion();
dependency.getArtifactId();
dependency.getGroupId();
}
Should I just use XPath notation here, or is there a library I can call to do this type of thing?
After these posts, I am thinking the quickest/most reliable way is to generate a text file with the dependency tree in it: mvn dependency:tree. Then I will parse the text file, separate the groupId, artifactId, and version, and then determine the structure by the indentation level.
If I do that, can I export to XML instead of text? I can then use JAXB and easily parse that file without doing any/much work.
It is a hack, but looks promising.
Walter
I will just use the mvn dependency:tree plugin to generate a text file with the dependency tree. Then I will parse that in and create the dependency tree/graph from that. I will get the scope of the artifact, groupId, artifactId, version, and its parent.
I successfully implemented this type of lookup, it simply takes the dependency output, parses it and organizes dependencies simply using the indentation, nothing fancy. The artifact, group, version, and scope are easily parsed since the separator is a :.
Walter
Maven has of course such an API. Have a look at org.apache.maven.project.MavenProject. But, to be honest, I don't think it will be that easy to create a MavenProject instance. The source code will be helpful here, check for example MavenProjectTest or maybe the Maven Plugin API (actually, this task would be much, really much, simpler to achieve from a Mojo) for some guidance.
I'd suggest to search for or ask this question on the Maven Mailing Lists, org.apache.maven.dev would be appropriate here IMHO.