Maven Webservice Plugin working with install, not deploy - java

I have a maven module that uses the plugin jaxws-maven-plugin. I have the webservice up and running and when browsing to the .../myWebservice?wsdl, I get the WSDL. No problem.
This also works when running the wsimport maven goal through:
<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>jaxws-maven-plugin</artifactId>
<executions>
<execution>
<goals>
<goal>wsimport</goal>
</goals>
</execution>
</executions>
<configuration>
<sourceDestDir>src/main/java</sourceDestDir>
<wsdlUrls>
<wsdlUrl>http://host/f/soap/fWeb?wsdl</wsdlUrl>
</wsdlUrls>
</configuration>
</plugin>
When running
mvn clean install
All is fine... However, it doesn't work when I run
mvn clean deploy
Looking at the logged out parameters, they are identical in both cases:
[INFO] jaxws:wsimport args: [-s, D:\works2\f-service\src\main\java, -d, D:\works2\f-
service\target\classes, -Xnocompile, http://host/f/soap/fWeb?wsdl]
parsing WSDL...
After a really long timeout, that feels like a http timeout, it fails with this message:
[ERROR] Unexpected end of file from server
Failed to read the WSDL document: http://host/f/soap/fWeb?wsdl, because
1) could not find the document;
2) the document could not be read;
3) the root element of the document is not wsdl:definitions.
ERROR failed.noservice=Could not find wsdl:service in the provided WSDL(s):
At least one WSDL with at least one service definition needs to be provided.
Failed to parse the WSDL.
It's a bit confusing, since it takes so long... In fact the full namespace is not mentioned in the WSDL, root element is <definitions>, not <wsdl:definitions>, but then why does it work with mvn clean install...?
Thanks!
Raoul

Actually, I have no idea why mvn clean install and mvn clean deploy don't have a consistent result. First, wsimport is bound to the generate-sources phase and is executed in both case much earlier. Second, the deploy phase which occurs right after install doesn't do much more things, as documented:
done in an integration or release environment, copies the final package to the remote repository for sharing with other developers and projects.
So, really, I don't see how copying an artifact to a repository can impact wsimport or make the build fail on something related to the WSDL. Very, very weird. Maybe run mvn -X clean deploy to see if you can get more information.
Anyway, I have a few suggestions about the jaxws-maven-plugin configuration.
The first one would be to not generate sources in src/main/java. IMO, generated sources should go under the target directory as you want to be able to delete them during a clean. So I'd suggest to use the default value which is ${project.build.directory}/jaxws/wsimport/java or something like ${project.build.directory}/generated-sources/jaxws instead (this is the standard maven pattern for generated stuff). But this is a side note, this won't solve your issue :)
The second suggestion is about the <wsdlUrls> configuration. Instead of using <wsdlUrls>, why don't you get the WSDL (as a file) and put it in src/wsdl (or another location in which case you'll have to specify it using the <wsdlLocation> element). This should help to workaround the timeout issue.

Related

How can I run a program compiled by Maven?

I am starting to learn Maven by reading https://spring.io/guides/gs/maven/.
In the examples, after running mvn compile successfully, how can I run the program via maven? This part seems missing from the article.
Thanks.
You can invoke a Java program (i.e. with a public static void main(String[] args) signature) with the classpath of the combined dependencies for the current pom.xml using
mvn -q exec:java
You need to configure the main class to invoke in your pom.xml similar to
<build>
<plugins>
<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>exec-maven-plugin</artifactId>
<version>1.6.0</version>
<executions>
<execution>
<goals>
<goal>java</goal>
</goals>
</execution>
</executions>
<configuration>
<mainClass>demo.Main</mainClass>
</configuration>
</plugin>
</plugins>
</build>
This is useful for testing and development, but not deployment
See http://www.mojohaus.org/exec-maven-plugin/usage.html for full details.
The Maven build process has a number of lifecycles, or points in the process. compile is one of the build steps, but most likely running the following would resolve your issue:
mvn clean package
This would generate a JAR file, in the folder where you ran it. You can then try running this JAR file using java.
Generally, maven is not used for running code. This is a build tool that you can use for compiling, running unit or integration tests, deploying you your code locally and remotely, etc..
It is based around the idea of a build lifecycle where which is in its turn is defined by a list of build phases. For example, the default lifecycle has the following phases:
validate - validate the project is correct and all necessary information is available
compile - compile the source code of the project
test - test the compiled source code using a suitable unit testing framework. These tests should not require the code be packaged or deployed
package - take the compiled code and package it in its distributable format, such as a JAR.
verify - run any checks on results of integration tests to ensure quality criteria are met
install - install the package into the local repository, for use as a dependency in other projects locally
deploy - done in the build environment, copies the final package to the remote repository for sharing with other developers and projects.
For more information you can refer to this.
UPDATE:
Having said that, it is possible as mentioned in Thorbjørn Ravn Andersen answer here.

Maven + Clover + Jenkins - How to get Coverage report and non-instrumented artifact in one command

I am trying to run maven clover plugin to generate report as well as generate NON-instrumented artifact.
<plugin>
<groupId>com.atlassian.maven.plugins</groupId>
<artifactId>maven-clover2-plugin</artifactId>
<version>3.1.3</version>
<configuration>
<generatePdf>true</generatePdf>
<generateHtml>true</generateHtml>
<licenseLocation>clover.license</licenseLocation>
<!-- the contextFilters element has to be specified within the reporting section and will not work if you specify it in the build section. -->
<!-- contextFilters>try,static,catch</contextFilters -->
</configuration>
</plugin>
mvn clean clover2:instrument clover2:clover install
If I run above according to clover doc instument goal will run in separate lifecycle and will not affect default buildcycle. So It does but problem is I want to skip test during default build lifecycle.
I tried following but it skipped test for both lifecycle.
mvn clean clover2:instrument clover2:clover install -DskipTests
If above works then I can simple set it up on jenkins withou creating mulitple jobs for multiple maven commands.
It is probably not the best idea to do everything in single cryptic maven command (in the same way it is not the best idea to put all your code in a procedure). Why not splitting the command into several steps or even jobs, which will trigger one another? Moreover from CI point of view different kind of jobs ask different priority to fail fast. I do understand that it is not exactly an answer.

Why during the install phase of maven cycle, package phase is called as well?

I have in my pom.xml a section
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-install-plugin</artifactId>
<configuration>
<pomFile>./lib/pom.xml</pomFile>
<file>./lib/wls-maven-plugin.jar</file>
</configuration>
<executions>
<execution>
<phase>install</phase>
<goals>
<goal>install-file</goal>
</goals>
</execution>
</executions>
where i want to install the Weblogic plugin to my local repository. Note i indicated that i want this to be done in install phase. Then i want to use this plugin but in package and deploy phases. However when i try to run mvn install the package phase is invoked as well and i get error because my weblogic plugin is not installed yet. So why this is happening? I want my plugin to be installed first and then used. Sorry fo poor English.
The install goal tells Maven to install the artifact(s) produced by the project or module in question. Maven has to package them into a jar or other suitable artifacts in order to have anything to install.
You really shouldn't be manually twiddling plugins like this. Instead, you should declare a proper Maven dependency on that Weblogic plugin, if it's actually even necessary.
The Maven build lifecycle is composed of a sequence of phases. When you execute a certain phase, all phases prior to it in the lifecycle will be called in order until the phase you invoked. The default build lifecycle starts with the validate phase and ends with the deploy phase, and in between, the package phase comes prior to install.
More information can be obtained here.
package phase is called before in maven's lifecycle just before install, so if you ask to invoke particular phase it invokes all the phases upto that phase by default
skipping package phase is effectively don't run plugins which are binded to run at package phase (usually jar plugin) you can find all these plugins by mvn help:effective-pom and then create another build profile and skip their execution
I found this https://maven.apache.org/guides/introduction/introduction-to-the-lifecycle.html#usual-command-line-calls
You should select the phase that matches your outcome. If you want your jar, run package. If you want to run the unit tests, run test.
If you are uncertain what you want, the preferred phase to call is
mvn verify
This command executes each default lifecycle phase in order (validate, compile, package, etc.), before executing verify. You only need to call the last build phase to be executed, in this case, verify. In most cases the effect is the same as package. However, in case there are integration-tests, these will be executed as well. And during the verify phase some additional checks can be done, e.g. if your code written according to the predefined checkstyle rules.

How can I share clover coverage data between maven and IntelliJ

I have a multi-module maven project. I'm using intellij-idea as my IDE.
I have Maven configured with the clover plugin to automatically instrument on build.
How can I get IntelliJ to recognize those changes and refresh its coverage data.(NOTE: having to click the "Refresh Coverage" toolbar button is fine.)
I've tried configuring maven-clover2-plugin like so:
<plugin>
<groupId>com.atlassian.maven.plugins</groupId>
<artifactId>maven-clover2-plugin</artifactId>
<version>3.2.2</version>
<configuration>
<baseDir>${project.basedir}</baseDir>
<cloverMergeDatabase>
${project.basedir}.clover\cloverMerge.db
</cloverMergeDatabase>
</configuration>
<executions>
<execution>
<id>main</id>
<phase>package</phase>
<goals>
<goal>instrument</goal>
<goal>aggregate</goal>
<goal>check</goal>
</goals>
</execution>
<execution>
<id>site</id>
<phase>pre-site</phase>
<goals>
<goal>instrument</goal>
<goal>aggregate</goal>
<goal>check</goal>
</goals>
</execution>
<execution>
<id>clean</id>
<phase>clean</phase>
<goals><goal>clean</goal></goals>
</execution>
</executions>
</plugin>
I then configured my project settings to use:
.clover\cloverMerge.db and checked the relative to project directory. checkbox.
But that didn't work.
NOTE:
At the bottom of Configuring Instrumentation it says
Do not set these locations explicitly if you have a multi-module project.
So I also tried leaving the location as the default for both Maven and IDEA and that didn't work either.
Also in the Clover for IDEA installation GUIDE - Known Issues
If you are using the Maven build tool, you should avoid using the same > IntelliJ output directory as Maven does. As Maven uses the target/classes and target/test-classes directories,
avoid specifying these ones. The clover.db location for IntelliJ should also be distinct from that used by Maven.
WHY should they be distinct is there some file corruption issue? If they're kept distinct then HOW can I get awesome coverage highlighting/etc, without having to repeat builds in a completely separate process?
Well I finally figured out an answer. I'm leaving this here for posterity.
The solution is complicated and somewhat of a Hack but it WORKS.
Update the parent projects pom.xml file
cloverDB: <cloverDatabase>${project.basedir}.clover\clover.db</cloverDatabase>
Merge CloverDB:
<cloverMergeDatabase>
${project.basedir}.clover\cloverMerge.db
</cloverMergeDatabase>
Create your Unit Tests to Run in IntelliJ IDEA
setup a Before launch - Run Maven Goal
clean clover2:setup prepare-package -DSkipTests
Create a Maven Run Configuration
Make the Unit-Tests a Before launch condition
In the command line have Maven run clover2:aggregrate
Update Intellij Project Settings for clover to point to the merge file
Make sure the Relative to project directory. checkbox is checked.
InitString to User specified with the value the same as your pom file.
in my case: .clover\cloverMergeDB
Once the command is run, just click the Referesh Coverage icon to see and work with the coverage data in idea.
If the tests fail you will also have the nice IntelliJ Test runner Tab to figure out why.
At the bottom of Configuring Instrumentation it says
Do not set these locations explicitly if you have a multi-module project.
Documentation actually says: Do not set these locations explicitly (using absolute path) if you have a multi-module project. The reason is simple - if you use an absolute path, then you will not have a separate clover.db for every module, but only a single clover.db file.
"If you are using the Maven build tool, you should avoid using the same IntelliJ output directory as Maven does. As Maven uses the target/classes and target/test-classes directories, avoid specifying these ones" [...] WHY should they be distinct is there some file corruption issue?
The problem is as follows: IntelliJ IDEA uses it's own engine to compile sources. It means that it does not have to call the original project's build system (a Maven, for instance) to compile sources.
It means that:
- if you have a Maven-based project and it has the Clover-for-Maven plugin installed and
- at the same time you have the Clover-for-IDEA installed in the IntelliJ IDE
- and these two Clover integrations use the same output folders for classes and databases
... then these two Clover integrations may start overwriting their files.
In most cases this is not a desired behaviour because any source code modification / project rebuild action etc in IDEA will trigger source recompilation; which can delete results obtained previously by Clover-for-Maven.

How to get maven yuicompressor and tomcat plugins to play nice

I'd like to compress all of my javascript files and aggregate them
using YUICompressor, and I saw that there was a maven plugin to allow
me to do this. I got it working for the most part.
I am also using the Mojo tomcat plugin as well. When i go to run the
tomcat:run goal, tomcat does not read from the target's output
directory (this is where the YUI compressor put my javascript files) -
but rather, it reads from the actual source files in my "src/main/
webapp/scripts" directory. Of course, the aggregated javascript file
(all.js) is not there.
I have a few questions.
How can I get the tomcat plugin to read the target's output folder
that the yui compressor plugin created?
Do I have to run the yui compressor maven goal every time I want to
update my javascript files during development?
Is there a better way to achieve this? Essentially, my end goal is
to be able to develop JavaScript and test my source files in
development mode, but I want to compress and aggregate the files and
use the all.js script when the application is running in production
mode.
While the Rails people have certainly figured this out, this seems to
be a non-trivial thing to do with Maven and Spring.
I would appreciate any and all assistance on how I can get this
running correctly. Thanks!
I was just investigating this very problem and found my answer by looking at the plugin documentation.
mvn tomcat:run - Runs the current project as a dynamic web application
using an embedded Tomcat server.
What this means in practice is that the package execution phase has not been reached when the embedded tomcat runs.
The answer is to instead use:
mvn tomcat:run-war - Runs the current project as a packaged web
application using an embedded Tomcat server.
This allows the maven build to get as far as packaging the WAR file and therefore allows the yuicompressor-maven-plugin to do what it needs to before the embedded tomcat starts up.
As for having to run it every time, you should attach the run of the yui plugin to the "generate-sources" execution phase.
Add the following to your plugin (the important part is the "phase" element to attach it to the lifecycle):
<plugin>
<groupId>net.alchim31.maven</groupId>
<artifactId>yuicompressor-maven-plugin</artifactId>
<executions>
<execution>
<phase>generate-sources</phase>
<goals>
<goal>compress</goal>
</goals>
</execution>
</executions>
<configuration>...</configuration>
</plugin>
This way the plugin will run every build during the generate-sources phase. So any time you change your java scripts which you have configured the plugin for, the output .js file will be updated as soon as you run something like:
mvn compile
mvn test
mvn install
mvn package
and so forth.
The above does cause the minified (and possibly aggregated) files to be created earlier in the lifecycle but tomcat:run cannot find them!

Categories

Resources