This question was raised since our main war file is about 40 Mb size (and it's not a single war file in whole project). And all the rest jar files is about 20 Mb thus every release takes 3 times bigger space then if we will not deploy wars.
So maybe there is an option to not deploy whole war files but only resources so then deploy team could build it from Nexus? If I skip deploy for module where we build war file then deployment team cannot be able to build it without access to source code.
Is it common practice to deploy war files in company local repository?
There are two camps. One camp says: deploy (actually upload) every artifact to Nexus. This way the process for every artifact is the same and everybody knows where to find it.
The other camp says: Use Nexus only as dependency repository. If your war-file is the end-product and it isn't a dependency for another project like an ear-file, then there's no reason to upload it to Nexus. In this case you could upload it to a share or to a scm-location and distribute it from there.
For every change - Push to nexus as snapshot !!
In the process of continuous integration development practice - having parallel development, it is never ending process to update the nexus with latest development changes irrespective of size.
Its the best practice to have the deploy goal run and the updated artifacts would be available
to teams working on that -- either deployment team, testing team, other development teams would start consuming the change...
In general I would suggest to just upload the war to Nexus and get your operations team to get it from Nexus for deployment. After Nexus is already server running for managing your binaries . Why would you split it up and use a different storage for your build outputs just because they are a war file vs being a jar or something else. That just makes things more complicated for no good reason.
<maven.deploy.skip>true</maven.deploy.skip>
set this properties maven pom file can fix this problem
The maven deploy plugin has a skip property: http://maven.apache.org/plugins/maven-deploy-plugin/deploy-mojo.html
Related
we are implementing Maven + Jenkins and we are trying some artifact repositories like Artifactory. We now are deploying with svn -yes, it's awful- so we have a lot of requirements from our clients that should be done instantly, like "please add this button to my js form" so for now is enough commiting JS file and updating that on client deployed Tomcat.
I need to know if there is a way in Maven to install in a server just an incremental part of our deployable files, because the complete war file has about 600MB and it is not viable to download all the war file for a little change of two lines in javascript.
Now we can create a new artifact in each commit, but we cannot understand if it will be possible to achieve a continuous delivery that allow us to send a instant fix to a client server.
All the examples that we saw are referred to deployable artifacts, but there isn't a case in which someone use somethind like mvn install -mySpecificVersion and only the changed files are downloaded, or something similar.
Thanks.
After months of thinking in solutions we figured out that the solution is modules in our project, so the dynamism and instant-fix requirement are related to the core or specific little module. This approach will drive us to a reproducible pipe line and a cheap transfer of data.
I have a big war file over-sized due to lots of external dependencies & also I have internet connection speed issues because of which I don't want to keep the dependency jars in my war, so that I could reduce war size & do faster uploads of my updated wars from dev machine to remote server.
I would like the maven project to instead download the dependencies on the remote tomcat server itself when it has been uploaded there & starts running. How do I configure maven to do that ?
There is a pretty simple solution: Build the project on the server.
An easy way to do this is to put all the sources into a version control system like Mercurial or Git.
In addition to giving you a history and an automated backup, DVCS have insanely efficient algorithms to update remote copies (they just transfer the changes, so if you change a single line, only one line is sent over the wire).
Building on your server also means that you get the very fast download of dependencies on the server (which has probably very good download rates). And local deployment will be very, very fast.
Last but not least: When you use version control, you will be able to go back to the last stable version quickly when something goes wrong.
As Aarom says you should build the project on the server directly.
There are two requirements:
You need to have a command line access on the remote server.
Maven must be installed on the remote server.
Then you can upload the sources of your project on the remote server (without dependencies).
Go in the root directory of your project and run your build command (mvn package or whatever custom build command that you use).
So that's it, you have the .war on the remote server loaded with all the dependencies; you can then remove the source files.
#user01
Install all desired 3rd-party jars to Tomcat's lib folder.
Set the scope of those dependencies to "provided" in you Maven pom.xml.
Install Maven on your remote server.
Install a CI server such as Jenkins, Continuum, Bamboo, Hudson, CruiseControl, etc. I'd suggest Jenkins.
Hopefully, you are using revision control software such as SVN, Git, Mercurial, Bazaar, or CVS. If not, then I'd suggest setting up
Git or SVN for your source code repository.
Configure the scm tag in your pom.xml to point to your project's location within your source code repository.
Configure your CI server to get your pom.xml from your source code repository. Your CI server will read the scm tag, and the
URL's you've configured within the scm tag, and will check your
project out. Your CI server will then build your project.
You can either have Jenkins deploy your built war artifact to Tomcat via the Jenkins Deploy Plugin, or you can use a Maven plugin such as the
tomcat7-maven-plugin or Cargo.
I have two maven projects, which I need to deploy through automatic deployment process. (like nightly build or similar)
Scenario is as:
mv-proj1
-dependency-1.jar
-dependency-2.jar
-dependency-3.jar
mv-proj2
-dependency-3.jar
-dependency-4.jar
-mv-proj1.jar
sources of mv-proj1 and mv-proj2 can not be disclosed.
mv-proj2 is executable jars and provide services to other application modules.
So what is the standard way of deploying these to production machine or lets say UAT machine?
Do I need to set up intra-organization maven repository?
Do I need to install maven repository to UAT machine?
One possible way I could think is to set up and host intra-organization maven respository as well as setting up maven on UAT machine to fetch data from intra-organization maven respository. and deploy only pom.xml.
I would let my choice depend on what the consumers of your artifacts are.
If the consumers are also Maven projects that can pull in your JARs from said intra-organizational Maven repository, that's definitely a great way to go. I believe that every organization that is serious about using Maven is sooner or later going to have use cases for such own repository anyway. I've worked with Artifactory and Nexus and feel that both are great products (and free beer for the use case as stated here). They're both easy to install, and it should not be an exploration that is daunting, go for it!
If your UAT machines would use Maven to build and install anything that pulls in your artifacts as a dependency, them would be running Maven client-side. There would be a local repository (artifact cache) on these clients, but that's a different beast than the organizational repository mentioned above, which you would likely deploy not on the actual UAT machines.
If mv-proj2 is rather a "final delivery", executable as you say, you may want to pack it all up as a nice, single JAR (Maven can do that for you) and distribute that to your users. You could do that again through an organizational repository. You could ultimately release it to some network drive or web server. Many ways to do so, e.g. use maven-jar-plugin with outputDirectory pointing to wherever you want to release.
For a new project which uses Maven I would like to add distributionManagement configuration in the pom.xml which will connect the project with the Sourceforge.net file upload system.
I have found this information (of 2007), is it still valid or do you know updated resources?
http://docs.codehaus.org/display/MAVENUSER/MavenAndSourceforge
Related question: How can I deploy artifacts from a Maven build to the SourceForge File Release System?
This looks correct. However, note that it only describes deploying the site artifacts, not the project artifacts (JAR and POM). And while it's possible that you could use maven to deploy your artifacts, I'm not sure that you'd want to -- the Maven directory structure is different from the SourceForge structure (of one directory per release).
If you're looking to deploy your project releases to Maven Central, read this: http://maven.apache.org/guides/mini/guide-central-repository-upload.html
The process has changed in the last year or so. At one time you could request that your project be added to a nightly rsync job, but apparently now you have to deploy directly to a recognized repository. Given the number of times that rsync job would fail, it's no wonder they decided to change the process ...
I'm confused about the use of maven in development and production environments - I'm sure it's something simple that I'm missing. Grateful for any help..
I set up maven inside eclipse on my local machine and wrote some software. I really like how it's made things like including dependent jars very easy.
So that's my development environment. But now I want to release the project to production on a remote server. I've searched the documentation, but I can't figure out how it's supposed to work or what the maven best practice is.. Are you supposed to:
a) Also be running maven on your production environment, and upload all your files to your production environment and rebuild your project there? (Something in me baulks at the idea of rebuilding 'released' code on the production server, so I'm fairly sure this isn't right..)
b) use mvn:package to create your jar file and then copy that up to production? (But then what of all those nice dependencies? Isn't there a danger that your tested code is now going to be running against different versions of the dependent jars in the production environment, possibly breaking your code? Or missing a jar..?)
c) Something else that I'm not figuring out..
Thanks in advance for any help!
You're supposed to have your code under version control (and you never "upload" files to another machine, you "download" them from the Version Control System if required).
You're supposed to package your code in a format (a WAR, an EAR, another kind of bundle) that can be deployed on the production environment for execution. Such bundles typically include the dependencies. To build more complex bundles, the Maven Assembly Plugin can help.
Maven generated artifacts (JARs, WARs, whatever) should be shared via a remote repository (and thus deployed - I mean mvn deploy here - to this remote repository). A remote repository can be a simple file system served via a web server or a more advanced solution like Nexus.
Development is usually done using SNAPSHOT dependencies (e.g. 1.0-SNAPSHOT). At release time, you're supposed to change the version into a "fixed" version (e.g. 1.0) and some other bits from your pom.xml, run the build to check that everything is ok, commit the modified pom.xml, create a tag in the VCS, promote the versions to a new SNAPSHOT (e.g. 1.1-SNAPSHOT) in the pom.xml, commit the new pom.xml in the VCS. The entire process require some work but this can be automated using the Maven Release Plugin.
On the production environment, get the artifacts to be deployed from the remote repository and deploy them (some projects automate the deployment to the production server using Maven but that's another story).
Of course, there are variations around this (deployment to production is most of time company specific) but the general idea is there.
You need to look into the Maven Assembly Plugin and the Maven Release Plugin.
When building artifact you usually state what scope the dependency has. In default scope it should be packaged in your archive. If you do not want it, use scope "provided" - in such case you have to prepare runtime environment providing the dependency. It's generaaly a bad idea to rebuild a package only for deployment.
As for deploying, you can use maven's antrun plugin to copy files locally or via scp .