I have user-level access to Jenkins and cannot change the settings.xml for Maven due to access restrictions. When I want to mvn deploy, it expects to have a distributionManagement clause in the pom.xml. The code however comes from the internet and I rather don't change the pom.xml every time. Is there something I can do in Jenkins? I am considering using the Artifactory plugin because that is where I want to deploy.
I'd recommend using mvn deploy from Jenkins, but specifying the repository to deploy to with the -DaltDeploymentRepository=id::layout::url system property.
Format: id::layout::url
id is the repository id to be used to get credentials from the settings.xml (i.e. central, snapshot)
layout should be "default", unless you are still using Maven1 (in which case it should be "legacy")
url is the URL for the repository you want to deploy to.
This is specified in the Maven documentation here: https://maven.apache.org/plugins/maven-deploy-plugin/deploy-mojo.html
The Artifactory plugin seems a good solution if you cannot use the Maven deploy goal.
With a simple mvn clean install command + the Artifactory plugin, you will be able to deploy where you want (if the Jenkins server has the relevant read/write access):
Related
I tried with dependency:get or dependency:copy, but those goals will also resolve from my local repo.
Next I tried to make a simple download from https://repo.company.com/repo/<path-to-group>/<artifactId>/<version>/<artifactId>-<version>.pom, but that fails because of missing permissions. I guess that's why I tried to use a plugin, in order to use the existing maven credentials.
Context: Writing a deployment script that should avoid overwriting existing artifacts in our company repo.
A simple approach would be dependency:get, but overriding the local repository on command line with an empty (temporary) directory.
But, as was already said, Artifactory and Nexus are usually configured in a way that they do not override existing artifacts.
local repo override: -Dmaven.repo.local=...
I am building a jar in Jenkins and uploading it to an Artifactory repository. I've verified the jar, the pom, and the hash files are present in the repository. When I try to build a project on my machine that has a dependency on the jar, it downloads the jar correctly but then I get a POM file is missing message and the maven build fails. I don't see any rhyme or reason why this should fail, I've done an Artifactory trace on the jar and the pom in Chrome and Firefox and the response says that it found the files. So I don't understand what could bve causing the issue? We were running Artifactory 5.2.1 and upgraded over the weekend to 5.5.1, but it hasn't changed anything. What should I be looking at?
Thanks.
EDIT: This question is about to be moot. Discussions are in progress about setting up a generic maven repository and avoiding the use of Artifactory altogether, since it won't do what is needed.
First, make sure your groupId and artifactId are correct. I've lost a lot of time thinking it's a maven problem when it's really just that I reversed a couple letters in a long groupId.
Next, have you tried 'forcing' maven to bypass its local cache? Try running mvn -U <your tasks>
If that doesn't work, try deleting the ~/.m2/repository/path/to/the/artifact/with/the/missing/pom and use mvn -U again
OK, problem solved. Turned out to be a Maven problem, not an Artifactory problem. Our maven settings contain a proxy setting so we can pull down jar from Maven Central - problem was, our company url was incorrectly configured in the nonProxyHosts tag, was set to domain.org, instead of *.domain.org, so it was trying to retrieve the maven artifact through the proxy instead of going directly to the Artifactory server. My apologies to the Artifactory devs for blaming the problem on Artifactory.
Websphere offers a set of provided jars, including com.ibm.ws.ejb.thinclient_8.5.0.jar, com.ibm.ws.batch.runtime.jar, com.ibm.ws.orb_8.5.0.jar, etc.
In the ANT build process, some people had these files on the classpath. Now we are moving to Maven, and I am not sure what I should do with these files:
If they should be part of the build process, I need to put them into the repository. But how should I get or generate proper POMs for them?
If they should not be part of the build process, what are proper replacements?
If you are using a company maven repository as a proxy to maven central, the best thing to do is to make these jar files available there:
mvn install:install-file -Dfile=<path to the jarfile> -DgeneratePom=true -Dpackaging=jar -Dversion=<version> -DgroupId=<groupId> -DartifactId=<artifactId>
In such a case the groupId is usualy composed by your company prefix and then the base package of the artifact. The artefactId would be the last part without the version. For example for com.ibm.ws.ejb.thinclient_8.5.0.jar, the version is 8.5.0, the artifactId thinclient and the groupId something like com.example.thirdparty.com.ibm.ws.ejb.
The same approach works as well if you are the sole developer and install these artifacts in your local repository.
See also the official documentation
Another approach would be to have these files as part of the project and reference them through a local path and install it from there either using the maven-install-plugin or by issuing the steps from the first approach as part of the build process. See Maven and adding JARs to system scope and Maven: add a dependency to a jar by relative path.
Disclaimer: I always used the first option, as this seems to be the proper way.
Try the "was_public" JAR and POM shipped along with WebSphere Application Server traditional, starting with Version 8.
See here.
I have a java project I'm working on, and so far I've used mvn deploy to upload artifacts to nexus. As far as I can tell, Maven looks at the distributionManagement element in the POM and, if the current version is a snapshot, it uploads to the repository configured as snapshot, otherwise it uploads to the release repository. For this to work, both need to be configured in the POM.
What I'd like to know is if this behavior is the same with Jenkins Maven Integration Plugin. Do I need to set both repositories within the POM? If not, how can it know when to upload to snapshot repo and when to release repo (since it only asks for 1 URL or ID)?
It should be the same with Jenkins Maven integration plugin.
You are suposed to specify same things at pom.xml
I have a Java-based GitHub project, fitnessjiffy-spring (I'm currently focused on the "bootstrap" branch). It depends on a library built from another GitHib project, fitnessjiff-etl. I am trying to configure both of these to be built by Travis CI.
Unfortunately, Travis is not as sophisticated as Jenkins or Hudson in dealing with Maven-based Java projects. Jenkins can easily handle dependencies between projects, but the same concept doesn't seem to exist with Travis. If one project depends on another, then that other project must already be built previously... and its artifact uploaded to some Maven repo where the first project can download it later.
My "fitnessjiffy-etl" library is building and deploying just fine. I'm using Bintray for Maven repository hosting, and you can clearly see my artifacts over plain HTTP at:
http://dl.bintray.com/steve-perkins/maven/
In my "fitnessjiffy-spring" project, I am adding this Maven repo location directly in the pom.xml, so that Travis will be able to find that artifact dependency. Here is the state of my POM at the time of this writing. Note the <repositories> element at the bottom of the file.
When I build this project locally, it works just fine. I can see it downloading the Maven artifact from "http://dl.bintray.com/...". However, when I try to build on Travis CI it fails every time. I can see in the console log that Travis is still trying to download the artifact from Maven Central rather than my specified repo.
Does this make sense to anyone else? Why does Maven utilize a custom repository location in a POM file when building locally, but ignores this configuration when running on a Travis CI build?
From digging into this further, I discovered that Travis uses its own proxy for Maven Central, and has configured Maven to force ALL dependency requests through their proxy. In other words, it does not seem possible at this time to use additional Maven repos specified in the POM file of a project built on Travis.
In my case, I ended up refactoring such that project would not need the outside JAR dependency. I also switched to Drone.io, so I could manage my settings on the build server rather than having to carry a YAML file in my repository (which always struck me as a bit daft).
However, even on Drone it's still a major hassle to manage dependencies between multiple projects (extremely common with Java development). For Java, I just don't think there's currently an adequate substitute for Jenkins or Hudson, maybe running on a cheap Digital Ocean droplet or some other VPS provider instance.
In your install phase add a $HOME/.m2/settings.xml define your custom repository.
cache:
directories:
- "$HOME/.m2"
install:
- curl -o $HOME/.m2/settings.xml
https://raw.githubusercontent.com/trajano/trajano/master/src/site/resources/settings.xml
- mvn dependency:go-offline
script:
- mvn clean install site