How to configure Maven2 to publish to Artifactory? - java

Currently I have a Maven2 project that builds a JAR when you run:
mvn clean package
I need to now tweak the pom.xml to publish this JAR (myapp.jar) to an Artifactory server running at:
http://myartifactory/artifactory/simple/myorg/myapp/0.1
I tried adding a <repositories> element to my pom.xml but nothing is being published with this config:
<repositories>
<repository>
<id>myartifactory</id>
<url>http://myartifactory/artifactory/simple/</url>
</repository>
</repositories>
Any ideas as to how I could get publishing to work? For simplicity's sake, pretend that this Artifactory repo is authenticated to accept publishes/writes from a user with a username=foo and password=bar.

You have two options (please note that the later is the recommended one):
Add DistributionManagement part to your pom and server part to your settings.xml
Let's say you want to deploy to libs-snapshot-local repository. In this case you need to go to the tree browser in Artifactory, focus on the repository level, copy the Distribution Management snippet and paste it in your pom.xml:
Next, you need to tell maven the credentials. For that, click on your username in the top right corner, enter your password to unlock the screen, copy the server tag from Maven Settings panel:
This one you paste in your settings.xml. Don't forget to replace the ${server-id} with the real server id (the one you have in Distribution Management now).
Now, just run mvn deploy and enjoy.
Working with Maven Artifactory Plugin:
Add the relevant <plugin> part as described in the wiki to your pom.xml. It includes both the target repository and the credentials (please use external credentials source, like environment variables or system properties).
Run mvn deploy and enjoy not only the deployment to Artifactory, but also additional features as described below.
Additional features of Artifactory Maven Plugin (on top of regular Maven deployment):
Allow adding custom properties to the deployed files
Provide the build bill of materials (the buildInfo), allowing Build Integration with any build server (even those not supported by JFrog) or even with standalone builds (without build server at all).

Related

Maven does not update/download SNAPSHOTS jars when executed with '-U,--update-snapshots' parameter

I execute maven build with 'mvn clean deploy' on a SNAPSHOT version of my, say, dependency-lib. The build was successful and the artifact got successfully deployed in the artifactory.
Then, I execute maven with arguments "mvn -U clean package" on my, say, dependee-proj, it does not download newest deployed version of dependency-lib. It just downloads the maven-metadata.xml from the artifactory and skips the downloading of the jar files. I verified the local maven repository, and there just some xmls got updated and no new updates of jar files.
Is it something wrong which i am doing?
PS: Deleting the dependency-lib worked but, somehow I feel that this is not a sane thing to do.
Put an updatePolicy for the repository where you push your snapshots to. Ideally this is how snapshots are updated and later pulled by maven.
<repositories>
<repository>
<id>you-snapshots</id>
<url>http://host/repos/snapshots</url>
<snapshots>
<updatePolicy>always</updatePolicy>
</snapshots>
</repository>
</repositories>
Infact you can also do mvn -U, --update-snapshots
Maven is made up of phases. These phases are:
validate - validate the project is correct and all necessary
information is available
compile - compile the source code of the project
test - test the compiled source code using a suitable unit testing
framework. These tests should not require the code be packaged or
deployed
package - take the compiled code and package it in its distributable
format, such as a JAR.
verify - run any checks on results of integration tests to ensure
quality criteria are met
install - install the package into the local repository, for use as a
dependency in other projects locally
deploy - done in the build environment, copies the final package to
the remote repository for sharing with other developers and projects.
Since deploy phase is the executed last, it means that before it all preceding phases have been executed, including the install phase which installs the artifact on your local repository.
So when deploy finishes, your local repository and remote have the latest version that you just uploaded so there is no need to download the latest version from the remote since it is already present on your local repository.
Is it something wrong which i am doing?
Finally, to answer this, nothing strange is happening, just the normal behavior.
Deleting the artifact from the local repository of course forces maven to download the artifact from the remote, since that's the way maven works.

What's the correct way to generate a .pom file from my pom.xml?

I've been learning how to publish a Java library to jcenter. jcenter/bintray wants me to upload the following artifacts:
Binaries: {groupId}/{artifactId}-{version}.jar
Source: {groupId}/{artifactId}-{version}-sources.jar
(Optionally) Javadoc: {groupId}/{artifactId}-{version}-javadoc.jar
POM: {groupId}/{artifactId}-{version}.pom
I can generate the first three no problem (binaries with a standard mvn package and sources and javadoc using the plugins described here).
Currently I'm just manually copying my pom.xml to {groupId}/{artifactId}-{version}.pom and that works fine, but it smells. I'm sure there must be an automated Maven way of doing this but I can't find it. Can anyone help?
You can upload your Maven project directly to Bintray just by adding some code to your pom.xml and then running the appropriate mvn command:
First add a distribution section to your maven and specify the URL from which to distribute your project:
<distributionManagement>
<repository>
<id>bintray-repo-maven-example</id>
<url>https://api.bintray.com/maven/tamarjfrog/maven-repo/maven-example/;publish=1</url>
</repository>
</distributionManagement>
In order to work with Bintray you need to provide your Bintray username and API Key as upload credentials in the username and password tags of your Maven settings.xml file. The API Key can be found when editing your Bintray profile page:
<server>
<id>bintray-repo-maven-example</id>
<username>tamarjfrog</username>
<password>***my-top-secret-api-key***</password>
</server>
the you just run this simple command:
mvn deploy
The project will be built, uploaded to the the Bintray repository target URL you provided, and published. For more information take a look here.
mvn deploy should automatically push your pom

How to import a deployed jar "SNAPSHOT" from local Artifactory repository into another project?

I am a newbie in Artifactory, I have 2 projects one depends on another...
I set up Artifactory on a server and deploy the first jar into libs-snapshot....and change the C:\Users.m2\setting.xml and add this tag in the pom of the deployed project:
<distributionManagement>
<snapshotRepository>
<id>serverId</id>
<name>serverName</name>
<url>serverUrl/artifactory/libs-snapshot/</url>
</snapshotRepository>
</distributionManagement>
how can I access the first project from the second one via Artifactory repository
I am working on Netbean8.2, glassfish 4 and artifactory 4
By default, maven doesn't know to look anywhere except your local repo and maven central. You'll need to tell it the additional repos it can look in either via a pom setting or settings.xml.
You can see some example and additional details in the Maven docs.

Download maven project dependencies to build offline later

I've used maven for a while in common way with settings out of the box, so I understand what it is but I'm newbie in maven settings.
I need to organize such workflow:
Developer writes java code, use some dependencies from internet.
Developer commits his work.
TeamCity automatically can build his work. Without any manual work, and without internet.
I have idea how to do it:
Developer uses maven. A "common" directory acts as repository for certain java projects.
After the work is complete, the developer commits his project and common directory into svn.
TeamCity updates project and common directory from svn and run "mvn package". Anything needs takes from common directory. Without worrying about internet connection and startup nexus, or other repo services.
My question is:
How to use simple directory on filesystem as proxy repository for certain projects?
Tell me please how to realize this idea or give me another idea to realize such workflow.
I can just commit local repository, but there are some limitations:
Local repo zip artifacts. If I make even little changes to it - the whole cache file must be uploaded and downloaded to/from svn. It takes a long time.
Local repo store artifacts for all projects. I want only certain projects to use this repo, because developers don't want to check changes and filter unused dependencies.
I test local directory to deploy projects, simple by writing "file://testRespoDir" in repo url, but I can't understand how to make this directory proxy all remote artefacts for project(project must not use local repo and use only common directory.
I found simple and perfect solution:
POM include 2 repositories:
<repositories>
<repository>
<id>commonDep</id>
<name>common dependency</name>
<url>file://../common/repository</url>
</repository>
<!-- it used when collect dependencies before commit. If developer already download dependency, get it from local repo, save traffik and time -->
<repository>
<id>localPlugins</id>
<name>local plugins</name>
<url>file://${user.home}/.m2/repository</url>
</repository>
</repositories>
<pluginRepositories>
<pluginRepository>
<id>commonDep</id>
<name>common dependency</name>
<url>file://../common/repository</url>
</pluginRepository>
<!-- it used when collect dependencies before commit. If developer already download dependency, get it from local repo, save traffik and time -->
<pluginRepository>
<id>localPlugins</id>
<name>local plugins</name>
<url>file://${user.home}/.m2/repository</url>
</pluginRepository>
</pluginRepositories>
when developer opens project, he use local, common and central repositories in such order. It saves his traffic. When he finished work, he call script:
mvn dependency:go-offline -Dmaven.repo.local=../common/repository
all current dependencies of project copyed from his default local repository to common repository. Then developer commit common.
When we run build on teamcity, we checkout project and common and run it. There is no internet connection, but all dependencies exist in common repository.
Yeah you can do this personally i wouldn't recommend it especially if you're using SNAPSHOT's however you should be able to.
So what you want to do is create a network drive (i dont know whether your on windows or linux but it dont matter).
Then mount that network drive on all systems which require it.
Then in maven config file specify the local maven repo location:
<settings xmlns="http://maven.apache.org/SETTINGS/1.0.0"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/SETTINGS/1.0.0
http://maven.apache.org/xsd/settings-1.0.0.xsd" >
<localRepository>c:/mvn-repo/</localRepository>
...
</settings>
Replace c:/mvn-repo/ with path to your the directory on the network drive you wish to use.
You can place this config in various places but i would suggest placing it in the root config file which lives # ${MAVEN_HOME}/conf/settings.xml.
You will need to specify this on each computer which is using maven.
Then that should do it and all your maven run times will share the same local repo.
So how to get round different projects using different directories? Thats a tricky one you could use different directories on network drive and change the localRepository variable # run time by specifying it as a runtime property.
mvn -Dmaven.repo.local=$HOME/.my/other/repository clean install
That way you would have it all parceled up nicely one network drive with a directory for each project then simply specify that variable # run time to set which local repo to use.
The flow you propose won't scale. I would rather set up a local corporate mirror of the central repository and have both developers and automation servers (teamcity etc.) use it. Trivial to set up, easy to maintain, easy to track dependencies on the third party software and put restrictions in place.
I want to try such solution:
Add repository file://testRespoDir to POM.
In pom there is plugin in init, which copy project dependencies to this repo, if such dependencies not exist there.
Disable Central on teamcity server.
Only one question - which plugin can copy dependencies in such way?

multiple repositories with the same artifact in maven

I have made some modifications to log4j and would like my project to use my local version rather than that from the remote maven repo, so I declared my project as a local repo in pom.xml in addition to my remote repo for other dependencies:
<repository>
<id>my-log4j</id>
<name>my log4j</name>
<url>file:///...</url>
</repository>
<repository>
<id>remote</id>
<name>remote repo</name>
<url>http://...</url>
</repository>
maven copied the files from my local repo as expected, but then it downloaded log4j again from the remote repo and overwrote the earlier files. Is there a way to exclude certain artifacts from being downloaded from the remote repo?
Also, how does maven detect changes to my-log4j? How can I make maven copy the my-log4j artifacts each time during compilation?
If you make a custom version of something, you give it a custom version number.
For example, if you modify log4j-1.2.17 for your own use, give it the version 1.2.17.JRR.1 and following numbers as you work on it.
You build them on your computer and when you run the install goal, it will put them in your local repo. If you have a shared repo for your group, it can be deployed there as well and never confused with the Apache releases.
This will never be found in the remote repo, just in yours.
If maven looks for artifacts, it always looks in your local repository first, you do not have to specify it (you can specify the location of your local repository in your settings.xml).
You answered your question already: If you had to change a third-party artifact, rename it (already in the pom.xml) like my-log4j or log4j-my-patch. Then it won't collide with the original artifacts.

Categories

Resources