Our build is a standard maven2 build running surefire on jdk 1.5. I know that some of the tests are connecting to the internet to resolve schemas/dtds because they use a loong time when the internet connection is down. How do I stop this ?
The connection is done by your tests while parsing some XML, not by maven itself. To check just start them outside of maven, i.e. in eclipse.
If you can control the files you parse, and it is acceptable, or just as a quick check, you could remove the definitions from your XMLs (a dirty hack). Better is however to configure the CatalogManager through a CatalogManager.properties file, where you could specify custom XML catalogs, e.i. pointing to the resources locally.
You should run it in offline mode. That's a -o on the command line.
Otherwise you could mirror the remote repository locally, and add the local repo to your settings.xml
<project>
...
<repositories>
<repository>
<id>my-internal-site</id>
<url>http://myserver/repo</url>
</repository>
</repositories>
...
</project>
edit: I could be answering the wrong question here, so please clarify.
Related
How can I be sure that my jar files ARE NOT loaded to central repo maven? I am asking this question as I saw several times exception like - error while uploading to central repo. I was shocked (as I didn't make any configurations in pom and not applied to central repo administration). That's why I decided to ask this question.
So, how can I check that the absence or presence of some code guarantee that my jar is not uploading to central repo?
You can specify which repository your project should be deployed to via the Distribution Management section of pom.xml. I think there is no default. However, it's possible that you have a parent pom.xml specified and it contains some setting. If that is the case, you can modify parent. Failing that, you can override it by putting your own private repository details in this section to avoid deploying your artifacts anywhere else. It can even be invalid URL, in which case deployment will simply always fail.
Example:
<distributionManagement>
<snapshotRepository>
<id>fake-snapshots</id>
<url>https://fake/snapshots</url>
</snapshotRepository>
<repository>
<id>fake-releases</id>
<url>https://fake/releases</url>
</repository>
</distributionManagement>
Check whether are you using deploy:deploy-file goal under maven-deploy-plugin in your POM.xml.
This feature is used to deploy jar files to the remote repo.
I am trying to get the vertx library to compile in IntelliJ (w/ maven)
I had a lot of trouble getting maven to recognize/download the vertx dependency but I was able to resolve the dependency issues by downloading and installing the vertx libraries via the command line. Now I no longer have issues with the vertx dependencies and when I check the project dependencies in IntelliJ it has the correct path and shows no errors.
The maven libraries in 'project structure' show up correctly and I've checked the directories for the .jar and .java files, they're all there.
I've also set my maven home directory to a fresh maven install:
C:/Program Files/apache-maven-3.3.3
and the settings file to the correct file:
C:\Users[User].m2\settings.xml
and the local repository (where maven has installed the requisite libraries)
C:\Users[User].m2\repository
These directories line up with libraries I've installed and the maven directories that seem to be working correctly (little red line that was there before I reinstalled the libraries and reset the caches is gone)
However, I am still not able to use the vertx libraries.
Also, this code is working on a friends mac, but I'm running windows and can't seem to get it to compile (java package does not exist error).
I was able to install the module with Maven by going to File - Project Structure - Add Libraries - Maven - and after googling the name of the module (it's io.vertx:vertx-core - io.vertx.core won't bring it up) I was able to install it fully using Maven and add it as a dependency from within IntelliJ. HTH.
So it took me forever, but I figured this out.
This was a caching issue and a problem with intelliJ not recognizing that I was importing a snapshot library with maven.
1: for some reason the settings.xml file that the vertx download website has does not have snapshots enabled, but (from what I understand) the most recent build of their system is a snapshot that maven updates from time to time. If snapshots are not enabled, maven intelliJ wont update/recognize the libraries. This was happening even though I had installed the libraries with mvn install.
settings.xml should look like:
<profiles>
<profile>
<id>allow-snapshots</id>
<activation><activeByDefault>true</activeByDefault></activation>
<repositories>
<repository>
<id>snapshots-repo</id>
<url>https://oss.sonatype.org/content/repositories/snapshots</url>
<releases><enabled>false</enabled></releases>
<snapshots><enabled>true</enabled></snapshots>
</repository>
</repositories>
</profile>
</profiles>
2: After I fixed the above issue, I still had the same issue until I deleted the local cached libraries I had previously installed with maven (/user/.m2/repositories). For some reason intelliJ wouldn't recognize them. Once deleted, maven/intelliJ redownloaded them and then recognized them from that point on.
Currently I have a Maven2 project that builds a JAR when you run:
mvn clean package
I need to now tweak the pom.xml to publish this JAR (myapp.jar) to an Artifactory server running at:
http://myartifactory/artifactory/simple/myorg/myapp/0.1
I tried adding a <repositories> element to my pom.xml but nothing is being published with this config:
<repositories>
<repository>
<id>myartifactory</id>
<url>http://myartifactory/artifactory/simple/</url>
</repository>
</repositories>
Any ideas as to how I could get publishing to work? For simplicity's sake, pretend that this Artifactory repo is authenticated to accept publishes/writes from a user with a username=foo and password=bar.
You have two options (please note that the later is the recommended one):
Add DistributionManagement part to your pom and server part to your settings.xml
Let's say you want to deploy to libs-snapshot-local repository. In this case you need to go to the tree browser in Artifactory, focus on the repository level, copy the Distribution Management snippet and paste it in your pom.xml:
Next, you need to tell maven the credentials. For that, click on your username in the top right corner, enter your password to unlock the screen, copy the server tag from Maven Settings panel:
This one you paste in your settings.xml. Don't forget to replace the ${server-id} with the real server id (the one you have in Distribution Management now).
Now, just run mvn deploy and enjoy.
Working with Maven Artifactory Plugin:
Add the relevant <plugin> part as described in the wiki to your pom.xml. It includes both the target repository and the credentials (please use external credentials source, like environment variables or system properties).
Run mvn deploy and enjoy not only the deployment to Artifactory, but also additional features as described below.
Additional features of Artifactory Maven Plugin (on top of regular Maven deployment):
Allow adding custom properties to the deployed files
Provide the build bill of materials (the buildInfo), allowing Build Integration with any build server (even those not supported by JFrog) or even with standalone builds (without build server at all).
I've used maven for a while in common way with settings out of the box, so I understand what it is but I'm newbie in maven settings.
I need to organize such workflow:
Developer writes java code, use some dependencies from internet.
Developer commits his work.
TeamCity automatically can build his work. Without any manual work, and without internet.
I have idea how to do it:
Developer uses maven. A "common" directory acts as repository for certain java projects.
After the work is complete, the developer commits his project and common directory into svn.
TeamCity updates project and common directory from svn and run "mvn package". Anything needs takes from common directory. Without worrying about internet connection and startup nexus, or other repo services.
My question is:
How to use simple directory on filesystem as proxy repository for certain projects?
Tell me please how to realize this idea or give me another idea to realize such workflow.
I can just commit local repository, but there are some limitations:
Local repo zip artifacts. If I make even little changes to it - the whole cache file must be uploaded and downloaded to/from svn. It takes a long time.
Local repo store artifacts for all projects. I want only certain projects to use this repo, because developers don't want to check changes and filter unused dependencies.
I test local directory to deploy projects, simple by writing "file://testRespoDir" in repo url, but I can't understand how to make this directory proxy all remote artefacts for project(project must not use local repo and use only common directory.
I found simple and perfect solution:
POM include 2 repositories:
<repositories>
<repository>
<id>commonDep</id>
<name>common dependency</name>
<url>file://../common/repository</url>
</repository>
<!-- it used when collect dependencies before commit. If developer already download dependency, get it from local repo, save traffik and time -->
<repository>
<id>localPlugins</id>
<name>local plugins</name>
<url>file://${user.home}/.m2/repository</url>
</repository>
</repositories>
<pluginRepositories>
<pluginRepository>
<id>commonDep</id>
<name>common dependency</name>
<url>file://../common/repository</url>
</pluginRepository>
<!-- it used when collect dependencies before commit. If developer already download dependency, get it from local repo, save traffik and time -->
<pluginRepository>
<id>localPlugins</id>
<name>local plugins</name>
<url>file://${user.home}/.m2/repository</url>
</pluginRepository>
</pluginRepositories>
when developer opens project, he use local, common and central repositories in such order. It saves his traffic. When he finished work, he call script:
mvn dependency:go-offline -Dmaven.repo.local=../common/repository
all current dependencies of project copyed from his default local repository to common repository. Then developer commit common.
When we run build on teamcity, we checkout project and common and run it. There is no internet connection, but all dependencies exist in common repository.
Yeah you can do this personally i wouldn't recommend it especially if you're using SNAPSHOT's however you should be able to.
So what you want to do is create a network drive (i dont know whether your on windows or linux but it dont matter).
Then mount that network drive on all systems which require it.
Then in maven config file specify the local maven repo location:
<settings xmlns="http://maven.apache.org/SETTINGS/1.0.0"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/SETTINGS/1.0.0
http://maven.apache.org/xsd/settings-1.0.0.xsd" >
<localRepository>c:/mvn-repo/</localRepository>
...
</settings>
Replace c:/mvn-repo/ with path to your the directory on the network drive you wish to use.
You can place this config in various places but i would suggest placing it in the root config file which lives # ${MAVEN_HOME}/conf/settings.xml.
You will need to specify this on each computer which is using maven.
Then that should do it and all your maven run times will share the same local repo.
So how to get round different projects using different directories? Thats a tricky one you could use different directories on network drive and change the localRepository variable # run time by specifying it as a runtime property.
mvn -Dmaven.repo.local=$HOME/.my/other/repository clean install
That way you would have it all parceled up nicely one network drive with a directory for each project then simply specify that variable # run time to set which local repo to use.
The flow you propose won't scale. I would rather set up a local corporate mirror of the central repository and have both developers and automation servers (teamcity etc.) use it. Trivial to set up, easy to maintain, easy to track dependencies on the third party software and put restrictions in place.
I want to try such solution:
Add repository file://testRespoDir to POM.
In pom there is plugin in init, which copy project dependencies to this repo, if such dependencies not exist there.
Disable Central on teamcity server.
Only one question - which plugin can copy dependencies in such way?
My pom.xml use the following code to define the company 's internal Maven repository such that the dependencies will be downloaded from this repository if they cannot be found in my local repository.
<repositories>
<repository>
<id>XXXXXX</id>
<name>Internal Repository</name>
<url>http://private.ip/nexus-webapp/content/groups/public/</url>
</repository>
</repositories>
When I add some dependencies in pom.xml , I find that the dependencies I added will also be added to that internal repository . Besides deleting <repositories> section in pom.xml , can I configure its attributes such that the dependencies added in the pom.xml will not be added to this internal repository?
It sounds like what you're talking about is Nexus' proxying mechanism. You request artifacts from Nexus, and it looks at configured outside repos for the artifacts, caches them locally and returns them to you. That assumes the repositories in question are configured to be proxied through Nexus, of course. If someone set it up that way, then why do you want to circumvent it? You'd use Nexus in this way so the artifacts are closer to you and your builds work faster. The only way you'd get this not to happen is to change the settings in Nexus or else stop using it. You don't have to remove the repo entirely from the pom. Just put other repos ahead of it, and Maven will look in those first. But again, why would you not want to use Nexus as it was designed as a near cache for artifacts?
You need to configure it in your repository software (Artifactory, Nexus, ...).
I think you have set up a proxy repository here which downloads every artefact requested. You might want to try running a 'hosted repository' instead. More info here.
The equivalent concept in Artifactory is a 'local repository' (read here).
Download and install the dependencies you need manually using following command. It will add the package to your local repository such that you can use it. Read here
mvn install:install-file -Dfile=<path-to-file> -DgroupId=<group-id> \
-DartifactId=<artifact-id> -Dversion=<version> -Dpackaging=<packaging>