I have a question with the correct flow for dockerizing Java/Spring Boot/Maven application in order to use docker layers.
Currently it looks like this -> Maven updates the versions(using the maven release plugin) -> docker image is building with the custom dockerfile.
However the problem is, that all of the dependencies are being downloaded every time(even when the poms are not modified, due to the fact, that every new version which I want to build has different version value in the pom's(it is being updated using this plugin).
I would want to use the docker layers, so that our builds could be faster than currently it is. Is it possible to do so? If it is, how? Should I use different plugin or are there other options that should be taken into account(maybe the whole flow is bad and it should be done in a different way?)
One possibility is to set up a local proxy, like a Nexus or Artifactory instance, which you can use to cache contents from the internet. You'll still be downloading them but hopefully that would be faster.
The second possibility you could do would be to create a Docker layer by downloading the content to the local repository (in the Docker image), followed by then doing a Docker build. So it would look something like:
RUN mvn dependency:go-offline
RUN mvn build
That way, your docker image would capture all of the dependencies and plugins in the local ~/.m2/repository cache, and then the build would resolve them locally. When you re-run the build step, it should inherit the layer previously.
You'll need to re-build the dependency set over time as your project evolves, but it would speed your build up by not needing them again.
You might want to have a final step that creates a new runtime Docker image from the built contents rather than depending on this layer for all your production content though.
You might want to look at David Delabasee's presentation at QCon London last year:
https://www.infoq.com/presentations/openjdk-containers/
I recommend to either mount the local maven repository as a volume and use it across Docker images or use a special local repository (/usr/share/maven/ref/) the contents of which will be copied on container startup.
The documentation of the official Maven Docker images (click here) also points out different ways to achieve better caching of dependencies.
Related
I have a multi-module project on maven. It is quite ancient and is going with a special dance with a tambourine.
Project structure
root
|__api
|__build
|__flash
|__gwt
|__server
|__service
|__shared
|__target
|__toolset
To build such a project, I have a special script that needs to be executed while at the root of the project.
./build/build_and_deploy.sh
When building on Windows, there are a lot of problems (problems with long paths, symbols and line separators get lost, etc.), so I want to build this project in docker.
At first I wanted to connect docker-maven-plugin from io.fabric8 as a plugin in maven, but as I understand it, it cannot run the build of itself in docker.
So I tried to write Dockerfile and ran into the following problems
I don't want to copy the .m2 folder to docker, there are a lot of dependencies there, it will be quite a long time.
I don't want to copy the project sources inside the container
I couldn't run the script./build/build_and_deploy.sh
How I see the solution to this problem.
Create a dockerfile, connect maven and java8 to it, and bash
Using Volume to connect the sources and maven repository
Because I work through VPN and the script is deployed, you need to find a solution to the problem through it (proxy/port forwarding???)
If you have experience or examples of a similar script or competent advice, then I will be glad to hear it
You can perform the build with Maven inside Docker.
For that you basically trigger something like docker build ., and the rest is inside the Dockerfile.
Start off from a container that has Maven, such as maven.
Add your whole project structure
Run your build script
Save your build result
To save your build result, you might want to upload it to some repository, or store it in a mounted volume that is available after the container run as well. Alternatively copy it to the next stage if you use a multistage docker build.
If you want to prevent repeated downloads of the .m2 directory or have many other dependencies in there, also mount it as volume when running the container.
I'm trying to build an efficient Docker image that leverages Docker’s image layering to decrease the duration and the required bandwidth for uploading to or downloading from the repository. Is there any way to separate my compiled code and dependencies (external libs) using Gradle build tools?
My current Dockerfile copies the fat jar to the container, while the most significant part of the jar file is libraries that don’t change between releases.
My current Dockerfile:
FROM openjdk:11-jre-slim
VOLUME /tmp
WORKDIR /app
# Add the application's jar to the container
COPY /build/libs/app-*.jar app.jar
# Run the jar file
ENTRYPOINT ["java","-Djava.security.egd=file:/dev/./urandom","-jar","app.jar"]
Edit 1: I'm trying to achieve this goal without using any plugin (if it's possible).
Well, maybe a tool like https://github.com/GoogleContainerTools/jib is more what you are looking for? It has been made precisely for this kind of use case.
With "pure" Docker, the whole build is done in your Dockerfile, with at the beginning a command to dowload the dependencies, and THEN adding your code to the image with a copy. This way, the first layers can be cached. You should probably start from an official Gradle image then. And since your dependencies will be included, you can even get away with running your software by directly calling Gradle. The image size will be bigger, but only your code would be sent over the network each time, unless your dependencies change.
There are a lot of other options to achieve something similar with Docker, but many of them break the reproducibility of the build, by using local caches, or downloading the dependencies directly from the target machine. They work, but then you lose the "universal container" approach of Docker.
I have a library and a program, both under my control and built using Gradle. What's the best way to develop these two at the same time?
I have set up a private maven repository to distribute the library and that's working, but I don't want to release to that repository every little experiment I make during development. It's slow and disruptive to users of the library.
I tried installing the jar to the local maven repository as explained here: Gradle alternate to mvn install but the project that's using the library is not picking up that newly installed version.
I think, you can try to use multi-project builds for that if it's possible. But you will likely need to restructure both your current projects to become modules of the same new project.
What's the best way to develop these two at the same time?
It depends by how the team is organized and what are your policies.
Of course if the team can use a git repo and access to the source code you can just use git without pushing a new version on the maven server for each commit or push.
Otherwise if other users can only use the final library, you have to push the version on the maven server.
I have set up a private repository to distribute the library and that's working, but I don't want to release to that repository every little experiment I make during development. It's slow and disruptive to users of the library.
Every maven repo has 2 different repositories:
release
snapshot
Usually release repo is used only for stable releases and the snapshot repo is used to publish little change, beta release and so on.
In any case it is not required that every changes in the code is pushed in the maven repo (it is a your choice)
It's slow
The time to upload artifacts usually is not so big, in any case you can evaluate to push the release in the maven repo with a CI server.
The best method seems to be to make one project include the other one when present by adding:
if (file("../libraryproject").exists()) {
includeBuild "../libraryproject"
}
to the settings.gradle file of the project that uses the library. That can be committed to the source code repo because when that directory doesn't exist, the dependency will be included in the traditional way.
This is called Composite Build in the Gradle world, IntelliJ seems to handle properly and theres'a recorded webcast showing the whole setup: https://www.youtube.com/watch?v=grPJanXfRPg
I currently have a Jenkins instance installed on a Development box. This builds fine and deploys to our development environment without any issues.
During the build process my project makes use of a single properties file containing details such as a database connection URL (Details such as these will obviously vary depending on the environment I'm pointing to).
What I would like to know is what is the best way to configure my project so that when I want to release to Production the WAR file built by Jenkins contains the Production properties instead of Development?
(Note I am also using Maven in my project).
I know 3 options:
We have used maven.-profiles for that in the past, but they have the disadvantage, that the release-plugin of maven doesn't work with profiles, so we had to change the versions manually and were unable to deploy the artifacts in a remote repository like nexus.
Another Option is mavens assembly-plugin. That can be used together with the release-plugin, as far as I know.
We decided to write a simple tool that changes the war-files after the maven-build process. It runs in a seperate Jenkins-Job. The Idea is, that building and configuring are two seperate steps. The Artifacts comming out of maven are always in a default-configuration. And if we need the configuration for the production release we start a jenkins job that does the configuration of the war-files.
You can create different maven profiles, like dev, prod, then in the profile setting, use/filter the corresponding resource files like .../(dev|test|prod)/project.properties And in Jenkins, when you build for different platform, build with -Pdev or -Pprod to get the war for the right target.
You may want to check maven profile, maven resource filtering for detailed configuration.
something not related, connect Database via jndi if possible.
I am on Netbeans and don't know Maven much. Whenever I import, open some Maven project, it starts donwloading something from some central repository, sometimes huge. It downloads things in .m2\repository.cache\m2e. I have limited bandwidth and don't want this. How to stop this?
I have set Options>Java>Maven>Dependency Download Strategy to never. Also tried mvn -o install and mvn -o for offline. Not solved.
The Maven way is to get you what the project says it needs, but you have not already downloaded to your local repository.
The huge file is the list of what is actually available in Maven Central, and for some reason unknown to me it is downloaded on a regular basis. If you do it once, it should be kept for future sessions.
Maven will download all the dependency only once to the local repository and not again and again.
Weather you have limited or unlimited bandwidth you have to download it to execute your project.
Maven has a very modular architecture. That means the the thing you get when you download the Maven distribution is in reality small core functionality.
The rest is downloaded from a Maven artifact repository, like Maven Central (which is the default repo).
Note that this applies not only for dependencies (the library your project uses), but also your plugins (i.e. the stuff that compiles, packages, and otherwise builds the projects). Hence the large number of downloads.
Like the other answers said, if you don't delete your local repository it should eventually contain all the artifacts (dependencies and plugins) you need without re-downloading. The only exception are SNAPHSOT dependencies which can get re-downloaded periodically, depending what's in your POM and settings.
Ultimately, you have two possibilities:
If you have access to a higher-bandwith connection somewhere, you can build the projects while using it, and your local repo will still store the needed artifacts.
If you have several computers/configurations behind a local network, you can set up a Maven repository manager, like Nexus or Artifactory, and use it as a local mirror. Note that those still need to download the artifacts at first as well.
But there isn't much else you can do. "Maven downloading the Internet" is, unfortunately in your case, by design.