I have a multi-module project on maven. It is quite ancient and is going with a special dance with a tambourine.
Project structure
root
|__api
|__build
|__flash
|__gwt
|__server
|__service
|__shared
|__target
|__toolset
To build such a project, I have a special script that needs to be executed while at the root of the project.
./build/build_and_deploy.sh
When building on Windows, there are a lot of problems (problems with long paths, symbols and line separators get lost, etc.), so I want to build this project in docker.
At first I wanted to connect docker-maven-plugin from io.fabric8 as a plugin in maven, but as I understand it, it cannot run the build of itself in docker.
So I tried to write Dockerfile and ran into the following problems
I don't want to copy the .m2 folder to docker, there are a lot of dependencies there, it will be quite a long time.
I don't want to copy the project sources inside the container
I couldn't run the script./build/build_and_deploy.sh
How I see the solution to this problem.
Create a dockerfile, connect maven and java8 to it, and bash
Using Volume to connect the sources and maven repository
Because I work through VPN and the script is deployed, you need to find a solution to the problem through it (proxy/port forwarding???)
If you have experience or examples of a similar script or competent advice, then I will be glad to hear it
You can perform the build with Maven inside Docker.
For that you basically trigger something like docker build ., and the rest is inside the Dockerfile.
Start off from a container that has Maven, such as maven.
Add your whole project structure
Run your build script
Save your build result
To save your build result, you might want to upload it to some repository, or store it in a mounted volume that is available after the container run as well. Alternatively copy it to the next stage if you use a multistage docker build.
If you want to prevent repeated downloads of the .m2 directory or have many other dependencies in there, also mount it as volume when running the container.
Related
I'm trying to build an efficient Docker image that leverages Docker’s image layering to decrease the duration and the required bandwidth for uploading to or downloading from the repository. Is there any way to separate my compiled code and dependencies (external libs) using Gradle build tools?
My current Dockerfile copies the fat jar to the container, while the most significant part of the jar file is libraries that don’t change between releases.
My current Dockerfile:
FROM openjdk:11-jre-slim
VOLUME /tmp
WORKDIR /app
# Add the application's jar to the container
COPY /build/libs/app-*.jar app.jar
# Run the jar file
ENTRYPOINT ["java","-Djava.security.egd=file:/dev/./urandom","-jar","app.jar"]
Edit 1: I'm trying to achieve this goal without using any plugin (if it's possible).
Well, maybe a tool like https://github.com/GoogleContainerTools/jib is more what you are looking for? It has been made precisely for this kind of use case.
With "pure" Docker, the whole build is done in your Dockerfile, with at the beginning a command to dowload the dependencies, and THEN adding your code to the image with a copy. This way, the first layers can be cached. You should probably start from an official Gradle image then. And since your dependencies will be included, you can even get away with running your software by directly calling Gradle. The image size will be bigger, but only your code would be sent over the network each time, unless your dependencies change.
There are a lot of other options to achieve something similar with Docker, but many of them break the reproducibility of the build, by using local caches, or downloading the dependencies directly from the target machine. They work, but then you lose the "universal container" approach of Docker.
I have a question with the correct flow for dockerizing Java/Spring Boot/Maven application in order to use docker layers.
Currently it looks like this -> Maven updates the versions(using the maven release plugin) -> docker image is building with the custom dockerfile.
However the problem is, that all of the dependencies are being downloaded every time(even when the poms are not modified, due to the fact, that every new version which I want to build has different version value in the pom's(it is being updated using this plugin).
I would want to use the docker layers, so that our builds could be faster than currently it is. Is it possible to do so? If it is, how? Should I use different plugin or are there other options that should be taken into account(maybe the whole flow is bad and it should be done in a different way?)
One possibility is to set up a local proxy, like a Nexus or Artifactory instance, which you can use to cache contents from the internet. You'll still be downloading them but hopefully that would be faster.
The second possibility you could do would be to create a Docker layer by downloading the content to the local repository (in the Docker image), followed by then doing a Docker build. So it would look something like:
RUN mvn dependency:go-offline
RUN mvn build
That way, your docker image would capture all of the dependencies and plugins in the local ~/.m2/repository cache, and then the build would resolve them locally. When you re-run the build step, it should inherit the layer previously.
You'll need to re-build the dependency set over time as your project evolves, but it would speed your build up by not needing them again.
You might want to have a final step that creates a new runtime Docker image from the built contents rather than depending on this layer for all your production content though.
You might want to look at David Delabasee's presentation at QCon London last year:
https://www.infoq.com/presentations/openjdk-containers/
I recommend to either mount the local maven repository as a volume and use it across Docker images or use a special local repository (/usr/share/maven/ref/) the contents of which will be copied on container startup.
The documentation of the official Maven Docker images (click here) also points out different ways to achieve better caching of dependencies.
I'm currently building a desktop java application in a very clumsy manner. The application is deployed on Windows, Mac and Linux. Here's my build process now:
On Windows:
Update local repository
Fire up Eclipse
Refresh the project
Double click the .jardesc file to generate an executable jar file
Commit the executable jar to source control
Open up the .nsi script and click the build button (I have NSSI plugin installed) to produce the .exe installer
Upload installer to ftp server to publish
On Mac:
Update local repository
Run shell script to generate .dmg file using .jar in source control
Upload to ftp server to publish
On Linux:
Update local repository
Run shell script to generate .deb file using .jar in source control
Upload to ftp server to publish
I'd also like to include some extra steps in my build in the future, such as:
Setting build date
Setting the HEAD git commit-id
Performing some code obfuscation
Any suggestions on how I can streamline and speed up this process?
If you are serious about having a good build system, then I'd recommend learning and using Maven, which provides:
Comprehensive project build lifecycle management based on a declarative project definition (pom.xml)
A huge range of plugins, which I expect will be able to handle all the specific build steps you require
Very good integration with Eclipse
Full dependency management (including automatic resolution and download of dependencies)
This is not for the faint hearted (Maven is a complex beast) but in the long run it is a great solution.
First step would be to just get everything building without Eclipse.
You might also want to consider using something like Jenkins to automate some of this. You'll still require build scripts.
A solution could look like
Update repository.
Jenkins detects update and builds the jar.
Jenkins saves the jar to some location.
Then you can have separate builds for each OS, also running in Jenkins. These could be triggered automatically on successful completion of the first build. These would each:
Pick up the jar from the previous build.
Publish the OS specific binary to an FTP site.
Ant is a good start, but you may also want to look at Apache Ivy or Maven, as these will help a bit with managing your build outputs and dependencies.
You should have a look at Ant: https://ant.apache.org/
Apache Ant is a Java library and command-line tool whose mission is to drive processes described in build files as targets and extension points dependent upon each other. The main known usage of Ant is the build of Java applications.
Also, a long list of build systems: https://en.wikipedia.org/wiki/List_of_build_automation_software
I have a muti-module maven project, and I created a new module that depends on 3 other modules. (I already have a web app maven module that produces a .war file, now I need this)
This module's output is a .jar, and it has a few resources also which are:
spring context xml file
properties file
Now I want to produce a production ready folder so I can upload it to my server. I am hoping maven can do this for me.
I need the following layout:
myjar.jar
/libs/ (the 3 other maven modules that are dependancies)
/resources
Also, there are some generic dependancies that my parent pom.xml have like slf4j/log4j/ that I also need to package.
It would be cool if I could add a switch to mvn that will produce this like:
mvn clean install production
I plan on running this on my server via the command line.
I think what you are looking for is a Maven Assembly:
https://maven.apache.org/plugins/maven-assembly-plugin/
You can use profiles to disable the generation of the assembly by default (can speed up the development process).
#puce is right in that you may be best to use the Assembly Plugin. What you can't do easily is add another lifecycle 'production' to maven. If you have time you could write a plugin to do this, but you might be better off using a profile called 'production' or 'prod-deploy' to enable the coping into place on the server.
mvn clean install -Pprod-deploy
One thing to remember with maven is that it is very good at building projects in using it's conventions, but it is pretty bad at actually script things to happen out side of the build lifecycle.
I have on several occasions used external scripting tools such as ant/python/bash and groovy to first run the build using mvn then to script the deployment in a more natural language.
The intention of Maven is building not deployment in the sense to production. For this purpose i would recommend things like Chef or Puppet. From a technial point of view it's of course possible to handle such things via Maven. What also possible to build on CI solution like Jenkins. Furthermore it's possible to run a script from Jenkins to do the deployment on production.
I have a small maven project that build a java jar file. I added a plugin (maven-antrun-plugin) in order to start it during maven's build phase. This also works in the build server (Continuum) which is good.
Now I would also like to copy the artifact jar to another server. What is the best way for doing that? I saw that you can make maven execute bash script, would that be a good way?
thanks!
It depends on your server and what options you have for uploading jars there. One of the options could be to use Maven Wagon plugin, which supports number of protocols, including ssh, ftp, webdaw.