Small question regarding maven and the deploy command mvn deploy and push it to Dockerhub please.
I have a small springboot project, very straightforward.
I build and create a docker image from this project using mvn spring-boot:build-image after this command is run, I get a jar, but I also get a docker image which I can see using docker images, very happy.
My next step would consist into running a small two lines shell script which does:
docker tag myapp dockerhub/myapp
docker push dockerhub/myapp
Therefore, my pipeline can be summarized as:
step 1 maven to clean install build the jar + generate the docker image
step 2 docker to tag + docker push to docker hub.
I am wondering, is it possible to simply use the maven deploy command, and have the deploy tagging and pushing to dockerhub please?
What I tried:
Runing the deploy command, but it is not pushing to dockerhub at all.
What would be a way to leverage maven deploy, or maybe other maven commands, to have a pipeline build a jar + generate the docker image + tag it + push it please?
I cannot afford any CI/CD pipeline, and this is light enough, so hoping to get a maven command which can also push to dockerhub.
Thank you
Related
I have a multi-module project on maven. It is quite ancient and is going with a special dance with a tambourine.
Project structure
root
|__api
|__build
|__flash
|__gwt
|__server
|__service
|__shared
|__target
|__toolset
To build such a project, I have a special script that needs to be executed while at the root of the project.
./build/build_and_deploy.sh
When building on Windows, there are a lot of problems (problems with long paths, symbols and line separators get lost, etc.), so I want to build this project in docker.
At first I wanted to connect docker-maven-plugin from io.fabric8 as a plugin in maven, but as I understand it, it cannot run the build of itself in docker.
So I tried to write Dockerfile and ran into the following problems
I don't want to copy the .m2 folder to docker, there are a lot of dependencies there, it will be quite a long time.
I don't want to copy the project sources inside the container
I couldn't run the script./build/build_and_deploy.sh
How I see the solution to this problem.
Create a dockerfile, connect maven and java8 to it, and bash
Using Volume to connect the sources and maven repository
Because I work through VPN and the script is deployed, you need to find a solution to the problem through it (proxy/port forwarding???)
If you have experience or examples of a similar script or competent advice, then I will be glad to hear it
You can perform the build with Maven inside Docker.
For that you basically trigger something like docker build ., and the rest is inside the Dockerfile.
Start off from a container that has Maven, such as maven.
Add your whole project structure
Run your build script
Save your build result
To save your build result, you might want to upload it to some repository, or store it in a mounted volume that is available after the container run as well. Alternatively copy it to the next stage if you use a multistage docker build.
If you want to prevent repeated downloads of the .m2 directory or have many other dependencies in there, also mount it as volume when running the container.
I am trying to run a spring-boot maven project inside a docker environment. So the setup is as follows:
Docker is set up and installs Java, etc. (done only once)
App is run (can be any number of times)
What I am experiencing
Every time I run the spring-boot project by mvn spring-boot:run, it installs all the required libraries (every time I run the project) from the pom.xml (Java, Maven, etc. are preinstalled from the docker) and then runs the project.
What I am trying to do
This process of reinstalling every time is redundant and time-consuming, so I want to delegate this installation thing to the docker as well. Ideally, using the pom.xml to do the installations, though alternative ways are also welcome.
What I have tried so far
Install npm using a good tutorial, but it fails in Docker as we can't restart the terminal during docker build, while source ~/.bash_profile doesn't seem to work either.
Tried to build that project directly in docker (by RUN mvn clean install --fail-never) and copying both npm and node folders to the directory where I run the app. But it doesn't seem to work either as it's installing them every time without any change.
Can anyone please help me there? This problem has stuck the project. Thanks a lot!
From your question I understand that, in the Dockerfile you just install java, maven, etc. but does not build your project using mvn clean package install before executing mvn spring-boot:run (and that is redundant as well because mvn spring-boot:run does the build for you before staring the application).
You cannot skip installing maven dependency while running on containers as they are spun as they run. So it will be installed either while you call mvn clean install or mvn spring-boot:run.
What the max you can do is, using your devops pipeline, build the jar previously and in the Dockerfile just copy the build jar and execute.
Example Dockerfile in this case:
FROM openjdk:8-jdk-alpine
ARG JAR_FILE=target/*.jar
COPY ${JAR_FILE} app.jar
ENTRYPOINT ["java","-jar","/app.jar"]
Here the previously build artifact is already available at target/
I would like to ask for some recommendations for development workflow for application with stack mentioned in the title. Before I switched to use Docker all I had to do was:
Go to start.spring.io and download project starter
Import it into intelliJ
Develop features, hit green arrow to start app or red square to stop and repeat it with every change in code
Now when I switched to docker, after step 2, I do this:
Create Dockerfile and docker-compose.yml (where I start my app and also mysql service).
Right click on docker-compose and hit run. Then it builds my app image (i use --build flag in my run configuration so it builds images every time it I hit run on docker-compose) and starts two services: app and mysql, and everything works.
The problem is when I change sth in my code then I have to:
Execute mvn clean and install steps manually, to produce new jar under /target folder
Then stop previous docker compose and run it again. Then it builds new images from what is in /target
I would rather like to have something like one-click solution, like it was before I started to use docker. So when I change code then I press only one button and new image is generated and run with all changes applied. Is it possible? Do I miss something? Could you tell me if your workflow is similar to mine? Maybe you could recommend some tools or different config?
You can set up Spring Boot dev tools to live reload inside a Docker container.
Ensure spring-boot-devtools dependency is in your pom.xml:
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-devtools</artifactId>
<scope>runtime</scope>
<optional>true</optional>
</dependency>
Then create a docker-compose.yml file with
version: '3.1'
services:
backend:
image: maven:3.6.3-jdk-8
command: mvn spring-boot:run
ports:
- 8000:8000
volumes:
- .:/usr/src/mymaven:rw
working_dir: /usr/src/mymaven
This uses the Maven Docker image so when you run docker-compose up, it will run the image and map your source code folder as a volume. Then maven will run the application using mvn spring-boot:run
Whenever you make a change to the source code, it will reload with the same functionality as mvn spring-boot:run.
Create a separate release module - project-name-release - which brings down the old image, build the new image and run/publish it
Use docker maven plugin in the config
Sample project - https://github.com/spring-guides/gs-spring-boot-docker
I created a Docker container whose purpose is to run a Docker image that implements a REST API. The REST API is created with Java (using the Eclipse IDE, Maven and Spring Boot). When creating a jar file, it (jar file) is titled: workserver-0.0.1-SNAPSHOT.jar
The code is commited to a Gitlab server. When this takes place, a Jenkins job is created. The Jenkins job pulls down the code from the Gitlab Repository, creates a .jar file and then goes through the actions to turn the .jar file into a Docker image (or rather a .zip version of the Docker image).
"scp" is used move the zip file to a target system - where - the .zip file is unpacked (revealing the Docker image) and a container is started. The thing is, the Docker image being used has a version of "latest" (ex: imagename:latest).
I would like to use versions in this scenario starting with Eclipse (i.e. a pom.xml file holding a target workserver-2.2.13.jar file would eventually lead to Docker image that would be named imagename:2.3.13)
I have seen here how one can assign a version number in Docker:
Adding tags to docker image from jenkins
I have also seen that one can use tags and version numbers in Git :
ex: git tag -a v2.5 -m 'Version 2.5' As mentioned above, the Maven
pom.xml file contains instructions to produce a .jar file called:
workserver-0.0.1-SNAPSHOT.jar
The system is working fine. I can commit a change in Eclipse and in a few minutes, a new version of the Docker container has been spun up on the delivery system - ready for use.
The issue I have now is setting up the version numbers.
Any guidance in this area would be greatly appreciated.
TIA
I would recommend using the fabric8io docker maven plugin to create the docker image using maven. This plugin allows you to build, run, and push docker images.
In particular, you can setup the name field in the plugin configuration to be:
<name>workserver:%l<name>
The %l will be resolved to the maven project version, which is the same as the jar version. You can run the plugin explicitly using:
mvn io.fabric8:docker-maven-plugin:build
Or you can set the packaging in the pom to be:
<packaging>docker-build</packaging>
This will build the image whenever you docker an mvn package and will push the image on mvn deploy
I would like to build a test environment with Docker, where I can remotely send JUnit test classes (including the code that is tested), execute the tests and retrieve the results.
I found some articles which explained how to use docker for testing databaseconntection/writing inside a redis, but not how i can simple let my tests perform on docker and retrieve the results.
Do you have any recommendations how You would actually achieve this?
I don't know much about Jenkins, but would this might solve my problem?
Is there any good framework outside for this?
In a dockerfile, checkout your code and do a "maven test" command, redirect the result in a file that is on a mounted directory.
Each time you build the dockerfile, you do a unit test on your project.
With docker you also have a "docker test" command. I dont know if there is a plugin to use it on jenkins.
One way I found that works (using Gradle) is as follows. I know you are specifically referencing JUnit as your testing framework, but I actually think something similar to this could work.
Dockerfile (I called mine Dockerfile.UnitTests):
FROM gradle:jdk8 AS test-stage
WORKDIR /app
COPY . ./
RUN gradle clean
RUN gradle test
FROM scratch AS export-stage
COPY --from=test-stage /app/build/reports/tests/test/* /
I then run this with (in Gitbash on Windows 10):
> DOCKER_BUILDKIT=1 docker build -f Dockerfile.UnitTests --output type=tar,dest=UnitTests.tar .
This results in a tar file containing the test results displayed in an html file.
I executed the above in a Gitlab CI/CD pipeline and then sent the results to a web API for analysis.
A couple of assumptions:
My project is set up for Gradle builds so I have the structure from the root of my project src/test/java/groupname/projectname/testfile.java
I am working in Windows 10 targeting Linux containers and using Gitbash.