Using Docker in development for Java EE applications - java

I will add 300 points as bounty
I have recently started to take a closer look at Docker and how I can use it for faster getting new member of the team up and running with the development environment as well as shipping new versions of the software to production.
I have some questions regarding how and at what stage I should add the Java EE application to the container. As I see it there are multiple ways of doing this.
This WAS the typical workflow (in my team) before Docker:
Developer writes code
Developer builds the code with Maven producing a WAR
Developer uploads the WAR in the JBoss admin console / or with Maven plugin
Now after Docker came around I am a little confused about if I should create the images that I need and configure them so that all that is left to do when you run the JBoss Wildfly container is to deploy the application through the admin console on the web. Or should I create a new container for each time I build the application in Maven and add it with the ADD command in the Dockerfile and then just run the container without ever deploying to it once it is started?
In production I guess the last approach is what it preffered? Correct me if I am wrong.
But in development how should it be done? Are there other workflows?

With the latest version of Docker, you can achieve that easily with Docker Links, Docker Volume and Docker Compose. More information about these tools from Docker site.
Back to your workflow as you have mentioned: for any typical Java EE application, an application server and a database server are required. Since you do not mention in your post how the database is set up, I would assume that your development environment will have separated database server for each developer.
Taking all these into assumption, I could suggest the following workflow:
Build the base Wildfly application server from the official image. You can achieve that by: "docker pull" command
Run the base application server with:
docker run -d -it -p 8080:8080 -p 9990:9990 --name baseWildfly
jboss/wildfly
The application server is running now, you need to configure it to connect to your database server and also configure the datasource settings and other configuration if neccessary in order to start your Java EE application.
For this, you need to log into bash terminal of the Jboss container:
docker exec -i -t baseWildfly /bin/bash/
You are now in the terminal of container. You can configure the application server as you do for any linux environment.
You can test the configuration by manually deploying the WAR file to Wildfly. This can be done easily with the admin console, or maven plugin, or ADD command as you said. I usually do that with admin console, just for testing quickly. When you verify that the configuration works, you can remove the WAR file and create a snapshot of your container:
docker commit --change "add base settings and configurations"
baseWildfly yourRepository:tag
You can now push the created image to your private repository and share that with your developer team. They can now pull the image and run the application server to deploy right away.
We don't want to deploy the WAR file for every Maven build using admin console as that is too cumbersome, so next task is to automate it with Docker Volume.
Assuming that you have configured Maven to build the WAR file to "../your_project/deployments/", you can link that to deployment directory of Jboss container as following:
docker run -d -p 8080:8080 -v
../your_project/deployments:/opt/jboss/wildfly/standalone/deployments
Now, every time you rebuild the application with Maven, the application server will scan for changes and redeploy your WAR file.
It is also quite problematic to have separated database server for each developer, as they have to configure it by themselves in the container because they might have different settings (e.g. db's url, username, password, etc...). So, it's good to dockerize that eventually.
Assuming you use Postgres as your db server, you can pull it from postgres official repository. When you have the image ready, you can run the db server:
docker run -d -p 5432:5432 -t --name postgresDB postgres
or run the database server with the linked "data" directory:
docker run -d -p 5432:5432 -v
../your_postgres/data:/var/lib/postgresql -t --name postgresDB
postgres
The first command will keep your data in the container, while the latter one will keep your data in the host env.
Now you can link your database container with the Wildfly:
docker run -d -p 8080:8080 --link postgresDB:database -t baseWildfly
Following is the output of linking:
Now you can have the same environment for all members in developer's team and they can start coding with minimal set up.
The same base images can be used for Production environment, so that whenever you want to release new version, you just need to copy the WAR file to "your_deployment" folder of the host.
The good thing of dockerizing application server and db server is that you can cluster it easily in the future to scale it or to apply the High Availability.

I've used Docker with Glassfish extensively, for a long time now and wrote a blog on the subject a while ago here.
Its a great tool for JavaEE development.
For your production image I prefer to bundle everything together, building off the static base image and layering in the new WAR. I like to use the CI server to do the work and have a CI configuration for production branches which will grab a base, layer in the release build, and then publish the artifact. Typically we manually deploy into production but if you really want to get fancy you can even automate that with the CI server deploying into a production environment and using proxy servers to ensure new sessions that come it get the updated version.
In development I like to take the same approach when it comes time to locally running any that rely on the container (eg. Arquillian integration tests) prior to checking in code. That keeps the environment as close to production as possible which I think is important when it comes to testing. That's one big reason I am against approaches like testing with embedded containers but deploying to non-embedded ones. I've seen plenty of cases where a test will pass in the embedded environment and fail in the production/non-embedded one.
During a develop/deploy/hand test cycle, prior to committing code, I think the approach of deploying into a container (which is part of a base image) is more cost effective in terms of speed of that dev. cycle vs. building in your WAR each time. It's also a better approach if your dev environment uses a tool like JRebel or XRebel where you can hot deploy your code and simply refresh your browser to see the changes.

You might want to have a look at rhuss/docker-maven-plugin. It allows a seamless integration for using docker as your deployment unit:
Use a standard Maven assembly descriptor for building images with docker:build, so you generated WAR file or your Microservice can be easily added to a Docker image.
You can push the created image with docker:push
With docker:start and docker:stop you can utilize your image during unit tests.
This plugin comes with a comprehensive documentation, if there are any open questions, please open an issue.
And as you might have noticed, I'm the author of this plugin ;-). And frankly, there are other docker-maven-plugins out there, which all have a slightly different focus. For a simple check, you can have a look at shootout-docker-maven which provides sample configurations for the four most active maven-docker-plugins.
The workflow then simply shifts the artifact boundary from WAR/EAR files to Docker images. mvn docker:push moves them to a Docker registry from where it is pulled during the various testing stages used in a continuous delivery pipeline.

The way you would normally deploy anything with Docker is by producing a new image atop of the platform base image. This way you follow Docker dependency bundling philosophy.
In terms of Maven, you can produce a tarball assembly (let's say it's called jars.tar) and then call ADD jars.tar /app/lib in Dockerfile. You might also implement a Maven plugin that generates a Dockerfile as well.
This is the most sane approach with Docker today, other approaches, such as building image FROM scratch are not quite applicable for Java applications.
See also Java JVM on Docker/CoreOS.

The blog post about setting up JRebel with Docker by Arun Gupta would probably be handy here: http://blog.arungupta.me/configure-jrebel-docker-containers/

I have tried a simular scenario to use docker to run my application. In my situation i wanted to start docker with tomcat running the war. Then at the integration-test phase of maven start the cucumber/phantomjs integration test on the docker.
The example implementation is documented at https://github.com/abroer/cucumber-integration-test. You could extend this example to push the docker image to your private repo when the test is successfull. The pushed image can be used in any enviroment from development to production.

For my current deployment process I use glassfish and this trick, which works very nicely.
<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>exec-maven-plugin</artifactId>
<version>${plugin.exec.version}</version>
<executions>
<execution>
<id>docker</id>
<phase>package</phase>
<goals>
<goal>exec</goal>
</goals>
</execution>
</executions>
<configuration>
<executable>docker</executable>
<arguments>
<argument>cp</argument>
<argument>${project.build.directory}/${project.build.finalName}</argument>
<argument>glassfish:/glassfish4/glassfish/domains/domain1/autodeploy</argument>
</arguments>
</configuration>
</plugin>
Once you run: mvn clean package, the containers kicks-in and starts deployment of the latest war.

Related

How to deploy a spring boot application jar from Jenkins to another EC2 instance machine

I'm seeing so many different sources how to to achieve CI with Jenkins and EC2 and strangely none seem to fit my needs.
I have 2 EC2 ubuntu instances. One is empty and the other has Jenkins installed on it. I want to perform a build on the Jenkins machine and copy the jar to the other ubuntu machine. Once the jar is there i want to run mvn spring-boot:run
That's is - a very simple flow which i can't find a good source to follow that doesn't include slaves, dockers etc..

spring-boot-devtools Automatic Restart not working

I have a working Spring Boot 2.25 application built with mvn. As per this documentation I add
<dependencies>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-devtools</artifactId>
<optional>true</optional>
</dependency>
</dependencies>
From the documentation:
As DevTools monitors classpath resources, the only way to trigger a restart is to update the classpath. The way in which you cause the classpath to be updated depends on the IDE that you are using. In Eclipse, saving a modified file causes the classpath to be updated and triggers a restart. In IntelliJ IDEA, building the project (Build -> Build Project) has the same effect.
With the application running I tried a simple
touch /path/to/app.jar
expecting the application to restart but nothing happened.
Okay, so maybe it's doing something smarter. I modified some source .java, recompiled the .jar, and cp'd it to replace the running .jar file and... nothing happened.
Also from the documentation
DevTools relies on the application context’s shutdown hook to close it during a restart. It does not work correctly if you have disabled the shutdown hook (SpringApplication.setRegisterShutdownHook(false)).
I am not doing this.
DevTools needs to customize the ResourceLoader used by the ApplicationContext. If your application provides one already, it is going to be wrapped. Direct override of the getResource method on the ApplicationContext is not supported.
I am not doing this.
I am running this in a Docker container, if that matters. From the documentation:
Developer tools are automatically disabled when running a fully packaged application. If your application is launched from java -jar or if it is started from a special classloader, then it is considered a “production application”. If that does not apply to you (i.e. if you run your application from a container), consider excluding devtools or set the -Dspring.devtools.restart.enabled=false system property.
I don't understand what this means or if it is relevant.
I want to recompile a .jar and replace it in the running docker container and trigger and application restart without restarting the container. How can I do this?
EDIT: I am using mvn to rebuild the jar, then docker cp to replace it in the running container. (IntelliJ IDEA claims to rebuild the project, but the jar files are actually not touched, but that's another story.) I am looking for a non-IDE-specific solution.
The Spring Boot Devtools offers for Spring Boot applications the functionality that usually is available in IDEs like IntelliJ in which you have the ability to, for example, restart an application or force a live browser reload when certain classes or resources change. This can be very useful in the development phase of your application.
It is typically used in conjunction with an IDE in such a way that it will be launched with the rest of your application by Spring Boot when detected in the classpath and if it is not disabled.
Although you can configure it to monitor further resources, it will usually look for changes in your application code, in your classes and resources.
It is important to say that, AFAIK, Devtools will monitor your own classes and resources in an exploded way, I mean, the restart process will not work if you overwrite your whole application jar, only if you overwrite some resources in your classes directory.
This functionality can be tested with Maven. Please, consider download a simple blueprint from Spring Initializr, with Spring Boot, Spring Boot Devtools and Spring Web, for example - in order to keep the application running. From a terminal, in the directory that contains the pom.xml file, run your application, for instance, with the help of the spring-boot-maven-plugin plugin included in the pom.xml:
mvn spring-boot:run
The command will download the project dependencies, compile and run your application.
Now, perform any modification in your source code, either in your classes or in your resources and, from another terminal, in the same directory, recompile your resources:
mvn compile
If you look at the first terminal window you will see that the application is restarted to reflect the changes.
If you are using docker for your application deployment, try reproducing this behavior can be tricky.
On one hand, I do not know if it makes sense, but you can try creating a maven based image and run your code inside, just as described above. Your Dockerfile can look similar to this:
FROM maven:3.5-jdk-8 as maven
WORKDIR /app
# Copy project pom
COPY ./pom.xml ./pom.xml
# Fetch (and cache) dependencies
RUN mvn dependency:go-offline -B
# Copy source files
COPY ./src ./src
# Run your application
RUN mvn springboot:run
With this setup, you can copy with docker cp your resources to the /app/target directory and it will trigger an application restart. As an alternative, consider mounting a volume in your container instead of using docker cp.
Much better, and taking into account the fact that overwriting your application jar will probably not work, you can try to copy both your classes and library dependencies, and run your application in a exploded way. Consider the following Dockerfile:
FROM maven:3.5-jdk-8 as maven
WORKDIR /app
# Copy your project pom
COPY ./pom.xml ./pom.xml
# Fetch (and cache) dependencies
RUN mvn dependency:go-offline -B
# Copy source files
COPY ./src ./src
# Compile application and library dependencies
# The dependencies will, by default, be copied to target/dependency
RUN mvn clean compile dependency:copy-dependencies -Dspring-boot.repackage.skip=true
# Final run image (based on https://stackoverflow.com/questions/53691781/how-to-cache-maven-dependencies-in-docker)
FROM openjdk:8u171-jre-alpine
# OPTIONAL: copy dependencies so the thin jar won't need to re-download them
# COPY --from=maven /root/.m2 /root/.m2
# Change working directory
WORKDIR /app
# Copy classes from maven image
COPY --from=maven /app/target/classes ./classes
# Copy dependent libraries
COPY --from=maven /app/target/dependency ./lib
EXPOSE 8080
# Please, modify your main class name as appropriate
ENTRYPOINT ["java", "-cp", "/app/classes:/app/lib/*", "com.example.demo.DemoApplication"]
The important line in the Dockerfile is this:
mvn clean compile dependency:copy-dependencies -Dspring-boot.repackage.skip=true
It will instruct maven to compile your resources and copy the required libraries. Although redundant for the typical Maven phase in which the spring-boot-maven-plugin repackage goal runs, the flag spring-boot.repackage.skip=true will instruct this plugin to not repackage the application.
With this Dockerfile, build you image (let's tag it devtools-demo, for example):
docker build -t devtools-demo .
And run it:
docker run devtools-demo:latest
With this setup, if now you change your classes and/or resources, and run mvn locally:
mvn compile
you should be able to force the restart mechanism in your container with the following docker cp command:
docker cp classes <container name>:/app/classes
Please, again, consider mounting a volume in your container instead of using docker cp.
I tested the setup and it worked properly.
The important think to keep in mind is to replace your exploded resources, not the whole application jar.
As another option, you can take an approach similar to the one indicated in your comments and run your Devtools in remote mode:
FROM maven:3.5-jdk-8 as maven
WORKDIR /app
# Copy project pom
COPY ./pom.xml ./pom.xml
# Fetch (and cache) dependencies
RUN mvn dependency:go-offline -B
# Copy source files
COPY ./src ./src
# Build jar
RUN mvn package && cp target/your-app-version.jar app.jar
# Final run image (based on https://stackoverflow.com/questions/53691781/how-to-cache-maven-dependencies-in-docker)
FROM openjdk:8u171-jre-alpine
# OPTIONAL: copy dependencies so the thin jar won't need to re-download them
# COPY --from=maven /root/.m2 /root/.m2
# Change working directory
WORKDIR /app
# Copy artifact from the maven image
COPY --from=maven /app/app.jar ./app.jar
ENV JAVA_DOCKER_OPTS "-agentlib:jdwp=transport=dt_socket,server=y,address=*:8000,suspend=n"
ENV JAVA_OPTS "-Dspring.devtools.restart.enabled=true"
EXPOSE 8000
EXPOSE 8080
ENTRYPOINT ["/bin/bash", "-lc", "exec java $JAVA_DOCKER_OPTS $JAVA_OPTS -jar /app/app.jar"]
For the Spring Boot Devtools remote mode to work properly, you need several things (some of them pointed out by Opri as well in his/her answer).
First, you need to configure the spring-boot-maven-plugin to include the devtools in your application jar (it will be excluded otherwise, by default):
<plugin>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-maven-plugin</artifactId>
<configuration>
<excludeDevtools>false</excludeDevtools>
</configuration>
</plugin>
Then, you need to setup a value for the configuration property spring.devtools.remote.secret. This property has to do with the way remote debugging works in Spring Boot Devtools.
The remote debugging functionality consists of two parts, a client and a server. Basically, the client is a copy of your server code, and it uses the value of the spring.devtools.remote.secret configuration property to authenticate itself against the server.
This client code should be run from an IDE, and you attach your IDE debugging process to a local server exposed from that client.
Every change performed in the client monitored resources, remember, the same as in your server, is pushed to the remote server and it will trigger a restart if necessary.
As you can see, this functionality is again more appropriate from a development point of view.
If you need to actually restart your application by overwriting your jar application file, maybe a better approach will be to configure your docker container to run a shell script in your ENTRYPOINT or CMD. This shell script will monitor a copy of your jar, in a certain directory. If that resource changes, as a consequence of your docker cp, this shell script will stop the current running application version - this application is supposed to run from a different location to avoid problems when updating the jar -, replace the current jar with the new one, and then start the new application version. Not the same, but please, consider read this related SO answer.
In any case, when you run an application in a container, you are trying to provide a consistent and platform independent way of deployment for it. From this perspective, instead of monitoring changes in your docker container, a more convenient approach may be to generate and to deploy a new version of your container image with those new changes. This process can be automated greatly using tools like Jenkins, Travis, etcetera. These tools allow you to define CI/CD pipelines that, in response to a code commit, for example, can generate on the fly a docker image with your code and, it configured accordingly, deploy later this image to services like some docker flavor or Kubernetes, on premises or in the cloud. Some of them, especially Kubernetes, but swarm an even docker compose as well, will allow you to perform rolling updates without or with minimal application service interruption.
To conclude, probably it will not fit your needs, but be aware that you can use spring-boot-starter-actuator directly or with Spring Boot Admin, for instance, to restart your application.
Finally, as already indicated in the Spring Boot Devtools documentation, you can try a different option, not based on restart but in application reload, in hot swapping. This functionality is offered by commercial products like JRebel although there are some open sources alternatives as well, mainly dcevm and the HotswapAgent. This related article provides some insight in how these last two products work. This Github project provides complementary information about how to run it in docker containers.
I had a similar problem when using intellij idea, I saw somewhere that you had to use the build button for it to work.
In jsp the application reloads the files, it is not completely automatic, because intellij saves automatically -> this behavior is the default but there is I think a way to change it. -> To record manually and then that it reloads automatically.
Works for jsp apps only, if you try this with standard apps it will create a double frame execution (swing)
I am no shore because you are not saying explicitly if you tried this things but:
try to set this on true:(SpringApplication.setRegisterShutdownHook(true))
try adding manually in the dockerfile this property -Dspring.devtools.restart.enabled=true
I know it says that on default should be on true, but try to do it
manually
Maybe show us the dockerfile.
Later Edit:
Saw this in documentation:
repackaged archives do not contain devtools by default. If you want to
use certain remote devtools feature, you’ll need to disable the
excludeDevtools build property to include it. The property is
supported with both the Maven and Gradle plugins.
The Spring Boot developer tools are not just limited to local development. You can also use several features when running applications remotely. Remote support is opt-in, to enable it you need to make sure that devtools is included in the repackaged archive:
<build>
<plugins>
<plugin>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-maven-plugin</artifactId>
<configuration>
<excludeDevtools>false</excludeDevtools>
</configuration>
</plugin>
</plugins>
</build>
Then you need to set a spring.devtools.remote.secret property, for example:
spring.devtools.remote.secret=mysecret
Remote devtools support is provided in two parts; there is a server side endpoint that accepts connections, and a client application that you run in your IDE. The server component is automatically enabled when the spring.devtools.remote.secret property is set. The client component must be launched manually.
Documents from spring
In order to restart app with devtools you need to make sure following things.
Use any IDE or Build tools like maven gradle to start app
Using java -jar devtools will not work as it packages app.
Using maven you can run app like mvn spring-boot:run
Refer official documentation for more details.
I had similar issue after using dependency also spring boot was not picking up devtools configuration so I did following steps in eclipse.
installed eclipse (assuming you have already installed)
installed sts plugin from eclipse market (since i am eclipse generic version lover so prefer generic eclipse on top of that i installed sts plugin)
project --> build automatically
debug as --> spring boot application
done.

jni4net on Docker Ubuntu Host

I have a application developed in Java 8 with SpringBoot, that use jni4net for consuming a dll library.
It's posibble make a docker container in Ubuntu to run this application ?
Thanks
Spring Boot with Docker
This guide walks you through the process of building a Docker image for running a Spring Boot application.
What you’ll build
Docker is a Linux container management toolkit with a "social" aspect, allowing users to publish container images and consume those published by others. A Docker image is a recipe for running a containerized process, and in this guide we will build one for a simple Spring boot application.
What you’ll need
About 15 minutes
A favorite text editor or IDE
JDK 1.8 or later
Gradle 2.3+ or Maven 3.0+
You can also import the code straight into your IDE:
Spring Tool Suite (STS)
IntelliJ IDEA
If you are NOT using a Linux machine, you will need a virtualized server. By installing VirtualBox, other tools like the Mac’s boot2docker, can seamlessly manage it for you. Visit VirtualBox’s download site and pick the version for your machine. Download and install. Don’t worry about actually running it.
You will also need Docker, which only runs on 64-bit machines. See https://docs.docker.com/installation/#installation for details on setting Docker up for your machine. Before proceeding further, verify you can run docker commands from the shell. If you are using boot2docker you need to run that first.

Jenkins Octopus Integration

I am using Jenkins as a CI tool and using Octopus to deploy my JAVA application. But when surfed, i could get solutions to deploy a .Net application using Octopack. But how to pack my JAVA Application and automatically deploy it into the Octopus server from my Jenkins instance?
You can pack it with NuGet (with the nuget pack command, documented here). That's essentially all that Octopack does. Create a .nuspec file, and in your <files> section, include the files you want with an empty target. For example, this will include all files in your package:
...
<files>
<file src="path/to/output/**" target="" />
</files>
...
You can then push it to your Octopus Deploy system using nuget push. Instructions are on your Octopus Deploy Package Library page.
Since Octopus 3.3 you can also package in tar and zip, in addition to NuGet.
You can configure the machine where you want your code to be deployed to as a deployment target. Listening Tentacles are the most oft-used ones.
Once your deployment target is configured, setup Octo.exe on your Jenkins server and use the script console in your Jenkins job to automatically deploy your package to the intended target using Octo.exe .
You can also write the code to a script on the Jenkins server and call that directly from the console in the Jenkins job. We do this in our setup because Octo.exe uses the API-KEY which we'd rather keep secret from the developers.
Note: Octopus Deploy is also currently working on native Java support. See this RFC.

Continuous deployment/integration on Heroku for Java web app

I have a Java web application that I have managed to successfully deploy and get running on Heroku using the 'git push heroku master' method, but I would like to automate deployment and have a full CI setup. I've had a go at using Atlassian Bamboo with the Heroku plugin but it's really only suitable for standalone .war files - I need to be able to specify additional config via the Procfile definition in my project.
What have other people used for CI/CD of Java web applications to Heroku?
Jenkins has a good Heroku Plugin, that allow you to deploy WARs and interact with Heroku in many ways, including setting variables, scaling your dynos and running one-off processes:
https://github.com/heroku/heroku-jenkins-plugin/blob/master/README.md
To change the Procfile on Heroku, you need to commit and push the new file. You can do that as a step on your CI build. Jenkins can run scripts as part of your build, where you could easily push a new Procfile if that is needed.

Categories

Resources