I was wondering if IntelliJ has a built in Application Server (like Tomcat) that I can use without having to download Tomcat directly?
Right now when I go to Run | Edit Configurations, Defaults, Tomcat Server, Local, it asks me to specify the Tomcat home directory.
Previously I had used myEclipse and it came packaged with a Tomcat so I would be suprised if the ultimate version of IntelliJ does not have this.
Intellij does not include built in application server. It has simple web server they refer to as Webstorm. However it is not application server.
Here is excellent resource (official docs) for working with application servers in intellij which you may find to be useful including tips on integrating it IDE via plugins etc:
Working with Application Servers
You need to download an application server manually.
Or you could use a Maven/Gradle dependency to start Tomcat programmaticly from the Main method, or a plugin for starting it via command line e.g. mvn tomcat:run.
I will add 300 points as bounty
I have recently started to take a closer look at Docker and how I can use it for faster getting new member of the team up and running with the development environment as well as shipping new versions of the software to production.
I have some questions regarding how and at what stage I should add the Java EE application to the container. As I see it there are multiple ways of doing this.
This WAS the typical workflow (in my team) before Docker:
Developer writes code
Developer builds the code with Maven producing a WAR
Developer uploads the WAR in the JBoss admin console / or with Maven plugin
Now after Docker came around I am a little confused about if I should create the images that I need and configure them so that all that is left to do when you run the JBoss Wildfly container is to deploy the application through the admin console on the web. Or should I create a new container for each time I build the application in Maven and add it with the ADD command in the Dockerfile and then just run the container without ever deploying to it once it is started?
In production I guess the last approach is what it preffered? Correct me if I am wrong.
But in development how should it be done? Are there other workflows?
With the latest version of Docker, you can achieve that easily with Docker Links, Docker Volume and Docker Compose. More information about these tools from Docker site.
Back to your workflow as you have mentioned: for any typical Java EE application, an application server and a database server are required. Since you do not mention in your post how the database is set up, I would assume that your development environment will have separated database server for each developer.
Taking all these into assumption, I could suggest the following workflow:
Build the base Wildfly application server from the official image. You can achieve that by: "docker pull" command
Run the base application server with:
docker run -d -it -p 8080:8080 -p 9990:9990 --name baseWildfly
jboss/wildfly
The application server is running now, you need to configure it to connect to your database server and also configure the datasource settings and other configuration if neccessary in order to start your Java EE application.
For this, you need to log into bash terminal of the Jboss container:
docker exec -i -t baseWildfly /bin/bash/
You are now in the terminal of container. You can configure the application server as you do for any linux environment.
You can test the configuration by manually deploying the WAR file to Wildfly. This can be done easily with the admin console, or maven plugin, or ADD command as you said. I usually do that with admin console, just for testing quickly. When you verify that the configuration works, you can remove the WAR file and create a snapshot of your container:
docker commit --change "add base settings and configurations"
baseWildfly yourRepository:tag
You can now push the created image to your private repository and share that with your developer team. They can now pull the image and run the application server to deploy right away.
We don't want to deploy the WAR file for every Maven build using admin console as that is too cumbersome, so next task is to automate it with Docker Volume.
Assuming that you have configured Maven to build the WAR file to "../your_project/deployments/", you can link that to deployment directory of Jboss container as following:
docker run -d -p 8080:8080 -v
../your_project/deployments:/opt/jboss/wildfly/standalone/deployments
Now, every time you rebuild the application with Maven, the application server will scan for changes and redeploy your WAR file.
It is also quite problematic to have separated database server for each developer, as they have to configure it by themselves in the container because they might have different settings (e.g. db's url, username, password, etc...). So, it's good to dockerize that eventually.
Assuming you use Postgres as your db server, you can pull it from postgres official repository. When you have the image ready, you can run the db server:
docker run -d -p 5432:5432 -t --name postgresDB postgres
or run the database server with the linked "data" directory:
docker run -d -p 5432:5432 -v
../your_postgres/data:/var/lib/postgresql -t --name postgresDB
postgres
The first command will keep your data in the container, while the latter one will keep your data in the host env.
Now you can link your database container with the Wildfly:
docker run -d -p 8080:8080 --link postgresDB:database -t baseWildfly
Following is the output of linking:
Now you can have the same environment for all members in developer's team and they can start coding with minimal set up.
The same base images can be used for Production environment, so that whenever you want to release new version, you just need to copy the WAR file to "your_deployment" folder of the host.
The good thing of dockerizing application server and db server is that you can cluster it easily in the future to scale it or to apply the High Availability.
I've used Docker with Glassfish extensively, for a long time now and wrote a blog on the subject a while ago here.
Its a great tool for JavaEE development.
For your production image I prefer to bundle everything together, building off the static base image and layering in the new WAR. I like to use the CI server to do the work and have a CI configuration for production branches which will grab a base, layer in the release build, and then publish the artifact. Typically we manually deploy into production but if you really want to get fancy you can even automate that with the CI server deploying into a production environment and using proxy servers to ensure new sessions that come it get the updated version.
In development I like to take the same approach when it comes time to locally running any that rely on the container (eg. Arquillian integration tests) prior to checking in code. That keeps the environment as close to production as possible which I think is important when it comes to testing. That's one big reason I am against approaches like testing with embedded containers but deploying to non-embedded ones. I've seen plenty of cases where a test will pass in the embedded environment and fail in the production/non-embedded one.
During a develop/deploy/hand test cycle, prior to committing code, I think the approach of deploying into a container (which is part of a base image) is more cost effective in terms of speed of that dev. cycle vs. building in your WAR each time. It's also a better approach if your dev environment uses a tool like JRebel or XRebel where you can hot deploy your code and simply refresh your browser to see the changes.
You might want to have a look at rhuss/docker-maven-plugin. It allows a seamless integration for using docker as your deployment unit:
Use a standard Maven assembly descriptor for building images with docker:build, so you generated WAR file or your Microservice can be easily added to a Docker image.
You can push the created image with docker:push
With docker:start and docker:stop you can utilize your image during unit tests.
This plugin comes with a comprehensive documentation, if there are any open questions, please open an issue.
And as you might have noticed, I'm the author of this plugin ;-). And frankly, there are other docker-maven-plugins out there, which all have a slightly different focus. For a simple check, you can have a look at shootout-docker-maven which provides sample configurations for the four most active maven-docker-plugins.
The workflow then simply shifts the artifact boundary from WAR/EAR files to Docker images. mvn docker:push moves them to a Docker registry from where it is pulled during the various testing stages used in a continuous delivery pipeline.
The way you would normally deploy anything with Docker is by producing a new image atop of the platform base image. This way you follow Docker dependency bundling philosophy.
In terms of Maven, you can produce a tarball assembly (let's say it's called jars.tar) and then call ADD jars.tar /app/lib in Dockerfile. You might also implement a Maven plugin that generates a Dockerfile as well.
This is the most sane approach with Docker today, other approaches, such as building image FROM scratch are not quite applicable for Java applications.
See also Java JVM on Docker/CoreOS.
The blog post about setting up JRebel with Docker by Arun Gupta would probably be handy here: http://blog.arungupta.me/configure-jrebel-docker-containers/
I have tried a simular scenario to use docker to run my application. In my situation i wanted to start docker with tomcat running the war. Then at the integration-test phase of maven start the cucumber/phantomjs integration test on the docker.
The example implementation is documented at https://github.com/abroer/cucumber-integration-test. You could extend this example to push the docker image to your private repo when the test is successfull. The pushed image can be used in any enviroment from development to production.
For my current deployment process I use glassfish and this trick, which works very nicely.
<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>exec-maven-plugin</artifactId>
<version>${plugin.exec.version}</version>
<executions>
<execution>
<id>docker</id>
<phase>package</phase>
<goals>
<goal>exec</goal>
</goals>
</execution>
</executions>
<configuration>
<executable>docker</executable>
<arguments>
<argument>cp</argument>
<argument>${project.build.directory}/${project.build.finalName}</argument>
<argument>glassfish:/glassfish4/glassfish/domains/domain1/autodeploy</argument>
</arguments>
</configuration>
</plugin>
Once you run: mvn clean package, the containers kicks-in and starts deployment of the latest war.
I'm new to Angularjs and i'm trying to run first angular program with Spring. I downloaded the code from here: http://javahonk.com/spring-mvc-angularjs-integration/. Imported it as a maven project. Downloaded JSDT and AngualrJs plugin from eclipse marketplace but it's still not running. There is no error message. I'm running this on tomcat 7. What am I missing?
Any help would be appreciated?
I'm not a Eclipse user. so I have no idea for that. But you can use nodejs as a static server. it is explained here
1) Install nodejs for your OS
2) Run this command in a terminal (console)
> npm install http-server -g
3) Start server
> cd /path/to/your/project
> http-server -o --cors
Now you can access your project from browser http://localhost:8080/yourfile.html
Work in Eclipse and just refresh page on browser to see changes.
If you are working on a Spring Application within Eclipse IDE, your best alternative would be to start you appliction using an embedded Application Server (Best I would recommand are Jetty or Tomcat if you ain't need EE level components).
You can follow this link, which holds basic steps for adding a new Application Server within Eclipse IDE.
Once you have added a new application server, you can deploy your application in it then launch it and you should have your applcation reachable at http://localhost:8080/SpringMVCAngularJS.
A good alternative when using Maven as a build tool, is using an embedded AS plugin such as Tomcat7 Plugin or Jetty Plugin. This plugin will provide the ability to start your applcation using the Maven different goals (which does not require adding a new AS into Eclipse IDE).
I've pushed a sample module based on the tutorial you mentioned. YOu can test the above described plugin as follows in a *nix shell (You may need to setup git if not already done):
git clone https://github.com/tmarwen/stackoverflow-showcase.git
cd stackoverflow-showcase/springmvc-angularjs
mvn tomcat7:run
I've got a Java web application that builds with Maven. My project uses RequireJS. I use a maven plugin at build time to compress the JS artifacts (https://github.com/bringking/requirejs-maven-plugin). The plugin calls out to NodeJS (with the r.js compressor) to do the actual work.
Local builds work wonderfully.
On Heroku, however, NodeJS is not available using the Heroku Java buildpack (the default for Java/Maven applications).
For now, I run the requireJS maven plugin locally using an active Maven profile that isn't present on the Heroku server. This prevents the RequireJS plugin from running on the Heroku server. This is less than ideal because it requires me to run the plugin locally, then check in the resulting build artifact. It's far better to generate the compressed JS file at build time in the Heroku system.
I'm looking for a good solution. Thanks in advance.
The best solution is to use Heroku Multi Buildpack with the Node.js and Java buildpacks. This is described in an article using Grunt with Java and Maven but the same principles apply for Require.js.
I have a few jobs that automatically build a java app. I would like it to automatically push it to a other server. I found a plugin that copies artifacts over ssh, but using it I end up with app-1.0-SNAPHSHOT.jar, app-1.1-SNAPHSHOT.jar and so on on the remote server.
I would like to have it as app.jar instead, overwriting the old one every time. Is there a "intelligent" way of doing this, or should I just make a shell script that looks for the newest one, and overwrites it?
If you are using a Maven project I would recommend Mojo's Ship Maven Plugin for doing the transfer via your build scripts.
If you want to do this via Jenkins plugins, there are the following plugin options:
Publish via SSH
Publish to a FTP server
Publish to a Windows file share
I just do it right in my build scripts. Why all the extra management? In ant,
<copy todir="${remote}">
<globmapper from="*" to="app.jar"/>
...
</copy>
works perfectly fine.