I wanted to ask if anyone has gained experienced by using Jenkins for starting (Java/Maven) processes? How is your company starting processes on their application servers?
Currently we are using ant to start the application, but are searching for a replacement that offers
traceability / logging
ease of access (Webinterface or GUI for the production department)
Jenkins came in mind because I know it, but I'm not sure if this is the right tool for this purpose.
What do you think?
Have a look at Rundeck
It can run local and remote tasks, jobs, etc.
Yes, You can use jenkins for both.
traceability / logging
Using Audit2DB plugin (https://wiki.jenkins.io/display/JENKINS/Audit+To+Database+Plugin), you can log all build/job details into a db and can fetch/create a report. Customize the schema as per requirement.
ease of access (Webinterface or GUI for the production department)
Again a yes. By implementing propert folders/views and access-control it is possible.
More, I have used Jenkins server a centralized build/deploy/test as an orchestrator to initiate a build, deploy on remote hosts (tomcat based java apps) and run test cases.
Related
I am developing a product using microservices and am running into a bit of an issue. In order to do any work, I need to have all 9 services running on my local development environment. I am using Cloud Foundry to run the applications, but when running locally I am just running the Spring Boot Jars themselves. Is there anyway to setup a more lightweight environment so that I don't need everything running? Ideally, I would only like to have the service I am currently working on to have to be real.
I believe this is a matter of your testing strategy. If you have a lot of micro-services in your system, it is not wise to always perform end-to-end testing at development time -- it costs you productivity and the set up is usually complex (like what you observed).
You should really think about what is the thing you wanna test. Within one service, it is usually good to decouple core logic and the integration points with other services. Ideally, you should be able to write simple unit tests for your core logic. If you wanna test integration points with other services, use mock library (a quick google search shows this to be promising http://spring.io/blog/2007/01/15/unit-testing-with-stubs-and-mocks/)
If you don't have already, I would highly recommend to set up a separate staging area with all micro-services running. You should perform all your end-to-end testing there, before deploying to production.
This post from Martin Fowler has a more comprehensive take on micro-service testing stratey:
https://martinfowler.com/articles/microservice-testing
It boils down to a test technique that you use. Here my recent answer in another topic that you could find useful https://stackoverflow.com/a/44486519/2328781.
In general, I think that Wiremock is a good choice because of the following reasons:
It has out-of-the-box support by Spring Boot
It has out-of-the-box support by Spring Cloud Contract, which gives a possibility to use a very powerful technique called Consumer Driven Contracts.
It has a recording feature. Setup your Wiremock as a proxy and make requests through it. This will generate stubs for you automatically based on your requests and responses.
There are multiple tools out there that let you create mocked versions of your microservices.
When I encountered this exact problem myself I decided to create my own tool which is tailored for microservice testing. The goal is to never have to run all microservices at once, only the one that you are working on.
You can read more about the tool and how to use it to mock microservices here: https://mocki.io/mock-api-microservices. If you only want to run them locally, it is possible using the open source CLI tool
It can be solved if your microservices allow passing metadata along with requests.
Good microservice architecture should use central service discovery, also every service should be able to take metadata map along with request payload. Known fields of this map can be somehow interpreted and modified by the service then passed to next service.
Most popular usage of per-request metadata is request tracing (i.e. collecting tree of nodes used to process this request and timings for every node) but it also can be used to tell entire system which nodes to use
Thus plan is
register your local node in dev environment service discovery
send request to entry node of your system along with metadata telling everyone to use your local service instance instead of default one
metadata will propagate and your local node will be called by dev environment, then local node will pass processed results back to dev env
Alternatively:
use code generation for inter-service communication to reduce risk of failing because of mistakes in RPC code
resort to integration tests, mocking all client apis for microservice under development
fully automate deployment of your system to your local machine. You will possibly need to run nodes with reduced memory (which is generally OK as memory is commonly consumed only under load) or buy more RAM.
An approach would be to use / deploy an app which maps paths / urls to json response files. I personally haven't used it but I believe http://wiremock.org/ might help you
For java microservices, you should try Stybby4j. This will mock the json responses of other microservices using Stubby server. If you feel that mocking is not enough to map all the features of your microservices, you should setup a local docker environment to deploy the dependent microservices.
Question
Together with my friends from university I'm making Web Application and We faced following problem recently. The server is synchronized with remote repository (git). Everyone can run application locally and has own local database on his local machine. There is database on web-hosting plugged to application on server. When someone wants to change something in database, he writes an sql script push it to the repository run it, then run it on server and make sure that every each developer execute it too. That seems to be very uncomfortable for us.
Bad idea
The solution would be plugging everyone to the same database. But IMHO this is the bad idea because of:
We would need to buy another web-host for SQL because, that which is running currently is for worldwide users. For safety, testing reasons we would need another one.
Having a database that is visible for the world, protected with simple password only, seems to be dangerous for me. Current database is configured to be visible only locally (locally relatively to server of course), so generally it is visible for the web server and to developers via ssh if needed.
Performance reason. Connecting to remote database instead of local would be over a dozen times slower considering it for developer use (more complicated queries, tesing site a lots of jUnit testing) would be incredibly painful solution.
Good idea
Some time ago I worked in company that problem was resolved as follows. There was a maven plugin configured to run each sql script in specified directory only once during application build (mvn clean install) i.e. it remembers which script was executed already and leave it. Consider that someone wants to change something in database new column for example. Then he writes script push it to the repository then he don't worry about anything because script would be automatically executed for him, sever and every other developer during application build.
How to do it
Unfortunately I can't find that plugin or configuration. To be honest I cannot find anything related to my problem on the web which is surprising because it seems to be a common problem for me. So can I do it by some Maven plugin? Maybe there is way to do it by proper Spring configuration. In case I would forced to do it manually (in Java at the application start) what tools do I need, any advice, class patterns?
Looking forward for your help. Also sorry for my English I'm not a native speaker.
Just a guess, but maybe the company you worked for used liquibase or flyway.
In case of liquibase which can be used via maven as well, information can be found here: http://www.liquibase.org/, specifically for the maven integratation here: http://www.liquibase.org/documentation/maven/index.html
Spring comes with a liquibase integration as well, information can be found here: http://www.liquibase.org/documentation/spring.html or in addition, if you're using spring boot: https://docs.spring.io/spring-boot/docs/current/reference/html/howto-database-initialization.html
Another possible solution for database migration is flyway: your entry point for documentation: http://flywaydb.org/
I'm working on an application which is comprised of many web containers with WAR-s installed on these containers.
Currently I have a farm of ~ 10 servers like this.
I'm about to start the integration of jolokia/hawtio to track the JMX mbeans exposed on each of these servers.
For this purpose (I think) I'll install a jolokia agent on each of these servers (just put the war into the deployment library.
Now I think to put hawtio on a dedicated node and connect to the remote agents deployed on the rest of my 10 servers.
My question is whether its possible to somehow give a list of predefined agents (host/port/credentials)?
I have a lot of farms to manage like this, I would prefer to use a predefined list generated per farm, rather than dealing with auto-discovery
Thanks in advance
No currently this is not supported, but a good idea. You are welcome to log a ticket about this.
https://github.com/hawtio/hawtio/issues
And also how would you like that list to be configured? Should you need to edit the web.xml and repackage the WAR, or how should that configuration be made easy? That is for something to think about.
I need to simulate JMS behavior while performing automated tests via maven/hudson. I was thinking about using some mock framework i.e. Mockito to achieve that goal but maybe there is some easier tool which can accomplish this task? I have read a little bit about ActiveMQ but from what I have found out it requires to install broker prior using it. In my case it is important to have everything run by maven only because I don't have any privileges to install anything on the build server.
You can run ActiveMQ in embedded mode - the broker starts within your application and queues are created on the fly. You just need to add activemq.jar and run few lines of code.
On the other hand there is a Mockrunner library that has support for JMS - although it was designed mainly for unit tests, not integration.
We have a java portal connected to a mysql db containing about 70 tables.
When we prepare a new client on it, we test it on a DEV server and if all work good
we DO THE SAME configuration on PRODUCTION.
Well, we want to build some simple tool to EXPORT this configuration from DEV and IMPORT it to PRODUCTION. (to avoid doing it by hand every time)
We think about doing this with REST. GET from DEV and POST to PRODUCTION.
This configuration implies about 7-8 tables.
What do you recommend? Do you think REST is the best decision?
I think REST is a a bit strange decision for this, as you would need to build and maintain the client and server software for handling the file uploads, and have it installed correctly on both machines.
I would use an automated secure copy (SCP) script to copy your build artefacts.