We have a java portal connected to a mysql db containing about 70 tables.
When we prepare a new client on it, we test it on a DEV server and if all work good
we DO THE SAME configuration on PRODUCTION.
Well, we want to build some simple tool to EXPORT this configuration from DEV and IMPORT it to PRODUCTION. (to avoid doing it by hand every time)
We think about doing this with REST. GET from DEV and POST to PRODUCTION.
This configuration implies about 7-8 tables.
What do you recommend? Do you think REST is the best decision?
I think REST is a a bit strange decision for this, as you would need to build and maintain the client and server software for handling the file uploads, and have it installed correctly on both machines.
I would use an automated secure copy (SCP) script to copy your build artefacts.
Related
I have a Java microservice and I would like to use Dynamodb as a database.
I have multiple profiles: dev, test, stage and prod.
I would like to use for internal profiles (dev and test) a local database using the docker-local image.
My idea is to automate the setup process for test profile (because I'll use Jenkins in order to run the tests for CI/CD) as also for dev profile in order to avoid to run a lot of commands manually (I would like to run just a setup script in order to create the db, the tables and the indexes also).
I saw that, after the initialization of the database using the docker compose, it's possible to use the aws cli in order to create the tables, indexes, etc
The idea is to run these commands into a bash script in order to setup everything.
I also saw that there is this project https://github.com/derjust/spring-data-dynamodb
The idea is to reuse the Hibernate idea in order to write just the entities java class having the automatic creation/update of the database tables/indexes.
In particular I don't like this approach because this is not an 'official' project.. just this.
About the production environment the idea is to integrate this configuration step into the Cloudformation script.
Could this be a good way or is it better to configure all using the AWS console?
Honestly this is the first time that I'm using DynamoDB and I'm not a clear idea about the best practices.
Can you point me to some good examples?
We maintain our server once a week.
Sometimes, the customer wishes that we change some settings which is already cached in server.
My colleague always write some JSP code to change these settings which are stored in the memory.
Is it a good method to use this kind of methodology?
If our project is not a Web container, which tools can help me?
Usually, in my experience, the server configuration is not stored only in memory of server:
What happens that after a configuration change, the server has been restarted / just went down for some system reason?
What happens if you have more than one instance of the same server to work on (a cluster of servers in other words)?
So, usually, people opt for various "externalized configuration" options that can range from "file-based" configuration + redeploy the whole cluster upon each configuration change, to configuration management servers (like Consul, etc.d, etc). There are also some solutions that came from (and used in) a java world: Apache Zookeeper, Spring cloud config server to name a few, there are others. In addition, sometimes, it's convenient to store the configurations in a database.
Now to your question: If your project is not a web container and you don't care that configuration will "disappear" after a server restart and you're not running a distributed cluster of servers, then, using JSP indeed doesn't seem appropriate in this case.
Maybe you should take a look at JMX - Java management extensions, that have a built-in solution so that you probably will be able to get rid of a web container (which seems to be not used by your team anyway other than for JSP modifications that you've described).
You basically need in memory cache, there are multiple solutions found in answers which include creating your own implementation or using existing java library. You can also get data from database and add cache over the database layer.
I wanted to ask if anyone has gained experienced by using Jenkins for starting (Java/Maven) processes? How is your company starting processes on their application servers?
Currently we are using ant to start the application, but are searching for a replacement that offers
traceability / logging
ease of access (Webinterface or GUI for the production department)
Jenkins came in mind because I know it, but I'm not sure if this is the right tool for this purpose.
What do you think?
Have a look at Rundeck
It can run local and remote tasks, jobs, etc.
Yes, You can use jenkins for both.
traceability / logging
Using Audit2DB plugin (https://wiki.jenkins.io/display/JENKINS/Audit+To+Database+Plugin), you can log all build/job details into a db and can fetch/create a report. Customize the schema as per requirement.
ease of access (Webinterface or GUI for the production department)
Again a yes. By implementing propert folders/views and access-control it is possible.
More, I have used Jenkins server a centralized build/deploy/test as an orchestrator to initiate a build, deploy on remote hosts (tomcat based java apps) and run test cases.
I am developing a product using microservices and am running into a bit of an issue. In order to do any work, I need to have all 9 services running on my local development environment. I am using Cloud Foundry to run the applications, but when running locally I am just running the Spring Boot Jars themselves. Is there anyway to setup a more lightweight environment so that I don't need everything running? Ideally, I would only like to have the service I am currently working on to have to be real.
I believe this is a matter of your testing strategy. If you have a lot of micro-services in your system, it is not wise to always perform end-to-end testing at development time -- it costs you productivity and the set up is usually complex (like what you observed).
You should really think about what is the thing you wanna test. Within one service, it is usually good to decouple core logic and the integration points with other services. Ideally, you should be able to write simple unit tests for your core logic. If you wanna test integration points with other services, use mock library (a quick google search shows this to be promising http://spring.io/blog/2007/01/15/unit-testing-with-stubs-and-mocks/)
If you don't have already, I would highly recommend to set up a separate staging area with all micro-services running. You should perform all your end-to-end testing there, before deploying to production.
This post from Martin Fowler has a more comprehensive take on micro-service testing stratey:
https://martinfowler.com/articles/microservice-testing
It boils down to a test technique that you use. Here my recent answer in another topic that you could find useful https://stackoverflow.com/a/44486519/2328781.
In general, I think that Wiremock is a good choice because of the following reasons:
It has out-of-the-box support by Spring Boot
It has out-of-the-box support by Spring Cloud Contract, which gives a possibility to use a very powerful technique called Consumer Driven Contracts.
It has a recording feature. Setup your Wiremock as a proxy and make requests through it. This will generate stubs for you automatically based on your requests and responses.
There are multiple tools out there that let you create mocked versions of your microservices.
When I encountered this exact problem myself I decided to create my own tool which is tailored for microservice testing. The goal is to never have to run all microservices at once, only the one that you are working on.
You can read more about the tool and how to use it to mock microservices here: https://mocki.io/mock-api-microservices. If you only want to run them locally, it is possible using the open source CLI tool
It can be solved if your microservices allow passing metadata along with requests.
Good microservice architecture should use central service discovery, also every service should be able to take metadata map along with request payload. Known fields of this map can be somehow interpreted and modified by the service then passed to next service.
Most popular usage of per-request metadata is request tracing (i.e. collecting tree of nodes used to process this request and timings for every node) but it also can be used to tell entire system which nodes to use
Thus plan is
register your local node in dev environment service discovery
send request to entry node of your system along with metadata telling everyone to use your local service instance instead of default one
metadata will propagate and your local node will be called by dev environment, then local node will pass processed results back to dev env
Alternatively:
use code generation for inter-service communication to reduce risk of failing because of mistakes in RPC code
resort to integration tests, mocking all client apis for microservice under development
fully automate deployment of your system to your local machine. You will possibly need to run nodes with reduced memory (which is generally OK as memory is commonly consumed only under load) or buy more RAM.
An approach would be to use / deploy an app which maps paths / urls to json response files. I personally haven't used it but I believe http://wiremock.org/ might help you
For java microservices, you should try Stybby4j. This will mock the json responses of other microservices using Stubby server. If you feel that mocking is not enough to map all the features of your microservices, you should setup a local docker environment to deploy the dependent microservices.
Question
Together with my friends from university I'm making Web Application and We faced following problem recently. The server is synchronized with remote repository (git). Everyone can run application locally and has own local database on his local machine. There is database on web-hosting plugged to application on server. When someone wants to change something in database, he writes an sql script push it to the repository run it, then run it on server and make sure that every each developer execute it too. That seems to be very uncomfortable for us.
Bad idea
The solution would be plugging everyone to the same database. But IMHO this is the bad idea because of:
We would need to buy another web-host for SQL because, that which is running currently is for worldwide users. For safety, testing reasons we would need another one.
Having a database that is visible for the world, protected with simple password only, seems to be dangerous for me. Current database is configured to be visible only locally (locally relatively to server of course), so generally it is visible for the web server and to developers via ssh if needed.
Performance reason. Connecting to remote database instead of local would be over a dozen times slower considering it for developer use (more complicated queries, tesing site a lots of jUnit testing) would be incredibly painful solution.
Good idea
Some time ago I worked in company that problem was resolved as follows. There was a maven plugin configured to run each sql script in specified directory only once during application build (mvn clean install) i.e. it remembers which script was executed already and leave it. Consider that someone wants to change something in database new column for example. Then he writes script push it to the repository then he don't worry about anything because script would be automatically executed for him, sever and every other developer during application build.
How to do it
Unfortunately I can't find that plugin or configuration. To be honest I cannot find anything related to my problem on the web which is surprising because it seems to be a common problem for me. So can I do it by some Maven plugin? Maybe there is way to do it by proper Spring configuration. In case I would forced to do it manually (in Java at the application start) what tools do I need, any advice, class patterns?
Looking forward for your help. Also sorry for my English I'm not a native speaker.
Just a guess, but maybe the company you worked for used liquibase or flyway.
In case of liquibase which can be used via maven as well, information can be found here: http://www.liquibase.org/, specifically for the maven integratation here: http://www.liquibase.org/documentation/maven/index.html
Spring comes with a liquibase integration as well, information can be found here: http://www.liquibase.org/documentation/spring.html or in addition, if you're using spring boot: https://docs.spring.io/spring-boot/docs/current/reference/html/howto-database-initialization.html
Another possible solution for database migration is flyway: your entry point for documentation: http://flywaydb.org/