I have 3 web-projects that use the same database and same models. These systems require partly the same bootstrap data in the database in order to run properly. All Systems share library code that will read the data from database and update it according to the bootstrap data in the code (add new, remove unused, update changed). Every application will perform this when they start and most of the time nothing needs to be done since the data is already correct. This data is also used by some of the integration tests.
The problem is that when some of the common data needs to be changed, then all 3 applications needs to be re-deployed with the new bootstrap data because otherwise they will bootstrap with the old data in case they are restarted (server reboots for example).
I'm looking for the best way to manage shared bootstrap data for multiple projects.
You could create a plugin that contains a service that does what you need and include the plugin in all projects. Then simply call the plugin service within each bootstrap.
Related
I have two modules. For those two modules database is the common point
Module 1: Collects the information and store in database
Module 2: using database results perform the further action
But here there is no connection between Module 1 & Module 2. Now my question is how the second module will trigger automatically when database values are updated?
Is there any method to know database is updated or not?
You have tree options:
Create a procedure in database listening when there is new updates
Store in your Module A a flag with the last updated field and check every t seconds if has any updated row
Create a messaging interface between the two services.
My comments:
Avoid that solution. It is a BAD SOLUTION! It is difficult to track with the code.
It can be a good solution if you know that your system will not grow.
If you see in the future more modules that can communicate with each other. Do that! Choose a good Message Queue (RabbitMQ, ActiveMC, etc...) or use you cloud solution (AWS has SQS)
Communication between two independent modules using database is not a good solution. Because if you change the database schema you will need to make two deploys. If you only need the data, you can use a Message solution using Protobuf as serializer.
I am going to start a new project using Spring framework. As I dont have much experience in Spring I need your help to sort out few confusions.
Lets look at use case
My application uses Spring integration framework. The core functionality of my app is,
I need to poll multiple directories from file system,
read the files(csv mostly),
process some operations on them and insert them to database.
Currently I have set up spring integration flow for it. Which has inbound-chaneell-adapter for polling and then file traverse through the channels and at the end inserted into database.
My concerns are
Number of directories application supposed to poll will be decided at runtime. Hence I need to create inbound-chanell-adapter at runtime (as one chanell adapter can poll only one directory at once) and cant define them statically in my spring context xml (As I dont know the how many I need).
Each directory has certain properties which should be applied to the file while processing.(While going through the integration flow)
So right now what I am doing is I am loading new ClassPathXmlApplicationContext("/applicationContext.xml"); for each directory. And cache the required properties in that newly created context. And use them at the time of processing (in <int:service-activator>).
Drawbacks of current design
Separate context is created for each directory.
Unnecessary beans are duplicated. (Database session factories and like)
So is there any way to design the application in such a way that context will not be duplicated. And still I can use properties of each directory throughout the integration flow at the same time???
Thanks in advance.
See the dynamic ftp sample and the links in its readme about creating child contexts when needed, containing new inbound components.
Also see my answer to a similar question for multiple IMAP mail adapters using Java configuration and then a follow-up question.
You can also use a message source advice to reconfigure the FileReadingMessageSource on each poll to look at different directories. See Smart polling.
I am creating a library (java jar file ) to provide a solution of a problem. Library is mainly targeted for web application (j2ee application) can be used with spring and other framework.
Targeted j2ee application will be deployed in clustered environment.User will use this library by adding it in application class path.
Library has a dependency of some configuration which is packaged itself in library (jar) which will be used at run time.
At run time configuration can be modified.
As it is targeted for clustered environment, In case of any modification to configuration , updated configuration must be replicated to all of nodes of clustered environment.
As per my understanding there can be two ways to hold configuration to use at run time (I am not sure correct me if I am wrong)
1.Store configuration in file
2.Store configuration in database
In first approach (store configuration in file)
There will a property file in library to hold initial configuration .
At server start up time configuration from property file will be copied to some file (abc.xml) at server physical location.
There will be set of APIs to perform CRUD action in abc.xml file from user home location.
And every time abc.xml file will be used.
In this approach holding data is possible but for clustered environment I am not getting how it will update all the nodes of cluster in case of modification.
In second appraoch (store configuration in database table)
While publishing toolkit (jar file) sql table queries also published with jar.
User have to create table using that queries.
There will a property file in library to hold initial configuration .
At server start up time configuration from property file will be copied to database.
There will be set of APIs to perform CRUD action on database.
As there is any modification to configuration all nodes of cluster can updated with latest data using some third party tool (Hazel cast or any thing else.)
In analysis I found Quartz uses database approach to hold its configuration.
So when one download quartz distribution, it also have sql queries to create required tables in database, that will be used by quartz it self.
I want to know what are the standard design pratices to hold configuration in library (jar) form and what are the factor need to be noticed in such cases.
There are other solutions as well. Use a cluster aware caching technologies like EhCache or Apache JCS or Hazelcast. Use the cache API to retrieve configuration data from the library. You could add in a listener within your library to poll on to the configuration file and update the cache.
If you are planning to use solution 1 which you mentioned, you could set up a listener within your library which listens to the configuration file and updates the server copy whenever there is a change. Similarly for Solution 2 as well but if I were in your similar situation, I would rather use a caching technology for frequently accessed data(configurations). The advantage it would give me is that I would not have to update the configuration in all the nodes because it self replicates.
I know this is a very subjective question but I will try to make it as specific as possible. The different classes of data are
Installation/Initialisation Data (e.g. path of installation, serial number, server port, server address...)
Application/Purpose Data (data which the application was built work with)
Configuration Data (Settings like Colors, Certain Thresholds for calculations, Language....)
The Installation Data has to be saved in a ini-file because it should be editable without code or without accessing the database.
The Application Data is stored inside a database.
The question is now: Should I store the Configuration Data inside the same database (with Hibernate) and mix the different classes of data or save the config inside another file (works with Preferences API) and seperate the classes?
The project in question is a web-server running multiple modules which should all have access to the same configuration.
If there are multiple modules trying to access the same config file, it is better to put it in a database (assuming it's transactional), so that we can update the file easily. If kept in the file system, you need to keep some mechanism to make read-write operations safe, like semaphores.
I'm looking for a Java Web Container (like jetty and tomcat) or a tool in which I can create/remove server instances through a management console.
The problem is that my organization needs to create different instances of a test server for quality control testing (against different database configurations). Currently, I'm having to manually copy a Tomcat "catalina_base" template directory and make any changes needed for the test being run. It would be nice to have a unified interface where I could click a button to create a new instance (and click another to remove it).
Edit 1
Must be able to run on Windows Server 2003.
Edit 2
I'm getting a lot of answers that have to do with build, so I'm going to add some extra information about the application. The application is a standard Java EE web application that is built using an ANT script. We use a container managed JNDI DataSource for connecting to the database. Our procedures, which are hold overs from 20+ years ago, have dictated that every new database change needs to be in a new schema.
So say a customer reports that our application is displaying a calculation wrong - the first thing we do is create a new database schema, then we run the create script for all of the database objects, and lastly copy the data from production for testing to that new schema. When we've fixed the bug (either application side or database side), our quality control person, needs the fixed application and the schema within the DataSource changed to that of the new "test environment". When they've completed their testing, we stage the code to be included in the next scheduled release.
The catch is, this process is multiplied by a number of developers and a number of concurrent bugs fixed and features added. Currently, there are over 20+ Tomcat instances managing different versions of the application and database objects. I'm constantly having to create new Tomcat instances and remove old ones, as features are added and quality control has completed.
it sounds like what you really need is a build a deployment tool like Continuum
You can do so with jetty.You can create your own Java Class with the specified configuration as an embedded server and run it from the prompt or through some web interface.
you can check this out
http://docs.codehaus.org/display/JETTY/Embedding+Jetty
Have you thought about a configuration management tool like Chef?
It sounds like you should just update your application's build script to be able to accept different parameters for things like "use test1.properties for database settings" or "use prod2.properties", and then deploy that rebuilt application.