How to read Tomcat JDBC Data Sources Resource Factory properties? - java

I have an tomcat 6 with an configured JNDI Tomcat JDBC Data Sources Resource Factory. Now my task is to verify that the connection pool has a minimal max size.
(If it is less, I need to disable some function or print at least a warning.)
But I don't know how to get access to that value.
The Application (Spring) access the DataSource via JNDI, but this is only the data source (org.apache.tomcat.dbcp.dbcp.BasicDataSource) but I need the Factory (org.apache.tomcat.dbcp.dbcp.BasicDataSourceFactory) because the Factory only knows the value.
So how to read the org.apache.tomcat.dbcp.dbcp.BasicDataSourceFactory.maxActive property within an application?

Not the right thing to do - but if you insist you can cast the DataSource to org.apache.tomcat.dbcp.dbcp.BasicDataSource and then call the getMaxActive method on it. The value in the property file will be set onto the factory which then initializes the corresponding properties in the DataSource. The datasource object returned might not be an instance of BasicDataSource - it might get wrapped.

Related

Does anyone know if EclipseLink supports database per tenant?

We are currently using EclipseLink as our JPA implementation for the new code. The old code did multitenancy with the solution description as this:
every customer (tenant) must have its own database - separate JDBC resources.
Information about selected tenant is carried across method invocations in ThreadLocal
TENANT variable, declared in com.aaa.Instance.java class.
JDBC resources are handled by com.aaa.TenantDataSource.
During initialization, TenantDataSource looks for DataSources registered
within java:comp/env/jdbc JNDI context and checks, if their names starts with 'abc_'.
References to DataSources are stored in internal map using tenant code from JNDI
resource name (DataSource named 'abc_xx' is registered as DataSource for 'xx' instance).
TenantDataSource examines value of Instance.TENANT variable when 'getConnection'
is invoked, for using appropriate DataSource.
TenantDataSource falls back to default DataSource if Instance.TENANT has null
value. This could be default behavior for single-tenant deployments.
Now we are using EclipseLink, I googled to see if I can implement multitenant as required. It seems EclipseLink can do:
* Single-table multi-tenancy
* table-per-tenant
* (VDP) multi-tenancy
not like Hibernate which supports database per tenant.
Could someone tell if my research is right?
Thanks

Spring Boot: setting a PostgreSQL run-time parameter when database connection is open

I am looking for the right way to set a run-time parameter when a database connection is open. My run-time parameter is actually a time zone, but I think this should work for an arbitrary parameter.
I've found following solutions, but I feel like none of these is the right thing.
JdbcInterceptor
Because Spring Boot has Apache Tomcat connection pool as default I can use org.apache.tomcat.jdbc.pool.JdbcInterceptor to intercept connections.
I don't think this interceptor provides a reliable way to perform a statement when connection is open. Possibility to intercept every statement provided by this interceptor is unnecessary to set a parameter that should be set only once.
initSQL property
Apache's pooled connection has a build-in ability to initialise itself by a statement provided by PoolProperties.initSQL parameter. This is executed in ConnectionPool.createConnection(...) method.
Unfortunately official support for this parameter has been removed from Spring and no equivalent functionality has been introduced since then.
I mean, I can still use a datasource builder like in an example below, and then hack the property into a connection pool, but this is not a good looking solution.
// Thank's to property binders used while creating custom datasource,
// the datasource.initSQL parameter will be passed to an underlying connection pool.
#Bean
#ConfigurationProperties(prefix = "datasource")
public DataSource dataSource() {
return DataSourceBuilder.create().build();
}
Update
I was testing this in a Spring Boot 1.x application. Above statements are no more valid for Spring Boot 2 applications, because:
Default Tomcat datasource was replaced by Hikari which supports spring.datasource.hikari.connection-init-sql property. It's documentation says Get the SQL string that will be executed on all new connections when they are created, before they are added to the pool.
It seems that similar property was reintroduced for Tomcat datasource as spring.datasource.tomcat.init-s-q-l.
ConnectionPreparer & AOP
This is not an actual solution. It is more like an inspiration. The connection preparer was a mechanism used to initialise Oracle connections in Spring Data JDBC Extensions project. This thing has its own problems and is no more maintained but possibly can be used as a base for similar solution.
If your parameter is actually a time zone, why don't you find a way to set this property.
For example if you want to store or read a DateTime with a predefined timestamp the right way to do this is to set property hibernate.jdbc.time_zone in hibernate entityManager or spring.jpa.properties.hibernate.jdbc.time_zone in application.properties

Spring: Define AbstractRoutingDataSource which's targetDataSources are filled from seperate (H2-)DataSource

I try to setup a spring/spring-boot project which has an AbstractRoutingDataSource whichs targetDataSources should be filled from a separate DataSource (actually embedded H2-DataSource).
I have tried many different things (configuring multiple EntityManagerFactories and TransactionManagers) but somewhere I always either get a circular reference or the Repository-Bean which should provide the details for my targetDataSources is null when autowiring it into the RoutingDataSource.
The thing seems to be that I can't use any DataSource dependency inside initialization of my RoutingDataSource, because it's a DataSource itself and therefore will be created before any Repository-Beans.
Can you give me a hint how the approach would look to configure
A H2-Datasource which is initialized first
A Repository for this (and only this) H2-DataSource
A RoutingDataSource which depends on the Repository and loads its targetDataSources from it

How to dynamically manage multiple datasources

Similar topics have been covered in other threads, but I couldn't find a definitive solution to my problem.
What we're trying to achieve is to design a web app which is able to:
read a datasource configuration at startup (an XML file containing multiple datasource definitions, which is placed outside the WAR file and it's not the application-context or hibernate configuration file)
create a session factory for each one of them (considering that each datasource is a database with a different schema)
switch at runtime to different datasources based on user input (users can select which datasource they want to use)
provide the correct dao object to manage user requests.
At the moment we have a DAO Manager object which is able to read the datasource configuration file and instantiate multiple session factories, saving them in a map. Each session factory is created with a configuration containing the proper hibernate mapping classes (different for each database schema). Moreover we have multiple DAO interfaces with their implementations, used to access "their database".
At this point we would need a way to get from the DAO Manager a specific DAO object, containing the right session factory attached, all based on the user request (basically a call from the above service containing the datasource id or a custom datasource object).
Ideally the service layer should use the DAO Manager to get a DAO object based on the datasource id (for instance), without worrying about it's actual implementation: the DAO Manager would take care of it, by creating the correct DAO object and injecting in it the right session factory, based on the datasource id.
My questions are:
Is this a good approach to follow?
How can I use Spring to dynamically inject in the DAO Manager multiple DAO implementations for each DAO interface?
Once the session factories are created, is there a way to let Spring handle them, as I would normally do with dependency injection inside the application-context.xml?
Would the 2nd Level Cache still work for each Session Factory?
Is this a good approach to follow?
It's probably the only possible approach. So, yes.
How can I use Spring to dynamically
inject in the DAO Manager multiple DAO
implementations for each DAO
interface?
Dynamically? I thought you wanted to do it at startup time. If so, just provide an accessor with a list or array:
public void setMyDaos(List<Mydao> daos){
this.daos = daos;
}
Once the session factories are
created, is there a way to let Spring
handle them, as I would normally do
with dependency injection inside the
application-context.xml?
This one's tough. I'd say you will probably have to store your sessionFactory bean in scope=session

JNDI ClassCastException

I am attempting to use JNDI with a custom DataSource called CEDataSource. From my understanding for this to work I would have to create a custom factory as well.
So I created a custom factory that would return the CEDataSource object but now when I attempt to use this in Java with
Context initCtx = new InitialContext();
Context envCtx = (Context) initCtx.lookup("java:comp/env");
// Look up our data source
CEDataSource ds = (CEDataSource)envCtx.lookup("jdbc/cePu");
I get the exception ClassCastException
"CEDataSource cannot be mapped to CEDataSource". I added the CEDataSource and the CEDataSourceFactory to the TOMCAT/lib folder as well as referenced this same jar on my deployed application.
Any help would be greatly appreciated on why this possible error may occur. Thanks
"CEDataSource cannot be mapped to CEDataSource" seems to point to the fact that it's not the same "CEDataSource" in both places.
What could be different is the classloader and this usually happens if you have the same jars/.class(es) in multiple locations.
Do you have multiple copies of your jar?
Try to have a single copy, maybe in the shared tomcat lib so it's loaded by the same classloader no matter from where you access it from.
It is actually not too difficult to start Tomcat under an Eclipse debug session (just put all the Bootstrap.jar in a project and add the System properties in the JVM parameters). II've done that many times, if only to dissect the bowels of that feline. Once this is done you can break on the class cast exception of the JNDI connection factory and you will then be able to see if your factory is called or not.
From what I remember Tomcat uses the DBCP DataSource. Actually repackaged under com.apache.tomcat.dbcp.dbcp.DataSource (IIRC).
So I would not be surprised if this is what you end up with as a result of your look-up.
With hindsight, I now realize I also forgot to mention that if any underlying class (for instance a JDBC driver) needed to create the instance of your CEDataSource is missing you also get this ClassCastException. Fair enough, but you always focus on the class itself and not on the other jars...
CEDataSource ds = (CEDataSource)envCtx.lookup("jdbc/cePu");
The lookup you are doing on jdbc/cePu is not of class type CEDataSource , it belongs to some other class type, that is why you are getting class cast exception. if you could show me the configuration for jdbc/cePu that would be helpful.

Categories

Resources