Reading 2 property files having same variable names in Spring - java

I am reading property files using below entry in my Spring xml.
<context:property-placeholder
location="classpath:resources/database1.properties,
classpath:resources/licence.properties"/>
I am injecting this values in variable using xml entry or using #Value annotation.
<bean id="myClass" class="MyClass">
<property name="driverClassName" value="${database.driver}" />
<property name="url" value="${database.url}" />
<property name="name" value="${database.name}" />
</bean>
I want to add a new property file(database2.properties) which has few same variable names as of database1.properties.
database1.properties:
database.driver=com.mysql.jdbc.Driver
database.url=jdbc:mysql://192.168.1.10/
database.name=dbname
database2.properties:
database.url=jdbc:mysql://192.168.1.50/
database.name=anotherdbname
database.user=sampleuser
You can see few property variables have same name like database.url, database.name in both the property files.
Is it possible to inject database.url of database2.properties?
Or I have to change variable names?
Thank you.

You can do it by configuring two PropertyPlaceholderConfigurer. Usually there's only one instance that serves out all the properties, however, if you change the placeholderPrefix you can use two instances, something like
<bean id="firstPropertyGroup" class="org.springframework.beans.factory.config.PropertyPlaceholderConfigurer">
<property name="locations" value="classpath:resources/database1.properties,
classpath:resources/licence.properties" />
<property name="placeholderPrefix" value="${db1."/>
</bean>
<bean id="secondPropertyGroup" class="org.springframework.beans.factory.config.PropertyPlaceholderConfigurer">
<property name="locations" value="classpath:resources/database2.properties" />
<property name="placeholderPrefix" value="${db2."/>"
</bean>
Then you would access your properties like ${db1.database.url} or ${db2.database.url}

There might be a solution, similar to what's that you want to achieve. Check the second answer to this question: Multiple properties access. It basically explains what to do in order to access the properties of the second file by using another expression, which is defined by you.
Otherwise, the simplest solution would be just changing the key values (the variable names).

You will sooner or later switch to Spring Boot. So with Spring Boot you can do have such POJO:
public class Database {
#NotBlank
private String driver;
#NotBlank
private String url;
#NotBlank
private String dbname;
public String getDriver() {
return driver;
}
public void setDriver(String driver) {
this.driver = driver;
}
public String getUrl() {
return url;
}
public void setUrl(String url) {
this.url = url;
}
public String getDbname() {
return dbname;
}
public void setDbname(String dbname) {
this.dbname = dbname;
}
}
and use #ConfigurationProperties to fill it:
#Bean
#ConfigurationProperties(locations="classpath:database1.properties", prefix="driver")
public Database database1(){
return new Database();
}
#Bean
#ConfigurationProperties(locations="classpath:database2.properties", prefix="driver")
public Database database2(){
return new Database();
}
Downside of this is that it's mutable. With Lombok library, you can eliminate nasty getters and setters.

Related

How to define a bean in ApplicationContext.xml that has no constructor?

I have a class
public class DataStore {
public String name;
public String username;
public String password;
public String token;
public String connectionString;
public String type;
public String scheme;
public boolean usesBasicAuth;
public boolean usesBearerAuth;
}
I need to create an bean for it in another project. But i need to fill the fields somehow. The problem is I can not use <constructor-arg ... /> because there is no constructor.
The code below results in BeanCreationException: "Could not resolve matching constructor"
<bean id="dataStore"
class="com.fressnapf.sdk.dataaccess.services.DataStore">
<constructor-arg index="0" value="${spring.datastore.name}"/>
...
</bean>
Assuming, You have public (getters and setters for your) properties, and only the "default (no args) constructor", then You can change your configuration to:
<bean id="dataStore" class="com.fressnapf.sdk.dataaccess.services.DataStore">
<property name="connectionString" value="..."/>
<!-- ... -->
</bean>
Using property instead of constructor-arg.
Docs (Spring 4.2): https://docs.spring.io/spring/docs/4.2.x/spring-framework-reference/html/xsd-configuration.html
Yes, there is a constructor.
If you don't explicitly put one, Java will automatically add a default (non argument) constructor.
Use that one.
It will set all the instance variables to the default value of their type.
In your case: null for every String variable, false for every boolean.

SpringBatch : dynamic datasource values

I have found this subject that answered to what I was looking for :
how to pass values dynamically in config file
The thing is, when I try it, I have an Exception..
Error creating bean with name 'jobOperator' defined in class path resource [atlentic-Spring-Batch-common.xml]: Cannot resolve reference to bean 'jobExplorer' while setting bean property 'jobExplorer' [...]
Error creating bean with name 'connex' defined in class path resource [batch-calendar-context.xml]: Error setting property values;[...] Bean property 'dataSource' is not writable or has an invalid setter method. Does the parameter type of the setter match the return type of the getter?
I'm trying to read a .ini file where I get DB info, then I would like to inject them into my XML datasource config.
Here is my xml,
<beans:bean id="dataSource"
class="org.springframework.jdbc.datasource.DriverManagerDataSource" >
<beans:property name="driverClassName" value="${DB_DRIVER}" />
<beans:property name="url"
value="${DB_PROTOCOL}:#${DB_HOST}:${DB_PORT}:${DB_NAME}" />
<beans:property name="username" value="#{connex.user}" />
<beans:property name="password" value="#{connex.pass}" />
</beans:bean>
<beans:bean id="connex" class="com.sponge.bob.calendar.entity.CustomConnexion">
<beans:property name="dataSource" ref="dataSource" />
</beans:bean>
Then my CustomConnexiob.class where I use constructor to instantiate my attributs (it is not sexy, but I'm starting with SpringBatch) :
#Component
#Scope("step")
public class CustomConnexion {
public String user;
public String pass;
public String base;
#Autowired
private static final Logger LOGGER = LoggerFactory.getLogger(CustomConnexion.class);
public CustomConnexion() {
initConnexion();
}
public void initConnexion() {
IniReader reader = new IniReader();
setUser(reader.getProperty(Constants.MYCOMMON, Constants.USER));
setBase(reader.getProperty(Constants.MYCOMMON, Constants.BASE));
setPass(reader.getProperty(Constants.MYCOMMON, Constants.PASS));
}
/* getters and setters after this line (not printed here but they have the default name */
}
Is it possible to get this password and user dynamically using this way, I begin to lose my mind ?
Deinum,
thank you for your answer ! I tried to use UserCrendentialsDataSourceAdapter, but I didn't manage to make it work. But your observation about the scope make me try something I tried before writing this post.
Finally I used this :
<beans:bean id="connex" class="com.sponge.bob.calendar.entity.CustomConnexion">
</beans:bean>
<beans:bean id="dataSource"
class="org.springframework.jdbc.datasource.DriverManagerDataSource" >
<beans:property name="driverClassName" value="${DB_DRIVER}" />
<beans:property name="url" value="${DB_PROTOCOL}:#${DB_HOST}:${DB_PORT}:${DB_NAME}" />
<beans:property name="username" value="#{connex.user}"/>
<beans:property name="password" value="#{connex.pass}"/>
</beans:bean>
and
#Component
#Scope("singleton") // <-- I changed this (it was "step" before)
public class CustomConnexion {
public String user;
public String pass;
public String base;
#Autowired
private static final Logger LOGGER = LoggerFactory.getLogger(CustomConnexion.class);
public CustomConnexion() {
initConnexion();
}
public void initConnexion() {
IniReader reader = new IniReader();
setUser(reader.getProperty(Constants.MYCOMMON, Constants.USER));
setBase(reader.getProperty(Constants.MYCOMMON, Constants.BASE));
setPass(reader.getProperty(Constants.MYCOMMON, Constants.PASS));
}
/* getters and setters after this line (not printed here but they have the default name */
}
my IniReader() just parse the .ini
I think you are getting username and password as null.
Remove calling initConnexion() from its constructor.
Add below annotation on top of initConnexion()
#PostConstruct

How to add column alias names to output csv file spring batch

I need to add the aliases defined in the SQL query while generating the CSV file.
I see some example using FlatFileHeaderCallback but there I don't have a way to pass the aliases
is there any way to get the column aliases in write(List<? extends T> items) method of FlatFileItemWriter
For starters, I think you could simply use a custom FlatFileHeaderCallback which takes a String as a parameter and writes it :
public class CustomHeaderWriter implements FlatFileHeaderCallback {
private String header;
#Override
public void writeHeader(Writer writer) throws IOException {
writer.write(header);
}
public void setHeader(String header) {
this.header = header;
}
}
To use it, declare it in your FlatFileItemWriter and give it a String that contains the name of your columns/aliases separated by your flat file delimiter :
<bean class="org.springframework.batch.item.file.FlatFileItemWriter" scope="step">
<property name="headerCallback">
<bean class="xx.xx.xx.CustomHeaderWriter">
<property name="header" value="${columns.or.aliases}"></property>
</bean>
</property>
</bean>
Now, I suppose you don't want to write the columns/aliases a second time for the header, and would like to "extract" them from the SQL query. This could be accomplished for example by fiddling with the CustomHeaderWriter :
Instead of passing the columns/aliases directly, you could give it the actual SQL query
Using a Regular Expression or manual parsing, you could then extract the aliases or the columns names (strings beween SELECT and FROM, split with ,, strip quotes, etc.)
You would then need to pass (or use a constant) the delimiter of the FlatFileItemWriter
Finally, write the String you just built
Create a custom class(assuming 5 csv columns):
public class MyFlatFileWriter implements FlatFileHeaderCallback {
#Override
public void writeHeader(Writer writer) throws IOException {
writer.write("Col1,Col2,Col3,Col4,Col5");
}
}
Add bean & its reference in writer bean:
<bean id="flatFileWriter" class="org.springframework.batch.item.file.FlatFileItemWriter">
<property name="resource" value="file:csv/outputs/name.csv" />
<property name="headerCallback" ref="headerCallback" />
<property name="lineAggregator">
............
</property>
</bean>
<bean id="headerCallback" class="com.whatever.model.MyFlatFileWriter" />

DynamoDB and TableNameOverride with prefix

I am testing DynamoDB tables and want to set up different table names for prod and dev environment using the prefix "dev_" for development.
I made this test to print the table name:
import com.amazonaws.services.dynamodbv2.datamodeling.DynamoDBMapperConfig.TableNameOverride;
TableNameOverride tbl = new TableNameOverride("test").withTableNamePrefix("dev_");
System.out.println("name=" + tbl.getTableName() + " prefix=" + tbl.getTableNamePrefix());
This prints: name=null prefix=dev_
How come the name here is null ?
TableNameOverride tbl = new TableNameOverride("test");//.withTableNamePrefix("dev_");
System.out.println("name=" + tbl.getTableName() + " prefix=" + tbl.getTableNamePrefix());
This prints: name=test prefix=null
*How can I get the table name to be "dev_test" ?*
I want to use this later to get a "dev_" prefix for all tables in development mode like this:
DynamoDBTable annotation = (DynamoDBTable) myclass.getClass().getAnnotation(DynamoDBTable.class);
TableNameOverride tbl = new TableNameOverride(annotation.tableName()).withTableNamePrefix("dev_");
Or is there another solution to separate between dev and prod tables?
I first thought of putting them in separate regions but not sure about this.
Could also use this:
mapper.save(ck, new DynamoDBMapperConfig(new TableNameOverride((isDev ? "dev_" : "") + annotation.tableName())));
withTableNamePrefix is a static method. So this line is creating a new instance of TableNameOverride with the String "test", and then throwing that instance away by using it to call the static withTableNamePrefix method:
TableNameOverride tbl = new TableNameOverride("test").withTableNamePrefix("dev_");
To answer the deeper question of separating test from prod, I would recommend having 2 separate AWS Accounts entirely, one for dev and one for prod. This is the only way you can:
See billing separately
Ensure you never leak data between prod and test systems
Have high scaling on a dev table prevent you from scaling a prod table higher
I've faced the same situation and struggled with myself a couple of days to get that working.
Just in case you're using DynamoDB + Spring here is what worked for me:
POJO class:
#DynamoDBTable(tableName = "APP-ACCESSKEY")
public class AccessKey {
#NotBlank
#Size(min = 1, max = 36)
private String accessToken;
#NotNull
#Size(min = 3, max = 15)
private String userName;
private Date dateInsertion;
public AccessKey() {
// ... All POJO stuff
}
Spring configuration:
<?xml version="1.0" encoding="UTF-8"?>
<beans xmlns="http://www.springframework.org/schema/beans"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://www.springframework.org/schema/beans
http://www.springframework.org/schema/beans/spring-beans-3.1.xsd">
<!-- Amazon Credentials -->
<bean id="tableNameOverride" class="org.springframework.beans.factory.config.MethodInvokingFactoryBean">
<property name="staticMethod" value="com.amazonaws.services.dynamodbv2.datamodeling.DynamoDBMapperConfig.TableNameOverride.withTableNamePrefix"/>
<property name="arguments" value="DES-" />
</bean>
<bean id="dynamoDBMapperConfig" class="com.amazonaws.services.dynamodbv2.datamodeling.DynamoDBMapperConfig">
<constructor-arg index="0" ref="tableNameOverride" />
</bean>
<bean id="BasicAWSCredentials" class="com.amazonaws.auth.BasicAWSCredentials">
<constructor-arg index="0" value="${amazon.accessKey}" />
<constructor-arg index="1" value="${amazon.secretKey}" />
</bean>
<bean id="amazonDynamoDBClient" class="com.amazonaws.services.dynamodbv2.AmazonDynamoDBClient">
<constructor-arg index="0" ref="BasicAWSCredentials" />
<property name="endpoint" value="http://dynamodb.us-west-2.amazonaws.com" />
</bean>
<bean id="dynamoDBMapper" class="com.amazonaws.services.dynamodbv2.datamodeling.DynamoDBMapper">
<constructor-arg index="0" ref="amazonDynamoDBClient" />
<constructor-arg index="1" ref="dynamoDBMapperConfig" />
</bean>
</beans>
Explanation:
Taking into account that my AccessKey object point to APP-ACCESSKEY table on AWS DynamodDB then it turns out that after running this, your application will start to point to DES-APP-ACCESSKEY.
Hope it helps someone who's facing a situation akin to it
Cheers
Same as Paolo Almeidas solution, but with Spring-Boot annotations.
Just wanted to share it and maybe save someone time:
I have dynamodb tables for each namespace, e.g. myApp-dev-UserTable, myApp-prod-UserTable and I am using the EKS_NAMESPACE env variable, which in my case gets injected into the pods by kubernetes.
import com.amazonaws.services.dynamodbv2.AmazonDynamoDB;
import com.amazonaws.services.dynamodbv2.AmazonDynamoDBClientBuilder;
import com.amazonaws.services.dynamodbv2.datamodeling.DynamoDBMapper;
import com.amazonaws.services.dynamodbv2.datamodeling.DynamoDBMapperConfig;
#Configuration
#EnableDynamoDBRepositories(basePackages = "de.dynamodb")
public class DynamoDBConfig {
#Value("${EKS_NAMESPACE}")
String eksNamespace;
#Bean
public AmazonDynamoDB amazonDynamoDB() {
return AmazonDynamoDBClientBuilder.standard()
.withEndpointConfiguration(new AwsClientBuilder.EndpointConfiguration(
"dynamodb.eu-central-1.amazonaws.com", "eu-central-1"))
.withCredentials(awsCredentials())
.build();
}
#Bean
public AWSCredentialsProvider awsCredentials() {
return WebIdentityTokenCredentialsProvider.builder().build();
}
// Table Name override:
#Bean
public DynamoDBMapperConfig.TableNameOverride tableNameOverride() {
return DynamoDBMapperConfig.TableNameOverride.withTableNamePrefix("myApp-" + eksNamespace + "-");
}
#Bean
public DynamoDBMapperConfig dynamoDBMapperConfig() {
return DynamoDBMapperConfig.builder().withTableNameOverride(tableNameOverride()).build();
}
#Bean
// Marked as primary bean to override default bean.
#Primary
public DynamoDBMapper dynamoDBMapper() {
return new DynamoDBMapper(amazonDynamoDB(), dynamoDBMapperConfig());
}
}
With a table like this:
#Data
#DynamoDBTable(tableName = "UserTable")
public class User {
#DynamoDBHashKey
private String userId;
#DynamoDBAttribute
private String foo;
#DynamoDBAttribute
private String bar;
}

How to configure a Spring beans with properties that are stored in a database table

In my project we'd like to externalize the properties of our Spring managed beans, that is very easy to do with standard Java .properties files, however we want to be able to read those properties from a DB table that behaves like a Map (key is the property name, value is the value assigned to that property).
I found this post that suggest the usage of Commons Configuration but I don't know if there's a better way to do the same with Spring 3.x. Maybe implementing my own PropertyResource or something.
Any clues?
I'd use a FactoryBean of type <Properties> that I'd implement using JdbcTemplate. You can then use the generated Properties object with the <context:property-placeholder> mechanism.
Sample code:
public class JdbcPropertiesFactoryBean
extends AbstractFactoryBean<Properties>{
#Required
public void setJdbcTemplate(final JdbcTemplate jdbcTemplate){
this.jdbcTemplate = jdbcTemplate;
}
private JdbcTemplate jdbcTemplate;
#Required
public void setTableName(final String tableName){
this.tableName = tableName;
}
private String tableName;
#Required
public void setKeyColumn(final String keyColumn){
this.keyColumn = keyColumn;
}
private String keyColumn;
#Required
public void setValueColumn(final String valueColumn){
this.valueColumn = valueColumn;
}
private String valueColumn;
#Override
public Class<?> getObjectType(){
return Properties.class;
}
#Override
protected Properties createInstance() throws Exception{
final Properties props = new Properties();
jdbcTemplate.query("Select " + keyColumn + ", " + valueColumn
+ " from " + tableName, new RowCallbackHandler(){
#Override
public void processRow(final ResultSet rs) throws SQLException{
props.put(rs.getString(1), rs.getString(2));
}
});
return props;
}
}
XML Configuration:
<bean id="props" class="foo.bar.JdbcPropertiesFactoryBean">
<property name="jdbcTemplate">
<bean class="org.springframework.jdbc.core.JdbcTemplate">
<!-- reference to a defined data source -->
<constructor-arg ref="dataSource" />
</bean>
</property>
<property name="tableName" value="TBL_PROPERTIES" />
<property name="keyColumn" value="COL_KEY" />
<property name="valueColumn" value="COL_VAL" />
</bean>
<context:property-placeholder properties-ref="props" />
In addition to Sean's suggestion, you can extend PropertyPlaceholderConfigurer. Look at the two current implementations - PreferencesX and ServletContextX, and roll out your own, jdbc-based.
There are ways to create "PropertyPlaceholderConfigurer" Programmatically , please see below.
Write a DAO which reads Properties and create a PropertyPlaceholderConfigurer as shown below.
XmlBeanFactory factory = new XmlBeanFactory(new FileSystemResource("beans.xml"));
PropertyPlaceholderConfigurer cfg = new PropertyPlaceholderConfigurer();
cfg.setProperties(yourProperties);
cfg.postProcessBeanFactory(factory);

Categories

Resources