I am using SpringBatch to read from Oracle and write into ElasticSearch.
My code works fine for static queries.
Example: select emp_id, emp_name from employee_table I have a RowMapper class that maps the values from resultSet with the Employee POJO.
My requirement is
The query will be input by the user. So the query might be as follows
select emp_id, emp_name from employee_table
select cust_id, cust_name, cust_age from customer_table
select door_no, street_name, loc_name, city from address_table
Similar queries
My questions are
Is there a way to dynamically create a POJO according to the query given by the user?
Will the RowMapper concept work if the query keeps changing as in my case?
Is there something like a generic rowmapper?
Sample code would be much appreciated.
If you have objects you need to map to...
Consider aliasing your SQL to match your object field names using a custom implementation of RowMapper which actually extends BeanWrapperFieldSetMapper
So if your POJO looks like this:
public class Employee {
private String employeeId;
private String employeeName;
...
// getters and setters
}
Then your SQL can look like this:
SELECT emp_id employeeId, emp_name employeeName from employee_table
Then your wrapped RowMapper would look something like this:
import org.springframework.jdbc.core.RowMapper
import org.springframework.batch.item.file.mapping.BeanWrapperFieldSetMapper
public class BeanWrapperRowMapper<T> extends BeanWrapperFieldSetMapper<T> implements RowMapper<T> {
#Override
public T mapRow(final ResultSet rs, final int rowNum) throws SQLException {
final FieldSet fs = getFieldSet(rs);
try {
return super.mapFieldSet(fs);
} catch (final BindException e) {
throw new IllegalArgumentException("Could not bind bean to FieldSet", e);
}
}
private FieldSet getFieldSet(final ResultSet rs) throws SQLException {
final ResultSetMetaData metaData = rs.getMetaData();
final int columnCount = metaData.getColumnCount();
final List<String> tokens = new ArrayList<>();
final List<String> names = new ArrayList<>();
for (int i = 1; i <= columnCount; i++) {
tokens.add(rs.getString(i));
names.add(metaData.getColumnName(i));
}
return new DefaultFieldSet(tokens.toArray(new String[0]), names.toArray(new String[0]));
}
}
Alternatively...
If you don't have any POJOs to map to, use the out-of-box ColumnMapRowMapper to get get back a map (Map<String,Object>) of column names (let's call them COL_A, COL_B, COL_C) to values. Then if your writer is something like a JdbcBatchItemWriter you can set your named parameters as:
INSERT TO ${schema}.TARGET_TABLE (COL_1, COL_2, COL_3) values (:COL_A, :COL_B, :COL_C)
and then your ItemSqlParameterSourceProvider implementation could look like so:
public class MapItemSqlParameterSourceProvider implements
ItemSqlParameterSourceProvider<Map<String, Object>> {
public SqlParameterSource createSqlParameterSource(Map<String, Object> item) {
return new MapSqlParameterSource(item);
}
}
To answer your questions:
Is there a way to dynamically create a POJO based on the user's query - Even if there was, I'm not sure how much help it would be. For your use case, I'd suggest just using a Map.
Will the RowMapper concept work if the query keeps changing - If you use a Map, you can use the column names as the keys and the column values as the values. You should be able to create a RowMapper implementation that can do this.
Is there something like a generic RowMapper - There is but it's intended for POJO's so you'd need to create your own for this.
You can do it simply like below,
SettingsDto settings = SettingsDao.getById(1, new BeanPropertyRowMapper<>(SettingsDto.class));
In a generic way, you can pass your DTO class, but please note you have to use same name as SQL columns or have to use ALIAS in SQL query according to the DTO.
#Data
public class SettingsDto {
private int id;
private int retryCount;
private int batchSize;
private int retryPeriod;
private int statusInitialDelay;
}
My dao method is below
SettingsDto getById(int id, final RowMapper<OMoneySettingsDto> mapper);
its implementation is below,
#Override
public SettingsDto getById(final int id, final RowMapper<OMoneySettingsDto> mapper) {
return new JdbcTemplate(YourDataSource).queryForObject(QUERY_SETTINGS_BY_ID,new Object[]{id}, mapper);
}
SQL is here, as below you have to use same name in the DTO
private static final String OMONEY_SETTINGS_BY_ID = "SELECT AS id,retry_count AS retryCount FROM setttings WHERE id = ?";
I found a solution to my problem by using Spring's ColumnMapRowMapper. Please find a snippet from the xml configuration file. I didn't generate any POJO class. I managed with a Map and inserted the same into ES. The map's key name should match with the field names present in index.
<step id="slave" xmlns="http://www.springframework.org/schema/batch">
<tasklet>
<chunk reader="pagingItemReader" writer="elasticSearcItemWriter"
processor="itemProcessor" commit-interval="10" />
</tasklet>
</step>
<bean id="pagingItemReader"
class="org.springframework.batch.item.database.JdbcPagingItemReader"
scope="step">
<property name="dataSource" ref="dataSource" />
<property name="queryProvider">
<bean
class="org.springframework.batch.item.database.support.SqlPagingQueryProviderFactoryBean">
<property name="dataSource" ref="dataSource" />
<property name="selectClause" value="*******" />
<property name="fromClause" value="*******" />
<property name="whereClause" value="*******" />
<property name="sortKey" value="*******" />
</bean>
</property>
<!-- Inject via the ExecutionContext in rangePartitioner -->
<property name="parameterValues">
<map>
<entry key="fromId" value="#{stepExecutionContext[fromId]}" />
<entry key="toId" value="#{stepExecutionContext[toId]}" />
</map>
</property>
<property name="pageSize" value="10" />
<property name="rowMapper">
<bean class="org.springframework.jdbc.core.ColumnMapRowMapper" />
</property>
</bean>
And inside my elasticSearcItemWriter class....
public class ElasticSearchItemWriter<T> extends AbstractItemStreamItemWriter<T>
implements ResourceAwareItemWriterItemStream<T>, InitializingBean {
....
....
....
#Override
public void write(List<? extends T> items) throws Exception {
client = jestClient.getJestClient();
if (items.size() > 0) {
for (Object item : items) {
#SuppressWarnings("unchecked")
Map<String, Object> map = (Map<String, Object>) item;
// Asynch index
Index index = new Index.Builder(map).index(Start.prop.getProperty(Constants.ES_INDEX_NAME))
.type(Start.prop.getProperty(Constants.ES_INDEX_TYPE)).build();
client.executeAsync(index, new JestResultHandler<JestResult>() {
public void failed(Exception ex) {
}
public void completed(JestResult result) {
}
});
}
}
}
.....
....
}
Related
I am new to Spring Beans. I am trying to set entry map using beans.xml file and accessing that value using GET REST request.
beans.xml
<bean name ="book" id="book" class=" org.test.model.Book" scope = "singleton">
<property name="id" value="123" />
<property name="bookName" value="FirstBeanBook"></property>
</bean>
<bean name="bookservice2" id = "bookservice" class="org.test.service.BookService" scope="singleton">
<property name="bookMap">
<map><entry key="123" value-ref="book" /></map>
</property>
</bean>`
In Main class,
BookService bookService = (BookService) context.getBean("bookservice2");
bookService.getMap().toString(); // here it is working fine.`
I guess when I am trying to access this map using GET request it is creating another instance of BookService class which has empty bookMap.
Please provide some solution to get same result when I use GET request of REST.
Edit:
Handling get request as
#GET
#Produces(MediaType.APPLICATION_JSON)
#Path("/getBook/{id}")
public Book getBook(#PathParam("id") String id) {
return bookService.getBook(id);
}
BookService.Java
`public class BookService {
static Map<Integer, Book> bookMap = new HashMap<Integer, Book>();
//This class has Getter setter of bookmap too.
public BookService() {}
public Book getBook(String id) {
return bookMap.get(Integer.parseInt(id));
}`
I need to add the aliases defined in the SQL query while generating the CSV file.
I see some example using FlatFileHeaderCallback but there I don't have a way to pass the aliases
is there any way to get the column aliases in write(List<? extends T> items) method of FlatFileItemWriter
For starters, I think you could simply use a custom FlatFileHeaderCallback which takes a String as a parameter and writes it :
public class CustomHeaderWriter implements FlatFileHeaderCallback {
private String header;
#Override
public void writeHeader(Writer writer) throws IOException {
writer.write(header);
}
public void setHeader(String header) {
this.header = header;
}
}
To use it, declare it in your FlatFileItemWriter and give it a String that contains the name of your columns/aliases separated by your flat file delimiter :
<bean class="org.springframework.batch.item.file.FlatFileItemWriter" scope="step">
<property name="headerCallback">
<bean class="xx.xx.xx.CustomHeaderWriter">
<property name="header" value="${columns.or.aliases}"></property>
</bean>
</property>
</bean>
Now, I suppose you don't want to write the columns/aliases a second time for the header, and would like to "extract" them from the SQL query. This could be accomplished for example by fiddling with the CustomHeaderWriter :
Instead of passing the columns/aliases directly, you could give it the actual SQL query
Using a Regular Expression or manual parsing, you could then extract the aliases or the columns names (strings beween SELECT and FROM, split with ,, strip quotes, etc.)
You would then need to pass (or use a constant) the delimiter of the FlatFileItemWriter
Finally, write the String you just built
Create a custom class(assuming 5 csv columns):
public class MyFlatFileWriter implements FlatFileHeaderCallback {
#Override
public void writeHeader(Writer writer) throws IOException {
writer.write("Col1,Col2,Col3,Col4,Col5");
}
}
Add bean & its reference in writer bean:
<bean id="flatFileWriter" class="org.springframework.batch.item.file.FlatFileItemWriter">
<property name="resource" value="file:csv/outputs/name.csv" />
<property name="headerCallback" ref="headerCallback" />
<property name="lineAggregator">
............
</property>
</bean>
<bean id="headerCallback" class="com.whatever.model.MyFlatFileWriter" />
I am new to the hibernate world and I am getting the following error message when trying to execute a query with hibernate and postgres.
org.postgresql.util.PSQLException: ERROR: operator does not exist: text = bytea
Hint: No operator matches the given name and argument type(s). You might
need to add explicit type casts.
Here is my hibernate mapping (car.hbm.xml):
<hibernate-mapping>
<class name="Car" table="car"
schema="someSchema">
<id name="id" type="int" column="car_id">
<generator class="sequence">
<param name="sequence">car_seq</param>
</generator>
</id>
<property name="carMake">
<column name="car_make" sql-type="string"/>
</property>
<property name="carModel">
<column name="car_model" sql-type="string"/>
</property>
<property name="carVin" >
<column name="car_vin" sql-type="int" />
</property>
<property name="datePurchased">
<column name="date_purchased" sql-type="date"/>
</property>
<property name="retiredModel">
<column name="retired_model" sql-type="boolean"/>
</property>
</class>
On Postgres, here is what my table looks like:
CREATE TABLE car (
car_vin INTEGER NOT NULL DEFAULT nextval('car_seq'::regclass) PRIMARY KEY,
car_make TEXT NOT NULL,
car_model TEXT DEFAULT NULL,
date_purchased DATE DEFAULT now() NOT NULL,
retired_model BOOLEAN DEFAULT FALSE NOT NULL
);
Here is my model class (Car.java):
public class Car {
private int id;
private String carMake;
private String carModel;
private int carVin;
private Date datePurchased;
private boolean retiredModel;
public int getId() {
return id;
}
public void setId(int id) {
this.id = id;
}
public String getCarModel() {
return carModel;
}
public void setcarModel(String carModel) {
this.carModel = carModel;
}
public String getcarMake() {
return carMake;
}
public void setcarMake(String carMake) {
this.carMake = carMake;
}
public Date getDatePurchased() {
return datePurchased;
}
public void setDatePurchased(Date datePurchased) {
this.datePurchased = datePurchased;
}
public boolean isRetired() {
return retiredModel;
}
public void setRetired(boolean retiredModel) {
this.retiredModel = retiredModel;
}
In my DAO layer, I am using the following line to query:
Query query = getSession().createQuery("from Car as c where " +
"c.carModel = ? AND c.carMake = ?").setParameter(0, carModel).setParameter(1, carMake);
carMake and carModel are both String datatypes passed on as method parameters in the DAO method.
Note that the strings in my hbm are mapped to TEXT in postgres, so I am guessing if that is the problem or not. If it is, how do I solve it ?
It is weird but the query does not handle null very well. When I changed the query to:
Query query = getSession().createQuery("from Car as c where " +
"c.carModel = ? AND c.carMake is null").setParameter(0, carModel);
it works fine since the DAO needs to query the make as NULL. So if it is not null, I need to have two sets of query, one that is hardcoded to select null as above, other to setParam(1, carMake).
Weird but I think this works.
Usually this error is from Hibernate serializing a not-otherwise-mapped class (resulting in a bytea) and comparing that to a String (probably given by you in a query).
Map the Date! Use #Temporal(Date) on the Date attribute. I don't know how to express that in hbm.xml notation.
Use Query.setParameterList instead of setParameter solved my problem for an integer array (integer = bytea)
I had the same problem and in my case it was caused by the fact that I was trying to use a Java enum class as a named parameter within a native query.
The solution was to go all the way with Hibernate, so non-native query and using Java class and member names instead of table name and column names.
I'm having problems trying to use batchUpdate, from Spring's JdbcTemplate.
The problem is that i want to execute two SQL operations: a DELETE method (to clear my table) and then an INSERT method. It works fine the first time i make the call (from jsp). But from the second attempt on, when i try to do the call, the DELETE procedure isn't called or executed, just the INSERT procedure, causing an unique constraint exception.
First i tried this:
public class MyTableDAOStoredProcedure extends JdbcDaoSupport implements MyTableDAO {
...
....
public void insert(final List<MyObject> myObjectList) {
...
String deleteSql = "DELETE FROM ......";
String insertSql = "INSERT INTO ......";
// Delete Procedure
jdbcTemplate.execute(deleteSql);
//Insert Procedure
jdbcTemplate.batchUpdate(insertSql, new BatchPreparedStatementSetter() {
#Override
public int getBatchSize() {
return myObjectList.size();
}
#Override
public void setValues(PreparedStatement ps, int i) throws SQLException {
MyObject object = myObjectList.get(i);
ps.setString(1, myObject.getA());
ps.setInt(2, myObject.getB());
}
});
}
}
Then i tried this:
public class MyTableDAOStoredProcedure extends JdbcDaoSupport implements MyTableDAO {
...
...
String deleteSql = "DELETE FROM MY OBJECT";
String insertsql = "INSERT INTO MY_OBJECT values(1,2)";
jdbcTemplate.batchUpdate(new String[] { deleteSql, insertSql});
}
I think it might be some Spring Transaction problem. Here it is the config of my DAO procedure on applicationContext.xml, it's quite simple:
<bean id="txManager"
class="org.springframework.jdbc.datasource.DataSourceTransactionManager">
<property name="dataSource" ref="dataSource" />
</bean>
.
.
.
<bean id="myTableDAOStoredProcedure" class="....dao.spring.MyTableDAOStoredProcedure">
<property name="dataSource">
<ref bean="dataSource" />
</property>
</bean>
Any ideas or suggestions?
In my project we'd like to externalize the properties of our Spring managed beans, that is very easy to do with standard Java .properties files, however we want to be able to read those properties from a DB table that behaves like a Map (key is the property name, value is the value assigned to that property).
I found this post that suggest the usage of Commons Configuration but I don't know if there's a better way to do the same with Spring 3.x. Maybe implementing my own PropertyResource or something.
Any clues?
I'd use a FactoryBean of type <Properties> that I'd implement using JdbcTemplate. You can then use the generated Properties object with the <context:property-placeholder> mechanism.
Sample code:
public class JdbcPropertiesFactoryBean
extends AbstractFactoryBean<Properties>{
#Required
public void setJdbcTemplate(final JdbcTemplate jdbcTemplate){
this.jdbcTemplate = jdbcTemplate;
}
private JdbcTemplate jdbcTemplate;
#Required
public void setTableName(final String tableName){
this.tableName = tableName;
}
private String tableName;
#Required
public void setKeyColumn(final String keyColumn){
this.keyColumn = keyColumn;
}
private String keyColumn;
#Required
public void setValueColumn(final String valueColumn){
this.valueColumn = valueColumn;
}
private String valueColumn;
#Override
public Class<?> getObjectType(){
return Properties.class;
}
#Override
protected Properties createInstance() throws Exception{
final Properties props = new Properties();
jdbcTemplate.query("Select " + keyColumn + ", " + valueColumn
+ " from " + tableName, new RowCallbackHandler(){
#Override
public void processRow(final ResultSet rs) throws SQLException{
props.put(rs.getString(1), rs.getString(2));
}
});
return props;
}
}
XML Configuration:
<bean id="props" class="foo.bar.JdbcPropertiesFactoryBean">
<property name="jdbcTemplate">
<bean class="org.springframework.jdbc.core.JdbcTemplate">
<!-- reference to a defined data source -->
<constructor-arg ref="dataSource" />
</bean>
</property>
<property name="tableName" value="TBL_PROPERTIES" />
<property name="keyColumn" value="COL_KEY" />
<property name="valueColumn" value="COL_VAL" />
</bean>
<context:property-placeholder properties-ref="props" />
In addition to Sean's suggestion, you can extend PropertyPlaceholderConfigurer. Look at the two current implementations - PreferencesX and ServletContextX, and roll out your own, jdbc-based.
There are ways to create "PropertyPlaceholderConfigurer" Programmatically , please see below.
Write a DAO which reads Properties and create a PropertyPlaceholderConfigurer as shown below.
XmlBeanFactory factory = new XmlBeanFactory(new FileSystemResource("beans.xml"));
PropertyPlaceholderConfigurer cfg = new PropertyPlaceholderConfigurer();
cfg.setProperties(yourProperties);
cfg.postProcessBeanFactory(factory);