Is there any way to access caller-scoped variables from an anonymous inner class in Java?
Here's the sample code to understand what I need:
public Long getNumber(final String type, final String refNumber, final Long year) throws ServiceException {
Long result = null;
try {
Session session = PersistenceHelper.getSession();
session.doWork(new Work() {
public void execute(Connection conn) throws SQLException {
CallableStatement st = conn.prepareCall("{ CALL PACKAGE.procedure(?, ?, ?, ?) }");
st.setString(1, type);
st.setString(2, refNumber);
st.setLong(3, year);
st.registerOutParameter(4, OracleTypes.NUMBER);
st.execute();
result = st.getLong(4) ;
}
});
} catch (Exception e) {
log.error(e);
}
return result;
}
The code is in a DAO service class. Obviously it doesn't compile, because it asks that result be final, if it is -- it doesn't compile because I try to modify a final var. I'm bound to JDK5. Other than dropping the doWork() altogether, is there a way to set the result value from within doWork()?
Java doesn't know that doWork is going to be synchronous and that the stack frame that result is in will still be there. You need to alter something that isn't in the stack.
I think this would work
final Long[] result = new Long[1];
and then
result[0] = st.getLong(4);
in execute(). At the end, you need to return result[0];
You might want to make a class because you don't like how it looks to use an array here, but this is the basic idea.
This situation arises a lot in Java, and the cleanest way to handle it is with a simple value container class. It's the same type thing as the array approach, but it's cleaner IMO.
public class ValContainer<T> {
private T val;
public ValContainer() {
}
public ValContainer(T v) {
this.val = v;
}
public T getVal() {
return val;
}
public void setVal(T val) {
this.val = val;
}
}
You need a 'container' to hold your value. You, however, do not have to create a container class. You may use classes in the java.util.concurrent.atomic package. They provide an immutable wrapper for a value along with a set and a get method. You have AtomicInteger, AtomicBoolean, AtomicReference<V> (for your objects) e.t.c
In the outer method:
final AtomicLong resultHolder = new AtomicLong();
In the anonymous inner class method
long result = getMyLongValue();
resultHolder.set(result);
Later in your outer method
return resultHolder.get();
Here's an example.
public Long getNumber() {
final AtomicLong resultHolder = new AtomicLong();
Session session = new Session();
session.doWork(new Work() {
public void execute() {
//Inside anonymous inner class
long result = getMyLongValue();
resultHolder.set(result);
}
});
return resultHolder.get(); //Returns the value of result
}
Long is immutable. If you use a mutable class, holding a long value, you can change the value. For example:
public class Main {
public static void main( String[] args ) throws Exception {
Main a = new Main();
System.out.println( a.getNumber() );
}
public void doWork( Work work ) {
work.doWork();
}
public Long getNumber() {
final LongHolder result = new LongHolder();
doWork( new Work() {
public void doWork() {
result.value = 1L;
}
} );
return result.value;
}
private static class LongHolder {
public Long value;
}
private static abstract class Work {
public abstract void doWork();
}
}
If the containing class is MyClass -->
MyClass.this.variable = value;
Do not remember if this would work with a private variable (I think it would work).
Only works for attributes of the class (class variable). Does not work for method local variables. In JSE 7 probably there will be closures to do that kind of thing.
Anonymous classes/methods are not closures - this is exactly the difference.
The problem is that doWork() could create a new thread to call execute() and getNumber() could return before the result is set - and even more problematically: where should execute() write the result when the stack frame that contains the variable is gone? Languages with closures have to introduce a mechanism to keep such variables alive outside their original scope (or ensure that the closure is not executed in a separate thread).
A workaround:
Long[] result = new Long[1];
...
result[0] = st.getLong(4) ;
...
return result[0];
The standard solution to this is to return a value. See, for instance, ye olde java.security.AccessController.doPrivileged.
So the code would look something like this:
public Long getNumber(
final String type, final String refNumber, final Long year
) throws ServiceException {
try {
Session session = PersistenceHelper.getSession();
return session.doWork(new Work<Long>() {
public Long execute(Connection conn) throws SQLException {
CallableStatement st = conn.prepareCall("{ CALL PACKAGE.procedure(?, ?, ?, ?) }");
try {
st.setString(1, type);
st.setString(2, refNumber);
st.setLong(3, year);
st.registerOutParameter(4, OracleTypes.NUMBER);
st.execute();
return st.getLong(4);
} finally {
st.close();
}
}
});
} catch (Exception e) {
throw ServiceException(e);
}
}
(Also fixed the potential resource leak, and returning null for any error.)
Update: So apparently Work is from a third-party library and can't be altered. So I suggest not using it, at least isolate your application from so that you are not using it directly. Something like:
public interface WithConnection<T> {
T execute(Connection connnection) throws SQLException;
}
public class SessionWrapper {
private final Session session;
public SessionWrapper(Session session) {
session = nonnull(session);
}
public <T> T withConnection(final WithConnection<T> task) throws Service Exception {
nonnull(task);
return new Work() {
T result;
{
session.doWork(this);
}
public void execute(Connection connection) throws SQLException {
result = task.execute(connection);
}
}.result;
}
}
As of Hibernate 4, the method Session#doReturningWork(ReturningWork<T> work) will return the return val from the inner method:
public Long getNumber(final String type, final String refNumber, final Long year) throws ServiceException {
try {
Session session = PersistenceHelper.getSession();
return session.doReturningWork(conn -> {
CallableStatement st = conn.prepareCall("{ CALL PACKAGE.procedure(?, ?, ?, ?) }");
st.setString(1, type);
st.setString(2, refNumber);
st.setLong(3, year);
st.registerOutParameter(4, OracleTypes.NUMBER);
st.execute();
return st.getLong(4);
});
} catch (Exception e) {
log.error(e);
}
return null;
}
(Cleaned up using a Java 8 lambda)
Using AtomicLong helped me in a very similar situation and the code looked clean.
// Create a new final AtomicLong variable with the initial value 0.
final AtomicLong YOUR_VARIABLE = new AtomicLong(0);
...
// set long value to the variable within inner class
YOUR_VARIABLE.set(LONG_VALUE);
...
// get the value even outside the inner class
YOUR_VARIABLE.get();
Related
I'm not sure whether I'm using JMockit incorrectly, or there's something amiss in my setup. I'm using JMockit 1.32 with JUnit 4.12 in Eclipse.
My problem seems to be that interfaces aren't being captured. Specifically in the java.sql package. For example:
public class Dto {
private int id;
public Dto(){}
public Dto(ResultSet rs) {
try {
id = rs.getInt(1);
} catch (SQLException e) { }
}
public int getId() {return id;}
void setId(int id) {this.id = id;}
}
.
public class ClassUnderTest {
public static Dto loadObject(Connection conn, String tablename, int id) {
Dto result = null;
ResultSet rs = null;
PreparedStatement ps = null;
try {
String sql = "select * from " + tablename + " where id = ?";
ps = conn.prepareStatement(sql);
ps.setInt(1, id);
rs = ps.executeQuery();
if (rs.next()) {
result = new Dto(rs);
}
} catch (SQLException e) {
} finally {
try {
if (ps != null) ps.close();
} catch (SQLException e) { }
}
return result;
}
}
.
public class ResultSetTest extends junit.framework.TestCase {
private static final int OBJ_ID = 5;
#Capturing
ResultSet mockResultSet;
#Capturing
PreparedStatement mockStatement;
#Capturing
Connection mockConn;
#Test
public void testGetDtoById() throws SQLException {
new Expectations() {{
mockConn.prepareStatement(anyString); result = mockStatement;
mockStatement.setInt(anyInt, OBJ_ID);
mockResultSet.next(); result = true;
new Dto(mockResultSet); result = new Dto();
mockResultSet.next(); result = true;
}};
Dto dto = ClassUnderTest.loadObject(mockConn, "", OBJ_ID);
assertEquals(dto.getId(), OBJ_ID);
}
}
In this setup, test execution fails with a NPE on the first line in the Expectations(){} block. But from the tutorials, etc. I'm expecting a mocked instance to have been created. (e.g. tutorial)
Trying to move past this, I created explicit mocked classes like so:
public class ResultSetMockup extends MockUp<ResultSet> { }
public class PreparedStatementMockup extends MockUp<PreparedStatement>
{
#Mock ResultSet executeQuery() {return new ResultSetMockup().getMockInstance();}
}
public class ConnectionMockup extends MockUp<Connection>
{
#Mock PreparedStatement prepareStatement(String sql) throws SQLException {
return new PreparedStatementMockup().getMockInstance();
}
}
#Capturing
ResultSet mockResultSet = new ResultSetMockup().getMockInstance();
#Capturing
PreparedStatement mockStatement = new PreparedStatementMockup().getMockInstance();
#Capturing
Connection mockConn = new ConnectionMockup().getMockInstance();
At this point, the Expectations() {} block is happy, but it appears that results is never actually being set. By setting a breakpoint I see that rs.next() always fails. So I presume nothing is actually being captured.
What am I doing wrong? Or is something in my setup preventing JMockit from actually running?
The actual problem in the test is that it's mixing the APIs from JUnit 3 (anything from junit.framework) and JUnit 4+ (org.junit). This should never be done.
JMockit only supports the JUnit 4 API, not the obsolete JUnit 3 API. So, simply remove "extends from junit.framework.TestCase" that it will be ok.
BTW, your Java IDE should have warned against this mistake. IntelliJ, at least, promptly displays "Method 'testGetDtoById()' annotated with '#Test' inside class extending JUnit 3 TestCase".
Also, the test (and the code under test) has several other mistakes...
My problem appears to have been the use of JUnit. I went as far as to try tutorial examples verbatim without luck. But by converting over to TestNG all my problems went away.
It seems as though JMockit's Expectations block wasn't able to hook into the code properly with JUnit. Either calls weren't recognized or the faking wasn't happening. I'm curious now, does anyone have it working with JUnit?
I have had some trouble with using a general type in a static method.
All comments on the source code are welcome, especially ones that significantly improve the code. I am also currently not planning on using any external framework, apart from JDBC, to keep it still simple, please do not put too much emphasis on that.
My view on not using external frameworks is also supported by the fact that the operations I will be using on the database are very minimal:
Inserting data
Updating data
Retrieving all fields. (And simply by putting in a different SQL Query you could already select what fields to retrieve
I do not plan on making a full framework, so I know that it will not be supporting everything. The speed of retrieving all fields is neither a real issue, as this will be pretty much only done on server bootup, and if used at any other time it will be done in a background task for which I do not really care when it is finished.
Entity.java:
abstract public class Entity<KeyType, DataType> {
protected KeyType key;
protected List<Object> data;
public Entity() {
data = new ArrayList<>();
}
//abstract public static Map<KeyType, DataType> getAll();
protected List<Object> createData(final DataAction dataAction) {
List<Object> list = new ArrayList<>();
if (dataAction == DataAction.INSERT) {
list.add(key);
}
list.addAll(data);
if (dataAction == DataAction.UPDATE) {
list.add(key);
}
return list;
}
abstract public void insert();
abstract public void update();
protected static <KeyType, DataType> Map<KeyType, DataType> getData(final Class<DataType> dataTypeClass, final String query) {
Map<KeyType, DataType> map = new HashMap<>();
try {
PreparedStatement preparedStatement = DatabaseConnection.getConnection().prepareStatement(query);
ResultSet resultSet = preparedStatement.executeQuery();
while (resultSet.next()) {
KeyType key = (KeyType)resultSet.getObject(1);
int index = 2;
List<Object> dataList = new ArrayList<>();
while (resultSet.getObject(index) != null) {
dataList.add(resultSet.getObject(index));
index++;
}
DataType dataObject = null;
try {
dataObject = dataTypeClass.getConstructor(List.class).newInstance(dataList);
} catch (InstantiationException | IllegalAccessException | IllegalArgumentException | InvocationTargetException | NoSuchMethodException | SecurityException ex) {
Logger.getLogger(Entity.class.getName()).log(Level.SEVERE, null, ex);
}
map.put(key, dataObject);
}
} catch (SQLException ex) {
Logger.getLogger(Entity.class.getName()).log(Level.SEVERE, null, ex);
}
return map;
}
protected void executeQuery(final String query, final List<Object> data) {
try {
PreparedStatement preparedStatement = DatabaseConnection.getConnection().prepareStatement(query);
int dataIndex = 0;
for (Object dataObject : data) {
preparedStatement.setObject(dataIndex, dataObject);
dataIndex++;
}
preparedStatement.execute();
preparedStatement.close();
} catch (SQLException ex) {
Logger.getLogger(Entity.class.getName()).log(Level.SEVERE, null, ex);
}
}
}
A concrete implementation, Account.java:
public class Account extends Entity<String, Account> {
private final static String SELECT_ALL_QUERY = "SELECT * FROM accounts";
private final static String INSERT_QUERY = "INSERT INTO accounts (username, password) VALUES(?, ?)";
private final static String UPDATE_QUERY = "UPDATE accounts SET password=? WHERE username=?";
private String username;
private String password;
public Account(final String username, final String password) {
this.username = username;
this.password = password;
key = username;
data.add(password);
}
public Account(final List<Object> data) {
this((String)data.get(0), (String)data.get(1));
}
public String getUsername() {
return username;
}
public void setUsername(final String username) {
this.username = username;
}
public String getPassword() {
return password;
}
public void setPassword(final String password) {
this.password = password;
}
public static Map<String, Account> selectAll() {
return getData(Account.class, SELECT_ALL_QUERY);
}
#Override
public void insert() {
executeQuery(INSERT_QUERY, createData(DataAction.INSERT));
}
#Override
public void update() {
executeQuery(UPDATE_QUERY, createData(DataAction.UPDATE));
}
}
I am generally happy about the concrete implementation, it seems like I have managed to bring it down to a bare minimum, except public Account(final List<Object> data) does not seem that nice, but I can live with it.
However, as guessed, the getData() from Entity is definately not nice, and I would like to improve it if possible.
What I would like to use is something like DataType dataObject = new DataType(dataList), but it seems like Generic Type Arguments cannot be instantiated.
So are there any ways of optimizing my current code in my current view? And is it possible to decouple the concrete classes and abstract classes even more?
EDIT:
Added a relevant question (I don't think I should make a fully new question for this thing, right?):
Is there a way to move the static Strings (SQL Queries) and the insert() and update() out of the Account class, into the Entity class?
To avoid the use of reflection in your getData method you should accept a factory that given a ResultSet creates instances of the specific type. Your selectAll method would then be something like:
public static Map<String, Account> selectAll()
{
return getData(
new EntityFactory<Account>()
{
public Account newInstance(ResultSet resultSet) throws SQLException
{
return new Account(resultSet.getString(0), resultSet.getString(1));
}
},
SELECT_ALL_QUERY
);
}
The getData method then ends up something like:
protected static <K, T extends Entity<K>> Map<K, T> getData(EntityFactory<T> entityFactory, String query)
{
Connection connection = null;
PreparedStatement preparedStatement = null;
ResultSet resultSet = null;
try
{
connection = dataSource.getConnection();
preparedStatement = connection.prepareStatement(query);
resultSet = preparedStatement.executeQuery();
Map<K, T> entities = new HashMap<>();
while (resultSet.next())
{
Entity<K> entity = entityFactory.newInstance(resultSet);
entities.put(entity.getKey(), entity);
}
return entities;
}
finally
{
closeQuietly(resultSet);
closeQuietly(prepareStatement);
closeQuietly(connection);
}
}
And assumes the Entity looks like:
public interface Entity<K>
{
public K getKey();
}
This allows you to remove the reflection and keeps the code that understands the database structure in one place. You should also use a similar template pattern to map from the domain object to the prepared statement when doing inserts and updates.
Now you've asked for comments on the code in general.
First off, code like this violates the Single Responsibility Principal and Seperation Of Concerns. A domain class should be a domain class and not contain persistance logic. Look at patterns like the Data Access Object for how this should be done.
Second, while I'm all for keeping it simple, Hibernate solved this problem a long time ago and JPA standardized it - you need a very good reason not to use one or both of these APIs.
Finally, your use of database resources - if you are going to use JDBC directly you have to clean up properly. Database connections are expensive resources and should be handled as such, the basic template for any JDBC call should be:
Connection connection = null;
PreparedStatement preparedStatement = null;
ResultSet resultSet = null;
try
{
connection = //get connection from pool or single instance.
preparedStatement = connection.prepareStatement("SELECT * FROM table WHERE column = ?");
preparedStatement.setString(1, "some string");
resultSet = preparedStatement.executeQuery();
while (resultSet.next())
{
//logic goes here.
}
}
catch (SQLException e)
{
//Handle exceptions.
}
finally
{
closeQuietly(resultSet);
closeQuietly(prepareStatement);
closeQuietly(connection);
}
The closeQuietly method has to be overloaded but should take the general form:
try
{
if (resultSet != null)
{
resultSet.close();
}
}
catch (SQLException e)
{
//Log exceptions but don't re-throw.
}
Well, as Darwind and Nick Holt told you, in a normal situation, you should use JPA, which is the Java standard specification for object-relational mapping. You can use Hibernate, EclipseLink or any other framework behind. Their design is can manage connections, transactions. In addition, using standards rather than exotic frameworks means that you can get help more easily for the community.
Another option is using Spring JDBC, which is quite light and facilitates many things.
Anyway, I suppose you did this for learning purpose so let's try to go further.
First, I think you should separate the classes in charge or retrieving the data (call it manager or Data Access Object -DAO-) and the entites representing the data themselves.
For me, using the class to get all the data as you did isn't a problem in itself. The problem is the position of the key is hardcoded. This should not be determined directly a generic (I mean the same for all the Entity implementation). This makes queries subjects to bugs when the first field is not the key (are you sure a select * from... will ALWAYS return the key in the first position? ) or with a composite key.
I think a better solution is to crate a Mapper interface and to implement it for each entity.
public interface RecordMapper<KeyType, DataType extends Entity> {
public void appendToMap(ResultSet resultSet, Map<KeyType, DataType>) throws SQLException;
}
The implementation of the mapper should be in charge of instanciating your entity, retrieving the key from the resultset, populating your entity and putting it in the map you expect.
public class AccountMapper implement RecordMapper<String, Account>{
public void appendToMap(ResultSet resultSet, Map<String, Account> accounts) throws SQLException {
String user= resultSet.getString("userName");
String pwd= resultSet.getString("passWord");
Account account = new Account(user, pwd);
accounts.put(user, account);
}
}
As I told you should move your data access methods in a DAO:
public class DAO{
public <KeyType, DataType> Map<KeyType, DataType> getData(final RecordMapper<KeyType, DataType> mapper, final String query) {
Map<KeyType, DataType> map = new HashMap<>();
try {
PreparedStatement preparedStatement = DatabaseConnection.getConnection().prepareStatement(query);
ResultSet resultSet = preparedStatement.executeQuery();
while (resultSet.next()) {
mapper.appendToMap(resultSet, map);
}
} catch (SQLException ex) {
Logger.getLogger(Entity.class.getName()).log(Level.SEVERE, null, ex);
} finally {
if(resultSet != null){
try{resultSet.close();} catch (Exception e){}
}
if(preparedStatement!= null){
try{preparedStatement.close();} catch (Exception e){}
}
}
return map;
}
public void executeQuery(final String query, final List<Object> data) {
try {
PreparedStatement preparedStatement = DatabaseConnection.getConnection().prepareStatement(query);
int dataIndex = 0;
for (Object dataObject : data) {
preparedStatement.setObject(dataIndex, dataObject);
dataIndex++;
}
preparedStatement.execute();
} catch (SQLException ex) {
Logger.getLogger(Entity.class.getName()).log(Level.SEVERE, null, ex);
} finally {
if(resultSet != null){
try{resultSet.close();} catch (Exception e){}
}
if(preparedStatement!= null){
try{preparedStatement.close();} catch (Exception e){}
}
}
}
}
To answer your second quenstion, I think that putting your request string in the abstract parent instead of is certainly not a good idea. Each time you create new entity, you have to create a new query in the parent. Weird...unless I haven't understood properly your question.
Personnaly I think that the queries should be build dynamically and you should use reflection and annotations but the answer should be a bit long. Once again, you can get a look at JPA to see how creating an entity should look like. By the way, it should be even better if the entities didn't have to extend a parent Entity class.
I have an ExecutorService that is used to handle a stream of tasks. The tasks are represented by my DaemonTask class, and each task builds a response object which is passed to a response call (outside the scope of this question). I am using a switch statement to spawn the appropriate task based on a task id int. It looks something like;
//in my api listening thread
executorService.submit(DaemonTask.buildTask(int taskID));
//daemon task class
public abstract class DaemonTask implements Runnable {
public static DaemonTask buildTask(int taskID) {
switch(taskID) {
case TASK_A_ID: return new WiggleTask();
case TASK_B_ID: return new WobbleTask();
// ...very long list ...
case TASK_ZZZ_ID: return new WaggleTask();
}
}
public void run() {
respond(execute());
}
public abstract Response execute();
}
All of my task classes (such as WiggleTask() ) extend DaemonTask and provide an implementation for the execute() method.
My question is simply; is this pattern reasonable? Something feels wrong when I look at my huge switch case with all its return statements. I have tried to come up with a more elegant lookup table solution using reflection in some way but can't seem to figure out an approach that would work.
Do you really need so many classes? You could have one method per taskId.
final ResponseHandler handler = ... // has many methods.
// use a map or array or enum to translate transIds into method names.
final Method method = handler.getClass().getMethod(taskArray[taskID]);
executorService.submit(new Callable<Void>() {
public Void call() throws Exception {
method.invoke(handler);
}
});
If you have to have many classes, you can do
// use a map or array or enum to translate transIds into methods.
final Runnable runs = Class.forName(taskClassArray[taskID]).newInstance();
executorService.submit(new Callable<Void>() {
public Void call() throws Exception {
runs.run();
}
});
You can use an enum:
public enum TaskBuilder
{
// Task definitions
TASK_A_ID(1){
#Override
public DaemonTask newTask()
{
return new WiggleTask();
}
},
// etc
// Build lookup map
private static final Map<Integer, TaskBuilder> LOOKUP_MAP
= new HashMap<Integer, TaskBuilder>();
static {
for (final TaskBuilder builder: values())
LOOKUP_MAP.put(builder.taskID, builder);
}
private final int taskID;
public abstract DaemonTask newTask();
TaskBuilder(final int taskID)
{
this.taskID = taskID;
}
// Note: null needs to be handled somewhat
public static TaskBuilder fromTaskID(final int taskID)
{
return LOOKUP_MAP.get(taskID);
}
}
With such an enum, you can then do:
TaskBuilder.fromTaskID(taskID).newTask();
Another possibility is to use a constructor field instead of a method, that is, you use reflection. It is much easier to write and it works OK, but exception handling then becomes nothing short of a nightmare:
private enum TaskBuilder
{
TASK_ID_A(1, WiggleTask.class),
// others
// Build lookup map
private static final Map<Integer, TaskBuilder> LOOKUP_MAP
= new HashMap<Integer, TaskBuilder>();
static {
for (final TaskBuilder builder: values())
LOOKUP_MAP.put(builder.taskID, builder);
}
private final int index;
private final Constructor<? extends DaemonTask> constructor;
TaskBuilder(final int index, final Class<? extends DaemonTask> c)
{
this.index = index;
// This can fail...
try {
constructor = c.getConstructor();
} catch (NoSuchMethodException e) {
throw new ExceptionInInitializerError(e);
}
}
// Ewww, three exceptions :(
public DaemonTask newTask()
throws IllegalAccessException, InvocationTargetException,
InstantiationException
{
return constructor.newInstance();
}
// Note: null needs to be handled somewhat
public static TaskBuilder fromTaskID(final int taskID)
{
return LOOKUP_MAP.get(taskID);
}
}
This enum can be used the same way as the other one.
I am trying to invoke a stored procedure which has default (optional) arguments without passing them and it is not working. Essentially the same problem as described here.
My code:
SqlParameterSource in = new MapSqlParameterSource()
.addValue("ownname", "USER")
.addValue("tabname", cachedTableName)
.addValue("estimate_percent", 20)
.addValue("method_opt", "FOR ALL COLUMNS SIZE 1")
.addValue("degree", 0)
.addValue("granularity", "AUTO")
.addValue("cascade", Boolean.TRUE)
.addValue("no_invalidate", Boolean.FALSE)
.addValue("force", Boolean.FALSE);
And I get an exception:
Caused by: org.springframework.dao.InvalidDataAccessApiUsageException: Required input parameter 'PARTNAME' is missing
at org.springframework.jdbc.core.CallableStatementCreatorFactory$CallableStatementCreatorImpl.createCallableStatement(CallableStatementCreatorFactory.java:209)
Where PARTNAME is an optional parameter according to this. Also confirmed by the fact that I can run this procedure w/o the PARTNAME argument manually.
Ater giving up on this question and just passing all the parameters, including optional ones I ran into its inability to pass boolean arguments, because boolean is not an SQL data type, only PL/SQL.
So my current solution is that JDBC is not suited for running stored procedures and this is how I'm working around it:
jdbcTemplate.execute(
new CallableStatementCreator() {
public CallableStatement createCallableStatement(Connection con) throws SQLException{
CallableStatement cs = con.prepareCall("{call sys.dbms_stats.gather_table_stats(ownname=>user, tabname=>'" + cachedMetadataTableName + "', estimate_percent=>20, method_opt=>'FOR ALL COLUMNS SIZE 1', degree=>0, granularity=>'AUTO', cascade=>TRUE, no_invalidate=>FALSE, force=>FALSE) }");
return cs;
}
},
new CallableStatementCallback() {
public Object doInCallableStatement(CallableStatement cs) throws SQLException{
cs.execute();
return null; // Whatever is returned here is returned from the jdbcTemplate.execute method
}
}
);
Came up with a decent solution to this today, that copes with non-null defaults, and does not use fruity reflection techniques. It works by creating the metadata context for the function externally to retrieve all the parameter types and so forth, then constructing the SimpleJdbcCall manually from that.
First, create a CallMetaDataContext for the function:
CallMetaDataContext context = new CallMetaDataContext();
context.setFunction(true);
context.setSchemaName(schemaName);
context.setProcedureName(functionName);
context.initializeMetaData(jdbcTemplate.getDataSource());
context.processParameters(Collections.emptyList());
Next, create the SimpleJdbcCall, but force it to not do its own metadata lookup:
SimpleJdbcCall simpleJdbcCall = new SimpleJdbcCall(jdbcTemplate);
// This forces the call object to skip metadata lookup, which is the part that forces all parameters
simpleJdbcCall.setAccessCallParameterMetaData(false);
// Now go back to our previously created context and pull the parameters we need from it
simpleJdbcCall.addDeclaredParameter(context.getCallParameters().get(0));
for (int i = 0; i < params.length; ++i) {
simpleJdbcCall.addDeclaredParameter(context.getCallParameters().get(i));
}
// Call the function and retrieve the result
Map<String, Object> resultsMap = simpleJdbcCall
.withSchemaName(schemaName)
.withFunctionName(functionName)
.execute(params);
Object returnValue = resultsMap.get(context.getScalarOutParameterName());
I found solution for my case with SimpleJdbcCall and Spring 5.2.1, Java 8, Oracle 12.
You need to:
Use .withoutProcedureColumnMetaDataAccess()
Use .withNamedBinding()
Declare parameters, you know about in .declareParameters() call. Procedure will be called only with parameters, declared in this method. Default parameters, you dont want to set, arent writing here.
Example call is below
final String dataParamName = "P_DATA";
final String ageParamName = "P_AGE";
final String genderParamName = "P_GENDER";
final String acceptedParamName = "P_ACCEPTED";
SimpleJdbcCall simpleJdbcCall = new SimpleJdbcCall(getJdbcTemplate())
.withCatalogName("PKG_USER")
.withProcedureName("USER_CHECK")
.withoutProcedureColumnMetaDataAccess()
.withNamedBinding()
.declareParameters(
new SqlParameter(dataParamName, OracleTypes.VARCHAR),
new SqlParameter(ageParamName, OracleTypes.NUMBER),
new SqlParameter(genderParamName, OracleTypes.VARCHAR),
new SqlOutParameter(acceptedParamName, OracleTypes.NUMBER)
);
SqlParameterSource parameterSource = new MapSqlParameterSource()
.addValue(dataParamName, data)
.addValue(ageParamName, age)
.addValue(genderParamName, gender);
Map<String, Object> out = simpleJdbcCall.execute(parameterSource);
Here is a different approach that I have taken. I added the ability for the user to set the number of parameters they will be providing on the call. These will be the first n number of positional parameters. Any remaining parameters available in the stored-proc, will have to be set via the database's default value handling. This allows new parameters to be added to the end of the list with default values, or to be null-able, without breaking code that does not know to provide a value.
I sub-classed SimpleJdbcCall and added the methods to set the "maxParamCount". I also used a bit a evil reflection to set my sub-classed version of CallMetaDataContext.
public class MySimpleJdbcCall extends SimpleJdbcCall
{
private final MyCallMetaDataContext callMetaDataContext = new MyCallMetaDataContext();
public MySimpleJdbcCall(DataSource dataSource)
{
this(new JdbcTemplate(dataSource));
}
public MySimpleJdbcCall(JdbcTemplate jdbcTemplate)
{
super(jdbcTemplate);
try
{
// Access private field
Field callMetaDataContextField = AbstractJdbcCall.class.getDeclaredField("callMetaDataContext");
callMetaDataContextField.setAccessible(true);
// Make it non-final
Field modifiersField = Field.class.getDeclaredField("modifiers");
modifiersField.setAccessible(true);
modifiersField.setInt(callMetaDataContextField, callMetaDataContextField.getModifiers() & ~Modifier.FINAL);
// Set field
callMetaDataContextField.set(this, this.callMetaDataContext);
}
catch (NoSuchFieldException | IllegalAccessException ex)
{
throw new RuntimeException("Exception thrown overriding AbstractJdbcCall.callMetaDataContext field", ex);
}
}
public MySimpleJdbcCall withMaxParamCount(int maxInParamCount)
{
setMaxParamCount(maxInParamCount);
return this;
}
public int getMaxParamCount()
{
return this.callMetaDataContext.getMaxParamCount();
}
public void setMaxParamCount(int maxInParamCount)
{
this.callMetaDataContext.setMaxParamCount(maxInParamCount);
}
}
In my CallMetaDataContext sub-class, I store the maxInParamCount, and use it to trim the list of parameters known to exist in the stored-proc.
public class MyCallMetaDataContext extends CallMetaDataContext
{
private int maxParamCount = Integer.MAX_VALUE;
public int getMaxParamCount()
{
return maxParamCount;
}
public void setMaxParamCount(int maxInParamCount)
{
this.maxParamCount = maxInParamCount;
}
#Override
protected List<SqlParameter> reconcileParameters(List<SqlParameter> parameters)
{
List<SqlParameter> limittedParams = new ArrayList<>();
int paramCount = 0;
for(SqlParameter param : super.reconcileParameters(parameters))
{
if (!param.isResultsParameter())
{
paramCount++;
if (paramCount > this.maxParamCount)
continue;
}
limittedParams.add(param);
}
return limittedParams;
}
}
Use is basically the same except for seeting the max parameter count.
SimpleJdbcCall call = new MySimpleJdbcCall(jdbcTemplate)
.withMaxParamCount(3)
.withProcedureName("MayProc");
SMALL RANT: It's funny that Spring is well know for its IOC container. But, within its utility classes, I have to resort to reflection to provide an alternate implementation of a dependent class.
Was also struggling with the problem, and didn't want to deal with strings.
There could be more interesting solution, if we get default values from meta data, which spring doesn't care about in default implementation, but I simply put nulls there.
The solution came like the following:
Overridden simpleJdbcCall
private class JdbcCallWithDefaultArgs extends SimpleJdbcCall {
CallableStatementCreatorFactory callableStatementFactory;
public JdbcCallWithDefaultArgs(JdbcTemplate jdbcTemplate) {
super(jdbcTemplate);
}
#Override
protected CallableStatementCreatorFactory getCallableStatementFactory() {
return callableStatementFactory;
}
#Override
protected void onCompileInternal() {
callableStatementFactory =
new CallableStatementCreatorWithDefaultArgsFactory(getCallString(), this.getCallParameters());
callableStatementFactory.setNativeJdbcExtractor(getJdbcTemplate().getNativeJdbcExtractor());
}
#Override
public Map<String, Object> execute(SqlParameterSource parameterSource) {
((CallableStatementCreatorWithDefaultArgsFactory)callableStatementFactory).cleanupParameters(parameterSource);
return super.doExecute(parameterSource);
}
}
And overriden CallableStatementCreatorFactory
public class CallableStatementCreatorWithDefaultArgsFactory extends CallableStatementCreatorFactory {
private final Logger logger = LoggerFactory.getLogger(getClass());
private final List<SqlParameter> declaredParameters;
public CallableStatementCreatorWithDefaultArgsFactory(String callString, List<SqlParameter> declaredParameters) {
super(callString, declaredParameters);
this.declaredParameters = declaredParameters;
}
protected void cleanupParameters(SqlParameterSource sqlParameterSource) {
MapSqlParameterSource mapSqlParameterSource = (MapSqlParameterSource) sqlParameterSource;
Iterator<SqlParameter> declaredParameterIterator = declaredParameters.iterator();
Set<String> parameterNameSet = mapSqlParameterSource.getValues().keySet();
while (declaredParameterIterator.hasNext()) {
SqlParameter parameter = declaredParameterIterator.next();
if (!(parameter instanceof SqlOutParameter) &&
(!mapContainsParameterIgnoreCase(parameter.getName(), parameterNameSet))) {
logger.warn("Missing value parameter "+parameter.getName() + " will be replaced by null!");
mapSqlParameterSource.addValue(parameter.getName(), null);
}
}
}
private boolean mapContainsParameterIgnoreCase(String parameterName, Set<String> parameterNameSet) {
String lowerParameterName = parameterName.toLowerCase();
for (String parameter : parameterNameSet) {
if (parameter.toLowerCase().equals(lowerParameterName)) {
return true;
}
}
return false;
}
#Override
public void addParameter(SqlParameter param) {
this.declaredParameters.add(param);
}
I use this util method:
public <T> void setOptionalParameter(MapSqlParameterSource parameters, String name, T value) {
if (value == null)
parameters.addValue(name, value, Types.NULL);
else
parameters.addValue(name, value);
}
Is it possible to lazily instantiate a final field?
The following code does not compile:
public class Test{
private final Connection conn;
public Connection getConnection(){
if(conn==null){
conn = new Connection();
}
return conn;
}
}
Is there an alternative?
No. The point of a final field is that it's set once, during construction, and will never change thereafter. How could the compiler or the VM know anything useful about conn in your case? How would it know that only that property should be able to set it, and not some other method?
Perhaps if you explained what you want the semantics to be, we could come up with an alterative. You could potentially have a "provider" interface representing a way to fetch a value, and then a MemoizingProvider which proxies to another provider, but only once, caching the value otherwise. That wouldn't be able to have a final field for the cached value either, but at least it would only be in one place.
Here's one way you can do it using Memoisation (with Callables):
Class Memo:
public class Memo<T> {
private T result;
private final Callable<T> callable;
private boolean established;
public Memo(final Callable<T> callable) {
this.callable = callable;
}
public T get() {
if (!established) {
try {
result = callable.call();
established = true;
}
catch (Exception e) {
throw new RuntimeException("Failed to get value of memo", e);
}
}
return result;
}
}
Now we can create a final conn!
private final Memo<Connection> conn = new Memo<Connection>(
new Callable<Connection>() {
public Connection call() throws Exception {
return new Connection();
}
});
public Connection getConnection() {
return conn.get();
}
Source
dhiller's answer is the classic double checked locking bug, do not use.
As Jon Skeet said, no, there isn't.
Interpreting your code sample you may want to do something like this:
public class Test{
private final Object mutex = new Object(); // No public locking
private Connection conn;
public Connection getConnection(){
if(conn==null){
synchronized (mutex) {
if(conn==null){
conn = new Connection();
}
}
}
return conn;
}
}
As a side note, it's possible to change a final field. At least instance fields. You just need some reflection:
import java.lang.reflect.Field;
public class LazyFinalField {
private final String finalField = null;
public static void main(String[] args) throws Exception {
LazyFinalField o = new LazyFinalField();
System.out.println("Original Value = " + o.finalField);
Field finalField = LazyFinalField.class.getDeclaredField("finalField");
finalField.setAccessible(true);
finalField.set(o, "Hello World");
System.out.println("New Value = " + o.finalField);
}
}
Original Value = null
New Value = Hello World