Overriding transaction propagation levels for methods having Spring's #transactional - java

I have multiple methods in my codebase annotated with Spring's #transactional with different propgation levels (lets ignore the idea behind choosing the propagation levels). Example -
public class X {
#Transactional(Propagation.NOT_SUPPORTED)
public void A() { do_something; }
#Transactional(Propagation.REQUIRED)
public void B() { do_something; }
#Transactional(Propagation.REQUIRES_NEW)
public void C() { do_something; }
}
Now I have a new use case where I want to perform all these operations in a single transaction (for this specific use case only, without modifying existing behavior), overriding any annotated propagation levels. Example -
public class Y {
private X x;
// Stores application's global state
private GlobalState globalState;
#Transactional
public void newOperation() {
// Set current operation as the new operation in the global state,
// in case this info might be required somewhere
globalState.setCurrentOperation("newOperation");
// For this new operation A, B, C should be performed in the current
// transaction regardless of the propagation level defined on them
x.A();
x.B();
x.C();
}
}
Does Spring provide some way to achieve this ? Is this not possible ?
One way I could think of is to split the original methods
#Transactional(Propagation.NOT_SUPPORTED)
public void A() { A_actual(); }
// Call A_actual from A and newOperation
public void A_actual() { do_something; }
But this might not be as simple to do as this example (there can be a lot of such methods and doing this might not scale). Also it does not look much clean.
Also the use case might also appear counter intuitive, but anyway let's keep that out of scope of this question.

I do believe the only option is to replace TransactionInterceptor via BeanPostProcessor, smth. like:
public class TransactionInterceptorExt extends TransactionInterceptor {
#Override
public Object invoke(MethodInvocation invocation) throws Throwable {
// here some logic determining how to proceed invocation
return super.invoke(invocation);
}
}
public class TransactionInterceptorPostProcessor implements BeanFactoryPostProcessor, BeanPostProcessor, BeanFactoryAware {
#Setter
private BeanFactory beanFactory;
#Override
public void postProcessBeanFactory(#NonNull ConfigurableListableBeanFactory beanFactory) throws BeansException {
beanFactory.addBeanPostProcessor(this);
}
#Override
public Object postProcessBeforeInitialization(#NonNull Object bean, #NonNull String beanName) throws BeansException {
if (bean instanceof TransactionInterceptor) {
TransactionInterceptor interceptor = (TransactionInterceptor) bean;
TransactionInterceptor result = new TransactionInterceptorExt();
result.setTransactionAttributeSource(interceptor.getTransactionAttributeSource());
result.setTransactionManager(interceptor.getTransactionManager());
result.setBeanFactory(beanFactory);
return result;
}
return bean;
}
}
#Configuration
public class CustomTransactionConfiguration {
#Bean
//#ConditionalOnBean(TransactionInterceptor.class)
public static BeanFactoryPostProcessor transactionInterceptorPostProcessor() {
return new TransactionInterceptorPostProcessor();
}
}
However, I would agree with #jim-garrison suggestion to refactor your spring beans.
UPD.
But you favour refactoring the beans instead of following this approach. So for the sake of completeness, can you please mention any issues/shortcomings with this
Well, there are a plenty of things/concepts/ideas in spring framework which were implemented without understanding/anticipating consequences (I believe the goal was to make framework attractive to unexperienced developers), and #Transactional annotation is one of such things. Let's consider the following code:
#Transactional(Propagation.REQUIRED)
public void doSomething() {
do_something;
}
The question is: why do we put #Transactional(Propagation.REQUIRED) annotation above that method? Someone might say smth. like this:
that method modifies multiple rows/tables in DB and we would like to avoid inconsistencies in our DB, moreover Propagation.REQUIRED does not hurt anything, because according to the contract it either starts new transaction or joins to the exisiting one.
and that would be wrong:
#Transactional annotation poisons stacktraces with irrelevant information
in case of exception it marks existing transaction it joined to as rollback-only - after that caller side has no option to compensate that exception
In the most cases developers should not use #Transactional(Propagation.REQUIRED) - technically we just need a simple assertion about transaction status.
Using #Transactional(Propagation.REQUIRES_NEW) is even more harmful:
in case of existing transaction it acquires another one JDBC-connection from connection pool, and hence you start getting 2+ connections per thread - this hurts performance sizing
you need to carefully watch for data you are working with - data corruptions and self-locks are the consequences of using #Transactional(Propagation.REQUIRES_NEW), cause now you have two incarnations of the same data within the same thread
In the most cases #Transactional(Propagation.REQUIRES_NEW) is an indicator that you code requires refactoring.
So, the general idea about #Transactional annotation is do not use it everywhere just because we can, and your question actually confirms this idea: you have failed to tie up 3 methods together just because developer had some assumptions about how those methods should being executed.

Related

How to add SynchronizationCallbacks to #TransactionalEventListener during spring boot application startup?

I have a spring boot application that uses a few #TransactionalEventListener(phase = TransactionPhase.AFTER_COMMIT). I noticed that spring boot doesn't do any exception logging for them when they end up with an exception being thrown.
Because of this I wanted to add some generic logging facility for such exceptions. I found that TransactionalApplicationListener.SynchronizationCallback is the interface I need to implement. However it seems complicated to register these callbacks. I didn't find any call of TransactionalApplicationListener#addCallback in the spring dependencies that would achieve this.
Trying to get a list of TransactionalApplicationListener and the SynchronizationCallback injected and then call addCallback in a #PostConstruct didn't get me further because there were always no listeners injected even though the application did make successful use of them.
So how do I add SynchronizationCallbacks to TransactionalApplicationListeners during spring boot application startup?
The first thing to note is that TransactionalApplicationListeners like all ApplicationListener are not beans in the spring context. They live somewhat outside of it (see org.springframework.context.ConfigurableApplicationContext#addApplicationListener). So injecting them is not possible for the application context.
While debugging and looking through spring sources one finds that these listeners are being created by org.springframework.transaction.event.TransactionalEventListenerFactory. And that is where my solution steps into. We decorate that factory with another one that is aware of SynchronizationCallbacks:
public class SynchronizationCallbackAwareFactory implements EventListenerFactory, Ordered {
private final TransactionalEventListenerFactory delegate;
private final Provider<List<SynchronizationCallback>> synchronizationCallbacks;
private final int order;
public SynchronizationCallbackAwareFactory(TransactionalEventListenerFactory transactionalEventListenerFactory,
Provider<List<SynchronizationCallback>> synchronizationCallbacks,
int order) {
this.delegate = transactionalEventListenerFactory;
this.synchronizationCallbacks = synchronizationCallbacks;
this.order = order;
}
#Override
public boolean supportsMethod(Method method) {
return delegate.supportsMethod(method);
}
#Override
public ApplicationListener<?> createApplicationListener(String beanName, Class<?> type, Method method) {
ApplicationListener<?> applicationListener = delegate.createApplicationListener(beanName, type, method);
if (applicationListener instanceof TransactionalApplicationListener) {
TransactionalApplicationListener<?> listener = (TransactionalApplicationListener<?>) applicationListener;
Collection<SynchronizationCallback> callbacks = this.synchronizationCallbacks.get();
callbacks.forEach(listener::addCallback);
}
return applicationListener;
}
#Override
public int getOrder() {
return order;
}
}
Note that I use a javax.inject.Provider in my case to make the retrieval of the callbacks at the latest possible time.
The decorator has to be Ordered because spring will use the first factory supporting the method it gets across. And therefore the order of an instance of this class has to have higher precedence as the order value 50 of TransactionEventListenerFactory.
I had simmilar problem with code as below
#Transactional(propagation = Propagation.REQUIRES_NEW)
public class SomeListenerFacade {
#TransactionalEventListener
public void onSomething(SomeEvent event) {
throw new RuntimeException("some cause");
}
}
I followed your solution. It worked. On the way I've found an alternative way for at least seeing that exception in the logfile
# application.properties
logging.level.org.springframework.transaction.support.TransactionSynchronizationUtils = DEBUG

#Transactional annotation does not rollback RuntimeException even if #EnableTransactionManagement is provided

I have the following application setup:
#SpringBootApplication
#EnableTransactionManagement
public class MyApp extends SpringBootServletInitializer {
...
}
with a class which has the following:
public class DoStaff {
public void doStaffOnAll(List<MyObject> myObjects) {
for (int i=0; i<myObjects.size(); i++) {
try {
doStaffOnSingle(myObjects.get(i), i);
} catch (Exception e) {
e.printStrackTrace();
}
}
}
#Transactional
public void doStaffOnSingle(MyObject myObject, int i) {
repository.save(myObject);
if (i%2==0) {
throw new RuntimeException();
}
}
}
So if I call DoStaff.doStaffOnAll with a list of MyObjects, the code saves all element from the list but also throws a runtime exception for every second element.
Since the doStaffOnSingle has #Transactional annotation, I would expect that every second element will be rolled back.
But if I run this code, every element is saved in the DB successfully. Why is that? What am I doing wrong?
Quoting Spring Documentation:
In proxy mode (which is the default), only external method calls coming in through the proxy are intercepted. This means that self-invocation (in effect, a method within the target object calling another method of the target object) does not lead to an actual transaction at runtime even if the invoked method is marked with #Transactional. Also, the proxy must be fully initialized to provide the expected behavior, so you should not rely on this feature in your initialization code (that is, #PostConstruct).
Move the doStaffOnAll() to a different Spring component, and it'll work.
Or change to aspectj mode.
I would recommend moving the method, and design the code so transaction boundaries are clear and distinct, i.e. all public methods on the class starts a transaction, or no methods on the class starts a transaction.
It should always be very clear where your transaction boundaries are, e.g. in a layered design, you would normally make the #Service layer also be the transaction layer, i.e. any call from a higher layer to the service layer is an atomic transaction.
#Transactional annotation is able to do the magic because of a proxy object.
Since you call the method directly you don't get that magic. In doStaffOnAll method you are directly invoking doStaffOnSingle method. So, nothing of Transactional behaviour gets added.
Try invoking the method using self invocation.
#Service
public class DoStaff {
#Autowired
private DoStaff doStaff;
public void doStaffOnAll(List<MyObject> myObjects) {
for (int i=0; i<myObjects.size(); i++) {
doStaff.doStaffOnSingle(..) // invoke like this
}
}
#Transactional
public void doStaffOnSingle(MyObject myObject, int i) {
}
}
Since the doStaffOnSingle has #Transactional annotation, I would
expect that every second element will be rolled back.
The default Transactional mode will commit everything or nothing. I think you would want to use REQUIRES_NEW Propagation.
Look here for supported propagation types.
https://docs.spring.io/spring-framework/docs/current/javadoc-api/org/springframework/transaction/annotation/Propagation.html#REQUIRED

Dynamic dependency injection for multiple implementations of the same interface with Spring MVC

I am working on a REST API where I have an interface that defines a list of methods which are implemented by 4 different classes, with the possibility of adding many more in the future.
When I receive an HTTP request from the client there is some information included in the URL which will determine which implementation needs to be used.
Within my controller, I would like to have the end-point method contain a switch statement that checks the URL path variable and then uses the appropriate implementation.
I know that I can define and inject the concrete implementations into the controller and then insert which one I would like to use in each particular case in the switch statement, but this doesn't seem very elegant or scalable for 2 reasons:
I now have to instantiate all of the services, even though I only need to use one.
The code seems like it could be much leaner since I am literally calling the same method that is defined in the interface with the same parameters and while in the example it is not really an issue, but in the case that the list of implementations grows ... so does the number of cases and redundant code.
Is there a better solution to solve this type of situation? I am using SpringBoot 2 and JDK 10, ideally, I'd like to implement the most modern solution.
My Current Approach
#RequestMapping(Requests.MY_BASE_API_URL)
public class MyController {
//== FIELDS ==
private final ConcreteServiceImpl1 concreteService1;
private final ConcreteServiceImpl2 concreteService2;
private final ConcreteServiceImpl3 concreteService3;
//== CONSTRUCTORS ==
#Autowired
public MyController(ConcreteServiceImpl1 concreteService1, ConcreteServiceImpl2 concreteService2,
ConcreteServiceImpl3 concreteService3){
this.concreteService1 = concreteService1;
this.concreteService2 = concreteService2;
this.concreteService3 = concreteService3;
}
//== REQUEST MAPPINGS ==
#GetMapping(Requests.SPECIFIC_REQUEST)
public ResponseEntity<?> handleSpecificRequest(#PathVariable String source,
#RequestParam String start,
#RequestParam String end){
source = source.toLowerCase();
if(MyConstants.SOURCES.contains(source)){
switch(source){
case("value1"):
concreteService1.doSomething(start, end);
break;
case("value2"):
concreteService2.doSomething(start, end);
break;
case("value3"):
concreteService3.doSomething(start, end);
break;
}
}else{
//An invalid source path variable was recieved
}
//Return something after additional processing
return null;
}
}
In Spring you can get all implementations of an interface (say T) by injecting a List<T> or a Map<String, T> field. In the second case the names of the beans will become the keys of the map. You could consider this if there are a lot of possible implementations or if they change often. Thanks to it you could add or remove an implementation without changing the controller.
Both injecting a List or a Map have some benefits and drawbacks in this case. If you inject a List you would probably need to add some method to map the name and the implementation. Something like :
interface MyInterface() {
(...)
String name()
}
This way you could transform it to a Map<String, MyInterface>, for example using Streams API. While this would be more explicit, it would polute your interface a bit (why should it be aware that there are multiple implementations?).
When using the Map you should probably name the beans explicitly or even introduce an annotation to follow the principle of least astonishment. If you are naming the beans by using the class name or the method name of the configuration class you could break the app by renaming those (and in effect changing the url), which is usually a safe operation to do.
A simplistic implementation in Spring Boot could look like this:
#SpringBootApplication
public class DynamicDependencyInjectionForMultipleImplementationsApplication {
public static void main(String[] args) {
SpringApplication.run(DynamicDependencyInjectionForMultipleImplementationsApplication.class, args);
}
interface MyInterface {
Object getStuff();
}
class Implementation1 implements MyInterface {
#Override public Object getStuff() {
return "foo";
}
}
class Implementation2 implements MyInterface {
#Override public Object getStuff() {
return "bar";
}
}
#Configuration
class Config {
#Bean("getFoo")
Implementation1 implementation1() {
return new Implementation1();
}
#Bean("getBar")
Implementation2 implementation2() {
return new Implementation2();
}
}
#RestController
class Controller {
private final Map<String, MyInterface> implementations;
Controller(Map<String, MyInterface> implementations) {
this.implementations = implementations;
}
#GetMapping("/run/{beanName}")
Object runSelectedImplementation(#PathVariable String beanName) {
return Optional.ofNullable(implementations.get(beanName))
.orElseThrow(UnknownImplementation::new)
.getStuff();
}
#ResponseStatus(BAD_REQUEST)
class UnknownImplementation extends RuntimeException {
}
}
}
It passes the following tests:
#RunWith(SpringRunner.class)
#SpringBootTest
#AutoConfigureMockMvc
public class DynamicDependencyInjectionForMultipleImplementationsApplicationTests {
#Autowired
private MockMvc mockMvc;
#Test
public void shouldCallImplementation1() throws Exception {
mockMvc.perform(get("/run/getFoo"))
.andExpect(status().isOk())
.andExpect(content().string(containsString("foo")));
}
#Test
public void shouldCallImplementation2() throws Exception {
mockMvc.perform(get("/run/getBar"))
.andExpect(status().isOk())
.andExpect(content().string(containsString("bar")));
}
#Test
public void shouldRejectUnknownImplementations() throws Exception {
mockMvc.perform(get("/run/getSomethingElse"))
.andExpect(status().isBadRequest());
}
}
Regarding two of your doubts :
1. Instantiating the service object should not be an issue as this is one time job and controller gonna need them to serve all type of request.
2. You can use the exact Path mapping to get rid of switch case. For e.g. :
#GetMapping("/specificRequest/value1")
#GetMapping("/specificRequest/value2")
#GetMapping("/specificRequest/value3")
All of the above mapping will be on separate method which would deal with specific source value and invoke respective service method.
Hope this will help to make code more cleaner and elegant.
There is one more option of separating this on service layer and having only one endpoint to serve all types of source but as you said there is different implementation for each source value then it says that source is nothing but a resource for your application and having separate URI/separate method makes the perfect sense here. Few advantages that I see here with this are :
Makes it easy to write the test cases.
Scaling the same without impacting any other source/service.
Your code dealing the each source as separate entity from other sources.
The above approach should be fine when you have limited source values. If you have no control over source value then we need further redesign here by making source value differentiate by one more value like sourceType etc. and then having separate controller for each group type of source.

Best practice to 'rollback' REST method calls inside method

The title might be incorrect, but I will try to explain my issue. My project is a Spring Boot project. I have services which do calls to external REST endpoints.
I have a service method which contains several method calls to other services I have. Every individual method call can be successful or not. Every method call is done to a REST endpoint and there can be issues that for example the webservice is not available or that it throws an unknown exception in rare cases. What ever happens, I need to be able to track which method calls were successful and if any one of them fails, I want to rollback to the original state as if nothing happened, see it a bit as #Transactional annotation. All REST calls are different endpoints and need to be called separately and are from an external party which I don't have influence on. Example:
public MyServiceImpl implements MyService {
#Autowired
private Process1Service;
#Autowired
private Process2Service;
#Autowired
private Process3Service;
#Autowired
private Process4Service;
public void bundledProcess() {
process1Service.createFileRESTcall();
process2Service.addFilePermissionsRESTcall();
process3Service.addFileMetadataRESTcall(); <-- might fail for example
process4Service.addFileTimestampRESTcall();
}
}
If for example process3Service.addFileMetadataRESTcall fails I want to do something like undo (in reverse order) for every step before process3:
process2Service.removeFilePermissionsRESTcall();
process1Service.deleteFileRESTcall();
I read about the Command pattern, but that seems to be used for Undo actions inside an application as a sort of history of actions performed, not inside a Spring web application. Is this correct for my use case too or should I track per method/webservice call if it was successful? Is there a best practice for doing this?
I guess however I track it, I need to know which method call failed and from there on perform my 'undo' method REST calls. Although in theory even these calls might also fail of course.
My main goal is to not have files being created (in my example) which any further processes have not been performed on. It should either be all successful or nothing. A sort of transactional.
Update1: improved pseudo implementation based on comments:
public Process1ServiceImpl implements Process1Service {
public void createFileRESTcall() throws MyException {
// Call an external REST api, pseudo code:
if (REST-call fails) {
throw new MyException("External REST api failed");
}
}
}
public class BundledProcessEvent {
private boolean createFileSuccess;
private boolean addFilePermissionsSuccess;
private boolean addFileMetadataSuccess;
private boolean addFileTimestampSuccess;
// Getters and setters
}
public MyServiceImpl implements MyService {
#Autowired
private Process1Service;
#Autowired
private Process2Service;
#Autowired
private Process3Service;
#Autowired
private Process4Service;
#Autowired
private ApplicationEventPublisher applicationEventPublisher;
#Transactional(rollbackOn = MyException.class)
public void bundledProcess() {
BundleProcessEvent bundleProcessEvent = new BundleProcessEvent();
this.applicationEventPublisher.publishEvent(bundleProcessEvent);
bundleProcessEvent.setCreateFileSuccess = bundprocess1Service.createFileRESTcall();
bundleProcessEvent.setAddFilePermissionsSuccess = process2Service.addFilePermissionsRESTcall();
bundleProcessEvent.setAddFileMetadataSuccess = process3Service.addFileMetadataRESTcall();
bundleProcessEvent.setAddFileTimestampSuccess = process4Service.addFileTimestampRESTcall();
}
#TransactionalEventListener(phase = TransactionPhase.AFTER_ROLLBACK)
public void rollback(BundleProcessEvent bundleProcessEvent) {
// If the last process event is successful, we should not
// be in this rollback method even
//if (bundleProcessEvent.isAddFileTimestampSuccess()) {
// remove timestamp
//}
if (bundleProcessEvent.isAddFileMetadataSuccess()) {
// remove metadata
}
if (bundleProcessEvent.isAddFilePermissionsSuccess()) {
// remove file permissions
}
if (bundleProcessEvent.isCreateFileSuccess()) {
// remove file
}
}
Your operation looks like a transaction, so you can use #Transactional annotation. From your code I can't really tell how you are managing HTTP response calls for each of those operations, but you should consider having your service methods to return them, and then do a rollback depending on response calls. You can create an array of methods like so, but how exactly you want your logic to be is up to you.
private Process[] restCalls = new Process[] {
new Process() { public void call() { process1Service.createFileRESTcall(); } },
new Process() { public void call() { process2Service.addFilePermissionsRESTcall(); } },
new Process() { public void call() { process3Service.addFileMetadataRESTcall(); } },
new Process() { public void call() { process4Service.addFileTimestampRESTcall(); } },
};
interface Process {
void call();
}
#Transactional(rollbackOn = Exception.class)
public void bundledProcess() {
restCalls[0].call();
... // say, see which process returned wrong response code
}
#TransactionalEventListener(phase = TransactionPhase.AFTER_ROLLBACK)
public void rollback() {
// handle rollback according to failed method index
}
Check this article. Might come in handy.
The answer to this question is quite broad. There are various ways to do distributed transactions to go through them all here. However, since you are using Java and Spring, your best bet is to use something like JTA (Java Transaction API), which enables a distributed transactions across multiple services/instances/etc.. Fortunately, Spring Boot supports JTA using either Atomikos or Bitronix. You can read the doc here.
One approach to enable distributed transactions is through a message broker such as JMS, RabbitMQ, Kafka, ActiveMQ, etc. and use a protocol like XA transactions (two-phase commit). In the case of external services that do not support distributed, one approach is to write a wrapper service that understands XA transactions to that external service.

JPA synch/commit error of a POJO DTO even if I do not want to save it

Due to lack of key words to capture this scenario, let me just proceed to describe it. The classes have been simplified.
Given this:
public ItemController {
#Autowired
ItemDtoService ItemDtoService;
#Autowired
DiscountService discountService;
#RequestMapping(value = "/viewItems", method = RequestMethod.POST)
public void process() {
List<ItemDto> ItemDtos = ItemDtoService.getItemDtos();
for(ItemDto i: ItemDtos) {
boolean isDiscounted = discountService.hasDiscount(i); //throws exception here on iteration 2 and the last iteration, ItemDto was discounted
if (isDiscounted) {
i.setPrice(discountService.getDiscountedPrice(i));
//do some other i.setter, basically modify the pojo
}
}
}
}
An exception is thrown at the discountService.hasDiscount when:
on subsequent iteration
and the previous iteration, the ItemDto was discounted.
Exception is:
Caused by: org.hibernate.exception.SQLGrammarException: could not update: [somepackage.ItemDto#364]
And somewhere in the stacktrace you will see this:
at org.springframework.orm.jpa.JpaTransactionManager.doCommit(JpaTransactionManager.java:456)"
The problem is that method call uses a dao method underneath that is #Transactional (and maybe for a good reason even though it's only a query, complicated query). When the JPA Tx manager does its job upon method call end, it sees the pojo as modified and tries to synch it. The ItemDto pojo does have #Entity because inside ItemDtoService.getItemDtos uses the getEntityManager().createNativeQuery(nativeSql, ItemDto.class). The 5 other class details are here:
#Entity
public class ItemDto{
//body
}
#Service
public class ItemService {
#Autowired
ItemDao itemDao;
public List<ItemDto> getItems() {
return itemDao.getItems(); //for sake of simplicity
}
}
#Repository
#Transactional
public class ItemDaoImpl {
public List<ItemDto> getItems() {
String nativeSql = "select...."
return getEntityManager().createNativeQuery(nativeSql, ItemDto.class);
}
}
#Service
public class DiscountService {
#Autowired
DiscountDao discountDao;
public boolean hasDiscount(ItemDto i) {
boolean hasDiscount = discountDao.hasDiscount(i);
//do other service stuff that might influence the hasDiscount flag
return hasDiscount;
}
}
#Repository
#Transactional
public class DiscountDaoImpl {
public boolean hasDiscount(ItemDto i) {
String nativeSql = "select...."
boolean hasDiscount;
//in reality the query is a complicated joins, executes and returns if has discount or not
return hasDiscount;
}
}
What am I doing wrong?
Some of the options I tried and worked include:
add to the #Transactional the (readonly=true) on the Dao methods
since they are only queries (negative effect though is those might
be intentionally transactional due to complex queries, and may need
locking to prevent dirty reads)
in the Controller, create a separate loop for modification, it
then have 2 loops, 1 for looping through items and seeing which is
discounted, store those info somewhere to be referenced later on 2nd
loop, which does the modification of said pojos
I am looking at other options, and please comment if you see something wrong with the way it was coded.
Another option I just found is inside the Dao that returns the list of ItemDto, before returning the list, I would execute this:
getEntityManager().clear();
It works fine because the list is Dto anyways and one would expect that these require no DB synching, at the same time the #Transactional is retained for necessary locking for consistent reads.
That's one more alternative, but what is the most appropriate way really?

Categories

Resources