is there any way to add entry in the executioncontext other than read(), updated(), and open() method.
Like in the below code I'm trying to add entry in the close method.
public class MyFileReader extends FlatFileItemReader<AccountDetails>{
private long currentRowProcessedCount = 0;
#Autowired
private ExecutionContext executionContext;
#Override
public synchronized AccountDetails read() throws Exception, UnexpectedInputException, ParseException {
AccountDetails accDetailsObj = super.read();
currentRowProcessedCount++;
return accDetailsObj;
}
#Override
public void open(ExecutionContext executionContext) throws ItemStreamException {
super.open(executionContext);
currentRowProcessedCount = executionContext.getLong(Constants.CONTEXT_COUNT_KEY.getStrValue(),0);
this.executionContext = executionContext;
}
#Override
public void update(ExecutionContext executionContext) throws ItemStreamException {
executionContext.putLong(Constants.CONTEXT_COUNT_KEY.getStrValue(), currentRowProcessedCount);
}
#Override
public void close() throws ItemStreamException {
System.out.println("close --------------"+currentRowProcessedCount);
System.out.println(executionContext.getLong(Constants.CONTEXT_COUNT_KEY.getStrValue()));
this.executionContext.putLong(Constants.CONTEXT_COUNT_KEY.getStrValue(), currentRowProcessedCount);
}
}
in the above example I'm not able to updated new entry.
It' only working as readonly. I can read data but no write.
class abc{
#Autowired
private ExecutionContext executionContext;
public AccountDetails mapFieldSet(FieldSet fieldSet) throws BindException {
executionContext.putLong(Constants.CONTEXT_COUNT_KEY.getStrValue(), 47);
return accDetailsObj;
}
}
I need to updated executionContext in other classes also.
Is there any way?
You just use put(String key, Object value) to override already existing value.
ExecutionContext is backed by ConcurrentHashMap so if you really want it you can get reference to it via reflection and then use computeIfAbsent, etc...
Also counting already implemented in AbstractItemCountingItemStreamItemReader and If you inherited from it (and you are), this should be already solved.
Related
I am trying to create unit test for the following code. The code utilizes AWS Java 2 SDK. The code calls selectObjectContent in S3AsyncClient class which returns a CompletableFuture (https://sdk.amazonaws.com/java/api/latest/software/amazon/awssdk/services/s3/S3AsyncClient.html). My test is returning null pointer exception while invoking future.get()
Here is the method I want to unit test.
public <T> Collection<T> queryWithS3Select(
List<String> s3Keys,
String s3SelectQuery,
InputSerialization inputSerialization,
Class<T> modelObject,
Comparator<T> comparator
) throws ExecutionException, InterruptedException, IOException {
TreeSet<T> collection = new TreeSet<>(comparator);
List<SelectObjectContentRequest> selectObjectContentRequest =
buildS3SelectRequests(s3Keys, s3SelectQuery, inputSerialization);
S3SelectContentHandler s3SelectContentHandler = new S3SelectContentHandler();
StringBuilder selectionResult = new StringBuilder();
for (SelectObjectContentRequest socr : selectObjectContentRequest) {
CompletableFuture<Void> future = s3AsyncClient.selectObjectContent(socr, s3SelectContentHandler);
future.get();
s3SelectContentHandler.getReceivedEvents().forEach(e -> {
if (e.sdkEventType() == SelectObjectContentEventStream.EventType.RECORDS) {
RecordsEvent response = (RecordsEvent) e;
selectionResult.append(response.payload().asUtf8String());
}
});
}
JsonParser parser = objectMapper.createParser(selectionResult.toString());
collection.addAll(Lists.newArrayList(objectMapper.readValues(parser, modelObject)));
return collection;
}
My unit test so far. Running this code I get null pointer exception at future.get() line above. How can I use the mock s3AsyncClient to return a valid future?
#Mock
private S3AsyncClient s3AsyncClient;
#Test
public void itShouldReturnQueryResults() throws IOException, ExecutionException, InterruptedException {
List<String> keysToQuery = List.of("key1", "key2");
InputSerialization inputSerialization = InputSerialization.builder()
.json(JSONInput.builder().type(JSONType.DOCUMENT).build())
.compressionType(String.valueOf(CompressionType.GZIP))
.build();
Comparator<S3SelectObject> comparator =
Comparator.comparing((S3SelectObject e) -> e.getStartTime());
underTest.queryWithS3Select(keysToQuery, S3_SELECT_QUERY, inputSerialization, S3SelectObject.class, comparator );
}
Here is the S3SelectContentHandler
public class S3SelectContentHandler implements SelectObjectContentResponseHandler {
private SelectObjectContentResponse response;
private List<SelectObjectContentEventStream> receivedEvents = new ArrayList<>();
private Throwable exception;
#Override
public void responseReceived(SelectObjectContentResponse response) {
this.response = response;
}
#Override
public void onEventStream(SdkPublisher<SelectObjectContentEventStream> publisher) {
publisher.subscribe(receivedEvents::add);
}
#Override
public void exceptionOccurred(Throwable throwable) {
exception = throwable;
}
#Override
public void complete() {}
public List<SelectObjectContentEventStream> getReceivedEvents() {
return receivedEvents;
}
}
I will share unit test for similar functionality and show you how to work with completable future when your code has .join() to continue the execution
Code under test
private final S3AsyncClient s3AsyncClient;
public long getSize(final S3AsyncClient client, final String bucket, final String key) {
return client.headObject(HeadObjectRequest.builder().bucket(bucket).key(key).build()).join().contentLength();
}
and in this code the client.headObject() returns the CompletableFuture<HeadObjectResponse> which we are going to mock and test in Unit Test as shown below
#Test
#DisplayName("Verify getSize returns the size of the given key in the bucket")
void verifyGetSizeRetunsSizeOfFileInS3() {
CompletableFuture<HeadObjectResponse> headObjectResponseCompletableFuture =
CompletableFuture.completedFuture(HeadObjectResponse.builder().contentLength(20000L).build());
when(s3AsyncClient.headObject(headObjectRequestArgumentCaptor.capture()))
.thenReturn(headObjectResponseCompletableFuture);
long size = s3Service.getSize(s3AsyncClient, "somebucket", "someFile");
assertThat(headObjectRequestArgumentCaptor.getValue()).hasFieldOrPropertyWithValue("bucket", "somebucket")
.hasFieldOrPropertyWithValue("key", "someFile");
assertThat(size).isEqualTo(20000L);
}
In order to update the update updatedTime property of the parent when the child was changed I decided to use event listeners in Hibernate. I have read many stackoverflows questions and VladMihalcea blogs and thought, that I can do it, however I am stuck.
#Component
public class HibernateEventListenerConfig {
#PersistenceUnit
private EntityManagerFactory emf;
#PostConstruct
protected void init() {
SessionFactoryImpl sessionFactory = emf.unwrap(SessionFactoryImpl.class);
EventListenerRegistry registry = sessionFactory.getServiceRegistry().getService(EventListenerRegistry.class);
// registry.getEventListenerGroup(EventType.PERSIST).prependListener(CustomEventListener.INSTANCE);
// registry.getEventListenerGroup(EventType.SAVE_UPDATE).prependListener(CustomEventListener.INSTANCE);
// registry.getEventListenerGroup(EventType.DELETE).prependListener(CustomEventListener.INSTANCE);
registry.getEventListenerGroup(EventType.PRE_UPDATE).prependListener(CustomEventListener.INSTANCE);
// registry.getEventListenerGroup(EventType.FLUSH_ENTITY).prependListener(CustomEventListener.INSTANCE);
}
}
What I observe is when I add my EventListener for PERSIST and DELETE everything works as expected( I see that the time is changed and the change is propagated to the database). However I have a problem with update. I have tried using SAVE_UPDATE, UPDATE, FLUSH_ENTITY from the VladMihalcea blog, however still I am unable to see the change being propagated to the database(i have logs turned on)
This is code from my CustomEventListener. ( the method update chain and get related works correctly as I see the update when I persist or delete child entity)
public class CustomEventListener implements PersistEventListener, DeleteEventListener, SaveOrUpdateEventListener, PreUpdateEventListener {
public static final CustomEventListener INSTANCE = new CustomEventListener();
#Override
public void onSaveOrUpdate(SaveOrUpdateEvent event) throws HibernateException {
updateRelatedObjects(event.getObject());
}
#Override
public boolean onPreUpdate(PreUpdateEvent event) {
updateRelatedObjects(event.getEntity());
return false;
}
#Override
public void onPersist(PersistEvent event) throws HibernateException {
updateRelatedObjects(event.getObject());
}
#Override
public void onDelete(DeleteEvent event) throws HibernateException {
updateRelatedObjects(event.getObject());
}
#Override
public void onDelete(DeleteEvent event, Set transientEntities) throws HibernateException {
onDelete(event);
}
#Override
public void onPersist(PersistEvent event, Map createdAlready) throws HibernateException {
onPersist(event);
}
private void updateRelatedObjects(Object object) {
if (object instanceof BaseClass) {
BaseClass rootAware = (BaseClass) object;
rootAware.getRelated().ifPresent(BaseClass::updateChain);
}
}
}
Do You know what did I forget about when updating the entity? Have I missed something? If that's how it works, how can I do the update of my relatedObject?
I have a SpringMVC web applicatioin, I set the value for a Object property in HandlerInterceptorAdapter:
public class SpringMVCFilter extends HandlerInterceptorAdapter {
#Override
public boolean preHandle(HttpServletRequest request, HttpServletResponse response, Object handler) throws Exception {
Interceptor.ss = "foo"; // it's a static field
return super.preHandle(request, response, handler);
}
...more code hidden
}
Then, I getting the value in the MySQL QueryInterceptor:
public class Interceptor implements QueryInterceptor {
public static String ss = null;
#Override
public <T extends Resultset> T postProcess(Supplier<String> sql, Query interceptedQuery, T originalResultSet, ServerSession serverSession) {
System.out.println(ss); // I have set the value for 'ss' before
return null;
}
...more code hidden
}
ss is obviously a static variable, and I have assigned it a value, why can't I get the value in QueryInterceptor, they are in the some thread, but the ss always show null, the execution order is assigned first.
enviroment:
mysql:mysql-connector-java:8.0.13
spring boot 2.0.2.RELEASE
In a batch service, I read multiple XML files using a MultiResourceItemReader, which delegate to a StaxEventItemReader.
If an error is raised reading a file (a parsing exception for example), I would like to specify to Spring to start reading the next matching file. Using #OnReadError annotation and/or a SkipPolicy for example.
Currently, when a reading exception is raised, the batch stops.
Does anyone have an idea how to do it ?
EDIT: I see MultiResourceItemReader has a method readNextItem(), but it's private -_-
I'm not using SB for a while, but looking MultiResourceItemReader code I suppose you can write your own ResourceAwareItemReaderItemStream wrapper where you check for a flag setted to move to next file or to perform a standard read using a delegate.
This flag can be stored into execution-context or into your wrapper and should be cleared after a move next.
class MoveNextReader<T> implements ResourceAwareItemReaderItemStream<T> {
private ResourceAwareItemReaderItemStream delegate;
private boolean skipThisFile = false;
public void setSkipThisFile(boolean value) {
skipThisFile = value;
}
public void setResource(Resource resource) {
skipThisFile = false;
delegate.setResource(resource);
}
public T read() {
if(skipThisFile) {
skipThisFile = false;
// This force MultiResourceItemReader to move to next resource
return null;
}
return delegate.read();
}
}
Use this class as delegate for MultiResourceItemReader and in #OnReadErrorinject MoveNextReader and set MoveNextReader.skipThisFile.
I can't test code from myself but I hope this can be a good starting point.
Here are my final classes to read multiple XML files and jump to the next file when a read error occurs on one (thanks to Luca's idea).
My custom ItemReader, extended from MultiResourceItemReader :
public class MyItemReader extends MultiResourceItemReader<InputElement> {
private SkippableResourceItemReader<InputElement> reader;
public MyItemReader() throws IOException {
super();
// Resources
PathMatchingResourcePatternResolver resourceResolver = new PathMatchingResourcePatternResolver();
this.setResources( resourceResolver.getResources( "classpath:input/inputFile*.xml" ) );
// Delegate reader
reader = new SkippableResourceItemReader<InputElement>();
StaxEventItemReader<InputElement> delegateReader = new StaxEventItemReader<InputElement>();
delegateReader.setFragmentRootElementName("inputElement");
Jaxb2Marshaller unmarshaller = new Jaxb2Marshaller();
unmarshaller.setClassesToBeBound( InputElement.class );
delegateReader.setUnmarshaller( unmarshaller );
reader.setDelegate( delegateReader );
this.setDelegate( reader );
}
[...]
#OnReadError
public void onReadError( Exception exception ){
reader.setSkipResource( true );
}
}
And the ItemReader-in-the-middle used to skip the current resource :
public class SkippableResourceItemReader<T> implements ResourceAwareItemReaderItemStream<T> {
private ResourceAwareItemReaderItemStream<T> delegate;
private boolean skipResource = false;
#Override
public void close() throws ItemStreamException {
delegate.close();
}
#Override
public T read() throws UnexpectedInputException, ParseException, NonTransientResourceException, Exception {
if( skipResource ){
skipResource = false;
return null;
}
return delegate.read();
}
#Override
public void setResource( Resource resource ) {
skipResource = false;
delegate.setResource( resource );
}
#Override
public void open( ExecutionContext executionContext ) throws ItemStreamException {
delegate.open( executionContext );
}
#Override
public void update( ExecutionContext executionContext ) throws ItemStreamException {
delegate.update( executionContext );
}
public void setDelegate(ResourceAwareItemReaderItemStream<T> delegate) {
this.delegate = delegate;
}
public void setSkipResource( boolean skipResource ) {
this.skipResource = skipResource;
}
}
I try to create a new language support for NetBeans 7.4 and higher.
When files are being saved locally I need to deploy them to a server. So I need to handle the save event. I did this implementing Savable:
public class VFDataObject extends MultiDataObject implements Savable {
.......
#Override
public void save() throws IOException {
.......
}
}
And it worked perfectly for the Save event. But then I realized I need to extend HtmlDataObject instead of MultiDataObject:
public class VFDataObject extends HtmlDataObject implements Savable {
.......
#Override
public void save() throws IOException {
.......
}
}
And now the save() doesn't get executed. Why? Since HtmlDataObject extends MultiDataObject. What should be done to make that work?
Also is there a way to catch Save All event in NetBeans as well? Do you have any info on if anything changed in 8.0 in this regards?
Thanks a lot.
Have you tried OnSaveTask SPI (https://netbeans.org/bugzilla/show_bug.cgi?id=140719)? The API can be used to perform tasks when files of a given type are saved.
Something like this can be used to listen to all the save events on a given MIME type (in this case "text/x-sieve-java"):
public static class CustomOnSaveTask implements OnSaveTask {
private final Context context;
public CustomOnSaveTask(Context ctx) {
context = ctx;
}
#Override
public void performTask() {
System.out.println(">>> Save performed on " +
NbEditorUtilities.getDataObject(context.getDocument()).toString());
}
#Override
public void runLocked(Runnable r) {
r.run();
}
#Override
public boolean cancel() {
return true;
}
#MimeRegistration(mimeType = "text/x-sieve-java", service = OnSaveTask.Factory.class, position = 1600)
public static class CustomOnSaveTaskFactory implements OnSaveTask.Factory {
#Override
public OnSaveTask createTask(Context cntxt) {
return new CustomOnSaveTask(cntxt);
}
}
}