SFTP inbound channel adapter hangs at PollableChannel receive method - java

My context xml reads as follows:
<bean id="defaultSftpSessionFactory"
class="org.springframework.integration.sftp.session.DefaultSftpSessionFactory">
<property name="host" value="${host}"/>
<property name="password" value="${password}"/>
<property name="port" value="${port}"/>
<property name="user" value="${username}"/>
</bean>
<int-sftp:inbound-channel-adapter id="sftpInbondAdapter"
session-factory="sftpSessionFactory"
channel="receiveChannel"
filename-pattern="*.txt"
remote-directory="/loblawln"
local-directory="/local-dir"
auto-create-local-directory="true"
temporary-file-suffix=".writing"
delete-remote-files="false">
<int:poller fixed-rate="1000" max-messages-per-poll="1"/>
</int-sftp:inbound-channel-adapter>
<int:channel id="receiveChannel">
<int:queue/>
</int:channel>
The java file reads like this:
PollableChannel localFileChannel = context.getBean("receiveChannel", PollableChannel.class);
SourcePollingChannelAdapter adapter = context.getBean(SourcePollingChannelAdapter.class);
adapter.start();
System.out.println("Adapter started...");
Message<?> received = localFileChannel.receive();
System.out.println("Received first file message: " + received);
I have referred to the implementation provided by Gary Russell at Github and also followed spring docs for sftp integration. I think i am missing something important.
Moreover logs are also not showing anything at all.

The problem got solved by changing the key to read username.
This seemed a bit weird to me when I debugged the code and read values in defaultSessionFactory instance, I found it was not reading the username from the properties file, rather it read the value correspnding to System.getProperty("user.name"). When I changed the key from {username} to some thing else like {sftp.username} it read it correctly.

Related

Put operation does not work for IBM MQ from camel when using JMSPoolXAConnectionFactory

We are implementing XA transaction between MQ and database and trying to create a connection factory as a service in karaf as per the below link.
https://access.redhat.com/documentation/fr-fr/red_hat_fuse/7.2/html/apache_karaf_transaction_guide/using-jms-connection-factories#manual-deployment-connection-factories
The MQ we are using is IBM and we are connecting to it through camel.
The karaf service is exposed from the same bundle that is going to use it. This is done through blueprint xml file present in the src/main/resources/OSGI-INF/blueprint folder.
When we use (through JNDI) the connection factory exposed as a service for setting the connection factory to be used by the JmsComponent of camel, we are able to get message from the queue but not able to put message into the queue. There is no error when the put operation fails and hence, the database gets updated with success. This happens specifically when using JmsPoolXAConnectionFactory as the pool connection factory. If we change it to JmsPoolConnectionFactory, the put operation works and the message is added to the queue.
Below are the sample routes for get and put to queue.
GET:
from("mq:queue:{{queueName}}")
.process(new CustomProcessor1())
.to("direct:call-sp")
.end();
from("direct:call-sp")
.to("sql-stored:call-sp")
.end();
PUT:
from("vm:send")
.process(new CustomProcessor2())
.to("mq:queue:{{queueName}}")
.to("sql-stored:update-sp")
.to("vm:nextroute")
.end();
Camel JmsComponent Configuration in camel-context.xml:
<reference id="ptm" interface="org.springframework.transaction.PlatformTransactionManager" />
<reference id="connectionFactory" interface="javax.jms.ConnectionFactory" filter="(osgi.jndi.service.name=jms/mq)" availability="optional" />
<bean id="jmsConfig" class="org.apache.camel.component.jms.JmsConfiguration">
<property name="transacted" value="false" />
<property name="connectionFactory" ref="connectionFactory" />
<property name="transactionManager" ref="ptm" />
</bean>
<bean id="mq" class="org.apache.camel.component.jms.JmsComponent">
<property name="configuration" ref="jmsConfig" />
<property name="destinationResolver" ref="customDestinationResolver" />
</bean>
<bean id="customDestinationResolver" class="com.example.CustomDestinationResolver">
</bean>
Is there any put related specific configuration that we are missing?
To coordinate XA transactions, you need a transaction manager which implements the Java Transaction API (JTA).
Therefore, I think you need to use a JtaTransactionManager rather than a org.springframework.transaction.PlatformTransactionManager.
Check this out:
https://tomd.xyz/camel-xa-transactions-checklist/

Jms message failure with Retry advice send response to fail channel and again pass to successful channel

Below is the config
<jms:outbound-channel-adapter id="someId" channel="inputChannel"
connection-factory="${connection.factory}" destination="queue">
<jms:request-handler-advice-chain>
<bean class="org.springframework.integration.handler.advice.ExpressionEvaluatingRequestHandlerAdvice">
<property name="onSuccessExpression" value="T(Boolean).TRUE"/>
<property name="successChannelName" value="afterSuccessDeliveryMessageChannel"/>
<property name="onFailureExpression" value="T(Boolean).FALSE"/>
<property name="failureChannelName" value="failureChannel"/>
</bean>
<bean id="retryWithBackoffAdviceSession"
class="org.springframework.integration.handler.advice.RequestHandlerRetryAdvice">
<property name="retryTemplate">
<bean class="org.springframework.retry.support.RetryTemplate">
<property name="retryPolicy">
<bean class="org.springframework.retry.policy.SimpleRetryPolicy">
<property name="maxAttempts" value="5"/>
</bean>
</property>
</bean>
</property>
<property name="recoveryCallback">
<bean class="org.springframework.integration.handler.advice.ErrorMessageSendingRecoverer">
<constructor-arg ref="failureChannel"/>
</bean>
</property>
</bean>
</jms:request-handler-advice-chain>
</jms:outbound-channel-adapter>
I am retrying message 5 times and then using recoveryCallBack logging message to some DB.
It works fine retry 5 times and call failureChannel channel but once it calls the failureChannel then again it pass to afterSuccessDeliveryMessageChannel.
I am not sure what I am doing wrong here?
I am expecting once it failed it should go failedChannel NOT again back to afterSuccessDeliveryMessageChannel.
Your problem is like this:
Your Retry advice sends to the failureChannel from the recoveryCallback and exits, successfully.
Then when we look into the ExpressionEvaluatingRequestHandlerAdvice code, we see this logic:
try {
Object result = callback.execute();
if (this.onSuccessExpression != null) {
evaluateSuccessExpression(message);
}
return result;
}
So, since there is no exception calling callback, it goes to the successChannel configured.
To make it fail and go to the failureChannel configured, you should consider to not use that recoveryCallback. Then RequestHandlerRetryAdvice will throw an exception which is going to be caught by the ExpressionEvaluatingRequestHandlerAdvice and sent to that failureChannel. There won't be onSuccessExpression evaluation since we will end up with an exception.

How to get the file from service-activator Message object in listener class

I need to pass the file to service layer which i am receiving in SFTP path.
below is configuration and i am seeing the message receiving in my service-activator like
GenericMessage [payload=com.jcraft.jsch.ChannelSftp$2#322906a2,
headers={closableResource=org.springframework.integration.sftp.session.SftpSession#379662a7,
id=704c58e7-1d93-3bef-0330-233c0f9af55c, file_remoteDirectory=/tmp/charge/,
file_remoteFile=Charge.txt, timestamp=1594158522576}]
<bean id="sftpSessionFactory"
class="org.springframework.integration.sftp.session.DefaultSftpSessionFactory">
<property name="host" value="hostname"/>
<property name="port" value="22"/>
<property name="user" value="vkp"/>
<property name="password" value="1234"/>
<property name="allowUnknownKeys" value="true"/>
</bean>
<int-sftp:inbound-streaming-channel-adapter id="sftpAdapterAutoCreate"
session-factory="sftpSessionFactory"
filename-pattern="*.txt"
channel="receiveChannel"
remote-directory="/tmp/charge/">
</int-sftp:inbound-streaming-channel-adapter>
<int:poller fixed-rate="25000" max-messages-per-poll="1" id="shippingChargePoller" default="true"/>
<int:channel id="receiveChannel">
<int:queue/>
</int:channel>
<int:stream-transformer id="withCharset" charset="UTF-8" input-
channel="receiveChannel" />
<int:service-activator id="FeedListener" input-channel="receiveChannel" method="onMessage">
<bean class="com.listener.ChargeFeedListener"/>
</int:service-activator>
public void onMessage(Message<?> message){
System.out.println(message.toString());
System.out.println( " Received File is "+message.getPayload());
}
But i am not receiving the file in my java class . What i need to do to get the file ?
Please, read documentation: https://docs.spring.io/spring-integration/docs/current/reference/html/sftp.html#sftp-streaming. The <int-sftp:inbound-streaming-channel-adapter> is not about files. It does open an InputStream for a remote entry (probably file on SFTP) and let you to do with this stream whatever you want. For example (also according that docs), there is a StreamTransformer which let's you to read that stream into a byte[] or string if you provide a charset. If you really want to deal with files, then you need to consider to switch to the <int-sftp:inbound-channel-adapter>. That one pull the remote entry and store its content into a local file. Then that java.io.File is sent to the channel for your consideration.
I think we had a chat with you on the matter in other your question: Spring SFTP Integration is not polling the file.
Please, let us know what is wrong with our docs that confuses you so you have to raise questions like this over here.

Spring Integration RecursiveDirectoryScanner gives too many files open exception

I am using Spring Integration RecursiveDirectoryScanner to scan a directory recursively to process the incoming file that will be placed under the configured directory (/home/test).
I am frequently getting the below error:
ERROR org.springframework.integration.handler.LoggingHandler - java.lang.IllegalArgumentException: java.nio.file.FileSystemException: /home/test: Too many open files
at org.springframework.integration.file.RecursiveDirectoryScanner.listFiles(RecursiveDirectoryScanner.java:89)
at org.springframework.integration.file.FileReadingMessageSource.scanInputDirectory(FileReadingMessageSource.java:387)
at org.springframework.integration.file.FileReadingMessageSource.doReceive(FileReadingMessageSource.java:361)
at org.springframework.integration.file.FileReadingMessageSource.doReceive(FileReadingMessageSource.java:90)
at org.springframework.integration.endpoint.AbstractMessageSource.receive(AbstractMessageSource.java:134)
at org.springframework.integration.endpoint.SourcePollingChannelAdapter.receiveMessage(SourcePollingChannelAdapter.java:224)
at org.springframework.integration.endpoint.AbstractPollingEndpoint.doPoll(AbstractPollingEndpoint.java:245)
at org.springframework.integration.endpoint.AbstractPollingEndpoint.access$000(AbstractPollingEndpoint.java:58)
at org.springframework.integration.endpoint.AbstractPollingEndpoint$1.call(AbstractPollingEndpoint.java:190)
at org.springframework.integration.endpoint.AbstractPollingEndpoint$1.call(AbstractPollingEndpoint.java:186)
at org.springframework.integration.endpoint.AbstractPollingEndpoint$Poller$1.run(AbstractPollingEndpoint.java:353)
at org.springframework.integration.util.ErrorHandlingTaskExecutor$1.run(ErrorHandlingTaskExecutor.java:55)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Caused by: java.nio.file.FileSystemException: /home/test: Too many open files
at sun.nio.fs.UnixException.translateToIOException(UnixException.java:91)
at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:102)
at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:107)
at sun.nio.fs.UnixFileSystemProvider.newDirectoryStream(UnixFileSystemProvider.java:427)
at java.nio.file.Files.newDirectoryStream(Files.java:457)
at java.nio.file.FileTreeWalker.visit(FileTreeWalker.java:300)
at java.nio.file.FileTreeWalker.walk(FileTreeWalker.java:322)
at java.nio.file.FileTreeIterator.<init>(FileTreeIterator.java:72)
at java.nio.file.Files.walk(Files.java:3574)
at org.springframework.integration.file.RecursiveDirectoryScanner.listFiles(RecursiveDirectoryScanner.java:73)
My Spring Integration flow is as below:
Configuration in XML
<task:executor id="pollerPool"
pool-size="${pollerThreadPoolSize}"
queue-capacity="${pollerThreadQueueCapacity}" rejection-policy="ABORT" />
<task:executor id="fileHandlerPool"
pool-size="${fileHandlerPoolSize}"
queue-capacity="${fileHandlerPoolThreadQueueCapacity}" rejection-policy="CALLER_RUNS" />
<bean id="iFilter" class="org.springframework.integration.file.filters.ChainFileListFilter">
<constructor-arg>
<list>
<bean id="lastModifiedFileListFilter" class="org.springframework.integration.file.filters.LastModifiedFileListFilter">
<property name="age" value="120" />
</bean>
<ref bean="acceptOnceFileListFilter"/>
<bean class="org.springframework.integration.file.filters.RegexPatternFileListFilter">
<constructor-arg value="^.*\.(txt|csv|xls|xlsx|asc)$"/>
</bean>
</list>
</constructor-arg>
</bean>
<bean id="acceptOnceFileListFilter" name="acceptOnceFileListFilter" class="org.springframework.integration.file.filters.AcceptOnceFileListFilter" primary="true" />
<bean id="recursiveDirectoryScanner" class="org.springframework.integration.file.RecursiveDirectoryScanner">
<property name="filter" ref="iFilter" />
<property name="locker" ref="nioFileLocker" />
</bean>
<bean id="nioFileLocker" class="org.springframework.integration.file.locking.NioFileLocker" />
<int-file:inbound-channel-adapter
id="fileSource" channel="fileReceivedChannel" auto-startup="true"
directory="file:${polling.directory}"
scanner="recursiveDirectoryScanner" >
<int:poller task-executor="pollerPool"
fixed-rate="${pollerFixedRate}"
receive-timeout="${pollerReceiveTimeout}">
</int:poller>
</int-file:inbound-channel-adapter>
Dynamic parameters are as below:
polling.directory=/home/test
pollerThreadPoolSize=1
pollerThreadQueueCapacity=10
pollerFixedRate=5000
pollerReceiveTimeout=5000
fileHandlerPoolSize=2
fileHandlerPoolThreadQueueCapacity=100
EDIT:
I do unlock file in a service activator that comes in to picture when a file is picked. I get some information from file and unlock it.
#Autowired
NioFileLocker nioFileLocker;
protected void doTransform(Message<?> message) throws Exception {
MessageBuilder<File> payload = (MessageBuilder<File>) message.getPayload();
File inFile = payload.getPayload();
try {
nioFileLocker.unlock(inFile);
} catch (Exception e) {
LOGGER.error("file not unlock");
}
}
Is there any issue with the configuration ? How do I make sure this exception never appear again ?
Thank you in advance.
I would suggest to test your solution without NioFileLocker. Doesn't look like you are using it for unlocking files, but the lock(File fileToLock) really keeps some file marker in OS.
On the other hand the file locker doesn't work reliable on UNIX systems. It still allows an access to files. At least for reading.
For better exclusive file access I would recommend to use a FileSystemPersistentAcceptOnceFileListFilter with the external MetadataStore instead of in memory AcceptOnceFileListFilter. This way only one instance of your application will get access to the file and it won't be processed again at all.

Migrating from service-mix file poller to apache camel file poller

I am new to apache camel. I want to migrate from service mix file poller to camel file poller. I am trying to do it, but currently I have nothing to test as I have to code this and give someone for testing. So can someone help me and check whether I am going in right way?
Service-mix File Poller code:
<sm:activationSpec componentName="abcFilePoller"
destinationService="b:destinationA"
service="b:abcFilePoller">
<sm:component>
<bean class="org.apache.servicemix.components.file.FilePoller">
<property name="file" value="file://D:/input" />
<property name="period" value="20000"/>
<property name="archive" value="file://D:/archive" />
<property name="filter" ref="abcFileFilter" />
<property name="marshaler">
<bean class="org.apache.servicemix.components.util.BinaryFileMarshaler" />
</property>
</bean>
</sm:component>
</sm:activationSpec>
<sm:activationSpec componentName="destinationA"
service="b:destinationA">
<sm:component>
<bean
class="com.abc.file.ABCReceiverComponent">
</bean>
</sm:component>
</sm:activationSpec>
<bean id="abcFileFilter" class="org.apache.commons.io.filefilter.WildcardFileFilter">
<constructor-arg value="A*.ID" />
Apache Camel File Poller
<camel:route id="abcFilePoller">
<camel:from
uri="timer://time?period=20000"/>
<camel:pollEnrich uri="file://D:/input"/>
<camel:filter ref="abcFileFilter"></camel:filter>
<camel:to uri="file://D:/archive" />
<camel:to uri="" />
</camel:route>
<bean id="abcFileFilter" class="org.apache.commons.io.filefilter.WildcardFileFilter">
<constructor-arg value="A*.ID" />
</bean>
I have not completed the camel coding. I have left with destination part. And I have no idea about the marshaler that is used in the service-mix part. How to implement that BinaryFileMarshaler using camel.
You can do this even easier in Apache Camel where you can configure the filtering in the file endpoint, so it just becomes
<route>
<from uri="file:D:/input?delay=20000&include=A.*ID"/>
<to uri="file:D:/archive"/>
</route>
Just mind that the include option uses a regular expression, so if you are not familiar with that it can take a bit tries to get the expression to work as expected. But its standard java regular expressions.
See more at: https://camel.apache.org/components/latest/file-component.html
And for new users to Apache Camel then see: http://java.dzone.com/articles/open-source-integration-apache

Categories

Resources