Spring Integration is not detecting files on FTP Folder - java

Currently I'm facing a problem where I don't know what to do to resolve it. I'm developing a simple application that transfer files through different FTP and SFTP servers to be processed. Well, at the beginnig those servers weren't ready, so I used Apache Mina and RebexTinySftpServer to set a FTP server and a SFTP server on my computer to test my development.
With those applications I completed and tested my application locally, so it was time to test it using the servers that will be used in production, but something is wrong with the FTP server and Spring Integration is not detecting the files that are put in the folder to be transferred to the SFTP server.
I have two folders on each server: one for Input files and the another one for Output files. So when Spring Integration detects that there's a new file in the Input folder on the SFTP Server, it transfers the file to the Input folder on FTP Server to be processed for another service. That part works fine.
After that service processes the file, it generates an output file and stores it in the Output folder on the FTP Server to be transferred to the Output folder on the SFTP server, but for some reason Spring Integration is not detecting those files and doesn't transfer none of them. This part is what I don't know how to solve because none Exception is being thrown, so I don't know what part of my code I have to modify.
Here is my code where I define the DefaultFtpSessionFactory:
public DefaultFtpSessionFactory ftpSessionFactory() {
DefaultFtpSessionFactory session = new DefaultFtpSessionFactory();
session.setUsername(username);
session.setPassword(password);
session.setPort(port);
session.setHost(host);
session.setFileType(FTP.ASCII_FILE_TYPE);
session.setClientMode(FTPClient.PASSIVE_LOCAL_DATA_CONNECTION_MODE);
return session;
}
And here is the code where I define the FTP Inbound Adapter:
#Bean
FtpInboundChannelAdapterSpec salidaAS400InboundAdapter() {
return Ftp.inboundAdapter(as400Session.ftpSessionFactory())
.preserveTimestamp(true)
.remoteDirectory(as400Session.getPathSalida())
.deleteRemoteFiles(true)
.localDirectory(new File("tmp/as400/salida"))
.temporaryFileSuffix(".tmp");
}
I should mention that the FTP Server is running on an AS/400 system, so maybe that has something to do with the situation I'm facing.

I found the solution of my problem. I'm posting this in case something similar happens to someone else.
My project is using spring-integration-ftp 5.5.16 that has commons-net 3.8.0 as a dependency. For some reason, that version of commons-net wasn't retrieving the files inside the directory of the AS400, so I excluded that dependency from spring-integration-ftp and added commons-net 3.9.0 in my project. Now everything works fine.

Related

How to edit /etc/hosts file to open angularjs springboot project in specific website address

I have an angularJS application with springboot and it runs on 8090 port number with "index.htm" file. Like below ;
192.168.1.25:8090/index.htm
I just googled about the changing port and I did the change with 80 port number so I can open the web page with
192.168.1.25/index.htm
But, in google there are plenty of angular result and they tell me to change those settings with in angular.json file. I created a temporary angular project and did those with successfully. But, I could not figure out about the angularjs part. I checked the project and I could not find any file like angular.json. After that checking the springboot, I found this part of where I can open the web page with the above second URL address ;
192.168.1.25/index.htm
The code of Spring Boot part is below (There is where I can change the port number) :
Those ones did the trick by the way.
config.getMemberAttributeConfig().setStringAttribute(ServerService.MANAGEMENT_URL_PREFIX, ":80/index.htm");
config.getMemberAttributeConfig().setStringAttribute(ServerService.MANAGEMENT_URL_PREFIX, ":80");
After those steps, I can easily serve my angularjs app on 80 port and start to thinking about editing the /etc/hosts file on the server machine as :
192.168.1.25/index.htm xyz.com
I did reboot when I finished the editing part and I tried again to write xyz.com on firefox browser it does not go to my application.
I am still looking on the google and still could not find any solutions about this problem.
Any help will be really appreciated.
Format for hosts file is
#<ip> <hostname that resolve to the ip>
192.168.1.25 xyz.com
# or a list of names
192.168.1.25 xyz.com myapp.xyz.com
You do not put any port numbers or path parts in it. This will obviously only work if you edit the hosts file on all the computers
you are indending to access the site from and not necessarily on the host running the application itself.
That being said, you should probably read on supported Spring Boot properties because starting an application server on a specific port should be as easy as adding application.properties file in java resources with the following line:
server.port=80

Having problem to connect on sftp after moving sftp to AWS through camel

I'm having problems to connect on the sftp through springboot camel app. This started happening after we moved our sftp to AWS . Now, I have a temporary server host which looks like this s-add03ac9b.server.transfer.eu-west-1.amazonaws.com, I can connect in there by using for instance FileZilla but if I try to connect using the app, this is the error I get:
Caused by: org.apache.camel.NoSuchEndpointException: No endpoint could be found for: s-add03ac9b.server.transfer.eu-west-1.amazonaws.com/testFolder?username=myUser&password=myPassword&disconnect=true&maxMessagesPerPoll=50&initialDelay=1s&delay=1s&timeout=3000&move=done&moveFailed=failed, please check your classpath contains the needed Camel component jar.
, and here is the route itself, I changed it a bit to be more readable
from("s-add03ac9b.server.transfer.eu-west-1.amazonaws.com/testFolder?username=myUser&password=myPassword&disconnect=true&maxMessagesPerPoll=50&initialDelay=1s&delay=1s&timeout=3000&move=done&moveFailed=failed")
.setHeader(Headers.CONFIGURATION.name(), constant(routeConfiguration))
.setHeader("filenameModify").constant(modifyFileNames).setHeader("fileExtension")
.constant(fileExtension).choice().when(PredicateBuilder.and(header("filenameModify").isEqualTo(true), header("fileExtension").isNotNull()))
.setHeader(Exchange.FILE_NAME,
simple("${file:name.noext}-${date:in.header.CamelFileLastModified:ddMMyyyy-HHmmss}-${file:length}.${in.header.fileExtension}"))
.end().idempotentConsumer(simple("${file:name}-${file:length}"), MemoryIdempotentRepository.memoryIdempotentRepository(1000))
.log("Processing ${file:name}")
.process(rawDataProcessor)
.to((String) routeConfiguration.get(ConfigKey.END)).otherwise().log("File ${file:name} processed.").stop().end();
Do I need to add something else, maybe some dependency or...?
If anyone is having the same issue, I fixed it by adding an sftp:// as a prefix in from part.

SSH from inside a docker container

So, I haven't worked with docker for very long. But this is the first time where I've had a requirement to ssh OUT of a docker container. It should be straight forward because I know I can connect to databases pull files from repositories. But for some reason I cannot seem to connect to a remote sftp server. Interestingly on my local it runs fine (no docker), but when building on Jenkins the tests cannot connect. Even to a MOCK server that I set up and put a test file on before the tests run. Running on Jenkins also makes it difficult to debug what the issue is.
Im using Jcraft to get the connection below:
public static Session getSession (String host, String user) throws JSchException{
JSch jsch = null;
int port = 22;
if (JunitHelperContext.getInstance.isJunit()){
port = JunitHelperContext.getInstance.getSftpPort();
Session session = jsch.getSession(user,host,port);
java.util.Properties config = new java.util.Properties();
config.put(“StrictHostKeyChecking”, “no”);
if (!JunitHelperContext.getInstance.isJunit()){
config.put(“PreferredAuthentications”, “publickey”);
jsch.setKnownHosts(”~/.ssh/known_hosts”);
jsch.addIdentity(“~/.ssh/id_rsa”);
}
session.setConfig(config);
session.connect();
return session;
}
}
My requirement is to go out and read a file and process it. I can build the kit fine using a non-docker template. The file is found and processed. Running it inside a docker container though, I get this error when I try to connect:
Invalid Key Exception: the security strength of SHA-1 digest algorithm is not sufficient for this key size.
com.jcraft.jsch.JSchException: Session.connect: java.io.IOException: End of IO Stream Read
So this seems like a security issue. In production, the certificates are on the server and they can be read in that /.ssh directory. But this is a mock Jcraft server, and I shouldnt need to authenticate.
Is there a piece I am missing here? Configuration in the docker file ?Thanks in advance.
You probably need to enable Java's JCE unlimited strength stuff in the docker container:
http://www.oracle.com/technetwork/java/javase/downloads/jce8-download-2133166.html
There are export restrictions on this cryptography stuff and I'll bet your docker container has the weak strength exportable jars.
Copy local_policy.jar and US_export_policy.jar into the docker container with the Dockerfile and overwrite what's there.
Follow the instructions at that link.

ftp.ControlChannelIOException: Connection reset(Talend/Java)

I have a requirement to pull the files and push the files from FTP Server but I am getting the below error while pulling/fetching the files from/into FTP server. I am using Talend Open Source Data Integration Tool, also tried using custom java code and getting the same error. While I am able to pull and fetch the files using FileZilla FTP Client. I am getting the error when I am trying to pull and push the files.
Error Start ..
com.enterprisedt.net.ftp.ControlChannelIOException: Connection reset
at
com.enterprisedt.net.ftp.FTPControlSocket.readLine(FTPControlSocket.java:1014)
at
com.enterprisedt.net.ftp.FTPControlSocket.readReply(FTPControlSocket.java:1049)
at
com.enterprisedt.net.ftp.FTPControlSocket.sendCommand(FTPControlSocket.java:973)
at
com.enterprisedt.net.ftp.FTPControlSocket.createDataSocketPASV(FTPControlSocket.java:807)
at
com.enterprisedt.net.ftp.FTPControlSocket.createDataSocket(FTPControlSocket.java:563)
at
com.enterprisedt.net.ftp.FTPClient.setupDataSocket(FTPClient.java:2561)
at com.enterprisedt.net.ftp.FTPClient.dir(FTPClient.java:3468) at
vikas_sir.ftp_salesforce_and_vice_verasa_0_1.FTP_Salesforce_AND_Vice_Verasa.tFTPFileList_2Process(FTP_Salesforce_AND_Vice_Verasa.java:488) at
vikas_sir.ftp_salesforce_and_vice_verasa_0_1.FTP_Salesforce_AND_Vice_Verasa.tFTPConnection_1Process(FTP_Salesforce_AND_Vice_Verasa.java:396)
at
vikas_sir.ftp_salesforce_and_vice_verasa_0_1.FTP_Salesforce_AND_Vice_Verasa.runJobInTOS(FTP_Salesforce_AND_Vice_Verasa.java:1085) at
vikas_sir.ftp_salesforce_and_vice_verasa_0_1.FTP_Salesforce_AND_Vice_Verasa.main(FTP_Salesforce_AND_Vice_Verasa.java:942)
Error End ..
I have already tried the below things :
Allowed java application for my firewall e.g java.exe, javaw.exe and etc.
Also tried by disabling the firewall.
used netsh advfirewall set global StatefulFTP disable command as admin.
Disabled my antivirus.
Also added TLS1.1 and TLS 1.2 into .ini file.
Also tried in different Local Machines.
Tried by writing custom java code.
Below is link to the screenshot of the talend job :
Talend Job to pull the files from FTP server
Also when I am trying another FTP credentials, not getting(Can push and pull the files from/into FTP Server) any error.
Please let me know what/where I am doing wrong.
Any help will be greatly appreciated.
Thanks in Advance.
Amit

Would it be a RedHat permission issue for a executable jar to get an SSLHandshakeException between servers?

Troubleshooting someone elses problem over the phone.
The error is trying to connect from one server to another servers db over ssl.
The error message is: 'requires a valid client certificate'
I've done everything to ensure that they have created the correct cert.
I had them debug=yes ssl:handshake and send me the logs.
In their logs they have nothing set for the keystore and the wrong path for the truststore.
Herein I think lies the issue. We have an executable jar that reads the paths and passwords for the keystore/truststore and sets the values via calls such as System.setProperty("javax.net.ssl.keyStore", config.getKeyStore());
from within the executable. I've verified at least locally that I can communicate over SSL from 1 server to anothers database in our test environment and all is fine. If I were to remove all of the settings then the error log generated by the ssl:handshake gives me the 'requires a valid client certificate'.
I'm told that the destination server with the DB has its pg_hba file
set properly to accept the communication over SSL from the specified
id.
I've verified that the cert is valid from the CA.
The only thing that I can think of is that the executable jar is not setting the values to the System via the code System.setProperty("javax.net.ssl.keyStore");
Is this a configuration that needs to occur to the Red Hat server to allow the jar to set system properties or is this at the file permission level or ? ? ?
I'm at a loss here and would appreciate any guidance on this issue.
Very difficult environment. I'm on the North-East tasked to assist someone in Midwest to trouble shoot them trying to connect to our Database in the South-West. I have no access to the actual servers and have to try and debug over the phone.

Categories

Resources