When I try to connect our alfresco through SFTP it is not able to connect alfresco. It hangs the explorer and no error goes in the logger file also.
public void FTPTest()throws SocketException, IOException, NoSuchAlgorithmException
{
FTPSClient ftp = new FTPSClient("SSL");
System.out.println("1");
ftp.connect("172.17.178.144",2121); // or "localhost" in your case
System.out.println("2"+ftp.getReplyString());
System.out.println("login: "+ftp.login("admin", "admin"));
System.out.println("3"+ ftp.getReplyString());
ftp.changeWorkingDirectory("/alfresco");
// list the files of the current directory
FTPFile[] files = ftp.listFiles();
System.out.println("Listed "+files.length+" files.");
for(FTPFile file : files) {
System.out.println(file.getName());
}
// lets pretend there is a JPEG image in the present folder that we want to copy to the desktop (on a windows machine)
ftp.setFileType(FTPClient.BINARY_FILE_TYPE); // don't forget to change to binary mode! or you will have a scrambled image!
FileOutputStream br = new FileOutputStream("C:\\Documents and Settings\\casonkl\\Desktop\\my_downloaded_image_new_name.jpg");
ftp.retrieveFile("name_of_image_on_server.jpg", br);
ftp.disconnect();
}
I got output in our console only
1
at the execution of ftp.connect("172.17.178.144",2121); this code system will be hang no error got in our console
I am able to connect to my Alfresco through SFTP with the Filezila FTP client software. Can any one help me resolve this issue?
If I'm not mistaken then Alfresco chose for FTPS.
So try it with the following code here: http://alvinalexander.com/java/jwarehouse/commons-net-2.2/src/main/java/examples/ftp/FTPSExample.java.shtml
Related
I am new to FTPSClient i trying to connect to a FTPS created in my laptop. i don't exactly what some of the methods working and their parameter meaning.
For example,
In my code i have created a FTPSClient as below:
FTPSClient ftps =new FTPSClient();
Then connected to a server use connect() method with ip address.
ftps.connect("172.xx.xx.xxx");
After every step i will check the reply code using.
ftps.getReplyCode();
In the below code i know that
username = system username
password = the password to login
ftps.login(username, password);
In the my system in Internet Information Service(IIS). Created an ftp server with ssl and given the below directory to share.
C:\Users\karan-pt2843\Desktop\FTPS
Want to send the file in below directory to the server.
D:\sam.txt
Now i want to store a file in the server in the given above directory and i tried using
remote="";
local="";
InputStream input;
input = new FileInputStream(local);
ftps.storeFile(remote, input);
input.close();
I don't know what value to give for remote and local. please help me with the values to give on them and the what happens internal.
// Use passive mode as default because most of us are
// behind firewalls these days.
ftps.enterLocalPassiveMode();
...
String remote = "samFromClient.txt"; //Place on FTP
String input = "D:/sam.txt" //Place on your Client
//Your FTP reads from the inputstream and store the file on remote-path
InputStream input = new InputStream(new FileInputStream(input));
ftps.storeFile(remote, input);
input.close();
ftps.logout();
...
Taken from: Apache example
I have a remote Linux server (Debian 9.2) with Tomcat 9. I loaded a web app on the server that generates a .csv file, the user can connect to the server and download it.
When the program runs on the localhost, it works well, however, when it runs on the remote server, the browser says: file not found
Here is my code:
private void writeFile(String nomeFile, String content, HttpServletResponse response) throws IOException {
response.setContentType("text/csv");
response.setHeader("Content-disposition","attachment; filename="+nomeFile);
String filename=nomeFile;
try {
File file = new File(filename);
FileWriter fw = new FileWriter(file);
BufferedWriter bw = new BufferedWriter(fw);
bw.write(content);
bw.flush();
bw.close();
}
catch(IOException e) {
e.printStackTrace();
}
// This should send the file to browser
ServletOutputStream out = response.getOutputStream();
FileInputStream in = new FileInputStream(filename);
byte[] buffer = new byte[4096];
int length;
while ((length = in.read(buffer)) > 0){
out.write(buffer, 0, length);
}
in.close();
out.flush();
}
I'm trying to debug this, but I do not know where the error could be. The servlet that implements the code runs fine on localhost. Why does it fail on the remote server?
If you have the same code on local and remote machine and it works on one and does not work on the other it means one of 2 things:
1) localhost configuration of Tomcat might be different vs remote host
- this might cause Tomcat to search "different" folder for file etc. OR
2) file is present only localhost and is not present on the remote host.
You can see in DevTools into your Browser on Network tab what is request string to remote server to get file and check it.
Also check your context on the remote server. It may be different with your localhost when you deploy project.
Type where you put csv file and path.
I'm using Hadoop 2.6, and I have a cluster of Virtual Machines where I installed my HDFS. I'm trying to remotely read a file in my HDFS through some Java code running on my local, in the basic way, with a BufferedReader
FileSystem fs = null;
String hadoopLocalPath = "/path/to/my/hadoop/local/folder/etc/hadoop";
Configuration hConf = new Configuration();
hConf.addResource(new Path(hadoopLocalPath + File.separator + "core-site.xml"));
hConf.addResource(new Path(hadoopLocalPath + File.separator + "hdfs-site.xml"));
try {
fs = FileSystem.get(URI.create("hdfs://10.0.0.1:54310/"), hConf);
} catch (IOException e1) {
e1.printStackTrace();
System.exit(-1);
}
Path startPath = new Path("/user/myuser/path/to/my/file.txt");
FileStatus[] fileStatus;
try {
fileStatus = fs.listStatus(startPath);
Path[] paths = FileUtil.stat2Paths(fileStatus);
for(Path path : paths) {
BufferedReader br=new BufferedReader(new InputStreamReader(fs.open(path)));
String line = new String();
while ((line = br.readLine()) != null) {
System.out.println(line);
}
br.close();
}
} catch (FileNotFoundException e) {
e.printStackTrace();
} catch (IOException e) {
e.printStackTrace();
}
The program can access correctly the HDFS (no exception are risen). If I ask to list the files and directories via code, it can read them without problems.
Now, the issue is that if I try to read a file (as in the code shown), it gets stuck while reading (in the while), until it rises the BlockMissingException
org.apache.hadoop.hdfs.BlockMissingException: Could not obtain block: BP-2005327120-10.1.1.55-1467731650291:blk_1073741836_1015 file=/user/myuser/path/to/my/file.txt
at org.apache.hadoop.hdfs.DFSInputStream.chooseDataNode(DFSInputStream.java:888)
at org.apache.hadoop.hdfs.DFSInputStream.blockSeekTo(DFSInputStream.java:568)
at org.apache.hadoop.hdfs.DFSInputStream.readWithStrategy(DFSInputStream.java:800)
at org.apache.hadoop.hdfs.DFSInputStream.read(DFSInputStream.java:847)
at java.io.DataInputStream.read(DataInputStream.java:149)
at sun.nio.cs.StreamDecoder.readBytes(StreamDecoder.java:284)
at sun.nio.cs.StreamDecoder.implRead(StreamDecoder.java:326)
at sun.nio.cs.StreamDecoder.read(StreamDecoder.java:178)
at java.io.InputStreamReader.read(InputStreamReader.java:184)
at java.io.BufferedReader.fill(BufferedReader.java:161)
at java.io.BufferedReader.readLine(BufferedReader.java:324)
at java.io.BufferedReader.readLine(BufferedReader.java:389)
at uk.ou.kmi.med.datoolkit.tests.access.HDFSAccessTest.main(HDFSAccessTest.java:55)
What I already know:
I tried the same code directly on the machine running the namenode, and it works perfectly
I already checked the log of the namenode, and added the user of my local machine to the group managing the HDFS (as suggested by this thread, and other related threads)
There should not be issues with fully-qualified domain names, as suggested by this thread, as I'm using static IPs. On the other hand, the "Your cluster runs in a VM and its virtualized network access to the client is blocked" can be an option. I would say that if it is like that, it shouldn't allow me to do any action on the HDFS (see next point)
The cluster run on a network with a firewall, and I have correctly open and forwarded the port 54310 (I can access the HDFS for other purposes, as creating files, directories, and listing their content). I wonder if there are other ports to open needed for file reading
Can you make sure that Datanode is also accessible from the client ? I had similar issue when connecting Hadoop configured in AWS . I am able to resolve the issue , by conforming connection between all datanodes and my client system
HI Friends I am working on a java application to store the file into the server
but when i upload some file its shows
org.apache.commons.net.io.CopyStreamException: IOException caught while copying.
QUIT
550 can't access file.
can any body tell me how to solve that?
public void uploadtxtFile(String localFileFullName, String fileName, String hostDir)
throws Exception {
FTPClient ftpclient= DBConnection.connect();
File file = new File(localFileFullName);
if (!(file.isDirectory())) {
if (file.exists()) {
FileInputStream input = null;
BufferedInputStream bis=null;
try {
input = new FileInputStream(new File(localFileFullName));
if (input != null) {
hostDir = hostDir.replaceAll("//", "/");
logger.info("uploading host dir : " + hostDir);
boolean bool =false ;
logger.error("Replay of the ftp store file is 1111"+ ftpclient.getReplyCode());
try{
ftpclient.enterLocalPassiveMode();
logger.error("Replay of the ftp store file is 2222"+ ftpclient.getReplyCode());
if( ftpclient.isConnected()){
// here server timeout error is get
logger.error("here server timeout error is get");//new
bis = new BufferedInputStream(input);
logger.error("Replay of the ftp store file is 3333"+ ftpclient.getReplyCode());
bool = ftpclient.storeFile(hostDir, bis);
} else{
logger.error("here server timeout error is get");//new
bis = new BufferedInputStream(input);
logger.error("Replay of the ftp store file is 6666"+ ftpclient.getReplyCode());
ftpclient.enterLocalPassiveMode();
bool = ftpclient.storeFile(hostDir, bis);
}}finally{
bis.close();
input.close();
}
logger.error("Replay of the ftp store file is 4444 "+ ftpclient.getReplyCode());
if (bool) {
logger.info("Success uploading file on host dir :"+hostDir);
} else {
logger.error("file not uploaded.");
}
} else {
logger.error("uploading file input null.");
}
} catch(Exception ex)
{
logger.error("Error in connection ="+ex);
}finally {
ftpclient.logout();
ftpclient.disconnect();
}
} else {
logger.info("uploading file is not exists.");
}
}
}
and also the response of file folder is not show
SYMPTOMS
When attempting to upload a file to a remote FTP site, a 550 error code is encountered, resulting in an error message similar to one of the following examples:
Example 1:
STATUS:> Transferring file "/pub/yourfile.txt"...
COMMAND:> SIZE yourfile.txt
550 yourfile.txt: No such file.
STATUS:> Requested action not taken (e.g., file or directory not found, no access).
COMMAND:> CWD /pub/yourfile.txt
550 /pub/yourfile.txt: No such file or folder.
STATUS:> Requested action not taken (e.g., file or directory not found, no access).
COMMAND:> STOR yourfile.txt
Example 2:
COMMAND:> STOR yourfile.txt
550 Permission Denied.
ERROR:> Requested action not taken (e.g., file or directory not found, no access).
CAUSE
Example 1:
In this example the 550 code returned by the remote FTP server is for
information purposes only. It is not an error and should be ignored
by the user. In this case an upload command has already been given
but before the upload can be started CuteFTP needs it determine
whether or not the file being transferred already exists on the remote
site as either a file or a folder.
First, the SIZE command is sent in an attempt to determine if a file with the same name exists on the remote site. The server
responds with a 550 indicating that the file does not already exist
there.
Next, the CWD command is sent in an attempt to determine if a folder with the same name exists on the remote site. The server
responds with a 550 indicating that a folder by that name does not
exist.
Finally, the STOR command is given and the file upload begins.
Example 2:
A file upload is being attempted but the remote server has denied the
needed permission. The 550 error code is a result of insufficient
account privileges on the remote FTP server. The error is not caused
by CuteFTP.
RESOLUTION
Example 1:
Not applicable. In this example the 550 code returned by the remote
FTP server is for information purposes only. It is not an error and
should be ignored by the user.
Example 2:
If you believe that your FTP account privileges or permissions are
configured incorrectly, contact the technical support department at
the remote FTP site or your Web hosting company for help.
Or you can check FTP "550 Access is denied" Error
I am using com.jcraft.jsch library to read .xls files from an SFTP server. Following is the code to connect to server.
session = jsch.getSession(username, host);
session.setConfig("StrictHostKeyChecking", "no");
session.setPassword(password);
session.connect();
sftpChannel = (ChannelSftp) session.openChannel("sftp");
sftpChannel.connect();
I am using sftpChannel.get(file) to retrieve inputStream to the file. This inputstream is then used to instantiate XSSFWorkbook as shown below:
XSSFWorkbook workbook = new XSSFWorkbook(in);
Problem 1:
When I run the app, it seems to get stuck on the above line for some time (say 5 minutes) and then it throws java.io.IOException: Pipe closed error.
The xls file I am trying to read is 800kb and it works fine when run from local machine.
Problem 2:
The app is designed to process files sequentially. So, if first file fails with IOE, rest of the files also fail as the connection is timed out. To prevent this, I put the below code to check and re-connect:
if(null == session || !session.isConnected()){
log.debug("Session is not connected/timed out. Creating a new session");
openSftpSession();
log.debug("New session is created");
}
//openSftpSession() is the code to create a new session as explained in the beginning of the question.
When this code gets executed, following exception gets thrown:
java.io.IOException: error: 4: RequestQueue: unknown request id 1028332337
at com.jcraft.jsch.ChannelSftp$2.read(ChannelSftp.java:1407)
at java.io.FilterInputStream.read(FilterInputStream.java:133)
at java.io.PushbackInputStream.read(PushbackInputStream.java:186)
//More lines
edit : code to retrieve input stream
public InputStream getInputStream(String folder, String file) throws Exception{
sftpChannel.cd(root + folder);
log.debug("current directory:" + sftpChannel.pwd());
log.debug("File :" + folder + " " + file);
return sftpChannel.get(file);
}
Can anyone please help me get over this? I believe an alternate approach to prevent timeout is to download the file in some temp directory and process. However, I don't really want to do that.
Thanks in advance.
Have you checked to see whether the approach you describe (download into temp file) works? Just to verify that your inputstream is ok.. How long does it take to download into a local file over that connection?
If you don't want to manage a temp file you could always pull it into a byte[] in memory, so long as you don't have to scale to much more than 800kbs.. Use Apache Commons as such:
InputStream in = sftpChannel.get(file);
byte[] inBytes = org.apache.commons.io.IOUtils.toByteArray(in)
ByteArrayInputStream inByteStream = new ByteArrayInputStream(inBytes)
XSSFWorkbook workbook = new XSSFWorkbook(inByteStream);
As for the Request Id, it looks like the old session/channel is still trying to do a read but is no longer able to. Maybe you aren't closing out that session/channel properly. From the openSftpSession() code it looks like you would only be overwriting the references without properly shutting them down.