Downloading a file from a URL -- Nothing works - java

I run a Minecraft server for my friends, and a while back I very shoddily put together an installer which would download mods and install them properly to the correct location.
After about 6 hours of figuring out how to do what I was trying to do, I successfully got a working installer. I tried it again today, and found that it has stopped working. I've been trying to fix it, but to no avail. I cannot find any solution which actually gets the whole file to download, rather, an empty 4KB file is created. I tried on a different computer and the same thing happened. I'm running the latest version of Java.
This is the method I'm using to download the file. I have tried solutions using a ReadableByteChannel, but that yielded similar results, with a 1KB empty file.
public void download(String filename, URL url) {
try {
String fileName = filename;
URL link = url;
InputStream in = new BufferedInputStream(link.openStream());
ByteArrayOutputStream out = new ByteArrayOutputStream();
byte[] buf = new byte[1024];
int n = 0;
StringBuffer stringBuffer = new StringBuffer();
while (-1 != (n = in.read(buf))) {
out.write(buf, 0, n);
}
out.close();
in.close();
byte[] response = out.toByteArray();
FileOutputStream fos = new FileOutputStream(fileName);
fos.write(response);
fos.close();
System.out.println("Download complete!");
} catch (Exception e) {
System.out.println("Download failed, something went wrong!");
}
}
Is there something wrong with the method above? I can't really wrap my head around how downloading the file works, else I'd troubleshoot it myself. Any methods I can find on Google are going about it in pretty much the same way.
Printing stack trace:
at Install.download(Install.java:93)
at Install.downloadFile(Install.java:41)
at Install.main(Install.java:106)
at sun.reflect.NativeMethodAccessorImpl.invoke0(NativeMethodAccessorImpl.java:-2)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at com.intellij.rt.execution.application.AppMain.main(AppMain.java:144)
EDIT: Problem solved, used a different file hosting site to host the file. Now I just need to figure out how I can update the file without changing the URL.

Related

How to write a Zip File with the "DFLT-X" Method

I am working on a Software which bundles and exports XML files as a zip. The compression method is "Deflate" (Code snipped included below).
These zip files are needed in another (older) Software, which is build up on "QT" (Code snipped also below).
The Problem is, that the zip files are not accepted in the second Software. If these zip files get rezipped manually, they work suddendly.
To find any differences in the generated zip and the Manual one, i plugged both into "powerArchiver" and saw they are exactly the same except for the "Method", which is "DFLT-X" on the workign zip and "DFLT-N" on the not working one (Note: working refers to the second Software import, both zip files can be extracted without Problems manually).
Any ideas how i can get the "DFLT-X" Method with Java utils libs?
I tried all settings and variants (.setLevel(), setMethod()) for "ZipOutputStream", "Deflater" and "DeflaterOutputStream" but i only got the "DFLT-N" Format.
The Explanation what these Formats are is not included in the powerArchiver Forums or else where. "DFLT-N" seems to refer to "Deflate, Normal" and the X variant for some higher compression, but not Deflate64.
Software 1, generating the zip:
final byte[] buffer = new byte[1024];
FileOutputStream fos = null;
ZipOutputStream zos = null;
try {
fos = new FileOutputStream(zipFile);
zos = new ZipOutputStream(fos);
FileInputStream inputStream = null;
for (final String file : this.fileList) {
if (file.toString().contains(".xml")) {
final ZipEntry ze = new ZipEntry(File.separator + file);
zos.putNextEntry(ze);
try {
inputStream = new FileInputStream(sourceFolder + File.separator + file);
int len;
while ((len = inputStream.read(buffer)) > 0) {
zos.write(buffer, 0, len);
}
}
finally {
if (inputStream != null) {
inputStream.close();
}
}
}
}
zos.closeEntry();
}
catch (final IOException ex) {
ex.printStackTrace();
}
Software 2, reading the zip:
bool WfControlDataStorage::load(const QString& identifier, QByteArray& outZipFileContent) const
{
QFile dataFile(identifierToFilepath(identifier));
if(dataFile.open(QFile::ReadOnly)) {
outZipFileContent = dataFile.readAll();
dataFile.close();
return true;
}
return false;
}
#Holger Thank you for your Time, sounds like you did exactly what i did too.
Solution:
in my Project the zip Entry name had a leading "/" like "/someName". This was not visible in powerarchiver and also didnt hinder decompressing, but my recieving software had troubles resolving this name.
Repacking with powerarchiver removed that slash, so a bunch of undocumented behavior made my life hell.
In Terms of DFLT-X and DFLT-N, these are strange powerarchiver specific namings and i still cannot say how they determine the difference. But i can say, that deflate has no different "methods" nexto the simple levels from 0-9 and the "Deflate64" which is basically never used and obsolete. The DFLT-X naming is unrelated to both and imho completly irrelevant.

Java Socket File Transfer Doesnt Work Properly

I am new with networking in java and i want to achieve a file transfer system that server downloads a file from client (client computer file to server) i searched the internet for a bunch of tutorials but all of them seems to give me a error or dont do anything but i found 10 year old tutorial that worked here is the link:
Click me and i did small change to it that made it more stable but there is a problem that i cannot solve when the file downloads on the server the file is still in use by the server and i noticed that the loop does not quit and doesnt close the file stream when its done now here is the code:
Client:
File fl = new File(splitedstreamcommand[2]);
FileInputStream fin = new FileInputStream(fl);
int i = -1;
while((i = fin.read()) != -1){
output.write(i);
output.flush();
//System.out.print((char) i);
}
fin.close();
The output is a ObjectOutputStream
Server:
File fl = new File("C:\\testfolder\\thetestfiledownloaded.txt");
//FileOutputStream fos = new FileOutputStream("C:\\testfolder\\thetestfiledownloaded.txt");
int i = -1;
if(fl.createNewFile()){
FileOutputStream fout = new FileOutputStream(fl);
while((i=input.read()) != -1){
//System.out.print((char) i);
fout.write(i);
System.out.println("Downloading....");
}
//System.out.println("Still downloading....");
fout.close();
//fin.close();
System.out.println("File downloaded.");
}
The input is a ObjectInputStream and now the loop that is the problem is here in the server the while loop that is reading bytes it stops when its done reading but it doesnt quit the loop and continue executing the code like the File Downloaded string that should print out and the file is still in use. I think this code is broken because its old but this is the only code that is working to me for some reason soo how can i fix this issue? Thanks. Sorry about the bad question format and explanation if you cannot understand tell me and i can try to format it.

How to download a file from the internet using Java

Hi I am trying to write some code in my program so I can grab a file from the internet but it seems that is not working. Can someone give me some advice please ? Here is my code. In this case I try to download an mp3 file from the last.fm website, my code runs perfectly fine but when I open my downloads directory the file is not there. Any idea ?
public class download {
public static void main(String[] args) throws IOException {
String fileName = "Death Grips - Get Got.mp3";
URL link = new URL("http://www.last.fm/music/+free-music-downloads");
InputStream in = new BufferedInputStream(link.openStream());
ByteArrayOutputStream out = new ByteArrayOutputStream();
byte[] buf = new byte[1024];
int n = 0;
while (-1!=(n=in.read(buf)))
{
out.write(buf, 0, n);
}
out.close();
in.close();
byte[] response = out.toByteArray();
FileOutputStream fos = new FileOutputStream(fileName);
fos.write(response);
fos.close();
System.out.println("Finished");
}
}
Every executing program has a current working directory. Often times, it is the directory where the executable lives (if it was launched in a "normal" way).
Since you didn't specify a path (in fileName), the file will be saved with that name in the current working directory.
If you want the file to be saved in your downloads directory, specify the full path. E.g.
String fileName = "C:\\Users\\YOUR_USERNAME\\Downloads\\Death Grips - Get Got.mp3";
Note how I've escaped the backslashes. Also note that there are methods for joining paths in Java. There is a way to get the current working directory in Java.

Automatically download file from URL

I am attempting to download a file automatically. I know the link as I have already parsed it from the RSS XML file. Is there a simple noob friendly way of doing this?
Since my previous edit I have been informed that as long as I keep the file name the same I will be able to do this this is the code I have so far (I should have mentioned previously that this is for a bukkit plugin however the plugin)
public void getFile (String url) {
try{
BufferedInputStream in = new BufferedInputStream(new
URL("http://dev.bukkit.org/media/files/706/595/Kustom-Warn.jar").openStream());
FileOutputStream fileOutputStream = new FileOutputStream(plugin.getDataFolder().getAbsolutePath() + "/KustomWarn.jar");
logger.severe(String.valueOf(plugin.getDataFolder().getAbsolutePath()));
BufferedOutputStream outputStream = new BufferedOutputStream(fileOutputStream,1024);
byte data[] = new byte[1024];
while(in.read(data,0,1024)>=0)
{
outputStream.write(data);
}
outputStream.close();
in.close();
}catch (Exception e){
logger.severe("Error: " + e.getMessage());
}
}
If you mean to copy a file from a site to a local file then you can use java.nio.file
Files.copy(new URL("http://host/site/filename").openStream(), Paths.get(localfile));
Use URL.openStream to open the stream and Java NIO (New I/O) to read efficiently.

zip the files which are present at one FTP location and copy to another FTP location directly

I want to create zip file of files which are present at one ftp location and Copy this zip file to other ftp location without saving locally.
I am able to handle this for small size of files.It works well for small size files 1 mb etc
But if file size is big like 100 MB, 200 MB , 300 MB then its giving error as,
java.io.FileNotFoundException: STOR myfile.zip : 550 The process cannot access the
file because it is being used by another process.
at sun.net.ftp.FtpClient.readReply(FtpClient.java:251)
at sun.net.ftp.FtpClient.issueCommand(FtpClient.java:208)
at sun.net.ftp.FtpClient.openDataConnection(FtpClient.java:398)
at sun.net.ftp.FtpClient.put(FtpClient.java:609)
My code is
URLConnection urlConnection=null;
ZipOutputStream zipOutputStream=null;
InputStream inputStream = null;
byte[] buf;
int ByteRead,ByteWritten=0;
***Destination where file will be zipped***
URL url = new URL("ftp://" + ftpuser+ ":" + ftppass + "#"+ ftppass + "/" +
fileNameToStore + ";type=i");
urlConnection=url.openConnection();
OutputStream outputStream = urlConnection.getOutputStream();
zipOutputStream = new ZipOutputStream(outputStream);
buf = new byte[size];
for (int i=0; i<li.size(); i++)
{
try
{
***Souce from where file will be read***
URL u= new URL((String)li.get(i)); // this li has values http://xyz.com/folder
/myPDF.pdf
URLConnection uCon = u.openConnection();
inputStream = uCon.getInputStream();
zipOutputStream.putNextEntry(new ZipEntry((String)li.get(i).substring((int)li.get(i).lastIndexOf("/")+1).trim()));
while ((ByteRead = inputStream .read(buf)) != -1)
{
zipOutputStream.write(buf, 0, ByteRead);
ByteWritten += ByteRead;
}
zipOutputStream.closeEntry();
}
catch(Exception e)
{
e.printStackTrace();
}
}
if (inputStream != null) {
try {
inputStream .close();
}
catch (Exception e) {
e.printStackTrace();
}
}
if (zipOutputStream != null) {
try {
zipOutputStream.close();
} catch (Exception e){
e.printStackTrace();
}
}
Can anybody let me know how I can avoid this error and handle large files
This is unrelated to file sizes; as the error says, you can't replace the file because some other process is currently locking it.
The reason why you see it more often with large files is because these take longer to transfer hence the chance of concurrent accesses is higher.
So the only solution is to make sure that no one uses the file when you try to transfer it. Good luck with that.
Possible other solutions:
Don't use Windows on the server.
Transfer the file under a temporary name and rename it when it's complete. That way, other processes won't see incomplete files. Always a good thing.
Use rsync instead of inventing the wheel again.
Back in the day, before we had network security, there were FTP servers that allowed 3rd party transfers. You could use site specific commands and send a file to another FTP server directly. Those days are long gone. Sigh.
Ok, maybe not long gone. Some FTP servers support the proxy command. There is a discussion here: http://www.math.iitb.ac.in/resources/manuals/Unix_Unleashed/Vol_1/ch27.htm

Categories

Resources