Do You know any way to save a WMS tile as an image (especially .png) using Java?
I have a tile, for example:
http://mapy.geoportal.gov.pl/wss/service/img/guest/ORTO/MapServer/WMSServer?VERSION=1.1.1&SERVICE=WMS&REQUEST=GetMap&LAYERS=Raster&SRS=EPSG:4326&WIDTH=500&HEIGHT=500&TRANSPARENT=TRUE&FORMAT=image/png&BBOX=23.805441,50.98483844444444,23.807441,50.98594955555556&styles=
My code looks like:
public static void saveImage(String imageUrl, String destinationFile) throws IOException {
URL url = new URL(imageUrl);
InputStream is = url.openStream();
OutputStream os = new FileOutputStream(destinationFile);
byte[] b = new byte[2048];
int length;
while ((length = is.read(b)) != -1) {
os.write(b, 0, length);
}
is.close();
os.close();
}
It works for normal images like http://www.delaval.com/ImageVaultFiles/id_15702/cf_5/st_edited/AYAbVD33cXEhPNEqWOOd.jpg
Should I use any special library?
It seems that maps.geoportal.gov.pl checks User-Agent header and when you connect like this, there is no User-Agent header sent to server. If you set this header to some accepted value (e.g. Mozilla/5.0 seems to be valid), you will get the image you want.
So instead of
InputStream is = url.openStream();
try
URLConnection connection = url.openConnection();
connection.setRequestProperty("User-Agent", "Mozilla/5.0");
InputStream is = connection.getInputStream();
and it should work.
Related
For my project i need to download a pdf file from google drive using java
I get my httpresponse code 200 and by using following method i store it in abc.pdf file
String url = "https://docs.google.com/uc?id="+fileid+"&export=download";
URL obj = new URL(url);
HttpURLConnection conn = (HttpURLConnection) obj.openConnection();
// optional default is GET
conn.setRequestMethod("GET");
//add request header
conn.setRequestProperty("User-Agent", USER_AGENT);
int responseCode = conn.getResponseCode();
System.out.println("\nSending 'GET' request to URL : " + url);
System.out.println("Response Code : " + responseCode);
BufferedReader in = new BufferedReader(
new InputStreamReader(conn.getInputStream()));
String inputLine;
OutputStream f0 = new FileOutputStream("C:\\Users\\Darshil\\Desktop\\abc.pdf",true);
while ((inputLine = in.readLine()) != null) {
//System.out.println(inputLine);
byte b[]=inputLine.getBytes();
//System.out.println(b);
f0.write(b);
}
in.close();
f0.close();
But when i try to open abc.pdf in my adobe reader x i get following error:
There was an error opening this document.The file is damaged and could not be repaired
You seem to be directly accessing the Google drive using Raw HTTP requests.
You may be better of using the Google Drive SDK. This link contains good examples to address the use cases you state in your question.
However if you do want to stick to your technique then you should not be using a BufferedReader.readLine(). This is because the PDF file is a binary finally that would depend upon the correct byte sequences to be preserved in order to be read correctly by the PDF reader software. Hopefully the below technique should help you:
//read in chunks of 2KB
byte[] buffer = new byte[2048];
int bytesRead = 0;
try(InputStream is = conn.getInputStream())
{
try(DataOutputStream os = new DataOutputStream(new FileOutputStream("file.pdf"))
{
while((bytesRead = is.read(buffer)) != -1)
{
os.write(buffer, 0, bytesRead);
}
}
}
catch(Exception ex)
{
//handle exception
}
Note that I am using the try-with-resources statement in Java 7
Hope this helps.
I have this issue with GZIP compression:
I need to send by POST method a huge JSON string, which is too big to be accept like URL (Ex: http://localhost/app/send/JSON STRING ENCODED BY BASE64), than it result in HTTP error 403
so, I need to compress my json and I found a way to do it with GZIP compression, which I can decompress with gzdecode() in PHP.
but it doesn't work...
my functions compress() and decompress() works fine inside my Java App, but when I send it to webservice, something goes wrong and gzdecode() doesn't work.
I have no idea what I missing, I need some help
functions used in java app (client)
public String Post(){
String retorno = "";
String u = compress(getInput());
u = URLEncoder.encode(URLEncoder.encode(u, "UTF-8"));
URL uri = new URL(url + u);
HttpURLConnection conn = (HttpURLConnection) uri.openConnection();
conn.setDoOutput(false);
conn.setRequestMethod(getMethod());
conn.setRequestProperty("Content-encoding", "gzip");
conn.setRequestProperty("Content-type", "application/octet-stream");
BufferedReader buffer = new BufferedReader(
new InputStreamReader((conn.getInputStream())));
String r = "";
while ((r = buffer.readLine()) != null) {
retorno = r + "\n";
}
return retorno;
}
GZIP compress function (client)
public static String compress(String str) throws IOException {
byte[] blockcopy = ByteBuffer
.allocate(4)
.order(java.nio.ByteOrder.LITTLE_ENDIAN)
.putInt(str.length())
.array();
ByteArrayOutputStream os = new ByteArrayOutputStream(str.length());
GZIPOutputStream gos = new GZIPOutputStream(os);
gos.write(str.getBytes());
gos.close();
os.close();
byte[] compressed = new byte[4 + os.toByteArray().length];
System.arraycopy(blockcopy, 0, compressed, 0, 4);
System.arraycopy(os.toByteArray(), 0, compressed, 4,
os.toByteArray().length);
return Base64.encode(compressed);
}
method php used to receive a URL (server, using Slim/PHP Framework)
init::$app->post('/enviar/:obj/', function( $obj ) {
$dec = base64_decode(urldecode( $obj ));//decode url and decode base64 tostring
$dec = gzdecode($dec);//here is my problem, gzdecode() doesn't work
}
post method
public Sender() throws JSONException {
//
url = "http://192.168.0.25/api/index.php/enviar/";
method = "POST";
output = true;
//
}
As noticed in some of the comments.
Bigger data should be send as a POST request instead of GET. URL params should be used only for single variables. As you noticed the URL length is limited to few kB and it's not very good idea to send larger data this way (even though GZIP compressed).
Your GZIP compression code seems to be wrong. Please try this:
public static String compress(String str) throws IOException {
ByteArrayOutputStream os = new ByteArrayOutputStream(str.length());
GZIPOutputStream gos = new GZIPOutputStream(os);
gos.write(str.getBytes());
os.close();
gos.close();
return Base64.encodeToString(os.toByteArray(),Base64.DEFAULT);
}
I can't realize why my file download function perfectly works on Linux, but on Windows it only downloads 1-2 KB of file and finishes. What am I doing wrong? I've already tried approx. 3 examples from Stack Overflow, but no result. Big thanks, you'll save my mind!
public static void get(String URL, String filename) throws IOException, ArithmeticException {
URL connection = new URL(URL);
HttpURLConnection conn;
conn = (HttpURLConnection) connection.openConnection();
conn.setRequestMethod("GET");
conn.connect();
InputStream in = conn.getInputStream();
OutputStream writer = new FileOutputStream(filename);
byte buffer[] = new byte[55000];
int c = in.read(buffer);
while (c > 0) {
writer.write(buffer, 0, c);
c = in.read(buffer);
}
writer.flush();
writer.close();
in.close();
}
I can only blame my stupidness :) I used File.separator in URL instead of normal "/". As Linux has the same slash as URL's, everything is OK, but not on Windows. Like Linux slashes, and you? Thanks for contributing!
hi all well i wan to make an app where it downloads something from a website and puts it in the desktop.
this code downloads it but temporarly, how would i go about saving it?
heres my code
private static void grabItem() throws ClassNotFoundException,
InstantiationException, IllegalAccessException, IOException,
UnsupportedLookAndFeelException {
final URL url = new URL("sampleurl");
final InputStream is = url.openStream();
final byte[] b = new byte[2048];
int length;
final HttpURLConnection connection = (HttpURLConnection) url
.openConnection();
// Specify what portion of file to download.
connection.setRequestProperty("Range", "bytes=" + downloaded + "-");
// Connect to server.
connection.connect();
// Make sure response code is in the 200 range.
if ((connection.getResponseCode() / 100) != 2) {
logger.info("Unable to find file");
return;
}
// set content length.
size = connection.getContentLength();
while ((length = is.read(b)) != -1) {
downloaded += length;
progressBar.setValue((int) getProgress()); // set progress bar
}
is.close();
setFrameTheme();
}
thanks
you never write any data at all to your computer... but anyways...
this is how I download & save a file ... it needs to be a direct download but its easy enough to change it to work the way you want it
URL url = new URL("direct link goes here");
URLConnection connection = url.openConnection();
InputStream inputstream = connection.getInputStream();
to get it to save you would then...
BufferedOuputStream bufferedoutputstream = new BufferedOutputStream(new FileOutputStream(new File("location to save downloaded file")));
byte[] buffer = new byte[1024];
int bytesRead = 0;
while((bytesRead = inputstream.read(buffer)))
{
bufferedoutputstream.write(buffer, 0, bytesRead);
}
bufferedoutputstream.flush();
bufferedoutputstream.close();
inputstream.close();
that should download & save
I have a url like below
http://blah.com/download.zip
I want a java code to download this Zip file from the URL and save it in my server directory as ZIP file only . I would also like to know what is the most effecient way to do this.
First, your URL is not http:\\blah.com\download.zip. It is http://blah.com/download.zip.
Second, it is simple. You have to perform HTTP GET request, take the stream and copy it to FileOutputStream. Here is the code sample.
URL url = new URL("http://blah.com/download.zip");
HttpURLConnection connection = (HttpURLConnection) url.openConnection();
connection.setRequestMethod("GET");
InputStream in = connection.getInputStream();
FileOutputStream out = new FileOutputStream("download.zip");
copy(in, out, 1024);
out.close();
public static void copy(InputStream input, OutputStream output, int bufferSize) throws IOException {
byte[] buf = new byte[bufferSize];
int n = input.read(buf);
while (n >= 0) {
output.write(buf, 0, n);
n = input.read(buf);
}
output.flush();
}