I am creating a program that has to take prices of some products from web. I managed to do this for few first products, but then I got a URL that either read with 503 responce from the server or not fully read(tags with price were not included in the output). Here is my code:
import java.net.*;
import java.io.*;
import java.util.Properties;
public class Test {
public static void main(String[] args) throws Exception {
new Test().connect();
}
public void connect() {
try {
String url = "https://antoshka.ua/ua/nabir-lakiv-dlya-nigtiv-make-it-real-rusalonka-3-sht6282464.html",
proxy = "proxy.mydomain.com",
port = "8080";
URL server = new URL(url);
Properties systemProperties = System.getProperties();
systemProperties.setProperty("http.proxyHost",proxy);
systemProperties.setProperty("http.proxyPort",port);
HttpURLConnection connection = (HttpURLConnection)server.openConnection();
connection.connect();
InputStream in = connection.getInputStream();
readResponse(in);
} catch(Exception e) {
e.printStackTrace();
}
}
public void readResponse(InputStream is) throws IOException {
BufferedInputStream bis = new BufferedInputStream(is);
ByteArrayOutputStream buf = new ByteArrayOutputStream();
int result = bis.read();
while(result != -1) {
byte b = (byte)result;
buf.write(b);
result = bis.read();
}
System.out.println(buf.toString());
}
}
And here is the url I try to read: https://antoshka.ua/ua/nabir-lakiv-dlya-nigtiv-make-it-real-rusalonka-3-sht6282464.html
If you browse it in incognito mode you will see this
Which mind be the cause of the problem. This also mind mean that this page is protected against bots.
Also waiting for 6 seconds after this command
server.openConnection();
mind solve your problem
My advise is to use REST API (if exist). I'm not Russian so i cant find this web page's REST API for you.
I am taking screenshot of my desktop and i want to know how i would go if i want to send it to php site and then display it?
I have made this and no results about streaming.
import java.awt.Dimension;
import java.awt.Rectangle;
import java.awt.Robot;
import java.awt.image.BufferedImage;
import java.io.ByteArrayOutputStream;
import java.io.IOException;
import java.io.PrintStream;
import java.net.MalformedURLException;
import java.net.URL;
import java.net.URLConnection;
import javax.imageio.ImageIO;
public class Stream{
static public void captureScreen() throws Exception {
Dimension screenSize = java.awt.Toolkit.getDefaultToolkit().getScreenSize();
Rectangle screenRectangle = new Rectangle(screenSize);
Robot robot = new Robot();
BufferedImage image = robot.createScreenCapture(screenRectangle);
ByteArrayOutputStream buffer = new ByteArrayOutputStream();
ImageIO.write(image, "png", buffer);
byte[] data = buffer.toByteArray();
try {
// open a connection to the site
URL url = new URL("http://futuretechs.eu/stream.php");
URLConnection con = url.openConnection();
// activate the output
con.setDoOutput(true);
PrintStream ps = new PrintStream(con.getOutputStream());
// send your parameters to your site
ps.print("image=" + encodeArray(data));
System.out.println(encodeArray(data));
// we have to get the input stream in order to actually send the request
con.getInputStream();
// close the print stream
ps.close();
} catch (MalformedURLException e) {
e.printStackTrace();
} catch (IOException e) {
e.printStackTrace();
}
}
public static void main(String[] args) {
try{
System.out.println("[ Stream Started ]");
while(true){
Thread.currentThread().sleep(100);
Stream.captureScreen();
}
// System.out.println("[ Stream Ended ]");
}
catch(Exception e)
{
e.printStackTrace();
}
}
static String encodeArray(byte[] in) throws IOException {
StringBuffer out = new StringBuffer();
out.append(Base64Coder.encode(in, 0, in.length));
return out.toString();
}
}
How now i would send from java the byte[] to php and play it?
So it would go like this
Java Client program sends to php site the byte[] content and then the php shows it to the user who is at the site?
Thank you!
EDIT: CODE UPDATED
What is that site you wanna upload the screenshot content? Is that site on the internet?
There are different approaches.
- You could have a php page which waits for an HTTP-POST request, with the screenshot in the payload, while the site itself has a php-module running on that server and gets invoked by the web-request.
- That server probably supports WebDav, then you could upload your screenshot via HTTP-PUT and invoke a php site with HTTP-GET (while sending the filename with your HTTP-GET-Args).
It's hard to tell, if we don't know the php-site, it's API and/or it's behaviour.
well its better to convert the image bytes into base64 before sending to php
when you send to php you can use this function imagecreatefromstring($image_data) to display
display.php
<?php
$image = $_POST['image'];
$data = base64_decode($image);
$im = imagecreatefromstring($data);
if ($im !== false) {
header('Content-Type: image/png');
imagepng($im);
imagedestroy($im);
}
else {
echo 'An error occurred.';
}
?>
this should work with (PHP 4 >= 4.0.4, PHP 5)
Let me know if it works :)
Edit :
I am not really good with java try the below
As asked java code
try {
// open a connection to the site
URL url = new URL("http://www.yourdomain.com/yourphpscript.php");
URLConnection con = url.openConnection();
// activate the output
con.setDoOutput(true);
PrintStream ps = new PrintStream(con.getOutputStream());
// send your parameters to your site
ps.print("image=BASE64_ENCODED_STRING_HERE");
// we have to get the input stream in order to actually send the request
con.getInputStream();
// close the print stream
ps.close();
} catch (MalformedURLException e) {
e.printStackTrace();
} catch (IOException e) {
e.printStackTrace();
}
or Use HTTP Client in java
HttpClient client = new HttpClient();
PostMethod method = new PostMethod("http://www.yourdomain.com/yourphpscript.php");
method.addParamezter("image","BASE64_ENCODED_STRING"); // or whatever
int statusCode = client.executeMethod(method);
I'm trying to make a java program that changes a text document on my website. The permissions are on that everyone can edit it. I've tried, and reading it works perfectly, but writing doesn't.
Here's the code for the writing:
import java.net.*;
import java.io.*;
public class Main {
public static void main(String[] args) throws Exception {
URL infoThing = new URL("http://www.[name of my website]/infoThing.txt");
URLConnection con = infoThing.openConnection();
con.setDoOutput(true);
OutputStreamWriter out = new OutputStreamWriter(con.getOutputStream());
out.write("Change to this.");
out.close();
}
}
For a Java program to interact with a server-side process it simply must be able to write to a URL, thus providing data to the server. It can do this by following these steps:
1 Create a URL.
2 Retrieve the URLConnection object.
3 Set output capability on the URLConnection.
4 Open a connection to the resource.
5 Get an output stream from the connection.
6 Write to the output stream.
7 Close the output stream.
If you want to write to the url. You have to use the concepts above and concepts for servlets.
An example program that runs the backwards script over the network through a URLConnection:
import java.io.*;
import java.net.*;
public class ReverseTest {
public static void main(String[] args) {
try {
if (args.length != 1) {
System.err.println("Usage: java ReverseTest string_to_reverse");
System.exit(1);
}
String stringToReverse = URLEncoder.encode(args[0]);
URL url = new URL("http://java.sun.com/cgi-bin/backwards");
URLConnection connection = url.openConnection();
PrintStream outStream = new PrintStream(connection.getOutputStream());
outStream.println("string=" + stringToReverse);
outStream.close();
DataInputStream inStream = new DataInputStream(connection.getInputStream());
String inputLine;
while ((inputLine = inStream.readLine()) != null) {
System.out.println(inputLine);
}
inStream.close();
} catch (MalformedURLException me) {
System.err.println("MalformedURLException: " + me);
} catch (IOException ioe) {
System.err.println("IOException: " + ioe);
}
}
}
I'm writing a Java program which hits a list of urls and needs to first know if the url exists. I don't know how to go about this and cant find the java code to use.
The URL is like this:
http: //ip:port/URI?Attribute=x&attribute2=y
These are URLs on our internal network that would return an XML if valid.
Can anyone suggest some code?
You could just use httpURLConnection. If it is not valid you won't get anything back.
HttpURLConnection connection = null;
try{
URL myurl = new URL("http://www.myURL.com");
connection = (HttpURLConnection) myurl.openConnection();
//Set request to header to reduce load as Subirkumarsao said.
connection.setRequestMethod("HEAD");
int code = connection.getResponseCode();
System.out.println("" + code);
} catch {
//Handle invalid URL
}
Or you could ping it like you would from CMD and record the response.
String myurl = "google.com"
String ping = "ping " + myurl
try {
Runtime r = Runtime.getRuntime();
Process p = r.exec(ping);
r.exec(ping);
BufferedReader in = new BufferedReader(new InputStreamReader(p.getInputStream()));
String inLine;
BufferedWriter write = new BufferedWriter(new FileWriter("C:\\myfile.txt"));
while ((inLine = in.readLine()) != null) {
write.write(inLine);
write.newLine();
}
write.flush();
write.close();
in.close();
} catch (Exception ex) {
//Code here for what you want to do with invalid URLs
}
}
A malformed url will give you an exception.
To know if you the url is active or not you have to hit the url. There is no other way.
You can reduce the load by requesting for a header from the url.
package com.my;
import java.io.IOException;
import java.io.InputStream;
import java.net.MalformedURLException;
import java.net.URL;
import java.net.UnknownHostException;
public class StrTest {
public static void main(String[] args) throws IOException {
try {
URL url = new URL("http://www.yaoo.coi");
InputStream i = null;
try {
i = url.openStream();
} catch (UnknownHostException ex) {
System.out.println("THIS URL IS NOT VALID");
}
if (i != null) {
System.out.println("Its working");
}
} catch (MalformedURLException e) {
e.printStackTrace();
}
}
}
output : THIS URL IS NOT VALID
Open a connection and check if the response contains valid XML? Was that too obvious or do you look for some other magic?
You may want to use HttpURLConnection and check for error status:
HttpURLConnection javadoc
I would like to be able to fetch a web page's html and save it to a String, so I can do some processing on it. Also, how could I handle various types of compression.
How would I go about doing that using Java?
I'd use a decent HTML parser like Jsoup. It's then as easy as:
String html = Jsoup.connect("http://stackoverflow.com").get().html();
It handles GZIP and chunked responses and character encoding fully transparently. It offers more advantages as well, like HTML traversing and manipulation by CSS selectors like as jQuery can do. You only have to grab it as Document, not as a String.
Document document = Jsoup.connect("http://google.com").get();
You really don't want to run basic String methods or even regex on HTML to process it.
See also:
What are the pros and cons of leading HTML parsers in Java?
Here's some tested code using Java's URL class. I'd recommend do a better job than I do here of handling the exceptions or passing them up the call stack, though.
public static void main(String[] args) {
URL url;
InputStream is = null;
BufferedReader br;
String line;
try {
url = new URL("http://stackoverflow.com/");
is = url.openStream(); // throws an IOException
br = new BufferedReader(new InputStreamReader(is));
while ((line = br.readLine()) != null) {
System.out.println(line);
}
} catch (MalformedURLException mue) {
mue.printStackTrace();
} catch (IOException ioe) {
ioe.printStackTrace();
} finally {
try {
if (is != null) is.close();
} catch (IOException ioe) {
// nothing to see here
}
}
}
Bill's answer is very good, but you may want to do some things with the request like compression or user-agents. The following code shows how you can various types of compression to your requests.
URL url = new URL(urlStr);
HttpURLConnection conn = (HttpURLConnection) url.openConnection(); // Cast shouldn't fail
HttpURLConnection.setFollowRedirects(true);
// allow both GZip and Deflate (ZLib) encodings
conn.setRequestProperty("Accept-Encoding", "gzip, deflate");
String encoding = conn.getContentEncoding();
InputStream inStr = null;
// create the appropriate stream wrapper based on
// the encoding type
if (encoding != null && encoding.equalsIgnoreCase("gzip")) {
inStr = new GZIPInputStream(conn.getInputStream());
} else if (encoding != null && encoding.equalsIgnoreCase("deflate")) {
inStr = new InflaterInputStream(conn.getInputStream(),
new Inflater(true));
} else {
inStr = conn.getInputStream();
}
To also set the user-agent add the following code:
conn.setRequestProperty ( "User-agent", "my agent name");
Well, you could go with the built-in libraries such as URL and URLConnection, but they don't give very much control.
Personally I'd go with the Apache HTTPClient library.
Edit: HTTPClient has been set to end of life by Apache. The replacement is: HTTP Components
All the above mentioned approaches do not download the web page text as it looks in the browser. these days a lot of data is loaded into browsers through scripts in html pages. none of above mentioned techniques supports scripts, they just downloads the html text only. HTMLUNIT supports the javascripts. so if you are looking to download the web page text as it looks in the browser then you should use HTMLUNIT.
You'd most likely need to extract code from a secure web page (https protocol). In the following example, the html file is being saved into c:\temp\filename.html Enjoy!
import java.io.BufferedReader;
import java.io.BufferedWriter;
import java.io.FileWriter;
import java.io.InputStream;
import java.io.InputStreamReader;
import java.net.URL;
import javax.net.ssl.HttpsURLConnection;
/**
* <b>Get the Html source from the secure url </b>
*/
public class HttpsClientUtil {
public static void main(String[] args) throws Exception {
String httpsURL = "https://stackoverflow.com";
String FILENAME = "c:\\temp\\filename.html";
BufferedWriter bw = new BufferedWriter(new FileWriter(FILENAME));
URL myurl = new URL(httpsURL);
HttpsURLConnection con = (HttpsURLConnection) myurl.openConnection();
con.setRequestProperty ( "User-Agent", "Mozilla/5.0 (Windows NT 10.0; Win64; x64; rv:63.0) Gecko/20100101 Firefox/63.0" );
InputStream ins = con.getInputStream();
InputStreamReader isr = new InputStreamReader(ins, "Windows-1252");
BufferedReader in = new BufferedReader(isr);
String inputLine;
// Write each line into the file
while ((inputLine = in.readLine()) != null) {
System.out.println(inputLine);
bw.write(inputLine);
}
in.close();
bw.close();
}
}
To do so using NIO.2 powerful Files.copy(InputStream in, Path target):
URL url = new URL( "http://download.me/" );
Files.copy( url.openStream(), Paths.get("downloaded.html" ) );
On a Unix/Linux box you could just run 'wget' but this is not really an option if you're writing a cross-platform client. Of course this assumes that you don't really want to do much with the data you download between the point of downloading it and it hitting the disk.
Get help from this class it get code and filter some information.
public class MainActivity extends AppCompatActivity {
EditText url;
#Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate( savedInstanceState );
setContentView( R.layout.activity_main );
url = ((EditText)findViewById( R.id.editText));
DownloadCode obj = new DownloadCode();
try {
String des=" ";
String tag1= "<div class=\"description\">";
String l = obj.execute( "http://www.nu.edu.pk/Campus/Chiniot-Faisalabad/Faculty" ).get();
url.setText( l );
url.setText( " " );
String[] t1 = l.split(tag1);
String[] t2 = t1[0].split( "</div>" );
url.setText( t2[0] );
}
catch (Exception e)
{
Toast.makeText( this,e.toString(),Toast.LENGTH_SHORT ).show();
}
}
// input, extrafunctionrunparallel, output
class DownloadCode extends AsyncTask<String,Void,String>
{
#Override
protected String doInBackground(String... WebAddress) // string of webAddress separate by ','
{
String htmlcontent = " ";
try {
URL url = new URL( WebAddress[0] );
HttpURLConnection c = (HttpURLConnection) url.openConnection();
c.connect();
InputStream input = c.getInputStream();
int data;
InputStreamReader reader = new InputStreamReader( input );
data = reader.read();
while (data != -1)
{
char content = (char) data;
htmlcontent+=content;
data = reader.read();
}
}
catch (Exception e)
{
Log.i("Status : ",e.toString());
}
return htmlcontent;
}
}
}
Jetty has an HTTP client which can be use to download a web page.
package com.zetcode;
import org.eclipse.jetty.client.HttpClient;
import org.eclipse.jetty.client.api.ContentResponse;
public class ReadWebPageEx5 {
public static void main(String[] args) throws Exception {
HttpClient client = null;
try {
client = new HttpClient();
client.start();
String url = "http://example.com";
ContentResponse res = client.GET(url);
System.out.println(res.getContentAsString());
} finally {
if (client != null) {
client.stop();
}
}
}
}
The example prints the contents of a simple web page.
In a Reading a web page in Java tutorial I have written six examples of dowloading a web page programmaticaly in Java using URL, JSoup, HtmlCleaner, Apache HttpClient, Jetty HttpClient, and HtmlUnit.
I used the actual answer to this post (url) and writing the output into a
file.
package test;
import java.net.*;
import java.io.*;
public class PDFTest {
public static void main(String[] args) throws Exception {
try {
URL oracle = new URL("http://www.fetagracollege.org");
BufferedReader in = new BufferedReader(new InputStreamReader(oracle.openStream()));
String fileName = "D:\\a_01\\output.txt";
PrintWriter writer = new PrintWriter(fileName, "UTF-8");
OutputStream outputStream = new FileOutputStream(fileName);
String inputLine;
while ((inputLine = in.readLine()) != null) {
System.out.println(inputLine);
writer.println(inputLine);
}
in.close();
} catch(Exception e) {
}
}
}