This is my code, currently it just can connect to the url. how to confirm the file is exist(gedata.php). (the url is not real, i change the real url for security reason)
URLConnection conn = new URL("http://testweb/trueweb.com.my/getdata.php").openConnection();
conn.connect();
any idea folk?
EDIT ANSWERED (credit to Hanlet Escaño for providing code):
URLConnection conn = new URL("http://www.google.com.my").openConnection();
conn.connect();
int code = ((java.net.HttpURLConnection)conn).getResponseCode();
if (code == 404)
{
System.out.println("URL not exist");
}
else{
System.out.println("URL Exist!");
}
Try this:
int code = ((java.net.HttpURLConnection)conn).getResponseCode();
Then check if the code is 404 you know your page did not exist:
if (code == 404)
{
...
}
You can check the HTTP headers for a status code. You really only want to look for 200. 200 = page is ok and 404 is page not found
Related
I've been afraid for some time that I might be doxxed. For this reason I want to write an automation in Java to not have to manually search all possible keywords every day. However, I always get a 200 response code due to DDoS Protection. Is there any way to get around this?
Here's my Code:
URL u = new URL ( "https://doxbin.org/upload/afafgFsg/");
HttpURLConnection huc = (HttpURLConnection) u.openConnection ();
huc.setRequestMethod ("GET"); //OR huc.setRequestMethod ("HEAD");
huc.connect () ;
int code = huc.getResponseCode() ;
System.out.println(code);
if (code == 200) {
System.out.println("Success");
} else if (code == 404) {
System.out.println("Not Found");
} else {
System.out.println("Error");
}
The link is currently still provisional. I just want to get the system working once.
I have tried every conceivable method so far, but every time it fails due to DDoS protection.
I'm using selenium to get a link from a , and i wanted to check if it was a download link.
For that i used this code that i made with URL and URLConnection :
final WebElement element = driver.findElement(By.xpath(pathToFile));
URL url = null;
final String urlFileToDownload = element.getAttribute("href");
URLConnection myCon = null;
String contentDisposition = "";
try {
url = new URL(urlFileToDownload);
myCon = url.openConnection();
contentDisposition = myCon.getHeaderField("Content-Disposition");
if (!contentDisposition.contains("attachment;filename=")) {
assertTrue(false, "The link isn't a download link.");
}
} catch (final MalformedURLException e) {
throw new TestIntegrationException("Error while creating URL : " + e.getMessage());
} catch (final IOException e) {
throw new TestIntegrationException("Error while connecting to the URL : " + e.getMessage());
}
assertTrue(true, "Link is a download link.");
The probleme is that my link is a download link as you can see on this picture : Image-link-download. (the picture is a print-screen of the console)
And when i open the connection of url.openConnection();
myCon.getHeaderField("Content-Disposition") is null.
I've searched a way to do this but everytime my header-field is empty and i can't find the problem because when i check with the console, my headerfield isn't empty ...
EDIT : I'm launching my selenium test on a docker server, i think that's a important point to know.
try this:
driver.get("https://i.stack.imgur.com/64qFG.png");
WebElement img = wait5s.until(ExpectedConditions.elementToBeClickable(By.xpath("/html/body/img")));
Dimension h = img.getSize();
Assert.assertNotEquals(0, h);
Instead of looking for attachments why don't you look at the MIME type?
String contentType = myCon.getContentType();
if(contentType.startsWith("text/")) {
assertTrue("The link isn't a download link.", false);
}
My problem was caused by my session who was different with the url.openConnection().
To correct the problem i've collected my cookie JSESSION using selenium like that :
String cookieTarget = null;
for (final Cookie cookie : this.kSupTestCase.getDriver().manage().getCookies()) {
if (StringUtils.equalsIgnoreCase(cookie.getName(), "JSESSIONID")) {
cookieTarget = cookie.getName() + "=" + cookie.getValue();
break;
}
}
Then i've put the cookie to the opened connection :
try {
url = new URL(urlFichierATelecharger);
myCon = url.openConnection();
myCon.setRequestProperty("Cookie", cookieCible);
contentDisposition = myCon.getHeaderField("Content-Disposition");
if (!contentDisposition.contains("attachment;filename=")) {
assertTrue(false, "The link isn't a download link.");
}
} catch [...]
Like that i've got the good session and my URL was recognized as a download link.
Im using below code to get the cache-control value in header of given URL. I dont want to get the body of the URL. Below request takes 800ms to process. Is there any alteration can be done in below code? Im using Google App Engine for development. Please suggest. Thanks. I like not to add extra jar.
URL obj;
URLConnection conn = null;
String noTransform = "";
obj = new URL(url);
conn = obj.openConnection();
noTransform = conn.getHeaderField("cache-control");
if (noTransform !=null && (noTransform.contains("no-transform") || noTransform.contains("private") )){
news.setIsGoogleLiteURL("false");
return news;
}
else {
news.setIsGoogleLiteURL("false");
return news;
}
Instead of making a GET request, try making a HEAD request.
https://www.w3.org/Protocols/rfc2616/rfc2616-sec9.html#sec9.4
I've been working on some Java code in which a string is converted into a URL and then used to download and output its corresponding URL. Unfortunately, when I run the program, it just hangs up. Does anyone have any suggestsion?
Note: I've used import java.io.* and import java.net.*
public static boolean htmlOutput(String testURL) throws Exception {
URL myPage2 = new URL(testURL); //converting String to URL
System.out.println(myPage2);
BufferedReader webInput2 = new BufferedReader(
new InputStreamReader(myPage2.openStream()));
String individualLine=null;
String completeInput=null;
while ((individualLine = webInput2.readLine()) != null) {
//System.out.println(inputLine);
System.out.println(individualLine);
completeInput=completeInput+individualLine;
}//end while
webInput2.close();
return true;
}//end htmlOutput()
[Though this answer helped the OP it is wrong. HttpURLConnection does follow redirects so this could not be the OP 's problem. I will remove it as soon as the OP removes the accepted mark.]
My guess is that you don't get anything back in the response stream because the page you are trying to connect sends you a redirect response (i.e. 302).
Try to verify that by reading the response code and iterate over the response headers. There should be a header named Location with a new url that you need to follow
HttpURLConnection connection = (HttpURLConnection) url.openConnection();
int code = connection.getResponseCode();
Map<String, List<String>> map = conn.getHeaderFields();
// iterate over the map and find new url
If you are having trouble getting the above snippet to work take a look at a working example
You could do yourself a favor and use a third party http client like Apache Http client that can handle redirects otherwise you should do this manually.
This question already has answers here:
How to use java.net.URLConnection to fire and handle HTTP requests
(12 answers)
Closed 9 years ago.
Need to make a program that takes a valid URL of a webpage like www.stackoverflow.com/questions and its IP address equivalent. The program will then find that webpage and return the status code of the page to us such as 200 OK and 404 NOT FOUND. If the webpage isn’t reachable, a message should be returned explaining the situation.
Here’s what I have done so far:
interface Result {
public boolean ok ();
public String message (); }
class Page {
public Result check ( String wholeURL ) throws Exception {
throw new Exception ( "Not sure about the rest”); } }
Also if I were to check a page like http://www.stackoverflow.com I’ll create an instance of Page and then do something like this:
Page page = new PageImplementation ();
Result result = page.check ( "http://www.stackoverflow.com:60" );
if ( result.ok () ) { ... }
else { ... }
The object that is returned is an instance of Result, and the “ok” method should return true when the status code is 200 OK but false otherwise. The method “msg” should return the status code as string.
Have a look at the HttpURLConnection class within the JDK or use Apache Http Components.
Basically you try to connect to the url and check the response header or wait for a timeout if the server isn't reachable at all.
With HttpURLConnection it might look like this:
URL url = new URL("http://www.stackoverflow.com");
HttpURLConnection connection = (HttpURLConnection)url.openConnection();
connection.connect();
int httpStatusCode = connection.getResponseCode(); //200, 404 etc.
You can use some api like commons http ,
import org.apache.commons.httpclient.*;
import org.apache.commons.httpclient.methods.*;
import org.apache.commons.httpclient.params.HttpMethodParams;
..........
public Result check ( String fullURL ) throws Exception {
HttpClient client = new HttpClient();
GetMethod method = new GetMethod(url);
int statusCode = client.executeMethod(method);
//Update your result object based on statuscode
}