Java Get URL Header Alone, Body Not Required - java

Im using below code to get the cache-control value in header of given URL. I dont want to get the body of the URL. Below request takes 800ms to process. Is there any alteration can be done in below code? Im using Google App Engine for development. Please suggest. Thanks. I like not to add extra jar.
URL obj;
URLConnection conn = null;
String noTransform = "";
obj = new URL(url);
conn = obj.openConnection();
noTransform = conn.getHeaderField("cache-control");
if (noTransform !=null && (noTransform.contains("no-transform") || noTransform.contains("private") )){
news.setIsGoogleLiteURL("false");
return news;
}
else {
news.setIsGoogleLiteURL("false");
return news;
}

Instead of making a GET request, try making a HEAD request.
https://www.w3.org/Protocols/rfc2616/rfc2616-sec9.html#sec9.4

Related

Check if page on d0xbin exists?

I've been afraid for some time that I might be doxxed. For this reason I want to write an automation in Java to not have to manually search all possible keywords every day. However, I always get a 200 response code due to DDoS Protection. Is there any way to get around this?
Here's my Code:
URL u = new URL ( "https://doxbin.org/upload/afafgFsg/");
HttpURLConnection huc = (HttpURLConnection) u.openConnection ();
huc.setRequestMethod ("GET"); //OR huc.setRequestMethod ("HEAD");
huc.connect () ;
int code = huc.getResponseCode() ;
System.out.println(code);
if (code == 200) {
System.out.println("Success");
} else if (code == 404) {
System.out.println("Not Found");
} else {
System.out.println("Error");
}
The link is currently still provisional. I just want to get the system working once.
I have tried every conceivable method so far, but every time it fails due to DDoS protection.

retrieve data inside cookieContainer in java

I'm developing a game using Unity engine which have to send cookie from Client side C# to server side - Java , and I facing this problem (maybe cross platform problem? I'm not sure)
I write a bunch of code in client side like this
private HttpWebRequest request(){
try{
string url = "http://localhost:8080/...";
var request = (HttpWebRequest)WebRequest.Create(url);
request.Timeout = 15000;
request.KeepAlive = true ;
request.Method= "GET";
CookieContainer cookieContainer = new CookieContainer();
Cookie Authentication = new Cookie("Session" , "09iubasd");
Authentication.Domain = url;
cookieContainer.Add(Authentication);
request.CookieContainer = cookieContainer;
request.Headers.Add("testting", "hascome");
return request;
}catch(System.Exception ex){
Debug.Log("[Exception]" + ex);
throw ex;
}
}
and The server side is writing in Java Spring. I can't retrieve the Cookie data inside the CookieContainer at server-side. Can anyone give me any suggestion or any solution to solve this problem? Or something similar to the CookieContainer in Java. I have googled but seem no way, If this is a silly question then please teach me. Many thanks.
Vince
I just find out the reason why, my cookie domain set wrong way.
Here the new test Code I just fix. Hope this help who have the same problem in the future ( Of cause it must be great if no one face this silly problem )
private HttpWebRequest request(){
try{
System.Uri uri = new System.Uri("http://localhost:8080/...");
var request = (HttpWebRequest)WebRequest.Create(uri);
request.Timeout = 15000;
request.KeepAlive = true ;
request.Method= "GET";
Cookie Authentication = new Cookie("Session" , "09iubasd");
Authentication.Domain = uri.Host;
request.CookieContainer = new CookieContainer();
request.CookieContainer.Add(Authentication);
request.Headers.Add("testting", "hascome");
return request;
}catch(System.Exception ex){
Debug.Log("[Exception]" + ex);
throw ex;
}
}

How to confirm the file getdata.php exist in given url

This is my code, currently it just can connect to the url. how to confirm the file is exist(gedata.php). (the url is not real, i change the real url for security reason)
URLConnection conn = new URL("http://testweb/trueweb.com.my/getdata.php").openConnection();
conn.connect();
any idea folk?
EDIT ANSWERED (credit to Hanlet Escaño for providing code):
URLConnection conn = new URL("http://www.google.com.my").openConnection();
conn.connect();
int code = ((java.net.HttpURLConnection)conn).getResponseCode();
if (code == 404)
{
System.out.println("URL not exist");
}
else{
System.out.println("URL Exist!");
}
Try this:
int code = ((java.net.HttpURLConnection)conn).getResponseCode();
Then check if the code is 404 you know your page did not exist:
if (code == 404)
{
...
}
You can check the HTTP headers for a status code. You really only want to look for 200. 200 = page is ok and 404 is page not found

Reading and printing HTML from website hangs up

I've been working on some Java code in which a string is converted into a URL and then used to download and output its corresponding URL. Unfortunately, when I run the program, it just hangs up. Does anyone have any suggestsion?
Note: I've used import java.io.* and import java.net.*
public static boolean htmlOutput(String testURL) throws Exception {
URL myPage2 = new URL(testURL); //converting String to URL
System.out.println(myPage2);
BufferedReader webInput2 = new BufferedReader(
new InputStreamReader(myPage2.openStream()));
String individualLine=null;
String completeInput=null;
while ((individualLine = webInput2.readLine()) != null) {
//System.out.println(inputLine);
System.out.println(individualLine);
completeInput=completeInput+individualLine;
}//end while
webInput2.close();
return true;
}//end htmlOutput()
[Though this answer helped the OP it is wrong. HttpURLConnection does follow redirects so this could not be the OP 's problem. I will remove it as soon as the OP removes the accepted mark.]
My guess is that you don't get anything back in the response stream because the page you are trying to connect sends you a redirect response (i.e. 302).
Try to verify that by reading the response code and iterate over the response headers. There should be a header named Location with a new url that you need to follow
HttpURLConnection connection = (HttpURLConnection) url.openConnection();
int code = connection.getResponseCode();
Map<String, List<String>> map = conn.getHeaderFields();
// iterate over the map and find new url
If you are having trouble getting the above snippet to work take a look at a working example
You could do yourself a favor and use a third party http client like Apache Http client that can handle redirects otherwise you should do this manually.

Checking the status of a web page [duplicate]

This question already has answers here:
How to use java.net.URLConnection to fire and handle HTTP requests
(12 answers)
Closed 9 years ago.
Need to make a program that takes a valid URL of a webpage like www.stackoverflow.com/questions and its IP address equivalent. The program will then find that webpage and return the status code of the page to us such as 200 OK and 404 NOT FOUND. If the webpage isn’t reachable, a message should be returned explaining the situation.
Here’s what I have done so far:
interface Result {
public boolean ok ();
public String message (); }
class Page {
public Result check ( String wholeURL ) throws Exception {
throw new Exception ( "Not sure about the rest”); } }
Also if I were to check a page like http://www.stackoverflow.com I’ll create an instance of Page and then do something like this:
Page page = new PageImplementation ();
Result result = page.check ( "http://www.stackoverflow.com:60" );
if ( result.ok () ) { ... }
else { ... }
The object that is returned is an instance of Result, and the “ok” method should return true when the status code is 200 OK but false otherwise. The method “msg” should return the status code as string.
Have a look at the HttpURLConnection class within the JDK or use Apache Http Components.
Basically you try to connect to the url and check the response header or wait for a timeout if the server isn't reachable at all.
With HttpURLConnection it might look like this:
URL url = new URL("http://www.stackoverflow.com");
HttpURLConnection connection = (HttpURLConnection)url.openConnection();
connection.connect();
int httpStatusCode = connection.getResponseCode(); //200, 404 etc.
You can use some api like commons http ,
import org.apache.commons.httpclient.*;
import org.apache.commons.httpclient.methods.*;
import org.apache.commons.httpclient.params.HttpMethodParams;
..........
public Result check ( String fullURL ) throws Exception {
HttpClient client = new HttpClient();
GetMethod method = new GetMethod(url);
int statusCode = client.executeMethod(method);
//Update your result object based on statuscode
}

Categories

Resources