java http 403 error on upload - java

I am definitely not a computer guru. :)
I get an http 403 error when trying to upload a file on my website through an applet.
This means that it is forbidden. Maybe it is a regular behavior, for uploading through http protocol may be not allowed. Is it true? Then how to do it? I would like my applet to upload little files on a specific folder of the server.
here is the code :
private static String folder = "http://..." //URL of the folder to upload to
public void saveScore(Item hs) { //Item is a serializable object to save
String filename = "s"+Integer.toString(hs.getScore())+".sco" ; // name of the file
System.out.println("*** Trying to save file to : " + filename) ;
try {
//setting connection
HttpURLConnection con = (HttpURLConnection) new URL(folder+"/"+filename).openConnection() ;
con.setDoInput(true);
con.setDoOutput(true);
con.setRequestProperty ("Content-Type", "multipart/form-data");
con.setRequestProperty("User-Agent", "Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10.4; en-US; rv:1.9.2.2) Gecko/20100316 Firefox/3.6.2");
con.setChunkedStreamingMode(1024);
con.setRequestMethod("PUT") ;
ObjectOutputStream oos = new ObjectOutputStream(con.getOutputStream()) ;
//uploading
oos.writeObject(hs) ;
oos.flush() ;
oos.close();
//getting answer
DataInputStream is = new DataInputStream(con.getInputStream());
String s = is.readLine();
is.close();
System.out.println("** Answer **");
System.out.println(s) ;
} catch (IOException e) {
e.printStackTrace(System.out) ; // gives me a 403 error
}
}
Thanks for helping...

Go through the link. Hope it will help you.
http://stackoverflow.com/questions/1599018/java-applet-to-upload-a-file?rq=1

Related

Does the website not like Java?

I'm trying to open my university's website to read their menu. I've written a version that reads the menu given the link directly to the menu link, but I want to pull it back a little so I can read the menu from the website and not the direct link (in case the link ever changes).
Here is the URL I am opening:
https://nccudining.sodexomyway.com/dining-choices/index.html
Whenever I open the link to the website, this is the output that I get:
302
<html><head><title>Object moved</title></head><body>
<h2>Object moved to here.</h2>
</body></html>
The URL it outputs is the mobile version of the website, but when I try to use that URL, it outputs nothing.
This is my code:
import java.io.*;
import java.net.*;
public class test
{
public static void main( String[] args )
{
URL url = null;
try
{
url = new URL("https://nccudining.sodexomyway.com/dining-choices/index.html");
HttpURLConnection test = (HttpURLConnection) url.openConnection();
test.setInstanceFollowRedirects(true);
test.connect();
System.out.println(test.getResponseCode());
} catch ( MalformedURLException e1 )
{
System.out.println("URL cannot be opened.");
return;
}
BufferedReader in = null;
try
{
in = new BufferedReader(new InputStreamReader(url.openStream()));
} catch ( IOException e )
{
System.out.println("Error");
}
String inputLine;
try
{
while ((inputLine = in.readLine()) != null)
{
System.out.println(inputLine);
}
} catch ( IOException e )
{
System.out.println("Error");
}
}
}
I apologize for all the try/catch loops. I don't want to just throw an IOException from the main from the get-go because I've heard that's bad practice. Anyway, this code just opens the URL, sets up a connection so I can make sure the URL actually exists, and try to read the HTML of it. It works on any other site I've tried it on, including google.
My question is why will my code not read the correct source code of the website? Is it something wrong with my code (I figured adding in the HttpsURLConnection and allowing redirects would work) or is it just the website, and is there anything I can do to bypass that aside from just opening the weekly menu's page?
Solution found! Thanks to #ShayHaned for the fixes. I added the following lines to the HttpURLConnection so I got a 200 response code rather than a 302:
test = (HttpURLConnection) url.openConnection();
test.setRequestMethod("GET");
test.setRequestProperty("User-Agent", "Mozilla/5.0");
test.setInstanceFollowRedirects(true);
Then I changed the InputStream from opening the stream from the URL to getting the input stream from the HttpURLConnection, as shown:
BufferedReader in = new BufferedReader(new InputStreamReader(test.getInputStream()));
That gave me the HTML I was looking for.
You are just missing the appropriate headers for http communication to work safely and securely. You can add a few Headers to make sure that you get the desired response
HttpURLConnection test = (HttpURLConnection) url.openConnection();
test.addRequestProperty( "User-Agent", "Mozilla/5.0 (Windows NT 6.3; WOW64; Trident/7.0; rv:11.0) like Gecko" );
test.addRequestProperty( "Accept" , "text/html,application/xhtml+xml,application/xml,image/png, image/svg+xml,;q=0.9,*/*;q=0.8");
test.addRequestProperty( "Accept-Charset" , "ISO-8859-1,utf-8;q=0.7,*;q=0.3");
test.addRequestProperty( "Accept-Language" , "en-US,en;q=0.8" );
test.addRequestProperty( "Connection" , "close" );
test.setRequestMethod("GET");
test.setInstanceFollowRedirects(true);
test.connect();
// Nopes DONT TRY THIS
//in = new BufferedReader(new InputStreamReader(url.openStream()));
in = new BufferedReader( new InputStreamReader( test.getInputStream() ) );
String htmlContent = "";
for( String inputLine = ""; ( inputLine = in.readLine() ) != null; )
htmlContent += inputLine;
System.out.println( htmlContent );
Instead of in = new BufferedReader(new InputStreamReader(url.openStream() ) ); , please try in = new BufferedReader(new InputStreamReader(test.getInputStream() ) ); , because it sounds pretty logical to open your InputStream from the actual HttpURLConnection object .
If you really want to understand the http header part try https://en.wikipedia.org/wiki/List_of_HTTP_header_fields for a detailed description of http headers and usage

Facebook login with HTTPSUrlConnection pure, without an API

Some way to save tokens without a Facebook API? I want to login on Facebook using native Java libraries. As Mozilla does!
If I set my correct login infos, the 'facebook.com' returns the home page. If I set incorrectly logins, he returns a page of 'you email or password is incorr...'
Why he do not returns my home page, with a friends and etc... ?
Here is the code:
String httpsURL = "https://www.facebook.com/login.php";
String query;
try {
String MY_MAIL = "YOUR MAIL";
String MY_PASS = "YOUR PASS";
query = "lsd=AVpivkhO&email=" + URLEncoder.encode(MY_MAIL, "UTF-8") + "&pass=" + URLEncoder.encode(MY_PASS, "UTF-8") + "&default_persistent=0&timezone=180&lgndim=eyJ3IjoxMzY2LCJoIjo3NjgsImF3IjoxMzY2LCJhaCI6NzM4LCJjIjoyNH0%3D&lgnrnd=135513_yy32&lgnjs=1441832114&locale=pt_BR&qsstamp=W1tbMywxMSwyOSwzMSw1Niw3Miw5NywxNDYsMTUxLDE2OSwxNzEsMTkwLDE5NSwyMzcsMjU1LDI2NiwyODEsMjk0LDMwMSwzMDgsMzM3LDM0MSwzNDYsMzk0LDQxMyw0MzEsNDUyLDQ1NCw0NzIsNDc0LDQ3NSw0OTAsNTMzLDU1MCw1NTYsNTgxLDU4NCw2NDYsNjQ5LDY4NCw3OTMsODUzXV0sIkFabG1OQ2l1WFBTS194bWlmUndYdWlrMVFYT25MODBLMGhoTnpjSmxRTXlGd3lZR0o0ckZZRXp6ZjkwX201aUxuZTJ3VUxsTEdtTW90NExSUzI1LTZyekV6U0RFRE1DWHFuRU11RllZSGstaGE3bTE5UWtRMk5xem9WT3pSVXFNdHBJZU9LNkYzV1EyNFlzVnJhNi16NUVrMzJFOTZGYy1RUDRpTUZic1dGVHRSMzFsY0hRX3l4eXZPdm90ZEhEdXM1UmphcFc5UENVSFNoYmp5RmFNZGxGSyJd";
//query = "";
URL myurl = new URL(httpsURL);
HttpsURLConnection con = (HttpsURLConnection)myurl.openConnection();
con.setRequestMethod("POST");
con.setRequestProperty("Host", "facebook.com");
con.setRequestProperty("User-Agent", "Mozilla/4.0 (compatible; MSIE 5.0;Windows98;DigExt)");
con.setRequestProperty("Content-Type","Content-Type: application/x-www-form-urlencoded");
con.setRequestProperty("Accept", "text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8");
con.setRequestProperty("Accept-Language", "pt-BR,pt;q=0.8,en-US;q=0.5,en;q=0.3");
//con.setRequestProperty("Accept-Encoding", "gzip, deflate");
con.setRequestProperty("Cookie:", "fr=0DlGgNm7j7OSrxwX6.AWVtGD2R_8r7u9v6SmEp-u_cTWA.BVmtC5.sI.AAA.0.AWV7nB8J; lu=RAJEHG-pWh91KhTp4yKcf40A; datr=n9CaVRge8gdmQM4fbYPCgerZ; locale=pt_BR; a11y=%7B%22sr%22%3A0%2C%22sr-ts%22%3A1441880544006%2C%22jk%22%3A0%2C%22jk-ts%22%3A1441880544006%2C%22kb%22%3A1%2C%22kb-ts%22%3A1441880544006%2C%22hcm%22%3A0%2C%22hcm-ts%22%3A1441880544006%7D; reg_fb_ref=https%3A%2F%2Fwww.facebook.com%2F%3Fstype%3Dlo%26jlou%3DAfeJdlzS3ToJP2-zmWuo749IArqE_LKyQPhyMJUZGzXJ04e7NsDHRllozvz1i-L_gVMnR55t3_-GIBPa9s1jrq1eNyoO43bYSUV4YqOC04TPGg%26smuh%3D50695%26lh%3DAc9epmfRhrK0i_WU; reg_fb_gate=https%3A%2F%2Fwww.facebook.com%2F%3Fstype%3Dlo%26jlou%3DAfeJdlzS3ToJP2-zmWuo749IArqE_LKyQPhyMJUZGzXJ04e7NsDHRllozvz1i-L_gVMnR55t3_-GIBPa9s1jrq1eNyoO43bYSUV4YqOC04TPGg%26smuh%3D50695%26lh%3DAc9epmfRhrK0i_WU; wd=1525x268");
con.setRequestProperty("Connection", "keep-alive");
con.setDoInput(true);
con.setDoOutput(true);
DataOutputStream output = new DataOutputStream(con.getOutputStream());
output.writeBytes(query);
output.close();
DataInputStream input = new DataInputStream( con.getInputStream() );
for( int c = input.read(); c != -1; c = input.read()){
System.out.print( (char)c );
}
System.out.println();
System.out.println(">> " + con.getHeaderField("Set-Cookie"));
input.close();
System.out.println("Resp Code:"+con .getResponseCode());
System.out.println("Resp Message:"+ con .getResponseMessage());
//System.out.println("coo: " + coo.get(1));
} catch (UnsupportedEncodingException | MalformedURLException e) {
e.printStackTrace();
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
I'm using Mozilla to analyze the Network HTTP Headers/Requests and Standards, and to get Cookies.
Further questions:
1 - The browser generates the Cookie?
2 - Who first sends cookies, client or server?
3 - What is needed for a successful login? Tokens, cookies... ?
OBS: I already know programming, I'm a good programmer. This is a matter of protocol and standardization. So please respect my doubt and the knowledge.
________________________ EDIT ____________________
This code is similar to mine code. I wanted to do something in that style: mkyong
_______________________ EDIT 2 _____________________
I do not want to use API, but the HTMLUnit looks interesting and no frills.
Looks: How to log into Facebook programmatically using Java?

I'm getting an IllegalStateException: Already connected and I cannot figure out why

So I'm trying to write a program which connects to a site and pulls data from the source code. Whenever I call this method, once it reaches the line connection.setRequestProperty("Cookie", cookie); it doesn't proceed any further and spits out "IllegalStateException: Already connected". I'm trying to cycle through 123 different URL's so the URL changes each time the method is called, so I'm not too sure why it's telling me it's already connected when I'm attempting to reconnect to a different URL. I've tried searching everywhere for a solution and cannot find one. Can any of you help? Thanks!
private void getUrlData(String u, String championName) throws IOException {
List<String> data = new ArrayList<String>();
try {
BufferedWriter out = new BufferedWriter(new FileWriter("Other Stuff/Champion Data Test.txt"));
out.write(championName);
out.newLine();
URL url = new URL(u);
URLConnection connection = url.openConnection();
String cookie = connection.getHeaderField("Set-Cookie");
connection.setRequestProperty("Cookie", cookie);
connection.setRequestProperty("User-Agent", "Mozilla/5.0 (Windows NT 6.1) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/42.0.2311.135 Safari/537.36");
connection.connect();
Scanner in = new Scanner(connection.getInputStream());
String inputLine;
while(in.hasNext()) {
inputLine = in.nextLine();
if(inputLine.contains("stat-label")) {
out.write(in.nextLine());
in.nextLine();
in.nextLine();
out.write(" " + in.nextLine());
}
}
}
catch(Exception e) {
System.out.println(e);
}
}
I found out the problem, but me solving this problem aroused new problems. The problem was me using the method connection.getHeaderField("Set-Cookie").

Retrieving XML over HTTP in Java 7

Some devices (e.g. webrelays) return raw XML in response to HTTPGet requests. That is, the reply contains no valid HTTP header. For many years I have retrieved information from such devices using code like this:
private InputStream doRawGET(String url) throws MalformedURLException, IOException
{
try
{
URL url = new URL(url);
HttpURLConnection con = (HttpURLConnection) url.openConnection();
con.setConnectTimeout(5000);
con.setReadTimeout(5000);
return con.getInputStream();
}
catch (SocketTimeoutException ex)
{
throw new IOException("Timeout attempting to contact Web Relay at " + url);
}
}
In openJdk 7 the following lines have been added to sun.net.www.protocol.http.HttpURLConnection, which mean any HTTP response with an invalid header generates an IOException:
1325 respCode = getResponseCode();
1326 if (respCode == -1) {
1327 disconnectInternal();
1328 throw new IOException ("Invalid Http response");
1329 }
How do I get 'headless' XML from a server which expects HTTPGet requests in the new world of Java 7?
You could always do it the "socket way" and talk HTTP directly to the host:
private InputStream doRawGET( String url )
{
Socket s = new Socket( new URL( url ).getHost() , 80 );
PrintWriter out = new PrintWriter( s.getOutputStream() , true );
out.println( "GET " + new URL( url ).getPath() + " HTTP/1.0" );
out.println(); // extra CR necessary sometimes.
return s.getInputStream():
}
Not exactly elegant, but it'll work. Strange that JRE7 introduces such "regression" though.
Cheers,

Connect to web that requires user/password

I'm a bit new to Java and more to connections stuff with it. I'm trying to create a program to connect to a website ("www.buybackprofesional.com") where I would like to download pictures and get some text from cars (after the login you have to enter a plate number to access a car's file).
This is what I have right now, but it always says that the session has expired, I need a way to login using the username and password of the mainpage, am I right? can someone give me some advice? Thanks
Note: I want to do it in Java, maybe I was not clear in the question.
//URL web = new URL("http://www.buybackprofesional.com/DetallePeri.asp?mat=9073FCV&fec=27/07/2010&tipo=C&modelo=4582&Foto=0");
URL web = new URL("http://www.buybackprofesional.com/");
HttpURLConnection con = (HttpURLConnection) web.openConnection();
con.setRequestMethod("GET");
con.setRequestProperty("User-Agent", "Mozilla/4.0 (compatible; JVM)");
con.setRequestProperty("Pragma", "no-cache");
con.connect();
BufferedReader reader = new BufferedReader(new InputStreamReader(con.getInputStream()));
String line = null;
while ((line = reader.readLine()) != null) {
System.out.println(line);
}
A colleage helped me with this so I'll post the code that works:
public static URLConnection login(String _url, String _username, String _password) throws IOException, MalformedURLException {
String data = URLEncoder.encode("Usuario", "UTF-8") + "=" + URLEncoder.encode(_username, "UTF-8");
data += "&" + URLEncoder.encode("Contrase", "UTF-8") + "=" + URLEncoder.encode(_password, "UTF-8");
// Send data
URL url = new URL(_url);
URLConnection conn = url.openConnection();
conn.setDoOutput(true);
OutputStreamWriter wr = new OutputStreamWriter(conn.getOutputStream());
wr.write(data);
wr.flush();
wr.close();
return conn;
}
This will submit the form info on the page I need and after that, using cookies I can stay connected!
To connect to a website using java consider using httpunit or httpcore (offered by apache). They handle sessions much better then you (or I) could do on your own.
Edit: Fixed the location of the link. Thanks for the correction!

Categories

Resources