I hope someone can help me. I'm a bit of a noob to Java. But I have a question regarding calling a web service from Java. The question is actually simple but one way works the other does not?
If I call a web service from Java like this, it works:
try {
String parameters = "<soap:Envelope xmlns:xsi=\"http://www.w3.org/2001/XMLSchema-instance\" xmlns:xsd=\"http://www.w3.org/2001/XMLSchema\" xmlns:soap=\"http://schemas.xmlsoap.org/soap/envelope/\">"+
"<soap:Body>"+
" <HelloWorld xmlns=\"http://np-challenger\" />"+
"</soap:Body>"+
"</soap:Envelope>";
//out.println(parameters);
java.net.URL url = new java.net.URL("http://localhost:50217/WebSite3/Service.asmx");
java.net.HttpURLConnection connjava = (java.net.HttpURLConnection)url.openConnection();
connjava.setRequestMethod("GET");
connjava.setRequestProperty("Content-Length", "" + Integer.toString(parameters.getBytes().length));
connjava.setRequestProperty("Content-Language", "en-US");
connjava.setRequestProperty("Content-Type", "text/xml; charset=utf-8");
connjava.setRequestProperty("SOAPAction", "http://np-challenger/HelloWorld");
connjava.setDoInput(true);
connjava.setDoOutput(true);
connjava.setUseCaches(false);
connjava.setAllowUserInteraction(true);
java.io.DataOutputStream printout = new java.io.DataOutputStream (connjava.getOutputStream());
printout.writeBytes(parameters);
printout.flush();
printout.close();
java.io.BufferedReader in = new java.io.BufferedReader(new java.io.InputStreamReader(connjava.getInputStream()));
String line;
while ((line = in.readLine()) != null) {
System.out.println(line);
/*pagecontent += stuff;*/
}
in.close();
} catch (Exception e) {
System.out.println("Error: "+ e);
}
However, if I try to do it like this, I keep getting a bad request. I'm just about ready to pull my hair out.
try {
String xmlData = "<soap:Envelope xmlns:xsi=\"http://www.w3.org/2001/XMLSchema-instance\" xmlns:xsd=\"http://www.w3.org/2001/XMLSchema\" xmlns:soap=\"http://schemas.xmlsoap.org/soap/envelope/\">"+
"<soap:Body>"+
" <HelloWorld xmlns=\"http://np-challenger\" />"+
"</soap:Body>"+
"</soap:Envelope>";
//create socket
String hostname = "localhost";
int port = 50217;
InetAddress addr = InetAddress.getByName(hostname);
Socket sock = new Socket(addr,port);
FileWriter fstream = new FileWriter("out.txt");
// Send header
String path = "/WebSite3/Service.asmx";
BufferedWriter bw = new BufferedWriter(new OutputStreamWriter(sock.getOutputStream(), "UTF8"));
bw.write("POST " + path + " HTTP/1.1\r\n");
bw.write("Host: localhost\r\n");
bw.write("Content-Type: text/xml; charset=\"utf-8\"\r\n");
bw.write("Content-Length: " + xmlData.length() + "\r\n");
bw.write("SOAPAction: \"http://np-challenger/HelloWorld\"");
bw.write("\r\n");
// Send POST data string
bw.write(xmlData);
bw.flush();
// Process the response from the Web Services
BufferedReader br = new BufferedReader(new InputStreamReader(sock.getInputStream()));
String line;
while ((line = br.readLine()) != null) {
System.out.println(line);
}
bw.close();
br.close();
} catch(Exception e) {
System.err.println(e.getMessage());
e.printStackTrace(System.err);
}
I'm a bit suspicious whether the way you calculate the content length is correct, but more importantly:
Use a testing tool.
You can use a testing tool to compare between good and bad requests. One of such tools is soapUI, it's very convenient in showing you the exact contents of the requests and responses.
Create a new project in soapUI, based on the WSDL of your web service. Make sure to mark the checkboxes "Create sample requests for all operations" and "Create a Web Service Simulation of the imported WSDL". This way, soapUI will be able to act both as a client for your actual .NET web service, and as a server to which your Java client will connect.
Make sure that when soapUI connects acts as a client and connects to your web service, the request is processed correctly. Then run it as a server, send a request from Java, and compare it to the request that was processed successfully.
I chose to emphasize the role of a testing tool instead of addressing the specific problems in your code, because I believe that the ability to analyze the contents of your requests and responses will prove to be valuable time after time.
Use a WS framework.
Working with web services on such a low level requires a lot of unnecessary work from you. There are several frameworks and tools in Java that allow you to work on a higher abstraction level, eliminating the need to handle sockets and HTTP headers yourself. Take a look at the JAX-WS standard. This tutorial shows how to create a client for an existing web service. You'll notice that it's much simpler than your code sample.
Other popular WS frameworks in Java are Apache Axis2 and Apache CXF.
It's actually difference in data that is going to server. Monitor the data that you are actually posting using TCP Monitor. and compare the data i.e. mime header, request xml etc.
You will find the mistake. As far as I can see, first method is using GET method while second method is using POST method. I do not say that this is error just monitor actual data that is going to server and you will automatically get your problem resolved.
Related
I am new to programming (especially in java) and I most likely lack knowledge with server work in java, my question is that I could send a request to the server and at the same time receive a response in the form of a response code, for example 404 (file not found), please someone tell me how to correctly implement this
the code we currently have
public static void Connection(int portNumber, String addr, String request) throws UnknownHostException, IOException {
URL url = new URL(addr);
String postData = request; // html request
int response = 0;
responses = response;
URLConnection conn = url.openConnection();
conn.setDoOutput(true);
conn.setRequestProperty("Content-Type", "application/x-www-form-urlencoded");
conn.setRequestProperty("Content-Length", Integer.toString(postData.length()));
//<-------------------------------------Add a response code------------------------------------->//
try (DataOutputStream dos = new DataOutputStream(conn.getOutputStream())) {
dos.writeBytes(postData);
}
try (BufferedReader bf = new BufferedReader(new InputStreamReader(
conn.getInputStream())))
{
String line;
while ((line = bf.readLine()) != null) {
System.out.println(line);
}
}
}
Honestly. I've been scouring the internet and trying to find this in java books, but I haven't been able to find a proper answer
If you just want to send Http request and receive the data back you can just use 3d party Http clients. The most popular are Apache Http Client with good tutorial - Apache HttpClient Tutorial and OK Http client with good tutorial - A Guide to OkHttp. However, If you want to learn how to use Java classes such as URLConnection so you can write your own code than I can offer you to look at source code of my own Http client that I wrote using those classes. This HttpClient can also be used as 3d party Http client (although it is a simplistic and not well-known as the 3d party clients I mentioned above), but also you can look at the source code that is not that big and (I hope) is well and clearly written. So it could be used as tutorial as well. This HttpClient comes as part of MgntUtils Open Source library written and maintained by me. Here is the source code of HttpClient. Here is its Javadoc. If you want the source code of the whole library you can get it on Github here, and just the library as Maven artifact is available from Maven Central here
I'm using soap to request some information from server.
so, In order to test whether my soap way is the correct way or not, I tested soapUI Pro 4.6.3 program and java code.
when I use soapUI program , I got the response of my request from server. But, when I use java code I couldn't get response of my request from server..
I can see the error code 500. As I know, 500 Error code is Internal Error. so Isn't this problem of server?
I want to know what's the difference between them.
My java code is below.. and The XML code is the same what I use by SoapUI Program and what I use java code.
HttpClient client = new HttpClient();
PostMethod method = new PostMethod("My URL");
int status = 0;
String result = "";
try {
method.setRequestBody(MySoapXML);
method.getParams().setParameter("http.socket.timeout", new Integer(5000));
method.getParams().setParameter("http.protocol.content-charset", "UTF-8");
method.getParams().setParameter("SOAPAction", "My Soap Action URL");
method.getParams().setParameter("Content-Type", MySoapXML.length());
status = client.executeMethod(method);
BufferedReader br = new BufferedReader(new InputStreamReader(method.getResponseBodyAsStream()));
String readLine;
while ((readLine=br.readLine())!=null) {
System.out.println(readLine);
}
} catch (Exception e) {
e.printStackTrace();
} finally {
method.releaseConnection();
}
and I already did URLConnection Class, and HttpClient Class.. but The result was the same..
If you know the way to solve this problem or have the same experience as me. please let me know how to solve this problem.. thank you for reading ^_^
Soap - UI will parse poorly defined webservices(for eg. Not Well Defined WSDL). From my experience working with SoapUI is not a proof that your webservice is well-defined.
I am trying to create a proxy server.
I want to read the websites byte by byte so that I can display images and all other stuff. I tried readLine but I can't display images. Do you have any suggestions how I can change my code and send all data with DataOutputStream object to browser ?
try{
Socket s = new Socket(InetAddress.getByName(req.hostname), 80);
String file = parcala(req.url);
DataOutputStream out = new DataOutputStream(clientSocket.getOutputStream());
BufferedReader dis = new BufferedReader(new InputStreamReader(s.getInputStream()));
PrintWriter socketOut = new PrintWriter(s.getOutputStream());
socketOut.print("GET "+ req.url + "\n\n");
//socketOut.print("Host: "+req.hostname);
socketOut.flush();
String line;
while ((line = dis.readLine()) != null){
System.out.println(line);
}
}
catch (Exception e){}
}
Edited Part
This is what I should have to do. I can block banned web sites but can't allow other web sites in my program.
In the filter program, you will open a TCP socket at the specified port and wait for connections. If a
request comes (i.e. the client types a URL to access a web site), the application will process it to
decide whether access is allowed or not and then, using the same socket, it will send the reply back
to the client. After the client opened her connection to WebPolice (and her request has been checked
and is allowed), the real web page needs to be shown to the client. Therefore, since the user already gave her request, now it is WebPolice’s turn to forward the request so that the user can get the web page. Thus, WebPolice acts as a client and requests the web page. This means you need to open a connection to the web server (without closing the connection to the user), forward the request over this connection, get the reply and forward it back to the client. You will use threads to handle multiple connections (at the same time and/or at different times).
I don't know what exactly you're trying to do, but crafting an HTTP request and reading its response incorporates somewhat more than you have done here. Readline won't work on binary data anyway.
You can take a look at the URLConnection class (stolen here):
URL oracle = new URL("http://www.oracle.com/");
URLConnection yc = oracle.openConnection();
BufferedReader in = new BufferedReader(new InputStreamReader(yc.getInputStream()));
Then you can read textual or binary data from the in object.
Read line will treat the line read as a String, so unless you want to mess around with conversions over to bytes, I wouldn't recommend that.
I would just read bytes until you can't read anymore, then write them out to a file, this should allow you to grab the images, keeping file headers intact which can be important when dealing with files other than text.
Hope this helps.
Instead of using BufferedReader you can try to use InputStream.
It has several methods for reading bytes.
http://docs.oracle.com/javase/6/docs/api/java/io/InputStream.html
I am sending a request on a server URL but I am getting File not found exception but when I browse this file through a web browser it seems fine.
URL url = new URL(serverUrl);
connection = getSecureConnection(url);
// Connect to server
connection.connect();
// Send parameters to server
writer = new BufferedWriter(new OutputStreamWriter(connection.getOutputStream(), "UTF-8"));
writer.write(parseParameters(CoreConstants.ACTION_PREFIX + actionName, parameters));
writer.flush();
// Read server's response
reader = new BufferedReader(new InputStreamReader(connection.getInputStream()));
when I try to getInputStream then it throws error file not found.
It is an .aspx Controller page.
If the request works fine in a browser but not in code, and you've verified that the URL is the same, then the problem probably has something to do with how you are sending your parameters to the server. Specifically, this part:
writer.write(parseParameters(CoreConstants.ACTION_PREFIX + actionName, parameters));
Perhaps there is a bug in the parseParameters() function?
But more generally, I would recommend using something a bit higher-level than a raw URLConnection. HtmlUnit and HttpClient are both fine choices, particularly since it seems like your request is a fairly simple one. I've used both to perform similar client/server interaction in a number of apps. I suggest revising your code to use one of these libraries, and then see if it still produces the error.
Ok finally I have found that the problem was at IIS side it has been resolved in .Net 4.0. for previous version go to your web.config and specify validateRequest==false
I want to access forms on HTMl pages throught Java Programming Language without involving real browser in between.
At present I am doing it through HTML UNIT but it takes a bit more time to load a page. When it comes to accessing millions of page, then this extra bit time matters most.
Is there any other methods for doing this?
I've used something similar called httpunit before, but I have no idea how it compares performance wise.
If you have millions of pages to process, I would recommend throwing some more threads at it. Just a guess, but I think that if you scale this up to multiple threads, you'll run out of bandwidth before you run out of CPU power (in which case it won't matter how much faster it could be)
Accessing a web page using a browser, even HtmlUnit, is going to be slow. A better method is to test the layer just below the web interface, so that you don't need to access millions of pages -- instead you test enough to make sure that the web interface is using the lower layer correctly.
Most of the interaction in browser comes down to an HTTP GET or an HTTP POST.
You need to figure out exactly the operation you need, and then you can construct the URL and/or form data. Then you can use something like this:
try {
//Construct data
String data = URLEncoder.encode("key1", "UTF-8") + "=" + URLEncoder.encode("value1", "UTF-8"); data += "&" + URLEncoder.encode("key2", "UTF-8") + "=" + URLEncoder.encode("value2", "UTF-8");
// Send data
URL url = new URL("http://hostname:80/cgi");
URLConnection conn = url.openConnection(); conn.setDoOutput(true);
OutputStreamWriter wr = new OutputStreamWriter(conn.getOutputStream());
wr.write(data);
wr.flush();
// Get the response
BufferedReader rd = new BufferedReader(new InputStreamReader(conn.getInputStream()));
String line; while ((line = rd.readLine()) != null) {
// Process line... }
wr.close();
rd.close();
} catch (Exception e) { }