Java + Jersey- sending utf-8 encoded data - java

In my application I need to use the rest api of a web service. For now I need to send an xml message. The problem is, some of the characters in this xml are polish diacritics. Now, the code of my message sending looks like this
WebResource r = client.resource(resourceAddress);
String response = r.accept(
MediaType.APPLICATION_XML_TYPE,
MediaType.APPLICATION_JSON_TYPE,
MediaType.TEXT_HTML_TYPE
)
.type(MediaType.TEXT_XML_TYPE)
.header("Authorization", authorizationString)
.post(String.class, event);
Java Strings are UTF-16 and my XML should be UTF-8 encoded. Is there a way to tell Jersey to change somehow the encoding before serialization? Or maybe there is some other way, so I can send this String data as UTF-8 and not UTF-16 using Jersey client api?

Related

Fortify Cross-Site Scripting Persistent on Java Rest API response (JSON string & XML string)

I understand that to fix the cross-site scripting, I need to validate the user input and encode the output to avoid browser execute malicious data.
However my application is just a pure Rest API which return JSON string and XML string, fortify reported cross-site scripting persistent (stored) because the code will query data from db and return to the response
#Java Code
#PostMapping(path = "${api.abc.endpoint}")
public ResponseEntity processRequest(#RequestBody String requestStr,
HttpServletRequest servletRequest) {
ResponseEntity<String> response = null;
String responseStr = "";
responseStr = processRequest(requestString, servletRequest);
response = ResponseEntity.ok().body(responseStr);
return response; //response can be JSON or XML
}
#Original JSON Response
{
"type":"order",
"responseCode":"001",
"responseText":"Success",
"transDesc":"Value from DB"
}
#Original XML Response
<abc:res xmlns:abc="http://sample.com/abc/">
<type>order</type>
<responseCode>001</responseCode>
<responseText>Success</responseText>
<transDesc>Value from DB</transDesc>
</abc:res>
I try to encode the output string using the OWASP Java Encoder and I got the below encoded string which changed the response format.
#Encoded JSON Response
{\"type\":\"order\",\"responseCode\":\"001\",\"responseText\":\"Success\",\"transDesc\":\"Value from DB\"}
#Encoded XML Response
<data contentType="application/xml;charset=UTF-8" contentLength="241">
<![CDATA[<abc:res xmlns:abc="http://sample.com/abc/"><type>order</type><responseCode>001</responseCode><responseText>Success</responseText><transDesc>Value from DB</type></abc:res>]]></data>
How can I actually fix the cross-site scripting persistent in fortify for JSON string and XML string?
Thanks.
Fortify may be too eager to detect XSS as it assumes any data you produce could end up directly interpreted as HTML. Content sent back to the browser with XML or JSON content types aren't vulnerable to XSS by themselves though. Check that the content-type header being sent back isn't text/html.
The issue may be that a client would read part of the response and output it as is onto the page. The encoding here would be the client's responsibility though as what encoding to use depends on the output context.
Many client-side frameworks will HTML encode data as necessary by default. If you control the client, you should check whether it's doing its own encoding here.
Input validation can help in general too. Either here or in related requests that are writing to the database. Input can be validated depending on what its content should be.
How the above Fortify cross site scripting persistent issue is solved for the database call and sending output as responsentity.
Leaving my solution in case this helps peeps in the future.
My app security team needed fortify to completely resolve the issue.
What worked for me was grabbing all the keys+values in the json and running them through the html encoder function from import org.apache.commons.lang3.StringUtils library.
As the user above mentioned, fortify tries to make sure that the user input it html encoded.

Cyrillic symbols in jsonrpc response

Using Android Studio and alexd-jsonrpc client I recieve a response, where cyrillic symbols looks like:
{..."ticket_info=ÐÐ¾ÐºÑ 1"...}
instead of:
{..."ticket_info=Мойщик 1"...}
How can i decode this to cyrillic?
JSONRPC request code:
JSONRPCClient client = JSONRPCClient.create(_server, JSONRPCParams.Versions.VERSION_2);
client.setConnectionTimeout(2000);
client.setSoTimeout(2000);
_workplaceList = client.callJSONArray("GetWorkplaceList", companyID);
It looks like an encoding problem. Verify that the service is encoding the JSON-RPC response as UTF-8 and that JSONRPCClient is configured to expect UTF-8.

Encoding Turkish char on Http Post method with Java

I have a problem Turkish Character encoding
I send a xml document with web services on http post methods but When I encoding Turkish Character(Ğ,Ş ı...) asci code java translate &#230 etc.
this time url conneciton is cut data's other partial because & is mean new attribute
so how to solve this problem what can I do before send on java???
It seems you are sending the XML as part of the URL? In that case you'll need to percent-encode it (see RFC 3986)

Sending UTF-8 values in HTTP headers results in Mojibake

i want to send arabic data from servlet using HTTPServletResponse to client
i am trying this
response.setCharacterEncoding("UTF-8");
response.setHeader("Info", arabicWord);
and i receive the word like this
String arabicWord = response.getHeader("Info");
in client(receiving) also tried this
byte[]d = response.getHeader("Info").getBytes("UTF-8");
arabicWord = new String(d);
but seems like there is no unicode because i receive strange english words,so please how can i send and receive arabic utf8 words?
HTTP headers doesn't support UTF-8. They officially support ISO-8859-1 only. See also RFC 2616 section 2:
Words of *TEXT MAY contain characters from character sets other than ISO- 8859-1 [22] only when encoded according to the rules of RFC 2047 [14].
Your best bet is to URL-encode and decode them.
response.setHeader("Info", URLEncoder.encode(arabicWord, "UTF-8"));
and
String arabicWord = URLDecoder.decode(response.getHeader("Info"), "UTF-8");
URL-encoding will transform them into %nn format which is perfectly valid ISO-8859-1. Note that the data sent in the headers may have size limitations. Rather send it in the response body instead, in plain text, JSON, CSV or XML format. Using custom HTTP headers this way is namely a design smell.
I don't know where word variable is comming from, but try this:
arabicWord = new String(d, "UTF-8");
UPDATE: Looks like the problem is with UTF-8 encoded data in HTTP headers, see: HTTP headers encoding/decoding in Java for detailed discussion.

Safe Data serialization for Plain HTTP GET & POST communication

I'm using the client's browser to submit HTTP request.
For report generation the securityToken is submitted as POST, for report download the same token needs to be submitted by the user browser, this time using GET.
What encoding would you recommend for the securityToken which actually represents encrypted data.
I've tried BASE64 but this fails because the standard can include the "+" character which gets translated in HTTP GET to ' ' (blank space).
Then I tried URL Encoding, but this fails because for HTTP POST stuff such as %3d are transmitted without translation but when browser does HTTP GET with the data %3d is converted to '='.
What encoding would you recommend, to allow safe transmission over HTTP POST & GET without data being misinterpreted.
The environment is Java, Tomcat.
Thank you,
Maxim.
Hex string.
Apache commons-codec has a Hex class that provides this functionality.
It will look like this:
http://youraddress.com/context/servlet?param=ac7432be432b21
Well, you can keep the Base64 and use this solution:
Code for decoding/encoding a modified base64 URL

Categories

Resources