JSONObject.write(<a writer>) vs. aWriter.write(jSONObj.toString()) - java

It seems like there're 2 ways to write the content of a JSON object to a writer. I can either do
myWriter.write(myJSONObj.toString());
Or
myJSONObj.write(myWriter);
Is there any reason why anyone would choose one way over the other?

According to the source code:
public String toString() {
try {
return this.toString(0);
} catch (Exception e) {
return null;
}
}
public String toString(int indentFactor) throws JSONException {
StringWriter w = new StringWriter();
synchronized (w.getBuffer()) {
return this.write(w, indentFactor, 0).toString();
}
}
public Writer write(Writer writer) throws JSONException {
return this.write(writer, 0, 0);
}
so basically, the first approach:
myWriter.write(myJSONObj.toString());
Creates a StringWriter.
Passes the writer to write(Writer writer, int indentFactor, int indent).
The JSON content get written to the writer.
The content of the writer is converted via StringWriter#toString().
The final string get written to myWriter.
The second approach:
myJSONObj.write(myWriter);
Passes the writer to write(Writer writer, int indentFactor, int indent).
The JSON content get written to the writer.

Related

How to use Mockito/PowerMockito to trigger IOException for Unit/Integration Testing

I am doing testing on the following piece of code, and i am having trouble getting this method to throw an IOException so i can get 100% coverage.
I have tried to mock the CharArrayReader, StringWriter classes but to no avail.
Would appreciate any help!
Class to test
public static final String getValue(String content) {
if (content == null) return null;
CharArrayReader reader = new CharArrayReader(content.toCharArray());
StringWriter writer = new StringWriter();
try {
int c;
while ((c = reader.read()) != -1) {
if (isChinese((char)c)) {
writer.write(c);
} else {
if ( (char)c > 0x20 && (char)c < 0x7f ) {
writer.write(c);
} else {
writer.write(' ');
}
}
}
} catch (IOException e) {
return null;
} finally {
reader.close();
}
return writer.toString();
}
My Attempts
#Test
public void getValue_Exception() throws IOException {
String content = "asd";
char[] chara = null;
CharArrayReader reader = mock(CharArrayReader.class);
when(content.toCharArray()).thenReturn(chara);
when(reader.read()).thenThrow(IOException.class);
StringWriter writer = mock(StringWriter.class);
doThrow(IOException.class).when(writer).write(anyInt());
spyController.getValue(content);
}
While reading inside the CharArrayReader class, the IOException is thrown when the char[] buff parameter in the constructor is null.
private void ensureOpen() throws IOException {
if (buf == null)
throw new IOException("Stream closed");
}
One approach (even if it is a bad idea to mock the String class) is to mock the call of the method toCharArray() from the String class to return a null value.
The only scenario, in which CharArrayReader::read throws IOException is when the stream is closed. In your example it seems rather impossible.
Nevertheless, if you really want to get that 100% coverage or just make sure your class behaves in case of unexpected, my advise would be to create a factory for your reader e.g.:
class ReaderFactory {
Reader create(String content) {
return new CharArrayReader(content.toCharArray());
}
}
With this class in place, you could use it in your code and mock the create method. This way you would have control over the instance of Reader being used in your test case.

InputStream for CSV exporter

I have following method:
public String exportAsCsv(CqlQuery query) {
Iterator<String> result = queryService.execute(.....);
StringBuilder buf = new StringBuilder();
for (String nextLine : result) {
buf.append(nextLine);
}
return buf.toString();
}
It executes some query which returns Iterator<String> - it contains gigabytes of data, so appending it to StringBuilder is not the best idea...
I would like to change my method so that it returns InputStream instead.
This could be one possible implementation (pseudo code):
public InputStream exportAsCsv(CqlQuery query) {
final Iterator<String> result = queryService.execute(query,false);
return new MagicalInputStream(){
#Overwrite
byte[] readNext() {
if(!result.hasNext()) {
return null;
} else {
return result.next().getBytes();
}
}
}
}
I am looking for InputStream where I have to implement abstract method (like byte[] readNext()), which will be used to read data chunks - one by one. So this input stream has to buffer read chunk, stream it back, and when its buffer is empty it should read next chunk.
The idea is, that I read next elements from Iterator ONLY when "client" rads next bytes from input stream.
Or there might be another possibility to change my method so that it does return InputStream instead of String - any ideas?
The whole InputStream implementation could be avoided if you allow your method to accept an java.io.Writer. Instead of appending Strings to the in-memory StringBuilder, you append them to the provided Writer.
public void exportAsCsv(CqlQuery query, Writer writer) {
Iterator<String> result = queryService.execute(.....);
for (String nextLine : result) {
writer.append(nextLine);
}
}
If you really want an InputStream, though, you could try something like this:
public InputStream exportAsCsv(CqlQuery query) {
Iterator<String> result = queryService.execute(.....);
return new SequenceInputStream(asStreamEnum(result));
}
private Enumeration<InputStream> asStreamEnum(final Iterator<String> it) {
return new Enumeration<InputStream>() {
#Override
public boolean hasMoreElements() {
return it.hasNext();
}
#Override
public InputStream nextElement() {
try {
return new ByteArrayInputStream(it.next().getBytes("UTF-8"));
} catch (UnsupportedEncodingException ex) {
throw new RuntimeException(ex);
}
}
};
}
I haven't actually tested this approach yet, so be warned; conceptually, though, I think this is what you're after.

Converting POJO to JsonNode using a JsonView

I'm writing a typical Play Framework app where I want to return a JsonNode from my Controller's methods, using Jackson.
This is how I'm doing it right now:
public static Result foo() {
MyPojoType myPojo = new myPojo();
String tmp = new ObjectMapper().writerWithView(JSONViews.Public.class).writeValueAsString(myPojo);
JsonNode jsonNode = Json.parse(tmp);
return ok(jsonNode);
}
Is it possible to avoid the "String tmp" copy and convert directly from MyPojoType to JsonNode using a view?
Maybe I can use ObjectMapper.valueToTree, but I don't know how to specify a JSonView to it.
Interesting question: off-hand, I don't think there is a specific method, and your code is the most straight-forward way to do it: valueToTree method does not apply any views.
So code is fine as is.
After more investigation, this is what I did in the end to avoid the redundant work:
public Result toResult() {
Content ret = null;
try {
final String jsonpayload = new ObjectMapper().writerWithView(JsonViews.Public.class).writeValueAsString(payload);
ret = new Content() {
#Override public String body() { return jsonpayload; }
#Override public String contentType() { return "application/json"; }
};
} catch (JsonProcessingException exc) {
Logger.error("toResult: ", exc);
}
if (ret == null)
return Results.badRequest();
return Results.ok(ret);
}
In summary: The methods ok, badRequest, etc accept a play.mvc.Content class. Then, simply use it to wrap your serialized json object.
As i know, with jax-rs, you can do this :
public Response toResult() throws JsonProcessingException {
final ObjectWriter writer = new ObjectMapper()
.writerWithView(JSONViews.Public.class);
return Response.ok(new StreamingOutput() {
#Override
public void write(OutputStream outputStream) throws IOException, WebApplicationException {
writer.writeValue(outputStream, /*Pojo*/ payload);
}
}).build();
}
So you have to find a class in the Play framework which able to stream the result (through an OutputStream)
I think this is more efficient way
public Result toResult() {
MyPojo result = new MyPojo();
JsonNode node = objectMapper.valueToTree(result);
return ok(node);
}

Parsing XML without document start and end tags

I'm parsing a document that I cannot change from the internet using a SAX Parser. It was working just fine when the documents came formatted as such:
<outtertag>
<innertag>data</innertag>
<innerag>moreData</innertag>
</outtertag>
However, there are certain calls I make where the XML comes formatted without the outer tags, so I essentially get just a list of data, like such:
<innertag>data</innertag>
<innerag>moreData</innertag>
This seems silly to me, but I don't get to choose how the XML is formatted and it can't be changed for now. The problem is that it seems that the SAX Parser hits the endDocument event as soon as it hits the first closing innertag.
I have a rather hacky solution of converting the InputStream into a String, throwing tags around it, and then converting it back to an InputStream. It actually parses fine that way. But, surely there's a better way. I'd also would prefer not to write a whole other parser. Most of the tags are the same aside from the lack of opening and closing tags.
Just for the heck of it, I'll post the code, but it's pretty standard SAX Parser. The original is actually parsing about 30 some tags:
SAXParserFactory factory = SAXParserFactory.newInstance();
SAXParser saxParser = factory.newSAXParser();
XMLReader xmlReader = saxParser.getXMLReader();
MyHandler handler = new MyHandler();
xmlReader.setContentHandler(handler);
InputSource inputSource = new InputSource(url.openStream());
xmlReader.parse(inputSource);
}
catch (SAXException e) { e.printStackTrace(); }
catch (ParserConfigurationException e) { e.printStackTrace(); }
catch(Exception e) { e.printStackTrace(); }
}
private class MyHandler extends DefaultHandler {
private StringBuilder content;
public MyHandler() {
content = new StringBuilder();
}
public void startElement(String uri, String localName, String qName,
Attributes atts) throws SAXException {
content = new StringBuilder();
if(localName.equalsIgnoreCase("innertag")) {
//Doing stuff
}
}
public void endElement(String uri, String localName, String qName)
throws SAXException {
//Doing stuff
}
public void characters(char[] ch, int start, int length)
throws SAXException {
content.append(ch, start, length);
}
public void endDocument() throws SAXException {
//When parsing the second type of document, hits this event almost immediately after parsing first tag
}
}
And, if it matters, here's my hacky code I'm using, but just feels wrong, yet it works:
BufferedReader reader = new BufferedReader(new InputStreamReader(url.openStream()));
StringBuilder sb = new StringBuilder("<tag>");
String line = null;
while ((line = reader.readLine()) != null) {
sb.append(line);
}
sb.append("</tag>");
String xml =sb.toString();
InputStream is = new ByteArrayInputStream(xml.getBytes());
InputSource source = new InputSource(is);
xmlReader.parse(source);
I'd say what you're doing now is about as good as you'll get. The one thing to consider improving is the stream -> string -> stream conversion, especially if the documents are large. You could use something like Guava's ByteStreams.join(), which lets you concatenate streams together instead of strings. Something like the following:
import com.google.common.io.*;
import java.io.*;
public class ConcatenateStreams {
public static void main(String[] args) throws Exception {
InputStream malformedXmlContent = externalXmlStream();
InputSupplier<InputStream> joined = ByteStreams.join(
inputSupplier("<root>"),
inputSupplier(malformedXmlContent),
inputSupplier("</root>"));
ByteStreams.copy(joined, System.out);
}
private static InputStream externalXmlStream() {
return new ByteArrayInputStream("<foo>5</foo><bar>10</bar>".getBytes());
}
private static InputSupplier<InputStream> inputSupplier(final String text) {
return inputSupplier(new ByteArrayInputStream(text.getBytes()));
}
private static InputSupplier<InputStream> inputSupplier(final InputStream inputStream) {
return new InputSupplier<InputStream>() {
#Override
public InputStream getInput() throws IOException {
return inputStream;
}
};
}
}
which outputs:
<root><foo>5</foo><bar>10</bar></root>
The XML you have is not a well-formed document, but it is a well-formed external parsed entity, which means it can be referenced from a well-formed document by means of an entity reference. So create a skeleton document like this:
<!DOCTYPE doc [
<!ENTITY e SYSTEM "data.xml">
]>
<doc>&e;</doc>
where data.xml is your XML, and pass this document to the XML parser in place of the original. Beats writing dozens of lines of Java code.

how to convert PrintWriter to String or write to a File?

I am generating dynamic page using JSP, I want to save this dynamically generated complete page in file as archive.
In JSP, everything is written to PrintWriter out = response.getWriter();
At the end of page, before sending response to client I want to save this page, either in file or in buffer as string for later treatment.
How can I save Printwriter content or convert to String?
To get a string from the output of a PrintWriter, you can pass a StringWriter to a PrintWriter via the constructor:
#Test
public void writerTest(){
StringWriter out = new StringWriter();
PrintWriter writer = new PrintWriter(out);
// use writer, e.g.:
writer.print("ABC");
writer.print("DEF");
writer.flush(); // flush is really optional here, as Writer calls the empty StringWriter.flush
String result = out.toString();
assertEquals("ABCDEF", result);
}
Why not use StringWriter instead? I think this should be able to provide what you need.
So for example:
StringWriter strOut = new StringWriter();
...
String output = strOut.toString();
System.out.println(output);
It will depend on: how the PrintWriter is constructed and then used.
If the PrintWriter is constructed 1st and then passed to code that writes to it, you could use the Decorator pattern that allows you to create a sub-class of Writer, that takes the PrintWriter as a delegate, and forwards calls to the delegate, but also maintains a copy of the content that you can then archive.
public class DecoratedWriter extends Writer
{
private final Writer delegate;
private final StringWriter archive = new StringWriter();
//pass in the original PrintWriter here
public DecoratedWriter( Writer delegate )
{
this.delegate = delegate;
}
public String getForArchive()
{
return this.archive.toString();
}
public void write( char[] cbuf, int off, int len ) throws IOException
{
this.delegate.write( cbuf, off, len );
this.archive.write( cbuf, off, len );
}
public void flush() throws IOException
{
this.delegate.flush();
this.archive.flush();
}
public void close() throws IOException
{
this.delegate.close();
this.archive.close();
}
}
You cannot get it with just your PrintWriter object. It flushes the data, and does not hold any content within itself. This isn't the object you should be looking at to get the entire string,
The best way I think is prepare your response in other object like StringBuffer, and fush its content to the response, and after save the content stored in that variable to the file.
This helped me: for obtaining a SOAP-able object as XML string.
JAXBContext jc = JAXBContext.newInstance(o.getClass());
Marshaller m = jc.createMarshaller();
StringWriter writer = new StringWriter();
m.marshal( o, new PrintWriter(writer) );
return writer.toString();
Along similar lines to what cdc is doing - you can extend PrintWriter and then create and pass around an instance of this new class.
Call getArchive() to get a copy of the data that's passed through the writer.
public class ArchiveWriter extends PrintWriter {
private StringBuilder data = new StringBuilder();
public ArchiveWriter(Writer out) {
super(out);
}
public ArchiveWriter(Writer out, boolean autoFlush) {
super(out, autoFlush);
}
public ArchiveWriter(OutputStream out) {
super(out);
}
public ArchiveWriter(OutputStream out, boolean autoFlush) {
super(out, autoFlush);
}
public ArchiveWriter(String fileName) throws FileNotFoundException {
super(fileName);
}
public ArchiveWriter(String fileName, String csn) throws FileNotFoundException, UnsupportedEncodingException {
super(fileName, csn);
}
public ArchiveWriter(File file) throws FileNotFoundException {
super(file);
}
public ArchiveWriter(File file, String csn) throws FileNotFoundException, UnsupportedEncodingException {
super(file, csn);
}
#Override
public void write(char[] cbuf, int off, int len) {
super.write(cbuf, off,len);
data.append(cbuf, off, len);
}
#Override
public void write(String s, int off, int len) {
super.write(s, off,len);
data.append(s, off, len);
}
public String getArchive() {
return data.toString();
}
}

Categories

Resources