Testing writing into a csv file using JUnit [duplicate] - java

This question already has an answer here:
JUnit testing for IO
(1 answer)
Closed 8 years ago.
I am new to junit testing and I want to write unit tests. Actually the methods does not return anything. It take the a list of signals and write it to a csv file. I am not sure how to test methods with void return types.
Anyone can help me ?
public void createCSV ( final ArrayList< Signal > messages, File file )
{
try
{
// Use FileWriter constructor that specifies open for appending
csvOutput = new MyWriter( new FileWriter( file, false ), ',' );
// Create Header for CSV
csvOutput.writeRecord( "Message Source" );
csvOutput.writeRecord( "Message Name" );
csvOutput.writeRecord( "Component" );
csvOutput.writeRecord( "Occurance" );
csvOutput.writeRecord( "Message Payload with Header" );
csvOutput.writeRecord( "Bandwidth(with Header %)" );
csvOutput.writeRecord( "Message Payload" );
csvOutput.writeRecord( "Bandwidth(%)" );
csvOutput.endOfRecord();
for ( Signal signal : messages )
{
csvOutput.writeRecord( signal.getSource() );
csvOutput.writeRecord( signal.getName() );
csvOutput.writeRecord( signal.getComponent() );
csvOutput.writeRecord( Integer.toString( signal.getOccurance() ) );
csvOutput.writeRecord( Integer.toString( signal
.getSizewithHeader() ) );
csvOutput.writeRecord( Float.toString( signal
.getBandwidthWithHeader() ) );
csvOutput.writeRecord( Integer.toString( signal.getSize() ) );
csvOutput.writeRecord( Float.toString( signal.getBandwidth() ) );
csvOutput.endOfRecord();
}
}
catch ( IOException e )
{
logger.error( "Error in writing CSV file for messages", e );
}
finally
{
try
{
if ( csvOutput != null )
{
csvOutput.flush();
csvOutput.close();
}
messages.clear();
}
catch ( IOException ex )
{
ex.printStackTrace();
}
}
}
}

One takes a map and sort it.
Pass in a map with known, unsorted values. Verify the map has been sorted after the method was called.
The other take the sorted map and write it to a csv file. I am not sure how to test methods with void return types.
Two options:
Pass in a temporary file path, e.g. see JUnit temporary folders, then read that file after the method has been called and test it for correctness.
Adjust your method to accept an OutputStream instead of a File. Then you can pass a ByteArrayOutputStream and verify its contents by calling toByteArray() and inspecting the bytes.

Unit test for File
If you dont want to change the src code:
In the unit test I would pass a file to a temp path, call that create csv method and
then open the file and dependendent of how many effort you want to invest:
check
1) if the file exists (use a filename genereated that contains the current time)
2) check that the length is more than 0 bytes
3) read the first and last line and check for expected content
But in most cases, an OutputStream is more flexible than a File parameter.
In productive code you pass a FileOutputStream, in your unit test a ByteArrayOutputStream, which you can parse using an ByteArrayInputStream.
This is the cleaner solution, since it does not create files which should be cleaned up, and it runs faster.
Unit test for sorting
Just create an unsorted map. call you sort, and check the result to be sorted:
Iterate and check that each next element is e.g greater than the previous one (or smaller depending on the sort order)
Just

Related

MultiResourceItemReader - Skip entire file if header is invalid

My Spring Batch job reads a list of csv files containing two types of headers. I want the reader to skip the entire file if its header does not match one of the two possible header types.
I've taken a look at Spring Boot batch - MultiResourceItemReader : move to next file on error.
But I don't see how to validate the header tokens to ensure they match up in count and content
I was able to figure this out by doing the following,
public FlatFileItemReader<RawFile> reader() {
return new FlatFileItemReaderBuilder<RawFile>()
.skippedLinesCallback(line -> {
// Verify file header is what we expect
if (!StringUtils.equals(line, header)) {
throw new IllegalArgumentException(String.format("Bad header!", line));
}
})
.name( "myReader" )
.linesToSkip( 1 )
.lineMapper( new DefaultLineMapper() {
{
setLineTokenizer( lineTokenizer );
setFieldSetMapper( fieldSetMapper );
}} )
.build();
}
I call the reader() method when setting the delegate in my MultiResourceItemReader.
Note that header, lineTokenizer, and fieldSetMapper are all variables that I set depending on which type of file (and hence which set of headers) my job is expected to read.
Can we do this in XML based configuration ?

Fast non-blocking read/writes using MappedByteBuffer?

I am processing messages from a vendor as a stream of data and want to store msgSeqNum locally in a local file. Reason:
They send msgSeqNum to uniquely identify each message. And they provide a 'sync-and-stream' functionality to stream messages on reconnecting from a given sequence number. Say if the msgSeqNum starts from 1 and my connection went down at msgSeqNum 50 and missed the next 100 messages (vendor server's current msgSeqNum is now 150), then when I reconnect to the vendor, I need to call 'sync-and-stream' with msgSeqNum=50 to get the missed 100 messages.
So I want to understand how I can persist the msgSeqNum locally for fast access. I assume
1) Since the read/writes happen frequently i.e. while processing every message (read to ignore dups, write to update msgSeqNum after processing a msg), I think it's best to use Java NIO's 'MappedByteBuffer'?
2) Could someone confirm if the below code is best for this where I expose the mapped byte buffer object to be reused for reads and writes and leave the FileChannel open for the lifetime of the process? Sample Junit code below:
I know this could be achieved with general Java file operations to read and write into a file but I need something fast which is equivalent to non-IO as I am using a single writer patten and want to be quick in processing these messages in a non-blocking manner.
private FileChannel fileChannel = null;
private MappedByteBuffer mappedByteBuffer = null;
private Charset utf8Charset = null;
private CharBuffer charBuffer = null;
#Before
public void setup() {
try {
charBuffer = CharBuffer.allocate( 24 ); // Long max/min are till 20 bytes anyway
System.out.println( "charBuffer length: " + charBuffer.length() );
Path pathToWrite = getFileURIFromResources();
FileChannel fileChannel = (FileChannel) Files
.newByteChannel( pathToWrite, EnumSet.of(
StandardOpenOption.READ,
StandardOpenOption.WRITE,
StandardOpenOption.TRUNCATE_EXISTING ));
mappedByteBuffer = fileChannel
.map( FileChannel.MapMode.READ_WRITE, 0, charBuffer.length() );
utf8Charset = Charset.forName( "utf-8" );
//charBuffer = CharBuffer.allocate( 8 );
} catch ( Exception e ) {
// handle it
}
}
#After
public void destroy() {
try {
fileChannel.close();
} catch ( IOException e ) {
// handle it
}
}
#Test
public void testWriteAndReadUsingSharedMappedByteBuffer() {
if ( mappedByteBuffer != null ) {
mappedByteBuffer.put( utf8Charset.encode( charBuffer.wrap( "101" ) )); // TODO improve this and try reusing the same buffer instead of creating a new one
} else {
System.out.println( "mappedByteBuffer null" );
fail();
}
mappedByteBuffer.flip();
assertEquals( "101", utf8Charset.decode(mappedByteBuffer).toString() );
}

How are these methods allowing / causing data to be lost on disk?

I have a program that writes its settings and data out to disk every so often (15 seconds or so).
If the program is running and the computer is shut off abruptly -- for example, with the power being cut at the wall -- somehow all of my data files on disk are changed to empty files.
Here is my code, which I thought I designed to protect against this failure, but based on testing the failure still exists:
SaveAllData -- Called every so often, and also when JavaFX.Application.stop() is called.
public void saveAllData () {
createNecessaryFolders();
saveAlbumsAndTracks();
saveSources();
saveCurrentList();
saveQueue();
saveHistory();
saveLibraryPlaylists();
saveSettings();
saveHotkeys();
}
CreateNecessaryFolders
private void createNecessaryFolders () {
if ( !playlistsDirectory.exists() ) {
boolean playlistDir = playlistsDirectory.mkdirs();
}
}
Save Functions -- they all look just like this
public void saveCurrentList () {
File tempCurrentFile = new File ( currentFile.toString() + ".temp" );
try ( ObjectOutputStream currentListOut = new ObjectOutputStream( new FileOutputStream( tempCurrentFile ) ) ) {
currentListOut.writeObject( player.getCurrentList().getState() );
currentListOut.flush();
currentListOut.close();
Files.move( tempCurrentFile.toPath(), currentFile.toPath(), StandardCopyOption.REPLACE_EXISTING );
} catch ( Exception e ) {
LOGGER.warning( e.getClass().getCanonicalName() + ": Unable to save current list to disk, continuing." );
}
}
Github repository to commit where this problem exists. See Persister.java.
As I said, when the power is cut abruptly all setting files saved by this method are blanked. This makes particularly no sense to me, since they are called in sequence and I am making sure the file is written to disk and flushed before calling move().
Any idea how this could be happening? I thought by calling flush, close, then move, I would ensure that the data is written to disk before overwriting the old data. Somehow, this isn't the case, but I am clueless. Any suggestions?
Note: these files are only written to by these functions, and only read from by corresponding load() functions. There is no other access to the files any where else in my program.
Note 2: I am experiencing this on Ubuntu Linux 16.10. I have not tested it on other platforms yet.
Adding StandardCopyOption.ATOMIC_MOVE to the Files.move() call solves the problem:
public void saveCurrentList () {
File tempCurrentFile = new File ( currentFile.toString() + ".temp" );
try ( ObjectOutputStream currentListOut = new ObjectOutputStream( new FileOutputStream( tempCurrentFile ) ) ) {
currentListOut.writeObject( player.getCurrentList().getState() );
currentListOut.flush();
currentListOut.close();
Files.move( tempCurrentFile.toPath(), currentFile.toPath(), StandardCopyOption.REPLACE_EXISTING, StandardCopyOption.ATOMIC_MOVE );
} catch ( Exception e ) {
LOGGER.warning( e.getClass().getCanonicalName() + ": Unable to save current list to disk, continuing." );
}
}

Calling a jsp multiple times from single doPost() method

I am using PD4ML libraries for converting my .jsp to pdf files and I need to call the same jsp file for a List of values.
I am doing this in my doPost()
protected void doPost(HttpServletRequest request, HttpServletResponse response) throws ServletException, IOException {
String [] posSelected = request.getParameterValues("selectPOs");
for(String eachPO: posSelected){
request.getRequestDispatcher("CreateInvoices.jsp").forward(request,response);
//This does not work as can not create multiple instances of servlet.
}}
I get java.lang.IllegalStateException: Cannot forward after response has been committed exception.
How can I invoke same JSP multiple times?
Thanks
MekaM
How can I invoke same JSP multiple times?
By including it multiple times.
request.getRequestDispatcher("CreateInvoices.jsp").include(request, response);
By using Include instead of Forward on Request Dispatcher.
You can call same jsp multiple time suing include.
It will look something like this.
request.getRequestDispatcher("CreateInvoices.jsp").include(request,response);
As mentioned by others you can send only one response per request in HTTP protocol hence you need to try another approach.
In your case since pd4ml is mandatory and here you need multiple pdfs hence creating multiple jsp is not the ideal way. Hence rather than converting the jsp to pdf you should create the multiple pdf through the code as shown in the link
http://pd4ml.com/examples.
private void runConverter(String urlstring, File output) throws IOException {
if (urlstring.length() > 0) {
if (!urlstring.startsWith("http://") && !urlstring.startsWith("file:")) {
urlstring = "http://" + urlstring;
}
4 java.io.FileOutputStream fos = new java.io.FileOutputStream(output);
5 if ( proxyHost != null && proxyHost.length() != 0 && proxyPort != 0 ) {
System.getProperties().setProperty("proxySet", "true");
System.getProperties().setProperty("proxyHost", proxyHost);
System.getProperties().setProperty("proxyPort", "" + proxyPort);
}
6 PD4ML pd4ml = new PD4ML();
7 try {
pd4ml.setPageSize( landscapeValue ? pd4ml.changePageOrientation( format ): format );
} catch (Exception e) {
e.printStackTrace();
}
if ( unitsValue.equals("mm") ) {
pd4ml.setPageInsetsMM( new Insets(topValue, leftValue,
bottomValue, rightValue) );
} else {
pd4ml.setPageInsets( new Insets(topValue, leftValue,
bottomValue, rightValue) );
}
pd4ml.setHtmlWidth( userSpaceWidth );
8 pd4ml.render( urlstring, fos );
}
}
I have used jsoup.connect().get() to achieve what I wanted.

Java response content returns instead of < or > < >

I have a little problem: I'm writing to response content of the file and return it to the client as an ajax response.
But there occurs html substitution: of > to > etc...
What i have to do to make this substitution off ?
res.setHeader( "Cache-Control", "must-revalidate, post-check=0, pre-check=0" );
res.setHeader( "Pragma", "public" );
res.setContentType( "text/html" );
TIA
update
// import com.ibm.useful.http.PostData;
PostData pd = new PostData( req );
final FileData data;
try {
data = pd.getFileData( "sqlFile" );
ByteArrayOutputStream buf = new ByteArrayOutputStream();
for ( byte b : data.getByteData() ) {
buf.write( b );
}
res.getWriter().print( buf.toString() );
}
i watched buf.toString() through debugger. it's ok there. substitution goes further. but where...
The HTML special characters are been escaped into HTML entities.
If you are sure that this happened right after you wrote it to the response and right before the response data arrives at the client, then there's possibly a filter in the chain which has escaped the HTML entities for some reason. Check the declared filters in web.xml and adjust the url-pattern if necessary.
This usually happens when html characters (
<, > , "
amongst others) are being escaped.
Try setting escape to false or similar.
Can't find the api documentation for "com.ibm.useful.http.PostData"
Try using below snippet:
res.setContentType("text/html; charset=UTF-8");
Please make sure your database is set to UTF-8 encoding as well, if your using one.
If this doesn't solve, please read this article.

Categories

Resources