Polygon annotation in PDFBox PDAnnotation - java

I want to add the polygon in the PDF at the given coordinates, I referred this link for adding the annotation of circle and rectangle, but it does not contain anything for polygon. Does anyone know how to do it? Or does anyone know where do I get all documentation about PDFBox annotation.
Here I am sharing what I'vs done until now. But I couldn't proceed further.
import java.io.IOException;
import java.io.File;
import java.io.FileReader;
import java.util.Arrays;
import java.util.Iterator;
import java.util.List;
import org.json.simple.parser.ParseException;
import org.apache.pdfbox.pdmodel.PDDocument;
import org.apache.pdfbox.pdmodel.PDPage;
import org.apache.pdfbox.pdmodel.common.PDRectangle;
import org.apache.pdfbox.pdmodel.PDPageContentStream;
import org.apache.pdfbox.pdmodel.font.PDFont;
import org.apache.pdfbox.pdmodel.font.PDType1Font;
import org.apache.pdfbox.pdmodel.graphics.color.PDColor;
import org.apache.pdfbox.pdmodel.graphics.color.PDDeviceRGB;
import org.apache.pdfbox.pdmodel.interactive.action.PDActionGoTo;
import org.apache.pdfbox.pdmodel.interactive.action.PDActionURI;
import org.apache.pdfbox.pdmodel.interactive.annotation.PDAnnotation;
import org.apache.pdfbox.pdmodel.interactive.annotation.PDAnnotationLine;
import org.apache.pdfbox.pdmodel.interactive.annotation.PDAnnotationText;
import org.apache.pdfbox.pdmodel.interactive.annotation.PDAnnotationLink;
import org.apache.pdfbox.pdmodel.interactive.annotation.PDAnnotationMarkup;
import org.apache.pdfbox.pdmodel.interactive.annotation.PDAnnotationSquareCircle;
import org.apache.pdfbox.pdmodel.interactive.annotation.PDAnnotationTextMarkup;
import org.apache.pdfbox.pdmodel.interactive.annotation.PDBorderStyleDictionary;
import org.apache.pdfbox.pdmodel.interactive.documentnavigation.destination.PDPageDestination;
import org.apache.pdfbox.pdmodel.interactive.documentnavigation.destination.PDPageFitWidthDestination;
public class Polygon{
public static void main(String[] args) throws IOException {
// TODO Auto-generated method stub
// Loading the PDF File
File file = new File("abc.pdf");
PDDocument document = PDDocument.load(file);
System.out.println("PDF Loaded.");
PDPage page = document.getPage(0);
List<PDAnnotation> polygon = page.getAnnotations();
// Color of polygon
PDColor color = new PDColor(new float[] {0, 0, 1}, PDDeviceRGB.INSTANCE);
// Define border thickness
PDBorderStyleDictionary thickness = new PDBorderStyleDictionary();
thickness.setWidth((float)2);
float[] vertices = {418, 110, 523, 110, 522, 132, 419, 133};
PDAnnotationSquareCircle lines = new PDAnnotationSquareCircle(PDAnnotationSquareCircle.SUB_TYPE_POLYGON);
lines.setColor(color);
lines.setBorderStyle(thickness);
/*****************
*
* ????
* *************************************/
// Save annotations
document.save(file);
// Close document
document.close();
}
}
As far I have seen, There isn't any method for adding vertices in polygon in PDAnnotation jar. So is there any way we can draw polygon here?
Thanks.

Here's some code that will soon be added to the AddAnnotations.java example from the source code download:
static final float INCH = 72;
float pw = page1.getMediaBox().getUpperRightX();
float ph = page1.getMediaBox().getUpperRightY();
PDAnnotationMarkup polygon = new PDAnnotationMarkup();
polygon.getCOSObject().setName(COSName.SUBTYPE, PDAnnotationMarkup.SUB_TYPE_POLYGON);
position = new PDRectangle();
position.setLowerLeftX(pw - INCH);
position.setLowerLeftY(ph - INCH);
position.setUpperRightX(pw - 2 * INCH);
position.setUpperRightY(ph - 2 * INCH);
polygon.setRectangle(position);
polygon.setColor(blue); // border color
polygon.getCOSObject().setItem(COSName.IC, red.toCOSArray()); // interior color
COSArray triangleVertices = new COSArray();
triangleVertices.add(new COSFloat(pw - INCH));
triangleVertices.add(new COSFloat(ph - 2 * INCH));
triangleVertices.add(new COSFloat(pw - INCH * 1.5f));
triangleVertices.add(new COSFloat(ph - INCH));
triangleVertices.add(new COSFloat(pw - 2 * INCH));
triangleVertices.add(new COSFloat(ph - 2 * INCH));
polygon.getCOSObject().setItem(COSName.VERTICES, triangleVertices);
polygon.setBorderStyle(borderThick);
annotations.add(polygon);
to adjust your own code, you need to adjust the rectangle and pass your vertices:
position.setLowerLeftX(418);
position.setLowerLeftY(110);
position.setUpperRightX(523);
position.setUpperRightY(133);
polygon.setRectangle(position);
float[] vertices = {418, 110, 523, 110, 522, 132, 419, 133};
COSArray verticesArray = new COSArray();
for (float v : vertices)
verticesArray.add(new COSFloat(v));
polygon.getCOSObject().setItem(COSName.VERTICES, verticesArray);
This is for 2.0 only. In 3.0 there will be a PDAnnotationPolygon type with appropriate methods. That version will also support the construction of appearance streams, i.e. you will be able to show the PDF with other viewers than Adobe Reader. Most viewers, e.g. PDF.js and PDFBox don't build missing appearances so you'll see nothing.
If you need the appearance for 2.0 you can try with the code in the file ShowAnnotation-6.java in https://issues.apache.org/jira/browse/PDFBOX-3353 .
To test with the 3.0 version, get the jar here:
https://repository.apache.org/content/groups/snapshots/org/apache/pdfbox/pdfbox-app/3.0.0-SNAPSHOT/
To build the appearance, call polygon.constructAppearances();

Related

Powershell: Write-Host is very slow

I have a Java application which has a functionality to take a screenshot. It does it by running Powershell script:
Add-Type -AssemblyName System.Windows.Forms,System.Drawing
$screens = [Windows.Forms.Screen]::AllScreens
$top = ($screens.Bounds.Top | Measure-Object -Minimum).Minimum
$left = ($screens.Bounds.Left | Measure-Object -Minimum).Minimum
$width = ($screens.Bounds.Right | Measure-Object -Maximum).Maximum
$height = ($screens.Bounds.Bottom | Measure-Object -Maximum).Maximum
$bounds = [Drawing.Rectangle]::FromLTRB($left, $top, $width, $height)
$bmp = New-Object System.Drawing.Bitmap ([int]$bounds.width), ([int]$bounds.height)
$graphics = [Drawing.Graphics]::FromImage($bmp)
$graphics.CopyFromScreen($bounds.Location, [Drawing.Point]::Empty, $bounds.size)
$memStream = New-Object System.IO.MemoryStream
$bmp.Save($memStream, [Drawing.Imaging.ImageFormat]::Jpeg)
Write-Host $memStream.ToArray()
$graphics.Dispose()
$bmp.Dispose()
$memStream.Dispose()
Java application listens to the output of it and does some operations on it. The problem is that sometimes Write-Host $memStream.ToArray() takes too much time (Sometimes in 2 minutes, sometimes 3, or even 5). I'm not familiar with Powershell. Is there any analog of Write-Host which is faster? Or maybe I can take a screenshot using some other functionality faster? Thanks
You stated a solution using other functionality would be acceptable, so why not perform the screen capture directly with the Java application instead? Java is fully capable of this natively:
import java.awt.Robot;
import java.awt.Rectangle;
import java.awt.image.BufferedImage;
import java.awt.GraphicsDevice;
import java.awt.GraphicsEnvironment;
import java.awt.AWTException;
import java.text.SimpleDateFormat;
import java.util.Calendar;
import java.io.File;
import java.io.IOException;
import javax.imageio.ImageIO;
// Set up Robot and other vars
Robot robot = new Robot();
String imgFormat = "jpg";
BufferedImage screenBuffer;
Rectangle screenBounds;
// Enumerate all screens
GraphicsEnvironment graphEnv = GraphicsEnvironment.getLocalGraphicsEnvironment();
GraphicsDevice[] screens = graphEnv.getScreenDevices();
// Variables only used for generating filename
String fnameFormat = "%s-%s-screencap.%s";
String dtNowString = new SimpleDateFormat("yyyyMMddHHmmss").format(Calendar.getInstance().getTime());
String filename = String.format(fnameFormat, dtNowString, "all", imgFormat);
Rectangle allScreenBounds = new Rectangle();
int num = 0;
for(GraphicsDevice screen : screens) {
screenBounds = screen.getDefaultConfiguration().getBounds();
allScreenBounds.x = Math.min(allScreenBounds.x, screenBounds.x);
allScreenBounds.y = Math.min(allScreenBounds.y, screenBounds.y);
// Make sure we only add extra pixels to the total width and height, subtracting overlapping dimensions
// Does not take into account non-continuous display area, normally impossible on Windows
allScreenBounds.width += Math.abs(allScreenBounds.width - (screenBounds.width + screenBounds.x));
allScreenBounds.height += Math.abs(allScreenBounds.height - (screenBounds.height + screenBounds.y));
System.out.println(String.format("Display %d: X=%d, Y=%d, Height=%d, Width=%d", num++, screenBounds.x, screenBounds.y, screenBounds.height, screenBounds.width));
}
System.out.println(String.format("Screen Area: X=%d, Y=%d, Height=%d, Width=%d", allScreenBounds.x, allScreenBounds.y, allScreenBounds.height, allScreenBounds.width));
screenBuffer = robot.createScreenCapture(allScreenBounds);
// Save the screencap to file
ImageIO.write(screenBuffer, imgFormat, new File(filename));
There is file-writing code there for testing but if this is performed by your application you can remove the filename variables, import javax.imageio.ImageIO, and the ImageIO.write call as you'll have the screenshot data in screenBuffer instead.

How to write rotation metadata to video in xuggler?

I have created a class to resize a video. I tested this with a video filmed on my mobile in portrait mode. The video is correctly displayed in portrait mode in any media player booth on phone and computer. However when using my program to resize the video is correctly resized but the rotation metadata is lost.
Querying the metadata of my input video via mediainfo gives the following output:
General
Complete name : /home/sneese/Downloads/20190708_095754.mp4
Format : MPEG-4
Format profile : Base Media / Version 2
Codec ID : mp42 (isom/mp42)
File size : 22.7 MiB
Duration : 6s 741ms
Overall bit rate : 28.3 Mbps
Encoded date : UTC 2019-07-08 07:58:02
Tagged date : UTC 2019-07-08 07:58:02
com.android.version : 8.0.0
Video
ID : 1
Format : AVC
Format/Info : Advanced Video Codec
Format profile : High#L4.2
Format settings, CABAC : Yes
Format settings, ReFrames : 1 frame
Format settings, GOP : M=1, N=60
Codec ID : avc1
Codec ID/Info : Advanced Video Coding
Duration : 6s 732ms
Bit rate : 28.0 Mbps
Width : 1 920 pixels
Height : 1 080 pixels
Display aspect ratio : 16:9
Rotation : 90°
Frame rate mode : Variable
Frame rate : 60.000 fps
Minimum frame rate : 59.094 fps
Maximum frame rate : 60.934 fps
Color space : YUV
Chroma subsampling : 4:2:0
Bit depth : 8 bits
Scan type : Progressive
Bits/(Pixel*Frame) : 0.225
Stream size : 22.5 MiB (99%)
Title : VideoHandle
Language : English
Encoded date : UTC 2019-07-08 07:58:02
Tagged date : UTC 2019-07-08 07:58:02
mdhd_Duration : 6732
Audio
ID : 2
Format : AAC
Format/Info : Advanced Audio Codec
Format profile : LC
Codec ID : 40
Duration : 6s 741ms
Source duration : 6s 745ms
Source_Duration_FirstFrame : 3ms
Bit rate mode : Constant
Bit rate : 256 Kbps
Channel(s) : 2 channels
Channel positions : Front: L R
Sampling rate : 48.0 KHz
Frame rate : 46.875 fps (1024 spf)
Compression mode : Lossy
Stream size : 211 KiB (1%)
Source stream size : 211 KiB (1%)
Title : SoundHandle
Language : English
Encoded date : UTC 2019-07-08 07:58:02
Tagged date : UTC 2019-07-08 07:58:02
mdhd_Duration : 6741
As you can see the rotation is set to 90 degrees.
This is my code:
Main Class:
package test.video;
import com.xuggle.mediatool.IMediaReader;
import com.xuggle.mediatool.IMediaWriter;
import com.xuggle.mediatool.ToolFactory;
import com.xuggle.xuggler.ICodec;
import com.xuggle.xuggler.IContainer;
import com.xuggle.xuggler.IMetaData;
import com.xuggle.xuggler.IStream;
import com.xuggle.xuggler.IStreamCoder;
import java.io.File;
import java.io.IOException;
import java.nio.file.Files;
import java.nio.file.Path;
import java.nio.file.Paths;
import java.nio.file.attribute.BasicFileAttributes;
import java.util.Arrays;
public class MediaConverter {
private static final String INPUT_FILE = "/home/user/input_video.mp4";
private static final String OUTPUT_FILE = "/home/user/output_video.mp4";
public static void main(String[] args) throws IOException {
convertVideo(INPUT_FILE, OUTPUT_FILE, 640, 360);
}
private static void convertVideo(String inputFile, String outputFile, int width, int height){
// reader
IMediaReader reader = ToolFactory.makeReader(inputFile);
Resizer resizer = new Resizer(width, height);
reader.addListener(resizer);
// writer
IMediaWriter writer = ToolFactory.makeWriter(outputFile, reader);
resizer.addListener(writer);
while (reader.readPacket() == null) {
// no need to do anything here just let him read
}
writer.flush();
}
}
MediaToolAdapter:
package test.video;
import com.xuggle.mediatool.MediaToolAdapter;
import com.xuggle.mediatool.event.IAddStreamEvent;
import com.xuggle.mediatool.event.IVideoPictureEvent;
import com.xuggle.mediatool.event.IWriteHeaderEvent;
import com.xuggle.mediatool.event.VideoPictureEvent;
import com.xuggle.xuggler.ICodec;
import com.xuggle.xuggler.IMetaData;
import com.xuggle.xuggler.IStream;
import com.xuggle.xuggler.IStreamCoder;
import com.xuggle.xuggler.IVideoPicture;
import com.xuggle.xuggler.IVideoResampler;
import java.util.Collection;
public class Resizer extends MediaToolAdapter {
private Integer width;
private Integer height;
private IVideoResampler videoResampler = null;
public Resizer(Integer aWidth, Integer aHeight) {
this.width = aWidth;
this.height = aHeight;
}
#Override
public void onAddStream(IAddStreamEvent event) {
int streamIndex = event.getStreamIndex();
IStreamCoder streamCoder = event.getSource().getContainer().getStream(streamIndex).getStreamCoder();
if (streamCoder.getCodecType() == ICodec.Type.CODEC_TYPE_VIDEO) {
IStream stream = event.getSource().getContainer().getStream(streamIndex);
streamCoder.setWidth(width);
streamCoder.setHeight(height);
// checked if this writes the metadata - but does not
streamCoder.setProperty(stream.getMetaData(), null);
event.getSource().getContainer().setMetaData(stream.getMetaData());
}
super.onAddStream(event);
}
#Override
public void onVideoPicture(IVideoPictureEvent event) {
IVideoPicture pic = event.getPicture();
if (videoResampler == null) {
videoResampler = IVideoResampler.make(width, height, pic.getPixelType(), pic.getWidth(), pic
.getHeight(), pic.getPixelType());
}
IVideoPicture out = IVideoPicture.make(pic.getPixelType(), width, height);
videoResampler.resample(out, pic);
IVideoPictureEvent asc = new VideoPictureEvent(event.getSource(), out, event.getStreamIndex());
super.onVideoPicture(asc);
out.delete();
}
}
Expected result would be that the video is displayed in portrait mode after conversion if it was like that before the conversion otherwise landscape mode. Thank you

Lag a value with Datavec transform

I'm trying to figure out how to get a lagged value of a field as part of a datavec transform step.
Here is a little example built off the dl4j examples:
import org.datavec.api.records.reader.RecordReader;
import org.datavec.api.records.reader.impl.csv.CSVRecordReader;
import org.datavec.api.split.FileSplit;
import org.datavec.api.transform.TransformProcess;
import org.datavec.api.transform.schema.Schema;
import org.datavec.api.writable.Writable;
import org.datavec.local.transforms.LocalTransformExecutor;
import org.nd4j.linalg.io.ClassPathResource;
import java.io.File;
import java.util.ArrayList;
import java.util.Arrays;
import java.util.List;
public class myExample {
public static void main(String[] args) throws Exception {
Schema inputDataSchema = new Schema.Builder()
.addColumnString("DateTimeString")
.addColumnsString("CustomerID", "MerchantID")
.addColumnInteger("NumItemsInTransaction")
.addColumnCategorical("MerchantCountryCode", Arrays.asList("USA","CAN","FR","MX"))
.addColumnDouble("TransactionAmountUSD",0.0,null,false,false) //$0.0 or more, no maximum limit, no NaN and no Infinite values
.addColumnCategorical("FraudLabel", Arrays.asList("Fraud","Legit"))
.build();
TransformProcess tp = new TransformProcess.Builder(inputDataSchema)
.removeAllColumnsExceptFor("DateTimeString","TransactionAmountUSD")
.build();
File inputFile = new ClassPathResource("BasicDataVecExample/exampledata.csv").getFile();
//Define input reader and output writer:
RecordReader rr = new CSVRecordReader(1, ',');
rr.initialize(new FileSplit(inputFile));
//Process the data:
List<List<Writable>> originalData = new ArrayList<>();
while(rr.hasNext()){
originalData.add(rr.next());
}
List<List<Writable>> processedData = LocalTransformExecutor.execute(originalData, tp);
int numRows = 5;
System.out.println("=== BEFORE ===");
for (int i=0;i<=numRows;i++) {
System.out.println(originalData.get(i));
}
System.out.println("=== AFTER ===");
for (int i=0;i<=numRows;i++) {
System.out.println(processedData.get(i));
}
}
}
I'm looking to get a lagged value (ordered by DateTimeString) of TransactionAmountUSD
I was looking at sequenceMovingWindowReduce from the docs but could not figure it out. Also could not really find any examples in the examples repo that seemed to do anything similar to this.
Thanks to some help from Alex Black on the dl4j gitter channel i can post my own answer.
Tip to anyone new to dl4j - there is lots of good things to look at in the tests code too in addition to the examples and tutorials.
Here is my updated toy example code:
package org.datavec.transform.basic;
import org.datavec.api.records.reader.RecordReader;
import org.datavec.api.records.reader.impl.csv.CSVRecordReader;
import org.datavec.api.split.FileSplit;
import org.datavec.api.transform.TransformProcess;
import org.datavec.api.transform.schema.Schema;
import org.datavec.api.transform.sequence.comparator.NumericalColumnComparator;
import org.datavec.api.transform.transform.sequence.SequenceOffsetTransform;
import org.datavec.api.writable.Writable;
import org.datavec.local.transforms.LocalTransformExecutor;
import org.joda.time.DateTimeZone;
import org.nd4j.linalg.io.ClassPathResource;
import java.io.File;
import java.util.ArrayList;
import java.util.Arrays;
import java.util.List;
public class myExample {
public static void main(String[] args) throws Exception {
Schema inputDataSchema = new Schema.Builder()
.addColumnString("DateTimeString")
.addColumnsString("CustomerID", "MerchantID")
.addColumnInteger("NumItemsInTransaction")
.addColumnCategorical("MerchantCountryCode", Arrays.asList("USA","CAN","FR","MX"))
.addColumnDouble("TransactionAmountUSD",0.0,null,false,false) //$0.0 or more, no maximum limit, no NaN and no Infinite values
.addColumnCategorical("FraudLabel", Arrays.asList("Fraud","Legit"))
.build();
TransformProcess tp = new TransformProcess.Builder(inputDataSchema)
.removeAllColumnsExceptFor("CustomerID", "DateTimeString","TransactionAmountUSD")
.stringToTimeTransform("DateTimeString","YYYY-MM-DD HH:mm:ss.SSS", DateTimeZone.UTC)
.convertToSequence(Arrays.asList("CustomerID"), new NumericalColumnComparator("DateTimeString"))
.offsetSequence(Arrays.asList("TransactionAmountUSD"),1, SequenceOffsetTransform.OperationType.NewColumn)
.build();
File inputFile = new ClassPathResource("BasicDataVecExample/exampledata.csv").getFile();
//Define input reader and output writer:
RecordReader rr = new CSVRecordReader(0, ',');
rr.initialize(new FileSplit(inputFile));
//Process the data:
List<List<Writable>> originalData = new ArrayList<>();
while(rr.hasNext()){
originalData.add(rr.next());
}
List<List<List<Writable>>> processedData = LocalTransformExecutor.executeToSequence(originalData, tp);
System.out.println("=== BEFORE ===");
for (int i=0;i<originalData.size();i++) {
System.out.println(originalData.get(i));
}
System.out.println("=== AFTER ===");
for (int i=0;i<processedData.size();i++) {
System.out.println(processedData.get(i));
}
}
}
This should give some output like below where you can see a now col with the last value for the transaction amount for each customer id is added.
"C:\Program Files\Java\jdk1.8.0_201\bin\java.exe" "-javaagent:C:\Program Files\JetBrains\IntelliJ IDEA Community Edition 2019.1\lib\idea_rt.jar=56103:C:\Program Files\JetBrains\IntelliJ IDEA Community Edition 2019.1\bin" -Dfile.encoding=UTF-8 -classpath "C:\Program Files\Java\jdk1.8.0_201\jre\lib\charsets.jar;C:\Program Files\Java\jdk1.8.0_201\jre\lib\deploy.jar;C:\Program Files\Java\jdk1.8.0_201\jre\lib\ext\access-bridge-64.jar;C:\Program Files\Java\jdk1.8.0_201\jre\lib\ext\cldrdata.jar;C:\Program Files\Java\jdk1.8.0_201\jre\lib\ext\dnsns.jar;C:\Program Files\Java\jdk1.8.0_201\jre\lib\ext\jaccess.jar;C:\Program Files\Java\jdk1.8.0_201\jre\lib\ext\jfxrt.jar;C:\Program Files\Java\jdk1.8.0_201\jre\lib\ext\localedata.jar;C:\Program Files\Java\jdk1.8.0_201\jre\lib\ext\nashorn.jar;C:\Program Files\Java\jdk1.8.0_201\jre\lib\ext\sunec.jar;C:\Program Files\Java\jdk1.8.0_201\jre\lib\ext\sunjce_provider.jar;C:\Program Files\Java\jdk1.8.0_201\jre\lib\ext\sunmscapi.jar;C:\Program Files\Java\jdk1.8.0_201\jre\lib\ext\sunpkcs11.jar;C:\Program Files\Java\jdk1.8.0_201\jre\lib\ext\zipfs.jar;C:\Program Files\Java\jdk1.8.0_201\jre\lib\javaws.jar;C:\Program Files\Java\jdk1.8.0_201\jre\lib\jce.jar;C:\Program Files\Java\jdk1.8.0_201\jre\lib\jfr.jar;C:\Program Files\Java\jdk1.8.0_201\jre\lib\jfxswt.jar;C:\Program Files\Java\jdk1.8.0_201\jre\lib\jsse.jar;C:\Program Files\Java\jdk1.8.0_201\jre\lib\management-agent.jar;C:\Program Files\Java\jdk1.8.0_201\jre\lib\plugin.jar;C:\Program Files\Java\jdk1.8.0_201\jre\lib\resources.jar;C:\Program Files\Java\jdk1.8.0_201\jre\lib\rt.jar;C:\Users\amaguire\Documents\java_learning\dl4j-examples\datavec-examples\target\classes;C:\Users\amaguire\.m2\repository\org\datavec\datavec-api\1.0.0-beta3\datavec-api-1.0.0-beta3.jar;C:\Users\amaguire\.m2\repository\org\jetbrains\annotations\13.0\annotations-13.0.jar;C:\Users\amaguire\.m2\repository\org\apache\commons\commons-lang3\3.6\commons-lang3-3.6.jar;C:\Users\amaguire\.m2\repository\commons-io\commons-io\2.5\commons-io-2.5.jar;C:\Users\amaguire\.m2\repository\commons-codec\commons-codec\1.10\commons-codec-1.10.jar;C:\Users\amaguire\.m2\repository\org\slf4j\slf4j-api\1.7.21\slf4j-api-1.7.21.jar;C:\Users\amaguire\.m2\repository\joda-time\joda-time\2.2\joda-time-2.2.jar;C:\Users\amaguire\.m2\repository\org\yaml\snakeyaml\1.12\snakeyaml-1.12.jar;C:\Users\amaguire\.m2\repository\org\nd4j\jackson\1.0.0-beta3\jackson-1.0.0-beta3.jar;C:\Users\amaguire\.m2\repository\org\codehaus\woodstox\stax2-api\3.1.4\stax2-api-3.1.4.jar;C:\Users\amaguire\.m2\repository\org\freemarker\freemarker\2.3.23\freemarker-2.3.23.jar;C:\Users\amaguire\.m2\repository\org\nd4j\nd4j-common\1.0.0-beta3\nd4j-common-1.0.0-beta3.jar;C:\Users\amaguire\.m2\repository\org\nd4j\nd4j-api\1.0.0-beta3\nd4j-api-1.0.0-beta3.jar;C:\Users\amaguire\.m2\repository\com\google\flatbuffers\flatbuffers-java\1.9.0\flatbuffers-java-1.9.0.jar;C:\Users\amaguire\.m2\repository\com\github\os72\protobuf-java-shaded-351\0.9\protobuf-java-shaded-351-0.9.jar;C:\Users\amaguire\.m2\repository\com\github\os72\protobuf-java-util-shaded-351\0.9\protobuf-java-util-shaded-351-0.9.jar;C:\Users\amaguire\.m2\repository\com\google\code\gson\gson\2.7\gson-2.7.jar;C:\Users\amaguire\.m2\repository\org\objenesis\objenesis\2.6\objenesis-2.6.jar;C:\Users\amaguire\.m2\repository\uk\com\robust-it\cloning\1.9.3\cloning-1.9.3.jar;C:\Users\amaguire\.m2\repository\org\nd4j\nd4j-buffer\1.0.0-beta3\nd4j-buffer-1.0.0-beta3.jar;C:\Users\amaguire\.m2\repository\org\bytedeco\javacpp\1.4.3\javacpp-1.4.3.jar;C:\Users\amaguire\.m2\repository\org\nd4j\nd4j-context\1.0.0-beta3\nd4j-context-1.0.0-beta3.jar;C:\Users\amaguire\.m2\repository\net\ericaro\neoitertools\1.0.0\neoitertools-1.0.0.jar;C:\Users\amaguire\.m2\repository\com\clearspring\analytics\stream\2.7.0\stream-2.7.0.jar;C:\Users\amaguire\.m2\repository\net\sf\opencsv\opencsv\2.3\opencsv-2.3.jar;C:\Users\amaguire\.m2\repository\com\tdunning\t-digest\3.2\t-digest-3.2.jar;C:\Users\amaguire\.m2\repository\it\unimi\dsi\fastutil\6.5.7\fastutil-6.5.7.jar;C:\Users\amaguire\.m2\repository\org\datavec\datavec-spark_2.11\1.0.0-beta3_spark_1\datavec-spark_2.11-1.0.0-beta3_spark_1.jar;C:\Users\amaguire\.m2\repository\org\scala-lang\scala-library\2.11.12\scala-library-2.11.12.jar;C:\Users\amaguire\.m2\repository\org\scala-lang\scala-reflect\2.11.12\scala-reflect-2.11.12.jar;C:\Users\amaguire\.m2\repository\org\codehaus\jackson\jackson-core-asl\1.9.13\jackson-core-asl-1.9.13.jar;C:\Users\amaguire\.m2\repository\org\codehaus\jackson\jackson-mapper-asl\1.9.13\jackson-mapper-asl-1.9.13.jar;C:\Users\amaguire\.m2\repository\org\apache\spark\spark-sql_2.11\1.6.3\spark-sql_2.11-1.6.3.jar;C:\Users\amaguire\.m2\repository\org\apache\spark\spark-core_2.11\1.6.3\spark-core_2.11-1.6.3.jar;C:\Users\amaguire\.m2\repository\org\apache\avro\avro-mapred\1.7.7\avro-mapred-1.7.7-hadoop2.jar;C:\Users\amaguire\.m2\repository\org\apache\avro\avro-ipc\1.7.7\avro-ipc-1.7.7.jar;C:\Users\amaguire\.m2\repository\org\apache\avro\avro\1.7.7\avro-1.7.7.jar;C:\Users\amaguire\.m2\repository\org\apache\avro\avro-ipc\1.7.7\avro-ipc-1.7.7-tests.jar;C:\Users\amaguire\.m2\repository\com\twitter\chill_2.11\0.5.0\chill_2.11-0.5.0.jar;C:\Users\amaguire\.m2\repository\com\esotericsoftware\kryo\kryo\2.21\kryo-2.21.jar;C:\Users\amaguire\.m2\repository\com\esotericsoftware\reflectasm\reflectasm\1.07\reflectasm-1.07-shaded.jar;C:\Users\amaguire\.m2\repository\com\esotericsoftware\minlog\minlog\1.2\minlog-1.2.jar;C:\Users\amaguire\.m2\repository\com\twitter\chill-java\0.5.0\chill-java-0.5.0.jar;C:\Users\amaguire\.m2\repository\org\apache\xbean\xbean-asm5-shaded\4.4\xbean-asm5-shaded-4.4.jar;C:\Users\amaguire\.m2\repository\org\apache\hadoop\hadoop-client\2.2.0\hadoop-client-2.2.0.jar;C:\Users\amaguire\.m2\repository\org\apache\hadoop\hadoop-common\2.2.0\hadoop-common-2.2.0.jar;C:\Users\amaguire\.m2\repository\commons-cli\commons-cli\1.2\commons-cli-1.2.jar;C:\Users\amaguire\.m2\repository\org\apache\commons\commons-math\2.1\commons-math-2.1.jar;C:\Users\amaguire\.m2\repository\xmlenc\xmlenc\0.52\xmlenc-0.52.jar;C:\Users\amaguire\.m2\repository\commons-configuration\commons-configuration\1.6\commons-configuration-1.6.jar;C:\Users\amaguire\.m2\repository\commons-digester\commons-digester\1.8\commons-digester-1.8.jar;C:\Users\amaguire\.m2\repository\commons-beanutils\commons-beanutils\1.7.0\commons-beanutils-1.7.0.jar;C:\Users\amaguire\.m2\repository\commons-beanutils\commons-beanutils-core\1.8.0\commons-beanutils-core-1.8.0.jar;C:\Users\amaguire\.m2\repository\org\apache\hadoop\hadoop-auth\2.2.0\hadoop-auth-2.2.0.jar;C:\Users\amaguire\.m2\repository\org\apache\hadoop\hadoop-hdfs\2.2.0\hadoop-hdfs-2.2.0.jar;C:\Users\amaguire\.m2\repository\org\mortbay\jetty\jetty-util\6.1.26\jetty-util-6.1.26.jar;C:\Users\amaguire\.m2\repository\org\apache\hadoop\hadoop-mapreduce-client-app\2.2.0\hadoop-mapreduce-client-app-2.2.0.jar;C:\Users\amaguire\.m2\repository\org\apache\hadoop\hadoop-mapreduce-client-common\2.2.0\hadoop-mapreduce-client-common-2.2.0.jar;C:\Users\amaguire\.m2\repository\org\apache\hadoop\hadoop-yarn-client\2.2.0\hadoop-yarn-client-2.2.0.jar;C:\Users\amaguire\.m2\repository\com\sun\jersey\jersey-test-framework\jersey-test-framework-grizzly2\1.9\jersey-test-framework-grizzly2-1.9.jar;C:\Users\amaguire\.m2\repository\com\sun\jersey\jersey-test-framework\jersey-test-framework-core\1.9\jersey-test-framework-core-1.9.jar;C:\Users\amaguire\.m2\repository\com\sun\jersey\jersey-client\1.9\jersey-client-1.9.jar;C:\Users\amaguire\.m2\repository\com\sun\jersey\jersey-grizzly2\1.9\jersey-grizzly2-1.9.jar;C:\Users\amaguire\.m2\repository\org\glassfish\grizzly\grizzly-http\2.1.2\grizzly-http-2.1.2.jar;C:\Users\amaguire\.m2\repository\org\glassfish\grizzly\grizzly-framework\2.1.2\grizzly-framework-2.1.2.jar;C:\Users\amaguire\.m2\repository\org\glassfish\gmbal\gmbal-api-only\3.0.0-b023\gmbal-api-only-3.0.0-b023.jar;C:\Users\amaguire\.m2\repository\org\glassfish\external\management-api\3.0.0-b012\management-api-3.0.0-b012.jar;C:\Users\amaguire\.m2\repository\org\glassfish\grizzly\grizzly-http-server\2.1.2\grizzly-http-server-2.1.2.jar;C:\Users\amaguire\.m2\repository\org\glassfish\grizzly\grizzly-rcm\2.1.2\grizzly-rcm-2.1.2.jar;C:\Users\amaguire\.m2\repository\org\glassfish\grizzly\grizzly-http-servlet\2.1.2\grizzly-http-servlet-2.1.2.jar;C:\Users\amaguire\.m2\repository\org\glassfish\javax.servlet\3.1\javax.servlet-3.1.jar;C:\Users\amaguire\.m2\repository\com\sun\jersey\jersey-json\1.9\jersey-json-1.9.jar;C:\Users\amaguire\.m2\repository\org\codehaus\jettison\jettison\1.1\jettison-1.1.jar;C:\Users\amaguire\.m2\repository\stax\stax-api\1.0.1\stax-api-1.0.1.jar;C:\Users\amaguire\.m2\repository\org\codehaus\jackson\jackson-jaxrs\1.8.3\jackson-jaxrs-1.8.3.jar;C:\Users\amaguire\.m2\repository\org\codehaus\jackson\jackson-xc\1.8.3\jackson-xc-1.8.3.jar;C:\Users\amaguire\.m2\repository\com\sun\jersey\contribs\jersey-guice\1.9\jersey-guice-1.9.jar;C:\Users\amaguire\.m2\repository\org\apache\hadoop\hadoop-yarn-server-common\2.2.0\hadoop-yarn-server-common-2.2.0.jar;C:\Users\amaguire\.m2\repository\org\apache\hadoop\hadoop-mapreduce-client-shuffle\2.2.0\hadoop-mapreduce-client-shuffle-2.2.0.jar;C:\Users\amaguire\.m2\repository\org\apache\hadoop\hadoop-yarn-api\2.2.0\hadoop-yarn-api-2.2.0.jar;C:\Users\amaguire\.m2\repository\org\apache\hadoop\hadoop-mapreduce-client-core\2.2.0\hadoop-mapreduce-client-core-2.2.0.jar;C:\Users\amaguire\.m2\repository\org\apache\hadoop\hadoop-yarn-common\2.2.0\hadoop-yarn-common-2.2.0.jar;C:\Users\amaguire\.m2\repository\org\apache\hadoop\hadoop-mapreduce-client-jobclient\2.2.0\hadoop-mapreduce-client-jobclient-2.2.0.jar;C:\Users\amaguire\.m2\repository\org\apache\hadoop\hadoop-annotations\2.2.0\hadoop-annotations-2.2.0.jar;C:\Users\amaguire\.m2\repository\org\apache\spark\spark-launcher_2.11\1.6.3\spark-launcher_2.11-1.6.3.jar;C:\Users\amaguire\.m2\repository\org\apache\spark\spark-network-common_2.11\1.6.3\spark-network-common_2.11-1.6.3.jar;C:\Users\amaguire\.m2\repository\org\apache\spark\spark-network-shuffle_2.11\1.6.3\spark-network-shuffle_2.11-1.6.3.jar;C:\Users\amaguire\.m2\repository\org\fusesource\leveldbjni\leveldbjni-all\1.8\leveldbjni-all-1.8.jar;C:\Users\amaguire\.m2\repository\org\apache\spark\spark-unsafe_2.11\1.6.3\spark-unsafe_2.11-1.6.3.jar;C:\Users\amaguire\.m2\repository\net\java\dev\jets3t\jets3t\0.7.1\jets3t-0.7.1.jar;C:\Users\amaguire\.m2\repository\commons-httpclient\commons-httpclient\3.1\commons-httpclient-3.1.jar;C:\Users\amaguire\.m2\repository\org\eclipse\jetty\orbit\javax.servlet\3.0.0.v201112011016\javax.servlet-3.0.0.v201112011016.jar;C:\Users\amaguire\.m2\repository\com\google\code\findbugs\jsr305\1.3.9\jsr305-1.3.9.jar;C:\Users\amaguire\.m2\repository\org\slf4j\jul-to-slf4j\1.7.10\jul-to-slf4j-1.7.10.jar;C:\Users\amaguire\.m2\repository\org\slf4j\jcl-over-slf4j\1.7.10\jcl-over-slf4j-1.7.10.jar;C:\Users\amaguire\.m2\repository\log4j\log4j\1.2.17\log4j-1.2.17.jar;C:\Users\amaguire\.m2\repository\org\slf4j\slf4j-log4j12\1.7.10\slf4j-log4j12-1.7.10.jar;C:\Users\amaguire\.m2\repository\com\ning\compress-lzf\1.0.3\compress-lzf-1.0.3.jar;C:\Users\amaguire\.m2\repository\org\xerial\snappy\snappy-java\1.1.2.6\snappy-java-1.1.2.6.jar;C:\Users\amaguire\.m2\repository\net\jpountz\lz4\lz4\1.3.0\lz4-1.3.0.jar;C:\Users\amaguire\.m2\repository\org\roaringbitmap\RoaringBitmap\0.5.11\RoaringBitmap-0.5.11.jar;C:\Users\amaguire\.m2\repository\org\json4s\json4s-jackson_2.11\3.2.10\json4s-jackson_2.11-3.2.10.jar;C:\Users\amaguire\.m2\repository\org\json4s\json4s-core_2.11\3.2.10\json4s-core_2.11-3.2.10.jar;C:\Users\amaguire\.m2\repository\org\json4s\json4s-ast_2.11\3.2.10\json4s-ast_2.11-3.2.10.jar;C:\Users\amaguire\.m2\repository\org\scala-lang\scalap\2.11.0\scalap-2.11.0.jar;C:\Users\amaguire\.m2\repository\org\scala-lang\scala-compiler\2.11.0\scala-compiler-2.11.0.jar;C:\Users\amaguire\.m2\repository\org\scala-lang\modules\scala-xml_2.11\1.0.1\scala-xml_2.11-1.0.1.jar;C:\Users\amaguire\.m2\repository\org\scala-lang\modules\scala-parser-combinators_2.11\1.0.1\scala-parser-combinators_2.11-1.0.1.jar;C:\Users\amaguire\.m2\repository\com\sun\jersey\jersey-server\1.9\jersey-server-1.9.jar;C:\Users\amaguire\.m2\repository\asm\asm\3.1\asm-3.1.jar;C:\Users\amaguire\.m2\repository\com\sun\jersey\jersey-core\1.9\jersey-core-1.9.jar;C:\Users\amaguire\.m2\repository\org\apache\mesos\mesos\0.21.1\mesos-0.21.1-shaded-protobuf.jar;C:\Users\amaguire\.m2\repository\io\netty\netty-all\4.0.29.Final\netty-all-4.0.29.Final.jar;C:\Users\amaguire\.m2\repository\io\dropwizard\metrics\metrics-core\3.1.2\metrics-core-3.1.2.jar;C:\Users\amaguire\.m2\repository\io\dropwizard\metrics\metrics-jvm\3.1.2\metrics-jvm-3.1.2.jar;C:\Users\amaguire\.m2\repository\io\dropwizard\metrics\metrics-json\3.1.2\metrics-json-3.1.2.jar;C:\Users\amaguire\.m2\repository\io\dropwizard\metrics\metrics-graphite\3.1.2\metrics-graphite-3.1.2.jar;C:\Users\amaguire\.m2\repository\com\fasterxml\jackson\module\jackson-module-scala_2.11\2.5.1\jackson-module-scala_2.11-2.5.1.jar;C:\Users\amaguire\.m2\repository\com\thoughtworks\paranamer\paranamer\2.6\paranamer-2.6.jar;C:\Users\amaguire\.m2\repository\org\apache\ivy\ivy\2.4.0\ivy-2.4.0.jar;C:\Users\amaguire\.m2\repository\oro\oro\2.0.8\oro-2.0.8.jar;C:\Users\amaguire\.m2\repository\org\tachyonproject\tachyon-client\0.8.2\tachyon-client-0.8.2.jar;C:\Users\amaguire\.m2\repository\org\tachyonproject\tachyon-underfs-hdfs\0.8.2\tachyon-underfs-hdfs-0.8.2.jar;C:\Users\amaguire\.m2\repository\org\tachyonproject\tachyon-underfs-s3\0.8.2\tachyon-underfs-s3-0.8.2.jar;C:\Users\amaguire\.m2\repository\org\tachyonproject\tachyon-underfs-local\0.8.2\tachyon-underfs-local-0.8.2.jar;C:\Users\amaguire\.m2\repository\net\razorvine\pyrolite\4.9\pyrolite-4.9.jar;C:\Users\amaguire\.m2\repository\net\sf\py4j\py4j\0.9\py4j-0.9.jar;C:\Users\amaguire\.m2\repository\org\apache\spark\spark-catalyst_2.11\1.6.3\spark-catalyst_2.11-1.6.3.jar;C:\Users\amaguire\.m2\repository\org\codehaus\janino\janino\2.7.8\janino-2.7.8.jar;C:\Users\amaguire\.m2\repository\org\codehaus\janino\commons-compiler\2.7.8\commons-compiler-2.7.8.jar;C:\Users\amaguire\.m2\repository\org\apache\parquet\parquet-column\1.7.0\parquet-column-1.7.0.jar;C:\Users\amaguire\.m2\repository\org\apache\parquet\parquet-common\1.7.0\parquet-common-1.7.0.jar;C:\Users\amaguire\.m2\repository\org\apache\parquet\parquet-encoding\1.7.0\parquet-encoding-1.7.0.jar;C:\Users\amaguire\.m2\repository\org\apache\parquet\parquet-generator\1.7.0\parquet-generator-1.7.0.jar;C:\Users\amaguire\.m2\repository\org\apache\parquet\parquet-hadoop\1.7.0\parquet-hadoop-1.7.0.jar;C:\Users\amaguire\.m2\repository\org\apache\parquet\parquet-format\2.3.0-incubating\parquet-format-2.3.0-incubating.jar;C:\Users\amaguire\.m2\repository\org\apache\parquet\parquet-jackson\1.7.0\parquet-jackson-1.7.0.jar;C:\Users\amaguire\.m2\repository\org\spark-project\spark\unused\1.0.0\unused-1.0.0.jar;C:\Users\amaguire\.m2\repository\com\google\guava\guava\20.0\guava-20.0.jar;C:\Users\amaguire\.m2\repository\com\google\inject\guice\4.0\guice-4.0.jar;C:\Users\amaguire\.m2\repository\javax\inject\javax.inject\1\javax.inject-1.jar;C:\Users\amaguire\.m2\repository\aopalliance\aopalliance\1.0\aopalliance-1.0.jar;C:\Users\amaguire\.m2\repository\com\google\protobuf\protobuf-java\2.6.1\protobuf-java-2.6.1.jar;C:\Users\amaguire\.m2\repository\commons-collections\commons-collections\3.2.2\commons-collections-3.2.2.jar;C:\Users\amaguire\.m2\repository\commons-lang\commons-lang\2.6\commons-lang-2.6.jar;C:\Users\amaguire\.m2\repository\commons-net\commons-net\3.1\commons-net-3.1.jar;C:\Users\amaguire\.m2\repository\com\sun\xml\bind\jaxb-core\2.2.11\jaxb-core-2.2.11.jar;C:\Users\amaguire\.m2\repository\com\sun\xml\bind\jaxb-impl\2.2.11\jaxb-impl-2.2.11.jar;C:\Users\amaguire\.m2\repository\com\typesafe\akka\akka-actor_2.11\2.3.16\akka-actor_2.11-2.3.16.jar;C:\Users\amaguire\.m2\repository\com\typesafe\akka\akka-remote_2.11\2.3.16\akka-remote_2.11-2.3.16.jar;C:\Users\amaguire\.m2\repository\org\uncommons\maths\uncommons-maths\1.2.2a\uncommons-maths-1.2.2a.jar;C:\Users\amaguire\.m2\repository\com\typesafe\akka\akka-slf4j_2.11\2.3.16\akka-slf4j_2.11-2.3.16.jar;C:\Users\amaguire\.m2\repository\io\netty\netty\3.10.4.Final\netty-3.10.4.Final.jar;C:\Users\amaguire\.m2\repository\com\fasterxml\jackson\core\jackson-core\2.5.1\jackson-core-2.5.1.jar;C:\Users\amaguire\.m2\repository\com\fasterxml\jackson\core\jackson-databind\2.5.1\jackson-databind-2.5.1.jar;C:\Users\amaguire\.m2\repository\com\fasterxml\jackson\core\jackson-annotations\2.5.1\jackson-annotations-2.5.1.jar;C:\Users\amaguire\.m2\repository\javax\servlet\javax.servlet-api\3.1.0\javax.servlet-api-3.1.0.jar;C:\Users\amaguire\.m2\repository\org\apache\commons\commons-compress\1.16.1\commons-compress-1.16.1.jar;C:\Users\amaguire\.m2\repository\org\apache\commons\commons-math3\3.5\commons-math3-3.5.jar;C:\Users\amaguire\.m2\repository\org\apache\curator\curator-recipes\2.8.0\curator-recipes-2.8.0.jar;C:\Users\amaguire\.m2\repository\org\apache\curator\curator-framework\2.8.0\curator-framework-2.8.0.jar;C:\Users\amaguire\.m2\repository\org\apache\curator\curator-client\2.8.0\curator-client-2.8.0.jar;C:\Users\amaguire\.m2\repository\org\apache\zookeeper\zookeeper\3.4.6\zookeeper-3.4.6.jar;C:\Users\amaguire\.m2\repository\jline\jline\0.9.94\jline-0.9.94.jar;C:\Users\amaguire\.m2\repository\com\typesafe\config\1.3.0\config-1.3.0.jar;C:\Users\amaguire\.m2\repository\org\datavec\datavec-hadoop\1.0.0-beta3\datavec-hadoop-1.0.0-beta3.jar;C:\Users\amaguire\.m2\repository\org\datavec\datavec-local\1.0.0-beta3\datavec-local-1.0.0-beta3.jar;C:\Users\amaguire\.m2\repository\com\codepoetics\protonpack\1.15\protonpack-1.15.jar;C:\Users\amaguire\.m2\repository\org\datavec\datavec-arrow\1.0.0-beta3\datavec-arrow-1.0.0-beta3.jar;C:\Users\amaguire\.m2\repository\org\nd4j\nd4j-arrow\1.0.0-beta3\nd4j-arrow-1.0.0-beta3.jar;C:\Users\amaguire\.m2\repository\com\fasterxml\jackson\dataformat\jackson-dataformat-yaml\2.6.5\jackson-dataformat-yaml-2.6.5.jar;C:\Users\amaguire\.m2\repository\com\fasterxml\jackson\dataformat\jackson-dataformat-xml\2.6.5\jackson-dataformat-xml-2.6.5.jar;C:\Users\amaguire\.m2\repository\com\fasterxml\jackson\module\jackson-module-jaxb-annotations\2.6.5\jackson-module-jaxb-annotations-2.6.5.jar;C:\Users\amaguire\.m2\repository\com\fasterxml\jackson\datatype\jackson-datatype-joda\2.6.5\jackson-datatype-joda-2.6.5.jar;C:\Users\amaguire\.m2\repository\com\carrotsearch\hppc\0.8.1\hppc-0.8.1.jar;C:\Users\amaguire\.m2\repository\org\apache\arrow\arrow-vector\0.11.0\arrow-vector-0.11.0.jar;C:\Users\amaguire\.m2\repository\io\netty\netty-buffer\4.1.22.Final\netty-buffer-4.1.22.Final.jar;C:\Users\amaguire\.m2\repository\io\netty\netty-common\4.1.22.Final\netty-common-4.1.22.Final.jar;C:\Users\amaguire\.m2\repository\org\apache\arrow\arrow-memory\0.11.0\arrow-memory-0.11.0.jar;C:\Users\amaguire\.m2\repository\org\apache\arrow\arrow-format\0.11.0\arrow-format-0.11.0.jar" org.datavec.transform.basic.myExample
log4j:WARN No appenders could be found for logger (io.netty.util.internal.logging.InternalLoggerFactory).
log4j:WARN Please initialize the log4j system properly.
log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info.
=== BEFORE ===
[2016-01-01 17:00:00.000, 830a7u3, u323fy8902, 1, USA, 100.00, Legit]
[2016-01-01 18:03:01.256, 830a7u3, 9732498oeu, 3, FR, 73.20, Legit]
[2016-01-03 02:53:32.231, 78ueoau32, w234e989, 1, USA, 1621.00, Fraud]
[2016-01-03 09:30:16.832, t842uocd, 9732498oeu, 4, USA, 43.19, Legit]
[2016-01-04 23:01:52.920, t842uocd, cza8873bm, 10, MX, 159.65, Legit]
[2016-01-05 02:28:10.648, t842uocd, fgcq9803, 6, CAN, 26.33, Fraud]
[2016-01-05 10:15:36.483, rgc707ke3, tn342v7, 2, USA, -0.90, Legit]
=== AFTER ===
[[1451948512920, t842uocd, 159.65, 43.19], [1451960890648, t842uocd, 26.33, 159.65]]
[[1451671381256, 830a7u3, 73.20, 100.00]]
[]
[]
Process finished with exit code 0

jPOS connectivity with Posiflex printer through Java application

I want to integrate a Posiflex printer with my application, where the logical name is "PP Demo" and the device model is PP-8900.
The printer is connected to my system and is able to print with Posiflex software. How can I integrate it with my application?
I just needed to paste the needed dll files into the lib of my java that's it, and the jpos.xml file is mentioned in the code...
package practice;
import java.awt.BasicStroke;
import java.awt.Color;
import java.awt.Font;
import java.awt.FontMetrics;
import java.awt.Graphics;
import java.awt.Graphics2D;
import java.awt.geom.Rectangle2D;
import java.awt.print.Book;
import java.awt.print.PageFormat;
import java.awt.print.Printable;
import java.awt.print.PrinterJob;
public class PrintableComponent {
public static void main(String[] args) {
PrintableComponent example3 = new PrintableComponent();
System.exit(0);
}
// --- Private instances declarations
private final static int POINTS_PER_INCH = 72;
/**
* Constructor: Example3
* <p>
*
*/
public PrintableComponent() {
// --- Create a new PrinterJob object
PrinterJob printJob = PrinterJob.getPrinterJob();
// --- Create a new book to add pages to
Book book = new Book();
// --- Add the cover page using the default page format for this print
// job
book.append(new IntroPage(), printJob.defaultPage());
// --- Add the document page using a landscape page format
PageFormat documentPageFormat = new PageFormat();
documentPageFormat.setOrientation(PageFormat.LANDSCAPE);
book.append(new Document(), documentPageFormat);
// --- Add a third page using the same painter
book.append(new Document(), documentPageFormat);
// --- Tell the printJob to use the book as the pageable object
printJob.setPageable(book);
// --- Show the print dialog box. If the user click the
// --- print button we then proceed to print else we cancel
// --- the process.
if (printJob.printDialog()) {
try {
printJob.print();
} catch (Exception PrintException) {
PrintException.printStackTrace();
}
}
}
/**
* Class: IntroPage
* <p>
*
* This class defines the painter for the cover page by implementing the
* Printable interface.
* <p>
*
* #author Jean-Pierre Dube <jpdube#videotron.ca>
* #version 1.0
* #since 1.0
* #see Printable
*/
private class IntroPage implements Printable {
/**
* Method: print
* <p>
*
* #param g
* a value of type Graphics
* #param pageFormat
* a value of type PageFormat
* #param page
* a value of type int
* #return a value of type int
*/
public int print(Graphics g, PageFormat pageFormat, int page) {
// --- Create the Graphics2D object
Graphics2D g2d = (Graphics2D) g;
// --- Translate the origin to 0,0 for the top left corner
g2d.translate(pageFormat.getImageableX(), pageFormat.getImageableY());
// --- Set the default drawing color to black
g2d.setPaint(Color.black);
// --- Draw a border arround the page
Rectangle2D.Double border = new Rectangle2D.Double(0, 0, pageFormat.getImageableWidth(),
pageFormat.getImageableHeight());
g2d.draw(border);
// --- Print the title
String titleText = "Printing in Java Part 2";
Font titleFont = new Font("helvetica", Font.BOLD, 36);
g2d.setFont(titleFont);
// --- Compute the horizontal center of the page
FontMetrics fontMetrics = g2d.getFontMetrics();
double titleX = (pageFormat.getImageableWidth() / 2) - (fontMetrics.stringWidth(titleText) / 2);
double titleY = 3 * POINTS_PER_INCH;
g2d.drawString(titleText, (int) titleX, (int) titleY);
return (PAGE_EXISTS);
}
}
/**
* Class: Document
* <p>
*
* This class is the painter for the document content.
* <p>
*
*
* #author Jean-Pierre Dube <jpdube#videotron.ca>
* #version 1.0
* #since 1.0
* #see Printable
*/
private class Document implements Printable {
/**
* Method: print
* <p>
*
* #param g
* a value of type Graphics
* #param pageFormat
* a value of type PageFormat
* #param page
* a value of type int
* #return a value of type int
*/
public int print(Graphics g, PageFormat pageFormat, int page) {
// --- Create the Graphics2D object
Graphics2D g2d = (Graphics2D) g;
// --- Translate the origin to 0,0 for the top left corner
g2d.translate(pageFormat.getImageableX(), pageFormat.getImageableY());
// --- Set the drawing color to black
g2d.setPaint(Color.black);
// --- Draw a border arround the page using a 12 point border
g2d.setStroke(new BasicStroke(12));
Rectangle2D.Double border = new Rectangle2D.Double(0, 0, pageFormat.getImageableWidth(),
pageFormat.getImageableHeight());
g2d.draw(border);
// --- Print page 1
if (page == 1) {
// --- Print the text one inch from the top and laft margins
g2d.drawString("This the content page of page: " + page, POINTS_PER_INCH, POINTS_PER_INCH);
return (PAGE_EXISTS);
}
// --- Print page 2
else if (page == 2) {
// --- Print the text one inch from the top and laft margins
g2d.drawString("This the content of the second page: " + page, POINTS_PER_INCH, POINTS_PER_INCH);
return (PAGE_EXISTS);
}
// --- Validate the page
return (NO_SUCH_PAGE);
}
}
} // Example3

cannot access neo4j database with py2neo after creating it with the java BatchInserter

SOLVED
OK, I just messed with neo4j-server.properties` config file, I shouldn't have written the db path using "...".
I have created a neo4j database using java's inserter and I strive to access it with py2neo. Here's my java code:
///opt/java/64/jdk1.6.0_45/bin/javac -classpath $HOME/opt/usr/neo4j-community-1.8.2/lib/*:. neo_batch.java
///opt/java/64/jdk1.6.0_45/bin/java -classpath $HOME/opt/usr/neo4j-community-1.8.2/lib/*:. neo_batch
import org.neo4j.graphdb.GraphDatabaseService;
import org.neo4j.graphdb.Node;
import org.neo4j.graphdb.Transaction;
import org.neo4j.graphdb.factory.GraphDatabaseFactory;
import org.neo4j.graphdb.index.Index;
import java.io.File;
import java.io.IOException;
import java.io.InputStream;
import java.io.Writer;
import java.util.HashMap;
import java.util.Map;
import java.lang.Long;
import org.neo4j.graphdb.Direction;
import org.neo4j.graphdb.DynamicRelationshipType;
import org.neo4j.graphdb.GraphDatabaseService;
import org.neo4j.graphdb.Node;
import org.neo4j.graphdb.RelationshipType;
import org.neo4j.helpers.collection.MapUtil;
import org.neo4j.unsafe.batchinsert.BatchInserter;
import org.neo4j.unsafe.batchinsert.BatchInserterImpl;
import org.neo4j.unsafe.batchinsert.BatchInserters;
import org.neo4j.unsafe.batchinsert.BatchInserterIndex;
import org.neo4j.unsafe.batchinsert.BatchInserterIndexProvider;
import org.neo4j.unsafe.batchinsert.LuceneBatchInserterIndexProvider;
public class neo_batch{
private static final String KEY = "id";
public static void main(String[] args) {
//create & connect 2 neo db folder
String batch_dir = "neo4j-batchinserter-store";
BatchInserter inserter = BatchInserters.inserter( batch_dir );
//set up neo index
BatchInserterIndexProvider indexProvider =
new LuceneBatchInserterIndexProvider( inserter );
BatchInserterIndex OneIndex =
indexProvider.nodeIndex( "one", MapUtil.stringMap( "type", "exact" ) );
OneIndex.setCacheCapacity( "id", 100000 );
//batchin graph, index and relationships
RelationshipType linked = DynamicRelationshipType.withName( "linked" );
for (int i=0;i<10;i++){
System.out.println(i);
long Node1 = createIndexedNode(inserter, OneIndex, i);
long Node2 = createIndexedNode(inserter, OneIndex, i+10);
inserter.createRelationship(Node1, Node2, linked, null);
}
indexProvider.shutdown();
inserter.shutdown();
}
// START SNIPPET: helperMethods
private static long createIndexedNode(BatchInserter inserter,BatchInserterIndex OneIndex,final long id)
{
Map<String, Object> properties = new HashMap<String, Object>();
properties.put( KEY, id );
long node = inserter.createNode( properties );
OneIndex.add( node, properties);
return node;
}
// END SNIPPET: helperMethods
}
Then I modify the neo4j-server.properties config file accordingly and start neo4j start.
The following python code suggests the graph is empty
from py2neo import neo4j
graph = neo4j.GraphDatabaseService("http://localhost:7474/db/data/")
graph.size()
Out[8]: 0
graph.get_indexed_node("one",'id',1)
What's wrong with my appraoch? Thanks
EDIT
Neither can I count the nodes with a cypher:
neo4j-sh (?)$ START n=node(*)
> return count(*);
+----------+
| count(*) |
+----------+
| 0 |
+----------+
1 row
0 ms
EDIT 2
I can check with the java api that the indexes and nodes exist
private static void query_batched_db(){
GraphDatabaseService graphDb = new GraphDatabaseFactory().newEmbeddedDatabase( batch_dir);
IndexManager indexes = graphDb.index();
boolean oneExists = indexes.existsForNodes("one");
System.out.println("Does the 'one' index exists: "+oneExists);
System.out.println("list indexes: "+graphDb.index().nodeIndexNames());
//search index 'one'
Index<Node> oneIndex = graphDb.index().forNodes( "one" );
for (int i=0;i<25;i++){
IndexHits<Node> hits = oneIndex.get( KEY, i );
System.out.println(hits.size());
}
graphDb.shutdown();
}
Where the output is
Does the 'one' index exists: true
list indexes: [Ljava.lang.String;#26ae533a
1
1
...
1
1
0
0
0
0
0
Now if I populate the graph using python, I won't be able to access them with the previous java method (will count 20 again)
from py2neo import neo4j
graph = neo4j.GraphDatabaseService("http://localhost:7474/db/data/")
idx=graph.get_or_create_index(neo4j.Node,"idx")
for k in range(100):
graph.get_or_create_indexed_node('idx','id',k,{'id':k}
EDIT 3
Now I delete the store I created with the batchinserter, namely neo4j-test-store while the neo4j-server.properties config file continues to point to the deleted store, namely org.neo4j.server.database.location="{some_path}/neo4j-test-store".
Now if I run a cypher count, I got a 100, 100 being the number of nodes I inserted using py2neo.
I am going crazy with this stuff!
SOLVED
OK, I just messed with neo4j-server.properties` config file, I shouldn't have written the db path using "...".

Categories

Resources