Eclipse String List is not printed properly - java
I'm trying to print long size String list.
but elements not printed properly.
I'm not able to figure out what's happening.
output of console :
look, head of line is not visible
but if I have slice list into 200, list printed properly.
please help me.
I'm using STS :
Spring Tool Suite
Version: 3.6.4.RELEASE
Build Id: 201503100339
Platform: Eclipse Luna SR1 (4.4.2)
code :
import java.util.Arrays;
import java.util.List;
public class Test {
public static void main(String[] args) {
List<String> asList = Arrays.asList("testLongLongLongColumnName1",
"testLongLongLongColumnName2", "testLongLongLongColumnName3",
"testLongLongLongColumnName4", "testLongLongLongColumnName5",
"testLongLongLongColumnName6", "testLongLongLongColumnName7",
"testLongLongLongColumnName8", "testLongLongLongColumnName9",
"testLongLongLongColumnName10", "testLongLongLongColumnName11",
"testLongLongLongColumnName12", "testLongLongLongColumnName13",
"testLongLongLongColumnName14", "testLongLongLongColumnName15",
"testLongLongLongColumnName16", "testLongLongLongColumnName17",
"testLongLongLongColumnName18", "testLongLongLongColumnName19",
"testLongLongLongColumnName20", "testLongLongLongColumnName21",
"testLongLongLongColumnName22", "testLongLongLongColumnName23",
"testLongLongLongColumnName24", "testLongLongLongColumnName25",
"testLongLongLongColumnName26", "testLongLongLongColumnName27",
"testLongLongLongColumnName28", "testLongLongLongColumnName29",
"testLongLongLongColumnName30", "testLongLongLongColumnName31",
"testLongLongLongColumnName32", "testLongLongLongColumnName33",
"testLongLongLongColumnName34", "testLongLongLongColumnName35",
"testLongLongLongColumnName36", "testLongLongLongColumnName37",
"testLongLongLongColumnName38", "testLongLongLongColumnName39",
"testLongLongLongColumnName40", "testLongLongLongColumnName41",
"testLongLongLongColumnName42", "testLongLongLongColumnName43",
"testLongLongLongColumnName44", "testLongLongLongColumnName45",
"testLongLongLongColumnName46", "testLongLongLongColumnName47",
"testLongLongLongColumnName48", "testLongLongLongColumnName49",
"testLongLongLongColumnName50", "testLongLongLongColumnName51",
"testLongLongLongColumnName52", "testLongLongLongColumnName53",
"testLongLongLongColumnName54", "testLongLongLongColumnName55",
"testLongLongLongColumnName56", "testLongLongLongColumnName57",
"testLongLongLongColumnName58", "testLongLongLongColumnName59",
"testLongLongLongColumnName60", "testLongLongLongColumnName61",
"testLongLongLongColumnName62", "testLongLongLongColumnName63",
"testLongLongLongColumnName64", "testLongLongLongColumnName65",
"testLongLongLongColumnName66", "testLongLongLongColumnName67",
"testLongLongLongColumnName68", "testLongLongLongColumnName69",
"testLongLongLongColumnName70", "testLongLongLongColumnName71",
"testLongLongLongColumnName72", "testLongLongLongColumnName73",
"testLongLongLongColumnName74", "testLongLongLongColumnName75",
"testLongLongLongColumnName76", "testLongLongLongColumnName77",
"testLongLongLongColumnName78", "testLongLongLongColumnName79",
"testLongLongLongColumnName80", "testLongLongLongColumnName81",
"testLongLongLongColumnName82", "testLongLongLongColumnName83",
"testLongLongLongColumnName84", "testLongLongLongColumnName85",
"testLongLongLongColumnName86", "testLongLongLongColumnName87",
"testLongLongLongColumnName88", "testLongLongLongColumnName89",
"testLongLongLongColumnName90", "testLongLongLongColumnName91",
"testLongLongLongColumnName92", "testLongLongLongColumnName93",
"testLongLongLongColumnName94", "testLongLongLongColumnName95",
"testLongLongLongColumnName96", "testLongLongLongColumnName97",
"testLongLongLongColumnName98", "testLongLongLongColumnName99",
"testLongLongLongColumnName100",
"testLongLongLongColumnName101",
"testLongLongLongColumnName102",
"testLongLongLongColumnName103",
"testLongLongLongColumnName104",
"testLongLongLongColumnName105",
"testLongLongLongColumnName106",
"testLongLongLongColumnName107",
"testLongLongLongColumnName108",
"testLongLongLongColumnName109",
"testLongLongLongColumnName110",
"testLongLongLongColumnName111",
"testLongLongLongColumnName112",
"testLongLongLongColumnName113",
"testLongLongLongColumnName114",
"testLongLongLongColumnName115",
"testLongLongLongColumnName116",
"testLongLongLongColumnName117",
"testLongLongLongColumnName118",
"testLongLongLongColumnName119",
"testLongLongLongColumnName120",
"testLongLongLongColumnName121",
"testLongLongLongColumnName122",
"testLongLongLongColumnName123",
"testLongLongLongColumnName124",
"testLongLongLongColumnName125",
"testLongLongLongColumnName126",
"testLongLongLongColumnName127",
"testLongLongLongColumnName128",
"testLongLongLongColumnName129",
"testLongLongLongColumnName130",
"testLongLongLongColumnName131",
"testLongLongLongColumnName132",
"testLongLongLongColumnName133",
"testLongLongLongColumnName134",
"testLongLongLongColumnName135",
"testLongLongLongColumnName136",
"testLongLongLongColumnName137",
"testLongLongLongColumnName138",
"testLongLongLongColumnName139",
"testLongLongLongColumnName140",
"testLongLongLongColumnName141",
"testLongLongLongColumnName142",
"testLongLongLongColumnName143",
"testLongLongLongColumnName144",
"testLongLongLongColumnName145",
"testLongLongLongColumnName146",
"testLongLongLongColumnName147",
"testLongLongLongColumnName148",
"testLongLongLongColumnName149",
"testLongLongLongColumnName150",
"testLongLongLongColumnName151",
"testLongLongLongColumnName152",
"testLongLongLongColumnName153",
"testLongLongLongColumnName154",
"testLongLongLongColumnName155",
"testLongLongLongColumnName156",
"testLongLongLongColumnName157",
"testLongLongLongColumnName158",
"testLongLongLongColumnName159",
"testLongLongLongColumnName160",
"testLongLongLongColumnName161",
"testLongLongLongColumnName162",
"testLongLongLongColumnName163",
"testLongLongLongColumnName164",
"testLongLongLongColumnName165",
"testLongLongLongColumnName166",
"testLongLongLongColumnName167",
"testLongLongLongColumnName168",
"testLongLongLongColumnName169",
"testLongLongLongColumnName170",
"testLongLongLongColumnName171",
"testLongLongLongColumnName172",
"testLongLongLongColumnName173",
"testLongLongLongColumnName174",
"testLongLongLongColumnName175",
"testLongLongLongColumnName176",
"testLongLongLongColumnName177",
"testLongLongLongColumnName178",
"testLongLongLongColumnName179",
"testLongLongLongColumnName180",
"testLongLongLongColumnName181",
"testLongLongLongColumnName182",
"testLongLongLongColumnName183",
"testLongLongLongColumnName184",
"testLongLongLongColumnName185",
"testLongLongLongColumnName186",
"testLongLongLongColumnName187",
"testLongLongLongColumnName188",
"testLongLongLongColumnName189",
"testLongLongLongColumnName190",
"testLongLongLongColumnName191",
"testLongLongLongColumnName192",
"testLongLongLongColumnName193",
"testLongLongLongColumnName194",
"testLongLongLongColumnName195",
"testLongLongLongColumnName196",
"testLongLongLongColumnName197",
"testLongLongLongColumnName198",
"testLongLongLongColumnName199",
"testLongLongLongColumnName200",
"testLongLongLongColumnName201",
"testLongLongLongColumnName202",
"testLongLongLongColumnName203",
"testLongLongLongColumnName204",
"testLongLongLongColumnName205",
"testLongLongLongColumnName206",
"testLongLongLongColumnName207",
"testLongLongLongColumnName208",
"testLongLongLongColumnName209",
"testLongLongLongColumnName210",
"testLongLongLongColumnName211",
"testLongLongLongColumnName212",
"testLongLongLongColumnName213",
"testLongLongLongColumnName214",
"testLongLongLongColumnName215",
"testLongLongLongColumnName216",
"testLongLongLongColumnName217",
"testLongLongLongColumnName218",
"testLongLongLongColumnName219",
"testLongLongLongColumnName220",
"testLongLongLongColumnName221",
"testLongLongLongColumnName222",
"testLongLongLongColumnName223",
"testLongLongLongColumnName224",
"testLongLongLongColumnName225",
"testLongLongLongColumnName226",
"testLongLongLongColumnName227",
"testLongLongLongColumnName228",
"testLongLongLongColumnName229",
"testLongLongLongColumnName230",
"testLongLongLongColumnName231",
"testLongLongLongColumnName232",
"testLongLongLongColumnName233",
"testLongLongLongColumnName234",
"testLongLongLongColumnName235",
"testLongLongLongColumnName236",
"testLongLongLongColumnName237",
"testLongLongLongColumnName238",
"testLongLongLongColumnName239",
"testLongLongLongColumnName240",
"testLongLongLongColumnName241",
"testLongLongLongColumnName242",
"testLongLongLongColumnName243",
"testLongLongLongColumnName244",
"testLongLongLongColumnName245",
"testLongLongLongColumnName246",
"testLongLongLongColumnName247",
"testLongLongLongColumnName248",
"testLongLongLongColumnName249",
"testLongLongLongColumnName250",
"testLongLongLongColumnName251",
"testLongLongLongColumnName252",
"testLongLongLongColumnName253",
"testLongLongLongColumnName254",
"testLongLongLongColumnName255",
"testLongLongLongColumnName256",
"testLongLongLongColumnName257",
"testLongLongLongColumnName258",
"testLongLongLongColumnName259",
"testLongLongLongColumnName260",
"testLongLongLongColumnName261",
"testLongLongLongColumnName262",
"testLongLongLongColumnName263",
"testLongLongLongColumnName264",
"testLongLongLongColumnName265",
"testLongLongLongColumnName266",
"testLongLongLongColumnName267",
"testLongLongLongColumnName268",
"testLongLongLongColumnName269",
"testLongLongLongColumnName270",
"testLongLongLongColumnName271",
"testLongLongLongColumnName272",
"testLongLongLongColumnName273",
"testLongLongLongColumnName274",
"testLongLongLongColumnName275",
"testLongLongLongColumnName276",
"testLongLongLongColumnName277",
"testLongLongLongColumnName278",
"testLongLongLongColumnName279",
"testLongLongLongColumnName280",
"testLongLongLongColumnName281",
"testLongLongLongColumnName282",
"testLongLongLongColumnName283",
"testLongLongLongColumnName284",
"testLongLongLongColumnName285",
"testLongLongLongColumnName286",
"testLongLongLongColumnName287",
"testLongLongLongColumnName288",
"testLongLongLongColumnName289",
"testLongLongLongColumnName290",
"testLongLongLongColumnName291",
"testLongLongLongColumnName292",
"testLongLongLongColumnName293",
"testLongLongLongColumnName294",
"testLongLongLongColumnName295",
"testLongLongLongColumnName296",
"testLongLongLongColumnName297",
"testLongLongLongColumnName298",
"testLongLongLongColumnName299",
"testLongLongLongColumnName300",
"testLongLongLongColumnName301",
"testLongLongLongColumnName302",
"testLongLongLongColumnName303",
"testLongLongLongColumnName304",
"testLongLongLongColumnName305",
"testLongLongLongColumnName306",
"testLongLongLongColumnName307",
"testLongLongLongColumnName308",
"testLongLongLongColumnName309",
"testLongLongLongColumnName310",
"testLongLongLongColumnName311",
"testLongLongLongColumnName312",
"testLongLongLongColumnName313",
"testLongLongLongColumnName314",
"testLongLongLongColumnName315",
"testLongLongLongColumnName316",
"testLongLongLongColumnName317",
"testLongLongLongColumnName318",
"testLongLongLongColumnName319",
"testLongLongLongColumnName320",
"testLongLongLongColumnName321",
"testLongLongLongColumnName322",
"testLongLongLongColumnName323",
"testLongLongLongColumnName324",
"testLongLongLongColumnName325",
"testLongLongLongColumnName326",
"testLongLongLongColumnName327",
"testLongLongLongColumnName328",
"testLongLongLongColumnName329",
"testLongLongLongColumnName330",
"testLongLongLongColumnName331",
"testLongLongLongColumnName332",
"testLongLongLongColumnName333",
"testLongLongLongColumnName334",
"testLongLongLongColumnName335",
"testLongLongLongColumnName336",
"testLongLongLongColumnName337",
"testLongLongLongColumnName338",
"testLongLongLongColumnName339",
"testLongLongLongColumnName340",
"testLongLongLongColumnName341",
"testLongLongLongColumnName342",
"testLongLongLongColumnName343",
"testLongLongLongColumnName344",
"testLongLongLongColumnName345",
"testLongLongLongColumnName346",
"testLongLongLongColumnName347",
"testLongLongLongColumnName348",
"testLongLongLongColumnName349",
"testLongLongLongColumnName350");
System.out.println(asList);
}
}
Related
Trying to retrieve a userCertificate from Windows Active directory using java CertStore but no success
I am trying to retrive a userCertificate associated with a domain name from Windows Active Directory but having difficulties by using Java API for example when I use 'ldapsearch' command tool, I am able to retrieve the certificate as you can see below ldapsearch -h 192.xx.2.xx -D "CN=Administrator,CN=Users,DC=mmo,DC=co,DC=ca" -w Password -b "CN=rsa0,CN=Users,DC=mmo,DC=co,DC=ca" "userCertificate" # extended LDIF # # LDAPv3 # base <CN=rsa0,CN=Users,DC=mmo,DC=co,DC=ca> with scope subtree # filter: (objectclass=*) # requesting: userCertificate # # rsa0, Users, mmo.co.ca dn: CN=rsa0,CN=Users,DC=mmo,DC=co,DC=ca userCertificate:: MIIDbTCCAlWgAwIBAgIEFbvHazANBgkqhkiG9w0BAQsFADBnMQswCQYDVQQG EwJ1azEQMA4GA1UECBMHVW5rbm93bjEWMBQGA1UEBxMNcmlja21hbnN3b3J0aDERMA8GA1UEChMId m9jYWxpbmsxDDAKBgNVBAsTA2lwczENMAsGA1UEAxMEcnNhMDAeFw0xOTExMjExNDUwNDNaFw0yOT ExMTgxNDUwNDNaMGcxCzAJBgNVBAYTAnVrMRAwDgYDVQQIEwdVbmtub3duMRYwFAYDVQQHEw1yaWN rbWFuc3dvcnRoMREwDwYDVQQKEwh2b2NhbGluazEMMAoGA1UECxMDaXBzMQ0wCwYDVQQDEwRyc2Ew MIIBIjANBgkqhkiG9w0BAQEFAAOCAQ8AMIIBCgKCAQEA0R0yCr0uU80oFG3Zg0vTbR4NSR2St+w4f DOmoHQ27z1Q2JwhiNh1XkwC8MtVeGzRJw0pe+jXc2fMVbIqONHImOZuX6p1UMWof7fxMAIEfWq98u OqVbvbXVLeCE9+BJGsOaiJ70Q76e8tDNTH3vg1orXAvb0O7R0Vz9I0iXjJKzUtmFEBju/m3eoa+WI 6OaBr64hJw7oz1CzPIKj0OcapFypFjr4+QKpRsHA4Nn21XrYSsT00Dk9SVK3NTjHm661crvTR6jSx j1GrCpVdQGCQ25a2RrHIi0cmclNJmy81PngW0cpdO3p9ZsZ2vPUy5/CNbVwqPEPSlIjJtVa0Xf9O1 QIDAQABoyEwHzAdBgNVHQ4EFgQU1U7VOM/vAHL0pqZgi6TS1f0SAt8wDQYJKoZIhvcNAQELBQADgg EBAC7fK81BHDbF8PSQO2YznZtnzCMs46TwCezyqIFzQljwYA5wxKIytV7GtV4aEUrfIFIeQIMW812 pMol9xIotULSl1I/2WI18QTIJfRAnkCZZPJIa9MU6nEGCouF1LwW9bzQzHOeI07NgCIyBryojXaxc L/epJtVxYialdI9mBWB8KDytINrylOcP9sXYaUtkOOiU7h0sBF9XBfzXgtTkF8pB7ObX9YJnyvzTn y2zVfeZD8Q7BtDL7AvIDcUjoHtYx5B0oD86aCNTSShmtB/ZEyqt8Kynqf+QUYQIWA3wVFjgZjCCwc NxiXuf6H8KGW8hP+ETKnc7u9XP9GCHINf9K0I= # search result search: 2 result: 0 Success # numResponses: 2 # numEntries: 1 however when I try to use the Java program, I am unable to retrive it, below is the sample java program package CertStore; import javax.naming.AuthenticationException; import javax.naming.AuthenticationNotSupportedException; import javax.naming.Context; import javax.naming.NamingException; import javax.naming.directory.DirContext; import javax.naming.directory.InitialDirContext; import javax.security.auth.x500.X500Principal; import java.security.cert.*; import java.util.*; import java.io.*; class CertStoreTest { CertStoreTest() { try { LDAPCertStoreParameters lcsp = new LDAPCertStoreParameters("192.xx.2.xx", 389); String referenceID = "CN=rsa0,CN=Users,DC=bmo,DC=co,DC=ca"; X509CertSelector xcs = new X509CertSelector(); xcs.setSubject(referenceID); CertStore cs = CertStore.getInstance("LDAP", lcsp); Collection certificates = cs.getCertificates((CertSelector)xcs); System.out.println("size: "+ certificates.size()); Iterator certificate = certificates.iterator(); while(certificate.hasNext()) { System.out.println(certificate.next()); } } catch(Exception e) { e.printStackTrace(); } } public static void main(String[] args) { System.out.println("main() called."); CertStoreTest test = new CertStoreTest(); } } When I run this program, I get the size as 0 where I am expecting as 1. main() called. size: 0 I also have openldap running on a linux system, and in the above java program if I point to that server and with appropriate domain name information, java is able to pull the certificate associated with that domain name. Not sure what I am missing when I try to retrive certificate from Windows Active Directory. Can anyone shed some light on this as I have been stuck for few days now.
Lag a value with Datavec transform
I'm trying to figure out how to get a lagged value of a field as part of a datavec transform step. Here is a little example built off the dl4j examples: import org.datavec.api.records.reader.RecordReader; import org.datavec.api.records.reader.impl.csv.CSVRecordReader; import org.datavec.api.split.FileSplit; import org.datavec.api.transform.TransformProcess; import org.datavec.api.transform.schema.Schema; import org.datavec.api.writable.Writable; import org.datavec.local.transforms.LocalTransformExecutor; import org.nd4j.linalg.io.ClassPathResource; import java.io.File; import java.util.ArrayList; import java.util.Arrays; import java.util.List; public class myExample { public static void main(String[] args) throws Exception { Schema inputDataSchema = new Schema.Builder() .addColumnString("DateTimeString") .addColumnsString("CustomerID", "MerchantID") .addColumnInteger("NumItemsInTransaction") .addColumnCategorical("MerchantCountryCode", Arrays.asList("USA","CAN","FR","MX")) .addColumnDouble("TransactionAmountUSD",0.0,null,false,false) //$0.0 or more, no maximum limit, no NaN and no Infinite values .addColumnCategorical("FraudLabel", Arrays.asList("Fraud","Legit")) .build(); TransformProcess tp = new TransformProcess.Builder(inputDataSchema) .removeAllColumnsExceptFor("DateTimeString","TransactionAmountUSD") .build(); File inputFile = new ClassPathResource("BasicDataVecExample/exampledata.csv").getFile(); //Define input reader and output writer: RecordReader rr = new CSVRecordReader(1, ','); rr.initialize(new FileSplit(inputFile)); //Process the data: List<List<Writable>> originalData = new ArrayList<>(); while(rr.hasNext()){ originalData.add(rr.next()); } List<List<Writable>> processedData = LocalTransformExecutor.execute(originalData, tp); int numRows = 5; System.out.println("=== BEFORE ==="); for (int i=0;i<=numRows;i++) { System.out.println(originalData.get(i)); } System.out.println("=== AFTER ==="); for (int i=0;i<=numRows;i++) { System.out.println(processedData.get(i)); } } } I'm looking to get a lagged value (ordered by DateTimeString) of TransactionAmountUSD I was looking at sequenceMovingWindowReduce from the docs but could not figure it out. Also could not really find any examples in the examples repo that seemed to do anything similar to this.
Thanks to some help from Alex Black on the dl4j gitter channel i can post my own answer. Tip to anyone new to dl4j - there is lots of good things to look at in the tests code too in addition to the examples and tutorials. Here is my updated toy example code: package org.datavec.transform.basic; import org.datavec.api.records.reader.RecordReader; import org.datavec.api.records.reader.impl.csv.CSVRecordReader; import org.datavec.api.split.FileSplit; import org.datavec.api.transform.TransformProcess; import org.datavec.api.transform.schema.Schema; import org.datavec.api.transform.sequence.comparator.NumericalColumnComparator; import org.datavec.api.transform.transform.sequence.SequenceOffsetTransform; import org.datavec.api.writable.Writable; import org.datavec.local.transforms.LocalTransformExecutor; import org.joda.time.DateTimeZone; import org.nd4j.linalg.io.ClassPathResource; import java.io.File; import java.util.ArrayList; import java.util.Arrays; import java.util.List; public class myExample { public static void main(String[] args) throws Exception { Schema inputDataSchema = new Schema.Builder() .addColumnString("DateTimeString") .addColumnsString("CustomerID", "MerchantID") .addColumnInteger("NumItemsInTransaction") .addColumnCategorical("MerchantCountryCode", Arrays.asList("USA","CAN","FR","MX")) .addColumnDouble("TransactionAmountUSD",0.0,null,false,false) //$0.0 or more, no maximum limit, no NaN and no Infinite values .addColumnCategorical("FraudLabel", Arrays.asList("Fraud","Legit")) .build(); TransformProcess tp = new TransformProcess.Builder(inputDataSchema) .removeAllColumnsExceptFor("CustomerID", "DateTimeString","TransactionAmountUSD") .stringToTimeTransform("DateTimeString","YYYY-MM-DD HH:mm:ss.SSS", DateTimeZone.UTC) .convertToSequence(Arrays.asList("CustomerID"), new NumericalColumnComparator("DateTimeString")) .offsetSequence(Arrays.asList("TransactionAmountUSD"),1, SequenceOffsetTransform.OperationType.NewColumn) .build(); File inputFile = new ClassPathResource("BasicDataVecExample/exampledata.csv").getFile(); //Define input reader and output writer: RecordReader rr = new CSVRecordReader(0, ','); rr.initialize(new FileSplit(inputFile)); //Process the data: List<List<Writable>> originalData = new ArrayList<>(); while(rr.hasNext()){ originalData.add(rr.next()); } List<List<List<Writable>>> processedData = LocalTransformExecutor.executeToSequence(originalData, tp); System.out.println("=== BEFORE ==="); for (int i=0;i<originalData.size();i++) { System.out.println(originalData.get(i)); } System.out.println("=== AFTER ==="); for (int i=0;i<processedData.size();i++) { System.out.println(processedData.get(i)); } } } This should give some output like below where you can see a now col with the last value for the transaction amount for each customer id is added. "C:\Program Files\Java\jdk1.8.0_201\bin\java.exe" "-javaagent:C:\Program Files\JetBrains\IntelliJ IDEA Community Edition 2019.1\lib\idea_rt.jar=56103:C:\Program Files\JetBrains\IntelliJ IDEA Community Edition 2019.1\bin" -Dfile.encoding=UTF-8 -classpath "C:\Program Files\Java\jdk1.8.0_201\jre\lib\charsets.jar;C:\Program Files\Java\jdk1.8.0_201\jre\lib\deploy.jar;C:\Program Files\Java\jdk1.8.0_201\jre\lib\ext\access-bridge-64.jar;C:\Program Files\Java\jdk1.8.0_201\jre\lib\ext\cldrdata.jar;C:\Program Files\Java\jdk1.8.0_201\jre\lib\ext\dnsns.jar;C:\Program Files\Java\jdk1.8.0_201\jre\lib\ext\jaccess.jar;C:\Program Files\Java\jdk1.8.0_201\jre\lib\ext\jfxrt.jar;C:\Program Files\Java\jdk1.8.0_201\jre\lib\ext\localedata.jar;C:\Program Files\Java\jdk1.8.0_201\jre\lib\ext\nashorn.jar;C:\Program Files\Java\jdk1.8.0_201\jre\lib\ext\sunec.jar;C:\Program Files\Java\jdk1.8.0_201\jre\lib\ext\sunjce_provider.jar;C:\Program Files\Java\jdk1.8.0_201\jre\lib\ext\sunmscapi.jar;C:\Program Files\Java\jdk1.8.0_201\jre\lib\ext\sunpkcs11.jar;C:\Program Files\Java\jdk1.8.0_201\jre\lib\ext\zipfs.jar;C:\Program Files\Java\jdk1.8.0_201\jre\lib\javaws.jar;C:\Program Files\Java\jdk1.8.0_201\jre\lib\jce.jar;C:\Program Files\Java\jdk1.8.0_201\jre\lib\jfr.jar;C:\Program Files\Java\jdk1.8.0_201\jre\lib\jfxswt.jar;C:\Program Files\Java\jdk1.8.0_201\jre\lib\jsse.jar;C:\Program Files\Java\jdk1.8.0_201\jre\lib\management-agent.jar;C:\Program Files\Java\jdk1.8.0_201\jre\lib\plugin.jar;C:\Program Files\Java\jdk1.8.0_201\jre\lib\resources.jar;C:\Program Files\Java\jdk1.8.0_201\jre\lib\rt.jar;C:\Users\amaguire\Documents\java_learning\dl4j-examples\datavec-examples\target\classes;C:\Users\amaguire\.m2\repository\org\datavec\datavec-api\1.0.0-beta3\datavec-api-1.0.0-beta3.jar;C:\Users\amaguire\.m2\repository\org\jetbrains\annotations\13.0\annotations-13.0.jar;C:\Users\amaguire\.m2\repository\org\apache\commons\commons-lang3\3.6\commons-lang3-3.6.jar;C:\Users\amaguire\.m2\repository\commons-io\commons-io\2.5\commons-io-2.5.jar;C:\Users\amaguire\.m2\repository\commons-codec\commons-codec\1.10\commons-codec-1.10.jar;C:\Users\amaguire\.m2\repository\org\slf4j\slf4j-api\1.7.21\slf4j-api-1.7.21.jar;C:\Users\amaguire\.m2\repository\joda-time\joda-time\2.2\joda-time-2.2.jar;C:\Users\amaguire\.m2\repository\org\yaml\snakeyaml\1.12\snakeyaml-1.12.jar;C:\Users\amaguire\.m2\repository\org\nd4j\jackson\1.0.0-beta3\jackson-1.0.0-beta3.jar;C:\Users\amaguire\.m2\repository\org\codehaus\woodstox\stax2-api\3.1.4\stax2-api-3.1.4.jar;C:\Users\amaguire\.m2\repository\org\freemarker\freemarker\2.3.23\freemarker-2.3.23.jar;C:\Users\amaguire\.m2\repository\org\nd4j\nd4j-common\1.0.0-beta3\nd4j-common-1.0.0-beta3.jar;C:\Users\amaguire\.m2\repository\org\nd4j\nd4j-api\1.0.0-beta3\nd4j-api-1.0.0-beta3.jar;C:\Users\amaguire\.m2\repository\com\google\flatbuffers\flatbuffers-java\1.9.0\flatbuffers-java-1.9.0.jar;C:\Users\amaguire\.m2\repository\com\github\os72\protobuf-java-shaded-351\0.9\protobuf-java-shaded-351-0.9.jar;C:\Users\amaguire\.m2\repository\com\github\os72\protobuf-java-util-shaded-351\0.9\protobuf-java-util-shaded-351-0.9.jar;C:\Users\amaguire\.m2\repository\com\google\code\gson\gson\2.7\gson-2.7.jar;C:\Users\amaguire\.m2\repository\org\objenesis\objenesis\2.6\objenesis-2.6.jar;C:\Users\amaguire\.m2\repository\uk\com\robust-it\cloning\1.9.3\cloning-1.9.3.jar;C:\Users\amaguire\.m2\repository\org\nd4j\nd4j-buffer\1.0.0-beta3\nd4j-buffer-1.0.0-beta3.jar;C:\Users\amaguire\.m2\repository\org\bytedeco\javacpp\1.4.3\javacpp-1.4.3.jar;C:\Users\amaguire\.m2\repository\org\nd4j\nd4j-context\1.0.0-beta3\nd4j-context-1.0.0-beta3.jar;C:\Users\amaguire\.m2\repository\net\ericaro\neoitertools\1.0.0\neoitertools-1.0.0.jar;C:\Users\amaguire\.m2\repository\com\clearspring\analytics\stream\2.7.0\stream-2.7.0.jar;C:\Users\amaguire\.m2\repository\net\sf\opencsv\opencsv\2.3\opencsv-2.3.jar;C:\Users\amaguire\.m2\repository\com\tdunning\t-digest\3.2\t-digest-3.2.jar;C:\Users\amaguire\.m2\repository\it\unimi\dsi\fastutil\6.5.7\fastutil-6.5.7.jar;C:\Users\amaguire\.m2\repository\org\datavec\datavec-spark_2.11\1.0.0-beta3_spark_1\datavec-spark_2.11-1.0.0-beta3_spark_1.jar;C:\Users\amaguire\.m2\repository\org\scala-lang\scala-library\2.11.12\scala-library-2.11.12.jar;C:\Users\amaguire\.m2\repository\org\scala-lang\scala-reflect\2.11.12\scala-reflect-2.11.12.jar;C:\Users\amaguire\.m2\repository\org\codehaus\jackson\jackson-core-asl\1.9.13\jackson-core-asl-1.9.13.jar;C:\Users\amaguire\.m2\repository\org\codehaus\jackson\jackson-mapper-asl\1.9.13\jackson-mapper-asl-1.9.13.jar;C:\Users\amaguire\.m2\repository\org\apache\spark\spark-sql_2.11\1.6.3\spark-sql_2.11-1.6.3.jar;C:\Users\amaguire\.m2\repository\org\apache\spark\spark-core_2.11\1.6.3\spark-core_2.11-1.6.3.jar;C:\Users\amaguire\.m2\repository\org\apache\avro\avro-mapred\1.7.7\avro-mapred-1.7.7-hadoop2.jar;C:\Users\amaguire\.m2\repository\org\apache\avro\avro-ipc\1.7.7\avro-ipc-1.7.7.jar;C:\Users\amaguire\.m2\repository\org\apache\avro\avro\1.7.7\avro-1.7.7.jar;C:\Users\amaguire\.m2\repository\org\apache\avro\avro-ipc\1.7.7\avro-ipc-1.7.7-tests.jar;C:\Users\amaguire\.m2\repository\com\twitter\chill_2.11\0.5.0\chill_2.11-0.5.0.jar;C:\Users\amaguire\.m2\repository\com\esotericsoftware\kryo\kryo\2.21\kryo-2.21.jar;C:\Users\amaguire\.m2\repository\com\esotericsoftware\reflectasm\reflectasm\1.07\reflectasm-1.07-shaded.jar;C:\Users\amaguire\.m2\repository\com\esotericsoftware\minlog\minlog\1.2\minlog-1.2.jar;C:\Users\amaguire\.m2\repository\com\twitter\chill-java\0.5.0\chill-java-0.5.0.jar;C:\Users\amaguire\.m2\repository\org\apache\xbean\xbean-asm5-shaded\4.4\xbean-asm5-shaded-4.4.jar;C:\Users\amaguire\.m2\repository\org\apache\hadoop\hadoop-client\2.2.0\hadoop-client-2.2.0.jar;C:\Users\amaguire\.m2\repository\org\apache\hadoop\hadoop-common\2.2.0\hadoop-common-2.2.0.jar;C:\Users\amaguire\.m2\repository\commons-cli\commons-cli\1.2\commons-cli-1.2.jar;C:\Users\amaguire\.m2\repository\org\apache\commons\commons-math\2.1\commons-math-2.1.jar;C:\Users\amaguire\.m2\repository\xmlenc\xmlenc\0.52\xmlenc-0.52.jar;C:\Users\amaguire\.m2\repository\commons-configuration\commons-configuration\1.6\commons-configuration-1.6.jar;C:\Users\amaguire\.m2\repository\commons-digester\commons-digester\1.8\commons-digester-1.8.jar;C:\Users\amaguire\.m2\repository\commons-beanutils\commons-beanutils\1.7.0\commons-beanutils-1.7.0.jar;C:\Users\amaguire\.m2\repository\commons-beanutils\commons-beanutils-core\1.8.0\commons-beanutils-core-1.8.0.jar;C:\Users\amaguire\.m2\repository\org\apache\hadoop\hadoop-auth\2.2.0\hadoop-auth-2.2.0.jar;C:\Users\amaguire\.m2\repository\org\apache\hadoop\hadoop-hdfs\2.2.0\hadoop-hdfs-2.2.0.jar;C:\Users\amaguire\.m2\repository\org\mortbay\jetty\jetty-util\6.1.26\jetty-util-6.1.26.jar;C:\Users\amaguire\.m2\repository\org\apache\hadoop\hadoop-mapreduce-client-app\2.2.0\hadoop-mapreduce-client-app-2.2.0.jar;C:\Users\amaguire\.m2\repository\org\apache\hadoop\hadoop-mapreduce-client-common\2.2.0\hadoop-mapreduce-client-common-2.2.0.jar;C:\Users\amaguire\.m2\repository\org\apache\hadoop\hadoop-yarn-client\2.2.0\hadoop-yarn-client-2.2.0.jar;C:\Users\amaguire\.m2\repository\com\sun\jersey\jersey-test-framework\jersey-test-framework-grizzly2\1.9\jersey-test-framework-grizzly2-1.9.jar;C:\Users\amaguire\.m2\repository\com\sun\jersey\jersey-test-framework\jersey-test-framework-core\1.9\jersey-test-framework-core-1.9.jar;C:\Users\amaguire\.m2\repository\com\sun\jersey\jersey-client\1.9\jersey-client-1.9.jar;C:\Users\amaguire\.m2\repository\com\sun\jersey\jersey-grizzly2\1.9\jersey-grizzly2-1.9.jar;C:\Users\amaguire\.m2\repository\org\glassfish\grizzly\grizzly-http\2.1.2\grizzly-http-2.1.2.jar;C:\Users\amaguire\.m2\repository\org\glassfish\grizzly\grizzly-framework\2.1.2\grizzly-framework-2.1.2.jar;C:\Users\amaguire\.m2\repository\org\glassfish\gmbal\gmbal-api-only\3.0.0-b023\gmbal-api-only-3.0.0-b023.jar;C:\Users\amaguire\.m2\repository\org\glassfish\external\management-api\3.0.0-b012\management-api-3.0.0-b012.jar;C:\Users\amaguire\.m2\repository\org\glassfish\grizzly\grizzly-http-server\2.1.2\grizzly-http-server-2.1.2.jar;C:\Users\amaguire\.m2\repository\org\glassfish\grizzly\grizzly-rcm\2.1.2\grizzly-rcm-2.1.2.jar;C:\Users\amaguire\.m2\repository\org\glassfish\grizzly\grizzly-http-servlet\2.1.2\grizzly-http-servlet-2.1.2.jar;C:\Users\amaguire\.m2\repository\org\glassfish\javax.servlet\3.1\javax.servlet-3.1.jar;C:\Users\amaguire\.m2\repository\com\sun\jersey\jersey-json\1.9\jersey-json-1.9.jar;C:\Users\amaguire\.m2\repository\org\codehaus\jettison\jettison\1.1\jettison-1.1.jar;C:\Users\amaguire\.m2\repository\stax\stax-api\1.0.1\stax-api-1.0.1.jar;C:\Users\amaguire\.m2\repository\org\codehaus\jackson\jackson-jaxrs\1.8.3\jackson-jaxrs-1.8.3.jar;C:\Users\amaguire\.m2\repository\org\codehaus\jackson\jackson-xc\1.8.3\jackson-xc-1.8.3.jar;C:\Users\amaguire\.m2\repository\com\sun\jersey\contribs\jersey-guice\1.9\jersey-guice-1.9.jar;C:\Users\amaguire\.m2\repository\org\apache\hadoop\hadoop-yarn-server-common\2.2.0\hadoop-yarn-server-common-2.2.0.jar;C:\Users\amaguire\.m2\repository\org\apache\hadoop\hadoop-mapreduce-client-shuffle\2.2.0\hadoop-mapreduce-client-shuffle-2.2.0.jar;C:\Users\amaguire\.m2\repository\org\apache\hadoop\hadoop-yarn-api\2.2.0\hadoop-yarn-api-2.2.0.jar;C:\Users\amaguire\.m2\repository\org\apache\hadoop\hadoop-mapreduce-client-core\2.2.0\hadoop-mapreduce-client-core-2.2.0.jar;C:\Users\amaguire\.m2\repository\org\apache\hadoop\hadoop-yarn-common\2.2.0\hadoop-yarn-common-2.2.0.jar;C:\Users\amaguire\.m2\repository\org\apache\hadoop\hadoop-mapreduce-client-jobclient\2.2.0\hadoop-mapreduce-client-jobclient-2.2.0.jar;C:\Users\amaguire\.m2\repository\org\apache\hadoop\hadoop-annotations\2.2.0\hadoop-annotations-2.2.0.jar;C:\Users\amaguire\.m2\repository\org\apache\spark\spark-launcher_2.11\1.6.3\spark-launcher_2.11-1.6.3.jar;C:\Users\amaguire\.m2\repository\org\apache\spark\spark-network-common_2.11\1.6.3\spark-network-common_2.11-1.6.3.jar;C:\Users\amaguire\.m2\repository\org\apache\spark\spark-network-shuffle_2.11\1.6.3\spark-network-shuffle_2.11-1.6.3.jar;C:\Users\amaguire\.m2\repository\org\fusesource\leveldbjni\leveldbjni-all\1.8\leveldbjni-all-1.8.jar;C:\Users\amaguire\.m2\repository\org\apache\spark\spark-unsafe_2.11\1.6.3\spark-unsafe_2.11-1.6.3.jar;C:\Users\amaguire\.m2\repository\net\java\dev\jets3t\jets3t\0.7.1\jets3t-0.7.1.jar;C:\Users\amaguire\.m2\repository\commons-httpclient\commons-httpclient\3.1\commons-httpclient-3.1.jar;C:\Users\amaguire\.m2\repository\org\eclipse\jetty\orbit\javax.servlet\3.0.0.v201112011016\javax.servlet-3.0.0.v201112011016.jar;C:\Users\amaguire\.m2\repository\com\google\code\findbugs\jsr305\1.3.9\jsr305-1.3.9.jar;C:\Users\amaguire\.m2\repository\org\slf4j\jul-to-slf4j\1.7.10\jul-to-slf4j-1.7.10.jar;C:\Users\amaguire\.m2\repository\org\slf4j\jcl-over-slf4j\1.7.10\jcl-over-slf4j-1.7.10.jar;C:\Users\amaguire\.m2\repository\log4j\log4j\1.2.17\log4j-1.2.17.jar;C:\Users\amaguire\.m2\repository\org\slf4j\slf4j-log4j12\1.7.10\slf4j-log4j12-1.7.10.jar;C:\Users\amaguire\.m2\repository\com\ning\compress-lzf\1.0.3\compress-lzf-1.0.3.jar;C:\Users\amaguire\.m2\repository\org\xerial\snappy\snappy-java\1.1.2.6\snappy-java-1.1.2.6.jar;C:\Users\amaguire\.m2\repository\net\jpountz\lz4\lz4\1.3.0\lz4-1.3.0.jar;C:\Users\amaguire\.m2\repository\org\roaringbitmap\RoaringBitmap\0.5.11\RoaringBitmap-0.5.11.jar;C:\Users\amaguire\.m2\repository\org\json4s\json4s-jackson_2.11\3.2.10\json4s-jackson_2.11-3.2.10.jar;C:\Users\amaguire\.m2\repository\org\json4s\json4s-core_2.11\3.2.10\json4s-core_2.11-3.2.10.jar;C:\Users\amaguire\.m2\repository\org\json4s\json4s-ast_2.11\3.2.10\json4s-ast_2.11-3.2.10.jar;C:\Users\amaguire\.m2\repository\org\scala-lang\scalap\2.11.0\scalap-2.11.0.jar;C:\Users\amaguire\.m2\repository\org\scala-lang\scala-compiler\2.11.0\scala-compiler-2.11.0.jar;C:\Users\amaguire\.m2\repository\org\scala-lang\modules\scala-xml_2.11\1.0.1\scala-xml_2.11-1.0.1.jar;C:\Users\amaguire\.m2\repository\org\scala-lang\modules\scala-parser-combinators_2.11\1.0.1\scala-parser-combinators_2.11-1.0.1.jar;C:\Users\amaguire\.m2\repository\com\sun\jersey\jersey-server\1.9\jersey-server-1.9.jar;C:\Users\amaguire\.m2\repository\asm\asm\3.1\asm-3.1.jar;C:\Users\amaguire\.m2\repository\com\sun\jersey\jersey-core\1.9\jersey-core-1.9.jar;C:\Users\amaguire\.m2\repository\org\apache\mesos\mesos\0.21.1\mesos-0.21.1-shaded-protobuf.jar;C:\Users\amaguire\.m2\repository\io\netty\netty-all\4.0.29.Final\netty-all-4.0.29.Final.jar;C:\Users\amaguire\.m2\repository\io\dropwizard\metrics\metrics-core\3.1.2\metrics-core-3.1.2.jar;C:\Users\amaguire\.m2\repository\io\dropwizard\metrics\metrics-jvm\3.1.2\metrics-jvm-3.1.2.jar;C:\Users\amaguire\.m2\repository\io\dropwizard\metrics\metrics-json\3.1.2\metrics-json-3.1.2.jar;C:\Users\amaguire\.m2\repository\io\dropwizard\metrics\metrics-graphite\3.1.2\metrics-graphite-3.1.2.jar;C:\Users\amaguire\.m2\repository\com\fasterxml\jackson\module\jackson-module-scala_2.11\2.5.1\jackson-module-scala_2.11-2.5.1.jar;C:\Users\amaguire\.m2\repository\com\thoughtworks\paranamer\paranamer\2.6\paranamer-2.6.jar;C:\Users\amaguire\.m2\repository\org\apache\ivy\ivy\2.4.0\ivy-2.4.0.jar;C:\Users\amaguire\.m2\repository\oro\oro\2.0.8\oro-2.0.8.jar;C:\Users\amaguire\.m2\repository\org\tachyonproject\tachyon-client\0.8.2\tachyon-client-0.8.2.jar;C:\Users\amaguire\.m2\repository\org\tachyonproject\tachyon-underfs-hdfs\0.8.2\tachyon-underfs-hdfs-0.8.2.jar;C:\Users\amaguire\.m2\repository\org\tachyonproject\tachyon-underfs-s3\0.8.2\tachyon-underfs-s3-0.8.2.jar;C:\Users\amaguire\.m2\repository\org\tachyonproject\tachyon-underfs-local\0.8.2\tachyon-underfs-local-0.8.2.jar;C:\Users\amaguire\.m2\repository\net\razorvine\pyrolite\4.9\pyrolite-4.9.jar;C:\Users\amaguire\.m2\repository\net\sf\py4j\py4j\0.9\py4j-0.9.jar;C:\Users\amaguire\.m2\repository\org\apache\spark\spark-catalyst_2.11\1.6.3\spark-catalyst_2.11-1.6.3.jar;C:\Users\amaguire\.m2\repository\org\codehaus\janino\janino\2.7.8\janino-2.7.8.jar;C:\Users\amaguire\.m2\repository\org\codehaus\janino\commons-compiler\2.7.8\commons-compiler-2.7.8.jar;C:\Users\amaguire\.m2\repository\org\apache\parquet\parquet-column\1.7.0\parquet-column-1.7.0.jar;C:\Users\amaguire\.m2\repository\org\apache\parquet\parquet-common\1.7.0\parquet-common-1.7.0.jar;C:\Users\amaguire\.m2\repository\org\apache\parquet\parquet-encoding\1.7.0\parquet-encoding-1.7.0.jar;C:\Users\amaguire\.m2\repository\org\apache\parquet\parquet-generator\1.7.0\parquet-generator-1.7.0.jar;C:\Users\amaguire\.m2\repository\org\apache\parquet\parquet-hadoop\1.7.0\parquet-hadoop-1.7.0.jar;C:\Users\amaguire\.m2\repository\org\apache\parquet\parquet-format\2.3.0-incubating\parquet-format-2.3.0-incubating.jar;C:\Users\amaguire\.m2\repository\org\apache\parquet\parquet-jackson\1.7.0\parquet-jackson-1.7.0.jar;C:\Users\amaguire\.m2\repository\org\spark-project\spark\unused\1.0.0\unused-1.0.0.jar;C:\Users\amaguire\.m2\repository\com\google\guava\guava\20.0\guava-20.0.jar;C:\Users\amaguire\.m2\repository\com\google\inject\guice\4.0\guice-4.0.jar;C:\Users\amaguire\.m2\repository\javax\inject\javax.inject\1\javax.inject-1.jar;C:\Users\amaguire\.m2\repository\aopalliance\aopalliance\1.0\aopalliance-1.0.jar;C:\Users\amaguire\.m2\repository\com\google\protobuf\protobuf-java\2.6.1\protobuf-java-2.6.1.jar;C:\Users\amaguire\.m2\repository\commons-collections\commons-collections\3.2.2\commons-collections-3.2.2.jar;C:\Users\amaguire\.m2\repository\commons-lang\commons-lang\2.6\commons-lang-2.6.jar;C:\Users\amaguire\.m2\repository\commons-net\commons-net\3.1\commons-net-3.1.jar;C:\Users\amaguire\.m2\repository\com\sun\xml\bind\jaxb-core\2.2.11\jaxb-core-2.2.11.jar;C:\Users\amaguire\.m2\repository\com\sun\xml\bind\jaxb-impl\2.2.11\jaxb-impl-2.2.11.jar;C:\Users\amaguire\.m2\repository\com\typesafe\akka\akka-actor_2.11\2.3.16\akka-actor_2.11-2.3.16.jar;C:\Users\amaguire\.m2\repository\com\typesafe\akka\akka-remote_2.11\2.3.16\akka-remote_2.11-2.3.16.jar;C:\Users\amaguire\.m2\repository\org\uncommons\maths\uncommons-maths\1.2.2a\uncommons-maths-1.2.2a.jar;C:\Users\amaguire\.m2\repository\com\typesafe\akka\akka-slf4j_2.11\2.3.16\akka-slf4j_2.11-2.3.16.jar;C:\Users\amaguire\.m2\repository\io\netty\netty\3.10.4.Final\netty-3.10.4.Final.jar;C:\Users\amaguire\.m2\repository\com\fasterxml\jackson\core\jackson-core\2.5.1\jackson-core-2.5.1.jar;C:\Users\amaguire\.m2\repository\com\fasterxml\jackson\core\jackson-databind\2.5.1\jackson-databind-2.5.1.jar;C:\Users\amaguire\.m2\repository\com\fasterxml\jackson\core\jackson-annotations\2.5.1\jackson-annotations-2.5.1.jar;C:\Users\amaguire\.m2\repository\javax\servlet\javax.servlet-api\3.1.0\javax.servlet-api-3.1.0.jar;C:\Users\amaguire\.m2\repository\org\apache\commons\commons-compress\1.16.1\commons-compress-1.16.1.jar;C:\Users\amaguire\.m2\repository\org\apache\commons\commons-math3\3.5\commons-math3-3.5.jar;C:\Users\amaguire\.m2\repository\org\apache\curator\curator-recipes\2.8.0\curator-recipes-2.8.0.jar;C:\Users\amaguire\.m2\repository\org\apache\curator\curator-framework\2.8.0\curator-framework-2.8.0.jar;C:\Users\amaguire\.m2\repository\org\apache\curator\curator-client\2.8.0\curator-client-2.8.0.jar;C:\Users\amaguire\.m2\repository\org\apache\zookeeper\zookeeper\3.4.6\zookeeper-3.4.6.jar;C:\Users\amaguire\.m2\repository\jline\jline\0.9.94\jline-0.9.94.jar;C:\Users\amaguire\.m2\repository\com\typesafe\config\1.3.0\config-1.3.0.jar;C:\Users\amaguire\.m2\repository\org\datavec\datavec-hadoop\1.0.0-beta3\datavec-hadoop-1.0.0-beta3.jar;C:\Users\amaguire\.m2\repository\org\datavec\datavec-local\1.0.0-beta3\datavec-local-1.0.0-beta3.jar;C:\Users\amaguire\.m2\repository\com\codepoetics\protonpack\1.15\protonpack-1.15.jar;C:\Users\amaguire\.m2\repository\org\datavec\datavec-arrow\1.0.0-beta3\datavec-arrow-1.0.0-beta3.jar;C:\Users\amaguire\.m2\repository\org\nd4j\nd4j-arrow\1.0.0-beta3\nd4j-arrow-1.0.0-beta3.jar;C:\Users\amaguire\.m2\repository\com\fasterxml\jackson\dataformat\jackson-dataformat-yaml\2.6.5\jackson-dataformat-yaml-2.6.5.jar;C:\Users\amaguire\.m2\repository\com\fasterxml\jackson\dataformat\jackson-dataformat-xml\2.6.5\jackson-dataformat-xml-2.6.5.jar;C:\Users\amaguire\.m2\repository\com\fasterxml\jackson\module\jackson-module-jaxb-annotations\2.6.5\jackson-module-jaxb-annotations-2.6.5.jar;C:\Users\amaguire\.m2\repository\com\fasterxml\jackson\datatype\jackson-datatype-joda\2.6.5\jackson-datatype-joda-2.6.5.jar;C:\Users\amaguire\.m2\repository\com\carrotsearch\hppc\0.8.1\hppc-0.8.1.jar;C:\Users\amaguire\.m2\repository\org\apache\arrow\arrow-vector\0.11.0\arrow-vector-0.11.0.jar;C:\Users\amaguire\.m2\repository\io\netty\netty-buffer\4.1.22.Final\netty-buffer-4.1.22.Final.jar;C:\Users\amaguire\.m2\repository\io\netty\netty-common\4.1.22.Final\netty-common-4.1.22.Final.jar;C:\Users\amaguire\.m2\repository\org\apache\arrow\arrow-memory\0.11.0\arrow-memory-0.11.0.jar;C:\Users\amaguire\.m2\repository\org\apache\arrow\arrow-format\0.11.0\arrow-format-0.11.0.jar" org.datavec.transform.basic.myExample log4j:WARN No appenders could be found for logger (io.netty.util.internal.logging.InternalLoggerFactory). log4j:WARN Please initialize the log4j system properly. log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info. === BEFORE === [2016-01-01 17:00:00.000, 830a7u3, u323fy8902, 1, USA, 100.00, Legit] [2016-01-01 18:03:01.256, 830a7u3, 9732498oeu, 3, FR, 73.20, Legit] [2016-01-03 02:53:32.231, 78ueoau32, w234e989, 1, USA, 1621.00, Fraud] [2016-01-03 09:30:16.832, t842uocd, 9732498oeu, 4, USA, 43.19, Legit] [2016-01-04 23:01:52.920, t842uocd, cza8873bm, 10, MX, 159.65, Legit] [2016-01-05 02:28:10.648, t842uocd, fgcq9803, 6, CAN, 26.33, Fraud] [2016-01-05 10:15:36.483, rgc707ke3, tn342v7, 2, USA, -0.90, Legit] === AFTER === [[1451948512920, t842uocd, 159.65, 43.19], [1451960890648, t842uocd, 26.33, 159.65]] [[1451671381256, 830a7u3, 73.20, 100.00]] [] [] Process finished with exit code 0
Issue in R arules package using java
For a university project I have to implement arules(package of R) in java. I have successfully integrated R and java using JRI. I did not understand how to get output of "inspect(Groceries[1:1])". I have tried with asString(),asString[]() but this gives me following error: Exception in thread "main" java.lang.NullPointerException at TestR.main(TestR.java:11) Also, how can implement summary(Groceries) in java? How to get output of summary in String array or string? R code: >data(Groceries) >inspect(Groceries[1:1]) >summary(Groceries) Java code: import org.rosuda.JRI.Rengine; import org.rosuda.JRI.REXP; public class TestR { public static void main(String[] args){ Rengine re = new Rengine(new String[]{"--no-save"}, false, null); re.eval("library(arules)"); re.eval("data(Groceries)"); REXP result = re.eval("inspect(Groceries[1:1])"); System.out.println(result.asString()); } }
Appears that the inspect function in pkg:arules returns NULL. The output you see is a "side-effect". You can attempt to "capture output" but this is untested since I don't have experience with this integration across languages. Try instead.: REXP result = re.eval("capture.output( inspect(Groceries[1:1]) )"); In an R console session you will get: library(arules) data("Adult") rules <- apriori(Adult) val <- inspect(rules[1000]) > str(val) NULL > val.co <- capture.output(inspect(rules[1000])) > val.co [1] " lhs rhs support confidence lift" [2] "1 {education=Some-college, " [3] " sex=Male, " [4] " capital-loss=None} => {native-country=United-States} 0.1208181 0.9256471 1.031449" But I haven't tested this in a non-interactive session. May need to muck with the file argument to capture.output, ... or it may not work at all.
java.lang.NullPointerException in OpenNLP using RJB (Ruby Java Bridge)
I am trying to use the open-nlp Ruby gem to access the Java OpenNLP processor through RJB (Ruby Java Bridge). I am not a Java programmer, so I don't know how to solve this. Any recommendations regarding resolving it, debugging it, collecting more information, etc. would be appreciated. The environment is Windows 8, Ruby 1.9.3p448, Rails 4.0.0, JDK 1.7.0-40 x586. Gems are rjb 1.4.8 and louismullie/open-nlp 0.1.4. For the record, this file runs in JRuby but I experience other problems in that environment and would prefer to stay native Ruby for now. In brief, the open-nlp gem is failing with java.lang.NullPointerException and Ruby error method missing. I hesitate to say why this is happening because I don't know, but it appears to me that the dynamic loading of the Jars file opennlp.tools.postag.POSTaggerME#1b5080a cannot be accessed, perhaps because OpenNLP::Bindings::Utils.tagWithArrayList isn't being set up correctly. OpenNLP::Bindings is Ruby. Utils, and its methods, are Java. And Utils is supposedly the "default" Jars and Class files, which may be important. What am I doing wrong, here? Thanks! The code I am running is copied straight out of github/open-nlp. My copy of the code is: class OpennlpTryer $DEBUG=false # From https://github.com/louismullie/open-nlp # Hints: Dir.pwd; File.expand_path('../../Gemfile', __FILE__); # Load the module require 'open-nlp' #require 'jruby-jars' =begin # Alias "write" to "print" to monkeypatch the NoMethod write error java_import java.io.PrintStream class PrintStream java_alias(:write, :print, [java.lang.String]) end =end =begin # Display path of jruby-jars jars... puts JRubyJars.core_jar_path # => path to jruby-core-VERSION.jar puts JRubyJars.stdlib_jar_path # => path to jruby-stdlib-VERSION.jar =end puts ENV['CLASSPATH'] # Set an alternative path to look for the JAR files. # Default is gem's bin folder. # OpenNLP.jar_path = '/path_to_jars/' OpenNLP.jar_path = File.join(ENV["GEM_HOME"],"gems/open-nlp-0.1.4/bin/") puts OpenNLP.jar_path # Set an alternative path to look for the model files. # Default is gem's bin folder. # OpenNLP.model_path = '/path_to_models/' OpenNLP.model_path = File.join(ENV["GEM_HOME"],"gems/open-nlp-0.1.4/bin/") puts OpenNLP.model_path # Pass some alternative arguments to the Java VM. # Default is ['-Xms512M', '-Xmx1024M']. # OpenNLP.jvm_args = ['-option1', '-option2'] OpenNLP.jvm_args = ['-Xms512M', '-Xmx1024M'] # Redirect VM output to log.txt OpenNLP.log_file = 'log.txt' # Set default models for a language. # OpenNLP.use :language OpenNLP.use :english # Make sure this is lower case!!!! # Simple tokenizer OpenNLP.load sent = "The death of the poet was kept from his poems." tokenizer = OpenNLP::SimpleTokenizer.new tokens = tokenizer.tokenize(sent).to_a # => %w[The death of the poet was kept from his poems .] puts "Tokenize #{tokens}" # Maximum entropy tokenizer, chunker and POS tagger OpenNLP.load chunker = OpenNLP::ChunkerME.new tokenizer = OpenNLP::TokenizerME.new tagger = OpenNLP::POSTaggerME.new sent = "The death of the poet was kept from his poems." tokens = tokenizer.tokenize(sent).to_a # => %w[The death of the poet was kept from his poems .] puts "Tokenize #{tokens}" tags = tagger.tag(tokens).to_a # => %w[DT NN IN DT NN VBD VBN IN PRP$ NNS .] puts "Tags #{tags}" chunks = chunker.chunk(tokens, tags).to_a # => %w[B-NP I-NP B-PP B-NP I-NP B-VP I-VP B-PP B-NP I-NP O] puts "Chunks #{chunks}" # Abstract Bottom-Up Parser OpenNLP.load sent = "The death of the poet was kept from his poems." parser = OpenNLP::Parser.new parse = parser.parse(sent) =begin parse.get_text.should eql sent parse.get_span.get_start.should eql 0 parse.get_span.get_end.should eql 46 parse.get_child_count.should eql 1 =end child = parse.get_children[0] child.text # => "The death of the poet was kept from his poems." child.get_child_count # => 3 child.get_head_index #=> 5 child.get_type # => "S" puts "Child: #{child}" # Maximum Entropy Name Finder* OpenNLP.load # puts File.expand_path('.', __FILE__) text = File.read('./spec/sample.txt').gsub!("\n", "") tokenizer = OpenNLP::TokenizerME.new segmenter = OpenNLP::SentenceDetectorME.new puts "Tokenizer: #{tokenizer}" puts "Segmenter: #{segmenter}" ner_models = ['person', 'time', 'money'] ner_finders = ner_models.map do |model| OpenNLP::NameFinderME.new("en-ner-#{model}.bin") end puts "NER Finders: #{ner_finders}" sentences = segmenter.sent_detect(text) puts "Sentences: #{sentences}" named_entities = [] sentences.each do |sentence| tokens = tokenizer.tokenize(sentence) ner_models.each_with_index do |model, i| finder = ner_finders[i] name_spans = finder.find(tokens) name_spans.each do |name_span| start = name_span.get_start stop = name_span.get_end-1 slice = tokens[start..stop].to_a named_entities << [slice, model] end end end puts "Named Entities: #{named_entities}" # Loading specific models # Just pass the name of the model file to the constructor. The gem will search for the file in the OpenNLP.model_path folder. OpenNLP.load tokenizer = OpenNLP::TokenizerME.new('en-token.bin') tagger = OpenNLP::POSTaggerME.new('en-pos-perceptron.bin') name_finder = OpenNLP::NameFinderME.new('en-ner-person.bin') # etc. puts "Tokenizer: #{tokenizer}" puts "Tagger: #{tagger}" puts "Name Finder: #{name_finder}" # Loading specific classes # You may want to load specific classes from the OpenNLP library that are not loaded by default. The gem provides an API to do this: # Default base class is opennlp.tools. OpenNLP.load_class('SomeClassName') # => OpenNLP::SomeClassName # Here, we specify another base class. OpenNLP.load_class('SomeOtherClass', 'opennlp.tools.namefind') # => OpenNLP::SomeOtherClass end The line which is failing is line 73: (tokens == the sentence being processed.) tags = tagger.tag(tokens).to_a # # => %w[DT NN IN DT NN VBD VBN IN PRP$ NNS .] tagger.tag calls open-nlp/classes.rb line 13, which is where the error is thrown. The code there is: class OpenNLP::POSTaggerME < OpenNLP::Base unless RUBY_PLATFORM =~ /java/ def tag(*args) OpenNLP::Bindings::Utils.tagWithArrayList(#proxy_inst, args[0]) # <== Line 13 end end end The Ruby error thrown at this point is: `method_missing': unknown exception (NullPointerException). Debugging this, I found the error java.lang.NullPointerException. args[0] is the sentence being processed. #proxy_inst is opennlp.tools.postag.POSTaggerME#1b5080a. OpenNLP::Bindings sets up the Java environment. For example, it sets up the Jars to be loaded and the classes within those Jars. In line 54, it sets up defaults for RJB, which should set up OpenNLP::Bindings::Utils and its methods as follows: # Add in Rjb workarounds. unless RUBY_PLATFORM =~ /java/ self.default_jars << 'utils.jar' self.default_classes << ['Utils', ''] end utils.jar and Utils.java are in the CLASSPATH with the other Jars being loaded. They are being accessed, which is verified because the other Jars throw error messages if they are not present. The CLASSPATH is: .;C:\Program Files (x86)Java\jdk1.7.0_40\lib;C:\Program Files (x86)Java\jre7\lib;D:\BitNami\rubystack-1.9.3-12\ruby\lib\ruby\gems\1.9.1\gems\open-nlp-0.1.4\bin The applications Jars are in D:\BitNami\rubystack-1.9.3-12\ruby\lib\ruby\gems\1.9.1\gems\open-nlp-0.1.4\bin and, again, if they are not there I get error messages on other Jars. The Jars and Java files in ...\bin include: jwnl-1.3.3.jar opennlp-maxent-3.0.2-incubating.jar opennlp-tools-1.5.2-incubating.jar opennlp-uima-1.5.2-incubating.jar utils.jar Utils.java Utils.java is as follows: import java.util.Arrays; import java.util.ArrayList; import java.lang.String; import opennlp.tools.postag.POSTagger; import opennlp.tools.chunker.ChunkerME; import opennlp.tools.namefind.NameFinderME; // interface instead? import opennlp.tools.util.Span; // javac -cp '.:opennlp.tools.jar' Utils.java // jar cf utils.jar Utils.class public class Utils { public static String[] tagWithArrayList(POSTagger posTagger, ArrayList[] objectArray) { return posTagger.tag(getStringArray(objectArray)); } public static Object[] findWithArrayList(NameFinderME nameFinder, ArrayList[] tokens) { return nameFinder.find(getStringArray(tokens)); } public static Object[] chunkWithArrays(ChunkerME chunker, ArrayList[] tokens, ArrayList[] tags) { return chunker.chunk(getStringArray(tokens), getStringArray(tags)); } public static String[] getStringArray(ArrayList[] objectArray) { String[] stringArray = Arrays.copyOf(objectArray, objectArray.length, String[].class); return stringArray; } } So, it should define tagWithArrayList and import opennlp.tools.postag.POSTagger. (OBTW, just to try, I changed the incidences of POSTagger to POSTaggerME in this file. It changed nothing...) The tools Jar file, opennlp-tools-1.5.2-incubating.jar, includes postag/POSTagger and POSTaggerME class files, as expected. Error messages are: D:\BitNami\rubystack-1.9.3-12\ruby\bin\ruby.exe -e $stdout.sync=true;$stderr.sync=true;load($0=ARGV.shift) D:/BitNami/rubystack-1.9.3-12/projects/RjbTest/app/helpers/opennlp_tryer.rb .;C:\Program Files (x86)\Java\jdk1.7.0_40\lib;C:\Program Files (x86)\Java\jre7\lib;D:\BitNami\rubystack-1.9.3-12\ruby\lib\ruby\gems\1.9.1\gems\open-nlp-0.1.4\bin D:/BitNami/rubystack-1.9.3-12/ruby/lib/ruby/gems/1.9.1/gems/open-nlp-0.1.4/bin/ D:/BitNami/rubystack-1.9.3-12/ruby/lib/ruby/gems/1.9.1/gems/open-nlp-0.1.4/bin/ Tokenize ["The", "death", "of", "the", "poet", "was", "kept", "from", "his", "poems", "."] Tokenize ["The", "death", "of", "the", "poet", "was", "kept", "from", "his", "poems", "."] D:/BitNami/rubystack-1.9.3-12/ruby/lib/ruby/gems/1.9.1/gems/open-nlp-0.1.4/lib/open-nlp/classes.rb:13:in `method_missing': unknown exception (NullPointerException) from D:/BitNami/rubystack-1.9.3-12/ruby/lib/ruby/gems/1.9.1/gems/open-nlp-0.1.4/lib/open-nlp/classes.rb:13:in `tag' from D:/BitNami/rubystack-1.9.3-12/projects/RjbTest/app/helpers/opennlp_tryer.rb:73:in `<class:OpennlpTryer>' from D:/BitNami/rubystack-1.9.3-12/projects/RjbTest/app/helpers/opennlp_tryer.rb:1:in `<top (required)>' from -e:1:in `load' from -e:1:in `<main>' Modified Utils.java: import java.util.Arrays; import java.util.Object; import java.lang.String; import opennlp.tools.postag.POSTagger; import opennlp.tools.chunker.ChunkerME; import opennlp.tools.namefind.NameFinderME; // interface instead? import opennlp.tools.util.Span; // javac -cp '.:opennlp.tools.jar' Utils.java // jar cf utils.jar Utils.class public class Utils { public static String[] tagWithArrayList(POSTagger posTagger, Object[] objectArray) { return posTagger.tag(getStringArray(objectArray)); }f public static Object[] findWithArrayList(NameFinderME nameFinder, Object[] tokens) { return nameFinder.find(getStringArray(tokens)); } public static Object[] chunkWithArrays(ChunkerME chunker, Object[] tokens, Object[] tags) { return chunker.chunk(getStringArray(tokens), getStringArray(tags)); } public static String[] getStringArray(Object[] objectArray) { String[] stringArray = Arrays.copyOf(objectArray, objectArray.length, String[].class); return stringArray; } } Modified error messages: Uncaught exception: uninitialized constant OpennlpTryer::ArrayStoreException D:/BitNami/rubystack-1.9.3-12/projects/RjbTest/app/helpers/opennlp_tryer.rb:81:in `rescue in <class:OpennlpTryer>' D:/BitNami/rubystack-1.9.3-12/projects/RjbTest/app/helpers/opennlp_tryer.rb:77:in `<class:OpennlpTryer>' D:/BitNami/rubystack-1.9.3-12/projects/RjbTest/app/helpers/opennlp_tryer.rb:1:in `<top (required)>' Revised error with Utils.java revised to "import java.lang.Object;": Uncaught exception: uninitialized constant OpennlpTryer::ArrayStoreException D:/BitNami/rubystack-1.9.3-12/projects/RjbTest/app/helpers/opennlp_tryer.rb:81:in `rescue in <class:OpennlpTryer>' D:/BitNami/rubystack-1.9.3-12/projects/RjbTest/app/helpers/opennlp_tryer.rb:77:in `<class:OpennlpTryer>' D:/BitNami/rubystack-1.9.3-12/projects/RjbTest/app/helpers/opennlp_tryer.rb:1:in `<top (required)>' Rescue removed from OpennlpTryer shows error trapped in classes.rb: Uncaught exception: uninitialized constant OpenNLP::POSTaggerME::ArrayStoreException D:/BitNami/rubystack-1.9.3-12/ruby/lib/ruby/gems/1.9.1/gems/open-nlp-0.1.4/lib/open-nlp/classes.rb:16:in `rescue in tag' D:/BitNami/rubystack-1.9.3-12/ruby/lib/ruby/gems/1.9.1/gems/open-nlp-0.1.4/lib/open-nlp/classes.rb:13:in `tag' D:/BitNami/rubystack-1.9.3-12/projects/RjbTest/app/helpers/opennlp_tryer.rb:78:in `<class:OpennlpTryer>' D:/BitNami/rubystack-1.9.3-12/projects/RjbTest/app/helpers/opennlp_tryer.rb:1:in `<top (required)>' Same error but with all rescues removed so it's "native Ruby" Uncaught exception: unknown exception D:/BitNami/rubystack-1.9.3-12/ruby/lib/ruby/gems/1.9.1/gems/open-nlp-0.1.4/lib/open-nlp/classes.rb:15:in `method_missing' D:/BitNami/rubystack-1.9.3-12/ruby/lib/ruby/gems/1.9.1/gems/open-nlp-0.1.4/lib/open-nlp/classes.rb:15:in `tag' D:/BitNami/rubystack-1.9.3-12/projects/RjbTest/app/helpers/opennlp_tryer.rb:78:in `<class:OpennlpTryer>' D:/BitNami/rubystack-1.9.3-12/projects/RjbTest/app/helpers/opennlp_tryer.rb:1:in `<top (required)>' Revised Utils.java: import java.util.Arrays; import java.util.ArrayList; import java.lang.String; import opennlp.tools.postag.POSTagger; import opennlp.tools.chunker.ChunkerME; import opennlp.tools.namefind.NameFinderME; // interface instead? import opennlp.tools.util.Span; // javac -cp '.:opennlp.tools.jar' Utils.java // jar cf utils.jar Utils.class public class Utils { public static String[] tagWithArrayList( System.out.println("Tokens: ("+objectArray.getClass().getSimpleName()+"): \n"+objectArray); POSTagger posTagger, ArrayList[] objectArray) { return posTagger.tag(getStringArray(objectArray)); } public static Object[] findWithArrayList(NameFinderME nameFinder, ArrayList[] tokens) { return nameFinder.find(getStringArray(tokens)); } public static Object[] chunkWithArrays(ChunkerME chunker, ArrayList[] tokens, ArrayList[] tags) { return chunker.chunk(getStringArray(tokens), getStringArray(tags)); } public static String[] getStringArray(ArrayList[] objectArray) { String[] stringArray = Arrays.copyOf(objectArray, objectArray.length, String[].class); return stringArray; } } I ran cavaj on Utils.class that I unzipped from util.jar and this is what I found. It differs from Utils.java by quite a bit. Both come installed with the open-nlp 1.4.8 gem. I don't know if this is the root cause of the problem, but this file is the core of where it breaks and we have a major discrepancy. Which should we use? import java.util.ArrayList; import java.util.Arrays; import opennlp.tools.chunker.ChunkerME; import opennlp.tools.namefind.NameFinderME; import opennlp.tools.postag.POSTagger; public class Utils { public Utils() { } public static String[] tagWithArrayList(POSTagger postagger, ArrayList aarraylist[]) { return postagger.tag(getStringArray(aarraylist)); } public static Object[] findWithArrayList(NameFinderME namefinderme, ArrayList aarraylist[]) { return namefinderme.find(getStringArray(aarraylist)); } public static Object[] chunkWithArrays(ChunkerME chunkerme, ArrayList aarraylist[], ArrayList aarraylist1[]) { return chunkerme.chunk(getStringArray(aarraylist), getStringArray(aarraylist1)); } public static String[] getStringArray(ArrayList aarraylist[]) { String as[] = (String[])Arrays.copyOf(aarraylist, aarraylist.length, [Ljava/lang/String;); return as; } } Utils.java in use as of 10/07, compiled and compressed into utils.jar: import java.util.Arrays; import java.util.ArrayList; import java.lang.String; import opennlp.tools.postag.POSTagger; import opennlp.tools.chunker.ChunkerME; import opennlp.tools.namefind.NameFinderME; // interface instead? import opennlp.tools.util.Span; // javac -cp '.:opennlp.tools.jar' Utils.java // jar cf utils.jar Utils.class public class Utils { public static String[] tagWithArrayList(POSTagger posTagger, ArrayList[] objectArray) { return posTagger.tag(getStringArray(objectArray)); } public static Object[] findWithArrayList(NameFinderME nameFinder, ArrayList[] tokens) { return nameFinder.find(getStringArray(tokens)); } public static Object[] chunkWithArrays(ChunkerME chunker, ArrayList[] tokens, ArrayList[] tags) { return chunker.chunk(getStringArray(tokens), getStringArray(tags)); } public static String[] getStringArray(ArrayList[] objectArray) { String[] stringArray = Arrays.copyOf(objectArray, objectArray.length, String[].class); return stringArray; } } Failures are occurring in BindIt::Binding::load_klass in line 110 here: # Private function to load classes. # Doesn't check if initialized. def load_klass(klass, base, name=nil) base += '.' unless base == '' fqcn = "#{base}#{klass}" name ||= klass if RUBY_PLATFORM =~ /java/ rb_class = java_import(fqcn) if name != klass if rb_class.is_a?(Array) rb_class = rb_class.first end const_set(name.intern, rb_class) end else rb_class = Rjb::import(fqcn) # <== This is line 110 const_set(name.intern, rb_class) end end The messages are as follows, however they are inconsistent in terms of the particular method that is identified. Each run may display a different method, any of POSTagger, ChunkerME, or NameFinderME. D:/BitNami/rubystack-1.9.3-12/ruby/lib/ruby/gems/1.9.1/gems/bind-it-0.2.7/lib/bind-it/binding.rb:110:in `import': opennlp/tools/namefind/NameFinderME (NoClassDefFoundError) from D:/BitNami/rubystack-1.9.3-12/ruby/lib/ruby/gems/1.9.1/gems/bind-it-0.2.7/lib/bind-it/binding.rb:110:in `load_klass' from D:/BitNami/rubystack-1.9.3-12/ruby/lib/ruby/gems/1.9.1/gems/bind-it-0.2.7/lib/bind-it/binding.rb:89:in `block in load_default_classes' from D:/BitNami/rubystack-1.9.3-12/ruby/lib/ruby/gems/1.9.1/gems/bind-it-0.2.7/lib/bind-it/binding.rb:87:in `each' from D:/BitNami/rubystack-1.9.3-12/ruby/lib/ruby/gems/1.9.1/gems/bind-it-0.2.7/lib/bind-it/binding.rb:87:in `load_default_classes' from D:/BitNami/rubystack-1.9.3-12/ruby/lib/ruby/gems/1.9.1/gems/bind-it-0.2.7/lib/bind-it/binding.rb:56:in `bind' from D:/BitNami/rubystack-1.9.3-12/ruby/lib/ruby/gems/1.9.1/gems/open-nlp-0.1.4/lib/open-nlp.rb:14:in `load' from D:/BitNami/rubystack-1.9.3-12/projects/RjbTest/app/helpers/opennlp_tryer.rb:54:in `<class:OpennlpTryer>' from D:/BitNami/rubystack-1.9.3-12/projects/RjbTest/app/helpers/opennlp_tryer.rb:1:in `<top (required)>' from -e:1:in `load' from -e:1:in `<main>' The interesting point about these errors are that they are originating in OpennlpTryer line 54 which is: OpenNLP.load At this point, OpenNLP fires up RJB which uses BindIt to load the jars and classes. This is well before the errors that I was seeing at the beginning of this question. However, I can't help but think it is all related. I really don't understand the inconsistency of these errors at all. I was able to add the logging function in to Utils.java, compile it after adding in an "import java.io.*" and compress it. However, I pulled it out because of these errors as I didn't know if or not it was involved. I don't think it was. However, because these errors are occurring during load, the method is never called anyway so logging there won't help... For each of the other jars, the jar is loaded then each class is imported using RJB. Utils is handled differently and is specified as the "default". From what I can tell, Utils.class is executed to load its own classes? Later update on 10/07: Here is where I am, I think. First, I have some problem replacing Utils.java, as I described earlier today. That problem probably needs solved before I can install a fix. Second, I now understand the difference between POSTagger and POSTaggerME because the ME means Maximum Entropy. The test code is trying to call POSTaggerME but it looks to me like Utils.java, as implemented, supports POSTagger. I tried changing the test code to call POSTagger, but it said it couldn't find an initializer. Looking at the source for each of these, and I am guessing here, I think that POSTagger exists for the sole purpose to support POSTaggerME which implements it. The source is opennlp-tools file opennlp-tools-1.5.2-incubating-sources.jar. What I don't get is the whole reason for Utils in the first place? Why aren't the jars/classes provided in bindings.rb enough? This feels like a bad monkeypatch. I mean, look what bindings.rb does in the first place: # Default JARs to load. self.default_jars = [ 'jwnl-1.3.3.jar', 'opennlp-tools-1.5.2-incubating.jar', 'opennlp-maxent-3.0.2-incubating.jar', 'opennlp-uima-1.5.2-incubating.jar' ] # Default namespace. self.default_namespace = 'opennlp.tools' # Default classes. self.default_classes = [ # OpenNLP classes. ['AbstractBottomUpParser', 'opennlp.tools.parser'], ['DocumentCategorizerME', 'opennlp.tools.doccat'], ['ChunkerME', 'opennlp.tools.chunker'], ['DictionaryDetokenizer', 'opennlp.tools.tokenize'], ['NameFinderME', 'opennlp.tools.namefind'], ['Parser', 'opennlp.tools.parser.chunking'], ['Parse', 'opennlp.tools.parser'], ['ParserFactory', 'opennlp.tools.parser'], ['POSTaggerME', 'opennlp.tools.postag'], ['SentenceDetectorME', 'opennlp.tools.sentdetect'], ['SimpleTokenizer', 'opennlp.tools.tokenize'], ['Span', 'opennlp.tools.util'], ['TokenizerME', 'opennlp.tools.tokenize'], # Generic Java classes. ['FileInputStream', 'java.io'], ['String', 'java.lang'], ['ArrayList', 'java.util'] ] # Add in Rjb workarounds. unless RUBY_PLATFORM =~ /java/ self.default_jars << 'utils.jar' self.default_classes << ['Utils', ''] end
SEE FULL CODE AT END FOR THE COMPLETE CORRECTED CLASSES.RB MODULE I ran into the same problem today. I didn't quite understand why the Utils class were being used, so I modified the classes.rb file in the following way: unless RUBY_PLATFORM =~ /java/ def tag(*args) #proxy_inst.tag(args[0]) #OpenNLP::Bindings::Utils.tagWithArrayList(#proxy_inst, args[0]) end end In that way I can make the following test to pass: sent = "The death of the poet was kept from his poems." tokens = tokenizer.tokenize(sent).to_a # => %w[The death of the poet was kept from his poems .] tags = tagger.tag(tokens).to_a # => ["prop", "prp", "n", "v-fin", "n", "adj", "prop", "v-fin", "n", "adj", "punc"] R_G Edit: I tested that change and it eliminated the error. I am going to have to do more testing to ensure the outcome is what should be expected. However, following that same pattern, I made the following changes in classes.rb as well: def chunk(tokens, tags) chunks = #proxy_inst.chunk(tokens, tags) # chunks = OpenNLP::Bindings::Utils.chunkWithArrays(#proxy_inst, tokens,tags) chunks.map { |c| c.to_s } end ... class OpenNLP::NameFinderME < OpenNLP::Base unless RUBY_PLATFORM =~ /java/ def find(*args) #proxy_inst.find(args[0]) # OpenNLP::Bindings::Utils.findWithArrayList(#proxy_inst, args[0]) end end end This allowed the entire sample test to execute without failure. I will provide a later update regarding verification of the results. FINAL EDIT AND UPDATED CLASSES.RB per Space Pope and R_G: As it turns out, this answer was key to the desired solution. However, the results were inconsistent as it was corrected. We continued to drill down into it and implemented strong typing during the calls, as specified by RJB. This converts the call to use of the _invoke method where the parameters include the desired method, the strong type, and the additional parameters. Andre's recommendation was key to the solution, so kudos to him. Here is the complete module. It eliminates the need for the Utils.class that was attempting to make these calls but failing. We plan to issue a github pull request for the open-nlp gem to update this module: require 'open-nlp/base' class OpenNLP::SentenceDetectorME < OpenNLP::Base; end class OpenNLP::SimpleTokenizer < OpenNLP::Base; end class OpenNLP::TokenizerME < OpenNLP::Base; end class OpenNLP::POSTaggerME < OpenNLP::Base unless RUBY_PLATFORM =~ /java/ def tag(*args) #proxy_inst._invoke("tag", "[Ljava.lang.String;", args[0]) end end end class OpenNLP::ChunkerME < OpenNLP::Base if RUBY_PLATFORM =~ /java/ def chunk(tokens, tags) if !tokens.is_a?(Array) tokens = tokens.to_a tags = tags.to_a end tokens = tokens.to_java(:String) tags = tags.to_java(:String) #proxy_inst.chunk(tokens,tags).to_a end else def chunk(tokens, tags) chunks = #proxy_inst._invoke("chunk", "[Ljava.lang.String;[Ljava.lang.String;", tokens, tags) chunks.map { |c| c.to_s } end end end class OpenNLP::Parser < OpenNLP::Base def parse(text) tokenizer = OpenNLP::TokenizerME.new full_span = OpenNLP::Bindings::Span.new(0, text.size) parse_obj = OpenNLP::Bindings::Parse.new( text, full_span, "INC", 1, 0) tokens = tokenizer.tokenize_pos(text) tokens.each_with_index do |tok,i| start, stop = tok.get_start, tok.get_end token = text[start..stop-1] span = OpenNLP::Bindings::Span.new(start, stop) parse = OpenNLP::Bindings::Parse.new(text, span, "TK", 0, i) parse_obj.insert(parse) end #proxy_inst.parse(parse_obj) end end class OpenNLP::NameFinderME < OpenNLP::Base unless RUBY_PLATFORM =~ /java/ def find(*args) #proxy_inst._invoke("find", "[Ljava.lang.String;", args[0]) end end end
I don't think you're doing anything wrong at all. You're also not the only one with this problem. It looks like a bug in Utils. Creating an ArrayList[] in Java doesn't make much sense - it's technically legal, but it would be an array of ArrayLists, which a) is just plain odd and b) terrible practice with regard to Java generics, and c) won't cast properly to String[] like the author intends in getStringArray(). Given the way the utility's written and the fact that OpenNLP does, in fact, expect to receive a String[] as input for its tag() method, my best guess is that the original author meant to have Object[] where they have ArrayList[] in the Utils class. Update To output to a file in the root of your project directory, try adjusting the logging like this (I added another line for printing the contents of the input array): try { File log = new File("log.txt"); FileWriter fileWriter = new FileWriter(log); BufferedWriter bufferedWriter = new BufferedWriter(fileWriter); bufferedWriter.write("Tokens ("+objectArray.getClass().getSimpleName()+"): \r\n"+objectArray.toString()+"\r\n"); bufferedWriter.write(Arrays.toString(objectArray)); bufferedWriter.close(); } catch (Exception e) { e.printStackTrace(); }
Calling R in java-Rcaller
I am trying to implement clustering using R in java by employing R caller. I am trying to run sample code for clustering validation and I get that common error faced by most of the users: Premature end of file package test; import rcaller.RCaller; import java.io.File; import java.lang.*; import java.util.*; import java.awt.image.DataBuffer; public class test3 { public static void main(String[] args) { new test3(); } public test3() { try{ RCaller caller = new RCaller(); caller.cleanRCode(); caller.setRscriptExecutable("C:/Program Files/R/R-2.15.1/bin/x64/Rscript"); caller.cleanRCode(); caller.addRCode("library(clvalid)"); caller.addRCode("data(mouse)"); caller.addRCode("express <- mouse [,c(M1,M2,M3,NC1,NC2,NC3)]"); caller.addRCode("rownames (express) <- mouse$ID "); caller.addRCode("intern <- clValid(express, 2:6 , clMethods = c( hierarchical,kmeans,diana,clara,model) ,validation = internal)"); caller.addRCode("b <- summary(intern) "); caller.runAndReturnResult("b"); } catch (Exception e){ e.printStackTrace(); } } }
You have some spelling mistakes in you code. like clValid not clvalid , and you miss many quotes like "hierarchical",.... I think it is better to put your code in a script, and call it from java like this : Runtime.getRuntime().exec("Rscript myScript.R"); where myScript.R is : library(clValid) data(mouse) express <- mouse [,c('M1','M2','M3','NC1','NC2','NC3')] rownames (express) <- mouse$ID intern <- clValid(express, 2:6 , clMethods = c( 'hierarchical','kmeans', 'diana','clara','model') , validation = 'internal') b <- summary(intern)