Calling R in java-Rcaller - java
I am trying to implement clustering using R in java by employing R caller. I am trying to run sample code for clustering validation and I get that common error faced by most of the users: Premature end of file
package test;
import rcaller.RCaller;
import java.io.File;
import java.lang.*;
import java.util.*;
import java.awt.image.DataBuffer;
public class test3 {
public static void main(String[] args) {
new test3();
}
public test3()
{
try{
RCaller caller = new RCaller();
caller.cleanRCode();
caller.setRscriptExecutable("C:/Program Files/R/R-2.15.1/bin/x64/Rscript");
caller.cleanRCode();
caller.addRCode("library(clvalid)");
caller.addRCode("data(mouse)");
caller.addRCode("express <- mouse [,c(M1,M2,M3,NC1,NC2,NC3)]");
caller.addRCode("rownames (express) <- mouse$ID ");
caller.addRCode("intern <- clValid(express, 2:6 , clMethods = c( hierarchical,kmeans,diana,clara,model) ,validation = internal)");
caller.addRCode("b <- summary(intern) ");
caller.runAndReturnResult("b");
}
catch (Exception e){
e.printStackTrace();
}
}
}
You have some spelling mistakes in you code. like clValid not clvalid , and you miss many quotes like "hierarchical",....
I think it is better to put your code in a script, and call it from java like this :
Runtime.getRuntime().exec("Rscript myScript.R");
where myScript.R is :
library(clValid)
data(mouse)
express <- mouse [,c('M1','M2','M3','NC1','NC2','NC3')]
rownames (express) <- mouse$ID
intern <- clValid(express, 2:6 , clMethods = c( 'hierarchical','kmeans',
'diana','clara','model') ,
validation = 'internal')
b <- summary(intern)
Related
Lag a value with Datavec transform
I'm trying to figure out how to get a lagged value of a field as part of a datavec transform step. Here is a little example built off the dl4j examples: import org.datavec.api.records.reader.RecordReader; import org.datavec.api.records.reader.impl.csv.CSVRecordReader; import org.datavec.api.split.FileSplit; import org.datavec.api.transform.TransformProcess; import org.datavec.api.transform.schema.Schema; import org.datavec.api.writable.Writable; import org.datavec.local.transforms.LocalTransformExecutor; import org.nd4j.linalg.io.ClassPathResource; import java.io.File; import java.util.ArrayList; import java.util.Arrays; import java.util.List; public class myExample { public static void main(String[] args) throws Exception { Schema inputDataSchema = new Schema.Builder() .addColumnString("DateTimeString") .addColumnsString("CustomerID", "MerchantID") .addColumnInteger("NumItemsInTransaction") .addColumnCategorical("MerchantCountryCode", Arrays.asList("USA","CAN","FR","MX")) .addColumnDouble("TransactionAmountUSD",0.0,null,false,false) //$0.0 or more, no maximum limit, no NaN and no Infinite values .addColumnCategorical("FraudLabel", Arrays.asList("Fraud","Legit")) .build(); TransformProcess tp = new TransformProcess.Builder(inputDataSchema) .removeAllColumnsExceptFor("DateTimeString","TransactionAmountUSD") .build(); File inputFile = new ClassPathResource("BasicDataVecExample/exampledata.csv").getFile(); //Define input reader and output writer: RecordReader rr = new CSVRecordReader(1, ','); rr.initialize(new FileSplit(inputFile)); //Process the data: List<List<Writable>> originalData = new ArrayList<>(); while(rr.hasNext()){ originalData.add(rr.next()); } List<List<Writable>> processedData = LocalTransformExecutor.execute(originalData, tp); int numRows = 5; System.out.println("=== BEFORE ==="); for (int i=0;i<=numRows;i++) { System.out.println(originalData.get(i)); } System.out.println("=== AFTER ==="); for (int i=0;i<=numRows;i++) { System.out.println(processedData.get(i)); } } } I'm looking to get a lagged value (ordered by DateTimeString) of TransactionAmountUSD I was looking at sequenceMovingWindowReduce from the docs but could not figure it out. Also could not really find any examples in the examples repo that seemed to do anything similar to this.
Thanks to some help from Alex Black on the dl4j gitter channel i can post my own answer. Tip to anyone new to dl4j - there is lots of good things to look at in the tests code too in addition to the examples and tutorials. Here is my updated toy example code: package org.datavec.transform.basic; import org.datavec.api.records.reader.RecordReader; import org.datavec.api.records.reader.impl.csv.CSVRecordReader; import org.datavec.api.split.FileSplit; import org.datavec.api.transform.TransformProcess; import org.datavec.api.transform.schema.Schema; import org.datavec.api.transform.sequence.comparator.NumericalColumnComparator; import org.datavec.api.transform.transform.sequence.SequenceOffsetTransform; import org.datavec.api.writable.Writable; import org.datavec.local.transforms.LocalTransformExecutor; import org.joda.time.DateTimeZone; import org.nd4j.linalg.io.ClassPathResource; import java.io.File; import java.util.ArrayList; import java.util.Arrays; import java.util.List; public class myExample { public static void main(String[] args) throws Exception { Schema inputDataSchema = new Schema.Builder() .addColumnString("DateTimeString") .addColumnsString("CustomerID", "MerchantID") .addColumnInteger("NumItemsInTransaction") .addColumnCategorical("MerchantCountryCode", Arrays.asList("USA","CAN","FR","MX")) .addColumnDouble("TransactionAmountUSD",0.0,null,false,false) //$0.0 or more, no maximum limit, no NaN and no Infinite values .addColumnCategorical("FraudLabel", Arrays.asList("Fraud","Legit")) .build(); TransformProcess tp = new TransformProcess.Builder(inputDataSchema) .removeAllColumnsExceptFor("CustomerID", "DateTimeString","TransactionAmountUSD") .stringToTimeTransform("DateTimeString","YYYY-MM-DD HH:mm:ss.SSS", DateTimeZone.UTC) .convertToSequence(Arrays.asList("CustomerID"), new NumericalColumnComparator("DateTimeString")) .offsetSequence(Arrays.asList("TransactionAmountUSD"),1, SequenceOffsetTransform.OperationType.NewColumn) .build(); File inputFile = new ClassPathResource("BasicDataVecExample/exampledata.csv").getFile(); //Define input reader and output writer: RecordReader rr = new CSVRecordReader(0, ','); rr.initialize(new FileSplit(inputFile)); //Process the data: List<List<Writable>> originalData = new ArrayList<>(); while(rr.hasNext()){ originalData.add(rr.next()); } List<List<List<Writable>>> processedData = LocalTransformExecutor.executeToSequence(originalData, tp); System.out.println("=== BEFORE ==="); for (int i=0;i<originalData.size();i++) { System.out.println(originalData.get(i)); } System.out.println("=== AFTER ==="); for (int i=0;i<processedData.size();i++) { System.out.println(processedData.get(i)); } } } This should give some output like below where you can see a now col with the last value for the transaction amount for each customer id is added. "C:\Program Files\Java\jdk1.8.0_201\bin\java.exe" "-javaagent:C:\Program Files\JetBrains\IntelliJ IDEA Community Edition 2019.1\lib\idea_rt.jar=56103:C:\Program Files\JetBrains\IntelliJ IDEA Community Edition 2019.1\bin" -Dfile.encoding=UTF-8 -classpath "C:\Program Files\Java\jdk1.8.0_201\jre\lib\charsets.jar;C:\Program Files\Java\jdk1.8.0_201\jre\lib\deploy.jar;C:\Program Files\Java\jdk1.8.0_201\jre\lib\ext\access-bridge-64.jar;C:\Program Files\Java\jdk1.8.0_201\jre\lib\ext\cldrdata.jar;C:\Program Files\Java\jdk1.8.0_201\jre\lib\ext\dnsns.jar;C:\Program Files\Java\jdk1.8.0_201\jre\lib\ext\jaccess.jar;C:\Program Files\Java\jdk1.8.0_201\jre\lib\ext\jfxrt.jar;C:\Program Files\Java\jdk1.8.0_201\jre\lib\ext\localedata.jar;C:\Program Files\Java\jdk1.8.0_201\jre\lib\ext\nashorn.jar;C:\Program Files\Java\jdk1.8.0_201\jre\lib\ext\sunec.jar;C:\Program Files\Java\jdk1.8.0_201\jre\lib\ext\sunjce_provider.jar;C:\Program Files\Java\jdk1.8.0_201\jre\lib\ext\sunmscapi.jar;C:\Program Files\Java\jdk1.8.0_201\jre\lib\ext\sunpkcs11.jar;C:\Program Files\Java\jdk1.8.0_201\jre\lib\ext\zipfs.jar;C:\Program Files\Java\jdk1.8.0_201\jre\lib\javaws.jar;C:\Program Files\Java\jdk1.8.0_201\jre\lib\jce.jar;C:\Program Files\Java\jdk1.8.0_201\jre\lib\jfr.jar;C:\Program Files\Java\jdk1.8.0_201\jre\lib\jfxswt.jar;C:\Program Files\Java\jdk1.8.0_201\jre\lib\jsse.jar;C:\Program Files\Java\jdk1.8.0_201\jre\lib\management-agent.jar;C:\Program Files\Java\jdk1.8.0_201\jre\lib\plugin.jar;C:\Program Files\Java\jdk1.8.0_201\jre\lib\resources.jar;C:\Program Files\Java\jdk1.8.0_201\jre\lib\rt.jar;C:\Users\amaguire\Documents\java_learning\dl4j-examples\datavec-examples\target\classes;C:\Users\amaguire\.m2\repository\org\datavec\datavec-api\1.0.0-beta3\datavec-api-1.0.0-beta3.jar;C:\Users\amaguire\.m2\repository\org\jetbrains\annotations\13.0\annotations-13.0.jar;C:\Users\amaguire\.m2\repository\org\apache\commons\commons-lang3\3.6\commons-lang3-3.6.jar;C:\Users\amaguire\.m2\repository\commons-io\commons-io\2.5\commons-io-2.5.jar;C:\Users\amaguire\.m2\repository\commons-codec\commons-codec\1.10\commons-codec-1.10.jar;C:\Users\amaguire\.m2\repository\org\slf4j\slf4j-api\1.7.21\slf4j-api-1.7.21.jar;C:\Users\amaguire\.m2\repository\joda-time\joda-time\2.2\joda-time-2.2.jar;C:\Users\amaguire\.m2\repository\org\yaml\snakeyaml\1.12\snakeyaml-1.12.jar;C:\Users\amaguire\.m2\repository\org\nd4j\jackson\1.0.0-beta3\jackson-1.0.0-beta3.jar;C:\Users\amaguire\.m2\repository\org\codehaus\woodstox\stax2-api\3.1.4\stax2-api-3.1.4.jar;C:\Users\amaguire\.m2\repository\org\freemarker\freemarker\2.3.23\freemarker-2.3.23.jar;C:\Users\amaguire\.m2\repository\org\nd4j\nd4j-common\1.0.0-beta3\nd4j-common-1.0.0-beta3.jar;C:\Users\amaguire\.m2\repository\org\nd4j\nd4j-api\1.0.0-beta3\nd4j-api-1.0.0-beta3.jar;C:\Users\amaguire\.m2\repository\com\google\flatbuffers\flatbuffers-java\1.9.0\flatbuffers-java-1.9.0.jar;C:\Users\amaguire\.m2\repository\com\github\os72\protobuf-java-shaded-351\0.9\protobuf-java-shaded-351-0.9.jar;C:\Users\amaguire\.m2\repository\com\github\os72\protobuf-java-util-shaded-351\0.9\protobuf-java-util-shaded-351-0.9.jar;C:\Users\amaguire\.m2\repository\com\google\code\gson\gson\2.7\gson-2.7.jar;C:\Users\amaguire\.m2\repository\org\objenesis\objenesis\2.6\objenesis-2.6.jar;C:\Users\amaguire\.m2\repository\uk\com\robust-it\cloning\1.9.3\cloning-1.9.3.jar;C:\Users\amaguire\.m2\repository\org\nd4j\nd4j-buffer\1.0.0-beta3\nd4j-buffer-1.0.0-beta3.jar;C:\Users\amaguire\.m2\repository\org\bytedeco\javacpp\1.4.3\javacpp-1.4.3.jar;C:\Users\amaguire\.m2\repository\org\nd4j\nd4j-context\1.0.0-beta3\nd4j-context-1.0.0-beta3.jar;C:\Users\amaguire\.m2\repository\net\ericaro\neoitertools\1.0.0\neoitertools-1.0.0.jar;C:\Users\amaguire\.m2\repository\com\clearspring\analytics\stream\2.7.0\stream-2.7.0.jar;C:\Users\amaguire\.m2\repository\net\sf\opencsv\opencsv\2.3\opencsv-2.3.jar;C:\Users\amaguire\.m2\repository\com\tdunning\t-digest\3.2\t-digest-3.2.jar;C:\Users\amaguire\.m2\repository\it\unimi\dsi\fastutil\6.5.7\fastutil-6.5.7.jar;C:\Users\amaguire\.m2\repository\org\datavec\datavec-spark_2.11\1.0.0-beta3_spark_1\datavec-spark_2.11-1.0.0-beta3_spark_1.jar;C:\Users\amaguire\.m2\repository\org\scala-lang\scala-library\2.11.12\scala-library-2.11.12.jar;C:\Users\amaguire\.m2\repository\org\scala-lang\scala-reflect\2.11.12\scala-reflect-2.11.12.jar;C:\Users\amaguire\.m2\repository\org\codehaus\jackson\jackson-core-asl\1.9.13\jackson-core-asl-1.9.13.jar;C:\Users\amaguire\.m2\repository\org\codehaus\jackson\jackson-mapper-asl\1.9.13\jackson-mapper-asl-1.9.13.jar;C:\Users\amaguire\.m2\repository\org\apache\spark\spark-sql_2.11\1.6.3\spark-sql_2.11-1.6.3.jar;C:\Users\amaguire\.m2\repository\org\apache\spark\spark-core_2.11\1.6.3\spark-core_2.11-1.6.3.jar;C:\Users\amaguire\.m2\repository\org\apache\avro\avro-mapred\1.7.7\avro-mapred-1.7.7-hadoop2.jar;C:\Users\amaguire\.m2\repository\org\apache\avro\avro-ipc\1.7.7\avro-ipc-1.7.7.jar;C:\Users\amaguire\.m2\repository\org\apache\avro\avro\1.7.7\avro-1.7.7.jar;C:\Users\amaguire\.m2\repository\org\apache\avro\avro-ipc\1.7.7\avro-ipc-1.7.7-tests.jar;C:\Users\amaguire\.m2\repository\com\twitter\chill_2.11\0.5.0\chill_2.11-0.5.0.jar;C:\Users\amaguire\.m2\repository\com\esotericsoftware\kryo\kryo\2.21\kryo-2.21.jar;C:\Users\amaguire\.m2\repository\com\esotericsoftware\reflectasm\reflectasm\1.07\reflectasm-1.07-shaded.jar;C:\Users\amaguire\.m2\repository\com\esotericsoftware\minlog\minlog\1.2\minlog-1.2.jar;C:\Users\amaguire\.m2\repository\com\twitter\chill-java\0.5.0\chill-java-0.5.0.jar;C:\Users\amaguire\.m2\repository\org\apache\xbean\xbean-asm5-shaded\4.4\xbean-asm5-shaded-4.4.jar;C:\Users\amaguire\.m2\repository\org\apache\hadoop\hadoop-client\2.2.0\hadoop-client-2.2.0.jar;C:\Users\amaguire\.m2\repository\org\apache\hadoop\hadoop-common\2.2.0\hadoop-common-2.2.0.jar;C:\Users\amaguire\.m2\repository\commons-cli\commons-cli\1.2\commons-cli-1.2.jar;C:\Users\amaguire\.m2\repository\org\apache\commons\commons-math\2.1\commons-math-2.1.jar;C:\Users\amaguire\.m2\repository\xmlenc\xmlenc\0.52\xmlenc-0.52.jar;C:\Users\amaguire\.m2\repository\commons-configuration\commons-configuration\1.6\commons-configuration-1.6.jar;C:\Users\amaguire\.m2\repository\commons-digester\commons-digester\1.8\commons-digester-1.8.jar;C:\Users\amaguire\.m2\repository\commons-beanutils\commons-beanutils\1.7.0\commons-beanutils-1.7.0.jar;C:\Users\amaguire\.m2\repository\commons-beanutils\commons-beanutils-core\1.8.0\commons-beanutils-core-1.8.0.jar;C:\Users\amaguire\.m2\repository\org\apache\hadoop\hadoop-auth\2.2.0\hadoop-auth-2.2.0.jar;C:\Users\amaguire\.m2\repository\org\apache\hadoop\hadoop-hdfs\2.2.0\hadoop-hdfs-2.2.0.jar;C:\Users\amaguire\.m2\repository\org\mortbay\jetty\jetty-util\6.1.26\jetty-util-6.1.26.jar;C:\Users\amaguire\.m2\repository\org\apache\hadoop\hadoop-mapreduce-client-app\2.2.0\hadoop-mapreduce-client-app-2.2.0.jar;C:\Users\amaguire\.m2\repository\org\apache\hadoop\hadoop-mapreduce-client-common\2.2.0\hadoop-mapreduce-client-common-2.2.0.jar;C:\Users\amaguire\.m2\repository\org\apache\hadoop\hadoop-yarn-client\2.2.0\hadoop-yarn-client-2.2.0.jar;C:\Users\amaguire\.m2\repository\com\sun\jersey\jersey-test-framework\jersey-test-framework-grizzly2\1.9\jersey-test-framework-grizzly2-1.9.jar;C:\Users\amaguire\.m2\repository\com\sun\jersey\jersey-test-framework\jersey-test-framework-core\1.9\jersey-test-framework-core-1.9.jar;C:\Users\amaguire\.m2\repository\com\sun\jersey\jersey-client\1.9\jersey-client-1.9.jar;C:\Users\amaguire\.m2\repository\com\sun\jersey\jersey-grizzly2\1.9\jersey-grizzly2-1.9.jar;C:\Users\amaguire\.m2\repository\org\glassfish\grizzly\grizzly-http\2.1.2\grizzly-http-2.1.2.jar;C:\Users\amaguire\.m2\repository\org\glassfish\grizzly\grizzly-framework\2.1.2\grizzly-framework-2.1.2.jar;C:\Users\amaguire\.m2\repository\org\glassfish\gmbal\gmbal-api-only\3.0.0-b023\gmbal-api-only-3.0.0-b023.jar;C:\Users\amaguire\.m2\repository\org\glassfish\external\management-api\3.0.0-b012\management-api-3.0.0-b012.jar;C:\Users\amaguire\.m2\repository\org\glassfish\grizzly\grizzly-http-server\2.1.2\grizzly-http-server-2.1.2.jar;C:\Users\amaguire\.m2\repository\org\glassfish\grizzly\grizzly-rcm\2.1.2\grizzly-rcm-2.1.2.jar;C:\Users\amaguire\.m2\repository\org\glassfish\grizzly\grizzly-http-servlet\2.1.2\grizzly-http-servlet-2.1.2.jar;C:\Users\amaguire\.m2\repository\org\glassfish\javax.servlet\3.1\javax.servlet-3.1.jar;C:\Users\amaguire\.m2\repository\com\sun\jersey\jersey-json\1.9\jersey-json-1.9.jar;C:\Users\amaguire\.m2\repository\org\codehaus\jettison\jettison\1.1\jettison-1.1.jar;C:\Users\amaguire\.m2\repository\stax\stax-api\1.0.1\stax-api-1.0.1.jar;C:\Users\amaguire\.m2\repository\org\codehaus\jackson\jackson-jaxrs\1.8.3\jackson-jaxrs-1.8.3.jar;C:\Users\amaguire\.m2\repository\org\codehaus\jackson\jackson-xc\1.8.3\jackson-xc-1.8.3.jar;C:\Users\amaguire\.m2\repository\com\sun\jersey\contribs\jersey-guice\1.9\jersey-guice-1.9.jar;C:\Users\amaguire\.m2\repository\org\apache\hadoop\hadoop-yarn-server-common\2.2.0\hadoop-yarn-server-common-2.2.0.jar;C:\Users\amaguire\.m2\repository\org\apache\hadoop\hadoop-mapreduce-client-shuffle\2.2.0\hadoop-mapreduce-client-shuffle-2.2.0.jar;C:\Users\amaguire\.m2\repository\org\apache\hadoop\hadoop-yarn-api\2.2.0\hadoop-yarn-api-2.2.0.jar;C:\Users\amaguire\.m2\repository\org\apache\hadoop\hadoop-mapreduce-client-core\2.2.0\hadoop-mapreduce-client-core-2.2.0.jar;C:\Users\amaguire\.m2\repository\org\apache\hadoop\hadoop-yarn-common\2.2.0\hadoop-yarn-common-2.2.0.jar;C:\Users\amaguire\.m2\repository\org\apache\hadoop\hadoop-mapreduce-client-jobclient\2.2.0\hadoop-mapreduce-client-jobclient-2.2.0.jar;C:\Users\amaguire\.m2\repository\org\apache\hadoop\hadoop-annotations\2.2.0\hadoop-annotations-2.2.0.jar;C:\Users\amaguire\.m2\repository\org\apache\spark\spark-launcher_2.11\1.6.3\spark-launcher_2.11-1.6.3.jar;C:\Users\amaguire\.m2\repository\org\apache\spark\spark-network-common_2.11\1.6.3\spark-network-common_2.11-1.6.3.jar;C:\Users\amaguire\.m2\repository\org\apache\spark\spark-network-shuffle_2.11\1.6.3\spark-network-shuffle_2.11-1.6.3.jar;C:\Users\amaguire\.m2\repository\org\fusesource\leveldbjni\leveldbjni-all\1.8\leveldbjni-all-1.8.jar;C:\Users\amaguire\.m2\repository\org\apache\spark\spark-unsafe_2.11\1.6.3\spark-unsafe_2.11-1.6.3.jar;C:\Users\amaguire\.m2\repository\net\java\dev\jets3t\jets3t\0.7.1\jets3t-0.7.1.jar;C:\Users\amaguire\.m2\repository\commons-httpclient\commons-httpclient\3.1\commons-httpclient-3.1.jar;C:\Users\amaguire\.m2\repository\org\eclipse\jetty\orbit\javax.servlet\3.0.0.v201112011016\javax.servlet-3.0.0.v201112011016.jar;C:\Users\amaguire\.m2\repository\com\google\code\findbugs\jsr305\1.3.9\jsr305-1.3.9.jar;C:\Users\amaguire\.m2\repository\org\slf4j\jul-to-slf4j\1.7.10\jul-to-slf4j-1.7.10.jar;C:\Users\amaguire\.m2\repository\org\slf4j\jcl-over-slf4j\1.7.10\jcl-over-slf4j-1.7.10.jar;C:\Users\amaguire\.m2\repository\log4j\log4j\1.2.17\log4j-1.2.17.jar;C:\Users\amaguire\.m2\repository\org\slf4j\slf4j-log4j12\1.7.10\slf4j-log4j12-1.7.10.jar;C:\Users\amaguire\.m2\repository\com\ning\compress-lzf\1.0.3\compress-lzf-1.0.3.jar;C:\Users\amaguire\.m2\repository\org\xerial\snappy\snappy-java\1.1.2.6\snappy-java-1.1.2.6.jar;C:\Users\amaguire\.m2\repository\net\jpountz\lz4\lz4\1.3.0\lz4-1.3.0.jar;C:\Users\amaguire\.m2\repository\org\roaringbitmap\RoaringBitmap\0.5.11\RoaringBitmap-0.5.11.jar;C:\Users\amaguire\.m2\repository\org\json4s\json4s-jackson_2.11\3.2.10\json4s-jackson_2.11-3.2.10.jar;C:\Users\amaguire\.m2\repository\org\json4s\json4s-core_2.11\3.2.10\json4s-core_2.11-3.2.10.jar;C:\Users\amaguire\.m2\repository\org\json4s\json4s-ast_2.11\3.2.10\json4s-ast_2.11-3.2.10.jar;C:\Users\amaguire\.m2\repository\org\scala-lang\scalap\2.11.0\scalap-2.11.0.jar;C:\Users\amaguire\.m2\repository\org\scala-lang\scala-compiler\2.11.0\scala-compiler-2.11.0.jar;C:\Users\amaguire\.m2\repository\org\scala-lang\modules\scala-xml_2.11\1.0.1\scala-xml_2.11-1.0.1.jar;C:\Users\amaguire\.m2\repository\org\scala-lang\modules\scala-parser-combinators_2.11\1.0.1\scala-parser-combinators_2.11-1.0.1.jar;C:\Users\amaguire\.m2\repository\com\sun\jersey\jersey-server\1.9\jersey-server-1.9.jar;C:\Users\amaguire\.m2\repository\asm\asm\3.1\asm-3.1.jar;C:\Users\amaguire\.m2\repository\com\sun\jersey\jersey-core\1.9\jersey-core-1.9.jar;C:\Users\amaguire\.m2\repository\org\apache\mesos\mesos\0.21.1\mesos-0.21.1-shaded-protobuf.jar;C:\Users\amaguire\.m2\repository\io\netty\netty-all\4.0.29.Final\netty-all-4.0.29.Final.jar;C:\Users\amaguire\.m2\repository\io\dropwizard\metrics\metrics-core\3.1.2\metrics-core-3.1.2.jar;C:\Users\amaguire\.m2\repository\io\dropwizard\metrics\metrics-jvm\3.1.2\metrics-jvm-3.1.2.jar;C:\Users\amaguire\.m2\repository\io\dropwizard\metrics\metrics-json\3.1.2\metrics-json-3.1.2.jar;C:\Users\amaguire\.m2\repository\io\dropwizard\metrics\metrics-graphite\3.1.2\metrics-graphite-3.1.2.jar;C:\Users\amaguire\.m2\repository\com\fasterxml\jackson\module\jackson-module-scala_2.11\2.5.1\jackson-module-scala_2.11-2.5.1.jar;C:\Users\amaguire\.m2\repository\com\thoughtworks\paranamer\paranamer\2.6\paranamer-2.6.jar;C:\Users\amaguire\.m2\repository\org\apache\ivy\ivy\2.4.0\ivy-2.4.0.jar;C:\Users\amaguire\.m2\repository\oro\oro\2.0.8\oro-2.0.8.jar;C:\Users\amaguire\.m2\repository\org\tachyonproject\tachyon-client\0.8.2\tachyon-client-0.8.2.jar;C:\Users\amaguire\.m2\repository\org\tachyonproject\tachyon-underfs-hdfs\0.8.2\tachyon-underfs-hdfs-0.8.2.jar;C:\Users\amaguire\.m2\repository\org\tachyonproject\tachyon-underfs-s3\0.8.2\tachyon-underfs-s3-0.8.2.jar;C:\Users\amaguire\.m2\repository\org\tachyonproject\tachyon-underfs-local\0.8.2\tachyon-underfs-local-0.8.2.jar;C:\Users\amaguire\.m2\repository\net\razorvine\pyrolite\4.9\pyrolite-4.9.jar;C:\Users\amaguire\.m2\repository\net\sf\py4j\py4j\0.9\py4j-0.9.jar;C:\Users\amaguire\.m2\repository\org\apache\spark\spark-catalyst_2.11\1.6.3\spark-catalyst_2.11-1.6.3.jar;C:\Users\amaguire\.m2\repository\org\codehaus\janino\janino\2.7.8\janino-2.7.8.jar;C:\Users\amaguire\.m2\repository\org\codehaus\janino\commons-compiler\2.7.8\commons-compiler-2.7.8.jar;C:\Users\amaguire\.m2\repository\org\apache\parquet\parquet-column\1.7.0\parquet-column-1.7.0.jar;C:\Users\amaguire\.m2\repository\org\apache\parquet\parquet-common\1.7.0\parquet-common-1.7.0.jar;C:\Users\amaguire\.m2\repository\org\apache\parquet\parquet-encoding\1.7.0\parquet-encoding-1.7.0.jar;C:\Users\amaguire\.m2\repository\org\apache\parquet\parquet-generator\1.7.0\parquet-generator-1.7.0.jar;C:\Users\amaguire\.m2\repository\org\apache\parquet\parquet-hadoop\1.7.0\parquet-hadoop-1.7.0.jar;C:\Users\amaguire\.m2\repository\org\apache\parquet\parquet-format\2.3.0-incubating\parquet-format-2.3.0-incubating.jar;C:\Users\amaguire\.m2\repository\org\apache\parquet\parquet-jackson\1.7.0\parquet-jackson-1.7.0.jar;C:\Users\amaguire\.m2\repository\org\spark-project\spark\unused\1.0.0\unused-1.0.0.jar;C:\Users\amaguire\.m2\repository\com\google\guava\guava\20.0\guava-20.0.jar;C:\Users\amaguire\.m2\repository\com\google\inject\guice\4.0\guice-4.0.jar;C:\Users\amaguire\.m2\repository\javax\inject\javax.inject\1\javax.inject-1.jar;C:\Users\amaguire\.m2\repository\aopalliance\aopalliance\1.0\aopalliance-1.0.jar;C:\Users\amaguire\.m2\repository\com\google\protobuf\protobuf-java\2.6.1\protobuf-java-2.6.1.jar;C:\Users\amaguire\.m2\repository\commons-collections\commons-collections\3.2.2\commons-collections-3.2.2.jar;C:\Users\amaguire\.m2\repository\commons-lang\commons-lang\2.6\commons-lang-2.6.jar;C:\Users\amaguire\.m2\repository\commons-net\commons-net\3.1\commons-net-3.1.jar;C:\Users\amaguire\.m2\repository\com\sun\xml\bind\jaxb-core\2.2.11\jaxb-core-2.2.11.jar;C:\Users\amaguire\.m2\repository\com\sun\xml\bind\jaxb-impl\2.2.11\jaxb-impl-2.2.11.jar;C:\Users\amaguire\.m2\repository\com\typesafe\akka\akka-actor_2.11\2.3.16\akka-actor_2.11-2.3.16.jar;C:\Users\amaguire\.m2\repository\com\typesafe\akka\akka-remote_2.11\2.3.16\akka-remote_2.11-2.3.16.jar;C:\Users\amaguire\.m2\repository\org\uncommons\maths\uncommons-maths\1.2.2a\uncommons-maths-1.2.2a.jar;C:\Users\amaguire\.m2\repository\com\typesafe\akka\akka-slf4j_2.11\2.3.16\akka-slf4j_2.11-2.3.16.jar;C:\Users\amaguire\.m2\repository\io\netty\netty\3.10.4.Final\netty-3.10.4.Final.jar;C:\Users\amaguire\.m2\repository\com\fasterxml\jackson\core\jackson-core\2.5.1\jackson-core-2.5.1.jar;C:\Users\amaguire\.m2\repository\com\fasterxml\jackson\core\jackson-databind\2.5.1\jackson-databind-2.5.1.jar;C:\Users\amaguire\.m2\repository\com\fasterxml\jackson\core\jackson-annotations\2.5.1\jackson-annotations-2.5.1.jar;C:\Users\amaguire\.m2\repository\javax\servlet\javax.servlet-api\3.1.0\javax.servlet-api-3.1.0.jar;C:\Users\amaguire\.m2\repository\org\apache\commons\commons-compress\1.16.1\commons-compress-1.16.1.jar;C:\Users\amaguire\.m2\repository\org\apache\commons\commons-math3\3.5\commons-math3-3.5.jar;C:\Users\amaguire\.m2\repository\org\apache\curator\curator-recipes\2.8.0\curator-recipes-2.8.0.jar;C:\Users\amaguire\.m2\repository\org\apache\curator\curator-framework\2.8.0\curator-framework-2.8.0.jar;C:\Users\amaguire\.m2\repository\org\apache\curator\curator-client\2.8.0\curator-client-2.8.0.jar;C:\Users\amaguire\.m2\repository\org\apache\zookeeper\zookeeper\3.4.6\zookeeper-3.4.6.jar;C:\Users\amaguire\.m2\repository\jline\jline\0.9.94\jline-0.9.94.jar;C:\Users\amaguire\.m2\repository\com\typesafe\config\1.3.0\config-1.3.0.jar;C:\Users\amaguire\.m2\repository\org\datavec\datavec-hadoop\1.0.0-beta3\datavec-hadoop-1.0.0-beta3.jar;C:\Users\amaguire\.m2\repository\org\datavec\datavec-local\1.0.0-beta3\datavec-local-1.0.0-beta3.jar;C:\Users\amaguire\.m2\repository\com\codepoetics\protonpack\1.15\protonpack-1.15.jar;C:\Users\amaguire\.m2\repository\org\datavec\datavec-arrow\1.0.0-beta3\datavec-arrow-1.0.0-beta3.jar;C:\Users\amaguire\.m2\repository\org\nd4j\nd4j-arrow\1.0.0-beta3\nd4j-arrow-1.0.0-beta3.jar;C:\Users\amaguire\.m2\repository\com\fasterxml\jackson\dataformat\jackson-dataformat-yaml\2.6.5\jackson-dataformat-yaml-2.6.5.jar;C:\Users\amaguire\.m2\repository\com\fasterxml\jackson\dataformat\jackson-dataformat-xml\2.6.5\jackson-dataformat-xml-2.6.5.jar;C:\Users\amaguire\.m2\repository\com\fasterxml\jackson\module\jackson-module-jaxb-annotations\2.6.5\jackson-module-jaxb-annotations-2.6.5.jar;C:\Users\amaguire\.m2\repository\com\fasterxml\jackson\datatype\jackson-datatype-joda\2.6.5\jackson-datatype-joda-2.6.5.jar;C:\Users\amaguire\.m2\repository\com\carrotsearch\hppc\0.8.1\hppc-0.8.1.jar;C:\Users\amaguire\.m2\repository\org\apache\arrow\arrow-vector\0.11.0\arrow-vector-0.11.0.jar;C:\Users\amaguire\.m2\repository\io\netty\netty-buffer\4.1.22.Final\netty-buffer-4.1.22.Final.jar;C:\Users\amaguire\.m2\repository\io\netty\netty-common\4.1.22.Final\netty-common-4.1.22.Final.jar;C:\Users\amaguire\.m2\repository\org\apache\arrow\arrow-memory\0.11.0\arrow-memory-0.11.0.jar;C:\Users\amaguire\.m2\repository\org\apache\arrow\arrow-format\0.11.0\arrow-format-0.11.0.jar" org.datavec.transform.basic.myExample log4j:WARN No appenders could be found for logger (io.netty.util.internal.logging.InternalLoggerFactory). log4j:WARN Please initialize the log4j system properly. log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info. === BEFORE === [2016-01-01 17:00:00.000, 830a7u3, u323fy8902, 1, USA, 100.00, Legit] [2016-01-01 18:03:01.256, 830a7u3, 9732498oeu, 3, FR, 73.20, Legit] [2016-01-03 02:53:32.231, 78ueoau32, w234e989, 1, USA, 1621.00, Fraud] [2016-01-03 09:30:16.832, t842uocd, 9732498oeu, 4, USA, 43.19, Legit] [2016-01-04 23:01:52.920, t842uocd, cza8873bm, 10, MX, 159.65, Legit] [2016-01-05 02:28:10.648, t842uocd, fgcq9803, 6, CAN, 26.33, Fraud] [2016-01-05 10:15:36.483, rgc707ke3, tn342v7, 2, USA, -0.90, Legit] === AFTER === [[1451948512920, t842uocd, 159.65, 43.19], [1451960890648, t842uocd, 26.33, 159.65]] [[1451671381256, 830a7u3, 73.20, 100.00]] [] [] Process finished with exit code 0
Weka output predictions
I've used the Weka GUI for training and testing a file (making predictions), but can't do the same with the API. The error I'm getting says there's a different number of attributes in the train and test files. In the GUI, this can be solved by checking "Output predictions". How to do something similar using the API? do you know of any samples out there? import weka.classifiers.bayes.NaiveBayes; import weka.classifiers.meta.FilteredClassifier; import weka.classifiers.trees.J48; import weka.core.Instances; import weka.core.converters.ConverterUtils.DataSource; import weka.filters.Filter; import weka.filters.unsupervised.attribute.NominalToBinary; import weka.filters.unsupervised.attribute.Remove; public class WekaTutorial { public static void main(String[] args) throws Exception { DataSource trainSource = new DataSource("/tmp/classes - edited.arff"); // training Instances trainData = trainSource.getDataSet(); DataSource testSource = new DataSource("/tmp/classes_testing.arff"); Instances testData = testSource.getDataSet(); if (trainData.classIndex() == -1) { trainData.setClassIndex(trainData.numAttributes() - 1); } if (testData.classIndex() == -1) { testData.setClassIndex(testData.numAttributes() - 1); } String[] options = weka.core.Utils.splitOptions("weka.filters.unsupervised.attribute.StringToWordVector -R first-last -W 1000 -prune-rate -1.0 -N 0 -stemmer weka.core.stemmers.NullStemmer -M 1 " + "-tokenizer \"weka.core.tokenizers.WordTokenizer -delimiters \" \\r\\n\\t.,;:\\\'\\\"()?!\""); Remove remove = new Remove(); remove.setOptions(options); remove.setInputFormat(trainData); NominalToBinary filter = new NominalToBinary(); NaiveBayes nb = new NaiveBayes(); FilteredClassifier fc = new FilteredClassifier(); fc.setFilter(filter); fc.setClassifier(nb); // train and make predictions fc.buildClassifier(trainData); for (int i = 0; i < testData.numInstances(); i++) { double pred = fc.classifyInstance(testData.instance(i)); System.out.print("ID: " + testData.instance(i).value(0)); System.out.print(", actual: " + testData.classAttribute().value((int) testData.instance(i).classValue())); System.out.println(", predicted: " + testData.classAttribute().value((int) pred)); } } } Error: Exception in thread "main" java.lang.IllegalArgumentException: Src and Dest differ in # of attributes: 2 != 17152 This was not an issue for the GUI.
You need to ensure that categories in train and test sets are compatible, try to combine train and test sets List item preprocess them save them as arff open two empty files copy the header from the top to line "#data" copy in training set into first file and test set into second file
R Integrate with Java using Rserve
I have build an application connecting R and java using the RServe package. In this project I use neuralnet to predict output. Where is the source code that I use are as follows: myneuralnetscript=function(){ trainingData = read.csv("D:\\Kuliah\\Semester V\\TA\\Implementasi\\training.csv") testingData = read.csv("D:\\Kuliah\\Semester V\\TA\\Implementasi\\testing.csv") X1training <- trainingData$open X2training <- trainingData$high X3training <- trainingData$low X4training <- trainingData$close X5training <- trainingData$volume targetTraining <- trainingData$target X1testing <- testingData$open X2testing <- testingData$high X3testing <- testingData$low X4testing <- testingData$close X5testing <- testingData$volume targetTesting <- testingData$target xTraining <- cbind(X1training,X2training,X3training,X4training,X5training) sum.trainingData <- data.frame(xTraining,targetTraining) net.sum <- neuralnet(targetTraining~X1training+X2training+X3training+X4training+X5training, sum.trainingData, hidden=5,act.fct="logistic") xTesting <- cbind(X1testing,X2testing,X3testing,X4testing,X5testing) sum.testingData <- data.frame(xTesting,targetTesting) result <- compute(net.sum,sum.testingData[,1:5]) return(result) } The output generated as follows: Here the program from Java to access the results of the R. public static void main(String[] args) { RConnection connection = null; try { /* Create a connection to Rserve instance running on default port * 6311 */ connection = new RConnection(); //Directory of R script connection.eval("source('D:\\\\Kuliah\\\\Semester V\\\\TA\\\\Implementasi\\\\R\\\\neuralNet.R')"); //Call method double output = connection.eval("myneuralnetscript()").asDouble(); System.out.println(output); } catch (RserveException | REXPMismatchException e) { System.out.println("There is some problem indeed..."); } } However, the output that appears is "There is some problem indeed ...".
Please do not catch exceptions just to print a useless message. Remove your try catch and declare main to throw Exception. That way you'll see the actual error. Either Rserve is not running locally on 6311 or it's failing to evaluate or the result of second evaluation cannot be coerced into a single double. When you run eval do tryCatch({CODE},e=function ()e) instead and check if return inherits from try-error and get the message
What is this parameter used for?
Main Class: import java.util.ArrayList; public class readingfiles { public static void main (String args[]) { ArrayList<Double> result2 = lol2016.ReadNumberFile("lel"); System.out.println("Result 2: " + result2); } } Regardless of what I input in the parameter for this, nothing changes?? ArrayList result2 = lol2016.ReadNumberFile("lel"); import java.util.*; import java.io.*; public class lol2016 { static public ArrayList<Double> ReadNumberFile(String filename) { ArrayList<Double> res = new ArrayList<Double>(); Reader r; try { r = new BufferedReader(new FileReader("C:\\Users\\Documents\\Primes.txt")); StreamTokenizer stok = new StreamTokenizer(r); stok.parseNumbers(); stok.nextToken(); while (stok.ttype != StreamTokenizer.TT_EOF) { if (stok.ttype == StreamTokenizer.TT_NUMBER) { res.add(stok.nval); } stok.nextToken(); } } catch(Exception E) { System.out.println("+++ReadFile: "+E.getMessage()); } return(res); } } The output remains the same regardless of what I put between those parameters, my question is what is supposed to go there? by the way this is part of a much bigger project but i believe this is enough code to help me understand what I could put in those parameters that would affect my output. Output: Result 2: [2.0, 3.0, 5.0, 7.0, 11.0, 13.0, 17.0, 19.0, 23.0, 29.0, 31.0, 37.0, 41.0, 43.0, 47.0, 53.0, 59.0, 61.0, 67.0, 71.0, 73.0, 79.0, 83.0, 89.0, 97.0, 101.0, 103.0, 107.0, 109.0, 113.0, 127.0, 131.0, 137.0, 139.0, 149.0, 151.0, 157.0, 163.0, 167.0, 173.0, 179.0, 181.0, 191.0, 193.0, 197.0, 199.0, 211.0, 223.0, 227.0, 229.0, 233.0, 239.0, 241.0, 251.0, 257.0, 263.0, 269.0, 271.0, 277.0, 281.0, 283.0, 293.0, 307.0, 311.0, 313.0, 317.0, 331.0, 337.0, 347.0, 349.0, 353.0, 359.0, 367.0, 373.0, 379.0, 383.0, 389.0, 397.0, 401.0, 409.0, 419.0, 421.0, 431.0, 433.0, 439.0, 443.0, 449.0, 457.0, 461.0, 463.0, 467.0, 479.0, 487.0, 491.0, 499.0, 503.0, 509.0, 521.0, 523.0, 541.0, 547.0, 557.0, 563.0, 569.0, 571.0, 577.0, 587.0, 593.0, 599.0, 601.0, 607.0, 613.0, 617.0, 619.0, 631.0, 641.0, 643.0, 647.0, 653.0, 659.0, 661.0, 673.0, 677.0, 683.0, 691.0, 701.0, 709.0, 719.0, 727.0, 733.0, 739.0, 743.0, 751.0, 757.0, 761.0, 769.0, 773.0, 787.0, 797.0, 809.0, 811.0, 821.0, 823.0, 827.0, 829.0, 839.0, 853.0, 857.0, 859.0, 863.0, 877.0, 881.0, 883.0, 887.0, 907.0, 911.0, 919.0, 929.0, 937.0, 941.0, 947.0, 953.0, 967.0, 971.0, 977.0, 983.0, 991.0, 997.0, 1009.0, 1013.0, 1019.0, 1021.0, 1031.0, 1033.0, 1039.0, 1049.0, 1051.0, 1061.0, 1063.0, 1069.0, 1087.0, 1091.0, 1093.0, 1097.0, 1103.0, 1109.0, 1117.0, 1123.0, 1129.0, 1151.0, 1153.0, 1163.0, 1171.0, 1181.0, 1187.0, 1193.0, 1201.0, 1213.0, 1217.0, 1223.0, 1229.0, 1231.0, 1237.0, 1249.0, 1259.0, 1277.0, 1279.0, 1283.0, 1289.0, 1291.0, 1297.0, 1301.0, 1303.0, 1307.0, 1319.0, 1321.0, 1327.0, 1361.0, 1367.0, 1373.0, 1381.0, 1399.0, 1409.0, 1423.0, 1427.0, 1429.0, 1433.0, 1439.0, 1447.0, 1451.0, 1453.0, 1459.0, 1471.0, 1481.0, 1483.0, 1487.0, 1489.0, 1493.0, 1499.0, 1511.0, 1523.0, 1531.0, 1543.0, 1549.0, 1553.0, 1559.0, 1567.0, 1571.0, 1579.0, 1583.0, 1597.0, 1601.0, 1607.0, 1609.0, 1613.0, 1619.0, 1621.0, 1627.0, 1637.0, 1657.0, 1663.0, 1667.0, 1669.0, 1693.0, 1697.0, 1699.0, 1709.0, 1721.0, 1723.0, 1733.0, 1741.0, 1747.0, 1753.0, 1759.0, 1777.0, 1783.0, 1787.0, 1789.0, 1801.0, 1811.0, 1823.0, 1831.0, 1847.0, 1861.0, 1867.0, 1871.0, 1873.0, 1877.0, 1879.0, 1889.0, 1901.0, 1907.0, 1913.0, 1931.0, 1933.0, 1949.0, 1951.0, 1973.0, 1979.0, 1987.0, 1993.0, 1997.0, 1999.0, 2003.0, 2011.0, 2017.0, 2027.0, 2029.0, 2039.0, 2053.0, 2063.0, 2069.0, 2081.0, 2083.0, 2087.0, 2089.0, 2099.0, 2111.0, 2113.0, 2129.0, 2131.0, 2137.0, 2141.0, 2143.0, 2153.0, 2161.0, 2179.0, 2203.0, 2207.0, 2213.0, 2221.0, 2237.0, 2239.0, 2243.0, 2251.0, 2267.0, 2269.0, 2273.0, 2281.0, 2287.0, 2293.0, 2297.0, 2309.0, 2311.0, 2333.0, 2339.0, 2341.0, 2347.0, 2351.0, 2357.0, 2371.0, 2377.0, 2381.0, 2383.0, 2389.0, 2393.0, 2399.0, 2411.0, 2417.0, 2423.0, 2437.0, 2441.0, 2447.0, 2459.0, 2467.0, 2473.0, 2477.0, 2503.0, 2521.0, 2531.0, 2539.0, 2543.0, 2549.0, 2551.0, 2557.0, 2579.0, 2591.0, 2593.0, 2609.0, 2617.0, 2621.0, 2633.0, 2647.0, 2657.0, 2659.0, 2663.0, 2671.0, 2677.0, 2683.0, 2687.0, 2689.0, 2693.0, 2699.0, 2707.0, 2711.0, 2713.0, 2719.0, 2729.0, 2731.0, 2741.0, 2749.0, 2753.0, 2767.0, 2777.0, 2789.0, 2791.0, 2797.0, 2801.0, 2803.0, 2819.0, 2833.0, 2837.0, 2843.0, 2851.0, 2857.0, 2861.0, 2879.0, 2887.0, 2897.0, 2903.0, 2909.0, 2917.0, 2927.0, 2939.0, 2953.0, 2957.0, 2963.0, 2969.0, 2971.0, 2999.0, 3001.0, 3011.0, 3019.0, 3023.0, 3037.0, 3041.0, 3049.0, 3061.0, 3067.0, 3079.0, 3083.0, 3089.0, 3109.0, 3119.0, 3121.0, 3137.0, 3163.0, 3167.0, 3169.0, 3181.0, 3187.0, 3191.0, 3203.0, 3209.0, 3217.0, 3221.0, 3229.0, 3251.0, 3253.0, 3257.0, 3259.0, 3271.0, 3299.0, 3301.0, 3307.0, 3313.0, 3319.0, 3323.0, 3329.0, 3331.0, 3343.0, 3347.0, 3359.0, 3361.0, 3371.0, 3373.0, 3389.0, 3391.0, 3407.0, 3413.0, 3433.0, 3449.0, 3457.0, 3461.0, 3463.0, 3467.0, 3469.0, 3491.0, 3499.0, 3511.0, 3517.0, 3527.0, 3529.0, 3533.0, 3539.0, 3541.0, 3547.0, 3557.0, 3559.0, 3571.0, 3581.0, 3583.0, 3593.0, 3607.0, 3613.0, 3617.0, 3623.0, 3631.0, 3637.0, 3643.0, 3659.0, 3671.0, 3673.0, 3677.0, 3691.0, 3697.0, 3701.0, 3709.0, 3719.0, 3727.0, 3733.0, 3739.0, 3761.0, 3767.0, 3769.0, 3779.0, 3793.0, 3797.0, 3803.0, 3821.0, 3823.0, 3833.0, 3847.0, 3851.0, 3853.0, 3863.0, 3877.0, 3881.0, 3889.0, 3907.0, 3911.0, 3917.0, 3919.0, 3923.0, 3929.0, 3931.0, 3943.0, 3947.0, 3967.0, 3989.0, 4001.0, 4003.0, 4007.0, 4013.0, 4019.0, 4021.0, 4027.0, 4049.0, 4051.0, 4057.0, 4073.0, 4079.0, 4091.0, 4093.0, 4099.0, 4111.0, 4127.0, 4129.0, 4133.0, 4139.0, 4153.0, 4157.0, 4159.0, 4177.0, 4201.0, 4211.0, 4217.0, 4219.0, 4229.0, 4231.0, 4241.0, 4243.0, 4253.0, 4259.0, 4261.0, 4271.0, 4273.0, 4283.0, 4289.0, 4297.0, 4327.0, 4337.0, 4339.0, 4349.0, 4357.0, 4363.0, 4373.0, 4391.0, 4397.0, 4409.0, 4421.0, 4423.0, 4441.0, 4447.0, 4451.0, 4457.0, 4463.0, 4481.0, 4483.0, 4493.0, 4507.0, 4513.0, 4517.0, 4519.0, 4523.0, 4547.0, 4549.0, 4561.0, 4567.0, 4583.0, 4591.0, 4597.0, 4603.0, 4621.0, 4637.0, 4639.0, 4643.0, 4649.0, 4651.0, 4657.0, 4663.0, 4673.0, 4679.0, 4691.0, 4703.0, 4721.0, 4723.0, 4729.0, 4733.0, 4751.0, 4759.0, 4783.0, 4787.0, 4789.0, 4793.0, 4799.0, 4801.0, 4813.0, 4817.0, 4831.0, 4861.0, 4871.0, 4877.0, 4889.0, 4903.0, 4909.0, 4919.0, 4931.0, 4933.0, 4937.0, 4943.0, 4951.0, 4957.0, 4967.0, 4969.0, 4973.0, 4987.0, 4993.0, 4999.0, 5003.0, 5009.0, 5011.0, 5021.0, 5023.0, 5039.0, 5051.0, 5059.0, 5077.0, 5081.0, 5087.0, 5099.0, 5101.0, 5107.0, 5113.0, 5119.0, 5147.0, 5153.0, 5167.0, 5171.0, 5179.0, 5189.0, 5197.0, 5209.0, 5227.0, 5231.0, 5233.0, 5237.0, 5261.0, 5273.0, 5279.0, 5281.0, 5297.0, 5303.0, 5309.0, 5323.0, 5333.0, 5347.0, 5351.0, 5381.0, 5387.0, 5393.0, 5399.0, 5407.0, 5413.0, 5417.0, 5419.0, 5431.0, 5437.0, 5441.0, 5443.0, 5449.0, 5471.0, 5477.0, 5479.0, 5483.0, 5501.0, 5503.0, 5507.0, 5519.0, 5521.0, 5527.0, 5531.0, 5557.0, 5563.0, 5569.0, 5573.0, 5581.0, 5591.0, 5623.0, 5639.0, 5641.0, 5647.0, 5651.0, 5653.0, 5657.0, 5659.0, 5669.0, 5683.0, 5689.0, 5693.0, 5701.0, 5711.0, 5717.0, 5737.0, 5741.0, 5743.0, 5749.0, 5779.0, 5783.0, 5791.0, 5801.0, 5807.0, 5813.0, 5821.0, 5827.0, 5839.0, 5843.0, 5849.0, 5851.0, 5857.0, 5861.0, 5867.0, 5869.0, 5879.0, 5881.0, 5897.0, 5903.0, 5923.0, 5927.0, 5939.0, 5953.0, 5981.0, 5987.0, 6007.0, 6011.0, 6029.0, 6037.0, 6043.0, 6047.0, 6053.0, 6067.0, 6073.0, 6079.0, 6089.0, 6091.0, 6101.0, 6113.0, 6121.0, 6131.0, 6133.0, 6143.0, 6151.0, 6163.0, 6173.0, 6197.0, 6199.0, 6203.0, 6211.0, 6217.0, 6221.0, 6229.0, 6247.0, 6257.0, 6263.0, 6269.0, 6271.0, 6277.0, 6287.0, 6299.0, 6301.0, 6311.0, 6317.0, 6323.0, 6329.0, 6337.0, 6343.0, 6353.0, 6359.0, 6361.0, 6367.0, 6373.0, 6379.0, 6389.0, 6397.0, 6421.0, 6427.0, 6449.0, 6451.0, 6469.0, 6473.0, 6481.0, 6491.0, 6521.0, 6529.0, 6547.0, 6551.0, 6553.0, 6563.0, 6569.0, 6571.0, 6577.0, 6581.0, 6599.0, 6607.0, 6619.0, 6637.0, 6653.0, 6659.0, 6661.0, 6673.0, 6679.0, 6689.0, 6691.0, 6701.0, 6703.0, 6709.0, 6719.0, 6733.0, 6737.0, 6761.0, 6763.0, 6779.0, 6781.0, 6791.0, 6793.0, 6803.0, 6823.0, 6827.0, 6829.0, 6833.0, 6841.0, 6857.0, 6863.0, 6869.0, 6871.0, 6883.0, 6899.0, 6907.0, 6911.0, 6917.0, 6947.0, 6949.0, 6959.0, 6961.0, 6967.0, 6971.0, 6977.0, 6983.0, 6991.0, 6997.0, 7001.0, 7013.0, 7019.0, 7027.0, 7039.0, 7043.0, 7057.0, 7069.0, 7079.0, 7103.0, 7109.0, 7121.0, 7127.0, 7129.0, 7151.0, 7159.0, 7177.0, 7187.0, 7193.0, 7207.0, 7211.0, 7213.0, 7219.0, 7229.0, 7237.0, 7243.0, 7247.0, 7253.0, 7283.0, 7297.0, 7307.0, 7309.0, 7321.0, 7331.0, 7333.0, 7349.0, 7351.0, 7369.0, 7393.0, 7411.0, 7417.0, 7433.0, 7451.0, 7457.0, 7459.0, 7477.0, 7481.0, 7487.0, 7489.0, 7499.0, 7507.0, 7517.0, 7523.0, 7529.0, 7537.0, 7541.0, 7547.0, 7549.0, 7559.0, 7561.0, 7573.0, 7577.0, 7583.0, 7589.0, 7591.0, 7603.0, 7607.0, 7621.0, 7639.0, 7643.0, 7649.0, 7669.0, 7673.0, 7681.0, 7687.0, 7691.0, 7699.0, 7703.0, 7717.0, 7723.0, 7727.0, 7741.0, 7753.0, 7757.0, 7759.0, 7789.0, 7793.0, 7817.0, 7823.0, 7829.0, 7841.0, 7853.0, 7867.0, 7873.0, 7877.0, 7879.0, 7883.0, 7901.0, 7907.0, 7919.0] Bare in my the text file that is being read literally just displays all the prime numbers same as the output.
Regardless of what I input in the parameter for this, nothing changes?? It's because the filename argument is not used anywhere in the ReadNumberFile(String filename) method.. As it seems, this parameter (filename) represents the name (or maybe the fully-qualified path) of a file that should be read. If that's the case, you should change this line to: r = new BufferedReader(new FileReader(filename));
Issue in R arules package using java
For a university project I have to implement arules(package of R) in java. I have successfully integrated R and java using JRI. I did not understand how to get output of "inspect(Groceries[1:1])". I have tried with asString(),asString[]() but this gives me following error: Exception in thread "main" java.lang.NullPointerException at TestR.main(TestR.java:11) Also, how can implement summary(Groceries) in java? How to get output of summary in String array or string? R code: >data(Groceries) >inspect(Groceries[1:1]) >summary(Groceries) Java code: import org.rosuda.JRI.Rengine; import org.rosuda.JRI.REXP; public class TestR { public static void main(String[] args){ Rengine re = new Rengine(new String[]{"--no-save"}, false, null); re.eval("library(arules)"); re.eval("data(Groceries)"); REXP result = re.eval("inspect(Groceries[1:1])"); System.out.println(result.asString()); } }
Appears that the inspect function in pkg:arules returns NULL. The output you see is a "side-effect". You can attempt to "capture output" but this is untested since I don't have experience with this integration across languages. Try instead.: REXP result = re.eval("capture.output( inspect(Groceries[1:1]) )"); In an R console session you will get: library(arules) data("Adult") rules <- apriori(Adult) val <- inspect(rules[1000]) > str(val) NULL > val.co <- capture.output(inspect(rules[1000])) > val.co [1] " lhs rhs support confidence lift" [2] "1 {education=Some-college, " [3] " sex=Male, " [4] " capital-loss=None} => {native-country=United-States} 0.1208181 0.9256471 1.031449" But I haven't tested this in a non-interactive session. May need to muck with the file argument to capture.output, ... or it may not work at all.