What is this parameter used for? - java
Main Class:
import java.util.ArrayList;
public class readingfiles {
public static void main (String args[])
{
ArrayList<Double> result2 = lol2016.ReadNumberFile("lel");
System.out.println("Result 2: " + result2);
}
}
Regardless of what I input in the parameter for this, nothing changes??
ArrayList result2 = lol2016.ReadNumberFile("lel");
import java.util.*;
import java.io.*;
public class lol2016
{
static public ArrayList<Double> ReadNumberFile(String filename)
{
ArrayList<Double> res = new ArrayList<Double>();
Reader r;
try
{
r = new BufferedReader(new FileReader("C:\\Users\\Documents\\Primes.txt"));
StreamTokenizer stok = new StreamTokenizer(r);
stok.parseNumbers();
stok.nextToken();
while (stok.ttype != StreamTokenizer.TT_EOF)
{
if (stok.ttype == StreamTokenizer.TT_NUMBER)
{
res.add(stok.nval);
}
stok.nextToken();
}
}
catch(Exception E)
{
System.out.println("+++ReadFile: "+E.getMessage());
}
return(res);
}
}
The output remains the same regardless of what I put between those parameters, my question is what is supposed to go there? by the way this is part of a much bigger project but i believe this is enough code to help me understand what I could put in those parameters that would affect my output.
Output:
Result 2: [2.0, 3.0, 5.0, 7.0, 11.0, 13.0, 17.0, 19.0, 23.0, 29.0, 31.0, 37.0, 41.0, 43.0, 47.0, 53.0, 59.0, 61.0, 67.0, 71.0, 73.0, 79.0, 83.0, 89.0, 97.0, 101.0, 103.0, 107.0, 109.0, 113.0, 127.0, 131.0, 137.0, 139.0, 149.0, 151.0, 157.0, 163.0, 167.0, 173.0, 179.0, 181.0, 191.0, 193.0, 197.0, 199.0, 211.0, 223.0, 227.0, 229.0, 233.0, 239.0, 241.0, 251.0, 257.0, 263.0, 269.0, 271.0, 277.0, 281.0, 283.0, 293.0, 307.0, 311.0, 313.0, 317.0, 331.0, 337.0, 347.0, 349.0, 353.0, 359.0, 367.0, 373.0, 379.0, 383.0, 389.0, 397.0, 401.0, 409.0, 419.0, 421.0, 431.0, 433.0, 439.0, 443.0, 449.0, 457.0, 461.0, 463.0, 467.0, 479.0, 487.0, 491.0, 499.0, 503.0, 509.0, 521.0, 523.0, 541.0, 547.0, 557.0, 563.0, 569.0, 571.0, 577.0, 587.0, 593.0, 599.0, 601.0, 607.0, 613.0, 617.0, 619.0, 631.0, 641.0, 643.0, 647.0, 653.0, 659.0, 661.0, 673.0, 677.0, 683.0, 691.0, 701.0, 709.0, 719.0, 727.0, 733.0, 739.0, 743.0, 751.0, 757.0, 761.0, 769.0, 773.0, 787.0, 797.0, 809.0, 811.0, 821.0, 823.0, 827.0, 829.0, 839.0, 853.0, 857.0, 859.0, 863.0, 877.0, 881.0, 883.0, 887.0, 907.0, 911.0, 919.0, 929.0, 937.0, 941.0, 947.0, 953.0, 967.0, 971.0, 977.0, 983.0, 991.0, 997.0, 1009.0, 1013.0, 1019.0, 1021.0, 1031.0, 1033.0, 1039.0, 1049.0, 1051.0, 1061.0, 1063.0, 1069.0, 1087.0, 1091.0, 1093.0, 1097.0, 1103.0, 1109.0, 1117.0, 1123.0, 1129.0, 1151.0, 1153.0, 1163.0, 1171.0, 1181.0, 1187.0, 1193.0, 1201.0, 1213.0, 1217.0, 1223.0, 1229.0, 1231.0, 1237.0, 1249.0, 1259.0, 1277.0, 1279.0, 1283.0, 1289.0, 1291.0, 1297.0, 1301.0, 1303.0, 1307.0, 1319.0, 1321.0, 1327.0, 1361.0, 1367.0, 1373.0, 1381.0, 1399.0, 1409.0, 1423.0, 1427.0, 1429.0, 1433.0, 1439.0, 1447.0, 1451.0, 1453.0, 1459.0, 1471.0, 1481.0, 1483.0, 1487.0, 1489.0, 1493.0, 1499.0, 1511.0, 1523.0, 1531.0, 1543.0, 1549.0, 1553.0, 1559.0, 1567.0, 1571.0, 1579.0, 1583.0, 1597.0, 1601.0, 1607.0, 1609.0, 1613.0, 1619.0, 1621.0, 1627.0, 1637.0, 1657.0, 1663.0, 1667.0, 1669.0, 1693.0, 1697.0, 1699.0, 1709.0, 1721.0, 1723.0, 1733.0, 1741.0, 1747.0, 1753.0, 1759.0, 1777.0, 1783.0, 1787.0, 1789.0, 1801.0, 1811.0, 1823.0, 1831.0, 1847.0, 1861.0, 1867.0, 1871.0, 1873.0, 1877.0, 1879.0, 1889.0, 1901.0, 1907.0, 1913.0, 1931.0, 1933.0, 1949.0, 1951.0, 1973.0, 1979.0, 1987.0, 1993.0, 1997.0, 1999.0, 2003.0, 2011.0, 2017.0, 2027.0, 2029.0, 2039.0, 2053.0, 2063.0, 2069.0, 2081.0, 2083.0, 2087.0, 2089.0, 2099.0, 2111.0, 2113.0, 2129.0, 2131.0, 2137.0, 2141.0, 2143.0, 2153.0, 2161.0, 2179.0, 2203.0, 2207.0, 2213.0, 2221.0, 2237.0, 2239.0, 2243.0, 2251.0, 2267.0, 2269.0, 2273.0, 2281.0, 2287.0, 2293.0, 2297.0, 2309.0, 2311.0, 2333.0, 2339.0, 2341.0, 2347.0, 2351.0, 2357.0, 2371.0, 2377.0, 2381.0, 2383.0, 2389.0, 2393.0, 2399.0, 2411.0, 2417.0, 2423.0, 2437.0, 2441.0, 2447.0, 2459.0, 2467.0, 2473.0, 2477.0, 2503.0, 2521.0, 2531.0, 2539.0, 2543.0, 2549.0, 2551.0, 2557.0, 2579.0, 2591.0, 2593.0, 2609.0, 2617.0, 2621.0, 2633.0, 2647.0, 2657.0, 2659.0, 2663.0, 2671.0, 2677.0, 2683.0, 2687.0, 2689.0, 2693.0, 2699.0, 2707.0, 2711.0, 2713.0, 2719.0, 2729.0, 2731.0, 2741.0, 2749.0, 2753.0, 2767.0, 2777.0, 2789.0, 2791.0, 2797.0, 2801.0, 2803.0, 2819.0, 2833.0, 2837.0, 2843.0, 2851.0, 2857.0, 2861.0, 2879.0, 2887.0, 2897.0, 2903.0, 2909.0, 2917.0, 2927.0, 2939.0, 2953.0, 2957.0, 2963.0, 2969.0, 2971.0, 2999.0, 3001.0, 3011.0, 3019.0, 3023.0, 3037.0, 3041.0, 3049.0, 3061.0, 3067.0, 3079.0, 3083.0, 3089.0, 3109.0, 3119.0, 3121.0, 3137.0, 3163.0, 3167.0, 3169.0, 3181.0, 3187.0, 3191.0, 3203.0, 3209.0, 3217.0, 3221.0, 3229.0, 3251.0, 3253.0, 3257.0, 3259.0, 3271.0, 3299.0, 3301.0, 3307.0, 3313.0, 3319.0, 3323.0, 3329.0, 3331.0, 3343.0, 3347.0, 3359.0, 3361.0, 3371.0, 3373.0, 3389.0, 3391.0, 3407.0, 3413.0, 3433.0, 3449.0, 3457.0, 3461.0, 3463.0, 3467.0, 3469.0, 3491.0, 3499.0, 3511.0, 3517.0, 3527.0, 3529.0, 3533.0, 3539.0, 3541.0, 3547.0, 3557.0, 3559.0, 3571.0, 3581.0, 3583.0, 3593.0, 3607.0, 3613.0, 3617.0, 3623.0, 3631.0, 3637.0, 3643.0, 3659.0, 3671.0, 3673.0, 3677.0, 3691.0, 3697.0, 3701.0, 3709.0, 3719.0, 3727.0, 3733.0, 3739.0, 3761.0, 3767.0, 3769.0, 3779.0, 3793.0, 3797.0, 3803.0, 3821.0, 3823.0, 3833.0, 3847.0, 3851.0, 3853.0, 3863.0, 3877.0, 3881.0, 3889.0, 3907.0, 3911.0, 3917.0, 3919.0, 3923.0, 3929.0, 3931.0, 3943.0, 3947.0, 3967.0, 3989.0, 4001.0, 4003.0, 4007.0, 4013.0, 4019.0, 4021.0, 4027.0, 4049.0, 4051.0, 4057.0, 4073.0, 4079.0, 4091.0, 4093.0, 4099.0, 4111.0, 4127.0, 4129.0, 4133.0, 4139.0, 4153.0, 4157.0, 4159.0, 4177.0, 4201.0, 4211.0, 4217.0, 4219.0, 4229.0, 4231.0, 4241.0, 4243.0, 4253.0, 4259.0, 4261.0, 4271.0, 4273.0, 4283.0, 4289.0, 4297.0, 4327.0, 4337.0, 4339.0, 4349.0, 4357.0, 4363.0, 4373.0, 4391.0, 4397.0, 4409.0, 4421.0, 4423.0, 4441.0, 4447.0, 4451.0, 4457.0, 4463.0, 4481.0, 4483.0, 4493.0, 4507.0, 4513.0, 4517.0, 4519.0, 4523.0, 4547.0, 4549.0, 4561.0, 4567.0, 4583.0, 4591.0, 4597.0, 4603.0, 4621.0, 4637.0, 4639.0, 4643.0, 4649.0, 4651.0, 4657.0, 4663.0, 4673.0, 4679.0, 4691.0, 4703.0, 4721.0, 4723.0, 4729.0, 4733.0, 4751.0, 4759.0, 4783.0, 4787.0, 4789.0, 4793.0, 4799.0, 4801.0, 4813.0, 4817.0, 4831.0, 4861.0, 4871.0, 4877.0, 4889.0, 4903.0, 4909.0, 4919.0, 4931.0, 4933.0, 4937.0, 4943.0, 4951.0, 4957.0, 4967.0, 4969.0, 4973.0, 4987.0, 4993.0, 4999.0, 5003.0, 5009.0, 5011.0, 5021.0, 5023.0, 5039.0, 5051.0, 5059.0, 5077.0, 5081.0, 5087.0, 5099.0, 5101.0, 5107.0, 5113.0, 5119.0, 5147.0, 5153.0, 5167.0, 5171.0, 5179.0, 5189.0, 5197.0, 5209.0, 5227.0, 5231.0, 5233.0, 5237.0, 5261.0, 5273.0, 5279.0, 5281.0, 5297.0, 5303.0, 5309.0, 5323.0, 5333.0, 5347.0, 5351.0, 5381.0, 5387.0, 5393.0, 5399.0, 5407.0, 5413.0, 5417.0, 5419.0, 5431.0, 5437.0, 5441.0, 5443.0, 5449.0, 5471.0, 5477.0, 5479.0, 5483.0, 5501.0, 5503.0, 5507.0, 5519.0, 5521.0, 5527.0, 5531.0, 5557.0, 5563.0, 5569.0, 5573.0, 5581.0, 5591.0, 5623.0, 5639.0, 5641.0, 5647.0, 5651.0, 5653.0, 5657.0, 5659.0, 5669.0, 5683.0, 5689.0, 5693.0, 5701.0, 5711.0, 5717.0, 5737.0, 5741.0, 5743.0, 5749.0, 5779.0, 5783.0, 5791.0, 5801.0, 5807.0, 5813.0, 5821.0, 5827.0, 5839.0, 5843.0, 5849.0, 5851.0, 5857.0, 5861.0, 5867.0, 5869.0, 5879.0, 5881.0, 5897.0, 5903.0, 5923.0, 5927.0, 5939.0, 5953.0, 5981.0, 5987.0, 6007.0, 6011.0, 6029.0, 6037.0, 6043.0, 6047.0, 6053.0, 6067.0, 6073.0, 6079.0, 6089.0, 6091.0, 6101.0, 6113.0, 6121.0, 6131.0, 6133.0, 6143.0, 6151.0, 6163.0, 6173.0, 6197.0, 6199.0, 6203.0, 6211.0, 6217.0, 6221.0, 6229.0, 6247.0, 6257.0, 6263.0, 6269.0, 6271.0, 6277.0, 6287.0, 6299.0, 6301.0, 6311.0, 6317.0, 6323.0, 6329.0, 6337.0, 6343.0, 6353.0, 6359.0, 6361.0, 6367.0, 6373.0, 6379.0, 6389.0, 6397.0, 6421.0, 6427.0, 6449.0, 6451.0, 6469.0, 6473.0, 6481.0, 6491.0, 6521.0, 6529.0, 6547.0, 6551.0, 6553.0, 6563.0, 6569.0, 6571.0, 6577.0, 6581.0, 6599.0, 6607.0, 6619.0, 6637.0, 6653.0, 6659.0, 6661.0, 6673.0, 6679.0, 6689.0, 6691.0, 6701.0, 6703.0, 6709.0, 6719.0, 6733.0, 6737.0, 6761.0, 6763.0, 6779.0, 6781.0, 6791.0, 6793.0, 6803.0, 6823.0, 6827.0, 6829.0, 6833.0, 6841.0, 6857.0, 6863.0, 6869.0, 6871.0, 6883.0, 6899.0, 6907.0, 6911.0, 6917.0, 6947.0, 6949.0, 6959.0, 6961.0, 6967.0, 6971.0, 6977.0, 6983.0, 6991.0, 6997.0, 7001.0, 7013.0, 7019.0, 7027.0, 7039.0, 7043.0, 7057.0, 7069.0, 7079.0, 7103.0, 7109.0, 7121.0, 7127.0, 7129.0, 7151.0, 7159.0, 7177.0, 7187.0, 7193.0, 7207.0, 7211.0, 7213.0, 7219.0, 7229.0, 7237.0, 7243.0, 7247.0, 7253.0, 7283.0, 7297.0, 7307.0, 7309.0, 7321.0, 7331.0, 7333.0, 7349.0, 7351.0, 7369.0, 7393.0, 7411.0, 7417.0, 7433.0, 7451.0, 7457.0, 7459.0, 7477.0, 7481.0, 7487.0, 7489.0, 7499.0, 7507.0, 7517.0, 7523.0, 7529.0, 7537.0, 7541.0, 7547.0, 7549.0, 7559.0, 7561.0, 7573.0, 7577.0, 7583.0, 7589.0, 7591.0, 7603.0, 7607.0, 7621.0, 7639.0, 7643.0, 7649.0, 7669.0, 7673.0, 7681.0, 7687.0, 7691.0, 7699.0, 7703.0, 7717.0, 7723.0, 7727.0, 7741.0, 7753.0, 7757.0, 7759.0, 7789.0, 7793.0, 7817.0, 7823.0, 7829.0, 7841.0, 7853.0, 7867.0, 7873.0, 7877.0, 7879.0, 7883.0, 7901.0, 7907.0, 7919.0]
Bare in my the text file that is being read literally just displays all the prime numbers same as the output.
Regardless of what I input in the parameter for this, nothing changes??
It's because the filename argument is not used anywhere in the ReadNumberFile(String filename) method..
As it seems, this parameter (filename) represents the name (or maybe the fully-qualified path) of a file that should be read. If that's the case, you should change this line to:
r = new BufferedReader(new FileReader(filename));
Related
Trained neural network outputs the same results for all evaluation rows
There seems to be no problem when training my network because it converges and falls below 0.01 error. However when I load my trained network, and introduce the evaluation set, it outputs the same results for all the evaluation set rows (the actual prediction, not the training phase). I trained my network with resilient propagation with 9 inputs, 1 hidden layer with 7 hidden neurons and 1 output neuron. UPDATE: My data is normalized using min-max. i am trying to predict an electric load data. Here is the sample data, first 9 rows are the inputs while the 10th is the ideal value: 0.5386671932975533, 1100000.0, 0.0, 1.0, 40.0, 1.0, 30.0, 9.0, 2014.0 , 0.5260616667545941 0.5260616667545941, 1100000.0, 0.0, 1.0, 40.0, 2.0, 30.0, 9.0, 2014.0, 0.5196499668339777 0.5196499668339777, 1100000.0, 0.0, 1.0, 40.0, 3.0, 30.0, 9.0, 2014.0, 0.5083828048375548 0.5083828048375548, 1100000.0, 0.0, 1.0, 40.0, 4.0, 30.0, 9.0, 2014.0, 0.49985462144799725 0.49985462144799725, 1100000.0, 0.0, 1.0, 40.0, 5.0, 30.0, 9.0, 2014.0, 0.49085956670499675 0.49085956670499675, 1100000.0, 0.0, 1.0, 40.0, 6.0, 30.0, 9.0, 2014.0, 0.485008112408512 Here's the full code: public class ANN { //training //public final static String SQL = "SELECT load_input, day_of_week, weekend_day, type_of_day, week_num, time, day_date, month, year, ideal_value FROM sample WHERE (year,month,day_date,time) between (2012,4,1,1) and (2014,9,29, 96) ORDER BY ID"; //testing public final static String SQL = "SELECT load_input, day_of_week, weekend_day, type_of_day, week_num, time, day_date, month, year, ideal_value FROM sample WHERE (year,month,day_date,time) between (2014,9,30,1) and (2014,9,30, 92) ORDER BY ID"; //validation //public final static String SQL = "SELECT load_input, day_of_week, weekend_day, type_of_day, week_num, time, day_date, month, year, ideal_value FROM sample WHERE (year,month,day_date,time) between (2014,9,30,93) and (2014,9,30, 96) ORDER BY ID"; public final static int INPUT_SIZE = 9; public final static int IDEAL_SIZE = 1; public final static String SQL_DRIVER = "org.postgresql.Driver"; public final static String SQL_URL = "jdbc:postgresql://localhost/ANN"; public final static String SQL_UID = "postgres"; public final static String SQL_PWD = ""; public static void main(String args[]) { Mynetwork(); //train network. will add customizable params later. //train(trainingData()); //evaluate network evaluate(trainingData()); Encog.getInstance().shutdown(); } public static void evaluate(MLDataSet testSet) { BasicNetwork network = (BasicNetwork)EncogDirectoryPersistence.loadObject(new File("directory")); // test the neural network System.out.println("Neural Network Results:"); for(MLDataPair pair: testSet ) { final MLData output = network.compute(pair.getInput()); System.out.println(pair.getInput().getData(0) + "," + pair.getInput().getData(1) + "," + pair.getInput().getData(2) + "," + pair.getInput().getData(3) + "," + pair.getInput().getData(4) + "," + pair.getInput().getData(5) + "," + pair.getInput().getData(6) + "," + pair.getInput().getData(7) + "," + pair.getInput().getData(8) + "," + "Predicted=" + output.getData(0) + ", Actual=" + pair.getIdeal().getData(0)); } } public static BasicNetwork Mynetwork() { //basic neural network template. Inputs should'nt have activation functions //because it affects data coming from the previous layer and there is no previous layer before the input. BasicNetwork network = new BasicNetwork(); //input layer with 2 neurons. //The 'true' parameter means that it should have a bias neuron. Bias neuron affects the next layer. network.addLayer(new BasicLayer(null , true, 9)); //hidden layer with 3 neurons network.addLayer(new BasicLayer(new ActivationSigmoid(), true, 5)); //output layer with 1 neuron network.addLayer(new BasicLayer(new ActivationSigmoid(), false, 1)); network.getStructure().finalizeStructure() ; network.reset(); return network; } public static void train(MLDataSet trainingSet) { //Backpropagation(network, dataset, learning rate, momentum) //final Backpropagation train = new Backpropagation(Mynetwork(), trainingSet, 0.1, 0.9); final ResilientPropagation train = new ResilientPropagation(Mynetwork(), trainingSet); //final QuickPropagation train = new QuickPropagation(Mynetwork(), trainingSet, 0.9); int epoch = 1; do { train.iteration(); System.out.println("Epoch #" + epoch + " Error:" + train.getError()); epoch++; } while((train.getError() > 0.01)); System.out.println("Saving network"); System.out.println("Saving Done"); EncogDirectoryPersistence.saveObject(new File("directory"), Mynetwork()); } public static MLDataSet trainingData() { MLDataSet trainingSet = new SQLNeuralDataSet( ANN.SQL, ANN.INPUT_SIZE, ANN.IDEAL_SIZE, ANN.SQL_DRIVER, ANN.SQL_URL, ANN.SQL_UID, ANN.SQL_PWD); return trainingSet; } } Here is my result: Predicted=0.4451817588640455, Actual=0.5260616667545941 Predicted=0.4451817588640455, Actual=0.5196499668339777 Predicted=0.4451817588640455, Actual=0.5083828048375548 Predicted=0.4451817588640455, Actual=0.49985462144799725 Predicted=0.4451817588640455, Actual=0.49085956670499675 Predicted=0.4451817588640455, Actual=0.485008112408512 Predicted=0.4451817588640455, Actual=0.47800504210686795 Predicted=0.4451817588640455, Actual=0.4693212349328293 (...and so on with the same "predicted") Results im expecting (I changed the "predicted" with something random for demonstration purposes, indicating that the network is actually predicting): Predicted=0.4451817588640455, Actual=0.5260616667545941 Predicted=0.5123312331212122, Actual=0.5196499668339777 Predicted=0.435234234234254365, Actual=0.5083828048375548 Predicted=0.673424556563455, Actual=0.49985462144799725 Predicted=0.2344673345345544235, Actual=0.49085956670499675 Predicted=0.123346457544324, Actual=0.485008112408512 Predicted=0.5673452342342342, Actual=0.47800504210686795 Predicted=0.678435234423423423, Actual=0.4693212349328293
The first reason to consider when you get weird results with neural networks is normalization. Your data must be normalized, otherwise, yes, the training will result in skewed NN which will produce the same outcome all the time, it is a common symptom. Always normalize your data before feeding it into a neural network. This is important because if you consider the sigmoid activation function it is basically flat for larger values (positive and negative), resulting in a constant behavior of your neural net. Try normalizing as such input = (input-median(input)) / std(input)
Weka output predictions
I've used the Weka GUI for training and testing a file (making predictions), but can't do the same with the API. The error I'm getting says there's a different number of attributes in the train and test files. In the GUI, this can be solved by checking "Output predictions". How to do something similar using the API? do you know of any samples out there? import weka.classifiers.bayes.NaiveBayes; import weka.classifiers.meta.FilteredClassifier; import weka.classifiers.trees.J48; import weka.core.Instances; import weka.core.converters.ConverterUtils.DataSource; import weka.filters.Filter; import weka.filters.unsupervised.attribute.NominalToBinary; import weka.filters.unsupervised.attribute.Remove; public class WekaTutorial { public static void main(String[] args) throws Exception { DataSource trainSource = new DataSource("/tmp/classes - edited.arff"); // training Instances trainData = trainSource.getDataSet(); DataSource testSource = new DataSource("/tmp/classes_testing.arff"); Instances testData = testSource.getDataSet(); if (trainData.classIndex() == -1) { trainData.setClassIndex(trainData.numAttributes() - 1); } if (testData.classIndex() == -1) { testData.setClassIndex(testData.numAttributes() - 1); } String[] options = weka.core.Utils.splitOptions("weka.filters.unsupervised.attribute.StringToWordVector -R first-last -W 1000 -prune-rate -1.0 -N 0 -stemmer weka.core.stemmers.NullStemmer -M 1 " + "-tokenizer \"weka.core.tokenizers.WordTokenizer -delimiters \" \\r\\n\\t.,;:\\\'\\\"()?!\""); Remove remove = new Remove(); remove.setOptions(options); remove.setInputFormat(trainData); NominalToBinary filter = new NominalToBinary(); NaiveBayes nb = new NaiveBayes(); FilteredClassifier fc = new FilteredClassifier(); fc.setFilter(filter); fc.setClassifier(nb); // train and make predictions fc.buildClassifier(trainData); for (int i = 0; i < testData.numInstances(); i++) { double pred = fc.classifyInstance(testData.instance(i)); System.out.print("ID: " + testData.instance(i).value(0)); System.out.print(", actual: " + testData.classAttribute().value((int) testData.instance(i).classValue())); System.out.println(", predicted: " + testData.classAttribute().value((int) pred)); } } } Error: Exception in thread "main" java.lang.IllegalArgumentException: Src and Dest differ in # of attributes: 2 != 17152 This was not an issue for the GUI.
You need to ensure that categories in train and test sets are compatible, try to combine train and test sets List item preprocess them save them as arff open two empty files copy the header from the top to line "#data" copy in training set into first file and test set into second file
Exception in thread "main" java.util.UnknownFormatConversionException: Conversion = 'ti'
package chapterreader; import java.util.Scanner; import java.io.File; public class ChapterReader { public static void main(String[] args) throws Exception { Chapter myChapter = new Chapter(); File chapterFile = new File("toc.txt"); Scanner chapterScanner; //check to see if the file exists to read the data if (chapterFile.exists()) { System.out.printf("%7Chapter %14Title %69Page %80Length"); chapterScanner = new Scanner(chapterFile); //Set Delimiter as ';' & 'new line' chapterScanner.useDelimiter(";|\r\n"); while (chapterScanner.hasNext()) { //Reads all the data from file and set it to the object Chapter myChapter.setChapterNumber(chapterScanner.nextInt()); myChapter.setChapterTitle(chapterScanner.next()); myChapter.setStartingPageNumber(chapterScanner.nextInt()); myChapter.setEndingPageNumber(chapterScanner.nextInt()); displayProduct(myChapter); } chapterScanner.close(); } else { System.out.println("Missing Chapter File"); } } //Display the Chapter Information in a correct Format public static void displayProduct(Chapter reportProduct) { System.out.printf("%7d", reportProduct.getChapterNumber()); System.out.printf("%-60s", reportProduct.getChapterTitle()); System.out.printf("%-6d", reportProduct.getStartingPageNumber()); System.out.printf("%-7d%n", reportProduct.getEndingPageNumber()); } } But then I got an Error: run: Exception in thread "main" java.util.UnknownFormatConversionException: Conversion = 'ti' at java.util.Formatter$FormatSpecifier.checkDateTime(Formatter.java:2915) at java.util.Formatter$FormatSpecifier.(Formatter.java:2678) at java.util.Formatter.parse(Formatter.java:2528) at java.util.Formatter.format(Formatter.java:2469) at java.io.PrintStream.format(PrintStream.java:970) at java.io.PrintStream.printf(PrintStream.java:871) at chapterreader.ChapterReader.main(ChapterReader.java:17) Java Result: 1 BUILD SUCCESSFUL (total time: 0 seconds) What's wrong with this error? Please, Help!
Your below statement is not formattable. That why it throws UnknownFormatConversionException System.out.printf("%7Chapter %14Title %69Page %80Length"); If you want to separate these words than use following way System.out.printf("%7s %14s %69s %80s", "Chapter", "Title", "Page", "Length");
Instead of System.out.printf("%7Chapter %14Title %69Page %80Length"); I think you wanted something like System.out.printf("%7s %14s %69s %80s%n", "Chapter", "Title", "Page", "Length"); and your message is telling you that your format String(s) aren't valid (%14Ti). The Formatter#syntax javadoc says (in part) 't', 'T' date/time Prefix for date and time conversion characters. See Date/Time Conversions.
Calling R in java-Rcaller
I am trying to implement clustering using R in java by employing R caller. I am trying to run sample code for clustering validation and I get that common error faced by most of the users: Premature end of file package test; import rcaller.RCaller; import java.io.File; import java.lang.*; import java.util.*; import java.awt.image.DataBuffer; public class test3 { public static void main(String[] args) { new test3(); } public test3() { try{ RCaller caller = new RCaller(); caller.cleanRCode(); caller.setRscriptExecutable("C:/Program Files/R/R-2.15.1/bin/x64/Rscript"); caller.cleanRCode(); caller.addRCode("library(clvalid)"); caller.addRCode("data(mouse)"); caller.addRCode("express <- mouse [,c(M1,M2,M3,NC1,NC2,NC3)]"); caller.addRCode("rownames (express) <- mouse$ID "); caller.addRCode("intern <- clValid(express, 2:6 , clMethods = c( hierarchical,kmeans,diana,clara,model) ,validation = internal)"); caller.addRCode("b <- summary(intern) "); caller.runAndReturnResult("b"); } catch (Exception e){ e.printStackTrace(); } } }
You have some spelling mistakes in you code. like clValid not clvalid , and you miss many quotes like "hierarchical",.... I think it is better to put your code in a script, and call it from java like this : Runtime.getRuntime().exec("Rscript myScript.R"); where myScript.R is : library(clValid) data(mouse) express <- mouse [,c('M1','M2','M3','NC1','NC2','NC3')] rownames (express) <- mouse$ID intern <- clValid(express, 2:6 , clMethods = c( 'hierarchical','kmeans', 'diana','clara','model') , validation = 'internal') b <- summary(intern)
Why do I get class, enum, interface expected?
/** * #(#)b.java * * * #author * #version 1.00 2012/5/4 */ import java.util.*; import java.io.*; import java.*; public class b { static void lireBddParcs(String nomFichier) throws IOException { LinkedHashMap parcMap = new LinkedHashMap<Parc,Collection<Manege>> (); boolean existeFichier = true; FileReader fr = null; try { fr = new FileReader (nomFichier); } catch(java.io.FileNotFoundException erreur) { System.out.println("Probleme rencontree a l'ouverture du fichier" + nomFichier); existeFichier = false; } if (existeFichier) { Scanner scan = new Scanner(new File(nomFichier)); while (scan.hasNextLine()) { String[] line = scan.nextLine().split("\t"); Parc p = new Parc(line[0], line[1], line[2]); parcMap.put(p, null); } } scan.close(); } } /** * #param args the command line arguments */ public static void main(String[] args) throws IOException { lireBddParcs("parcs.txt"); } } parc.txt contains: Great America Chicago Illinois Magic mountain Los Ageles Californie Six Flags over Georgia Atlanta Georgie Darien Lake Buffalo New York La Ronde Montreal Quebec The Great Escape Lake Georges New York Six Flags New Orleans New Orleans Louisiane Elitch Gardens Denver Colorado Six Flags over Texas Arlington Texas Six Flags New England Springfield Massachusetts Six Flags America Washington D.C. Great Adventure Jackson New Jersey error: class, interface, or enum expected line 94 error: class, interface, or enum expected line 99 I decided to change my code, because something didn't work out as expected, but now I am getting this. Can't get through the compilation. Any idea why it doesn't work? I am a complete noob about to abandon my java course.
Although indentation is confusing, the main method is outside of the class while it should be inside of it. It also make the line scan.close(); invalid, as scan is not defined there. remove the } before scan.close();.
It's just because there's an extraneous closing brace in your first method here: } scan.close(); If you use an IDE such as eclipse or netbeans to edit your source files, it will help a lot with automatic brace matching and highlighting these kinds of errors.