Below is my program. I want to use a ExecutorService on it to run once a day. However, the program is not 'runnable' please advice the necessary change to make it so.
package priceCollector;
import java.io.BufferedInputStream;
import java.io.ByteArrayOutputStream;
import java.io.FileOutputStream;
import java.io.InputStream;
import java.net.URL;
import java.text.DateFormat;
import java.util.Date;
public class App extends myTimerTask {
public static void main(String[] args) {
Date now = new Date();
DateFormat df = DateFormat.getDateInstance();
String s = df.format(now);
String fileName = new String();
fileName = "/Users/Desktop/" + s + ".csv";
URL link = null;
try {
link = new URL("http://finance.yahoo.com/d/quotes.csv?s=III.L+ADM.L+AAL.L+ANTO.L+AHT.L+ABF.L+AZN.L+AV.L+BAB.L+BA.L+BARC.L+BDEV.L+BLT.L+BP.L+BATS.L+BLND.L+BTA.L+BNZL.L+BRBY.L+CPI.L+CCL.L+CNA.L+CCH.L+CPG.L+CRH.L+CRDA.L+DCC.L+DGE.L+DLG.L+DC.L+EZJ.L+EXPN.L+FRES.L+GKN.L+GSK.L+GLEN.L+HMSO.L+HL.L+HIK.L+HSBA.L+IMB.L+INF.L+IHG.L+IAG.L+ITRK.L+INTU.L+ITV.L+JMAT.L+KGF.L+LAND.L+LGEN.L+LLOY.L+LSE.L+MKS.L+MDC.L+MERL.L+MCRO.L+MNDI.L+MRW.L+NG.L+NXT.L+OML.L+PPB.L+PSON.L+PSN.L+POLY.L+PFG.L+PRU.L+RRS.L+RB.L+REL.L+RIO.L+RR.L+RBS.L+RDSA.L+RDSB.L+RMG.L+RSA.L+SGE.L+SBRY.L+SDR.L+SVT.L+SHP.L+SKY.L+SN.L+SMIN.L+SSE.L+STJ.L+STAN.L+SL.L+TW.L+TSCO.L+TPK.L+TUI.L+ULVR.L+UU.L+VOD.L+WTB.L+WOS.L+WPG.L+WPP.L&f=np");
InputStream in = new BufferedInputStream(link.openStream());
ByteArrayOutputStream out = new ByteArrayOutputStream();
byte[] buf = new byte[1024];
int n = 0;
while (-1!=(n=in.read(buf))) {
out.write(buf, 0, n);
}
out.close();
in.close();
byte[] response = out.toByteArray();
FileOutputStream fos = new FileOutputStream(fileName);
fos.write(response);
fos.close();
} catch (Exception e) {
System.out.println("not available");
}
}
}
I am guessing you would need to implement the run() instance but not sure how you would write a program into that.
You need to follow the below steps which use ScheduledExecutorService as shown below:
Step (1): You need to move your code into a separate class as shown below:
public TaskRunner implements Runnable {
public void run() {
//your code here (logic froom main method)
}
}
Step(2): Now use ScheduledExecutorService as shown below:
public class MySchduler {
private final ScheduledExecutorService scheduler =
Executors.newScheduledThreadPool(1);
public static void main(String[] args) {
scheduler.scheduleAtFixedRate(new TaskRunner(), 0, 1, TimeUnit.DAYS);
}
}
You can look here
Related
Currently working on making a pipeline among two files in java and I would to transmit a float via stream bytes. However I don't know how I can receive it and convert it into a float. Here is what I have done so far:
(3 files)
Consumi.java:
package tryout5_stream_bytes;
import java.io.Serializable;
public class Consumi implements Serializable{
private float consumi = 0.0F;
public Consumi(float consumi){
this.consumi = consumi;
}
public float getConsumi(){
return consumi;
}
public byte[] getBytes(String encode){
return String.valueOf(consumi).getBytes();
}
}
SimulaConsumi.java
package tryout5_stream_bytes;
import java.io.PipedOutputStream;
import java.io.IOException;
import java.io.UnsupportedEncodingException;
import java.util.concurrent.atomic.AtomicBoolean;
public class SimulaConsumi implements Runnable {
private AtomicBoolean isRunning = new AtomicBoolean(false);
private PipedOutputStream pos = null;
public SimulaConsumi(PipedOutputStream pos){
this.pos = pos;
}
#Override
public void run(){
isRunning.set(true);
while(isRunning.get()){
Consumi c = new Consumi((float) (30 * Math.random()));
byte[] message = null;
message = c.getBytes("UTF-8");
try{
pos.write(message);
pos.flush();
} catch(IOException e){
e.printStackTrace();
}
try{
Thread.sleep(1000);
} catch(InterruptedException e){
e.printStackTrace();
}
}
}
public void terminaSimulaConsumi(){
isRunning.set(false);
}
}
Main.java
package tryout5_stream_bytes;
import java.io.IOException;
import java.io.PipedInputStream;
import java.io.PipedOutputStream;
import java.io.UnsupportedEncodingException;
import java.nio.ByteOrder;
import java.nio.charset.Charset;
import java.io.*;
import java.lang.object;
import java.nio.ByteBuffer;
public class Main {
public static void main(String[] args){
PipedInputStream pis = new PipedInputStream();
PipedOutputStream pos = null;
try{
pos = new PipedOutputStream(pis);
}catch(IOException e){
e.printStackTrace();
}
SimulaConsumi sc = new SimulaConsumi(pos);
Thread tsc = new Thread();
tsc.start();
while(true){
try{
Thread.sleep(900);
}catch(InterruptedException e){
e.printStackTrace();
}
byte[] buffer = new byte[256];
try{
pis.read(buffer);
}catch(IOException e){
e.printStackTrace();
}
float received = //Get a float from a stream bytes???
System.out.println("Value:"+received);
}
}
}
I believe that the sending of the float in the file "SimulaConsumi" is done well (however I might still be wrong). On the other hand I really have no idea how I can receive it!
Can anyone help me with searching for a particular string in HTML file using Jsoup or any other method. There are inbuilt methods but they help in extracting title or script texts inside a specific tags and not string in general.
In this code I have used one such inbuilt method to extract title from the html page.
But I want to search a string instead.
package dynamic_tester;
import java.io.File;
import java.io.IOException;
import org.jsoup.Jsoup;
import org.jsoup.nodes.Document;
public class tester {
public static void main(String args[])
{
Document htmlFile = null;
{
try {
htmlFile = Jsoup.parse(new File("x.html"), "ISO-8859-1");
}
catch (IOException e)
{
e.printStackTrace();
}
String title = htmlFile.title();
System.out.println("Title = "+title);
}
}
}
Here's a sample. It reads the HTML file as text String and then performs search on that String.
package com.example;
import java.io.FileInputStream;
import java.nio.charset.Charset;
public class SearchTest {
public static void main(String[] args) throws Exception {
StringBuffer htmlStr = getStringFromFile("test.html", "ISO-8859-1");
boolean isPresent = htmlStr.indexOf("hello") != -1;
System.out.println("is Present ? : " + isPresent);
}
private static StringBuffer getStringFromFile(String fileName, String charSetOfFile) {
StringBuffer strBuffer = new StringBuffer();
try(FileInputStream fis = new FileInputStream(fileName)) {
byte[] buffer = new byte[10240]; //10K buffer;
int readLen = -1;
while( (readLen = fis.read(buffer)) != -1) {
strBuffer.append( new String(buffer, 0, readLen, Charset.forName(charSetOfFile)));
}
} catch(Exception ex) {
ex.printStackTrace();
strBuffer = new StringBuffer();
}
return strBuffer;
}
}
I have this code:
public static void write() throws IOException{
ObjectOutputStream out = new ObjectOutputStream(
new FileOutputStream("ips.txt")
);
for ( int i = 0; i < Main.ipList.length; i++){
out.writeObject(ipList[i]);
}
out.flush();
out.close();
}
Which writes the string array to a text file:
static String[] ipList = {"127.0.0.1", "173.57.51.111"};
I was wondering how it would be possible to read the text file and edit the ipList with the new ips.
If you want to write String objects to a file, it's better to use a FileWriter instead of an ObjectOutputStream. Similarly, use a FileReader to read from the file. See https://docs.oracle.com/javase/tutorial/essential/io/charstreams.html for how to use these Reader objects.
ObjectOutputStream is usually suitable for writing more complex objects that implement the java.io.Serializable interface.
Here's an example:
BufferedReader inputStream = null;
List<String> ipList = new ArrayList<>();
try {
inputStream = new BufferedReader(new FileReader("ips.txt"));
String l;
while ((l = inputStream.readLine()) != null) {
ipList.add(l);
}
} finally {
if (inputStream != null) {
inputStream.close();
}
}
// get an array from the ArrayList
ipArray = ipList.toArray(new String[ipList.size()]);
You can try something like this
package a;
import java.io.FileOutputStream;
import java.io.IOException;
import java.io.ObjectOutputStream;
import java.nio.file.Files;
import java.nio.file.Paths;
import java.util.ArrayList;
import java.util.Arrays;
import java.util.List;
public class A {
static String[] ipList = { "127.0.0.1", "173.57.51.111" };
public static void main(String[] args) {
try {
write();
update();
} catch (IOException e) {
System.err.println(e);
}
Arrays.asList(ipList).stream().forEach(System.out::println);
}
// Your method
public static void write() throws IOException {
ObjectOutputStream out = new ObjectOutputStream(new FileOutputStream("ips.txt"));
for (int i = 0; i < A.ipList.length; i++) {
out.writeObject(ipList[i]);
}
out.flush();
out.close();
}
public static void update() throws IOException {
List<String> lines = Files.readAllLines(Paths.get(".", "newIps.txt"));
List<String> newIps = new ArrayList<>();
newIps.addAll(Arrays.asList(ipList));
newIps.addAll(lines);
ipList = newIps.toArray(ipList);
}
}
The content of the newIps.txt file is
0.0.0.0
192.168.1.1
The output of the program is
127.0.0.1
173.57.51.111
0.0.0.0
192.168.1.1
Note that Arrays.asList(ipList) returns a bridge list over the array (any changes to the list will be visible for array), so we do a putAll
I am using java spark API to write some test application . I am using a class which doesn't extends serializable interface . So to make the application work I am using kryo serializer to serialize the class . But the problem which I observed while debugging was that during the de-serialization the returned class object becomes null and in turn throws a null pointer exception . It seems to be closure problem where things are going wrong but not sure.Since I am new to this kind of serialization I don't know where to start digging.
Here is the code I am testing :
package org.apache.spark.examples;
import java.io.FileWriter;
import java.io.IOException;
import java.io.PrintWriter;
import java.net.InetAddress;
import java.net.UnknownHostException;
import org.apache.spark.SparkConf;
import org.apache.spark.api.java.JavaRDD;
import org.apache.spark.api.java.JavaSparkContext;
import org.apache.spark.api.java.function.Function;
/**
* Spark application to test the Serialization issue in spark
*/
public class Test {
static PrintWriter outputFileWriter;
static FileWriter file;
static JavaSparkContext ssc;
public static void main(String[] args) {
String inputFile = "/home/incubator-spark/examples/src/main/scala/org/apache/spark/examples/InputFile.txt";
String master = "local";
String jobName = "TestSerialization";
String sparkHome = "/home/test/Spark_Installation/spark-0.7.0";
String sparkJar = "/home/test/TestSerializationIssesInSpark/TestSparkSerIssueApp/target/TestSparkSerIssueApp-0.0.1-SNAPSHOT.jar";
SparkConf conf = new SparkConf();
conf.set("spark.closure.serializer","org.apache.spark.serializer.KryoSerializer");
conf.set("spark.kryo.registrator", "org.apache.spark.examples.MyRegistrator");
// create the Spark context
if(master.equals("local")){
ssc = new JavaSparkContext("local", jobName,conf);
//ssc = new JavaSparkContext("local", jobName);
} else {
ssc = new JavaSparkContext(master, jobName, sparkHome, sparkJar);
}
JavaRDD<String> testData = ssc.textFile(inputFile).cache();
final NotSerializableJavaClass notSerializableTestObject= new NotSerializableJavaClass("Hi ");
#SuppressWarnings({ "serial", "unchecked"})
JavaRDD<String> classificationResults = testData.map(
new Function<String, String>() {
#Override
public String call(String inputRecord) throws Exception {
if(!inputRecord.isEmpty()) {
//String[] pointDimensions = inputRecord.split(",");
String result = "";
try {
FileWriter file = new FileWriter("/home/test/TestSerializationIssesInSpark/results/test_result_" + (int) (Math.random() * 100));
PrintWriter outputFile = new PrintWriter(file);
InetAddress ip;
ip = InetAddress.getLocalHost();
outputFile.println("IP of the server: " + ip);
result = notSerializableTestObject.testMethod(inputRecord);
outputFile.println("Result: " + result);
outputFile.flush();
outputFile.close();
file.close();
} catch (UnknownHostException e) {
e.printStackTrace();
}
catch (IOException e1) {
e1.printStackTrace();
}
return result;
} else {
System.out.println("End of elements in the stream.");
String result = "End of elements in the input data";
return result;
}
}
}).cache();
long processedRecords = classificationResults.count();
ssc.stop();
System.out.println("sssssssssss"+processedRecords);
}
}
Here is the KryoRegistrator class
package org.apache.spark.examples;
import org.apache.spark.serializer.KryoRegistrator;
import com.esotericsoftware.kryo.Kryo;
public class MyRegistrator implements KryoRegistrator {
public void registerClasses(Kryo kryo) {
kryo.register(NotSerializableJavaClass.class);
}
}
Here is the class I am serializing :
package org.apache.spark.examples;
public class NotSerializableJavaClass {
public String testVariable;
public NotSerializableJavaClass(String testVariable) {
super();
this.testVariable = testVariable;
}
public String testMethod(String vartoAppend){
return this.testVariable + vartoAppend;
}
}
This is because spark.closure.serializer only supports the Java serializer. See http://spark.apache.org/docs/latest/configuration.html about spark.closure.serializer
import java.io.BufferedWriter;
import java.io.File;
import java.io.FileInputStream;
import java.io.FileWriter;
import java.io.IOException;
import java.nio.MappedByteBuffer;
import java.nio.channels.FileChannel;
import java.nio.charset.Charset;
import java.util.ArrayList;
import java.util.concurrent.ExecutorService;
import java.util.concurrent.Executors;
public class Test6 implements Runnable {
private File file;
private int totalNumberOfFiles = 0;
private static int nextFile = -1;
private static ArrayList<String> allFilesArrayList = new ArrayList<String>();
private static ExecutorService executorService = null;
public Test6(File file) {
this.file = file;
}
private String readFileToString(String fileAddress) {
FileInputStream stream = null;
MappedByteBuffer bb = null;
String stringFromFile = "";
try {
stream = new FileInputStream(new File(fileAddress));
FileChannel fc = stream.getChannel();
bb = fc.map(FileChannel.MapMode.READ_ONLY, 0, fc.size());
/* Instead of using default, pass in a decoder. */
stringFromFile = Charset.defaultCharset().decode(bb).toString();
} catch (IOException e) {
System.out.println("readFileToString IOException");
e.printStackTrace();
} finally {
try {
stream.close();
} catch (IOException e) {
System.out.println("readFileToString IOException");
e.printStackTrace();
}
}
return stringFromFile;
}
private void toFile(String message, String fileName) {
try {
FileWriter fstream = new FileWriter("C:/Users/Nomi/Desktop/Workspace2/Test6/TestWritten/" + fileName);
System.out.println("printing to file: ".concat(fileName));
BufferedWriter out = new BufferedWriter(fstream);
out.write(message);
out.close();
} catch (Exception e) {
System.out.println("toFile() Exception");
System.err.println("Error: " + e.getMessage());
}
}
// private void listFilesForFolder(final File fileOrFolder) {
// String temp = "";
// if (fileOrFolder.isDirectory()) {
// for (final File fileEntry : fileOrFolder.listFiles()) {
// if (fileEntry.isFile()) {
// temp = fileEntry.getName();
// toFile(readFileToString(temp), "Copy".concat(temp));
// }
// }
// }
// if (fileOrFolder.isFile()) {
// temp = fileOrFolder.getName();
// toFile(readFileToString(temp), "Copy".concat(temp));
// }
// }
public void getAllFilesInArrayList(final File fileOrFolder) {
String temp = "";
System.out.println("getAllFilesInArrayList fileOrFolder.getAbsolutePath()" + fileOrFolder.getAbsolutePath());
if (fileOrFolder.isDirectory()) {
for (final File fileEntry : fileOrFolder.listFiles()) {
if (fileEntry.isFile()) {
temp = fileEntry.getAbsolutePath();
allFilesArrayList.add(temp);
}
}
}
if (fileOrFolder.isFile()) {
temp = fileOrFolder.getAbsolutePath();
allFilesArrayList.add(temp);
}
totalNumberOfFiles = allFilesArrayList.size();
for (int i = 0; i < allFilesArrayList.size(); i++) {
System.out.println("getAllFilesInArrayList path: " + allFilesArrayList.get(i));
}
}
public synchronized String getNextFile() {
nextFile++;
if (nextFile < allFilesArrayList.size()) {
// File tempFile = new File(allFilesArrayList.get(nextFile));
return allFilesArrayList.get(nextFile);
} else {
return null;
}
}
#Override
public void run() {
getAllFilesInArrayList(file);
executorService = Executors.newFixedThreadPool(allFilesArrayList.size());
while(nextFile < totalNumberOfFiles)
{
String tempGetFile = getNextFile();
File tempFile = new File(allFilesArrayList.get(nextFile));
toFile(readFileToString(tempFile.getAbsolutePath()), "Copy".concat(tempFile.getName()));
}
}
public static void main(String[] args) {
Test6 test6 = new Test6(new File("C:/Users/Nomi/Desktop/Workspace2/Test6/Test Files/"));
Thread thread = new Thread(test6);
thread.start();
// executorService.execute(test6);
// test6.listFilesForFolder(new File("C:/Users/Nomi/Desktop/Workspace2/Test6/"));
}
}
The programs' doing what's expected. It goes into the folder, grabs a file, reads it into a string and then writes the contents to a new file.
I would like to do this multi threaded. If the folder has N number of files, I need N number of threads. Also I would like to use executor framework if possible. I'm thinking that there can be a method along this line:
public synchronized void getAllFilesInArrayList() {
return nextFile;
}
So each new thread could pick the next file.
Thank you for your help.
Error:
Exception in thread "Thread-0" java.lang.IllegalArgumentException
at java.util.concurrent.ThreadPoolExecutor.<init>(ThreadPoolExecutor.java:589)
at java.util.concurrent.ThreadPoolExecutor.<init>(ThreadPoolExecutor.java:480)
at java.util.concurrent.Executors.newFixedThreadPool(Executors.java:59)
at Test6.run(Test6.java:112)
at java.lang.Thread.run(Thread.java:662)
Firstly, your approach to the problem will result in more synchronization and race condition worries than seems necessary. A simple strategy to keep your threads from racing would be this:
1) Have a dispatcher thread read all the file names in your directory.
2) For each file, have the dispatcher thread spawn a worker thread and hand off the file reference
3) Have the worker thread process the file
4) Make sure you have some sane naming convention for your output file names so that you don't get threads overwriting each other.
As for using an executor, a ThreadPoolExecutor would probably work well. Go take a look at the javadoc: http://docs.oracle.com/javase/1.5.0/docs/api/java/util/concurrent/ThreadPoolExecutor.html