I want to execute a Kafka producer using multiple threads. Below is the code that I have tried out. I am unaware of how to implement threads in Kafka producer since I am not well versed with Thread programming.
Below is the code for my producer.
import org.apache.kafka.clients.producer.Callback;
import org.apache.kafka.clients.producer.ProducerConfig;
import org.apache.kafka.clients.producer.ProducerRecord;
import org.apache.kafka.clients.producer.RecordMetadata;
import org.apache.kafka.common.Metric;
import org.apache.kafka.common.MetricName;
import org.apache.kafka.common.serialization.StringSerializer;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import java.io.BufferedReader;
import java.io.FileNotFoundException;
import java.io.FileReader;
import java.io.IOException;
import java.util.Map;
import java.util.Properties;
import java.util.concurrent.ExecutorService;
import java.util.concurrent.Executors;
public class KafkaProducerWithThread {
//init params
final String bootstrapServer = "127.0.0.1:9092";
final String topicName = "spark-data-topic";
final String csvFileName = "unique_products.csv";
final static int MAX_THREAD = 2; //created number of threads
//Logger
final Logger logger = LoggerFactory.getLogger(KafkaProducerWithThread.class);
public KafkaProducerWithThread() throws FileNotFoundException {
}
public static void main(String[] args) throws IOException {
new KafkaProducerWithThread().runProducer();
}
public void runProducer() throws IOException {
//Read the CSV file from Resources folder as BufferedReader
ClassLoader classLoader = new KafkaProducerWithThread().getClass().getClassLoader();
BufferedReader reader = new BufferedReader(new FileReader(classLoader.getResource(csvFileName).getFile()));
//Create a Kafka Producer
org.apache.kafka.clients.producer.KafkaProducer<String, String> producer = createKafkaProducer();
//Kafka Producer Metrics
Metric requestTotalMetric = null;
for (Map.Entry<MetricName, ? extends Metric> entry : producer.metrics().entrySet()) {
if ("request-total".equals(entry.getKey().name())) {
requestTotalMetric = entry.getValue();
}
}
//Thread
ExecutorService executorService = Executors.newFixedThreadPool(MAX_THREAD);
//Read the CSV file line by line
String line = "";
int i = 0;
while ((line = reader.readLine()) != null) {
i++;
String key = "products_" + i;
//Create a ProducerRecord
ProducerRecord<String, String> csvProducerRecord = new ProducerRecord<>(topicName, key, line.trim());
//Send the data - Asynchronously
producer.send(csvProducerRecord, new Callback() {
#Override
public void onCompletion(RecordMetadata recordMetadata, Exception e) {
//executes every time a record is sent successfully or an exception is thrown
if (e == null) {
//the record was sent successfully
// logger.info("Received new metadata. \n" +
// "Topic: " + recordMetadata.topic() + "\n" +
// "Partition: " + recordMetadata.partition() + "\n" +
// "Offset: " + recordMetadata.offset() + "\n" +
// "Timestamp: " + recordMetadata.timestamp());
} else {
logger.error("Error while producing", e);
}
}
});
if (i % 1000 == 0){
logger.info("Record #: " + i + " Request rate: " + requestTotalMetric.metricValue());
}
}
//Adding a shutdown hook
Runtime.getRuntime().addShutdownHook(new Thread(() -> {
logger.info("Stopping the Producer!");
producer.flush();
producer.close();
logger.info("Stopped the Producer!");
}));
}
public org.apache.kafka.clients.producer.KafkaProducer<String, String> createKafkaProducer() {
//Create Producer Properties
Properties properties = new Properties();
properties.setProperty(ProducerConfig.BOOTSTRAP_SERVERS_CONFIG, bootstrapServer);
properties.setProperty(ProducerConfig.ACKS_CONFIG, "all");
properties.setProperty(ProducerConfig.RETRIES_CONFIG, "5");
properties.setProperty(ProducerConfig.KEY_SERIALIZER_CLASS_CONFIG, StringSerializer.class.getName());
properties.setProperty(ProducerConfig.VALUE_SERIALIZER_CLASS_CONFIG, StringSerializer.class.getName());
properties.setProperty(ProducerConfig.ENABLE_IDEMPOTENCE_CONFIG, "true"); // For an idempotent producer
//kafka can detect whether it's a duplicate data based on the producer request id.
//Create high throughput Producer at the expense of latency & CPU
properties.setProperty(ProducerConfig.COMPRESSION_TYPE_CONFIG, "snappy");
properties.setProperty(ProducerConfig.LINGER_MS_CONFIG, "60");
properties.setProperty(ProducerConfig.BATCH_SIZE_CONFIG, Integer.toString(32 * 1024)); //32KB batch size
//Create Kafka Producer
org.apache.kafka.clients.producer.KafkaProducer<String, String> csvProducer = new org.apache.kafka.clients.producer.KafkaProducer<String, String>(properties);
return csvProducer;
}
}
Can anyone help me in implementing the threads in my Kafka producer program?
My Producer will be producing over a million records & so I want to implement threads for the same. I am aware of ExecutorService used for thread programming but I am not sure how to implement in this case.
Thanks.
create a MessageSender class as given below.
after creating the producer class, create a new MesssageSender object taking the producer record and producer as constructor args.
invoke executorService.submit() to perform the task.
class Producer {
ExecutorService executorService =
Executors.newFixedThreadPool(MAX_THREAD);
//Read the CSV file line by line
String line = "";
int i = 0;
while ((line = reader.readLine()) != null) {
//create produver record
ProducerRecord<String, String> csvProducerRecord = new ProducerRecord<>(topicName, key, line.trim());
MessageSender sendMessage= new MessageSender(csvProducerRecord,producer);
executorService.submit()...
}
}
//Thread class
class MessageSender implements Runnable<>{
MessageSender(Producerrecord,producer{
//store in class level variable in thread class
}
public void run(){
producer.send(csvProducerRecord...);
}
Related
My objective is to count the frequencies of each word while reading a large file using multiple threads.
I am implementing Runnable interface to achieve multi-threading. But while executing the program I'm not getting the correct answer every time. Sometimes, it is giving correct output and sometimes not. But using Callable interface instead of Runnable, the program executes correctly without any error.
This is the main class:
import java.io.BufferedReader;
import java.io.FileReader;
import java.io.IOException;
import java.util.ArrayList;
import java.util.HashMap;
import java.util.List;
import java.util.Map;
public class WordFrequencyRunnableTest {
public static void main(String[] args) throws IOException {
long startTime = System.currentTimeMillis();
String filePath = "C:/Users/Mukesh Kumar/Desktop/data.txt";
WordFrequencyRunnableTest runnableTest = new WordFrequencyRunnableTest();
Map<String, Integer> wordFrequencies = runnableTest.parseLines(filePath);
runnableTest.printResult(wordFrequencies);
long elapsedTime = System.currentTimeMillis() - startTime;
System.out.println("Total execution time in millis: " + elapsedTime);
}
public Map<String, Integer> parseLines(String filePath) throws IOException {
Map<String, Integer> wordFrequencies = new HashMap<>();
try (BufferedReader bufferedReader = new BufferedReader(new FileReader(filePath))) {
String eachLine = bufferedReader.readLine();
while (eachLine != null) {
List<String> linesForEachThread = new ArrayList<>();
while (linesForEachThread.size() != 100 && eachLine != null) {
linesForEachThread.add(eachLine);
eachLine = bufferedReader.readLine();
}
WordFrequencyUsingRunnable task = new WordFrequencyUsingRunnable(linesForEachThread, wordFrequencies);
Thread thread = new Thread(task);
thread.start();
}
}
return wordFrequencies;
}
public void printResult(Map<String, Integer> wordFrequencies) {
wordFrequencies.forEach((key, value) -> System.out.println(key + " " + value));
}
}
And this is the logic class:
import java.util.ArrayList;
import java.util.List;
import java.util.Map;
public class WordFrequencyUsingRunnable implements Runnable {
private final List<String> linesForEachThread;
private final Map<String, Integer> wordFrequencies;
public WordFrequencyUsingRunnable(List<String> linesForEachThread, Map<String, Integer> wordFrequencies) {
this.linesForEachThread = linesForEachThread;
this.wordFrequencies = wordFrequencies;
}
#Override
public void run() {
List<String> currentThreadLines = new ArrayList<>(linesForEachThread);
for (String eachLine : currentThreadLines) {
String[] eachLineWords = eachLine.toLowerCase().split("([,.\\s]+)");
synchronized (wordFrequencies) {
for (String eachWord : eachLineWords) {
if (wordFrequencies.containsKey(eachWord)) {
wordFrequencies.replace(eachWord, wordFrequencies.get(eachWord) + 1);
}
wordFrequencies.putIfAbsent(eachWord, 1);
}
}
}
}
}
I am hoping for good responses and thanking in advance for the help.
You should wait for all threads to close before printing the results.
public class WordFrequencyRunnableTest {
List<Thread> threads = new ArrayList<>();
public static void main(String[] args) throws IOException {
...
...
Map<String, Integer> wordFrequencies = runnableTest.parseLines(filePath);
for(Thread thread: threads)
{
thread.join();
}
runnableTest.printResult(wordFrequencies);
...
...
}
public Map<String, Integer> parseLines(String filePath) throws IOException {
Map<String, Integer> wordFrequencies = new HashMap<>();
try (BufferedReader bufferedReader = new BufferedReader(new FileReader(filePath))) {
String eachLine = bufferedReader.readLine();
while (eachLine != null) {
List<String> linesForEachThread = new ArrayList<>();
while (linesForEachThread.size() != 100 && eachLine != null) {
linesForEachThread.add(eachLine);
eachLine = bufferedReader.readLine();
}
WordFrequencyUsingRunnable task = new WordFrequencyUsingRunnable(linesForEachThread, wordFrequencies);
Thread thread = new Thread(task);
thread.start();
threads.add(thread); // Add thread to the list.
}
}
return wordFrequencies;
}
}
PS - You can use ConcurrentHashMap<String, AtomicInteger> to avoid having to synchronize access to the hashmap. The program will run faster that way.
I am working on a Kafka Custom partitioner class. Here I am trying to push the data into separate partitions.
My Kafka producer class:
import java.util.Date;
import java.util.Properties;
import java.util.Random;
import kafka.javaapi.producer.Producer;
import kafka.producer.KeyedMessage;
import kafka.producer.ProducerConfig;
public class KafkaCustomPartitioner {
public static void main(String[] args) {
long events = Long.parseLong(args[0]);
int blocks = Integer.parseInt(args[1]);
Random rnd = new Random();
Properties props = new Properties();
props.put("metadata.broker.list", "localhost:9092");
props.put("serializer.class","kafka.serializer.StringEncoder");
props.put("key.serializer.class", "kafka.serializer.StringEncoder");
props.put("partitioner.class","com.kafka.partdecider.CustomPartitioner");
props.put("producer.type", "sync");
props.put("request.required.acks","1");
ProducerConfig config = new ProducerConfig(props);
Producer producer = new Producer(config);
for(int nBlocks=0; nBlocks<blocks; nBlocks++) {
for(long nEvents=0; nEvents<events; nEvents++) {
long runTime = new Date().getTime();
String msg = runTime + ": " + (50+nBlocks) + ": " + nEvents + ": " + rnd;
KeyedMessage<String, String> data = new KeyedMessage<String, String>("CustPartTopic",String.valueOf(nBlocks),msg);
producer.send(data);
}
}
producer.close();
}
}
Customer Partitioner Class:
import kafka.producer.Partitioner;
public class CustomPartitioner implements Partitioner {
public int partition(Object key, int arg1) {
String receivingkey = (String) key;
long id = Long.parseLong(receivingkey);
return (int) (id%arg1);
}
}
The project's arguments section has the values: 3 2
I am getting "ArrayOutOfBoundsException" at this line if I run the class:
Exception in thread "main" java.lang.ArrayIndexOutOfBoundsException: 0
at com.kafka.custompartitioner.KafkaCustomPartitioner.main(KafkaCustomPartitioner.java:13)
The error is shown at the line:long events = Long.parseLong(args[0]);
But I don't understand why is that line giving the error.
Could anyone let me know how can I fix this ?
This works for me, the API are quite different :
package mypackage.io;
import org.apache.kafka.clients.producer.KafkaProducer;
import org.apache.kafka.clients.producer.ProducerConfig;
import org.apache.kafka.clients.producer.ProducerRecord;
import java.util.Date;
import java.util.Properties;
import java.util.Random;
import java.util.concurrent.ExecutionException;
public class KafkaCustomPartitioner {
public static void main(String[] args) throws InterruptedException, ExecutionException {
long events = Long.parseLong(args[0]);
int blocks = Integer.parseInt(args[1]);
Random rnd = new Random();
Properties props = new Properties();
props.put(ProducerConfig.BOOTSTRAP_SERVERS_CONFIG, "localhost:9092");
props.put(ProducerConfig.KEY_SERIALIZER_CLASS_CONFIG, "org.apache.kafka.common.serialization.StringSerializer");
props.put(ProducerConfig.VALUE_SERIALIZER_CLASS_CONFIG, "org.apache.kafka.common.serialization.StringSerializer");
props.put(ProducerConfig.PARTITIONER_CLASS_CONFIG, "mypackage.io.CustomPartitioner");
props.put(ProducerConfig.ACKS_CONFIG, "1");
KafkaProducer<String, String> producer = new KafkaProducer<String, String>(props);
for(int nBlocks=0; nBlocks<blocks; nBlocks++) {
for(long nEvents=0; nEvents<events; nEvents++) {
long runTime = new Date().getTime();
String msg = runTime + ": " + (50+nBlocks) + ": " + nEvents + ": " + rnd;
producer.send(new ProducerRecord<String, String>("CustPartTopic", String.valueOf(nBlocks), msg)).get();
}
}
producer.close();
}
}
then the custom partitioner
package mypackage.io;
import org.apache.kafka.clients.producer.Partitioner;
import org.apache.kafka.common.Cluster;
import java.util.Map;
public class CustomPartitioner implements Partitioner {
public int partition(String topic, Object key, byte[] keyBytes, Object value, byte[] valueBytes, Cluster cluster) {
String receivingkey = (String) key;
long id = Long.parseLong(receivingkey);
int numPartitions = cluster.availablePartitionsForTopic(topic).size();
return (int) (id % numPartitions);
}
public void close() {
}
public void configure(Map<String, ?> map) {
}
}
I have tried to run this code but it doesn't work because of producer.send() doesn't accept KeyedMessage type.
I tried to import kafka.javaapi.producer.Producer instead of kafka.producer.Producer; but still doesn't work
The code is:
package sources;
import java.io.BufferedReader;
import java.io.InputStreamReader;
import java.util.Properties;
//import kafka.javaapi.producer.Producer;
import kafka.producer.KeyedMessage;
import kafka.producer.ProducerConfig;
import kafka.javaapi.producer.Producer;
//import kafka.producer.Producer;
public class ProducerCode {
private static Producer<Integer, String> producer;
private static final String topic= "mytopic";
public void initialize() {
Properties producerProps = new Properties();
producerProps.put("metadata.broker.list", "localhost:9092");
producerProps.put("serializer.class", "kafka.serializer.StringEncoder");
producerProps.put("request.required.acks", "1");
// ProducerConfig producerConfig = new ProducerConfig(producerProps);
// have a change here **
producer = new Producer<Integer, String>(new ProducerConfig(producerProps));
}
public void publishMesssage() throws Exception{
BufferedReader reader = new BufferedReader(new InputStreamReader(System.in));
while (true){
System.out.print("Enter message to send to kafka broker (Press 'Y' to close producer): ");
String msg = null;
msg = reader.readLine(); // Read message from console
//Define topic name and message
KeyedMessage<Integer, String> keyedMsg = new KeyedMessage<Integer, String>(topic, msg);
producer.send(keyedMsg);
// producer.send(keyedMsg); // This publishes message on given topic
if("Y".equals(msg)){ break; }
System.out.println("--> Message [" + msg + "] sent.Check message on Consumer's program console");
}
return;
}
public static void main(String[] args) throws Exception {
KafkaProducer kafkaProducer = new KafkaProducer();
// Initialize producer
kafkaProducer.initialize();
// Publish message
kafkaProducer.publishMesssage();
//Close the producer
producer.close();
}
}
You have to use ProducerRecord (instead of KeyedMessage) with constructor ProducerRecord(String topic, K key, V value)
Producer<String, String> producer = new KafkaProducer<>(props);
producer.send(new ProducerRecord<String, String>("my-topic", "key", "value"));
See https://kafka.apache.org/0100/javadoc/index.html?org/apache/kafka/clients/producer/KafkaProducer.html
I am spawning 5 threads using the thread pool executor to execute 5 different commands in parallel. After completion of each thread i am updating the concurrent hashmap with the entries of threadid as a key and terminated as value. But my threadpool is not updating the hashmap of the successful completion of the commands execution.
Main Class:
package com.cisco.executor;
import java.util.concurrent.ConcurrentHashMap;
import java.util.concurrent.Executors;
import java.util.concurrent.ThreadPoolExecutor;
import java.util.ArrayList;
import java.util.Arrays;
import java.util.Iterator;
import java.util.List;
import java.util.Map;
public class MainExecutor {
static String element;
static ConcurrentHashMap<Integer, String> map = new ConcurrentHashMap<Integer, String>();
static Integer array[] = { 1, 2, 3, 4, 5 };
// static Integer array[] = { 1 };
static List<Integer> threadid = Arrays.asList(array);
static String SQOOP_XXCCS_DS_SAHDR_CORE = ReadProperties.getInstance().getProperty("SQOOP_XXCCS_DS_SAHDR_CORE");
static String SQOOP_XXCCS_DS_CVDPRDLINE_DETAIL = ReadProperties.getInstance()
.getProperty("SQOOP_XXCCS_DS_CVDPRDLINE_DETAIL");
static String SQOOP_XXCCS_DS_INSTANCE_DETAIL = ReadProperties.getInstance()
.getProperty("SQOOP_XXCCS_DS_INSTANCE_DETAIL");
static String SQOOP_XXCCS_SCDC_PRODUCT_PROFILE = ReadProperties.getInstance()
.getProperty("SQOOP_XXCCS_SCDC_PRODUCT_PROFILE");
static String SQOOP_MTL_SYSTEM_ITEMS_B = ReadProperties.getInstance().getProperty("SQOOP_MTL_SYSTEM_ITEMS_B");
public static void main(String[] args) {
ThreadPoolExecutor executors = (ThreadPoolExecutor) Executors.newFixedThreadPool(5);
// ThreadPoolExecutor executors = (ThreadPoolExecutor) Executors.newFixedThreadPool(1);
System.out.println("at executors step");
List<String> getlist = getList();
Iterator<Integer> itr2 = threadid.iterator();
for (Iterator<String> itr = getlist.iterator(); itr.hasNext() && itr2.hasNext();) {
String element = (String) itr.next();
int thread_id = itr2.next();
String[] command = { "ssh", "hddev-c01-edge-02", "\"" + element + "\"" };
System.out.println("the command is as below ");
System.out.println(Arrays.toString(command));
System.out.println("inside the iterator");
ParallelExecutor pe = new ParallelExecutor(command, thread_id, map);
executors.execute(pe);
}
// executors.shutdown();
for(Map.Entry<Integer, String> entry: map.entrySet())
{
Integer key = entry.getKey();
String value = entry.getValue();
System.out.println("The key is " + key + " The value is " + value);
System.out.println("Thread " + key + " is terminated");
}
}
public static List<String> getList() {
List<String> commandlist = new ArrayList<String>();
System.out.println("inside getList");
commandlist.add(SQOOP_XXCCS_DS_SAHDR_CORE);
commandlist.add(SQOOP_XXCCS_DS_CVDPRDLINE_DETAIL);
commandlist.add(SQOOP_XXCCS_DS_INSTANCE_DETAIL);
commandlist.add(SQOOP_XXCCS_SCDC_PRODUCT_PROFILE);
commandlist.add(SQOOP_MTL_SYSTEM_ITEMS_B);
return commandlist;
}
}
Runnable Class:
package com.cisco.executor;
import java.io.BufferedReader;
import java.io.InputStreamReader;
import java.util.concurrent.ConcurrentHashMap;
import org.apache.log4j.Logger;
public class ParallelExecutor implements Runnable {
private static Logger LOGGER = Logger.getLogger(ParallelExecutor.class);
String[] command;
int threadid;
ConcurrentHashMap<Integer, String> map;
public ParallelExecutor(String[] command, int threadid, ConcurrentHashMap<Integer, String> map) {
this.command = command;
this.threadid = threadid;
this.map = map;
}
#Override
public void run() {
ProcessBuilder processbuilder = new ProcessBuilder(command);
LOGGER.info(command);
try {
Process process = processbuilder.inheritIO().start();
System.out.println("inside process builder ");
process.waitFor();
BufferedReader reader = new BufferedReader(new InputStreamReader(process.getInputStream()));
String readline;
while ((readline = reader.readLine()) != null) {
LOGGER.info(readline);
}
// getting the thread state and adding it to a collection
Thread.State state = Thread.currentThread().getState();
if (state == Thread.State.TERMINATED) {
map.put(threadid, "TERMINATED");
}
} catch (Exception e) {
LOGGER.error(e.getMessage());
}
}
}
Is my implementation wrong. Can someone help me with the implementation.
Instead of trying to capture the outcome of a thread in the thread (which is error prone esp if an exception/error is thrown) I suggest you retain the Future objects and inspect those.
ExecutorService exec = Executors.newFixedThreadPool(5);
System.out.println("at executors step");
Map<String, Future<?>> results = new HashMap<>();
for (String element : getList()) {
String[] command = { "ssh", "hddev-c01-edge-02", "\"" + element + "\"" };
results.put(element, exec.submit(new ParallelExecutor(command, thread_id, map)));
}
for(Map.Entry<String, Future<?>> entry: map.entrySet()) {
try {
entry.getValue().get();
System.out.println(entry.getKey()+ " is complete");
} catch (ExecutionException e) {
System.out.println(entry.getKey()+ " failed with");
e.getCause().printStackTrace(System.out);
}
}
ThreadPoolExecutor does not terminate untill it is asked to do so.
So, first of all you have to call
// executors.shutdown();
, which you kept as commented.
2nd, you need to wait for the threads to terminate properly. for that add a loop, before for(Map.Entry entry: map.entrySet())
while (!es.isTerminated()) {
}
But, since one thread will probably run many runnables and if I get you correctly you want to update the CHM once one Runnable is done with it's execution.
To do that, you have to use a CustomThread class. extends Thread
and override just 1 method, afterExecute() where from you need to put code to update CHM with Runnable 's id and terminated status. But remember this means completion of passed Runnables run() method, not the underlying Thread's termination.
I have a problem when trying to realize thread synchronization with PipedInputStream and PipedOutputStream in Java.
There are three threads T1, T2, T3 that can edit a file named toto.txt in concurrence. The file content of toto.txt is something like:
T1 : 1
T2 : 1
T3 : 1
T1 : 2
T2 : 2
T3 : 2
T1 : 3
T2 : 3
T3 : 3
....
My idea is: each Thread can access to toto.txt only when it have a key variable key = true. After editing file, thread A write key content into a pipedInputStream connected to a PipedOutputStream. Thread B read key from PipedOutStream, if key = true, B can access to edit file. There is a starting thread that can write to file, another thread wait first for the key -> write to file -> write key to pipe. If there are 3 thread, so there are 3 pipe connected: T1-T2, T2-T3, T3-T1.
My code Thread
import java.io.BufferedInputStream;
import java.io.BufferedOutputStream;
import java.io.BufferedWriter;
import java.io.DataInputStream;
import java.io.DataOutputStream;
import java.io.File;
import java.io.FileWriter;
import java.io.IOException;
import java.io.InputStream;
import java.io.OutputStream;
import java.io.PipedInputStream;
import java.io.PipedOutputStream;
import java.util.logging.Level;
import java.util.logging.Logger;
public class threadFlux implements Runnable {
public String _threadName;
public boolean _key;
public boolean _stratingThread;
public int _count;
public int _maxCount;
public String _fileName;
public DataInputStream _is;
public DataOutputStream _os;
public threadFlux(String threadName, String fileName, boolean starting, int maxCount) {
this._threadName = threadName;
this._maxCount = maxCount;
this._count = 1;
this._fileName = fileName;
this._stratingThread = starting;
this._key = (starting == true);
}
#Override
public void run() {
while (this._count <= this._maxCount) {
if (this._stratingThread == true) {
try {
/* starting thread write to file */
System.out.println("startint thread");
System.out.println(this._threadName + ": " + this._count);
this.writeToFile(this._threadName + ": " + this._count + "\n");
this._count++;
/* write key to pipe */
this.writeKeyToPipe(this._key);
System.out.println("key written");
/* set key = false */
this._key = false;
this._stratingThread = false;
} catch (IOException ex) {
Logger.getLogger(threadFlux.class.getName()).log(Level.SEVERE, null, ex);
}
} else {
try {
/* read key from pipe */
System.out.println(this._threadName + " Clef " + this._key);
this._key = this.readKeyFromPipe();
System.out.println(this._threadName + " Clef " + this._key);
/* write key to pipe */
System.out.println(this._threadName + ": " + this._count);
this.writeToFile(this._threadName + ": " + this._count + "\n");
this._count++;
/* write key to pipe for another thread */
this.writeKeyToPipe(this._key);
this._key = false;
} catch (IOException ex) {
Logger.getLogger(threadFlux.class.getName()).log(Level.SEVERE, null, ex);
}
}
}
System.out.println(this._threadName + " finish!");
}
public void setPipedStream(PipedOutputStream pos, PipedInputStream pis) throws IOException {
this._os = new DataOutputStream(new BufferedOutputStream(pos));
this._is = new DataInputStream(new BufferedInputStream(pis));
}
private void writeToFile(String string) throws IOException {
File file = new File(this._fileName);
//if file doesnt exists, then create it
if (!file.exists()) {
file.createNewFile();
}
//true = append file
FileWriter fileWritter = new FileWriter(file.getName(), true);
try (BufferedWriter bufferWritter = new BufferedWriter(fileWritter)) {
bufferWritter.write(string);
bufferWritter.close();
}
}
private void writeKeyToPipe(boolean _key) throws IOException {
this._os.writeBoolean(_key);
}
private boolean readKeyFromPipe() throws IOException {
return this._is.readBoolean();
}
}
My main program
import java.io.IOException;
import java.io.PipedInputStream;
import java.io.PipedOutputStream;
import java.util.logging.Level;
import java.util.logging.Logger;
public class Test {
public static void main(String[] args) {
try {
// TODO code application logic here
threadFlux runnableThread1 = new threadFlux("T1", "toto.txt", true, 3);
threadFlux runnableThread2 = new threadFlux("T2", "toto.txt", false, 3);
threadFlux runnableThread3 = new threadFlux("T3", "toto.txt", false, 3);
PipedOutputStream pos1 = new PipedOutputStream();
PipedOutputStream pos2 = new PipedOutputStream();
PipedOutputStream pos3 = new PipedOutputStream();
PipedInputStream pis2 = new PipedInputStream(pos1);
PipedInputStream pis1 = new PipedInputStream(pos3);
PipedInputStream pis3 = new PipedInputStream(pos2);
runnableThread1.setPipedStream(pos1, pis1);
runnableThread2.setPipedStream(pos2, pis2);
runnableThread3.setPipedStream(pos3, pis3);
Thread thread1 = new Thread(runnableThread1);
Thread thread2 = new Thread(runnableThread2);
Thread thread3 = new Thread(runnableThread3);
thread1.start();
thread2.start();
thread3.start();
} catch (IOException ex) {
Logger.getLogger(Test.class.getName()).log(Level.SEVERE, null, ex);
} finally {
}
}
}
The problem when I run those codes that: it's blocked after the starting thread wrote to file and wrote key to PipedOutputStream.
Thanks for any helps
PipedOutputStream has a fixed buffer, 4k last time I looked. When it fills it blocks on write until the reading thread reads something. So your read thread isn't reading.
Don't do this. I/O pipes between threads are basically unnecessary. You don't need to move data like this. Find another design.