I'd like create one Stream with data from several files. How I can do it ? There are my java class. Or maybe I should using not BufferReader but other way ? Thanks !!
import java.io.BufferedReader;
import java.io.File;
import java.io.FileNotFoundException;
import java.io.FileReader;
import java.util.stream.Stream;
public class BuffReader {
public static void main(String[] args) throws FileNotFoundException {
File file1 = new File("src/page_1.txt");
File file2 = new File("src/page_2.txt");
File file3 = new File("src/page_3.txt");
BufferedReader bufferedReader = new BufferedReader(new FileReader(file1));
//*** I'd like get one bufferedReader with file1 + file2 + file3.
Stream<String> stream = bufferedReader.lines(); // get Stream
stream.forEach(e -> System.out.println(e)); // Working with Stream
}
}
You can create one Stream from the BufferedReader for each file, combine them into a stream, and then use the Stream#flatMap method to create a stream that is a concatenation of all these.
import java.util.function.Function;
import java.util.stream.Stream;
public class CombinedStreams
{
public static void main(String[] args)
{
Stream<String> stream0 = Stream.of("line0", "line1");
Stream<String> stream1 = Stream.of("line2", "line3");
Stream<String> stream2 = Stream.of("line4", "line5");
Stream<String> stream = Stream.of(stream0, stream1, stream2)
.flatMap(Function.identity());
stream.forEach(e -> System.out.println(e));
}
}
(Kudos to diesieben07 for the suggested improvement!)
If you don't need a BufferedReader and the Stream solution is enough, use that.
If you absolutely need a Reader you can use SequenceInputStream to concatenate the InputStreams and then create a BufferedReader from that.
The API is a little clunky since SequenceInputStream takes an Enumeration, so you would have to use one of the old collection types like Vector to construct it, but it works.
Related
Basically I have to do the following:
1)Read the CSV input file.
2)Filter the CSV data based on the blacklist.
3)Sort the input based on the country names in ascending order.
4)Print the records which are not blacklisted.
The CSV file contains:
id,country-short,country
1,AU,Australia
2,CN,China
3,AU,Australia
4,CN,China
The blacklist file contains:
AU
JP
And the desired output is
2,CN,China
4,CN,China
import java.io.IOException;
import java.nio.file.Files;
import java.nio.file.Paths;
import java.util.Optional;
import java.util.stream.Collectors;
import java.util.stream.Stream;
public class StreamProcessing {
public static void filterCsv(String fileName, String blacklistFile){
try (Stream<String> stream1 = Files.lines(Paths.get(fileName))) {
Stream<String> stream2 = Files.lines(Paths.get(blacklistFile));
Optional<String> hasBlackList = stream1.filter(s->s.contains(stream2)).findFirst();
} catch (IOException e) {
e.printStackTrace();
}
}
public static void main(String args[])
{
StreamProcessing sp = new StreamProcessing();
sp.filterCsv("Data.csv","blacklist.txt");
}
}
I want to remove the entries that are present in second Stream from comparing from the first Stream without converting it into an array?
You can consume a stream only once. Since you need access to all the members of the blacklist while evaluating each member of the main file, you must first consume the blacklist stream in entirety. For efficiency reasons, don't convert to an array, but to a HashSet.
boolean hasBlacklistedWord(String fileName, String blacklistFile) {
Set<String> blacklist = Files.lines(Paths.get(blacklistFile)).collect(toSet());
return Files.lines(Paths.get(fileName)).anyMatch(s -> blacklist.contains(s));
}
Unfortunately you can't use a stream more than once so code that checks the blacklist stream for each line will fail. The easiest solution would be to store the blacklist in a collection and then check each line against it.
List<String> blacklist = Files.readAllLines(Paths.get(blacklistFile));
boolean hasBlacklist = Files.lines(Paths.get(filename)).anyMatch(blacklist::contains);
From the docs:
Streams have a BaseStream.close() method and implement AutoCloseable,
but nearly all stream instances do not actually need to be closed
after use. Generally, only streams whose source is an IO channel (such
as those returned by Files.lines(Path, Charset)) will require closing.
Most streams are backed by collections, arrays, or generating
functions, which require no special resource management. (If a stream
does require closing, it can be declared as a resource in a
try-with-resources statement.)
When I create a Stream<String> using the lines() method on a BufferedReader as seen below, does closing the Stream also close the BufferedReader?
try (Stream<String> lines = new BufferedReader(new InputStreamReader(process.getInputStream())).lines()) {
// Do stuff
}
// Is the BufferedReader, InputStreamReader and InputStream closed?
Some really quick tests I've tried say no (the in field of the BufferedReader is not null), but then I'm confused by the following sentence, since this example is I/O as well, right?
Generally, only streams whose source is an IO channel (such
as those returned by Files.lines(Path, Charset)) will require closing.
If not, do I need to close both instances, or will closing the BufferedReader suffice?
Ideally, I'd like to return a Stream<String> from some method, without having the client worry about the readers. At the moment, I've created a Stream decorator which also closes the reader, but it's easier if that isn't necessary.
If you want to defer closing of the reader to the delivered Stream you need to invoke Stream.onClose():
static Stream<String> toStream(BufferedReader br){
return br.lines().onClose(asUncheckedAutoCloseable(br));
}
static Runnable asUncheckedAutoCloseable(AutoCloseable ac) {
return () -> {
try {
ac.close();
} catch (Exception e) {
throw new RuntimeException(e);
}
};
}
No, seems it doesn't. As the stream is created using
return StreamSupport.stream(Spliterators.spliteratorUnknownSize(
iter, Spliterator.ORDERED | Spliterator.NONNULL), false);
which doesn't pass any reference to the the BufferedReader
In your question you don't show how you create the Reader that is the argument of new BufferedReader(in). But from my own tests there is no reason to assume that the Stream closes this argument.
Doing the following should close everybody:
import java.io.BufferedReader;
import java.io.File;
import java.io.FileInputStream;
import java.io.InputStreamReader;
import java.io.Reader;
import java.util.stream.Stream;
public class SOPlayground {
public static void main(String[] args) throws Exception {
try (Reader in = new InputStreamReader(new FileInputStream(new File("/tmp/foo.html")));
BufferedReader reader = new BufferedReader(in);
Stream<String> lines = reader.lines()) {
lines.forEach(System.out::println);
}
}
}
I'm trying to write some text to a file. I have a while loop that is supposed to just take some text and write the exact same text back to the file.
I discovered that the while loop is never entered because Scanner thinks there's no more text to read. But there is.
import java.util.Scanner;
import java.io.File;
import java.io.FileNotFoundException;
import java.io.PrintWriter;
public class WriteToFile {
public static void main(String[] args) throws FileNotFoundException {
String whatToWrite = "";
File theFile = new File("C:\\test.txt");
Scanner readinput = new Scanner(theFile);
PrintWriter output = new PrintWriter(theFile);
while (readinput.hasNext()) { //why is this false initially?
String whatToRead = readinput.next();
whatToWrite = whatToRead;
output.print(whatToWrite);
}
readinput.close();
output.close();
}
}
The text file just contains random words. Dog, cat, etc.
When I run the code, text.txt becomes empty.
There was a similar question: https://stackoverflow.com/questions/8495850/scanner-hasnext-returns-false which pointed to encoding issues. I use Windows 7 and U.S. language. Can I find out how the text file is encoded somehow?
Update:
Indeed, as Ph.Voronov commented, the PrintWriter line erases the file contents! user2115021 is right, if you use PrintWriter you should not work on one file. Unfortunately, for the assignment I had to solve, I had to work with a single file. Here's what I did:
import java.util.ArrayList;
import java.util.Scanner;
import java.io.File;
import java.io.FileNotFoundException;
import java.io.PrintWriter;
public class WriteToFile {
public static void main(String[] args) throws FileNotFoundException {
ArrayList<String> theWords = new ArrayList<String>();
File theFile = new File("C:\\test.txt");
Scanner readinput = new Scanner(theFile);
while (readinput.hasNext()) {
theWords.add(readinput.next());
}
readinput.close();
PrintWriter output = new PrintWriter(theFile); //we already got all of
//the file content, so it's safe to erase it now
for (int a = 0; a < theWords.size(); a++) {
output.print(theWords.get(a));
if (a != theWords.size() - 1) {
output.print(" ");
}
}
output.close();
}
}
PrintWriter output = new PrintWriter(theFile);
It erases your file.
You are trying to read the file using SCANNER and writing to another file using PRINTWRITER,but both are working on same file.PRINTWRITER clear the content of the file to write the content.Both the class need to work on different file.
I have a file, I know that file will always contain only one word.
So what should be the most efficient way to read this file ?
Do i have to create input stream reader for small files also OR Is there any other options available?
Well something's got to convert bytes to characters.
Personally I'd suggest using Guava which will allow you to write something like this:
String text = Files.toString(new File("..."), Charsets.UTF_8);
Obviously Guava contains much more than just this. It wouldn't be worth it for this single method, but it's positive treasure trove of utility classes. Guava and Joda Time are two libraries I couldn't do without :)
Use Scanner
File file = new File("filename");
Scanner sc = new Scanner(file);
System.out.println(sc.next()); //it will give you the first word
if you have int,float...as first word you can use corresponding function like nextInt(),nextFloat()...etc.
Efficient you mean performance-wise or code simplicity (lazy programmer)?
If it is the second, then nothing I know beats:
String fileContent = org.apache.commons.io.FileUtils.readFileToString("/your/file/name.txt")
- Use InputStream and Scanner for reading the file.
Eg:
public class Pass {
public static void main(String[] args){
File f = new File("E:\\karo.txt");
Scanner scan;
try {
scan = new Scanner(f);
while(scan.hasNextLine()){
System.out.println(scan.nextLine());
}
} catch (FileNotFoundException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
}
}
- Guava Library handles this beautifully and efficiently.
Use BufferedReader and FileReader classes. Only two lines of code will suffice to read one word/one line file.
BufferedReader br = new BufferedReader(new FileReader("Demo.txt"));
System.out.println(br.readLine());
Here is a small program to do so. Empty file will cause to print 'null' as output.
import java.io.BufferedReader;
import java.io.FileReader;
import java.io.IOException;
public class SmallFileReader
public static void main(String[] args) throws IOException {
BufferedReader br = new BufferedReader(new FileReader("Demo.txt"));
System.out.println(br.readLine());
}
}
I am working on a basic encryption program for a school project, and I want to have easily interchangable keys. As it stands, I have a encryption class and a decryption class, with multiple methods. One of those methods is the key that I want to print to a file. Because I will be making many changes to those two classes (apart from the keys), I want to be able to print just that one method to a file. I also need to be able to load it again. Is there any easy way to do this?
You could use java serialization, as you commented that you have a String array to save, you could do something like this:
// save object to file
ObjectOutputStream oos = new ObjectOutputStream(new FileOutputStream(new File("/tmp/file")));
oos.writeObject(myArray); // where myArray is String[]
// load object from file
ObjectInputStream ois = new ObjectInputStream(new FileInputStream(new File("/tmp/file")));
String[] read = (String[]) ois.readObject();
A working example ;), saving the arguments received with the application execution.
import java.io.File;
import java.io.FileInputStream;
import java.io.FileNotFoundException;
import java.io.FileOutputStream;
import java.io.IOException;
import java.io.ObjectInputStream;
import java.io.ObjectOutputStream;
import java.util.Arrays;
public class TestSerialization {
public static void main(final String[] array) throws FileNotFoundException, IOException, ClassNotFoundException {
// save object to file
ObjectOutputStream oos = new ObjectOutputStream(new FileOutputStream(new File("/tmp/file")));
oos.writeObject(array); // where myArray is String[]
// load object from file
ObjectInputStream ois = new ObjectInputStream(new FileInputStream(new File("/tmp/file")));
String[] read = (String[]) ois.readObject();
System.out.println(Arrays.toString(read));
}
}
Serialization pointed out by user Speath is a very good approach I find.
If you want to be more selective in what you write into your files you can use simple File I/O to write into a file like follows:
Create a new file on the file system withe the FileWriter class and then initiate an I/O stream with BufferedWriter to write into this file:
// create a new file with specified file name
FileWriter fw = new FileWriter("myFile.log");
// create the IO strem on that file
BufferedWriter bw = new BufferedWriter(fw);
// write a string into the IO stream
bw.out("my log entry");
// don't forget to close the stream!
bw.close();
The whole thing must be surrounded with a try/catch in order to catch IO Exception.
Hope this helps.