Java 8 - program not reading file but seems to be writing though - java

import java.io.IOException;
import java.nio.ByteBuffer;
import java.nio.channels.SeekableByteChannel;
import java.nio.file.Files;
import java.nio.file.OpenOption;
import java.nio.file.Path;
import java.nio.file.Paths;
import java.nio.file.StandardOpenOption;
import java.nio.file.attribute.FileAttribute;
import java.nio.file.attribute.PosixFilePermission;
import java.nio.file.attribute.PosixFilePermissions;
import java.util.HashSet;
import java.util.Set;
public class RAFRead {
public static void main(String[] args) {
create();
read();
}
public static void create() {
// Create the set of options for appending to the file.
Set<OpenOption> options = new HashSet<OpenOption>();
options.add(StandardOpenOption.APPEND);
options.add(StandardOpenOption.CREATE);
// Create the custom permissions attribute.
Set<PosixFilePermission> perms = PosixFilePermissions
.fromString("rw-r-----");
FileAttribute<Set<PosixFilePermission>> attr = PosixFilePermissions
.asFileAttribute(perms);
Path file = Paths.get("./outfile.log");
ByteBuffer buffer = ByteBuffer.allocate(4);
try {
SeekableByteChannel sbc = Files.newByteChannel(file, options, attr);
for (int i = 9; i >= 0; --i) {
sbc = sbc.position(i * 4);
buffer.clear();
buffer.put(new Integer(i).byteValue());
buffer.flip();
sbc.write(buffer);
}
} catch (IOException e) {
System.out.println(e.getMessage());
}
}
public static void read() {
// Create the set of options for appending to the file.
Set<OpenOption> options = new HashSet<OpenOption>();
options.add(StandardOpenOption.READ);
Path file = Paths.get("./outfile.log");
ByteBuffer buffer = ByteBuffer.allocate(4);
try {
SeekableByteChannel sbc = Files.newByteChannel(file, options);
int nread;
do {
nread = sbc.read(buffer);
if(nread!= -1) {
buffer.flip();
System.out.println(buffer.getInt());
}
} while(nread != -1 && buffer.hasRemaining());
} catch (IOException e) {
System.out.println(e.getMessage());
}
}
}
I first create the file.
I am trying to put 9, then 8, then 7 and so on in the file.
But I am trying to add to file in reverse order using random access.
The output of file actually will be numbers in ascending order.
I am just writing to file in reverse order to try out random access writing.
After that I try to read the file and print the data (numbers).
It prints only 0. I was expecting it to print 1-9.
I couldn't figure out the reason. Any help is appreciated.
I followed this link from Oracle site: https://docs.oracle.com/javase/tutorial/essential/io/file.html
The file has size after I run this program, so it seems program is writing.
Since it is buffer read, i can't see the data by vi or cat.

You need to flip() the buffer before calling write() or get()(and friends), and compact() afterwards.

Related

Need to count the number of files located within each zip file in a directory folder in JAVA

I have a folder which has a series of Zip files within it. I am trying to iterate through the folder and count the number of files that are in each zip file. I have created two pieces of code, I am just not sure how to put them together to get my desired results. Both codes are placed into try/catch blocks and they both work perfectly independently. This is using Eclipse, written in Java.
import java.io.IOException;
import java.nio.file.Files;
import java.nio.file.Path;
import java.nio.file.Paths;
import java.util.stream.Collectors;
import java.util.zip.ZipFile;
import java.io.File;
import java.util.List;
public class KZF {
public static void main(String[] args) {
// TODO Auto-generated method stub
// Try/Catch Block counts the number of files within a given zip file
try {
ZipFile zipFile = new ZipFile(
"C:\\Users\\username\\Documents\\Temp\\AllKo\\Policy.zip");
int NumberOfFiles = zipFile.size() - 1;
// String name = zipFile.getName();
Path path = Paths
.get("C:\\Users\\username\\Documents\\Temp\\AllKo\\Policy.zip");
Path filename = path.getFileName();
System.out.print("The number of files in: ");
// System.out.print(name);
System.out.print(filename.toString());
System.out.print(" are: ");
System.out.print(NumberOfFiles + " file(s)");
zipFile.close();
}
catch (IOException ioe) {
System.out.println("Error opening zip file" + ioe);
}
// ----------------------------------------------------------------------------------------------------------
// Creates list of every file specified folder
String dirLocation = "C:\\Users\\username\\Documents\\Temp\\AllKo";
try { List<File> files = Files.list(Paths.get(dirLocation))
.map(Path::toFile) .collect(Collectors.toList());
files.forEach(System.out::println);
} catch(IOException e) { Error }
}
}
You must be careful about opening/closing streams, so you can try something like this:
import java.io.File;
import java.io.IOException;
import java.nio.file.Files;
import java.nio.file.Path;
import java.nio.file.Paths;
import java.util.Enumeration;
import java.util.stream.Collectors;
import java.util.stream.Stream;
import java.util.zip.ZipEntry;
import java.util.zip.ZipFile;
public class KZF
{
static int findNumberOfFiles(File file) {
try (ZipFile zipFile = new ZipFile(file)) {
return zipFile.stream().filter(z -> !z.isDirectory()).count();
} catch (Exception e) {
return -1;
}
}
static String createInfo(File file) {
int tot = findNumberOfFiles(file);
return (file.getName() + ": " + (tot >= 0 ? tot + " files" : "Error reading zip file"));
}
public static void main(String[] args) throws IOException {
String dirLocation = "C:\\Users\\username\\Documents\\Temp\\AllKo";
try (Stream<Path> files = Files.list(Paths.get(dirLocation))) {
files
.filter(path -> path.toFile().isFile())
.filter(path -> path.toString().toLowerCase().endsWith(".zip"))
.map(Path::toFile)
.map(KZF::createInfo)
.forEach(System.out::println);
}
}
}

Java IO outperforms Java NIO when it comes to file reading

I believed that the new nio package would outperform the old io package when it comes to the time required to read the contents of a file. However, based on my results, io package seems to outperform nio package. Here's my test:
import java.io.*;
import java.lang.reflect.Array;
import java.nio.ByteBuffer;
import java.nio.MappedByteBuffer;
import java.nio.channels.FileChannel;
import java.nio.file.Files;
import java.util.ArrayList;
import java.util.List;
public class FileTestingOne {
public static void main(String[] args) {
long startTime = System.nanoTime();
File file = new File("hey2.txt");
try {
byte[] a = direct(file);
String s = new String(a);
}
catch (IOException err) {
err.printStackTrace();
}
long endTime = System.nanoTime();
long totalTime = (endTime - startTime);
System.out.println(totalTime);
}
public static ByteBuffer readFile_NIO(File file) throws IOException {
RandomAccessFile rFile = new RandomAccessFile(file.getName(), "rw");
FileChannel inChannel = rFile.getChannel();
ByteBuffer _buffer = ByteBuffer.allocate(1024);
int bytesRead = inChannel.read(_buffer);
while (bytesRead != -1) {
_buffer.flip();
while (_buffer.hasRemaining()) {
byte b = _buffer.get();
}
_buffer.clear();
bytesRead = inChannel.read(_buffer);
}
inChannel.close();
rFile.close();
return _buffer;
}
public static byte[] direct(File file) throws IOException {
byte[] buffer = Files.readAllBytes(file.toPath());
return buffer;
}
public static byte[] readFile_IO(File file) throws IOException {
byte[] _buffer = new byte[(int) file.length()];
InputStream in = null;
try {
in = new FileInputStream(file);
if ( in.read(_buffer) == -1 ) {
throw new IOException(
"EOF reached while reading file. File is probably empty");
}
}
finally {
try {
if (in != null)
in.close();
}
catch (IOException err) {
// TODO Logging
err.printStackTrace();
}
}
return _buffer;
}
}
// Small file
//7566395 -> readFile_NIO
//10790558 -> direct
//707775 -> readFile_IO
// Large file
//9228099 -> readFile_NIO
//737674 -> readFile_IO
//10903324 -> direct
// Very large file
//13700005 -> readFile_NIO
//2837188 -> readFile_IO
//11020507 -> direct
Results are:
Small file:
nio implementation: 7,566,395ns
io implementation: 707,775ns
direct implementation: 10,790,558ns
Large file:
nio implementation: 9,228,099ns
io implementation: 737,674ns
direct implementation: 10,903,324ns
Very large file:
nio implementation: 13,700,005ns
io implementation: 2,837,188ns
direct implementation: 11,020,507ns
I wanted to ask this question because (I believe) nio package is non-blocking, thus it needs to be faster, right?
Thank you,
Edit:
Changed ms to ns
Memory mapped files (or MappedByteBuffer) are a part of Java NIO and could help improve performance.
The non-blocking in Java NIO means that a thread does not have to wait for the next data to read. It does not necessarily affect performance of a full operation (like reading and processing a file) at all.

using Java arraylist for storing data from scan from file

I am new to java, but not coding. I am trying to figure out java because it's part of my class this term and I am having a really hard problem grasping the idea of it and implementing things in java.
my problem Is that I am not sure if I am correctly using the arraylist to grab data from the scan of the file and input it into a arraylist to sort and print at a later time. I am just having issues picking up on java any help would be great since I am new to java.
import java.io.File;
import java.io.FileInputStream;
import java.io.InputStream;
import java.io.FileNotFoundException;
import java.util.Scanner;
import java.util.regex.Pattern;
import java.util.ArrayList;
import java.util.*;
public class MissionCount
{
private static ArrayList<String> list = new ArrayList<String>();
// returns an InputStream that gets data from the named file
private static InputStream getFileInputStream(String fileName) throws Exception {
InputStream inputStream;
try {
inputStream = new FileInputStream(new File(fileName));
}
catch (FileNotFoundException e) { // no file with this name exists
inputStream = null;
throw new Exception("unable to open the file -- " + e.getMessage());
}
return inputStream;
}
public static void main(String[] args) {
if (args.length != 1) {
System.out.println("USage: MissionCount <datafile>");
//System.exit(1);
}
try {
System.out.printf("CS261 - MissionCount - Chad Dreher%n%n");
int crewcount = 0;
int misscount = 0;
InputStream log = getFileInputStream(args[0]);
Scanner sc = new Scanner(log);
sc.useDelimiter(Pattern.compile(",|\n"));
while (sc.hasNext()) {
String crewMember = sc.next();
list.add(crewMember);
String mission = sc.next();
list.add(mission);
}
sc.close();
// Add code to print the report here
}catch (Exception e) {
System.out.println("Error: " + e.getMessage());
}
}
}
InputStream log = getFileInputStream(args[0]);
Change that line to as follows :-
File log = new File(args[0])
that should work!

Read different portion of a file with multiple threads in Java

I have a 10GB PDF file that I would like to break up into 10 files each 1GB in size. I need to do this operation in parallel, which means spinning 10 threads which each starts from a different position and read up to 1GB of data and write to a file. Basically the final result should be 10 files that each contain a portion of the original 10GB file.
I looked at FileChannel, but the position is shared, so once I modify the position in one thread, it impacts the other thread. I also looked at AsynchronousFileChannel in Java 7 but I'm not sure if that's the way to go. I appreciate any suggestion on this issue.
I wrote this simple program that reads a small text file to test the FileChannel idea, doesn't seem to work for what I'm trying to achieve.
package org.cas.filesplit;
import java.io.BufferedReader;
import java.io.FileReader;
import java.io.IOException;
import java.nio.ByteBuffer;
import java.nio.channels.FileChannel;
import java.nio.file.Path;
import java.nio.file.Paths;
public class ConcurrentRead implements Runnable {
private int myPosition = 0;
public int getPosition() {
return myPosition;
}
public void setPosition(int position) {
this.myPosition = position;
}
static final String filePath = "C:\\Users\\temp.txt";
#Override
public void run() {
try {
readFile();
} catch (IOException e) {
e.printStackTrace();
}
}
private void readFile() throws IOException {
Path path = Paths.get(filePath);
FileChannel fileChannel = FileChannel.open(path);
fileChannel.position(myPosition);
ByteBuffer buffer = ByteBuffer.allocate(8);
int noOfBytesRead = fileChannel.read(buffer);
while (noOfBytesRead != -1) {
buffer.flip();
System.out.println("Thread - " + Thread.currentThread().getId());
while (buffer.hasRemaining()) {
System.out.print((char) buffer.get());
}
System.out.println(" ");
buffer.clear();
noOfBytesRead = fileChannel.read(buffer);
}
fileChannel.close();
}
}

I would like to print the output that comes on the console to a text file. I tried the below but it didn't workout. Could anyone please look at this?

The below is the code that I tried to print the output that comes on the console to a text file. The main idea of the code is fetch values from a csv file and print the output to a text file. Could someone let me know how this can be achieved.
import java.io.BufferedWriter;
import java.io.File;
import java.io.FileWriter;
import java.io.IOException;
import java.sql.Connection;
import java.sql.DriverManager;
import java.sql.PreparedStatement;
import java.sql.ResultSet;
import com.csvreader.CsvReader;
public class projectInfo {
public static void main(String[] args) {
StringBuffer buffer = new StringBuffer();
Connection conn;
int count = 0;
String sampleIddisp = null;
String sample_name = null;
String compound_name = null;
String registration_date = null;
}
products.close();
} catch (Exception e) {
e.printStackTrace();
} finally {
System.out.println("\n Total number of records processed:" + count);
}
}
}
You never place any content in your StringBuffer buffer, so it is empty when you write it to file:
bw.write(buffer.toString());
buffer could potentially consume a large amount of memory here.
A better approach to writing the data to file would be to write the data as you read it from the database:
while (rs.next()) {
sampleIddisp = rs.getString(1);
...
bw.write(....);
}

Categories

Resources