i have a file similaire to this :
...
The hotspot server JVM has specific code-path optimizations
# which yield an approximate 10% gain over the client version.
export CATALINA_OPTS="$CATALINA_OPTS -server"
#############HDK1001#############
# Disable remote (distributed) garbage collection by Java clients
# and remove ability for applications to call explicit GC collection
export CATALINA_OPTS="$CATALINA_OPTS -XX:+DisableExplicitGC"
# Check for application specific parameters at startup
if [ -r "$CATALINA_BASE/bin/appenv.sh" ]; then
. "$CATALINA_BASE/bin/appenv.sh"
fi
#############HDK7564#############
# Disable remote (distributed) garbage collection by Java clients
# and remove ability for applications to call explicit GC collection
export CATALINA_OPTS="$CATALINA_OPTS -XX:+DisableExplicitGC"
i want to begin the reading from the line where exists the word "HDK1001" and end it where the world "HDK7564"
i tryed with this code but i am unable to do the limitation
public static HashMap<String, String> getEnvVariables(String scriptFile,String config) {
HashMap<String, String> vars = new HashMap<String, String>();
try {
FileInputStream fstream = new FileInputStream(scriptFile);
BufferedReader br = new BufferedReader(new InputStreamReader(fstream));
String strLine;
String var= "HDK1001";
while ((strLine = br.readLine()) != null ) {
if (strLine.startsWith("export") && !strLine.contains("$")) {
strLine = strLine.substring(7);
Scanner scanner = new Scanner(strLine);
scanner.useDelimiter("=");
if (scanner.hasNext()) {
String name = scanner.next();
String value = scanner.next();
System.out.println(name+"="+value);
vars.put(name, value);
}
}
Help me please
try this code.
public static HashMap<String, String> getEnvVariables(String scriptFile,
String config) {
HashMap<String, String> vars = new HashMap<String, String>();
BufferedReader br = null;
try {
FileInputStream fstream = new FileInputStream(scriptFile);
br = new BufferedReader(new InputStreamReader(fstream));
String strLine = null;
String stopvar = "HDK7564";
String startvar = "HDK1001";
String keyword = "export";
do {
if (strLine != null && strLine.contains(startvar)) {
if (strLine.contains(stopvar)) {
return vars;
}
while (strLine != null && !strLine.contains(stopvar)) {
strLine = br.readLine();
if (strLine.startsWith(keyword)) {
strLine = strLine.substring(keyword.length())
.trim();
String[] split = strLine.split("=");
String name = split[0];
String value = split[1];
System.out.println(name + "=" + value);
vars.put(name, value);
}
}
}
} while ((strLine = br.readLine()) != null);
} catch (Exception e) {
e.printStackTrace();
}
return vars;
}
Your example code is quite far off, and I don't intend to rewrite all of your code, I will give you some pointers though. You are already doing:
if (strLine.startsWith("export") && !strLine.contains("$"))
This is your conditional that should be testing for the "HDK1001" string instead of whatever it's doing right now. I'm not sure why you are checking for the word "export" when it seems like it doesn't matter for your program.
There isn't a way to just magically start and end at specific words in the file, you MUST start at the beginning and go line by line checking all of them until you find your desired first and last line. Once you find that first line, you can continue reading until you reach your desired end line and then bail out.
this is pseudo code that follows the kind of logic you would want to be able to accomplish this task.
flag = false
inside a loop
{
read in a line
if( line != #############HDK1001############# && flag == false){ //check to see if you found your starting place
nothing useful here. lets go around the loop and try again
else // if i found it, set a flag to true
flag = true;
if( flag == true) // am i after my starting place but before my end place?
{
if( line == #############HDK1001#############)
do nothing and go back through the loop, this line is not useful to us
else if( line == #############HDK7564#############) //did i find my end place?
flag = false // yes i did, so lets not be able to assign stuff any more
else // im not at the start, im not at the end. I must be inbetwee. lets read the data and assign it.
read in the lines and assign it to variables that you want
}
Related
I have a csv file that doesn't always have the same number of lines. However, I want a method to only read me the last line, so I can access the first column of that last line. So far I haven't found a solution, that does exactly that.
Right now I'm just at the point were I would read every single line with BufferedReader and save it into an Array.
public void readPreviousEntryID(){
String path = "csvs/cartEntries.csv";
try {
BufferedReader br = new BufferedReader((new FileReader(path)));
String line;
while ((line = br.readLine() != null)) {
String[] values = line.split(",");
}
} catch (FileNotFoundException e) {
throw new RuntimeException(e);
}
}
Normally I would then access the first entry of every line by using values[0]. But I just want the first value of the last line.
I thought about counting the number of lines in the while loop by incrementing an integer and then using the final value of that integer to access the corresponding line, but I'm not sure if this would work or how I would implement that.
I hope I included enough information to make the problem understandable. This is my first question here and I'm quite new to Java.
Simply read the lines of the file in a loop and save the values of the last line read. After the loop terminates, values contains the contents of the last line in the file.
public void readPreviousEntryID() throws IOException {
String path = "csvs/cartEntries.csv";
try (FileReader fr = new FileReader(path);
BufferedReader br = new BufferedReader(fr)) {
String[] values = null;
String line = br.readLine();
while (line != null) {
values = line.split(",");
line = br.readLine();
}
if (values == null) {
throw new IOException("File is empty.");
}
// Handle 'values[0]'
}
}
The advantage of the above code is that you don't need to store the entire file contents in the computer memory. If the CSV file is very large, you may get OutOfMemoryError.
Note that is important to close a file after you have finished reading it. Since Java 7 you can use try-with-resources.
Rather than catch the IOException and wrap it in a RuntimeException, I suggest that you simply declare that method readPreviousEntryID may throw an IOException. Refer to Unchecked Exceptions — The Controversy.
It is probably also a good idea to check, after the loop terminates, that values contains the expected number of elements, e.g.
if (values.length == 5) {
// Handle 'values[0]'
}
else {
throw new IOException("Invalid last line.");
}
Edit
Alternatively, no need to split every line. Just save the last line read and split that last line after the loop terminates.
public void readPreviousEntryID() throws IOException {
String path = "csvs/cartEntries.csv";
try (FileReader fr = new FileReader(path);
BufferedReader br = new BufferedReader(fr)) {
String lastLine = null;
String line = br.readLine();
while (line != null) {
lastLine = line;
line = br.readLine();
}
if (lastLine == null) {
throw new IOException("File is empty.");
}
String[] values = lastLine.split(",");
// Handle 'values[0]'
}
}
Why not stored all lines into List and get last line details such as follows
private List<String[]> readLines = new ArrayList<>();
public void readPreviousEntryID(){
String path = "csvs/cartEntries.csv";
try {
String line;
BufferedReader br = new BufferedReader(new FileReader(path));
while ((line = br.readLine()) != null) {
String[] values = line.split(",");
readLines.add(values);
}
br.close();
} catch (IOException e) {
e.printStackTrace();
}
}
public String[] getLastLine() {
return readLines.get(readLines.size()-1);
}
This above function will gives the last row of csv file.
In Linux one would use the tail command to print the n last lines. Search java tail will get you some implementations.
A good fast implementation for large files would use a RandomAccessFile, maybe a MemoryMappedByteBuffer, and search back from the end for a \n.
In your case you can keep it simple.
public String readPreviousEntryID(){
Path path = Paths.get("csvs/cartEntries.csv");
try (Stream<String> lines = Files.lines(path, Charset.defaultCharset())) {
return lines
.filter(line -> !line.isEmpty())
.reduce("", (acc, line) -> line);
} catch (FileNotFoundException e) {
// You gave a relative path, show the actual full path:
System.out.println("File not found: " + Files.toAbsolutePath());
throw new RuntimeException(e);
} // Automatically closes the Stream lines, even on exception or return.
}
Try-with-resources try (... declarations of AutoCloseables ...) { ... } ensuring the call to .close().
Stream, the newest walk through items. Here skipping empty lines and "accumulating" just the last line. You need not keep all lines in memory.
Lambdas, x -> ... or (x, y) -> ... declare an anonymous function with 1 resp. 2 parameter declarations.
Path is a generalisation of disk file only File. Path can also be from an URL, inside a zip file etcetera.
Files is a worthwile utility class providing many Path related goodies. Files.
Some good Answers have been posted. Here is a variation using streams.
Also, learn about NIO.2 as the modern way to work with files in Java.
Some untested code to try:
Path path = Paths.get( "/csvs" , "cartEntries.csv" ) ;
Optional < String > lastLine =
Files
.lines( path )
.reduce( ( previousLine , currentLine ) -> currentLine ) ;
if( lastLine.isPresent() ) {
String[] parts = lastLine.get().split( "," ) ;
…
}
Or, re-organized into a one-liner:
String[] parts =
Files
.lines(
Paths.get( "/csvs" , "cartEntries.csv" )
)
.reduce(
( previousLine , currentLine ) -> currentLine
)
.map(
line -> line.split.( "," )
)
.orElse(
String[] :: new
)
;
Can someone tell me how to read every second line from a file in java?
BufferedReader br = new BufferedReader(new FileReader(file));
String line = br.readLine();
while(line != null){
//Do something ..
line = br.readLine()
}
br.close
One simple way would be to just maintain a counter of number of lines read:
int count = 0;
String line;
while ((line = br.readLine()) != null) {
if (count % 2 == 0) {
// do something with this line
}
++count;
}
But this still technically reads every line in the file, only choosing to process every other line. If you really only want to read every second line, then something like RandomAccessFile might be necessary.
You can do it in Java 8 fashion with very few lines :
static final int FIRST_LINE = 1;
Stream<String> lines = Files.lines(path);
String secondLine = lines.limit(2).skip(FIST_LINE).collect(Collectors.joining("\n"));
First you stream your file lines
You keep only the two first lines
Skip the first line
Note : In java 8, when using Files.lines(), you are supposed to close the stream afterwards or use it in a try-with-resource block.
This is similar to #Tim Biegeleisen's approach, but I thought I would show an alternative to get every other line using a boolean instead of a counter:
boolean skipOddLine = true;
String line;
while ((line = br.readLine()) != null) {
if (skipOddLine = !skipOddLine) {
//Use the String line here
}
}
This will toggle the boolean value every loop iteration, skipping every odd line. If you want to skip every even line instead you just need to change the initial condition to boolean skipOddLine = false;.
Note: This approach only works if you do not need to extend functionality to skip every 3rd line for example, where an approach like Tim's would be easier to modify. It also has the downside of being harder to read than the modulo approach.
This will help you to do it very well
You can use try with resource
You can use stream api java 8
You can use stream api supplier to use stream object again and again
I already hane added comment area to understand you
try (BufferedReader reader =
new BufferedReader(
new InputStreamReader(
new ByteArrayInputStream(x.getBytes()),
"UTF-8"))) { //this will help to you for various languages reading files
Supplier<Stream<String>> fileContentStream = reader::lines; // this will help you to use stream object again and again
if (FilenameUtils.getExtension(x.getOriginalFilename()).equals("txt")) { this will help you to various files extension filter
String secondLine = lines.limit(2).skip(FIST_LINE).collect(Collectors.joining("\n"));
String secondLine =
fileContentStream
.get()
.limit(2)
.skip(1)// you can skip any line with this action
.collect(Collectors.joining("\n"));
}
else if (FilenameUtils.getExtension(x.getOriginalFilename()).equals("pdf")) {
} catch (Exception ex) {
}
I have an application that can create backups of its information and recover them. I was working on a seventh version, and found that the recovery method would not work. It acted as if it did, but would do nothing. I could not figure out what was causing this, so I decided to simply start over from my sixth version.
None of my versions' recovery methods work now, despite passing all tests in the past.
It does not throw an error or anything else. It tells the user "The backup has been restored." with a toast. If ext storage isn't allowed, it throws an error. If the file path doesn't exist, throws an error. Nothing abnormal on logcat. But when everything's right, it just simply doesn't work. So...here's the recovery method from a stable version. Let me know if there's anything else you'd like to examine.
Edit: Changed method of decoding from Base64 and setting it to a string. End result still same. Narrowed down issue to the while loop, which never runs, so no information is actually processed.
//Method025: Imports user acc settings from a file on a specified path.
public void importFile() {
//The file variable to be imported.
File file;
try {
//Used to access settings.
TinyDB database = new TinyDB(getApplicationContext());
//Sets the file equal to the file found at the specified path.
String strfilePath = database.getString("FilePath");
file = new File(strfilePath);
//To be used to arrange the imported information.
ArrayList<String> strAcc = new ArrayList<>();
ArrayList<String> strUser = new ArrayList<>();
ArrayList<String> strPass = new ArrayList<>();
ArrayList<String> strAdditionalInfo = new ArrayList<>();
//To be used to store all the information for additional info variables. This is
//due to its multi-line nature requiring a slightly different method of
//importation, the other variables are expected to be one line.
String strExtraInfo = "";
//Goes through the file and adds info to arrays for each corresponding variable.
//If the line does not have an identifier, it assumes it to be an additional
//info line, and will be processed later.
try (BufferedReader br = new BufferedReader(new FileReader(file))) {
String line;
String strLine = br.readLine();
//Decodes the line from Base64 and converts it to a string.
byte[] decodedContent = Base64.decode(strLine.getBytes(), Base64.DEFAULT);
strLine = new String (decodedContent);
while ((line = br.readLine()) != null) {
if (strLine.contains("[Acc]")) {
strLine = strLine.replace("[Acc]","");
strAcc.add(strLine);
} else if (strLine.contains("[User]")) {
strLine = strLine.replace("[User]", "");
strUser.add(strLine);
} else if (strLine.contains("[Pass]")) {
strLine = strLine.replace("[Pass]", "");
strPass.add(strLine);
} else {
strExtraInfo += strLine;
}
}
}
The issue was completely different from first believed, and there ended up being many problems. The main ones are as listed.
I assume passing the tests in the past was on account of human error, and the question has been changed to reflect that.
Incorrect method of decoding Base64
Converting to Base64 and back messed up line spacing
ReadLine() can only be called once, doing so more returns nothing
Not enough error checking for things such as strAdditionalInfo.size > 0
Substring recovery method throws many errors, as can be done more simply using split()
Here is the updated code, fully functional.
public void importFile() {
//The file variable to be imported.
File file;
try {
//Used to access settings.
TinyDB database = new TinyDB(getApplicationContext());
//Sets the file equal to the file found at the specified path.
String strfilePath = database.getString("FilePath");
file = new File(strfilePath);
//To be used to arrange the imported information.
ArrayList<String> strAcc = new ArrayList<>();
ArrayList<String> strUser = new ArrayList<>();
ArrayList<String> strPass = new ArrayList<>();
ArrayList<String> strAdditionalInfo = new ArrayList<>();
//To be used to store all the information for additional info variables. This is
//due to its multi-line nature requiring a slightly different method of
//importation, the other variables are expected to be one line.
String strExtraInfo = "";
//Goes through the file and adds info to arrays for each corresponding variable.
//If the line does not have an identifier, it assumes it to be an additional
//info line, and will be processed later.
try (BufferedReader br = new BufferedReader(new FileReader(file))) {
String line;
String strLine;
while ((line = br.readLine()) != null) {
if (line.contains("[Acc]")) {
strLine = line.replace("[Acc]","");
strAcc.add(strLine);
} else if (line.contains("[User]")) {
strLine = line.replace("[User]", "");
strUser.add(strLine);
} else if (line.contains("[Pass]")) {
strLine = line.replace("[Pass]", "");
strPass.add(strLine);
} else {
strExtraInfo += line;
}
}
}
//Gets the list of accounts.
ArrayList<String> savedInfo = new ArrayList<>(database.getListString("allSaved"));
//To be used to get the AdditionalInfo variables one line at a time.
String strSubInfo;
//Gets rid of any erroneous spaces.
while (strExtraInfo.contains(" ")) {
strExtraInfo = strExtraInfo.replace(" ", " ");
}
Log.d("STRExtraInfo",strExtraInfo);
strExtraInfo = strExtraInfo.replace("[ExtraStart]","");
String array[] = strExtraInfo.split("\\[ExtraEnd\\]");
ArrayList<String> strRawAdditionalInfo = new ArrayList<>();
strRawAdditionalInfo = new ArrayList<>(Arrays.asList(array));
for (String info : strRawAdditionalInfo){
strAdditionalInfo.add(info);
Log.d("ExtraInfo",info );
}
//Arranges the information.
for (String name : strAcc) {
savedInfo.add(name);
ArrayList<String> allInfo = new ArrayList<>();
//Gets the info then adds it to database.
//Deletes the old information.
if (strUser.size() > 0) {
allInfo.add(strUser.get(0));
strUser.remove(0);
}
if (strPass.size() > 0) {
allInfo.add(strPass.get(0));
strPass.remove(0);
}
if (strAdditionalInfo.size() > 0) {
allInfo.add(strAdditionalInfo.get(0));
strAdditionalInfo.remove(0);
}
database.putListString(name,allInfo);
}
Why do i only get one entry into the map when i run this code.There is thousands of lines in the file im reading in but it only seems to be getting to the first line and stopping?
public class Details {
public Map<String, String> dictionaryWords() throws IOException{
String cvsSplitBy = ",";
Collection<String> words = new TreeSet<String>();
Map<String,String> m = new TreeMap<String,String>();
BufferedReader br = new BufferedReader(new InputStreamReader(new FileInputStream("dictionary.csv")));
String line = null;
String [] word = null;
String remove = null;
String nextline = null;
String getAllLines = "-";
while ((line = br.readLine())!= null) {
if (line.startsWith("\"")) {
getAllLines = line;
while((nextline = br.readLine())!= null){
if(!nextline.startsWith("\"")){
getAllLines.concat(nextline);
}else{
}
words.add(getAllLines);
word = getAllLines.split(cvsSplitBy);
remove = word[0].replace('"', '-');
m.put(remove.toLowerCase(),Arrays.toString(word));
}
}else{
}
}
for (String key : m.keySet()) {
System.out.println(key + " " + m.get(key));}
return m;
}
Try the following code
if(!nextline.startsWith("\""))
{
getAllLines = getAllLines.concat(nextline);
}
Don't forget to reassign "getAllLines" to the return value of the .concat() function. Since Strings are immutable, the .concat() function returns a new String object, which you do not assign to anything (therefore it is lost). This leaves you with your original String still stored in "getAllLines" as if the call to .concat() was never made.
Feel Free to use the StringBuilder class and the append method, which will likely be much faster than creating new Strings via .concat() thousands of times.
Also: You do not need blank else{} statements.
In the following part of your code the nextlines (2nd ...) are lost in space. They are saved in the variable nextline and used as a parameter for getAllLines.concat. But the return value of String::concat is not assigned to anything.
...
while((nextline = br.readLine())!= null){
if(!nextline.startsWith("\"")){
getAllLines.concat(nextline);
}else{
...
I need to be able to read each line of the file for multiple arguments, hence the for loop. After the first one, it does not seem to be reading them anymore, seems to skip the try statement. Any ideas? I'm sure Its something silly I am missing but have been playing about with it and unfortunately time is not on my side.
for (int j = 0; j < ags.length; j++){
try{
String nameFromFile = null;
BufferedReader InputReader = new BufferedReader(new InputStreamReader(System.in));
while ((nameFromFile = InputReader.readLine()) != null) {
// Do stuff
} catch (IOException e) {
System.out.println(e.getMessage());
}
}
You appear to have two sources you want to compare System.in and args I suggest you read these individually and then compare them.
Set<String> fromInt = new HashSet<>();
try (BufferedReader br = new BufferedReader(new InputStreamReader(System.in))) {
for(String line; (line = br.readLine()) != null;)
fromIn.add(normalise(line));
}
// compare argsList with fromIn.
e.g.
for(String arg: args) {
if (fromIn.contains(normalise(arg))) {
// something
} else {
// something else
}
}
I need to be able to read each line of the file
What file? You're reading from System.in:
BufferedReader InputReader = new BufferedReader(new InputStreamReader(System.in));
Your code will block at this line until you enter something at the console.
You do not read a file, bu the System.in stream.
Every stream has an internal pointer, so the stream nows, which line was read at last.
If the System stream was read once, the pointer is pointing to the end of the stream.
As long as the stream is not reset, the read command will not return anything.
try
InputStream.reset()
or even better, only read the Stream once and cache the result! This is faster and safe, because the Stream input can change during iteration.
Your code will never exit from while loop.
while ((nameFromFile = InputReader.readLine()) != null)
In above loop it will print only one time and at the end of the file it will not be out of the while loop . That's why you are getting only one time output. Since it is not exited from while loop it does not go back into for loop. readLine() return the string and it is terminated by "\n" or "\r\n". Change as below and you will be able to read as ags.length
while ((nameFromFile = InputReader.readLine())=="\n")