I have created a java gui to generate bat files.
When I write the bat containing a string like this: "L’uomo più forte" notepad++ shows this: "L?uomo pi— forte"
Here is the code:
FileOutputStream fos = new FileOutputStream(bat);
Writer w = new BufferedWriter(new OutputStreamWriter(fos, "Cp850"));
String stringa = "L’uomo più forte"
w.write(stringa);
w.write("\n");
w.write("pause");
w.write("\n");
w.flush();
w.close();
I had to use cp850 for dos use. Using base charset the bat give error.
Solutions?
Instead of using "Cp850":
Writer w = new BufferedWriter(new OutputStreamWriter(fos, "Cp850"));
Try using "UTF-8":
Writer w = new BufferedWriter(new OutputStreamWriter(fos, "UTF-8"));
For more information on UTF-8 Encoding Go Here.
Don't forget to place a semicolon (;) at the end of your stringa string variable declaration/initialization and surround your code in try/catch block to handle possible FileNotFoundException, UnsupportedEncodingException, and IOException Exceptions.
Also...for NotePad, to provide a new line you need to also supply the \r tag:
w.write("\r\n");
You may do the following:
Open a cmd.exe command prompt session.
Execute echo L’uomo più forte>text.txt command.
Open text.txt file and copy the string there.
Paste it in your code.
For example, I created the text.txt file, renamed it to test.bat and appended an echo command. This is my test.bat file:
#echo off
echo L'uomo pi— forte
... and this is the output:
L'uomo più forte
If some character is not displayed correctly, then the code page used does not contain such a character.
Note: I suggest you to use the standard Windows Notepad. The notepad++ program may cause strange output in these cases.
Related
I have a tomcat running on a Linux server.
My webapp is creating text files that must be imported by another external system that accepts DOS/Windows formatted files.
FileWriterWithEncoding writer;
writer = new FileWriterWithEncoding(file,"UTF-8", true);
PrintWriter printer = new PrintWriter(writer);
How can I create such DOS formatted files with Java on a Linux server?
Thank you.
Make sure that the line endings you write are "\r\n", this is the Windows way of writing them (carriage return character + line feed character) .
I want to write a new line using a FileOutputStream; I have tried the following approaches, but none of them are working:
encfileout.write('\n');
encfileout.write("\n".getbytes());
encfileout.write(System.getProperty("line.separator").getBytes());
This should work. Probably you forgot to call encfileout.flush().
However this is not the preferred way to write texts. You should wrap your output stream with PrintWriter and enjoy its println() methods:
PrintWriter writer = new PrintWriter(new OutputStreamWriter(encfileout, charset));
Alternatively you can use FileWriter instead of FileOutputStream from the beginning:
FileWriter fw = new FileWriter("myfile");
PrintWriter writer = new PrintWriter(fw);
Now just call
writer.println();
And do not forget to call flush() and close() when you finish your job.
It could be a viewer problem... Try opening the file in EditPlus or Notepad++. Windows Notepad may not recognize the line feed of another operating system. In which program are you viewing the file now?
String lineSeparator = System.getProperty("line.separator");
<br>
fos.write(lineSeparator.getBytes());
To add a line break use
fileOutputStream.write(10);
here decimal value 10 represents newline in ASCII
I read lines from a .txt file into a String list. I show the text in a JTextPane. The encoding is fine when running from Eclipse or NetBeans, however if I create a jar, the encoding is not correct. The encoding of the file is UTF-8. Is there a way to solve this problem?
Your problem is probably that you're opening a reader using the platform encoding.
You should manually specify the encoding whenever you convert between bytes and characters. If you know that the appropriate encoding is UTF-8 you can open a file thus:
FileInputStream inputFile = new FileInputStream(myFile);
try {
FileReader reader = new FileReader(inputFile, "UTF-8");
// Maybe buffer reader and do something with it.
} finally {
inputFile.close();
}
Libraries like Guava can make this whole process easier..
Have you tried to run your jar as
java -Dfile.encoding=utf-8 -jar xxx.jar
Given:
try{
FileWriter fw = new FileWriter(someFileName);
BufferedWriter bw = new BufferedWriter(fw);
bw.write("Hello Java");
}catch...
}finally{
bw.close();
}
It works perfectly in windows, but not in Unix.
Remark: the created file in unix has the complete 777 rights!
What should I do to get it working in unix?
Thanks,
Roxana
Try doing a
bw.flush();
before closing the file (on try block).
Maybe the information is still on the buffer, so it doesn't get reflected on the file contents
You should give us some more code, specially the section where the someFileName is specified. Since there is some difference in Java on how the 'file separator' is treated, your problem could be that you're creating/opening a file in windows, but it isn't on the unix... and your 'catch' is treating it, but you didn't provide its contents.
Take a look here
"file.separator" --> Character that separates components of a file path. This is "/" on UNIX and "\" on Windows.
I have an application, which proccesses some text and then saves it to file.
When I run it from NetBeans IDE, both System.out and PrintWriter work correct and non-ACSII characters are displayed/saved correctly. But, if I run the JAR from Windows 7 command line (which uses the cp1250 (central european) encoding in this case) screen output and saved file are broken.
I tried to put UTF-8 to PrintWriter's constructor, but it didn't help… And it can't affect System.out, which will be corrupted even after this.
Why is it working in the IDE and not in cmd.exe?
I would understand that System.out has some problems, but why is also output file affected?
How can I fix this issue?
I just had the same problem.
Actual reason of that is because when your code is ran in NetBeans environment, NetBeans automatically sets properties of the system.
You can see that when you run your code with NetBeans, the code below probably prints "UTF-8". But when you run it with cmd, you sure will see "cp1256".
System.getProperty("file.encoding");
You should notice that while using 'setProperty' will change the output of 'getProperty' function, it will not have any effect on Input/Outputs. (because they are all set before the main function is called.)
Having this background in mind, when you want to read from files and write to them, It's better to use codes below:
File f = new File(sourcePath);
For reading:
InputStreamReader isr = new InputStreamReader(
new FileInputStream(f), Charset.forName("UTF-8"));
and for writing (I have not tested this):
OutputStreamWriter osw = new OutputStreamWriter(
new FileOutputStream(f), Charset.forName("UTF-8"));
the main difference is that these Classes get required Charset in their constructors, but classes like FileWrite and PrintWrite don't.
I hope that works for you.