I have a string is half-size font and i want to convert it to full size. I was tried to use this code
final String x = "01589846";
String b = "";
System.out.print("01589846");
int y = 0;
final char[] list = x.toCharArray();
for (int i = 0; i < list.length; i++) {
y = Integer.parseInt(String.valueOf(list[i]));
final char unicode = (char) (y + 65296);
b += unicode;
}
System.out.println(b);
}
it actually working but it only working with number.
Anyone have another way for this ? please help me !!!!!
Java Strings are Unicode. They don't need converting. Java does not natively use ASCII.
You apparently wish to map one set of Unicode characters to another. The appropriate tool for that would be a Map, but you'll have to populate the Map with your desired conversion taken from the Unicode code charts.
There may be some algorithmic way to do this for particular subranges; you seem to have discovered a way that works for (western) digits. Note that the fullwidth digits occupy codepoints 0xFF10 to 0xFF19, so the conversion formula is digit - '0' + 0xff10. 0xFF10 is 65296 decimal, but the hex is clearer, since it's what is used in published code charts.
Actually, it looks to me that the same thing works for all characters in the range SPACE to '~', presumably by design. Thus
for (int i=0; i<list.length; i++)
list[i] += 0xff00 - ' ';
Here, I simply assume without checking that list will only contain characters in the range of SPACE to '~', i.e., the Unicode range that corresponds to graphic (printable) ASCII characters. Dealing with other characters, for example Katakana, is more involved.
final String x = "012345 abcdef ABCDEF";
System.out.println(x);
String b = "";
final char[] list = x.toCharArray();
for (int i = 0; i < list.length; i++) {
if(Character.isDigit(list[i])) {
b += (char)(list[i] - 48 + 0xFF10);
} else if(Character.isUpperCase(list[i])) {
b += (char)(list[i] - 65 + 0xFF21);
} else if(Character.isLowerCase(list[i])) {
b += (char)(list[i] - 97 + 0xFF41);
} else if(Character.isWhitespace(list[i])) {
b += list[i];
} else {
b += (char)(list[i] - 33 + 0xFF01);
}
}
System.out.println(b);
Output:
012345 abcdef ABCDEF
012345 abcdef ABCDEF
Related
Im currently working on a QRCode scanne and have come to a point where I've been stuck at for a while.
What I have so far is a String of 1s and 0s such as "100010100101....". What I wanted to do next ist turn this String into Bytes by always seperating 8 Bits.
With these Bytes I now want to decode them into text with this "ISO8859_1" Standart.
My Problem is the following: my results are way of what I want. This is my code:
for(int i = 0; i <= numberOfInt; i++){
String character = "";
for(int j = 0;j < 8; j++){
boolean bool = tResult.remove(0); //tResult is a List of 1s & 0s
if(bool){
character = character + '1';
}else{
character = character + '0';
}
}
allcharacter[byteCounter] = (byte)Integer.parseInt(character,2);//I think this Line is where the mistake is.
byteCounter++; //Variable that counts where to put the next bit
}
String endresult ="";
try {
endresult = new String(allcharacter,"ISO8859_1");
} catch (UnsupportedEncodingException e) {
e.printStackTrace();
}
return endresult;
What I think is that, the cast to (byte) doesn't work the way I understand it and therefore different bytes are saved into the array.
Thanks for any help.
You can use the substring method of the String class to get the first 8 characters and then convert those 8 characters (treat them as bits) to a character (which is also 8 bits). Instead of parsing each character as an integer, and then casting it to a byte, you should check each character and multiply a byte value by 2 for each time you hit a 1. That way, you will get a value between 0-255 for each byte, which should give you a valid character.
Also you might want to check the Byte class and its methods, it probably has a method that already does this.
Edit: There you go.
Edit 2: Also this question may answer why your int to byte casting does not give you the result you thought it would.
Okay, I rarely work with bytes, so in that aspect I am useless. However, I have converted binary to string many times. The logic behind it is converting the binary string to a decimal int, then from int to char, then from char to string. Here is how I do that.
String list = "100111000110000111010011" //24 random binary digits for example
String output = "";
char letter = '';
int ascii = 0;
//Repeat while there is still something to read.
for(int i = 0; i < list.length(); i+=8){
String temp = list.substring(i,i+8); //1 character in binary.
for(int j = temp.length()-1; j >= 0; j--) //Convert binary to decimal
if(temp.charAt(j) == '1')
ascii += (int)Math.pow(2,j);
letter = (char)ascii; //Sets the char letter to it's corresponding ascii value
output = output + Character.toString(letter); //Adds the letter to the string
ascii = 0; //resets ascii
}
System.out.println(output); //outputs the converted string
I hope that you found this helpful!
int re = 0;
for (int t=0; t<mat1.length-1; t++){
if(mat1[t]=="A"){
re=re+00;
}else if(mat1[t]=="T"){
re=re+01;
}else if(mat1[t]=="G"){
re=re+10;
}else if(mat1[t]=="C"){
re=re+11;
}
System.out.println(mat1[t]);
}
I want these codes translated from the binary we choose to ASCII and then the ASCII will know the values
Based from your last comment Sam I believe I now understand what you require and as you can see in the code below it is relatively easy to accomplish providing specific rules are followed.
One such rule is that the Binary value for each ASCII character must be 8 bits. Because lower ASCII (0 to 127) only truly represents a 7 bit binary value (ie: A = 1000001 and z = 1111010) we must ensure that a 0 is padded to the left most end of the binary value so as to produce a definite 8 bit binary number. We need to do this because our ACGT translation requires two binary digits for each character (ie: A = 00, C = 11, G = 10, T = 01) and therefore all binary values (appended or not) must be dividable by 2 and have no remainder. If we left everything as 7 bit binary values then this can not be accomplish. Now, knowing that we need to append a 0 to the left most of each ASCII binary value to establish 8 bit we will find that the ACGT string will always start with either a 'T' or an 'A'. The ACGT string will never start with a 'C' or a 'G'. If this is unacceptable then the ACGT character to Binary translation must change or the Padding to our ASCII binary value must change. It should be the translation that changes because if a change is made to the ASCII Binary value then it will be a misrepresentation of the ASCII Binary which is not good.
Another rule is that the ACGT Character to Binary Translation always remains the same. It never changes throughout processing.
The new code I provide below carries out the task you described within your last comment. I will leave the previous code from my previous post as it is since someone may find that useful as well.
In this new code I have used a Scanner to receive input from a User for testing purposes. I understand that you will be retrieving strings from a database and I will leave that up to you as to how you will implement that to the code since placing the two conversional sections of this code into methods would be the best way to go here.
As with just about anything with Java, there are about 12 ways to do anything however I particularly used 'for loops' to process things here since it's the easiest to follow in my opinion. You can optimize this code any way you see fit once you have it working exactly the way way you want.
Here is the code (copy/paste/run):
import java.util.Arrays;
import java.util.Scanner;
public class CharacterTranslation {
public static void main(String[] args) {
// Get Input from User...
Scanner in = new Scanner (System.in);
System.out.println("*** CONVERT FROM STRING TO ASCII TO BINARY TO ACGT ***\n");
System.out.println("Please enter a String to Convert to ACGT:");
String inputString = in.nextLine();
// Declare and initialize required variables...
int[] inputAscii = new int[inputString.length()];
String[] inputBinary = new String[inputString.length()];
// Translation Table made from a two dimensional Array:
String[][] ACGTtranslation = {{"A","00"},{"T","01"},{"G","10"},{"C","11"}};
// ------------------------------------------------
// -------- CONVERT FROM STRING TO ACGT ----------
// ------------------------------------------------
//Convert the input string into ASCII numbers...
for (int i = 0; i < inputString.length(); i++) {
char character = inputString.charAt(i);
inputAscii[i] = (int) character;
}
System.out.println("Conversion To ASCII: " + Arrays.toString(inputAscii)
.replace("[","").replace("]",""));
//Convert the ASCII Numbers to 8 bit Binary numbers...
for (int i = 0; i < inputAscii.length; i++) {
String bs = String.valueOf(Integer.toBinaryString(0x100 +
inputAscii[i]).substring(2));
// Pad the left end of the binary number with 0 should
// it not be 8 bits. ASCII Charcters will only produce
// 7 bit binary. We must have 8 bits to acquire a even
// number of digit pairs for our ACGT convertion.
while (bs.length() < 8) { bs = "0" + bs; }
inputBinary[i] = bs;
}
System.out.println("Conversion To 8bit Binary: " + Arrays.toString(inputBinary)
.replace("[","").replace("]",""));
//Convert the Binary String to ACGT format based from
// our translational Two Dimensional String Array.
// First we append all the binary data together to form
// a single string of binary numbers then starting from
// the left we break off 2 binary digits at a time to
// convert to our ACGT string format.
// Convert the inputBinary Array to a single binary String...
String binaryString = "";
for (int i = 0; i < inputBinary.length; i++) {
binaryString+= String.valueOf(inputBinary[i]);
}
// Convert the Binary String to ACGT...
String ACGTstring = "";
for (int i = 0; i < binaryString.length(); i+= 2) {
String tmp = binaryString.substring(i, i+2);
for (int j = 0; j < ACGTtranslation.length; j++) {
if (tmp.equals(ACGTtranslation[j][1])) {
ACGTstring+= ACGTtranslation[j][0];
}
}
}
System.out.println("The ACGT Translation String for the Word '" +
inputString + "' is: " + ACGTstring + "\n");
// ------------------------------------------------
// ----- CONVERT FROM ACGT BACK TO STRING --------
// ------------------------------------------------
System.out.println("*** CONVERT FROM ACGT (" + ACGTstring +
"' TO BINARY TO ASCII TO STRING ***\n");
System.out.println("Press ENTER Key To Continue...");
String tmp = in.nextLine();
// Convert ACGT back to 8bit Binary...
String translation = "";
for (int i = 0; i < ACGTstring.length(); i++) {
String c = Character.toString(ACGTstring.charAt(i));
for (int j = 0; j < ACGTtranslation.length; j++) {
if (ACGTtranslation[j][0].equals(c)) { translation+= ACGTtranslation[j][1]; break; }
}
}
// We divide the translation String by 8 so as to get
// the total number of 8 bit binary numbers that would
// be contained within that ACGT String. We then reinitialize
// our inputBinary Array to hold that many binary numbers.
inputBinary = new String[translation.length() / 8];
int cntr = 0;
for (int i = 0; i < translation.length(); i+= 8) {
inputBinary[cntr] = translation.substring(i, i+8);
cntr++;
}
System.out.println("Conversion from ACGT To 8bit Binary: " +
Arrays.toString(inputBinary).replace("[","")
.replace("]",""));
//Convert 8bit Binary To ASCII...
inputAscii = new int[inputBinary.length];
for (int i = 0; i < inputBinary.length; i++) {
inputAscii[i] = Integer.parseInt(inputBinary[i], 2);
}
System.out.println("Conversion from Binary To ASCII: " + Arrays.toString(inputAscii)
.replace("[","").replace("]",""));
// Convert ASCII to Character String...
inputString = "";
for (int i = 0; i < inputAscii.length; i++) {
inputString+= Character.toString ((char) inputAscii[i]);
}
System.out.println("Conversion from ASCII to Character String: " + inputString);
System.out.println("** Process Complete ***");
}
}
EDIT:
I want to add that I have supplied a bit of a fib within the text. Most characters used within the ASCII character set (which are used for strings - ASCII 32 to 127) are represented by a 7 bit Binary value however, Characters within the Upper ASCII character set (128 to 255) are represented with an actual 8 bit binary value. The code provided takes this into account. I have edited my answer to accommodate this.
Sam, you'll need to create some sort of translation table so as to know which makeshift binary bit relates to whichever character and I say makeshift because 00 is nothing equivalent to the letter "A" in binary. The same applies to the binary representation for the other characters you provide as well. What the binary representation might be is irrelevant at this point since you can do whatever you want for your specific purpose. If possible though, proper representation is the way to go for more advanced functionality later on down the road AND you wouldn't need a translation table because all you would need to do is convert the character ASCII value to binary, like this:
// 65 is the ASCII value for the letter A.
String letterAis = String.valueOf(Integer.toBinaryString(0x100 + 65).substring(2));
ALPHABET IN (8 bit) BINARY, CAPITAL LETTERS
A 01000001 N 01001110
B 01000010 O 01001111
C 01000011 P 01010000
D 01000100 Q 01010001
E 01000101 R 01010010
F 01000110 S 01010011
G 01000111 T 01010100
H 01001000 U 01010101
I 01001001 V 01010110
J 01001010 W 01010111
K 01001011 X 01011000
L 01001100 Y 01011001
M 01001101 Z 01011010
ALPHABET IN (8 bit) BINARY, LOWER CASE
a 01100001 n 01101110
b 01100010 o 01101111
c 01100011 p 01110000
d 01100100 q 01110001
e 01100101 r 01110010
f 01100110 s 01110011
g 01100111 t 01110100
h 01101000 u 01110101
i 01101001 v 01110110
j 01101010 w 01110111
k 01101011 x 01111000
l 01101100 y 01111001
m 01101101 z 01111010
You already have an Array variable named mat1[] which holds your string letters and what I propose is to make it a two dimensional array, 1 column to hold the letter and a 2nd column to hold the binary translation for that letter. Once you have the translation established you can convert back and forth from letter string to Binary and Binary to letter string. Here is the code (just copy/paste and run):
public class CharacterTranslation {
public static void main(String[] args) {
// Translation Table made from a two dimensional Array:
String[][] mat1 = {{"A","00"},{"T","01"},{"G","10"},{"C","11"}};
String input = "GCAT";
System.out.println("Original Input: " + input);
// Convert each character within the supplied input string
// to our binary character translation.
String translation = "";
for (int i = 0; i < input.length(); i++) {
String c = Character.toString(input.charAt(i));
for (int j = 0; j < mat1.length; j++) {
if (mat1[j][0].equals(c)) { translation+= mat1[j][1]; break; }
}
}
// Display the translation in output console (pane).
System.out.println("Convert To Binary Translation: " + translation);
// Now, convert the binary translation back to our
// original character input. Note: this only works
// if the binary translation is only 2 bits for any
// character.
String origInput = "";
for (int i = 0; i < translation.length(); i+= 2) {
String b = translation.substring(i, i+2);
for (int j = 0; j < mat1.length; j++) {
if (mat1[j][1].equals(b)) { origInput+= mat1[j][0]; break; }
}
}
// Display the converted binary translation back to
// it original characters.
System.out.println("Convert Back To Original Input: " + origInput);
}
}
For what you really want to do Sam, we only need one button. We can make it act like a toggle button between two different functions. One Button and One Text Field. This is pretty basic stuff.
The translation table 2 Dimensional Array should be placed under the constructor for your GUI class so that it's available to all methods, something like this:
public class MyGUIClassName??? extends javax.swing.JFrame {
String[][] ACGTtranslation = {{"A","00"},{"T","01"},{"G","10"},{"C","11"}};
..............................
..............................
..............................
}
Then your jButton3 ActionPerformed event could look somethIng like:
private void jButton3ActionPerformed(java.awt.event.ActionEvent evt) {
// Skip this event if there is nothing contained
// within jTextField1.
if (jTextField1.getText().isEmpty()) { return; }
// Get the current text in jButton3.
String buttonText = jButton3.getText();
// If the button text reads "Convert To ACGT" then...
if ("Convert To ACGT".equals(buttonText)) {
// change the button text to "Convert To String".
jButton3.setText("Convert To String");
// Convert the string from database now contained in jTextField1
// to ACGT format and place that new ACGT into the same JTextfield.
// We use the StringToACGT() method for this.
jTextField1.SetText(StringToACGT(jTextField1.getText());
}
// The button text must be "Convert To String"...
else {
// so let's change the button text to now be "Convert To ACGT"
// again.
jButton3.setText("Convert To ACGT");
// Take the ACGT string now contained within jTextField1
// from the first button click and convert it back to its
// original String format. We use the ACGTtoString() method
// for this.
jTextField1.SetText(ACGTtoString(jTextField1.getText());
}
}
And here are the methods you also place into your GUI class:
// CONVERT A STRING TO ACGT FORMAT
public static String StringToACGT(String inputString) {
// Make sure the input string contains something.
if ("".equals(inputString)) { return ""; }
// Declare and initialize required variables...
int[] inputAscii = new int[inputString.length()];
String[] inputBinary = new String[inputString.length()];
//Convert the input string into ASCII numbers...
for (int i = 0; i < inputString.length(); i++) {
char character = inputString.charAt(i);
inputAscii[i] = (int) character;
}
//Convert the ASCII Numbers to 8 bit Binary numbers...
for (int i = 0; i < inputAscii.length; i++) {
String bs = String.valueOf(Integer.toBinaryString(0x100 +
inputAscii[i]).substring(2));
// Pad the left end of the binary number with 0 should
// it not be 8 bits. ASCII Charcters will only produce
// 7 bit binary. We must have 8 bits to acquire a even
// number of digit pairs for our ACGT convertion.
while (bs.length() < 8) { bs = "0" + bs; }
inputBinary[i] = bs;
}
//Convert the Binary String to ACGT format based from
// our translational Two Dimensional String Array.
// First we append all the binary data together to form
// a single string of binary numbers then starting from
// the left we break off 2 binary digits at a time to
// convert to our ACGT string format.
// Convert the inputBinary Array to a single binary String...
String binaryString = "";
for (int i = 0; i < inputBinary.length; i++) {
binaryString+= String.valueOf(inputBinary[i]);
}
// Convert the Binary String to ACGT...
String ACGTstring = "";
for (int i = 0; i < binaryString.length(); i+= 2) {
String tmp = binaryString.substring(i, i+2);
for (int j = 0; j < ACGTtranslation.length; j++) {
if (tmp.equals(ACGTtranslation[j][1])) {
ACGTstring+= ACGTtranslation[j][0];
}
}
}
return ACGTstring;
}
// CONVERT A ACGT STRING BACK TO ITS ORIGINAL STRING STATE.
public static String ACGTtoString(String inputString) {
// Make sure the input string contains something.
if ("".equals(inputString)) { return ""; }
String ACGTstring = inputString;
// Declare and initialize required variables...
int[] inputAscii = new int[inputString.length()];
String[] inputBinary = new String[inputString.length()];
// Convert ACGT back to 8bit Binary...
String translation = "";
for (int i = 0; i < ACGTstring.length(); i++) {
String c = Character.toString(ACGTstring.charAt(i));
for (int j = 0; j < ACGTtranslation.length; j++) {
if (ACGTtranslation[j][0].equals(c)) { translation+= ACGTtranslation[j][1]; break; }
}
}
// We divide the translation String by 8 so as to get
// the total number of 8 bit binary numbers that would
// be contained within that ACGT String. We then reinitialize
// our inputBinary Array to hold that many binary numbers.
inputBinary = new String[translation.length() / 8];
int cntr = 0;
for (int i = 0; i < translation.length(); i+= 8) {
inputBinary[cntr] = translation.substring(i, i+8);
cntr++;
}
//Convert 8bit Binary To ASCII...
inputAscii = new int[inputBinary.length];
for (int i = 0; i < inputBinary.length; i++) {
inputAscii[i] = Integer.parseInt(inputBinary[i], 2);
}
// Convert ASCII to Character String...
inputString = "";
for (int i = 0; i < inputAscii.length; i++) {
inputString+= Character.toString ((char) inputAscii[i]);
}
return inputString;
}
Like I said earlier Sam, this is pretty basic material and you should already have a very good handle onto the Java programming language to get to this point, especially if you are actually retrieving the data to convert from a Database. My job here is Complete. :o) I hope this has helped you (and others) to obtain your goals.
so I am trying to set each of the letter in the alphabet to a number like a = 1
b = 2 c =3 and so on.
int char = "a";
int[] num = new int{26};
for (int i = 0; i <num.length; i++){
System.out.print(i);
But after this i got Stuck so if you possible help me out. So when the users input a word like cat it would out put 3-1-20.
You can subtract 'a' from each char and add 1. E.g.
String input = "cat";
for (char c : input.toCharArray()) {
System.out.print(c - 'a' + 1);
}
The code you posted doesn't compile as you can't assign a String to an int and char is a reserved word (name of a primitive type)
int char = "a";
You also mention that you want the output formatted like this "3-1-20". This is one way to achieve that :
String input = "cat";
String[] out = new String[input.length()];
for (int i = 0; i < input.length(); ++i) {
out[i] = Integer.toString(input.charAt(i) - 'a' + 1);
}
System.out.println(String.join("-", out));
Both versions work only for lowercase English letters (a to z)
Assigning a number to a character is called an "encoding". As computers can only handle numbers internally, this happens all the time. Even this text here is encoded (probably into an encoding called "UTF-8") and then the resulting number is stored somewhere.
One very basic encoding is the so called ASCII (American Standard Code for Information Interchange). ASCII already does what you want, it assigns a number to each character, only that the number for "A" is 65 instead 1.
So, how does that help? We can assume that for the character A-z, the numeric value of a char is equal to the ASCII code (it's not true for every character, but for the most basic ones, it's good enough).
And this is why everyone here tells you to subtract 'A' or 'a': Your character is a char, which is a character, but also the numeric value of that character, so you can subtract 'A' (again, a char) and add 1:
'B' - 'A' + 1 = 2
because...
66 (numeric value of 'B') - 65 (numeric value of 'A') + 1 = 2
Actually, char is not ASCII, but UTF-8, but there it starts to get slightly bit more complex, so ASCII will suffice for the moment.
the best way of doing this is to convert the String to a byte[], like this:
char[] buffer = str.toCharArray();
Then each of the characters can be converted to their byte-value (which are constants for a certain encoding), like this:
byte[] b = new byte[buffer.length];
for (int i = 0; i < b.length; i++) {
b[i] = (byte) buffer[i];
}
Now look at the resulting values and subtract/add some value to get the desired results!
What I'm trying to do is create an encryption method that shifts a char array taken from an input file by a determined amount of letters. I'm having trouble figuring out how to change the chars into ints and back.
THis is what I've got so far:
char [] sChar = new char[line.length()];
for(int i = 0; i < sChar.length; i++){
String s = reader.next();
sChar = s.toCharArray();
if(Character.isLetter(sChar[i])) {
char c = 'a';
int b = c;
sChar[i] += key;
Not sure what you are thinking.
I thought conversion int to character and back would be easy.
Just to freshen up the idea I checked
char xdrf = 'a';
System.out.println((int)xdrf); // output is 97
int idrf= 99;
xdrf = (char)idrf;
System.out.println(xdrf); // output is c
also if you key is a character you can directly sum it therefore statement
schar[i] += key;
should be good
more to it
idrf = idrf + 'd';
System.out.println(idrf); //output is 199
further using
System.out.println(Character.getNumericValue(idrf-20)); //output is 3
this all is working by ascii value. I am unsure if you would like ascii values to be used.
I might be somewhat stupid here, but I can't seem to think of a straightforward solution to this problem.
I've currently got an int[] that contains ASCII character codes, however, with the ASCII table, any value < 32 is a control code. So what I need to do is for any value > 32, put the ASCII character into a char[], however if it's < 32, just put the literal integer value in as a character.
For example:
public static void main(String[] args) {
int[] input = {57, 4, 31}; //57 is the only valid ASCII character '9'
char[] output = new char[3];
for (int i = 0; i < input.length; i++) {
if (input[i] < 32) { //If it's a control code
System.out.println("pos " + i + " Not an ascii symbol, it's a control code");
output[i] = (char) input[i];
} else { //If it's an actual ASCII character
System.out.println("pos " + i + " Ascii character, add to array");
output[i] = (char) input[i];
}
}
System.out.println("\nOutput buffer contains:");
for (int i = 0; i < output.length; i++) {
System.out.println(output[i]);
}
}
Output is:
pos 0 Ascii character, add to array
pos 1 Not an ascii symbol, it's a control code
pos 2 Not an ascii symbol, it's a control code
Output buffer contains:
9 // int value 57, this is OK
As you can see the last two entries in the array are blank, as there isn't actually an ASCII character for either 4, or 31. I know there are methods for converting Strings to char[], however what's the general idea when you've already got a char[] in which you want the value.
There is probably a really easy solution for this, I think I'm just having a dumb moment!
Any advice would be appreciate, thanks!
For classifying characters you should use the Character.getType(char) method.
To store either a character or an integer you could try using a wrapper object to do that.
Alternatively you could wrap your char like this:
static class NiceCharacter {
// The actual character.
final char ch;
public NiceCharacter ( char ch ) {
this.ch = ch;
}
#Override
public String toString () {
return stringValue(ch);
}
public static String stringValue ( char ch ) {
switch ( Character.getType(ch)) {
// See http://en.wikipedia.org/wiki/Mapping_of_Unicode_characters for what the Cc group is.
// See http://en.wikipedia.org/wiki/Control_character for a definition of what are CONTROL characters.
case Character.CONTROL:
return Integer.toString(ch);
default:
return Character.toString(ch);
}
}
}
Change how you print the output buffer
for (int i = 0; i < output.length; i++) {
if (output[i] < 32){
System.out.println("'" + (int)output[i] + "'"); //Control code is casted to int.
//I added the ' ' arround the value to know its a control character
}else {
System.out.println(output[i]); //Print the character
}
}