Creating Text out of a String of 1s and 0s - java

Im currently working on a QRCode scanne and have come to a point where I've been stuck at for a while.
What I have so far is a String of 1s and 0s such as "100010100101....". What I wanted to do next ist turn this String into Bytes by always seperating 8 Bits.
With these Bytes I now want to decode them into text with this "ISO8859_1" Standart.
My Problem is the following: my results are way of what I want. This is my code:
for(int i = 0; i <= numberOfInt; i++){
String character = "";
for(int j = 0;j < 8; j++){
boolean bool = tResult.remove(0); //tResult is a List of 1s & 0s
if(bool){
character = character + '1';
}else{
character = character + '0';
}
}
allcharacter[byteCounter] = (byte)Integer.parseInt(character,2);//I think this Line is where the mistake is.
byteCounter++; //Variable that counts where to put the next bit
}
String endresult ="";
try {
endresult = new String(allcharacter,"ISO8859_1");
} catch (UnsupportedEncodingException e) {
e.printStackTrace();
}
return endresult;
What I think is that, the cast to (byte) doesn't work the way I understand it and therefore different bytes are saved into the array.
Thanks for any help.

You can use the substring method of the String class to get the first 8 characters and then convert those 8 characters (treat them as bits) to a character (which is also 8 bits). Instead of parsing each character as an integer, and then casting it to a byte, you should check each character and multiply a byte value by 2 for each time you hit a 1. That way, you will get a value between 0-255 for each byte, which should give you a valid character.
Also you might want to check the Byte class and its methods, it probably has a method that already does this.
Edit: There you go.
Edit 2: Also this question may answer why your int to byte casting does not give you the result you thought it would.

Okay, I rarely work with bytes, so in that aspect I am useless. However, I have converted binary to string many times. The logic behind it is converting the binary string to a decimal int, then from int to char, then from char to string. Here is how I do that.
String list = "100111000110000111010011" //24 random binary digits for example
String output = "";
char letter = '';
int ascii = 0;
//Repeat while there is still something to read.
for(int i = 0; i < list.length(); i+=8){
String temp = list.substring(i,i+8); //1 character in binary.
for(int j = temp.length()-1; j >= 0; j--) //Convert binary to decimal
if(temp.charAt(j) == '1')
ascii += (int)Math.pow(2,j);
letter = (char)ascii; //Sets the char letter to it's corresponding ascii value
output = output + Character.toString(letter); //Adds the letter to the string
ascii = 0; //resets ascii
}
System.out.println(output); //outputs the converted string
I hope that you found this helpful!

Related

How to convert Ascii to Unicode in java?

I have a string is half-size font and i want to convert it to full size. I was tried to use this code
final String x = "01589846";
String b = "";
System.out.print("01589846");
int y = 0;
final char[] list = x.toCharArray();
for (int i = 0; i < list.length; i++) {
y = Integer.parseInt(String.valueOf(list[i]));
final char unicode = (char) (y + 65296);
b += unicode;
}
System.out.println(b);
}
it actually working but it only working with number.
Anyone have another way for this ? please help me !!!!!
Java Strings are Unicode. They don't need converting. Java does not natively use ASCII.
You apparently wish to map one set of Unicode characters to another. The appropriate tool for that would be a Map, but you'll have to populate the Map with your desired conversion taken from the Unicode code charts.
There may be some algorithmic way to do this for particular subranges; you seem to have discovered a way that works for (western) digits. Note that the fullwidth digits occupy codepoints 0xFF10 to 0xFF19, so the conversion formula is digit - '0' + 0xff10. 0xFF10 is 65296 decimal, but the hex is clearer, since it's what is used in published code charts.
Actually, it looks to me that the same thing works for all characters in the range SPACE to '~', presumably by design. Thus
for (int i=0; i<list.length; i++)
list[i] += 0xff00 - ' ';
Here, I simply assume without checking that list will only contain characters in the range of SPACE to '~', i.e., the Unicode range that corresponds to graphic (printable) ASCII characters. Dealing with other characters, for example Katakana, is more involved.
final String x = "012345 abcdef ABCDEF";
System.out.println(x);
String b = "";
final char[] list = x.toCharArray();
for (int i = 0; i < list.length; i++) {
if(Character.isDigit(list[i])) {
b += (char)(list[i] - 48 + 0xFF10);
} else if(Character.isUpperCase(list[i])) {
b += (char)(list[i] - 65 + 0xFF21);
} else if(Character.isLowerCase(list[i])) {
b += (char)(list[i] - 97 + 0xFF41);
} else if(Character.isWhitespace(list[i])) {
b += list[i];
} else {
b += (char)(list[i] - 33 + 0xFF01);
}
}
System.out.println(b);
Output:
012345 abcdef ABCDEF
012345 abcdef ABCDEF

Convert an array of integers (or string) of binary 1s and 0s to their alpha equivalent in Java

I have an array of integer 1s and 0s (possibly need to get converted to byte type?). I have used [an online ASCII to binary generator][1] to get the equivalent binary of this 6 digit letter sequence:
abcdef should equal 011000010110001001100011011001000110010101100110 in binary.
My array is set up as int[] curCheckArr = new int[48]; and my string is basically just using a StringBuilder to build the same ints as strings, and calling toString() - so I have access to the code as a string or an array.
I have tried a few different methods, all of which crash the browser, including:
StringBuilder curCheckAlphaSB = new StringBuilder(); // Some place to store the chars
Arrays.stream( // Create a Stream
curCheckString.split("(?<=\\G.{8})") // Splits the input string into 8-char-sections (Since a char has 8 bits = 1 byte)
).forEach(s -> // Go through each 8-char-section...
curCheckAlphaSB.append((char) Integer.parseInt(s, 2)) // ...and turn it into an int and then to a char
);
String curAlpha = curCheckAlphaSB.toString();
and
String curAlpha = "";
for (int b = 0; b < curCheckString.length()/8; b++) {
int a = Integer.parseInt(curAlpha.substring(8*b,(b+1)*8),2);
curAlpha += (char)(a);
}
How can I most efficiently convert these 48 1s and 0s to a six digit alpha character sequence?
Assuming each character is represented by precisely one byte you can iterate over the input with Integer.parseInt() (because byte value in the input is potentially unsigned):
String input = "011000010110001001100011011001000110010101100110";
StringBuilder sb = new StringBuilder();
for (int i = 0; i < input.length(); i += 8) {
int c = Integer.parseInt(input.substring(i, i + 8), 2);
sb.append((char) c);
}
System.out.println(sb); // abcdef
Using a regex is probably the slowest part of these. Using a pre-compiled regex would help, but substring is faster. Not creating any Strings except the result would be faster. e.g. use StringBuilder instead of += for String.

converted 4 letters to a specific binary digits how can I convert these binary to ASCII

int re = 0;
for (int t=0; t<mat1.length-1; t++){
if(mat1[t]=="A"){
re=re+00;
}else if(mat1[t]=="T"){
re=re+01;
}else if(mat1[t]=="G"){
re=re+10;
}else if(mat1[t]=="C"){
re=re+11;
}
System.out.println(mat1[t]);
}
I want these codes translated from the binary we choose to ASCII and then the ASCII will know the values
Based from your last comment Sam I believe I now understand what you require and as you can see in the code below it is relatively easy to accomplish providing specific rules are followed.
One such rule is that the Binary value for each ASCII character must be 8 bits. Because lower ASCII (0 to 127) only truly represents a 7 bit binary value (ie: A = 1000001 and z = 1111010) we must ensure that a 0 is padded to the left most end of the binary value so as to produce a definite 8 bit binary number. We need to do this because our ACGT translation requires two binary digits for each character (ie: A = 00, C = 11, G = 10, T = 01) and therefore all binary values (appended or not) must be dividable by 2 and have no remainder. If we left everything as 7 bit binary values then this can not be accomplish. Now, knowing that we need to append a 0 to the left most of each ASCII binary value to establish 8 bit we will find that the ACGT string will always start with either a 'T' or an 'A'. The ACGT string will never start with a 'C' or a 'G'. If this is unacceptable then the ACGT character to Binary translation must change or the Padding to our ASCII binary value must change. It should be the translation that changes because if a change is made to the ASCII Binary value then it will be a misrepresentation of the ASCII Binary which is not good.
Another rule is that the ACGT Character to Binary Translation always remains the same. It never changes throughout processing.
The new code I provide below carries out the task you described within your last comment. I will leave the previous code from my previous post as it is since someone may find that useful as well.
In this new code I have used a Scanner to receive input from a User for testing purposes. I understand that you will be retrieving strings from a database and I will leave that up to you as to how you will implement that to the code since placing the two conversional sections of this code into methods would be the best way to go here.
As with just about anything with Java, there are about 12 ways to do anything however I particularly used 'for loops' to process things here since it's the easiest to follow in my opinion. You can optimize this code any way you see fit once you have it working exactly the way way you want.
Here is the code (copy/paste/run):
import java.util.Arrays;
import java.util.Scanner;
public class CharacterTranslation {
public static void main(String[] args) {
// Get Input from User...
Scanner in = new Scanner (System.in);
System.out.println("*** CONVERT FROM STRING TO ASCII TO BINARY TO ACGT ***\n");
System.out.println("Please enter a String to Convert to ACGT:");
String inputString = in.nextLine();
// Declare and initialize required variables...
int[] inputAscii = new int[inputString.length()];
String[] inputBinary = new String[inputString.length()];
// Translation Table made from a two dimensional Array:
String[][] ACGTtranslation = {{"A","00"},{"T","01"},{"G","10"},{"C","11"}};
// ------------------------------------------------
// -------- CONVERT FROM STRING TO ACGT ----------
// ------------------------------------------------
//Convert the input string into ASCII numbers...
for (int i = 0; i < inputString.length(); i++) {
char character = inputString.charAt(i);
inputAscii[i] = (int) character;
}
System.out.println("Conversion To ASCII: " + Arrays.toString(inputAscii)
.replace("[","").replace("]",""));
//Convert the ASCII Numbers to 8 bit Binary numbers...
for (int i = 0; i < inputAscii.length; i++) {
String bs = String.valueOf(Integer.toBinaryString(0x100 +
inputAscii[i]).substring(2));
// Pad the left end of the binary number with 0 should
// it not be 8 bits. ASCII Charcters will only produce
// 7 bit binary. We must have 8 bits to acquire a even
// number of digit pairs for our ACGT convertion.
while (bs.length() < 8) { bs = "0" + bs; }
inputBinary[i] = bs;
}
System.out.println("Conversion To 8bit Binary: " + Arrays.toString(inputBinary)
.replace("[","").replace("]",""));
//Convert the Binary String to ACGT format based from
// our translational Two Dimensional String Array.
// First we append all the binary data together to form
// a single string of binary numbers then starting from
// the left we break off 2 binary digits at a time to
// convert to our ACGT string format.
// Convert the inputBinary Array to a single binary String...
String binaryString = "";
for (int i = 0; i < inputBinary.length; i++) {
binaryString+= String.valueOf(inputBinary[i]);
}
// Convert the Binary String to ACGT...
String ACGTstring = "";
for (int i = 0; i < binaryString.length(); i+= 2) {
String tmp = binaryString.substring(i, i+2);
for (int j = 0; j < ACGTtranslation.length; j++) {
if (tmp.equals(ACGTtranslation[j][1])) {
ACGTstring+= ACGTtranslation[j][0];
}
}
}
System.out.println("The ACGT Translation String for the Word '" +
inputString + "' is: " + ACGTstring + "\n");
// ------------------------------------------------
// ----- CONVERT FROM ACGT BACK TO STRING --------
// ------------------------------------------------
System.out.println("*** CONVERT FROM ACGT (" + ACGTstring +
"' TO BINARY TO ASCII TO STRING ***\n");
System.out.println("Press ENTER Key To Continue...");
String tmp = in.nextLine();
// Convert ACGT back to 8bit Binary...
String translation = "";
for (int i = 0; i < ACGTstring.length(); i++) {
String c = Character.toString(ACGTstring.charAt(i));
for (int j = 0; j < ACGTtranslation.length; j++) {
if (ACGTtranslation[j][0].equals(c)) { translation+= ACGTtranslation[j][1]; break; }
}
}
// We divide the translation String by 8 so as to get
// the total number of 8 bit binary numbers that would
// be contained within that ACGT String. We then reinitialize
// our inputBinary Array to hold that many binary numbers.
inputBinary = new String[translation.length() / 8];
int cntr = 0;
for (int i = 0; i < translation.length(); i+= 8) {
inputBinary[cntr] = translation.substring(i, i+8);
cntr++;
}
System.out.println("Conversion from ACGT To 8bit Binary: " +
Arrays.toString(inputBinary).replace("[","")
.replace("]",""));
//Convert 8bit Binary To ASCII...
inputAscii = new int[inputBinary.length];
for (int i = 0; i < inputBinary.length; i++) {
inputAscii[i] = Integer.parseInt(inputBinary[i], 2);
}
System.out.println("Conversion from Binary To ASCII: " + Arrays.toString(inputAscii)
.replace("[","").replace("]",""));
// Convert ASCII to Character String...
inputString = "";
for (int i = 0; i < inputAscii.length; i++) {
inputString+= Character.toString ((char) inputAscii[i]);
}
System.out.println("Conversion from ASCII to Character String: " + inputString);
System.out.println("** Process Complete ***");
}
}
EDIT:
I want to add that I have supplied a bit of a fib within the text. Most characters used within the ASCII character set (which are used for strings - ASCII 32 to 127) are represented by a 7 bit Binary value however, Characters within the Upper ASCII character set (128 to 255) are represented with an actual 8 bit binary value. The code provided takes this into account. I have edited my answer to accommodate this.
Sam, you'll need to create some sort of translation table so as to know which makeshift binary bit relates to whichever character and I say makeshift because 00 is nothing equivalent to the letter "A" in binary. The same applies to the binary representation for the other characters you provide as well. What the binary representation might be is irrelevant at this point since you can do whatever you want for your specific purpose. If possible though, proper representation is the way to go for more advanced functionality later on down the road AND you wouldn't need a translation table because all you would need to do is convert the character ASCII value to binary, like this:
// 65 is the ASCII value for the letter A.
String letterAis = String.valueOf(Integer.toBinaryString(0x100 + 65).substring(2));
ALPHABET IN (8 bit) BINARY, CAPITAL LETTERS
A 01000001 N 01001110
B 01000010 O 01001111
C 01000011 P 01010000
D 01000100 Q 01010001
E 01000101 R 01010010
F 01000110 S 01010011
G 01000111 T 01010100
H 01001000 U 01010101
I 01001001 V 01010110
J 01001010 W 01010111
K 01001011 X 01011000
L 01001100 Y 01011001
M 01001101 Z 01011010
ALPHABET IN (8 bit) BINARY, LOWER CASE
a 01100001 n 01101110
b 01100010 o 01101111
c 01100011 p 01110000
d 01100100 q 01110001
e 01100101 r 01110010
f 01100110 s 01110011
g 01100111 t 01110100
h 01101000 u 01110101
i 01101001 v 01110110
j 01101010 w 01110111
k 01101011 x 01111000
l 01101100 y 01111001
m 01101101 z 01111010
You already have an Array variable named mat1[] which holds your string letters and what I propose is to make it a two dimensional array, 1 column to hold the letter and a 2nd column to hold the binary translation for that letter. Once you have the translation established you can convert back and forth from letter string to Binary and Binary to letter string. Here is the code (just copy/paste and run):
public class CharacterTranslation {
public static void main(String[] args) {
// Translation Table made from a two dimensional Array:
String[][] mat1 = {{"A","00"},{"T","01"},{"G","10"},{"C","11"}};
String input = "GCAT";
System.out.println("Original Input: " + input);
// Convert each character within the supplied input string
// to our binary character translation.
String translation = "";
for (int i = 0; i < input.length(); i++) {
String c = Character.toString(input.charAt(i));
for (int j = 0; j < mat1.length; j++) {
if (mat1[j][0].equals(c)) { translation+= mat1[j][1]; break; }
}
}
// Display the translation in output console (pane).
System.out.println("Convert To Binary Translation: " + translation);
// Now, convert the binary translation back to our
// original character input. Note: this only works
// if the binary translation is only 2 bits for any
// character.
String origInput = "";
for (int i = 0; i < translation.length(); i+= 2) {
String b = translation.substring(i, i+2);
for (int j = 0; j < mat1.length; j++) {
if (mat1[j][1].equals(b)) { origInput+= mat1[j][0]; break; }
}
}
// Display the converted binary translation back to
// it original characters.
System.out.println("Convert Back To Original Input: " + origInput);
}
}
For what you really want to do Sam, we only need one button. We can make it act like a toggle button between two different functions. One Button and One Text Field. This is pretty basic stuff.
The translation table 2 Dimensional Array should be placed under the constructor for your GUI class so that it's available to all methods, something like this:
public class MyGUIClassName??? extends javax.swing.JFrame {
String[][] ACGTtranslation = {{"A","00"},{"T","01"},{"G","10"},{"C","11"}};
..............................
..............................
..............................
}
Then your jButton3 ActionPerformed event could look somethIng like:
private void jButton3ActionPerformed(java.awt.event.ActionEvent evt) {
// Skip this event if there is nothing contained
// within jTextField1.
if (jTextField1.getText().isEmpty()) { return; }
// Get the current text in jButton3.
String buttonText = jButton3.getText();
// If the button text reads "Convert To ACGT" then...
if ("Convert To ACGT".equals(buttonText)) {
// change the button text to "Convert To String".
jButton3.setText("Convert To String");
// Convert the string from database now contained in jTextField1
// to ACGT format and place that new ACGT into the same JTextfield.
// We use the StringToACGT() method for this.
jTextField1.SetText(StringToACGT(jTextField1.getText());
}
// The button text must be "Convert To String"...
else {
// so let's change the button text to now be "Convert To ACGT"
// again.
jButton3.setText("Convert To ACGT");
// Take the ACGT string now contained within jTextField1
// from the first button click and convert it back to its
// original String format. We use the ACGTtoString() method
// for this.
jTextField1.SetText(ACGTtoString(jTextField1.getText());
}
}
And here are the methods you also place into your GUI class:
// CONVERT A STRING TO ACGT FORMAT
public static String StringToACGT(String inputString) {
// Make sure the input string contains something.
if ("".equals(inputString)) { return ""; }
// Declare and initialize required variables...
int[] inputAscii = new int[inputString.length()];
String[] inputBinary = new String[inputString.length()];
//Convert the input string into ASCII numbers...
for (int i = 0; i < inputString.length(); i++) {
char character = inputString.charAt(i);
inputAscii[i] = (int) character;
}
//Convert the ASCII Numbers to 8 bit Binary numbers...
for (int i = 0; i < inputAscii.length; i++) {
String bs = String.valueOf(Integer.toBinaryString(0x100 +
inputAscii[i]).substring(2));
// Pad the left end of the binary number with 0 should
// it not be 8 bits. ASCII Charcters will only produce
// 7 bit binary. We must have 8 bits to acquire a even
// number of digit pairs for our ACGT convertion.
while (bs.length() < 8) { bs = "0" + bs; }
inputBinary[i] = bs;
}
//Convert the Binary String to ACGT format based from
// our translational Two Dimensional String Array.
// First we append all the binary data together to form
// a single string of binary numbers then starting from
// the left we break off 2 binary digits at a time to
// convert to our ACGT string format.
// Convert the inputBinary Array to a single binary String...
String binaryString = "";
for (int i = 0; i < inputBinary.length; i++) {
binaryString+= String.valueOf(inputBinary[i]);
}
// Convert the Binary String to ACGT...
String ACGTstring = "";
for (int i = 0; i < binaryString.length(); i+= 2) {
String tmp = binaryString.substring(i, i+2);
for (int j = 0; j < ACGTtranslation.length; j++) {
if (tmp.equals(ACGTtranslation[j][1])) {
ACGTstring+= ACGTtranslation[j][0];
}
}
}
return ACGTstring;
}
// CONVERT A ACGT STRING BACK TO ITS ORIGINAL STRING STATE.
public static String ACGTtoString(String inputString) {
// Make sure the input string contains something.
if ("".equals(inputString)) { return ""; }
String ACGTstring = inputString;
// Declare and initialize required variables...
int[] inputAscii = new int[inputString.length()];
String[] inputBinary = new String[inputString.length()];
// Convert ACGT back to 8bit Binary...
String translation = "";
for (int i = 0; i < ACGTstring.length(); i++) {
String c = Character.toString(ACGTstring.charAt(i));
for (int j = 0; j < ACGTtranslation.length; j++) {
if (ACGTtranslation[j][0].equals(c)) { translation+= ACGTtranslation[j][1]; break; }
}
}
// We divide the translation String by 8 so as to get
// the total number of 8 bit binary numbers that would
// be contained within that ACGT String. We then reinitialize
// our inputBinary Array to hold that many binary numbers.
inputBinary = new String[translation.length() / 8];
int cntr = 0;
for (int i = 0; i < translation.length(); i+= 8) {
inputBinary[cntr] = translation.substring(i, i+8);
cntr++;
}
//Convert 8bit Binary To ASCII...
inputAscii = new int[inputBinary.length];
for (int i = 0; i < inputBinary.length; i++) {
inputAscii[i] = Integer.parseInt(inputBinary[i], 2);
}
// Convert ASCII to Character String...
inputString = "";
for (int i = 0; i < inputAscii.length; i++) {
inputString+= Character.toString ((char) inputAscii[i]);
}
return inputString;
}
Like I said earlier Sam, this is pretty basic material and you should already have a very good handle onto the Java programming language to get to this point, especially if you are actually retrieving the data to convert from a Database. My job here is Complete. :o) I hope this has helped you (and others) to obtain your goals.

Setting each of the alphabet to a number

so I am trying to set each of the letter in the alphabet to a number like a = 1
b = 2 c =3 and so on.
int char = "a";
int[] num = new int{26};
for (int i = 0; i <num.length; i++){
System.out.print(i);
But after this i got Stuck so if you possible help me out. So when the users input a word like cat it would out put 3-1-20.
You can subtract 'a' from each char and add 1. E.g.
String input = "cat";
for (char c : input.toCharArray()) {
System.out.print(c - 'a' + 1);
}
The code you posted doesn't compile as you can't assign a String to an int and char is a reserved word (name of a primitive type)
int char = "a";
You also mention that you want the output formatted like this "3-1-20". This is one way to achieve that :
String input = "cat";
String[] out = new String[input.length()];
for (int i = 0; i < input.length(); ++i) {
out[i] = Integer.toString(input.charAt(i) - 'a' + 1);
}
System.out.println(String.join("-", out));
Both versions work only for lowercase English letters (a to z)
Assigning a number to a character is called an "encoding". As computers can only handle numbers internally, this happens all the time. Even this text here is encoded (probably into an encoding called "UTF-8") and then the resulting number is stored somewhere.
One very basic encoding is the so called ASCII (American Standard Code for Information Interchange). ASCII already does what you want, it assigns a number to each character, only that the number for "A" is 65 instead 1.
So, how does that help? We can assume that for the character A-z, the numeric value of a char is equal to the ASCII code (it's not true for every character, but for the most basic ones, it's good enough).
And this is why everyone here tells you to subtract 'A' or 'a': Your character is a char, which is a character, but also the numeric value of that character, so you can subtract 'A' (again, a char) and add 1:
'B' - 'A' + 1 = 2
because...
66 (numeric value of 'B') - 65 (numeric value of 'A') + 1 = 2
Actually, char is not ASCII, but UTF-8, but there it starts to get slightly bit more complex, so ASCII will suffice for the moment.
the best way of doing this is to convert the String to a byte[], like this:
char[] buffer = str.toCharArray();
Then each of the characters can be converted to their byte-value (which are constants for a certain encoding), like this:
byte[] b = new byte[buffer.length];
for (int i = 0; i < b.length; i++) {
b[i] = (byte) buffer[i];
}
Now look at the resulting values and subtract/add some value to get the desired results!

(Java) Convert a string of numbers to an array of ints

I'm trying to convert a string filled with 16 digits into an array of ints where each index holds the digit of its respective index in the string. I'm writing a program where I need to do math on individual ints in the string, but all of the methods I've tried don't seem to work. I can't split by a character, either, because the user is inputting the number.
Here's what I have tried.
//Directly converting from char to int
//(returns different values like 49 instead of 1?)
//I also tried converting to an array of char, which worked,
//but then when I converted
//the array of char to an array of ints, it still gave me weird numbers.
for (int count = 0; count <=15; count++)
{
intArray[count] = UserInput.charAt(count);
}
//Converting the string to an int and then using division to grab each digit,
//but it throws the following error (perhaps it's too long?):
// "java.lang.NumberFormatException: For input string: "1234567890123456""
int varX = Integer.parseInt(UserInput);
int varY = 1;
for (count=0; count<=15; count++)
{
intArray[count]= (varX / varY * 10);
}
Any idea what I should do?
how about this:
for (int count = 0; count < userInput.length; ++count)
intArray[count] = userInput.charAt(count)-'0';
I think that the thing that is a bit confusing here is that ints and chars can be interpited as eachother. The int value for the character '1' is actually 49.
Here is a solution:
for (int i = 0; i < 16; i++) {
intArray[i] = Integer.valueOf(userInput.substring(i, i + 1));
}
The substring method returns a part of the string as another string, not a character, and this can be parsed to an int.
Some tips:
I changed <= 15 to < 16. This is the convetion and will tell you how many loop interations you will actually go throug (16)
I changed "count" to "i". Another convention...

Categories

Resources