For simplicity's sake, I've got the following method to calculate whether the number inputted is binary (only accepts 1's and 0's)
public static void checkBinary(int BinaryNumber) {
String bNumber = String.valueOf(BinaryNumber);
char[] Digits = bNumber.toCharArray();
for (int i = 0; i < Digits.length; i++) {
if (Digits[i] > 1) {
System.out.println("You can't have the digit " + Digits[i]);
System.out.println("Your number is not a binary number.");
System.exit(0);
}
}
}
However when I try running checkBinary(1010); I get the following output
You can't have the digit 1
Your number is not a binary number.
Any idea why it's counting the initial 1 as greater than 1?
Thanks in advance guys!
ASCII '1' is not the same as 1. You should be comparing:
if (Digits[i] > '1') {
...
}
ASCII '1' is 0x31 or 49 decimal.
EDIT: also, be aware that if the input number is negative, you will have an ASCII '-' (0x2d, dec 45) in your char array. Really, you should be comparing against '0' and '1' only, not using >
try comparing character with character, not a number. Changing your comparision to:
Digits[i] > '1'
Will fix this code.
"1" representation in ASCII code has decimal value of 49, as presented in following table:
You are confusing numbers with representations of numbers. It makes no sense to ask if '0' or '1' are greater than one. '0' and '1' are digits and one is a numerical value.
Digits: '0' is a digit. '1' is a digit. "Three" is not a digit. Digits are symbols that can express a number, or part of a number, in some particular base.
Numbers: '0', "zero", and "one less than one" all mean the same thing, they're the same number. Numbers are amounts and can be represented many different ways, including by sequences of digits.
Thoroughly understanding the difference between values and representations of values is a critical programming skill.
Digits[i] is char, you comparing it with 1 (as integer).
You need to you use
Integer.valueOf(Digits[i])
instead.
Because Digits[i] equals 49, which is greater than 1.
You're using characters, not numbers:
char[] Digits = bNumber.toCharArray();
When you compare a char to an int, the char is implicitly converted to an int using the integer value of that character. And the integer value of '1' is 49.
A simple approach would be to use characters in both sides of the comparison:
if (Digits[i] > '1')
Or maybe use the intuitive numeric value of the char:
if (Character.getNumericValue(Digits[i]) > 1)
Here is the issue: if (Digits[i] > 1)
You are comparing char to an int. You should probably change this logic. Either compare char to char or int to int.
In java Characters equals to ASCII code, so you should change your if statement in your function like that:
public static void checkBinary(int BinaryNumber) {
String bNumber = String.valueOf(BinaryNumber);
char[] Digits = bNumber.toCharArray();
for (int i = 0; i < Digits.length; i++) {
if (Character.getNumericValue(Digits[i]) > 1) {
System.out.println("You can't have the digit " + Digits[i]);
System.out.println("Your number is not a binary number.");
System.exit(0);
}
}
}
Related
I've been searching for a solution to my problem for days but can't get a spot-on answer when looking at previously answered questions/ blogs / tutorials etc. all over the internet.
My aim is to write a program which takes a decimal number as an input and then calculates the hexadecimal number and also prints the unicode-symbol of said hexadecimal number (\uXXXX).
My problem is I can't "convert" the hexadecimal number to unicode. (It has to be written in this format: \uXXXX)
Example:
Input:
122 (= Decimal)
Output:
Hexadecimal: 7A
Unicode: \u007A | Unicode Symbol: Latin small letter "z"
The only thing I've managed to do is print the unicode (\u007A), but I want the symbol ("z").
I thought if the unicode only has 4 numbers/letters, I would just need to "copy" the hexadecimal into the code and fill up the remaining places with 0's and it kinda worked, but as I said I need the symbol not the code. So I tried and tried, but I just couldn't get the symbol.
By my understanding, if you want the symbol you need to print it as a string.
But when trying it with a string I get the error "illegal unicode escape".
It's like you only can print pre-determined unicodes and not "random" ones generated on the spot in relation of your input.
I'm only a couple days into Java, so apologies if I have missed anything.
Thank you for reading.
My code:
int dec;
int quotient;
int rest;
int[]hex = new int[10];
char[]chars = new char[]{
'F',
'E',
'D',
'C',
'B',
'A'
};
String unicode;
// Input Number
System.out.println("Input decimal number:");
Scanner input = new Scanner(System.in);
dec = input.nextInt();
//
// "Converting to hexadecimal
quotient = dec / 16;
rest = dec % 16;
hex[0] = rest;
int j = 1;
while (quotient != 0) {
rest = quotient % 16;
quotient = quotient / 16;
hex[j] = rest;
j++;
}
//
/*if (j == 1) {
unicode = '\u000';
}
if (j == 2) {
unicode = '\u00';
}
if (j == 3) {
unicode = '\u0';
}*/
System.out.println("Your number: " + dec);
System.out.print("The corresponding Hexadecimal number: ");
for (int i = j - 1; i >= 0; i--) {
if (hex[i] > 9) {
if (j == 1) {
unicode = "\u000" + String.valueOf(chars[16 - hex[i] - 1]);
}
if (j == 2) {
unicode = "\u00" + String.valueOf(chars[16 - hex[i] - 1]);
}
if (j == 3) {
unicode = "\u0" + String.valueOf(chars[16 - hex[i] - 1]);
}
System.out.print(chars[16 - hex[i] - 1]);
} else {
if (j == 1) {
unicode = "\u000" + Character.valueOf[hex[i]);
}
if (j == 2) {
unicode = "\u00" + Character.valueOf(hex[i]);
}
if (j == 3) {
unicode = "\u0" + Character.valueOf(hex[i]);
}
System.out.print(hex[i]);
}
}
System.out.println();
System.out.print("Unicode: " + (unicode));
}
It's not an advanced code whatsoever, I wrote it exactly how I would calculate it on paper.
Dividing the number through 16 until I get a 0 and what remains while doing so is the hexadecimal equivalent.
So I put it in a while loop, since I would divide the number n-times until I got 0, the condition would be to repeat the division until the quotient equals zero.
While doing so the remains of each division would be the numbers/letters of my hexadecimal number, so I need them to be saved. I choose an integer array to do so. Rest (remains) = hex[j].
I also threw a variable in the called "j", so I would now how many times the division was repeated. So I could determine how long the hexadecimal is.
In the example it would 2 letters/numbers long (7A), so j = 2.
The variable would then be used to determine how many 0's I would need to fill up the unicode with.
If I have only 2 letters/numbers, it means there are 2 empty spots after \u, so we add two zeros, to get \u007A instead of \u7A.
Also the next if-command replaces any numbers higher than 9 with a character from the char array above. Basically just like you would do on paper.
I'm very sorry for this insanely long question.
U+007A is the 3 bytes int code pointer.
\u007A is the UTF-16 char.
A Unicode code pointer, symbol, sometimes is converted to two chars and then the hexadecimal numbers do not agree. Using code pointers hence is best. As UTF-16 is just an encoding scheme for two-bytes representation, where the surrogate pairs for 3 byte Unicode numbers do not contain / or such (high bit always 1).
int hex = 0x7A;
hex = Integer.parseUnsignedInt("007A", 16);
char ch = (char) hex;
String stringWith1CodePoint = new String(new int[] { hex }, 0, 1);
int[] codePoints = stringWith1CodePoint.codePoints().toArray();
String s = "š¯„˛"; // U+1D11E = "\uD834\uDD1E"
You can simply use System.out.printf or String.format to do what you want.
Example:
int decimal = 122;
System.out.printf("Hexadecimal: %X\n", decimal);
System.out.printf("Unicode: u%04X\n", decimal);
System.out.printf("Latin small letter: %c\n", (char)decimal);
Output:
Hexadecimal: 7A
Unicode: u007A
Latin small letter: z
I came across a code which checks whether a character is between 'a' and 'z' case insensitive. However, I don't understand what the line after that is doing which is:
alphabets[c - 'a']++;
Could someone please explain this code to me?
alphabets = new int[26];
for (int i = 0; i < str.length(); i++)
{
char c = str.charAt(i);
if ('a' <= c && c <= 'z')
{
alphabets[c - 'a']++; // what does this do?
}
}
This code counts the number of times every lower-case letter appears in the strings. alphabets is an array where the first (i.e., index 0) index holds the number of as, the second the amount of bs, etc.
Subtracting a from the character will produce the relative index, and then ++ will increment the counter for that letter.
A char in Java is just a small integer, 16 bits wide. Generally speaking, the values it holds are the values that Unicode [aside: Java does not represent characters as "ASCII"] assigns to characters, but fundamentally, chars are just integers. Thus 'a' is the integer 0x0061, which can also be written as 97.
So, if you have value in the range 'a' to 'z', you have a value in the range 97 to 122. Subracting 'a' (subtracting 97) puts it in the range 0 to 25, which is suitable for indexing the 26-element array alphabets.
I am trying to convert a String with preceding zeroes into an integer (or, let's say, a BigDecimal). But the zeroes are truncated when the String is converted. Please help me to convert without losing zeroes.
Here Oracle explains how integers are stored.
int, whose values are 32-bit signed two's-complement integers, and whose default value is zero
and also
The values of the integral types of the Java Virtual Machine are:
For int, from -2147483648 to 2147483647 (-2^31 to 2^31 - 1), inclusive
show that they only store the number, with 32 bits you can only store where you are in the range of -2147483648 to 2147483647.
So integers don't contain information about preceding zeroes. You had to store it separately.
String s = "0002314";
int precedingZeroes = 0;
for (char c : s.toCharArray()) {
if(c == '+' || c == '-') {
// Allowed, but no zero -> continue
continue;
}
if(c == '0') {
zeroes++; // We found a zero, increment counter
continue;
}
// This is not a sign and no zero; we have non-zero digits now
// (or it would be an ill-formed integer) so the other zeroes
// are no preceding ones; -> break the loop, that's it
break;
}
int number = Integer.parseInt(s);
So you have to store both the number and the number of zeroes. Note that this only works for decimal numbers.
I have written a small piece of code where you enter a 3 digit number via the command line, and then it detects how many 5's are in the code.
public class fivedet {
public static void main (String[] args) {
String input = args[0];
int[] a = {0,0,0};
int x = 0;
int y = 0;
int z = 0;
for(int i = 0; i<input.length();i++) {
a[i] = input.charAt(i) - 48;
}
if(a[0]==5) {
x=5;
}
if(a[1]==5) {
y=5;
}
if(a[2]==5) {
z=5;
}
System.out.println("5 digits here:" + x + y + z);
}
}
My main question is why I require the -48 term after the input.charAt(i) method in order for each value in a[] to be equal to the actual number I input.
For example I enter
java fivedet 505
and without the -48 the array a[]={53,48,53} instead of a[]={5,0,5} and I unfortunately am not experienced enough with coding java (began learning 3 months ago) to understand why this is happening.
I also do want to develop it to be able to detect different digits and for different lengths of input numbers.
I would appreciate any insight as to why this happens.
Subtracting 48 is a quick but slightly confusing way of converting from a character to an integer. It so happens that the character code for each digit is 48 away from its numeric value.
See this table of ASCII values. (Java uses unicode, not ascii - strings are UTF-16 internally - but the values are valid in this specific case). So the character '0' has the value 48; the character '9' has the value 57.
Another way of doing this would be to take 1-character substrings of input, then call Integer.parseInt() on that string, converting "1" to 1, "2" to 2, etc.
You donā€™t need to convert to int in order to detect character 5. Just count them as chars.
for (char i : args[0].toCharArray()) {
System.out.print(i == '5' ? i : '0');
}
Also this question has good answers to count occurrences of a char in string
Can someone please explain to me what is going on here:
char c = '+';
int i = (int)c;
System.out.println("i: " + i + " ch: " + Character.getNumericValue(c));
This prints i: 43 ch:-1. Does that mean I have to rely on primitive conversions to convert char to int? So how can I convert a Character to Integer?
Edit: Yes I know Character.getNumericValue returns -1 if it is not a numeric value and that makes sense to me. The question is: why does doing primitive conversions return 43?
Edit2: 43 is the ASCII for +, but I would expect the cast to not succeed just like getNumericValue did not succeed. Otherwise that means there are two semantic equivalent ways to perform the same operation but with different results?
Character.getNumericValue(c)
The java.lang.Character.getNumericValue(char ch) returns the int value that the specified Unicode character represents. For example, the character '\u216C' (the roman numeral fifty) will return an int with a value of 50.
The letters A-Z in their uppercase ('\u0041' through '\u005A'), lowercase ('\u0061' through '\u007A'), and full width variant ('\uFF21' through '\uFF3A' and '\uFF41' through '\uFF5A') forms have numeric values from 10 through 35. This is independent of the Unicode specification, which does not assign numeric values to these char values.
This method returns the numeric value of the character, as a
nonnegative int value;
-2 if the character has a numeric value that is not a nonnegative integer;
-1 if the character has no numeric value.
And here is the link.
As the documentation clearly states, Character.getNumericValue() returns the character's value as a digit.
It returns -1 if the character is not a digit.
If you want to get the numeric Unicode code point of a boxed Character object, you'll need to unbox it first:
int value = (int)c.charValue();
Try any one of the below. These should work:
int a = Character.getNumericValue('3');
int a = Integer.parseInt(String.valueOf('3');
From the Javadoc for Character#getNumericValue:
If the character does not have a numeric value, then -1 is returned.
If the character has a numeric value that cannot be represented as a
nonnegative integer (for example, a fractional value), then -2 is
returned.
The character + does not have a numeric value, so you're getting -1.
Update:
The reason that primitive conversion is giving you 43 is that the the character '+' is encoded as the integer 43.
43 is the dec ascii number for the "+" symbol. That explains why you get a 43 back.
http://en.wikipedia.org/wiki/ASCII
public class IntergerParser {
public static void main(String[] args){
String number = "+123123";
System.out.println(parseInt(number));
}
private static int parseInt(String number){
char[] numChar = number.toCharArray();
int intValue = 0;
int decimal = 1;
for(int index = numChar.length ; index > 0 ; index --){
if(index == 1 ){
if(numChar[index - 1] == '-'){
return intValue * -1;
} else if(numChar[index - 1] == '+'){
return intValue;
}
}
intValue = intValue + (((int)numChar[index-1] - 48) * (decimal));
System.out.println((int)numChar[index-1] - 48+ " " + (decimal));
decimal = decimal * 10;
}
return intValue;
}