Java Implicit conversion char to int? - java

I was given an interview question to output the most frequent occurrence of a character.
Given this string "aaxxaabbaa". The character 'a' is the most frequent.
The following code was something that I found searching on the internet. Note: I implemented it with 2 loops which is in-efficient (different topic)
public static char findMostUsedChar(String str){
char maxchar = ' ';
int maxcnt = 0;
// if you are confident that your input will be only ascii, then this array can be size 128.
// Create a character counter
**int[] charcnt = new int[Character.MAX_VALUE + 1];**
for(int i = 0; i < str.length()-1; i++){
char **ch** = str.charAt(i);
// increment this character's cnt and compare it to our max.
if (**charcnt[ch]**++ >= maxcnt) {
maxcnt = charcnt[ch];
maxchar = ch;
}
}
return maxchar;
}
They declared an int array, found the character at a specific index (i.e. 'a') then used as an index.
After tracing the code out on the debugger in eclipse, I still don't understand how using a character to represent an int index works without explicitly casting it or using charVal.getNumericValue()?? Even most of the S.O. char to int topics explicitly cast.
Thanks in advance.

Array access expressions undergo implicit unary numeric promotion, which will widen an expression to int.
The index expression undergoes unary numeric promotion (ยง5.6.1).
The char datatype is widened to int via its Unicode value, e.g. 'A' -> 65.

Related

Casting a specific element of an array - java

I have an array as:
String letters [] = {a, b, c, d, e};
Can I cast a specific element let's say "a" ? I want to get its ascii value, so I did this but it is not working:
for (int i = 0; i < letters.length; i++) {
Integer iDecimal = (int) a[0]; // the a[0] is wrong!!
System.out.print(iDecimal);
}
Any ideas about how to cast in such cases?
You are trying to cast a[0] which doesn't exist, to access your array you need to do letters[0] and then try working with it. I suggest changing type of your array to char and then casting it, you can't do that with String.
Declare the array as object array, but be aware that this is bad coding style, since you are mixing different types in one array.
Object letters [] = {5, "b", 'c'};
The object array contains an integer, a string, and a character you may then iterate over the array and test what object type you have.
The integer element is autoboxed to an Integer-Object.
But again i would not recommend doing so, since element testing is expensive.
for (int i = 0; i < letters.length; i++) {
if (a[i]) instanceof Integer)
Integer iDecimal = (Integer) a[i];
}
Assuming you want just the ASCII code for the first character in each string in your array, then you can can access that char then cast it to an int.
for (int i = 0; i < letters.length; i++) {
int iChar = (int) letters[i][0];
System.out.print(iChar);
}
letters[i] is the ith string in the letters array.
letters[i][0] is the first character of the ith string in the letters array, and will be of type char.
(int)letters[i][0] is the integer value equivalent to the first character of the ith string in the letters array.
Strictly this will give the first UTF-16 word in the string, but for values <127 it will be the same as the ASCII value.
ints are not decimal values, so don't call them decimals. The BigDecimal type is used for decimal values.

Setting each of the alphabet to a number

so I am trying to set each of the letter in the alphabet to a number like a = 1
b = 2 c =3 and so on.
int char = "a";
int[] num = new int{26};
for (int i = 0; i <num.length; i++){
System.out.print(i);
But after this i got Stuck so if you possible help me out. So when the users input a word like cat it would out put 3-1-20.
You can subtract 'a' from each char and add 1. E.g.
String input = "cat";
for (char c : input.toCharArray()) {
System.out.print(c - 'a' + 1);
}
The code you posted doesn't compile as you can't assign a String to an int and char is a reserved word (name of a primitive type)
int char = "a";
You also mention that you want the output formatted like this "3-1-20". This is one way to achieve that :
String input = "cat";
String[] out = new String[input.length()];
for (int i = 0; i < input.length(); ++i) {
out[i] = Integer.toString(input.charAt(i) - 'a' + 1);
}
System.out.println(String.join("-", out));
Both versions work only for lowercase English letters (a to z)
Assigning a number to a character is called an "encoding". As computers can only handle numbers internally, this happens all the time. Even this text here is encoded (probably into an encoding called "UTF-8") and then the resulting number is stored somewhere.
One very basic encoding is the so called ASCII (American Standard Code for Information Interchange). ASCII already does what you want, it assigns a number to each character, only that the number for "A" is 65 instead 1.
So, how does that help? We can assume that for the character A-z, the numeric value of a char is equal to the ASCII code (it's not true for every character, but for the most basic ones, it's good enough).
And this is why everyone here tells you to subtract 'A' or 'a': Your character is a char, which is a character, but also the numeric value of that character, so you can subtract 'A' (again, a char) and add 1:
'B' - 'A' + 1 = 2
because...
66 (numeric value of 'B') - 65 (numeric value of 'A') + 1 = 2
Actually, char is not ASCII, but UTF-8, but there it starts to get slightly bit more complex, so ASCII will suffice for the moment.
the best way of doing this is to convert the String to a byte[], like this:
char[] buffer = str.toCharArray();
Then each of the characters can be converted to their byte-value (which are constants for a certain encoding), like this:
byte[] b = new byte[buffer.length];
for (int i = 0; i < b.length; i++) {
b[i] = (byte) buffer[i];
}
Now look at the resulting values and subtract/add some value to get the desired results!

Storing Integer value as A Character Java

I tried to look in previous stack overflow questions but could not find any similar.
I want to store the integer value to char variable but the integer value is stored in another variable.
char[] ch1 = (binary1.toString()).toCharArray();
char[] ch2 = binary2.toString().toCharArray();
if (ch1.length >= ch2.length) {
char[] ch = new char[ch1.length];
int j = 0;
int num1;
int num2;
int num3;
for (int i = 0; i < ch1.length; i++) {
if (j == ch2.length - 1)
j = 0;
num1 = Character.getNumericValue(ch1[i]);
num2 = Character.getNumericValue(ch2[j]);
num3 = num1 ^ num2;
ch[i] = (char) num3;
j++;
}
String str = new String(ch);
return str;
}
Here what is happening is that I am getting null values in many cases.
I tried to look in the Character class but could not find any function.
If there is any way then please tell.
Thanks in advance.
EDIT : I need to store either 0 or 1
EDIT : ch1[] & ch2[] ar of char type
EDIT : I got a temporary solution to my problem but for a wider concept how should we type cast integer to char if size is same and when getting null values.
You can do
char ch = num3 == 0 ? '0' : '1';
When you call Character.getNumericValue(someChar); You are actually getting the integer literal of the value.
Lets say someChar='4'; in this case, The above method will give you a value 4.
But you cannot get numaric value if the char has value other than 0 to 9
If you really want the integer value of what so ever byte value stored in the char variable, then you have to type cast it directly to an integer as below.
char x = 'Z';
int z = x;
System.out.println(z);
the above code snippet will print 90 as Z is equalent to 90 in ascii.
Maybe your conversion from
char ch = (char)num3; is not right?
Maybe you can try to use something like
Character.toChars(65)
See this answer for information on that:Converting stream of int's to char's in java
If you know that your integer is a single digit number, then you can use Character's forDigit method to do the conversion from int to char
Character.forDigit(9,10) --> '9'
From the Official docs,
Determines the character representation for a specific digit in the
specified radix.
Parameters:
digit - the number to convert to a character.
radix - the radix.
Returns:
the char representation of the specified digit in the specified radix.

Converting a lowercase char in a char array to an uppercase char (java)

Hello I am trying to write a little segment of code that checks if each char in a char array is lower case or uppercase. Right now it uses the char's ASCII number to check. After it checks it should convert the char to upper case if it is not already so:
for (int counter = 0; counter < charmessage.length; counter++) {
if (91 - charmessage[counter] <= 0 && 160 - charmessage[counter] != 0) {
charmessage[counter] = charmessage[counter].toUpperCase();
}
}
charmessage is already initialized previously in the program. The 160 part is to make sure it doesn't convert a space to uppercase. How do I get the .toUpperCase method to work?
I would do it this way. First check if the character is a letter and if it is lowercase. After this just use the Character.toUpperCase(char ch)
if(Character.isLetter(charmessage[counter]) && Character.isLowerCase(charmessage[counter])){
charmessage[counter] = Character.toUpperCase(charmessage[counter]);
}
You can use the Character#toUpperCase for that. Example:
char a = 'a';
char upperCase = Character.toUpperCase(a);
It has some limitations, though. It's very important you know that the world is aware of many more characters that can fit within the 16-bit range.
String s = "stackoverflow";
int stringLenght = s.length();
char arr[] = s.toCharArray();
for (int i = stringLenght - 1; i >= 0; i--) {
char a = (arr[i]);
char upperCase = Character.toUpperCase(a);
System.out.print(upperCase);
}

Representing letters as numbers in java

I was wondering how you would represent letters as integers in java. I am working on a problem where I have to find the mid letter between a two lettered word. For example, I would choose the word 'go' and provide each letter with an assigned integer value to find the midpoint letter. Can anyone help me out with this or just point me in the right direction to go about solving on how to get the midpoint letter between a two letter word?
That is simple
int a = 'a';
int c = 'c';
char mid = (char) ((a + c) / 2);
System.out.println(mid);
prints
b
(int)str.charAt(i) will get you an integer value (the ASCII value). For "regular" letters, this should allow you to do what you want.
str = "GO";
midLetter = Character.toChars(((int)str.charAt(0) + (int)str.charAt(1))/2);
I think I got the brackets to match...
If by "letters" you're referring to the char primitive type, then they are already represented by integers behind the scenes. From the Java Tutorials:
The char data type is a single 16-bit Unicode character. It has a
minimum value of '\u0000' (or 0) and a maximum value of '\uffff' (or
65,535 inclusive).
So you can assign a char literal to an int variable for example:
int g = 'g';
int o = 'o';
That should be enough to get you started.
In java (and most other language) characters are actually represented as numbers. Google for 'ascii table', and you'll find out lowercase a is actually 97.
Assuming you want to index lowercase a as 0, then given arbitrary character from a string, you can subtract it with the 'a' chacater, and you will get the index
String str = ...;
for(int i=0; i<str.length(); i++) {
char c = str.charAt(i);
int cIndex = c - 'a';
// do something with cIndex...
}
If the given letter is of char then convert the type (int)yourword. Then find the midpoint
Character.getNumericValue returns a numeric value for each character. So, each letter has a unique numeric value via the Character class.
This could be the starting point for your ordering, although you might need to consider case, etc.
You should be able to go back and forth from int to char and perform arithmetic on your char values:
char median = 0;
for(char ch=65; ch<91; ch++) {
System.out.println(ch);
median += ch;
}
median = (char)(median/26);
System.out.println("=====");
System.out.println("Median leter in the alphabet: "+median);

Categories

Resources