Setting each of the alphabet to a number - java

so I am trying to set each of the letter in the alphabet to a number like a = 1
b = 2 c =3 and so on.
int char = "a";
int[] num = new int{26};
for (int i = 0; i <num.length; i++){
System.out.print(i);
But after this i got Stuck so if you possible help me out. So when the users input a word like cat it would out put 3-1-20.

You can subtract 'a' from each char and add 1. E.g.
String input = "cat";
for (char c : input.toCharArray()) {
System.out.print(c - 'a' + 1);
}
The code you posted doesn't compile as you can't assign a String to an int and char is a reserved word (name of a primitive type)
int char = "a";
You also mention that you want the output formatted like this "3-1-20". This is one way to achieve that :
String input = "cat";
String[] out = new String[input.length()];
for (int i = 0; i < input.length(); ++i) {
out[i] = Integer.toString(input.charAt(i) - 'a' + 1);
}
System.out.println(String.join("-", out));
Both versions work only for lowercase English letters (a to z)

Assigning a number to a character is called an "encoding". As computers can only handle numbers internally, this happens all the time. Even this text here is encoded (probably into an encoding called "UTF-8") and then the resulting number is stored somewhere.
One very basic encoding is the so called ASCII (American Standard Code for Information Interchange). ASCII already does what you want, it assigns a number to each character, only that the number for "A" is 65 instead 1.
So, how does that help? We can assume that for the character A-z, the numeric value of a char is equal to the ASCII code (it's not true for every character, but for the most basic ones, it's good enough).
And this is why everyone here tells you to subtract 'A' or 'a': Your character is a char, which is a character, but also the numeric value of that character, so you can subtract 'A' (again, a char) and add 1:
'B' - 'A' + 1 = 2
because...
66 (numeric value of 'B') - 65 (numeric value of 'A') + 1 = 2
Actually, char is not ASCII, but UTF-8, but there it starts to get slightly bit more complex, so ASCII will suffice for the moment.

the best way of doing this is to convert the String to a byte[], like this:
char[] buffer = str.toCharArray();
Then each of the characters can be converted to their byte-value (which are constants for a certain encoding), like this:
byte[] b = new byte[buffer.length];
for (int i = 0; i < b.length; i++) {
b[i] = (byte) buffer[i];
}
Now look at the resulting values and subtract/add some value to get the desired results!

Related

How to convert Ascii to Unicode in java?

I have a string is half-size font and i want to convert it to full size. I was tried to use this code
final String x = "01589846";
String b = "";
System.out.print("01589846");
int y = 0;
final char[] list = x.toCharArray();
for (int i = 0; i < list.length; i++) {
y = Integer.parseInt(String.valueOf(list[i]));
final char unicode = (char) (y + 65296);
b += unicode;
}
System.out.println(b);
}
it actually working but it only working with number.
Anyone have another way for this ? please help me !!!!!
Java Strings are Unicode. They don't need converting. Java does not natively use ASCII.
You apparently wish to map one set of Unicode characters to another. The appropriate tool for that would be a Map, but you'll have to populate the Map with your desired conversion taken from the Unicode code charts.
There may be some algorithmic way to do this for particular subranges; you seem to have discovered a way that works for (western) digits. Note that the fullwidth digits occupy codepoints 0xFF10 to 0xFF19, so the conversion formula is digit - '0' + 0xff10. 0xFF10 is 65296 decimal, but the hex is clearer, since it's what is used in published code charts.
Actually, it looks to me that the same thing works for all characters in the range SPACE to '~', presumably by design. Thus
for (int i=0; i<list.length; i++)
list[i] += 0xff00 - ' ';
Here, I simply assume without checking that list will only contain characters in the range of SPACE to '~', i.e., the Unicode range that corresponds to graphic (printable) ASCII characters. Dealing with other characters, for example Katakana, is more involved.
final String x = "012345 abcdef ABCDEF";
System.out.println(x);
String b = "";
final char[] list = x.toCharArray();
for (int i = 0; i < list.length; i++) {
if(Character.isDigit(list[i])) {
b += (char)(list[i] - 48 + 0xFF10);
} else if(Character.isUpperCase(list[i])) {
b += (char)(list[i] - 65 + 0xFF21);
} else if(Character.isLowerCase(list[i])) {
b += (char)(list[i] - 97 + 0xFF41);
} else if(Character.isWhitespace(list[i])) {
b += list[i];
} else {
b += (char)(list[i] - 33 + 0xFF01);
}
}
System.out.println(b);
Output:
012345 abcdef ABCDEF
012345 abcdef ABCDEF

Creating Text out of a String of 1s and 0s

Im currently working on a QRCode scanne and have come to a point where I've been stuck at for a while.
What I have so far is a String of 1s and 0s such as "100010100101....". What I wanted to do next ist turn this String into Bytes by always seperating 8 Bits.
With these Bytes I now want to decode them into text with this "ISO8859_1" Standart.
My Problem is the following: my results are way of what I want. This is my code:
for(int i = 0; i <= numberOfInt; i++){
String character = "";
for(int j = 0;j < 8; j++){
boolean bool = tResult.remove(0); //tResult is a List of 1s & 0s
if(bool){
character = character + '1';
}else{
character = character + '0';
}
}
allcharacter[byteCounter] = (byte)Integer.parseInt(character,2);//I think this Line is where the mistake is.
byteCounter++; //Variable that counts where to put the next bit
}
String endresult ="";
try {
endresult = new String(allcharacter,"ISO8859_1");
} catch (UnsupportedEncodingException e) {
e.printStackTrace();
}
return endresult;
What I think is that, the cast to (byte) doesn't work the way I understand it and therefore different bytes are saved into the array.
Thanks for any help.
You can use the substring method of the String class to get the first 8 characters and then convert those 8 characters (treat them as bits) to a character (which is also 8 bits). Instead of parsing each character as an integer, and then casting it to a byte, you should check each character and multiply a byte value by 2 for each time you hit a 1. That way, you will get a value between 0-255 for each byte, which should give you a valid character.
Also you might want to check the Byte class and its methods, it probably has a method that already does this.
Edit: There you go.
Edit 2: Also this question may answer why your int to byte casting does not give you the result you thought it would.
Okay, I rarely work with bytes, so in that aspect I am useless. However, I have converted binary to string many times. The logic behind it is converting the binary string to a decimal int, then from int to char, then from char to string. Here is how I do that.
String list = "100111000110000111010011" //24 random binary digits for example
String output = "";
char letter = '';
int ascii = 0;
//Repeat while there is still something to read.
for(int i = 0; i < list.length(); i+=8){
String temp = list.substring(i,i+8); //1 character in binary.
for(int j = temp.length()-1; j >= 0; j--) //Convert binary to decimal
if(temp.charAt(j) == '1')
ascii += (int)Math.pow(2,j);
letter = (char)ascii; //Sets the char letter to it's corresponding ascii value
output = output + Character.toString(letter); //Adds the letter to the string
ascii = 0; //resets ascii
}
System.out.println(output); //outputs the converted string
I hope that you found this helpful!

How do I shift the value of my char array in java?

What I'm trying to do is create an encryption method that shifts a char array taken from an input file by a determined amount of letters. I'm having trouble figuring out how to change the chars into ints and back.
THis is what I've got so far:
char [] sChar = new char[line.length()];
for(int i = 0; i < sChar.length; i++){
String s = reader.next();
sChar = s.toCharArray();
if(Character.isLetter(sChar[i])) {
char c = 'a';
int b = c;
sChar[i] += key;
Not sure what you are thinking.
I thought conversion int to character and back would be easy.
Just to freshen up the idea I checked
char xdrf = 'a';
System.out.println((int)xdrf); // output is 97
int idrf= 99;
xdrf = (char)idrf;
System.out.println(xdrf); // output is c
also if you key is a character you can directly sum it therefore statement
schar[i] += key;
should be good
more to it
idrf = idrf + 'd';
System.out.println(idrf); //output is 199
further using
System.out.println(Character.getNumericValue(idrf-20)); //output is 3
this all is working by ascii value. I am unsure if you would like ascii values to be used.

Converting a lowercase char in a char array to an uppercase char (java)

Hello I am trying to write a little segment of code that checks if each char in a char array is lower case or uppercase. Right now it uses the char's ASCII number to check. After it checks it should convert the char to upper case if it is not already so:
for (int counter = 0; counter < charmessage.length; counter++) {
if (91 - charmessage[counter] <= 0 && 160 - charmessage[counter] != 0) {
charmessage[counter] = charmessage[counter].toUpperCase();
}
}
charmessage is already initialized previously in the program. The 160 part is to make sure it doesn't convert a space to uppercase. How do I get the .toUpperCase method to work?
I would do it this way. First check if the character is a letter and if it is lowercase. After this just use the Character.toUpperCase(char ch)
if(Character.isLetter(charmessage[counter]) && Character.isLowerCase(charmessage[counter])){
charmessage[counter] = Character.toUpperCase(charmessage[counter]);
}
You can use the Character#toUpperCase for that. Example:
char a = 'a';
char upperCase = Character.toUpperCase(a);
It has some limitations, though. It's very important you know that the world is aware of many more characters that can fit within the 16-bit range.
String s = "stackoverflow";
int stringLenght = s.length();
char arr[] = s.toCharArray();
for (int i = stringLenght - 1; i >= 0; i--) {
char a = (arr[i]);
char upperCase = Character.toUpperCase(a);
System.out.print(upperCase);
}

Representing letters as numbers in java

I was wondering how you would represent letters as integers in java. I am working on a problem where I have to find the mid letter between a two lettered word. For example, I would choose the word 'go' and provide each letter with an assigned integer value to find the midpoint letter. Can anyone help me out with this or just point me in the right direction to go about solving on how to get the midpoint letter between a two letter word?
That is simple
int a = 'a';
int c = 'c';
char mid = (char) ((a + c) / 2);
System.out.println(mid);
prints
b
(int)str.charAt(i) will get you an integer value (the ASCII value). For "regular" letters, this should allow you to do what you want.
str = "GO";
midLetter = Character.toChars(((int)str.charAt(0) + (int)str.charAt(1))/2);
I think I got the brackets to match...
If by "letters" you're referring to the char primitive type, then they are already represented by integers behind the scenes. From the Java Tutorials:
The char data type is a single 16-bit Unicode character. It has a
minimum value of '\u0000' (or 0) and a maximum value of '\uffff' (or
65,535 inclusive).
So you can assign a char literal to an int variable for example:
int g = 'g';
int o = 'o';
That should be enough to get you started.
In java (and most other language) characters are actually represented as numbers. Google for 'ascii table', and you'll find out lowercase a is actually 97.
Assuming you want to index lowercase a as 0, then given arbitrary character from a string, you can subtract it with the 'a' chacater, and you will get the index
String str = ...;
for(int i=0; i<str.length(); i++) {
char c = str.charAt(i);
int cIndex = c - 'a';
// do something with cIndex...
}
If the given letter is of char then convert the type (int)yourword. Then find the midpoint
Character.getNumericValue returns a numeric value for each character. So, each letter has a unique numeric value via the Character class.
This could be the starting point for your ordering, although you might need to consider case, etc.
You should be able to go back and forth from int to char and perform arithmetic on your char values:
char median = 0;
for(char ch=65; ch<91; ch++) {
System.out.println(ch);
median += ch;
}
median = (char)(median/26);
System.out.println("=====");
System.out.println("Median leter in the alphabet: "+median);

Categories

Resources