What I'm trying to do is create an encryption method that shifts a char array taken from an input file by a determined amount of letters. I'm having trouble figuring out how to change the chars into ints and back.
THis is what I've got so far:
char [] sChar = new char[line.length()];
for(int i = 0; i < sChar.length; i++){
String s = reader.next();
sChar = s.toCharArray();
if(Character.isLetter(sChar[i])) {
char c = 'a';
int b = c;
sChar[i] += key;
Not sure what you are thinking.
I thought conversion int to character and back would be easy.
Just to freshen up the idea I checked
char xdrf = 'a';
System.out.println((int)xdrf); // output is 97
int idrf= 99;
xdrf = (char)idrf;
System.out.println(xdrf); // output is c
also if you key is a character you can directly sum it therefore statement
schar[i] += key;
should be good
more to it
idrf = idrf + 'd';
System.out.println(idrf); //output is 199
further using
System.out.println(Character.getNumericValue(idrf-20)); //output is 3
this all is working by ascii value. I am unsure if you would like ascii values to be used.
Related
I have a string is half-size font and i want to convert it to full size. I was tried to use this code
final String x = "01589846";
String b = "";
System.out.print("01589846");
int y = 0;
final char[] list = x.toCharArray();
for (int i = 0; i < list.length; i++) {
y = Integer.parseInt(String.valueOf(list[i]));
final char unicode = (char) (y + 65296);
b += unicode;
}
System.out.println(b);
}
it actually working but it only working with number.
Anyone have another way for this ? please help me !!!!!
Java Strings are Unicode. They don't need converting. Java does not natively use ASCII.
You apparently wish to map one set of Unicode characters to another. The appropriate tool for that would be a Map, but you'll have to populate the Map with your desired conversion taken from the Unicode code charts.
There may be some algorithmic way to do this for particular subranges; you seem to have discovered a way that works for (western) digits. Note that the fullwidth digits occupy codepoints 0xFF10 to 0xFF19, so the conversion formula is digit - '0' + 0xff10. 0xFF10 is 65296 decimal, but the hex is clearer, since it's what is used in published code charts.
Actually, it looks to me that the same thing works for all characters in the range SPACE to '~', presumably by design. Thus
for (int i=0; i<list.length; i++)
list[i] += 0xff00 - ' ';
Here, I simply assume without checking that list will only contain characters in the range of SPACE to '~', i.e., the Unicode range that corresponds to graphic (printable) ASCII characters. Dealing with other characters, for example Katakana, is more involved.
final String x = "012345 abcdef ABCDEF";
System.out.println(x);
String b = "";
final char[] list = x.toCharArray();
for (int i = 0; i < list.length; i++) {
if(Character.isDigit(list[i])) {
b += (char)(list[i] - 48 + 0xFF10);
} else if(Character.isUpperCase(list[i])) {
b += (char)(list[i] - 65 + 0xFF21);
} else if(Character.isLowerCase(list[i])) {
b += (char)(list[i] - 97 + 0xFF41);
} else if(Character.isWhitespace(list[i])) {
b += list[i];
} else {
b += (char)(list[i] - 33 + 0xFF01);
}
}
System.out.println(b);
Output:
012345 abcdef ABCDEF
012345 abcdef ABCDEF
I'm trying to use a for loop to enter characters a-z into a string array, but I'm not having much luck converting characters to string values so they'll actually go into the string array. I keep getting null values as my output. Could anyone provide some tips on how to get characters into a string array?
This is what I have so far:
String[] letters = new String[26];
for (char ch = 'a'; ch <= 'z'; ch++)
{
int i = 0;
letters[i] = String.valueOf(ch);
i++;
}
System.out.println(Arrays.toString(letters));
String[] letters = new String[26];
int i = 0;
for (char ch = 'a'; ch <= 'z'; ch++)
{
letters[i] = String.valueOf(ch);
i++;
}
System.out.println(Arrays.toString(letters));
Try this. i=0 should be outside the loop.
Move int i = 0; outside the loop like Eran said or just don't use another counter but determine index by ordinal character representation:
String[] letters = new String[26];
for (char ch = 'a'; ch <= 'z'; ch++) {
int index = (int) ch - 97;
letters[index] = String.valueOf(ch);
}
System.out.println(Arrays.toString(letters));
Additionally, you don't have to use another local variable and could just do letters[(int) ch - 97] = ... of course.
your inserting each time to letters[0], keep Variable i outside the loop.
String[] letters = new String[26];
int i = 0;
for (char ch = 'a'; ch <= 'z'; ch++)
{
letters[i++] = String.valueOf(ch);
}
System.out.println(Arrays.toString(letters));
for (char ch = 'a'; ch <= 'z'; ch++){
int i = 0;
letters[i] = String.valueOf(ch);
i++;
}
You need to understand what it does : at EACH iteration you're initializing the variable i to 0 so you will never write in an other place than letters[0]
You need only to set it to 0, and then increment it, so just put the instruction before the loop
An other easy way would be only :
char[] letters = "abcdefghijklmnopqrstuvwxyz".toCharArray();
For your particular purpose, with Java 8 Streams, you don't even need a loop.
String[] letters = IntStream.rangeClosed('a', 'z').mapToObj(i -> Character.toString((char)i)).toArray(String[]::new);
System.out.println(Arrays.toString(letters));
To break it down:
IntStream.rangeClosed(int, int) makes a Stream of ints from the first int to the second, inclusive of both endpoints. We use this because there is no CharStream class (for some reason), but we can still use chars 'a' and 'z', which will be implicitly converted to their int value.
mapToObj takes a function which will convert each int of the Stream into an object. It gets a little messy here, as there is no single step conversion from int to String, we first need the int interpreted as a character value. So, we cast each int (named i) to a char, and then wrap that in a conversion from char to String: i -> Character.toString((char)i). This will leave us with a Stream<String>.
Now, we want the output to be String[], as per your question. Stream has a toArray method, but this will give us an annoying Object[] result. Instead, we will supply the method we want to have used to build the array. We don't want anything fancy, so we'll just use the standard initializer for a String array: toArray(String[]::new).
After that, letters will be equal to an array of Strings, and each one will successively be a letter from a to z.
If you don't have access to Java 8 or simply don't like the above solution, here's a simplified version of your above code that removes the need for the index:
String[] letters = new String[26];
for (char c = 'a'; c <= 'z'; c++) letters[c - 'a'] = Character.toString(c);
System.out.println(Arrays.toString(letters));
In Java, chars can be treated as ints because below the surface, they are both stored as numbers.
Im currently working on a QRCode scanne and have come to a point where I've been stuck at for a while.
What I have so far is a String of 1s and 0s such as "100010100101....". What I wanted to do next ist turn this String into Bytes by always seperating 8 Bits.
With these Bytes I now want to decode them into text with this "ISO8859_1" Standart.
My Problem is the following: my results are way of what I want. This is my code:
for(int i = 0; i <= numberOfInt; i++){
String character = "";
for(int j = 0;j < 8; j++){
boolean bool = tResult.remove(0); //tResult is a List of 1s & 0s
if(bool){
character = character + '1';
}else{
character = character + '0';
}
}
allcharacter[byteCounter] = (byte)Integer.parseInt(character,2);//I think this Line is where the mistake is.
byteCounter++; //Variable that counts where to put the next bit
}
String endresult ="";
try {
endresult = new String(allcharacter,"ISO8859_1");
} catch (UnsupportedEncodingException e) {
e.printStackTrace();
}
return endresult;
What I think is that, the cast to (byte) doesn't work the way I understand it and therefore different bytes are saved into the array.
Thanks for any help.
You can use the substring method of the String class to get the first 8 characters and then convert those 8 characters (treat them as bits) to a character (which is also 8 bits). Instead of parsing each character as an integer, and then casting it to a byte, you should check each character and multiply a byte value by 2 for each time you hit a 1. That way, you will get a value between 0-255 for each byte, which should give you a valid character.
Also you might want to check the Byte class and its methods, it probably has a method that already does this.
Edit: There you go.
Edit 2: Also this question may answer why your int to byte casting does not give you the result you thought it would.
Okay, I rarely work with bytes, so in that aspect I am useless. However, I have converted binary to string many times. The logic behind it is converting the binary string to a decimal int, then from int to char, then from char to string. Here is how I do that.
String list = "100111000110000111010011" //24 random binary digits for example
String output = "";
char letter = '';
int ascii = 0;
//Repeat while there is still something to read.
for(int i = 0; i < list.length(); i+=8){
String temp = list.substring(i,i+8); //1 character in binary.
for(int j = temp.length()-1; j >= 0; j--) //Convert binary to decimal
if(temp.charAt(j) == '1')
ascii += (int)Math.pow(2,j);
letter = (char)ascii; //Sets the char letter to it's corresponding ascii value
output = output + Character.toString(letter); //Adds the letter to the string
ascii = 0; //resets ascii
}
System.out.println(output); //outputs the converted string
I hope that you found this helpful!
so I am trying to set each of the letter in the alphabet to a number like a = 1
b = 2 c =3 and so on.
int char = "a";
int[] num = new int{26};
for (int i = 0; i <num.length; i++){
System.out.print(i);
But after this i got Stuck so if you possible help me out. So when the users input a word like cat it would out put 3-1-20.
You can subtract 'a' from each char and add 1. E.g.
String input = "cat";
for (char c : input.toCharArray()) {
System.out.print(c - 'a' + 1);
}
The code you posted doesn't compile as you can't assign a String to an int and char is a reserved word (name of a primitive type)
int char = "a";
You also mention that you want the output formatted like this "3-1-20". This is one way to achieve that :
String input = "cat";
String[] out = new String[input.length()];
for (int i = 0; i < input.length(); ++i) {
out[i] = Integer.toString(input.charAt(i) - 'a' + 1);
}
System.out.println(String.join("-", out));
Both versions work only for lowercase English letters (a to z)
Assigning a number to a character is called an "encoding". As computers can only handle numbers internally, this happens all the time. Even this text here is encoded (probably into an encoding called "UTF-8") and then the resulting number is stored somewhere.
One very basic encoding is the so called ASCII (American Standard Code for Information Interchange). ASCII already does what you want, it assigns a number to each character, only that the number for "A" is 65 instead 1.
So, how does that help? We can assume that for the character A-z, the numeric value of a char is equal to the ASCII code (it's not true for every character, but for the most basic ones, it's good enough).
And this is why everyone here tells you to subtract 'A' or 'a': Your character is a char, which is a character, but also the numeric value of that character, so you can subtract 'A' (again, a char) and add 1:
'B' - 'A' + 1 = 2
because...
66 (numeric value of 'B') - 65 (numeric value of 'A') + 1 = 2
Actually, char is not ASCII, but UTF-8, but there it starts to get slightly bit more complex, so ASCII will suffice for the moment.
the best way of doing this is to convert the String to a byte[], like this:
char[] buffer = str.toCharArray();
Then each of the characters can be converted to their byte-value (which are constants for a certain encoding), like this:
byte[] b = new byte[buffer.length];
for (int i = 0; i < b.length; i++) {
b[i] = (byte) buffer[i];
}
Now look at the resulting values and subtract/add some value to get the desired results!
Hello I am trying to write a little segment of code that checks if each char in a char array is lower case or uppercase. Right now it uses the char's ASCII number to check. After it checks it should convert the char to upper case if it is not already so:
for (int counter = 0; counter < charmessage.length; counter++) {
if (91 - charmessage[counter] <= 0 && 160 - charmessage[counter] != 0) {
charmessage[counter] = charmessage[counter].toUpperCase();
}
}
charmessage is already initialized previously in the program. The 160 part is to make sure it doesn't convert a space to uppercase. How do I get the .toUpperCase method to work?
I would do it this way. First check if the character is a letter and if it is lowercase. After this just use the Character.toUpperCase(char ch)
if(Character.isLetter(charmessage[counter]) && Character.isLowerCase(charmessage[counter])){
charmessage[counter] = Character.toUpperCase(charmessage[counter]);
}
You can use the Character#toUpperCase for that. Example:
char a = 'a';
char upperCase = Character.toUpperCase(a);
It has some limitations, though. It's very important you know that the world is aware of many more characters that can fit within the 16-bit range.
String s = "stackoverflow";
int stringLenght = s.length();
char arr[] = s.toCharArray();
for (int i = stringLenght - 1; i >= 0; i--) {
char a = (arr[i]);
char upperCase = Character.toUpperCase(a);
System.out.print(upperCase);
}