replace does not replace digits - java

I want to replace in a string every '0' with a 'F', every '1' with a 'E' and so on.
e.g. "234567890ABCDEF" should result in "DCBA9876543210"
final char[] items = {'0', '1', '2', '3', '4', '5', '6', '7', '8', '9', 'A', 'B', 'C', 'D', 'E', 'F'};
for (int i = 0; i < 16; i++) {
newString = oldString.replace(items[i], items[15-i]);
}
unfortunately, this piece of code does not work. It replaces all Letters but not the digits. Any suggestions, why? I'm really at a loss...

Your problem is that you replace the digits to letters for i=0 to 7 and back for i=8 to 15.

If you add debug to your code and look at the iterations you'll notice how you overwrite the results of the first iterations with the replace()es of the last iterations:
234567890ABCDEF
23456789FABCDEF
23456789FABCDEF
D3456789FABCDEF
DC456789FABCDEF
DCB56789FABCDEF
DCBA6789FABCDEF
DCBA9789FABCDEF
DCBA9889FABCDEF
DCBA9779FABCDEF
DCBA6776FABCDEF
DCB56776F5BCDEF
DC456776F54CDEF
D3456776F543DEF
23456776F5432EF
23456776F54321F
234567760543210

This is because you invert the result done during the first eight replacements in your second replacements! This meant,
0-7 are converted back to 0-7, but 8 and 9 will be converted to their conterparts!

Related

How to use BigInteger as a index of array [duplicate]

This question already has answers here:
What causes a java.lang.ArrayIndexOutOfBoundsException and how do I prevent it?
(26 answers)
BigDecimal.intValue() returns Unexpected Number
(5 answers)
Closed 5 days ago.
This post was edited and submitted for review 5 days ago and failed to reopen the post:
Duplicate This question has been answered, is not unique, and doesn’t differentiate itself from another question.
i am trying to convert a number with more than 10 digits in base 2 to 10 but when it reaches 11 i get an error. Please help me to fix it. Here is my code
public static final char[] hexDigits = {
'0', '1', '2', '3', '4', '5', '6', '7',
'8', '9', 'A', 'B', 'C', 'D', 'E', 'F'
};
public static String convertDecimalToBinary(String decimal) {
String binary = "";
BigInteger deci = new BigInteger(decimal);
BigInteger base = new BigInteger("2");
if(deci.equals(0)) {
return deci.toString();
}
while (deci.intValue() != 0) {
binary = hexDigits[deci.intValue() % 2] + binary;
deci = deci.divide(base);
}
return binary;
}

How can i get a column from 2d array in java [duplicate]

This question already has answers here:
get columns from two dimensional array in java
(7 answers)
Closed 1 year ago.
char[][] isbns = {
{'9', '6', '0', '-', '4', '2', '5', '-', '0', '5', '9', '-', '0'},
{'8', '0', '-', '9', '0', '2', '7', '4', '4', '-', '1', '-', '6'},
{'0', '-', '8', '0', '4', '4', '-', '2', '9', '5', '8', '-', 'X'},
{'0', '-', '9', '4', '3', '3', '9', '6', '-', '0', '4', '-', '2'},
{'0', '-', '9', '7', '5', '2', '2', '9', '8', '-', '0', '-', '5'},
{'9', '9', '7', '1', '-', '5', '-', '0', '2', 'l', '0', '-', '0'},
{'9', '3', '-', '8', '6', '5', '4', '-', '-', '2', '1', '-', '4'},
{'9', '9', '9', '2', '1', '-', '5', '8', '8', '-', '1', '0', '7'}
};
array is as like this on top and my method is as follows
public static Object[] getColumn(char[][] arr, int index){
Object[] column = new Object[arr[0].length];
for(int i = 0; i < column.length; i++){
column[i] = arr[i][index];
}
return column;
}
However as rows are 8 and columns are 13 characters, i get array index out of bounds exception error.
How can i solve this issue ? thanks.
A column has the same length as the number of rows, which is the number of elements in arr.
So change
Object[] column = new Object[arr[0].length];
to
Object[] column = new Object[arr.length];

Java's BigInteger C# in Base 32

How would one convert the following line of Java to C#. It generates a random BigInteger of 130 bits in size, converts it to a string in base 32 (i.e. not decimal) and then manipulates the string:
new BigInteger(130, new SecureRandom()).toString(32).replace("/", "w").toUpperCase(Locale.US);
How can I achieve that in C#?
Generate a random 130 bit BigInteger
convert it to a string in base 32
As far as the random BigInteger I have this function:
static BigInteger RandomInteger(int bits)
{
RNGCryptoServiceProvider secureRandom = new RNGCryptoServiceProvider();
// make sure there is extra room for a 0-byte so our number isn't negative
// in the case that the msb is set
var bytes = new byte[bits / 8 + 1];
secureRandom.GetBytes(bytes);
// mask off excess bits
bytes[bytes.Length - 1] &= (byte)((1 << (bits % 8)) - 1);
return new BigInteger(bytes);
}
taken from this question which does not address the base 32 conversion: Equivalent of Java's BigInteger in C#
However I'm not sure if that function is correct as well.
The C# code I have so far, RandomInteger being the function described above:
RandomInteger(130).ToString().Replace("/","w").ToUpper(CultureInfo.GetCultureInfo("en-US"));
The above code has quite a few bugs, if bits are whole the last number gets masked out entirely and there's a chance that the number gets positive because the new BigInteger(byte[]) overload expects a little endian signed number, so you have to prepend it with a 0 byte
static BigInteger RandomInteger(int bits)
{
var bytes = new byte[(bits + 7) / 8 + 1];
using (var rng = new RNGCryptoServiceProvider())
rng.GetBytes(bytes, 0, bytes.Length - 1);
var remainingBits = bits % 8;
if (remainingBits > 0)
bytes[bytes.Length - 2] &= (byte)((1 << remainingBits) - 1);
return new BigInteger(bytes);
}
This would work I suppose
Base 32 string
This is how I would convert to base 32. Note that I can not test this here and that my C# is a little rusty, but I think the following should be good enough to get you going (if someone sees a syntax error, please edit it out):
static string BigIntegerToString32(BigInteger bi)
{
// Obvious shortcut -- avoids problems later on.
if (bi == BigInteger.Zero)
return("0");
readonly char[] digits = new char[] {
'0', '1', '2', '3', '4', '5', '6', '7',
'8', '9', 'A', 'B', 'C', 'D', 'E', 'F',
'G', 'H', 'I', 'J', 'K', 'L', 'M', 'N',
'O', 'P', 'Q', 'R', 'S', 'T', 'U', 'V'
};
// Only work with positive BigInteger.
BigInteger value = BigInteger.Abs(bi);
// Collects one digit on each iteration.
StringBuilder result = new StringBuilder();
// This value is needed more often -- only converted once.
BigInteger thirtyOne = 0x1F;
while (value > BigInteger.Zero)
{
// get next digit: value % 32
result.Insert(0, digits[(int)(value & thirtyOne)]);
// shift out last digit: value = value / 32
value >>= 5;
}
// prepend '-' if negative
if (bi < BigInteger.Zero)
result.Insert(0, '-');
return result.ToString();
}
Note that for huge BigIntegers, it might make sense to use a faster, but more complicated algorithm (as I do in my Delphi BigInteger implementation), although this here is probably more or less how C#'s BigInteger (which does not use more sophisticated routines for large BigIntegers, AFAIK, unlike Java) does this too, for base 10.
Random 130 bit BigInteger
The answer by #hl3mukkel does a much better job of generating an n bit random BigInteger than the code you found and posted, so use his code to generate such a BigInteger.

hard time understanding this code for a numerology program

private int sumCharValues (String input) {
String total = input.toLowerCase();
int result = 0;
for (int i = 0, n = total.length(); i < n; i++) {
char c = total.charAt(i);
result += (c - 'a' + 1);
}
return result;
}
I'm trying to understand this code. What does the result += (c - 'a' + 1) means?
Any help would be very much appreciated.
It gives the letter's numerical position in the alphabet.
String total = input.toLowerCase(); means all the letters are lower case. Then, c - 'a' subtracts the ascii value for 'a' from the ascii value of c. That means you'll get anything from 0-25. Adding one shifts your starting point, giving you 1-26.
Try it in your head or add a print statement in your code for this expression.
When the character in the string is 'a', the expression reads 'a' - 'a' + 1 so you can see that the result will be 1.
When you're performing arithmetic on characters, you're actually doing it on their Unicode value,
For non-accented alphabetic characters, that is the same as ASCII value.
For 'a', this value is 97, for 'b', it's 98, etc.
So the expression above returns the index of the character in the alphabet, starting with 1 for 'a'.
The a += b operator is (more or less—see below) a shortcut for a = a + (b). The expression (c - 'a' + 1) converts the character stored in c to an integer in such a way that 'a' will have the value 1, 'b' will have the value 2, etc., based on the Unicode code points of the characters in input. When the loop exits, result will be the sum of all the numerical equivalents of the characters in the input.
The compound assignment a += b isn't exactly a shortcut for a = a + (b) in a couple of ways. First, the left side is evaluated only once, so something like vec[i++] += 3 will increment i only once. Second, there is an implicit cast (if necessary) to the type of a. Thus,
byte a = 0;
a += 1; // works
a = a + 1; // compiler error -- a + 1 is an int value
The statement result += (c - 'a' + 1) is evaluated as,
First char c is converted to its ASCII value, then we get the diff between ascii value of char c and char 'a' and then we add 1 to it.
And this statement result += (c - 'a' + 1) can be rewritten as result = result + (c - 'a' + 1), i.e. we are just adding the previous value of result and assigning it again.
PS: The ASCII value of 'a' is 97 and 'z' is 122.
For e.g. if input String is "stackoverflow", it will be evaluated as,
char c = 's', result = 19
char c = 't', result = 39
char c = 'a', result = 40
char c = 'c', result = 43
char c = 'k', result = 54
char c = 'o', result = 69
char c = 'v', result = 91
char c = 'e', result = 96
char c = 'r', result = 114
char c = 'f', result = 120
char c = 'l', result = 132
char c = 'o', result = 147
char c = 'w', result = 170

Java - Change int to ascii

Is there a way for java to convert int's to ascii symbols?
Do you want to convert ints to chars?:
int yourInt = 33;
char ch = (char) yourInt;
System.out.println(yourInt);
System.out.println(ch);
// Output:
// 33
// !
Or do you want to convert ints to Strings?
int yourInt = 33;
String str = String.valueOf(yourInt);
Or what is it that you mean?
If you first convert the int to a char, you will have your ascii code.
For example:
int iAsciiValue = 9; // Currently just the number 9, but we want Tab character
// Put the tab character into a string
String strAsciiTab = Character.toString((char) iAsciiValue);
There are many ways to convert an int to ASCII (depending on your needs) but here is a way to convert each integer byte to an ASCII character:
private static String toASCII(int value) {
int length = 4;
StringBuilder builder = new StringBuilder(length);
for (int i = length - 1; i >= 0; i--) {
builder.append((char) ((value >> (8 * i)) & 0xFF));
}
return builder.toString();
}
For example, the ASCII text for "TEST" can be represented as the byte array:
byte[] test = new byte[] { (byte) 0x54, (byte) 0x45, (byte) 0x53, (byte) 0x54 };
Then you could do the following:
int value = ByteBuffer.wrap(test).getInt(); // 1413829460
System.out.println(toASCII(value)); // outputs "TEST"
...so this essentially converts the 4 bytes in a 32-bit integer to 4 separate ASCII characters (one character per byte).
You can convert a number to ASCII in java. example converting a number 1 (base is 10) to ASCII.
char k = Character.forDigit(1, 10);
System.out.println("Character: " + k);
System.out.println("Character: " + ((int) k));
Output:
Character: 1
Character: 49
In fact in the last answer
String strAsciiTab = Character.toString((char) iAsciiValue);
the essential part is (char)iAsciiValue which is doing the job (Character.toString useless)
Meaning the first answer was correct actually
char ch = (char) yourInt;
if in yourint=49 (or 0x31), ch will be '1'
In Java, you really want to use Integer.toString to convert an integer to its corresponding String value. If you are dealing with just the digits 0-9, then you could use something like this:
private static final char[] DIGITS =
{'0', '1', '2', '3', '4', '5', '6', '7', '8', '9'};
private static char getDigit(int digitValue) {
assertInRange(digitValue, 0, 9);
return DIGITS[digitValue];
}
Or, equivalently:
private static int ASCII_ZERO = 0x30;
private static char getDigit(int digitValue) {
assertInRange(digitValue, 0, 9);
return ((char) (digitValue + ASCII_ZERO));
}
The most simple way is using type casting:
public char toChar(int c) {
return (char)c;
}
tl;dr
Use Character#toString, not char.
String result = Character.toString( yourAsciiNumber ) ;
Ex:
Character.toString( 97 ) // LATIN SMALL LETTER A
a
Character.toString( 128_567 ) // FACE WITH MEDICAL MASK
😷
char is legacy
The char type in Java is legacy, and is essentially broken. As a 16-bit value, char is incapable of representing most characters defined by Unicode.
This succeeds:
System.out.println( Character.toString( 128_567 )); // Unicode code points handle full-range of Unicode characters.
😷
This fails:
System.out.println( ( char ) 128_567 ); // `char` fails with most Unicode characters.
See code run live at IdeOne.com.
Code point
Use code point integer numbers to represent individual letters.
US-ASCII is a subset of Unicode. So, any US-ASCII number (0-127) is also a Unicode code point (0-1,114,111).
To change a code point number to a String object containing a single character, call Character#toString.
String x = Character.toString( 97 ) ;
a
See this code run live at IdeOne.com.
The most simple way is to get integer and just use the casting operator
Ex
int num = 33;
System.out.println((char) num); //Outputs 33
//if you want to find the integer value of character instead.
//Just do the reverse
char ch = '%';
System.out.println((int) ch);

Categories

Resources