How can i get a column from 2d array in java [duplicate] - java

This question already has answers here:
get columns from two dimensional array in java
(7 answers)
Closed 1 year ago.
char[][] isbns = {
{'9', '6', '0', '-', '4', '2', '5', '-', '0', '5', '9', '-', '0'},
{'8', '0', '-', '9', '0', '2', '7', '4', '4', '-', '1', '-', '6'},
{'0', '-', '8', '0', '4', '4', '-', '2', '9', '5', '8', '-', 'X'},
{'0', '-', '9', '4', '3', '3', '9', '6', '-', '0', '4', '-', '2'},
{'0', '-', '9', '7', '5', '2', '2', '9', '8', '-', '0', '-', '5'},
{'9', '9', '7', '1', '-', '5', '-', '0', '2', 'l', '0', '-', '0'},
{'9', '3', '-', '8', '6', '5', '4', '-', '-', '2', '1', '-', '4'},
{'9', '9', '9', '2', '1', '-', '5', '8', '8', '-', '1', '0', '7'}
};
array is as like this on top and my method is as follows
public static Object[] getColumn(char[][] arr, int index){
Object[] column = new Object[arr[0].length];
for(int i = 0; i < column.length; i++){
column[i] = arr[i][index];
}
return column;
}
However as rows are 8 and columns are 13 characters, i get array index out of bounds exception error.
How can i solve this issue ? thanks.

A column has the same length as the number of rows, which is the number of elements in arr.
So change
Object[] column = new Object[arr[0].length];
to
Object[] column = new Object[arr.length];

Related

How to use BigInteger as a index of array [duplicate]

This question already has answers here:
What causes a java.lang.ArrayIndexOutOfBoundsException and how do I prevent it?
(26 answers)
BigDecimal.intValue() returns Unexpected Number
(5 answers)
Closed 5 days ago.
This post was edited and submitted for review 5 days ago and failed to reopen the post:
Duplicate This question has been answered, is not unique, and doesn’t differentiate itself from another question.
i am trying to convert a number with more than 10 digits in base 2 to 10 but when it reaches 11 i get an error. Please help me to fix it. Here is my code
public static final char[] hexDigits = {
'0', '1', '2', '3', '4', '5', '6', '7',
'8', '9', 'A', 'B', 'C', 'D', 'E', 'F'
};
public static String convertDecimalToBinary(String decimal) {
String binary = "";
BigInteger deci = new BigInteger(decimal);
BigInteger base = new BigInteger("2");
if(deci.equals(0)) {
return deci.toString();
}
while (deci.intValue() != 0) {
binary = hexDigits[deci.intValue() % 2] + binary;
deci = deci.divide(base);
}
return binary;
}

Java's BigInteger C# in Base 32

How would one convert the following line of Java to C#. It generates a random BigInteger of 130 bits in size, converts it to a string in base 32 (i.e. not decimal) and then manipulates the string:
new BigInteger(130, new SecureRandom()).toString(32).replace("/", "w").toUpperCase(Locale.US);
How can I achieve that in C#?
Generate a random 130 bit BigInteger
convert it to a string in base 32
As far as the random BigInteger I have this function:
static BigInteger RandomInteger(int bits)
{
RNGCryptoServiceProvider secureRandom = new RNGCryptoServiceProvider();
// make sure there is extra room for a 0-byte so our number isn't negative
// in the case that the msb is set
var bytes = new byte[bits / 8 + 1];
secureRandom.GetBytes(bytes);
// mask off excess bits
bytes[bytes.Length - 1] &= (byte)((1 << (bits % 8)) - 1);
return new BigInteger(bytes);
}
taken from this question which does not address the base 32 conversion: Equivalent of Java's BigInteger in C#
However I'm not sure if that function is correct as well.
The C# code I have so far, RandomInteger being the function described above:
RandomInteger(130).ToString().Replace("/","w").ToUpper(CultureInfo.GetCultureInfo("en-US"));
The above code has quite a few bugs, if bits are whole the last number gets masked out entirely and there's a chance that the number gets positive because the new BigInteger(byte[]) overload expects a little endian signed number, so you have to prepend it with a 0 byte
static BigInteger RandomInteger(int bits)
{
var bytes = new byte[(bits + 7) / 8 + 1];
using (var rng = new RNGCryptoServiceProvider())
rng.GetBytes(bytes, 0, bytes.Length - 1);
var remainingBits = bits % 8;
if (remainingBits > 0)
bytes[bytes.Length - 2] &= (byte)((1 << remainingBits) - 1);
return new BigInteger(bytes);
}
This would work I suppose
Base 32 string
This is how I would convert to base 32. Note that I can not test this here and that my C# is a little rusty, but I think the following should be good enough to get you going (if someone sees a syntax error, please edit it out):
static string BigIntegerToString32(BigInteger bi)
{
// Obvious shortcut -- avoids problems later on.
if (bi == BigInteger.Zero)
return("0");
readonly char[] digits = new char[] {
'0', '1', '2', '3', '4', '5', '6', '7',
'8', '9', 'A', 'B', 'C', 'D', 'E', 'F',
'G', 'H', 'I', 'J', 'K', 'L', 'M', 'N',
'O', 'P', 'Q', 'R', 'S', 'T', 'U', 'V'
};
// Only work with positive BigInteger.
BigInteger value = BigInteger.Abs(bi);
// Collects one digit on each iteration.
StringBuilder result = new StringBuilder();
// This value is needed more often -- only converted once.
BigInteger thirtyOne = 0x1F;
while (value > BigInteger.Zero)
{
// get next digit: value % 32
result.Insert(0, digits[(int)(value & thirtyOne)]);
// shift out last digit: value = value / 32
value >>= 5;
}
// prepend '-' if negative
if (bi < BigInteger.Zero)
result.Insert(0, '-');
return result.ToString();
}
Note that for huge BigIntegers, it might make sense to use a faster, but more complicated algorithm (as I do in my Delphi BigInteger implementation), although this here is probably more or less how C#'s BigInteger (which does not use more sophisticated routines for large BigIntegers, AFAIK, unlike Java) does this too, for base 10.
Random 130 bit BigInteger
The answer by #hl3mukkel does a much better job of generating an n bit random BigInteger than the code you found and posted, so use his code to generate such a BigInteger.

Javascript : Windows 7 taskbar time (Almost Done) But it is still in 24 hour format

I have already created windows 7 time, which is almost done but it is still in 24 hour format.
JAVASCRIPT :-
<script>
function date_time(id){
date = new Date;
year = date.getFullYear();
month = date.getMonth();
months = new Array('1', '2', '3', '4', '5', '6', '7', '8', '9', '10', '11', '12');
d = date.getDate();
day = date.getDay();
days = new Array('Sunday', 'Monday', 'Tuesday', 'Wednesday', 'Thursday', 'Friday', 'Saturday');
h = date.getHours();
if(h<10)
{
h = ""+h;
}
m = date.getMinutes();
if(m<10)
{
m = "0"+m;
}
a = date.getHours() < 12 ? 'AM' : 'PM';
result = ''+h+':'+m+' '+a+'<br/>'+months[month]+'/'+d+'/'+year;
document.getElementById(id).innerHTML = result;
setTimeout('date_time("'+id+'");','1');
return true;
}
</script>
HTML :-
<span class="right" id="date_time"></span>
<script type="text/javascript">window.onload = date_time('date_time');</script>
I want just change the format of hour into 12 not 24
You are not converting hours to the 12 hours format.
Add the line
h = h % 12
after the line where you set 'a'.
And remove
if(h<10)
{
h = ""+h;
}

hard time understanding this code for a numerology program

private int sumCharValues (String input) {
String total = input.toLowerCase();
int result = 0;
for (int i = 0, n = total.length(); i < n; i++) {
char c = total.charAt(i);
result += (c - 'a' + 1);
}
return result;
}
I'm trying to understand this code. What does the result += (c - 'a' + 1) means?
Any help would be very much appreciated.
It gives the letter's numerical position in the alphabet.
String total = input.toLowerCase(); means all the letters are lower case. Then, c - 'a' subtracts the ascii value for 'a' from the ascii value of c. That means you'll get anything from 0-25. Adding one shifts your starting point, giving you 1-26.
Try it in your head or add a print statement in your code for this expression.
When the character in the string is 'a', the expression reads 'a' - 'a' + 1 so you can see that the result will be 1.
When you're performing arithmetic on characters, you're actually doing it on their Unicode value,
For non-accented alphabetic characters, that is the same as ASCII value.
For 'a', this value is 97, for 'b', it's 98, etc.
So the expression above returns the index of the character in the alphabet, starting with 1 for 'a'.
The a += b operator is (more or less—see below) a shortcut for a = a + (b). The expression (c - 'a' + 1) converts the character stored in c to an integer in such a way that 'a' will have the value 1, 'b' will have the value 2, etc., based on the Unicode code points of the characters in input. When the loop exits, result will be the sum of all the numerical equivalents of the characters in the input.
The compound assignment a += b isn't exactly a shortcut for a = a + (b) in a couple of ways. First, the left side is evaluated only once, so something like vec[i++] += 3 will increment i only once. Second, there is an implicit cast (if necessary) to the type of a. Thus,
byte a = 0;
a += 1; // works
a = a + 1; // compiler error -- a + 1 is an int value
The statement result += (c - 'a' + 1) is evaluated as,
First char c is converted to its ASCII value, then we get the diff between ascii value of char c and char 'a' and then we add 1 to it.
And this statement result += (c - 'a' + 1) can be rewritten as result = result + (c - 'a' + 1), i.e. we are just adding the previous value of result and assigning it again.
PS: The ASCII value of 'a' is 97 and 'z' is 122.
For e.g. if input String is "stackoverflow", it will be evaluated as,
char c = 's', result = 19
char c = 't', result = 39
char c = 'a', result = 40
char c = 'c', result = 43
char c = 'k', result = 54
char c = 'o', result = 69
char c = 'v', result = 91
char c = 'e', result = 96
char c = 'r', result = 114
char c = 'f', result = 120
char c = 'l', result = 132
char c = 'o', result = 147
char c = 'w', result = 170

replace does not replace digits

I want to replace in a string every '0' with a 'F', every '1' with a 'E' and so on.
e.g. "234567890ABCDEF" should result in "DCBA9876543210"
final char[] items = {'0', '1', '2', '3', '4', '5', '6', '7', '8', '9', 'A', 'B', 'C', 'D', 'E', 'F'};
for (int i = 0; i < 16; i++) {
newString = oldString.replace(items[i], items[15-i]);
}
unfortunately, this piece of code does not work. It replaces all Letters but not the digits. Any suggestions, why? I'm really at a loss...
Your problem is that you replace the digits to letters for i=0 to 7 and back for i=8 to 15.
If you add debug to your code and look at the iterations you'll notice how you overwrite the results of the first iterations with the replace()es of the last iterations:
234567890ABCDEF
23456789FABCDEF
23456789FABCDEF
D3456789FABCDEF
DC456789FABCDEF
DCB56789FABCDEF
DCBA6789FABCDEF
DCBA9789FABCDEF
DCBA9889FABCDEF
DCBA9779FABCDEF
DCBA6776FABCDEF
DCB56776F5BCDEF
DC456776F54CDEF
D3456776F543DEF
23456776F5432EF
23456776F54321F
234567760543210
This is because you invert the result done during the first eight replacements in your second replacements! This meant,
0-7 are converted back to 0-7, but 8 and 9 will be converted to their conterparts!

Categories

Resources