How can I reduce the complexity of this code :
this code returns true if there are two elements in the array whose sum equals a number K
public static boolean methode(int c, int[] t) {
for(int i = 0; i < t.length; i++)
for(int j = 0; j < t.length; j++)
if(j != i && t[i] + t[j] == c)
return true;
return false;
}
As one of the option, you can use Set to store previous numbers. It reduces time complexity from O(n*n) to O(n), but at the same time in increases space complexity from O(1) to O(n).
public static boolean verification(int k, int[] tab) {
Set<Integer> unique = new HashSet<>();
for(int i = 0; i < tab.length; i++) {
if(unique.contains(k - tab[i]))
return true;
unique.add(tab[i]);
}
return false;
}
If you want to check the sum of two elements then you can use the following code. It's a bit simpler:
public static boolean verification(int k, int[] tab) {
for(int i = 0; i < tab.length - 1; i++)
for(int j = i + 1; j < tab.length; j++)
if(tab[i] + tab[j] == k)
return true;
return false;
}
Related
I am working on implementing the longest palindromic substring problem and I followed the approach with DP and extra O(N^2) (yes I know there is an even more efficient algorithm but I am not interested in that in this post).
My implementation which basically uses the recurrence:
P(i, j) = P(i + 1, j - 1) ^ s[i] == s[j]
builds the relevant table but the run time is much slower than expected.
It does give the correct output if I run it in my IDE after several seconds (15+) but it is rejected by any online judge as too slow. I am not sure where the issue is since I am using memorization. So there is not recomputation of the same cases.
The strings that are starting to show that the algorithm has a performance issue are over 900 chars long.
Update
I am updating the question to add full source code and test case
Dynamic Programming approach O(N^2) time and O(N^2) space (not accepted and too slow)
public static String longestPalindromeDP(String s) {
Map<List<Integer>, Boolean> cache = new HashMap<>();
for(int i = 0; i < s.length(); i++) {
for(int j = 0; j < s.length(); j++) {
populateTable(s, i, j, cache);
}
}
int start = 0;
int end = 0;
for(int i = 0; i < s.length(); i++) {
for(int j = 0; j < s.length(); j++) {
if(cache.get(Arrays.asList(i, j))) {
if(Math.abs(start - end) < Math.abs(i - j)) {
start = i;
end = j;
}
}
}
}
return s.substring(start, end + 1);
}
private static boolean populateTable(String s, int i, int j, Map<List<Integer>, Boolean> cache) {
if(i == j) {
cache.put(Arrays.asList(i, j), true);
return true;
}
if(Math.abs(i - j) == 1) {
cache.put(Arrays.asList(i, j), s.charAt(i) == s.charAt(j));
return s.charAt(i) == s.charAt(j);
}
if(cache.containsKey(Arrays.asList(i, j))) {
return cache.get(Arrays.asList(i, j));
}
boolean res = populateTable(s, i + 1, j - 1, cache) && s.charAt(i) == s.charAt(j);
cache.put(Arrays.asList(i, j), res);
cache.put(Arrays.asList(j, i), res);
return res;
}
This is very slow in the populateTable but once it finishes the result is correct.
Brute force O(N^3) time and O(1) space: much faster and accepted
public static String longestPalindromeBruteForce(String s) {
if(s.length() == 1) {
return s;
}
String result = "";
for(int i = 0; i < s.length(); i++) {
for(int j = i + 1; j <= s.length(); j++) {
String tmp = s.substring(i, j);
if(isPalindrome(tmp)) {
if(tmp.length() > result.length()) {
result = tmp;
if(result.length() == s.length()) {
return result;
}
}
}
}
}
return result;
}
private static boolean isPalindrome(String s) {
for(int i = 0, j = s.length() - 1; i < j; i++, j--) {
if(s.charAt(i) != s.charAt(j)) {
return false;
}
}
return true;
}
Testing and input:
public static void main(String[] args) {
final String string1 = "civilwartestingwhetherthatnaptionoranynartionsoconceivedandsodedicatedcanlongendureWeareqmetonagreatbattlefiemldoftzhatwarWehavecometodedicpateaportionofthatfieldasafinalrestingplaceforthosewhoheregavetheirlivesthatthatnationmightliveItisaltogetherfangandproperthatweshoulddothisButinalargersensewecannotdedicatewecannotconsecratewecannothallowthisgroundThebravelmenlivinganddeadwhostruggledherehaveconsecrateditfaraboveourpoorponwertoaddordetractTgheworldadswfilllittlenotlenorlongrememberwhatwesayherebutitcanneverforgetwhattheydidhereItisforusthelivingrathertobededicatedheretotheulnfinishedworkwhichtheywhofoughtherehavethusfarsonoblyadvancedItisratherforustobeherededicatedtothegreattdafskremainingbeforeusthatfromthesehonoreddeadwetakeincreaseddevotiontothatcauseforwhichtheygavethelastpfullmeasureofdevotionthatweherehighlyresolvethatthesedeadshallnothavediedinvainthatthisnationunsderGodshallhaveanewbirthoffreedomandthatgovernmentofthepeoplebythepeopleforthepeopleshallnotperishfromtheearth";
//final String string2 = "ibvjkmpyzsifuxcabqqpahjdeuzaybqsrsmbfplxycsafogotliyvhxjtkrbzqxlyfwujzhkdafhebvsdhkkdbhlhmaoxmbkqiwiusngkbdhlvxdyvnjrzvxmukvdfobzlmvnbnilnsyrgoygfdzjlymhprcpxsnxpcafctikxxybcusgjwmfklkffehbvlhvxfiddznwumxosomfbgxoruoqrhezgsgidgcfzbtdftjxeahriirqgxbhicoxavquhbkaomrroghdnfkknyigsluqebaqrtcwgmlnvmxoagisdmsokeznjsnwpxygjjptvyjjkbmkxvlivinmpnpxgmmorkasebngirckqcawgevljplkkgextudqaodwqmfljljhrujoerycoojwwgtklypicgkyaboqjfivbeqdlonxeidgxsyzugkntoevwfuxovazcyayvwbcqswzhytlmtmrtwpikgacnpkbwgfmpavzyjoxughwhvlsxsgttbcyrlkaarngeoaldsdtjncivhcfsaohmdhgbwkuemcembmlwbwquxfaiukoqvzmgoeppieztdacvwngbkcxknbytvztodbfnjhbtwpjlzuajnlzfmmujhcggpdcwdquutdiubgcvnxvgspmfumeqrofewynizvynavjzkbpkuxxvkjujectdyfwygnfsukvzflcuxxzvxzravzznpxttduajhbsyiywpqunnarabcroljwcbdydagachbobkcvudkoddldaucwruobfylfhyvjuynjrosxczgjwudpxaqwnboxgxybnngxxhibesiaxkicinikzzmonftqkcudlzfzutplbycejmkpxcygsafzkgudy";
long startTime = System.nanoTime();
String palindromic = longestPalindromeDP(string1);
long elapsed = TimeUnit.SECONDS.convert(System.nanoTime() - startTime, TimeUnit.NANOSECONDS);
System.out.println(elapsed);
System.out.println(palindromic);
}
The BruteForce finishes in 0 seconds.
The DynamicProgramming finishes in up to 9 seconds (depending on the machine)
What is the problem here?
I understand that there can be some optimization to improve the performance but how is it possible that the O(N^3) outperforms the O(N^2) since I use memoization?
Update
Update based on the answer of #CahidEnesKeleş
I replaced the List<Integer> as key with a custom object:
class IdxPair {
int i;
int j;
IdxPair(int i, int j) {
this.i = i;
this.j = j;
}
#Override
public boolean equals(Object o) {
if(o == null || !(o instanceof IdxPair)) return false;
if(this == o ) return true;
IdxPair other = (IdxPair) o;
return this.i == other.i && this.j == other.j;
}
#Override
public int hashCode() {
int h = 31;
h = 31 * i + 37;
h = (37 * h) + j;
return h;
}
}
Although a couple of test cases that previously failed, now pass it is still too slow overall and rejected by online judges.
I tried using c-like arrays instead of HashMap, here is the code:
public static String longestPalindromeDP(String s) {
int[][] cache = new int[s.length()][s.length()];
for (int i = 0; i < s.length(); i++) {
for (int j = 0; j < s.length(); j++) {
cache[i][j] = -1;
}
}
for(int i = 0; i < s.length(); i++) {
for(int j = 0; j < s.length(); j++) {
populateTable(s, i, j, cache);
}
}
int start = 0;
int end = 0;
for(int i = 0; i < s.length(); i++) {
for(int j = 0; j < s.length(); j++) {
if(cache[i][j] == 1) {
if(Math.abs(start - end) < Math.abs(i - j)) {
start = i;
end = j;
}
}
}
}
return s.substring(start, end + 1);
}
private static boolean populateTable(String s, int i, int j, int[][] cache) {
if(i == j) {
cache[i][j] = 1;
return true;
}
if(Math.abs(i - j) == 1) {
cache[i][j] = s.charAt(i) == s.charAt(j) ? 1 : 0;
return s.charAt(i) == s.charAt(j);
}
if (cache[i][j] != -1) {
return cache[i][j] == 1;
}
boolean res = populateTable(s, i + 1, j - 1, cache) && s.charAt(i) == s.charAt(j);
cache[i][j] = res ? 1 : 0;
cache[j][i] = res ? 1 : 0;
return res;
}
This code works faster than brute force approach. In my computer old dp finishes in ~5000 milliseconds, new dp finishes ~30 milliseconds, and bruteforce finishes in ~100 milliseconds.
Now that we know the reason for slowness, I conducted further experiments and measured the following codes' running time.
for (int i = 0; i < 1000; i++) {
for (int j = 0; j < 1000; j++) {
cache.put(Arrays.asList(i, j), true);
}
}
This code finishes in 2000 milliseconds. I further divided the expression to find exactly what is the source of slowness.
for (int i = 0; i < 1000; i++) {
for (int j = 0; j < 1000; j++) {
Arrays.asList(i, j);
}
}
This code finishes in 37 milliseconds.
Map<Integer, Boolean> cache = new HashMap<>();
for (int i = 0; i < 1000; i++) {
for (int j = 0; j < 1000; j++) {
cache.put(i*1000 + j, true);
}
}
This code finishes in 97 milliseconds.
Not Arrays.asList neither Map.put is slow. Maybe the hash function of the list is slow
for (int i = 0; i < 1000; i++) {
for (int j = 0; j < 1000; j++) {
Arrays.asList(i, j).hashCode();
}
}
This code finishes in 101 milliseconds.
No. This is fast as well. So maybe hash values collide most of the time. To test this, I put all hash codes inside a set and checked its size.
Set<Integer> hashSet = new HashSet<>();
for (int i = 0; i < 1000; i++) {
for (int j = 0; j < 1000; j++) {
hashSet.add(Arrays.asList(i, j).hashCode());
}
}
System.out.println(hashSet.size());
And it gave 31969. 31969 out of 1000000 is about %3,2. I think this is the source of the slowness. 1m item is too much for HashMap. It starts to move away from O(1) as more and more collisions occur.
I tried to code the naive solution for that, which tries to match a first index and then go deeper.
But I get true when I shouldn't and I cant find why.
this is my code (Java):
boolean contains(BufferedImage img, BufferedImage subImg, int[] coordinates){
boolean result = false;
int verticalLimit = img.getWidth() - subImg.getWidth();
int horizontalLimit =img.getHeight() - subImg.getHeight();
for (int i = 0; i <= horizontalLimit; i++) {
for (int j = 0; j <= verticalLimit; j++) {
if(img.getRGB(j, i) == subImg.getRGB(0, 0)){
result = true;
coordinates[0] = j; // stores the first indices for self use
coordinates[1] = i;
for (int k = i; k < subImg.getHeight() && result; k++) {
for (int l = j; l < subImg.getWidth() && result; l++) {
if(img.getRGB(l, k) != subImg.getRGB(l, k)){
result = false;
}
}
}
if(result) return result;
}
}
}
return result;
}
your search for the sub image is off. you jump way far into the sub image by indexing with k,l i've changed to 0 and using k,l as offsets from i,j. also use a labeled break from having to hold "found" state. if all of the pixels match it reaches the end of the loop and returns true otherwise it breaks and tries again until all possible locations are tried and returns false if none found.
static boolean contains(BufferedImage img, BufferedImage subImg, int[] coordinates) {
int verticalLimit = img.getWidth() - subImg.getWidth();
int horizontalLimit = img.getHeight() - subImg.getHeight();
for (int i = 0; i <= horizontalLimit; i++) {
for (int j = 0; j <= verticalLimit; j++) {
subSearch:
for (int k = 0; k < subImg.getHeight(); k++) {
for (int l = 0; l < subImg.getWidth(); l++) {
if (img.getRGB(l + j, k + i) != subImg.getRGB(l, k)) {
break subSearch;
}
}
if (k==subImg.getHeight()-1){
coordinates[0] = j;
coordinates[1] = i;
return true;
}
}
}
}
return false;
}
Complete the divisibleSumPairs function in the editor below. It should return the integer count of pairs meeting the criteria.
divisibleSumPairs has the following parameter(s):
n: the integer length of array ar
ar: an array of integers
k: the integer to divide the pair sum by
Print the number of (i, j) pairs where i < j and ar[i] + ar[j] is evenly divisible by k.
I don't know what is wrong, only some cases has worked
static int divisibleSumPairs(int n, int k, int[] ar) {
int count = 0;
for (int i=0; i<n; i++){
for (int j=0; j<n; j++){
if ((ar[i]<ar[j]) && ((ar[i]+ar[j])%k)== 0){
count++;
}
}
}
return count;
}
The main problem is that you check for ar[i] < ar[j] while the problem statement says i < j:
static int divisibleSumPairs(int n, int k, int[] ar) {
int count = 0;
for (int i = 0; i < n; i++){
for (int j = 0; j < n; j++){
if (i < j && (ar[i] + ar[j]) % k == 0) {
count++;
}
}
}
return count;
}
The algorithm can be further optimized to:
static int divisibleSumPairs(int n, int k, int[] ar) {
int count = 0;
for (int i = 0; i < n; i++){
for (int j = i + 1; j < n; j++){
if ((ar[i] + ar[j]) % k == 0) {
count++;
}
}
}
return count;
}
int[] value = new int[5];
boolean result = true;
for(int i = 0; i < 5; i++) {
value[i] = cards[i].getValue();
}
for(int i = 0; i < 5; i++) {
for(int j = i;j < 5; j++) {
if(value[i] == value[j + 1]) {
result = false;
}
}
}
return result;
This code is essentially going to compare the values each card object has, and if two cards in the array have the same value return true. We have 5 cards in each hand and that is why the array length is 5. The getValue method returns an integer which is essentially the value of the card. I don't seem to know what I'm doing wrong to be getting errors on my method.
Your array access is incorrect when you use j + 1, that will be out of bounds when j is four (at the end of the length for value). And, I would prefer to use value.length instead of hardcoding. Something like
for (int i = 0; i < value.length - 1; i++) {
for (int j = i + 1; j < value.length; j++) {
if (value[i] == value[j]) {
result = false;
}
}
}
Additionally, as pointed out by Tom, in the comments; it is pointless to continue iteration when the result becomes false. You could simply return when it becomes false and avoid the result variable entirely. Like,
for (int i = 0; i < value.length - 1; i++) {
for (int j = i + 1; j < value.length; j++) {
if (value[i] == value[j]) {
return false;
}
}
}
return true;
Another option, in Java 8+, would be something like
return IntStream.of(value).allMatch(x -> value[0] == x);
The following code prints a table of the factors of each number from 0 to n. Can someone help me rewrite the following O(n²) time code so that it has complexity O(n·sqrt(n)) time ?
I actually rewrote the algorithm to have O(n·log n) but I can't figure it out for that complexity.
public static Vector<Vector<Integer>> factTable(int n) {
Vector<Vector<Integer>> table = new Vector<Vector<Integer>>();
for (int i = 1;i <= n; i++) {
Vector<Integer> factors = new Vector<Integer>();
for (int f = 1; f <= i; f++) {
if ((i % f) == 0)
factors.add(f);
}
table.add(factors);
}
return table;
}
For each factor f of i, i/f is a factor of i.
I haven't proven the complexity but it is better than O(n²).
int count = 0;
Vector<Vector<Integer>> table = new Vector<Vector<Integer>>();
for (int i = 0; i <= n; i++) {
table.add(new Vector<Integer>());
}
for (int i = 1; i <= n; i++) {
for (int j = i; j <= n; j += i) {
Vector<Integer> vj = table.get(j);
vj.add(i);
count++;
}
}