How to do connection pooling for Google Cloud Bigtable - java

Is there any built-in library exists or is it required to implement custom one?
I have tried to check here, but not sure how to ahead from here: Bigtable connection pool
I have tried the below code, but not really sure how to progress from here:
import com.google.cloud.bigtable.config.BigtableOptions;
import com.google.cloud.bigtable.config.CredentialOptions;
import com.google.cloud.bigtable.grpc.BigtableSession;
import com.google.cloud.bigtable.grpc.io.ChannelPool;
import com.mahindra.digisense.config.AppConfig;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.stereotype.Component;
import java.io.File;
import java.io.FileInputStream;
import java.io.IOException;
import java.security.GeneralSecurityException;
#Component
public class BigTableConnectionPoolingExample {
#Autowired
private AppConfig.BigTableConfig bigTableConfig;
private void bigTableConnectionPooling() throws IOException, GeneralSecurityException {
CredentialOptions credentialOptions = CredentialOptions.jsonCredentials(new FileInputStream(new File(bigTableConfig.getCredentialsJson())));
BigtableOptions.Builder builder = new BigtableOptions.Builder();
builder.setCredentialOptions(credentialOptions);
ChannelPool.ChannelFactory channelFactory = (ChannelPool.ChannelFactory) BigtableSession.createChannelPool(bigTableConfig.getInstanceId(), builder.build());
ChannelPool channelPool = new ChannelPool(channelFactory,3);
}
}
Here is another Stack Overflow question, which has no answers.

As noted by Solomon Duskis, we are encouraging new folks to get started with the idiomatic Bigtable client in google-cloud-java. The client is suitable for production use, however we have not finalized the client API so we may make backward-incompatible changes.
If you are using the HBase client from the Cloud Bigtable Client repo, there are options to adjust the number of data channels used underneath as well as the number of inflight RPCs per channel. But we suggest that you profile your application first, as you should be able to achieve good performance and saturate your clusters without needing to manually adjust these parameters from their defaults.

Related

How to access AEM Core models in a Sling model

Specifically, I am trying to enable .SVG files to be usable by the core image component.
Right now I am making a sling model that ideally I would like to access the returned values of the getSrc() and getFileReference() classes in the core AEM Component interface located here.
I am very new to AEM development and Sling models. Am I missing some vital functionality that would let me do this?
Here is my code, which probably isn't at all helpful at this point.
package com.site.core.models;
import com.adobe.cq.wcm.core.components.models.Image;
import org.apache.sling.api.SlingHttpServletRequest;
import org.apache.sling.api.resource.Resource;
import org.apache.sling.api.resource.ResourceResolver;
import org.apache.sling.api.resource.ValueMap;
import org.apache.sling.models.annotations.*;
import org.apache.sling.models.annotations.injectorspecific.*;
import org.apache.sling.settings.SlingSettingsService;
import javax.jcr.Node;
import java.util.regex.Matcher;
import java.util.regex.Pattern;
import javax.inject.Inject;
#Model(adaptables = SlingHttpServletRequest.class)
public class ImageModel {
private String src = Image.getSrc();
return src;
}
As I mentioned in my comment, the link you are referring to is an interface, the implementation of that interface is here
In order to use your own implementation, you have two options:
In the image component html, change the data-sly-use to refer to your impl: com.site.core.models.ImageModel
Create a separate model that implements the Image interface and give it a high ranking to be picked up instead of the existing impl.
Disclaimer: I have not tested #2 but the documentation suggests that it's possible.

Simple Script With Interactive Brokers Java API

I am new to java though I have some experience with R.
I have taken part of a java course and have read a book or two as well as the API guides published by interactive brokers. This API is higher level than anything I have worked with before, obviously.
The very first thing I want to do is simply connect to the software. I have been able to do this with the test GUI that Interactive Brokers provides. However, when writing my own script, I am getting an error: Uncompilable source code - Erroneous sym type. I have imported the javaclient/com directory into my new project.
The line that is causing the error is econnect(port=7496, clientid=0);
Reading the documentation, this should work, but obviously does not.
Below is the full code. All of the import calls were copied from a sample file that IB provided. onHowToDetermineStock() is copied from another part of the documentation. Before I can do anything, I obviously need to to connect.
Any ideas?
Thank you.
package ibapp;
import java.awt.BorderLayout;
import java.awt.Dimension;
import java.awt.Rectangle;
import java.util.ArrayList;
import javax.swing.Box;
import javax.swing.JFrame;
import javax.swing.JLabel;
import javax.swing.JPanel;
import javax.swing.JScrollPane;
import javax.swing.JTextArea;
import javax.swing.JTextField;
import javax.swing.SwingUtilities;
import javax.swing.border.EmptyBorder;
import com.ib.controller.ApiConnection.ILogger;
import com.ib.controller.ApiController;
import com.ib.controller.ApiController.IBulletinHandler;
import com.ib.controller.ApiController.IConnectionHandler;
import com.ib.controller.ApiController.ITimeHandler;
import com.ib.controller.Formats;
import com.ib.controller.Types.NewsType;
import com.ib.client.EClientSocket;
/**
*
* #author
*/
void onHowToDetermineStock(){
Contract contract = new Contract();
Order order = new Order();
contract.m_symbol = "IBKR";
contract.m_secType = "STK";
contract.m_exchange = "SMART";
contract.m_currency = "USD";
order.m_action = "BUY";
order.m_totalQuantity = 100;
order.m_orderType = "LMT";
order.m_lmtPrice = enteredLmtPrice;
m_client.placeOrder(GlobalOrderId, contract, order);
}
public class IBApp {
/**
* #param args the command line arguments
*/
public static void main(String[] args) {
econnect(port=7496, clientid=0);
onHowToDetermineStock();
}
}
There are a number of problems with your code that make it an invalid Java program.
In Java, all methods must be contained within a class, unlike your onHowToDetermineStock method. Also, unlike R, Java doesn't use named parameters (i.e. port=7496 is not valid except to assign a variable named port). There are other problems.
Java is an object-oriented language and is very different from R. I would suggest forgetting the IB API for the time being, and spending some time learning how to code a basic Java application. There are many free tutorials on the web.
E.g.: https://docs.oracle.com/javase/tutorial/

Create a Table/Query with JPA/Hibernate in Play Framework

So I have Play Framework running at the moment with JPA and Hibernate. I'm completely new to both and the tutorials I've found around the web are above my head.
How in the world can I send a simple query or create a table? This is example code I've written up and I get: "RuntimeException: No EntityManager bound to this thread. Try wrapping this call in JPA.withTransaction, or ensure that the HTTP context is setup on this thread."
package controllers;
import play.Logger;
import javax.persistence.Entity;
import javax.persistence.EntityManager;
import javax.persistence.EntityManagerFactory;
import javax.persistence.EntityTransaction;
import javax.persistence.Persistence;
import javax.persistence.Query;
import play.db.jpa.JPA;
import play.mvc.Controller;
import play.db.*;
public class Database {
public static void initbuild() {
Logger.info("Checking database structure. The database will be restructured if not in the correct format.");
JPA.em().createQuery("create table test");
}
}
Le'ts start saying that you don't want to build the tables by yourself, just write your models under the models package annotating them by #Entity and the JPA plugin will automagically generate the tables matching the models that you have defined.
As for the error the error is raised cause you should annotate the method with the #Transactional annotation.
As stated in the official doc http://www.playframework.com/documentation/2.3.x/JavaJPA
"Every JPA call must be done in a transaction so, to enable JPA for a particular action, annotate it with #play.db.jpa.Transactional. This will compose your action method with a JPA Action that manages the transaction for you"
Hope it helped btw reading the doc and have a look to the computer-jpa example is suggested

Issues with HttpClient 4.3.1 creating instance of HttpClient

I am trying to convert Http upload to use the new HttpClient 4.3.1 classes. I'm new to Java. Everything I find online uses deprecated classes (i.e. HttpClient client = new DefaultHttpClient() or an even older method for creating an instance of HttpClient. Forgive all the extra libraries below, some will be needed in the rest of my project.
I've tried umpteen different ways to create the instance, what I have below is the method I see used in org.appache documenation for 4.3.1.
Unfortunately, I'm getting an error indicating that HttpClientBuilder is not visible. I'm not even sure what not visible means...the library has been imported. Can anyone point me in the right direction for creating an HttpClient instance.
package newHttpApiUpload;
import org.apache.http.client.HttpClient;
import org.apache.http.HttpConnection;
import org.apache.http.conn.HttpClientConnectionManager;
import org.apache.http.impl.client.HttpClientBuilder;
import org.apache.http.impl.client.HttpClients;
import org.apache.http.impl.conn.PoolingHttpClientConnectionManager;
import org.apache.http.client.config.RequestConfig;
import org.apache.http.auth.AuthScope;
import org.apache.http.auth.UsernamePasswordCredentials;
import org.apache.http.client.methods.HttpPost;
import org.apache.http.entity.AbstractHttpEntity;
import java.io.File;
import java.io.FileNotFoundException;
import java.io.IOException;
import java.io.UnsupportedEncodingException;
import java.net.URLEncoder;
public class Api {
protected static final HttpClientBuilder client = new HttpClientBuilder();
}
The constructor HttpClientBuilder() is protected (not public) so you cannot call it from your code. That is what not visible means.
You invoke the builder using a static constructor method, as such:
HttpClientBuilder.create();
Or, to quickly create a client (which is the whole point)
HttpClient client = HttpClientBuilder.create()
.setUserAgent("MyAgent")
.setMaxConnPerRoute(4)
.build()
You need to use static factory methods from HttpClients.
You can use them to obtain preconfigured instances of HttpClient, or use HttpClients.custom() to apply your custom configuration using HttpClientBuilder.

Why does Alternator mock DynamoDB fail from maven but not Eclipse?

I have been in touch with the author of Alternator about this issue, and he's as puzzled as I am. The short story: I have written unit tests of code that operates against DynamoDB using the Alternator mocking framework that work fine when I invoke them from within Eclipse, but fail when invoked from maven.
The failure arises from within the AWS SDK itself:
com.amazonaws.AmazonServiceException: [java.lang.Error: property value is null.]
at com.amazonaws.http.AmazonHttpClient.handleErrorResponse(AmazonHttpClient.java:679) ~[aws-java-sdk-1.5.5.jar:na]
at com.amazonaws.http.AmazonHttpClient.executeHelper(AmazonHttpClient.java:350) ~[aws-java-sdk-1.5.5.jar:na]
at com.amazonaws.http.AmazonHttpClient.execute(AmazonHttpClient.java:202) ~[aws-java-sdk-1.5.5.jar:na]
at com.michelboudreau.alternatorv2.AlternatorDBClientV2.invoke(AlternatorDBClientV2.java:225) ~[alternator-0.6.4.jar:na]
at com.michelboudreau.alternatorv2.AlternatorDBClientV2.updateItem(AlternatorDBClientV2.java:99) ~[alternator-0.6.4.jar:na]
at com.amazonaws.services.dynamodbv2.datamodeling.DynamoDBMapper$2.executeLowLevelRequest(DynamoDBMapper.java:646) ~[aws-java-sdk-1.5.5.jar:na]
at com.amazonaws.services.dynamodbv2.datamodeling.DynamoDBMapper$SaveObjectHandler.execute(DynamoDBMapper.java:767) ~[aws-java-sdk-1.5.5.jar:na]
at com.amazonaws.services.dynamodbv2.datamodeling.DynamoDBMapper.save(DynamoDBMapper.java:658) ~[aws-java-sdk-1.5.5.jar:na]
at com.amazonaws.services.dynamodbv2.datamodeling.DynamoDBMapper.save(DynamoDBMapper.java:488) ~[aws-java-sdk-1.5.5.jar:na]
at com.somoglobal.eventdata.Controller.addDeviceKeys(Controller.java:531) ~[classes/:na]
which as you can see is not hugely useful as the stack trace itself is just showing the stack at the point where the AWS SDK is assembling an exception from the response received through the (mocked) remote service.
The relevant version numbers:
Maven 3.0.4
Java 1.7.0_10
AWS SDK 1.5.5 (although I've also tried 1.5.3 and 1.5.4)
Eclipse version "Kepler", Build id: 20130614-0229
Burrowing down a bit further, the (slightly elided) pom.xml for the project
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<parent>
<groupId>snip</groupId>
<artifactId>snip</artifactId>
<version>1.14.2-SNAPSHOT</version>
</parent>
<artifactId>snip</artifactId>
<version>1.6.0-SNAPSHOT</version>
<name>snip</name>
<dependencies>
...snip...
<dependency>
<groupId>com.amazonaws</groupId>
<artifactId>aws-java-sdk</artifactId>
<version>1.5.5</version>
</dependency>
<dependency>
<groupId>com.michelboudreau</groupId>
<artifactId>alternator</artifactId>
<version>0.6.4</version>
<scope>test</scope>
</dependency>
</dependencies>
</project>
and the slightly trimmed down test:
package com.xxx;
import static org.junit.Assert.assertEquals;
import static org.junit.Assert.assertFalse;
import static org.junit.Assert.assertNotNull;
import static org.junit.Assert.assertTrue;
import java.util.ArrayList;
import java.util.Arrays;
import java.util.Collection;
import java.util.Collections;
import java.util.HashSet;
import java.util.Map;
import java.util.Set;
import java.util.UUID;
import org.joda.time.DateTime;
import org.junit.AfterClass;
import org.junit.BeforeClass;
import org.junit.Rule;
import org.junit.Test;
import org.junit.rules.TestRule;
import com.amazonaws.Request;
import com.amazonaws.handlers.RequestHandler;
import com.amazonaws.services.dynamodbv2.datamodeling.DynamoDBMapper;
import com.amazonaws.services.dynamodbv2.model.AttributeDefinition;
import com.amazonaws.services.dynamodbv2.model.CreateTableRequest;
import com.amazonaws.services.dynamodbv2.model.CreateTableResult;
import com.amazonaws.services.dynamodbv2.model.KeySchemaElement;
import com.amazonaws.services.dynamodbv2.model.KeyType;
import com.amazonaws.services.dynamodbv2.model.ProvisionedThroughput;
import com.amazonaws.services.dynamodbv2.model.ProvisionedThroughputExceededException;
import com.amazonaws.util.TimingInfo;
import com.michelboudreau.alternator.AlternatorDB;
import com.michelboudreau.alternatorv2.AlternatorDBClientV2;
import com.xxx.stuff;
public class ControllerTest {
private static Controller instance;
private static DynamoDBMapper mapper;
private static AlternatorDB db;
private static AlternatorDBClientV2 client;
#BeforeClass
public static void setup() throws Exception {
client = new AlternatorDBClientV2();
mapper = new DynamoDBMapper(client);
db = new AlternatorDB().start();
// code to create dynamodb was here
instance = new Controller(mapper);
}
#AfterClass
public static void tearDown() throws Exception {
db.stop();
}
#Test
public void testAddDeviceKeys() {
Collection<DeviceKey> keys = EventDataModelMapper.getDeviceKeys(ClassFixtures.event);
assertNotNull("keys should not be null", keys);
assertFalse("keys should not be empty", keys.isEmpty());
boolean result = instance.addDeviceKeys(keys);
assertTrue("result should be true", result);
}
}
The code under test is probably not particularly involved in this failure - I've done sufficient debugging tracing to see that it behaves identically during test when invoked directly through Eclipse, and when run from maven.
EDIT
Actually, Alternator might be implicated in this, as the error message in question could be coming out of com.michelboudreau.alternator.validation.ValidatorUtils:
public static <T> List<Error> rejectIfNull(T property) {
List<Error> errors = new ArrayList<Error>();
if (property == null) {
errors.add(new Error("property value is null."));
}
return errors;
}
Ok stand down, Alternator is not at fault, the problem was ultimately between one of the chairs and keyboards at my workplace. The error message in question was indeed arising from Alternator, and ultimately was sourced to a missing table definition - for very complex reasons when the tests were run through Maven there was a discrepancy between the name of the mock Dynamodb table created through Alternator, and the table name that the code under test was trying to access.
I would like to publicly thank Michel Boudreau for taking the time to respond when I contacted him directly on this matter.
Instead you can run Amazon DynamoDB locally.
http://aws.typepad.com/aws/2013/09/dynamodb-local-for-desktop-development.html
Instead, you can run DynamoDB Local using jcabi-dynamodb-maven-plugin (I'm the developer).

Categories

Resources