Bigtable emulator. Not find an appropriate constructor - java

Recently, I'm trying to develop something use Bigtable emulator with java(Spring Boot) on IntelliJ IDEA tool.
What I have done:
Bigtable emulator works well on my computer (MacOs 10.15.6).
"cbt" works normally with Bigtable emulator running on my mac as somethings like this.
I've checked that running Bigtable emulator doesn't need real gcloud credential.
I write a unit test on IEDA like this works fine.
I have added environment variable in setting like this:
My unit test code:
I. Connect init:
Configuration conf;
Connection connection = null;
conf = BigtableConfiguration.configure("fake-project", "fake-instance");
String host = "localhost";
String port = "8086";
II. Constant data going to write into table.
final byte[] TABLE_NAME = Bytes.toBytes("Hello-Bigtable");
final byte[] COLUMN_FAMILY_NAME = Bytes.toBytes("cf1");
final byte[] COLUMN_NAME = Bytes.toBytes("greeting");
final String[] GREETINGS = {
"Hello World!", "Hello Cloud Bigtable!", "Hello!!"
};
III. Connecting: (Duplicated to I.Connect init.)
Configuration conf;
Connection connection = null;
conf = BigtableConfiguration.configure("fake-project", "fake-instance");
String host = "localhost";
String port = "8086";
III. Connecting: (Edited)
if(!Strings.isNullOrEmpty(host)){
conf.set(BigtableOptionsFactory.BIGTABLE_HOST_KEY, host);
conf.set(BigtableOptionsFactory.BIGTABLE_PORT_KEY,port);
conf.set(BigtableOptionsFactory.BIGTABLE_USE_PLAINTEXT_NEGOTIATION, "true");
}
connection = BigtableConfiguration.connect(conf);
IV. Write & Read data:
Admin admin = connection.getAdmin();
Table table = connection.getTable(TableName.valueOf(TABLE_NAME));
if(!admin.tableExists(TableName.valueOf(TABLE_NAME))){
HTableDescriptor descriptor = new HTableDescriptor(TableName.valueOf(TABLE_NAME));
descriptor.addFamily(new HColumnDescriptor(COLUMN_FAMILY_NAME));
System.out.print("Create table " + descriptor.getNameAsString());
admin.createTable(descriptor);
}
for (int i = 0; i < GREETINGS.length; i++) {
String rowKey = "greeting" + i;
Put put = new Put(Bytes.toBytes(rowKey));
put.addColumn(COLUMN_FAMILY_NAME, COLUMN_NAME, Bytes.toBytes(GREETINGS[i]));
table.put(put);
}
Scan scan = new Scan();
ResultScanner scanner = table.getScanner(scan);
for (Result row : scanner) {
byte[] valueBytes = row.getValue(COLUMN_FAMILY_NAME, COLUMN_NAME);
System.out.println('\t' + Bytes.toString(valueBytes));
}
V. Output
Hello World!
Hello Cloud Bigtable!
Hello!!
Problem came after I get this code to my project.
When I use 'debug' to run the code.
I get somethings like this
when it trying to connect bigtable:
Seems that it can't new instance base on the config i create.
Eventually, it shows me an error like
Could not find an appropriate constructor for com.google.cloud.bigtable.hbase1_x.BigtableConnection
P.S. I have tried to use command running IntelliJ IDEA. Reason I doing so is because I missing environment variable when I using unit test.
In my .zshrc:
My CMD tool is iTerm2 with oh-myzsh.
Anythings is help!!!!
Thanks lots.

It seems that you miss the constructor for the BigtableConnection: BigtableConnection(org.apache.hadoop.conf.Configuration conf)
I would suggest you trying to create a Connection object by following the steps mentioned in Google Documentation
private static Connection connection = null;
public static void connect() throws IOException {
Configuration config = BigtableConfiguration.configure(PROJECT_ID, INSTANCE_ID);
// Include the following line if you are using app profiles.
// If you do not include the following line, the connection uses the
// default app profile.
config.set(BigtableOptionsFactory.APP_PROFILE_ID_KEY, APP_PROFILE_ID);
connection = BigtableConfiguration.connect(config);
}

Related

Insert variables into SAS using JAVA (IOM Bridge). Should i use CORBA stubs and JDBC or is there any other alternative?

This is part of my code snippet
WorkspaceConnector connector = null;
WorkspaceFactory workspaceFactory = null;
String variableListString = null;
Properties sasServerProperties = new Properties();
sasServerProperties.put("host", host);
sasServerProperties.put("port", port);
sasServerProperties.put("userName", userName);
sasServerProperties.put("password", password);
Properties[] sasServerPropertiesList = { sasServerProperties };
workspaceFactory = new WorkspaceFactory(sasServerPropertiesList, null, logWriter);
connector = workspaceFactory.getWorkspaceConnector(0L);
IWorkspace sasWorkspace = connector.getWorkspace();
ILanguageService sasLanguage = sasWorkspace.LanguageService();
//send variable list string
//continued
I need to send the "variableListString" to the SAS server through IOM bridge. Java SAS API doesn't give explicit ways to do it. Using CORBA and JDBC is the best way to do it?? Give me a hint how to do it. Is there any alternative method to do it??
This was asked a while back but useful in case anyone is still looking to do the same.
One way to do this is build a string of sas code and submit it to the server. We use this method for setting up variables on the host for the connected session. You can also use this technique to include sas code using code like %include "path to my code/my sas code.sas";:
...continue from code in the question...
langService = iWorkspace.LanguageService();
StringBuilder sb = new StringBuilder();
sb.append("%let mysasvar=" + javalocalvar);
... more variables
try {
langService.Submit(sb.toString());
} catch (GenericError e) {
e.printStackTrace();
}

Overriding authentication behavior in jclouds

I'm looking to leverage RackSpace's CloudFiles platform for large object storage (word docs, images, etc). Following some of their guides, I found a useful code snippet, that looks like it should work, but doesn't in my case.
Iterable<Module> modules = ImmutableSet.<Module> of(
new Log4JLoggingModule());
Properties properties = new Properties();
properties.setProperty(LocationConstants.PROPERTY_ZONE, ZONE);
properties.setProperty(LocationConstants.PROPERTY_REGION, "ORD");
CloudFilesClient cloudFilesClient = ContextBuilder.newBuilder(PROVIDER)
.credentials(username, apiKey)
.overrides(properties)
.modules(modules)
.buildApi(CloudFilesClient.class);
The problem is that when this code executes, it tries to log me in the IAD (Virginia) instance of CloudFiles. My organization's goal is to use the ORD (Chicago) instance as primary to be colocated with our cloud and use DFW as a back up environment. The login response results in the IAD instance coming back first, so I'm assuming JClouds is using that. Browsing around, it looks like the ZONE/REGION attributes are ignored for CloudFiles. I was wondering if there is any way to override the code that comes back for authentication to loop through the returned providers and choose which one to login to.
Update:
The accepted answer is mostly good, with some more info available in this snippet:
RestContext<CommonSwiftClient, CommonSwiftAsyncClient> swift = cloudFilesClient.unwrap();
CommonSwiftClient client = swift.getApi();
SwiftObject object = client.newSwiftObject();
object.getInfo().setName(FILENAME + SUFFIX);
object.setPayload("This is my payload."); //input stream.
String id = client.putObject(CONTAINER, object);
System.out.println(id);
SwiftObject obj2 = client.getObject(CONTAINER,FILENAME + SUFFIX);
System.out.println(obj2.getPayload());
We are working on the next version of jclouds (1.7.1) that should include multi-region support for Rackspace Cloud Files and OpenStack Swift. In the meantime you might be able to use this code as a workaround.
private void uploadToRackspaceRegion() {
Iterable<Module> modules = ImmutableSet.<Module> of(new Log4JLoggingModule());
String provider = "swift-keystone"; //Region selection is limited to swift-keystone provider
String identity = "username";
String credential = "password";
String endpoint = "https://identity.api.rackspacecloud.com/v2.0/";
String region = "ORD";
Properties overrides = new Properties();
overrides.setProperty(LocationConstants.PROPERTY_REGION, region);
overrides.setProperty(Constants.PROPERTY_API_VERSION, "2");
BlobStoreContext context = ContextBuilder.newBuilder(provider)
.endpoint(endpoint)
.credentials(identity, credential)
.modules(modules)
.overrides(overrides)
.buildView(BlobStoreContext.class);
RestContext<CommonSwiftClient, CommonSwiftAsyncClient> swift = context.unwrap();
CommonSwiftClient client = swift.getApi();
SwiftObject uploadObject = client.newSwiftObject();
uploadObject.getInfo().setName("test.txt");
uploadObject.setPayload("This is my payload."); //input stream.
String eTag = client.putObject("jclouds", uploadObject);
System.out.println("eTag = " + eTag);
SwiftObject downloadObject = client.getObject("jclouds", "test.txt");
System.out.println("downloadObject = " + downloadObject.getPayload());
context.close();
}
Use swift as you would Cloud Files. Keep in mind that if you need to use Cloud Files CDN stuff, the above won't work for that. Also, know that this way of doing things will eventually be deprecated.

BIRT Error : Unable to determine the default workspace location in Java

I get the following error
java.lang.IllegalStateException: Unable to determine the default workspace location. Check your OSGi-less platform configuration of the plugin or datatools workspace path.
which makes little sense.
Reports are created using the BIRT designer within Eclipse, and we are using code to covert the reports in to PDF.
the code looks something like
final EngineConfig config = new EngineConfig();
config.setBIRTHome("./birt");
Platform.startup(config);
final IReportEngineFactory factory = (IReportEngineFactory) Platform
.createFactoryObject(IReportEngineFactory.EXTENSION_REPORT_ENGINE_FACTORY);
final HTMLRenderOption ho = new HTMLRenderOption();
ho.setImageHandler(new HTMLCompleteImageHandler());
config.setEmitterConfiguration(RenderOption.OUTPUT_FORMAT_HTML, ho);
// Create the engine.
this.engine = factory.createReportEngine(config);
final IReportRunnable report = this.engine.openReportDesign(reportName);
final IRunAndRenderTask task = this.engine.createRunAndRenderTask(report);
final RenderOption options = new HMTLRenderOption();
options.setOutputFormat(HTMLRenderOption.OUTPUT_FORMAT_PDF);
options.setOutputFormat("pdf");
final String output = reportName.replaceFirst(".rptdesign", ".xls");
final String output = name.replaceFirst(".rptdesign", "." + HTMLRenderOption.OUTPUT_FORMAT_PDF);
options.setOutputFileName( outputReporttName);
task.setRenderOption(options);
// Run the report.
task.run();
but it seems during the task.run() method, the system throws the error.
This needs to be able to run standalone, without the need of eclipse, and hopped thatt he setting of BIRT home would make it happy, but these seems to be some other connection profile i am unaware of and probably don't need.
The full error :
07-Jan-2013 14:55:31 org.eclipse.datatools.connectivity.internal.ConnectivityPlugin log
SEVERE: Unable to determine the default workspace location. Check your OSGi-less platform configuration of the plugin or datatools workspace path.
07-Jan-2013 14:55:31 org.eclipse.birt.report.engine.api.impl.EngineTask handleFatalExceptions
SEVERE: An error happened while running the report. Cause:
java.lang.IllegalStateException: Unable to determine the default workspace location. Check your OSGi-less platform configuration of the plugin or datatools workspace path.
at org.eclipse.datatools.connectivity.internal.ConnectivityPlugin.getDefaultStateLocation(ConnectivityPlugin.java:155)
at org.eclipse.datatools.connectivity.internal.ConnectivityPlugin.getStorageLocation(ConnectivityPlugin.java:191)
at org.eclipse.datatools.connectivity.internal.ConnectionProfileMgmt.getStorageLocation(ConnectionProfileMgmt.java:1060)
at org.eclipse.datatools.connectivity.oda.profile.internal.OdaProfileFactory.defaultProfileStoreFile(OdaProfileFactory.java:170)
at org.eclipse.datatools.connectivity.oda.profile.OdaProfileExplorer.defaultProfileStoreFile(OdaProfileExplorer.java:138)
at org.eclipse.datatools.connectivity.oda.profile.OdaProfileExplorer.loadProfiles(OdaProfileExplorer.java:292)
at org.eclipse.datatools.connectivity.oda.profile.OdaProfileExplorer.getProfileByName(OdaProfileExplorer.java:537)
at org.eclipse.datatools.connectivity.oda.profile.provider.ProfilePropertyProviderImpl.getConnectionProfileImpl(ProfilePropertyProviderImpl.java:184)
at org.eclipse.datatools.connectivity.oda.profile.provider.ProfilePropertyProviderImpl.getDataSourceProperties(ProfilePropertyProviderImpl.java:64)
at org.eclipse.datatools.connectivity.oda.consumer.helper.ConnectionPropertyHandler.getEffectiveProperties(ConnectionPropertyHandler.java:123)
at org.eclipse.datatools.connectivity.oda.consumer.helper.OdaConnection.getEffectiveProperties(OdaConnection.java:826)
at org.eclipse.datatools.connectivity.oda.consumer.helper.OdaConnection.open(OdaConnection.java:240)
at org.eclipse.birt.data.engine.odaconsumer.ConnectionManager.openConnection(ConnectionManager.java:165)
at org.eclipse.birt.data.engine.executor.DataSource.newConnection(DataSource.java:224)
at org.eclipse.birt.data.engine.executor.DataSource.open(DataSource.java:212)
at org.eclipse.birt.data.engine.impl.DataSourceRuntime.openOdiDataSource(DataSourceRuntime.java:217)
at org.eclipse.birt.data.engine.impl.QueryExecutor.openDataSource(QueryExecutor.java:407)
at org.eclipse.birt.data.engine.impl.QueryExecutor.prepareExecution(QueryExecutor.java:317)
at org.eclipse.birt.data.engine.impl.PreparedQuery.doPrepare(PreparedQuery.java:455)
at org.eclipse.birt.data.engine.impl.PreparedDataSourceQuery.produceQueryResults(PreparedDataSourceQuery.java:190)
at org.eclipse.birt.data.engine.impl.PreparedDataSourceQuery.execute(PreparedDataSourceQuery.java:178)
at org.eclipse.birt.data.engine.impl.PreparedOdaDSQuery.execute(PreparedOdaDSQuery.java:145)
at org.eclipse.birt.report.data.adapter.impl.DataRequestSessionImpl.execute(DataRequestSessionImpl.java:624)
at org.eclipse.birt.report.engine.data.dte.DteDataEngine.doExecuteQuery(DteDataEngine.java:152)
at org.eclipse.birt.report.engine.data.dte.AbstractDataEngine.execute(AbstractDataEngine.java:267)
at org.eclipse.birt.report.engine.executor.ExecutionContext.executeQuery(ExecutionContext.java:1939)
at org.eclipse.birt.report.engine.executor.QueryItemExecutor.executeQuery(QueryItemExecutor.java:80)
at org.eclipse.birt.report.engine.executor.TableItemExecutor.execute(TableItemExecutor.java:62)
at org.eclipse.birt.report.engine.internal.executor.dup.SuppressDuplicateItemExecutor.execute(SuppressDuplicateItemExecutor.java:43)
at org.eclipse.birt.report.engine.internal.executor.wrap.WrappedReportItemExecutor.execute(WrappedReportItemExecutor.java:46)
at org.eclipse.birt.report.engine.internal.executor.l18n.LocalizedReportItemExecutor.execute(LocalizedReportItemExecutor.java:34)
at org.eclipse.birt.report.engine.layout.html.HTMLBlockStackingLM.layoutNodes(HTMLBlockStackingLM.java:65)
at org.eclipse.birt.report.engine.layout.html.HTMLPageLM.layout(HTMLPageLM.java:92)
at org.eclipse.birt.report.engine.layout.html.HTMLReportLayoutEngine.layout(HTMLReportLayoutEngine.java:100)
at org.eclipse.birt.report.engine.api.impl.RunAndRenderTask.doRun(RunAndRenderTask.java:180)
at org.eclipse.birt.report.engine.api.impl.RunAndRenderTask.run (RunAndRenderTask.java:77)
has anyone seen this error and can point me in the right direction ?
When I had this issue then I tried two things. The first thing solved the error but then I just got to the next error.
The first thing I tried was setting the setenv.sh file to have the following line:
export CATALINA_OPTS="$CATALINA_OPTS -Djava.io.tmpdir=/opt/local/share/tomcat/apache-tomcat-8.0.8/temp/tmpdir -Dorg.eclipse.datatools_workspacepath=/opt/local/share/tomcat/apache-tomcat-8.0.8/temp/tmpdir/workspace_dtp"
This solution worked after I made the tmpdir and the workspace_dtp directories in my local tomcat server. This was done in response to the guidance here.
However, I just got to the next error, which was a connection profile error. I can look into it again if you need. I know how to replicate the issue.
The second thing I tried ended up solving the issue completely and had to do with our report designer selecting the wrong type of datasource in the report design process. See my post on the Eclipse BIRT forums here for the full story: post.
Basically, the report type was set to "JDBC Database Connection for Query Builder" when it should have been set to "JDBC Data Source." See the picture for reference:
Here I give you a tip that save me from that pain :
just launch eclipse with "-clean" option after installing BIRT plugins.
To be clear, my project was built from BIRT maven dependencies, and so should not use eclipse dependencies to run (except for designing reports), but ... i think there was a conflict somewhere ... especially with org.eclipse.datatools.connectivity_1.2.4.v201202041105.jar
For global understanding, you should follow the migration guide :
http://wiki.eclipse.org/Birt_3.7_Migration_Guide#Connection_Profiles
It helps using a connection profile to externalize datasource parameters.
So it's not required if you define JDBC parameters directly in report design.
I used this programmatic way to initialize worskpace directory :
#Override
public void initializeEngine() throws BirtException {
// define eclipse datatools workspace path (required)
String workspacePath = setDataToolsWorkspacePath();
// set configuration
final EngineConfig config = new EngineConfig();
config.setLogConfig(workspacePath, Level.WARNING);
// config.setResourcePath(getSqlDriverClassJarPath());
// startup OSGi framework
Platform.startup(config); // really needed ?
IReportEngineFactory factory = (IReportEngineFactory) Platform
.createFactoryObject(IReportEngineFactory.EXTENSION_REPORT_ENGINE_FACTORY);
engine = factory.createReportEngine(config);
engine.changeLogLevel(Level.WARNING);
}
private String setDataToolsWorkspacePath() {
String workspacePath = System.getProperty(DATATOOLS_WORKSPACE_PATH);
if (workspacePath == null) {
workspacePath = FilenameUtils.concat(SystemUtils.getJavaIoTmpDir().getAbsolutePath(), "workspace_dtp");
File workspaceDir = new File(workspacePath);
if (!workspaceDir.exists()) {
workspaceDir.mkdir();
}
if (!workspaceDir.canWrite()) {
workspaceDir.setWritable(true);
}
System.setProperty(DATATOOLS_WORKSPACE_PATH, workspacePath);
}
return workspacePath;
}
I also needed to force datasource parameters at runtime this way :
private void generateReportOutput(InputStream reportDesignInStream, File outputFile, OUTPUT_FORMAT outputFormat,
Map<PARAM, Object> params) throws EngineException, SemanticException {
// Open a report design
IReportRunnable design = engine.openReportDesign(reportDesignInStream);
// Use data-source properties from persistence.xml
forceDataSource(design);
// Create RunAndRender task
IRunAndRenderTask runTask = engine.createRunAndRenderTask(design);
// Use data-source from JPA persistence context
// forceDataSourceConnection(runTask);
// Define report parameters
defineReportParameters(runTask, params);
// Set render options
runTask.setRenderOption(getRenderOptions(outputFile, outputFormat, params));
// Execute task
runTask.run();
}
private void forceDataSource(IReportRunnable runableReport) throws SemanticException {
DesignElementHandle designHandle = runableReport.getDesignHandle();
Map<String, String> persistenceProperties = PersistenceUtils.getPersistenceProperties();
String dsURL = persistenceProperties.get(AvailableSettings.JDBC_URL);
String dsDatabase = StringUtils.substringAfterLast(dsURL, "/");
String dsUser = persistenceProperties.get(AvailableSettings.JDBC_USER);
String dsPass = persistenceProperties.get(AvailableSettings.JDBC_PASSWORD);
String dsDriver = persistenceProperties.get(AvailableSettings.JDBC_DRIVER);
SlotHandle dataSources = ((ReportDesignHandle) designHandle).getDataSources();
int count = dataSources.getCount();
for (int i = 0; i < count; i++) {
DesignElementHandle dsHandle = dataSources.get(i);
if (dsHandle != null && dsHandle instanceof OdaDataSourceHandle) {
// replace connection properties from persistence.xml
dsHandle.setProperty("databaseName", dsDatabase);
dsHandle.setProperty("username", dsUser);
dsHandle.setProperty("password", dsPass);
dsHandle.setProperty("URL", dsURL);
dsHandle.setProperty("driverClass", dsDriver);
dsHandle.setProperty("jarList", getSqlDriverClassJarPath());
// #SuppressWarnings("unchecked")
// List<ExtendedProperty> privateProperties = (List<ExtendedProperty>) dsHandle
// .getProperty("privateDriverProperties");
// for (ExtendedProperty extProp : privateProperties) {
// if ("odaUser".equals(extProp.getName())) {
// extProp.setValue(dsUser);
// }
// }
}
}
}
I was having the same issue
Changing the Data Source type from "JDBC Database Connection for Query Builder" to "JDBC Data Source" solved the problem for me.

can't connect to MySQL on localhost

i am trying to connect to Mysql database using the code below , yet my attempt fails.
this is my attempt:
private static Connection conn = null;
private static String url = "jdbc:mysql://localhost/";
private static String dbName = "proj1";
private static String driver = "com.mysql.jdbc.Driver";
private static String userName = "root";
private static String password = "root";
public static int setupConnection ()
{
try{
Class.forName(driver).newInstance();
conn = DriverManager.getConnection(url+dbName,"root","root");
return 1;
}
catch (Exception e)
{
JOptionPane.showMessageDialog(null, e.getMessage());
return 0;
}
}
when installing MySQL i remember entering the password "root" , but im not 100% sure if the username is autmatically assigned "root" , i really appreciate your help.
i get the error message : com.mysql.jdbc.Driver
You need to add the MySQL Connector/J driver to the build-path/classpath of your Netbeans project. Otherwise it cannot be loaded.
Unfortunately you did not mention what kind of failure you got.
But here are some tips.
My JDBC URL looks like the following. jdbc:mysql://localhost:3306/MYSCHEMA. So port and schema name are missing in yours.
To check your credentials try to connect to your DB using command line client:
mysql -uroot -proot
Read the error message if you fail. If you cannot restore credentials, re-install MySql. It takes 3 minutes. Do not try to connect to DB using your code unless you can do it using existing clients.
Good luck.
you should connect to port address 3306,
change the url as below :
private static String url = "jdbc:mysql://localhost:3306/";
I am considering that you are not getting any compilation error and you have added mysql java api..
Start by trying to log into mysql from a command shell. If you can't, JDBC won't be able to, either.
This might help if you can't remember:
http://www.cyberciti.biz/tips/recover-mysql-root-password.html

Documentum 5.3 connecting

I am new to Documentum. I am creating a stand alone java application that needs to connect to a documentum instance (5.3). How would I go about connecting to the instance?
You must install and configure Documentum Foundation Classes (Documentum API) at your client machine, copy dfc.jar at any place in your classpath and at the end read tons of documentation :-)
first your dmcl.ini file must conain the following lines
[DOCBROKER_PRIMARY]
host=<host address or ip of docbroker for your docbase>
port=<port # usually 1489>
then the following code should work for you
String docbaseName = "docbase";
String userName = "user";
String password = "pass";
IDfClientX clientx = new DfClientX();
IDfClient client = clientx.getLocalClient();
IDfSessionManager sMgr = client.newSessionManager();
IDfLoginInfo loginInfoObj = clientx.getLoginInfo();
loginInfoObj.setUser(userName);
loginInfoObj.setPassword(password);
loginInfoObj.setDomain(null);
sMgr.setIdentity(docbaseName, loginInfoObj);
IDfSession session = null;
try
{
session = sMgr.getSession(docbaseName);
// do stuff here
}
finally
{
if(session != null)
{
sMgr.release(session);
}
}

Categories

Resources