Big Query API Exception - java

I'm having this problem
java.lang.NoSuchMethodError: com.google.common.util.concurrent.MoreExecutors.directExecutor()Ljava/util/concurrent/Executor;
while using Big Query API from Google.
I'm using JSF, GlassFish 4.1.
There is my method that is throwing the exception:
public void process() throws InterruptedException, FileNotFoundException, IOException {
GoogleCredentials credentials;
File credentialsPath = new File("/home/jesus_miranda/Downloads/credential2.json"); // TODO: update to your key path.
try (FileInputStream serviceAccountStream = new FileInputStream(credentialsPath)) {
credentials = ServiceAccountCredentials.fromStream(serviceAccountStream);
}
// Instantiate a client.
BigQuery bigquery = BigQueryOptions.newBuilder().setCredentials(credentials).build().getService();
String query = "SELECT corpus FROM `bigquery-public-data.samples.shakespeare` GROUP BY corpus;";
QueryJobConfiguration queryConfig = QueryJobConfiguration.newBuilder(query).build();
for (FieldValueList row : bigquery.query(queryConfig).iterateAll()) {//At this line the program failed.
for (FieldValue val : row) {
System.out.printf("%s,", val.toString());
}
System.out.printf("\n");
}
}
Please help me, I read a lot of forums and all of those talked about guava version. I update and downgrade the guava version and it still doesn't work.
Regards!

This problem seems to be the effect of a version mismatch. Take a look into the dependency declaration of the BigQuery API and see which Guava version is being used.
Maybe you are building your application and forgot to shade/fatjar guava into your jar.

Related

java.lang.NoSuchMethodError: org.apache.poi.xwpf.usermodel.XWPFHyperlinkRun

I am trying to:
Use a word document with "MergeFields" to fill it with data
Convert to a PDF document, using java
I have had this working before, and now all of a sudden i get the following error:
java.lang.NoSuchMethodError:
org.apache.poi.xwpf.usermodel.XWPFHyperlinkRun.
This occurs when i put the .war file on an Amazon EC2 server.
(all other libraries work fine)
Here are the libraries that i use:
fr.opensagres.xdocreport.converter.odt.odfdom (v 1.0.4)
fr.opensagres.xdocreport.template.freemarker (v 1.0.4)
org.apache.poi.xwpf.converter.core (1.0.5)
org.apache.poi.xwpf.converter.pdf (1.0.5)
org.apache.poi.xwpf.converter.xhtml (1.0.5)
org.apache.poi (3.11)
Is there anything wrong with my libraries? or is this just a server deployment issue?
Very thankful for help.
Below is my code:
public byte[] wordToPdf(RequestHelper reqHelper, Map<String, Object> values, String docPath) throws IOException, XDocReportException, ServiceUnavailableException, E24Exception {
try {
ServletContext ctx = reqHelper.getRequest().getServletContext();
InputStream tpl = new BufferedInputStream(ctx.getResourceAsStream(docPath));
IXDocReport report = XDocReportRegistry.getRegistry().loadReport(tpl, TemplateEngineKind.Velocity);
Options options = Options.getTo(ConverterTypeTo.PDF).via(ConverterTypeVia.XWPF);
ByteArrayOutputStream pdfOut = new ByteArrayOutputStream();
report.convert(report.createContext(values), options, pdfOut);
byte[] pdfImage = pdfOut.toByteArray();
return pdfImage;
}
catch (FileNotFoundException ex) {
}
return null;
}
Ok i finally got to a solution that worked for me, since this post has alot of views and no answers, i'll answer it myself for those who are in need!
I changed the version of all libraries that has anything to do with
"apache.poi" to version 1.0.4
After that i used org.apache.poi version 3.9 instead of 3.11
So finally, to wrap it up... this is what i used in the end
org.apache.poi.xwpf.converter.core (1.0.4)
org.apache.poi.xwpf.converter.pdf (1.0.4)
org.apache.poi.xwpf.converter.xhtml (1.0.4)
org.apache.poi (3.9)
/Marcus

Azure SDK for Java - sample program throwing InvalidKeyException

Using Azure Storage SDK for Java, I am trying to perform basic create, read, update, delete operations on Azure Table Storage as given in the link below:
https://azure.microsoft.com/en-us/documentation/articles/storage-java-how-to-use-table-storage/
Sample program for creating a table:
package com.azure.test;
import java.io.UnsupportedEncodingException;
import com.microsoft.azure.storage.*;
import com.microsoft.azure.storage.table.CloudTable;
import com.microsoft.azure.storage.table.CloudTableClient;
import com.microsoft.windowsazure.core.utils.Base64;
public class App
{
public static void main( String[] args ) throws StorageException, UnsupportedEncodingException
{
String storageConnectionString =
"DefaultEndpointsProtocol=http;" +
"AccountName=accountname;" +
"AccountKey=storagekey;"+
"EndpointSuffix=table.core.windows.net";
try
{
// Retrieve storage account from connection-string.
CloudStorageAccount storageAccount =
CloudStorageAccount.parse(storageConnectionString);
CloudTableClient tableClient = storageAccount.createCloudTableClient();
//Create the table if it doesn't exist.
String tableName = "MyTable";
CloudTable cloudTable = tableClient.getTableReference(tableName);
cloudTable.createIfNotExists();
}
catch (Exception e)
{
// Output the stack trace.
e.printStackTrace();
System.out.println(e.getMessage());
}
}
}
The code seems to be fairly simple to understand. It would connect to the Azure table storage and if a table with a given name does not exist it will create it. But I am getting a InvalidKeyException(full exception pasted below).
java.security.InvalidKeyException: Storage Key is not a valid base64 encoded string.
at com.microsoft.azure.storage.StorageCredentials.tryParseCredentials(StorageCredentials.java:68)
at com.microsoft.azure.storage.CloudStorageAccount.tryConfigureServiceAccount(CloudStorageAccount.java:408)
at com.microsoft.azure.storage.CloudStorageAccount.parse(CloudStorageAccount.java:259)
at com.azure.test.App.main(App.java:71)
I am surprised that not many people using Azure Storage are facing this issue. I tried to encode the storage key using and used the encoded key in the connection string but still no use.
String encodedKey=Base64.encode(storageKey.getBytes())
String storageConnectionString =
"DefaultEndpointsProtocol=http;" +
"AccountName=accountname" +
"AccountKey="+encodedKey+
"EndpointSuffix=table.core.windows.net;";
Can anyone please help me with this? I searched in google a lot and I am able to find one user raised a similar issue on discus but there is no answer provided for that or rather that answer was not helpful.
Update:/Resolution of the issue
First of all I ensured that all the properties in connection string are separated by ';' as suggested by Gaurav(below)
It turns out that I have to manually set the proxy settings in my program since my company work machine is using a proxy to connect to the internet.
System.getProperties().put("http.proxyHost", "myproxyHost");
System.getProperties().put("http.proxyPort", "myProxyPort");
System.getProperties().put("http.proxyUser", "myProxyUser");
System.getProperties().put("http.proxyPassword","myProxyPassword");
Updating the proxy settings solved the issue for me.
Please change the following line of code:
String storageConnectionString =
"DefaultEndpointsProtocol=http;" +
"AccountName=accountname" +
"AccountKey="+encodedKey+
"EndpointSuffix=table.core.windows.net;";
To
String storageConnectionString =
"DefaultEndpointsProtocol=http;" +
"AccountName=accountname" +
";AccountKey="+encodedKey+
";EndpointSuffix=core.windows.net;";
Essentially in your code, there was no separator (;) between AccountName, AccountKey and EndpointSuffix. Also, if you're connecting to standard endpoint (core.windows.net), you don't need to specify EndpointSuffix in your connection string.
Lastly, please ensure that the account key is correct.

Getting file size from S3 bucket

I am trying to get the file size (content-length) using Amazon S3 JAVA sdk.
public Long getObjectSize(AmazonS3Client amazonS3Client, String bucket, String key)
throws IOException {
Long size = null;
S3Object object = null;
try {
object = amazonS3Client.getObject(bucket, key);
size = object.getObjectMetadata().getContentLength();
} finally {
if (object != null) {
//object.close();
1. This results in 50 calls (connection pool size) post that I start getting connection pool errors.
2. If this line is uncommented it takes hell lot of time to make calls.
}
}
return size;
}
I followed this and this. But not sure what I am doing wrong here.
Any help on this?
I'm guessing what your actual question is asking, but I think you can reduce your code and eliminate the need to create an s3Object at all by doing something like:
public Long getObjectSize(AmazonS3Client amazonS3Client, String bucket, String key)
throws IOException {
return amazonS3Client.getObjectMetadata(bucket, key).getContentLength();
}
That should remove the need to call object.close() which you appear to be having issues with.
For v2 of the Amazon S3 Java SDK, try something like this:
HeadObjectRequest headObjectRequest =
HeadObjectRequest.builder()
.bucket(bucket)
.key(key)
.build();
HeadObjectResponse headObjectResponse =
s3Client.headObject(headObjectRequest);
Long contentLength = headObjectResponse.contentLength();
So we have 2 SDKs.
For v1 of the Amazon S3 Java SDK, below
client.getObjectMetadata(bucket, key).getContentLength();
where client is an instance of AmazonS3 coming from import com.amazonaws.services.s3.AmazonS3; and implementation 'com.amazonaws:aws-java-sdk-s3:1.12.353' gradle dependencies.
For v2 of the Amazon S3 Java SDK, below:
return client.headObject(HeadObjectRequest.builder().bucket(bucket).key(key).build()).contentLength();
where client is instance of S3Client coming from import software.amazon.awssdk.services.s3.S3Client; and gradle dependencies implementation 'software.amazon.awssdk:s3:2.18.35'

Java - Create domain in Amazon SimpleDB

I'm working with Amazon SimpleDB and attempting the creation of a DB using the following tutorial . Basically it throws an error i.e. Error occured: java.lang.String cannot be cast to org.apache.http.HttpHost. The full stacktrace is as below:
Error occured: java.lang.String cannot be cast to org.apache.http.HttpHost
java.lang.ClassCastException: java.lang.String cannot be cast to org.apache.http.HttpHost
at org.apache.http.impl.client.DefaultRequestDirector.execute(DefaultRequestDirector.java:416)
at org.apache.http.impl.client.AbstractHttpClient.execute(AbstractHttpClient.java:906)
at org.apache.http.impl.client.AbstractHttpClient.execute(AbstractHttpClient.java:805)
at org.apache.http.impl.client.AbstractHttpClient.execute(AbstractHttpClient.java:784)
at com.xerox.amazonws.common.AWSQueryConnection.makeRequest(AWSQueryConnection.java:474)
at com.xerox.amazonws.sdb.SimpleDB.makeRequestInt(SimpleDB.java:231)
at com.xerox.amazonws.sdb.SimpleDB.createDomain(SimpleDB.java:155)
at com.amazonsimpledb.SDBexample1.main(SDBexample1.java:19)
My code is as below (note i have substituted the AWS access id and secret key with the actual values):
public static void main(String[] args) {
String awsAccessId = "My aws access id";
String awsSecretKey = "my aws secret key";
SimpleDB sdb = new SimpleDB(awsAccessId, awsSecretKey, true);
try {
Domain domain = sdb.createDomain("cars");
System.out.println(domain);
} catch (com.xerox.amazonws.sdb.SDBException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
}
Any ideas as to why the above mentioned error is occurs.
I appreciate any assistance.
It seems you are using the Typica client library, which is pretty much unmaintained since mid 2011, see e.g. the rare commmits and the steady growing unresolved issues, where the latest one appears to be exactly yours in fact, see ClassCastException using Apache HttpClient 4.2:
According to the reporter, things appear to be functional once we downgrade back to Apache HttpClient 4.1, so that might be a temporary workaround eventually.
Either way I highly recommend to switch to the official AWS SDK for Java (or one of the other language SDKs), which isn't only supported and maintained on a regular fashion, but also closely tracks all AWS API changes (admittedly this isn't that critical for Amazon SimpleDB, which is basically frozen technology wise, but you'll have a much easier time using the plethora of AWS Products & Services later on).
In addition you could benefit from the AWS Toolkit for Eclipse in case you are using that IDE.
The SDK includes a couple of samples (also available via the Eclipse Toolkit wizard), amongst those one for SimpleDB - here's a condensed code excerpt regarding your example:
BasicAWSCredentials basicAWSCredentials = new BasicAWSCredentials(
awsAccessId, awsSecretKey);
AmazonSimpleDB sdb = new AmazonSimpleDBClient(basicAWSCredentials);
Region usWest2 = Region.getRegion(Regions.US_WEST_2);
sdb.setRegion(usWest2);
try {
// Create a domain
String myDomain = "MyStore";
System.out.println("Creating domain called " + myDomain + ".\n");
sdb.createDomain(new CreateDomainRequest(myDomain));
// ...
// Delete a domain
System.out.println("Deleting " + myDomain + " domain.\n");
sdb.deleteDomain(new DeleteDomainRequest(myDomain));
} catch (AmazonServiceException ase) {
// ...
} catch (AmazonClientException ace) {
// ...
}
Please try to create instance of SimpleDB with server and port and let me know if it works.
public SimpleDB objSimpleDB = null;
private String awsAccessKeyId = "access key";
private String awsSecretAccessKey = "secret key";
private boolean isSecure= true;
private String server = "sdb.amazonaws.com";
private int port=443;
try{
SimpleDB objSimpleDB = new SimpleDB(awsAccessKeyId, awsSecretAccessKey, isSecure, server, port);
Domain domain = objSimpleDB .createDomain("cars");
} catch (com.xerox.amazonws.sdb.SDBException e) {
//handle error
}

BIRT Error : Unable to determine the default workspace location in Java

I get the following error
java.lang.IllegalStateException: Unable to determine the default workspace location. Check your OSGi-less platform configuration of the plugin or datatools workspace path.
which makes little sense.
Reports are created using the BIRT designer within Eclipse, and we are using code to covert the reports in to PDF.
the code looks something like
final EngineConfig config = new EngineConfig();
config.setBIRTHome("./birt");
Platform.startup(config);
final IReportEngineFactory factory = (IReportEngineFactory) Platform
.createFactoryObject(IReportEngineFactory.EXTENSION_REPORT_ENGINE_FACTORY);
final HTMLRenderOption ho = new HTMLRenderOption();
ho.setImageHandler(new HTMLCompleteImageHandler());
config.setEmitterConfiguration(RenderOption.OUTPUT_FORMAT_HTML, ho);
// Create the engine.
this.engine = factory.createReportEngine(config);
final IReportRunnable report = this.engine.openReportDesign(reportName);
final IRunAndRenderTask task = this.engine.createRunAndRenderTask(report);
final RenderOption options = new HMTLRenderOption();
options.setOutputFormat(HTMLRenderOption.OUTPUT_FORMAT_PDF);
options.setOutputFormat("pdf");
final String output = reportName.replaceFirst(".rptdesign", ".xls");
final String output = name.replaceFirst(".rptdesign", "." + HTMLRenderOption.OUTPUT_FORMAT_PDF);
options.setOutputFileName( outputReporttName);
task.setRenderOption(options);
// Run the report.
task.run();
but it seems during the task.run() method, the system throws the error.
This needs to be able to run standalone, without the need of eclipse, and hopped thatt he setting of BIRT home would make it happy, but these seems to be some other connection profile i am unaware of and probably don't need.
The full error :
07-Jan-2013 14:55:31 org.eclipse.datatools.connectivity.internal.ConnectivityPlugin log
SEVERE: Unable to determine the default workspace location. Check your OSGi-less platform configuration of the plugin or datatools workspace path.
07-Jan-2013 14:55:31 org.eclipse.birt.report.engine.api.impl.EngineTask handleFatalExceptions
SEVERE: An error happened while running the report. Cause:
java.lang.IllegalStateException: Unable to determine the default workspace location. Check your OSGi-less platform configuration of the plugin or datatools workspace path.
at org.eclipse.datatools.connectivity.internal.ConnectivityPlugin.getDefaultStateLocation(ConnectivityPlugin.java:155)
at org.eclipse.datatools.connectivity.internal.ConnectivityPlugin.getStorageLocation(ConnectivityPlugin.java:191)
at org.eclipse.datatools.connectivity.internal.ConnectionProfileMgmt.getStorageLocation(ConnectionProfileMgmt.java:1060)
at org.eclipse.datatools.connectivity.oda.profile.internal.OdaProfileFactory.defaultProfileStoreFile(OdaProfileFactory.java:170)
at org.eclipse.datatools.connectivity.oda.profile.OdaProfileExplorer.defaultProfileStoreFile(OdaProfileExplorer.java:138)
at org.eclipse.datatools.connectivity.oda.profile.OdaProfileExplorer.loadProfiles(OdaProfileExplorer.java:292)
at org.eclipse.datatools.connectivity.oda.profile.OdaProfileExplorer.getProfileByName(OdaProfileExplorer.java:537)
at org.eclipse.datatools.connectivity.oda.profile.provider.ProfilePropertyProviderImpl.getConnectionProfileImpl(ProfilePropertyProviderImpl.java:184)
at org.eclipse.datatools.connectivity.oda.profile.provider.ProfilePropertyProviderImpl.getDataSourceProperties(ProfilePropertyProviderImpl.java:64)
at org.eclipse.datatools.connectivity.oda.consumer.helper.ConnectionPropertyHandler.getEffectiveProperties(ConnectionPropertyHandler.java:123)
at org.eclipse.datatools.connectivity.oda.consumer.helper.OdaConnection.getEffectiveProperties(OdaConnection.java:826)
at org.eclipse.datatools.connectivity.oda.consumer.helper.OdaConnection.open(OdaConnection.java:240)
at org.eclipse.birt.data.engine.odaconsumer.ConnectionManager.openConnection(ConnectionManager.java:165)
at org.eclipse.birt.data.engine.executor.DataSource.newConnection(DataSource.java:224)
at org.eclipse.birt.data.engine.executor.DataSource.open(DataSource.java:212)
at org.eclipse.birt.data.engine.impl.DataSourceRuntime.openOdiDataSource(DataSourceRuntime.java:217)
at org.eclipse.birt.data.engine.impl.QueryExecutor.openDataSource(QueryExecutor.java:407)
at org.eclipse.birt.data.engine.impl.QueryExecutor.prepareExecution(QueryExecutor.java:317)
at org.eclipse.birt.data.engine.impl.PreparedQuery.doPrepare(PreparedQuery.java:455)
at org.eclipse.birt.data.engine.impl.PreparedDataSourceQuery.produceQueryResults(PreparedDataSourceQuery.java:190)
at org.eclipse.birt.data.engine.impl.PreparedDataSourceQuery.execute(PreparedDataSourceQuery.java:178)
at org.eclipse.birt.data.engine.impl.PreparedOdaDSQuery.execute(PreparedOdaDSQuery.java:145)
at org.eclipse.birt.report.data.adapter.impl.DataRequestSessionImpl.execute(DataRequestSessionImpl.java:624)
at org.eclipse.birt.report.engine.data.dte.DteDataEngine.doExecuteQuery(DteDataEngine.java:152)
at org.eclipse.birt.report.engine.data.dte.AbstractDataEngine.execute(AbstractDataEngine.java:267)
at org.eclipse.birt.report.engine.executor.ExecutionContext.executeQuery(ExecutionContext.java:1939)
at org.eclipse.birt.report.engine.executor.QueryItemExecutor.executeQuery(QueryItemExecutor.java:80)
at org.eclipse.birt.report.engine.executor.TableItemExecutor.execute(TableItemExecutor.java:62)
at org.eclipse.birt.report.engine.internal.executor.dup.SuppressDuplicateItemExecutor.execute(SuppressDuplicateItemExecutor.java:43)
at org.eclipse.birt.report.engine.internal.executor.wrap.WrappedReportItemExecutor.execute(WrappedReportItemExecutor.java:46)
at org.eclipse.birt.report.engine.internal.executor.l18n.LocalizedReportItemExecutor.execute(LocalizedReportItemExecutor.java:34)
at org.eclipse.birt.report.engine.layout.html.HTMLBlockStackingLM.layoutNodes(HTMLBlockStackingLM.java:65)
at org.eclipse.birt.report.engine.layout.html.HTMLPageLM.layout(HTMLPageLM.java:92)
at org.eclipse.birt.report.engine.layout.html.HTMLReportLayoutEngine.layout(HTMLReportLayoutEngine.java:100)
at org.eclipse.birt.report.engine.api.impl.RunAndRenderTask.doRun(RunAndRenderTask.java:180)
at org.eclipse.birt.report.engine.api.impl.RunAndRenderTask.run (RunAndRenderTask.java:77)
has anyone seen this error and can point me in the right direction ?
When I had this issue then I tried two things. The first thing solved the error but then I just got to the next error.
The first thing I tried was setting the setenv.sh file to have the following line:
export CATALINA_OPTS="$CATALINA_OPTS -Djava.io.tmpdir=/opt/local/share/tomcat/apache-tomcat-8.0.8/temp/tmpdir -Dorg.eclipse.datatools_workspacepath=/opt/local/share/tomcat/apache-tomcat-8.0.8/temp/tmpdir/workspace_dtp"
This solution worked after I made the tmpdir and the workspace_dtp directories in my local tomcat server. This was done in response to the guidance here.
However, I just got to the next error, which was a connection profile error. I can look into it again if you need. I know how to replicate the issue.
The second thing I tried ended up solving the issue completely and had to do with our report designer selecting the wrong type of datasource in the report design process. See my post on the Eclipse BIRT forums here for the full story: post.
Basically, the report type was set to "JDBC Database Connection for Query Builder" when it should have been set to "JDBC Data Source." See the picture for reference:
Here I give you a tip that save me from that pain :
just launch eclipse with "-clean" option after installing BIRT plugins.
To be clear, my project was built from BIRT maven dependencies, and so should not use eclipse dependencies to run (except for designing reports), but ... i think there was a conflict somewhere ... especially with org.eclipse.datatools.connectivity_1.2.4.v201202041105.jar
For global understanding, you should follow the migration guide :
http://wiki.eclipse.org/Birt_3.7_Migration_Guide#Connection_Profiles
It helps using a connection profile to externalize datasource parameters.
So it's not required if you define JDBC parameters directly in report design.
I used this programmatic way to initialize worskpace directory :
#Override
public void initializeEngine() throws BirtException {
// define eclipse datatools workspace path (required)
String workspacePath = setDataToolsWorkspacePath();
// set configuration
final EngineConfig config = new EngineConfig();
config.setLogConfig(workspacePath, Level.WARNING);
// config.setResourcePath(getSqlDriverClassJarPath());
// startup OSGi framework
Platform.startup(config); // really needed ?
IReportEngineFactory factory = (IReportEngineFactory) Platform
.createFactoryObject(IReportEngineFactory.EXTENSION_REPORT_ENGINE_FACTORY);
engine = factory.createReportEngine(config);
engine.changeLogLevel(Level.WARNING);
}
private String setDataToolsWorkspacePath() {
String workspacePath = System.getProperty(DATATOOLS_WORKSPACE_PATH);
if (workspacePath == null) {
workspacePath = FilenameUtils.concat(SystemUtils.getJavaIoTmpDir().getAbsolutePath(), "workspace_dtp");
File workspaceDir = new File(workspacePath);
if (!workspaceDir.exists()) {
workspaceDir.mkdir();
}
if (!workspaceDir.canWrite()) {
workspaceDir.setWritable(true);
}
System.setProperty(DATATOOLS_WORKSPACE_PATH, workspacePath);
}
return workspacePath;
}
I also needed to force datasource parameters at runtime this way :
private void generateReportOutput(InputStream reportDesignInStream, File outputFile, OUTPUT_FORMAT outputFormat,
Map<PARAM, Object> params) throws EngineException, SemanticException {
// Open a report design
IReportRunnable design = engine.openReportDesign(reportDesignInStream);
// Use data-source properties from persistence.xml
forceDataSource(design);
// Create RunAndRender task
IRunAndRenderTask runTask = engine.createRunAndRenderTask(design);
// Use data-source from JPA persistence context
// forceDataSourceConnection(runTask);
// Define report parameters
defineReportParameters(runTask, params);
// Set render options
runTask.setRenderOption(getRenderOptions(outputFile, outputFormat, params));
// Execute task
runTask.run();
}
private void forceDataSource(IReportRunnable runableReport) throws SemanticException {
DesignElementHandle designHandle = runableReport.getDesignHandle();
Map<String, String> persistenceProperties = PersistenceUtils.getPersistenceProperties();
String dsURL = persistenceProperties.get(AvailableSettings.JDBC_URL);
String dsDatabase = StringUtils.substringAfterLast(dsURL, "/");
String dsUser = persistenceProperties.get(AvailableSettings.JDBC_USER);
String dsPass = persistenceProperties.get(AvailableSettings.JDBC_PASSWORD);
String dsDriver = persistenceProperties.get(AvailableSettings.JDBC_DRIVER);
SlotHandle dataSources = ((ReportDesignHandle) designHandle).getDataSources();
int count = dataSources.getCount();
for (int i = 0; i < count; i++) {
DesignElementHandle dsHandle = dataSources.get(i);
if (dsHandle != null && dsHandle instanceof OdaDataSourceHandle) {
// replace connection properties from persistence.xml
dsHandle.setProperty("databaseName", dsDatabase);
dsHandle.setProperty("username", dsUser);
dsHandle.setProperty("password", dsPass);
dsHandle.setProperty("URL", dsURL);
dsHandle.setProperty("driverClass", dsDriver);
dsHandle.setProperty("jarList", getSqlDriverClassJarPath());
// #SuppressWarnings("unchecked")
// List<ExtendedProperty> privateProperties = (List<ExtendedProperty>) dsHandle
// .getProperty("privateDriverProperties");
// for (ExtendedProperty extProp : privateProperties) {
// if ("odaUser".equals(extProp.getName())) {
// extProp.setValue(dsUser);
// }
// }
}
}
}
I was having the same issue
Changing the Data Source type from "JDBC Database Connection for Query Builder" to "JDBC Data Source" solved the problem for me.

Categories

Resources