Azure Table batch insert from Java Using SAS fails - java

Trying to use a batch insert to azure table fails if using a SAS ("Shared access signature").
When using account key (which is less secure I guess) it works.
Example code:
StorageCredentialsSharedAccessSignature credentials = new StorageCredentialsSharedAccessSignature("sig=.....");
CloudTableClient cloudTableClient = new CloudTableClient(new URI("https://<storage account>.table.core.windows.net/<tablename>"), credentials);
CloudTable cloudTable = cloudTableClient.getTableReference("<tablename>");
//these 2 will be in a batch
TableServiceEntity d1 = new TableServiceEntity("3333333333333", "22222222222222" + System.currentTimeMillis());
TableServiceEntity d2 = new TableServiceEntity("3333333333333", "eeeeeeeeeee" + System.currentTimeMillis());
//single
TableServiceEntity d3 = new TableServiceEntity("ddddddddddddddddddd", "dddddddddd" + System.currentTimeMillis());
//prepare batch
TableBatchOperation batch = new TableBatchOperation();
batch.insert(d1);
batch.insert(d2);
try {
// this will work (not batch, just to show that regular insert works)
cloudTable.execute(TableOperation.insert(d3));
// this will fail
cloudTable.execute(batch);
} catch (StorageException e) {
//here we get "Unsupported Media Type" (415 error)
e.printStackTrace();
return;
}
System.out.println("OK");
The error I get is:
com.microsoft.azure.storage.StorageException: Unsupported Media Type
at com.microsoft.azure.storage.StorageException.translateException(StorageException.java:89)
at com.microsoft.azure.storage.core.StorageRequest.materializeException(StorageRequest.java:315)
at com.microsoft.azure.storage.core.ExecutionEngine.executeWithRetry(ExecutionEngine.java:175)
at com.microsoft.azure.storage.table.TableBatchOperation.execute(TableBatchOperation.java:418)
at com.microsoft.azure.storage.table.CloudTable.execute(CloudTable.java:475)
at com.microsoft.azure.storage.table.CloudTable.execute(CloudTable.java:432)
at com.bgprotect.azurestorage.Test.main(Test.java:49)
SAS
sig=<sig>&se=2020-01-01T00%3A00%3A00Z&sv=2015-04-05&tn=<table name>&sp=raud

Based on the issue on Github, please try to change the following line of code:
CloudTableClient cloudTableClient = new CloudTableClient(new URI("https://<storage account>.table.core.windows.net/<tablename>"), credentials);
to:
CloudTableClient cloudTableClient = new CloudTableClient(new URI("https://<storage account>.table.core.windows.net"), credentials);
Essentially don't include the name of the table in the URI. It should only be https://account-name.table.core.windows.net.
P.S. I didn't realize you had also opened an issue on Github regarding this :).

Related

Google drive api to get all children is not working if I dynamically pass fileId to query

I am trying google drive api to search parents of a folder. In search query i have to pass file id dynamically instead of hard coding. I tried below code. but I am getting file not found json response.
here its not taking fileId as value i think its consider as String
if I hardcode the value it is working.
FileList result = service.files().list().setQ("name='testfile' ").execute();
for (com.google.api.services.drive.model.File file : result.getFiles()) {
System.out.printf("Found file: %s (%s)\n",
file.getName(), file.getId());
String fileId =file.getId();
FileList childern = service.files().list().setQ(" + \"file.getId()\" in parents").setFields("files(id, name, modifiedTime, mimeType)").execute();
This should help.
String fileid=file.getId()
service.files().list().setQ("'" + fileId + "'" + " in parents").setFields("files(id, name, modifiedTime, mimeType)").execute();
Make sure you have valid file.getId()
I know your question states java but the only sample of this working is in C#. Another issue is as far as i know PageStreamer.cs does not have an equivalent in the java client library.
I am hoping that C# and java are close enough that this might give you some ideas of how to get it working in Java. My java knowledge is quote basic but i may be able to help you debug it if you want to try to convert this.
try
{
// Initial validation.
if (service == null)
throw new ArgumentNullException("service");
// Building the initial request.
var request = service.Files.List();
// Applying optional parameters to the request.
request = (FilesResource.ListRequest)SampleHelpers.ApplyOptionalParms(request, optional);
var pageStreamer = new Google.Apis.Requests.PageStreamer<Google.Apis.Drive.v3.Data.File, FilesResource.ListRequest, Google.Apis.Drive.v3.Data.FileList, string>(
(req, token) => request.PageToken = token,
response => response.NextPageToken,
response => response.Files);
var allFiles = new Google.Apis.Drive.v3.Data.FileList();
allFiles.Files = new List<Google.Apis.Drive.v3.Data.File>();
foreach (var result in pageStreamer.Fetch(request))
{
allFiles.Files.Add(result);
}
return allFiles;
}
catch (Exception Ex)
{
throw new Exception("Request Files.List failed.", Ex);
}

Table creation false in Azure Storage using java

I created a java program for create a new table people at the Azure Storage. but i could not able to create a new table.This is my Java code.
CloudStorageAccount storageAccount = CloudStorageAccount.parse(storageConnectionString);
CloudTableClient tableClient = storageAccount.createCloudTableClient();
String tableName = "people";
CloudTable cloudTable = new CloudTable(tableName,tableClient);
//A below line Produce the False.
cloudTable.createIfNotExists();
Can someone Explain this?
Thank u

Call to get EMR results always returns empty state collection even when EMR job done

For IP reasons, I'm not able to post full source code. However, I made a call to submit an Amazon Elastic Map Reduce Job (EMR) which now runs to completion. Previously it failed with essentially a file not found error.
RunJobFlowResult result=emr.runJobFlow(request);
succeeds and I can get the job flow ID from it.
Later, I have a loop polls for the status by first
DescribeJobFlowsRequest request=new DescribeJobFlowsRequest(jobFlowIdArray);
I check each state in a loop by calling
request.getJobFlowStates()
Unfortunately, that call always returns an empty collection, regardless of whether the job is running, failed or succeeded. How can I get at least some indication of what's going on?
AWSCredentials credentials = new BasicAWSCredentials(accessKey, secretKey);
AmazonElasticMapReduceClient client = new AmazonElasticMapReduceClient(credentials);
client.setEndPoint("elasticmapreduce.us-east-1.amazonaws.com");
StepFactory stepFactory = new StepFactory();
StepConfig enableDebugging = new StepConfig()
.withActionOnFailure("TERMINATE_JOB_FLOW")
.withHadoopjJarStep(stepFactory.newEnableDebuggingStep());
String[] arguments={...} // Custom jar arguments
HadoopJarStepConfig jarConfig = new HadoopJarStepConfig();
jarConfig.setJar(JAR_NAME);
jarConfig.setArgs(Arrays.asList(arguments));
StepConfig runJar = new StepConfig(JAR_NAME.substring(JAR_NAME.indexOf('/')+1),jarConfig);
RunJobFlowRequest request = new RunJobFlowRequest()
.withName("...")
.withSteps(runJar)
.withLogUri("...")
.withInstances(
new JobFlowInstancesCOnfig()
.withHadoopVersion("1.0.3")
.withInstanceCount(5)
.withKeepJobFlowAliveWhenNoSteps(false)
.withMasterInstanceType("m1.small")
.withSlaveInstanceType("m1.small");
RunJobFlowResult result = client.runJobFlow(request);
String jobFlowID=result.getJobFlowID();
List<String> describeJobFlowIdList=new ArrayList<String>(1);
describeJobFlowIdList.add(jobFlowID);
String lastState="";
boolean jobMonitoringNotDone=true;
while(jobMonitoringNotDone){
SescribeJobFlowsRequest describeJobFlowsRequest=
new DescribeJobFlowsRequest(describeJobFlowIdList);
// Call to describeJobFlowsRequest.getJobFlowStates() always returns
// empty list even when job succeeds or fails.
for(String state : describeJobFlowsRequest.getJobFlowStates()){
if(DONE_STATES.contains(state)){
jobMonitoringNotDone=false;
} else if(!lastState.equals(state)){
lastState = state;
System.out.println("Job "+state + " at "+ new Date().toString());
}
}
try {
Thread.sleep(10000);
} catch (InterruptedException e) {
e.printStackTrace();
}
}
The code above was missing a call similar to
DescribeJobFlowsResult describeJobFlowsResult = client.describeJobFlows(describeJobFlowsRequest);
This got me a solution that works, but unfortunately Amazon deprecated the method but didn't provide an alternative. I wish I had a non deprecated solution so this is only a partial answer.

WRONG_DOCUMENT_ERR Error after login to sugarCRM from java Axis 1.4

I want to import data from java web application to sugarCRM. I created client stub using AXIS and then I am trying to connect, it seems it is getting connected, since I can get server information. But after login, it gives me error while getting sessionID:
Error is: "faultString: org.w3c.dom.DOMException: WRONG_DOCUMENT_ERR: A node is used in a different document than the one that created it."
Here is my code:
private static final String ENDPOINT_URL = " http://localhost/sugarcrm/service/v3/soap.php";
java.net.URL url = null;
try {
url = new URL(ENDPOINT_URL);
} catch (MalformedURLException e1) {
System.out.println("URL endpoing creation failed. Message: "+e1.getMessage());
e1.printStackTrace();
}
> System.out.println("URL endpoint created successfully!");
Sugarsoap service = new SugarsoapLocator();
SugarsoapPortType port = service.getsugarsoapPort(url);
Get_server_info_result result = port.get_server_info();
System.out.println(result.getGmt_time());
System.out.println(result.getVersion());
//I am getting right answers
User_auth userAuth=new User_auth();
userAuth.setUser_name(USER_NAME);
MessageDigest md =MessageDigest.getInstance("MD5");
String password=convertToHex(md.digest(USER_PASSWORD.getBytes()));
userAuth.setPassword(password);
Name_value nameValueListLogin[] = null;
Entry_value loginResponse = null;
loginResponse=port.login (userAuth, "sugarcrm",nameValueListLogin);
String sessionID = loginResponse.getId(); // <--- Get error on this one
The nameValueListLogin could be be from a different document context (coming from a different source). See if this link helps.
You may need to get more debugging/logging information so we can see what nameValueListLogin consists of and where it is coming from.

Error while creating a new "OpportunityLineItemSchedule" using SFDC Partner API

When I try to create a new OpportunityLineItemSchedule I'm running into following error..
Error code: INSUFFICIENT_ACCESS_ON_CROSS_REFERENCE_ENTITY
Error message: insufficient access rights on cross-reference id
Attached is the code snippet. Any help will be extremely useful.
SObject[] rs = new SObject[1];
MessageElement[] specificRS = new MessageElement[6];
specificRS[0] = new MessageElement(new QName("OpportunityLineItemId"),"00k7000000DFLqfAAH");
specificRS[1] = new MessageElement(new QName("Description"),"Rev Schedule Descr");
specificRS[2] = new MessageElement(new QName("Type"),"Quantity");
specificRS[3] = new MessageElement(new QName("Quantity"),(double)2);
specificRS[4] = new MessageElement(new QName("Revenue"),(double)400000.00);
specificRS[5] = new MessageElement(new QName("ScheduleDate"),"2010-10-30");
rs[0] = new SObject();
rs[0].setType("OpportunityLineItemSchedule");
rs[0].set_any(specificRS);
SaveResult[] sr = null;
try {
sr = binding.create(rs);
} catch (Exception ex) {
System.out.println("An unexpected error has occurred." + ex.getMessage());
ex.printStackTrace();
return;
}
Following works..
MessageElement[] specificRS2 = new MessageElement[5];
specificRS2[0] = new MessageElement(new QName("OpportunityLineItemId"),"00k7000000DFcOG");
// PricebookEntryId can be found by joining PricebookEntry and Pricebook2 tables (on Product2Id and
specificRS2[1] = new MessageElement(new QName("Description"),"Rev Schedule Descr2");
specificRS2[2] = new MessageElement(new QName("ScheduleDate"),"2010-10-31");
//specificRS[3] = new MessageElement(new QName("Quantity"),(double)2);
specificRS2[3] = new MessageElement(new QName("Revenue"),(double)10.00);
//specificRS[4] = new MessageElement(new QName("Type"),"Quantity"); // and/or "Revenue"
specificRS2[4] = new MessageElement(new QName("Type"),"Revenue"); // and/or "Quantity"
rs[1] = new SObject();
rs[1].setType("OpportunityLineItemSchedule");
rs[1].set_any(specificRS2);
SaveResult[] sr = null;
try {
sr = binding.create(rs);
} catch (Exception ex) {
System.out.println("An unexpected error has occurred." + ex.getMessage());
ex.printStackTrace();
return;
}
This is usually an error when code is trying to use an ID for an object that doesn't exist, or that the user doesn't have access to. I take it the only difference between the 2 snippets is the OpportunityLineItem ID? Check that the user running the code can access the item with that ID.
Have a look at the Allowed Type Field Values and Allowed Quantity and Revenue Field Values documentation for OpportunityLineItemSchedule.
The allowed Type values for an OpportunityLineItemSchedule depend on the product-level schedule preferences and whether the line item has any existing schedules
You may need to check if there are existing OpportunityLineItemSchedule records.
The allowable Quantity and Revenue field values depend on the value of the Type field
You only set the Quantity or Revenue field, not both.

Categories

Resources