I have written a Java app that synchronises Google Groups on our Google Apps for Education domain (similar in function to Google Apps School Directory Sync, but customised for some of our specific needs).
The synchronisation works, but it is slow because it is performing each task individually. I know that there are API interfaces for batching operations, but I can't find any examples of how this is implemented with the Java API.
The code I'm using looks similar to this (authentication and other setup is taken care of elsewhere):
try
{
Member m = new Member ();
m.setEmail (member);
m.setRole ("MEMBER");
service.members ().insert (group, m).execute ();
}
catch (Exception e)
{
// ERROR handling
}
Instead of executing these operations one-by-one, I would like to batch them instead. Can anyone tell me how?
Look here: Batch Java API
For example:
BatchRequest batch = new BatchRequest(httpTransport, httpRequestInitializer);
batch.setBatchUrl(new GenericUrl(/*your customized batch URL goes here*/));
batch.queue(httpRequest1, dataClass, errorClass, callback);
batch.queue(httpRequest2, dataClass, errorClass, callback);
batch.execute();
Remember, that:
The body of each part is itself a complete HTTP request, with its own
verb, URL, headers, and body. The HTTP request must only contain the
path portion of the URL; full URLs are not allowed in batch requests.
UPDATE
Look also here, how to build batch with Google Batch API:
https://github.com/google/google-api-java-client
UPDATE 2
Try something like this:
// Create the Storage service object
Storage storage = new Storage(httpTransport, jsonFactory, credential);
// Create a new batch request
BatchRequest batch = storage.batch();
// Add some requests to the batch request
storage.objectAccessControls().insert("bucket-name", "object-key1",
new ObjectAccessControl().setEntity("user-123423423").setRole("READER"))
.queue(batch, callback);
storage.objectAccessControls().insert("bucket-name", "object-key2",
new ObjectAccessControl().setEntity("user-guy#example.com").setRole("READER"))
.queue(batch, callback);
storage.objectAccessControls().insert("bucket-name", "object-key3",
new ObjectAccessControl().setEntity("group-foo#googlegroups.com").setRole("OWNER"))
.queue(batch, callback);
// Execute the batch request. The individual callbacks will be called when requests finish.
batch.execute();
From here: Batch request with Google Storage Json Api (JAVA)
Related
I'm using the ElasticSeach highlevel rest api client to do some custom reindexing from another cluster.
ReindexRequest reindexRequest = new ReindexRequest()
.setMaxDocs(3000)
.setDestIndex("my-new-index")
.setTimeout(TimeValue.timeValueHours(1))
.setRemoteInfo(
new RemoteInfo("http", "otherhost", 9200, "/",
new BytesArray(selectQuery.toString()), "user",
"password",
Map.of()), TimeValue.timeValueSeconds(30),
TimeValue.timeValueSeconds(30)))
.setSourceIndices("old-index")
.setScript(new Script(ScriptType.STORED, null, "add_sc_id", Map.of("sc_id", some-id)));
TaskSubmissionResponse task = esClient.submitReindexTask(reindexRequest, RequestOptions.DEFAULT);
I periodically check the task to see if it's done or not using the task API
Optional<GetTaskResponse> getTaskResponse =
esClient.tasks().get(new GetTaskRequest(nodeId, taskId), RequestOptions.DEFAULT);
Using this, I can see if the task is completed with getTaskResponse.get().isCompleted() but I don't see anyway to check if it's successful or not.
By doing the GET _taks/nodeId:taskId with curl, I see there is a response.failures field.
Is there a way to retrieve this field with the Java High level rest api client? Or is there another way to achieve this?
Please check List Task API.
You can use below Java code for geeting task Failure information:
ListTasksRequest request = new ListTasksRequest();
request.setActions("cluster:*");
request.setNodes("nodeId1", "nodeId2");
request.setParentTaskId(new TaskId("parentTaskId", 42));
ListTasksResponse responseTask = client.tasks().list(request,RequestOptions.DEFAULT);
List<TaskOperationFailure> taskFailures = response.getTaskFailures();
I have an function app: funct1(HttpTrigger) -> blob storage -> func2(BlobTrigger). In Application Insights, there will be two separate request telemetry generated with different operation id. Each has its own end-to-end transaction trace.
In order to get the end-to-end trace for the whole app, I would like to correlate two functions by setting the parent id and operation id of func2 with request id and operation id of func1 so both can be shown in application insights as one end-to-end trace.
I have tried following code but it didn't take any effect and there is a lack of documentation about how to use application insights Java SDK in general for customizing telemetry.
#FunctionName("Create-Thumbnail")
#StorageAccount(Config.STORAGE_ACCOUNT_NAME)
#BlobOutput(name = "$return", path = "output/{name}")
public byte[] generateThumbnail(
#BlobTrigger(name = "blob", path = "input/{name}")
byte[] content,
final ExecutionContext context
) {
try {
TelemetryConfiguration configuration = TelemetryConfiguration.getActive();
TelemetryClient client = new TelemetryClient(configuration);
client.getContext().getOperation().setParentId("MY_CUSTOM_PARENT_ID");
client.flush();
return Converter.createThumbnail(content);
} catch (Exception e) {
e.printStackTrace();
return content;
}
}
Anyone with knowledge in this area can provide some tips?
I'm afraid it can't be achieved as the official doc said :
In C# and JavaScript, you can use an Application Insights SDK to write
custom telemetry data.
If you need to set custom telemetry, you need to add app insights java SDK to your function, but I haven't found any SDK... If there's any progress, I'll update here.
I'm trying to implement multipart upload in Java, following this sample: https://docs.aws.amazon.com/AmazonS3/latest/dev/llJavaUploadFile.html
But my actual task is a bit more complicated: I need to support resuming in case application was shut down during uploading. Also, I can't use TransferManager - I need to use low-level API for particular reason.
The code there is pretty straight-forward, but the problem comes with List<PartETag> partETags part. When finalizing resumed upload, I need to have this collection, previously filled during the upload process. And, obviously, if I'm trying to finalize upload after application restart, I don't have this collection anymore.
So the question is: how do I finalize resumed upload? Is it possible to obtain List<PartETag> partETags from the server using some API? What I have is only a MultipartUpload object.
Get the list of multipart uploads in progress
MultipartUploadListing multipartUploadListing =
s3Client.listMultipartUploads(new ListMultipartUploadsRequest(bucketName));
## for uploadId and keyName
Get the list of parts for each uploadId and key
PartsListing partsListing =
s3Client.listParts(new ListPartsRequest(bucketName, key, uploadId));
Get the List of part summary
List<PartSummary> parts = partsListing.getParts();
From PartSummary getETag() and getPartNumber()
for(PartSummary part: parts)
{
part.getETag();
part.getPartNumber();
}
Amazon S3 SDK Package
AmazonS3 client
I have a system that has Windows COM interface so that external applications can connect to it and it has following details
Interface: InterfaceName
Flags: (1234) Dual OleAutomation Dispatchable
GUID: {ABCDEFG-ABCD-1234-ABCD-ABCDE1234}
I'd like to connect to this interface through Java Spring Application, it will sends a request to this interface and process the response.
I've tried to use the following code
ActiveXComponent mf = new ActiveXComponent("ApplicationName.InterfaceName");
try {
Dispatch f2 = mf.QueryInterface(" {ABCDEFG-ABCD-1234-ABCD-ABCDE1234} ");
Dispatch.put(f2, 201, new Variant("Request String"));
} catch (Exception e) {
e.printStackTrace();
}
The executable file opens but it doesn't do what I want. I want to do the following.
How do I make sure, my interface has bee registered, I can see it
under
Computer\HKEY_CLASSES_ROOT\ApplicationName.InterfaceName
Using ActiveXComponent opens the instance of application, which is not required. Application is already running.
call the interface with dispid.
Retreive the response from the call/put/invoke ( which suits best
for my requiremet ? ) and process the response.
I'm working first time with JAVA-COM_Interface and don't have much experience with it also I could find very few examples on the internet for it and I tried to convert the example I found for my project, also I am not sure the approach I am taking to call the interface is correct or not I would be glad if you can give a hand!
I have resolved this using JACOB lib.
1) Download JACOB folder from here.
2) Check your application is working & has details under
Computer\HKEY_CLASSES_ROOT\ApplicationName.InterfaceName
3) Make sure ApplicationName.dll file is registered. If not use this link for more info
regsvr32
4) Use this Java Code to send data to COM Interface with below simple code.
Dispatch dispatch = new Dispatch("Application.InterfaceName");
Variant response = Dispatch.call(dispatch, <DISPID>, message);
syso(response.getString()); // to print the response
Hope this helps.
I have seen multiple threads regarding the use of XStorable.storeToURL and vnd.sun.star.webdav://domain:8080//path/to/document_library to save OO documents to a webdav library folder. However, I have not seen a posting where someone has successfully used this in Java. While the use of the UCB vnd.sun.star.webdav://domain:8080//path/to/document_library/doc.odt works when using the File, Save menu options within OO Writer, I am prompted for a username and password. Supplying user and password via vnd.sun.star.webdav://user:password#domain:8080/ has not worked for me. I need to use this method from within a Java class to save a OO document. Has anyone had success using the following or something similar?
xStorable.storeToURL("vnd.sun.star.webdav://domain:8080/path/to/document_library/doc.odt", storeProps)
In the OO Developer's Guide, there is a paragraph regarding WebDav authentication:
DAV resources that require authentication are accessed using the interaction handler mechanism of the UCB. The DAV content calls an interaction handler supplied by the client to let it handle an authentication request. The implementation of the interaction handler collects the user name or password from a location, for example, a login dialog, and supplies this data as an interaction response.
Maybe this is related to the issue? If so, how to use an interaction handler for the authentication when trying to storeToURL via webdav?
Adding InteractionHandler was the issue. With this added, documents can be saved via storeToURL and passing the handler in as an argument:
String oooExeFolder = "C:/OpenOffice/program";
XComponentContext xLocalContext = BootstrapSocketConnector.bootstrap(oooExeFolder);
Object serviceManager = xLocalServiceManager.createInstanceWithContext("com.sun.star.task.InteractionHandler", xLocalContext);
XInteractionHandler xHandler = (XInteractionHandler)UnoRuntime.queryInterface( XInteractionHandler.class, serviceManager);
PropertyValue[] storeProps = new PropertyValue[1];
storeProps[0] = new PropertyValue();
storeProps[0].Name = "InteractionHandler";
storeProps[0].Value = xHandler;
xStorable.storeToURL("vnd.sun.star.webdav://domain:8080/path/to/document_library/doc.odt", storeProps);