Insert row without overwritting data - java

I end up here a lot from google and since I started trying to make an android app for myself I'm stumped since the v4 API isn't very helpful. I've search on here but I haven't seen an answer that answer my question.
I have a budget spreadsheet. My app is going to grab the data from the sheet and filter it to my current payweek. So I can look at all my current transactions and future ones so I can know how much I have extra to spend. Then I want to just be able to insert a new traction from the app, say if I get gas I want to be able to add that in a couple clicks rather than try to spend a few minutes editing the sheets in the sheet app. Later I plan to customize the formula for the cell I'm putting it in as well as copy it to the cells north and south of it so the math works out.
I used the Android Quickstart to be able to read data easy enough. I've since tweaked that to filter the sheets data to filter my data. I've even gone as far to get it be able to write data to the sheet. The problem is that I can't find any example in java/android in the same style as the quick start to know how to insert a row in the middle of the sheet instead of overwrite the cells. I'm assuming I have to use the INSERTDATAOPTION=INSERT_ROWS somewhere but I can't find anywhere to add it. The documentation suggest I use spreadsheets.values.append but .append never shows up as an option. I only get BatchGet, BatchUpdate, Get, Update.
I'm a beginner and I'm sure my code is clumsy that I've pieced together after hours of googling but I'm sure I'm just missing something easy. Any help would be appreciated.
private List<String> PostDataForApi() throws IOException {
String spreadsheetID = getResources().getString(R.string.my_google_spreadsheet_id);
Integer sheetID = getResources().getInteger(R.integer.my_google_sheet_id);
List<RowData> rowData = new ArrayList<RowData>();
List<CellData> cellData = new ArrayList<CellData>();
String value = "test";
String formula = "=IF(COUNTBLANK(C510) = 2,\"\",Sum(B511+(SUM(C510))))";
String value2 = "999";
CellData cell = new CellData();
cell.setUserEnteredValue(new ExtendedValue().setStringValue(value));
CellData cell2 = new CellData();
cell2.setUserEnteredValue(new ExtendedValue().setFormulaValue(formula));
CellData cell3 = new CellData();
cell3.setUserEnteredValue(new ExtendedValue().setStringValue(value2));
cellData.add(cell);
cellData.add(cell2);
cellData.add(cell3);
rowData.add(new RowData().setValues(cellData));
BatchUpdateSpreadsheetRequest batchRequests = new BatchUpdateSpreadsheetRequest();
BatchUpdateSpreadsheetResponse response;
List<Request> requests = new ArrayList<Request>();
AppendCellsRequest appendCellReq = new AppendCellsRequest();
appendCellReq.setSheetId(sheetID);
appendCellReq.setRows( rowData );
appendCellReq.setFields("userEnteredValue");
requests = new ArrayList<Request>();
requests.add( new Request().setAppendCells(appendCellReq));
batchRequests = new BatchUpdateSpreadsheetRequest();
batchRequests.setRequests( requests );
response = this.mService.spreadsheets().batchUpdate(spreadsheetID, batchRequests).execute();
System.out.println(response.toPrettyString());
return null;
}

I figured it out after some mind numbing throw things against the wall and see what sticks. I had to do it in two steps. The first step will insert a row into your sheet at row 32 so you'll get a blank row 33. Then the second area will insert values into that blank row. I hope this helps someone in the future.
String spreadsheetID = getResources().getString(R.string.my_google_spreadsheet_id);
Integer sheetID = getResources().getInteger(R.integer.my_google_sheet_id);
BatchUpdateSpreadsheetResponse response;
BatchUpdateSpreadsheetRequest batchRequests = new BatchUpdateSpreadsheetRequest();
List<Request> requests = new ArrayList<Request>();
InsertDimensionRequest insertDimensionRequest = new InsertDimensionRequest();
DimensionRange dimRange = new DimensionRange();
dimRange.setStartIndex(32);
dimRange.setEndIndex(33);
dimRange.setSheetId(sheetID);
dimRange.setDimension("ROWS");
insertDimensionRequest.setRange(dimRange);
insertDimensionRequest.setInheritFromBefore(false);
requests.add( new Request().setInsertDimension(insertDimensionRequest));
batchRequests = new BatchUpdateSpreadsheetRequest();
batchRequests.setRequests( requests );
response = this.mService.spreadsheets().batchUpdate(spreadsheetID, batchRequests).execute();
System.out.println(response.toPrettyString());
List<List<Object>> argData = getData(entryTitle, entryValue);
ValueRange vRange = new ValueRange();
vRange.setRange("2016!A33");
vRange.setValues(argData);
List<ValueRange> vList = new ArrayList<>();
vList.add(vRange);
BatchUpdateValuesRequest batchRequest = new BatchUpdateValuesRequest();
batchRequest.setValueInputOption("USER-ENTERED");
batchRequest.setData(vList);
this.mService.spreadsheets().values().batchUpdate(spreadsheetID, batchRequest).execute();

I've been having sooo much unnecessary hustle with the v4 google sheets api, that it was ridiculous. Therefore, I reverted to the gdata (v3) api, much better easier to follow and much better documented also.
Here is the link to setting it up and a few examples https://developers.google.com/google-apps/spreadsheets/
The only thing missing was the authorization process in those notes, which was a hustle, but after some digging I was able to get a authorization code base, as shown below.
public class YourClass {
// Application name
private static final String APPLICATION_NAME = "Your-Application-Name";
// account info and p12
private static final String ACCOUNT_P12_ID = "Get-the-details-developer-console-google";
private static final File P12FILE = new File("D:/path/Drive API Test-bf290e0ee314.p12");
// scopes
private static final List<String> SCOPES = Arrays.asList(
"https://docs.google.com/feeds",
"https://spreadsheets.google.com/feeds");
// Spreadsheet API URL
private static final String SPREADSHEET_URL = "https://spreadsheets.google.com/feeds/spreadsheets/private/full";
private static final URL SPREADSHEET_FEED_URL;
static {
try {
SPREADSHEET_FEED_URL = new URL(SPREADSHEET_URL);
} catch (MalformedURLException e) {
throw new RuntimeException(e);
}
}
// Authorize
private static Credential authorize() throws Exception {
System.out.println("authorize in");
HttpTransport httpTransport = GoogleNetHttpTransport.newTrustedTransport();
JsonFactory jsonFactory = new JacksonFactory();
GoogleCredential credential = new GoogleCredential.Builder()
.setTransport(httpTransport)
.setJsonFactory(jsonFactory)
.setServiceAccountId(ACCOUNT_P12_ID)
.setServiceAccountPrivateKeyFromP12File(P12FILE)
.setServiceAccountScopes(SCOPES)
.build();
boolean ret = credential.refreshToken();
// debug dump
System.out.println("refreshToken:" + ret);
// debug dump
if (credential != null) {
System.out.println("AccessToken:" + credential.getAccessToken());
}
System.out.println("authorize out");
return credential;
}
// Get service
private static SpreadsheetService getService() throws Exception {
System.out.println("service in");
SpreadsheetService service = new SpreadsheetService(APPLICATION_NAME);
service.setProtocolVersion(SpreadsheetService.Versions.V3);
Credential credential = authorize();
service.setOAuth2Credentials(credential);
// debug dump
System.out.println("Schema: " + service.getSchema().toString());
System.out.println("Protocol: " + service.getProtocolVersion().getVersionString());
System.out.println("ServiceVersion: " + service.getServiceVersion());
System.out.println("service out");
return service;
}
From that onward, I was able to perform a number of inserts and appends to the sheets, without any major hustle.

Related

Fetching a user's "other" Google contacts in Android

If I understand correctly, in order to fetch the user Google contacts from my Android app, I should use the People API instead of the Contacts API. In my case, I want to get all user's contacts including the "other contacts", as shown in the figure below (one can see his/her other contacts by clicking on the other contacts link):
Up to now, I have successfully used the People API as shown below. First I provide the required scopes to the Google SignIn Options:
GoogleSignInOptions gso = new GoogleSignInOptions.Builder(GoogleSignInOptions.DEFAULT_SIGN_IN)
.requestIdToken(getString(R.string.default_web_client_id))
.requestServerAuthCode(getString(R.string.default_web_client_id))
.requestEmail()
.requestProfile()
.requestScopes(new Scope(PeopleServiceScopes.CONTACTS_READONLY))
.build();
mGoogleSignInClient = GoogleSignIn.getClient(this, gso);
Then I use my webclient Id and secret to fetch the user's contacts:
public void getUserContacts () throws IOException {
HttpTransport httpTransport = new NetHttpTransport();
JacksonFactory jsonFactory = new JacksonFactory();
// Go to the Google API Console, open your application's
// credentials page, and copy the client ID and client secret.
// Then paste them into the following code.
String clientId = getString(R.string.webClientIDAutoCreated);
String clientSecret = getString(R.string.webClientIDSecretAutoCreated);
// Or your redirect URL for web based applications.
String redirectUrl = "urn:ietf:wg:oauth:2.0:oob";
String scope = "https://www.googleapis.com/auth/contacts.readonly";
String serverAuthCode = userSettings.getString(USER_PREFS_SERVER_AUTH_CODE,"");
// Step 1: Authorize -->
String authorizationUrl = new GoogleBrowserClientRequestUrl(clientId, redirectUrl, Arrays.asList(scope)).build();
// Point or redirect your user to the authorizationUrl.
System.out.println("Go to the following link in your browser:");
System.out.println(authorizationUrl);
// Read the authorization code from the standard input stream.
BufferedReader in = new BufferedReader(new InputStreamReader(System.in));
System.out.println("What is the authorization code?");
String code = in.readLine();
// End of Step 1 <--
// Step 2: Exchange -->
GoogleTokenResponse tokenResponse = new GoogleAuthorizationCodeTokenRequest(httpTransport, jsonFactory, clientId, clientSecret, serverAuthCode, redirectUrl).execute();
// End of Step 2 <--
GoogleCredential credential = new GoogleCredential.Builder()
.setTransport(httpTransport)
.setJsonFactory(jsonFactory)
.setClientSecrets(clientId, clientSecret)
.build()
.setFromTokenResponse(tokenResponse);
PeopleService peopleService = new PeopleService.Builder(httpTransport, jsonFactory, credential)
.setApplicationName(getString(R.string.app_name))
.build();
ListConnectionsResponse response = peopleService.people().connections()
.list("people/me")
.setPersonFields("names,emailAddresses")
.execute();
// Print display name of connections if available.
List<Person> connections = response.getConnections();
if (connections != null && connections.size() > 0) {
for (Person person : connections) {
List<Name> names = person.getNames();
if (names != null && names.size() > 0) {
myLog(TAG,DEBUG_OK,"Name: " + person.getNames().get(0).getDisplayName());
List<EmailAddress> emailAddresses = person.getEmailAddresses();
if (emailAddresses != null && emailAddresses.size() > 0) {
for (EmailAddress email: emailAddresses)
myLog(TAG,DEBUG_OK,"email: " + email.getValue());
}
}
else {
myLog(TAG,DEBUG_OK,"No names available for connection.");
}
}
}
else {
System.out.println("No connections found.");
}
}
I was hoping that this would get all available contacts, however it returns only a small subset. So my question is whether I need to pass / use any other scopes to read all contacts, including the "other contacts" list.
The People API doesn't appear to support the "Other Contacts" contacts as described in this answer. You should use the Contacts API to get the data you want.
People API allows to fetch other Contacts now as described here
https://developers.google.com/people/v1/other-contacts
ListOtherContactsResponse response = peopleService.otherContacts().list()
.setReadMask("metadata,names,emailAddresses")
.execute();
List<Person> otherContacts = response.getOtherContacts();

Importing csv data from Storage to Cloud SQL not working - status always "pending"

I am new to java (I have experience with C# though)
Sadly, I inherited a terrible project (the code is terrible) and what I need to accomplish is to import some csv files into Cloud SQL
So there's a WS which runs this task, apparently the dev followed this guide to import data. But it is not working. Here's the code (Essential parts, actually it is longer and more ugly)
InstancesImportRequest requestBody = new InstancesImportRequest();
ImportContext ic = new ImportContext();
ic.setKind("sql#importContext");
ic.setFileType("csv");
ic.setUri(bucketPath);
ic.setDatabase(CLOUD_SQL_DATABASE);
CsvImportOptions csv = new CsvImportOptions();
csv.setTable(tablename);
List<String> list = new ArrayList<String>();
// here there is some code that populates the list with the columns
csv.setColumns(list);
ic.setCsvImportOptions(csv);
requestBody.setImportContext(ic);
SQLAdmin sqlAdminService = createSqlAdminService();
SQLAdmin.Instances.SQLAdminImport request = sqlAdminService.instances().sqladminImport(project, instance, requestBody);
Operation response = request.execute();
System.out.println("Executed : Going to sleep.>"+response.getStatus());
int c = 1;
while(!response.getStatus().equalsIgnoreCase("Done")){
Thread.sleep(10000);
System.out.println("sleeped enough >"+response.getStatus());
c++;
if(c==50){
System.out.println("timeout?");
break;
}
}
public static SQLAdmin createSqlAdminService() throws IOException, GeneralSecurityException {
HttpTransport httpTransport = GoogleNetHttpTransport.newTrustedTransport();
JsonFactory jsonFactory = JacksonFactory.getDefaultInstance();
GoogleCredential credential = GoogleCredential.getApplicationDefault();
if (credential.createScopedRequired()) {
credential =
credential.createScoped(Arrays.asList("https://www.googleapis.com/auth/cloud-platform"));
}
return new SQLAdmin.Builder(httpTransport, jsonFactory, credential)
.setApplicationName("Google-SQLAdminSample/0.1")
.build();
}
I am not quite sure how response should be treated, it seems it is an async request. Either way, I always get status Pending; it seems it is not even start to executing.
Of course it ends timing out. What is wrong here, why the requests never starts ? I couldn't find any actual example on the internet about using this java sdk to import files, except the link I gave above
Well, the thing is that the response object is static, so it will always return "Pending" as the initial status since it is a string in the object - it is not actually being updated.
To get the actual status, you have to requested it to google using the sdk. I did something like this (it will be better to use a smaller sleep time, and make it grow as you try more times)
SQLAdmin.Instances.SQLAdminImport request = sqlAdminService.instances().sqladminImport(CLOUD_PROJECT, CLOUD_SQL_INSTANCE, requestBody);
// execution of our import request
Operation response = request.execute();
int tried = 0;
Operation statusOperation;
do {
// sleep one minute
Thread.sleep(60000);
// here we are requesting the status of our operation. Name is actually the unique identifier
Get requestStatus = sqlAdminService.operations().get(CLOUD_PROJECT, response.getName());
statusOperation = requestStatus.execute();
tried++;
System.out.println("status is: " + statusOperation.getStatus());
} while(!statusOperation.getStatus().equalsIgnoreCase("DONE") && tried < 10);
if (!statusOperation.getStatus().equalsIgnoreCase("DONE")) {
throw new Exception("import failed: Timeout");
}

how to Authenticate emails to open google sheet v4 in java example

I created a google sheet by google api service v4 in java. I am trying to Authenticate emails (Users) which can open google sheet. Say, I have 4 email ids:
abc#gmail.com
123#gmail.com
324#gmail.com
xyz#gmail.com
these all 4 users can open the google sheet.
I have wrote a code below, please help me to rectify this problem
public static void insertDataInSheet(String spreadsheetId){
System.out.print("sheet id : >> "+spreadsheetId);
String range = "Sheet1!A2:E20";
try{
Quickstart.authorize();
Sheets service = getSheetsService();
List<Object> temp = new ArrayList();
temp.add("HHHHHHNN");
temp.add("ON");
temp.add("04/09/2017");
temp.add("500305");
temp.add("2399400");
List<Object> temp1 = new ArrayList();
temp1.add("bata");
temp1.add("OFF");
temp1.add("02/09/2017");
temp1.add("1203994");
temp1.add("5536771");
List<List<Object>> values = Arrays.asList(temp,temp1);
List<ValueRange> data = new ArrayList<ValueRange>();
data.add(new ValueRange().setRange(range).setValues(values));
System.out.println(">>>>>>>>>>>");
BatchUpdateValuesRequest body = new BatchUpdateValuesRequest()
.setValueInputOption("USER_ENTERED")
.setData(data);
BatchUpdateValuesResponse result =service.spreadsheets().values().batchUpdate(spreadsheetId, body).execute();
}catch(Exception ex){
ex.printStackTrace();
}
}

Trouble building Shapefile in Geotools

I have a project where I want to load in a given shapefile, and pick out polygons above a certain size before writing the results to a new shapefile. Maybe not the most efficient, but I've got code that successfully does all of that, right up to the point where it is supposed to write the shapefile. I get no errors, but the resulting shapefile has no usable data in it. I've followed as many tutorials as possible, but still I'm coming up blank.
The first bit of code is where I read in a shapefile, pickout the polygons I want, and put then into a feature collection. This part seems to work fine as far as I can tell.
public class ShapefileTest {
public static void main(String[] args) throws MalformedURLException, IOException, FactoryException, MismatchedDimensionException, TransformException, SchemaException {
File oldShp = new File("Old.shp");
File newShp = new File("New.shp");
//Get data from the original ShapeFile
Map<String, Object> map = new HashMap<String, Object>();
map.put("url", oldShp.toURI().toURL());
//Connect to the dataStore
DataStore dataStore = DataStoreFinder.getDataStore(map);
//Get the typeName from the dataStore
String typeName = dataStore.getTypeNames()[0];
//Get the FeatureSource from the dataStore
FeatureSource<SimpleFeatureType, SimpleFeature> source = dataStore.getFeatureSource(typeName);
SimpleFeatureCollection collection = (SimpleFeatureCollection) source.getFeatures(); //Get all of the features - no filter
//Start creating the new Shapefile
final SimpleFeatureType TYPE = createFeatureType(); //Calls a method that builds the feature type - tested and works.
DefaultFeatureCollection newCollection = new DefaultFeatureCollection(); //To hold my new collection
try (FeatureIterator<SimpleFeature> features = collection.features()) {
while (features.hasNext()) {
SimpleFeature feature = features.next(); //Get next feature
SimpleFeatureBuilder fb = new SimpleFeatureBuilder(TYPE); //Create a new SimpleFeature based on the original
Integer level = (Integer) feature.getAttribute(1); //Get the level for this feature
MultiPolygon multiPoly = (MultiPolygon) feature.getDefaultGeometry(); //Get the geometry collection
//First count how many new polygons we will have
int numNewPoly = 0;
for (int i = 0; i < multiPoly.getNumGeometries(); i++) {
double area = getArea(multiPoly.getGeometryN(i));
if (area > 20200) {
numNewPoly++;
}
}
//Now build an array of the larger polygons
Polygon[] polys = new Polygon[numNewPoly]; //Array of new geometies
int iPoly = 0;
for (int i = 0; i < multiPoly.getNumGeometries(); i++) {
double area = getArea(multiPoly.getGeometryN(i));
if (area > 20200) { //Write the new data
polys[iPoly] = (Polygon) multiPoly.getGeometryN(i);
iPoly++;
}
}
GeometryFactory gf = new GeometryFactory(); //Create a geometry factory
MultiPolygon mp = new MultiPolygon(polys, gf); //Create the MultiPolygonyy
fb.add(mp); //Add the geometry collection to the feature builder
fb.add(level);
fb.add("dBA");
SimpleFeature newFeature = SimpleFeatureBuilder.build( TYPE, new Object[]{mp, level,"dBA"}, null );
newCollection.add(newFeature); //Add it to the collection
}
At this point I have a collection that looks right - it has the correct bounds and everything. The next bit if code is where I put it into a new Shapefile.
//Time to put together the new Shapefile
Map<String, Serializable> newMap = new HashMap<String, Serializable>();
newMap.put("url", newShp.toURI().toURL());
newMap.put("create spatial index", Boolean.TRUE);
DataStore newDataStore = DataStoreFinder.getDataStore(newMap);
newDataStore.createSchema(TYPE);
String newTypeName = newDataStore.getTypeNames()[0];
SimpleFeatureStore fs = (SimpleFeatureStore) newDataStore.getFeatureSource(newTypeName);
Transaction t = new DefaultTransaction("add");
fs.setTransaction(t);
fs.addFeatures(newCollection);
t.commit();
ReferencedEnvelope env = fs.getBounds();
}
}
I put in the very last code to check the bounds of the FeatureStore fs, and it comes back null. Obviously, loading the newly created shapefile (which DOES get created and is ab out the right size), nothing shows up.
The solution actually had nothing to do with the code I posted - it had everything to do with my FeatureType definition. I did not include the "the_geom" to my polygon feature type, so nothing was getting written to the file.
I believe you are missing the step to finalize/close the file. Try adding this after the the t.commit line.
fs.close();
As an expedient alternative, you might try out the Shapefile dumper utility mentioned in the Shapefile DataStores docs. Using that may simplify your second code block into two or three lines.

How to get the Index Size in Solr using Java

I need to get total size of an index in Apache Solr using Java. The following code gets the total number of documents but I am looking for the size. And with the use of ReplicationHandler I was thinking that I can get the index size as told by someone here on this link.. http://lucene.472066.n3.nabble.com/cheking-the-size-of-the-index-using-solrj-API-s-td692686.html but I am not getting the index size.
BufferedWriter out1 = null;
FileWriter fstream1 = new FileWriter("src/test/resources/solr-document-id-desc.txt");
out1 = new BufferedWriter(fstream1);
ApplicationContext context = null;
context = new ClassPathXmlApplicationContext("application-context.xml");
CommonsHttpSolrServer solrServer = (CommonsHttpSolrServer) context.getBean("solrServer");
SolrQuery solrQuery = new SolrQuery().setQuery("*:*");
QueryResponse rsp = solrServer.query(solrQuery);
//I am trying to use replicationhandler but I am not able to get the index size using statistics. Is there any way to get the index size..?
ReplicationHandler handler2 = new ReplicationHandler();
System.out.println( handler2.getDescription());
NamedList statistics = handler2.getStatistics();
System.out.println("Statistics "+ statistics);
System.out.println(rsp.getResults().getNumFound());
Iterator<SolrDocument> iter = rsp.getResults().iterator();
while (iter.hasNext()) {
SolrDocument resultDoc = iter.next();
System.out.println(resultDoc.getFieldNames());
String id = (String) resultDoc.getFieldValue("numFound");
String description = (String) resultDoc.getFieldValue("description");
System.out.println(id+"~~"+description);
out1.write(id+"~~"+description);
out1.newLine();
}
out1.close();
Any suggestions will be appreciated..
Update Code:-
ReplicationHandler handler2 = new ReplicationHandler();
System.out.println( handler2.getDescription());
NamedList statistics = handler2.getStatistics();
System.out.println("Statistics "+ statistics.get("indexSize"));
The indexsize is available with the statistics in ReplicationHandler
org.apache.solr.handler.ReplicationHandler
code
public NamedList getStatistics() {
NamedList list = super.getStatistics();
if (core != null) {
list.add("indexSize", NumberUtils.readableSize(getIndexSize()));
}
}
You can use the URL http://localhost:8983/solr/replication?command=details , which returns the index size.
<lst name="details">
<str name="indexSize">26.13 KB</str>
.....
</lst>
Not sure if it works with the instantiation of ReplicationHandler, as it would need the reference of the core and the index.
You can use the command in the data directory
- du -kx
as said in this post you can use MAT tool in order to see the memory consumption. I think that you could use in your code. Enjoy solr!

Categories

Resources