Can I store and retrieve .kml file from database sqlite - java

I'm using osmnavigator, and I'm trying to retrieve raw kml from the sqlite but couldn't find any solution. Can I store the filename only and retrieve on parsekmlstream?

If you can get an InputStream on what you want to read, then you should be able to use KmlDocument method:
public boolean parseKMLStream(InputStream stream, ZipFile kmzContainer)
(just pass null for the kmzContainer param)

Related

Output data appears in a random order when uploaded to google cloud storage

I've been using the google-dataflow-sdk to upload CSV files to google cloud storage.
When I upload the file to a google cloud project, my data appears in a file in a random order on the cloud. Each line on the csv is correct, but the rows are all over the place.
The header of the csv )i.e. attribute, attribute, attribute) are on another line all the time and never at the top where is should be. I stress again, the data in each column is fine, it is just the rows that are randomly positioned.
here is the code which reads the data initially:
PCollection<String> csvData = pipeline.apply(TextIO.Read.named("ReadItems")
.from(filename));
and this is the code that writes to the google cloud project:
csvData.apply(TextIO.Write.named("WriteToCloud")
.to("gs://dbm-poc/"+partnerId+"/"+dateOfReport+modifiedFileName)
.withSuffix(".csv"));
Thanks for any help.
Firstly, to fix your header use:
public static TextIO.Write.Bound<String> withHeader(#Nullable String header)
https://cloud.google.com/dataflow/java-sdk/JavaDoc/com/google/cloud/dataflow/sdk/io/TextIO.Write#withHeader-java.lang.String-
For example:
...
TextIO.Write.withHeader("<header>").apply(..)
...
Secondly, Dataflow does not currently support ordered/sorted writing to Sinks. This is mostly likely due to its distributed/paralell architecture. You could write your own custom Sink if you really wanted to. See similar question here for more details.
Whilst i agree the answer provided by Graham Polley is correct, I managed to find a much simpler way to get the data to write in an ordered way.
I instead used the google cloud storage library to store the files I would need onto the cloud, like so:
public static String writeFile(byte[] content, String filename, String partnerId, String dateOfReport) {
Storage storage = StorageOptions.defaultInstance().service();
BlobId blobId = BlobId.of("dbm-poc", partnerId + "/" + dateOfReport + "-" + filename + ".csv");
BlobInfo blobInfo = BlobInfo.builder(blobId).contentType("binary/octet-stream").build();
storage.create(blobInfo, content);
return filename;
}
public static byte[] readFile(String filename) throws IOException {
return Files.readAllBytes(Paths.get(filename));
}
Using these two methods in conjunction with each other, I was not only able to upload the files to the bucket i wanted without losing any of the contents ordering, but i was also able to change the format of the uploaded files from text to a binary/octet-stream file which means it can be access and downloaded.
This method also seems to remove the need to have a pipeline to upload data.

How to generate a JSON file from stored procedure(SQL) result

I want to get the data from DB using stored procedures in the form of JSon file.
In simple words , my output should be json file which should be result of data in DB(based on stored procedure) .How can i move forward?
You have to create Object Mapper which will convert your data to an object (I think there's an apache library that can do this). Then you can use existing APIs to convert your objects to JSON string example of this is Google's or Jackson. When you have the JSON string you can now write it to a file. Hope this helps.

Create Download Link But Source is Java String

I want to create a download link but the part I'm having trouble is that the source is a Java string. The String I have is a JSON data. I want people to be able to download that data.
I am using the Play! framework so I can pass the String data using the Scala template. But I'm not sure how to allow users to download the String and append the file types (.txt, .json) so that users actually download a file.
How do I go about to doing this?
I can't believe how simple the solution is. This was what did it for me. Basically take the string and convert it into an InputStream.
String data = "someBigOrSmallData";
InputStream dataStream = new ByteArrayInputStream(data.getBytes());
response().setHeader("Content-disposition","attachment; filename=anyFileName.txt");
return ok(dataStream);

How to read content of a blob and write to GAE datastore (Java)

How to read content of a blob and write to GAE datastore in java.
Once you have the BlobKey for the blob you want to read, you can construct a BlobstoreInputStream:
BlobKey blobKey = ...;
InputStream is = new BlobstoreInputStream(blobKey)
You can then read the blob contents using any of the InputStream read methods.
You can use FileService API to create/write/read files in Blobstore. When you read byte array from file, then you can easily add as a property to Datastore entity and save it.

Java Read binary record into String

We are storing uploaded text files in a SQL server data. The field type is image.
The file upload and download correctly, what I want to do now is load the actual text content into a String variable directly from the database record.
Can anyone advise on how to do this please?
Depends on how you read the data from the db. If you get a byte array, you could use new String(bytes);
Btw, why don't you use the CLOB datatype (or the equivalent for your server) for the field? This should normally cause the Java driver to return the String directly.

Categories

Resources