Schedule export of Google Analytics csv reports - java

I would like to have a copy of one of the Analytics custom reports as
CSV on a webserver every day. I wish to update some records in my
database depending on this csv report.
Before I would start, will it work if I
find the analytics core java api code of fetching reports, compile and save it
set a cronjob daily which runs a php file
the php file executes a bash command that calls java
the java application interacts with analytics, gets and saves the report
the php file checks if the new csv exists, reads the file, gets information
the php file connects to mysql and updates records
Please correct me if it's bullsh.t or there are easier ways (Analytics PHP/JS API if exists, or something else). These six points just popped in my head, I've never done something like this before, so please help me.

The Core Reporting API is language agnostic and there are libraries for many languages including PHP. So I'd say calling Java via PHP is unnecessarily complicated.

Related

How to integrate Map Reduce programs with Web Application

I am currently working on a web application. The requirement is something like the User will upload excel or Csv files, containing large Datasets from a front end framework.
Once uploaded the data will be processed based on many parameters like duplication check, individual field validations etc.
The user should be able to download the results based on the filters instantly in the form of newly generated csv files.
The technologies I am using is Hbase for storing the User information like name, email & so. Once the data is uploaded by the User it will be stored and processed in HDFS. The backend I have written in sparkjava web framework. Now the data processing engine I have used is MapReduce.
For MapReduce, I have written multiple Mappers, Reducers, Driver classes in Java which are present inside the same project Directory, but the issue is I am not able to integrate MapReduce with my backend. Once the data is uploaded, the Mapreduce programs should run. I am not able to do that.
Can anyone please suggest me any ideas regarding this. I am new to Hadoop, so please do tell me if I am doing anything wrong & suggest a better alternative for this. Any help will be awesome. Thank you.

GTFS realtime feed example script

I was able to successfully parse through BART's GTFS realtime Service Alerts and TripsUpdate feed. I also looked at the official protocol buffer for Java tutorial page and was able to compile and run the tutorial.
https://developers.google.com/protocol-buffers/docs/javatutorial
The next part for me is figuring out how to create a Realtime feed maybe preferably Service Alerts first for my GTFS static data. From what I understand a GTFS realtime feed is like sending protocol buffer data that is serialized to a webpage and then writing a script that takes in the web link that refers to the page and deserialized the data using HTTP GET. I was thinking of using Visual Studio and ASP.NET core to do this. Is there an example project I can refer too and/or am I even on the right track in the first place?
Take a look at the awesome-transit list of gtfs-realtime stuff. A lot of libraries within the OneBusAway project are probably your best bet for seeing code in action that deals with gtfs-realtime. For example, maybe you can look at onebusaway-gtfs-realtime-exporter.

Store data between Program Runs Java

Short Version: I need to store some data between runs of a java program.The data will be of the form of a table.Is there anything that can let do something like a sql query in java??THE SOLUTION MUST BE ABLE TO RUN ON AN OFFLINE COMPUTER.
Long Version: The user will be entering some data daily and i want something like a sql table in java. The program will run on a computer that is NOT CONNECTED TO THE INTERNET and so i need a truly local way to store data(Lots of it).Also preferably the data should be stored in such a way that it is not easily accessible to the end user(as in ,he should not be able to double click the file and simply read its contents)
Major Constraint: On searching online i found many people were using localhost to solve similar problems but that facility is not available to me as i CANNOT INSTALL ANYTHING on the target computer.
If a simple data file is not good enough, how about using SQLite with a JDBC backend? It will allow you to have an SQL database stored in a regular file with no dependency on any kind of server. Alternatively, there are many other embedded DB engines that you could use, depending on your needs.
EDIT:
By the way, most (if not all) DB engines that I know of do not obfuscate the data before storing them in the filesystem. The data will be fragmented, but parts of it will be visible if you force an editor to open the file (e.g. using "Open with..." in Windows).
There is also nothing to stop your user from accessing the data using the command line utility of the selected DB engine. If you want to obfuscate the data you have to do it in your code. Keep in mind that this will not stop a determined person - if your application can read it offline, so can everyone else.
Use an embedded database (like Apache Derby, HSQLDB, H2) so that you don't have to run a database server on the machine. The data will be stored locally on the target machine and it won't be human readable.
You have several options:
Store it in an xml-file
Store it in an local installed database
You can install a database like mysql or use a in memory database like sqlite or hbase or apache derby, which is included in java 6

How to access uploaded files in Ruby

I am trying use a Java Uploader in a ROR app (for its ease of uploading entire directories). The selected uploader comes with some PHP code that saves the files to the server. I am trying to translate this code to Ruby, but am stumped on this point:
PHP has a very convenient superglobal – $_FILES – that contains a hash of all files uploaded to the current script via the HTTP POST method. It appears Ruby does not have a similar resource. Lacking that, what is the best way to access and save the uploaded files?
I am using the JavaPowUpload uploader ( http://www.element-it.com/OnlineHelpJavaPowUpload/index.html ).
ruby on rails allows you use the application root directory to get at the file stored (wherever you have decided to put it) via #{RAILS_ROOT}.
Check out this tutorial. Not the prettiest method, but it should give you an idea of what needs to be done. Once the file is uploaded, it's just a matter of getting the right path and doing your processing from there.

How to upload bulk data to Google Servers using Google App Engine running on Java?

I am not able to figure out how to upload bulk data to the Google's servers bypassing the 10mb upload limit and 30 sec session timeout. I want to design an application that takes my standard SQL data and pushes it to the Google's servers.
I might sound naive but your help is most valuable for my project.
There's not currently a native Java bulkloader, so what you need to do is use the Python one. The process goes like this:
First, you'll need to download the Python SDK and extract it. Then, create an empty directory, and in it create a file called app.yaml, containing the following:
application: yourappid
version: bulkload
runtime: python
api_version: 1
handlers:
- url: /remote_api
script: $PYTHON_LIB/google/appengine/ext/remote_api/handler.py
login: admin
Now, run "appcfg.py update yourdir" from the Python SDK, and enter your credentials when prompted. appcfg will upload a new version of your app, which will run side-by-side with your main version, and allow you to bulkload.
Now, to do the actual bulkloading, you need to use the Python Bulkloader. Follow the instructions here. You'll need to know a (very) little bit of Python, but it's mostly copy-and-paste. When you're done, you can run the bulkloader as described in the article, but add the "-s bulkload.latest.yourapp.appspot.com" argument to the command line, like this:
appcfg.py upload_data --config_file=album_loader.py --filename=album_data.csv --kind=Album -s bulkload.latest.yourapp.appspot.com <app-directory>
Finally, to load data directly from an SQL database instead of from a CSV file, follow the instructions in my blog post here.
I wanna do the same thing also. So, here's my naivest concept to achieve the goal.
Web Server Preparation
Create a servlet that will receive the uploaded data (e.g. for data type
XML, JSON)
(optional) store it as Blobstore
Parse the data using JAXB/JSoup and/or GSON
Dynamically interpret the data structure
Store it using Datastore/
Client Uploader Preparation
Using a local computer, create a Java/C++/PHP script that generates XML/JSON files and store it locally
Create a shell script (linux) or batch file (windows) to programatically upload the files using cURL.
Please drop a comment to this one if you have better idea guys.

Categories

Resources