Fusion Charts FusionTime making a time-series chart from mySQL - java

I am a beginner at all things coding but need some help with Fusion Charts if anyone can help.
I have followed along with tutorials already for Fusion Charts linking it to MySQL database and displaying a chart with no issues.
However, I would like to display a time-series chart, which uses FusionTime. This requires the data to be in a Datatable. " FusionTime accepts data in rows and columns as a Datatable".
I cannot find any examples online for taking SQL data and converting into a datatable with data and schema which it seems to require. This is different from the way fusioncharts works.
https://www.fusioncharts.com/dev/fusiontime/getting-started/create-your-first-chart-in-fusiontime
My SQL database contains many tables and many columns within it, so will need to select the appropriate column to display.
I would appreciate any advice anyone can provide. The main problem is I don't know how to get the SQL database into a data and schema file to display with fusiontime. This is to display on a webpage hosted locally.
Many thanks for any time you can provide to help with this

ft needs a json, you must write a file from php type json
like this
.....
$result = json_encode($dataIngGast, JSON_UNESCAPED_SLASHES | JSON_UNESCAPED_UNICODE |JSON_NUMERIC_CHECK | JSON_PRETTY_PRINT);
//echo $result;
$arquivo = "column-line-combination-data-gasto-ingreso-finanzas.json";
$fp = fopen($arquivo, "a+");
fwrite($fp, $result);
fclose($fp);

Related

Oracle sql developer save hebrew data - converted to gibberish

I'm trying to insert hebrew content to table, using oracle sql developer db.
this is my code using myBatis:
<insert id="insertTaskTempImage" parameterType="com.ladpc.mobile.entities.TaskTempImage" useGeneratedKeys="false">
INSERT INTO TASKS_TEMP_IMAGES (
TASK_ID,
RASHUT_ID,
COMMENTS,
CREATION_DATE,
IMAGE,
FILE_NAME
)
VALUES (
#{taskId,jdbcType=NUMERIC},
#{rashutId,jdbcType=NUMERIC},
#{comments, jdbcType=VARCHAR},
#{creationDate, jdbcType=TIMESTAMP},
#{image,jdbcType=BLOB},
#{fileName,jdbcType=VARCHAR}
)
</insert>
After I insert the fileName to the table- with hebrew chars, I get gibberish content in the table:
and when I load this content and show it in UI its writen in gibberish.
What need I do to resolve this issue?
edit:
My nls is on hebrew but its still not working...
Thank you!
ALTER SESSION SET NLS_LANGUAGE . And you will have to find the right value for the language you want.
Tools > Preferences > Code Editor > Fonts.
Set it to something friendly like Tahoma.
Query your data, save it.
And then open said file.
Oracle SQL Developer is a java app - it's got full unicode supports out-of-the-box. Nine times out of 10, display issues are due to an incompatible font.
I don't read Hebrew, I used an ipsum generator for this text, so if it's offensive, my apologies.
Sorry, I finded out the sulotion...
My client need to send this data in utf-8 encoding.
The problem wasn't on server inserts, the server get this string on gibbresh...
Tanks for all answers!

Fusion Table API : query don't get <innerBoundaryIs> elements in Location column

I'm trying to get the contents of a Fusion Table Location column
This column contains a Polygon with "outerBoundaryIs" and "innerBoundaryIs"
I'm querying the column with Fusion Tables API Query:sql, but the response don't get the "innerBoundaryIs" elements
I'm using the Fusion Tables Java Library v1r33lv1.15.0-rc
Any direction is appreciated
LluĂ­s
Set the typed-parameter to false (default is true)
<edit>:
I don't know why(I haven't found anything about this in any documentation), but for me it works when I add a linebreak after the closing </outerBoundaryIs>
Demo: http://jsfiddle.net/doktormolle/LsCLV/ (it's a copy of your table, the only difference is the linebreak)
The query from the Fusion Tables API will not return KML. Look at the documentation for what the 2 types (JSON, CSV) will look like, neither includes the InnerBoundaryIs (or outerBoundaryIs) elements.
If you need the KML, it will be returned if you use the google visualization query, but that is limited to returning 500 rows.
Query using Fusion Tables API v1.0 and parse response to native Google Maps Javascript API v3 objects

Configuring server side Datatables for an unknown number of tables

I am using DataTables with the tables being generated in a java controller class that talks to the database. Given a category id, the controller class returns an unknown number of preformatted HTML tables, one for each section in the given category queried. Ideally I would like to display each table as a DataTable on the same page, but unsure if that's possible given that I don't know how many tables I will be getting back so I can't set up their behavior before the query.
Is there a way to format the tables when/as I get them from the controller? I attempted to prepend each table with its own .ready block but that didn't seem to do the trick though I'm fairly new to jQuery and could just be missing something. I used the barest of configuration to try to get it working first
$(document).ready(function(){
$("#results").dataTable({
"bJQueryUI" : true
});
});
$(document).ready(function() {
$('.dataTable').dataTable();
} );
Turns out to work after all, but ONLY if you specify the tables as class="dataTable" which isn't well documented or explained, hopefully this un-confuses someone else!

file (not in memory) based JDBC driver for CSV files

Is there a open source file based (NOT in-memory based) JDBC driver for CSV files? My CSV are dynamically generated from the UI according to the user selections and each user will have a different CSV file. I'm doing this to reduce database hits, since the information is contained in the CSV file. I only need to perform SELECT operations.
HSQLDB allows for indexed searches if we specify an index, but I won't be able to provide an unique column that can be used as an index, hence it does SQL operations in memory.
Edit:
I've tried CSVJDBC but that doesn't support simple operations like order by and group by. It is still unclear whether it reads from file or loads into memory.
I've tried xlSQL, but that again relies on HSQLDB and only works with Excel and not CSV. Plus its not in development or support anymore.
H2, but that only reads CSV. Doesn't support SQL.
You can solve this problem using the H2 database.
The following groovy script demonstrates:
Loading data into the database
Running a "GROUP BY" and "ORDER BY" sql query
Note: H2 supports in-memory databases, so you have the choice of persisting the data or not.
// Create the database
def sql = Sql.newInstance("jdbc:h2:db/csv", "user", "pass", "org.h2.Driver")
// Load CSV file
sql.execute("CREATE TABLE data (id INT PRIMARY KEY, message VARCHAR(255), score INT) AS SELECT * FROM CSVREAD('data.csv')")
// Print results
def result = sql.firstRow("SELECT message, score, count(*) FROM data GROUP BY message, score ORDER BY score")
assert result[0] == "hello world"
assert result[1] == 0
assert result[2] == 5
// Cleanup
sql.close()
Sample CSV data:
0,hello world,0
1,hello world,1
2,hello world,0
3,hello world,1
4,hello world,0
5,hello world,1
6,hello world,0
7,hello world,1
8,hello world,0
9,hello world,1
10,hello world,0
If you check the sourceforge project csvjdbc please report your expierences. the documentation says it is useful for importing CSV files.
Project page
This was discussed on Superuser https://superuser.com/questions/7169/querying-a-csv-file.
You can use the Text Tables feature of hsqldb: http://hsqldb.org/doc/2.0/guide/texttables-chapt.html
csvsql/gcsvsql are also possible solutions (but there is no JDBC driver, you will have to run a command line program for your query).
sqlite is another solution but you have to import the CSV file into a database before you can query it.
Alternatively, there is commercial software such as http://www.csv-jdbc.com/ which will do what you want.
To do anything with a file you have to load it into memory at some point. What you could do is just open the file and read it line by line, discarding the previous line as you read in a new one. Only downside to this approach is its linearity. Have you thought about using something like memcache on a server where you use Key-Value stores in memory you can query instead of dumping to a CSV file?
You can use either specialized JDBC driver, like CsvJdbc (http://csvjdbc.sourceforge.net) or you may chose to configure a database engine such as mySQL to treat your CSV as a table and then manipulate your CSV through standard JDBC driver.
The trade-off here - available SQL features vs performance.
Direct access to CSV via CsvJdbc (or similar) will allow you very quick operations on big data volumes, but without capabilities to sort or group records using SQL commands ;
mySQL CSV engine can provide rich set of SQL features, but with the cost of performance.
So if the size of your table is relatively small - go with mySQL. However if you need to process big files (> 100Mb) without need for grouping or sorting - go with CsvJdbc.
If you need both - handle very bif files and be able to manipulate them using SQL, then optimal course of action - to load the CSV into normal database table (e.g. mySQL) first and then handle the data as usual SQL table.

Yahoo Finance Stream Parsing to insert into MySQL database

I'm using the Yahoo Finance Streaming API to get stock quotes, I wanted to save these into a DB table for historical reference.
I'm looking for something which can easily parse various strings which have a format that varies like the examples below:
<script>try{parent.yfs_mktmcb({"unixtime":1310957222});}catch(e){}</script>
<script>try{parent.yfs_u1f({"ASX.AX":{c10:"-0.06"}});}catch(e){}</script>
<script>try{parent.yfs_u1f({"AWC.AX":{l10:"2.16",c10:"+0.01",p20:"+0.47"}});}catch(e){}</script>
<script>try{parent.yfs_u1f({"ALZ.AX":{l10:"2.6900",c10:"-0.1200",p20:"-4.27"}});}catch(e){}</script>
I want to parse these strings to a MySQL database and I was thinking the easiest way will be using Java to do this parsing. Basically these entries are line by line in a text file. I want to extract the time, the stock code, the price and the change values in a simple table.
The table looks like StockCode | Date | Time | Price | ChangeDol | ChangePer
Are there any tools or frameworks which would make this process easy?
Thanks!
I don't how you get your quote, but if you could use YQL, any XML parser would do:
YQL
<quote symbol="YHOO">
<Ask>14.76</Ask>
<AverageDailyVolume>28463800</AverageDailyVolume>
<Bid>14.51</Bid>
<AskRealtime>14.76</AskRealtime>
<BidRealtime>14.51</BidRealtime>
<BookValue>9.826</BookValue>
<Change_PercentChange>0.00 - 0.00%</Change_PercentChange>
....
</quote>
List of XML Parsers for Java
You could have a look there
http://www.wikijava.org/wiki/Downloading_stock_market_quotes_from_Yahoo!_finance
They get finencial data as csv from yahoo.

Categories

Resources