I have created a connection with Mysql and java program via jdbc. Now I want to populate the tables in the mysql database. How do I parse the data into the tables from the java code?
I have two input data files.The format of file is like:
"AAH196","17:13:00","02:49:00",287,166.03,"Austin","TX","Virginia Beach","VA"
"AAH3727","21:38:00","03:04:00",273,176.44,"Los Angeles","CA","Colorado Springs","CO"
You can use the LOAD DATA INFILE SQL command.
An easy solution: read a single line an use the content as a part of an SQL INSERT command:
List<String> lines = getAllLinesFromFile(file);
for (String line: lines) {
String query = "INSERT INTO \"TABLE\" (COL1, ..., COL9) VALUES("+line+");";
stmt.executeUpdate(query);
}
Replace TABLE with your actual tablename and COL1, ..., COL9 with an enumeration of your column names. There may be solutions with a better database performance (like using prepared statements) but the algorithm is easy and sufficient to get some data into the database.
Related
I am trying to update 2 tables at the same time where the inserted index of the first should be inserted into the 2nd table.
The sql looks like this:
DECLARE #nrTable table (TXT_nr int)
IF NOT EXISTS (SELECT txt FROM tbl1 WHERE txt = (?))
INSERT INTO tbl1 (txt, new) OUTPUT INSERTED.nr INTO #nrTable VALUES((?), 1)
IF NOT EXISTS (SELECT txt FROM tbl1 WHERE txt =(?))
INSERT INTO tbl2 (TXT_nr, field1, field2)
VALUES((SELECT TXT_nr FROM #nrTable), (?), (?))
WHERE field3 = (?) AND field4 = (?)
I am trying to accomplish this using
this.jdbcTemplate.batchUpdate(sql, batch);
simply concatenating the lines in java using basic strings. This seems to only execute the first statement, though.
Now the reason I don´t want to do this transactionally is that I would have to do it using a loop just inserting one batch-object at a time, because of the ouput-clause. This would result in loads of calls to the sql-server.
Is there any known way to accomplish something like this?
You can't use batchupdate in this way. Please refer to this document http://tutorials.jenkov.com/jdbc/batchupdate.html
And to achieve your goal, if you are using a sequence in sql, then you need to get the new value in java and store it in your query. Like this :
long id = jdbcTemplace.queryForObject("select sequence_name.nextval from dual",Long.class);
I have a java Map (Map) and a JDBC connection to hive server.
The schema of the table at the server contains a column of type Map.
Is it possible to insert the java Map to the hive table column with similar datatype using JDBC?
I tried:
"create table test(key string, value Map<String, String>)"
"insert into table test values ('keywer', map('subkey', 'subvalue')) from dummy limit 1;"
ref: Hive inserting values to an array complex type column
but the insert failed with:
"Error: Error while compiling statement: FAILED: ParseException line 1:69 missing EOF at 'from' near ')' (state=42000,code=40000)"
[EDIT]
hive version is : 0.14.0
Thanks
The manual clearly says you cannot insert into a Map data type using SQL:
"Hive does not support literals for complex types (array, map, struct, union), so it is not possible to use them in INSERT INTO...VALUES clauses. This means that the user cannot insert data into a complex datatype column using the INSERT INTO...VALUES clause.”
I think the correct DDL and query would be:
CREATE TABLE test(key STRING, value MAP<STRING, STRING>);
INSERT INTO TABLE test VALUES('keywer', map('subkey', 'subvalue')) from dummy limit 1;
A working method to put a complex type from jdbc client is:
insert into table test select "key",map("key1","value1","key2","value2") from dummy limit 1;
where dummy is another table which has at least one row.
Hello guys I need fill a table with el result of my query like....
SELECT FIELD1, FIELD2, X FROM OLDTABLE WHERE X=Y
I am a Java developer, my friends, RPG developers in the AS400. When they execute a a Query have a option to save the query result in a file
The option is called SELECT output and can choice 1 Display 2 Printer 3 File
Can do this directly from a query? or is a native iSeries option ?
Create a table with iseries sql
create a table with data.
create table abc as (select x,y,z from sometable where x=y) with data
create an empty table.
create table abc as (select x,y,z from sometable where x=y) data definition only
There is no output to printer using just sql.
Query will prompt you to replace an existing table. Straight SQL won't prompt to replace an existing table so you have two scenarios (see note) .
If the output table doesn't exist, all you need is
create table newtable as (select <...> from oldtable) with data
If the output table already exists, all you need is an insert with sub-select.
Insert into newtable
Select <...> from oldtable
NOTE
With the release of TR10 for v7.1 and TR2 for v7.2 in May 2015, IBM has added support the OR REPLACE clause to the CREATE TABLE statement. So if you happen to be on those TRs or higher, you could simply use:
create or replace table newtable as (select <...> from oldtable) with data
Could compile the SQL into a query manager query (CRTQMQRY) then run the query via (STRQMQRY).
To do that, put the query into the some sort of source file with a member type of TXT. Get to a command line and run the CRTQMQRY command and create the output QMQRY. STRQMQRY can be prompted and you can save the results in either an output file or a printout or look at it interactively. If you submit it as a batch job, viewing the output interactively won't work too well.
i need to insert csv into mysql database in proper column.
let say csv has header and then data
A B C
and Mysql has table with column C A B
i need to know best way to insert csv data to mysql table
Use tools like OpenCsv.Jar
Example are given here - http://opencsv.sourceforge.net/
It handles large volume of data.
I believe you can use the following syntax for mysql:
"INSERT INTO users (username, password, email, firstName, lastName, createDate) VALUES ('test', 'test', 'test', 'test', 'test', 'test')"
So you can build up your query, using the header and the column it falls into, like so (pseudocode):
"Insert into table (header1, header2, header3) values (column1, column2, column3)"
Regardless of what order the data is in the table, that will insert data into the correct column.
To load CSV data into MySQL database in the correct order, regardless of column order in the file, do this:
LOAD DATA INFILE path/to/datafile.csv INTO TABLE tablename (colA, colB, colC)
Ordinarily, the column list (colA, colB, colC) is optional. However, when (as in this question) the order of columns is different between the CSV file and the DB table, then arrange the column names in the order you want them inserted into the table, like so:
LOAD DATA INFILE path/to/datafile.csv INTO TABLE tablename (colC, colB, colA)
To see other options for loading CSV data into mysql, check out http://dev.mysql.com/doc/refman/5.1/en/load-data.html
Hi I would like to create table through JDBC on multiple databases like DB2, Sybase, MySQL etc. Now I need to create this table using text file say data.txt which contains data space separated values. For e.g.
CustName OrderNo PhoneNo
XYZ 230 123456789
ABC 450 879641238
Now this data.txt contains thousands of records space separated values. I need to parse this file line by line using java io and execute sql insert queries for each records.
I found there is LOAD DATA INFILE sql command. Does any JDBC driver supports this command? If not what should be the best efficient fast approach to solve this problem.
Please guide. Thanks in advance.
The following will work through JDBC. Note that to use LOAD DATA INFILE you need superuser privilege. Which you don't need for LOAD DATA LOCAL INFILE
Connection con = DriverManager.getConnection("jdbc:mysql://localhost/foobar", "root", "password");
Statement stmt = con.createStatement();
String sql =
"load data infile 'c:/temp/some_data.txt' \n" +
" replace \n" +
" into table prd \n" +
" columns terminated by '\\t' \n" +
" ignore 1 lines";
stmt.execute(sql);
If you use LOAD DATA INFILE the file location is based on the server's filesystem! If you use a local file, then obviously it's based on the client's filesystem.
I think LOAD DATA INFILE is specific to mySql, and I doubt whether a JDBC driver would support it. Other databases will have similar ( but different ) utilities
If you want to do this is a database independent way I think you have two choices
Parse up the input file and use SQL INSERT statements over a JDBC connection
Write a number of different, database dependent scripts, determine which dbms you are using and execute the correct one using Runtime.exec
Unless you have compelling performance reasons not to, I'd go for option 1.
I believe LOAD DATA INFILE is faster than parsing the file and inserting the records using Java . You can execute the query for load data infile through JDBC. As per this Oracle doc and MySql doc:
The LOAD DATA INFILE statement reads rows from a text file into a table at a very high speed.
The file should be in server . You can try both the approaches, log the time each one of them consume.
"Load data local infile" does work with MySQL's JDBC driver, there are some issues with this.
When using "load data infile" or "load data local infile" the inserted records WILL NOT be added to the bin log, this means that if you are using replication the records inserted by "load data infile" will not be transferred to the slave server(s), the inserted records will not have any transactions record, and this is why load data infile is so much quicker than a standard insert and due to no validation on the inserted data.