having some issues adding mysql data to JTables - java

I am learning how to connect to a database, and as a project I am making a JTable display database information. I am using a database called world that has a table called city with columns id, name, countrycode, district, population. Here is my code, I made it after looking at the java doc and some other various sources, but I am not sure if I am doing it right, and I have never used vectors before either.
import java.sql.Connection;
import java.sql.DriverManager;
import java.sql.ResultSet;
import java.sql.ResultSetMetaData;
import java.sql.SQLException;
import java.sql.Statement;
import java.util.Vector;
import javax.swing.JFrame;
import javax.swing.JScrollPane;
import javax.swing.JTable;
import javax.swing.table.DefaultTableModel;
import javax.swing.table.TableModel;
public class JTableTest extends JFrame {
Vector<String> columnNames;
Vector<Object> row;
JTable table;
public static void main(String[] args) {
JTableTest test = new JTableTest();
test.connectDB();
}
public void buildGui() {
setSize(500, 600);
setVisible(true);
JScrollPane scrollPane = new JScrollPane(table,
JScrollPane.VERTICAL_SCROLLBAR_ALWAYS,
JScrollPane.HORIZONTAL_SCROLLBAR_NEVER);
getContentPane().add(scrollPane);
}
public void connectDB() {
String driver = "com.mysql.jdbc.Driver";
String url = "jdbc:mysql://localhost/world";
String user = "root";
String pass = "root";
String sql = "Select id, name, countrycode, district, population from city where id < 100";
Connection conn;
Statement stmt;
ResultSet rs;
try {
Class.forName(driver);
System.out.println("connecting..");
conn = DriverManager.getConnection(url, user, pass);
System.out.println("connected!");
stmt = conn.createStatement();
rs = stmt.executeQuery(sql);
table = new JTable(rsToTableModel(rs));
} catch (SQLException e) {
e.printStackTrace();
} catch (ClassNotFoundException e) {
e.printStackTrace();
}
}
public TableModel rsToTableModel(ResultSet rs) {
try {
ResultSetMetaData md = rs.getMetaData();
int numberOfColumns = md.getColumnCount();
columnNames = new Vector<String>();
for (int i = 1; i < numberOfColumns; i++) {
columnNames.addElement(md.getColumnLabel(i));
}
row = new Vector<Object>();
while (rs.next()) {
for (int i = 1; i < numberOfColumns; i++) {
row.addElement(rs.getObject(i));
}
}
} catch (Exception e) {
e.printStackTrace();
}
return new DefaultTableModel(row, columnNames);
}
}
Here is my error
connecting..
connected!
[1, Kabul, AFG, Kabol, 2, Qandahar, AFG, Qandahar, 3, Herat, AFG, Herat, 4, Mazar-e-Sharif, AFG, Balkh, 5, Amsterdam, NLD, Noord-Holland, 6, Rotterdam, NLD, Zuid-Holland, 7, Haag, NLD, Zuid-Holland, 8, Utrecht, NLD, Utrecht, 9, Eindhoven, NLD, Noord-Brabant, 10, Tilburg, NLD, Noord-Brabant, 11, Groningen, NLD, Groningen, 12, Breda, NLD, Noord-Brabant, 13, Apeldoorn, NLD, Gelderland, 14, Nijmegen, NLD, Gelderland, 15, Enschede, NLD, Overijssel, 16, Haarlem, NLD, Noord-Holland, 17, Almere, NLD, Flevoland, 18, Arnhem, NLD, Gelderland, 19, Zaanstad, NLD, Noord-Holland, 20, ´s-Hertogenbosch, NLD, Noord-Brabant, 21, Amersfoort, NLD, Utrecht, 22, Maastricht, NLD, Limburg, 23, Dordrecht, NLD, Zuid-Holland, 24, Leiden, NLD, Zuid-Holland, 25, Haarlemmermeer, NLD, Noord-Holland, 26, Zoetermeer, NLD, Zuid-Holland, 27, Emmen, NLD, Drenthe, 28, Zwolle, NLD, Overijssel, 29, Ede, NLD, Gelderland, 30, Delft, NLD, Zuid-Holland, 31, Heerlen, NLD, Limburg, 32, Alkmaar, NLD, Noord-Holland, 33, Willemstad, ANT, Curaçao, 34, Tirana, ALB, Tirana, 35, Alger, DZA, Alger, 36, Oran, DZA, Oran, 37, Constantine, DZA, Constantine, 38, Annaba, DZA, Annaba, 39, Batna, DZA, Batna, 40, Sétif, DZA, Sétif, 41, Sidi Bel Abbès, DZA, Sidi Bel Abbès, 42, Skikda, DZA, Skikda, 43, Biskra, DZA, Biskra, 44, Blida (el-Boulaida), DZA, Blida, 45, Béjaïa, DZA, Béjaïa, 46, Mostaganem, DZA, Mostaganem, 47, Tébessa, DZA, Tébessa, 48, Tlemcen (Tilimsen), DZA, Tlemcen, 49, Béchar, DZA, Béchar, 50, Tiaret, DZA, Tiaret, 51, Ech-Chleff (el-Asnam), DZA, Chlef, 52, Ghardaïa, DZA, Ghardaïa, 53, Tafuna, ASM, Tutuila, 54, Fagatogo, ASM, Tutuila, 55, Andorra la Vella, AND, Andorra la Vella, 56, Luanda, AGO, Luanda, 57, Huambo, AGO, Huambo, 58, Lobito, AGO, Benguela, 59, Benguela, AGO, Benguela, 60, Namibe, AGO, Namibe, 61, South Hill, AIA, –, 62, The Valley, AIA, –, 63, Saint John´s, ATG, St John, 64, Dubai, ARE, Dubai, 65, Abu Dhabi, ARE, Abu Dhabi, 66, Sharja, ARE, Sharja, 67, al-Ayn, ARE, Abu Dhabi, 68, Ajman, ARE, Ajman, 69, Buenos Aires, ARG, Distrito Federal, 70, La Matanza, ARG, Buenos Aires, 71, Córdoba, ARG, Córdoba, 72, Rosario, ARG, Santa Fé, 73, Lomas de Zamora, ARG, Buenos Aires, 74, Quilmes, ARG, Buenos Aires, 75, Almirante Brown, ARG, Buenos Aires, 76, La Plata, ARG, Buenos Aires, 77, Mar del Plata, ARG, Buenos Aires, 78, San Miguel de Tucumán, ARG, Tucumán, 79, Lanús, ARG, Buenos Aires, 80, Merlo, ARG, Buenos Aires, 81, General San Martín, ARG, Buenos Aires, 82, Salta, ARG, Salta, 83, Moreno, ARG, Buenos Aires, 84, Santa Fé, ARG, Santa Fé, 85, Avellaneda, ARG, Buenos Aires, 86, Tres de Febrero, ARG, Buenos Aires, 87, Morón, ARG, Buenos Aires, 88, Florencio Varela, ARG, Buenos Aires, 89, San Isidro, ARG, Buenos Aires, 90, Tigre, ARG, Buenos Aires, 91, Malvinas Argentinas, ARG, Buenos Aires, 92, Vicente López, ARG, Buenos Aires, 93, Berazategui, ARG, Buenos Aires, 94, Corrientes, ARG, Corrientes, 95, San Miguel, ARG, Buenos Aires, 96, Bahía Blanca, ARG, Buenos Aires, 97, Esteban Echeverría, ARG, Buenos Aires, 98, Resistencia, ARG, Chaco, 99, José C. Paz, ARG, Buenos Aires]
Exception in thread "main" java.lang.ClassCastException: java.lang.Integer cannot be cast to java.util.Vector
at javax.swing.table.DefaultTableModel.justifyRows(Unknown Source)
at javax.swing.table.DefaultTableModel.setDataVector(Unknown Source)
at javax.swing.table.DefaultTableModel.<init>(Unknown Source)
at JTableTest.rsToTableModel(JTableTest.java:80)
at JTableTest.connectDB(JTableTest.java:51)
at JTableTest.main(JTableTest.java:22)
My first question is am I doing the vectors right? I am not sure if its supposed to add everything to the row vector as just one huge entry separated by commas. Am I approaching this right? I feel like I am missing something ..I am assuming I get this error because the object has 5 columns including ints and strings, but not sure where to go from here. All I know is a Jtable accepts (vector,vector) or (object[][],object(). I am getting confused as its my first time working with mysql and jtables.

row = new Vector();
representing single row, not 2D set of rows
you have to create 2D array e.g. Vector<Vector<Object>>() data = new Vector<Vector<Object>>();, then add a new row = new Vector<Object>(); to Vector<Vector<Object>>(); after data are filled inside while (rs.next()) {
change return new DefaultTableModel(row, columnNames); to return new DefaultTableModel(data, columnNames);
override getColumnClass for DefaultTableModel
don't reinvent the wheel, search for ResultSetTableModel, TableFromDatabase

According to the method definition mentioned here, the first argument of DefaultTableModel constructor should be a Vector of Vectors.
In your while loop declare another Vector variable
while (rs.next()) {
java.util.Vector<Object> rowData = new java.util.Vector<Object>();
for (int i = 1; i < numberOfColumns; i++) {
rowData.addElement(rs.getObject(i));
}
// Here add that row data to the row vector
row.addElement(rowData);
}

Related

Phoenix framework: How to decode Phoenix Session Cookie with Java

I am trying two different ways to decode Phoenix Session Cookie.
First one is Elixir's interaction shell, and the second one is with Java.
Please see the following examples;
IEx
iex(1)> set_cookie = "SFMyNTY.g3QAAAABbQAAAAtfY3NyZl90b2tlbm0AAAAYZFRuNUtQMkJ5YWtKT1JnWUtCeXhmNmdP.l0T3G-i8I5dMwz7lEZnQAeK_WeqEZTxcDeyNY2poz_M"
"SFMyNTY.g3QAAAABbQAAAAtfY3NyZl90b2tlbm0AAAAYZFRuNUtQMkJ5YWtKT1JnWUtCeXhmNmdP.l0T3G-i8I5dMwz7lEZnQAeK_WeqEZTxcDeyNY2poz_M"
iex(2)> [_, payload, _] = String.split(set_cookie, ".", parts: 3)
["SFMyNTY",
"g3QAAAABbQAAAAtfY3NyZl90b2tlbm0AAAAYZFRuNUtQMkJ5YWtKT1JnWUtCeXhmNmdP",
"l0T3G-i8I5dMwz7lEZnQAeK_WeqEZTxcDeyNY2poz_M"]
iex(3)> {:ok, encoded_term } = Base.url_decode64(payload, padding: false)
{:ok,
<<131, 116, 0, 0, 0, 1, 109, 0, 0, 0, 11, 95, 99, 115, 114, 102, 95, 116, 111,
107, 101, 110, 109, 0, 0, 0, 24, 100, 84, 110, 53, 75, 80, 50, 66, 121, 97,
107, 74, 79, 82, 103, 89, 75, 66, 121, 120, 102, ...>>}
iex(4)> :erlang.binary_to_term(encoded_term)
%{"_csrf_token" => "dTn5KP2ByakJORgYKByxf6gO"}
Java
public static String decodePhoenixSessionCookie(String sessionCookie) {
String payload = sessionCookie.split("\\.")[1];
byte[] encoded_term = Base64.getUrlDecoder().decode(payload.getBytes());
return new String(encoded_term);
}
Java Output
�tm_csrf_tokenmdTn5KP2ByakJORgYKByxf6gO
What I wonder is; with the Java way, I can fully achieve field name and it's value, but some gibberish values come with them.
Do you know what's the reason for this?
Do I have a chance to get clean output like Elixir way in Java way?

Cannot use unicode filename as argument to Java program on Windows cmd, but wildcard works

I tried to pass filename with unicode characters to my java program in Windows cmd, but the filename I got in program was broken. Those unicode characters were presented as ?, and it threw IOException when I read those file.
However, if I use wildcard like *.txt, it works correctly.
For example, I have a file called [テスト]测试文件1.txt in my directory.
And I wrote a simple java program to show arguments it received, which also prints string bytes.
import java.util.Arrays;
public class FileArg {
public static void main(String[] args) {
for (String fn: args) {
System.out.println(fn);
System.out.println(Arrays.toString(fn.getBytes()));
}
}
}
My Java version is 15, and I have chcp to 65001, also running program with -Dfile.encoding=UTF-8 flag.
Then I run java -Dfile.encoding=UTF-8 FileArg "[テスト]测试文件1.txt" [テスト]测试文件1.txt *.txt.
The filename is broken if I passed them directly, but it works perfectly with wildcard.
The output:
[???]??文件1.txt
[91, 63, 63, 63, 93, 63, 63, -26, -106, -121, -28, -69, -74, 49, 46, 116, 120, 116]
[???]??文件1.txt
[91, 63, 63, 63, 93, 63, 63, -26, -106, -121, -28, -69, -74, 49, 46, 116, 120, 116]
[テスト]测试文件1.txt
[91, -29, -125, -122, -29, -126, -71, -29, -125, -120, 93, -26, -75, -117, -24, -81, -107, -26, -106, -121, -28, -69, -74, 49, 46, 116, 120, 116]
Result:
BTW, my default code page is cp950 (BIG5).
How could I get it work?

Apache Beam - RabbitMq Read - fail ack message and exception raised

I'm implementing a pipeline to read RabbitMq queue.
I'm having problems when I read it at unbound stream
it is saying that channel is already closed and ack is not sent to rabbitmq and message still on the queue:
WARNING: Failed to finalize Finalization{expiryTime=2020-11-21T19:33:14.909Z, callback=org.apache.beam.sdk.io.Read$UnboundedSourceAsSDFWrapperFn$$Lambda$378/0x00000001007ee440#4ae82af9} for completed bundle CommittedImmutableListBundle{PCollection=Read RabbitMQ queue/Read(RabbitMQSource)/ParDo(UnboundedSourceAsSDFWrapper)/ParMultiDo(UnboundedSourceAsSDFWrapper)/ProcessKeyedElements/SplittableParDoViaKeyedWorkItems.GBKIntoKeyedWorkItems.out [PCollection], key=org.apache.beam.repackaged.direct_java.runners.local.StructuralKey$CoderStructuralKey#3607f949, elements=[ValueInGlobalWindow{value=ComposedKeyedWorkItem{key=[-55, 41, -123, 97, 13, 104, 92, 61, 92, 122, -19, 112, -90, 16, 7, -97, 89, 107, -80, 12, 9, 120, 10, -97, 72, 114, -62, -105, 101, -34, 96, 48, 30, -96, 8, -19, 23, -115, -9, 87, 1, -58, -127, 70, -59, -24, -40, -111, -63, -119, 51, -108, 126, 64, -4, -120, -41, 9, 56, -63, -18, -18, -1, 17, -82, 90, -32, 110, 67, -12, -97, 10, -107, -110, 13, -74, -47, -113, 122, 27, 52, 46, -111, -118, -8, 118, -3, 20, 71, -109, 65, -87, -94, 107, 114, 116, -110, -126, -79, -123, -67, 18, -33, 70, -100, 9, -81, -65, -2, 98, 33, -122, -46, 23, -103, -70, 79, -23, 74, 9, 5, -9, 65, -33, -52, 5, 9, 101], elements=[], timers=[TimerData{timerId=1:1605986594072, timerFamilyId=, namespace=Window(org.apache.beam.sdk.transforms.windowing.GlobalWindow#4958d651), timestamp=2020-11-21T19:23:14.072Z, outputTimestamp=2020-11-21T19:23:14.072Z, domain=PROCESSING_TIME}]}, pane=PaneInfo.NO_FIRING}], minimumTimestamp=-290308-12-21T19:59:05.225Z, synchronizedProcessingOutputWatermark=2020-11-21T19:23:14.757Z}
com.rabbitmq.client.AlreadyClosedException: channel is already closed due to clean channel shutdown; protocol method: #method<channel.close>(reply-code=200, reply-text=OK, class-id=0, method-id=0)
at com.rabbitmq.client.impl.AMQChannel.ensureIsOpen(AMQChannel.java:258)
at com.rabbitmq.client.impl.AMQChannel.transmit(AMQChannel.java:427)
at com.rabbitmq.client.impl.AMQChannel.transmit(AMQChannel.java:421)
at com.rabbitmq.client.impl.recovery.RecoveryAwareChannelN.basicAck(RecoveryAwareChannelN.java:93)
at com.rabbitmq.client.impl.recovery.AutorecoveringChannel.basicAck(AutorecoveringChannel.java:428)
at org.apache.beam.sdk.io.rabbitmq.RabbitMqIO$RabbitMQCheckpointMark.finalizeCheckpoint(RabbitMqIO.java:433)
at org.apache.beam.runners.direct.EvaluationContext.handleResult(EvaluationContext.java:195)
at org.apache.beam.runners.direct.QuiescenceDriver$TimerIterableCompletionCallback.handleResult(QuiescenceDriver.java:287)
at org.apache.beam.runners.direct.DirectTransformExecutor.finishBundle(DirectTransformExecutor.java:189)
at org.apache.beam.runners.direct.DirectTransformExecutor.run(DirectTransformExecutor.java:126)
at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515)
at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)
at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
at java.base/java.lang.Thread.run(Thread.java:834)
BUT
if I include withMaxNumRecords
I receive the message and ack is sent to rabbitmq queue
but it works as bound data
CODE
my code is like below:
Pipeline p = Pipeline.create(options);
PCollection<RabbitMqMessage> messages = p.apply("Read RabbitMQ queue",
RabbitMqIO.read()
.withUri("amqp://guest:guest#localhost:5672")
.withQueue("queue")
//.withMaxNumRecords(1) // TRANFORM BOUND
);
PCollection<TableRow> rows = messages.apply("Transform Json to TableRow",
ParDo.of(new DoFn<RabbitMqMessage, TableRow>() {
#ProcessElement
public void processElement(ProcessContext c) {
ObjectMapper objectMapper = new ObjectMapper();
String jsonInString = new String(c.element().getBody());
LOG.info(jsonInString);
}
}));
rows.apply(
"Write to BigQuery",
BigQueryIO.writeTableRows()
.to("livelo-analytics-dev:cart_idle.cart_idle_process")
.withCreateDisposition(BigQueryIO.Write.CreateDisposition.CREATE_NEVER)
.withWriteDisposition(BigQueryIO.Write.WriteDisposition.WRITE_APPEND)
);
Someone could help with this?
I sent an email to apache dev thread and get an awesome answer from
Boyuan Zhang
that worked as a workaround for me
As a workaround, you can add --experiments=use_deprecated_read when launching your pipeline to bypass the sdf unbounded source wrapper here.
--experiments=use_deprecated_read
Put it as an argument on the command line and worked fine for me

how to convert Byte code to String?

I am using a 3rd party application.
In that application, the input word "test" gets converted to byte code output.
the byte code value is appreaing as [17, 17, 17, 17, 34, 34, 34, 34, 51, 51, 51, 51, 68, 68, 68, 68]
I do not know how to convert this byte code to a readable text value (text value: "test").
I am trying this on an Android application.
Can somebody help?
Like this:
public static void main(String[] args)
{
char[] c = new char[] {17, 17, 17, 17, 34, 34, 34, 34, 51, 51, 51, 51, 68, 68, 68, 68};
String word = String.valueOf(c);
System.out.println(word);
}

Store a list of Strings as human-readable path

I need to store a list of Strings into a single field in Java. The order is important, and I would prefer it to be stored in a human-readable format.
Perfect solution would be storing it like an xPath, but I only know libraries for compiling complex xml files to xPath, not lists of Strings.
My own written solutions easily get too complex because I want to support Strings containing any character, including the one I use as delimiter.
I currently use serialization this way:
String[] items = new String[3];
items[0] = item1;
items[1] = item2;
items[2] = item3;
byte[] bytes = SerializationUtils.serialize(items);
System.out.println("Serialized:\n"+Arrays.toString(bytes));
String[] read = (String[]) SerializationUtils.deserialize(bytes);
System.out.println("Read:");
for(String s : read) {
System.out.println(s);
}
Output:
[-84, -19, 0, 5, 117, 114, 0, 19, 91, 76, 106, 97, 118, 97, 46, 108, 97, 110, 103, 46, 83, 116, 114, 105, 110, 103, 59, -83, -46, 86, -25, -23, 29, 123, 71, 2, 0, 0, 120, 112, 0, 0, 0, 3, 116, 0, 7, 110, 117, 109, 98, 101, 114, 49, 116, 0, 8, 110, 117, 109, 98, 101, 114, 47, 50, 116, 0, 8, 110, 117, 109, 98, 101, 114, 92, 51]
This works, but apart from generating a very long String, it also generates a non-human readable string.
How can I best store this path, in a human readable way, and as little complication in my code as possible?
Solution
This is my solution using the OstermillerUtils as suggested by ct_ (thanks!).
String item1="number1";
String item2="number/2";
String item3="number\\3";
String item4="//number/4\\";
String item5=",num\"ber5,";
String item6="number,6";
String[] items = new String[6];
items[0] = item1;
items[1] = item2;
items[2] = item3;
items[3] = item4;
items[4] = item5;
items[5] = item6;
System.out.println("Test values");
for(String s : items) {
System.out.println(s);
}
StringWriter writer = new StringWriter();
CSVPrinter printer = new CSVPrinter(writer);
printer.changeDelimiter('/');
printer.write(items);
System.out.println("Persisted:\n\t"+writer.toString());
String[][] results = CSVParser.parse(writer.toString(), '/');
for (int j=0; j<results[0].length; j++){
System.out.println(results[0][j]);
}
So you want to serialize and deserialize a String array to a string and back? Have a look at http://ostermiller.org/utils/CSV.html - it can serialize and deserialize arrays using an arbitrary delimeter.
JAXB only using annotations would work. Best when you have one container class with a list field. You then get XML.

Categories

Resources