How to convert blob to image with Spring - java

I have a mysql db and I have a table called User and contains a column called pct that is of Type Blob.
I am using hibernate to carry out a native select query as follows:
public List<Map<String, Object>> queryExtraction(String sql, QWSInputParam[] qwsInputParams) {
sql = "SELECT user.name,user.pct FROM user user WHERE user.id = user.id and user.id in :PARAM0";
Query query = getSession().createSQLQuery(sql);
query.setResultTransformer(CriteriaSpecification.ALIAS_TO_ENTITY_MAP);
for (int i = 0; i < qwsInputParams.length; i++) {
LOGGER.info("PARAM" + i + ": " + Arrays.toString(qwsInputParams[i].getValues()));
query.setParameterList("PARAM" + i, qwsInputParams[i].getValues());
}
//LOGGER.info("Query extraction: " + query.toString());
//query.setTimeout(QUERY_TIME_OUT);
List<Map<String, Object>> list = query.list();
Object value = null;
for (Map<String, Object> map : list) {
for (Map.Entry<String, Object> entry : map.entrySet()) {
String key = entry.getKey();
value = entry.getValue();
System.out.println("0 " + entry.getValue());
}
}
return list;
}
I cannot use entity because it is a generic method that should cater for any table and therefore not related to a specific table.
Basically when the query is executed and for the column pct of type blob the following value is displayed:
[B#1a270232
Based on the post I understand this is a JNI type signature.
The pct value in the table is a picture as follows:
Is it possible to convert the value [B#1a270232 to a base64 so that I can display it on my browser?
Thanks in advance

How about:
// ... once get hold of a byte array:
byte[] bytes = (byte[]) entry.getValue();
System.out.println(Base64Utils.encodeToString(bytes));
// ...
Javadoc:
Spring-Base64Utils

An alternative solution using apache IOUtils which matches with your exact requirement.
This is from my Github repo. Where I have the the Entity having:
#Lob
private Byte[] userpicture;
The controller to get image to view:
#GetMapping("appUser/{id}/appUserimage")
public void renderImageFromDB(#PathVariable String id, HttpServletResponse response) throws IOException {
AppUserCommand appUserCommand = appUserService.findCommandById(Long.valueOf(id));
if (appUserCommand.getUserpicture() != null) {
byte[] byteArray = new byte[appUserCommand.getUserpicture().length];
int i = 0;
for (Byte wrappedByte : appUserCommand.getUserpicture()){
byteArray[i++] = wrappedByte; //auto unboxing
}
response.setContentType("image/jpeg");
InputStream is = new ByteArrayInputStream(byteArray);
IOUtils.copy(is, response.getOutputStream());
}
}
And finally the view where the image is displayed:
<div th:if="${appUser.userpicture == null}">
<img th:src="#{/images/defaultuser.jpg}" width="40" height="40" />
Upload </div>
<div th:if="${appUser.userpicture != null}"><img th:src="#{'/appUser/' + ${appUser.id} + '/appUserimage'}" width="40" height="40" />

Related

Debezium - Update Operation Emits Change Event into Kafka Topic with Both Before & After Struct Values ,But Ignores null Column/Field in Before Struct

I'm using debezium to synchronize the data between two postgres DB server & i'm facing an issue with the update event/operation as it's recording the change event into kafka topic by ignoring the null value column/field(refer below infodetcode field missing in the before struct as it was null in DB ), same column/field is avail in after struct as the value changed from "null" to "some value", with the null value column missing in before struct, when i compare before with after struct to find out which are the field/column values are unique/duplicate to construct dynamic query,query is constructed with the missing column,and it is necessary to put that column in the query(please find the below configuration & comparison implementation of before and after struct which returns result without null value column),I'd gladly take suggestions/help on this issue.
Note : REPLICA IDENTITY is set to "FULL"
Version:
PostgreSQL - 10.9, debezium - 1.1.1.Final
Before & After Struct-Topic Record(Actual):
before=struct{accountno=01,currencycode=USD,seqno=1,informationcode=S}
after=struct{accountno=01,currencycode=USD,seqno=1 ,informationcode=M ,infodetcode=N}
Before & After Struct-Topic Record(Expected):
before=struct{accountno=01,currencycode=USD,seqno=1,informationcode=S,infodetcode=null}
after=struct{accountno=01,currencycode=USD,seqno=1 ,informationcode=M ,infodetcode=N}
Debezium configuration:
#Bean
public io.debezium.config.Configuration postgreConnectorConfiguration() {
return io.debezium.config.Configuration.create()
.with("name", "postgres-connector")
.with("snapshot.mode", SnapshotMode.CUSTOM)
.with("snapshot.custom.class", "postgresql.snapshot.CustomSnapshotter")
.with("connector.class", "io.debezium.connector.postgresql.PostgresConnector")
.with("database.history", "io.debezium.relational.history.FileDatabaseHistory")
.with("database.history.file.filename", "/debezium/dbhistory.dat")
.with("offset.storage", "org.apache.kafka.connect.storage.FileOffsetBackingStore")
.with("offset.storage.file.filename", "/debezium/offset/postgre-offset.dat")
.with("offset.flush.interval.ms", 60000)
.with("snapshot.isolation.mode", "read_committed")
.with("key.converter.schemas.enable",true)
.with("value.converter.schemas.enable", true)
.with("plugin.name", "pgoutput")
.with("slot.name", "debeziumtest")
.with("database.server.name", "server-c")
.with("database.hostname", databaseHost)
.with("database.port", databasePort)
.with("database.user", databaseUserName)
.with("database.password", databasePassword)
.with("database.dbname", databaseName)
.with("table.whitelist", TABLES_TO_MONITOR).build();
}
Comparison of Struct(Before & After):
private void handleEvent(SourceRecord sourceRecord) {
Struct sourceRecordEntry = (Struct) sourceRecord.value();
if (sourceRecordEntry != null) {
Struct sourceStruct = (Struct) sourceRecordEntry.get(FieldName.SOURCE);
String tableName = sourceStruct.getString(TABLE);
Date transactionDate = new Date(System.currentTimeMillis());
Long transctionTime = (Long) sourceStruct.get(FieldName.TIMESTAMP);
Time txnTime = new Time(transctionTime);
Long transactionCode = (Long) sourceStruct.get(TRANSACTION_ID);
Operation operation = Operation.forCode(sourceRecordEntry.getString(OPERATION));
if (operation == Operation.UPDATE) {
Map<String, Object> beforeEntryHash;
Map<String, Object> afterEntryHash;
List preFieldList = new ArrayList();
List preValueList = new ArrayList();
List postFieldList = new ArrayList();
List postValueList = new ArrayList();
Integer preFieldcount = 0, preValuecount = 0, postFieldcount = 0, postValuecount = 0;
Struct beforeStruct = (Struct) sourceRecordEntry.get(BEFORE);
Struct afterStruct = (Struct) sourceRecordEntry.get(AFTER);
beforeEntryHash = beforeStruct.schema().fields().stream().map(Field::name).filter(fieldName->beforeStruct.get(fieldName)!=null).map(fieldName-> Pair.of(fieldName, beforeStruct.get(fieldName))).collect(toMap(Pair::getKey,Pair::getValue));
afterEntryHash = afterStruct.schema().fields().stream().map(Field::name).filter(fieldName->afterStruct.get(fieldName)!=null).map(fieldName-> Pair.of(fieldName, afterStruct.get(fieldName))).collect(toMap(Pair::getKey,Pair::getValue));
MapDifference<String, Object> rowDifferenceHash = Maps.difference(beforeEntryHash, afterEntryHash);
for(Entry<String, ValueDifference<Object>> rowEntry : rowDifferenceHash.entriesDiffering().entrySet()) {
preFieldList.add(PR_PREFIX + rowEntry.getKey());
postFieldList.add(PO_PREFIX + rowEntry.getKey());
preValueList.add(SQ + rowEntry.getValue().leftValue() + SQ);
postValueList.add(SQ + rowEntry.getValue().rightValue() + SQ);
LOGGER.info("Key : " + rowEntry.getKey() + " Left Value : " + rowEntry.getValue().leftValue() + " Right Value : " + rowEntry.getValue().rightValue());
}
}
}
}
Message:
SourceRecord{sourcePartition={server=server-c}, sourceOffset={transaction_id=null, lsn_proc=4921004793408, lsn=4921004793408, txId=81939856, ts_usec=1588212060567019}} ConnectRecord{topic='server-c.a.accinfo', kafkaPartition=null, key=Struct{accountno=01 ,currencycode=USD,seqno=1 }, keySchema=Schema{server-c.a.accinfo.Key:STRUCT}, value=Struct{before=Struct{accountno=01 ,currencycode=USD,seqno=1 ,informationcode=S },after=Struct{accountno=01 ,currencycode=USD,seqno=1 ,informationcode=P ,infodetcode=I},source=Struct{version=1.2.0.Alpha1,connector=postgresql,name=server-c,ts_ms=1588212060567,db=OTATEMP,schema=a,table=accinfo,txId=81939856,lsn=4921004793408},op=u,ts_ms=1588213782961}, valueSchema=Schema{server-c.a.accinfo.Envelope:STRUCT}, timestamp=null, headers=ConnectHeaders(headers=)}
Schema:
[Field{name=before, index=0, schema=Schema{server-c.aeota.accinfo.Value:STRUCT}}, Field{name=after, index=1, schema=Schema{server-c.aeota.accinfo.Value:STRUCT}}, Field{name=source, index=2, schema=Schema{io.debezium.connector.postgresql.Source:STRUCT}}, Field{name=op, index=3, schema=Schema{STRING}}, Field{name=ts_ms, index=4, schema=Schema{INT64}}, Field{name=transaction, index=5, schema=Schema{STRUCT}}]

Appending SQL data into datatables using JAVA in JSP

In my SQL Server I have the following result sets after all the condition filtering and sum query execution.
I would like to be shown like this in my page (refer to the screenshot below).
I have tried the below JAVA code that gave me the results that I appended into my datatables.
<%
ArrayList<String[]> rows = sqlq.querySQL();
String rowsetdate = new String();
String rowres1 = new String();
for(String[] rowset : rows) {
rowsetdate = rowset[0];
rowres1 = rowres1 + rowset[1]+ ",";
for(String rowres2 : rowset) {
rowres1 = rowres1 + rowres2 + ",";
}
rowres1 = rowres1.substring(0, rowres1.length()-1);
rowres1 = rowres1 + "|";
}
rowres1 = rowres1.substring(0, rowres1.length()-1);
%>
<tr>
<td><%if (rowres1 == null) out.print(""); else out.print(rowres1);%></td>
</tr>
sqlq.querySQL() is used to send my SQL query to JDBC in order for me to send query to my DB.
Photo below is the appended data in my datatables after the code execution, on the left is the Date and on the right is the data.
I tried some different code,
<%
ArrayList<String[]> rows = sqlq.querySQL();
for(String[] rowset : rows) {
<tr>
<td><%if (rowset[0] == null) out.print(""); else out.print(rowset[0]);%></td>
<td><%if (rowset[1] == null) out.print(""); else out.print(rowset[1]);%></td>
</tr>
}
%>
which did not achieve my expected results also, it returns the data like how I see it in my SSMS (check screenshot below)
What did I do wrong and how should I do it to get my expected outcome? (screenshot below)
Appreciate the help from all of you.
You can use a Map that its key is date and its value again is a Map. Inner Map uses trans as key and sumtot as value.
Map<String, Map<String, String>> mapByDate = new HashMap<>();
TreeSet<String> allTrans = new TreeSet<>();
for (String[] row : rows) {
Map<String, String> mapByDateAndTrans = mapByDate.get(row[0]);
if (mapByDateAndTrans == null) {
mapByDateAndTrans = new HashMap<>();
}
mapByDateAndTrans.put(row[1], row[2]);
mapByDate.put(row[0], mapByDateAndTrans);
allTrans.add(row[1]);
}
Here is a sample code to print data as you might expect:
System.out.println("Date/Trans: " + allTrans);
for (Map.Entry<String, Map<String, String>> mapByDateEntry : mapByDate.entrySet()) {
System.out.print(mapByDateEntry.getKey() + ": ");
Map<String, String> mapByTrans = mapByDateEntry.getValue();
for (String trans : allTrans) {
String sumtot = mapByTrans.get(trans);
if (sumtot != null) {
System.out.print("[ " + sumtot + " ]");
} else {
System.out.print("[ ]");
}
}
System.out.println();
}
The output:
Date/Trans: [11200, 11201, 11202]
2019-07-02: [ 136 ][ 18 ][ 14 ]
2019-07-03: [ 164 ][ 10 ][ 8 ]
Or we can generate an HTML table content:
StringBuilder tableBuilder = new StringBuilder("<table border = 1>");
// table header
tableBuilder.append("<tr>");
tableBuilder.append("<th>date/trans</th>");
for (String trans : allTrans) {
tableBuilder.append("<th>").append(trans).append("</th>");
}
tableBuilder.append("</tr>");
// table rows
for (Map.Entry<String, Map<String, String>> mapByDateEntry : mapByDate.entrySet()) {
tableBuilder.append("<tr>");
tableBuilder.append("<td>").append(mapByDateEntry.getKey()).append("</td>");
Map<String, String> mapByTrans = mapByDateEntry.getValue();
for (String trans : allTrans) {
String sumtot = mapByTrans.get(trans);
if (sumtot != null) {
tableBuilder.append("<td>").append(sumtot).append("</td>");
} else {
tableBuilder.append("<td></td>");
}
}
tableBuilder.append("<tr>");
}
tableBuilder.append("</table>");
System.out.println(tableBuilder.toString());
The Output:
<table border = 1><tr><th>date/trans</th><th>11200</th><th>11201</th><th>11202</th></tr><tr><td>2019-07-02</td><td>136</td><td>18</td><td>14</td><tr><tr><td>2019-07-03</td><td>164</td><td>10</td><td>8</td><tr></table>
If we save generated output as an HTML file, It maybe your desired result (screenshot below). Also you can change the code to be used in JSP:
To have an ordered Map by natural order of its keys, a TreeMap can be used. So to print the data ordered by date, we can construct a new TreeMap containing the mapByDate data:
TreeMap<String, Map<String, String>> sortedMapByDate = new TreeMap<>(mapByDate);
// table rows
for (Map.Entry<String, Map<String, String>> mapByDateEntry : sortedMapByDate.entrySet()) {
tableBuilder.append("<tr>");
tableBuilder.append("<td>").append(mapByDateEntry.getKey()).append("</td>");
Map<String, String> mapByTrans = mapByDateEntry.getValue();
for (String trans : allTrans) {
String sumtot = mapByTrans.get(trans);
if (sumtot != null) {
tableBuilder.append("<td>").append(sumtot).append("</td>");
} else {
tableBuilder.append("<td></td>");
}
}
tableBuilder.append("<tr>");
}
It's wasteful to do this in Java code, that's what the window functions are for in SQL. If you have a query like SELECT datet, trans, sumtout FROM ... you can use SUM with OVER:
SELECT DISTINCT datet, SUM(sumtout) OVER (PARTITION BY datet)
FROM ...
ORDER BY datet;

Alfresco, query with dynamic values

I have a Map with different values:
props = new HashMap<String, Object>();
props.put("cmis:objectTypeId", "D:ruc:PLICO");
props.put("cmis:name", "PLICO_1.pdf");
props.put("cmis:description", "Descr");
props.put("ruc:doc_surname", "Rossi");
props.put("ruc:doc_name", "Mario");
I want to do a query (QueryStatement or other) that dynamically reads this parameters (some of them can be missing) and build QueryStatement.
Does it exist an easy way to generate the query String for QueryStatement? Or do I should iterate my Map to build a String containing all the parameters and values in my query?
My solution, but maybe somebody know how to improve it without dynamically build the query string:
StringBuilder query = new StringBuilder("SELECT * FROM ? where ");
String folder = null;
if (path!=null)
{
folder = findPath(path);
if (folder==null)
{
return null;
}
query.append("IN_FOLDER(?) AND ");
}
ArrayList <String> values = new ArrayList<String>();
Map<String, Object> properties = loadAnnotationAndData(doc);
String objectType = properties.remove(MyEnum.cmis_object_type_id.getValue()).toString();
for (Map.Entry<String, Object> entry : properties.entrySet())
{
System.out.println(entry.getKey() + " - " + entry.getValue());
query.append(entry.getKey() + "=? AND ");
values.add(entry.getValue().toString());
}
query.delete(query.length()-4, query.length());
query.append(" ORDER BY cmis:creationDate");
System.out.println(query.toString());
Session cmisSession = getCmisSession();
QueryStatement qs=
cmisSession.createQueryStatement(query.toString());
int offset = 1;
qs.setType(offset++, objectType);
if (path!=null)
{
qs.setString(offset++, folder);
}
for (int i=0; i<values.size(); i++)
{
System.out.println(values.get(i).toString());
qs.setString(i+offset, values.get(i).toString());
}

Why the data is going as null in my test and how to convert it data provider data as per my function in java

I have read some data from Excel in array form, and converted into a 2-D array in order to provide the data to a data provider.But now in the #Test when I pass the data it takes on the null value.
Can you please suggest why it's going null? Also my function in #Test is using a map as well - how can I convert the data provider data to map ?
My function in #Test is like below:-
public void testCategorySearch(String vendor_code, Map<Integer, List<String>> seller_sku , String upload_id,Protocol protocol)
throws InvocationTargetException
My code is :
#DataProvider(name = "valid_parameters")
public Object[][] sendValidParameters() {
List<ArrayList> result = td.getExcelData("C:\\Users\\ashish.gupta02\\QAAutomation\\test.xls", 1);
Object[][] a = new String[result.size()][3];
{
for (int i = 0; i < result.size(); i++) {
Object currentObject = result.get(i);
a[i][0] = currentObject.toString();
System.out.println("COnverted" + a[i][0]);
}
}
System.out.println("Printing data" + a);
//return mapper.getProtocolMappedObject(a);
//return Object ;
return a;
}
#Test(dataProvider = "valid_parameters", groups = {"positive"})
public void testCategorySearch(String vendor_code, Map<Integer, List<String>> seller_sku, String upload_id, Protocol protocol)
throws InvocationTargetException {
//Protocol protocol
//set parameter values to the api
System.out.println("Executing the request");
CreateSellerProductUpdateInfoRequest createReq = setRequest(vendor_code, seller_sku, upload_id, protocol);
CreateSellerProductUpdateInfoResponse createResponse = service.createSellerProductUpdateInfo(createReq);
System.out.print("Response is :" + createResponse);
}

Dynamic data store for ExtJS chart

I am trying to use JSP on server-side to perform a variable number of queries and output the result of all of them as a single block of JSON data for an ExtJS line chart.
The reason the number of queries is variable is because each one represent a different series (a different line) on the line chart, and the number of series is different depending on the line chart that the user selects.
I am using hibernate and my persistence class returns each query data as a: List<Map<String, Object>> (each Map represents one row).
There will always be at least one series (one line on the graph, one query to execute), so the way I was thinking of setting this up is as follows:
1) Have the initial query run and get the first series
2) Run another query to check for any other series that should be on the graph
3) For each "other" series found in the second query run a query that gets the data for that series (same number of rows) and then merge that data into the first List<Map<String, Object>> that was returned in #1 as another column. The query is set-up to order it properly it just needs to be merged at the same index level.
4) Output that List as JSON.
My problem is with #3, I am not sure how to go about the merging the data.
Here's what I have so far:
GenericSelectCommand graphData = new GenericSelectCommand(graphDataQuery);
GenericSelectCommand backSeriesData = new GenericSelectCommand(backSeriesQuery);
List<Map<String, Object>> graphDataList;
List<Map<String, Object>> backSeriesList;
try
{
Persistor myPersistor = new Persistor();
// 1) GET THE INITIAL LINE CHART SERIES
myPersistor.executeTransact(graphData);
graphDataList = graphData.getRows();
// 2) LOOK FOR ANY ADDITIONAL SERIES THAT SHOULD BE ON THE LINE CHART
myPersistor.executeTransact(backSeriesData);
backSeriesList = backSeriesData.getRows();
// 3) FOR EACH ADDITIONAL SERIES FOUND, RUN A QUERY AND APPEND THE DATA TO THE INITIAL LINE CHART SERIES (graphDataList)
for (int i = 0; i < backSeriesList.size(); i++)
{
Map<String, Object> backSeriesBean = backSeriesList.get(i);
// THIS QUERY RETURNS ONE COLUMN OF INT VALUES (THE LINE CHART DATA) WITH THE EXACT SAME NUMBER OF ROWS AS THE INITIAL LINE CHART SERIES (graphDataList)
String backDataQuery = "exec runQuery 'getBackData', '" + backSeriesBean.get("series_id") + "'";
GenericSelectCommand backData = new GenericSelectCommand(backDataQuery);
myPersistor.executeTransact(backData);
List<Map<String, Object>> backDataList = backData.getRows();
// FOR EACH RECORD IN THE BACK DATA (Map<String, Object>)
for (int i = 0; i < backDataList.size(); i++)
{
Map<String, Object> backDataBean = backDataList.get(i);
// HOW DO I ADD IT TO THE RECORD AT THE SAME INDEX LEVEL IN graphDataList (List<Map<String, Object>>)
}
}
}
catch (Throwable e)
{
System.err.println("Error: ");
System.err.println(e.getCause());
}
finally
{
myPersistor.closeSession();
}
// 4) RETURN THE DATA AS JSON NOW THAT IT IS MERGED
for (int i = 0; i < graphDataList.size(); i++)
{
Map<String, Object> graphDataBean = graphDataList.get(i);
out.println(/*JSON FORMAT + graphDataBean.get('data') + JSON FORMAT*/)
}
SOLUTION:
GenericSelectCommand graphData = new GenericSelectCommand(graphDataQuery);
GenericSelectCommand backSeries = new GenericSelectCommand(backSeriesQuery);
List<Map<String, Object>> graphDataList = Collections.emptyList();
List<Map<String, Object>> backSeriesList = Collections.emptyList();
List backDataListArray = new ArrayList();
try
{
// GET THE INITIAL LINE CHART SERIES
Persistor.instance().executeTransact(graphData);
graphDataList = graphData.getRows();
// LOOK FOR ANY ADDITIONAL SERIES THAT SHOULD BE ON THE LINE CHART
Persistor.instance().executeTransact(backSeries);
backSeriesList = backSeries.getRows();
// FOR EACH ADDITIONAL SERIES FOUND, RUN THE QUERY AND ADD IT TO backDataListArray
for (int i = 0; i < backSeriesList.size(); i++)
{
Map<String, Object> backSeriesBean = backSeriesList.get(i);
String backDataQuery = "exec runQuery 'getBackData', " + backSeriesBean.get("series_id");
GenericSelectCommand backData = new GenericSelectCommand(backDataQuery);
Persistor.instance().executeTransact(backData);
List<Map<String, Object>> backDataList = backData.getRows();
backDataListArray.add(backDataList);
}
}
catch (Throwable e)
{
System.err.println("Error: ");
System.err.println(e.getCause());
}
finally
{
Persistor.instance().closeSession();
}
// FOR EACH RECORD IN THE ORIGINAL QUERY, WRITE THE JSON STRING
for (int i = 0; i < graphDataList.size(); i++)
{
StringBuilder backDataString = new StringBuilder();
// BUILD THE BACK DATA STRING (IF THERE IS ANY)
for (int j = 0; j < backDataListArray.size(); j++)
{
List<Map<String, Object>> backDataList = (List<Map<String, Object>>) backDataListArray.get(j);
Map<String, Object> backDataBean = backDataList.get(i);
Map<String, Object> backSeriesBean = backSeriesList.get(j);
backDataString.append(backSeriesBean.get("the_series") + ": " + backDataBean.get("the_count") + ", ");
}
Map<String, Object> graphDataBean = graphDataList.get(i);
out.println("{the_quota: " + graphDataBean.get("the_quota") + ", " + "count_pt_year: " + graphDataBean.get("count_pt_year") + ", " + backDataString + "date_string: '" + graphDataBean.get("date_string") + "'}" + (i + 1 == graphDataList.size() ? "" : "," ));
}
I would not merge the lists. I would just create an outer list for each query and then go through the outer list and return each series list. You can just create the outer list as:
List outerList = new ArrayList();
I would not worry about specifying the types for the outer list as it just makes it more complicated for little benefit.

Categories

Resources