Am trying to maintain two MQTT connections using PAHO library.If i maintain the same object Subscription hangs on second subscription, but if i create a new client it works but closes the first subscription. Below is my code snippet. What could i be missing?.
try {
logging.info(logPreString + "| "
+ Thread.currentThread().getId() + " | "
+ "Subscribing to topic: ");
// Construct the connection options object that contains connection parameters
connectOptions = new MqttConnectOptions();
connectOptions.setCleanSession(true);
connectOptions.setKeepAliveInterval(1000);
// Construct an MQTT blocking mode client
client = new MqttClient(broker, clientID, persistence);
// Set this wrapper as the callback handler
client.setCallback(this);
// Connect to the MQTT server
logging.info(logPreString + "Connecting to broker ..." + broker);
client.connect(connectOptions);
logging.info(logPreString + "Connected and sucscribed ");
// Subscribe to a topic
System.out.println("Subscribe to topic(s): " + "| "
+ Thread.currentThread() + " | "
+ Arrays.toString(topic) + " With QOS of:" + props.getQoS());
client.subscribe(topic, QOS);
} catch (MqttException me) {
logging.error(logPreString + "MqttException caught "
+ "while connecting to broker. Error: "
+ me.getMessage());
}
}
public void subscribeAll(String[] topic, int[] QOS) {
try {
logging.info(logPreString + "subscribing to all ..." + broker);
// Subscribe to a topic
System.out.println("Subscribing to topic(s): " + "| "
+ Thread.currentThread() + " | "
+ Arrays.toString(topic)
+ " With QOS of:" + props.getQoS());
client.subscribe(topic, QOS);
} catch (MqttException me) {
logging.error(logPreString + "MqttException caught "
+ "while connecting to broker. Error: "
+ me.getMessage());
}
}
Related
I'm trying ElasticSearch 7.9 and wanted to do a benchmark on 1M documents. I use the 'single node' docker image.
I use the high level java client to index documents using BulkRequest. I consistenly get a Too Many Requests exception after 360k requests, even if I put some sleep(1000) statements after each 10k docs.
I tried increasing the memory in the jvm.options from 1G to 8G but that did not affect it.
Is there an option to increase this number of requests?
My laptop has 4 cores and 16GB and docker is not limited in any way.
Error details:
{"error":{"root_cause":[{"type":"es_rejected_execution_exception","reason":"rejected execution of coordinating operation [coordinating_and_primary_bytes=0, replica_bytes=0, all_bytes=0, coordinating_operation_bytes=108400734, max_coordinating_and_primary_bytes=107374182]"}],"type":"es_rejected_execution_exception","reason":"rejected execution of coordinating operation [coordinating_and_primary_bytes=0, replica_bytes=0, all_bytes=0, coordinating_operation_bytes=108400734, max_coordinating_and_primary_bytes=107374182]"},"status":429}
Indexing code
CreateIndexRequest createIndexRequest = new CreateIndexRequest(index);
createIndexRequest.mapping(
"{\n" +
" \"properties\": {\n" +
" \"category\": {\n" +
" \"type\": \"keyword\"\n" +
" },\n" +
" \"title\": {\n" +
" \"type\": \"keyword\"\n" +
" },\n" +
" \"naam\": {\n" +
" \"type\": \"keyword\"\n" +
" }\n" +
" }\n" +
"}",
XContentType.JSON);
CreateIndexResponse createIndexResponse = client.indices().create(createIndexRequest, RequestOptions.DEFAULT);
for (int b=0;b<100; b++) {
List<Book> bookList = new ArrayList<>();
for (int i = 0; i < 10_000; i++) {
int item = b*100_000 + i;
bookList.add(new Book("" + item,
item % 2 == 0 ? "aap" : "banaan",
item % 4 == 0 ? "naam1" : "naam2",
"Rob" + item,
"The great start" + item/100,
item));
}
bookList.forEach(book -> {
IndexRequest indexRequest = new IndexRequest().
source(objectMapper.convertValue(book, Map.class)).index(index).id(book.id());
bulkRequest.add(indexRequest);
});
System.out.println("Ok, batch: " + b);
bulkRequest.timeout(TimeValue.timeValueSeconds(20));
try {
Thread.sleep(1_000);
} catch (InterruptedException e) {
e.printStackTrace();
}
try {
client.bulk(bulkRequest, RequestOptions.DEFAULT);
System.out.println("Ok2");
} catch (IOException e) {
e.printStackTrace();
// System.out.println(objectMapper.convertValue(book, Map.class));
}
}
Ok I found it. I just kept adding request to the BulkRequest instead of clearing it.
I wrote a piece of JDBC template code, which inserts the record in the table, but the problem is my execution is stuck on this particular snippet, it seems some kind of hang up. I didn't figure out the cause as query properly running in sqldeveloper
List<SalaryDetailReport> reports = salaryDetailReportDAO.findAll(tableSuffix, regionId, circleId);
// the above line find the required data, if data is found then it proceeds
if (reports != null && reports.size() > 0) {
for (SalaryDetailReport salaryDetail : reports) {
try {
SalaryDetail sd = new SalaryDetail();
sd.setDetailReport(salaryDetail);
salaryDetailDAO.save(sd, tableSuffix);
} catch (Exception e) {
log.error("Error occured", e);
e.printStackTrace();
throw new MyExceptionHandler(" Error :" + e.getMessage());
}
}
System.out.println("data found");
} else {
log.error("Salary Record Not Found.");
throw new MyExceptionHandler("No record Found.");
}
You people saw try-catch , my execution stuck inside try and catch and here is the insertion code in my implementation class. when i commented the above code then my application works fine, but why my application stuck here, I am not able to figure it out, kindly help me
#Override
public void save(SalaryDetail details, String tableSuffix) {
String tabName = "SALARY_DETAIL_" + tableSuffix;
// String q = "INSERT INTO " + tabName + "(ID "
String q = "INSERT INTO SALARY_DETAIL_TBL "
+ " (ID "
+ " ,EMP_NAME "
+ " ,EMP_CODE "
+ " ,NET_SALARY "
+ " ,YYYYMM "
+ " ,PAY_CODE "
+ " ,EMP_ID "
+ " ,PAY_CODE_DESC "
+ " ,REMARK "
+ " ,PAY_MODE ) "
+ " (SELECT (sd.SALARY_REPORT_ID) ID "
+ " ,(sd.emp_name) emp_name "
+ " ,(sd.EMP_CODE) EMP_CODE "
+ " ,(sd.amount) NET_SALARY "
+ " ,(sd.YYYYMM) YYYYMM "
+ " ,(sd.pay_code) pay_code "
+ " ,(sd.emp_id) emp_id "
+ " ,(sd.PAY_CODE_DESC) PAY_CODE_DESC "
+ " ,(sd.REMARK) REMARK "
+ " ,(sd.PAY_MODE)PAY_MODE "
// + " FROM SALARY_DETAIL_REPORT_" + tableSuffix + " sd "
+ " FROM SALARY_DETAIL_REPORT_TBL sd "
+ " WHERE sd.PAY_CODE = 999 "
+ " AND sd.EMP_ID IS NOT NULL "
// + " AND sd.EMP_ID NOT IN (SELECT EMP_ID FROM SALARY_DETAIL_" + tableSuffix + ") "
+ " AND sd.EMP_ID NOT IN (SELECT EMP_ID FROM SALARY_DETAIL_TBL) "
+ " ) ";
MapSqlParameterSource param = new MapSqlParameterSource();
param.addValue("id", details.getId());
param.addValue("EMP_NAME", details.getEmpName());
param.addValue("EMP_CODE", details.getEmpCode());
param.addValue("NET_SALARY", details.getNetSalary());
param.addValue("GROSS_EARNING", details.getGrossEarning());
param.addValue("GROSS_DEDUCTION", details.getGrossDeduction());
param.addValue("YYYYMM", details.getYyyymm());
param.addValue("EMP_ID", details.getEmployee() != null ? details.getEmployee().getEmpId() : null);
KeyHolder keyHolder = new GeneratedKeyHolder();
getNamedParameterJdbcTemplate().update(q, param);
// details.setId(((BigDecimal) keyHolder.getKeys().get("ID")).longValue());
}
The main problem is in your query is Not In condition. It will degrade your performance. Try to fetch the "SELECT EMP_ID FROM SALARY_DETAIL_TB" in a separate query and pass in the Not in block in the main query. This will increase the performance of your query. Every time a save is performed this will fire the select query every time.
You have to decide whether you will insert records from SELECT or from the application.
If you don't need to manipulate with data after their select then you can simply call one INSERT INTO SELECT statement without any for cycle. It will be fast because of the only one INSERT statement call.
So you will implement method like copyAllInSalaryDetail(tableSuffix, regionId, circleId) in your SalaryDetailReportDAO and that method will execute INSERT INTO salary_detail_tbl... (...) (SELECT ... WHERE ...) using the same WHERE condition as you have in findAll() method. All inserts will be done only on the Database layer.
If you want to manipulate with data before their insert you can continue with your approach using SalaryDetail bean and for cycle, but you should remove the SELECT part from the INSERT statement and use values from the provided bean. Then the save() method can look like:
#Override
public void save(SalaryDetail details, String tableSuffix) {
// use tableSuffix if it is really needed
String q = "INSERT INTO SALARY_DETAIL_TBL "
+ " (ID "
+ " ,EMP_NAME "
+ " ,EMP_CODE "
+ " ,NET_SALARY "
+ " ,YYYYMM "
+ " ,PAY_CODE "
+ " ,EMP_ID "
+ " ,PAY_CODE_DESC "
+ " ,REMARK "
+ " ,PAY_MODE ) "
+ " VALUES (:id "
+ " ,:emp_name "
+ " ,:emp_code "
+ " ,:net_salary "
+ " ,:yyyymm "
+ " ,:pay_code "
+ " ,:emp_id "
+ " ,:pay_code_desc "
+ " ,:remark "
+ " ,:pay_mode)";
MapSqlParameterSource param = new MapSqlParameterSource();
// KeyHolder keyHolder = new GeneratedKeyHolder();
// details.setId(((BigDecimal) keyHolder.getKeys().get("ID")).longValue());
param.addValue("id", details.getId());
param.addValue("emp_name", details.getEmpName());
param.addValue("emp_code", details.getEmpCode());
param.addValue("net_salary", details.getNetSalary());
param.addValue("pay_code", details.getPayCode());
param.addValue("pay_code_desc", details.getPayCodeDesc());
param.addValue("pay_mode", details.getPayMode());
param.addValue("remark", details.getPayRemark());
param.addValue("yyyymm", details.getYyyymm());
param.addValue("emp_id", details.getEmployee() != null ? details.getEmployee().getEmpId() : null);
getNamedParameterJdbcTemplate().update(q, param);
}
I am running a PHP code using java.lang.Process, I want to return a value as a result from the PHP code so I can get it from Process, how can I do that ?
This is the code I am using:
public void callPhpListener(String handler, String logFile, String messageId) throws Exception {
try {
LOGGER.info("Calling PHP executore with message id:" + messageId);
// LOGGER.info("Handler:" + handler + ", logFile:" + logFile + ", message id:" + messageId);
Process exec = Runtime.getRuntime().exec(
new String[] {"/bin/bash", "-c",
"php " + handler + " " + messageId +
" >> " + logFile + " 2>&1"});
exec.waitFor();
int exitValue = exec.exitValue();
LOGGER.info("exit value is " + exitValue);
if(exitValue != 0) {
throw new Exception("Processed with error");
}
LOGGER.info("Done processing message with id:" + messageId);
} catch (Exception e) {
LOGGER.error(e.getMessage());
throw new Exception();
}
}
I have the following Java while loop:
while(true){
byte buffer[] = new byte[MAX_PDU_SIZE];
packet = new DatagramPacket(buffer, buffer.length);
socket.receive(packet);
Pdu pdu = pduFactory.createPdu(packet.getData());
System.out.print("Got PDU of type: " + pdu.getClass().getName());
if(pdu instanceof EntityStatePdu){
EntityID eid = ((EntityStatePdu)pdu).getEntityID();
Vector3Double position = ((EntityStatePdu)pdu).getEntityLocation();
System.out.print(" EID:[" + eid.getSite() + ", " + eid.getApplication() + ", " + eid.getEntity() + "] ");
System.out.print(" Location in DIS coordinates: [" + position.getX() + ", " + position.getY() + ", " + position.getZ() + "]");
}
System.out.println();
}
}
The intended function of the while loop is to capture any PDUs that are being sent across the network, and display information about them.
When I run the code, I go get the output that I had intended in the console- at least initially... But after it has returned information about a number of the PDUs, I got an error displayed in the console (can't remember what it said now- but I thought it may be because it was trying to capture a PDU when there wasn't one being sent).
I have tried amending my code to account for the occasion that a PDU may not be being received over the network at the time that it is trying to capture the information about the PDU by surrounding the code with the following try- catch loop:
try{
socket = new MulticastSocket(EspduSender.PORT);
address = InetAddress.getByName(EspduSender.DEFAULT_MULTICAST_GROUP);
socket.joinGroup(address);
while(true){
byte buffer[] = new byte[MAX_PDU_SIZE];
packet = new DatagramPacket(buffer, buffer.length);
socket.receive(packet);
Pdu pdu = pduFactory.createPdu(packet.getData());
System.out.print("Got PDU of type: " + pdu.getClass().getName());
if(pdu instanceof EntityStatePdu){
EntityID eid = ((EntityStatePdu)pdu).getEntityID();
Vector3Double position = ((EntityStatePdu)pdu).getEntityLocation();
System.out.print(" EID:[" + eid.getSite() + ", " + eid.getApplication() + ", " + eid.getEntity() + "] ");
System.out.print(" Location in DIS coordinates: [" + position.getX() + ", " + position.getY() + ", " + position.getZ() + "]");
}
System.out.println();
}
}
catch(Exception e){
System.out.println(e);
System.out.println("This is where the error is being generated");
}
However, when I now run the code- it still displays the first x number of DIS packets it captures (x varies every time I run the code), but then gives me a java.lang.NullPointerException.
As I understand, this will occur either because the PDU that the code has captured does not contain any information, (i.e. an 'empty' PDU), or because it is attempting to receive a PDU when there isn't one being sent over the network.
How can I make the code 'skip' the occasion that it doesn't receive a PDU, and just keep running? Or is there something else I should do to get rid of this error?
This may not fix your problem (won't know until you put a stack trace up), but you could check to see if pdu is null before using it.
while(true){
byte buffer[] = new byte[MAX_PDU_SIZE];
packet = new DatagramPacket(buffer, buffer.length);
socket.receive(packet);
Pdu pdu = pduFactory.createPdu(packet.getData());
if (pdu != null) {
System.out.print("Got PDU of type: " + pdu.getClass().getName());
if(pdu instanceof EntityStatePdu){
EntityID eid = ((EntityStatePdu)pdu).getEntityID();
Vector3Double position = ((EntityStatePdu)pdu).getEntityLocation();
System.out.print(" EID:[" + eid.getSite() + ", " + eid.getApplication() + ", " + eid.getEntity() + "] ");
System.out.print(" Location in DIS coordinates: [" + position.getX() + ", " + position.getY() + ", " + position.getZ() + "]");
}
System.out.println();
}
}
If you get a stack trace out, I can change this for your actual problem.
You code looks perfectly alright , except the below missing line.
if (pdu != null) {
//execute this
}
I want to ask..currently I'm using SMPP Logica Library..for developing SMSC Client...and I'm using session.submit to send message to smsc server,Sending Message almost 50 message/second, and i want to use thread for multiple send ,,because I need every response from server to get Delivery Report...the questions is..is session.submit is thread safe?should i syncronized first the session before sending the message using thread?
the code I use to send message is
{
response = session.submit(sm);
SUBMIT_SM_RESP(response,mt);
........
}
private void SUBMIT_SM_RESP(SubmitSMResp pdu,MTData mtd) {
log.info("CommandID From PDU "+pdu.getCommandId());
//SubmitSMResp submitSMResp = new SubmitSMResp();
if (pdu.getCommandId() == Data.SUBMIT_SM_RESP ){
// submitSMResp.setData(pdu.getData());
switch (pdu.getCommandStatus())
{
case 0:
log.info("Berhasil kirim MT; SeqNo=" + pdu.getSequenceNumber() + ";" + pdu.debugString());
break;
case 1031:
log.info("Error. Service not found; SeqNo=" + pdu.getSequenceNumber() + ";" +pdu.debugString());
break;
case 1032:
log.info("Error. Invalid TX Id; SeqNo=" + pdu.getSequenceNumber() + ";" +pdu.debugString());
break;
case 1033:
log.info("Error. Push limit exceeded; SeqNo=" + pdu.getSequenceNumber() + ";" + pdu.debugString());
break;
case 4107:
log.info("Error. Content Whitelisted (Testingmode); SeqNo=" + pdu.getSequenceNumber() + ";" +pdu.debugString());
break;
case 1280:
log.info("Insuficient Balance(Charging); SeqNo=" + pdu.getSequenceNumber() + ";" +pdu.debugString());
break;
}
log.debug("MT rec status: CommandID=SUBMIT_SM_RESP; SeqNo=" + pdu.getSequenceNumber() + "; CommandSts=" + pdu.getCommandStatus() + pdu.debugString());
//MTData mt = (MTData)DataInstance.getInstance().getHmSeqMT().get(Integer.valueOf(pdu.getSequenceNumber()));
MTData mt =mtd;
String tid = null;
if (mt == null) {
log.info("MT null");
// if (DataInstance.getInstance().getHmSeqTrxID().containsKey(Integer.valueOf(pdu.getSequenceNumber()))) {
// tid = (String)DataInstance.getInstance().getHmSeqTrxID().get(Integer.valueOf(pdu.getSequenceNumber()));
// mt = DataInstance.getInstance().getDbPrs().getMTTrxID(tid);
// log.info("MT rec status: CommandID=SUBMIT_SM_RESP; SeqNo=" + pdu.getSequenceNumber() + "; CommandSts=" + pdu.getCommandStatus() + "; trxid=" + tid + "; " + pdu.debugString());
// if (mt == null) {
log.info("MT rec status: CommandID=SUBMIT_SM_RESP; SeqNo=" + pdu.getSequenceNumber() + "; CommandSts=" + pdu.getCommandStatus() + "; trxid=" + tid + " n/a; " + pdu.debugString());
mt = DataInstance.getInstance().getDbPrs().getMTSeqNo(String.valueOf(pdu.getSequenceNumber()));
// }
// }
}
else {
tid = mt.getTransid();
}
// log.info(mt.getClass());
log.info("Transaction ID >>> "+mt.getTransid());
if (mt != null) {
DataInstance.getInstance().getDbPrs().deleteMT(mt,
String.valueOf(pdu.getCommandStatus()), DataInstance.getInstance().getTransX().operator);
DataInstance.getInstance().getHmSeqMT().remove(Integer.valueOf(pdu.getSequenceNumber()));
} else {
log.info("MT Null,,,Can't Send DR");
log.info("MT n/a: SeqNo=" + pdu.getSequenceNumber() + "; CommandSts=" + pdu.getCommandStatus() + pdu.debugString());
}
}
else
{
log.info("Nilai Enquiry Link "+Data.ENQUIRE_LINK_RESP);
}
}
Regards
Danz
In Synchronous mode it is always thread safe (like you are trying now). It is a totally different story going Asynchronous mode.