Object automatically being set to null in reflection java - java

I have written a reflection function which merges two objects on some pre-specified conditions :
While iterating on the fields of the object,the value of object A is correct as observed but as soon as the fields of the current object are exhausted and the function starts iterating on the next object, the fields to which values had been set ,they become NULL.
Here, I provide my reflection code and the main function :
public static void mergeObjectAndSet(Object objectA,Object objectB,Set<String> deleteSetA,Set<String> deleteSetB,Set<String> deleteSetC,String path) throws Exception {
LOGGER.info("In merge Sets and Diff function : ");
LOGGER.info("Deleted sets A are : "+deleteSetA+" B : "+deleteSetB+" C : "+deleteSetC);
LOGGER.info("Object A is : "+objectA+" object B is : "+objectB);
if(null == deleteSetA){
deleteSetA = new HashSet<>();
}
else if(null == deleteSetB){
deleteSetB = new HashSet<>();
}
Class classA = objectA.getClass();
Class classB = objectB.getClass();
LOGGER.info("Classes are : "+classA+" class B : "+classB);
Field objectBFields[] = classB.getDeclaredFields();
System.out.println(objectA);
for (Field fieldB : objectBFields) {
LOGGER.info("fields to be looped in mergeObjectAndSet are : "+objectBFields.toString() +" path is : "+path);
fieldB.setAccessible(true);
Class typeB = fieldB.getType();
Object fieldAObject = fieldB.get(objectA);
Object fieldBObject = fieldB.get(objectB);
if (!Collection.class.isAssignableFrom(typeB)) {
LOGGER.info("passes null check for field objects : ");
if (isTypePrimitive(typeB)) {
if(null != fieldAObject || null != fieldBObject) {
Field fieldA = classA.getDeclaredField(fieldB.getName());
fieldA.setAccessible(true);
LOGGER.info("field A is : " + fieldA.getName());
if (!(deleteSetA.contains(path + Constants.HYPHEN + fieldA.getName())) && (deleteSetB.contains(path + Constants.HYPHEN + fieldA.getName()))) {
LOGGER.info("In only deleted set case : Adding field : " + fieldA.getName() + " to deleted set");
deleteSetC.add(path + Constants.HYPHEN + fieldA.getName());
LOGGER.info("Merged object for the field : " + fieldB.getName() + " to object : " + fieldBObject + " in object : " + objectA);
}else if(deleteSetA.contains(path + Constants.HYPHEN + fieldA.getName()) && !(deleteSetB.contains(path + Constants.HYPHEN + fieldA.getName())) && null != fieldBObject && null==fieldAObject) {
LOGGER.info("in merge set and objects case : ");
fieldA.set(objectA, fieldBObject);
LOGGER.info("Merged object for the field for case refresh : " + fieldB.getName() + " to object : " + fieldBObject + " in object : " + objectA);
}else if(!(deleteSetA.contains(path + Constants.HYPHEN + fieldA.getName())) && !(deleteSetB.contains(path + Constants.HYPHEN + fieldA.getName()))){
LOGGER.info("In merge case irrespective of deleted sets : ");
fieldA.set(objectA, fieldBObject);
}
}
} else {
if ((null==fieldAObject && null==fieldBObject)){
LOGGER.info("In both fields are null case : ");
LOGGER.info("Field here is : "+fieldB.getName());
for(String del : deleteSetA){
if(del.startsWith(fieldB.getName())){
deleteSetC.addAll(deleteSetA);
}
}
continue;
}
LOGGER.info("In non primitive type check : path here for np is : "+path);
LOGGER.info("field name here : "+fieldB.getName());
LOGGER.info("path here : "+path);
LOGGER.info("object name here : "+objectA.getClass().getName());
Field fieldA = classA.getDeclaredField(fieldB.getName());
fieldA.setAccessible(true);
if (null == fieldAObject) {
LOGGER.info("if A is null case : initialise it with an instance");
Constructor[] constructors = fieldA.getType().getDeclaredConstructors();
for (Constructor constructor : constructors) {
constructor.setAccessible(true);
if (0 == constructor.getParameterCount()) {
fieldAObject = constructor.newInstance();
break;
}
}
}
LOGGER.info("No test cases met, path is : "+path);
if (null == path) {
LOGGER.info("when path is null new path here is : "+path);
mergeObjectAndSet(fieldAObject, fieldBObject, deleteSetA, deleteSetB, deleteSetC, fieldA.getName());
} else {
LOGGER.info("path here when some path is there is : "+path);
mergeObjectAndSet(fieldAObject, fieldBObject, deleteSetA, deleteSetB, deleteSetC, path + Constants.HYPHEN + fieldA.getName());
}
}
}
}
}
The main function is :
public static void main(String args[]) {
LeadDetailSRO leadDetailSRO1 = new LeadDetailSRO();
LeadDetailSRO leadDetailSRO2 = new LeadDetailSRO();
BankDetailSRO bankDetails = new BankDetailSRO();
bankDetails.setBeneficiaryName("ars");
bankDetails.setBranchName("noida");
leadDetailSRO2.setBankDetails(bankDetails);
Set<String> deleteSet1 = new HashSet<>();
Set<String> deleteSet2 = new HashSet<>();
Set<String> deleteSet3 = new HashSet<>();
deleteSet1.add("bankDetails-beneficiaryName");
try {
System.out.println("Before deletion object 1 is : " + leadDetailSRO1 + " object 2 is : " + leadDetailSRO2+"deleteset A is : "+deleteSet1+" B is : "+deleteSet2+" C is : "+deleteSet3);
Utils.mergeObjectAndSet(leadDetailSRO1, leadDetailSRO2, deleteSet1, deleteSet2, deleteSet3, null);
System.out.println("After deletion object 1 is : " + leadDetailSRO1 + " object 2 is : " + leadDetailSRO2+"deleteset A is : "+deleteSet1+" B is : "+deleteSet2+" C is : "+deleteSet3);
} catch (Exception e) {
e.printStackTrace();
}
}
The output is :
After deletion object 1 is: LeadDetailSRO{
uploadDocumentList=null,
businessOwnerDetails=null,
businessOwnerDetailList=null,
authorizedSignatoryList=null,
businessEntityDetails=null,
leadInfo=null,
bankDetails=null,
addressDetails=null,
cfaAgent=null,
vaAgent=null,
auditTrail=null,
additionalDetails=null,
additionalQuestions=null,
}
object2 is: LeadDetailSRO{
uploadDocumentList=null,
businessOwnerDetails=null,
businessOwnerDetailList=null,
authorizedSignatoryList=null,
businessEntityDetails=null,
leadInfo=null,
bankDetails=BankDetailSRO{
bankName='null',
bankAccountNumber='null',
ifscCode='null',
bankAccountHolder='null',
beneficiaryName='ars',
branchName='noida',
status='null',
nameMatchStatus='null',
reEnterAccountNumber='null',
reEnterIfscCode='null'
},
addressDetails=null,
cfaAgent=null,
vaAgent=null,
auditTrail=null,
additionalDetails=null,
additionalQuestions=null,
documents=null,
}
deleteset A is: [
bankDetails-beneficiaryName
]Bis: [
]Cis: [
]
Whereas the expected output is :
After deletion object1 is: LeadDetailSRO{
uploadDocumentList=null,
businessOwnerDetails=null,
businessOwnerDetailList=null,
authorizedSignatoryList=null,
businessEntityDetails=null,
leadInfo=null,
bankDetails=BankDetailSRO{
bankName='null',
bankAccountNumber='null',
ifscCode='null',
bankAccountHolder='null',
beneficiaryName='ars',
branchName='noida',
status='null',
nameMatchStatus='null',
reEnterAccountNumber='null',
reEnterIfscCode='null'
},
addressDetails=null,
cfaAgent=null,
vaAgent=null,
auditTrail=null,
additionalDetails=null,
additionalQuestions=null,
documents=null,
}
object2 is: LeadDetailSRO{
uploadDocumentList=null,
businessOwnerDetails=null,
businessOwnerDetailList=null,
authorizedSignatoryList=null,
businessEntityDetails=null,
leadInfo=null,
bankDetails=BankDetailSRO{
bankName='null',
bankAccountNumber='null',
ifscCode='null',
bankAccountHolder='null',
beneficiaryName='ars',
branchName='noida',
status='null',
nameMatchStatus='null',
reEnterAccountNumber='null',
reEnterIfscCode='null'
},
addressDetails=null,
cfaAgent=null,
vaAgent=null,
auditTrail=null,
additionalDetails=null,
additionalQuestions=null,
documents=null,
}
deleteset A is: [
bankDetails-beneficiaryName
]Bis: [
]Cis: [
]
The difference in outputs is to be seen in object1 bankDetails
LeadDetailSRO is a class which contains BankDetailSRO within it as an object and BankDetailSRO contains fields beneficiaryName,branchName.

When fieldA is null, you create a new object, but you never set the field with this new instance.
Reference fieldAObject does not point to the null value from fieldA anymore but on the new instance you created. But objectA does not reference this object.
You have to explicitly write it.
if (null == fieldAObject) {
LOGGER.info("if A is null case : initialise it with an instance");
Constructor[] constructors = fieldA.getType().getDeclaredConstructors();
for (Constructor constructor : constructors) {
constructor.setAccessible(true);
if (0 == constructor.getParameterCount()) {
fieldAObject = constructor.newInstance();
fieldA.set(objectA, fieldAObject); // <-- set the new value
break;
}
}

Related

Calling void methods but deference parameters

I call two methods void at the same time, the problem is that the second method I call is not executed, only the first call works fine. The only difference is the parameters
Which calls the method :
private void calculateDistributorRabat(InvoiceHeader invoice, Distributor distributor) {
insertRabatToWallet(distributor.getUser(), invoice.getPurchaseOrder());
insertRabatToWallet(distributor.getBranchOffice().getUser(), invoice.getPurchaseOrder());
}
This is the void method that is being called :
private void insertRabatToWallet(User user, PurchaseOrder purchaseOrder) {
Map<ProductType, Rabat> rabatMap = rabatService.getRabatMapPayPo(user.getUserType());
Wallet wallet = walletDao.findByUser(user);
if (wallet == null) {
wallet = new Wallet();
wallet.setUser(user);
}
for (PurchaseOrderDetail pd : purchaseOrder.getPurchaseOrderDetails()) {
Rabat rabat = rabatMap.get(pd.getProductType());
if (rabat == null) {
continue;
}
BigDecimal rabatAmount = rabat.getNominal().multiply(new BigDecimal(pd.getProductQty()));
wallet.setTotalDebet(wallet.getTotalDebet().add(rabatAmount));
wallet.setCurrentAmount(wallet.getCurrentAmount().add(rabatAmount));
WalletTransaction wt = new WalletTransaction();
wt.setAmount(rabatAmount);
wt.setDescription("Rabat " + user.getUserType() + " [" + purchaseOrder.getPurchaseOrderNumber()
+ "] (" + pd.getProductType().getCode()
+ "): Rp." + rabat.getNominal().setScale(0).toString()
+ " x " + pd.getProductQty());
wallet.addWalletTransaction(wt);
walletDao.save(wallet);
}
}

Search for specific value in a deep nested JsonArray/JsonObject

I have a deep nested JsonObject like this, what is the best way to search for a specific value (null in this example) in this object?
{
"monitors" : ["monitor1"],
"index" : [{
"patterns" : [ "*" ],
"masked" : [ "abcd", "*ip_dest*::/[0-9]{1,3}$/::XXX"],
"allowed" : [ "123", null ]
}],
"permissions" : [ ]
}
For this example, I have a list of keys, I want to get the values for those keys, check if the value has Array type and if yes, search if there is any null in that array. Here is the code I have:
for (Entry<String, DataType> allowedKey : allowedKeys.entrySet()) {
DataType dataType = allowedKey.getValue();
JsonNode value = contentAsNode.get(allowedKey.getKey());
if (dataType == DataType.ARRAY && value != null) {
try {
List contentArray = DefaultObjectMapper.objectMapper.convertValue(value, java.util.List.class);
if (contentArray.contains(null)) {
this.errorType = ErrorType.NULL_ARRAY_ELEMENT;
return false;
}
} catch (Exception e) {
this.errorType = ErrorType.BODY_NOT_PARSEABLE;
return false;
}
}
}
However contains() can not find null in this case, because I have a nested array. Since the structure of the Json object could be different each time (it could have nested array or maps or just an array), I was wondering what is the best way to parse a deep nested JsonObject to find a specific value?
More clarification: in the above example Json, the key that I am interested in is index, the value of this key, is a map (but it could be an array or nested array as well, we do not know beforehand), I want to check if there is any null in index values(which in this case, there is a null)
One viable easy solution using JSONPath
public static void main(String[] args) {
String json = "{\r\n" +
" \"monitors\" : [\"monitor1\"],\r\n" +
" \"index\" : [{\r\n" +
" \"patterns\" : [ \"*\" ],\r\n" +
" \"masked\" : [ \"abcd\", \"*ip_dest*::/[0-9]{1,3}$/::XXX\"],\r\n" +
" \"allowed\" : [ \"123\", null ]\r\n" +
" }],\r\n" +
" \"permissions\" : [ ]\r\n" +
"}" ;
String path = "$.index[?(null in #.allowed)]"; //Check if allowed List has null value i.e. index->allowed
DocumentContext jsonContext = JsonPath.parse(json);
List<?> list = jsonContext.read(path);
if(list.isEmpty()) { //Based on empty List u can return true or false
System.out.println("Not found");
}else {
System.out.println("Found");
}
}
As per OP requirement psoting another solution iterate over JsonObject recursively
public static void main(String[] args) {
String json = "{\r\n" +
" \"monitors\" : [\"monitor1\"],\r\n" +
" \"index\" : [{\r\n" +
" \"patterns\" : [ \"*\" ],\r\n" +
" \"masked\" : [ \"abcd\", \"*ip_dest*::/[0-9]{1,3}$/::XXX\"],\r\n" +
" \"allowed\" : [ \"123\", null ],\r\n" +
" \"country\" : \"India\"\r\n" +
" }],\r\n" +
" \"permissions\" : [ ]\r\n" +
"}" ;
try {
iterateJson(new JSONObject(json));
} catch (Exception e) {
e.printStackTrace();
}
}
public static void iterateJson(JSONObject jsonObject) throws Exception {
ObjectMapper mapper = new ObjectMapper();
Iterator<String> iterator = jsonObject.keys();
while(iterator.hasNext()) {
String key = iterator.next();
if(jsonObject.get(key) instanceof JSONArray) {
JSONArray jsonArray = jsonObject.getJSONArray(key);
for(int i=0; i<jsonArray.length(); i++) {
if(jsonArray.get(i) instanceof JSONObject) {
iterateJson(jsonArray.getJSONObject(i));
}else if(jsonArray.get(i) instanceof String){
List<String> list = mapper.readValue(jsonArray.toString(), new TypeReference<List<String>>() {});
System.out.println(key+" :: "+list);
System.out.println("Contains null :: "+list.contains(null));
System.out.println();
break;
}
}
}else if(jsonObject.get(key) instanceof JSONObject) {
iterateJson(jsonObject.getJSONObject(key));
}
}
}
output
masked :: [abcd, *ip_dest*::/[0-9]{1,3}$/::XXX]
Contains null :: false
allowed :: [123, null]
Contains null :: true
patterns :: [*]
Contains null :: false
monitors :: [monitor1]
Contains null :: false

ERROR org.omg.CORBA.MARSHAL Sequence length too large

After successfully fetching alarms from Corba U2000 server and now reading the values, I am getting the error below
ERROR: org.omg.CORBA.MARSHAL: Sequence length too large. Only 12 available and trying to assign 31926513 vmcid: 0x0 minor code: 0 completed: No
org.omg.CORBA.MARSHAL: Sequence length too large. Only 12 available and trying to assign 31926513 vmcid: 0x0 minor code: 0 completed: No
at org.omg.CosNotification.EventBatchHelper.read(EventBatchHelper.java:57)
at AlarmIRPConstDefs.AlarmInformationSeqHelper.read(AlarmInformationSeqHelper.java:51)
at AlarmIRPConstDefs.AlarmInformationSeqHelper.extract(AlarmInformationSeqHelper.java:26)
at com.be.u2k.Main.getAlarmsList(Main.java:144)
at com.be.u2k.Main.main(Main.java:109)
for method AlarmInformationSeqHelper.extract
// Get all active alarms list
private static void getAlarmsList(ORB orb, AlarmIRP alarmIRP) {
try {
ManagedGenericIRPConstDefs.StringTypeOpt filter = new ManagedGenericIRPConstDefs.StringTypeOpt();
filter.value("($type_name == 'x1')"); // Query new alarms and acknowledge or unacknowledge alarms
AlarmIRPConstDefs.DNTypeOpt base_object = new AlarmIRPConstDefs.DNTypeOpt();
BooleanHolder flag = new BooleanHolder();
AlarmIRPSystem.AlarmInformationIteratorHolder iter = new AlarmIRPSystem.AlarmInformationIteratorHolder();
StructuredEvent[] alarmList = alarmIRP.get_alarm_list(filter, base_object, flag, iter);
System.out.println("AlarmIRP get_alarm_list success, flag: " + flag.value + " fetched total: " + (alarmList == null? -1: alarmList.length));
for (StructuredEvent alarm: alarmList) {
if (alarm.header != null) {
System.out.println("fixed_header.event_type.name: " + alarm.header.fixed_header.event_type.type_name
+ " fixed_header.event_type.domain_name: " + alarm.header.fixed_header.event_type.domain_name);
if (alarm.header.variable_header != null) {
for (Property variableHeader: alarm.header.variable_header) {
System.out.println("variable_header.name: " + variableHeader.name + " alarm.header.variable_header.value: " + variableHeader.value);
}
}
}
if (alarm.filterable_data != null) {
for (Property filterableData: alarm.filterable_data) {
System.out.println("data.name: " + filterableData.name);
if (filterableData.value != null && filterableData.value.toString().contains("org.jacorb.orb.CDROutputStream")) {
StructuredEvent[] filterableDataValues = AlarmInformationSeqHelper.extract(filterableData.value);
} else {
System.out.println("data.value: " + filterableData.value);
}
}
}
}
} catch (ManagedGenericIRPSystem.InvalidParameter e) {
System.out.println("ERROR get_alarm_list InvalidParameter (Indicates that the parameter is invalid): " + e) ;
} catch (ManagedGenericIRPSystem.ParameterNotSupported e) {
System.out.println("ERROR get_alarm_list ParameterNotSupported (Indicates that the operation is not supported): " + e) ;
} catch (AlarmIRPSystem.GetAlarmList e) {
System.out.println("ERROR get_alarm_list ParameterNotSupported (Indicates exceptions caused by unknown reasons): " + e) ;
}
}
Or is my way of reading the alarms list incorrect? Thanks.
You can find the example method below for getAlarmList
//Connect to AlarmIRP
AlarmIRP alarmIRP = AlarmIRPHelper.narrow(orb.string_to_object(alarmIrpIOR.value));
StringTypeOpt alarmFilter = new StringTypeOpt();
alarmFilter.value("");
DNTypeOpt base_object = new DNTypeOpt();
base_object.value("");
BooleanHolder flag = new BooleanHolder(false); // false for iteration
AlarmInformationIteratorHolder iter = new AlarmInformationIteratorHolder();
List<String> alarmIds = get_alarm_list(alarmIRP, alarmFilter, base_object, flag, iter);
private List<String> get_alarm_list(org._3gppsa5_2.AlarmIRPSystem.AlarmIRP alarmIRP, org._3gppsa5_2.ManagedGenericIRPConstDefs.StringTypeOpt alarmFilter, org._3gppsa5_2.AlarmIRPConstDefs.DNTypeOpt base_object, BooleanHolder flag, org._3gppsa5_2.AlarmIRPSystem.AlarmInformationIteratorHolder iter) throws org._3gppsa5_2.AlarmIRPSystem.GetAlarmList, org._3gppsa5_2.ManagedGenericIRPSystem.ParameterNotSupported, org._3gppsa5_2.AlarmIRPSystem.NextAlarmInformations, org._3gppsa5_2.ManagedGenericIRPSystem.InvalidParameter, BAD_OPERATION {
logger.info("[get-alarm-list][start]");
alarmIRP.get_alarm_list(alarmFilter, base_object, flag, iter);
List<StructuredEvent> alarms = new ArrayList();
EventBatchHolder alarmInformation = new EventBatchHolder();
short alarmSize = 100;
List<String> alarmIds = new ArrayList();
while (iter.value.next_alarmInformations(alarmSize, alarmInformation)) {
alarms.addAll(Arrays.asList(alarmInformation.value));
logger.info("Current alarm size:" + alarms.size());
}
for (StructuredEvent event : alarms) {
try {
//printAlarm(event);
} catch (Exception ex) {
}
List<Property> rem = new ArrayList<Property>();
rem.addAll(Arrays.asList(PropertySeqHelper.extract(event.remainder_of_body)));
for (Property property : rem) {
if (!property.name.equals(org._3gppsa5_2.AlarmIRPNotifications.NotifyNewAlarm.ALARM_ID)) {
continue;
}
alarmIds.add(property.value.extract_string());
}
}
logger.info("[get-alarm-list][completed] size :" + alarms.size());
return alarmIds;
}
I managed to figure out what is that filterableData.value.toString() value that is "org.jacorb.orb.CDROutputStream". It turns out that the property with name "b" is a TimeBase:: UtcT according to the docs.
To convert it to correct value which is a utc timestamp, I changed the condition to
if (filterableData.name.equals("b") && filterableData.value != null && filterableData.value.toString().contains("org.jacorb.orb.CDROutputStream")) {
long occuranceTime = TimeTHelper.read(filterableData.value.create_input_stream());
System.out.println("data.value: " + occuranceTime);
}

Request parameters coming from jsps are changed when two different users access the code same time

public String generateDataPDF() {
System.out.println("Inside generate PDF");
String filePath = "";
HttpSession sess = ServletActionContext.getRequest().getSession();
try {
sess.setAttribute("msg", "");
if (getCrnListType().equalsIgnoreCase("F")) {
try {
filePath = getModulePath("CRNLIST_BASE_LOCATION") + File.separator + getCrnFileFileName();
System.out.println("File stored path : " + filePath);
target = new File(filePath);
FileUtils.copyFile(crnFile, target);
} catch (Exception e) {
System.out.println("File path Exception " + e);
}
}
System.out.println("Values from jsp are : 1)Mode of Generation : " + getCrnListType() + " 2)Policy Number : " + getCrnNumber() + " 3)Uploaded File Name : " + getCrnFileFileName() + " 4)LogoType : " + getLogoType()
+ " 5)Output Path : " + getOutputPath() + " 6)Type of Generation : " + getOptionId() + " 7)PDF Name : " + getPdfName());
String srtVAL = "";
String arrayVaue[] = new String[]{getCrnListType(), getCrnListType().equalsIgnoreCase("S") ? getCrnNumber() : filePath, getLogoType().equalsIgnoreCase("WL") ? "0" : "1",
getOutputPath(), getGenMode(), getRenType()};
//INS DB Connection
con = getInsjdbcConnection();
ArrayList selectedCRNList = new ArrayList();
String selectedCRNStr = "";
selectedCRNStr = getSelectedVal(selectedCRNStr, arrayVaue[1]);
String[] fileRes = selectedCRNStr.split("\\,");
if (fileRes[0].equalsIgnoreCase("FAIL")) {
System.out.println("fileRes is FAIL beacause of other extension file.");
sess.setAttribute("pr", "Please upload xls or csv file.");
return SUCCESS;
}
System.out.println("List file is : " + selectedCRNStr);
String st[] = srtVAL.split("[*]");
String billDateStr = DateUtil.getStrDateProc(new Date());
Timestamp strtPasrsingTm = new Timestamp(new Date().getTime());
String minAMPM = DateUtil.getTimeDate(new Date());
String str = "";
String batchID = callSequence();
try {
System.out.println("Inside Multiple policy Generation.");
String userName=sess.getAttribute("loginName").toString();
String list = getProcessesdList(userName);
if (list != null) {
System.out.println("list is not null Users previous data is processing.....");
//setTotalPDFgNERATEDmSG("Data is processing please wait.");
sess.setAttribute("pr","Batch Id "+list+" for User " + userName + " is currently running.Please wait till this Process complete.");
return SUCCESS;
}
String[] policyNo = selectedCRNStr.split("\\,");
int l = 0, f = 0,counter=1;
for (int j = 0; j < policyNo.length; j++,counter++) {
String pdfFileName = "";
int uniqueId=counter;
globUniqueId=uniqueId;
insertData(batchID, new Date(), policyNo[j], getOptionId(), userName,uniqueId);
System.out.println("Executing Proc one by one.");
System.out.println("policyNo[j]" + policyNo[j]);
System.out.println("getOptionId()" + getOptionId());
System.out.println("seqValue i.e batchId : " + batchID);
}
str = callProcedure(policyNo[j], getOptionId(), batchID);
String[] procResponse = str.split("\\|");
for (int i = 0; i < procResponse.length; i++) {
System.out.println("Response is : " + procResponse[i]);
}
if (procResponse[0].equals("SUCCESS")) {
Generator gen = new Generator();
if (getPdfName().equalsIgnoreCase("true")) {
System.out.println("Checkbox is click i.e true");
pdfFileName = procResponse[1];
} else {
System.out.println("Checkbox is not click i.e false");
String POLICY_SCH_GEN_PSS = getDetailsForFileName(userName, policyNo[j], batchID);
String[] fileName = POLICY_SCH_GEN_PSS.split("\\|");
if (getLogoType().equals("0") || getLogoType().equals("2")) {
System.out.println("If logo is O or 1");
pdfFileName = fileName[1];
} else if (getLogoType().equals("1")) {
System.out.println("If logo is 2");
pdfFileName = fileName[0];
}
}
b1 = gen.genStmt(procResponse[1], procResponse[2], "2", getLogoType(), "0", pdfFileName,"1",userName,batchID);
l++;
updateData(uniqueId,batchID, "Y");
} else {
f++;
updateData(uniqueId,batchID, "F");
}
}
sess.setAttribute("pr","Total "+l+" "+getGenericModulePath("PDF_RES1") + " " + " " + getGenericModulePath("PDF_RES2") + " " + f);
}catch (Exception e) {
updateData(globUniqueId,batchID, "F");
System.out.println("Exception in procedure call");
setTotalPDFgNERATEDmSG("Fail");
e.printStackTrace();
sess.setAttribute("pr", "Server Error.");
return SUCCESS;
}
}catch (Exception ex) {
ex.printStackTrace();
sess.setAttribute("pr", "Server Error.");
return SUCCESS;
}
System.out.println("Above second return");
return SUCCESS;
}
GenerateDataPDf method generates PDF based on the parameters i.e ProductType(GenMode),CrnList(uploaded in excel file...)Code works fine when only single user generates PDF. But If two different User(User and roles are assigned in application) start the process same time request paraeters are overridden then! Suppose first user request pdf for 50 customers for product 1. User1's process is still running and second user request for product2. Now User1's pdf are generated but for product2.....! Here batchId is unique for every single request.One table is maintained where batch_id,all pdf,generation flags are mainained there. How do I solve this?
As per your comment, this is what I would do, It's probably not the best way to do !
Firstly : Create a function to collet all your data at the beginning. You should not modify/update/create anything when you are generating a PDF. IE : array/list collectPDFData() wich should retourn an array/list.
Secondly : Use a synchronized methods like synchronized boolean generatePDF(array/list)
"Synchronized" methods use monitor lock or intrinsic lock in order to manage synchronization so when using synchronized, each method share the same monitor of the corresponding object.
NB : If you use Synchronize, it's probably useless to collect all your data in a separate way, but I think it's a good practice to make small function dedicated to a specific task.
Thus, your code should be refactored a little bit.

DynamoDB Parallel Scan - Java Synchronization

I'm trying to use the DynamoDB Parallel Scan Example:
http://docs.aws.amazon.com/amazondynamodb/latest/developerguide/LowLevelJavaScanning.html
I have 200,000 items, and I've taken the sequential code scan, and modified it slightly for my usage:
Map<String, AttributeValue> lastKeyEvaluated = null;
do
{
ScanRequest scanRequest = new ScanRequest()
.withTableName(tableName)
.withExclusiveStartKey(lastKeyEvaluated);
ScanResult result = client.scan(scanRequest);
double counter = 0;
for(Map<String, AttributeValue> item : result.getItems())
{
itemSerialize.add("Set:"+counter);
for (Map.Entry<String, AttributeValue> getItem : item.entrySet())
{
String attributeName = getItem.getKey();
AttributeValue value = getItem.getValue();
itemSerialize.add(attributeName
+ (value.getS() == null ? "" : ":" + value.getS())
+ (value.getN() == null ? "" : ":" + value.getN())
+ (value.getB() == null ? "" : ":" + value.getB())
+ (value.getSS() == null ? "" : ":" + value.getSS())
+ (value.getNS() == null ? "" : ":" + value.getNS())
+ (value.getBS() == null ? "" : ":" + value.getBS()));
}
counter += 1;
}
lastKeyEvaluated = result.getLastEvaluatedKey();
}
while(lastKeyEvaluated != null);
The counter gives exactly 200,000 when this code has finished, however, I also wanted to try the parallel scan.
Function Call:
ScanSegmentTask task = null;
ArrayList<String> list = new ArrayList<String>();
try
{
ExecutorService executor = Executors.newFixedThreadPool(numberOfThreads);
int totalSegments = numberOfThreads;
for (int segment = 0; segment < totalSegments; segment++)
{
// Runnable task that will only scan one segment
task = new ScanSegmentTask(tableName, itemLimit, totalSegments, segment, list);
// Execute the task
executor.execute(task);
}
shutDownExecutorService(executor);
}
.......Catches something if error
return list;
Class:
I have a static list that the data is shared with all the threads. I was able to retrieve the lists, and output the amount of data.
// Runnable task for scanning a single segment of a DynamoDB table
private static class ScanSegmentTask implements Runnable
{
// DynamoDB table to scan
private String tableName;
// number of items each scan request should return
private int itemLimit;
// Total number of segments
// Equals to total number of threads scanning the table in parallel
private int totalSegments;
// Segment that will be scanned with by this task
private int segment;
static ArrayList<String> list_2;
Object lock = new Object();
public ScanSegmentTask(String tableName, int itemLimit, int totalSegments, int segment, ArrayList<String> list)
{
this.tableName = tableName;
this.itemLimit = itemLimit;
this.totalSegments = totalSegments;
this.segment = segment;
list_2 = list;
}
public void run()
{
System.out.println("Scanning " + tableName + " segment " + segment + " out of " + totalSegments + " segments " + itemLimit + " items at a time...");
Map<String, AttributeValue> exclusiveStartKey = null;
int totalScannedItemCount = 0;
int totalScanRequestCount = 0;
int counter = 0;
try
{
while(true)
{
ScanRequest scanRequest = new ScanRequest()
.withTableName(tableName)
.withLimit(itemLimit)
.withExclusiveStartKey(exclusiveStartKey)
.withTotalSegments(totalSegments)
.withSegment(segment);
ScanResult result = client.scan(scanRequest);
totalScanRequestCount++;
totalScannedItemCount += result.getScannedCount();
synchronized(lock)
{
for(Map<String, AttributeValue> item : result.getItems())
{
list_2.add("Set:"+counter);
for (Map.Entry<String, AttributeValue> getItem : item.entrySet())
{
String attributeName = getItem.getKey();
AttributeValue value = getItem.getValue();
list_2.add(attributeName
+ (value.getS() == null ? "" : ":" + value.getS())
+ (value.getN() == null ? "" : ":" + value.getN())
+ (value.getB() == null ? "" : ":" + value.getB())
+ (value.getSS() == null ? "" : ":" + value.getSS())
+ (value.getNS() == null ? "" : ":" + value.getNS())
+ (value.getBS() == null ? "" : ":" + value.getBS()));
}
counter += 1;
}
}
exclusiveStartKey = result.getLastEvaluatedKey();
if (exclusiveStartKey == null)
{
break;
}
}
}
catch (AmazonServiceException ase)
{
System.err.println(ase.getMessage());
}
finally
{
System.out.println("Scanned " + totalScannedItemCount + " items from segment " + segment + " out of " + totalSegments + " of " + tableName + " with " + totalScanRequestCount + " scan requests");
}
}
}
Executor Service Shut Down:
public static void shutDownExecutorService(ExecutorService executor)
{
executor.shutdown();
try
{
if (!executor.awaitTermination(10, TimeUnit.SECONDS))
{
executor.shutdownNow();
}
}
catch (InterruptedException e)
{
executor.shutdownNow();
Thread.currentThread().interrupt();
}
}
However, the amount of items changes every time I run this piece of code (Varies around 60000 in total, 6000 per threads, with 10 created threads). Removing synchronization does not change the result too.
Is there a bug with the synchronization or with the Amazon AWS API?
Thanks All
EDIT:
The new function call:
ScanSegmentTask task = null;
ArrayList<String> list = new ArrayList<String>();
try
{
ExecutorService executor = Executors.newFixedThreadPool(numberOfThreads);
int totalSegments = numberOfThreads;
for (int segment = 0; segment < totalSegments; segment++)
{
// Runnable task that will only scan one segment
task = new ScanSegmentTask(tableName, itemLimit, totalSegments, segment);
// Execute the task
Future<ArrayList<String>> future = executor.submit(task);
list.addAll(future.get());
}
shutDownExecutorService(executor);
}
The new class:
// Runnable task for scanning a single segment of a DynamoDB table
private static class ScanSegmentTask implements Callable<ArrayList<String>>
{
// DynamoDB table to scan
private String tableName;
// number of items each scan request should return
private int itemLimit;
// Total number of segments
// Equals to total number of threads scanning the table in parallel
private int totalSegments;
// Segment that will be scanned with by this task
private int segment;
ArrayList<String> list_2 = new ArrayList<String>();
static int counter = 0;
public ScanSegmentTask(String tableName, int itemLimit, int totalSegments, int segment)
{
this.tableName = tableName;
this.itemLimit = itemLimit;
this.totalSegments = totalSegments;
this.segment = segment;
}
#SuppressWarnings("finally")
public ArrayList<String> call()
{
System.out.println("Scanning " + tableName + " segment " + segment + " out of " + totalSegments + " segments " + itemLimit + " items at a time...");
Map<String, AttributeValue> exclusiveStartKey = null;
try
{
while(true)
{
ScanRequest scanRequest = new ScanRequest()
.withTableName(tableName)
.withLimit(itemLimit)
.withExclusiveStartKey(exclusiveStartKey)
.withTotalSegments(totalSegments)
.withSegment(segment);
ScanResult result = client.scan(scanRequest);
for(Map<String, AttributeValue> item : result.getItems())
{
list_2.add("Set:"+counter);
for (Map.Entry<String, AttributeValue> getItem : item.entrySet())
{
String attributeName = getItem.getKey();
AttributeValue value = getItem.getValue();
list_2.add(attributeName
+ (value.getS() == null ? "" : ":" + value.getS())
+ (value.getN() == null ? "" : ":" + value.getN())
+ (value.getB() == null ? "" : ":" + value.getB())
+ (value.getSS() == null ? "" : ":" + value.getSS())
+ (value.getNS() == null ? "" : ":" + value.getNS())
+ (value.getBS() == null ? "" : ":" + value.getBS()));
}
counter += 1;
}
exclusiveStartKey = result.getLastEvaluatedKey();
if (exclusiveStartKey == null)
{
break;
}
}
}
catch (AmazonServiceException ase)
{
System.err.println(ase.getMessage());
}
finally
{
return list_2;
}
}
}
Final EDIT:
Function Call:
ScanSegmentTask task = null;
ArrayList<String> list = new ArrayList<String>();
ArrayList<Future<ArrayList<String>>> holdFuture = new ArrayList<Future<ArrayList<String>>>();
try
{
ExecutorService executor = Executors.newFixedThreadPool(numberOfThreads);
int totalSegments = numberOfThreads;
for (int segment = 0; segment < totalSegments; segment++)
{
// Runnable task that will only scan one segment
task = new ScanSegmentTask(tableName, itemLimit, totalSegments, segment);
// Execute the task
Future<ArrayList<String>> future = executor.submit(task);
holdFuture.add(future);
}
for (int i = 0 ; i < holdFuture.size(); i++)
{
boolean flag = false;
while(flag == false)
{
Thread.sleep(1000);
if(holdFuture.get(i).isDone())
{
list.addAll(holdFuture.get(i).get());
flag = true;
}
}
}
shutDownExecutorService(executor);
}
Class:
private static class ScanSegmentTask implements Callable>
{
// DynamoDB table to scan
private String tableName;
// number of items each scan request should return
private int itemLimit;
// Total number of segments
// Equals to total number of threads scanning the table in parallel
private int totalSegments;
// Segment that will be scanned with by this task
private int segment;
ArrayList<String> list_2 = new ArrayList<String>();
static AtomicInteger counter = new AtomicInteger(0);
public ScanSegmentTask(String tableName, int itemLimit, int totalSegments, int segment)
{
this.tableName = tableName;
this.itemLimit = itemLimit;
this.totalSegments = totalSegments;
this.segment = segment;
}
#SuppressWarnings("finally")
public ArrayList<String> call()
{
System.out.println("Scanning " + tableName + " segment " + segment + " out of " + totalSegments + " segments " + itemLimit + " items at a time...");
Map<String, AttributeValue> exclusiveStartKey = null;
try
{
while(true)
{
ScanRequest scanRequest = new ScanRequest()
.withTableName(tableName)
.withLimit(itemLimit)
.withExclusiveStartKey(exclusiveStartKey)
.withTotalSegments(totalSegments)
.withSegment(segment);
ScanResult result = client.scan(scanRequest);
for(Map<String, AttributeValue> item : result.getItems())
{
list_2.add("Set:"+counter);
for (Map.Entry<String, AttributeValue> getItem : item.entrySet())
{
String attributeName = getItem.getKey();
AttributeValue value = getItem.getValue();
list_2.add(attributeName
+ (value.getS() == null ? "" : ":" + value.getS())
+ (value.getN() == null ? "" : ":" + value.getN())
+ (value.getB() == null ? "" : ":" + value.getB())
+ (value.getSS() == null ? "" : ":" + value.getSS())
+ (value.getNS() == null ? "" : ":" + value.getNS())
+ (value.getBS() == null ? "" : ":" + value.getBS()));
}
counter.addAndGet(1);
}
exclusiveStartKey = result.getLastEvaluatedKey();
if (exclusiveStartKey == null)
{
break;
}
}
}
catch (AmazonServiceException ase)
{
System.err.println(ase.getMessage());
}
finally
{
return list_2;
}
}
}
OK, I believe the issue is in the way you synchronized.
In your case, your lock is pretty much pointless, as each thread has its own lock, and so synchronizing never actually blocks one thread from running the same piece of code. I believe that this is the reason that removing synchronization does not change the result -- because it never would have had an effect in the first place.
I believe your issue is in fact due to the static ArrayList<String> that's shared by your threads. This is because ArrayList is actually not thread-safe, and so operations on it are not guaranteed to succeed; as a result, you have to synchronize operations to/from it. Without proper synchronization, it could be possible to have two threads add something to an empty ArrayList, yet have the resulting ArrayList have a size of 1! (or at least if my memory hasn't failed me. I believe this is the case for non-thread-safe objects, though)
As I said before, while you do have a synchronized block, it really isn't doing anything. You could synchronize on list_2, but all that would do is effectively make all your threads run in sequence, as the lock on the ArrayList wouldn't be released until one of your threads was done.
There are a few solutions to this. You can use Collections.synchronizedList(list_2) to create a synchronized wrapper to your ArrayList. This way, adding to the list is guaranteed to succeed. However, this induces a synchronization cost per operations, and so isn't ideal.
What I would do is actually have ScanSegmentTask implement Callable (technically Callable<ArrayList<String>>. The Callable interface is almost exactly like the Runnable interface, except its method is call(), which returns a value.
Why is this important? I think that what would produce the best results for you is this:
Make list_2 an instance variable, initialized to a blank list
Have each thread add to this list exactly as you have done
Return list_2 when you are done
Concatenate each resulting ArrayList<String> to the original ArrayList using addAll()
This way, you have no synchronization overhead to deal with!
This will require a few changes to your executor code. Instead of calling execute(), you'll need to call submit(). This returns a Future object (Future<ArrayList<String>> in your case) that holds the results of the call() method. You'll need to store this into some collection -- an array, ArrayList, doesn't matter.
To retrieve the results, simply loop through the collection of Future objects and call get() (I think). This call will block until the thread that the Future object corresponds to is complete.
I think that's it. While this is more complicated, I think that this is be best performance you're going to get, as with enough threads either CPU contention or your network link will become the bottleneck. Please ask if you have any questions, and I'll update as needed.

Categories

Resources