My videos.dat contains some objects were stored by: V - 1000 * disks + AI
Video<ID; Name; Disk; $Fee>
Video<V1001; Avenger; 1; $12.0>
Video<V1002; 50 First Dates; 1; $13.0>
Video<V2001; Furious 7; 2; $18.0>
And my StartUp() load to my GUI. At the startup how can I count and continue to add video Objects following auto incremental rule.
public void startUp() {
int count = 0;
try {
try (ObjectInputStream ois = new ObjectInputStream(new FileInputStream("videos.dat"))) {
objects = (Vector) ois.readObject();
for (Object v : objects) {
System.out.println(v.toString());
count++;
fireObjectCreated(v);
}
ois.close();
} catch (SecurityException ex) {
Logger.getLogger(VideoManager.class.getName()).log(Level.SEVERE, null, ex);
} catch (IllegalArgumentException ex) {
Logger.getLogger(VideoManager.class.getName()).log(Level.SEVERE, null, ex);
}
System.out.println("Got " + count + " video objects");
} catch (FileNotFoundException ex) {
System.err.println("FileNotFoundException");
} catch (IOException ex) {
System.err.println("IOException");
} catch (ClassNotFoundException ex) {
System.err.println("ClassNotFoundException");
}
}
I'm using HashMap to store and auto increment but after startup() my program go back to start means it started from Video V1001
public static int currentNumberOfStaticVideoID(int value) {
if (listCountVideoID.containsKey(value)){
System.out.println("Contain disk type " +value + " : " + listCountVideoID.containsKey(value));
int id = listCountVideoID.get(value);
System.out.println("Get current id: " + id);
int newid = id + 1;
listCountVideoID.put(value, newid);
System.out.println("New id: " + newid);
return newid;
}
int newKey = value*1000;
listCountVideoID.put(value, newKey);
return newKey;
}
public static Map<Integer, Integer> listCountVideoID = new HashMap<Integer,Integer>();
So, anyone can help to figure out some goals:
+ load the data file >> load objects
+ continue store video with ID = "V" + 1000*disk + AutoIncre
+ if disk = 1 > continue from V1003
+ if disk = 2 > continue from V2002
Related
I am currently facing an issue regarding this method getSurroundingSumGrid() which is supposed to take data from an earlier grid that was built based off of text file data and use it to determine new values within the array sumGrid. The STATICGRID array gets built at first with the correct values but then as the for loop continues on, the STATICGRID values change to what i have set sumGrid to change to. I don't have any defined code where STATICGRID is ever set to equal another value and if I did it should give an error.
public double[][] getSurroundingSumGrid() {
this.sumGrid = getBaseGrid();
for (int rowNum = 0; rowNum < sumGrid.length; rowNum++) {
final double[][] STATICGRID = this.getBaseGrid();
double topNum = 0, botNum = 0, rightNum = 0, leftNum = 0;
for (int colNum = 0; colNum < sumGrid[0].length; colNum++) {
try {
topNum = STATICGRID[rowNum - 1][colNum];
System.out.println("TOPNUM : (" + (rowNum-1) + "," + colNum + ") " + STATICGRID[rowNum-1][colNum]);
} catch (Exception e) {
topNum = STATICGRID[rowNum][colNum];
System.out.println("Top IndexOutOfBoundsException: " + STATICGRID[rowNum][colNum] + " used instead.");
}
try {
botNum = STATICGRID[rowNum + 1][colNum];
System.out.println("BOTNUM : (" + (rowNum+1) + "," + colNum + ") " + STATICGRID[rowNum+1][colNum]);
} catch (Exception e) {
botNum = STATICGRID[rowNum][colNum];
System.out.println("Bot IndexOutOfBoundsException: " + STATICGRID[rowNum][colNum] + " used instead.");
}
try {
leftNum = STATICGRID[rowNum][colNum - 1];
System.out.println("LEFTNUM : (" + rowNum + "," + (colNum-1) + ") " + STATICGRID[rowNum][colNum-1]);
} catch (Exception e) {
leftNum = STATICGRID[rowNum][colNum];
System.out.println("Left IndexOutOfBoundsException: " + STATICGRID[rowNum][colNum] + " used instead.");
}
try {
rightNum = STATICGRID[rowNum][colNum + 1];
System.out.println("RIGHTNUM : (" + rowNum + "," + (colNum+1) + ") " + STATICGRID[rowNum][colNum+1]);
} catch (Exception e) {
rightNum = STATICGRID[rowNum][colNum];
System.out.println("Right IndexOutOfBoundsException: " + STATICGRID[rowNum][colNum] + " used instead.");
}
this.sumGrid[rowNum][colNum] = topNum + botNum + rightNum + leftNum;
System.out.println("STATICGRID NEW NUM : " + STATICGRID[rowNum][colNum]);
System.out.println("SUMGRID NEW NUM : " + sumGrid[rowNum][colNum]);
}
}
return this.sumGrid;
}
When doing these tests with the code I can see very clearly that the data in both arrays are changing overtime, and in turn giving me wrong results. I've tried for about 2 hours just moving things around and can't seem to figure out how to get this to work properly.
As you can even see, I even attempted rebuilding the STATICGRID array every single time the for loop completed and it wouldn't even hinder the result. It does the same thing regardless of where you put the STATICGRID at (either outside or inside at the top-most level of the for loop, and it doesn't matter whether it's final or not), it does the same thing. After looking at it for so long I'm beyond confused on why my code isn't working and I have a slight feeling that it is the try-catch statement but I wouldn't at all know why. I don't know a ton about the statement and what it does entirely but the reason it is there is because the data can get an IndexOutOfBoundsException so instead of getting that it would instead count itself for each IndexOutOfBoundsException it got as per the assignment instructions.
Thanks and I hope this makes sense.
Alright, thanks to FredK's suggestion at using a deepCopy, I did some research and used this method to get better results. This is the unoptomizedDeepCopy by Philip Isehour. I don't exactly understand it but I'm going to just use it for now and spend some time learning more about this and how they work. I'm currently a CS221 student and we haven't gone over deepCopy yet.
public static Object deepCopy(Object orig) {
Object obj = null;
try {
// Write the object out to a byte array
ByteArrayOutputStream bos = new ByteArrayOutputStream();
ObjectOutputStream out = new ObjectOutputStream(bos);
out.writeObject(orig);
out.flush();
out.close();
// Make an input stream from the byte array and read
// a copy of the object back in.
ObjectInputStream in = new ObjectInputStream(
new ByteArrayInputStream(bos.toByteArray()));
obj = in.readObject();
}
catch(IOException e) {
e.printStackTrace();
}
catch(ClassNotFoundException cnfe) {
cnfe.printStackTrace();
}
return obj;
}
After this method, I then was able to stop the array from changing value by using this in front of the getBaseGrid() method.
public double[][] getSurroundingSumGrid() {
this.sumGrid = (double[][]) GridMonitor.deepCopy(this.getBaseGrid());
double[][] staticGrid = (double[][]) GridMonitor.deepCopy(this.getBaseGrid());
double topNum, botNum, rightNum, leftNum;
for (int rowNum = 0; rowNum < sumGrid.length; rowNum++) {
for (int colNum = 0; colNum < sumGrid[0].length; colNum++) {
try {
topNum = staticGrid[rowNum - 1][colNum];
} catch (Exception e) {
topNum = staticGrid[rowNum][colNum];
}
try {
botNum = staticGrid[rowNum + 1][colNum];
} catch (Exception e) {
botNum = staticGrid[rowNum][colNum];
}
try {
leftNum = staticGrid[rowNum][colNum - 1];
} catch (Exception e) {
leftNum = staticGrid[rowNum][colNum];
}
try {
rightNum = staticGrid[rowNum][colNum + 1];
} catch (Exception e) {
rightNum = staticGrid[rowNum][colNum];
}
this.sumGrid[rowNum][colNum] = topNum + botNum + rightNum + leftNum;
}
}
return this.sumGrid;
}
Thanks!
I try to save tweets with keywords, I know that free API gives only 7 days of the result, but it never gets any set of a timeline greater than few hours, sometimes it even gives me a range of an hour. I did set since() and until() to the searching query. The maximum number of the tweets I've got from a single run was less than 400. And can anyone tell me why it stopped automatically with such few results? Thanks.
public static void main(String[] args) throws TwitterException {
String KEY_word;
String Exclude;
String Since;
String Until;
String OPT_dir;
String time;
int x;
Propertyloader confg = new Propertyloader();
KEY_word = confg.getProperty("KEY_word");
Exclude = confg.getProperty("Exclude");
Since = confg.getProperty("Since");
Until = confg.getProperty("Until");
OPT_dir = confg.getProperty("OPT_dir");
Twitter twitter = new TwitterFactory().getInstance();
try {
time = new SimpleDateFormat("yyyyMMddHHmm'.txt'").format(new Date());
x = 1;
Query query = new Query(KEY_word + Exclude);
query.since(Since);
query.until(Until);
QueryResult result;
do {
result = twitter.search(query);
List<Status> tweets = result.getTweets();
for (Status tweet : tweets) {
try {
String filedir = OPT_dir + KEY_word + time;
writeStringToFile(filedir, x + ". " + "#" + tweet.getUser().getScreenName() + ", At: " + tweet.getCreatedAt() + ", Rt= " + tweet.getRetweetCount() + ", Text: " + tweet.getText());
x += 1;
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
}
} while ((query = result.nextQuery()) != null);
System.exit(0);
} catch (TwitterException te) {
te.printStackTrace();
System.out.println("Failed to search tweets: " + te.getMessage());
System.exit(-1);
}
}
public static void writeStringToFile(String filePathAndName, String stringToBeWritten) throws IOException{
try
{
String filename= filePathAndName;
boolean append = true;
FileWriter fw = new FileWriter(filename,append);
fw.write(stringToBeWritten);//appends the string to the file
fw.write("\n" +"\n");
fw.close();
}
catch(IOException ioe)
{
System.err.println("IOException: " + ioe.getMessage());
}
}
You can get more tweets by using setMaxId. Here is an example :
long lowestTweetId = Long.MAX_VALUE;
x = 1;
Query query = new Query("stackoverflow");
query.since("2018-08-10");
query.until("2018-08-16");
query.setCount(100); //The number of tweets to return per page, up to a maximum of 100. Defaults to 15. https://developer.twitter.com/en/docs/tweets/search/api-reference/get-search-tweets.html
query.setResultType(Query.RECENT); // to get an order
int searchResultCount=100;
QueryResult result;
do {
result = twitter.search(query);
List<Status> tweets = result.getTweets();
for (Status tweet : tweets) {
try {
System.out.println( "#" + tweet.getUser().getScreenName() + ", At: " + tweet.getCreatedAt() );
x += 1;
if (tweet.getId() < lowestTweetId) {
lowestTweetId = tweet.getId();
query.setMaxId(lowestTweetId-1);
}
else {// each new maxid should be smaller than the other one so break here
//do whatever you want to handle it ex: break from two loops
}
} catch (Exception e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
}
} while (searchResultCount != 0 );
I have to pass record to an UDF which calls an API but as we want to do it parallely,we are using spark and thats why UDF is being developed, the problem here is that that UDF needs to take only 100 records at a time not more than that, it can't handle more than 100 records parallely, so how to ensure that only 100 record pass to it in one go please note we don't want to use count() function on whole record.
I am attaching the UDF code here,it's a generic UDF which returns array of struct.moreover if we pass 100 records in batchsize variable each time then,if suppose there are 198 records then if as we dont want to use count() we will not be knowing that its last batchsize is going to be 98.so how to handle that thing.
Guys... I have a generic UDF in which call is made for an API but before calling it creates batch of 100 firstly then only call restapi.. So the argument UDF takes are x1:string, x2:string, batchsize:integer(currently the batchsize is 100)..so in UDF until and unless the batchsize is not 100 the call will not happen.. And for each record it will return null.
So till 99th record it will return. Null but at 100th record the call will happen
[So, now the problem part:as we are taking batchsize 100 and call will take place only at 100th record. So, in condition like if we have suppose 198 record in file then 100 record will get the output but, other 98 will only return null as they will not get processed..
So please help a way around, and UDF take one record at a time, but it keep on collecting till 100th record.. I hope this clears up
public class Standardize_Address extends GenericUDF {
private static final Logger logger = LoggerFactory.getLogger(Standardize_Address.class);
private int counter = 0;
Client client = null;
private Batch batch = new Batch();
public Standardize_Address() {
client = new ClientBuilder().withUrl("https://ss-staging-public.beringmedia.com/street-address").build();
}
// StringObjectInspector streeti;
PrimitiveObjectInspector streeti;
PrimitiveObjectInspector cityi;
PrimitiveObjectInspector zipi;
PrimitiveObjectInspector statei;
PrimitiveObjectInspector batchsizei;
private ArrayList ret;
#Override
public String getDisplayString(String[] argument) {
return "My display string";
}
#Override
public ObjectInspector initialize(ObjectInspector[] args) throws UDFArgumentException {
System.out.println("under initialize");
if (args[0] == null) {
throw new UDFArgumentTypeException(0, "NO Street is mentioned");
}
if (args[1] == null) {
throw new UDFArgumentTypeException(0, "No Zip is mentioned");
}
if (args[2] == null) {
throw new UDFArgumentTypeException(0, "No city is mentioned");
}
if (args[3] == null) {
throw new UDFArgumentTypeException(0, "No State is mentioned");
}
if (args[4] == null) {
throw new UDFArgumentTypeException(0, "No batch size is mentioned");
}
/// streeti =args[0];
streeti = (PrimitiveObjectInspector)args[0];
// this.streetvalue = (StringObjectInspector) streeti;
cityi = (PrimitiveObjectInspector)args[1];
zipi = (PrimitiveObjectInspector)args[2];
statei = (PrimitiveObjectInspector)args[3];
batchsizei = (PrimitiveObjectInspector)args[4];
ret = new ArrayList();
ArrayList structFieldNames = new ArrayList();
ArrayList structFieldObjectInspectors = new ArrayList();
structFieldNames.add("Street");
structFieldNames.add("city");
structFieldNames.add("zip");
structFieldNames.add("state");
structFieldObjectInspectors.add(PrimitiveObjectInspectorFactory.writableStringObjectInspector);
structFieldObjectInspectors.add(PrimitiveObjectInspectorFactory.writableStringObjectInspector);
structFieldObjectInspectors.add(PrimitiveObjectInspectorFactory.writableStringObjectInspector);
structFieldObjectInspectors.add(PrimitiveObjectInspectorFactory.writableStringObjectInspector);
StructObjectInspector si2 = ObjectInspectorFactory.getStandardStructObjectInspector(structFieldNames,
structFieldObjectInspectors);
ListObjectInspector li2;
li2 = ObjectInspectorFactory.getStandardListObjectInspector(si2);
return li2;
}
#Override
public Object evaluate(DeferredObject[] args) throws HiveException {
ret.clear();
System.out.println("under evaluate");
// String street1 = streetvalue.getPrimitiveJavaObject(args[0].get());
Object oin = args[4].get();
System.out.println("under typecasting");
int batchsize = (Integer) batchsizei.getPrimitiveJavaObject(oin);
System.out.println("batchsize");
Object oin1 = args[0].get();
String street1 = (String) streeti.getPrimitiveJavaObject(oin1);
Object oin2 = args[1].get();
String zip1 = (String) zipi.getPrimitiveJavaObject(oin2);
Object oin3 = args[2].get();
String city1 = (String) cityi.getPrimitiveJavaObject(oin3);
Object oin4 = args[3].get();
String state1 = (String) statei.getPrimitiveJavaObject(oin4);
logger.info("address passed, street=" + street1 + ",zip=" + zip1 + ",city=" + city1 + ",state=" + state1);
counter++;
try {
System.out.println("under try");
Lookup lookup = new Lookup();
lookup.setStreet(street1);
lookup.setCity(city1);
lookup.setState(state1);
lookup.setZipCode(zip1);
lookup.setMaxCandidates(1);
batch.add(lookup);
} catch (BatchFullException ex) {
logger.error(ex.getMessage(), ex);
} catch (Exception e) {
logger.error(e.getMessage(), e);
}
/* batch.add(lookup); */
if (counter == batchsize) {
System.out.println("under if");
try {
logger.info("batch input street " + batch.get(0).getStreet());
try {
client.send(batch);
} catch (Exception e) {
logger.error(e.getMessage(), e);
logger.warn("skipping current batch, continuing with the next batch");
batch.clear();
counter = 0;
return null;
}
Vector<Lookup> lookups = batch.getAllLookups();
for (int i = 0; i < batch.size(); i++) {
// ListObjectInspector candidates;
ArrayList<Candidate> candidates = lookups.get(i).getResult();
if (candidates.isEmpty()) {
logger.warn("Address " + i + " is invalid.\n");
continue;
}
logger.info("Address " + i + " is valid. (There is at least one candidate)");
for (Candidate candidate : candidates) {
final Components components = candidate.getComponents();
final Metadata metadata = candidate.getMetadata();
logger.info("\nCandidate " + candidate.getCandidateIndex() + ":");
logger.info("Delivery line 1: " + candidate.getDeliveryLine1());
logger.info("Last line: " + candidate.getLastLine());
logger.info("ZIP Code: " + components.getZipCode() + "-" + components.getPlus4Code());
logger.info("County: " + metadata.getCountyName());
logger.info("Latitude: " + metadata.getLatitude());
logger.info("Longitude: " + metadata.getLongitude());
}
Object[] e;
e = new Object[4];
e[0] = new Text(candidates.get(i).getComponents().getStreetName());
e[1] = new Text(candidates.get(i).getComponents().getCityName());
e[2] = new Text(candidates.get(i).getComponents().getZipCode());
e[3] = new Text(candidates.get(i).getComponents().getState());
ret.add(e);
}
counter = 0;
batch.clear();
} catch (Exception e) {
logger.error(e.getMessage(), e);
}
return ret;
} else {
return null;
}
}
}
After successfully fetching alarms from Corba U2000 server and now reading the values, I am getting the error below
ERROR: org.omg.CORBA.MARSHAL: Sequence length too large. Only 12 available and trying to assign 31926513 vmcid: 0x0 minor code: 0 completed: No
org.omg.CORBA.MARSHAL: Sequence length too large. Only 12 available and trying to assign 31926513 vmcid: 0x0 minor code: 0 completed: No
at org.omg.CosNotification.EventBatchHelper.read(EventBatchHelper.java:57)
at AlarmIRPConstDefs.AlarmInformationSeqHelper.read(AlarmInformationSeqHelper.java:51)
at AlarmIRPConstDefs.AlarmInformationSeqHelper.extract(AlarmInformationSeqHelper.java:26)
at com.be.u2k.Main.getAlarmsList(Main.java:144)
at com.be.u2k.Main.main(Main.java:109)
for method AlarmInformationSeqHelper.extract
// Get all active alarms list
private static void getAlarmsList(ORB orb, AlarmIRP alarmIRP) {
try {
ManagedGenericIRPConstDefs.StringTypeOpt filter = new ManagedGenericIRPConstDefs.StringTypeOpt();
filter.value("($type_name == 'x1')"); // Query new alarms and acknowledge or unacknowledge alarms
AlarmIRPConstDefs.DNTypeOpt base_object = new AlarmIRPConstDefs.DNTypeOpt();
BooleanHolder flag = new BooleanHolder();
AlarmIRPSystem.AlarmInformationIteratorHolder iter = new AlarmIRPSystem.AlarmInformationIteratorHolder();
StructuredEvent[] alarmList = alarmIRP.get_alarm_list(filter, base_object, flag, iter);
System.out.println("AlarmIRP get_alarm_list success, flag: " + flag.value + " fetched total: " + (alarmList == null? -1: alarmList.length));
for (StructuredEvent alarm: alarmList) {
if (alarm.header != null) {
System.out.println("fixed_header.event_type.name: " + alarm.header.fixed_header.event_type.type_name
+ " fixed_header.event_type.domain_name: " + alarm.header.fixed_header.event_type.domain_name);
if (alarm.header.variable_header != null) {
for (Property variableHeader: alarm.header.variable_header) {
System.out.println("variable_header.name: " + variableHeader.name + " alarm.header.variable_header.value: " + variableHeader.value);
}
}
}
if (alarm.filterable_data != null) {
for (Property filterableData: alarm.filterable_data) {
System.out.println("data.name: " + filterableData.name);
if (filterableData.value != null && filterableData.value.toString().contains("org.jacorb.orb.CDROutputStream")) {
StructuredEvent[] filterableDataValues = AlarmInformationSeqHelper.extract(filterableData.value);
} else {
System.out.println("data.value: " + filterableData.value);
}
}
}
}
} catch (ManagedGenericIRPSystem.InvalidParameter e) {
System.out.println("ERROR get_alarm_list InvalidParameter (Indicates that the parameter is invalid): " + e) ;
} catch (ManagedGenericIRPSystem.ParameterNotSupported e) {
System.out.println("ERROR get_alarm_list ParameterNotSupported (Indicates that the operation is not supported): " + e) ;
} catch (AlarmIRPSystem.GetAlarmList e) {
System.out.println("ERROR get_alarm_list ParameterNotSupported (Indicates exceptions caused by unknown reasons): " + e) ;
}
}
Or is my way of reading the alarms list incorrect? Thanks.
You can find the example method below for getAlarmList
//Connect to AlarmIRP
AlarmIRP alarmIRP = AlarmIRPHelper.narrow(orb.string_to_object(alarmIrpIOR.value));
StringTypeOpt alarmFilter = new StringTypeOpt();
alarmFilter.value("");
DNTypeOpt base_object = new DNTypeOpt();
base_object.value("");
BooleanHolder flag = new BooleanHolder(false); // false for iteration
AlarmInformationIteratorHolder iter = new AlarmInformationIteratorHolder();
List<String> alarmIds = get_alarm_list(alarmIRP, alarmFilter, base_object, flag, iter);
private List<String> get_alarm_list(org._3gppsa5_2.AlarmIRPSystem.AlarmIRP alarmIRP, org._3gppsa5_2.ManagedGenericIRPConstDefs.StringTypeOpt alarmFilter, org._3gppsa5_2.AlarmIRPConstDefs.DNTypeOpt base_object, BooleanHolder flag, org._3gppsa5_2.AlarmIRPSystem.AlarmInformationIteratorHolder iter) throws org._3gppsa5_2.AlarmIRPSystem.GetAlarmList, org._3gppsa5_2.ManagedGenericIRPSystem.ParameterNotSupported, org._3gppsa5_2.AlarmIRPSystem.NextAlarmInformations, org._3gppsa5_2.ManagedGenericIRPSystem.InvalidParameter, BAD_OPERATION {
logger.info("[get-alarm-list][start]");
alarmIRP.get_alarm_list(alarmFilter, base_object, flag, iter);
List<StructuredEvent> alarms = new ArrayList();
EventBatchHolder alarmInformation = new EventBatchHolder();
short alarmSize = 100;
List<String> alarmIds = new ArrayList();
while (iter.value.next_alarmInformations(alarmSize, alarmInformation)) {
alarms.addAll(Arrays.asList(alarmInformation.value));
logger.info("Current alarm size:" + alarms.size());
}
for (StructuredEvent event : alarms) {
try {
//printAlarm(event);
} catch (Exception ex) {
}
List<Property> rem = new ArrayList<Property>();
rem.addAll(Arrays.asList(PropertySeqHelper.extract(event.remainder_of_body)));
for (Property property : rem) {
if (!property.name.equals(org._3gppsa5_2.AlarmIRPNotifications.NotifyNewAlarm.ALARM_ID)) {
continue;
}
alarmIds.add(property.value.extract_string());
}
}
logger.info("[get-alarm-list][completed] size :" + alarms.size());
return alarmIds;
}
I managed to figure out what is that filterableData.value.toString() value that is "org.jacorb.orb.CDROutputStream". It turns out that the property with name "b" is a TimeBase:: UtcT according to the docs.
To convert it to correct value which is a utc timestamp, I changed the condition to
if (filterableData.name.equals("b") && filterableData.value != null && filterableData.value.toString().contains("org.jacorb.orb.CDROutputStream")) {
long occuranceTime = TimeTHelper.read(filterableData.value.create_input_stream());
System.out.println("data.value: " + occuranceTime);
}
Is it possible to store all test cases which have failed in a hashmap, and then call all values stored in the map at the end of a class?
Variable:
private HashMap<String, Integer> serverStatusMap = new HashMap<String, Integer>();
After Method Code:
#AfterMethod
public void trackServerStatus(ITestResult testResult) {
if (testResult.getStatus() == ITestResult.FAILURE) {
try {
String testName = this.getClass().getSimpleName().toString();
int serverStatus = ServerStatus.getResponseCode(basePage.getCurrentURL());
int i = 0;
while(i < serverStatusMap.size()) {
serverStatusMap.put(testName, serverStatus);
i++;
}
//serverStatusMap.put(testName, serverStatus);
} catch (MalformedURLException e) {
// TODO Auto-generated catch block
e.printStackTrace();
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
}
}
Calling the stored values in the map After class:
#AfterClass
public void sendEmailBasedOnFailure(ITestContext context) throws WebDriverException, Exception {
String tempTime = new SimpleDateFormat("hh.mm.ss").format(new Date());
if(context.getFailedTests().size() > 0) {
SendEmailFile.sendEmailReport(
"TIME: " + tempTime + " | " + this.getClass().getPackage().toString(),
"TIME: " + tempTime + " | " + this.getClass().getPackage().toString() + " | " + "CLASS NAME: "
+ this.getClass().getSimpleName().toString() + "\n\n" +
"TOTAL NUMBER FAILED TESTS: " + context.getFailedTests().size() + "\n\n" +
"FAILED TEST CASES: " + context.getFailedTests().getAllMethods().toString() + "\n\n" +
serverStatusMap.toString());
}
Look at the last line of code: 'serverStatusMap.toString()'
Current Output of the Map:
{}
I do not understand what you are trying to do.
Do you want to send an email with failed tests?
Why not using the appropriate features like Listener or Reporter?
Have a look on the documentation about logging.
You have initialized variable "serverStatusMap". From below code
int i =0;
while(i < serverStatusMap.size()) {
serverStatusMap.put(testName, serverStatus);
i++;
}
I can see that i=0 and also serverStatusMap.size()=0. So it will never enter in while loop. so Finally when you print map there is nothing inside map. You need to change your while condition.