NegativeArraySizeException, DataProvider, Excel - java

I'd like to implement the data from Excel file to different tests depending on the scenario correct data/invalid one, however when I want to get the cell value I get the "NegativeArraySizeException".
The first row has just a title so I don't want to read it, that's why I have the parameters [rows-1].
Could you please indicate what is my mistake?
Thank you
public class SignInTest extends Driver {
#BeforeMethod
public void setUp() {
Driver.initConfiguration();
}
public Object[][] getData(String excelPath, String sheetName) {
int rows = excel.getRowCount(sheetName);
int cols = excel.getColumnCount(sheetName);
Object[][] data = new Object[rows - 1][1];
excel = new ExcelReader(excelPath);
for (int rowNum = 1; rowNum < rows; rowNum++) {
for (int colNum = 0; colNum < cols; colNum++) {
data[rowNum - 1][colNum] = excel.getCellData(sheetName, colNum, rowNum);
}
}
return data;
}
#DataProvider(name = "credentials")
public Object[][] getCredentials() {
Object[][] data = getData(excelPath, sheetName);
return data;
}
#Test(dataProviderClass = DataProviders.class, dataProvider = "credentials")
public void loginWithCorrectCredentials(String email, String password) {
HomePageActions hp = new HomePageActions();
SignInActions sign = new SignInActions();
DataProviders dp = new DataProviders();
dp.getData(excelPath, "correctData");
System.out.println("email " + email);
System.out.println("password " + password);
}

This function "excel.getRowCount(sheetName)"
On this line:
int rows = excel.getRowCount(sheetName);
Is returning 0 (or possibly null), thus when you do rows-1, you get a number less than zero. I should hope that much is obvious. So the question becomes WHY?
Things to look for in troubleshooting:
Is the getColumnCount also returning zero? If so, this points to a possible
error in the worksheet reference.
Is the sheetName actually correctly being passed into the function?
Can you insert an explicit value into a specific place on the worksheet? Meaning is that reference working? Throw in a test line and see what happens.
What happens if you hard set the array to say:
Object[][] data = new Object[100][1];
My gut is telling me you have an issue with the reference to the worksheet, but without knowing more about your worksheet referencing, it's impossible to know for sure.
I hope some of this points you in the right direction and gets you going. Good luck!

Related

ClassCastException handling

I have a legacy code that looks like this,
///
....
List<Object[]> reportResults = new ArrayList<Object[]>();
reportResults = query.getResultList();
logger.info("reportResults ==== "+reportResults.toString());
workbook = expService.exportExcel(fileName, reportResults, reportHeaders);
...///
So when the resultset is more than 1 column then logger prints '[[Ljava.lang.Object;#391472f5] ' like an object.
When the reportResult is only 1 column then results look like this, '[USA, EUROPE]' when its 1 column 1 row it looks like '[USA]'
The problem is when I iterate with the reportResults in the below method. reportData parameter is nothing but reportResults below.
The inner for loop throws 'java.lang.ClassCastException: java.lang.String cannot be cast to [Ljava.lang.Object;' when reportData is just 1 column. How can i handle so it works for all the cases.
public XSSFWorkbook exportExcel(String reportName, List<Object[]> reportData, String headers) throws Exception {
...///
System.out.println("... "+reportData.size());
for(int i=0; i<reportData.size(); i++){
row = sheet.createRow(rownum);
rownum++;
System.out.println("....... "+reportData.get(i));
for(int j = 0; j < reportData.get(i).length; j++){
cell = row.createCell(j);
Object oCellValue = reportData.get(i)[j];
if(oCellValue != null){
String className=oCellValue.getClass().getName();
if(className.equals(Timestamp.class.getName())){
CellStyle cStyle = workbook.createCellStyle();
cStyle.setDataFormat((short)14);
Timestamp tsValue=(Timestamp)oCellValue;
Date dtValue = new Date(tsValue.getTime());
cell.setCellValue(dtValue);
cell.setCellStyle(cStyle);
}else{
cell.setCellValue(""+oCellValue);
}
}else{
cell.setCellValue("");
}
}
}
...//
}
So the 1st line sysout prints size as 1. It gets into the loop and prints the 2nd sysout as 'USA'.
reportData.get(i) -> this is supposed to be a object and in my case for this scenario its 'USA'.
Please let me know how i can get rid of this exception and handle the same for all cases. Any help is highly appreciated. Thank you.

Apache POI Java - Write to excel and dynamically update cells

My java spring boot app needs to create a new excel file based on the contents of my DB. My current solution places all the data from my DB and inserts it in my excel sheet, but I want to improve it by not stating what the cell values are. For example, although it works, my solution has 34 fields so I am stating the userRow.createCell line 34 times for each field which is repetitive. Ideally I want to say create the cell(n) and take all the values from each row in the DB. How can this be done? Another for loop within this for loop? Every example I looked at online seems to specifically state what the cell value is.
List<CaseData> cases = (List<CaseData>) model.get("cases");
Sheet sheet = workbook.createSheet("PIE Cases");
int rowCount = 1;
for (CaseData pieCase : cases) {
Row userRow = sheet.createRow(rowCount++);
userRow.createCell(0).setCellValue(pieCase.getCaseId());
userRow.createCell(1).setCellValue(pieCase.getAcknowledgementReceivedDate());
}
Use the Reflection API
Example:
try {
Class caseDataObj = CaseData.class;
Method [] methods = caseDataObj.getDeclaredMethods();
Sheet sheet = workbook.createSheet("PIE Cases");
int rowCount = 1;
for(CaseData cd : cases) {
int cellIndex = 0;
Row userRow = sheet.createRow(rowCount++);
for (Method method : methods) {
String methodName = method.getName();
if(methodName.startsWith("get")) {
// Assuming all getters return String
userRow.createCell(cellIndex++).setCellValue((String) method.invoke(cd));
}
}
}
} catch (Exception e) {
e.printStackTrace();
}
There are probably many ways to do this, You can try something like this, this is how I usually go about it for things like what you are doing.
public enum DATA {
CASE_ID(0),
ACK_RECIEVED(1),
ETC(2);
//ETC(3) and so on
public int index;
DATA(int index) {
this.index = index;
}
public Object parse(CaseData data) throws Exception {
switch (this) {
case CASE_ID:
return data.getCaseId();
case ACK_RECIEVED:
return data.getAcknowledgementReceivedDate();
case ETC:
return "etc...";
default: return null;
}
}
}
Then, the implementation is:
List<CaseData> cases = (List<CaseData>) model.get("cases");
Sheet sheet = workbook.createSheet("PIE Cases");
int rowCount = 1;
for (CaseData pieCase : cases) {
Row userRow = sheet.createRow(rowCount++);
for (DATA DAT : DATA.values()) {
userRow.createCell(DAT.index).setCellValue(DAT.parse(pieCase));
}
}

Insert Multiple Rows realm android

Hello ive been trying to insert multiple rows to my realm database using values from arraylists , whenever i try to insert through a for loop it only adds the last one, if you need something else (code, xml) pls let me know
here is my code:
realm.executeTransactionAsync(new Realm.Transaction() { //ASYNCHRONOUS TRANSACCION TO EXECUTE THE QUERY ON A DIFFERENT THREAD
#Override
public void execute(Realm bgRealm) {
// increment index
Invoices inv = bgRealm.createObject(Invoices.class, RealmController.autoincrement(bgRealm, Invoices.class)); //METHOD THAT GIVES US THE AUTONINCREMENTE FUNCTION
//inv.id = nextId; //THE 2ND PARAMETER IN CREATE OBJECTE DEFINES THE PK
//...
//realm.insertOrUpdate(user); // using insert API
inv.number = n;
inv.serial = s;
inv.client = c;
inv.subtotal = sub;
inv.tax = tax;
inv.total = tot;
Invoice_lines invl = bgRealm.createObject(Invoice_lines.class, RealmController.autoincrement(bgRealm, Invoice_lines.class));//ID FROM ANOHTER TABLE (ROW)
for(int i=0; i<price.size(); i++) {
invl.description = description.get(i);
invl.price = price.get(i);
invl.quantity = quantity.get(i);
invl.invoice = inv;
bgRealm.insert(invl);
}
}
}
I'm not sure. You create only one realm object in this line:
Invoice_lines invl = bgRealm.createObject(Invoice_lines.class, RealmController.autoincrement(bgRealm, Invoice_lines.class));//ID FROM ANOHTER TABLE (ROW)
And in cycle you change invl fields, but don't insert new objects.
Try to create objects inside cycle.
Because what you wanted to do is
Invoice_lines invl = new Invoice_lines(); // unmanaged object
for(int i = 0; i < price.size(); i++) {
inv1.setId(RealmController.autoincrement(bgRealm, Invoice_lines.class));//ID FROM ANOHTER TABLE (ROW)
invl.description = description.get(i);
invl.price = price.get(i);
invl.quantity = quantity.get(i);
invl.invoice = inv;
bgRealm.insert(invl);
}

How to mass delete multiple rows in hbase?

I have the following rows with these keys in hbase table "mytable"
user_1
user_2
user_3
...
user_9999999
I want to use the Hbase shell to delete rows from:
user_500 to user_900
I know there is no way to delete, but is there a way I could use the "BulkDeleteProcessor" to do this?
I see here:
https://github.com/apache/hbase/blob/master/hbase-examples/src/test/java/org/apache/hadoop/hbase/coprocessor/example/TestBulkDeleteProtocol.java
I want to just paste in imports and then paste this into the shell, but have no idea how to go about this. Does anyone know how I can use this endpoint from the jruby hbase shell?
Table ht = TEST_UTIL.getConnection().getTable("my_table");
long noOfDeletedRows = 0L;
Batch.Call<BulkDeleteService, BulkDeleteResponse> callable =
new Batch.Call<BulkDeleteService, BulkDeleteResponse>() {
ServerRpcController controller = new ServerRpcController();
BlockingRpcCallback<BulkDeleteResponse> rpcCallback =
new BlockingRpcCallback<BulkDeleteResponse>();
public BulkDeleteResponse call(BulkDeleteService service) throws IOException {
Builder builder = BulkDeleteRequest.newBuilder();
builder.setScan(ProtobufUtil.toScan(scan));
builder.setDeleteType(deleteType);
builder.setRowBatchSize(rowBatchSize);
if (timeStamp != null) {
builder.setTimestamp(timeStamp);
}
service.delete(controller, builder.build(), rpcCallback);
return rpcCallback.get();
}
};
Map<byte[], BulkDeleteResponse> result = ht.coprocessorService(BulkDeleteService.class, scan
.getStartRow(), scan.getStopRow(), callable);
for (BulkDeleteResponse response : result.values()) {
noOfDeletedRows += response.getRowsDeleted();
}
ht.close();
If there exists no way to do this through JRuby, Java or alternate way to quickly delete multiple rows is fine.
Do you really want to do it in shell because there are various other better ways. One way is using the native java API
Construct an array list of deletes
pass this array list to Table.delete method
Method 1: if you already know the range of keys.
public void massDelete(byte[] tableName) throws IOException {
HTable table=(HTable)hbasePool.getTable(tableName);
String tablePrefix = "user_";
int startRange = 500;
int endRange = 999;
List<Delete> listOfBatchDelete = new ArrayList<Delete>();
for(int i=startRange;i<=endRange;i++){
String key = tablePrefix+i;
Delete d=new Delete(Bytes.toBytes(key));
listOfBatchDelete.add(d);
}
try {
table.delete(listOfBatchDelete);
} finally {
if (hbasePool != null && table != null) {
hbasePool.putTable(table);
}
}
}
Method 2: If you want to do a batch delete on the basis of a scan result.
public bulkDelete(final HTable table) throws IOException {
Scan s=new Scan();
List<Delete> listOfBatchDelete = new ArrayList<Delete>();
//add your filters to the scanner
s.addFilter();
ResultScanner scanner=table.getScanner(s);
for (Result rr : scanner) {
Delete d=new Delete(rr.getRow());
listOfBatchDelete.add(d);
}
try {
table.delete(listOfBatchDelete);
} catch (Exception e) {
LOGGER.log(e);
}
}
Now coming down to using a CoProcessor. only one advice, 'DON'T USE CoProcessor' unless you are an expert in HBase.
CoProcessors have many inbuilt issues if you need I can provide a detailed description to you.
Secondly when you delete anything from HBase it's never directly deleted from Hbase there is tombstone marker get attached to that record and later during a major compaction it gets deleted, so no need to use a coprocessor which is highly resource exhaustive.
Modified code to support batch operation.
int batchSize = 50;
int batchCounter=0;
for(int i=startRange;i<=endRange;i++){
String key = tablePrefix+i;
Delete d=new Delete(Bytes.toBytes(key));
listOfBatchDelete.add(d);
batchCounter++;
if(batchCounter==batchSize){
try {
table.delete(listOfBatchDelete);
listOfBatchDelete.clear();
batchCounter=0;
}
}}
Creating HBase conf and getting table instance.
Configuration hConf = HBaseConfiguration.create(conf);
hConf.set("hbase.zookeeper.quorum", "Zookeeper IP");
hConf.set("hbase.zookeeper.property.clientPort", ZookeeperPort);
HTable hTable = new HTable(hConf, tableName);
If you already aware of the rowkeys of the records that you want to delete from HBase table then you can use the following approach
1.First create a List objects with these rowkeys
for (int rowKey = 1; rowKey <= 10; rowKey++) {
deleteList.add(new Delete(Bytes.toBytes(rowKey + "")));
}
2.Then get the Table object by using HBase Connection
Table table = connection.getTable(TableName.valueOf(tableName));
3.Once you have table object call delete() by passing the list
table.delete(deleteList);
The complete code will look like below
Configuration config = HBaseConfiguration.create();
config.addResource(new Path("/etc/hbase/conf/hbase-site.xml"));
config.addResource(new Path("/etc/hadoop/conf/core-site.xml"));
String tableName = "users";
Connection connection = ConnectionFactory.createConnection(config);
Table table = connection.getTable(TableName.valueOf(tableName));
List<Delete> deleteList = new ArrayList<Delete>();
for (int rowKey = 500; rowKey <= 900; rowKey++) {
deleteList.add(new Delete(Bytes.toBytes("user_" + rowKey)));
}
table.delete(deleteList);

jpa getresultslist column index exception

Hello even after I perused Stack Overflow and other sites I have yet to address the problem. I would think it's a configuration issue but both persistence.xml and tomee.xml files seem perfectly fine.
The issue seems to be this line:
returnList = mapToDtoList(getEntityManager().createNativeQuery("SELECT * FROM map_category_content_type", entityClass).getResultList());
It is generating a "Column index out of range exception 0 < 1"
In most cases this is due to an errant index. But in my case, since I never reference an index directly at all (the "magic" is supposed to do that for me) I can only blame either configuration or getResultList().
I have hit a brick wall.
Here is the exception and context.
*I am not allowed to post images (not enough "reputation points") but I can assure you that the stack trace is identical to the one found here:
Getting column index out of range, 0 < 1
persistence.java code:
#Stateless
#TransactionAttribute(TransactionAttributeType.REQUIRED)
public class CategoriesAndContentTypesMapPersistence extends AbstractPersistenceService<CategoriesAndContentTypesMap, CategoriesAndContentTypesMapDto, Integer> {
public List<CategoriesAndContentTypesMapDto> getCategoriesAndContentTypes() {
List<CategoriesAndContentTypesMapDto> returnList;
try {
returnList = mapToDtoList(getEntityManager().createNativeQuery("SELECT * FROM map_category_content_type", entityClass).getResultList());
} catch (Exception e) {
System.out.println("---> "+e);
throw new PersistenceException(e, this.getClass());
}
return returnList;
}
}
The caller code:
#Inject
private CategoriesAndContentTypesMapPersistence catContPersistence;
protected void doGet(HttpServletRequest request, HttpServletResponse response) throws ServletException, IOException {
int categoryID = 0;
String categoryName = "";
int categorySortOrder = 0;
int contentTypeID = 0;
String contentTypeName = "";
int contentTypeSortOrder = 0;
int contentTypeRequired = 0;
int contentTypeVisibility = 0;
Map<Integer, String> categoriesContentTypes = new LinkedHashMap<Integer, String>();
List<CategoriesAndContentTypesMapDto> dbCatsConts = catContPersistence.getCategoriesAndContentTypes();
for (CategoriesAndContentTypesMapDto catCotItem : dbCatsConts) {
categoryID = catCotItem.getCategoryId();
categoryName = catCotItem.getCategoryName();
categorySortOrder = catCotItem.getCategorySortOrder();
contentTypeID = catCotItem.getContentTypeId();
contentTypeName = catCotItem.getContentTypeName();
contentTypeSortOrder = catCotItem.getContentTypeSortOrder();
contentTypeRequired = catCotItem.getContentTypeRequired();
contentTypeVisibility = catCotItem.getContentTypeVisible();
I suspect my problem is something along these lines:
Getting column index out of range, 0 < 1
but the conf files look fine and were working.
Any advice is greatly appreciated.
Nevermind.
I didn't have a "managed bean" among other architectural EJB/JSF/... necessities.
Thanks all!

Categories

Resources