Elasticsearch mapping settings 'not_analyzed' and grouping by field in Java - java

I'm trying to group the results so that they are grouped by category.
SearchResponse response = client.prepareSearch("search")
.addAggregation(AggregationBuilders.terms("category").field("category").size(0))
.execute()
.actionGet();
The code above creates aggregations but I'm running into a problem where strings with hyphens in them are being separated and put into their own 'Bucket'.
From what I've read I need to change the mapping settings so that the category is not analysed but I'm not sure how to do this. Is this done when writing to Elasticsearch or when reading? How is it set exactly?

To apply elasticsearch mapping using java api,
STEP 1) First create your mapping for the Elasticsearch type in a json file,
eg. resources/Customer.json
{
"Customer" : {
"settings" : {
},
"properties" : {
"category" : { "type":"String" , "index" : "not_analyzed"}
}
}
}
}
}
STEP 2) create a java method to apply mapping from a json file, (see complete example here)
class EsUtils {
public static Client client
public static void applyMapping(String index, String type, String location) throws Exception {
String source = readJsonDefn(location);
if (source != null) {
PutMappingRequestBuilder pmrb = client.admin().indices()
.preparePutMapping(index)
.setType(type);
pmrb.setSource(source);
MappingListener mappingListener = new MappingListener(pmrb)
// Create type and mapping
Thread thread = new Thread(mappingListener)
thread.start();
while (!mappingListener.processComplete.get()) {
System.out.println("not complete yet. Waiting for 100 ms")
Thread.sleep(100);
}
} else {
System.out.println("mapping error");
}
}
public static String readJsonDefn(String url) throws Exception {
//implement it the way you like
StringBuffer bufferJSON = new StringBuffer();
FileInputStream input = new FileInputStream(new File(url).absolutePath);
DataInputStream inputStream = new DataInputStream(input);
BufferedReader br = new BufferedReader(new InputStreamReader(inputStream));
String line;
while ((line = br.readLine()) != null) {
bufferJSON.append(line);
}
br.close();
return bufferJSON.toString();
}
}
STEP 3) call applyMapping() method passing your es client,
String index = "search"; //yourIndex
String type = "Customer";
String location = "resources/Customer.json";
EsUtils.client = yourClient; //pass your client
EsUtils.applyMapping(index, type, location);
STEP 4) query as you want,
SearchRequestBuilder builder = client.prepareSearch("search");
builder.addAggregation(AggregationBuilders.terms("categoryterms")
.field("category").size(0))
SearchResponse response = builder.execute().actionGet();
Complete Reference
Elasticsearch apply mapping

Related

No value present - cucumber json data driven

I've got a problem with data driven testing in cucumber. I want to get data from json file. I've prepared scenario:
Feature: data provider
Scenario Outline: Data driven using json file
Given account user
And Get System Variables
And Delete TB report if already exist
When user navigates to TB report
Then Select Filters On Reports Page from <data>
Example:
|data|
|test|
Data json object:
[
{
"fundName": "test",
"currentDate": "31/12/2020"
},
"fundName": "test2",
"currentDate": "31/12/2020"
}
Pojo class for storing data:
public class Data {
public String fundName;
public String currentDate;
}
Data json reader:
private final String path = "path/to/file";
private List<Data> data;
public JsonDataReader(){
DataList = getData();
}
private List<Data> getData() {
Gson gson = new Gson();
BufferedReader bufferReader = null;
try {
bufferReader = new BufferedReader(new FileReader(path));
Data[] data= gson.fromJson(bufferReader, Data[].class);
return Arrays.asList(data);
}catch(FileNotFoundException e) {
throw new RuntimeException("Json file not found at path : " + path);
}finally {
try { if(bufferReader != null) bufferReader.close();}
catch (IOException ignore) {}
}
}
public final Data getDataByName(String name){
return dataList.stream().filter(x -> x.fundName.equalsIgnoreCase(name)).findAny().get();
}
Step definition:
#Then("Select Filters On Reports Page from \\\"(.*)\\\"$")
public void selectMultipleFilters(String name) {
Data data = FileReaderManager.getInstance().getJsonDataReader().getFundByName(name);
reportSteps.selectMultipleFiltersForReports(data);
}
But when I try to run this I've got an error on 5th step:
java.util.NoSuchElementException: No value present
at JsonDataReader.getDataByName
Can someone tell me what am I doing wrong?

how to fetch and validate csv header in open csv?

I want to fetch header from csv file . If I am not use this skipLines then I will get header at 0 index array . But I want to fetch header directly using HeaderColumnNameMappingStrategy but it will not work with my code.
I also want to validate header column list ( like csv had not allowed to contain extra column)
I had also check this How to validate the csv headers using opencsv but it was not helpful to me.
#SuppressWarnings({ "unchecked", "rawtypes" })
public Map<String, Object> handleStockFileUpload(MultipartFile file, Long customerId) {
Map<String, Object> responseMap = new HashMap<>();
responseMap.put("datamap", "");
responseMap.put("errormap", "");
responseMap.put("errorkeys", "");
List<Map<String, Integer>> list = new ArrayList<>();
List<StockCsvDTO> csvStockList = new ArrayList<>();
try {
String fileName = new SimpleDateFormat("yyyy_MM_dd_HHmmss").format(new Date()) + "_" + file.getOriginalFilename();
responseMap.put("filename", fileName);
File stockFile = new File(productsUploadFilePath + fileName);
stockFile.getParentFile().mkdirs();
FileOutputStream fos = new FileOutputStream(stockFile);
fos.write(file.getBytes());
fos.close();
CsvTransfer csvTransfer = new CsvTransfer();
ColumnPositionMappingStrategy ms = new ColumnPositionMappingStrategy();
ms.setType(StockCsv.class);
Reader reader = Files.newBufferedReader(Paths.get(productsUploadFilePath + fileName));
CSVReader csvReader = new CSVReader(reader);
CsvToBean cb = new CsvToBeanBuilder(reader)
.withType(StockCsv.class)
.withMappingStrategy(ms)
.withSkipLines(1)
.build();
csvTransfer.setCsvList(cb.parse());
reader.close();
csvStockList = csvTransfer.getCsvList();
} catch (Exception e) {
e.printStackTrace();
responseMap.put("status", "servererror");
}
responseMap.put("datamap", csvStockList);
return responseMap;
}
I found the following solution:
Use #CsvBindByName with HeaderColumnNameMappingStrategy,e.g. annotate your bean properties with #CsvBindByName:
public static class HollywoodActor {
private int id;
#CsvBindByName(column = "First Name")
private String firstName;
#CsvBindByName(column = "Last Name")
private String lastName;
// getter / setter
}
Add a method like this:
public class CsvParser {
public <T> ParseResult<T> parseByPropertyNames(Reader csvReader, Class<T> beanClass) throws IOException {
CSVReader reader = new CSVReaderBuilder(csvReader).withCSVParser(new
CSVParserBuilder().build()).build();
CsvToBean<T> bean = new CsvToBean();
HeaderColumnNameMappingStrategy<T> mappingStrategy = new HeaderColumnNameMappingStrategy();
mappingStrategy.setType(beanClass);
bean.setMappingStrategy(mappingStrategy);
bean.setCsvReader(reader);
List<T> beans = bean.parse();
return new CsvParseResult<>(mappingStrategy.generateHeader(), beans);
}
and also don't forget to add public class ParseResult
public class ParseResult <T> {
private final String[] headers;
private final List<T> lines;
// all-args constructor & getters
}
Use then use them in your code:
String csv = "Id,First Name,Last Name\n" + "1, \"Johnny\", \"Depp\"\n" + "2, \"Al\", \"Pacino\"";
CsvParseResult<HollywoodActor> parseResult = parser
.parseByPropertyNames(new InputStreamReader(new ByteArrayInputStream(csv.getBytes(StandardCharsets.UTF_8), HollywoodActor.class)));
From ParseResult.headers you can get actual headers from which were in your .csv file. Just compare them with what's expected.
Hope that helps!
Here I was comparing my csvHeader with originalHeader:
List<String> originalHeader = fileUploadUtility.getHeader(new StockCsv());
List<String> invalidHeader = csvHeader.stream().filter(o -> (originalHeader.stream().filter(f -> f.equalsIgnoreCase(o)).count()) < 1).collect(Collectors.toList());
if(null != invalidHeader && invalidHeader.size() > 0 && invalidHeader.toString().replaceAll("\\[\\]", "").length() > 0) {
msg = "Invalid column(s) : " + invalidHeader.toString().replace(", ]", "]") + ". Please remove invalid column(s) from file.";
resultMap.put(1, msg);
}
public List<String> getHeader(T pojo) {
// TODO Auto-generated method stub
final CustomMappingStrategy<T> mappingStrategy = new CustomMappingStrategy<>();
mappingStrategy.setType((Class<? extends T>) pojo.getClass());
String header[] = mappingStrategy.generateHeader();
List<String> strHeader = Arrays.asList(header);
return strHeader;
}
Here is an alternative to your present problem.First, define what you expect your headers to look like. For example:
public static final ArrayList<String> fileFormat = new ArrayList<> (Arrays.asList("Values1", "Values2", "Values3", "Values4"));
Now, write a method to return custom errors if any exist:
public String validateCsvFileDetails(MultipartFile file, Set<String> requiredHeadersArray) {
Set<String> errors = new HashSet<>();
try {
InputStream stream = file.getInputStream();
BufferedReader reader = new BufferedReader(new InputStreamReader(stream));
String headerLine = reader.readLine();
if (Objects.isNull(headerLine))
return "The file has no headers, please ensure it has the correct upload format";
List<String> headersInFileList;
String[] headersInFileArray;
if (headerLine.contains(",")) {
headersInFileArray = StringUtils.split(headerLine, ",");
headersInFileList = Arrays.asList(headersInFileArray);
} else//the headerline has only one headerfield
{
headersInFileList = Collections.singletonList(headerLine);
}
for (String header : requiredHeadersArray) {
if (!headersInFileList.contains(header))
errors.add("The file has the wrong header format, please ensure " + header + " header is present");
}
//if there are errors, return it
if (!errors.isEmpty())
return sysUtils.getStringFromSet(errors);
//Ensure the csv file actually has values after the header, but don't read beyond the first line
String line;
int counter = 0;
while ((line = reader.readLine()) != null) {
counter++;
if (counter > 0)
break;
}
//if line is null return validation error
if (Objects.isNull(line))
return "Cannot upload empty file";
} catch (Exception e) {
logger.error(new Object() {
}.getClass().getEnclosingMethod().getName(), e);
return "System Error";
}
return null;
}
Now you can validate you file headers as follows:
String errors = validateCsvFileDetails(file, new HashSet<>(fileFormat));
if (errors != null)
return error
//proceed
Give this a try using captureHeader as a pre-filter:
...
private class CustomHeaderColumnNameMappingStrategy<T> extends HeaderColumnNameMappingStrategy {
private String[] expectedHeadersOrdered = {"Column1", "Column2", "Column3", "Column4", "Column5"};
#Override
public void captureHeader(CSVReader reader) throws IOException, CsvRequiredFieldEmptyException {
String[] actualCsvHeaders = reader.peek();
String actualHeader, expectedHeader;
if (expectedHeadersOrdered.length > actualCsvHeaders.length) {
throw new CsvRequiredFieldEmptyException("Missing header column.");
} else if (expectedHeadersOrdered.length < actualCsvHeaders.length) {
throw new IOException("Unexpected extra header column.");
}
// Enforce strict column ordering with index
// TODO: you might want to employ simple hashMap, List, set, etc. as needed
for (int i=0; i<actualCsvHeaders.length; i++) {
actualHeader = actualCsvHeaders[i];
expectedHeader = expectedHeadersOrdered[i];
if ( ! expectedHeader.equals(actualHeader) ) {
throw new IOException("Header columns mismatch in ordering.");
}
}
super.captureHeader(reader); // Back to default processing if the headers include ordering are as expected
}
}
CustomHeaderColumnNameMappingStrategy yourMappingStrategy = new CustomHeaderColumnNameMappingStrategy<YourPOJO>();
ourMappingStrategy.setType(YourPOJO.class);
try {
pojosFromCsv = new CsvToBeanBuilder<YourPOJO>(new FileReader(csvFile))
.withType(YourPOJO.class)
.withMappingStrategy(yourMappingStrategy)
.build();
pojosFromCsv.stream();
}
Inspired by Using captureHeader in OpenCSV

Adding String to a list in Java

Please have a look at the below code snippet.
I had a look at some solutions provided on stackoverflow for adding String to a list.
They did not work out well in the below case.
#RequestMapping(value = "/rest/EmployeeDept/", method = RequestMethod.GET)
// ResponseEntity is meant to represent the entire HTTP response
public ResponseEntity<EmployeeDeptResponse> getDept()
{
EmployeeDeptResponse deptResponse = new EmployeeDeptResponse();
HttpStatus httpStatus;
List<EmployeeDept> employeeDeptList = new ArrayList<EmployeeDept>();
try {
DefaultHttpClient httpClient = new DefaultHttpClient();
HttpGet getRequest = new HttpGet(
"http://localhost:8082/rest/EmployeeDept/");
getRequest.addHeader("accept", "application/json");
HttpResponse response = httpClient.execute(getRequest);
if (response.getStatusLine().getStatusCode() != 200) {
throw new RuntimeException("Failed : HTTP error code : "
+ response.getStatusLine().getStatusCode());
}
BufferedReader br = new BufferedReader(
new InputStreamReader((response.getEntity().getContent())));
String output;
while ((output = br.readLine()) != null) {
employeeDeptList.add(output);
}
deptResponse.setItems(employeeDeptList);
httpClient.getConnectionManager().shutdown();
} catch (ClientProtocolException e) {
e.printStackTrace();
} catch (IOException e) {
e.printStackTrace();
}
httpStatus = HttpStatus.OK;
return new ResponseEntity<EmployeeDeptResponse>(deptResponse,httpStatus);
}
I am getting an error in the while loop as "add in list can not be applied to java.lang.String"
The list of type "EmployeeDept".The EmployeeDept class looks like this:-
package com.springboot.postrgres.model;
import java.io.Serializable;
public class EmployeeDept implements Serializable {
private static final long serialVersionUID = 1L;
private int id;
private String dept;
public EmployeeDept() {
}
public EmployeeDept(int id, String dept) {
this.id = id;
this.dept = dept;
}
public int getId() {
return id;
}
public void setId(int id) {
this.id = id;
}
public String getDept() {
return dept;
}
public void setDept(String name) {
this.dept = dept;
}
}
In the above code I have a list "employeeDeptList" and a string "Output".
I need to add this string to the list.
Can any of you provide suitable suggestions.
Thanks in advance.
employeeDeptList is of type ArrayList<EmployeeDept>.
List<EmployeeDept> employeeDeptList = new ArrayList<EmployeeDept>();
output on the other hand is of type String
String output;
So when you do employeeDeptList.add(output);, you are trying to add a String to your employeeDeptList, when it should be an EmployeeDept.
So you either make output an EmployeeDept or you rethink what you want to do with it.
As a suggestion, I am going to assume that your output should contain the information you need to create an EmployeeDept. You probably want to parse that information and create a EmployeeDept dept = new EmployeeDept(parsedId, parsedDept); and then add it to employeeDeptList as employeeDeptList.add(dept);
employeeDeptList is a list of EmployeeDept object. You are trying to add a String to the list of EmployeeDept. Which is not posssible unless you change the type of output variable to EmployeeDept.
If you response is a valid json (specified header), why won't you try to map it to objects?
ObjectMapper mapper = new ObjectMapper();
//assuming your response entity content is a list of objects (json array, since you specified header 'application/json'
String jsonArray = String theString = IOUtils.toString(response.getEntity().getContent(), encoding);
employeeDeptList = List<Employee> list = mapper.readValue(jsonString, TypeFactory.defaultInstance().constructCollectionType(List.class, employeeDeptList.class));
//assuming your response is a single object
String json = String theString = IOUtils.toString(response.getEntity().getContent(), encoding);
employeeDeptList.add(mapper.readValue(json, Employee.class));
//assuming every line of content is an object (does not really make sense)
BufferedReader br = new BufferedReader(ew InputStreamReader((response.getEntity().getContent())));
String output;
while ((output = br.readLine()) != null) {
employeeDeptList.add(mapper.readValue(output, Employee.class));
}
There is a problem in your code.
while ((output = br.readLine()) != null) {
employeeDeptList.add(output);
}
output is a String and you are trying to add that to a List<EmployeeDept>. You can't do that. If you want to add output to a List, you should create a List of Strings. Something like List<String>
As you mentioned, what you are getting is,
{
"1499921014230": {
"id": 1499921014230,
"dept": "mechanics"
},
"1499921019747": {
"id": 1499921019747,
"dept": "civil"
}
}
If you can change that , you can try to change it to a simple array of objects,
[
{
"id": 1499921014230,
"dept": "mechanics"
},
{
"id": 1499921019747,
"dept": "civil"
}
]
Add below dependency if you use maven, or just add the .jar to the lib,
<dependency>
<groupId>org.json</groupId>
<artifactId>json</artifactId>
<version>20090211</version>
</dependency>
Then try something like this,
while ((output = br.readLine()) != null) {
JSONArray jsonArr = new JSONArray(output);
for (int i = 0; i < jsonArr.length(); i++) {
JSONObject jsonObj = jsonArr.getJSONObject(i);
String dept = jsonObj.getString("dept");
int id = jsonObj.getInt("id");
System.out.println("id : " + id + " dept : " + dept);
employeeDeptList.add(new EmployeeDept(id, dept));
}
}

Can't read json file

Here's my method where im reading json file.
private void LoadTabaksFromJson() {
InputStream raw = mContext.getResources().openRawResource(R.raw.tabaks);
Reader reader = new BufferedReader(new InputStreamReader(raw));
ListOfTabaks listOfTodos = new Gson().fromJson(reader, ListOfTabaks.class);
List<Tabak> todoList = listOfTodos.getTodoArrayList();
for (Tabak item: todoList){
mDataBase.insert(TabakTable.NAME,null,getContentValues(item));
}
}
public class ListOfTabaks {
protected ArrayList<Tabak> tabakArrayList;
public ArrayList<Tabak> getTodoArrayList(){
return tabakArrayList;
}
}
And Exeption
Caused by: java.lang.NullPointerException: Attempt to invoke interface
method 'java.util.Iterator java.util.List.iterator()' on a null object
reference
at
com.hookah.roma.hookahmix.TabakLab.LoadTabaksFromJson(TabakLab.java:61)
at com.hookah.roma.hookahmix.TabakLab.(TabakLab.java:32)
at com.hookah.roma.hookahmix.TabakLab.get(TabakLab.java:37)
at
com.hookah.roma.hookahmix.TabakListFragment.updateUI(TabakListFragment.java:38)
at
com.hookah.roma.hookahmix.TabakListFragment.onCreateView(TabakListFragment.java:32)
at
android.support.v4.app.Fragment.performCreateView(Fragment.java:2184)
at
android.support.v4.app.FragmentManagerImpl.moveToState(FragmentManager.java:1298)
at
android.support.v4.app.FragmentManagerImpl.moveFragmentsToInvisible(FragmentManager.java:2323)
at
android.support.v4.app.FragmentManagerImpl.executeOpsTogether(FragmentManager.java:2136)
And json file :
{
"tabaksArrayList":[
{
"name":"Абрикос",
"description":"Со вкусом Абрикоса",
"rating":"4.1",
"favourite":"1",
"family":"Al fakher"
},
{
"name":"Ананас",
"description":"Со вкусом Ананаса",
"rating":"4.1",
"favourite":"1",
"family":"Al fakher"
},
{
"name":"Апельсин",
"description":"Со вкусом Апельсина",
"rating":"4.1",
"favourite":"1",
"family":"Al fakher"
},
{
"name":"Апельсин с мятой",
"description":"Со вкусом Апельсина с мятой",
"rating":"4.1",
"favourite":"1",
"family":"Al fakher"
},
It looks like your json schema issue, i'm guessing listOfTodos return null. You can refer to this to generate your schema.
But sometimes that tools can make us confuse so i tried to create your schema manually like this:
TabakRoot.java
public class TabakRoot {
#SerializedName("tabaksArrayList")
private List<TabakItem> tabakItem = null;
public List<TabakItem> getTabakItem() {
return tabakItem;
}}
TabakItem.java
public class TabakItem {
#SerializedName("family")
#Expose
private String tabakFamily;
public String getTabakFamily() {
return tabakFamily;
}}
finally
TabakRoot listOfTodos = new Gson().fromJson(reader, TabakRoot.class);
List<TabakItem> todoList = listOfTodos.getTabakItem();
Looks like you are not initialising your ArrayList, try changing:
protected ArrayList<Tabak> tabakArrayList;
for:
protected ArrayList<Tabak> tabakArrayList = new ArrayList<>();
Please put your json file in assets folder
use AsyncTask to protect from ANR like situtation
onBackground(){
String json = null;
try {
InputStream stream = activity.getAssets().open("ur_json_file_in_assets_folder.json");
int size = stream.available();
byte[] buffer = new byte[size];
stream.read(buffer);
stream.close();
json = new String(buffer, "UTF-8");
} catch (IOException e) {
e.printStackTrace();
return null;
}
return json;
}
then parse in
onPostExecute(String str){
JsonObject object = new JsonObject(str);
JsonArray arr = object.getJsonArray("tabaksArrayList");
...}
more details at ParseJsonFileAsync.java
You're not initialising tabakArrayList, add a constructor to your ListOfTabaks as following
public ListOfTabaks{
tabakArrayList = new ArrayList<>();
}
and you should be fine

converting java object to json

I am trying to convert java object to json. I have a java class which reads a specific column from a text file. And I want to store that read column in json format.
Here is my code. I dont know where I am going wrong.
Thanks in advance.
File.java
public class File {
public File(String filename)
throws IOException {
filename = readWordsFromFile("c:/cbir-2/sample/aol.txt");
}
public String value2;
public String readWordsFromFile(String filename)
throws IOException {
filename = "c:/cbir-2/sample/aol.txt";
// Creating a buffered reader to read the file
BufferedReader bReader = new BufferedReader(new FileReader(filename));
String line;
//Looping the read block until all lines in the file are read.
while ((line = bReader.readLine()) != null) {
// Splitting the content of tabbed separated line
String datavalue[] = line.split("\t");
value2 = datavalue[1];
// System.out.println(value2);
}
bReader.close();
return "File [ list=" + value2 + "]";
}
}
GsonExample.java
import com.google.gson.Gson;
public class GsonExample {
public static void main(String[] args)
throws IOException {
File obj = new File("c:/cbir-2/sample/aol.txt");
Gson gson = new Gson();
// convert java object to JSON format,
// and returned as JSON formatted string
String json = gson.toJson(obj);
try {
//write converted json data to a file named "file.json"
FileWriter writer = new FileWriter("c:/file.json");
writer.write(json);
writer.close();
} catch (IOException e) {
e.printStackTrace();
}
System.out.println(json);
}
}
I recommend you to use
Jackson
High-performance JSON processor.
from http://jackson.codehaus.org/
here is the sample from their tutorial
The most common usage is to take piece of JSON, and construct a Plain Old Java Object ("POJO") out of it. So let's start there. With simple 2-property POJO like this:
// Note: can use getters/setters as well; here we just use public fields directly:
public class MyValue {
public String name;
public int age;
// NOTE: if using getters/setters, can keep fields `protected` or `private`
}
we will need a com.fasterxml.jackson.databind.ObjectMapper instance, used for all data-binding, so let's construct one:
ObjectMapper mapper = new ObjectMapper(); // create once, reuse
The default instance is fine for our use -- we will learn later on how to configure mapper instance if necessary. Usage is simple:
MyValue value = mapper.readValue(new File("data.json"), MyValue.class);
// or:
value = mapper.readValue(new URL("http://some.com/api/entry.json"), MyValue.class);
// or:
value = mapper.readValue("{\"name\":\"Bob\", \"age\":13}", MyValue.class);
And if we want to write JSON, we do the reverse:
mapper.writeValue(new File("result.json"), myResultObject);
// or:
byte[] jsonBytes = mapper.writeValueAsBytes(myResultObject);
// or:
String jsonString = mapper.writeValueAsString(myResultObject);
Processing a file that have the information in columns like a csv I recomend for this task use opencsv here is an example for information in 5 columns separated by '|'
import com.opencsv.CSVReader;
import pagos.vo.UserTransfer;
import java.io.*;
import java.text.DateFormat;
import java.text.SimpleDateFormat;
import java.util.ArrayList;
import java.util.List;
/**
* Created by anquegi on
*/
public class CSVProcessor {
public List<String[]> csvdata = new ArrayList<String[]>();
public CSVProcessor(File CSVfile) {
CSVReader reader = null;
try {
reader = new CSVReader(new FileReader(CSVfile),'|');
} catch (FileNotFoundException e) {
e.printStackTrace();
Logger.error("Cannot read CSV: FileNotFoundException");
}
String[] nextLine;
if (reader != null) {
try {
while ((nextLine = reader.readNext()) != null) {
this.csvdata.add(nextLine);
}
} catch (IOException e) {
e.printStackTrace();
Logger.error("Cannot read CSV: IOException");
}
}
}
public List<TransfersResult> extractTransfers() {
List<TransfersResult> transfersResults = new ArrayList<>();
for(String [] csvline: this.csvdata ){
if(csvline.length >= 5){
TransfersResult transfersResult = new TransfersResult(csvline[0]
,csvline[1],csvline[2],csvline[3],csvline[4]);
// here transfersResult is a pojo java object
}
}
return transfersResults;
}
}
and for returning a json from a servlet this is solved in this question in stackoverflow
How do you return a JSON object from a Java Servlet
Looks like you might be overwriting value2 for each line.
value2= datavalue[1];
EDIT: Can you make value2 a List and add to it.
value2.add(datavalue[1]);
EDIT2: You need to check the size of the array before using it.
if (datavalue.length >= 2){
value2.add(datavalue[1]);
}
The reason for the exception could be
value2=datavlue[1];
means during your first execution of while loop , you are trying to assign seconds element(datavalue[1]) in the String array to value2 , which is not created by then.So its giving that exception.

Categories

Resources