public KustoResultSetTable executeKustoQuery(ClientImpl client, String query) {
KustoResultSetTable mainTableResult = null;
try {
KustoOperationResult results = client.execute(databaseName, query);
mainTableResult = results.getPrimaryResults();
} catch (DataServiceException | DataClientException e) {
errorHandler(e, "Error while retrieving results from kusto query!");
}
return mainTableResult;
}
The above code returns me a result of this type
Name | Age
XYZ AAA | 29
How can I get the value of 1st row under name column using Azure Kusto Java mainTableResult object
Expected String output - "XYZ AAA"
You can do:
if (mainTableResult.first()) {
int columnIndex = mainTableResult.findColumn("Name")
return mainTableResult.getString(columnIndex);
} else {
throw new UnsupportedOperationException(""); // Or any other error handling
}
A complete version is:
public String executeKustoQuery(ClientImpl client, String query) {
KustoResultSetTable mainTableResult = null;
try {
KustoOperationResult results = client.execute("databaseName", query);
mainTableResult = results.getPrimaryResults();
if (mainTableResult.first()) {
int columnIndex = mainTableResult.findColumn("Name")
return mainTableResult.getString(columnIndex);
} else {
throw new UnsupportedOperationException(""); // Or any other error handling
}
} catch (DataServiceException | DataClientException e) {
errorHandler(e, "Error while retrieving results from kusto query!");
}
}
I have a Patients entity class which auto generates an id:
#Id
#GeneratedValue(strategy = GenerationType.IDENTITY)
#Column(name = "personId", nullable = false, unique = true)
private Long personId;
public void copy (Patients patient) {
if (patient.getNationality() != null)
this.setNationality(patient.getNationality());
if (patient.getGivenName() != null)
this.setGivenName(patient.getGivenName());
if (patient.getMiddleName() != null)
this.setMiddleName(patient.getMiddleName());
if (patient.getPrefix() != null)
this.setPrefix(patient.getPrefix());
}
/**
* #return PERSONID
*/
public int getPersonId() {
return personId;
}
My addPerson in PersonDaoImpl :
public Patients addPerson(Patients person) {
Patients p = new Patients(person);
try {
em = factory.createEntityManager();
em.getTransaction().begin();
SimpleDateFormat sdfr = new SimpleDateFormat("yyyy-MM-
dd'T'HH:mm:ss.SSS+05:30");
Date date = new Date();
String dateCreated = sdfr.format(date);
p.setDateCreated(dateCreated);
em.persist(p);
em.getTransaction().commit();
} catch (Exception e) {
em.getTransaction().rollback();
log.error("Exception caught :: " + e);
p = null;
}
em.close();
return p;
}
My update api in person service class:
#PUT
#Path("/person-manager-resource/updatePersonById")
#Produces("application/json")
#Consumes("application/json")
public Response update(Patients person) {
log.info("Inside UpdatePerson");
log.info(person.getPersonId());
dao = new PersonDaoImpl();
ObjectMapper mapper = new ObjectMapper();
person1 = dao.updatePerson(person);
String result = "";
try {
result = mapper.writeValueAsString(person1);
log.info("Person updated :: " + result);
} catch (JsonProcessingException e) {
log.info("Exception Caught :: " + e);
}
if (person1 != null) {
return Response.
status(Response.Status.OK.getStatusCode()).
entity(result).
build();
} else {
return Response.
status(Response.Status.INTERNAL_SERVER_ERROR.getStatusCode()).
entity(result).
build();
}
}
UpdatePerson:
public Patients updatePerson(Patients updatedPatient) {
Patients dbPatient = new Patients();
TypedQuery<Patients> query = null;
ObjectMapper mapper = new ObjectMapper();
try {
em = factory.createEntityManager();
String identifier = updatedPatient.getPersonIdentifiers().getIdentifier();
String queryStr = "SELECT c FROM Patients c where c.personIdentifiers.identifier = '" + identifier + "'";
query = em.createQuery(queryStr, Patients.class);
dbPatient = query.getSingleResult();
dbPatient.copy(updatedPatient);
em.getTransaction().begin();
em.merge(dbPatient);
em.getTransaction().commit();
} catch (Exception e) {
log.error("Exception caught :: " + e);
em.getTransaction().rollback();
dbPatient = null;
}
em.close();
return dbPatient;
}
I pass a json object through my REST api to create a patient entry:
{
"personId": 5,
"prefix": null,
"givenName": "Pooja roy",
"middleName": null
}
Now this is going fine. I take the same object, which now contains the auto-generated personId, in an api which is supposed to update the object. I pass the json in the Patients entity object. When I print this whole object, the personId is null.
Since it is null and primary key, I can't do a merge. I have to manually update the database object, which is a very lengthy process.
Any ideas why it is coming as null and how I can retrieve it?
I am using postgres.
I think the whole problem is caused by the implementation of the updatePerson method. You should implement the method as follows and it should work as expected, assuming the updatedPatient instance is a persistent entity (meaning it has an ID field set):
public Patients updatePerson(Patients updatedPatient) {
Patients mergedPatient = new Patients();
try {
em = factory.createEntityManager();
em.getTransaction().begin();
mergedPatient = em.merge(updatedPatient);
em.getTransaction().commit();
} catch (Exception e) {
log.error("Exception caught :: " + e);
em.getTransaction().rollback();
}
em.close();
return mergedPatient;
}
Now mergedPatient should contain the synchronized state.
Update:
alternative solution
For whatever reason you cannot use a setter for the ID field. Then the following might solve your problem:
public Patients updatePerson(Patients updatedPatient) {
Patients dbPatient = new Patients();
try {
em = factory.createEntityManager();
String identifier = updatedPatient.getPersonIdentifiers().getIdentifier();
em.getTransaction().begin();
dbPatient = em.find(Patients.class, Long.parseLong(identifier));
dbPatient.copy(updatedPatient);
em.getTransaction().commit();
} catch (Exception e) {
// ..:
dbPatient = null;
}
em.close();
return dbPatient;
}
As the em.find() method is executed inside of a transaction, the object returned is managed, which means any changes to that returned instance will be synchronized with the database when the transaction commits.
PersonId is an auto generated id. So, jpa doesn't allow for me to set a setter for personId. We only have getPersonId() method in the entity class.
So, in updatePerson(Patients person), when I am passing the person object, every setter is called and the object is thus created. Since, personId doesn't have a setter method, it is returned as null in that object.
I have set LightSIDE plugin and can run properly, but I don't know why I can't save my data to empty file? This is what a simple structure I made.
Activity is the list data that need to be categorize.
I have 3 categories and each of them have each type.
I already define each category with specific list of Words. For example : Food ({Sushi, Food, Japan}, {Cap Jay, Food, Chinese}, {Jog, Sport, Running}, ...)
And this is how I save my prediction with LightSIDE.
public void predictSectionType(String[] sections, List<String> activityList) {
LightSideService currentLightsideHelper = new LightSideService();
Recipe newRecipe;
// Initialize SIDEPlugin
currentLightsideHelper.initSIDEPlugin();
try {
// Load Recipe with Extracted Features & Trained Models
ClassLoader myClassLoader = getClass().getClassLoader();
newRecipe = ConverterControl.readFromXML(new InputStreamReader(myClassLoader.getResourceAsStream("static/lightsideTrainingResult/trainingData.xml")));
// Predict Result Data
Recipe recipeToPredict = currentLightsideHelper.loadNewDocumentsFromCSV(sections); // DocumentList & Recipe Created
currentLightsideHelper.predictLabels(recipeToPredict, newRecipe);
} catch (FileNotFoundException e) {
e.printStackTrace();
} catch (IOException e) {
e.printStackTrace();
}
}
I have class of LightSideService as Summary Class of LightSIDE function.
public class LightSideService {
// Extract Features Parameters
final String featureTableName = "1Grams";
final int featureThreshold = 2;
final String featureAnnotation = "Code";
final Type featureType = Type.NOMINAL;
// Build Models Parameters
final String trainingResultName = "Bayes_1Grams";
// Predict Labels Parameters
final String predictionColumnName = featureAnnotation + "_Prediction";
final boolean showMaxScore = false;
final boolean showDists = true;
final boolean overwrite = false;
final boolean useEvaluation = false;
public DocumentListTableModel model = new DocumentListTableModel(null);
public Map<String, Serializable> validationSettings = new TreeMap<String, Serializable>();
public Map<FeaturePlugin, Boolean> featurePlugins = new HashMap<FeaturePlugin, Boolean>();
public Map<LearningPlugin, Boolean> learningPlugins = new HashMap<LearningPlugin, Boolean>();
public Collection<ModelMetricPlugin> modelEvaluationPlugins = new ArrayList<ModelMetricPlugin>();
public Map<WrapperPlugin, Boolean> wrapperPlugins = new HashMap<WrapperPlugin, Boolean>();
// Initialize Data ==================================================
public void initSIDEPlugin() {
SIDEPlugin[] featureExtractors = PluginManager.getSIDEPluginArrayByType("feature_hit_extractor");
boolean selected = true;
for (SIDEPlugin fe : featureExtractors) {
featurePlugins.put((FeaturePlugin) fe, selected);
selected = false;
}
SIDEPlugin[] learners = PluginManager.getSIDEPluginArrayByType("model_builder");
for (SIDEPlugin le : learners) {
learningPlugins.put((LearningPlugin) le, true);
}
SIDEPlugin[] tableEvaluations = PluginManager.getSIDEPluginArrayByType("model_evaluation");
for (SIDEPlugin fe : tableEvaluations) {
modelEvaluationPlugins.add((ModelMetricPlugin) fe);
}
SIDEPlugin[] wrappers = PluginManager.getSIDEPluginArrayByType("learning_wrapper");
for (SIDEPlugin wr : wrappers) {
wrapperPlugins.put((WrapperPlugin) wr, false);
}
}
//Used to Train Models, adjust parameters according to model
public void initValidationSettings(Recipe currentRecipe) {
validationSettings.put("testRecipe", currentRecipe);
validationSettings.put("testSet", currentRecipe.getDocumentList());
validationSettings.put("annotation", "Age");
validationSettings.put("type", "CV");
validationSettings.put("foldMethod", "AUTO");
validationSettings.put("numFolds", 10);
validationSettings.put("source", "RANDOM");
validationSettings.put("test", "true");
}
// Load CSV Doc ==================================================
public Recipe loadNewDocumentsFromCSV(String filePath) {
DocumentList testDocs;
testDocs = chooseDocumentList(filePath);
if (testDocs != null) {
testDocs.guessTextAndAnnotationColumns();
Recipe currentRecipe = Recipe.fetchRecipe();
currentRecipe.setDocumentList(testDocs);
return currentRecipe;
}
return null;
}
public Recipe loadNewDocumentsFromCSV(String[] rootCauseList) {
DocumentList testDocs;
testDocs = chooseDocumentList(rootCauseList);
if (testDocs != null) {
testDocs.guessTextAndAnnotationColumns();
Recipe currentRecipe = Recipe.fetchRecipe();
currentRecipe.setDocumentList(testDocs);
return currentRecipe;
}
return null;
}
protected DocumentList chooseDocumentList(String filePath) {
TreeSet<String> docNames = new TreeSet<String>();
docNames.add(filePath);
try {
DocumentList testDocs;
Charset encoding = Charset.forName("UTF-8");
{
testDocs = ImportController.makeDocumentList(docNames, encoding);
}
return testDocs;
} catch (FileNotFoundException e) {
e.printStackTrace();
} catch (Exception e) {
e.printStackTrace();
}
return null;
}
protected DocumentList chooseDocumentList(String[] rootCauseList) {
try {
DocumentList testDocs;
testDocs = new DocumentList();
testDocs.setName("TestData.csv");
List<String> codes = new ArrayList();
List<String> roots = new ArrayList();
for (String s : rootCauseList) {
codes.add("");
roots.add((s != null) ? s : "");
}
testDocs.addAnnotation("Code", codes, false);
testDocs.addAnnotation("Root Cause Failure Description", roots, false);
return testDocs;
} catch (Exception e) {
e.printStackTrace();
}
return null;
}
// Save/Load XML ==================================================
public void saveRecipeToXml(Recipe currentRecipe, String filePath) {
File f = new File(filePath);
try {
ConverterControl.writeToXML(f, currentRecipe);
} catch (Exception e) {
e.printStackTrace();
}
}
public Recipe loadRecipeFromXml(String filePath) throws FileNotFoundException, IOException {
Recipe currentRecipe = ConverterControl.loadRecipe(filePath);
return currentRecipe;
}
// Extract Features ==================================================
public Recipe prepareBuildFeatureTable(Recipe currentRecipe) {
// Add Feature Plugins
Collection<FeaturePlugin> plugins = new TreeSet<FeaturePlugin>();
for (FeaturePlugin plugin : featurePlugins.keySet()) {
String pluginString = plugin.toString();
if (pluginString == "Basic Features" || pluginString == "Character N-Grams") {
plugins.add(plugin);
}
}
// Generate Plugin into Recipe
currentRecipe = Recipe.addPluginsToRecipe(currentRecipe, plugins);
// Setup Plugin configurations
OrderedPluginMap currentOrderedPluginMap = currentRecipe.getExtractors();
for (SIDEPlugin plugin : currentOrderedPluginMap.keySet()) {
String pluginString = plugin.toString();
Map<String, String> currentConfigurations = currentOrderedPluginMap.get(plugin);
if (pluginString == "Basic Features") {
for (String s : currentConfigurations.keySet()) {
if (s == "Unigrams" || s == "Bigrams" || s == "Trigrams" ||
s == "Count Occurences" || s == "Normalize N-Gram Counts" ||
s == "Stem N-Grams" || s == "Skip Stopwords in N-Grams") {
currentConfigurations.put(s, "true");
} else {
currentConfigurations.put(s, "false");
}
}
} else if (pluginString == "Character N-Grams") {
for (String s : currentConfigurations.keySet()) {
if (s == "Include Punctuation") {
currentConfigurations.put(s, "true");
} else if (s == "minGram") {
currentConfigurations.put(s, "3");
} else if (s == "maxGram") {
currentConfigurations.put(s, "4");
}
}
currentConfigurations.put("Extract Only Within Words", "true");
}
}
// Build FeatureTable
currentRecipe = buildFeatureTable(currentRecipe, featureTableName, featureThreshold, featureAnnotation, featureType);
return currentRecipe;
}
protected Recipe buildFeatureTable(Recipe currentRecipe, String name, int threshold, String annotation, Type type) {
FeaturePlugin activeExtractor = null;
try {
Collection<FeatureHit> hits = new HashSet<FeatureHit>();
for (SIDEPlugin plug : currentRecipe.getExtractors().keySet()) {
activeExtractor = (FeaturePlugin) plug;
hits.addAll(activeExtractor.extractFeatureHits(currentRecipe.getDocumentList(), currentRecipe.getExtractors().get(plug)));
}
FeatureTable ft = new FeatureTable(currentRecipe.getDocumentList(), hits, threshold, annotation, type);
ft.setName(name);
currentRecipe.setFeatureTable(ft);
} catch (Exception e) {
System.err.println("Feature Extraction Failed");
e.printStackTrace();
}
return currentRecipe;
}
// Build Models ==================================================
public Recipe prepareBuildModel(Recipe currentRecipe) {
try {
// Get Learner Plugins
LearningPlugin learner = null;
for (LearningPlugin plugin : learningPlugins.keySet()) {
/* if (plugin.toString() == "Naive Bayes") */
if (plugin.toString() == "Logistic Regression") {
learner = plugin;
}
}
if (Boolean.TRUE.toString().equals(validationSettings.get("test"))) {
if (validationSettings.get("type").equals("CV")) {
validationSettings.put("testSet", currentRecipe.getDocumentList());
}
}
Map<String, String> settings = learner.generateConfigurationSettings();
currentRecipe = Recipe.addLearnerToRecipe(currentRecipe, learner, settings);
currentRecipe.setValidationSettings(new TreeMap<String, Serializable>(validationSettings));
for (WrapperPlugin wrap : wrapperPlugins.keySet()) {
if (wrapperPlugins.get(wrap)) {
currentRecipe.addWrapper(wrap, wrap.generateConfigurationSettings());
}
}
buildModel(currentRecipe, validationSettings);
} catch (Exception e) {
e.printStackTrace();
}
return currentRecipe;
}
protected void buildModel(Recipe currentRecipe,
Map<String, Serializable> validationSettings) {
try {
FeatureTable currentFeatureTable = currentRecipe.getTrainingTable();
if (currentRecipe != null) {
TrainingResult results = null;
/*
* if (validationSettings.get("type").equals("SUPPLY")) {
* DocumentList test = (DocumentList)
* validationSettings.get("testSet"); FeatureTable
* extractTestFeatures = prepareTestFeatureTable(currentRecipe,
* validationSettings, test);
* validationSettings.put("testFeatureTable",
* extractTestFeatures);
*
* // if we've already trained the exact same model, don't // do
* it again. Just evaluate. Recipe cached =
* checkForCachedModel(); if (cached != null) { results =
* evaluateUsingCachedModel(currentFeatureTable,
* extractTestFeatures, cached, currentRecipe); } }
*/
if (results == null) {
results = currentRecipe.getLearner().train(currentFeatureTable, currentRecipe.getLearnerSettings(), validationSettings, currentRecipe.getWrappers());
}
if (results != null) {
currentRecipe.setTrainingResult(results);
results.setName(trainingResultName);
currentRecipe.setLearnerSettings(currentRecipe.getLearner().generateConfigurationSettings());
currentRecipe.setValidationSettings(new TreeMap<String, Serializable>(validationSettings));
}
}
} catch (Exception e) {
e.printStackTrace();
}
}
protected static FeatureTable prepareTestFeatureTable(Recipe recipe, Map<String, Serializable> validationSettings, DocumentList test) {
prepareDocuments(recipe, validationSettings, test); // assigns classes, annotations.
Collection<FeatureHit> hits = new TreeSet<FeatureHit>();
OrderedPluginMap extractors = recipe.getExtractors();
for (SIDEPlugin plug : extractors.keySet()) {
Collection<FeatureHit> extractorHits = ((FeaturePlugin) plug).extractFeatureHits(test, extractors.get(plug));
hits.addAll(extractorHits);
}
FeatureTable originalTable = recipe.getTrainingTable();
FeatureTable ft = new FeatureTable(test, hits, 0, originalTable.getAnnotation(), originalTable.getClassValueType());
for (SIDEPlugin plug : recipe.getFilters().keySet()) {
ft = ((RestructurePlugin) plug).filterTestSet(originalTable, ft, recipe.getFilters().get(plug), recipe.getFilteredTable().getThreshold());
}
ft.reconcileFeatures(originalTable.getFeatureSet());
return ft;
}
protected static Map<String, Serializable> prepareDocuments(Recipe currentRecipe, Map<String, Serializable> validationSettings, DocumentList test) throws IllegalStateException {
DocumentList train = currentRecipe.getDocumentList();
try {
test.setCurrentAnnotation(currentRecipe.getTrainingTable().getAnnotation(), currentRecipe.getTrainingTable().getClassValueType());
test.setTextColumns(new HashSet<String>(train.getTextColumns()));
test.setDifferentiateTextColumns(train.getTextColumnsAreDifferentiated());
Collection<String> trainColumns = train.allAnnotations().keySet();
Collection<String> testColumns = test.allAnnotations().keySet();
if (!testColumns.containsAll(trainColumns)) {
ArrayList<String> missing = new ArrayList<String>(trainColumns);
missing.removeAll(testColumns);
throw new java.lang.IllegalStateException("Test set annotations do not match training set.\nMissing columns: " + missing);
}
validationSettings.put("testSet", test);
} catch (Exception e) {
e.printStackTrace();
throw new java.lang.IllegalStateException("Could not prepare test set.\n" + e.getMessage(), e);
}
return validationSettings;
}
//Predict Labels ==================================================
public void predictLabels(Recipe recipeToPredict, Recipe currentRecipe) {
DocumentList newDocs = null;
DocumentList originalDocs;
if (useEvaluation) {
originalDocs = recipeToPredict.getTrainingResult().getEvaluationTable().getDocumentList();
TrainingResult results = currentRecipe.getTrainingResult();
List<String> predictions = (List<String>) results.getPredictions();
newDocs = addLabelsToDocs(predictionColumnName, showDists, overwrite, originalDocs, results, predictions, currentRecipe.getTrainingTable());
} else {
originalDocs = recipeToPredict.getDocumentList();
Predictor predictor = new Predictor(currentRecipe, predictionColumnName);
newDocs = predictor.predict(originalDocs, predictionColumnName, showDists, overwrite);
}
// Predict Labels result
model.setDocumentList(newDocs);
}
protected DocumentList addLabelsToDocs(final String name, final boolean showDists, final boolean overwrite, DocumentList docs, TrainingResult results, List<String> predictions, FeatureTable currentFeatureTable) {
Map<String, List<Double>> distributions = results.getDistributions();
DocumentList newDocs = docs.clone();
newDocs.addAnnotation(name, predictions, overwrite);
if (distributions != null) {
if (showDists) {
for (String label : currentFeatureTable.getLabelArray()) {
List<String> dist = new ArrayList<String>();
for (int i = 0; i < predictions.size(); i++) {
dist.add(String.format("%.3f", distributions.get(label).get(i)));
}
newDocs.addAnnotation(name + "_" + label + "_score", dist, overwrite);
}
}
}
return newDocs;
}
// ==================================================
}
David. It looks like the above replicates a lot of the functionality from the edu.cmu.side.recipe package. However, it doesn't look like your predictSectionType() method actually outputs the model's predictions anywhere.
If what you're trying to do is indeed to save predictions on new data using a trained model, check out the edu.cmu.side.recipe.Predictor class. It takes a trained model path as input, It's used by the scripts/predict.sh convenience script, but you could repurpose its main method if you needed to call it programmatically.
I hope this helps!
My DAO Class:
#SuppressWarnings("unchecked")
public int getRowCount(Map<String, Object> searchParam) throws DAOReadException {
List<Client> clientRow = null;
try {
Criteria criteria = Criteria.forClass(Client.class);
//set criteria search
for (String key : searchParam.keySet()) {
/*if(key.equals("ClientPK.clientId1")){
criteria.add(Restrictions.like("ClientPK.clientId", searchParam.get(key)));
}*/
if(key.equals("clientPK.clientId")){
criteria.add(Restrictions.eq(key, Integer.parseInt(searchParam.get(key).toString())));
}
if(key.equals("clientName")){
criteria.add(Restrictions.like(key, searchParam.get(key)));
}
if(key.equals("status")){
criteria.add(Restrictions.eq(key, Short.parseShort(searchParam.get(key).toString())));
}
//Bug# 12544 start
if(key.equals("orgId"))
{
criteria.add(Restrictions.eq("ClientPK.orgId", searchParam.get(key)));
}
//Bug# 12544 End
}
criteria.addOrder(Order.desc("createdDate"));
clientRow = (List<Client>) findByCriteria(criteria);
}
catch (Exception e) {
throw new DAOReadException(e);
}
int rowCount = 0;
if (clientRow != null) {
rowCount = clientRow.size();
}
return rowCount;
}
}
error is :
java.lang.IllegalArgumentException: org.hibernate.QueryException: could not resolve property: ClientPK of: com.vin.eretail.model.client.Client [select this from com.vin.eretail.model.client.Client as this where this.ClientPK.orgId=? order by this.createdDate desc]
seems to change like below:
//Bug# 12544 start
if(key.equals("orgId"))
{
criteria.add(Restrictions.eq("clientPK.orgId", searchParam.get(key)));
}
//Bug# 12544 End
still need to see your Client class
I do not know how to get the data of two columns. I only know how to do it when it deals with one column only.
here is the code where the issue is:
public ArrayList<String> getData() {
ArrayList<String> List = new ArrayList<String>();
Cursor c = db.rawQuery("SELECT Column1, Column2 FROM Table where id = 1", null);
try {
if (c != null) {
if (c.moveToFirst()) {
do {
String levelData = c.getString(c.getColumnIndex("Column1"));
List.add("" + levelData);
}
while (c.moveToNext());
}
}
} catch (SQLiteException e) {
Log.e("Retrieve Data", "Unable to get Data " + e);
}
return List;
}
I know that the problem is at the c.getColumnIndex("Column1")); because that will be the place where to type the column of the table you want to get data from. But what will I do if I will try to do it using two columns?
the answer is simple. it was the first time i tried this and i didn't expect it to work so
this is what i did:
try {
if (c != null) {
if (c.moveToFirst()) {
do {
String levelData = c.getString(c.getColumnIndex("Column1"));
List.add("" + levelData);
}
while (c.moveToNext());
}
}
if (c != null) {
if (c.moveToFirst()) {
do {
String levelData = c.getString(c.getColumnIndex("Column2"));
List.add("" + levelData);
}
while (c.moveToNext());
}
}
} catch (SQLiteException e) {
Log.e("Retrieve Data", "Unable to get Data " + e);
}
I simply added another exact code but this time, it reads Column2 and worked as expected. :D
Make a java bean class with 2 variable and their getters & Setters like
public class Data {
String coloumn1;
String coloumn2;
public String getColoumn1() {
return coloumn1;
}
public void setColoumn1(String coloumn1) {
this.coloumn1 = coloumn1;
}
public String getColoumn2() {
return coloumn2;
}
public void setColoumn2(String coloumn2) {
this.coloumn2 = coloumn2;
}
}
Use
ArrayList dataList = new ArrayList();
dataList.setsetColoumn1(Your Data);
same for coloumn2 and for getters.