I have OPL .mod model and I run it from Java code. The model needs some external data.
Currently model loads the data from .dat file with
IloOplFactory.createOplRunConfiguration(String modelName, String[] dataFiles)
method.
I want to load the data directly from Java code.
I found
IloOplFactory.createOplRunConfiguration(OplModelDefinition, OplDataElements)
but I can't understand how to use it (how to define elements for OplDataElements).
Could someone provide example of defining elements and usage of this method?
(Or better way to pass data from Java to OPL model)
Thanks in advance.
I have done this to pass in control and configuration data to a model, typically parameter values and flags. Once you create an instance of IloOplDataElements, you can just add it as a data source for your model, e.g.
IloOplDataElements configData = new IloOplDataElements(env);
configData.addElement(configData.makeElement("modelIteration", 1));
configData.addElement(configData.makeElement("debug", 2));
// etc
myModel.addDataSource(configData);
I haven't tried doing this with array data, but I guess it should be similar.
Related
I have different JSON files and need to read, process and write the containing JSON objects of a JSON array.
The output format (more specific: the output class) is for all files the same. Lets call it OutputClass. Hence the item processor is something like ItemProcessor<X, OutPutClass>. Where X is the class of the specific JSON file.
The difference between files are:
The JSON array / the information is at a different position in every JSON file
The structure of the JSON objects in the JSON array are different (the objects in file a have a different syntax than the ones in file b)
I already came across of #StepScope and was able to dynamically generate a reader (depending on job parameters) which starts reading at a different position in the JSON structure.
But I have no idea how to dynamically choose an ItemProcessor depending on the job parameters. Because I got many different JSON files and want to reduce the amount of code to write for each file.
Since you were able to create a dynamic item reader based on job parameters by using the a step-scoped bean (which is the way I would do it too), you can use the same approach to create a dynamic item processor as well.
In my application, I have a grid populated with some data, I am displaying 2 columns in the grid..ie, name and noOfScripts. But when I double click on the grid, I want scriptname and parameters also to be displayed( and the number of times it is displayed is based on the noOfScripts) in the Window/Form.
I am using Java Servlets for the backend, Now I am not sure if I have to add the script details into the same Main Class or not? How should I handle the script details?
Check this: this is how I add these data:
Here
Should I save scriptname and parameters along with the main data?
Is it possible to store multiple values, if I add scriptname and parameters to the main Store? or should I create a different store for Script?
If I am creating a different store called Scripts..How I should be mapping it to the data record? Are there any methods provided by ExtJs that helps achieve this?
Not sure If my explanation on the problem is clear or not. Do let me know if it is not clear, I will try to make it clear.
Please suggest me if there is something helpful. Any suggestions, ideas or references will help.
Thanks in advance
I suggest you to create a model and a store for Script and on rowitem click - populate a window with the data from the Script store.
ExtJS4 has support for model association (more here) but the mechanism is too complex so i prefer a simplier take on it:
Add MainModelId to Script model and simply filter the Script store every time you show your modal window (grid item click)
I'm totally new in MapReduce programming and in my first MR code, I have this question. In my mapper, I need to have access to a 2D array that has been created and filled before the mapper in the main class. How can I have access to it? Should I export it into txt and try to read it in the mapper? If so, how should in insert it into mapper? I have no idea how should I make it available? My code is in Java.
You can do this couple of ways.
After you created the 2D array, you can load this file into HDFS and then use DistributedCache in Java M/R API to access this data in your mapper/reducer code. Take a look at this: http://developer.yahoo.com/hadoop/tutorial/module5.html
If your data is not too large and you have an object that represents this data which is serializable and quite small you can pass it along via the job Configuration. Serialize it and include a base64 encoded version of it in the Configuration. Then you can access this data in mapper/reducer: http://hadoop.apache.org/docs/stable/api/org/apache/hadoop/conf/Configuration.html#set(java.lang.String, java.lang.String)
I have a java method which retrieves data from database and a bean which manages this data.
Using Arrays.toString(var), I get this array [{'tom','1'},{'sawyer','2'}] but highcharts accept data only in this format ['tom','1','sawyer','2']
For now I have to get the array and using replace function to get the correct format but all this has to be done manually.
My question is how to convert the array to the correct format and then pass it to the highcharts data.
How to load a Java method on page load?
Thank you for your patience and help in advance.
Data that needs to be passed to highcharts must be JSON. So the best way to do this is to use a JSON library. http://json.org/java/
Data can be passed also using xml.. The best way to pass data i found is to create a string tn the java class by itself and in the page calling this java method we just need to create an xml dom parser which will convert the string to xml and thus push data to highcharts using jQuery easily.
I'm making a program that opens a previous saved file through serialization and want to create a new one, however, data stays in.
How can I make the program forget the data?
If you specify an attribute with the keyword transient, then it will not be serialized. If you're saving the data by serializing objects and writing them to files, this may be what you're looking for. Here's an example of using the transient keyword.
Shouldn't creating a new instance of whichever class you are serializing give you such an 'empty data record'?
Using the example of a text editor you use, you would have, say, a Document class which completely encapsulates a text document and assume you use serialization to save it, then simply new Document() would give you an empty document... Until you fill in some text (or data in your program) you shouldnt open a file...
Assuming you meant a tree of Employee data, or a tree data structure with Employee objects at its nodes, then creating a new such tree will give you what you want.
Think, how did you create the first data set that you serialized? Just repeat that process...