I'm writing a simple JavaFX7 application where I display data pulled out of database using a StackedBarChart. I also provide the user with the ability to filter the displayed data sets based on a specific property's value. The problem that I'm facing is that there seems to be some caching issues. Consider the following scenario
Initial load, display everything to the user - no filtering involved.
Say our categories are named 1,2,3,4 and 5, and are rendered in that order (consider them sorted)
The user now selects a filter value. This leads to only categories 1,2,4 and 5 being on the screen (again, in that order - this is the expected behavior)
The user now resets the filter to "do-not-filter".
The expected output of step 3 would be 1,2,3,4 and 5, in that order. However, it is 1,2,4,5,3. Notice the category that got filtered out is added back at the end of the array instead of where it should be.
Things I've tried so far:
Assigning a new ObservableList via Axis.setCategory. this doesn't work.
Same as above, but also force the category list to null before hand.
Sorting the category list. This doesn't work either.
I can't (yet) update to Java 8 - I also can't just leave this as a broken feature because this is expected to roll out to users before we upgrade to Java 8. So JavaFX 8's FilteredList is out of question (and a backport is very much annoying just from looking at the changes to ObservableList). I also don't want to entirely recreate the graph if I can avoid it.
At this point, I'm out of ideas. Any suggestions are welcome. Below is the function that populates the chart.
private void refreshContents() {
this.vaguesTotal.getData().clear();
this.vaguesDone.getData().clear();
this.vaguesPending.getData().clear();
this.xAxis.setCategories(null);
this.chartCategories = FXCollections.observableArrayList();
// Already sorted here
for (VagueLight vagueInfo : context.getVagues()) {
if (this.categoryFilter != null && this.categoryFilter != vagueInfo.getCategory())
continue;
int dossiersTraites = vagueInfo.getNbDossiersTraites();
int dossiersPending = vagueInfo.getNbDossiersATraiter();
String vagueIdentifier = Integer.toString(vagueInfo.getId());
this.vaguesTotal.getData().add(new Data<String, Number>(vagueIdentifier, 0, vagueInfo));
this.vaguesDone.getData().add(new Data<String, Number>(vagueIdentifier, dossiersTraites, vagueInfo));
this.vaguesPending.getData().add(new Data<String, Number>(vagueIdentifier, dossiersPending, vagueInfo));
this.chartCategories.add(vagueIdentifier);
}
// This just sets up event handlers and styles for the series
for (Series<String, Number> dataSeries : this.barChart.getData()) {
for (Data<String, Number> dataNode : dataSeries.getData()) {
initializeDataNode(dataNode, dataSeries);
}
}
// This is where the "bug" happens
this.xAxis.setCategories(this.chartCategories);
layout(true);
}
Related
We have a lot of grids in our Vaadin app, every one of them has some filters on string columns.
Recently we noticed that not all of them are working. There are no errors in the log, no exceptions, no nothing.
We have one method for adding all the filters to string columns in the whole app, it adds a text field to a HeaderRow of a grid with specific listener:
public <T> void addFilterToSpecificStringColumn(ListDataProvider<T> dataProvider, HeaderRow filterRow, String columnName, FilterObject filter) {
TextField filterField = new TextField();
filterField.setValueChangeMode(ValueChangeMode.EAGER);
filterField.addValueChangeListener(event -> {
filter.setCorrectFieldByName(columnName, event.getValue());
dataProvider.refreshAll();
});
filterRow.getCell(this.getColumnByKey(columnName)).setComponent(filterField);
}
This is how we add the filter:
dataProvider.setFilter(filterObject::searchForOffer);
and this is a basic filter method of FilterObject added to DataProvider by setFilter() method and seems like this is not always working:
public boolean searchForOffer(Offer offer) {
return name.length() == 0 || StringUtils.containsIgnoreCase(String.valueOf(offer.getName()), name);
}
In normal case the dataProvider.refreshAll() called in the text field listener would trigger the searchForOffer() method which goes through data provider list elements one by one and returns offers which meet the filter condition (name in this case). This basically stopped working in some grids and i am not sure why.
Listener is called, i am able to debug it, dataProvider.refreshAll() is called but it never calls the filter method which is the case in other grids.
What could be the problem here? Ive checked and the dataProvider.getItems() is the same instance of list that is set when invoking grid.setItems().
We are using Vaadin 14(java only), Java 11.
Ok i asked here because i spent almost 5 hours trying to figure out what is wrong. Turns out another dev added a grid.setItems() (which creates a new data provider) where he passed a new instance of list with same elements because we do have other threads updating grids...
I really felt hopeless at some point, but the second i posted the question i found what the problem was.
I guess this question can be deleted by mod. My bad.
I am using Vaadin 15. In my application I am using a grid, which uses a dataprovider (data) to show a bunch of different columns. Further, I am using a filter to filter all columns. One Column consists of dates, with my current Filter Function i can not filter ranges.
I am very new to Java and programming in general. This is the filter function I use currently.
TextField cas=getFilters(casColumn,filter);
cas.addValueChangeListener(event-> data.addFilter(dataRecord-> StringUtils.containsIgnoreCase(dataRecord.getCas(), cas.getValue())));
Now I would like to also be able to filter ranges of dates. So in the filter TextField a date is given and everything till today will be shown, everything older than the date will not be shown.
Thanks for your help. Tipps for some literature on how to do it are also appreciated.
I am using Vaadin 15.
Just pointing out that it makes little sense to use Vaadin 15, as it is not supported anymore. The latest version Vaadin 21.
With the newest Vaadin versions if you are using in memory data provider Grid is set up using
GridListDataView<Data> dataView = setITems(data); // where data is a collection of
items
Then you can set the filter you use via dataView, say
textField.addValueChangeListener(event -> {
dataView.setFilter(item-> item.getProperty().equals(event.getValue));
});
Note, if you have data provider from callbacks, then above method does not apply for you , instead filtering is done as follows, i.e. providing a query that can use the value of the filter
TextField filter = new TextField("Filter");
filter.setValueChangeMode(ValueChangeMode.LAZY);
filter.addValueChangeListener(event -> {
grid.setItems(query -> personService
.fetch(query.getOffset(), query.getLimit(), event.getValue()).stream(),query -> personService.count(event.getValue()));
});
I'm trying to automatically select the first item in a filtered table.
I'm essentially doing the following:
table = new TableViewer(...);
table.addFilter(filter);
table.setContentProvider(contentProvider);
table.setInput(input);
first = table.getElementAt(0);
table.setSelection(new StructuredSelection(first));
The surprising thing is that (depending on the filter) I can get an element that is filtered out from getElementAt(0). The result is that ultimately, no item will be selected.
I have tried calling table.refresh() before getting the element with the same results.
If I call getElementAt(0) at a later point, I do indeed get the correct first element (that is not filtered out).
How can I make getElementAt respect the filtering immediately?
In my experience, the most reliable way to select the first (visible) element is - for once only - to bypass JFace, rely on its internal data model, and use SWT API to select the first TableItem like this:
static Object selectFirstElement(TableViewer tableViewer) {
Object firstElement = null;
Table table = tableViewer.getTable();
if (table.getItemCount() > 0) {
firstElement = table.getItem(0).getData();
tableViewer.setSelection(new StructuredSelection(firstElement), true); // true == reveal
}
return firstElement;
}
I've been using this code successfully for several years with sorted, filtered, and virtual tables.
Well, I found what was wrong and it was my own fault. The filter I set is mutable so it can filter more or less strictly. The problem was that I activated the more strict filtering after I set the selection.
Thanks everyone for the help anyways.
I am getting messages from a kafka topic which is sending me a JSON message. I would like to extract a field from that json message (which can be for ex. an ID) and I would like to create 'n' sessions for 'n' unique device IDs.
I have tried creating a new session instance for every unique ID that I am receiving, but after creating new session window instance i.e. creating a new branch in the pipeline for each IDs, I am unable to push the next upcoming messages to the corresponding branch which already exists.
The expected result that I want is, suppose we are getting messages like
{ID:1,...}, {ID:2,...}, {ID:3,...},{ID:1,...}
There would be three different sessions created and the fourth message would go to the session for device ID 1.
Is there a way to do this in apache beam programming paradigm or in Java Programming Paradigm ? Any help would be greatly appreciated.
Yes, this is possible with the Beam paradigm if you use a custom WindowFn. You can subclass the Sessions class and modify it to set gap durations differently based on the ID of each element. You can do this in assignWindows, which looks like this in Sessions:
#Override
public Collection<IntervalWindow> assignWindows(AssignContext c) {
// Assign each element into a window from its timestamp until gapDuration in the
// future. Overlapping windows (representing elements within gapDuration of
// each other) will be merged.
return Arrays.asList(new IntervalWindow(c.timestamp(), gapDuration));
}
The AssignContext class can be used to access the element being assigned this window, which will allow you to retrieve the ID of that element.
It also sounds like you want elements with different IDs to be grouped in different windows (i.e. if element A and B come in within the gap duration but with different IDs, they should still be in different windows). This can be done by performing a GroupByKey with the ID of your elements as keys. Session windows apply on a per-key basis as described in the Beam Programming Guide, so this will separate the elements by IDs.
I have implemented Java and Python examples for this use case. The Java one follows the approach suggested by Daniel Oliveira but I think it's nice to share a working sample.
Note that the Java example is featured in the Beam common patterns docs. Custom merging windows isn't supported in Python (with fnapi).
Java version:
We can adapt the code from Session windows to fit our use case.
Briefly, when records are windowed into sessions they get assigned to a window that begins at the element’s timestamp (unaligned windows) and adds the gap duration to the start to calculate the end. The mergeWindows function will then combine all overlapping windows per key resulting in an extended session.
We’ll need to modify the assignWindows function to create a window with a data-driven gap instead of a fixed duration. We can access the element through WindowFn.AssignContext.element(). The original assignment function is:
public Collection<IntervalWindow> assignWindows(AssignContext c) {
// Assign each element into a window from its timestamp until gapDuration in the
// future. Overlapping windows (representing elements within gapDuration of
// each other) will be merged.
return Arrays.asList(new IntervalWindow(c.timestamp(), gapDuration));
}
The modified function will be:
#Override
public Collection<IntervalWindow> assignWindows(AssignContext c) {
// Assign each element into a window from its timestamp until gapDuration in the
// future. Overlapping windows (representing elements within gapDuration of
// each other) will be merged.
Duration dataDrivenGap;
JSONObject message = new JSONObject(c.element().toString());
try {
dataDrivenGap = Duration.standardSeconds(Long.parseLong(message.getString(gapAttribute)));
}
catch(Exception e) {
dataDrivenGap = gapDuration;
}
return Arrays.asList(new IntervalWindow(c.timestamp(), dataDrivenGap));
}
Note that we have added a couple extra things:
A default value for cases where the custom gap is not present in the data
A way to set the attribute from the main pipeline as a method of the custom windows.
The withDefaultGapDuration and withGapAttribute methods are:
/** Creates a {#code DynamicSessions} {#link WindowFn} with the specified gap duration. */
public static DynamicSessions withDefaultGapDuration(Duration gapDuration) {
return new DynamicSessions(gapDuration, "");
}
public DynamicSessions withGapAttribute(String gapAttribute) {
return new DynamicSessions(gapDuration, gapAttribute);
}
We will also add a new field (gapAttribute) and constructor:
public class DynamicSessions extends WindowFn<Object, IntervalWindow> {
/** Duration of the gaps between sessions. */
private final Duration gapDuration;
/** Pub/Sub attribute that modifies session gap. */
private final String gapAttribute;
/** Creates a {#code DynamicSessions} {#link WindowFn} with the specified gap duration. */
private DynamicSessions(Duration gapDuration, String gapAttribute) {
this.gapDuration = gapDuration;
this.gapAttribute = gapAttribute;
}
Then, we can window our messages into the new custom sessions with:
.apply("Window into sessions", Window.<String>into(DynamicSessions
.withDefaultGapDuration(Duration.standardSeconds(10))
.withGapAttribute("gap"))
In order to test this we’ll use a simple example with a controlled input. For our use case we'll consider different needs for our users depending on the device where the app is running. Desktop users can be idle for long (allowing for longer sessions) whereas we only expect short-span sessions for our mobile users. We generate some mock data, where some messages contain the gap attribute and others omit it (gap window will resort to default for these ones):
.apply("Create data", Create.timestamped(
TimestampedValue.of("{\"user\":\"mobile\",\"score\":\"12\",\"gap\":\"5\"}", new Instant()),
TimestampedValue.of("{\"user\":\"desktop\",\"score\":\"4\"}", new Instant()),
TimestampedValue.of("{\"user\":\"mobile\",\"score\":\"-3\",\"gap\":\"5\"}", new Instant().plus(2000)),
TimestampedValue.of("{\"user\":\"mobile\",\"score\":\"2\",\"gap\":\"5\"}", new Instant().plus(9000)),
TimestampedValue.of("{\"user\":\"mobile\",\"score\":\"7\",\"gap\":\"5\"}", new Instant().plus(12000)),
TimestampedValue.of("{\"user\":\"desktop\",\"score\":\"10\"}", new Instant().plus(12000)))
.withCoder(StringUtf8Coder.of()))
Visually:
For the desktop user there are only two events separated 12 seconds. No gap is specified so it will default to 10s and both scores will not be added up as they will belong to different sessions.
The other user, mobile, has 4 events separated 2, 7 and 3 seconds respectively. None of the time separations is greater than the default gap, so with standard sessions all events would belong to a single session with added score of 18:
user=desktop, score=4, window=[2019-05-26T13:28:49.122Z..2019-05-26T13:28:59.122Z)
user=mobile, score=18, window=[2019-05-26T13:28:48.582Z..2019-05-26T13:29:12.774Z)
user=desktop, score=10, window=[2019-05-26T13:29:03.367Z..2019-05-26T13:29:13.367Z)
With the new sessions we specify a “gap” attribute of 5 seconds to those events. The third message comes 7 seconds after the second one thus falling into a different session now. The previous large session with score 18 will be split into two 9-point sessions:
user=desktop, score=4, window=[2019-05-26T14:30:22.969Z..2019-05-26T14:30:32.969Z)
user=mobile, score=9, window=[2019-05-26T14:30:22.429Z..2019-05-26T14:30:30.553Z)
user=mobile, score=9, window=[2019-05-26T14:30:33.276Z..2019-05-26T14:30:41.849Z)
user=desktop, score=10, window=[2019-05-26T14:30:37.357Z..2019-05-26T14:30:47.357Z)
Full code here. Tested with Java SDK 2.13.0
Python version:
Analogously, we can extend the same approach to the Python SDK. The code for the Sessions class can be found here. We’ll define a new DynamicSessions class. Inside the assign method we can access the processed record using context.element and modify the gap according to data.
Original:
def assign(self, context):
timestamp = context.timestamp
return [IntervalWindow(timestamp, timestamp + self.gap_size)]
Extended:
def assign(self, context):
timestamp = context.timestamp
try:
gap = Duration.of(context.element[1][“gap”])
except:
gap = self.gap_size
return [IntervalWindow(timestamp, timestamp + gap)]
If the input data contains a gap field it will use it to override the default gap size. In our pipeline code we just need to window events into DynamicSessions instead of the standard Sessions:
'user_session_window' >> beam.WindowInto(DynamicSessions(gap_size=gap_size),
timestamp_combiner=window.TimestampCombiner.OUTPUT_AT_EOW)
With standard sessions the output is as follows:
INFO:root:>> User mobile had 4 events with total score 18 in a 0:00:22 session
INFO:root:>> User desktop had 1 events with total score 4 in a 0:00:10 session
INFO:root:>> User desktop had 1 events with total score 10 in a 0:00:10 session
With our custom windowing mobile events are split into two different sessions:
INFO:root:>> User mobile had 2 events with total score 9 in a 0:00:08 session
INFO:root:>> User mobile had 2 events with total score 9 in a 0:00:07 session
INFO:root:>> User desktop had 1 events with total score 4 in a 0:00:10 session
INFO:root:>> User desktop had 1 events with total score 10 in a 0:00:10 session
All files here. Tested with Python SDK 2.13.0
To expand my question, you could say that I want to program in SmartGWT instead of programming into SmartGWT ( http://msmvps.com/blogs/jon_skeet/archive/2008/04/23/programming-quot-in-quot-a-language-vs-programming-quot-into-quot-a-language.aspx ).
I have a 2 column ListGrid, populated with data from the 5 column database table. I don't use a DataSource (more on that later), instead I get the data from the async service and populate it on success like this predmetiGrid.setData(PredmetRecord.convertToContractRecordArray(result)); . The user can edit the data and press the Save button to save it. The way I have implemented the save is:
// repeat this for each edited field
for (int i=0; i < predmetiGrid.getAllEditRows().length; i++){
int editedRowIndex = predmetiGrid.getAllEditRows()[i];
// for each edite get the full record
PredmetRecord editedRecord = (PredmetRecord)predmetiGrid.getRecord(editedRowIndex);
// create a new DomainObject - Predmet, and set the ID from the
// Row so I have the ID to use for update later
Integer predmetID = editedRecord.getAttributeAsInt("predmetID");
Predmet predmet = new Predmet(predmetID);
// fill Predmet object with either the edited value, or the
// original value (if only one of the fields was changed and not both)
String editedNazivPredmeta = (String)predmetiGrid.getEditValues(editedRecord).get("nazivPredmeta");
boolean isNazivChanged = editedNazivPredmeta != null;
if (!isNazivChanged){
editedNazivPredmeta = editedRecord.getAttribute("nazivPredmeta");
}
predmet.setNazivPredmeta(editedNazivPredmeta);
String editedOpisPredmeta = (String) predmetiGrid.getEditValues(editedRecord).get("opisPredmeta");
boolean isOpisChanged = editedOpisPredmeta != null;
if (!isOpisChanged){
editedOpisPredmeta = editedRecord.getAttribute("opisPredmeta");
}
predmet.setOpisPredmeta(editedOpisPredmeta);
predmetiList.add(predmet);
}
In another method I call the async service:
public void updatePredmeti(List<Predmet> predmeti) throws RpcException, IllegalArgumentException {
for (int i=0; i<predmeti.size();i++){
JdbcPredgledPredmetaDAO.getInstance().updatePredmet(predmeti.get(i));
}
}
Now there are a few problems with this implementation. The most obvious ones are:
a) I'm not using a Datasource conected with the ListGrid. I don't use it because I don't understand how to use it in my case since the examples are written either for an XML DataSource or for the SmartGWT Pro (or higher) integrated server.
b) The async method needs to have a rollback mechanism if one of the inserts fail, though there could be a smarter implementation of this (e.g. do all inserts in one transaction).
c) I'm "hacking" to get and update the data instead of using object methods/properties but this is, currently, the best I got form the JavaDoc; I'd prefer to see best practice way to write this and learn
I'm using SmartGWT LGPL 3.0, Tomcat 7.0, Java 1.6
You can use a custom Datasource. DataSource.setDataFormat(DSDataFormat.CUSTOM). With this setting the DataSource will not handle the response, instead you have to parse it with transformResponse().