use Lombok #Builder #Default #Singular to initialise List<> - java

Disclaimer: I am kind of new to Java :)
I am running a bunch of selections on some data and in order to keep track of what happens at each stage of the the selection I use int counters. These counters are all in a data object:
public class MyCounters {
private int counter0;
private int counter1;
...
}
I also have to count how many candidates end up in a given number of categories, which I account for with an enum. To do this I created List<Integer> where the index of the list covers the values of the enum.
private List<Integer> myList;
And later in the code I need a dedicated method to initialise the list with zeros:
for (MyEnum i : MyEnum.values()) {
myList.add(0);
}
In the main code then, once the final category has been assigned, this happens:
myCounters.getMyList().set(myEnum.ordinal(), myCounters.getList().get(myEnum.ordinal()) + 1);
I was suggested that the declaration/initialisation steps can be improved using Lombok's #Builder.Default functionality (or maybe #Singular), but I can't really find out how: in the end I need to initialise a List<Integer> to as many zeros as the values in the enum.
Is it really possible to do this using Lombok's extensions? Or are they targeted for something different?

Lombok's #Builder + #Singular on their own will initialize your List with an empty ArrayList, and that's it (they won't initialize this List with any elements, like zeroes). #Builder.Default could do that (you don't need #Singular then), but I would not follow that path if possible.
I don't fully understand what you want to do, e.g. I don't know if you have only one enum (MyEnum), or if there's more than one enum.
If you have only MyEnum, you'd be much better off using a different data structure than List:
An EnumMap is the easy choice, because it's native to Java:
initialization: EnumMap<MyEnum, Integer> myMap = new EnumMap<>(MyEnum.class)
incrementing: myMap.merge(myEnum, 1, Integer::sum)
final result: myMap.getOrDefault(myEnum, 0)
The best data structure for this, though, would be a multiset. One external library that supports mulitsets is Guava with its Multiset:
initialization: Multiset<MyEnum> myMultiset= HashMultiset.create()
incrementing: myMultiset.add(myEnum)
final result: myMultiset.count(myEnum)

Sticking to your architecture, I guess you have to initialize the List within a class that uses Lombok. That can be achieved as follows:
#Builder
public class Foo {
#Builder.Default
private List<Integer> myList = Arrays.asList(0, 0, 0);
}
Arrays.asList is how you can initialize a List with default values using the standard Java libraries. I know it can be a bit confusing to use a class called Arrays instead of List or Collection, but you can find more information on its Javadoc (here the doc for Java 8). The result of that initialization is a list with three Integers set to 0. You just need to put as many as you need.
The reason to use the annotation #Builder.Default on the myList field is to make the builder constructor aware of the default initialization that otherwise would be skipped by the Lombok's builder.
For brevity, I only included the very specific code for initializing your List and the builder. Note that probably you'd want to use also the Lombok annotations #Data and #AllArgsConstructor in combination with it.
You can find more information on the Lombok official documentation

Honestly, I suggest a different architecture:
Consider not using Enum.ordinal(). It will work fine if you just care about "one" point in time, but if you persist your data somehow then things break apart as soon as you want to compare different persisted data (and the enum changed in the meantime)
Maybe consider LongAdder.
Meaning: use a Map<YourEnumType, LongAdder> to count things. Retrieve the counter, call its add() method, done.

Related

Getting element properties from first list item by getting element once vs couple of times performance

Can't find any article regarding this. Does it have any impact if we use:
Fruit fruit = fruits.get(0);
FruitDto dto = new FruitDto();
dto.setPrice(fruit.getPrice());
dto.setType(fruit.getType());
... eg to set 10 properties using fruit.getSomeProperty
compared to:
FruitDto dto = new FruitDto();
dto.setPrice(fruits.get(0).getPrice());
dto.setType(fruits.get(0).getType());
... eg to set 10 properties using fruits.get(0).getSomeProperty
and is there any advantage if we use fruits.stream().findFirst() instead of get(0)?
Yes some impact, but it might be negligible depending on the collection type and whether you're actually pulling item 0 or some higher index (different collection types have different scaling of get, e.g. get(0) vs. get(1000000) on a LinkedList would be very different in comparative runtimes).
I would also say the first version is preferable from a cleanliness/maintenance POV. Consider sequential calls to fruits.get(0).getProperty() for some set of properties, and then we need to change from the 0th item to the 1st (e.g. need to change to fruits.get(0).getProperty() in many spots). That change now runs a higher risk of some typo leading to a bug (e.g. someone misses changing a 0 to a 1).
Generally it's rather about cleaner code style than about performance, and the first approach may be easily extracted into a mapper method, to copy the properties of an entity into DTO (and it does not actually matter that the entity belongs to some collection at some index):
FruitDto dto = toDto(fruits.get(0));
static FruitDto toDto(Fruit fruit) {
FruitDto dto = new FruitDto();
dto.setPrice(fruit.getPrice());
dto.setType(fruit.getType());
// ...
return dto;
}
Regarding use of Stream::findFirst instead of checking the size/emptiness of the list -- it's also rather about code style and handling the empty input collection:
"imperative"
if (fruits.isEmpty()) throw new NoFruitException(); // or return "empty" dto
FruitDto dto = toDto(fruits.get(0));
"functional" using Optional
FruitDto dto = fruits.stream().findFirst() // Optional<Fruit>
.map(Mapper::toDto) // Optional<FruitDto>
.orElseThrow(NoFruitException::new); // or .orElse(EMPTY_DTO)

How to convert ArrayList to ExampleSet in Rapidminer?

I'm creating an extension for rapidminer using java. I have an array of elements of type Example and I need to covert it to a dataset of type ExampleSet.
Rapidminer's ExampleSet definition looks like this:
public interface ExampleSet extends ResultObject, Cloneable, Iterable<Example>
I need to pick certain elements from dataset and send it back, still as ExampleSet, however casting is not working and I can't simply create new ExampleSet object since it's an interface.
private ExampleSet generateSet(ExampleSet dataset){
List<Example> list = new ArrayList<Example>();
// pick elements from sent dataset and add them to newly created list above
return (ExampleSet)list;
}
You will need more than a simple explicit cast.
In RapidMiner, an ExampleSet is not just a collection of Example. It contains more complex information and logic.
Therefore, you need another approach to work with ExampleSets. Like you already said, it is just the interface, which lead us to choice of the right subtype.
For starters, (Since: 7.3) simply use one of ExampleSets class's methods .
You also need to define each Attribute this ExampleSet is going to have, namely the columns.
Below, I create one with a single Attribute called First
Attribute attributeFirst = AttributeFactory.createAttribute("First", Ontology.POLYNOMINAL);
ExampleSetBuilder builder = ExampleSets.from(attributeFirst);
builder.addDataRow(example.getDataRow());
ExampleSet result = builder.build();
You can also get the Attributes in a more generic way using:
Attribute[] attributes = example.getAttributes().createRegularAttributeArray();
ExampleSetBuilder builder = ExampleSets.from(attributes);
...
If you have many cases where you have to create or alter ExampleSet, I encourage you to write your own ExampleSetBuilder since the original implementation have many drawbacks.
You can also try searching for other extensions, which may already meet your requirements, and you do not need to create one of your own (belive me, it's not Headache-free).
the ExampleSet class is getting deprecated (but still perfectly fine to use).
You might want to consider switching over to the newer data set API called Belt (https://github.com/rapidminer/belt). It's faster and more intuitive to use. It's still actively developed, so feedback is also welcome.
Also if you have more specific questions, feel free to drop by the RapidMiner community (https://community.rapidminer.com/), where also many of the developers are very active.

Populating large set of parameters with Setter methods

I am using POJO to create and fetch data. These POJOs represent our APIs and we use them for testing through REST Assured.
I have a RequestDTO class with 30 variables. Since this is a DTO, I am using 30 setter methods in my class for updating there values.
I am calling these setter methods as below with method chaining. I am using varList variable to read data from csv and supply to this DTO.
However this looks clumsy, less readable & incorrect. I want to know what is a good approach/design pattern here. As I have fairly less knowledge around best practices & design pattern.
Sample code:
public static void setRequestDTO(List<Object> varList) {
MyRequestDTO myrequest = new MyRequestDTO()
.setkey1(varList.get(0).toString())
.setkey2(varList.get(1).toString())
// ........
.setkey30(varList.get(30).toString());
}
Firstly, I believe your DTO is too bloated - is there really no other way that you can perhaps break this down into smaller classes?
Secondly, you're using a List<Object> but all of the examples show that you're using String values - is there a chance that you could change the type parameter of the List to eliminate the need for all of the .toString calls?
Thirdly, you are depending heavily on your List containing all of the necessary elements that you're looking to set on your DTO and that they are all in the correct order. This will lead to exceptions being thrown if you have too few elements.
Lastly, while I would consider refactoring this, I'll leave you with one idea that you could proceed with. If you are determined to keep your current DTO structure, then consider putting your List<Object> into a constructor for MyRequestDTO and then performing all of your setters in there. That way you don't have 30 lines of setters whenever you're instantiating a new instance of this DTO and you're only setting these values on instantiation.

stream on JPA lazy list

I have JPA entity with list like this:
#OneToMany(mappedBy = "scadaElement", orphanRemoval = true)
private List<ElementParameter> elementParameters;
and map form ElementParameter
#ManyToOne
#JoinColumn(name = "SCADAELEMENT_ID")
ScadaElement scadaElement;
when i get entity with elementParameters list and do stream on it stream do nothing, even when I trigger list with .size() but when I do the same with a for loop it work.
System.out.println("elements size: " + s.getElementParameters().size());
s.getElementParameters()
.stream()
.forEach(
a -> {
System.out.println("elementId: " + a.getId());
}
);
Is there any solution to make that stream work? I use eclipselink as JPA provider.
Apparently, you are referring to this issue. These lazy lists using the anti-pattern of inheriting from actual implementations (here Vector) fail to adapt to the evolution of the base class. Note that there are two possible outcomes depending on how the anti-pattern was realized
If the lazily populated list populates itself (it terms of the inherited state) on the first use, the new inherited methods will start working as soon as a trigger property has been accessed for the first time
But if the list overrides all accessor methods to enforce delegation to another implementation, without ever updating the state of the base class, the base class’ methods which have not been overridden will never start working, even if the list has been populated (from the subclass’ point of view)
Apparently, the second case applies to you. Triggering the population of the list does not make the inherited forEach method work. Note that turning off the lazy population via configuration might be the simpler solution here.
To me, the cleanest solution would be if IndirectList inherits from AbstractList and adheres to the Collection API standard, now, almost twenty years after the Collection API has superseded Vector (should I mention how much younger JPA actually is?). Unfortunately, the developers didn’t go that road. Instead, the anti-pattern was maxed out by creating another class that inherits from the class which already inherits from the class not designed for inheritance. This class overrides the methods introduced in Java 8 and perhaps gets another subclass in one of the next Java releases.
So the good news is, developers expecting every List to be a Vector do not have to make up their minds, but the bad news is it doesn’t work as sometimes, you will not get the extended Java 8 specific version with JPA 2.6. But apparently, JPA 2.7 will work.
So you can derive a few alternative solutions:
Turn off lazy population
Stay with Java 7
Wait for JPA 2.7
just copy the collection, e.g.
List<ElementParameter> workList=new ArrayList<>(elementParameters);
This workList will support all Collection & Stream operations
Why not using the real JPA Streaming?
Stream<User> findAllByName(String name);

Iterate list inside list inside list java

I got a question. Is there a simple solution to iterate over the list that is inside of a list and inside of a list again?
So my point is I have few of lists inside of each other (based on xml unmarshall) and I sometimes do not know how deep is the structure.
Exsample:
class Car{
private List<Door>
}
class Door{
private List<Parts>
}
class Parts{
private List<Some1>
}
}
class Some1{
private List<Some2>
}
So how can iterate from Car to Some2 without knowing if there is a list or is empty in a "good way"? I mean without 5-times nested "for" loops mixed up with another 6 "if's".
DeeV makes a good suggestion about each class iterating over their respective lists.
As a client using the Car class, you may want to get all the Some2s that it contains. If you do this:
car.getDoors().getParts().get...
you expose the internals of the Car class. A much cleaner solution would be to have the following method in the Car class:
public List<Some2> getSome2s()
This way, if the internals of Car change (perhaps using a different Collection type) your client code will not break as long as a list of Some2s is still returned.
There is no easier way than nested for loops. If you want more speed try using different structures like Maps.
First implement composite pattern with your classes after you'll be able to recursive to leafs and aggregate them in a list.

Categories

Resources