Usage of parameters inside <> in a class [duplicate] - java

The diamond operator in java 7 allows code like the following:
List<String> list = new LinkedList<>();
However in Java 5/6, I can simply write:
List<String> list = new LinkedList();
My understanding of type erasure is that these are exactly the same. (The generic gets removed at runtime anyway).
Why bother with the diamond at all? What new functionality / type safety does it allow? If it doesn't yield any new functionality why do they mention it as a feature? Is my understanding of this concept flawed?

The issue with
List<String> list = new LinkedList();
is that on the left hand side, you are using the generic type List<String> where on the right side you are using the raw type LinkedList. Raw types in Java effectively only exist for compatibility with pre-generics code and should never be used in new code unless
you absolutely have to.
Now, if Java had generics from the beginning and didn't have types, such as LinkedList, that were originally created before it had generics, it probably could have made it so that the constructor for a generic type automatically infers its type parameters from the left-hand side of the assignment if possible. But it didn't, and it must treat raw types and generic types differently for backwards compatibility. That leaves them needing to make a slightly different, but equally convenient, way of declaring a new instance of a generic object without having to repeat its type parameters... the diamond operator.
As far as your original example of List<String> list = new LinkedList(), the compiler generates a warning for that assignment because it must. Consider this:
List<String> strings = ... // some list that contains some strings
// Totally legal since you used the raw type and lost all type checking!
List<Integer> integers = new LinkedList(strings);
Generics exist to provide compile-time protection against doing the wrong thing. In the above example, using the raw type means you don't get this protection and will get an error at runtime. This is why you should not use raw types.
// Not legal since the right side is actually generic!
List<Integer> integers = new LinkedList<>(strings);
The diamond operator, however, allows the right hand side of the assignment to be defined as a true generic instance with the same type parameters as the left side... without having to type those parameters again. It allows you to keep the safety of generics with almost the same effort as using the raw type.
I think the key thing to understand is that raw types (with no <>) cannot be treated the same as generic types. When you declare a raw type, you get none of the benefits and type checking of generics. You also have to keep in mind that generics are a general purpose part of the Java language... they don't just apply to the no-arg constructors of Collections!

Your understanding is slightly flawed. The diamond operator is a nice feature as you don't have to repeat yourself. It makes sense to define the type once when you declare the type but just doesn't make sense to define it again on the right side. The DRY principle.
Now to explain all the fuzz about defining types. You are right that the type is removed at runtime but once you want to retrieve something out of a List with type definition you get it back as the type you've defined when declaring the list otherwise it would lose all specific features and have only the Object features except when you'd cast the retrieved object to it's original type which can sometimes be very tricky and result in a ClassCastException.
Using List<String> list = new LinkedList() will get you rawtype warnings.

This line causes the [unchecked] warning:
List<String> list = new LinkedList();
So, the question transforms: why [unchecked] warning is not suppressed automatically only for the case when new collection is created?
I think, it would be much more difficult task then adding <> feature.
UPD: I also think that there would be a mess if it were legally to use raw types 'just for a few things'.

In theory, the diamond operator allows you to write more compact (and readable) code by saving repeated type arguments. In practice, it's just two confusing chars more giving you nothing. Why?
No sane programmer uses raw types in new code. So the compiler could simply assume that by writing no type arguments you want it to infer them.
The diamond operator provides no type information, it just says the compiler, "it'll be fine". So by omitting it you can do no harm. At any place where the diamond operator is legal it could be "inferred" by the compiler.
IMHO, having a clear and simple way to mark a source as Java 7 would be more useful than inventing such strange things. In so marked code raw types could be forbidden without losing anything.
Btw., I don't think that it should be done using a compile switch. The Java version of a program file is an attribute of the file, no option at all. Using something as trivial as
package 7 com.example;
could make it clear (you may prefer something more sophisticated including one or more fancy keywords). It would even allow to compile sources written for different Java versions together without any problems. It would allow introducing new keywords (e.g., "module") or dropping some obsolete features (multiple non-public non-nested classes in a single file or whatsoever) without losing any compatibility.

When you write List<String> list = new LinkedList();, compiler produces an "unchecked" warning. You may ignore it, but if you used to ignore these warnings you may also miss a warning that notifies you about a real type safety problem.
So, it's better to write a code that doesn't generate extra warnings, and diamond operator allows you to do it in convenient way without unnecessary repetition.

All said in the other responses are valid but the use cases are not completely valid IMHO. If one checks out Guava and especially the collections related stuff, the same has been done with static methods. E.g. Lists.newArrayList() which allows you to write
List<String> names = Lists.newArrayList();
or with static import
import static com.google.common.collect.Lists.*;
...
List<String> names = newArrayList();
List<String> names = newArrayList("one", "two", "three");
Guava has other very powerful features like this and I actually can't think of much uses for the <>.
It would have been more useful if they went for making the diamond operator behavior the default, that is, the type is inferenced from the left side of the expression or if the type of the left side was inferenced from the right side. The latter is what happens in Scala.

The point for diamond operator is simply to reduce typing of code when declaring generic types. It doesn't have any effect on runtime whatsoever.
The only difference if you specify in Java 5 and 6,
List<String> list = new ArrayList();
is that you have to specify #SuppressWarnings("unchecked") to the list (otherwise you will get an unchecked cast warning). My understanding is that diamond operator is trying to make development easier. It's got nothing to do on runtime execution of generics at all.

Related

Do I need to use <> empty angle brackets for generics? [duplicate]

The diamond operator in java 7 allows code like the following:
List<String> list = new LinkedList<>();
However in Java 5/6, I can simply write:
List<String> list = new LinkedList();
My understanding of type erasure is that these are exactly the same. (The generic gets removed at runtime anyway).
Why bother with the diamond at all? What new functionality / type safety does it allow? If it doesn't yield any new functionality why do they mention it as a feature? Is my understanding of this concept flawed?
The issue with
List<String> list = new LinkedList();
is that on the left hand side, you are using the generic type List<String> where on the right side you are using the raw type LinkedList. Raw types in Java effectively only exist for compatibility with pre-generics code and should never be used in new code unless
you absolutely have to.
Now, if Java had generics from the beginning and didn't have types, such as LinkedList, that were originally created before it had generics, it probably could have made it so that the constructor for a generic type automatically infers its type parameters from the left-hand side of the assignment if possible. But it didn't, and it must treat raw types and generic types differently for backwards compatibility. That leaves them needing to make a slightly different, but equally convenient, way of declaring a new instance of a generic object without having to repeat its type parameters... the diamond operator.
As far as your original example of List<String> list = new LinkedList(), the compiler generates a warning for that assignment because it must. Consider this:
List<String> strings = ... // some list that contains some strings
// Totally legal since you used the raw type and lost all type checking!
List<Integer> integers = new LinkedList(strings);
Generics exist to provide compile-time protection against doing the wrong thing. In the above example, using the raw type means you don't get this protection and will get an error at runtime. This is why you should not use raw types.
// Not legal since the right side is actually generic!
List<Integer> integers = new LinkedList<>(strings);
The diamond operator, however, allows the right hand side of the assignment to be defined as a true generic instance with the same type parameters as the left side... without having to type those parameters again. It allows you to keep the safety of generics with almost the same effort as using the raw type.
I think the key thing to understand is that raw types (with no <>) cannot be treated the same as generic types. When you declare a raw type, you get none of the benefits and type checking of generics. You also have to keep in mind that generics are a general purpose part of the Java language... they don't just apply to the no-arg constructors of Collections!
Your understanding is slightly flawed. The diamond operator is a nice feature as you don't have to repeat yourself. It makes sense to define the type once when you declare the type but just doesn't make sense to define it again on the right side. The DRY principle.
Now to explain all the fuzz about defining types. You are right that the type is removed at runtime but once you want to retrieve something out of a List with type definition you get it back as the type you've defined when declaring the list otherwise it would lose all specific features and have only the Object features except when you'd cast the retrieved object to it's original type which can sometimes be very tricky and result in a ClassCastException.
Using List<String> list = new LinkedList() will get you rawtype warnings.
This line causes the [unchecked] warning:
List<String> list = new LinkedList();
So, the question transforms: why [unchecked] warning is not suppressed automatically only for the case when new collection is created?
I think, it would be much more difficult task then adding <> feature.
UPD: I also think that there would be a mess if it were legally to use raw types 'just for a few things'.
In theory, the diamond operator allows you to write more compact (and readable) code by saving repeated type arguments. In practice, it's just two confusing chars more giving you nothing. Why?
No sane programmer uses raw types in new code. So the compiler could simply assume that by writing no type arguments you want it to infer them.
The diamond operator provides no type information, it just says the compiler, "it'll be fine". So by omitting it you can do no harm. At any place where the diamond operator is legal it could be "inferred" by the compiler.
IMHO, having a clear and simple way to mark a source as Java 7 would be more useful than inventing such strange things. In so marked code raw types could be forbidden without losing anything.
Btw., I don't think that it should be done using a compile switch. The Java version of a program file is an attribute of the file, no option at all. Using something as trivial as
package 7 com.example;
could make it clear (you may prefer something more sophisticated including one or more fancy keywords). It would even allow to compile sources written for different Java versions together without any problems. It would allow introducing new keywords (e.g., "module") or dropping some obsolete features (multiple non-public non-nested classes in a single file or whatsoever) without losing any compatibility.
When you write List<String> list = new LinkedList();, compiler produces an "unchecked" warning. You may ignore it, but if you used to ignore these warnings you may also miss a warning that notifies you about a real type safety problem.
So, it's better to write a code that doesn't generate extra warnings, and diamond operator allows you to do it in convenient way without unnecessary repetition.
All said in the other responses are valid but the use cases are not completely valid IMHO. If one checks out Guava and especially the collections related stuff, the same has been done with static methods. E.g. Lists.newArrayList() which allows you to write
List<String> names = Lists.newArrayList();
or with static import
import static com.google.common.collect.Lists.*;
...
List<String> names = newArrayList();
List<String> names = newArrayList("one", "two", "three");
Guava has other very powerful features like this and I actually can't think of much uses for the <>.
It would have been more useful if they went for making the diamond operator behavior the default, that is, the type is inferenced from the left side of the expression or if the type of the left side was inferenced from the right side. The latter is what happens in Scala.
The point for diamond operator is simply to reduce typing of code when declaring generic types. It doesn't have any effect on runtime whatsoever.
The only difference if you specify in Java 5 and 6,
List<String> list = new ArrayList();
is that you have to specify #SuppressWarnings("unchecked") to the list (otherwise you will get an unchecked cast warning). My understanding is that diamond operator is trying to make development easier. It's got nothing to do on runtime execution of generics at all.

What is the difference between these java snippets? [duplicate]

The diamond operator in java 7 allows code like the following:
List<String> list = new LinkedList<>();
However in Java 5/6, I can simply write:
List<String> list = new LinkedList();
My understanding of type erasure is that these are exactly the same. (The generic gets removed at runtime anyway).
Why bother with the diamond at all? What new functionality / type safety does it allow? If it doesn't yield any new functionality why do they mention it as a feature? Is my understanding of this concept flawed?
The issue with
List<String> list = new LinkedList();
is that on the left hand side, you are using the generic type List<String> where on the right side you are using the raw type LinkedList. Raw types in Java effectively only exist for compatibility with pre-generics code and should never be used in new code unless
you absolutely have to.
Now, if Java had generics from the beginning and didn't have types, such as LinkedList, that were originally created before it had generics, it probably could have made it so that the constructor for a generic type automatically infers its type parameters from the left-hand side of the assignment if possible. But it didn't, and it must treat raw types and generic types differently for backwards compatibility. That leaves them needing to make a slightly different, but equally convenient, way of declaring a new instance of a generic object without having to repeat its type parameters... the diamond operator.
As far as your original example of List<String> list = new LinkedList(), the compiler generates a warning for that assignment because it must. Consider this:
List<String> strings = ... // some list that contains some strings
// Totally legal since you used the raw type and lost all type checking!
List<Integer> integers = new LinkedList(strings);
Generics exist to provide compile-time protection against doing the wrong thing. In the above example, using the raw type means you don't get this protection and will get an error at runtime. This is why you should not use raw types.
// Not legal since the right side is actually generic!
List<Integer> integers = new LinkedList<>(strings);
The diamond operator, however, allows the right hand side of the assignment to be defined as a true generic instance with the same type parameters as the left side... without having to type those parameters again. It allows you to keep the safety of generics with almost the same effort as using the raw type.
I think the key thing to understand is that raw types (with no <>) cannot be treated the same as generic types. When you declare a raw type, you get none of the benefits and type checking of generics. You also have to keep in mind that generics are a general purpose part of the Java language... they don't just apply to the no-arg constructors of Collections!
Your understanding is slightly flawed. The diamond operator is a nice feature as you don't have to repeat yourself. It makes sense to define the type once when you declare the type but just doesn't make sense to define it again on the right side. The DRY principle.
Now to explain all the fuzz about defining types. You are right that the type is removed at runtime but once you want to retrieve something out of a List with type definition you get it back as the type you've defined when declaring the list otherwise it would lose all specific features and have only the Object features except when you'd cast the retrieved object to it's original type which can sometimes be very tricky and result in a ClassCastException.
Using List<String> list = new LinkedList() will get you rawtype warnings.
This line causes the [unchecked] warning:
List<String> list = new LinkedList();
So, the question transforms: why [unchecked] warning is not suppressed automatically only for the case when new collection is created?
I think, it would be much more difficult task then adding <> feature.
UPD: I also think that there would be a mess if it were legally to use raw types 'just for a few things'.
In theory, the diamond operator allows you to write more compact (and readable) code by saving repeated type arguments. In practice, it's just two confusing chars more giving you nothing. Why?
No sane programmer uses raw types in new code. So the compiler could simply assume that by writing no type arguments you want it to infer them.
The diamond operator provides no type information, it just says the compiler, "it'll be fine". So by omitting it you can do no harm. At any place where the diamond operator is legal it could be "inferred" by the compiler.
IMHO, having a clear and simple way to mark a source as Java 7 would be more useful than inventing such strange things. In so marked code raw types could be forbidden without losing anything.
Btw., I don't think that it should be done using a compile switch. The Java version of a program file is an attribute of the file, no option at all. Using something as trivial as
package 7 com.example;
could make it clear (you may prefer something more sophisticated including one or more fancy keywords). It would even allow to compile sources written for different Java versions together without any problems. It would allow introducing new keywords (e.g., "module") or dropping some obsolete features (multiple non-public non-nested classes in a single file or whatsoever) without losing any compatibility.
When you write List<String> list = new LinkedList();, compiler produces an "unchecked" warning. You may ignore it, but if you used to ignore these warnings you may also miss a warning that notifies you about a real type safety problem.
So, it's better to write a code that doesn't generate extra warnings, and diamond operator allows you to do it in convenient way without unnecessary repetition.
All said in the other responses are valid but the use cases are not completely valid IMHO. If one checks out Guava and especially the collections related stuff, the same has been done with static methods. E.g. Lists.newArrayList() which allows you to write
List<String> names = Lists.newArrayList();
or with static import
import static com.google.common.collect.Lists.*;
...
List<String> names = newArrayList();
List<String> names = newArrayList("one", "two", "three");
Guava has other very powerful features like this and I actually can't think of much uses for the <>.
It would have been more useful if they went for making the diamond operator behavior the default, that is, the type is inferenced from the left side of the expression or if the type of the left side was inferenced from the right side. The latter is what happens in Scala.
The point for diamond operator is simply to reduce typing of code when declaring generic types. It doesn't have any effect on runtime whatsoever.
The only difference if you specify in Java 5 and 6,
List<String> list = new ArrayList();
is that you have to specify #SuppressWarnings("unchecked") to the list (otherwise you will get an unchecked cast warning). My understanding is that diamond operator is trying to make development easier. It's got nothing to do on runtime execution of generics at all.

Array of Lists, and workarounds

I want to create an array of ArrayLists, similar to that in this thread: How to do an array of hashmaps?. However, Java gives the warning
"Cannot create a generic array of ArrayList<String>"
when I try to do the following
ArrayList[] arrayOfLists = new ArrayList[size];
I have sort of understood the problems and the workarounds provided.
I have my own approach which unfortunately does not fix the problem either.
I tried creating a list of ArrayLists and then used toArray().
ArrayList<ArrayList<String>> listOfLists = new ArrayList<ArrayList<String>>();
ArrayList<String>[] arrayOfLists = (ArrayList<String>[])listOfLists.toArray();
Worked fine, but got the warning :
Type safety: Unchecked cast from Object[] to ArrayList<String>[]
When I tried to check for type safety, using
if(listOfLists.toArray() instanceof ArrayList<String>[])
I get the error:
Cannot perform instanceof check against parameterized type ArrayList<String>[]. Use the form ArrayList<?>[] instead since further generic type information will be erased at runtime
Why cant I use this method? Why does toArray() return Object[] instead of ArrayList<String> since the instance was initialised with theArrayList<String>; type?
Any other workarounds/suggestions on how I can get this done? A 2D array will not work since different lists can vary greatly in size.
The currently accepted answer has a major error in describing Java's generics, so I felt I should answer to make sure there aren't any misconceptions.
Generics in Java are an entirely compile-time feature and for the most part don't exist at runtime due to erasure (you can get the runtime to cough up generic type information in some cases, but that's far from the general case). This provides the basis for the answers to your questions.
Why cant I use this method?
Because generics are erased, an ArrayList<String>[] (as well as all other parameterized ArrayList<>[] instances) at runtime is really an ArrayList[]. Thus, it is impossible for the runtime to check if something is instanceof ArrayList<String>[], as the runtime doesn't actually know that String is your type parameter -- it just sees ArrayList[].
Why does toArray() return Object[] instead of ArrayList since the instance was initialised with theArrayList; type?
Again, erasure. The type parameter is erased to Object, so at runtime what you effectively have is an ArrayList<Object>. Because of this erasure, the runtime doesn't have the information necessary to return an array of the proper type; it only knows that the ArrayList holds Objects, so it returns an Object[]. This is why the toArray(T[]) overload exists -- arrays retain their type information, so an array could be used to provide the requisite type information to return an array of the right type.
Any other workarounds/suggestions on how I can get this done?
As you can see, mixing generic stuff and arrays doesn't work too well, so ideally, you wouldn't mix Lists and arrays together. Therefore, if possible, you should use List<List<String>> or something of the sort instead of List<String>[]. If you want to keep a ArrayList<String>[], though, you could do this:
#SuppressWarnings("unchecked")
ArrayList<String>[] array = new ArrayList[size];
You'll still get the unchecked type warning, but you can be reasonably sure that you won't encounter heap pollution as the only reference to the object is through array. You can also use this as the parameter to toArray():
#SuppressWarnings("unchecked")
ArrayList<String>[] temp = new ArrayList[0];
ArrayList<String>[] arrayOfLists = listOfLists.toArray(temp);
or
#SuppressWarnings("unchecked")
ArrayList<String>[] arrayOfLists = listOfLists.toArray((ArrayList<String>[]) new ArrayList[0]);
For more reading on why you can't parameterize an array, see this SO question. In short, such a thing isn't safe because arrays are covariant, while generics are invariant.
The problem is that Generics are created during runtime, but type conversions and array sizes must be checkable at compile time. The compiler cannot tell what class ArrayList<String> will be during compile time (as it will be generated later), it can only tell that it will be at least an Object, because every class in Java is at least an Object. You can do type conversion and suppress the warning and it might even work, but you run into a pitfall to accidentally confuse types somewhere and mess up your code.
Java is a type-safe language by choice to prevent you from doing one of the most recurring mistakes programmers do in their daily work: confusing variable types. So while it is possible to do the type conversion, you - as an upcoming good Java programmer - should not do that. Use the ArrayList<ArrayList<String>> if you need such a construct, and use arrays only when they are necessary.
The main reason to use arrays is speed of execution, as obviously using an object will keep the runtime busy with some overhead. The main reason to not use arrays is the fact that this overhead will allow you more flexibility in coding and reduce the amount of errors you make. So as a general advice: unless you know (as in measured and determined to be a bottleneck) that you need the speed, go with Lists. Java even does some internal optimizations beyond what you would expect to speed up Lists to a point where they come very close to the execution speed of arrays.

How does new LinkedList<>() differ from new LinkedList()

I just stumbled upon the compiler treating these two terms differently. when I type:
LinkedList<String> list = new LinkedList();
I get a compiler warning about a raw type. however:
LinkedList<String> list = new LinkedList<>();
removes the warning. It seems to me as though the two statements mean essentially the same thing (i.e. create a new LinkedList with no specified object type). Why then does the complier all ow the empty generics? What is the difference here?
The statements do not mean the same thing at all.
The first statement tries to fit an untyped LinkedList into a declared generic LinkedList<String> and appropriately throws a warning.
The second statement, valid in Java 1.7 onward, uses type inference to guess the type parameter by using the declaring type's type parameter. In addition, sometimes this can be used in method calls. It doesn't always work, however.
See this page for more info.
It's the diamond operator in Java 7, that helps you save writing the type again. In Java 7 this is equivalent to the same generic type argument that is used on the left side of the declaration. So the initialization is type safe and no warning is issued.
With LinkedList<>, you use the new Diamond Operator, from java 7.
The Diamod operator uses the generic value setted in the left side of the line.
In Java 6, this doesnt works!
The diamond operator, however, allows the right hand side of the
assignment to be defined as a true generic instance with the same type
parameters as the left side... without having to type those parameters
again. It allows you to keep the safety of generics with almost the
same effort as using the raw type.
I think the key thing to understand is that raw types (with no <>)
cannot be treated the same as generic types. When you declare a raw
type, you get none of the benefits and type checking of generics. You
also have to keep in mind that generics are a general purpose part of
the Java language... they don't just apply to the no-arg constructors
of Collections!
Extracted from: https://stackoverflow.com/a/10093701/1281306
Backword compatibility (Inter-operating with legacy code) is the reason why java allows above signature. Generics are compile time syntax only. At runtime "all generic" syntax will be removed. You will just see if you de-compile any class file. Read this documentation.
LinkedList list = new LinkedList();

What are the implications of casting a generic List to a non-generic List?

I'm refatoring a home-grown DAO container, hoping to make the class generic. It internally uses an ArrayList to store the retrieved objects.
One usage of this class puts the container's list into a request scope, and due to a limitation of Websphere, I can't pass the generic List<Foo> to the request scope (Websphere doesn't handle generics out-of-the-box)
If I go ahead with my refactorings, I will need to convert/cast the List<Foo> into a non-generic List object..
// Boils down to this...
List<Foo> listFoo = new FooListing().findAllFoo();
List listThings = listFoo;
request.setAttribute("listThings", listThings);
What are the implications of reversing a generification like this? Should I avoid doing this kind of manipulation?
EDIT: The code snippet is verbose to explicitly demonstrate what I'm describing..
If the component type of the List does match the expected type, there is no problem.
Generics in Java are only used for type-checks by the compiler, they have not effect at runtime. If you are using an older library that does not support generics, you have no choice but to ignore the generic type.
Things should continue to work, as this system has been designed with backwards compatibility in mind.
So all you are losing is the compile-time type checking (it puts you back to where Java was at 1.4, which means, if the types match, everything will work, if not, you'll get ClassCastExceptions or other unwanted behaviour at runtime).
However, I think you can just write
request.setAttribute("listThings", listFoo);
This method takes any kind of Object. Even if it wanted a List, you could still pass a List<Foo> (which is still a List).
Java uses "type erasure" for generics -- essentially that means that the compiler checks the generics, but the runtime forgets all about it and just treats it as a list of objects.*
Whenever you treat a List<Foo> as just a List, you won't get compiler checks to make sure you don't put a Bla into your list. So you could get a ClassCastException if you call List<Foo>.get() and it turns out to be a Bla hiding in the list. But that can only happen if you some code puts a Bla in your list.
If you wan't to be cautious, then if you pass the List<Foo> as a List to anything that might add a non-Foo to the list, don't treat it as a List<Foo> whenever you access it, but treat it as a list of Objects and add instanceof checks.
*Some of the information is accessible at runtime, but let's not complicate matters.
A "non-generic" version of a generic type is called a "raw type".
Passing a generic type where the raw equivalent is requested is generally ok. This is actually the main reason generics in Java work the way they do (with erasure): to enable interoperability between "generified" code and pre-generics code.
The main thing you need to be careful about is that if you pass a List<Foo> to something that askes for a List, they may put non-Foo objects into the List. You won't get any compile time checking to help you here. You do get some runtime checks: a ClassCastException will be thrown when you use a method that returns a Foo on your List<Foo> and it has to return a non-Foo.
If you want more fail-fast behavior you can wrap your List<Foo> with Collections.checkedList() to get a List that'll check the type of elements on insertion.
Things get more complicated if Foo itself is a generic type. Runtime checks are only done on reified types (ie: the type with generic type parameters removed) so if you give them a List<Set<Bar>> and they insert a Set<Baz> or just a Set, you won't know since the runtime/reified type of the element is Set either way.
First, you can't cast a generic to a non-generic list so yeah you'd have to convert it.
Second, the two main advantages to a generic list are 1) it ensures that all objects are of the specified type and 2) it allows you to directly access methods of the object collection without needing to recast them. This allows you to write cleaner code and saves some processing cycles from having to cast back and fourth.
Neither one of these advantages is a dire need however. If you can't use them you won't notice a difference in performance. Your code may look a little messier though.
I have similar problems with Weblogic Portal. Just use none-generic type for this case.

Categories

Resources