Here is an example how start hazelcast without network. For my purpose it is needed to run hazelcast as embeded, only as simple cache. But original question was about testing - can I use that code for production when I do not need a separated hazelcast server?
Hazelcast is designed to be a distributed system. It wasn't designed to be in-process cache. Because of distributed nature, may design decisions don't make it as a good candidate for your use case. You will see overhead on serialization and network (even in local single node embedded mode).
We're planning to improve this situation by providing optimization for local cache use case but no ETA at this point. You will see some features related to this use case in next couple releases.
I would suggest taking a look on Caffeine. It has JCache and Spring Boot integration. I would suggest sticking to JCache integration because it will make your code portable. If in the future you decide to go distributed, you just need to replace caffeine jars with Hazelcast.
Feel free to ask if you have any questions.
Thank you
I recently had the pleasure of being allowed to bump the version of the Hibernate dependency (amongst others) in a medium sized legacy code base (from 3.x to 5.2). The code itself is partially over 10 years old but still in daily use.
So even after increasing the version and porting as much API calls away from now deprecated or even missing areas to their bleeding edge counterparts (finding out how to do a SchemaExport was a particularly fun experience) I still don't see this as a complete migration.
I'm wondering what the intended upgrade path is for legacy users as often enterprise systems will be around for 10 to 15+ years and still at times you need to jump to a newer dependency version to get necessary bugfixes or features.
The following points are somewhat still open:
There is no clear or automatic way to migrate .hbm.xml mapping information to JPA annotations. I know a manual migration will be very error prone and not all concepts do have clear or obvious counter parts.
We now get a lot of deprecation warnings (org.hibernate.orm.deprecation) about our usage of the old Criteria API but there is also no clear upgrade path. One can not just re-write the whole db access code of an application to a completely different and more verbose API that will surely behave different at certain edge cases.
We seem to use a lot of native queries and instances of org.hibernate.transform.ResultTransformer yet the org.hibernate.query.Query#setResultTransformer() seems to be deprecated with no indication of how to work around this.
In general I find the documentation about deprecation and intended upgrade paths on Hibernate's side a litte scarce. I do understand that it is an Open Source project and that they don't want to maintain old APIs forever but still I'm feeling a little lost and I don't believe this to be the only legacy Java application out there still in use today.
I understand what you mean. In fact, I've been recently seeing all sorts of questions on our forum regarding migration from 3.x to 4.x and 5.x.
I think we should have a migration landing page as a starting page for every migration. This way, users will have to go to a single page and find everything they need.
We don't have an automatic HBM-to-Annotations tool. However, there is an alternative. You can do HBM -> database, and then use the Reverse Engineer Tool to generate Annotations from your database schema.
Legacy Criteria is deprecated since we can no longer afford to maintain two Criteria APIs. Plus, the JPA Criteria is more advanced (it has type safe queries and Metamodel). Unfortunately, there is no automatic migration from legacy to Criteria API either. But then, even if you have hundreds of such method calls, you can easily migrate them either automatically (regex/perl/vi) or manually. It's not going to take that much to do it.
The ResultTransformer is going to be substituted with a new mechanism that can better take advantage of lambdas. For this reason, the new interface or interfaces will have to be functional interfaces.
I'm new to Flex development, and RIAs in general. I've got a CRUD-style Java + Spring + Hibernate service on top of which I'm writing a Flex UI. Currently I'm using BlazeDS. This is an internal application running on a local network.
It's become apparent to me that the way RIAs work is more similar to a desktop application than a web application in that we load up the entire model and work with it directly on the client (or at least the portion that we're interested in). This doesn't really jive well with BlazeDS because really it only supports remoting and not data management, thus it can become a lot of extra work to make sure that clients are in sync and to avoid reloading the model which can be large (especially since lazy loading is not possible).
So it feels like what I'm left with is a situation where I have to treat my Flex application more like a regular old web application where I do a lot of fine grained loading of data.
LiveCycle is too expensive. The free version of WebOrb for Java really only does remoting.
Enter GraniteDS. As far as I can determine, it's the only free solution out there that has many of the data management features of LiveCycle. I've started to go through its documentation a bit and suddenly feel like it's yet another quagmire of framework that I'll have to learn just to get an application running.
So my question(s) to the StackOverflow audience is:
1) do you recommend GraniteDS,
especially if my current Java stack
is Spring + Hibernate?
2) at what point do you feel like it starts to
pay off? That is, at what level of
application complexity do you feel
that using GraniteDS really starts
to make development that much
better? In what ways?
If you're committed to Spring and don't want to introduce Seam then I don't think that Granite DS will give you much beyond Blaze DS. There is a useful utility that ensures only a single instance of any one entity exists in the client at any one time but it's actually pretty easy to do that with a few instances of Dictionary with weak references and some post-processing applied to the server calls. A lot of the other features are Seam-specific as alluded to here in the docs:
http://www.graniteds.org/confluence/display/DOC/6.+Tide+Data+Framework
Generally, the Tide approach is to minimize the amount of code needed to make things work between the client and the server. Its principles are very similar to the ones of JBoss Seam, which is the main reason why the first integration of Tide has been done with this framework. Integrations with Spring and EJB 3 are also available but are a little more limited.
I do however think that Granite's approach to data management is a big improvement over Livecycle's because they are indeed quite different. From the Granite docs:
All client/server interactions are done exclusively by method calls on services exposed by the server, and thus respect transaction boundaries and security defined by the remote services.
This is different to how Livecycle DS uses "managed collections" where you invoke fill() to grab large swathes of data and then invoke commit() methods to persist changes en-mass. This treats the backend like a raw data access API and starts to get complicated (or simply fall apart entirely) when you have fine-grained security requirements. Therefore I think Granite's approach is far more workable.
All data management features (serialization of JPA detached entities, client entity caching, data paging...) work with Spring.
GraniteDS does not mandate anything, you only need Seam if you want to use Seam on the server.
Actually, the free version of WebORB for Java does do data management. I've recently posted a comparison between WebORB for Java, LiveCycle DS, BlazeDS and GraniteDS. You can view this comparison chart here: http://bit.ly/d7RVnJ I'd be interested in your comments and feedback as we want this to be the most comprehensive feature comparison on the web.
Cheers,
Kathleen
Have you looked at the spring-blazeDS integration project?
GraniteDS with Seam Framework, Hibernate and MySql is a very nice combination. What I do is create the database, use seamgen to generate hibernate entities then work from there.
Never used a cache like this before. The problem is that I want to load 500,000 + records out of a database and do some selecting/filtering wicked fast.
I'm thinking about using a cache, and preliminarily found EHCache and OSCache, any opinions?
Judging by their releases page, OSCache has not been actively maintained since 2007. This is not a good thing. EhCache, on the other hand, is under constant development. For that reason alone, I would choose EhCache.
Edit Nov 2013: OSCache, like the rest of OpenSymphony, is dead.
They're both pretty solid projects. If you have pretty basic caching needs, either one of them will probably work as well as the other.
You may also wish to consider doing the filtering in a database query if it's feasible. Often, using a tuned query that returns a smaller result set will give you better performance than loading 500,000 rows into memory and then filtering them.
I've used JCS (http://jakarta.apache.org/jcs/) and it seems solid and easy to use programatically.
It sort of depends on your needs. If you're doing the work in memory on one machine, then ehcache will work perfectly, assuming you have enough RAM or a fast enough hard disk so that the overflow doesn't cause disk paging/thrashing. if you find you need to achieve scalability, even despite this particular operation happening a lot, then you'll probably want to do clustering. JGroups /TreeCache from JBoss support this, so does EHcache (I think), and I know it definitely works if you use Ehcache with terracotta, which is a very slick integration. This answer doesn't speak directly to the merits of EHcache and OSCache, so here's that answer: EHcache seems to have the most inertia (used to be the default, well known, active development, including a new cache server), and OSCache seemed (at least at one point) to have slightly more features, but I think that with the options mentioned above those advantages are moot/superseded. Ah, the other thing I forgot to mention is that transactionality of the data is important, and your requirements will refine the list of valid choices.
Choose a cache which complies to JSR 107 which will make your job easy when you want to migrate from one implementation to the other. To be specific on the question go for Ehcache which is more popular and widely used Java caching solution. We are using Ehcache extensively and it works for us.
Other answers discuss pros/cons for caches; but I am wondering whether you actually benefit from cache at all. It is not quite clear exactly what you plan on doing here, and why a cache would be beneficial: if you have the data set at your use, just access that. Cache only helps reuse things between otherwise independent tasks. If this is what you are doing, yes, caching can help. But if it is a big task that can carry along its data set, caching would add no value.
Either way, I recommend using them with Spring Modules.
The cache can be transparent to the application, and cache implementations are trivially easy to swap.
In addition to OSCache and EHCache, Spring Modules also support Gigaspaces and JBoss cache.
As to comparisons....
OSCache is easier to configure
EHCache has more configuration options
They are both rock solid, both support mirroring cache, both work with Terracotta, both support in-memory and to-disk caching.
I have used oscache on several spring projects with spring-modules, using the aop based configuration.
Recently I looked to use oscache + spring modules on a Spring 3.x project, but found spring-modules annotation-based caching is not supported (even by the fork).
I recently found out about this project -
http://code.google.com/p/ehcache-spring-annotations/
Which supports spring 3.x with declarative annotation-based caching using ehcache.
I mainly use EhCache because it used to be the default cache provider for Hibernate. There is a list of caching solutions on Java-Source.net.
I used to have a link that compared the main caching solutions. If I find it I will update this answer.
OSCache is pretty much dead as it has been abandoned a few years ago. You may take a look at Cacheonix, it's been actively developed and we've just released v.2.2.2 with support for caching in the web tier. I'm a committer so you can reach out if you have any questions.
My developers are waging a civil war. In one camp, they've embraced Hibernate and Spring. In the other camp, they've denounced frameworks - they're considering Hibernate though.
The question is: Are there any nasty surprises, weaknesses or pit-falls that newbie Hibernate-Spring converts are likely to stumble on?
PS: We've a DAO library that's not very sophisticated. I doubt that it has Hibernate's richness, but it's reaching some sort of maturity (i.e. it's not been changed in the last few projects it's included).
They've denounced frameworks?
That's nuts. If you don't use an off-the-shelf framework, then you create your own. It's still a framework.
I've used Hibernate a number of times in the past. Each time I've run into edge cases where determining the syntax devolved into a scavenger hunt through the documentation, Google, and old versions. It is a powerful tool but poorly documented (last I looked).
As for Spring, just about every job I've interviewed for or looked at in the past few years involved Spring, it's really become the de-facto standard for Java/web. Using it will help your developers be more marketable in the future, and it'll help you as you'll have a large pool of people who'll understand your application.
Writing your own framework is tempting, educational, and fun. Not so great on results.
Hibernate has quirks to be sure but that is because the problem it is trying to solve is complex. Every time someone complains about Hibernate I remind them of all of the boring DAO code that they would have to maintain if they weren't using it.
A few tips:
Hibernate is no substitute for a good database design. Hibernate schemas are OK but you will have to tweak them occasionally
Eventually you are going to have to understand how Hibernate lazy loads classes and how that affects things. Hibernate modifies the Java bytecode and you will need to delve into the depths sooner or later if only to explain why object links are null.
Use annotations if you can.
Take the time to learn the Hibernate performance tuning techniques, it will save you in the long run.
If you have a fairly complex database, Hibernate may not be for you. At work we have a fairly complex database with lots of data, and Hibernate doesn't really work for us. We've started using iBATIS instead. However, I know a lot of development shops who use Hibernate successfully - and it does do a lot of grunt work for you - so it's worth considering.
Spring is a good tool if you know how to use it properly.
I would say that frameworks are definitely a good thing - like others have pointed out, you don't want to reinvent the wheel. Spring contains a lot of modules which will mean you won't have to write so much code. Don't succumb to the "Not Invented Here" syndrome!
Lazy loading is the big gotcha in MVC applications that use Hibernate for their persistence framework. You load the object in the controller and pass it to the JSP view. Some or all of the members of the class are proxied and everything blows up because you Hibernate session was closed when the controller completed.
You will need to read the Open Session in View article to understand the problem and get a solution. If you are using Spring the this blog article describes the Spring solution to the open session in view issue.
This is one thing (I could remember) that I fell into when I was in my Hibernate days.
When you delete (several) child objects from a collection (in a parent entity) and then add new entities to the same collection in one transaction without flushing in the middle, Hibernate will do "insert" before "delete". If the child table has a unique constraint in one of its columns, and you are expecting that you would not violate it since you have already deleted some data before (just like I was), then get ready to be frustrated.
Hibernate forum suggests:
It was a DB design flaw, redesign;
flush (or commit if you will) in between the deletes and inserts;
I couldn't do both, and end up tweaking the Hibernate source and recompiling. It was only 1 line of code. But the effort to find that one line was equal to approximately 27 cups of coffee and 3 sleepless nights.
This is just one example of problems and quirks you might end up when using Hibernate with no real expert on your team (expert: someone with adequate knowledge about the philosophy and internal working of Hibernate). Your problem, solution, litre of coffee, and sleepless night count may vary. But you get the idea.
I haven't worked much with Java but I did work in large groups of Java developers. The impression I've got was that Spring is OK. But everybody was upset at Hibernate. Half the team if asked "If you could change one thing, what would you change?" and they'd say "Get rid of Hibernate.". When I started to learn Hibernate it struck me at amazingly complex, but I didn't learn enough (thankfully I've moved along) to know if the complexity was justified or not (maybe it was require to solve some complex problems).
The team got rid of Spring in favor of Guice, but that was more like a political change, at least from my point of view and other developers I've talked to.
I have always found Hibernate to be a bit complex and hard to learn. But as JPA (Java Persistence API) and EJB (Enterprise Java Beans) 3.0 has existed for a while things have gotten a lot easier, I much prefer annotating my classes to create mappings via JavaDoc or XML. Check out the support in Hibernate. The added bonus is that it is possible (but not effortless) to change the database framework later on if needed. I have used OpenJPA with great results.
Lately I have been using JCR (Java Content Repository) more and more. I love the way that my modules can share a single data storage and that I can let the structure and properties evolve. I find it a lot easier working with nodes and properties rather that mapping my objects to a database. A good implementation is Jackrabbit.
As for Spring, it has a lot of features I like, but the amount of XML needed to configure means I will never use it. Instead I utilize Guice and absolutely love it.
To roundup, I would show your doubting developers how Hibernate will make their life easier. As for Spring I would seriously check if Guice is a viable alternative and then try to show how Spring/Guice makes development better and easier.
I've done a lot of Spring/Hibernate development. Over time the way people used both in combination has changed a bit. The original HibernateTemplate approach has proved to be difficult to debug since it swallows and wraps otherwise useful exceptions; talk to the Hiberante API directly!
Please keep looking at the generated SQL (configure your development logging to show SQL). Having an abstraction layer to the database doesn't mean you don't have to think in SQL anymore; you won't get good performance if you otherwise.
Consider the project. I've choosen iBatis over Hibernate on several occasions where we had stringent performance requirements, complex legacy schemas or good DBa's capable of writing excellent SQL.
As for Hibernate: a very good tool for application which deals with a rapidly changing database schema, a large amount of tables, do lots of simple CRUD operations. Reports with complex queries involved are rather less well handled. But in these case I prefer mixing in JDBC or native queries. So, for a short answer: I do think time spent learning Hibernate is a good investment (they say it is compliant with EJB3.0 and JPA standards, also, but that didn't come into the equation when I evaluated it for my personal use).
As for Spring... see The Bile Blog :)
Remember: frameworks are not silver bullets, but you should not reinvent the wheel either.
I find it really helps to use well-known frameworks such as Hibernate because it fits your code into a specific mold, or a way of thinking. Meaning, since you're using Hibernate, you write code a certain way, and most if not all developers who know Hibernate will be able to follow your line of thinking quite easily.
There's a downside to this, of course. Before you become a hot shot Hibernate developer, you're going to find that you're trying to fit a square into a circular hole. You KNOW what you want to do, and how you were supposed to do it before Hibernate came into the picture, but finding the Hibernate way of doing it may take... quite a bit of time.
Still, for companies that frequently hire consultants (who need to understand a lot of source code in a short amount of time) or where the developers sign on and quit frequently, or where you just don't want to bet that your key developers will stay forever and never change jobs -- Hibernate and other standard frameworks are a pretty good idea I think.
/Ace
Spring and Hibernate are frameworks that are tricky to master. It may not be a good idea to use them in projects with tight deadlines while you're still trying to figure out the frameworks.
The benefits of the frameworks is basically to try to provide a platform to allow for consistent codes to be products. From experience, you'd be well advised to have developers experienced with the frameworks setting in place best practices.
Depending on the design of your application and/or database, there are also quirks that you'll need to circumvent to ensure that the frameworks do not hinder performance.
In my opinion, the biggest advantage of Spring is that it encourages and enables better development practices, in particular loose coupling, testing, and more interfaces. Hibernate without Spring can be really painful, but the two together are very useful.
Retrofitting an existing project to any framework is going to be painful, but the refactoring process often has serious benefits for long-term maintainability.
I have to agree with many posts on this one. I've used both, extensively, in a variety of settings. If I could undo a design decision it would be to have used Hibernate. We actually budgeted a release in one of our products to swap Hibernate for iBatis and Spring-JDBC for a best-of-all-worlds approach. I can have a new developer get up to speed using Spring-JDBC, Spring-MVC, Spring-Ioc, and iBatis faster than if I just tasked them with Hibernate.
Hibernate is just too complicated for this KISS developer. And heaven help you with hibernate if your DBA sees the generated SQL the database sees and sends you back with optimized versions.
The top answer mentions that Hibernate is poorly documented. I agree that the online reference manual could be more complete. However, a book written by Hibernate's authors, 'Java persistence with Hibernate' is a must-read for every Hibernate user and very complete.
#slim - I am with you again this morning.
It sounds like a classic case of Not Invented Here Syndrome. If they aren't keen on spring, they should consider other options rather than rolling their own framework (whether they acknowledge doing it or not). Guice comes to mind as an possibility. Also picocontainer. There are others out there, depending on what you need.
Spring and Hibernate definitely make life easier.
Getting started with them might be a little time-consuming at the beginning, but you'll certainly benefit from it later. Now the XML is being replaced by annotations, you don't need to type hundreds of lines of XML either.
You may want to consider AppFuse to reduce your learning-curve: generate an application, study and adapt it, and off you go.
Frameworks are not evil. even the Java SDK is a framework.
What they probably fight is framework proliferation. You shouldn't bring a framework to a project just for the kick of it, it should bring consistent value in a reasonable time. Every framework requires a learning curve, but should reward you with increased productivity and features later on.
If you struggle with code that is hard to debug because of inconsistent database usage, complicated cache mechanisms, or a myriad of other reasons. Hibernate will add great value.
apart from the learning curve (which took about 1 month of practical work for me) there weren't any pitfalls, provided you have someone around to explain the basics for you.