I am looking a name of a concept, where you can configure
any type of persistence methods like RDBMS, XML Databases, RESTful APIs or a formatted file like CSV file to a programming language, such as Java.
I am quite sure that this concept has a name, but I just can't remember what it's called...
It's similar to "DataSource" in CakePHP, but as I mentioned, it wouldn't be restricted only to RDBMS.
Does anyone got an idea?
EDIT: and know an actual implementation of this concept to Java?
I think "Data Source" already describes it very well. And data sources in CakePHP are not limited to just RDBMS. See https://github.com/cakephp/datasources/tree/2.0/Model/Datasource
I do no think there is a real name for that, what you're looking for is more a pattern and I think this level of abstraction can be approached by using different ways or patterns. Check how CakePHP or any other good framework is doing it or directly check the Spring framework for java. If you want to write your own abstraction layer lets inspire you be them.
How about DataServiceProvider and Abstract/Persist through one of the DataAdapters "types"
Which your business objects, or application services can locate this via a common service locator.
We have a similar implementation, though practically it is not being used much as people would prefer their own tools to source data for other data types.
Related
I have a POJO class and I need to call a RESTful web service using some properties from the POJO as parameters to the service. The caveat is that I won't know the endpoint and its parameters 'till runtime. Basically, the user will configure at runtime the endpoint, input/output schemas and mappings from/to those schemas to the POJO class. Then I have to call the API with the appropriate values.
This is going to be a really broad answer.
It sounds like a question that would benefit as 'code as data'.
What I mean by this, is that the amount of possibilities that you have to be able to deal with at runtime, is close to the complexities of using a programming language itself.
When this happens, there's generally a few choices that people either choose by accident, or consciously choose depending on who the user is.
Limit the scope of the problem, and make your configuration that complex it may as well be a programming language itself.
Embed a scripting language, or create some runtime loading of plugins in the native language.
Use an off the shelf library / solution.
I'd recommend 2 or 3 over 1 if your user is yourself or the configuration can be provided by another programmer.
Does anyone have any strategies or examples of cross-framework libraries?
I am working on a project with an android app, a java server and a Java desktop client, which all use different frameworks. I need to refactor some core business logic into a separate library that can be used across all of these to ensure consistent behavior, but the field annotations are killing me.
The problem is that I am using Room in the Android app (which requires the #PrimaryKey annotation on the primary key field of a database entity) and JPA in the server and JavaFX client (which requires #Id).
Given this level of difficulty with the models, we initially copy-pasted the fields without annotations to the others when changing them. However, the business logic needs to make use of the models and accommodate each platform's specific ORM, Http client and Json serializer. (I know that it is technically possible to get Gson, Apache Http and Hibernate to run on all of these platforms, but actually doing any of these solutions created too many nightmares of its own)
As far as I can tell, there isn't a nice solution to this. Fortunately, the same #Inject is used in Dagger2 and CDI/CDI-SE so I have created some interfaces that each platform/framework will implement.
Does anybody have any examples or case studies I could look at which might help me arrive at a solution?
(I realize this question doesn't include any code samples, but it's more of a general programming strategy question.)
Disclaimer: I am the architect of JDX for Java and JDXA for Android ORMs.
You may consider using JDX for Java and JDXA for Android ORM frameworks to share the common object model, the core business logic code, and the data integration code across Java server, Java desktop, and Android platforms.
JDX and JDXA don't use annotations to define the mapping - instead they use an external text file to define the mapping specification based on a simple ORM grammar. So you may use the same mapping specification for your common object model across different platforms. Also, the APIs for both JDX and JDXA are simliar.
So, you just need to use the appropriate JDX(A) ORM library for your target platform and an appropriate JDBC driver for your target database without needing to change your object model or business logic.
I'm a Java EE nooby developer, According to many resources on the internet which claim that service locator design pattern is an anti-pattern because it hides classes dependecies and more things and should be avoided as many as possbile and using Dependecy Injection instead, as we know JNDI is an implemantation of service locator pattern.
I googled to check that JNDI is an implementation of service locator and i found this response which claims this : Understanding JNDI
Althought i see that JNDI is used in Java EE application for many purposes (Datasources, EJB lookup ...), So should i use it or should i avoid it as more as possible?, if JNDI isn't bad then service locator isn't?
I think that the one part of your question, whether service locator is good or not or whether JNDI is about this pattern is a bit esoteric. I can give a general advice here as being a software architect for some years now, that a pattern by itself is not good and not bad, it is just a piece of solution that was successfully used before in many cases and thus be declared a pattern in order to be used for future cases which are similar. And another thing is, as opposed to many years ago, when one had to know the GoF book by heart in order to survive an interview, nowadays it is much more important to understand the underlying concepts of a framework like Java EE than to implement all those patterns, because what you have to implement is very often very simple and straightforward, but using them relies on those concepts.
Concerning the second part of your question, you are almost never in need of directly using JNDI, but to use concepts built on top of it, as injection - that is what you should use in your application.
It's a horrible pattern IMHO since it is a massive security flaw. If dependencies are known at compile time and do not change, then its much easier to audit, gate and control possible vulenrabilities. Even within an organization JDNI is a Trojan horse waiting to be put to nefarious use, if a bad actor ca compromise some other area and your network, then get load whatever they want via a poorly/unwittingly implemented app. This log4j debacle is proof of that: don't allow apps to look-up and load whatever, whenever. It's a stupid idea. It's unsafe.
In a business environment we end up needing different kinds of data across applications so that it makes sense to store them in a shared location. For instance you may have a set of applications that share the same set of users, and we need authorization information for each of them listing what roles they have so we can know what they need to access. That kind of thing goes into an LDAP data store, you can think of it as a hierarchical database optimized for fast read access.
All sorts of things can go in these datastores, it's normal for an application server to stash connection pools in them, for instance. A lot of these, like users, roles, and connection pools, are vital things you need to do your job.
JNDI is the standard Java API for accessing these LDAP datastores.
The nasty thing about the service locator design pattern is that the client code doing the lookup has to know too much about the thing it is querying (mainly, where to get it from), and having that lookup hard-coded in the client makes the code inflexible and hard to test. But if we use dependency injection (whether it's CDI, Spring, whatever) we can have the framework inject the value we want into the code, while the JNDI lookups are handled within the framework code and not in the application. That means you can use JNDI without your application code having to use the service locator pattern.
I am developing an app using MVC pattern.
Controllers: servlets
Model: I am following DAO/DTO pattern for accessing database
View: simple JSP EL and JSTL
For accessing database I am using DAO pattern. I want to put validation method and a HashMap for error messages inside the DTO classes for validating FORM data, something similar to Putting validation method and hashmap into DTO.
My question is - this a right approach? If not what is an ideal way for doing this?
As a summary: I want to know real world solutions for server side form validation when we are using DAO/DTO pattern. Please help me.
I believe you need to treat separately the architecture you're implementing and the frameworks you're using to implement the architecture.
Java has a rich set of tools for working on the three standard tiers of your application and choices depend on some factors like expected load and server resources, if you have a two or three users application then it is just a matter of taste.
In terms of DAO/DTO then you have also some options, for example you can build your Data access layer with hibernate and then for your service layer API use DTO's. In this situation you probably want to use a tool for mapping between your domain model and your DTO's (for example jDTO Binder).
Another common approach is to use Spring JDBC Template, there you can go a little bit more crazy and use the same Domain objects as part of the Service layer API.
Finally, the truth is, you can do this by the book or you can do it completely different choice is based on your scenario, taste and experience.
I need to create a system oriented around Methods where providers can register for the Methods they handle and consumers can do two things (for now) - either get Metadata for a method or execute it. I'm considering creating a REST style architecture where methods are resources with unique URIs and an interface consisting of two methods - getMetadata and Execute.
I'll need to have an equivalent of #RequestMapping so that the provider that handles specific methods can be located by the central dispatcher. As a result the provider will return either Model or Metadata object.
This looks pretty similar to Spring MVC but I don't want to expose and consume my resources(methods) over the web and use http as this will incur unnecessary overhead. Instead I want to use it like a standard java API where java methods are called and java objects are transferred.
I can do that by writing my own equivalent of #RequestMapping and Dispatcher logic but I was wondering if there's a better way to do this with Spring. Any suggestions?
Thanks!
Kostadin
You are saying you want to do using REST and everything will have a unique URI but not over HTTP?? Sounds like you are looking for RMI or something similar... Chech Burlap or Hessian both of them has excellent support from spring.
There's software out there called NetKernel that might interest you. Its literature says that it is an implementation of Resource-Oriented Computing. It looks like it rigorously separates its logical computing model from the physical details. It's RESTful, defining a resource model, a limited set of verbs, and a naming scheme. Implemented in Java. Comes with HTTP and other transports built in.
It doesn't have a Java in-process transport, but you could probably write one for it pretty easily.
Hmm...if you never need to process requests from out-of-process sources, it's probably overkill for you, but maybe it will show you some useful patterns.