OSGi services architecture: creation of service at request of consumer - java

I am developing an application in Eclipse RCP. I need help with a design decision concerning the design of a service.
I have some bundles which are used to provide an REngine object to other modules. The REngine is an interface to a calculation engine, which can be implemented in multiple ways. The bundles provide instances of REngine by connecting to a remote server or starting a local calculation thread. Some bundles require configuration by the GUI (but also need to be available on a headless platform). A client Bundle may request multiple REngine objects for parallel calculations.
I currently register these modules to provide an REngine service. The service is created by a ServiceFactory, which either launches a local calculation instance or a remote (server) instance. The client is responsible for trying out all Service registrations of the REngine class and selecting the right one.
The code to do this can be summarised as follows:
class API.REngine { ... }
class REngineProvider.Activator {
public void start(BundleContext ctx) {
ctx.registerService(REngine.class.getName(), new REngineFactory(), null);
}
}
class REngineProvider.REngineFactory implements ServiceFactory {
public Object getService(Bundle bundle, ServiceReference reference) {
return new MyREngineImplementation();
}
public void ungetService(REngine service) {
service.releaseAssociatedResources();
}
}
class RConsumer.Class {
REngine getREngine() {
ServiceReference[] references = bundleContext.getAllServiceReferences(REngine.class.getName(), null);
for(ServiceReference ref: references) {
try {
return bundleContext.getService(ref);
} catch (Exception e) {} // too bad, try the next one
}
}
}
I would like to keep this model. It is nice that the OSGi service spec matches my business requirement that REngine objects are living objects which should be released when they are not needed anymore.
However, a registered service can only provide one service instance per bundle. The second time the service is requested, a cached instance is returned (instead of creating a new one). This does not match my requirement; a bundle should be able to get multiple REngine objects from the same provider.
I have already looked at other OSGi framework classes, but nothing seems to help. The alternative is the whiteboard model, but it seems strange to register an REngineRequestService that is used by the REngineProvider bundle to give out a live REngine.
How do I implement this in OSGi? As a reminder, here is my list of requirements:
Easy enabling and disabling of REngineProvider bundles. Client code will just use another provider instead.
Configuration of REngineProvider bundles.
Multiple REngine instances per client bundle.
Explicit release of REngine instances
REngine creation can fail. The client module should be able to know the reason why.
Just to add the solution I have chosen as future reference. It seems the OSGi Services platform is not made for "requesting a service". It is the provider bundle that creates a service, and the client bundle that can find and use the services. It is not possible to provide an automatic "Factory" for services per user request.
The solution chosen involves the OSGi whiteboard model. On first sight, this may seem very difficult to manage, but Blueprint can help a lot!
Provider blueprint.xml file:
<reference-list interface="org.application.REngineRequest"
availability="optional">
<reference-listener
bind-method="bind" unbind-method="unbind">
<bean class="org.provider.REngineProvider"/>
</reference-listener>
The class REngineRequest is a shared API class allowing the provider to input his REngine object, or set an Exception explaining why the creation did not work.
For the client, using an REngine is now as easy as doing:
REngineRequest req = new REngineRequest();
ServiceRegistration reg = bundleContext.registerService(req, REngineRequest.class.getName(), engineCreationProperties);
req.getEngine().doSomeStuff();
reg.unregister();
We make the assumption that the provider will never stop while the client is using the REngine. If it does, the REngine becomes invalid.

ComponentFactory from the Declarative Services is what you need. Most of the time you should rather use DS instead of manually registering and looking up the services.
The provider side should register REngine factory service (you don't have to implement the factory itself, DS will do that for you). The consmer should declare one-to-many dependency to the REngine service. At runtime, all available factories will be injected and consumer can go through them to create actual REngine instances.

Two years ago I tried to create Genuine Service Factories what later became Parameterized Services. However, after analysis it turned out that nothing was needed, just register the factory as the service.
However.
I do not know enough about your service but it sounds very much that you could significantly simplify things by removing control from the client bundle, the client bundle should just use whatever REngine service is available in the service registry, maybe with a property signalling its use type if there are multiple bundles that need REngines and they should not share the same REngine (which should rarely be the case).
If that model is possible, it usually significantly simplifies. I generally then use DS with Configuration Admin configurations that drive the instances (one of the most useful aspects of DS, see http://www.aqute.biz/Bnd/Components). With the metatype integration, you even get a user interface to edit your configuration properties.

One solution would be to register the REngineFactory as the service rather than the REngine implementation itself and return the factory from the getService method. That way the clients can look up the factory and, on successfully finding one, use it to get a new REngine implementation.

Related

Find out which bundle calls service

In equinox OSGi, I am using a service (DS) from several different bundles and I would like to know in the service who is using it, each time.
The service write to the database, and I want to know which bundle writes what.
The buildin LogService must be able to do this, since it knows who wrote each log line, but I can't figure out how.
The simplest solution is to put the name of the bundle in each method to the service, but I hope for a more elegant solution.
This is precisely what a ServiceFactory is for, see OSGi Core R6 specification Section 5.9. "Service Factory".
Updated Below after question clarified to specify DS usage.
This can be achieved by using scope=ServiceScope.BUNDLE on your #Component annotation. Then you can access the calling bundle by allowing the ComponentContext to be injected into your activate method and calling getUsingBundle(). For example:
#Component(scope = ServiceScope.BUNDLE)
public class MyComponent implements MyService {
private Bundle usingBundle;
#Activate
void activate(ComponentContext context) {
this.usingBundle = context.getUsingBundle();
}
// ...
}
At the low-level this works by registering the service as an instance of ServiceFactory rather than a plain service object. When OSGi obtains the service on behalf of a consumer, it invokes the getService method, which passes the consumer Bundle object to the provider of the service. This happens entirely transparently for the consumer, i.e. they don't have to change their code at all.
Add a bundle name or a Bundle object to the object you pass from service clients to the service. By doing so, you can find out the client for each service call.
You mentioned LogService. It uses a LogEntry class to pass a Bundle object with the log message to the service.

Dropwizard registering two classes/clients

I have two instances of clients with different configs that I am creating (timeout, threadpool, etc...), and would like to leverage Dropwizard's metric on both of the clients.
final JerseyClientBuilder jerseyClientBuilder = new JerseyClientBuilder(environment)
.using(configuration.getJerseyClientConfiguration());
final Client config1Client = jerseyClientBuilder.build("config1Client");
environment.jersey().register(config1Client);
final Client config2Client = jerseyClientBuilder.build("config2Client");
environment.jersey().register(config2Client);
However, I am getting
org.glassfish.jersey.internal.Errors: The following warnings have been detected:
HINT: Cannot create new registration for component type class org.glassfish.jersey.client.JerseyClient:
Existing previous registration found for the type.
And only one client's metric shows up.
How do I track both clients' metrics or is it not common to have 2 clients in a single dropwizard app?
Never mind, turned out I was an idiot (for trying to save some resource on the ClientBuilder).
2 Things that I did wrong with my original code:
1. You don't need to register Jersey clients, just the resource is enough... somehow I missed the resource part in my code and just straight up trying to register the client
2. You need to explicitly build each JerseyClientBuilder and then build your individually configured clients, then dropwizard will fetch by each JerseyClientBuilder's metrics
In the end, I just had to change my code to the following:
final Client config1Client = new JerseyClientBuilder(environment)
.using(configuration.getJerseyClientConfiguration()).build("config1Client");
final Client config2Client = new JerseyClientBuilder(environment)
.using(configuration.getJerseyClientConfiguration()).build("config2Client");
Doh.
environment.jersey().register() has a javadoc listing of Adds the given object as a Jersey singleton component meaning that the objects registered become part of the jersey dependency injection framework. Specifically this method is used to add resource classes to the jersey context, but any object with an annotation or type that Jersey looks for can be added this way. Additionally, since they are singletons you can only have one of them per any concrete type (which is why you are getting a "previous registration" error from Jersey).
I imagine that you want to have two Jersey clients to connect to two different external services via REST/HTTP. Since your service needs to talk to these others to do its work, you'll want to have the clients accessible wherever the "work" or business logic is being performed.
For example, this guide creates a resource class that requires a client to an external http service to do currency conversions. I'm not saying this is a great example (just a top google result for dropwizard external client example). In fact, I think this not a good to structure your application. I'd create several internal objects that hide from the resource class how the currency information is fetched, like a business object (BO) or data access object (DAO), etc.
For your case, you might want something like this (think of these as constructor calls). JC = jersey client, R = resource object, BO = business logic object
JC1()
JC2()
B1(JC1)
B2(JC2)
R1(B1)
R2(B2)
R3(B1, B2)
environment.jersey().register(R1)
environment.jersey().register(R2)
environment.jersey().register(R3)
The official Dropwizard docs are somewhat helpful. They at least explain how to create a jersey client; they don't explain how to structure your application.
If you're using the Jersey client builder from dropwizard, each of the clients that you create should be automatically registered to record metrics. Make sure you're using the client builder from the dropwizard-client artifact and package io.dropwizard.client. (Looks like you are because you have the using(config) method.)

How to programmatically register a service in the SCR using OSGi standard features?

I've spent some time experimenting with and studying the OSGi enRoute site. The Quick Start, and Base tutorials were really good. Now as a learning exercise, I'm creating my own example following the principles in those tutorials.
I've decided to reproduce the StageService from the blog post "Making JavaFX better with OSGi". Rather than using the org.apache.felix.dm and org.apache.felix.dm.annotation.api packages I want to use the OSGi standard SCR packages (org.osgi.service.component.*) along with the enRoute provider template.
So far everything has worked out nicely. But I'm stuck on one point. In the "Making JavaFX better with OSGi" tutorial the service is programmatically registered into the service registry using the org.apache.felix.dm.DependencyManager like this:
#Override
public void start(Stage primaryStage) throws Exception {
BundleContext bc = FrameworkUtil.getBundle(this.getClass()).getBundleContext();
DependencyManager dm = new DependencyManager(bc);
dm.add(dm.createComponent()
.setInterface(StageService.class.getName(), null)
.setImplementation(new StageServiceImpl(primaryStage)));
}
My assumption is that in this example the DependencyManager is an Apache Felix specific feature rather than an OSGi standard. I would like to have my enRoute provider depend only on OSGi standard features.
So my question is simply:
How would one register a service in the service registry programmatically using only OSGi standard features? (I know from following the enRoute tutorials that if my component implements the exported service that SCR will automatically register my component in the service registry when my component is activated. The problem with this solution though is that when my component is activated it has to launch the JavaFX application in a different thread so as to not block the thread in use by the SCR until the JavaFX application terminates. Because of this, my component must programmatically register the service in the service registry. Otherwise it won't be guaranteed to be available upon registration.)
For reference, here is what I currently have:
private void registerService(Stage stage) {
DependencyManager dm = new DependencyManager(bundle().getBundleContext());
dm.add(
dm.createComponent()
.setInterface(StageService.class.getName(), null)
.setImplementation(new StageServiceImpl(primaryStage))
);
}
But instead I want to replace it with this:
private void registerService(Stage stage) {
// How to register service in service registry using only OSGi standard features? (not the apache felix dependency manager)
}
UPDATE 1
Following BJ Hargrave's recommendation I tried to register the service directly from the bundle context as follows:
FrameworkUtil
.getBundle(getClass())
.getBundleContext()
.registerService(StageService.class, new StageServiceImpl(primaryStage), null);
After doing this and trying to resolve the enRoute application project the following error occurs:
org.osgi.service.resolver.ResolutionException: Unable to resolve
<> version=null: missing requirement
com.github.axiopisty.osgi.javafx.launcher.application
-> Unable to resolve com.github.axiopisty.osgi.javafx.launcher.application
version=1.0.0.201608172037: missing requirement
objectClass=com.github.axiopisty.osgi.javafx.launcher.api.StageService]
I have uploaded the project to github so you can reproduce the error.
Update 2
The build tab in the bnd.bnd file in the provider module shows the following warning:
The servicefactory:=true directive is set but no service is provided, ignoring it
Might this have something to do with the application module not being able to be resolved?
In rare cases it is necessary to register a 'service by hand' using the standard OSGi API. Try very hard to avoid this case because if you start to register (and maybe depend) on services that you manually register you get a lot of responsibility that is normally hidden from view. For example, you have to ensure that the services you register are also unregistered.
One of the rare cases where this is necessary is when you have to wait for a condition before you can register your service. For example, you need to poll a piece of hardware before you register a service for the device. You will need to control the CPU but at that moment you cannot yet register a service. In that case you create an immediate component and register the service manually.
To register a service manually you require a BundleContext object. You can get that objectvia the activate method, just declare a Bundle Context in its arguments and it is automatically injected:
#Activate
void activate( BundleContext context) {
this.context = context;
}
You can now register a service with the bundle context:
void register(MyService service) {
Hashtable<String,Object> properties = new Hashtable<>();
properties.put("foo", "bar");
this.registration = context.registerService( MyService.class, service, properties );
}
However, you now have the responsibility to unregister this service in your deactivate. If you do not clean up this service then your component might be deactivated while your service still floats around. Your service is unmanaged. (Although when the bundle is stopped it will be cleaned up.)
#Deactivate
void deactivate() {
if ( this.registration != null)
this.registration.unregister();
}
If you create the service is a call back or background thread then you obviously have to handle the concurrency issues. You must ensure that there is no race condition that you register a service while the deactivate method has finished.
This text has also been added to the DS Page of OSGi enRoute
Reading the OSGi spec would help you understand the service API.
But this should do it:
ServiceRegistration<StageService> reg = bc.registerService(StageService.class, new StageServiceImpl(primaryStage), null);

Service Provider Interface without the Provider

I am reading Bloch's Effective java book[1] and came across the following example of SPI:
//Service interface
public interface Service {
//Service specific methods here
}
//Service provider interface
public interface Provider {
Service newService();
}
//Class for service registration and access
public class Services {
private Services(){}
private static final Map<String, Provider> providers =
new ConcurrentHashMap<String, Provider>();
public static final String DEFAULT_PROVIDER_NAME = "<def>";
//Registration
public static void registerDefaultProvider(Provider p) {
registerProvider(DEFAULT_PROVIDER_NAME, p);
}
public static void registerProvider(String name, Provider p) {
providers.put(name, p);
}
//Access
public static Service newInstance() {
return newInstance(DEFAULT_PROVIDER_NAME);
}
public static Service newInstance(String name) {
// you get the point..lookup in the map the provider by name
// and return provider.newService();
}
This my question: why is the Provider interface necessary? Couldn't we have just as easily registered the Service(s) themselves - e.g. maintain a map of the Service implementations and then return the instance when looked up? Why the extra layer of abstraction?
Perhaps this example is just too generic - any "better" example to illustrate the point would be great too.
[1] Second edition, Chapter 2. The first edition example does not refer to the Service Provider Interfaces.
Why is the Provider interface necessary? Couldn't we have just as easily registered the Service(s) themselves - e.g. maintain a map of the Service implementations and then return the instance when looked up?
As others have stated, the purpose of a Provider is to have an AbstractFactory that can make Service instances. You don't always want to keep a reference to all the Service implementations because they might be short lived and/or might not be reusable after they have been executed.
But what is the purpose of the provider and how can you use a "provider registration API" if you don't have a provider
One of the most powerful reasons to have a Provider interface is so you DON'T need to have an implementation at compile time. Users of your API can add their own implementations later.
Let's use JDBC as an example like Ajay used in another answer but let's take it a little further:
There are many different types of Databases and database vendors who all have slightly different ways of managing and implementing databases (and perhaps how to query them). The creators of Java can't possibly create implementations of all these different possible ways for many reasons:
When Java was first written, many of these database companies or systems didn't exist yet.
Not all these database vendors are open source so the creators of Java couldn't know how to communicate with them even if they wanted to.
Users might want to write their own custom database
So how do you solve this? By using a Service Provider.
The Driver interface is the Provider. It provides methods for interacting with a particular vendor's databases. One of the methods in Driver is a factory method to make a Connection instance(which is the Service) to the database given a url and other properties (like user name and password etc).
Each Database vendor writes their own Driver implementation for how to communicate with their own database system. These aren't included in the JDK; you must go to the company websites or some other code repository and download them as a separate jar.
To use these drivers, you must add the jar to your classpath and then use the JDK DriverManager class to register the driver.
The DriverManager class is the Service Registration.
The DriverManager class has a method registerDriver(Driver) that is used to register a Driver instance in the Service Registration so it can be used. By convention, most Driver implementations register at class loading time so all you have to do in your code is write
Class.forname("foo.bar.Driver");
To register the Driver for vendor "foo.bar" (assuming you have the jar with that class in your classpath.)
Once the Database Drivers are registered, you can get a Service implementation instance that is connected to your database.
For example, if you had a mysql database on your local machine named "test" and you had a user account with username "monty" and password "greatsqldb" then you can create a Service implementation like this :
Connection conn =
DriverManager.getConnection("jdbc:mysql://localhost/test?" +
"user=monty&password=greatsqldb");
The DriverManager class sees the String you passed in and finds the registered driver that can understand what that means. (This is actually done using the Chain of Responsibility Pattern by going through all the registered Drivers and invoking their Driver.acceptsUrl(Stirng) method until the url gets accepted)
Notice that there is no mysql specific code in the JDK. All you had to do is register a Driver of some vendor and then pass a properly formatted String to the Service Provider. If we later decide to use a different database vendor (like oracle or sybase) then we just swap jars and modify the our connection string. The code in the DriverManager does not change.
Why didn't we just make a connection once and keep it around? Why do we need the Service?
We might want connect/disconnect after each operation. Or we might want to keep the connection around longer. Having the Service allows us to create new connections whenever we want and does not preclude us from keeping a reference to it to re-use later.
This is a very powerful concept and is used by frameworks to allow many possible permutations and extensions without cluttering the core codebase.
EDIT
Working with multiple Providers and Providers that provide multiple Services:
There is nothing stopping you from having multiple Providers. You can connect to multiple databases created using different database vendor software at the same time. You can also connect to multiple databases produced by the same vendor at the same time.
Multiple services - Some Providers may even provide different Service implementations depending on the connect url. For example, H2 can create both file system based or in-memeory based databases. The way to tell H2 which one you want to use is a different url format. I haven't looked at the H2 code, but I assume the file based and the in memory based are different Service implementations.
Why doesn't the DriverManager just manage Connections and Oracle could implement the OracleConnectionWrapper? No providers!
That would also require you to know that you have an Oracle connection. That is very tight coupling and I would have to change lots of code if I ever changed vendors.
The Service Registration just takes a String. Remember that it uses the chain of Responsiblity to find the first registered Provider that knows how to handle the url. the application can be vendor neutral, and it can get the connection url and Driver class name from a property file. That way I don't have to recompile my code if I change vendors. However, if I hardcoded refernences to "OracleConnectionWrapper" and then I changed vendors, I would have to rewrite portions of my code and then recompile.
There is nothing preventing someone from supporting multiple database vendor url formats if they want. So I can make a GenericDriver that could handle mysql and oracle if I wanted to.
If you might need more than one service of each type, you can't just reuse the old Services. (Additionally, tests and the like might want to create fresh services for each test, rather than reusing services that might have been modified or updated by previous tests.)
I think the answer is mentioned in Effective Java along with an example.
An optional fourth component of a service provider framework is a
service provider interface, which providers implement to create
instances of their service implementation. In the absence of a service
provider interface, implementations are registered by class name and
instantiated reflectively (Item 53).
In the case of JDBC,
Connection plays the part of the service interface,
DriverManager.registerDriver is the provider registration API, DriverManager.getConnection is the service access API, and
Driver is the service provider interface.
So as you have correctly noted it is not a must to have the Provider interface but just a little cleaner approach.
So seems like you can have multiple Providers for the same Service and based on a specific Provider name you may get different instances of the same Service. So I would say each Provider is kind of like factory that creates the service appropriately.
For example suppose class PaymentService implements Service and it requires a Gateway. You have PayPal and Chase gateway that deal with those payment processors. Now you create a PayPalProvider and ChaseProvider each of which knows how to create the correct the PaymentService instance with the right gateway.
But I agree, seems contrived.
As a synthesis of the other answers (the fourth component is the textual reason) I think this is to limit compilation dependencies. With the SPI, you have all the tools to exclude en explicit reference to the implementation:
The META-INF/services/ directory contains files mentioning the available service provider implementations
The ServiceLoader standard class allows the resolution of the available implementations names and by the way a dynamic construction [1].
The SPI was not mentioned in the first edition. It was perhaps not the right place to include it in an item about static factories. The DriverManager mentioned in the text is a hint, but Bloch does not go in deep. In a way, the platform implements a kind of ServiceLocator pattern to reduce compilation dependencies, depending on the environment. With a SPI in your abstract factory, it becomes the ServiceFactory of a ServiceLocator with the help of the ServiceLoader for modularity.
The ServiceLoader iterator could be used to populate dynamically the services map of the example.
[1] In an OSGi environment, this is a subtle operation.
Service Provider Interface without a provider
Let's see how it would look like without a provider.
//Service interface
public interface Service {
//Service specific methods here
}
//Class for service registration and access
public class Services {
private Services(){}
private static final Map<String, Service> services =
new ConcurrentHashMap<String, Service>();
public static final String DEFAULT_SERVICE_NAME = "<def>";
//Registration
public static void registerDefaultService(Provider p) {
registerService(DEFAULT_SERVICE_NAME, p);
}
public static void registerService(String name, Provider p) {
services.put(name, p);
}
//Access
public static Service getInstance() {
return newInstance(DEFAULT_SERVICE_NAME);
}
public static Service getInstance(String name) {
// you get the point..lookup in the map the service by name
// and return it;
}
As you see, it's possible to create a Service Provider Interface without a Provider interface. Callers of #getInstance(..) eventually wouldn't notice a difference.
Then why do we need a provider?
The Provider interface is an Abstract Factory and Services#newInstance(String) is a Factory Method. Both design patterns have the advantage that they decouple service implementation from service registration.
Single responsibility principle
Instead of implementing the service instantiation in a startup event handler, which registers all services, you create one provider per service. This makes it loosely coupled and easier to refactor, because Service and Service Provider could be put near to each other, for example into another JAR-file.
"Factory methods are common in toolkits and frameworks, where library code needs to create objects of types that may be subclassed by applications using the framework." [1]
Lifetime management:
You might have realized in the upper code without providers, that we're registering service instances instead of a provider, which could decide to instantiate a new service instance.
This approach has some disadvantages:
1. Service instances have to be created before the first service call. Lazy initialization isn't possible. This will delay startup and bind resources to services which are rarely used or even never.
1b. You "cannot" close services after usage, because there is no way to reinstantiate them. (With a provider you could design the service interface in a way that the caller has to call #close(), which informs the provider and the provider decides to keep or finalize the service instance.)
2. All callers will use the same service instance, therefore you have to make sure that it's thread-safe. But making it thread-safe will make it slow. In contrary a provider might choose to create a couple of service instances to reduce retention time.
Conclusion
A provider interface isn't required, but it encapsulates service-specific instantiation logic and optimizes resource allocation.

Using EJBs with regular java classes. Trying to instantiate a stateless EJB

I have a large web project in Java EE 6 so far everything is working great.
Now I'm adding a new class that takes twitter information and returns a string. So far the strings have been extracted from the JSON file from twitter and are ready to be persisted in my database. My problem is I'm not sure how to pass information from the EJB that normally handles all of my database calls. I'm using JPA and have a DAO class that managers all database access. I already have a method there for updateDatabase(String). I'd like to be able to call updateDatabase(String) from the class that has the strings to add but I don't know if it's good form to instantiate a stateless bean like that. Normally you inject beans and then call just their class name to access their methods. I could also maybe try and reference the twitter string generating class from inside of the EJB but then I'd have to instantiate it there and mess with main() method calls for execution. I'm not really sure how to do this. Right now my Twitter consuming class is just a POJO with a main method. For some reason some of the library methods did not work outside of main in face IOUtils() API directly says "Instances should NOT be constructed in standard programming".
So on a higher level bottom line, I'm just asking how POJO's are normally "mixed" into a Java EE project where most of your classes are EJBs and servlets.
Edit: the above seems confusing to me after rereading so I'll try to simplify it. basically I have a class with a main method. I'd like to call my EJB class that handles database access and call it's updateDatabase(String) method and just pass in the string. How should I do this?
Edit: So it looks like a JNDI lookup and subsequence reference is the preferred way to do this rather than instantiating the EJB directly?
Edit: these classes are all in the same web project. In the same package. I could inject one or convert the POJO to an EJB. However the POJO does have a main method and some of the library files do not like to be instantiated so running it in main seems like the best option.
My main code:
public class Driver {
#EJB
static RSSbean rssbean;
public static void main(String[] args) throws Exception {
System.setProperty("http.proxyHost", "proxya..com");
System.setProperty("http.proxyPort", "8080");
/////////////auth code///////////////auth code/////////////////
String username = System.getProperty("proxy.authentication.username");
String password = System.getProperty("proxy.authentication.password");
if (username == null) {
Authenticator.setDefault(new ProxyAuthenticator("", ""));
}
///////////////end auth code/////////////////////////////////end
URL twitterSource = new URL("http://search.twitter.com/search.json?q=google");
ByteArrayOutputStream urlOutputStream = new ByteArrayOutputStream();
IOUtils.copy(twitterSource.openStream(), urlOutputStream);
String urlContents = urlOutputStream.toString();
JSONObject thisobject = new JSONObject(urlContents);
JSONArray names = thisobject.names();
JSONArray asArray = thisobject.toJSONArray(names);
JSONArray resultsArray = thisobject.getJSONArray("results");
JSONObject(urlContents.substring(urlContents.indexOf('s')));
JSONObject jsonObject = resultsArray.getJSONObject(0);
String twitterText = jsonObject.getString("text");
rssbean.updateDatabase("twitterText");
}
}
I'm also getting a java.lang.NullPointerException somewhere around rssbean.updateDatabase("twitterText");
You should use InitialContext#lookup method to obtain EJB reference from an application server.
For example:
#Stateless(name="myEJB")
public class MyEJB {
public void ejbMethod() {
// business logic
}
}
public class TestEJB {
public static void main() {
MyEJB ejbRef = (MyEJB) new InitialContext().lookup("java:comp/env/myEJB");
ejbRef.ejbMethod();
}
}
However, note that the ejb name used for lookup may be vendor-specific. Also, EJB 3.1 introduces the idea of portable JNDI names which should work for every application server.
Use the POJO as a stateless EJB, there's nothing wrong with that approach.
From the wikipedia: EJB is a server-side model that encapsulates the business logic of an application.
Your POJO class consumes a web service, so it performs a business logic for you.
EDIT > Upon reading your comment, are you trying to access an EJB from outside of the Java EE container? Because if not, then you can inject your EJB into another EJB (they HAVE to be Stateless, both of them)
If you have a stand alone program that wishes to access an EJB you have a couple of options.
One is to simply use JNDI to look up the EJB. The EJB must have a Remote interface, and you need to configure the JNDI part for you container, as well as include any specific container jars within your stand alone application.
Another technique is to use the Java EE artifact know as the "application client". Here, there is a container provider wrapper for your class, but it provides a run time environment very similar to running the class within the container, notably you get things like EJB injection.
You app still runs in a separate JVM, so you still need to reference Remote EJBs, but the app client container handles a bunch of the boiler plate in getting your app connected to the server. This, too, while a Java EE artifact, is also container dependent in how to configure and launch an app client application.
Finally, there is basically little difference in how a POJO interact with the EJB container this way in contrast to a POJO deployed within the container. The interface is still a matter of getting the EJB injected (more easily done in Java EE 6 than before) or looking up a reference via JNDI. The only significant difference being that a POJO deployed in the container can use a Local interface instead of the Remote.

Categories

Resources