I am trying to set a new InitialContext in the following manner (which is pretty standard I believe):
private static InitialContext getInitialContext() throws NamingException {
InitialContext context = null;
try {
Properties properties = new Properties();
properties.put(Context.URL_PKG_PREFIXES, "org.jboss.ejb.client.naming");
properties.put(Context.INITIAL_CONTEXT_FACTORY, "org.jboss.naming.remote.client.InitialContextFactory");
properties.put(Context.PROVIDER_URL, "remote://localhost:4447");
properties.put(Context.SECURITY_PRINCIPAL, "username");
properties.put(Context.SECURITY_CREDENTIALS, "password");
context = new InitialContext(properties);
System.out.println("\n\tGot initial Context: " + context);
}
catch (Exception e) {
e.printStackTrace();
}
return context;
}
public static void sendMessage(RoboticsParameters object_msg) throws Exception {
InitialContext context = getInitialContext();
// other code
}
The code "fails" at the line where the new InitialContext is created using the properties and i get a java.lang.NullPointerException. I suspect I am missing an argument. Here is the stack trace:
WARN: EJB client integration will not be available due to a problem setting up the EJB client handler java.lang.NullPointerException
at org.jboss.naming.remote.client.InitialContextFactory.<clinit>(InitialContextFactory.java:118)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:348)
at com.sun.naming.internal.VersionHelper12.loadClass(VersionHelper12.java:72)
at com.sun.naming.internal.VersionHelper12.loadClass(VersionHelper12.java:61)
at javax.naming.spi.NamingManager.getInitialContext(NamingManager.java:672)
at javax.naming.InitialContext.getDefaultInitCtx(InitialContext.java:313)
at javax.naming.InitialContext.init(InitialContext.java:244)
at javax.naming.InitialContext.<init>(InitialContext.java:216)
Any suggestions?
I am running JBoss EAP 6.4 and using EJB 3. I have jboss-client.jar in the class path.
I checked the source code for:
jboss-remote-naming/src/main/java/org/jboss/naming/remote/client/InitialContextFactory.java
and found where the log message was coming from:
public class InitialContextFactory implements javax.naming.spi.InitialContextFactory {
// code
private static final String REMOTE_NAMING_EJB_CLIENT_HANDLER_CLASS_NAME = "org.jboss.naming.remote.client.ejb.RemoteNamingStoreEJBClientHandler";
// code
try {
klass = classLoader.loadClass(REMOTE_NAMING_EJB_CLIENT_HANDLER_CLASS_NAME);
method = klass.getMethod("setupEJBClientContext", new Class<?>[] {Properties.class, List.class});
} catch (Throwable t) {
logger.warn("EJB client integration will not be available due to a problem setting up the EJB client handler", t);
}
// other code
}
The class org.jboss.naming.remote.client.ejb.RemoteNamingStoreEJBClientHandler was in the jar that I added to the class path but for some reason there were problems loading the class.
Then I stumbled upon this small README-EJB-JMS.txt file in the [jboss_home]/bin/client folder which states the following:
"Maven users should not use this jar, but should use the following BOM dependencies instead
<dependencies>
<dependency>
<groupId>org.jboss.as</groupId>
<artifactId>jboss-as-ejb-client-bom</artifactId>
<type>pom</type>
</dependency>
<dependency>
<groupId>org.jboss.as</groupId>
<artifactId>jboss-as-jms-client-bom</artifactId>
<type>pom</type>
</dependency>
</dependencies>
This is because using maven with a shaded jar has a very high chance of causing class version conflicts, which is why
we do not publish this jar to the maven repository."
So, I added the maven dependency instead of having the jar in my class path and VOILA! It works!
Related
I'm running spark job from EMR cluster which connects to Cassandra on EC2
The following are the dependencies which I'm using for my project.
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.10</artifactId>
<version>1.6.0</version>
</dependency>
<dependency>
<groupId>com.datastax.spark</groupId>
<artifactId>spark-cassandra-connector_2.10</artifactId>
<version>1.5.0-M1</version>
</dependency>
<dependency>
<groupId>com.datastax.cassandra</groupId>
<artifactId>cassandra-driver-core</artifactId>
<version>2.1.6</version>
</dependency>
<dependency>
<groupId>com.datastax.spark</groupId>
<artifactId>spark-cassandra-connector-java_2.10</artifactId>
<version>1.5.0-M3</version>
</dependency>
The issue that Im facing here is if I use the cassandra-driver-core 3.0.0 , I get the following error
java.lang.ExceptionInInitializerError
at mobi.vserv.SparkAutomation.DriverTester.doTest(DriverTester.java:28)
at mobi.vserv.SparkAutomation.DriverTester.main(DriverTester.java:16)
Caused by: java.lang.IllegalStateException: Detected Guava issue #1635 which indicates that a version of Guava less than 16.01 is in use. This introduces codec resolution issues and potentially other incompatibility issues in the driver. Please upgrade to Guava 16.01 or later.
at com.datastax.driver.core.SanityChecks.checkGuava(SanityChecks.java:62)
at com.datastax.driver.core.SanityChecks.check(SanityChecks.java:36)
at com.datastax.driver.core.Cluster.<clinit>(Cluster.java:67)
... 2 more
I have tried including the guaua version 19.0.0 also but still I'm unable to run the job
and when I degrate the cassandra-driver-core 2.1.6 I get the following error.
com.datastax.driver.core.exceptions.NoHostAvailableException: All host(s) tried for query failed (tried: /EMR PUBLIC IP:9042 (com.datastax.driver.core.TransportException: [/EMR PUBLIC IP:9042] Cannot connect))
at com.datastax.driver.core.ControlConnection.reconnectInternal(ControlConnection.java:223)
at com.datastax.driver.core.ControlConnection.connect(ControlConnection.java:78)
at com.datastax.driver.core.Cluster$Manager.init(Cluster.java:1272)
at com.datastax.driver.core.Cluster.init(Cluster.java:158)
at com.datastax.driver.core.Cluster.connect(Cluster.java:248)
Please note that I have tested my code locally and it runs absolutely fine and I have followed the different combinations of dependencies as mentioned here https://github.com/datastax/spark-cassandra-connector
Code :
public class App1 {
private static Logger logger = LoggerFactory.getLogger(App1.class);
static SparkConf conf = new SparkConf().setAppName("SparkAutomation").setMaster("yarn-cluster");
static JavaSparkContext sc = null;
static
{
sc = new JavaSparkContext(conf);
}
public static void main(String[] args) throws Exception {
JavaRDD<String> Data = sc.textFile("S3 PATH TO GZ FILE/*.gz");
JavaRDD<UserSetGet> usgRDD1=Data.map(new ConverLineToUSerProfile());
List<UserSetGet> t3 = usgRDD1.collect();
for(int i =0 ; i <=t3.size();i++){
try{
phpcallone php = new phpcallone();
php.sendRequest(t3.get(i));
}
catch(Exception e){
logger.error("This Has reached ====> " + e);
}
}
}
}
public class phpcallone{
private static Logger logger = LoggerFactory.getLogger(phpcallone.class);
static String pid;
public void sendRequest(UserSetGet usg) throws JSONException, IOException, InterruptedException {
UpdateCassandra uc= new UpdateCassandra();
try {
uc.UpdateCsrd();
}
catch (ClassNotFoundException e) {
e.printStackTrace(); }
}
}
}
public class UpdateCassandra{
public void UpdateCsrd() throws ClassNotFoundException {
Cluster.Builder clusterBuilder = Cluster.builder()
.addContactPoint("PUBLIC IP ").withPort(9042)
.withCredentials("username", "password");
clusterBuilder.getConfiguration().getSocketOptions().setConnectTimeoutMillis(10000);
try {
Session session = clusterBuilder.build().connect("dmp");
session.execute("USE dmp");
System.out.println("Connection established");
} catch (Exception e) {
e.printStackTrace();
}
}
}
Assuming that you are using EMR 4.1+, you can pass in the guava jar into the --jars option for spark submit. Then supply a configuration file to EMR to use user class paths first.
For example, in a file setup.json
[
{
"Classification": "spark-defaults",
"Properties": {
"spark.driver.userClassPathFirst": "true",
"spark.executor.userClassPathFirst": "true"
}
}
]
You would supply the --configurations file://setup.json option into the create-cluster aws cli command.
As per docs, under (Expecting Log Messages),
Be sure to exchange the default logger with the TestEventListener in
your application.conf to enable this function: akka.loggers =
[akka.testkit.TestEventListener]
So, this works well when I put it in application.conf. My test work well
#Test
public void testActorForNonExistentLocation() throws Exception {
final Map<String, String> configValues = Collections.singletonMap("tenant.assetsLocation",
"/non/existentLocation");
final Config config = mergeConfig(configValues);
System.out.println(config.getList("akka.loggers"));
new JavaTestKit(system) {{
assertEquals("system", system.name());
final Props props = TenantMonitorActor.props(config);
final ActorRef supervisor = system.actorOf(props, "supervisor");
new EventFilter<Void>(DiskException.class) {
#Override
protected Void run() {
supervisor.tell(new TenantMonitorMessage(), supervisor);
return null;
}
}.from("akka://system/user/supervisor/diskMonitor").occurrences(1).exec();
}};
}
Now when I try to run my application
public class Main {
private Main() {
}
public static void main(final String[] args) {
final Config config = ConfigFactory.load();
final ActorSystem actorSystem = ActorSystem.create(config.getString("ec.name"));
setUpMonitoring(actorSystem, config);
}
private static void setUpMonitoring(final ActorSystem system, final Config config) {
final ActorRef tenantMonitorRef = system.actorOf(TenantMonitorActor.props(config),
"tenantMonitor");
tenantMonitorRef.tell(new TenantMonitorMessage(), tenantMonitorRef);
}
}
I see errors as
error while starting up loggers
akka.ConfigurationException: Logger specified in config can't be loaded [akka.testkit.TestEventListener] due to [java.lang.ClassNotFoundException: akka.testkit.TestEventListener]
at akka.event.LoggingBus$$anonfun$4$$anonfun$apply$1.applyOrElse(Logging.scala:116)
at akka.event.LoggingBus$$anonfun$4$$anonfun$apply$1.applyOrElse(Logging.scala:115)
at scala.runtime.AbstractPartialFunction.apply(AbstractPartialFunction.scala:36)
at scala.util.Failure$$anonfun$recover$1.apply(Try.scala:215)
at scala.util.Try$.apply(Try.scala:191)
at scala.util.Failure.recover(Try.scala:215)
at akka.event.LoggingBus$$anonfun$4.apply(Logging.scala:115)
at akka.event.LoggingBus$$anonfun$4.apply(Logging.scala:110)
at scala.collection.TraversableLike$WithFilter$$anonfun$map$2.apply(TraversableLike.scala:728)
at scala.collection.Iterator$class.foreach(Iterator.scala:750)
at scala.collection.AbstractIterator.foreach(Iterator.scala:1202)
at scala.collection.IterableLike$class.foreach(IterableLike.scala:72)
at scala.collection.AbstractIterable.foreach(Iterable.scala:54)
at scala.collection.TraversableLike$WithFilter.map(TraversableLike.scala:727)
at akka.event.LoggingBus$class.startDefaultLoggers(Logging.scala:110)
at akka.event.EventStream.startDefaultLoggers(EventStream.scala:26)
at akka.actor.LocalActorRefProvider.init(ActorRefProvider.scala:622)
at akka.actor.ActorSystemImpl.liftedTree2$1(ActorSystem.scala:619)
at akka.actor.ActorSystemImpl._start$lzycompute(ActorSystem.scala:616)
at akka.actor.ActorSystemImpl._start(ActorSystem.scala:616)
at akka.actor.ActorSystemImpl.start(ActorSystem.scala:633)
at akka.actor.ActorSystem$.apply(ActorSystem.scala:142)
at akka.actor.ActorSystem$.apply(ActorSystem.scala:109)
at akka.actor.ActorSystem$.create(ActorSystem.scala:57)
at akka.actor.ActorSystem.create(ActorSystem.scala)
at Main.main(Main.java:16)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:497)
at com.intellij.rt.execution.application.AppMain.main(AppMain.java:140)
Caused by: java.lang.ClassNotFoundException: akka.testkit.TestEventListener
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:348)
at akka.actor.ReflectiveDynamicAccess$$anonfun$getClassFor$1.apply(DynamicAccess.scala:67)
at akka.actor.ReflectiveDynamicAccess$$anonfun$getClassFor$1.apply(DynamicAccess.scala:66)
at scala.util.Try$.apply(Try.scala:191)
at akka.actor.ReflectiveDynamicAccess.getClassFor(DynamicAccess.scala:66)
at akka.event.LoggingBus$$anonfun$4.apply(Logging.scala:113)
... 24 more
Exception in thread "main" akka.ConfigurationException: Could not start logger due to [akka.ConfigurationException: Logger specified in config can't be loaded [akka.testkit.TestEventListener] due to [java.lang.ClassNotFoundException: akka.testkit.TestEventListener]]
at akka.event.LoggingBus$class.startDefaultLoggers(Logging.scala:144)
at akka.event.EventStream.startDefaultLoggers(EventStream.scala:26)
at akka.actor.LocalActorRefProvider.init(ActorRefProvider.scala:622)
at akka.actor.ActorSystemImpl.liftedTree2$1(ActorSystem.scala:619)
at akka.actor.ActorSystemImpl._start$lzycompute(ActorSystem.scala:616)
at akka.actor.ActorSystemImpl._start(ActorSystem.scala:616)
at akka.actor.ActorSystemImpl.start(ActorSystem.scala:633)
at akka.actor.ActorSystem$.apply(ActorSystem.scala:142)
at akka.actor.ActorSystem$.apply(ActorSystem.scala:109)
at akka.actor.ActorSystem$.create(ActorSystem.scala:57)
at akka.actor.ActorSystem.create(ActorSystem.scala)
at Main.main(Main.java:16)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:497)
at com.intellij.rt.execution.application.AppMain.main(AppMain.java:140)
My application.conf looks like
akka {
event-handlers = ["akka.event.slf4j.Slf4jEventHandler"]
loglevel = "INFO"
loggers = [akka.testkit.TestEventListener]
}
ec {
name = "Connector"
}
tenant {
assetsLocation: /Users
}
monitoring {
tenant.disk.schedule.seconds: 2
tenant.disk.threshold.percent: 80
}
and I have the dependency installed as well
<dependencies>
<dependency>
<groupId>com.typesafe.akka</groupId>
<artifactId>akka-actor_2.11</artifactId>
<version>${akka-actor_2.11.version}</version>
</dependency>
<dependency>
<groupId>com.typesafe.akka</groupId>
<artifactId>akka-testkit_2.11</artifactId>
<version>${akka-testkit_2.11.version}</version>
</dependency>
</dependencies>
and with versions
<akka-actor_2.11.version>2.3.9</akka-actor_2.11.version>
<akka-testkit_2.11.version>2.3.10</akka-testkit_2.11.version>
Where is bug/inconsistency?
This may not be a bug at all and difference in understanding.
We use maven scopes to decide which dependencies should be available and when. Usually with tests, we use <scope>test</scope> to make the dependency available only in test and not in final application jar when bundled.
Now if we make use of this kind of tests(Expecting Log Messages), we need to remove <scope>test</scope> to let the application run correctly and tests pass at the same time. Which means that that dependency akka-testkit need to be present in the final jar as well since application.conf needs it to load akka.testkit.TestEventListener.
This if all sounds correct, couples production and test code together and is required in current setup.
Again, I may have missed something very basic here, but the only way I could run the application and let the test pass is by combining the dependencies akka-actor and akka-testkit together.
What are your thoughts?
Add an application.conf in your test resources:
src/test/resources/application.conf
Contents could look like:
include "../../main/resources/application"
akka.loggers = [akka.testkit.TestEventListener]
This application.conf will be loaded only during tests, so you can scope your akka-testkit dependency to test only.
So I am developing one maven plugin where I need to modify the classloaders in order to work correctly. The problem is that I am not sure that I am modifying the correct classloader. What I'm doing is the following:
#Mojo(name = "aggregate", requiresDependencyResolution = ResolutionScope.TEST)
public class AcceptanceTestMojo extends AbstractMojo {
private static final String SYSTEM_CLASSLOADER_FIELD_NAME = "scl";
#Parameter
private String property;
#Component
public PluginDescriptor pluginDescriptor;
#Component
public MavenProject mavenProject;
#Override
public void execute() throws MojoExecutionException, MojoFailureException {
ClassLoader newClassLoader = null;
List<String> runtimeClassPathElements;
try {
runtimeClassPathElements = mavenProject.getTestClasspathElements();
} catch (DependencyResolutionRequiredException e) {
throw new MojoFailureException(MojoFailureMessages.UNRESOLVED_DEPENDENCIES_MESSAGE);
}
ClassRealm realm = pluginDescriptor.getClassRealm();
ClassRealm modifiedRealm= new ClassRealm( realm.getWorld(), realm.getId(), realm.getParentClassLoader());
try {
for (String element : runtimeClassPathElements) {
File elementFile = new File(element);
modifiedRealm.addURL(elementFile.toURI().toURL());
}
} catch (MalformedURLException e) {
throw new MojoFailureException(MojoFailureMessages.UNRESOLVED_CLASSES_MESSAGE);
}
pluginDescriptor.setClassRealm(modifiedRealm);
So I am getting the ClassRealm and I'am making slight changes to the UCP(removing some jars) and after that I set the newly created ClassRealm to the project descriptor. I am also changing the ContextClassLoader and the SystemClassLoader as the project I am executing my plugin on are using them for some interactions. These two are working fine- they are changed and the plugin is working fine with them. The problem is the plugin classloader. Because for some reason when executing my plugin on one project it is looking in the plugin ClassRealm and searching for the needed jars from there. But the code I put above is not fully correct, because when I come to the part where the execution of the plugin is looking in the plugin ClassRealm it is not the modified one- it gets another reference, which I don't know where it comes from. What I think is that I am not setting the ClassRealm correctly or I am missing something else.
I dont understand how i can rewrite this code used with jetty 6 for jetty 9 :
import org.mortbay.jetty.*;
import org.mortbay.jetty.nio.SelectChannelConnector;
import org.mortbay.jetty.webapp.WebAppContext;
public class ApplLauncher {
public static void main(String[] args) {
Server server = new Server();
Connector connector = new SelectChannelConnector();
connector.setPort(8080);
server.addConnector(connector);
WebAppContext root = new WebAppContext("C:\\Users\\OZKA\\IdeaProjects\\projectname\\projectname\\web", "/");
root.setWelcomeFiles(new String[]{"index.html"});
//root.addServlet(new ServletHolder(new TestServlet()), "/test");
server.setHandlers(new Handler[]{root});
try {
server.start();
} catch (Exception e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
}
}
The code above works fine and response the static content from the web folder and servlets, mapped in web.xml. Here is my attempts to use embeded jetty 9:
import org.eclipse.jetty.server.Server;
import org.eclipse.jetty.webapp.WebAppContext;
import org.eclipse.jetty.server.handler.ResourceHandler;
import org.eclipse.jetty.server.handler.HandlerList;
import org.eclipse.jetty.server.Handler;
public class ApplLauncher {
public static void main(String[] args) {
System.out.println("Hello from ScalaSbt Web Project");
Server server = new Server(8080);
WebAppContext webapp = new WebAppContext("D:\\Dev\\Scala\\ScalaTestProject\\web\\", "/");
ResourceHandler resource_handler = new ResourceHandler();
resource_handler.setWelcomeFiles(new String[]{ "index.html" });
HandlerList handlers = new HandlerList();
handlers.setHandlers(new Handler[] { resource_handler, webapp});
server.setHandler(handlers);
try {
server.start();
server.join();
}
catch(Exception ex) {
ex.printStackTrace();
}
}
}
The server is starting, but index.html request throws error:
"java.lang.NoSuchMethodError: javax.servlet.http.HttpServletRequest.isAsyncStarted()Z"
I tried to find a working example in Google, but found nothing useful. Official samples and documentation is very confusing and I do not understand how i can use embeded jetty version 9.
Error message clearly indicates that you have wrong version of Servlet API in your classpath.
Check your dependencies, you probably have pre-3.0 Servlet API somewhere, it should be removed.
If you are using Gradle the execute
gradle dependencies
analyze the dependency tree and exclude the 'servlet-api' dependencies with version less than 3.0. You can do the following to exclude
compile ('javax.servlet:jsp-api:2.0'){
exclude module : 'servlet-api'
}
There can be multiple dependencies which further include servlet-api-2.x. Exclude all those
Adding on what #axtavt said: if you are using maven add the following depedency:
<dependency>
<groupId>javax.servlet</groupId>
<artifactId>servlet-api</artifactId>
<version>3.0-alpha-1</version>
<scope>provided</scope>
</dependency>
Also do a mvn dependency:tree |grep servlet and double-check that you do not have servlet-api:2.x imported.
I am new in Java EJB 3.0. It is possible to call a (session) bean—deployed on JBoss—from a desktop application client?
Thanks in advance.
Yes you can. Some specifics are here (references EJB2 but it the same for EJB3 when it comes to remote clients): http://www.theserverside.com/discussions/thread.tss?thread_id=9197
Paraphrased:
Hashtable env = new Hashtable();
env.put("java.naming.factory.initial", "org.jnp.interfaces.NamingContextFactory");
env.put("java.naming.provider.url", "jnp://localhost:1099");
env.put("java.naming.factory.url.pkgs", "org.jboss.naming:org.jnp.interfaces");
Context ctx = new InitialContext(env);
// name is whatever JNDI name you gave it
Object o = ctx.lookup("home name");
EJBHome ejbHome = (EJBHome) PortableRemoteObject.narrow(o,EJBHome.class);
// This is userID should be the one passed.
EJB ejb = ejbHome.create(..);
Yes.
public static void main(String args[]) throws Exception {
InitialContext ctx = new InitialContext();
YourService yourService = (YourService) ctx.lookup("com.example.session.YourService");
String time = yourService.getTime();
System.out.println("Time is: " + time);
}
For client configuration you must provide jndi.properties file with contents
java.naming.factory.initial=org.jnp.interfaces.NamingContextFactory
java.naming.factory.url.pkgs=org.jboss.naming:org.jnp.interfaces
java.naming.provider.url=localhost
If you are looking for working examples on JBoss try download source code of Enterprise JavaBeans 3.0, Fifth Edition
Let's assume you have the following remote interface:
#Remote
public interface HelloBeanRemote {
public String sayHello();
}
And a session bean implementing it:
#Stateless
public class HelloBean implements HelloBeanRemote {
...
}
And that this EJB is correctly packaged and deployed on JBoss.
On the client side, create a jndi.properties with the following content and put it on the classpath:
java.naming.factory.initial=org.jnp.interfaces.NamingContextFactory
java.naming.factory.url.pkgs=org.jboss.naming:org.jnp.interfaces
java.naming.provider.url=localhost:1099
Then use the following code to call your EJB:
Context context;
try {
context = new InitialContext();
HelloBeanRemote beanRemote = (HelloBeanRemote)context.lookup("HelloBean/remote");
beanRemote.test();
} catch (NamingException e) {
e.printStackTrace();
throw new RuntimeException(e);
}
Alternatively, if you don't want to provide a jndi.properties file, you can explicitly setup the JNDI environment in the code and create the context like this:
Properties properties = new Properties();
properties.put("java.naming.factory.initial","org.jnp.interfaces.NamingContextFactory");
properties.put("java.naming.factory.url.pkgs","=org.jboss.naming:org.jnp.interfaces");
properties.put("java.naming.provider.url","localhost:1099");
Context context = new InitialContext(properties);
But I'd recommend using the jndi.properties for the sake of portability.
You can also expose the bean as a web service. I believe this is available as of EJB 3. It is quite nice considering you can do it with annotations. You may wish to consider using this option to decrease coupling. Here is a link to a tutorial.