I don't want to use jdni.properties file, so to add new properties to my JNDI settings, I wrote following:
Hashtable<String, Object> jndi_env = new Hashtable<String, Object>();
jndi_env.put(InitialContext.INITIAL_CONTEXT_FACTORY, "org.apache.activemq.jndi.ActiveMQInitialContextFactory");
jndi_env.put("connectionFactory.ConnectionFactory","vm://0");
jndi_env.put("topic.example","example");
My Problem is, when I call this class:
initialContext = new InitialContext(jndi_env);
Since I pass a name parameter in the last line a URL context factory is looked up.
This makes my code looking for a tcp://localhost:61616 connection which I actually don't want.
I see that there are
QueueConnectionFactory: org.apache.activemq.ActiveMQConnectionFactory
example: org.apache.activemq.command.ActiveMQTopic
XAConnectionFactory: org.apache.activemq.ActiveMQXAConnectionFactory
which I don't want, or at least not the type they are.
If I check without passing an argument using my jndi.properties file where I don't get the issue of establishing a tcp connection, then I find just:
ConnectionFactory: org.apache.activemq.artemis.jms.client.ActiveMQJMSConnectionFactory
queue: org.apache.activemq.artemis.jndi.ReadOnlyContext
queue/exampleQueue: org.apache.activemq.artemis.jms.client.ActiveMQQueue
dynamicTopics: org.apache.activemq.artemis.jndi.ActiveMQInitialContextFactory$2
dynamicQueues: org.apache.activemq.artemis.jndi.ActiveMQInitialContextFactory$1
So how can I change the object types of my added jndi_env.put("topic.example","example"); so it will be like this (but of course for Topics)
queue: org.apache.activemq.artemis.jndi.ReadOnlyContext
queue/exampleQueue: org.apache.activemq.artemis.jms.client.ActiveMQQueue
When you create your InitialContext you're passing in the wrong factory. Currently you're passing in org.apache.activemq.jndi.ActiveMQInitialContextFactory. This is the factory for ActiveMQ 5.x, not Artemis. You need to pass in org.apache.activemq.artemis.jndi.ActiveMQInitialContextFactory instead, e.g.:
Hashtable<String, Object> jndi_env = new Hashtable<String, Object>();
jndi_env.put(InitialContext.INITIAL_CONTEXT_FACTORY, "org.apache.activemq.artemis.jndi.ActiveMQInitialContextFactory");
jndi_env.put("connectionFactory.ConnectionFactory","vm://0");
jndi_env.put("topic.example","example");
Related
Building an SPI to push events into Kafka, to get deployed as an EAR inside Keycloak 6.0.1, which uses the WildFly Server, packaged in a Docker image based on jboss/keycloak:6.0.1.
I ran into: Kafka Producer - org.apache.kafka.common.serialization.StringSerializer could not be found
So I applied the suggested solution of setting Thread.currentThread().setContextClassLoader(null);.
This seems to work fine for my local Kafka on port 9092, without authentication. As soon as I authenticate as described here:
String jaasTemplate = "org.apache.kafka.common.security.scram.ScramLoginModule required username=\"%s\" password=\"%s\";";
String jaasCfg = String.format(jaasTemplate, username, password);
Properties props = new Properties();
props.put("sasl.jaas.config", jaasCfg);
// ...
Thread.currentThread().setContextClassLoader(null);
KafkaProducer<String, String> producer = new KafkaProducer<>(props);
I run into the error:
org.apache.kafka.common.KafkaException: javax.security.auth.login.LoginException: unable to find LoginModule class: org.apache.kafka.common.security.scram.ScramLoginModule
at org.apache.kafka.common.network.SaslChannelBuilder.configure(SaslChannelBuilder.java:160)
at org.apache.kafka.common.network.ChannelBuilders.create(ChannelBuilders.java:146)
at org.apache.kafka.common.network.ChannelBuilders.clientChannelBuilder(ChannelBuilders.java:67)
at org.apache.kafka.clients.ClientUtils.createChannelBuilder(ClientUtils.java:99)
at org.apache.kafka.clients.producer.KafkaProducer.newSender(KafkaProducer.java:441)
at org.apache.kafka.clients.producer.KafkaProducer.<init>(KafkaProducer.java:422)
I suspect that this is caused by setting the ClassLoader to null, but I'm not certain.
The JAAS string mentions this org.apache.kafka.common.security.scram.ScramLoginModule. I tried not using JAAS but instead plain username + password like so:
Properties props = new Properties();
props.put("sasl.username", username);
props.put("sasl.password", password);
// ...
Thread.currentThread().setContextClassLoader(null);
KafkaProducer<String, String> producer = new KafkaProducer<>(props);
But that also resulted in an Exception, just mentioning a different Class could not be located:
org.apache.kafka.common.KafkaException: javax.security.auth.login.LoginException: unable to find LoginModule class: org.jboss.as.security.remoting.RemotingLoginModule
at org.apache.kafka.common.network.SaslChannelBuilder.configure(SaslChannelBuilder.java:160)
at org.apache.kafka.common.network.ChannelBuilders.create(ChannelBuilders.java:146)
at org.apache.kafka.common.network.ChannelBuilders.clientChannelBuilder(ChannelBuilders.java:67)
at org.apache.kafka.clients.ClientUtils.createChannelBuilder(ClientUtils.java:99)
at org.apache.kafka.clients.producer.KafkaProducer.newSender(KafkaProducer.java:441)
at org.apache.kafka.clients.producer.KafkaProducer.<init>(KafkaProducer.java:422)
How can I make it, that the org.apache.kafka.common.security.scram.ScramLoginModule is found? Thanks!
Warning: ugly hack!
Just before you create the KafkaProducer, set the context classloader shown bellow:
Thread.currentThread().setContextClassLoader(this.getClass().getClassLoader());
I need to write a simple java class which send messages to a Sonic Topic. I can use the JNDI values for lookup.
Here is the code.
Hashtable<String, String> properties = new Hashtable<>();
properties.put(Context.INITIAL_CONTEXT_FACTORY, "com.sonicsw.jndi.mfcontext.MFContextFactory");
properties.put(Context.PROVIDER_URL, "tcp://Devserver:7002");
properties.put(Context.SECURITY_PRINCIPAL, "username");
properties.put(Context.SECURITY_CREDENTIALS, "password");
properties.put("com.sonicsw.jndi.mfcontext.domain", "dmDEV");
Context jndiContext = new InitialContext(properties);
ConnectionFactory connectionFactory = (ConnectionFactory) jndiContext.lookup("TopicConnectionFactory");
Topic topic = (Topic) jndiContext.lookup("testtopic");
This throws an error
javax.naming.NameNotFoundException: /testtopic not found in the specified context
When I debug the code, I can see "connectionFactory" variable has following fields and values which are totally different from the values I specify above in properties.
brokerHostName "MyMachine" (id=55)
brokerPort 0
brokerProtocol "tcp" (id=59)
brokerURL "" (id=66)
clientID null
connectID null
defaultPassword "" (id=67)
defaultUserName "Administrator" (id=68)
I need to know how to write a simple Java client to connect to a Sonic topic.
I used the following which resolved my issue. Here is it in case you face the same issue. topic = session.createTopic(topicName);
I'm seeing an issue where when connected to a mailbox using IMAP the infinite timeout default is causing an issue. I am having an issue getting Java Mail to recgonise IMAP properties. I verified IMAP did not seem to be using the properties by setting things like port number to the value 1, which should not work.
This is the code snippit:
Properties props = new Properties()
props.put("mail.imap.port", "1");
props.put("mail.imap.timeout", "1");
props.put("mail.imaps.connectiontimeout", "1");
Session session = Session.getInstance(props, null);
Store store = session.getStore("imaps");
store.connect(***,***,***);
If anyone knows where the problem is arising from that would great, all help is appreciated.
I believe you should be using props.setProperty(key, value) instead of using props.put(key, value). The documentation here: http://docs.oracle.com/javase/tutorial/essential/environment/properties.html warns you not to use hashTable methods
You're using the "imaps" protocol but setting properties for the "imap" protocol. Change your property names to "mail.imaps.*".
I have been creating a thin browser client (on java) that sends an RTMP stream to a specified red5 instance. I also use RTMP Researcher to monitor the traffic and events that occur between the client and the server.
Here is what I note:
There is obviously a map with options that is being exchanged between the red5 instance and the client.
You can see it here:
(direct link : http://img716.imageshack.us/img716/661/newbitmapimagelb.png )
What I am wondering about is is there a programmatic way to obtain this map in the client side and maybe change some of the parameters or just examine them
Edit:
I am connecting like this
connect ( host, port, app, callback );
. I assume I am sending some default parameters along, because the other connect methods have also an optionsMap as an argument. I was wondering what are the possible values that could be put in such an optionsMap and where to obtain a list of them?
Hey,
I was also struggling with red5 and found this post. Download the red5 source and look inside this source file: src/org/red5/server/net/rtmp/BaseRTMPClientHandler.java
You should know that the connect() method has multiple signatures.
The following method in BaseRTMPClientHandler.java creates the default parameters:
public Map<String, Object> makeDefaultConnectionParams(String server, int port, String application) {
Map<String, Object> params = new ObjectMap<String, Object>();
params.put("app", application);
params.put("objectEncoding", Integer.valueOf(0));
params.put("fpad", Boolean.FALSE);
params.put("flashVer", "WIN 9,0,115,0");
params.put("audioCodecs", Integer.valueOf(1639));
params.put("videoFunction", Integer.valueOf(1));
params.put("pageUrl", null);
params.put("path", application);
params.put("capabilities", Integer.valueOf(15));
params.put("swfUrl", null);
params.put("videoCodecs", Integer.valueOf(252));
return params;
}
In order to get the Weblogic initial context to query the task database i am doing the following:
Properties h = new Properties();
h.put(Context.SECURITY_PRINCIPAL, "weblogic");
h.put(Context.PROVIDER_URL, "t3://localhost:17101");
h.put(Context.SECURITY_CREDENTIALS, "weblogic");
h.put(Context.SECURITY_AUTHENTICATION, "simple");
WLInitialContextFactory test = new WLInitialContextFactory();
test.getInitialContext(h);
Context ctx = null;
ctx = getInitialContext();
WorklistContext wliContext = WorklistContextFactory.getRemoteWorklistContext(ctx, "MyTaskApplication");
I then get the TaskQuery interface with the following code:
WorklistTaskQuery taskQuery = wliContext.getInterfaceForTaskQuery();
and to get the tasks i do:
taskQuery.getTasks(query);
where query is com.bea.wli.worklist.api.TaskQuery object.
Please note that this code is running inside the domain running the tasks.
Unfortunally i am getting the following error when i call the getTasks methods:
java.lang.SecurityException: [WLI-Worklist:493103]Access denied to resource /taskplans
/Manual:1.0. Applicable policy: Query Caller: principals=[] Method: com.bea.wli.worklist.security.WorklistSecurityManager.assertTaskAccessAllowed
It seems Weblogic is ignoring the user set on the new initial context and trying to use the one coming from the browser. It so happens that i might need to do query searchs in background workers that don't have a browser session(obviously).
Can anyone help with this?
I've found a solution for this, though it's convoluted and ugly as hell.
Since i'm making these calls through an EJB i can authenticate the call by grabbing the EJB implementation from an authenticated context like so:
Hashtable<String, String> env = new Hashtable<String, String>();
env.put(Context.SECURITY_PRINCIPAL,"user");
env.put(Context.PROVIDER_URL,"t3://localhost:7001");
env.put(Context.SECURITY_CREDENTIALS,"password");
env.put(Context.SECURITY_AUTHENTICATION, "simple");
getSessionInterface(interfaceClass, new InitialContext(env));
Your best bet for this is to avoid the above example and this API all together. Just use the regular MBean Implementation which allows authentication.
Update this solution doesn't seem to be viable, it will screw up the transaction management. Will report back, but it seems if you need to create tasks outside of an authenticated context you will need to go the MBean way