Kafka usage on Google appengine to push messages-Is it possible? - java

I am trying to push messages from webApplication server hosted on Google AppEngine to the kafka cluster running on my local machine. Is there any one who did this? I am facing an issue while creating KafkaProducer instance to send the messages. I am using Kafka_0.8.2. It gives error, "javax.management.ObjectName is a restricted class" and which as per AppEngine's whitelist, its restricted. What is the alternative for this? Can anyone answer? Thanks.

Related

Hazelcast client subscriber not receiving message

We are running Hazelcast Java client app in one docker container & hazelcast server with single member in other container on same machine.
Our client app is able to connect to hz server container & play around with data but pub/sub is not working.
Both pub/sub threads are running in client app itself. Data is getting published as it can be seen in Hazelcast management center but it's not getting received.
Does anyone have any idea about this?
PS: We are using hazelcast v5.1.3 with default server & client config.
We resolved this issue by replacing hazelcast-5.3.1 jar with hazelcast-client-4.x jar in our client app docker image.
Posting an answer here just in case if someone faces similar issue.

Google Pub/Sub test strategy for local GAE java dev server

my team and I are really keen to include Google Pub/Sub in our application as it will solve some coupling issues.
The problem we are having is how to do local integration tests in conjunction the java appengine dev server.
What I've done so far:
start the pub sub emulator and set the PUBSUB_EMULATOR_HOST
environment variable
start our application in the java dev server
which creates topics and subscriptions and then sends some messages
to the topics.
I'm assuming I'm doing something wrong because:
the topics and subscriptions as created in the cloud (surely they should have created in the pubsub emulator?)
messages are pushed and we receive message Ids back but no endpoint is reached or errors reported either in the cloud or the emulator.
From this I am pretty sure that the emulator is not being picked up by the dev server.
I also have some deeper questions regarding our testing strategy. Is local integration testing really feasible in this day and age with more and more services becoming bound to the cloud. Should we be focusing more on integration test suites running against cloud instances themselves? If so how would one ensure that developers have the confidence in their code before deploying to a cloud test environment, and wouldn't this increase the feedback loop significantly?
UPDATE
Using the Google Java API Client PubSub builder class I was able to inject a url (localhost:8010) from a local configuration which now allows me to successfully publish to the local emulator.
Pubsub client = new Pubsub.Builder(httpTransport, jsonFactory, initializer)
.setApplicationName(getProjectId())
.setRootUrl(rootUrl).build();
I forced the port used to simplify setup for the rest of my team rather than having to depend on a dynamically changing port.
gcloud beta emulators pubsub start --host-port localhost:8010
Now topic, subscription and messages are being created successfully on the emulator. Unfortunately I'm still not getting the messages pushed to the endpoints registered.
UPDATE 2
gcloud version 120.0.0 seems to improve things but I'm now getting the following error:
{
"code" : 400,
"message" : "Payload isn't valid for request.",
"status" : "INVALID_ARGUMENT"
}
Will the latest update of the gcloud utility the local pubsub server has been fixed. This issue resolved it: https://code.google.com/p/cloud-pubsub/issues/detail?id=41

GAE: MQTT broker

So I was wondering if it's possible to run a MQTT broker on the Google App Engine platform?
Couldn't find any information about it (or maybe I might be using the wrong keywords).
I've got my GAE running on Java so I'd like to go into direction of running the MQTT broker on GAE using a backend.
EDIT:
Did some further research and it seems Moquette is running on Java. Does someone have experience running Moquette on the GAE?
EDIT2:
Ok, it seems the examples of Moquette are running using an OSGi container, which is unavailble in GAE. Looking for a script to start this server on GAE.
MQTT is protocol on top of TCP. In order to run MQTT server, one needs to be able to open a listening socket. Those are still not supported on normal AppEngine instances.
Note: GAE backends have been replaced: now you just have automatic scaled (aka frontend) instances and manual scaled (aka backend) instances.
Back to your problem: Managed VMs have most of the benefits of GAE (access to services), but run a full JVM, which allows listening sockets.
An alternative to Moquette would also be the HiveMQ broker, it also runs on Java and can be easily installed. All the documentation is available here.
We haven't tested it on GAE yet, but if you have any problems running it, you could ask in the support forum.
Update: If Peter Knego is right, then HiveMQ or any other MQTT broker won't work on GAE.
Full disclose: I'm working for the company, who develops HiveMQ.
Cheers,
Christian
#Peter Knego is definitely right, and all i would add to his answer is that,
If you manage to configure you application to use a custom Runtime on the Managed Vms of Appengine and Compute Engine,
then you will be able to run you MQTT brooker perfectly sound and well.
As long as you define a fire wall to allow a tcp connection at the port which your broker is listening from.
By default the ports are blocked for security reasons.

ZeroMQ PUB/SUB not working on same machine on different JVM

I am using Zero MQ PUB/SUB model where PUB and SUB are two different applications deployed on the same machine on webpshere 6.1. This model works fine on my local machine but when I deploy it on a remote unix box server it isn't working. My SUB never receives a message from PUB. I tried all the options suggested i could find on the web (localhost, 127.0.0.1) but no luck. Appreciate any help on this. I am using jeroMq 3.2.2.
Thanks
Akash
If you're using multicast, then you need to enable loopback on the socket. As a result of this, the sender app will get the data as well if it's listening for it.
We also faced same issue and it was fixed by using below settings:
Publisher side use * (star) : tcp:// *.port number
Subscriber side use machine name : tcp://machine name of publisher.port number

Amazon Elastic Beanstalk Broadcast

I would like to develop a simple chat, but for that I need to inform all clients that a user wrote a message. Simple with one server instance, but how it works with 2+...
Is there a way to inform all registered instances in my beanstalk or how can I resolve this problem.
Thanks for help!
Regards,
Rookee
Given your setup, the best solution is to use Amazon's SNS. This is equivalent to a JMS Topic, so, your app can subscribe to events published by other parties (probably the same app running in another instance) on the topic. Then, each consumer of the topic would broadcast the chat message to the users connected to said server.
http://aws.amazon.com/sns/

Categories

Resources