How to consume datas generated by iot devices? - java

I must create a small IOT platform based on Spring Boot/Java 8.
Context: Devices send some various informations to the platform. I must save them and after consume them in an analysis algorithm.
Constraint: I want it all be async and the platform must be based on Java8/Spring technologies or must be easily integrated in a spring boot app.
What I imagine? I thought send devices' informations to Async Spring REST controller and save them async in Mongodb.
I have already the analysis algorithm based on Google Guava Event Bus.
To resume, I have datas from devices in Mongodb database and an algorithm based on Java POJO and the last part which is missing is transform datas from devices to Java POJO.
With which technologies can I do that? Spring Reactor? RxJava? Something else? And how can I put this in place?
I search something simple to put in place which can easily scale by instance duplication for example. For the moment, I thought that Spring Cloud technologies is a bit too big for my purpose.

You should have a look at Spring XD engine.
Spring XD Enables Different Sources (HTTP, FTP, MQTT, File etc), Transformers, Filters, Sinks (HTTP, FTP, MQTT, File etc).
Please check this post on a small IoT Project based on Spring XD and Twitter API.

Related

Spring boot API's

I'm making a challenge that I need to do a credit analysis and use some architectural concepts, which I was in doubt about "API's". It needs to be developed in spring boot which I already did. The conditions of the challenge are:
Frontend / Backend.
Api concept backend containing swagger documentation of endpoints.
API for registration and consultation of proposals.
Credit engine API that will review the proposal and make the credit limit decision.
I'm in doubt about steps 3 and 4 where API's are required. What does he mean by that? Do I need to create new Spring boot projects that communicate with each other? What is the best way to dealing with API's?
Thank you!
Your HTML form (e.g. registering proposal) will call API with all the form values as key=value pair, your API (Spring Controller) will accept those key=value and process it, apply your business logic, store to database, etc. You can go thru spring boot guides to get more idea, one of such guide for submitting form:
https://spring.io/guides/gs/handling-form-submission/
https://hellokoding.com/handling-form-submission-example-with-java-spring-boot-and-freemarker/
https://medium.com/#grokwich/spring-boot-thymeleaf-html-form-handling-762ef0d51327
Also you can go thru spring boot pet project, a sample showcase app developed with most of spring mvc capabilities.

spring boot distributed processing with kafka

I have several apps developed using spring boot. Some apps call another apps, that in time call other apps, it is getting hard to manage and scale. I need to be able to distribute them in a network and also combine the apps in different 'flows' with minimun changes to the apps.
Ideally I would like to wrap the apps and abstract them into components that have N inputs and M outputs. At boot time I would use some configuration to wire the inputs and outputs to real kafka topic queues.
For instance, input A to an app can come from several kafka topic queues, and the output B from the same app can go to other set of kafka topic queues.
I would like to be able to change the queues without having to recompile the apps, also no extra network hops to send/receive from/to multiple queues, this should in the same process and multi threaded.
Does anybody knows if something similar exists already? Can spring integration do this? Apache Camel? Or am I better off writing it myself?
See Spring for Apache Kafka. There is also a spring-integration-kafka extension that sits on top of spring-kafka.
The Spring for Apache Kafka (spring-kafka) project applies core Spring concepts to the development of Kafka-based messaging solutions. It provides a "template" as a high-level abstraction for sending messages. It also provides support for Message-driven POJOs with #KafkaListener annotations and a "listener container". These libraries promote the use of dependency injection and declarative. In all of these cases, you will see similarities to the JMS support in the Spring Framework and RabbitMQ support in Spring AMQP.

Programmatically generate new sinks based on database entries

I'm building a small app that models city public transport network. The idea is that each bus stop is a Sink and listens to messages from other bus stops, thus, calculating times the bus will show up.
Bus stops with unique ids are stored in the database and I need to generate and run exactly the number of sinks with unique ids. How do I do that?
My guess is that the task that can be done using Spring Cloud Data Flow that will launch .jar files using (--id) property that'll be injected with #Value notation. But I can't understand how to implement that.
Also found this but it didn't help.
You got some of the concepts correctly, but your implementation may need some help.
So Spring Cloud Dataflow is an orchestration engine that deploys boot applications and connects them using a middleware.
Those apps can be Streaming apps, which mean they use Spring Cloud Stream as an abstraction layer to communicate with a middleware (Rabbit or Kafka), and in its core it has three types of apps: Sources (data emitters), Processors (data transformation) and sinks (data receivers)
You use dataflow to combine those and deploy to a runtime (Local, CloudFoundry, K8S, YARN)
So, yes SCDF can be used for your assignment, however you do not want to create one sink per bus, this is abusing your resources.
You can have a simple stream that captures the data from your buses (the source), maybe do some transformation and sinks it to a DB
You can then create a tap that listens to the messages stored in a DB if you are interested in processing them.
You can tap that information and have a client that broadcasts it downstream (your display at each bus stop)
So for example you can have just one single sink, but have a websocket for example where each client connects and passes an id. You can then forward the events received filtered by that id to this specific client.
This is a much more efficient way to deal with that.

Existing spring application extension by adding camel features

I have my web application written in Spring MVC. It is quite simple app for registering some activities and generating reports after some time. Now I have it done fully in Spring. The only entry point is HTTP webapp request. I'd like to add other entry points to allow user to trigger application via JMS queue, FTP files and SOAP-based web service.
I know I can do this all using Spring own features somehow, but I wonder if it is desirable to involve Apache Camel into all that stuff?
I think about leaving web application as it is (communicating directly with services), only add some Camel magic to spring context and expose several endpoints from Camel and then after messages processing and transformations call existing services.
I think about using Camel to be able to use some asynchronous processing and threads/scalability features. Is it the right way to go?
I will recommend you to use Apache Camel. I have used it for a similar purpose. The solution is an appropriate one from a 'Separation of Concerns' point. Camel implement Enterprise Integration Patters and is a better solution for integrating various protocols and interfaces. Your application should deal with functionality only and as designed should just expose a servlet to get requests and process it.
Handling of interfaces and protocols are well structured in Camel and its easy to maintain and configure in the long run.

Notification Mechanism Using Spring

I have currently two wars files in which one war has to send notification to other war file using spring.Both of the wars are implemented using spring and web service.
My requirement is first war has to send notifications to other war file.
Could you please provide some pointers to implement the same using spring ?
I do not know exactly your requirements but I'd suggest you to use RestFull web service for this notification. Spring has a perfect support of this kind of services.
Internally the first application will send HTTP POST (or GET) request like http://thehost/webapp2/mynotification
Other way is to communicate using JMS. This way is good if you have to make the communication asynchronous. Spring supports JMS using JMS templates.
You can use:
JMS
a webservice (or spring http invoker) in the target app and invoke it from the notifier
You can use RMI to export your beans and make them visible from other modules, better than other alternatives in this case because:
JMS is asynchronous and needs a middleware.
Webservice are less efficient (since it is mostly conceived to communicate heterogeneous platforms).
Take a look here on how to do it:
http://static.springsource.org/spring/docs/3.0.x/spring-framework-reference/html/remoting.html#remoting-rmi
But I would first of all review the architecture you are using, in case you can refactor it for a better integration of business logic.

Categories

Resources