I want to use the Machine Learning capabilities of Apache Spark through a RESTful API. Therefore I use the Spark Job Server. I already developed an interface for the communication but figured out that, while I am using the Persistent Context Mode, I can't save objects like a trained model between different job circles. I can't find any documentation on how to actually implement a persistent job with JAVA.
I am also quite new to Apache Spark and have no clue of Scala. I also don't want to start over with the development process and would be very happy if somebody can share his experience in how to persist JAVA objects between Apache Spark Jobserver jobs or point me to a good example or documentation.
For the beginning, it would be even sufficient to at least serialize an object and save it to the disk. But I also wasn't succesful with that inside of the Apache Spark Jobserver. I used a simple code like the following, but this probably doesn't work as simple in Spark
try {
FileInputStream streamIn = new FileInputStream("D:\\LRS.ser");
ObjectInputStream objectinputstream = new ObjectInputStream(streamIn);
LRService = (LogisticRegressionService) objectinputstream.readObject();
objectinputstream.close();
} catch (Exception e) {
e.printStackTrace();
}
Related
As i am in doubt when we use micrometer and prometheus in production as prometheus pull data form micrometer and we just use remote data storage for prometheus but some data are also stored by micrometer.. now my question is if my server is running in a production than the micrometer store data is keep going increases as it is running or it automatically flushes after some time? means how micrometer store data in production?
Micrometer itself does not store data persistently. All data is kept in memory. If the application restarts, the counters starts from zero.
It is the task of the timeline database to hanlde that. E.g. Prometheus has functions like rate() and increase() that ignore these resets.
No matter which environment your application uses micrometer (whether it is locally, in dev, acceptance or production, micrometer will behave the same way:
Collecting and storing in-memory metrics
Waiting for other metrics analysis and visualization tools to collect the data:
Publishing when the tools implementation uses a push model
Exposing the needed endpoints for tools using the pull model, which is the Prometheus case
micrometers, and any other metrics collecting library by the way, cannot make assumptions about when the data collected should be flushed or cleared since it cannot make assumptions or even know in advance which tools will be collecting which data and when.
Meanwhile, if you have already full picture about your application architecture and you know that you will be only using Prometheus to collect metrics, you can configure your endpoint to clear the MeterRegistry after successful scraping (based on the official documentation sample since you did not afford any snippet about the implementation):
PrometheusMeterRegistry prometheusRegistry = new PrometheusMeterRegistry(PrometheusConfig.DEFAULT);
try {
HttpServer server = HttpServer.create(new InetSocketAddress(8080), 0);
server.createContext("/prometheus", httpExchange -> {
String response = prometheusRegistry.scrape(); (1)
httpExchange.sendResponseHeaders(200, response.getBytes().length);
try (OutputStream os = httpExchange.getResponseBody()) {
os.write(response.getBytes());
prometheusRegistry.clear(); // clear the registry upon successful response write
}
});
new Thread(server::start).start();
} catch (IOException e) {
throw new RuntimeException(e);
}
I am struggling to find a fully fledged example of how to use Apache Camel in Spring Boot framework for the purpose of a polling consumer.
I have looked at this: https://camel.apache.org/manual/latest/polling-consumer.html as well as this: https://camel.apache.org/components/latest/timer-component.html but the code examples are not wide enough for me to understand what it is that I need to do to accomplish my task in Java.
I'm typically a C# developer, so a lot of these small references to things don't make sense.
I am seeking an example of the following to do in Java including all the imports and other dependencies that are required to get this to work.
What I am trying to do, is the following
A web request is made to an endpoint, which should trigger the start of a polling consumer
The polling consumer needs to poll another web endpoint with a provided "ID" that needs to be sent to the consumer at the time that it is trigger.
The polling consumer should poll every X seconds (let's say 5 seconds).
Once a specific successful response is received from the endpoint we are polling, the consumer should stop polling and send a message to another web endpoint.
I would like to know if this is possible, and if so, can you provide a small example of everything that is needed to achieve this (as the documentation from the Camel website is extremely sparse in terms of imports and class structure etc.)?
After discussions with some fellow Java colleagues, they have assured me that this use case is not one that Camel is designed for. This is reason it was so difficult to find anything on the internet before I posted this question.
For those that are seeking this answer via Google, the best suggested approach is to use a different tool or just use standard java.
In my case, I ended up using plain old Java thread to achieve what was required. Once the request is received I simply start a new Runnable thread, that handles the checking of the result from the other service, sleeps for X seconds, and terminates when the response is successful.
A simple example is below:
Runnable runner = new Runnable() {
#Override
public void run() {
boolean cont = true;
while (cont) {
cont = getResponseFromServer();
try {
Thread.sleep(5000);
} catch (Exception e) {
// we don't care about this, it just means this time it didn't sleep
}
}
}
}
new Thread(runner).start();
I have just started using OrientDB and one of the most useful features for my project was the hook.
Let me try to explain my setup:
OrientDB "Graph" database server running on machine-1, to which many
clients are connected
On Machine-2 i am running a OrientDB Java based client application
There are many web applications which are connecting to the OrientDB server for graph database manipulation
On Machine-2 java client, i have registered hooks to listen to change
of certain class records
I was expecting that whenever a record (of specified class mentioned by hook) is modified the hook will imitate the trigger on java client running on Machine-2
But in reality, i am getting those triggers only when the class gets modified by the Java client running on Machine-2 (i.e. same client)
This is my code
graph = new OrientGraph(connectionString, "root", "hello");
ODatabaseDocumentTx docDBRef = graph.getRawGraph();
docDBRef.begin();
try
{
docDBRef.registerHook(new ODocumentHookAbstract() {
-------
-------
});
docDBRef.commit();
}catch (Exception e) {
docDBRef.rollback();
}
Is there any way, i can write a client which can get notified on any change in the database ?
Thanks in advance
Is it possible to create socket with multiple outgoing (ingoing) connections using ZeroMQExtensions?
More about multiple connections in ZeroMQ guide.
upd:
I can't see equavalent sample with ZeroMQExtensions. In 0MQExtensions documnetation i found:
newPubSocket(socketParameters: Array[SocketOption]): ActorRef
Java API factory method to create the actor representing the ZeroMQ Publisher socket. You can pass in as many configuration options as you want and the order of the configuration options doesn't matter They are matched on type and the first one found wins.
PS: I don't known scala and just started reading akka documentation to understand I need Akka or not.
I found solution (it was not intuitive but works):
ActorRef subSocket = ZeroMQExtension.get(getContext().system())
.newSubSocket(null, new Listener(getSelf()), new Subscribe("health"));
#Override
public void preStart() {
super.preStart();
subSocket.tell(new Connect("tcp://127.0.0.1:1237"));
subSocket.tell(new Connect("tcp://127.0.0.1:1238"));
}
I'm writing an Android App and I'm looking for the fastest (In terms of setup) way for me to send data to a server and receive information back on request.
We're talking basic stuff. I have a log file which tells me how a user is using my application (In beta, I wouldn't runin a user experience by constantly logging usually) and I want to communicate that to my server (That I haven't setup).
I don't need security, I don't need high throughput or concurrent connections (I have 3 phones to play with) but I do need to set it up fast!
I remember back in the day that setting up XAMPP was particularly brainless, then maybe I could use PHP to send the file from the phone to the Server?
The Server would ideally be able to respond to a GET which would allow me to send back some SQL statements which ultimately affect the UI. (It's meant to adapt the presented options depending on those most commonly used).
So there you have it, I used PHP about 4 years ago and will go down that route if it's the best but if there's some kind of new fangled port open closing binary streaming singing and dancing method that has superseeded that option I would love to know.
This tutorial seems useful but I don't really need object serialization, just text files back and forth, compressed naturally.
Android comes with the Apache HTTP Client 4.0 built in as well as java.net.URL and java.net.HttpUrlConnection, I'd rather not add too much bult to my App with third party libraries.
Please remember that I'm setting up the server side as well so I'm looking for an overall minimum lines of code!
private void sendData(ProfileVO pvo) {
Log.i(getClass().getSimpleName(), "send task - start");
HttpParams p=new BasicHttpParams();
p.setParameter("name", pvo.getName());
//Instantiate an HttpClient
HttpClient client = new DefaultHttpClient(p);
//Instantiate a GET HTTP method
try {
HttpResponse response=client.execute(new HttpGet("http://www.itortv.com/android/sendName.php"));
InputStream is=response.getEntity().getContent();
//You can convert inputstream to a string with: http://senior.ceng.metu.edu.tr/2009/praeda/2009/01/11/a-simple-restful-client-at-android/
} catch (ClientProtocolException e) {
// TODO Auto-generated catch block
e.printStackTrace();
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
Log.i(getClass().getSimpleName(), "send task - end");
}
The easiest solution for you would be to use the apache http client to get and post JSON requests to a php server.
The android already has a JSON builder/parser built-in, as does PHP5, so integration is trivial, and a lot easier than using its XML/Socket counterparts.
If you want examples of how to do this the android side of things, here is a twitter API that basically communicates with twitter via their JSON rest API.
http://java.sun.com/docs/books/tutorial/networking/sockets/
This is a good background tutorial to Java socket communication.
Check out WWW.viewstreet.com developing an Apache plugin specifically for android serverside development for Java programmers