Create ZNodes without cmd in Zookeeper - java

I am trying to implement Configuration Management through Zookeeper. I have created few ZNodes from command line as:
create /config ""
create /config/mypocapp ""
create /config/mypocapp/name "John Doe"
Here, name is one of the properties that I want to access in my app called mypocapp.
Since we will be having a lot of properties in our application, we just can't use command line to create each and every property like this.
Is there a way we can create the properties in zookeeper through some UI or directly in a file (and import it to zookeeper).
I am completely new to zookeeper and not getting any help in this direction. Please help.

Exhibitor is one of the options that one can try to insert, modify or delete properties in ZNodes.
One can follow the steps given below:
Download the pom file of Exhibitor UI from the Github.
Built the pom file using maven that will generate a jar file.
Run the jar file as: java -jar <jar-file-name>.jar -c file
Go to your browser and type in: localhost:8080 to access Exhibitor UI.
Here, you can configure your Zookeeper ensemble and can edit the properties.
Please note that each instance of Zookeeper will have corresponding Exhibitor UI.
In order to run exhibitor on a different port, you can run:
java -jar <jar-file-name>.jar -c file --port <port-of-your-choice>

There are now also VS Code extensions that allow viewing and editing the Zookeeper node hierarchy and data, like this one:
https://marketplace.visualstudio.com/items?itemName=gaoliang.visual-zookeeper

Related

How to read app.properties file from Java Spark application

I implemented Java Spark application, which I'm running on EMR cluster with spark-submit command.
I want to pass app.properties which I use in my application.
app.properties looks as follows:
local_fetcher = false
local_storage = false
local_db = true
.
.
.
I want to be able to get this data in my application.
My questions are:
Where app.properties should be located?
How can I read it content in my Spark application?
Should I be able to read it from driver & executers?
I tried to use --properties-file flag but I understood it will override the default Spark configuration which is not what I want.
I saw that I might use --file flag, but didn't understand where should the file be located and how I can read it inside my application.
First option: --files
--files FILES Comma-separated list of files to be placed in the working directory of each executor. File paths of these files in executors can be accessed via SparkFiles.get(fileName).
spark-submit --files /path/to/app.properties /path/to/your/fat/jar.jar
You can get the exact location of the uploaded file using the SparkFiles.
Second option: getResourceAsStream
Put your app.properties inside your job's JAR file, and load it like this:
val appPropertiesStream = scala.io.Source.fromInputStream(
classOf[yourObject].getClassLoader.getResourceAsStream("/app.properties")
val appPropertiesString = scala.io.Source.fromInputStream(appPropertiesStream ).mkString
(note the forward slash before the "app.properties", as far as I remember it's important)

Micronaut PropertySource for multiple configuration files

I have a micronaut project where I want to have an unnversioned configuration file for private data (like database connections and so on)
This information have to be loaded through #Property annotation, but since there will be more than one .yml (there will also be at least an application.yml) y want to be able to provide file's path to #Properties to be able to differentiate where to look for property.
Since it's my first micronaut project I'm a bit lost with this stuff but taking springboot as an example, what I want to do is something like:
#PropertySource("classpath:configprops.properties")
But after reading micronaut documentation(https://docs.micronaut.io/latest/guide/index.html#configurationProperties) I found myself unable to do this (except from something like just reading the plain file which I guess would not be micronaut compliant)
I do it by passing jvm arguments.
For example, If I am running it on my local machine using gradle:run, I add following to build.grade
run.jvmArgs('-Dmicronaut.environments=dev', "-Dmicronaut.config.files=${System.getProperty("user.home")}/auth-config.groovy")
For my jar deployment, I have made a deploy.sh file as follows :
#!/bin/bash
fuser -k 8181/tcp
nohup java -Xmx512m -Dmicronaut.environments=staging -Dmicronaut.config.files=<path-to-config>/config.groovy -jar application-0.1-all.jar > application.log 2>&1 &
Also note that I am passing different environment names, this helps you to include development environment config directly in code if you want.
Like
application-[environment_name].groovy
application-[environment_name].yml
application-[environment_name].properties
This will help new contributors on your project to speedup the process project setup, I generally also include note in my application-dev.groovy file
DEVELOPER NOTE:
***** DO NOT COMMIT ANY CHANGE IN THIS FILE IF YOU MAKE ANY
*******************************************************
***** CREATE <config.groovy> file in your <HOME> folder and copy paste content of this file
***** Override properties as required
*******************************************************

How can I dynamically use externalized properties file with Spring jar?

With this setup (from Eclipse using Windows10)
I was able to correctly start my SpringBoot application. This one worked too (same directory pattern):
Now I'm packaging my project as JAR and I want to use an external properties file. I had an teste32.yml file beside my JAR at the same directory (also tried to use it inside /config directory, as show here, but it didn't work either)
I want to dynamically use a properties file beside my JAR file everytime. Doesn't matter at which directory they are, I wanted to dynamically point to a properties file always at the same directory as the JAR is. I want to say to my client: "take this JAR and this file, put them wherever you want and run this command X and everything will be alright". I'm trying to discover command X but before I add some dynamic path, I'm trying with absolutes paths. I'm using this:
java -jar myJar.jar -Dspring.config.name=teste32 -Dspring.config.location=C:\workspace\myProject\target\
I manually copied teste32 inside target\ to test this. But this didn't work. This didn't work either (only spring.config.location variants):
-Dspring.config.location=file:C:\workspace\myProject\target\
-Dspring.config.location=classpath:/
-Dspring.config.location=file:C:/workspace/myProject/target/
I also tried with no spring.config.location, only name
So my questions are:
What does classpath: and file: mean? Until now I got the 2 correct setups by pure luck and I would like to understand when to use them.
When I have my project package as a JAR, what classpath becomes?
Finally, which combination is necessary to dynamically use a properties always at the same directory as the JAR?
UPDATE
Using --debug at the correct example got me this line at the very begging (Spring banner was still visible):
2018-09-25 15:45:14.480 DEBUG 11360 --- [ main] o.s.b.c.c.ConfigFileApplicationListener : Loaded config file 'file:src/main/resources/xirulei/teste32.yml' (file:src/main/resources/xirulei/teste32.yml)
But after moving myJar.jar and teste32.yml to a specific directory and running java -jar myJar.jar -Dspring.config.name=teste32 --debug (without spring.config.location, since teste32 is at the same directory as JAR), I simply didn't get any ConfigFileApplicationListener debug line.
a) java -jar myJar.jar -Dspring.config.name=teste32 -Dspring.config.location=C:\workspace\myProject\target
Did you check content of target dir? I'm pretty sure your cfg file is placed to target\classes\xirulei and it is why Spring cannot find it in target
b) When you place teste32.yml in the same directory as jar file then Spring must be able to find it (given this directory is working directory) without -Dspring.config.location (but you still need to provide -Dspring.config.name=teste32)
c) When you use -jar and do not provide additional class paths then classpath: points to the root of packages inside jar. Spring cannot find your file at classpath:/ because your file is at classpath:/xirulei/
Well, after all it was a simple mistake. As documentation says and as already pointed here, it should be
java -jar myproject.jar --spring.config.name=myproject
and not
java - jar myproject.jar -Dspring.config.name=myproject
As stated on question, only when using Eclipse -D(JVM argument) is necessary. When using bash/cmd, just --(program argument) is the correct option:

How to copy files from Hadoop cluster to local file system

Setup:
I have a map-reduce job. In the mapper class (which is obviously running on the cluster), I have a code something like this:
try {
.
.
.
} catch (<some exception>) {
// Do some stuff
}
What I want to change:
In the catch{} clause, I want to copy the logs from the cluster to the local file system
Problem:
I can see the log file in the directory on the node if I check from command line. But when I try to copy it use org.apache.hadoop.fs.FileSystem.copyToLocalFile(boolean delSrc, Path src, Path dst), it says the file does not exist.
Can anyone tell me what I am doing wrong? I am very new to Hadoop, so may be I am missing something obvious. Please ask ask me any clarifying questions, if needed, as I am not sure if I have given all the necessary inforation.
Thanks
EDIT 1:: Since I am trying to copy files from cluster to local and the java code is also running on cluster, can I even use copyToLocalFile()? Or do I need to do a simple scp?
The MapReduce log files are usually located on the data node's local file system path HADOOP_LOG_DIR/userlogs/mapOrReduceTask where the Map/Reduce program runs. Each MapReduce programs generates syslog/stdout/stderr in the above directory.
It would be easier to use the Task tracker's Web UI to see the local log files or you can ssh to the machine and look over logs in the above mentioned directories.
By default, the Task tracker Web UI URL is http://machineName:50060/

Using P4Package (Java) from Java app to validate Perforce directory

In a web-app I'm writing, the user is supposed to enter the path in the Perforce repository for the node they're entering. The application is supposed to validate that the entered directory exists in the repo.
I've got the P4Package (p4.jar) lib, and I'm configuring it correctly enough that it works for almost everything, EXCEPT this directory validation. I'm creating a DirEntry (from the p4.jar) using a configured Env and a path, but when I call DirEntry.sync(), it issues an incorrect command. Where I want it to issue the command:
p4 [config info] dirs directory_argument <-- using the dirs command to validate a dir
Instead, it issues:
p4 [config info] dirs directory_argument%1 <-- note extraneous %1
Which always fails, since none of the directories have a %1 at the end of them.
Any help? Is there a different way to check that a directory exists using this package?
Sounds like the sync command has a bug in relation to dir entries and the command. My suggestion would be to just roll the command yourself, using the perforce command line as that has to be set up anyway in order to use the java library.
Process p = Runtime.getRuntime().exec("p4 dirs " + directory_argument);
BufferedReader stdOut = new BufferedReader(new InputReader(p.InputStream()));
//Read the output of the command and process appropriately after this
I would try another library, P4Java, instead:
http://tek42.com/p4java
P4Java is much newer and I've found works much better than the P4Package. It is used in the Hudson project and I've seen it in the Fisheye source, though, I'm not sure if they are using it or not.
So, the code I was using did have a bug requiring me to make a change and check the code into my repository.
However, since then, Perforce has come up with their own Java wrapper for the P4 client which works much better. I'd give that one a shot.

Categories

Resources