I am trying to load a custom config for an elastic plugin, myConfig.conf, as so:
conf = ConfigFactory.load("myConfig.conf");
Which has only the contents:
myInteger: 1234
When I try to access the variable myInteger, it fails:
int bar1 = conf.getInt("myInteger");
With error message:
com.typesafe.config.ConfigException$Missing: system properties: No configuration setting found for key 'myInteger'
When I print out the contents of myConfig.conf, it shows a dump of Elastic configurations, like so:
Config(SimpleConfigObject({"es":{"bundled_jdk":"false","distribution":{"flavor":"oss","type":"zip"},"logs":{"base_path":"/Users/me/Downloads/project/build/testclusters/integTest-0/logs","cluster_name":"integTest","node_name":"integTest-0"},"networkaddress":{"cache":{"negative":{"ttl":"10"},"ttl":"60"}},"path":{"conf":"/Users/me/Downloads/project/build/testclusters/integTest-0/config","home":"/Users/me/Downloads/project/build/testclusters/integTest-0/distro/7.8.0-INTEG_TEST"}},"file":{"encoding":"UTF-8","separator":"/"},"ftp":{"nonProxyHosts":"local|*.local|169.254/16|*.169.254/16"},"http":{"nonProxyHosts":"local|*.local|169.254/16|*.169.254/16"},"io":{"netty":{"allocator":{"numDirectArenas":"0"},"noKeySetOptimization":"true","noUnsafe":"true","recycler":{"maxCapacityPerThread":"0"}}},"java":{"awt":{"headless":"true"},"class":{"path":"/Users/me/Downloads/project/build/testclusters/integTest-0/distro/7.8.0-INTEG_TEST/lib/lucene-queries-8.5.1.jar:/Users/me/Downloads/project/build/testclusters/integTest-0/distro/7.8.0-INTEG_TEST/lib/hppc-0.8.1.jar:/Users/me/Downloads/project/build/testclusters/integTest-0/distro/7.8.0-INTEG_TEST/lib/jackson-core-2.10.4.jar:/Users/me/Downloads/project/build/testclusters/integTest-0/distro/7.8.0-INTEG_TEST/lib/log4j-api-2.11.1.jar:/Users/me/Downloads/project/build/testclusters/integTest-0/distro/7.8.0-INTEG_TEST/lib/lucene-suggest-8.5.1.jar:/Users/me/Downloads/project/build/testclusters/integTest-0/distro/7.8.0-INTEG_TEST/lib/elasticsearch-7.8.0.jar:/Users/me/Downloads/project/build/testclusters/integTest-0/distro/7.8.0-INTEG_TEST/lib/lucene-analyzers-common-8.5.1.jar:/Users/me/Downloads/project/build/testclusters/integTest-0/distro/7.8.0-INTEG_TEST/lib/jopt-simple-5.0.2.jar:/Users/me/Downloads/project/build/testclusters/integTest-0/distro/7.8.0-INTEG_TEST/lib/lucene-highlighter-8.5.1.jar:/Users/me/Downloads/project/build/testclusters/integTest-0/distro/7.8.0-INTEG_TEST/lib/jackson-dataformat-cbor-2.10.4.jar:/Users/me/Downloads/project/build/testclusters/integTest-0/distro/7.8.0-INTEG_TEST/lib/lucene-spatial3d-8.5.1.jar:/Users/me/Downloads/project/build/testclusters/integTest-0/distro/7.8.0-INTEG_TEST/lib/elasticsearch-secure-sm-7.8.0.jar:/Users/me/Downloads/project/build/testclusters/integTest-0/distro/7.8.0-INTEG_TEST/lib/lucene-join-8.5.1.jar:/Users/me/Downloads/project/build/testclusters/integTest-0/distro/7.8.0-INTEG_TEST/lib/log4j-core-2.11.1.jar:/Users/me/Downloads/project/build/testclusters/integTest-0/distro/7.8.0-INTEG_TEST/lib/java-version-checker-7.8.0.jar:/Users/me/Downloads/project/build/testclusters/integTest-0/distro/7.8.0-INTEG_TEST/lib/elasticsearch-cli-7.8.0.jar:/Users/me/Downloads/project/build/testclusters/integTest-0/distro/7.8.0-INTEG_TEST/lib/elasticsearch-x-content-7.8.0.jar:/Users/me/Downloads/project/build/testclusters/integTest-0/distro/7.8.0-INTEG_TEST/lib/lucene-spatial-extras-8.5.1.jar:/Users/me/Downloads/project/build/testclusters/integTest-0/distro/7.8.0-INTEG_TEST/lib/snakeyaml-1.26.jar:/Users/me/Downloads/project/build/testclusters/integTest-0/distro/7.8.0-INTEG_TEST/lib/lucene-queryparser-8.5.1.jar:/Users/me/Downloads/project/build/testclusters/integTest-0/distro/7.8.0-INTEG_TEST/lib/elasticsearch-geo-7.8.0.jar:/Users/me/Downloads/project/build/testclusters/integTest-0/distro/7.8.0-INTEG_TEST/lib/jackson-dataformat-smile-2.10.4.jar:/Users/me/Downloads/project/build/testclusters/integTest-0/distro/7.8.0-INTEG_TEST/lib/elasticsearch-plugin-classloader-7.8.0.jar:/Users/me/Downloads/project/build/testclusters/integTest-0/distro/7.8.0-INTEG_TEST/lib/t-digest-3.2.jar:/Users/me/Downloads/project/build/testclusters/integTest-0/distro/7.8.0-INTEG_TEST/lib/lucene-misc-8.5.1.jar:/Users/me/Downloads/project/build/testclusters/integTest-0/distro/7.8.0-INTEG_TEST/lib/lucene-sandbox-8.5.1.jar:/Users/me/Downloads/project/build/testclusters/integTest-0/distro/7.8.0-INTEG_TEST/lib/elasticsearch-core-7.8.0.jar:/Users/me/Downloads/project/build/testclusters/integTest-0/distro/7.8.0-INTEG_TEST/lib/jna-4.5.1.jar:/Users/me/Downloads/project/build/testclusters/integTest-0/distro/7.8.0-INTEG_TEST/lib/lucene-backward-codecs-8.5.1.jar:/Users/me/Downloads/project/build/testclusters/integTest-0/distro/7.8.0-INTEG_TEST/lib/spatial4j-0.7.jar:/Users/me/Downloads/project/build/testclusters/integTest-0/distro/7.8.0-INTEG_TEST/lib/lucene-grouping-8.5.1.jar:/Users/me/Downloads/project/build/testclusters/integTest-0/distro/7.8.0-INTEG_TEST/lib/jackson-dataformat-yaml-2.10.4.jar:/Users/me/Downloads/project/build/testclusters/integTest-0/distro/7.8.0-INTEG_TEST/lib/lucene-memory-8.5.1.jar:/Users/me/Downloads/project/build/testclusters/integTest-0/distro/7.8.0-INTEG_TEST/lib/elasticsearch-launchers-7.8.0.jar:/Users/me/Downloads/project/build/testclusters/integTest-0/distro/7.8.0-INTEG_TEST/lib/HdrHistogram-2.1.9.jar:/Users/me/Downloads/project/build/testclusters/integTest-0/distro/7.8.0-INTEG_TEST/lib/lucene-core-8.5.1.jar:/Users/me/Downloads/project/build/testclusters/integTest-0/distro/7.8.0-INTEG_TEST/lib/jts-core-1.15.0.jar:/Users/me/Downloads/project/build/testclusters/integTest-0/distro/7.8.0-INTEG_TEST/lib/joda-time-2.10.4.jar","version":"59.0"},"home":"/usr/local/Cellar/openjdk/15.0.1/libexec/openjdk.jdk/Contents/Home","io":{"tmpdir":"/Users/me/Downloads/project/build/testclusters/integTest-0/tmp"},"library":{"path":"/Users/me/Library/Java/Extensions:/Library/Java/Extensions:/Network/Library/Java/Extensions:/System/Library/Java/Extensions:/usr/lib/java:."},"locale":{"providers":"SPI,COMPAT"},"runtime":{"name":"OpenJDK Runtime Environment","version":"15.0.1+9"},"specification":{"name":"Java Platform API Specification","vendor":"Oracle Corporation","version":"15"},"vendor":{"url":{"bug":"https://bugreport.java.com/bugreport/"}},"version":"15.0.1","vm":{"compressedOopsMode":"Zero based","info":"mixed mode, sharing","name":"OpenJDK 64-Bit Server VM","specification":{"name":"Java Virtual Machine Specification","vendor":"Oracle Corporation","version":"15"},"vendor":"Oracle Corporation","version":"15.0.1+9"}},"jdk":{"debug":"release"},"jna":{"loaded":"true","nosys":"true","platform":{"library":{"path":"/usr/lib:/usr/lib"}}},"jnidispatch":{"path":"/Users/me/Downloads/project/build/testclusters/integTest-0/tmp/jna-518060194/jna19175516516881411.tmp"},"line":{"separator":"\n"},"log4j":{"shutdownHookEnabled":"false"},"log4j2":{"disable":{"jmx":"true"}},"os":{"arch":"x86_64","name":"Mac OS X","version":"10.15.5"},"path":{"separator":":"},"socksNonProxyHosts":"local|*.local|169.254/16|*.169.254/16","sun":{"arch":{"data":{"model":"64"}},"boot":{"library":{"path":"/usr/local/Cellar/openjdk/15.0.1/libexec/openjdk.jdk/Contents/Home/lib"}},"cpu":{"endian":"little"},"io":{"unicode":{"encoding":"UnicodeBig"}},"java":{"command":"org.elasticsearch.bootstrap.Elasticsearch","launcher":"SUN_STANDARD"},"jnu":{"encoding":"UTF-8"},"management":{"compiler":"HotSpot 64-Bit Tiered Compilers"},"nio":{"ch":{"bugLevel":""}}},"user":{"country":"GB","dir":"/Users/me/Downloads/project/build/testclusters/integTest-0/distro/7.8.0-INTEG_TEST","home":"/Users/me","language":"en","name":"me","timezone":"Europe/London"}}))
It successfully recognises that the file exists (if I fake a file-path it won't work).
But isn't recognising / reading any of the contents of myConfig.conf.
Why is this? How can I fix this?
EDIT
I should also note, to read the configuration file without Elastic complaining, I've had to do the following:
AccessController.doPrivileged((PrivilegedAction<AssignmentConfig>) () -> {
try {
return AssignmentConfig.configure();
} catch (Exception e) {
e.printStackTrace();
}
return null;
});
it is a bad idea to use external configuration files in elasticsearch plugin. ES provides a mechanism for extending the elasticsearch configuration. all of your custom config should be put in the elasticsearch.yml along with a custom setting registration in the plugin like so:
public class MyESPlugin extends Plugin implements ... {
#Override
public List<Setting<?>> getSettings() {
return Arrays.asList(new Setting<>("setting1", "", Function.identity(), Setting.Property.NodeScope),
new Setting<>("setting2", "", Function.identity(), Setting.Property.NodeScope), ...);
}
and then, in your elasticsearch.yml you can add:
setting1: ...
setting2: ...
but note that your plugin must be installed before you start up your node otherwise the node will not start because it can't recognize the custom settings.
Related
I'm trying to understand a comment that a colleague made. We're using testcontainers to create a fixture:
import org.testcontainers.containers.GenericContainer;
import org.testcontainers.utility.DockerImageName;
public class SalesforceFixture extends GenericContainer<SalesforceFixture> {
private static final String APPLICATION_NAME = "salesforce-emulator";
public SalesforceFixture() {
// super(ImageResolver.resolve(APPLICATION_NAME));
super(DockerImageName.parse("gcr.io/ad-selfserve/salesforce-emulator:latest"));
...
}
...
The commented code is what it used to be. The next line is my colleague's suggestion. And on that line he commented:
This is the part I don't know. The [ImageResolver] gets the specific version of the emulator, rather than the latest. You need a docker-info file for that though, which jib doesn't automatically generate (but I think it can).
This is what I know or have figured so far:
SalesforceFixture is a class that will be used by other projects to write tests. It spins up a container in Docker, running a service that emulates the real service's API. It's like a local version of the service that behaves enough like the real thing that if one writes code and tests using the fixture, it should work the same in production. (This is where my knowledge ends.)
I looked into ImageResolver—it seems to be a class we wrote that searches a filesystem for something:
public static String resolve(String applicationName, File... roots) {
Stream<File> searchPaths = Arrays.stream(roots).flatMap((value) -> {
return Stream.of(new File(value, "../" + applicationName), new File(value, applicationName));
});
Optional<File> buildFile = searchPaths.flatMap((searchFile) -> {
if (searchFile.exists()) {
File imageFile = new File(searchFile + File.separator + "/target/docker/image-name");
if (imageFile.exists()) {
return Stream.of(imageFile);
}
}
return Stream.empty();
}).findAny();
InputStream build = (InputStream)buildFile.map(ImageResolver::fileStream).orElseGet(() -> {
return searchClasspath(applicationName);
});
if (build != null) {
try {
return IOUtils.toString(build, Charset.defaultCharset()).trim();
} catch (IOException var6) {
throw new RuntimeException("An exception has occurred while reading build file", var6);
}
} else {
throw new RuntimeException("Could not resolve target image for application: " + applicationName);
}
}
But I'm confused. What filesystem? Like, what is the present working directory? My local computer, wherever I ran the Java program from? Or is this from within some container? (I don't think so.) Or maybe the directory structure inside a .jar file? Or somewhere in gcr.io?
What does he mean about a "specific version number" vs. "latest"? I mean, when I build this project, whatever it built is all I have. Isn't that equivalent to "latest"? In what case would an older version of an image be present? (That's what made me think of gcr.io.)
Or, does he mean, that in the project using this project's image, one will not be able to specify a version via Maven/pom.xml—it will always spin up the latest.
Sorry this is long, just trying to "show my work." Any hints welcome. I'll keep looking.
I can't comment on specifics of your own internal implementations, but ImageResolver seems to work on your local filesystem, e.g. it looks into your target/ directory and also touches the classpath. I can imagine this code was just written for resolving an actual image name (not an image), since it also returns a String.
Regarding latest, using a latest tag for a Docker image is generally considered an anti-pattern, so likely your colleague is commenting about this. Here is a random article from the web explaining some of the issues with latest tag:
https://vsupalov.com/docker-latest-tag/
Besides, I don't understand why you ask these questions which are very specific to your project here on SO rather than asking your colleague.
With the same code, I want to run it both as a real mode and a dev mode.
So I added a service-mode argument, and it decides the port, database name, etc.
I also want to make a log file for a dev mode.
But using log4j, it seems that configuration about logging is decided when app is started.
Is there a way to change the log file path depending on a given argument?
I expected something like:
// Main.java
if (args.serviceMode is real) {
setLogging(log4j.properties)
}
else {
setLogging(log4j.properties.dev)
}
or
// log4j.properties
log4j.appender.file.File=/path/to/real/log
log4jdev.appender.file.File=/path/to/dev/log
// Main.java
if (args.serviceMode is real) {
setLogPath(log4j.appender.file.File)
}
else {
setLogPath(log4jdev.appender.file.File)
}
Any link or comment appreciated.
You can add spring.profiles in your application.yml and set new log-name
spring.profiles: dev
logging.file: new-app.log
More information you can find get here
I am attempting to get an embedded Spring Config Server implementation working that reads configuration from GitHub. I'm following this tutorial:
https://mromeh.com/2017/12/04/spring-boot-with-embedded-config-server-via-spring-cloud-config/
I am getting the following Exception when my Spring Boot app tries to start up:
Caused by: com.jcraft.jsch.JSchException: There are not any available
ciphers. at com.jcraft.jsch.Session.send_kexinit(Session.java:629)
at com.jcraft.jsch.Session.connect(Session.java:307) at
org.eclipse.jgit.transport.JschConfigSessionFactory.getSession(JschConfigSessionFactory.java:146)
... 23 more
The only interesting bit in my code that I see contributing to this is my bootstrap.yml file, which looks like this:
spring:
application:
name: DemoApplication.yml
---
spring:
cloud:
config:
failFast: true
server:
bootstrap: true
git:
uri: git#github.com:mycompany/demo-config.git
I am running OpenJDK 8 v212 on MacOS, per running the following:
#> java -version
openjdk version "1.8.0_212"
OpenJDK Runtime Environment (AdoptOpenJDK)(build 1.8.0_212-b03)
OpenJDK 64-Bit Server VM (AdoptOpenJDK)(build 25.212-b03, mixed mode)
I've searched through the Spring code and documentation, and have yet to find anything about passing configuration parameters or adding code to affect how the Jsch session being used by Spring is constructed. Everything I find suggests that what I'm doing should just work.
I'm at a loss as to where to go from here. Can someone tell me what I'm missing...what I need to do to get past this problem?
To consolidate the comments earlier...
Behind the scenes, Spring is using JGit to make the SSH connection. By default this uses JSch to make the SSH connection, which is configured by the ~/.ssh/config file.
The wiki also has details of how to bypass JSch and use a native ssh command, the GIT_SSH environment variable can be set, e.g. to /usr/bin/ssh in OS X or Linux, or even something like C:\Program Files\TortoiseGit\bin\TortoiseGitPlink.exe.
Following the comment about how to avoid a dependency on setting the environment variable, note how the GIT_SSH environment variable is checked using a SystemReader in the TransportGitSsh.useExtSession() method.
This means one way would be to override the SystemReader class. It's not a small interface though, so would involve a fair bit of wrapping code - with the custom bit in getenv():
import org.eclipse.jgit.lib.Config;
import org.eclipse.jgit.storage.file.FileBasedConfig;
import org.eclipse.jgit.util.FS;
import org.eclipse.jgit.util.SystemReader;
public class CustomSystemReader extends SystemReader {
private final SystemReader systemReader;
public CustomSystemReader(SystemReader systemReader) {
this.systemReader = systemReader;
}
#Override
public String getHostname() {
return systemReader.getHostname();
}
#Override
public String getenv(String variable) {
if ("GIT_SSH".equals(variable))
return "/usr/bin/ssh";
return systemReader.getenv(variable);
}
#Override
public String getProperty(String key) {
return systemReader.getProperty(key);
}
#Override
public FileBasedConfig openUserConfig(Config parent, FS fs) {
return systemReader.openUserConfig(parent, fs);
}
#Override
public FileBasedConfig openSystemConfig(Config parent, FS fs) {
return systemReader.openSystemConfig(parent, fs);
}
#Override
public long getCurrentTime() {
return systemReader.getCurrentTime();
}
#Override
public int getTimezone(long when) {
return systemReader.getTimezone(when);
}
}
Which can then be wired in like this:
SystemReader.setInstance(
new CustomSystemReader(SystemReader.getInstance()));
#df778899 gave me the direction I needed to figure this out, with the statement
As you've found, by default this uses JSch, which is configured by the
~/.ssh/config file - if that exists you may find a clue in there.
I had already looked to this file for clues, especially looking for anything about encryption setup. I saw that I had this commented out line near the top of the file:
# Ciphers +aes256-cbc
What I had missed (I thought I'd done a text search for "cipher" but obviously, I didn't) was that buried way down in the middle of the file amidst a bunch of unrelated settings, I was doing the same thing again, not commented out this time:
Host *
...
Ciphers +aes256-cbc
...
It is this line that is killing me with Jsch. If I comment out this one line, my simple setup works fine. #df778899's statement about ~/.ssh/config being critical to Jsch's setup was the push I needed.
This isn't going to part of my solution, but I'll point out that the following code seems to be how I can set GIT_SSH in Java, at the top of my main():
public static void main(String[] args) {
try {
Map<String, String> env = System.getenv();
Field field = env.getClass().getDeclaredField("m");
field.setAccessible(true);
((Map<String, String>) field.get(env)).put("GIT_SSH", "/usr/bin/ssh");
} catch (NoSuchFieldException e) {
e.printStackTrace();
} catch (IllegalAccessException e) {
e.printStackTrace();
}
SpringApplication.run(DemoApplication.class, args);
}
This also "solves" my problem. That bit about getDeclaredField("m") is particularly strange. What the heck is "m"?
We're working on deploying a Java project to Heroku that uses MongoDB. According to the Heroku docs, the DB connection parameters are read from an environment variable, MONGOHQ_URL. When I run the project in Netbeans on my laptop, how do I set this variable?
I tried adding it as a VM option with -DMONGOHQ_URL=... in Run -> Set Project Configuration -> Customize -> Run and as well in Actions -> Run project and Run file via main(), but to no avail. When the program reads it with System.getvar it's not set.
You can set it in your netbeans.conf file. Add the line:
export MONGOHQ_URL=...
There's an example here: http://sunng.info/blog/2009/12/setting-environment-variables-for-netbeans/.
Ok, I figured it out. This may be obvious to Java coders, but I'm not one, so here is what I cobbled together.
String mongo_url = System.getenv("MONGOHQ_URL");
// If env var not set, try reading from Java "system properties"
if (mongo_url == null) {
mongo_url = System.getProperty("MONGOHQ_URL");
}
MongoURI mongoURI = new MongoURI(mongo_url);
this.db = mongoURI.connectDB();
// Only authenticate if username or password provided
if (!"".equals(mongoURI.getUsername()) || mongoURI.getPassword().length > 0) {
Boolean success = this.db.authenticate(mongoURI.getUsername(), mongoURI.getPassword());
if (!success) {
System.out.println("MongoDB Authentication failed");
return;
}
}
this.my_collection = db.getCollection("my_collection");
We need to import a SSJS library in a database using DXL. For this we have written a Java Agent and its code goes something like this:
import lotus.domino.*;
public class JavaAgent extends AgentBase {
private DxlImporter importer = null;
public void NotesMain() {
try {
Session session = getSession();
AgentContext agentContext = session.getAgentContext();
String filename = "C:\\tempssjslib.xml";
Stream stream = session.createStream();
if (stream.open(filename) & (stream.getBytes() > 0)) {
Database importdb = session.getCurrentDatabase();
importer = session.createDxlImporter();
importer.setReplaceDbProperties(true);
importer.setReplicaRequiredForReplaceOrUpdate(false);
importer.setAclImportOption(DxlImporter.DXLIMPORTOPTION_REPLACE_ELSE_IGNORE);
importer.setDesignImportOption(DxlImporter.DXLIMPORTOPTION_REPLACE_ELSE_CREATE);
importer.importDxl(stream, importdb);
}
} catch (Exception e) {
e.printStackTrace();
}
finally {
try {
System.out.println(importer.getLog());
} catch (Exception e) {
e.printStackTrace();
}
}
}
}
The file C:\tempssjslib.xml contains a SSJS library which I created in Domino Designer and then exported using "Tools > DXL Utilities > Exporter" (for testing purpose). But when I run this agent library does not get imported in the database. There is no error in DxlImporter.getLog() also.
I tried similar procedure with XPages, Form, LotusScript script library and was successfully able to import them. But the same agent is not able to import SSJS library.
Is there something that I have missed in the code? Can we import SSJS library in database using DXL?
It looks like the exporter tool (or maybe even the DXLexporter) is not exporting all needed fields. If you manually add this inside the dxl file, just before the item name='$ServerJavaScriptLibrary'... line, it will succesfully import it.
<item name='$Flags'><text>.5834Q</text></item>
<item name='$TITLE'><text>...name of the SSJS library...</text></item>
If you print the imported note id and analyze that in an appropriate tool (Ytria or Notespeek) you'll see that the problem is with $Flags field.
I created a test SSJS library and $Flags field contains ".5834Q". But the imported one has "34Q" only.
I don't have the exact reference for those flags but it may be a good start. Manually overwriting this field works successfully but this flag may contain some valuable information.
It seems like a bug to me.
In addition YTria tool has a good reference about $flags field content.
Make your live easier and use the Import/Export plug-in found on OpenNTF: http://www.openntf.org/blogs/openntf.nsf/d6plinks/NHEF-7YAAF6 It has an ANT API, so you can automate operations. Needs Domino Designer, so it might not fit your use case. Alternatively (haven't checked): Did you have a look if webDAV exposes the script libraries?