FileNotFoundException for properties file in aws-cdk - java

I've been trying to read a properties file and want it to be dynamic, I'm doing this in aws-cdk.
My project layout:
Main Project
resources
config.properties
src
main/java/com/myorg
xxxstage.java
The class xxxstage.java has following code:
public class xxxstage extends Stage {
public xxxstage(final Construct scope, final String id) {
this(scope, id, null);
}
public xxxstage(final Construct scope, final String id, final StageProps props) {
super(scope, id, props);
String account = null;
InputStream inputStream = null;
try {
Properties prop = new Properties();
String propFileName = "resources/config.properties";
inputStream = this.getClass().getClassLoader().getResourceAsStream(propFileName);
System.out.println("inputStream is -> "+inputStream);
if (inputStream != null) {
prop.load(inputStream);
} else {
throw new FileNotFoundException("property file '" + propFileName + "' not found in the classpath");
}
// get the property value and print it out
account = prop.getProperty("account.id");
System.out.println("account id -> "+account);
} catch (Exception e) {
System.out.println("Exception: " + e);
} finally {
try{
inputStream.close();
}
catch (Exception e){
System.out.println("Exception: " + e);
}
}
new xxxStack(this, "xxxStack", StackProps.builder()
.env(new Environment.Builder()
.account(account)
.region("us-east-1")
.build())
.build());
}
}
The line where I'm trying to print System.out.println("inputStream is -> "+inputStream); is showing null, and hence the FileNotFoundException.
It worth noting that this is working fine when I run it on local java-project, its however failing during build phase of the aws-codepipeline.
In the pipeline build phase I'm getting:
inputStream is -> null
Exception: java.io.FileNotFoundException: property file 'resources/config.properties' not found in the classpath
Exception: java.lang.NullPointerException
Can someone please help?
EDIT 1 - Adding environment variables:
Environment Vars
PATH=/root/.npm/_npx/230/bin:/usr/local/bin/sbt/bin:/root/.phpenv/shims:/root/.phpenv/bin:/root/.goenv/shims:/root/.goenv/bin:/go/bin:/root/.phpenv/shims:/root/.phpenv/bin:/root/.pyenv/shims:/root/.pyenv/bin:/root/.rbenv/shims:/usr/local/rbenv/bin:/usr/local/rbenv/shims:/root/.dotnet/:/root/.dotnet/tools/:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/opt/tools:/codebuild/user/bin
AWS_CONTAINER_CREDENTIALS_RELATIVE_URI=/v2/credentials/xxx-xxx-xxx
CODEBUILD_CI=true
CODEBUILD_AUTH_TOKEN=xxx-xxx-xxx
JAVA_8_HOME=/usr/lib/jvm/java-1.8.0-amazon-corretto
JDK_8_HOME=/usr/lib/jvm/java-1.8.0-amazon-corretto
CODEBUILD_BUILD_ARN=arn:aws:codebuild:us-east-1:821518525729:build/pipelinePipelinexxx-C-xxx-xxx-xxx
CODEBUILD_GOPATH=/codebuild/output/src123245
GOLANG_15_VERSION=1.15.12
CODEBUILD_BUILD_SUCCEEDING=1
GOENV_DISABLE_GOPATH=1
JRE_HOME=/usr/lib/jvm/java-11-amazon-corretto
CDK_DEFAULT_REGION=us-east-1
JAVA_11_HOME=/usr/lib/jvm/java-11-amazon-corretto
PHP_74_VERSION=7.4.13
CODEBUILD_SOURCE_VERSION=arn:aws:s3:::pipelinexxxstack-pipelinexxxartifacts-1cacuj92rramf/ServiceDeploymentPip/(user)/c1BGqPX
RUBY_BUILD_SRC_DIR=/usr/local/rbenv/plugins/ruby-build
JDK_HOME=/usr/lib/jvm/java-11-amazon-corretto
PWD=/codebuild/output/src12345/src
CODEBUILD_CONTAINER_NAME=default
PYTHON_37_VERSION=3.7.10
CODEBUILD_START_TIME=1642556874326
CDK_DEFAULT_ACCOUNT=12345
AWS_REGION=us-east-1
PYTHON_38_VERSION=3.8.10
CODEBUILD_BUILD_URL=https://us-east-1.console.aws.amazon.com/codebuild/home?region=us-east-1#/builds/pipelinePipelinexxxSynthC-ofdfmXGrWl5m:xxx-xxx-xxx/view/new
CDK_OUTDIR=cdk.out
DOTNET_31_SDK_VERSION=3.1.404
CODEBUILD_BUILD_ID=pipelinePipelinexxxSynthC-ofdfmXGrWl5m:xxx-xxx-xxx
GOPATH=/go:/codebuild/output/src12345
CODEBUILD_RESOLVED_SOURCE_VERSION=xxx-xxx-xxx
OLDPWD=/codebuild/output/src12345/src
RUBY_26_VERSION=2.6.6
AWS_STS_REGIONAL_ENDPOINTS=regional
DOTNET_ROOT=/root/.dotnet
_PROJECT_CONFIG_HASH=xxx-xxx-xxx
CODEBUILD_AGENT_ENDPOINT=http://127.0.0.1:port
LC_CTYPE=C.UTF-8
JRE_8_HOME=/usr/lib/jvm/java-1.8.0-amazon-corretto/jre
CODEBUILD_BUILD_IMAGE=aws/codebuild/standard:5.0
PYYAML_VERSION=5.4.1
FORCE_COLOR=0
CODEBUILD_BMR_URL=https://CODEBUILD_AGENT:port
JAVA_HOME=/usr/lib/jvm/java-11-amazon-corretto
CODEBUILD_SRC_DIR=/codebuild/output/src12345/src
AWS_DEFAULT_REGION=us-east-1
AWS_EXECUTION_ENV=AWS_ECS_EC2
ECS_CONTAINER_METADATA_URI=http://169.254.170.2/v3/xxx-xxx-xxx
ECS_CONTAINER_METADATA_URI_V4=http://169.254.170.2/v4/xxx-xxx-xxx
CODEBUILD_INITIATOR=codepipeline/ServicexxxPipeline
MAVEN_OPTS= -Dmaven.wagon.httpconnectionManager.maxPerRoute=2
CDK_CONTEXT_JSON={"#aws-cdk/aws-apigateway:usagePlanKeyOrderInsensitiveId":true,"#aws-cdk/core:enableStackNameDuplicates":true,"aws-cdk:enableDiffNoFail":true,"#aws-cdk/core:stackRelativeExports":true,"#aws-cdk/aws-ecr-assets:dockerIgnoreSupport":true,"#aws-cdk/aws-secretsmanager:parseOwnedSecretName":true,"#aws-cdk/aws-kms:defaultKeyPolicies":true,"#aws-cdk/aws-s3:grantWriteWithoutAcl":true,"#aws-cdk/aws-ecs-patterns:removeDefaultDesiredCount":true,"#aws-cdk/aws-rds:lowercaseDbIdentifier":true,"#aws-cdk/aws-efs:defaultEncryptionAtRest":true,"#aws-cdk/aws-lambda:recognizeVersionProps":true,"#aws-cdk/aws-cloudfront:defaultSecurityPolicyTLSv1.2_2021":true,"#aws-cdk/core:newStyleStackSynthesis":true,"aws:cdk:enable-path-metadata":true,"aws:cdk:enable-asset-metadata":true,"aws:cdk:version-reporting":true,"aws:cdk:bundling-stacks":[]}
CODEBUILD_LOG_PATH=xxx-xxx-xxx
CODEBUILD_EXECUTION_ROLE_BUILD=
CODEBUILD_BUILD_NUMBER=31
GOLANG_16_VERSION=1.16.4
PHP_73_VERSION=7.3.25
CODEBUILD_FE_REPORT_ENDPOINT=https://codebuild.us-east-1.amazonaws.com/
CODEBUILD_LAST_EXIT=0
AWS_NODEJS_CONNECTION_REUSE_ENABLED=1
MAVEN_CMD_LINE_ARGS= -e -q compile exec:java
NUGET_XMLDOC_MODE=skip
DOTNET_5_SDK_VERSION=5.0.202
NODE_12_VERSION=12.22.2
PYTHON_39_VERSION=3.9.5
CDK_CLI_VERSION=2.8.0
NODE_14_VERSION=14.17.2
MAVEN_PROJECTBASEDIR=/codebuild/output/src12345/src
CDK_CLI_ASM_VERSION=16.0.0
JRE_11_HOME=/usr/lib/jvm/java-11-amazon-corretto
RUBY_27_VERSION=2.7.2
HOSTNAME=12345
JDK_11_HOME=/usr/lib/jvm/java-11-amazon-corretto
CODEBUILD_PROJECT_UUID=xxx-xxx-xxx
PHP_80_VERSION=8.0.0
CODEBUILD_KMS_KEY_ID=arn:aws:kms:us-east-1:(account-num):alias/aws/s3
HOME=/root

I found an answer here and it worked for me. The location of properties file matters.

Related

read project directory to get properties file in java web application

Am trying to ready properties file which is presented in my project directory src/test/resources/properties/api/. But this way is not working and its give me file not found exception.
Please find my code below :
public Properties extractProperties() throws IOException {
InputStream configReader= null;
String env = getProperty("tuf.environment");
try {
configReader = new FileInputStream(new File("src/test/resources/properties/api/"+env+".properties")); // throwing exception
} catch (FileNotFoundException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
prop.load(configReader);
return prop;
}
I would do it the following way. Please note that the extractProperties() method will return an empty Properties object if the file was not found. Please also note the try-with-resources statement which will auto-close the InputStream.
public Properties extractProperties() throws IOException {
String env = getProperty("tuf.environment");
Properties prop = new Properties();
try (InputStream in = this.getClass().getResourceAsStream("/properties/api/" + env + ".properties")) {
prop.load(in);
} catch (Exception e) {
e.printStackTrace();
}
return prop;
}
Judging from your path you are using either Maven or Gradle as it looks like the default structure used by them. Which means src/test/resources points to the root of the classpath, so there is no src/test/resources. (The same applies to src/main/resources as well!).
So if you want to load it yuo would need to remove the src/test/resources part of the loading.
Next if this is run from a packaged application loading a File won't work as it isn't a File. The File needs to be a physical file on the filesystem and not inside an archive.
Taking all that into account you should be able to load the properties using the following
public Properties extractProperties() throws IOException {
String env = getProperty("tuf.environment");
String resource = "/properties/api/"+env+".properties";
try (InputStream in = getClass().getResourceAsStream(resource)) {
prop.load(in);
return prop;
}
}
Try something like below
public Properties extractProperties() throws IOException {
Properties prop=new Properties();
String env = getProperty("tuf.environment");
String mappingFileName = "/properties/api/" + env+ ".properties";
Resource resource = resourceLoader.getResource("classpath:" + mappingFileName);
try (InputStream inputStream = resource.getInputStream();
BufferedReader bufferedInputStream = new BufferedReader(new InputStreamReader(inputStream))) {
prop.load(bufferedInputStream);
} catch IOException ie) {
//handle exception
}
return prop;
}
Probably env is not what you think it is. Why not list all files in that directory?
You can print with https://docs.oracle.com/en/java/javase/14/docs/api/java.base/java/nio/file/Files.html#list(java.nio.file.Path)
With the relevant directory:
Path apiDir = Paths.get("src/test/resources/properties/api/");

How best to Impersonate a user account on hadoop

I've a Java program that is copying a file from Unix to hdfs. It is running fine however I am looking for impersonating a different account when it runs and copies file.
Input: Apart form input file and target hdfs directory path, another input should be properties file containing account, keytab directory, domain
Please kindly let me know the best way to move forward.
I am currently exploring using a shell to first issue a kinit command and then run the jar
I am also reading about Jaas and how this can be done in Java itself - from - https://henning.kropponline.de/2016/02/14/a-secure-hdfs-client-example/
Need inputs and any reference of available options.
My Java program that copies file is as below:
public class FileCopy implements Runnable {
#Option(names = {"-i","--input"}, required=true, description="file name to copy to hadoop")
String input;
#Option(names = {"-o","--output"}, required=true, description="hdfs directory path to be copied into")
String output;
public void run() {
Properties hadoop_properties = new Properties();
HdfsFileDeploy hdfsFileDeploy = new HdfsFileDeploy();
try {
hadoop_properties.load(FileCopy.class.getClassLoader().getResourceAsStream("hadoop.properties"));
} catch (IOException e) {
e.printStackTrace();
}
FileSystem fs = hdfsFileDeploy.configureFilesystem(hadoop_properties.getProperty("coreSitePath"),hadoop_properties.getProperty("hdfsSitePath"));
String status = hdfsFileDeploy.writeToHDFS(fs,input,output);
if (status == "SUCCESS") {
System.out.println("completed copying");
} else {
System.out.println("copying error");
}
hdfsFileDeploy.closeFileSystem(fs);
}
public static void main(String[] args) throws IOException {
CommandLine.run(new FileCopy(), args);
}
}
public class HdfsFileDeploy {
public FileSystem configureFilesystem(String coreSitePath, String hdfsSitePath) {
FileSystem fileSystem = null;
try {
Configuration conf = new Configuration();
Path hdfsCoreSitePath = new Path(coreSitePath);
Path hdfsHDFSSitePath = new Path(hdfsSitePath);
conf.addResource(hdfsCoreSitePath);
conf.addResource(hdfsHDFSSitePath);
fileSystem = FileSystem.get(conf);
System.out.println(fileSystem);
return fileSystem;
} catch (Exception ex) {
ex.printStackTrace();
return fileSystem;
}
}
public void closeFileSystem(FileSystem fileSystem) {
try {
fileSystem.close();
} catch (Exception ex) {
System.out.println("Unable to close Hadoop filesystem : " + ex);
}
}
//
public String writeToHDFS(FileSystem fileSystem, String sourcePath, String destinationPath) {
String failure = "FAILURE";
String success = "SUCCESS";
Boolean doNotDelSrc = false;
Boolean overwrite = true;
try {
Path inputPath = new Path(sourcePath);
Path outputPath = new Path(destinationPath);
if(!fileSystem.exists(outputPath)) {
System.out.println("Output path " + outputPath + " does not exist. Creating outputPath directory now..");
if (fileSystem.mkdirs(outputPath)) {
System.out.println("Output path " + outputPath + " created...");
}
}
System.out.println("about to copy from " + inputPath + " to " + outputPath);
fileSystem.copyFromLocalFile(doNotDelSrc, overwrite, inputPath, outputPath);
return success;
} catch (IOException ex) {
System.out.println("Some exception occurred while writing file to hdfs");
ex.printStackTrace();
return failure;
}
}
}
Input1: input file
Input2: target hdfs directory
Reference Input: file (say yaml) containing account, domain, keytab path.
jar should impersonate and copy the input file to target hdfs directory.

How to make a executable jar file with .dll - RXTX

My programm shall communicate via RS232, therefore i use a .jar and two .dll's from RXTX. At the end I want to run it from a single .jar file.
To solve this problem i used this tutorial. But if I run the program from Eclipse (or after exporting from console) I get this exception:
java.lang.UnsatisfiedLinkError: no rxtxSerial in java.library.path thrown while loading gnu.io.RXTXCommDriver
Exception in thread "main" java.lang.UnsatisfiedLinkError: no rxtxSerial in java.library.path
Here is an minimal example of my code
private static final String LIB = "lib/";
private final static String RXTXPARALLEL = "rxtxParallel";
private final static String RXTXSERIAL = "rxtxSerial";
static {
try {
System.loadLibrary(RXTXSERIAL);
System.loadLibrary(RXTXPARALLEL);
} catch (UnsatisfiedLinkError e) {
loadFromJar();
}
}
public static void main(String[] args) {
//RS232 is this class
RS232 main = new RS232();
main.connect("COM15");
}
private static void loadFromJar() {
String path = "AC_" + new Date().getTime();
loadLib(path, RXTXPARALLEL);
loadLib(path, RXTXSERIAL);
}
private static void loadLib(String path, String name) {
name = name + ".dll";
try {
InputStream in = ResourceLoader.load(LIB + name);
File fileOut = new File(System.getProperty("java.io.tmpdir") + "/"
+ path + LIB + name);
OutputStream out = FileUtils.openOutputStream(fileOut);
IOUtils.copy(in, out);
in.close();
out.close();
System.load(fileOut.getAbsolutePath());
} catch (Exception e) {
e.printStackTrace();
}
}
private void connect(String portName) {
CommPortIdentifier portIdentifier;
try {
//Here the exception is thrown
portIdentifier = CommPortIdentifier.getPortIdentifier(portName);
} catch (NoSuchPortException exc) {
exc.printStackTrace();
return;
}
//... some other code
}
Is there a way to get an executable .jar file?
You have a few options. Try to copy the .dll files in the runtime folder and override the files at each start of your program. A second option is to copy the files in a fix folder and add the path of the folder to the environment variables in MS Windows. You can also override the files at each start.
Another possibility is to add the temporary folder to the MS Windows environment variables at runntime. But be careful with this solution, for more information read this post.
static {
try {
System.loadLibrary(RXTXSERIAL);
System.loadLibrary(RXTXPARALLEL);
} catch (UnsatisfiedLinkError exc) {
initLibStructure();
}
}
private static void initLibStructure() {
try {
//runntime Path
String runPath = new File(".").getCanonicalPath();
//create folder
File dir = new File(runPath + "/" + LIB);
dir.mkdir();
//get environment variables and add the path of the 'lib' folder
String currentLibPath = System.getProperty("java.library.path");
System.setProperty("java.library.path",
currentLibPath + ";" + dir.getAbsolutePath());
Field fieldSysPath = ClassLoader.class
.getDeclaredField("sys_paths");
fieldSysPath.setAccessible(true);
fieldSysPath.set(null, null);
loadLib(runPath, RXTXPARALLEL);
loadLib(runPath, RXTXSERIAL);
} catch (Exception e) {
e.printStackTrace();
}
}
private static void loadLib(String path, String name) {
name = name + ".dll";
try {
InputStream in = ResourceLoader.load(LIB + name);
File fileOut = new File(path + "/" + LIB + name);
OutputStream out = FileUtils.openOutputStream(fileOut);
IOUtils.copy(in, out);
in.close();
out.close();
System.load(fileOut.getAbsolutePath());
} catch (Exception e) {
e.printStackTrace();
}
}

Read file from SVN over https using svnkit

SVN server is accessible over https. So I need to read a file that is located there. I followed the snippet from svnkit wiki (http://svn.svnkit.com/repos/svnkit/tags/1.3.5/doc/examples/src/org/tmatesoft/svn/examples/repository/DisplayFile.java), but my SVNKindNode is NONE and as a result no file is read. Nevertheless there's no exceptions during connection. So I can assume that I do connect correctly to SVN server, but then something goes wrong.
Here is the code:
public class SVNRepoConnector {
private String username = "user";
private String password = "pwd";
private String baseUrl = "https://mysvnserver.com/svn/project/trunk";
private String filePath = "/myproject/src/main/webapp/file.html";
public void downloadSchema() {
DAVRepositoryFactory.setup();
SVNRepository repository = null;
try {
repository = SVNRepositoryFactory.create(SVNURL.parseURIEncoded(baseUrl));
ISVNAuthenticationManager authManager = SVNWCUtil.createDefaultAuthenticationManager(username, password);
repository.setAuthenticationManager(authManager);
SVNNodeKind nodeKind = repository.checkPath(filePath, -1);
if(nodeKind == SVNNodeKind.NONE) {
System.err.println("There is file at: " + baseUrl + filePath);
System.exit(1);
} else if (nodeKind == SVNNodeKind.DIR) {
System.err.println("The entry at " + baseUrl + filePath + " is a directory while a file was expected.");
System.exit(1);
}
SVNProperties properties = new SVNProperties();
ByteArrayOutputStream out = new ByteArrayOutputStream();
repository.getFile(filePath, -1, properties, out);
System.out.println("Content:\n");
try {
out.writeTo(System.out);
} catch (IOException e) {
e.printStackTrace();
}
} catch (SVNException e) {
e.printStackTrace();
}
}
public static void main(String[] args) {
SVNRepoConnector connector = new SVNRepoConnector();
connector.downloadSchema();
}
}
I receive "There is file at..." due to SVNNodeKind equals NONE. I cannot understand what is wrong here. How to read file from SVN over https?
Btw, my svnkit is 1.8.5.
Specify a relative path (unless baseUrl is the repository root):
private String filePath = "myproject/src/main/webapp/file.html";
instead of
private String filePath = "/myproject/src/main/webapp/file.html";
I found the solution after thorough debugging of the sources.
In short, the problem is in the second argument of repository.checkPath(filePath, -1); and repository.getFile(filePath, -1, properties, out);. filePath must be file name and path to it must be in the baseUrl field. After these changes everything started working correctly.
Regarding the snippet, in case of www/license.html, one should pass 1 as a second arg.

How a JAR file can read an external properties file

We have a connection pooling component (JAR file) for one of our application. As of now the application connection details are bundled with-in the JAR file (in .properties file).
Can we make it more generic? Can we have the client tell the properties file details (both the path and the file name) and use the JAR to get the connection?
Does it make sense to have something like this in the client code:
XyzConnection con = connectionIF.getConnection(uname, pwd);
Along with this, the client will specify (somehow???) the properties file details that has the URLs to connect, timeout etc.
Simplest way, use the -D switch to define a system property on a java command line.
That system property may contain a path to your properties file.
E.g
java -cp ... -Dmy.app.properties=/path/to/my.app.properties my.package.App
Then, in your code you can do ( exception handling is not shown for brevity ):
String propPath = System.getProperty( "my.app.properties" );
final Properties myProps;
if ( propPath != null )
{
final FileInputStream in = new FileInputStream( propPath );
try
{
myProps = Properties.load( in );
}
finally
{
in.close( );
}
}
else
{
// Do defaults initialization here or throw an exception telling
// that environment is not set
...
}
http://www.javaworld.com/javaworld/javaqa/2003-08/01-qa-0808-property.html
multiple approaches are available, the article above provides more details
ClassLoader.getResourceAsStream ("some/pkg/resource.properties");
Class.getResourceAsStream ("/some/pkg/resource.properties");
ResourceBundle.getBundle ("some.pkg.resource");
Just load the properties from file, something like
Properties properties = new Properties();
InputStreamReader in = null;
try {
in = new InputStreamReader(new FileInputStream("propertiesfilepathandname"), "UTF-8");
properties.load(in);
} finally {
if (null != in) {
try {
in.close();
} catch (IOException ex) {}
}
}
Note how the encoding is explicitly specified as UTF-8 above. It could also be left out if you accept the default ISO8859-1 encoding, but beware with any special characters then.
This is my solution. First looking for app.properties in startup folder, if does not exists try to load from your JAR package:
File external = new File("app.properties");
if (external.exists())
properties.load(new FileInputStream(external));
else
properties.load(Main.class.getClassLoader().getResourceAsStream("app.properties"));
Simplest way is below. It will load application.properties from cfg folder outside of the jar file.
Directory Structure
|-cfg<Folder>-->application.properties
|-somerunnable.jar
Code:
Properties mainProperties = new Properties();
mainProperties.load(new FileInputStream("./cfg/application.properties"));
System.out.println(mainProperties.getProperty("error.message"));
In netbeans I needed to load application.properties from conf/ folder outside of the jar file.
Therefore I wrote :
public static String getProperty(String FileName, String Prop)
{
try {
FIS = new FileInputStream( "./../conf/"+FileName);
Properties properties;
(properties = new Properties()).load(FIS);
for(Enumeration propKeys = properties.propertyNames();
propKeys.hasMoreElements();){
String tmpKey = (String) propKeys.nextElement();
String tmpValue = properties.getProperty(tmpKey);
tmpValue = tmpValue.trim();
if (tmpKey.equals(Prop)){
//System.out.println(tmpKey +"="+tmpValue);
properties.put(tmpKey, tmpValue);
Value = properties.getProperty(Prop);
break;
}
}
if (Value==null){
throw new Exception("La Propiedad : "+Prop+" no Se encuentra en el Archivo de Configuracion");
}
} catch (IOException e) {
e.printStackTrace();
} catch (Exception e) {
e.printStackTrace();
}
return Value;
}
For Eclipse apply the following:
FIS = new FileInputStream( "../conf/"+FileName);
public static String getPropertiesValue(String propValue) {
Properties props = new Properties();
fileType = PCLLoaderLQIOrder.class.getClassLoader().getResourceAsStream(propFileName);
if (fileType != null) {
try {
props.load(fileType);
} catch (IOException e) {
logger.error(e);
}
} else {
try {
throw new FileNotFoundException("Property file" + propFileName + " not found in the class path");
} catch (FileNotFoundException e) {
logger.error(e);
}
}
String propertiesValue = props.getProperty(propValue);
return propertiesValue;
}
above methods works for me, just store your property file into directory from where to run your jar and provide that name in place of propFileName, when you want any value from property just call getPropertyValue("name").

Categories

Resources