Log changes in XMLConfiguration file - java

I have a configuration file (an XML) which I load using XMLConfiguration.
I need to make sure that this XMLConfiguration instance is updated ( every 30 seconds ).
For that matter I have the following code:
XMLConfiguration configuration = new XMLConfiguration(configFile);
configuration.setAutoSave(true);
FileChangedReloadingStrategy strategy = new FileChangedReloadingStrategy();
strategy.setRefreshDelay(getRefreshDelay());
configuration.setReloadingStrategy(strategy);
It works great, but the thing is I want to log any changes in this XML file.
Is there a way of doing it?

I got it!
All I need to do is this:
ConfigurationListener listener = new ConfigurationListener() {
#Override
public void configurationChanged(ConfigurationEvent event) {
if ( !event.isBeforeUpdate() ){
System.out.println(event.getPropertyName() + " " + event.getPropertyValue());
}
}
};
configuration.addConfigurationListener(listener);
It works!

Related

Set the text of a Bukkit config file by default (PaperMc 1.15)

I created a Locale setting so Italian, English, ... I needed to know how to set up a predefined config already: I have obviously tried how every good person does this but I think it is too inefficient, I also tried to create files through the IDE in the same location where the files in the DataFolder are created at the onEnable but obviously it didn't work, however what I tried to be ineffective is this: customConfig.set("Hi-Message", "I'm sorry, i love you");
The way I'm doing it right now, is simply having the config file in the source code itself and, if the file-version of the config does not exist yet, creates a new config file using the config file from the compiled source code.
In my onEnable method in Main, I simply call a method in another class FileManager.setup().
It looks a bit like this in setup():
public static void setup() throws IOException {
File plugin_work_directory = new File(plugin_work_path);
core_server_config = new File(plugin_work_path + "config.txt");
if (!plugin_work_directory.exists()) plugin_work_directory.mkdir();
if (!core_server_config.exists()) {
InputStream core_server_config_template = (Main.class.getResourceAsStream("/config.txt"));
Files.copy(core_server_config_template, Paths.get(plugin_work_path + "config.txt"));
}
Config.load();
if (Integer.parseInt(Config.getValue("config.version")) < Config.version) {
core_server_config.delete();
InputStream core_server_config_template = (Main.class.getResourceAsStream("/config.txt"));
Files.copy(core_server_config_template, Paths.get(plugin_work_path + "config.txt"));
}
Config.load();
}
Config.load() parses the values into a private hashmap of the Config class, whereby other classes can reference the hashmap through a String getValue(String string) method.

Spring Batch creating multiple files .Gradle based project

I need to create 3 separate files.
My Batch job should read from Mongo then parse through the information and find the "business" column (3 types of business: RETAIL,HPP,SAX) then create a file for their respective business. the file should create either RETAIL +formattedDate; HPP + formattedDate; SAX +formattedDate as the file name and the information found in the DB inside a txt file. Also, I need to set the .resource(new FileSystemResource("C:\filewriter\index.txt)) into something that will send the information to the right location, right now hard coding works but only creates one .txt file.
example:
#Bean
public FlatFileItemWriter<PaymentAudit> writer() {
LOG.debug("Mongo-writer");
FlatFileItemWriter<PaymentAudit> flatFile = new
FlatFileItemWriterBuilder<PaymentAudit>()
.name("flatFileItemWriter")
.resource(new FileSystemResource("C:\\filewriter\\index.txt))
//trying to create a path instead of hard coding it
.lineAggregator(createPaymentPortalLineAggregator())
.build();
String exportFileHeader =
"CREATE_DTTM";
StringHeaderWriter headerWriter = new
StringHeaderWriter(exportFileHeader);
flatFile.setHeaderCallback(headerWriter);
return flatFile;
}
My idea would be something like but not sure where to go:
public Map<String, List<PaymentAudit>> getPaymentPortalRecords() {
List<PaymentAudit> recentlyCreated =
PaymentPortalRepository.findByCreateDttmBetween(yesterdayMidnight,
yesterdayEndOfDay);
List<PaymentAudit> retailList = new ArrayList<>();
List<PaymentAudit> saxList = new ArrayList<>();
List<PaymentAudit> hppList = new ArrayList<>();
//String exportFilePath = "C://filewriter/";??????
recentlyCreated.parallelStream().forEach(paymentAudit -> {
if (paymentAudit.getBusiness().equalsIgnoreCase(RETAIL)) {
retailList.add(paymentAudit);
} else if
(paymentAudit.getBusiness().equalsIgnoreCase(SAX)) {
saxList.add(paymentAudit);
} else if
(paymentAudit.getBusiness().equalsIgnoreCase(HPP)) {
hppList.add(paymentAudit);
}
});
To create a file for each business object type, you can use the ClassifierCompositeItemWriter. In your case, you can create a writer for each type and add them as delegates in the composite item writer.
As per creating the filename dynamically, you need to use a step scoped writer. There is an example in the Step Scope section of the reference documentation.
Hope this helps.

How to put multiple assets in a data event for Android Development

I am trying to make an app that sends files from my Android Watch to my Android Phone.
The problem I have is that if I record and save multiple files and send all of them at the same time, I do not get all the files back on the phone side. I only receive one file.
The code for sending the file is as follows. This code is implemented on the Watch side.:
public void sendData(View v){
String fname = "_Activity.bin";
int FileCounterCopy = FileCounter;
if(mGoogleApiClient.isConnected()){
for (int i = 0; i < FileCounterCopy ; i++){
String FileName = String.valueOf(i) + fname;
File dataFile = new File(Environment.getExternalStorageDirectory(), FileName);
Log.i("Path", Environment.getExternalStorageDirectory().toString());
Log.i("file", dataFile.toString());
Asset dataAsset = createAssetfromBin(dataFile);
sensorData = PutDataMapRequest.create(SENSOR_DATA_PATH);
sensorData.getDataMap().putAsset("File", dataAsset);
PutDataRequest request = sensorData.asPutDataRequest();
Wearable.DataApi.putDataItem(mGoogleApiClient, request).setResultCallback(new ResultCallback<DataApi.DataItemResult>() {
#Override
public void onResult(DataApi.DataItemResult dataItemResult) {
Log.e("SENDING IMAGE WAS SUCCESSFUL: ", String.valueOf(dataItemResult.getStatus().isSuccess()));
}
});
boolean deleted = dataFile.delete();
Log.i("Deleted", String.valueOf(deleted));
FileCounter--;
}
mTextView.setText(String.valueOf(FileCounter));
Return();
}
else {
Log.d("Not", "Connecteddddddddd");
}
}
The code for receiving the files is as follows and is implemented on the phone side.
#Override
public void onDataChanged(DataEventBuffer dataEvents) {
Counter++;
final List<DataEvent> events = FreezableUtils.freezeIterable(dataEvents);
dataEvents.close();
Log.e("List Size: ", String.valueOf(events.size()));
for (DataEvent event : events) {
if (event.getType() == DataEvent.TYPE_CHANGED) {
Log.v("Data is changed", "========================");
String path = event.getDataItem().getUri().getPath();
if (SENSOR_DATA_PATH.equals(path)) {
DataMapItem dataMapItem = DataMapItem.fromDataItem(event.getDataItem());
fileAsset = dataMapItem.getDataMap().getAsset("File");
myRunnable = createRunnable();
if (checkSelfPermission(Manifest.permission.WRITE_EXTERNAL_STORAGE) == PackageManager.PERMISSION_GRANTED)
new Thread(myRunnable).start();
}
}
}
status.setText("Received" + " File_"+ String.valueOf(Counter) );
}
Right before the for loop, I check the size of the event and it only shows a size of 1, no matter how many files I save.
I am stuck on how to implement this (tbh I used code from youtube video/online resources so I am not 100% sure on how some of the api works).
Thanks in advance!
You're putting all of the files at the same path, with nothing to differentiate them - so each one you put in overwrites the previous ones. The Data API works much like a filesystem in this regard.
In your sendData method, you need code something like this:
sensorData = PutDataMapRequest.create(SENSOR_DATA_PATH + '/' + dataFile.toString());
And then in onDataChanged, either only check the path prefix...
if (path.startsWith(SENSOR_DATA_PATH)) {
...or, preferably, put the value of SENSOR_DATA_PATH in your manifest declaration as an android:pathPrefix element in the intent-filter of your data receiver. You can then remove the path check from your Java code completely. Docs for that are here: https://developers.google.com/android/reference/com/google/android/gms/wearable/WearableListenerService
One other thing: it's good practice to clear stuff like these files out of the Data API when you're done using them, so that they're not taking up space there.

scriptella executor with xml as string instead of file

I am trying to use scriptella in my project to copy data from one db to another, now the application has a frontend which users can use to create mapping between tables and create dynamic queries, now currently once the user submits the frontend queries are passed via a query engine and a scriptella xml is created using freemarker template
however to execute the xml the executor expects a file instead of a xml string currently i am achieving this by creating a xml in temp directory and deleting it post execution of query, is there any way i can skip file creation and execute the query as a xml string
You can create a custom URLStreamHandler that will serve streams directly from memory. This is similar to what was done in AbstractTestCase. It can be registered by calling URL.setURLStreamHandlerFactory. See Registering and using a custom java.net.URL protocol or Is it possible to create an URL pointing to an in-memory object?
After that, use
EtlExecutor.newExecutor(java.net.URL) with the new URI, e.g. new URL("memory://file")
I had a similar use case. I downloaded the code and made a small change in the core. Due to some private functions I had no choice.
in
package scriptella.configuration.ConfigurationFactory
I added the following function:
public ConfigurationEl createConfigurationFromTxt(String xml, final ParametersCallback externalParameters ) {
try {
DocumentBuilder db = DBF.newDocumentBuilder();
db.setEntityResolver(ETL_ENTITY_RESOLVER);
db.setErrorHandler(ETL_ERROR_HANDLER);
final InputStream in = new ByteArrayInputStream(xml.getBytes());
final Document document = db.parse(in);
HierarchicalParametersCallback params = new HierarchicalParametersCallback(
externalParameters == null ? NullParametersCallback.INSTANCE : externalParameters, null);
PropertiesSubstitutor ps = new PropertiesSubstitutor(params);
return new ConfigurationEl(new XmlElement(
document.getDocumentElement(), resourceURL, ps), params);
} catch (IOException e) {
throw new ConfigurationException("Unable to load document: " + e, e);
} catch (Exception e) {
throw new ConfigurationException("Unable to parse document: " + e, e);
}
}
Then from my code I can do something like this:
ConfigurationFactory cf = new ConfigurationFactory();
ConfigurationEl conf = cf.createConfigurationFromTxt(FETCH_ETLS, p);
EtlExecutor exec = new EtlExecutor(conf);

changing log4j file name programatically in osgi maven bundle not working

Im developing a maven-osgi bundle and deploying in karaf.. In that, a piece of code, should get .cfg files from the karaf/etc and im programatically changing them at runtime.. writeTrace() is invoked within 'for loop' from another class. So that I can create different files and corresponding logging should go in to that file.
public void writeLog(int i,String HostName) {
StringBuilder sb = new StringBuilder();
sb.append("\n HEADER : \n");
....
String str = sb.toString();
String logfile = ("/home/Dev/" + HostName + i);
logger = LoggerFactory.getLogger("TracerLog");
updateLog4jConfiguration(logfile);
logger.error(str + i);}
public void updateLog4jConfiguration(String logFile) {
Properties props = new Properties();
try {
// InputStream configStream = getClass().getResourceAsStream(
// "/home/Temp-files/NumberGenerator/src/main/java/log4j.properties");
InputStream configStream = new FileInputStream("etc/org.ops4j.pax.logging.cfg");
props.load(configStream);
System.out.println(configStream);
configStream.close();
} catch (IOException e) {
System.out.println("Error: Cannot laod configuration file ");
}
props.setProperty("log4j.appender.Tracer.File", logFile);
LogManager.resetConfiguration();
PropertyConfigurator.configure(props);
}
and I am able to see new files created with hostname such as (hostname_1 , hostname_2, etc..) but logging happens only at actual appender configured at karaf/etc... thaat is log.txt..
log4j.logger.TracerLog=TRACE,Tracer
log4j.appender.Tracer=org.apache.log4j.RollingFileAppender
log4j.appender.Tracer.MaxBackupIndex=10
log4j.appender.Tracer.MaxFileSize=500KB
log4j.appender.Tracer.File=/home/Dev/log.txt
I got struck in this error.. Dont know whether it has to do something with the karaf or problem with code..???
Why aren't you just using the ConfigurationAdminService for this, instead of altering the file?
Just reference the configuration admin service from the registry and take the configuration with the PID org.ops4j.pax.logging.
With this approach you will have all configuration properties available for your proposal and it is in your code to alter this. It's also possible for you to add new configuration entries. In the end the combination of ConfigurationAdminService and the felix FileInstaller will even persist your changes back to the configuration file.
Btw. did you know that there is a shell command for configuring configurations, so actually also to alter the configuration for the org.ops4j.pax.logging service?
Just do a:
config:list
to retrieve all configurations available
and a
config:list "(service=org.ops4j.pax.logging)"
to retrieve just this information.

Categories

Resources