Using spring-integration-sftp: 5.1.3
I have an inbound setup with RotatingServerAdvice that is watching two directories:
--inbox
-----dir1
-----dir2
Watching: inbox/dir1 and inbox/dir2
Local directory: temp/sftp
When I put a file in inbox/dir1/file1.txt, my handler is called as expected and file1.txt is copied to a temp/sftp/file1.txt (<- this is the problem, details below)
Problem:
My use case is that when I get the file, I want to know which remote subdirectory it's from. If the local file was transferred to temp/sftp/inbox/dir1/file1.txt then I could tell that it came from /inbox/dir1 and I can perform some operations on the remote sftp on this directory.
Why is the file being transferred flat and not in subdirectories on the local directory? Any help would be appreciated.
Inbound Adapter:
#Bean
public IntegrationFlow sftpInboundsFlow(DelegatingSessionFactory<LsEntry> delegatingSessionFactory,
SftpServiceActivator serviceActivator) throws JSchException {
SftpConnection config = sftpConnectionConfig.getSftpConnectionConfig(delegatingSessionFactory);
return IntegrationFlows
.from(Sftp.inboundAdapter(delegatingSessionFactory)
.preserveTimestamp(true)
.autoCreateLocalDirectory(true)
.preserveTimestamp(true)
.remoteDirectory("*")
.filter(new CopartFileFilter(Pattern.compile(".*")))
.localDirectory(new File( System.getProperty("java.io.tmpdir") + "/" + config.getLocalDirectory())),
e -> e.id("inboundSftpChannel")
.autoStartup(true)
.poller(Pollers
.fixedDelay(config.getPollerInterval())
.advice(advice(delegatingSessionFactory))
.maxMessagesPerPoll(1)))
.handle(m -> serviceActivator.handleMessage(m))
.get();
}
File info on handler:
file: file1.txt, parent: /var/folders/sd/5k6jwzgs2hj2q165b6x9pql55mp8t7/T/sftp, headers {file_originalFile=/var/folders/sd/5k6jwzgs2hj2q165b6x9pql55mp8t7/T/sftp/file1.txt, id=d2620539-ab0d-2590-9b51-f4dfb442a74a, file_name=file1.txt, file_relativePath=file1.txt, timestamp=1581371879819}
Try 1:
I think this is similar to the first approach mentioned.
.localFilenameExpression("#remoteDirectory + '/' + #this")
It correctly puts the file under temp/sftp/inbox/dir1/file1.txt. The problem now is that the message I get is for this directory:
temp/sftp/inbox
Not the file1.txt
Something like this should work...
Subclass the StandardRotationPolicy, overriding
#Override
public void beforeReceive(MessageSource<?> source) {
...
}
Call super.beforeReceive(source) then cast the source to an AbstractInboundFileSynchronizingMessageSource<?>.
Then call getCurrent() to get the current KeyDirectory.
Then abstractInboundFileSynchronizingMessageSource.setLocalDirectory(new File(...));
Add your custom implementation to the advice.
However, you will have to use a RecursiveDirectoryScanner in the message source so the tree will be scanned.
Alternatively, use an outbound gateway configured to use the MGET command using alternate directories each time (or just inbox in recursive mode). The gateway can be configured to recreate the remote tree.
Had some trouble implementing the solutions suggested because of me being new to this. I went with a different approach because it worked fine for my requirement:
In the sftpInboundsFlow
.localFilenameExpression("#remoteDirectory.replaceAll('/', '-') + '_' + #this")
This places the files in a flat structure in the local temp directory.
temp/sftp:
inbox-dir1_file1.txt
Then in the handler:
Extract remote directory from file name:
public String extractRemoteDir(File ackFile) {
String fileName = ackFile.getName();
return fileName.split("_")[0].replaceAll("-", "/");
}
Custom logic on the remote dir
Delete ackFile
One thing that would make this less hacky would be to to know the remote directory in the local file handler (maybe in the headers), but I didn't find a way to find that.
Related
I am running a spring boot application with wiremock. My files structure is like this:
project/main/
- java/package/Wiremock.java
- resources/wiremock/__files/file.json
Inside Wiremock.java I am calling WireMockServer like this:
WireMockServer wiremockServer = new WireMockServer(WireMockConfiguration.wireMockConfig()
.withRootDirectory(getClass().getResource("/wiremock").getPath())
.port(port));
wiremockServer.start();
wiremockServer.stubFor(get(urlEqualTo("/myurl"))
.willReturn(aResponse()
.withBodyFile("file.json")
.withHeader(CONTENT_TYPE, APPLICATION_JSON_VALUE)
.withStatus(HTTP_OK)));
When I am running it locally it works as expected.
When I compile the app to a jar file , a jar file /Users/user/project-0.0.1-SNAPSHOT.jar is generated with the structure:
BOOT-INF/classes/
- wiremock/__files/file.json
- package/Wiremock.class
But when I run the jar file , I'm getting the following error:
java.lang.RuntimeException: java.io.FileNotFoundException: /Users/user/jar:file:/Users/user/project-0.0.1-SNAPSHOT.jar!/BOOT-INF/classes!/wiremock/__files/file.json (No such file or directory)
Please help, thanks
if this path is correct?
/Users/user/jar:file:/Users/user/project-0.0.1-SNAPSHOT.jar!/BOOT-
INF/classes!/wiremock/__files/file.json (No such file or directory)
I find there is one more "!" after XXXX.jar and classess.
I just meet the same issue today, when i run wire mock in IDEA, it works. but when i run the application by java -jar mode, wired mock server cannot find the json mock file. the root cause of this issue is that when initialization of wire mock server, it will found the json file by com.github.tomakehurst.wiremock.common.ClasspathFileSource
class, it will recursively add files to list of the config path which you specified. the logic of add file is showed below like this.
public List<TextFile> listFilesRecursively() {
if (this.isFileSystem()) {
this.assertExistsAndIsDirectory();
List<File> fileList = Lists.newArrayList();
this.recursivelyAddFilesToList(this.rootDirectory, fileList);
return this.toTextFileList(fileList);
} else {
return FluentIterable.from(toIterable(this.zipFile.entries())).filter(new Predicate<ZipEntry>() {
public boolean apply(ZipEntry jarEntry) {
return !jarEntry.isDirectory() && jarEntry.getName().startsWith(ClasspathFileSource.this.path);
}
}).transform(new Function<ZipEntry, TextFile>() {
public TextFile apply(ZipEntry jarEntry) {
return new TextFile(ClasspathFileSource.this.getUriFor(jarEntry));
}
}).toList();
}
}
it will recursively add file which is absolutely started with the path. but when you run with java -jar, the jarEntry.getName is started with 'BOOT-INF'. one of the solution is that override the method with a subclass extend ClasspathFileSource, and modify the match rule. it will fix
I have a problem with one of my project. Here is a little more info about it :
Our teacher gave us a virtual machine (ubuntu) which contains Hadoop and Hbase, already setup.
The objective is pretty simple : we have a Rest api with tomcat 8.5 (RestServer project, web project), which intercept GET requests (our teacher only want us to have GET request, security reason apparently), and we need to perform, according to the url (for instance : /students/{id}/{program} will return the grades summary for this particular student (id) and year of study (program)), data selection and mapreduce job on Hbase tables. And we have a BigData project, which contains simple java code to scan and filter Hbase table. Here is the short summary of the project.
Here is the structure we use for this project : project structure
And here is what is the execution logic : we type our url in the browser, after we launched our RestServer project (right click on RestServer -> Run as -> Run on server.
Here is what we get after doing so : RestServer in the browser.
The easy part stop there. The links we see on the previous image are just for demo, they are not what we need to do in this project. The idea is to intercept the GET request from the api, in the method handling the request, get the parameters, give them to a call to the constructor of our response object, and return the object as the response (that will be transform into a JSON). The idea is to get this object (the response to our GET request) from the BigData project. So we need to make this 2 projects communicate.
Here is the code to intercept the request :
#GET
#Path("/students/{id}/{program}")
#Produces(MediaType.APPLICATION_JSON)
public Response getStudent(#PathParam("id") String ID,#PathParam("program") String program) throws IOException {
System.out.println("ID : "+ID+" program"+program);
if (ID != null) {
System.out.println("Non nul");
return Response.ok(new Response1(ID,program), MediaType.APPLICATION_JSON).build();
} else {
return Response.status(Response.Status.NOT_FOUND).entity("Student not found: " + ID).build();
}
}
The Response1(ID,program) object is build in the BigData project. When i execute the code from the BigData project directly (as Java application), i have absolutely no problem, no error. But the idea is to use the code from the BigData project to build the Result1 object, and "give it back" to the Rest api. The problem is here, i tried absolutely everything i know and found on the internet but i can't resolve this problem. When i type my url, (which is : http://localhost:8080/RestServer/v1/StudentService/students/2005000033/L3) i get this error : error
From my research, i found that (correct me if i'm wrong) the program can't find the ByteArrayComparable class at runtime. I looked all the links i could find, and here is what i tried to resolve it :
Check if the library for Hadoop and Hbase are in both projects.
Check if the projects contains hbase-client, which is suppose to contains the ByteArrayComparable class (yes, it is in both projects).
By doing right click on RestServer -> Properties -> Java Build Path :
Source tab : i added the src folder from BigData project (and bin folder, but i can't remember where, i believe it is in one of the tab of Java Build Path).
Projects tab : i added the BigData project.
Order and Export tab : i checked the src folder (this folder is in the RestServer project, created after i added the src folder from BigData project in the Source tab).
Deployement Assembly : i added BigData project.
I copied the class which are use in the BigData project in my src folder of my RestServer project.
I saw that it can be cause by conflict between libraries, so i tried to remove some in one project and let them in the other.
I cleaned and rebuilt the projects between each changes.
I tried adding the import that seems to cause the problem by adding import org.apache.hadoop.hbase.filter.*; in the files that are involve in the execution.
I have no idea of what i can do now. Some of my friend have the same problem, even if we don't have the same code, so it seems that the problem come from the configuration. At this point, i didn't perform any mapreduce job, i'm just using Hbase java api to scan the table with some filters.
Thanks for reading me, i hope i'll find the answer. I'll keep testing and searching, and editing this post if i find something.
Here is the code for the Response1 class :
package bdma.bigdata.project.rest.core;
import java.io.IOException;
import org.apache.hadoop.hbase.filter.Filter.*;
public class Response1 {
private StudentBD student;
private Semester semesters;
public Response1(String id, String program) throws IOException {
System.out.println("Building student");
this.student = new StudentBD(id);
System.out.println("Building semester");
this.semesters = new Semester(id,program);
}
#Override
public String toString() {
return student.toString()+" "+semesters.toString();
}
public static void main(String[] args) throws IOException {
Response1 r = new Response1("2005000100", "L1");
System.out.println("AFFICHAGE TEST");
System.out.println(r);
}
}
Edit
I finally managed to resolve my problem. I put the solution here, if it can help someone in the same situation as mine in the futur.
Once you've linked your 2 projects (in the Java Build Path section of the properties of the Rest api project), you need to go, still in the properties, in the Deployment Assembly (above Java Build Path). Here you click on Add... and add all of your jar files.
For a webservice client I'd like to use Implementation-Title and Implementation-Version from the jar file as user-agent string. The question is how to read the jar's manifest.
This question has been asked multiple times, however the answer seems not applicable for me. (e.g. Reading my own Jar's Manifest)
The problem is that simply reading /META-INF/MANIFEST.MF almost always gives wrong results. In my case, it would almost always refer to JBoss.
The solution proposed in https://stackoverflow.com/a/1273196/4222206
is problematic for me as you'd have to hardcode the library name to stop the iteration, and then still it may mean two versions of the same library are on the classpath and you just return the first - not necessarily the right - hit.
The solution in https://stackoverflow.com/a/1273432/4222206
seems to work with jar:// urls only which completely fails within JBoss where the application classloader produces vfs:// urls.
Is there a way for code in a class to find it's own manifest?
I tried the abovementioned items which seem to run well in small applications run from the java command line but then I'd like to have a portable solution as I cannot predict where my library would be used later.
public static Manifest getManifest() {
log.debug("getManifest()");
synchronized(Version.class) {
if(manifest==null) {
try {
// this works wrongly in JBoss
//ClassLoader cl = Version.class.getProtectionDomain().getClassLoader();
//log.debug("found classloader={}", cl);
//URL manifesturl = cl.getResource("/META-INF/MANIFEST.MF");
URL jar = Version.class.getProtectionDomain().getCodeSource().getLocation();
log.debug("Class loaded from {}", jar);
URL manifesturl = null;
switch(jar.getProtocol()) {
case "file":
manifesturl = new URL(jar.toString()+"META-INF/MANIFEST.MF");
break;
default:
manifesturl = new URL(jar.toString()+"!/META-INF/MANIFEST.MF");
}
log.debug("Expecting manifest at {}", manifesturl);
manifest = new Manifest(manifesturl.openStream());
}
catch(Exception e) {
log.info("Could not read version", e);
}
}
}
The code will detect the correct jar path. I assumed by modifying the url to point to the manifest would give the required result however I get this:
Class loaded from vfs:/C:/Users/user/Documents/JavaLibs/wildfly-18.0.0.Final/bin/content/webapp.war/WEB-INF/lib/library-1.0-18.jar
Expecting manifest at vfs:/C:/Users/user/Documents/JavaLibs/wildfly-18.0.0.Final/bin/content/webapp.war/WEB-INF/lib/library-1.0-18.jar!/META-INF/MANIFEST.MF
Could not read version: java.io.FileNotFoundException: C:\Users\hiran\Documents\JavaLibs\wildfly-18.0.0.Final\standalone\tmp\vfs\temp\tempfc75b13f07296e98\content-e4d5ca96cbe6b35e\WEB-INF\lib\library-1.0-18.jar!\META-INF\MANIFEST.MF (The system cannot find the path specified)
I checked that path and it seems even the first URL to the jar (obtained via Version.class.getProtectionDomain().getCodeSource().getLocation() ) was wrong already. It should have been C:\Users\user\Documents\JavaLibs\wildfly-18.0.0.Final\standalone\tmp\vfs\temp\tempfc75b13f07296e98\content-e4d5ca96cbe6b35e\WEB-INF\lib\library-1.0.18.jar.
So this could even point to a problem in Wildfly?
It seems I found some suitable solution here:
https://stackoverflow.com/a/37325538/4222206
So in the end this code can display the correct version of the jar (at least) in JBoss:
this.getClass().getPackage().getImplementationTitle();
this.getClass().getPackage().getImplementationVersion();
Hopefully I will find this answer when I search next time...
I'm running Node JS with https://github.com/apigee/trireme from Java, inside the JVM. I have a directory that looks as following:
node/
-test_file.js
-test_somemodule.js
-somemodule/
-somemodule/index.js
-somemodule/...
I have no problem running the test_file.js using this code:
#Test
public void shouldRunTestScript() {
try {
NodeEnvironment env = new NodeEnvironment();
// Pass in the script file name, a File pointing to the actual script, and an Object[] containg "argv"
NodeScript script = env.createScript("my-test-script.js",
new File(Settings.getInstance().getNodeDir() + "/my-test-script.js"), null);
// Wait for the script to complete
ScriptStatus status = script.execute().get();
// Check the exit code
assertTrue("Exit code was not 77.", status.getExitCode() == 77);
} catch (NodeException | InterruptedException | ExecutionException ex) {
Logger.getLogger(TriremeTest.class.getName()).log(Level.SEVERE, null, ex);
fail("Trireme triggered an exception: " + ex.getMessage());
}
}
In the file test_somemodule.js I include the index.js.
require('somemodule/index.js');
When I try to run that file, it can't find the file in the require.
I have no knowledge about Node JS, so I'm not familiar with the module loading. I already tried setting NODE_PATH, only to get
Error: Cannot find module 'request'
It seems like I can't obtain the NODE_PATH from Trireme, and if I overwrite it, Trireme fails to run. I'm out of ideas on how I could get an Node JS module loaded in Trimere. Any help appreciated.
Edit: I changed the require to ('./somemodule/index.js'), which works. So setting the NODE_PATH would have done the job too. I just found out the error came from an missing dependency.
"dependencies": {
"request": "^2.49.0",
"tough-cookie": "^0.12.1"
},
I figured out the best way to deal with it is installing Node JS + npm, and invoking npm install some_module in the node/ folder. It automatically downloads some_module and all of its dependencies into my node/ folder.
No more require errors.
I did not specify that the file was in the working directory.
require('./somemodule/index.js');
instead of
require('somemodule/index.js');
did the job. Another possiblity is to set the NODE_PATH environment variable to the node/ folder, so you can require without ./.
I also figured out that the best way to obtain modules is by installing them with npm instead of downloading them from git, because the latter does not download any dependencies.
I tried in a lot of ways to clone a repo with jGit (it works).
Then, I write some archive in the repository, and tried to add all (a git add *, git add -A or something like it).. but it don't work. The files simple are not added to the staging area.
My code is like this:
FileRepositoryBuilder builder = new FileRepositoryBuilder();
Repository repository = builder.setGitDir(new File(folder))
.readEnvironment().findGitDir().setup().build();
CloneCommand clone = Git.cloneRepository();
clone.setBare(false).setCloneAllBranches(true);
clone.setDirectory(f).setURI("git#192.168.2.43:test.git");
try {
clone.call();
} catch (GitAPIException e) {
e.printStackTrace();
}
Files.write("testing it...", new File(folder + "/test2.txt"),
Charsets.UTF_8);
Git g = new Git(repository);
g.add().addFilepattern("*").call();
What am I doing wrong?
Thanks.
Exception while trying what with addFilePattern("."):
Exception in thread "main" org.eclipse.jgit.errors.NoWorkTreeException: Bare Repository has neither a working tree, nor an index
at org.eclipse.jgit.lib.Repository.getIndexFile(Repository.java:850)
at org.eclipse.jgit.dircache.DirCache.lock(DirCache.java:264)
at org.eclipse.jgit.lib.Repository.lockDirCache(Repository.java:906)
at org.eclipse.jgit.api.AddCommand.call(AddCommand.java:138)
at net.ciphersec.git.GitTests.main(GitTests.java:110)
One easy way to debug this is to look at the tests of the AddCommand in the JGit repo: AddCommandTest.java
You will see that in order to add all files the pattern "*" is never used, but "." is.
And it is used in the test function named... testAddWholeRepo()(!)
git.add().addFilepattern(".").call();
The Exception:
Exception in thread "main" org.eclipse.jgit.errors.NoWorkTreeException:
Bare Repository has neither a working tree, nor an index
is quite explicit: you need to add file in a non-bare repo.
See test method testCloneRepository() to compare with your own clone, and see if there is any difference.
I had a situation where I had to move a file f1 from the current directory to another directory called 'temp'. After moving the file, calling git.add().addFilePattern(".").call() acted in a weird way since git status gave the following result:
Changes to be committed:
(use "git reset HEAD <file>..." to unstage)
new file: temp/f1.html
Changes not staged for commit:
(use "git add/rm <file>..." to update what will be committed)
(use "git checkout -- <file>..." to discard changes in working directory)
deleted: f1.html
It recognized that a new file temp/f1 was created but didn't detect that the file was deleted first. This was perhaps because moving the file can be seen as follows
Deleting/Cutting the file f1
Creating a folder called temp
Creating/Pasting the file f1
Then I came across the setUpdate(true) that looks for updates to files that are already being tracked and will not stage new files. (Check java-doc for more info)
So I had to change my code to two lines like so in order for git to recognize both files added and modified (which includes deletion):
git.add().addFilepattern(".").call();
git.add().setUpdate(true).addFilepattern(".").call();
git status now gives the expected result:
renamed: f1.hml -> temp/f1.html
It might be the wildcard, I just read the javadoc for the add command, looks like you send the name of a directory in order to add its contents not a wild card:
addFilepattern
public AddCommand addFilepattern(String filepattern)
Parameters: filepattern - File to add content from. Also a leading directory
name (e.g. dir to add dir/file1 and dir/file2) can be given to add all
files in the directory, recursively. Fileglobs (e.g. *.c) are not yet
supported.
Just to make a note about a problem i got where i was using File.separatorChar (Wich will give you either "/" or "\" depending on your OS) to change directory but actually jgit use only "/" and will do the job himself if you use separatorChar it will not work on windows.