How to read the resource file? (google cloud dafaflow) - java

My Dataflow pipeline needs to read a resource file GeoLite2-City.mmdb. I added it to my project and ran the pipeline. I confirmed that the project package zip file exists in the staging bucket on GCS.
However, when I try to read the resource file GeoLite-City.mmdb, I get a FileNotFoundException. How can I fix this? This is my code:
String path = myClass.class.getResource("/GeoLite2-City.mmdb").getPath();
File database = new File(path);
try
{
DatabaseReader reader = new DatabaseReader.Builder(database).build(); //<-this line get a FileNotFoundException
}
catch (IOException e)
{
LOG.info(e.toString());
}
My project package zip file is "classes-WOdCPQCHjW-hRNtrfrnZMw.zip"
(it contains class files and GeoLite2-City.mmdb)
The path value is "file:/dataflow/packages/staging/classes-WOdCPQCHjW-hRNtrfrnZMw.zip!/GeoLite2-City.mmdb", however it cannot be opened.
and This is the options.
--runner=BlockingDataflowPipelineRunner
--project=peak-myproject
--stagingLocation=gs://mybucket/staging
--input=gs://mybucket_log/log.68599ca3.gz
The Goal is transform the log file on GCS, and insert the transformed data to BigQuery.
When i ran locally, it was success importing to Bigquery.
i think there is a difference local PC and GCE to get the resource path.

I think the issue might be that DatabaseReader does not support paths to resources located inside a .zip or .jar file.
If that's the case, then your program worked with DirectPipelineRunner not because it's direct, but because the resource was simply located on the local filesystem rather than within the .zip file (as your comment says, the path was C:/Users/Jennie/workspace/DataflowJavaSDK-master/eclipse/starter/target/classe‌​s/GeoLite2-City.mmdb, while in the other case it was file:/dataflow/packages/staging/classes-WOdCPQCHjW-hRNtrfrnZMw.zip!/GeoLite2-City.mmdb)
I searched the web for what DatabaseReader class you might be talking about, and seems like it is https://github.com/maxmind/GeoIP2-java/blob/master/src/main/java/com/maxmind/geoip2/DatabaseReader.java .
In that case, there's a good chance that your code will work with the following minor change:
try
{
InputStream stream = myClass.class.getResourceAsStream("/GeoLite2-City.mmdb");
DatabaseReader reader = new DatabaseReader.Builder(stream).build();
}
catch (IOException e)
{
...
}

Related

store file in spring boot resource folder after deployment

I have deployed a spring-boot application JAR file. Now, I want to upload the image from android and store it in the myfolder of resource directory. But unable to get the path of resource directory.
Error is:
java.io.FileNotFoundException: src/main/resources/static/myfolder/myimage.png
(No such file or directory)
This is the code for storing the file in the resource folder
private final String RESOURCE_PATH = "src/main/resources";
String filepath = "/myfolder/";
public String saveFile(byte[] bytes, String filepath, String filename) throws MalformedURLException, IOException {
File file = new File(RESOURCE_PATH + filepath + filename);
OutputStream out = new FileOutputStream(file);
try {
out.write(bytes);
} catch (Exception e) {
e.printStackTrace();
} finally {
out.close();
}
return file.getName();
}
UPDATED:
This is what I have tried
private final String RESOURCE_PATH = "config/";
controller class:
String filepath = "myfolder/";
String filename = "newfile.png"
public String saveFile(byte[] bytes, String filepath, String filename) throws MalformedURLException, IOException {
//reading old file
System.out.println(Files.readAllBytes(Paths.get("config","myfolder","oldfile.png"))); //gives noSuchFileException
//writing new file
File file = new File(RESOURCE_PATH + filepath + filename);
OutputStream out = new FileOutputStream(file); //FileNotFoundException
try {
out.write(bytes);
} catch (Exception e) {
e.printStackTrace();
} finally {
out.close();
}
return file.getName();
}
Project structure:
+springdemo-0.0.1-application.zip
+config
+myfolder
-oldfile.png
-application.properties
+lib
+springdemo-0.0.1.jar
+start.sh
-springdemo-0.0.1.jar //running this jar file
Usually when you deploy an application (or start it using Java), you start a JAR file. You don't have a resource folder. You can have one and access it, too, but it certainly won't be src/main/resources.
When you build your final artifact (your application), it creates a JAR (or EAR or WAR) file and your resources, which you had in your src/main/resources-folder, are copied over to the output directory and included in the final artifact. That folder simply does not exist when the application is run (assuming you are trying to run it standalone).
During the build process target/ is created and contains the classes, resources, test-resources and the likes (assuming you are building with Maven; it is a little different if you build using Gradle or Ant or by hand).
What you can do is create a folder e.g. docs next to your final artifact, give it the appropriate permissions (chmod/chown) and have your application output files into that folder. This folder is then expected to exist on the target machine running your artifact, too, so if it doesn't, it would mean the folder does not exist or the application lacks the proper permissions to read from / write to that folder.
If you need more details, don't hesitate to ask.
Update:
To access a resource, which is bundled and hence inside your artifact (e.g. final.jar), you should be able to retrieve it by using e.g. the following:
testText = new String(ControllerClass.class.getResourceAsStream("/test.txt").readAllBytes());
This is assuming your test.txt file is right under src/main/resources and was bundled to be directly in the root of your JAR-file (or target folder where your application is run from). ControllerClass is the controller, which is accessing the file. readAllBytes just does exactly this: read all the bytes from a text file. For accessing images inside your artifact, you might want to use ImageIO.
IF you however want to access an external file, which is not bundled and hence not inside your artifact, you may use File image = new File(...) where ... would be something like "docs/image.png". This would require you to create a folder called docs next to your JAR-artifact and put a file image.png inside of it.
You of course also may work with streams and there are various helpful libraries for working with input- and output streams.
The following was meant for AWT, but it works in case you really want to access the bytes of your image: ImageIO. In a controller you usually wouldn't want to do that, but rather have your users access (and thus download) it from a given available folder.
I hope this helps :).

How to upload a files in dir to S3 when files are continuesly getting written in dir

So I am uploading all files from a dir into S3 using TransferManager
and I am able to upload also .
But my issue in the same dir file are getting written also.
So how do i call that method to write into S3 .
Do i have to call that method on fixed interval ?
Please suggest what could be the best way to call that method.
public void uploadDir(Path strFile,String strFileName){
ArrayList<File> files = new ArrayList<File>();
for (Path path : strFile) {
files.add(new File(path.toString()));
}
TransferManager xfer_mgr = TransferManagerBuilder.standard().build();
try {
MultipleFileUpload xfer = xfer_mgr.uploadFileList(bucketName,strFileName, new File("."), files);
//XferMgrProgress.showTransferProgress(xfer);
//XferMgrProgress.waitForCompletion(xfer);
} catch (AmazonServiceException e) {
System.err.println(e.getErrorMessage());
System.exit(1);
}
}
Couple of solutions, you could try any, based on your need.
Solution 1:- For scanrio like your, instead of time interval, you should be using fileAge.
FileAge:when the file was last modified, this common concept used by file Poller either local directory or remote directory.
So think like your files takes max. 20 seconds in writing, then only pull files older then 20s or more.
Solution 2:-
Other way is ask your clients, the program generating files to use some extension say .tmp, when file writing completed, ask them to convert it to actual file extension and modify your program to skip files with extension .tmp.
e.g While writing abc.jpg to abc.jpg.tmp, when files writing completed, then rename it to abc.jpg.
Hope this helps.

Jar classpath resources read failing which is triggerred from other executable jar

With reference to the link: How do I read a resource file from a Java jar file?
I am trying using your code base and trying to read content of sample.csv which is residing in my project directory src/main/resources. I am unable to read the content, it says can not read file. Output:
[Can not read file: sample.csv]
//This is added within your while loop after this check /* If it is a directory, then skip it. */
I mean when file is detected then next is my below code snippet added to read the file content
if(entry.getName().contains("sample.csv")) {
File f1 = new File("sample.csv");
if(f1.canRead()) {
List<String> lines = Files.readAllLines(f1.toPath());
System.out.println("Lines in file: "+lines.size());
} else {
System.out.println("Can not read file: "+entry.getName());
}
}
Can anyone educate me what I am doing wrong here, how can I make it working?
My requirement is this:
(My micro-service) Service.jar imports Parser.jar library in its pom.xml
(My library) - Parser.jar has FnmaUtils-3.2-fieldMapping.csv file in src/main/resources directory
There is a FnmaUtils class that loads the FnmaUtils-3.2-fieldMapping.csv within its constructor, this class is part of Parser.jar - Here I am trying to read the content FnmaUtils-3.2-fieldMapping.csv, this step is keep failing with below error, tried all possible options shown in [How do I read a resource file from a Java jar file?
public FnmaUtils() {
String mappingFileUrl = null;
try {
Resource resource = new ClassPathResource("FnmaUtils-3.2-fieldMapping.csv");
mappingFileUrl = resource.getFile().getPath();
loadFnmaTemplate(mappingFileUrl);
} catch (Exception e) {
e.printStackTrace();
LOGGER.error("Error loading fnma template file ", e);
}
}
Getting error:
java.io.FileNotFoundException: class path resource [`FnmaUtils-3.2-fieldMapping.csv`] cannot be resolved to absolute file path because it does not reside in the file system: `jar:file:/home/ravibeli/.m2/repository/com/xxx/mismo/util/fnma-parser32/2018.1.0.0-SNAPSHOT/fnma-parser32-2018.1.0.0-SNAPSHOT.jar!/FnmaUtils-3.2-fieldMapping.csv`
at org.springframework.util.ResourceUtils.getFile(ResourceUtils.java:218)
at org.springframework.core.io.AbstractFileResolvingResource.getFile(AbstractFileResolvingResource.java:52)
at com.xxx.fnma.util.FannieMaeUtils.<init>(FannieMaeUtils.java:41)
at com.xxx.fnma.processor.FNMA32Processor.<init>(FNMA32Processor.java:54)
at com.xxx.fnma.processor.FNMA32Processor.<clinit>(FNMA32Processor.java:43)
What is going wrong here?
Try
InputStream in = this.getClass().getClassLoader()
.getResourceAsStream("SomeTextFile.txt");
Be sure the resource is in your classpath.

writing (modifying or adding) a file inside a zip

I've followed the instructions in
this thread, using the code in there I've been able to add a file to a zip file without uncompressing and recompresing it, but i have a problem, let me show you my code:
private void saveFileIntoProjectArchive(Path pathOfFile) {
this.projectArchiveFile.setWritable(true, false);
Path zipFilePath = Paths.get(this.projectArchiveFile.getAbsolutePath()),
pathToSaveInsideZIP = null;
FileSystem fs;
try {
fs = FileSystems.newFileSystem(zipFilePath, null);
pathToSaveInsideZIP = fs.getPath(pathOfFile.toString().substring((int) this.transactionalProjectFolder.getAbsolutePath().length()));
System.out.println("Coping from:\n\t"+pathOfFile+"\nto\n\t"+pathToSaveInsideZIP);
Files.copy(pathOfFile, pathToSaveInsideZIP, REPLACE_EXISTING);
System.out.println("Done!!!");
fs.close();
} catch (java.nio.file.NoSuchFileException e) {
e.printStackTrace();
} catch (IOException e) {
e.printStackTrace();
}
projectArchiveFile.setWritable(false, false);
}
what I'm trying to do is, i have many files for a same project, that project is an archive (ZIP, which in the code is referenced by a java.io.File called projectArchiveFile, an instance variable of my class) containing all of those files, when I want to work with certain file inside my archive, i uncompress only that file into a folder which has an structure identical to the one inside my archive (ZIP, projectArchiveFile), the reference to that folder is a java.io.File called transactionalProjectFolder, also an instance variable of my class. But is giving me this error:
coping from:
C:\Dir1\Dir2\Dir3\Begin of Archive stucure\Another folder replica of the archive structure\An Excel File.xlsm
to
\Begin of Archive stucure\Another folder replica of the archive structure\An Excel File.xlsm
java.nio.file.NoSuchFileException: Begin of Archive stucure\Another folder replica of the archive structure\ at com.sun.nio.zipfs.ZipFileSystem.checkParents(ZipFileSystem.java:846)
at com.sun.nio.zipfs.ZipFileSystem.newOutputStream(ZipFileSystem.java:515)
at com.sun.nio.zipfs.ZipPath.newOutputStream(ZipPath.java:783)
at com.sun.nio.zipfs.ZipFileSystemProvider.newOutputStream(ZipFileSystemProvider.java:276)
at java.nio.file.Files.newOutputStream(Files.java:170)
at java.nio.file.Files.copy(Files.java:2826)
at java.nio.file.CopyMoveHelper.copyToForeignTarget(CopyMoveHelper.java:126)
at java.nio.file.Files.copy(Files.java:1222)
the rest of the stack trace are my classes.
I've been able to write in the root of the archive(zip), but whenever i try to write inside of a folder which is inside of the archive (zip) it fails, as you can notice in the stack trace, it says that java.nio.file.NoSuchFileException: Begin of Archive stucure\Another folder replica of the archive structure\ and it stops right before the name if the file which I'm trying to copy, I'M ENTIRELY SURE that the path inside the zip exist and it is appropriately spelled it just do not want to write (I've trying with Files.copy and Files.move) the file inside the archive, I've been stuck in this for a month, I don't know what else to do, any suggestion will be appreciated!!!
thanks in advance! :)...
Even tho you are sure the path exists, I would for troubleshooting add below row and see what directory structure it creates.
The error indicate the directories are missing inside the zip. Could be you are using some weird folder names not supported by zip etc..
System.out.println("Coping from:\n\t"+pathOfFile+"\nto\n\t"+pathToSaveInsideZIP);
Files.createDirectories(pathToSaveInsideZIP); // add this row
Files.copy(pathOfFile, pathToSaveInsideZIP, StandardCopyOption.REPLACE_EXISTING);
System.out.println("Done!!!");
I'll recommend you to try out the TrueZip it's easy to use, it exposes any archive into a virtual file system, then you can append, delete or edit any file inside the archive easily.

Java Jar file: use resource errors: URI is not hierarchical

I have deployed my app to jar file. When I need to copy data from one file of resource to outside of jar file, I do this code:
URL resourceUrl = getClass().getResource("/resource/data.sav");
File src = new File(resourceUrl.toURI()); //ERROR HERE
File dst = new File(CurrentPath()+"data.sav"); //CurrentPath: path of jar file don't include jar file name
FileInputStream in = new FileInputStream(src);
FileOutputStream out = new FileOutputStream(dst);
// some excute code here
The error I have met is: URI is not hierarchical. this error I don't meet when run in IDE.
If I change above code as some help on other post on StackOverFlow:
InputStream in = Model.class.getClassLoader().getResourceAsStream("/resource/data.sav");
File dst = new File(CurrentPath() + "data.sav");
FileOutputStream out = new FileOutputStream(dst);
//....
byte[] buf = new byte[1024];
int len;
while ((len = in.read(buf)) > 0) { //NULL POINTER EXCEPTION
//....
}
You cannot do this
File src = new File(resourceUrl.toURI()); //ERROR HERE
it is not a file!
When you run from the ide you don't have any error, because you don't run a jar file. In the IDE classes and resources are extracted on the file system.
But you can open an InputStream in this way:
InputStream in = Model.class.getClassLoader().getResourceAsStream("/data.sav");
Remove "/resource". Generally the IDEs separates on file system classes and resources. But when the jar is created they are put all together. So the folder level "/resource" is used only for classes and resources separation.
When you get a resource from classloader you have to specify the path that the resource has inside the jar, that is the real package hierarchy.
If for some reason you really need to create a java.io.File object to point to a resource inside of a Jar file, the answer is here: https://stackoverflow.com/a/27149287/155167
File f = new File(getClass().getResource("/MyResource").toExternalForm());
Here is a solution for Eclipse RCP / Plugin developers:
Bundle bundle = Platform.getBundle("resource_from_some_plugin");
URL fileURL = bundle.getEntry("files/test.txt");
File file = null;
try {
URL resolvedFileURL = FileLocator.toFileURL(fileURL);
// We need to use the 3-arg constructor of URI in order to properly escape file system chars
URI resolvedURI = new URI(resolvedFileURL.getProtocol(), resolvedFileURL.getPath(), null);
File file = new File(resolvedURI);
} catch (URISyntaxException e1) {
e1.printStackTrace();
} catch (IOException e1) {
e1.printStackTrace();
}
It's very important to use FileLocator.toFileURL(fileURL) rather than resolve(fileURL)
, cause when the plugin is packed into a jar this will cause Eclipse to create an unpacked version in a temporary location so that the object can be accessed using File. For instance, I guess Lars Vogel has an error in his article - http://blog.vogella.com/2010/07/06/reading-resources-from-plugin/
I got a similiar issues before, and I used the code:
new File(new URI(url.toString().replace(" ","%20")).getSchemeSpecificPart());
instead of the code :
new File(new URI(url.toURI())
to solve the problem
While I stumbled upon this problem myself I'd like to add another option (to the otherwise perfect explanation from #dash1e):
Export the plugin as a folder (not a jar) by adding:
Eclipse-BundleShape: dir
to your MANIFEST.MF.
At least when you export your RCP app with the export wizard (based on a *.product) file this gets respected and will produce a folder.
In addition to the general answers, you can get "URI is not hierarchical" from Unitils library attempting to load a dataset off a .jar file. It may happen when you keep datasets in one maven submodule, but actual tests in another.
There is even a bug UNI-197 filed.

Categories

Resources