I'm trying to download java jdk and when I try to extract the file I get this message
tar (child): jdk-8u241-linux-i586.tar.gz: Cannot open: No such file or directory
tar (child): Error is not recoverable: exiting now
tar: Child returned status 2
tar: Error is not recoverable: exiting now
I have it downloaded and the file is on my desktop and this is what i entered
root#faiq-desktop:~/Desktop# tar zxvf jdk-8u241-linux-i586.tar.gz
It seems like you are facing this because you are not in that directory, first you need to cd into that directory.
tar -zxvf filename.tar.gz # Replace the filename with your's
referenced from: “Cannot open: No such file or directory” when extracting a tar file
Related
when i am running my spring application in tomcat using .sh file in init-container in kubernetes and i have set runAsUser : 1337 in security context of the init-container in deployment.yaml file.
it gives
cp: cannot create regular file '/usr/java/openjdk-11/conf/security/java.security.bak': permission denied
and
sed: couldn't open temporary file '': permission denied.
i have used chmod to change permission but facing below issue
chmod: changing permissions of '/opt/jdk/conf/security/java.security': Operation not permitted
also facing
/startup.sh: line 3: exec: catalina.sh: not found
my .sh file (after adding chmod)
chmod -R 766 ${JAVA_HOME}/conf/security
/add-jce-provider.sh ${JAVA_HOME}/conf/security/java.security;
exec catalina.sh run;
If you're not able to write to the directory, then it is possible that:
the directory has the immutable flag enabled. check with lsattr.
the directory is mounted with read-only permissions: type in
terminal: cat /proc/mounts (or mount or cat /etc/mtab)
and check the output, if directory is mounted read-only.
If you are in the first case, change the directory attributes with chattr;
remove immutable flag on file or directory chattr -i <file/dir>
adding immutable flag on file or directory again chattr +i <file/dir>
If you're in the latter case, edit the file /etc/fstab.
I am trying to write a program in Java to unzip files zipped by PKZIP tool in Mainframe. However, I have tried below 3 ways, none of them can solve my problem.
By exe.
I have tried to open it by WinRAR, 7Zip and Linux command(unzip).
All are failed with below error message :
The archive is either in unknown format or damaged
By JDK API - java.util.ZipFile
I also have tried to unzip it by JDK API, as this website described.
However, it fails with error message :
IO Error: java.util.zip.ZipException: error in opening zip file
By Zip4J
I also have tried to use Zip4J. It failed too, with error message :
Caused by: java.io.IOException: Negative seek offset
at java.io.RandomAccessFile.seek(Native Method)
at net.lingala.zip4j.core.HeaderReader.readEndOfCentralDirectoryRecord(HeaderReader.java:117)
... 5 more
May I ask if there is any java lib or linux command can extract zip file zipped by PKZIP in Mainframe? Thanks a lot!
I have successfully read files that were compressed with PKZip on z/OS and transferred to Linux. I was able to read them with java.util.zip* classes:
ZipFile ifile = new ZipFile(inFileName);
// faster to loop through entries than open the zip file as a stream
Enumeration<? extends ZipEntry> entries = ifile.entries();
while ( entries.hasMoreElements()) {
ZipEntry entry = entries.nextElement();
if (!entry.isDirectory()) { // skip directories
String entryName = entry.getName();
// code to determine to process omitted
InputStream zis = ifile.getInputStream(entry);
// process the stream
}
}
The jar file format is just a zip file, so the "jar" command can also read such files.
Like the others, I suspect that maybe the file was not transferred in binary and so was corrupted. On Linux you can use the xxd utility (piped through head) to dump the first few bytes to see if it looks like a zip file:
# xxd myfile.zip | head
0000000: 504b 0304 2d00 0000 0800 2c66 a348 eb5e PK..-.....,f.H.^
The first 4 bytes should be as shown. See also the Wikipedia entry for zip files
Even if the first 4 bytes are correct, if the file was truncated during transmission that could also cause the corrupt file message.
I am currently try to build hadoop 2.5 for windows 7 x64 Plateform. I am following instruction from
https://wiki.apache.org/hadoop/Hadoop2OnWindows and have all dependencies mentioned in https://svn.apache.org/repos/asf/hadoop/common/branches/branch-2.5/BUILDING.txt. I am getting a error for Apache Hadoop Common Project while building using following maven command
mvn package -Pdist,native-win -DskipTests -Dtar. Following is error
[INFO]
[INFO] --- exec-maven-plugin:1.2:exec (compile-ms-native-dll) # hadoop-common ---
Build started 15-11-2014 11:08:28.
Project "D:\hadoop-2.5.0-src\hadoop-common-project\hadoop-common\src\main\native\native.sln" on node 1 (default targets).
ValidateSolutionConfiguration:
Building solution configuration "Release|x64".
Project "D:\hadoop-2.5.0-src\hadoop-common-project\hadoop-common\src\main\native\native.sln" (1) is building "D:\hadoop-2.5.0-src\hadoop-common-project\hadoop-common\src\main\native\native.vcxproj" (2) on node 1 (default targets).
InitializeBuildStatus:
Touching "..\..\..\target\native\Release\native.unsuccessfulbuild".
ClCompile:
C:\Program Files (x86)\Microsoft Visual Studio 10.0\VC\bin\amd64\CL.exe /c /I..\winutils\include /I..\..\..\target\native\javah /I"C:\Progra~1\Java\jdk1.7.0_51\include" /I"C:\Progra~1\Java\jdk1.7.0_51\include\win32" /I.\src /Zi /nologo /W3 /WX- /O2 /Oi /GL /D WIN32 /D NDEBUG /D _WINDOWS /D _USRDLL /D NATIVE_EXPORTS /D _WINDLL /D _UNICODE /D UNICODE /Gm- /EHsc /MD /GS /Gy /fp:precise /Zc:wchar_t /Zc:forScope /Fo"..\..\..\target\native\Release\\" /Fd"..\..\..\target\native\Release\vcWindows7.1SDK.pdb" /Gd /TC /wd4244 /errorReport:queue src\org\apache\hadoop\io\compress\zlib\ZlibCompressor.c src\org\apache\hadoop\io\compress\zlib\ZlibDecompressor.c
ZlibCompressor.c
d:\hadoop-2.5.0-src\hadoop-common-project\hadoop-common\src\main\native\src\org\apache\hadoop\io\compress\zlib\org_apache_hadoop_io_compress_zlib.h(36): fatal error C1083: Cannot open include file: 'zlib.h': No such file or directory [D:\hadoop-2.5.0-src\hadoop-common-project\hadoop-common\src\main\native\native.vcxproj]
ZlibDecompressor.c
d:\hadoop-2.5.0-src\hadoop-common-project\hadoop-common\src\main\native\src\org\apache\hadoop\io\compress\zlib\org_apache_hadoop_io_compress_zlib.h(36): fatal error C1083: Cannot open include file: 'zlib.h': No such file or directory [D:\hadoop-2.5.0-src\hadoop-common-project\hadoop-common\src\main\native\native.vcxproj]
Done Building Project "D:\hadoop-2.5.0-src\hadoop-common-project\hadoop-common\src\main\native\native.vcxproj" (default targets) -- FAILED.
Done Building Project "D:\hadoop-2.5.0-src\hadoop-common-project\hadoop-common\src\main\native\native.sln" (default targets) -- FAILED.
Build FAILED.
"D:\hadoop-2.5.0-src\hadoop-common-project\hadoop-common\src\main\native\native.sln" (default target) (1) ->
"D:\hadoop-2.5.0-src\hadoop-common-project\hadoop-common\src\main\native\native.vcxproj" (default target) (2) ->
(ClCompile target) ->
d:\hadoop-2.5.0-src\hadoop-common-project\hadoop-common\src\main\native\src\org\apache\hadoop\io\compress\zlib\org_apache_hadoop_io_compress_zlib.h(36): fatal error C1083: Cannot open include file: 'zlib.h': No such file or directory [D:\hadoop-2.5.0-src\hadoop-common-project\hadoop-common\src\main\native\native.vcxproj]
d:\hadoop-2.5.0-src\hadoop-common-project\hadoop-common\src\main\native\src\org\apache\hadoop\io\compress\zlib\org_apache_hadoop_io_compress_zlib.h(36): fatal error C1083: Cannot open include file: 'zlib.h': No such file or directory [D:\hadoop-2.5.0-src\hadoop-common-project\hadoop-common\src\main\native\native.vcxproj]
0 Warning(s)
2 Error(s)
From the log, it is obviously that you miss zlib.h Did you set the env var ZLIB_HOME to the directory containing zlib.h? Note that put zlib.h directory in PATH is wrong and not necessary, only zlib bin directory is needed, as said in Building guide https://svn.apache.org/viewvc/hadoop/common/branches/branch-2/BUILDING.txt?view=markup
Also, zlib.h requires some header not present in Windows, so you will need to download those headers and put in the same folder with zlib.h. For more details, see Is there a replacement for unistd.h for Windows (Visual C)? to get unistd.h, and https://gist.github.com/ashelly/7776712 to get getopt.h
For those folks on Win32 (like me), you may have faced much more troubles. The key solution should be editing .sln and .vcxprj file of winutils and native package so that they are compatible with Win32 platform.
I checked all previous threads, set
LD_LIBRARY_PATH and followed accordingly. But still no issue.
I am trying to execute cherrypicker software and executing in this way:
./cherrypicker.sh input.txt
Error message :
/root/Desktop/karim/software/cherrypicker1.01/tools/crf++/.libs/lt-crf_test: error while loading shared libraries: libcrfpp.so.0: cannot open shared object file: No such file or directory
Exception in thread "main" java.lang.ArrayIndexOutOfBoundsException: Array index out of range: 0
at java.util.Vector.get(Vector.java:744)
at CreateNPSpan.<init>(CreateNPSpan.java:30)
at CreateNPSpan.main(CreateNPSpan.java:81)
creating feature file....
java.io.FileNotFoundException: input.txt.npspan (No such file or directory)
at java.io.FileInputStream.open(Native Method)
at java.io.FileInputStream.<init>(FileInputStream.java:146)
at java.io.FileInputStream.<init>(FileInputStream.java:101)
at java.io.FileReader.<init>(FileReader.java:58)
at CherryPick.LoadManualEntities(CherryPick.java:111)
at CherryPick.LoadEntities(CherryPick.java:139)
at CherryPick.<init>(CherryPick.java:30)
at CherryPick.main(CherryPick.java:2188)
Exception in thread "main" java.lang.NullPointerException
at CherryPick.SortEntityMentions(CherryPick.java:171)
at CherryPick.LoadEntities(CherryPick.java:141)
at CherryPick.<init>(CherryPick.java:30)
at CherryPick.main(CherryPick.java:2188)
classifying clusters using cr joint model.....
creating output.....
Gotcha creating entities : java.lang.NumberFormatException: For input string: "no"
I checked usr/lib but there's no such file.
In directory : cherrypicker1.01/tools/crf++/.libs I found following files
crf_learn feature_index.o libcrfpp.lai lt-crf_test tagger.o
crf_test feature.o libcrfpp.o node.o
encoder.o lbfgs.o libcrfpp.so.0.0.0 param.o
feature_cache.o libcrfpp.a lt-crf_learn path.o
Any suggestion for this?
Follow these steps
Go to http://taku910.github.io/crfpp/#download and download CRF++-0.58.tar.gz
Untar above file and do ./configure, make install
In parent directories lookup for file sudo find ./ | grep libcrfpp.so.0, from there you will get where the missing file is located
copy that file to /usr/lib and cherrypicker1.01/tools/crf++/.libs/
Now it should work
The path to which the libraries reside depends upon the value passed to --prefix to the configure script. If it is not passed, then according to the source code, the default path is /usr/local/.
Actually, by default /usr/local/lib is not present the path where system searches for dynamic libraries. Hence, one can do:
echo "/usr/local/lib/" | sudo tee /etc/ld.so.conf.d/CRF.conf
sudo rm -f /etc/ld.so.cache
sudo ldconfig
Now, perform:
ldd $(which crf_test)
The output should be something similar to:
linux-vdso.so.1 (0x00007ffefc1f0000)
libcrfpp.so.0 => /usr/local/lib/libcrfpp.so.0 (0x00007f6b715b4000)
libpthread.so.0 => /lib64/libpthread.so.0 (0x00007f6b71398000)
libstdc++.so.6 => /lib64/libstdc++.so.6 (0x00007f6b71016000)
libm.so.6 => /lib64/libm.so.6 (0x00007f6b70d0e000)
libgcc_s.so.1 => /lib64/libgcc_s.so.1 (0x00007f6b70af7000)
libc.so.6 => /lib64/libc.so.6 (0x00007f6b70737000)
/lib64/ld-linux-x86-64.so.2 (0x00007f6b717f3000)
The CRF developers may wish to hard code /usr/local/lib/ or $PREFIX/lib as one of the directories to search for, inside the binaries, using RPATH. To check if a binary contains RPATH, do:
objdump -x $binary | grep RPATH
When I try to do the install process of Oppia, an error message is thrown:
Building languages list.
Compiling repl.coffee.
minifying tmp/jsrepl.js using java -Xmx1g -jar ./tools/closure-compiler/trunk/build/compiler.jar --compilation_level SIMPLE_OPTIMIZATIONS --js
Done.
Downloading file yuicompressor-2.4.8.jar to ../oppia_tools/yuicompressor-2.4.8
Downloading file ui-bootstrap-tpls-0.10.0.js to ./third_party/static/ui-bootstrap-0.10.0
Downloading file ui-bootstrap-tpls-0.10.0.min.js to ./third_party/static/ui-bootstrap-0.10.0
Downloading file jquery.js to ./third_party/static/jquery-2.0.3
Downloading file jquery.min.js to ./third_party/static/jquery-2.0.3
Downloading file jquery.min.map to ./third_party/static/jquery-2.0.3
Downloading file jquery-ui.min.js to ./third_party/static/jqueryui-1.10.3
Downloading file angular.js to ./third_party/static/angularjs-1.2.0-rc.3
Downloading file angular.min.js to ./third_party/static/angularjs-1.2.0-rc.3
Downloading file angular.min.js.map to ./third_party/static/angularjs-1.2.0-rc.3
Downloading file angular-resource.js to ./third_party/static/angularjs-1.2.0-rc.3
Downloading file angular-resource.min.js to ./third_party/static/angularjs-1.2.0-rc.3
Downloading file angular-resource.min.js.map to ./third_party/static/angularjs-1.2.0-rc.3
Downloading file angular-route.js to ./third_party/static/angularjs-1.2.0-rc.3
Downloading file angular-route.min.js to ./third_party/static/angularjs-1.2.0-rc.3
Downloading file angular-route.min.js.map to ./third_party/static/angularjs-1.2.0-rc.3
Downloading file angular-sanitize.js to ./third_party/static/angularjs-1.2.0-rc.3
Downloading file angular-sanitize.min.js to ./third_party/static/angularjs-1.2.0-rc.3
Downloading file angular-sanitize.min.js.map to ./third_party/static/angularjs-1.2.0-rc.3
Downloading file angular-mocks.js to ./third_party/static/angularjs-1.2.0-rc.3
Downloading file angular-scenario.js to ./third_party/static/angularjs-1.2.0-rc.3
Downloading file d3.min.js to ./third_party/static/d3js-3.2.8
Downloading and unzipping file select2-3.4.1 to ./third_party/static
Traceback (most recent call last):
File "scripts/install_third_party.py", line 260, in <module>
SELECT2_ZIP_ROOT_NAME, SELECT2_TARGET_ROOT_NAME)
File "scripts/install_third_party.py", line 83, in download_and_unzip_files
with zipfile.ZipFile(TMP_UNZIP_PATH, 'r') as z:
AttributeError: ZipFile instance has no attribute '__exit__'
I cannot really understand the error message. Should I install some package (for ZIP)?
Are you (or the installer) actually running Python 2.7?
The line: with zipfile.ZipFile(TMP_UNZIP_PATH, 'r') as z: is a with statement and it requires ZipFile instances to support the context manager interface, namely __enter__() and __exit__() methods need to be defined.
Context manager support for ZipFile was added in version 2.7, and for Python 3 it was added in 3.2. You are not using the correct Python version.