I have some .cfm files which have a binary corrupted lines, and I used a Java decompiler to see the code of these files by converting its extension to .class.
What I ask about, how can I modify on these files, or even see the lines as a CFML template, and run it so I can access it on my server
����
SourceFile :C:\inetpub\wwwroot\Clients\ent\www\ADMIN\clips\logclip.cfm
To my knowledge there is no way to get the original source back from the compiled cfm files because it is not actually encrypted or corrupted, it's really just a class file.
There is a bit more information about it here by Rupesh Kumar from the CF team
http://coldfused.blogspot.co.uk/2008/01/encrypted-cfml-templates.html
2 answers here:
If you are speaking of the compiled class files and you have the cfm files you can safely delete the classfiles (in the "cfclasses" directory of your instance). ColdFusion will recompile them when next your run the source file. The big gotcha is, if this is a high traffic site and you delete ALL the files, then you may drag your server down as CF has to recompile everything. In that case you might want to be more selective.
If you mean you are seeing nonsense encryption in a CF file and the only readible part of the file is at the top (a sort of header that tells you what it is), then you are working with encrypted CFM files - CF provides (or provided) a utility to encrypt and deploy CFM files to hide source code. The CF engine can unecrypt these files into regular CF then compile them down into class files. The encryption used was (in the past) quite trivial and not much protection at all. I have unecrypted these files in the past for site owners who owned their code but whose developer had chosen to encrypt them and not provide source copies.
YOu can find methods to unecrypt them (and even utilities to do it) in the wild if this is your issue. My only caution would be be sure you have a right to the code - there was a time when many devs sold encrypted custom tags and widgets - encrypted to prevent them from propogating without license.
Related
I'm developing a simple javacard applet using the jcdk 3.0.5u3 with Eclipse Oxygen3. If I use a simple API from GlobalPlatform like the GPSystem.getCardContentState() results in error.
I've tried to add the globalplatform.jar file from GP API v1.1 and v1.6 to the Reference Libraries part of the package explorer. I also imported the "org.globalplatform.*" into the code.
import org.globalplatform.*;
if(GPSystem.getCardContentState() == GPSystem.APPLICATION_SELECTABLE){
//Do something
}
The converter returns "export file global platform.exp of package org.globalplatform not found"
Java Card doesn't just require a compile stage, it also performs the linking that is usually performed as dynamic linking in the JVM of a normal Java application. Basically it orders the methods and such, and then calls the right serial ID. You don't want your Applet to contain the string names of your fields after all: it would explode the memory requirements, and dynamically looking for classes and fields is not a good idea either within such a restricted platform.
So if you call external libraries then you need to configure:
the .jar file containing the .class files for the normal compiler;
the .exp file which contains the an export of the mapping of the normal names and the ID of the classes and fields specific for the converted classes of the called library;
If it is not already present on the card, you may also need the version specific .cap file for uploading. However, the GP functionality is should already be present on the card.
The ID's are only unique for a specific .cap file / preloaded byte code. This is why you always need the right .exp file for the code that is loaded. If another field is added, the ordering is different and the wrong fields would be linked, if the linker executes at all. So having the right .exp file is a requirement for correct conversion to .cap for your application / library.
For the JCDK I think you just need to configure the right -exportpath, as the GP should be included with the JCDK.
I was inspecting the class file format since I wanted to add source code to the class file (which was possible in early Java versions) but all I found was a SourceFile attribute and the SourceDebug attribute. I was looking for the complete source code of the class to be bundled with the class file to ease the post-processing pipeline.
Does anyone know if my memories are wrong or how I can bundle the complete source code of a class within the class file so that I do not have to look up for the java-file when I want to check the source code?
Is there a compiler switch to do that?
Javac has a -g option adding additional debug information. Can someone tell me whats are the information it adds? Without the -g switch it generates lines of code index and source file information.
The main problem I have is generate a class file but only have a reference to a source file that might change. I want simply to bundle up source and class file.
In maven I can simply copy over all the source files to the target directory but would might be incompatible with Eclipse, IntelliJ and NetBeans IDE (and what not)... .
Using a decompiler will also provide a way to extract a useful representation of the source code since most decompiler will value the lines of code information and position the decompiled structures accordingly within the source code.
Since some scenarios will require access to comments and a correct representation on a char by char level, the decompiler would be a second rate solution.
One possible solution I found is defining a new class-file attribute (which is legal) that contains the source. Since the source is huge when compared to the class file, the content might be best compressed (yielding a 1:5 to 1:10 ratio).
This way the class file and the sources stay bundled.
The JVM specification guarantees that every JVM/Tool has to ignore unknown attributes.
I will invest into a wrapper of javac application, that ensures the source was not modified during compilation (and if yes, redo the compilation process) and after compilation is done adding the source code as a class-file attribute.
Since this will be incompatible to the IDE-build cycle of Eclipse (and most likely IntelliJ and NetBeans) it will also require a special post processor.
So integration will also require alternatives to the JavaBuilder.
Once the source code is attached to the class file in question it is very easy to do a lot of advanced stuff with it that helps with both maintaining and managing code. For me its important that the source code and a class stay together and the source information is a 100% percent equal to the source code it was compiled from.
I have two programs (one in Java and one in Python) that ZIP a folder, upload it to a WebServer and trigger a UNZIP method on them.
The Java version of the program works without issues and the file is extracted on the server without problems.
Here I'm using the ArchiveStreamFactory class i.e. new ArchiveStreamFactory().createArchiveOutputStream(ArchiveStreamFactory.ZIP, this.currentOutputStream);
The Python version only works if I use zipfile.ZIP_STORED method (which does not compress the file). If I use the zipfile.ZIP_DEFLATED method I get internal server error.
I don't have access to the server so for this I can only rely on what I'm able to figure out on my side.
The Java program does seem to use the ZIP_DEFLATED method also as the archive is compressed (smaller) and not just stored.
I've also run zipinfo on the both archives (the one created with Java and the one with Python with DEFLATE - which doesn't work) too see what's different.
Here's the output:
# Java
-rw---- 2.0 fat 14398 bl defN 4-Jun-15 13:55 somefile.txt
# Python
-rw-r--r-- 2.0 unx 183 b- defN 28-Jun-15 21:39 someotherfile.txt
Both seem to be compressed with DEFLATE (defN) method so why does the archive generated by Java works while the one generated by Python doesn't?
So after a lot of debugging and trial and error looks like I found the issue in case anyone else is interested or will have the same problem.
I was also adding the folder to the zip and looks like it didn't liked the folders being compressed with ZIP_DEFLATED. What I did was to manually set the compression to ZIP_STORED for folders and to ZIP_DEFLATED for files and after this it worked. Interesting how Java knew to do this automatically behind the scenes, or at least I guess it does as the Java version is kind of the same (iterate over folders/files and add them to the ZIP) except I just use the default values (so I never explicitly set the compression type for anything).
So basically my code (the version that didn't worked) was something like this:
for dir_path, dir_names, file_names in os.walk(absolute_folder_path, compression=zipfile.ZIP_DEFLATED):
...
# Add folder to ZIP
f_zip.write(absolute_dir_path, arcname=relative_dir_path)
for file_name in file_names:
...
# Add file to ZIP
f_zip.write(absolute_file_path, arcname=relative_file_path)
and the fix was this one:
for dir_path, dir_names, file_names in os.walk(absolute_folder_path):
...
# Add folder to ZIP
f_zip.write(absolute_dir_path, arcname=relative_dir_path, compress_type=zipfile.ZIP_STORED)
for file_name in file_names:
...
# Add file to ZIP
f_zip.write(absolute_file_path, arcname=relative_file_path, compress_type=zipfile.ZIP_DEFLATED)
I have been searching online about Java Jar signing concepts for some time now to understand what is actually happening when one actually signs his/her jar file.I have looked into various articles pertaining to this , however i ended up reading ones with complex jargons which were not simple to understand. It would be really helpful if you can explain the concept in simple terms / provide any reference link.
My prime objective is to reverse engineer a signed jar file (by whatever means , such as editing the class files within the jar at byte-code level ) to convert it into a working , non-error throwing unsigned jar file.
Please guide me if my approach is not right or if the above mentioned process is not possible.
Thanks in advance.
Generally speaking signing includes the following steps:
Create a hash value over the data to be signed
Do a private key operation operation on the hash value
The result ("the signature") can then be verified by anyone who has the public key. Usually the signature is packaged in a data structure that contains the public key and infos about the algorithms that were used for signing.
Signed jar files contain two additional files in the META-INF folder (open the jar file with 7-Zip or whatever file archiver you prefer to see the content), for example:
META-INF/BCKEY.DSA
META-INF/BCKEY.SF
The ".SF" file contains hash values for every file in the jar:
Signature-Version: 1.0
Created-By: 1.5.0_08 (Sun Microsystems Inc.)
SHA1-Digest-Manifest-Main-Attributes: TCwFll9z+7/6t/SlEoKf3a1SEKU=
SHA1-Digest-Manifest: tbYd5vvo/j3yIenDqYs8xdPRv4c=
Name: org/bouncycastle/asn1/ua/DSTU4145BinaryField.class
SHA1-Digest: LwFPLRwMlgwj7TOKYsDtqhS6+lE=
Name: org/bouncycastle/asn1/DEREnumerated.class
SHA1-Digest: DLc3+IOaSG+cgzW+u4KUbgyypWA=
Name: org/bouncycastle/asn1/x509/SubjectKeyIdentifier.class
SHA1-Digest: v08rbVIhc3KGIL/JlpIPqwQTvgI=
...
The ".DSA" file contains the signature and additional information in PKCS#7 format. The file extension depends on the key algorithm (".DSA", ".RSA" or ".EC").
"BCKEY" is just a name for the signature (usually the first 8 characters of the key alias used for signing). There might be several pairs of signature files in the META-INF folder.
The documentation of jarsigner contains a short passage about those files, it is titled "The Signed JAR File".
So, if you want to remove the signature from a jar file, you simply have to delete all ".SF" and ".RSA"/".DSA"/".EC" files.
I'm writing a Java Class which extends Ant Zip Task to do a particular job for me. I want to create a zip file and once that file is created, I want to suppress the access time in the inode so I can't be modified or find a way to not let it change, even if the file is modified. The reason for that is I made a md5 hash which depends on the access time. Thus that's giving me a lot of trouble, and making the access time constant will solve my problem.
Does someone now how would I accomplish that?
Thanks!
I've had to solve a similar problem previously - perhaps this is an option for you. In my case, the problem was:
We made a jar file and then ran an secure hash algorithm on the jar file. Because the jar file is really a zip file, and a zip file internally contains file metadata information including last access time, if we create a new jar file from the exact same source material, then the hash on the new jar file doesn't match the original hash (because while the zip contents are the same, the metadata stored in the zip file has different file creation / access times).
Basically, we needed to be able to compute a secure hash for compliance purposes to be able to easily show that the contents of a jar was unchanged. Recompiling an equivalent jar was ok - it's just that the contents had to be identical.
We wrote a simple set of tools that performed secure hashes (and verifications) specifically for zip/jar files. It computed two hashes:
a regular secure hash of the file (which would identify the exact same jar - this would be the same as the output of your standard md5sum)
a "content only" hash which was computed by iterating over the bytes of the unpacked contents of the zip/jar (and thus could be used to identify that a recompiled jar matched the original jar)
To implement the content only hash, we used a ZipInputStream to iterate over the zip entries.
MessageDigest sha1;
byte[] digest;
for (each zip file entry)
{
if (entry represents a directory)
{
sha1.update( directory name bytes as UTF-8 );
}
else
{
read the entry bytes using ZipInputStream.read()
sha1.update( bytes );
}
}
digest = sha1.digest();
See also: ZipInputStream.read()
Note, however, that some files such as the manifest can contain information such as the version of ant used to create the jar, and the version of the compiler used to compile the classes. Thus, you have to compile from an equivalent environment for the hash to match.
Finally, this doesn't cope with the fact that a zip file might itself contain other zip files. While it would be straight forward enough to make the inspection cater for this and descend into nested zip/jar/war files, our implementation does not.