I need check currentFile of MIME-type. If result is success and file have MIME-type return true. If wasn't checking succed return false.
With this goal I use JMimeMagic.
I try do this according this post
Output from this code is - net.sf.jmimemagic.MagicMatchNotFoundException
You need have JDK 7 - for changing File to byte[] at this way(Files.readAllBytes(path)).
Code:
class ProbeContentTypeCheker implements Checker {
#Override
public boolean check(File currentFile) {
String mimeType = null;
try {
Path path = Paths.get(currentFile.getAbsolutePath());
byte[] data = Files.readAllBytes(path);
MagicMatch match = Magic.getMagicMatch(data);
mimeType = match.getMimeType();
} catch (MagicParseException | MagicMatchNotFoundException
| MagicException | IOException e) {
e.printStackTrace();
}
if (null != mimeType) {
return true;
}
return false;
}
}
Output (only if it's "wrong" type):
net.sf.jmimemagic.MagicMatchNotFoundException
at net.sf.jmimemagic.Magic.getMagicMatch(Magic.java:222)
at net.sf.jmimemagic.Magic.getMagicMatch(Magic.java:170)
at task.ProbeContentTypeCheker.check(FileScan.java:357)
at task.FolderScan.findFiles(FileScan.java:223)
at task.FolderScan.findFiles(FileScan.java:215)
at task.FolderScan.run(FileScan.java:202)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1110)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:603)
at java.lang.Thread.run(Thread.java:722)
If file is "ok" type => output to console normal. But after some time arise another exception:
Exception in thread "pool-1-thread-1" java.lang.OutOfMemoryError: Java heap space
at java.lang.String.toCharArray(String.java:2753)
at org.apache.oro.text.perl.Perl5Util.match(Unknown Source)
at net.sf.jmimemagic.MagicMatcher.testRegex(MagicMatcher.java:663)
at net.sf.jmimemagic.MagicMatcher.testInternal(MagicMatcher.java:433)
at net.sf.jmimemagic.MagicMatcher.test(MagicMatcher.java:341)
at net.sf.jmimemagic.Magic.getMagicMatch(Magic.java:208)
at net.sf.jmimemagic.Magic.getMagicMatch(Magic.java:170)
at task.ProbeContentTypeCheking.check(FileScan.java:384)
at task.FolderScan.findFiles(FileScan.java:228)
at task.FolderScan.findFiles(FileScan.java:225)
at task.FolderScan.findFiles(FileScan.java:225)
at task.FolderScan.run(FileScan.java:209)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1110)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:603)
at java.lang.Thread.run(Thread.java:722)
Question:
How do solve this arises of exception?
JMimeMagic 0.1.2 depends on Commons Logging 1.0.4
A NoClassDefFoundError means that the Java Virtual Machine or a ClassLoader instance tries to load in the definition of a class (as part of a normal method call or as part of creating a new instance using the new expression) and no definition of the class could be found.
The solution would be to add the commons-logging-1.0.4.jar to your classpath.
Note that JMimeMagic has other 3rd party dependencies:
Jakarta ORO 2.0.8
Log4j 1.2.8
Xerces 2.4.0 (optional)
xml-apis 2.0.2
xmlParserAPIs 2.0.2
Update - MagicMatchNotFoundException
The MagicMatchNotFoundException is thrown if no mime type match is found for the provided data.
You can set the log level of net.sf.jmimemagic to DEBUG to get more information about what is going on
Update 2 - OutOfMemoryError
The OOM looks related to the behavior of JmimeMagic. In some cases it will try to run a regular expression against the entire byte array input to find the magic number match. See this reported issue for the Nuxeo Enterprise Platform.
I think you can solve this issue by limiting the size of the byte array you pass to getMagicMatch
Related
We don't consider our app running properly if log4j2 isn't configured properly as specified in the config file. How do i reliably detect whether errors have occurred during log4j2 initialization and been printed on the console from java? There are (at least) two sources of bad configuration:
The configuration file is entirely missing (or invalid?)
The configuration file is valid, but has a problem during setConfiguration().
This question is about the second point. It looks like log4j2 internally does try{...} catch(e) { LOGGER.error("bla bla", e) } during initialization making it impossible to detect such problems.
Errors printed to the console that no human is ever going to see is bad. We'd rather have our app crash!
I'm surprised that I can't find a way to programmatically ask log4j2: "Did I get the configuration I asked for?". Is there something I have overlooked? E.g. some way to detect whether anything has been logged yet?
Background
Here is the console output we're seeing. Yes, there is permission denied to /path/to/file - how do I detect that?
2015-10-15 17:43:50,539 main ERROR FileManager (/path/to/file) java.io.FileNotFoundException: /path/to/file (Permission denied)
2015-10-15 17:43:50,541 main ERROR Unable to invoke factory method in class class org.apache.logging.log4j.core.appender.FileAppender for element File. java.lang.reflect.InvocationTargetException
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:497)
at org.apache.logging.log4j.core.config.plugins.util.PluginBuilder.build(PluginBuilder.java:136)
at org.apache.logging.log4j.core.config.AbstractConfiguration.createPluginObject(AbstractConfiguration.java:794)
at org.apache.logging.log4j.core.config.AbstractConfiguration.createConfiguration(AbstractConfiguration.java:734)
at org.apache.logging.log4j.core.config.AbstractConfiguration.createConfiguration(AbstractConfiguration.java:726)
at org.apache.logging.log4j.core.config.AbstractConfiguration.doConfigure(AbstractConfiguration.java:383)
at org.apache.logging.log4j.core.config.AbstractConfiguration.initialize(AbstractConfiguration.java:161)
at org.apache.logging.log4j.core.config.AbstractConfiguration.start(AbstractConfiguration.java:173)
at org.apache.logging.log4j.core.LoggerContext.setConfiguration(LoggerContext.java:422)
at org.apache.logging.log4j.core.LoggerContext.reconfigure(LoggerContext.java:494)
at org.apache.logging.log4j.core.LoggerContext.reconfigure(LoggerContext.java:510)
at org.apache.logging.log4j.core.LoggerContext.start(LoggerContext.java:199)
at org.apache.logging.log4j.core.impl.Log4jContextFactory.getContext(Log4jContextFactory.java:146)
at org.apache.logging.log4j.core.impl.Log4jContextFactory.getContext(Log4jContextFactory.java:41)
at org.apache.logging.log4j.LogManager.getContext(LogManager.java:177)
at org.apache.logging.log4j.LogManager.getLogger(LogManager.java:447)
at (our first call to org.apache.logging.log4j.LogManager.getLogger())
Looking in the code for LoggerContext.reconfigure(), I see:
private void reconfigure(final URI configURI) {
<snip>
final Configuration instance = ConfigurationFactory.getInstance().getConfiguration(name, configURI, cl);
setConfiguration(instance);
<snip>
}
And during the call of setConfiguration(), there is this in org.apache.logging.log4j.core.config.plugins.util.PluginBuilder.build:
try {
<snip>
final Object plugin = factory.invoke(null, params);
<snip>
return plugin;
} catch (final Exception e) {
LOGGER.error("Unable to invoke factory method in class {} for element {}.", this.clazz, this.node.getName(),
e);
return null;
}
So the problem was not with getConfiguration() from the ConfigurationFactory, but with the subsequent setConfiguration(), and there seems to be no way to detect that there was a problem... :-(
About the first point: The configuration file is entirely missing (or invalid?)
We're setting -Dlog4j.configurationFile=<file> on the java command line, and there are already several posts on how to detect if <file> is missing. One of which suggests to -Dlog4j.configurationFactory with our own ConfigurationFactory. But no amount of fiddling with our own ConfigurationFactory is going to help during setConfiguration() to detect the second point ("2." above) as far as I can reason.
We're running log4j2 version 2.4 on openjdk 8 on debian stable/jessie.
It turns out the Log4j2 internals use a special StatusLogger, that has a getStatusData() method, so here's what to do:
import org.apache.logging.log4j.status.StatusData;
import org.apache.logging.log4j.status.StatusLogger;
...
StatusLogger statusLogger = StatusLogger.getLogger();
if (statusLogger.getStatusData().size() > 0) {
System.out.printf(
"Logged %d messages\n",
statusLogger.getStatusData().size()
);
// Investigate List<StatusData> if you want
for(StatusData data : statusLogger.getStatusData()) {
System.out.printf(
" Level %s message: %s\n",
data.getLevel(),
data.getMessage().getFormattedMessage()
);
}
System.err.println("exiting due to unexpected console status logs");
System.exit(1);
}
I am trying to decrypt some private keys (.pfx X509Certificate) with Bouncy Castle.
If I run the code standalone (junit), it works fine, but when I run it on wildfly with arquillian deployed as a war file, I'm facing some issues:
org.jboss.arquillian.test.spi.ArquillianProxyException: javax.ejb.EJBException : JBAS014580: Unexpected Error
[Proxied because : Original exception caused: class java.lang.ClassFormatError: Absent Code attribute in method
that is not native or abstract in class file javax/ejb/EJBException]
I think the arquillian is encapsulating the real exception, but no more errors appear in the log file.
In the pom file I declared it as provided, to use the provided version.
The versions installed are:
$WILDFLY_HOME\modules\system\layers\base\org\bouncycastle\main\bcmail-jdk15on-1.50.jar
$WILDFLY_HOME\modules\system\layers\base\org\bouncycastle\main\bcpkix-jdk15on-1.50.jar
$WILDFLY_HOME\modules\system\layers\base\org\bouncycastle\main\bcprov-jdk15on-1.50.jar
I also tried to use the version bcprov-jdk16 specified directly in the pom file with scope as compile/runtime, but it didn't work anyway.
The error occurs specifically in this point:
org.bouncycastle.x509.extension.X509ExtensionUtil.getIssuerAlternativeNames(java.security.cert.X509Certificate);
X509ExtensionUtil.getIssuerAlternativeNames(certificate) = >Unknown type "org.bouncycastle.x509.extension.X509ExtensionUtil"<
Anyone else ever had this problem or know how can I fix it? Any tips?
I solved my question using only java 8 api, as the follow:
Collection<?> altNames = certificate.getSubjectAlternativeNames();
for (Object i : altNames) {
List<Object> item = (java.util.List) i;
Integer type = (Integer) item.get(0);
try {
if (type > 0) {
continue;
}
String[] arr = StringEscapeUtils.escapeHtml(new String((byte[]) item.get(1))).split(";");
return Arrays.asList(arr)
.stream()
.map(k -> k.trim())
.filter(u -> isCNPJ(u))
.findFirst().get();
} catch (Exception e) {
LOG.error(e.getMessage(), e);
}
}
return null;
isCNPJ is just a method to filter only value I need.
StringEscapeUtils is a apache commons lang class
There are three types of default security levels of NTRU, implemented in bouncy-castle:
1. NTRUSigningKeyGenerationParameters.TEST157
2. NTRUSigningKeyGenerationParameters.APR2011_439
3. NTRUSigningKeyGenerationParameters.APR2011_743
First two are generated normally, but when I try to generate the tird one, I get the next Exception:
SEVERE: Servlet.service() for servlet [mvc-dispatcher] in context with path [] threw exception [Request processing failed; nested exception is java.lang.IllegalStateException: Signing failed: too many retries (max=100)] with root cause
java.lang.IllegalStateException: Signing failed: too many retries (max=100)
Here is piece of my code:
NTRUSigningPrivateKeyParameters ntruSigningPrivateKeyParameters1 = null;
NTRUSigner ntruSigner = new NTRUSigner(ntruSigningKeyGenerationParameters.getSigningParameters());
try {
ntruSigningPrivateKeyParameters1 = new NTRUSigningPrivateKeyParameters(ntruSigningPrivateKeyParameters.getEncoded(), ntruSigningKeyGenerationParameters);
} catch (IOException e) {
e.printStackTrace();
}
ntruSigner.init(true, ntruSigningPrivateKeyParameters);
byte [] res = ntruSigner.generateSignature();
Calling ntruSigner.generateSignature() with the third set of parameters leads to a such Exception.
Does anyone knows how to solve it?
Currently, it's a bug, so there are two solutions:
use another library - tbuktu's github project (bouncy-castle is using it with some modifications, as I see)
download sources, catch the bug of this generation parameter, solve it and pack into library for a project
It's not really a bug in the code. The problem is that the norm bound in the APR2011_743 and APR2011_743_PROD parameter sets is too low which means that the signer is unable to generate a valid signature.
For N=743, q=2048 and beta=0.127 you should choose a norm bound of around 545 (see equation 10 in J. Hoffstein et al, Performance improvements and a baseline parameter generation algorithm for NTRUSign) but the parameter sets in BouncyCastle use normBound=405. Changing this solves the issue.
Updating the normBound does appear to fix the issue, however I should point out the NTRUSigner class is now deprecated in Bouncy Castle. The NTRU signing algorithm was shown to be badly broken just over a year ago. See:
http://www.di.ens.fr/~ducas/NTRUSign_Cryptanalysis/DucasNguyen_Learning.pdf
for details.
I switched an existing code base to Java 7 and I keep getting this warning:
warning: File for type '[Insert class here]' created in the last round
will not be subject to annotation processing.
A quick search reveals that no one has hit this warning.
It's not documented in the javac compiler source either:
From OpenJDK\langtools\src\share\classes\com\sun\tools\javac\processing\JavacFiler.java
private JavaFileObject createSourceOrClassFile(boolean isSourceFile, String name) throws IOException {
checkNameAndExistence(name, isSourceFile);
Location loc = (isSourceFile ? SOURCE_OUTPUT : CLASS_OUTPUT);
JavaFileObject.Kind kind = (isSourceFile ?
JavaFileObject.Kind.SOURCE :
JavaFileObject.Kind.CLASS);
JavaFileObject fileObject =
fileManager.getJavaFileForOutput(loc, name, kind, null);
checkFileReopening(fileObject, true);
if (lastRound) // <-------------------------------TRIGGERS WARNING
log.warning("proc.file.create.last.round", name);
if (isSourceFile)
aggregateGeneratedSourceNames.add(name);
else
aggregateGeneratedClassNames.add(name);
openTypeNames.add(name);
return new FilerOutputJavaFileObject(name, fileObject);
}
What does this mean and what steps can I take to clear this warning?
Thanks.
The warning
warning: File for type '[Insert class here]' created in the last round
will not be subject to annotation processing
means that your were running an annotation processor creating a new class or source file using a javax.annotation.processing.Filer implementation (provided through the javax.annotation.processing.ProcessingEnvironment) although the processing tool already decided its "in the last round".
This may be problem (and thus the warning) because the generated file itself may contain annotations being ignored by the annotation processor (because it is not going to do a further round).
The above ought to answer the first part of your question
What does this mean and what steps can I take to clear this warning?
(you figured this out already by yourself, didn't you :-))
What possible steps to take? Check your annotation processors:
1) Do you really have to use filer.createClassFile / filer.createSourceFile on the very last round of the annotaion processor? Usually one uses the filer object inside of a code block like
for (TypeElement annotation : annotations) {
...
}
(in method process). This ensures that the annotation processor will not be in its last round (the last round always being the one having an empty set of annotations).
2) If you really can't avoid writing your generated files in the last round and these files are source files, trick the annotation processor and use the method "createResource" of the filer object (take "SOURCE_OUTPUT" as location).
In OpenJDK test case this warning produced because processor uses "processingOver()" to write new file exactly at last round.
public boolean process(Set<? extends TypeElement> elems, RoundEnvironment renv) {
if (renv.processingOver()) { // Write only at last round
Filer filer = processingEnv.getFiler();
Messager messager = processingEnv.getMessager();
try {
JavaFileObject fo = filer.createSourceFile("Gen");
Writer out = fo.openWriter();
out.write("class Gen { }");
out.close();
messager.printMessage(Diagnostic.Kind.NOTE, "File 'Gen' created");
} catch (IOException e) {
messager.printMessage(Diagnostic.Kind.ERROR, e.toString());
}
}
return false;
}
I modified original example code a bit. Added diagnostic note "File 'Gen' created", replaced "*" mask with "org.junit.runner.RunWith" and set return value to "true". Produced compiler log was:
Round 1:
input files: {ProcFileCreateLastRound}
annotations: [org.junit.runner.RunWith]
last round: false
Processor AnnoProc matches [org.junit.runner.RunWith] and returns true.
Round 2:
input files: {}
annotations: []
last round: true
Note: File 'Gen' created
Compilation completed successfully with 1 warning
0 errors
1 warning
Warning: File for type 'Gen' created in the last round will not be subject to annotation processing.
If we remove my custom note from log, it's hard to tell that file 'Gen' was actually created on 'Round 2' - last round. So, basic advice applies: if in doubt - add more logs.
Where is also a little bit of useful info on this page:
http://docs.oracle.com/javase/7/docs/technotes/tools/solaris/javac.html
Read section about "ANNOTATION PROCESSING" and try to get more info with compiler options:
-XprintProcessorInfo
Print information about which annotations a processor is asked to process.
-XprintRounds Print information about initial and subsequent annotation processing rounds.
I poked around the java 7 compiler options and I found this:
-implicit:{class,none}
Controls the generation of class files for implicitly loaded source files. To automatically generate class files, use -implicit:class. To suppress class file generation, use -implicit:none. If this option is not specified, the default is to automatically generate class files. In this case, the compiler will issue a warning if any such class files are generated when also doing annotation processing. The warning will not be issued if this option is set explicitly. See Searching For Types.
Source
Can you try and implicitly declare the class file.
I'm trying to validate an Atom feed with Java 5 (JRE 1.5.0 update 11). The code I have works without problem in Java 6, but fails when running in Java 5 with a
org.xml.sax.SAXParseException: src-resolve: Cannot resolve the name 'xml:base' to a(n) 'attribute declaration' component.
I think I remember reading something about the version of Xerces bundled with Java 5 having some problems with some schemas, but i cant find the workaround. Is it a known problem ? Do I have some error in my code ?
public static void validate() throws SAXException, IOException {
List<Source> schemas = new ArrayList<Source>();
schemas.add(new StreamSource(AtomValidator.class.getResourceAsStream("/atom.xsd")));
schemas.add(new StreamSource(AtomValidator.class.getResourceAsStream("/dc.xsd")));
// Lookup a factory for the W3C XML Schema language
SchemaFactory factory = SchemaFactory.newInstance("http://www.w3.org/2001/XMLSchema");
// Compile the schemas.
Schema schema = factory.newSchema(schemas.toArray(new Source[schemas.size()]));
Validator validator = schema.newValidator();
// load the file to validate
Source source = new StreamSource(AtomValidator.class.getResourceAsStream("/sample-feed.xml"));
// check the document
validator.validate(source);
}
Update : I tried the method below, but I still have the same problem if I use Xerces 2.9.0. I also tried adding xml.xsd to the list of schemas (as xml:base is defined in xml.xsd) but this time I have
Exception in thread "main" org.xml.sax.SAXParseException: schema_reference.4: Failed to read schema document 'null', because 1) could not find the document; 2) the document could not be read; 3) the root element of the document is not <xsd:schema>.
Update 2: I tried to configure a proxy with the VM arguments -Dhttp.proxyHost=<proxy.host.com> -Dhttp.proxyPort=8080 and now it works. I'll try to post a "real answer" from home.
and sorry, I cant reply as a comment : because of security reasons XHR is disabled from work ...
Indeed, people have been mentioning the Java 5 Sun provided SchemaFactory is giving troubles.
So: did you include Xerces in your project yourself?
After including Xerces, you need to ensure it is being used. If you like to hardcode it (well, as a minimal requirement you'd probably use some application properties file to enable and populate the following code):
String schemaFactoryProperty =
"javax.xml.validation.SchemaFactory:" + XMLConstants.W3C_XML_SCHEMA_NS_URI;
System.setProperty(schemaFactoryProperty,
"org.apache.xerces.jaxp.validation.XMLSchemaFactory");
SchemaFactory factory =
SchemaFactory.newInstance(XMLConstants.W3C_XML_SCHEMA_NS_URI);
Or, if you don't want to hardcode, or when your troublesome code would be in some 3rd party library that you cannot change, set it on the java command line or environment options. For example (on one line of course):
set JAVA_OPTS =
"-Djavax.xml.validation.SchemaFactory:http://www.w3.org/2001/XMLSchema
=org.apache.xerces.jaxp.validation.XMLSchemaFactory"
By the way: apart from the Sun included SchemaFactory implementation giving trouble (something like com.sun.org.apache.xerces.internal.jaxp.validation.xs.schemaFactoryImpl), it also seems that the "discovery" of non-JDK implementations fails in that version. If I understand correctly than, normally, just including Xerces would in fact make SchemaFactory#newInstance find that included library, and give it precedence over the Sun implementation. To my knowledge, that fails as well in Java 5, making the above configuration required.
I tried to configure a proxy with the VM arguments -Dhttp.proxyHost=<proxy.host.com> -Dhttp.proxyPort=8080 and now it works.
Ah, I didn't realize that xml.xsd is in fact the one referenced as http://www.w3.org/2001/xml.xsd or something like that. That should teach us to always show some XML and XSD fragments as well. ;-)
So, am I correct to assume that 1.) to fix the Java 5 issue, you still needed to include Xerces and set the system property, and that 2.) you did not have xml.xsd available locally?
Before you found your solution, did you happen to try using getResource rather than getResourceAsStream, to see if the exception would then have showed you some more details?
If you actually did have xml.xsd available (so: if getResource did in fact yield a URL) then I wonder what Xerces was trying to fetch from the internet then. Or maybe you did not add that schema to the list prior to adding your own schemas? The order is important: dependencies must be added first.
For whoever gets tot his question using the search: maybe using a custom EntityResolver could have indicated the source of the problem as well (if only writing something to the log and just returning null to tell Xerces to use the default behavior).
Hmmm, just read your "comment" -- editing does not alert people for new replies, so time to ask your boss for some iPhone or some other gadget that is connected to the net directly ;-)
Well, I assume you added:
schemas.add(
new StreamSource(AtomValidator.class.getResourceAsStream("/xml.xsd")));
If so, is xml.xsd actually to be found on the classpath then? I wonder if the getResourceAsStream did not yield null in your case, and how new StreamSource(null) would act then.
Even if getResourceAsStream did not yield null, the resulting StreamSource would still not know where it was loaded from, which may be a problem when trying to include references. So, what if you use the constructor StreamSource(String systemId) instead:
schemas.add(new StreamSource(AtomValidator.class.getResource("/atom.xsd")));
schemas.add(new StreamSource(AtomValidator.class.getResource("/dc.xsd")));
You might also use StreamSource(InputStream inputStream, String systemId), but I don't see any advantage over the above two lines. However, the documentation explains why passing the systemId in either of the 2 constructors seems good:
This constructor allows the systemID to be set in addition to the input stream, which allows relative URIs to be processed.
Likewise, setSystemId(String systemId) explains a bit:
The system identifier is optional if there is a byte stream or a character stream, but it is still useful to provide one, since the application can use it to resolve relative URIs and can include it in error messages and warnings (the parser will attempt to open a connection to the URI only if there is no byte stream or character stream specified).
If this doesn't work out, then maybe some custom error handler can give you more details:
ErrorHandlerImpl errorHandler = new ErrorHandlerImpl();
validator.setErrorHandler(errorHandler);
:
:
validator.validate(source);
if(errorHandler.hasErrors()){
LOG.error(errorHandler.getMessages());
throw new [..];
}
if(errorHandler.hasWarnings()){
LOG.warn(errorHandler.getMessages());
}
...using the following ErrorHandler to capture the validation errors and continue parsing as far as possible:
import org.xml.sax.helpers.DefaultHandler;
private class ErrorHandlerImpl extends DefaultHandler{
private String messages = "";
private boolean validationError = false;
private boolean validationWarning = false;
public void error(SAXParseException exception) throws SAXException{
messages += "Error: " + exception.getMessage() + "\n";
validationError = true;
}
public void fatalError(SAXParseException exception) throws SAXException{
messages += "Fatal: " + exception.getMessage();
validationError = true;
}
public void warning(SAXParseException exception) throws SAXException{
messages += "Warn: " + exception.getMessage();
validationWarning = true;
}
public boolean hasErrors(){
return validationError;
}
public boolean hasWarnings(){
return validationWarning;
}
public String getMessages(){
return messages;
}
}