How to resolve NoSuchFieldError exception when testing Lucene 4.0 - java

I want to test my own Analyzer. Following is test code from Lucene in Action 2nd Edition, Code List 4.2, page 121.
public class AnalyzerUtils {
public static void displayTokens(Analyzer analyzer, String text) throws IOException {
TokenStream tokenStream = analyzer.tokenStream("contents", new StringReader(text));
displayTokens(tokenStream);
}
public static void displayTokens(TokenStream stream) throws IOException {
CharTermAttribute term = stream.getAttribute(CharTermAttribute.class);
while(stream.incrementToken()) {
System.out.println(Arrays.toString(term.buffer()));
}
}
}
My customed Analyzer is:
static class SimpleAnalyzer extends Analyzer {
static class SimpleFilter extends TokenFilter {
protected SimpleFilter(TokenStream input) { super(input); }
#Override
public boolean incrementToken() throws IOException { return false; }
}
#Override
protected TokenStreamComponents createComponents(String s, Reader reader) {
Tokenizer tokenizer = new WhitespaceTokenizer(reader);
return new TokenStreamComponents(tokenizer, new SimpleFilter(tokenizer));
}
}
static class FilteringAnalyzer extends Analyzer {
static class FilteringFilter extends FilteringTokenFilter {
public FilteringFilter(TokenStream in) { super(in); }
#Override
protected boolean accept() throws IOException { return false; }
}
#Override
protected TokenStreamComponents createComponents(String s, Reader reader) {
Tokenizer tokenizer = new WhitespaceTokenizer(reader);
return new TokenStreamComponents(tokenizer, new FilteringFilter(tokenizer));
}
}
Problem is if I run AnalyzerUtils.displayTokens(new SimpleAnalyzer(), "美国 法国 中国");, it is ok; however, running AnalyzerUtils.displayTokens(new FilteringAnalyzer(), "美国 法国 中国"); I got this Exception:
Exception in thread "main" java.lang.NoSuchFieldError: LATEST
at org.apache.lucene.analysis.util.FilteringTokenFilter.<init>(FilteringTokenFilter.java:70)
at cn.edu.nju.ws.miliqa.nlp.ner.index.NameEntityIndexing$FilteringFilter.<init>(NameEntityIndexing.java:62)
at cn.edu.nju.ws.miliqa.nlp.ner.index.NameEntityIndexing$FilteringAnalyzer.createComponents(NameEntityIndexing.java:83)
at org.apache.lucene.analysis.Analyzer.tokenStream(Analyzer.java:134)
at cn.edu.nju.ws.miliqa.lucene.AnalyzerUtils.displayTokens(AnalyzerUtils.java:19)
The difference between to test case is the filter in analyzer extendes TokenFilter or FilteringTokenFilter. I have been working on it for three days, but still have no idea about it. What is the reason for this odd exception?

The java.lang.NoSuchFieldError runtime exception means you have one class attempting to access a field on another class that doesn't exist. The offending class was FilteringTokenFilter.
Most likely, you have multiple versions of Lucene in your classpath.
You mention you are using 4.0 in the title, but Version.LATEST (the field this exception is complaining is missing) was not introduced until Lucene 4.10.
That implies that perhaps you have a copy of FilteringTokenFilter.class in a Lucene 4.10+ jar file attempting to find the field "LATEST" in an older (4.0?) Version.class file.
Check you only have one copy of each "lucene-core" and "lucene-analyzers-common" jar files in your class path, and that they are both matching version numbers. If you are not sure, download them again to ensure you have matching versions.

Related

Can we make Lucene IndexWriter serializable for ExecutionContext of Spring Batch?

This question is related to my another SO question.
To keep IndexWriter open for the duration of a partitioned step, I thought to add IndexWriter in ExecutionContext of partitioner and then close in a StepExecutionListenerSupport 's afterStep(StepExecution stepExecution) method.
Challenge that I am facing in this approach is that ExecutionContext needs Objects to be serializable.
In light of these two questions, Q1, Q2 -- it doesn't seem feasible because I can't add a no - arg constructor in my custom writer because IndexWriter doesn't have any no - arg constructor.
public class CustomIndexWriter extends IndexWriter implements Serializable {
/*
private Directory d;
private IndexWriterConfig conf;
public CustomIndexWriter(){
super();
super(this.d, this.conf);
}
*/
public CustomIndexWriter(Directory d, IndexWriterConfig conf) throws IOException {
super(d, conf);
}
/**
*
*/
private static final long serialVersionUID = 1L;
private void readObject(ObjectInputStream input) throws IOException, ClassNotFoundException{
input.defaultReadObject();
}
private void writeObject(ObjectOutputStream output) throws IOException, ClassNotFoundException {
output.defaultWriteObject();
}
}
In above code, I can't add constructor shown as commented because no - arg constructor doesn't exist in Super class and can't access this fields before super .
Is there a way to achieve this?
You can always add a parameter-less constructor.
E.g:
public class CustomWriter extends IndexWriter implements Serializable {
private Directory lDirectory;
private IndexWriterConfig iwConfig;
public CustomWriter() {
super();
// Assign default values
this(new Directory("." + System.getProperty("path.separator")), new IndexWriterConfig());
}
public CustomWriter(Directory dir, IndexWriterConfig iwConf) {
lDirectory = dir;
iwConfig = iwConf;
}
public Directory getDirectory() { return lDirectory; }
public IndexWriterConfig getConfig() { return iwConfig; }
public void setDirectory(Directory dir) { lDirectory = dir; }
public void setConfig(IndexWriterConfig conf) { iwConfig = conf; }
// ...
}
EDIT:
Having taken a look at my own code (using Lucene.Net), the IndexWriter needs an analyzer, and a MaxFieldLength.
So the super-call would look something like this:
super(new Directory("." + System.getProperty("path.separator")), new StandardAnalyzer(), MaxFieldLength.UNLIMITED);
So adding these values as defaults should fix the issue. Maybe then add getter- and setter-methods for the analyzer and MaxFieldLength, so you have control over that at a later stage.
I am not sure how but this syntax works in Spring Batch and ExecutionContext returns a non - null Object in StepExecutionListenerSupport.
public class CustomIndexWriter implements Serializable {
private static final long serialVersionUID = 1L;
private transient IndexWriter luceneIndexWriter;
public CustomIndexWriter(IndexWriter luceneIndexWriter) {
this.luceneIndexWriter=luceneIndexWriter;
}
public IndexWriter getLuceneIndexWriter() {
return luceneIndexWriter;
}
public void setLuceneIndexWriter(IndexWriter luceneIndexWriter) {
this.luceneIndexWriter = luceneIndexWriter;
}
}
I put an instance of CustomIndexWriter in step partitioner, partitioned step chunk works with writer by doing, getLuceneIndexWriter() and then in StepExecutionListenerSupport , I close this writer.
This way my spring batch partitioned step works with a single instance of Lucene Index Writer Object.
I was hoping that I will get a NullPointer if trying to perform operation on writer obtained by getLuceneIndexWriter() but that doesn't happen ( despite it being transient ). I am not sure why this works but it does.
For Spring Batch job metadata, I am using in - memory repository and not db based repository. Not sure if this will continue to work once I start using db for metadata.

Spring Boot batch - MultiResourceItemReader : move to next file on error

In a batch service, I read multiple XML files using a MultiResourceItemReader, which delegate to a StaxEventItemReader.
If an error is raised reading a file (a parsing exception for example), I would like to specify to Spring to start reading the next matching file. Using #OnReadError annotation and/or a SkipPolicy for example.
Currently, when a reading exception is raised, the batch stops.
Does anyone have an idea how to do it ?
EDIT: I see MultiResourceItemReader has a method readNextItem(), but it's private -_-
I'm not using SB for a while, but looking MultiResourceItemReader code I suppose you can write your own ResourceAwareItemReaderItemStream wrapper where you check for a flag setted to move to next file or to perform a standard read using a delegate.
This flag can be stored into execution-context or into your wrapper and should be cleared after a move next.
class MoveNextReader<T> implements ResourceAwareItemReaderItemStream<T> {
private ResourceAwareItemReaderItemStream delegate;
private boolean skipThisFile = false;
public void setSkipThisFile(boolean value) {
skipThisFile = value;
}
public void setResource(Resource resource) {
skipThisFile = false;
delegate.setResource(resource);
}
public T read() {
if(skipThisFile) {
skipThisFile = false;
// This force MultiResourceItemReader to move to next resource
return null;
}
return delegate.read();
}
}
Use this class as delegate for MultiResourceItemReader and in #OnReadErrorinject MoveNextReader and set MoveNextReader.skipThisFile.
I can't test code from myself but I hope this can be a good starting point.
Here are my final classes to read multiple XML files and jump to the next file when a read error occurs on one (thanks to Luca's idea).
My custom ItemReader, extended from MultiResourceItemReader :
public class MyItemReader extends MultiResourceItemReader<InputElement> {
private SkippableResourceItemReader<InputElement> reader;
public MyItemReader() throws IOException {
super();
// Resources
PathMatchingResourcePatternResolver resourceResolver = new PathMatchingResourcePatternResolver();
this.setResources( resourceResolver.getResources( "classpath:input/inputFile*.xml" ) );
// Delegate reader
reader = new SkippableResourceItemReader<InputElement>();
StaxEventItemReader<InputElement> delegateReader = new StaxEventItemReader<InputElement>();
delegateReader.setFragmentRootElementName("inputElement");
Jaxb2Marshaller unmarshaller = new Jaxb2Marshaller();
unmarshaller.setClassesToBeBound( InputElement.class );
delegateReader.setUnmarshaller( unmarshaller );
reader.setDelegate( delegateReader );
this.setDelegate( reader );
}
[...]
#OnReadError
public void onReadError( Exception exception ){
reader.setSkipResource( true );
}
}
And the ItemReader-in-the-middle used to skip the current resource :
public class SkippableResourceItemReader<T> implements ResourceAwareItemReaderItemStream<T> {
private ResourceAwareItemReaderItemStream<T> delegate;
private boolean skipResource = false;
#Override
public void close() throws ItemStreamException {
delegate.close();
}
#Override
public T read() throws UnexpectedInputException, ParseException, NonTransientResourceException, Exception {
if( skipResource ){
skipResource = false;
return null;
}
return delegate.read();
}
#Override
public void setResource( Resource resource ) {
skipResource = false;
delegate.setResource( resource );
}
#Override
public void open( ExecutionContext executionContext ) throws ItemStreamException {
delegate.open( executionContext );
}
#Override
public void update( ExecutionContext executionContext ) throws ItemStreamException {
delegate.update( executionContext );
}
public void setDelegate(ResourceAwareItemReaderItemStream<T> delegate) {
this.delegate = delegate;
}
public void setSkipResource( boolean skipResource ) {
this.skipResource = skipResource;
}
}

How to create and configure Command objects from a dynamic context with variable (number of) arguments?

SO! I've reached an impasse regarding code-design. Here's the scenario.
I am required to either copy, move or delete files. OK, sure, no problem. I can easily write it like this.
public class SimpleFileManager {
public void copy(String sourcePath, String targetPath) { ... }
public void move(String sourcePath, String targetPath) { ... }
public void delete(String filePath) { ... }
}
and call it from a client code in a manner like this, for example:
public static void main(String[] args) {
...
fileManager.copy(x, y);
...
}
However, a request arises that some particular POJOs which have the FileManager reference perform specific operations, depending on some configuration. The actual specification of which POJO instance should do what is contained in config.
Here's an example of what I mean:
public static void main(String[] args) {
...
InvokerPojo invokerPojo1 = new InvokerPojo(invokerPojo1Config, fileManager); // this config tells it to copy files only
InvokerPojo invokerPojo2 = new InvokerPojo(invokerPojo2Config, fileManager); // this config tells it to move files only
InvokerPojo invokerPojo3 = new InvokerPojo(invokerPojo3Config, fileManager); // this config tells it to delete files only
}
So, FileManager provides the functionality to do actual operations, while InvokerPojo simply invokes and delegates those methods based on config.
However, I do not want to be coupled to FileManager, because, for example, I may find some library that provides the same functionality but is much better than mine.
So, I was thinking something like this:
public interface FileManagerDelegator {
void copy(String sourcePath, String targetPath);
void move(String sourcePath, String targetPath);
void delete(String filePath);
}
public class MyFileManagerDelegator implements FileManagerDelegator {
private SimpleFileManager simpleFileManager;
void copy(String sourcePath, String targetPath) { // delegate to simpleFileManager }
void move(String sourcePath, String targetPath) { // delegate to simpleFileManager }
void delete(String filePath) { // delegate to simpleFileManager }
}
public class ComplexFileManagerDelegator implements FileManagerDelegator {
private ComplexFileManager complexFileManager;
void copy(String sourcePath, String targetPath) { // delegate to complexFileManager }
void move(String sourcePath, String targetPath) { // delegate to complexFileManager }
void delete(String filePath) { // delegate to complexFileManager }
}
public interface Command {
void execute();
}
public class CopyCommand() {
private FileManagerDelegator delegator;
String sourcePath;
String targetPath;
void execute() {
delegator.copy(sourcePath, targetPath);
}
}
public class MoveCommand() {
private FileManagerDelegator delegator;
String sourcePath;
String targetPath;
void execute() {
delegator.move(sourcePath, targetPath);
}
}
public class DeleteCommand() {
private FileManagerDelegator delegator;
String filePath;
void execute() {
delegator.delete(filePath);
}
}
public static void main(String[] args) {
...
Command c = getCommand(context);
c.execute();
...
}
Now, the problem is in actually creating that particular command, because I do not want to know which command is being created. All I know is there is info in context that creates it.
I was thinking about having a factory that would create the appropriate Command from context.
The main issue arises in number of parameters for command.
If the only operations that existed were copy and move, then it would have been easy
public interface Command {
void execute(String a, String b);
}
However, since there is a delete operation which takes only one, then I'd either have to have to ignore the other argument in the call or add another method to interface, and I consider both to be bad.
I also don't want to send a variable number of arguments like this:
public interface Command {
void execute(String ... args);
}
because it's just a bit prettier version of this beforemention bad design.
So, would a factory based on context be a bit more cleaner, in a way that my client code doesn't know which operation is being called, and on which receiver:
Example of 2 contexts:
Context copyContext = new Context();
copyContext.set("OPERATION", "COPY");
copyContext.set("sourcePath", sourcePath);
copyContext.set("targetPath", targetPath);
Context deleteContext = new Context();
deleteContext.set("OPERATION", "DELETE");
deleteContext.set("filePath", filePath);
And then, in the factory, I could do something like this:
Command getCommand(FileManagerDelegator delegator, Context context) {
String operation = context.get("OPERATION");
if (operation.equals("COPY")) {
String sourcePath = context.get("sourcePath");
String targetPath = context.get("targetPath");
return new CopyCommand(sourcePath, targetPath, delegator);
} else if (operation.equals("DELETE")) {
String filePath = context.get("filePath");
return new DeleteCommand(filePath, delegator);
} else {
...
}
}
Is there a cleaner, more configurable way to create parametrized command objects on the fly (dynamically configure them, from context) that operate with different (number of) arguments?

Can I use google reflections in a static section of code to find sub classes?

I am trying to set up a system that allows me to subclass a class that gets exported to a text file without having to modify the initial class. To do this, I am trying to build a list of callbacks that can tell if they handle a particular entry, and then use that callback to get an instance of that class created from the file. The problem is I get an error
java.lang.NoClassDefFoundError: com/google/common/base/Predicate when I try to run anything involving this class. What am I doing wrong?
public abstract class Visibility {
private static final List<VisibilityCreationCallback> creationCallbacks;
static {
creationCallbacks = new ArrayList<VisibilityCreationCallback>();
Reflections reflections = new Reflections(new ConfigurationBuilder()
.setUrls(ClasspathHelper.forPackage("com.protocase.viewer.utils.visibility"))
.setScanners(new ResourcesScanner()));
// ... cut ...
public static Visibility importFromFile(Map<String, Object> importMap) {
for (VisibilityCreationCallback callback: creationCallbacks) {
if (callback.handles(importMap)) {
return callback.create(importMap);
}
}
return null;
}
public class CategoryVisibility extends Visibility{
public static VisibilityCreationCallback makeVisibilityCreationCallback () {
return new VisibilityCreationCallback() {
#Override
public boolean handles(Map<String, Object> importMap) {
return importMap.containsKey(classTag);
}
#Override
public CategoryVisibility create(Map<String, Object> importMap) {
return importPD(importMap);
}
};
}
/**
* Test of matches method, of class CategoryVisibility.
*/
#Test
public void testMatches1() {
Visibility other = new UnrestrictedVisibility();
CategoryVisibility instance = new CategoryVisibility("A Cat");
boolean expResult = true;
boolean result = instance.matches(other);
assertEquals(expResult, result);
}
You're just missing the guava library in your classpath, and Reflections requires it. That's the short answer.
The better solution is to use a proper build tool (maven,
gradle, ...), and have your transitive dependencies without the hassle.

How do I handle file saves properly in NetBeans platform project (plugin)

I try to create a new language support for NetBeans 7.4 and higher.
When files are being saved locally I need to deploy them to a server. So I need to handle the save event. I did this implementing Savable:
public class VFDataObject extends MultiDataObject implements Savable {
.......
#Override
public void save() throws IOException {
.......
}
}
And it worked perfectly for the Save event. But then I realized I need to extend HtmlDataObject instead of MultiDataObject:
public class VFDataObject extends HtmlDataObject implements Savable {
.......
#Override
public void save() throws IOException {
.......
}
}
And now the save() doesn't get executed. Why? Since HtmlDataObject extends MultiDataObject. What should be done to make that work?
Also is there a way to catch Save All event in NetBeans as well? Do you have any info on if anything changed in 8.0 in this regards?
Thanks a lot.
Have you tried OnSaveTask SPI (https://netbeans.org/bugzilla/show_bug.cgi?id=140719)? The API can be used to perform tasks when files of a given type are saved.
Something like this can be used to listen to all the save events on a given MIME type (in this case "text/x-sieve-java"):
public static class CustomOnSaveTask implements OnSaveTask {
private final Context context;
public CustomOnSaveTask(Context ctx) {
context = ctx;
}
#Override
public void performTask() {
System.out.println(">>> Save performed on " +
NbEditorUtilities.getDataObject(context.getDocument()).toString());
}
#Override
public void runLocked(Runnable r) {
r.run();
}
#Override
public boolean cancel() {
return true;
}
#MimeRegistration(mimeType = "text/x-sieve-java", service = OnSaveTask.Factory.class, position = 1600)
public static class CustomOnSaveTaskFactory implements OnSaveTask.Factory {
#Override
public OnSaveTask createTask(Context cntxt) {
return new CustomOnSaveTask(cntxt);
}
}
}

Categories

Resources