eager2learn
eager2learn

Reputation: 1488

CRFClassifier java.lang.NoSuchFieldError: maxAdditionalKnownLCWords

I'm trying to use the CRFClassifier of the Stanford NLP library.

The trained model is supposed to be in a .ser file, but when I pass the deserialized object to the CRFClassifier constructor I get the error:

java.lang.NoSuchFieldError: maxAdditionalKnownLCWords

This is what I tried, I've also tried to use the property file that was given in the same directory. I get the same error regardless of whether I pass the prop file or not though:

import edu.stanford.nlp.process.*;
import java.util.Collection;
import edu.stanford.nlp.ling.*;
import java.util.List;
import java.io.*;
import edu.stanford.nlp.io.*;
import edu.stanford.nlp.ie.*;
import edu.stanford.nlp.ie.crf.*;
import java.util.*;

public class StanfordParserTest {

    public static void main(String[] args) {
        // TODO Auto-generated method stub

        String propfile = "/Users/--------/Documents/Programming/Java/stanford-ner-2015-12-09/classifiers/english.all.3class.distsim.prop";
        FileReader p_file_reader = null;
        Properties prop = new Properties();
        try{
            p_file_reader = new FileReader(propfile);
        }catch(FileNotFoundException e){
            e.printStackTrace();
        }
        if (p_file_reader != null){
            try{
                prop.load(p_file_reader);
                p_file_reader.close();
            }catch(IOException e){
                e.printStackTrace();
            }

        }

        ObjectInputStream o_in = null;
        String serializedClassifier = "/Users/--------/Documents/Programming/Java/stanford-ner-2015-12-09/classifiers/english.all.3class.distsim.crf.ser";
        try{
            FileInputStream f_in = new FileInputStream(serializedClassifier);
            o_in = new ObjectInputStream(f_in);
            f_in.close();
        }catch(FileNotFoundException e){
            e.printStackTrace();
        }catch(IOException e){
            e.printStackTrace();
        }
        System.out.println(o_in);
        System.out.println(prop);
        AbstractSequenceClassifier<CoreLabel> classifier = null;
        try{    
            classifier = CRFClassifier.getClassifier(o_in, prop);
        }catch(ClassNotFoundException e){
            e.printStackTrace();
        }
        catch(IOException e){
            e.printStackTrace();
        }
        System.out.println(classifier);

    }

}

This is the output:

java.io.ObjectInputStream@6ff3c5b5
{useDisjunctive=true, useSequences=true, serializeTo=english.all.3class.distsim.crf.ser.gz, useOccurrencePatterns=true, unknownWordDistSimClass=0, useClassFeature=true, testFile=/u/nlp/data/ner/column_data/all.3class.test, useQN=true, useTypeSeqs=true, usePrevSequences=true, featureDiffThresh=0.05, wordFunction=edu.stanford.nlp.process.AmericanizeFunction, distSimLexicon=/u/nlp/data/pos_tags_are_useless/egw4-reut.512.clusters, wordShape=chris2useLC, usePrev=true, maxLeft=1, useNextRealWord=true, useTypeSeqs2=true, map=word=0,answer=1, disjunctionWidth=5, useWord=true, QNsize=25, useLastRealWord=true, numberEquivalenceDistSim=true, useDistSim=true, useNGrams=true, saveFeatureIndexToDisk=true, useLongSequences=true, useObservedSequencesOnly=true, readerAndWriter=edu.stanford.nlp.sequences.ColumnDocumentReaderAndWriter, maxNGramLeng=6, normalize=true, trainFileList=/u/nlp/data/ner/column_data/ace23.3class.train,/u/nlp/data/ner/column_data/muc6.3class.ptb.train,/u/nlp/data/ner/column_data/muc7.3class.ptb.train,/u/nlp/data/ner/column_data/conll.3class.train,/u/nlp/data/ner/column_data/wikiner.3class.train,/u/nlp/data/ner/column_data/ontonotes.3class.train,/u/nlp/data/ner/column_data/english.extra.3class.train, useNext=true, noMidNGrams=true, useTypeySequences=true, type=crf}
Exception in thread "main" java.lang.NoSuchFieldError: maxAdditionalKnownLCWords
    at edu.stanford.nlp.ie.AbstractSequenceClassifier.reinit(AbstractSequenceClassifier.java:185)
    at edu.stanford.nlp.ie.AbstractSequenceClassifier.<init>(AbstractSequenceClassifier.java:152)
    at edu.stanford.nlp.ie.crf.CRFClassifier.<init>(CRFClassifier.java:174)
    at edu.stanford.nlp.ie.crf.CRFClassifier.getClassifier(CRFClassifier.java:2967)
    at StanfordParserTest.main(StanfordParserTest.java:66)

Does anybody know what is going wrong here?

Upvotes: 0

Views: 323

Answers (1)

StanfordNLPHelp
StanfordNLPHelp

Reputation: 8739

Please consult the code given in NERDemo.java to see how to programmatically load a CRFClassifier.

These commands should run fine if run in the distribution directory:

javac -cp "*" NERDemo.java
java -mx400m -cp "*:.:lib/*" NERDemo classifiers/english.all.3class.distsim.crf.ser.gz sample.txt

In general, make sure your CLASSPATH is only using the current jars from that distribution directory. If you have outdated jars in your CLASSPATH you might get some errors.

This should work if you have the proper CLASSPATH:

String serializedClassifier = "classifiers/english.all.3class.distsim.crf.ser.gz";
AbstractSequenceClassifier<CoreLabel> classifier = CRFClassifier.getClassifier(serializedClassifier);

and are deserializing the models provided with the current distribution which are in the classifiers folder.

Upvotes: 1

Related Questions