Reputation: 11
I downloaded the Stanford coreNLP from http://nlp.stanford.edu/software//stanford-corenlp-full-2015-12-09.zip and decompressed it in my R library directory. I got this when initializing it with initCoreNLP():
initCoreNLP() [main] INFO edu.stanford.nlp.pipeline.StanfordCoreNLP - Searching for resource: StanfordCoreNLP.properties [main] INFO edu.stanford.nlp.pipeline.StanfordCoreNLP - Adding annotator tokenize [main] INFO edu.stanford.nlp.pipeline.TokenizerAnnotator - TokenizerAnnotator: No tokenizer type provided. Defaulting to PTBTokenizer. [main] INFO edu.stanford.nlp.pipeline.StanfordCoreNLP - Adding annotator ssplit [main] INFO edu.stanford.nlp.pipeline.StanfordCoreNLP - Adding annotator pos Reading POS tagger model from edu/stanford/nlp/models/pos-tagger/english-left3words/english-left3words-distsim.tagger ... done [0,4 sec]. [main] INFO edu.stanford.nlp.pipeline.StanfordCoreNLP - Adding annotator lemma [main] INFO edu.stanford.nlp.pipeline.StanfordCoreNLP - Adding annotator ner Loading classifier from edu/stanford/nlp/models/ner/english.all.3class.distsim.crf.ser.gz ... done [0,7 sec]. Loading classifier from edu/stanford/nlp/models/ner/english.muc.7class.distsim.crf.ser.gz ... done [0,3 sec]. Loading classifier from edu/stanford/nlp/models/ner/english.conll.4class.distsim.crf.ser.gz ... done [0,3 sec]. [main] INFO edu.stanford.nlp.time.JollyDayHolidays - Initializing JollyDayHoliday for SUTime from classpath edu/stanford/nlp/models/sutime/jollyday/Holidays_sutime.xml as sutime.binder.1. Erreur dans initCoreNLP() : edu.stanford.nlp.util.ReflectionLoading$ReflectionLoadingException: Error creating edu.stanford.nlp.time.TimeExpressionExtractorImpl
Can someone help fix my coreNLP configuration?
I tried out several coreNLP versions.
Upvotes: 0
Views: 73
Reputation: 11
This issue is solved. There was a memory overhead when running the
initCoreNLP()
command. A mac workaround is available here Cannot Initialize CoreNLP in R
Before proceeding, make sure that you have Java 8 installed on your system. Adapted to debian the workaround yields the following steps:
RStudio
or your terminaloptions(java.parameters = "-XX:-UseGCOverheadLimit")
dyn.load('/usr/lib/jvm/java-8-openjdk-amd64/jre/lib/amd64/server/libjvm.so')
coreNLP
configuration commandinitCoreNLP()
At last, keep in mind that you are still not out of trouble. The java overhead issue warning message may still popup given your hardware settings and the size of your data set.
Upvotes: 0