Rubal Jain
Rubal Jain

Reputation: 87

java.nio.channels.OverlappingFileLockException on Hibernate Search

I have implemented very basic Hibernate Search in my project. Below is the search method:

public List<Patients> search (String givenName, String middleName, String email, String phoneNumber) 
        throws InterruptedException, AppException {

    fullTextEntityManager = Search.getFullTextEntityManager(EntityManagerUtil.getEntityManager());          

    fullTextEntityManager.createIndexer().startAndWait();       

    QueryBuilder qb = fullTextEntityManager
            .getSearchFactory()
            .buildQueryBuilder()
            .forEntity(Patients.class)
            .overridesForField("givenName", "customanalyzer_query")
            .overridesForField("middleName", "customanalyzer_query")
            .overridesForField("email", "customanalyzer_query")
            .overridesForField("phoneNumber", "customanalyzer_query")
            .get();

    org.apache.lucene.search.Query luceneQuery = qb.bool()
                .must(qb.keyword().wildcard().onField("givenName").matching(givenName).createQuery())
                .must(qb.keyword().wildcard().onField("middleName").matching(middleName).createQuery())
                .must(qb.keyword().wildcard().onField("email").matching(email).createQuery())
                .must(qb.keyword().wildcard().onField("phoneNumber").matching(phoneNumber).createQuery()).createQuery();

    javax.persistence.Query jpaQuery =
            fullTextEntityManager.createFullTextQuery(luceneQuery, Patients.class);

    @SuppressWarnings("unchecked")
    List<Patients> result = jpaQuery.getResultList();

    return result;
}

I am using Postgresql to read the data from.

The first time this is run, everything goes great. The indexes are getting created in the directory, the search is complete.

Then I go on and add a few more entries in the database. After I add, I run the same search. Sometimes I get the desired result and sometimes it doesn't include the newly added entries in the search. And I get this exception:

ERROR [Hibernate Search: Index updates queue processor for index com.healthelife.DGS.entity.Patients-1] (LuceneBackendQueueTask.java:73) - HSEARCH000073: Error in backend
java.nio.channels.OverlappingFileLockException
at sun.nio.ch.SharedFileLockTable.checkList(FileLockTable.java:255)
at sun.nio.ch.SharedFileLockTable.add(FileLockTable.java:152)
at sun.nio.ch.FileChannelImpl.tryLock(FileChannelImpl.java:1108)
at java.nio.channels.FileChannel.tryLock(FileChannel.java:1155)
at org.apache.lucene.store.NativeFSLock.obtain(NativeFSLockFactory.java:217)
at org.apache.lucene.store.Lock.obtain(Lock.java:72)
at org.apache.lucene.index.IndexWriter.<init>(IndexWriter.java:1098)
at org.hibernate.search.backend.impl.lucene.IndexWriterHolder.createNewIndexWriter(IndexWriterHolder.java:148)
at org.hibernate.search.backend.impl.lucene.IndexWriterHolder.getIndexWriter(IndexWriterHolder.java:115)
at org.hibernate.search.backend.impl.lucene.AbstractWorkspaceImpl.getIndexWriter(AbstractWorkspaceImpl.java:117)
at org.hibernate.search.backend.impl.lucene.LuceneBackendQueueTask.applyUpdates(LuceneBackendQueueTask.java:99)
at org.hibernate.search.backend.impl.lucene.LuceneBackendQueueTask.run(LuceneBackendQueueTask.java:67)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
ERROR [Hibernate Search: Index updates queue processor for index com.healthelife.DGS.entity.Patients-1] (LogErrorHandler.java:83) - HSEARCH000058: Exception occurred java.nio.channels.OverlappingFileLockException
Primary Failure:
Entity com.healthelife.DGS.entity.Patients  Id null  Work Type  org.hibernate.search.backend.PurgeAllLuceneWork

My entity class:

@Produces(MediaType.APPLICATION_JSON)
@XmlRootElement
@AnalyzerDef(name = "customanalyzer_query", 
        tokenizer = @TokenizerDef(factory = WhitespaceTokenizerFactory.class),
        filters = {
            @TokenFilterDef(factory = LowerCaseFilterFactory.class),
            @TokenFilterDef(factory = SnowballPorterFilterFactory.class, params = { @Parameter(name = "language", value = "English") 
        })

})
@Analyzer(definition = "customanalyzer_query")
public class Patients implements Serializable {

   private static final long serialVersionUID = -6061320465621019356L;

   @Id
   @GeneratedValue(strategy = GenerationType.AUTO)
   @Column(name = "personId", nullable = false, unique = true)
   private Long personId;

   @Column(name = "prefix", nullable = true, unique = false)
   private String prefix;

   @Field(index=Index.YES, analyze=Analyze.YES, store=Store.NO, analyzer=@Analyzer(definition="customanalyzer_query"))
   @Column(name = "givenName", nullable = true, unique = false)
   private String givenName;

   @Field(index=Index.YES, analyze=Analyze.YES, store=Store.NO, analyzer=@Analyzer(definition="customanalyzer_query"))
   @Column(name = "middleName", nullable = true, unique = false)
   private String middleName;

   @OneToOne(cascade = CascadeType.ALL, fetch = FetchType.EAGER)
   @JoinColumn(name = "gender", nullable = true)
   private Gender gender;

   @Column(name = "dateOfBirth", nullable = true, unique = false)
   private String dateOfBirth;

   @Column(name = "address1", nullable = true, unique = false)
   private String address1;

   @Column(name = "address2", nullable = true, unique = false)
   private String address2;

   @Column(name = "postalCode", nullable = true, unique = false)
   private String postalCode;

   @Field(index=Index.YES, analyze=Analyze.YES, store=Store.NO, analyzer=@Analyzer(definition="customanalyzer_query"))
   @Column(name = "phoneNumber", nullable = true, unique = false)
   private String phoneNumber;

   @Column(name = "phoneExt", nullable = true, unique = false)
   private String phoneExt;

   @Field(index=Index.YES, analyze=Analyze.YES, store=Store.NO, analyzer=@Analyzer(definition="customanalyzer_query"))
   @Column(name = "email", nullable = true, unique = false)
   private String email;        

   @Column(name = "city", nullable = true, unique = false)
   private String city; 

   @Column(name = "dateChanged", nullable = true, unique = false)
   private String dateChanged;

   @Column(name = "dateCreated", nullable = false, unique = false)
   private String dateCreated;

   @OneToOne(cascade = CascadeType.ALL, fetch = FetchType.EAGER, orphanRemoval=true)
   @JoinColumn(name = "personIdentifiers", nullable = true)
   private PersonIdentifiers personIdentifiers;

   @Column(name = "profileImage", nullable = true, unique = false)
   private String profileImage;

   @Column(name = "nationality", nullable = true, unique = false)
   private String nationality;

   @OneToOne(cascade = CascadeType.ALL, fetch = FetchType.EAGER)
   @JoinColumn(name = "visa", nullable = true)
   private VisaDetails visa;

   @OneToOne(cascade = CascadeType.ALL, fetch = FetchType.EAGER)
   @JoinColumn(name = "emergencyContact", nullable = true)
   private EmergencyContact emergencyContact;

   @OneToMany(fetch = FetchType.EAGER, cascade= CascadeType.ALL)
   private List<IDProof> idproof = new ArrayList<>();

//getters and setters....
}

Upvotes: 0

Views: 4348

Answers (1)

Sanne
Sanne

Reputation: 6107

You need to make sure that for a given path being used to store a Lucene index, you are not having multiple instances of Hibernate Search running.

This also implies you have to ensure that any Hibernate application is correctly stopping on shutdown; for example the Hibernate SessionFactory needs to be closed:

sessionFactory.close();

Never start a second copy when one is still running: to protect you from doing such a mistake, the locks will throw such an exception as you're seeing.

Upvotes: 1

Related Questions