Exorcismus
Exorcismus

Reputation: 2482

Hibernate pagination

I have a table of items with a flag of "null" or "done" , I need to fetch the null flagged items ,process them, set flag to done.

thing is , I want to use pagination , where I fetch 500 by 500 item(null flagged) my design goes as follows

  1. I fetch 500 item // the producer
  2. put them in a queue
  3. some thread takes these 500 item // the consumer
  4. operate on them and updates flag to "done"

the problem am facing is the consumer is pretty slow, so the producer fetches the same 500 part again , so I went for indexing but seems not to work properly

public List<Parts> getNParts(int listSize) {
        try {

            criteria = session.createCriteria(Parts.class);
            criteria.setFirstResult(DBIndexGuard.getNextIndex()); //index+=500;
            criteria.add(Restrictions.isNull("Status"));

            criteria.setMaxResults(listSize); //list size is 500;
            newPartList = criteria.list();

        } catch (Exception e) {
            e.printStackTrace();
        } finally {

        }

        return newPartList;
    }

how can I implement pagination in order to fetch 500 by 500 different items with the criteria that these items are null flagged ?

Upvotes: 0

Views: 190

Answers (4)

Exorcismus
Exorcismus

Reputation: 2482

I solved it as follows

  if (newPartList.isEmpty() || newPartList.size()<DBIndexGuard.getAllowedListSize()) { // AllowedListSize=500                   
    System.out.println("DataFetcher Sleeping");
    inputQueue.offer(newPartList);       
    DBIndexGuard.resetIndex();          
    session.clear();
    TimeUnit.MINUTES.sleep(10);
    }

Upvotes: 0

hityagi
hityagi

Reputation: 5256

This problem can easily be solved with inclusion of one more status, say 'processing'. So your producer will mark the records picked as - 'processing' and consumer can then work on them and set their status to 'done'. In this case producer will not pick the already picked records.

Upvotes: 0

Poorna Subhash
Poorna Subhash

Reputation: 2128

You may try one of the following implementation. 1. Eliminate duplicate processing on consumer side. Set done only if null. 2. Eliminate duplicate on producer by maintaining additional status 'in processs" whenever put in queue. Exclude them in your producer query. 3. While paginating, sort your records by primary key of your table , and for subsequent pages , query only on those records greater than the primary key of last record in previous page.

Upvotes: 0

Vishrant
Vishrant

Reputation: 16628

create a synchronized method for producer - consumer type of problem, this tutorial can help you.

Upvotes: 1

Related Questions