blue-sky
blue-sky

Reputation: 53876

Does Spring @Cacheable block if accessed by more that 1 thread?

If a method marked @Cacheable takes 10 minutes to complete and two threads t1,t2 access the method.

t1 accesses at time 0 (cache method is now run for first time) t2 accesses at time t1+5mins

Does this mean that t2 will not access the data for approx 5 mins since t1 has already started the @Cacheable operation and it's due to complete in 5 mins(as its been running for 5 mins) or will a new call to @Cacheable be invoked by t2?

Upvotes: 17

Views: 12879

Answers (4)

JayVeeInCorp
JayVeeInCorp

Reputation: 184

Since Spring 4.3, you can get the desired blocking behavior by adding the sync = true flag:

@Cacheable(value="cacheName", key="{#keyField1, #keyField2}", sync = true)

Upvotes: 11

Loki
Loki

Reputation: 941

There is no blocking on @Cacheable.

But you can use blocking cache strategy in cache implementation. First query found miss has the responsibility to rebuild the cache. Others queries wait until the cache is rebuilt.

  • For local cache implementation, use ReadWriteLock. See the net.sf.ehcache.constructs.blocking.BlockingCache.
  • For remote cache implementation, use ghetto central lock.

Upvotes: 6

Jigish
Jigish

Reputation: 1784

As colossus explained, the cache is checked prior to the method call. So, if the item is not in cache (as will be the case at t1 + 5 mins), the method invocation will happen for thread t2 as well.

There is no "blocking" on the @Cacheable annotation. t2 will call the method as if there was a cache-miss and hence t2 will also take 10 minutes to complete the same method.

Upvotes: 2

kolossus
kolossus

Reputation: 20691

If the result of the first execution hasn't been cached, the second invocation will proceed.

You should understand that @Cacheable is centered around the content of the cache (and not specifically a thread's execution context [well, kind of; the cache still needs to be threadsafe]). On execution of a method, the cache is first checked to see if the key exists: if t1 is taking a while to complete, its result will not be cached therefore, concurrent executions will proceed without regard for t1's execution

Upvotes: 8

Related Questions