ptjack
ptjack

Reputation: 141

Multiple Spring Batch jobs executing concurrently causing deadlocks

I'm using spring batch 4.0.0, spring boot 2.2.0, java jdk 12.0.2, db sql server 2016.

Multiple Spring Batch jobs executing concurrently causing deadlocks in the Spring Batch metadata tables.

I tried different solution proposed:

  1. I set identity on metadata tables
ALTER TABLE [OWN].[BATCH_JOB_EXECUTION_SEQ]
ADD CONSTRAINT [PK_BATCH_JOB_EXECUTION_SEQ] PRIMARY KEY CLUSTERED ([ID] ASC);

ALTER TABLE [OWN].[BATCH_JOB_SEQ]
ADD CONSTRAINT [PK_BATCH_JOB_SEQ] PRIMARY KEY CLUSTERED ([ID] ASC);

ALTER TABLE [OWN].[BATCH_STEP_EXECUTION_SEQ]
ADD CONSTRAINT [PK_BATCH_STEP_EXECUTION_SEQ] PRIMARY KEY CLUSTERED ([ID] ASC);
  1. I did override of JobRepository and JobExplorer to configure isolation level
    @Override
    protected JobRepository createJobRepository() throws Exception {
        JobRepositoryFactoryBean factory = new JobRepositoryFactoryBean();
        factory.setDatabaseType(DATABASE_TYPE);
        factory.setIsolationLevelForCreate(ISOLATION_REPEATABLE_READ);
        factory.setDataSource(dataSource);
        factory.setTransactionManager(getTransactionManager());
        factory.setTablePrefix(TABLE_PREFIX);
        factory.afterPropertiesSet();
        return factory.getObject();
    }

    @Override
    protected JobExplorer createJobExplorer() throws Exception {
        JobExplorerFactoryBean jobExplorerFactoryBean = new JobExplorerFactoryBean();
        jobExplorerFactoryBean.setDataSource(dataSource);
        jobExplorerFactoryBean.afterPropertiesSet();
        jobExplorerFactoryBean.setTablePrefix(TABLE_PREFIX);
        return jobExplorerFactoryBean.getObject();
    }

But when i start multiple jobs, i get often/always the same error:

Could not increment identity; nested exception is com.microsoft.sqlserver.jdbc.SQLServerException: Transaction was deadlocked on lock resources with another process and has been chosen as the deadlock victim. Rerun the transaction.

Upvotes: 5

Views: 2500

Answers (1)

ptjack
ptjack

Reputation: 141

I have found the solution and publish it for anyone with the same problem; I have never found a complete and comprehensive solution so i want explain how to solve it and why it happened, let's start. This deadlock is caused by update and delete from BATCH_STEP_EXECUTION_SEQ, BATCH_JOB_EXECUTION_SEQ, and BATCH_JOB_SEQ tables, but why it happens? Maybe because the thing is not properly managed, but the real problem is that sql server start to use sequence from sql server 2012 so spring batch had to use this tables to manage sequence, i don't have the solution for those who use an older version of sql server than 2012, THIS GUIDE IS ONLY FOR VERSIONS FROM 2012.

1. Create sequence in your DB.

PAY ATTENTION TO START THE SEQUENCE FROM AN ID NOT YET PROCESSED OTHERWISE IF YOU START FROM 1 YOU MUST CLEAN THE SPRING TABLES AND POSSIBLE TABLES WHERE THE JOB ID IS USED.

DROP TABLE [OWN].[BATCH_JOB_SEQ]
GO


DROP TABLE [OWN].[BATCH_JOB_EXECUTION_SEQ]
GO


DROP TABLE [OWN].[BATCH_STEP_EXECUTION_SEQ]
GO


CREATE SEQUENCE [OWN].[BATCH_JOB_SEQ] 
 AS [bigint]
 START WITH 1000
 INCREMENT BY 1
 MINVALUE -9223372036854775808
 MAXVALUE 9223372036854775807
 CACHE 
GO


CREATE SEQUENCE [OWN].[BATCH_JOB_EXECUTION_SEQ] 
 AS [bigint]
 START WITH 1000
 INCREMENT BY 1
 MINVALUE -9223372036854775808
 MAXVALUE 9223372036854775807
 CACHE 
GO

CREATE SEQUENCE [OWN].[BATCH_STEP_EXECUTION_SEQ] 
 AS [bigint]
 START WITH 1000
 INCREMENT BY 1
 MINVALUE -9223372036854775808
 MAXVALUE 9223372036854775807
 CACHE 
GO

2. Override for JobRepository method and JobExplorer too (personally i had to add it because otherwise i had the problem about the spring batch tables prefix).

Set isolation level to REPEATABLE_READ because otherwise you will have new concurrency error on DB (Personally without this setting I had problems on BATCH_JOB_EXECUTION or BATCH_JOB_INSTANCE i don't remember just now).

@Component
public class SQLServerConfig extends DefaultBatchConfigurer{

  @Autowired
  private DataSource dataSource;

  @Value("${spring.batch.tablePrefix}")
  private String tablePrefix;

  @Override
  protected JobRepository createJobRepository() throws Exception {
      JobRepositoryFactoryBean factory = new JobRepositoryFactoryBean();
      factory.setDatabaseType(DatabaseType.SQLSERVER.name());
      factory.setDataSource(dataSource);
      factory.setTransactionManager(getTransactionManager());
      factory.setTablePrefix(tablePrefix);
      factory.setIncrementerFactory(new CustomDataFieldMaxValueIncrementerFactory(dataSource));
      factory.setIsolationLevelForCreate("ISOLATION_REPEATABLE_READ");
      factory.afterPropertiesSet();
      return factory.getObject();
  }

  @Override
  protected JobExplorer createJobExplorer() throws Exception {
      JobExplorerFactoryBean jobExplorerFactoryBean = new JobExplorerFactoryBean();
      jobExplorerFactoryBean.setDataSource(dataSource);
      jobExplorerFactoryBean.afterPropertiesSet();
      jobExplorerFactoryBean.setTablePrefix(tablePrefix);
      return jobExplorerFactoryBean.getObject();
  }

}

3. Set CustomDataFieldIncrementerFactory because now it start to manage sequence from it.

 public class CustomDataFieldMaxValueIncrementerFactory extends DefaultDataFieldMaxValueIncrementerFactory {


  private DataSource dataSource;


  public CustomDataFieldMaxValueIncrementerFactory(DataSource dataSource) {
      super(dataSource);
      this.dataSource = dataSource;
  }


  @Override
  public DataFieldMaxValueIncrementer getIncrementer(String incrementerType, String incrementerName) {
      DatabaseType databaseType = DatabaseType.valueOf(incrementerType.toUpperCase());
      DataFieldMaxValueIncrementer dataFieldMaxValueIncrementer = null;
      if (databaseType == DatabaseType.SQLSERVER) {
          dataFieldMaxValueIncrementer =  new CustomSqlServerMaxValueIncrementer(dataSource, incrementerName);
      } else {
          dataFieldMaxValueIncrementer = super.getIncrementer(incrementerType, incrementerName);
      }
      return dataFieldMaxValueIncrementer;
  }
}

THAT'S ALL, NOW IT WORKING GOOD!!!

I hope it will be useful to someone.

Upvotes: 5

Related Questions