Viestuts
Viestuts

Reputation: 41

Migration to Spring Boot 2 and using Spring Batch 4

I am migrating Spring Boot from 1.4.2 to 2.0.0, which also includes migrating Spring batch from 3.0.7 to 4.0.0 and it looks like batch process is no longer working when i try to run it with new Spring Batch version.

When I tried to debug i found a problem when batch tries to get data from batch_job_execution_context.

I can see the getting the data from database works fine, but the new version of batch fails to parse database data

{"map":[{"entry":[{"string":["name",""]},{"string":["sender",""]},{"string":["id",""]},{"string":["nav",""]},{"string":["created",140418]}]}]}

with this error :

com.fasterxml.jackson.databind.exc.MismatchedInputException: Unexpected token (START_OBJECT), expected VALUE_STRING: need JSON String that contains type id (for subtype of java.lang.Object) at [Source: (ByteArrayInputStream); line: 1, column: 9] (through reference chain: java.util.HashMap["map"])

I have found that when I delete all batch metadata tables and recreate them from scratch, batch seems to work again. It looks like the metadata JSON format has changed to this

{"name":"","sender":"145844","id":"","nav":"","created":"160909"}

I do not want to delete old data to makes this work again, so is there any way to fix this?

Has anyone else tried to do this upgrade? It would be nice to know if there are any other breaking changes that I may not have noticed.

Thanks

Upvotes: 4

Views: 6576

Answers (3)

EndlosSchleife
EndlosSchleife

Reputation: 597

In the solution by @anotherdave and @michael-minella, you could also replace the plain XStreamExecutionContextStringSerializer with an instance of the following class. It accepts both formats when deserializing and serializes to the new format.

import java.io.BufferedInputStream;
import java.io.IOException;
import java.io.InputStream;
import java.io.OutputStream;
import java.util.Map;
import com.fasterxml.jackson.core.JsonProcessingException;
import org.springframework.batch.core.repository.ExecutionContextSerializer;
import org.springframework.batch.core.repository.dao.Jackson2ExecutionContextStringSerializer;
import org.springframework.batch.core.repository.dao.XStreamExecutionContextStringSerializer;


/**
 * Enables Spring Batch 4 to read both ExecutionContext entries written by ealier versions and the Spring 5 format. Entries are
 * written in Spring 5 format.
 */
@SuppressWarnings("deprecation")
class XStreamOrJackson2ExecutionContextSerializer implements ExecutionContextSerializer {
    private final XStreamExecutionContextStringSerializer xStream = new XStreamExecutionContextStringSerializer();
    private final Jackson2ExecutionContextStringSerializer jackson = new Jackson2ExecutionContextStringSerializer();

    public XStreamOrJackson2ExecutionContextSerializer() throws Exception {
        xStream.afterPropertiesSet();
    }

    // The caller closes the stream; and the decoration by ensureMarkSupported does not need any cleanup.
    @SuppressWarnings("resource")
    @Override
    public Map<String, Object> deserialize(InputStream inputStream) throws IOException {
        InputStream repeatableInputStream = ensureMarkSupported(inputStream);
        repeatableInputStream.mark(Integer.MAX_VALUE);

        try {
            return jackson.deserialize(repeatableInputStream);
        } catch (JsonProcessingException e) {
            repeatableInputStream.reset();
            return xStream.deserialize(repeatableInputStream);
        }
    }

    private static InputStream ensureMarkSupported(InputStream in) {
        return in.markSupported() ? in : new BufferedInputStream(in);
    }

    @Override
    public void serialize(Map<String, Object> object, OutputStream outputStream) throws IOException {
        jackson.serialize(object, outputStream);
    }
}

Upvotes: 0

anotherdave
anotherdave

Reputation: 6754

Based on Michael's answer above, this code block worked for me to extend the default configuration — I had to wire up the Serializer to both the JobRepository.class and the JobExplorer.class:

@Configuration
@EnableBatchProcessing
MyBatchConfigurer extends DefaultBatchConfigurer {
    private final DataSource dataSource;

    @Autowired
    public BatchConfiguration(final DataSource dataSource) throws Exception {
        this.dataSource = dataSource;
    }

    @Bean
    ExecutionContextSerializer getSerializer() {
        return new XStreamExecutionContextStringSerializer();
    }


    @Override
    protected JobRepository createJobRepository() throws Exception {
        final JobRepositoryFactoryBean factory = new JobRepositoryFactoryBean();
        factory.setDataSource(dataSource);
        factory.setSerializer(getSerializer());
        factory.setTransactionManager(getTransactionManager());
        factory.afterPropertiesSet();
        return factory.getObject();
    }

    @Override
    protected JobExplorer createJobExplorer() throws Exception {
        final JobExplorerFactoryBean jobExplorerFactoryBean = new JobExplorerFactoryBean();
        jobExplorerFactoryBean.setDataSource(dataSource);
        jobExplorerFactoryBean.setSerializer(getSerializer());
        jobExplorerFactoryBean.afterPropertiesSet();
        return jobExplorerFactoryBean.getObject();
    }
}

Upvotes: 2

Michael Minella
Michael Minella

Reputation: 21463

Before Spring Batch 4, the default serialization mechanism for the ExecutionContext was via XStream. Now it uses Jackson by default which is not compatible with the old serialization format. We still have the old version available (XStreamExecutionContextStringSerializer) but you'll need to configure it yourself by implementing a BatchConfigurer and overriding the configuration in the JobRepositoryFactoryBean.

For the record, this is related to this issue: https://jira.spring.io/browse/BATCH-2575.

Upvotes: 3

Related Questions