Jorge Guerreiro
Jorge Guerreiro

Reputation: 845

How to force a BatchWriteItem failure

I'm currently writing integration tests for my BatchWriteItem logic using Spock/Groovy. I'm running a docker container which spins up a real DynamoDb table for this same purpose.

This is my logic in Java for BatchWriteItems

public Promise<Boolean> createItemsInBatch(ClientKey clientKey, String accountId, List<SrItems> srItems) {
        List<Item> items = srItems.stream()
                .map(srItem -> createItemFromSrItem(clientKey, createItemRef(srItem.getId(), accountId), srItem))
                .collect(Collectors.toList());
        List<List<Item>> batchItems = Lists.partition(items, 25);

        var promises = batchItems.stream().map(itemsList -> Blocking.get(() -> {
            TableWriteItems tableWriteItems = new TableWriteItems(table.getTableName());
            tableWriteItems.withItemsToPut(itemsList);
            BatchWriteItemOutcome outcome = dynamoDB.batchWriteItem(tableWriteItems);
            return outcome.getUnprocessedItems().values().stream().flatMap(Collection::stream).collect(Collectors.toList());
        })).collect(Collectors.toList());

        return ParallelPromises.yieldAll(promises).map((List<? extends ExecResult<List<WriteRequest>>> results) -> {
            if(results.isEmpty()) {
                return true;
            } else {
                results.stream().map(Result::getValue).flatMap(Collection::stream).forEach(failure -> {
                    var failedItem = failure.getPutRequest().getItem();
                    logger.error(append("item", failedItem), "Failed to batch write item");
                });
                return false;
            }
        });
    }

And this is my current implementation for the test (happy path)

@Unroll
    def "createItemsInBatch - #description"(description, srItemsList, createResult) {
        given:
        def dynamoItemService = new DynamoItemService(realTable, amazonDynamoDBClient1) //passing the table running in the docker image + the dynamo client associated

        when:
        def promised = ExecHarness.yieldSingle {
            dynamoItemService.createItemsInBatch(CLIENT_KEY, 'account-id', srItemsList as List<SrItem>)
        }

        then:
        promised.success == createResult

        where:
        description                                        |  srItemsList          | createResult
        "single batch req not reaching batch size limit"   |  srItems(10)          | true
        "double batch req reaching batch size limit"       |  srItems(25)          | true
        "triple batch req reaching batch size limit"       |  srItems(51)          | true
    }

For context:

Want I want now is to be able to test the unhappy path of my logic, i.e. get some UnprocessedItems from my outcome, testing the below code is actually doing its job

BatchWriteItemOutcome outcome = dynamoDB.batchWriteItem(tableWriteItems);
            return outcome.getUnprocessedItems().values().stream().flatMap(Collection::stream).collect(Collectors.toList());

Any help would be greatly appreciated

Upvotes: 0

Views: 597

Answers (1)

Leeroy Hannigan
Leeroy Hannigan

Reputation: 19883

This is quite easy to do actually, we can force throttling on your DynamoDB table which will result in UnprocessedItems....

Configure your table to have 1WCU and disable auto-scaling. Now run your BatchWriteItem in batches of 25 for a couple of seconds and DynamoDB will begin to throttle requests, which will return the throttled items in the UnprocessedItems response, testing your unhappy path.

Upvotes: 1

Related Questions