Reputation: 642
I have a table sorted by user id, each object contains a list of objects that have their own id plus two attributes. Is it possible to use update_item to update either one of those attributes, using user_id, and obj_id. The only way I have found to do is to use GET and then PUT, which feels like it could be more expensive.
This is and example entry in my table:
{'id': '0',
'commands': [{'id': '0', 'command': '', actions: []}, {'id': '1', 'command': '', actions: []}]
}
Say I wanted to update command with id 0 on user_id 0 to have a different list of actions I figure I can do something like this:
table = dynamodb.Table('commands')
user_id = '0'
command_id = '0'
document = table.get_item(Key={"id": user_id})['Item']
for command in document['commands']:
if command['id'] == command_id:
command['actions'] = ['new actions']
break
table.put(Item=document, ReturnValues="NONE")
But is there a way of doing it with update_item that may be more efficient?
The only thing I can find in the docs is to use list_append in the Expression, but I don't want to append a new item I want to edit an existing one.
I've found these two questions as reference, the first one is where I got the solution above, but the question I'm asking here hasn't been answered, can I use update_item to do this?
Boto3 DynamoDB update list attributes for an item
How to append a value to list attribute on AWS DynamoDB?
Upvotes: 0
Views: 1191
Reputation: 9675
As far as the documentation concerned, there is no BatchUpdate
API available as of now.
Only one I can see is this BatchWriteItem which is applicable to put
and delete
The BatchWriteItem operation puts or deletes multiple items in one or more tables. A single call to BatchWriteItem can write up to 16 MB of data, which can comprise as many as 25 put or delete requests. Individual items to be written can be as large as 400 KB.
BatchWriteItem cannot update items. To update items, use the UpdateItem action.
Upvotes: 2