Boto DynamoDB: How to prevent items from being overwritten with batch_write?

I am using boto DynamoDB v2 and I am writing items to a table in batch. However, I cannot prevent DynamoDB from overwriting the attributes of existing elements. I would prefer the process to fail.

The table has the following schema:

from boto.dynamodb2.table import Table, HashKey, RangeKey

conn = get_connection()
t = Table.create(
        'intervals',
        schema=[
            HashKey('id'),
            RangeKey('start')
        ],
        connection=conn
    )

      

Let's say I insert one element:

item = {
    'id': '4920',
    'start': '20',
    'stop': '40'
}
t.put_item(data=item)

      

Now when I insert new items with batch_write

, I want to make sure DynamoDB doesn't overwrite the existing item. According to the documentation , this should be achieved with a parameter overwrite

from the put_item

class method BatchTable

(which is the one used as the context manager in the example below)

new_items = [{
    'id': '4920',
    'start': '20',
    'stop': '90'
}]

with t.batch_write() as batch:
    for i in new_items:
        batch.put_item(data=i, overwrite=False)

      

However, it is not. The attribute stop

in my example gets a new value 90

. Thus, the previous value ( 40

) is overwritten.

If I use my own table method the put_item

parameter works overwrite

. By setting it to True

, instead of the value, the stop

value False

is printed to ConditionalCheckFailedException

.

How can I get this exception when used batch_write

?

+3


source to share


1 answer


I don't think it can be done with DynamoDB. The Batch API does not support it. It is a bug in boto that an put_item

object method BatchTable

takes a parameter overwrite

. If you check the code, you will see that it does nothing with this parameter. He is ignored because there is nothing he can do about it. DynamoDB just doesn't support this. At least for now.



+2


source







All Articles