Reputation: 433
Can someone please explain how to set up dynamodb_mapper (together with boto?) to use ddbmock with sqlite backend as Amazon DynamoDB-replacement for functional testing purposes?
Right now, I have tried out "plain" boto and managed to get it working with ddbmock (with sqlite) by starting the ddbmock server locally and connect using boto like this:
db = connect_boto_network(host='127.0.0.1', port=6543)
..and then I use the db object for all operations against the database. However, dynamodb_mapper uses this way to get a db connection:
conn = ConnectionBorg()
As I understand, it uses boto's default way to connect with (the real) DynamoDB. So basically I'm wondering if there is a (preferred?) way to get ConnectionBorg() to connect with my local ddbmock server, as I've done with boto above? Thanks for any suggestions.
Upvotes: 3
Views: 665
Reputation: 1499
In library mode rather than server mode:
import boto
from ddbmock import config
from ddbmock import connect_boto_patch
# switch to sqlite backend
config.STORAGE_ENGINE_NAME = 'sqlite'
# define the database path. defaults to 'dynamo.db'
config.STORAGE_SQLITE_FILE = '/tmp/my_database.sqlite'
# Wire-up boto and ddbmock together
db = connect_boto_patch()
Any access to dynamodb service via boto will use ddbmock under the hood.
If you still want to us ddbmock in server mode, I would try to change ConnectionBorg._shared_state['_region']
in the really beginning of test setup code:
ConnectionBorg._shared_state['_region'] = RegionInfo(name='ddbmock', endpoint="localhost:6543")
As far as I understand, any access to dynamodb via any ConnectionBorg
instance after those lines will use ddbmock entry point.
This said, I've never tested it. I'll make sure authors of ddbmock gives an update on this.
Upvotes: 3