Reputation: 131
I'm trying to make my update db quicker that's why I want to use bulk operations. I've a TEST table witch contains around 200k rows. Every day I've to clean the table and load fresh data. When I do this one by one it takes my 2 hours.
I'd like put all the data into dictionary and insert in one operations.
I use a code like below but something is wrong. Do You know what should I change ?
sqlalchemy.exc.UnboundExecutionError: Could not locate a bind configured on mapper Mapper|TEST|TEST or this Session
my db table:
class TEST(db.Model):
ID = db.Column(db.Integer, primary_key=True)
PN = db.Column(db.String(45))
AMOUNT = db.Column(db.String(6))
and insert code:
from sqlalchemy.orm import mapper, Session
s=Session()
s.bulk_insert_mappings(TEST,
[dict(PN='TEST2', AMOUNT=200), dict(PN='TEST3', AMOUNT=300), dict(PN='TEST5', AMOUNT=500)]
)
Upvotes: 1
Views: 816
Reputation: 10315
I suggest to import Session from flask_sqlalchemy.
from flask_sqlalchemy import Session
And if you configure flask-sqlalchemy correctly flask_sqlalchemy
internally configured below mess in behind the scene.
When do I make a sessionmaker?
You are importing the Session wrong way. You have to bind it to the create_engine
from sqlalchemy.orm import mapper, Session
engine = create_engine('sqlite3://...')
s = Session(bind=engine)
Upvotes: 1