Reputation: 21
I've looked at several answers to similar questions, and none of them fit my use case. That said, I apologize in advance if I've somehow missed the answer and this ends up being a re-posted question.
In short, I'm writing a small extension / wrapper around a legacy application that uses BDE Paradox tables as its backend. I can't change the backend, and the legacy application isn't being replaced any time soon. I can, however, access the backend and do everything I need to in SQLAlchemy using the Paradox ODBC drivers that ship with Windows.
The original creator of the legacy application organized the "database" for customer information as a series of ~28 individual Paradox DB files which all have an identical schema / design, but are named differently based on the first letter of the last name of the customer(s) they contain. As an example, the customer record for Adam Smith
would be in the DB named [CUSTOMER_S], whereas the record for Johnny Appleseed
would be in the DB named [CUSTOMER_A]. To further complicate things, the internal organization of the legacy application's file structure is such that (while all of the application's files exist in a single directory), the main application data is located in one folder and the customer data is located in another. The structure looks something like this:
-- /
|
-- Top-Level Application Directory
|
-- Main Application Data
|
...
|
-- Customer Data
|
...
While this is irritating, handling it is as simple as:
item_engine = create_engine(item_connection_string)
customer_engine = create_engine(customer_connection_string)
super_session = scoped_session(
sessionmaker(binds={Item: item_engine, Customer: customer_engine})
)
super_session.configure(autoflush=False, autocommit=False, expire_on_commit=True)
session = super_session()
There are some strange internal rules in the legacy application that determine how items are matched to customers, but that's probably not germane to the question at hand.
What I'd like to do is something like this:
from sqlalchemy import event
from sqlalchemy.orm.query import Query
@event.listens_for(Query, "before_compile", named=True, retval=True)
def fix_customer_tables(query: Query, *args, **kwargs):
# Check if the object being queried is a Customer
# If so, alter the __tablename__ before it's passed
# to the SQLAlchemy compiler so that the record is
# pulled from or written to the correct file
return query
What's got me completely turned around is trying to figure out where / how the Query object that gets passed into the fix_customer_tables
listener actually stores / references the __tablename__
to be queried. If someone could point me in the right direction, that would be amazing.
Upvotes: 1
Views: 135