Mike Axiak
Mike Axiak

Reputation: 12004

Qualifying table names with database names in sqlalchemy

I'm using sqlalchemy with elixir on multiple databases. Currently everything works well with multiple sessions -- one bind to a different database. However, there are cases where I want to use one session to query against the other database (this is to support multiple servers with different replicated databases).

The problem is that when making a query against one session, the table names are not qualified with the proper database name, and I don't know how I can tell the query which database name to prefix. Is there any way to accomplish this?

Upvotes: 3

Views: 3236

Answers (2)

zzzeek
zzzeek

Reputation: 75317

This may be difficult as you already have everything mapped in terms of different binds. The "schema" argument to Table is how the "schemaname.tablename" syntax is rendered, however this would imply that you'd be using mapped classes that are distinct from the normal ones that are mapped to the table without the schema name.

So first some platform-specific techniques that would make this a lot easier. If you're on Oracle, use Oracle CREATE SYNONYM to map "somedb.sometable" to just "sometable" in the remote schema. If you're on Postgresql, manipulate the search_path so that multiple schemas are searched for a given name (see http://www.postgresql.org/docs/8.1/static/ddl-schemas.html#DDL-SCHEMAS-PATH).

Neither of those work ? OK then you need to do something along the lines of the recipe here at http://www.sqlalchemy.org/trac/wiki/UsageRecipes/EntityName (sorry, this isn't using Elixir, I'm not sure what Elixir's capabilities are as far as that). There's different ways to achieve it depending on specifics. Here's one way that will make a new, anonymous class mapped to the original table:

# regular class
class ClassOne(Base):
    __tablename__ = 'one'
    id = Column(Integer, primary_key=True)

class ClassTwo(Base):
    __tablename__ = 'two'
    id = Column(Integer, primary_key=True)

def map_to_alt_tb(cls, metadata, schemaname):
    m = class_mapper(cls)
    cls2 = type(
        "AltClass",
        (object, ),
        {}
    )
    table = m.local_table.tometadata(metadata, schemaname)
    mapper(cls2, table)
    return cls2

alt_cls = map_to_alt_db(ClassTwo, somemetadata, "alt_schema")

map_to_alt_db() will pull out the table that ClassTwo is mapped to, change it's schema to "alt_schema", then map it to a new class.

This approach loses whatever else is special about ClassTwo, though. If you needed that you could do more of the specific approach that's on that wiki page.

Upvotes: 1

Nick Martin
Nick Martin

Reputation: 172

Hit a similar issue with Oracle as the DB. The schema name was different between the DB instances, so hopefully this is analogous to your situation.

The method to resolve this was to initially fire a simple query to the DB to determine the owner of the schema objects:

SELECT owner FROM ALL_OBJECTS WHERE object_name = :obj_name AND object_type = :obj_type

Then use the resulting scalar as the value for the schema parameter on your reflected tables:

mytable = Table(name='my_table_name',
                metadata=my_bound_metadata,
                autoload=True,
                schema=schema_owner)

Upvotes: 2

Related Questions