Reputation: 81
Rather than creating mixin classes that models inherit from, I have a use case that requires me to configure classes the other way around. The classes that would normally be mixin classes need to be the classes that inherit from the models as well as the class that model objects are created from. This is because the models and the mapper configurations are in an external library from the main repository. I need to pass in the host for the engine from the main repository to the models library before any of the models are loaded so they can load with the declarative base already configured. After the engine information is passed in, the session, Base class, and everything is created within a sort of base class that the models inherit from. Here is a simplified example:
class SQLAlchemyBase(object):
metadata = None
Session = None
Base = object
sessionfactory = sessionmaker()
def initialize(self, host):
engine = create_engine(host)
self.metadata = MetaData(bind=engine)
self.Session = scoped_session(self.sessionfactory)
self.Base = declarative_base(metadata=self.metadata)
models = SQLAlchemyBase()
(The models inherit from models.Base)
So the SQLAlchemyBase will be imported into the main repository, the initialize method will be called, passing in the host for the engine, and the models can then be imported. The main repository has its own classes with the same names as the models and have additional methods that a normal mixin class would have to extend functionality. However, I am unable to create model objects using the classes in the main repository because I can't get the mappers to play nice with this unusual inheritance that extends from the external models library. Additionally, in the models library, there are models that have multiple levels of inherited polymorphic relationships. Here is an example that is similar one of the more basic inherited polymorphic relationships:
Models Library
class Foo(models.Base):
__tablename__ = "foo"
id = Column(Integer, primary_key=True)
type = Column(String)
foo_bar_id = Column(Integer, ForeignKey("foo_bar.id"))
foo_bar = relationship(Foo, backref=backref("foos"))
__mapper_args__ = {"polymorphic_on": type}
class Bar(Foo):
__mapper_args__ = {"polymorphic_identity": "bar"}
class FooBar(models.Base):
__tablename__ = "foo_bar"
id = Column(Integer, primary_key=True)
Main Repository
from separate_library.models import models, Foo as BaseFoo, Bar as BaseBar, FooBar as BaseFooBar
class Foo(BaseFoo):
@classmethod
def custom_create_method(cls, **kw):
foo_obj = cls(**kw)
models.session.add(foo_obj)
models.session.flush()
class Bar(BaseBar):
pass
class FooBar(BaseFooBar):
pass
The original error I was getting was something like this:
InvalidRequestError
: One or more mappers failed to initialize - can't proceed with initialization of other mappers.
Original exception was: Multiple classes found for pathFoo
in the registry of this declarative base. Please use a fully module-qualified path.
So I tried putting the full path in the relationships. Then it started giving me an error like this:
FlushError
: Attempting to flush an item of type as a member of collectionFooBar.foos
. Expected an object of type or a polymorphic subclass of this type. If is a subclass of , configure mapperMapper|Foo|foo
to load this subtype polymorphically, or setenable_typechecks=False
to allow any subtype to be accepted for flush.
Essentially, the main problem is getting the classes in the main module to point to and act like the model classes. For example, when I try to create relationships, it says it expected an object of type separate_library.models.Foo
instead of main_module.models.Foo
. Additionally, in the polymorphic relationships, I can't get the polymorphic_identity to populate for the polymorphic_on column. For example, Bar in the main repository will have the type
column empty when the object is initially created.
One idea I tried was to add a metaclass to the declarative base in the models library and modify the mappers in the __init__
method during their initialization. I made progress this way, but haven't gotten it to work completely.
Sorry for the complex explanation, but this is a complex problem. I am not able to change anything about the models or the use case, unfortunately. I have to work within these constraints. If anyone can offer ideas on how to configure the mappers for the classes in the main repository to act like the models in the model library, I would be very grateful.
Upvotes: 2
Views: 2794
Reputation: 20518
There are three problems here:
foo_bar = relationship(FooBar, backref=backref("foos"))
the FooBar
needs to refer to the subclass FooBar
, not the BaseFooBar
.Bar
needs to inherit from Foo
for the inheritance mechanism to work; it cannot inherit from BaseFoo
.The solutions to these problems, in order:
Base
because SQLAlchemy's declarative extension makes liberal use of metaclasses. We'll see that the metaclass approach can also solve problem 1 in a flexible way.__abstract__ = True
.Simplest possible example:
from sqlalchemy import *
from sqlalchemy.ext.declarative import declarative_base, declared_attr, DeclarativeMeta
class BaseMeta(DeclarativeMeta):
def __new__(cls, name, bases, attrs):
if not attrs.get("__abstract__"):
if len(bases) != 1:
# you'll need to have multiple inheritance if you have that
# as well
raise NotImplementedError()
base, = bases
extra_bases = tuple(b._impl for b in base.__bases__
if hasattr(b, "_impl"))
bases += extra_bases
self = super(BaseMeta, cls).__new__(cls, name, bases, attrs)
if getattr(base, "__abstract__", False):
base._impl = self
return self
else:
return super(BaseMeta, cls).__new__(cls, name, bases, attrs)
Base = declarative_base(metaclass=BaseMeta)
class BaseFoo(Base):
__abstract__ = True
__tablename__ = "foo"
id = Column(Integer, primary_key=True)
type = Column(String)
@declared_attr
def foo_bar_id(cls):
return Column(Integer, ForeignKey("foo_bar.id"))
@declared_attr
def foo_bar(cls):
return relationship(lambda: BaseFooBar._impl, backref=backref("foos"))
__mapper_args__ = {"polymorphic_on": type}
class BaseBar(BaseFoo):
__abstract__ = True
__mapper_args__ = {"polymorphic_identity": "bar"}
class BaseFooBar(Base):
__abstract__ = True
__tablename__ = "foo_bar"
id = Column(Integer, primary_key=True)
class Foo(BaseFoo):
@classmethod
def custom_create_method(cls, **kw):
foo_obj = cls(**kw)
models.session.add(foo_obj)
models.session.flush()
class Bar(BaseBar):
pass
class FooBar(BaseFooBar):
pass
print(Bar.__bases__) # (<class '__main__.BaseBar'>, <class '__main__.Foo'>)
The basic idea of the metaclass is to inject the class Foo
into the bases of Bar
, based on the fact that BaseBar
inherits from BaseFoo
, and the fact that Foo
implements BaseFoo
(by inheriting from it).
You can add more complicated stuff on top, such as multiple inheritance support or graceful error handling (e.g. warning the user that he's missing a subclass for each base class that you have or he's provided multiple subclasses for the same base class).
Upvotes: 3