Jens
Jens

Reputation: 9130

SQLAlchemy: inspection of column type size gives odd result

Is it possible to query a column for its maximum possible size of data (in bytes) that can be stored in it? For example, say I declare a column using

content = Column(LargeBinary)

then how can I query information about content? Following the inspection approach suggested in this question:

table = File.__table__
field = table.c["content"]
print("max=" + field.type.length)

I get a max=None, whereas I would have expected this to be max=65535 considering field.type=BLOB. What am I doing wrong?

Upvotes: 5

Views: 1537

Answers (1)

alecxe
alecxe

Reputation: 473853

As far as I understand, the maximum length of a column is dialect specific and this information is not stored inside the sqlalchemy sources. Though, depending on the backend, you can get it dynamically. For example, for mysql you can retrieve it from INFORMATION_SCHEMA.COLUMNS table:

q = select(["CHARACTER_MAXIMUM_LENGTH"]).select_from("INFORMATION_SCHEMA.COLUMNS").where(
    and_("table_name = '%s'" % File.__tablename__,
         "table_schema = '%s'" % schema,
         "column_name = '%s'" % column_name))
result = session.execute(q)
print(result.first()[0])  # tested - prints 65535

I bet there is a more beautiful way to write the query.

See also:


field.type.length refers to the user-defined maximum length which is initialized to None if not provided explicitly:

def __init__(self, length=None):
    self.length = length

And since you are not providing the length argument, you are getting None.

Upvotes: 4

Related Questions