user185887
user185887

Reputation: 713

ignoring unconsumed columns while inserting into postgres database using sqlalchemy

I want to insert data into a table from a python dictionary. But my dictionary contains more keys than the columns in the table. So I get the error sqlalchemy.exc.CompileError: Unconsumed column names: column_name while inserting. I want to just insert the column names that exist in the table and ignore extra columns.

How to do that using sqlalchemy?

The code I am using is

from sqlalchemy import *
from sqlalchemy.dialects import postgresql

db = create_engine('postgresql+psycopg2://localhost:5432/postgres')
meta = MetaData()
meta.bind = db

data = [{
    'a': 1,
    'b': 2,
    'c': 3,
    'd': 4,
    'e': 5,
    'f': 6
}]

ins_stmt = postgresql.insert(table_object).values(data)
db.execute(ins_stmt)

where my table_object contains columns a,b,c,d,e.

P.S.- I am using sqlalchemy 1.4

Upvotes: 4

Views: 2995

Answers (1)

snakecharmerb
snakecharmerb

Reputation: 55669

I'm not aware of any way to get insert to discard the extra items, so they must be filtered out beforehand. You could do this by comparing the table's column names against the dictionary's keys.

column_names = table_object.columns.keys()
fixed = [{k: v for k, v in d.items() if k in column_names} for d in data]

ins_stmt = postgresql.insert(table_object).values(fixed)
with engine.connect() as conn:
    with conn.begin():
        conn.execute(ins_stmt)

Upvotes: 4

Related Questions