lukik
lukik

Reputation: 4060

How can I load initial data into a database using sqlalchemy

I want to be able to load data automatically upon creation of tables using SQLAlchemy.

In django, you have fixtures which allow you to easily pre-populate your database with data upon creation of a table. This I found useful especially when you have basic "lookup" tables e.g. product_type, student_type which contain just a few rows or even a table like currencies which will load all the currencies of the world without you having to key them in over and over again when you destroy your models/classes.

My current app isn't using django. I have SQLAlchemy. How can I achieve the same thing? I want the app to know that the database is being created for the first time and hence it populates some tables with data.

Upvotes: 13

Views: 8340

Answers (3)

adrihanu
adrihanu

Reputation: 1433

I used the event listener to prepopulate database with data upon creation of a table.

Let's say you have ProductType model in your code:

from sqlalchemy import event, Column, Integer, String
from sqlalchemy.ext.declarative import declarative_base
Base = declarative_base()

class ProductType(Base):
    __tablename__ = 'product_type'
    id = Column(Integer, primary_key=True)
    name = Column(String(100))

First, you need to define a callback function, which will be executed when the table is created:

def insert_data(target, connection, **kw):
    connection.execute(target.insert(), [{'id': 1, 'name':'spam'}, {'id':2, 'name': 'eggs'}])

Then you just add the event listener:

event.listen(ProductType.__table__, 'after_create', insert_data)

Upvotes: 21

Harshal Parekh
Harshal Parekh

Reputation: 6017

Background

I am migrating one of my smaller projects from Django to FastAPI. I have static data loaded into MySQL using Django fixtures and I wanted to maintain that functionality with FastAPI as well.

This basic python script works for me (at least for now), but you may modify it according to your needs.

Folder Structure
- fixtures
    - data
        - table_name.json
        - table_name2.json
    - main.py

You can have as many .json files here. The script reads all the files under this directory, and performs a functionality equivalent to loaddata in Django.

Fixtures File
[
  {
    "id": int,
    "name": str
  }
]

Every json object here should translate to an insert query in MySQL. For example: if name is a required field, you must have it here.

Complete Example
from json import loads
from os import listdir
from os.path import dirname

from sqlalchemy import create_engine, delete, insert, Table
from sqlalchemy.schema import MetaData


fixtures_directory: str = dirname(__file__) + "/data/"

engine = create_engine("sqlite:///your_database_path", connect_args={"check_same_thread": False})
conn = engine.connect().execution_options(autocommit=True)

meta = MetaData()
meta.reflect(bind=engine)


def close_connection():
    conn.commit()
    conn.close()


def read_file(file_path: str):
    return loads(open(file_path, "r").read())


def insert_data(fixtures: list, table_name: str):
    table = Table(table_name, meta, autoload=True)
    conn.execute(delete(table))
    for item in fixtures:
        conn.execute(insert(table).values(item))


def get_file_paths():
    return listdir(fixtures_directory)


def load_fixtures():
    file_paths: list = get_file_paths()
    for file_path in file_paths:
        fixtures: list = read_file(file_path=fixtures_directory + file_path)
        insert_data(fixtures=fixtures, table_name=file_path.removesuffix(".json"))
    close_connection()


load_fixtures()

Since this is a standalone script, you need to ensure there are no relative imports here.


Why am I truncating the table?
conn.execute(delete(table))

Deleting all the rows in the table to insert newer rows. I only care about the static data in the table and no update operations take place here. You need to ensure your insert command will not fail with IntegrityContraint error. If you wish to update the data, you may use .merge() instead.

Do not forget!
conn.commit()

Upvotes: 0

Hieu Nguyen
Hieu Nguyen

Reputation: 8623

The short answer is no, SQLAlchemy doesn't provide the same feature as dumpdata and loaddata like Django.

There is https://github.com/kvesteri/sqlalchemy-fixtures that might be useful for you but the workflow is different.

Upvotes: 2

Related Questions