Houy Narun
Houy Narun

Reputation: 1725

Double inserting records in flask SqlAlchemy connected with PostgreSql?

Very rarely, I meet a problem that the record that I inserted into Table Tbl_CUSTOMER was double with auto ID from Postgres.

I have no idea, but I suspected that it could be caused from postgres vacuum running time. To confirm that, I tried to run postgres vacuum at the same with inserting record, but could not found this problem happened, therefore, I could not duplicate the issue to find what was the root cause and fix the problem.

models.py

class Tbl_CUSTOMER():
    ID              =   db.Column(db.Numeric(25, 9), primary_key=True, autoincrement=True)
    PotentialCustomer   =   db.Column(db.String(12))
    FirstNameEn     =   db.Column(db.String(35))
    LastNameEn      =   db.Column(db.String(35))
    FirstNameKh     =   db.Column(db.String(35))
    LastNameKh      =   db.Column(db.String(35))
    Salutation      =   db.Column(db.String(4))
    Gender          =   db.Column(db.String(6))
    DateOfBirth     =   db.Column(db.String(10))
    CountryOfBirth  =   db.Column(db.String(2))
    Nationality     =   db.Column(db.String(2))
    ProvinceOfBirth =   db.Column(db.String(3))

views.py

dataInsert =Tbl_CUSTOMER(
                PotentialCustomer   =   request.form['PotentialCustomer'],
                FirstNameEn     =   request.form['FirstNameEn'],
                LastNameEn      =   request.form['LastNameEn'],
                FirstNameKh     =   request.form['FirstNameKh'],
                LastNameKh      =   request.form['LastNameKh'],
                Salutation      =   request.form['Salutation'],
                Gender          =   request.form['Gender'],
                DateOfBirth     =   request.form['DateOfBirth'],
                CountryOfBirth  =   request.form['CountryOfBirth'],
                Nationality     =   request.form['Nationality'],
                ProvinceOfBirth =   request.form['ProvinceOfBirth']
            )

db.session.add(dataInsert)
db.session.commit()

This problem does not happen frequently. So, what is the problem, and how can I fix this to prevent it happen in future? Thanks.

Upvotes: 4

Views: 1006

Answers (1)

Thiago Mata
Thiago Mata

Reputation: 2959

If you create a unique key ( or replace your primary key ) with some hashing function value based on all the values of your row, that may help you to see when this problem is happening. Using this hashing column you will be able to decide what you should happen when your system get the same value ( same hash ). One option, for example, just ignores the new row, keeping the old one. Other, is to rewrite, etc.

The chance of getting the same hash value from different rows is so small that I would not even consider that. Look this thread https://crypto.stackexchange.com/questions/1170/best-way-to-reduce-chance-of-hash-collisions-multiple-hashes-or-larger-hash if you want to see more details about that.

Upvotes: 1

Related Questions