BetaDev
BetaDev

Reputation: 4684

How to prevent multiple entries (Duplicate) in SQLite3 using Python

I am inserting data into sqlite3 using python script. But my problem is the script is inserting three duplicate (same) data. Simply inserting same data three times.

print flag        
  if flag == 1:
      print colored('Authorized', 'green')
      pi1.write(6, 0)
      conn1 = sqlite3.connect('db/local.db')
      c = conn1.cursor()
      i = datetime.datetime.now()
      date = i.strftime("%Y-%m-%d %H:%M:%S")
      c.execute("INSERT INTO lock_logs (msg,created_at,card,date,host_ip,door_name,state) VALUES ('Authorized card','"+date+"', "+reader_value+", '"+date+"', '"+get_ip_address('eth0')+"', 'NL_1','opened')")
      conn1.commit()
      conn1.close()
      time.sleep(3)
      pi1.write(6, 1)
      print "inserted"
      #cron will send log to server 
  else:
      print colored('Not Authorized', 'red')
      pi1.write(6, 1)
      #store log to local sqllight file
      #conn = sqlite3.connect('db/local.db')
      conn2 = sqlite3.connect('db/local.db')
      c = conn2.cursor()
      i = datetime.datetime.now()
      date = i.strftime("%Y-%m-%d %H:%M:%S")
      c.execute("INSERT INTO lock_logs (msg,created_at,card,date,host_ip,door_name,state) VALUES ('Not Authorized card','"+date+"', "+reader_value+", '"+date+"', '"+get_ip_address('eth0')+"', 'NL_1','failed')")
      conn2.commit()
      conn2.close()
      #cron will send log to server
  print "_____________________________________"

Note: If section and else section execute only once but three records inserted instead inserting only one.

Sample Data into DB file (local.db)

545|2017-01-28 20:35:44||Authorized card|43448981|2017-01-28 20:35:44|172.20.100.15|NL_1|opened|0
546|2017-01-28 20:35:44||Authorized card|43448981|2017-01-28 20:35:44|172.20.100.15|NL_1|opened|0
547|2017-01-28 20:35:44||Authorized card|43448981|2017-01-28 20:35:44|172.20.100.15|NL_1|opened|0

How to prevent this automatic multiple entry?

Upvotes: 2

Views: 1812

Answers (1)

Richard
Richard

Reputation: 2355

Based on the sample data, the timestamp and card id are duplicated across multiple entries. You can create an index for uniqueness on these two fields, which will prevent multiple entries from being inserted into the database.

CREATE UNIQUE INDEX index_card_and_created_at_on_lock_logs ON lock_logs(card, created_at);

This uniqueness index will reject entries where these two fields are duplicated.

Upvotes: 3

Related Questions