amans san
amans san

Reputation: 43

Python variables in mysql request

I'm quite new in Python (Python 3.4.6) :)
I'm trying to insert into a mysql db some lines but with variables.
At the beginning, I've a dictionary list_hosts.
Here is my code :

import mysql.connector
import time
db = mysql.connector.connect(host='localhost', user='xxxxx', passwd='xxxxx', database='xxxxx')
cursor = db.cursor()
now_db = time.strftime('%Y-%m-%d %H:%M:%S')
for key, value in list_hosts
    key_db += key+", "
    value_ex += "%s, "
    value_db += "\""+value+"\", "
key_db = key_db.strip(" ")                                                                                                                                                                                                                            
key_db = key_db.strip(",")
value_ex = value_ex.strip(" ")
value_ex = value_ex.strip(",")
value_db = value_db.strip(" ")
value_db = value_db.strip(",")
add_host = ("INSERT INTO nagios_hosts (date_created, date_modified, "+key_db+") VALUES ("+value_ex+")")
data_host = ("\""+now_db+"\", \""+now_db+"\", "+value_db)
cursor.execute(add_host, data_host)
db.commit()
db.close()

Example of list_hosts:

OrderedDict([('xxxx1', 'data1'), ('xxxx2', 'data2'), ('xxxx3', 'data3'), ('xxxx4', 'data4'), ('xxxx5', 'data5'), ('xxxx6', 'data6')])

I've simplified the code of course. I did it like this as I've never have the same amount of items in the dictionnary. I'm trying to create something like this :

add_host - INSERT INTO TABLE (date_created, date_modified, xxxx1, xxxx2, xxxx3, xxxx4, xxxx5, xxxx6) VALUES (%s, %s, %s, %s, %s, %s)
data_host - now, now, data1, data2, data3, data4, data5, data6

Where there are never the same number of xxxx... They all exist in the DB, but I don't need to fill each column for each item in the dictionnary.

When I execute I get this error :

mysql.connector.errors.ProgrammingError: 1064 (42000): You have an error in your SQL syntax; check the manual that corresponds to your MariaDB server version for the right syntax to use near 'xxxxxxxxxxxxxxxxxxxxx' at line 1

As I'm beginning with Python, I think there are a lot of things we can clean too... don't hesitate :)

Upvotes: 0

Views: 499

Answers (2)

bruno desthuilliers
bruno desthuilliers

Reputation: 77912

Here's a canonical python3 (python2 compatible) solution:

import time
from collections import OrderedDict

list_hosts = OrderedDict([("field1", "value1"), ("field2", "value2"), ("fieldN", "valueN")])

# builds a comma-separated string of db placeholders for the values:
placeholders = ", ".join(["%s"] * (len(list_hosts) + 2))

# builds a comma-separated string of field names     
fields =  ", ".join(("date_created","date_modified") + tuple(list_hosts.keys()))

# builds a tuple of values including the dates
now_db = time.strftime('%Y-%m-%d %H:%M:%S')
values = (now_db, now_db) + tuple(list_hosts.values())


# build the SQL query:
sql = "INSERT INTO nagio_hosts({}) VALUES({})".format(fields, placeholders)

# and safely execute it 
cursor.execute(sql, values)
db.commit()

Upvotes: 1

Ildar Akhmetov
Ildar Akhmetov

Reputation: 1431

As @khelwood mentioned in the comments, you should use parameterized queries.

If the number of columns you're inserting varies, you might prefer to generate a tuple and use it in a parameterized query then.

cursor.execute() accepts two parameters:

  • a query as a string;

  • parameters as a tuple.

The idea is to generate the string and the tuple and pass those to cursor.execute().

You'll need something like this:

list_hosts = {'xxxx1': 'data1', 'xxxx2': 'data2', 'xxxx3': 'data3', 'xxxx4': 'data4'}

keys = [] # creating a list for keys
values = () # creating a tuple for values
for key, value in list_hosts.items():
    keys.append(key)
    values = values + (value,)

keys_str = ', '.join(keys)
ps = ', '.join(['%s'] * len(list_hosts))

query = "INSERT INTO tbl (%s) VALUES (%s)" % (keys_str, ps)

print(query)
# INSERT INTO tbl (data_created, data_modified, xxxx1, xxxx2, xxxx3, xxxx4) VALUES (%s, %s, %s, %s)

cursor.execute(query, values)

Just tried it on a sample data, works fine!

Upvotes: 0

Related Questions