Reputation: 4740
Not sure what I'm missing here but this code runs without any error message, but there's nothing in the table. I'm loading a CSV values in three columns into mysql table
import csv
import MySQLdb
mydb = MySQLdb.connect(host='localhost',
user='root',
passwd='',
db='mydb')
cursor = mydb.cursor()
csv_data = csv.reader(file('students.csv'))
for row in csv_data:
cursor.execute('INSERT INTO testcsv(names, \
classes, mark )' \
'VALUES("%s", "%s", "%s")',
row)
#close the connection to the database.
cursor.close()
print "Done"
Would appreciate if someone else could have a look.
Upvotes: 60
Views: 153077
Reputation: 2208
If you do not have the pandas and sqlalchemy libraries, install them using pip
pip install pandas
pip install sqlalchemy
We can use pandas and sqlalchemy to directly insert into the database
import csv
import pandas as pd
from sqlalchemy import create_engine, types
engine = create_engine('mysql://root:*Enter password here*@localhost/*Enter Databse name here*') # enter your password and database names here
df = pd.read_csv("Excel_file_name.csv",sep=',',quotechar='\'',encoding='utf8') # Replace Excel_file_name with your excel sheet name
df.to_sql('Table_name',con=engine,index=False,if_exists='append') # Replace Table_name with your sql table name
Upvotes: 22
Reputation: 39
Fastest way is to use MySQL bulk loader by "load data infile" statement. It is the fastest way by far than any way you can come up with in Python. If you have to use Python, you can call statement "load data infile" from Python itself.
Upvotes: 0
Reputation:
using pymsql if it helps
import pymysql
import csv
db = pymysql.connect("localhost","root","12345678","data" )
cursor = db.cursor()
csv_data = csv.reader(open('test.csv'))
next(csv_data)
for row in csv_data:
cursor.execute('INSERT INTO PM(col1,col2) VALUES(%s, %s)',row)
db.commit()
cursor.close()
Upvotes: 5
Reputation: 907
The above answer seems good. But another way of doing this is adding the auto commit option along with the db connect. This automatically commits every other operations performed in the db, avoiding the use of mentioning sql.commit()
every time.
mydb = MySQLdb.connect(host='localhost',
user='root',
passwd='',
db='mydb',autocommit=true)
Upvotes: 11
Reputation: 525
from __future__ import print_function
import csv
import MySQLdb
print("Enter File To Be Export")
conn = MySQLdb.connect(host="localhost", port=3306, user="root", passwd="", db="database")
cursor = conn.cursor()
#sql = 'CREATE DATABASE test1'
sql ='''DROP TABLE IF EXISTS `test1`; CREATE TABLE test1 (policyID int, statecode varchar(255), county varchar(255))'''
cursor.execute(sql)
with open('C:/Users/Desktop/Code/python/sample.csv') as csvfile:
reader = csv.DictReader(csvfile, delimiter = ',')
for row in reader:
print(row['policyID'], row['statecode'], row['county'])
# insert
conn = MySQLdb.connect(host="localhost", port=3306, user="root", passwd="", db="database")
sql_statement = "INSERT INTO test1(policyID ,statecode,county) VALUES (%s,%s,%s)"
cur = conn.cursor()
cur.executemany(sql_statement,[(row['policyID'], row['statecode'], row['county'])])
conn.escape_string(sql_statement)
conn.commit()
Upvotes: 5
Reputation: 31
If it is a pandas data frame you could do:
csv_data.to_sql=(con=mydb, name='<the name of your table>',
if_exists='replace', flavor='mysql')
to avoid the use of the for
.
Upvotes: 2
Reputation: 34698
I think you have to do mydb.commit()
all the insert into.
Something like this
import csv
import MySQLdb
mydb = MySQLdb.connect(host='localhost',
user='root',
passwd='',
db='mydb')
cursor = mydb.cursor()
csv_data = csv.reader(file('students.csv'))
for row in csv_data:
cursor.execute('INSERT INTO testcsv(names, \
classes, mark )' \
'VALUES("%s", "%s", "%s")',
row)
#close the connection to the database.
mydb.commit()
cursor.close()
print "Done"
Upvotes: 86