Tushar Bharadwaj
Tushar Bharadwaj

Reputation: 309

Converting .sql file to a text table using Python

I've a .sql file directly exported from phpmyadmin containing data of all the users. Is it possible to get the list of all the column names and the values from the .sql file without executing the file?

I'm currently using python's MySQLdb library to execute the .sql file and create the following without a hassle, but I'm trying to skip the execution part and directly convert to .txt file.

Example :

Input file content

CREATE TABLE IF NOT EXISTS `login_logs` (
  `ID` int(11) NOT NULL AUTO_INCREMENT,
  `Username` varchar(20) NOT NULL,
  `Time` int(11) NOT NULL,
  `IP` varchar(20) NOT NULL,
  `Type` enum('Success','Failed') NOT NULL,
  PRIMARY KEY (`ID`)
) ENGINE=InnoDB  DEFAULT CHARSET=utf8 



INSERT INTO `login_logs` (`ID`, `Username`, `Time`, `IP`, `Type`) VALUES
(1, 'User1', 1411318511, '127.0.0.1', 'Failed'),
(2, 'User2', 1411318674, '127.0.0.1', 'Failed'),
(3, 'User3', 1411318683, '127.0.0.1', 'Success'),
(4, 'User12', 1411319092, '127.0.0.1', 'Success'),
(5, 'User3', 1411319112, '127.0.0.1', 'Success');

The output would be something like


ID  |  Username  |     Time    |     IP     |  Type
1   |    User1   | 1411318511  | 127.0.0.1  | Failed
1   |    User2   | 1411318674  | 127.0.0.1  | Failed

Which would later be saved in text file.

Upvotes: 1

Views: 3786

Answers (2)

Oliver S.
Oliver S.

Reputation: 39

I understood you'd like to extract the column names and the values from the sql file and write them to a txt file with pipes as column separators. The code below should do the job. However, it's tailored to exactly your input file example, especially the text structure in the INSERT INTO section. Once that changes (e.g. a new line after 'INSERT INTO') it won't work.

Hth

#!/usr/bin/python

def remove_stuff(inp_string):
    """ removes some punctuation marks"""
    pm = ['(', ')', '\'', ';', '`']
    # explicitly remove '),' from values lines
    out_string = inp_string.replace('),','')
    for pmark in pm:
        out_string = out_string.replace(pmark, '')
    return out_string

f = open('sql_statements.sql','r')
out = open('table2.txt','w')
line = f.readline()
vals = 0

while line != '':
    if vals == 1:
        values = line.strip()
        values = remove_stuff(values)
        values = values.split(',')
        values = '|'.join(values)
        out.write(values + '\n')
    if 'INSERT INTO' in line.upper():
        col_names = line.split('(')[1]
        col_names = col_names.split(')')[0]
        col_names = remove_stuff(col_names)
        col_names = col_names.split(',')
        col_names = '|'.join(col_names)
        out.write(col_names + '\n')
        vals = 1
    line = f.readline()

f.close()
out.close()

Upvotes: 3

Sanketh
Sanketh

Reputation: 1277

Try this.

import re
s = open('data.sql','r').read()
s = s[s.find('INSERT'):]
c = re.compile('\((.+),(.+),(.+),(.+),(.+)\)')
f = open('out.txt', 'w')
for item in c.findall(s):
    item = [a.replace("'", "").replace("`","") for a in item]
    f.write('\t|\t'.join(item)+'\n')
f.close()

I'm using regular expressions to match the pattern

Upvotes: 2

Related Questions