user15005160
user15005160

Reputation: 23

Trying to query two mysql databases with python and write the result in google sheets (auto-update every day)

My data is divided in two mysql databases with the same structure.

What I am trying to do is write a python script that extracts and appends the data from both databases, stores it in a variable or a text file (let's say somefile.csv) and then another script gets the data from the variable/text file and imports it in google sheets through the google sheets api. The caveat is that my data changes every day and I also want both scripts to update automatically (run each day and fetch the updated data, the first to rewrite the csv and the second the google sheet with the new data from the csv).

Is that possible?

What I have so far is:

The first script:

from mysql.connector import connect, Error

username = "user"
password = "pass"

connection_1 = connect(
    host="hostaddress",
    user=username,
    password=password,
    database="databasename"
)

connection_2 = connect(
    host="hostaddress",
    user=username,
    password=password,
    database="databasename"
)

cursor_1 = connection_1.cursor()
cursor_2 = connection_2.cursor()

q1 = open('query_db1.sql', 'r')
query1 = q1.read()
q1.close()

q2 = open('query_db2.sql', 'r')
query2 = q2.read()
q2.close()

try:
    with connection_1:
        with cursor_1:
            cursor_1.execute(query_1)
            for row in cursor_1.fetchall():
                print(row)
    with connection_2:
        with cursor_2:
            cursor_2.execute(query2)
            for row in cursor_2.fetchall():
                print(row)
except Error as e:
    print(e)

The two problems I am facing in this script are:

  1. how to store the data from the executed queries in one variable or save it to one file?
  2. how to make that script query the databases every day and update the stored information?

In the second script, I have

from googleapiclient.discovery import build
from google.oauth2 import service_account

SERVICE_ACCOUNT_FILE = 'googleapicredentials.json'
SCOPES = ['https://www.googleapis.com/auth/spreadsheets']

creds = None
creds = service_account.Credentials.from_service_account_file(
        SERVICE_ACCOUNT_FILE, scopes=SCOPES)

SAMPLE_SPREADSHEET_ID = 'SHEET_ID'

service = build('sheets', 'v4', credentials=creds)

request = sheet.values().update(spreadsheetId=SAMPLE_SPREADSHEET_ID,
                            range="Sheet1!A1", valueInputOption="USER_ENTERED", body={"values":"somefile.csv"}).execute()

Again, I don't know how to make that script update each day.

Upvotes: 1

Views: 149

Answers (2)

fullfine
fullfine

Reputation: 1461

Answer

1. how to store the data from the executed queries in one variable or save it to one file?

Use the module pickle. It's very easy to save and load variables.

2. how to make that script query the databases every day and update the stored information?

Use the module schedule. Once you have installed it, you can define the execution of a file as follows:

import schedule
import time

def job():
    print("I'm working...")

schedule.every().day.at("10:30").do(job)

while True:
    schedule.run_pending()
    time.sleep(1)

References:

Upvotes: 1

JohnA
JohnA

Reputation: 1107

I do this every day on multiple projects although I use google app script which is based on javascript.

You can set a Trigger (like a chron job), to run a function on a schedule of your choosing. Triggers are set in the script editor. Look for the hourglass icon. Optionally, you can set a Trigger based on code in the script.

/**
 * Creates time-driven triggers
 *
 * https://developers.google.com/apps-script/reference/script/clock-trigger-builder
 */
function createTimeDrivenTriggers() {
  // Trigger every day at 04:00AM CT.
  ScriptApp.newTrigger('csvDaily')
      .timeBased()
      .everyDays(1)
      .atHour(4)
      .create();
}

Here is a code snippet showing the app script code to access my MySQL db, I removed the user name and passwords:

const MAXROWS = 10000;
const HOST = '65.60.34.202';
const PORT = '3306';
const USERNAME = '';
const PASSWORD = '';
const DATABASE = 'agustafa_barnes';
const DB_TYPE = 'mysql';

function getL2s() {
  var L2s = [];
  try {
    var fullConnectionString = 'jdbc:' + DB_TYPE + '://' + HOST + ':' + PORT;
    var conn = Jdbc.getConnection(fullConnectionString, USERNAME, PASSWORD);
    var stmt = conn.createStatement();
    stmt.execute('USE ' + DATABASE);
    var query = "SELECT * FROM L2s WHERE !IsDeleted ORDER BY TeamTL";
    var rs = stmt.executeQuery(query);

    while (rs.next()) {
      L2s.push([rs.getString(1), rs.getString(2), rs.getString(3), rs.getString(4), rs.getString(5)]);
    }
    rs.close();
    stmt.close();
  } catch (e) {
        console.log(e, e.lineNumber);
  }

  return L2s;
}

So, if accessing your db is off the table (I have the same issue with my company), perhaps you can set up some data extract routines. Our company has hundred of them and requires basic authorization. I issue a web request and then I can write the data to google sheets or to my own MySQL db. Here is an example URL we use:

https://app.yourdomain.com/datadumper/report.ashx?reportid=5107&startDate=2021-04-08&endDate=2021-04-08&bu=3&currentuser=john.agusta&__action=Export#

Then you can code the app script as follows:

/*
https://modjeska.us/csv-google-sheets-basic-auth/
https://redfin.engineering/when-importdata-isnt-good-enough-retrieving-csv-files-behind-basic-auth-with-a-google-apps-script-6c563f3328c5
cm9iZXJ0Om9sZHRpcmVz -- this is a base 64-encoded string of the form

<username>:<password>. In this case, the value represents the username 'robert' and the password 'oldtires', so the string prior to encoding is: robert:oldtires

this site can be used to create a base 64-encoded string:

https://www.base64encode.net/


*/

const auth = '';

VAR URL = "https://app.yourdomain.com/datadumper/report.ashx?reportid=5107&startDate=2021-04-08&endDate=2021-04-08&bu=3&currentuser=john.agusta&__action=Export#";
var csvContents = getCSVContents(URL);
var parsedContents = parseCsvResponse(csvContents, true);

function getCSVContents(csvUrl) {
// request the CSV
    var resp = UrlFetchApp.fetch(csvUrl, {
        headers: {
            // use basic auth
            'Authorization': 'Basic ' + auth
        }
    });
  return resp.getContentText();
}

// parse the CSV response
function parseCsvResponse(csvString, ignoreHeaders) {
  var retArray = [];
  var numCols = 0;
  var i = 0;
  var j = 0;
  var line = "";
  var strLines = csvString.split(/\n/g);
  if (ignoreHeaders) {
    strLines.shift();
  }
  var lenLines = strLines.length;  
  for (i = 0; i < lenLines; i++) {
    line = strLines[i];
    if (line != '') {
      retArray.push(line.split(/,(?=(?:(?:[^"]*"){2})*[^"]*$)/));
      numCols = retArray[i].length;
      // remove outer double quotes
      for (j = 0; j < numCols; j++) {
        retArray[i][j] = retArray[i][j].replace(/^"|"$/g, '');
      }
    }
  }
  return retArray;
}

Upvotes: 1

Related Questions