kim
kim

Reputation: 567

upload multiple csv files to sharepoint folder in python?

I am trying to save the data that pulled out from PostgreSQL db onto designated MS SharePoint folder. To do so, first I retrieved data from local db, then I need to store/save this data onto SharePoint folder. I tried of using office365 api to do this, but no data saved on SharePoint folder. Does anyone has similar experiences of doing this in python? Any workaround to do this in python? any thoughts?

My current attempt:

first, I did pull up data from local postgresql db as follow:

from sqlalchemy import create_engine
import pandas as pd
import os.path

hostname = 'localhost'
database_name = 'postgres'
user = 'kim'
pw = 'password123'

engine = create_engine('postgresql+psycopg2://'+user+':'+pw+'@'+hostname+'/'+database_name)

sql = """ select * from mytable """

with engine.connect() as conn:
    df = pd.read_sql_query(sql,con=engine)

then, I tried to store/save the data to designated sharepoint folder as follow:

from office365.runtime.auth.authentication_context import AuthenticationContext
from office365.sharepoint.client_context import ClientContext
from office365.sharepoint.files.file import File 

url_shrpt = 'https://xyzcompany.sharepoint.com/_layouts/15/sharepoint.aspx?'
username_shrpt = '[email protected]'
password_shrpt = 'password123'
folder_url_shrpt = 'https://xyzcompany.sharepoint.com/:f:/g/EnIh9jxkDVpOsTnAUbo-LvIBdsN0X_pJifX4_9Rx3rchnQ'

ctx_auth = AuthenticationContext(url_shrpt)
if ctx_auth.acquire_token_for_user(username_shrpt, password_shrpt):
    ctx = ClientContext(url_shrpt, ctx_auth)
    web = ctx.web
    ctx.load(web)
    ctx.execute_query()

else:
    print(ctx_auth.get_last_error())

response = File.open_binary(ctx, df)

with open("Your_Offline_File_Path", 'wb') as output_file:  
    output_file.write(response.content)

but file was not saved on SharePoint folder. How should we save the data from PostgreSQL onto SharePoint folder using python? Is there any workaround to do this? any thoughts?

objective:

I want to write down the data that pulled out from PostgreSQL db onto SharePoint folder. From my current attempt, above attempt didn't save data onto sharepoint folder. Can anyone suggest possible way of doing this?

Upvotes: 0

Views: 1958

Answers (2)

Hummer
Hummer

Reputation: 445

A slight variation to what @Jared had up there for example if one wants to create a folder based on a date and upload files to it from a location other than the root folder on the user's computer. This will be handy to people interested in a such a solution, a problem I had.

from shareplum import Site
from shareplum import Office365
from shareplum.site import Version
import pendulum #Install it for  manipulation of  dates 

todaysdate = pendulum.now() Get todays date 
foldername1 = todaysdate.strftime('%d-%m-%Y') #Folder name in a format such as  19-06-2021

UN = "[email protected]"
PW = "hello#"

path = r"C:\xxxx\xxx\xxx" #Path where the files to be uploaded are stored.
doc_library = "xxxxx/yyyy" #Folder where the new folder (foldername1) will be stored

authcookie = Office365('https://xyzcompany.sharepoint.com',username=UN,password=PW).GetCookies()
site = Site('https://xyzcompany.sharepoint.com/sites/sample_data/',version=Version.v365,authcookie=authcookie)
folder = site.Folder(doc_library+'/'+foldername1) #Creates the  new  folder matching  todays date. 

files = glob.glob(path+"\\*.csv")
for file in files:
    with open(file, mode='rb') as rowFile:
        fileContent = rowFile.read()
    folder.upload_file(fileContent, os.path.basename(file))

That's a solution that will work well for anyone looking around for such code.

Upvotes: 1

Jerry07
Jerry07

Reputation: 941

I think you should write csv files locally, then try following in order to upload them onto SharePoint folder:

from shareplum import Site
from shareplum import Office365
from requests_ntlm import HttpNtlmAuth
from shareplum.site import Version

UN = "[email protected]"
PW = "hello#"

cred = HttpNtlmAuth(UN,PW)
authcookie = Office365('https://xyzcompany.sharepoint.com',username=UN,password=PW).GetCookies()
site = Site('https://xyzcompany.sharepoint.com/sites/sample_data/',version=Version.v365,authcookie=authcookie)
folder = site.Folder('Shared Documents/New Folder')

files = Path(os.getcwd()).glob('*.csv')
for file in files:
    with open(file, mode='rb') as rowFile:
        fileContent = rowFile.read()
    folder.upload_file(fileContent, os.path.basename(file))

this is error-free and working solution, this should work for uploading files to SharePoint folder.

Upvotes: 1

Related Questions