Reputation: 473
I have a CSV file of this form:
Username, Password_Hash
noam , ************
paz , ************
I want to import this CSV into my datastore so the data could be accessed from python by using this model:
class Company(ndb.Model):
Username = ndb.StringProperty()
Password_Hash= ndb.StringProperty(indexed=False)
Of course, manual import one by one is not an option because the real file is pretty large.
I've no idea of which structure the file used by gcloud preview datastore upload is based on
.
Google has a lack of good documentation on this issue.
Upvotes: 0
Views: 344
Reputation: 11370
How about something like:
from google.appengine.api import urlfetch
from models import Company
def do_it(request):
csv_string = 'http://mysite-or-localhost/table.csv'
csv_response = urlfetch.fetch(csv_string, allow_truncated=True)
if csv_response.status_code == 200:
for row in csv_response.content.split('\n'):
if row != '' and not row.lower().startswith('Username,'):
row_values = row.split(',')
new_record = Company(
Username = row_values[0],
Password_Hash = row_values[1]
)
new_record.put()
return Response("Did it", mimetype='text/plain')
Upvotes: 2
Reputation: 15994
there is no magic way of migrating. you need to write a program that reads the file and saves to the datastore one by one. it's not particularly difficult to write this program. give it as long as it takes, it won't be forever...
Upvotes: 2