Reputation: 11523
I need to query the remote db and write a local csv file with the data. I tried this code:
heroku run python manage.py shell
>>> from libros.models import LibrosDisponibles
>>> x = LibrosDisponibles.objects.all()
>>>
>>> import csv
>>> with open("librosd_backup.csv", "wb") as f:
... e = csv.writer(f, quoting=csv.QUOTE_ALL)
... for l in x:
... e.writerow([l.perfil, l.libro])
...
>>>
It doesn't work the file is empty, I'm not writing to the right place of course. But, how can I solve this?
Upvotes: 0
Views: 1315
Reputation: 5065
Quick and dirty, and not exactly via heroku run
: execute a SQL statement via pg:psql
, and use the COPY
command. E.g. heroku pg:psql -c 'copy my_table to stdout with csv header' > my_table.csv
. Very quick and convenient way to generate data for reports and other analysis. You can use a table name, or a full sql statement (e.g., copy (select foo, bar from baz) to stdout with csv header
).
See http://www.postgresql.org/docs/9.3/static/sql-copy.html for info on COPY
. You may wonder if you can use the 'to filename' version of the call... you can't. The filename is defined on the database server, which you have no access to.
If you need to honor some aspect of your Python logic, you could either -
1) write a local script and use (actually, don't do this - if you forget to unset heroku pg:credentials
to set DATABASE_URL
DATABASE_URL
bad things canwill happen...)
2) write an API endpoint that executes the code you wrote and stream directly (see https://docs.djangoproject.com/en/1.9/howto/outputting-csv/) so you can call it locally via cURL
(authenticated, of course)
Upvotes: 2