Reputation: 13
I need to export data from one database into another database. The dataobject needs to be mapped to a dictionary.
each row needs to be saved twice with different values from a list foobar=['foo','bar'].
foobar=['foo','bar']
data = []
for q in queryset:
row = {"id": q.id,
"created_at": q.created_at}
for f in foobar:
row['index'] = f
data.append(row)
bulksave(data)
this doesn't give the desired result:
print data
[{'id': 1, 'created_at': '2017-01-01', 'index': 'bar'},
{'id': 2, 'created_at': '2017-01-02', 'index': 'bar'}]
Where the desired output would have 'foo' and 'bar' as the index. How do I get this to work? Taking a different approach would be to loop differently:
for f in foobar:
for q in queryset
this works, but takes twice the time because each element in the queryset will be evaluated twice.
Upvotes: 0
Views: 208
Reputation: 77337
The problem is that inner loop. You keep updating the same dict
and re-adding it to data
. data
ends up with multiple references to the same dict
and the dict
has been updated to the last value in foobar
.
Just copy before adding. Also, a note on good question writing... you want a runnable example when possible. Here I mock queryset
so the thing runs.
class Mock:
def __init__(self, id, created_at):
self.id = id
self.created_at = created_at
queryset = [Mock(1, '2017-01-01'), Mock(2, '2017-01-02')]
foobar=['foo','bar']
data = []
for q in queryset:
row = {"id": q.id,
"created_at": q.created_at}
for f in foobar:
row['index'] = f
data.append(row.copy())
bulksave(data)
UPDATE
If you really want to confuse the next person to look at your code, collapse it all into
import itertools
class Mock:
def __init__(self, id, created_at):
self.id = id
self.created_at = created_at
queryset = [Mock(1, '2017-01-01'), Mock(2, '2017-01-02')]
foobar=['foo','bar']
bulkupdate(list(dict(zip(('id', 'created_at', 'index'), (q.id, q.created_at, index)))
for q, index in itertools.product(queryset, foobar)))
(dgg32
's now-deleted post got me thinking...)
Upvotes: 1