Reputation: 65
I've got a model that includes a lot of voters, which have soft deletes enabled on them. When I get a new electoral register and list of voters, I want to upload these, and bulk upsert() using Laravel's new functionality.
To do this, I delete (i.e. soft delete) all the voters in the ward, then take them back out (with trashed models) and upsert them. I expected that this would:
But this isn't actually what happens. Instead, what happens is every user is always re-inserted. If I upload the same file twice without making any change to the SQL database, I expect all the lines to be in exactly the same state they were before - with no deleted_at entry. But this isn't the case. What am I doing wrong?
Voter::where('ward_id', $wardID)->delete();
foreach ($array as $id => $line)
{
$upsertArray[] = [
'address_id' => $line['address_id'],
'council_id' => $this->id,
'road_id' => $roadsArray[$line['parsed_address_road']],
'ward_id' => $wardID,
'forename' => $line['Elector Forename'],
'surname' => $line['Elector Surname'],
'deleted_at' => null,
];
}
$results = [];
foreach (array_chunk($upsertArray, 500) as $upsertArrayChunk)
{
$results[] = Voter::withTrashed()->upsert($upsertArrayChunk, ['address_id', 'council_id', 'road_id', 'ward_id', 'forename', 'surname'], ['deleted_at']);
}
Upvotes: 2
Views: 1106
Reputation: 17206
from the laravel documentation
All databases systems except SQL Server require the columns in the second argument provided to the upsert method to have a "primary" or "unique" index.
So you should limit your second argument to the IDs ['address_id', 'council_id', 'road_id', 'ward_id']
and create a unique index on them in your database.
ALTER TABLE `voters` ADD UNIQUE `unique_index`(`address_id`, `council_id`, `road_id`, `ward_id`);
Upvotes: 1