Reputation: 299
I need some help with looping through a table and converting the rows into a single row in a new table.
Table A contains 100k+ rows. I want to condense this down in table B by adding the values to a json array.
Table A has the following structure:
+---------------+------------+------+-----+---------+----------------+
| Field | Type | Null | Key | Default | Extra |
+---------------+------------+------+-----+---------+----------------+
| id | int(11) | NO | PRI | NULL | auto_increment |
| result_id | int(11) | NO | | 0 | |
| question_id | int(11) | NO | | 0 | |
| question_type | tinyint(4) | NO | | 0 | |
| answer | tinytext | NO | | NULL | |
| user_answer | tinytext | NO | | NULL | |
| exam_id | int(11) | NO | | NULL | |
+---------------+------------+------+-----+---------+----------------+
Table B has the following structure:
+------------+-----------+------+-----+-------------------+-----------------------------+
| Field | Type | Null | Key | Default | Extra |
+------------+-----------+------+-----+-------------------+-----------------------------+
| id | int(11) | NO | PRI | NULL | auto_increment |
| exam_id | int(11) | NO | | NULL | |
| score | int(11) | NO | | NULL | |
| exam_data | text | NO | | NULL | |
| created_at | timestamp | NO | | CURRENT_TIMESTAMP | on update CURRENT_TIMESTAMP |
| updated_at | timestamp | YES | | NULL | |
| deleted_at | timestamp | YES | | NULL | |
+------------+-----------+------+-----+-------------------+-----------------------------+
Esentially, I want to loop through all the rows in Table A and add the rows with the same result_id
to an array. I want to then take this array and insert a new row into Table B.
Here's what I have so far, but it does not work as intended.
$array = array();
$id = null;
TableA::chunk(100, function($answers)
{
foreach($answers as $old)
{
$id = $old->result_id;
if($id == $old->result_id)
{
$array[] = array("type" => $old->question_type, "answer" => $old->user_answer);
}
}
$new = new TableB;
$new->exam_id = $id;
$new->exam_data = json_encode($array);
$new->save();
});
I want to utilize the Laravel chunk function to avoid the SQL server from timing out.
Can someone point me in the right direction?
Upvotes: 1
Views: 1968
Reputation: 1189
I would suggest doing one SELECT
, loop over your results and use the result_id
to create the first dimension of an array. Then use the question_id
to create the second dimension of the array.
$aJsonResults = array();
foreach($aDatabaseResults as $key => $value){
$aJsonResults[$value['result_id']][$value['question_id']] = $value['answer'];
}
Then just loop over the first level of $aJsonResults
and json_encode
the value.
$aJsonedResults = array()
foreach($aJsonResults as $key => $value){
$aJsonedResults[$key] = json_encode($value);
}
Then create your insert how @alexander-ravikovich suggested.
Upvotes: 1
Reputation: 1756
I hate ORM for actions that different from simple CRUD operations. So I will describe an idea in plain SQL/Pseudo code.
Additionally, to save memory, you can create object TableB once outside the loop, and then in the loop to change needed fields. In such way you save memory and time needed to create new object every iteration.
p.s. For more efficient way, I would use plain SQL queries. Furthermore, it's better to create queries dynamically, to be able insert data to db in portions instead of doing INSERT for each row.
I mean, first build query that contains multiple inserts, and then insert it by running just one query:
INSERT INTO Table ( Column1, Column2 ) VALUES
( Value1, Value2 ), ( Value1, Value2 )
Upvotes: 1