Jis
Jis

Reputation: 35

How can I upload a big file into MySQL DB Laravel

I have a huge table where I get the users and then I make some calculations and I store this new information in another table, the thing is thta it does not finish to do it because it's too huge and I think that the server executes timing out and it kills the process so I have read that it exists chunk() o something like that OR if I can use paginate but what I do not understand it's this:

My query is:

  $user = Users::all();  <- this displays me an error

But if I want to do it like every 25 rows:

  $user = Users::paginate(25);

the thing is that if I add this piece of code as above it will not work because it will return me 25 rows only I wonder:

how can I make that it returns me 25 rows it finishes to process that 25 rows and then it restarts in a new 25 rows again?

Thanks

Upvotes: 0

Views: 50

Answers (1)

ceejayoz
ceejayoz

Reputation: 180065

You're looking for either the chunk feature (which will process batches of records in chunks of a size you select) or the lazy feature (which will fetch one at a time).

You may still run into timeouts on the server side if this is a web request. If so, you may need to do this in an Artisan command, or split it up into queue jobs.

Upvotes: 1

Related Questions