Brian Powell
Brian Powell

Reputation: 3411

JSON response too large for PHP?

Working with a database I've created, I can run search queries that output data in JSON, then send them back to my webpage. Pretty standard.

$searchQuery  = "SELECT * FROM alldata where $searchBy like '%$searchValue%'" ;
$searchResult = mysqli_query($con, $searchQuery);

while ($row = mysqli_fetch_row($searchResult)) {

$item = array();
    $item["dateReceived"] = convertDate($row[1]);    
    $item["quoteCost"] = $row[15];
    $item["quoteNumber"] = $row[16];
    $item["quoteExpDate"] = convertDate($row[17]);    
    $item["comments"] = $row[21];    

    $output[] = $item; 
    }

$out = array('aaData' => $output);    
echo json_encode($out);

When my query is too general, something fails, and I'm not sure where it is.

The response I get from the server is:

Fatal error :  Allowed memory size of 134217728 bytes exhausted (tried to allocate 14680165 bytes) in C:\xampp\htdocs\....php</b> on line <b>74</b><br />

My memory limit is set in php.ini as:

memory_limit=128M

If I go into mySQL and write SELECT * FROMalldataWHERE name like "%a%"; I get about 35,000 rows, each with about 20 columns, but it only takes a split second.

The amount that I can query against the table is prohibitively small - what can I do to fix this so my users can run more generic queries?

Upvotes: 1

Views: 5912

Answers (4)

Adon
Adon

Reputation: 345

The first thing you should do is increase the memory_limit for php either inside your php.ini (search for memory_limit = 128M and set it to a higher value. Do not set the value TOO high though or a few requests will result in using all the memory of your server if the scripts are not efficient in the usage. Or you can raise the limit on the fly using ini_set('memory_limit', '256M'); if the ini_set function is not disabled (Usually hosting providers disable so resources are not abused).

Increasing your memory limit in PHP will only temporarily solve your problem if your database is growing.

The way you are using your query is bad architecture. Queries should never be executed without a LIMIT, especially for large data sets.

There are a couple of things you need to note here, other than raising the already low memory_limit. The problem you are currently having (if it is true that the query executed by itself returns a response quickly) is just a matter of allocating enough space for the resulting array/object and its converted string in memory; yet, other problem may cause this now or in the future. I could identify a few here to help you fix this once and for all.

1- Add a limit and a page parameter to your request so you can get the results paginated.

2- Use proper indexing in MySQL for fields you are searching by. It is preferable to have a Full Text index rather than a regular B-Tree index if you are searching in varchar or text fields as it seems you are doing. Full-Text Search Functions

3- If for some reason you cannot use Full Text search (ex: you have to use InnoDB and not MyISAM) make sure you do not use the wildcard '%' at the beginning of the string as it will not use the index properly (if using regular B-Tree index and not Full Text).

If you absolutely HAVE to use it, I would suggest shifting to Full Text search (if you want to stick to MySQL for search and not use a search engine like Apache Solr, Sphinx Search or Elastic Search to name a few.

Upvotes: 1

Dipen Shah
Dipen Shah

Reputation: 1919

There is no way a user is going to want to see all 35,000 rows at once in one page.

This would:

(1) Increase the execution time of the script.

(2) Probably crash the browser trying to display so much of data.

For effective execution of the script use pagination and/or put limits on your queries.

Upvotes: 1

Beto L&#243;pez
Beto L&#243;pez

Reputation: 91

  1. Define output array length upfront => $output=array(mysql_count($query));
  2. Include a LIMIT in your sql query

Upvotes: 2

jatyap
jatyap

Reputation: 31

You can try increasing the memory by setting

memory_limit=256M

or if you're only using it for that query, you can make temporary changes as well

<?php
    ini_set('memory_limit', '256M');

Of course replace '256M' with the appropriate amount you need. Be sure to set it back to what it was before, though. You don't want PHP to use up too much memory.

I would also recommend the answer by 96Levels. Use pagination to manage memory more efficiently. However, if you really want to work on all that data at once, you need to increase your memory size.

Upvotes: 1

Related Questions