user1738643
user1738643

Reputation:

The best way to handle mysql database with 2.5M entries

I am developing a program (Webbased PHP, MySQL DB) which includes database of all cities around the world by 2.5M entries (all in one table). When I try to get the details from it, It took the server (assume localhost, wamp) so much time to search and grab the data. I searched over the internet to find the most optimistic way to use such a large data. One way, I think, is to shrink this table to different tables for every country to make the search process easier.

Could anyone suggest how to handle this and share the knowledge?

EDIT: The table includes two columns. 1.country 2.city. I user WHERE country=XX to make the search process easier, But it takes a long time to process.

DATABASE: (table name: cities)

country city
US  Lakewood Estates
US  Lakewood Estates Mobile Home Park
US  Lakewood Falls
US  Lakewood Forest
US  Lakewood Gardens
US  Lakewood Harbor
US  Lakewood Heights
US  Lakewood Highlands
US  Lakewood Hills
.
.
.

PHP CODE:

$data = mysql_query( "SELECT * FROM cities where country='".mysql_real_escape_string($_GET['country'])."' ORDER BY country" )
        or die( mysql_error() );

$i = 0;
while( $row = mysql_fetch_array( $data )){
    if ( strpos( strtolower( $row['city'] ), $q ) !== false ) {
        $i++;
        $arr[] = $row['city'];
        if ($i==10) {break;}
    }
}

echo "[";
for ($x=0; $x<count($arr); $x++)
  {
  echo '{"name":"'.$arr[$x].'"}';
   if ($x!=count($arr)-1) {echo ',';}
  }
echo "]";

Upvotes: 2

Views: 607

Answers (2)

Class
Class

Reputation: 3160

It might more wise to do the following to only retrieve 10 values, but I could be wrong? Because you are retrieving all records WHERE country = XX and only using the first 10 returned records and discarding the rest of the records. Returning all records for a country might be slowing your query down.

SELECT * FROM cities where country='".mysql_real_escape_string($_GET['country'])."'
ORDER BY country LIMIT 10 (or what ever you want to limit to)

while( $row = mysql_fetch_array( $data )){
    if ( strpos( strtolower( $row['city'] ), $q ) !== false ) {
        $i++;
        $arr[] = $row['city'];
        if ($i==10) {break;}#remove this line
    }
}

Also maybe switching MySQL engines might improve execution.

Upvotes: 0

juergen d
juergen d

Reputation: 204746

2.5 million entries is actually not that much. You need to add proper indexes depending on the way you retrieve your data (where clause) then it will be way faster.

Add an index on country like this (first you need to change your column data type to varchar:

ALTER TABLE cities MODIFY COLUMN country varchar(255);

ALTER TABLE cities
ADD INDEX country_idx (`country`);

Upvotes: 3

Related Questions