Reputation: 861
I'm writing a feature for an admin panel that blocks ip addresses on the apache level. The file is called blacklist.txt
and looks like 10.0.0.1,10.0.0.2,10.0.0.3, ...
All a single line, with each ip address separated by a comma. After reading What is the best way to write a large file to disk in PHP?, I am still unsure of the best practices on the matter.
Here's what I want to do: IF an administrator presses the 'ban hammer', the file is read looking for strpos($file, $ip)
, if it's not found, append to the end of the file and the .htaccess file blocks accordingly.
Question: is a .txt file suitable for this potentially large amount of data? I do not want to execute a query to check if someone is banned every time a page is requested
EDIT: The purpose is to block single ip addresses that have 10 failed login attempts in the past 12 hours. I would think that the 'recover my password' would prevent a normal client from doing this.
Upvotes: 2
Views: 1424
Reputation: 43
First for reading your File in CSV format
you can use many ways. example:
$rows = array_map('str_getcsv', file('myfile.csv'));
$header = array_shift($rows);
$csv = array();
foreach ($rows as $row) {
$csv[] = array_combine($header, $row);
}
src: http://steindom.com/articles/shortest-php-code-convert-csv-associative-array
for checking that on each page load and to minimize the Reading of that file
you can use a memory cache , something like memCache, then search the array for the incoming ip. note: memory cache is faster then Database query.
PHP shared memory ref: http://www.php.net/manual/en/book.shmop.php
memCache php.net/memcache
Array Search php.net/in_array
also to return the key if value found php.net/array_search
note: in 1 mb file you can store ~65K IP considering an ip is the following format: "255.255.255.255,"
it's even better if you put the key of the array the ip, then instead of searching the array for that ip you can Check if the Key exist with this: php.net/array_key_exists
Upvotes: 1
Reputation: 524
Question: is a .txt file suitable for this potentially large amount of data?
No, it is not. A database with proper indexing is.
Upvotes: 3