Reputation: 2930
I have a very big array stored in memory after reading a whole file in an array (as hex) like this :
$bin_content = fread(fopen($filename,"r"),filesize($filename));
$hex_decode = explode(" ",chunk_split(bin2hex($bin_content),2," "));
unset($bin_content);
function myfunction($i) {
global $hex_decode;
// stuff here
}
When I create a function that uses this $hex_decode
array as global ... the script runs like forever (very slow) but if i call that function passing the $hex_decode
as a parameter ( myfunction($i,$hex_decode) instead of myfunction($i) in which $i is a pointer
) , things are more faster .
Can anyone explain why ? And is there any way to speed the script by reading that file in a different method .
I need to have the whole file in that array rather that line by line , because I'm building a custom ASN.1 decoder and i need to have it all .
Upvotes: 1
Views: 411
Reputation: 212522
And is there any way to speed the script by reading that file in a different method .
Personally I'd use a stream filter to chunk-read and convert the file as it was read rather than reading the entire file in one go, and then converting and fixing with the entire file in memory, handling any filtering and fixing of the ASN.1 structure within the stream filter.
I know this isn't a direct response to the actual question, but rather to the single quote above; but it could provide a less memory-hungry alternative.
Upvotes: 2
Reputation: 111409
If your files can be huge you could consider a memory-conservative approach. ASN.1 is usually encoded in structures of type-length-value, which means that you don't have to store the whole thing in memory, just the data that you need to process at any given time.
Upvotes: 1
Reputation: 955
There already was a similar question at StackOverflow, please take a look at:
The advantage / disadvantage between global variables and function parameters in PHP?
Upvotes: 1