Waku-2
Waku-2

Reputation: 1196

setting Timeout for SplFileObject reading from remote aws s3 file

I am reading a file line by line directly from aws s3 server using SplFileObject.

$url = "s3://{$bucketName}/{$fileName}";
$fileHandle = new \SplFileObject($url, 'r');
$fileHandle->setFlags(SplFileObject::READ_AHEAD);

while(!$fileHandle->eof()){
    $line = $fileHandle->current();
    // process the line        
    $fileHandle->next();
}

That works perfectly fine in 99% of the cases except at times when loop is running and there is a temporary network glitch. Script being unable to access the next line from s3 for x seconds exits prematurely. And the problem is you never know if the script completed its job or it exited due to timeout.

My question here is

1- is there a way to explicitly set timeout on SPLFileObject while accessing remote file so that when the loop exits, I can understand if it exited due to time out or if the file really reached the eof.

I checked the stream_set_blocking and stream_set_timeout but they both do not seem to work with SplFileObject.

2- What is the timeout setting that this script is currently following? is it socket_timeout ? or stream_timeout or curl timeout? or simply php script timeout (which is highly unlikely I guess as the script runs on command line ).

Any hints?

Upvotes: 1

Views: 511

Answers (0)

Related Questions