Reputation: 2418
I have small project made in symfony2 when I try to build it on my server it's always fails when unzipping symfony. Build was OK and suddenly composer won't unzip symfony and I didn't change anything. I tried to build with Jenkins and also manually from bash with same result. It's not permissions problem and also internet connection on my server is OK.
Loading composer repositories with package information
Installing dependencies (including require-dev) from lock file
- Installing symfony/symfony (v2.3.4)
Downloading: 100%
[Symfony\Component\Process\Exception\ProcessTimedOutException]
The process "unzip '/path/vendor/symfony/symfony/6116f6f3
d4125a757858954cb107e64b' -d 'vendor/composer/b2f33269' && chmod -R u+w 'vendor/composer/b2f33269'" exceeded the timeout of 300 seconds.
Upvotes: 123
Views: 168703
Reputation: 2455
google/apiclient
& repeated download attemptsI was having an issue with composer
and google/apiclient
, running in an ubuntu Docker, on a Mac host machine (and not windows like other people).
The problem: it was continuing to try download every 5 minutes. This was not noticeable at first because there are no timestamps, even when running composer -vvv install
. After many increments of 5-15 minutes, I painfully realized I was in a programmer's hell - waiting long periods of time between each attempt to fix said problem. Additionally, I have no real "timeout" error messaging to help figure this out.
Things I tried:
export COMPOSER_PROCESS_TIMEOUT=600 # default is 300
composer.json
's config
section: "process-timeout": 0
"compressed": true
Example of repeating lines:
Reading /root/.cache/composer/files/watson/rememberable/61967a1ee36ae90d92cb250b916ec819f141400a.zip from cache
- Loading watson/rememberable (6.1.0) from cache
[302] https://api.github.com/repos/googleapis/google-api-php-client-services/zipball/033de899257a294ca82f52fe4296dac011d01f74
Following redirect (1) https://codeload.github.com/googleapis/google-api-php-client-services/legacy.zip/033de899257a294ca82f52fe4296dac011d01f74
- Downloading google/apiclient-services (v0.326.0)
Downloading https://api.github.com/repos/googleapis/google-api-php-client-services/zipball/033de899257a294ca82f52fe4296dac011d01f74
[302] https://api.github.com/repos/googleapis/google-api-php-client-services/zipball/033de899257a294ca82f52fe4296dac011d01f74
Following redirect (1) https://codeload.github.com/googleapis/google-api-php-client-services/legacy.zip/033de899257a294ca82f52fe4296dac011d01f74
- Downloading google/apiclient-services (v0.326.0)
Downloading https://api.github.com/repos/googleapis/google-api-php-client-services/zipball/033de899257a294ca82f52fe4296dac011d01f74
[302] https://api.github.com/repos/googleapis/google-api-php-client-services/zipball/033de899257a294ca82f52fe4296dac011d01f74
Following redirect (1) https://codeload.github.com/googleapis/google-api-php-client-services/legacy.zip/033de899257a294ca82f52fe4296dac011d01f74
- Downloading google/apiclient-services (v0.326.0)
Downloading https://api.github.com/repos/googleapis/google-api-php-client-services/zipball/033de899257a294ca82f52fe4296dac011d01f74
When I finally realized it was 5 minutes
timeout, I started to question my php.ini
settings.
Supported by this article on composer's site, I discovered that I could edit the default_socket_timeout
since it will choose which ever number is larger (default 300) or default_socket_timeout (default 60):
It means your network is probably so slow that a request took over 300seconds to complete. This is the minimum timeout Composer will use, but you can increase it by increasing the default_socket_timeout value in your php.ini to something higher.
✔️ Here's what I set mine (/etc/php/8.1/cli/php.ini
) to and it worked:
; Default timeout for socket based streams (seconds)
; https://php.net/default-socket-timeout
; default_socket_timeout = 60
default_socket_timeout = 3600
compress: true
was a bad ideaI got it to download (had to set to more than 10 mins!) However, I got a new problem about zip archives - so I removed the compress: true
from config, removed the vendor
folder, and composer clearcache
in hopes that it would solve the issue. It worked!
Zip archive related Error messages:
Executing async command (CWD): '/usr/bin/unzip' -qq
Failed to extract google/apiclient-services
End-of-central-directory signature not found. Either this file is not
a zipfile, or it constitutes one disk of a multi-part archive. In the
latter case the central directory and zipfile comment will be found on
the last disk(s) of this archive.
The archive may contain identical file names with different capitalization (which fails on case insensitive filesystems)
Unzip with unzip command failed, falling back to ZipArchive class
Upvotes: 3
Reputation: 11
On Windows 11, and somewhat related to an above answer, adding a folder exclusion to real-time protection can stop the "antimalware executable" from scanning the folder and causing the timeout (and saves entirely disabling "real-time protection").
Upvotes: 1
Reputation: 613
None of the solutions worked for me running on win10 wsl ubuntu (disabling firewall, removing debuggers, clearing cache, increasing timeout, deleting vendor). The only way that worked was deleting vendor and composer.lock from the main machine, copying composer.json to a fresh machine, install php and composer, run composer install (it should take less than 1 second to execute), then copying the vendor dir to the other machine, and run composer update.
Upvotes: 0
Reputation: 52493
Check with composer update/install -o -vvv
whether the package is being loaded from composers' cache.
If yes, try clearing composer's cache or try adding --cache-dir=/dev/null
.
To force downloading an archive instead of cloning sources, use the --prefer-dist
option in combination with --no-dev
.
Otherwise you could try raising composer's process timeout value:
export COMPOSER_PROCESS_TIMEOUT=600 # default is 300
Upvotes: 162
Reputation: 11699
Composer itself impose a limit on how long it would allow for the remote git operation. A look at the Composer documentation confirms that the environment variable COMPOSER_PROCESS_TIMEOUT
governs this. The variable is set to a default value of 300 (seconds) which is apparently not enough for a large clone operation using a slow internet connection.
Raise this value using:
COMPOSER_PROCESS_TIMEOUT=2000 composer install
Upvotes: 45
Reputation: 1448
I agree with most of what has been suggested above, but I had the same issue and what worked for me was deleting the vendor folder and re-run composer install
Regards
Upvotes: 2
Reputation: 374
old thread but new problem for me. No solutions here were working when trying to install google/apiclient (it failed on google/apiclient-services) on an Ubuntu VM within a Windows 10 host.
After noticing Windows' "antimalware executable" taking up considerable CPU cycles when doing this composer install/update, I disabled "real-time protection" on the Windows 10 machine, and my composer update/install worked!!
Hope that helps someone.
Upvotes: 10
Reputation: 1005
It's an old thread but I found out the reason for time out was running a php debugger (PHPStorm was listening to xdebug connections) which caused the process timeout. When I closed the PHPStorm or disabled the xdebug extension, no time out occurred.
Upvotes: 12
Reputation: 1971
The easiest method is add config option to composer.json file, Add process-timeout 0, That's all. It works anywhere.
{
.....
"scripts": {
"start": "php -S 0.0.0.0:8080 -t public public/index.php"
},
"config": {
"process-timeout":0
}
}
Upvotes: 97
Reputation: 31130
The Symfony Component has process timeout set to 60 by default. That's why you get errors like this:
[Symfony\Component\Process\Exception\ProcessTimedOutException]
The process "composer update" exceeded the timeout of 60 seconds.
Solution
Set timeout to 5 minutes or more
$process = new Process("composer update");
$process->setTimeout(300); // 5 minutes
$process->run();
Upvotes: 4