Reputation: 19186
When I try to deploy using shub deploy
, I got this error:
Removing intermediate container fccf1ec715e6 Step 10 : RUN sudo -u nobody -E PYTHONUSERBASE=$PYTHONUSERBASE pip install --user --no-cache-dir -r /app/requirements.txt ---> Running in 729e0d414f46 Double requirement given: attrs==16.1.0 (from -r /app/requirements.txt (line 51)) (already in attrs==16.0.0 (from -r /app/requirements.txt (line 1)), name='attrs')
{"message": "The command '/bin/sh -c sudo -u nobody -E PYTHONUSERBASE=$PYTHONUSERBASE pip install --user --no-cache-dir -r /app/requirements.txt' returned a non-zero code: 1", "details": {"message": "The command '/bin/sh -c sudo -u nobody -E PYTHONUSERBASE=$PYTHONUSERBASE pip install --user --no-cache-dir -r /app/requirements.txt' returned a non-zero code: 1"}, "error": "build_error"}
{"message": "Internal build error", "status": "error"} Deploy log location: c:\users\dr521f~1.pri\appdata\local\temp\shub_deploy_pvx7dk.log Error: Deploy failed: {"message": "Internal build error", "status": "error"}
This is my requirements.txt
:
attrs==16.1.0
beautifulsoup4==4.5.1
cffi==1.8.2
click==6.6
cryptography==1.5
cssselect==0.9.2
enum34==1.1.6
fake-useragent==0.1.2
hubstorage==0.23.1
idna==2.1
ipaddress==1.0.17
lxml==3.6.1
parsel==1.0.3
pyasn1==0.1.9
pyasn1-modules==0.0.8
pycparser==2.14
PyDispatcher==2.0.5
pyOpenSSL==16.1.0
pypiwin32==219
queuelib==1.4.2
requests==2.11.1
retrying==1.3.3
ruamel.ordereddict==0.4.9
ruamel.yaml==0.12.13
scrapinghub==1.8.0
Scrapy==1.1.2
scrapy-fake-useragent==0.0.1
service-identity==16.0.0
shub==2.4.0
six==1.10.0
Twisted==16.4.0
typing==3.5.2.2
w3lib==1.15.0
zope.interface==4.3.2
Why can't I deploy?
Upvotes: 1
Views: 747
Reputation: 19186
From the documentation here
Note that this requirements file is an extension of the Scrapy Cloud stack, and therefore should not contain packages that are already part of the stack, such as scrapy.
As you can see in the error:
Running in 729e0d414f46 Double requirement given: attrs==16.1.0 (from -r /app/requirements.txt (line 51)) (already in attrs==16.0.0 (from -r /app/requirements.txt (line 1)), name='attrs')
It says Double requirement given
.
Use different requirements.txt
for the whole project and for Scrapinghub. I ended up creating shub-requirements.txt
which contains this:
beautifulsoup4==4.5.1
fake-useragent==0.1.2
Upvotes: 1