Reputation: 2942
I tried to setup Scrapy on Windows 7 by steps described at http://doc.scrapy.org/en/latest/intro/install.html . On my PC was installed Python 3.5.1. Although Scrapy not support this python version it was installed successfully with latest Anaconda but fails to run spider script. I find that Scrapy only works with Python 3.3.+ version so uninstall version 3.5.1, uninstall Anaconda, install python 3.3.5, install pywin32 and install pip. pip fails pip install Scrapy
, so I install Anaconda and run conda install -c scrapinghub scrapy
Scrapy installed, but I saw that libs installed was for python 3.5 like: scrapy: 1.1.0-py35_0
Now I run the
c:\python\olxscrapy>scrapy crawl OlxCatalogSpider
and get error
File "C:\Anaconda3\lib\site-packages\twisted\internet\stdio.py", line 30, in
module>
from twisted.internet import _win32stdio
ImportError: cannot import name '_win32stdio'
How to make Scrapy run with python 3.3.+
Upvotes: 0
Views: 1281
Reputation: 46
Installation of Scrapy
on Windows may facing error while installing Twisted
.
Twisted
according to your Python and windows version on this site http://www.lfd.uci.edu/~gohlke/pythonlibs/#twistedpip install <downloaded filename>
pip install scrapy
Upvotes: 0
Reputation: 609
I add the follow package and it works:
pip install twisted-win==0.5.5
Upvotes: 1
Reputation: 21
On this blog:
https://blog.scrapinghub.com/2016/05/25/data-extraction-with-scrapy-and-python-3/
it says Scrapy on Python 3 doesn't work in Windows environments yet
Edit: I recently installed scrapy on Ubuntu for Python 3.5 and received a lot of errors. The errors stopped after: "sudo apt-get install python3.5-dev".
Upvotes: 2
Reputation: 53
Try to create a virtual env:
pip install virtualenv (instalation) virtualenv -p python3.3.5 envName (creation with specific python version) source ./envName/bin/activate (activate virtual env)
This way you can guarantee that's the right python version. Also scrapy has some requirements that can't be installed via pip and this may cause your pip install scrapy to fail
So install at your computer: python-dev libxslt1-dev libxslt1.1 libxml2-dev libxml2 libssl-dev
After this you finaly be able to install scrapy via pip inside your virtual env (probably)
Sry for my poor English isn't my native lang. Hope this work =]
Upvotes: 0