Reputation: 193
I am trying to learn scrapy and it seems that I have to set up the spider in an IDE and save it as a .py file but then I need to go into cmd(I have windows on this computer) to set up the virtual environment and run it.
Is there a way to keep everything inside of Pycharm or another IDE so I can just not worry about dealing with cmd and have everything in one place?
Thank you
Upvotes: 0
Views: 438
Reputation: 36
You can use the terminal that comes with PyCharm, which has your current projects venv already activated in it. So you would not have to switch between windows to run your crawler
Or you could try to do..
from scrapy.crawler import CrawlerProcess
if __name__ == "__main__":
process = CrawlerProcess()
process.crawl()
at the end of your code.
Upvotes: 2