Gilzy165
Gilzy165

Reputation: 1

Running a Scrapy spider from another python script

I'm trying to run a scrapy spider from a simple PyQt4 GUI I built, the user has to fill his email and password to run the spider, the spider works well if I call it from the command prompt like so

scrapy crawl my_spider -a email -a password

So once the user has filled his email and password and I saved them in my script, and after reading the documentation and some examples I found of google I still can't figure out how to run it

    self.BumpPushButton.clicked.connect(self.BumpListings)

    def BumpListings(self):
        email = self.emailTextEdit.toPlainText()
        password = self.passwordTextEdit.toPlainText()
        bumpCycleInMinutes = self.MinutesTextEdit.toPlainText()

is there a simple way to call the spider at this point?

Upvotes: 0

Views: 786

Answers (1)

costrouc
costrouc

Reputation: 3175

There are two approaches to this problem. You could call the spider directly by importing the spider or you could use python subprocesses. I would recommend python subprocesses because you don't want to block your pyqt process.

 import subprocess
 process = subprocess.Popen(['scrapy', 'crawl', 'my_spider', 
                                       '-a', email, '-a', password]
                             shell=True, 
                             stdout=subprocess.PIPE, 
                             stderr=subprocess.PIPE) 

You can then check on the process using wait, communicate, poll etc. See docs to see what actions you might want to perform.

Upvotes: 1

Related Questions