user1592380
user1592380

Reputation: 36317

Importing headers and payload into Scrapy

I've been working with firebug and I've got the following dictionaries to query an api.

payload = "{\"prefixText\":\"2261\",\"count\":\"10 \"}"

headers = {
'origin': "site.com",
'x-requested-with': "XMLHttpRequest",
'user-agent': "Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/49.0.2623.87 Safari/537.36",
'content-type': "application/json; charset=UTF-8",
'accept': "*/*",
'referer': "***",
'accept-encoding': "gzip, deflate",
'accept-language': "en-US,en;q=0.8",
'cookie': "ASP.NET_SessionId=tnrqoff2y; 
'cache-control': "no-cache"
}

with python requests, using this is as simple as:

response = requests.request("POST", url, data=payload, headers=headers)

How can I use these in Scrapy? I know its something to do with middleware, but I've been reading http://doc.scrapy.org/en/latest/topics/spider-middleware.html and its not clear to me how to do this.

Upvotes: 0

Views: 1180

Answers (1)

eLRuLL
eLRuLL

Reputation: 18799

Scrapy isn't very useful for just "making requests", that's the requests module for. Scrapy is a crawling framework, used for creating website spiders and on those spiders, the request rules are necessary.

Anyway if you create a spider, and you need to send a POST request, you can do it like this:

...
yield Request(url, method="post", headers=dict(), body=body, callback=self.parse_method)
...

I would recommend following first this tutorial

Upvotes: 2

Related Questions