Sunny
Sunny

Reputation: 229

BigCommerce API issue with page limit

I am working on a task in which i need to fetch bigcommerce products check for product url's to generate sitemap.xml file

Actually there are 180000 products on site so i will need to create multiple sitemap xml files and a single index (sitemap.xml) file.

I completed all script to do that, i am grouping 50000 url's in every sitemap xml files created.

When i run it, once it was executed correctly and created 4 sitemap files. all was worked fine.

But now i am unable to execute it coz after running for sometime it gives me network error (Something like bigcommerce connection lost)

The issue is that there is a limit while calling bigcommerce api, coz we need to send it a page no and at a time only 250 products are fetched.

so i asked to bigcommerce support person for extending limit for fetching products in a single api call. he suggested me to use loop and informed me that there is no any solution for it, we can fetch only 250 products at a time.

its difficult to fetch 180000 products in single script calling api in loop. but in my case its compulsory for me to do that in a sigle script (i need to set that script as a cron job).

Is there any solution do accomplish this task without any network error. Any bigcommerce expert here?

Any help would be greatly appreciated!!

Upvotes: 1

Views: 1348

Answers (1)

Jay
Jay

Reputation: 252

I had the same problem trying to pull all the products in the store i was working on, As it stands, they do have a max number of products per request,

What you need to do instead is to use a filter, and loop... i believe there is no other way to do this.

$count = Bigcommerce::getProductsCount()/250;

for($x=1;$x<$count;$x++){
$filter = array("page" => $x, "limit" => 250);
$products = Bigcommerce::getProducts($filter);

// All your code goes here

}

I hope this answers your question. Though this reply is a bit late, it might help someone

Upvotes: 1

Related Questions