Reputation: 1182
I have an array of objects which need processing. I need to send them to a 3rd party system via their API, which only allows me to submit 100 objects at a time.
So let's say I have an array of objects like this
myUserArray = [{first_name: 'Jon', last_name: 'Snow'}, {first_name: 'Sansa', last_name: 'Stark'}...]
I end up sending this to their API like this
intercom.users.submit_bulk_job(create_items: myUserArray)
This works fine when the number of objects less than 100 but throws an error when greater than 100 due to their rate limiting, which is fair enough. I have 5000 objects to process, so I need a way of batching the myUserArray into chunks of 100 until they are all done. Would appreciate any advice !
Upvotes: 3
Views: 2733
Reputation: 571
If an array of objects is generated continuously on the fly, consider msg-batcher gem.
Here is the code which will call intercom.users.submit_bulk_job
with up to 100 user objects.
require 'msg-batcher'
batcher = MsgBatcher.new 100, 1000 do |batch|
intercom.users.submit_bulk_job(create_items: batch)
end
myUserArray.each do |user|
batcher.push user
end
Upvotes: 0
Reputation: 121010
Enumerable#each_slice
comes to the rescue:
myUserArray.each_slice(100) do |slice|
intercom.users.submit_bulk_job(create_items: slice)
end
Upvotes: 7