Reputation: 42788
Let's say that I have an Javascript array looking as following:
["Element 1","Element 2","Element 3",...]; // with close to a hundred elements.
What approach would be appropriate to chunk (split) the array into many smaller arrays with, lets say, 10 elements at its most?
Upvotes: 1071
Views: 1311541
Reputation: 351039
You could also get a chunked
array from an iterator (let's call it it
) using take
and Array.from
:
Array.from(it, val => [val, ...it.take(size-1)])
So if your input is an array, get an iterator for its values first:
// Example input
const arr = [...Array(11).keys()], size = 3;
// Get iterator and let Array.from and spread consume it:
const it = arr.values();
const result = Array.from(it, val => [val, ...it.take(size-1)]);
console.log(result);
Upvotes: 0
Reputation: 11258
Object.groupBy is considered "Baseline 2024" features of JavaScript which one can use it for the purpose,
const items = [1, 2, 3, 4, 5, 6, 7, 8, 9];
const size = 3;
Object.values(Object.groupBy(items, (_, i) => Math.floor(i / size)))
Or the map version,
[...Map.groupBy(items, (_, i) => Math.floor(i / size)).values()]
Upvotes: 1
Reputation: 5738
const chunk = (a,n)=>[...Array(Math.ceil(a.length/n))].map((_,i)=>a.slice(n*i,n+n*i));
const chunk = <T>(arr: T[], size: number): T[][] =>
[...Array(Math.ceil(arr.length / size))].map((_, i) =>
arr.slice(size * i, size + size * i)
);
const chunk = (a,n)=>[...Array(Math.ceil(a.length/n))].map((_,i)=>a.slice(n*i,n+n*i));
document.write(JSON.stringify(chunk([1, 2, 3, 4], 2)));
const part=(a,n)=>[...Array(n)].map((_,i)=>a.slice(i*Math.ceil(a.length/n),(i+1)*Math.ceil(a.length/n)));
const partitionArray = <T>(array: T[], parts: number): T[][] => {
const itemsPerPart = Math.ceil(array.length / parts);
return [...Array(parts)].map((_, index) =>
array.slice(index * itemsPerPart, (index + 1) * itemsPerPart)
);
};
const part = (a, n) => {
const b = Math.ceil(a.length / n);
return [...Array(n)].map((_, i) => a.slice(i * b, (i + 1) * b));
};
document.write(JSON.stringify(part([1, 2, 3, 4, 5, 6], 2))+'<br/>');
document.write(JSON.stringify(part([1, 2, 3, 4, 5, 6, 7], 2)));
Upvotes: 30
Reputation: 3852
A new JavaScript feature (Baseline 2024), Object.groupBy()
makes this problem very easy to solve:
const inventory = [
{ name: "asparagus", type: "vegetables", quantity: 5 },
{ name: "bananas", type: "fruit", quantity: 0 },
{ name: "goat", type: "meat", quantity: 23 },
{ name: "cherries", type: "fruit", quantity: 5 },
{ name: "fish", type: "meat", quantity: 22 },
]
Object.values(Object.groupBy(inventory, (x, i) => Math.floor(i/3)))
Results:
[
[
{ name: 'asparagus', type: 'vegetables', quantity: 5 },
{ name: 'bananas', type: 'fruit', quantity: 0 },
{ name: 'goat', type: 'meat', quantity: 23 }
],
[
{ name: 'cherries', type: 'fruit', quantity: 5 },
{ name: 'fish', type: 'meat', quantity: 22 }
]
]
Upvotes: 1
Reputation: 8077
Here is a JSDoc documented TypeScript implementation that accepts sliceable ArrayLike
(interface ArrayLike<T> { readonly length: number; readonly [n: number]: T }
), such as String
, Array
, ArrayBuffer
, SharedArrayBuffer
, and TypedArray
:
/**
* Split a sliceable array-like into slices of equal length.
* If it cannot be split evenly, the last slice will contain the remaining
* elements.
* @param arrayLike - The sliceable array-like to slice.
* @param sliceLength - The length of each slice.
* @return An array of slices of equal length.
* @exception {RangeError} Parameter sliceLength must be positive.
*/
export function splitEvenly<T, U>(
arrayLike: ArrayLike<T> & { slice: (start?: number, end?: number) => U },
sliceLength: number,
): U[] {
if (sliceLength <= 0) {
throw new RangeError('Parameter sliceLength must be positive');
}
return Array.from(
{ length: Math.ceil(arrayLike.length / sliceLength) },
(_, index) =>
arrayLike.slice(index * sliceLength, (index + 1) * sliceLength),
);
}
Examples:
> splitEvenly([1, 2, 3, 4, 5], 2)
[[1, 2], [3, 4], [5]]
> splitEvenly('abcde', 2)
['ab', 'cd', 'e']
> splitEvenly({ 0: 1, 1: 2, 2: 3, length: 3, slice: Array.prototype.slice }, 2)
[[1, 2], [3]]
And here is the lazy counterpart that accepts Iterable
(interface Iterable<T> { [Symbol.iterator](): Iterator<T> }
):
/**
* Split an iterable into slices of equal length.
* If it cannot be split evenly, the last slice will contain the remaining
* elements.
* @param iterable - The iterable to slice.
* @param sliceLength - The length of each slice.
* @return A generator of slices of equal length.
* @exception {RangeError} Parameter sliceLength must be positive.
*/
export function* splitEvenly<T>(
iterable: Iterable<T>,
sliceLength: number,
): Generator<T extends string ? T : T[], undefined, undefined> {
if (sliceLength <= 0) {
throw new RangeError('Argument sliceLength must be positive');
}
let slice: T[] = [];
const isString = typeof iterable === 'string';
for (const element of iterable) {
slice.push(element);
if (slice.length === sliceLength) {
yield (isString ? slice.join('') : slice) as any;
slice = [];
}
}
if (slice.length) {
yield (isString ? slice.join('') : slice) as any;
}
}
Examples:
> [...splitEvenly([1, 2, 3, 4, 5], 2)]
[[1, 2], [3, 4], [5]]
> [...splitEvenly('abcde', 2)]
['ab', 'cd', 'e']
> [...splitEvenly({ *[Symbol.iterator]() { yield 1; yield 2; yield 3 } }, 2)]
[[1, 2], [3]]
Upvotes: 0
Reputation: 91142
Modified from an answer by dbaseman: https://stackoverflow.com/a/10456344/711085
Object.defineProperty(Array.prototype, 'chunk_inefficient', {
value: function(chunkSize) {
var array = this;
return [].concat.apply([],
array.map(function(elem, i) {
return i % chunkSize ? [] : [array.slice(i, i + chunkSize)];
})
);
}
});
console.log(
[1, 2, 3, 4, 5, 6, 7].chunk_inefficient(3)
)
// [[1, 2, 3], [4, 5, 6], [7]]
minor addendum:
I should point out that the above is a not-that-elegant (in my mind) workaround to use Array.map
. It basically does the following, where ~ is concatenation:
[[1,2,3]]~[]~[]~[] ~ [[4,5,6]]~[]~[]~[] ~ [[7]]
It has the same asymptotic running time as the method below, but perhaps a worse constant factor due to building empty lists. One could rewrite this as follows (mostly the same as Blazemonger's method, which is why I did not originally submit this answer):
More efficient method:
// refresh page if experimenting and you already defined Array.prototype.chunk
Object.defineProperty(Array.prototype, 'chunk', {
value: function(chunkSize) {
var R = [];
for (var i = 0; i < this.length; i += chunkSize)
R.push(this.slice(i, i + chunkSize));
return R;
}
});
console.log(
[1, 2, 3, 4, 5, 6, 7].chunk(3)
)
My preferred way nowadays is the above, or one of the following:
Array.range = function(n) {
// Array.range(5) --> [0,1,2,3,4]
return Array.apply(null,Array(n)).map((x,i) => i)
};
Object.defineProperty(Array.prototype, 'chunk', {
value: function(n) {
// ACTUAL CODE FOR CHUNKING ARRAY:
return Array.range(Math.ceil(this.length/n)).map((x,i) => this.slice(i*n,i*n+n));
}
});
Demo:
> JSON.stringify( Array.range(10).chunk(3) );
[[1,2,3],[4,5,6],[7,8,9],[10]]
Or if you don't want an Array.range function, it's actually just a one-liner (excluding the fluff):
var ceil = Math.ceil;
Object.defineProperty(Array.prototype, 'chunk', {value: function(n) {
return Array(ceil(this.length/n)).fill().map((_,i) => this.slice(i*n,i*n+n));
}});
or
Object.defineProperty(Array.prototype, 'chunk', {value: function(n) {
return Array.from(Array(ceil(this.length/n)), (_,i)=>this.slice(i*n,i*n+n));
}});
"Don't modify Object.prototype" --some comments
Of course that's fine, you can put the above in your own function chunk(arr, chunkSize)
; how you package your chunk function is immaterial to answering OP's question (how to code such a function).
However I will mildly double-down on the 'wrong' act of extending Array.prototype, and here's the justification: This isn't some horrible thing. The link shared in the comments does a good job explaining the potential pitfalls, but people may read the link and see "extending prototype bad!" and not understand the nuance. Many languages have such functionality, and the semantics are basically managed by the programmer (at import/etc. time):
myLib.chunk = function() {...}
or module-export it or whatnot, or use Object.defineProperty('myLib_chunk', ...
defineProperty('myLib_chunk', ...
) if you aren't going to do this often. But, if you are going to be chunking your arrays everywhere in your code, go ahead and extend prototype! Especially if your code will only be used in a small downstream (non-library) project. This is just the age-old debate of "why is from LIBRARY import *
" [e.g. in Python] bad? Well it's clearly bad if you overuse it, and it's sometimes good (who wants to write Math.floor
everywhere in your javascript code?! Math functions are not usually implementation-dependent!). In general, you should namespace everything myLib.yourExports
, but not necessarily to the extent of readability.for...in
)... using Object.defineProperty
will make it non-enumerable by default..chunk
, they should have reserved it or built better functionality sooner. New ECMAScript standards always have the risk of breaking existing code. If you want to future-proof your code because you won't be maintaining it or don't want to (an extremely important consideration!), don't code using this method, and don't add variables to the global namespace either (without some extensive future-proofing, maybe like Symbol, or prefixing).So... thank you to the commenters for pointing out this consideration. This discussion is outside the scope of this answer because really it could be had with any answer on StackOverflow, but that's my take.
Upvotes: 173
Reputation: 191
Here's a one-liner using the Array.from method.
const myArray = [1,2,3,4,5,6,7,8,9,10];
const chunkSize = 2;
const chunkedArray = Array.from({ length: Math.ceil(myArray .length / chunkSize) }, (_, i) => myArray.slice(i * chunkSize, i * chunkSize + chunkSize));
console.log(chunkedArray); // [ [1,2],[3,4],[5,6],[7,8],[9,10] ]
Upvotes: 4
Reputation: 94
function groupArr(arr = [], size = 0) {
if (!Array.isArray(arr) || !arr.length) return [];
if (arr.length <= size || size <= 0) return [arr];
const resultArr = [];
for (let i = 0, j = size, len = arr.length; i < len; j += size) {
let tempArr = [];
if (j > len) {
tempArr = arr.slice(i);
} else {
tempArr = arr.slice(i, j);
}
i = j;
resultArr.push(tempArr);
}
return resultArr;
}
let arr = [1, 2, 3, 4, 5];
let size = 2;
let result = groupArr(arr, size);
console.log(result);
Upvotes: 0
Reputation: 2196
Since this is the first result I came across when trying to "pull" data from an array in chunks, I'll add this here.
When the integrity of the source array is not important, or when trying to "pull" data from the source in chunks, this simplified code can be used:
const chunkedQueue = [1, 2, 3, 4, 5, 6, 7, 8, 9];
while (chunkedQueue.length) {
const chunk = chunkedQueue.splice(0, 10);
// Do something with chunk
}
If using a stack
approach, rather than a queue
, a negative index can be used in the splice function. Though it is important to note, that the order of the items in each chunk is still the same as in the source array:
const chunk = chunkedStack.splice(-10, 10);
Upvotes: 2
Reputation: 1182
Another version using Array.reduce but without using Math object and remainder (%
) operator:
const chunk = size => array => array.reduce((result, item) => {
if (result[result.length - 1].length < size) {
result[result.length - 1].push(item);
} else {
result.push([item]);
}
return result;
}, [[]]);
const myArray = [1, 2, 3, 4, 5, 6, 7, 8, 9, 10];
console.log(chunk(3)(myArray)); // [[1, 2, 3], [4, 5, 6], [7, 8, 9], [10]]
console.log(chunk(4)(myArray)); // [[1, 2, 3, 4], [5, 6, 7, 8], [9, 10]]
Upvotes: 2
Reputation: 2974
Here is a simple solution, based on flatMap
(I haven't see any in the existing answers):
function splitIntoChunks(array, chunkSize) {
return array.flatMap((x, i) =>
i % chunkSize === 0 ? [array.slice(i, i + chunkSize)] : []
);
}
Upvotes: 1
Reputation: 130
This is what I just came up with. Might be a duplicate but I ain't reading through all these to check.
const toBatches = (src, batchSize) => src.reduce((pv, cv) => {
const lastBatch = pv.length > 0 ? pv[pv.length - 1] : null;
!lastBatch || lastBatch.length === batchSize
? pv.push([cv])
: lastBatch.push(cv);
return pv;
}, []);
Upvotes: 0
Reputation: 23602
A ES2020 one-liner with a nullish coalescing assignment:
const arr = '1234567890'.split('');
const chunkSize = 3;
const r = arr.reduce((arr, item, idx) => (arr[idx / chunkSize | 0] ??= []).push(item) && arr, []);
console.log(JSON.stringify(r));
Upvotes: 7
Reputation: 76953
You can combine .filter()
and .map()
to achieve it.
let array = [];
for (let i = 1; i < 95; i++) array.push(i);
let limit = 10;
console.log(array.filter((item, index) => (index % limit === 0)).map((item, index) => {
let tmp = [];
for (let i = limit * index; i < Math.min((index + 1) * limit, array.length); i++) tmp.push(array[i]);
return tmp;
}));
Upvotes: 0
Reputation: 161
Optimized function with one loop in JS, no slice, reduce or anything
const splitArrayIntoN = (myArray, n) => {
if (!Number.isSafeInteger(n)) {
return myArray;
}
let count = 0;
const tempArray = [];
for (let item = 0; item < myArray.length; item++) {
if (tempArray[count] && tempArray[count].length !== n) {
tempArray[count] = [...tempArray[count], myArray[item]];
}
if (!tempArray[count]) {
tempArray[count] = [myArray[item]];
}
if (tempArray[count].length === n) {
count++;
}
}
return tempArray;
};
Upvotes: 0
Reputation: 4678
Recursive way
function chunk(array, chunk_size){
if(array.length == 0) return [];
else return [array.splice(0, chunk_size)].concat(chunk(array, chunk_size))
}
console.log(chunk([1,2,3,4,5,6,7,8],3))
Upvotes: 2
Reputation: 24221
Ok, this is kind of a slightly more enhanced version of Ikechukwu Eze answer using generators.
It's updated so the source doesn't have to be an array, but any iterable.
The main benefits of using generators & iterables is that they can work with more than just arrays, (strings, DOM elements etc, custom iterator) and the memory usage can be kept much lower, and of course code re-use. It's also possible to use custom generators that you can chain.
eg..
function *chunkIterator(iterable, chunkSize) {
let i, iter = iterable[Symbol.iterator]();
function *nextChunk() {
for (let l = 0; l < chunkSize && !i.done; l++) {
yield i.value;
i = iter.next();
}
}
i = iter.next();
while (!i.done) yield [...nextChunk()];
}
const myArray = [1, 2, 3, 4, 5, 6, 7, 8, 9, 10];
const chunkedArray = [...chunkIterator(myArray, 3)];
console.log(JSON.stringify(chunkedArray));
for (const chunk of chunkIterator('hello world!.', 3))
console.log(chunk.join(''));
It's also possible to make the chunks iterable too, this means that no arrays even needs creating. Also because the chunks can be any iterable, I've create a simple random number generator instead of supplying an array.
example ->
function *chunkIterator(iterable, chunkSize) {
let i, iter = iterable[Symbol.iterator]();
function *nextChunk() {
for (let l = 0; l < chunkSize && !i.done; l++) {
yield i.value;
i = iter.next();
}
}
i = iter.next();
while (!i.done) yield nextChunk();
}
function *rand() {
for (let l = 0; l < 10; l++)
yield `${l} = ${(Math.random()*1000) | 0}`;
}
for (const r of chunkIterator(rand(), 3)) {
console.log('---');
for (const c of r) {
console.log(c);
}
}
Upvotes: 1
Reputation: 103
I could not find an answer here that made sure the chunks were equal despite an odd size of elements. Therefore, I wrote my own method to do this. This ensures chunks are always of size, where the filled in holes are of the provided default value. This also does not modify the original array.
Modern version:
// Modern version
function chunkArray(a, s, d) {
const l = a.length;
let p = 0;
if (l !== 0) {
return a.reduce((a, c, i) => {
if ((i % s) === 0)
p = a.push([]);
let r = a[p - 1].push(c);
if ((i + 1) === l)
while (r < s)
r = a[p - 1].push(d);
return a;
}, []);
} else
return [...Array(s).fill(d)];
}
const add = (v, i) => v + (i + 1);
console.log('a.length = 7, s = 3, d = 0');
console.log(chunkArray([...Array(7).fill(0)].map(add), 3, 0));
console.log('');
console.log('a.length = 12, s = 2, d = 2');
console.log(chunkArray([...Array(12).fill(0)].map(add), 2, 2));
console.log('');
console.log('a.length = 10, s = 6, d = "ADDITIONAL"');
console.log(chunkArray([...Array(10).fill('ORIGINAL')].map(add), 6, 'ADDITIONAL'));
console.log('');
console.log('a.length = 20, s = 12, d = undefined');
console.log(chunkArray([...Array(20).fill(0)].map(add), 12, undefined));
console.log('');
console.log('a.length = 30, s = 4, d = null');
console.log(chunkArray([...Array(30).fill('TEST')].map(add), 4, null));
IE10+ compatible version:
// IE10+ compatible version
function addArray(a) {
return a.map(function(v, i) { return v + (i + 1); });
}
function createArray(s, d) {
var a = [];
for (var i = 0; i < s; i++)
a.push(d);
return a;
}
function chunkArray(a, s, d) {
var l = a.length, p = 0, r = 0;
if (l !== 0) {
return a.reduce(function(a, c, i) {
if ((i % s) === 0)
p = a.push([]);
r = a[p - 1].push(c);
if ((i + 1) === l)
while (r < s)
r = a[p - 1].push(d);
return a;
}, []);
} else
return createArray(s, d);
}
console.log('a.length = 7, s = 3, d = 0');
console.log(chunkArray(addArray(createArray(7, 0)), 3, 0));
console.log('');
console.log('a.length = 12, s = 2, d = 2');
console.log(chunkArray(addArray(createArray(12, 0)), 2, 2));
console.log('');
console.log('a.length = 10, s = 6, d = "ADDITIONAL"');
console.log(chunkArray(addArray(createArray(10, 'ORIGINAL')), 6, 'ADDITIONAL'));
console.log('');
console.log('a.length = 20, s = 12, d = undefined');
console.log(chunkArray(addArray(createArray(20, 0)), 12, undefined));
console.log('');
console.log('a.length = 30, s = 4, d = null');
console.log(chunkArray(addArray(createArray(30, 'TEST')), 4, null));
Upvotes: 1
Reputation: 2124
Splice version using ES6
let [list,chunkSize] = [[1,2,3,4,5,6,7,8,9,10,11,12,13,14,15], 6];
list = [...Array(Math.ceil(list.length / chunkSize))].map(_ => list.splice(0,chunkSize))
console.log(list);
Upvotes: 105
Reputation: 25930
The most efficient way is to treat the array as Iterable
, and do lazy pagination. That way, it will produce data only when requested. The code below uses operator page from iter-ops library:
import {pipe, page} from 'iter-ops';
const arr = [1, 2, 3, 4, 5, 6, 7, 8, 9]; // some input data
const i = pipe(arr, page(2)); //=> Iterable<number>
console.log(...i); //=> [ 1, 2 ] [ 3, 4 ] [ 5, 6 ] [ 7, 8 ] [ 9 ]
Works the same way for any Iterable
or AsyncIterable
.
P.S I'm the author of the library.
Upvotes: 2
Reputation: 3161
function* chunks(arr, n) {
for (let i = 0; i < arr.length; i += n) {
yield arr.slice(i, i + n);
}
}
let someArray = [0, 1, 2, 3, 4, 5, 6, 7, 8, 9]
console.log([...chunks(someArray, 2)]) // [[0, 1], [2, 3], [4, 5], [6, 7], [8, 9]]
Can be typed with Typescript like so:
function* chunks<T>(arr: T[], n: number): Generator<T[], void> {
for (let i = 0; i < arr.length; i += n) {
yield arr.slice(i, i + n);
}
}
Upvotes: 201
Reputation: 1665
With mutation of the source array:
let a = [ 1, 2, 3, 4, 5, 6, 7, 8, 9 ], aa = [], x
while((x = a.splice(0, 2)).length) aa.push(x)
// aa == [ [ 1, 2 ], [ 3, 4 ], [ 5, 6 ], [ 7, 8 ], [ 9 ] ]
// a == []
Without mutating the source array:
let a = [ 1, 2, 3, 4, 5, 6, 7, 8, 9 ], aa = []
for(let i = 0; i < a.length; i += 2) aa.push(a.slice(i, i + 2))
// aa == [ [ 1, 2 ], [ 3, 4 ], [ 5, 6 ], [ 7, 8 ], [ 9 ] ]
// a == [ 1, 2, 3, 4, 5, 6, 7, 8, 9 ]
Upvotes: 7
Reputation: 20796
The problem with the current top answers is that they produce lopsided chunks. For example, the currently accepted answer will distribute a 101-element array into 10 chunks of size 10, followed by 1 chunk of size 1.
Using some modular arithmetic can create uniform chunk sizes that never differ by more than 1:
function split_array(a, nparts) {
const quot = Math.floor(a.length / nparts)
const rem = a.length % nparts
var parts = []
for (var i = 0; i < nparts; ++i) {
const begin = i * quot + Math.min(rem, i)
const end = begin + quot + (i < rem)
parts.push(a.slice(begin, end))
}
return parts
}
var chunks = split_array([1, 2, 3, 4, 5, 6, 7, 8, 9, 10], 3)
console.log(JSON.stringify(chunks))
Output:
[[1,2,3,4],[5,6,7],[8,9,10]]
(Copied from a related answer.)
Upvotes: 1
Reputation: 620
In JS,
const splitInChunks = (arr,n) => {
let chunksArr = [];
if(arr !=null && arr!= undefined){
for(i=0; i<arr.length;i+=n){
if(arr.length-i>=n)
chunksArr.push(arr.slice(i,i+n))
else
chunksArr.push(arr.slice(i,arr.length))
}
return chunksArr
}
}
Upvotes: 0
Reputation: 5168
Here's a ES6 version using reduce
const perChunk = 2 // items per chunk
const inputArray = ['a','b','c','d','e']
const result = inputArray.reduce((resultArray, item, index) => {
const chunkIndex = Math.floor(index/perChunk)
if(!resultArray[chunkIndex]) {
resultArray[chunkIndex] = [] // start a new chunk
}
resultArray[chunkIndex].push(item)
return resultArray
}, [])
console.log(result); // result: [['a','b'], ['c','d'], ['e']]
And you're ready to chain further map/reduce transformations. Your input array is left intact
If you prefer a shorter but less readable version, you can sprinkle some concat
into the mix for the same end result:
inputArray.reduce((all,one,i) => {
const ch = Math.floor(i/perChunk);
all[ch] = [].concat((all[ch]||[]),one);
return all
}, [])
You can use remainder operator to put consecutive items into different chunks:
const ch = (i % perChunk);
Upvotes: 317
Reputation: 4179
My favorite is the generator generateChunks
with the additional function getChunks
to execute the generator.
function* generateChunks(array, size) {
let start = 0;
while (start < array.length) {
yield array.slice(start, start + size);
start += size;
}
}
function getChunks(array, size) {
return [...generateChunks(array, size)];
}
console.log(getChunks([0, 1, 2, 3, 4, 5, 6, 7, 8, 9], 3)) // [ [ 0, 1, 2 ], [ 3, 4, 5 ], [ 6, 7, 8 ], [ 9 ] ]
As an addition here the generator generatePartitions
with the further function getPartitions
to get n arrays of equal size.
function generatePartitions(array, count) {
return generateChunks(array, Math.ceil(array.length / count));
}
function getPartitions(array, count) {
return [...generatePartitions(array, count)];
}
console.log(getPartitions([0, 1, 2, 3, 4, 5, 6, 7, 8, 9], 3)) // [ [ 0, 1, 2, 3 ], [ 4, 5, 6, 7 ], [ 8, 9 ] ]
An advantage of the generator compared to many other solutions is that not multiple unnecessary arrays are created.
Upvotes: 0
Reputation: 92983
The array.slice()
method can extract a slice from the beginning, middle, or end of an array for whatever purposes you require, without changing the original array.
const chunkSize = 10;
for (let i = 0; i < array.length; i += chunkSize) {
const chunk = array.slice(i, i + chunkSize);
// do whatever
}
The last chunk
may be smaller than chunkSize
. For example when given an array
of 12 elements the first chunk will have 10 elements, the second chunk only has 2.
Note that a chunkSize
of 0
will cause an infinite loop.
Upvotes: 1381
Reputation: 10145
An efficient solution is to join solution with slice and push by indexChunk, the solution split into chunks:
function splitChunks(sourceArray, chunkSize) {
if(chunkSize <= 0)
throw "chunkSize must be greater than 0";
let result = [];
for (var i = 0; i < sourceArray.length; i += chunkSize) {
result[i / chunkSize] = sourceArray.slice(i, i + chunkSize);
}
return result;
}
let ar1 = [
1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20
];
console.log("Split in chunks with 4 size", splitChunks(ar1, 4));
console.log("Split in chunks with 7 size", splitChunks(ar1, 7));
Upvotes: 1
Reputation: 2699
I tried a recursive function…
const chunk = (arr, n) =>
arr.length ? [arr.slice(0, n), ...chunk(arr.slice(n), n)] : [];
…which is nice and short, but seems to take about 256× as long as @AymKdn’s answer for 1,000 elements, and 1,058× as long for 10,000 elements!
Upvotes: 3
Reputation: 2717
Try to avoid mucking with native prototypes, including Array.prototype
, if you don't know who will be consuming your code (3rd parties, coworkers, yourself at a later date, etc.).
There are ways to safely extend prototypes (but not in all browsers) and there are ways to safely consume objects created from extended prototypes, but a better rule of thumb is to follow the Principle of Least Surprise and avoid these practices altogether.
If you have some time, watch Andrew Dupont's JSConf 2011 talk, "Everything is Permitted: Extending Built-ins", for a good discussion about this topic.
But back to the question, while the solutions above will work, they are overly complex and requiring unnecessary computational overhead. Here is my solution:
function chunk (arr, len) {
var chunks = [],
i = 0,
n = arr.length;
while (i < n) {
chunks.push(arr.slice(i, i += len));
}
return chunks;
}
// Optionally, you can do the following to avoid cluttering the global namespace:
Array.chunk = chunk;
Upvotes: 150