Reputation: 15566
class MathSet extends Set{
constructor(arr){
super(arr);
}
union(set){
return new MathSet([...this, ...set])
}
intersection(set){
return new MathSet([...this].filter(x => set.has(x)));
}
difference(set){
return new MathSet([...this].filter(x => !set.has(x)));
}
cartesian(set){
return new MathSet( [...this].reduce((acc, i)=> [...acc, [...set].map(j=>[i,j])], []) )
}
}
let x = new MathSet([1,2,3]);
let y = new MathSet([1,2,3,4,5]);
console.log(JSON.stringify([...x.cartesian(y)]));
//[
// [[1,1],[1,2],[1,3],[1,4],[1,5]],
// [[2,1],[2,2],[2,3],[2,4],[2,5]],
// [[3,1],[3,2],[3,3],[3,4],[3,5]]
// ]
With cartesian
function expected result is a flattened version of the above array ([[1,1],[1,2],[1,3],[1,4],[1,5],[2,1],[2,2],[2,3],[2,4],[2,5],[3,1],[3,2],[3,3],[3,4],[3,5]]
), but as you can see somehow its getting grouped into three arrays. The reduce is keeping on concatenating earlier result with spreaded version of new results. Any guess on what I am doing wrong?
Upvotes: 1
Views: 47
Reputation: 26191
This is a problem. I do cartesians by two nested reduces just like dealing with two arrays but can handle n arrays in fact. The major problem was to flatten the nesting up arrays in between operation. My solution would be
Array.prototype.cartesian = function(...a){
return a.length ? this.reduce((p,c) => (p.push(...a[0].cartesian(...a.slice(1)).map(e => a.length > 1 ? [c,...e] : [c,e])),p),[])
: this;
};
var arr = ['a', 'b', 'c'],
brr = [1,2,3],
crr = [[9],[8],[7]];
console.log(JSON.stringify(arr.cartesian(brr,crr)));
Actually on a second thought and by influencing from your code, i think using a map in the place of the second reduce would in fact be more appropriate. I modified the code accordingly.
Upvotes: 1
Reputation: 665000
To flatten the arrays you need just one more spread:
return new MathSet( [...this].reduce((acc, i)=> [...acc, ...[...set].map(j=>[i,j])], []) )
// ^^^
(or acc.concat(Array.from(set, j=>[i,j]))
)
Upvotes: 2