Reputation: 2424
I am fetching sets of data as json on demand (while runtime). In order to not fetch the same set twice I decided to save already downloaded sets in an js-object. Would it be more efficient to store the sets as json and parse it whenever needed or save the parsed object?
Here are the two approaches:
Approach A
:
const alreadyLoaded = {};
async function f(set) {
if (alreadyLoaded[set] !== undefined) return alreadyLoaded[set]; //Changes
let jsonRes = await fetch("example." + set + ".json");
let obRes = await res.json();
alreadyLoaded[set] = obRes; //Changes
return obRes; //Changes
}
f("one");
f("two");
f("one");
Approach B
:
const alreadyLoaded = {};
async function f(set) {
if (alreadyLoaded[set] !== undefined) return await alreadyLoaded[set].json(); //Changes
let jsonRes = await fetch("example." + set + ".json");
let obRes = await res.json();
alreadyLoaded[set] = jsonRes; //Changes
return obRes.json(); //Changes
}
f("one");
f("two");
f("one");
Upvotes: 0
Views: 225
Reputation: 8389
You need to cache the final result and not the intermediate. i.e., if you go with B
then you are caching a resource on which you need to call json()
(which creates the final object, so you are not caching the final object here). If you go with A
, you are caching the final object assuming that's the final form you will be reusing and not just calling a function on it every time (in which case you should cache the result of that function again).
Caching in general can happen at any level, but you were interested which way is more efficient (not memory wise but processing wise). And then the answer is the later you cache in the flow the better.
If by efficiency you mean memory consumption, then you need to measure each object and decide which one is lighter weight. It's a tradeoff often.
Upvotes: 1
Reputation: 99515
The A
way doesn't really work because you can only call .json()
once. It's not possible to re-use it later.
Upvotes: 2