3

I was trying to see if there was a way to cache a json response from a fetch async call, possibly using LRU.

I've tried using several packages, such as node-cache and lru-cache, but I don't think they worked because my function is asynchronous.

This is what my fetch function basically looks like:

const jsonFetch = async (url) => {
    try {
        const response = await fetch (url)
        const json = await response.json();
        return json
    }
    catch (error) {
        console.log(error)
    }
}

For example, if I get someone to hit my route 20 times in a minute, I'd like to easily fetch the data and return the response within 0.03 ms instead of 0.3 ms. Currently, it is always using the a URL to fetch the data.

1

2 Answers 2

5

This has been here for a while, but I agree with the comment from @sleepy012. If I wanted to avoid parallel calls, the trick should be to cache the promise, not only the value. So something like this should work:

let cache = {}
function cacheAsync(loader) {
  return async (url) => {
    if (url in cache) {                    // return cached result if available
        console.log("cache hit")
        return cache[url]
    }
    try {
        const responsePromise = loader(url)
        cache[url] = responsePromise
        return responsePromise
    }
    catch (error) {
        console.log('Error', error.message)
    }
  };
}


function delayedLoader(url) {
  console.log('Loading url: ' + url)
  return new Promise((r) => setTimeout(r, 1000,'Returning ' + url));
}

const cachedLoader = cacheAsync(delayedLoader);

cachedLoader('url1').then((d) => console.log('First load got: ' + d));
cachedLoader('url1').then((d) => console.log('Second load got: ' + d));
cachedLoader('url2').then((d) => console.log('Third load got: ' + d));
cachedLoader('url2').then((d) => console.log('Fourth load got: ' + d));
console.log('Waiting for load to complete');

Sign up to request clarification or add additional context in comments.

Comments

-2

There's nothing about async functions that will prevent caching results. It's possible the libraries you're looking at can't handle the promises, but here's a basic proof of concept that might help to get things started:

let cache = {}
const jsonFetch = async (url) => {
    if (url in cache) {                    // return cached result if available
        console.log("cache hit")
        return cache[url]
    }
    try {
        const response = await fetch (url)
        const json = response.json();
        cache[url] = json                  // cache response keyed to url
        return json
    }
    catch (error) {
        console.log(error)
    }
}

jsonFetch("https://jsonplaceholder.typicode.com/todos/1").then((user) => console.log(user.id))

// should be cached -- same url
setTimeout(() => jsonFetch("https://jsonplaceholder.typicode.com/todos/1").then((user) => console.log(user.id)), 2000)

// not in cache
setTimeout(() => jsonFetch("https://jsonplaceholder.typicode.com/todos/2").then((user) => console.log(user.id)), 2000)

You will only get cache hits on requests made after the first request returns a value to cache

1 Comment

Actually, this code is not 100% correct: if a second request to the same url comes while the first one is still awaited then a second fetch will be fired. Depending on context, this can be everything from harmless to evil.

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.