Async cache with dedupe support

Overview

async-cache-dedupe

async-cache-dedupe is a cache for asynchronous fetching of resources with full deduplication, i.e. the same resource is only asked once at any given time.

Install

npm i async-cache-dedupe

Example

import { Cache } from 'async-cache-dedupe'

const cache = new Cache({
  ttl: 5 // seconds
})

cache.define('fetchSomething', async (k) => {
  console.log('query', k)
  // query 42
  // query 24

  return { k }
})

const p1 = cache.fetchSomething(42)
const p2 = cache.fetchSomething(24)
const p3 = cache.fetchSomething(42)

const res = await Promise.all([p1, p2, p3])

console.log(res)
// [
//   { k: 42 },
//   { k: 24 }
//   { k: 42 }
// ]

Commonjs/require is also supported.

API

new Cache(opts)

Creates a new cache.

Options:

  • tll: the maximum time a cache entry can live, default 0
  • cacheSize: the maximum amount of entries to fit in the cache for each defined method, default 1024.

cache.define(name[, opts], original(arg))

Define a new function to cache of the given name.

Options:

  • tll: the maximum time a cache entry can live, default as defined in the cache.
  • cacheSize: the maximum amount of entries to fit in the cache for each defined method, default as defined in the cache.
  • serialize: a function to convert the given argument into a serializable object (or string)

The define method adds a cache[name] function that will call the original function if the result is not present in the cache. The cache key for arg is computed using safe-stable-stringify.

cache.clear([name], [arg])

Clear the cache. If name is specified, all the cache entries from the function defined with that name are cleared. If arg is specified, only the elements cached with the given name and arg are cleared.

License

MIT

Comments
  • Investigate a faster hashing algorithm that JSON stable stringify

    Investigate a faster hashing algorithm that JSON stable stringify

    A quick analysis with a flamegraph shows that safe-stable-stringify is the major bottleneck for this cache.

    We should investigate if we could come up with a faster algorithm for hashing objects but with the same properties.

    opened by mcollina 9
  • feat: use storage and invalidation

    feat: use storage and invalidation

    proposal for use of an external storage, with optional invalidation


    TODO

    • [x] test on storage/s
    • [x] 100% coverage (and significant cases)
    • [x] use an empty logger on storage
    • [x] fix CI, add a redis instance
    • [x] restore expiration logic on memory storage / do not use Date.now
    • [x] expose explicit "invalidation" function
    • [x] improve redis storage (pipeline etc)
    • [x] ~~storage gc function~~
    • [x] memory storage with sync deletes
    • [x] redis storage with sync deletes
    • [x] optional invalidation
    • [x] benchmarks
    • [x] redis references TTL
    • [x] redis gc
    • [x] "stale-on-error" > next
    • [x] solve TODOs
    • [x] check redis pipeline
    • [x] memory references: ~~use a btree? a set?~~ arrays are good enough
    • [x] example
      • [x] basic
      • [x] mjs
    • [x] update documentation
    opened by simone-sanfratello 9
  • implement

    implement "onError" event listener

    as well as onDedupe, onHit, onMiss, we can have "onError" event

    ~~consider to implement also "stale on error" and relative options when an error occur, for example the first implementation can be serve the latest response for an amount of time~~

    good first issue 
    opened by simone-sanfratello 6
  • draft: remove ttl, clear query once resolved

    draft: remove ttl, clear query once resolved

    To implement invalidation, I need to remove ttl and clear the deduped promise is resolved

    Note: tests where the cache is suppose to be used are not passing

    If this has gone too far from the original purpose, I'm happy to start a new module

    opened by simone-sanfratello 3
  • log.debug isn't defined

    log.debug isn't defined

    Good day, when setting the log level to level higher than debug in fastify, log.debug becomes undefined and causes a failure. I can submit a PR that checks for a level's existence and skips if it isn't there

    opened by salzhrani 2
  • Dynamic TTL for each function call

    Dynamic TTL for each function call

    Let's say we want to cache something like an access token where expiration is set from the server.

    
    const cache = createCache();
    cache.define('fetchSomething', fetchSomethingHandler);
    
    async function fetchSomethingHandler() {
      const data =  { "token": "abc", "expiresInSeconds": 60 }
    
      // something like this
      cache.fetchSomething.ttl(data.expiresInSeconds)
      return data;
    }
    

    expiresInSeconds is changing every function call

    opened by seanghay 2
  • fix(cache): clean up dedupes

    fix(cache): clean up dedupes

    This fixes a memory leak in the cache for dedupes. We were previously setting the value of each key to undefined instead of deleting it. This caused a slow leak in this package that ultimately begins causing performance issues.

    opened by evanlucas 2
  • Bypass storage for functions whose TTL is 0

    Bypass storage for functions whose TTL is 0

    Functions whose TTL is set to 0 will try to retrieve the cached value from storage, even when the value is never stored in the first place. We should bypass storage completely, to avoid the performance hit of the additional lookup.

    opened by dualbus 2
  • Global cache TTL overrides function TTL when the function TTL is 0

    Global cache TTL overrides function TTL when the function TTL is 0

    The global cache TTL takes precedence over the more specific function TTL when the function TTL is 0. This happens in https://github.com/mcollina/async-cache-dedupe/blob/afdf82b7a6e7a2e51e596ac6f532f78f8c390381/src/cache.js#L103, since opts.ttl is falsy.

    Please see the following example, where caching should be disabled for fetchSomething. As can be seen from the output, the function is called only once.

    $ cat example.mjs
    import { createCache } from 'async-cache-dedupe'
    
    const cache = createCache({
      ttl: 5,
      storage: { type: 'memory' },
    })
    
    cache.define('fetchSomething', { ttl: 0 }, async (k) => {
      console.log('query', k)
      return { k }
    })
    
    await cache.fetchSomething(1)
    await cache.fetchSomething(1)
    await cache.fetchSomething(1)
    
    $ node example.mjs 
    query 1
    

    I've tested against version 1.2.2 of async-cache-dedupe.

    $ cat package.json 
    {
      "name": "zero-ttl",
      "version": "1.0.0",
      "dependencies": {
        "async-cache-dedupe": "1.2.2"
      }
    }
    
    bug good first issue 
    opened by dualbus 2
  • Bump tap from 15.2.3 to 16.1.0

    Bump tap from 15.2.3 to 16.1.0

    Bumps tap from 15.2.3 to 16.1.0.

    Release notes

    Sourced from tap's releases.

    v16.0.0

    https://node-tap.org/changelog/#160---2022-03-05

    Commits

    Dependabot compatibility score

    Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting @dependabot rebase.


    Dependabot commands and options

    You can trigger Dependabot actions by commenting on this PR:

    • @dependabot rebase will rebase this PR
    • @dependabot recreate will recreate this PR, overwriting any edits that have been made to it
    • @dependabot merge will merge this PR after your CI passes on it
    • @dependabot squash and merge will squash and merge this PR after your CI passes on it
    • @dependabot cancel merge will cancel a previously requested merge and block automerging
    • @dependabot reopen will reopen this PR if it is closed
    • @dependabot close will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
    • @dependabot ignore this major version will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)
    • @dependabot ignore this minor version will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)
    • @dependabot ignore this dependency will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
    dependencies 
    opened by dependabot[bot] 1
  • Bump tap from 15.2.3 to 16.0.1

    Bump tap from 15.2.3 to 16.0.1

    Bumps tap from 15.2.3 to 16.0.1.

    Release notes

    Sourced from tap's releases.

    v16.0.0

    https://node-tap.org/changelog/#160---2022-03-05

    Commits

    Dependabot compatibility score

    Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting @dependabot rebase.


    Dependabot commands and options

    You can trigger Dependabot actions by commenting on this PR:

    • @dependabot rebase will rebase this PR
    • @dependabot recreate will recreate this PR, overwriting any edits that have been made to it
    • @dependabot merge will merge this PR after your CI passes on it
    • @dependabot squash and merge will squash and merge this PR after your CI passes on it
    • @dependabot cancel merge will cancel a previously requested merge and block automerging
    • @dependabot reopen will reopen this PR if it is closed
    • @dependabot close will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
    • @dependabot ignore this major version will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)
    • @dependabot ignore this minor version will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)
    • @dependabot ignore this dependency will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
    dependencies 
    opened by dependabot[bot] 1
  • Stale-while-revalidate style cache strategy

    Stale-while-revalidate style cache strategy

    For some use cases where we don't necessarily need the latest data as soon as ttl is passed, it can be useful to be able to specify a separate interval for staleness so we can continue serving data that's considered stale while we trigger a refetch in the background.

    This feature is similar in principle to #14, and can possibly share some implementation details.

    Proposal:

    Accept a new staleWhileRevalidate: number argument (or maybe staleWhileRefetch? since we're not necessarily going to be doing any revalidation here), specifying the staleness interval.

    Between the ttl and the staleness interval, data is considered stale, and any requests for it will resolve immediately with stale data while triggering a deduped refetch of the data in the background. After the interval, stale data must not be used and requests for the data must await on the refetch of the new data before being served.

    Defaulting to 0 should preserve existing behavior where no stale data is ever served.

    Thoughts?

    I have a bunch of higher priority customer-facing stuff I have to work on over the next few weeks before I'm going to be able to work on an optimization making use of this, but I'd be happy to take a crack at a PR if nobody has tackled it by then!

    opened by lewisl9029 1
  • implement

    implement "stale on error"

    when an error occur, we can add the option "staleOnError" to serve the latest cached response, for example

    const cache = createCache({ ttl: 60 })
    
    cache.define('fetchUser', {
      staleOnError: 10
    }, 
    (id) => database.find({ table: 'users', where: { id }}))
    

    note the error is caused by the defined function, so for example here the database may not respond

    for the first version, I'd go with a simple time (in seconds), then we could add a function to add some logic later, I'm not sure at the moment

    we must also renew the cache ttl for staling entries

    this logic should go here https://github.com/mcollina/async-cache-dedupe/blob/main/src/cache.js#L247

    good first issue 
    opened by simone-sanfratello 3
Releases(v1.6.0)
  • v1.6.0(Nov 18, 2022)

    What's Changed

    • ttl may be a function, handle appropriately by @pvogel1967 in https://github.com/mcollina/async-cache-dedupe/pull/38

    New Contributors

    • @pvogel1967 made their first contribution in https://github.com/mcollina/async-cache-dedupe/pull/38

    Full Changelog: https://github.com/mcollina/async-cache-dedupe/compare/v1.5.0...v1.6.0

    Source code(tar.gz)
    Source code(zip)
  • v1.5.0(Oct 26, 2022)

    What's Changed

    • feat: ttl as a function by @seanghay in https://github.com/mcollina/async-cache-dedupe/pull/37

    New Contributors

    • @seanghay made their first contribution in https://github.com/mcollina/async-cache-dedupe/pull/37

    Full Changelog: https://github.com/mcollina/async-cache-dedupe/compare/v1.4.2...v1.5.0

    Source code(tar.gz)
    Source code(zip)
  • v1.4.2(Oct 13, 2022)

    What's Changed

    • fix(cache): clean up dedupes by @evanlucas in https://github.com/mcollina/async-cache-dedupe/pull/35

    New Contributors

    • @evanlucas made their first contribution in https://github.com/mcollina/async-cache-dedupe/pull/35

    Full Changelog: https://github.com/mcollina/async-cache-dedupe/compare/v1.4.1...v1.4.2

    Source code(tar.gz)
    Source code(zip)
  • v1.4.1(Sep 16, 2022)

    What's Changed

    • fix: error handling in references function by @simone-sanfratello in https://github.com/mcollina/async-cache-dedupe/pull/34

    Full Changelog: https://github.com/mcollina/async-cache-dedupe/compare/v1.4.0...v1.4.1

    Source code(tar.gz)
    Source code(zip)
  • v1.4.0(Jun 30, 2022)

    What's Changed

    • Bypass storage for functions whose TTL is 0 by @dualbus in https://github.com/mcollina/async-cache-dedupe/pull/32

    Full Changelog: https://github.com/mcollina/async-cache-dedupe/compare/v1.3.0...v1.4.0

    Source code(tar.gz)
    Source code(zip)
  • v1.3.0(Jun 14, 2022)

    What's Changed

    • Bump ioredis from 4.28.5 to 5.0.1 by @dependabot in https://github.com/mcollina/async-cache-dedupe/pull/26
    • fix #30 function ttl of 0 is overwritten by global ttl by @dualbus in https://github.com/mcollina/async-cache-dedupe/pull/31
    • Bump tap from 15.2.3 to 16.2.0 by @dependabot in https://github.com/mcollina/async-cache-dedupe/pull/29
    • Bump standard from 16.0.4 to 17.0.0 by @dependabot in https://github.com/mcollina/async-cache-dedupe/pull/27

    New Contributors

    • @dualbus made their first contribution in https://github.com/mcollina/async-cache-dedupe/pull/31

    Full Changelog: https://github.com/mcollina/async-cache-dedupe/compare/v1.2.2...v1.3.0

    Source code(tar.gz)
    Source code(zip)
  • v1.2.2(Mar 7, 2022)

    What's Changed

    • fix: url by @liuhanqu in https://github.com/mcollina/async-cache-dedupe/pull/22
    • fix: emitting onError when defined cache fails by @ramonmulia in https://github.com/mcollina/async-cache-dedupe/pull/23

    New Contributors

    • @liuhanqu made their first contribution in https://github.com/mcollina/async-cache-dedupe/pull/22
    • @ramonmulia made their first contribution in https://github.com/mcollina/async-cache-dedupe/pull/23

    Full Changelog: https://github.com/mcollina/async-cache-dedupe/compare/v1.2.1...v1.2.2

    Source code(tar.gz)
    Source code(zip)
  • v1.2.1(Mar 2, 2022)

    What's Changed

    • fix: lint by @simone-sanfratello in https://github.com/mcollina/async-cache-dedupe/pull/20
    • fix: custom storage options by @simone-sanfratello in https://github.com/mcollina/async-cache-dedupe/pull/19
    • fix: lint by @simone-sanfratello in https://github.com/mcollina/async-cache-dedupe/pull/21

    Full Changelog: https://github.com/mcollina/async-cache-dedupe/compare/v1.2.0...v1.2.1

    Source code(tar.gz)
    Source code(zip)
  • v1.2.0(Feb 25, 2022)

    What's Changed

    • feat: invalidation with wildcard by @simone-sanfratello in https://github.com/mcollina/async-cache-dedupe/pull/17

    Full Changelog: https://github.com/mcollina/async-cache-dedupe/compare/v1.1.0...v1.2.0

    Source code(tar.gz)
    Source code(zip)
  • v1.1.0(Jan 20, 2022)

    What's Changed

    • feat: add onError listener by @zbo14 in https://github.com/mcollina/async-cache-dedupe/pull/13

    New Contributors

    • @zbo14 made their first contribution in https://github.com/mcollina/async-cache-dedupe/pull/13

    Full Changelog: https://github.com/mcollina/async-cache-dedupe/compare/v1.0.0...v1.1.0

    Source code(tar.gz)
    Source code(zip)
  • v1.0.0(Jan 13, 2022)

    What's Changed

    • Bump mnemonist from 0.38.5 to 0.39.0 by @dependabot in https://github.com/mcollina/async-cache-dedupe/pull/10
    • feat: use storage and invalidation by @simone-sanfratello in https://github.com/mcollina/async-cache-dedupe/pull/7

    New Contributors

    • @simone-sanfratello made their first contribution in https://github.com/mcollina/async-cache-dedupe/pull/7

    Full Changelog: https://github.com/mcollina/async-cache-dedupe/compare/v0.5.0...v1.0.0

    Source code(tar.gz)
    Source code(zip)
  • v0.5.0(Nov 16, 2021)

    What's Changed

    • Bump safe-stable-stringify from 1.1.1 to 2.0.0 by @dependabot in https://github.com/mcollina/async-cache-dedupe/pull/5
    • Dedupe only with no ttl by @mcollina in https://github.com/mcollina/async-cache-dedupe/pull/9

    Full Changelog: https://github.com/mcollina/async-cache-dedupe/compare/v0.4.0...v0.5.0

    Source code(tar.gz)
    Source code(zip)
  • v0.4.0(Aug 19, 2021)

    What's Changed

    • Added onHit option by @mcollina in https://github.com/mcollina/async-cache-dedupe/pull/4

    Full Changelog: https://github.com/mcollina/async-cache-dedupe/compare/v0.3.0...v0.4.0

    Source code(tar.gz)
    Source code(zip)
  • v0.3.0(Aug 6, 2021)

    What's Changed

    • Fix repository url by @mooyoul in https://github.com/mcollina/async-cache-dedupe/pull/2
    • Pass the cache key as argument by @mcollina in https://github.com/mcollina/async-cache-dedupe/pull/3

    New Contributors

    • @mooyoul made their first contribution in https://github.com/mcollina/async-cache-dedupe/pull/2
    • @mcollina made their first contribution in https://github.com/mcollina/async-cache-dedupe/pull/3

    Full Changelog: https://github.com/mcollina/async-cache-dedupe/compare/v0.1.0...v0.3.0

    Source code(tar.gz)
    Source code(zip)
  • v0.1.0(Jul 17, 2021)

Owner
Matteo Collina
Technical Director @nearform, TSC member @nodejs, IoT Expert, Conference Speaker, Ph.D.
Matteo Collina
A workshop about JavaScript iteration protocols: iterator, iterable, async iterator, async iterable

JavaScript Iteration protocol workshop A workshop about JavaScript iteration protocols: iterator, iterable, async iterator, async iterable by @loige.

Luciano Mammino 96 Dec 20, 2022
A simple in-memory key-value cache for function execution, allowing both sync and async operations using the same methods

A simple in-memory key-value cache for function execution, allowing both sync and async operations using the same methods. It provides an invalidation mechanism based both on exact string and regex.

cadienvan 10 Dec 15, 2022
Query for CSS brower support data, combined from caniuse and MDN, including version support started and global support percentages.

css-browser-support Query for CSS browser support data, combined from caniuse and MDN, including version support started and global support percentage

Stephanie Eckles 65 Nov 2, 2022
⚖️ Limit an async function's concurrency with ease!

limit-concur Limit an async function's concurrency with ease! Install $ npm i limit-concur Usage import got from 'got' import limitConcur from 'limit-

Tomer Aberbach 19 Apr 8, 2022
This simple project, show how work with async Fetch, function component and class component

Getting Started with Create React App This project was bootstrapped with Create React App. Available Scripts In the project directory, you can run: np

DaliyaAsel 2 Feb 17, 2022
Run async code one after another by scheduling promises.

promise-scheduler Run async code in a synchronous order by scheduling promises, with the possibility to cancel pending or active tasks. Optimized for

Matthias 2 Dec 17, 2021
This branch is created to make receive and send data to api using async and await methods

Microverse-Leader-Board Project from module 2 week 4 This branch is created to make receive and send data to api using async and await methods Screens

Akshitha Reddy 6 Apr 22, 2022
Debounce promise-returning & async functions.

perfect-debounce An improved debounce function with Promise support. Well tested debounce implementation Native Promise support Avoid duplicate calls

unjs 55 Jan 2, 2023
Converts an iterable, iterable of Promises, or async iterable into a Promise of an Array.

iterate-all A utility function that converts any of these: Iterable<T> Iterable<Promise<T>> AsyncIterable<T> AsyncIterable<Promise<T>> Into this: Prom

Lily Scott 8 Jun 7, 2022
Parallel/concurrent async work, optionally using multiple threads or processes

parallel-park Parallel/concurrent async work, optionally using multiple processes Usage parallel-park exports two functions: runJobs and inChildProces

Lily Scott 10 Mar 1, 2022
Projeto de Botnet com Python, Websockets, Async e Javascript

A3 - Botnets Este é um repositório onde documentarei todo o processo de pesquisa e desenvolvimento de uma botnet do zero com python, websockets e asyn

Luiz Melo 15 Sep 23, 2022
JavaScript project for the Leaderboard list app, using Webpack and ES6 features, notably modules. this app consume the Leaderboard API using JavaScript async and await and add some styling.

Leaderboard Project JavaScript project for the Leaderboard list app, using Webpack and ES6 features, notably modules. this app consume the Leaderboard

bizimungu pascal 4 May 20, 2022
Well-tested utility functions dealing with async iterables

aitertools This library provides a well-tested collection of small utility functions dealing with async iterables. You can think of it as LINQ or aite

Hong Minhee (洪 民憙) 11 Aug 15, 2022
Tries to execute sync/async function, returns a specified default value if the function throws

good-try Tries to execute sync/async function, returns a specified default value if the function throws. Why Why not nice-try with it's 70+ million do

Antonio Stoilkov 14 Dec 8, 2022
A regular table library, for async and virtual data models.

A Javascript library for the browser, regular-table exports a custom element named <regular-table>, which renders a regular HTML <table> to a sticky p

J.P. Morgan Chase & Co. 285 Dec 16, 2022
Add class(es) to DOM elements while waiting for async action. Promise or callback.

jquery.loading Add class(es) to DOM elements while waiting for async action. Promise or callback. Install The simplest way is to include loading.js in

Dumitru Uzun 1 Mar 26, 2022
Leader Board is a simple project based on JavaScript programing language. The purpose of this project is to work with APIs and ASYNC & AWAIT methods. I have used vanilla JavaScript with web pack to implement this project

Leader Board - JavaScript Project Table of contents Overview The challenge Screenshot Links Project Setup commands My process Built with What I learne

Mahdi Rezaei 7 Oct 21, 2022
open-source implementation of the Turborepo custom remote cache server.

This project is an open-source implementation of the Turborepo custom remote cache server. If Vercel's official cache server isn't a viable option, th

Maksim Sinik 362 Dec 30, 2022
A GitHub Action to cache your workload to a Google Cloud Storage bucket.

Google Cloud Storage Cache Action GitHub already provides an awesome action to cache your workload to Azure's servers hosted in United States. However

MansaGroup 9 Dec 15, 2022