Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Move redis adapter to separate npm library #87

Merged
merged 8 commits into from
Feb 9, 2024
43 changes: 13 additions & 30 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -52,40 +52,23 @@ npm install @epic-web/cachified

```ts
import { LRUCache } from 'lru-cache';
import { cachified, CacheEntry, Cache, totalTtl } from '@epic-web/cachified';
import { cachified, CacheEntry, Cache } from '@epic-web/cachified';

/* lru cache is not part of this package but a simple non-persistent cache */
const lruInstance = new LRUCache<string, CacheEntry>({ max: 1000 });

/* defines options for the set method for lru cache */
interface LRUishCache extends Omit<Cache, 'set'> {
set(
key: string,
value: CacheEntry<unknown>,
options?: { ttl?: number; start?: number },
): void;
}

/* creates a wrapper for the lru cache so that it can easily work with cachified and ensures the lru cache cleans up outdated values itself*/
function lruCacheAdapter(lruCache: LRUishCache): Cache {
return {
set(key, value) {
const ttl = totalTtl(value?.metadata);
return lruCache.set(key, value, {
ttl: ttl === Infinity ? undefined : ttl,
start: value?.metadata?.createdTime,
});
},
get(key) {
return lruCache.get(key);
},
delete(key) {
return lruCache.delete(key);
},
};
}

const lru = lruCacheAdapter(lruInstance);
const lru: Cache = {
/* Note that value here exposes metadata that includes things such as ttl and createdTime */
set(key, value) {
return lruInstance.set(key, value);
},
get(key) {
return lruInstance.get(key);
},
delete(key) {
return lruInstance.delete(key);
},
};
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This implementation has no benefit over

const cache = new LRUCache<string, CacheEntry>({ max: 1000 })

cachified({ cache, /* ... */ });

One of the main reasons for an adapter is to set the TTL on the cached item in a way that the cache will clean it up itself without involving cachified.

Proposing to either not create an on the fly adapter (example above) or do the following:

/* lru cache is not part of this package but a simple non-persistent cache */
const lruInstance = new LRUCache<string, CacheEntry>({ max: 1000 });

const cache: Cache = {
  set(key, value) {
    const ttl = totalTtl(value?.metadata);
    return lruCache.set(key, value, {
      ttl: ttl === Infinity ? undefined : ttl,
      start: value?.metadata?.createdTime,
    });
  },
  get(key) {
    return lruInstance.get(key);
  },
  delete(key) {
    return lruInstance.delete(key);
  },
};

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Would vote for the latter option. In real apps this is quite critical in order to not bloat your cache db.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yep, I agree. Let's update the example to the one which sets the ttl 👍

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ahh that makes sense! Thank you for the help!

I've updated it now to the above example.


function getUserById(userId: number) {
return cachified({
Expand Down