-
-
Notifications
You must be signed in to change notification settings - Fork 957
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add cache
option
#284
Add cache
option
#284
Conversation
aadd907
to
06c0eba
Compare
Got it working, Is |
Ok, I've done some digging and a Node.js response is an instance of http.IncomingMessage. So to make the cached version compatible with the Node.js response there's quite a few things we need to support as well as some methods/properties that we can't support on the cached response (e.g We'll need to either add support to all these things in There is also the issue that we'll need to manually keep the cached response up to date with This seems like kind of a hacky solution, maybe it would be better to limit the cached responses to only basic data, although this is probably not gonna be obvious to people unless they read the docs. Or alternatively just do this as a wrapper that never exposes the Node.js response. I'll wait for feedback before going any further with this. |
Maybe https://github.com/pornel/http-cache-semantics might be of some use. It seems to have a pretty extensive coverage of all caching possibilities and quirks in HTTP. Even if it's just for inspiration. |
Yes, let's limit it to the basic data. statusCode, headers, body, url.
Seems like you need to do that regardless as you have yet to get a response on your PR. |
index.js
Outdated
@@ -58,6 +61,20 @@ function requestAsEventEmitter(opts) { | |||
response.url = redirectUrl || requestUrl; | |||
response.requestUrl = requestUrl; | |||
|
|||
if (opts.cache && statusCode === 200 && req.method === 'GET' && (expired(response.headers) === false || 'etag' in response.headers)) { | |||
const stream = opts.encoding === null ? getStream.buffer(response) : getStream(response, opts); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
const encoding = opts.encoding === null ? 'buffer' : opts.encoding;
const stream = getStream(stream, {encoding});
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
👌
fccafe0
to
df09c96
Compare
|
Just an idea but However this functionality is being written in a way that will allow someone to use any cache store with Got and I think TTL is useful information to pass through. What about if we change the Then we could provide a LevelUP cache plugin for Got so using with LevelUP and friends is still simple: const got = require('got');
const gotCacheLevelup = require('got-cache-levelup');
const db = gotCacheLevelup('mydb', { db: require('redisdown'), host: 'localhost', port: 6379 });
got('foo.com', {cache: db}); But then if someone wanted implement a native Redis cache plugin for Got they could make use of the TTL. Also, I was just using callbacks so it worked OOB with a LevelUP db instance. If we are setting our own spec for cache "plugins" then we could require a Promise interface which would be a bit neater. In my mind I see a cache plugin looking like this: module.exports = {
set: (key, val, ttl) => new Promise(),
get: key => new Promise(),
del: key => new Promise()
}; Thoughts? |
Instead of adding an additional parameter to
We can just run the input through |
We can still pass the TTL as the third argument to set and match the const map = new Map();
map.set('foo', 'bar', 100);
map.get('foo');
// "bar" So e.g: If const cache = require('some-cache-store');
const gotCachePlugin = {
set: (key, val, ttl) => cache.put(key, val, ttl),
get: key => cache.get(key),
};
module.exports = gotCachePlugin;
Sounds good 👌 |
I was proposing we just support TTL natively, so users can use any kind of "dumb" storage and get TTL support without doing anything extra. |
I see. But theres no benefit to checking the TTL when we retrieve a cached item, everything is already checked inside The only use for storing the TTL would be to, on a regular interval, loop over all items from the cache and delete each item which has exceeded it's TTL. IMHO that's not a good idea and would be a job better handled by the underlying database. |
@sindresorhus thanks for taking the time to go over this! All latest requested changes have been implemented. Two things I'm not quite sure about: New Properties
Is this ok: 4d9ab59 Pulled straight from Cache link There are currently two headings titled "cache", which means the href Is this ok? Obviously works fine as is but wondering if re-ordering or something in the future could break the other link. https://github.com/lukechilds/got/tree/cache#cache |
I think you can use an HTML link to force it: <a name="cache"></a>
### Cache |
Is this ok?
It's worth noting, it won't behave exactly like the original, because the original has references to the socket and stuff which I obviously can't do with a cached response. However none of those core Node.js HTTP response stream methods are actually explicitly mentioned in the Got docs so you could consider them undocumented properties from Got's perspective and argue the behaviour is exactly the same. Anyway, I linked to the responselike module so they can view more information on exactly what properties it supports |
acc9557
to
aacb9c9
Compare
Hey, just incase you're about to merge this, can I just make a quick tweak to Keyv first. It won't change any current behaviour but it will store the data in a way that will make it much easier for me to make changes to Keyv in the future without affecting 3rd party users like Got. I want to get this in before Keyv gets more usage, it's literally only a tiny change, just not backwards compatible, working on it now. |
Sorted! Details on why the change was needed here: jaredwray/keyv#29 |
This reverts commit 3c79205.
This reverts commit 3c79205.
brings npm pkg from 0.75MB to 0.45MB after yarn autoclean --force, down to 0.25MB from this PR: sindresorhus#284 next big savings is removing progress, from sindresorhus#322
brings npm pkg from 0.75MB to 0.45MB after yarn autoclean --force, down to 0.25MB from this PR: sindresorhus#284 next big savings is removing progress, from sindresorhus#322
Resolves #281
To keep track of what's implemented:
Map()
API)If-Modified-Since
/If-None-Match
headerOk I hacked this together last night, I'm aware there are a few things that need tidying up / potentially unhandled errors etc. Just submitting this PR for some feedback.
I'm caching the url/headers/body of all responses that satisfy:
This is working, I can view them in the cache. The issue is to serve them up I need to create a Node.js response (or similar) object to be passed through Got's
asPromise()
/asStream()
fns and to make sure all the usual response API methods are available on the returned object. I was hopinghttp-response-object
was gonna do the job here but it looks like it only implements a basic subset of the Node.js response object.When I emit the response-like object I get:
I think this is coming from
getStream()
becausehttp-response-object
doesn't implement some stream stuff from the Node.js response object.Any thoughts? I'm kind of out of my depth here in regards to streams and the native response object. Is there is simpler way to do this? Or maybe a straight forward way to create a fake request object?
// @sindresorhus @reconbot @alextes