-
-
Notifications
You must be signed in to change notification settings - Fork 5.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Try out node-fetch instead of got #6914
Conversation
|
return new URLSearchParams(qs) | ||
} | ||
|
||
function request2NodeFetch({ url, options }) { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
There's a bit of a separation of concerns issue going on here.
Part of what I am doing here is translating request concepts to node-fetch concepts
That said, if we go with node-fetch I think there are some abstractions here that we would probably want to retain and make available to services anyway. For example the qs/form abstraction. I think for the purpose of evaluating performance this is fine and we can clean things up if we decide node-fetch is the way forward.
} | ||
|
||
if ('form' in requestOptions) { | ||
nodeFetchOptions.body = object2URLSearchParams(requestOptions.form) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
There are 2 ways you can do form encoding: URL Encoded and Multipart I've implemented URL Encoded here. This:
- is simpler to implement
- is easier to mock with nock
- works with all the services we've got that are sending forms
If we wanted to do multipart encoding instead, that implementation would look like
import FormData from 'form-data'
const form = new FormData()
for (const [key, value] of Object.entries(requestOptions.form)) {
form.append(key, value)
}
nodeFetchOptions.body = form
const formHeaders = form.getHeaders()
nodeFetchOptions.headers['content-type'] = formHeaders['content-type']
delete requestOptions.form
One thing I haven't really got my head round is how higher level clients decide or allow the user to specify when to use each encoding type. Given URL Encoding works for everything we're currently doing, I reckon we can leave this how it is and worry about multipart in future if we move forward with node-fetch and find a use-case that needs it.
try { | ||
const resp = await fetch(nodeFetchUrl, nodeFetchOptions) | ||
const body = await resp.text() | ||
resp.statusCode = resp.status |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I've just done this for compatibility/convenience. There's a bunch of code that expects response to have a statusCode
property. This is the path of least resistance.
Yup, will do |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I've had a read through the code. Changes look good to me. 👍🏻
Will test this in my env this weekend |
Working fine behind a proxy 👍 |
Updated this with a view to deploying it over the weekend. Managed to coordinate this with NPM having an outage 😄 and it turns out there are two tests in the core suite that reach out to NPM. Maybe we should mock them. Aside from that, all the service failures seem like our usual flaky friends. I'll re-run the tests again once the NPM outage clears just to be sure but I plan to deploy this, run it for a week or so and see if we can nail down a decision on where we're going with this migration. |
Right. All seems to be in order. I'll plan to deploy this tomorrow evening |
The saga continues
Refs #4655
OK, so lets start off with a summary of where we are.:
We migrated the majority of our HTTP traffic to got. There's definitely some noticable increase in memory usage compared to request, although we're now running with a bit more headroom. Before doubling down on got and moving the rest of our traffic, lets try out node-fetch and see if we get better performance with that first. Interesting side note is that right now our resource usage seems to be weirdly low. Maybe everyone is on summer holidays? Once reviewed we might want to hold off deploying this until our usage looks a bit more typical.
I think having got this far with node-fetch, if the resource usage seems about the same I think I'm in favour of got because the higher-level abstraction is just a bit nicer to work with, but if it looks like node-fetch improves our memory footprint I think its a fine solution to move forward with.
For the moment I'm really just interested in seeing what happens to performance with node-fetch vs got so
influx-metrics.js
using got for the moment. Its not on the 'critical path'. Once we've decided on a way forward, we can standardize one one client but while we work it out we're going to have a random mashup of some stuff using node-fetch, some stuff still using request and influx-metrics.js using got.@calebcartwright are you able to double-check this works ok through a proxy when you get a chance?
There will be a few failing service tests on the build, but they are all unrelated. Its #6913 , #6912 and the usual suspects for flaky/transient failures.