Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

maxSockets trashes global value; not server-specific #1346

Closed
garthk opened this issue Jan 21, 2014 · 1 comment · Fixed by #1458
Closed

maxSockets trashes global value; not server-specific #1346

garthk opened this issue Jan 21, 2014 · 1 comment · Fixed by #1458
Labels
bug Bug or defect feature New functionality or improvement

Comments

@garthk
Copy link

garthk commented Jan 21, 2014

> var hapi = require('hapi');
undefined
> var server = new hapi.Server({ maxSockets: 23 });
undefined
> require('http').globalAgent.maxSockets
23

This stuffs up my code-minimal way to throttle only some parts of a REST API. Can we have Server set this.agent to false by default, create its own http.Agent if options.maxSockets is present, and have Proxy pull it from request.server to pass to Nipple?

jmonster pushed a commit to jmonster/hapi that referenced this issue Feb 7, 2014
jmonster pushed a commit to jmonster/hapi that referenced this issue Feb 8, 2014
jmonster pushed a commit to jmonster/hapi that referenced this issue Feb 8, 2014
@hueniverse hueniverse added the bug label Feb 25, 2014
@Marsup Marsup added feature New functionality or improvement and removed request labels Sep 20, 2019
@lock
Copy link

lock bot commented Jan 9, 2020

This thread has been automatically locked due to inactivity. Please open a new issue for related bugs or questions following the new issue template instructions.

@lock lock bot locked as resolved and limited conversation to collaborators Jan 9, 2020
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
bug Bug or defect feature New functionality or improvement
Projects
None yet
Development

Successfully merging a pull request may close this issue.

3 participants