Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Limited to 1024 concurrent connections, and looking for suggestions. #300

Open
rmmoul opened this issue Mar 26, 2015 · 66 comments
Open

Limited to 1024 concurrent connections, and looking for suggestions. #300

rmmoul opened this issue Mar 26, 2015 · 66 comments
Labels

Comments

@rmmoul
Copy link

rmmoul commented Mar 26, 2015

I started a project using ratchet, and wanted to test the number of connections that could be handled at one time on our server (Digital Ocean Ubuntu 14.04, 2 cores, 4GB ram running php 5.6.7 and apache2 2.4.7).

I followed some of the suggestions here on the deploy page http://socketo.me/docs/deploy to help increase the number of connections that could be handled, and seemed to get the ulimit and such to up the number of open files to 10,000.

I started running tests today using thor (https://github.com/observing/thor):

thor --amount 10000 ws://example.com:2600 -C 1000 -W 2 -M 100 

I got a php error when the number of connections exceeded 1024:

PHP Warning:  stream_select(): You MUST recompile PHP with a larger value of FD_SETSIZE.
It is set to 1024, but you have descriptors numbered at least as high as 1123.
 --enable-fd-setsize=2048 is recommended, but you may want to set it
to equal the maximum number of open files supported by your system,
in order to avoid seeing this error again at a later date. in
/var/www/example.com/server/vendor/react/event-loop/StreamSelectLoop.php on line 255

I was actually using php 5.5.9 at the time, so I followed some old instructions from http://ubuntuforums.org/archive/index.php/t-2130554.html and increased the FD_SETSIZE value to 10000 in the following two files and then downloaded and compiled php 5.6.7.

/usr/include/linux/posix_types.h
/usr/include/x86_64-linux-gnu/bits/typesizes.h

That coupled with using this command to run the server through supervisor:

bash -c "ulimit -n 10000 && php /var/www/hyvly.com/server/server.php"

Seems to have allowed the number of connections to go beyond 1024, but now it causes a buffer overflow within php, showing this error in the log file before restarting the process:

*** buffer overflow detected ***: php terminated

I'm curious how other users are getting beyond 1024 concurrent connections, whether some of you have never hit this limit at all (could you share your environment details), or made certain changes to get beyond it (could you share what changes you've made)?

@kinncj
Copy link
Contributor

kinncj commented Mar 26, 2015

Does this server only handles the socket application?

1024 seems to be a really low number of users.

My guess is: your code may be doing something really weird.

Try to profile your application in order to see where it's overwriting the adjacent memory.
If it's the vendor (ratchetphp), report as a bug with more details (such as what are you really doing with it...) ; otherwise, fix the gap on your app.

Worst case scenario: Bring more nodes to your application and use a shared memory (a.k.a memcached, riak, etc etc etc) to share the state between the nodes in order to scale (very last option; IMHO 1024 users are nothing... unless you keep their messages in memory and this messages are significant size BLOBs).

@rmmoul
Copy link
Author

rmmoul commented Mar 26, 2015

Yeah, I thought that seemed low too.

The application is a simple chat application, and it's all that's running on the server. When the concurrent connections are under 1024 (where php craps out) the server doesn't get strained.

@kinncj
Copy link
Contributor

kinncj commented Mar 26, 2015

Are you storing the messages in memory somehow?

May be related.

@rmmoul
Copy link
Author

rmmoul commented Mar 26, 2015

Not for this round of tests, just incrementing and decrementing a connection counter.

@kinncj
Copy link
Contributor

kinncj commented Mar 26, 2015

Can you profile (xhprof, whatever) and share the reports? I'm interesting to see the results!

@lokielse
Copy link

lokielse commented Apr 6, 2015

Limited to about 50 concurrent connections, and looking for suggestions.

Run command:

echo "show info" | socat /tmp/haproxy.sock stdio

Result:
Name: HAProxy
Version: 1.5.3
Release_date: 2014/07/25
Nbproc: 1
Process_num: 1
Pid: 21515
Uptime: 0d 0h13m30s
Uptime_sec: 810
Memmax_MB: 16 #Here Memmax_MB is 16M, I don't know how to increase it.
Ulimit-n: 20032
Maxsock: 20032
Maxconn: 10000
Hard_maxconn: 10000
CurrConns: 53 #Current connections count
CumConns: 278
CumReq: 278
Maxpipes: 0
PipesUsed: 0
PipesFree: 0
ConnRate: 2
ConnRateLimit: 0
MaxConnRate: 14
SessRate: 2
SessRateLimit: 0
MaxSessRate: 14
CompressBpsIn: 0
CompressBpsOut: 0
CompressBpsRateLim: 0
Tasks: 52
Run_queue: 1
Idle_pct: 100

@lokielse
Copy link

lokielse commented Apr 8, 2015

I found that my init.d/haproxy start haproxy with start-stop-daemon which result of 16M memory limit

I change it to this and work

haproxy_start()
{
    $HAPROXY -f "$CONFIG" -D -p "$PIDFILE"
    return 0
}

@hsvikum
Copy link

hsvikum commented Apr 17, 2015

@rmmoul I'm having theexact same issue my chat server fails at 1019 connections.I increased the allowed file opens.and compiled php with the necessary configurations.but it seems that php is still not detecting the changed amount.any luck fixings this.

@benconnito
Copy link
Contributor

@rmmoul I ran into the 1024 limit and did NOT want to compile php and run my own version. So to get around the 1024 limit, i ran multiple instances on several ports and had a http endpoint that round robined the ports to the client.

@rmmoul
Copy link
Author

rmmoul commented Jun 8, 2015

@benconnito That's what I've been considering doing as well. I was hoping to keep them all on the same port, though, to avoid needing to set up a redis pub/sub server to keep all of the clients connected. I think you've found the easiest / best solution.

@hsvikum I haven't found a way to get around this through recompiling or changing up my server's config options. @benconnito's solution is probably the way to go, though I haven't tried what @lokielse suggested to up the memory limit (his connection limit was super small though).

@lokielse
Copy link

lokielse commented Jun 9, 2015

Try to Increase Open Files Limit

https://rtcamp.com/tutorials/linux/increase-open-files-limit/

@Mecanik
Copy link

Mecanik commented Jul 7, 2017

I did everything mentioned here on Linux, and my application is limited to 254 connections no matter what. I have optimised PHP and Apache, but still no luck. I also tested on my Windows dev enviroment, same result. My application is a multi-room chat server... very simple. Any suggestions ?

@WyriHaximus
Copy link

@Mecanik use a event loop other then the default, which is limited to 1024 open file desciptors: https://github.com/reactphp/event-loop#loop-implementations

@Mecanik
Copy link

Mecanik commented Jul 7, 2017

Well I do not quite understand how that works ... but I will try. This is how I have it now:

$wsServer 	= new WsServer($chatServer);
    	$wsServer->disableVersion(0); // old, bad, protocol version
    	
    	$http 		= new HttpServer($wsServer);
    	$server 	= IoServer::factory($http, $port, $ip);

    	$server->run();

@WyriHaximus
Copy link

Replace that IoServer::factory with the code from it https://github.com/ratchetphp/Ratchet/blob/master/src/Ratchet/Server/IoServer.php#L67-L72 and use the desired loop.

@Mecanik
Copy link

Mecanik commented Jul 7, 2017

Well this messes up my understanding of websocket server :) Thank you anyway, I will see what I can manage.

@Mecanik
Copy link

Mecanik commented Jul 7, 2017

@WyriHaximus No luck. The server tries to start, but it automatically stops without any error :( Could please give me a small example ? I will take it from there and try and understand the "loop"...

@Mecanik
Copy link

Mecanik commented Jul 19, 2017

@WyriHaximus I still cannot replace my event loop, I tried a lot... can you please give me an example ? I don't know what to do...

@WyriHaximus
Copy link

@Mecanik take a look at https://github.com/reactphp/event-loop/blob/master/travis-init.sh it is used be the event loop component to install loops on travis for testing.

@Mecanik
Copy link

Mecanik commented Jul 20, 2017

@WyriHaximus If you are talking about "pecl event" I already installed it, and it made only a small difference from 254 connections to 1010, and it gets stuck on 1010. I honestly do not know what to do, and I need this in a production environment. I am using this "chat" example: https://github.com/pmill/php-chat

I also increased every limit there is possible on server and PHP, I am using PHP-FPM 7.1

@Mecanik
Copy link

Mecanik commented Jul 20, 2017

I managed to "debug" things, it appears that the extension "ev" is not being detected at all, thus the loop factory is using "StreamSelectLoop". I am trying to install now "event" on Centos 7 with PHP 7.1 but failing so far.

@WyriHaximus
Copy link

WyriHaximus commented Jul 20, 2017

@Mecanik did you load the .so as an extension in php.ini? When it is loaded in php php -m shows it in the list of installed extensions

@Mecanik
Copy link

Mecanik commented Jul 20, 2017

@WyriHaximus Of course I did, still it`s not detected :/ is it because PHP is 7.1 ? ..

@Mecanik
Copy link

Mecanik commented Jul 20, 2017

@WyriHaximus I installed ( somehow ) libevent-devel and then pecl install event and now Ratchet starts with ExtEventLoop(). Hopefully my "limit" is gone now 👯‍♂️

@Mecanik
Copy link

Mecanik commented Jul 20, 2017

@WyriHaximus Final result: I have passed the 1024 with "event" because "ev" is NOT detected in PHP 7.1 by Ratchet lib.

@WyriHaximus
Copy link

@Mecanik glad to hear 👍

@ChojinDSL
Copy link

@WyriHaximus
What's the maximum number of connections you've been able to achieve so far?

@WyriHaximus
Copy link

WyriHaximus commented Jul 26, 2017

@ChojinDSL I've stopped paying attention to that after a couple thousand

@i3bitcoin
Copy link

i3bitcoin commented Sep 9, 2017

@kelunik sorry for my stupid questions, I'm new to this. I have php_sockets.so onboard. Which extension will solve my issue?

@kelunik
Copy link

kelunik commented Sep 10, 2017

Either ev or event from pecl.

@i3bitcoin
Copy link

i3bitcoin commented Sep 12, 2017

Okay, I've found that my PHP Sockets server is accepting more than 1024 sockets when I run it from the root.

I'm starting it using this command.
sudo -u www-data php7.0 /path/to/file/

/etc/php/7.0/fpm/php-fpm.conf
rlimit_files = 1048576

/etc/php/7.0/fpm/pool.d/www.conf
rlimit_files = 1048576
listen.backlog = 65536

/etc/security/limits.conf

  • soft nproc 1048576
  • hard nproc 1048576
  • soft nofile 1048576
  • hard nofile 1048576

Any suggestions?

@kelunik
Copy link

kelunik commented Sep 12, 2017

@i3bitcoin In that case your problem is probably ulimit -n, try increasing that.

@i3bitcoin
Copy link

su www-data --shell /bin/bash --command "ulimit -n"
1048576

it's aslo increased

@i3bitcoin
Copy link

Is there any limits for sudo command?

@amadeubarbosa
Copy link

amadeubarbosa commented Nov 10, 2017

You cannot override the 1024 connections limitation in Linux systems without recompile the kernel. See this: https://access.redhat.com/solutions/488623

It's a far more complex that some comments were mentioned ulimit. The ulimit command doesn't take effect in select for sockets when you put > 1024 values. The problem is FD_SETSIZE in libc that it's impossible to override, when you try probably the process hang, it's a undefined behaviour, actually.

Some systems as Solaris accepts FD_SETSIZE until 65536, but vanilla Linux don't.

The recomendation is to use epoll and libevent alternatives.

Further readings about the C10k problem:

Hope to help,
Amadeu Barbosa Junior

@Gemorroj
Copy link

Gemorroj commented Feb 23, 2018

for rhel/centos users see reactphp/event-loop#152
php-pecl-ev extension from remi repository not supported at this time.

@cboden cboden added the docs label Aug 3, 2018
@AntonioDilorenzo
Copy link

hi, I'm using the socket for a web app, now that the connections are increasing, more and more often the socket hangs when the 1024 connections are reached. all the others are in the state of close_wait, I've already tried unlimit, but nothing has changed. Do you know if there is a parameter to change?
thanks so much for the help!

@inri13666
Copy link

Possible solution is here

@i3bitcoin
Copy link

@AntonioDilorenzo

My solution was to use HHVM instead of PHP for sockets. It doesn't have 1024 connections limit.

@jupitern
Copy link

@i3bitcoin how many connections did you achieve?

@i3bitcoin
Copy link

i3bitcoin commented Apr 28, 2019

@jupitern

More than 3k connections right now. It's the only solution worked for me.

I believe it's limited only with rlimit.

@josephmiller2000
Copy link

@i3bitcoin is there anyway i can contact you in personal about setting HHVM with ratchet chat? Im stuck with same 1024 connection limit.

@inri13666
Copy link

inri13666 commented Jun 10, 2019

@josephmiller2000 , possible this post may help you #328 (comment)

The main quick solution is to use any other Loop Event library instead of default React\EventLoop\StreamSelectLoop

@josephmiller2000
Copy link

@inri13666 Well, im using "even.so" event and tested with this method

#300 (comment)

Ev, is not detected by php, so right now using "event.so" instead of StreamSelectLoop

Increased all server side limits and php-fpm limits, still can't achieve more than 1024 at my peak time.

Users are in close_wait(socket) stage when they are connected to the chat.

So i planned to move to >> HHVM instead of basic php.

@WyriHaximus
Copy link

Easiest solution to this is to install ext-uv and make sure you're running the latest react/event-loop which has support for it. (And use the Factory::create() method to get your event loop of course.)

@josephmiller2000
Copy link

josephmiller2000 commented Jun 10, 2019

@WyriHaximus thanks for the comment, i can successfully can install "event", but cannot install "ext-uv".

End up getting this error,

Snap_Shot_00769

@inri13666
Copy link

inri13666 commented Jun 10, 2019

Ok, could you please share the result

php -r "require_once 'vendor/autoload.php'; var_dump(\React\EventLoop\Factory::create());"

for my configuration it's

D:\_dev\sites\private\event-loop>php -r "require_once 'vendor/autoload.php'; var_dump(\React\EventLoop\Factory::create());"
Command line code:1:
class React\EventLoop\ExtEventLoop#3 (14) {
  private $eventBase =>
  class EventBase#4 (0) {
  }
  private $futureTickQueue =>
  class React\EventLoop\Tick\FutureTickQueue#5 (1) {
    private $queue =>
    class SplQueue#6 (2) {
      private $flags =>
      int(4)
      private $dllist =>
      array(0) {
        ...
      }
    }
  }
...

@josephmiller2000
Copy link

josephmiller2000 commented Jun 10, 2019

Here you go @inri13666

root@vps652855:# php -r "require_once 'vendor/autoload.php'; var_dump(\React\EventLoop\Factory::create());"
object(React\EventLoop\ExtEventLoop)#3 (11) {
  ["eventBase":"React\EventLoop\ExtEventLoop":private]=>
  object(EventBase)#2 (0) {
  }
  ["nextTickQueue":"React\EventLoop\ExtEventLoop":private]=>
  object(React\EventLoop\Tick\NextTickQueue)#4 (2) {
    ["eventLoop":"React\EventLoop\Tick\NextTickQueue":private]=>
    *RECURSION*
    ["queue":"React\EventLoop\Tick\NextTickQueue":private]=>
    object(SplQueue)#5 (2) {
      ["flags":"SplDoublyLinkedList":private]=>
      int(4)
      ["dllist":"SplDoublyLinkedList":private]=>
      array(0) {
      }
    }
  }
  ["futureTickQueue":"React\EventLoop\ExtEventLoop":private]=>
  object(React\EventLoop\Tick\FutureTickQueue)#6 (2) {
    ["eventLoop":"React\EventLoop\Tick\FutureTickQueue":private]=>
    *RECURSION*
    ["queue":"React\EventLoop\Tick\FutureTickQueue":private]=>
    object(SplQueue)#7 (2) {
      ["flags":"SplDoublyLinkedList":private]=>
      int(4)
      ["dllist":"SplDoublyLinkedList":private]=>
      array(0) {
      }
    }
  }
  ["timerCallback":"React\EventLoop\ExtEventLoop":private]=>
  object(Closure)#9 (2) {
    ["this"]=>
    *RECURSION*
    ["parameter"]=>
    array(3) {
      ["$_"]=>
      string(10) "<required>"
      ["$__"]=>
      string(10) "<required>"
      ["$timer"]=>
      string(10) "<required>"
    }
  }
  ["timerEvents":"React\EventLoop\ExtEventLoop":private]=>
  object(SplObjectStorage)#8 (1) {
    ["storage":"SplObjectStorage":private]=>
    array(0) {
    }
  }
  ["streamCallback":"React\EventLoop\ExtEventLoop":private]=>
  object(Closure)#10 (2) {
    ["this"]=>
    *RECURSION*
    ["parameter"]=>
    array(2) {
      ["$stream"]=>
      string(10) "<required>"
      ["$flags"]=>
      string(10) "<required>"
    }
  }
  ["streamEvents":"React\EventLoop\ExtEventLoop":private]=>
  array(0) {
  }
  ["streamFlags":"React\EventLoop\ExtEventLoop":private]=>
  array(0) {
  }
  ["readListeners":"React\EventLoop\ExtEventLoop":private]=>
  array(0) {
  }
  ["writeListeners":"React\EventLoop\ExtEventLoop":private]=>
  array(0) {
  }
  ["running":"React\EventLoop\ExtEventLoop":private]=>
  NULL
}

@inri13666
Copy link

inri13666 commented Jun 10, 2019

@josephmiller2000, I'm using socket server behind NGinX

nginx.conf
worker_processes auto;
worker_rlimit_nofile 40000;  # Important
events {
    worker_connections  40000;  # Important
    multi_accept        on;  # Important
    use                 epoll;  # Important
}
default.conf
server {
    server_name _;

    listen 8000 default_server;
    listen [::]:8000 default_server;

    root        /home/site/wwwroot/web;
    error_log   /home/LogFiles/nginx-error.log;
    access_log  /home/LogFiles/nginx-access.log;

    location ~ ^/ws(/|$)$ {
        proxy_pass          http://127.0.0.1:8080;
        proxy_http_version  1.1;
        proxy_set_header    Upgrade $http_upgrade;
        proxy_set_header    Connection "Upgrade";
        proxy_buffer_size       128k;
        proxy_buffers           4 256k;
        proxy_busy_buffers_size 256k;
    }

@WyriHaximus
Copy link

@WyriHaximus thanks for the comment, i can successfully can install "event", but cannot install "ext-uv".

End up getting this error,

Snap_Shot_00769

Did you check config.log? To be honest I never had issues compiled ext-uv except for the occasional missing libuvdev (or what ever the name is on your distro).

@josephmiller2000
Copy link

Anyway figured out how to install ext-uv and all got up and working.

This is the maximum, connection i can get whatever events i use. Increased server limits and all done on my side. Even the script is using Zeromqnow.

Snap_Shot_00077

@jupitern
Copy link

jupitern commented Oct 10, 2019

with a cent os with 2gb ram, uv installed and a node socket client sending connections from other machine at my company we are reaching 20k connections.
we just don't get more because all ram is in use.

node client => https://github.com/jupitern/node-socket-client

@shmeeps
Copy link

shmeeps commented Apr 13, 2021

Just got hit with this and was able to eventually work around it. Wanted to share what all I went through in case it helps someone else down the line, because it took me two frustrating days with angry clients to resolve completely. For reference, we're running Ratchet with an Apache 2.4 reverse proxy on PHP 7.0, all running on Ubuntu 16.04. The Ratchet script is kept running by a supervisor task, ensuring that it restarts if it ever crashes. The Ratchet script is pretty straight forward; it interacts with an API on connection or when receiving certain messages, and contains a timer to hit the API for some data to send to specific clients (maintained by a user -> client map). Ratchet was maxing out at around 500 connections when we started.

First thing we noticed was Apache redlining both cores of the server. Ideally we'd move to a better server software like nginx, but our app currently prevents that. We also have to use a reverse proxy for SSL. We tried to use the underlying React library to run a WSS server directly without needing Apache/nginx, but weren't able to get it working correctly.

Bumping the server up to 4 cores gave enough resources to run Apache comfortably. From there we noticed that we'd still get 500 errors periodically, and some investigation into Apache revealed that it was tuned poorly and would cap out at a few hundred concurrent connections. Since the websockets count as a connection, these would quickly eat up available threads and prevent Apache from serving other traffic (other PHP scripts and static content). We were already using mpm_event, and updated our config to the following:

<IfModule mpm_event_module>
    StartServers 10
    MinSpareThreads 25
    MaxSpareThreads 750
    ThreadLimit 1000
    ThreadsPerChild 750
    # MaxRequestWorkers aka MaxClients => ServerLimit * ThreadsPerChild
    MaxRequestWorkers 15000
    MaxConnectionsPerChild 0
    ServerLimit 20
    ThreadStackSize 524288

Stress testing the server after this showed we could comfortably maintain thousands of requests a minute without any issue, which is well over what we needed to serve.

From there, we noticed that while Apache was running fine, the Ratchet script was now redlining with only a few hundred connections. Various searching led to the well documented StreamSelectLoop issue. We ruled out LibEvent due to using PHP 7.0, and weren't able to get LibUv to install without errors, so settled on LibEv with the following:

sudo pecl install ev
echo 'extension=ev.so' > /etc/php/7.0/mods-available/ev.ini
sudo phpenmod ev
sudo service php7.0-fpm restart # Not needed as the Ratch script is CLI, but better to see if this causes FPM issues now than later

Running a second instance of the Ratchet script that would initialize and then execute die(get_class($server->loop)); verified that the server was no longer running with a StreamSelectLoop and instead using a ExtEvLoop. We restarted the Ratchet script and let clients begin to auto-reconnect (our client side script will attempt to reconnect in increasing time per attempt), figuring we could watch the script as they reconnected for any performance issues. Everything ran fine, with the Ratchet script taking no more than 25% of a core until about 15 minutes later when it began to redline again. At this point, attempting to open a new connection would hang for a few minutes before failing.

We attempted to connect directly to the Ratchet script from the server itself (ie, bypassing Apache) to see if we could connect.

curl \
    -o - \ 
    --http1.1 \
    --include \
    --no-buffer \
    --header "Connection: Upgrade" \
    --header "Upgrade: websocket" \
    --header "Host: localhost:8080" \
    --header "Origin: http://localhost:8080" \
    --header "Sec-WebSocket-Key: SGVsbG8sIHdvcmxkIQ==" \
    --header "Sec-WebSocket-Version: 13" \
    http://localhost:8080/

This would also hang and then fail. When we restarted the Ratchet script, we could use the above to connect immediately, but once it started redlining we could not. This indicated that Apache was fine, and the limit was on the Ratchet script.

We updated the script to output the number of connected clients on tick and restarted, which would get to 1017 and no higher. This was conspicuously close to 1024, so we assumed it was some form of system limit. We checked the overall system limits using ulimit -a and saw no issues. However, we checked the actual process limits with the following

ps aux | grep RatchetScript.php # Record PID from this command
cat /proc/<PID>/limits

and saw that it was soft limited to 1024 soft / 4096 hard max open files. Updating this with

prlimit --pid <PID> --nofile=500000:5000000

and checking the log verified that once these limits were raised, we were able to handle an additional several thousand connections, after which we could still connect via a browser to our app or via the cURL request above with no issue.

We figured this was a user-limit issue (the script does not run as the webuser) and updated /etc/security/limits.conf with the user running the script and restarted, but saw that the limits were reset. We also attempted to run sudo su - ratchetuser -c 'ulimits -a' to see if that neede to be updated for the user, but those also appeared fine. After some further digging, we came across an article saying the 1024 / 4096 limit is enforced by supervisor, after which we updated /etc/supervisor/supervisord.conf with the following:

[supervisord]
....
minfds=500000

Restarting verified that the limits were maintained on the Ratchet script. The Ratchet script is now handling ~2,500 connections and using about 10% of one core, with small spikes here and there (mainly on client connection, as we have to decrypt connection data).

I imagine that the redlining occurs when Ratchet basically deadlocks waiting on a file handle that can't be created, but I haven't been able to verify this yet. It would explain the vast performance decrease once those connections are able to be properly created and maintained.

@abbaasi69
Copy link

I had an experience which may help somebody.
My server was stoping responding after one hour when the number of concurrent socket connections reached about 700. After doing all of the possible solutions, I realized that I had a ProxyPass in apache which redirects port 443 (SSL) to 8080 (my socket port). Finally, I increased the ServerLimit in my Apache prefork configuration from 700 to 1700 and the problem was solved temporarily.
This shows that if you use ProxyPass of Apache (or another webserver) the Apache will become busy as it is between the client and WebSocket server.

@mr-older
Copy link

mr-older commented Nov 13, 2023

I had that problem with ReactPHP, the core is deeper. Its nature rests in php methods of servicing socket events. Code was rewritten in cpp using epoll instead of select.
stackoverflow

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests