Hello Alex,

Am 21.10.24 um 08:06 schrieb Alex Song (宋文平):

Database settings

10k DB connections consume up to 21G of memory, which only accounts for 10% of the server's memory in our env and will not cause OOM risk.

I think that's a lot and I would size it differently if it were my setup.
The calculation only refers to the working memory that the Connections roughly use. Not included in this calculation is, for example, the buffer pool of the database itself.
In most cases, there are also processes that could use a lot of memory outside the database itself.

 

RabbitMQ

 

The maximum number of RabbitMQ connections is 20000, which is obtained by test in the 3000 node environment.

That is interesting. What was the limiting factor? Usually the number of available file descriptors is significantly higher for the RabbitMQ process.
(see /proc/<pid>/limits)

Incidentally, I have had good experiences with RabbitMQ Perftest during performance tests these days.
(https://perftest.rabbitmq.com/)

A example:

URL="amqp://openstack:mypassword@10.10.21.12:5672"
docker run -it --net host --rm pivotalrabbitmq/perf-test:latest  \
          --queue-pattern-from 1 --queue-pattern-to 500 \
          --producers 500 --consumers 15 \
          --variable-size 1000:30 \
          --variable-size 10000:20 \
          --variable-size 5000:45  \
          --quorum-queue --queue perftest  \
          --uri "$URL"

Respectful regards
Marc