Project

Profile

Help

Issue #2008

closed

qpidd exceeded 2000 connections limit

Added by jluza almost 8 years ago. Updated almost 5 years ago.

Status:
CLOSED - WORKSFORME
Priority:
High
Assignee:
Category:
-
Sprint/Milestone:
-
Start date:
Due date:
Estimated time:
Severity:
3. High
Version:
2.8.0
Platform Release:
OS:
Triaged:
Yes
Groomed:
Yes
Sprint Candidate:
Yes
Tags:
Pulp 2
Sprint:
Sprint 7
Quarter:

Description

Jun 13 21:37:57 pulp-docker01 pulp: celery.worker.job:ERROR: (4062-22208) Traceback (most recent call last):
Jun 13 21:37:57 pulp-docker01 pulp: celery.worker.job:ERROR: (4062-22208)   File "/usr/lib/python2.6/site-packages/celery/app/trace.py", line 240, in trace_task
Jun 13 21:37:57 pulp-docker01 pulp: celery.worker.job:ERROR: (4062-22208)     R = retval = fun(*args, **kwargs)
Jun 13 21:37:57 pulp-docker01 pulp: celery.worker.job:ERROR: (4062-22208)   File "/usr/lib/python2.6/site-packages/pulp/server/async/tasks.py", line 473, in __call__
Jun 13 21:37:57 pulp-docker01 pulp: celery.worker.job:ERROR: (4062-22208)     return super(Task, self).__call__(*args, **kwargs)
Jun 13 21:37:57 pulp-docker01 pulp: celery.worker.job:ERROR: (4062-22208)   File "/usr/lib/python2.6/site-packages/pulp/server/async/tasks.py", line 103, in __call__
Jun 13 21:37:57 pulp-docker01 pulp: celery.worker.job:ERROR: (4062-22208)     return super(PulpTask, self).__call__(*args, **kwargs)
Jun 13 21:37:57 pulp-docker01 pulp: celery.worker.job:ERROR: (4062-22208)   File "/usr/lib/python2.6/site-packages/celery/app/trace.py", line 437, in __protected_call__
Jun 13 21:37:57 pulp-docker01 pulp: celery.worker.job:ERROR: (4062-22208)     return self.run(*args, **kwargs)
Jun 13 21:37:57 pulp-docker01 pulp: celery.worker.job:ERROR: (4062-22208)   File "/usr/lib/python2.6/site-packages/pulp/server/controllers/repository.py", line 1162, in download_deferred
Jun 13 21:37:57 pulp-docker01 pulp: celery.worker.job:ERROR: (4062-22208)     download_step.start()
Jun 13 21:37:57 pulp-docker01 pulp: celery.worker.job:ERROR: (4062-22208)   File "/usr/lib/python2.6/site-packages/pulp/server/controllers/repository.py", line 1358, in start
Jun 13 21:37:57 pulp-docker01 pulp: celery.worker.job:ERROR: (4062-22208)     self.downloader.download(self.download_requests)
Jun 13 21:37:57 pulp-docker01 pulp: celery.worker.job:ERROR: (4062-22208)   File "/usr/lib/python2.6/site-packages/nectar/downloaders/threaded.py", line 142, in download
Jun 13 21:37:57 pulp-docker01 pulp: celery.worker.job:ERROR: (4062-22208)     worker_thread.start()
Jun 13 21:37:57 pulp-docker01 pulp: celery.worker.job:ERROR: (4062-22208)   File "/usr/lib64/python2.6/threading.py", line 474, in start
Jun 13 21:37:57 pulp-docker01 pulp: celery.worker.job:ERROR: (4062-22208)     _start_new_thread(self.__bootstrap, ())
Jun 13 21:37:57 pulp-docker01 pulp: celery.worker.job:ERROR: (4062-22208) error: can't start new thread
Jun 13 21:41:43 pulp-docker01 qpidd[26643]: 2016-06-13 21:41:43 [Security] error Client max total connection count limit of 2000 exceeded by 'qpid.127.0.0.1:5672-127.0.0.1:47832', user: 'guest@QPID'. Connection refused
Jun 13 21:41:43 pulp-docker01 qpidd[26643]: 2016-06-13 21:41:43 [Security] error Client max total connection count limit of 2000 exceeded by 'qpid.127.0.0.1:5672-127.0.0.1:47833', user: 'anonymous'. Connection refused
Jun 13 21:41:43 pulp-docker01 pulp: kombu.transport.qpid:ERROR: Unable to authenticate to qpid using the following mechanisms: ['PLAIN', 'ANONYMOUS']
Jun 13 21:41:53 pulp-docker01 qpidd[26643]: 2016-06-13 21:41:53 [System] error Connection qpid.127.0.0.1:5672-127.0.0.1:47832 No protocol received closing
Jun 13 21:41:53 pulp-docker01 qpidd[26643]: 2016-06-13 21:41:53 [System] error Connection qpid.127.0.0.1:5672-127.0.0.1:47833 No protocol received closing
Jun 13 21:42:46 pulp-docker01 qpidd[26643]: 2016-06-13 21:42:46 [Security] error Client max total connection count limit of 2000 exceeded by 'qpid.127.0.0.1:5672-127.0.0.1:47842', user: 'guest@QPID'. Connection refused

that ^ repeated many times and at one point qpidd probably crashed, because logs changed to:

Jun 14 03:25:06 pulp-docker01 pulp: gofer.messaging.consumer:ERROR: (4179-50016) pulp.task
Jun 14 03:25:06 pulp-docker01 pulp: gofer.messaging.consumer:ERROR: (4179-50016) Traceback (most recent call last):
Jun 14 03:25:06 pulp-docker01 pulp: gofer.messaging.consumer:ERROR: (4179-50016)   File "/usr/lib/python2.6/site-packages/gofer/messaging/consumer.py", line 68, in _open
Jun 14 03:25:06 pulp-docker01 pulp: gofer.messaging.consumer:ERROR: (4179-50016)     self._reader.open()
Jun 14 03:25:06 pulp-docker01 pulp: gofer.messaging.consumer:ERROR: (4179-50016)   File "/usr/lib/python2.6/site-packages/gofer/messaging/adapter/model.py", line 39, in _fn
Jun 14 03:25:06 pulp-docker01 pulp: gofer.messaging.consumer:ERROR: (4179-50016)     return fn(*args, **keywords)
Jun 14 03:25:06 pulp-docker01 pulp: gofer.messaging.consumer:ERROR: (4179-50016)   File "/usr/lib/python2.6/site-packages/gofer/messaging/adapter/model.py", line 587, in open
Jun 14 03:25:06 pulp-docker01 pulp: gofer.messaging.consumer:ERROR: (4179-50016)     self._impl.open()
Jun 14 03:25:06 pulp-docker01 pulp: gofer.messaging.consumer:ERROR: (4179-50016)   File "/usr/lib/python2.6/site-packages/gofer/messaging/adapter/qpid/reliability.py", line 34, in _fn
Jun 14 03:25:06 pulp-docker01 pulp: gofer.messaging.consumer:ERROR: (4179-50016)     raise NotFound(*e.args)
Jun 14 03:25:06 pulp-docker01 pulp: gofer.messaging.consumer:ERROR: (4179-50016) NotFound: no such queue: pulp.task
Jun 14 03:25:06 pulp-docker01 pulp: gofer.messaging.consumer:ERROR: (4181-50016) pulp.task

Installed packages:

qpid-tools-0.30-4.el6.noarch
python-qpid-0.30-6.el6.noarch
qpid-qmf-0.30-5.el6.x86_64
qpid-cpp-server-0.30-6.el6.x86_64
python-qpid-qmf-0.30-5.el6.x86_64
python-gofer-qpid-2.6.8-1.el6.noarch
qpid-cpp-client-0.30-6.el6.x86_64
qpid-proton-c-0.7-5.el6.x86_64

pulp-nodes-admin-extensions-2.8.1.2-1.el6sat.noarch
python-pulp-client-lib-2.8.1.2-1.el6sat.noarch
pulp-server-2.8.1.2-1.el6sat.noarch
pulp-selinux-2.8.1.2-1.el6sat.noarch
python-pulp-cdn-distributor-common-2.8.0-8.el6eng.noarch
python-pulp-agent-lib-2.8.1.2-1.el6sat.noarch
pulp-docker-admin-extensions-2.0.0.2-1.el6sat.noarch
pulp-nodes-consumer-extensions-2.8.1.2-1.el6sat.noarch
python-pulp-oid_validation-2.8.1.2-1.el6sat.noarch
pulp-agent-2.8.1.2-1.el6sat.noarch
m2crypto-0.21.1.pulp-8.el6.x86_64
python-pulp-repoauth-2.8.1.2-1.el6sat.noarch
pulp-consumer-client-2.8.1.2-1.el6sat.noarch
pulp-docker-plugins-2.0.0.2-1.el6sat.noarch
python-pulp-streamer-2.8.1.2-1.el6sat.noarch
python-isodate-0.5.0-1.pulp.el6.noarch
mod_wsgi-3.4-1.pulp.el6.x86_64
python-pulp-bindings-2.8.1.2-1.el6sat.noarch
pulp-admin-client-2.8.1.2-1.el6sat.noarch
pulp-cdn-distributor-plugins-2.8.0-8.el6eng.noarch
python-kombu-3.0.24-5.pulp.el6ui.noarch
python-pulp-docker-common-2.0.0.2-1.el6sat.noarch
pulp-nodes-common-2.8.1.2-1.el6sat.noarch
pulp-nodes-child-2.8.1.2-1.el6sat.noarch
python-pulp-common-2.8.1.2-1.el6sat.noarch

python-celery-3.1.11-1.el6.noarch
Actions #1

Updated by bmbouter almost 8 years ago

This installation is not deployed correctly if pulp.task is missing. This means that the migrations were not run or failed to run.

Actions #3

Updated by dgregor@redhat.com almost 8 years ago

  • Version set to 2.8.0

bmbouter wrote:

This installation is not deployed correctly if pulp.task is missing. This means that the migrations were not run or failed to run.

Brian, thank you for the quick reply. However, I believe the pulp.task error is a red herring. Note that it only happens after qpidd stops working. It's the max connections error before that which is the problem.

Actions #4

Updated by bmbouter almost 8 years ago

  • Description updated (diff)
Actions #5

Updated by bmbouter almost 8 years ago

What does your `pulp-admin status ` output show? That will tell us a lot about how large your deployment is.

Also where are all of your connections going? Determining that is done by counting the connections by pid from `qpid-stat -c` and labeling the pids by their process name from `ps -awfux | grep celery`. Post the count of connections to Qpid per process.

Also here is also a repost of something I wrote earlier outlining the expected numbers:

Each pulp httpd WSGI process has 2 connections.
Each pulp_resource_manager parent process has 2, there may be one or 2 connections from a child process at times.
Each pulp_worker parent process has 2, there may be one or 2 connection from a child process at times.
Each pulp_celerybeat should have exactly 3

Based on my simple measurements I expect no more than 3 connections per process. (note these are processes, and each pulp_resource_manager and pulp_worker has 2 processes each). Most connections (except for celery worker child processes) are formed at the beginning and the only one I've seen have shorter connection times are the WSGI processes which may do some process recycling to account for that observation. There is no process recycling otherwise so I expect throughout runtime for it to stay near constant except for workers opening a maximum of 2 connections throughout their runtime.

We need to determine if your installation is very large or if Pulp is leaking Qpid connections.

Actions #6

Updated by jluza almost 8 years ago

qpid-stat -c

Connections
  connection                                 cproc        cpid   mech       auth       connected    idle         msgIn  msgOut
  ==============================================================================================================================
  qpid.10.25.119.20:5672-10.25.119.20:40897  mod_wsgi     30395  ANONYMOUS  anonymous  9h 32m 52s   6h 37m 19s    105      5
  qpid.10.25.119.20:5672-10.25.119.20:41500  __main__.py  30214  ANONYMOUS  anonymous  9h 13m 7s    9h 12m 59s      5      4
  qpid.10.25.119.20:5672-10.25.119.20:45449  celery       29415  ANONYMOUS  anonymous  21h 13m 7s   21h 12m 59s     0      0
  qpid.10.25.119.20:5672-10.25.119.20:45452  celery       29415  ANONYMOUS  anonymous  21h 13m 7s   0s              6    124k
  qpid.10.25.119.20:5672-10.25.119.20:45453  celery       29415  ANONYMOUS  anonymous  21h 13m 7s   3h 12m 59s      7      4
  qpid.10.25.119.20:5672-10.25.119.20:45460  __main__.py  29505  ANONYMOUS  anonymous  21h 13m 3s   3m 29s         15   4.74k
  qpid.10.25.119.20:5672-10.25.119.20:45461  __main__.py  29505  ANONYMOUS  anonymous  21h 13m 3s   19s          16.1k     2
  qpid.10.25.119.20:5672-10.25.119.20:45481  __main__.py  30089  ANONYMOUS  anonymous  21h 12m 54s  1m 39s         15    357
  qpid.10.25.119.20:5672-10.25.119.20:45482  __main__.py  30089  ANONYMOUS  anonymous  21h 12m 54s  9s           2.98k     2
  qpid.10.25.119.20:5672-10.25.119.20:45487  __main__.py  30114  ANONYMOUS  anonymous  21h 12m 53s  3m 29s         15    365
  qpid.10.25.119.20:5672-10.25.119.20:45488  __main__.py  30114  ANONYMOUS  anonymous  21h 12m 53s  9s           3.01k     2
  qpid.10.25.119.20:5672-10.25.119.20:45493  __main__.py  30166  ANONYMOUS  anonymous  21h 12m 53s  1m 39s         15    357
  qpid.10.25.119.20:5672-10.25.119.20:45494  __main__.py  30166  ANONYMOUS  anonymous  21h 12m 53s  9s           2.99k     2
  qpid.10.25.119.20:5672-10.25.119.20:45499  __main__.py  30200  ANONYMOUS  anonymous  21h 12m 52s  3m 19s         15    359
  qpid.10.25.119.20:5672-10.25.119.20:45500  __main__.py  30200  ANONYMOUS  anonymous  21h 12m 52s  0s           2.99k     2
  qpid.10.25.119.20:5672-10.25.119.20:45505  __main__.py  30242  ANONYMOUS  anonymous  21h 12m 52s  3m 29s         15    375
  qpid.10.25.119.20:5672-10.25.119.20:45506  __main__.py  30242  ANONYMOUS  anonymous  21h 12m 52s  0s           3.04k     2
  qpid.10.25.119.20:5672-10.25.119.20:45511  __main__.py  30272  ANONYMOUS  anonymous  21h 12m 51s  2m 29s         15    379
  qpid.10.25.119.20:5672-10.25.119.20:45512  __main__.py  30272  ANONYMOUS  anonymous  21h 12m 51s  0s           3.05k     2
  qpid.10.25.119.20:5672-10.25.119.20:45517  __main__.py  30299  ANONYMOUS  anonymous  21h 12m 51s  2m 19s         15    330
  qpid.10.25.119.20:5672-10.25.119.20:45518  __main__.py  30299  ANONYMOUS  anonymous  21h 12m 51s  0s           2.90k     2
  qpid.10.25.119.20:5672-10.25.119.20:45521  __main__.py  30339  ANONYMOUS  anonymous  21h 12m 50s  1m 39s         15    316
  qpid.10.25.119.20:5672-10.25.119.20:45522  __main__.py  30339  ANONYMOUS  anonymous  21h 12m 50s  0s           2.86k     2
  qpid.10.25.119.20:5672-10.25.119.20:45529  mod_wsgi     30397  ANONYMOUS  anonymous  21h 12m 49s  0s              1      1
  qpid.10.25.119.20:5672-10.25.119.20:45530  mod_wsgi     30395  ANONYMOUS  anonymous  21h 12m 49s  0s              1      1
  qpid.10.25.119.20:5672-10.25.119.20:45531  mod_wsgi     30397  ANONYMOUS  anonymous  21h 12m 49s  0s              0      0
  qpid.10.25.119.20:5672-10.25.119.20:45532  mod_wsgi     30396  ANONYMOUS  anonymous  21h 12m 49s  0s              1      1
  qpid.10.25.119.20:5672-10.25.119.20:45533  mod_wsgi     30395  ANONYMOUS  anonymous  21h 12m 49s  0s              0      0
  qpid.10.25.119.20:5672-10.25.119.20:45534  mod_wsgi     30396  ANONYMOUS  anonymous  21h 12m 49s  0s              0      0
  qpid.10.25.119.20:5672-10.25.119.20:45677  __main__.py  29549  ANONYMOUS  anonymous  21h 8m 10s   3m 39s       9.05k     4
  qpid.10.25.119.20:5672-10.25.119.20:45680  mod_wsgi     30397  ANONYMOUS  anonymous  21h 8m 5s    3m 29s        372      5
  qpid.10.25.119.20:5672-10.25.119.20:45682  mod_wsgi     30395  ANONYMOUS  anonymous  21h 8m 3s    3m 29s        313      5
  qpid.10.25.119.20:5672-10.25.119.20:45717  mod_wsgi     30396  ANONYMOUS  anonymous  21h 7m 26s   19h 36m 19s    79      5
  qpid.10.25.119.20:5672-10.25.119.20:48733  mod_wsgi     30396  ANONYMOUS  anonymous  19h 36m 23s  3m 39s        330      5
  qpid.10.25.119.20:5672-10.25.119.20:54502  __main__.py  30286  ANONYMOUS  anonymous  3h 13m 7s    3h 12m 59s      5      4
  qpid.10.25.119.20:5672-10.25.119.20:57476  __main__.py  30128  ANONYMOUS  anonymous  15h 13m 7s   15h 12m 59s     5      4
  qpid.10.25.119.20:5672-10.25.119.21:48010  mod_wsgi     25641  ANONYMOUS  anonymous  11h 39m 28s  3m 49s        298      5
  qpid.10.25.119.20:5672-10.25.119.21:57016  __main__.py  25407  ANONYMOUS  anonymous  21h 12m 34s  3m 29s         15    718
  qpid.10.25.119.20:5672-10.25.119.21:57017  __main__.py  25407  ANONYMOUS  anonymous  21h 12m 34s  9s           4.07k     2
  qpid.10.25.119.20:5672-10.25.119.21:57024  __main__.py  25432  ANONYMOUS  anonymous  21h 12m 34s  3m 19s         15    712
  qpid.10.25.119.20:5672-10.25.119.21:57025  __main__.py  25432  ANONYMOUS  anonymous  21h 12m 34s  9s           4.05k     2
  qpid.10.25.119.20:5672-10.25.119.21:57028  __main__.py  25446  ANONYMOUS  anonymous  21h 12m 33s  3m 29s         15    458
  qpid.10.25.119.20:5672-10.25.119.21:57031  __main__.py  25446  ANONYMOUS  anonymous  21h 12m 33s  9s           3.23k     2
  qpid.10.25.119.20:5672-10.25.119.21:57032  __main__.py  25485  ANONYMOUS  anonymous  21h 12m 33s  2m 9s          15    330
  qpid.10.25.119.20:5672-10.25.119.21:57033  __main__.py  25485  ANONYMOUS  anonymous  21h 12m 33s  9s           2.90k     2
  qpid.10.25.119.20:5672-10.25.119.21:57038  __main__.py  25513  ANONYMOUS  anonymous  21h 12m 33s  3m 29s         15    444
  qpid.10.25.119.20:5672-10.25.119.21:57039  __main__.py  25513  ANONYMOUS  anonymous  21h 12m 33s  9s           3.24k     2
  qpid.10.25.119.20:5672-10.25.119.21:57044  __main__.py  25540  ANONYMOUS  anonymous  21h 12m 32s  1m 59s         15    348
  qpid.10.25.119.20:5672-10.25.119.21:57045  __main__.py  25540  ANONYMOUS  anonymous  21h 12m 32s  9s           2.96k     2
  qpid.10.25.119.20:5672-10.25.119.21:57050  __main__.py  25568  ANONYMOUS  anonymous  21h 12m 32s  2m 59s         15   1.34k
  qpid.10.25.119.20:5672-10.25.119.21:57051  __main__.py  25568  ANONYMOUS  anonymous  21h 12m 32s  9s           5.93k     2
  qpid.10.25.119.20:5672-10.25.119.21:57060  __main__.py  25595  ANONYMOUS  anonymous  21h 12m 31s  1m 59s         15    410
  qpid.10.25.119.20:5672-10.25.119.21:57061  __main__.py  25595  ANONYMOUS  anonymous  21h 12m 31s  9s           3.15k     2
  qpid.10.25.119.20:5672-10.25.119.21:57062  mod_wsgi     25641  ANONYMOUS  anonymous  21h 12m 31s  0s              1      1
  qpid.10.25.119.20:5672-10.25.119.21:57063  mod_wsgi     25640  ANONYMOUS  anonymous  21h 12m 31s  0s              1      1
  qpid.10.25.119.20:5672-10.25.119.21:57064  mod_wsgi     25639  ANONYMOUS  anonymous  21h 12m 31s  0s              1      1
  qpid.10.25.119.20:5672-10.25.119.21:57065  mod_wsgi     25640  ANONYMOUS  anonymous  21h 12m 31s  0s              0      0
  qpid.10.25.119.20:5672-10.25.119.21:57066  mod_wsgi     25641  ANONYMOUS  anonymous  21h 12m 31s  0s              0      0
  qpid.10.25.119.20:5672-10.25.119.21:57067  mod_wsgi     25639  ANONYMOUS  anonymous  21h 12m 31s  0s              0      0
  qpid.10.25.119.20:5672-10.25.119.21:57199  mod_wsgi     25640  ANONYMOUS  anonymous  21h 8m 10s   6h 43m 9s     233      5
  qpid.10.25.119.20:5672-10.25.119.21:57204  mod_wsgi     25639  ANONYMOUS  anonymous  21h 8m 7s    6h 36m 19s    276      5
  qpid.10.25.119.20:5672-10.25.119.21:57223  mod_wsgi     25641  ANONYMOUS  anonymous  21h 7m 44s   11h 39m 19s   127      5
  qpid.10.25.119.20:5672-10.25.119.21:57901  mod_wsgi     25640  ANONYMOUS  anonymous  6h 43m 9s    3m 39s        160      5
  qpid.10.25.119.20:5672-10.25.119.21:58155  mod_wsgi     25639  ANONYMOUS  anonymous  6h 36m 20s   3m 29s        120      5
  qpid.10.25.119.20:5672-10.25.119.22:39357  mod_wsgi     32171  ANONYMOUS  anonymous  9h 32m 14s   3m 29s        241      5
  qpid.10.25.119.20:5672-10.25.119.22:44253  __main__.py  31939  ANONYMOUS  anonymous  21h 12m 17s  3m 9s          15    612
  qpid.10.25.119.20:5672-10.25.119.22:44254  __main__.py  31939  ANONYMOUS  anonymous  21h 12m 17s  0s           3.75k     2
  qpid.10.25.119.20:5672-10.25.119.22:44259  __main__.py  31964  ANONYMOUS  anonymous  21h 12m 16s  2m 59s         15    570
  qpid.10.25.119.20:5672-10.25.119.22:44260  __main__.py  31964  ANONYMOUS  anonymous  21h 12m 16s  0s           3.62k     2
  qpid.10.25.119.20:5672-10.25.119.22:44265  __main__.py  31991  ANONYMOUS  anonymous  21h 12m 16s  3m 19s         15    412
  qpid.10.25.119.20:5672-10.25.119.22:44266  __main__.py  31991  ANONYMOUS  anonymous  21h 12m 16s  0s           3.15k     2
  qpid.10.25.119.20:5672-10.25.119.22:44271  __main__.py  32018  ANONYMOUS  anonymous  21h 12m 15s  3m 19s         15    610
  qpid.10.25.119.20:5672-10.25.119.22:44272  __main__.py  32018  ANONYMOUS  anonymous  21h 12m 15s  29s          3.75k     2
  qpid.10.25.119.20:5672-10.25.119.22:44277  __main__.py  32045  ANONYMOUS  anonymous  21h 12m 15s  2m 59s         15    860
  qpid.10.25.119.20:5672-10.25.119.22:44278  __main__.py  32045  ANONYMOUS  anonymous  21h 12m 15s  29s          4.50k     2
  qpid.10.25.119.20:5672-10.25.119.22:44283  __main__.py  32072  ANONYMOUS  anonymous  21h 12m 14s  1m 9s          15    504
  qpid.10.25.119.20:5672-10.25.119.22:44284  __main__.py  32072  ANONYMOUS  anonymous  21h 12m 14s  19s          3.43k     2
  qpid.10.25.119.20:5672-10.25.119.22:44289  __main__.py  32099  ANONYMOUS  anonymous  21h 12m 14s  2m 29s         15    544
  qpid.10.25.119.20:5672-10.25.119.22:44290  __main__.py  32099  ANONYMOUS  anonymous  21h 12m 14s  19s          3.54k     2
  qpid.10.25.119.20:5672-10.25.119.22:44300  __main__.py  32127  ANONYMOUS  anonymous  21h 12m 13s  3m 29s         15    534
  qpid.10.25.119.20:5672-10.25.119.22:44301  __main__.py  32127  ANONYMOUS  anonymous  21h 12m 13s  19s          3.51k     2
  qpid.10.25.119.20:5672-10.25.119.22:44302  mod_wsgi     32173  ANONYMOUS  anonymous  21h 12m 13s  0s              1      1
  qpid.10.25.119.20:5672-10.25.119.22:44303  mod_wsgi     32172  ANONYMOUS  anonymous  21h 12m 13s  0s              1      1
  qpid.10.25.119.20:5672-10.25.119.22:44305  mod_wsgi     32171  ANONYMOUS  anonymous  21h 12m 13s  0s              1      1
  qpid.10.25.119.20:5672-10.25.119.22:44306  mod_wsgi     32173  ANONYMOUS  anonymous  21h 12m 13s  0s              0      0
  qpid.10.25.119.20:5672-10.25.119.22:44307  mod_wsgi     32172  ANONYMOUS  anonymous  21h 12m 13s  0s              0      0
  qpid.10.25.119.20:5672-10.25.119.22:44308  mod_wsgi     32171  ANONYMOUS  anonymous  21h 12m 13s  0s              0      0
  qpid.10.25.119.20:5672-10.25.119.22:44431  mod_wsgi     32171  ANONYMOUS  anonymous  21h 8m 8s    9h 32m 9s     173      5
  qpid.10.25.119.20:5672-10.25.119.22:44447  mod_wsgi     32173  ANONYMOUS  anonymous  21h 7m 50s   6h 36m 9s     256      5
  qpid.10.25.119.20:5672-10.25.119.22:44448  mod_wsgi     32172  ANONYMOUS  anonymous  21h 7m 49s   3m 49s        375      5
  qpid.10.25.119.20:5672-10.25.119.22:45857  mod_wsgi     32173  ANONYMOUS  anonymous  6h 36m 17s   3m 39s        134      5
  qpid.10.25.119.20:5672-10.25.119.23:32979  mod_wsgi     1166   ANONYMOUS  anonymous  6h 37m 49s   3m 39s        147      5
  qpid.10.25.119.20:5672-10.25.119.23:33039  mod_wsgi     1168   ANONYMOUS  anonymous  6h 36m 18s   3m 29s        163      5
  qpid.10.25.119.20:5672-10.25.119.23:57658  mod_wsgi     1167   ANONYMOUS  anonymous  8h 12m 34s   3m 29s        211      5
  qpid.10.25.119.20:5672-10.25.119.23:59015  __main__.py  926    ANONYMOUS  anonymous  21h 11m 59s  3m 29s         15    514
  qpid.10.25.119.20:5672-10.25.119.23:59016  __main__.py  926    ANONYMOUS  anonymous  21h 11m 59s  9s           3.46k     2
  qpid.10.25.119.20:5672-10.25.119.23:59021  __main__.py  951    ANONYMOUS  anonymous  21h 11m 58s  3m 19s         15    516
  qpid.10.25.119.20:5672-10.25.119.23:59022  __main__.py  951    ANONYMOUS  anonymous  21h 11m 58s  9s           3.45k     2
  qpid.10.25.119.20:5672-10.25.119.23:59027  __main__.py  978    ANONYMOUS  anonymous  21h 11m 58s  3m 29s         15    740
  qpid.10.25.119.20:5672-10.25.119.23:59028  __main__.py  978    ANONYMOUS  anonymous  21h 11m 58s  9s           4.13k     2
  qpid.10.25.119.20:5672-10.25.119.23:59033  __main__.py  1005   ANONYMOUS  anonymous  21h 11m 57s  3m 19s         15    372
  qpid.10.25.119.20:5672-10.25.119.23:59034  __main__.py  1005   ANONYMOUS  anonymous  21h 11m 57s  9s           3.03k     2
  qpid.10.25.119.20:5672-10.25.119.23:59039  __main__.py  1032   ANONYMOUS  anonymous  21h 11m 57s  3m 29s         15    364
  qpid.10.25.119.20:5672-10.25.119.23:59040  __main__.py  1032   ANONYMOUS  anonymous  21h 11m 57s  9s           3.00k     2
  qpid.10.25.119.20:5672-10.25.119.23:59045  __main__.py  1059   ANONYMOUS  anonymous  21h 11m 56s  3m 29s         15    346
  qpid.10.25.119.20:5672-10.25.119.23:59047  __main__.py  1059   ANONYMOUS  anonymous  21h 11m 56s  9s           2.95k     2
  qpid.10.25.119.20:5672-10.25.119.23:59051  __main__.py  1086   ANONYMOUS  anonymous  21h 11m 56s  0s             15    358
  qpid.10.25.119.20:5672-10.25.119.23:59052  __main__.py  1086   ANONYMOUS  anonymous  21h 11m 56s  0s           2.99k     2
  qpid.10.25.119.20:5672-10.25.119.23:59061  __main__.py  1113   ANONYMOUS  anonymous  21h 11m 55s  3m 19s         15    386
  qpid.10.25.119.20:5672-10.25.119.23:59062  __main__.py  1113   ANONYMOUS  anonymous  21h 11m 55s  9s           3.07k     2
  qpid.10.25.119.20:5672-10.25.119.23:59063  mod_wsgi     1166   ANONYMOUS  anonymous  21h 11m 55s  0s              1      1
  qpid.10.25.119.20:5672-10.25.119.23:59064  mod_wsgi     1168   ANONYMOUS  anonymous  21h 11m 55s  0s              1      1
  qpid.10.25.119.20:5672-10.25.119.23:59065  mod_wsgi     1166   ANONYMOUS  anonymous  21h 11m 55s  0s              0      0
  qpid.10.25.119.20:5672-10.25.119.23:59066  mod_wsgi     1168   ANONYMOUS  anonymous  21h 11m 55s  0s              0      0
  qpid.10.25.119.20:5672-10.25.119.23:59067  mod_wsgi     1167   ANONYMOUS  anonymous  21h 11m 55s  0s              1      1
  qpid.10.25.119.20:5672-10.25.119.23:59068  mod_wsgi     1167   ANONYMOUS  anonymous  21h 11m 55s  0s              0      0
  qpid.10.25.119.20:5672-10.25.119.23:59182  mod_wsgi     1167   ANONYMOUS  anonymous  21h 8m 9s    8h 12m 29s    206      5
  qpid.10.25.119.20:5672-10.25.119.23:59185  mod_wsgi     1168   ANONYMOUS  anonymous  21h 8m 6s    6h 36m 9s     242      5
  qpid.10.25.119.20:5672-10.25.119.23:59211  mod_wsgi     1166   ANONYMOUS  anonymous  21h 7m 37s   6h 37m 49s    269      5
  qpid.[::1]:5672-[::1]:49034                qpid-stat    26778  ANONYMOUS  anonymous  0s           0s              1      0
Actions #7

Updated by jluza almost 8 years ago

pulp server status:

{
    "api_version": "2",
    "database_connection": {
        "connected": true
    },
    "known_workers": [
        {
            "_id": "scheduler@pulp-docker01.web.stage.ext.phx2.redhat.com",
            "_ns": "workers",
            "last_heartbeat": "2016-06-14T15:05:01Z"
        },
        {
            "_id": "resource_manager@pulp-docker01.web.stage.ext.phx2.redhat.com",
            "_ns": "workers",
            "last_heartbeat": "2016-06-14T15:04:46Z"
        },
        {
            "_id": "reserved_resource_worker-0@pulp-docker01.web.stage.ext.phx2.redhat.com",
            "_ns": "workers",
            "last_heartbeat": "2016-06-14T15:04:46Z"
        },
        {
            "_id": "reserved_resource_worker-1@pulp-docker01.web.stage.ext.phx2.redhat.com",
            "_ns": "workers",
            "last_heartbeat": "2016-06-14T15:04:46Z"
        },
        {
            "_id": "reserved_resource_worker-2@pulp-docker01.web.stage.ext.phx2.redhat.com",
            "_ns": "workers",
            "last_heartbeat": "2016-06-14T15:04:46Z"
        },
        {
            "_id": "reserved_resource_worker-3@pulp-docker01.web.stage.ext.phx2.redhat.com",
            "_ns": "workers",
            "last_heartbeat": "2016-06-14T15:04:46Z"
        }
    ],
    "messaging_connection": {
        "connected": true
    },
    "versions": {
        "platform_version": "2.8.1b1"
    }
}
Actions #8

Updated by bmbouter almost 8 years ago

Your pulp server status looks very reasonable, but your qpid-stat -c does show many unexpected connections.

What are the process names the correspond with those pids from `qpid-stat -c`?

For example here is my output on a system with 4 workers:

Connections
  connection                   cproc      cpid  mech       auth            connected      idle           msgIn  msgOut
  ======================================================================================================================
  qpid.[::1]:5672-[::1]:58096  celery     2020  ANONYMOUS  anonymous@QPID  1d 2h 51m 20s  1d 2h 51m 16s     0      0
  qpid.[::1]:5672-[::1]:58102  celery     2020  ANONYMOUS  anonymous@QPID  1d 2h 51m 19s  14s               6   15.9k
  qpid.[::1]:5672-[::1]:58108  celery     2020  ANONYMOUS  anonymous@QPID  1d 2h 51m 19s  7m 44s           60      4
  qpid.[::1]:5672-[::1]:58212  celery     2027  ANONYMOUS  anonymous@QPID  1d 2h 50m 28s  37m 44s          12     40
  qpid.[::1]:5672-[::1]:58214  celery     2027  ANONYMOUS  anonymous@QPID  1d 2h 50m 28s  14s            3.20k     2
  qpid.[::1]:5672-[::1]:58216  celery     2031  ANONYMOUS  anonymous@QPID  1d 2h 50m 28s  37m 44s          12     40
  qpid.[::1]:5672-[::1]:58218  celery     2031  ANONYMOUS  anonymous@QPID  1d 2h 50m 28s  14s            3.20k     2
  qpid.[::1]:5672-[::1]:58220  celery     2025  ANONYMOUS  anonymous@QPID  1d 2h 50m 28s  7m 44s           12     40
  qpid.[::1]:5672-[::1]:58222  celery     2025  ANONYMOUS  anonymous@QPID  1d 2h 50m 28s  14s            3.20k     2
  qpid.[::1]:5672-[::1]:58226  celery     2029  ANONYMOUS  anonymous@QPID  1d 2h 50m 28s  7m 44s           12     40
  qpid.[::1]:5672-[::1]:58228  celery     2029  ANONYMOUS  anonymous@QPID  1d 2h 50m 28s  14s            3.20k     2
  qpid.[::1]:5672-[::1]:58234  celery     2021  ANONYMOUS  anonymous@QPID  1d 2h 50m 27s  1d 2h 50m 26s    12     12
  qpid.[::1]:5672-[::1]:58238  celery     2021  ANONYMOUS  anonymous@QPID  1d 2h 50m 27s  14s            3.12k     2
  qpid.[::1]:5672-[::1]:58882  celery     2323  ANONYMOUS  anonymous@QPID  1d 2h 21m 19s  37m 44s          32      4
  qpid.[::1]:5672-[::1]:58892  celery     2327  ANONYMOUS  anonymous@QPID  1d 1h 51m 19s  7m 44s           32      4
  qpid.[::1]:5672-[::1]:58940  mod_wsgi   5650  ANONYMOUS  anonymous@QPID  23h 41m 6s     23h 41m 1s        1      1
  qpid.[::1]:5672-[::1]:58946  mod_wsgi   5650  ANONYMOUS  anonymous@QPID  23h 41m 6s     0s                0      0
  qpid.[::1]:5672-[::1]:58986  mod_wsgi   5929  ANONYMOUS  anonymous@QPID  21h 37m 51s    21h 37m 44s       1      1
  qpid.[::1]:5672-[::1]:58988  mod_wsgi   5929  ANONYMOUS  anonymous@QPID  21h 37m 51s    0s                0      0
  qpid.[::1]:5672-[::1]:58990  mod_wsgi   5930  ANONYMOUS  anonymous@QPID  21h 37m 51s    21h 37m 44s       1      1
  qpid.[::1]:5672-[::1]:58992  mod_wsgi   5930  ANONYMOUS  anonymous@QPID  21h 37m 51s    0s                0      0
  qpid.[::1]:5672-[::1]:59130  qpid-stat  7827  ANONYMOUS  anonymous@QPID  0s             0s                1      0
Actions #9

Updated by jluza almost 8 years ago

sorted by PID

qpid.127.0.0.1:5672-127.0.0.1:51539 celery 3639 PLAIN guest@QPID 7h 48m 14s 7h 48m 14s 0 0
qpid.127.0.0.1:5672-127.0.0.1:51540 celery 3639 PLAIN guest@QPID 7h 48m 14s 4s 6 4.79k
qpid.127.0.0.1:5672-127.0.0.1:51598 celery 3639 PLAIN guest@QPID 7h 40m 2s 9m 54s 21 4
qpid.127.0.0.1:5672-127.0.0.1:51535 __main__.py 3790 PLAIN guest@QPID 7h 48m 22s 7h 48m 4s 12 13
qpid.127.0.0.1:5672-127.0.0.1:51536 __main__.py 3790 PLAIN guest@QPID 7h 48m 22s 4s 941 2
qpid.127.0.0.1:5672-127.0.0.1:51527 __main__.py 4002 PLAIN guest@QPID 7h 48m 22s 9m 54s 12 22
qpid.127.0.0.1:5672-127.0.0.1:51531 __main__.py 4002 PLAIN guest@QPID 7h 48m 22s 4s 968 2
qpid.127.0.0.1:5672-127.0.0.1:51528 __main__.py 4017 PLAIN guest@QPID 7h 48m 22s 39m 54s 12 21
qpid.127.0.0.1:5672-127.0.0.1:51533 __main__.py 4017 PLAIN guest@QPID 7h 48m 22s 4s 965 2
qpid.127.0.0.1:5672-127.0.0.1:51529 __main__.py 4036 PLAIN guest@QPID 7h 48m 22s 39m 54s 12 21
qpid.127.0.0.1:5672-127.0.0.1:51532 __main__.py 4036 PLAIN guest@QPID 7h 48m 22s 4s 965 2
qpid.127.0.0.1:5672-127.0.0.1:51599 __main__.py 4060 PLAIN guest@QPID 7h 40m 2s 9m 54s 13 4
qpid.127.0.0.1:5672-127.0.0.1:51530 __main__.py 4062 PLAIN guest@QPID 7h 48m 22s 9m 54s 12 22
qpid.127.0.0.1:5672-127.0.0.1:51534 __main__.py 4062 PLAIN guest@QPID 7h 48m 22s 4s 968 2
qpid.[::1]:5672-[::1]:38112 qpid-stat 4091 ANONYMOUS anonymous 0s 0s 1 0
qpid.127.0.0.1:5672-127.0.0.1:51890 __main__.py 4134 PLAIN guest@QPID 7h 10m 2s 39m 54s 12 4
qpid.127.0.0.1:5672-127.0.0.1:51557 mod_wsgi 4179 PLAIN guest@QPID 7h 46m 42s 7h 46m 34s 0 0
qpid.127.0.0.1:5672-127.0.0.1:51567 mod_wsgi 4179 PLAIN guest@QPID 7h 45m 15s 7h 45m 14s 0 0
qpid.127.0.0.1:5672-127.0.0.1:51606 mod_wsgi 4179 PLAIN guest@QPID 7h 39m 7s 7h 39m 4s 0 0
qpid.127.0.0.1:5672-127.0.0.1:51616 mod_wsgi 4179 PLAIN guest@QPID 7h 37m 42s 7h 37m 34s 0 0
qpid.127.0.0.1:5672-127.0.0.1:51667 mod_wsgi 4179 PLAIN guest@QPID 7h 30m 6s 7h 30m 4s 0 0
qpid.127.0.0.1:5672-127.0.0.1:51809 mod_wsgi 4179 PLAIN guest@QPID 7h 22m 43s 7h 22m 34s 0 0
qpid.127.0.0.1:5672-127.0.0.1:51849 mod_wsgi 4179 PLAIN guest@QPID 7h 16m 40s 7h 16m 34s 0 0
qpid.127.0.0.1:5672-127.0.0.1:52122 mod_wsgi 4179 PLAIN guest@QPID 6h 35m 25s 6h 35m 24s 0 0
qpid.127.0.0.1:5672-127.0.0.1:52160 mod_wsgi 4179 PLAIN guest@QPID 6h 29m 33s 6h 29m 24s 0 0
qpid.127.0.0.1:5672-127.0.0.1:52251 mod_wsgi 4179 PLAIN guest@QPID 6h 27m 58s 6h 27m 54s 0 0
qpid.127.0.0.1:5672-127.0.0.1:52301 mod_wsgi 4179 PLAIN guest@QPID 6h 22m 14s 6h 22m 14s 0 0
qpid.127.0.0.1:5672-127.0.0.1:52316 mod_wsgi 4179 PLAIN guest@QPID 6h 20m 35s 6h 20m 34s 0 0
qpid.127.0.0.1:5672-127.0.0.1:52461 mod_wsgi 4179 PLAIN guest@QPID 5h 58m 25s 5h 58m 24s 0 0
qpid.127.0.0.1:5672-127.0.0.1:52557 mod_wsgi 4179 PLAIN guest@QPID 5h 43m 15s 5h 43m 14s 0 0
qpid.127.0.0.1:5672-127.0.0.1:52572 mod_wsgi 4179 PLAIN guest@QPID 5h 41m 31s 5h 41m 24s 0 0
qpid.127.0.0.1:5672-127.0.0.1:52800 mod_wsgi 4179 PLAIN guest@QPID 5h 20m 58s 5h 20m 54s 0 0
qpid.127.0.0.1:5672-127.0.0.1:52911 mod_wsgi 4179 PLAIN guest@QPID 5h 5m 23s 5h 5m 14s 0 0
qpid.127.0.0.1:5672-127.0.0.1:52924 mod_wsgi 4179 PLAIN guest@QPID 5h 3m 45s 5h 3m 44s 0 0
qpid.127.0.0.1:5672-127.0.0.1:53029 mod_wsgi 4179 PLAIN guest@QPID 4h 48m 58s 4h 48m 54s 0 0
qpid.127.0.0.1:5672-127.0.0.1:53039 mod_wsgi 4179 PLAIN guest@QPID 4h 47m 33s 4h 47m 24s 0 0
qpid.127.0.0.1:5672-127.0.0.1:53079 mod_wsgi 4179 PLAIN guest@QPID 4h 41m 45s 4h 41m 44s 0 0
qpid.127.0.0.1:5672-127.0.0.1:53372 mod_wsgi 4179 PLAIN guest@QPID 4h 12m 2s 4h 11m 54s 0 0
qpid.127.0.0.1:5672-127.0.0.1:53434 mod_wsgi 4179 PLAIN guest@QPID 4h 2m 39s 4h 2m 34s 0 0
qpid.127.0.0.1:5672-127.0.0.1:53478 mod_wsgi 4179 PLAIN guest@QPID 3h 56m 7s 3h 56m 4s 0 0
qpid.127.0.0.1:5672-127.0.0.1:53491 mod_wsgi 4179 PLAIN guest@QPID 3h 54m 11s 3h 54m 4s 0 0
qpid.127.0.0.1:5672-127.0.0.1:53839 mod_wsgi 4179 PLAIN guest@QPID 3h 15m 22s 3h 15m 14s 0 0
qpid.127.0.0.1:5672-127.0.0.1:53881 mod_wsgi 4179 PLAIN guest@QPID 3h 8m 39s 3h 8m 34s 0 0
qpid.127.0.0.1:5672-127.0.0.1:54043 mod_wsgi 4179 PLAIN guest@QPID 2h 43m 30s 2h 43m 24s 0 0
qpid.127.0.0.1:5672-127.0.0.1:54053 mod_wsgi 4179 PLAIN guest@QPID 2h 42m 10s 2h 42m 4s 0 0
qpid.127.0.0.1:5672-127.0.0.1:54100 mod_wsgi 4179 PLAIN guest@QPID 2h 36m 2s 2h 35m 54s 0 0
qpid.127.0.0.1:5672-127.0.0.1:54927 mod_wsgi 4179 PLAIN guest@QPID 1h 2m 36s 1h 2m 34s 0 0
qpid.127.0.0.1:5672-127.0.0.1:54941 mod_wsgi 4179 PLAIN guest@QPID 1h 1m 1s 1h 0m 54s 0 0
qpid.127.0.0.1:5672-127.0.0.1:54983 mod_wsgi 4179 PLAIN guest@QPID 54m 16s 54m 14s 0 0
qpid.127.0.0.1:5672-127.0.0.1:55096 mod_wsgi 4179 PLAIN guest@QPID 37m 0s 36m 54s 0 0
qpid.127.0.0.1:5672-127.0.0.1:55136 mod_wsgi 4179 PLAIN guest@QPID 30m 41s 30m 34s 0 0
qpid.127.0.0.1:5672-127.0.0.1:55143 mod_wsgi 4179 PLAIN guest@QPID 29m 27s 29m 24s 0 0
qpid.[::1]:5672-[::1]:34199 mod_wsgi 4179 ANONYMOUS anonymous 7h 47m 36s 4s 0 0
qpid.127.0.0.1:5672-127.0.0.1:51656 mod_wsgi 4180 PLAIN guest@QPID 7h 31m 37s 7h 31m 34s 0 0
qpid.127.0.0.1:5672-127.0.0.1:51860 mod_wsgi 4180 PLAIN guest@QPID 7h 15m 9s 7h 15m 4s 0 0
qpid.127.0.0.1:5672-127.0.0.1:51894 mod_wsgi 4180 PLAIN guest@QPID 7h 9m 20s 7h 9m 14s 0 0
qpid.127.0.0.1:5672-127.0.0.1:51947 mod_wsgi 4180 PLAIN guest@QPID 7h 1m 31s 7h 1m 24s 0 0
qpid.127.0.0.1:5672-127.0.0.1:51999 mod_wsgi 4180 PLAIN guest@QPID 6h 52m 50s 6h 52m 44s 0 0
qpid.127.0.0.1:5672-127.0.0.1:52015 mod_wsgi 4180 PLAIN guest@QPID 6h 51m 14s 6h 51m 4s 0 0
qpid.127.0.0.1:5672-127.0.0.1:52351 mod_wsgi 4180 PLAIN guest@QPID 6h 15m 6s 6h 15m 4s 0 0
qpid.127.0.0.1:5672-127.0.0.1:52361 mod_wsgi 4180 PLAIN guest@QPID 6h 13m 25s 6h 13m 24s 0 0
qpid.127.0.0.1:5672-127.0.0.1:52398 mod_wsgi 4180 PLAIN guest@QPID 6h 7m 42s 6h 7m 34s 0 0
qpid.127.0.0.1:5672-127.0.0.1:52414 mod_wsgi 4180 PLAIN guest@QPID 6h 6m 2s 6h 5m 54s 0 0
qpid.127.0.0.1:5672-127.0.0.1:52451 mod_wsgi 4180 PLAIN guest@QPID 6h 0m 2s 5h 59m 54s 0 0
qpid.127.0.0.1:5672-127.0.0.1:52810 mod_wsgi 4180 PLAIN guest@QPID 5h 19m 33s 5h 19m 24s 0 0
qpid.127.0.0.1:5672-127.0.0.1:52854 mod_wsgi 4180 PLAIN guest@QPID 5h 13m 32s 5h 13m 24s 0 0
qpid.127.0.0.1:5672-127.0.0.1:52971 mod_wsgi 4180 PLAIN guest@QPID 4h 57m 9s 4h 57m 4s 0 0
qpid.127.0.0.1:5672-127.0.0.1:52985 mod_wsgi 4180 PLAIN guest@QPID 4h 55m 38s 4h 55m 34s 0 0
qpid.127.0.0.1:5672-127.0.0.1:53089 mod_wsgi 4180 PLAIN guest@QPID 4h 40m 12s 4h 40m 4s 0 0
qpid.127.0.0.1:5672-127.0.0.1:53123 mod_wsgi 4180 PLAIN guest@QPID 4h 34m 40s 4h 34m 34s 0 0
qpid.127.0.0.1:5672-127.0.0.1:53133 mod_wsgi 4180 PLAIN guest@QPID 4h 33m 14s 4h 33m 14s 0 0
qpid.127.0.0.1:5672-127.0.0.1:53265 mod_wsgi 4180 PLAIN guest@QPID 4h 27m 16s 4h 27m 14s 0 0
qpid.127.0.0.1:5672-127.0.0.1:53280 mod_wsgi 4180 PLAIN guest@QPID 4h 25m 38s 4h 25m 34s 0 0
qpid.127.0.0.1:5672-127.0.0.1:53318 mod_wsgi 4180 PLAIN guest@QPID 4h 19m 46s 4h 19m 44s 0 0
qpid.127.0.0.1:5672-127.0.0.1:53421 mod_wsgi 4180 PLAIN guest@QPID 4h 4m 26s 4h 4m 24s 0 0
qpid.127.0.0.1:5672-127.0.0.1:53952 mod_wsgi 4180 PLAIN guest@QPID 2h 57m 50s 2h 57m 44s 0 0
qpid.127.0.0.1:5672-127.0.0.1:54004 mod_wsgi 4180 PLAIN guest@QPID 2h 49m 56s 2h 49m 54s 0 0
qpid.127.0.0.1:5672-127.0.0.1:54152 mod_wsgi 4180 PLAIN guest@QPID 2h 28m 16s 2h 28m 14s 0 0
qpid.127.0.0.1:5672-127.0.0.1:54263 mod_wsgi 4180 PLAIN guest@QPID 2h 26m 46s 2h 26m 44s 0 0
qpid.127.0.0.1:5672-127.0.0.1:54302 mod_wsgi 4180 PLAIN guest@QPID 2h 20m 30s 2h 20m 24s 0 0
qpid.127.0.0.1:5672-127.0.0.1:54359 mod_wsgi 4180 PLAIN guest@QPID 2h 12m 33s 2h 12m 24s 0 0
qpid.127.0.0.1:5672-127.0.0.1:54374 mod_wsgi 4180 PLAIN guest@QPID 2h 10m 50s 2h 10m 44s 0 0
qpid.127.0.0.1:5672-127.0.0.1:54416 mod_wsgi 4180 PLAIN guest@QPID 2h 4m 44s 2h 4m 34s 0 0
qpid.127.0.0.1:5672-127.0.0.1:54430 mod_wsgi 4180 PLAIN guest@QPID 2h 3m 11s 2h 3m 4s 0 0
qpid.127.0.0.1:5672-127.0.0.1:54474 mod_wsgi 4180 PLAIN guest@QPID 1h 57m 4s 1h 57m 4s 0 0
qpid.127.0.0.1:5672-127.0.0.1:54485 mod_wsgi 4180 PLAIN guest@QPID 1h 55m 37s 1h 55m 34s 0 0
qpid.127.0.0.1:5672-127.0.0.1:54622 mod_wsgi 4180 PLAIN guest@QPID 1h 34m 2s 1h 33m 54s 0 0
qpid.127.0.0.1:5672-127.0.0.1:54633 mod_wsgi 4180 PLAIN guest@QPID 1h 32m 26s 1h 32m 24s 0 0
qpid.127.0.0.1:5672-127.0.0.1:54773 mod_wsgi 4180 PLAIN guest@QPID 1h 25m 57s 1h 25m 54s 0 0
qpid.127.0.0.1:5672-127.0.0.1:54784 mod_wsgi 4180 PLAIN guest@QPID 1h 24m 22s 1h 24m 14s 0 0
qpid.127.0.0.1:5672-127.0.0.1:54995 mod_wsgi 4180 PLAIN guest@QPID 52m 16s 52m 14s 0 0
qpid.127.0.0.1:5672-127.0.0.1:55280 mod_wsgi 4180 PLAIN guest@QPID 23m 1s 22m 54s 0 0
qpid.127.0.0.1:5672-127.0.0.1:55295 mod_wsgi 4180 PLAIN guest@QPID 21m 23s 21m 14s 0 0
qpid.127.0.0.1:5672-127.0.0.1:55333 mod_wsgi 4180 PLAIN guest@QPID 15m 22s 15m 14s 0 0
qpid.127.0.0.1:5672-127.0.0.1:55343 mod_wsgi 4180 PLAIN guest@QPID 13m 55s 13m 54s 0 0
qpid.127.0.0.1:5672-127.0.0.1:55389 mod_wsgi 4180 PLAIN guest@QPID 7m 31s 7m 26s 0 0
qpid.127.0.0.1:5672-127.0.0.1:55405 mod_wsgi 4180 PLAIN guest@QPID 6m 2s 5m 54s 0 0
qpid.[::1]:5672-[::1]:34200 mod_wsgi 4180 ANONYMOUS anonymous 7h 47m 35s 4s 0 0
qpid.127.0.0.1:5672-127.0.0.1:51798 mod_wsgi 4181 PLAIN guest@QPID 7h 24m 14s 7h 24m 14s 0 0
qpid.127.0.0.1:5672-127.0.0.1:51907 mod_wsgi 4181 PLAIN guest@QPID 7h 7m 38s 7h 7m 34s 0 0
qpid.127.0.0.1:5672-127.0.0.1:51960 mod_wsgi 4181 PLAIN guest@QPID 6h 59m 23s 6h 59m 14s 0 0
qpid.127.0.0.1:5672-127.0.0.1:52054 mod_wsgi 4181 PLAIN guest@QPID 6h 44m 29s 6h 44m 24s 0 0
qpid.127.0.0.1:5672-127.0.0.1:52065 mod_wsgi 4181 PLAIN guest@QPID 6h 42m 59s 6h 42m 54s 0 0
qpid.127.0.0.1:5672-127.0.0.1:52105 mod_wsgi 4181 PLAIN guest@QPID 6h 36m 55s 6h 36m 54s 0 0
qpid.127.0.0.1:5672-127.0.0.1:52506 mod_wsgi 4181 PLAIN guest@QPID 5h 51m 18s 5h 51m 14s 0 0
qpid.127.0.0.1:5672-127.0.0.1:52517 mod_wsgi 4181 PLAIN guest@QPID 5h 49m 53s 5h 49m 44s 0 0
qpid.127.0.0.1:5672-127.0.0.1:52608 mod_wsgi 4181 PLAIN guest@QPID 5h 35m 46s 5h 35m 44s 0 0
qpid.127.0.0.1:5672-127.0.0.1:52618 mod_wsgi 4181 PLAIN guest@QPID 5h 34m 13s 5h 34m 4s 0 0
qpid.127.0.0.1:5672-127.0.0.1:52655 mod_wsgi 4181 PLAIN guest@QPID 5h 28m 22s 5h 28m 14s 0 0
qpid.127.0.0.1:5672-127.0.0.1:52765 mod_wsgi 4181 PLAIN guest@QPID 5h 26m 45s 5h 26m 44s 0 0
qpid.127.0.0.1:5672-127.0.0.1:52869 mod_wsgi 4181 PLAIN guest@QPID 5h 11m 48s 5h 11m 44s 0 0
qpid.127.0.0.1:5672-127.0.0.1:53328 mod_wsgi 4181 PLAIN guest@QPID 4h 18m 4s 4h 18m 4s 0 0
qpid.127.0.0.1:5672-127.0.0.1:53385 mod_wsgi 4181 PLAIN guest@QPID 4h 10m 14s 4h 10m 14s 0 0
qpid.127.0.0.1:5672-127.0.0.1:53534 mod_wsgi 4181 PLAIN guest@QPID 3h 47m 9s 3h 47m 4s 0 0
qpid.127.0.0.1:5672-127.0.0.1:53547 mod_wsgi 4181 PLAIN guest@QPID 3h 45m 20s 3h 45m 14s 0 0
qpid.127.0.0.1:5672-127.0.0.1:53583 mod_wsgi 4181 PLAIN guest@QPID 3h 39m 37s 3h 39m 34s 0 0
qpid.127.0.0.1:5672-127.0.0.1:53593 mod_wsgi 4181 PLAIN guest@QPID 3h 38m 4s 3h 38m 4s 0 0
qpid.127.0.0.1:5672-127.0.0.1:53629 mod_wsgi 4181 PLAIN guest@QPID 3h 32m 9s 3h 32m 4s 0 0
qpid.127.0.0.1:5672-127.0.0.1:53644 mod_wsgi 4181 PLAIN guest@QPID 3h 30m 30s 3h 30m 24s 0 0
qpid.127.0.0.1:5672-127.0.0.1:53776 mod_wsgi 4181 PLAIN guest@QPID 3h 24m 47s 3h 24m 44s 0 0
qpid.127.0.0.1:5672-127.0.0.1:53786 mod_wsgi 4181 PLAIN guest@QPID 3h 23m 6s 3h 23m 4s 0 0
qpid.127.0.0.1:5672-127.0.0.1:53824 mod_wsgi 4181 PLAIN guest@QPID 3h 17m 9s 3h 17m 4s 0 0
qpid.127.0.0.1:5672-127.0.0.1:53898 mod_wsgi 4181 PLAIN guest@QPID 3h 6m 47s 3h 6m 44s 0 0
qpid.127.0.0.1:5672-127.0.0.1:53938 mod_wsgi 4181 PLAIN guest@QPID 3h 0m 1s 2h 59m 54s 0 0
qpid.127.0.0.1:5672-127.0.0.1:53996 mod_wsgi 4181 PLAIN guest@QPID 2h 51m 17s 2h 51m 14s 0 0
qpid.127.0.0.1:5672-127.0.0.1:54111 mod_wsgi 4181 PLAIN guest@QPID 2h 34m 37s 2h 34m 34s 0 0
qpid.127.0.0.1:5672-127.0.0.1:54316 mod_wsgi 4181 PLAIN guest@QPID 2h 18m 57s 2h 18m 54s 0 0
qpid.127.0.0.1:5672-127.0.0.1:54523 mod_wsgi 4181 PLAIN guest@QPID 1h 49m 15s 1h 49m 14s 0 0
qpid.127.0.0.1:5672-127.0.0.1:54534 mod_wsgi 4181 PLAIN guest@QPID 1h 47m 22s 1h 47m 14s 0 0
qpid.127.0.0.1:5672-127.0.0.1:54561 mod_wsgi 4181 PLAIN guest@QPID 1h 43m 1s 1h 42m 54s 0 0
qpid.127.0.0.1:5672-127.0.0.1:54562 mod_wsgi 4181 PLAIN guest@QPID 1h 42m 59s 1h 42m 54s 0 0
qpid.127.0.0.1:5672-127.0.0.1:54578 mod_wsgi 4181 PLAIN guest@QPID 1h 41m 11s 1h 41m 4s 0 0
qpid.127.0.0.1:5672-127.0.0.1:54586 mod_wsgi 4181 PLAIN guest@QPID 1h 39m 53s 1h 39m 44s 0 0
qpid.127.0.0.1:5672-127.0.0.1:54821 mod_wsgi 4181 PLAIN guest@QPID 1h 18m 22s 1h 18m 14s 0 0
qpid.127.0.0.1:5672-127.0.0.1:54839 mod_wsgi 4181 PLAIN guest@QPID 1h 16m 41s 1h 16m 34s 0 0
qpid.127.0.0.1:5672-127.0.0.1:54876 mod_wsgi 4181 PLAIN guest@QPID 1h 10m 35s 1h 10m 34s 0 0
qpid.127.0.0.1:5672-127.0.0.1:54886 mod_wsgi 4181 PLAIN guest@QPID 1h 9m 2s 1h 8m 54s 0 0
qpid.127.0.0.1:5672-127.0.0.1:55036 mod_wsgi 4181 PLAIN guest@QPID 46m 13s 46m 4s 0 0
qpid.127.0.0.1:5672-127.0.0.1:55046 mod_wsgi 4181 PLAIN guest@QPID 44m 41s 44m 34s 0 0
qpid.127.0.0.1:5672-127.0.0.1:55083 mod_wsgi 4181 PLAIN guest@QPID 38m 30s 38m 24s 0 0

Processes:

[pulp-docker01.web.stage.ext.phx2.redhat.com] [12:48:01 PM]
[root@pulp-docker01 ~]# ps aux | grep 4181
root      4110  0.0  0.0 103308   852 pts/2    R+   12:48   0:00 grep 4181
apache    4181  3.4  1.9 3039836 155976 ?      Sl   Jun08 307:59 (wsgi:pulp)    
[pulp-docker01.web.stage.ext.phx2.redhat.com] [12:48:36 PM]
[root@pulp-docker01 ~]# ps aux | grep 4180
root      4119  0.0  0.0 103308   848 pts/2    R+   12:48   0:00 grep 4180
apache    4180  3.4  1.9 3074024 153456 ?      Sl   Jun08 309:42 (wsgi:pulp)    
[pulp-docker01.web.stage.ext.phx2.redhat.com] [12:48:42 PM]
[root@pulp-docker01 ~]# ps aux | grep 4179
root      4127  0.0  0.0 103308   856 pts/2    S+   12:48   0:00 grep 4179
apache    4179  3.6  1.9 2978372 160724 ?      Sl   Jun08 323:57 (wsgi:pulp)    

[pulp-docker01.web.stage.ext.phx2.redhat.com] [12:49:09 PM]
[root@pulp-docker01 ~]# ps aux | grep 4062
apache    4062  0.0  0.6 731816 50604 ?        Sl   Jun08   6:09 /usr/bin/python -m celery.__main__ worker --heartbeat-interval=30 -c 1 -n reserved_resource_worker-3@pulp-docker01.web.stage.ext.phx2.redhat.com --events --app=pulp.server.async.app --loglevel=INFO --logfile=/var/log/pulp/reserved_resource_worker-3.log --pidfile=/var/run/pulp/reserved_resource_worker-3.pid
root      4219  0.0  0.0 103312   860 pts/2    S+   12:49   0:00 grep 4062
[pulp-docker01.web.stage.ext.phx2.redhat.com] [12:49:23 PM]
[root@pulp-docker01 ~]# ps aux | grep 4060
apache    4060  0.0  0.5 732212 47824 ?        Sl   Jun08   3:01 /usr/bin/python -m celery.__main__ worker --heartbeat-interval=30 -c 1 -n reserved_resource_worker-0@pulp-docker01.web.stage.ext.phx2.redhat.com --events --app=pulp.server.async.app --loglevel=INFO --logfile=/var/log/pulp/reserved_resource_worker-0.log --pidfile=/var/run/pulp/reserved_resource_worker-0.pid
root      4226  0.0  0.0 103308   852 pts/2    R+   12:49   0:00 grep 4060
[pulp-docker01.web.stage.ext.phx2.redhat.com] [12:49:32 PM]
[root@pulp-docker01 ~]# ps aux | grep 4036
apache    4036  0.0  0.6 732360 51272 ?        Sl   Jun08   7:38 /usr/bin/python -m celery.__main__ worker --heartbeat-interval=30 -c 1 -n reserved_resource_worker-2@pulp-docker01.web.stage.ext.phx2.redhat.com --events --app=pulp.server.async.app --loglevel=INFO --logfile=/var/log/pulp/reserved_resource_worker-2.log --pidfile=/var/run/pulp/reserved_resource_worker-2.pid
root      4286  0.0  0.0 103312   864 pts/2    S+   12:49   0:00 grep 4036
[pulp-docker01.web.stage.ext.phx2.redhat.com] [12:49:35 PM]
[root@pulp-docker01 ~]# ps aux | grep 4017
apache    4017  0.0  0.6 731960 50700 ?        Sl   Jun08   6:16 /usr/bin/python -m celery.__main__ worker --heartbeat-interval=30 -c 1 -n reserved_resource_worker-1@pulp-docker01.web.stage.ext.phx2.redhat.com --events --app=pulp.server.async.app --loglevel=INFO --logfile=/var/log/pulp/reserved_resource_worker-1.log --pidfile=/var/run/pulp/reserved_resource_worker-1.pid
root      4301  0.0  0.0 103308   856 pts/2    R+   12:49   0:00 grep 4017
[pulp-docker01.web.stage.ext.phx2.redhat.com] [12:49:38 PM]
[root@pulp-docker01 ~]# ps aux | grep 4002
apache    4002  0.0  0.5 731632 48276 ?        Sl   Jun08   6:05 /usr/bin/python -m celery.__main__ worker --heartbeat-interval=30 -c 1 -n reserved_resource_worker-0@pulp-docker01.web.stage.ext.phx2.redhat.com --events --app=pulp.server.async.app --loglevel=INFO --logfile=/var/log/pulp/reserved_resource_worker-0.log --pidfile=/var/run/pulp/reserved_resource_worker-0.pid
root      4311  0.0  0.0 103312   864 pts/2    S+   12:49   0:00 grep 4002
[pulp-docker01.web.stage.ext.phx2.redhat.com] [12:49:45 PM]
[root@pulp-docker01 ~]# ps aux | grep 3790
apache    3790  0.0  0.5 731672 45048 ?        Sl   Jun08   6:48 /usr/bin/python -m celery.__main__ worker --heartbeat-interval=30 -c 1 -n resource_manager@pulp-docker01.web.stage.ext.phx2.redhat.com --events --app=pulp.server.async.app --loglevel=INFO -Q resource_manager --logfile=/var/log/pulp/resource_manager.log --pidfile=/var/run/pulp/resource_manager.pid
root      4319  0.0  0.0 103308   856 pts/2    R+   12:49   0:00 grep 3790
[pulp-docker01.web.stage.ext.phx2.redhat.com] [12:49:56 PM]
[root@pulp-docker01 ~]# ps aux | grep 3639
apache    3639  0.1  0.3 1050456 31904 ?       Sl   Jun08  17:33 /usr/bin/python /usr/bin/celery beat --app=pulp.server.async.celery_instance.celery --scheduler=pulp.server.async.scheduler.Scheduler --workdir=/var/run/pulp/ -f /var/log/pulp/celerybeat.log -l INFO --detach --pidfile=/var/run/pulp/celerybeat.pid
root      4335  0.0  0.0 103312   864 pts/2    S+   12:50   0:00 grep 3639
[
Actions #10

Updated by jluza almost 8 years ago

It looks like apache created connection for every new fork of worker and then it forgot to close it.

Actions #11

Updated by bmbouter almost 8 years ago

This is interesting!

Your mod_wsgi process is leaking Qpid connections. Look at the large numbers of them for pids 4179, 4180, and 4181. Is httpd configured to do any process/wsgi recycling. We had talked about a workaround being put in place. Perhaps it's not closing the connections adequately some how? Look at how it starts them over time from the timestamps. It's regularly starting these connections.

Given ^, it has to be fixed because your installation won't run no matter what the connection limit is set to.

What are you production httpd configs for this installation?

I don't know why this isn't seen on other installations. I've not observed this behavior on my dev installation, but we need to get it to reproduce somehow. What do you think?

Actions #13

Updated by jluza almost 8 years ago

Here's relevant (I hope) parts from our server.conf

[messaging]
url: tcp://localhost:5672
transport: qpid

[tasks]
broker_url: qpid://guest@localhost/
celery_require_ssl: false

and /etc/qpid/qpidd.conf

auth=no
max-connections=2000

I'm not sure what apache config you need, so here's /etc/httpd/conf.d/pulp.conf

AddType application/x-pkcs7-crl .crl
AddType application/x-x509-ca-cert .crt

SSLCACertificateFile /etc/pki/pulp/ca.crt

SSLInsecureRenegotiation on

WSGIProcessGroup pulp
WSGIApplicationGroup pulp
WSGIDaemonProcess pulp user=apache group=apache processes=3 display-name=%{GROUP}

WSGISocketPrefix run/wsgi
WSGIScriptAlias /pulp/api /srv/pulp/webservices.wsgi
WSGIImportScript /srv/pulp/webservices.wsgi process-group=pulp application-group=pulp

<Files webservices.wsgi>
    WSGIPassAuthorization On
    WSGIProcessGroup pulp
    WSGIApplicationGroup pulp
    SSLRenegBufferSize 10485760
    SSLRequireSSL
    SSLVerifyDepth 3
    SSLOptions +StdEnvVars +ExportCertData
    SSLVerifyClient optional
</Files>

<VirtualHost *:80>
    Include /etc/pulp/vhosts80/*.conf
</VirtualHost>
Actions #14

Updated by bmbouter almost 8 years ago

This is a short aside and probably not related to this specific root cause, but you should move away from using:

broker_url: qpid://guest@localhost/

And instead use:

broker_url: qpid://localhost/

Otherwise you probably will have a deadlock at some point[0]. BTW, the recommended version ^ is what is the default for 2.8 which you are running so leaving it commented would also solve this for you.

[0]: https://pulp.plan.io/issues/1152

Actions #15

Updated by jluza almost 8 years ago

bmbouter wrote:

This is a short aside and probably not related to this specific root cause, but you should move away from using:

[...]

And instead use:

[...]

Otherwise you probably will have a deadlock at some point[0]. BTW, the recommended version ^ is what is the default for 2.8 which you are running so leaving it commented would also solve this for you.

[0]: https://pulp.plan.io/issues/1152

I think he had some issue with that. in old pulp (2.5.3) we used qpid://localhost/ but then with newer version of kombu or something it stopped working.

Actions #16

Updated by bmbouter almost 8 years ago

jluza wrote:

bmbouter wrote:

This is a short aside and probably not related to this specific root cause, but you should move away from using:

[...]

And instead use:

[...]

Otherwise you probably will have a deadlock at some point[0]. BTW, the recommended version ^ is what is the default for 2.8 which you are running so leaving it commented would also solve this for you.

[0]: https://pulp.plan.io/issues/1152

I think he had some issue with that. in old pulp (2.5.3) we used qpid://localhost/ but then with newer version of kombu or something it stopped working.

Yes we did have some problems with this in the 2.5 release line. Since then, the qpid://localhost/ has been compatible with kombu for several releases (including the one bundled with 2.8) so you should switch back to this form. If you don't you will very likely experience deadlock.

Actions #17

Updated by bmbouter almost 8 years ago

What version of httpd are you using?

I am running httpd-2.4.18-1.fc23.x86_64 and when I run pulp-smash I don't get connections leaked. pulp-smash runs hundreds of tasks in Pulp so expect it to show the leak. I am re-running now just to make sure.

Also what is your system doing at the moments when these new Qpid connections are being formed? They don't seem to be formed by any specific pattern of timestamps so it doesn't seem to have a regular interval.

Also is this only happening in your production environments, can you verify that a separate, test environment also shows the issue?

Actions #18

Updated by jluza almost 8 years ago

httpd-2.2.15-47.el6_7.1.x86_64
mod_wsgi-3.4-1.pulp.el6.x86_64
mod_ssl-2.2.15-47.el6_7.1.x86_64

It's hard to tell what is happening in system when new connection is created. All these envs have this issue: pulp-docker qa and stage, rhsm pulp qa,stage, prod <- although for rhsm it doesn't take that many connections so 1000 limit is enough for us. I think it's because we restart services on those more often.

Actions #20

Updated by pthomas@redhat.com almost 8 years ago

pulp-smash tests have passed on el6, el7, f22 & f23

The manual testing was done mostly on el7 and some on el6.

Most of the manual testing is done on beaker machines.

Actions #21

Updated by amacdona@redhat.com almost 8 years ago

  • Severity changed from 2. Medium to 3. High
  • Triaged changed from No to Yes
Actions #22

Updated by pthomas@redhat.com almost 8 years ago

I have setup a pulp-server on rhel6.8. Here is the pre test data on the pulp-server


[root@mgmt13 ~]# rpm -qa |grep pulp
python-kombu-3.0.33-5.pulp.el6.noarch
pulp-selinux-2.8.5-0.1.beta.el6.noarch
pulp-server-2.8.5-0.1.beta.el6.noarch
python-pulp-client-lib-2.8.5-0.1.beta.el6.noarch
pulp-rpm-admin-extensions-2.8.5-0.1.beta.el6.noarch
python-pulp-common-2.8.5-0.1.beta.el6.noarch
mod_wsgi-3.4-2.pulp.el6.x86_64
python-pulp-rpm-common-2.8.5-0.1.beta.el6.noarch
python-pulp-docker-common-2.0.2-0.3.beta.el6.noarch
python-pulp-repoauth-2.8.5-0.1.beta.el6.noarch
pulp-rpm-plugins-2.8.5-0.1.beta.el6.noarch
pulp-puppet-plugins-2.8.5-0.1.beta.el6.noarch
pulp-admin-client-2.8.5-0.1.beta.el6.noarch
pulp-puppet-admin-extensions-2.8.5-0.1.beta.el6.noarch
python-isodate-0.5.0-4.pulp.el6.noarch
python-pulp-puppet-common-2.8.5-0.1.beta.el6.noarch
python-pulp-oid_validation-2.8.5-0.1.beta.el6.noarch
pulp-docker-plugins-2.0.2-0.3.beta.el6.noarch
python-pulp-bindings-2.8.5-0.1.beta.el6.noarch
pulp-docker-admin-extensions-2.0.2-0.3.beta.el6.noarch
[root@mgmt13 ~]# 
[root@mgmt13 ~]# 
[root@mgmt13 ~]# qpid-stat -c
Connections
  connection                   cproc        cpid   mech       auth            connected  idle    msgIn  msgOut
  ==============================================================================================================
  qpid.[::1]:5672-[::1]:42406  mod_wsgi     24335  ANONYMOUS  anonymous@QPID  7m 42s     7m 35s     1      1
  qpid.[::1]:5672-[::1]:42407  mod_wsgi     24337  ANONYMOUS  anonymous@QPID  7m 42s     7m 35s     1      1
  qpid.[::1]:5672-[::1]:42408  mod_wsgi     24333  ANONYMOUS  anonymous@QPID  7m 42s     7m 35s     1      1
  qpid.[::1]:5672-[::1]:42411  mod_wsgi     24335  ANONYMOUS  anonymous@QPID  7m 40s     0s         0      0
  qpid.[::1]:5672-[::1]:42412  mod_wsgi     24337  ANONYMOUS  anonymous@QPID  7m 40s     0s         0      0
  qpid.[::1]:5672-[::1]:42413  mod_wsgi     24333  ANONYMOUS  anonymous@QPID  7m 40s     0s         0      0
  qpid.[::1]:5672-[::1]:42418  __main__.py  24640  ANONYMOUS  anonymous@QPID  7m 40s     7m 25s    12     13
  qpid.[::1]:5672-[::1]:42419  __main__.py  24640  ANONYMOUS  anonymous@QPID  7m 40s     5s        19      2
  qpid.[::1]:5672-[::1]:42424  __main__.py  24709  ANONYMOUS  anonymous@QPID  7m 39s     7m 25s    12     13
  qpid.[::1]:5672-[::1]:42425  __main__.py  24709  ANONYMOUS  anonymous@QPID  7m 39s     5s        19      2
  qpid.[::1]:5672-[::1]:42430  __main__.py  24784  ANONYMOUS  anonymous@QPID  7m 38s     7m 25s    12     13
  qpid.[::1]:5672-[::1]:42431  __main__.py  24784  ANONYMOUS  anonymous@QPID  7m 38s     0s        19      2
  qpid.[::1]:5672-[::1]:42436  __main__.py  24807  ANONYMOUS  anonymous@QPID  7m 37s     7m 25s    12     13
  qpid.[::1]:5672-[::1]:42437  __main__.py  24807  ANONYMOUS  anonymous@QPID  7m 37s     0s        19      2
  qpid.[::1]:5672-[::1]:42442  __main__.py  24830  ANONYMOUS  anonymous@QPID  7m 36s     7m 25s    12     13
  qpid.[::1]:5672-[::1]:42443  __main__.py  24830  ANONYMOUS  anonymous@QPID  7m 36s     0s        19      2
  qpid.[::1]:5672-[::1]:42448  __main__.py  24853  ANONYMOUS  anonymous@QPID  7m 36s     7m 25s    12     12
  qpid.[::1]:5672-[::1]:42449  __main__.py  24853  ANONYMOUS  anonymous@QPID  7m 36s     25s       17      2
  qpid.[::1]:5672-[::1]:42454  celery       24960  ANONYMOUS  anonymous@QPID  7m 35s     7m 25s     0      0
  qpid.[::1]:5672-[::1]:42457  __main__.py  24876  ANONYMOUS  anonymous@QPID  7m 35s     7m 25s    12     12
  qpid.[::1]:5672-[::1]:42458  __main__.py  24876  ANONYMOUS  anonymous@QPID  7m 35s     25s       17      2
  qpid.[::1]:5672-[::1]:42461  celery       24960  ANONYMOUS  anonymous@QPID  7m 34s     7m 25s     0      0
  qpid.[::1]:5672-[::1]:42464  __main__.py  24899  ANONYMOUS  anonymous@QPID  7m 34s     7m 25s    12     12
  qpid.[::1]:5672-[::1]:42465  __main__.py  24899  ANONYMOUS  anonymous@QPID  7m 34s     0s        18      2
  qpid.[::1]:5672-[::1]:42468  celery       24960  ANONYMOUS  anonymous@QPID  7m 33s     0s         6    147
  qpid.[::1]:5672-[::1]:42471  __main__.py  25054  ANONYMOUS  anonymous@QPID  7m 32s     7m 25s    12     12
  qpid.[::1]:5672-[::1]:42472  __main__.py  25054  ANONYMOUS  anonymous@QPID  7m 32s     25s       17      2
  qpid.[::1]:5672-[::1]:42480  qpid-stat    25083  ANONYMOUS  anonymous@QPID  0s         0s         1      0
[root@mgmt13 ~]# 
Actions #23

Updated by pthomas@redhat.com almost 8 years ago

I ran the following script and "qpid-stat-c" after it completed is follows

[pulp]# cat pulp-admin.sh 
#!/bin/bash
for ((i=1; i<=10; i++))
do
 echo "Iteration $i"
 pulp-admin rpm repo create --repo-id zoo --feed https://repos.fedorapeople.org/repos/pulp/pulp/fixtures/rpm/
  echo "Rpm repo created"
  pulp-admin rpm repo sync run --repo-id zoo
  echo "repo synced & published"
  pulp-admin rpm repo delete --repo-id zoo
  echo "rpm repo deleted"
  pulp-admin orphan remove --all
  echo "orphan removed"
  pulp-admin docker repo create --repo-id=synctest --feed=https://registry-1.docker.io --upstream-name=busybox --enable-v1 false --enable-v2 true
  echo "docker repo created"
  pulp-admin docker repo sync run --repo-id synctest
  echo "docker repo synced & published"
  pulp-admin docker repo delete --repo-id synctest
  echo "docker repo deleted"
  pulp-admin orphan remove --all
  echo "orphan removed"
done;

[pulp~]# 

[pulp ~]# qpid-stat -c
Connections
  connection                   cproc        cpid   mech       auth            connected  idle     msgIn  msgOut
  ===============================================================================================================
  qpid.[::1]:5672-[::1]:42406  mod_wsgi     24335  ANONYMOUS  anonymous@QPID  56m 20s    56m 12s     1      1
  qpid.[::1]:5672-[::1]:42407  mod_wsgi     24337  ANONYMOUS  anonymous@QPID  56m 20s    56m 12s     1      1
  qpid.[::1]:5672-[::1]:42408  mod_wsgi     24333  ANONYMOUS  anonymous@QPID  56m 20s    56m 12s     1      1
  qpid.[::1]:5672-[::1]:42411  mod_wsgi     24335  ANONYMOUS  anonymous@QPID  56m 18s    0s          0      0
  qpid.[::1]:5672-[::1]:42412  mod_wsgi     24337  ANONYMOUS  anonymous@QPID  56m 18s    0s          0      0
  qpid.[::1]:5672-[::1]:42413  mod_wsgi     24333  ANONYMOUS  anonymous@QPID  56m 18s    0s          0      0
  qpid.[::1]:5672-[::1]:42418  __main__.py  24640  ANONYMOUS  anonymous@QPID  56m 18s    2m 2s      12     16
  qpid.[::1]:5672-[::1]:42419  __main__.py  24640  ANONYMOUS  anonymous@QPID  56m 18s    12s       125      2
  qpid.[::1]:5672-[::1]:42424  __main__.py  24709  ANONYMOUS  anonymous@QPID  56m 17s    1m 42s     12     16
  qpid.[::1]:5672-[::1]:42425  __main__.py  24709  ANONYMOUS  anonymous@QPID  56m 17s    2s        125      2
  qpid.[::1]:5672-[::1]:42430  __main__.py  24784  ANONYMOUS  anonymous@QPID  56m 16s    1m 32s     12     16
  qpid.[::1]:5672-[::1]:42431  __main__.py  24784  ANONYMOUS  anonymous@QPID  56m 16s    2s        125      2
  qpid.[::1]:5672-[::1]:42436  __main__.py  24807  ANONYMOUS  anonymous@QPID  56m 15s    52s        12     16
  qpid.[::1]:5672-[::1]:42437  __main__.py  24807  ANONYMOUS  anonymous@QPID  56m 15s    2s        125      2
  qpid.[::1]:5672-[::1]:42442  __main__.py  24830  ANONYMOUS  anonymous@QPID  56m 14s    1m 12s     12     16
  qpid.[::1]:5672-[::1]:42443  __main__.py  24830  ANONYMOUS  anonymous@QPID  56m 14s    2s        125      2
  qpid.[::1]:5672-[::1]:42448  __main__.py  24853  ANONYMOUS  anonymous@QPID  56m 13s    42s        12     15
  qpid.[::1]:5672-[::1]:42449  __main__.py  24853  ANONYMOUS  anonymous@QPID  56m 13s    2s        124      2
  qpid.[::1]:5672-[::1]:42454  celery       24960  ANONYMOUS  anonymous@QPID  56m 13s    56m 2s      0      0
  qpid.[::1]:5672-[::1]:42457  __main__.py  24876  ANONYMOUS  anonymous@QPID  56m 13s    2m 12s     12     14
  qpid.[::1]:5672-[::1]:42458  __main__.py  24876  ANONYMOUS  anonymous@QPID  56m 13s    2s        121      2
  qpid.[::1]:5672-[::1]:42461  celery       24960  ANONYMOUS  anonymous@QPID  56m 12s    26m 2s      5      4
  qpid.[::1]:5672-[::1]:42464  __main__.py  24899  ANONYMOUS  anonymous@QPID  56m 12s    42s        12    114
  qpid.[::1]:5672-[::1]:42465  __main__.py  24899  ANONYMOUS  anonymous@QPID  56m 12s    2s        421      2
  qpid.[::1]:5672-[::1]:42468  celery       24960  ANONYMOUS  anonymous@QPID  56m 10s    2s          6   1.54k
  qpid.[::1]:5672-[::1]:42471  __main__.py  25054  ANONYMOUS  anonymous@QPID  56m 10s    42s        12     62
  qpid.[::1]:5672-[::1]:42472  __main__.py  25054  ANONYMOUS  anonymous@QPID  56m 10s    2s        265      2
  qpid.[::1]:5672-[::1]:42481  __main__.py  24802  ANONYMOUS  anonymous@QPID  26m 12s    26m 2s      5      4
  qpid.[::1]:5672-[::1]:42489  mod_wsgi     24333  ANONYMOUS  anonymous@QPID  5m 50s     52s        24      8
  qpid.[::1]:5672-[::1]:42492  __main__.py  25067  ANONYMOUS  anonymous@QPID  5m 50s     42s       104      4
  qpid.[::1]:5672-[::1]:42531  __main__.py  25051  ANONYMOUS  anonymous@QPID  5m 41s     42s        24      4
  qpid.[::1]:5672-[::1]:42535  mod_wsgi     24337  ANONYMOUS  anonymous@QPID  5m 39s     42s        24      8
  qpid.[::1]:5672-[::1]:42625  mod_wsgi     24335  ANONYMOUS  anonymous@QPID  5m 0s      52s        26      8
  qpid.[::1]:5672-[::1]:43145  qpid-stat    25964  ANONYMOUS  anonymous@QPID  0s         0s          1      0
[pulp ~]# 
Actions #24

Updated by bmbouter almost 8 years ago

Pre workload, there were 6 mod_wsgi connections, and after there are 9 mod_wsgi connections with 3 of them being started roughly 50 minutes after the original ones. The new ones are only 5m 50s old. Can we see if the number grows larger than 9? Perhaps by running a test over several hours and perhaps a day with lots of work happening the whole time?

Actions #25

Updated by pthomas@redhat.com almost 8 years ago

I have setup the script running every hour. Will update in the morning.

Actions #26

Updated by pthomas@redhat.com almost 8 years ago

After letting the script run every hour for 12+ hours, the qpid-stat -c was the same as #23.

I updated the script to include more docker repo syncs and let it run again 2 more times


[root@pulp ~]# qpid-stat -c
Connections
  connection                   cproc        cpid   mech       auth            connected    idle         msgIn  msgOut
  =====================================================================================================================
  qpid.[::1]:5672-[::1]:40128  qpid-stat    21147  ANONYMOUS  anonymous@QPID  0s           0s              1      0
  qpid.[::1]:5672-[::1]:42406  mod_wsgi     24335  ANONYMOUS  anonymous@QPID  21h 42m 21s  21h 42m 14s     1      1
  qpid.[::1]:5672-[::1]:42407  mod_wsgi     24337  ANONYMOUS  anonymous@QPID  21h 42m 21s  21h 42m 14s     1      1
  qpid.[::1]:5672-[::1]:42408  mod_wsgi     24333  ANONYMOUS  anonymous@QPID  21h 42m 21s  21h 42m 14s     1      1
  qpid.[::1]:5672-[::1]:42411  mod_wsgi     24335  ANONYMOUS  anonymous@QPID  21h 42m 20s  0s              0      0
  qpid.[::1]:5672-[::1]:42412  mod_wsgi     24337  ANONYMOUS  anonymous@QPID  21h 42m 20s  0s              0      0
  qpid.[::1]:5672-[::1]:42413  mod_wsgi     24333  ANONYMOUS  anonymous@QPID  21h 42m 20s  0s              0      0
  qpid.[::1]:5672-[::1]:42418  __main__.py  24640  ANONYMOUS  anonymous@QPID  21h 42m 19s  42m 4s         12     76
  qpid.[::1]:5672-[::1]:42419  __main__.py  24640  ANONYMOUS  anonymous@QPID  21h 42m 19s  14s          2.80k     2
  qpid.[::1]:5672-[::1]:42424  __main__.py  24709  ANONYMOUS  anonymous@QPID  21h 42m 18s  12m 4s         12     82
  qpid.[::1]:5672-[::1]:42425  __main__.py  24709  ANONYMOUS  anonymous@QPID  21h 42m 18s  14s          2.81k     2
  qpid.[::1]:5672-[::1]:42430  __main__.py  24784  ANONYMOUS  anonymous@QPID  21h 42m 17s  12m 4s         12     76
  qpid.[::1]:5672-[::1]:42431  __main__.py  24784  ANONYMOUS  anonymous@QPID  21h 42m 17s  14s          2.80k     2
  qpid.[::1]:5672-[::1]:42436  __main__.py  24807  ANONYMOUS  anonymous@QPID  21h 42m 16s  42m 44s        12    139
  qpid.[::1]:5672-[::1]:42437  __main__.py  24807  ANONYMOUS  anonymous@QPID  21h 42m 16s  14s          2.98k     2
  qpid.[::1]:5672-[::1]:42442  __main__.py  24830  ANONYMOUS  anonymous@QPID  21h 42m 16s  45m 44s        12     75
  qpid.[::1]:5672-[::1]:42443  __main__.py  24830  ANONYMOUS  anonymous@QPID  21h 42m 16s  14s          2.79k     2
  qpid.[::1]:5672-[::1]:42448  __main__.py  24853  ANONYMOUS  anonymous@QPID  21h 42m 15s  43m 34s        12     74
  qpid.[::1]:5672-[::1]:42449  __main__.py  24853  ANONYMOUS  anonymous@QPID  21h 42m 15s  4s           2.79k     2
  qpid.[::1]:5672-[::1]:42454  celery       24960  ANONYMOUS  anonymous@QPID  21h 42m 14s  21h 42m 4s      0      0
  qpid.[::1]:5672-[::1]:42457  __main__.py  24876  ANONYMOUS  anonymous@QPID  21h 42m 14s  42m 4s         12     74
  qpid.[::1]:5672-[::1]:42458  __main__.py  24876  ANONYMOUS  anonymous@QPID  21h 42m 14s  4s           2.79k     2
  qpid.[::1]:5672-[::1]:42461  celery       24960  ANONYMOUS  anonymous@QPID  21h 42m 13s  12m 4s         50      4
  qpid.[::1]:5672-[::1]:42464  __main__.py  24899  ANONYMOUS  anonymous@QPID  21h 42m 13s  42m 44s        12   2.46k
  qpid.[::1]:5672-[::1]:42465  __main__.py  24899  ANONYMOUS  anonymous@QPID  21h 42m 13s  14s          9.93k     2
  qpid.[::1]:5672-[::1]:42468  celery       24960  ANONYMOUS  anonymous@QPID  21h 42m 12s  4s              6   36.0k
  qpid.[::1]:5672-[::1]:42471  __main__.py  25054  ANONYMOUS  anonymous@QPID  21h 42m 11s  42m 44s        12   1.24k
  qpid.[::1]:5672-[::1]:42472  __main__.py  25054  ANONYMOUS  anonymous@QPID  21h 42m 11s  4s           6.28k     2
  qpid.[::1]:5672-[::1]:42481  __main__.py  24802  ANONYMOUS  anonymous@QPID  21h 12m 13s  4h 12m 4s      16      4
  qpid.[::1]:5672-[::1]:42489  mod_wsgi     24333  ANONYMOUS  anonymous@QPID  20h 51m 52s  42m 44s       443      8
  qpid.[::1]:5672-[::1]:42492  __main__.py  25067  ANONYMOUS  anonymous@QPID  20h 51m 51s  42m 44s      2.46k     4
  qpid.[::1]:5672-[::1]:42531  __main__.py  25051  ANONYMOUS  anonymous@QPID  20h 51m 43s  2h 40m 34s    419      8
  qpid.[::1]:5672-[::1]:42535  mod_wsgi     24337  ANONYMOUS  anonymous@QPID  20h 51m 40s  43m 34s       381      8
  qpid.[::1]:5672-[::1]:42625  mod_wsgi     24335  ANONYMOUS  anonymous@QPID  20h 51m 1s   42m 44s       421      8
  qpid.[::1]:5672-[::1]:43968  __main__.py  24871  ANONYMOUS  anonymous@QPID  20h 12m 13s  42m 54s        26      8
  qpid.[::1]:5672-[::1]:44490  __main__.py  24844  ANONYMOUS  anonymous@QPID  19h 42m 13s  2h 12m 4s      11      4
  qpid.[::1]:5672-[::1]:46170  __main__.py  24825  ANONYMOUS  anonymous@QPID  17h 42m 13s  12m 4s         12      8
  qpid.[::1]:5672-[::1]:47052  __main__.py  25038  ANONYMOUS  anonymous@QPID  16h 42m 13s  42m 4s          9      4
  qpid.[::1]:5672-[::1]:51506  __main__.py  24953  ANONYMOUS  anonymous@QPID  11h 42m 12s  11h 42m 4s      5      4
[root@pulp ~]# 
Actions #27

Updated by bmbouter almost 8 years ago

Strangely Preethi cannot reproduce the leaking Qpid connections. On startup there are 6 (which makes sense with each of the 3 workers having two initial connections). Then when the first asynchronous task is processed they each form one additional connection for a total of 3 connections per mod_wsgi process. It holds constant after that even under much load.

@dgregor, In the environment that is showing the issue, maybe the logging of httpd should be turned way up and monitored as additional connections are shown for mod_wsgi by `qpid-stat -c` over time? They seem to start ever few minutes or so from comment 9.

Actions #28

Updated by bmbouter over 7 years ago

  • Groomed changed from No to Yes
  • Sprint Candidate changed from No to Yes
Actions #29

Updated by mhrivnak over 7 years ago

  • Sprint/Milestone set to 23
Actions #31

Updated by mhrivnak over 7 years ago

  • Sprint/Milestone changed from 23 to 24
Actions #33

Updated by jluza over 7 years ago

For now workarouded with

WSGIDaemonProcess pulp user=apache group=apache processes=3 display-name=%{GROUP} maximum-requests=1000

in /etc/httpd/conf.d/pulp.conf

Actions #34

Updated by bmbouter over 7 years ago

  • Status changed from NEW to ASSIGNED
Actions #35

Updated by bmbouter over 7 years ago

  • Assignee set to bmbouter
Actions #36

Updated by mhrivnak over 7 years ago

  • Sprint/Milestone changed from 24 to 25
Actions #37

Updated by bmbouter over 7 years ago

  • Status changed from ASSIGNED to CLOSED - WORKSFORME

I tried to reproduce this on F23 and @preethi did on EL6. Neither of us could. The user indicated they are also not experiencing it anymore after putting in process recycling of the httpd processes so the one reporter also is not experiencing it. Since we couldn't reproduce it I'm closing it as workforme.

If anyone else experiences this please post a comment or reopen.

Actions #38

Updated by bmbouter about 6 years ago

  • Sprint set to Sprint 7
Actions #39

Updated by bmbouter about 6 years ago

  • Sprint/Milestone deleted (25)
Actions #40

Updated by bmbouter almost 5 years ago

  • Tags Pulp 2 added

Also available in: Atom PDF