Note: This is a beta release of Red Hat Bugzilla 5.0. The data contained within is a snapshot of the live data so any changes you make will not be reflected in the production Bugzilla. Also email is disabled so feel free to test any aspect of the site that you want. File any problems you find or give feedback here.
Bug 1511537 - nova-compute logs "No calling threads waiting for msg_id" on nova-conductor RPC [NEEDINFO]
Summary: nova-compute logs "No calling threads waiting for msg_id" on nova-conductor RPC
Keywords:
Status: CLOSED INSUFFICIENT_DATA
Alias: None
Product: Red Hat OpenStack
Classification: Red Hat
Component: python-oslo-messaging
Version: 8.0 (Liberty)
Hardware: Unspecified
OS: Unspecified
high
high
Target Milestone: ---
: 13.0 (Queens)
Assignee: Victor Stinner
QA Contact: Udi Shkalim
URL:
Whiteboard:
Depends On:
Blocks:
TreeView+ depends on / blocked
 
Reported: 2017-11-09 14:17 UTC by Victor Stinner
Modified: 2018-06-05 13:50 UTC (History)
10 users (show)

Fixed In Version:
Doc Type: If docs needed, set a value
Doc Text:
Clone Of:
Environment:
Last Closed: 2018-06-05 13:50:55 UTC
vstinner: needinfo? (sbandyop)


Attachments (Terms of Use)

Description Victor Stinner 2017-11-09 14:17:09 UTC

Comment 16 Shatadru Bandyopadhyay 2017-11-15 11:52:34 UTC
Hello All,

Thank you for the detailed explanation.

@Victor : So if I understand correctly we need to suggest customer, just to set  max_overflow = 50 in cinder.conf, please correct me of I am wrong.

Comment 17 Victor Stinner 2017-11-17 15:28:42 UTC
> @Victor : So if I understand correctly we need to suggest customer, just to set max_overflow = 50 in cinder.conf, please correct me of I am wrong.

Yes, but also try to reduce the number of cinder-api processes: the osapi_volume_workers option of cinder.conf.

Michael Bayer: "The ps listing shows that there are a ton of cinder-api processes running - osapi_volume_workers in cinder.conf is....48!"

Comment 18 Shatadru Bandyopadhyay 2017-11-18 14:04:49 UTC
Hello Victor,

Thank you for your response. I have shared the feedback to the customer and have asked to monitor if the issue is still reproducible under heavy load.

I will share feedback from customer when he comes backs.

Comment 30 Victor Stinner 2018-04-09 13:17:52 UTC
This issue seems to be a configuration issue, more than a bug. What is the status of this issue? Can it be closed? Or do we have new data to elaborate on the bug aspect?

Comment 31 Victor Stinner 2018-06-05 13:50:55 UTC
I close the issue since it didn't get any concrete data since last year, and my NEEDINFO didn't get any reply since 2 months.


Note You need to log in before you can comment on or make changes to this bug.