[Bug 1812935] Please test proposed package

Corey Bryant corey.bryant at canonical.com
Wed Apr 10 12:44:43 UTC 2019


Hello Michele, or anyone else affected,

Accepted python-oslo.cache into rocky-proposed. The package will build
now and be available in the Ubuntu Cloud Archive in a few hours, and
then in the -proposed repository.

Please help us by testing this new package. To enable the -proposed
repository:

  sudo add-apt-repository cloud-archive:rocky-proposed
  sudo apt-get update

Your feedback will aid us getting this update out to other Ubuntu users.

If this package fixes the bug for you, please add a comment to this bug,
mentioning the version of the package you tested, and change the tag
from verification-rocky-needed to verification-rocky-done. If it does
not fix the bug for you, please add a comment stating that, and change
the tag to verification-rocky-failed. In either case, details of your
testing will help us make a better decision.

Further information regarding the verification process can be found at
https://wiki.ubuntu.com/QATeam/PerformingSRUVerification . Thank you in
advance!

** Changed in: cloud-archive/rocky
       Status: Triaged => Fix Committed

** Tags added: verification-rocky-needed

-- 
You received this bug notification because you are a member of Ubuntu
OpenStack, which is subscribed to Ubuntu Cloud Archive.
https://bugs.launchpad.net/bugs/1812935

Title:
  oslo cache mempool issues with python3

Status in Ubuntu Cloud Archive:
  Fix Released
Status in Ubuntu Cloud Archive rocky series:
  Fix Committed
Status in Ubuntu Cloud Archive stein series:
  Fix Released
Status in oslo.cache:
  In Progress
Status in python-oslo.cache package in Ubuntu:
  Fix Released
Status in python-oslo.cache source package in Cosmic:
  Fix Committed
Status in python-oslo.cache source package in Disco:
  Fix Released

Bug description:
  nova conductor running on a rhel8 host inside f28 based containers hits the following error:
  2019-01-17 13:59:37.049 46 DEBUG oslo_concurrency.lockutils [req-284f3071-8eee-4dcb-903c-838f2e024b48 40ca1490773f49f791d3a834af3702c8 8671bdf05abf48f58a9bdcdb0ef4b740 - default default] Lock "b051d003-482d-4cb7-810c-8d256e6c879e" acquired by "nova.context.set_target_cell.<locals>.get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3.6/site-packages/oslo_concurrency/lockutils.py:327
  2019-01-17 13:59:37.050 46 DEBUG oslo_concurrency.lockutils [req-284f3071-8eee-4dcb-903c-838f2e024b48 40ca1490773f49f791d3a834af3702c8 8671bdf05abf48f58a9bdcdb0ef4b740 - default default] Lock "b051d003-482d-4cb7-810c-8d256e6c879e" released by "nova.context.set_target_cell.<locals>.get_or_set_cached_cell_and_set_connections" :: held 0.002s inner /usr/lib/python3.6/site-packages/oslo_concurrency/lockutils.py:339
  2019-01-17 13:59:37.060 46 DEBUG oslo_db.sqlalchemy.engines [req-284f3071-8eee-4dcb-903c-838f2e024b48 40ca1490773f49f791d3a834af3702c8 8671bdf05abf48f58a9bdcdb0ef4b740 - default default] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_AUTO_CREATE_USER,NO_ENGINE_SUBSTITUTION _check_effective_sql_mode /usr/lib/python3.6/site-packages/oslo_db/sqlalchemy/engines.py:307
  2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server [req-284f3071-8eee-4dcb-903c-838f2e024b48 40ca1490773f49f791d3a834af3702c8 8671bdf05abf48f58a9bdcdb0ef4b740 - default default] Exception during message handling: TypeError: object() takes no parameters
  2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server Traceback (most recent call last):
  2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.6/site-packages/oslo_cache/_memcache_pool.py", line 163, in _get
  2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server     conn = self.queue.pop().connection
  2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server IndexError: pop from an empty deque
  2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server
  2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server During handling of the above exception, another exception occurred:
  2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server
  2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server Traceback (most recent call last):
  2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.6/site-packages/oslo_messaging/rpc/server.py", line 166, in _process_incoming
  2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server     res = self.dispatcher.dispatch(message)
  2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.6/site-packages/oslo_messaging/rpc/dispatcher.py", line 265, in dispatch
  2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server     return self._do_dispatch(endpoint, method, ctxt, args)
  2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.6/site-packages/oslo_messaging/rpc/dispatcher.py", line 194, in _do_dispatch
  2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server     result = func(ctxt, **new_args)
  2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.6/site-packages/nova/conductor/manager.py", line 1303, in schedule_and_build_instances
  2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server     instance.create()
  2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.6/site-packages/oslo_versionedobjects/base.py", line 226, in wrapper
  2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server     return fn(self, *args, **kwargs)
  2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.6/site-packages/nova/objects/instance.py", line 607, in create
  2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server     self._load_ec2_ids()
  2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.6/site-packages/nova/objects/instance.py", line 983, in _load_ec2_ids
  2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server     self.ec2_ids = objects.EC2Ids.get_by_instance(self._context, self)
  2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.6/site-packages/oslo_versionedobjects/base.py", line 184, in wrapper
  2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server     result = fn(cls, context, *args, **kwargs)
  2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.6/site-packages/nova/objects/ec2.py", line 216, in get_by_instance
  2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server     ec2_ids = cls._get_ec2_ids(context, instance)
  2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.6/site-packages/nova/objects/ec2.py", line 200, in _get_ec2_ids
  2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server     ec2_ids['instance_id'] = ec2utils.id_to_ec2_inst_id(instance.uuid)
  2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.6/site-packages/nova/api/ec2/ec2utils.py", line 188, in id_to_ec2_inst_id
  2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server     int_id = get_int_id_from_instance_uuid(ctxt, instance_id)
  2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.6/site-packages/nova/api/ec2/ec2utils.py", line 47, in memoizer
  2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server     value = _CACHE.get(key)
  2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.6/site-packages/nova/cache_utils.py", line 107, in get
  2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server     value = self.region.get(key)
  2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.6/site-packages/dogpile/cache/region.py", line 645, in get
  2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server     value = self.backend.get(key)
  2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.6/site-packages/dogpile/cache/backends/memcached.py", line 161, in get
  2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server     value = self.client.get(key)
  2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.6/site-packages/oslo_cache/backends/memcache_pool.py", line 31, in _run_method
  2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server     with self.client_pool.acquire() as client:
  2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server   File "/usr/lib64/python3.6/contextlib.py", line 81, in __enter__
  2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server     return next(self.gen)
  2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.6/site-packages/oslo_cache/_memcache_pool.py", line 128, in acquire
  2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server     conn = self.get(timeout=self._connection_get_timeout)
  2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.6/site-packages/eventlet/queue.py", line 303, in get
  2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server     return self._get()
  2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.6/site-packages/oslo_cache/_memcache_pool.py", line 214, in _get
  2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server     conn = ConnectionPool._get(self)
  2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.6/site-packages/oslo_cache/_memcache_pool.py", line 165, in _get
  2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server     conn = self._create_connection()
  2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.6/site-packages/oslo_cache/_memcache_pool.py", line 206, in _create_connection
  2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server     return _MemcacheClient(self.urls, **self._arguments)
  2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server TypeError: object() takes no parameters
  2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server

  nova.conf has:
  ()[nova at standalone /]$ grep -v ^# /etc/nova/nova.conf |grep '[a-zA-Z]' |grep -i mem
  backend=oslo_cache.memcache_pool
  memcache_servers=192.168.24.2:11211
  memcached_servers=192.168.24.2:11211

  memcache seems to be up:
  [root at standalone ~]# podman top memcached
  USER        PID   PPID   %CPU    ELAPSED             TTY   TIME   COMMAND
  memcached   1     0      0.000   2h2m27.45050385s    ?     0s     dumb-init --single-child -- /bin/bash -c source /etc/sysconfig/memcached; /usr/bin/memcached -p ${PORT} -u ${USER} -m ${CACHESIZE} -c ${MAXCONN} $OPTIONS
  memcached   8     1      0.000   2h2m27.450722828s   ?     0s     /bin/bash -c source /etc/sysconfig/memcached; /usr/bin/memcached -p ${PORT} -u ${USER} -m ${CACHESIZE} -c ${MAXCONN} $OPTIONS
  memcached   9     8      0.000   2h2m27.450781466s   ?     0s     /usr/bin/memcached -p 11211 -u memcached -m 11970 -c 8192 -v -l 192.168.24.2 -U 0 -X -t 8 >> /var/log/memcached.log 2>&1

  Workaround:
  disable caching in nova
  crudini --set /var/lib/config-data/puppet-generated/nova/etc/nova/nova.conf cache enabled false
  podman restart nova_conductor

  Versions inside the f28 nova-conductor container:
  python-oslo-versionedobjects-lang-1.34.1-0.20181128123056.50474ad.fc28.noarch
  python3-oslo-config-6.7.0-0.20181108120643.64e020a.fc28.noarch
  python3-oslo-cache-1.32.0-0.20190108083242.eb68d73.fc28.noarch
  python3-oslo-versionedobjects-1.34.1-0.20181128123056.50474ad.fc28.noarch
  python-oslo-middleware-lang-3.37.0-0.20181211135004.a609e68.fc28.noarch
  puppet-oslo-14.2.0-0.20190111032249.b937844.fc28.noarch
  python3-oslo-middleware-3.37.0-0.20181211135004.a609e68.fc28.noarch
  python3-oslo-service-1.34.0-0.20190114140259.d987a4a.fc28.noarch
  python3-oslo-policy-1.44.0-0.20190108082943.c9ea8f7.fc28.noarch
  python-oslo-policy-lang-1.44.0-0.20190108082943.c9ea8f7.fc28.noarch
  python-oslo-concurrency-lang-3.29.0-0.20181128184857.0767ddf.fc28.noarch
  python3-oslo-utils-3.39.0-0.20190110184625.3823707.fc28.noarch
  python3-oslo-vmware-2.32.1-0.20181126101324.04f82ae.fc28.noarch
  python-oslo-log-lang-3.42.2-0.20190114115634.1babd44.fc28.noarch
  python3-oslo-rootwrap-5.15.1-0.20181226111925.27f2314.fc28.noarch
  python3-oslo-context-2.22.0-0.20190114134914.f65408d.fc28.noarch
  python3-oslo-privsep-1.30.1-0.20181127103633.9391cbf.fc28.noarch
  python-oslo-utils-lang-3.39.0-0.20190110184625.3823707.fc28.noarch
  python-oslo-db-lang-4.43.0-0.20190108081838.a6d2cc5.fc28.noarch
  python3-memcached-1.58-5.fc29.noarch
  python-oslo-cache-lang-1.32.0-0.20190108083242.eb68d73.fc28.noarch
  python3-oslo-i18n-3.23.0-0.20190111201133.a5fde9a.fc28.noarch
  python3-oslo-concurrency-3.29.0-0.20181128184857.0767ddf.fc28.noarch
  python3-oslo-db-4.43.0-0.20190108081838.a6d2cc5.fc28.noarch
  python3-oslo-reports-1.29.1-0.20181126102912.dde49a4.fc28.noarch
  python-oslo-i18n-lang-3.23.0-0.20190111201133.a5fde9a.fc28.noarch
  python3-oslo-messaging-9.3.0-0.20190114135848.13fa4f5.fc28.noarch
  python3-oslo-serialization-2.28.1-0.20181001122254.0371c1d.fc28.noarch
  python-oslo-vmware-lang-2.32.1-0.20181126101324.04f82ae.fc28.noarch
  python-oslo-privsep-lang-1.30.1-0.20181127103633.9391cbf.fc28.noarch
  python3-oslo-log-3.42.2-0.20190114115634.1babd44.fc28.noarch
  puppet-memcached-3.3.0-0.20180803162752.e517b44.fc28.noarch

  ------------------------------------------------

  Ubuntu SRU details
  ------------------

  [Impact]
  See description above.

  [Test Case]
  1. Deploy an HA Rocky cloud using the openstack-next-charms, with memcache related to nova-cloud-controller. 
  2. Configure a simple network, upload a bionic image.
  3. Try to start an instance via the API. It will stay in build.
  4. Try to use the openstack cli to read availability zones. It will
  return an error.

  For more info see bug 1823740

  [Regression Potential]
  The regression potential is low. This is a minimal fix that has successfully been reviewed upstream and passed all upstream gate tests. It has already landed in upstream master branch and Ubuntu Disco and received 3 +1's and zuul tests +1 on stable/rocky gerrit reviews.

To manage notifications about this bug go to:
https://bugs.launchpad.net/cloud-archive/+bug/1812935/+subscriptions



More information about the Ubuntu-openstack-bugs mailing list