[Bug 1816468] Re: [SRU] Acceleration cinder - glance with ceph not working
Edward Hope-Morley
edward.hope-morley at canonical.com
Mon Jul 29 13:53:42 UTC 2019
Hi @vvsorokin can I get you to check something. If you upgrade to this
new package but you had previously stored an image using the old broken
package then you will need to manually modify the db entry for that
image in the glance db to get it to work again (after upgrading). There
is a bit more info in https://bugs.launchpad.net/cloud-
archive/+bug/1816721. You also need to upgrade glance. So basically to
fix this problem you need to:
* upgrade cinder
* upgrade glance
* check glance db for incorrectly formatted image local locations in the image_locations table (see [Test Case] in 1816721)
Alternatively to messing with the db you could just delete and re-upload
your glance image and that should give you the correct value in the
database.
--
You received this bug notification because you are a member of Ubuntu
OpenStack, which is subscribed to cinder in Ubuntu.
https://bugs.launchpad.net/bugs/1816468
Title:
[SRU] Acceleration cinder - glance with ceph not working
Status in Cinder:
Fix Released
Status in Ubuntu Cloud Archive:
Fix Committed
Status in Ubuntu Cloud Archive rocky series:
Fix Committed
Status in Ubuntu Cloud Archive stein series:
Fix Committed
Status in Ubuntu Cloud Archive train series:
Triaged
Status in OpenStack Compute (nova):
Fix Released
Status in cinder package in Ubuntu:
Fix Released
Status in nova package in Ubuntu:
Fix Committed
Status in cinder source package in Cosmic:
Won't Fix
Status in nova source package in Cosmic:
Won't Fix
Status in cinder source package in Disco:
Fix Released
Status in nova source package in Disco:
Fix Committed
Status in nova source package in Eoan:
Fix Committed
Bug description:
[Impact]
For >= rocky (i.e. if using py3 packages) librados.cluster.get_fsid() is returning a binary string which means that the fsid can't be matched against a string version of the same value from glance when deciding whether to use an image that is stored in Ceph.
[Test Case]
* deploy openstack rocky (using p3 packages)
* deploy ceph and use for glance backend
* set
/etc/glance/glance-api.conf:show_multiple_locations = True
/etc/glance/glance-api.conf:show_image_direct_url = True
* upload image to glance
* attempt to boot an instance using this image
* confirm that instance booted properly and check that the image it booted from is a cow clone of the glance image by doing the following in ceph:
rbd -p nova info <vm uuid>| grep parent:
* confirm that you see "parent: glance/<image uuid>@snap"
[Regression Potential]
None expected
[Other Info]
None expected.
----------------------------------------------------------------------------
When using cinder, glance with ceph, in a code is support for creating volumes from images INSIDE ceph environment as copy-on-write volume. This option is saving space in ceph cluster, and increase speed of instance spawning because volume is created directly in ceph. <= THIS IS NOT WORKING IN PY3
If this function is not enabled , image is copying to compute-host
..convert ..create volume, and upload to ceph ( which is time
consuming of course ).
Problem is , that even if glance-cinder acceleration is turned-on ,
code is executed as when it is disabled, so ..the same as above , copy
image , create volume, upload to ceph ... BUT it should create copy-
on-write volume inside the ceph internally. <= THIS IS A BUG IN PY3
Glance config ( controller ):
[DEFAULT]
show_image_direct_url = true <= this has to be set to true to reproduce issue
workers = 7
transport_url = rabbit://openstack:openstack@openstack-db
[cors]
[database]
connection = mysql+pymysql://glance:Eew7shai@openstack-db:3306/glance
[glance_store]
stores = file,rbd
default_store = rbd
filesystem_store_datadir = /var/lib/glance/images
rbd_store_pool = images
rbd_store_user = images
rbd_store_ceph_conf = /etc/ceph/ceph.conf
[image_format]
[keystone_authtoken]
auth_url = http://openstack-ctrl:35357
project_name = service
project_domain_name = default
username = glance
user_domain_name = default
password = Eew7shai
www_authenticate_uri = http://openstack-ctrl:5000
auth_uri = http://openstack-ctrl:35357
cache = swift.cache
region_name = RegionOne
auth_type = password
[matchmaker_redis]
[oslo_concurrency]
lock_path = /var/lock/glance
[oslo_messaging_amqp]
[oslo_messaging_kafka]
[oslo_messaging_notifications]
[oslo_messaging_rabbit]
[oslo_messaging_zmq]
[oslo_middleware]
[oslo_policy]
[paste_deploy]
flavor = keystone
[store_type_location_strategy]
[task]
[taskflow_executor]
[profiler]
enabled = true
trace_sqlalchemy = true
hmac_keys = secret
connection_string = redis://127.0.0.1:6379
trace_wsgi_transport = True
trace_message_store = True
trace_management_store = True
Cinder conf (controller) :
root at openstack-controller:/tmp# cat /etc/cinder/cinder.conf | grep -v '^#' | awk NF
[DEFAULT]
my_ip = 192.168.10.15
glance_api_servers = http://openstack-ctrl:9292
auth_strategy = keystone
enabled_backends = rbd
osapi_volume_workers = 7
debug = true
transport_url = rabbit://openstack:openstack@openstack-db
[backend]
[backend_defaults]
rbd_pool = volumes
rbd_user = volumes1
rbd_secret_uuid = b2efeb49-9844-475b-92ad-5df4a3e1300e
volume_driver = cinder.volume.drivers.rbd.RBDDriver
[barbican]
[brcd_fabric_example]
[cisco_fabric_example]
[coordination]
[cors]
[database]
connection = mysql+pymysql://cinder:EeRe3ahx@openstack-db:3306/cinder
[fc-zone-manager]
[healthcheck]
[key_manager]
[keystone_authtoken]
auth_url = http://openstack-ctrl:35357
project_name = service
project_domain_name = default
username = cinder
user_domain_name = default
password = EeRe3ahx
www_authenticate_uri = http://openstack-ctrl:5000
auth_uri = http://openstack-ctrl:35357
cache = swift.cache
region_name = RegionOne
auth_type = password
[matchmaker_redis]
[nova]
[oslo_concurrency]
lock_path = /var/lock/cinder
[oslo_messaging_amqp]
[oslo_messaging_kafka]
[oslo_messaging_notifications]
[oslo_messaging_rabbit]
[oslo_messaging_zmq]
[oslo_middleware]
[oslo_policy]
[oslo_reports]
[oslo_versionedobjects]
[sample_remote_file_source]
[service_user]
[ssl]
[vault]
[lvm]
volume_driver = cinder.volume.drivers.lvm.LVMVolumeDriver
volume_group = cinder-volumes
iscsi_protocol = iscsi
iscsi_helper = tgtadm
[profiler]
enabled = true
trace_sqlalchemy = true
hmac_keys = secret
connection_string = redis://127.0.0.1:6379
trace_wsgi_transport = True
trace_message_store = True
trace_management_store = True
[rbd]
volume_driver = cinder.volume.drivers.rbd.RBDDriver
rbd_pool = volumes
rbd_ceph_conf = /etc/ceph/ceph.conf
rbd_user = volumes1
image_volume_cache_enabled = True
volume_clear = zero
rbd_max_clone_depth = 5
rbd_flatten_volume_from_snapshot = False
cinder conf compute node :
root at openstack-compute2:~# cat /etc/cinder/cinder.conf | grep -v '^#' | awk NF
[DEFAULT]
my_ip = 192.168.10.6
glance_api_servers = http://openstack-ctrl:9292
auth_strategy = keystone
enabled_backends = rbd
debug = true
transport_url = rabbit://openstack:openstack@openstack-db
[backend]
[backend_defaults]
rbd_pool = volumes
rbd_user = volumes1
rbd_secret_uuid = b2efeb49-9844-475b-92ad-5df4a3e1300e
volume_driver = cinder.volume.drivers.rbd.RBDDriver
[barbican]
[brcd_fabric_example]
[cisco_fabric_example]
[coordination]
[cors]
[database]
connection = mysql+pymysql://cinder:EeRe3ahx@openstack-db:3306/cinder
[fc-zone-manager]
[healthcheck]
[key_manager]
[keystone_authtoken]
auth_url = http://openstack-ctrl:5000
project_name = service
project_domain_name = default
username = cinder
user_domain_name = default
password = EeRe3ahx
www_authenticate_uri = http://openstack-ctrl:5000
auth_uri = http://openstack-ctrl:5000
cache = swift.cache
region_name = RegionOne
auth_type = password
[matchmaker_redis]
[nova]
[oslo_concurrency]
lock_path = /var/lock/cinder
[oslo_messaging_amqp]
[oslo_messaging_kafka]
[oslo_messaging_notifications]
[oslo_messaging_rabbit]
[oslo_messaging_zmq]
[oslo_middleware]
[oslo_policy]
[oslo_reports]
[oslo_versionedobjects]
[profiler]
[sample_remote_file_source]
[service_user]
[ssl]
[vault]
[lvm]
volume_driver = cinder.volume.drivers.lvm.LVMVolumeDriver
volume_group = cinder-volumes
iscsi_protocol = iscsi
iscsi_helper = tgtadm
[rbd]
volume_driver = cinder.volume.drivers.rbd.RBDDriver
rbd_pool = volumes
rbd_ceph_conf = /etc/ceph/ceph.conf
rbd_user = volumes1
Cinder log :
root at openstack-compute2:~# cat /var/log/cinder/*
2019-02-18 15:57:29.903 29303 DEBUG cinder.volume.flows.manager.create_volume [req-bdc7dc4f-6fca-4358-beb9-43d3c2000b42 664034a641e64922a1cc253e2dc39e76 ba5ef70fd99642fdb75c9307c88b1164 - default default] Volume reschedule parameters: True retry: {'num_attempts': 1, 'backends': ['openstack-compute2 at rbd#RBD'], 'hosts': ['openstack-compute2 at rbd#RBD']} get_flow /usr/lib/python3/dist-packages/cinder/volume/flows/manager/create_volume.py:1171
2019-02-18 15:57:29.921 29303 DEBUG oslo_db.sqlalchemy.engines [req-bdc7dc4f-6fca-4358-beb9-43d3c2000b42 664034a641e64922a1cc253e2dc39e76 ba5ef70fd99642fdb75c9307c88b1164 - default default] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_AUTO_CREATE_USER,NO_ENGINE_SUBSTITUTION _check_effective_sql_mode /usr/lib/python3/dist-packages/oslo_db/sqlalchemy/engines.py:308
2019-02-18 15:57:29.929 29303 INFO cinder.rpc [req-bdc7dc4f-6fca-4358-beb9-43d3c2000b42 664034a641e64922a1cc253e2dc39e76 ba5ef70fd99642fdb75c9307c88b1164 - default default] Automatically selected cinder-backup objects version 1.37 as minimum service version.
2019-02-18 15:57:29.937 29303 INFO cinder.rpc [req-bdc7dc4f-6fca-4358-beb9-43d3c2000b42 664034a641e64922a1cc253e2dc39e76 ba5ef70fd99642fdb75c9307c88b1164 - default default] Automatically selected cinder-backup RPC version 2.1 as minimum service version.
2019-02-18 15:57:29.951 29303 INFO cinder.rpc [req-bdc7dc4f-6fca-4358-beb9-43d3c2000b42 664034a641e64922a1cc253e2dc39e76 ba5ef70fd99642fdb75c9307c88b1164 - default default] Automatically selected cinder-volume objects version 1.37 as minimum service version.
2019-02-18 15:57:29.960 29303 INFO cinder.rpc [req-bdc7dc4f-6fca-4358-beb9-43d3c2000b42 664034a641e64922a1cc253e2dc39e76 ba5ef70fd99642fdb75c9307c88b1164 - default default] Automatically selected cinder-volume RPC version 3.16 as minimum service version.
2019-02-18 15:57:30.007 29303 DEBUG cinder.volume.manager [req-bdc7dc4f-6fca-4358-beb9-43d3c2000b42 664034a641e64922a1cc253e2dc39e76 ba5ef70fd99642fdb75c9307c88b1164 - default default] Flow 'volume_create_manager' (18af88d8-7ea2-41c0-8554-b231a179f492) transitioned into state 'RUNNING' from state 'PENDING' _flow_receiver /usr/lib/python3/dist-packages/taskflow/listeners/logging.py:145
2019-02-18 15:57:30.011 29303 DEBUG cinder.volume.manager [req-bdc7dc4f-6fca-4358-beb9-43d3c2000b42 664034a641e64922a1cc253e2dc39e76 ba5ef70fd99642fdb75c9307c88b1164 - default default] Task 'cinder.volume.flows.manager.create_volume.ExtractVolumeRefTask;volume:create' (f8656e1d-dd4c-4c83-9464-fc1376154d4a) transitioned into state 'RUNNING' from state 'PENDING' _task_receiver /usr/lib/python3/dist-packages/taskflow/listeners/logging.py:194
2019-02-18 15:57:30.051 29303 DEBUG oslo_db.sqlalchemy.engines [req-bdc7dc4f-6fca-4358-beb9-43d3c2000b42 664034a641e64922a1cc253e2dc39e76 ba5ef70fd99642fdb75c9307c88b1164 - default default] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_AUTO_CREATE_USER,NO_ENGINE_SUBSTITUTION _check_effective_sql_mode /usr/lib/python3/dist-packages/oslo_db/sqlalchemy/engines.py:308
2019-02-18 15:57:30.082 29303 DEBUG cinder.volume.manager [req-bdc7dc4f-6fca-4358-beb9-43d3c2000b42 664034a641e64922a1cc253e2dc39e76 ba5ef70fd99642fdb75c9307c88b1164 - default default] Task 'cinder.volume.flows.manager.create_volume.ExtractVolumeRefTask;volume:create' (f8656e1d-dd4c-4c83-9464-fc1376154d4a) transitioned into state 'SUCCESS' from state 'RUNNING' with result 'Volume(_name_id=None,admin_metadata={},attach_status='detached',availability_zone='nova',bootable=False,cluster=<?>,cluster_name=None,consistencygroup=<?>,consistencygroup_id=None,created_at=2019-02-18T15:57:29Z,deleted=False,deleted_at=None,display_description='',display_name='',ec2_id=None,encryption_key_id=None,glance_metadata=<?>,group=<?>,group_id=None,host='openstack-compute2 at rbd#RBD',id=d9b22b30-cb59-488b-94bd-15eaf47fd704,launched_at=None,metadata={},migration_status=None,multiattach=False,previous_status=None,project_id='ba5ef70fd99642fdb75c9307c88b1164',provider_auth=None,provider_geometry=None,provider_id=None,provider_location=None,replication_driver_data=None,replication_extended_status=None,replication_status=None,scheduled_at=2019-02-18T15:57:30Z,service_uuid=None,shared_targets=True,size=20,snapshot_id=None,snapshots=<?>,source_volid=None,status='creating',terminated_at=None,updated_at=2019-02-18T15:57:30Z,user_id='664034a641e64922a1cc253e2dc39e76',volume_attachment=<?>,volume_type=<?>,volume_type_id=None)' _task_receiver /usr/lib/python3/dist-packages/taskflow/listeners/logging.py:183
2019-02-18 15:57:30.086 29303 DEBUG cinder.volume.manager [req-bdc7dc4f-6fca-4358-beb9-43d3c2000b42 664034a641e64922a1cc253e2dc39e76 ba5ef70fd99642fdb75c9307c88b1164 - default default] Task 'cinder.volume.flows.manager.create_volume.OnFailureRescheduleTask;volume:create' (c8a584b1-5712-4043-a845-f3c24cfaa9be) transitioned into state 'RUNNING' from state 'PENDING' _task_receiver /usr/lib/python3/dist-packages/taskflow/listeners/logging.py:194
2019-02-18 15:57:30.088 29303 DEBUG cinder.volume.manager [req-bdc7dc4f-6fca-4358-beb9-43d3c2000b42 664034a641e64922a1cc253e2dc39e76 ba5ef70fd99642fdb75c9307c88b1164 - default default] Task 'cinder.volume.flows.manager.create_volume.OnFailureRescheduleTask;volume:create' (c8a584b1-5712-4043-a845-f3c24cfaa9be) transitioned into state 'SUCCESS' from state 'RUNNING' with result 'None' _task_receiver /usr/lib/python3/dist-packages/taskflow/listeners/logging.py:183
2019-02-18 15:57:30.090 29303 DEBUG cinder.volume.manager [req-bdc7dc4f-6fca-4358-beb9-43d3c2000b42 664034a641e64922a1cc253e2dc39e76 ba5ef70fd99642fdb75c9307c88b1164 - default default] Task 'cinder.volume.flows.manager.create_volume.ExtractVolumeSpecTask;volume:create' (d9cadc4f-5b28-48c9-bd2f-a877b0f04294) transitioned into state 'RUNNING' from state 'PENDING' _task_receiver /usr/lib/python3/dist-packages/taskflow/listeners/logging.py:194
2019-02-18 15:57:30.311 29303 DEBUG cinder.volume.manager [req-bdc7dc4f-6fca-4358-beb9-43d3c2000b42 664034a641e64922a1cc253e2dc39e76 ba5ef70fd99642fdb75c9307c88b1164 - default default] Task 'cinder.volume.flows.manager.create_volume.ExtractVolumeSpecTask;volume:create' (d9cadc4f-5b28-48c9-bd2f-a877b0f04294) transitioned into state 'SUCCESS' from state 'RUNNING' with result '{'image_location': ('rbd://b%27868c2b5d-12f1-4f3f-aa2a-5701a3bb1041%27/images/eb816518-fac2-48f2-8197-ca7a83c89ada/snap', None), 'type': 'image', 'status': 'creating', 'image_meta': {'owner': 'ba5ef70fd99642fdb75c9307c88b1164', 'visibility': 'shared', 'os_hash_value': '03fee9094d9720ddd51a7eb61755503a114752fbcefb10d1e5cf48a7f261ed9e36fe20e5c5bdf7e4a956910ebc6a0c0bd130b43e541d49b3b3362dd0d491851d', 'protected': False, 'properties': {}, 'os_hidden': False, 'checksum': 'abaeff277a4e2ce7dff253ad453312a5', 'min_disk': 0, 'updated_at': datetime.datetime(2019, 2, 18, 12, 3, 22, tzinfo=<iso8601.Utc>), 'created_at': datetime.datetime(2019, 2, 18, 12, 1, 4, tzinfo=<iso8601.Utc>), 'min_ram': 0, 'tags': [], 'virtual_size': None, 'direct_url': 'rbd://b%27868c2b5d-12f1-4f3f-aa2a-5701a3bb1041%27/images/eb816518-fac2-48f2-8197-ca7a83c89ada/snap', 'name': 'debian-9', 'container_format': 'bare', 'disk_format': 'raw', 'id': 'eb816518-fac2-48f2-8197-ca7a83c89ada', 'os_hash_algo': 'sha512', 'size': 2147483648, 'status': 'active', 'file': '/v2/images/eb816518-fac2-48f2-8197-ca7a83c89ada/file'}, 'volume_id': 'd9b22b30-cb59-488b-94bd-15eaf47fd704', 'volume_name': 'volume-d9b22b30-cb59-488b-94bd-15eaf47fd704', 'volume_size': 20, 'image_service': <cinder.image.glance.GlanceImageService object at 0x7f6db2eb64e0>, 'image_id': 'eb816518-fac2-48f2-8197-ca7a83c89ada'}' _task_receiver /usr/lib/python3/dist-packages/taskflow/listeners/logging.py:183
2019-02-18 15:57:30.314 29303 DEBUG cinder.volume.manager [req-bdc7dc4f-6fca-4358-beb9-43d3c2000b42 664034a641e64922a1cc253e2dc39e76 ba5ef70fd99642fdb75c9307c88b1164 - default default] Task 'cinder.volume.flows.manager.create_volume.NotifyVolumeActionTask;volume:create, create.start' (9bc7e799-7151-4451-8ed8-efef11b72c99) transitioned into state 'RUNNING' from state 'PENDING' _task_receiver /usr/lib/python3/dist-packages/taskflow/listeners/logging.py:194
2019-02-18 15:57:30.317 29303 DEBUG cinder.volume.manager [req-bdc7dc4f-6fca-4358-beb9-43d3c2000b42 664034a641e64922a1cc253e2dc39e76 ba5ef70fd99642fdb75c9307c88b1164 - default default] Task 'cinder.volume.flows.manager.create_volume.NotifyVolumeActionTask;volume:create, create.start' (9bc7e799-7151-4451-8ed8-efef11b72c99) transitioned into state 'SUCCESS' from state 'RUNNING' with result 'None' _task_receiver /usr/lib/python3/dist-packages/taskflow/listeners/logging.py:183
2019-02-18 15:57:30.319 29303 DEBUG cinder.volume.manager [req-bdc7dc4f-6fca-4358-beb9-43d3c2000b42 664034a641e64922a1cc253e2dc39e76 ba5ef70fd99642fdb75c9307c88b1164 - default default] Task 'cinder.volume.flows.manager.create_volume.CreateVolumeFromSpecTask;volume:create' (980d2ea5-5e92-4699-bb48-8846797b7baf) transitioned into state 'RUNNING' from state 'PENDING' _task_receiver /usr/lib/python3/dist-packages/taskflow/listeners/logging.py:194
2019-02-18 15:57:30.320 29303 INFO cinder.volume.flows.manager.create_volume [req-bdc7dc4f-6fca-4358-beb9-43d3c2000b42 664034a641e64922a1cc253e2dc39e76 ba5ef70fd99642fdb75c9307c88b1164 - default default] Volume d9b22b30-cb59-488b-94bd-15eaf47fd704: being created as image with specification: {'image_location': ('rbd://b%27868c2b5d-12f1-4f3f-aa2a-5701a3bb1041%27/images/eb816518-fac2-48f2-8197-ca7a83c89ada/snap', None), 'status': 'creating', 'image_meta': {'owner': 'ba5ef70fd99642fdb75c9307c88b1164', 'visibility': 'shared', 'os_hash_value': '03fee9094d9720ddd51a7eb61755503a114752fbcefb10d1e5cf48a7f261ed9e36fe20e5c5bdf7e4a956910ebc6a0c0bd130b43e541d49b3b3362dd0d491851d', 'protected': False, 'properties': {}, 'os_hidden': False, 'checksum': 'abaeff277a4e2ce7dff253ad453312a5', 'min_disk': 0, 'updated_at': datetime.datetime(2019, 2, 18, 12, 3, 22, tzinfo=<iso8601.Utc>), 'created_at': datetime.datetime(2019, 2, 18, 12, 1, 4, tzinfo=<iso8601.Utc>), 'min_ram': 0, 'tags': [], 'virtual_size': None, 'direct_url': 'rbd://b%27868c2b5d-12f1-4f3f-aa2a-5701a3bb1041%27/images/eb816518-fac2-48f2-8197-ca7a83c89ada/snap', 'name': 'debian-9', 'container_format': 'bare', 'disk_format': 'raw', 'id': 'eb816518-fac2-48f2-8197-ca7a83c89ada', 'os_hash_algo': 'sha512', 'size': 2147483648, 'status': 'active', 'file': '/v2/images/eb816518-fac2-48f2-8197-ca7a83c89ada/file'}, 'volume_name': 'volume-d9b22b30-cb59-488b-94bd-15eaf47fd704', 'volume_size': 20, 'image_service': <cinder.image.glance.GlanceImageService object at 0x7f6db2eb64e0>, 'image_id': 'eb816518-fac2-48f2-8197-ca7a83c89ada'}
2019-02-18 15:57:30.320 29303 DEBUG cinder.volume.flows.manager.create_volume [req-bdc7dc4f-6fca-4358-beb9-43d3c2000b42 664034a641e64922a1cc253e2dc39e76 ba5ef70fd99642fdb75c9307c88b1164 - default default] Cloning d9b22b30-cb59-488b-94bd-15eaf47fd704 from image eb816518-fac2-48f2-8197-ca7a83c89ada at location ('rbd://b%27868c2b5d-12f1-4f3f-aa2a-5701a3bb1041%27/images/eb816518-fac2-48f2-8197-ca7a83c89ada/snap', None). _create_from_image /usr/lib/python3/dist-packages/cinder/volume/flows/manager/create_volume.py:894
2019-02-18 15:57:30.322 29303 DEBUG cinder.volume.drivers.rbd [req-bdc7dc4f-6fca-4358-beb9-43d3c2000b42 664034a641e64922a1cc253e2dc39e76 ba5ef70fd99642fdb75c9307c88b1164 - default default] connecting to volumes1 at ceph (conf=/etc/ceph/ceph.conf, timeout=-1). _do_conn /usr/lib/python3/dist-packages/cinder/volume/drivers/rbd.py:336
2019-02-18 15:57:30.356 29303 DEBUG cinder.volume.drivers.rbd [req-bdc7dc4f-6fca-4358-beb9-43d3c2000b42 664034a641e64922a1cc253e2dc39e76 ba5ef70fd99642fdb75c9307c88b1164 - default default] rbd://b%27868c2b5d-12f1-4f3f-aa2a-5701a3bb1041%27/images/eb816518-fac2-48f2-8197-ca7a83c89ada/snap is in a different ceph cluster. _is_cloneable /usr/lib/python3/dist-packages/cinder/volume/drivers/rbd.py:1320
2019-02-18 15:57:48.356 29303 DEBUG cinder.image.image_utils [req-bdc7dc4f-6fca-4358-beb9-43d3c2000b42 664034a641e64922a1cc253e2dc39e76 ba5ef70fd99642fdb75c9307c88b1164 - default default] Image fetch details: dest /var/lib/cinder/conversion/tmp2tkzwyhaopenstack-compute2 at rbd, size 2048.00 MB, duration 18.00 sec fetch /usr/lib/python3/dist-packages/cinder/image/image_utils.py:392
2019-02-18 15:57:48.356 29303 INFO cinder.image.image_utils [req-bdc7dc4f-6fca-4358-beb9-43d3c2000b42 664034a641e64922a1cc253e2dc39e76 ba5ef70fd99642fdb75c9307c88b1164 - default default] Image download 2048.00 MB at 113.79 MB/s
2019-02-18 15:57:48.360 29303 DEBUG oslo_service.periodic_task [req-f4ac235a-987e-4892-9b2b-d624feec45b4 - - - - -] Running periodic task VolumeManager.publish_service_capabilities run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:215
2019-02-18 15:57:48.361 29303 DEBUG cinder.volume.drivers.rbd [req-f4ac235a-987e-4892-9b2b-d624feec45b4 - - - - -] connecting to volumes1 at ceph (conf=/etc/ceph/ceph.conf, timeout=-1). _do_conn /usr/lib/python3/dist-packages/cinder/volume/drivers/rbd.py:336
2019-02-18 15:57:48.393 29303 DEBUG cinder.volume.drivers.rbd [req-f4ac235a-987e-4892-9b2b-d624feec45b4 - - - - -] connecting to volumes1 at ceph (conf=/etc/ceph/ceph.conf, timeout=-1). _do_conn /usr/lib/python3/dist-packages/cinder/volume/drivers/rbd.py:336
2019-02-18 15:57:48.430 29303 DEBUG cinder.volume.drivers.rbd [req-f4ac235a-987e-4892-9b2b-d624feec45b4 - - - - -] connecting to volumes1 at ceph (conf=/etc/ceph/ceph.conf, timeout=-1). _do_conn /usr/lib/python3/dist-packages/cinder/volume/drivers/rbd.py:336
2019-02-18 15:57:48.590 29303 DEBUG oslo_concurrency.processutils [req-bdc7dc4f-6fca-4358-beb9-43d3c2000b42 664034a641e64922a1cc253e2dc39e76 ba5ef70fd99642fdb75c9307c88b1164 - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=8 -- sudo cinder-rootwrap /etc/cinder/rootwrap.conf env LC_ALL=C qemu-img info /var/lib/cinder/conversion/tmp2tkzwyhaopenstack-compute2 at rbd execute /usr/lib/python3/dist-packages/oslo_concurrency/processutils.py:372
2019-02-18 15:57:48.765 29303 DEBUG cinder.manager [req-f4ac235a-987e-4892-9b2b-d624feec45b4 - - - - -] Notifying Schedulers of capabilities ... _publish_service_capabilities /usr/lib/python3/dist-packages/cinder/manager.py:194
2019-02-18 15:57:49.026 29303 DEBUG oslo_concurrency.processutils [req-bdc7dc4f-6fca-4358-beb9-43d3c2000b42 664034a641e64922a1cc253e2dc39e76 ba5ef70fd99642fdb75c9307c88b1164 - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=8 -- sudo cinder-rootwrap /etc/cinder/rootwrap.conf env LC_ALL=C qemu-img info /var/lib/cinder/conversion/tmp2tkzwyhaopenstack-compute2 at rbd" returned: 0 in 0.436s execute /usr/lib/python3/dist-packages/oslo_concurrency/processutils.py:409
2019-02-18 15:57:49.027 29303 DEBUG cinder.image.image_utils [req-bdc7dc4f-6fca-4358-beb9-43d3c2000b42 664034a641e64922a1cc253e2dc39e76 ba5ef70fd99642fdb75c9307c88b1164 - default default] Temporary image eb816518-fac2-48f2-8197-ca7a83c89ada is fetched for user 664034a641e64922a1cc253e2dc39e76. fetch /usr/lib/python3/dist-packages/cinder/image/image_utils.py:820
2019-02-18 15:57:49.268 29303 DEBUG oslo_concurrency.processutils [req-bdc7dc4f-6fca-4358-beb9-43d3c2000b42 664034a641e64922a1cc253e2dc39e76 ba5ef70fd99642fdb75c9307c88b1164 - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=8 -- sudo cinder-rootwrap /etc/cinder/rootwrap.conf env LC_ALL=C qemu-img info /var/lib/cinder/conversion/tmp2tkzwyhaopenstack-compute2 at rbd execute /usr/lib/python3/dist-packages/oslo_concurrency/processutils.py:372
2019-02-18 15:57:49.699 29303 DEBUG oslo_concurrency.processutils [req-bdc7dc4f-6fca-4358-beb9-43d3c2000b42 664034a641e64922a1cc253e2dc39e76 ba5ef70fd99642fdb75c9307c88b1164 - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=8 -- sudo cinder-rootwrap /etc/cinder/rootwrap.conf env LC_ALL=C qemu-img info /var/lib/cinder/conversion/tmp2tkzwyhaopenstack-compute2 at rbd" returned: 0 in 0.431s execute /usr/lib/python3/dist-packages/oslo_concurrency/processutils.py:409
2019-02-18 15:57:49.700 29303 DEBUG cinder.volume.drivers.rbd [req-bdc7dc4f-6fca-4358-beb9-43d3c2000b42 664034a641e64922a1cc253e2dc39e76 ba5ef70fd99642fdb75c9307c88b1164 - default default] creating volume 'volume-d9b22b30-cb59-488b-94bd-15eaf47fd704' create_volume /usr/lib/python3/dist-packages/cinder/volume/drivers/rbd.py:793
2019-02-18 15:57:49.702 29303 DEBUG cinder.volume.drivers.rbd [req-bdc7dc4f-6fca-4358-beb9-43d3c2000b42 664034a641e64922a1cc253e2dc39e76 ba5ef70fd99642fdb75c9307c88b1164 - default default] connecting to volumes1 at ceph (conf=/etc/ceph/ceph.conf, timeout=-1). _do_conn /usr/lib/python3/dist-packages/cinder/volume/drivers/rbd.py:336
2019-02-18 15:57:50.033 29303 DEBUG cinder.volume.flows.manager.create_volume [req-bdc7dc4f-6fca-4358-beb9-43d3c2000b42 664034a641e64922a1cc253e2dc39e76 ba5ef70fd99642fdb75c9307c88b1164 - default default] Attempting download of eb816518-fac2-48f2-8197-ca7a83c89ada (('rbd://b%27868c2b5d-12f1-4f3f-aa2a-5701a3bb1041%27/images/eb816518-fac2-48f2-8197-ca7a83c89ada/snap', None)) to volume d9b22b30-cb59-488b-94bd-15eaf47fd704. _copy_image_to_volume /usr/lib/python3/dist-packages/cinder/volume/flows/manager/create_volume.py:513
2019-02-18 15:57:50.086 29303 DEBUG oslo_concurrency.processutils [req-bdc7dc4f-6fca-4358-beb9-43d3c2000b42 664034a641e64922a1cc253e2dc39e76 ba5ef70fd99642fdb75c9307c88b1164 - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=8 -- sudo cinder-rootwrap /etc/cinder/rootwrap.conf env LC_ALL=C qemu-img info /var/lib/cinder/conversion/tmp3rov4mxq execute /usr/lib/python3/dist-packages/oslo_concurrency/processutils.py:372
2019-02-18 15:57:50.495 29303 DEBUG oslo_concurrency.processutils [req-bdc7dc4f-6fca-4358-beb9-43d3c2000b42 664034a641e64922a1cc253e2dc39e76 ba5ef70fd99642fdb75c9307c88b1164 - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=8 -- sudo cinder-rootwrap /etc/cinder/rootwrap.conf env LC_ALL=C qemu-img info /var/lib/cinder/conversion/tmp3rov4mxq" returned: 0 in 0.409s execute /usr/lib/python3/dist-packages/oslo_concurrency/processutils.py:409
2019-02-18 15:57:50.497 29303 DEBUG oslo_concurrency.processutils [req-bdc7dc4f-6fca-4358-beb9-43d3c2000b42 664034a641e64922a1cc253e2dc39e76 ba5ef70fd99642fdb75c9307c88b1164 - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=8 -- sudo cinder-rootwrap /etc/cinder/rootwrap.conf env LC_ALL=C qemu-img info /var/lib/cinder/conversion/tmp2tkzwyhaopenstack-compute2 at rbd execute /usr/lib/python3/dist-packages/oslo_concurrency/processutils.py:372
2019-02-18 15:57:50.915 29303 DEBUG oslo_concurrency.processutils [req-bdc7dc4f-6fca-4358-beb9-43d3c2000b42 664034a641e64922a1cc253e2dc39e76 ba5ef70fd99642fdb75c9307c88b1164 - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=8 -- sudo cinder-rootwrap /etc/cinder/rootwrap.conf env LC_ALL=C qemu-img info /var/lib/cinder/conversion/tmp2tkzwyhaopenstack-compute2 at rbd" returned: 0 in 0.418s execute /usr/lib/python3/dist-packages/oslo_concurrency/processutils.py:409
2019-02-18 15:57:50.917 29303 DEBUG cinder.image.image_utils [req-bdc7dc4f-6fca-4358-beb9-43d3c2000b42 664034a641e64922a1cc253e2dc39e76 ba5ef70fd99642fdb75c9307c88b1164 - default default] eb816518-fac2-48f2-8197-ca7a83c89ada was raw, converting to raw fetch_to_volume_format /usr/lib/python3/dist-packages/cinder/image/image_utils.py:558
2019-02-18 15:57:50.919 29303 DEBUG oslo_concurrency.processutils [req-bdc7dc4f-6fca-4358-beb9-43d3c2000b42 664034a641e64922a1cc253e2dc39e76 ba5ef70fd99642fdb75c9307c88b1164 - default default] Running cmd (subprocess): sudo cinder-rootwrap /etc/cinder/rootwrap.conf qemu-img convert -O raw -f raw /var/lib/cinder/conversion/tmp2tkzwyhaopenstack-compute2 at rbd /var/lib/cinder/conversion/tmpiw6lpfyj execute /usr/lib/python3/dist-packages/oslo_concurrency/processutils.py:372
2019-02-18 15:57:52.968 29303 DEBUG oslo_concurrency.processutils [req-bdc7dc4f-6fca-4358-beb9-43d3c2000b42 664034a641e64922a1cc253e2dc39e76 ba5ef70fd99642fdb75c9307c88b1164 - default default] CMD "sudo cinder-rootwrap /etc/cinder/rootwrap.conf qemu-img convert -O raw -f raw /var/lib/cinder/conversion/tmp2tkzwyhaopenstack-compute2 at rbd /var/lib/cinder/conversion/tmpiw6lpfyj" returned: 0 in 2.050s execute /usr/lib/python3/dist-packages/oslo_concurrency/processutils.py:409
2019-02-18 15:57:52.970 29303 DEBUG oslo_concurrency.processutils [req-bdc7dc4f-6fca-4358-beb9-43d3c2000b42 664034a641e64922a1cc253e2dc39e76 ba5ef70fd99642fdb75c9307c88b1164 - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=8 -- sudo cinder-rootwrap /etc/cinder/rootwrap.conf env LC_ALL=C qemu-img info /var/lib/cinder/conversion/tmp2tkzwyhaopenstack-compute2 at rbd execute /usr/lib/python3/dist-packages/oslo_concurrency/processutils.py:372
2019-02-18 15:57:53.342 29303 DEBUG oslo_concurrency.processutils [req-bdc7dc4f-6fca-4358-beb9-43d3c2000b42 664034a641e64922a1cc253e2dc39e76 ba5ef70fd99642fdb75c9307c88b1164 - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=8 -- sudo cinder-rootwrap /etc/cinder/rootwrap.conf env LC_ALL=C qemu-img info /var/lib/cinder/conversion/tmp2tkzwyhaopenstack-compute2 at rbd" returned: 0 in 0.372s execute /usr/lib/python3/dist-packages/oslo_concurrency/processutils.py:409
2019-02-18 15:57:53.344 29303 DEBUG cinder.image.image_utils [req-bdc7dc4f-6fca-4358-beb9-43d3c2000b42 664034a641e64922a1cc253e2dc39e76 ba5ef70fd99642fdb75c9307c88b1164 - default default] Image conversion details: src /var/lib/cinder/conversion/tmp2tkzwyhaopenstack-compute2 at rbd, size 2048.00 MB, duration 2.05 sec, destination /var/lib/cinder/conversion/tmpiw6lpfyj _convert_image /usr/lib/python3/dist-packages/cinder/image/image_utils.py:263
2019-02-18 15:57:53.344 29303 INFO cinder.image.image_utils [req-bdc7dc4f-6fca-4358-beb9-43d3c2000b42 664034a641e64922a1cc253e2dc39e76 ba5ef70fd99642fdb75c9307c88b1164 - default default] Converted 2048.00 MB image at 998.52 MB/s
2019-02-18 15:57:53.345 29303 DEBUG cinder.volume.drivers.rbd [req-bdc7dc4f-6fca-4358-beb9-43d3c2000b42 664034a641e64922a1cc253e2dc39e76 ba5ef70fd99642fdb75c9307c88b1164 - default default] connecting to volumes1 at ceph (conf=/etc/ceph/ceph.conf, timeout=-1). _do_conn /usr/lib/python3/dist-packages/cinder/volume/drivers/rbd.py:336
2019-02-18 15:57:53.635 29303 DEBUG cinder.volume.drivers.rbd [req-bdc7dc4f-6fca-4358-beb9-43d3c2000b42 664034a641e64922a1cc253e2dc39e76 ba5ef70fd99642fdb75c9307c88b1164 - default default] volume has no backup snaps _delete_backup_snaps /usr/lib/python3/dist-packages/cinder/volume/drivers/rbd.py:874
2019-02-18 15:57:53.636 29303 DEBUG cinder.volume.drivers.rbd [req-bdc7dc4f-6fca-4358-beb9-43d3c2000b42 664034a641e64922a1cc253e2dc39e76 ba5ef70fd99642fdb75c9307c88b1164 - default default] Volume volume-d9b22b30-cb59-488b-94bd-15eaf47fd704 is not a clone. _get_clone_info /usr/lib/python3/dist-packages/cinder/volume/drivers/rbd.py:897
2019-02-18 15:57:53.663 29303 DEBUG cinder.volume.drivers.rbd [req-bdc7dc4f-6fca-4358-beb9-43d3c2000b42 664034a641e64922a1cc253e2dc39e76 ba5ef70fd99642fdb75c9307c88b1164 - default default] deleting rbd volume volume-d9b22b30-cb59-488b-94bd-15eaf47fd704 delete_volume /usr/lib/python3/dist-packages/cinder/volume/drivers/rbd.py:995
2019-02-18 15:57:54.843 29303 DEBUG oslo_concurrency.processutils [req-bdc7dc4f-6fca-4358-beb9-43d3c2000b42 664034a641e64922a1cc253e2dc39e76 ba5ef70fd99642fdb75c9307c88b1164 - default default] Running cmd (subprocess): rbd import --pool volumes --order 22 /var/lib/cinder/conversion/tmpiw6lpfyj volume-d9b22b30-cb59-488b-94bd-15eaf47fd704 --new-format --id volumes1 --cluster ceph --conf /etc/ceph/ceph.conf execute /usr/lib/python3/dist-packages/oslo_concurrency/processutils.py:372
2019-02-18 15:58:48.369 29303 DEBUG oslo_service.periodic_task [req-f4ac235a-987e-4892-9b2b-d624feec45b4 - - - - -] Running periodic task VolumeManager.publish_service_capabilities run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:215
2019-02-18 15:58:48.370 29303 DEBUG cinder.volume.drivers.rbd [req-f4ac235a-987e-4892-9b2b-d624feec45b4 - - - - -] connecting to volumes1 at ceph (conf=/etc/ceph/ceph.conf, timeout=-1). _do_conn /usr/lib/python3/dist-packages/cinder/volume/drivers/rbd.py:336
2019-02-18 15:58:48.412 29303 DEBUG cinder.volume.drivers.rbd [req-f4ac235a-987e-4892-9b2b-d624feec45b4 - - - - -] connecting to volumes1 at ceph (conf=/etc/ceph/ceph.conf, timeout=-1). _do_conn /usr/lib/python3/dist-packages/cinder/volume/drivers/rbd.py:336
2019-02-18 15:58:48.446 29303 DEBUG cinder.volume.drivers.rbd [req-f4ac235a-987e-4892-9b2b-d624feec45b4 - - - - -] connecting to volumes1 at ceph (conf=/etc/ceph/ceph.conf, timeout=-1). _do_conn /usr/lib/python3/dist-packages/cinder/volume/drivers/rbd.py:336
2019-02-18 15:58:48.763 29303 DEBUG cinder.manager [req-f4ac235a-987e-4892-9b2b-d624feec45b4 - - - - -] Notifying Schedulers of capabilities ... _publish_service_capabilities /usr/lib/python3/dist-packages/cinder/manager.py:194
2019-02-18 15:59:48.371 29303 DEBUG oslo_service.periodic_task [req-f4ac235a-987e-4892-9b2b-d624feec45b4 - - - - -] Running periodic task VolumeManager.publish_service_capabilities run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:215
2019-02-18 15:59:48.372 29303 DEBUG cinder.volume.drivers.rbd [req-f4ac235a-987e-4892-9b2b-d624feec45b4 - - - - -] connecting to volumes1 at ceph (conf=/etc/ceph/ceph.conf, timeout=-1). _do_conn /usr/lib/python3/dist-packages/cinder/volume/drivers/rbd.py:336
2019-02-18 15:59:48.404 29303 DEBUG cinder.volume.drivers.rbd [req-f4ac235a-987e-4892-9b2b-d624feec45b4 - - - - -] connecting to volumes1 at ceph (conf=/etc/ceph/ceph.conf, timeout=-1). _do_conn /usr/lib/python3/dist-packages/cinder/volume/drivers/rbd.py:336
2019-02-18 15:59:48.434 29303 DEBUG cinder.volume.drivers.rbd [req-f4ac235a-987e-4892-9b2b-d624feec45b4 - - - - -] connecting to volumes1 at ceph (conf=/etc/ceph/ceph.conf, timeout=-1). _do_conn /usr/lib/python3/dist-packages/cinder/volume/drivers/rbd.py:336
2019-02-18 15:59:48.763 29303 DEBUG cinder.manager [req-f4ac235a-987e-4892-9b2b-d624feec45b4 - - - - -] Notifying Schedulers of capabilities ... _publish_service_capabilities /usr/lib/python3/dist-packages/cinder/manager.py:194
2019-02-18 16:00:00.674 29303 DEBUG oslo_concurrency.processutils [req-bdc7dc4f-6fca-4358-beb9-43d3c2000b42 664034a641e64922a1cc253e2dc39e76 ba5ef70fd99642fdb75c9307c88b1164 - default default] CMD "rbd import --pool volumes --order 22 /var/lib/cinder/conversion/tmpiw6lpfyj volume-d9b22b30-cb59-488b-94bd-15eaf47fd704 --new-format --id volumes1 --cluster ceph --conf /etc/ceph/ceph.conf" returned: 0 in 125.831s execute /usr/lib/python3/dist-packages/oslo_concurrency/processutils.py:409
2019-02-18 16:00:01.033 29303 DEBUG cinder.volume.drivers.rbd [req-bdc7dc4f-6fca-4358-beb9-43d3c2000b42 664034a641e64922a1cc253e2dc39e76 ba5ef70fd99642fdb75c9307c88b1164 - default default] connecting to volumes1 at ceph (conf=/etc/ceph/ceph.conf, timeout=-1). _do_conn /usr/lib/python3/dist-packages/cinder/volume/drivers/rbd.py:336
2019-02-18 16:00:01.556 29303 DEBUG cinder.volume.flows.manager.create_volume [req-bdc7dc4f-6fca-4358-beb9-43d3c2000b42 664034a641e64922a1cc253e2dc39e76 ba5ef70fd99642fdb75c9307c88b1164 - default default] Downloaded image eb816518-fac2-48f2-8197-ca7a83c89ada (('rbd://b%27868c2b5d-12f1-4f3f-aa2a-5701a3bb1041%27/images/eb816518-fac2-48f2-8197-ca7a83c89ada/snap', None)) to volume d9b22b30-cb59-488b-94bd-15eaf47fd704 successfully. _copy_image_to_volume /usr/lib/python3/dist-packages/cinder/volume/flows/manager/create_volume.py:557
2019-02-18 16:00:01.946 29303 DEBUG cinder.image.image_utils [req-bdc7dc4f-6fca-4358-beb9-43d3c2000b42 664034a641e64922a1cc253e2dc39e76 ba5ef70fd99642fdb75c9307c88b1164 - default default] Temporary image eb816518-fac2-48f2-8197-ca7a83c89ada for user 664034a641e64922a1cc253e2dc39e76 is deleted. fetch /usr/lib/python3/dist-packages/cinder/image/image_utils.py:824
2019-02-18 16:00:01.948 29303 DEBUG cinder.volume.flows.manager.create_volume [req-bdc7dc4f-6fca-4358-beb9-43d3c2000b42 664034a641e64922a1cc253e2dc39e76 ba5ef70fd99642fdb75c9307c88b1164 - default default] Marking volume d9b22b30-cb59-488b-94bd-15eaf47fd704 as bootable. _enable_bootable_flag /usr/lib/python3/dist-packages/cinder/volume/flows/manager/create_volume.py:476
2019-02-18 16:00:02.663 29303 DEBUG cinder.volume.flows.manager.create_volume [req-bdc7dc4f-6fca-4358-beb9-43d3c2000b42 664034a641e64922a1cc253e2dc39e76 ba5ef70fd99642fdb75c9307c88b1164 - default default] Copying metadata from image eb816518-fac2-48f2-8197-ca7a83c89ada to d9b22b30-cb59-488b-94bd-15eaf47fd704. _handle_bootable_volume_glance_meta /usr/lib/python3/dist-packages/cinder/volume/flows/manager/create_volume.py:432
2019-02-18 16:00:02.664 29303 DEBUG cinder.volume.flows.manager.create_volume [req-bdc7dc4f-6fca-4358-beb9-43d3c2000b42 664034a641e64922a1cc253e2dc39e76 ba5ef70fd99642fdb75c9307c88b1164 - default default] Creating volume glance metadata for volume d9b22b30-cb59-488b-94bd-15eaf47fd704 backed by image eb816518-fac2-48f2-8197-ca7a83c89ada with: {'image_id': 'eb816518-fac2-48f2-8197-ca7a83c89ada', 'image_name': 'debian-9', 'container_format': 'bare', 'disk_format': 'raw', 'checksum': 'abaeff277a4e2ce7dff253ad453312a5', 'min_disk': 0, 'size': 2147483648, 'min_ram': 0}. _capture_volume_image_metadata /usr/lib/python3/dist-packages/cinder/volume/flows/manager/create_volume.py:591
2019-02-18 16:00:04.186 29303 DEBUG cinder.volume.manager [req-bdc7dc4f-6fca-4358-beb9-43d3c2000b42 664034a641e64922a1cc253e2dc39e76 ba5ef70fd99642fdb75c9307c88b1164 - default default] Task 'cinder.volume.flows.manager.create_volume.CreateVolumeFromSpecTask;volume:create' (980d2ea5-5e92-4699-bb48-8846797b7baf) transitioned into state 'SUCCESS' from state 'RUNNING' with result '{'image_location': ('rbd://b%27868c2b5d-12f1-4f3f-aa2a-5701a3bb1041%27/images/eb816518-fac2-48f2-8197-ca7a83c89ada/snap', None), 'status': 'creating', 'image_meta': {'owner': 'ba5ef70fd99642fdb75c9307c88b1164', 'visibility': 'shared', 'os_hash_value': '03fee9094d9720ddd51a7eb61755503a114752fbcefb10d1e5cf48a7f261ed9e36fe20e5c5bdf7e4a956910ebc6a0c0bd130b43e541d49b3b3362dd0d491851d', 'protected': False, 'properties': {}, 'os_hidden': False, 'checksum': 'abaeff277a4e2ce7dff253ad453312a5', 'min_disk': 0, 'updated_at': datetime.datetime(2019, 2, 18, 12, 3, 22, tzinfo=<iso8601.Utc>), 'created_at': datetime.datetime(2019, 2, 18, 12, 1, 4, tzinfo=<iso8601.Utc>), 'min_ram': 0, 'tags': [], 'virtual_size': None, 'direct_url': 'rbd://b%27868c2b5d-12f1-4f3f-aa2a-5701a3bb1041%27/images/eb816518-fac2-48f2-8197-ca7a83c89ada/snap', 'name': 'debian-9', 'container_format': 'bare', 'disk_format': 'raw', 'id': 'eb816518-fac2-48f2-8197-ca7a83c89ada', 'os_hash_algo': 'sha512', 'size': 2147483648, 'status': 'active', 'file': '/v2/images/eb816518-fac2-48f2-8197-ca7a83c89ada/file'}, 'volume_name': 'volume-d9b22b30-cb59-488b-94bd-15eaf47fd704', 'volume_size': 20, 'image_service': <cinder.image.glance.GlanceImageService object at 0x7f6db2eb64e0>, 'image_id': 'eb816518-fac2-48f2-8197-ca7a83c89ada'}' _task_receiver /usr/lib/python3/dist-packages/taskflow/listeners/logging.py:183
2019-02-18 16:00:04.188 29303 DEBUG cinder.volume.manager [req-bdc7dc4f-6fca-4358-beb9-43d3c2000b42 664034a641e64922a1cc253e2dc39e76 ba5ef70fd99642fdb75c9307c88b1164 - default default] Task 'cinder.volume.flows.manager.create_volume.CreateVolumeOnFinishTask;volume:create, create.end' (afb50542-d5a4-4558-a6d7-6399ede97460) transitioned into state 'RUNNING' from state 'PENDING' _task_receiver /usr/lib/python3/dist-packages/taskflow/listeners/logging.py:194
2019-02-18 16:00:04.258 29303 INFO cinder.volume.flows.manager.create_volume [req-bdc7dc4f-6fca-4358-beb9-43d3c2000b42 664034a641e64922a1cc253e2dc39e76 ba5ef70fd99642fdb75c9307c88b1164 - default default] Volume volume-d9b22b30-cb59-488b-94bd-15eaf47fd704 (d9b22b30-cb59-488b-94bd-15eaf47fd704): created successfully
2019-02-18 16:00:04.260 29303 DEBUG cinder.volume.manager [req-bdc7dc4f-6fca-4358-beb9-43d3c2000b42 664034a641e64922a1cc253e2dc39e76 ba5ef70fd99642fdb75c9307c88b1164 - default default] Task 'cinder.volume.flows.manager.create_volume.CreateVolumeOnFinishTask;volume:create, create.end' (afb50542-d5a4-4558-a6d7-6399ede97460) transitioned into state 'SUCCESS' from state 'RUNNING' with result 'None' _task_receiver /usr/lib/python3/dist-packages/taskflow/listeners/logging.py:183
2019-02-18 16:00:04.263 29303 DEBUG cinder.volume.manager [req-bdc7dc4f-6fca-4358-beb9-43d3c2000b42 664034a641e64922a1cc253e2dc39e76 ba5ef70fd99642fdb75c9307c88b1164 - default default] Flow 'volume_create_manager' (18af88d8-7ea2-41c0-8554-b231a179f492) transitioned into state 'SUCCESS' from state 'RUNNING' _flow_receiver /usr/lib/python3/dist-packages/taskflow/listeners/logging.py:145
2019-02-18 16:00:04.294 29303 INFO cinder.volume.manager [req-bdc7dc4f-6fca-4358-beb9-43d3c2000b42 664034a641e64922a1cc253e2dc39e76 ba5ef70fd99642fdb75c9307c88b1164 - default default] Created volume successfully.
2019-02-18 16:00:06.498 29303 DEBUG oslo_concurrency.processutils [req-c944f40c-94fd-4a0c-9640-6f3fe7b9f007 664034a641e64922a1cc253e2dc39e76 ba5ef70fd99642fdb75c9307c88b1164 - default default] Running cmd (subprocess): ceph mon dump --format=json --id volumes1 --cluster ceph --conf /etc/ceph/ceph.conf execute /usr/lib/python3/dist-packages/oslo_concurrency/processutils.py:372
2019-02-18 16:00:06.888 29303 DEBUG oslo_concurrency.processutils [req-c944f40c-94fd-4a0c-9640-6f3fe7b9f007 664034a641e64922a1cc253e2dc39e76 ba5ef70fd99642fdb75c9307c88b1164 - default default] CMD "ceph mon dump --format=json --id volumes1 --cluster ceph --conf /etc/ceph/ceph.conf" returned: 0 in 0.390s execute /usr/lib/python3/dist-packages/oslo_concurrency/processutils.py:409
2019-02-18 16:00:06.890 29303 DEBUG cinder.volume.drivers.rbd [req-c944f40c-94fd-4a0c-9640-6f3fe7b9f007 664034a641e64922a1cc253e2dc39e76 ba5ef70fd99642fdb75c9307c88b1164 - default default] connection data: {'driver_volume_type': 'rbd', 'data': {'cluster_name': 'ceph', 'secret_type': 'ceph', 'discard': True, 'keyring': None, 'hosts': ['185.21.196.128', '185.21.196.146', '185.21.196.163'], 'secret_uuid': 'b2efeb49-9844-475b-92ad-5df4a3e1300e', 'name': 'volumes/volume-d9b22b30-cb59-488b-94bd-15eaf47fd704', 'auth_username': 'volumes1', 'auth_enabled': True, 'volume_id': 'd9b22b30-cb59-488b-94bd-15eaf47fd704', 'ports': ['6789', '6789', '6789']}} initialize_connection /usr/lib/python3/dist-packages/cinder/volume/drivers/rbd.py:1286
2019-02-18 16:00:06.891 29303 INFO cinder.volume.manager [req-c944f40c-94fd-4a0c-9640-6f3fe7b9f007 664034a641e64922a1cc253e2dc39e76 ba5ef70fd99642fdb75c9307c88b1164 - default default] Initialize volume connection completed successfully.
2019-02-18 16:00:08.302 29303 DEBUG cinder.coordination [req-3e789701-ed1a-47a2-8525-3c7631cbbd46 664034a641e64922a1cc253e2dc39e76 ba5ef70fd99642fdb75c9307c88b1164 - default default] Lock "/var/lib/cinder/cinder-d9b22b30-cb59-488b-94bd-15eaf47fd704" acquired by "attach_volume" :: waited 0.002s _synchronized /usr/lib/python3/dist-packages/cinder/coordination.py:150
2019-02-18 16:00:09.481 29303 INFO cinder.volume.manager [req-3e789701-ed1a-47a2-8525-3c7631cbbd46 664034a641e64922a1cc253e2dc39e76 ba5ef70fd99642fdb75c9307c88b1164 - default default] Attaching volume d9b22b30-cb59-488b-94bd-15eaf47fd704 to instance 4eeefa1e-4382-4716-8364-029c47aa8b24 at mountpoint /dev/vda on host None.
2019-02-18 16:00:09.905 29303 INFO cinder.volume.manager [req-3e789701-ed1a-47a2-8525-3c7631cbbd46 664034a641e64922a1cc253e2dc39e76 ba5ef70fd99642fdb75c9307c88b1164 - default default] Attach volume completed successfully.
2019-02-18 16:00:09.906 29303 DEBUG cinder.coordination [req-3e789701-ed1a-47a2-8525-3c7631cbbd46 664034a641e64922a1cc253e2dc39e76 ba5ef70fd99642fdb75c9307c88b1164 - default default] Lock "/var/lib/cinder/cinder-d9b22b30-cb59-488b-94bd-15eaf47fd704" released by "attach_volume" :: held 1.604s _synchronized /usr/lib/python3/dist-packages/cinder/coordination.py:162
In a log above , this line is important in python3 ( STR issue ):
2019-02-18 15:57:30.356 29303 DEBUG cinder.volume.drivers.rbd [req-
bdc7dc4f-6fca-4358-beb9-43d3c2000b42 664034a641e64922a1cc253e2dc39e76
ba5ef70fd99642fdb75c9307c88b1164 - default default] rbd://b
%27868c2b5d-12f1-4f3f-aa2a-
5701a3bb1041%27/images/eb816518-fac2-48f2-8197-ca7a83c89ada/snap is in
a different ceph cluster. _is_cloneable /usr/lib/python3/dist-
packages/cinder/volume/drivers/rbd.py:1320
I've added this to code /usr/lib/python3/dist-
packages/cinder/volume/drivers/rbd.py , to function _is_cloneable to
debug why:
def _is_cloneable(self, image_location, image_meta):
try:
# MICHAL ARBET DEBUG
LOG.debug("MICHAL ARBET Image Location : {}".format(image_location))
LOG.debug("MICHAL ARBET TYPE Image Location : {}".format(type(image_location)))
fsid, pool, image, snapshot = self._parse_location(image_location)
except exception.ImageUnacceptable as e:
LOG.debug('not cloneable: %s.', e)
return False
# MICHAL ARBET DEBUG
self_get_fsid = self._get_fsid()
LOG.debug("MICHAL ARBET fsid = {}".format(fsid))
LOG.debug("MICHAL ARBET type fsid = {}".format(type(fsid)))
LOG.debug("MICHAL ARBET self._get_fsid() = {}".format(self_get_fsid))
LOG.debug("MICHAL ARBET type self._get_fsid() = {}".format(type(self_get_fsid)))
if self_get_fsid != fsid:
LOG.debug('%s is in a different ceph cluster.', image_location)
return False
root at openstack-compute2:~# cat /var/log/cinder/* | grep MICHAL\ ARBET
2019-02-18 16:58:34.398 6973 DEBUG cinder.volume.drivers.rbd [req-b90a4a29-e66c-4704-bc44-1487cfff3924 664034a641e64922a1cc253e2dc39e76 ba5ef70fd99642fdb75c9307c88b1164 - default default] MICHAL ARBET Image Location : rbd://b%27868c2b5d-12f1-4f3f-aa2a-5701a3bb1041%27/images/eb816518-fac2-48f2-8197-ca7a83c89ada/snap _is_cloneable /usr/lib/python3/dist-packages/cinder/volume/drivers/rbd.py:1314
2019-02-18 16:58:34.398 6973 DEBUG cinder.volume.drivers.rbd [req-b90a4a29-e66c-4704-bc44-1487cfff3924 664034a641e64922a1cc253e2dc39e76 ba5ef70fd99642fdb75c9307c88b1164 - default default] MICHAL ARBET TYPE Image Location : <class 'str'> _is_cloneable /usr/lib/python3/dist-packages/cinder/volume/drivers/rbd.py:1315
2019-02-18 16:58:34.436 6973 DEBUG cinder.volume.drivers.rbd [req-b90a4a29-e66c-4704-bc44-1487cfff3924 664034a641e64922a1cc253e2dc39e76 ba5ef70fd99642fdb75c9307c88b1164 - default default] MICHAL ARBET fsid = b'868c2b5d-12f1-4f3f-aa2a-5701a3bb1041' _is_cloneable /usr/lib/python3/dist-packages/cinder/volume/drivers/rbd.py:1323
2019-02-18 16:58:34.437 6973 DEBUG cinder.volume.drivers.rbd [req-b90a4a29-e66c-4704-bc44-1487cfff3924 664034a641e64922a1cc253e2dc39e76 ba5ef70fd99642fdb75c9307c88b1164 - default default] MICHAL ARBET type fsid = <class 'str'> _is_cloneable /usr/lib/python3/dist-packages/cinder/volume/drivers/rbd.py:1324
2019-02-18 16:58:34.438 6973 DEBUG cinder.volume.drivers.rbd [req-b90a4a29-e66c-4704-bc44-1487cfff3924 664034a641e64922a1cc253e2dc39e76 ba5ef70fd99642fdb75c9307c88b1164 - default default] MICHAL ARBET self._get_fsid() = b'868c2b5d-12f1-4f3f-aa2a-5701a3bb1041' _is_cloneable /usr/lib/python3/dist-packages/cinder/volume/drivers/rbd.py:1325
2019-02-18 16:58:34.438 6973 DEBUG cinder.volume.drivers.rbd [req-b90a4a29-e66c-4704-bc44-1487cfff3924 664034a641e64922a1cc253e2dc39e76 ba5ef70fd99642fdb75c9307c88b1164 - default default] MICHAL ARBET type self._get_fsid() = <class 'bytes'> _is_cloneable /usr/lib/python3/dist-packages/cinder/volume/drivers/rbd.py:1326
And above log is BUG in python3
To manage notifications about this bug go to:
https://bugs.launchpad.net/cinder/+bug/1816468/+subscriptions
More information about the Ubuntu-openstack-bugs
mailing list