Ceph OSD vs. AppArmor

tpDev Tester tpdev.tester at gmail.com
Thu Nov 7 14:49:31 UTC 2024


Hi,

I would like to deploy a ceph cluster with 24.04 and podman. I installed the ubuntu caphadm, ceph-common, ... packages and the cluster deployment itself worked well, but adding OSDs fails with error below. The ceph mailinglist suggests to check apparmor.

aa-status lists some ceph related profiles in 'enforeced' mode, but no relevant logs in kernel.log or syslog. When turning off apparmor completely, adding OSDs work well, but this is not my first choice. I tried to set the 'enforced' profiles into 'audit' mode to see more loggings, but this failed with e.g.:

root at ceph-3:~# aa-audit containers-default-0.57.4-apparmor1

ERROR: Operation {'runbindable'} cannot have a source. Source = AARE('/')

aa-complain ... fails as well.


How can I find out, what is going on with apparmor and ceph OSDs?


Kind regards
Thomas



####
root at ceph-3:~# ceph orch daemon add osd ceph-3:/dev/sda
Error EINVAL: Traceback (most recent call last):
   File "/usr/share/ceph/mgr/mgr_module.py", line 1862, in _handle_command
     return self.handle_command(inbuf, cmd)
   File "/usr/share/ceph/mgr/orchestrator/_interface.py", line 184, in handle_command
     return dispatch[cmd['prefix']].call(self, cmd, inbuf)
   File "/usr/share/ceph/mgr/mgr_module.py", line 499, in call
     return self.func(mgr, **kwargs)
   File "/usr/share/ceph/mgr/orchestrator/_interface.py", line 120, in <lambda>
     wrapper_copy = lambda *l_args, **l_kwargs: wrapper(*l_args, **l_kwargs)  # noqa: E731
   File "/usr/share/ceph/mgr/orchestrator/_interface.py", line 109, in wrapper
     return func(*args, **kwargs)
   File "/usr/share/ceph/mgr/orchestrator/module.py", line 1374, in _daemon_add_osd
     raise_if_exception(completion)
   File "/usr/share/ceph/mgr/orchestrator/_interface.py", line 241, in raise_if_exception
     raise e
RuntimeError: cephadm exited with an error code: 1, stderr:Inferring config /var/lib/ceph/8e142907-9cdc-11ef-bf79-001e06456e10/mon.ceph-3/config
Traceback (most recent call last):
   File "<frozen runpy>", line 198, in _run_module_as_main
   File "<frozen runpy>", line 88, in _run_code
   File "/var/lib/ceph/8e142907-9cdc-11ef-bf79-001e06456e10/cephadm.a58127a8eed242cae13849ddbebcb9931d7a5410f406f2d264e3b1ed31d9605e/__main__.py", line 5579, in <module>
   File "/var/lib/ceph/8e142907-9cdc-11ef-bf79-001e06456e10/cephadm.a58127a8eed242cae13849ddbebcb9931d7a5410f406f2d264e3b1ed31d9605e/__main__.py", line 5567, in main
   File "/var/lib/ceph/8e142907-9cdc-11ef-bf79-001e06456e10/cephadm.a58127a8eed242cae13849ddbebcb9931d7a5410f406f2d264e3b1ed31d9605e/__main__.py", line 409, in _infer_config
   File "/var/lib/ceph/8e142907-9cdc-11ef-bf79-001e06456e10/cephadm.a58127a8eed242cae13849ddbebcb9931d7a5410f406f2d264e3b1ed31d9605e/__main__.py", line 324, in _infer_fsid
   File "/var/lib/ceph/8e142907-9cdc-11ef-bf79-001e06456e10/cephadm.a58127a8eed242cae13849ddbebcb9931d7a5410f406f2d264e3b1ed31d9605e/__main__.py", line 437, in _infer_image
   File "/var/lib/ceph/8e142907-9cdc-11ef-bf79-001e06456e10/cephadm.a58127a8eed242cae13849ddbebcb9931d7a5410f406f2d264e3b1ed31d9605e/__main__.py", line 311, in _validate_fsid
   File "/var/lib/ceph/8e142907-9cdc-11ef-bf79-001e06456e10/cephadm.a58127a8eed242cae13849ddbebcb9931d7a5410f406f2d264e3b1ed31d9605e/__main__.py", line 3288, in command_ceph_volume
   File "/var/lib/ceph/8e142907-9cdc-11ef-bf79-001e06456e10/cephadm.a58127a8eed242cae13849ddbebcb9931d7a5410f406f2d264e3b1ed31d9605e/__main__.py", line 918, in get_container_mounts_for_type
   File "/var/lib/ceph/8e142907-9cdc-11ef-bf79-001e06456e10/cephadm.a58127a8eed242cae13849ddbebcb9931d7a5410f406f2d264e3b1ed31d9605e/cephadmlib/daemons/ceph.py", line 422, in get_ceph_mounts_for_type
   File "/var/lib/ceph/8e142907-9cdc-11ef-bf79-001e06456e10/cephadm.a58127a8eed242cae13849ddbebcb9931d7a5410f406f2d264e3b1ed31d9605e/cephadmlib/host_facts.py", line 760, in selinux_enabled
   File "/var/lib/ceph/8e142907-9cdc-11ef-bf79-001e06456e10/cephadm.a58127a8eed242cae13849ddbebcb9931d7a5410f406f2d264e3b1ed31d9605e/cephadmlib/host_facts.py", line 743, in kernel_security
   File "/var/lib/ceph/8e142907-9cdc-11ef-bf79-001e06456e10/cephadm.a58127a8eed242cae13849ddbebcb9931d7a5410f406f2d264e3b1ed31d9605e/cephadmlib/host_facts.py", line 722, in _fetch_apparmor
ValueError: too many values to unpack (expected 2)





More information about the ubuntu-users mailing list