Note: This is a beta release of Red Hat Bugzilla 5.0. The data contained within is a snapshot of the live data so any changes you make will not be reflected in the production Bugzilla. Also email is disabled so feel free to test any aspect of the site that you want. File any problems you find or give feedback here.
Bug 1160808 - selinux prevents hosted engine to be deployed on RHEL 6.6 with iscsi support
Summary: selinux prevents hosted engine to be deployed on RHEL 6.6 with iscsi support
Keywords:
Status: CLOSED CURRENTRELEASE
Alias: None
Product: Red Hat Enterprise Virtualization Manager
Classification: Red Hat
Component: vdsm
Version: 3.5.0
Hardware: Unspecified
OS: Unspecified
unspecified
urgent
Target Milestone: ---
: 3.5.0
Assignee: Nir Soffer
QA Contact: Elad
URL:
Whiteboard: storage
Depends On: 1167277 1171452
Blocks: 1159946 rhev35rcblocker rhev35gablocker
TreeView+ depends on / blocked
 
Reported: 2014-11-05 16:37 UTC by Simone Tiraboschi
Modified: 2016-02-10 18:24 UTC (History)
18 users (show)

Fixed In Version: vdsm-4.16.8.1-4.el6ev
Doc Type: Bug Fix
Doc Text:
Cause: An outdated selinux policy can prevent hosted engine deployment on RHEL 6.6. Consequence: Customers must manually yum upgrade selinux-policy and make sure they have at least selinux-policy-3.7.19-261.el6 Fix: See above Result: See above
Clone Of:
: 1167277 (view as bug list)
Environment:
Last Closed: 2015-02-16 13:39:53 UTC
oVirt Team: Storage
Target Upstream Version:


Attachments (Terms of Use)
ovirt-hosted-engine-setup (deleted)
2014-11-06 12:16 UTC, Simone Tiraboschi
no flags Details
vdsm (deleted)
2014-11-06 12:17 UTC, Simone Tiraboschi
no flags Details
audit (deleted)
2014-11-06 12:21 UTC, Simone Tiraboschi
no flags Details


Links
System ID Priority Status Summary Last Updated
oVirt gerrit 35973 master MERGED spec: fix selinux-policy requirement for EL6 Never
oVirt gerrit 36018 ovirt-3.5 MERGED spec: fix selinux-policy requirement for EL6 Never

Description Simone Tiraboschi 2014-11-05 16:37:40 UTC
Description of problem:
deploying hosted engine via iSCSI on RHEL 6.6 hosts fails due to selinux denials.

Version-Release number of selected component (if applicable):
# rpm -qa|egrep "(selinux-policy|libvirt|qemu)"|sort 
gpxe-roms-qemu-0.9.7-6.12.el6.noarch
libvirt-0.10.2-46.el6_6.1.x86_64
libvirt-client-0.10.2-46.el6_6.1.x86_64
libvirt-lock-sanlock-0.10.2-46.el6_6.1.x86_64
libvirt-python-0.10.2-46.el6_6.1.x86_64
qemu-img-rhev-0.12.1.2-2.448.el6.x86_64
qemu-kvm-rhev-0.12.1.2-2.448.el6.x86_64
selinux-policy-3.7.19-260.el6.noarch
selinux-policy-targeted-3.7.19-260.el6.noarch

RHEV-M 3.5.0 vt8

How reproducible:
100%

Steps to Reproduce:
1. deploy hosted engine via iSCSI 

Actual results:
From hosted-engine setup:
[ INFO  ] Engine replied: DB Up!Welcome to Health Status!
          Enter the name of the cluster to which you want to add the host (Default) [Default]: 
[ INFO  ] Waiting for the host to become operational in the engine. This may take several minutes...
[ ERROR ] The VDSM host was found in a failed state. Please check engine and bootstrap installation logs.
[ ERROR ] Unable to add hosted_engine_1 to the manager
          Please shutdown the VM allowing the system to launch it as a monitored service.
          The system will wait until the VM is down.
[ ERROR ] Failed to execute stage 'Closing up': [Errno 111] Connection refused
[ INFO  ] Stage: Clean up
[ ERROR ] Failed to execute stage 'Clean up': [Errno 111] Connection refused
[ INFO  ] Generating answer file '/var/lib/ovirt-hosted-engine-setup/answers/answers-20141105163830.conf'


From VDSM logs:
Thread-73::DEBUG::2014-11-05 16:38:13,471::domainMonitor::201::Storage.DomainMonitorThread::(_monitorLoop) Unable to release the host id 1 for domain a4eed2bb-5acc-4056-8940-5cb55ccf1b6d
Traceback (most recent call last):
  File "/usr/share/vdsm/storage/domainMonitor.py", line 198, in _monitorLoop
    self.domain.releaseHostId(self.hostId, unused=True)
  File "/usr/share/vdsm/storage/sd.py", line 480, in releaseHostId
    self._clusterLock.releaseHostId(hostId, async, unused)
  File "/usr/share/vdsm/storage/clusterlock.py", line 252, in releaseHostId
    raise se.ReleaseHostIdFailure(self._sdUUID, e)
ReleaseHostIdFailure: Cannot release host id: ('a4eed2bb-5acc-4056-8940-5cb55ccf1b6d', SanlockException(16, 'Sanlock lockspace remove failure', 'Device or resource busy'))
VM Channels Listener::INFO::2014-11-05 16:38:13,472::vmchannels::183::vds::(run) VM channels listener thread has ended.


From SELinux logs:
----
time->Wed Nov  5 16:40:08 2014
type=SYSCALL msg=audit(1415202008.743:1587): arch=c000003e syscall=6 success=yes exit=0 a0=7fffef0a8e10 a1=7fffef0a4180 a2=7fffef0a4180 a3=6 items=0 ppid=1838 pid=2074 auid=4294967295 uid=175 gid=175 euid=175 suid=175 fsuid=175 egid=175 sgid=175 fsgid=175 tty=(none) ses=4294967295 comm="python" exe="/usr/bin/python" subj=system_u:system_r:rhev_agentd_t:s0 key=(null)
type=AVC msg=audit(1415202008.743:1587): avc:  denied  { getattr } for  pid=2074 comm="python" path="/dev/.udev/db/block:sr0" dev=devtmpfs ino=9604 scontext=system_u:system_r:rhev_agentd_t:s0 tcontext=system_u:object_r:udev_tbl_t:s0 tclass=file
----
time->Wed Nov  5 16:40:08 2014
type=SYSCALL msg=audit(1415202008.743:1588): arch=c000003e syscall=2 success=yes exit=6 a0=7fffef0a8e10 a1=0 a2=1b6 a3=0 items=0 ppid=1838 pid=2074 auid=4294967295 uid=175 gid=175 euid=175 suid=175 fsuid=175 egid=175 sgid=175 fsgid=175 tty=(none) ses=4294967295 comm="python" exe="/usr/bin/python" subj=system_u:system_r:rhev_agentd_t:s0 key=(null)
type=AVC msg=audit(1415202008.743:1588): avc:  denied  { open } for  pid=2074 comm="python" name="block:sr0" dev=devtmpfs ino=9604 scontext=system_u:system_r:rhev_agentd_t:s0 tcontext=system_u:object_r:udev_tbl_t:s0 tclass=file
type=AVC msg=audit(1415202008.743:1588): avc:  denied  { read } for  pid=2074 comm="python" name="block:sr0" dev=devtmpfs ino=9604 scontext=system_u:system_r:rhev_agentd_t:s0 tcontext=system_u:object_r:udev_tbl_t:s0 tclass=file


Expected results:
the deploy should succeed

Additional info:
recently we faced similar issue over EL7, see https://bugzilla.redhat.com/show_bug.cgi?id=1146529

Comment 1 Allon Mureinik 2014-11-06 08:19:14 UTC
Nir, doesn't the fix for bug 1127460 cover this one too?

Comment 2 Nir Soffer 2014-11-06 09:13:48 UTC
Simone: Why do you think this is related to storage?

Allon: I don't see any relation to bug 1127460. Did the hosted engine vm pause?

Comment 3 Elad 2014-11-06 09:29:23 UTC
Did you try to deploy the HE over a LUN which was used for a storage domain previously? 

Can you please attach the setup logs?

Comment 4 Simone Tiraboschi 2014-11-06 10:54:32 UTC
(In reply to Nir Soffer from comment #2)
> Simone: Why do you think this is related to storage?

Just cause I notice a sanlock failure, not really sure about that.

ReleaseHostIdFailure: Cannot release host id: ('a4eed2bb-5acc-4056-8940-5cb55ccf1b6d', SanlockException(16, 'Sanlock lockspace remove failure', 'Device or resource busy'))

> Allon: I don't see any relation to bug 1127460. Did the hosted engine vm
> pause?

If I remember correctly no.


(In reply to Elad from comment #3)
> Did you try to deploy the HE over a LUN which was used for a storage domain
> previously? 

No, it was a fresh one.

> Can you please attach the setup logs?

Of course.

Comment 5 Simone Tiraboschi 2014-11-06 12:16:18 UTC
Created attachment 954422 [details]
ovirt-hosted-engine-setup

Comment 6 Simone Tiraboschi 2014-11-06 12:17:39 UTC
Created attachment 954423 [details]
vdsm

Comment 7 Simone Tiraboschi 2014-11-06 12:21:17 UTC
Created attachment 954424 [details]
audit

Comment 8 Miroslav Grepl 2014-11-07 09:55:39 UTC
I see

type=AVC msg=audit(1415260556.242:265555): avc:  denied  { getattr } for  pid=23130 comm="python" path="/dev/.udev/db/block:sr0" dev=devtmpfs ino=92089 scontext=system_u:system_r:rhev_agentd_t:s0 tcontext=system_u:object_r:udev_tbl_t:s0 tclass=file
type=SYSCALL msg=audit(1415260556.242:265555): arch=c000003e syscall=6 success=yes exit=0 a0=7fff19386ff0 a1=7fff19382360 a2=7fff19382360 a3=6 items=0 ppid=1898 pid=23130 auid=4294967295 uid=175 gid=175 euid=175 suid=175 fsuid=175 egid=175 sgid=175 fsgid=175 tty=(none) ses=4294967295 comm="python" exe="/usr/bin/python" subj=system_u:system_r:rhev_agentd_t:s0 key=(null)
type=AVC msg=audit(1415260556.242:265556): avc:  denied  { read } for  pid=23130 comm="python" name="block:sr0" dev=devtmpfs ino=92089 scontext=system_u:system_r:rhev_agentd_t:s0 tcontext=system_u:object_r:udev_tbl_t:s0 tclass=file
type=AVC msg=audit(1415260556.242:265556): avc:  denied  { open } for  pid=23130 comm="python" name="block:sr0" dev=devtmpfs ino=92089 scontext=system_u:system_r:rhev_agentd_t:s0 tcontext=system_u:object_r:udev_tbl_t:s0 tclass=file


Did it work in permissive mode?

Comment 9 Simone Tiraboschi 2014-11-07 10:05:53 UTC
(In reply to Miroslav Grepl from comment #8)
> Did it work in permissive mode?

Yes it does

Comment 10 Miroslav Grepl 2014-11-21 15:13:49 UTC
Could you test it with

#grep rhev_agentd /var/log/audit/auditl.log |audit2allow -M mypol
#semodule -i mypol.pp

in enforcing mode?

Comment 11 Simone Tiraboschi 2014-11-21 16:25:01 UTC
It seams to work as expected after that(In reply to Miroslav Grepl from comment #10)
> Could you test it with
> 
> #grep rhev_agentd /var/log/audit/auditl.log |audit2allow -M mypol
> #semodule -i mypol.pp
> 
> in enforcing mode?

After that it seams to work as expected

Comment 12 Miroslav Grepl 2014-11-24 11:09:13 UTC
diff --git a/rhev.te b/rhev.te
index eeee78a..8b7aa12 100644
--- a/rhev.te
+++ b/rhev.te
@@ -93,6 +93,10 @@ optional_policy(`
 ')
 
 optional_policy(`
+    udev_read_db(rhev_agentd_t)
+')
+
+optional_policy(`

is needed.

Comment 13 Allon Mureinik 2014-11-24 11:59:50 UTC
Miroslav, isn't the dependency reversed here?
IIUC, bug 1167277 should supply a new selinux-policy and then RHEV should consume it (this bug)?

Comment 14 Miroslav Grepl 2014-11-24 12:12:29 UTC
Feel free to edit it.

Comment 15 Michal Skrivanek 2014-12-01 16:41:13 UTC
should this block GA? - workaround is simple, switch selinux to permissive, after deployment switch it back...

Comment 16 Allon Mureinik 2014-12-01 20:02:03 UTC
(In reply to Michal Skrivanek from comment #15)
> should this block GA? - workaround is simple, switch selinux to permissive,
> after deployment switch it back...
I'm fine with not blocking GA on this, but not my call.
Ultimately, a PM should ack/nack this.

Doron - you understand HE better than me - your two cents here?

Comment 17 Doron Fediuck 2014-12-02 13:24:42 UTC
(In reply to Allon Mureinik from comment #16)
> (In reply to Michal Skrivanek from comment #15)
> > should this block GA? - workaround is simple, switch selinux to permissive,
> > after deployment switch it back...
> I'm fine with not blocking GA on this, but not my call.
> Ultimately, a PM should ack/nack this.
> 
> Doron - you understand HE better than me - your two cents here?

Since the RHEL bug 1167277 moved to MODIFIED we should be fine now. 
So no point of keeping this one as a blocker.

Comment 18 Allon Mureinik 2014-12-02 19:54:39 UTC
We need a patch to update vdsm.spec.in to require this rpm once its out.
If this indeed solves the issue, a customer could simply yum upgrade selinux-policy-targeted to avoid this issue.
Ugly, but not a blocker - assuming RHEV's QA team can verify this.

Comment 19 Allon Mureinik 2014-12-04 15:31:55 UTC
Can we please verify this with selinux-policy-3.7.19-260.el6_6.1 (https://brewweb.devel.redhat.com/buildinfo?buildID=401412)?

Comment 20 Aharon Canan 2014-12-07 09:40:09 UTC
(In reply to Allon Mureinik from comment #19)
> Can we please verify this with selinux-policy-3.7.19-260.el6_6.1
> (https://brewweb.devel.redhat.com/buildinfo?buildID=401412)?

Allon, 

In case we are using spesific pkg version which is not part of the regular installation I am not sure we can set it to on_qa, 
is it going to be part of the dependencies?

Comment 21 Allon Mureinik 2014-12-07 09:45:00 UTC
(In reply to Aharon Canan from comment #20)
> (In reply to Allon Mureinik from comment #19)
> > Can we please verify this with selinux-policy-3.7.19-260.el6_6.1
> > (https://brewweb.devel.redhat.com/buildinfo?buildID=401412)?
> 
> Allon, 
> 
> In case we are using spesific pkg version which is not part of the regular
> installation I am not sure we can set it to on_qa, 
> is it going to be part of the dependencies?
obviously.

Comment 22 Allon Mureinik 2014-12-08 21:52:58 UTC
(In reply to Aharon Canan from comment #20)
> (In reply to Allon Mureinik from comment #19)
> > Can we please verify this with selinux-policy-3.7.19-260.el6_6.1
> > (https://brewweb.devel.redhat.com/buildinfo?buildID=401412)?
> 
> Allon, 
> 
> In case we are using spesific pkg version which is not part of the regular
> installation I am not sure we can set it to on_qa, 
> is it going to be part of the dependencies?
On second thought, you're right.

We can proceed in two directions here:
1. dev - should add a dependency in VDSM (in the works, see http://gerrit.ovirt.org/#/c/35973)
2. qa - can, if they, wish, test by manually yum upgrading.

Moving bug back to POST.

Comment 23 Elad 2014-12-09 08:09:59 UTC
(In reply to Allon Mureinik from comment #22)
> (In reply to Aharon Canan from comment #20)
> > (In reply to Allon Mureinik from comment #19)
> > > Can we please verify this with selinux-policy-3.7.19-260.el6_6.1
> > > (https://brewweb.devel.redhat.com/buildinfo?buildID=401412)?
> > 
> > Allon, 
> > 
> > In case we are using spesific pkg version which is not part of the regular
> > installation I am not sure we can set it to on_qa, 
> > is it going to be part of the dependencies?
> On second thought, you're right.
> 
> We can proceed in two directions here:
> 1. dev - should add a dependency in VDSM (in the works, see
> http://gerrit.ovirt.org/#/c/35973)
> 2. qa - can, if they, wish, test by manually yum upgrading.
> 
> Moving bug back to POST.

Allon, I'm unable to deploy hosted-engine due to https://bugzilla.redhat.com/show_bug.cgi?id=1167074

Comment 24 Elad 2014-12-09 09:53:13 UTC
(In reply to Elad from comment #23)
> (In reply to Allon Mureinik from comment #22)
> > (In reply to Aharon Canan from comment #20)
> > > (In reply to Allon Mureinik from comment #19)
> > > > Can we please verify this with selinux-policy-3.7.19-260.el6_6.1
> > > > (https://brewweb.devel.redhat.com/buildinfo?buildID=401412)?
> > > 
> > > Allon, 
> > > 
> > > In case we are using spesific pkg version which is not part of the regular
> > > installation I am not sure we can set it to on_qa, 
> > > is it going to be part of the dependencies?
> > On second thought, you're right.
> > 
> > We can proceed in two directions here:
> > 1. dev - should add a dependency in VDSM (in the works, see
> > http://gerrit.ovirt.org/#/c/35973)
> > 2. qa - can, if they, wish, test by manually yum upgrading.
> > 
> > Moving bug back to POST.
> 
> Allon, I'm unable to deploy hosted-engine due to
> https://bugzilla.redhat.com/show_bug.cgi?id=1167074

I managed to deploy using the default SElinux policy, will try using https://brewweb.devel.redhat.com/buildinfo?buildID=401412

Comment 25 Elad 2014-12-09 10:45:19 UTC
(In reply to Allon Mureinik from comment #19)
> Can we please verify this with selinux-policy-3.7.19-260.el6_6.1
> (https://brewweb.devel.redhat.com/buildinfo?buildID=401412)?

Checked deployment using:

RHEL6.6 

libselinux-utils-2.0.94-5.8.el6.x86_64
libselinux-2.0.94-5.8.el6.x86_64
selinux-policy-targeted-3.7.19-260.el6_6.1.noarch
libselinux-ruby-2.0.94-5.8.el6.x86_64
libselinux-python-2.0.94-5.8.el6.x86_64
selinux-policy-3.7.19-260.el6_6.1.noarch

ovirt-hosted-engine-setup-1.2.1-7.el6ev.noarch
vdsm-4.16.8.1-2.el6ev.x86_64

Deployment went fine

Comment 26 Elad 2014-12-21 16:04:31 UTC
Cannot be tested due to https://bugzilla.redhat.com/show_bug.cgi?id=1171452

Comment 27 Elad 2014-12-24 14:01:24 UTC
I managed to deploy iSCSI on a RHEL6.6 host with the following packages installed:


libselinux-utils-2.0.94-5.8.el6.x86_64
libselinux-ruby-2.0.94-5.8.el6.x86_64
selinux-policy-targeted-3.7.19-260.el6_6.1.noarch
libselinux-2.0.94-5.8.el6.x86_64
libselinux-python-2.0.94-5.8.el6.x86_64
selinux-policy-3.7.19-260.el6_6.1.noarch

vdsm-4.16.8.1-4.el6ev.x86_64

ovirt-hosted-engine-ha-1.2.4-5.el6ev.noarch
ovirt-hosted-engine-setup-1.2.1-8.el6ev.noarch

sanlock-2.8-1.el6.x86_64

Comment 28 Elad 2014-12-24 14:02:49 UTC
*Used rhev 3.5 vt13.5


Note You need to log in before you can comment on or make changes to this bug.