why did Ubuntu turn off kernel preemption in Edgy?

Matthew Garrett mjg59 at srcf.ucam.org
Tue Nov 28 21:50:53 GMT 2006


On Tue, Nov 28, 2006 at 03:01:35PM -0500, Phillip Susi wrote:

> That isn't because the brain can detect the flicker of the screen, it is 
> because the flicker of the screen causes a beat with the flicker of the 
> incandescent light bulbs in the room which are flickering at 60 Hz. 
> When the screen is refreshing at 65 Hz and the lights are flickering at 
> 60 Hz, you get a 5 Hz beat which is quite noticeable.  Conventional 
> televisions operate on the principal that the brain perceives only 29 
> images per second as a continuous flowing scene.

There are multiple issues here, and it's important to distinguish them.

The first is the speed at which humans begin to interpret multiple 
static images as movement. This varies - 12 frames per second may be 
adequate for slowly moving objects, with 16 being adequate for most 
purposes. Cinema is shot at 24 frames per second, which will almost 
always result in the impression of smooth movement. However, this 
doesn't mean that humans are unable to see the difference. If you point 
a film camera at a wheel and accelerate it, viewers will see the wheel 
speed up, slow down and then reverse. The brain is able to process 
information at a greater rate than 24 frames per second, and the missing 
information is noticed under certain circumstances.

The second issue is the refresh rate at which humans will perceive 
flicker. This varies depending on the individual, but to remove flicker 
requires a refresh rate of much more than 24Hz. Cinema projectors 
typically illuminate each frame twice, giving a refresh rate of 48Hz. 
More modern equipment may do it at 72Hz. Whether flicker is noticable 
depends on a combination of factors, including screen brightness, size 
and how tired the viewer is.

This is certainly not primarily determined by any beat effect resulting 
from interaction with AC-powered lighting. A 60Hz CRT appears just as 
flickery in the EU, where the AC frequency is 50Hz.

But this is all pretty irrelevant when it comes to audio. Both of these 
concepts (apparent motion and refresh rate) are related to visual 
processing. There is no concept of "response time" that can be applied 
to every means of sensory input - saying that humans can perceive 
flicker even though the difference between the two updates is around 
17ms, and therefore that humans should be able to perceive 17ms latency 
in audio is based on a false assumption. The two things have absolutely 
nothing to do with each other. The brain is entirely capable of 
processing sound at up to around 20KHz - that doesn't mean that we need 
CRT refresh rates to be that high in order to avoid flicker.

However, beyond dispelling various myths, I'm afraid I can't contribute 
much of use to the conversation - most of the neurophysiology I've 
studied was related to the visual system. However, there's certainly 
published literature on what sort of audio latency is perceivable, and I 
can try to dig some of that out if people are interested.
-- 
Matthew Garrett | mjg59 at srcf.ucam.org



More information about the ubuntu-devel mailing list