Proposal: reduce base font size from 10pt to 9pt for Karmic Koala release
mrmazda at earthlink.net
Sun Oct 11 00:19:05 UTC 2009
On 2009/10/10 21:01 (GMT+0100) Conn composed:
> Indeed. My laptop's LCD screen should be using 91dpi, but it uses 96dpi by
> conn at inspiron:~$ xdpyinfo | grep -e "\(resolution\|dimensions\)"
> dimensions: 1024x768 pixels (286x214 millimeters)
> resolution: 91x91 dots per inch
Don't make the common mistake of thinking the xdpyinfo result will always be
the same as what apps and the desktop think DPI is. The envar Xft.dpi (which
may or may not be non-null) and 'xrdb -query | grep dpi' may report
differently, and most modern apps will prefer either or both of the two
latter to the one xdpyinfo finds when there is a difference.
> Most of the hardware I am testing runs at 1024x768 resolution, so I am
Odds are you're also biased by youth. Computer developers average younger
than the computer using population in general. Younger people average better
Those who prefer smaller fonts are generally more sensitive to the effects of
anti-aliasing & hinting on the tiny text they prefer, compared to those with
vision on the other end of the scale, who just want their text to be big
enough to avoid pain if not be totally comfortable.
A 1px nominal size difference is four times the percentage difference at 96
DPI than it is at 192 DPI. At 96 DPI a 9pt font is 12px nominal, or on
average about 72 pixels per character box (6 X 12). At 192 DPI, the character
box size for 9pt has grown to an average of about 288 pixels (12 X 24).
> However, why is 1024x768 the "wrong" environment? I'm sure that many
> people use this resolution, even in 2009.
There are plenty using 1024x768 still in the population, but few are using
new displays, and the numbers of those that are are dwindling. The
replacements average higher resolution, which means the average resolution is
higher than 1024x768. All else being equal (which is not the case here) lower
resolution means larger text, while higher resolution means smaller text.
Yours is on the end of the range (larger) that is easiest for a user to
correct when correction is needed. As I wrote previously, it's better when
perfection is impossible to err larger than ideal than to err smaller, as
those with the smaller problem who need to do something about it typically
have a magnitudes bigger ordeal to suffer to fix the problem.
The "right" environment is equal to or higher than the real world average,
whatever it may be. We can't know precisely what that average is, but what we
do know is that average DPI for new hardware is higher than it used to be, is
higher than yours, and it continues to climb. So, better the sample
environment be higher resolution than lower.
We also need to remember it is not the nominal resolution that matters.
Resolution and pixels alone aren't enough information to communicate size.
The display size has to be taken into account to do that. The result of that
So to restate a couple points above in DPI:
1-your DPI, at 91, is on the low side of theoretical optimal (often termed
"logical", and often assumed; I'll get to "optimal" below), though not by a lot
2-average DPI has been increasing, and continues to increase
To add more:
3-the spread/range between high DPI and low DPI has been increasing. The most
expensive displays have ever higher DPI, while the low end new, plus legacy,
deviates less from yesteryear's average
4-as real DPI increases, fonts based on an assumed DPI shrink
5-as the price of a display increases, real DPI generally increases along with it
What happens as a result of 4 & 5 is that the people spending the most on a
new display tend to enjoy (or suffer, depending on point of view) smaller
fonts as a consequence of their larger outlays.
As to a choice to acquire a physically larger display, room for more objects
is what some prefer. Others' interest is making everything bigger. Still
others are after some mix of the two. Invariably the demo systems in stores
are running Windows, which normally assumes 96 DPI, but often assumes 120 on
the higher resolution laptop models. It takes some effort when shopping a new
display to ensure that the goal can be met.
> I understand what you're saying, but there is still an issue present with
> our implementation of fonts on-screen. Ubuntu is not auto-detecting the
> correct DPI setting for each monitor - it is defaulting to 96dpi.
That's no surprise. Mandriva currently assumes 96:
It makes a good point that even DPI doesn't mean much when viewing distance
is not taken into account. I don't think many people would be able to read a
physically 10pt font sitting on the opposite side of an average family room
from the TV screen hooked up to their computer, and certainly none sitting
beside a projector 30 meters from its projection sceen.
I think Fedora assumes too, and it would come as no surprise if no small
number of other distros do as well.
> It may be
> true that a 10pt DejaVu Sans font may look fine on an expensive display
> whose dimensions are 100dpi and higher
That's no small point either. Anti-alias & hinting are kludges designed to
work around the limited number if pixels available per glyph at low
resolution. The higher the resolution, the less they are of any use,
eventually becoming useless.
Now couple this fact with the above discussion of increasing average DPI. It
applies across the board, which means the average DPI used by a dev is
increasing, which in turn means the numbers of devs who will actually test at
low resolutions is decreasing. It used to be one display could do this, as it
was easy in Linux to switch resolutions to test, when most displays were
CRTs. Now, native resolutions on flat panels limit that ability, as
non-native resolutions on most of them provide disproportionately poorer
results that just aren't representative of output on a lower resolution panel
or a CRT.
Also speaking of DejaVu, does your opinion of 9pt vs 10pt change if you
switch from DejaVu Sans to Liberation Sans as your systemwide preferred font?
>, but the issue remains that such
> displays are /still defaulting to 96dpi in Ubuntu/.
http://blogs.msdn.com/fontblog/archive/2005/11/08/490490.aspx explains where
96 came from in the first place. 96 is widely assumed among popular distros,
largely because that's what Windows does (& Mac, plus with Mac, there is not
yet any usable option for anything than 96), and what Windows does, and has
long done, has had widespread trickle-down effect that makes good
justification for assuming it. It's even in the W3C CSS specs:
Because on Windows so many for so long have assumed 96, and never taken into
account the 96 may be inappropriate, many have taken to calling 96 optimal,
as use of anything else breaks the poor designs that assume it. I think this
to be no small part of why it's in the CSS specs.
A common example of the problem of not assuming some minimum occurs with
projection and large TV screen users. It's not unusual for physical DPI in
those environments to be 48 or less. If you wanted a 10pt font at an enforced
48 DPI you'd need to do it with only 6.67 pixels (nominal) to work with! With
most fonts it takes a minimum of 9px nominal character box size to render an
entire glyph set legibly. With less than nine, there simply are not enough
dots in the box to draw a complete legible glyph for every character, and
even for those that can, attractive is rarely a good description of the result.
96 is not good when actual DPI is higher. What installation configurators
should be doing is assuming 96 only when actual DPI is equal to or less than
96, but at resolutions materially higher than 96, either using a higher
assumption, or using actual.
> It seems to me that we need to configure Ubuntu to set the appropriate DPI
> computed against on the EDID-supplied dimensions of the default monitor. The
> EDID-derived values may not match 100% if you physically measure, but it's
> better than sticking to the static setting of 96dpi.
96 is a good fallback for when EDID/DDC is broken and/or missing, and for
many a good assumption if and only if physical DPI, whether from EDID/DDC or
otherwise, is lower. When physical DPI is above 96, applied/logical DPI
should be higher, whatever it takes to get there.
> I'm sympathetic to accessibility issues, but on the other hand, we don't
> ship with the High Contrast theme by default, either. I have some issues
> with eyestrain, but a 9pt DejaVu font at 96dpi (or 91dpi) on a 1024x768
> resolution does not seem to small for me, personally.
What's your age? Do you think you have average vision? What distance are your
eyes from your screen? Acuity and distance play a huge role in the
appropriate size. Laptop users tend to keep the screen closer to the eyes,
which somewhat makes up for laptops' higher average DPI.
There's yet another issue when speaking of DejaVu. Its glyphs, like
Verdana's, at least in x-height, are larger than average. When someone
chooses just about anything other than the DejaVu Sans default, it can be
like reducing pt size by as much as 2pt.
> I obviously don't know as much as you with regards to fonts, but I would
> imagine that if you take a ruler and measure the height of a 10pt font on
> two monitors with different dimensions, ideally the fonts should display
> with the same physical measurements on-screen.
If viewing distance is to be ignored, then that's probably true, but it can
be pretty difficult to ignore the viewing distance factor.
> This is not happening in
> reality, as GNOME is not setting the correct DPI for the connected monitor.
The general consensus among devs and theorists seems to be that on computer
screens logical DPI and physical DPI usually do not need to match, as M$
decided roughly two decades ago. At least with Linux, unlike on Windows and
Mac, those who do want them to match aren't so seriously obstructed in
achieving it. Note that on http://fm.no-ip.com/auth/dpi-screen-window.html I
use the word "accurate", and neither the words "properly" nor "correct", for
describing the DPI encountered.
> I dual-boot between XP and Ubuntu, and use a bookmark synchronization
> extension for Firefox. Looking at my Firefox bookmarks toolbar (which is
> identical in layout between operating systems), Windows XP can fit about 10
> bookmarks on-screen, whereas Ubuntu can fit only 7, or at most 8 with a
> short title. The entire interface of Ubuntu looks unnecessarily bloated.
You're a sample of one that seems to prefer smaller than the population at
large, when in general for web pages at least an average web user prefers a
minimum of 10pt and is more often comfortable with 12pt
<http://www.surl.org/usabilitynews/22/font.asp>. There is no single right
answer for the whole population. Whatever anyone's opinion about size is is
unaffected by the fact that deference needs to be given to those less able to
This is one of the major advantages computers have given people over
newspapers, books and magazines as information sources. Adjustability to
taste is built in - personal computers can be, and are, personalized by a
significant non-zero number of their users.
> As I posted in my previous comment, my screen should use 91dpi rendering. I
> have manually set this value, and while fonts looks slightly more compact
> compared to the default setting, /in my opinion/, 10pt is still too bloated.
Well, I can't tolerate sitting with my eyes only 10 inches from my screen, so
I can't have that opinion. ;-)
" A patriot without religion . . . is as great a
paradox, as an honest man without the fear of God. . . .
2nd U.S. President, John Adams
Team OS/2 ** Reg. Linux User #211409
Felix Miata *** http://fm.no-ip.com/
More information about the Ubuntu-devel-discuss