Default Desktop Experience for 11.04

Sean McNamara smcnam at gmail.com
Sat Apr 9 16:58:22 UTC 2011


Oops -- I meant to send this to the whole list, not just to Timo!
Sorry for the double mail, Timo!


Sean

On Sat, Apr 9, 2011 at 12:57 PM, Sean McNamara <smcnam at gmail.com> wrote:
> Hi,
>
> On Sat, Apr 9, 2011 at 5:07 AM, Timo Jyrinki <timo.jyrinki at gmail.com> wrote:
>> 2011/4/8 Timo Jyrinki <timo.jyrinki at gmail.com>:
>>> There are a lot of bugs and lack of features (and many have been fixed
>>> already as well) and the performance is quite bad in parts, but those
>>> are not as serious as a) crashers and potentially b) accessibility and
>>> lack of any help.
>>
>> Just reflecting on the more recent posts, I'm using Unity on my work
>> machine, but I do have only 1280x1024 (or 1400x900, depending on if I
>> use internal or external display) resolution. I mostly run everything
>> full screen, and most of the time I don't use Unity to switch between
>> windows at least yet - I use either alt-tab (somewhat annoyingly slow)
>> or compiz's scale plugin which I have bound to lower right corner of
>> the screen (works pretty nicely for me).
>>
>> As for "what I say to others" factor, I (and Ubuntu Finland website as
>> decided by us already around 10.10 time) continue recommending Ubuntu
>> 10.04.2 LTS for everyone. I don't believe users should generally
>> install a non-LTS Ubuntu, even with the caveat of not the newest
>> hardware support. 18 months of security support and therefore need to
>> upgrade N number of times to get to the next LTS is too much for many,
>> since the upgrade is still something of a hassle at times, regressions
>> appear et cetera.
>
> I don't think it makes sense to lower the quality bar for Ubuntu
> non-LTS stable releases. If you want to put something out there that's
> rough around the edges, that's what alphas, betas, and (to some
> extent) RCs are for. Stable releases should be just that; stable. If
> Unity can't meet everyone's expectations for quality, it shouldn't go
> into a release by default, whether it's an LTS or not.
>
> I know a lot of developers stand by the argument that, in order to get
> enough testing and popularity for their software to be successful, it
> needs to be released by default to the general public in a major
> distribution. I've heard appeals to past experience with things like
> PulseAudio and Empathy. While I agree that using the general public as
> a massive test bed can be beneficial to software quality, it also
> harms the image of the GNU/Linux desktop at the same time. When things
> don't work right, users don't understand why: they don't care why, and
> they don't want to know. They only associate the product they were
> using with terms such as "buggy" or "unusable" and move on. This
> really hurts our progress towards resolving Bug #1.
>
> Aside from that, PulseAudio was somewhat of a special case: its
> primary growing pains could *only* be resolved by widespread testing.
> The problem was that ALSA didn't have the necessary quirks for a great
> number of sound cards out there, resulting in incorrect volume
> information and timing in PulseAudio. The key point is that *no other
> software before* had ever tried to use ALSA like that, so the bugs
> were exposed left and right. Can we say the same thing about Unity and
> the open source graphics stack? I don't think so.
>
> The issues you see with Unity vs. Mesa/KMS/Xorg/etc. manifest
> themselves in plenty of other programs, including running compiz on
> Gnome2; Mutter on Gnome3; and a smattering of OpenGL 3d games. We
> already ship the OSGS by default for many years, so it's already
> getting exposed to a wide audience, and advanced 3d is being tested to
> the extent that users try to enable compiz and/or play 3d games. It
> appears that the main benefit of widespread Unity testing in a stable
> release -- that of getting a wide array of hardware to test it on --
> is kind of redundant, in light of other programs already testing out
> the graphics stack for years now. The consensus has always been that
> the OSGS is woefully inadequate, and any improvement we've seen in the
> past few years has been completely orthogonal to the development of
> Unity.
>
> So my opinion is, spend as much time as it takes working with a
> smaller group of enthusiasts, early adopters and developers to perfect
> your software prior to releasing it to a channel where millions of
> people will try it. Ubuntu is the most-downloaded distribution, so the
> software quality contained therein reflects strongly on the public
> perception of the quality of GNU/Linux and FOSS in general. If Unity
> is crashing, lagging and behaving counter-intuitively on the "latest
> and greatest" Ubuntu release, this does not bode well.
>
> How much time is "as much time as it takes"? I don't have a good
> answer; the folks who QA Unity probably have the best sense of that.
> But I do know that Gnome 3 extended its release date multiple times
> and spent many, many years in development before it was finally
> released as stable. And now, running Fedora 15 *Alpha* with Gnome 3.0
> feels more polished than Unity *beta*.
>
>>Unity being still maturing is just one factor that
>> contributes to this, but I wouldn't have any problem recommending
>> 12.04 LTS with Unity to everyone, since it's going to be "ok" already
>> in 11.04 (and fabulous effort / re-write since 10.10) it's a piece of
>> cake to believe it keeps improving. Not that I would have any problem
>> with gnome-shell either, it's becoming great nowadays as well.
>
> Goodness, I *hope* by 12.04 that Unity will be "ok"! :) It'll still be
> a big adjustment for users (even bigger than GS), but if you throw
> away 98% of the bugs, performance issues, defects triggered in the
> open source graphics drivers, etc., you're left with a rather nice DE
> -- *for some use cases*.
>
>>
>> I know that as a power user I'm from the more "adjusts to the
>> environment" part of scale. I don't need to keep doing the way I've
>> been doing before, and I usually stick to quite near the shipping
>> defaults. I did have focus follows mouse though, which I disabled
>> since it worked so poorly with Unity :P Of course I wouldn't keep
>> using Unity if it hadn't improved in the last month like it has, but
>> the application launching via Super key or Alt-F2 really starts to
>> work now, better than ever in GNOME 2. Still too laggy and does not
>> always "just work", but most of the time it's neat.
>
> Hmm... Alt+F2 and Super seem to do the same thing on Gnome-Shell as on
> Unity. Just noting this for anyone reading up on Unity vs Gnome3 :)
>
> I haven't updated my Natty test box for 2 weeks or so, and that's on
> my multi-monitor system with 1920x1200 and 1680x1050 monitors. I think
> I will try out the latest in a virtual machine on top of my Fedora 15
> install on my 1024x768 laptop, and disregard any crashes or
> performance issues I run into -- just for the sake of testing out the
> latest UX.
>
> When Natty goes into RC, I will do a more in-depth research of the UX
> differences between Unity (as it appears in Natty) and Gnome3+GS (as
> it appears in Fedora 15), and publish it so others can compare and
> decide for themselves :) My intuition is that the two projects will
> converge rather than diverge (in terms of feature set) over time, so
> I'll try and update the list as both projects evolve.
>
>
> Sean
>
>>
>> -Timo
>>
>> --
>> ubuntu-desktop mailing list
>> ubuntu-desktop at lists.ubuntu.com
>> https://lists.ubuntu.com/mailman/listinfo/ubuntu-desktop
>>
>



More information about the ubuntu-desktop mailing list