Ubuntu has gone!
habtool at gmail.com
Tue Apr 21 12:07:10 UTC 2009
On Tue, 2009-04-21 at 07:37 -0400, Juan De Mola wrote:
> To update my system. At home I don't have Internet and not all the
> things are 113 kb to be downloaded with GPRS.
> This is not the first time I've done a install/update with deb files
> in stable status. But this time a apt leaded the deletion of most
> important files including the kernel. I think that the way the
> dependences are checked must be redesigned.
> But i want restore my system to play what my windows can't.
This email is a bit long, sorry for that in advance:
I feel for you not having internet at home, that must be the pits.
I would still advise you not to use Lenny deb files with Ubuntu, unless
you know for sure what you are doing. This is going to reslut in more
pain than its worth, imo.
Ubuntu and Debian or NOT binary compatible.
Lets see what other on the list come up with but you may need to do a
fresh install. When we have ZFS (go on Oracle do the right thing) or
BTRFS file systems with snapshots during apt-get upgrade, then you can
just roll back to a time prior to the failed update. Trying to do it now
may be very difficult.
If you have enough HDD (harddrive) space use a program like partimage to
make a backup of a unmounted Ubuntu install.
If you using EXT4, then partimage is not yet supporting ext4, use
Fsarchiver to make a exact backup clone.
If something goes wrong, you then restore the cloned image and you back
to the last snapshot/clone you made.
(please note that Fsarchiver does not restore grub to the partition that
it is restoring too, so you would need to learn howto do that from a
You can also look at:
This has partimage and Fsarchiver on the livecd.
Another nice tool, for ext3, is:
Here you will need to add partimage
sudo apt-get install partimage
(but not having internet is going to always be a curse!)
Another interesting tool is 'back in time':
Another option would be to use LVM, but here you on your own, I have
only played with it a bit and as a home user not stuck with it thus far.
Bottom line, if you are going to mix Debian repos with Ubuntu, you are
certain to run into pain, hence the need for a snapshot solution.
Having a part-image or fsarchive compressed snapshot is a really good
practice, esp if you have HDD space to spare.
some info from mark Shuttleworth, Ubuntu founder:
What about binary compatibility between distributions?
A lot has been said about the fact that Debian is not binary-compatible
with Ubuntu. Sometimes this manifests itself as "I can't install Ubuntu
packages on Debian", sometimes its more "Why does Ubuntu use GCC 4 when
Debian is using GCC 3.3?". Or "Why is the kernel and glibc on Ubuntu
5.04 different from that in Debian Sarge?". I'll try to address all of
I'll start with our general policy and approach, and then examine some
of those examples in detail.
First, "binary compatibility" means different things to different
people. If you've followed the trials and tribulations of the LSB
standards process, you'll understand how difficult it is even to
*define* binary compatibility in a meaningful way across multiple
distributions. That, in essence, is why we don't set "binary
compatibility" as a goal for Ubuntu. Sometimes it happens, but if so,
it's because there was an opportunity to make something work nicely -
not because it's a hard goal. We take opportunities for binary
compatibility across distributions where we can find them, but we don't
constrain ourselves by making that an absolute requirement.
Just to be clear, I'll say it again, for the record. We don't aim for
"binary compatibility" with any other distribution. Why?
In short, because we believe in Free Software as a collaborative process
focused on SOURCE CODE, and consider it superior to the proprietary
process which is focused on specific applications and binary bits. We
choose to devote the large majority of our energy to the improvement of
the source code that is widely and freely available, rather than trying
to work on binary bits that cannot be shared as widely. When we spend
hours of time on a feature, we want that work to be usable by as many
other distributions as possible, so we publish the source code in "real
time" as we publish new versions of the packages. We go to great lengths
to make those patches widely available, in an easy to find format, so
that they will be useful to upstreams, and other distributions. That
benefits Debian, but it also benefits Suse and Redhat, if any of them
are willing to take the time to study and apply the patches.
We synchronise our development with upstream, and with Debian, and with
other distributions such as Suse and Gentoo and Mandrake and Red Hat, on
a regular basis. We draw code from the latest upstream projects (which
might not even be in Debian, or in Red Hat, or addressed in the LSB). We
try to merge with Debian Unstable (a.k.a. Sid) every six months. We have
no control over the release processes of other distributions, nor of
upstream, so it would be impossible for us to define in advance an API
or ABI for each release. We are in the hands of hundreds of other
developers every time we freeze Ubuntu in preparation for a new version.
Even though the Ubuntu community is substantial and growing rapidly, it
is still tiny compared to the total number of developers working on all
the free software applications that make up the distribution itself. Our
job is to package what is there, efficiently and cohesively, not to try
to massage it to some pre-defined state of compatibility. We focus on
delivering the newest-but-stabilised-and-polished versions of the best
open source applications for your server or desktop. If we were to set
binary compatibility (at any level) as a top priority, it would
massively diminish our ability to deliver either newer software, or
better integration and polish. And we think our users care most about
the fact that they are getting the best, and most integrated, apps on
It is worth noting that the Linux kernel itself takes the same approach,
shunning "binary compatibility" in favour of a "custom monolithic
kernel". Each release of the kernel requires that it be compiled
separately from previous releases. Modules (drivers) need to be
recompiled with the new release, they cannot just be used in their
binary form. Linus has specifically stated that the monolithic kernel -
based on source code, not trying to maintain a binary interface for
drivers across releases - is better for the kernel. We believe the same
is true for the distribution.
So the imperative to work with very current code overrides the idea of
maintaining compatibility with a specific ABI, especially if we have
little or no say in the ABI we should be trying to remain compatible
More information about the ubuntu-users