Concerned about future 32 bit server support
Xen
list at xenhideout.nl
Tue Aug 8 20:41:08 UTC 2017
Robert Heller schreef op 08-08-2017 22:06:
> Just as another datapoint: RedHat has dropped 32-bit support starting
> with
> RHel 7. But the CentOS team is building 32-bit kernels and has a distro
> for
> i686 AND arm based on RHel 7, I believe mostly done by people who have
> a
> specific need for such systems. I expect that eventually all of the
> major
> distros will stop distributing *binaries* for 32-bit x86 eventually,
> just as
> they have stopped distributing 68K, Alpha, ppc/ppc-64, etc. I believe
> non-PAE
> 32-bit kernels are no longer available stock, as well is <i686 kernels
> -- is
> anyone still running 80386's? '486s or '586s? Even though I believe
> current
> kernels still have support for these processors, if one were willing to
> [cross?] build them.
I'm not sure what the difference is between current-day 32-bit (x86) and
those old CPUs...
But the main reason I am sure for 32-bit today is that it has a lower
memory footprint.
There are many embedded systems I believe that still want to be 32-bit
for this reason and I guess the memory usage of 64-bit is easily 30%
higher.
Java doesn't care, it will use 32-bit pointers when the application uses
less than 4GB.
But everything else appears to use vastly more memory; all pointers and
regular arithmetic and iterative integers are 64-bit.
Personally I must say I never felt good about 64-bit for some reason
;-).
Of course, without 64-bit you also cannot (really) have large files
although large file support is not constrained to 64-bit systems.
But even today if I had a netbook or something with less than 4GB of
memory I would want to use a 32-bit version of whatever I use.
Well, that's just me, but I think it is pretty common-sensical.
Curiously enough, if you take the C type uint_fast32_t on a 64-bit
platform, it will simply use 32-bit integers, not 64-bit ones. But I am
not a very good C programmer.
Maybe x64 cpus are fast enough with 32-bit that it doesn't matter. I
mean to say, RISC cpus.
Anyway, the push for more memory seems to me to sit in the domain of
"not really needed". Computers advanced in speed until the 2008
generation of Athlon X2 CPUs etc. and at that point I thought "Now they
are fast enough" and it seemed to be a kind of platform that still many
people use today.
E.g. graphics cards such as the GTX 1080 are so incredibly fast that
there is almost no point to them anymore. No one actually needs it. They
can build the software to match it, but there is barely any reason for
it.
Regards.
More information about the ubuntu-users
mailing list