Advice on getting a computer lab server
Gavin McCullagh
gmccullagh at gmail.com
Fri Jan 11 14:39:47 GMT 2008
Hi,
On Thu, 10 Jan 2008, Joseph Hartman wrote:
> I actually started rolling out an edubuntu lab earlier this year and it
> became painfully clear early on that the server I have (3.0 GHz P4 Dell
> with 4GB Dual Channel DDR RAM) was not going to be adequate to handle all
> the thin clients.
How many was that? I had understood from documentation that you should be
able to get up to 25-30 with that.
> I have since dedicated my efforts towards setting up MiniLANs in each
> teacher's classroom with the P4 servers handling just 3 to 6 thin
> clients. This seems to be working alright, although things are still
> pretty slow.
Another thing to consider is the disks. IDE disks and controllers are
pretty poor for multiple user access. If you're going to have many users,
SCSI is generally the way forward. 3-6 should probably be okay with IDE I
would have thought and with a Pentium 4 running openoffice/firefox. You
should perhaps try to check if the problem is lack of memory or cpu on the
server. Highly animated applications like tuxtype, tuxmath, et al,
gcompris, video will cause video slowdowns which could be your problem.
> I was hoping to use Kino and Blender with the middle schoolers at some point
> in the future, but it isn't a big deal to postpone those plans if need be.
Intensive video stuff sounds like something which might not work well.
> In my current lab if I try to run more than about 7 thin clients they freeze
> up just playing flash games or running tuxtyping.
These applications could indicate network bandwidth troubles, particularly
if they all share, say a cheap 100Mbit hub.
There have also been reports of tuxtype and tuxmath crashing thin clients
so perhaps that problem is affecting you (one which might not go away with
bigger hardware). Seemingly turning sound off on the client can remedy
this issue (I'm not saying that's the answer, but it might help you
identify the problem).
> Like I said, I have 4GB of Dual Channel DDR RAM so is the bottleneck with
> my 10/100 switch or my P4 CPU? Until now I've thought it was the CPU
> because when I look at the system monitor the CPU maxes out pretty quick.
> (maybe I should bring my c2d in from home and try it out to see how it
> holds up)
I'm afraid it's hard to pinpoint from here. You need to look at ways to
measure RAM used (the output of the "free" command is a start) and cpu load
("uptime" and "top"). If you have some way to measure network load that
would also be interesting to see.
Munin is a very useful thing for monitoring usage of bandwidth, disk, cpu
load, ram usage, etc.
http://www.debian-administration.org/articles/229
http://www.howtoforge.com/server_monitoring_monit_munin
> If I want to use the above programs and keep my current curriculum will a
> single server with 2 dual core xeons and 8 GB RAM be fast or just
> adequate? I want the lab to be fast. (obviously I'm just asking your
> opinion here)
If the lab is 30-35 users I would hope so but our usage tends to be more
sporadic so I don't have enough experience to be sure. A couple of other
large installs (Jim K has 70 thin clients?) have used several bonded
Gigabit interfaces in order to give the server >1Gb/sec access to the
network.
> Finally, what kind of switch should I get? Should I go all gigabit or is it
> enough to just be gigabit to the server and 10/100 to the clients? I do a
> lot of flash based reading games and stuff with the lower grades like
> starfall.com and so I absolutely must have these activities perform well.
The server really should have at least one gigabit interface. Limiting
each client to 100Mb/s is perhaps not a bad thing as it stops a rogue
client from saturating the server's network card and in principal leaves
the other 900Mb/s for other clients.
Gavin
More information about the edubuntu-users
mailing list