[ubuntu-uk] regarding for solution
Mark Harrison
Mark at ascentium.co.uk
Wed May 16 17:18:10 BST 2007
fatma oymak wrote:
> Dear all,
>
> I have one problem....I couldnt find right answer...do you have any idea?
> please please let me know
>
> many thanks
> fatma
>
>
> "........Consider the behaviour of two machines in a distributed system.
> Both have clocks that are supposed to tick 1000 times per millisecond. One
> of them ticks 990 times per millisecond. The other ticks 1015 times per
> millisecond. If the system designer want to guarantee that clocks of these
> two machines never differ by more than 5 seconds, how often must be clocks
> be re-synchronized? Supposing that all machines in this distributed systems
> come from same manufacturer and the maximum drift rate is specified as 1.0%,
> how often must the clocks of this system must be re-synchonized if the
> system designers want to guarantee that clocks of these two machines never
> differ by more than 5 seconds? Describe in steps how you get your
> result.......
>
Part 1: At least every 200 seconds - the difference between them gets
.025 seconds (25 milliseconds) bigger each "tick", so it will take 200
seconds for them to be 5 seconds apart (at which point the "fast" clock
will read 202.02 is, and the "slow" clock will read 197.04 ish)
Part 2: Consider the "pathalogical case" of variance, where one
over-ticks by 1% and one under-ticks by one percent, so once is at 990
ms and the other at 1110 ms - hence the divergence is 20ms per tick...
so it takes 250 ticks to diverge by five seconds. The third clock is
irrelevant, since it has to be somewhere either at one of these
extremes, or between them, so will never read further apart than the two
pathalogical outliers.
As an implementation issue, assuming that this can somehow be made
Ubuntu-related, I'd sync each every minute with a crontab job :-)
Mark
More information about the ubuntu-uk
mailing list