VCS comparison table

Jakub Narebski jnareb at gmail.com
Tue Oct 24 11:27:03 BST 2006


Matthieu Moy wrote:
> "Matthew D. Fuller" <fullermd at over-yonder.net> writes:
> 
>>> For example, how long does it take to do an arbitrary "undo" (ie
>>> forcing a branch to an earlier state) [...]
>>
>> I don't understand the thrust of this, either.  As I understand the
>> operation you're talking about, it doesn't have anything to do with a
>> branch; you'd just be whipping the working tree around to different
>> versions.  That should be O(diff) on any modern VCS.

> There are two things to do:
>
> * Mark the tree as corresponding to a different revision in the past.
[...]
> * Then, do the "merge" to make your tree up to date. You can hardly do
>   faster than git and its unpacked format, but this is at the cost of
>   disk space. But as you say, in almost any modern VCS, that's
>   O(diff). In a space-efficient format, that's just the tradeoff you
>   make between full copies of a file and delta-compression.

Actually, this would be "checkout" (in git terminology), i.e. overwriting
the files which differ in current revision, and the revision we rewind (do
undo) to. (That's of course simplification omitting for example removing
and creating files.) Which would be O(changed files) which is lower bound
and cannot be faster. Finding which files changed is also O(changed files),
with a little bit of O(directory depth) in git, with very small constant.

And even in the case of packed format, it wouldn't be O(diff)/O(history),
but O(delta length) where delta length is maximum length of delta chain
in pack, by default set to 10. Well, constant is a bit larges because git
additionally gzip-compresses (even in loose, i.e. unpacked format).
-- 
Jakub Narebski
Warsaw, Poland
ShadeHawk on #git






More information about the bazaar mailing list