Getting started with a content filter

Chris Hecker checker at d6.com
Thu Jul 28 07:07:30 UTC 2011


> time, or just live with it because it's only on one server not every
> machine?

Yeah, sorry, I wasn't being clear:  you DEFINITELY want all the 
revisions, you just want them on the server that has the 10TB raid 
array, not the laptop with 1gb free or you have to start deleting 
pictures of your kids to make space.  :)

You want the server to have everything, all the history for every file, 
text, small binary, or large binary.  Then, clients branch from that, 
and you want to get all the code and small binary revisions, and only n 
of the large binary revisions.  If you have a 100mb psd file, you might 
want 2 or 3 revisions to be kept.  If you have a 1gb light map, you 
might want only 1, or even 0 (meaning lightweight checkout, nothing in 
the repo, and you know you'll always be on gigabit ethernet when dealing 
with that file).  Clients should be able to choose how much history they 
get per file, and plugins can set defaults, like maybe brackets per file 
size.

If I check in a new 100mb psd, then I want this magical perfect system 
to put it in the local branch/shared repository.  Then, when I push (or 
if I have auto-pushing turned on, can't remember what it's called), then 
I want it to confirm that the psd makes it to the server, and then I 
want it to kill the n+1th copy of the psd in my repository (or I do this 
with a manual prune, whichever, it just needs to be easy and quick).

If I ask for an old revision, if it's local, no problem, if some of it's 
remote, then it prompts, and maybe I wait.  Not sure what happens to the 
local repo if I ask for r1 and I'm on r1000 and there are large 
binaries, but that's not too common, so any reasonable behavior is okay 
here.  Best would probably be something like it takes the history 
horizon and applies it around that number, or something.  The super 
fancy version might have a sparse local repo, so it could have holes in 
it, so I can get r50-55 and r998-1000 or something, but that's totally 
not necessary.

The repository should have all the history logs locally, but just not 
all the file data for these large files.  That way I can search the logs 
quickly, see which files changed, etc.

Etc.

Chris






On 2011/07/27 23:33, Martin Pool wrote:
> On 28 July 2011 15:23, Chris Hecker<checker at d6.com>  wrote:
>>
>>> On the other hand, I'm not sure that autodelete of old revisions is
>>> such a good idea.
>>
>> I'd be okay with a manual prune of old history if it was fast and worked
>> well.  But, once the feature is in, there's no reason you couldn't also have
>> it run automatically for artists as an option.
>>
>> I think I know about all the workarounds with the various dvcses right now,
>> and they're all pretty bad.  If I don't get something in bzr by the time my
>> content directories grow (either the separate server hack with stub files,
>> or the Real Thing), I will just put those in svn or p4 since I know they
>> work.
>
> So, if you put those files into svn, won't svn want to keep them
> forever?  Or will you edit them out of the svn history (I forget the
> command but I believe there is one), reset the history from time to
> time, or just live with it because it's only on one server not every
> machine?
>
> m
>



More information about the bazaar mailing list