Rev 3397: comment cleanups. in http://bzr.arbash-meinel.com/branches/bzr/1.4-dev/find_differences
John Arbash Meinel
john at arbash-meinel.com
Thu Apr 24 00:15:12 BST 2008
At http://bzr.arbash-meinel.com/branches/bzr/1.4-dev/find_differences
------------------------------------------------------------
revno: 3397
revision-id: john at arbash-meinel.com-20080423230918-3dwdjgum1qm2nntb
parent: john at arbash-meinel.com-20080423222403-sqa8rs4d8eqdk0xi
committer: John Arbash Meinel <john at arbash-meinel.com>
branch nick: find_differences
timestamp: Wed 2008-04-23 18:09:18 -0500
message:
comment cleanups.
I looked at removing the extra find_seen_ancestors() calls as
part of new_common_unique, but they turn out to be very
beneficial.
modified:
bzrlib/graph.py graph_walker.py-20070525030359-y852guab65d4wtn0-1
-------------- next part --------------
=== modified file 'bzrlib/graph.py'
--- a/bzrlib/graph.py 2008-04-23 22:24:03 +0000
+++ b/bzrlib/graph.py 2008-04-23 23:09:18 +0000
@@ -601,8 +601,7 @@
for searcher in common_searchers:
searcher.stop_searching_any(stop_searching_common)
if new_common_unique:
- # We found some ancestors that are common, jump all the way to
- # their most ancestral node that we have already seen.
+ # We found some ancestors that are common
for searcher in unique_searchers:
new_common_unique.update(
searcher.find_seen_ancestors(new_common_unique))
@@ -612,8 +611,6 @@
new_common_unique.update(
searcher.find_seen_ancestors(new_common_unique))
- # Now we have a complete set of common nodes which are
- # ancestors of the unique nodes.
# We can tell all of the unique searchers to start at these
# nodes, and tell all of the common searchers to *stop*
# searching these nodes
@@ -623,8 +620,8 @@
searcher.stop_searching_any(new_common_unique)
ancestor_all_unique.update(new_common_unique)
- # Filter out searchers that don't actually search different nodes. We
- # already have the ancestry intersection for them
+ # Filter out searchers that don't actually search different
+ # nodes. We already have the ancestry intersection for them
next_unique_searchers = []
unique_search_sets = set()
for searcher in unique_searchers:
More information about the bazaar-commits
mailing list