[MERGE] Fix syntax error on last refactoring.
John Arbash Meinel
john at arbash-meinel.com
Thu Nov 29 22:22:42 GMT 2007
John Arbash Meinel has voted tweak.
Status is now: Conditionally approved
Comment:
You split out
self._copy_revision_texts()
self._copy_inventory_texts()
self._copy_text_texts()
But not "_copy_signature_texts()". It would seem to more consistent if
it was its own helper function.
+ elif ideal_parents[0:1] == node[3][0][0:1]:
+ # the left most parent is the same, or there are no
parents
+ # today. Either way, we can preserve the
representation as
+ # long as we change the refs to be inserted.
+ self._data_changed = True
+ ok_nodes.append((node[0], node[1], node[2],
+ (ideal_parents, node[3][1])))
+ self._data_changed = True
^- you repeat yourself here (in the reconcile code)
+ # 3) adhoc copy all the other texts.
+ transaction = repo.get_transaction()
+ file_id_index = GraphIndexPrefixAdapter(
+ self.new_pack.text_index,
+ ('blank', ), 1,
+ add_nodes_callback=self.new_pack.text_index.add_nodes)
+ knit_index = KnitGraphIndex(file_id_index,
+ add_callback=file_id_index.add_nodes,
+ deltas=True, parents=True)
+ output_knit = knit.KnitVersionedFile('reconcile-texts',
+ self._pack_collection.transport,
+ None,
+ index=knit_index,
+ access_method=_PackAccess(
+
{self.new_pack.text_index:self.new_pack.access_tuple()},
+ (self.new_pack._writer, self.new_pack.text_index)),
+ factory=knit.KnitPlainFactory())
+
^- You reuse the number 3.
Also, it seems like if a given node is bad, then all of its children are
suspect. Well, the annotations would be suspect, but I guess packs
wouldn't have annotations.
Also, it would only be merge nodes. Because all other nodes should only
reference lines which are modified in this.
At a minimum having a comment that this file_id_index is bogus and meant
only to exist so that we can instantiate the reconciled pack, and
override its index each time we get a new file_id.
I'm also a little confused why you need a new KnitVersionedFile for just
the reconciled texts. It seems to shared the same indexes as the current
pack, so I don't quite understand what is happening here.
+ source_weave = repo.weave_store.get_weave(key[0],
transaction)
+ text_lines = source_weave.get_lines(key[1])
^- This really seems like a case where you would like to have a better
API for getting the texts for a file. Like a "text_getter" which would
know about previous extractions, or that you could plan what texts you
want (without having to load them all into memory at once).
But for now, I think this is the best we have.
...
+ # 4) check that nothing inserted has a reference outside the
keyspace.
+ missing_text_keys =
self._external_compression_parents_of_new_texts()
+ if missing_text_keys:
+ raise errors.BzrError('Reference to missing compression
parents %r'
+ % (refs - keys,))
^- This should be step 5, and you have a bug in that 'refs' and 'keys'
don't exist. Is it possible to write a test for this code? Or is the
whole point of reconcile that it won't let this happen, and this is just
a "did I really do what I meant to do" check?
It still seems really weird to me that you would theoretically copy a
bunch of deltas to the output pack, and then copy the base fulltext
(because it had bad parents, so it gets copied last). I suppose as long
as the records are present (which is what this is checking) it is
"correct", just not optimal. Maybe we should recommend running "repack"
if there is bad data?
In "def _use_pack()" you have a check to see if there are any new
inventory keys, along with whether data was inserted, or we notice
"_data_changed".
How do you get data inserted if _data_changed is False? It would seem
like "_copy_text_texts()" could stop early if _data_changed is False.
(bad_texts is empty and discarded_nodes is empty.)
For details, see:
http://bundlebuggy.aaronbentley.com/request/%3C1196313895.32300.438.camel%40lifeless-64%3E
More information about the bazaar
mailing list