Rev 4460: (jam) Get rid of InterPackRepository in favor of PackStreamSource. in file:///home/pqm/archives/thelove/bzr/%2Btrunk/

Canonical.com Patch Queue Manager pqm at pqm.ubuntu.com
Thu Jun 18 20:13:52 BST 2009


At file:///home/pqm/archives/thelove/bzr/%2Btrunk/

------------------------------------------------------------
revno: 4460
revision-id: pqm at pqm.ubuntu.com-20090618191345-vgsr5zv78uesqsdg
parent: pqm at pqm.ubuntu.com-20090618061159-nwe8eie5p489xqss
parent: john at arbash-meinel.com-20090618180001-6f8rq6a78e8ow78c
committer: Canonical.com Patch Queue Manager <pqm at pqm.ubuntu.com>
branch nick: +trunk
timestamp: Thu 2009-06-18 20:13:45 +0100
message:
  (jam) Get rid of InterPackRepository in favor of PackStreamSource.
modified:
  NEWS                           NEWS-20050323055033-4e00b5db738777ff
  bzrlib/fetch.py                fetch.py-20050818234941-26fea6105696365d
  bzrlib/repofmt/groupcompress_repo.py repofmt.py-20080715094215-wp1qfvoo7093c8qr-1
  bzrlib/repofmt/pack_repo.py    pack_repo.py-20070813041115-gjv5ma7ktfqwsjgn-1
  bzrlib/repository.py           rev_storage.py-20051111201905-119e9401e46257e3
  bzrlib/tests/test_commit_merge.py test_commit_merge.py-20050920084723-819eeeff77907bc5
  bzrlib/tests/test_pack_repository.py test_pack_repository-20080801043947-eaw0e6h2gu75kwmy-1
  bzrlib/tests/test_repository.py test_repository.py-20060131075918-65c555b881612f4d
    ------------------------------------------------------------
    revno: 4360.4.17
    revision-id: john at arbash-meinel.com-20090618180001-6f8rq6a78e8ow78c
    parent: john at arbash-meinel.com-20090617190825-ktfk82li57rf2im6
    committer: John Arbash Meinel <john at arbash-meinel.com>
    branch nick: 1.15-pack-source
    timestamp: Thu 2009-06-18 13:00:01 -0500
    message:
      Change insert_from_broken_repo into an expectedFailure.
      This has to do with bug #389141.
    modified:
      bzrlib/tests/test_repository.py test_repository.py-20060131075918-65c555b881612f4d
    ------------------------------------------------------------
    revno: 4360.4.16
    revision-id: john at arbash-meinel.com-20090617190825-ktfk82li57rf2im6
    parent: john at arbash-meinel.com-20090617175910-7h37mzgovxh0fabn
    committer: John Arbash Meinel <john at arbash-meinel.com>
    branch nick: 1.15-pack-source
    timestamp: Wed 2009-06-17 14:08:25 -0500
    message:
      It seems that fetch() no longer returns the number of revisions fetched.
      It still does for *some* InterRepository fetch paths, but the generic one does not.
      It is also not easy to get it to, since the Source and Sink are the ones
      that would know how many keys were transmitted, and they are potentially 'remote'
      objects.
      
      This was also only tested to occur as a by-product in a random 'test_commit' test.
      I assume if we really wanted the assurance, we would have a per_repo or interrepo
      test for it.
    modified:
      bzrlib/tests/test_commit_merge.py test_commit_merge.py-20050920084723-819eeeff77907bc5
    ------------------------------------------------------------
    revno: 4360.4.15
    revision-id: john at arbash-meinel.com-20090617175910-7h37mzgovxh0fabn
    parent: john at arbash-meinel.com-20090617175715-p9ebpwx5rhc0qin1
    committer: John Arbash Meinel <john at arbash-meinel.com>
    branch nick: 1.15-pack-source
    timestamp: Wed 2009-06-17 12:59:10 -0500
    message:
      NEWS entry about PackStreamSource
    modified:
      NEWS                           NEWS-20050323055033-4e00b5db738777ff
    ------------------------------------------------------------
    revno: 4360.4.14
    revision-id: john at arbash-meinel.com-20090617175715-p9ebpwx5rhc0qin1
    parent: john at arbash-meinel.com-20090602025918-dd1turxkjymobs4u
    parent: pqm at pqm.ubuntu.com-20090617100437-gavn9zkum4dj5yjz
    committer: John Arbash Meinel <john at arbash-meinel.com>
    branch nick: 1.15-pack-source
    timestamp: Wed 2009-06-17 12:57:15 -0500
    message:
      Merge bzr.dev 4454 in preparation for NEWS entry.
    removed:
      bzrlib/util/tests/test_bencode.py test_bencode.py-20070713042202-qjw8rppxaz7ky6i6-1
      doc/developers/performance-contributing.txt performancecontribut-20070621063612-ac4zhhagjzkr21qp-1
    added:
      bzrlib/_bencode_pyx.h          _bencode_pyx.h-20090604155331-53bg7d0udmrvz44n-1
      bzrlib/_bencode_pyx.pyx        bencode.pyx-20070806220735-j75g4ebfnado2i60-3
      bzrlib/_known_graph_py.py      _known_graph_py.py-20090610185421-vw8vfda2cgnckgb1-1
      bzrlib/_known_graph_pyx.pyx    _known_graph_pyx.pyx-20090610194911-yjk73td9hpjilas0-1
      bzrlib/benchmarks/bench_tags.py bench_tags.py-20070812104202-0q5i0mqkt72hubof-1
      bzrlib/bencode.py              bencode.py-20070806220735-j75g4ebfnado2i60-2
      bzrlib/help_topics/en/diverged-branches.txt divergedbranches.txt-20090608035534-mb4ry8so4hw238n0-1
      bzrlib/tests/per_repository_reference/test_get_rev_id_for_revno.py test_get_rev_id_for_-20090615064050-b6mq6co557towrxh-1
      bzrlib/tests/test__known_graph.py test__known_graph.py-20090610185421-vw8vfda2cgnckgb1-2
      bzrlib/tests/test_bencode.py   test_bencode.py-20070806225234-s51cnnkh6raytxti-1
      bzrlib/tests/test_chk_serializer.py test_chk_serializer.-20090515105921-urte9wnhknlj5dyp-1
      bzrlib/util/bencode.py         bencode.py-20090609141817-jtvhqq6vyryjoeky-1
      doc/developers/bug-handling.txt bughandling.txt-20090615072247-mplym00zjq2n4s61-1
      doc/index.ru.txt               index.ru.txt-20080819091426-kfq61l02dhm9pplk-1
      doc/ru/                        ru-20080818031309-t3nyctvfbvfh4h2u-1
      doc/ru/mini-tutorial/          minitutorial-20080818031309-t3nyctvfbvfh4h2u-2
      doc/ru/mini-tutorial/index.txt index.txt-20080818031309-t3nyctvfbvfh4h2u-4
      doc/ru/quick-reference/        quickreference-20080818031309-t3nyctvfbvfh4h2u-3
      doc/ru/quick-reference/Makefile makefile-20080818031309-t3nyctvfbvfh4h2u-5
      doc/ru/quick-reference/quick-start-summary.pdf quickstartsummary.pd-20080818031309-t3nyctvfbvfh4h2u-6
      doc/ru/quick-reference/quick-start-summary.png quickstartsummary.pn-20080818031309-t3nyctvfbvfh4h2u-7
      doc/ru/quick-reference/quick-start-summary.svg quickstartsummary.sv-20080818031309-t3nyctvfbvfh4h2u-8
      doc/ru/tutorials/              docrututorials-20090427084615-toum0jo7qohd807p-1
      doc/ru/tutorials/centralized_workflow.txt centralized_workflow-20090531190825-ex3ums4bcuaf2r6k-1
      doc/ru/tutorials/tutorial.txt  tutorial.txt-20090602180629-wkp7wr27jl4i2zep-1
      doc/ru/tutorials/using_bazaar_with_launchpad.txt using_bazaar_with_la-20090427084917-b22ppqtdx7q4hapw-1
      doc/ru/user-guide/             docruuserguide-20090601191403-rcoy6nsre0vjiozm-1
      doc/ru/user-guide/branching_a_project.txt branching_a_project.-20090602104644-pjpwfx7xh2k5l0ba-1
      doc/ru/user-guide/core_concepts.txt core_concepts.txt-20090602104644-pjpwfx7xh2k5l0ba-2
      doc/ru/user-guide/images/      images-20090601201124-cruf3mmq5cfxeb1w-1
      doc/ru/user-guide/images/workflows_centralized.png workflows_centralize-20090601201124-cruf3mmq5cfxeb1w-3
      doc/ru/user-guide/images/workflows_centralized.svg workflows_centralize-20090601201124-cruf3mmq5cfxeb1w-4
      doc/ru/user-guide/images/workflows_gatekeeper.png workflows_gatekeeper-20090601201124-cruf3mmq5cfxeb1w-5
      doc/ru/user-guide/images/workflows_gatekeeper.svg workflows_gatekeeper-20090601201124-cruf3mmq5cfxeb1w-6
      doc/ru/user-guide/images/workflows_localcommit.png workflows_localcommi-20090601201124-cruf3mmq5cfxeb1w-7
      doc/ru/user-guide/images/workflows_localcommit.svg workflows_localcommi-20090601201124-cruf3mmq5cfxeb1w-8
      doc/ru/user-guide/images/workflows_peer.png workflows_peer.png-20090601201124-cruf3mmq5cfxeb1w-9
      doc/ru/user-guide/images/workflows_peer.svg workflows_peer.svg-20090601201124-cruf3mmq5cfxeb1w-10
      doc/ru/user-guide/images/workflows_pqm.png workflows_pqm.png-20090601201124-cruf3mmq5cfxeb1w-11
      doc/ru/user-guide/images/workflows_pqm.svg workflows_pqm.svg-20090601201124-cruf3mmq5cfxeb1w-12
      doc/ru/user-guide/images/workflows_shared.png workflows_shared.png-20090601201124-cruf3mmq5cfxeb1w-13
      doc/ru/user-guide/images/workflows_shared.svg workflows_shared.svg-20090601201124-cruf3mmq5cfxeb1w-14
      doc/ru/user-guide/images/workflows_single.png workflows_single.png-20090601201124-cruf3mmq5cfxeb1w-15
      doc/ru/user-guide/images/workflows_single.svg workflows_single.svg-20090601201124-cruf3mmq5cfxeb1w-16
      doc/ru/user-guide/index.txt    index.txt-20090601201124-cruf3mmq5cfxeb1w-2
      doc/ru/user-guide/introducing_bazaar.txt introducing_bazaar.t-20090601221109-6ehwbt2pvzgpftlu-1
      doc/ru/user-guide/specifying_revisions.txt specifying_revisions-20090602104644-pjpwfx7xh2k5l0ba-3
      doc/ru/user-guide/stacked.txt  stacked.txt-20090602104644-pjpwfx7xh2k5l0ba-4
      doc/ru/user-guide/using_checkouts.txt using_checkouts.txt-20090602104644-pjpwfx7xh2k5l0ba-5
      doc/ru/user-guide/zen.txt      zen.txt-20090602104644-pjpwfx7xh2k5l0ba-6
      tools/time_graph.py            time_graph.py-20090608210127-6g0epojxnqjo0f0s-1
    renamed:
      bzrlib/util/bencode.py => bzrlib/util/_bencode_py.py bencode.py-20070220044742-sltr28q21w2wzlxi-1
    modified:
      .bzrignore                     bzrignore-20050311232317-81f7b71efa2db11a
      Makefile                       Makefile-20050805140406-d96e3498bb61c5bb
      NEWS                           NEWS-20050323055033-4e00b5db738777ff
      bzr                            bzr.py-20050313053754-5485f144c7006fa6
      bzrlib/__init__.py             __init__.py-20050309040759-33e65acf91bbcd5d
      bzrlib/_dirstate_helpers_c.pyx dirstate_helpers.pyx-20070503201057-u425eni465q4idwn-3
      bzrlib/_groupcompress_pyx.pyx  _groupcompress_c.pyx-20080724041824-yelg6ii7c7zxt4z0-1
      bzrlib/benchmarks/__init__.py  __init__.py-20060516064526-eb0d37c78e86065d
      bzrlib/branch.py               branch.py-20050309040759-e4baf4e0d046576e
      bzrlib/builtins.py             builtins.py-20050830033751-fc01482b9ca23183
      bzrlib/bundle/serializer/v4.py v10.py-20070611062757-5ggj7k18s9dej0fr-1
      bzrlib/bzrdir.py               bzrdir.py-20060131065624-156dfea39c4387cb
      bzrlib/cache_utf8.py           cache_utf8.py-20060810004311-x4cph46la06h9azm-1
      bzrlib/chk_map.py              chk_map.py-20081001014447-ue6kkuhofvdecvxa-1
      bzrlib/chk_serializer.py       chk_serializer.py-20081002064345-2tofdfj2eqq01h4b-1
      bzrlib/commands.py             bzr.py-20050309040720-d10f4714595cf8c3
      bzrlib/commit.py               commit.py-20050511101309-79ec1a0168e0e825
      bzrlib/config.py               config.py-20051011043216-070c74f4e9e338e8
      bzrlib/diff.py                 diff.py-20050309040759-26944fbbf2ebbf36
      bzrlib/dirstate.py             dirstate.py-20060728012006-d6mvoihjb3je9peu-1
      bzrlib/errors.py               errors.py-20050309040759-20512168c4e14fbd
      bzrlib/filters/__init__.py     __init__.py-20080416080515-mkxl29amuwrf6uir-2
      bzrlib/graph.py                graph_walker.py-20070525030359-y852guab65d4wtn0-1
      bzrlib/groupcompress.py        groupcompress.py-20080705181503-ccbxd6xuy1bdnrpu-8
      bzrlib/help.py                 help.py-20050505025907-4dd7a6d63912f894
      bzrlib/help_topics/__init__.py help_topics.py-20060920210027-rnim90q9e0bwxvy4-1
      bzrlib/help_topics/en/configuration.txt configuration.txt-20060314161707-868350809502af01
      bzrlib/help_topics/en/eol.txt  eol.txt-20090327060429-todzdjmqt3bpv5r8-3
      bzrlib/index.py                index.py-20070712131115-lolkarso50vjr64s-1
      bzrlib/inventory.py            inventory.py-20050309040759-6648b84ca2005b37
      bzrlib/knit.py                 knit.py-20051212171256-f056ac8f0fbe1bd9
      bzrlib/lock.py                 lock.py-20050527050856-ec090bb51bc03349
      bzrlib/mail_client.py          mail_client.py-20070809192806-vuxt3t19srtpjpdn-1
      bzrlib/multiparent.py          __init__.py-20070410133617-n1jdhcc1n1mibarp-1
      bzrlib/osutils.py              osutils.py-20050309040759-eeaff12fbf77ac86
      bzrlib/plugins/launchpad/test_register.py test_register.py-20060315182712-40f5dda945c829a8
      bzrlib/progress.py             progress.py-20050610070202-df9faaab791964c0
      bzrlib/push.py                 push.py-20080606021927-5fe39050e8xne9un-1
      bzrlib/python-compat.h         pythoncompat.h-20080924041409-9kvi0fgtuuqp743j-1
      bzrlib/reconcile.py            reweave_inventory.py-20051108164726-1e5e0934febac06e
      bzrlib/remote.py               remote.py-20060720103555-yeeg2x51vn0rbtdp-1
      bzrlib/repofmt/groupcompress_repo.py repofmt.py-20080715094215-wp1qfvoo7093c8qr-1
      bzrlib/repofmt/knitrepo.py     knitrepo.py-20070206081537-pyy4a00xdas0j4pf-1
      bzrlib/repofmt/pack_repo.py    pack_repo.py-20070813041115-gjv5ma7ktfqwsjgn-1
      bzrlib/repository.py           rev_storage.py-20051111201905-119e9401e46257e3
      bzrlib/revisiontree.py         revisiontree.py-20060724012533-bg8xyryhxd0o0i0h-1
      bzrlib/serializer.py           serializer.py-20090402143702-wmkh9cfjhwpju0qi-1
      bzrlib/shelf.py                prepare_shelf.py-20081005181341-n74qe6gu1e65ad4v-1
      bzrlib/shellcomplete.py        shellcomplete.py-20050822153127-3be115ff5e70fc39
      bzrlib/smart/bzrdir.py         bzrdir.py-20061122024551-ol0l0o0oofsu9b3t-1
      bzrlib/smart/medium.py         medium.py-20061103051856-rgu2huy59fkz902q-1
      bzrlib/smart/protocol.py       protocol.py-20061108035435-ot0lstk2590yqhzr-1
      bzrlib/smart/repository.py     repository.py-20061128022038-vr5wy5bubyb8xttk-1
      bzrlib/smart/request.py        request.py-20061108095550-gunadhxmzkdjfeek-1
      bzrlib/tag.py                  tag.py-20070212110532-91cw79inah2cfozx-1
      bzrlib/tests/__init__.py       selftest.py-20050531073622-8d0e3c8845c97a64
      bzrlib/tests/blackbox/test_branch.py test_branch.py-20060524161337-noms9gmcwqqrfi8y-1
      bzrlib/tests/blackbox/test_diff.py test_diff.py-20060110203741-aa99ac93e633d971
      bzrlib/tests/blackbox/test_export.py test_export.py-20051229024010-e6c26658e460fb1c
      bzrlib/tests/blackbox/test_init.py test_init.py-20060309032856-a292116204d86eb7
      bzrlib/tests/blackbox/test_pull.py test_pull.py-20051201144907-64959364f629947f
      bzrlib/tests/blackbox/test_push.py test_push.py-20060329002750-929af230d5d22663
      bzrlib/tests/blackbox/test_split.py test_split.py-20061008023421-qy0vdpzysh5rriu8-1
      bzrlib/tests/blackbox/test_status.py teststatus.py-20050712014354-508855eb9f29f7dc
      bzrlib/tests/branch_implementations/test_check.py test_check.py-20080429151303-1sbfclxhddpz0tnj-1
      bzrlib/tests/branch_implementations/test_dotted_revno_to_revision_id.py test_dotted_revno_to-20090121014844-6x7d9jtri5sspg1o-1
      bzrlib/tests/branch_implementations/test_push.py test_push.py-20070130153159-fhfap8uoifevg30j-1
      bzrlib/tests/branch_implementations/test_reconcile.py test_reconcile.py-20080429161555-qlmccuyeyt6pvho7-1
      bzrlib/tests/branch_implementations/test_sprout.py test_sprout.py-20070521151739-b8t8p7axw1h966ws-1
      bzrlib/tests/branch_implementations/test_stacking.py test_stacking.py-20080214020755-msjlkb7urobwly0f-1
      bzrlib/tests/bzrdir_implementations/test_bzrdir.py test_bzrdir.py-20060131065642-0ebeca5e30e30866
      bzrlib/tests/inventory_implementations/basics.py basics.py-20070903044446-kdjwbiu1p1zi9phs-1
      bzrlib/tests/per_repository/test_iter_reverse_revision_history.py test_iter_reverse_re-20070217015036-spu7j5ggch7pbpyd-1
      bzrlib/tests/per_repository/test_reconcile.py test_reconcile.py-20060223022332-572ef70a3288e369
      bzrlib/tests/per_repository/test_repository.py test_repository.py-20060131092128-ad07f494f5c9d26c
      bzrlib/tests/per_repository/test_revision.py testrevprops.py-20051013073044-92bc3c68302ce1bf
      bzrlib/tests/per_repository_reference/__init__.py __init__.py-20080220025549-nnm2s80it1lvcwnc-2
      bzrlib/tests/per_repository_reference/test_initialize.py test_initialize.py-20090527083941-4rz2urcthjet5e2i-1
      bzrlib/tests/test__groupcompress.py test__groupcompress_-20080724145854-koifwb7749cfzrvj-1
      bzrlib/tests/test_bzrdir.py    test_bzrdir.py-20060131065654-deba40eef51cf220
      bzrlib/tests/test_chk_map.py   test_chk_map.py-20081001014447-ue6kkuhofvdecvxa-2
      bzrlib/tests/test_commands.py  test_command.py-20051019190109-3b17be0f52eaa7a8
      bzrlib/tests/test_config.py    testconfig.py-20051011041908-742d0c15d8d8c8eb
      bzrlib/tests/test_eol_filters.py test_eol_filters.py-20090327060429-todzdjmqt3bpv5r8-2
      bzrlib/tests/test_filters.py   test_filters.py-20080417120614-tc3zok0vvvprsc99-1
      bzrlib/tests/test_generate_docs.py test_generate_docs.p-20070102123151-cqctnsrlqwmiljd7-1
      bzrlib/tests/test_graph.py     test_graph_walker.py-20070525030405-enq4r60hhi9xrujc-1
      bzrlib/tests/test_help.py      test_help.py-20070419045354-6q6rq15j9e2n5fna-1
      bzrlib/tests/test_http.py      testhttp.py-20051018020158-b2eef6e867c514d9
      bzrlib/tests/test_knit.py      test_knit.py-20051212171302-95d4c00dd5f11f2b
      bzrlib/tests/test_mail_client.py test_mail_client.py-20070809192806-vuxt3t19srtpjpdn-2
      bzrlib/tests/test_options.py   testoptions.py-20051014093702-96457cfc86319a8f
      bzrlib/tests/test_osutils.py   test_osutils.py-20051201224856-e48ee24c12182989
      bzrlib/tests/test_plugins.py   plugins.py-20050622075746-32002b55e5e943e9
      bzrlib/tests/test_progress.py  test_progress.py-20060308160359-978c397bc79b7fda
      bzrlib/tests/test_remote.py    test_remote.py-20060720103555-yeeg2x51vn0rbtdp-2
      bzrlib/tests/test_repository.py test_repository.py-20060131075918-65c555b881612f4d
      bzrlib/tests/test_serializer.py test_serializer.py-20090403213933-q6x117y8t9fbeyoz-1
      bzrlib/tests/test_smart.py     test_smart.py-20061122024551-ol0l0o0oofsu9b3t-2
      bzrlib/tests/test_source.py    test_source.py-20051207061333-a58dea6abecc030d
      bzrlib/tests/test_transform.py test_transaction.py-20060105172520-b3ffb3946550e6c4
      bzrlib/tests/test_ui.py        test_ui.py-20051130162854-458e667a7414af09
      bzrlib/tests/tree_implementations/test_list_files.py test_list_files.py-20070216005501-cjh6fzprbe9lbs2t-1
      bzrlib/tests/workingtree_implementations/test_content_filters.py test_content_filters-20080424071441-8navsrmrfdxpn90a-1
      bzrlib/tests/workingtree_implementations/test_eol_conversion.py test_eol_conversion.-20090327060429-todzdjmqt3bpv5r8-4
      bzrlib/transform.py            transform.py-20060105172343-dd99e54394d91687
      bzrlib/transport/sftp.py       sftp.py-20051019050329-ab48ce71b7e32dfe
      bzrlib/ui/__init__.py          ui.py-20050824083933-8cf663c763ba53a9
      bzrlib/ui/text.py              text.py-20051130153916-2e438cffc8afc478
      bzrlib/versionedfile.py        versionedfile.py-20060222045106-5039c71ee3b65490
      bzrlib/weave.py                knit.py-20050627021749-759c29984154256b
      bzrlib/win32utils.py           win32console.py-20051021033308-123c6c929d04973d
      bzrlib/workingtree.py          workingtree.py-20050511021032-29b6ec0a681e02e3
      bzrlib/workingtree_4.py        workingtree_4.py-20070208044105-5fgpc5j3ljlh5q6c-1
      bzrlib/xml4.py                 xml4.py-20050916091259-db5ab55e7e6ca324
      bzrlib/xml8.py                 xml5.py-20050907032657-aac8f960815b66b1
      bzrlib/xml_serializer.py       xml.py-20050309040759-57d51586fdec365d
      doc/developers/cycle.txt       cycle.txt-20081017031739-rw24r0cywm2ok3xu-1
      doc/developers/index.txt       index.txt-20070508041241-qznziunkg0nffhiw-1
      doc/developers/performance-roadmap.txt performanceroadmap.t-20070507174912-mwv3xv517cs4sisd-2
      doc/developers/planned-change-integration.txt plannedchangeintegra-20070619004702-i1b3ccamjtfaoq6w-1
      doc/developers/releasing.txt   releasing.txt-20080502015919-fnrcav8fwy8ccibu-1
      doc/en/developer-guide/HACKING.txt HACKING-20050805200004-2a5dc975d870f78c
      doc/en/quick-reference/Makefile makefile-20070813143223-5i7bgw7w8s7l3ae2-2
      doc/en/quick-reference/quick-start-summary.png quickstartsummary.pn-20071203142852-hsiybkmh37q5owwe-1
      doc/en/tutorials/using_bazaar_with_launchpad.txt using_bazaar_with_lp-20071211073140-7msh8uf9a9h4y9hb-1
      doc/en/user-guide/images/workflows_centralized.png workflows_centralize-20071114035000-q36a9h57ps06uvnl-8
      doc/en/user-guide/images/workflows_gatekeeper.png workflows_gatekeeper-20071114035000-q36a9h57ps06uvnl-9
      doc/en/user-guide/images/workflows_localcommit.png workflows_localcommi-20071114035000-q36a9h57ps06uvnl-10
      doc/en/user-guide/images/workflows_peer.png workflows_peer.png-20071114035000-q36a9h57ps06uvnl-11
      doc/en/user-guide/images/workflows_pqm.png workflows_pqm.png-20071114035000-q36a9h57ps06uvnl-12
      doc/en/user-guide/images/workflows_shared.png workflows_shared.png-20071114035000-q36a9h57ps06uvnl-13
      doc/en/user-guide/images/workflows_single.png workflows_single.png-20071114035000-q36a9h57ps06uvnl-14
      doc/en/user-guide/introducing_bazaar.txt introducing_bazaar.t-20071114035000-q36a9h57ps06uvnl-5
      doc/index.txt                  index.txt-20070813101924-07gd9i9d2jt124bf-1
      generate_docs.py               bzrinfogen.py-20051211224525-78e7c14f2c955e55
      setup.py                       setup.py-20050314065409-02f8a0a6e3f9bc70
      bzrlib/util/_bencode_py.py     bencode.py-20070220044742-sltr28q21w2wzlxi-1
    ------------------------------------------------------------
    revno: 4360.4.13
    revision-id: john at arbash-meinel.com-20090602025918-dd1turxkjymobs4u
    parent: john at arbash-meinel.com-20090601191632-p29x4v3s6qcedjzn
    parent: pqm at pqm.ubuntu.com-20090602023451-adrz7xhxr6qimtni
    committer: John Arbash Meinel <john at arbash-meinel.com>
    branch nick: 1.15-pack-source
    timestamp: Mon 2009-06-01 21:59:18 -0500
    message:
      Merge bzr.dev 4396 to bring in official 'quick-fix'
    modified:
      Makefile                       Makefile-20050805140406-d96e3498bb61c5bb
      NEWS                           NEWS-20050323055033-4e00b5db738777ff
      bzrlib/builtins.py             builtins.py-20050830033751-fc01482b9ca23183
      bzrlib/foreign.py              foreign.py-20081112170002-olsxmandkk8qyfuq-1
      bzrlib/log.py                  log.py-20050505065812-c40ce11702fe5fb1
      bzrlib/tests/test_foreign.py   test_foreign.py-20081125004048-ywb901edgp9lluxo-1
    ------------------------------------------------------------
    revno: 4360.4.12
    revision-id: john at arbash-meinel.com-20090601191632-p29x4v3s6qcedjzn
    parent: john at arbash-meinel.com-20090601183634-ptugsr2rmk7fzs3p
    committer: John Arbash Meinel <john at arbash-meinel.com>
    branch nick: 1.15-pack-source
    timestamp: Mon 2009-06-01 14:16:32 -0500
    message:
      Work out some issues with revision_ids vs revision_keys.
      
      Mapping back and forth really sucks. Even worse, though, is when a variable
      is labeled 'revision_ids' when it is really 'revision_keys'. (or vice versa)
      Like SearchResult.get_keys() may return revision_ids or may return keys...
    modified:
      bzrlib/repofmt/groupcompress_repo.py repofmt.py-20080715094215-wp1qfvoo7093c8qr-1
      bzrlib/repository.py           rev_storage.py-20051111201905-119e9401e46257e3
    ------------------------------------------------------------
    revno: 4360.4.11
    revision-id: john at arbash-meinel.com-20090601183634-ptugsr2rmk7fzs3p
    parent: john at arbash-meinel.com-20090529151214-88xyma7slrvumx7a
    parent: john at arbash-meinel.com-20090601181346-2fxsd3o977j5bj5b
    committer: John Arbash Meinel <john at arbash-meinel.com>
    branch nick: 1.15-pack-source
    timestamp: Mon 2009-06-01 13:36:34 -0500
    message:
      Merge the 'quick-fix' tests for stacking + ghosts + smart server, etc.
    modified:
      bzrlib/repofmt/groupcompress_repo.py repofmt.py-20080715094215-wp1qfvoo7093c8qr-1
      bzrlib/repository.py           rev_storage.py-20051111201905-119e9401e46257e3
      bzrlib/smart/repository.py     repository.py-20061128022038-vr5wy5bubyb8xttk-1
      bzrlib/tests/per_repository/test_fetch.py test_fetch.py-20070814052151-5cxha9slx4c93uog-1
      bzrlib/tests/per_repository_reference/test_fetch.py test_fetch.py-20090511214909-25pkgmoam913lrji-1
    ------------------------------------------------------------
    revno: 4360.4.10
    revision-id: john at arbash-meinel.com-20090529151214-88xyma7slrvumx7a
    parent: john at arbash-meinel.com-20090529150616-m29oaesf6ekxr489
    committer: John Arbash Meinel <john at arbash-meinel.com>
    branch nick: 1.15-pack-source
    timestamp: Fri 2009-05-29 10:12:14 -0500
    message:
      Remove some of the code duplication.
      Discover a slightly broken bit of code that was finding parent inventories 2x.
    modified:
      bzrlib/repofmt/groupcompress_repo.py repofmt.py-20080715094215-wp1qfvoo7093c8qr-1
      bzrlib/repofmt/pack_repo.py    pack_repo.py-20070813041115-gjv5ma7ktfqwsjgn-1
      bzrlib/repository.py           rev_storage.py-20051111201905-119e9401e46257e3
    ------------------------------------------------------------
    revno: 4360.4.9
    revision-id: john at arbash-meinel.com-20090529150616-m29oaesf6ekxr489
    parent: john at arbash-meinel.com-20090529143455-6uswchgtywhp73lv
    parent: pqm at pqm.ubuntu.com-20090529112630-p1bfpivkz3igjzn2
    committer: John Arbash Meinel <john at arbash-meinel.com>
    branch nick: 1.15-pack-source
    timestamp: Fri 2009-05-29 10:06:16 -0500
    message:
      Merge bzr.dev, bringing in the gc stacking fixes.
    added:
      bzrlib/_rio_py.py              _rio_py.py-20090514104624-ied3d39oju8anmfz-1
      bzrlib/_rio_pyx.pyx            _rio_pyx.pyx-20090514104636-8203jcqvfny56yrd-1
      bzrlib/send.py                 send.py-20090521192735-j7cdb33ykmtmzx4w-1
      bzrlib/tests/per_repository_reference/test_fetch.py test_fetch.py-20090511214909-25pkgmoam913lrji-1
      bzrlib/tests/per_repository_reference/test_initialize.py test_initialize.py-20090527083941-4rz2urcthjet5e2i-1
      bzrlib/tests/per_repository_reference/test_unlock.py test_unlock.py-20090526160031-14lvypj5pbrndnyz-1
      bzrlib/tests/test__rio.py      test__rio.py-20090514191748-cy74k8yj46gzoeq6-1
    renamed:
      bzrlib/tests/workingtree_implementations/test_get_file_with_stat.py => bzrlib/tests/tree_implementations/test_get_file_with_stat.py test_get_file_with_s-20080922035909-lhdovrr36jpxmu0v-1
    modified:
      .bzrignore                     bzrignore-20050311232317-81f7b71efa2db11a
      NEWS                           NEWS-20050323055033-4e00b5db738777ff
      bzr                            bzr.py-20050313053754-5485f144c7006fa6
      bzrlib/__init__.py             __init__.py-20050309040759-33e65acf91bbcd5d
      bzrlib/branch.py               branch.py-20050309040759-e4baf4e0d046576e
      bzrlib/builtins.py             builtins.py-20050830033751-fc01482b9ca23183
      bzrlib/commands.py             bzr.py-20050309040720-d10f4714595cf8c3
      bzrlib/diff.py                 diff.py-20050309040759-26944fbbf2ebbf36
      bzrlib/errors.py               errors.py-20050309040759-20512168c4e14fbd
      bzrlib/foreign.py              foreign.py-20081112170002-olsxmandkk8qyfuq-1
      bzrlib/graph.py                graph_walker.py-20070525030359-y852guab65d4wtn0-1
      bzrlib/groupcompress.py        groupcompress.py-20080705181503-ccbxd6xuy1bdnrpu-8
      bzrlib/inventory.py            inventory.py-20050309040759-6648b84ca2005b37
      bzrlib/knit.py                 knit.py-20051212171256-f056ac8f0fbe1bd9
      bzrlib/mutabletree.py          mutabletree.py-20060906023413-4wlkalbdpsxi2r4y-2
      bzrlib/osutils.py              osutils.py-20050309040759-eeaff12fbf77ac86
      bzrlib/remote.py               remote.py-20060720103555-yeeg2x51vn0rbtdp-1
      bzrlib/repofmt/groupcompress_repo.py repofmt.py-20080715094215-wp1qfvoo7093c8qr-1
      bzrlib/repofmt/pack_repo.py    pack_repo.py-20070813041115-gjv5ma7ktfqwsjgn-1
      bzrlib/repository.py           rev_storage.py-20051111201905-119e9401e46257e3
      bzrlib/revisiontree.py         revisiontree.py-20060724012533-bg8xyryhxd0o0i0h-1
      bzrlib/rio.py                  rio.py-20051128032247-770b120b34dfff60
      bzrlib/smart/server.py         server.py-20061110062051-chzu10y32vx8gvur-1
      bzrlib/tests/__init__.py       selftest.py-20050531073622-8d0e3c8845c97a64
      bzrlib/tests/blackbox/test_add.py test_add.py-20060518072250-857e4f86f54a30b2
      bzrlib/tests/blackbox/test_dpush.py test_dpush.py-20090108125928-st1td6le59g0vyv2-1
      bzrlib/tests/blackbox/test_log.py test_log.py-20060112090212-78f6ea560c868e24
      bzrlib/tests/blackbox/test_send.py test_bundle.py-20060616222707-c21c8b7ea5ef57b1
      bzrlib/tests/blackbox/test_serve.py test_serve.py-20060913064329-8t2pvmsikl4s3xhl-1
      bzrlib/tests/blackbox/test_switch.py test_switch.py-20071122111948-0c5en6uz92bwl76h-1
      bzrlib/tests/branch_implementations/test_branch.py testbranch.py-20050711070244-121d632bc37d7253
      bzrlib/tests/branch_implementations/test_push.py test_push.py-20070130153159-fhfap8uoifevg30j-1
      bzrlib/tests/per_repository/test_fetch.py test_fetch.py-20070814052151-5cxha9slx4c93uog-1
      bzrlib/tests/per_repository/test_fileid_involved.py test_file_involved.py-20051215205901-728a172d1014daaa
      bzrlib/tests/per_repository/test_write_group.py test_write_group.py-20070716105516-89n34xtogq5frn0m-1
      bzrlib/tests/per_repository_reference/__init__.py __init__.py-20080220025549-nnm2s80it1lvcwnc-2
      bzrlib/tests/per_repository_reference/test_default_stacking.py test_default_stackin-20090311055345-9ajahgm58oq3wh6h-1
      bzrlib/tests/test_foreign.py   test_foreign.py-20081125004048-ywb901edgp9lluxo-1
      bzrlib/tests/test_graph.py     test_graph_walker.py-20070525030405-enq4r60hhi9xrujc-1
      bzrlib/tests/test_groupcompress.py test_groupcompress.p-20080705181503-ccbxd6xuy1bdnrpu-13
      bzrlib/tests/test_http.py      testhttp.py-20051018020158-b2eef6e867c514d9
      bzrlib/tests/test_osutils.py   test_osutils.py-20051201224856-e48ee24c12182989
      bzrlib/tests/test_pack_repository.py test_pack_repository-20080801043947-eaw0e6h2gu75kwmy-1
      bzrlib/tests/test_repository.py test_repository.py-20060131075918-65c555b881612f4d
      bzrlib/tests/test_transform.py test_transaction.py-20060105172520-b3ffb3946550e6c4
      bzrlib/tests/tree_implementations/__init__.py __init__.py-20060717075546-420s7b0bj9hzeowi-2
      bzrlib/tests/workingtree_implementations/__init__.py __init__.py-20060203003124-b2aa5aca21a8bfad
      bzrlib/transform.py            transform.py-20060105172343-dd99e54394d91687
      bzrlib/transport/__init__.py   transport.py-20050711165921-4978aa7ce1285ad5
      bzrlib/tree.py                 tree.py-20050309040759-9d5f2496be663e77
      bzrlib/win32utils.py           win32console.py-20051021033308-123c6c929d04973d
      bzrlib/workingtree.py          workingtree.py-20050511021032-29b6ec0a681e02e3
      bzrlib/workingtree_4.py        workingtree_4.py-20070208044105-5fgpc5j3ljlh5q6c-1
      doc/en/user-guide/svn_plugin.txt svn_plugin.txt-20080509065016-cjc90f46407vi9a0-2
      setup.py                       setup.py-20050314065409-02f8a0a6e3f9bc70
      bzrlib/tests/tree_implementations/test_get_file_with_stat.py test_get_file_with_s-20080922035909-lhdovrr36jpxmu0v-1
    ------------------------------------------------------------
    revno: 4360.4.8
    revision-id: john at arbash-meinel.com-20090529143455-6uswchgtywhp73lv
    parent: john at arbash-meinel.com-20090529141005-j0kkvu24tvx2g756
    committer: John Arbash Meinel <john at arbash-meinel.com>
    branch nick: 1.15-pack-source
    timestamp: Fri 2009-05-29 09:34:55 -0500
    message:
      Clean up references to InterPackRepo.
    modified:
      bzrlib/fetch.py                fetch.py-20050818234941-26fea6105696365d
      bzrlib/repofmt/pack_repo.py    pack_repo.py-20070813041115-gjv5ma7ktfqwsjgn-1
      bzrlib/repository.py           rev_storage.py-20051111201905-119e9401e46257e3
    ------------------------------------------------------------
    revno: 4360.4.7
    revision-id: john at arbash-meinel.com-20090529141005-j0kkvu24tvx2g756
    parent: john at arbash-meinel.com-20090529125220-idiknsd81ihvkacm
    committer: John Arbash Meinel <john at arbash-meinel.com>
    branch nick: 1.15-pack-source
    timestamp: Fri 2009-05-29 09:10:05 -0500
    message:
      It seems that inventory_xml_lines_for_keys really does want keys and not ids.
    modified:
      bzrlib/repofmt/pack_repo.py    pack_repo.py-20070813041115-gjv5ma7ktfqwsjgn-1
    ------------------------------------------------------------
    revno: 4360.4.6
    revision-id: john at arbash-meinel.com-20090529125220-idiknsd81ihvkacm
    parent: john at arbash-meinel.com-20090529105417-jfl3k51g6ymqsx13
    committer: John Arbash Meinel <john at arbash-meinel.com>
    branch nick: 1.15-pack-source
    timestamp: Fri 2009-05-29 07:52:20 -0500
    message:
      Change how 'missing.*parent_prevents_commit' determines what to skip.
      It was skipping everything, rather than just dev6.
      Also, fix it to always call abort_write_group, even if it skipped the test.
      Otherwise, it was failing to clean up the working dirs.
    modified:
      bzrlib/repofmt/pack_repo.py    pack_repo.py-20070813041115-gjv5ma7ktfqwsjgn-1
      bzrlib/repository.py           rev_storage.py-20051111201905-119e9401e46257e3
      bzrlib/tests/test_pack_repository.py test_pack_repository-20080801043947-eaw0e6h2gu75kwmy-1
    ------------------------------------------------------------
    revno: 4360.4.5
    revision-id: john at arbash-meinel.com-20090529105417-jfl3k51g6ymqsx13
    parent: john at arbash-meinel.com-20090528160334-ewrrmm5d92m6k8xn
    committer: John Arbash Meinel <john at arbash-meinel.com>
    branch nick: 1.15-pack-source
    timestamp: Fri 2009-05-29 05:54:17 -0500
    message:
      Implement a KnitPackStreamSource
      This is hard-coded to only support exact format => format streaming,
      which allows the code to be written with much simpler understanding of the
      code.
      
      Small bug right now with expecting line_iterator to return the context revision.
      Will work on after lunch.
    modified:
      bzrlib/repofmt/pack_repo.py    pack_repo.py-20070813041115-gjv5ma7ktfqwsjgn-1
    ------------------------------------------------------------
    revno: 4360.4.4
    revision-id: john at arbash-meinel.com-20090528160334-ewrrmm5d92m6k8xn
    parent: john at arbash-meinel.com-20090528154433-7hgay7hq57jw7faj
    committer: John Arbash Meinel <john at arbash-meinel.com>
    branch nick: 1.15-pack-source
    timestamp: Thu 2009-05-28 11:03:34 -0500
    message:
      (broken) In the middle of creating an 'optimal' knit streamer.
    modified:
      bzrlib/repofmt/groupcompress_repo.py repofmt.py-20080715094215-wp1qfvoo7093c8qr-1
      bzrlib/repofmt/pack_repo.py    pack_repo.py-20070813041115-gjv5ma7ktfqwsjgn-1
    ------------------------------------------------------------
    revno: 4360.4.3
    revision-id: john at arbash-meinel.com-20090528154433-7hgay7hq57jw7faj
    parent: tanner at real-time.com-20090523031709-ivn0zya2afnetf8c
    committer: John Arbash Meinel <john at arbash-meinel.com>
    branch nick: 1.15-pack-source
    timestamp: Thu 2009-05-28 10:44:33 -0500
    message:
      Introduce a KnitPackStreamSource which is used when
      going knitpack <=> knitpack and the exact formats are used on
      both sides.
    modified:
      bzrlib/repofmt/pack_repo.py    pack_repo.py-20070813041115-gjv5ma7ktfqwsjgn-1
      bzrlib/tests/test_repository.py test_repository.py-20060131075918-65c555b881612f4d
=== modified file 'NEWS'
--- a/NEWS	2009-06-18 05:13:43 +0000
+++ b/NEWS	2009-06-18 19:13:45 +0000
@@ -58,6 +58,11 @@
   for files with long ancestry and 'cherrypicked' changes.)
   (John Arbash Meinel)
 
+* pack <=> pack fetching is now done via a ``PackStreamSource`` rather
+  than the ``Packer`` code. The user visible change is that we now
+  properly fetch the minimum number of texts for non-smart fetching.
+  (John Arbash Meinel)
+
 
 Improvements
 ************

=== modified file 'bzrlib/fetch.py'
--- a/bzrlib/fetch.py	2009-06-10 03:56:49 +0000
+++ b/bzrlib/fetch.py	2009-06-17 17:57:15 +0000
@@ -51,9 +51,6 @@
         :param last_revision: If set, try to limit to the data this revision
             references.
         :param find_ghosts: If True search the entire history for ghosts.
-        :param _write_group_acquired_callable: Don't use; this parameter only
-            exists to facilitate a hack done in InterPackRepo.fetch.  We would
-            like to remove this parameter.
         :param pb: ProgressBar object to use; deprecated and ignored.
             This method will just create one on top of the stack.
         """

=== modified file 'bzrlib/repofmt/groupcompress_repo.py'
--- a/bzrlib/repofmt/groupcompress_repo.py	2009-06-12 01:11:00 +0000
+++ b/bzrlib/repofmt/groupcompress_repo.py	2009-06-17 17:57:15 +0000
@@ -48,6 +48,7 @@
     Pack,
     NewPack,
     KnitPackRepository,
+    KnitPackStreamSource,
     PackRootCommitBuilder,
     RepositoryPackCollection,
     RepositoryFormatPack,
@@ -736,21 +737,10 @@
         # make it raise to trap naughty direct users.
         raise NotImplementedError(self._iter_inventory_xmls)
 
-    def _find_parent_ids_of_revisions(self, revision_ids):
-        # TODO: we probably want to make this a helper that other code can get
-        #       at
-        parent_map = self.get_parent_map(revision_ids)
-        parents = set()
-        map(parents.update, parent_map.itervalues())
-        parents.difference_update(revision_ids)
-        parents.discard(_mod_revision.NULL_REVISION)
-        return parents
-
-    def _find_present_inventory_ids(self, revision_ids):
-        keys = [(r,) for r in revision_ids]
-        parent_map = self.inventories.get_parent_map(keys)
-        present_inventory_ids = set(k[-1] for k in parent_map)
-        return present_inventory_ids
+    def _find_present_inventory_keys(self, revision_keys):
+        parent_map = self.inventories.get_parent_map(revision_keys)
+        present_inventory_keys = set(k for k in parent_map)
+        return present_inventory_keys
 
     def fileids_altered_by_revision_ids(self, revision_ids, _inv_weave=None):
         """Find the file ids and versions affected by revisions.
@@ -767,12 +757,20 @@
         file_id_revisions = {}
         pb = ui.ui_factory.nested_progress_bar()
         try:
-            parent_ids = self._find_parent_ids_of_revisions(revision_ids)
-            present_parent_inv_ids = self._find_present_inventory_ids(parent_ids)
+            revision_keys = [(r,) for r in revision_ids]
+            parent_keys = self._find_parent_keys_of_revisions(revision_keys)
+            # TODO: instead of using _find_present_inventory_keys, change the
+            #       code paths to allow missing inventories to be tolerated.
+            #       However, we only want to tolerate missing parent
+            #       inventories, not missing inventories for revision_ids
+            present_parent_inv_keys = self._find_present_inventory_keys(
+                                        parent_keys)
+            present_parent_inv_ids = set(
+                [k[-1] for k in present_parent_inv_keys])
             uninteresting_root_keys = set()
             interesting_root_keys = set()
-            inventories_to_read = set(present_parent_inv_ids)
-            inventories_to_read.update(revision_ids)
+            inventories_to_read = set(revision_ids)
+            inventories_to_read.update(present_parent_inv_ids)
             for inv in self.iter_inventories(inventories_to_read):
                 entry_chk_root_key = inv.id_to_entry.key()
                 if inv.revision_id in present_parent_inv_ids:
@@ -846,7 +844,7 @@
         return super(CHKInventoryRepository, self)._get_source(to_format)
 
 
-class GroupCHKStreamSource(repository.StreamSource):
+class GroupCHKStreamSource(KnitPackStreamSource):
     """Used when both the source and target repo are GroupCHK repos."""
 
     def __init__(self, from_repository, to_format):
@@ -854,6 +852,7 @@
         super(GroupCHKStreamSource, self).__init__(from_repository, to_format)
         self._revision_keys = None
         self._text_keys = None
+        self._text_fetch_order = 'groupcompress'
         self._chk_id_roots = None
         self._chk_p_id_roots = None
 
@@ -898,16 +897,10 @@
             p_id_roots_set.clear()
         return ('inventories', _filtered_inv_stream())
 
-    def _find_present_inventories(self, revision_ids):
-        revision_keys = [(r,) for r in revision_ids]
-        inventories = self.from_repository.inventories
-        present_inventories = inventories.get_parent_map(revision_keys)
-        return [p[-1] for p in present_inventories]
-
-    def _get_filtered_chk_streams(self, excluded_revision_ids):
+    def _get_filtered_chk_streams(self, excluded_revision_keys):
         self._text_keys = set()
-        excluded_revision_ids.discard(_mod_revision.NULL_REVISION)
-        if not excluded_revision_ids:
+        excluded_revision_keys.discard(_mod_revision.NULL_REVISION)
+        if not excluded_revision_keys:
             uninteresting_root_keys = set()
             uninteresting_pid_root_keys = set()
         else:
@@ -915,9 +908,9 @@
             # actually present
             # TODO: Update Repository.iter_inventories() to add
             #       ignore_missing=True
-            present_ids = self.from_repository._find_present_inventory_ids(
-                            excluded_revision_ids)
-            present_ids = self._find_present_inventories(excluded_revision_ids)
+            present_keys = self.from_repository._find_present_inventory_keys(
+                            excluded_revision_keys)
+            present_ids = [k[-1] for k in present_keys]
             uninteresting_root_keys = set()
             uninteresting_pid_root_keys = set()
             for inv in self.from_repository.iter_inventories(present_ids):
@@ -948,14 +941,6 @@
             self._chk_p_id_roots = None
         yield 'chk_bytes', _get_parent_id_basename_to_file_id_pages()
 
-    def _get_text_stream(self):
-        # Note: We know we don't have to handle adding root keys, because both
-        # the source and target are GCCHK, and those always support rich-roots
-        # We may want to request as 'unordered', in case the source has done a
-        # 'split' packing
-        return ('texts', self.from_repository.texts.get_record_stream(
-                            self._text_keys, 'groupcompress', False))
-
     def get_stream(self, search):
         revision_ids = search.get_keys()
         for stream_info in self._fetch_revision_texts(revision_ids):
@@ -966,8 +951,9 @@
         # For now, exclude all parents that are at the edge of ancestry, for
         # which we have inventories
         from_repo = self.from_repository
-        parent_ids = from_repo._find_parent_ids_of_revisions(revision_ids)
-        for stream_info in self._get_filtered_chk_streams(parent_ids):
+        parent_keys = from_repo._find_parent_keys_of_revisions(
+                        self._revision_keys)
+        for stream_info in self._get_filtered_chk_streams(parent_keys):
             yield stream_info
         yield self._get_text_stream()
 
@@ -991,8 +977,8 @@
         # no unavailable texts when the ghost inventories are not filled in.
         yield self._get_inventory_stream(missing_inventory_keys,
                                          allow_absent=True)
-        # We use the empty set for excluded_revision_ids, to make it clear that
-        # we want to transmit all referenced chk pages.
+        # We use the empty set for excluded_revision_keys, to make it clear
+        # that we want to transmit all referenced chk pages.
         for stream_info in self._get_filtered_chk_streams(set()):
             yield stream_info
 

=== modified file 'bzrlib/repofmt/pack_repo.py'
--- a/bzrlib/repofmt/pack_repo.py	2009-06-10 03:56:49 +0000
+++ b/bzrlib/repofmt/pack_repo.py	2009-06-17 17:57:15 +0000
@@ -73,6 +73,7 @@
     MetaDirRepositoryFormat,
     RepositoryFormat,
     RootCommitBuilder,
+    StreamSource,
     )
 import bzrlib.revision as _mod_revision
 from bzrlib.trace import (
@@ -2265,6 +2266,11 @@
             pb.finished()
         return result
 
+    def _get_source(self, to_format):
+        if to_format.network_name() == self._format.network_name():
+            return KnitPackStreamSource(self, to_format)
+        return super(KnitPackRepository, self)._get_source(to_format)
+
     def _make_parents_provider(self):
         return graph.CachingParentsProvider(self)
 
@@ -2384,6 +2390,79 @@
                 repo.unlock()
 
 
+class KnitPackStreamSource(StreamSource):
+    """A StreamSource used to transfer data between same-format KnitPack repos.
+
+    This source assumes:
+        1) Same serialization format for all objects
+        2) Same root information
+        3) XML format inventories
+        4) Atomic inserts (so we can stream inventory texts before text
+           content)
+        5) No chk_bytes
+    """
+
+    def __init__(self, from_repository, to_format):
+        super(KnitPackStreamSource, self).__init__(from_repository, to_format)
+        self._text_keys = None
+        self._text_fetch_order = 'unordered'
+
+    def _get_filtered_inv_stream(self, revision_ids):
+        from_repo = self.from_repository
+        parent_ids = from_repo._find_parent_ids_of_revisions(revision_ids)
+        parent_keys = [(p,) for p in parent_ids]
+        find_text_keys = from_repo._find_text_key_references_from_xml_inventory_lines
+        parent_text_keys = set(find_text_keys(
+            from_repo._inventory_xml_lines_for_keys(parent_keys)))
+        content_text_keys = set()
+        knit = KnitVersionedFiles(None, None)
+        factory = KnitPlainFactory()
+        def find_text_keys_from_content(record):
+            if record.storage_kind not in ('knit-delta-gz', 'knit-ft-gz'):
+                raise ValueError("Unknown content storage kind for"
+                    " inventory text: %s" % (record.storage_kind,))
+            # It's a knit record, it has a _raw_record field (even if it was
+            # reconstituted from a network stream).
+            raw_data = record._raw_record
+            # read the entire thing
+            revision_id = record.key[-1]
+            content, _ = knit._parse_record(revision_id, raw_data)
+            if record.storage_kind == 'knit-delta-gz':
+                line_iterator = factory.get_linedelta_content(content)
+            elif record.storage_kind == 'knit-ft-gz':
+                line_iterator = factory.get_fulltext_content(content)
+            content_text_keys.update(find_text_keys(
+                [(line, revision_id) for line in line_iterator]))
+        revision_keys = [(r,) for r in revision_ids]
+        def _filtered_inv_stream():
+            source_vf = from_repo.inventories
+            stream = source_vf.get_record_stream(revision_keys,
+                                                 'unordered', False)
+            for record in stream:
+                if record.storage_kind == 'absent':
+                    raise errors.NoSuchRevision(from_repo, record.key)
+                find_text_keys_from_content(record)
+                yield record
+            self._text_keys = content_text_keys - parent_text_keys
+        return ('inventories', _filtered_inv_stream())
+
+    def _get_text_stream(self):
+        # Note: We know we don't have to handle adding root keys, because both
+        # the source and target are the identical network name.
+        text_stream = self.from_repository.texts.get_record_stream(
+                        self._text_keys, self._text_fetch_order, False)
+        return ('texts', text_stream)
+
+    def get_stream(self, search):
+        revision_ids = search.get_keys()
+        for stream_info in self._fetch_revision_texts(revision_ids):
+            yield stream_info
+        self._revision_keys = [(rev_id,) for rev_id in revision_ids]
+        yield self._get_filtered_inv_stream(revision_ids)
+        yield self._get_text_stream()
+
+
+
 class RepositoryFormatPack(MetaDirRepositoryFormat):
     """Format logic for pack structured repositories.
 

=== modified file 'bzrlib/repository.py'
--- a/bzrlib/repository.py	2009-06-15 08:01:21 +0000
+++ b/bzrlib/repository.py	2009-06-17 17:57:15 +0000
@@ -1919,29 +1919,25 @@
                     yield line, revid
 
     def _find_file_ids_from_xml_inventory_lines(self, line_iterator,
-        revision_ids):
+        revision_keys):
         """Helper routine for fileids_altered_by_revision_ids.
 
         This performs the translation of xml lines to revision ids.
 
         :param line_iterator: An iterator of lines, origin_version_id
-        :param revision_ids: The revision ids to filter for. This should be a
+        :param revision_keys: The revision ids to filter for. This should be a
             set or other type which supports efficient __contains__ lookups, as
-            the revision id from each parsed line will be looked up in the
-            revision_ids filter.
+            the revision key from each parsed line will be looked up in the
+            revision_keys filter.
         :return: a dictionary mapping altered file-ids to an iterable of
         revision_ids. Each altered file-ids has the exact revision_ids that
         altered it listed explicitly.
         """
         seen = set(self._find_text_key_references_from_xml_inventory_lines(
                 line_iterator).iterkeys())
-        # Note that revision_ids are revision keys.
-        parent_maps = self.revisions.get_parent_map(revision_ids)
-        parents = set()
-        map(parents.update, parent_maps.itervalues())
-        parents.difference_update(revision_ids)
+        parent_keys = self._find_parent_keys_of_revisions(revision_keys)
         parent_seen = set(self._find_text_key_references_from_xml_inventory_lines(
-            self._inventory_xml_lines_for_keys(parents)))
+            self._inventory_xml_lines_for_keys(parent_keys)))
         new_keys = seen - parent_seen
         result = {}
         setdefault = result.setdefault
@@ -1949,6 +1945,33 @@
             setdefault(key[0], set()).add(key[-1])
         return result
 
+    def _find_parent_ids_of_revisions(self, revision_ids):
+        """Find all parent ids that are mentioned in the revision graph.
+
+        :return: set of revisions that are parents of revision_ids which are
+            not part of revision_ids themselves
+        """
+        parent_map = self.get_parent_map(revision_ids)
+        parent_ids = set()
+        map(parent_ids.update, parent_map.itervalues())
+        parent_ids.difference_update(revision_ids)
+        parent_ids.discard(_mod_revision.NULL_REVISION)
+        return parent_ids
+
+    def _find_parent_keys_of_revisions(self, revision_keys):
+        """Similar to _find_parent_ids_of_revisions, but used with keys.
+
+        :param revision_keys: An iterable of revision_keys.
+        :return: The parents of all revision_keys that are not already in
+            revision_keys
+        """
+        parent_map = self.revisions.get_parent_map(revision_keys)
+        parent_keys = set()
+        map(parent_keys.update, parent_map.itervalues())
+        parent_keys.difference_update(revision_keys)
+        parent_keys.discard(_mod_revision.NULL_REVISION)
+        return parent_keys
+
     def fileids_altered_by_revision_ids(self, revision_ids, _inv_weave=None):
         """Find the file ids and versions affected by revisions.
 
@@ -3453,144 +3476,6 @@
         return self.source.revision_ids_to_search_result(result_set)
 
 
-class InterPackRepo(InterSameDataRepository):
-    """Optimised code paths between Pack based repositories."""
-
-    @classmethod
-    def _get_repo_format_to_test(self):
-        from bzrlib.repofmt import pack_repo
-        return pack_repo.RepositoryFormatKnitPack6RichRoot()
-
-    @staticmethod
-    def is_compatible(source, target):
-        """Be compatible with known Pack formats.
-
-        We don't test for the stores being of specific types because that
-        could lead to confusing results, and there is no need to be
-        overly general.
-
-        InterPackRepo does not support CHK based repositories.
-        """
-        from bzrlib.repofmt.pack_repo import RepositoryFormatPack
-        from bzrlib.repofmt.groupcompress_repo import RepositoryFormatCHK1
-        try:
-            are_packs = (isinstance(source._format, RepositoryFormatPack) and
-                isinstance(target._format, RepositoryFormatPack))
-            not_packs = (isinstance(source._format, RepositoryFormatCHK1) or
-                isinstance(target._format, RepositoryFormatCHK1))
-        except AttributeError:
-            return False
-        if not_packs or not are_packs:
-            return False
-        return InterRepository._same_model(source, target)
-
-    @needs_write_lock
-    def fetch(self, revision_id=None, pb=None, find_ghosts=False,
-            fetch_spec=None):
-        """See InterRepository.fetch()."""
-        if (len(self.source._fallback_repositories) > 0 or
-            len(self.target._fallback_repositories) > 0):
-            # The pack layer is not aware of fallback repositories, so when
-            # fetching from a stacked repository or into a stacked repository
-            # we use the generic fetch logic which uses the VersionedFiles
-            # attributes on repository.
-            from bzrlib.fetch import RepoFetcher
-            fetcher = RepoFetcher(self.target, self.source, revision_id,
-                    pb, find_ghosts, fetch_spec=fetch_spec)
-        if fetch_spec is not None:
-            if len(list(fetch_spec.heads)) != 1:
-                raise AssertionError(
-                    "InterPackRepo.fetch doesn't support "
-                    "fetching multiple heads yet.")
-            revision_id = list(fetch_spec.heads)[0]
-            fetch_spec = None
-        if revision_id is None:
-            # TODO:
-            # everything to do - use pack logic
-            # to fetch from all packs to one without
-            # inventory parsing etc, IFF nothing to be copied is in the target.
-            # till then:
-            source_revision_ids = frozenset(self.source.all_revision_ids())
-            revision_ids = source_revision_ids - \
-                frozenset(self.target.get_parent_map(source_revision_ids))
-            revision_keys = [(revid,) for revid in revision_ids]
-            index = self.target._pack_collection.revision_index.combined_index
-            present_revision_ids = set(item[1][0] for item in
-                index.iter_entries(revision_keys))
-            revision_ids = set(revision_ids) - present_revision_ids
-            # implementing the TODO will involve:
-            # - detecting when all of a pack is selected
-            # - avoiding as much as possible pre-selection, so the
-            # more-core routines such as create_pack_from_packs can filter in
-            # a just-in-time fashion. (though having a HEADS list on a
-            # repository might make this a lot easier, because we could
-            # sensibly detect 'new revisions' without doing a full index scan.
-        elif _mod_revision.is_null(revision_id):
-            # nothing to do:
-            return (0, [])
-        else:
-            revision_ids = self.search_missing_revision_ids(revision_id,
-                find_ghosts=find_ghosts).get_keys()
-            if len(revision_ids) == 0:
-                return (0, [])
-        return self._pack(self.source, self.target, revision_ids)
-
-    def _pack(self, source, target, revision_ids):
-        from bzrlib.repofmt.pack_repo import Packer
-        packs = source._pack_collection.all_packs()
-        pack = Packer(self.target._pack_collection, packs, '.fetch',
-            revision_ids).pack()
-        if pack is not None:
-            self.target._pack_collection._save_pack_names()
-            copied_revs = pack.get_revision_count()
-            # Trigger an autopack. This may duplicate effort as we've just done
-            # a pack creation, but for now it is simpler to think about as
-            # 'upload data, then repack if needed'.
-            self.target._pack_collection.autopack()
-            return (copied_revs, [])
-        else:
-            return (0, [])
-
-    @needs_read_lock
-    def search_missing_revision_ids(self, revision_id=None, find_ghosts=True):
-        """See InterRepository.missing_revision_ids().
-
-        :param find_ghosts: Find ghosts throughout the ancestry of
-            revision_id.
-        """
-        if not find_ghosts and revision_id is not None:
-            return self._walk_to_common_revisions([revision_id])
-        elif revision_id is not None:
-            # Find ghosts: search for revisions pointing from one repository to
-            # the other, and vice versa, anywhere in the history of revision_id.
-            graph = self.target.get_graph(other_repository=self.source)
-            searcher = graph._make_breadth_first_searcher([revision_id])
-            found_ids = set()
-            while True:
-                try:
-                    next_revs, ghosts = searcher.next_with_ghosts()
-                except StopIteration:
-                    break
-                if revision_id in ghosts:
-                    raise errors.NoSuchRevision(self.source, revision_id)
-                found_ids.update(next_revs)
-                found_ids.update(ghosts)
-            found_ids = frozenset(found_ids)
-            # Double query here: should be able to avoid this by changing the
-            # graph api further.
-            result_set = found_ids - frozenset(
-                self.target.get_parent_map(found_ids))
-        else:
-            source_ids = self.source.all_revision_ids()
-            # source_ids is the worst possible case we may need to pull.
-            # now we want to filter source_ids against what we actually
-            # have in target, but don't try to check for existence where we know
-            # we do not have a revision as that would be pointless.
-            target_ids = set(self.target.all_revision_ids())
-            result_set = set(source_ids).difference(target_ids)
-        return self.source.revision_ids_to_search_result(result_set)
-
-
 class InterDifferingSerializer(InterRepository):
 
     @classmethod
@@ -3871,7 +3756,6 @@
 InterRepository.register_optimiser(InterSameDataRepository)
 InterRepository.register_optimiser(InterWeaveRepo)
 InterRepository.register_optimiser(InterKnitRepo)
-InterRepository.register_optimiser(InterPackRepo)
 
 
 class CopyConverter(object):

=== modified file 'bzrlib/tests/test_commit_merge.py'
--- a/bzrlib/tests/test_commit_merge.py	2009-03-23 14:59:43 +0000
+++ b/bzrlib/tests/test_commit_merge.py	2009-06-17 19:08:25 +0000
@@ -46,7 +46,7 @@
         wtx.commit('commit one', rev_id='x at u-0-1', allow_pointless=True)
         wty.commit('commit two', rev_id='y at u-0-1', allow_pointless=True)
 
-        self.assertEqual((1, []), by.fetch(bx))
+        by.fetch(bx)
         # just having the history there does nothing
         self.assertRaises(PointlessCommit,
                           wty.commit,

=== modified file 'bzrlib/tests/test_pack_repository.py'
--- a/bzrlib/tests/test_pack_repository.py	2009-06-10 03:56:49 +0000
+++ b/bzrlib/tests/test_pack_repository.py	2009-06-17 17:57:15 +0000
@@ -38,6 +38,10 @@
     upgrade,
     workingtree,
     )
+from bzrlib.repofmt import (
+    pack_repo,
+    groupcompress_repo,
+    )
 from bzrlib.repofmt.groupcompress_repo import RepositoryFormatCHK1
 from bzrlib.smart import (
     client,
@@ -556,58 +560,43 @@
             missing_ghost.get_inventory, 'ghost')
 
     def make_write_ready_repo(self):
-        repo = self.make_repository('.', format=self.get_format())
+        format = self.get_format()
+        if isinstance(format.repository_format, RepositoryFormatCHK1):
+            raise TestNotApplicable("No missing compression parents")
+        repo = self.make_repository('.', format=format)
         repo.lock_write()
+        self.addCleanup(repo.unlock)
         repo.start_write_group()
+        self.addCleanup(repo.abort_write_group)
         return repo
 
     def test_missing_inventories_compression_parent_prevents_commit(self):
         repo = self.make_write_ready_repo()
         key = ('junk',)
-        if not getattr(repo.inventories._index, '_missing_compression_parents',
-            None):
-            raise TestSkipped("No missing compression parents")
         repo.inventories._index._missing_compression_parents.add(key)
         self.assertRaises(errors.BzrCheckError, repo.commit_write_group)
         self.assertRaises(errors.BzrCheckError, repo.commit_write_group)
-        repo.abort_write_group()
-        repo.unlock()
 
     def test_missing_revisions_compression_parent_prevents_commit(self):
         repo = self.make_write_ready_repo()
         key = ('junk',)
-        if not getattr(repo.inventories._index, '_missing_compression_parents',
-            None):
-            raise TestSkipped("No missing compression parents")
         repo.revisions._index._missing_compression_parents.add(key)
         self.assertRaises(errors.BzrCheckError, repo.commit_write_group)
         self.assertRaises(errors.BzrCheckError, repo.commit_write_group)
-        repo.abort_write_group()
-        repo.unlock()
 
     def test_missing_signatures_compression_parent_prevents_commit(self):
         repo = self.make_write_ready_repo()
         key = ('junk',)
-        if not getattr(repo.inventories._index, '_missing_compression_parents',
-            None):
-            raise TestSkipped("No missing compression parents")
         repo.signatures._index._missing_compression_parents.add(key)
         self.assertRaises(errors.BzrCheckError, repo.commit_write_group)
         self.assertRaises(errors.BzrCheckError, repo.commit_write_group)
-        repo.abort_write_group()
-        repo.unlock()
 
     def test_missing_text_compression_parent_prevents_commit(self):
         repo = self.make_write_ready_repo()
         key = ('some', 'junk')
-        if not getattr(repo.inventories._index, '_missing_compression_parents',
-            None):
-            raise TestSkipped("No missing compression parents")
         repo.texts._index._missing_compression_parents.add(key)
         self.assertRaises(errors.BzrCheckError, repo.commit_write_group)
         e = self.assertRaises(errors.BzrCheckError, repo.commit_write_group)
-        repo.abort_write_group()
-        repo.unlock()
 
     def test_supports_external_lookups(self):
         repo = self.make_repository('.', format=self.get_format())

=== modified file 'bzrlib/tests/test_repository.py'
--- a/bzrlib/tests/test_repository.py	2009-06-10 03:56:49 +0000
+++ b/bzrlib/tests/test_repository.py	2009-06-18 18:00:01 +0000
@@ -31,7 +31,10 @@
                            UnknownFormatError,
                            UnsupportedFormatError,
                            )
-from bzrlib import graph
+from bzrlib import (
+    graph,
+    tests,
+    )
 from bzrlib.branchbuilder import BranchBuilder
 from bzrlib.btree_index import BTreeBuilder, BTreeGraphIndex
 from bzrlib.index import GraphIndex, InMemoryGraphIndex
@@ -685,6 +688,147 @@
         self.assertEqual(65536,
             inv.parent_id_basename_to_file_id._root_node.maximum_size)
 
+    def test_stream_source_to_gc(self):
+        source = self.make_repository('source', format='development6-rich-root')
+        target = self.make_repository('target', format='development6-rich-root')
+        stream = source._get_source(target._format)
+        self.assertIsInstance(stream, groupcompress_repo.GroupCHKStreamSource)
+
+    def test_stream_source_to_non_gc(self):
+        source = self.make_repository('source', format='development6-rich-root')
+        target = self.make_repository('target', format='rich-root-pack')
+        stream = source._get_source(target._format)
+        # We don't want the child GroupCHKStreamSource
+        self.assertIs(type(stream), repository.StreamSource)
+
+    def test_get_stream_for_missing_keys_includes_all_chk_refs(self):
+        source_builder = self.make_branch_builder('source',
+                            format='development6-rich-root')
+        # We have to build a fairly large tree, so that we are sure the chk
+        # pages will have split into multiple pages.
+        entries = [('add', ('', 'a-root-id', 'directory', None))]
+        for i in 'abcdefghijklmnopqrstuvwxyz123456789':
+            for j in 'abcdefghijklmnopqrstuvwxyz123456789':
+                fname = i + j
+                fid = fname + '-id'
+                content = 'content for %s\n' % (fname,)
+                entries.append(('add', (fname, fid, 'file', content)))
+        source_builder.start_series()
+        source_builder.build_snapshot('rev-1', None, entries)
+        # Now change a few of them, so we get a few new pages for the second
+        # revision
+        source_builder.build_snapshot('rev-2', ['rev-1'], [
+            ('modify', ('aa-id', 'new content for aa-id\n')),
+            ('modify', ('cc-id', 'new content for cc-id\n')),
+            ('modify', ('zz-id', 'new content for zz-id\n')),
+            ])
+        source_builder.finish_series()
+        source_branch = source_builder.get_branch()
+        source_branch.lock_read()
+        self.addCleanup(source_branch.unlock)
+        target = self.make_repository('target', format='development6-rich-root')
+        source = source_branch.repository._get_source(target._format)
+        self.assertIsInstance(source, groupcompress_repo.GroupCHKStreamSource)
+
+        # On a regular pass, getting the inventories and chk pages for rev-2
+        # would only get the newly created chk pages
+        search = graph.SearchResult(set(['rev-2']), set(['rev-1']), 1,
+                                    set(['rev-2']))
+        simple_chk_records = []
+        for vf_name, substream in source.get_stream(search):
+            if vf_name == 'chk_bytes':
+                for record in substream:
+                    simple_chk_records.append(record.key)
+            else:
+                for _ in substream:
+                    continue
+        # 3 pages, the root (InternalNode), + 2 pages which actually changed
+        self.assertEqual([('sha1:91481f539e802c76542ea5e4c83ad416bf219f73',),
+                          ('sha1:4ff91971043668583985aec83f4f0ab10a907d3f',),
+                          ('sha1:81e7324507c5ca132eedaf2d8414ee4bb2226187',),
+                          ('sha1:b101b7da280596c71a4540e9a1eeba8045985ee0',)],
+                         simple_chk_records)
+        # Now, when we do a similar call using 'get_stream_for_missing_keys'
+        # we should get a much larger set of pages.
+        missing = [('inventories', 'rev-2')]
+        full_chk_records = []
+        for vf_name, substream in source.get_stream_for_missing_keys(missing):
+            if vf_name == 'inventories':
+                for record in substream:
+                    self.assertEqual(('rev-2',), record.key)
+            elif vf_name == 'chk_bytes':
+                for record in substream:
+                    full_chk_records.append(record.key)
+            else:
+                self.fail('Should not be getting a stream of %s' % (vf_name,))
+        # We have 257 records now. This is because we have 1 root page, and 256
+        # leaf pages in a complete listing.
+        self.assertEqual(257, len(full_chk_records))
+        self.assertSubset(simple_chk_records, full_chk_records)
+
+
+class TestKnitPackStreamSource(tests.TestCaseWithMemoryTransport):
+
+    def test_source_to_exact_pack_092(self):
+        source = self.make_repository('source', format='pack-0.92')
+        target = self.make_repository('target', format='pack-0.92')
+        stream_source = source._get_source(target._format)
+        self.assertIsInstance(stream_source, pack_repo.KnitPackStreamSource)
+
+    def test_source_to_exact_pack_rich_root_pack(self):
+        source = self.make_repository('source', format='rich-root-pack')
+        target = self.make_repository('target', format='rich-root-pack')
+        stream_source = source._get_source(target._format)
+        self.assertIsInstance(stream_source, pack_repo.KnitPackStreamSource)
+
+    def test_source_to_exact_pack_19(self):
+        source = self.make_repository('source', format='1.9')
+        target = self.make_repository('target', format='1.9')
+        stream_source = source._get_source(target._format)
+        self.assertIsInstance(stream_source, pack_repo.KnitPackStreamSource)
+
+    def test_source_to_exact_pack_19_rich_root(self):
+        source = self.make_repository('source', format='1.9-rich-root')
+        target = self.make_repository('target', format='1.9-rich-root')
+        stream_source = source._get_source(target._format)
+        self.assertIsInstance(stream_source, pack_repo.KnitPackStreamSource)
+
+    def test_source_to_remote_exact_pack_19(self):
+        trans = self.make_smart_server('target')
+        trans.ensure_base()
+        source = self.make_repository('source', format='1.9')
+        target = self.make_repository('target', format='1.9')
+        target = repository.Repository.open(trans.base)
+        stream_source = source._get_source(target._format)
+        self.assertIsInstance(stream_source, pack_repo.KnitPackStreamSource)
+
+    def test_stream_source_to_non_exact(self):
+        source = self.make_repository('source', format='pack-0.92')
+        target = self.make_repository('target', format='1.9')
+        stream = source._get_source(target._format)
+        self.assertIs(type(stream), repository.StreamSource)
+
+    def test_stream_source_to_non_exact_rich_root(self):
+        source = self.make_repository('source', format='1.9')
+        target = self.make_repository('target', format='1.9-rich-root')
+        stream = source._get_source(target._format)
+        self.assertIs(type(stream), repository.StreamSource)
+
+    def test_source_to_remote_non_exact_pack_19(self):
+        trans = self.make_smart_server('target')
+        trans.ensure_base()
+        source = self.make_repository('source', format='1.9')
+        target = self.make_repository('target', format='1.6')
+        target = repository.Repository.open(trans.base)
+        stream_source = source._get_source(target._format)
+        self.assertIs(type(stream_source), repository.StreamSource)
+
+    def test_stream_source_to_knit(self):
+        source = self.make_repository('source', format='pack-0.92')
+        target = self.make_repository('target', format='dirstate')
+        stream = source._get_source(target._format)
+        self.assertIs(type(stream), repository.StreamSource)
+
 
 class TestDevelopment6FindParentIdsOfRevisions(TestCaseWithTransport):
     """Tests for _find_parent_ids_of_revisions."""
@@ -825,6 +969,12 @@
         """
         broken_repo = self.make_broken_repository()
         empty_repo = self.make_repository('empty-repo')
+        # See bug https://bugs.launchpad.net/bzr/+bug/389141 for information
+        # about why this was turned into expectFailure
+        self.expectFailure('new Stream fetch fills in missing compression'
+           ' parents (bug #389141)',
+           self.assertRaises, (errors.RevisionNotPresent, errors.BzrCheckError),
+                              empty_repo.fetch, broken_repo)
         self.assertRaises((errors.RevisionNotPresent, errors.BzrCheckError),
                           empty_repo.fetch, broken_repo)
 
@@ -1204,84 +1354,3 @@
         self.assertTrue(new_pack.inventory_index._optimize_for_size)
         self.assertTrue(new_pack.text_index._optimize_for_size)
         self.assertTrue(new_pack.signature_index._optimize_for_size)
-
-
-class TestGCCHKPackCollection(TestCaseWithTransport):
-
-    def test_stream_source_to_gc(self):
-        source = self.make_repository('source', format='development6-rich-root')
-        target = self.make_repository('target', format='development6-rich-root')
-        stream = source._get_source(target._format)
-        self.assertIsInstance(stream, groupcompress_repo.GroupCHKStreamSource)
-
-    def test_stream_source_to_non_gc(self):
-        source = self.make_repository('source', format='development6-rich-root')
-        target = self.make_repository('target', format='rich-root-pack')
-        stream = source._get_source(target._format)
-        # We don't want the child GroupCHKStreamSource
-        self.assertIs(type(stream), repository.StreamSource)
-
-    def test_get_stream_for_missing_keys_includes_all_chk_refs(self):
-        source_builder = self.make_branch_builder('source',
-                            format='development6-rich-root')
-        # We have to build a fairly large tree, so that we are sure the chk
-        # pages will have split into multiple pages.
-        entries = [('add', ('', 'a-root-id', 'directory', None))]
-        for i in 'abcdefghijklmnopqrstuvwxyz123456789':
-            for j in 'abcdefghijklmnopqrstuvwxyz123456789':
-                fname = i + j
-                fid = fname + '-id'
-                content = 'content for %s\n' % (fname,)
-                entries.append(('add', (fname, fid, 'file', content)))
-        source_builder.start_series()
-        source_builder.build_snapshot('rev-1', None, entries)
-        # Now change a few of them, so we get a few new pages for the second
-        # revision
-        source_builder.build_snapshot('rev-2', ['rev-1'], [
-            ('modify', ('aa-id', 'new content for aa-id\n')),
-            ('modify', ('cc-id', 'new content for cc-id\n')),
-            ('modify', ('zz-id', 'new content for zz-id\n')),
-            ])
-        source_builder.finish_series()
-        source_branch = source_builder.get_branch()
-        source_branch.lock_read()
-        self.addCleanup(source_branch.unlock)
-        target = self.make_repository('target', format='development6-rich-root')
-        source = source_branch.repository._get_source(target._format)
-        self.assertIsInstance(source, groupcompress_repo.GroupCHKStreamSource)
-
-        # On a regular pass, getting the inventories and chk pages for rev-2
-        # would only get the newly created chk pages
-        search = graph.SearchResult(set(['rev-2']), set(['rev-1']), 1,
-                                    set(['rev-2']))
-        simple_chk_records = []
-        for vf_name, substream in source.get_stream(search):
-            if vf_name == 'chk_bytes':
-                for record in substream:
-                    simple_chk_records.append(record.key)
-            else:
-                for _ in substream:
-                    continue
-        # 3 pages, the root (InternalNode), + 2 pages which actually changed
-        self.assertEqual([('sha1:91481f539e802c76542ea5e4c83ad416bf219f73',),
-                          ('sha1:4ff91971043668583985aec83f4f0ab10a907d3f',),
-                          ('sha1:81e7324507c5ca132eedaf2d8414ee4bb2226187',),
-                          ('sha1:b101b7da280596c71a4540e9a1eeba8045985ee0',)],
-                         simple_chk_records)
-        # Now, when we do a similar call using 'get_stream_for_missing_keys'
-        # we should get a much larger set of pages.
-        missing = [('inventories', 'rev-2')]
-        full_chk_records = []
-        for vf_name, substream in source.get_stream_for_missing_keys(missing):
-            if vf_name == 'inventories':
-                for record in substream:
-                    self.assertEqual(('rev-2',), record.key)
-            elif vf_name == 'chk_bytes':
-                for record in substream:
-                    full_chk_records.append(record.key)
-            else:
-                self.fail('Should not be getting a stream of %s' % (vf_name,))
-        # We have 257 records now. This is because we have 1 root page, and 256
-        # leaf pages in a complete listing.
-        self.assertEqual(257, len(full_chk_records))
-        self.assertSubset(simple_chk_records, full_chk_records)




More information about the bazaar-commits mailing list