From 112cb6a84485e8b22227b1141fb1ca015d1455d2 Mon Sep 17 00:00:00 2001 From: Chris Fischer Date: Mon, 18 Dec 2023 16:26:45 -0700 Subject: [PATCH] Squashed 'manage_externals/' changes from fde04e4..876b344 MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit 876b344 Merge pull request #208 from jedwards4b/add_version_file 46b2eb9 improve workflow name 1506bbf Need to update file before creating version cb06623 use github workflow to update version.txt 3d13177 update version.txt ba00e50 Merge branch 'main' into add_version_file 757fed1 update version.txt cd64e76 add version.txt and pre-push hook 0f884bf Merge pull request #205 from jedwards4b/sunset_svn_git_access 82a5edf merge in billsacks:svn_testing_no_github 17532c1 Use a local svn repo for testing 9c90434 different method to determine if in tests 539952e remove debug print statement cc5434f fix submodule testing 1d7f288 remove broken tests 04e94a5 provide a meaningful error message 38bcc0a Merge pull request #201 from jedwards4b/partial_match b4466a5 remove debug print statement c3cf3ec fix issue with partial branch match 7b6d92e Merge pull request #198 from johnpaulalex/gitdir 927ce3a Merge pull request #197 from johnpaulalex/testpath a04f114 Merge pull request #196 from johnpaulalex/readmod d9c14bf Change the rest of the methods to use -C. Still some usage of getcwd in test_unit_repository_git. 332b106 Fix incorrect logged path of checkout_externals in test_sys_checkout: it was basically the parent of the current directory, which varies throughout the test. (it called abspath with '{0}/../../', which adds arbitrary and not-interpolated subdir '{0}' to the path, then removes it and removes one more level). 932a749 Remove printlog from read_gitmodules_file since read_externals_description_file() already has a nearly-the-same printlog (but add it to the other caller). 5d13719 Merge pull request #195 from johnpaulalex/check_repo 4233954 Update utest to mock _git_remote_verbose in a new way, since it is now called via the GitRepository class rather than on the specific GitRepository instance. d7a42ae Check that desired repo was actually checked out. 71596bb Merge pull request #194 from johnpaulalex/manic2 4c96e82 Make the MANIC_TEST_BARE_REPO_ROOT env var special - give it a constant for easy tracking, and automatically tear it down after each test. 259bfc0 test_sys_checkout: use actual paths in on-the-fly configs rather than MANIC_TEST_BARE_REPO_ROOT env var. This will make it easier to test (in the near future) that checkout_externals actually checked out the desired repo dir. 557bbd6 Merge pull request #193 from johnpaulalex/manic 5314eed Remove MANIC_TEST_TMP_REPO_ROOT environment variable in favor of module-level variable. 345fc1e Merge pull request #191 from johnpaulalex/test_doc12 2117b84 test_sys_checkout: verify that basic by-tag/branch/hash tests actually take us to the correct git tag/branch/hash. 94d6e5f Merge pull request #190 from johnpaulalex/test_doc11 3ff33a6 Inline local-path-creation methods 47dea7f Merge pull request #189 from johnpaulalex/test_doc10 9ea75cb Grab-bag of renamings: Remove redundant _NAME from repo constants, and consistently add _REPO suffix (This causes the majority of diffs). c0c847e Merge pull request #188 from johnpaulalex/test_doc9 2dd5ce0 test_sys_checkout.py: only check for correct 'required' or 'optional' state in the test that exercises required vs optional behavior. Removed a lot of boilerplate. eb30859 Merge pull request #187 from johnpaulalex/test_doc8 1832e1f test_sys_checkout: Simplify many tests to only use a single external. 8689d61 Merge pull request #186 from johnpaulalex/test_doc7 fbee425 Grab bag of test_sys_checkout cleanups: Doc inside of each test more clearly/consistently. TestSysCheckoutSVN didn’t get the inlining-of-helper-methods treatment, now it has that. Move various standalone repo helper methods (like create_branch) into a RepoUtils class. README.md was missing newlines when rendered as markdown. Doc the return value of checkout.main Fix test_container_exclude_component - it was looking for the wrong key (which is never present); now it looks for the correct key. f0ed44a Merge pull request #185 from johnpaulalex/test_doc6 a3d59f5 Merge pull request #184 from johnpaulalex/test_doc5 5329c8b test_sys_checkout: Inline config generation functions that are only called once. 464f2c7 test_sys_checkout: Inline another layer (per-config-file checks). Rename the 4 methods that are used multiple times, to reflect what they do rather than what they're called. 8872c0d Merge pull request #183 from johnpaulalex/doc_test4 c045335 Merge pull request #182 from johnpaulalex/doc_test3 c583b95 Merge pull request #181 from johnpaulalex/doc_test2 e01cfe2 test_sys_checkout: less confusing handling of return values from checkout_externals. Specifically, when doing a checkout, don't return tree_status from _before_ the checkout. Make a new wrapper to call checkout_externals a second time, to calculate the new status after a checkout (very frequent pattern). 2328681 test_sys_checkout: Remove another layer (which generates test component names) c3717b6 Merge pull request #180 from johnpaulalex/doc_test 36d7a44 test_sys_checkout.py: remove one layer of functions (that check for local status enums). No-op. 2c4584b More documentation about tests: * contents of test repositories (n a new README.md) * various constants in test_sys_checkout.py that point to those contents, and terminology like container/simple/mixed. * in each test method, the scenarios being tested. * The coupling between test methods. 55e74bd Merge pull request #179 from johnpaulalex/circ 66be842 Remove circular dependency by making _External stop doing tricky things with sourcetrees. 82d3b24 Merge pull request #178 from johnpaulalex/test_doc 3223f49 Additional documentation of system tests - global variables, method descriptions. 45b7c01 Merge pull request #177 from jedwards4b/git_workflow ace90b2 try setting credentials this way f4d6aa9 try setting credentials this way 1d61a69 use this to set git credentials 7f9d330 use this to set git credentials 5ac731b add tmate code 836847b get git workflow working dcd462d Merge pull request #176 from jedwards4b/add_github_testing 2d2479e Merge pull request #175 from johnpaulalex/fix 711a53f add github testing of prs and automatic tagging of main cfe0f88 fix typos 5665d61 Fix broken checkout behavior introduced by PR #172. 27909e2 Merge pull request #173 from johnpaulalex/readall 00ad044 Further tiny refactorings and docs of checkout API (no-op). Remove unused load_all param in _External.checkout(). Rename _External.checkout_externals() to checkout_subexternals(), to remove the ambiguity about whether the main external pointed to by the _External is itelf checked out (it is not) Clarify load_all documentation - it’s always recursive, but applies different criteria at each level. Rename variables in checkout.py (e.g. ext_description) to match the equivalent code in sourcetree.py. 2ea3d1a Merge pull request #172 from johnpaulalex/fixit 43bf809 Merge pull request #171 from johnpaulalex/docstatus e6aa7d2 Merge pull request #170 from johnpaulalex/printdir adbd715 On checkout, refresh locally installed optional packages regardless of whether -o is passed in. add0745 Comment tweaks, and fix 'ppath' typo 696527c Document the format of various status dictionaries, and the various paths and path components within an _External. c677b94 When processing an external, print out its path in addition to the base filename (to disambiguate all the externals.cfg's) 975d7fd Merge pull request #169 from johnpaulalex/docfix_branch 09709e3 Document _Externals.status(). The original comment was apparently copy-pasted from checkout(). 1d880e0 Merge pull request #167 from billsacks/fix_svn_on_windows 3510da8 Tweak a unit test to improve coverage eb7fc13 Handle the possibility that the URL already ends with '/' 02ea87e Fix svn URLs on Windows b1c02ab Merge pull request #165 from gold2718/doc_fix 9f4be8c Add documentation about externals = None feature a3b3a03 Merge pull request #162 from ESMCI/fischer/python3 d4f1b1e Change shebang lines to python3 2fd941a Merge pull request #158 from billsacks/modified_solution de08dc2 Add another option for when an external is in a modified state e954582 Merge pull request #156 from billsacks/onbranch_show_hash 952e44d Change output: put tag/hash before branch name 1028843 Fix pre-existing pylint issues 01b13f7 When on a branch, show tag/hash, too 39ad532 Merge pull request #150 from gold2718/fix_combo_config 75f8f02 Merge pull request #152 from jedwards4b/sort_by_local_path 42687bd remove commented code 29e26af fix pylint issues 7c9f3c6 add a test for nested repo checkout 75c5353 fix spacing 24a3726 improve sorting, checkout externals with each comp 29f45b0 remove py2 test and fix super call 880a4e7 remove decode 1c53be8 no need for set call 36c56db simplier fix for issue dc67cc6 simpler solution b32c6fc fix to allow submodule name different from path 5b5e1c2 Merge pull request #144 from billsacks/improve_errmsg c983863 Add another option for dealing with modified externals 59ce252 Add some details to the error message when externals are modified be5a1a4 Merge pull request #143 from jedwards4b/add_exclude 2aa014a fix lint issue 49cd5e8 fix lint issues 418173f Added tests for ExternalsDescriptionDict afab352 fix lint issue be85b7d fix the test a580a57 push test d437108 add a test 21affe3 fix formatting issue 72e6b64 add an exclude option c33a3bd Merge pull request #139 from jedwards4b/ignore_branch_prop b124a9a ignore this silently, we use a hash so it does not matter git-subtree-dir: manage_externals git-subtree-split: 876b3440e7356fed2bf417659a5f7b5527a92a2f --- .github/workflows/bumpversion.yml | 46 + .github/workflows/tests.yml | 30 + .gitignore | 3 + .travis.yml | 3 +- README.md | 5 + checkout_externals | 11 +- manic/checkout.py | 99 +- manic/externals_description.py | 88 +- manic/externals_status.py | 30 +- manic/repository_factory.py | 1 + manic/repository_git.py | 354 +-- manic/repository_svn.py | 12 +- manic/sourcetree.py | 442 ++-- manic/utils.py | 2 +- test/README.md | 38 +- test/repos/README.md | 33 + test/repos/simple-ext.svn/README.txt | 5 + test/repos/simple-ext.svn/conf/authz | 32 + test/repos/simple-ext.svn/conf/hooks-env.tmpl | 19 + test/repos/simple-ext.svn/conf/passwd | 8 + test/repos/simple-ext.svn/conf/svnserve.conf | 81 + test/repos/simple-ext.svn/db/current | 1 + test/repos/simple-ext.svn/db/format | 3 + test/repos/simple-ext.svn/db/fs-type | 1 + test/repos/simple-ext.svn/db/fsfs.conf | 200 ++ test/repos/simple-ext.svn/db/min-unpacked-rev | 1 + test/repos/simple-ext.svn/db/rep-cache.db | Bin 0 -> 8192 bytes .../simple-ext.svn/db/rep-cache.db-journal | 0 test/repos/simple-ext.svn/db/revprops/0/0 | 5 + test/repos/simple-ext.svn/db/revprops/0/1 | 13 + test/repos/simple-ext.svn/db/revprops/0/2 | 13 + test/repos/simple-ext.svn/db/revprops/0/3 | 13 + test/repos/simple-ext.svn/db/revs/0/0 | Bin 0 -> 253 bytes test/repos/simple-ext.svn/db/revs/0/1 | Bin 0 -> 725 bytes test/repos/simple-ext.svn/db/revs/0/2 | Bin 0 -> 816 bytes test/repos/simple-ext.svn/db/revs/0/3 | Bin 0 -> 769 bytes test/repos/simple-ext.svn/db/txn-current | 1 + test/repos/simple-ext.svn/db/txn-current-lock | 0 test/repos/simple-ext.svn/db/uuid | 2 + test/repos/simple-ext.svn/db/write-lock | 0 test/repos/simple-ext.svn/format | 1 + .../simple-ext.svn/hooks/post-commit.tmpl | 62 + .../repos/simple-ext.svn/hooks/post-lock.tmpl | 64 + .../hooks/post-revprop-change.tmpl | 69 + .../simple-ext.svn/hooks/post-unlock.tmpl | 61 + .../simple-ext.svn/hooks/pre-commit.tmpl | 91 + test/repos/simple-ext.svn/hooks/pre-lock.tmpl | 95 + .../hooks/pre-revprop-change.tmpl | 79 + .../simple-ext.svn/hooks/pre-unlock.tmpl | 87 + .../simple-ext.svn/hooks/start-commit.tmpl | 81 + test/repos/simple-ext.svn/locks/db-logs.lock | 3 + test/repos/simple-ext.svn/locks/db.lock | 3 + test/test_sys_checkout.py | 2216 ++++++++--------- test/test_sys_repository_git.py | 60 +- test/test_unit_externals_description.py | 79 +- test/test_unit_externals_status.py | 2 +- test/test_unit_repository.py | 2 +- test/test_unit_repository_git.py | 137 +- test/test_unit_repository_svn.py | 4 +- test/test_unit_utils.py | 2 +- version.txt | 1 + 61 files changed, 3101 insertions(+), 1693 deletions(-) create mode 100644 .github/workflows/bumpversion.yml create mode 100644 .github/workflows/tests.yml create mode 100644 test/repos/README.md create mode 100644 test/repos/simple-ext.svn/README.txt create mode 100644 test/repos/simple-ext.svn/conf/authz create mode 100644 test/repos/simple-ext.svn/conf/hooks-env.tmpl create mode 100644 test/repos/simple-ext.svn/conf/passwd create mode 100644 test/repos/simple-ext.svn/conf/svnserve.conf create mode 100644 test/repos/simple-ext.svn/db/current create mode 100644 test/repos/simple-ext.svn/db/format create mode 100644 test/repos/simple-ext.svn/db/fs-type create mode 100644 test/repos/simple-ext.svn/db/fsfs.conf create mode 100644 test/repos/simple-ext.svn/db/min-unpacked-rev create mode 100644 test/repos/simple-ext.svn/db/rep-cache.db create mode 100644 test/repos/simple-ext.svn/db/rep-cache.db-journal create mode 100644 test/repos/simple-ext.svn/db/revprops/0/0 create mode 100644 test/repos/simple-ext.svn/db/revprops/0/1 create mode 100644 test/repos/simple-ext.svn/db/revprops/0/2 create mode 100644 test/repos/simple-ext.svn/db/revprops/0/3 create mode 100644 test/repos/simple-ext.svn/db/revs/0/0 create mode 100644 test/repos/simple-ext.svn/db/revs/0/1 create mode 100644 test/repos/simple-ext.svn/db/revs/0/2 create mode 100644 test/repos/simple-ext.svn/db/revs/0/3 create mode 100644 test/repos/simple-ext.svn/db/txn-current create mode 100644 test/repos/simple-ext.svn/db/txn-current-lock create mode 100644 test/repos/simple-ext.svn/db/uuid create mode 100644 test/repos/simple-ext.svn/db/write-lock create mode 100644 test/repos/simple-ext.svn/format create mode 100755 test/repos/simple-ext.svn/hooks/post-commit.tmpl create mode 100755 test/repos/simple-ext.svn/hooks/post-lock.tmpl create mode 100755 test/repos/simple-ext.svn/hooks/post-revprop-change.tmpl create mode 100755 test/repos/simple-ext.svn/hooks/post-unlock.tmpl create mode 100755 test/repos/simple-ext.svn/hooks/pre-commit.tmpl create mode 100755 test/repos/simple-ext.svn/hooks/pre-lock.tmpl create mode 100755 test/repos/simple-ext.svn/hooks/pre-revprop-change.tmpl create mode 100755 test/repos/simple-ext.svn/hooks/pre-unlock.tmpl create mode 100755 test/repos/simple-ext.svn/hooks/start-commit.tmpl create mode 100644 test/repos/simple-ext.svn/locks/db-logs.lock create mode 100644 test/repos/simple-ext.svn/locks/db.lock mode change 100644 => 100755 test/test_sys_checkout.py mode change 100644 => 100755 test/test_unit_repository_svn.py create mode 100644 version.txt diff --git a/.github/workflows/bumpversion.yml b/.github/workflows/bumpversion.yml new file mode 100644 index 0000000000..a37084a04d --- /dev/null +++ b/.github/workflows/bumpversion.yml @@ -0,0 +1,46 @@ +name: Update file on PR merge +on: + pull_request: + branches: + - main + types: closed + +jobs: + update_version: + if: github.event.pull_request.merged == true + runs-on: ubuntu-latest + steps: + - name: checkout + uses: actions/checkout@v3 + with: + # Fetch full depth, otherwise the last step overwrites the last commit's parent, essentially removing the graph. + fetch-depth: 0 + + - name: GenerateTag + id: name_tag + uses: mathieudutour/github-tag-action@v6.1 + with: + github_token: ${{ secrets.GITHUB_TOKEN }} + create_annotated_tag: true + default_bump: patch + dry_run: true + tag_prefix: manic- + - name: Update version.txt + run: | + echo "${{ steps.name_tag.outputs.new_tag }}" > version.txt + - name: Amend the last commit + run: | + git config --global user.email "gitbot@openrct2.org" + git config --global user.name "OpenRCT2 git bot" + git commit -a --amend --no-edit + git push --force-with-lease + echo "Complete"name: Bump version + - name: Bump version and push tag + id: really_tag_version + uses: mathieudutour/github-tag-action@v6.1 + with: + github_token: ${{ secrets.GITHUB_TOKEN }} + create_annotated_tag: true + default_bump: patch + dry_run: false + tag_prefix: manic- diff --git a/.github/workflows/tests.yml b/.github/workflows/tests.yml new file mode 100644 index 0000000000..dd75b91b49 --- /dev/null +++ b/.github/workflows/tests.yml @@ -0,0 +1,30 @@ +# This is a workflow to compile the cmeps source without cime +name: Test Manic + +# Controls when the action will run. Triggers the workflow on push or pull request +# events but only for the master branch +on: + push: + branches: [ main ] + pull_request: + branches: [ main ] + +# A workflow run is made up of one or more jobs that can run sequentially or in parallel +jobs: + test-manic: + runs-on: ubuntu-latest + steps: + - uses: actions/checkout@v3 + - name: Test Manic + run: | + pushd test + git config --global user.email "devnull@example.com" + git config --global user.name "GITHUB tester" + git config --global protocol.file.allow always + make utest + make stest + popd + + - name: Setup tmate session + if: ${{ failure() }} + uses: mxschmitt/action-tmate@v3 diff --git a/.gitignore b/.gitignore index 411de5d96e..a71ac0cd75 100644 --- a/.gitignore +++ b/.gitignore @@ -12,3 +12,6 @@ components/ # generated python files *.pyc + +# test tmp file +test/tmp diff --git a/.travis.yml b/.travis.yml index 1990cb9604..d9b24c584d 100644 --- a/.travis.yml +++ b/.travis.yml @@ -1,7 +1,6 @@ language: python os: linux -python: - - "2.7" +python: - "3.4" - "3.5" - "3.6" diff --git a/README.md b/README.md index c931c8e213..9475301b5d 100644 --- a/README.md +++ b/README.md @@ -201,6 +201,11 @@ The root of the source tree will be referred to as `${SRC_ROOT}` below. externals description file pointed 'useful_library/sub-xternals.cfg', Then the main 'externals' field in the top level repo should point to 'sub-externals.cfg'. + Note that by default, `checkout_externals` will clone an external's + submodules. As a special case, the entry, `externals = None`, will + prevent this behavior. For more control over which externals are + checked out, create an externals file (and see the `from_submodule` + configuration entry below). * from_submodule (True / False) : used to pull the repo_url, local_path, and hash properties for this external from the .gitmodules file in diff --git a/checkout_externals b/checkout_externals index a0698baef0..536c64eb65 100755 --- a/checkout_externals +++ b/checkout_externals @@ -1,4 +1,4 @@ -#!/usr/bin/env python +#!/usr/bin/env python3 """Main driver wrapper around the manic/checkout utility. @@ -12,7 +12,7 @@ from __future__ import print_function import sys import traceback - +import os import manic if sys.hexversion < 0x02070000: @@ -26,6 +26,13 @@ if sys.hexversion < 0x02070000: if __name__ == '__main__': ARGS = manic.checkout.commandline_arguments() + if ARGS.version: + version_info = '' + version_file_path = os.path.join(os.path.dirname(__file__),'version.txt') + with open(version_file_path) as f: + version_info = f.readlines()[0].strip() + print(version_info) + sys.exit(0) try: RET_STATUS, _ = manic.checkout.main(ARGS) sys.exit(RET_STATUS) diff --git a/manic/checkout.py b/manic/checkout.py index edc5655954..25c05ea233 100755 --- a/manic/checkout.py +++ b/manic/checkout.py @@ -1,4 +1,4 @@ -#!/usr/bin/env python +#!/usr/bin/env python3 """ Tool to assemble repositories represented in a model-description file. @@ -227,6 +227,12 @@ def commandline_arguments(args=None): Now, %(prog)s will process Externals.cfg and also process Externals_LIBX.cfg as if it was a sub-external. + Note that by default, checkout_externals will clone an external's + submodules. As a special case, the entry, "externals = None", will + prevent this behavior. For more control over which externals are + checked out, create an externals file (and see the from_submodule + configuration entry below). + * from_submodule (True / False) : used to pull the repo_url, local_path, and hash properties for this external from the .gitmodules file in this repository. Note that the section name (the entry in square @@ -279,6 +285,9 @@ def commandline_arguments(args=None): help='The externals description filename. ' 'Default: %(default)s.') + parser.add_argument('-x', '--exclude', nargs='*', + help='Component(s) listed in the externals file which should be ignored.') + parser.add_argument('-o', '--optional', action='store_true', default=False, help='By default only the required externals ' 'are checked out. This flag will also checkout the ' @@ -295,6 +304,9 @@ def commandline_arguments(args=None): 'used up to two times, increasing the ' 'verbosity level each time.') + parser.add_argument('--version', action='store_true', default=False, + help='Print manage_externals version and exit.') + parser.add_argument('--svn-ignore-ancestry', action='store_true', default=False, help='By default, subversion will abort if a component is ' 'already checked out and there is no common ancestry with ' @@ -329,7 +341,34 @@ def commandline_arguments(args=None): options = parser.parse_args() return options - +def _dirty_local_repo_msg(program_name, config_file): + return """The external repositories labeled with 'M' above are not in a clean state. +The following are four options for how to proceed: +(1) Go into each external that is not in a clean state and issue either a 'git status' or + an 'svn status' command (depending on whether the external is managed by git or + svn). Either revert or commit your changes so that all externals are in a clean + state. (To revert changes in git, follow the instructions given when you run 'git + status'.) (Note, though, that it is okay to have untracked files in your working + directory.) Then rerun {program_name}. +(2) Alternatively, you do not have to rely on {program_name}. Instead, you can manually + update out-of-sync externals (labeled with 's' above) as described in the + configuration file {config_file}. (For example, run 'git fetch' and 'git checkout' + commands to checkout the appropriate tags for each external, as given in + {config_file}.) +(3) You can also use {program_name} to manage most, but not all externals: You can specify + one or more externals to ignore using the '-x' or '--exclude' argument to + {program_name}. Excluding externals labeled with 'M' will allow {program_name} to + update the other, non-excluded externals. +(4) As a last resort, if you are confident that there is no work that needs to be saved + from a given external, you can remove that external (via "rm -rf [directory]") and + then rerun the {program_name} tool. This option is mainly useful as a workaround for + issues with this tool (such as https://github.com/ESMCI/manage_externals/issues/157). +The external repositories labeled with '?' above are not under version +control using the expected protocol. If you are sure you want to switch +protocols, and you don't have any work you need to save from this +directory, then run "rm -rf [directory]" before rerunning the +{program_name} tool. +""".format(program_name=program_name, config_file=config_file) # --------------------------------------------------------------------- # # main @@ -342,9 +381,9 @@ def main(args): the --all option is passed. Returns a tuple (overall_status, tree_status). overall_status is 0 - on success, non-zero on failure. tree_status gives the full status - *before* executing the checkout command - i.e., the status that it - used to determine if it's safe to proceed with the checkout. + on success, non-zero on failure. tree_status is a dict mapping local path + to ExternalStatus -- if no checkout is happening. If checkout is happening, tree_status + is None. """ if args.do_logging: logging.basicConfig(filename=LOG_FILE_NAME, @@ -360,57 +399,41 @@ def main(args): load_all = True root_dir = os.path.abspath(os.getcwd()) - external_data = read_externals_description_file(root_dir, args.externals) - external = create_externals_description( - external_data, components=args.components) + model_data = read_externals_description_file(root_dir, args.externals) + ext_description = create_externals_description( + model_data, components=args.components, exclude=args.exclude) for comp in args.components: - if comp not in external.keys(): + if comp not in ext_description.keys(): + # Note we can't print out the list of found externals because + # they were filtered in create_externals_description above. fatal_error( "No component {} found in {}".format( comp, args.externals)) - source_tree = SourceTree(root_dir, external, svn_ignore_ancestry=args.svn_ignore_ancestry) - printlog('Checking status of externals: ', end='') - tree_status = source_tree.status() + source_tree = SourceTree(root_dir, ext_description, svn_ignore_ancestry=args.svn_ignore_ancestry) + if args.components: + components_str = 'specified components' + else: + components_str = 'required & optional components' + printlog('Checking local status of ' + components_str + ': ', end='') + tree_status = source_tree.status(print_progress=True) printlog('') if args.status: # user requested status-only - for comp in sorted(tree_status.keys()): + for comp in sorted(tree_status): tree_status[comp].log_status_message(args.verbose) else: # checkout / update the external repositories. safe_to_update = check_safe_to_update_repos(tree_status) if not safe_to_update: # print status - for comp in sorted(tree_status.keys()): + for comp in sorted(tree_status): tree_status[comp].log_status_message(args.verbose) # exit gracefully - msg = """The external repositories labeled with 'M' above are not in a clean state. - -The following are two options for how to proceed: - -(1) Go into each external that is not in a clean state and issue either - an 'svn status' or a 'git status' command. Either revert or commit - your changes so that all externals are in a clean state. (Note, - though, that it is okay to have untracked files in your working - directory.) Then rerun {program_name}. - -(2) Alternatively, you do not have to rely on {program_name}. Instead, you - can manually update out-of-sync externals (labeled with 's' above) - as described in the configuration file {config_file}. - - -The external repositories labeled with '?' above are not under version -control using the expected protocol. If you are sure you want to switch -protocols, and you don't have any work you need to save from this -directory, then run "rm -rf [directory]" before re-running the -checkout_externals tool. -""".format(program_name=program_name, config_file=args.externals) - printlog('-' * 70) - printlog(msg) + printlog(_dirty_local_repo_msg(program_name, args.externals)) printlog('-' * 70) else: if not args.components: @@ -418,6 +441,8 @@ def main(args): for comp in args.components: source_tree.checkout(args.verbose, load_all, load_comp=comp) printlog('') + # New tree status is unknown, don't return anything. + tree_status = None logging.info('%s completed without exceptions.', program_name) # NOTE(bja, 2017-11) tree status is used by the systems tests diff --git a/manic/externals_description.py b/manic/externals_description.py index b0c4f736a7..546e7fdcb4 100644 --- a/manic/externals_description.py +++ b/manic/externals_description.py @@ -1,4 +1,4 @@ -#!/usr/bin/env python +#!/usr/bin/env python3 """Model description @@ -71,7 +71,8 @@ def read_externals_description_file(root_dir, file_name): root_dir = os.path.abspath(root_dir) msg = 'In directory : {0}'.format(root_dir) logging.info(msg) - printlog('Processing externals description file : {0}'.format(file_name)) + printlog('Processing externals description file : {0} ({1})'.format(file_name, + root_dir)) file_path = os.path.join(root_dir, file_name) if not os.path.exists(file_name): @@ -87,7 +88,7 @@ def read_externals_description_file(root_dir, file_name): externals_description = None if file_name == ExternalsDescription.GIT_SUBMODULES_FILENAME: - externals_description = read_gitmodules_file(root_dir, file_name) + externals_description = _read_gitmodules_file(root_dir, file_name) else: try: config = config_parser() @@ -150,9 +151,8 @@ def git_submodule_status(repo_dir): """Run the git submodule status command to obtain submodule hashes. """ # This function is here instead of GitRepository to avoid a dependency loop - cwd = os.getcwd() - os.chdir(repo_dir) - cmd = ['git', 'submodule', 'status'] + cmd = 'git -C {repo_dir} submodule status'.format( + repo_dir=repo_dir).split() git_output = execute_subprocess(cmd, output_to_caller=True) submodules = {} submods = git_output.split('\n') @@ -167,7 +167,6 @@ def git_submodule_status(repo_dir): submodules[items[1]] = {'hash':items[0], 'status':status, 'tag':tag} - os.chdir(cwd) return submodules def parse_submodules_desc_section(section_items, file_path): @@ -180,6 +179,9 @@ def parse_submodules_desc_section(section_items, file_path): path = item[1].strip() elif name == 'url': url = item[1].strip() + elif name == 'branch': + # We do not care about branch since we have a hash - silently ignore + pass else: msg = 'WARNING: Ignoring unknown {} property, in {}' msg = msg.format(item[0], file_path) # fool pylint @@ -187,21 +189,23 @@ def parse_submodules_desc_section(section_items, file_path): return path, url -def read_gitmodules_file(root_dir, file_name): +def _read_gitmodules_file(root_dir, file_name): # pylint: disable=deprecated-method # Disabling this check because the method is only used for python2 + # pylint: disable=too-many-locals + # pylint: disable=too-many-branches + # pylint: disable=too-many-statements """Read a .gitmodules file and convert it to be compatible with an externals description. """ root_dir = os.path.abspath(root_dir) msg = 'In directory : {0}'.format(root_dir) logging.info(msg) - printlog('Processing submodules description file : {0}'.format(file_name)) file_path = os.path.join(root_dir, file_name) if not os.path.exists(file_name): msg = ('ERROR: submodules description file, "{0}", does not ' - 'exist at path:\n {1}'.format(file_name, file_path)) + 'exist in dir:\n {1}'.format(file_name, root_dir)) fatal_error(msg) submodules_description = None @@ -250,9 +254,21 @@ def read_gitmodules_file(root_dir, file_name): ExternalsDescription.REPO_URL, url) externals_description.set(sec_name, ExternalsDescription.REQUIRED, 'True') - git_hash = submods[sec_name]['hash'] - externals_description.set(sec_name, - ExternalsDescription.HASH, git_hash) + if sec_name in submods: + submod_name = sec_name + else: + # The section name does not have to match the path + submod_name = path + + if submod_name in submods: + git_hash = submods[submod_name]['hash'] + externals_description.set(sec_name, + ExternalsDescription.HASH, + git_hash) + else: + emsg = "submodule status has no section, '{}'" + emsg += "\nCheck section names in externals config file" + fatal_error(emsg.format(submod_name)) # Required items externals_description.add_section(DESCRIPTION_SECTION) @@ -261,18 +277,22 @@ def read_gitmodules_file(root_dir, file_name): return externals_description def create_externals_description( - model_data, model_format='cfg', components=None, parent_repo=None): + model_data, model_format='cfg', components=None, exclude=None, parent_repo=None): """Create the a externals description object from the provided data + + components: list of component names to include, None to include all. If a + name isn't found, it is silently omitted from the return value. + exclude: list of component names to skip. """ externals_description = None if model_format == 'dict': externals_description = ExternalsDescriptionDict( - model_data, components=components) + model_data, components=components, exclude=exclude) elif model_format == 'cfg': major, _, _ = get_cfg_schema_version(model_data) if major == 1: externals_description = ExternalsDescriptionConfigV1( - model_data, components=components, parent_repo=parent_repo) + model_data, components=components, exclude=exclude, parent_repo=parent_repo) else: msg = ('Externals description file has unsupported schema ' 'version "{0}".'.format(major)) @@ -339,8 +359,9 @@ class ExternalsDescription(dict): input value. """ - # keywords defining the interface into the externals description data - EXTERNALS = 'externals' + # keywords defining the interface into the externals description data; these + # are brought together by the schema below. + EXTERNALS = 'externals' # path to externals file. BRANCH = 'branch' SUBMODULE = 'from_submodule' HASH = 'hash' @@ -366,6 +387,8 @@ class ExternalsDescription(dict): _V1_BRANCH = 'BRANCH' _V1_REQ_SOURCE = 'REQ_SOURCE' + # Dictionary keys are component names. The corresponding values are laid out + # according to this schema. _source_schema = {REQUIRED: True, PATH: 'string', EXTERNALS: 'string', @@ -614,8 +637,11 @@ def _repo_config_from_submodule(self, field, submod_desc): ' Parent repo, "{1}" does not have submodules') fatal_error(msg.format(field, self._parent_repo.name())) - submod_file = read_gitmodules_file(repo_path, submod_file) - submod_desc = create_externals_description(submod_file) + printlog( + 'Processing submodules description file : {0} ({1})'.format( + submod_file, repo_path)) + submod_model_data= _read_gitmodules_file(repo_path, submod_file) + submod_desc = create_externals_description(submod_model_data) # Can we find our external? repo_url = None @@ -707,7 +733,7 @@ class ExternalsDescriptionDict(ExternalsDescription): """ - def __init__(self, model_data, components=None): + def __init__(self, model_data, components=None, exclude=None): """Parse a native dictionary into a externals description. """ ExternalsDescription.__init__(self) @@ -719,10 +745,15 @@ def __init__(self, model_data, components=None): self._input_patch = 0 self._verify_schema_version() if components: - for key in model_data.items(): + for key in list(model_data.keys()): if key not in components: del model_data[key] + if exclude: + for key in list(model_data.keys()): + if key in exclude: + del model_data[key] + self.update(model_data) self._check_user_input() @@ -733,10 +764,12 @@ class ExternalsDescriptionConfigV1(ExternalsDescription): """ - def __init__(self, model_data, components=None, parent_repo=None): + def __init__(self, model_data, components=None, exclude=None, parent_repo=None): """Convert the config data into a standardized dict that can be used to construct the source objects + components: list of component names to include, None to include all. + exclude: list of component names to skip. """ ExternalsDescription.__init__(self, parent_repo=parent_repo) self._schema_major = 1 @@ -746,7 +779,7 @@ def __init__(self, model_data, components=None, parent_repo=None): get_cfg_schema_version(model_data) self._verify_schema_version() self._remove_metadata(model_data) - self._parse_cfg(model_data, components=components) + self._parse_cfg(model_data, components=components, exclude=exclude) self._check_user_input() @staticmethod @@ -758,8 +791,11 @@ def _remove_metadata(model_data): """ model_data.remove_section(DESCRIPTION_SECTION) - def _parse_cfg(self, cfg_data, components=None): + def _parse_cfg(self, cfg_data, components=None, exclude=None): """Parse a config_parser object into a externals description. + + components: list of component names to include, None to include all. + exclude: list of component names to skip. """ def list_to_dict(input_list, convert_to_lower_case=True): """Convert a list of key-value pairs into a dictionary. @@ -775,7 +811,7 @@ def list_to_dict(input_list, convert_to_lower_case=True): for section in cfg_data.sections(): name = config_string_cleaner(section.lower().strip()) - if components and name not in components: + if (components and name not in components) or (exclude and name in exclude): continue self[name] = {} self[name].update(list_to_dict(cfg_data.items(section))) diff --git a/manic/externals_status.py b/manic/externals_status.py index d3d238f289..6bc29e9732 100644 --- a/manic/externals_status.py +++ b/manic/externals_status.py @@ -29,16 +29,16 @@ class ExternalStatus(object): transactions (e.g. add, remove, rename, untracked files). """ - DEFAULT = '-' + # sync_state and clean_state can be one of the following: + DEFAULT = '-' # not set yet (sync_state). clean_state can be this if sync_state is EMPTY. UNKNOWN = '?' EMPTY = 'e' - MODEL_MODIFIED = 's' # a.k.a. out-of-sync - DIRTY = 'M' - - STATUS_OK = ' ' + MODEL_MODIFIED = 's' # repo version != externals (sync_state only) + DIRTY = 'M' # repo is dirty (clean_state only) + STATUS_OK = ' ' # repo is clean (clean_state) or matches externals version (sync_state) STATUS_ERROR = '!' - # source types + # source_type can be one of the following: OPTIONAL = 'o' STANDALONE = 's' MANAGED = ' ' @@ -55,19 +55,21 @@ def __init__(self): def log_status_message(self, verbosity): """Write status message to the screen and log file """ - self._default_status_message() + printlog(self._default_status_message()) if verbosity >= VERBOSITY_VERBOSE: - self._verbose_status_message() + printlog(self._verbose_status_message()) if verbosity >= VERBOSITY_DUMP: - self._dump_status_message() + printlog(self._dump_status_message()) + + def __repr__(self): + return self._default_status_message() def _default_status_message(self): """Return the default terse status message string """ - msg = '{sync}{clean}{src_type} {path}'.format( + return '{sync}{clean}{src_type} {path}'.format( sync=self.sync_state, clean=self.clean_state, src_type=self.source_type, path=self.path) - printlog(msg) def _verbose_status_message(self): """Return the verbose status message string @@ -82,14 +84,12 @@ def _verbose_status_message(self): if self.sync_state != self.STATUS_OK: sync_str = '{current} --> {expected}'.format( current=self.current_version, expected=self.expected_version) - msg = ' {clean}, {sync}'.format(clean=clean_str, sync=sync_str) - printlog(msg) + return ' {clean}, {sync}'.format(clean=clean_str, sync=sync_str) def _dump_status_message(self): """Return the dump status message string """ - msg = indent_string(self.status_output, 12) - printlog(msg) + return indent_string(self.status_output, 12) def safe_to_update(self): """Report if it is safe to update a repository. Safe is defined as: diff --git a/manic/repository_factory.py b/manic/repository_factory.py index 80a92a9d8a..18c73ffc4b 100644 --- a/manic/repository_factory.py +++ b/manic/repository_factory.py @@ -15,6 +15,7 @@ def create_repository(component_name, repo_info, svn_ignore_ancestry=False): """Determine what type of repository we have, i.e. git or svn, and create the appropriate object. + Can return None (e.g. if protocol is 'externals_only'). """ protocol = repo_info[ExternalsDescription.PROTOCOL].lower() if protocol == 'git': diff --git a/manic/repository_git.py b/manic/repository_git.py index f986051001..aab1a468a8 100644 --- a/manic/repository_git.py +++ b/manic/repository_git.py @@ -7,6 +7,7 @@ import copy import os +import sys from .global_constants import EMPTY_STR, LOCAL_PATH_INDICATOR from .global_constants import VERBOSITY_VERBOSE @@ -25,7 +26,7 @@ class GitRepository(Repository): * be isolated in separate functions with no application logic * of the form: - - cmd = ['git', ...] + - cmd = 'git -C {dirname} ...'.format(dirname=dirname).split() - value = execute_subprocess(cmd, output_to_caller={T|F}, status_to_caller={T|F}) - return value @@ -39,7 +40,7 @@ class GitRepository(Repository): def __init__(self, component_name, repo): """ - Parse repo (a XML element). + repo: ExternalsDescription. """ Repository.__init__(self, component_name, repo) self._gitmodules = None @@ -99,45 +100,42 @@ def submodules_file(self, repo_path=None): # # ---------------------------------------------------------------- def _clone_repo(self, base_dir_path, repo_dir_name, verbosity): - """Prepare to execute the clone by managing directory location + """Clones repo_dir_name into base_dir_path. """ - cwd = os.getcwd() - os.chdir(base_dir_path) - self._git_clone(self._url, repo_dir_name, verbosity) - os.chdir(cwd) + self._git_clone(self._url, os.path.join(base_dir_path, repo_dir_name), + verbosity=verbosity) - def _current_ref(self): - """Determine the *name* associated with HEAD. + def _current_ref(self, dirname): + """Determine the *name* associated with HEAD at dirname. - If we're on a branch, then returns the branch name; otherwise, - if we're on a tag, then returns the tag name; otherwise, returns + If we're on a tag, then returns the tag name; otherwise, returns the current hash. Returns an empty string if no reference can be determined (e.g., if we're not actually in a git repository). + + If we're on a branch, then the branch name is also included in + the returned string (in addition to the tag / hash). """ ref_found = False - # If we're on a branch, then use that as the current ref - branch_found, branch_name = self._git_current_branch() - if branch_found: - current_ref = branch_name + # If we're exactly at a tag, use that as the current ref + tag_found, tag_name = self._git_current_tag(dirname) + if tag_found: + current_ref = tag_name ref_found = True - if not ref_found: - # Otherwise, if we're exactly at a tag, use that as the - # current ref - tag_found, tag_name = self._git_current_tag() - if tag_found: - current_ref = tag_name - ref_found = True - if not ref_found: # Otherwise, use current hash as the current ref - hash_found, hash_name = self._git_current_hash() + hash_found, hash_name = self._git_current_hash(dirname) if hash_found: current_ref = hash_name ref_found = True - if not ref_found: + if ref_found: + # If we're on a branch, include branch name in current ref + branch_found, branch_name = self._git_current_branch(dirname) + if branch_found: + current_ref = "{} (branch {})".format(current_ref, branch_name) + else: # If we still can't find a ref, return empty string. This # can happen if we're not actually in a git repo current_ref = '' @@ -185,17 +183,15 @@ def compare_refs(current_ref, expected_ref): status = ExternalStatus.MODEL_MODIFIED return status - cwd = os.getcwd() - os.chdir(repo_dir_path) - # get the full hash of the current commit - _, current_ref = self._git_current_hash() + _, current_ref = self._git_current_hash(repo_dir_path) if self._branch: if self._url == LOCAL_PATH_INDICATOR: expected_ref = self._branch else: - remote_name = self._determine_remote_name() + remote_name = self._remote_name_for_url(self._url, + repo_dir_path) if not remote_name: # git doesn't know about this remote. by definition # this is a modified state. @@ -212,7 +208,7 @@ def compare_refs(current_ref, expected_ref): fatal_error(msg) # record the *names* of the current and expected branches - stat.current_version = self._current_ref() + stat.current_version = self._current_ref(repo_dir_path) stat.expected_version = copy.deepcopy(expected_ref) if current_ref == EMPTY_STR: @@ -220,7 +216,7 @@ def compare_refs(current_ref, expected_ref): else: # get the underlying hash of the expected ref revparse_status, expected_ref_hash = self._git_revparse_commit( - expected_ref) + expected_ref, repo_dir_path) if revparse_status: # We failed to get the hash associated with # expected_ref. Maybe we should assign this to some special @@ -231,18 +227,13 @@ def compare_refs(current_ref, expected_ref): # compare the underlying hashes stat.sync_state = compare_refs(current_ref, expected_ref_hash) - os.chdir(cwd) - - def _determine_remote_name(self): - """Return the remote name. - - Note that this is for the *future* repo url and branch, not - the current working copy! + @classmethod + def _remote_name_for_url(cls, remote_url, dirname): + """Return the remote name matching remote_url (or None) """ - git_output = self._git_remote_verbose() + git_output = cls._git_remote_verbose(dirname) git_output = git_output.splitlines() - remote_name = '' for line in git_output: data = line.strip() if not data: @@ -250,10 +241,9 @@ def _determine_remote_name(self): data = data.split() name = data[0].strip() url = data[1].strip() - if self._url == url: - remote_name = name - break - return remote_name + if remote_url == url: + return name + return None def _create_remote_name(self): """The url specified in the externals description file was not known @@ -309,19 +299,16 @@ def _checkout_ref(self, repo_dir, verbosity, submodules): the repo's submodules """ # import pdb; pdb.set_trace() - cwd = os.getcwd() - os.chdir(repo_dir) if self._url.strip() == LOCAL_PATH_INDICATOR: - self._checkout_local_ref(verbosity, submodules) + self._checkout_local_ref(verbosity, submodules, repo_dir) else: - self._checkout_external_ref(verbosity, submodules) + self._checkout_external_ref(verbosity, submodules, repo_dir) if self._sparse: self._sparse_checkout(repo_dir, verbosity) - os.chdir(cwd) - def _checkout_local_ref(self, verbosity, submodules): + def _checkout_local_ref(self, verbosity, submodules, dirname): """Checkout the reference considering the local repo only. Do not fetch any additional remotes or specify the remote when checkout out the ref. @@ -335,13 +322,18 @@ def _checkout_local_ref(self, verbosity, submodules): else: ref = self._hash - self._check_for_valid_ref(ref) - self._git_checkout_ref(ref, verbosity, submodules) + self._check_for_valid_ref(ref, remote_name=None, + dirname=dirname) + self._git_checkout_ref(ref, verbosity, submodules, dirname) - def _checkout_external_ref(self, verbosity, submodules): - """Checkout the reference from a remote repository + def _checkout_external_ref(self, verbosity, submodules, dirname): + """Checkout the reference from a remote repository into dirname. if is True, recursively initialize and update - the repo's submodules + the repo's submodules. + Note that this results in a 'detached HEAD' state if checking out + a branch, because we check out the remote branch rather than the + local. See https://github.com/ESMCI/manage_externals/issues/34 for + more discussion. """ if self._tag: ref = self._tag @@ -350,45 +342,45 @@ def _checkout_external_ref(self, verbosity, submodules): else: ref = self._hash - remote_name = self._determine_remote_name() + remote_name = self._remote_name_for_url(self._url, dirname) if not remote_name: remote_name = self._create_remote_name() - self._git_remote_add(remote_name, self._url) - self._git_fetch(remote_name) + self._git_remote_add(remote_name, self._url, dirname) + self._git_fetch(remote_name, dirname) # NOTE(bja, 2018-03) we need to send separate ref and remote # name to check_for_vaild_ref, but the combined name to # checkout_ref! - self._check_for_valid_ref(ref, remote_name) + self._check_for_valid_ref(ref, remote_name, dirname) if self._branch: + # Prepend remote name to branch. This means we avoid various + # special cases if the local branch is not tracking the remote or + # cannot be trivially fast-forwarded to match; but, it also + # means we end up in a 'detached HEAD' state. ref = '{0}/{1}'.format(remote_name, ref) - self._git_checkout_ref(ref, verbosity, submodules) + self._git_checkout_ref(ref, verbosity, submodules, dirname) def _sparse_checkout(self, repo_dir, verbosity): """Use git read-tree to thin the working tree.""" - cwd = os.getcwd() - - cmd = ['cp', self._sparse, os.path.join(repo_dir, - '.git/info/sparse-checkout')] + cmd = ['cp', os.path.join(repo_dir, self._sparse), + os.path.join(repo_dir, + '.git/info/sparse-checkout')] if verbosity >= VERBOSITY_VERBOSE: printlog(' {0}'.format(' '.join(cmd))) execute_subprocess(cmd) - os.chdir(repo_dir) - self._git_sparse_checkout(verbosity) + self._git_sparse_checkout(verbosity, repo_dir) - os.chdir(cwd) - - def _check_for_valid_ref(self, ref, remote_name=None): + def _check_for_valid_ref(self, ref, remote_name, dirname): """Try some basic sanity checks on the user supplied reference so we can provide a more useful error message than calledprocess error... + remote_name can be NOne """ - is_tag = self._ref_is_tag(ref) - is_branch = self._ref_is_branch(ref, remote_name) - is_hash = self._ref_is_hash(ref) - + is_tag = self._ref_is_tag(ref, dirname) + is_branch = self._ref_is_branch(ref, remote_name, dirname) + is_hash = self._ref_is_hash(ref, dirname) is_valid = is_tag or is_branch or is_hash if not is_valid: msg = ('In repo "{0}": reference "{1}" does not appear to be a ' @@ -398,7 +390,8 @@ def _check_for_valid_ref(self, ref, remote_name=None): fatal_error(msg) if is_tag: - is_unique_tag, msg = self._is_unique_tag(ref, remote_name) + is_unique_tag, msg = self._is_unique_tag(ref, remote_name, + dirname) if not is_unique_tag: msg = ('In repo "{0}": tag "{1}" {2}'.format( self._name, self._tag, msg)) @@ -406,7 +399,7 @@ def _check_for_valid_ref(self, ref, remote_name=None): return is_valid - def _is_unique_tag(self, ref, remote_name): + def _is_unique_tag(self, ref, remote_name, dirname): """Verify that a reference is a valid tag and is unique (not a branch) Tags may be tag names, or SHA id's. It is also possible that a @@ -417,9 +410,9 @@ def _is_unique_tag(self, ref, remote_name): error! """ - is_tag = self._ref_is_tag(ref) - is_branch = self._ref_is_branch(ref, remote_name) - is_hash = self._ref_is_hash(ref) + is_tag = self._ref_is_tag(ref, dirname) + is_branch = self._ref_is_branch(ref, remote_name, dirname) + is_hash = self._ref_is_hash(ref, dirname) msg = '' is_unique_tag = False @@ -450,7 +443,7 @@ def _is_unique_tag(self, ref, remote_name): return is_unique_tag, msg - def _ref_is_tag(self, ref): + def _ref_is_tag(self, ref, dirname): """Verify that a reference is a valid tag according to git. Note: values returned by git_showref_* and git_revparse are @@ -458,28 +451,30 @@ def _ref_is_tag(self, ref): error! """ is_tag = False - value = self._git_showref_tag(ref) + value = self._git_showref_tag(ref, dirname) if value == 0: is_tag = True return is_tag - def _ref_is_branch(self, ref, remote_name=None): + def _ref_is_branch(self, ref, remote_name, dirname): """Verify if a ref is any kind of branch (local, tracked remote, untracked remote). + remote_name can be None. """ local_branch = False remote_branch = False if remote_name: - remote_branch = self._ref_is_remote_branch(ref, remote_name) - local_branch = self._ref_is_local_branch(ref) + remote_branch = self._ref_is_remote_branch(ref, remote_name, + dirname) + local_branch = self._ref_is_local_branch(ref, dirname) is_branch = False if local_branch or remote_branch: is_branch = True return is_branch - def _ref_is_local_branch(self, ref): + def _ref_is_local_branch(self, ref, dirname): """Verify that a reference is a valid branch according to git. show-ref branch returns local branches that have been @@ -492,12 +487,12 @@ def _ref_is_local_branch(self, ref): """ is_branch = False - value = self._git_showref_branch(ref) + value = self._git_showref_branch(ref, dirname) if value == 0: is_branch = True return is_branch - def _ref_is_remote_branch(self, ref, remote_name): + def _ref_is_remote_branch(self, ref, remote_name, dirname): """Verify that a reference is a valid branch according to git. show-ref branch returns local branches that have been @@ -510,12 +505,12 @@ def _ref_is_remote_branch(self, ref, remote_name): """ is_branch = False - value = self._git_lsremote_branch(ref, remote_name) + value = self._git_lsremote_branch(ref, remote_name, dirname) if value == 0: is_branch = True return is_branch - def _ref_is_commit(self, ref): + def _ref_is_commit(self, ref, dirname): """Verify that a reference is a valid commit according to git. This could be a tag, branch, sha1 id, HEAD and potentially others... @@ -525,12 +520,12 @@ def _ref_is_commit(self, ref): error! """ is_commit = False - value, _ = self._git_revparse_commit(ref) + value, _ = self._git_revparse_commit(ref, dirname) if value == 0: is_commit = True return is_commit - def _ref_is_hash(self, ref): + def _ref_is_hash(self, ref, dirname): """Verify that a reference is a valid hash according to git. Git doesn't seem to provide an exact way to determine if user @@ -545,7 +540,7 @@ def _ref_is_hash(self, ref): """ is_hash = False - status, git_output = self._git_revparse_commit(ref) + status, git_output = self._git_revparse_commit(ref, dirname) if status == 0: if git_output.strip().startswith(ref): is_hash = True @@ -555,9 +550,7 @@ def _status_summary(self, stat, repo_dir_path): """Determine the clean/dirty status of a git repository """ - cwd = os.getcwd() - os.chdir(repo_dir_path) - git_output = self._git_status_porcelain_v1z() + git_output = self._git_status_porcelain_v1z(repo_dir_path) is_dirty = self._status_v1z_is_dirty(git_output) if is_dirty: stat.clean_state = ExternalStatus.DIRTY @@ -566,8 +559,7 @@ def _status_summary(self, stat, repo_dir_path): # Now save the verbose status output incase the user wants to # see it. - stat.status_output = self._git_status_verbose() - os.chdir(cwd) + stat.status_output = self._git_status_verbose(repo_dir_path) @staticmethod def _status_v1z_is_dirty(git_output): @@ -602,7 +594,7 @@ def _status_v1z_is_dirty(git_output): # # ---------------------------------------------------------------- @staticmethod - def _git_current_hash(): + def _git_current_hash(dirname): """Return the full hash of the currently checked-out version. Returns a tuple, (hash_found, hash), where hash_found is a @@ -610,21 +602,51 @@ def _git_current_hash(): could mean we're not in a git repository at all). (If hash_found is False, then hash is ''.) """ - status, git_output = GitRepository._git_revparse_commit("HEAD") + status, git_output = GitRepository._git_revparse_commit("HEAD", + dirname) hash_found = not status if not hash_found: git_output = '' return hash_found, git_output @staticmethod - def _git_current_branch(): - """Determines the name of the current branch. + def _git_current_remote_branch(dirname): + """Determines the name of the current remote branch, if any. + + if dir is None, uses the cwd. + + Returns a tuple, (branch_found, branch_name), where branch_found + is a bool specifying whether a branch name was found for + HEAD. (If branch_found is False, then branch_name is ''). + branch_name is in the format '$remote/$branch', e.g. 'origin/foo'. + """ + branch_found = False + branch_name = '' + + cmd = 'git -C {dirname} log -n 1 --pretty=%d HEAD'.format( + dirname=dirname).split() + status, git_output = execute_subprocess(cmd, + output_to_caller=True, + status_to_caller=True) + branch_found = 'HEAD,' in git_output + if branch_found: + # git_output is of the form " (HEAD, origin/blah)" + branch_name = git_output.split(',')[1].strip()[:-1] + return branch_found, branch_name + + @staticmethod + def _git_current_branch(dirname): + """Determines the name of the current local branch. Returns a tuple, (branch_found, branch_name), where branch_found - is a logical specifying whether a branch name was found for + is a bool specifying whether a branch name was found for HEAD. (If branch_found is False, then branch_name is ''.) + Note that currently we check out the remote branch rather than + the local, so this command does not return the just-checked-out + branch. See _git_current_remote_branch. """ - cmd = ['git', 'symbolic-ref', '--short', '-q', 'HEAD'] + cmd = 'git -C {dirname} symbolic-ref --short -q HEAD'.format( + dirname=dirname).split() status, git_output = execute_subprocess(cmd, output_to_caller=True, status_to_caller=True) @@ -636,15 +658,17 @@ def _git_current_branch(): return branch_found, git_output @staticmethod - def _git_current_tag(): + def _git_current_tag(dirname): """Determines the name tag corresponding to HEAD (if any). + if dirname is None, uses the cwd. + Returns a tuple, (tag_found, tag_name), where tag_found is a - logical specifying whether we found a tag name corresponding to + bool specifying whether we found a tag name corresponding to HEAD. (If tag_found is False, then tag_name is ''.) """ - # git describe --exact-match --tags HEAD - cmd = ['git', 'describe', '--exact-match', '--tags', 'HEAD'] + cmd = 'git -C {dirname} describe --exact-match --tags HEAD'.format( + dirname=dirname).split() status, git_output = execute_subprocess(cmd, output_to_caller=True, status_to_caller=True) @@ -656,53 +680,57 @@ def _git_current_tag(): return tag_found, git_output @staticmethod - def _git_showref_tag(ref): + def _git_showref_tag(ref, dirname): """Run git show-ref check if the user supplied ref is a tag. could also use git rev-parse --quiet --verify tagname^{tag} """ - cmd = ['git', 'show-ref', '--quiet', '--verify', - 'refs/tags/{0}'.format(ref), ] + cmd = ('git -C {dirname} show-ref --quiet --verify refs/tags/{ref}' + .format(dirname=dirname, ref=ref).split()) status = execute_subprocess(cmd, status_to_caller=True) return status @staticmethod - def _git_showref_branch(ref): + def _git_showref_branch(ref, dirname): """Run git show-ref check if the user supplied ref is a local or tracked remote branch. """ - cmd = ['git', 'show-ref', '--quiet', '--verify', - 'refs/heads/{0}'.format(ref), ] + cmd = ('git -C {dirname} show-ref --quiet --verify refs/heads/{ref}' + .format(dirname=dirname, ref=ref).split()) status = execute_subprocess(cmd, status_to_caller=True) return status @staticmethod - def _git_lsremote_branch(ref, remote_name): + def _git_lsremote_branch(ref, remote_name, dirname): """Run git ls-remote to check if the user supplied ref is a remote branch that is not being tracked """ - cmd = ['git', 'ls-remote', '--exit-code', '--heads', - remote_name, ref, ] - status = execute_subprocess(cmd, status_to_caller=True) + cmd = ('git -C {dirname} ls-remote --exit-code --heads ' + '{remote_name} {ref}').format( + dirname=dirname, remote_name=remote_name, ref=ref).split() + status, output = execute_subprocess(cmd, status_to_caller=True, output_to_caller=True) + if not status and not f"refs/heads/{ref}" in output: + # In this case the ref is contained in the branch name but is not the complete branch name + return -1 return status @staticmethod - def _git_revparse_commit(ref): + def _git_revparse_commit(ref, dirname): """Run git rev-parse to detect if a reference is a SHA, HEAD or other valid commit. """ - cmd = ['git', 'rev-parse', '--quiet', '--verify', - '{0}^{1}'.format(ref, '{commit}'), ] + cmd = ('git -C {dirname} rev-parse --quiet --verify {ref}^{commit}' + .format(dirname=dirname, ref=ref, commit='{commit}').split()) status, git_output = execute_subprocess(cmd, status_to_caller=True, output_to_caller=True) git_output = git_output.strip() return status, git_output @staticmethod - def _git_status_porcelain_v1z(): + def _git_status_porcelain_v1z(dirname): """Run git status to obtain repository information. This is run with '--untracked=no' to ignore untracked files. @@ -711,36 +739,38 @@ def _git_status_porcelain_v1z(): between git versions or *user configuration*. """ - cmd = ['git', 'status', '--untracked-files=no', '--porcelain', '-z'] + cmd = ('git -C {dirname} status --untracked-files=no --porcelain -z' + .format(dirname=dirname)).split() git_output = execute_subprocess(cmd, output_to_caller=True) return git_output @staticmethod - def _git_status_verbose(): + def _git_status_verbose(dirname): """Run the git status command to obtain repository information. """ - cmd = ['git', 'status'] + cmd = 'git -C {dirname} status'.format(dirname=dirname).split() git_output = execute_subprocess(cmd, output_to_caller=True) return git_output @staticmethod - def _git_remote_verbose(): + def _git_remote_verbose(dirname): """Run the git remote command to obtain repository information. + + Returned string is of the form: + myfork git@github.com:johnpaulalex/manage_externals_jp.git (fetch) + myfork git@github.com:johnpaulalex/manage_externals_jp.git (push) """ - cmd = ['git', 'remote', '--verbose'] - git_output = execute_subprocess(cmd, output_to_caller=True) - return git_output + cmd = 'git -C {dirname} remote --verbose'.format( + dirname=dirname).split() + return execute_subprocess(cmd, output_to_caller=True) @staticmethod - def has_submodules(repo_dir_path=None): - """Return True iff the repository at (or the current - directory if is None) has a '.gitmodules' file + def has_submodules(repo_dir_path): + """Return True iff the repository at has a + '.gitmodules' file """ - if repo_dir_path is None: - fname = ExternalsDescription.GIT_SUBMODULES_FILENAME - else: - fname = os.path.join(repo_dir_path, - ExternalsDescription.GIT_SUBMODULES_FILENAME) + fname = os.path.join(repo_dir_path, + ExternalsDescription.GIT_SUBMODULES_FILENAME) return os.path.exists(fname) @@ -751,68 +781,78 @@ def has_submodules(repo_dir_path=None): # ---------------------------------------------------------------- @staticmethod def _git_clone(url, repo_dir_name, verbosity): - """Run git clone for the side effect of creating a repository. + """Clones url into repo_dir_name. """ - cmd = ['git', 'clone', '--quiet'] - subcmd = None - - cmd.extend([url, repo_dir_name]) + cmd = 'git clone --quiet {url} {repo_dir_name}'.format( + url=url, repo_dir_name=repo_dir_name).split() if verbosity >= VERBOSITY_VERBOSE: printlog(' {0}'.format(' '.join(cmd))) execute_subprocess(cmd) - if subcmd is not None: - os.chdir(repo_dir_name) - execute_subprocess(subcmd) @staticmethod - def _git_remote_add(name, url): + def _git_remote_add(name, url, dirname): """Run the git remote command for the side effect of adding a remote """ - cmd = ['git', 'remote', 'add', name, url] + cmd = 'git -C {dirname} remote add {name} {url}'.format( + dirname=dirname, name=name, url=url).split() execute_subprocess(cmd) @staticmethod - def _git_fetch(remote_name): + def _git_fetch(remote_name, dirname): """Run the git fetch command for the side effect of updating the repo """ - cmd = ['git', 'fetch', '--quiet', '--tags', remote_name] + cmd = 'git -C {dirname} fetch --quiet --tags {remote_name}'.format( + dirname=dirname, remote_name=remote_name).split() execute_subprocess(cmd) @staticmethod - def _git_checkout_ref(ref, verbosity, submodules): + def _git_checkout_ref(ref, verbosity, submodules, dirname): """Run the git checkout command for the side effect of updating the repo Param: ref is a reference to a local or remote object in the form 'origin/my_feature', or 'tag1'. """ - cmd = ['git', 'checkout', '--quiet', ref] + cmd = 'git -C {dirname} checkout --quiet {ref}'.format( + dirname=dirname, ref=ref).split() if verbosity >= VERBOSITY_VERBOSE: printlog(' {0}'.format(' '.join(cmd))) execute_subprocess(cmd) if submodules: - GitRepository._git_update_submodules(verbosity) + GitRepository._git_update_submodules(verbosity, dirname) @staticmethod - def _git_sparse_checkout(verbosity): + def _git_sparse_checkout(verbosity, dirname): """Configure repo via read-tree.""" - cmd = ['git', 'config', 'core.sparsecheckout', 'true'] + cmd = 'git -C {dirname} config core.sparsecheckout true'.format( + dirname=dirname).split() if verbosity >= VERBOSITY_VERBOSE: printlog(' {0}'.format(' '.join(cmd))) execute_subprocess(cmd) - cmd = ['git', 'read-tree', '-mu', 'HEAD'] + cmd = 'git -C {dirname} read-tree -mu HEAD'.format( + dirname=dirname).split() if verbosity >= VERBOSITY_VERBOSE: printlog(' {0}'.format(' '.join(cmd))) execute_subprocess(cmd) @staticmethod - def _git_update_submodules(verbosity): + def _git_update_submodules(verbosity, dirname): """Run git submodule update for the side effect of updating this repo's submodules. """ + # due to https://vielmetti.typepad.com/logbook/2022/10/git-security-fixes-lead-to-fatal-transport-file-not-allowed-error-in-ci-systems-cve-2022-39253.html + # submodules from file doesn't work without overriding the protocol, this is done + # for testing submodule support but should not be done in practice + file_protocol = "" + if 'unittest' in sys.modules.keys(): + file_protocol = "-c protocol.file.allow=always" + # First, verify that we have a .gitmodules file - if os.path.exists(ExternalsDescription.GIT_SUBMODULES_FILENAME): - cmd = ['git', 'submodule', 'update', '--init', '--recursive'] + if os.path.exists( + os.path.join(dirname, + ExternalsDescription.GIT_SUBMODULES_FILENAME)): + cmd = ('git {file_protocol} -C {dirname} submodule update --init --recursive' + .format(file_protocol=file_protocol, dirname=dirname)).split() if verbosity >= VERBOSITY_VERBOSE: printlog(' {0}'.format(' '.join(cmd))) diff --git a/manic/repository_svn.py b/manic/repository_svn.py index 408ed84676..32a71184b4 100644 --- a/manic/repository_svn.py +++ b/manic/repository_svn.py @@ -42,11 +42,19 @@ def __init__(self, component_name, repo, ignore_ancestry=False): Parse repo (a XML element). """ Repository.__init__(self, component_name, repo) + if 'github.com' in self._url: + msg = "SVN access to github.com is no longer supported" + fatal_error(msg) self._ignore_ancestry = ignore_ancestry + if self._url.endswith('/'): + # there is already a '/' separator in the URL; no need to add another + url_sep = '' + else: + url_sep = '/' if self._branch: - self._url = os.path.join(self._url, self._branch) + self._url = self._url + url_sep + self._branch elif self._tag: - self._url = os.path.join(self._url, self._tag) + self._url = self._url + url_sep + self._tag else: msg = "DEV_ERROR in svn repository. Shouldn't be here!" fatal_error(msg) diff --git a/manic/sourcetree.py b/manic/sourcetree.py index b9c9c21082..cf2a5b7569 100644 --- a/manic/sourcetree.py +++ b/manic/sourcetree.py @@ -1,6 +1,6 @@ """ - -FIXME(bja, 2017-11) External and SourceTree have a circular dependancy! +Classes to represent an externals config file (SourceTree) and the components +within it (_External). """ import errno @@ -19,62 +19,54 @@ class _External(object): """ - _External represents an external object inside a SourceTree - """ + A single component hosted in an external repository (and any children). + The component may or may not be checked-out upon construction. + """ # pylint: disable=R0902 - def __init__(self, root_dir, name, ext_description, svn_ignore_ancestry): - """Parse an external description file into a dictionary of externals. + def __init__(self, root_dir, name, local_path, required, subexternals_path, + repo, svn_ignore_ancestry, subexternal_sourcetree): + """Create a single external component (checked out or not). Input: + root_dir : string - the (checked-out) parent repo's root dir. + local_path : string - this external's (checked-out) subdir relative + to root_dir, e.g. "components/mom" + repo: Repository - the repo object for this external. Can be None (e.g. if this external just refers to another external file). - root_dir : string - the root directory path where - 'local_path' is relative to. - - name : string - name of the ext_description object. may or may not - correspond to something in the path. + name : string - name of this external (as named by the parent + reference). May or may not correspond to something in the path. ext_description : dict - source ExternalsDescription object svn_ignore_ancestry : bool - use --ignore-externals with svn switch + subexternals_path: string - path to sub-externals config file, if any. Relative to local_path, or special value 'none'. + subexternal_sourcetree: SourceTree - corresponding to subexternals_path, if subexternals_path exists (it might not, if it is not checked out yet). """ self._name = name - self._repo = None - self._externals = EMPTY_STR - self._externals_sourcetree = None - self._stat = ExternalStatus() - self._sparse = None - # Parse the sub-elements - - # _path : local path relative to the containing source tree - self._local_path = ext_description[ExternalsDescription.PATH] - # _repo_dir : full repository directory - repo_dir = os.path.join(root_dir, self._local_path) + self._required = required + + self._stat = None # Populated in status() + + self._local_path = local_path + # _repo_dir_path : full repository directory, e.g. + # "/components/mom" + repo_dir = os.path.join(root_dir, local_path) self._repo_dir_path = os.path.abspath(repo_dir) - # _base_dir : base directory *containing* the repository + # _base_dir_path : base directory *containing* the repository, e.g. + # "/components" self._base_dir_path = os.path.dirname(self._repo_dir_path) - # repo_dir_name : base_dir_path + repo_dir_name = rep_dir_path + # _repo_dir_name : base_dir_path + repo_dir_name = repo_dir_path + # e.g., "mom" self._repo_dir_name = os.path.basename(self._repo_dir_path) - assert(os.path.join(self._base_dir_path, self._repo_dir_name) - == self._repo_dir_path) + self._repo = repo - self._required = ext_description[ExternalsDescription.REQUIRED] - self._externals = ext_description[ExternalsDescription.EXTERNALS] - # Treat a .gitmodules file as a backup externals config - if not self._externals: - if GitRepository.has_submodules(self._repo_dir_path): - self._externals = ExternalsDescription.GIT_SUBMODULES_FILENAME - - repo = create_repository( - name, ext_description[ExternalsDescription.REPO], - svn_ignore_ancestry=svn_ignore_ancestry) - if repo: - self._repo = repo - - if self._externals and (self._externals.lower() != 'none'): - self._create_externals_sourcetree() + # Does this component have subcomponents aka an externals config? + self._subexternals_path = subexternals_path + self._subexternal_sourcetree = subexternal_sourcetree + def get_name(self): """ @@ -88,81 +80,97 @@ def get_local_path(self): """ return self._local_path - def status(self): - """ - If the repo destination directory exists, ensure it is correct (from - correct URL, correct branch or tag), and possibly update the external. - If the repo destination directory does not exist, checkout the correce - branch or tag. - If load_all is True, also load all of the the externals sub-externals. + def get_repo_dir_path(self): + return self._repo_dir_path + + def get_subexternals_path(self): + return self._subexternals_path + + def get_repo(self): + return self._repo + + def status(self, force=False, print_progress=False): """ + Returns status of this component and all subcomponents. - self._stat.path = self.get_local_path() - if not self._required: - self._stat.source_type = ExternalStatus.OPTIONAL - elif self._local_path == LOCAL_PATH_INDICATOR: - # LOCAL_PATH_INDICATOR, '.' paths, are standalone - # component directories that are not managed by - # checkout_externals. - self._stat.source_type = ExternalStatus.STANDALONE - else: - # managed by checkout_externals - self._stat.source_type = ExternalStatus.MANAGED + Returns a dict mapping our local path (not component name!) to an + ExternalStatus dict. Any subcomponents will have their own top-level + path keys. Note the return value includes entries for this and all + subcomponents regardless of whether they are locally installed or not. - ext_stats = {} + Side-effect: If self._stat is empty or force is True, calculates _stat. + """ + calc_stat = force or not self._stat + + if calc_stat: + self._stat = ExternalStatus() + self._stat.path = self.get_local_path() + if not self._required: + self._stat.source_type = ExternalStatus.OPTIONAL + elif self._local_path == LOCAL_PATH_INDICATOR: + # LOCAL_PATH_INDICATOR, '.' paths, are standalone + # component directories that are not managed by + # checkout_subexternals. + self._stat.source_type = ExternalStatus.STANDALONE + else: + # managed by checkout_subexternals + self._stat.source_type = ExternalStatus.MANAGED + subcomponent_stats = {} if not os.path.exists(self._repo_dir_path): - self._stat.sync_state = ExternalStatus.EMPTY - msg = ('status check: repository directory for "{0}" does not ' - 'exist.'.format(self._name)) - logging.info(msg) - self._stat.current_version = 'not checked out' - # NOTE(bja, 2018-01) directory doesn't exist, so we cannot - # use repo to determine the expected version. We just take - # a best-guess based on the assumption that only tag or - # branch should be set, but not both. - if not self._repo: - self._stat.expected_version = 'unknown' - else: - self._stat.expected_version = self._repo.tag() + self._repo.branch() + if calc_stat: + # No local repository. + self._stat.sync_state = ExternalStatus.EMPTY + msg = ('status check: repository directory for "{0}" does not ' + 'exist.'.format(self._name)) + logging.info(msg) + self._stat.current_version = 'not checked out' + # NOTE(bja, 2018-01) directory doesn't exist, so we cannot + # use repo to determine the expected version. We just take + # a best-guess based on the assumption that only tag or + # branch should be set, but not both. + if not self._repo: + self._stat.expected_version = 'unknown' + else: + self._stat.expected_version = self._repo.tag() + self._repo.branch() else: - if self._repo: + # Merge local repository state (e.g. clean/dirty) into self._stat. + if calc_stat and self._repo: self._repo.status(self._stat, self._repo_dir_path) - if self._externals and self._externals_sourcetree: - # we expect externals and they exist + # Status of subcomponents, if any. + if self._subexternals_path and self._subexternal_sourcetree: cwd = os.getcwd() - # SourceTree expects to be called from the correct + # SourceTree.status() expects to be called from the correct # root directory. os.chdir(self._repo_dir_path) - ext_stats = self._externals_sourcetree.status(self._local_path) + subcomponent_stats = self._subexternal_sourcetree.status(self._local_path, force=force, print_progress=print_progress) os.chdir(cwd) + # Merge our status + subcomponent statuses into one return dict keyed + # by component path. all_stats = {} # don't add the root component because we don't manage it # and can't provide useful info about it. if self._local_path != LOCAL_PATH_INDICATOR: - # store the stats under tha local_path, not comp name so + # store the stats under the local_path, not comp name so # it will be sorted correctly all_stats[self._stat.path] = self._stat - if ext_stats: - all_stats.update(ext_stats) + if subcomponent_stats: + all_stats.update(subcomponent_stats) return all_stats - def checkout(self, verbosity, load_all): + def checkout(self, verbosity): """ If the repo destination directory exists, ensure it is correct (from - correct URL, correct branch or tag), and possibly update the external. + correct URL, correct branch or tag), and possibly updateit. If the repo destination directory does not exist, checkout the correct branch or tag. - If load_all is True, also load all of the the externals sub-externals. + Does not check out sub-externals, see SourceTree.checkout(). """ - if load_all: - pass # Make sure we are in correct location - if not os.path.exists(self._repo_dir_path): # repository directory doesn't exist. Need to check it # out, and for that we need the base_dir_path to exist @@ -174,6 +182,10 @@ def checkout(self, verbosity, load_all): self._base_dir_path) fatal_error(msg) + if not self._stat: + self.status() + assert self._stat + if self._stat.source_type != ExternalStatus.STANDALONE: if verbosity >= VERBOSITY_VERBOSE: # NOTE(bja, 2018-01) probably do not want to pass @@ -194,120 +206,147 @@ def checkout(self, verbosity, load_all): self._repo.checkout(self._base_dir_path, self._repo_dir_name, checkout_verbosity, self.clone_recursive()) - def checkout_externals(self, verbosity, load_all): - """Checkout the sub-externals for this object - """ - if self.load_externals(): - if self._externals_sourcetree: - # NOTE(bja, 2018-02): the subtree externals objects - # were created during initial status check. Updating - # the external may have changed which sub-externals - # are needed. We need to delete those objects and - # re-read the potentially modified externals - # description file. - self._externals_sourcetree = None - self._create_externals_sourcetree() - self._externals_sourcetree.checkout(verbosity, load_all) - - def load_externals(self): - 'Return True iff an externals file should be loaded' - load_ex = False - if os.path.exists(self._repo_dir_path): - if self._externals: - if self._externals.lower() != 'none': - load_ex = os.path.exists(os.path.join(self._repo_dir_path, - self._externals)) - - return load_ex + def replace_subexternal_sourcetree(self, sourcetree): + self._subexternal_sourcetree = sourcetree def clone_recursive(self): 'Return True iff any .gitmodules files should be processed' - # Try recursive unless there is an externals entry - recursive = not self._externals + # Try recursive .gitmodules unless there is an externals entry + recursive = not self._subexternals_path return recursive - def _create_externals_sourcetree(self): - """ + +class SourceTree(object): + """ + SourceTree represents a group of managed externals. + + Those externals may not be checked out locally yet, they might only + have Repository objects pointing to their respective repositories. + """ + + @classmethod + def from_externals_file(cls, parent_repo_dir_path, parent_repo, + externals_path): + """Creates a SourceTree representing the given externals file. + + Looks up a git submodules file as an optional backup if there is no + externals file specified. + + Returns None if there is no externals file (i.e. it's None or 'none'), + or if the externals file hasn't been checked out yet. + + parent_repo_dir_path: parent repo root dir + parent_repo: parent repo. + externals_path: path to externals file, relative to parent_repo_dir_path. """ - if not os.path.exists(self._repo_dir_path): + if not os.path.exists(parent_repo_dir_path): # NOTE(bja, 2017-10) repository has not been checked out # yet, can't process the externals file. Assume we are # checking status before code is checkoud out and this # will be handled correctly later. - return + return None - cwd = os.getcwd() - os.chdir(self._repo_dir_path) - if self._externals.lower() == 'none': - msg = ('Internal: Attempt to create source tree for ' - 'externals = none in {}'.format(self._repo_dir_path)) - fatal_error(msg) + if externals_path.lower() == 'none': + # With explicit 'none', do not look for git submodules file. + return None - if not os.path.exists(self._externals): - if GitRepository.has_submodules(): - self._externals = ExternalsDescription.GIT_SUBMODULES_FILENAME + cwd = os.getcwd() + os.chdir(parent_repo_dir_path) + + if not externals_path: + if GitRepository.has_submodules(parent_repo_dir_path): + externals_path = ExternalsDescription.GIT_SUBMODULES_FILENAME + else: + return None - if not os.path.exists(self._externals): - # NOTE(bja, 2017-10) this check is redundent with the one + if not os.path.exists(externals_path): + # NOTE(bja, 2017-10) this check is redundant with the one # in read_externals_description_file! - msg = ('External externals description file "{0}" ' + msg = ('Externals description file "{0}" ' 'does not exist! In directory: {1}'.format( - self._externals, self._repo_dir_path)) + externals_path, parent_repo_dir_path)) fatal_error(msg) - externals_root = self._repo_dir_path + externals_root = parent_repo_dir_path + # model_data is a dict-like object which mirrors the file format. model_data = read_externals_description_file(externals_root, - self._externals) - externals = create_externals_description(model_data, - parent_repo=self._repo) - self._externals_sourcetree = SourceTree(externals_root, externals) + externals_path) + # ext_description is another dict-like object (see ExternalsDescription) + ext_description = create_externals_description(model_data, + parent_repo=parent_repo) + externals_sourcetree = SourceTree(externals_root, ext_description) os.chdir(cwd) - -class SourceTree(object): - """ - SourceTree represents a group of managed externals - """ - - def __init__(self, root_dir, model, svn_ignore_ancestry=False): + return externals_sourcetree + + def __init__(self, root_dir, ext_description, svn_ignore_ancestry=False): """ - Build a SourceTree object from a model description + Build a SourceTree object from an ExternalDescription. + + root_dir: the (checked-out) parent repo root dir. """ self._root_dir = os.path.abspath(root_dir) - self._all_components = {} + self._all_components = {} # component_name -> _External self._required_compnames = [] - for comp in model: - src = _External(self._root_dir, comp, model[comp], svn_ignore_ancestry) + for comp, desc in ext_description.items(): + local_path = desc[ExternalsDescription.PATH] + required = desc[ExternalsDescription.REQUIRED] + repo_info = desc[ExternalsDescription.REPO] + subexternals_path = desc[ExternalsDescription.EXTERNALS] + + repo = create_repository(comp, + repo_info, + svn_ignore_ancestry=svn_ignore_ancestry) + + sourcetree = None + # Treat a .gitmodules file as a backup externals config + if not subexternals_path: + parent_repo_dir_path = os.path.abspath(os.path.join(root_dir, + local_path)) + if GitRepository.has_submodules(parent_repo_dir_path): + subexternals_path = ExternalsDescription.GIT_SUBMODULES_FILENAME + + # Might return None (if the subexternal isn't checked out yet, or subexternal is None or 'none') + subexternal_sourcetree = SourceTree.from_externals_file( + os.path.join(self._root_dir, local_path), + repo, + subexternals_path) + src = _External(self._root_dir, comp, local_path, required, + subexternals_path, repo, svn_ignore_ancestry, + subexternal_sourcetree) + self._all_components[comp] = src - if model[comp][ExternalsDescription.REQUIRED]: + if required: self._required_compnames.append(comp) - def status(self, relative_path_base=LOCAL_PATH_INDICATOR): - """Report the status components - - FIXME(bja, 2017-10) what do we do about situations where the - user checked out the optional components, but didn't add - optional for running status? What do we do where the user - didn't add optional to the checkout but did add it to the - status. -- For now, we run status on all components, and try - to do the right thing based on the results.... - - """ + def status(self, relative_path_base=LOCAL_PATH_INDICATOR, + force=False, print_progress=False): + """Return a dictionary of local path->ExternalStatus. + + Notes about the returned dictionary: + * It is keyed by local path (e.g. 'components/mom'), not by + component name (e.g. 'mom'). + * It contains top-level keys for all traversed components, whether + discovered by recursion or top-level. + * It contains entries for all components regardless of whether they + are locally installed or not, or required or optional. +x """ load_comps = self._all_components.keys() - summary = {} + summary = {} # Holds merged statuses from all components. for comp in load_comps: - printlog('{0}, '.format(comp), end='') - stat = self._all_components[comp].status() + if print_progress: + printlog('{0}, '.format(comp), end='') + stat = self._all_components[comp].status(force=force, + print_progress=print_progress) + + # Returned status dictionary is keyed by local path; prepend + # relative_path_base if not already there. stat_final = {} for name in stat.keys(): - # check if we need to append the relative_path_base to - # the path so it will be sorted in the correct order. if stat[name].path.startswith(relative_path_base): - # use as is, without any changes to path stat_final[name] = stat[name] else: - # append relative_path_base to path and store under key = updated path modified_path = os.path.join(relative_path_base, stat[name].path) stat_final[modified_path] = stat[name] @@ -316,38 +355,71 @@ def status(self, relative_path_base=LOCAL_PATH_INDICATOR): return summary + def _find_installed_optional_components(self): + """Returns a list of installed optional component names, if any.""" + installed_comps = [] + for comp_name, ext in self._all_components.items(): + if comp_name in self._required_compnames: + continue + # Note that in practice we expect this status to be cached. + path_to_stat = ext.status() + + # If any part of this component exists locally, consider it + # installed and therefore eligible for updating. + if any(s.sync_state != ExternalStatus.EMPTY + for s in path_to_stat.values()): + installed_comps.append(comp_name) + return installed_comps + def checkout(self, verbosity, load_all, load_comp=None): """ - Checkout or update indicated components into the the configured - subdirs. + Checkout or update indicated components into the configured subdirs. - If load_all is True, recursively checkout all externals. - If load_all is False, load_comp is an optional set of components to load. - If load_all is True and load_comp is None, only load the required externals. + If load_all is True, checkout all externals (required + optional), recursively. + If load_all is False and load_comp is set, checkout load_comp (and any required subexternals, plus any optional subexternals that are already checked out, recursively) + If load_all is False and load_comp is None, checkout all required externals, plus any optionals that are already checked out, recursively. """ + if load_all: + tmp_comps = self._all_components.keys() + elif load_comp is not None: + tmp_comps = [load_comp] + else: + local_optional_compnames = self._find_installed_optional_components() + tmp_comps = self._required_compnames + local_optional_compnames + if local_optional_compnames: + printlog('Found locally installed optional components: ' + + ', '.join(local_optional_compnames)) + bad_compnames = set(local_optional_compnames) - set(self._all_components.keys()) + if bad_compnames: + printlog('Internal error: found locally installed components that are not in the global list of all components: ' + ','.join(bad_compnames)) + if verbosity >= VERBOSITY_VERBOSE: printlog('Checking out externals: ') else: printlog('Checking out externals: ', end='') - if load_all: - load_comps = self._all_components.keys() - elif load_comp is not None: - load_comps = [load_comp] - else: - load_comps = self._required_compnames + # Sort by path so that if paths are nested the + # parent repo is checked out first. + load_comps = sorted(tmp_comps, key=lambda comp: self._all_components[comp].get_local_path()) - # checkout the primary externals - for comp in load_comps: + # checkout. + for comp_name in load_comps: if verbosity < VERBOSITY_VERBOSE: - printlog('{0}, '.format(comp), end='') + printlog('{0}, '.format(comp_name), end='') else: # verbose output handled by the _External object, just # output a newline printlog(EMPTY_STR) - self._all_components[comp].checkout(verbosity, load_all) + c = self._all_components[comp_name] + # Does not recurse. + c.checkout(verbosity) + # Recursively check out subexternals, if any. Returns None + # if there's no subexternals path. + component_subexternal_sourcetree = SourceTree.from_externals_file( + c.get_repo_dir_path(), + c.get_repo(), + c.get_subexternals_path()) + c.replace_subexternal_sourcetree(component_subexternal_sourcetree) + if component_subexternal_sourcetree: + component_subexternal_sourcetree.checkout(verbosity, load_all) printlog('') - - # now give each external an opportunitity to checkout it's externals. - for comp in load_comps: - self._all_components[comp].checkout_externals(verbosity, load_all) diff --git a/manic/utils.py b/manic/utils.py index f57f43930c..9c63ffe65e 100644 --- a/manic/utils.py +++ b/manic/utils.py @@ -1,4 +1,4 @@ -#!/usr/bin/env python +#!/usr/bin/env python3 """ Common public utilities for manic package diff --git a/test/README.md b/test/README.md index 938a900eec..1e8f2eaa77 100644 --- a/test/README.md +++ b/test/README.md @@ -1,45 +1,25 @@ # Testing for checkout_externals -NOTE: Python2 is the supported runtime environment. Python3 compatibility is -in progress, complicated by the different proposed input methods -(yaml, xml, cfg/ini, json) and their different handling of strings -(unicode vs byte) in python2. Full python3 compatibility will be -possible once the number of possible input formats has been narrowed. - -## Setup development environment - -Development environments should be setup for python2 and python3: +## Unit tests ```SH cd checkout_externals/test - make python=python2 env - make python=python3 env + make utest ``` -## Unit tests - -Tests should be run for both python2 and python3. It is recommended -that you have seperate terminal windows open python2 and python3 -testing to avoid errors activating and deactivating environments. +## System tests ```SH cd checkout_externals/test - . env_python2/bin/activate - make utest - deactivate + make stest ``` +Example to run a single test: ```SH - cd checkout_externals/test - . env_python2/bin/activate - make utest - deactivate + cd checkout_externals + python -m unittest test.test_sys_checkout.TestSysCheckout.test_container_simple_required ``` -## System tests - -Not yet implemented. - ## Static analysis checkout_externals is difficult to test thoroughly because it relies @@ -51,9 +31,7 @@ regularly for automatic code formatting and linting. ```SH cd checkout_externals/test - . env_python2/bin/activate make lint - deactivate ``` The canonical formatting for the code is whatever autopep8 @@ -68,10 +46,8 @@ coverage, run the code coverage tool: ```SH cd checkout_externals/test - . env_python2/bin/activate make coverage open -a Firefox.app htmlcov/index.html - deactivate ``` diff --git a/test/repos/README.md b/test/repos/README.md new file mode 100644 index 0000000000..026b684ea3 --- /dev/null +++ b/test/repos/README.md @@ -0,0 +1,33 @@ +Git and svn repositories for testing git and svn-related behavior. For usage and terminology notes, see test/test_sys_checkout.py. + +For git repos: To list files and view file contents at HEAD: +``` +cd +git ls-tree --full-tree -r --name-only HEAD +git cat-file -p HEAD: +``` + +File contents at a glance: +``` +container.git/ + readme.txt + +simple-ext.git/ + (has branches: feature2, feature3) + (has tags: tag1, tag2) + readme.txt + simple_subdir/subdir_file.txt + +simple-ext-fork.git/ + (has tags: abandoned-feature, forked-feature-v1, tag1) + (has branch: feature2) + readme.txt + +mixed-cont-ext.git/ + (has branch: new-feature) + readme.txt + sub-externals.cfg ('simp_branch' section refers to 'feature2' branch in simple-ext.git/ repo) + +error/ + (no git repo here, just a readme.txt in the clear) +``` diff --git a/test/repos/simple-ext.svn/README.txt b/test/repos/simple-ext.svn/README.txt new file mode 100644 index 0000000000..9935818a1b --- /dev/null +++ b/test/repos/simple-ext.svn/README.txt @@ -0,0 +1,5 @@ +This is a Subversion repository; use the 'svnadmin' and 'svnlook' +tools to examine it. Do not add, delete, or modify files here +unless you know how to avoid corrupting the repository. + +Visit http://subversion.apache.org/ for more information. diff --git a/test/repos/simple-ext.svn/conf/authz b/test/repos/simple-ext.svn/conf/authz new file mode 100644 index 0000000000..0b9a41074e --- /dev/null +++ b/test/repos/simple-ext.svn/conf/authz @@ -0,0 +1,32 @@ +### This file is an example authorization file for svnserve. +### Its format is identical to that of mod_authz_svn authorization +### files. +### As shown below each section defines authorizations for the path and +### (optional) repository specified by the section name. +### The authorizations follow. An authorization line can refer to: +### - a single user, +### - a group of users defined in a special [groups] section, +### - an alias defined in a special [aliases] section, +### - all authenticated users, using the '$authenticated' token, +### - only anonymous users, using the '$anonymous' token, +### - anyone, using the '*' wildcard. +### +### A match can be inverted by prefixing the rule with '~'. Rules can +### grant read ('r') access, read-write ('rw') access, or no access +### (''). + +[aliases] +# joe = /C=XZ/ST=Dessert/L=Snake City/O=Snake Oil, Ltd./OU=Research Institute/CN=Joe Average + +[groups] +# harry_and_sally = harry,sally +# harry_sally_and_joe = harry,sally,&joe + +# [/foo/bar] +# harry = rw +# &joe = r +# * = + +# [repository:/baz/fuz] +# @harry_and_sally = rw +# * = r diff --git a/test/repos/simple-ext.svn/conf/hooks-env.tmpl b/test/repos/simple-ext.svn/conf/hooks-env.tmpl new file mode 100644 index 0000000000..ee965c316c --- /dev/null +++ b/test/repos/simple-ext.svn/conf/hooks-env.tmpl @@ -0,0 +1,19 @@ +### This file is an example hook script environment configuration file. +### Hook scripts run in an empty environment by default. +### As shown below each section defines environment variables for a +### particular hook script. The [default] section defines environment +### variables for all hook scripts, unless overridden by a hook-specific +### section. + +### This example configures a UTF-8 locale for all hook scripts, so that +### special characters, such as umlauts, may be printed to stderr. +### If UTF-8 is used with a mod_dav_svn server, the SVNUseUTF8 option must +### also be set to 'yes' in httpd.conf. +### With svnserve, the LANG environment variable of the svnserve process +### must be set to the same value as given here. +[default] +LANG = en_US.UTF-8 + +### This sets the PATH environment variable for the pre-commit hook. +[pre-commit] +PATH = /usr/local/bin:/usr/bin:/usr/sbin diff --git a/test/repos/simple-ext.svn/conf/passwd b/test/repos/simple-ext.svn/conf/passwd new file mode 100644 index 0000000000..ecaa08dcec --- /dev/null +++ b/test/repos/simple-ext.svn/conf/passwd @@ -0,0 +1,8 @@ +### This file is an example password file for svnserve. +### Its format is similar to that of svnserve.conf. As shown in the +### example below it contains one section labelled [users]. +### The name and password for each user follow, one account per line. + +[users] +# harry = harryssecret +# sally = sallyssecret diff --git a/test/repos/simple-ext.svn/conf/svnserve.conf b/test/repos/simple-ext.svn/conf/svnserve.conf new file mode 100644 index 0000000000..6cefc17b3e --- /dev/null +++ b/test/repos/simple-ext.svn/conf/svnserve.conf @@ -0,0 +1,81 @@ +### This file controls the configuration of the svnserve daemon, if you +### use it to allow access to this repository. (If you only allow +### access through http: and/or file: URLs, then this file is +### irrelevant.) + +### Visit http://subversion.apache.org/ for more information. + +[general] +### The anon-access and auth-access options control access to the +### repository for unauthenticated (a.k.a. anonymous) users and +### authenticated users, respectively. +### Valid values are "write", "read", and "none". +### Setting the value to "none" prohibits both reading and writing; +### "read" allows read-only access, and "write" allows complete +### read/write access to the repository. +### The sample settings below are the defaults and specify that anonymous +### users have read-only access to the repository, while authenticated +### users have read and write access to the repository. +# anon-access = read +# auth-access = write +### The password-db option controls the location of the password +### database file. Unless you specify a path starting with a /, +### the file's location is relative to the directory containing +### this configuration file. +### If SASL is enabled (see below), this file will NOT be used. +### Uncomment the line below to use the default password file. +# password-db = passwd +### The authz-db option controls the location of the authorization +### rules for path-based access control. Unless you specify a path +### starting with a /, the file's location is relative to the +### directory containing this file. The specified path may be a +### repository relative URL (^/) or an absolute file:// URL to a text +### file in a Subversion repository. If you don't specify an authz-db, +### no path-based access control is done. +### Uncomment the line below to use the default authorization file. +# authz-db = authz +### The groups-db option controls the location of the file with the +### group definitions and allows maintaining groups separately from the +### authorization rules. The groups-db file is of the same format as the +### authz-db file and should contain a single [groups] section with the +### group definitions. If the option is enabled, the authz-db file cannot +### contain a [groups] section. Unless you specify a path starting with +### a /, the file's location is relative to the directory containing this +### file. The specified path may be a repository relative URL (^/) or an +### absolute file:// URL to a text file in a Subversion repository. +### This option is not being used by default. +# groups-db = groups +### This option specifies the authentication realm of the repository. +### If two repositories have the same authentication realm, they should +### have the same password database, and vice versa. The default realm +### is repository's uuid. +# realm = My First Repository +### The force-username-case option causes svnserve to case-normalize +### usernames before comparing them against the authorization rules in the +### authz-db file configured above. Valid values are "upper" (to upper- +### case the usernames), "lower" (to lowercase the usernames), and +### "none" (to compare usernames as-is without case conversion, which +### is the default behavior). +# force-username-case = none +### The hooks-env options specifies a path to the hook script environment +### configuration file. This option overrides the per-repository default +### and can be used to configure the hook script environment for multiple +### repositories in a single file, if an absolute path is specified. +### Unless you specify an absolute path, the file's location is relative +### to the directory containing this file. +# hooks-env = hooks-env + +[sasl] +### This option specifies whether you want to use the Cyrus SASL +### library for authentication. Default is false. +### Enabling this option requires svnserve to have been built with Cyrus +### SASL support; to check, run 'svnserve --version' and look for a line +### reading 'Cyrus SASL authentication is available.' +# use-sasl = true +### These options specify the desired strength of the security layer +### that you want SASL to provide. 0 means no encryption, 1 means +### integrity-checking only, values larger than 1 are correlated +### to the effective key length for encryption (e.g. 128 means 128-bit +### encryption). The values below are the defaults. +# min-encryption = 0 +# max-encryption = 256 diff --git a/test/repos/simple-ext.svn/db/current b/test/repos/simple-ext.svn/db/current new file mode 100644 index 0000000000..00750edc07 --- /dev/null +++ b/test/repos/simple-ext.svn/db/current @@ -0,0 +1 @@ +3 diff --git a/test/repos/simple-ext.svn/db/format b/test/repos/simple-ext.svn/db/format new file mode 100644 index 0000000000..5dd0c22198 --- /dev/null +++ b/test/repos/simple-ext.svn/db/format @@ -0,0 +1,3 @@ +8 +layout sharded 1000 +addressing logical diff --git a/test/repos/simple-ext.svn/db/fs-type b/test/repos/simple-ext.svn/db/fs-type new file mode 100644 index 0000000000..4fdd95313f --- /dev/null +++ b/test/repos/simple-ext.svn/db/fs-type @@ -0,0 +1 @@ +fsfs diff --git a/test/repos/simple-ext.svn/db/fsfs.conf b/test/repos/simple-ext.svn/db/fsfs.conf new file mode 100644 index 0000000000..ac6877a727 --- /dev/null +++ b/test/repos/simple-ext.svn/db/fsfs.conf @@ -0,0 +1,200 @@ +### This file controls the configuration of the FSFS filesystem. + +[memcached-servers] +### These options name memcached servers used to cache internal FSFS +### data. See http://www.danga.com/memcached/ for more information on +### memcached. To use memcached with FSFS, run one or more memcached +### servers, and specify each of them as an option like so: +# first-server = 127.0.0.1:11211 +# remote-memcached = mymemcached.corp.example.com:11212 +### The option name is ignored; the value is of the form HOST:PORT. +### memcached servers can be shared between multiple repositories; +### however, if you do this, you *must* ensure that repositories have +### distinct UUIDs and paths, or else cached data from one repository +### might be used by another accidentally. Note also that memcached has +### no authentication for reads or writes, so you must ensure that your +### memcached servers are only accessible by trusted users. + +[caches] +### When a cache-related error occurs, normally Subversion ignores it +### and continues, logging an error if the server is appropriately +### configured (and ignoring it with file:// access). To make +### Subversion never ignore cache errors, uncomment this line. +# fail-stop = true + +[rep-sharing] +### To conserve space, the filesystem can optionally avoid storing +### duplicate representations. This comes at a slight cost in +### performance, as maintaining a database of shared representations can +### increase commit times. The space savings are dependent upon the size +### of the repository, the number of objects it contains and the amount of +### duplication between them, usually a function of the branching and +### merging process. +### +### The following parameter enables rep-sharing in the repository. It can +### be switched on and off at will, but for best space-saving results +### should be enabled consistently over the life of the repository. +### 'svnadmin verify' will check the rep-cache regardless of this setting. +### rep-sharing is enabled by default. +# enable-rep-sharing = true + +[deltification] +### To conserve space, the filesystem stores data as differences against +### existing representations. This comes at a slight cost in performance, +### as calculating differences can increase commit times. Reading data +### will also create higher CPU load and the data will be fragmented. +### Since deltification tends to save significant amounts of disk space, +### the overall I/O load can actually be lower. +### +### The options in this section allow for tuning the deltification +### strategy. Their effects on data size and server performance may vary +### from one repository to another. Versions prior to 1.8 will ignore +### this section. +### +### The following parameter enables deltification for directories. It can +### be switched on and off at will, but for best space-saving results +### should be enabled consistently over the lifetime of the repository. +### Repositories containing large directories will benefit greatly. +### In rarely accessed repositories, the I/O overhead may be significant +### as caches will most likely be low. +### directory deltification is enabled by default. +# enable-dir-deltification = true +### +### The following parameter enables deltification for properties on files +### and directories. Overall, this is a minor tuning option but can save +### some disk space if you merge frequently or frequently change node +### properties. You should not activate this if rep-sharing has been +### disabled because this may result in a net increase in repository size. +### property deltification is enabled by default. +# enable-props-deltification = true +### +### During commit, the server may need to walk the whole change history of +### of a given node to find a suitable deltification base. This linear +### process can impact commit times, svnadmin load and similar operations. +### This setting limits the depth of the deltification history. If the +### threshold has been reached, the node will be stored as fulltext and a +### new deltification history begins. +### Note, this is unrelated to svn log. +### Very large values rarely provide significant additional savings but +### can impact performance greatly - in particular if directory +### deltification has been activated. Very small values may be useful in +### repositories that are dominated by large, changing binaries. +### Should be a power of two minus 1. A value of 0 will effectively +### disable deltification. +### For 1.8, the default value is 1023; earlier versions have no limit. +# max-deltification-walk = 1023 +### +### The skip-delta scheme used by FSFS tends to repeatably store redundant +### delta information where a simple delta against the latest version is +### often smaller. By default, 1.8+ will therefore use skip deltas only +### after the linear chain of deltas has grown beyond the threshold +### specified by this setting. +### Values up to 64 can result in some reduction in repository size for +### the cost of quickly increasing I/O and CPU costs. Similarly, smaller +### numbers can reduce those costs at the cost of more disk space. For +### rarely read repositories or those containing larger binaries, this may +### present a better trade-off. +### Should be a power of two. A value of 1 or smaller will cause the +### exclusive use of skip-deltas (as in pre-1.8). +### For 1.8, the default value is 16; earlier versions use 1. +# max-linear-deltification = 16 +### +### After deltification, we compress the data to minimize on-disk size. +### This setting controls the compression algorithm, which will be used in +### future revisions. It can be used to either disable compression or to +### select between available algorithms (zlib, lz4). zlib is a general- +### purpose compression algorithm. lz4 is a fast compression algorithm +### which should be preferred for repositories with large and, possibly, +### incompressible files. Note that the compression ratio of lz4 is +### usually lower than the one provided by zlib, but using it can +### significantly speed up commits as well as reading the data. +### lz4 compression algorithm is supported, starting from format 8 +### repositories, available in Subversion 1.10 and higher. +### The syntax of this option is: +### compression = none | lz4 | zlib | zlib-1 ... zlib-9 +### Versions prior to Subversion 1.10 will ignore this option. +### The default value is 'lz4' if supported by the repository format and +### 'zlib' otherwise. 'zlib' is currently equivalent to 'zlib-5'. +# compression = lz4 +### +### DEPRECATED: The new 'compression' option deprecates previously used +### 'compression-level' option, which was used to configure zlib compression. +### For compatibility with previous versions of Subversion, this option can +### still be used (and it will result in zlib compression with the +### corresponding compression level). +### compression-level = 0 ... 9 (default is 5) + +[packed-revprops] +### This parameter controls the size (in kBytes) of packed revprop files. +### Revprops of consecutive revisions will be concatenated into a single +### file up to but not exceeding the threshold given here. However, each +### pack file may be much smaller and revprops of a single revision may be +### much larger than the limit set here. The threshold will be applied +### before optional compression takes place. +### Large values will reduce disk space usage at the expense of increased +### latency and CPU usage reading and changing individual revprops. +### Values smaller than 4 kByte will not improve latency any further and +### quickly render revprop packing ineffective. +### revprop-pack-size is 16 kBytes by default for non-compressed revprop +### pack files and 64 kBytes when compression has been enabled. +# revprop-pack-size = 16 +### +### To save disk space, packed revprop files may be compressed. Standard +### revprops tend to allow for very effective compression. Reading and +### even more so writing, become significantly more CPU intensive. +### Compressing packed revprops is disabled by default. +# compress-packed-revprops = false + +[io] +### Parameters in this section control the data access granularity in +### format 7 repositories and later. The defaults should translate into +### decent performance over a wide range of setups. +### +### When a specific piece of information needs to be read from disk, a +### data block is being read at once and its contents are being cached. +### If the repository is being stored on a RAID, the block size should be +### either 50% or 100% of RAID block size / granularity. Also, your file +### system blocks/clusters should be properly aligned and sized. In that +### setup, each access will hit only one disk (minimizes I/O load) but +### uses all the data provided by the disk in a single access. +### For SSD-based storage systems, slightly lower values around 16 kB +### may improve latency while still maximizing throughput. If block-read +### has not been enabled, this will be capped to 4 kBytes. +### Can be changed at any time but must be a power of 2. +### block-size is given in kBytes and with a default of 64 kBytes. +# block-size = 64 +### +### The log-to-phys index maps data item numbers to offsets within the +### rev or pack file. This index is organized in pages of a fixed maximum +### capacity. To access an item, the page table and the respective page +### must be read. +### This parameter only affects revisions with thousands of changed paths. +### If you have several extremely large revisions (~1 mio changes), think +### about increasing this setting. Reducing the value will rarely result +### in a net speedup. +### This is an expert setting. Must be a power of 2. +### l2p-page-size is 8192 entries by default. +# l2p-page-size = 8192 +### +### The phys-to-log index maps positions within the rev or pack file to +### to data items, i.e. describes what piece of information is being +### stored at any particular offset. The index describes the rev file +### in chunks (pages) and keeps a global list of all those pages. Large +### pages mean a shorter page table but a larger per-page description of +### data items in it. The latency sweetspot depends on the change size +### distribution but covers a relatively wide range. +### If the repository contains very large files, i.e. individual changes +### of tens of MB each, increasing the page size will shorten the index +### file at the expense of a slightly increased latency in sections with +### smaller changes. +### For source code repositories, this should be about 16x the block-size. +### Must be a power of 2. +### p2l-page-size is given in kBytes and with a default of 1024 kBytes. +# p2l-page-size = 1024 + +[debug] +### +### Whether to verify each new revision immediately before finalizing +### the commit. This is disabled by default except in maintainer-mode +### builds. +# verify-before-commit = false diff --git a/test/repos/simple-ext.svn/db/min-unpacked-rev b/test/repos/simple-ext.svn/db/min-unpacked-rev new file mode 100644 index 0000000000..573541ac97 --- /dev/null +++ b/test/repos/simple-ext.svn/db/min-unpacked-rev @@ -0,0 +1 @@ +0 diff --git a/test/repos/simple-ext.svn/db/rep-cache.db b/test/repos/simple-ext.svn/db/rep-cache.db new file mode 100644 index 0000000000000000000000000000000000000000..3193b2eaad046fba3704b0d28a50a68a94801da2 GIT binary patch literal 8192 zcmeIuK}!Nb6bJAbjiM;*RyW_VAR=4M^_JC;OPgEUhQTy(n&njm5*>#d|ehiI!uT zo%TC>#d3w1Jtzo300Izz00bZa0SG_<0uX?}pA`5U@~wkvm49vLDta+zt1RRiHZQ=L;CU@@4?#yhXLbg~~mc>fT`3nX?Ls&t(q_@;kd&|6q zPmnj!N-"j%`;)7F<&!^&p5-&NVNc`P`LVZ4M1vhI123LfW679Jk&yS}h4J?+gU zS3MZfY8^$e`Aiw@d{Ry%1dI`+SBZs%>uw7^VT{$-o>k4Xyk=x|K`9*$$d+#CX^Zv$ zf4Az(2{}^ad;78;-W-sx-=Fn>9+LNu&)3o0-}x~i0yI}C8O(!53mF)Lc9yx|g^G|c ej~cmPzoZI_YZ6{Gdr_88^Nk1hnf(AO7WJ=&dj!fDn0uL2;xcFgCGi};7P%MLA>_hL1!bWwMgk<=JI{-``-8Sx~2ler?X@>qvZhS z>jPT>)6KB%uk~`LVLed!QU%IMrh0nGt~zC~p7r~M2xW}B&Vh{`_(=}ATKsQ!Fzyy7 zc4uq7<>SMvwZ?-a) z)0B>oW87dZf4*6*J;dJdmgjkw&ZC`k2j2VA6K~^ji#5jLtJ2Wv5xP@BFE&4IU%WYq zems5i{p{Q%din14uL~NZ-LD^~H)6E)d2he;%)5-9fBCR^<=fA>YX}J*yI_JaqN6CX rO2(X~(lQ>UT$426lExq+3DoM8C~b*QiI&DOO{6l~rdCtJNj(1>+FjA$ literal 0 HcmV?d00001 diff --git a/test/repos/simple-ext.svn/db/revs/0/2 b/test/repos/simple-ext.svn/db/revs/0/2 new file mode 100644 index 0000000000000000000000000000000000000000..99a14cf4b7f76ee26eba753930dd53704627e1f5 GIT binary patch literal 816 zcmZvZJ!lj`6vt;bpA3nm2sW<~jBwfc+L_sFOu`))jUhw{0gGh577vfyac(dHi+~?R zh?R}CT|}_e+9r*qV5N;MStC{G3)m;lNk zCm~lv(?C<68Ar-%YL%o`uYzC#rkyZdVR9kD9Q-!%}%S;O#unUwLYMU_)@exaSbTJeGis^rnsBF zpQ$A6r{xuTL@SSJeWQcholp+qxI~k*M^FFKr;u_G6ohjsvpQAEc$GQHnKxDmExDoA zqy#EZIZ5|4d3k=%Gb>dAc}$Wl&XYP%bKGo< zTmjWejYqZH7n7Q%0^8yGT2lbUjl12Rz=7w$2rz4bdULXUmfC*4ic;URV zT5%(!r#bby^on`qSY{66?ef&aVF)RMbPpw2dC8>a+6zOarB*P`g_V_4QdROm3)2?cKX5He8X8kD zRTa09dUI>+4($Ie?{8&&x)?7E=6hH1%;vc$isp`@N5_6ekGpR>52FX2McUt&KhAE3 z8yY={3bg&={dRN#qtCB?emH(VI)R>Tecg&qWAyFm%XhPPH2SpjWe4}>(DxsApLgH< zK6?@&t~A)(Gf_HKN~e|J&g9-2E__xI=82{tXa=RNLR*{|XCxC^W?VX{y(YOLf>dnb E4; /dev/null || exit 1 + +# Check that the author of this commit has the rights to perform +# the commit on the files and directories being modified. +commit-access-control.pl "$REPOS" "$TXN" commit-access-control.cfg || exit 1 + +# All checks passed, so allow the commit. +exit 0 diff --git a/test/repos/simple-ext.svn/hooks/pre-lock.tmpl b/test/repos/simple-ext.svn/hooks/pre-lock.tmpl new file mode 100755 index 0000000000..148582a689 --- /dev/null +++ b/test/repos/simple-ext.svn/hooks/pre-lock.tmpl @@ -0,0 +1,95 @@ +#!/bin/sh + +# PRE-LOCK HOOK +# +# The pre-lock hook is invoked before an exclusive lock is +# created. Subversion runs this hook by invoking a program +# (script, executable, binary, etc.) named 'pre-lock' (for which +# this file is a template), with the following ordered arguments: +# +# [1] REPOS-PATH (the path to this repository) +# [2] PATH (the path in the repository about to be locked) +# [3] USER (the user creating the lock) +# [4] COMMENT (the comment of the lock) +# [5] STEAL-LOCK (1 if the user is trying to steal the lock, else 0) +# +# If the hook program outputs anything on stdout, the output string will +# be used as the lock token for this lock operation. If you choose to use +# this feature, you must guarantee the tokens generated are unique across +# the repository each time. +# +# If the hook program exits with success, the lock is created; but +# if it exits with failure (non-zero), the lock action is aborted +# and STDERR is returned to the client. +# +# The default working directory for the invocation is undefined, so +# the program should set one explicitly if it cares. +# +# On a Unix system, the normal procedure is to have 'pre-lock' +# invoke other programs to do the real work, though it may do the +# work itself too. +# +# Note that 'pre-lock' must be executable by the user(s) who will +# invoke it (typically the user httpd runs as), and that user must +# have filesystem-level permission to access the repository. +# +# On a Windows system, you should name the hook program +# 'pre-lock.bat' or 'pre-lock.exe', +# but the basic idea is the same. +# +# The hook program runs in an empty environment, unless the server is +# explicitly configured otherwise. For example, a common problem is for +# the PATH environment variable to not be set to its usual value, so +# that subprograms fail to launch unless invoked via absolute path. +# If you're having unexpected problems with a hook program, the +# culprit may be unusual (or missing) environment variables. +# +# CAUTION: +# For security reasons, you MUST always properly quote arguments when +# you use them, as those arguments could contain whitespace or other +# problematic characters. Additionally, you should delimit the list +# of options with "--" before passing the arguments, so malicious +# clients cannot bootleg unexpected options to the commands your +# script aims to execute. +# For similar reasons, you should also add a trailing @ to URLs which +# are passed to SVN commands accepting URLs with peg revisions. +# +# Here is an example hook script, for a Unix /bin/sh interpreter. +# For more examples and pre-written hooks, see those in +# the Subversion repository at +# http://svn.apache.org/repos/asf/subversion/trunk/tools/hook-scripts/ and +# http://svn.apache.org/repos/asf/subversion/trunk/contrib/hook-scripts/ + + +REPOS="$1" +PATH="$2" +USER="$3" +COMMENT="$4" +STEAL="$5" + +# If a lock exists and is owned by a different person, don't allow it +# to be stolen (e.g., with 'svn lock --force ...'). + +# (Maybe this script could send email to the lock owner?) +SVNLOOK=/opt/homebrew/Cellar/subversion/1.14.2_1/bin/svnlook +GREP=/bin/grep +SED=/bin/sed + +LOCK_OWNER=`$SVNLOOK lock "$REPOS" "$PATH" | \ + $GREP '^Owner: ' | $SED 's/Owner: //'` + +# If we get no result from svnlook, there's no lock, allow the lock to +# happen: +if [ "$LOCK_OWNER" = "" ]; then + exit 0 +fi + +# If the person locking matches the lock's owner, allow the lock to +# happen: +if [ "$LOCK_OWNER" = "$USER" ]; then + exit 0 +fi + +# Otherwise, we've got an owner mismatch, so return failure: +echo "Error: $PATH already locked by ${LOCK_OWNER}." 1>&2 +exit 1 diff --git a/test/repos/simple-ext.svn/hooks/pre-revprop-change.tmpl b/test/repos/simple-ext.svn/hooks/pre-revprop-change.tmpl new file mode 100755 index 0000000000..8b065d7c79 --- /dev/null +++ b/test/repos/simple-ext.svn/hooks/pre-revprop-change.tmpl @@ -0,0 +1,79 @@ +#!/bin/sh + +# PRE-REVPROP-CHANGE HOOK +# +# The pre-revprop-change hook is invoked before a revision property +# is added, modified or deleted. Subversion runs this hook by invoking +# a program (script, executable, binary, etc.) named 'pre-revprop-change' +# (for which this file is a template), with the following ordered +# arguments: +# +# [1] REPOS-PATH (the path to this repository) +# [2] REV (the revision being tweaked) +# [3] USER (the username of the person tweaking the property) +# [4] PROPNAME (the property being set on the revision) +# [5] ACTION (the property is being 'A'dded, 'M'odified, or 'D'eleted) +# +# [STDIN] PROPVAL ** the new property value is passed via STDIN. +# +# If the hook program exits with success, the propchange happens; but +# if it exits with failure (non-zero), the propchange doesn't happen. +# The hook program can use the 'svnlook' utility to examine the +# existing value of the revision property. +# +# WARNING: unlike other hooks, this hook MUST exist for revision +# properties to be changed. If the hook does not exist, Subversion +# will behave as if the hook were present, but failed. The reason +# for this is that revision properties are UNVERSIONED, meaning that +# a successful propchange is destructive; the old value is gone +# forever. We recommend the hook back up the old value somewhere. +# +# The default working directory for the invocation is undefined, so +# the program should set one explicitly if it cares. +# +# On a Unix system, the normal procedure is to have 'pre-revprop-change' +# invoke other programs to do the real work, though it may do the +# work itself too. +# +# Note that 'pre-revprop-change' must be executable by the user(s) who will +# invoke it (typically the user httpd runs as), and that user must +# have filesystem-level permission to access the repository. +# +# On a Windows system, you should name the hook program +# 'pre-revprop-change.bat' or 'pre-revprop-change.exe', +# but the basic idea is the same. +# +# The hook program runs in an empty environment, unless the server is +# explicitly configured otherwise. For example, a common problem is for +# the PATH environment variable to not be set to its usual value, so +# that subprograms fail to launch unless invoked via absolute path. +# If you're having unexpected problems with a hook program, the +# culprit may be unusual (or missing) environment variables. +# +# CAUTION: +# For security reasons, you MUST always properly quote arguments when +# you use them, as those arguments could contain whitespace or other +# problematic characters. Additionally, you should delimit the list +# of options with "--" before passing the arguments, so malicious +# clients cannot bootleg unexpected options to the commands your +# script aims to execute. +# For similar reasons, you should also add a trailing @ to URLs which +# are passed to SVN commands accepting URLs with peg revisions. +# +# Here is an example hook script, for a Unix /bin/sh interpreter. +# For more examples and pre-written hooks, see those in +# the Subversion repository at +# http://svn.apache.org/repos/asf/subversion/trunk/tools/hook-scripts/ and +# http://svn.apache.org/repos/asf/subversion/trunk/contrib/hook-scripts/ + + +REPOS="$1" +REV="$2" +USER="$3" +PROPNAME="$4" +ACTION="$5" + +if [ "$ACTION" = "M" -a "$PROPNAME" = "svn:log" ]; then exit 0; fi + +echo "Changing revision properties other than svn:log is prohibited" >&2 +exit 1 diff --git a/test/repos/simple-ext.svn/hooks/pre-unlock.tmpl b/test/repos/simple-ext.svn/hooks/pre-unlock.tmpl new file mode 100755 index 0000000000..9ba99d071b --- /dev/null +++ b/test/repos/simple-ext.svn/hooks/pre-unlock.tmpl @@ -0,0 +1,87 @@ +#!/bin/sh + +# PRE-UNLOCK HOOK +# +# The pre-unlock hook is invoked before an exclusive lock is +# destroyed. Subversion runs this hook by invoking a program +# (script, executable, binary, etc.) named 'pre-unlock' (for which +# this file is a template), with the following ordered arguments: +# +# [1] REPOS-PATH (the path to this repository) +# [2] PATH (the path in the repository about to be unlocked) +# [3] USER (the user destroying the lock) +# [4] TOKEN (the lock token to be destroyed) +# [5] BREAK-UNLOCK (1 if the user is breaking the lock, else 0) +# +# If the hook program exits with success, the lock is destroyed; but +# if it exits with failure (non-zero), the unlock action is aborted +# and STDERR is returned to the client. +# +# The default working directory for the invocation is undefined, so +# the program should set one explicitly if it cares. +# +# On a Unix system, the normal procedure is to have 'pre-unlock' +# invoke other programs to do the real work, though it may do the +# work itself too. +# +# Note that 'pre-unlock' must be executable by the user(s) who will +# invoke it (typically the user httpd runs as), and that user must +# have filesystem-level permission to access the repository. +# +# On a Windows system, you should name the hook program +# 'pre-unlock.bat' or 'pre-unlock.exe', +# but the basic idea is the same. +# +# The hook program runs in an empty environment, unless the server is +# explicitly configured otherwise. For example, a common problem is for +# the PATH environment variable to not be set to its usual value, so +# that subprograms fail to launch unless invoked via absolute path. +# If you're having unexpected problems with a hook program, the +# culprit may be unusual (or missing) environment variables. +# +# CAUTION: +# For security reasons, you MUST always properly quote arguments when +# you use them, as those arguments could contain whitespace or other +# problematic characters. Additionally, you should delimit the list +# of options with "--" before passing the arguments, so malicious +# clients cannot bootleg unexpected options to the commands your +# script aims to execute. +# For similar reasons, you should also add a trailing @ to URLs which +# are passed to SVN commands accepting URLs with peg revisions. +# +# Here is an example hook script, for a Unix /bin/sh interpreter. +# For more examples and pre-written hooks, see those in +# the Subversion repository at +# http://svn.apache.org/repos/asf/subversion/trunk/tools/hook-scripts/ and +# http://svn.apache.org/repos/asf/subversion/trunk/contrib/hook-scripts/ + + +REPOS="$1" +PATH="$2" +USER="$3" +TOKEN="$4" +BREAK="$5" + +# If a lock is owned by a different person, don't allow it be broken. +# (Maybe this script could send email to the lock owner?) + +SVNLOOK=/opt/homebrew/Cellar/subversion/1.14.2_1/bin/svnlook +GREP=/bin/grep +SED=/bin/sed + +LOCK_OWNER=`$SVNLOOK lock "$REPOS" "$PATH" | \ + $GREP '^Owner: ' | $SED 's/Owner: //'` + +# If we get no result from svnlook, there's no lock, return success: +if [ "$LOCK_OWNER" = "" ]; then + exit 0 +fi + +# If the person unlocking matches the lock's owner, return success: +if [ "$LOCK_OWNER" = "$USER" ]; then + exit 0 +fi + +# Otherwise, we've got an owner mismatch, so return failure: +echo "Error: $PATH locked by ${LOCK_OWNER}." 1>&2 +exit 1 diff --git a/test/repos/simple-ext.svn/hooks/start-commit.tmpl b/test/repos/simple-ext.svn/hooks/start-commit.tmpl new file mode 100755 index 0000000000..1395e8315a --- /dev/null +++ b/test/repos/simple-ext.svn/hooks/start-commit.tmpl @@ -0,0 +1,81 @@ +#!/bin/sh + +# START-COMMIT HOOK +# +# The start-commit hook is invoked immediately after a Subversion txn is +# created and populated with initial revprops in the process of doing a +# commit. Subversion runs this hook by invoking a program (script, +# executable, binary, etc.) named 'start-commit' (for which this file +# is a template) with the following ordered arguments: +# +# [1] REPOS-PATH (the path to this repository) +# [2] USER (the authenticated user attempting to commit) +# [3] CAPABILITIES (a colon-separated list of capabilities reported +# by the client; see note below) +# [4] TXN-NAME (the name of the commit txn just created) +# +# Note: The CAPABILITIES parameter is new in Subversion 1.5, and 1.5 +# clients will typically report at least the "mergeinfo" capability. +# If there are other capabilities, then the list is colon-separated, +# e.g.: "mergeinfo:some-other-capability" (the order is undefined). +# +# The list is self-reported by the client. Therefore, you should not +# make security assumptions based on the capabilities list, nor should +# you assume that clients reliably report every capability they have. +# +# Note: The TXN-NAME parameter is new in Subversion 1.8. Prior to version +# 1.8, the start-commit hook was invoked before the commit txn was even +# created, so the ability to inspect the commit txn and its metadata from +# within the start-commit hook was not possible. +# +# If the hook program exits with success, the commit continues; but +# if it exits with failure (non-zero), the commit is stopped before +# a Subversion txn is created, and STDERR is returned to the client. +# +# The default working directory for the invocation is undefined, so +# the program should set one explicitly if it cares. +# +# On a Unix system, the normal procedure is to have 'start-commit' +# invoke other programs to do the real work, though it may do the +# work itself too. +# +# Note that 'start-commit' must be executable by the user(s) who will +# invoke it (typically the user httpd runs as), and that user must +# have filesystem-level permission to access the repository. +# +# On a Windows system, you should name the hook program +# 'start-commit.bat' or 'start-commit.exe', +# but the basic idea is the same. +# +# The hook program runs in an empty environment, unless the server is +# explicitly configured otherwise. For example, a common problem is for +# the PATH environment variable to not be set to its usual value, so +# that subprograms fail to launch unless invoked via absolute path. +# If you're having unexpected problems with a hook program, the +# culprit may be unusual (or missing) environment variables. +# +# CAUTION: +# For security reasons, you MUST always properly quote arguments when +# you use them, as those arguments could contain whitespace or other +# problematic characters. Additionally, you should delimit the list +# of options with "--" before passing the arguments, so malicious +# clients cannot bootleg unexpected options to the commands your +# script aims to execute. +# For similar reasons, you should also add a trailing @ to URLs which +# are passed to SVN commands accepting URLs with peg revisions. +# +# Here is an example hook script, for a Unix /bin/sh interpreter. +# For more examples and pre-written hooks, see those in +# the Subversion repository at +# http://svn.apache.org/repos/asf/subversion/trunk/tools/hook-scripts/ and +# http://svn.apache.org/repos/asf/subversion/trunk/contrib/hook-scripts/ + + +REPOS="$1" +USER="$2" + +commit-allower.pl --repository "$REPOS" --user "$USER" || exit 1 +special-auth-check.py --user "$USER" --auth-level 3 || exit 1 + +# All checks passed, so allow the commit. +exit 0 diff --git a/test/repos/simple-ext.svn/locks/db-logs.lock b/test/repos/simple-ext.svn/locks/db-logs.lock new file mode 100644 index 0000000000..20dd6369be --- /dev/null +++ b/test/repos/simple-ext.svn/locks/db-logs.lock @@ -0,0 +1,3 @@ +This file is not used by Subversion 1.3.x or later. +However, its existence is required for compatibility with +Subversion 1.2.x or earlier. diff --git a/test/repos/simple-ext.svn/locks/db.lock b/test/repos/simple-ext.svn/locks/db.lock new file mode 100644 index 0000000000..20dd6369be --- /dev/null +++ b/test/repos/simple-ext.svn/locks/db.lock @@ -0,0 +1,3 @@ +This file is not used by Subversion 1.3.x or later. +However, its existence is required for compatibility with +Subversion 1.2.x or earlier. diff --git a/test/test_sys_checkout.py b/test/test_sys_checkout.py old mode 100644 new mode 100755 index df726f2b70..664160dc99 --- a/test/test_sys_checkout.py +++ b/test/test_sys_checkout.py @@ -1,7 +1,15 @@ -#!/usr/bin/env python +#!/usr/bin/env python3 """Unit test driver for checkout_externals +Terminology: + * 'container': a repo that has externals + * 'simple': a repo that has no externals, but is referenced as an external by another repo. + * 'mixed': a repo that both has externals and is referenced as an external by another repo. + + * 'clean': the local repo matches the version in the externals and has no local modifications. + * 'empty': the external isn't checked out at all. + Note: this script assume the path to the manic and checkout_externals module is already in the python path. This is usually handled by the makefile. If you call it directly, you may need @@ -21,7 +29,6 @@ * Erase any existing repos at the begining of the module in setUpModule. - """ # NOTE(bja, 2017-11) pylint complains that the module is too big, but @@ -66,27 +73,53 @@ # # --------------------------------------------------------------------- -# environment variable names -MANIC_TEST_BARE_REPO_ROOT = 'MANIC_TEST_BARE_REPO_ROOT' -MANIC_TEST_TMP_REPO_ROOT = 'MANIC_TEST_TMP_REPO_ROOT' -# directory names -TMP_REPO_DIR_NAME = 'tmp' +# Module-wide root directory for all the per-test subdirs we'll create on +# the fly (which are placed under wherever $CWD is when the test runs). +# Set by setupModule(). +module_tmp_root_dir = None +TMP_REPO_DIR_NAME = 'tmp' # subdir under $CWD + +# subdir under test/ that holds all of our checked-in repositories (which we +# will clone for these tests). BARE_REPO_ROOT_NAME = 'repos' -CONTAINER_REPO_NAME = 'container.git' -MIXED_REPO_NAME = 'mixed-cont-ext.git' -SIMPLE_REPO_NAME = 'simple-ext.git' -SIMPLE_FORK_NAME = 'simple-ext-fork.git' + +# Environment var referenced by checked-in externals file in mixed-cont-ext.git, +# which should be pointed to the fully-resolved BARE_REPO_ROOT_NAME directory. +# We explicitly clear this after every test, via tearDown(). +MIXED_CONT_EXT_ROOT_ENV_VAR = 'MANIC_TEST_BARE_REPO_ROOT' + +# Subdirs under bare repo root, each holding a repository. For more info +# on the contents of these repositories, see test/repos/README.md. In these +# tests the 'parent' repos are cloned as a starting point, whereas the 'child' +# repos are checked out when the tests run checkout_externals. +CONTAINER_REPO = 'container.git' # Parent repo +SIMPLE_REPO = 'simple-ext.git' # Child repo +SIMPLE_FORK_REPO = 'simple-ext-fork.git' # Child repo +MIXED_REPO = 'mixed-cont-ext.git' # Both parent and child +SVN_TEST_REPO = 'simple-ext.svn' # Subversion repository + +# Standard (arbitrary) external names for test configs +TAG_SECTION = 'simp_tag' +BRANCH_SECTION = 'simp_branch' +HASH_SECTION = 'simp_hash' + +# All the configs we construct check out their externals into these local paths. +EXTERNALS_PATH = 'externals' +SUB_EXTERNALS_PATH = 'src' # For mixed test repos, + +# For testing behavior with '.' instead of an explicit paths. SIMPLE_LOCAL_ONLY_NAME = '.' -ERROR_REPO_NAME = 'error' -EXTERNALS_NAME = 'externals' -SUB_EXTERNALS_PATH = 'src' -CFG_NAME = 'externals.cfg' -CFG_SUB_NAME = 'sub-externals.cfg' -README_NAME = 'readme.txt' -REMOTE_BRANCH_FEATURE2 = 'feature2' -SVN_TEST_REPO = 'https://github.com/escomp/cesm' +# Externals files. +CFG_NAME = 'externals.cfg' # We construct this on a per-test basis. +CFG_SUB_NAME = 'sub-externals.cfg' # Already exists in mixed-cont-ext repo. + +# Arbitrary text file in all the test repos. +README_NAME = 'readme.txt' + +# Branch that exists in both the simple and simple-fork repos. +REMOTE_BRANCH_FEATURE2 = 'feature2' # Disable too-many-public-methods error # pylint: disable=R0904 @@ -107,136 +140,113 @@ def setUpModule(): # pylint: disable=C0103 pass # create clean dir for this run os.mkdir(repo_root) - # set into the environment so var will be expanded in externals - # filess when executables are run - os.environ[MANIC_TEST_TMP_REPO_ROOT] = repo_root - - -class GenerateExternalsDescriptionCfgV1(object): - """Class to provide building blocks to create - ExternalsDescriptionCfgV1 files. - - Includes predefined files used in tests. - - """ - - def __init__(self): - self._schema_version = '1.1.0' - self._config = None - - def container_full(self, dest_dir): - """Create the full container config file with simple and mixed use - externals - - """ - self.create_config() - self.create_section(SIMPLE_REPO_NAME, 'simp_tag', - tag='tag1') - - self.create_section(SIMPLE_REPO_NAME, 'simp_branch', - branch=REMOTE_BRANCH_FEATURE2) - - self.create_section(SIMPLE_REPO_NAME, 'simp_opt', - tag='tag1', required=False) - self.create_section(MIXED_REPO_NAME, 'mixed_req', - branch='master', externals=CFG_SUB_NAME) + # Make available to all tests in this file. + global module_tmp_root_dir + assert module_tmp_root_dir == None, module_tmp_root_dir + module_tmp_root_dir = repo_root - self.write_config(dest_dir) - - def container_simple_required(self, dest_dir): - """Create a container externals file with only simple externals. - - """ - self.create_config() - self.create_section(SIMPLE_REPO_NAME, 'simp_tag', - tag='tag1') - - self.create_section(SIMPLE_REPO_NAME, 'simp_branch', - branch=REMOTE_BRANCH_FEATURE2) - - self.create_section(SIMPLE_REPO_NAME, 'simp_hash', - ref_hash='60b1cc1a38d63') - - self.write_config(dest_dir) - - def container_simple_optional(self, dest_dir): - """Create a container externals file with optional simple externals +class RepoUtils(object): + """Convenience methods for interacting with git repos.""" + @staticmethod + def create_branch(repo_base_dir, external_name, branch, with_commit=False): + """Create branch and optionally (with_commit) add a single commit. """ - self.create_config() - self.create_section(SIMPLE_REPO_NAME, 'simp_req', - tag='tag1') - - self.create_section(SIMPLE_REPO_NAME, 'simp_opt', - tag='tag1', required=False) - - self.write_config(dest_dir) - - def container_simple_svn(self, dest_dir): - """Create a container externals file with only simple externals. + # pylint: disable=R0913 + cwd = os.getcwd() + repo_root = os.path.join(repo_base_dir, EXTERNALS_PATH, external_name) + os.chdir(repo_root) + cmd = ['git', 'checkout', '-b', branch, ] + execute_subprocess(cmd) + if with_commit: + msg = 'start work on {0}'.format(branch) + with open(README_NAME, 'a') as handle: + handle.write(msg) + cmd = ['git', 'add', README_NAME, ] + execute_subprocess(cmd) + cmd = ['git', 'commit', '-m', msg, ] + execute_subprocess(cmd) + os.chdir(cwd) + @staticmethod + def create_commit(repo_base_dir, external_name): + """Make a commit to the given external. + + This is used to test sync state changes from local commits on + detached heads and tracking branches. """ - self.create_config() - self.create_section(SIMPLE_REPO_NAME, 'simp_tag', tag='tag1') - - self.create_svn_external('svn_branch', branch='trunk') - self.create_svn_external('svn_tag', tag='tags/cesm2.0.beta07') + cwd = os.getcwd() + repo_root = os.path.join(repo_base_dir, EXTERNALS_PATH, external_name) + os.chdir(repo_root) - self.write_config(dest_dir) + msg = 'work on great new feature!' + with open(README_NAME, 'a') as handle: + handle.write(msg) + cmd = ['git', 'add', README_NAME, ] + execute_subprocess(cmd) + cmd = ['git', 'commit', '-m', msg, ] + execute_subprocess(cmd) + os.chdir(cwd) - def container_sparse(self, dest_dir): - """Create a container with a full external and a sparse external + @staticmethod + def clone_test_repo(bare_root, test_id, parent_repo_name, dest_dir_in): + """Clone repo at / into dest_dir_in or local per-test-subdir. + Returns output dir. """ - # Create a file for a sparse pattern match - sparse_filename = 'sparse_checkout' - with open(os.path.join(dest_dir, sparse_filename), 'w') as sfile: - sfile.write('readme.txt') - - self.create_config() - self.create_section(SIMPLE_REPO_NAME, 'simp_tag', - tag='tag2') - - sparse_relpath = '../../{}'.format(sparse_filename) - self.create_section(SIMPLE_REPO_NAME, 'simp_sparse', - tag='tag2', sparse=sparse_relpath) + parent_repo_dir = os.path.join(bare_root, parent_repo_name) + if dest_dir_in is None: + # create unique subdir for this test + test_dir_name = test_id + print("Test repository name: {0}".format(test_dir_name)) + dest_dir = os.path.join(module_tmp_root_dir, test_dir_name) + else: + dest_dir = dest_dir_in - self.write_config(dest_dir) + # pylint: disable=W0212 + GitRepository._git_clone(parent_repo_dir, dest_dir, VERBOSITY_DEFAULT) + return dest_dir - def mixed_simple_base(self, dest_dir): - """Create a mixed-use base externals file with only simple externals. + @staticmethod + def add_file_to_repo(under_test_dir, filename, tracked): + """Add a file to the repository so we can put it into a dirty state """ - self.create_config() - self.create_section_ext_only('mixed_base') - self.create_section(SIMPLE_REPO_NAME, 'simp_tag', - tag='tag1') - - self.create_section(SIMPLE_REPO_NAME, 'simp_branch', - branch=REMOTE_BRANCH_FEATURE2) + cwd = os.getcwd() + os.chdir(under_test_dir) + with open(filename, 'w') as tmp: + tmp.write('Hello, world!') - self.create_section(SIMPLE_REPO_NAME, 'simp_hash', - ref_hash='60b1cc1a38d63') + if tracked: + # NOTE(bja, 2018-01) brittle hack to obtain repo dir and + # file name + path_data = filename.split('/') + repo_dir = os.path.join(path_data[0], path_data[1]) + os.chdir(repo_dir) + tracked_file = path_data[2] + cmd = ['git', 'add', tracked_file] + execute_subprocess(cmd) - self.write_config(dest_dir) + os.chdir(cwd) - def mixed_simple_sub(self, dest_dir): - """Create a mixed-use sub externals file with only simple externals. +class GenerateExternalsDescriptionCfgV1(object): + """Building blocks to create ExternalsDescriptionCfgV1 files. - """ - self.create_config() - self.create_section(SIMPLE_REPO_NAME, 'simp_tag', - tag='tag1', path=SUB_EXTERNALS_PATH) + Basic usage: create_config() multiple create_*(), then write_config(). + Optionally after that: write_with_*(). + """ - self.create_section(SIMPLE_REPO_NAME, 'simp_branch', - branch=REMOTE_BRANCH_FEATURE2, - path=SUB_EXTERNALS_PATH) + def __init__(self, bare_root): + self._schema_version = '1.1.0' + self._config = None - self.write_config(dest_dir, filename=CFG_SUB_NAME) + # directory where we have test repositories (which we will clone for + # tests) + self._bare_root = bare_root def write_config(self, dest_dir, filename=CFG_NAME): - """Write the configuration file to disk + """Write self._config to disk """ dest_path = os.path.join(dest_dir, filename) @@ -258,27 +268,39 @@ def create_metadata(self): self._config.set(DESCRIPTION_SECTION, VERSION_ITEM, self._schema_version) - def create_section(self, repo_type, name, tag='', branch='', - ref_hash='', required=True, path=EXTERNALS_NAME, - externals='', repo_path=None, from_submodule=False, - sparse=''): + def url_for_repo_path(self, repo_path, repo_path_abs=None): + if repo_path_abs is not None: + return repo_path_abs + else: + return os.path.join(self._bare_root, repo_path) + + def create_section(self, repo_path, name, tag='', branch='', + ref_hash='', required=True, path=EXTERNALS_PATH, + sub_externals='', repo_path_abs=None, from_submodule=False, + sparse='', nested=False): # pylint: disable=too-many-branches - """Create a config section with autofilling some items and handling - optional items. + """Create a config ExternalsDescription section with the given name. + + Autofills some items and handles some optional items. + repo_path_abs overrides repo_path (which is relative to the bare repo) + path is a subdir under repo_path to check out to. """ # pylint: disable=R0913 self._config.add_section(name) if not from_submodule: - self._config.set(name, ExternalsDescription.PATH, - os.path.join(path, name)) + if nested: + self._config.set(name, ExternalsDescription.PATH, path) + else: + self._config.set(name, ExternalsDescription.PATH, + os.path.join(path, name)) self._config.set(name, ExternalsDescription.PROTOCOL, ExternalsDescription.PROTOCOL_GIT) # from_submodules is incompatible with some other options, turn them off if (from_submodule and - ((repo_path is not None) or tag or ref_hash or branch)): + ((repo_path_abs is not None) or tag or ref_hash or branch)): printlog('create_section: "from_submodule" is incompatible with ' '"repo_url", "tag", "hash", and "branch" options;\n' 'Ignoring those options for {}'.format(name)) @@ -287,10 +309,7 @@ def create_section(self, repo_type, name, tag='', branch='', ref_hash = '' branch = '' - if repo_path is not None: - repo_url = repo_path - else: - repo_url = os.path.join('${MANIC_TEST_BARE_REPO_ROOT}', repo_type) + repo_url = self.url_for_repo_path(repo_path, repo_path_abs) if not from_submodule: self._config.set(name, ExternalsDescription.REPO_URL, repo_url) @@ -306,8 +325,9 @@ def create_section(self, repo_type, name, tag='', branch='', if ref_hash: self._config.set(name, ExternalsDescription.HASH, ref_hash) - if externals: - self._config.set(name, ExternalsDescription.EXTERNALS, externals) + if sub_externals: + self._config.set(name, ExternalsDescription.EXTERNALS, + sub_externals) if sparse: self._config.set(name, ExternalsDescription.SPARSE, sparse) @@ -315,10 +335,8 @@ def create_section(self, repo_type, name, tag='', branch='', if from_submodule: self._config.set(name, ExternalsDescription.SUBMODULE, "True") - def create_section_ext_only(self, name, - required=True, externals=CFG_SUB_NAME): - """Create a config section with autofilling some items and handling - optional items. + def create_section_reference_to_subexternal(self, name): + """Just a reference to another externals file. """ # pylint: disable=R0913 @@ -331,23 +349,22 @@ def create_section_ext_only(self, name, self._config.set(name, ExternalsDescription.REPO_URL, LOCAL_PATH_INDICATOR) - self._config.set(name, ExternalsDescription.REQUIRED, str(required)) + self._config.set(name, ExternalsDescription.REQUIRED, str(True)) - if externals: - self._config.set(name, ExternalsDescription.EXTERNALS, externals) + self._config.set(name, ExternalsDescription.EXTERNALS, CFG_SUB_NAME) - def create_svn_external(self, name, tag='', branch=''): + def create_svn_external(self, name, url, tag='', branch=''): """Create a config section for an svn repository. """ self._config.add_section(name) self._config.set(name, ExternalsDescription.PATH, - os.path.join(EXTERNALS_NAME, name)) + os.path.join(EXTERNALS_PATH, name)) self._config.set(name, ExternalsDescription.PROTOCOL, ExternalsDescription.PROTOCOL_SVN) - self._config.set(name, ExternalsDescription.REPO_URL, SVN_TEST_REPO) + self._config.set(name, ExternalsDescription.REPO_URL, url) self._config.set(name, ExternalsDescription.REQUIRED, str(True)) @@ -357,65 +374,19 @@ def create_svn_external(self, name, tag='', branch=''): if branch: self._config.set(name, ExternalsDescription.BRANCH, branch) - @staticmethod - def create_branch(dest_dir, repo_name, branch, with_commit=False): - """Update a repository branch, and potentially the remote. - """ - # pylint: disable=R0913 - cwd = os.getcwd() - repo_root = os.path.join(dest_dir, EXTERNALS_NAME) - repo_root = os.path.join(repo_root, repo_name) - os.chdir(repo_root) - cmd = ['git', 'checkout', '-b', branch, ] - execute_subprocess(cmd) - if with_commit: - msg = 'start work on {0}'.format(branch) - with open(README_NAME, 'a') as handle: - handle.write(msg) - cmd = ['git', 'add', README_NAME, ] - execute_subprocess(cmd) - cmd = ['git', 'commit', '-m', msg, ] - execute_subprocess(cmd) - os.chdir(cwd) - - @staticmethod - def create_commit(dest_dir, repo_name, local_tracking_branch=None): - """Make a commit on whatever is currently checked out. - - This is used to test sync state changes from local commits on - detached heads and tracking branches. + def write_with_git_branch(self, dest_dir, name, branch, new_remote_repo_path=None): + """Update fields in our config and write it to disk. - """ - cwd = os.getcwd() - repo_root = os.path.join(dest_dir, EXTERNALS_NAME) - repo_root = os.path.join(repo_root, repo_name) - os.chdir(repo_root) - if local_tracking_branch: - cmd = ['git', 'checkout', '-b', local_tracking_branch, ] - execute_subprocess(cmd) - - msg = 'work on great new feature!' - with open(README_NAME, 'a') as handle: - handle.write(msg) - cmd = ['git', 'add', README_NAME, ] - execute_subprocess(cmd) - cmd = ['git', 'commit', '-m', msg, ] - execute_subprocess(cmd) - os.chdir(cwd) - - def update_branch(self, dest_dir, name, branch, repo_type=None, - filename=CFG_NAME): - """Update a repository branch, and potentially the remote. + name is the key of the ExternalsDescription in self._config to update. """ # pylint: disable=R0913 self._config.set(name, ExternalsDescription.BRANCH, branch) - if repo_type: - if repo_type == SIMPLE_LOCAL_ONLY_NAME: + if new_remote_repo_path: + if new_remote_repo_path == SIMPLE_LOCAL_ONLY_NAME: repo_url = SIMPLE_LOCAL_ONLY_NAME else: - repo_url = os.path.join('${MANIC_TEST_BARE_REPO_ROOT}', - repo_type) + repo_url = os.path.join(self._bare_root, new_remote_repo_path) self._config.set(name, ExternalsDescription.REPO_URL, repo_url) try: @@ -424,9 +395,9 @@ def update_branch(self, dest_dir, name, branch, repo_type=None, except BaseException: pass - self.write_config(dest_dir, filename) + self.write_config(dest_dir) - def update_svn_branch(self, dest_dir, name, branch, filename=CFG_NAME): + def write_with_svn_branch(self, dest_dir, name, branch): """Update a repository branch, and potentially the remote. """ # pylint: disable=R0913 @@ -438,11 +409,11 @@ def update_svn_branch(self, dest_dir, name, branch, filename=CFG_NAME): except BaseException: pass - self.write_config(dest_dir, filename) + self.write_config(dest_dir) - def update_tag(self, dest_dir, name, tag, repo_type=None, - filename=CFG_NAME, remove_branch=True): - """Update a repository tag, and potentially the remote + def write_with_tag_and_remote_repo(self, dest_dir, name, tag, new_remote_repo_path, + remove_branch=True): + """Update a repository tag and the remote. NOTE(bja, 2017-11) remove_branch=False should result in an overspecified external with both a branch and tag. This is @@ -452,8 +423,8 @@ def update_tag(self, dest_dir, name, tag, repo_type=None, # pylint: disable=R0913 self._config.set(name, ExternalsDescription.TAG, tag) - if repo_type: - repo_url = os.path.join('${MANIC_TEST_BARE_REPO_ROOT}', repo_type) + if new_remote_repo_path: + repo_url = os.path.join(self._bare_root, new_remote_repo_path) self._config.set(name, ExternalsDescription.REPO_URL, repo_url) try: @@ -463,10 +434,9 @@ def update_tag(self, dest_dir, name, tag, repo_type=None, except BaseException: pass - self.write_config(dest_dir, filename) + self.write_config(dest_dir) - def update_underspecify_branch_tag(self, dest_dir, name, - filename=CFG_NAME): + def write_without_branch_tag(self, dest_dir, name): """Update a repository protocol, and potentially the remote """ # pylint: disable=R0913 @@ -482,10 +452,9 @@ def update_underspecify_branch_tag(self, dest_dir, name, except BaseException: pass - self.write_config(dest_dir, filename) + self.write_config(dest_dir) - def update_underspecify_remove_url(self, dest_dir, name, - filename=CFG_NAME): + def write_without_repo_url(self, dest_dir, name): """Update a repository protocol, and potentially the remote """ # pylint: disable=R0913 @@ -495,22 +464,59 @@ def update_underspecify_remove_url(self, dest_dir, name, except BaseException: pass - self.write_config(dest_dir, filename) + self.write_config(dest_dir) - def update_protocol(self, dest_dir, name, protocol, repo_type=None, - filename=CFG_NAME): + def write_with_protocol(self, dest_dir, name, protocol, repo_path=None): """Update a repository protocol, and potentially the remote """ # pylint: disable=R0913 self._config.set(name, ExternalsDescription.PROTOCOL, protocol) - if repo_type: - repo_url = os.path.join('${MANIC_TEST_BARE_REPO_ROOT}', repo_type) + if repo_path: + repo_url = os.path.join(self._bare_root, repo_path) self._config.set(name, ExternalsDescription.REPO_URL, repo_url) - self.write_config(dest_dir, filename) + self.write_config(dest_dir) +def _execute_checkout_in_dir(dirname, args, debug_env=''): + """Execute the checkout command in the appropriate repo dir with the + specified additional args. + + args should be a list of strings. + debug_env shuld be a string of the form 'FOO=bar' or the empty string. + + Note that we are calling the command line processing and main + routines and not using a subprocess call so that we get code + coverage results! Note this means that environment variables are passed + to checkout_externals via os.environ; debug_env is just used to aid + manual reproducibility of a given call. + + Returns (overall_status, tree_status) + where overall_status is 0 for success, nonzero otherwise. + and tree_status is set if --status was passed in, None otherwise. + + Note this command executes the checkout command, it doesn't + necessarily do any checking out (e.g. if --status is passed in). + """ + cwd = os.getcwd() + + # Construct a command line for reproducibility; this command is not + # actually executed in the test. + os.chdir(dirname) + cmdline = ['--externals', CFG_NAME, ] + cmdline += args + manual_cmd = ('Running equivalent of:\n' + 'pushd {dirname}; ' + '{debug_env} /path/to/checkout_externals {args}'.format( + dirname=dirname, debug_env=debug_env, + args=' '.join(cmdline))) + printlog(manual_cmd) + options = checkout.commandline_arguments(cmdline) + overall_status, tree_status = checkout.main(options) + os.chdir(cwd) + return overall_status, tree_status + class BaseTestSysCheckout(unittest.TestCase): """Base class of reusable systems level test setup for checkout_externals @@ -521,6 +527,7 @@ class BaseTestSysCheckout(unittest.TestCase): # cryptic. # pylint: disable=invalid-name + # Command-line args for checkout_externals, used in execute_checkout_in_dir() status_args = ['--status'] checkout_args = [] optional_args = ['--optional'] @@ -535,384 +542,74 @@ def setUp(self): self._test_id = self.id().split('.')[-1] - # path to the executable - self._checkout = os.path.join('../checkout_externals') - self._checkout = os.path.abspath(self._checkout) + # find root + if os.path.exists(os.path.join(os.getcwd(), 'checkout_externals')): + root_dir = os.path.abspath(os.getcwd()) + else: + # maybe we are in a subdir, search up + root_dir = os.path.abspath(os.path.join(os.getcwd(), os.pardir)) + while os.path.basename(root_dir): + if os.path.exists(os.path.join(root_dir, 'checkout_externals')): + break + root_dir = os.path.dirname(root_dir) + + if not os.path.exists(os.path.join(root_dir, 'checkout_externals')): + raise RuntimeError('Cannot find checkout_externals') - # directory where we have test repositories - self._bare_root = os.path.join(os.getcwd(), BARE_REPO_ROOT_NAME) - self._bare_root = os.path.abspath(self._bare_root) + # path to the executable + self._checkout = os.path.join(root_dir, 'checkout_externals') - # set into the environment so var will be expanded in externals files - os.environ[MANIC_TEST_BARE_REPO_ROOT] = self._bare_root + # directory where we have test repositories (which we will clone for + # tests) + self._bare_root = os.path.abspath( + os.path.join(root_dir, 'test', BARE_REPO_ROOT_NAME)) # set the input file generator - self._generator = GenerateExternalsDescriptionCfgV1() + self._generator = GenerateExternalsDescriptionCfgV1(self._bare_root) # set the input file generator for secondary externals - self._sub_generator = GenerateExternalsDescriptionCfgV1() + self._sub_generator = GenerateExternalsDescriptionCfgV1(self._bare_root) def tearDown(self): """Tear down for individual tests """ - # remove the env var we added in setup - del os.environ[MANIC_TEST_BARE_REPO_ROOT] - # return to our common starting point os.chdir(self._return_dir) - - def setup_test_repo(self, parent_repo_name, dest_dir_in=None): - """Setup the paths and clone the base test repo - - """ - # unique repo for this test - test_dir_name = self._test_id - print("Test repository name: {0}".format(test_dir_name)) - - parent_repo_dir = os.path.join(self._bare_root, parent_repo_name) - if dest_dir_in is None: - dest_dir = os.path.join(os.environ[MANIC_TEST_TMP_REPO_ROOT], - test_dir_name) - else: - dest_dir = dest_dir_in - - # pylint: disable=W0212 - GitRepository._git_clone(parent_repo_dir, dest_dir, VERBOSITY_DEFAULT) - return dest_dir - - @staticmethod - def _add_file_to_repo(under_test_dir, filename, tracked): - """Add a file to the repository so we can put it into a dirty state - - """ - cwd = os.getcwd() - os.chdir(under_test_dir) - with open(filename, 'w') as tmp: - tmp.write('Hello, world!') - - if tracked: - # NOTE(bja, 2018-01) brittle hack to obtain repo dir and - # file name - path_data = filename.split('/') - repo_dir = os.path.join(path_data[0], path_data[1]) - os.chdir(repo_dir) - tracked_file = path_data[2] - cmd = ['git', 'add', tracked_file] - execute_subprocess(cmd) - - os.chdir(cwd) + + # (in case this was set) Don't pollute environment of other tests. + os.environ.pop(MIXED_CONT_EXT_ROOT_ENV_VAR, + None) # Don't care if key wasn't set. + + def clone_test_repo(self, parent_repo_name, dest_dir_in=None): + """Clones repo under self._bare_root""" + return RepoUtils.clone_test_repo(self._bare_root, self._test_id, + parent_repo_name, dest_dir_in) + + def execute_checkout_in_dir(self, dirname, args, debug_env=''): + overall_status, tree_status = _execute_checkout_in_dir(dirname, args, + debug_env=debug_env) + self.assertEqual(overall_status, 0) + return tree_status + + def execute_checkout_with_status(self, dirname, args, debug_env=''): + """Calls checkout a second time to get status if needed.""" + tree_status = self.execute_checkout_in_dir( + dirname, args, debug_env=debug_env) + if tree_status is None: + tree_status = self.execute_checkout_in_dir(dirname, + self.status_args, + debug_env=debug_env) + self.assertNotEqual(tree_status, None) + return tree_status + + def _check_sync_clean(self, ext_status, expected_sync_state, + expected_clean_state): + self.assertEqual(ext_status.sync_state, expected_sync_state) + self.assertEqual(ext_status.clean_state, expected_clean_state) @staticmethod - def execute_cmd_in_dir(under_test_dir, args): - """Extecute the checkout command in the appropriate repo dir with the - specified additional args - - Note that we are calling the command line processing and main - routines and not using a subprocess call so that we get code - coverage results! - - """ - cwd = os.getcwd() - checkout_path = os.path.abspath('{0}/../../checkout_externals') - os.chdir(under_test_dir) - cmdline = ['--externals', CFG_NAME, ] - cmdline += args - repo_root = 'MANIC_TEST_BARE_REPO_ROOT={root}'.format( - root=os.environ[MANIC_TEST_BARE_REPO_ROOT]) - manual_cmd = ('Test cmd:\npushd {cwd}; {env} {checkout} {args}'.format( - cwd=under_test_dir, env=repo_root, checkout=checkout_path, - args=' '.join(cmdline))) - printlog(manual_cmd) - options = checkout.commandline_arguments(cmdline) - overall_status, tree_status = checkout.main(options) - os.chdir(cwd) - return overall_status, tree_status - - # ---------------------------------------------------------------- - # - # Check results for generic perturbation of states - # - # ---------------------------------------------------------------- - def _check_generic_empty_default_required(self, tree, name): - self.assertEqual(tree[name].sync_state, ExternalStatus.EMPTY) - self.assertEqual(tree[name].clean_state, ExternalStatus.DEFAULT) - self.assertEqual(tree[name].source_type, ExternalStatus.MANAGED) - - def _check_generic_ok_clean_required(self, tree, name): - self.assertEqual(tree[name].sync_state, ExternalStatus.STATUS_OK) - self.assertEqual(tree[name].clean_state, ExternalStatus.STATUS_OK) - self.assertEqual(tree[name].source_type, ExternalStatus.MANAGED) - - def _check_generic_ok_dirty_required(self, tree, name): - self.assertEqual(tree[name].sync_state, ExternalStatus.STATUS_OK) - self.assertEqual(tree[name].clean_state, ExternalStatus.DIRTY) - self.assertEqual(tree[name].source_type, ExternalStatus.MANAGED) - - def _check_generic_modified_ok_required(self, tree, name): - self.assertEqual(tree[name].sync_state, ExternalStatus.MODEL_MODIFIED) - self.assertEqual(tree[name].clean_state, ExternalStatus.STATUS_OK) - self.assertEqual(tree[name].source_type, ExternalStatus.MANAGED) - - def _check_generic_empty_default_optional(self, tree, name): - self.assertEqual(tree[name].sync_state, ExternalStatus.EMPTY) - self.assertEqual(tree[name].clean_state, ExternalStatus.DEFAULT) - self.assertEqual(tree[name].source_type, ExternalStatus.OPTIONAL) - - def _check_generic_ok_clean_optional(self, tree, name): - self.assertEqual(tree[name].sync_state, ExternalStatus.STATUS_OK) - self.assertEqual(tree[name].clean_state, ExternalStatus.STATUS_OK) - self.assertEqual(tree[name].source_type, ExternalStatus.OPTIONAL) - - # ---------------------------------------------------------------- - # - # Check results for individual named externals - # - # ---------------------------------------------------------------- - def _check_simple_tag_empty(self, tree, directory=EXTERNALS_NAME): - name = './{0}/simp_tag'.format(directory) - self._check_generic_empty_default_required(tree, name) - - def _check_simple_tag_ok(self, tree, directory=EXTERNALS_NAME): - name = './{0}/simp_tag'.format(directory) - self._check_generic_ok_clean_required(tree, name) - - def _check_simple_tag_dirty(self, tree, directory=EXTERNALS_NAME): - name = './{0}/simp_tag'.format(directory) - self._check_generic_ok_dirty_required(tree, name) - - def _check_simple_tag_modified(self, tree, directory=EXTERNALS_NAME): - name = './{0}/simp_tag'.format(directory) - self._check_generic_modified_ok_required(tree, name) - - def _check_simple_branch_empty(self, tree, directory=EXTERNALS_NAME): - name = './{0}/simp_branch'.format(directory) - self._check_generic_empty_default_required(tree, name) - - def _check_simple_branch_ok(self, tree, directory=EXTERNALS_NAME): - name = './{0}/simp_branch'.format(directory) - self._check_generic_ok_clean_required(tree, name) - - def _check_simple_branch_modified(self, tree, directory=EXTERNALS_NAME): - name = './{0}/simp_branch'.format(directory) - self._check_generic_modified_ok_required(tree, name) - - def _check_simple_hash_empty(self, tree, directory=EXTERNALS_NAME): - name = './{0}/simp_hash'.format(directory) - self._check_generic_empty_default_required(tree, name) - - def _check_simple_hash_ok(self, tree, directory=EXTERNALS_NAME): - name = './{0}/simp_hash'.format(directory) - self._check_generic_ok_clean_required(tree, name) - - def _check_simple_hash_modified(self, tree, directory=EXTERNALS_NAME): - name = './{0}/simp_hash'.format(directory) - self._check_generic_modified_ok_required(tree, name) - - def _check_simple_req_empty(self, tree, directory=EXTERNALS_NAME): - name = './{0}/simp_req'.format(directory) - self._check_generic_empty_default_required(tree, name) - - def _check_simple_req_ok(self, tree, directory=EXTERNALS_NAME): - name = './{0}/simp_req'.format(directory) - self._check_generic_ok_clean_required(tree, name) - - def _check_simple_opt_empty(self, tree, directory=EXTERNALS_NAME): - name = './{0}/simp_opt'.format(directory) - self._check_generic_empty_default_optional(tree, name) - - def _check_simple_opt_ok(self, tree, directory=EXTERNALS_NAME): - name = './{0}/simp_opt'.format(directory) - self._check_generic_ok_clean_optional(tree, name) - - def _check_mixed_ext_branch_empty(self, tree, directory=EXTERNALS_NAME): - name = './{0}/mixed_req'.format(directory) - self._check_generic_empty_default_required(tree, name) - - def _check_mixed_ext_branch_ok(self, tree, directory=EXTERNALS_NAME): - name = './{0}/mixed_req'.format(directory) - self._check_generic_ok_clean_required(tree, name) - - def _check_mixed_ext_branch_modified(self, tree, directory=EXTERNALS_NAME): - name = './{0}/mixed_req'.format(directory) - self._check_generic_modified_ok_required(tree, name) - - def _check_simple_sparse_empty(self, tree, directory=EXTERNALS_NAME): - name = './{0}/simp_sparse'.format(directory) - self._check_generic_empty_default_required(tree, name) - - def _check_simple_sparse_ok(self, tree, directory=EXTERNALS_NAME): - name = './{0}/simp_sparse'.format(directory) - self._check_generic_ok_clean_required(tree, name) - - # ---------------------------------------------------------------- - # - # Check results for groups of externals under specific conditions - # - # ---------------------------------------------------------------- - def _check_container_simple_required_pre_checkout(self, overall, tree): - self.assertEqual(overall, 0) - self._check_simple_tag_empty(tree) - self._check_simple_branch_empty(tree) - self._check_simple_hash_empty(tree) - - def _check_container_simple_required_checkout(self, overall, tree): - # Note, this is the internal tree status just before checkout - self.assertEqual(overall, 0) - self._check_simple_tag_empty(tree) - self._check_simple_branch_empty(tree) - self._check_simple_hash_empty(tree) - - def _check_container_simple_required_post_checkout(self, overall, tree): - self.assertEqual(overall, 0) - self._check_simple_tag_ok(tree) - self._check_simple_branch_ok(tree) - self._check_simple_hash_ok(tree) - - def _check_container_simple_required_out_of_sync(self, overall, tree): - self.assertEqual(overall, 0) - self._check_simple_tag_modified(tree) - self._check_simple_branch_modified(tree) - self._check_simple_hash_modified(tree) - - def _check_container_simple_optional_pre_checkout(self, overall, tree): - self.assertEqual(overall, 0) - self._check_simple_req_empty(tree) - self._check_simple_opt_empty(tree) - - def _check_container_simple_optional_checkout(self, overall, tree): - self.assertEqual(overall, 0) - self._check_simple_req_empty(tree) - self._check_simple_opt_empty(tree) - - def _check_container_simple_optional_post_checkout(self, overall, tree): - self.assertEqual(overall, 0) - self._check_simple_req_ok(tree) - self._check_simple_opt_empty(tree) - - def _check_container_simple_optional_post_optional(self, overall, tree): - self.assertEqual(overall, 0) - self._check_simple_req_ok(tree) - self._check_simple_opt_ok(tree) - - def _check_container_simple_required_sb_modified(self, overall, tree): - self.assertEqual(overall, 0) - self._check_simple_tag_ok(tree) - self._check_simple_branch_modified(tree) - self._check_simple_hash_ok(tree) - - def _check_container_simple_optional_st_dirty(self, overall, tree): - self.assertEqual(overall, 0) - self._check_simple_tag_dirty(tree) - self._check_simple_branch_ok(tree) - - def _check_container_full_pre_checkout(self, overall, tree): - self.assertEqual(overall, 0) - self._check_simple_tag_empty(tree) - self._check_simple_branch_empty(tree) - self._check_simple_opt_empty(tree) - self._check_mixed_ext_branch_required_pre_checkout(overall, tree) - - def _check_container_component_post_checkout(self, overall, tree): - self.assertEqual(overall, 0) - self._check_simple_opt_ok(tree) - self._check_simple_tag_empty(tree) - self._check_simple_branch_empty(tree) - - def _check_container_component_post_checkout2(self, overall, tree): - self.assertEqual(overall, 0) - self._check_simple_opt_ok(tree) - self._check_simple_tag_empty(tree) - self._check_simple_branch_ok(tree) - - def _check_container_full_post_checkout(self, overall, tree): - self.assertEqual(overall, 0) - self._check_simple_tag_ok(tree) - self._check_simple_branch_ok(tree) - self._check_simple_opt_empty(tree) - self._check_mixed_ext_branch_required_post_checkout(overall, tree) - - def _check_container_full_pre_checkout_ext_change(self, overall, tree): - self.assertEqual(overall, 0) - self._check_simple_tag_ok(tree) - self._check_simple_branch_ok(tree) - self._check_simple_opt_empty(tree) - self._check_mixed_ext_branch_required_pre_checkout_ext_change( - overall, tree) - - def _check_container_full_post_checkout_subext_modified( - self, overall, tree): - self.assertEqual(overall, 0) - self._check_simple_tag_ok(tree) - self._check_simple_branch_ok(tree) - self._check_simple_opt_empty(tree) - self._check_mixed_ext_branch_required_post_checkout_subext_modified( - overall, tree) - - def _check_mixed_ext_branch_required_pre_checkout(self, overall, tree): - # Note, this is the internal tree status just before checkout - self.assertEqual(overall, 0) - self._check_mixed_ext_branch_empty(tree, directory=EXTERNALS_NAME) - # NOTE: externals/mixed_req/src should not exist in the tree - # since this is the status before checkout of mixed_req. - - def _check_mixed_ext_branch_required_post_checkout(self, overall, tree): - # Note, this is the internal tree status just before checkout - self.assertEqual(overall, 0) - self._check_mixed_ext_branch_ok(tree, directory=EXTERNALS_NAME) - check_dir = "{0}/{1}/{2}".format(EXTERNALS_NAME, "mixed_req", - SUB_EXTERNALS_PATH) - self._check_simple_branch_ok(tree, directory=check_dir) - - def _check_mixed_ext_branch_required_pre_checkout_ext_change( - self, overall, tree): - # Note, this is the internal tree status just after change the - # externals description file, but before checkout - self.assertEqual(overall, 0) - self._check_mixed_ext_branch_modified(tree, directory=EXTERNALS_NAME) - check_dir = "{0}/{1}/{2}".format(EXTERNALS_NAME, "mixed_req", - SUB_EXTERNALS_PATH) - self._check_simple_branch_ok(tree, directory=check_dir) - - def _check_mixed_ext_branch_required_post_checkout_subext_modified( - self, overall, tree): - # Note, this is the internal tree status just after change the - # externals description file, but before checkout - self.assertEqual(overall, 0) - self._check_mixed_ext_branch_ok(tree, directory=EXTERNALS_NAME) - check_dir = "{0}/{1}/{2}".format(EXTERNALS_NAME, "mixed_req", - SUB_EXTERNALS_PATH) - self._check_simple_branch_modified(tree, directory=check_dir) - - def _check_mixed_cont_simple_required_pre_checkout(self, overall, tree): - # Note, this is the internal tree status just before checkout - self.assertEqual(overall, 0) - self._check_simple_tag_empty(tree, directory=EXTERNALS_NAME) - self._check_simple_branch_empty(tree, directory=EXTERNALS_NAME) - self._check_simple_branch_empty(tree, directory=SUB_EXTERNALS_PATH) - - def _check_mixed_cont_simple_required_checkout(self, overall, tree): - # Note, this is the internal tree status just before checkout - self.assertEqual(overall, 0) - self._check_simple_tag_empty(tree, directory=EXTERNALS_NAME) - self._check_simple_branch_empty(tree, directory=EXTERNALS_NAME) - self._check_simple_branch_empty(tree, directory=SUB_EXTERNALS_PATH) - - def _check_mixed_cont_simple_required_post_checkout(self, overall, tree): - # Note, this is the internal tree status just before checkout - self.assertEqual(overall, 0) - self._check_simple_tag_ok(tree, directory=EXTERNALS_NAME) - self._check_simple_branch_ok(tree, directory=EXTERNALS_NAME) - self._check_simple_branch_ok(tree, directory=SUB_EXTERNALS_PATH) - - def _check_container_sparse_pre_checkout(self, overall, tree): - self.assertEqual(overall, 0) - self._check_simple_tag_empty(tree) - self._check_simple_sparse_empty(tree) - - def _check_container_sparse_post_checkout(self, overall, tree): - self.assertEqual(overall, 0) - self._check_simple_tag_ok(tree) - self._check_simple_sparse_ok(tree) - + def _external_path(section_name, base_path=EXTERNALS_PATH): + return './{0}/{1}'.format(base_path, section_name) + def _check_file_exists(self, repo_dir, pathname): "Check that exists in " self.assertTrue(os.path.exists(os.path.join(repo_dir, pathname))) @@ -921,9 +618,9 @@ def _check_file_absent(self, repo_dir, pathname): "Check that does not exist in " self.assertFalse(os.path.exists(os.path.join(repo_dir, pathname))) + class TestSysCheckout(BaseTestSysCheckout): """Run systems level tests of checkout_externals - """ # NOTE(bja, 2017-11) pylint complains about long method names, but # it is hard to differentiate tests without making them more @@ -935,214 +632,431 @@ class TestSysCheckout(BaseTestSysCheckout): # Run systems tests # # ---------------------------------------------------------------- - def test_container_simple_required(self): - """Verify that a container with simple subrepos - generates the correct initial status. - + def test_required_bytag(self): + """Check out a required external pointing to a git tag.""" + cloned_repo_dir = self.clone_test_repo(CONTAINER_REPO) + self._generator.create_config() + self._generator.create_section(SIMPLE_REPO, TAG_SECTION, + tag='tag1') + self._generator.write_config(cloned_repo_dir) + + # externals start out 'empty' aka not checked out. + tree = self.execute_checkout_in_dir(cloned_repo_dir, + self.status_args) + local_path_rel = self._external_path(TAG_SECTION) + self._check_sync_clean(tree[local_path_rel], + ExternalStatus.EMPTY, + ExternalStatus.DEFAULT) + local_path_abs = os.path.join(cloned_repo_dir, local_path_rel) + self.assertFalse(os.path.exists(local_path_abs)) + + # after checkout, the external is 'clean' aka at the correct version. + tree = self.execute_checkout_with_status(cloned_repo_dir, + self.checkout_args) + self._check_sync_clean(tree[local_path_rel], + ExternalStatus.STATUS_OK, + ExternalStatus.STATUS_OK) + + # Actually checked out the desired repo. + self.assertEqual('origin', GitRepository._remote_name_for_url( + # Which url to look up + self._generator.url_for_repo_path(SIMPLE_REPO), + # Which directory has the local checked-out repo. + dirname=local_path_abs)) + + # Actually checked out the desired tag. + (tag_found, tag_name) = GitRepository._git_current_tag(local_path_abs) + self.assertEqual(tag_name, 'tag1') + + # Check existence of some simp_tag files + tag_path = os.path.join('externals', TAG_SECTION) + self._check_file_exists(cloned_repo_dir, + os.path.join(tag_path, README_NAME)) + # Subrepo should not exist (not referenced by configs). + self._check_file_absent(cloned_repo_dir, os.path.join(tag_path, + 'simple_subdir', + 'subdir_file.txt')) + + def test_required_bybranch(self): + """Check out a required external pointing to a git branch.""" + cloned_repo_dir = self.clone_test_repo(CONTAINER_REPO) + self._generator.create_config() + self._generator.create_section(SIMPLE_REPO, BRANCH_SECTION, + branch=REMOTE_BRANCH_FEATURE2) + self._generator.write_config(cloned_repo_dir) + + # externals start out 'empty' aka not checked out. + tree = self.execute_checkout_in_dir(cloned_repo_dir, + self.status_args) + local_path_rel = self._external_path(BRANCH_SECTION) + self._check_sync_clean(tree[local_path_rel], + ExternalStatus.EMPTY, + ExternalStatus.DEFAULT) + local_path_abs = os.path.join(cloned_repo_dir, local_path_rel) + self.assertFalse(os.path.exists(local_path_abs)) + + # after checkout, the external is 'clean' aka at the correct version. + tree = self.execute_checkout_with_status(cloned_repo_dir, + self.checkout_args) + self._check_sync_clean(tree[local_path_rel], + ExternalStatus.STATUS_OK, + ExternalStatus.STATUS_OK) + self.assertTrue(os.path.exists(local_path_abs)) + + # Actually checked out the desired repo. + self.assertEqual('origin', GitRepository._remote_name_for_url( + # Which url to look up + self._generator.url_for_repo_path(SIMPLE_REPO), + # Which directory has the local checked-out repo. + dirname=local_path_abs)) + + # Actually checked out the desired branch. + (branch_found, branch_name) = GitRepository._git_current_remote_branch( + local_path_abs) + self.assertEquals(branch_name, 'origin/' + REMOTE_BRANCH_FEATURE2) + + def test_required_byhash(self): + """Check out a required external pointing to a git hash.""" + cloned_repo_dir = self.clone_test_repo(CONTAINER_REPO) + self._generator.create_config() + self._generator.create_section(SIMPLE_REPO, HASH_SECTION, + ref_hash='60b1cc1a38d63') + self._generator.write_config(cloned_repo_dir) + + # externals start out 'empty' aka not checked out. + tree = self.execute_checkout_in_dir(cloned_repo_dir, + self.status_args) + local_path_rel = self._external_path(HASH_SECTION) + self._check_sync_clean(tree[local_path_rel], + ExternalStatus.EMPTY, + ExternalStatus.DEFAULT) + local_path_abs = os.path.join(cloned_repo_dir, local_path_rel) + self.assertFalse(os.path.exists(local_path_abs)) + + # after checkout, the externals are 'clean' aka at their correct version. + tree = self.execute_checkout_with_status(cloned_repo_dir, + self.checkout_args) + self._check_sync_clean(tree[local_path_rel], + ExternalStatus.STATUS_OK, + ExternalStatus.STATUS_OK) + + # Actually checked out the desired repo. + self.assertEqual('origin', GitRepository._remote_name_for_url( + # Which url to look up + self._generator.url_for_repo_path(SIMPLE_REPO), + # Which directory has the local checked-out repo. + dirname=local_path_abs)) + + # Actually checked out the desired hash. + (hash_found, hash_name) = GitRepository._git_current_hash( + local_path_abs) + self.assertTrue(hash_name.startswith('60b1cc1a38d63'), + msg=hash_name) + + def test_container_nested_required(self): + """Verify that a container with nested subrepos generates the correct initial status. + Tests over all possible permutations """ - # create repo - under_test_dir = self.setup_test_repo(CONTAINER_REPO_NAME) - self._generator.container_simple_required(under_test_dir) - - # status of empty repo - overall, tree = self.execute_cmd_in_dir(under_test_dir, + # Output subdirs for each of the externals, to test that one external can be + # checked out in a subdir of another. + NESTED_SUBDIR = ['./fred', './fred/wilma', './fred/wilma/barney'] + + # Assert that each type of external (e.g. tag vs branch) can be at any parent level + # (e.g. child/parent/grandparent). + orders = [[0, 1, 2], [1, 2, 0], [2, 0, 1], + [0, 2, 1], [2, 1, 0], [1, 0, 2]] + for n, order in enumerate(orders): + dest_dir = os.path.join(module_tmp_root_dir, self._test_id, + "test"+str(n)) + cloned_repo_dir = self.clone_test_repo(CONTAINER_REPO, + dest_dir_in=dest_dir) + self._generator.create_config() + # We happen to check out each section via a different reference (tag/branch/hash) but + # those don't really matter, we just need to check out three repos into a nested set of + # directories. + self._generator.create_section( + SIMPLE_REPO, TAG_SECTION, nested=True, + tag='tag1', path=NESTED_SUBDIR[order[0]]) + self._generator.create_section( + SIMPLE_REPO, BRANCH_SECTION, nested=True, + branch=REMOTE_BRANCH_FEATURE2, path=NESTED_SUBDIR[order[1]]) + self._generator.create_section( + SIMPLE_REPO, HASH_SECTION, nested=True, + ref_hash='60b1cc1a38d63', path=NESTED_SUBDIR[order[2]]) + self._generator.write_config(cloned_repo_dir) + + # all externals start out 'empty' aka not checked out. + tree = self.execute_checkout_in_dir(cloned_repo_dir, self.status_args) - self._check_container_simple_required_pre_checkout(overall, tree) - - # checkout - overall, tree = self.execute_cmd_in_dir(under_test_dir, - self.checkout_args) - self._check_container_simple_required_checkout(overall, tree) - - # status clean checked out - overall, tree = self.execute_cmd_in_dir(under_test_dir, - self.status_args) - self._check_container_simple_required_post_checkout(overall, tree) - + self._check_sync_clean(tree[NESTED_SUBDIR[order[0]]], + ExternalStatus.EMPTY, + ExternalStatus.DEFAULT) + self._check_sync_clean(tree[NESTED_SUBDIR[order[1]]], + ExternalStatus.EMPTY, + ExternalStatus.DEFAULT) + self._check_sync_clean(tree[NESTED_SUBDIR[order[2]]], + ExternalStatus.EMPTY, + ExternalStatus.DEFAULT) + + # after checkout, all the repos are 'clean'. + tree = self.execute_checkout_with_status(cloned_repo_dir, + self.checkout_args) + self._check_sync_clean(tree[NESTED_SUBDIR[order[0]]], + ExternalStatus.STATUS_OK, + ExternalStatus.STATUS_OK) + self._check_sync_clean(tree[NESTED_SUBDIR[order[1]]], + ExternalStatus.STATUS_OK, + ExternalStatus.STATUS_OK) + self._check_sync_clean(tree[NESTED_SUBDIR[order[2]]], + ExternalStatus.STATUS_OK, + ExternalStatus.STATUS_OK) + def test_container_simple_optional(self): - """Verify that container with an optional simple subrepos - generates the correct initial status. + """Verify that container with an optional simple subrepos generates + the correct initial status. """ - # create repo - under_test_dir = self.setup_test_repo(CONTAINER_REPO_NAME) - self._generator.container_simple_optional(under_test_dir) - - # check status of empty repo - overall, tree = self.execute_cmd_in_dir(under_test_dir, - self.status_args) - self._check_container_simple_optional_pre_checkout(overall, tree) - - # checkout required - overall, tree = self.execute_cmd_in_dir(under_test_dir, - self.checkout_args) - self._check_container_simple_optional_checkout(overall, tree) - - # status - overall, tree = self.execute_cmd_in_dir(under_test_dir, - self.status_args) - self._check_container_simple_optional_post_checkout(overall, tree) + # create repo and externals config. + cloned_repo_dir = self.clone_test_repo(CONTAINER_REPO) + self._generator.create_config() + self._generator.create_section(SIMPLE_REPO, 'simp_req', + tag='tag1') - # checkout optional - overall, tree = self.execute_cmd_in_dir(under_test_dir, - self.optional_args) - self._check_container_simple_optional_post_checkout(overall, tree) + self._generator.create_section(SIMPLE_REPO, 'simp_opt', + tag='tag1', required=False) - # status - overall, tree = self.execute_cmd_in_dir(under_test_dir, - self.status_args) - self._check_container_simple_optional_post_optional(overall, tree) + self._generator.write_config(cloned_repo_dir) + + # all externals start out 'empty' aka not checked out. + tree = self.execute_checkout_in_dir(cloned_repo_dir, + self.status_args) + req_status = tree[self._external_path('simp_req')] + self._check_sync_clean(req_status, + ExternalStatus.EMPTY, + ExternalStatus.DEFAULT) + self.assertEqual(req_status.source_type, ExternalStatus.MANAGED) + + opt_status = tree[self._external_path('simp_opt')] + self._check_sync_clean(opt_status, + ExternalStatus.EMPTY, + ExternalStatus.DEFAULT) + self.assertEqual(opt_status.source_type, ExternalStatus.OPTIONAL) + + # after checkout, required external is clean, optional is still empty. + tree = self.execute_checkout_with_status(cloned_repo_dir, + self.checkout_args) + req_status = tree[self._external_path('simp_req')] + self._check_sync_clean(req_status, + ExternalStatus.STATUS_OK, + ExternalStatus.STATUS_OK) + self.assertEqual(req_status.source_type, ExternalStatus.MANAGED) + + opt_status = tree[self._external_path('simp_opt')] + self._check_sync_clean(opt_status, + ExternalStatus.EMPTY, + ExternalStatus.DEFAULT) + self.assertEqual(opt_status.source_type, ExternalStatus.OPTIONAL) + + # after checking out optionals, the optional external is also clean. + tree = self.execute_checkout_with_status(cloned_repo_dir, + self.optional_args) + req_status = tree[self._external_path('simp_req')] + self._check_sync_clean(req_status, + ExternalStatus.STATUS_OK, + ExternalStatus.STATUS_OK) + self.assertEqual(req_status.source_type, ExternalStatus.MANAGED) + + opt_status = tree[self._external_path('simp_opt')] + self._check_sync_clean(opt_status, + ExternalStatus.STATUS_OK, + ExternalStatus.STATUS_OK) + self.assertEqual(opt_status.source_type, ExternalStatus.OPTIONAL) def test_container_simple_verbose(self): - """Verify that container with simple subrepos runs with verbose status - output and generates the correct initial status. - + """Verify that verbose status matches non-verbose. """ - # create repo - under_test_dir = self.setup_test_repo(CONTAINER_REPO_NAME) - self._generator.container_simple_required(under_test_dir) - - # checkout - overall, tree = self.execute_cmd_in_dir(under_test_dir, - self.checkout_args) - self._check_container_simple_required_checkout(overall, tree) - - # check verbose status - overall, tree = self.execute_cmd_in_dir(under_test_dir, - self.verbose_args) - self._check_container_simple_required_post_checkout(overall, tree) + cloned_repo_dir = self.clone_test_repo(CONTAINER_REPO) + self._generator.create_config() + self._generator.create_section(SIMPLE_REPO, TAG_SECTION, + tag='tag1') + self._generator.write_config(cloned_repo_dir) + + # after checkout, all externals should be 'clean'. + tree = self.execute_checkout_with_status(cloned_repo_dir, + self.checkout_args) + self._check_sync_clean(tree[self._external_path(TAG_SECTION)], + ExternalStatus.STATUS_OK, + ExternalStatus.STATUS_OK) + + # 'Verbose' status should tell the same story. + tree = self.execute_checkout_in_dir(cloned_repo_dir, + self.verbose_args) + self._check_sync_clean(tree[self._external_path(TAG_SECTION)], + ExternalStatus.STATUS_OK, + ExternalStatus.STATUS_OK) def test_container_simple_dirty(self): - """Verify that a container with simple subrepos - and a dirty status exits gracefully. - + """Verify that a container with a new tracked file is marked dirty. """ - under_test_dir = self.setup_test_repo(CONTAINER_REPO_NAME) - self._generator.container_simple_required(under_test_dir) - - # checkout - overall, tree = self.execute_cmd_in_dir(under_test_dir, - self.checkout_args) - self._check_container_simple_required_checkout(overall, tree) - - # add a file to the repo - tracked = True - self._add_file_to_repo(under_test_dir, 'externals/simp_tag/tmp.txt', - tracked) - - # checkout: pre-checkout status should be dirty, did not - # modify working copy. - overall, tree = self.execute_cmd_in_dir(under_test_dir, - self.checkout_args) - self._check_container_simple_optional_st_dirty(overall, tree) - - # verify status is still dirty - overall, tree = self.execute_cmd_in_dir(under_test_dir, - self.status_args) - self._check_container_simple_optional_st_dirty(overall, tree) + cloned_repo_dir = self.clone_test_repo(CONTAINER_REPO) + self._generator.create_config() + self._generator.create_section(SIMPLE_REPO, TAG_SECTION, + tag='tag1') + self._generator.write_config(cloned_repo_dir) + + # checkout, should start out clean. + tree = self.execute_checkout_with_status(cloned_repo_dir, self.checkout_args) + self._check_sync_clean(tree[self._external_path(TAG_SECTION)], + ExternalStatus.STATUS_OK, + ExternalStatus.STATUS_OK) + + # add a tracked file to the simp_tag external, should be dirty. + RepoUtils.add_file_to_repo(cloned_repo_dir, + 'externals/{0}/tmp.txt'.format(TAG_SECTION), + tracked=True) + tree = self.execute_checkout_in_dir(cloned_repo_dir, self.status_args) + self._check_sync_clean(tree[self._external_path(TAG_SECTION)], + ExternalStatus.STATUS_OK, + ExternalStatus.DIRTY) + + # Re-checkout; simp_tag should still be dirty. + tree = self.execute_checkout_with_status(cloned_repo_dir, + self.checkout_args) + self._check_sync_clean(tree[self._external_path(TAG_SECTION)], + ExternalStatus.STATUS_OK, + ExternalStatus.DIRTY) def test_container_simple_untracked(self): """Verify that a container with simple subrepos and a untracked files is not considered 'dirty' and will attempt an update. """ - under_test_dir = self.setup_test_repo(CONTAINER_REPO_NAME) - self._generator.container_simple_required(under_test_dir) - - # checkout - overall, tree = self.execute_cmd_in_dir(under_test_dir, - self.checkout_args) - self._check_container_simple_required_checkout(overall, tree) - - # add a file to the repo - tracked = False - self._add_file_to_repo(under_test_dir, 'externals/simp_tag/tmp.txt', - tracked) - - # checkout: pre-checkout status should be clean, ignoring the - # untracked file. - overall, tree = self.execute_cmd_in_dir(under_test_dir, - self.checkout_args) - self._check_container_simple_required_post_checkout(overall, tree) - - # verify status is still clean - overall, tree = self.execute_cmd_in_dir(under_test_dir, - self.status_args) - self._check_container_simple_required_post_checkout(overall, tree) + cloned_repo_dir = self.clone_test_repo(CONTAINER_REPO) + self._generator.create_config() + self._generator.create_section(SIMPLE_REPO, TAG_SECTION, + tag='tag1') + self._generator.write_config(cloned_repo_dir) + + # checkout, should start out clean. + tree = self.execute_checkout_with_status(cloned_repo_dir, + self.checkout_args) + self._check_sync_clean(tree[self._external_path(TAG_SECTION)], + ExternalStatus.STATUS_OK, + ExternalStatus.STATUS_OK) + + # add an untracked file to the simp_tag external, should stay clean. + RepoUtils.add_file_to_repo(cloned_repo_dir, + 'externals/{0}/tmp.txt'.format(TAG_SECTION), + tracked=False) + tree = self.execute_checkout_in_dir(cloned_repo_dir, self.status_args) + self._check_sync_clean(tree[self._external_path(TAG_SECTION)], + ExternalStatus.STATUS_OK, + ExternalStatus.STATUS_OK) + + # After checkout, the external should still be 'clean'. + tree = self.execute_checkout_with_status(cloned_repo_dir, + self.checkout_args) + self._check_sync_clean(tree[self._external_path(TAG_SECTION)], + ExternalStatus.STATUS_OK, + ExternalStatus.STATUS_OK) def test_container_simple_detached_sync(self): """Verify that a container with simple subrepos generates the correct out of sync status when making commits from a detached head - state. + state. + For more info about 'detached head' state: https://www.cloudbees.com/blog/git-detached-head """ - # create repo - under_test_dir = self.setup_test_repo(CONTAINER_REPO_NAME) - self._generator.container_simple_required(under_test_dir) - - # status of empty repo - overall, tree = self.execute_cmd_in_dir(under_test_dir, - self.status_args) - self._check_container_simple_required_pre_checkout(overall, tree) - - # checkout - overall, tree = self.execute_cmd_in_dir(under_test_dir, - self.checkout_args) - self._check_container_simple_required_checkout(overall, tree) - - # make a commit on the detached head of the tag and hash externals - self._generator.create_commit(under_test_dir, 'simp_tag') - self._generator.create_commit(under_test_dir, 'simp_hash') - self._generator.create_commit(under_test_dir, 'simp_branch') - - # status of repo, branch, tag and hash should all be out of sync! - overall, tree = self.execute_cmd_in_dir(under_test_dir, - self.status_args) - self._check_container_simple_required_out_of_sync(overall, tree) + cloned_repo_dir = self.clone_test_repo(CONTAINER_REPO) + self._generator.create_config() + self._generator.create_section(SIMPLE_REPO, TAG_SECTION, + tag='tag1') + + self._generator.create_section(SIMPLE_REPO, BRANCH_SECTION, + branch=REMOTE_BRANCH_FEATURE2) + + self._generator.create_section(SIMPLE_REPO, 'simp_hash', + ref_hash='60b1cc1a38d63') + + self._generator.write_config(cloned_repo_dir) + + # externals start out 'empty' aka not checked out. + tree = self.execute_checkout_in_dir(cloned_repo_dir, self.status_args) + self._check_sync_clean(tree[self._external_path(TAG_SECTION)], + ExternalStatus.EMPTY, + ExternalStatus.DEFAULT) + self._check_sync_clean(tree[self._external_path(BRANCH_SECTION)], + ExternalStatus.EMPTY, + ExternalStatus.DEFAULT) + self._check_sync_clean(tree[self._external_path(HASH_SECTION)], + ExternalStatus.EMPTY, + ExternalStatus.DEFAULT) # checkout - overall, tree = self.execute_cmd_in_dir(under_test_dir, - self.checkout_args) - # same pre-checkout out of sync status - self._check_container_simple_required_out_of_sync(overall, tree) - - # now status should be in-sync - overall, tree = self.execute_cmd_in_dir(under_test_dir, - self.status_args) - self._check_container_simple_required_post_checkout(overall, tree) + self.execute_checkout_in_dir(cloned_repo_dir, self.checkout_args) + + # Commit on top of the tag and hash (creating the detached head state in those two + # externals' repos) + # The branch commit does not create the detached head state, but here for completeness. + RepoUtils.create_commit(cloned_repo_dir, TAG_SECTION) + RepoUtils.create_commit(cloned_repo_dir, HASH_SECTION) + RepoUtils.create_commit(cloned_repo_dir, BRANCH_SECTION) + + # sync status of all three should be 'modified' (uncommitted changes) + # clean status is 'ok' (matches externals version) + tree = self.execute_checkout_in_dir(cloned_repo_dir, self.status_args) + self._check_sync_clean(tree[self._external_path(TAG_SECTION)], + ExternalStatus.MODEL_MODIFIED, + ExternalStatus.STATUS_OK) + self._check_sync_clean(tree[self._external_path(BRANCH_SECTION)], + ExternalStatus.MODEL_MODIFIED, + ExternalStatus.STATUS_OK) + self._check_sync_clean(tree[self._external_path(HASH_SECTION)], + ExternalStatus.MODEL_MODIFIED, + ExternalStatus.STATUS_OK) + + # after checkout, all externals should be totally clean (no uncommitted changes, + # and matches externals version). + tree = self.execute_checkout_with_status(cloned_repo_dir, self.checkout_args) + self._check_sync_clean(tree[self._external_path(TAG_SECTION)], + ExternalStatus.STATUS_OK, + ExternalStatus.STATUS_OK) + self._check_sync_clean(tree[self._external_path(BRANCH_SECTION)], + ExternalStatus.STATUS_OK, + ExternalStatus.STATUS_OK) + self._check_sync_clean(tree[self._external_path(HASH_SECTION)], + ExternalStatus.STATUS_OK, + ExternalStatus.STATUS_OK) def test_container_remote_branch(self): """Verify that a container with remote branch change works """ - # create repo - under_test_dir = self.setup_test_repo(CONTAINER_REPO_NAME) - self._generator.container_simple_required(under_test_dir) - - # checkout - overall, tree = self.execute_cmd_in_dir(under_test_dir, - self.checkout_args) - self._check_container_simple_required_checkout(overall, tree) - - # update the config file to point to a different remote with - # the same branch - self._generator.update_branch(under_test_dir, 'simp_branch', - REMOTE_BRANCH_FEATURE2, SIMPLE_FORK_NAME) - - # status of simp_branch should be out of sync - overall, tree = self.execute_cmd_in_dir(under_test_dir, - self.status_args) - self._check_container_simple_required_sb_modified(overall, tree) - - # checkout new externals - overall, tree = self.execute_cmd_in_dir(under_test_dir, - self.checkout_args) - self._check_container_simple_required_sb_modified(overall, tree) - - # status should be synced - overall, tree = self.execute_cmd_in_dir(under_test_dir, - self.status_args) - self._check_container_simple_required_post_checkout(overall, tree) + cloned_repo_dir = self.clone_test_repo(CONTAINER_REPO) + self._generator.create_config() + self._generator.create_section(SIMPLE_REPO, BRANCH_SECTION, + branch=REMOTE_BRANCH_FEATURE2) + self._generator.write_config(cloned_repo_dir) + + # initial checkout + self.execute_checkout_in_dir(cloned_repo_dir, self.checkout_args) + + # update the branch external to point to a different remote with the same branch, + # then simp_branch should be out of sync + self._generator.write_with_git_branch(cloned_repo_dir, + name=BRANCH_SECTION, + branch=REMOTE_BRANCH_FEATURE2, + new_remote_repo_path=SIMPLE_FORK_REPO) + tree = self.execute_checkout_in_dir(cloned_repo_dir, self.status_args) + self._check_sync_clean(tree[self._external_path(BRANCH_SECTION)], + ExternalStatus.MODEL_MODIFIED, + ExternalStatus.STATUS_OK) + + # checkout new externals, now simp_branch should be clean. + tree = self.execute_checkout_with_status(cloned_repo_dir, self.checkout_args) + self._check_sync_clean(tree[self._external_path(BRANCH_SECTION)], + ExternalStatus.STATUS_OK, + ExternalStatus.STATUS_OK) def test_container_remote_tag_same_branch(self): """Verify that a container with remote tag change works. The new tag @@ -1151,334 +1065,361 @@ def test_container_remote_tag_same_branch(self): the branch. """ - # create repo - under_test_dir = self.setup_test_repo(CONTAINER_REPO_NAME) - self._generator.container_simple_required(under_test_dir) + cloned_repo_dir = self.clone_test_repo(CONTAINER_REPO) + self._generator.create_config() + self._generator.create_section(SIMPLE_REPO, BRANCH_SECTION, + branch=REMOTE_BRANCH_FEATURE2) + self._generator.write_config(cloned_repo_dir) - # checkout - overall, tree = self.execute_cmd_in_dir(under_test_dir, - self.checkout_args) - self._check_container_simple_required_checkout(overall, tree) + # initial checkout + self.execute_checkout_in_dir(cloned_repo_dir, self.checkout_args) # update the config file to point to a different remote with - # the tag instead of branch. Tag MUST NOT be in the original - # repo! - self._generator.update_tag(under_test_dir, 'simp_branch', - 'forked-feature-v1', SIMPLE_FORK_NAME) - - # status of simp_branch should be out of sync - overall, tree = self.execute_cmd_in_dir(under_test_dir, - self.status_args) - self._check_container_simple_required_sb_modified(overall, tree) - - # checkout new externals - overall, tree = self.execute_cmd_in_dir(under_test_dir, - self.checkout_args) - self._check_container_simple_required_sb_modified(overall, tree) - - # status should be synced - overall, tree = self.execute_cmd_in_dir(under_test_dir, - self.status_args) - self._check_container_simple_required_post_checkout(overall, tree) + # the new tag replacing the old branch. Tag MUST NOT be in the original + # repo! status of simp_branch should then be out of sync + self._generator.write_with_tag_and_remote_repo(cloned_repo_dir, BRANCH_SECTION, + tag='forked-feature-v1', + new_remote_repo_path=SIMPLE_FORK_REPO) + tree = self.execute_checkout_in_dir(cloned_repo_dir, + self.status_args) + self._check_sync_clean(tree[self._external_path(BRANCH_SECTION)], + ExternalStatus.MODEL_MODIFIED, + ExternalStatus.STATUS_OK) + + # checkout new externals, then should be synced. + tree = self.execute_checkout_with_status(cloned_repo_dir, + self.checkout_args) + self._check_sync_clean(tree[self._external_path(BRANCH_SECTION)], + ExternalStatus.STATUS_OK, + ExternalStatus.STATUS_OK) def test_container_remote_tag_fetch_all(self): """Verify that a container with remote tag change works. The new tag should not be in the original repo, only the new remote - fork. It should also not be on a branch that will be fetch, + fork. It should also not be on a branch that will be fetched, and therefore not fetched by default with 'git fetch'. It will - only be retreived by 'git fetch --tags' - + only be retrieved by 'git fetch --tags' """ - # create repo - under_test_dir = self.setup_test_repo(CONTAINER_REPO_NAME) - self._generator.container_simple_required(under_test_dir) + cloned_repo_dir = self.clone_test_repo(CONTAINER_REPO) + self._generator.create_config() + self._generator.create_section(SIMPLE_REPO, BRANCH_SECTION, + branch=REMOTE_BRANCH_FEATURE2) + self._generator.write_config(cloned_repo_dir) - # checkout - overall, tree = self.execute_cmd_in_dir(under_test_dir, - self.checkout_args) - self._check_container_simple_required_checkout(overall, tree) + # initial checkout + self.execute_checkout_in_dir(cloned_repo_dir, self.checkout_args) # update the config file to point to a different remote with - # the tag instead of branch. Tag MUST NOT be in the original - # repo! - self._generator.update_tag(under_test_dir, 'simp_branch', - 'abandoned-feature', SIMPLE_FORK_NAME) - - # status of simp_branch should be out of sync - overall, tree = self.execute_cmd_in_dir(under_test_dir, - self.status_args) - self._check_container_simple_required_sb_modified(overall, tree) - - # checkout new externals - overall, tree = self.execute_cmd_in_dir(under_test_dir, - self.checkout_args) - self._check_container_simple_required_sb_modified(overall, tree) - - # status should be synced - overall, tree = self.execute_cmd_in_dir(under_test_dir, - self.status_args) - self._check_container_simple_required_post_checkout(overall, tree) + # the new tag instead of the old branch. Tag MUST NOT be in the original + # repo! status of simp_branch should then be out of sync. + self._generator.write_with_tag_and_remote_repo(cloned_repo_dir, BRANCH_SECTION, + tag='abandoned-feature', + new_remote_repo_path=SIMPLE_FORK_REPO) + tree = self.execute_checkout_in_dir(cloned_repo_dir, self.status_args) + self._check_sync_clean(tree[self._external_path(BRANCH_SECTION)], + ExternalStatus.MODEL_MODIFIED, + ExternalStatus.STATUS_OK) + + # checkout new externals, should be clean again. + tree = self.execute_checkout_with_status(cloned_repo_dir, + self.checkout_args) + self._check_sync_clean(tree[self._external_path(BRANCH_SECTION)], + ExternalStatus.STATUS_OK, + ExternalStatus.STATUS_OK) def test_container_preserve_dot(self): """Verify that after inital checkout, modifying an external git repo url to '.' and the current branch will leave it unchanged. """ - # create repo - under_test_dir = self.setup_test_repo(CONTAINER_REPO_NAME) - self._generator.container_simple_required(under_test_dir) + cloned_repo_dir = self.clone_test_repo(CONTAINER_REPO) + self._generator.create_config() + self._generator.create_section(SIMPLE_REPO, BRANCH_SECTION, + branch=REMOTE_BRANCH_FEATURE2) + self._generator.write_config(cloned_repo_dir) - # checkout - overall, tree = self.execute_cmd_in_dir(under_test_dir, - self.checkout_args) - self._check_container_simple_required_checkout(overall, tree) + # initial checkout + self.execute_checkout_in_dir(cloned_repo_dir, self.checkout_args) # update the config file to point to a different remote with - # the same branch - self._generator.update_branch(under_test_dir, 'simp_branch', - REMOTE_BRANCH_FEATURE2, SIMPLE_FORK_NAME) - # checkout - overall, tree = self.execute_cmd_in_dir(under_test_dir, - self.checkout_args) - - # verify status is clean and unmodified - overall, tree = self.execute_cmd_in_dir(under_test_dir, - self.status_args) - self._check_container_simple_required_post_checkout(overall, tree) + # the same branch. + self._generator.write_with_git_branch(cloned_repo_dir, name=BRANCH_SECTION, + branch=REMOTE_BRANCH_FEATURE2, + new_remote_repo_path=SIMPLE_FORK_REPO) + # after checkout, should be clean again. + tree = self.execute_checkout_with_status(cloned_repo_dir, self.checkout_args) + self._check_sync_clean(tree[self._external_path(BRANCH_SECTION)], + ExternalStatus.STATUS_OK, + ExternalStatus.STATUS_OK) # update branch to point to a new branch that only exists in # the local fork - self._generator.create_branch(under_test_dir, 'simp_branch', - 'private-feature', with_commit=True) - self._generator.update_branch(under_test_dir, 'simp_branch', - 'private-feature', - SIMPLE_LOCAL_ONLY_NAME) - overall, tree = self.execute_cmd_in_dir(under_test_dir, - self.checkout_args) - - # verify status is clean and unmodified - overall, tree = self.execute_cmd_in_dir(under_test_dir, - self.status_args) - self._check_container_simple_required_post_checkout(overall, tree) - - def test_container_full(self): - """Verify that 'full' container with simple and mixed subrepos - generates the correct initial status. + RepoUtils.create_branch(cloned_repo_dir, external_name=BRANCH_SECTION, + branch='private-feature', with_commit=True) + self._generator.write_with_git_branch(cloned_repo_dir, name=BRANCH_SECTION, + branch='private-feature', + new_remote_repo_path=SIMPLE_LOCAL_ONLY_NAME) + # after checkout, should be clean again. + tree = self.execute_checkout_with_status(cloned_repo_dir, self.checkout_args) + self._check_sync_clean(tree[self._external_path(BRANCH_SECTION)], + ExternalStatus.STATUS_OK, + ExternalStatus.STATUS_OK) + + def test_container_mixed_subrepo(self): + """Verify container with mixed subrepo. The mixed subrepo has a sub-externals file with different sub-externals on different branches. """ - # create the test repository - under_test_dir = self.setup_test_repo(CONTAINER_REPO_NAME) - - # create the top level externals file - self._generator.container_full(under_test_dir) - - # inital checkout - overall, tree = self.execute_cmd_in_dir(under_test_dir, - self.checkout_args) - self._check_container_full_pre_checkout(overall, tree) + cloned_repo_dir = self.clone_test_repo(CONTAINER_REPO) - overall, tree = self.execute_cmd_in_dir(under_test_dir, - self.status_args) - self._check_container_full_post_checkout(overall, tree) - - # Check existance of some files - subrepo_path = os.path.join('externals', 'simp_tag') - self._check_file_exists(under_test_dir, - os.path.join(subrepo_path, 'readme.txt')) - self._check_file_absent(under_test_dir, os.path.join(subrepo_path, - 'simple_subdir', - 'subdir_file.txt')) - - # update the mixed-use repo to point to different branch - self._generator.update_branch(under_test_dir, 'mixed_req', - 'new-feature', MIXED_REPO_NAME) - - # check status out of sync for mixed_req, but sub-externals + self._generator.create_config() + self._generator.create_section(MIXED_REPO, 'mixed_req', + branch='master', sub_externals=CFG_SUB_NAME) + self._generator.write_config(cloned_repo_dir) + + # The subrepo has a repo_url that uses this environment variable. + # It'll be cleared in tearDown(). + os.environ[MIXED_CONT_EXT_ROOT_ENV_VAR] = self._bare_root + debug_env = MIXED_CONT_EXT_ROOT_ENV_VAR + '=' + self._bare_root + + # inital checkout: all requireds are clean, and optional is empty. + tree = self.execute_checkout_with_status(cloned_repo_dir, + self.checkout_args, + debug_env=debug_env) + mixed_req_path = self._external_path('mixed_req') + self._check_sync_clean(tree[mixed_req_path], + ExternalStatus.STATUS_OK, + ExternalStatus.STATUS_OK) + sub_ext_base_path = "{0}/{1}/{2}".format(EXTERNALS_PATH, 'mixed_req', SUB_EXTERNALS_PATH) + # The already-checked-in subexternals file has a 'simp_branch' section + self._check_sync_clean(tree[self._external_path('simp_branch', base_path=sub_ext_base_path)], + ExternalStatus.STATUS_OK, + ExternalStatus.STATUS_OK) + + # update the mixed-use external to point to different branch + # status should become out of sync for mixed_req, but sub-externals # are still in sync - overall, tree = self.execute_cmd_in_dir(under_test_dir, - self.status_args) - self._check_container_full_pre_checkout_ext_change(overall, tree) - - # run the checkout. Now the mixed use external and it's - # sub-exterals should be changed. Returned status is - # pre-checkout! - overall, tree = self.execute_cmd_in_dir(under_test_dir, - self.checkout_args) - self._check_container_full_pre_checkout_ext_change(overall, tree) - - # check status out of sync for mixed_req, and sub-externals - # are in sync. - overall, tree = self.execute_cmd_in_dir(under_test_dir, - self.status_args) - self._check_container_full_post_checkout(overall, tree) - + self._generator.write_with_git_branch(cloned_repo_dir, name='mixed_req', + branch='new-feature', + new_remote_repo_path=MIXED_REPO) + tree = self.execute_checkout_in_dir(cloned_repo_dir, self.status_args, + debug_env=debug_env) + self._check_sync_clean(tree[mixed_req_path], + ExternalStatus.MODEL_MODIFIED, + ExternalStatus.STATUS_OK) + self._check_sync_clean(tree[self._external_path('simp_branch', base_path=sub_ext_base_path)], + ExternalStatus.STATUS_OK, + ExternalStatus.STATUS_OK) + + # run the checkout. Now the mixed use external and its sub-externals should be clean. + tree = self.execute_checkout_with_status(cloned_repo_dir, self.checkout_args, + debug_env=debug_env) + self._check_sync_clean(tree[mixed_req_path], + ExternalStatus.STATUS_OK, + ExternalStatus.STATUS_OK) + self._check_sync_clean(tree[self._external_path('simp_branch', base_path=sub_ext_base_path)], + ExternalStatus.STATUS_OK, + ExternalStatus.STATUS_OK) + def test_container_component(self): """Verify that optional component checkout works """ - # create the test repository - under_test_dir = self.setup_test_repo(CONTAINER_REPO_NAME) + cloned_repo_dir = self.clone_test_repo(CONTAINER_REPO) # create the top level externals file - self._generator.container_full(under_test_dir) - - # inital checkout, first try a nonexistant component argument noref + self._generator.create_config() + # Optional external, by tag. + self._generator.create_section(SIMPLE_REPO, 'simp_opt', + tag='tag1', required=False) + + # Required external, by branch. + self._generator.create_section(SIMPLE_REPO, BRANCH_SECTION, + branch=REMOTE_BRANCH_FEATURE2) + + # Required external, by hash. + self._generator.create_section(SIMPLE_REPO, HASH_SECTION, + ref_hash='60b1cc1a38d63') + self._generator.write_config(cloned_repo_dir) + + # inital checkout, first try a nonexistent component argument noref checkout_args = ['simp_opt', 'noref'] checkout_args.extend(self.checkout_args) with self.assertRaises(RuntimeError): - self.execute_cmd_in_dir(under_test_dir, checkout_args) + self.execute_checkout_in_dir(cloned_repo_dir, checkout_args) + # Now explicitly check out one optional component.. + # Explicitly listed component (opt) should be present, the other two not. checkout_args = ['simp_opt'] checkout_args.extend(self.checkout_args) - - overall, tree = self.execute_cmd_in_dir(under_test_dir, - checkout_args) - - overall, tree = self.execute_cmd_in_dir(under_test_dir, - self.status_args) - self._check_container_component_post_checkout(overall, tree) - checkout_args.append('simp_branch') - overall, tree = self.execute_cmd_in_dir(under_test_dir, - checkout_args) - overall, tree = self.execute_cmd_in_dir(under_test_dir, - self.status_args) - self._check_container_component_post_checkout2(overall, tree) - - def test_mixed_simple(self): - """Verify that a mixed use repo can serve as a 'full' container, - pulling in a set of externals and a seperate set of sub-externals. - + tree = self.execute_checkout_with_status(cloned_repo_dir, + checkout_args) + self._check_sync_clean(tree[self._external_path('simp_opt')], + ExternalStatus.STATUS_OK, + ExternalStatus.STATUS_OK) + self._check_sync_clean(tree[self._external_path(BRANCH_SECTION)], + ExternalStatus.EMPTY, + ExternalStatus.DEFAULT) + self._check_sync_clean(tree[self._external_path(HASH_SECTION)], + ExternalStatus.EMPTY, + ExternalStatus.DEFAULT) + + # Check out a second component, this one required. + # Explicitly listed component (branch) should be present, the still-unlisted one (tag) not. + checkout_args.append(BRANCH_SECTION) + tree = self.execute_checkout_with_status(cloned_repo_dir, + checkout_args) + self._check_sync_clean(tree[self._external_path('simp_opt')], + ExternalStatus.STATUS_OK, + ExternalStatus.STATUS_OK) + self._check_sync_clean(tree[self._external_path(BRANCH_SECTION)], + ExternalStatus.STATUS_OK, + ExternalStatus.STATUS_OK) + self._check_sync_clean(tree[self._external_path(HASH_SECTION)], + ExternalStatus.EMPTY, + ExternalStatus.DEFAULT) + + + def test_container_exclude_component(self): + """Verify that exclude component checkout works """ - #import pdb; pdb.set_trace() - # create repository - under_test_dir = self.setup_test_repo(MIXED_REPO_NAME) - # create top level externals file - self._generator.mixed_simple_base(under_test_dir) - # NOTE: sub-externals file is already in the repo so we can - # switch branches during testing. Since this is a mixed-repo - # serving as the top level container repo, we can't switch - # during this test. + cloned_repo_dir = self.clone_test_repo(CONTAINER_REPO) + self._generator.create_config() + self._generator.create_section(SIMPLE_REPO, TAG_SECTION, + tag='tag1') + + self._generator.create_section(SIMPLE_REPO, BRANCH_SECTION, + branch=REMOTE_BRANCH_FEATURE2) + + self._generator.create_section(SIMPLE_REPO, 'simp_hash', + ref_hash='60b1cc1a38d63') + + self._generator.write_config(cloned_repo_dir) + + # inital checkout should result in all externals being clean except excluded TAG_SECTION. + checkout_args = ['--exclude', TAG_SECTION] + checkout_args.extend(self.checkout_args) + tree = self.execute_checkout_with_status(cloned_repo_dir, checkout_args) + self._check_sync_clean(tree[self._external_path(TAG_SECTION)], + ExternalStatus.EMPTY, + ExternalStatus.DEFAULT) + self._check_sync_clean(tree[self._external_path(BRANCH_SECTION)], + ExternalStatus.STATUS_OK, + ExternalStatus.STATUS_OK) + self._check_sync_clean(tree[self._external_path(HASH_SECTION)], + ExternalStatus.STATUS_OK, + ExternalStatus.STATUS_OK) + + def test_subexternal(self): + """Verify that an externals file can be brought in as a reference. - # checkout - overall, tree = self.execute_cmd_in_dir(under_test_dir, - self.checkout_args) - self._check_mixed_cont_simple_required_checkout(overall, tree) + """ + cloned_repo_dir = self.clone_test_repo(MIXED_REPO) - # verify status is clean and unmodified - overall, tree = self.execute_cmd_in_dir(under_test_dir, - self.status_args) - self._check_mixed_cont_simple_required_post_checkout(overall, tree) + self._generator.create_config() + self._generator.create_section_reference_to_subexternal('mixed_base') + self._generator.write_config(cloned_repo_dir) + + # The subrepo has a repo_url that uses this environment variable. + # It'll be cleared in tearDown(). + os.environ[MIXED_CONT_EXT_ROOT_ENV_VAR] = self._bare_root + debug_env = MIXED_CONT_EXT_ROOT_ENV_VAR + '=' + self._bare_root + + # After checkout, confirm required's are clean and the referenced + # subexternal's contents are also clean. + tree = self.execute_checkout_with_status(cloned_repo_dir, + self.checkout_args, + debug_env=debug_env) + + self._check_sync_clean( + tree[self._external_path(BRANCH_SECTION, base_path=SUB_EXTERNALS_PATH)], + ExternalStatus.STATUS_OK, + ExternalStatus.STATUS_OK) def test_container_sparse(self): """Verify that 'full' container with simple subrepo can run a sparse checkout and generate the correct initial status. """ - # create the test repository - under_test_dir = self.setup_test_repo(CONTAINER_REPO_NAME) + cloned_repo_dir = self.clone_test_repo(CONTAINER_REPO) - # create the top level externals file - self._generator.container_sparse(under_test_dir) - - # inital checkout - overall, tree = self.execute_cmd_in_dir(under_test_dir, - self.checkout_args) - self._check_container_sparse_pre_checkout(overall, tree) - - overall, tree = self.execute_cmd_in_dir(under_test_dir, - self.status_args) - self._check_container_sparse_post_checkout(overall, tree) + # Create a file to list filenames to checkout. + sparse_filename = 'sparse_checkout' + with open(os.path.join(cloned_repo_dir, sparse_filename), 'w') as sfile: + sfile.write(README_NAME) - # Check existance of some files - subrepo_path = os.path.join('externals', 'simp_tag') - self._check_file_exists(under_test_dir, - os.path.join(subrepo_path, 'readme.txt')) - self._check_file_exists(under_test_dir, os.path.join(subrepo_path, + self._generator.create_config() + self._generator.create_section(SIMPLE_REPO, TAG_SECTION, + tag='tag2') + + # Same tag as above, but with a sparse file too. + sparse_relpath = '../../' + sparse_filename + self._generator.create_section(SIMPLE_REPO, 'simp_sparse', + tag='tag2', sparse=sparse_relpath) + + self._generator.write_config(cloned_repo_dir) + + # inital checkout, confirm required's are clean. + tree = self.execute_checkout_with_status(cloned_repo_dir, + self.checkout_args) + self._check_sync_clean(tree[self._external_path(TAG_SECTION)], + ExternalStatus.STATUS_OK, + ExternalStatus.STATUS_OK) + self._check_sync_clean(tree[self._external_path('simp_sparse')], + ExternalStatus.STATUS_OK, + ExternalStatus.STATUS_OK) + + # Check existence of some files - full set in TAG_SECTION, and sparse set + # in 'simp_sparse'. + subrepo_path = os.path.join('externals', TAG_SECTION) + self._check_file_exists(cloned_repo_dir, + os.path.join(subrepo_path, README_NAME)) + self._check_file_exists(cloned_repo_dir, os.path.join(subrepo_path, 'simple_subdir', 'subdir_file.txt')) subrepo_path = os.path.join('externals', 'simp_sparse') - self._check_file_exists(under_test_dir, - os.path.join(subrepo_path, 'readme.txt')) - self._check_file_absent(under_test_dir, os.path.join(subrepo_path, + self._check_file_exists(cloned_repo_dir, + os.path.join(subrepo_path, README_NAME)) + self._check_file_absent(cloned_repo_dir, os.path.join(subrepo_path, 'simple_subdir', 'subdir_file.txt')) - class TestSysCheckoutSVN(BaseTestSysCheckout): """Run systems level tests of checkout_externals accessing svn repositories - SVN tests - these tests use the svn repository interface. Since - they require an active network connection, they are significantly - slower than the git tests. But svn testing is critical. So try to - design the tests to only test svn repository functionality - (checkout, switch) and leave generic testing of functionality like - 'optional' to the fast git tests. - - Example timing as of 2017-11: - - * All other git and unit tests combined take between 4-5 seconds - - * Just checking if svn is available for a single test takes 2 seconds. - - * The single svn test typically takes between 10 and 25 seconds - (depending on the network)! - - NOTE(bja, 2017-11) To enable CI testing we can't use a real remote - repository that restricts access and it seems inappropriate to hit - a random open source repo. For now we are just hitting one of our - own github repos using the github svn server interface. This - should be "good enough" for basic checkout and swich - functionality. But if additional svn functionality is required, a - better solution will be necessary. I think eventually we want to - create a small local svn repository on the fly (doesn't require an - svn server or network connection!) and use it for testing. - + SVN tests - these tests use the svn repository interface. """ - def _check_svn_branch_ok(self, tree, directory=EXTERNALS_NAME): - name = './{0}/svn_branch'.format(directory) - self._check_generic_ok_clean_required(tree, name) - - def _check_svn_branch_dirty(self, tree, directory=EXTERNALS_NAME): - name = './{0}/svn_branch'.format(directory) - self._check_generic_ok_dirty_required(tree, name) - - def _check_svn_tag_ok(self, tree, directory=EXTERNALS_NAME): - name = './{0}/svn_tag'.format(directory) - self._check_generic_ok_clean_required(tree, name) - - def _check_svn_tag_modified(self, tree, directory=EXTERNALS_NAME): - name = './{0}/svn_tag'.format(directory) - self._check_generic_modified_ok_required(tree, name) - - def _check_container_simple_svn_post_checkout(self, overall, tree): - self.assertEqual(overall, 0) - self._check_simple_tag_ok(tree) - self._check_svn_branch_ok(tree) - self._check_svn_tag_ok(tree) - - def _check_container_simple_svn_sb_dirty_st_mod(self, overall, tree): - self.assertEqual(overall, 0) - self._check_simple_tag_ok(tree) - self._check_svn_tag_modified(tree) - self._check_svn_branch_dirty(tree) - - def _check_container_simple_svn_sb_clean_st_mod(self, overall, tree): - self.assertEqual(overall, 0) - self._check_simple_tag_ok(tree) - self._check_svn_tag_modified(tree) - self._check_svn_branch_ok(tree) + @staticmethod + def _svn_branch_name(): + return './{0}/svn_branch'.format(EXTERNALS_PATH) @staticmethod - def have_svn_access(): + def _svn_tag_name(): + return './{0}/svn_tag'.format(EXTERNALS_PATH) + + def _svn_test_repo_url(self): + return 'file://' + os.path.join(self._bare_root, SVN_TEST_REPO) + + def _check_tag_branch_svn_tag_clean(self, tree): + self._check_sync_clean(tree[self._external_path(TAG_SECTION)], + ExternalStatus.STATUS_OK, + ExternalStatus.STATUS_OK) + self._check_sync_clean(tree[self._svn_branch_name()], + ExternalStatus.STATUS_OK, + ExternalStatus.STATUS_OK) + self._check_sync_clean(tree[self._svn_tag_name()], + ExternalStatus.STATUS_OK, + ExternalStatus.STATUS_OK) + + def _have_svn_access(self): """Check if we have svn access so we can enable tests that use svn. """ have_svn = False - cmd = ['svn', 'ls', SVN_TEST_REPO, ] + cmd = ['svn', 'ls', self._svn_test_repo_url(), ] try: execute_subprocess(cmd) have_svn = True @@ -1486,10 +1427,10 @@ def have_svn_access(): pass return have_svn - def skip_if_no_svn_access(self): + def _skip_if_no_svn_access(self): """Function decorator to disable svn tests when svn isn't available """ - have_svn = self.have_svn_access() + have_svn = self._have_svn_access() if not have_svn: raise unittest.SkipTest("No svn access") @@ -1497,60 +1438,55 @@ def test_container_simple_svn(self): """Verify that a container repo can pull in an svn branch and svn tag. """ - self.skip_if_no_svn_access() + self._skip_if_no_svn_access() # create repo - under_test_dir = self.setup_test_repo(CONTAINER_REPO_NAME) - self._generator.container_simple_svn(under_test_dir) + cloned_repo_dir = self.clone_test_repo(CONTAINER_REPO) - # checkout - overall, tree = self.execute_cmd_in_dir(under_test_dir, - self.checkout_args) + self._generator.create_config() + # Git repo. + self._generator.create_section(SIMPLE_REPO, TAG_SECTION, tag='tag1') - # verify status is clean and unmodified - overall, tree = self.execute_cmd_in_dir(under_test_dir, - self.status_args) - self._check_container_simple_svn_post_checkout(overall, tree) + # Svn repos. + self._generator.create_svn_external('svn_branch', self._svn_test_repo_url(), branch='trunk') + self._generator.create_svn_external('svn_tag', self._svn_test_repo_url(), tag='tags/cesm2.0.beta07') + + self._generator.write_config(cloned_repo_dir) + + # checkout, make sure all sections are clean. + tree = self.execute_checkout_with_status(cloned_repo_dir, + self.checkout_args) + self._check_tag_branch_svn_tag_clean(tree) # update description file to make the tag into a branch and # trigger a switch - self._generator.update_svn_branch(under_test_dir, 'svn_tag', 'trunk') + self._generator.write_with_svn_branch(cloned_repo_dir, 'svn_tag', + 'trunk') - # checkout - overall, tree = self.execute_cmd_in_dir(under_test_dir, - self.checkout_args) - - # verify status is clean and unmodified - overall, tree = self.execute_cmd_in_dir(under_test_dir, - self.status_args) - self._check_container_simple_svn_post_checkout(overall, tree) + # checkout, again the results should be clean. + tree = self.execute_checkout_with_status(cloned_repo_dir, + self.checkout_args) + self._check_tag_branch_svn_tag_clean(tree) # add an untracked file to the repo tracked = False - self._add_file_to_repo(under_test_dir, - 'externals/svn_branch/tmp.txt', tracked) + RepoUtils.add_file_to_repo(cloned_repo_dir, + 'externals/svn_branch/tmp.txt', tracked) - # run a no-op checkout: pre-checkout status should be clean, - # ignoring the untracked file. - overall, tree = self.execute_cmd_in_dir(under_test_dir, - self.checkout_args) - self._check_container_simple_svn_post_checkout(overall, tree) + # run a no-op checkout. + self.execute_checkout_in_dir(cloned_repo_dir, self.checkout_args) # update description file to make the branch into a tag and # trigger a modified sync status - self._generator.update_svn_branch(under_test_dir, 'svn_tag', - 'tags/cesm2.0.beta07') + self._generator.write_with_svn_branch(cloned_repo_dir, 'svn_tag', + 'tags/cesm2.0.beta07') - # checkout: pre-checkout status should be clean and modified, - # will modify working copy. - overall, tree = self.execute_cmd_in_dir(under_test_dir, - self.checkout_args) - self._check_container_simple_svn_sb_clean_st_mod(overall, tree) + self.execute_checkout_in_dir(cloned_repo_dir,self.checkout_args) # verify status is still clean and unmodified, last # checkout modified the working dir state. - overall, tree = self.execute_cmd_in_dir(under_test_dir, - self.verbose_args) - self._check_container_simple_svn_post_checkout(overall, tree) + tree = self.execute_checkout_in_dir(cloned_repo_dir, + self.verbose_args) + self._check_tag_branch_svn_tag_clean(tree) class TestSubrepoCheckout(BaseTestSysCheckout): # Need to store information at setUp time for checking @@ -1569,7 +1505,7 @@ def setUp(self): """ # Run the basic setup - super(TestSubrepoCheckout, self).setUp() + super().setUp() # create test repo # We need to do this here (rather than have a static repo) because # git submodules do not allow for variables in .gitmodules files @@ -1577,26 +1513,26 @@ def setUp(self): self._bare_branch_name = 'subrepo_branch' self._config_branch_name = 'subrepo_config_branch' self._container_extern_name = 'externals_container.cfg' - self._my_test_dir = os.path.join(os.environ[MANIC_TEST_TMP_REPO_ROOT], - self._test_id) + self._my_test_dir = os.path.join(module_tmp_root_dir, self._test_id) self._repo_dir = os.path.join(self._my_test_dir, self._test_repo_name) self._checkout_dir = 'repo_with_submodules' - check_dir = self.setup_test_repo(CONTAINER_REPO_NAME, + check_dir = self.clone_test_repo(CONTAINER_REPO, dest_dir_in=self._repo_dir) self.assertTrue(self._repo_dir == check_dir) # Add the submodules cwd = os.getcwd() - fork_repo_dir = os.path.join(self._bare_root, SIMPLE_FORK_NAME) - simple_repo_dir = os.path.join(self._bare_root, SIMPLE_REPO_NAME) - self._simple_ext_fork_name = SIMPLE_FORK_NAME.split('.')[0] - self._simple_ext_name = SIMPLE_REPO_NAME.split('.')[0] + fork_repo_dir = os.path.join(self._bare_root, SIMPLE_FORK_REPO) + simple_repo_dir = os.path.join(self._bare_root, SIMPLE_REPO) + self._simple_ext_fork_name = os.path.splitext(SIMPLE_FORK_REPO)[0] + self._simple_ext_name = os.path.join('sourc', + os.path.splitext(SIMPLE_REPO)[0]) os.chdir(self._repo_dir) # Add a branch with a subrepo cmd = ['git', 'branch', self._bare_branch_name, 'master'] execute_subprocess(cmd) cmd = ['git', 'checkout', self._bare_branch_name] execute_subprocess(cmd) - cmd = ['git', 'submodule', 'add', fork_repo_dir] + cmd = ['git', '-c', 'protocol.file.allow=always','submodule', 'add', fork_repo_dir] execute_subprocess(cmd) cmd = ['git', 'commit', '-am', "'Added simple-ext-fork as a submodule'"] execute_subprocess(cmd) @@ -1610,7 +1546,8 @@ def setUp(self): execute_subprocess(cmd) cmd = ['git', 'checkout', self._config_branch_name] execute_subprocess(cmd) - cmd = ['git', 'submodule', 'add', simple_repo_dir] + cmd = ['git', '-c', 'protocol.file.allow=always', 'submodule', 'add', '--name', SIMPLE_REPO, + simple_repo_dir, self._simple_ext_name] execute_subprocess(cmd) # Checkout feature2 os.chdir(self._simple_ext_name) @@ -1621,8 +1558,8 @@ def setUp(self): # Save the fork repo hash for comparison self._simple_hash_check = self.get_git_hash() os.chdir(self._repo_dir) - self.create_externals_file(filename=self._container_extern_name, - dest_dir=self._repo_dir, from_submodule=True) + self.write_externals_config(filename=self._container_extern_name, + dest_dir=self._repo_dir, from_submodule=True) cmd = ['git', 'add', self._container_extern_name] execute_subprocess(cmd) cmd = ['git', 'commit', '-am', "'Added simple-ext as a submodule'"] @@ -1639,9 +1576,10 @@ def get_git_hash(revision="HEAD"): git_out = execute_subprocess(cmd, output_to_caller=True) return git_out.strip() - def create_externals_file(self, name='', filename=CFG_NAME, dest_dir=None, - branch_name=None, sub_externals=None, - from_submodule=False): + def write_externals_config(self, name='', dest_dir=None, + filename=CFG_NAME, + branch_name=None, sub_externals=None, + from_submodule=False): # pylint: disable=too-many-arguments """Create a container externals file with only simple externals. @@ -1652,10 +1590,10 @@ def create_externals_file(self, name='', filename=CFG_NAME, dest_dir=None, dest_dir = self._my_test_dir if from_submodule: - self._generator.create_section(SIMPLE_FORK_NAME, + self._generator.create_section(SIMPLE_FORK_REPO, self._simple_ext_fork_name, from_submodule=True) - self._generator.create_section(SIMPLE_REPO_NAME, + self._generator.create_section(SIMPLE_REPO, self._simple_ext_name, branch='feature3', path='', from_submodule=False) @@ -1666,8 +1604,8 @@ def create_externals_file(self, name='', filename=CFG_NAME, dest_dir=None, self._generator.create_section(self._test_repo_name, self._checkout_dir, branch=branch_name, - path=name, externals=sub_externals, - repo_path=self._repo_dir) + path=name, sub_externals=sub_externals, + repo_path_abs=self._repo_dir) self._generator.write_config(dest_dir, filename=filename) @@ -1676,12 +1614,10 @@ def idempotence_check(self, checkout_dir): checkout_externals --status does not cause errors""" cwd = os.getcwd() os.chdir(checkout_dir) - overall, _ = self.execute_cmd_in_dir(self._my_test_dir, - self.checkout_args) - self.assertTrue(overall == 0) - overall, _ = self.execute_cmd_in_dir(self._my_test_dir, - self.status_args) - self.assertTrue(overall == 0) + self.execute_checkout_in_dir(self._my_test_dir, + self.checkout_args) + self.execute_checkout_in_dir(self._my_test_dir, + self.status_args) os.chdir(cwd) def test_submodule_checkout_bare(self): @@ -1693,17 +1629,17 @@ def test_submodule_checkout_bare(self): """ simple_ext_fork_tag = "(tag1)" simple_ext_fork_status = " " - self.create_externals_file(branch_name=self._bare_branch_name) - overall, _ = self.execute_cmd_in_dir(self._my_test_dir, - self.checkout_args) - self.assertTrue(overall == 0) + self.write_externals_config(branch_name=self._bare_branch_name) + self.execute_checkout_in_dir(self._my_test_dir, + self.checkout_args) cwd = os.getcwd() checkout_dir = os.path.join(self._my_test_dir, self._checkout_dir) fork_file = os.path.join(checkout_dir, self._simple_ext_fork_name, "readme.txt") self.assertTrue(os.path.exists(fork_file)) - os.chdir(checkout_dir) + submods = git_submodule_status(checkout_dir) + print('checking status of', checkout_dir, ':', submods) self.assertEqual(len(submods.keys()), 1) self.assertTrue(self._simple_ext_fork_name in submods) submod = submods[self._simple_ext_fork_name] @@ -1713,7 +1649,6 @@ def test_submodule_checkout_bare(self): self.assertEqual(submod['status'], simple_ext_fork_status) self.assertTrue('tag' in submod) self.assertEqual(submod['tag'], simple_ext_fork_tag) - os.chdir(cwd) self.idempotence_check(checkout_dir) def test_submodule_checkout_none(self): @@ -1722,11 +1657,10 @@ def test_submodule_checkout_none(self): externals cfg file. Correct behavior is the submodle is not checked out. """ - self.create_externals_file(branch_name=self._bare_branch_name, - sub_externals="none") - overall, _ = self.execute_cmd_in_dir(self._my_test_dir, - self.checkout_args) - self.assertTrue(overall == 0) + self.write_externals_config(branch_name=self._bare_branch_name, + sub_externals="none") + self.execute_checkout_in_dir(self._my_test_dir, + self.checkout_args) cwd = os.getcwd() checkout_dir = os.path.join(self._my_test_dir, self._checkout_dir) fork_file = os.path.join(checkout_dir, @@ -1744,11 +1678,10 @@ def test_submodule_checkout_config(self): # pylint: disable=too-many-locals """ tag_check = None # Not checked out as submodule status_check = "-" # Not checked out as submodule - self.create_externals_file(branch_name=self._config_branch_name, - sub_externals=self._container_extern_name) - overall, _ = self.execute_cmd_in_dir(self._my_test_dir, - self.checkout_args) - self.assertTrue(overall == 0) + self.write_externals_config(branch_name=self._config_branch_name, + sub_externals=self._container_extern_name) + self.execute_checkout_in_dir(self._my_test_dir, + self.checkout_args) cwd = os.getcwd() checkout_dir = os.path.join(self._my_test_dir, self._checkout_dir) fork_file = os.path.join(checkout_dir, @@ -1810,17 +1743,20 @@ def test_error_unknown_protocol(self): """ # create repo - under_test_dir = self.setup_test_repo(CONTAINER_REPO_NAME) - self._generator.container_simple_required(under_test_dir) + cloned_repo_dir = self.clone_test_repo(CONTAINER_REPO) + self._generator.create_config() + self._generator.create_section(SIMPLE_REPO, BRANCH_SECTION, + branch=REMOTE_BRANCH_FEATURE2) + self._generator.write_config(cloned_repo_dir) # update the config file to point to a different remote with # the tag instead of branch. Tag MUST NOT be in the original # repo! - self._generator.update_protocol(under_test_dir, 'simp_branch', - 'this-protocol-does-not-exist') + self._generator.write_with_protocol(cloned_repo_dir, BRANCH_SECTION, + 'this-protocol-does-not-exist') with self.assertRaises(RuntimeError): - self.execute_cmd_in_dir(under_test_dir, self.checkout_args) + self.execute_checkout_in_dir(cloned_repo_dir, self.checkout_args) def test_error_switch_protocol(self): """Verify that a runtime error is raised when the user switches @@ -1831,15 +1767,18 @@ def test_error_switch_protocol(self): """ # create repo - under_test_dir = self.setup_test_repo(CONTAINER_REPO_NAME) - self._generator.container_simple_required(under_test_dir) + cloned_repo_dir = self.clone_test_repo(CONTAINER_REPO) + self._generator.create_config() + self._generator.create_section(SIMPLE_REPO, BRANCH_SECTION, + branch=REMOTE_BRANCH_FEATURE2) + self._generator.write_config(cloned_repo_dir) # update the config file to point to a different remote with # the tag instead of branch. Tag MUST NOT be in the original # repo! - self._generator.update_protocol(under_test_dir, 'simp_branch', 'svn') + self._generator.write_with_protocol(cloned_repo_dir, BRANCH_SECTION, 'svn') with self.assertRaises(RuntimeError): - self.execute_cmd_in_dir(under_test_dir, self.checkout_args) + self.execute_checkout_in_dir(cloned_repo_dir, self.checkout_args) def test_error_unknown_tag(self): """Verify that a runtime error is raised when the user specified tag @@ -1847,17 +1786,21 @@ def test_error_unknown_tag(self): """ # create repo - under_test_dir = self.setup_test_repo(CONTAINER_REPO_NAME) - self._generator.container_simple_required(under_test_dir) + cloned_repo_dir = self.clone_test_repo(CONTAINER_REPO) + self._generator.create_config() + self._generator.create_section(SIMPLE_REPO, BRANCH_SECTION, + branch=REMOTE_BRANCH_FEATURE2) + self._generator.write_config(cloned_repo_dir) # update the config file to point to a different remote with # the tag instead of branch. Tag MUST NOT be in the original # repo! - self._generator.update_tag(under_test_dir, 'simp_branch', - 'this-tag-does-not-exist', SIMPLE_REPO_NAME) + self._generator.write_with_tag_and_remote_repo(cloned_repo_dir, BRANCH_SECTION, + tag='this-tag-does-not-exist', + new_remote_repo_path=SIMPLE_REPO) with self.assertRaises(RuntimeError): - self.execute_cmd_in_dir(under_test_dir, self.checkout_args) + self.execute_checkout_in_dir(cloned_repo_dir, self.checkout_args) def test_error_overspecify_tag_branch(self): """Verify that a runtime error is raised when the user specified both @@ -1865,18 +1808,22 @@ def test_error_overspecify_tag_branch(self): """ # create repo - under_test_dir = self.setup_test_repo(CONTAINER_REPO_NAME) - self._generator.container_simple_required(under_test_dir) + cloned_repo_dir = self.clone_test_repo(CONTAINER_REPO) + self._generator.create_config() + self._generator.create_section(SIMPLE_REPO, BRANCH_SECTION, + branch=REMOTE_BRANCH_FEATURE2) + self._generator.write_config(cloned_repo_dir) # update the config file to point to a different remote with # the tag instead of branch. Tag MUST NOT be in the original # repo! - self._generator.update_tag(under_test_dir, 'simp_branch', - 'this-tag-does-not-exist', SIMPLE_REPO_NAME, - remove_branch=False) + self._generator.write_with_tag_and_remote_repo(cloned_repo_dir, BRANCH_SECTION, + tag='this-tag-does-not-exist', + new_remote_repo_path=SIMPLE_REPO, + remove_branch=False) with self.assertRaises(RuntimeError): - self.execute_cmd_in_dir(under_test_dir, self.checkout_args) + self.execute_checkout_in_dir(cloned_repo_dir, self.checkout_args) def test_error_underspecify_tag_branch(self): """Verify that a runtime error is raised when the user specified @@ -1884,17 +1831,19 @@ def test_error_underspecify_tag_branch(self): """ # create repo - under_test_dir = self.setup_test_repo(CONTAINER_REPO_NAME) - self._generator.container_simple_required(under_test_dir) + cloned_repo_dir = self.clone_test_repo(CONTAINER_REPO) + self._generator.create_config() + self._generator.create_section(SIMPLE_REPO, BRANCH_SECTION, + branch=REMOTE_BRANCH_FEATURE2) + self._generator.write_config(cloned_repo_dir) # update the config file to point to a different remote with # the tag instead of branch. Tag MUST NOT be in the original # repo! - self._generator.update_underspecify_branch_tag(under_test_dir, - 'simp_branch') + self._generator.write_without_branch_tag(cloned_repo_dir, BRANCH_SECTION) with self.assertRaises(RuntimeError): - self.execute_cmd_in_dir(under_test_dir, self.checkout_args) + self.execute_checkout_in_dir(cloned_repo_dir, self.checkout_args) def test_error_missing_url(self): """Verify that a runtime error is raised when the user specified @@ -1902,17 +1851,20 @@ def test_error_missing_url(self): """ # create repo - under_test_dir = self.setup_test_repo(CONTAINER_REPO_NAME) - self._generator.container_simple_required(under_test_dir) + cloned_repo_dir = self.clone_test_repo(CONTAINER_REPO) + self._generator.create_config() + self._generator.create_section(SIMPLE_REPO, BRANCH_SECTION, + branch=REMOTE_BRANCH_FEATURE2) + self._generator.write_config(cloned_repo_dir) # update the config file to point to a different remote with # the tag instead of branch. Tag MUST NOT be in the original # repo! - self._generator.update_underspecify_remove_url(under_test_dir, - 'simp_branch') + self._generator.write_without_repo_url(cloned_repo_dir, + BRANCH_SECTION) with self.assertRaises(RuntimeError): - self.execute_cmd_in_dir(under_test_dir, self.checkout_args) + self.execute_checkout_in_dir(cloned_repo_dir, self.checkout_args) if __name__ == '__main__': diff --git a/test/test_sys_repository_git.py b/test/test_sys_repository_git.py index f6dbf84284..7e5fb5020d 100644 --- a/test/test_sys_repository_git.py +++ b/test/test_sys_repository_git.py @@ -1,4 +1,4 @@ -#!/usr/bin/env python +#!/usr/bin/env python3 """Tests of some of the functionality in repository_git.py that actually interacts with git repositories. @@ -131,12 +131,12 @@ def tearDown(self): shutil.rmtree(self._tmpdir, ignore_errors=True) @staticmethod - def make_git_repo(): + def make_cwd_git_repo(): """Turn the current directory into an empty git repository""" execute_subprocess(['git', 'init']) @staticmethod - def add_git_commit(): + def add_cwd_git_commit(): """Add a git commit in the current directory""" with open('README', 'a') as myfile: myfile.write('more info') @@ -144,17 +144,17 @@ def add_git_commit(): execute_subprocess(['git', 'commit', '-m', 'my commit message']) @staticmethod - def checkout_git_branch(branchname): + def checkout_cwd_git_branch(branchname): """Checkout a new branch in the current directory""" execute_subprocess(['git', 'checkout', '-b', branchname]) @staticmethod - def make_git_tag(tagname): + def make_cwd_git_tag(tagname): """Make a lightweight tag at the current commit""" execute_subprocess(['git', 'tag', '-m', 'making a tag', tagname]) @staticmethod - def checkout_ref(refname): + def checkout_cwd_ref(refname): """Checkout the given refname in the current directory""" execute_subprocess(['git', 'checkout', refname]) @@ -164,72 +164,72 @@ def checkout_ref(refname): def test_currentHash_returnsHash(self): """Ensure that the _git_current_hash function returns a hash""" - self.make_git_repo() - self.add_git_commit() - hash_found, myhash = self._repo._git_current_hash() + self.make_cwd_git_repo() + self.add_cwd_git_commit() + hash_found, myhash = self._repo._git_current_hash(os.getcwd()) self.assertTrue(hash_found) self.assertIsHash(myhash) def test_currentHash_outsideGitRepo(self): """Ensure that the _git_current_hash function returns False when outside a git repository""" - hash_found, myhash = self._repo._git_current_hash() + hash_found, myhash = self._repo._git_current_hash(os.getcwd()) self.assertFalse(hash_found) self.assertEqual('', myhash) def test_currentBranch_onBranch(self): """Ensure that the _git_current_branch function returns the name of the branch""" - self.make_git_repo() - self.add_git_commit() - self.checkout_git_branch('foo') - branch_found, mybranch = self._repo._git_current_branch() + self.make_cwd_git_repo() + self.add_cwd_git_commit() + self.checkout_cwd_git_branch('foo') + branch_found, mybranch = self._repo._git_current_branch(os.getcwd()) self.assertTrue(branch_found) self.assertEqual('foo', mybranch) def test_currentBranch_notOnBranch(self): """Ensure that the _git_current_branch function returns False when not on a branch""" - self.make_git_repo() - self.add_git_commit() - self.make_git_tag('mytag') - self.checkout_ref('mytag') - branch_found, mybranch = self._repo._git_current_branch() + self.make_cwd_git_repo() + self.add_cwd_git_commit() + self.make_cwd_git_tag('mytag') + self.checkout_cwd_ref('mytag') + branch_found, mybranch = self._repo._git_current_branch(os.getcwd()) self.assertFalse(branch_found) self.assertEqual('', mybranch) def test_currentBranch_outsideGitRepo(self): """Ensure that the _git_current_branch function returns False when outside a git repository""" - branch_found, mybranch = self._repo._git_current_branch() + branch_found, mybranch = self._repo._git_current_branch(os.getcwd()) self.assertFalse(branch_found) self.assertEqual('', mybranch) def test_currentTag_onTag(self): """Ensure that the _git_current_tag function returns the name of the tag""" - self.make_git_repo() - self.add_git_commit() - self.make_git_tag('some_tag') - tag_found, mytag = self._repo._git_current_tag() + self.make_cwd_git_repo() + self.add_cwd_git_commit() + self.make_cwd_git_tag('some_tag') + tag_found, mytag = self._repo._git_current_tag(os.getcwd()) self.assertTrue(tag_found) self.assertEqual('some_tag', mytag) def test_currentTag_notOnTag(self): """Ensure tha the _git_current_tag function returns False when not on a tag""" - self.make_git_repo() - self.add_git_commit() - self.make_git_tag('some_tag') - self.add_git_commit() - tag_found, mytag = self._repo._git_current_tag() + self.make_cwd_git_repo() + self.add_cwd_git_commit() + self.make_cwd_git_tag('some_tag') + self.add_cwd_git_commit() + tag_found, mytag = self._repo._git_current_tag(os.getcwd()) self.assertFalse(tag_found) self.assertEqual('', mytag) def test_currentTag_outsideGitRepo(self): """Ensure that the _git_current_tag function returns False when outside a git repository""" - tag_found, mytag = self._repo._git_current_tag() + tag_found, mytag = self._repo._git_current_tag(os.getcwd()) self.assertFalse(tag_found) self.assertEqual('', mytag) diff --git a/test/test_unit_externals_description.py b/test/test_unit_externals_description.py index 637f760ee5..30e5288499 100644 --- a/test/test_unit_externals_description.py +++ b/test/test_unit_externals_description.py @@ -1,4 +1,4 @@ -#!/usr/bin/env python +#!/usr/bin/env python3 """Unit test driver for checkout_externals @@ -342,6 +342,40 @@ def setup_config(self): # NOTE(goldy, 2019-03) Should test other possible keywords such as # fetchRecurseSubmodules, ignore, and shallow + @staticmethod + def setup_dict_config(): + """Create the full container dictionary with simple and mixed use + externals + + """ + rdatat = {ExternalsDescription.PROTOCOL: 'git', + ExternalsDescription.REPO_URL: 'simple-ext.git', + ExternalsDescription.TAG: 'tag1'} + rdatab = {ExternalsDescription.PROTOCOL: 'git', + ExternalsDescription.REPO_URL: 'simple-ext.git', + ExternalsDescription.BRANCH: 'feature2'} + rdatam = {ExternalsDescription.PROTOCOL: 'git', + ExternalsDescription.REPO_URL: 'mixed-cont-ext.git', + ExternalsDescription.BRANCH: 'master'} + desc = {'simp_tag': {ExternalsDescription.REQUIRED: True, + ExternalsDescription.PATH: 'simp_tag', + ExternalsDescription.EXTERNALS: EMPTY_STR, + ExternalsDescription.REPO: rdatat}, + 'simp_branch' : {ExternalsDescription.REQUIRED: True, + ExternalsDescription.PATH: 'simp_branch', + ExternalsDescription.EXTERNALS: EMPTY_STR, + ExternalsDescription.REPO: rdatab}, + 'simp_opt': {ExternalsDescription.REQUIRED: False, + ExternalsDescription.PATH: 'simp_opt', + ExternalsDescription.EXTERNALS: EMPTY_STR, + ExternalsDescription.REPO: rdatat}, + 'mixed_req': {ExternalsDescription.REQUIRED: True, + ExternalsDescription.PATH: 'mixed_req', + ExternalsDescription.EXTERNALS: 'sub-ext.cfg', + ExternalsDescription.REPO: rdatam}} + + return desc + def test_cfg_v1_ok(self): """Test that a correct cfg v1 object is created by create_externals_description @@ -379,6 +413,49 @@ def test_dict(self): ext = create_externals_description(desc, model_format='dict') self.assertIsInstance(ext, ExternalsDescriptionDict) + def test_cfg_component_dict(self): + """Verify that create_externals_description works with a dictionary + """ + # create the top level externals file + desc = self.setup_dict_config() + # Check external with all repos + external = create_externals_description(desc, model_format='dict') + self.assertIsInstance(external, ExternalsDescriptionDict) + self.assertTrue('simp_tag' in external) + self.assertTrue('simp_branch' in external) + self.assertTrue('simp_opt' in external) + self.assertTrue('mixed_req' in external) + + def test_cfg_exclude_component_dict(self): + """Verify that exclude component checkout works with a dictionary + """ + # create the top level externals file + desc = self.setup_dict_config() + # Test an excluded repo + external = create_externals_description(desc, model_format='dict', + exclude=['simp_tag', + 'simp_opt']) + self.assertIsInstance(external, ExternalsDescriptionDict) + self.assertFalse('simp_tag' in external) + self.assertTrue('simp_branch' in external) + self.assertFalse('simp_opt' in external) + self.assertTrue('mixed_req' in external) + + def test_cfg_opt_component_dict(self): + """Verify that exclude component checkout works with a dictionary + """ + # create the top level externals file + desc = self.setup_dict_config() + # Test an excluded repo + external = create_externals_description(desc, model_format='dict', + components=['simp_tag', + 'simp_opt']) + self.assertIsInstance(external, ExternalsDescriptionDict) + self.assertTrue('simp_tag' in external) + self.assertFalse('simp_branch' in external) + self.assertTrue('simp_opt' in external) + self.assertFalse('mixed_req' in external) + def test_cfg_unknown_version(self): """Test that a runtime error is raised when an unknown file version is received diff --git a/test/test_unit_externals_status.py b/test/test_unit_externals_status.py index f8e953f756..f019514e9e 100644 --- a/test/test_unit_externals_status.py +++ b/test/test_unit_externals_status.py @@ -1,4 +1,4 @@ -#!/usr/bin/env python +#!/usr/bin/env python3 """Unit test driver for the manic external status reporting module. diff --git a/test/test_unit_repository.py b/test/test_unit_repository.py index 5b9c242fd3..1b93861834 100644 --- a/test/test_unit_repository.py +++ b/test/test_unit_repository.py @@ -1,4 +1,4 @@ -#!/usr/bin/env python +#!/usr/bin/env python3 """Unit test driver for checkout_externals diff --git a/test/test_unit_repository_git.py b/test/test_unit_repository_git.py index 4a0a334bb1..1c01098acf 100644 --- a/test/test_unit_repository_git.py +++ b/test/test_unit_repository_git.py @@ -1,4 +1,4 @@ -#!/usr/bin/env python +#!/usr/bin/env python3 """Unit test driver for checkout_externals @@ -67,7 +67,7 @@ def setUp(self): def _git_current_branch(branch_found, branch_name): """Return a function that takes the place of repo._git_current_branch, which returns the given output.""" - def my_git_current_branch(): + def my_git_current_branch(dirname): """mock function that can take the place of repo._git_current_branch""" return branch_found, branch_name return my_git_current_branch @@ -76,7 +76,7 @@ def my_git_current_branch(): def _git_current_tag(tag_found, tag_name): """Return a function that takes the place of repo._git_current_tag, which returns the given output.""" - def my_git_current_tag(): + def my_git_current_tag(dirname): """mock function that can take the place of repo._git_current_tag""" return tag_found, tag_name return my_git_current_tag @@ -85,7 +85,7 @@ def my_git_current_tag(): def _git_current_hash(hash_found, hash_name): """Return a function that takes the place of repo._git_current_hash, which returns the given output.""" - def my_git_current_hash(): + def my_git_current_hash(dirname): """mock function that can take the place of repo._git_current_hash""" return hash_found, hash_name return my_git_current_hash @@ -101,8 +101,8 @@ def test_ref_branch(self): True, 'feature3') self._repo._git_current_tag = self._git_current_tag(True, 'foo_tag') self._repo._git_current_hash = self._git_current_hash(True, 'abc123') - expected = 'feature3' - result = self._repo._current_ref() + expected = 'foo_tag (branch feature3)' + result = self._repo._current_ref(os.getcwd()) self.assertEqual(result, expected) def test_ref_detached_tag(self): @@ -112,7 +112,7 @@ def test_ref_detached_tag(self): self._repo._git_current_tag = self._git_current_tag(True, 'foo_tag') self._repo._git_current_hash = self._git_current_hash(True, 'abc123') expected = 'foo_tag' - result = self._repo._current_ref() + result = self._repo._current_ref(os.getcwd()) self.assertEqual(result, expected) def test_ref_detached_hash(self): @@ -123,7 +123,7 @@ def test_ref_detached_hash(self): self._repo._git_current_tag = self._git_current_tag(False, '') self._repo._git_current_hash = self._git_current_hash(True, 'abc123') expected = 'abc123' - result = self._repo._current_ref() + result = self._repo._current_ref(os.getcwd()) self.assertEqual(result, expected) def test_ref_none(self): @@ -132,7 +132,7 @@ def test_ref_none(self): self._repo._git_current_branch = self._git_current_branch(False, '') self._repo._git_current_tag = self._git_current_tag(False, '') self._repo._git_current_hash = self._git_current_hash(False, '') - result = self._repo._current_ref() + result = self._repo._current_ref(os.getcwd()) self.assertEqual(result, EMPTY_STR) @@ -206,11 +206,19 @@ def setUp(self): self._repo._current_ref = self._current_ref_empty self._create_tmp_git_dir() + # We have to override this class method rather than the self._repo + # instance method because it is called via + # GitRepository._remote_name_for_url, which is itself a @classmethod + # calls cls._git_remote_verbose(). + self._orignal_git_remote_verbose = GitRepository._git_remote_verbose + GitRepository._git_remote_verbose = self._git_remote_origin_upstream def tearDown(self): """Cleanup tmp stuff on the file system """ self._remove_tmp_git_dir() + GitRepository._git_remote_verbose = self._orignal_git_remote_verbose + def _create_tmp_git_dir(self): """Create a temporary fake git directory for testing purposes. """ @@ -227,29 +235,27 @@ def _remove_tmp_git_dir(self): # mock methods replacing git system calls # @staticmethod - def _current_ref_empty(): + def _current_ref_empty(dirname): """Return an empty string. + + Drop-in for GitRepository._current_ref """ return EMPTY_STR @staticmethod - def _git_remote_origin_upstream(): - """Return an info string that is a checkout hash - """ - return GIT_REMOTE_OUTPUT_ORIGIN_UPSTREAM + def _git_remote_origin_upstream(dirname): + """Return an info string that is a checkout hash. - @staticmethod - def _git_remote_none(): - """Return an info string that is a checkout hash + Drop-in for GitRepository._git_remote_verbose. """ - return EMPTY_STR + return GIT_REMOTE_OUTPUT_ORIGIN_UPSTREAM @staticmethod def _git_current_hash(myhash): """Return a function that takes the place of repo._git_current_hash, which returns the given hash """ - def my_git_current_hash(): + def my_git_current_hash(dirname): """mock function that can take the place of repo._git_current_hash""" return 0, myhash return my_git_current_hash @@ -263,7 +269,7 @@ def _git_revparse_commit(self, expected_ref, mystatus, myhash): status = 0 implies success, non-zero implies failure """ - def my_git_revparse_commit(ref): + def my_git_revparse_commit(ref, dirname): """mock function that can take the place of repo._git_revparse_commit""" self.assertEqual(expected_ref, ref) return mystatus, myhash @@ -291,9 +297,6 @@ def test_sync_dir_exist_no_git_info(self): """Test that a non-existent git repo returns an unknown status """ stat = ExternalStatus() - # Now we over-ride the _git_remote_verbose method on the repo to return - # a known value without requiring access to git. - self._repo._git_remote_verbose = self._git_remote_origin_upstream self._repo._tag = 'tag1' self._repo._git_current_hash = self._git_current_hash('') self._repo._git_revparse_commit = self._git_revparse_commit( @@ -313,7 +316,6 @@ def test_sync_invalid_reference(self): """Test that an invalid reference returns out-of-sync """ stat = ExternalStatus() - self._repo._git_remote_verbose = self._git_remote_origin_upstream self._repo._tag = 'tag1' self._repo._git_current_hash = self._git_current_hash('abc123') self._repo._git_revparse_commit = self._git_revparse_commit( @@ -333,7 +335,6 @@ def test_sync_tag_on_same_hash(self): """ stat = ExternalStatus() - self._repo._git_remote_verbose = self._git_remote_origin_upstream self._repo._tag = 'tag1' self._repo._git_current_hash = self._git_current_hash('abc123') self._repo._git_revparse_commit = self._git_revparse_commit( @@ -348,7 +349,6 @@ def test_sync_tag_on_different_hash(self): """ stat = ExternalStatus() - self._repo._git_remote_verbose = self._git_remote_origin_upstream self._repo._tag = 'tag1' self._repo._git_current_hash = self._git_current_hash('def456') self._repo._git_revparse_commit = self._git_revparse_commit( @@ -368,7 +368,6 @@ def test_sync_hash_on_same_hash(self): """ stat = ExternalStatus() - self._repo._git_remote_verbose = self._git_remote_origin_upstream self._repo._tag = '' self._repo._hash = 'abc' self._repo._git_current_hash = self._git_current_hash('abc123') @@ -384,7 +383,6 @@ def test_sync_hash_on_different_hash(self): """ stat = ExternalStatus() - self._repo._git_remote_verbose = self._git_remote_origin_upstream self._repo._tag = '' self._repo._hash = 'abc' self._repo._git_current_hash = self._git_current_hash('def456') @@ -405,7 +403,6 @@ def test_sync_branch_on_same_hash(self): """ stat = ExternalStatus() - self._repo._git_remote_verbose = self._git_remote_origin_upstream self._repo._branch = 'feature-2' self._repo._tag = '' self._repo._git_current_hash = self._git_current_hash('abc123') @@ -421,7 +418,6 @@ def test_sync_branch_on_diff_hash(self): """ stat = ExternalStatus() - self._repo._git_remote_verbose = self._git_remote_origin_upstream self._repo._branch = 'feature-2' self._repo._tag = '' self._repo._git_current_hash = self._git_current_hash('abc123') @@ -433,11 +429,10 @@ def test_sync_branch_on_diff_hash(self): self.assertEqual(stat.clean_state, ExternalStatus.DEFAULT) def test_sync_branch_diff_remote(self): - """Test _determine_remote_name with a different remote + """Test _remote_name_for_url with a different remote """ stat = ExternalStatus() - self._repo._git_remote_verbose = self._git_remote_origin_upstream self._repo._branch = 'feature-2' self._repo._tag = '' self._repo._url = '/path/to/other/repo' @@ -449,11 +444,10 @@ def test_sync_branch_diff_remote(self): # expected argument def test_sync_branch_diff_remote2(self): - """Test _determine_remote_name with a different remote + """Test _remote_name_for_url with a different remote """ stat = ExternalStatus() - self._repo._git_remote_verbose = self._git_remote_origin_upstream self._repo._branch = 'feature-2' self._repo._tag = '' self._repo._url = '/path/to/local/repo2' @@ -469,7 +463,6 @@ def test_sync_branch_on_unknown_remote(self): """ stat = ExternalStatus() - self._repo._git_remote_verbose = self._git_remote_origin_upstream self._repo._branch = 'feature-2' self._repo._tag = '' self._repo._url = '/path/to/unknown/repo' @@ -491,7 +484,6 @@ def test_sync_branch_on_untracked_local(self): """ stat = ExternalStatus() - self._repo._git_remote_verbose = self._git_remote_origin_upstream self._repo._branch = 'feature3' self._repo._tag = '' self._repo._url = '.' @@ -611,24 +603,20 @@ def setUp(self): self._repo = GitRepository('test', repo) @staticmethod - def _shell_true(url, remote=None): - _ = url - _ = remote + def _shell_true(*args, **kwargs): return 0 @staticmethod - def _shell_false(url, remote=None): - _ = url - _ = remote + def _shell_false(*args, **kwargs): return 1 @staticmethod - def _mock_function_true(ref): + def _mock_revparse_commit(ref, dirname): _ = ref return (TestValidRef._shell_true, '97ebc0e0deadc0de') @staticmethod - def _mock_function_false(ref): + def _mock_revparse_commit_false(ref, dirname): _ = ref return (TestValidRef._shell_false, '97ebc0e0deadc0de') @@ -638,10 +626,11 @@ def test_tag_not_tag_branch_commit(self): self._repo._git_showref_tag = self._shell_false self._repo._git_showref_branch = self._shell_false self._repo._git_lsremote_branch = self._shell_false - self._repo._git_revparse_commit = self._mock_function_false + self._repo._git_revparse_commit = self._mock_revparse_commit_false self._repo._tag = 'something' remote_name = 'origin' - received, _ = self._repo._is_unique_tag(self._repo._tag, remote_name) + received, _ = self._repo._is_unique_tag(self._repo._tag, remote_name, + os.getcwd()) self.assertFalse(received) def test_tag_not_tag(self): @@ -650,10 +639,11 @@ def test_tag_not_tag(self): self._repo._git_showref_tag = self._shell_false self._repo._git_showref_branch = self._shell_true self._repo._git_lsremote_branch = self._shell_true - self._repo._git_revparse_commit = self._mock_function_false + self._repo._git_revparse_commit = self._mock_revparse_commit_false self._repo._tag = 'tag1' remote_name = 'origin' - received, _ = self._repo._is_unique_tag(self._repo._tag, remote_name) + received, _ = self._repo._is_unique_tag(self._repo._tag, remote_name, + os.getcwd()) self.assertFalse(received) def test_tag_indeterminant(self): @@ -662,10 +652,11 @@ def test_tag_indeterminant(self): self._repo._git_showref_tag = self._shell_true self._repo._git_showref_branch = self._shell_true self._repo._git_lsremote_branch = self._shell_true - self._repo._git_revparse_commit = self._mock_function_true + self._repo._git_revparse_commit = self._mock_revparse_commit self._repo._tag = 'something' remote_name = 'origin' - received, _ = self._repo._is_unique_tag(self._repo._tag, remote_name) + received, _ = self._repo._is_unique_tag(self._repo._tag, remote_name, + os.getcwd()) self.assertFalse(received) def test_tag_is_unique(self): @@ -674,10 +665,11 @@ def test_tag_is_unique(self): self._repo._git_showref_tag = self._shell_true self._repo._git_showref_branch = self._shell_false self._repo._git_lsremote_branch = self._shell_false - self._repo._git_revparse_commit = self._mock_function_true + self._repo._git_revparse_commit = self._mock_revparse_commit self._repo._tag = 'tag1' remote_name = 'origin' - received, _ = self._repo._is_unique_tag(self._repo._tag, remote_name) + received, _ = self._repo._is_unique_tag(self._repo._tag, remote_name, + os.getcwd()) self.assertTrue(received) def test_tag_is_not_hash(self): @@ -686,10 +678,11 @@ def test_tag_is_not_hash(self): self._repo._git_showref_tag = self._shell_false self._repo._git_showref_branch = self._shell_false self._repo._git_lsremote_branch = self._shell_false - self._repo._git_revparse_commit = self._mock_function_true + self._repo._git_revparse_commit = self._mock_revparse_commit self._repo._tag = '97ebc0e0' remote_name = 'origin' - received, _ = self._repo._is_unique_tag(self._repo._tag, remote_name) + received, _ = self._repo._is_unique_tag(self._repo._tag, remote_name, + os.getcwd()) self.assertFalse(received) def test_hash_is_commit(self): @@ -698,10 +691,11 @@ def test_hash_is_commit(self): self._repo._git_showref_tag = self._shell_false self._repo._git_showref_branch = self._shell_false self._repo._git_lsremote_branch = self._shell_false - self._repo._git_revparse_commit = self._mock_function_true + self._repo._git_revparse_commit = self._mock_revparse_commit self._repo._tag = '97ebc0e0' remote_name = 'origin' - received, _ = self._repo._is_unique_tag(self._repo._tag, remote_name) + received, _ = self._repo._is_unique_tag(self._repo._tag, remote_name, + os.getcwd()) self.assertFalse(received) @@ -746,13 +740,14 @@ def _shell_false(url, remote=None): return 1 @staticmethod - def _mock_function_false(ref): + def _mock_revparse_commit_false(ref, dirname): _ = ref return (TestValidRef._shell_false, '') @staticmethod - def _mock_function_true(ref): + def _mock_revparse_commit_true(ref, dirname): _ = ref + _ = dirname return (TestValidRef._shell_true, '') def test_valid_ref_is_invalid(self): @@ -761,10 +756,12 @@ def test_valid_ref_is_invalid(self): self._repo._git_showref_tag = self._shell_false self._repo._git_showref_branch = self._shell_false self._repo._git_lsremote_branch = self._shell_false - self._repo._git_revparse_commit = self._mock_function_false + self._repo._git_revparse_commit = self._mock_revparse_commit_false self._repo._tag = 'invalid_ref' with self.assertRaises(RuntimeError): - self._repo._check_for_valid_ref(self._repo._tag) + self._repo._check_for_valid_ref(self._repo._tag, + remote_name=None, + dirname=os.getcwd()) def test_valid_tag(self): """Verify a valid tag return true @@ -772,9 +769,11 @@ def test_valid_tag(self): self._repo._git_showref_tag = self._shell_true self._repo._git_showref_branch = self._shell_false self._repo._git_lsremote_branch = self._shell_false - self._repo._git_revparse_commit = self._mock_function_true + self._repo._git_revparse_commit = self._mock_revparse_commit_true self._repo._tag = 'tag1' - received = self._repo._check_for_valid_ref(self._repo._tag) + received = self._repo._check_for_valid_ref(self._repo._tag, + remote_name=None, + dirname=os.getcwd()) self.assertTrue(received) def test_valid_branch(self): @@ -783,24 +782,28 @@ def test_valid_branch(self): self._repo._git_showref_tag = self._shell_false self._repo._git_showref_branch = self._shell_true self._repo._git_lsremote_branch = self._shell_false - self._repo._git_revparse_commit = self._mock_function_true + self._repo._git_revparse_commit = self._mock_revparse_commit_true self._repo._tag = 'tag1' - received = self._repo._check_for_valid_ref(self._repo._tag) + received = self._repo._check_for_valid_ref(self._repo._tag, + remote_name=None, + dirname=os.getcwd()) self.assertTrue(received) def test_valid_hash(self): """Verify a valid hash return true """ - def _mock_revparse_commit(ref): + def _mock_revparse_commit_true(ref, dirname): _ = ref return (0, '56cc0b539426eb26810af9e') self._repo._git_showref_tag = self._shell_false self._repo._git_showref_branch = self._shell_false self._repo._git_lsremote_branch = self._shell_false - self._repo._git_revparse_commit = _mock_revparse_commit + self._repo._git_revparse_commit = _mock_revparse_commit_true self._repo._hash = '56cc0b5394' - received = self._repo._check_for_valid_ref(self._repo._hash) + received = self._repo._check_for_valid_ref(self._repo._hash, + remote_name=None, + dirname=os.getcwd()) self.assertTrue(received) diff --git a/test/test_unit_repository_svn.py b/test/test_unit_repository_svn.py old mode 100644 new mode 100755 index 7ff31c4218..d9309df7f6 --- a/test/test_unit_repository_svn.py +++ b/test/test_unit_repository_svn.py @@ -1,4 +1,4 @@ -#!/usr/bin/env python +#!/usr/bin/env python3 """Unit test driver for checkout_externals @@ -60,7 +60,7 @@ def setUp(self): self._name = 'component' rdata = {ExternalsDescription.PROTOCOL: 'svn', ExternalsDescription.REPO_URL: - 'https://svn-ccsm-models.cgd.ucar.edu/', + 'https://svn-ccsm-models.cgd.ucar.edu', ExternalsDescription.TAG: 'mosart/trunk_tags/mosart1_0_26', } diff --git a/test/test_unit_utils.py b/test/test_unit_utils.py index c994e58ebe..80e1636649 100644 --- a/test/test_unit_utils.py +++ b/test/test_unit_utils.py @@ -1,4 +1,4 @@ -#!/usr/bin/env python +#!/usr/bin/env python3 """Unit test driver for checkout_externals diff --git a/version.txt b/version.txt new file mode 100644 index 0000000000..cbda54c515 --- /dev/null +++ b/version.txt @@ -0,0 +1 @@ +manic-1.2.24-3-gba00e50