profile
viewpoint
Yaroslav Halchenko yarikoptic Dartmouth College, @Debian, @DataLad, @PyMVPA, @fail2ban Planet Earth www.onerussian.com Cheers!

gitpython-developers/GitPython 2849

GitPython is a python library used to interact with Git repositories.

bids-standard/bids-specification 122

Brain Imaging Data Structure (BIDS) Specification

bids-standard/pybids 108

Python tools for querying and manipulating BIDS datasets.

brechtm/citeproc-py 87

Yet another Python CSL Processor

afni/afni 72

Official AFNI source and documentation

con/open-brain-consent 28

Making neuroimaging open from the grounds (consent form) and up (tools)

adswa/multimatch_gaze 16

Reimplementation of Matlabs MultiMatch toolbox (Dewhurst et al., 2012) in Python

bids-standard/bids2nda 11

A conversion tool for creating NDA compatible metadata representation from BIDS datasets.

fperez/nipy-notebooks 9

Example notebooks for basic tasks in neuroimaging and statistcs with nipy

PullRequestReviewEvent

Pull request review commentdatalad/datalad

Nf json ld search

 def test_meta2autofield_dict():             'extr1': {'prop1': 'value'}}),         {'extr1.prop1': 'value'}     )+++def test_meta2autofield_jsonld_graph():+    """ check proper handling of JSON-LD @graph nodes """+    # Just a test that we would obtain the value stored for that extractor+    # instead of what unique values it already had (whatever that means)+    eq_(+        _meta2autofield_dict({"r": {"@graph": ["a", "b", "c"]}}),+        {'r.graph[0]': 'a', 'r.graph[1]': 'b', 'r.graph[2]': 'c'}+    )+++def test_meta2autofield_jsonld_list():+    """ check proper handling of JSON-LD @list nodes """+    # Just a test that we would obtain the value stored for that extractor+    # instead of what unique values it already had (whatever that means)+    eq_(+        _meta2autofield_dict({"r": {"@list": ["a", "b", "c"]}}),+        {'r.list[0]': 'a', 'r.list[1]': 'b', 'r.list[2]': 'c'}+    )+++_mocked_studyminimeta_jsonld = {+    "@context": {

Since this structure already is here, could there be a test for actual search? Just to make sure that no side effects etc?

christian-monch

comment created time in 9 hours

push eventdatalad/datalad

John T. Wodder II

commit sha ed8cd5dfa71c91df193d1de9edb9fc997badceb0

Workflow for testing on macOS

view details

John T. Wodder II

commit sha 83adb0f905a9eac0c73b0e733e71d01c9512b971

Add `known_failure_githubci_osx` decorator to tests

view details

Yaroslav Halchenko

commit sha a2e70deb38061c829fec40391e3737805616ca08

Merge pull request #4947 from jwodder/gh-4942 Workflow for testing on macOS

view details

push time in 9 hours

PR merged datalad/datalad

Workflow for testing on macOS

Closes #4942.

+58 -0

4 comments

3 changed files

jwodder

pr closed time in 9 hours

pull request commentdatalad/datalad

Workflow for testing on macOS

I think more green runs is good, so I will proceed with merge. Could you please try to add codecov submission in a follow up PR? There was something not working for windows workflow https://github.com/datalad/datalad/blob/master/.github/workflows/test_win2019_disabled#L59 but may be it would work for osx?

jwodder

comment created time in 9 hours

push eventyarikoptic/datalad

Yaroslav Halchenko

commit sha 9de9a9cdc74fb07b979a3a55dc887746f0e7edbf

RF: remove disable_logger in customremotes/datalad.py It was introduced in 15ecbb78a86f36b353e763273577dd369671e61b (0.9.2~76^2~6) as a part of https://github.com/datalad/datalad/pull/1870 with a purpose according to description of not pollutting stdout used by special remote for communication. BUT logs go to stderr and do not interfer anyhow with the communication, so the reason for the change is not clear to me. I have been trying to figure out why the heck I am still seeing failures after https://github.com/datalad/datalad/pull/4931 . I kept adding more logging but nothing appeared in the logs! I finally was brought to this piece of code, thus the motivation and argumentation for the change.

view details

Yaroslav Halchenko

commit sha 0c88b031f924e70a761df8a5d5cfb943cb381167

Merge pull request #4934 from yarikoptic/rf-donot-disable-log RF: remove disable_logger in customremotes/datalad.py

view details

Yaroslav Halchenko

commit sha 05d7d7ecd3d621df7d95f4cbb386cbcc1cf580bc

ENH(+RF): @try_multiple_dec - add exceptions_filter + doc, mv msg where it is correct Main motivation -- I keep breeding custom loops (e.g. https://github.com/datalad/datalad/pull/4928 ) or similar situations. Adding an "exceptions_filter" argument would allow to make it flexible to fit more use cases, where decision making should be made based on details of the exception (e.g. .status for S3ResponseError). Also I moved log message which promises that we are about to sleep right before the sleep, so we do not log that if we are about to re-raise it instead.

view details

Yaroslav Halchenko

commit sha d2d8387570ba6ea8cde82948fa32a3dbb8734fd2

ENH: @try_multiple_dec - add "logger" kwarg Since initial use case was to mediate problems on windows with removing directories, we made logging at level 5. In many other use cases it would be desired to log at higher level, or even at a warning level. Thus making it configurable

view details

Yaroslav Halchenko

commit sha 705ac2b9de4ba951b97c0452f224778bbbd5b761

ENH: @try_multiple_dec_s3 - adapter for re-trying upon S3 errors

view details

Yaroslav Halchenko

commit sha 767da4e611c239e032be61b2be889cf8414d53a5

ENH: Use @try_multiple_dec_s3 in get_downloader_session I kept hitting that 400 while establishing ABCD dataset. I think this is a better alternative to a more "blind" https://github.com/datalad/datalad/pull/4928 which would retry a call to git annex addurl while datalad special remote in use.

view details

Yaroslav Halchenko

commit sha b26248b03b3161a808288a522b56209fb9b8cda5

ENH: Make use of try_multiple_dec_s3 in various other spots

view details

Yaroslav Halchenko

commit sha 84297fbdbeb76b07b4ff9d8c3274f93bf94aaa5a

typo fix Co-authored-by: Kyle Meyer <kyle@kyleam.com>

view details

Yaroslav Halchenko

commit sha 980e4a697d413edf38f02a1633a99da712f8a7cf

RF: make needs_authentication into a bool I do not see a reason why I made it possible to pass an instance of a credential

view details

Yaroslav Halchenko

commit sha 2ed07222e0c17dcda218599057a6b5ce11327131

BF: verify that if there is a credential that it has not expired yet while considering to reuse session

view details

Yaroslav Halchenko

commit sha 66e861584015b8b49cabbef15e3929e3ec07065f

BF: when we catch S3 error and see that authenticator key is expired -- raise AccessDeniedError That should cause outside loop in .access to retry _establish_session without even allowing for reusing previous session, and we should be all set!

view details

Yaroslav Halchenko

commit sha 0db3f18e212dd10b08f4599e9432ffe4b5ae981c

ENH: minor, removed commented out 403 Did not squash since branch already used in a "deployed" merge of various PRs

view details

Yaroslav Halchenko

commit sha 593d1a5f808539f0e249e4c32b9fb9a203d40024

ENH: provide details in the assert message when assert on having "command" in returned record

view details

Yaroslav Halchenko

commit sha 5fc0e9bd97d5616da2cb0c03714a07f0c1fa738f

BF: Provide is_expired for CompositeCredential eh -- should use more of @abstractmethod I guess :-/ Here it is a mix of pure bool and properties, so left as is for now

view details

Yaroslav Halchenko

commit sha 0d001cd1ca6c498d107779c0de31e9cbfda841b4

Revert "RF: make needs_authentication into a bool" This reverts commit c936ed37d29b5175d49f27e020cae3498a9c5c92. I think there was some thought behind initial code since "else" clause does cause entry of credential if there is a no known credential and we do not know by then (thus None) if needs_authentication (would be bool) if there is no authenticator assigned. So to not disturb that portion of the code, I am reverting the change which was incomplete anyways since logic in "else:" would no longer trigger nested else: to entry of credential.

view details

Yaroslav Halchenko

commit sha e476355e903b92dd01445ee3e31a0dfcc9d51328

ENH+RF: CompositeCredential - add .refresh + trigger regeneration of full chain upon enter_new We need interface to "force-trigger" regeneration of apparently (but not due to expiration datetime as we know) credential. .refresh() does that and is now used by enter_new(). Change in behavior is that we will now trigger full chain to get all "tail" credential, but I take it as a feature -- so we do not delay with generation and thus possibly surface code bugs etc only until used. Also, it would make `is_expired` assessment valid - since if we do not regenerate, is_expired ATM would not be adequately report either some old generated credential is expired or not. A possible disadvantage -- it typically would require network connection while simply entering a new credential. But IMHO it is Ok

view details

Yaroslav Halchenko

commit sha 3df1ce5432f6883af3a1bd2939dd4cad6eec8d5c

RF+ENH: downloaders (s3) - raise and handle dedicated AccessPermissionExpiredError s3 error_codes MIGHT (see comment -- we do not always get them) hint us on the nature of 400 code. If it is ExpiredToken, we need to handle it more specifically than just "keep trying" and then rely on our knowledge of when it is to expire. To say the truth -- I have triggered this use specific use case manually: I have replaced proper token with an expired one, which in theory should not happen. But who knows -- may be admins would explicitly expire some tokens which theoretically should still be valid (according to initial expiration date), so we better be able to handle such scenarios as well instead of just demanding a new full credential to be entered.

view details

Yaroslav Halchenko

commit sha a09ec9e2e7c9d8321e7a1260393887f0e29014b8

RF or BF?: do not claim that credential is expired if we do not have expiration datetime In good old 70c3c4b2f68dfc9ef6858a24eb266226992a7285 (0.3~120^2~9^2~2 !) logic was added to claim that any S3 key which has no expiration assigned "is_expired". It does not feel logical! Now that previous commits add checks for expiration in various relevant places, I think it is valuable to fix this to avoid unnecessary re-authentications etc. is_expired was used only within CompositeCredentials so I think it should be safe to "fix" this without any side-effects.

view details

Yaroslav Halchenko

commit sha a590b8dc28d43db51980428df04a53e599133e28

RF+ENH: centralize check of credential not expired and do before .get_key() This way we would not even attempt (immediately after) getting a key if we know that credential is expired already. There can still be a race condition and proper reaction to the expiration of the token is yet to be figured out how to handle properly while handling 400 I am also thinking may be about "artificially" bringing (stored) expiration slightly closer in time, e.g. making it 1% closer than the whoe allotment. That would help to avoid any possible race as long as it is to expire just in a few seconds since allotment... or may be change it to be 2 seconds closer altogether -- unlikely expiration would ever be that close in time!

view details

Yaroslav Halchenko

commit sha 95a7e6436ffd6a9e5ded7afa2e6f4414cfa7e29e

BF(workaround): is_expired -- remove 2 seconds from allowed duration to help avoiding race conditions etc. It is unlikely that expiration would be provided so close to request that it would expire within 2 seconds, so I think this should be ok

view details

push time in 9 hours

pull request commentdatalad/datalad

Workflow for testing on macOS

Great. Please add known_failure_githubci_osx similar to known_failure_githubci_win and decorate those two failing tests with it so we get green at least and proceed. IMHO ideally this one should be positioned against maint branch (not master) since would be of benefit there.

jwodder

comment created time in 13 hours

issue closeddatalad/datalad

Q: when to set/use git's splitIndex?

I thought that it would benefit heavy in index (number of files) repositories in general. @mih reported some boost in the past in https://github.com/datalad/datalad/issues/3869 but it seems to be v5 -vs- v7+splitIndex (so no pure v7?).

<details> <summary>I wrote a simple script to time growing a pure git repo with lots of files</summary>

#!/bin/bash
set -eu

PS4="> "
#set -x

CMD="$1"
ncommits="$2"
ndirs="$3"
nfiles="$4"
how="$5"

rm -rf ${TMPDIR:-/tmp}/dl-split*
cd "$(mktemp -d ${TMPDIR:-/tmp}/dl-split-XXXXXXX)"

mkdir src

time (
cd src
git init

if [ -n "$CMD" ]; then
  bash -c "$CMD"
fi

#set +x
fileno=1
for commit in `seq 1 $ncommits`; do
  #echo "Commit $commit"

  for d in `seq 1 $ndirs`; do
    mkdir -p "d$d"
    for f in `seq 1 $nfiles`; do
      fileno=$(( $fileno + 1 ))
      filename="d$d/$fileno"
      echo "$filename" > "$filename"
    done
  done
  case "$how" in
  git)
    git add *
    git commit --quiet -m "Committing commit #$commit"
    ;;
  datalad)
    datalad save -m "Committing save #$commit" >/dev/null
    ;;
  *)
    echo "#how" >&2
    exit 1
  esac
done
)

pwd

</details>

and to my surprise

$> ./time-heavy-repo-splitindex.sh "" 4 100 100 datalad
Initialized empty Git repository in /home/yoh/.tmp/dl-split-C35kIBR/src/.git/

real	0m28.010s
user	0m45.178s
sys	0m3.911s
/home/yoh/.tmp/dl-split-C35kIBR

$> ./time-heavy-repo-splitindex.sh "git config core.splitIndex true" 4 100 100 datalad
Initialized empty Git repository in /home/yoh/.tmp/dl-split-xeF8RnA/src/.git/

real	0m32.981s
user	0m51.165s
sys	0m4.708s
/home/yoh/.tmp/dl-split-xeF8RnA

so we only got slower, and the same applies to pure git

$> ./time-heavy-repo-splitindex.sh "" 4 100 100 git                                   
Initialized empty Git repository in /home/yoh/.tmp/dl-split-zRn2sip/src/.git/

real	0m4.935s
user	0m1.874s
sys	0m3.255s
/home/yoh/.tmp/dl-split-zRn2sip

$> ./time-heavy-repo-splitindex.sh "git config core.splitIndex true" 4 100 100 git    
Initialized empty Git repository in /home/yoh/.tmp/dl-split-VYuIbb7/src/.git/

real	0m5.164s
user	0m1.824s
sys	0m3.492s
/home/yoh/.tmp/dl-split-VYuIbb7

so the question is when is it actually useful feature to use if ever?

closed time in 13 hours

yarikoptic

issue commentdatalad/datalad

Q: when to set/use git's splitIndex?

Thank you @kyleam . So should be useful for large datasets when changing a tiny subset of files commit at a time

yarikoptic

comment created time in 13 hours

push eventdandi/dandi-api

Yaroslav Halchenko

commit sha 68fbd4db10eb4d23c3b0f649f73879d557be3e56

RF: move globally used regexes definitions into dandiapi.consts I also removed binding of SHA256_REGEX to Asset -- it is just a global thing. Asset.UUID_REGEX - I left bound since we might want to change specifically to Asset what kind of UUID it uses

view details

push time in 17 hours

push eventdandi/dandi-api

Yaroslav Halchenko

commit sha 6816e57e64ebff7b99770df93547eea31c2c82ca

BF: specify dandiapi not dandi module for pytest testing

view details

Yaroslav Halchenko

commit sha f0498fdecb11c8b6a65b169b0cdc4c59df40ed2e

RF: place testing depends into "dev" extra and use that one in tox.ini

view details

push time in 17 hours

issue commentdatalad/datalad

Add github workflow to test on OSX

yes, for this repo we all submit PRs from "personal" forks. Thank you in advance!

yarikoptic

comment created time in 18 hours

pull request commentdatalad/datalad

ENH+RF: @try_multiple_dec_s3 to retry interaction with S3 upon "retriable" errors

grr, still not there...

datalad.downloaders.credentials {2375010}[DEBUG] Credential NDA:1 will expire in 11.96h
datalad.s3      {2375010}[Level 5] Empty error_code in S3ResponseError: 400 Bad Request

datalad.s3      {2375010}[DEBUG] Caught S3ResponseError: 400 Bad Request
 [bucket.py:_get_key_internal:231] on trial #0. Sleeping 2.000000 and retrying
datalad.s3      {2375010}[Level 5] Empty error_code in S3ResponseError: 400 Bad Request

datalad.s3      {2375010}[DEBUG] Caught S3ResponseError: 400 Bad Request
 [bucket.py:_get_key_internal:231] on trial #1. Sleeping 4.000000 and retrying
datalad.s3      {2375010}[Level 5] Empty error_code in S3ResponseError: 400 Bad Request

datalad.s3      {2375010}[DEBUG] Caught S3ResponseError: 400 Bad Request
 [bucket.py:_get_key_internal:231] on trial #2. Sleeping 8.000000 and retrying
datalad.s3      {2375010}[Level 5] Empty error_code in S3ResponseError: 400 Bad Request

datalad.downloaders.credentials {2375010}[DEBUG] Credential NDA:1 will expire in 11.96h
datalad.customremotes {2375010}[DEBUG] Failed to check url s3://NDAR_Central_3/XXX: S3ResponseError: 400 Bad Request
|  [s3.py:get_downloader_session:262]
datalad.customremotes {2375010}[Level 4] Sending 'DEBUG Failed to check url s3://NDAR_Central_3/XXX: S3 refused to provide the key for XXX from url s3://NDAR_Central_3/XXX: S3ResponseError: 400 Bad Request\\n [s3.py:get_downloader_session:262]'
datalad.customremotes {2375010}[Level 4] Sending 'CHECKURL-FAILURE'

11.96h suggests that it might still be some zone issue -- too close to an even hour... I will check that first

yarikoptic

comment created time in 18 hours

push eventdatalad/datalad

Yaroslav Halchenko

commit sha 756aeef1ee5d6a904f67f9707119235abb049224

ENH: add --debug also to batched annexes if loglevel <= 8

view details

Yaroslav Halchenko

commit sha ada828214c4462b2e2016d029b504d774f1fce25

ENH: log up to 100 of last lines in stderr (if log outputs) for batched processes I am chasing some bug in datalad/git-annex where batched addurls eventually crashes with datalad.cmd [DEBUG ] Closing stdin of <subprocess.Popen object at 0x7fa14a7cb940> and waiting process to finish datalad.utils [WARNING] Caught Command '['git', 'annex', 'addurl', '--fast', '--with-files', '--json', '--json-error-messages', '--batch']' timed out after 3.0 seconds [subprocess.py:_wait:1616] on trial #1. Sleeping 1.000000 and retrying datalad.utils [WARNING] Caught Command '['git', 'annex', 'addurl', '--fast', '--with-files', '--json', '--json-error-messages', '--batch']' timed out after 3.0 seconds [subprocess.py:_wait:1616] on trial #2. Sleeping 1.000000 and retrying datalad.cmd [WARNING] Batched process <subprocess.Popen object at 0x7fa14a7cb940> did not finish, abandoning it without killing it Traceback (most recent call last): File "datalad-nda/scripts/datalad-nda", line 416, in <module> main() File "/usr/lib/python3/dist-packages/click/core.py", line 764, in __call__ return self.main(*args, **kwargs) File "/usr/lib/python3/dist-packages/click/core.py", line 717, in main rv = self.invoke(ctx) File "/usr/lib/python3/dist-packages/click/core.py", line 1137, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File "/usr/lib/python3/dist-packages/click/core.py", line 956, in invoke return ctx.invoke(self.callback, **ctx.params) File "/usr/lib/python3/dist-packages/click/core.py", line 555, in invoke return callback(*args, **kwargs) File "datalad-nda/scripts/datalad-nda", line 236, in add2datalad on_failure="stop", File "/mnt/scrap/tmp/abcd/datalad/datalad/distribution/dataset.py", line 503, in apply_func return f(**kwargs) File "/mnt/scrap/tmp/abcd/datalad/datalad/interface/utils.py", line 481, in eval_func return return_func(generator_func)(*args, **kwargs) File "/mnt/scrap/tmp/abcd/datalad/datalad/interface/utils.py", line 469, in return_func results = list(results) File "/mnt/scrap/tmp/abcd/datalad/datalad/interface/utils.py", line 456, in generator_func msg="Command did not complete successfully") datalad.support.exceptions.IncompleteResultsError: Command did not complete successfully. 1 failed: [{'action': 'addurls', 'message': "AnnexBatchCommandError: 'addurl' [Error, annex reported failure " 'for addurl ' "(url='s3://NDAR_Central_2/submission_23229/derivatives/abcd-hcp-pipeline/sub-XXX/ses-baselineYear1Arm1/img/DVARS_and_FD_task-SST01.png'): " "{'command': 'addurl', 'success': False, 'error-messages': [' " "unable to use special remote due to protocol error'], 'file': " "'sub-XXX/ses-baselineYear1Arm1/img/DVARS_and_FD_task-SST01.png'}] " '[annexrepo.py:add_url_to_file:1879]', 'path': '/mnt/scrap/tmp/abcd/testds-fast-full2/derivatives/abcd-hcp-pipeline/sub-XXX/ses-baselineYear1Arm1/img/DVARS_and_FD_task-SST01.png', 'status': 'error', 'type': 'file'}] > /mnt/scrap/tmp/abcd/datalad/datalad/interface/utils.py(456)generator_func() -> msg="Command did not complete successfully") I have no clue what is going on and since it is a sandwich of datalad/git-annex/git-annex-remote-datalad-archives here -- it is hard to impossible to see what is happening. Logging last lines from stderr might give a clue if would include relevant log lines from special remote log etc.

view details

Yaroslav Halchenko

commit sha e821fdde028c3e97059e3b7bd75c70d447b4253a

ENH: log all stderr lines not just last 100

view details

Yaroslav Halchenko

commit sha 021aa5faeba9be73fe2bf654c7bd602c3652f294

Merge pull request #4937 from yarikoptic/enh-debug-batched-annex ENH: make it possible to debug batched annex

view details

push time in 18 hours

PR merged datalad/datalad

ENH: make it possible to debug batched annex DX
  • add --debug option to annex CMD --batch invocation when level <= 8, analogously to what we do in main AnnexRepo "good old runner"
  • incorporate and supersede #4925 - log all stderr lines if logging of outputs was requested and log level <= 5
    • 100 is just often is not enough to troubleshoot some issue which has happened some time before (crash of special remote etc) the batch process actually finished

I hope these two changes would make debugging of annex and started by it external remotes more manageable. ATM it is either pain or just impossible.

+27 -5

1 comment

2 changed files

yarikoptic

pr closed time in 18 hours

issue openeddatalad/datalad

Q: when to set/use git's splitIndex?

I thought that it would benefit heavy in index (number of files) repositories in general. @mih reported some boost in the past in https://github.com/datalad/datalad/issues/3869 but it seems to be v5 -vs- v7+splitIndex (so no pure v7?).

<details> <summary>I wrote a simple script to time growing a pure git repo with lots of files</summary>

#!/bin/bash
set -eu

PS4="> "
#set -x

CMD="$1"
ncommits="$2"
ndirs="$3"
nfiles="$4"
how="$5"

rm -rf ${TMPDIR:-/tmp}/dl-split*
cd "$(mktemp -d ${TMPDIR:-/tmp}/dl-split-XXXXXXX)"

mkdir src

time (
cd src
git init

if [ -n "$CMD" ]; then
  bash -c "$CMD"
fi

#set +x
fileno=1
for commit in `seq 1 $ncommits`; do
  #echo "Commit $commit"

  for d in `seq 1 $ndirs`; do
    mkdir -p "d$d"
    for f in `seq 1 $nfiles`; do
      fileno=$(( $fileno + 1 ))
      filename="d$d/$fileno"
      echo "$filename" > "$filename"
    done
  done
  case "$how" in
  git)
    git add *
    git commit --quiet -m "Committing commit #$commit"
    ;;
  datalad)
    datalad save -m "Committing save #$commit" >/dev/null
    ;;
  *)
    echo "#how" >&2
    exit 1
  esac
done
)

pwd

</details>

and to my surprise

$> ./time-heavy-repo-splitindex.sh "" 4 100 100 datalad
Initialized empty Git repository in /home/yoh/.tmp/dl-split-C35kIBR/src/.git/

real	0m28.010s
user	0m45.178s
sys	0m3.911s
/home/yoh/.tmp/dl-split-C35kIBR

$> ./time-heavy-repo-splitindex.sh "git config core.splitIndex true" 4 100 100 datalad
Initialized empty Git repository in /home/yoh/.tmp/dl-split-xeF8RnA/src/.git/

real	0m32.981s
user	0m51.165s
sys	0m4.708s
/home/yoh/.tmp/dl-split-xeF8RnA

so we only got slower, and the same applies to pure git

$> ./time-heavy-repo-splitindex.sh "" 4 100 100 git                                   
Initialized empty Git repository in /home/yoh/.tmp/dl-split-zRn2sip/src/.git/

real	0m4.935s
user	0m1.874s
sys	0m3.255s
/home/yoh/.tmp/dl-split-zRn2sip

$> ./time-heavy-repo-splitindex.sh "git config core.splitIndex true" 4 100 100 git    
Initialized empty Git repository in /home/yoh/.tmp/dl-split-VYuIbb7/src/.git/

real	0m5.164s
user	0m1.824s
sys	0m3.492s
/home/yoh/.tmp/dl-split-VYuIbb7

so the question is when is it actually useful feature to use if ever?

created time in a day

pull request commentdatalad/datalad-extensions

Combined workflow for building git-annex on Ubuntu and macOS

woohoo !

  • now there are artifacts available for download for OSX as well
  • the one run (release) which did get to test datalad on OSX does get docker installed and proceeds with tests resulting in "(SKIP=132, errors=9, failures=2)" : filed https://github.com/datalad/datalad/issues/4941 and https://github.com/datalad/datalad/issues/4942 I hope you could help with @jwodder while all the github/OSX activity is ongoing.

Re "docker machine": it might be due to still open https://github.com/docker/machine/issues/2296 and may be the workaround of https://github.com/docker/machine/issues/2765#issuecomment-179470417 could help?

but overall, since we know that datalad still needs fixups but overall build on OSX and testing of annex succeeds -- lets just disable testing of datalad on OSX for now altogether and merge this, and initiate subsequent PR on top to poke periodically at to make sure that DataLad is getting green.

(Note that badges in README.md might need to be adjusted since workflow got renamed -- see https://github.com/datalad/datalad-extensions/blob/master/CONTRIBUTING.md on how to regenerate it after tune up)

jwodder

comment created time in a day

issue openeddatalad/datalad

Add github workflow to test on OSX

In the light of #4941 I think it would be very useful to get at least some (datalad.support and datalad.core at least or all which are not slow or turtle?) unittests being tested on OSX on github actions. Then PRs addressing failures listed in #4941 could expand coverage to cover what is fixed up.

For now git-annex could be installed from the officially distributed .dmg, but eventually I hope that we could provide some builds straight from https://github.com/datalad/datalad-extensions/ .

created time in a day

MemberEvent

issue openeddatalad/datalad

Results from running tests on OSX: SKIP=132, errors=9, failures=2

A (lengthy) extract from the full log (until expires) of running a freshly built git-annex on OSX within github actions testing released (0.13.3 ATM) datalad:

<details> <summary>(SKIP=132, errors=9, failures=2)</summary>

2020-09-21T18:02:00.4944200Z ======================================================================
2020-09-21T18:02:00.4945080Z ERROR: datalad.core.distributed.tests.test_push.test_push_git_annex_branch_many_paths_same_data
2020-09-21T18:02:00.4946720Z ----------------------------------------------------------------------
2020-09-21T18:02:00.4947320Z Traceback (most recent call last):
2020-09-21T18:02:00.4949110Z   File "/Users/runner/hostedtoolcache/Python/3.7.9/x64/lib/python3.7/site-packages/nose/case.py", line 198, in runTest
2020-09-21T18:02:00.4950020Z     self.test(*self.arg)
2020-09-21T18:02:00.4951520Z   File "/Users/runner/hostedtoolcache/Python/3.7.9/x64/lib/python3.7/site-packages/datalad/tests/utils.py", line 554, in _wrap_with_tree
2020-09-21T18:02:00.4952610Z     return t(*(arg + (d,)), **kw)
2020-09-21T18:02:00.4955570Z   File "/Users/runner/hostedtoolcache/Python/3.7.9/x64/lib/python3.7/site-packages/datalad/core/distributed/tests/test_push.py", line 813, in test_push_git_annex_branch_many_paths_same_data
2020-09-21T18:02:00.4957290Z     res = ds.push(to="target")
2020-09-21T18:02:00.4960850Z   File "/Users/runner/hostedtoolcache/Python/3.7.9/x64/lib/python3.7/site-packages/datalad/distribution/dataset.py", line 503, in apply_func
2020-09-21T18:02:00.4962000Z     return f(**kwargs)
2020-09-21T18:02:00.4964130Z   File "/Users/runner/hostedtoolcache/Python/3.7.9/x64/lib/python3.7/site-packages/datalad/interface/utils.py", line 494, in eval_func
2020-09-21T18:02:00.4965360Z     return return_func(generator_func)(*args, **kwargs)
2020-09-21T18:02:00.4967570Z   File "/Users/runner/hostedtoolcache/Python/3.7.9/x64/lib/python3.7/site-packages/datalad/interface/utils.py", line 482, in return_func
2020-09-21T18:02:00.4968710Z     results = list(results)
2020-09-21T18:02:00.4970770Z   File "/Users/runner/hostedtoolcache/Python/3.7.9/x64/lib/python3.7/site-packages/datalad/interface/utils.py", line 413, in generator_func
2020-09-21T18:02:00.4971890Z     allkwargs):
2020-09-21T18:02:00.4973910Z   File "/Users/runner/hostedtoolcache/Python/3.7.9/x64/lib/python3.7/site-packages/datalad/interface/utils.py", line 552, in _process_results
2020-09-21T18:02:00.4975040Z     for res in results:
2020-09-21T18:02:00.4977040Z   File "/Users/runner/hostedtoolcache/Python/3.7.9/x64/lib/python3.7/site-packages/datalad/core/distributed/push.py", line 263, in __call__
2020-09-21T18:02:00.4978690Z     got_path_arg=True if path else False)
2020-09-21T18:02:00.4981360Z   File "/Users/runner/hostedtoolcache/Python/3.7.9/x64/lib/python3.7/site-packages/datalad/core/distributed/push.py", line 579, in _push
2020-09-21T18:02:00.4982490Z     got_path_arg=got_path_arg,
2020-09-21T18:02:00.4984550Z   File "/Users/runner/hostedtoolcache/Python/3.7.9/x64/lib/python3.7/site-packages/datalad/core/distributed/push.py", line 840, in _push_data
2020-09-21T18:02:00.4985670Z     stdin=file_list)
2020-09-21T18:02:00.4987470Z   File "/Users/runner/hostedtoolcache/Python/3.7.9/x64/lib/python3.7/site-packages/datalad/cmd.py", line 508, in run
2020-09-21T18:02:00.4988430Z     **results,
2020-09-21T18:02:00.4992110Z datalad.support.exceptions.CommandError: CommandError: 'git annex copy --batch -z --to target --json --json-error-messages --json-progress --fast' failed with exitcode 1 under /var/folders/24/8k48jl6d249_n_qfxwsl6xvm0000gn/T/datalad_temp_tree_test_push_git_annex_branch_many_paths_same_dataam8xu4if/ds [err: 'git-annex: copy: 3 failed'] [info keys: stdout_json]
2020-09-21T18:02:00.4995320Z -------------------- >> begin captured logging << --------------------
2020-09-21T18:02:00.4996050Z asyncio: DEBUG: Using selector: KqueueSelector
2020-09-21T18:02:00.4996870Z asyncio: DEBUG: Using selector: KqueueSelector
2020-09-21T18:02:00.4998060Z asyncio: DEBUG: Using selector: KqueueSelector
2020-09-21T18:02:00.4998860Z asyncio: DEBUG: Using selector: KqueueSelector
2020-09-21T18:02:00.4999660Z asyncio: DEBUG: Using selector: KqueueSelector
2020-09-21T18:02:00.5001750Z datalad.utils: DEBUG: Determined class of decorated function: <class 'datalad.core.distributed.push.Push'>
2020-09-21T18:02:00.5003760Z datalad.dataset: DEBUG: Resolved dataset for pushing: /var/folders/24/8k48jl6d249_n_qfxwsl6xvm0000gn/T/datalad_temp_tree_test_push_git_annex_branch_many_paths_same_dataam8xu4if/ds
2020-09-21T18:02:00.5005900Z datalad.dataset: DEBUG: Resolved dataset for difference reporting: /var/folders/24/8k48jl6d249_n_qfxwsl6xvm0000gn/T/datalad_temp_tree_test_push_git_annex_branch_many_paths_same_dataam8xu4if/ds
2020-09-21T18:02:00.5008960Z datalad.core.local.diff: DEBUG: Diff Dataset(/var/folders/24/8k48jl6d249_n_qfxwsl6xvm0000gn/T/datalad_temp_tree_test_push_git_annex_branch_many_paths_same_dataam8xu4if/ds) from 'None' to 'HEAD'
2020-09-21T18:02:00.5011140Z datalad.gitrepo: DEBUG: AnnexRepo(/private/var/folders/24/8k48jl6d249_n_qfxwsl6xvm0000gn/T/datalad_temp_tree_test_push_git_annex_branch_many_paths_same_dataam8xu4if/ds).get_content_info(...)
2020-09-21T18:02:00.5013370Z datalad.gitrepo: DEBUG: Query repo: ['git', 'ls-tree', 'HEAD', '-z', '-r', '--full-tree', '-l']
2020-09-21T18:02:00.5015080Z datalad.gitrepo: DEBUG: Done query repo: ['git', 'ls-tree', 'HEAD', '-z', '-r', '--full-tree', '-l']
2020-09-21T18:02:00.5016670Z datalad.gitrepo: DEBUG: Done AnnexRepo(/private/var/folders/24/8k48jl6d249_n_qfxwsl6xvm0000gn/T/datalad_temp_tree_test_push_git_annex_branch_many_paths_same_dataam8xu4if/ds).get_content_info(...)
2020-09-21T18:02:00.5018960Z datalad.core.distributed.push: DEBUG: Attempt push of Dataset at /var/folders/24/8k48jl6d249_n_qfxwsl6xvm0000gn/T/datalad_temp_tree_test_push_git_annex_branch_many_paths_same_dataam8xu4if/ds
2020-09-21T18:02:00.5021030Z datalad.core.distributed.push: INFO: Determine push target
2020-09-21T18:02:00.5022260Z datalad.core.distributed.push: INFO: Push refspecs
2020-09-21T18:02:00.5023270Z asyncio: DEBUG: Using selector: KqueueSelector
2020-09-21T18:02:00.5024900Z datalad.cmd: DEBUG: Async run ['git', 'push', '--progress', '--porcelain', '--dry-run', 'target']
2020-09-21T18:02:00.5026720Z datalad.cmd: DEBUG: Launching process ['git', 'push', '--progress', '--porcelain', '--dry-run', 'target']
2020-09-21T18:02:00.5027660Z datalad.cmd: DEBUG: Process 63755 started
2020-09-21T18:02:00.5028400Z datalad.cmd: DEBUG: Waiting for process 63755 to complete
2020-09-21T18:02:00.5030280Z datalad.gitrepo: DEBUG: Non-progress stderr: b'fatal: The current branch dl-test-branch has no upstream branch.\n'
2020-09-21T18:02:00.5032790Z datalad.gitrepo: DEBUG: Non-progress stderr: b'To push the current branch and set the remote as upstream, use\n'
2020-09-21T18:02:00.5034600Z datalad.gitrepo: DEBUG: Non-progress stderr: b'\n'
2020-09-21T18:02:00.5036400Z datalad.gitrepo: DEBUG: Non-progress stderr: b'    git push --set-upstream target dl-test-branch\n'
2020-09-21T18:02:00.5038100Z datalad.gitrepo: DEBUG: Non-progress stderr: b'\n'
2020-09-21T18:02:00.5038960Z datalad.cmd: DEBUG: Process 63755 exited with return code 128
2020-09-21T18:02:00.5042190Z datalad.core.distributed.push: DEBUG: Dry-run push to check push configuration failed, assume no configuration: CommandError: 'git push --progress --porcelain --dry-run target' failed with exitcode 128 under /private/var/folders/24/8k48jl6d249_n_qfxwsl6xvm0000gn/T/datalad_temp_tree_test_push_git_annex_branch_many_paths_same_dataam8xu4if/ds [err: 'fatal: The current branch dl-test-branch has no upstream branch.
2020-09-21T18:02:00.5043810Z To push the current branch and set the remote as upstream, use
2020-09-21T18:02:00.5044120Z 
2020-09-21T18:02:00.5044880Z     git push --set-upstream target dl-test-branch']
2020-09-21T18:02:00.5045620Z datalad.core.distributed.push: DEBUG: No refspecs configured for push, attempting to use active branch
2020-09-21T18:02:00.5046440Z datalad.annex: DEBUG: No sync necessary, no corresponding branch detected
2020-09-21T18:02:00.5047390Z datalad.core.distributed.push: INFO: Transfer data
2020-09-21T18:02:00.5048970Z datalad.core.distributed.push: DEBUG: Push data from Dataset(/var/folders/24/8k48jl6d249_n_qfxwsl6xvm0000gn/T/datalad_temp_tree_test_push_git_annex_branch_many_paths_same_dataam8xu4if/ds) to 'target'
2020-09-21T18:02:00.5050120Z datalad.core.distributed.push: DEBUG: Counted 3 bytes of annex data to transfer
2020-09-21T18:02:00.5050760Z asyncio: DEBUG: Using selector: KqueueSelector
2020-09-21T18:02:00.5051850Z datalad.cmd: DEBUG: Async run ['git', 'annex', 'copy', '--batch', '-z', '--to', 'target', '--json', '--json-error-messages', '--json-progress', '--fast']
2020-09-21T18:02:00.5054660Z datalad.cmd: DEBUG: Launching process ['git', 'annex', 'copy', '--batch', '-z', '--to', 'target', '--json', '--json-error-messages', '--json-progress', '--fast']
2020-09-21T18:02:00.5055370Z datalad.cmd: DEBUG: Process 63814 started
2020-09-21T18:02:00.5055800Z datalad.annex: INFO: Start annex operation
2020-09-21T18:02:00.5056260Z datalad.cmd: DEBUG: Waiting for process 63814 to complete
2020-09-21T18:02:00.5056650Z datalad.annex: INFO: f0
2020-09-21T18:02:00.5056970Z datalad.annex: INFO: f3
2020-09-21T18:02:00.5057330Z datalad.annex: INFO: Finished annex copy
2020-09-21T18:02:00.5057780Z datalad.cmd: DEBUG: Process 63814 exited with return code 1
2020-09-21T18:02:00.5058610Z --------------------- >> end captured logging << ---------------------
2020-09-21T18:02:00.5058850Z 
2020-09-21T18:02:00.5059050Z ======================================================================
2020-09-21T18:02:00.5059640Z ERROR: datalad.distribution.tests.test_create_sibling.test_target_ssh_recursive(False,)
2020-09-21T18:02:00.5060600Z ----------------------------------------------------------------------
2020-09-21T18:02:00.5060910Z Traceback (most recent call last):
2020-09-21T18:02:00.5062670Z   File "/Users/runner/hostedtoolcache/Python/3.7.9/x64/lib/python3.7/site-packages/nose/case.py", line 198, in runTest
2020-09-21T18:02:00.5063690Z     self.test(*self.arg)
2020-09-21T18:02:00.5077200Z   File "/Users/runner/hostedtoolcache/Python/3.7.9/x64/lib/python3.7/site-packages/datalad/tests/utils.py", line 1056, in _wrap_with_testrepos
2020-09-21T18:02:00.5077840Z     t(*(arg + (uri,)), **kw)
2020-09-21T18:02:00.5079060Z   File "/Users/runner/hostedtoolcache/Python/3.7.9/x64/lib/python3.7/site-packages/datalad/tests/utils.py", line 731, in _wrap_with_tempfile
2020-09-21T18:02:00.5079710Z     return t(*(arg + (filename,)), **kw)
2020-09-21T18:02:00.5080880Z   File "/Users/runner/hostedtoolcache/Python/3.7.9/x64/lib/python3.7/site-packages/datalad/tests/utils.py", line 731, in _wrap_with_tempfile
2020-09-21T18:02:00.5082080Z     return t(*(arg + (filename,)), **kw)
2020-09-21T18:02:00.5083450Z   File "/Users/runner/hostedtoolcache/Python/3.7.9/x64/lib/python3.7/site-packages/datalad/distribution/tests/test_create_sibling.py", line 376, in check_target_ssh_recursive
2020-09-21T18:02:00.5084170Z     ui=True)
2020-09-21T18:02:00.5085310Z   File "/Users/runner/hostedtoolcache/Python/3.7.9/x64/lib/python3.7/site-packages/datalad/tests/utils.py", line 1516, in _wrap_assert_no_errors_logged
2020-09-21T18:02:00.5085960Z     out = func(*args, **kwargs)
2020-09-21T18:02:00.5087060Z   File "/Users/runner/hostedtoolcache/Python/3.7.9/x64/lib/python3.7/site-packages/datalad/interface/utils.py", line 494, in eval_func
2020-09-21T18:02:00.5087750Z     return return_func(generator_func)(*args, **kwargs)
2020-09-21T18:02:00.5088930Z   File "/Users/runner/hostedtoolcache/Python/3.7.9/x64/lib/python3.7/site-packages/datalad/interface/utils.py", line 482, in return_func
2020-09-21T18:02:00.5089560Z     results = list(results)
2020-09-21T18:02:00.5090840Z   File "/Users/runner/hostedtoolcache/Python/3.7.9/x64/lib/python3.7/site-packages/datalad/interface/utils.py", line 469, in generator_func
2020-09-21T18:02:00.5091560Z     msg="Command did not complete successfully")
2020-09-21T18:02:00.5092540Z datalad.support.exceptions.IncompleteResultsError: Command did not complete successfully. 1 failed:
2020-09-21T18:02:00.5094130Z [{'action': 'create_sibling',
2020-09-21T18:02:00.5094940Z   'message': ('failed to push web interface to the remote datalad repository '
2020-09-21T18:02:00.5095640Z               '(%s)',
2020-09-21T18:02:00.5096270Z               "CommandError: 'cp --recursive "
2020-09-21T18:02:00.5097270Z               '/Users/runner/hostedtoolcache/Python/3.7.9/x64/lib/python3.7/site-packages/datalad/resources/website/assets '
2020-09-21T18:02:00.5098630Z               "/private/var/folders/24/8k48jl6d249_n_qfxwsl6xvm0000gn/T/datalad_temp_check_target_ssh_recursiven3pcqapb-False/.git/datalad/web' "
2020-09-21T18:02:00.5099720Z               "failed with exitcode 64 [err: 'cp: illegal option -- -\n"
2020-09-21T18:02:00.5100490Z               'usage: cp [-R [-H | -L | -P]] [-fi | -n] [-apvXc] source_file '
2020-09-21T18:02:00.5101130Z               'target_file\n'
2020-09-21T18:02:00.5101760Z               '       cp [-R [-H | -L | -P]] [-fi | -n] [-apvXc] source_file '
2020-09-21T18:02:00.5102480Z               "... target_directory'] [cmd.py:run:961]"),
2020-09-21T18:02:00.5103500Z   'orig_request': '/private/var/folders/24/8k48jl6d249_n_qfxwsl6xvm0000gn/T/datalad_temp_check_target_ssh_recursiveioyatda9',
2020-09-21T18:02:00.5104770Z   'path': '/private/var/folders/24/8k48jl6d249_n_qfxwsl6xvm0000gn/T/datalad_temp_check_target_ssh_recursiveioyatda9',
2020-09-21T18:02:00.5105640Z   'raw_input': True,
2020-09-21T18:02:00.5106590Z   'refds': '/private/var/folders/24/8k48jl6d249_n_qfxwsl6xvm0000gn/T/datalad_temp_check_target_ssh_recursiveioyatda9',
2020-09-21T18:02:00.5107450Z   'status': 'error',
2020-09-21T18:02:00.5108010Z   'type': 'dataset'}]
2020-09-21T18:02:00.5108670Z -------------------- >> begin captured logging << --------------------
2020-09-21T18:02:00.5109080Z asyncio: DEBUG: Using selector: KqueueSelector
2020-09-21T18:02:00.5109550Z asyncio: DEBUG: Using selector: KqueueSelector
2020-09-21T18:02:00.5110070Z asyncio: DEBUG: Using selector: KqueueSelector
2020-09-21T18:02:00.5110840Z asyncio: DEBUG: Using selector: KqueueSelector
2020-09-21T18:02:00.5111640Z asyncio: DEBUG: Using selector: KqueueSelector
2020-09-21T18:02:00.5112430Z asyncio: DEBUG: Using selector: KqueueSelector
2020-09-21T18:02:00.5113320Z asyncio: DEBUG: Using selector: KqueueSelector
2020-09-21T18:02:00.5114120Z asyncio: DEBUG: Using selector: KqueueSelector
2020-09-21T18:02:00.5114910Z asyncio: DEBUG: Using selector: KqueueSelector
2020-09-21T18:02:00.5115680Z asyncio: DEBUG: Using selector: KqueueSelector
2020-09-21T18:02:00.5116450Z asyncio: DEBUG: Using selector: KqueueSelector
2020-09-21T18:02:00.5117220Z asyncio: DEBUG: Using selector: KqueueSelector
2020-09-21T18:02:00.5118770Z --------------------- >> end captured logging << ---------------------
2020-09-21T18:02:00.5119160Z 
2020-09-21T18:02:00.5119500Z ======================================================================
2020-09-21T18:02:00.5120540Z ERROR: datalad.distribution.tests.test_create_sibling.test_replace_and_relative_sshpath(False,)
2020-09-21T18:02:00.5121610Z ----------------------------------------------------------------------
2020-09-21T18:02:00.5121910Z Traceback (most recent call last):
2020-09-21T18:02:00.5123020Z   File "/Users/runner/hostedtoolcache/Python/3.7.9/x64/lib/python3.7/site-packages/nose/case.py", line 198, in runTest
2020-09-21T18:02:00.5123600Z     self.test(*self.arg)
2020-09-21T18:02:00.5124730Z   File "/Users/runner/hostedtoolcache/Python/3.7.9/x64/lib/python3.7/site-packages/datalad/tests/utils.py", line 731, in _wrap_with_tempfile
2020-09-21T18:02:00.5125360Z     return t(*(arg + (filename,)), **kw)
2020-09-21T18:02:00.5126500Z   File "/Users/runner/hostedtoolcache/Python/3.7.9/x64/lib/python3.7/site-packages/datalad/tests/utils.py", line 731, in _wrap_with_tempfile
2020-09-21T18:02:00.5127150Z     return t(*(arg + (filename,)), **kw)
2020-09-21T18:02:00.5128460Z   File "/Users/runner/hostedtoolcache/Python/3.7.9/x64/lib/python3.7/site-packages/datalad/distribution/tests/test_create_sibling.py", line 534, in check_replace_and_relative_sshpath
2020-09-21T18:02:00.5129540Z     ds.create_sibling(url, ui=True)
2020-09-21T18:02:00.5131140Z   File "/Users/runner/hostedtoolcache/Python/3.7.9/x64/lib/python3.7/site-packages/datalad/distribution/dataset.py", line 503, in apply_func
2020-09-21T18:02:00.5131790Z     return f(**kwargs)
2020-09-21T18:02:00.5132910Z   File "/Users/runner/hostedtoolcache/Python/3.7.9/x64/lib/python3.7/site-packages/datalad/interface/utils.py", line 494, in eval_func
2020-09-21T18:02:00.5133720Z     return return_func(generator_func)(*args, **kwargs)
2020-09-21T18:02:00.5135000Z   File "/Users/runner/hostedtoolcache/Python/3.7.9/x64/lib/python3.7/site-packages/datalad/interface/utils.py", line 482, in return_func
2020-09-21T18:02:00.5135700Z     results = list(results)
2020-09-21T18:02:00.5136900Z   File "/Users/runner/hostedtoolcache/Python/3.7.9/x64/lib/python3.7/site-packages/datalad/interface/utils.py", line 469, in generator_func
2020-09-21T18:02:00.5137630Z     msg="Command did not complete successfully")
2020-09-21T18:02:00.5138620Z datalad.support.exceptions.IncompleteResultsError: Command did not complete successfully. 1 failed:
2020-09-21T18:02:00.5139900Z [{'action': 'create_sibling',
2020-09-21T18:02:00.5140700Z   'message': ('failed to push web interface to the remote datalad repository '
2020-09-21T18:02:00.5141400Z               '(%s)',
2020-09-21T18:02:00.5142020Z               "CommandError: 'cp --recursive "
2020-09-21T18:02:00.5143020Z               '/Users/runner/hostedtoolcache/Python/3.7.9/x64/lib/python3.7/site-packages/datalad/resources/website/assets '
2020-09-21T18:02:00.5144340Z               "/private/var/folders/24/8k48jl6d249_n_qfxwsl6xvm0000gn/T/datalad_temp_check_replace_and_relative_sshpathfjgsv4je/.git/datalad/web' "
2020-09-21T18:02:00.5145390Z               "failed with exitcode 64 [err: 'cp: illegal option -- -\n"
2020-09-21T18:02:00.5146160Z               'usage: cp [-R [-H | -L | -P]] [-fi | -n] [-apvXc] source_file '
2020-09-21T18:02:00.5146800Z               'target_file\n'
2020-09-21T18:02:00.5147460Z               '       cp [-R [-H | -L | -P]] [-fi | -n] [-apvXc] source_file '
2020-09-21T18:02:00.5148160Z               "... target_directory'] [cmd.py:run:961]"),
2020-09-21T18:02:00.5149200Z   'orig_request': '/private/var/folders/24/8k48jl6d249_n_qfxwsl6xvm0000gn/T/datalad_temp_check_replace_and_relative_sshpaths3pivliy',
2020-09-21T18:02:00.5150510Z   'path': '/private/var/folders/24/8k48jl6d249_n_qfxwsl6xvm0000gn/T/datalad_temp_check_replace_and_relative_sshpaths3pivliy',
2020-09-21T18:02:00.5151380Z   'raw_input': True,
2020-09-21T18:02:00.5152840Z   'refds': '/private/var/folders/24/8k48jl6d249_n_qfxwsl6xvm0000gn/T/datalad_temp_check_replace_and_relative_sshpaths3pivliy',
2020-09-21T18:02:00.5153960Z   'status': 'error',
2020-09-21T18:02:00.5154560Z   'type': 'dataset'}]
2020-09-21T18:02:00.5155190Z -------------------- >> begin captured logging << --------------------
2020-09-21T18:02:00.5155630Z asyncio: DEBUG: Using selector: KqueueSelector
2020-09-21T18:02:00.5156100Z asyncio: DEBUG: Using selector: KqueueSelector
2020-09-21T18:02:00.5156560Z asyncio: DEBUG: Using selector: KqueueSelector
2020-09-21T18:02:00.5157020Z asyncio: DEBUG: Using selector: KqueueSelector
2020-09-21T18:02:00.5157480Z asyncio: DEBUG: Using selector: KqueueSelector
2020-09-21T18:02:00.5158230Z --------------------- >> end captured logging << ---------------------
2020-09-21T18:02:00.5158450Z 
2020-09-21T18:02:00.5158650Z ======================================================================
2020-09-21T18:02:00.5159260Z ERROR: datalad.distribution.tests.test_create_sibling.test_check_exists_interactive(False,)
2020-09-21T18:02:00.5160350Z ----------------------------------------------------------------------
2020-09-21T18:02:00.5160660Z Traceback (most recent call last):
2020-09-21T18:02:00.5161700Z   File "/Users/runner/hostedtoolcache/Python/3.7.9/x64/lib/python3.7/site-packages/nose/case.py", line 198, in runTest
2020-09-21T18:02:00.5162550Z     self.test(*self.arg)
2020-09-21T18:02:00.5163670Z   File "/Users/runner/hostedtoolcache/Python/3.7.9/x64/lib/python3.7/site-packages/datalad/tests/utils.py", line 1495, in _wrap_with_testsui
2020-09-21T18:02:00.5164280Z     ret = t(*args, **kwargs)
2020-09-21T18:02:00.5165340Z   File "/Users/runner/hostedtoolcache/Python/3.7.9/x64/lib/python3.7/site-packages/datalad/tests/utils.py", line 731, in _wrap_with_tempfile
2020-09-21T18:02:00.5165980Z     return t(*(arg + (filename,)), **kw)
2020-09-21T18:02:00.5167220Z   File "/Users/runner/hostedtoolcache/Python/3.7.9/x64/lib/python3.7/site-packages/datalad/distribution/tests/test_create_sibling.py", line 721, in check_exists_interactive
2020-09-21T18:02:00.5168390Z     origin.create_sibling(sshurl, existing='replace')
2020-09-21T18:02:00.5169590Z   File "/Users/runner/hostedtoolcache/Python/3.7.9/x64/lib/python3.7/site-packages/datalad/distribution/dataset.py", line 503, in apply_func
2020-09-21T18:02:00.5170220Z     return f(**kwargs)
2020-09-21T18:02:00.5171260Z   File "/Users/runner/hostedtoolcache/Python/3.7.9/x64/lib/python3.7/site-packages/datalad/interface/utils.py", line 494, in eval_func
2020-09-21T18:02:00.5171950Z     return return_func(generator_func)(*args, **kwargs)
2020-09-21T18:02:00.5173070Z   File "/Users/runner/hostedtoolcache/Python/3.7.9/x64/lib/python3.7/site-packages/datalad/interface/utils.py", line 482, in return_func
2020-09-21T18:02:00.5173700Z     results = list(results)
2020-09-21T18:02:00.5174780Z   File "/Users/runner/hostedtoolcache/Python/3.7.9/x64/lib/python3.7/site-packages/datalad/interface/utils.py", line 413, in generator_func
2020-09-21T18:02:00.5175390Z     allkwargs):
2020-09-21T18:02:00.5176460Z   File "/Users/runner/hostedtoolcache/Python/3.7.9/x64/lib/python3.7/site-packages/datalad/interface/utils.py", line 552, in _process_results
2020-09-21T18:02:00.5177090Z     for res in results:
2020-09-21T18:02:00.5178170Z   File "/Users/runner/hostedtoolcache/Python/3.7.9/x64/lib/python3.7/site-packages/datalad/distribution/create_sibling.py", line 761, in __call__
2020-09-21T18:02:00.5178800Z     inherit
2020-09-21T18:02:00.5179930Z   File "/Users/runner/hostedtoolcache/Python/3.7.9/x64/lib/python3.7/site-packages/datalad/distribution/create_sibling.py", line 256, in _create_dataset_sibling
2020-09-21T18:02:00.5181050Z     shell("chmod +r+w -R {}".format(sh_quote(remoteds_path)))
2020-09-21T18:02:00.5182090Z   File "/Users/runner/hostedtoolcache/Python/3.7.9/x64/lib/python3.7/site-packages/datalad/cmd.py", line 599, in __call__
2020-09-21T18:02:00.5182690Z     return self.run(cmd, *args, **kwargs)
2020-09-21T18:02:00.5183690Z   File "/Users/runner/hostedtoolcache/Python/3.7.9/x64/lib/python3.7/site-packages/datalad/cmd.py", line 961, in run
2020-09-21T18:02:00.5184490Z     raise exc
2020-09-21T18:02:00.5186210Z datalad.support.exceptions.CommandError: CommandError: ''chmod +r+w -R /private/var/folders/24/8k48jl6d249_n_qfxwsl6xvm0000gn/T/datalad_temp_check_exists_interactivep0u9h4rv/sibling'' failed with exitcode 1 [err: 'chmod: -R: No such file or directory']
2020-09-21T18:02:00.5187780Z -------------------- >> begin captured logging << --------------------
2020-09-21T18:02:00.5188200Z asyncio: DEBUG: Using selector: KqueueSelector
2020-09-21T18:02:00.5188680Z asyncio: DEBUG: Using selector: KqueueSelector
2020-09-21T18:02:00.5189430Z --------------------- >> end captured logging << ---------------------
2020-09-21T18:02:00.5189650Z 
2020-09-21T18:02:00.5189850Z ======================================================================
2020-09-21T18:02:00.5190470Z ERROR: datalad.distribution.tests.test_get.test_get_recurse_dirs
2020-09-21T18:02:00.5191360Z ----------------------------------------------------------------------
2020-09-21T18:02:00.5191670Z Traceback (most recent call last):
2020-09-21T18:02:00.5192700Z   File "/Users/runner/hostedtoolcache/Python/3.7.9/x64/lib/python3.7/site-packages/nose/case.py", line 198, in runTest
2020-09-21T18:02:00.5193270Z     self.test(*self.arg)
2020-09-21T18:02:00.5194620Z   File "/Users/runner/hostedtoolcache/Python/3.7.9/x64/lib/python3.7/site-packages/datalad/tests/utils.py", line 554, in _wrap_with_tree
2020-09-21T18:02:00.5195200Z     return t(*(arg + (d,)), **kw)
2020-09-21T18:02:00.5196420Z   File "/Users/runner/hostedtoolcache/Python/3.7.9/x64/lib/python3.7/site-packages/datalad/tests/utils.py", line 731, in _wrap_with_tempfile
2020-09-21T18:02:00.5197140Z     return t(*(arg + (filename,)), **kw)
2020-09-21T18:02:00.5198320Z   File "/Users/runner/hostedtoolcache/Python/3.7.9/x64/lib/python3.7/site-packages/datalad/distribution/tests/test_get.py", line 332, in test_get_recurse_dirs
2020-09-21T18:02:00.5199330Z     result = ds.get('subdir')
2020-09-21T18:02:00.5200430Z   File "/Users/runner/hostedtoolcache/Python/3.7.9/x64/lib/python3.7/site-packages/datalad/distribution/dataset.py", line 503, in apply_func
2020-09-21T18:02:00.5201070Z     return f(**kwargs)
2020-09-21T18:02:00.5202090Z   File "/Users/runner/hostedtoolcache/Python/3.7.9/x64/lib/python3.7/site-packages/datalad/interface/utils.py", line 494, in eval_func
2020-09-21T18:02:00.5202790Z     return return_func(generator_func)(*args, **kwargs)
2020-09-21T18:02:00.5203910Z   File "/Users/runner/hostedtoolcache/Python/3.7.9/x64/lib/python3.7/site-packages/datalad/interface/utils.py", line 482, in return_func
2020-09-21T18:02:00.5204540Z     results = list(results)
2020-09-21T18:02:00.5205620Z   File "/Users/runner/hostedtoolcache/Python/3.7.9/x64/lib/python3.7/site-packages/datalad/interface/utils.py", line 469, in generator_func
2020-09-21T18:02:00.5206330Z     msg="Command did not complete successfully")
2020-09-21T18:02:00.5207320Z datalad.support.exceptions.IncompleteResultsError: Command did not complete successfully. 4 failed:
2020-09-21T18:02:00.5208530Z [{'action': 'get',
2020-09-21T18:02:00.5209380Z   'annexkey': 'MD5E-s14--6c7ba9c5a141421e1c03cb9807c97c74.txt',
2020-09-21T18:02:00.5210570Z   'message': '../../../../../../../var/folders/24/8k48jl6d249_n_qfxwsl6xvm0000gn/T/datalad_temp_tree_test_get_recurse_dirsjiw2wxc5/.git/annex: '
2020-09-21T18:02:00.5211630Z              'createDirectory: does not exist (No such file or directory)\n'
2020-09-21T18:02:00.5212710Z              '../../../../../../../var/folders/24/8k48jl6d249_n_qfxwsl6xvm0000gn/T/datalad_temp_tree_test_get_recurse_dirsjiw2wxc5/.git/annex: '
2020-09-21T18:02:00.5213730Z              'createDirectory: does not exist (No such file or directory)\n'
2020-09-21T18:02:00.5214800Z              '../../../../../../../var/folders/24/8k48jl6d249_n_qfxwsl6xvm0000gn/T/datalad_temp_tree_test_get_recurse_dirsjiw2wxc5/.git/annex: '
2020-09-21T18:02:00.5215810Z              'createDirectory: does not exist (No such file or directory)',
2020-09-21T18:02:00.5217150Z   'path': '/private/var/folders/24/8k48jl6d249_n_qfxwsl6xvm0000gn/T/datalad_temp_test_get_recurse_dirso34eahkx/subdir/file2.txt',
2020-09-21T18:02:00.5218440Z   'refds': '/private/var/folders/24/8k48jl6d249_n_qfxwsl6xvm0000gn/T/datalad_temp_test_get_recurse_dirso34eahkx',
2020-09-21T18:02:00.5219280Z   'status': 'error',
2020-09-21T18:02:00.5219830Z   'type': 'file'},
2020-09-21T18:02:00.5220480Z  {'action': 'get',
2020-09-21T18:02:00.5221310Z   'annexkey': 'MD5E-s30--c22e53b13855f9e313a02bcb7c26a7cc.txt',
2020-09-21T18:02:00.5222490Z   'message': '../../../../../../../var/folders/24/8k48jl6d249_n_qfxwsl6xvm0000gn/T/datalad_temp_tree_test_get_recurse_dirsjiw2wxc5/.git/annex: '
2020-09-21T18:02:00.5223530Z              'createDirectory: does not exist (No such file or directory)\n'
2020-09-21T18:02:00.5224600Z              '../../../../../../../var/folders/24/8k48jl6d249_n_qfxwsl6xvm0000gn/T/datalad_temp_tree_test_get_recurse_dirsjiw2wxc5/.git/annex: '
2020-09-21T18:02:00.5225630Z              'createDirectory: does not exist (No such file or directory)\n'
2020-09-21T18:02:00.5226720Z              '../../../../../../../var/folders/24/8k48jl6d249_n_qfxwsl6xvm0000gn/T/datalad_temp_tree_test_get_recurse_dirsjiw2wxc5/.git/annex: '
2020-09-21T18:02:00.5227740Z              'createDirectory: does not exist (No such file or directory)',
2020-09-21T18:02:00.5229170Z   'path': '/private/var/folders/24/8k48jl6d249_n_qfxwsl6xvm0000gn/T/datalad_temp_test_get_recurse_dirso34eahkx/subdir/subsubdir/file3.txt',
2020-09-21T18:02:00.5230420Z   'refds': '/private/var/folders/24/8k48jl6d249_n_qfxwsl6xvm0000gn/T/datalad_temp_test_get_recurse_dirso34eahkx',
2020-09-21T18:02:00.5231250Z   'status': 'error',
2020-09-21T18:02:00.5231800Z   'type': 'file'},
2020-09-21T18:02:00.5232330Z  {'action': 'get',
2020-09-21T18:02:00.5233130Z   'annexkey': 'MD5E-s9--437b930db84b8079c2dd804a71936b5f.txt',
2020-09-21T18:02:00.5234290Z   'message': '../../../../../../../var/folders/24/8k48jl6d249_n_qfxwsl6xvm0000gn/T/datalad_temp_tree_test_get_recurse_dirsjiw2wxc5/.git/annex: '
2020-09-21T18:02:00.5235340Z              'createDirectory: does not exist (No such file or directory)\n'
2020-09-21T18:02:00.5236420Z              '../../../../../../../var/folders/24/8k48jl6d249_n_qfxwsl6xvm0000gn/T/datalad_temp_tree_test_get_recurse_dirsjiw2wxc5/.git/annex: '
2020-09-21T18:02:00.5237450Z              'createDirectory: does not exist (No such file or directory)\n'
2020-09-21T18:02:00.5238530Z              '../../../../../../../var/folders/24/8k48jl6d249_n_qfxwsl6xvm0000gn/T/datalad_temp_tree_test_get_recurse_dirsjiw2wxc5/.git/annex: '
2020-09-21T18:02:00.5239550Z              'createDirectory: does not exist (No such file or directory)',
2020-09-21T18:02:00.5240680Z   'path': '/private/var/folders/24/8k48jl6d249_n_qfxwsl6xvm0000gn/T/datalad_temp_test_get_recurse_dirso34eahkx/subdir/subsubdir/file4.txt',
2020-09-21T18:02:00.5241930Z   'refds': '/private/var/folders/24/8k48jl6d249_n_qfxwsl6xvm0000gn/T/datalad_temp_test_get_recurse_dirso34eahkx',
2020-09-21T18:02:00.5242750Z   'status': 'error',
2020-09-21T18:02:00.5243300Z   'type': 'file'},
2020-09-21T18:02:00.5243840Z  {'action': 'get',
2020-09-21T18:02:00.5244490Z   'message': ('could not get some content in %s %s',
2020-09-21T18:02:00.5245460Z               '/private/var/folders/24/8k48jl6d249_n_qfxwsl6xvm0000gn/T/datalad_temp_test_get_recurse_dirso34eahkx/subdir',
2020-09-21T18:02:00.5246700Z               ['/private/var/folders/24/8k48jl6d249_n_qfxwsl6xvm0000gn/T/datalad_temp_test_get_recurse_dirso34eahkx/subdir/file2.txt',
2020-09-21T18:02:00.5247990Z                '/private/var/folders/24/8k48jl6d249_n_qfxwsl6xvm0000gn/T/datalad_temp_test_get_recurse_dirso34eahkx/subdir/subsubdir/file3.txt',
2020-09-21T18:02:00.5249320Z                '/private/var/folders/24/8k48jl6d249_n_qfxwsl6xvm0000gn/T/datalad_temp_test_get_recurse_dirso34eahkx/subdir/subsubdir/file4.txt']),
2020-09-21T18:02:00.5250700Z   'path': '/private/var/folders/24/8k48jl6d249_n_qfxwsl6xvm0000gn/T/datalad_temp_test_get_recurse_dirso34eahkx/subdir',
2020-09-21T18:02:00.5252180Z   'refds': '/private/var/folders/24/8k48jl6d249_n_qfxwsl6xvm0000gn/T/datalad_temp_test_get_recurse_dirso34eahkx',
2020-09-21T18:02:00.5253120Z   'status': 'impossible',
2020-09-21T18:02:00.5253710Z   'type': 'directory'}]
2020-09-21T18:02:00.5254550Z -------------------- >> begin captured logging << --------------------
2020-09-21T18:02:00.5254980Z asyncio: DEBUG: Using selector: KqueueSelector
2020-09-21T18:02:00.5255450Z asyncio: DEBUG: Using selector: KqueueSelector
2020-09-21T18:02:00.5255900Z asyncio: DEBUG: Using selector: KqueueSelector
2020-09-21T18:02:00.5256360Z asyncio: DEBUG: Using selector: KqueueSelector
2020-09-21T18:02:00.5256820Z asyncio: DEBUG: Using selector: KqueueSelector
2020-09-21T18:02:00.5257280Z asyncio: DEBUG: Using selector: KqueueSelector
2020-09-21T18:02:00.5257740Z asyncio: DEBUG: Using selector: KqueueSelector
2020-09-21T18:02:00.5258200Z asyncio: DEBUG: Using selector: KqueueSelector
2020-09-21T18:02:00.5258650Z asyncio: DEBUG: Using selector: KqueueSelector
2020-09-21T18:02:00.5259440Z --------------------- >> end captured logging << ---------------------
2020-09-21T18:02:00.5259660Z 
2020-09-21T18:02:00.5259860Z ======================================================================
2020-09-21T18:02:00.5260430Z ERROR: datalad.distribution.tests.test_install.test_install_recursive_repeat
2020-09-21T18:02:00.5266860Z ----------------------------------------------------------------------
2020-09-21T18:02:00.5267170Z Traceback (most recent call last):
2020-09-21T18:02:00.5268280Z   File "/Users/runner/hostedtoolcache/Python/3.7.9/x64/lib/python3.7/site-packages/nose/case.py", line 198, in runTest
2020-09-21T18:02:00.5268850Z     self.test(*self.arg)
2020-09-21T18:02:00.5269920Z   File "/Users/runner/hostedtoolcache/Python/3.7.9/x64/lib/python3.7/site-packages/datalad/tests/utils.py", line 554, in _wrap_with_tree
2020-09-21T18:02:00.5270520Z     return t(*(arg + (d,)), **kw)
2020-09-21T18:02:00.5271600Z   File "/Users/runner/hostedtoolcache/Python/3.7.9/x64/lib/python3.7/site-packages/datalad/tests/utils.py", line 731, in _wrap_with_tempfile
2020-09-21T18:02:00.5272250Z     return t(*(arg + (filename,)), **kw)
2020-09-21T18:02:00.5273350Z   File "/Users/runner/hostedtoolcache/Python/3.7.9/x64/lib/python3.7/site-packages/datalad/tests/utils.py", line 197, in _wrap_skip_if_on_windows
2020-09-21T18:02:00.5274000Z     return func(*args, **kwargs)
2020-09-21T18:02:00.5275230Z   File "/Users/runner/hostedtoolcache/Python/3.7.9/x64/lib/python3.7/site-packages/datalad/distribution/tests/test_install.py", line 638, in test_install_recursive_repeat
2020-09-21T18:02:00.5276280Z     result_xfm='datasets')
2020-09-21T18:02:00.5277340Z   File "/Users/runner/hostedtoolcache/Python/3.7.9/x64/lib/python3.7/site-packages/datalad/interface/utils.py", line 494, in eval_func
2020-09-21T18:02:00.5278030Z     return return_func(generator_func)(*args, **kwargs)
2020-09-21T18:02:00.5279180Z   File "/Users/runner/hostedtoolcache/Python/3.7.9/x64/lib/python3.7/site-packages/datalad/interface/utils.py", line 482, in return_func
2020-09-21T18:02:00.5279810Z     results = list(results)
2020-09-21T18:02:00.5280890Z   File "/Users/runner/hostedtoolcache/Python/3.7.9/x64/lib/python3.7/site-packages/datalad/interface/utils.py", line 469, in generator_func
2020-09-21T18:02:00.5281610Z     msg="Command did not complete successfully")
2020-09-21T18:02:00.5282610Z datalad.support.exceptions.IncompleteResultsError: Command did not complete successfully. 3 failed:
2020-09-21T18:02:00.5283830Z [{'action': 'get',
2020-09-21T18:02:00.5284680Z   'annexkey': 'MD5E-s4--03d59e663c1af9ac33a9949d1193505a.txt',
2020-09-21T18:02:00.5285910Z   'message': '../../../../../../../var/folders/24/8k48jl6d249_n_qfxwsl6xvm0000gn/T/datalad_temp_tree_test_install_recursive_repeatzw3df6or/.git/annex: '
2020-09-21T18:02:00.5287010Z              'createDirectory: does not exist (No such file or directory)\n'
2020-09-21T18:02:00.5288410Z              '../../../../../../../var/folders/24/8k48jl6d249_n_qfxwsl6xvm0000gn/T/datalad_temp_tree_test_install_recursive_repeatzw3df6or/.git/annex: '
2020-09-21T18:02:00.5289570Z              'createDirectory: does not exist (No such file or directory)\n'
2020-09-21T18:02:00.5290830Z              '../../../../../../../var/folders/24/8k48jl6d249_n_qfxwsl6xvm0000gn/T/datalad_temp_tree_test_install_recursive_repeatzw3df6or/.git/annex: '
2020-09-21T18:02:00.5291920Z              'createDirectory: does not exist (No such file or directory)',
2020-09-21T18:02:00.5293070Z   'path': '/private/var/folders/24/8k48jl6d249_n_qfxwsl6xvm0000gn/T/datalad_temp_test_install_recursive_repeatud7nkz0f/top_file.txt',
2020-09-21T18:02:00.5294370Z   'refds': '/private/var/folders/24/8k48jl6d249_n_qfxwsl6xvm0000gn/T/datalad_temp_test_install_recursive_repeatud7nkz0f',
2020-09-21T18:02:00.5295240Z   'status': 'error',
2020-09-21T18:02:00.5295790Z   'type': 'file'},
2020-09-21T18:02:00.5296340Z  {'action': 'get',
2020-09-21T18:02:00.5297180Z   'annexkey': 'MD5E-s14--6c7ba9c5a141421e1c03cb9807c97c74.txt',
2020-09-21T18:02:00.5298410Z   'message': '../../../../../../../../var/folders/24/8k48jl6d249_n_qfxwsl6xvm0000gn/T/datalad_temp_tree_test_install_recursive_repeatzw3df6or/sub '
2020-09-21T18:02:00.5299470Z              '1/.git/annex: createDirectory: does not exist (No such file or '
2020-09-21T18:02:00.5300190Z              'directory)\n'
2020-09-21T18:02:00.5301410Z              '../../../../../../../../var/folders/24/8k48jl6d249_n_qfxwsl6xvm0000gn/T/datalad_temp_tree_test_install_recursive_repeatzw3df6or/sub '
2020-09-21T18:02:00.5302460Z              '1/.git/annex: createDirectory: does not exist (No such file or '
2020-09-21T18:02:00.5303180Z              'directory)\n'
2020-09-21T18:02:00.5304150Z              '../../../../../../../../var/folders/24/8k48jl6d249_n_qfxwsl6xvm0000gn/T/datalad_temp_tree_test_install_recursive_repeatzw3df6or/sub '
2020-09-21T18:02:00.5305190Z              '1/.git/annex: createDirectory: does not exist (No such file or '
2020-09-21T18:02:00.5305900Z              'directory)',
2020-09-21T18:02:00.5306880Z   'path': '/private/var/folders/24/8k48jl6d249_n_qfxwsl6xvm0000gn/T/datalad_temp_test_install_recursive_repeatud7nkz0f/sub '
2020-09-21T18:02:00.5307790Z           '1/sub1file.txt',
2020-09-21T18:02:00.5308750Z   'refds': '/private/var/folders/24/8k48jl6d249_n_qfxwsl6xvm0000gn/T/datalad_temp_test_install_recursive_repeatud7nkz0f',
2020-09-21T18:02:00.5309640Z   'status': 'error',
2020-09-21T18:02:00.5310200Z   'type': 'file'},
2020-09-21T18:02:00.5310750Z  {'action': 'get',
2020-09-21T18:02:00.5311620Z   'annexkey': 'MD5E-s11--41fa7abec960e60a32c9d7a6e115da79.txt',
2020-09-21T18:02:00.5312880Z   'message': '../../../../../../../../var/folders/24/8k48jl6d249_n_qfxwsl6xvm0000gn/T/datalad_temp_tree_test_install_recursive_repeatzw3df6or/sub '
2020-09-21T18:02:00.5313950Z              '2/.git/annex: createDirectory: does not exist (No such file or '
2020-09-21T18:02:00.5314670Z              'directory)\n'
2020-09-21T18:02:00.5315650Z              '../../../../../../../../var/folders/24/8k48jl6d249_n_qfxwsl6xvm0000gn/T/datalad_temp_tree_test_install_recursive_repeatzw3df6or/sub '
2020-09-21T18:02:00.5316690Z              '2/.git/annex: createDirectory: does not exist (No such file or '
2020-09-21T18:02:00.5317390Z              'directory)\n'
2020-09-21T18:02:00.5318370Z              '../../../../../../../../var/folders/24/8k48jl6d249_n_qfxwsl6xvm0000gn/T/datalad_temp_tree_test_install_recursive_repeatzw3df6or/sub '
2020-09-21T18:02:00.5319430Z              '2/.git/annex: createDirectory: does not exist (No such file or '
2020-09-21T18:02:00.5320260Z              'directory)',
2020-09-21T18:02:00.5321250Z   'path': '/private/var/folders/24/8k48jl6d249_n_qfxwsl6xvm0000gn/T/datalad_temp_test_install_recursive_repeatud7nkz0f/sub '
2020-09-21T18:02:00.5322140Z           '2/sub2file.txt',
2020-09-21T18:02:00.5323120Z   'refds': '/private/var/folders/24/8k48jl6d249_n_qfxwsl6xvm0000gn/T/datalad_temp_test_install_recursive_repeatud7nkz0f',
2020-09-21T18:02:00.5323990Z   'status': 'error',
2020-09-21T18:02:00.5324550Z   'type': 'file'}]
2020-09-21T18:02:00.5325430Z -------------------- >> begin captured logging << --------------------
2020-09-21T18:02:00.5325890Z asyncio: DEBUG: Using selector: KqueueSelector
2020-09-21T18:02:00.5326360Z asyncio: DEBUG: Using selector: KqueueSelector
2020-09-21T18:02:00.5326830Z asyncio: DEBUG: Using selector: KqueueSelector
2020-09-21T18:02:00.5327290Z asyncio: DEBUG: Using selector: KqueueSelector
2020-09-21T18:02:00.5327750Z asyncio: DEBUG: Using selector: KqueueSelector
2020-09-21T18:02:00.5328210Z asyncio: DEBUG: Using selector: KqueueSelector
2020-09-21T18:02:00.5328670Z asyncio: DEBUG: Using selector: KqueueSelector
2020-09-21T18:02:00.5329110Z asyncio: DEBUG: Using selector: KqueueSelector
2020-09-21T18:02:00.5329570Z asyncio: DEBUG: Using selector: KqueueSelector
2020-09-21T18:02:00.5330030Z asyncio: DEBUG: Using selector: KqueueSelector
2020-09-21T18:02:00.5330490Z asyncio: DEBUG: Using selector: KqueueSelector
2020-09-21T18:02:00.5330950Z asyncio: DEBUG: Using selector: KqueueSelector
2020-09-21T18:02:00.5331410Z asyncio: DEBUG: Using selector: KqueueSelector
2020-09-21T18:02:00.5331880Z asyncio: DEBUG: Using selector: KqueueSelector
2020-09-21T18:02:00.5332340Z asyncio: DEBUG: Using selector: KqueueSelector
2020-09-21T18:02:00.5332790Z asyncio: DEBUG: Using selector: KqueueSelector
2020-09-21T18:02:00.5333420Z asyncio: DEBUG: Using selector: KqueueSelector
2020-09-21T18:02:00.5333880Z asyncio: DEBUG: Using selector: KqueueSelector
2020-09-21T18:02:00.5334350Z asyncio: DEBUG: Using selector: KqueueSelector
2020-09-21T18:02:00.5334810Z asyncio: DEBUG: Using selector: KqueueSelector
2020-09-21T18:02:00.5335270Z asyncio: DEBUG: Using selector: KqueueSelector
2020-09-21T18:02:00.5336410Z --------------------- >> end captured logging << ---------------------
2020-09-21T18:02:00.5336630Z 
2020-09-21T18:02:00.5336830Z ======================================================================
2020-09-21T18:02:00.5337320Z ERROR: datalad.interface.tests.test_rerun.test_run_inputs_outputs
2020-09-21T18:02:00.5338180Z ----------------------------------------------------------------------
2020-09-21T18:02:00.5338480Z Traceback (most recent call last):
2020-09-21T18:02:00.5339540Z   File "/Users/runner/hostedtoolcache/Python/3.7.9/x64/lib/python3.7/site-packages/nose/case.py", line 198, in runTest
2020-09-21T18:02:00.5340120Z     self.test(*self.arg)
2020-09-21T18:02:00.5341190Z   File "/Users/runner/hostedtoolcache/Python/3.7.9/x64/lib/python3.7/site-packages/datalad/tests/utils.py", line 554, in _wrap_with_tree
2020-09-21T18:02:00.5341780Z     return t(*(arg + (d,)), **kw)
2020-09-21T18:02:00.5342850Z   File "/Users/runner/hostedtoolcache/Python/3.7.9/x64/lib/python3.7/site-packages/datalad/tests/utils.py", line 731, in _wrap_with_tempfile
2020-09-21T18:02:00.5343480Z     return t(*(arg + (filename,)), **kw)
2020-09-21T18:02:00.5344640Z   File "/Users/runner/hostedtoolcache/Python/3.7.9/x64/lib/python3.7/site-packages/datalad/interface/tests/test_rerun.py", line 626, in test_run_inputs_outputs
2020-09-21T18:02:00.5345780Z     inputs=["input.dat"], extra_inputs=["extra-input.dat"]))
2020-09-21T18:02:00.5346930Z   File "/Users/runner/hostedtoolcache/Python/3.7.9/x64/lib/python3.7/site-packages/datalad/core/local/run.py", line 700, in run_command
2020-09-21T18:02:00.5347500Z     raise exc
2020-09-21T18:02:00.5348540Z   File "/Users/runner/hostedtoolcache/Python/3.7.9/x64/lib/python3.7/site-packages/datalad/core/local/run.py", line 470, in _execute_command
2020-09-21T18:02:00.5349150Z     expect_fail=True,
2020-09-21T18:02:00.5350250Z   File "/Users/runner/hostedtoolcache/Python/3.7.9/x64/lib/python3.7/site-packages/datalad/cmd.py", line 961, in run
2020-09-21T18:02:00.5350800Z     raise exc
2020-09-21T18:02:00.5352450Z datalad.support.exceptions.CommandError: CommandError: ''cat input.dat input.dat >doubled.dat'' failed with exitcode 1 under /private/var/folders/24/8k48jl6d249_n_qfxwsl6xvm0000gn/T/datalad_temp_test_run_inputs_outputssvlxkihe
2020-09-21T18:02:00.5353990Z -------------------- >> begin captured logging << --------------------
2020-09-21T18:02:00.5354680Z asyncio: DEBUG: Using selector: KqueueSelector
2020-09-21T18:02:00.5355210Z asyncio: DEBUG: Using selector: KqueueSelector
2020-09-21T18:02:00.5355720Z asyncio: DEBUG: Using selector: KqueueSelector
2020-09-21T18:02:00.5356210Z asyncio: DEBUG: Using selector: KqueueSelector
2020-09-21T18:02:00.5356680Z asyncio: DEBUG: Using selector: KqueueSelector
2020-09-21T18:02:00.5357140Z asyncio: DEBUG: Using selector: KqueueSelector
2020-09-21T18:02:00.5357590Z asyncio: DEBUG: Using selector: KqueueSelector
2020-09-21T18:02:00.5358050Z asyncio: DEBUG: Using selector: KqueueSelector
2020-09-21T18:02:00.5358510Z asyncio: DEBUG: Using selector: KqueueSelector
2020-09-21T18:02:00.5358970Z asyncio: DEBUG: Using selector: KqueueSelector
2020-09-21T18:02:00.5359430Z asyncio: DEBUG: Using selector: KqueueSelector
2020-09-21T18:02:00.5359890Z asyncio: DEBUG: Using selector: KqueueSelector
2020-09-21T18:02:00.5360350Z asyncio: DEBUG: Using selector: KqueueSelector
2020-09-21T18:02:00.5360820Z asyncio: DEBUG: Using selector: KqueueSelector
2020-09-21T18:02:00.5361280Z asyncio: DEBUG: Using selector: KqueueSelector
2020-09-21T18:02:00.5361730Z asyncio: DEBUG: Using selector: KqueueSelector
2020-09-21T18:02:00.5362190Z asyncio: DEBUG: Using selector: KqueueSelector
2020-09-21T18:02:00.5362850Z asyncio: DEBUG: Using selector: KqueueSelector
2020-09-21T18:02:00.5363310Z asyncio: DEBUG: Using selector: KqueueSelector
2020-09-21T18:02:00.5363770Z asyncio: DEBUG: Using selector: KqueueSelector
2020-09-21T18:02:00.5364230Z asyncio: DEBUG: Using selector: KqueueSelector
2020-09-21T18:02:00.5364690Z asyncio: DEBUG: Using selector: KqueueSelector
2020-09-21T18:02:00.5365240Z asyncio: DEBUG: Using selector: KqueueSelector
2020-09-21T18:02:00.5365700Z asyncio: DEBUG: Using selector: KqueueSelector
2020-09-21T18:02:00.5366160Z asyncio: DEBUG: Using selector: KqueueSelector
2020-09-21T18:02:00.5367010Z --------------------- >> end captured logging << ---------------------
2020-09-21T18:02:00.5367250Z 
2020-09-21T18:02:00.5367460Z ======================================================================
2020-09-21T18:02:00.5368050Z ERROR: datalad.metadata.tests.test_aggregation.test_aggregate_with_unavailable_objects_from_subds
2020-09-21T18:02:00.5369030Z ----------------------------------------------------------------------
2020-09-21T18:02:00.5369340Z Traceback (most recent call last):
2020-09-21T18:02:00.5370390Z   File "/Users/runner/hostedtoolcache/Python/3.7.9/x64/lib/python3.7/site-packages/nose/case.py", line 198, in runTest
2020-09-21T18:02:00.5370960Z     self.test(*self.arg)
2020-09-21T18:02:00.5372020Z   File "/Users/runner/hostedtoolcache/Python/3.7.9/x64/lib/python3.7/site-packages/datalad/tests/utils.py", line 554, in _wrap_with_tree
2020-09-21T18:02:00.5372620Z     return t(*(arg + (d,)), **kw)
2020-09-21T18:02:00.5373670Z   File "/Users/runner/hostedtoolcache/Python/3.7.9/x64/lib/python3.7/site-packages/datalad/tests/utils.py", line 731, in _wrap_with_tempfile
2020-09-21T18:02:00.5374320Z     return t(*(arg + (filename,)), **kw)
2020-09-21T18:02:00.5375660Z   File "/Users/runner/hostedtoolcache/Python/3.7.9/x64/lib/python3.7/site-packages/datalad/metadata/tests/test_aggregation.py", line 220, in test_aggregate_with_unavailable_objects_from_subds
2020-09-21T18:02:00.5376480Z     force_extraction=False)
2020-09-21T18:02:00.5377590Z   File "/Users/runner/hostedtoolcache/Python/3.7.9/x64/lib/python3.7/site-packages/datalad/distribution/dataset.py", line 503, in apply_func
2020-09-21T18:02:00.5378230Z     return f(**kwargs)
2020-09-21T18:02:00.5379270Z   File "/Users/runner/hostedtoolcache/Python/3.7.9/x64/lib/python3.7/site-packages/datalad/interface/utils.py", line 494, in eval_func
2020-09-21T18:02:00.5380030Z     return return_func(generator_func)(*args, **kwargs)
2020-09-21T18:02:00.5381210Z   File "/Users/runner/hostedtoolcache/Python/3.7.9/x64/lib/python3.7/site-packages/datalad/interface/utils.py", line 482, in return_func
2020-09-21T18:02:00.5381840Z     results = list(results)
2020-09-21T18:02:00.5383200Z   File "/Users/runner/hostedtoolcache/Python/3.7.9/x64/lib/python3.7/site-packages/datalad/interface/utils.py", line 413, in generator_func
2020-09-21T18:02:00.5383860Z     allkwargs):
2020-09-21T18:02:00.5384970Z   File "/Users/runner/hostedtoolcache/Python/3.7.9/x64/lib/python3.7/site-packages/datalad/interface/utils.py", line 552, in _process_results
2020-09-21T18:02:00.5385610Z     for res in results:
2020-09-21T18:02:00.5386680Z   File "/Users/runner/hostedtoolcache/Python/3.7.9/x64/lib/python3.7/site-packages/datalad/metadata/aggregate.py", line 1052, in __call__
2020-09-21T18:02:00.5387270Z     to_save)
2020-09-21T18:02:00.5388340Z   File "/Users/runner/hostedtoolcache/Python/3.7.9/x64/lib/python3.7/site-packages/datalad/metadata/aggregate.py", line 715, in _update_ds_agginfo
2020-09-21T18:02:00.5389350Z     result_renderer='disabled')
2020-09-21T18:02:00.5390540Z   File "/Users/runner/hostedtoolcache/Python/3.7.9/x64/lib/python3.7/site-packages/datalad/distribution/dataset.py", line 503, in apply_func
2020-09-21T18:02:00.5391180Z     return f(**kwargs)
2020-09-21T18:02:00.5392250Z   File "/Users/runner/hostedtoolcache/Python/3.7.9/x64/lib/python3.7/site-packages/datalad/interface/utils.py", line 494, in eval_func
2020-09-21T18:02:00.5392940Z     return return_func(generator_func)(*args, **kwargs)
2020-09-21T18:02:00.5394300Z   File "/Users/runner/hostedtoolcache/Python/3.7.9/x64/lib/python3.7/site-packages/datalad/interface/utils.py", line 482, in return_func
2020-09-21T18:02:00.5394930Z     results = list(results)
2020-09-21T18:02:00.5396020Z   File "/Users/runner/hostedtoolcache/Python/3.7.9/x64/lib/python3.7/site-packages/datalad/interface/utils.py", line 469, in generator_func
2020-09-21T18:02:00.5396720Z     msg="Command did not complete successfully")
2020-09-21T18:02:00.5397710Z datalad.support.exceptions.IncompleteResultsError: Command did not complete successfully. 4 failed:
2020-09-21T18:02:00.5398910Z [{'action': 'get',
2020-09-21T18:02:00.5399730Z   'annexkey': 'MD5E-s160--e8068224242b6db0df2c869d56d1745c.xz',
2020-09-21T18:02:00.5401050Z   'message': '../../../../../../../../var/folders/24/8k48jl6d249_n_qfxwsl6xvm0000gn/T/datalad_temp_tree_test_aggregate_with_unavailable_objects_from_subdswrehiyip/origin/.git/annex: '
2020-09-21T18:02:00.5402230Z              'createDirectory: does not exist (No such file or directory)\n'
2020-09-21T18:02:00.5403490Z              '../../../../../../../../var/folders/24/8k48jl6d249_n_qfxwsl6xvm0000gn/T/datalad_temp_tree_test_aggregate_with_unavailable_objects_from_subdswrehiyip/origin/.git/annex: '
2020-09-21T18:02:00.5404640Z              'createDirectory: does not exist (No such file or directory)\n'
2020-09-21T18:02:00.5405880Z              '../../../../../../../../var/folders/24/8k48jl6d249_n_qfxwsl6xvm0000gn/T/datalad_temp_tree_test_aggregate_with_unavailable_objects_from_subdswrehiyip/origin/.git/annex: '
2020-09-21T18:02:00.5407030Z              'createDirectory: does not exist (No such file or directory)',
2020-09-21T18:02:00.5408590Z   'path': '/private/var/folders/24/8k48jl6d249_n_qfxwsl6xvm0000gn/T/datalad_temp_test_aggregate_with_unavailable_objects_from_subdsu4bvt19l/base/.datalad/metadata/objects/b2/cn-ddef9bd12a2be8af5d11eb2f28ab8e.xz',
2020-09-21T18:02:00.5410320Z   'refds': '/private/var/folders/24/8k48jl6d249_n_qfxwsl6xvm0000gn/T/datalad_temp_test_aggregate_with_unavailable_objects_from_subdsu4bvt19l',
2020-09-21T18:02:00.5411270Z   'status': 'error',
2020-09-21T18:02:00.5411830Z   'type': 'file'},
2020-09-21T18:02:00.5412360Z  {'action': 'get',
2020-09-21T18:02:00.5413180Z   'annexkey': 'MD5E-s394--7463cfa697cc24eba664aedfc9119178',
2020-09-21T18:02:00.5414510Z   'message': '../../../../../../../../var/folders/24/8k48jl6d249_n_qfxwsl6xvm0000gn/T/datalad_temp_tree_test_aggregate_with_unavailable_objects_from_subdswrehiyip/origin/.git/annex: '
2020-09-21T18:02:00.5415680Z              'createDirectory: does not exist (No such file or directory)\n'
2020-09-21T18:02:00.5417180Z              '../../../../../../../../var/folders/24/8k48jl6d249_n_qfxwsl6xvm0000gn/T/datalad_temp_tree_test_aggregate_with_unavailable_objects_from_subdswrehiyip/origin/.git/annex: '
2020-09-21T18:02:00.5418400Z              'createDirectory: does not exist (No such file or directory)\n'
2020-09-21T18:02:00.5419640Z              '../../../../../../../../var/folders/24/8k48jl6d249_n_qfxwsl6xvm0000gn/T/datalad_temp_tree_test_aggregate_with_unavailable_objects_from_subdswrehiyip/origin/.git/annex: '
2020-09-21T18:02:00.5420900Z              'createDirectory: does not exist (No such file or directory)',
2020-09-21T18:02:00.5422380Z   'path': '/private/var/folders/24/8k48jl6d249_n_qfxwsl6xvm0000gn/T/datalad_temp_test_aggregate_with_unavailable_objects_from_subdsu4bvt19l/base/.datalad/metadata/objects/11/ds-3ea0a0c381e8f0c25d767a740c8909',
2020-09-21T18:02:00.5424010Z   'refds': '/private/var/folders/24/8k48jl6d249_n_qfxwsl6xvm0000gn/T/datalad_temp_test_aggregate_with_unavailable_objects_from_subdsu4bvt19l',
2020-09-21T18:02:00.5424940Z   'status': 'error',
2020-09-21T18:02:00.5425510Z   'type': 'file'},
2020-09-21T18:02:00.5426050Z  {'action': 'get',
2020-09-21T18:02:00.5426850Z   'annexkey': 'MD5E-s450--11c7def491a7383a20a185f2fa030ea8',
2020-09-21T18:02:00.5428160Z   'message': '../../../../../../../../var/folders/24/8k48jl6d249_n_qfxwsl6xvm0000gn/T/datalad_temp_tree_test_aggregate_with_unavailable_objects_from_subdswrehiyip/origin/.git/annex: '
2020-09-21T18:02:00.5429580Z              'createDirectory: does not exist (No such file or directory)\n'
2020-09-21T18:02:00.5430830Z              '../../../../../../../../var/folders/24/8k48jl6d249_n_qfxwsl6xvm0000gn/T/datalad_temp_tree_test_aggregate_with_unavailable_objects_from_subdswrehiyip/origin/.git/annex: '
2020-09-21T18:02:00.5431970Z              'createDirectory: does not exist (No such file or directory)\n'
2020-09-21T18:02:00.5433210Z              '../../../../../../../../var/folders/24/8k48jl6d249_n_qfxwsl6xvm0000gn/T/datalad_temp_tree_test_aggregate_with_unavailable_objects_from_subdswrehiyip/origin/.git/annex: '
2020-09-21T18:02:00.5434360Z              'createDirectory: does not exist (No such file or directory)',
2020-09-21T18:02:00.5435870Z   'path': '/private/var/folders/24/8k48jl6d249_n_qfxwsl6xvm0000gn/T/datalad_temp_test_aggregate_with_unavailable_objects_from_subdsu4bvt19l/base/.datalad/metadata/objects/b2/ds-ddef9bd12a2be8af5d11eb2f28ab8e',
2020-09-21T18:02:00.5437560Z   'refds': '/private/var/folders/24/8k48jl6d249_n_qfxwsl6xvm0000gn/T/datalad_temp_test_aggregate_with_unavailable_objects_from_subdsu4bvt19l',
2020-09-21T18:02:00.5438490Z   'status': 'error',
2020-09-21T18:02:00.5439050Z   'type': 'file'},
2020-09-21T18:02:00.5439590Z  {'action': 'get',
2020-09-21T18:02:00.5440430Z   'annexkey': 'MD5E-s160--62cc89d94c829b2dae2807e01ab5601d.xz',
2020-09-21T18:02:00.5441750Z   'message': '../../../../../../../../var/folders/24/8k48jl6d249_n_qfxwsl6xvm0000gn/T/datalad_temp_tree_test_aggregate_with_unavailable_objects_from_subdswrehiyip/origin/.git/annex: '
2020-09-21T18:02:00.5442930Z              'createDirectory: does not exist (No such file or directory)\n'
2020-09-21T18:02:00.5444170Z              '../../../../../../../../var/folders/24/8k48jl6d249_n_qfxwsl6xvm0000gn/T/datalad_temp_tree_test_aggregate_with_unavailable_objects_from_subdswrehiyip/origin/.git/annex: '
2020-09-21T18:02:00.5445330Z              'createDirectory: does not exist (No such file or directory)\n'
2020-09-21T18:02:00.5446560Z              '../../../../../../../../var/folders/24/8k48jl6d249_n_qfxwsl6xvm0000gn/T/datalad_temp_tree_test_aggregate_with_unavailable_objects_from_subdswrehiyip/origin/.git/annex: '
2020-09-21T18:02:00.5447700Z              'createDirectory: does not exist (No such file or directory)',
2020-09-21T18:02:00.5449180Z   'path': '/private/var/folders/24/8k48jl6d249_n_qfxwsl6xvm0000gn/T/datalad_temp_test_aggregate_with_unavailable_objects_from_subdsu4bvt19l/base/.datalad/metadata/objects/11/cn-3ea0a0c381e8f0c25d767a740c8909.xz',
2020-09-21T18:02:00.5451240Z   'refds': '/private/var/folders/24/8k48jl6d249_n_qfxwsl6xvm0000gn/T/datalad_temp_test_aggregate_with_unavailable_objects_from_subdsu4bvt19l',
2020-09-21T18:02:00.5452250Z   'status': 'error',
2020-09-21T18:02:00.5452810Z   'type': 'file'}]
2020-09-21T18:02:00.5453450Z -------------------- >> begin captured logging << --------------------
2020-09-21T18:02:00.5453880Z asyncio: DEBUG: Using selector: KqueueSelector
2020-09-21T18:02:00.5454350Z asyncio: DEBUG: Using selector: KqueueSelector
2020-09-21T18:02:00.5454820Z asyncio: DEBUG: Using selector: KqueueSelector
2020-09-21T18:02:00.5455280Z asyncio: DEBUG: Using selector: KqueueSelector
2020-09-21T18:02:00.5455740Z asyncio: DEBUG: Using selector: KqueueSelector
2020-09-21T18:02:00.5456190Z asyncio: DEBUG: Using selector: KqueueSelector
2020-09-21T18:02:00.5456650Z asyncio: DEBUG: Using selector: KqueueSelector
2020-09-21T18:02:00.5457110Z asyncio: DEBUG: Using selector: KqueueSelector
2020-09-21T18:02:00.5457570Z asyncio: DEBUG: Using selector: KqueueSelector
2020-09-21T18:02:00.5458030Z asyncio: DEBUG: Using selector: KqueueSelector
2020-09-21T18:02:00.5458500Z asyncio: DEBUG: Using selector: KqueueSelector
2020-09-21T18:02:00.5458960Z asyncio: DEBUG: Using selector: KqueueSelector
2020-09-21T18:02:00.5459420Z asyncio: DEBUG: Using selector: KqueueSelector
2020-09-21T18:02:00.5459870Z asyncio: DEBUG: Using selector: KqueueSelector
2020-09-21T18:02:00.5460530Z asyncio: DEBUG: Using selector: KqueueSelector
2020-09-21T18:02:00.5460990Z asyncio: DEBUG: Using selector: KqueueSelector
2020-09-21T18:02:00.5461450Z asyncio: DEBUG: Using selector: KqueueSelector
2020-09-21T18:02:00.5461910Z asyncio: DEBUG: Using selector: KqueueSelector
2020-09-21T18:02:00.5462370Z asyncio: DEBUG: Using selector: KqueueSelector
2020-09-21T18:02:00.5462830Z asyncio: DEBUG: Using selector: KqueueSelector
2020-09-21T18:02:00.5463290Z asyncio: DEBUG: Using selector: KqueueSelector
2020-09-21T18:02:00.5463740Z asyncio: DEBUG: Using selector: KqueueSelector
2020-09-21T18:02:00.5464200Z asyncio: DEBUG: Using selector: KqueueSelector
2020-09-21T18:02:00.5464660Z asyncio: DEBUG: Using selector: KqueueSelector
2020-09-21T18:02:00.5465120Z asyncio: DEBUG: Using selector: KqueueSelector
2020-09-21T18:02:00.5465700Z asyncio: DEBUG: Using selector: KqueueSelector
2020-09-21T18:02:00.5466170Z asyncio: DEBUG: Using selector: KqueueSelector
2020-09-21T18:02:00.5466640Z asyncio: DEBUG: Using selector: KqueueSelector
2020-09-21T18:02:00.5467100Z asyncio: DEBUG: Using selector: KqueueSelector
2020-09-21T18:02:00.5467560Z asyncio: DEBUG: Using selector: KqueueSelector
2020-09-21T18:02:00.5468010Z asyncio: DEBUG: Using selector: KqueueSelector
2020-09-21T18:02:00.5468890Z --------------------- >> end captured logging << ---------------------
2020-09-21T18:02:00.5469120Z 
2020-09-21T18:02:00.5469320Z ======================================================================
2020-09-21T18:02:00.5469780Z ERROR: datalad.metadata.tests.test_base.test_aggregation
2020-09-21T18:02:00.5470590Z ----------------------------------------------------------------------
2020-09-21T18:02:00.5470900Z Traceback (most recent call last):
2020-09-21T18:02:00.5471970Z   File "/Users/runner/hostedtoolcache/Python/3.7.9/x64/lib/python3.7/site-packages/nose/case.py", line 198, in runTest
2020-09-21T18:02:00.5472550Z     self.test(*self.arg)
2020-09-21T18:02:00.5473610Z   File "/Users/runner/hostedtoolcache/Python/3.7.9/x64/lib/python3.7/site-packages/datalad/tests/utils.py", line 554, in _wrap_with_tree
2020-09-21T18:02:00.5474220Z     return t(*(arg + (d,)), **kw)
2020-09-21T18:02:00.5475320Z   File "/Users/runner/hostedtoolcache/Python/3.7.9/x64/lib/python3.7/site-packages/datalad/metadata/tests/test_base.py", line 177, in test_aggregation
2020-09-21T18:02:00.5476020Z     cloneres = clone.metadata()
2020-09-21T18:02:00.5477140Z   File "/Users/runner/hostedtoolcache/Python/3.7.9/x64/lib/python3.7/site-packages/datalad/distribution/dataset.py", line 503, in apply_func
2020-09-21T18:02:00.5477780Z     return f(**kwargs)
2020-09-21T18:02:00.5479410Z   File "/Users/runner/hostedtoolcache/Python/3.7.9/x64/lib/python3.7/site-packages/datalad/interface/utils.py", line 494, in eval_func
2020-09-21T18:02:00.5480250Z     return return_func(generator_func)(*args, **kwargs)
2020-09-21T18:02:00.5481460Z   File "/Users/runner/hostedtoolcache/Python/3.7.9/x64/lib/python3.7/site-packages/datalad/interface/utils.py", line 482, in return_func
2020-09-21T18:02:00.5482090Z     results = list(results)
2020-09-21T18:02:00.5483310Z   File "/Users/runner/hostedtoolcache/Python/3.7.9/x64/lib/python3.7/site-packages/datalad/interface/utils.py", line 413, in generator_func
2020-09-21T18:02:00.5483910Z     allkwargs):
2020-09-21T18:02:00.5484960Z   File "/Users/runner/hostedtoolcache/Python/3.7.9/x64/lib/python3.7/site-packages/datalad/interface/utils.py", line 552, in _process_results
2020-09-21T18:02:00.5485560Z     for res in results:
2020-09-21T18:02:00.5486580Z   File "/Users/runner/hostedtoolcache/Python/3.7.9/x64/lib/python3.7/site-packages/datalad/metadata/metadata.py", line 1065, in __call__
2020-09-21T18:02:00.5487150Z     **res_kwargs):
2020-09-21T18:02:00.5488250Z   File "/Users/runner/hostedtoolcache/Python/3.7.9/x64/lib/python3.7/site-packages/datalad/metadata/metadata.py", line 296, in query_aggregated_metadata
2020-09-21T18:02:00.5489260Z     result_renderer='disabled')
2020-09-21T18:02:00.5490530Z   File "/Users/runner/hostedtoolcache/Python/3.7.9/x64/lib/python3.7/site-packages/datalad/interface/utils.py", line 494, in eval_func
2020-09-21T18:02:00.5491410Z     return return_func(generator_func)(*args, **kwargs)
2020-09-21T18:02:00.5492590Z   File "/Users/runner/hostedtoolcache/Python/3.7.9/x64/lib/python3.7/site-packages/datalad/interface/utils.py", line 482, in return_func
2020-09-21T18:02:00.5493340Z     results = list(results)
2020-09-21T18:02:00.5494410Z   File "/Users/runner/hostedtoolcache/Python/3.7.9/x64/lib/python3.7/site-packages/datalad/interface/utils.py", line 469, in generator_func
2020-09-21T18:02:00.5495100Z     msg="Command did not complete successfully")
2020-09-21T18:02:00.5496070Z datalad.support.exceptions.IncompleteResultsError: Command did not complete successfully. 2 failed:
2020-09-21T18:02:00.5497260Z [{'action': 'get',
2020-09-21T18:02:00.5498080Z   'annexkey': 'MD5E-s152--2e6c53f4265c6882fcddb99f2472aaa1.xz',
2020-09-21T18:02:00.5499290Z   'message': '../../../../../../../../var/folders/24/8k48jl6d249_n_qfxwsl6xvm0000gn/T/datalad_temp_tree_test_aggregationyfq1avfh/origin/.git/annex: '
2020-09-21T18:02:00.5500370Z              'createDirectory: does not exist (No such file or directory)\n'
2020-09-21T18:02:00.5501480Z              '../../../../../../../../var/folders/24/8k48jl6d249_n_qfxwsl6xvm0000gn/T/datalad_temp_tree_test_aggregationyfq1avfh/origin/.git/annex: '
2020-09-21T18:02:00.5502530Z              'createDirectory: does not exist (No such file or directory)\n'
2020-09-21T18:02:00.5503640Z              '../../../../../../../../var/folders/24/8k48jl6d249_n_qfxwsl6xvm0000gn/T/datalad_temp_tree_test_aggregationyfq1avfh/origin/.git/annex: '
2020-09-21T18:02:00.5504680Z              'createDirectory: does not exist (No such file or directory)',
2020-09-21T18:02:00.5506020Z   'path': '/var/folders/24/8k48jl6d249_n_qfxwsl6xvm0000gn/T/datalad_temp_tree_test_aggregationyfq1avfh/clone/.datalad/metadata/objects/22/cn-6c0df9097abfd1cf3943d34702885a.xz',
2020-09-21T18:02:00.5507470Z   'refds': '/var/folders/24/8k48jl6d249_n_qfxwsl6xvm0000gn/T/datalad_temp_tree_test_aggregationyfq1avfh/clone',
2020-09-21T18:02:00.5508330Z   'status': 'error',
2020-09-21T18:02:00.5509010Z   'type': 'file'},
2020-09-21T18:02:00.5509560Z  {'action': 'get',
2020-09-21T18:02:00.5510350Z   'annexkey': 'MD5E-s533--38bb2f4d151c25766544abd6629dae98',
2020-09-21T18:02:00.5511550Z   'message': '../../../../../../../../var/folders/24/8k48jl6d249_n_qfxwsl6xvm0000gn/T/datalad_temp_tree_test_aggregationyfq1avfh/origin/.git/annex: '
2020-09-21T18:02:00.5512650Z              'createDirectory: does not exist (No such file or directory)\n'
2020-09-21T18:02:00.5514080Z              '../../../../../../../../var/folders/24/8k48jl6d249_n_qfxwsl6xvm0000gn/T/datalad_temp_tree_test_aggregationyfq1avfh/origin/.git/annex: '
2020-09-21T18:02:00.5515200Z              'createDirectory: does not exist (No such file or directory)\n'
2020-09-21T18:02:00.5516310Z              '../../../../../../../../var/folders/24/8k48jl6d249_n_qfxwsl6xvm0000gn/T/datalad_temp_tree_test_aggregationyfq1avfh/origin/.git/annex: '
2020-09-21T18:02:00.5517350Z              'createDirectory: does not exist (No such file or directory)',
2020-09-21T18:02:00.5518680Z   'path': '/var/folders/24/8k48jl6d249_n_qfxwsl6xvm0000gn/T/datalad_temp_tree_test_aggregationyfq1avfh/clone/.datalad/metadata/objects/22/ds-6c0df9097abfd1cf3943d34702885a',
2020-09-21T18:02:00.5520190Z   'refds': '/var/folders/24/8k48jl6d249_n_qfxwsl6xvm0000gn/T/datalad_temp_tree_test_aggregationyfq1avfh/clone',
2020-09-21T18:02:00.5521080Z   'status': 'error',
2020-09-21T18:02:00.5521630Z   'type': 'file'}]
2020-09-21T18:02:00.5522260Z -------------------- >> begin captured logging << --------------------
2020-09-21T18:02:00.5522680Z asyncio: DEBUG: Using selector: KqueueSelector
2020-09-21T18:02:00.5523140Z asyncio: DEBUG: Using selector: KqueueSelector
2020-09-21T18:02:00.5524750Z datalad.metadata.metadata: WARNING: Found no aggregated metadata info file /var/folders/24/8k48jl6d249_n_qfxwsl6xvm0000gn/T/datalad_temp_tree_test_aggregationyfq1avfh/origin/.datalad/metadata/aggregate_v1.json. You will likely need to either update the dataset from its original location or reaggregate metadata locally.
2020-09-21T18:02:00.5526340Z asyncio: DEBUG: Using selector: KqueueSelector
2020-09-21T18:02:00.5526800Z asyncio: DEBUG: Using selector: KqueueSelector
2020-09-21T18:02:00.5527270Z asyncio: DEBUG: Using selector: KqueueSelector
2020-09-21T18:02:00.5527720Z asyncio: DEBUG: Using selector: KqueueSelector
2020-09-21T18:02:00.5528160Z asyncio: DEBUG: Using selector: KqueueSelector
2020-09-21T18:02:00.5528610Z asyncio: DEBUG: Using selector: KqueueSelector
2020-09-21T18:02:00.5529060Z asyncio: DEBUG: Using selector: KqueueSelector
2020-09-21T18:02:00.5529520Z asyncio: DEBUG: Using selector: KqueueSelector
2020-09-21T18:02:00.5530110Z asyncio: DEBUG: Using selector: KqueueSelector
2020-09-21T18:02:00.5530570Z asyncio: DEBUG: Using selector: KqueueSelector
2020-09-21T18:02:00.5531030Z asyncio: DEBUG: Using selector: KqueueSelector
2020-09-21T18:02:00.5531500Z asyncio: DEBUG: Using selector: KqueueSelector
2020-09-21T18:02:00.5531950Z asyncio: DEBUG: Using selector: KqueueSelector
2020-09-21T18:02:00.5532400Z asyncio: DEBUG: Using selector: KqueueSelector
2020-09-21T18:02:00.5532990Z asyncio: DEBUG: Using selector: KqueueSelector
2020-09-21T18:02:00.5533440Z asyncio: DEBUG: Using selector: KqueueSelector
2020-09-21T18:02:00.5534020Z asyncio: DEBUG: Using selector: KqueueSelector
2020-09-21T18:02:00.5534930Z --------------------- >> end captured logging << ---------------------
2020-09-21T18:02:00.5535150Z 
2020-09-21T18:02:00.5535350Z ======================================================================
2020-09-21T18:02:00.5535910Z FAIL: datalad.customremotes.tests.test_archives.test_basic_scenario
2020-09-21T18:02:00.5536840Z ----------------------------------------------------------------------
2020-09-21T18:02:00.5537140Z Traceback (most recent call last):
2020-09-21T18:02:00.5538310Z   File "/Users/runner/hostedtoolcache/Python/3.7.9/x64/lib/python3.7/site-packages/nose/case.py", line 198, in runTest
2020-09-21T18:02:00.5538880Z     self.test(*self.arg)
2020-09-21T18:02:00.5540060Z   File "/Users/runner/hostedtoolcache/Python/3.7.9/x64/lib/python3.7/site-packages/datalad/tests/utils.py", line 554, in _wrap_with_tree
2020-09-21T18:02:00.5540650Z     return t(*(arg + (d,)), **kw)
2020-09-21T18:02:00.5541820Z   File "/Users/runner/hostedtoolcache/Python/3.7.9/x64/lib/python3.7/site-packages/datalad/tests/utils.py", line 731, in _wrap_with_tempfile
2020-09-21T18:02:00.5542470Z     return t(*(arg + (filename,)), **kw)
2020-09-21T18:02:00.5544510Z   File "/Users/runner/hostedtoolcache/Python/3.7.9/x64/lib/python3.7/site-packages/datalad/customremotes/tests/test_archives.py", line 142, in test_basic_scenario
2020-09-21T18:02:00.5545370Z     assert_true(cloned_annex.file_has_content(fn_extracted))
2020-09-21T18:02:00.5545820Z AssertionError: False is not true
2020-09-21T18:02:00.5546590Z -------------------- >> begin captured logging << --------------------
2020-09-21T18:02:00.5547010Z asyncio: DEBUG: Using selector: KqueueSelector
2020-09-21T18:02:00.5547470Z asyncio: DEBUG: Using selector: KqueueSelector
2020-09-21T18:02:00.5547940Z asyncio: DEBUG: Using selector: KqueueSelector
2020-09-21T18:02:00.5548410Z asyncio: DEBUG: Using selector: KqueueSelector
2020-09-21T18:02:00.5548870Z asyncio: DEBUG: Using selector: KqueueSelector
2020-09-21T18:02:00.5549330Z asyncio: DEBUG: Using selector: KqueueSelector
2020-09-21T18:02:00.5549780Z asyncio: DEBUG: Using selector: KqueueSelector
2020-09-21T18:02:00.5550240Z asyncio: DEBUG: Using selector: KqueueSelector
2020-09-21T18:02:00.5550700Z asyncio: DEBUG: Using selector: KqueueSelector
2020-09-21T18:02:00.5551150Z asyncio: DEBUG: Using selector: KqueueSelector
2020-09-21T18:02:00.5551620Z asyncio: DEBUG: Using selector: KqueueSelector
2020-09-21T18:02:00.5552070Z asyncio: DEBUG: Using selector: KqueueSelector
2020-09-21T18:02:00.5552530Z asyncio: DEBUG: Using selector: KqueueSelector
2020-09-21T18:02:00.5553540Z --------------------- >> end captured logging << ---------------------
2020-09-21T18:02:00.5553750Z 
2020-09-21T18:02:00.5553950Z ======================================================================
2020-09-21T18:02:00.5554470Z FAIL: datalad.distribution.tests.test_get.test_get_multiple_files
2020-09-21T18:02:00.5555340Z ----------------------------------------------------------------------
2020-09-21T18:02:00.5555640Z Traceback (most recent call last):
2020-09-21T18:02:00.5556670Z   File "/Users/runner/hostedtoolcache/Python/3.7.9/x64/lib/python3.7/site-packages/nose/case.py", line 198, in runTest
2020-09-21T18:02:00.5557240Z     self.test(*self.arg)
2020-09-21T18:02:00.5558320Z   File "/Users/runner/hostedtoolcache/Python/3.7.9/x64/lib/python3.7/site-packages/datalad/tests/utils.py", line 554, in _wrap_with_tree
2020-09-21T18:02:00.5558910Z     return t(*(arg + (d,)), **kw)
2020-09-21T18:02:00.5559990Z   File "/Users/runner/hostedtoolcache/Python/3.7.9/x64/lib/python3.7/site-packages/datalad/tests/utils.py", line 666, in _wrap_serve_path_via_http
2020-09-21T18:02:00.5560660Z     return tfunc(*(args + (path, url)), **kwargs)
2020-09-21T18:02:00.5561770Z   File "/Users/runner/hostedtoolcache/Python/3.7.9/x64/lib/python3.7/site-packages/datalad/tests/utils.py", line 731, in _wrap_with_tempfile
2020-09-21T18:02:00.5562410Z     return t(*(arg + (filename,)), **kw)
2020-09-21T18:02:00.5563590Z   File "/Users/runner/hostedtoolcache/Python/3.7.9/x64/lib/python3.7/site-packages/datalad/distribution/tests/test_get.py", line 291, in test_get_multiple_files
2020-09-21T18:02:00.5564660Z     assert_status(['ok', 'notneeded'], result[1:])
2020-09-21T18:02:00.5565900Z   File "/Users/runner/hostedtoolcache/Python/3.7.9/x64/lib/python3.7/site-packages/datalad/tests/utils.py", line 1292, in assert_status
2020-09-21T18:02:00.5566580Z     dumps(results, indent=1, default=lambda x: str(x))))
2020-09-21T18:02:00.5567450Z AssertionError: Test 1/2: expected status ['ok', 'notneeded'] not found in:
2020-09-21T18:02:00.5567850Z [
2020-09-21T18:02:00.5568030Z  {
2020-09-21T18:02:00.5568230Z   "type": "file",
2020-09-21T18:02:00.5568750Z   "refds": "/private/var/folders/24/8k48jl6d249_n_qfxwsl6xvm0000gn/T/datalad_temp_test_get_multiple_files9pxn62wu",
2020-09-21T18:02:00.5569270Z   "status": "error",
2020-09-21T18:02:00.5569820Z   "path": "/private/var/folders/24/8k48jl6d249_n_qfxwsl6xvm0000gn/T/datalad_temp_test_get_multiple_files9pxn62wu/file2.txt",
2020-09-21T18:02:00.5570470Z   "action": "get",
2020-09-21T18:02:00.5571320Z   "annexkey": "MD5E-s10--06b77de7075f3e6dc5779806296c3370.txt",
2020-09-21T18:02:00.5573550Z   "message": "../../../../../../../var/folders/24/8k48jl6d249_n_qfxwsl6xvm0000gn/T/datalad_temp_tree_test_get_multiple_filesgyldhkq1/.git/annex: createDirectory: does not exist (No such file or directory)\n../../../../../../../var/folders/24/8k48jl6d249_n_qfxwsl6xvm0000gn/T/datalad_temp_tree_test_get_multiple_filesgyldhkq1/.git/annex: createDirectory: does not exist (No such file or directory)\n../../../../../../../var/folders/24/8k48jl6d249_n_qfxwsl6xvm0000gn/T/datalad_temp_tree_test_get_multiple_filesgyldhkq1/.git/annex: createDirectory: does not exist (No such file or directory)"
2020-09-21T18:02:00.5575270Z  },
2020-09-21T18:02:00.5575450Z  {
2020-09-21T18:02:00.5575640Z   "type": "file",
2020-09-21T18:02:00.5576150Z   "refds": "/private/var/folders/24/8k48jl6d249_n_qfxwsl6xvm0000gn/T/datalad_temp_test_get_multiple_files9pxn62wu",
2020-09-21T18:02:00.5576680Z   "status": "error",
2020-09-21T18:02:00.5577240Z   "path": "/private/var/folders/24/8k48jl6d249_n_qfxwsl6xvm0000gn/T/datalad_temp_test_get_multiple_files9pxn62wu/file1.txt",
2020-09-21T18:02:00.5577780Z   "action": "get",
2020-09-21T18:02:00.5578660Z   "annexkey": "MD5E-s10--35a91e53e10899a6cf4012b797e2cd87.txt",
2020-09-21T18:02:00.5580590Z   "message": "../../../../../../../var/folders/24/8k48jl6d249_n_qfxwsl6xvm0000gn/T/datalad_temp_tree_test_get_multiple_filesgyldhkq1/.git/annex: createDirectory: does not exist (No such file or directory)\n../../../../../../../var/folders/24/8k48jl6d249_n_qfxwsl6xvm0000gn/T/datalad_temp_tree_test_get_multiple_filesgyldhkq1/.git/annex: createDirectory: does not exist (No such file or directory)\n../../../../../../../var/folders/24/8k48jl6d249_n_qfxwsl6xvm0000gn/T/datalad_temp_tree_test_get_multiple_filesgyldhkq1/.git/annex: createDirectory: does not exist (No such file or directory)"
2020-09-21T18:02:00.5582450Z  }
2020-09-21T18:02:00.5582630Z ]
2020-09-21T18:02:00.5583290Z -------------------- >> begin captured logging << --------------------
2020-09-21T18:02:00.5583720Z asyncio: DEBUG: Using selector: KqueueSelector
2020-09-21T18:02:00.5584190Z asyncio: DEBUG: Using selector: KqueueSelector
2020-09-21T18:02:00.5584660Z asyncio: DEBUG: Using selector: KqueueSelector
2020-09-21T18:02:00.5585120Z asyncio: DEBUG: Using selector: KqueueSelector
2020-09-21T18:02:00.5585580Z asyncio: DEBUG: Using selector: KqueueSelector
2020-09-21T18:02:00.5586030Z asyncio: DEBUG: Using selector: KqueueSelector
2020-09-21T18:02:00.5586490Z asyncio: DEBUG: Using selector: KqueueSelector
2020-09-21T18:02:00.5586960Z asyncio: DEBUG: Using selector: KqueueSelector
2020-09-21T18:02:00.5587420Z asyncio: DEBUG: Using selector: KqueueSelector
2020-09-21T18:02:00.5588190Z --------------------- >> end captured logging << ---------------------
2020-09-21T18:02:00.5588410Z 
2020-09-21T18:02:00.5588960Z ----------------------------------------------------------------------
2020-09-21T18:02:00.5589210Z Ran 1173 tests in 6653.801s
2020-09-21T18:02:00.5589380Z 
2020-09-21T18:02:00.5589640Z FAILED (SKIP=132, errors=9, failures=2)
2020-09-21T18:02:00.5590160Z testing 0
2020-09-21T18:02:00.5590390Z testing 1
2020-09-21T18:02:00.5590880Z /var/folders/24/8k48jl6d249_n_qfxwsl6xvm0000gn/T/datalad_temp_tree_test_paths_with_forward_slashesh_v756r0
2020-09-21T18:02:00.5591360Z .: here(+) [git]
2020-09-21T18:02:00.5592550Z .: origin(+) [/private/var/folders/24/8k48jl6d249_n_qfxwsl6xvm0000gn/T/datalad_temp_test_cached_datasetjkqkw2ys/https___github.com_datalad_testrepo--minimalds (git)]
2020-09-21T18:02:00.5593270Z .: here(+) [git]
2020-09-21T18:02:00.5594410Z .: origin(+) [/private/var/folders/24/8k48jl6d249_n_qfxwsl6xvm0000gn/T/datalad_temp_test_cached_datasetjkqkw2ys/https___github.com_datalad_testrepo--minimalds (git)]
2020-09-21T18:02:00.5595120Z .: here(+) [git]
2020-09-21T18:02:00.5596250Z .: origin(+) [/private/var/folders/24/8k48jl6d249_n_qfxwsl6xvm0000gn/T/datalad_temp_test_cached_datasetjkqkw2ys/https___github.com_datalad_testrepo--minimalds (git)]
2020-09-21T18:02:00.5596950Z .: here(+) [git]
2020-09-21T18:02:00.5598080Z .: origin(+) [/private/var/folders/24/8k48jl6d249_n_qfxwsl6xvm0000gn/T/datalad_temp_test_cached_datasetjkqkw2ys/https___github.com_datalad_testrepo--minimalds (git)]
2020-09-21T18:02:00.5600110Z Versions: annexremote=1.4.3 appdirs=1.4.4 boto=2.49.0 cmd:7z=16.02 cmd:annex=8.20200909-gc084bf7 cmd:bundled-git=2.28.0 cmd:git=2.28.0 cmd:system-git=2.28.0 cmd:system-ssh=8.1p1 humanize=2.6.0 iso8601=0.1.13 keyring=21.4.0 keyrings.alt=3.4.0 msgpack=1.0.0 patoolib=1.12 requests=2.24.0 tqdm=4.49.0 wrapt=1.12.1
2020-09-21T18:02:00.5601520Z Obscure filename: str=b' "\';a&b&c\xce\x94\xd7\xa7\xd9\x85\xe0\xb9\x97\xe3\x81\x82 `| ' repr=' "\';a&b&cΔקم๗あ `| '
2020-09-21T18:02:00.5602440Z Encodings: default='utf-8' filesystem='utf-8' locale.prefered='UTF-8'
2020-09-21T18:02:00.5605990Z Environment: LC_ALL='en_US.UTF-8' PATH='/Users/runner/hostedtoolcache/Python/3.7.9/x64/bin:/Users/runner/hostedtoolcache/Python/3.7.9/x64:/Applications/git-annex.app/Contents/MacOS:/Users/runner/.cargo/bin:/usr/local/lib/ruby/gems/2.6.0/bin:/usr/local/opt/ruby/bin:/usr/local/opt/curl/bin:/usr/local/bin:/usr/local/sbin:/Users/runner/bin:/Users/runner/.yarn/bin:/usr/local/go/bin:/Users/runner/Library/Android/sdk/tools:/Users/runner/Library/Android/sdk/platform-tools:/Users/runner/Library/Android/sdk/ndk-bundle:/Library/Frameworks/Mono.framework/Versions/Current/Commands:/usr/bin:/bin:/usr/sbin:/sbin:/Users/runner/.dotnet/tools:/Users/runner/.ghcup/bin:/Users/runner/hostedtoolcache/stack/2.3.3/x64' LANG='C' LC_CTYPE='en_US.UTF-8' GIT_CONFIG_PARAMETERS="'init.defaultBranch=dl-test-branch'"
2020-09-21T18:02:00.6421230Z ##[error]Process completed with exit code 1.
2020-09-21T18:02:00.6445820Z Cleaning up orphan processes

</details>

Some are suggesting some incorrect PWD or symlinks handling since I am observing a good number of paths like "message": "../../../../../../../var/folders/24/8k48jl6d249_n_qfxwsl6xvm

Then there is

020-09-21T18:02:00.5099720Z               "failed with exitcode 64 [err: 'cp: illegal option -- -\n"
2020-09-21T18:02:00.5100490Z               'usage: cp [-R [-H | -L | -P]] [-fi | -n] [-apvXc] source_file '
2020-09-21T18:02:00.5101130Z               'target_file\n'
2020-09-21T18:02:00.5101760Z               '       cp [-R [-H | -L | -P]] [-fi | -n] [-apvXc] source_file '
2020-09-21T18:02:00.5102480Z               "... target_directory'] [cmd.py:run:961]"),

which might be due to us using -- to separate paths away for some manual cp invocation?... etc

created time in 2 days

issue openeddandi/dandi-cli

local_docker_compose some times fails

e.g. this run of master: https://github.com/dandi/dandi-cli/runs/1144609727

it seems that the initial failure causing it is

minio.error.InvalidAccessKeyId: InvalidAccessKeyId: message: The access key Id you provided does not exist in our records.

<details> <summary>more of traceback etc</summary>

Creating dandiarchive-docker_django_run ... done
Traceback (most recent call last):
  File "./manage.py", line 20, in <module>
    main()
  File "./manage.py", line 16, in main
    execute_from_command_line(sys.argv)
  File "/usr/local/lib/python3.8/site-packages/django/core/management/__init__.py", line 401, in execute_from_command_line
    utility.execute()
  File "/usr/local/lib/python3.8/site-packages/django/core/management/__init__.py", line 377, in execute
    django.setup()
  File "/usr/local/lib/python3.8/site-packages/django/__init__.py", line 24, in setup
    apps.populate(settings.INSTALLED_APPS)
  File "/usr/local/lib/python3.8/site-packages/django/apps/registry.py", line 114, in populate
    app_config.import_models()
  File "/usr/local/lib/python3.8/site-packages/django/apps/config.py", line 211, in import_models
    self.models_module = import_module(models_module_name)
  File "/usr/local/lib/python3.8/importlib/__init__.py", line 127, in import_module
    return _bootstrap._gcd_import(name[level:], package, level)
...
  File "/opt/django/dandi/publish/models/asset.py", line 27, in _get_asset_blob_storage
    return create_s3_storage(settings.DANDI_DANDISETS_BUCKET_NAME)
  File "/opt/django/dandi/publish/storage.py", line 119, in create_s3_storage
    storage = VerbatimNameMinioStorage(
  File "/opt/django/dandi/publish/storage.py", line 57, in __init__
    super().__init__(*args, **kwargs)
  File "/usr/local/lib/python3.8/site-packages/minio_storage/storage.py", line 75, in __init__
    self._init_check()
  File "/usr/local/lib/python3.8/site-packages/minio_storage/storage.py", line 89, in _init_check
    if self.auto_create_bucket and not self.client.bucket_exists(
  File "/usr/local/lib/python3.8/site-packages/minio/api.py", line 404, in bucket_exists
    self._url_open('HEAD', bucket_name=bucket_name)
  File "/usr/local/lib/python3.8/site-packages/minio/api.py", line 2185, in _url_open
    region = self._get_bucket_region(bucket_name)
  File "/usr/local/lib/python3.8/site-packages/minio/api.py", line 2063, in _get_bucket_region
    region = self._get_bucket_location(bucket_name)
  File "/usr/local/lib/python3.8/site-packages/minio/api.py", line 2105, in _get_bucket_location
    raise ResponseError(response, method, bucket_name).get_exception()
minio.error.InvalidAccessKeyId: InvalidAccessKeyId: message: The access key Id you provided does not exist in our records.
Stopping dandiarchive-docker_rabbitmq_1 ... 
Stopping dandiarchive-docker_postgres_1 ... 

</details>

which overall leads to 87 passed, 13 errors. If that is the actual issue triggering the failure -- smells like some race condition, and may be then should be addressed at "-publish" level. Or could we just try starting a few times within our fixture? WDYT @jwodder ?

created time in 2 days

issue commentdatalad/datalad

fasteners: we need some min version higher than what in buster(0.12)

thank you @kyleam for digging this one out, I forgot about my earlier whining

yarikoptic

comment created time in 2 days

delete branch dandi/dandi-cli

delete branch : gh-243

delete time in 2 days

push eventdandi/dandi-cli

John T. Wodder II

commit sha 5f7d49d90fb311b565251e0f78a87c2f700c285a

Script for "instantiating" Dandisets from asset store

view details

John T. Wodder II

commit sha db93509ee7f5b1fa8d74d8753de3366932408321

Populate dandiset.yaml

view details

Yaroslav Halchenko

commit sha 05dff2d3d63b059643c828117cad421f02af0ecd

Merge pull request #244 from dandi/gh-243 Script for "instantiating" Dandisets from asset store

view details

push time in 2 days

PR merged dandi/dandi-cli

Script for "instantiating" Dandisets from asset store

Closes #243.

+85 -0

1 comment

1 changed file

jwodder

pr closed time in 2 days

issue closeddandi/dandi-cli

"instantiate" dandisets from the "backup"

To provide extensive testing for #226 on dandisets we have already in the archive, we need to download them all. But that would be increasingly prohibitive.

On drogon backup server we already have a datalad dataset with the backup of S3.

The idea is to "instantiate" dandisets present in the archive as directories with symlinks (or could actually be actual files via cp --refllink=always since its BTRFS CoW filesystem!) into some location on the drive, where those "symlinks" would be coming from an asset store which is located under /mnt/backup/dandi/dandiarchive-s3-backup/girder-assetstore .

The culprit is that asset store is using its own UUID, it is not an id of the girder's "file" . So we would need to either follow the redirect from dandiarchive's girder to https://girder.dandiarchive.org/api/v1/file/{file['id']}/download to get the actual asset id:

$> curl -I https://girder.dandiarchive.org/api/v1/file/5f176584f63d62e1dbd06946/download     
HTTP/1.1 303 See Other
Server: nginx/1.14.0 (Ubuntu)
Date: Thu, 17 Sep 2020 14:42:28 GMT
Content-Type: text/html;charset=utf-8
Content-Length: 1652
Connection: keep-alive
Allow: DELETE, GET, HEAD, OPTIONS, PATCH, POST, PUT
Girder-Request-Uid: 6e2b2cbc-c6a3-4265-8068-14151b94f9cc
Location: https://dandiarchive.s3.amazonaws.com/girder-assetstore/74/0f/740feade0d784acc8ec76bb7834d80dc?X-Amz-Algorithm=AWS4-HMAC-SHA256&X-Amz-Credential=ASIA3GIMZPVVEYHMC7MS%2F20200917%2Fus-east-2%2Fs3%2Faws4_request&X-Amz-Date=20200917T144228Z&X-Amz-Expires=3600&X-Amz-SignedHeaders=host&X-Amz-Security-Token=FwoGZXIvYXdzEBAaDKv1lZXvP9wFZRzEdCK%2FAZFXw8ch9QU9XsbYJneN4%2BIZTHUkUdu9P8xVvYlNNECKMEA25TTmbsKywS5YSDkTeY6x%2F67QDDHhbGRH89XanXXUejXHSk%2F5vU8MajEq0WV2iGMkpbYTUw9lFIlCAXnprmcDLd7LyTWCBi9tWpycrXD8YSUto3VUXG%2FTMjHOx4%2FG8CGi3I%2F1m3siPX7SQexDrmK7YpGI0jxEVYxF9sVvUtKeYF3PZWyX1b6KB0t%2BOOy4UCL%2FPRhW8gYvtHO%2F2EnxKIrrjfsFMi1WZZe2Ye%2FEJi6jx1xSE6nG%2B%2BdKQ%2BdHoigP06wBwHoLSaCdyIhVkoNGyk%2BEMt8%3D&X-Amz-Signature=cf09a8c3a24040939b756080de942bc79f171675ab70a4530a13e271b4adbe09
Strict-Transport-Security: max-age=63072000

to get that girder-assetstore/74/0f/740feade0d784acc8ec76bb7834d80dc path which is on drogon:

$> ls -l /mnt/backup/dandi/dandiarchive-s3-backup/girder-assetstore/74/0f/740feade0d784acc8ec76bb7834d80dc                 
lrwxrwxrwx 1 yoh yoh 125 Jul 21 18:00 /mnt/backup/dandi/dandiarchive-s3-backup/girder-assetstore/74/0f/740feade0d784acc8ec76bb7834d80dc -> ../../../.git/annex/objects/z8/3w/MD5E-s18792--33318fd510094e4304868b4a481d4a5a/MD5E-s18792--33318fd510094e4304868b4a481d4a5a

which would work but somewhat inefficient (we could cache those since mapping should not change) but work, or load from mongodb back the entire table and get all the mappings (more work - probably not).

So I think the course of action could be to

  • add an option "add_resolved_url" to GirderCli.get_dandiset_and_assets so it would add resolved URLs like above "https://dandiarchive.s3.amazonaws.com/girder-assetstore/74/0f/740feade0d784acc..." to the returned records of the assets
  • add dandi instantiate --assetstore PATH -o TOPPATH DANDISET_ID command (present only in DANDI_DEVEL mode) which would just go through all the assets of the dandiset and perform aforementioned cp -L --reflink=always {assetstore}/{path-within-assetstore-fromurl}

I think it should work quite fast and would be very efficient since no heavy data transfer would be happening and no new space consumed (besides for filesystem level metadata for COW copied files)

closed time in 2 days

yarikoptic

Pull request review commentdandi/dandi-cli

Script for "instantiating" Dandisets from asset store

+from pathlib import Path+import subprocess+from urllib.parse import urlparse+import click+import requests+from dandi import girder+from dandi.consts import dandiset_metadata_file+from dandi.dandiarchive import navigate_url+from dandi.dandiset import Dandiset+from dandi.utils import get_instance+++@click.command()+@click.option("-i", "--ignore-errors", is_flag=True)+@click.argument("assetstore", type=click.Path(exists=True, file_okay=False))+@click.argument("target", type=click.Path(file_okay=False))+def main(assetstore, target, ignore_errors):+    instantiate_dandisets(Path(assetstore), Path(target), ignore_errors)+++def instantiate_dandisets(+    assetstore_path: Path, target_path: Path, ignore_errors=False+):+    with requests.Session() as s:+        for did in get_dandiset_ids():+            dsdir = target_path / did+            dsdir.mkdir(parents=True, exist_ok=True)+            with navigate_url(f"https://dandiarchive.org/dandiset/{did}/draft") as (+                _,+                dandiset,+                assets,+            ):+                try:+                    (dsdir / dandiset_metadata_file).unlink()+                except FileNotFoundError:+                    pass+                metadata = dandiset.get("metadata", {})+                ds = Dandiset(dsdir, allow_empty=True)+                ds.update_metadata(metadata)+                for a in assets:+                    gid = a["girder"]["id"]+                    src = assetstore_path / girderid2assetpath(s, gid)+                    dest = dsdir / a["path"].lstrip("/")+                    dest.parent.mkdir(parents=True, exist_ok=True)+                    print(src, "->", dest)+                    try:+                        mklink(src, dest)+                    except Exception:+                        if not ignore_errors:+                            raise+++def get_dandiset_ids():+    dandi_instance = get_instance("dandi")+    client = girder.get_client(dandi_instance.girder, authenticate=False)+    offset = 0+    per_page = 50+    while True:+        dandisets = client.get(+            "dandi", parameters={"limit": per_page, "offset": offset}+        )+        if not dandisets:+            break+        for d in dandisets:+            yield d["meta"]["dandiset"]["identifier"]+        offset += len(dandisets)+++def girderid2assetpath(session, girder_id):+    r = session.head(+        f"https://girder.dandiarchive.org/api/v1/file/{girder_id}/download"+    )+    r.raise_for_status()+    return urlparse(r.headers["Location"]).path.lstrip("/")+++def mklink(src, dest):+    subprocess.run(+        ["cp", "-L", "--reflink=always", "--remove-destination", str(src), str(dest)],

ok, I will keep it in mind -- we might want to add a dedicated option for that etc to not miss some "duplicated files" entries (shouldn't happen but who knows). In general I think we might want to just always redo "fresh" into a new folder, and then rename it into the old one after having that one removed since we do not have yet "track what was deleted etc". Let's proceed meanwhile, thank you!

jwodder

comment created time in 2 days

PullRequestReviewEvent

Pull request review commentwummel/patool

Fallback to other programs if a compression type isn't supported

 def list_formats ():                       (command, util.strlist_with_or(handlers)))  -def check_program_compression(archive, command, program, compression):+def check_program_compression(program, compression):     """Check if a program supports the given compression."""     program = os.path.basename(program)     if compression:-        # check if compression is supported+        # check if compression is supported natively         if not program_supports_compression(program, compression):-            if command == 'create':-                comp_command = command-            else:-                comp_command = 'extract'-            comp_prog = find_archive_program(compression, comp_command)-            if not comp_prog:-                msg = "cannot %s archive `%s': compression `%s' not supported"-                raise util.PatoolError(msg % (command, archive, compression))+            # Check if compression is supported via an external program+            # Note that we expect the program name to be identical to the+            # compression type+            if program in ('tar', 'star', 'bsdtar'):+                comp_prog = util.find_program(compression)+                if comp_prog:+                    return True+            return False++    return True

so if compression is None (or other False) it would now return True and no exception is raised in either case. I am not familiar yet with code/logic enough yet on the use/assumptions but sounds a bit odd, but shouldn't it return False or None? in either case -- docstring should be adjusted.

Also since it is a part of public API, this change could break someone's code which relied on catching the exception... I guess ideally this function "signature" should remain the same but new code should go into a new function like is_compression_supported which would be used within RFed check_program_compression and through the code where desired to be used instead of check_program_compression.

benjaminwinger

comment created time in 2 days

PullRequestReviewEvent
PullRequestReviewEvent

pull request commentconda-forge/git-annex-feedstock

added testing of downstream packages annexremote and datalad

I have not looked into the nature of those failures -- will try in a few days to make a decision either version restriction is indeed due or a quick bug fix is possible. Limiting downstream testing for now sounds like a good plan, thank you @notestaff !

notestaff

comment created time in 2 days

push eventdandi/dandi-cli

Yaroslav Halchenko

commit sha 3c554256333c7d2ade7ead86340404f56fc7a5f7

RF: do not upload dandiset.yaml With RFings in master we can screw it up, and also web ui is fragile to wrong metadata records atm and there is no validation yet. Since we already have a header saying that all changes would be lost, I think the best course of action is just to not upload it at all!

view details

Yaroslav Halchenko

commit sha 3444ae330f729b73ebbea1152206f87c3eb24255

RF+BF: centralize "deduction" of dandiset identifier from the metadata

view details

Yaroslav Halchenko

commit sha 77eb4b523cf1f671d83c08eb990c91727664ff07

BF: we need to take metadata.dandiset into metadata

view details

Yaroslav Halchenko

commit sha 16617d31f5cb3f14d2eff63b9c045aecad84f758

RF: TEMP - skip validation of dandiset.yaml + report status validated if no errors

view details

John T. Wodder II

commit sha 8c33ae020c74bcf757849ff1f9dd787951a761f9

Failing download + upload test

view details

Yaroslav Halchenko

commit sha 4b64f95a039f6941e134a9b338afea8b67f99102

RF: register - just log identifier and URL, no need for full printout Also harmonized returned value to always return the metadata record, otherwise how would Python user know what was registered?

view details

Yaroslav Halchenko

commit sha 683e1e5fedc6037a077cd31287d257961cc7db8f

ENH: test_upload_external_download - do call register

view details

Yaroslav Halchenko

commit sha 7aa6877653c083ab53f9ecb865ba868734d14414

ENH: return back info msg about no dandiset being provided/detected

view details

John T. Wodder II

commit sha bf5abcff6e0c5ce6cea0d41a670e8c733ac86bf4

register command: Always print Dandiset identifier

view details

Yaroslav Halchenko

commit sha 2af6eebbdb158d3b7dac2514ef0f90c49e145604

Merge pull request #242 from dandi/bf-dandiset-meta-assumptions BF: dandiset metadata record structure/assumptions

view details

push time in 2 days

delete branch dandi/dandi-cli

delete branch : bf-dandiset-meta-assumptions

delete time in 2 days

PR merged dandi/dandi-cli

BF: dandiset metadata record structure/assumptions

In #233 (post 0.6.4 release, current master) I have tried to unify metadata records, but did that incorrectly for dandisets. I forgot that those would also be used during upload which caused fiasco in uploading sample 000029 (web ui is also not tolerant to bad metadata)

As part of this PR I also

  • adjuster register to always return record given by the server, and adjusted (and refactored a bit) corresponding test of register
  • included test by @jwodder from #241 (Closes #240) and adjusted it to use register
  • dandiset.yaml would no longer be uploaded at all to the archive. we already add a comment that all changes are to be lost, so we should actually do that (i.e. not upload so all changes lost)
+109 -53

7 comments

8 changed files

yarikoptic

pr closed time in 2 days

issue closeddandi/dandi-cli

"integration" test for `upload` to use main archive dandiset metadata record

I am afraid dandi upload is broken ATM since dandiset metadata structure has changed (now metadata is under "dandiset" key), and there were no coordination with having client account for that + release.

But I guess this dandi cli organize and upload for dandiset still uses old schema. So our tests pass (since testing in isolation) -- but real upload fails since on the main deployment metadata records are different... heh

I think we need a test for that: download 000027 from the main archive, and upload to the local fixture instance.

Meanwhile I will look into just fixing up the code so upload works again.

closed time in 2 days

yarikoptic

pull request commentdandi/dandi-cli

BF: dandiset metadata record structure/assumptions

Thanks, let's proceed

yarikoptic

comment created time in 2 days

pull request commentdatalad/datalad-extensions

Combined workflow for building git-annex on Ubuntu and macOS

yes -- that token. It is generated for my user, not for any specific repo, so should be usable across entirety of the github. It might lack some permissions allowed, but if that happens - I could add them.

jwodder

comment created time in 2 days

push eventdatalad/datalad

Yaroslav Halchenko

commit sha 9de9a9cdc74fb07b979a3a55dc887746f0e7edbf

RF: remove disable_logger in customremotes/datalad.py It was introduced in 15ecbb78a86f36b353e763273577dd369671e61b (0.9.2~76^2~6) as a part of https://github.com/datalad/datalad/pull/1870 with a purpose according to description of not pollutting stdout used by special remote for communication. BUT logs go to stderr and do not interfer anyhow with the communication, so the reason for the change is not clear to me. I have been trying to figure out why the heck I am still seeing failures after https://github.com/datalad/datalad/pull/4931 . I kept adding more logging but nothing appeared in the logs! I finally was brought to this piece of code, thus the motivation and argumentation for the change.

view details

Yaroslav Halchenko

commit sha 0c88b031f924e70a761df8a5d5cfb943cb381167

Merge pull request #4934 from yarikoptic/rf-donot-disable-log RF: remove disable_logger in customremotes/datalad.py

view details

push time in 2 days

PR merged datalad/datalad

RF: remove disable_logger in customremotes/datalad.py

It was introduced in 15ecbb78a86f36b353e763273577dd369671e61b (0.9.2~76^2~6) as a part of https://github.com/datalad/datalad/pull/1870 with a purpose according to description of not pollutting stdout used by special remote for communication. BUT logs go to stderr and do not interfer anyhow with the communication, so the reason for the change is not clear to me.

I have been trying to figure out why the heck I am still seeing failures after https://github.com/datalad/datalad/pull/4931 . I kept adding more logging but nothing appeared in the logs! I finally was brought to this piece of code, thus the motivation and argumentation for the change.

Let's see -- if CI is happy, it means that my argumentation above that there should be no interference is correct and it should be safe to merge.

+2 -4

3 comments

1 changed file

yarikoptic

pr closed time in 2 days

pull request commentdatalad/datalad

RF: remove disable_logger in customremotes/datalad.py

Thank you @kyleam . I have not ran into any side effects so far, so let's proceed

yarikoptic

comment created time in 2 days

pull request commentdatalad/datalad

BF: declare minimal version of fasteners to be 0.14

FWIW, 0.14.1 was uploaded wherever it built to neurodebian: http://neuro.debian.net/pkgs/python3-fasteners.html?highlight=fasteners .

yarikoptic

comment created time in 2 days

PR opened datalad/datalad

BF: declare minimal version of fasteners to be 0.14

In master, 3ab5a14689d6575f841a05d450e2676e855d6035, added use of InterProcessLock with providing our logger. That feature was added only in 0.14 according to https://github.com/harlowja/fasteners/blob/master/ChangeLog

and stable Debian (and NeuroDebian ATM) has only 0.12.0 which causes errors.

We could have checked on version of fasteners and provided only if recent enough, but for now (hoping that backport builds succeed) I am just establishing minimal version

+1 -1

0 comment

1 changed file

pr created time in 3 days

create barnchyarikoptic/datalad

branch : bf-fasteners

created branch time in 3 days

push eventyarikoptic/datalad

Yaroslav Halchenko

commit sha 0a67bcba23b30a5bb8dd4a05301306f7a8fcf794

RF+ENH: centralize check of credential not expired and do before .get_key() This way we would not even attempt (immediately after) getting a key if we know that credential is expired already. There can still be a race condition and proper reaction to the expiration of the token is yet to be figured out how to handle properly while handling 400 I am also thinking may be about "artificially" bringing (stored) expiration slightly closer in time, e.g. making it 1% closer than the whoe allotment. That would help to avoid any possible race as long as it is to expire just in a few seconds since allotment... or may be change it to be 2 seconds closer altogether -- unlikely expiration would ever be that close in time!

view details

Yaroslav Halchenko

commit sha c6cc33deed3bb3ca2a215015fec25440dd844948

BF(workaround): is_expired -- remove 2 seconds from allowed duration to help avoiding race conditions etc. It is unlikely that expiration would be provided so close to request that it would expire within 2 seconds, so I think this should be ok

view details

push time in 3 days

pull request commentdatalad/datalad

ENH+RF: @try_multiple_dec_s3 to retry interaction with S3 upon "retriable" errors

uff, it dosn't (at least not fully). Current stumbling point is within this question to boto people and might have a simple resolution.

Nevertheless -- I have pushed more of IMHO relevant etc fixes/refactorings, and although I believe they should be ok, no longer sure if maint would remain appropriate target. I have not even tried to rerun in real use case of mine (ABCD) with those changes yet since I think they would not help yet. Ideally I should establish a small test which would exercise all the added logic etc.

yarikoptic

comment created time in 3 days

push eventyarikoptic/datalad

Yaroslav Halchenko

commit sha 50a0b1051a83b9bccdf0be09b0eabbb1f612bb89

Revert "RF: make needs_authentication into a bool" This reverts commit c936ed37d29b5175d49f27e020cae3498a9c5c92. I think there was some thought behind initial code since "else" clause does cause entry of credential if there is a no known credential and we do not know by then (thus None) if needs_authentication (would be bool) if there is no authenticator assigned. So to not disturb that portion of the code, I am reverting the change which was incomplete anyways since logic in "else:" would no longer trigger nested else: to entry of credential.

view details

Yaroslav Halchenko

commit sha d34dcf27ddffb9a82c1f94f06f9169040878c61f

ENH+RF: CompositeCredential - add .refresh + trigger regeneration of full chain upon enter_new We need interface to "force-trigger" regeneration of apparently (but not due to expiration datetime as we know) credential. .refresh() does that and is now used by enter_new(). Change in behavior is that we will now trigger full chain to get all "tail" credential, but I take it as a feature -- so we do not delay with generation and thus possibly surface code bugs etc only until used. Also, it would make `is_expired` assessment valid - since if we do not regenerate, is_expired ATM would not be adequately report either some old generated credential is expired or not. A possible disadvantage -- it typically would require network connection while simply entering a new credential. But IMHO it is Ok

view details

Yaroslav Halchenko

commit sha a7102b284a12bb48f70536c570e38ce4222e9585

RF+ENH: downloaders (s3) - raise and handle dedicated AccessPermissionExpiredError s3 error_codes MIGHT (see comment -- we do not always get them) hint us on the nature of 400 code. If it is ExpiredToken, we need to handle it more specifically than just "keep trying" and then rely on our knowledge of when it is to expire. To say the truth -- I have triggered this use specific use case manually: I have replaced proper token with an expired one, which in theory should not happen. But who knows -- may be admins would explicitly expire some tokens which theoretically should still be valid (according to initial expiration date), so we better be able to handle such scenarios as well instead of just demanding a new full credential to be entered.

view details

Yaroslav Halchenko

commit sha 109a8f90bb263c8ba8eef01cc6516e94c530021e

RF or BF?: do not claim that credential is expired if we do not have expiration datetime In good old 70c3c4b2f68dfc9ef6858a24eb266226992a7285 (0.3~120^2~9^2~2 !) logic was added to claim that any S3 key which has no expiration assigned "is_expired". It does not feel logical! Now that previous commits add checks for expiration in various relevant places, I think it is valuable to fix this to avoid unnecessary re-authentications etc. is_expired was used only within CompositeCredentials so I think it should be safe to "fix" this without any side-effects.

view details

push time in 3 days

issue openedboto/boto3

400 Bad Request lacks any .error_code when raised by bucket.get_key

Please fill out the sections below to help us address your issue.

We are using boto3 (currently tried with 1.15.1) 's bucket.get_key to get the key and initiate its download from S3. While going through thousands of files, eventually token might have expired and then get_key raises S3ResponseError with status 400 and empty .error_code. 400 could be returned for countless number of issue and it makes it hard to react appropriately.

Looking at the code I see that it is the HEAD request sent out so I guess that is why no error_code is discovered/provided

<details> <summary>code snippet: (sorry -- I actually failed to find corresponding code in this repo -- reporting from system wide installed python3-boto 2.49.0-2.1</summary>

    def get_key(self, key_name, headers=None, version_id=None,
                response_headers=None, validate=True):
 ...
        key, resp = self._get_key_internal(key_name, headers, query_args_l)
        return key

    def _get_key_internal(self, key_name, headers, query_args_l):
        query_args = '&'.join(query_args_l) or None
        response = self.connection.make_request('HEAD', self.name, key_name,
                                                headers=headers,
                                                query_args=query_args)
        response.read()
...

</details>

so I wondered -- is there a way to either send request for a key differently via boto3 interface so there is a known error_code if it fails, or may be somehow request more details about this failed request "after the fact" to discover error_code?

created time in 3 days

push eventyarikoptic/datalad

Yaroslav Halchenko

commit sha 756aeef1ee5d6a904f67f9707119235abb049224

ENH: add --debug also to batched annexes if loglevel <= 8

view details

Yaroslav Halchenko

commit sha ada828214c4462b2e2016d029b504d774f1fce25

ENH: log up to 100 of last lines in stderr (if log outputs) for batched processes I am chasing some bug in datalad/git-annex where batched addurls eventually crashes with datalad.cmd [DEBUG ] Closing stdin of <subprocess.Popen object at 0x7fa14a7cb940> and waiting process to finish datalad.utils [WARNING] Caught Command '['git', 'annex', 'addurl', '--fast', '--with-files', '--json', '--json-error-messages', '--batch']' timed out after 3.0 seconds [subprocess.py:_wait:1616] on trial #1. Sleeping 1.000000 and retrying datalad.utils [WARNING] Caught Command '['git', 'annex', 'addurl', '--fast', '--with-files', '--json', '--json-error-messages', '--batch']' timed out after 3.0 seconds [subprocess.py:_wait:1616] on trial #2. Sleeping 1.000000 and retrying datalad.cmd [WARNING] Batched process <subprocess.Popen object at 0x7fa14a7cb940> did not finish, abandoning it without killing it Traceback (most recent call last): File "datalad-nda/scripts/datalad-nda", line 416, in <module> main() File "/usr/lib/python3/dist-packages/click/core.py", line 764, in __call__ return self.main(*args, **kwargs) File "/usr/lib/python3/dist-packages/click/core.py", line 717, in main rv = self.invoke(ctx) File "/usr/lib/python3/dist-packages/click/core.py", line 1137, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File "/usr/lib/python3/dist-packages/click/core.py", line 956, in invoke return ctx.invoke(self.callback, **ctx.params) File "/usr/lib/python3/dist-packages/click/core.py", line 555, in invoke return callback(*args, **kwargs) File "datalad-nda/scripts/datalad-nda", line 236, in add2datalad on_failure="stop", File "/mnt/scrap/tmp/abcd/datalad/datalad/distribution/dataset.py", line 503, in apply_func return f(**kwargs) File "/mnt/scrap/tmp/abcd/datalad/datalad/interface/utils.py", line 481, in eval_func return return_func(generator_func)(*args, **kwargs) File "/mnt/scrap/tmp/abcd/datalad/datalad/interface/utils.py", line 469, in return_func results = list(results) File "/mnt/scrap/tmp/abcd/datalad/datalad/interface/utils.py", line 456, in generator_func msg="Command did not complete successfully") datalad.support.exceptions.IncompleteResultsError: Command did not complete successfully. 1 failed: [{'action': 'addurls', 'message': "AnnexBatchCommandError: 'addurl' [Error, annex reported failure " 'for addurl ' "(url='s3://NDAR_Central_2/submission_23229/derivatives/abcd-hcp-pipeline/sub-XXX/ses-baselineYear1Arm1/img/DVARS_and_FD_task-SST01.png'): " "{'command': 'addurl', 'success': False, 'error-messages': [' " "unable to use special remote due to protocol error'], 'file': " "'sub-XXX/ses-baselineYear1Arm1/img/DVARS_and_FD_task-SST01.png'}] " '[annexrepo.py:add_url_to_file:1879]', 'path': '/mnt/scrap/tmp/abcd/testds-fast-full2/derivatives/abcd-hcp-pipeline/sub-XXX/ses-baselineYear1Arm1/img/DVARS_and_FD_task-SST01.png', 'status': 'error', 'type': 'file'}] > /mnt/scrap/tmp/abcd/datalad/datalad/interface/utils.py(456)generator_func() -> msg="Command did not complete successfully") I have no clue what is going on and since it is a sandwich of datalad/git-annex/git-annex-remote-datalad-archives here -- it is hard to impossible to see what is happening. Logging last lines from stderr might give a clue if would include relevant log lines from special remote log etc.

view details

Yaroslav Halchenko

commit sha e821fdde028c3e97059e3b7bd75c70d447b4253a

ENH: log all stderr lines not just last 100

view details

push time in 3 days

PR closed datalad/datalad

ENH: log up to 100 of last lines in stderr (if log outputs) for batched processe

I am chasing some bug in datalad/git-annex where batched addurls eventually crashes with

datalad.cmd     [DEBUG  ] Closing stdin of <subprocess.Popen object at 0x7fa14a7cb940> and waiting process to finish
datalad.utils   [WARNING] Caught Command '['git', 'annex', 'addurl', '--fast', '--with-files', '--json', '--json-error-messages', '--batch']' timed out after 3.0 seconds [subprocess.py:_wait:1616] on trial #1. Sleeping 1.000000 and retrying
datalad.utils   [WARNING] Caught Command '['git', 'annex', 'addurl', '--fast', '--with-files', '--json', '--json-error-messages', '--batch']' timed out after 3.0 seconds [subprocess.py:_wait:1616] on trial #2. Sleeping 1.000000 and retrying
datalad.cmd     [WARNING] Batched process <subprocess.Popen object at 0x7fa14a7cb940> did not finish, abandoning it without killing it
Traceback (most recent call last):
  File "datalad-nda/scripts/datalad-nda", line 416, in <module>
	main()
  File "/usr/lib/python3/dist-packages/click/core.py", line 764, in __call__
	return self.main(*args, **kwargs)
  File "/usr/lib/python3/dist-packages/click/core.py", line 717, in main
	rv = self.invoke(ctx)
  File "/usr/lib/python3/dist-packages/click/core.py", line 1137, in invoke
	return _process_result(sub_ctx.command.invoke(sub_ctx))
  File "/usr/lib/python3/dist-packages/click/core.py", line 956, in invoke
	return ctx.invoke(self.callback, **ctx.params)
  File "/usr/lib/python3/dist-packages/click/core.py", line 555, in invoke
	return callback(*args, **kwargs)
  File "datalad-nda/scripts/datalad-nda", line 236, in add2datalad
	on_failure="stop",
  File "/mnt/scrap/tmp/abcd/datalad/datalad/distribution/dataset.py", line 503, in apply_func
	return f(**kwargs)
  File "/mnt/scrap/tmp/abcd/datalad/datalad/interface/utils.py", line 481, in eval_func
	return return_func(generator_func)(*args, **kwargs)
  File "/mnt/scrap/tmp/abcd/datalad/datalad/interface/utils.py", line 469, in return_func
	results = list(results)
  File "/mnt/scrap/tmp/abcd/datalad/datalad/interface/utils.py", line 456, in generator_func
	msg="Command did not complete successfully")
datalad.support.exceptions.IncompleteResultsError: Command did not complete successfully. 1 failed:
[{'action': 'addurls',
  'message': "AnnexBatchCommandError: 'addurl' [Error, annex reported failure "
			 'for addurl '
			 "(url='s3://NDAR_Central_2/submission_23229/derivatives/abcd-hcp-pipeline/sub-XXX/ses-baselineYear1Arm1/img/DVARS_and_FD_task-SST01.png'): "
			 "{'command': 'addurl', 'success': False, 'error-messages': ['  "
			 "unable to use special remote due to protocol error'], 'file': "
			 "'sub-XXX/ses-baselineYear1Arm1/img/DVARS_and_FD_task-SST01.png'}] "
			 '[annexrepo.py:add_url_to_file:1879]',
  'path': '/mnt/scrap/tmp/abcd/testds-fast-full2/derivatives/abcd-hcp-pipeline/sub-XXX/ses-baselineYear1Arm1/img/DVARS_and_FD_task-SST01.png',
  'status': 'error',
  'type': 'file'}]

> /mnt/scrap/tmp/abcd/datalad/datalad/interface/utils.py(456)generator_func()
-> msg="Command did not complete successfully")

I have no clue what is going on and since it is a sandwich of datalad/git-annex/git-annex-remote-datalad-archives here -- it is hard to impossible to see what is happening. Logging last lines from stderr might give a clue if would include relevant log lines from special remote log etc.

TODOs

  • [ ] See if actually provides useful information on my usecase
+25 -4

2 comments

1 changed file

yarikoptic

pr closed time in 3 days

delete branch yarikoptic/datalad

delete branch : enh-log-batched

delete time in 3 days

PR opened datalad/datalad

ENH: make it possible to debug batched annex DX
  • add --debug option to annex CMD --batch invocation when level <= 8, analogously to what we do in main AnnexRepo "good old runner"
  • incorporate and supersede #4925 - log all stderr lines if logging of outputs was requested and log level <= 5
    • 100 is just often is not enough to troubleshoot some issue which has happened some time before (crash of special remote etc) the batch process actually finished

I hope these two changes would make debugging of annex and started by it external remotes more manageable. ATM it is either pain or just impossible.

+27 -5

0 comment

2 changed files

pr created time in 3 days

create barnchyarikoptic/datalad

branch : enh-debug-batched-annex

created branch time in 3 days

pull request commentdatalad/datalad

ENH+RF: @try_multiple_dec_s3 to retry interaction with S3 upon "retriable" errors

I think I figured it out (although I guess I should have figured it without additional logging ;)): CHECKURL happens right after the token has expired but we are reusing previous session. That is why we receive Bad Request. But that deep in the code we do not check on validity of the credentials and thus just "keep knocking on now closed door". Also interestingly (not yet sure why since we do swallow all --debug output from batched annex and with #4925 (which I use as well) show only the tail of the log and misses that point where restart happens. Upon restart of the special remote it does check that credential is still valid and does re-mint it since it is expired. Pushed some commits to address it and other discovered while at it gotchas -- it MUST work now! ;)

yarikoptic

comment created time in 4 days

push eventyarikoptic/datalad

Yaroslav Halchenko

commit sha 768993f1407f96479ff1f3589f55446d61866192

BF: Provide is_expired for CompositeCredential eh -- should use more of @abstractmethod I guess :-/ Here it is a mix of pure bool and properties, so left as is for now

view details

push time in 4 days

push eventyarikoptic/datalad

Yaroslav Halchenko

commit sha 3148c63e6053c96743494c2e5d0bee3a44e58c55

ENH: provide details in the assert message when assert on having "command" in returned record

view details

push time in 4 days

push eventyarikoptic/datalad

Yaroslav Halchenko

commit sha c936ed37d29b5175d49f27e020cae3498a9c5c92

RF: make needs_authentication into a bool I do not see a reason why I made it possible to pass an instance of a credential

view details

Yaroslav Halchenko

commit sha 0dde78cac1e6026261493e294a9ee28ebe927b37

BF: verify that if there is a credential that it has not expired yet while considering to reuse session

view details

Yaroslav Halchenko

commit sha c084251a14ecb535526b0e59550703b89c324862

BF: when we catch S3 error and see that authenticator key is expired -- raise AccessDeniedError That should cause outside loop in .access to retry _establish_session without even allowing for reusing previous session, and we should be all set!

view details

Yaroslav Halchenko

commit sha 5c50ef1e0d4272e6d9577b273cc3405fcc7e2aca

ENH: minor, removed commented out 403 Did not squash since branch already used in a "deployed" merge of various PRs

view details

push time in 4 days

pull request commentdatalad/datalad

ENH: log up to 100 of last lines in stderr (if log outputs) for batched processe

It did provide information but not enough -- failures lead to annex to restart underlying annex-remote process, and those are not within the last 100. I think a better solution would be to make this all configurable so log (if configured) is dumped to a dedicated (or our main?) log file with {pid} placeholder (as suggested for the main in #4930).

yarikoptic

comment created time in 4 days

PR closed datalad/datalad

ENH: retry individual addurl if addurl fails up to 3 times

Not yet sure if that is the best way to proceed. But I feel that there is some "fluke" which causes git annex to report a failed CHECKURL . I could not make sense of available logs to see if it is on our end or not yet.

  • [ ] decide either to bother with this approach at all
+44 -26

2 comments

1 changed file

yarikoptic

pr closed time in 4 days

pull request commentdatalad/datalad

ENH: retry individual addurl if addurl fails up to 3 times

decided not to bother and better address underlying issue with #4931

yarikoptic

comment created time in 4 days

PR opened datalad/datalad

RF: remove disable_logger in customremotes/datalad.py

It was introduced in 15ecbb78a86f36b353e763273577dd369671e61b (0.9.2~76^2~6) as a part of https://github.com/datalad/datalad/pull/1870 with a purpose according to description of not pollutting stdout used by special remote for communication. BUT logs go to stderr and do not interfer anyhow with the communication, so the reason for the change is not clear to me.

I have been trying to figure out why the heck I am still seeing failures after https://github.com/datalad/datalad/pull/4931 . I kept adding more logging but nothing appeared in the logs! I finally was brought to this piece of code, thus the motivation and argumentation for the change.

Let's see -- if CI is happy, it means that my argumentation above that there should be no interference is correct and it should be safe to merge.

+2 -4

0 comment

1 changed file

pr created time in 4 days

create barnchyarikoptic/datalad

branch : rf-donot-disable-log

created branch time in 4 days

issue openedpyout/pyout

Exception dump: Flank regexp unexpectedly did not match result...

I think I never saw that one before.

<details> <summary>First it was dumped while final rendering was not finished so I didn't see the end of it (overwritten by pyout)</summary>

drogon:/mnt/backup/dandi/dandiarchive-replica
PATH                             SIZE
000027
000027/sub-RAT123/sub-RAT123.nwb 18.8 kB
2020-09-18 20:39:38,393 [   ERROR] exception calling callback for <Future at 0x7fc582646790 state=finished returned dict>
Traceback (most recent call last):
  File "/home/yoh/miniconda3/lib/python3.8/concurrent/futures/_base.py", line 328, in _invoke_callbacks
    callback(self)
  File "/home/yoh/miniconda3/lib/python3.8/site-packages/pyout/interface.py", line 479, in callback
    self._write_async_result(
  File "/home/yoh/miniconda3/lib/python3.8/site-packages/pyout/interface.py", line 95, in wrapped
    return method(self, *args, **kwds)
  File "/home/yoh/miniconda3/lib/python3.8/site-packages/pyout/interface.py", line 430, in _write_async_result
    self._write(result)
  File "/home/yoh/miniconda3/lib/python3.8/site-packages/pyout/interface.py", line 345, in _write
    self._write_fn(row, style)
  File "/home/yoh/miniconda3/lib/python3.8/site-packages/pyout/interface.py", line 364, in _write_update
    content, status, summary = self._content.update(row, style)
  File "/home/yoh/miniconda3/lib/python3.8/site-packages/pyout/common.py", line 914, in update
    content, status = super(ContentWithSummary, self).update(row, style)
  File "/home/yoh/miniconda3/lib/python3.8/site-packages/pyout/common.py", line 881, in update
    line, adjusted = self.fields.render(row, style)
  File "/home/yoh/miniconda3/lib/python3.8/site-packages/pyout/common.py", line 688, in render
    return self.style["separator_"].join(proc_fields) + "\n", adjusted
  File "/home/yoh/miniconda3/lib/python3.8/site-packages/pyout/common.py", line 687, in <genexpr>
    proc_fields = (fld(val, keys=proc_keys) for fld, val in proc_fields)
  File "/home/yoh/miniconda3/lib/python3.8/site-packages/pyout/field.py", line 155, in __call__
    result = fn(value, result)
PATH                             SESSION_START... DESCRIPTION     SEX ND_TYPES SPECIES          GENOTYPE AGE             NWB  #SUBJECTS IDENTIFIER   KEYWORDS    SUBJECT_ID ORGANISM        SESSION_DESC... NAME            SIZE
000027                                            Should be ig... M                                      maximum: 12 ...      1         000027       development            species: Rat...                 Test dataset...
000027/sub-RAT123/sub-RAT123.nwb 1971-01-01/12...                 M   Subject  Rattus norveg... WT       12 mo           2.0b           TEST_Subject             RAT123                     a file to te...                 18.8 kB
Summary:                         1971-01-01/12...                                                                                                                                                                           18.8 kB
                                 1971-01-01/12...

</details>

and upon rerun (callback would return faster since result is cached):

PATH                             SPECIES          GENOTYPE NAME            SEX ND_TYPES SESSION_START... ORGANISM        IDENTIFIER   NWB  SESSION_DESC... KEYWORDS    SUBJECT_ID AGE             #SUBJECTS DESCRIPTION     SIZE
000027                                                     Test dataset... M                             species: Rat... 000027                            development            maximum: 12 ... 1         Should be ig...
000027/sub-RAT123/sub-RAT123.nwb Rattus norveg... WT                       M   Subject  1971-01-01/12...                 TEST_Subject 2.0b a file to te...             RAT123     12 mo                                     18.8 kB
Summary:                                                                                1971-01-01/12...                                                                                                                    18.8 kB
                                                                                        1971-01-01/12...
2020-09-18 20:40:01,332 [   ERROR] exception calling callback for <Future at 0x7fb515897610 state=finished returned dict>
Traceback (most recent call last):
  File "/home/yoh/miniconda3/lib/python3.8/concurrent/futures/_base.py", line 328, in _invoke_callbacks
    callback(self)
  File "/home/yoh/miniconda3/lib/python3.8/site-packages/pyout/interface.py", line 479, in callback
    self._write_async_result(
  File "/home/yoh/miniconda3/lib/python3.8/site-packages/pyout/interface.py", line 95, in wrapped
    return method(self, *args, **kwds)
  File "/home/yoh/miniconda3/lib/python3.8/site-packages/pyout/interface.py", line 430, in _write_async_result
    self._write(result)
  File "/home/yoh/miniconda3/lib/python3.8/site-packages/pyout/interface.py", line 345, in _write
    self._write_fn(row, style)
  File "/home/yoh/miniconda3/lib/python3.8/site-packages/pyout/interface.py", line 364, in _write_update
    content, status, summary = self._content.update(row, style)
  File "/home/yoh/miniconda3/lib/python3.8/site-packages/pyout/common.py", line 914, in update
    content, status = super(ContentWithSummary, self).update(row, style)
  File "/home/yoh/miniconda3/lib/python3.8/site-packages/pyout/common.py", line 881, in update
    line, adjusted = self.fields.render(row, style)
  File "/home/yoh/miniconda3/lib/python3.8/site-packages/pyout/common.py", line 688, in render
    return self.style["separator_"].join(proc_fields) + "\n", adjusted
  File "/home/yoh/miniconda3/lib/python3.8/site-packages/pyout/common.py", line 687, in <genexpr>
    proc_fields = (fld(val, keys=proc_keys) for fld, val in proc_fields)
  File "/home/yoh/miniconda3/lib/python3.8/site-packages/pyout/field.py", line 155, in __call__
    result = fn(value, result)
  File "/home/yoh/miniconda3/lib/python3.8/site-packages/pyout/field.py", line 473, in split_flanks
    raise RuntimeError(
RuntimeError: Flank regexp unexpectedly did not match result: 'Should be ignored by regular mortals.\n\nATM contains o...' (type: <class 'str'>)

Interestingly! if I shrink the window horizontally (before running the command, not during) -- exception goes away. So it is only after some width increase it appears.

<details> <summary>pip freeze info which was ran in conda env after installing dandi from conda and then pip install -e . in its git repo </summary>

$> pip freeze
appdirs==1.4.4
attrs==20.2.0
bcrypt==3.2.0
blessings==1.7
certifi==2020.4.5.1
cffi==1.14.0
chardet==3.0.4
ci-info==0.2.0
click==7.1.2
click-didyoumean==0.0.3
conda==4.8.3
conda-package-handling==1.7.0
cryptography==2.9.2
-e git://github.com/dandi/dandi-cli@928b45b5c608c5869688ea7240e8a21f363c2137#egg=dandi
diskcache==5.0.3
distro==1.5.0
dnspython==2.0.0
email-validator==1.1.1
etelemetry==0.2.2
fabric==2.5.0
girder-client==3.1.2
h5py==2.10.0
hdmf==2.2.0
humanize==2.6.0
idna==2.9
invoke==1.4.1
jeepney==0.4.3
Jinja2==2.11.2
joblib==0.16.0
jsonschema==3.2.0
keyring==21.4.0
keyrings.alt==3.4.0
MarkupSafe==1.1.1
numpy==1.19.2
pandas==1.1.2
paramiko==2.7.2
pycosat==0.6.3
pycparser==2.20
pycrypto==2.6.1
pydantic==1.6.1
PyNaCl==1.4.0
pynwb==1.4.0
pyOpenSSL==16.2.0
pyout==0.6.1
pyrsistent==0.17.3
PySocks==1.7.1
python-dateutil==2.8.1
pytz==2020.1
PyYAML==5.3.1
reproman==0.2.1
reprozip==1.0.16
requests==2.23.0
requests-toolbelt==0.9.1
rpaths==0.13
ruamel-yaml==0.15.87
ruamel.yaml.clib==0.2.2
scipy==1.5.2
scp==0.13.2
SecretStorage==3.1.2
semantic-version==2.8.5
six==1.14.0
tqdm==4.46.0
urllib3==1.25.8
usagestats==1.0

</details>

created time in 4 days

pull request commentdandi/dandi-cli

BF: dandiset metadata record structure/assumptions

I think it is ok to print it always (unless fails to register)

yarikoptic

comment created time in 4 days

Pull request review commentdandi/dandi-cli

Script for "instantiating" Dandisets from asset store

+from pathlib import Path+import subprocess+from urllib.parse import urlparse+import click+import requests+from dandi import girder+from dandi.consts import dandiset_metadata_file+from dandi.dandiarchive import navigate_url+from dandi.dandiset import Dandiset+from dandi.utils import get_instance+++@click.command()+@click.option("-i", "--ignore-errors", is_flag=True)+@click.argument("assetstore", type=click.Path(exists=True, file_okay=False))+@click.argument("target", type=click.Path(file_okay=False))+def main(assetstore, target, ignore_errors):+    instantiate_dandisets(Path(assetstore), Path(target), ignore_errors)+++def instantiate_dandisets(+    assetstore_path: Path, target_path: Path, ignore_errors=False+):+    with requests.Session() as s:+        for did in get_dandiset_ids():+            dsdir = target_path / did+            dsdir.mkdir(parents=True, exist_ok=True)+            with navigate_url(f"https://dandiarchive.org/dandiset/{did}/draft") as (+                _,+                dandiset,+                assets,+            ):+                try:+                    (dsdir / dandiset_metadata_file).unlink()+                except FileNotFoundError:+                    pass+                metadata = dandiset.get("metadata", {})+                ds = Dandiset(dsdir, allow_empty=True)+                ds.update_metadata(metadata)+                for a in assets:+                    gid = a["girder"]["id"]+                    src = assetstore_path / girderid2assetpath(s, gid)+                    dest = dsdir / a["path"].lstrip("/")+                    dest.parent.mkdir(parents=True, exist_ok=True)+                    print(src, "->", dest)+                    try:+                        mklink(src, dest)+                    except Exception:+                        if not ignore_errors:+                            raise+++def get_dandiset_ids():+    dandi_instance = get_instance("dandi")+    client = girder.get_client(dandi_instance.girder, authenticate=False)+    offset = 0+    per_page = 50+    while True:+        dandisets = client.get(+            "dandi", parameters={"limit": per_page, "offset": offset}+        )+        if not dandisets:+            break+        for d in dandisets:+            yield d["meta"]["dandiset"]["identifier"]+        offset += len(dandisets)+++def girderid2assetpath(session, girder_id):+    r = session.head(+        f"https://girder.dandiarchive.org/api/v1/file/{girder_id}/download"+    )+    r.raise_for_status()+    return urlparse(r.headers["Location"]).path.lstrip("/")+++def mklink(src, dest):+    subprocess.run(+        ["cp", "-L", "--reflink=always", "--remove-destination", str(src), str(dest)],

did you run into use cases where there were multiple files for the same path so you added --remove-destination or it was for "rerunning"?

jwodder

comment created time in 4 days

PullRequestReviewEvent

pull request commentdatalad/datalad

ENH+RF: @try_multiple_dec_s3 to retry interaction with S3 upon "retriable" errors

Update: somehow this change did not help me -- yet to figure out why (difficulty is all the sandwidching of datalad/git-annex/datalad-remote + inability to enter even epdb since it dumps to stdout tripping git annex). so I let's wait to merge it until I see how to make it actually work for the target use case or why it didn't help when it should've

yarikoptic

comment created time in 4 days

push eventyarikoptic/datalad

Yaroslav Halchenko

commit sha b107448d5cd11b0c32747525a1d0204d805d920c

typo fix Co-authored-by: Kyle Meyer <kyle@kyleam.com>

view details

push time in 4 days

Pull request review commentdatalad/datalad

ENH+RF: @try_multiple_dec_s3 to retry interaction with S3 upon "retriable" errors

 def try_multiple_dec(f, ntrials=None, duration=0.1, exceptions=None, increment_t     increment_type: {None, 'exponential'}       Note that if it is exponential, duration should typically be > 1.0       so it grows with higher power-+    exceptions: Exception or tuple of Exceptions, optional+      Exception or a tuple of multiple exceptions, on which to retry+    exceptions_filter: callable, optional+      If provided, this unction will be called with a caught exception

THANK YOU!

yarikoptic

comment created time in 4 days

PullRequestReviewEvent

issue commentcon/ference

Review of existing solutions (platforms and individual components to bolt together)

just a note: recording for that showandtell is available (yet to watch myself)

yarikoptic

comment created time in 4 days

pull request commentdatalad/datalad-extensions

Combined workflow for building git-annex on Ubuntu and macOS

Note: we already have a token in the secrets

jwodder

comment created time in 4 days

pull request commentdatalad/datalad-extensions

Combined workflow for building git-annex on Ubuntu and macOS

Note: actions/virtual-environments is the repository for underlying recipes of GitHub actions environments. So I thought it is a generic statement on how osx vms are setup for GitHub actions, this explains why we observe rate limiting only for osx.

We could start by adding to curl invocation.

jwodder

comment created time in 4 days

pull request commentdatalad/datalad-extensions

Combined workflow for building git-annex on Ubuntu and macOS

re ssh failure -- not sure yet, indeed may be some race. If you have specific log handy, please cut/paste details (could upload entire to smaug to share) to be able to comprehend what is going on. @kyleam might have ideas as well.

jwodder

comment created time in 4 days

pull request commentdatalad/datalad-extensions

Combined workflow for building git-annex on Ubuntu and macOS

I think https://github.com/actions/virtual-environments/issues/602#issuecomment-602472951 shines the light

The issue comes from the way how GitHub counts requests for rate limit. For unauthorized requests, it limits by IP. All macOS VMs have the same IP address because of infrastructure.

and the resolution was to provide a GITHUB token

jwodder

comment created time in 4 days

pull request commentconda-forge/git-annex-feedstock

added testing of downstream packages annexremote and datalad

I think all three failing tests are something we haven't seen before failing, filed https://github.com/datalad/datalad/issues/4933 and https://github.com/datalad/datalad/issues/4932 , we will try to look into them soonish than laterish

notestaff

comment created time in 5 days

issue openeddatalad/datalad

2 datalad-archives tests fail on conda

linux_64_nodepTrue downstream testing of git annex build of datalad 0.13.3 fails on conda-forge https://github.com/conda-forge/git-annex-feedstock/pull/101 leads to two datalad-archives remote related tests:

2020-09-18T03:50:48.4181911Z ======================================================================
2020-09-18T03:50:48.4182393Z ERROR: datalad.customremotes.tests.test_archives.test_annex_get_from_subdir
2020-09-18T03:50:48.4183170Z ----------------------------------------------------------------------
2020-09-18T03:50:48.4183612Z Traceback (most recent call last):
2020-09-18T03:50:48.4184924Z   File "/home/conda/feedstock_root/build_artifacts/git-annex_1600392873041/_test_env_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placeh/lib/python3.8/site-packages/nose/case.py", line 197, in runTest
2020-09-18T03:50:48.4185885Z     self.test(*self.arg)
2020-09-18T03:50:48.4187203Z   File "/home/conda/feedstock_root/build_artifacts/git-annex_1600392873041/_test_env_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placeh/lib/python3.8/site-packages/datalad/tests/utils.py", line 554, in _wrap_with_tree
2020-09-18T03:50:48.4188120Z     return t(*(arg + (d,)), **kw)
2020-09-18T03:50:48.4189536Z   File "/home/conda/feedstock_root/build_artifacts/git-annex_1600392873041/_test_env_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placeh/lib/python3.8/site-packages/datalad/customremotes/tests/test_archives.py", line 178, in test_annex_get_from_subdir
2020-09-18T03:50:48.4190841Z     runner(['git', 'annex', 'get', '--', fn_in_archive_obscure])   # run git annex get
2020-09-18T03:50:48.4192225Z   File "/home/conda/feedstock_root/build_artifacts/git-annex_1600392873041/_test_env_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placeh/lib/python3.8/site-packages/datalad/cmd.py", line 599, in __call__
2020-09-18T03:50:48.4193116Z     return self.run(cmd, *args, **kwargs)
2020-09-18T03:50:48.4194418Z   File "/home/conda/feedstock_root/build_artifacts/git-annex_1600392873041/_test_env_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placeh/lib/python3.8/site-packages/datalad/cmd.py", line 961, in run
2020-09-18T03:50:48.4195255Z     raise exc
2020-09-18T03:50:48.4196205Z datalad.support.exceptions.CommandError: CommandError: 'git annex get -- ' "'"'"';a&b&cΔЙקﻡ๗あ `| '' failed with exitcode 1 [out: 'get  "';a&b&cΔЙקﻡ๗あ `|  (from datalad-archives...)
2020-09-18T03:50:48.4196862Z
2020-09-18T03:50:48.4197076Z
2020-09-18T03:50:48.4197609Z   Unable to access these remotes: datalad-archives
2020-09-18T03:50:48.4197958Z
2020-09-18T03:50:48.4198297Z   Try making some of these repositories available:
2020-09-18T03:50:48.4198967Z    c04eb54b-4b4e-5755-8436-866b043170fa -- [datalad-archives]
2020-09-18T03:50:48.4200450Z failed'] [err: 'Failed to fetch any archive containing SHA256E-s3--a665a45920422f9d417e4867efdc4fb8a04a1f3fff1fa07e998e86f7f7a27ae3. Tried: ['SHA256E-s166--44e4c670086709cf7aadd3a7618ca77f5f1d5934305a3e792cf671e7bbe1d20b.tar.gz'] [archives.py:_transfer:405]
2020-09-18T03:50:48.4201611Z git-annex: get: 1 failed']
2020-09-18T03:50:48.4202260Z -------------------- >> begin captured logging << --------------------
2020-09-18T03:50:48.4202718Z asyncio: DEBUG: Using selector: EpollSelector
2020-09-18T03:50:48.4203409Z asyncio: DEBUG: Using selector: EpollSelector
2020-09-18T03:50:48.4204670Z asyncio: DEBUG: Using selector: EpollSelector
2020-09-18T03:50:48.4205176Z asyncio: DEBUG: Using selector: EpollSelector
2020-09-18T03:50:48.4205605Z asyncio: DEBUG: Using selector: EpollSelector
2020-09-18T03:50:48.4206694Z fasteners.process_lock: Level 5: Acquired file lock `/tmp/datalad_temp_tree_test_annex_get_from_subdirxu27uco6/.git/datalad/tmp/archives/69dc725608.extract-lck` after waiting 0.000s [1 attempts were required]
2020-09-18T03:50:48.4207955Z fasteners.process_lock: Level 5: Unlocked and closed file lock open on `/tmp/datalad_temp_tree_test_annex_get_from_subdirxu27uco6/.git/datalad/tmp/archives/69dc725608.extract-lck`
2020-09-18T03:50:48.4208632Z asyncio: DEBUG: Using selector: EpollSelector
2020-09-18T03:50:48.4209036Z asyncio: DEBUG: Using selector: EpollSelector
2020-09-18T03:50:48.4209688Z --------------------- >> end captured logging << ---------------------

2020-09-18T03:50:48.4210377Z ====================================================================== 2020-09-18T03:50:48.4210847Z FAIL: datalad.customremotes.tests.test_archives.test_basic_scenario 2020-09-18T03:50:48.4211510Z ---------------------------------------------------------------------- 2020-09-18T03:50:48.4211960Z Traceback (most recent call last): 2020-09-18T03:50:48.4213242Z File "/home/conda/feedstock_root/build_artifacts/git-annex_1600392873041/_test_env_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placeh/lib/python3.8/site-packages/nose/case.py", line 197, in runTest 2020-09-18T03:50:48.4214109Z self.test(self.arg) 2020-09-18T03:50:48.4215432Z File "/home/conda/feedstock_root/build_artifacts/git-annex_1600392873041/_test_env_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placeh/lib/python3.8/site-packages/datalad/tests/utils.py", line 554, in _wrap_with_tree 2020-09-18T03:50:48.4216318Z return t((arg + (d,)), **kw) 2020-09-18T03:50:48.4217634Z File "/home/conda/feedstock_root/build_artifacts/git-annex_1600392873041/_test_env_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placeh/lib/python3.8/site-packages/datalad/tests/utils.py", line 731, in _wrap_with_tempfile 2020-09-18T03:50:48.4218554Z return t(*(arg + (filename,)), **kw) 2020-09-18T03:50:48.4219966Z File "/home/conda/feedstock_root/build_artifacts/git-annex_1600392873041/_test_env_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placeh/lib/python3.8/site-packages/datalad/customremotes/tests/test_archives.py", line 124, in test_basic_scenario 2020-09-18T03:50:48.4220955Z assert_true(annex.file_has_content(fn_extracted)) 2020-09-18T03:50:48.4221338Z AssertionError: False is not true 2020-09-18T03:50:48.4221966Z -------------------- >> begin captured logging << -------------------- 2020-09-18T03:50:48.4222420Z asyncio: DEBUG: Using selector: EpollSelector 2020-09-18T03:50:48.4222824Z asyncio: DEBUG: Using selector: EpollSelector 2020-09-18T03:50:48.4223240Z asyncio: DEBUG: Using selector: EpollSelector 2020-09-18T03:50:48.4223735Z asyncio: DEBUG: Using selector: EpollSelector 2020-09-18T03:50:48.4224154Z asyncio: DEBUG: Using selector: EpollSelector 2020-09-18T03:50:48.4224625Z asyncio: DEBUG: Using selector: EpollSelector 2020-09-18T03:50:48.4225039Z asyncio: DEBUG: Using selector: EpollSelector 2020-09-18T03:50:48.4225825Z --------------------- >> end captured logging << ---------------------


created time in 5 days

issue openeddatalad/datalad

test_get_flexible_source_candidates_for_submodule test failing on conda

linux_64_nodepTrue downstream testing of git annex build of datalad 0.13.3 fails on conda-forge https://github.com/conda-forge/git-annex-feedstock/pull/101 in test_get.test_get_flexible_source_candidates_for_submodule:

2020-09-18T03:50:48.4226470Z ======================================================================
2020-09-18T03:50:48.4226924Z FAIL: datalad.distribution.tests.test_get.test_get_flexible_source_candidates_for_submodule
2020-09-18T03:50:48.4227600Z ----------------------------------------------------------------------
2020-09-18T03:50:48.4228004Z Traceback (most recent call last):
2020-09-18T03:50:48.4229198Z   File "/home/conda/feedstock_root/build_artifacts/git-annex_1600392873041/_test_env_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placeh/lib/python3.8/site-packages/nose/case.py", line 197, in runTest
2020-09-18T03:50:48.4229994Z     self.test(*self.arg)
2020-09-18T03:50:48.4231200Z   File "/home/conda/feedstock_root/build_artifacts/git-annex_1600392873041/_test_env_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placeh/lib/python3.8/site-packages/datalad/tests/utils.py", line 731, in _wrap_with_tempfile
2020-09-18T03:50:48.4232066Z     return t(*(arg + (filename,)), **kw)
2020-09-18T03:50:48.4233310Z   File "/home/conda/feedstock_root/build_artifacts/git-annex_1600392873041/_test_env_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placeh/lib/python3.8/site-packages/datalad/tests/utils.py", line 731, in _wrap_with_tempfile
2020-09-18T03:50:48.4235528Z     return t(*(arg + (filename,)), **kw)
2020-09-18T03:50:48.4236850Z   File "/home/conda/feedstock_root/build_artifacts/git-annex_1600392873041/_test_env_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placeh/lib/python3.8/site-packages/datalad/tests/utils.py", line 731, in _wrap_with_tempfile
2020-09-18T03:50:48.4237697Z     return t(*(arg + (filename,)), **kw)
2020-09-18T03:50:48.4239042Z   File "/home/conda/feedstock_root/build_artifacts/git-annex_1600392873041/_test_env_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placeh/lib/python3.8/site-packages/datalad/distribution/tests/test_get.py", line 145, in test_get_flexible_source_candidates_for_submodule
2020-09-18T03:50:48.4240270Z     eq_(f(clone, clone.subdatasets(return_type='item-or-list')),
2020-09-18T03:50:48.4242284Z AssertionError: [{'cost': 400, 'name': 'bang', 'url': 'youredead', 'from_config': True}, {'cost': 500, 'name': 'origin', 'url': '/tmp/datalad_temp_test_get_flexible_source_candidates_for_submoduleso013kl9/sub'}, {'cost': 700, 'name': 'bang', 'url': 'pre-a2f21833-32ac-4372-b0dc-5577a931b4a3-post', 'from_config': True}] != [{'cost': 500, 'name': 'origin', 'url': '/tmp/datalad_temp_test_get_flexible_source_candidates_for_submoduleso013kl9/sub'}, {'cost': 700, 'name': 'bang', 'url': 'pre-a2f21833-32ac-4372-b0dc-5577a931b4a3-post', 'from_config': True}]
2020-09-18T03:50:48.4244024Z -------------------- >> begin captured logging << --------------------
2020-09-18T03:50:48.4244489Z asyncio: DEBUG: Using selector: EpollSelector
2020-09-18T03:50:48.4245078Z asyncio: DEBUG: Using selector: EpollSelector
2020-09-18T03:50:48.4245495Z asyncio: DEBUG: Using selector: EpollSelector
2020-09-18T03:50:48.4245904Z asyncio: DEBUG: Using selector: EpollSelector
2020-09-18T03:50:48.4246316Z asyncio: DEBUG: Using selector: EpollSelector
2020-09-18T03:50:48.4246824Z asyncio: DEBUG: Using selector: EpollSelector
2020-09-18T03:50:48.4247311Z asyncio: DEBUG: Using selector: EpollSelector
2020-09-18T03:50:48.4247730Z asyncio: DEBUG: Using selector: EpollSelector
2020-09-18T03:50:48.4248382Z --------------------- >> end captured logging << ---------------------
2020-09-18T03:50:48.4248742Z
2020-09-18T03:50:48.4249296Z ----------------------------------------------------------------------

i don't think we saw this one yet

created time in 5 days

pull request commentdatalad-handbook/book

DataLad as DVC for ML analysis

yes, let's merge!!! Thank you for this section - I think it is great and will be a valuable addition to the handbook.

adswa

comment created time in 5 days

issue commentdandi/dandi-cli

"instantiate" dandisets from the "backup"

awesome! yes please - commit it under tools/.

Due to the https://github.com/dandi/dandiarchive/issues/491 though we are lacking dandiset.yaml in each one of those. Could you please adjust the script to use dandi download --download dandiset.yaml to instantiate all of them so we get them "more complete"?

yarikoptic

comment created time in 5 days

PullRequestReviewEvent
PullRequestReviewEvent

Pull request review commentdatalad/datalad

ENH+RF: @try_multiple_dec_s3 to retry interaction with S3 upon "retriable" errors

 def _handle_exception(e, bucket_name):         )  +def try_multiple_dec_s3(func):+    """An S3 specific adapter to @try_multiple_dec++    To decorate func to try multiple times after some sleep upon encountering+    some intermittent error from S3+    """+    return try_multiple_dec(+                ntrials=4,+                duration=2.,+                increment_type='exponential',+                exceptions=S3ResponseError,+                # https://docs.aws.amazon.com/AmazonS3/latest/API/ErrorResponses.html#ErrorCodeList+                exceptions_filter=lambda e: e.status in (+                    307,  # MovedTemporarily -- DNS updates etc+                    400,  # Generic Bad Request -- we kept hitting it once in a while+                    # 403,

oh, this is stuck here from "testing" where I was quickly triggering the handling by providing an incorrect url to the bucket forbidding listing, so 403 was raised. I guess I better remove this commented out chunk before merging. I will keep it here for the duration of review/until next push since CI is already running

yarikoptic

comment created time in 5 days

PR opened datalad/datalad

ENH+RF: @try_multiple_dec_s3 to retry interaction with S3 upon "retriable" errors

I think this is an alternative (as a better solution to the underlying issue) to the https://github.com/datalad/datalad/pull/4928 -- I kept hitting a 400 error from S3 while addurls ABCD. I believe in the course of DANDI project we had similar occasions whenever S3 would fail with 400 without apparent reason: well, there are many reasons why 400 could be returned - https://docs.aws.amazon.com/AmazonS3/latest/API/ErrorResponses.html#ErrorCodeList and some of them are "legit", i.e. retry probably should not 'fix' that. But I do not want to sift through all of them ATM. IMHO it would be perfectly fine to retry a few times.

I have also added other codes (with comments in the code) which I think are legit to retry with sleep upon hitting.

Also @try_multiple_dec was enhanced and refactored a little to make it more useful generally.

I will now do a rerun of addurls with this PR applied so I could see if I finally manage to complete the runs of addurls without crash.

+73 -22

0 comment

3 changed files

pr created time in 5 days

create barnchyarikoptic/datalad

branch : enh-try-multiple

created branch time in 5 days

issue commentdatalad/datalad-metalad

Use git-notes instead of per-branch metadata?

Some oddities (cannot get them to prune, as I think they should)

As (was) with submodules, I think notes feature is not that popular, so I won't be surprised it to have more bugs than core git functionality - might be worth checking with git people when running into something unexpected.

As for not needing history - benefit of history on text files is diff between states and thus efficient objects store. I think that generally we do want to make it possible to access metadata for any/previous releases/states of the dataset.

mih

comment created time in 5 days

issue commentdandi/dandi-cli

"instantiate" dandisets from the "backup"

hm... for now please add an option for that -- for the "investigate metadata compliance" it is ok to miss a few files. BUT eventually we need to figure it out. (backup runs daily, so it might be that there were some changes to that dandiset today? will check later)

yarikoptic

comment created time in 5 days

PR opened datalad/datalad

ENH: make logtarget to be a format spec which understands pid

In my case primary use case ATM -- for some reason datalad external special remote logs NOTHING into DATALAD_LOG_TARGET -- may be due to collision for the same file from multiple processes (ie main datalad process and then special remote). By adding the {pid} into the mix I hope to tease apart the two log files

If we like it

  • [ ] add {datetime}
  • [ ] anything else?
+2 -1

0 comment

1 changed file

pr created time in 5 days

create barnchdatalad/datalad

branch : enh-logtarget

created branch time in 5 days

pull request commentdatalad/datalad-extensions

Combined workflow for building git-annex on Ubuntu and macOS

ok, singularity issue is fixed up now, so linux build succeeds, and tests are running now. So we could proceed further with this PR (some tests setup on OSX seems needing a bit more work).

jwodder

comment created time in 6 days

issue commentdatalad/datalad

GitHub actions support self-hosted runners!

https://github.com/ci-for-research/self-hosted-runners provides nice detailed instructions and some ansible recipes for configuring self hosted runners for a number of scenarios (a docker image, virtualbox instance). Also https://github.com/actions/virtual-environments/ provides packer recipes for the stock github actions environments and provides instructions on how to build them. BUT it is all possibly too tight up into Azure, and the scripts which invoke packer are in powershell.

But I guess anyways this all should wait until github allows to provision some control (via "permitted users" or labels assignment on PRs from others) to avoid running potentially malicious code from PRs: see https://docs.github.com/en/actions/hosting-your-own-runners/about-self-hosted-runners#self-hosted-runner-security-with-public-repositories , although if someone manages to configure some cloudy azure-like "start brand new VM" for every workflow run, could be done sooner.

yarikoptic

comment created time in 6 days

issue closeddatalad/datalad-extensions

singularity wows

2020-09-11T12:56:55.5859873Z > chronic singularity exec ./buildenv.sif make -C git-annex debianstandalone-dsc
2020-09-11T12:56:55.5862045Z I: Build source packages
2020-09-11T12:56:55.6789364Z ^[[91mERROR  : Failed to mount squashfs image in (read only): Invalid argument
2020-09-11T12:56:55.6792440Z ^[[0m^[[31mABORT  : Retval = 255
2020-09-11T12:56:55.6814266Z ^[[0m

Looking at the diff of two (last successful and then failing one, timestamps removed to ease diff) builds logs - do not see any possibly relevant differences

<details> <summary>differences in images setup -- nothing which strikes relevant</summary>

(git)lena:~/proj/github/virtual-environments[main]images/linux
$> git diff ubuntu18/20200825.1..ubuntu18/20200901.1 -- . | xsel -i
diff --git a/images/linux/Ubuntu1604-README.md b/images/linux/Ubuntu1604-README.md
index e8e99f3..56c6430 100644
--- a/images/linux/Ubuntu1604-README.md
+++ b/images/linux/Ubuntu1604-README.md
@@ -1,12 +1,12 @@
 <!--- DO NOT EDIT - This markdown file is autogenerated. -->
 # Ubuntu 16.04.7 LTS
-The following software is installed on machines with the 20200817.1 update.
+The following software is installed on machines with the 20200825.1 update.
 ***
 - 7-Zip 9.20
 - Ansible (ansible 2.9.12)
 - AzCopy7 (available by azcopy alias) 7.3.0
-- AzCopy10 (available by azcopy10 alias) 10.5.1
-- Azure CLI (azure-cli                         2.10.1)
+- AzCopy10 (available by azcopy10 alias) 10.6.0
+- Azure CLI (azure-cli                         2.10.1 *)
 - Azure CLI (azure-devops                      0.18.0)
 - Basic packages:
   - dnsutils
@@ -66,7 +66,7 @@ The following software is installed on machines with the 20200817.1 update.
   - yamllint
   - libcurl3
 - Alibaba Cloud CLI (3.0.56)
-- AWS CLI (aws-cli/1.18.120 Python/2.7.12 Linux/4.15.0-1092-azure botocore/1.17.43)
+- AWS CLI (aws-cli/1.18.125 Python/2.7.12 Linux/4.15.0-1092-azure botocore/1.17.48)
 - AWS CLI Session manager plugin (1.1.61.0)
 - build-essential
 - nvm (0.35.3)
@@ -78,7 +78,7 @@ Target: x86_64-unknown-linux-gnu
 - CMake (cmake version 3.17.0)
 - Docker Compose (docker-compose version 1.26.2, build eefe0d31)
 - Docker-Moby (Docker version 19.03.12+azure, build 0ed913b885c8919944a2e4c8d0b80a318a8dd48b)
-- Docker-Buildx (0.4.1+azure)
+- Docker-Buildx (0.4.2+azure)
 - .NET Core SDK:
   - 3.1.401
   - 3.1.302
@@ -159,14 +159,14 @@ Target: x86_64-unknown-linux-gnu
 - Git-ftp (1.0.2)
 - Hub CLI (2.14.2)
 - GitHub CLI 0.11.1
-- Google Chrome (Google Chrome 84.0.4147.125 )
-- ChromeDriver 84.0.4147.30 (48b3e868b4cc0aa7e8149519690b6f6949e110a8-refs/branch-heads/4147@{#310}); Chrome Driver is available via CHROMEWEBDRIVER environment variable
-- Google Cloud SDK (305.0.0)
+- Google Chrome (Google Chrome 85.0.4183.83 )
+- ChromeDriver 85.0.4183.38 (9047dbc2c693f044042bbec5c91401c708c7c26a-refs/branch-heads/4183@{#779}); Chrome Driver is available via CHROMEWEBDRIVER environment variable
+- Google Cloud SDK (307.0.0)
 - Haskell Cabal (cabal-install version 3.2.0.0
 compiled using version 3.2.0.0 of the Cabal library )
 - GHC (The Glorious Glasgow Haskell Compilation System, version 8.10.2)
 - Haskell Stack (Version 2.3.3, Git revision cb44d51bed48b723a5deb08c3348c0b3ccfc437e x86_64 hpack-0.33.0)
-- Heroku (heroku/7.42.6 linux-x64 node-v12.16.2)
+- Heroku (heroku/7.42.10 linux-x64 node-v12.16.2)
 - HHVM (HipHop VM 4.56.1 (rel))
 - ImageMagick
 - Azul Zulu OpenJDK:
@@ -176,12 +176,13 @@ compiled using version 3.2.0.0 of the Cabal library )
   - 11 (openjdk version "11.0.8" 2020-07-14) 
   - 12 (openjdk version "12.0.2" 2019-07-16)
 - Ant (Apache Ant(TM) version 1.9.6 compiled on July 20 2018)
-- Gradle 6.6
+- Gradle 6.6.1
 - Maven (Apache Maven 3.6.3 (cecedd343002696d0abb50b32b541b8a6ba2883f))
 - Kind (kind v0.8.1 go1.14.2 linux/amd64)
 - kubectl (Client Version: v1.18.8)
 - helm (v3.3.0+g8a4aeec)
 - minikube version: v1.12.3
+- kustomize ({kustomize/v3.8.1  2020-07-16T00:58:46Z  })
 - oc CLI Client Version: 4.5.0-202005291417-9933eb9
 - Leiningen (Leiningen 2.9.4 on Java 1.8.0_265 OpenJDK 64-Bit Server VM)
 - Mercurial (Mercurial Distributed SCM (version 4.4.1))
@@ -197,13 +198,13 @@ compiled using version 3.2.0.0 of the Cabal library )
 Local version: Unknown)
 - n (6.7.0)
 - Parcel (1.12.4)
-- TypeScript (Version 3.9.7)
+- TypeScript (Version 4.0.2)
 - Webpack (4.44.1)
 - Webpack CLI (3.3.12)
 - Yarn (1.22.4)
 - Newman (5.1.2)
 - Bazel (bazel 3.4.1)
-- Bazelisk (1.6.0)
+- Bazelisk (1.6.1)
 - ORAS CLI 0.8.1
 - PhantomJS (2.1.1)
 - PHP 5.6 (PHP 5.6.40-30+ubuntu16.04.1+deb.sury.org+1 (cli) )
@@ -228,8 +229,8 @@ Local version: Unknown)
 - rustfmt (1.4.17-stable)
 - clippy (0.0.212)
 - rustdoc (1.45.2)
-- bindgen (0.54.1)
-- cbindgen (0.14.3)
+- bindgen (0.55.1)
+- cbindgen (0.14.4)
 - cargo audit (0.12.0)
 - cargo outdated (v0.9.11)
 - Julia (julia version 1.5.0)
@@ -240,7 +241,7 @@ Local version: Unknown)
 - Terraform (Terraform v0.13.0)
 - Packer (1.6.1)
 - Vcpkg 2020.06.15-unknownhash
-- Vercel CLI (19.2.0)
+- Vercel CLI (20.0.0)
 - MongoDB on Linux v4.4.0
 - Haveged 1.9.1-3
 - Swig 3.0.8
@@ -253,9 +254,10 @@ Local version: Unknown)
 - Google APIs 21
 - CMake 3.10.2.4988404
 3.6.4111459
-- Android Support Repository 47.0.0
-- Android Solver for ConstraintLayout 1.0.2
-- Android Solver for ConstraintLayout 1.0.1
+- Android ConstraintLayout 1.0.2
+- Android ConstraintLayout 1.0.1
+- Android ConstraintLayout Solver 1.0.2
+- Android ConstraintLayout Solver 1.0.1
 - Android SDK Platform-Tools 30.0.4
 - Android SDK Platform 30
 - Android SDK Platform 29
@@ -272,6 +274,7 @@ Local version: Unknown)
 - Android SDK Platform 15
 - Android SDK Platform 10
 - Android SDK Patch Applier v4
+- Android SDK Build-Tools 30.0.2
 - Android SDK Build-Tools 30.0.1
 - Android SDK Build-Tools 30.0.0
 - Android SDK Build-Tools 29.0.3
@@ -306,8 +309,6 @@ Local version: Unknown)
 - Android SDK Build-Tools 19.1.0
 - Android SDK Build-Tools 17.0.0
 - Android NDK 21.3.6528147
-- Android ConstraintLayout 1.0.2
-- Android ConstraintLayout 1.0.1
 - Az Module (1.0.0)
 - Az Module (1.6.0)
 - Az Module (2.3.2)
@@ -346,8 +347,8 @@ Local version: Unknown)
 - Python:
   - Python 2.7.18
   - Python 3.5.9
-  - Python 3.6.11
-  - Python 3.7.8
+  - Python 3.6.12
+  - Python 3.7.9
   - Python 3.8.5
 - PyPy:
   - PyPy 2.7.13 [PyPy 7.3.1 with GCC 7.3.1 20180303 (Red Hat 7.3.1-5)]
@@ -367,5 +368,5 @@ Local version: Unknown)
   - boost 1.69.0
   - boost 1.72.0
 - AWS SAM CLI, version 1.1.0
-- Homebrew on Linux (Homebrew 2.4.11
-Homebrew/linuxbrew-core (git revision 99cdab; last commit 2020-08-16))
+- Homebrew on Linux (Homebrew 2.4.13
+Homebrew/linuxbrew-core (git revision 7d5ec; last commit 2020-08-24))
diff --git a/images/linux/Ubuntu1804-README.md b/images/linux/Ubuntu1804-README.md
index 03ef007..11cc81e 100644
--- a/images/linux/Ubuntu1804-README.md
+++ b/images/linux/Ubuntu1804-README.md
@@ -1,12 +1,12 @@
 <!--- DO NOT EDIT - This markdown file is autogenerated. -->
 # Ubuntu 18.04.5 LTS
-The following software is installed on machines with the 20200825.1 update.
+The following software is installed on machines with the 20200901.1 update.
 ***
 - 7-Zip 16.02
 - Ansible (ansible 2.9.12)
 - AzCopy7 (available by azcopy alias) 7.3.0
 - AzCopy10 (available by azcopy10 alias) 10.6.0
-- Azure CLI (azure-cli                         2.10.1 *)
+- Azure CLI (azure-cli                         2.11.1)
 - Azure CLI (azure-devops                      0.18.0)
 - Basic packages:
   - dnsutils
@@ -65,7 +65,7 @@ The following software is installed on machines with the 20200825.1 update.
   - yamllint
   - libcurl3
 - Alibaba Cloud CLI (3.0.56)
-- AWS CLI (aws-cli/1.18.125 Python/2.7.17 Linux/5.3.0-1035-azure botocore/1.17.48)
+- AWS CLI (aws-cli/1.18.129 Python/2.7.17 Linux/5.3.0-1035-azure botocore/1.17.52)
 - AWS CLI Session manager plugin (1.1.61.0)
 - build-essential
 - Clang 6.0 (6.0.0)
@@ -74,7 +74,7 @@ The following software is installed on machines with the 20200825.1 update.
 - Swift version 5.2.5 (swift-5.2.5-RELEASE)
 Target: x86_64-unknown-linux-gnu
 - CMake (cmake version 3.17.0)
-- Podman (2.0.4)
+- Podman (2.0.5)
 - Buildah (1.15.1)
 - Skopeo (1.1.1)
 - Docker Compose (docker-compose version 1.26.2, build eefe0d31)
@@ -148,7 +148,7 @@ Target: x86_64-unknown-linux-gnu
   - 2.1.301
   - 2.1.300
 - Erlang (Erlang (SMP,ASYNC_THREADS,HIPE) (BEAM) emulator version 11.0.3)
-- Firefox (Mozilla Firefox 79.0)
+- Firefox (Mozilla Firefox 80.0)
 - Geckodriver (0.27.0); Gecko Driver is available via GECKOWEBDRIVER environment variable
 - GNU C++ 7.5.0
 - GNU C++ 8.4.0
@@ -161,14 +161,14 @@ Target: x86_64-unknown-linux-gnu
 - Hub CLI (2.14.2)
 - GitHub CLI 0.11.1
 - Google Chrome (Google Chrome 85.0.4183.83 )
-- ChromeDriver 85.0.4183.38 (9047dbc2c693f044042bbec5c91401c708c7c26a-refs/branch-heads/4183@{#779}); Chrome Driver is available via CHROMEWEBDRIVER environment variable
+- ChromeDriver 85.0.4183.87 (cd6713ebf92fa1cacc0f1a598df280093af0c5d7-refs/branch-heads/4183@{#1689}); Chrome Driver is available via CHROMEWEBDRIVER environment variable
 - Google Cloud SDK (307.0.0)
 - Haskell Cabal (cabal-install version 3.2.0.0
 compiled using version 3.2.0.0 of the Cabal library )
 - GHC (The Glorious Glasgow Haskell Compilation System, version 8.10.2)
 - Haskell Stack (Version 2.3.3, Git revision cb44d51bed48b723a5deb08c3348c0b3ccfc437e x86_64 hpack-0.33.0)
-- Heroku (heroku/7.42.10 linux-x64 node-v12.16.2)
-- HHVM (HipHop VM 4.71.0 (rel))
+- Heroku (heroku/7.42.13 linux-x64 node-v12.16.2)
+- HHVM (HipHop VM 4.72.0 (rel))
 - ImageMagick
 - Azul Zulu OpenJDK:
   - 7 (openjdk version "1.7.0_272")
@@ -180,15 +180,16 @@ compiled using version 3.2.0.0 of the Cabal library )
 - Gradle 6.6.1
 - Maven (Apache Maven 3.6.3 (cecedd343002696d0abb50b32b541b8a6ba2883f))
 - Kind (kind v0.8.1 go1.14.2 linux/amd64)
-- kubectl (Client Version: v1.18.8)
-- helm (v3.3.0+g8a4aeec)
+- kubectl (Client Version: v1.19.0)
+- helm (v3.3.1+g249e521)
 - minikube version: v1.12.3
-- kustomize ({kustomize/v3.8.1  2020-07-16T00:58:46Z  })
+- kustomize ({kustomize/v3.8.2  2020-08-29T17:44:01Z  })
 - oc CLI Client Version: 4.5.0-202005291417-9933eb9
 - Leiningen (Leiningen 2.9.4 on Java 1.8.0_265 OpenJDK 64-Bit Server VM)
 - Mercurial (Mercurial Distributed SCM (version 4.5.3))
 - Miniconda (conda 4.8.3)
 - Mono (Mono JIT compiler version 6.10.0.104 (tarball Fri Jun 26 19:38:24 UTC 2020))
+- NuGet (NuGet Version: 5.5.0.6382)
 - MySQL (mysql  Ver 14.14 Distrib 5.7.31, for Linux (x86_64) using  EditLine wrapper)
 - MySQL Server (user:root password:root)
 - MS SQL Server Client Tools
@@ -203,7 +204,7 @@ Local version: Unknown)
 - TypeScript (Version 4.0.2)
 - Webpack (4.44.1)
 - Webpack CLI (3.3.12)
-- Yarn (1.22.4)
+- Yarn (1.22.5)
 - Newman (5.1.2)
 - Bazel (bazel 3.4.1)
 - Bazelisk (1.6.1)
@@ -218,34 +219,35 @@ Local version: Unknown)
 - Pollinate
 - psql (PostgreSQL) 12.4
 - Powershell (PowerShell 7.0.3)
+- Pulumi v2.9.2
 - ruby (2.5.1p57)
 - gem (3.1.4)
 - OpenSSL 1.1.1g  21 Apr 2020
 - Libssl 1.1.1g-1+ubuntu18.04.1+deb.sury.org+1
 - R 4.0.2
 - rustup (1.22.1)
-- rust (1.45.2)
-- cargo (1.45.1)
-- rustfmt (1.4.17-stable)
+- rust (1.46.0)
+- cargo (1.46.0)
+- rustfmt (1.4.18-stable)
 - clippy (0.0.212)
-- rustdoc (1.45.2)
+- rustdoc (1.46.0)
 - bindgen (0.55.1)
 - cbindgen (0.14.4)
 - cargo audit (0.12.0)
 - cargo outdated (v0.9.11)
-- Julia (julia version 1.5.0)
+- Julia (julia version 1.5.1)
 - sbt (1.3.13)
 - Selenium server standalone (available via SELENIUM_JAR_PATH environment variable)
 - Sphinx Open Source Search Server
 - Subversion (svn, version 1.9.7 (r1800392))
-- Terraform (Terraform v0.13.0)
-- Packer (1.6.1)
+- Terraform (Terraform v0.13.1)
+- Packer (1.6.2)
 - Vcpkg 2020.06.15-unknownhash
-- Vercel CLI (20.0.0)
+- Vercel CLI (20.1.0)
 - MongoDB on Linux v4.4.0
 - Haveged 1.9.1-6
 - Swig 3.0.12
-- Netlify CLI (netlify-cli/2.59.1 linux-x64 node-v12.18.3)
+- Netlify CLI (netlify-cli/2.59.3 linux-x64 node-v12.18.3)
 - Google Repository 58
 - Google Play services 49
 - Google APIs 24
@@ -314,10 +316,10 @@ Local version: Unknown)
 - Az Module (4.3.0)
 - Az Module (4.4.0)
 - Cached container images
+  - buildpack-deps:stretch (Digest: sha256:82a686ba95fdf2bf4a5f5264e8e55b8aa272ffcedaed7826777f08de0d9e1146)
+  - buildpack-deps:buster (Digest: sha256:0f7be4c25fadb2b8aee537cdac00a684b09057e66368683bbf22adf477e05faa)
   - node:10 (Digest: sha256:cf3ee6a5a1b1916c7a2e4fb51eb7ecba1afe186739677d62e9c1bb2cb1c7d6b0)
   - node:12 (Digest: sha256:d0738468dfc7cedb7d260369e0546fd7ee8731cfd67136f6023d070ad9679090)
-  - buildpack-deps:stretch (Digest: sha256:34a18637ed801407f7a17a29575e82264fb0818f9b6a0c890f8a6530afea43dc)
-  - buildpack-deps:buster (Digest: sha256:b9343e9ba16795186ab1f34825803f1d7e9b0943dba5d644d3c1de5473f0602e)
   - debian:9 (Digest: sha256:335ecf9e8d9b2206c2e9e7f8b09547faa9f868e694f7c5be14c38be15ea8a7cf)
   - debian:8 (Digest: sha256:8a0f2603166345b4d7bbf4842137b2ffcb492ece20d15f963f08aa26670f82c7)
   - node:12-alpine (Digest: sha256:9623cd396644f9b2e595d833dc0188a880333674488d939338ab5fde10ef7c43)
@@ -351,7 +353,7 @@ Local version: Unknown)
   - node 8.17.0
   - node 10.22.0
   - node 12.18.3
-  - node 14.8.0
+  - node 14.9.0
 - go:
   - go 1.11.13
   - go 1.12.17
@@ -362,5 +364,5 @@ Local version: Unknown)
   - boost 1.69.0
   - boost 1.72.0
 - AWS SAM CLI, version 1.1.0
-- Homebrew on Linux (Homebrew 2.4.13
-Homebrew/linuxbrew-core (git revision 7d5ec; last commit 2020-08-24))
+- Homebrew on Linux (Homebrew 2.4.16
+Homebrew/linuxbrew-core (git revision d0486f; last commit 2020-09-01))
diff --git a/images/linux/Ubuntu2004-README.md b/images/linux/Ubuntu2004-README.md
index ed26985..fb074ef 100644
--- a/images/linux/Ubuntu2004-README.md
+++ b/images/linux/Ubuntu2004-README.md
@@ -1,12 +1,12 @@
 <!--- DO NOT EDIT - This markdown file is autogenerated. -->
 # Ubuntu 20.04.1 LTS
-The following software is installed on machines with the 20200817.1 update.
+The following software is installed on machines with the 20200825.1 update.
 ***
 - 7-Zip 16.02
 - Ansible (ansible 2.9.6)
 - AzCopy7 (available by azcopy alias) 7.3.0
-- AzCopy10 (available by azcopy10 alias) 10.5.1
-- Azure CLI (azure-cli                         2.10.1)
+- AzCopy10 (available by azcopy10 alias) 10.6.0
+- Azure CLI (azure-cli                         2.10.1 *)
 - Azure CLI (azure-devops                      0.18.0)
 - Basic packages:
   - dnsutils
@@ -65,7 +65,7 @@ The following software is installed on machines with the 20200817.1 update.
   - yamllint
   - libcurl4
 - Alibaba Cloud CLI (3.0.56)
-- AWS CLI (aws-cli/2.0.40 Python/3.7.3 Linux/5.4.0-1022-azure exe/x86_64.ubuntu.20)
+- AWS CLI (aws-cli/2.0.42 Python/3.7.3 Linux/5.4.0-1022-azure exe/x86_64.ubuntu.20)
 - AWS CLI Session manager plugin (1.1.61.0)
 - build-essential
 - Clang 6.0 (6.0.1)
@@ -79,7 +79,7 @@ Target: x86_64-unknown-linux-gnu
 - Skopeo (1.1.1)
 - Docker Compose (docker-compose version 1.26.2, build eefe0d31)
 - Docker-Moby (Docker version 19.03.12+azure, build 0ed913b885c8919944a2e4c8d0b80a318a8dd48b)
-- Docker-Buildx (0.4.1+azure)
+- Docker-Buildx (0.4.2+azure)
 - .NET Core SDK:
   - 3.1.401
   - 3.1.302
@@ -156,26 +156,27 @@ Target: x86_64-unknown-linux-gnu
 - Git-ftp (1.6.0)
 - Hub CLI (2.14.2)
 - GitHub CLI 0.11.1
-- Google Chrome (Google Chrome 84.0.4147.125 )
-- ChromeDriver 84.0.4147.30 (48b3e868b4cc0aa7e8149519690b6f6949e110a8-refs/branch-heads/4147@{#310}); Chrome Driver is available via CHROMEWEBDRIVER environment variable
-- Google Cloud SDK (305.0.0)
+- Google Chrome (Google Chrome 85.0.4183.83 )
+- ChromeDriver 85.0.4183.38 (9047dbc2c693f044042bbec5c91401c708c7c26a-refs/branch-heads/4183@{#779}); Chrome Driver is available via CHROMEWEBDRIVER environment variable
+- Google Cloud SDK (307.0.0)
 - Haskell Cabal (cabal-install version 3.2.0.0
 compiled using version 3.2.0.0 of the Cabal library )
 - GHC (The Glorious Glasgow Haskell Compilation System, version 8.10.2)
 - Haskell Stack (Version 2.3.3, Git revision cb44d51bed48b723a5deb08c3348c0b3ccfc437e x86_64 hpack-0.33.0)
-- Heroku (heroku/7.42.6 linux-x64 node-v12.16.2)
-- HHVM (HipHop VM 4.69.1 (rel))
+- Heroku (heroku/7.42.10 linux-x64 node-v12.16.2)
+- HHVM (HipHop VM 4.71.0 (rel))
 - ImageMagick
 - Adopt OpenJDK:
   - 8 (openjdk version "1.8.0_265") 
   - 11 (openjdk version "11.0.8" 2020-07-14) (default)
 - Ant (Apache Ant(TM) version 1.10.7 compiled on October 24 2019)
-- Gradle 6.6
+- Gradle 6.6.1
 - Maven (Apache Maven 3.6.3 (cecedd343002696d0abb50b32b541b8a6ba2883f))
 - Kind (kind v0.8.1 go1.14.2 linux/amd64)
 - kubectl (Client Version: v1.18.8)
 - helm (v3.3.0+g8a4aeec)
 - minikube version: v1.12.3
+- kustomize ({kustomize/v3.8.1  2020-07-16T00:58:46Z  })
 - oc CLI Client Version: 4.5.0-202005291417-9933eb9
 - Leiningen (Leiningen 2.9.4 on Java 11.0.8 OpenJDK 64-Bit Server VM)
 - Mercurial (Mercurial Distributed SCM (version 5.3.1))
@@ -192,13 +193,13 @@ compiled using version 3.2.0.0 of the Cabal library )
 Local version: Unknown)
 - n (6.7.0)
 - Parcel (1.12.4)
-- TypeScript (Version 3.9.7)
+- TypeScript (Version 4.0.2)
 - Webpack (4.44.1)
 - Webpack CLI (3.3.12)
 - Yarn (1.22.4)
 - Newman (5.1.2)
 - Bazel (bazel 3.4.1)
-- Bazelisk (1.6.0)
+- Bazelisk (1.6.1)
 - ORAS CLI 0.8.1
 - PhantomJS (2.1.1)
 - PHP 7.4 (PHP 7.4.9 (cli) (built: Aug  7 2020 14:30:01) ( NTS ))
@@ -223,8 +224,8 @@ apt-get update
 - rustfmt (1.4.17-stable)
 - clippy (0.0.212)
 - rustdoc (1.45.2)
-- bindgen (0.54.1)
-- cbindgen (0.14.3)
+- bindgen (0.55.1)
+- cbindgen (0.14.4)
 - cargo audit (0.12.0)
 - cargo outdated (v0.9.11)
 - Julia (julia version 1.5.0)
@@ -236,7 +237,7 @@ apt-get update
 - Terraform (Terraform v0.13.0)
 - Packer (1.6.1)
 - Vcpkg 2020.06.15-unknownhash
-- Vercel CLI (19.2.0)
+- Vercel CLI (20.0.0)
 - MongoDB on Linux v4.4.0
 - Haveged 1.9.1-6ubuntu1
 - Swig 4.0.1
@@ -244,13 +245,13 @@ apt-get update
 - Google Repository 58
 - Google Play services 49
 - CMake 3.10.2.4988404
-- Android Support Repository 47.0.0
 - Android SDK Platform-Tools 30.0.4
 - Android SDK Platform 30
 - Android SDK Platform 29
 - Android SDK Platform 28
 - Android SDK Platform 27
 - Android SDK Patch Applier v4
+- Android SDK Build-Tools 30.0.2
 - Android SDK Build-Tools 30.0.1
 - Android SDK Build-Tools 30.0.0
 - Android SDK Build-Tools 29.0.3
@@ -265,7 +266,7 @@ apt-get update
 - Android SDK Build-Tools 27.0.1
 - Android SDK Build-Tools 27.0.0
 - Android NDK 21.3.6528147
-- Az Module (4.5.0)
+- Az Module (4.6.0)
 - Cached container images
   - node:10 (Digest: sha256:cf3ee6a5a1b1916c7a2e4fb51eb7ecba1afe186739677d62e9c1bb2cb1c7d6b0)
   - node:12 (Digest: sha256:d0738468dfc7cedb7d260369e0546fd7ee8731cfd67136f6023d070ad9679090)
@@ -293,8 +294,8 @@ apt-get update
 - Python:
   - Python 2.7.18
   - Python 3.5.9
-  - Python 3.6.11
-  - Python 3.7.8
+  - Python 3.6.12
+  - Python 3.7.9
   - Python 3.8.5
 - PyPy:
   - PyPy 2.7.13 [PyPy 7.3.1 with GCC 7.3.1 20180303 (Red Hat 7.3.1-5)]
@@ -308,5 +309,5 @@ apt-get update
   - go 1.14.7
   - go 1.15.0
 - AWS SAM CLI, version 1.1.0
-- Homebrew on Linux (Homebrew 2.4.11
-Homebrew/linuxbrew-core (git revision 99cdab; last commit 2020-08-16))
+- Homebrew on Linux (Homebrew 2.4.13
+Homebrew/linuxbrew-core (git revision 7d5ec; last commit 2020-08-24))
diff --git a/images/linux/scripts/installers/android.sh b/images/linux/scripts/installers/android.sh
index 725bf3d..73611e6 100644
--- a/images/linux/scripts/installers/android.sh
+++ b/images/linux/scripts/installers/android.sh
@@ -27,9 +27,6 @@ wget -O android-sdk.zip https://dl.google.com/android/repository/sdk-tools-linux
 unzip android-sdk.zip -d ${ANDROID_SDK_ROOT}
 rm -f android-sdk.zip
 
-# Add required permissions
-chmod -R a+rwx ${ANDROID_SDK_ROOT}
-
 if isUbuntu20 ; then
     # Sdk manager doesn't work with Java > 8, set version 8 explicitly
     sed -i "2i export JAVA_HOME=${JAVA_HOME_8_X64}" /usr/local/lib/android/sdk/tools/bin/sdkmanager
@@ -63,6 +60,9 @@ constraint_layout_solver_versions_list=$(echo "$extras"|awk -F';' '/constraint-l
 platform_versions_list=$(echo "$platforms"|awk -F- '{print $2}')
 buildtools_versions_list=$(echo "$buildtools"|awk -F';' '{print $2}')
 
+# Add required permissions
+chmod -R a+rwx ${ANDROID_SDK_ROOT}
+
 echo "Lastly, document what was added to the metadata file"
 DocumentInstalledItem "Google Repository $(cat ${ANDROID_SDK_ROOT}/extras/google/m2repository/source.properties 2>&1 | grep Pkg.Revision | cut -d '=' -f 2)"
 DocumentInstalledItem "Google Play services $(cat ${ANDROID_SDK_ROOT}/extras/google/google_play_services/source.properties 2>&1 | grep Pkg.Revision | cut -d '=' -f 2)"
diff --git a/images/linux/scripts/installers/java-tools.sh b/images/linux/scripts/installers/java-tools.sh
index b64c576..7ec94b7 100644
--- a/images/linux/scripts/installers/java-tools.sh
+++ b/images/linux/scripts/installers/java-tools.sh
@@ -10,6 +10,15 @@ source $HELPER_SCRIPTS/os.sh
 
 set -e
 
+function javaTool {
+    if [[ "$2" =~ ([1]{0,1}.)?$DEFAULT_JDK_VERSION.* ]]; then
+        echo "$1 $2 is equal to default one $DEFAULT_JDK_VERSION"
+    else
+        echo "$1 $2 is not equal to default one $DEFAULT_JDK_VERSION"
+        exit 1
+    fi
+}
+
 # Install GPG Key for Adopt Open JDK. See https://adoptopenjdk.net/installation.html
 wget -qO - "https://adoptopenjdk.jfrog.io/adoptopenjdk/api/gpg/key/public" | apt-key add -
 add-apt-repository --yes https://adoptopenjdk.jfrog.io/adoptopenjdk/deb/
@@ -38,6 +47,11 @@ apt-get -y install adoptopenjdk-8-hotspot=\*
 apt-get -y install adoptopenjdk-11-hotspot=\*
 
 # Set Default Java version.
+if isUbuntu16; then
+    # issue: https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=825987
+    # stackoverflow: https://askubuntu.com/questions/1187136/update-java-alternatives-only-java-but-not-javac-is-changed
+    sed -i 's/(hl|jre|jdk|plugin|DUMMY) /(hl|jre|jdk|jdkhl|plugin|DUMMY) /g' /usr/sbin/update-java-alternatives
+fi
 update-java-alternatives -s /usr/lib/jvm/adoptopenjdk-${DEFAULT_JDK_VERSION}-hotspot-amd64
 
 echo "JAVA_HOME_8_X64=/usr/lib/jvm/adoptopenjdk-8-hotspot-amd64" | tee -a /etc/environment
@@ -85,6 +99,11 @@ for cmd in gradle java javac mvn ant; do
     fi
 done
 
+javaVersion=$(java -version |& head -n 1 | cut -d\" -f 2)
+javaTool "Java" $javaVersion
+javacVersion=$(javac -version |& cut -d" " -f2)
+javaTool "Javac" $javacVersion
+
 # Document what was added to the image
 echo "Lastly, documenting what we added to the metadata file"
 if isUbuntu16 || isUbuntu18 ; then
diff --git a/images/linux/scripts/installers/mono.sh b/images/linux/scripts/installers/mono.sh
index 10b9058..e0f84a1 100644
--- a/images/linux/scripts/installers/mono.sh
+++ b/images/linux/scripts/installers/mono.sh
@@ -15,7 +15,7 @@ LSB_CODENAME=$(lsb_release -cs)
 apt-key adv --keyserver hkp://keyserver.ubuntu.com:80 --recv-keys 3FA7E0328081BFF6A14DA29AA6A19B38D3D831EF
 echo "deb https://download.mono-project.com/repo/ubuntu stable-$LSB_CODENAME main" | tee /etc/apt/sources.list.d/mono-official-stable.list
 apt-get update
-apt-get install -y --no-install-recommends apt-transport-https mono-complete
+apt-get install -y --no-install-recommends apt-transport-https mono-complete nuget
 
 # Run tests to determine that the software installed as expected
 echo "Testing to make sure that script performed as expected, and basic scenarios work"
@@ -23,7 +23,12 @@ if ! command -v mono; then
     echo "mono was not installed"
     exit 1
 fi
+if ! command -v nuget; then
+    echo "nuget was not installed"
+    exit 1
+fi
 
 # Document what was added to the image
 echo "Lastly, documenting what we added to the metadata file"
 DocumentInstalledItem "Mono ($(mono --version | head -n 1))"
+DocumentInstalledItem "NuGet ($(nuget | tail -n +1 | head -n 1))" # Pipe to tail before piping to head because NuGet prints an ugly error if you close its stream before it's done writing.
diff --git a/images/linux/scripts/installers/pulumi.sh b/images/linux/scripts/installers/pulumi.sh
new file mode 100644
index 0000000..e041c2b
--- /dev/null
+++ b/images/linux/scripts/installers/pulumi.sh
@@ -0,0 +1,26 @@
+#!/bin/bash
+################################################################################
+##  File:  pulumi.sh
+##  Desc:  Installs Pulumi
+################################################################################
+
+# Source the helpers for use with the script
+source $HELPER_SCRIPTS/document.sh
+source $HELPER_SCRIPTS/install.sh
+
+# Install Pulumi
+VERSION=$(curl --fail --silent -L "https://www.pulumi.com/latest-version")
+TARBALL_URL="https://get.pulumi.com/releases/sdk/pulumi-v${VERSION}-linux-x64.tar.gz"
+download_with_retries ${TARBALL_URL} "/tmp" pulumi-v${VERSION}.tar.gz
+tar --strip=1 -xvf /tmp/pulumi-v${VERSION}.tar.gz -C /usr/local/bin
+
+# Run tests to determine that the software installed as expected
+echo "Testing to make sure that script performed as expected, and basic scenarios work"
+if ! command -v pulumi ; then
+    echo "Pulumi was not installed"
+    exit 1
+fi
+
+# Document what was added to the image
+echo "Lastly, documenting what we added to the metadata file"
+DocumentInstalledItem "Pulumi $(pulumi version)"
diff --git a/images/linux/toolsets/toolset-1604.json b/images/linux/toolsets/toolset-1604.json
index c2fe19c..40ea0af 100644
--- a/images/linux/toolsets/toolset-1604.json
+++ b/images/linux/toolsets/toolset-1604.json
@@ -104,7 +104,8 @@
                 "3.5.0",
                 "3.8.0",
                 "4.3.0",
-                "4.4.0"
+                "4.4.0",
+                "4.6.0"
             ]
         }
     ]
diff --git a/images/linux/toolsets/toolset-1804.json b/images/linux/toolsets/toolset-1804.json
index 6b4c9f0..1bd31b1 100644
--- a/images/linux/toolsets/toolset-1804.json
+++ b/images/linux/toolsets/toolset-1804.json
@@ -100,7 +100,8 @@
                 "3.5.0",
                 "3.8.0",
                 "4.3.0",
-                "4.4.0"
+                "4.4.0",
+                "4.6.0"
             ]
         }
     ]
diff --git a/images/linux/ubuntu1604.json b/images/linux/ubuntu1604.json
index 4a5c5be..3625b7e 100644
--- a/images/linux/ubuntu1604.json
+++ b/images/linux/ubuntu1604.json
@@ -181,6 +181,7 @@
                 "{{template_dir}}/scripts/installers/pollinate.sh",
                 "{{template_dir}}/scripts/installers/postgresql.sh",
                 "{{template_dir}}/scripts/installers/powershellcore.sh",
+                "{{template_dir}}/scripts/installers/pulumi.sh",
                 "{{template_dir}}/scripts/installers/ruby.sh",
                 "{{template_dir}}/scripts/installers/r.sh",
                 "{{template_dir}}/scripts/installers/rust.sh",
diff --git a/images/linux/ubuntu1804.json b/images/linux/ubuntu1804.json
index 91f468f..b503752 100644
--- a/images/linux/ubuntu1804.json
+++ b/images/linux/ubuntu1804.json
@@ -185,6 +185,7 @@
                 "{{template_dir}}/scripts/installers/pollinate.sh",
                 "{{template_dir}}/scripts/installers/postgresql.sh",
                 "{{template_dir}}/scripts/installers/powershellcore.sh",
+                "{{template_dir}}/scripts/installers/pulumi.sh",
                 "{{template_dir}}/scripts/installers/ruby.sh",
                 "{{template_dir}}/scripts/installers/r.sh",
                 "{{template_dir}}/scripts/installers/rust.sh",
diff --git a/images/linux/ubuntu2004.json b/images/linux/ubuntu2004.json
index f3b7a68..48d309b 100644
--- a/images/linux/ubuntu2004.json
+++ b/images/linux/ubuntu2004.json
@@ -187,6 +187,7 @@
                 "{{template_dir}}/scripts/installers/pollinate.sh",
                 "{{template_dir}}/scripts/installers/postgresql.sh",
                 "{{template_dir}}/scripts/installers/powershellcore.sh",
+                "{{template_dir}}/scripts/installers/pulumi.sh",
                 "{{template_dir}}/scripts/installers/ruby.sh",
                 "{{template_dir}}/scripts/installers/r.sh",
                 "{{template_dir}}/scripts/installers/rust.sh",

</details>

closed time in 6 days

yarikoptic

issue commentdatalad/datalad-extensions

singularity wows

yeap, that was it, build of annex passed, closing

yarikoptic

comment created time in 6 days

PR closed datalad/datalad-extensions

TEMP: See on what step it fails to mount

(I bet singularity, so print its version)

+84 -13

4 comments

2 changed files

yarikoptic

pr closed time in 6 days

pull request commentdatalad/datalad-extensions

TEMP: See on what step it fails to mount

I think I might adopt some of the changes from this TEMP PR later on , but for now it could be closed, singularity issue was addressed.

yarikoptic

comment created time in 6 days

pull request commentdatalad/datalad-extensions

TEMP: try again to figure out what is up with singularity from the package

ok -- original issue resolved with the upload of a patched up singularity-container (2.6.1-2+nd1~nd*) to neurodebian

yarikoptic

comment created time in 6 days

push eventdatalad/datalad

Yaroslav Halchenko

commit sha 74af72d881b6f841a9970140f3ba2fdc787d28da

BF: use time.time (no timezone offset) while checking expiration for AWS S3 credential calendar.timegm expects the time tuple to be in GMT time tuple. time.localtime() has time zone information but it is not part of the time tuple. calendar.timegm then produced seconds since epoch for that in GMT thus zone information was not used. With time.time() we seems to be getting seconds since epoch properly. I kept running into S3 downloads failing while credential reported to be expiring in some hours to come

view details

Yaroslav Halchenko

commit sha 3a85e9a1603f2f47ad768eb729f984bead61d681

BF: make iso8601_to_epoch respect time zone Thanks @kyleam for the patch

view details

Yaroslav Halchenko

commit sha 0d27a8e2262c8c937f137714c430b5957ede485a

Merge pull request #4927 from yarikoptic/bf-s3-expire BF: use time.time (no timezone offset) while checking expiration for AWS credentials

view details

push time in 6 days

PR merged datalad/datalad

BF: use time.time (no timezone offset) while checking expiration for AWS credentials

with time.localtime() we are getting local time without any time zone, calendar.timegm then produces seconds since epoch for that in GMT thus again - no zone. With time.time() we seems to be getting seconds since epoch properly.

I kept running into S3 downloads failing while credential reported to be expiring in some hours to come

+4 -5

4 comments

3 changed files

yarikoptic

pr closed time in 6 days

pull request commentdatalad/datalad

BF: use time.time (no timezone offset) while checking expiration for AWS credentials

look -- even containers is green now ;) merging!

yarikoptic

comment created time in 6 days

more