profile
viewpoint
Tom Prince tomprince @mozilla-releng

mozilla-services/guardian-vpn-windows 14

Mozilla VPN for Windows

mozilla-releng/releasewarrior-2.0 3

releasewarrior is a tool that manages and provides a checklist for human decision tasks with releases in flight

tomprince/abstract_hint 3

Tactic to add proof of goal as a hint.

mozilla-releng/releasewarrior-data 2

tracks status and human decision tasks for releases

tomprince/buildbot 1

Python-based continuous integration testing framework; send pull requests for your patches!

mozilla-releng/mobile-l10n-automation 0

Automation for keeping L10n strings in mozilla-mobile repositories up-to-date.

tomprince/agda-categories 0

Categories parametrized by morphism equality, in Agda

tomprince/agentforhire 0

An attempt to get a good ResourceTraversingAgent for testing Twisted web Resources

issue commentmozilla-mobile/fenix

[Bug] logs from previous builds are being displayed in taskcluster

This is caused by https://bugzilla.mozilla.org/show_bug.cgi?id=1595808#c15 and should be fixed by updating to the docker-image there.

rpappalax

comment created time in 12 hours

issue commenttaskcluster/taskcluster

Include launchSpec in worker-pool error reports

That information would also be necessary for https://github.com/taskcluster/taskcluster-rfcs/pull/161

djmitche

comment created time in a day

issue commenttaskcluster/taskcluster

Include launchSpec in worker-pool error reports

It would be useful to include the entire spec in the email reports, at least, as those can out-last a given configuration of a worker-pool.

djmitche

comment created time in a day

push eventtp-tc/taskcluster

Tom Prince

commit sha 4cf2d845ee0b38696fb09c76ef6b390e1ab21e06

Support docker images generated by newer docker versions.

view details

push time in 2 days

PR opened taskcluster/taskcluster

Support docker images generated by newer docker versions.

<!-- Did you remember to add a changelog snippet? See https://github.com/taskcluster/taskcluster/blob/master/dev-docs/best-practices/changelog.md

 If this is related to a Bugzilla bug, please begin your title with [Bug XXXXX]
 and update this link.  Otherwise, just remove it from your PR comment.  -->

Bugzilla Bug: XXXXX

<!-- If this is related to a GitHub Bug/Issue, Please write Issue Number after # in the next line. Otherwise, just remove it from your PR comment. -->

Github Bug/Issue: Fixes #XXXX

+32 -20

0 comment

1 changed file

pr created time in 2 days

create barnchtp-tc/taskcluster

branch : docker-manifest-version

created branch time in 2 days

push eventtp-tc/taskcluster

Greg Arndt

commit sha 6ae119e9cb388f7eda65ee16b9003c271f67bd57

Merge pull request #210 from gregarndt/update_docker_1.9 Update docker 1.10

view details

Greg Arndt

commit sha a9826c5ddcfbc64216f84dc8eb2d49e1c882b659

update worker ci image to run in a docker 1.10 environment

view details

Greg Arndt

commit sha da7af99be0994ec9ecda2a85da0d94138a528d79

Merge pull request #215 from djmitche/error-message-newline Always precede [taskcluster:error] with newline

view details

Greg Arndt

commit sha 055fe8487d361aef656829deba42484965153190

Merge pull request #216 from gregarndt/update_cert Update base image to have unexpired cert

view details

Greg Arndt

commit sha 5eef32d06fe81fa9427b936e21a488a80ca73580

Merge pull request #217 from gregarndt/add_docker_download_timing Add timings for image download and loading

view details

Greg Arndt

commit sha a83d0856684b46412f67aa9f7f66ca2475a2c9de

Merge pull request #220 from taskcluster/gregarndt-patch-10 Bump balrog vpn proxy version

view details

Greg Arndt

commit sha b30d52d6d00bbcb321d36800404d69e56cbe4a8f

Merge pull request #218 from gregarndt/add_timestamps Include timestamp for worker specific log messages

view details

Greg Arndt

commit sha e401a98bfa47c2a72f2a465e6a7d85c03075a16e

Merge pull request #221 from gregarndt/improve_message Change wording saying image was loaded

view details

Greg Arndt

commit sha 2f0c434652eb43187732930e8c622c5622126344

Update readme to include post deployment verification

view details

Greg Arndt

commit sha d09fc256ad49a47effd2c88630d0c13c76f2a66d

Merge pull request #203 from walac/master Bug 1220738: Use temp credentials on task behalf. r=garndt

view details

Greg Arndt

commit sha 456bfff765e8c3c12a26f58242b097367329cd7a

Bump dind service version

view details

Greg Arndt

commit sha 4dfdab77ee45abc4a0103d57cff3c5d3357397e8

Merge pull request #226 from gregarndt/add_worker_type_to_task Add instance information to the environment

view details

Brian Stack

commit sha 189ce7b6cf668252fa8c50568bb4a58411f7da60

Merge pull request #227 from Anup-Allamsetty/master Add npm-shrinkwrap

view details

Greg Arndt

commit sha 05e9b93972e0f6ff9b9b3d0d07e8d5bd0bf084ce

Merge pull request #228 from gregarndt/add_statsum_client Add lib-monitor client

view details

Greg Arndt

commit sha 5f73c856fe51d8227f21d111746b53bcbca3bc93

Add statsum client to garbage collector

view details

Greg Arndt

commit sha a60f61a4e24570bc518e8abbd51e3855618bc17f

Merge pull request #231 from walac/master Braino: remove wrong documentation about local live log.

view details

Greg Arndt

commit sha 2ca1e40e4d500455dd08e456018303296a4b62d5

Merge pull request #232 from djmitche/bug1281779 Bug 1281779: simplify docker-pull error message

view details

Greg Arndt

commit sha 62ac02ebfb4782e49e2533b55b9aefaa39b6bc59

Merge pull request #233 from gregarndt/remove_temp_image Remove temp task image directory after import

view details

Pete Moore

commit sha 443e92b44219049f56cd63887118c6a1b17ba4f7

Merge pull request #234 from mozbhearsum/fix-balrog-hack Update aus4-admin.mozilla.org DNS hack with new IP address

view details

Greg Arndt

commit sha ba3cc0f4d3776555456c84d535b3837a3d9438be

Update taskcluster proxy credentials with those from the claim

view details

push time in 2 days

push eventtomprince/fenix

Tom Prince

commit sha f269099b3c77e34c3cc33bce800c5c53c3396dea

Update taskgraph.

view details

push time in 2 days

push eventtomprince/fenix

Tom Prince

commit sha 345debac973f2a8c0249f40f8a164d50fbd54e71

Update taskgraph.

view details

push time in 2 days

create barnchtomprince/fenix

branch : taskgraph

created branch time in 2 days

PR opened mozilla-mobile/fenix

Update taskgraph.

Pull Request checklist

<!-- Before submitting the PR, please address each item -->

  • [ ] Tests: This PR includes thorough tests or an explanation of why it does not
  • [ ] Screenshots: This PR includes screenshots or GIFs of the changes made or an explanation of why it does not
  • [ ] Accessibility: The code in this PR follows accessibility best practices or does not include any user facing features. In addition, it includes a screenshot of a successful accessibility scan to ensure no new defects are added to the product.

After merge

  • [ ] Milestone: Make sure issues finished by this pull request are added to the milestone of the version currently in development.

To download an APK when reviewing a PR:

  1. click on Show All Checks,
  2. click Details next to "Taskcluster (pull_request)" after it appears and then finishes with a green checkmark,
  3. click on the "Fenix - assemble" task, then click "Run Artifacts".
  4. the APK links should be on the left side of the screen, named for each CPU architecture
+10 -2

0 comment

1 changed file

pr created time in 2 days

push eventtp-tc/kaniko

Tom Prince

commit sha 3e5d07a87c464c8ec38fdc116f3899aaa1c5cbbe

Add support for squashing.

view details

Tom Prince

commit sha c62657da32437a4ff2311161fcbf91eb6b85df85

[FIXME: revendor] Add support for squashing images.

view details

push time in 2 days

push eventtomprince/firefox-tv

Tom Prince

commit sha e47cfea7012f74a83e63f6b88d5f719de159b65d

Fix release artifact path.

view details

push time in 2 days

push eventtp-tc/firefox-tv

Simon Chae

commit sha 9968acad534bc185a314f9b7373d52a1dae43f2f

Issue #2995: Disable bitbar in master

view details

Simon Chae

commit sha 2ceb5cf3ab2069e6644d3ea77fff1a4618c916a9

Update version to v4.7-LAT1

view details

Simon Chae

commit sha 491e9acf94b3e7037c1184b57248f526e49d2f12

Update version to v4.7

view details

Tom Prince

commit sha 6ffa40d5989cd01f7b9f9aa460395bc5dbe238ec

Fix release artifact path.

view details

push time in 2 days

PR opened mozilla-mobile/firefox-tv

Release artifact path

Checklist

<!-- Before submitting and merging the PR, please address each item -->

  • [ ] Confirm the acceptance criteria is fully satisfied in the issue(s) this PR will close
  • [ ] Add thorough tests or an explanation of why it does not
  • [ ] Add a CHANGELOG entry if applicable
  • [ ] Add QA labels on the associated issue (not this PR; qa-ready or qa-notneeded)
+5 -4

0 comment

1 changed file

pr created time in 2 days

create barnchtp-tc/firefox-tv

branch : release-artifact-path

created branch time in 2 days

delete branch tp-tc/lando-api

delete branch : mars/add-sec-approval-form-submission-endpoint

delete time in 3 days

delete branch tp-tc/lando-api

delete branch : mars/reorganize-security-properties

delete time in 3 days

delete branch tp-tc/lando-api

delete branch : mars/rename-sec-approval-api-module

delete time in 3 days

delete branch tp-tc/lando-api

delete branch : mars/move-sec-approval-comment-endpoint

delete time in 3 days

delete branch tp-tc/lando-api

delete branch : 1631971-matrix-link

delete time in 3 days

create barnchtp-tc/lando-api

branch : scm_firefoxci

created branch time in 3 days

push eventtp-tc/kaniko

Tom Prince

commit sha f514602a5a2fa575ab235296d5913edaa23bd10c

Adjust kaniko dockerfiles to support bootstrapping.

view details

push time in 3 days

push eventtp-tc/kaniko

Tom Prince

commit sha 81808a728475c0ab869641faeec88f2d11c93f62

Add support for squashing.

view details

Tom Prince

commit sha 8a626bf1ec40cd2599f13931166c4908c4d55ea9

[FIXME: revendor] Add support for squashing images.

view details

push time in 3 days

delete branch tp-tc/kaniko

delete branch : tstromberg-patch-1

delete time in 3 days

delete branch tp-tc/kaniko

delete branch : release_15

delete time in 3 days

delete branch tp-tc/kaniko

delete branch : revert-334-fix-volume-cmd

delete time in 3 days

delete branch tp-tc/kaniko

delete branch : bootrap

delete time in 3 days

push eventtp-tc/kaniko

Tom Prince

commit sha 2ff92bc5831799384bd215ed57d248eeb329e10c

bootstrap

view details

Tom Prince

commit sha bcc05a4523e78b3cf5e53ae99237845fda10cf37

Adjust kaniko dockerfiles to support bootstrapping.

view details

push time in 3 days

push eventtp-tc/kaniko

Tom Prince

commit sha 2ff92bc5831799384bd215ed57d248eeb329e10c

bootstrap

view details

push time in 3 days

PR opened GoogleContainerTools/kaniko

Add an option to allow kaniko to bootstrap.

<!-- 🎉🎉🎉 Thank you for the PR!!! 🎉🎉🎉 -->

Fixes #<issue number>. in case of a bug fix, this should point to a bug and any other related issue(s)

Description

<!-- Describe your changes here- ideally you can get that description straight from your descriptive commit message(s)! -->

Submitter Checklist

These are the criteria that every PR should meet, please check them off as you review them:

  • [ ] Includes unit tests
  • [ ] Adds integration tests if needed.

See the contribution guide for more details.

Reviewer Notes

  • [ ] The code flow looks good.
  • [ ] Unit tests and or integration tests added.

Release Notes

Describe any changes here so maintainer can include it in the release notes, or delete this block.

Examples of user facing changes:
- kaniko adds a new flag `--registry-repo` to override registry

+67 -41

0 comment

16 changed files

pr created time in 3 days

push eventtp-tc/kaniko

Tom Prince

commit sha edd7ad99e197d0518cd7fe64f7f05095e728336d

Adjust kaniko dockerfiles to support bootstrapping.

view details

push time in 3 days

create barnchtp-tc/kaniko

branch : bootstrap

created branch time in 3 days

create barnchtp-tc/kaniko

branch : bootrap

created branch time in 3 days

push eventtp-tc/kaniko

Tom Prince

commit sha daab96c72fcd9dc070bc78cbb3c85c7beb94be1a

[FIXME: revendor] Add support for squashing images.

view details

push time in 3 days

push eventtp-tc/go-containerregistry

Tom Prince

commit sha 54fcff58fc87de48219e255c6f8d8b7f339dd95c

Add a command to squash images.

view details

push time in 3 days

PR opened GoogleContainerTools/kaniko

Add support for squashing.

<!-- 🎉🎉🎉 Thank you for the PR!!! 🎉🎉🎉 -->

Fixes #<issue number>. in case of a bug fix, this should point to a bug and any other related issue(s)

Description

Add support for squashing created images. This depends on https://github.com/google/go-containerregistry/pull/735

Submitter Checklist

These are the criteria that every PR should meet, please check them off as you review them:

  • [ ] Includes unit tests
  • [x] Adds integration tests if needed.

See the contribution guide for more details.

Reviewer Notes

  • [ ] The code flow looks good.
  • [ ] Unit tests and or integration tests added.

Release Notes

- kaniko adds a new flag `--squash` to build images with a single layer
+17 -0

0 comment

6 changed files

pr created time in 3 days

push eventtp-tc/kaniko

Tom Prince

commit sha 71b2d4596ea3c1da7365028964749293bb4a18d7

Add support for squashing.

view details

push time in 3 days

create barnchtp-tc/kaniko

branch : squash

created branch time in 3 days

create barnchtp-tc/go-containerregistry

branch : squash

created branch time in 3 days

issue commenttaskcluster/taskcluster

Decision task ignores TASKCLUSTER_HEAD_REV

It looks like run-task was fixed in Bug 1595808 but the decision image on docker hub was not updated with that fix.

petemoore

comment created time in 7 days

push eventtp-tc/kaniko

Tom Prince

commit sha bcfe48d7423ef7ea3f4613d9b68bcb27e904e2d7

Rename `--whitelist-var-run` to `--ignore-var-run`.

view details

push time in 8 days

PR opened GoogleContainerTools/kaniko

Rename `--whitelist-var-run` to `--ignore-var-run`.

<!-- 🎉🎉🎉 Thank you for the PR!!! 🎉🎉🎉 -->

Description

Renamve --whitelist-var-run to --ignore-var-run.

Submitter Checklist

These are the criteria that every PR should meet, please check them off as you review them:

  • [ ] Includes unit tests
  • [ ] Adds integration tests if needed.

See the contribution guide for more details.

Reviewer Notes

  • [ ] The code flow looks good.
  • [ ] Unit tests and or integration tests added.

Release Notes

  • Rename --whitelist-var-run to --ignore-var-run.
+4 -1

0 comment

2 changed files

pr created time in 8 days

create barnchtp-tc/kaniko

branch : ignore-var-run

created branch time in 8 days

issue openedmozilla-iam/dino-park-issues

Allow uploading a different profile pictures for the different display levels.

I have a picture uploaded as my profile picture, set to staff only (partly for the reasons outlined in #150). However, I'd like to associate a different image for other privacy levels (I've adopted the the original identicon that I got from github, and use it across every service, including slack/matrix/bugzilla/phabricator).

created time in 9 days

create barnchtp-tc/build-relengdocs

branch : machine-users

created branch time in 10 days

issue openedtaskcluster/taskcluster

Provide a way for a task using taskcluster proxy to restrict the scopes a particular request can use.

I have a tasks that fairly wide scopes, because it schedules a number of different other tasks, based on various input. Each of those tasks should only have access to a subset of scopes. I'd like to be able to enforce that in the original task.

The specific use case I have is a task that has assume:repo:<repo>:cron:*, and based on time and other inputs, would like to make an api request that only has assume:repo:<repo>:cron:<specific-thing> scopes when scheduling specific-thing.

created time in 13 days

Pull request review commentmozilla-releng/scriptworker-scripts

always run k8s-image tasks on dev and prod

 def filter(task, parameters):             return False         if parameters.get("script_name"):             return task.attributes.get("script-name") == parameters["script_name"]+        return True+    return [l for l, t in full_task_graph.tasks.iteritems() if filter(t, parameters)]+++@_target_task('docker-hub-push')+def target_tasks_default(full_task_graph, parameters, graph_config):+    """Filter by `run_on_tasks_for` and `script-name`."""++    def filter(task, parameters):+        if task.kind != "k8s-image":+            return False+        if parameters.get("script_name"):+            return task.attributes.get("script-name") == parameters["script_name"]

(optional) Could turn this into a filter_for_script function (that returns True if the parameter is unset) and then use a style like this.

escapewindow

comment created time in 14 days

delete branch tp-tc/cloudops-deployment-proxy

delete branch : hgmo-support

delete time in 17 days

Pull request review commentmozilla-releng/scriptworker-scripts

bug 1597598 - use taskgraph

+# This Source Code Form is subject to the terms of the Mozilla Public+# License, v. 2.0. If a copy of the MPL was not distributed with this+# file, You can obtain one at http://mozilla.org/MPL/2.0/.+"""+Create a task per job python-version+"""++from __future__ import absolute_import, print_function, unicode_literals+from copy import deepcopy++from taskgraph.transforms.base import TransformSequence+++transforms = TransformSequence()++def _replace_string(obj, subs):+    if isinstance(obj, dict):+        return {k: v.format(**subs) for k, v in obj.items()}+    elif isinstance(obj, list):+        for c in range(0, len(obj)):+            obj[c] = obj[c].format(**subs)+    else:+        obj = obj.format(**subs)+    return obj+++def _resolve_replace_string(item, field, subs):+    # largely from resolve_keyed_by+    container, subfield = item, field+    while '.' in subfield:+        f, subfield = subfield.split('.', 1)+        if f not in container:+            return item+        container = container[f]+        if not isinstance(container, dict):+            return item++    if subfield not in container:+        return item++    container[subfield] = _replace_string(container[subfield], subs)+    return item++++@transforms.add+def set_script_name(config, jobs):+    for job in jobs:+        job.setdefault("attributes", {}).update({+            "script-name": job["name"],+        })+        yield job+++@transforms.add+def tasks_per_python_version(config, jobs):+    fields = [+        "description",+        "docker-repo",+        "run.command",+        "worker.command",+        "worker.docker-image",+    ]+    for job in jobs:+        for python_version in job.pop("python-versions"):+            task = deepcopy(job)+            subs = {"name": job["name"], "python_version": python_version}+            for field in fields:+                _resolve_replace_string(task, field, subs)+            task["attributes"]["python-version"] = python_version+            yield task

Yeah, I definitely wasn't suggesting changing anything here, just commenting on the future consideration with non-python things

escapewindow

comment created time in 21 days

Pull request review commentmozilla-releng/scriptworker-scripts

bug 1597598 - use taskgraph

+# This Source Code Form is subject to the terms of the Mozilla Public+# License, v. 2.0. If a copy of the MPL was not distributed with this+# file, You can obtain one at http://mozilla.org/MPL/2.0/.+"""+Create a task per job python-version+"""++from __future__ import absolute_import, print_function, unicode_literals+from copy import deepcopy++from taskgraph.transforms.base import TransformSequence+++transforms = TransformSequence()++def _replace_string(obj, subs):+    if isinstance(obj, dict):+        return {k: v.format(**subs) for k, v in obj.items()}+    elif isinstance(obj, list):+        for c in range(0, len(obj)):+            obj[c] = obj[c].format(**subs)+    else:+        obj = obj.format(**subs)+    return obj+++def _resolve_replace_string(item, field, subs):+    # largely from resolve_keyed_by+    container, subfield = item, field+    while '.' in subfield:+        f, subfield = subfield.split('.', 1)+        if f not in container:+            return item+        container = container[f]+        if not isinstance(container, dict):+            return item++    if subfield not in container:+        return item++    container[subfield] = _replace_string(container[subfield], subs)+    return item++++@transforms.add+def set_script_name(config, jobs):+    for job in jobs:+        job.setdefault("attributes", {}).update({+            "script-name": job["name"],+        })+        yield job+++@transforms.add+def tasks_per_python_version(config, jobs):+    fields = [+        "description",+        "docker-repo",+        "run.command",+        "worker.command",+        "worker.docker-image",+    ]+    for job in jobs:+        for python_version in job.pop("python-versions"):+            task = deepcopy(job)+            subs = {"name": job["name"], "python_version": python_version}+            for field in fields:+                _resolve_replace_string(task, field, subs)+            task["attributes"]["python-version"] = python_version+            yield task

It is not clear to me that the python-version transform is providing value for the k8s-image kind in any case.

escapewindow

comment created time in 21 days

Pull request review commentmozilla-releng/scriptworker-scripts

bug 1597598 - use taskgraph

+# -*- coding: utf-8 -*-++# This Source Code Form is subject to the terms of the Mozilla Public+# License, v. 2.0. If a copy of the MPL was not distributed with this+# file, You can obtain one at http://mozilla.org/MPL/2.0/.++from __future__ import absolute_import, print_function, unicode_literals++from taskgraph.parameters import extend_parameters_schema+from voluptuous import (+    Any,+    Optional,+    Required,+)++PROJECT_SPECIFIC_PREFIXES = {+    "refs/heads/dev-": "dev",+    "refs/heads/production-": "production",+}++PUSH_TAGS = ("dev", "production")++scriptworker_schema = {+    Optional('docker_tag'): Any(basestring, None),+    Optional('push_docker_image'): Any(True, "force", False, None),

We don't support defaults in schemas. This comes from m-c, where --fast disables schema validation, which implements implements defaults by returning the new structure. Due to that, we ignore the return value, to make sure --fast doesn't change behavior.

escapewindow

comment created time in 21 days

Pull request review commentmozilla-releng/scriptworker-scripts

bug 1597598 - use taskgraph

+# This Source Code Form is subject to the terms of the Mozilla Public

taskgraph.transfrorms.cached_task is a transform that takes a cache value on a task (with cache type, name and digest data), and turns it into the appropriate routes and optimizations so that the task will be re-used as long as the digest data is not changed.

That transform is used by the docker-image, toolchain and fetch transforms to implement their caching behavior. This transform is similar to those, in that it generates a cache value with appropriate digest to be consumed by taskgraph.transforms.cached_task; but unlike them, does not expect the task to have any particular artifacts, and doesn't have any other bits of taskgraph that are designed to interact with it.

I wonder why we never added this transform in any of the mobile projects. I'll dig further.

It is not obvious to me where it would be useful for mobile projects. The most likely would be android-components or application-services. Though, given that android-components has a unified build system, I suspect it would be somewhat less useful for that project.

To a certain extent, this transform is being used to work-around the fact that we don't have a push-log for this repo, which would let us see if any relevant files changed since the last push. On the other hand, this precise tracking, as backing something out will not rerun tasks on the old code.

escapewindow

comment created time in 21 days

create barnchtp-tc/active-data-recipes

branch : try-syntax

created branch time in 22 days

issue commenttaskcluster/taskcluster

Better estimate the number of instances to spawn

One thing that I think I've noticed, though I don't know how common it is:

  1. a task T is scheduled
  2. worker-manger starts a new worker A
  3. the worker registers with worker manger
  4. worker-manger estimator see there is one instance (A) and one pending task (T) and so starts a worker B
  5. worker A claims task T

That is, I think there is a window between when a worker registers and when it starts claiming work that where worker-manager will spawn a second instance to run a single task.

djmitche

comment created time in 22 days

issue commenttaskcluster/taskcluster

figure out next steps for provisioning improvements

#2879 tracks some aspect of this

imbstack

comment created time in 22 days

pull request commentmozilla/community-tc-config

Allow WPT to use built-in/success

I suspect all projects should be granted access to built-in/* workers.

stephenmcgruer

comment created time in 23 days

issue commenttaskcluster/taskcluster

Pools with `minCapacity` set interact poorly with `shutdown.afterIdleSeconds`/`idleTimeoutSecs`.

the goal is to reduce our non task overhead as much as possible.

Time taken in vcs (and fetching other things that could be cached) is also overhead, but counted as part of tasks. I guess idle time between jobs probably dominates for most workers, so it is probably always a win in practice; but decreasing non-task overhead can lead to longer task times.

don't know why need to have workers always ready- there might be some cases like decision task where we need to have workers available faster as delays there block everything else.

Yeah, decision workers at least want reasonably fast turn-around times; particularly as these are also used for actions.

I say all this to highlight that I think there's a higher-level problem, or set of problems, to solve here, and that we are interested in solving. The engineering effort required to get idle timeout and minCapacity to play nicely together is high and probably does not address that larger problem very well.

Yeah, I definitely don't know what the right solution is, and it probably needs to take other related aspects of provisioning. I do think it is worth calling out explicitly, even if the eventual solution ends up changing things so that one or the other or both these knobs don't exist in their current form. I can also imagine that (in the context of a large solution), solving this might be an (comparatively) easy detour part way along the implementation of the full solution.

tomprince

comment created time in 23 days

Pull request review commentmozilla-releng/scriptworker-scripts

bug 1597598 - use taskgraph

+# This Source Code Form is subject to the terms of the Mozilla Public+# License, v. 2.0. If a copy of the MPL was not distributed with this+# file, You can obtain one at http://mozilla.org/MPL/2.0/.+"""+Build the cached_task digest to prevent rerunning tasks if the code hasn't changed.+"""++from __future__ import absolute_import, print_function, unicode_literals++import hashlib+import json+import os+import subprocess++import taskgraph+from taskgraph.transforms.base import TransformSequence+from taskgraph.util.hash import hash_path+from taskgraph.util.memoize import memoize++transforms = TransformSequence()++BASE_DIR = os.getcwd()++# Directory names we ignore, anywhere in the tree+# We won't recurse into these directories.+IGNORE_DIRS = (".git", "docs", "maintenance")++# File extensions we ignore, anywhere in the tree+IGNORE_EXTENSIONS = (".pyc", ".swp")++ADDITIONAL_FILES = tuple([])+++@memoize+def list_files(path):+    files = []+    for dir_name, subdir_list, file_list in os.walk(path):+        for dir_ in subdir_list:+            if dir_ in IGNORE_DIRS:+                subdir_list.remove(dir_)+                continue+        for file_ in file_list:+            (_, ext) = os.path.splitext(file_)+            if ext in IGNORE_EXTENSIONS:+                continue+            files.append(+                os.path.relpath(os.path.join(BASE_DIR, dir_name, file_), BASE_DIR)+            )+    return set(files)+++@transforms.add+def add_digest_directories(config, tasks):+    # TODO `s,digest-directories,digest-files`, where we check isdir(file) ?+    for task in tasks:+        digest_directories = task.pop("digest-directories", None)+        if digest_directories:+            task.setdefault("attributes", {})["digest-directories"] = digest_directories+        yield task+++@transforms.add+def build_cache(config, tasks):+    repo_name = subprocess.check_output(["git", "remote", "get-url", "origin"]).rstrip()+    repo_name = repo_name.replace(".git", "").rstrip("/")+    repo_name = repo_name.split("/")[-1]++    for task in tasks:+        if task.get("cache", True) and not taskgraph.fast:+            digest_data = []+            h = hashlib.sha256()+            h.update(+                json.dumps(task.get("attributes", {}).get("digest-extra", {}), indent=2, sort_keys=True)+            )+            directories = task.get("attributes", {}).get("digest-directories", [])

I don't see it set for the tox kind.

escapewindow

comment created time in 23 days

issue openedtaskcluster/taskcluster

Pools with `minCapacity` set interact poorly with `shutdown.afterIdleSeconds`/`idleTimeoutSecs`.

Describe the bug

A worker with shutdown.afterIdleSeconds/idleTimeoutSecs will shutdown if it does not receive work from the queue in the specified amount of time. This interacts poorly with pools that have a minCapacity set, as worker manager will immediately spin up a new worker with cold caches. It would be much better if enough workers to satisfy minCapacity stayed alive, rather then endlessly cycling.

This can have a huge impact on gecko action tasks, where the cost of a fresh clone ends up taking about half the runtime of a task.

(cc: @jmaher)

created time in 23 days

Pull request review commentmozilla-releng/scriptworker-scripts

bug 1597598 - use taskgraph

+# This Source Code Form is subject to the terms of the Mozilla Public+# License, v. 2.0. If a copy of the MPL was not distributed with this+# file, You can obtain one at http://mozilla.org/MPL/2.0/.+---+loader: taskgraph.loader.transform:loader++kind-dependencies:+    - docker-image++transforms:+    - scriptworker_taskgraph.transforms.python_version:transforms+    - scriptworker_taskgraph.transforms.cached:transforms+    - taskgraph.transforms.cached_tasks:transforms+    - taskgraph.transforms.job:transforms+    - taskgraph.transforms.task:transforms++job-defaults:+    description: "{name} tox-py{python_version}"+    label: "tox-{name}-py{python_version}"+    run-on-tasks-for: ["action", "github-pull-request", "github-push"]+    attributes:+        code-review: true+    worker-type: b-linux+    worker:+        docker-image: {in-tree: 'python{python_version}'}+        max-run-time: 1800+        artifacts:+            - type: directory+              name: public/+              path: /builds/worker/artifacts/

I think I would say, it depends? I couldn't think of any artifacts we obviously would want here, and if we have some, it isn't entirely clear that having a directory to dump them in is how we would want to collect them.

I definitely don't feel strongly about this.

escapewindow

comment created time in 24 days

Pull request review commentmozilla-releng/scriptworker-scripts

bug 1597598 - use taskgraph

+# This Source Code Form is subject to the terms of the Mozilla Public+# License, v. 2.0. If a copy of the MPL was not distributed with this+# file, You can obtain one at http://mozilla.org/MPL/2.0/.+---+loader: taskgraph.loader.transform:loader++kind-dependencies:+    - k8s-image

https://hg.mozilla.org/ci/taskgraph/file/tip/src/taskgraph/transforms/code_review.py#l21 only includes tasks from kinds listed here, and not every tox tasks had a dependent task in k8s-image, so if you want those tasks to be depended on here, that kind needs to be listed.

escapewindow

comment created time in 24 days

Pull request review commentmozilla-releng/scriptworker-scripts

bug 1597598 - use taskgraph

+# This Source Code Form is subject to the terms of the Mozilla Public+# License, v. 2.0. If a copy of the MPL was not distributed with this+# file, You can obtain one at http://mozilla.org/MPL/2.0/.+"""+Create a task per job python-version+"""++from __future__ import absolute_import, print_function, unicode_literals+from copy import deepcopy++from taskgraph.transforms.base import TransformSequence+++transforms = TransformSequence()++def _replace_string(obj, repl_dict):+    if isinstance(obj, dict):+        for k in obj.keys():+            obj[k] = obj[k].format(**repl_dict)+    elif isinstance(obj, list):+        for c in range(0, len(obj)):+            obj[c] = obj[c].format(**repl_dict)+    else:+        obj = obj.format(**repl_dict)+    return obj+++@transforms.add+def tasks_per_python_version(config, jobs):+    for job in jobs:+        for python_version in job.pop("python-versions"):+            task = deepcopy(job)+            repl_dict = {"name": job["name"], "python_version": python_version}+            task["label"] = _replace_string(task["label"], repl_dict)+            task['worker']['docker-image'] = _replace_string(task['worker']['docker-image'], repl_dict)+            task['description'] = _replace_string(task['description'], repl_dict)+            if task.get('run', {}).get('command'):+                task['run']['command'] = _replace_string(task['run']['command'], repl_dict)+            if task['worker'].get('command'):+                task['worker']['command'] = _replace_string(task['worker']['command'], repl_dict)+            if task.get('docker-repo'):+                task['docker-repo'] = _replace_string(task['docker-repo'], repl_dict)+            task.setdefault("attributes", {}).update({+                "script-name": job["name"],

I am very slightly inclined to move this to its own transform function before this one.

escapewindow

comment created time in 24 days

Pull request review commentmozilla-releng/scriptworker-scripts

bug 1597598 - use taskgraph

+# This Source Code Form is subject to the terms of the Mozilla Public+# License, v. 2.0. If a copy of the MPL was not distributed with this+# file, You can obtain one at http://mozilla.org/MPL/2.0/.+---+loader: taskgraph.loader.transform:loader++kind-dependencies:+    - docker-image

If the intent is to have the digests here depend on the docker-image digests, I think you need to explicitly add the docker-image dependency; The in-tree specification gets turned into a dependency too late to impact digests (and this is at least partly deliberate, we don't want toolchains to depend on the digest of the image that built them).

escapewindow

comment created time in 24 days

Pull request review commentmozilla-releng/scriptworker-scripts

bug 1597598 - use taskgraph

+# This Source Code Form is subject to the terms of the Mozilla Public+# License, v. 2.0. If a copy of the MPL was not distributed with this+# file, You can obtain one at http://mozilla.org/MPL/2.0/.+"""+Create a task per job python-version+"""++from __future__ import absolute_import, print_function, unicode_literals+from copy import deepcopy++from taskgraph.transforms.base import TransformSequence+++transforms = TransformSequence()++def _replace_string(obj, repl_dict):+    if isinstance(obj, dict):+        for k in obj.keys():+            obj[k] = obj[k].format(**repl_dict)+    elif isinstance(obj, list):+        for c in range(0, len(obj)):+            obj[c] = obj[c].format(**repl_dict)+    else:+        obj = obj.format(**repl_dict)+    return obj+++@transforms.add+def tasks_per_python_version(config, jobs):+    for job in jobs:+        for python_version in job.pop("python-versions"):+            task = deepcopy(job)+            repl_dict = {"name": job["name"], "python_version": python_version}+            task["label"] = _replace_string(task["label"], repl_dict)+            task['worker']['docker-image'] = _replace_string(task['worker']['docker-image'], repl_dict)+            task['description'] = _replace_string(task['description'], repl_dict)+            if task.get('run', {}).get('command'):+                task['run']['command'] = _replace_string(task['run']['command'], repl_dict)+            if task['worker'].get('command'):+                task['worker']['command'] = _replace_string(task['worker']['command'], repl_dict)+            if task.get('docker-repo'):+                task['docker-repo'] = _replace_string(task['docker-repo'], repl_dict)

I think this would be a bit clearer with a loop like here over the fields being modified.

escapewindow

comment created time in 24 days

Pull request review commentmozilla-releng/scriptworker-scripts

bug 1597598 - use taskgraph

+# This Source Code Form is subject to the terms of the Mozilla Public+# License, v. 2.0. If a copy of the MPL was not distributed with this+# file, You can obtain one at http://mozilla.org/MPL/2.0/.+"""+Kubernetes docker image builds.+"""++from __future__ import absolute_import, print_function, unicode_literals+from copy import deepcopy+import time++from taskgraph.transforms.base import TransformSequence+++transforms = TransformSequence()+++@transforms.add+def add_dependencies(config, jobs):+    """Add dependencies that match python-version and script-name.++    Also copy the digest-directories attribute, and fail if there are+    unexpected discrepancies in upstream deps.++    """+    for job in jobs:+        attributes = job["attributes"]+        dependencies = job.setdefault("dependencies", {})+        digest_directories = None+        for dep_task in config.kind_dependencies_tasks:+            dep_attrs = dep_task.attributes+            dep_kind = dep_task.kind+            if dep_attrs["python-version"]  == attributes["python-version"] and \+                    dep_attrs["script-name"] == attributes["script-name"]:+                if dependencies.get(dep_kind):+                    raise Exception("Duplicate kind {kind} dependencies: {existing_label}, {new_label}".format(+                        kind=dep_kind,+                        existing_label=dependencies[dep_kind]["label"],+                        new_label=dep_task.label,+                    ))+                dependencies[dep_kind] = dep_task.label+                if dep_attrs.get("digest-directories"):+                    if digest_directories and digest_directories != dep_attrs["digest-directories"]:+                        raise Exception("Conflicting digest_directories: {existing_digest} {new_digest}".format(+                            existing_digest=digest_directories,+                            new_digest=dep_attrs["digest-directories"],+                        ))+                    digest_directories = dep_attrs["digest-directories"]+        if digest_directories:+            attributes["digest-directories"] = digest_directories+        yield job+++@transforms.add+def set_environment(config, jobs):+    """Set the environment variables for the docker hub task."""+    for job in jobs:+        project_name = job["attributes"]["script-name"]+        secret_url = job.pop("deploy-secret-url")+        tasks_for = config.params['tasks_for']+        scopes = job.setdefault("scopes", [])+        attributes = job["attributes"]+        env = job["worker"].setdefault("env", {})+        env["HEAD_REV"] = config.params['head_rev']+        env["REPO_URL"] = config.params['head_repository']+        env["PROJECT_NAME"] = project_name+        env["TASKCLUSTER_ROOT_URL"] = "$TASKCLUSTER_ROOT_URL"+        env["DOCKER_TAG"] = "unknown"+        env["DOCKER_REPO"] = job.pop("docker-repo")+        force_push_docker_image = False+        if tasks_for == 'github-pull-request':+            env["DOCKER_TAG"] = "pull-request"+        elif tasks_for == 'github-push':+            for ref_name in ("dev", "production"):+                if config.params['head_ref'] == "refs/heads/{}-{}".format(ref_name, project_name):+                    env["DOCKER_TAG"] = ref_name+                    force_push_docker_image = True+                    break+            else:+                if config.params['head_ref'].startswith('refs/heads/'):+                    env["DOCKER_TAG"] = config.params['head_ref'].replace('refs/heads/', '')

Is the intention that this catch refs/heads/{dev,production}? If so, I'd inclined to be explicit about that.

Maybe re.match(r'refs/heads/(dev|production)(?:-[a-z]*)$') might be closer to what we want. (Thinking about this, maybe it might make sense to make this a graph parameter).

escapewindow

comment created time in 24 days

Pull request review commentmozilla-releng/scriptworker-scripts

bug 1597598 - use taskgraph

+# This Source Code Form is subject to the terms of the Mozilla Public+# License, v. 2.0. If a copy of the MPL was not distributed with this+# file, You can obtain one at http://mozilla.org/MPL/2.0/.+"""+Build the cached_task digest to prevent rerunning tasks if the code hasn't changed.+"""++from __future__ import absolute_import, print_function, unicode_literals++import hashlib+import json+import os+import subprocess++import taskgraph+from taskgraph.transforms.base import TransformSequence+from taskgraph.util.hash import hash_path+from taskgraph.util.memoize import memoize++transforms = TransformSequence()++BASE_DIR = os.getcwd()++# Directory names we ignore, anywhere in the tree+# We won't recurse into these directories.+IGNORE_DIRS = (".git", "docs", "maintenance")++# File extensions we ignore, anywhere in the tree+IGNORE_EXTENSIONS = (".pyc", ".swp")++ADDITIONAL_FILES = tuple([])+++@memoize+def list_files(path):+    files = []+    for dir_name, subdir_list, file_list in os.walk(path):+        for dir_ in subdir_list:+            if dir_ in IGNORE_DIRS:+                subdir_list.remove(dir_)+                continue+        for file_ in file_list:+            (_, ext) = os.path.splitext(file_)+            if ext in IGNORE_EXTENSIONS:+                continue+            files.append(+                os.path.relpath(os.path.join(BASE_DIR, dir_name, file_), BASE_DIR)+            )+    return set(files)+++@transforms.add+def add_digest_directories(config, tasks):+    # TODO `s,digest-directories,digest-files`, where we check isdir(file) ?+    for task in tasks:+        digest_directories = task.pop("digest-directories", None)+        if digest_directories:+            task.setdefault("attributes", {})["digest-directories"] = digest_directories+        yield task+++@transforms.add+def build_cache(config, tasks):+    repo_name = subprocess.check_output(["git", "remote", "get-url", "origin"]).rstrip()+    repo_name = repo_name.replace(".git", "").rstrip("/")+    repo_name = repo_name.split("/")[-1]++    for task in tasks:+        if task.get("cache", True) and not taskgraph.fast:+            digest_data = []+            h = hashlib.sha256()+            h.update(+                json.dumps(task.get("attributes", {}).get("digest-extra", {}), indent=2, sort_keys=True)+            )+            directories = task.get("attributes", {}).get("digest-directories", [])

I don't see this getting set as an attribute anywhere, but rather as a top-level key. I think that is correct, but this needs to be adjusted to match.

escapewindow

comment created time in 24 days

Pull request review commentmozilla-releng/scriptworker-scripts

bug 1597598 - use taskgraph

+# This Source Code Form is subject to the terms of the Mozilla Public+# License, v. 2.0. If a copy of the MPL was not distributed with this+# file, You can obtain one at http://mozilla.org/MPL/2.0/.+"""+Kubernetes docker image builds.+"""++from __future__ import absolute_import, print_function, unicode_literals+from copy import deepcopy+import time++from taskgraph.transforms.base import TransformSequence+++transforms = TransformSequence()+++@transforms.add+def add_dependencies(config, jobs):+    """Add dependencies that match python-version and script-name.++    Also copy the digest-directories attribute, and fail if there are+    unexpected discrepancies in upstream deps.++    """+    for job in jobs:+        attributes = job["attributes"]+        dependencies = job.setdefault("dependencies", {})+        digest_directories = None+        for dep_task in config.kind_dependencies_tasks:+            dep_attrs = dep_task.attributes+            dep_kind = dep_task.kind+            if dep_attrs["python-version"]  == attributes["python-version"] and \+                    dep_attrs["script-name"] == attributes["script-name"]:+                if dependencies.get(dep_kind):+                    raise Exception("Duplicate kind {kind} dependencies: {existing_label}, {new_label}".format(+                        kind=dep_kind,+                        existing_label=dependencies[dep_kind]["label"],+                        new_label=dep_task.label,+                    ))+                dependencies[dep_kind] = dep_task.label+                if dep_attrs.get("digest-directories"):+                    if digest_directories and digest_directories != dep_attrs["digest-directories"]:+                        raise Exception("Conflicting digest_directories: {existing_digest} {new_digest}".format(+                            existing_digest=digest_directories,+                            new_digest=dep_attrs["digest-directories"],+                        ))+                    digest_directories = dep_attrs["digest-directories"]+        if digest_directories:+            attributes["digest-directories"] = digest_directories+        yield job+++@transforms.add+def set_environment(config, jobs):+    """Set the environment variables for the docker hub task."""+    for job in jobs:+        project_name = job["attributes"]["script-name"]+        secret_url = job.pop("deploy-secret-url")+        tasks_for = config.params['tasks_for']+        scopes = job.setdefault("scopes", [])+        attributes = job["attributes"]+        env = job["worker"].setdefault("env", {})+        env["HEAD_REV"] = config.params['head_rev']+        env["REPO_URL"] = config.params['head_repository']+        env["PROJECT_NAME"] = project_name+        env["TASKCLUSTER_ROOT_URL"] = "$TASKCLUSTER_ROOT_URL"+        env["DOCKER_TAG"] = "unknown"+        env["DOCKER_REPO"] = job.pop("docker-repo")

nit: this feels a bit visually noisy, I wonder if using env.update({ ... }) would make this easier to parse.

escapewindow

comment created time in 24 days

Pull request review commentmozilla-releng/scriptworker-scripts

bug 1597598 - use taskgraph

+# This Source Code Form is subject to the terms of the Mozilla Public+# License, v. 2.0. If a copy of the MPL was not distributed with this+# file, You can obtain one at http://mozilla.org/MPL/2.0/.++from __future__ import absolute_import, print_function, unicode_literals++import hashlib+import json+import os+import subprocess++import taskgraph+from taskgraph.transforms.base import TransformSequence+from taskgraph.util.hash import hash_path+from taskgraph.util.memoize import memoize++transforms = TransformSequence()++BASE_DIR = os.getcwd()++# Directory names we ignore, anywhere in the tree+# We won't recurse into these directories.+IGNORE_DIRS = (".git", "docs", "maintenance")++# File extensions we ignore, anywhere in the tree+IGNORE_EXTENSIONS = (".pyc", ".swp")++ADDITIONAL_FILES = tuple([])+++@memoize+def list_files(path):+    files = []+    for dir_name, subdir_list, file_list in os.walk(path):+        for dir_ in subdir_list:+            if dir_ in IGNORE_DIRS:+                subdir_list.remove(dir_)+                continue+        for file_ in file_list:+            (_, ext) = os.path.splitext(file_)+            if ext in IGNORE_EXTENSIONS:+                continue+            files.append(+                os.path.relpath(os.path.join(BASE_DIR, dir_name, file_), BASE_DIR)+            )+    return set(files)+++@transforms.add+def add_digest_directories(config, tasks):+    # TODO `s,digest-directories,digest-files`, where we check isdir(file) ?+    for task in tasks:+        digest_directories = task.pop("digest-directories", None)+        if digest_directories:+            task.setdefault("attributes", {})["digest-directories"] = digest_directories+        yield task+++@transforms.add+def build_cache(config, tasks):+    repo_name = subprocess.check_output(["git", "remote", "get-url", "origin"]).rstrip()+    repo_name = repo_name.replace(".git", "").rstrip("/")+    repo_name = repo_name.split("/")[-1]++    for task in tasks:+        if task.get("cache", True) and not taskgraph.fast:+            digest_data = []+            h = hashlib.sha256()+            h.update(+                json.dumps(task.get("attributes", {}).get("digest-extra", {}), indent=2, sort_keys=True)+            )+            directories = task.get("attributes", {}).get("digest-directories", [])+            files = set([])+            for d in directories:+                directory = os.path.join(BASE_DIR, d)+                files.update(list_files(directory))+            # files.update(list_files(os.path.join(BASE_DIR, "taskcluster")))+            for path in ADDITIONAL_FILES:+                if os.path.exists(path):+                    files.update({path})+            for path in sorted(list(files)):+                h.update(+                    "{} {}\n".format(+                        hash_path(os.path.realpath(os.path.join(BASE_DIR, path))), path+                    )+                )+            task.setdefault("attributes", {}).setdefault("cached_task", {})+            cache_name = task["label"].replace(":", "-")+            task["cache"] = {+                "type": "{}.v2".format(repo_name),

I think this should probably be <something>.v1.<kind> where <something> is hard-coded.

escapewindow

comment created time in 24 days

Pull request review commentmozilla-releng/scriptworker-scripts

bug 1597598 - use taskgraph

+# %ARG DOCKER_IMAGE_PARENT+FROM $DOCKER_IMAGE_PARENT++VOLUME /builds/worker/checkouts+VOLUME /builds/worker/.cache

It'd be good to get a bug on file so that we don't need to duplicate these in dependent images.

escapewindow

comment created time in 24 days

Pull request review commentmozilla-releng/scriptworker-scripts

bug 1597598 - use taskgraph

+# %ARG DOCKER_IMAGE_PARENT+FROM $DOCKER_IMAGE_PARENT++VOLUME /builds/worker/checkouts+VOLUME /builds/worker/.cache++RUN apt-get update \+    && apt-get install -y libsodium-dev+RUN truncate -s 0 /etc/os-release

I bet this is an existing thing, but a comment on this line would be useful. I suspect I know why this is here, but it would be better if wasn't folk knowledge.

escapewindow

comment created time in 24 days

Pull request review commentmozilla-releng/scriptworker-scripts

bug 1597598 - use taskgraph

+# %ARG DOCKER_IMAGE_PARENT+FROM $DOCKER_IMAGE_PARENT++VOLUME /builds/worker/checkouts+VOLUME /builds/worker/.cache++RUN apt-get update && apt-get install -y default-jdk

It might be worth considering pulling out these commands into a script that can be shared with images we publish, so they don't get out-of-sync. Not a blocker.

escapewindow

comment created time in 24 days

Pull request review commentmozilla-releng/scriptworker-scripts

bug 1597598 - use taskgraph

+# This Source Code Form is subject to the terms of the Mozilla Public+# License, v. 2.0. If a copy of the MPL was not distributed with this+# file, You can obtain one at http://mozilla.org/MPL/2.0/.+---+loader: taskgraph.loader.transform:loader++kind-dependencies:+    - docker-image++transforms:+    - scriptworker_taskgraph.transforms.python_version:transforms+    - scriptworker_taskgraph.transforms.cached:transforms+    - taskgraph.transforms.cached_tasks:transforms+    - taskgraph.transforms.job:transforms+    - taskgraph.transforms.task:transforms++job-defaults:+    description: "{name} tox-py{python_version}"+    label: "tox-{name}-py{python_version}"+    run-on-tasks-for: ["action", "github-pull-request", "github-push"]+    attributes:+        code-review: true+    worker-type: b-linux+    worker:+        docker-image: {in-tree: 'python{python_version}'}+        max-run-time: 1800+        artifacts:+            - type: directory+              name: public/+              path: /builds/worker/artifacts/+    python-versions:+        - 38+        - 37+    run:+        using: run-task+        cache-dotcache: false+        checkout:+            scriptworker: {}+        # sparse-profile: basestring/none+        # workdir:+        cwd: '{checkout}'+        command:+            - sh+            - -lxce+            - >-+              tox -e {name}-py{python_version}++jobs:+    addonscript:+        digest-directories:

I would be inclined to use resources for this, to match what toolchains use.

escapewindow

comment created time in 24 days

Pull request review commentmozilla-releng/scriptworker-scripts

bug 1597598 - use taskgraph

+# This Source Code Form is subject to the terms of the Mozilla Public+# License, v. 2.0. If a copy of the MPL was not distributed with this+# file, You can obtain one at http://mozilla.org/MPL/2.0/.+---+loader: taskgraph.loader.transform:loader++kind-dependencies:+    - docker-image++transforms:+    - scriptworker_taskgraph.transforms.python_version:transforms+    - scriptworker_taskgraph.transforms.cached:transforms+    - taskgraph.transforms.cached_tasks:transforms+    - taskgraph.transforms.job:transforms+    - taskgraph.transforms.task:transforms++job-defaults:+    description: "{name} tox-py{python_version}"+    label: "tox-{name}-py{python_version}"+    run-on-tasks-for: ["action", "github-pull-request", "github-push"]+    attributes:+        code-review: true+    worker-type: b-linux+    worker:+        docker-image: {in-tree: 'python{python_version}'}+        max-run-time: 1800+        artifacts:+            - type: directory+              name: public/+              path: /builds/worker/artifacts/

Do we generate anything here?

escapewindow

comment created time in 24 days

Pull request review commentmozilla-releng/scriptworker-scripts

bug 1597598 - use taskgraph

+# This Source Code Form is subject to the terms of the Mozilla Public+# License, v. 2.0. If a copy of the MPL was not distributed with this+# file, You can obtain one at http://mozilla.org/MPL/2.0/.+"""+Create a task per job python-version+"""++from __future__ import absolute_import, print_function, unicode_literals+from copy import deepcopy++from taskgraph.transforms.base import TransformSequence+++transforms = TransformSequence()++def _replace_string(obj, repl_dict):+    if isinstance(obj, dict):+        for k in obj.keys():+            obj[k] = obj[k].format(**repl_dict)+    elif isinstance(obj, list):+        for c in range(0, len(obj)):+            obj[c] = obj[c].format(**repl_dict)+    else:+        obj = obj.format(**repl_dict)+    return obj+++@transforms.add+def tasks_per_python_version(config, jobs):+    for job in jobs:+        for python_version in job.pop("python-versions"):+            task = deepcopy(job)+            repl_dict = {"name": job["name"], "python_version": python_version}+            task["label"] = _replace_string(task["label"], repl_dict)+            task['worker']['docker-image'] = _replace_string(task['worker']['docker-image'], repl_dict)+            task['description'] = _replace_string(task['description'], repl_dict)+            if task.get('run', {}).get('command'):+                task['run']['command'] = _replace_string(task['run']['command'], repl_dict)+            if task['worker'].get('command'):+                task['worker']['command'] = _replace_string(task['worker']['command'], repl_dict)+            if task.get('docker-repo'):+                task['docker-repo'] = _replace_string(task['docker-repo'], repl_dict)+            task.setdefault("attributes", {}).update({+                "script-name": job["name"],+                "python-version": python_version,+            })+            yield task+++@transforms.add+def skip_on_project_specific_branches(config, jobs):+    """Skip if the branch is project-specific for a different project."""+    project_specific_prefixes = ("refs/heads/dev-", "refs/heads/production-")+    for job in jobs:+        script_name = job["attributes"]["script-name"]+        project_specific_branches = ["{}{}".format(prefix, script_name) for prefix in project_specific_prefixes]+        if config.params['head_ref'].startswith(project_specific_prefixes) and \+                config.params['head_ref'] not in project_specific_branches:+            continue+        yield job

My inclination would be to handle this via setting target_tasks_method and script-name parameter based on the branch, I think.

escapewindow

comment created time in 24 days

Pull request review commentmozilla-releng/scriptworker-scripts

bug 1597598 - use taskgraph

+# This Source Code Form is subject to the terms of the Mozilla Public+# License, v. 2.0. If a copy of the MPL was not distributed with this+# file, You can obtain one at http://mozilla.org/MPL/2.0/.+"""+Create a task per job python-version+"""++from __future__ import absolute_import, print_function, unicode_literals+from copy import deepcopy++from taskgraph.transforms.base import TransformSequence+++transforms = TransformSequence()++def _replace_string(obj, repl_dict):+    if isinstance(obj, dict):+        for k in obj.keys():+            obj[k] = obj[k].format(**repl_dict)
        return {k: v.format(**subs) for k, v in obj.items()}

and similar?

escapewindow

comment created time in 24 days

Pull request review commentmozilla-releng/scriptworker-scripts

bug 1597598 - use taskgraph

+# This Source Code Form is subject to the terms of the Mozilla Public+# License, v. 2.0. If a copy of the MPL was not distributed with this+# file, You can obtain one at http://mozilla.org/MPL/2.0/.++from __future__ import absolute_import, print_function, unicode_literals++import hashlib+import json+import os+import subprocess++import taskgraph+from taskgraph.transforms.base import TransformSequence+from taskgraph.util.hash import hash_path+from taskgraph.util.memoize import memoize++transforms = TransformSequence()++BASE_DIR = os.getcwd()++# Directory names we ignore, anywhere in the tree+# We won't recurse into these directories.+IGNORE_DIRS = (".git", "docs", "maintenance")++# File extensions we ignore, anywhere in the tree+IGNORE_EXTENSIONS = (".pyc", ".swp")++ADDITIONAL_FILES = tuple([])+++@memoize+def list_files(path):+    files = []+    for dir_name, subdir_list, file_list in os.walk(path):+        for dir_ in subdir_list:+            if dir_ in IGNORE_DIRS:+                subdir_list.remove(dir_)+                continue+        for file_ in file_list:+            (_, ext) = os.path.splitext(file_)+            if ext in IGNORE_EXTENSIONS:+                continue+            files.append(+                os.path.relpath(os.path.join(BASE_DIR, dir_name, file_), BASE_DIR)+            )+    return set(files)+++@transforms.add+def add_digest_directories(config, tasks):+    # TODO `s,digest-directories,digest-files`, where we check isdir(file) ?+    for task in tasks:+        digest_directories = task.pop("digest-directories", None)+        if digest_directories:+            task.setdefault("attributes", {})["digest-directories"] = digest_directories+        yield task+++@transforms.add+def build_cache(config, tasks):+    repo_name = subprocess.check_output(["git", "remote", "get-url", "origin"]).rstrip()+    repo_name = repo_name.replace(".git", "").rstrip("/")+    repo_name = repo_name.split("/")[-1]++    for task in tasks:+        if task.get("cache", True) and not taskgraph.fast:+            digest_data = []+            h = hashlib.sha256()+            h.update(+                json.dumps(task.get("attributes", {}).get("digest-extra", {}), indent=2, sort_keys=True)+            )+            directories = task.get("attributes", {}).get("digest-directories", [])+            files = set([])+            for d in directories:+                directory = os.path.join(BASE_DIR, d)+                files.update(list_files(directory))+            # files.update(list_files(os.path.join(BASE_DIR, "taskcluster")))+            for path in ADDITIONAL_FILES:+                if os.path.exists(path):+                    files.update({path})+            for path in sorted(list(files)):+                h.update(+                    "{} {}\n".format(+                        hash_path(os.path.realpath(os.path.join(BASE_DIR, path))), path+                    )+                )+            task.setdefault("attributes", {}).setdefault("cached_task", {})

I'm not sure why this is necessary?

escapewindow

comment created time in 24 days

Pull request review commentmozilla-releng/scriptworker-scripts

bug 1597598 - use taskgraph

+# This Source Code Form is subject to the terms of the Mozilla Public+# License, v. 2.0. If a copy of the MPL was not distributed with this+# file, You can obtain one at http://mozilla.org/MPL/2.0/.++from __future__ import absolute_import, print_function, unicode_literals++import hashlib+import json+import os+import subprocess++import taskgraph+from taskgraph.transforms.base import TransformSequence+from taskgraph.util.hash import hash_path+from taskgraph.util.memoize import memoize++transforms = TransformSequence()++BASE_DIR = os.getcwd()++# Directory names we ignore, anywhere in the tree+# We won't recurse into these directories.+IGNORE_DIRS = (".git", "docs", "maintenance")++# File extensions we ignore, anywhere in the tree+IGNORE_EXTENSIONS = (".pyc", ".swp")++ADDITIONAL_FILES = tuple([])+++@memoize+def list_files(path):+    files = []+    for dir_name, subdir_list, file_list in os.walk(path):+        for dir_ in subdir_list:+            if dir_ in IGNORE_DIRS:+                subdir_list.remove(dir_)+                continue+        for file_ in file_list:+            (_, ext) = os.path.splitext(file_)+            if ext in IGNORE_EXTENSIONS:+                continue+            files.append(+                os.path.relpath(os.path.join(BASE_DIR, dir_name, file_), BASE_DIR)+            )+    return set(files)+++@transforms.add+def add_digest_directories(config, tasks):+    # TODO `s,digest-directories,digest-files`, where we check isdir(file) ?+    for task in tasks:+        digest_directories = task.pop("digest-directories", None)+        if digest_directories:+            task.setdefault("attributes", {})["digest-directories"] = digest_directories+        yield task+++@transforms.add+def build_cache(config, tasks):+    repo_name = subprocess.check_output(["git", "remote", "get-url", "origin"]).rstrip()+    repo_name = repo_name.replace(".git", "").rstrip("/")+    repo_name = repo_name.split("/")[-1]++    for task in tasks:+        if task.get("cache", True) and not taskgraph.fast:+            digest_data = []+            h = hashlib.sha256()+            h.update(+                json.dumps(task.get("attributes", {}).get("digest-extra", {}), indent=2, sort_keys=True)+            )

You don't need to pre-hash the data passed to digest_data; it takes a list of strings and generates a digest from them.

escapewindow

comment created time in 24 days

Pull request review commentmozilla-releng/scriptworker-scripts

bug 1597598 - use taskgraph

+# This Source Code Form is subject to the terms of the Mozilla Public+# License, v. 2.0. If a copy of the MPL was not distributed with this+# file, You can obtain one at http://mozilla.org/MPL/2.0/.++from __future__ import absolute_import, print_function, unicode_literals++import hashlib+import json+import os+import subprocess++import taskgraph+from taskgraph.transforms.base import TransformSequence+from taskgraph.util.hash import hash_path+from taskgraph.util.memoize import memoize++transforms = TransformSequence()++BASE_DIR = os.getcwd()++# Directory names we ignore, anywhere in the tree+# We won't recurse into these directories.+IGNORE_DIRS = (".git", "docs", "maintenance")++# File extensions we ignore, anywhere in the tree+IGNORE_EXTENSIONS = (".pyc", ".swp")++ADDITIONAL_FILES = tuple([])+++@memoize+def list_files(path):+    files = []+    for dir_name, subdir_list, file_list in os.walk(path):+        for dir_ in subdir_list:+            if dir_ in IGNORE_DIRS:+                subdir_list.remove(dir_)+                continue+        for file_ in file_list:+            (_, ext) = os.path.splitext(file_)+            if ext in IGNORE_EXTENSIONS:+                continue+            files.append(+                os.path.relpath(os.path.join(BASE_DIR, dir_name, file_), BASE_DIR)+            )+    return set(files)+++@transforms.add+def add_digest_directories(config, tasks):+    # TODO `s,digest-directories,digest-files`, where we check isdir(file) ?+    for task in tasks:+        digest_directories = task.pop("digest-directories", None)+        if digest_directories:+            task.setdefault("attributes", {})["digest-directories"] = digest_directories+        yield task+++@transforms.add+def build_cache(config, tasks):+    repo_name = subprocess.check_output(["git", "remote", "get-url", "origin"]).rstrip()+    repo_name = repo_name.replace(".git", "").rstrip("/")+    repo_name = repo_name.split("/")[-1]++    for task in tasks:+        if task.get("cache", True) and not taskgraph.fast:+            digest_data = []+            h = hashlib.sha256()+            h.update(+                json.dumps(task.get("attributes", {}).get("digest-extra", {}), indent=2, sort_keys=True)+            )+            directories = task.get("attributes", {}).get("digest-directories", [])+            files = set([])+            for d in directories:+                directory = os.path.join(BASE_DIR, d)+                files.update(list_files(directory))+            # files.update(list_files(os.path.join(BASE_DIR, "taskcluster")))+            for path in ADDITIONAL_FILES:+                if os.path.exists(path):+                    files.update({path})+            for path in sorted(list(files)):+                h.update(+                    "{} {}\n".format(+                        hash_path(os.path.realpath(os.path.join(BASE_DIR, path))), path+                    )+                )

You can use taskgraph.util.hash.hash_paths for this.

escapewindow

comment created time in 24 days

Pull request review commentmozilla-releng/scriptworker-scripts

bug 1597598 - use taskgraph

+# This Source Code Form is subject to the terms of the Mozilla Public+# License, v. 2.0. If a copy of the MPL was not distributed with this+# file, You can obtain one at http://mozilla.org/MPL/2.0/.+---+loader: taskgraph.loader.transform:loader++kind-dependencies:+    - k8s-image

I think this should depend on tox as well?

escapewindow

comment created time in 24 days

PR opened mozilla/community-tc-config

Use preemptible GCP instances.

I noticed that the GCP configuration wasn't using preemptible instances. I wasn't sure if this was deliberate or an oversight.

+5 -1

0 comment

1 changed file

pr created time in 25 days

create barnchtp-tc/community-tc-config

branch : preempt

created branch time in 25 days

delete branch tp-tc/releng-rfcs

delete branch : ci-admin-automation

delete time in a month

push eventmozilla-releng/releng-rfcs

Tom Prince

commit sha 9de9360497e799eb0187d35249475cd93d682e0f

Add ci-admin automation RFC. (#31)

view details

push time in a month

PR merged mozilla-releng/releng-rfcs

Automate running ci-admin on push by running it in cloudops' jenkins. Phase: Final Comment

Rendered

@mozilla-releng/releng @mozilla-releng/relops @djmitche @imbstack @petemoore @moz-hwine @ajvb @edunham @sciurus

+54 -0

9 comments

1 changed file

tomprince

pr closed time in a month

pull request commenttaskcluster/taskcluster-rfcs

Intelligent Worker Cycling RFC

Rendered

I didn't understand this comment, sorry! Ah - is the issue that the doc name is 0161-deploymentid.md? If that is what you meant, indeed I can change that to 0161-intelligent-worker-cycling.md.

No, it is a convenient link to the rendered version of the RFC for easy of reading. I would've added to the description if I had permission.

petemoore

comment created time in a month

Pull request review commenttaskcluster/taskcluster-rfcs

Intelligent Worker Cycling RFC

+# RFC 0161 - Intelligent worker cycling on worker pool update+* Comments: [#161](https://github.com/taskcluster/taskcluster-rfcs/pull/161)+* Proposed by: @petemoore++# Summary++Workers of a worker pool should be gracefully and automatically refreshed or+decommissioned if they are no longer consistent with the most recent worker+pool definition.++## Motivation++Without checks and balances in place, worker pools may contain outdated workers+whose configuration no longer reflects the most recent worker pool definition,+for an unbounded period of time.++With intelligent worker cycling, we can ensure that workers with outdated+configuration are swiftly removed from worker pools.++# Details++The responsibility for keeping workers up-to-date is split between Worker+Manager (for ensuring that provider launch configuration of workers is valid)+and the workers, responsible for refreshing `workerConfig` on reregisterWorker+calls.++This RFC involves changes to both Worker Manager and Worker-Managed-spawned+workers (via Worker Runner):++* Worker Manager will keep a record of the launch parameters it used for all+  workers that it spawns+* Worker Manager will track changes to Worker Pool definitions that affect+  launch configurations of workers+* When Worker Manager receives a `workermanager.registerWorker` or+  `workermanager.reregisterWorker` call from a worker whose launch+  configuration is determined to be from a previous version of the current+  worker pool definition, it will assess whether its launch configuration is+  still valid against the latest worker pool definition, and if not, return a+  HTTP 410 status code with a message body explaining that the worker's launch+  configuration is no longer conformant.+* Worker Manager will include (the most recent) `workerConfig` in+  `workermanager.reregisterWorker` responses for workers with up-to-date launch+  configurations (`workermanager.registerWorker` already includes `workerConfig`)+* Workers (via Worker Runner) will refresh their configs from successful+  `workermanager.reregisterWorker` calls, or gracefully shutdown when receiving+  a HTTP 410 status code from either `workermanager.registerWorker` or+  `workermanager.reregisterWorker`+  ++## Worker Manager changes++Depending on the provider, Worker Pools contain different data. Regardless, the+logic to decide if a worker pool launch configuration has changed is the same:++* If Worker Manager can determine that the parameters used to launch an+  existing running worker (instance type, disk sizes, region, ...) from a+  previous version of the worker pool definition are still a valid combination+  of parameters against the latest worker pool configuration, responses to+  `workermanager.registerWorker` or `workermanager.reregisterWorker` will be+  treated as before, and the worker will not be earmarked for decommission.+* Otherwise, if Worker Manager is either unable to determine if the launch+  parameters of a given worker are still valid, or is able to determine that+  they are no longer consisten with the latest worker pool definition, it should:++    * Respond to `workermanager.registerWorker` and `workermanager.reregisterWorker`+      calls with an HTTP 410 response status code+    * Terminate the worker (in the case of non-static workers) as soon as+      possible, but _no earlier_ than 15 minutes after the worker's current+      credentials have expired

I'm not sure I understand the motivation for the restriction on timing here. I also wonder if the quarantine provision below should always happen, independent of whether the worker is static or not.

petemoore

comment created time in a month

pull request commenttaskcluster/taskcluster-rfcs

Intelligent Worker Cycling RFC

petemoore

comment created time in a month

push eventtp-tc/releng-rfcs

Tom Prince

commit sha c075b51208bd5b97ffcf6b61d6d93eb2f35f079f

Add ci-admin automation RFC.

view details

push time in a month

push eventtp-tc/releng-rfcs

Tom Prince

commit sha a1b2fb909ea88fbbdec42444b214aa88970e14cd

Add ci-admin automation RFC.

view details

push time in a month

Pull request review commentmozilla-releng/adhoc-signing

Attempt to support new manifest format and fetch-bmo.py

+# This Source Code Form is subject to the terms of the Mozilla Public+# License, v. 2.0. If a copy of the MPL was not distributed with this+# file, You can obtain one at http://mozilla.org/MPL/2.0/.++from __future__ import absolute_import, print_function, unicode_literals++from six import text_type++from voluptuous import Required++from taskgraph.util.schema import taskref_or_string+from taskgraph.util import path as mozpath+from taskgraph.transforms.fetch import fetch_builder+++@fetch_builder('bmo-attachment', schema={+    # The URL to download.+    Required('attachment-id'): text_type,++    # The SHA-256 of the downloaded content.+    Required('sha256'): text_type,++    # Size of the downloaded entity, in bytes.+    Required('size'): int,++    # The name to give to the generated artifact.+    Required('artifact-name'): text_type,++})+def create_fetch_url_task(config, name, fetch):++    artifact_name = fetch['artifact-name']++    workdir = '/builds/worker'++    # Arguments that matter to the cache digest+    args = [+        '--sha256', fetch['sha256'],+        '--size', '%d' % fetch['size'],+        '--name', artifact_name,+        fetch['attachment-id']+    ]++    cmd = [+        'bash',+        '-c',+        'cd {} && '+        '/usr/bin/python3 {} {}'.format(

I would be inclined to defer to the shebang, rather than explicitly calling python. This will require chmod +x in the dockerfile.

Callek

comment created time in a month

Pull request review commentmozilla-releng/adhoc-signing

Attempt to support new manifest format and fetch-bmo.py

+# This Source Code Form is subject to the terms of the Mozilla Public+# License, v. 2.0. If a copy of the MPL was not distributed with this+# file, You can obtain one at http://mozilla.org/MPL/2.0/.++from __future__ import absolute_import, print_function, unicode_literals++from six import text_type++from voluptuous import Required++from taskgraph.util.schema import taskref_or_string+from taskgraph.util import path as mozpath+from taskgraph.transforms.fetch import fetch_builder+++@fetch_builder('bmo-attachment', schema={+    # The URL to download.+    Required('attachment-id'): text_type,++    # The SHA-256 of the downloaded content.+    Required('sha256'): text_type,++    # Size of the downloaded entity, in bytes.+    Required('size'): int,++    # The name to give to the generated artifact.+    Required('artifact-name'): text_type,++})+def create_fetch_url_task(config, name, fetch):++    artifact_name = fetch['artifact-name']++    workdir = '/builds/worker'++    # Arguments that matter to the cache digest+    args = [+        '--sha256', fetch['sha256'],+        '--size', '%d' % fetch['size'],+        '--name', artifact_name,+        fetch['attachment-id']+    ]++    cmd = [+        'bash',+        '-c',+        'cd {} && '

Instead of setting the current directory, you should pass the full path to the destination here.

Callek

comment created time in a month

Pull request review commentmozilla-releng/adhoc-signing

Attempt to support new manifest format and fetch-bmo.py

 RUN apt-get update && \  # %include-run-task +# %include taskcluster/run-task/fetch-bmo.py+ADD topsrcdir/taskcluster/run-task/fetch-bmo.py /builds/worker/bin/fetch-bmo.py

I would be inclined to use /usr/local/bin/fetch-bmo.

Callek

comment created time in a month

Pull request review commentmozilla-releng/adhoc-signing

Attempt to support new manifest format and fetch-bmo.py

+#!/usr/bin/python3 -u

This file should either live in taskcluster/docker/fetch or taskcluster/script/fetch.

Callek

comment created time in a month

pull request commentmozilla-services/cloudops-deployment-proxy

Bug 1619470: Add support for listening to hg.mozilla.org pulse messsages.

@oremj I think this is ready for review.

tomprince

comment created time in a month

push eventtp-tc/cloudops-deployment-proxy

Tom Prince

commit sha 7c62924f08dfaf7bf299bc3b0234b805719d7b59

Add some verification.

view details

push time in a month

delete branch tp-tc/treeherder

delete branch : selected-task-without-run

delete time in a month

more