profile
viewpoint

GoogleChrome/lighthouse 20198

Automated auditing, performance metrics, and best practices for the web.

aFarkas/html5shiv 9688

This script is the defacto way to enable use of HTML5 sectioning elements in legacy Internet Explorer.

google/ios-webkit-debug-proxy 4883

A DevTools proxy (Chrome Remote Debugging Protocol) for iOS devices (Safari Remote Web Inspector).

ChromeDevTools/awesome-chrome-devtools 4171

Awesome tooling and resources in the Chrome DevTools & DevTools Protocol ecosystem

GoogleChrome/lighthouse-ci 3231

Automate running Lighthouse for every commit, viewing the changes, and preventing regressions

csnover/TraceKit 870

Attempts to create stack traces for unhandled JavaScript exceptions in all major browsers.

borismus/device.js 765

Semantic client-side device detection with Media Queries

GoogleChrome/chrome-launcher 631

Launch Google Chrome with ease from node.

GoogleChrome/devtools-docs 609

The legacy documentation for Chrome DevTools.

benschwarz/metaquery 328

A declarative responsive web design syntax. Breakpoints, defined in `<meta>`

push eventChromeDevTools/devtools-protocol

devtools-bot

commit sha c888b09ea683c4da68d05945c57a900435ac1007

Updates

view details

push time in 7 hours

pull request commentGoogleChrome/lighthouse

core(gather-runner): error on non-HTML

So from these snippets it sounds like Content-Type may be the stronger method.

contenttype is provided in the response headers. mimeType is provided by the browser and is based on contentType.

since the browser can choose a mimetype even if a contenttype isn't provided, we should just go with mimetype. it's the signal that's more in-tune with how the browser read the content.

let's keep this check simple: if the mimeType isn't text/html we have a problem.

that's it. :)

lemcardenas

comment created time in 16 hours

issue commentGoogleChrome/lighthouse

Increased 404s for lighthouse_worker in Canary Chrome DevTools

Today we ran into this sorta thing again. the 404s were fine in aggregate but we had one notable user get one. (And he got served the updated fallback thx to connor)

Our user updated to the latest Canary sometime this afternoon. That build (86.0.4189.0) was built from a git commit that happened at 11:24am. Chrome builders finished the Canary build at 1:18pm. (Presumably, Chrome started pushing out that update around then)

The uploader had started a new cycle at 11:19am. It fetches latest from the repo right then. so it missed our key commit by 5 minutes. The cycle ran for a few hours and a new one started right after 3:00pm. And at 3:06pm, the DevTools uploaded the latest version, fixing the LH version in canary. Our user found a very special window. :)


fixes

A while ago, me and connor made sure the uploader prioritizes release-bot commits, which are the key ones (and this was one.) https://chromium.googlesource.com/chromium/tools/chrome-devtools-frontend/+/refs/heads/master/gce/uploader_iteration.sh#140

However since uploader cycles can take 4-8 hours, there's risk for anyone who is running an hours-old build.

we can do a couple things

  1. make the uploader cycle shorter
  2. upload any key release-bot commits more frequently than once per cycle.

More medium-term, we also could reconsider how we handle network fallback. One alternative is serving Chrome 86 known resources to Chrome 86.xx if we don't have that particular commit. That'd also fix Brave and Yandex browsers, who get a pretty broken experience. We'd need to account for the remote-debugging case, still.

Gosh the uploader is a pain.

connorjclark

comment created time in 16 hours

issue commentGoogleChrome/lighthouse

Increased 404s for lighthouse_worker in Canary Chrome DevTools

TL;DR REMOTE_MODULE_FALLBACK_REVISION needs to be updated.

I was using another solution/hack because previous devtools TL wouldn't let me update that fallback revision. I updated the GAE commit metadata for the existing fallback revision. I just had it pretend like that hash was really a chrome 77 hash instead of a chrome 55 hash (or whatever). This basically uses the "Redirection hack" as mentioned in the DevTools Uploader guide.

connorjclark

comment created time in 17 hours

pull request commentGoogleChrome/lighthouse

core(gather-runner): error on non-HTML

nice query!

Maybe we'll need to combine mimeType with other information? It looks like checking for 'text/html' would catch a large majority of cases but would miss some. Since this will be a fatal error, we'll probably want to be more conservative in throwing. e.g. a server shouldn't be serving html with mime type 'content/html', but if Lighthouse can still run against it, we shouldn't not let them do so.

ehhhh. im happy to yell at content/html even tho technically the page will render.

Because we have the browser sniff, it complicates things. my assumption is that the mimeType from the protocol is post-sniff but i'm not confident.

broad categories we have

  1. page is served with text/html and is text/html. YAY
  2. page is served with non-text/html and isn't HTML. page (probably) renders fine?
    • we throw error.
  3. page is served with non-text/html but is HTML. page renders fine.
    • technically the page is good enough for us, but we should throw this error anyway.
  4. page is not served with a content-type, but is HTML and the browser sniffs it works. YAY
  5. page is not served with a content-type, but isnt HTML and the browser sniffs and who knows.
    • Tough situation. LH should throw on this. I have no idea what the .mimeType is in this scenario, but from the above comment it sounds like it's not text/html.

given that we're talking about 0.015% of pages in HA.. I think we can afford to just flag everything that's without a .mimeType of text/html.

lemcardenas

comment created time in 18 hours

issue commentGoogleChrome/lighthouse

Audit: Animations on layout inducing properties.

(a few years later..)

It looks like we already pass information on what property we're animating in the Animation trace event.

FWIW it looks like the CSS property isn't in the trace event. But https://chromium-review.googlesource.com/532435 added reasons that I think are the exact signal we want. @adamraine is looking into surfacing those reasons into the Animation trace event.


locations in today's codebase:

tdresser

comment created time in 19 hours

pull request commentGoogleChrome/lighthouse

core(page-functions): truncate long attribute values in HTML snippets

  • let's count the length of the attribute name + value as we go, and once it hit's ~500-600, we stop and just elide the rest of the node with

do we need to? since this adds so much more complexity i'd rather defer until we across a real example that necessitates it.

Beytoven

comment created time in 2 days

Pull request review commentGoogleChrome/lighthouse

core(renderer): display n/a as the score for categories with entirely n/a audits

 describe('CategoryRenderer', () => {         'no manual description');   }); -  it('renders not applicable audits if the category contains them', () => {-    const a11yCategory = sampleResults.categories.accessibility;-    const categoryDOM = renderer.render(a11yCategory, sampleResults.categoryGroups);-    assert.ok(categoryDOM.querySelector(+  describe('categories with not applicable audits', () => {+    let a11yCategory;++    beforeEach(()=> {+      a11yCategory = JSON.parse(JSON.stringify(sampleResults.categories.accessibility));

nah, you'll see a lot of "JSON.parse(JSON.stringify(" across our tests for the exact same reason. ;)

saavan-google-intern

comment created time in 2 days

Pull request review commentGoogleChrome/lighthouse

deps(angular): update minor version of angular fixture

 const assert = require('assert').strict; const {computeCSSTokenLength, computeJSTokenLength} = require('../../lib/minification-estimator.js'); // eslint-disable-line max-len  const angularFullScript = fs.readFileSync(require.resolve('angular/angular.js'), 'utf8');+const zoneMinifiedScript = fs.readFileSync(`${__dirname}/../../../lighthouse-cli/test/fixtures/dobetterweb/third_party/aggressive-promise-polyfill.js`, 'utf8'); // eslint-disable-line max-len

we could do another of these

https://github.com/GoogleChrome/lighthouse/blob/13301339637d7d3f6ad708b2759e70c175e85122/lighthouse-cli/test/fixtures/static-server.js#L161-L164

paulirish

comment created time in 2 days

Pull request review commentGoogleChrome/lighthouse

deps(angular): update minor version of angular fixture

 const assert = require('assert').strict; const {computeCSSTokenLength, computeJSTokenLength} = require('../../lib/minification-estimator.js'); // eslint-disable-line max-len  const angularFullScript = fs.readFileSync(require.resolve('angular/angular.js'), 'utf8');+const zoneMinifiedScript = fs.readFileSync(`${__dirname}/../../../lighthouse-cli/test/fixtures/dobetterweb/third_party/aggressive-promise-polyfill.js`, 'utf8'); // eslint-disable-line max-len

i looked into it, but

https://github.com/GoogleChrome/lighthouse/blob/13301339637d7d3f6ad708b2759e70c175e85122/lighthouse-cli/test/fixtures/static-server.js#L67

makes that pretty challenging. ideas?

paulirish

comment created time in 2 days

push eventGoogleChrome/lighthouse

Paul Irish

commit sha 7affb58aad876fe0f94199a5b2f74d42d457bd2d

rename

view details

push time in 2 days

push eventGoogleChrome/lighthouse

Paul Irish

commit sha 304c352548496eb6d3cb0ac23292b6c498a413fb

Update lighthouse-core/test/lib/minification-estimator-test.js Co-authored-by: Patrick Hulce <patrick.hulce@gmail.com>

view details

push time in 2 days

PR opened GoogleChrome/lighthouse

deps(angular): update minor version of angular fixture

same as #10086 just 7 months later.

this gets rid of the last security advisory we have.


also added one more minification-estimator test.... to make sure it doesn't overestimate pre-minified savings.

+13 -6

0 comment

2 changed files

pr created time in 2 days

create barnchGoogleChrome/lighthouse

branch : angularbump

created branch time in 2 days

push eventGoogleChrome/lighthouse

Paul Irish

commit sha 13301339637d7d3f6ad708b2759e70c175e85122

misc: annotate version-specific logic with COMPAT comments (#11019)

view details

push time in 2 days

delete branch GoogleChrome/lighthouse

delete branch : compatcomment

delete time in 2 days

PR merged GoogleChrome/lighthouse

misc: annotate version-specific logic with COMPAT comments cla: yes waiting4reviewer

We had a weak convention to annotate anything with branching/specific logic to handle new/old browsers with "COMPAT"

Having it makes it easier to take inventory of the special cases we have.

We were missing a few of these.

You'll also spot some stuff we can take action on. I've kept that out of this PR, but will followup with them.

+8 -8

0 comment

8 changed files

paulirish

pr closed time in 2 days

push eventGoogleChrome/lighthouse

Patrick Hulce

commit sha 014820eeffe7790b2fb6f4e2e2f639acbc9527c9

tests: relax requestIdleCallback smoke expectation (#11041)

view details

push time in 2 days

delete branch GoogleChrome/lighthouse

delete branch : lantern_ric_shim_smoke_flake

delete time in 2 days

PR merged GoogleChrome/lighthouse

Reviewers
tests: relax requestIdleCallback smoke expectation cla: yes land-when-ci-is-green waiting4reviewer

Summary The new requestIdleCallback smoke can be flaky in CI (and locally too, just rarer). Loosen the expectation quite a bit.

Related Issues/PRs ref https://github.com/GoogleChrome/lighthouse/pull/11017#issuecomment-652060654

+5 -4

0 comment

2 changed files

patrickhulce

pr closed time in 2 days

push eventGoogleChrome/lighthouse

Paul Irish

commit sha eb9d6c602f6aabb25999b63ac34739186c2578d0

empty

view details

push time in 2 days

PR opened GoogleChrome/lighthouse

docs: update architecture.md
+3 -3

0 comment

3 changed files

pr created time in 2 days

create barnchGoogleChrome/lighthouse

branch : archupdate

created branch time in 2 days

push eventGoogleChrome/lighthouse

Saavan Nanavati

commit sha 76d06f876fa3b6e4bd379698fc3246afaa08210f

deps: upgrade codecov to 3.7.0 (#11039)

view details

push time in 2 days

PR merged GoogleChrome/lighthouse

deps: upgrade codecov to 3.7.0 cla: yes waiting4reviewer

<!-- Thank you for submitting a pull request! See CONTRIBUTING.MD for help in getting a change landed. https://github.com/GoogleChrome/lighthouse/blob/master/CONTRIBUTING.md -->

Summary <!-- What kind of change does this PR introduce? --> <!-- Is this a bugfix, feature, refactoring, build related change, etc? -->

<!-- Describe the need for this change -->

<!-- Link any documentation or information that would help understand this change -->

Lighthouse's codecov integration has been flaky recently, and this is likely a result of using an old version of the codecov node uploader (3.6.5) as seen in a recent commit here. This PR upgrades to the latest version (3.7.0) which, codecov support says, has a critical fix to the v4 upload path, providing more stability to the uploads.

Related Issues/PRs <!-- Provide any additional information we might need to understand the pull request -->

+5 -5

1 comment

2 changed files

saavan-google-intern

pr closed time in 2 days

pull request commentGoogleChrome/lighthouse

deps: upgrade codecov to 3.7.0

Awesome thanks for fixing the uploading!

saavan-google-intern

comment created time in 2 days

push eventpaulirish/dotfiles

Paul Irish

commit sha cd71b7bccfa5600708c79e19a4f04ea20f6d1b07

deletemergesquashed py3

view details

Paul Irish

commit sha dd190992a49871613c38e1ec3e49e737d678c231

compare gzip and brotli

view details

push time in 2 days

push eventChromeDevTools/devtools-protocol

devtools-bot

commit sha 81d75f7666c316c0d1f0b94e41feb942ef36f4af

Updates

view details

push time in 3 days

push eventChromeDevTools/devtools-protocol

devtools-bot

commit sha b73da50bef0edd1a359d8dc3557879c2f73f1818

Updates

view details

push time in 3 days

push eventChromeDevTools/devtools-protocol

devtools-bot

commit sha e07f3f9261e0a1801c454b0fdc07dea72549c936

Updates

view details

push time in 3 days

push eventGoogleChrome/lighthouse

Paul Irish

commit sha c139d72a56512e82e10f7f15cd1b47d47b2c7410

tests: parallelize all the tests (#11009) Co-authored-by: connorjclark and brendankenny

view details

push time in 7 days

delete branch GoogleChrome/lighthouse

delete branch : uberjobs

delete time in 7 days

PR merged GoogleChrome/lighthouse

Reviewers
tests: parallelize all the tests cla: no waiting4reviewer

This PR merges both @connorjclark's #10988 and @brendankenny's #10993. And then I tweaked the work balance of the non-smoke stuff.

smoke stuff

  • ToTChrome is added to the mix
  • brendan's approach of halving the smoketests has big impact in total duration.

3 major non-smoke jobs: basics + unit

  • I looked at the timings a bit. There's benefit to splitting apart all the non-smoke stuff, otherwise it'll finish after the halved smokes do. I don't see any material benefits when split into 3 pieces, so as long as we kick out the longest smoke step into its own job, we're good. Ideally these two non-smoke jobs have equal durations.
  • Contrasting with connor's, i essentially merge misc back into build
  • Contrasting with brendan's, I build-all in both basics and unit.

The last two runs of this branch finished in 6:16 and 6:08, which is 2 and 6 min faster than the other two PRs.

+113 -62

3 comments

3 changed files

paulirish

pr closed time in 7 days

Pull request review commentGoogleChrome/lighthouse

tests: parallelize all the tests

 jobs:         # https://buildtracker.dev/docs/guides/github-actions#configuration         BT_API_AUTH_TOKEN: ${{ secrets.BT_API_AUTH_TOKEN }} +    - name: Upload dist+      uses: actions/upload-artifact@v1+      with:+        name: dist+        path: dist/+     # Fail if any changes were written to source files (ex, from: build/build-cdt-lib.js).     - run: git diff --exit-code++  # `unit` includes just unit and proto tests.+  unit:+    runs-on: ubuntu-latest++    steps:+    - name: git clone+      uses: actions/checkout@v2++    - name: Use Node.js 10.x+      uses: actions/setup-node@v1+      with:+        node-version: 10.x++    - name: Set up protoc+      uses: arduino/setup-protoc@7ad700d+      with:+        version: '3.7.1'+        repo-token: ${{ secrets.GITHUB_TOKEN }}++    - name: Set up Python+      uses: actions/setup-python@v1+      with:+        python-version: 2.7+    - name: Install Python dependencies+      run: |+        python -m pip install --upgrade pip+        pip install protobuf==3.7.1++    - run: yarn --frozen-lockfile++    - run: yarn test-proto # Run before unit-core because the roundtrip json is needed for proto tests.++    - run: sudo apt-get install xvfb+    - name: yarn unit+      run: xvfb-run --auto-servernum yarn unit++  # `smoke` runs as a matrix across 4 jobs:+  #  * The smoketest groups are split across two runners, to parallelize.+  #  * Then, those are run with both Chrome stable and ToT Chromium, in parallel+  smoke:+    runs-on: ubuntu-latest+    strategy:+      matrix:+        chrome-channel: ['stable', 'ToT']+        smoke-test-invert: [false, true]+      # e.g. if smoke 0 fails, continue with smoke 1 anyway+      fail-fast: false+    env:+      # The smokehouse tests run by job `smoke_0`. `smoke_1` will run the rest.+      SMOKE_GROUP_1: a11y oopif pwa pwa2 pwa3 dbw redirects errors offline+    name: smoke_${{ strategy.job-index }}_${{ matrix.chrome-channel }}

what were you thinking?

btw really glad you found this name prop tho.. since we need these to be deterministically named for branch protection.

paulirish

comment created time in 7 days

Pull request review commentGoogleChrome/lighthouse

tests: parallelize all the tests

 name: 💡🏠 on: [pull_request]  jobs:-  ci:-+  # `basics` includes all non-smoke CI, except for unit and proto+  basics:     runs-on: ubuntu-latest-    strategy:-      # e.g. if lint fails, continue to the unit tests anyway-      fail-fast: false +    # A few steps are duplicated across all jobs. Can be done better when this feature lands:+    #   https://github.community/t/reusing-sharing-inheriting-steps-between-jobs-declarations/16851+    #   https://github.com/actions/runner/issues/438     steps:     - name: git clone       uses: actions/checkout@v2

thx. i'll give that a shot.

paulirish

comment created time in 7 days

Pull request review commentGoogleChrome/lighthouse

misc: add tools to track issue response time

+/**+ * @license Copyright 2020 The Lighthouse Authors. All Rights Reserved.+ * Licensed under the Apache License, Version 2.0 (the "License"); you may not use this file except in compliance with the License. You may obtain a copy of the License at http://www.apache.org/licenses/LICENSE-2.0+ * Unless required by applicable law or agreed to in writing, software distributed under the License is distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the License for the specific language governing permissions and limitations under the License.+ */+'use strict';++/* eslint-disable no-console */++const fs = require('fs');+const path = require('path');+const log = require('lighthouse-logger');++/**+ * @fileoverview Used in conjunction with `./download-issues.js` to analyze our Issue and PR response times as a team.+ *+ * This file analyzes GitHub data that resides in `.tmp/_issues.json` primarily around *initial* response times.+ * Future work could do something fancier around responding to replies, followup reviews, closing issues, etc.+ *+ * See the download script for usage information.+ */++/** @typedef {import('./download-issues.js').AugmentedGitHubIssue} AugmentedGitHubIssue */++const RESPONSE_LOGINS = new Set([+  'adamraine',+  'Beytoven',+  'brendankenny',+  'connorjclark',+  'exterkamp',+  'jazyan',+  'patrickhulce',+  'paulirish',+]);++/**+ * @param {AugmentedGitHubIssue} issue+ * @return {AugmentedGitHubIssue}+ */+function normalizeIssue(issue) {+  if (!Array.isArray(issue.comments)) issue.comments = [];+  if (!Array.isArray(issue.events)) issue.events = [];+  issue.comments = issue.comments.sort(+    (a, b) =>+      new Date(a.created_at).getTime() - new Date(b.created_at).getTime()+  );+  issue.events = issue.events.sort(+    (a, b) =>+      new Date(a.created_at).getTime() - new Date(b.created_at).getTime()+  );++  issue.comments.forEach(comment => {+    comment.created_at = comment.created_at || comment.submitted_at || '';+  });++  return issue;+}++const RESPONSE_EVENTS = new Set(['labeled', 'assigned', 'renamed', 'closed']);++const HOUR_IN_MS = 60 * 60 * 1000;+const DAY_FILTER = 90;+const START_AT = new Date(+  new Date().getTime() - DAY_FILTER * 24 * HOUR_IN_MS+).getTime();++const ISSUES_PATH = path.join(__dirname, '../../../.tmp', '_issues.json');+/** @type {Array<AugmentedGitHubIssue>} */+const _ISSUES = JSON.parse(fs.readFileSync(ISSUES_PATH, 'utf8')).map(+  normalizeIssue+);+const _ISSUES_SINCE = _ISSUES.filter(+  issue => new Date(issue.created_at).getTime() > START_AT+);+const ISSUES = _ISSUES_SINCE.filter(issue => !issue.pull_request);++/** @param {number} n */+const percent = n => `${(n * 100).toFixed(1)}%`;++/**+ * This function only logs *initial* review responses, but could conceivably log all instances of applying "waiting4reviewer".+ * @param {string} label+ * @param {Array<AugmentedGitHubIssue>} issues+ */+function computeAndLogReviewResponseStats(label, issues) {+  const initialReviewRequests = issues+    .map(issue => {+      const assignEvent = issue.events.find(+        event => event.event === 'assigned'+      );+      const assignee =+        (assignEvent && assignEvent.assignee && assignEvent.assignee.login) ||+        '';+      const firstCommentByAssignee = issue.comments.find(+        comment => comment.user.login === assignee+      );+      const reviewTimeInHours = firstCommentByAssignee+        ? (new Date(firstCommentByAssignee.created_at).getTime() -+            new Date(issue.created_at).getTime()) /+          HOUR_IN_MS+        : Infinity;+      return {+        issue,+        assignEvent,+        assignee,+        firstCommentByAssignee,+        reviewTimeInHours,+      };+    })+    .filter(review => review.assignee)+    .sort((a, b) => a.reviewTimeInHours - b.reviewTimeInHours);++  const reviews = initialReviewRequests.filter(+    review => review.firstCommentByAssignee+  );+  const reviewsByLogin = Object.values(+    initialReviewRequests.reduce(+      (map, view) => {+        const reviews = map[view.assignee] || [];+        reviews.push(view);+        map[view.assignee] = reviews;+        return map;+      },+      /** @type {Record<string, typeof reviews>} */ ({})+    )+  ).sort((a, b) => b.length - a.length);++  const responseTimeInHours = initialReviewRequests.map(r => r.reviewTimeInHours);+  const medianResponseTime =+    responseTimeInHours[Math.floor(reviews.length / 2)];+  console.log(`${log.bold}${label}${log.reset}`);+  console.log(+    `  ${percent(+      initialReviewRequests.length / issues.length+    )} of PRs requested a review`+  );+  console.log(+    `  ${percent(+      reviews.length / initialReviewRequests.length+    )} of requests received a review`+  );+  console.log(+    `  Median initial response time of ${log.bold}${medianResponseTime.toFixed(+      1+    )} hours${log.reset}`+  );+  console.log('  By User');+  reviewsByLogin.forEach(requests => {

the console.table() version of this is

  const byUser = Object.fromEntries(reviewsByLogin.map(requests => {
    const user = requests[0].assignee;
    const reviews = requests.filter(r => r.firstCommentByAssignee);
    const medianResponseTime = requests[Math.floor(reviews.length / 2)].reviewTimeInHours;
    return [user, {reviews: reviews.length, medianResponse: `${medianResponseTime.toFixed(1)} h`}];
  }));
  console.table(byUser);

eh. just slightly better IMO

before after
image image
patrickhulce

comment created time in 7 days

Pull request review commentGoogleChrome/lighthouse

misc: add tools to track issue response time

+/**+ * @license Copyright 2020 The Lighthouse Authors. All Rights Reserved.+ * Licensed under the Apache License, Version 2.0 (the "License"); you may not use this file except in compliance with the License. You may obtain a copy of the License at http://www.apache.org/licenses/LICENSE-2.0+ * Unless required by applicable law or agreed to in writing, software distributed under the License is distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the License for the specific language governing permissions and limitations under the License.+ */+'use strict';++/**+ * @fileoverview Used in conjunction with `./analyze-issues.js` to analyze our Issue and PR response times+ * as a team. This file downloads GitHub data to `.tmp/_issues.json` for analysis.

can you add something like

_issues.json holds data on all issues for the last DAY_FILTER (90) days. Any comments and events are then fetched separately and added to their parent issue's object.

patrickhulce

comment created time in 7 days

Pull request review commentGoogleChrome/lighthouse

misc: add tools to track issue response time

+/**+ * @license Copyright 2020 The Lighthouse Authors. All Rights Reserved.+ * Licensed under the Apache License, Version 2.0 (the "License"); you may not use this file except in compliance with the License. You may obtain a copy of the License at http://www.apache.org/licenses/LICENSE-2.0+ * Unless required by applicable law or agreed to in writing, software distributed under the License is distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the License for the specific language governing permissions and limitations under the License.+ */+'use strict';++/* eslint-disable no-console */++const fs = require('fs');+const path = require('path');+const log = require('lighthouse-logger');++/**+ * @fileoverview Used in conjunction with `./download-issues.js` to analyze our Issue and PR response times as a team.+ *+ * This file analyzes GitHub data that resides in `.tmp/_issues.json` primarily around *initial* response times.+ * Future work could do something fancier around responding to replies, followup reviews, closing issues, etc.+ *+ * See the download script for usage information.+ */++/** @typedef {import('./download-issues.js').AugmentedGitHubIssue} AugmentedGitHubIssue */++const RESPONSE_LOGINS = new Set([+  'adamraine',+  'Beytoven',+  'brendankenny',+  'connorjclark',+  'exterkamp',+  'jazyan',+  'patrickhulce',+  'paulirish',+]);++/**+ * @param {AugmentedGitHubIssue} issue+ * @return {AugmentedGitHubIssue}+ */+function normalizeIssue(issue) {+  if (!Array.isArray(issue.comments)) issue.comments = [];+  if (!Array.isArray(issue.events)) issue.events = [];+  issue.comments = issue.comments.sort(+    (a, b) =>+      new Date(a.created_at).getTime() - new Date(b.created_at).getTime()+  );+  issue.events = issue.events.sort(+    (a, b) =>+      new Date(a.created_at).getTime() - new Date(b.created_at).getTime()+  );++  issue.comments.forEach(comment => {+    comment.created_at = comment.created_at || comment.submitted_at || '';+  });++  return issue;+}++const RESPONSE_EVENTS = new Set(['labeled', 'assigned', 'renamed', 'closed']);++const HOUR_IN_MS = 60 * 60 * 1000;+const DAY_FILTER = 90;+const START_AT = new Date(+  new Date().getTime() - DAY_FILTER * 24 * HOUR_IN_MS+).getTime();++const ISSUES_PATH = path.join(__dirname, '../../../.tmp', '_issues.json');+/** @type {Array<AugmentedGitHubIssue>} */+const _ISSUES = JSON.parse(fs.readFileSync(ISSUES_PATH, 'utf8')).map(+  normalizeIssue+);+const _ISSUES_SINCE = _ISSUES.filter(+  issue => new Date(issue.created_at).getTime() > START_AT+);+const ISSUES = _ISSUES_SINCE.filter(issue => !issue.pull_request);++/** @param {number} n */+const percent = n => `${(n * 100).toFixed(1)}%`;++/**+ * This function only logs *initial* review responses, but could conceivably log all instances of applying "waiting4reviewer".+ * @param {string} label+ * @param {Array<AugmentedGitHubIssue>} issues+ */+function computeAndLogReviewResponseStats(label, issues) {+  const initialReviewRequests = issues+    .map(issue => {+      const assignEvent = issue.events.find(+        event => event.event === 'assigned'+      );+      const assignee =+        (assignEvent && assignEvent.assignee && assignEvent.assignee.login) ||+        '';+      const firstCommentByAssignee = issue.comments.find(+        comment => comment.user.login === assignee+      );+      const reviewTimeInHours = firstCommentByAssignee+        ? (new Date(firstCommentByAssignee.created_at).getTime() -+            new Date(issue.created_at).getTime()) /+          HOUR_IN_MS+        : Infinity;+      return {+        issue,+        assignEvent,+        assignee,+        firstCommentByAssignee,+        reviewTimeInHours,+      };+    })+    .filter(review => review.assignee)+    .sort((a, b) => a.reviewTimeInHours - b.reviewTimeInHours);++  const reviews = initialReviewRequests.filter(+    review => review.firstCommentByAssignee+  );+  const reviewsByLogin = Object.values(+    initialReviewRequests.reduce(+      (map, view) => {+        const reviews = map[view.assignee] || [];+        reviews.push(view);+        map[view.assignee] = reviews;+        return map;+      },+      /** @type {Record<string, typeof reviews>} */ ({})+    )+  ).sort((a, b) => b.length - a.length);++  const responseTimeInHours = initialReviewRequests.map(r => r.reviewTimeInHours);+  const medianResponseTime =+    responseTimeInHours[Math.floor(reviews.length / 2)];+  console.log(`${log.bold}${label}${log.reset}`);+  console.log(+    `  ${percent(+      initialReviewRequests.length / issues.length+    )} of PRs requested a review`+  );+  console.log(+    `  ${percent(+      reviews.length / initialReviewRequests.length+    )} of requests received a review`+  );+  console.log(+    `  Median initial response time of ${log.bold}${medianResponseTime.toFixed(+      1+    )} hours${log.reset}`+  );+  console.log('  By User');+  reviewsByLogin.forEach(requests => {+    const user = requests[0].assignee;+    const reviews = requests.filter(r => r.firstCommentByAssignee);+    const medianResponseTime = requests[Math.floor(reviews.length / 2)].reviewTimeInHours;+    console.log(+      `    ${user} - ${requests.length} requests, ${+        reviews.length+      } reviews, ${medianResponseTime.toFixed(1)} hours`+    );+  });+}++/**+ * @param {string} label+ * @param {Array<AugmentedGitHubIssue>} issues+ */+function computeAndLogIssueResponseStats(label, issues) {

i think its inevitable that we'll want the compute data separately.. so could we split the compute and log bits into separate functions? (there'll still be a good amount of logic in the log bit, but i think that's fine)

patrickhulce

comment created time in 7 days

Pull request review commentGoogleChrome/lighthouse

misc: add tools to track issue response time

+/**+ * @license Copyright 2020 The Lighthouse Authors. All Rights Reserved.+ * Licensed under the Apache License, Version 2.0 (the "License"); you may not use this file except in compliance with the License. You may obtain a copy of the License at http://www.apache.org/licenses/LICENSE-2.0+ * Unless required by applicable law or agreed to in writing, software distributed under the License is distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the License for the specific language governing permissions and limitations under the License.+ */+'use strict';++/**+ * @fileoverview Used in conjunction with `./analyze-issues.js` to analyze our Issue and PR response times+ * as a team. This file downloads GitHub data to `.tmp/_issues.json` for analysis.+ *+ * Usage+ *+ * export GH_TOKEN=<your personal github token> # needed to get around API rate limits+ * node ./lighthouse-core/scripts/internal-analysis/download-issues.js+ * node ./lighthouse-core/scripts/internal-analysis/analyze-issues.js+ */++const fs = require('fs');+const path = require('path');+const fetch = require('isomorphic-fetch');++const DAY_FILTER = 90;+const HOUR_IN_MS = 60 * 60 * 1000;+const START_FROM = new Date(+  new Date().getTime() - DAY_FILTER * 24 * HOUR_IN_MS+);+const GITHUB_API = 'https://api.github.com/';+const HEADERS = {Authorization: `token ${process.env.GH_TOKEN}`};++/**+ * @typedef AugmentedGitHubIssue+ * @property {string} title+ * @property {string} url+ * @property {string} events_url+ * @property {string} comments_url+ * @property {string} created_at+ * @property {'MEMBER'|'NONE'|'FIRST_TIME_CONTRIBUTOR'} author_association+ * @property {{login: string}} user+ * @property {{login: string}} [assigne]+ * @property {{}} [pull_request]+ * @property {Array<{body: string, user: {login: string}, submitted_at?: string, created_at: string}>} comments+ * @property {Array<{event: 'labeled'|'unlabeled'|'closed'|'assigned', actor: {login: string}, assignee?: {login: string}, label?: {name: string}, created_at: string}>} events+ */++/** @param {number} ms @return {Promise<void>} */+function wait(ms) {+  return new Promise(r => setTimeout(r, ms));+}++/**+ * @param {AugmentedGitHubIssue} issue+ */+async function fetchAndInjectEvents(issue) {+  process.stdout.write(`Fetching ${issue.events_url}...\n`);+  const response = await fetch(issue.events_url, {headers: HEADERS});+  const events = await response.json();+  issue.events = events;+}++/**+ * @param {AugmentedGitHubIssue} issue+ */+async function fetchAndInjectComments(issue) {+  process.stdout.write(`Fetching ${issue.comments_url}...\n`);+  const response = await fetch(issue.comments_url, {headers: HEADERS});+  if (!response.ok) throw new Error(`Invalid API response: ${await response.text()}`);+  const comments = await response.json();+  issue.comments = comments;+  if (!Array.isArray(issue.comments)) {+    console.warn('Comments was not an array', issue.comments); // eslint-disable-line no-console+    issue.comments = [];+  }++  if (issue.pull_request) {+    const prCommentsUrl = issue.comments_url+      .replace('/issues/', '/pulls/')+      .replace('/comments', '/reviews');+    process.stdout.write(`Fetching ${prCommentsUrl}...\n`);+    const response = await fetch(prCommentsUrl, {headers: HEADERS});+    if (!response.ok) throw new Error(`Invalid API response: ${await response.text()}`);+    const comments = await response.json();+    issue.comments = issue.comments.concat(comments);+  }+}++/**+ * @param {string} [urlToStartAt]+ * @return {Promise<Array<AugmentedGitHubIssue>>}+ */+async function downloadIssues(urlToStartAt) {+  const url = new URL(`/repos/GoogleChrome/lighthouse/issues`, GITHUB_API);+  url.searchParams.set('state', 'all');+  url.searchParams.set('since', START_FROM.toISOString());+  const urlToFetch = urlToStartAt || url.href;+  process.stdout.write(`Fetching ${urlToFetch}...\n`);+  const response = await fetch(urlToFetch, {headers: HEADERS});+  if (!response.ok) throw new Error(`Invalid API response: ${await response.text()}`);+  /** @type {Array<AugmentedGitHubIssue>} */+  const issues = await response.json();+  const linkHeader = response.headers.get('link') || '';+  const nextLink = linkHeader+    .split(',')+    .find(link => link.split(';')[1].includes('rel="next"'));+  const nextUrlMatch = (nextLink && nextLink.match(/<(https.*?)>/)) || [];+  const nextUrl = nextUrlMatch[1] || '';+  const restOfIssues = nextUrl ? await downloadIssues(nextUrl) : [];++  // Yes really do this in series to avoid hitting abuse limits of GitHub API+  for (const issue of issues) {+    await Promise.all([+      fetchAndInjectEvents(issue).catch(async err => {+        console.error('Events failed! Trying again', err); // eslint-disable-line no-console

just curious what sorta things cause the failure?

patrickhulce

comment created time in 7 days

Pull request review commentGoogleChrome/lighthouse

misc: add tools to track issue response time

+/**+ * @license Copyright 2020 The Lighthouse Authors. All Rights Reserved.+ * Licensed under the Apache License, Version 2.0 (the "License"); you may not use this file except in compliance with the License. You may obtain a copy of the License at http://www.apache.org/licenses/LICENSE-2.0+ * Unless required by applicable law or agreed to in writing, software distributed under the License is distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the License for the specific language governing permissions and limitations under the License.+ */+'use strict';++/* eslint-disable no-console */++const fs = require('fs');+const path = require('path');+const log = require('lighthouse-logger');++/**+ * @fileoverview Used in conjunction with `./download-issues.js` to analyze our Issue and PR response times as a team.+ *+ * This file analyzes GitHub data that resides in `.tmp/_issues.json` primarily around *initial* response times.+ * Future work could do something fancier around responding to replies, followup reviews, closing issues, etc.+ *+ * See the download script for usage information.+ */++/** @typedef {import('./download-issues.js').AugmentedGitHubIssue} AugmentedGitHubIssue */++const RESPONSE_LOGINS = new Set([+  'adamraine',+  'Beytoven',+  'brendankenny',+  'connorjclark',+  'exterkamp',+  'jazyan',+  'patrickhulce',+  'paulirish',+]);++/**+ * @param {AugmentedGitHubIssue} issue+ * @return {AugmentedGitHubIssue}+ */+function normalizeIssue(issue) {+  if (!Array.isArray(issue.comments)) issue.comments = [];+  if (!Array.isArray(issue.events)) issue.events = [];+  issue.comments = issue.comments.sort(+    (a, b) =>+      new Date(a.created_at).getTime() - new Date(b.created_at).getTime()+  );+  issue.events = issue.events.sort(+    (a, b) =>+      new Date(a.created_at).getTime() - new Date(b.created_at).getTime()+  );++  issue.comments.forEach(comment => {+    comment.created_at = comment.created_at || comment.submitted_at || '';+  });++  return issue;+}++const RESPONSE_EVENTS = new Set(['labeled', 'assigned', 'renamed', 'closed']);++const HOUR_IN_MS = 60 * 60 * 1000;+const DAY_FILTER = 90;+const START_AT = new Date(+  new Date().getTime() - DAY_FILTER * 24 * HOUR_IN_MS+).getTime();++const ISSUES_PATH = path.join(__dirname, '../../../.tmp', '_issues.json');+/** @type {Array<AugmentedGitHubIssue>} */+const _ISSUES = JSON.parse(fs.readFileSync(ISSUES_PATH, 'utf8')).map(+  normalizeIssue+);+const _ISSUES_SINCE = _ISSUES.filter(+  issue => new Date(issue.created_at).getTime() > START_AT+);+const ISSUES = _ISSUES_SINCE.filter(issue => !issue.pull_request);++/** @param {number} n */+const percent = n => `${(n * 100).toFixed(1)}%`;++/**+ * This function only logs *initial* review responses, but could conceivably log all instances of applying "waiting4reviewer".+ * @param {string} label+ * @param {Array<AugmentedGitHubIssue>} issues+ */+function computeAndLogReviewResponseStats(label, issues) {+  const initialReviewRequests = issues+    .map(issue => {+      const assignEvent = issue.events.find(+        event => event.event === 'assigned'

what is this 80 character nonsense about? :)

patrickhulce

comment created time in 7 days

Pull request review commentGoogleChrome/lighthouse

misc: add tools to track issue response time

+/**+ * @license Copyright 2020 The Lighthouse Authors. All Rights Reserved.+ * Licensed under the Apache License, Version 2.0 (the "License"); you may not use this file except in compliance with the License. You may obtain a copy of the License at http://www.apache.org/licenses/LICENSE-2.0+ * Unless required by applicable law or agreed to in writing, software distributed under the License is distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the License for the specific language governing permissions and limitations under the License.+ */+'use strict';++/* eslint-disable no-console */++const fs = require('fs');+const path = require('path');+const log = require('lighthouse-logger');++/**+ * @fileoverview Used in conjunction with `./download-issues.js` to analyze our Issue and PR response times as a team.+ *+ * This file analyzes GitHub data that resides in `.tmp/_issues.json` primarily around *initial* response times.+ * Future work could do something fancier around responding to replies, followup reviews, closing issues, etc.+ *+ * See the download script for usage information.+ */++/** @typedef {import('./download-issues.js').AugmentedGitHubIssue} AugmentedGitHubIssue */++const RESPONSE_LOGINS = new Set([+  'adamraine',+  'Beytoven',+  'brendankenny',+  'connorjclark',+  'exterkamp',+  'jazyan',+  'patrickhulce',+  'paulirish',+]);++/**+ * @param {AugmentedGitHubIssue} issue+ * @return {AugmentedGitHubIssue}+ */+function normalizeIssue(issue) {+  if (!Array.isArray(issue.comments)) issue.comments = [];+  if (!Array.isArray(issue.events)) issue.events = [];+  issue.comments = issue.comments.sort(+    (a, b) =>+      new Date(a.created_at).getTime() - new Date(b.created_at).getTime()+  );+  issue.events = issue.events.sort(+    (a, b) =>+      new Date(a.created_at).getTime() - new Date(b.created_at).getTime()+  );++  issue.comments.forEach(comment => {+    comment.created_at = comment.created_at || comment.submitted_at || '';+  });++  return issue;+}++const RESPONSE_EVENTS = new Set(['labeled', 'assigned', 'renamed', 'closed']);++const HOUR_IN_MS = 60 * 60 * 1000;+const DAY_FILTER = 90;

move this to under RESPONSE_LOGINS ?

patrickhulce

comment created time in 7 days

Pull request review commentGoogleChrome/lighthouse

misc: add tools to track issue response time

+/**+ * @license Copyright 2020 The Lighthouse Authors. All Rights Reserved.+ * Licensed under the Apache License, Version 2.0 (the "License"); you may not use this file except in compliance with the License. You may obtain a copy of the License at http://www.apache.org/licenses/LICENSE-2.0+ * Unless required by applicable law or agreed to in writing, software distributed under the License is distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the License for the specific language governing permissions and limitations under the License.+ */+'use strict';++/**+ * @fileoverview Used in conjunction with `./analyze-issues.js` to analyze our Issue and PR response times+ * as a team. This file downloads GitHub data to `.tmp/_issues.json` for analysis.+ *+ * Usage+ *+ * export GH_TOKEN=<your personal github token> # needed to get around API rate limits+ * node ./lighthouse-core/scripts/internal-analysis/download-issues.js+ * node ./lighthouse-core/scripts/internal-analysis/analyze-issues.js+ */++const fs = require('fs');+const path = require('path');+const fetch = require('isomorphic-fetch');++const DAY_FILTER = 90;+const HOUR_IN_MS = 60 * 60 * 1000;+const START_FROM = new Date(+  new Date().getTime() - DAY_FILTER * 24 * HOUR_IN_MS+);+const GITHUB_API = 'https://api.github.com/';+const HEADERS = {Authorization: `token ${process.env.GH_TOKEN}`};++/**+ * @typedef AugmentedGitHubIssue+ * @property {string} title+ * @property {string} url+ * @property {string} events_url+ * @property {string} comments_url+ * @property {string} created_at+ * @property {'MEMBER'|'NONE'|'FIRST_TIME_CONTRIBUTOR'} author_association+ * @property {{login: string}} user+ * @property {{login: string}} [assigne]+ * @property {{}} [pull_request]+ * @property {Array<{body: string, user: {login: string}, submitted_at?: string, created_at: string}>} comments+ * @property {Array<{event: 'labeled'|'unlabeled'|'closed'|'assigned', actor: {login: string}, assignee?: {login: string}, label?: {name: string}, created_at: string}>} events+ */++/** @param {number} ms @return {Promise<void>} */+function wait(ms) {+  return new Promise(r => setTimeout(r, ms));+}++/**+ * @param {AugmentedGitHubIssue} issue+ */+async function fetchAndInjectEvents(issue) {+  process.stdout.write(`Fetching ${issue.events_url}...\n`);+  const response = await fetch(issue.events_url, {headers: HEADERS});+  const events = await response.json();+  issue.events = events;+}++/**+ * @param {AugmentedGitHubIssue} issue+ */+async function fetchAndInjectComments(issue) {+  process.stdout.write(`Fetching ${issue.comments_url}...\n`);+  const response = await fetch(issue.comments_url, {headers: HEADERS});+  if (!response.ok) throw new Error(`Invalid API response: ${await response.text()}`);+  const comments = await response.json();+  issue.comments = comments;+  if (!Array.isArray(issue.comments)) {+    console.warn('Comments was not an array', issue.comments); // eslint-disable-line no-console

you can drop a few of these // eslint-disable-line no-console since its at the top

patrickhulce

comment created time in 7 days

Pull request review commentGoogleChrome/lighthouse

misc: add tools to track issue response time

+/**+ * @license Copyright 2020 The Lighthouse Authors. All Rights Reserved.+ * Licensed under the Apache License, Version 2.0 (the "License"); you may not use this file except in compliance with the License. You may obtain a copy of the License at http://www.apache.org/licenses/LICENSE-2.0+ * Unless required by applicable law or agreed to in writing, software distributed under the License is distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the License for the specific language governing permissions and limitations under the License.+ */+'use strict';++/* eslint-disable no-console */++const fs = require('fs');+const path = require('path');+const log = require('lighthouse-logger');++/**+ * @fileoverview Used in conjunction with `./download-issues.js` to analyze our Issue and PR response times as a team.+ *+ * This file analyzes GitHub data that resides in `.tmp/_issues.json` primarily around *initial* response times.+ * Future work could do something fancier around responding to replies, followup reviews, closing issues, etc.+ *+ * See the download script for usage information.+ */++/** @typedef {import('./download-issues.js').AugmentedGitHubIssue} AugmentedGitHubIssue */++const RESPONSE_LOGINS = new Set([+  'adamraine',+  'Beytoven',+  'brendankenny',+  'connorjclark',+  'exterkamp',+  'jazyan',+  'patrickhulce',+  'paulirish',+]);++/**+ * @param {AugmentedGitHubIssue} issue+ * @return {AugmentedGitHubIssue}+ */+function normalizeIssue(issue) {+  if (!Array.isArray(issue.comments)) issue.comments = [];+  if (!Array.isArray(issue.events)) issue.events = [];+  issue.comments = issue.comments.sort(+    (a, b) =>+      new Date(a.created_at).getTime() - new Date(b.created_at).getTime()+  );+  issue.events = issue.events.sort(+    (a, b) =>+      new Date(a.created_at).getTime() - new Date(b.created_at).getTime()+  );++  issue.comments.forEach(comment => {+    comment.created_at = comment.created_at || comment.submitted_at || '';+  });++  return issue;+}++const RESPONSE_EVENTS = new Set(['labeled', 'assigned', 'renamed', 'closed']);++const HOUR_IN_MS = 60 * 60 * 1000;+const DAY_FILTER = 90;+const START_AT = new Date(+  new Date().getTime() - DAY_FILTER * 24 * HOUR_IN_MS+).getTime();++const ISSUES_PATH = path.join(__dirname, '../../../.tmp', '_issues.json');+/** @type {Array<AugmentedGitHubIssue>} */+const _ISSUES = JSON.parse(fs.readFileSync(ISSUES_PATH, 'utf8')).map(+  normalizeIssue+);+const _ISSUES_SINCE = _ISSUES.filter(+  issue => new Date(issue.created_at).getTime() > START_AT

is the created_at filter different than the since on the API?

patrickhulce

comment created time in 7 days

Pull request review commentGoogleChrome/lighthouse

misc: add tools to track issue response time

+/**+ * @license Copyright 2020 The Lighthouse Authors. All Rights Reserved.+ * Licensed under the Apache License, Version 2.0 (the "License"); you may not use this file except in compliance with the License. You may obtain a copy of the License at http://www.apache.org/licenses/LICENSE-2.0+ * Unless required by applicable law or agreed to in writing, software distributed under the License is distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the License for the specific language governing permissions and limitations under the License.+ */+'use strict';++/**+ * @fileoverview Used in conjunction with `./analyze-issues.js` to analyze our Issue and PR response times+ * as a team. This file downloads GitHub data to `.tmp/_issues.json` for analysis.+ *+ * Usage+ *+ * export GH_TOKEN=<your personal github token> # needed to get around API rate limits+ * node ./lighthouse-core/scripts/internal-analysis/download-issues.js+ * node ./lighthouse-core/scripts/internal-analysis/analyze-issues.js+ */++const fs = require('fs');+const path = require('path');+const fetch = require('isomorphic-fetch');++const DAY_FILTER = 90;+const HOUR_IN_MS = 60 * 60 * 1000;+const START_FROM = new Date(+  new Date().getTime() - DAY_FILTER * 24 * HOUR_IN_MS+);+const GITHUB_API = 'https://api.github.com/';+const HEADERS = {Authorization: `token ${process.env.GH_TOKEN}`};++/**+ * @typedef AugmentedGitHubIssue+ * @property {string} title+ * @property {string} url+ * @property {string} events_url+ * @property {string} comments_url+ * @property {string} created_at+ * @property {'MEMBER'|'NONE'|'FIRST_TIME_CONTRIBUTOR'} author_association+ * @property {{login: string}} user+ * @property {{login: string}} [assigne]+ * @property {{}} [pull_request]+ * @property {Array<{body: string, user: {login: string}, submitted_at?: string, created_at: string}>} comments+ * @property {Array<{event: 'labeled'|'unlabeled'|'closed'|'assigned', actor: {login: string}, assignee?: {login: string}, label?: {name: string}, created_at: string}>} events+ */++/** @param {number} ms @return {Promise<void>} */+function wait(ms) {

developer's best friend.

patrickhulce

comment created time in 7 days

Pull request review commentGoogleChrome/lighthouse

misc: annotate version-specific logic with COMPAT comments

 const expectations = [       },     },   },-  // TODO: Uncomment when Chrome m84 lands+  // TODO(COMPAT): Uncomment when Chrome m84 lands

true!

but i'mma keep it because the intent it to account for all of these.

paulirish

comment created time in 7 days

push eventGoogleChrome/lighthouse

Paul Irish

commit sha de273e972df016e151bdea211b764b1b61e28217

tests: move proto roundtrip json to .tmp/ (#10995)

view details

push time in 8 days

delete branch GoogleChrome/lighthouse

delete branch : protortdist

delete time in 8 days

PR merged GoogleChrome/lighthouse

Reviewers
tests: move proto roundtrip json to .tmp/ cla: yes waiting4reviewer

#6183 added this file and at the time it was tracked in source control

in #10557 we added the file to gitignore

now that it's ignored, seems like it can be in dist rather than nestled amongst tracked files.


~i went with dist because #10994 but i suppose if we're not sharing artifacts between builds.... perhaps~ this makes more semantic sense in .tmp ~?~

+3 -4

2 comments

4 changed files

paulirish

pr closed time in 8 days

PR opened GoogleChrome/lighthouse

Reviewers
misc: annotate version-specific logic with COMPAT comments

We had a weak convention to annotate anything with branching/specific logic to handle new/old browsers with "COMPAT"

Having it makes it easier to take inventory of the special cases we have.

We were missing a few of these.

You'll also spot some stuff we can take action on. I've kept that out of this PR, but will followup with them.

+8 -8

0 comment

8 changed files

pr created time in 8 days

pull request commentGoogleChrome/lighthouse

core: error if chrome version does not support lcp metric

can someone summarize the conclusion of what we wanted in #10499? :)

hah yeah. :) good call.


stepping back..

We have options:

  • run-level warning: could be general version check or multiple specific ones
  • audit-level warning (not currently supported in the report ui for metrics, but doable)
  • audit-level error: a la parameterized NO_LCP with a reason

Keeping the error specific to the audit is ideal, so someone doing a11y-only doesn't need to deal with our perf requirements. IMO this is more important than the downside of potentially showing multiple warnings.

So.. excluding the global version check... I think the NO_FCP_OLD_CHROME-ish case that motivated this is served best by an audit error. Basically we're being more specific than just plain NO_LCP (much like we want to do for NO_FCP).

@connorjclark @brendankenny work for you?

@adamraine sorry for switching things up on you but yeah we hadn't really sorted out this consensus yet. :)

adamraine

comment created time in 8 days

create barnchGoogleChrome/lighthouse

branch : compatcomment

created branch time in 8 days

Pull request review commentGoogleChrome/lighthouse

tests: parallelize all the tests

 jobs:       with:         node-version: 10.x +    - name: Define ToT chrome path+      if: matrix.chrome-channel == 'ToT'+      run: echo "::set-env name=CHROME_PATH::/home/runner/chrome-linux-tot/chrome"

yeah the wording is stupid https://help.github.com/en/actions/reference/workflow-commands-for-github-actions#setting-an-environment-variable

Creates or updates an environment variable for any actions running next in a job.

"Next", you say...

… The action that creates or updates the environment variable does not have access to the new value, but all subsequent actions in a job will have access.

so yeah. it's all subsequent.

also i hate how this doc says "actions" but should say "steps".


tested it earlier: https://github.com/GoogleChrome/lighthouse/runs/805258701?check_suite_focus=true

I read it two steps after i set it.

image

paulirish

comment created time in 8 days

Pull request review commentGoogleChrome/lighthouse

core: error if chrome version does not support lcp metric

 class GatherRunner {        await GatherRunner.setupDriver(driver, options); +      const milestone = (await driver.getBrowserVersion()).milestone;+      if (milestone < 83) {

@adamraine good call. right now that's the best we got. But on the bright side, we know it's a Chrome UA, so there's not much variability we have to deal with. So a pretty simple regex should do the job.

adamraine

comment created time in 8 days

push eventGoogleChrome/lighthouse

Paul Irish

commit sha 0b978e711dfbb2b32bfc61e290ee11917b88ef97

define chrome_path in yaml instead of external bash script

view details

push time in 8 days

push eventGoogleChrome/lighthouse

Paul Irish

commit sha 111356d284d25bd93061aee80b8c232765ea25a8

split out

view details

push time in 8 days

push eventGoogleChrome/lighthouse

Paul Irish

commit sha fdc819c11eb7cdd054f91a95dc00f50e63da1dc2

paren

view details

push time in 8 days

create barnchGoogleChrome/lighthouse

branch : uberjobstestore

created branch time in 8 days

push eventGoogleChrome/lighthouse

Paul Irish

commit sha 6fa8db7996dd664dedbaf8d9930c1c29293b8824

parens

view details

push time in 8 days

push eventGoogleChrome/lighthouse

Paul Irish

commit sha c1713b41f19110bb28004708c3ce326ca432fb8e

dolla

view details

push time in 8 days

push eventGoogleChrome/lighthouse

Paul Irish

commit sha 3e14de28aeaf2135515201727df9e0864ce5ac20

export

view details

push time in 8 days

push eventGoogleChrome/lighthouse

Paul Irish

commit sha b2265da6e95a96adf87567467d2c195519fc29f8

canary tot

view details

push time in 8 days

push eventGoogleChrome/lighthouse

Paul Irish

commit sha d69537cefeaf6cd27a5cdac4368359837ec59cfe

bash is hilarious

view details

push time in 8 days

push eventGoogleChrome/lighthouse

Paul Irish

commit sha 13b297db245a401792a51f13008b4e0f58a8f3a2

cleanup bash?

view details

push time in 8 days

push eventGoogleChrome/lighthouse

Paul Irish

commit sha bad8058808036e99aaad084d9f9e32ecf4fc4a98

bash ternary

view details

push time in 8 days

create barnchGoogleChrome/lighthouse

branch : uberjobssetenv

created branch time in 8 days

Pull request review commentGoogleChrome/lighthouse

core: error if chrome version does not support lcp metric

 class GatherRunner {        await GatherRunner.setupDriver(driver, options); +      const milestone = (await driver.getBrowserVersion()).milestone;+      if (milestone < 83) {

Typically we place these checks closer to the intended use. (Someone using LH without the perf category doesn't have any problem)

@connorjclark so i'm thinking this could go in either the LCP audit itself or trace-processor. wdyt?

Also.. Oddly, our existing comment sez 78, but the bug reporter also had 78. https://github.com/GoogleChrome/lighthouse/blob/b8bc05aa669913fdc5a7eb4f11ee1b9dc229d832/lighthouse-core/computed/metrics/largest-contentful-paint.js#L10

So i suppose the 78 is wrong, but what about 79-82?

adamraine

comment created time in 8 days

Pull request review commentGoogleChrome/lighthouse

core: error if chrome version does not support lcp metric

 class GatherRunner {        await GatherRunner.setupDriver(driver, options); +      const milestone = (await driver.getBrowserVersion()).milestone;+      if (milestone < 83) {

That's a more general solution to older chrome's running against LH. I personally really like that we have specific milestone requirements scattered throughout.. Also makes it much easier for us to track the age of any particular assumption.

adamraine

comment created time in 8 days

pull request commentpuppeteer/puppeteer

Remove hard coded `?hl=en` from docs

yup. it's best practice that ?hl shouldn't be in links.

petele

comment created time in 8 days

push eventpuppeteer/puppeteer

Pete LePage

commit sha 393831fa94b406a9812471ce96a831714cc474fe

Remove hard coded `?hl=en` from docs (#6097)

view details

push time in 8 days

PR merged puppeteer/puppeteer

Remove hard coded `?hl=en` from docs cla: yes

The hard coded ?hl=en is breaking deployments on WebFu

+1 -1

0 comment

1 changed file

petele

pr closed time in 8 days

Pull request review commentGoogleChrome/lighthouse

core: remove uses of deprecated extendedInfo field

 describe('Avoids front-end JavaScript libraries with known vulnerabilities', ()     });     assert.equal(auditResult.score, 0);     assert.equal(auditResult.details.items.length, 1);-    assert.equal(auditResult.extendedInfo.jsLibs.length, 3);+    assert.equal(auditResult.details.debugData.vulnerableLibs.length, 1);

(drop this and L93)

Beytoven

comment created time in 9 days

Pull request review commentGoogleChrome/lighthouse

core: remove uses of deprecated extendedInfo field

             }           }         ],-        "summary": {}+        "summary": {},+        "debugData": {+          "type": "debugdata",+          "vulnerableLibs": [+            {+              "name": "jQuery",+              "npmPkgName": "jquery",+              "version": "2.1.1",+              "vulns": [+                {+                  "severity": "Medium",+                  "numericSeverity": 2,+                  "library": "jQuery@2.1.1",+                  "url": "https://snyk.io/vuln/SNYK-JS-JQUERY-567880"+                },+                {+                  "severity": "Medium",+                  "numericSeverity": 2,+                  "library": "jQuery@2.1.1",+                  "url": "https://snyk.io/vuln/SNYK-JS-JQUERY-565129"+                },+                {+                  "severity": "Medium",+                  "numericSeverity": 2,+                  "library": "jQuery@2.1.1",+                  "url": "https://snyk.io/vuln/SNYK-JS-JQUERY-174006"+                },+                {+                  "severity": "Medium",+                  "numericSeverity": 2,+                  "library": "jQuery@2.1.1",+                  "url": "https://snyk.io/vuln/npm:jquery:20150627"+                }+              ],+              "highestSeverity": "Medium"+            }

sounds good. let's drop.

Beytoven

comment created time in 9 days

Pull request review commentGoogleChrome/lighthouse

core: remove uses of deprecated extendedInfo field

 class NoVulnerableLibrariesAudit extends Audit {       {key: 'highestSeverity', itemType: 'text', text: str_(UIStrings.columnSeverity)},     ];     const details = Audit.makeTableDetails(headings, vulnerabilityResults, {});-+    /** @type {LH.Audit.Details.DebugData} */+    const debugData = {

as discussed, let's drop.

Beytoven

comment created time in 9 days

push eventGoogleChrome/lighthouse

Matt Hobbs

commit sha fb79013f152ce2b277a51a0c3976a672be32ddcf

report: don't dim disclaimer anchor links (#10981)

view details

push time in 9 days

PR merged GoogleChrome/lighthouse

report: don't dim disclaimer anchor links cla: yes waiting4reviewer

Summary Bugfix: In the viewer it is currently hard to tell that 'See calculator' is a link, as it matches the same grey as the paragraph. PR removes this styling to mach other links in the audit results (light blue)

Before before

After after

+0 -3

2 comments

1 changed file

Nooshu

pr closed time in 9 days

pull request commentGoogleChrome/lighthouse

report: don't dim disclaimer anchor links

Thanks @Nooshu for tracking down the code and making a great PR. appreciate it. :)

Nooshu

comment created time in 9 days

pull request commentGoogleChrome/lighthouse

report: don't dim disclaimer anchor links

I think the underline issue is lighthouse-metrics.com's bug: https://twitter.com/paul_irish/status/1275596637050552320

but that said.... I can't really give a decent justification for why these links don't get a link color, but the rest do.

so yeah. let's do this. 👍

Nooshu

comment created time in 9 days

PR opened GoogleChrome/lighthouse

tests: parallelize all the tests

This PR merges both @connorjclark's #10988 and @brendankenny's #10993. And then I tweaked the work balance of the non-smoke stuff.

smoke stuff

  • ToTChrome is added to the mix
  • brendan's approach of halving the smoketests has big impact in total duration.

3 major non-smoke jobs: basics + unit

  • I looked at the timings a bit. There's benefit to splitting apart all the non-smoke stuff, otherwise it'll finish after the halved smokes do. I don't see any material benefits when split into 3 pieces, so as long as we kick out the longest smoke step into its own job, we're good. Ideally these two non-smoke jobs have equal durations.
  • Contrasting with connor's, i essentially merge misc back into build
  • Contrasting with brendan's, I build-all in both basics and unit.

The last run of this branch finished in 6:16, which is 2 and 6 min faster than the other two PRs. (I think this was a bit lucky.. tho)

+124 -61

0 comment

4 changed files

pr created time in 9 days

push eventGoogleChrome/lighthouse

Paul Irish

commit sha 414cb0425c02e7058e289bb6508a8fd6487b5a56

comments

view details

push time in 9 days

push eventGoogleChrome/lighthouse

Paul Irish

commit sha ecce98c71816da555794be0eead2c512ab14ea65

name job with chrome channel

view details

push time in 9 days

push eventGoogleChrome/lighthouse

Paul Irish

commit sha fc2d26f5cdda37df933ef4e42e8e7f697ff888e3

name job with chrome channel

view details

push time in 9 days

push eventGoogleChrome/lighthouse

Paul Irish

commit sha 6cf867fece246491cae077353eba44261ec22b28

manually copy in brendan's yaml changes

view details

Brendan Kenny

commit sha 4416fea31ae85e8ceaf5d4636941547f4fecac89

tests: run smoke tests in parallel jobs

view details

Brendan Kenny

commit sha 98ded3f2b3b2ea5d7f1ef5c673adaced3a3b5487

feedback and split out basics

view details

Paul Irish

commit sha c76e915f642440e0a38b868b7fcf7cb6ff956fab

resolve conflicts

view details

push time in 9 days

create barnchGoogleChrome/lighthouse

branch : uberjobs

created branch time in 9 days

pull request commentGoogleChrome/lighthouse

tests: move proto roundtrip json to .tmp/

.tmp done

paulirish

comment created time in 9 days

push eventGoogleChrome/lighthouse

Paul Irish

commit sha dc1129b2badebe6072686306683c4cd24bee672a

.tmp

view details

push time in 9 days

push eventGoogleChrome/lighthouse

Paul Irish

commit sha 6e276a5da4c131b4e2212943925650ddae7575b6

empty

view details

push time in 9 days

issue openedjoeldenning/narn

`narn run`

I use npm run or yarn run often to get a listing of all the scripts.

narn run runs npm run run which fails.

❯ narn run
npm run run
npm ERR! missing script: run

created time in 9 days

issue commentGoogleChrome/lighthouse

Integration with UI testing

@Swazimodo thanks for filing the issue.

We've been thinking about this too and agree this is underserved.

Could you provide some details on your intended usecases? Are you mostly interested in performance metrics? Or accessibility details?

Swazimodo

comment created time in 9 days

issue commentGoogleChrome/lighthouse

--local-overrides

usecases around building lots of variants and testing more automatedly.

seems juicy.

connorjclark

comment created time in 9 days

issue commentGoogleChrome/lighthouse

Failed to load favicon.ico in console violations

We're on board with a dedicated best-practice has-a-favicon audit (that doesn't 404).

We'll skip a dedicated network-requests-failed audit for now.

wardpeet

comment created time in 9 days

issue commentGoogleChrome/lighthouse

Show both resource / transfer sizes for some audits

After discussion. Thinking we can do these:

  • [lantern] include CPU time impact (parse/compile/etc) as part of overall wastedMs
  • [report] include resource sizes in tooltip on transfer size
connorjclark

comment created time in 9 days

issue commentGoogleChrome/lighthouse

Sort out new solution for monitoring bundlesize.

Remaining

  • add bundlesize thresholds to buildtracker config
  • collect master branch data (on push)
paulirish

comment created time in 9 days

issue commentGoogleChrome/lighthouse

Align load-fast-enough-for-pwa with Core Web Vitals?

Options:

  1. Leave it on TTI
  2. Drop hard perf criteria entirely
  3. Move from metrics to non-variable numbers (count of renderblocking bytes or w/e)
  4. Use some combo of LCP/CLS/TBT with thresholds
  5. Use perf score threshold
kaycebasques

comment created time in 9 days

issue commentGoogleChrome/lighthouse

Align load-fast-enough-for-pwa with Core Web Vitals?

We could also align with the TWA criteria of perf score 80: https://blog.chromium.org/2019/02/introducing-trusted-web-activity-for.html

kaycebasques

comment created time in 9 days

issue commentGoogleChrome/lighthouse

For legacy-javascript advice, suggest excluding some polyfills for "modern" build

For LH pov, we start with heavily recommending @babel/preset-env + esmodules: true + bugfixes: true.

Ideally we can resolve the promises/typedarrays polyfilling from within the babel-preset-env bugfixes path. An alternative is recommending some exclude patterns, but it's not guaranteed to be totally fine.

connorjclark

comment created time in 9 days

issue commentGoogleChrome/lighthouse

Unclear wording in LCP audit

consensus.

"This is the largest contentful element painted within the viewport."

exterkamp

comment created time in 9 days

Pull request review commentGoogleChrome/web.dev

CrUX API announcement

+---+layout: post+title: Using the Chrome UX Report API+authors:+  - rviscomi+  - exterkamp+hero: hero.png+description: |+  Learn how to use the Chrome UX Report API to get easy, RESTful access to +  real-user experience data across millions of websites.+date: 2020-06-23+tags:+  - performance+  - blog+---++The [Chrome UX Report](https://developers.google.com/web/tools/chrome-user-experience-report) (CrUX) dataset represents how real-world Chrome users experience popular destinations on the web. Since 2017, when the queryable dataset was first released on [BigQuery](https://web.dev/chrome-ux-report-bigquery/), field data from CrUX has been integrated into developer tools like [PageSpeed Insights](/chrome-ux-report-pagespeed-insights/), the [CrUX Dashboard](https://web.dev/chrome-ux-report-data-studio-dashboard/), and Search Console's [Core Web Vitals report](https://support.google.com/webmasters/answer/9205520), enabling developers to easily measure and monitor real-user experiences. The piece that has been missing all this time has been a tool that provides free and RESTful access to CrUX data programmatically. To help bridge that gap, we're excited to announce the release of the all new [Chrome UX Report API](https://developers.google.com/web/tools/chrome-user-experience-report/api/reference)!++This API has been built from the ground up with the goals of providing developers with simple, fast, and comprehensive access to CrUX data. The CrUX API only reports _field-based_ user experience data, unlike the existing [PageSpeed Insights API](https://developers.google.com/speed/docs/insights/v5/get-started), which also runs _lab-based_ Lighthouse performance audits. The CrUX API is streamlined and can quickly serve user experience data, making it ideally suited for real-time auditing applications.++To ensure that developers have access to all of the metrics that matter most—[Largest Contentful Paint](https://web.dev/lcp/) (LCP), [First Input Delay](https://web.dev/fid/) (FID), and [Cumulative Layout Shift](https://web.dev/cls/) (CLS)—the CrUX API audits and monitors these [Core Web Vitals](https://web.dev/vitals/#core-web-vitals) at both the origin and URL level.++So let's dive in and see how to use it!++## Querying origin data++Origins in the CrUX dataset encompass all underlying page-level experiences. The example below demonstrates how to query the CrUX API for an origin's user experience data using cURL on the command line.++```bash/0,3+API_KEY="[YOUR_API_KEY]"+curl "https://chromeuxreport.googleapis.com/v1/records:queryRecord?key=$API_KEY" \+  --header 'Content-Type: application/json' \+  --data '{"origin": "https://web.dev"}'+```++Run this query interactively in the [CrUX API explorer](https://developers.google.com/web/tools/chrome-user-experience-report/api/reference/rest/v1/records/queryRecord?apix=true&apix_params=%7B%22resource%22%3A%7B%22origin%22%3A%22https%3A%2F%2Fwww.google.com%22%7D%7D).++{% Aside %}+Note that all API requests must provide a value for the `key` parameter, the placeholder for which is left as `YOUR_API_KEY`. Get your own private CrUX API key at the click of a button in the official [CrUX API documentation](https://developers.google.com/web/tools/chrome-user-experience-report/api/guides/getting-started#APIKey). For convenience, the interactive [CrUX API explorer](https://developers.google.com/web/tools/chrome-user-experience-report/api/reference/rest/v1/records/queryRecord?apix=true) does not require an API key.+{% endAside %}++The `curl` command is made up of three parts:++1. The URL endpoint of the API, including the caller's private API key.+2. The `Content-Type: application/json` header, indicating that the request body contains JSON.+3. The JSON-encoded [request body](https://developers.google.com/web/tools/chrome-user-experience-report/api/reference/rest/v1/records/queryRecord#request-body), specifying the `https://web.dev` origin.++To do the same thing in JavaScript, use the <a name="crux-api-util">`CrUXApiUtil`</a> utility, which makes the API call and returns the decoded response.++```js/2+const CrUXApiUtil = {};

hmmmm now this isn't the same as https://developers.google.com/web/tools/chrome-user-experience-report/api/guides/getting-started#javascript

i like some of these changes.. maybe we can sort it out in the forthcoming PR to the crux repo? and then sync up both these other uses?

rviscomi

comment created time in 9 days

push eventGoogleChrome/lighthouse

Connor Clark

commit sha b58bf021cadea5791899daca83dc0baafca6916c

deps: remove bundlesize (#10999)

view details

push time in 10 days

delete branch GoogleChrome/lighthouse

delete branch : bundles

delete time in 10 days

PR merged GoogleChrome/lighthouse

deps: remove bundlesize cla: yes waiting4reviewer

#10472 hasn't moved, and bundlesize continues to randomly fail builds and not be useful, so buh bye.

+7 -277

0 comment

3 changed files

connorjclark

pr closed time in 10 days

more