profile
viewpoint
Connor Clark connorjclark @google Mountain View hoten.cc Connor Clark. Programmer. 🔦Lighthouse & Chrome Developer Tools

connorjclark/css-trimmer 9

Identify the unused properties in your CSS

connorjclark/as3unit 5

A unit testing framework for ActionScript 3

connorjclark/chrome-trace-events-tsc 3

Type all the trace events!

connorjclark/biglyurl 1

Make Uniform Resource Locators Great Again!

connorjclark/dart-life 1

Life in Dart

connorjclark/11ty-website 0

Documentation site for the Eleventy static site generator.

connorjclark/aoc-mgz 0

Age of Empires II recorded game parsing and summarization in Python 3.

pull request commentGoogleChrome/lighthouse

new_audit: third party facades

  1. Could we omit "0" from the blocking time column, to reduce the noise in the table?
  2. Should we aggregate assets <1KB into a single row? (Aside: I'd love for the table/report to be able to do this itself, and allow for expanding an "Other" row. currently, doing it manually in the audit is what we do)
adamraine

comment created time in 39 minutes

pull request commentGoogleChrome/lighthouse

new_audit: preload images that are used by the LCP element

Could we leave out that column then?

Beytoven

comment created time in 17 hours

PullRequestReviewEvent

pull request commentGoogleChrome/lighthouse

new_audit: preload images that are used by the LCP element

the audit will remain in experimental.

I think this could go in default to start.

Beytoven

comment created time in 18 hours

pull request commentGoogleChrome/lighthouse

report: swap locale in viewer

This is what it looks like now:

image

The browser built-in input element from a dropdown+filter isn't great...You must delete the current value to see any other choices (blegh), and I'm not sure if styling like in the mock is possible. What do?

connorjclark

comment created time in 18 hours

push eventGoogleChrome/lighthouse

Connor Clark

commit sha 1739e8d127f551fcc2f68d9c92e10ad1c6546589

display names

view details

push time in 18 hours

push eventGoogleChrome/lighthouse

Connor Clark

commit sha 54289bd3fd89a3c9e9488fc25a85432e2698c1d2

tweak

view details

push time in 18 hours

push eventGoogleChrome/lighthouse

Connor Clark

commit sha b79052757d31f6c17ee3d9733761bec43f8c31ce

available locales

view details

push time in 19 hours

push eventGoogleChrome/lighthouse

Connor Clark

commit sha 6673d835fa01916cedd0eb5bc03711985ea6b10b

revert old stuff

view details

push time in 19 hours

push eventGoogleChrome/lighthouse

Connor Clark

commit sha 5e4b5feacc56f96d50de5bc9a562cabbb7e90591

misc: release script push tag (#10193)

view details

John

commit sha 9c5d31ac4483cd1b7713fcb52d94b6045ceca3ae

tests(report-ui-features): fix tools button suite to pass isolated run (#10199)

view details

Connor Clark

commit sha ff87d69d6378ad69116574ec537ab649f302cd66

docs(auth): use --disable-storage-reset for chrome-debug (#10189)

view details

John

commit sha 9d6590dc7734372c23ddbd647cfac833bfde4540

tests(report-ui-features): add empty list and single item test cases (#10201)

view details

Connor Clark

commit sha 9931d2438fe0d0c4cbb477bf9cec2dd52a3bad61

core: measure time for GatherRunner.runPass (#10205)

view details

Patrick Hulce

commit sha 922471b7ae274cb55a57190c2aba89d357b53083

misc: add predictive perf to lantern test set (#10209)

view details

Connor Clark

commit sha 5870cb4d7cde62326fc2bf8728ff5a892240ba01

deps(devtools-protocol): upgrade to 0.0.729809 (#10207)

view details

Connor Clark

commit sha 8a66059ea5b90efa8d612a4c92c9df831ccd9b4f

tests(driver): type check (#10123)

view details

Connor Clark

commit sha 03ef2bc86f61ec4aa3e57a89343b16b1854b6889

tests(gather-runner): replace getMockedEmulationDriver w/ mock… (#10136)

view details

Connor Clark

commit sha 1abf9b87963c1fb95654dceb7a6b5abe4477b2b9

core(resource-summary): ignore /favicon.ico (#10190)

view details

Connor Clark

commit sha a9cfbf9b5ac9662d7ac0586b3d4c1c0fd1682262

core(font-size): speed up gatherer (#10200)

view details

Matt Jared

commit sha 90ade4ff7ba316775eac1d2b47ac29ab01f7ff90

docs: update scoring.md to v5 (#10223)

view details

John

commit sha e0dbb0a38a533680c7a171d26def02d897c0d819

report: fix ghost menu in print (#10216)

view details

Connor Clark

commit sha 6fba15d5615b93d03a5e9560247a8376d58cacbd

tests(gather-runner): type check (#10215)

view details

Shane Exterkamp

commit sha e87a8b76e66208be0266c6e3a3b648dbebe79521

i18n(report): runtime settings and tools (#9166) * i18n runtime settings & dropdown. Modify ui-feat for i18n. * Wrongly added this localeString mod * Messed up some HTML while editing the strings. * Ignore report-ui-feat from collect-strings. Update proto, update samples. * stringback * this -> Util * Add proto comments. * convert \t to _ because regex is hard * collected en-XL * let -> const * updated id's * Added caching/hydration to Util * hm, not sure how that worked, but okay * net -> network * data-i18n attr * update sample json * keyof! * forEach -> for of * add test * Connor's nits Co-Authored-By: Connor Clark <cjamcl@google.com> * Remove backup English from html, remove useless ignorePaths * fix tests * Revert "fix tests" This reverts commit 4c89c0f50a573137f4fb52610844b4b762e1a01c. * ignore axe link-name Co-authored-by: Connor Clark <cjamcl@gmail.com>

view details

Connor Clark

commit sha 56d1840770f7748796ef0af541ba7b009c5d4821

core(script-elements): parallelize getting request contents (#9713)

view details

John

commit sha b07dbf61e0cbfca5af5dde5e6f7638a02b0867a8

report: close drop down menu when focus is lost (#10208)

view details

Connor Clark

commit sha 5049d46fefd06a933e33add01c76097812fe6e47

core: move unused-css to computed artifact (#10160)

view details

Connor Clark

commit sha 22455f4ea8778bbaa6a8975cd7ce905206b19eb4

core: warn if document was redirected (#10157)

view details

Robert Linder

commit sha e6899455a6207b9b927f15e95f5567454d0d99bb

core(audits): Add more keywords to blocklist (#9986)

view details

push time in 19 hours

push eventGoogleChrome/lighthouse

Connor Clark

commit sha 50019868f1d7833bc6a89d94bcebf33f6852917f

core(large-javascript-libraries): move to experimental (#11484)

view details

push time in 21 hours

delete branch GoogleChrome/lighthouse

delete branch : large-exp

delete time in 21 hours

PR merged GoogleChrome/lighthouse

core(large-javascript-libraries): move to experimental cla: yes waiting4reviewer

ref #11423

+12 -56

0 comment

7 changed files

connorjclark

pr closed time in 21 hours

issue openedevmar/webtreemap

Upstreaming various changes from fork

Hi Evan!

The Lighthouse team forked this repo: https://github.com/paulirish/webtreemap-cdt

We will be using webtreemap for a new feature in Lighthouse, @paulirish is also working on a new feature in Chrome DevTools that uses it too.

We made a few changes in that fork:

a) webpack -> rollup (not necessary to upstream imo) b) add option for not injecting CSS with JS c) split out a public layout() method

Would you be open to receiving some PRs for upstreaming these changes?

created time in 21 hours

Pull request review commentGoogleChrome/lighthouse

core(js-bundles): return error object when sizes cannot be determined

 declare global {           files: Record<string, number>;           unmappedBytes: number;           totalBytes: number;-        };+        } | {error: string};

thanks for the alternative. if it's all the same, let's keep the current grossness

connorjclark

comment created time in a day

PullRequestReviewEvent

push eventGoogleChrome/lighthouse

Connor Clark

commit sha 8dd649aa30553b6862229fcee57d8c2039500b70

core(js-usage): normalize url key (#11302)

view details

George Makunde Martin

commit sha db039c92fb244079495aaa5d23c9da61f9d3789d

core(autocomplete): add chrome suggestions, invalid warning (#11342)

view details

Irfan Maulana

commit sha 72f1bd47cbb6fcc95349fc936b3fb0f098a93c47

core(config): correct typo on throttling profile name (#11355)

view details

Connor Clark

commit sha 74bb627289d1ebd19d7b51fcd1209bf47e4bdfdf

core(stack-packs): move to lighthouse-stack-packs npm package (#11370) Co-authored-by: Ward Peeters <ward@coding-tech.com>

view details

Alex Tkachuk

commit sha b5ee5ba2dc3af728ca2b9307b21dd9bec05b2801

docs: remove PageSpeed Green from integrations (#11390)

view details

Connor Clark

commit sha 2bf89fba2a5e3457b369fda653af214ec9c5ad08

docs(variability): expand on lighthouse-ci usage (#11377)

view details

Connor Clark

commit sha 3d828a303e066a8305188efb603031544a4a5bdf

core(is-on-https): remove <M84 codepaths (#11373)

view details

Adam Raine

commit sha 9c64af77f344475f15ef80367f46c8187895ac39

core(minification-estimator): minify nested template literals in JavaScript (#11395)

view details

Patrick Hulce

commit sha fe74339f1ee2b5606d5a7af19b6b52bd622e862a

i18n: code-escape <link> in preconnect and preload (#11401)

view details

Connor Clark

commit sha b142f83605e989a0024523dd5b73ad6d9c54a79e

misc: hide locale files by default in PRs (#11363)

view details

Pujitha.E

commit sha a1571ba4bc5b7713248ea53e01db085389e634f8

report(csv): add overall category scores (#11404)

view details

andreizet

commit sha a58510583acd2f796557175ac949932618af49e7

tests: add markdown link checker (#11358)

view details

Connor Clark

commit sha fa1755848a13b399e62b988da4f85c8500af8154

misc(build): fix devtools tests by making empty type files (#11418)

view details

Mohammed J. Razem

commit sha 84be010d33b2645011494498c6b26b8010641528

core(stack-packs): add Drupal pack (#10522)

view details

James Garbutt

commit sha 9b4a46bbfc5acc8cae37671e2697b7f61686e8a7

core: traverse shadow hosts in getNodePath (#10956) Co-authored-by: Paul Irish <paulirish@google.com>

view details

Paul Irish

commit sha 3359a709dcc905b2834375931930f614b71f49f3

tests(smoke): fix preconnect flake w/ a non-locally installed font (#11425)

view details

Connor Clark

commit sha 5f2a8e27f0dbe64a383437ad1e05ed014151e942

misc: update stack packs, remove duplicated stack pack files (#11396)

view details

Connor Clark

commit sha e5dee434d2b9095e4713febec74d2f3beeebf5db

tests: hash more files for devtools test cache (#11417)

view details

Paul Irish

commit sha d86ce3421e16c2c08a81b052a97330a1793c3f81

report: let fireworks eligibility ignore PWA category (#11200)

view details

Tim van der Lippe

commit sha f59011d7d862eb4b6e2867b0b4c4a6fb66b3be98

clients(devtools): update report-generator.js to match DevTools changes (#11411) This mirrors the changes made in https://chromium-review.googlesource.com/c/devtools/devtools-frontend/+/2398829/

view details

push time in a day

push eventGoogleChrome/lighthouse

Connor Clark

commit sha 19550a4fdba4bc8ae62b9a234d6ae8d3538a338b

update

view details

push time in a day

create barnchGoogleChrome/lighthouse

branch : large-exp

created branch time in a day

pull request commentGoogleChrome/lighthouse

new_audit: add treemap-data to experimental

any idea why this is happening? image

connorjclark

comment created time in a day

Pull request review commentGoogleChrome/lighthouse

new_audit: add treemap-data to experimental

+/**+ * @license Copyright 2020 The Lighthouse Authors. All Rights Reserved.+ * Licensed under the Apache License, Version 2.0 (the "License"); you may not use this file except in compliance with the License. You may obtain a copy of the License at http://www.apache.org/licenses/LICENSE-2.0+ * Unless required by applicable law or agreed to in writing, software distributed under the License is distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the License for the specific language governing permissions and limitations under the License.+ */+'use strict';++const TreemapData_ = require('../../audits/treemap-data.js');+const networkRecordsToDevtoolsLog = require('../network-records-to-devtools-log.js');+const {loadSourceMapFixture, makeParamsOptional} = require('../test-utils.js');++/* eslint-env jest */++const TreemapData = {+  audit: makeParamsOptional(TreemapData_.audit),+  prepareTreemapNodes: makeParamsOptional(TreemapData_.prepareTreemapNodes),+};++/**+ * @param {string} name+ */+function load(name) {+  const data = loadSourceMapFixture(name);+  if (!data.usage) throw new Error('exepcted usage');+  return {...data, usage: data.usage};+}++/**+ * @param {string} url+ * @param {number} transferSize+ * @param {LH.Crdp.Network.ResourceType} resourceType+ */+function generateRecord(url, transferSize, resourceType) {+  return {url, transferSize, resourceType};+}++describe('TreemapData audit', () => {+  describe('squoosh fixture', () => {+    /** @type {import('../../audits/treemap-data.js').TreemapData} */+    let treemapData;+    beforeAll(async () => {+      const context = {computedCache: new Map()};+      const {map, content, usage} = load('squoosh');+      const mainUrl = 'https://squoosh.app';+      const scriptUrl = 'https://squoosh.app/main-app.js';+      const networkRecords = [generateRecord(scriptUrl, content.length, 'Script')];++      const artifacts = {+        URL: {requestedUrl: mainUrl, finalUrl: mainUrl},+        JsUsage: {[usage.url]: [usage]},+        devtoolsLogs: {defaultPass: networkRecordsToDevtoolsLog(networkRecords)},+        SourceMaps: [{scriptUrl: scriptUrl, map}],+        ScriptElements: [{src: scriptUrl, content}],+      };+      const results = await TreemapData.audit(artifacts, context);++      // @ts-expect-error: Debug data.+      treemapData = results.details.treemapData;+    });++    it('basics', () => {+      expect(Object.keys(treemapData)).toEqual(['scripts', 'resources']);+    });++    it('js', () => {+      expect(treemapData.scripts).toMatchSnapshot();+    });++    it('resource summary', () => {

I've made a few more specific tests.

connorjclark

comment created time in a day

PullRequestReviewEvent

push eventGoogleChrome/lighthouse

Connor Clark

commit sha 9c7e3a51f57028be794c58154723f06ae14b5a3f

tests

view details

push time in a day

issue openedaseprite/aseprite

"spacing" option in grid settings

currently the grid settings support an x and y offset.

often spritesheets also have a constant x or y spacing offset–for every tile there is a gap of some pixels until the next sprite. it'd be nice if the grid settings could set such a value.

created time in 2 days

push eventGoogleChrome/lighthouse

Connor Clark

commit sha 1eec2e566a82e58139efa16e7b6ddd39c026f6e8

minor test

view details

push time in 4 days

Pull request review commentGoogleChrome/lighthouse

new_audit: add treemap-data to experimental

+/**+ * @license Copyright 2020 The Lighthouse Authors. All Rights Reserved.+ * Licensed under the Apache License, Version 2.0 (the "License"); you may not use this file except in compliance with the License. You may obtain a copy of the License at http://www.apache.org/licenses/LICENSE-2.0+ * Unless required by applicable law or agreed to in writing, software distributed under the License is distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the License for the specific language governing permissions and limitations under the License.+ */+'use strict';++const TreemapData_ = require('../../audits/treemap-data.js');+const networkRecordsToDevtoolsLog = require('../network-records-to-devtools-log.js');+const {loadSourceMapFixture, makeParamsOptional} = require('../test-utils.js');++/* eslint-env jest */++const TreemapData = {+  audit: makeParamsOptional(TreemapData_.audit),+  prepareTreemapNodes: makeParamsOptional(TreemapData_.prepareTreemapNodes),+};++/**+ * @param {string} name+ */+function load(name) {+  const data = loadSourceMapFixture(name);+  if (!data.usage) throw new Error('exepcted usage');

decided to just export two different functions to avoid the type issue

connorjclark

comment created time in 4 days

PullRequestReviewEvent

push eventGoogleChrome/lighthouse

Connor Clark

commit sha 7008e5502ec4465d47f15d54fa25383a4b0db40a

load

view details

push time in 4 days

push eventGoogleChrome/lighthouse

Adam Raine

commit sha 852017e18088d64fa41a41fa7bdf98a5a6752c33

core(non-composited-animations): add more actionable failure reasons (#11268)

view details

Pete Nykänen

commit sha 79ea8cc970ca8feab72b530b29268f134a254799

core(renderer): improve the unknown timezone checks in util.js (#9822)

view details

Connor Clark

commit sha 1a79adb1493dc8c453fa8aab5ca3db110a50f7ea

misc: fix types in duplicated-javascript (#11278)

view details

ryo

commit sha 977e05ac7158d912d5b8ac13373a977e6ef9bcdd

docs(readme): align headings with table of contents (#11288)

view details

Saavan Nanavati

commit sha 2faa6068b37deb93153405656e30efbf01b3fc0e

new_audit: add valid-source-maps audit (#11236)

view details

Warren Maresca

commit sha edecc71715bc3dbb99fbc1a4ed9dcbce424193dd

report(third-party-summary): show resources for entity (#11219)

view details

Connor Clark

commit sha fa3aa482662b01f6875fe234174f64f88ff8948f

misc: move doc link (#11300)

view details

George Makunde Martin

commit sha 35f1a6f90e9443bdf42179a4c9823e6d4848a9e6

new_audit: add autocomplete to experimental config (#11186)

view details

Kayce Basques

commit sha 0f05c0da708fd15dae80d62a8bb5cdd2aa34406a

core(non-composited-animations): update the "learn more" link (#11258)

view details

adrianaixba

commit sha fa34177e98fbaabc1437db2e13180adaa55b66d6

report: handle invalid urls for source location items (#11299) * first commit * added details renderer test * adjusted tests, new: relative urls, and clearly not url * quick clean up

view details

Patrick Hulce

commit sha 32bf4f97531c2d340442364584298d1566790264

docs: add audit naming guide (#11308)

view details

Patrick Hulce

commit sha 009c4f8096a22c09ff941cc55b8966ec77e9d437

misc(benchmark): update BenchmarkIndex for m86 changes (#11304)

view details

Jon Burger

commit sha 89bb5cee466d55a9a5f3bfb969dcb219d08b1ee8

deps: update lighthouse-plugin-publisher-ads to 1.2.0 (#11301)

view details

Stanislav Popov

commit sha 9761e2b2f188ab44ae6a4a1ec5211e5a31d3ed5b

docs(readme): add related project: site-audit-seo (#11305)

view details

Connor Clark

commit sha 780feea4ee0fa17ee8e36ba1507f34cb56cae561

core(module-duplication): ignore smaller modules (#11277)

view details

Adam Raine

commit sha 7e2ce68df6caaede7e0f5c07007e169a0b651015

core(trace-elements): do not break on unresolvable node id (#11298)

view details

lemcardenas

commit sha 764f9f3d17a80856887e70ecea61ab2de81118af

new_audit: add preload-fonts audit (#11255)

view details

Patrick Hulce

commit sha fa6ed12eee222a15eb7380b6a74e953dbe059f66

core(response-time): add time spent to details (#11307)

view details

Connor Clark

commit sha 1c0b0716cea2ca830aaf44de5a44f2a13fe4b29f

tests: run chromium webtests for devtools integration (#11176) Co-authored-by: Federico Grandi <fgrandi30@gmail.com>

view details

lemcardenas

commit sha 27e4f6877da34990dce1e6a7e415bd1680da0466

new_audit(revert): move unsized-images to experimental due to perf impact (#11317) * Revert "core(config): unsized-images to default (#11217)" This reverts commit 97a2375bec7a551a4dcef2b47404c0a3cbfb9838. * Revert "core(image-elements): collect CSS sizing, ShadowRoot, & position (#11188)" This reverts commit e0f7d5107e022ba96105c28fdcc54d865f29a221.

view details

push time in 4 days

Pull request review commentGoogleChrome/lighthouse

new_audit: add treemap-data to experimental

+/**+ * @license Copyright 2020 The Lighthouse Authors. All Rights Reserved.+ * Licensed under the Apache License, Version 2.0 (the "License"); you may not use this file except in compliance with the License. You may obtain a copy of the License at http://www.apache.org/licenses/LICENSE-2.0+ * Unless required by applicable law or agreed to in writing, software distributed under the License is distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the License for the specific language governing permissions and limitations under the License.+ */+'use strict';++const TreemapData_ = require('../../audits/treemap-data.js');+const networkRecordsToDevtoolsLog = require('../network-records-to-devtools-log.js');+const {loadSourceMapFixture, makeParamsOptional} = require('../test-utils.js');++/* eslint-env jest */++const TreemapData = {+  audit: makeParamsOptional(TreemapData_.audit),+  prepareTreemapNodes: makeParamsOptional(TreemapData_.prepareTreemapNodes),+};++/**+ * @param {string} name+ */+function load(name) {+  const data = loadSourceMapFixture(name);+  if (!data.usage) throw new Error('exepcted usage');

there are a few other test cases that need to load a source map + code from disk

image

was thinking we could use this function there too. it's not just a single call to readFileSync–it's two, and the correct relative path must be used to get the right folder.

connorjclark

comment created time in 4 days

Pull request review commentGoogleChrome/lighthouse

new_audit: add treemap-data to experimental

+/**+ * @license Copyright 2020 Google Inc. All Rights Reserved.+ * Licensed under the Apache License, Version 2.0 (the "License"); you may not use this file except in compliance with the License. You may obtain a copy of the License at http://www.apache.org/licenses/LICENSE-2.0+ * Unless required by applicable law or agreed to in writing, software distributed under the License is distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the License for the specific language governing permissions and limitations under the License.+ */+'use strict';++/**+ * @fileoverview+ * Creates treemap data for webtreemap.+ */++const Audit = require('./audit.js');+const JsBundles = require('../computed/js-bundles.js');+const UnusedJavaScriptSummary = require('../computed/unused-javascript-summary.js');+const ModuleDuplication = require('../computed/module-duplication.js');+const NetworkRecords = require('../computed/network-records.js');+const ResourceSummary = require('../computed/resource-summary.js');++/**+ * @typedef {Record<string, RootNode[]>} TreemapData+ */++/**+ * @typedef RootNode+ * @property {string} name+ * @property {Node} node+ */++/**+ * @typedef Node+ * @property {string} name+ * @property {number} resourceBytes+ * @property {number=} unusedBytes+ * @property {number=} executionTime+ * @property {string=} duplicate+ * @property {Node[]=} children+ */++/**+ * @typedef {Omit<Node, 'name'|'children'>} SourceData+ */++class TreemapDataAudit extends Audit {+  /**+   * @return {LH.Audit.Meta}+   */+  static get meta() {+    return {+      id: 'treemap-data',+      scoreDisplayMode: Audit.SCORING_MODES.INFORMATIVE,+      title: 'Treemap Data',+      description: 'Used for treemap app.',+      requiredArtifacts:+        ['traces', 'devtoolsLogs', 'SourceMaps', 'ScriptElements', 'JsUsage', 'URL'],+    };+  }++  /**+   * @param {string} sourceRoot+   * @param {Record<string, SourceData>} sourcesData+   * @return {Node}+   */+  static prepareTreemapNodes(sourceRoot, sourcesData) {+    /**+     * @param {string} name+     * @return {Node}+     */+    function newNode(name) {+      return {+        name,+        resourceBytes: 0,+      };+    }++    /**+     * Given a slash-delimited path, traverse the Node structure and increment+     * the data provided for each node in the chain. Creates nodes as needed.+     * Ex: path/to/file.js will find or create "path" on `node`, increment the data fields,+     *     and continue with "to", and so on.+     * @param {string} source+     * @param {SourceData} data+     * @param {Node} node+     */+    function addNode(source, data, node) {+      // Strip off the shared root.+      const sourcePathSegments = source.replace(sourceRoot, '').split(/\/+/);+      sourcePathSegments.forEach((sourcePathSegment, i) => {+        const isLastSegment = i === sourcePathSegments.length - 1;++        let child = node.children && node.children.find(child => child.name === sourcePathSegment);+        if (!child) {+          child = newNode(sourcePathSegment);+          node.children = node.children || [];+          node.children.push(child);+        }+        node = child;++        // Now that we've found or created the next node in the path, apply the data.+        node.resourceBytes += data.resourceBytes;+        if (data.unusedBytes) node.unusedBytes = (node.unusedBytes || 0) + data.unusedBytes;+        if (data.duplicate !== undefined && isLastSegment) {+          node.duplicate = data.duplicate;+        }+      });+    }++    const rootNode = newNode(sourceRoot);++    // For every source file, apply the data to all components+    // of the source path, creating nodes as necessary.+    for (const [source, data] of Object.entries(sourcesData)) {+      addNode(source || `<unmapped>`, data, rootNode);++      // Apply the data to the rootNode.+      rootNode.resourceBytes += data.resourceBytes;+      if (data.unusedBytes) rootNode.unusedBytes = (rootNode.unusedBytes || 0) + data.unusedBytes;+    }++    /**+     * Collapse nodes that have only one child.+     * @param {Node} node+     */+    function collapse(node) {+      while (node.children && node.children.length === 1) {+        node.name += '/' + node.children[0].name;+        node.children = node.children[0].children;+      }++      if (node.children) {+        for (const child of node.children) {+          collapse(child);+        }+      }+    }+    collapse(rootNode);++    // TODO(cjamcl): Should this structure be flattened for space savings?+    // Like DOM Snapshot.+    // Less JSON (no super nested children, and no repeated property names).++    return rootNode;+  }++  /**+   * @param {LH.Artifacts} artifacts+   * @param {LH.Audit.Context} context+   * @return {Promise<RootNode[]>}+   */+  static async makeJavaScriptRootNodes(artifacts, context) {+    /** @type {RootNode[]} */+    const rootNodes = [];++    const bundles = await JsBundles.request(artifacts, context);+    const duplication = await ModuleDuplication.request(artifacts, context);++    /** @type {Array<{src: string, length: number, unusedJavascriptSummary?: import('../computed/unused-javascript-summary.js').Summary}>} */+    const scriptData = [];+    const inlineScriptData = {+      src: artifacts.URL.finalUrl,+      length: 0,+    };+    for (const scriptElement of artifacts.ScriptElements) {+      // Normalize ScriptElements so that inline scripts show up as a single entity.+      if (!scriptElement.src) {+        inlineScriptData.length += (scriptElement.content || '').length;+        continue;+      }++      const url = scriptElement.src;+      const bundle = bundles.find(bundle => url === bundle.script.src);+      const scriptCoverages = artifacts.JsUsage[url];+      if (!bundle || !scriptCoverages) continue;

This was accidentally skipping all scripts without a source map. Messed that up during some refactoring.

connorjclark

comment created time in 4 days

PullRequestReviewEvent
PullRequestReviewEvent
PullRequestReviewEvent

Pull request review commentGoogleChrome/lighthouse

misc: script for analyzing results from gcp data collection

 const results = {   runResults, }; -writeFileSync('analyze-results.json', JSON.stringify(results, null, 2), 'utf8');+const resultsString = JSON.stringify(results, null, 2);+writeFileSync(saveDir + '/analyze-results.json', resultsString, 'utf8');

hmm, maybe just log to stdout and only write via bash in extract-lhr-data.sh?

Beytoven

comment created time in 6 days

PullRequestReviewEvent

Pull request review commentGoogleChrome/lighthouse

misc: script for analyzing results from gcp data collection

+#!/usr/bin/env node++/**+ * @license Copyright 2020 The Lighthouse Authors. All Rights Reserved.+ * Licensed under the Apache License, Version 2.0 (the "License"); you may not use this file except in compliance with the License. You may obtain a copy of the License at http://www.apache.org/licenses/LICENSE-2.0+ * Unless required by applicable law or agreed to in writing, software distributed under the License is distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the License for the specific language governing permissions and limitations under the License.+ */+'use strict';++/**+ * @fileoverview This script takes the directory (of extracted batch LHR data) along with an audit name and generates+ * a JSON file with aggregated data for that audit.+ *+ * USAGE: node lighthouse-core/scripts/gcp-collection/analyze-lhr-data.js [<directory of lhr data>] [<audit name>]+ */++/* eslint-disable no-console */+const {readdirSync, readFileSync, writeFileSync} = require('fs');+const {join} = require('path');++const directory = process.argv[2];+const audit = process.argv[3];+if (!directory) throw new Error('No directory provided\nUsage: $0 <lhr directory> <audit id>');++if (!audit) throw new Error('No audit provided');++const urlDirs = readdirSync(directory, {withFileTypes: true})+.filter(dirent => dirent.isDirectory());++const passSet = new Set();+const failSet = new Set();+const runResults = [];++for (const dir of urlDirs) {+  const url = dir.name;+  const path = join(directory, url);+  const runs = readdirSync(path, {withFileTypes: true});+  const entry = {+    url,+    /** @type {Array<any>} */+    runs: [],+  };+  for (const run of runs) {+    if (run.name === '.DS_Store') continue;++    if (!run.isDirectory()) throw new Error(`Unexpected directory "${run.name}" encountered`);++    const lhrPath = join(path, run.name, 'lhr.json');+    console.log(lhrPath);+    const data = readFileSync(lhrPath, 'utf8');++    /** @type {LH.Result | undefined} */+    let lhrData;+    try {+      lhrData = JSON.parse(data);+    } catch (error) {+      console.error('Error parsing: ' + url);+    }++    if (!lhrData) continue;++    const auditResult = lhrData.audits[audit].score;+    if (!auditResult) {+      failSet.add(url);+    }++    const runData = {+      index: run.name,+      auditResult,+    };+    entry.runs.push(runData);+  }++  // When the number of runs is greater than 1, we should only count a URL+  // as passing if the audit is passing for every run+  if (!failSet.has(url)) {+    passSet.add(url);+  }+  runResults.push(entry);+}++const results = {+  summary: {+    passes: passSet.size,+    fails: failSet.size,+    failingUrls: Array.from(failSet),+  },+  runResults,+};++writeFileSync('analyze-results.json', JSON.stringify(results, null, 2), 'utf8');

can you write this to .tmp folder?

Beytoven

comment created time in 6 days

PullRequestReviewEvent

pull request commentGoogleChrome/lighthouse

new_audit: third party facades

Adam pointed out that this is blocked on web.dev documentation, which he is about to start.

adamraine

comment created time in 6 days

issue commentGoogleChrome/lighthouse

Add PNG support to optimize-images

uses-webp-images audit uses OptimizedImages artifact (the webp part of the artifact), and uses-optimized-images uses the same thing but looks at the jpeg size estimation ... Why are there two? Seems like currently both audits could show the same image resource... do we really want a third audit such that we could recommend an image be png, webp, and jpeg?

if the user keeps following the advice of one audit eventually they should reach a local minimum and all these audits would stop alerting on that image, but this doesn't seem ideal.

FYI seems the protocol already supports png. https://source.chromium.org/chromium/chromium/src/+/master:third_party/blink/renderer/core/inspector/inspector_audits_agent.cc;l=91;drc=ee9e7e404e5a3f75a3ca0489aaf80490f625ca27

patrickhulce

comment created time in 6 days

pull request commentGoogleChrome/lighthouse-stack-packs

Joomla - Lazy Loading String Feedback

I've edited the same string in https://github.com/GoogleChrome/lighthouse-stack-packs/pull/53/

exterkamp

comment created time in 7 days

PR opened GoogleChrome/lighthouse

misc: make FormElements not a public artifact

This shouldn't have been public.

+2 -2

0 comment

1 changed file

pr created time in 7 days

create barnchGoogleChrome/lighthouse

branch : form-not-public

created branch time in 7 days

pull request commentGoogleChrome/lighthouse

core(artifacts): encapsulate node details in an object

Which part is breaking, taking into account our "PublicArtifacts"? I think it's just one or two artifacts... could we have both the object and the toplevel properties for now, and mark an item in #11207 to remove the old way?

@connorjclark Based on the PublicGathererArtifacts, it would be about 5 that use the changed artifacts: ImageElements, LinkElements, ScriptElements, IFrameElements, and FormElements. Is that what you were referring to?

Yes.

Alternatively, we can hold this entire PR until December (v7).

... FormElements

Yikes, that was a mistake, I'm going to revert this.

adrianaixba

comment created time in 7 days

pull request commentGoogleChrome/lighthouse

core: add node details object

Which part is breaking, taking into account our "PublicArtifacts"? I think it's just one or two artifacts... could we have both the object and the toplevel properties for now, and mark an item in #11207 to remove the old way?

adrianaixba

comment created time in 7 days

PullRequestReviewEvent

Pull request review commentGoogleChrome/lighthouse

new_audit: third party facades

+/**+ * @license Copyright 2020 The Lighthouse Authors. All Rights Reserved.+ * Licensed under the Apache License, Version 2.0 (the "License"); you may not use this file except in compliance with the License. You may obtain a copy of the License at http://www.apache.org/licenses/LICENSE-2.0+ * Unless required by applicable law or agreed to in writing, software distributed under the License is distributed on an AS IS' BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the License for the specific language governing permissions and limitations under the License.+ */+'use strict';++const ThirdPartyFacades = require('../../audits/third-party-facades.js');+const networkRecordsToDevtoolsLog = require('../network-records-to-devtools-log.js');+const createTestTrace = require('../create-test-trace.js');++const pwaTrace = require('../fixtures/traces/progressive-app-m60.json');+const pwaDevtoolsLog = require('../fixtures/traces/progressive-app-m60.devtools.log.json');+const videoEmbedsTrace = require('../fixtures/traces/video-embeds-m84.json');+const videoEmbedsDevtolsLog = require('../fixtures/traces/video-embeds-m84.devtools.log.json');+const noThirdPartyTrace = require('../fixtures/traces/no-tracingstarted-m74.json');++function resourceEntry(startTime, headerEndTime, endTime, transferSize, url) {+  return {+    url,+    startTime,+    endTime,+    transferSize,+    timing: {+      receiveHeadersEnd: (headerEndTime - startTime) * 1000,+    },+  };+}++function intercomProductEntry(startTime, headerEndTime, endTime, transferSize, id) {+  const url = `https://widget.intercom.io/widget/${id}`;+  return resourceEntry(startTime, headerEndTime, endTime, transferSize, url);+}++function intercomResourceEntry(startTime, headerEndTime, endTime, transferSize, id) {+  const url = `https://js.intercomcdn.com/frame-modern.${id}.js`;+  return resourceEntry(startTime, headerEndTime, endTime, transferSize, url);+}++function youtubeProductEntry(startTime, headerEndTime, endTime, transferSize, id) {+  const url = `https://www.youtube.com/embed/${id}`;+  return resourceEntry(startTime, headerEndTime, endTime, transferSize, url);+}++function youtubeResourceEntry(startTime, headerEndTime, endTime, transferSize, id) {+  const url = `https://i.ytimg.com/${id}/maxresdefault.jpg`;+  return resourceEntry(startTime, headerEndTime, endTime, transferSize, url);+}++/* eslint-env jest */+describe('Third party facades audit', () => {+  it('correctly identifies a third party product with facade alternative', async () => {+    const artifacts = {+      devtoolsLogs: {+        defaultPass: networkRecordsToDevtoolsLog([+          resourceEntry(100, 101, 102, 2000, 'https://example.com'),+          intercomProductEntry(200, 201, 202, 4000, '1'),+          intercomResourceEntry(300, 301, 302, 8000, 'a'),+        ]),+      },+      traces: {defaultPass: createTestTrace({timeOrigin: 0, traceEnd: 2000})},+      URL: {finalUrl: 'https://example.com'},+    };++    const settings = {throttlingMethod: 'simulate', throttling: {cpuSlowdownMultiplier: 4}};+    const results = await ThirdPartyFacades.audit(artifacts, {computedCache: new Map(), settings});++    expect(results.score).toBe(0);+    expect(results.displayValue).toBeDisplayString('1 facade alternative available');+    expect(results.details.items[0].product)+      .toBeDisplayString('Intercom Widget (Customer Success)');+    expect(results.details.items).toMatchObject([+      {+        transferSize: 12000,+        blockingTime: 0,+        subItems: {+          type: 'subitems',+          items: [+            {+              url: 'https://js.intercomcdn.com/frame-modern.a.js',+              mainThreadTime: 0,+              blockingTime: 0,+              transferSize: 8000,+              firstStartTime: 300,+              firstEndTime: 301,+            },+            {+              url: 'https://widget.intercom.io/widget/1',+              mainThreadTime: 0,+              blockingTime: 0,+              transferSize: 4000,+              firstStartTime: 200,+              firstEndTime: 201,+            },+          ],+        },+      },+    ]);+  });++  it('handles multiple products with facades', async () => {+    const artifacts = {+      devtoolsLogs: {+        defaultPass: networkRecordsToDevtoolsLog([+          resourceEntry(100, 101, 102, 2000, 'https://example.com'),+          intercomProductEntry(200, 201, 202, 4000, '1'),+          youtubeProductEntry(210, 211, 212, 3000, '2'),+          intercomResourceEntry(300, 301, 302, 8000, 'a'),+          youtubeResourceEntry(310, 311, 312, 7000, 'b'),+        ]),+      },+      traces: {defaultPass: createTestTrace({timeOrigin: 0, traceEnd: 2000})},+      URL: {finalUrl: 'https://example.com'},+    };++    const settings = {throttlingMethod: 'simulate', throttling: {cpuSlowdownMultiplier: 4}};+    const results = await ThirdPartyFacades.audit(artifacts, {computedCache: new Map(), settings});++    expect(results.score).toBe(0);+    expect(results.displayValue).toBeDisplayString('2 facade alternatives available');+    expect(results.details.items[0].product)+      .toBeDisplayString('Intercom Widget (Customer Success)');+    expect(results.details.items[1].product).toBeDisplayString('YouTube Embedded Player (Video)');+    expect(results.details.items).toMatchObject([+      {+        transferSize: 12000,+        blockingTime: 0,+        subItems: {+          type: 'subitems',+          items: [+            {+              url: 'https://js.intercomcdn.com/frame-modern.a.js',+              mainThreadTime: 0,+              blockingTime: 0,+              transferSize: 8000,+              firstStartTime: 300,+              firstEndTime: 301,+            },+            {+              url: 'https://widget.intercom.io/widget/1',+              mainThreadTime: 0,+              blockingTime: 0,+              transferSize: 4000,+              firstStartTime: 200,+              firstEndTime: 201,+            },+          ],+        },+      },+      {+        transferSize: 10000,+        blockingTime: 0,+        subItems: {+          type: 'subitems',+          items: [+            {+              url: 'https://i.ytimg.com/b/maxresdefault.jpg',+              mainThreadTime: 0,+              blockingTime: 0,+              transferSize: 7000,+              firstStartTime: 310,+              firstEndTime: 311,+            },+            {+              url: 'https://www.youtube.com/embed/2',+              mainThreadTime: 0,+              blockingTime: 0,+              transferSize: 3000,+              firstStartTime: 210,+              firstEndTime: 211,+            },+          ],+        },+      },+    ]);+  });++  it('handle multiple requests to same product resource', async () => {+    const artifacts = {+      devtoolsLogs: {+        defaultPass: networkRecordsToDevtoolsLog([+          resourceEntry(100, 101, 102, 2000, 'https://example.com'),+          // The first product entry is used for the cutoff time+          intercomProductEntry(200, 201, 202, 2000, '1'),+          intercomResourceEntry(300, 301, 302, 8000, 'a'),+          intercomProductEntry(400, 401, 402, 2000, '1'),+        ]),+      },+      traces: {defaultPass: createTestTrace({timeOrigin: 0, traceEnd: 2000})},+      URL: {finalUrl: 'https://example.com'},+    };++    const settings = {throttlingMethod: 'simulate', throttling: {cpuSlowdownMultiplier: 4}};+    const results = await ThirdPartyFacades.audit(artifacts, {computedCache: new Map(), settings});++    expect(results.score).toBe(0);+    expect(results.displayValue).toBeDisplayString('1 facade alternative available');+    expect(results.details.items[0].product)+      .toBeDisplayString('Intercom Widget (Customer Success)');+    expect(results.details.items).toMatchObject([+      {+        transferSize: 12000,+        blockingTime: 0,+        subItems: {+          type: 'subitems',+          items: [+            {+              url: 'https://js.intercomcdn.com/frame-modern.a.js',+              mainThreadTime: 0,+              blockingTime: 0,+              transferSize: 8000,+              firstStartTime: 300,+              firstEndTime: 301,+            },+            {+              url: 'https://widget.intercom.io/widget/1',+              mainThreadTime: 0,+              blockingTime: 0,+              transferSize: 4000,+              firstStartTime: 200,+              firstEndTime: 201,+            },+          ],+        },+      },+    ]);+  });++  it('uses receiveHeadersEnd as cutoff', async () => {+    const artifacts = {+      devtoolsLogs: {+        defaultPass: networkRecordsToDevtoolsLog([+          resourceEntry(100, 101, 102, 2000, 'https://example.com'),+          intercomProductEntry(200, 205, 210, 4000, '1'),+          // Starts between product's startTime and startTime + receiveHeadersEnd, so it is ignored+          intercomResourceEntry(201, 206, 208, 8000, 'a'),+          // Starts between product's startTime + receiveHeadersEnd and endTime, so it is included+          intercomResourceEntry(206, 208, 215, 8000, 'b'),+          // Starts past the cutoff but previous call to same url was before cutoff, so it is ignored+          intercomResourceEntry(300, 301, 303, 8000, 'a'),+        ]),+      },+      traces: {defaultPass: createTestTrace({timeOrigin: 0, traceEnd: 2000})},+      URL: {finalUrl: 'https://example.com'},+    };++    const settings = {throttlingMethod: 'simulate', throttling: {cpuSlowdownMultiplier: 4}};+    const results = await ThirdPartyFacades.audit(artifacts, {computedCache: new Map(), settings});++    expect(results.score).toBe(0);+    expect(results.displayValue).toBeDisplayString('1 facade alternative available');+    expect(results.details.items[0].product)+      .toBeDisplayString('Intercom Widget (Customer Success)');+    expect(results.details.items).toMatchObject([+      {+        transferSize: 12000,+        blockingTime: 0,+        subItems: {+          type: 'subitems',+          items: [+            {+              url: 'https://js.intercomcdn.com/frame-modern.b.js',+              mainThreadTime: 0,+              blockingTime: 0,+              transferSize: 8000,+              firstStartTime: 206,+              firstEndTime: 208,+            },+            {+              url: 'https://widget.intercom.io/widget/1',+              mainThreadTime: 0,+              blockingTime: 0,+              transferSize: 4000,+              firstStartTime: 200,+              firstEndTime: 205,+            },+          ],+        },+      },+    ]);+  });++  it('does not report first party resources', async () => {+    const artifacts = {+      devtoolsLogs: {+        defaultPass: networkRecordsToDevtoolsLog([+          resourceEntry(100, 101, 102, 2000, 'https://intercomcdn.com'),+          intercomProductEntry(200, 201, 202, 4000, '1'),+        ]),+      },+      traces: {defaultPass: createTestTrace({timeOrigin: 0, traceEnd: 2000})},+      URL: {finalUrl: 'https://intercomcdn.com'},+    };++    const settings = {throttlingMethod: 'simulate', throttling: {cpuSlowdownMultiplier: 4}};+    const results = await ThirdPartyFacades.audit(artifacts, {computedCache: new Map(), settings});++    expect(results).toEqual({+      score: 1,+      notApplicable: true,+    });+  });++  it('only reports resources which have facade alternatives', async () => {+    const artifacts = {+      // This devtools log has third party requests but none have facades+      devtoolsLogs: {defaultPass: pwaDevtoolsLog},+      traces: {defaultPass: pwaTrace},+      URL: {finalUrl: 'https://pwa-rocks.com'},+    };++    const settings = {throttlingMethod: 'simulate', throttling: {cpuSlowdownMultiplier: 4}};+    const results = await ThirdPartyFacades.audit(artifacts, {computedCache: new Map(), settings});++    expect(results).toEqual({+      score: 1,+      notApplicable: true,+    });+  });++  it('not applicable when no third party resources are present', async () => {+    const artifacts = {+      devtoolsLogs: {+        defaultPass: networkRecordsToDevtoolsLog([+          resourceEntry(100, 101, 102, 2000, 'https://example.com'),+        ]),+      },+      traces: {defaultPass: noThirdPartyTrace},+      URL: {finalUrl: 'https://example.com'},+    };++    const settings = {throttlingMethod: 'simulate', throttling: {cpuSlowdownMultiplier: 4}};+    const results = await ThirdPartyFacades.audit(artifacts, {computedCache: new Map(), settings});++    expect(results).toEqual({+      score: 1,+      notApplicable: true,+    });+  });++  it('handles real trace', async () => {+    const artifacts = {+      devtoolsLogs: {defaultPass: videoEmbedsDevtolsLog},+      traces: {defaultPass: videoEmbedsTrace},+      URL: {finalUrl: 'https://example.com'},+    };++    const settings = {throttlingMethod: 'simulate', throttling: {cpuSlowdownMultiplier: 4}};+    const results = await ThirdPartyFacades.audit(artifacts, {computedCache: new Map(), settings});++    expect(results.score).toBe(0);+    expect(results.displayValue).toBeDisplayString('2 facade alternatives available');+    expect(results.details.items[0].product).toBeDisplayString('YouTube Embedded Player (Video)');+    expect(results.details.items[1].product).toBeDisplayString('Vimeo Embedded Player (Video)');+    expect(results.details.items).toMatchObject(+      [+        {+          transferSize: 651350,+          blockingTime: 0,+          subItems: {+            items: [+              {+                blockingTime: 0,+                firstEndTime: 47786.347774999995,+                firstStartTime: 47786.326268,+                mainThreadTime: 0,+                transferSize: 459603,+                url: 'https://www.youtube.com/s/player/e0d83c30/player_ias.vflset/en_US/base.js',+              },+              {+                blockingTime: 0,+                firstEndTime: 47786.692717,+                firstStartTime: 47786.569798,+                mainThreadTime: 0,+                transferSize: 66273,+                url: 'https://i.ytimg.com/vi/tgbNymZ7vqY/maxresdefault.jpg',+              },+              {+                blockingTime: 0,+                firstEndTime: 47786.436251,+                firstStartTime: 47786.325979,+                mainThreadTime: 0,+                transferSize: 50213,+                url: 'https://www.youtube.com/s/player/e0d83c30/www-embed-player.vflset/www-embed-player.js',+              },+              {+                blockingTime: 0,+                firstEndTime: 47786.441221,+                firstStartTime: 47786.324095,+                mainThreadTime: 0,+                transferSize: 46813,+                url: 'https://www.youtube.com/s/player/e0d83c30/www-player.css',+              },+              {+                blockingTime: 0,+                firstEndTime: 47786.580910000004,+                firstStartTime: 47786.561199,+                mainThreadTime: 0,+                transferSize: 11477,+                url: 'https://www.youtube.com/s/player/e0d83c30/player_ias.vflset/en_US/embed.js',+              },+              {+                blockingTime: 0,+                firstEndTime: 47786.303873000004,+                firstStartTime: 47786.066226,+                mainThreadTime: 0,+                transferSize: 10703,+                url: 'https://www.youtube.com/embed/tgbNymZ7vqY',+              },+              {+                blockingTime: 0,+                firstEndTime: 47786.414732000005,+                firstStartTime: 47786.326585,+                mainThreadTime: 0,+                transferSize: 3191,+                url: 'https://www.youtube.com/yts/jsbin/fetch-polyfill-vfl6MZH8P/fetch-polyfill.js',+              },+              {+                blockingTime: 0,+                firstEndTime: 47786.679700999994,+                firstStartTime: 47786.568895,+                mainThreadTime: 0,+                transferSize: 3077,+                url: 'https://yt3.ggpht.com/a/AATXAJxtCYVD65XPtigYUOad-Nd2v3EvnXnz__MkJrg=s68-c-k-c0x00ffffff-no-rj',+              },+            ],+            type: 'subitems',+          },+        },+        {+          transferSize: 184495,+          blockingTime: 0,+          subItems: {+            items: [+              {+                blockingTime: 0,+                firstEndTime: 47786.422034999996,+                firstStartTime: 47786.323843,+                mainThreadTime: 0,+                transferSize: 145772,+                url: 'https://f.vimeocdn.com/p/3.22.3/js/player.js',+              },+              {+                blockingTime: 0,+                firstEndTime: 47786.422311999995,+                firstStartTime: 47786.324528,+                mainThreadTime: 0,+                transferSize: 17633,+                url: 'https://f.vimeocdn.com/p/3.22.3/css/player.css',+              },+              {+                blockingTime: 0,+                firstEndTime: 47786.634061000004,+                firstStartTime: 47786.606134,+                mainThreadTime: 0,+                transferSize: 9313,+                url: 'https://i.vimeocdn.com/video/784397921.webp?mw=1200&mh=675&q=70',+              },+              {+                blockingTime: 0,+                firstEndTime: 47786.291588,+                firstStartTime: 47786.074447,+                mainThreadTime: 0,+                transferSize: 8300,+                url: 'https://player.vimeo.com/video/336812660',+              },+              {+                blockingTime: 0,+                firstEndTime: 47786.47053,+                firstStartTime: 47786.325692,+                mainThreadTime: 0,+                transferSize: 1474,+                url: 'https://f.vimeocdn.com/js_opt/modules/utils/vuid.min.js',+              },+              {+                blockingTime: 0,+                firstEndTime: 47786.417184000005,+                firstStartTime: 47786.32147,+                mainThreadTime: 0,+                transferSize: 1075,+                url: 'https://i.vimeocdn.com/video/784397921.jpg?mw=80&q=85',+              },+              {+                blockingTime: 0,+                firstEndTime: 47787.641538,+                firstStartTime: 47786.499527,+                mainThreadTime: 0,+                transferSize: 818,+                url: 'https://vimeo.com/ablincoln/vuid?pid=a88cdaf56540a693f597632ffeeaf6a38f56542a1600197631',+              },+              {+                blockingTime: 0,+                firstEndTime: 47786.70353599999,+                firstStartTime: 47786.608785,+                mainThreadTime: 0,+                transferSize: 110,+                url: 'https://fresnel.vimeocdn.com/add/player-stats?beacon=1&session-id=a88cdaf56540a693f597632ffeeaf6a38f56542a1600197631',+              },+              {+                blockingTime: 0,+                firstEndTime: 47786.06986,+                firstStartTime: 47786.069794,+                mainThreadTime: 0,+                transferSize: 0,+                url: 'http://player.vimeo.com/video/336812660',+              },+            ],+            type: 'subitems',+          },+        },+      ]+    );+  });+});

can you add some tests where blockingTime is not 0?

adamraine

comment created time in 8 days

PullRequestReviewEvent

Pull request review commentGoogleChrome/lighthouse

report: reuse generalized clumping for perf category

 class PerformanceCategoryRenderer extends CategoryRenderer {      // Filmstrip     const timelineEl = this.dom.createChildOf(element, 'div', 'lh-filmstrip-container');-    const thumbnailAudit = category.auditRefs.find(audit => audit.id === 'screenshot-thumbnails');-    const thumbnailResult = thumbnailAudit && thumbnailAudit.result;-    if (thumbnailResult && thumbnailResult.details) {-      timelineEl.id = thumbnailResult.id;-      const filmstripEl = this.detailsRenderer.render(thumbnailResult.details);-      filmstripEl && timelineEl.appendChild(filmstripEl);+    // We only expect one of these, but the renderer will support multiple+    const thumbnailAudits = category.auditRefs.filter(audit => audit.group === 'filmstrip');

Is this breaking? v7?

paulirish

comment created time in 8 days

PullRequestReviewEvent

issue commentGoogleChrome/lighthouse

PSI fecht bad url

cc @Beytoven I think this has come up before?

Lofesa

comment created time in 8 days

Pull request review commentGoogleChrome/lighthouse

new_audit: third party facades

+/**+ * @license Copyright 2020 The Lighthouse Authors. All Rights Reserved.+ * Licensed under the Apache License, Version 2.0 (the "License"); you may not use this file except in compliance with the License. You may obtain a copy of the License at http://www.apache.org/licenses/LICENSE-2.0+ * Unless required by applicable law or agreed to in writing, software distributed under the License is distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the License for the specific language governing permissions and limitations under the License.+ */+'use strict';++/**+ * @fileoverview Audit which identifies third-party code on the page which can be lazy loaded.+ * The audit will recommend a facade alternative which is used to imitate the third party resource until it is needed.+ *+ * Entity: Set of domains which are used by a company or product area to deliver third party resources+ * Product: Specific piece of software belonging to an entity. Entities can have multiple products.+ * Facade: Placeholder for a product which looks likes the actual product and replaces itself with that product when the user needs it.+ */++const Audit = require('./audit.js');+const i18n = require('../lib/i18n/i18n.js');+const thirdPartyWeb = require('../lib/third-party-web.js');+const NetworkRecords = require('../computed/network-records.js');+const MainResource = require('../computed/main-resource.js');+const MainThreadTasks = require('../computed/main-thread-tasks.js');+const ThirdPartySummary = require('./third-party-summary.js');++const UIStrings = {+  /** Title of a diagnostic audit that provides details about the third-party code on a web page that can be lazy loaded with a facade alternative. This descriptive title is shown to users when no resources have facade alternatives available. Lazy loading means loading resources is deferred until they are needed. */+  title: 'Lazy load third-party resources with facade alternatives',+  /** Title of a diagnostic audit that provides details about the third-party code on a web page that can be lazy loaded with a facade alternative. This descriptive title is shown to users when one or more third-party resources have available facade alternatives. Lazy loading means loading resources is deferred until they are needed. */+  failureTitle: 'Some third-party resources can be lazy loaded with a facade alternative',+  /** Description of a Lighthouse audit that identifies the third party code on the page that can be lazy loaded with a facade alternative. This is displayed after a user expands the section to see more. No character length limits. 'Learn More' becomes link text to additional documentation. Lazy loading means loading resources is deferred until they are needed. */+  description: 'Some third-party resources can be fetched after the page loads. ' ++    'These third-party resources are used by embedded elements which can be replaced by a facade ' +

I like the second option, thanks.

adamraine

comment created time in 8 days

PullRequestReviewEvent

push eventGoogleChrome/lighthouse

Connor Clark

commit sha 1d0cc991f18da773154d5f8e2cd7b3c811a1a4c2

fix smokes

view details

push time in 8 days

pull request commentGoogleChrome/lighthouse

new_audit: json-ld audit

(2410564 - 1544755) / 1000 = +865KB to devtools bundles. Can expect slightly less if PR is updated to include recent build improvements.

mattzeunert

comment created time in 8 days

push eventGoogleChrome/lighthouse

Connor Clark

commit sha fb27bb08f3783703e952e3d8d50fcf11765a6de6

lint

view details

push time in 8 days

PullRequestReviewEvent

push eventGoogleChrome/lighthouse

Connor Clark

commit sha b142f83605e989a0024523dd5b73ad6d9c54a79e

misc: hide locale files by default in PRs (#11363)

view details

Pujitha.E

commit sha a1571ba4bc5b7713248ea53e01db085389e634f8

report(csv): add overall category scores (#11404)

view details

andreizet

commit sha a58510583acd2f796557175ac949932618af49e7

tests: add markdown link checker (#11358)

view details

Connor Clark

commit sha fa1755848a13b399e62b988da4f85c8500af8154

misc(build): fix devtools tests by making empty type files (#11418)

view details

Mohammed J. Razem

commit sha 84be010d33b2645011494498c6b26b8010641528

core(stack-packs): add Drupal pack (#10522)

view details

James Garbutt

commit sha 9b4a46bbfc5acc8cae37671e2697b7f61686e8a7

core: traverse shadow hosts in getNodePath (#10956) Co-authored-by: Paul Irish <paulirish@google.com>

view details

Paul Irish

commit sha 3359a709dcc905b2834375931930f614b71f49f3

tests(smoke): fix preconnect flake w/ a non-locally installed font (#11425)

view details

Connor Clark

commit sha 5f2a8e27f0dbe64a383437ad1e05ed014151e942

misc: update stack packs, remove duplicated stack pack files (#11396)

view details

Connor Clark

commit sha e5dee434d2b9095e4713febec74d2f3beeebf5db

tests: hash more files for devtools test cache (#11417)

view details

Paul Irish

commit sha d86ce3421e16c2c08a81b052a97330a1793c3f81

report: let fireworks eligibility ignore PWA category (#11200)

view details

Tim van der Lippe

commit sha f59011d7d862eb4b6e2867b0b4c4a6fb66b3be98

clients(devtools): update report-generator.js to match DevTools changes (#11411) This mirrors the changes made in https://chromium-review.googlesource.com/c/devtools/devtools-frontend/+/2398829/

view details

Paul Irish

commit sha 11cf91adee7ea932468ddcb2db0ae85dc27555da

misc(axe): rename axe types (#11432)

view details

Connor Clark

commit sha c52552b9e01ae577284957b83139ac25d7aa0e9a

tests: check for dependencies when setting up blink tools (#11437)

view details

adrianaixba

commit sha c6e753c191ac96300a931f2a4c434083590b265f

core: normalize node information in gathering (#11405)

view details

adrianaixba

commit sha 039b6c6e0826d8b09e145e6ffd5253724b497021

core(password-inputs-can-be-pasted-into): add devtoolsNodePath (#11416) Uses our snippet for axe nodes instead of aXe's.

view details

adrianaixba

commit sha b08089659b6454bcce73b9943472b17e44f8d076

add score shapes to legend (#11440)

view details

Connor Clark

commit sha 2e9967b1c4f8aa5bb667fffcdb4b9a0e1cafd7aa

clients(lr): enable uses-http2, add protocol override header (#11439)

view details

Connor Clark

commit sha d4fc00cec56b80924c6f661922f1092ee25734e6

deps(inquirer): upgrade to 7.3.3 (#11441)

view details

Brendan Kenny

commit sha 378a31f8117d20c852562514612c80ea12892c54

i18n: use IcuMessage objects instead of string IDs (#10630)

view details

Brendan Kenny

commit sha cf8a5553fd97eef7fbef19cc5107458804078ce5

deps: update transitive lodash (#11448)

view details

push time in 8 days

push eventGoogleChrome/lighthouse-stack-packs

Caroline Liu

commit sha a159cd2b6a67d4deae0e4cc467c62b3ec22dde41

Add informational links to AMP stack pack texts

view details

Connor Clark

commit sha c5f83e9e89894c66b497b4d684e661b94447368a

Add informational links to AMP stack pack texts

view details

Connor Clark

commit sha 6b571087523e239b8255ca37c6e6d37f4ff3ab46

Merge remote-tracking branch 'origin/master' into strings

view details

push time in 8 days

issue commentGoogleChrome/lighthouse

Request for metrics that are inclusive to Assistive Technology

Do you think something TTI-like would be too noisy? I'm thinking of some kind of settling metric like "time until N seconds between accessibility tree changes"?

I think that would devolve in some common cases regarding carousels.

scottjehl

comment created time in 8 days

Pull request review commentGoogleChrome/lighthouse

new_audit: third party facades

+/**+ * @license Copyright 2020 The Lighthouse Authors. All Rights Reserved.+ * Licensed under the Apache License, Version 2.0 (the "License"); you may not use this file except in compliance with the License. You may obtain a copy of the License at http://www.apache.org/licenses/LICENSE-2.0+ * Unless required by applicable law or agreed to in writing, software distributed under the License is distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the License for the specific language governing permissions and limitations under the License.+ */+'use strict';++/**+ * @fileoverview Audit which identifies third-party code on the page which can be lazy loaded.+ * The audit will recommend a facade alternative which is used to imitate the third party resource until it is needed.+ *+ * Entity: Set of domains which are used by a company or product area to deliver third party resources+ * Product: Specific piece of software belonging to an entity. Entities can have multiple products.+ * Facade: Placeholder for a product which looks likes the actual product and replaces itself with that product when the user needs it.+ */++const Audit = require('./audit.js');+const i18n = require('../lib/i18n/i18n.js');+const thirdPartyWeb = require('../lib/third-party-web.js');+const NetworkRecords = require('../computed/network-records.js');+const MainResource = require('../computed/main-resource.js');+const MainThreadTasks = require('../computed/main-thread-tasks.js');+const ThirdPartySummary = require('./third-party-summary.js');++const UIStrings = {+  /** Title of a diagnostic audit that provides details about the third-party code on a web page that can be lazy loaded with a facade alternative. This descriptive title is shown to users when no resources have facade alternatives available. Lazy loading means loading resources is deferred until they are needed. */+  title: 'Lazy load third-party resources with facade alternatives',+  /** Title of a diagnostic audit that provides details about the third-party code on a web page that can be lazy loaded with a facade alternative. This descriptive title is shown to users when one or more third-party resources have available facade alternatives. Lazy loading means loading resources is deferred until they are needed. */+  failureTitle: 'Some third-party resources can be lazy loaded with a facade alternative',+  /** Description of a Lighthouse audit that identifies the third party code on the page that can be lazy loaded with a facade alternative. This is displayed after a user expands the section to see more. No character length limits. 'Learn More' becomes link text to additional documentation. Lazy loading means loading resources is deferred until they are needed. */+  description: 'Some third-party resources can be fetched after the page loads. ' ++    'These third-party resources are used by embedded elements which can be replaced by a facade ' +

marking this string as needs-editing (don't have a good suggestion atm tho)

adamraine

comment created time in 8 days

PullRequestReviewEvent

issue commentGoogleChrome/lighthouse

Request for metrics that are inclusive to Assistive Technology

It could be computed much like Speed Index, assuming there's a decent calculation for determining tree similarity.

This is the part that gives me the most pause. This won't be nearly as simple as "sum all the color values". Some cursory googling for "tree similarity algorithms" wasn't encouraging.

scottjehl

comment created time in 8 days

pull request commentGoogleChrome/lighthouse

misc(build): give build-tracker a shared git history on PRs

Can you add a link to the build tracker somewhere, maybe the readme?

paulirish

comment created time in 8 days

Pull request review commentGoogleChrome/lighthouse

misc(build): give build-tracker a shared git history on PRs

 jobs:     - run: yarn i18n:checks     - run: yarn dogfood-lhci -    # buildtracker runs `git merge-base HEAD origin/master` which needs more history than depth=1. https://github.com/paularmstrong/build-tracker/issues/106-    - name: Deepen git fetch (for buildtracker)-      run: git fetch --deepen=100+    # buildtracker needs history and a common merge commit.

If build tracker UI can't filter to just master commits, I'm hesitant to mix the two data sources.

paulirish

comment created time in 8 days

PullRequestReviewEvent
PullRequestReviewEvent

Pull request review commentGoogleChrome/lighthouse

misc(build): give build-tracker a shared git history on PRs

 jobs:     - run: yarn i18n:checks     - run: yarn dogfood-lhci -    # buildtracker runs `git merge-base HEAD origin/master` which needs more history than depth=1. https://github.com/paularmstrong/build-tracker/issues/106-    - name: Deepen git fetch (for buildtracker)-      run: git fetch --deepen=100+    # buildtracker needs history and a common merge commit.

Is there a way to check against a budget w/o uploading the data for the PR revision (and thus, no need to do this complex git stuff)?

paulirish

comment created time in 8 days

push eventGoogleChrome/lighthouse-stack-packs

Caroline Liu

commit sha a159cd2b6a67d4deae0e4cc467c62b3ec22dde41

Add informational links to AMP stack pack texts

view details

Connor Clark

commit sha c5f83e9e89894c66b497b4d684e661b94447368a

Add informational links to AMP stack pack texts

view details

push time in 8 days

PR merged GoogleChrome/lighthouse-stack-packs

Add informational links to AMP stack pack texts
  • Adds links to references to amp-img
  • Links srcset mention to a relevant web.dev post
+3 -3

0 comment

1 changed file

caroqliu

pr closed time in 8 days

PullRequestReviewEvent

Pull request review commentGoogleChrome/lighthouse

misc(build): give build-tracker a shared git history on PRs

 jobs:     - run: yarn i18n:checks     - run: yarn dogfood-lhci -    # buildtracker runs `git merge-base HEAD origin/master` which needs more history than depth=1. https://github.com/paularmstrong/build-tracker/issues/106-    - name: Deepen git fetch (for buildtracker)-      run: git fetch --deepen=100+    # buildtracker needs history and a common merge commit.

Why do we want it tracking PR? It doesn't ever fail, right?

paulirish

comment created time in 8 days

PullRequestReviewEvent

Pull request review commentGoogleChrome/lighthouse

misc(build): give build-tracker a shared git history on PRs

 jobs:     - run: yarn i18n:checks     - run: yarn dogfood-lhci -    # buildtracker runs `git merge-base HEAD origin/master` which needs more history than depth=1. https://github.com/paularmstrong/build-tracker/issues/106-    - name: Deepen git fetch (for buildtracker)-      run: git fetch --deepen=100+    # buildtracker needs history and a common merge commit.

Could drop the entire script if we just skip this step if the current branch isn't master? so it only runs on commits merged to master.

paulirish

comment created time in 8 days

PullRequestReviewEvent

pull request commentGoogleChrome/lighthouse

core(is-on-https): add missing space in description

Thanks @qwright10!

qwright10

comment created time in 8 days

PullRequestReviewEvent

Pull request review commentGoogleChrome/lighthouse

misc: add missing space in description

 const UIStrings = {   /** Description of a Lighthouse audit that tells the user *why* HTTPS use *for all resources* is important. This is displayed after a user expands the section to see more. No character length limits. 'Learn More' becomes link text to additional documentation. */   description: 'All sites should be protected with HTTPS, even ones that don\'t handle ' +       'sensitive data. This includes avoiding [mixed content](https://developers.google.com/web/fundamentals/security/prevent-mixed-content/what-is-mixed-content), ' +-      'where some resources are loaded over HTTP despite the initial request being served' ++      'where some resources are loaded over HTTP despite the initial request being served ' +

jk "word wrap" was a pretty good search term. https://github.com/GoogleChrome/lighthouse/pull/9105#discussion_r307599198

qwright10

comment created time in 8 days

Pull request review commentGoogleChrome/lighthouse

misc: add missing space in description

 const UIStrings = {   /** Description of a Lighthouse audit that tells the user *why* HTTPS use *for all resources* is important. This is displayed after a user expands the section to see more. No character length limits. 'Learn More' becomes link text to additional documentation. */   description: 'All sites should be protected with HTTPS, even ones that don\'t handle ' +       'sensitive data. This includes avoiding [mixed content](https://developers.google.com/web/fundamentals/security/prevent-mixed-content/what-is-mixed-content), ' +-      'where some resources are loaded over HTTP despite the initial request being served' ++      'where some resources are loaded over HTTP despite the initial request being served ' +

Yes. No reasonable way to locate it. GitHub comment search is non-existent.

qwright10

comment created time in 8 days

PullRequestReviewEvent

push eventGoogleChrome/lighthouse

adrianaixba

commit sha b08089659b6454bcce73b9943472b17e44f8d076

add score shapes to legend (#11440)

view details

Connor Clark

commit sha 2e9967b1c4f8aa5bb667fffcdb4b9a0e1cafd7aa

clients(lr): enable uses-http2, add protocol override header (#11439)

view details

Connor Clark

commit sha d4fc00cec56b80924c6f661922f1092ee25734e6

deps(inquirer): upgrade to 7.3.3 (#11441)

view details

Brendan Kenny

commit sha 378a31f8117d20c852562514612c80ea12892c54

i18n: use IcuMessage objects instead of string IDs (#10630)

view details

Brendan Kenny

commit sha cf8a5553fd97eef7fbef19cc5107458804078ce5

deps: update transitive lodash (#11448)

view details

Connor Clark

commit sha d41591d4b37ef7bce3c070724787348b5f8b92bb

misc(build): minify bundles with terser (#9605)

view details

Paul Irish

commit sha 0580f5742ae30d79de688bc3b387e2b447d9ef30

tests(page-functions): add test for getNodePath (#11433)

view details

Paul Irish

commit sha 4335838135b1c254d1e66bb5d732f14b3b875ccf

deps: chrome-launcher to v0.13.4 (#11434)

view details

Connor Clark

commit sha 99df40a3fe5fab4d8f5c14febd355e1ef804e444

misc: tweak typescript jsdoc for list format (#11447)

view details

Connor Clark

commit sha 02003588412cad2d9c555bd58581e2fcd90e9241

misc(build): use terser on inline assets (#11461)

view details

Connor Clark

commit sha 315c29078b3419f848d031a1c4e8f19a90fa8ee9

Merge remote-tracking branch 'origin/master' into hreflang-axe

view details

push time in 8 days

Pull request review commentGoogleChrome/lighthouse

misc: add missing space in description

 const UIStrings = {   /** Description of a Lighthouse audit that tells the user *why* HTTPS use *for all resources* is important. This is displayed after a user expands the section to see more. No character length limits. 'Learn More' becomes link text to additional documentation. */   description: 'All sites should be protected with HTTPS, even ones that don\'t handle ' +       'sensitive data. This includes avoiding [mixed content](https://developers.google.com/web/fundamentals/security/prevent-mixed-content/what-is-mixed-content), ' +-      'where some resources are loaded over HTTP despite the initial request being served' ++      'where some resources are loaded over HTTP despite the initial request being served ' +

ahem this is why I think we should disable linting for these few lines and just use one long string.

qwright10

comment created time in 8 days

PullRequestReviewEvent

pull request commentGoogleChrome/lighthouse

misc: add missing space in description

should just need to do yarn upgrade:sample-json

qwright10

comment created time in 8 days

push eventGoogleChrome/lighthouse

Paul Irish

commit sha 0580f5742ae30d79de688bc3b387e2b447d9ef30

tests(page-functions): add test for getNodePath (#11433)

view details

Paul Irish

commit sha 4335838135b1c254d1e66bb5d732f14b3b875ccf

deps: chrome-launcher to v0.13.4 (#11434)

view details

Connor Clark

commit sha 99df40a3fe5fab4d8f5c14febd355e1ef804e444

misc: tweak typescript jsdoc for list format (#11447)

view details

Connor Clark

commit sha 8c0743c747103178bcc8ba570bd042a105d9fd6e

Merge remote-tracking branch 'origin/master' into open-devtools

view details

Connor Clark

commit sha 6d46a2c0234b00b1de523e41093ef5fc9b77626a

patrick

view details

push time in 8 days

push eventGoogleChrome/lighthouse

Connor Clark

commit sha 02003588412cad2d9c555bd58581e2fcd90e9241

misc(build): use terser on inline assets (#11461)

view details

push time in 8 days

delete branch GoogleChrome/lighthouse

delete branch : bundle-inline-terser

delete time in 8 days

PR merged GoogleChrome/lighthouse

misc(build): use terser on inline assets cla: yes waiting4committer

Saves ~32 KB in CDT bundle.

We did the same thing already for the viewer https://github.com/GoogleChrome/lighthouse/pull/9823, but the build-bundle never got the same treatment.

Here's what got trimmed:

minifying /Users/cjamcl/src/lighthouse/node_modules/axe-core/axe.min.js. saved 3.195 KB
minifying /Users/cjamcl/src/lighthouse/node_modules/js-library-detector/library/libraries.js. saved 26.963 KB

axe.min.js is quite big, and surfaced the fact that minifyFileTransform only worked if the amount of code fits in w/e Node decides to be the chunk size in the file streaming. Had to fix that.

+31 -9

2 comments

3 changed files

connorjclark

pr closed time in 8 days

pull request commentGoogleChrome/lighthouse

new_audit: third party facades

If the existing facades can be listed in web.dev behind the "Learn more" link does it really help us if the category also links to a specific part of that page.

It may help if there's more than a handful of products. And having a link inline (in the table results) is good too, some users might have a UX-blindness to a "Learn more" link.

Were you thinking the third party categories would have different pages?

No, one page.

If having the category is important, I think we should put it in the product column without linking to anything or have the product name column link to web.dev. Could look like this:

Product                            Transfer size           Blocking time

YouTube Embed (Video)               600Kb                   0ms
   youtube.com/embed/...            40Kb                    0ms

This is how I imagined it (with "Youtube Embed...." being an anchor link to web.dev where we list all the ways to lazy load this product)

adamraine

comment created time in 8 days

pull request commentGoogleChrome/lighthouse

misc(build): fix mangling for tap-targets gatherer

can we add a smoketest to test-bundle that ensures no audits throw an exception?

I'm confused–our smoke testing doesn't support this. What do you mean?

We could just run all the smoke tests instead of the subset we do. If just one of them assert on tap-target, it would have caught this.

connorjclark

comment created time in 8 days

Pull request review commentGoogleChrome/lighthouse

new_audit: preload images that are used by the LCP element

+/**+ * @license Copyright 2020 The Lighthouse Authors. All Rights Reserved.+ * Licensed under the Apache License, Version 2.0 (the "License"); you may not use this file except in compliance with the License. You may obtain a copy of the License at http://www.apache.org/licenses/LICENSE-2.0+ * Unless required by applicable law or agreed to in writing, software distributed under the License is distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the License for the specific language governing permissions and limitations under the License.+ */+'use strict';

Can you add a link to the design doc here + in the PR comment

Beytoven

comment created time in 8 days

PullRequestReviewEvent

Pull request review commentGoogleChrome/lighthouse

misc(build): fix mangling for tap-targets gatherer

  */ 'use strict'; -/* global document, window, getComputedStyle, getElementsInDocument, Node, getNodeDetails */+/* global document, window, getComputedStyle, getElementsInDocument, Node, getNodeDetails, getRectCenterPoint */  const Gatherer = require('../gatherer.js'); const pageFunctions = require('../../../lib/page-functions.js');-const {-  rectContains,-  getRectArea,-  getRectCenterPoint,-  getLargestRect,-} = require('../../../lib/rect-helpers.js');+const RectHelpers = require('../../../lib/rect-helpers.js');

This is how all other gatherers import this module. Only this one did destructuring, which results in references in the "page functions" below to (lexically) check out. but lexical scope is a damned lie our tools tell themselves

connorjclark

comment created time in 11 days

PullRequestReviewEvent

PR opened GoogleChrome/lighthouse

misc(build): fix mangling for tap-targets gatherer

The tap-targets gatherer was always failing in CDT (yarn open-devtools is so useful)

overloading variables between Node/page-function contexts is bad, it causes tooling to make mistakes.

+7 -11

0 comment

1 changed file

pr created time in 11 days

create barnchGoogleChrome/lighthouse

branch : fix-tap

created branch time in 11 days

push eventGoogleChrome/lighthouse

Connor Clark

commit sha a9f4ee00d5017a61bddbf9a2e7e9933b8b043f07

misc: yarn open-devtools

view details

Connor Clark

commit sha a4f81cfd79086c02da3a13c9bc1ca87e74a1fe80

feedback

view details

Connor Clark

commit sha 534ae7bc5bcd4145361d4cdbf64c01f8040919d7

Merge remote-tracking branch 'origin/master' into open-devtools

view details

Connor Clark

commit sha 8c7a8bdb971a1c0d58f72fc16ad197053d7da4f3

Merge branch 'open-devtools' into minify-dt-report-code

view details

push time in 11 days

create barnchGoogleChrome/lighthouse

branch : minify-dt-report-code

created branch time in 11 days

push eventGoogleChrome/lighthouse

Connor Clark

commit sha 7b6e88da4984fe70e58e230848ef9d07de53fc69

comment

view details

push time in 11 days

more