profile
viewpoint

addyosmani/webpack-lighthouse-plugin 266

A Webpack plugin for Lighthouse

brendankenny/libtess.js 185

Polygon tesselation library, ported from SGI's GLU implementation to JavaScript.

brendankenny/CanvasLayer 143

A <canvas> map layer for the Google Maps JavaScript API v3 for 2d and WebGL data visualization

brendankenny/crossfilter-and-v3 80

A simple example of using Crossfilter with the Google Maps JavaScript API

brendankenny/crossfilter 4

Fast n-dimensional filtering and grouping of records.

brendankenny/point-overlay 4

A simple example of using CanvasLayer via a Polymer element

brendankenny/speedline 2

Calculate the speed index and visual performance metrics from chrome dev tool timeline (recently: pmdartus -> paulirish)

brendankenny/GL-Shader-Validator 1

A GLSL and ESSL validator for Sublime Text 2

push eventGoogleChromeLabs/lh-metrics-analysis

Brendan Kenny

commit sha ebd326224dbced8082eebfda71f98f965aafadd5

deps: update transitive deps (#54)

view details

push time in 44 minutes

delete branch GoogleChromeLabs/lh-metrics-analysis

delete branch : deps

delete time in 44 minutes

create barnchGoogleChromeLabs/lh-metrics-analysis

branch : deps

created branch time in an hour

push eventGoogleChromeLabs/lh-metrics-analysis

Brendan Kenny

commit sha 7eda72702b43f72f0253ce1ef61c5127a9ed83af

add CPU throttling note to August 2020 report

view details

push time in an hour

push eventGoogleChromeLabs/lh-metrics-analysis

Brendan Kenny

commit sha 9e856e1a5e66df3d5bb4617d9c2d53f9f98f9537

upload test September 2020 report

view details

push time in an hour

issue commentGoogleChrome/lighthouse

Work with "application/xhtml+xml" documents

@patrickhulce pointed out that https://github.com/GoogleChrome/lighthouse/pull/11042#issuecomment-652637567 lists application/xhtml+xml as the third most popular mime type in the June HTTP Archive run (after text/html and the empty string). It helpfully gives http://apple-store.in/ as an example serving with that mime type (and it is, but it doesn't validate :).

The report for it looks pretty good and doesn't have any of the problems of the pure xml document cited in #9245, so supporting this case SGTM too.

craigfrancis

comment created time in 20 hours

pull request commentGoogleChrome/lighthouse

core(benchmarkindex): add workaround for Intel microcode fixes

Sure, it could reasonably be 100 without much impact to overshooting. Prefer 100? Something else entirely?

Oh I really meant it that bike shedding on a number isn't important :) A better way of phrasing what I was getting at is more I was curious why you picked 10 and not another number (since my instinct would be considerably higher). Just a reasonable minimum where everything still works?

patrickhulce

comment created time in a day

PullRequestReviewEvent

Pull request review commentGoogleChrome/lighthouse

new_audit: add treemap-data to experimental

+/**+ * @license Copyright 2020 The Lighthouse Authors. All Rights Reserved.+ * Licensed under the Apache License, Version 2.0 (the "License"); you may not use this file except in compliance with the License. You may obtain a copy of the License at http://www.apache.org/licenses/LICENSE-2.0+ * Unless required by applicable law or agreed to in writing, software distributed under the License is distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the License for the specific language governing permissions and limitations under the License.+ */+'use strict';++const TreemapData_ = require('../../audits/treemap-data.js');+const networkRecordsToDevtoolsLog = require('../network-records-to-devtools-log.js');+const {loadSourceMapFixture, makeParamsOptional} = require('../test-utils.js');++/* eslint-env jest */++const TreemapData = {+  audit: makeParamsOptional(TreemapData_.audit),+  prepareTreemapNodes: makeParamsOptional(TreemapData_.prepareTreemapNodes),+};++/**+ * @param {string} name+ */+function load(name) {+  const data = loadSourceMapFixture(name);+  if (!data.usage) throw new Error('exepcted usage');

That doesn't seem more onerous than tests that need a pair of trace and devtoolsLogs :) but if keeping, maybe add a comment to these, then? It took me a while to realize the identity data spread wasn't for a shallow clone or something and was for the type checking.

connorjclark

comment created time in 4 days

PullRequestReviewEvent

Pull request review commentGoogleChrome/lighthouse

core(driver): don't clear indexedb, websql, or localstorage before run

 class Driver {     }   } +  /**+   * @param {string} url+   * @param {(LH.IcuMessage | string)[]} LighthouseRunWarnings+   */+  async getImportantStorageWarning(url, LighthouseRunWarnings) {+    const usageData = await this.sendCommand('Storage.getUsageAndQuota', {+      origin: url,+    });+    /** @type {Object.<string, string>} */+    const storageTypeNames = {+      local_storage: 'Local Storage',+      indexeddb: 'IndexedDB',+      websql: 'Web SQL',+    };+    const locations = usageData.usageBreakdown+      .filter(usage => usage.usage)+      .map(usage => storageTypeNames[usage.storageType] || '')+      .filter(Boolean);+    if (locations.length) {+      LighthouseRunWarnings.push(str_(+        UIStrings.warningData,+        {locations: locations.join(', '), locationCount: locations.length}

Since we've deferred on this in the past, unless anyone else wants to chime in on this string, maybe we can bring it up in the monday eng sync and make a decision on this then?

adamraine

comment created time in 4 days

Pull request review commentGoogleChrome/lighthouse

core(driver): don't clear indexedb, websql, or localstorage before run

 describe('.goOnline', () => {   }); }); +describe('.clearDataForOrigin', () => {+  it('only clears data from certain locations', async () => {+    let foundStorageTypes;+    connectionStub.sendCommand = createMockSendCommandFn()+      .mockResponse('Storage.clearDataForOrigin', ({storageTypes}) => {+        foundStorageTypes = storageTypes;+      });+    await driver.clearDataForOrigin('https://example.com');+    expect(foundStorageTypes).toMatchInlineSnapshot(

can you add a comment for future editors of this file for the types they should be suspicious of if they appear in this snapshot? and/or could also outright assert the new important storage types aren't in here

adamraine

comment created time in 4 days

Pull request review commentGoogleChrome/lighthouse

core(driver): don't clear indexedb, websql, or localstorage before run

 class Driver {     }   } +  /**+   * @param {string} url+   * @param {(LH.IcuMessage | string)[]} LighthouseRunWarnings+   */+  async getImportantStorageWarning(url, LighthouseRunWarnings) {+    const usageData = await this.sendCommand('Storage.getUsageAndQuota', {+      origin: url,+    });+    /** @type {Object.<string, string>} */
    /** @type {Record<string, string>} */

have you been doing Closure compiled code recently? :)

adamraine

comment created time in 4 days

PullRequestReviewEvent
PullRequestReviewEvent

Pull request review commentGoogleChrome/lighthouse

new_audit: add treemap-data to experimental

+/**+ * @license Copyright 2020 Google Inc. All Rights Reserved.+ * Licensed under the Apache License, Version 2.0 (the "License"); you may not use this file except in compliance with the License. You may obtain a copy of the License at http://www.apache.org/licenses/LICENSE-2.0+ * Unless required by applicable law or agreed to in writing, software distributed under the License is distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the License for the specific language governing permissions and limitations under the License.+ */+'use strict';++/**+ * @fileoverview+ * Creates treemap data for webtreemap.+ */++const Audit = require('./audit.js');+const JsBundles = require('../computed/js-bundles.js');+const UnusedJavaScriptSummary = require('../computed/unused-javascript-summary.js');+const ModuleDuplication = require('../computed/module-duplication.js');+const NetworkRecords = require('../computed/network-records.js');+const ResourceSummary = require('../computed/resource-summary.js');++/**+ * @typedef {Record<string, RootNode[]>} TreemapData+ */++/**+ * @typedef RootNode+ * @property {string} name+ * @property {Node} node+ */++/**+ * @typedef Node+ * @property {string} name+ * @property {number} resourceBytes+ * @property {number=} unusedBytes+ * @property {number=} executionTime+ * @property {string=} duplicate+ * @property {Node[]=} children+ */++/**+ * @typedef {Omit<Node, 'name'|'children'>} SourceData+ */++class TreemapDataAudit extends Audit {+  /**+   * @return {LH.Audit.Meta}+   */+  static get meta() {+    return {+      id: 'treemap-data',+      scoreDisplayMode: Audit.SCORING_MODES.INFORMATIVE,+      title: 'Treemap Data',+      description: 'Used for treemap visualization.',+      requiredArtifacts:+        ['traces', 'devtoolsLogs', 'SourceMaps', 'ScriptElements', 'JsUsage', 'URL'],+    };+  }++  /**+   * @param {string} sourceRoot+   * @param {Record<string, SourceData>} sourcesData+   * @return {Node}+   */+  static prepareTreemapNodes(sourceRoot, sourcesData) {+    /**+     * @param {string} name+     * @return {Node}+     */+    function newNode(name) {+      return {+        name,+        resourceBytes: 0,+      };+    }++    /**+     * Given a slash-delimited path, traverse the Node structure and increment+     * the data provided for each node in the chain. Creates nodes as needed.+     * Ex: path/to/file.js will find or create "path" on `node`, increment the data fields,+     *     and continue with "to", and so on.+     * @param {string} source+     * @param {SourceData} data+     * @param {Node} node+     */+    function addNode(source, data, node) {+      // Strip off the shared root.+      const sourcePathSegments = source.replace(sourceRoot, '').split(/\/+/);+      sourcePathSegments.forEach((sourcePathSegment, i) => {+        const isLastSegment = i === sourcePathSegments.length - 1;++        let child = node.children && node.children.find(child => child.name === sourcePathSegment);+        if (!child) {+          child = newNode(sourcePathSegment);+          node.children = node.children || [];+          node.children.push(child);+        }+        node = child;++        // Now that we've found or created the next node in the path, apply the data.+        node.resourceBytes += data.resourceBytes;+        if (data.unusedBytes) node.unusedBytes = (node.unusedBytes || 0) + data.unusedBytes;+        if (data.duplicate !== undefined && isLastSegment) {+          node.duplicate = data.duplicate;+        }+      });+    }++    const rootNode = newNode(sourceRoot);++    // For every source file, apply the data to all components+    // of the source path, creating nodes as necessary.+    for (const [source, data] of Object.entries(sourcesData)) {+      addNode(source || `<unmapped>`, data, rootNode);++      // Apply the data to the rootNode.+      rootNode.resourceBytes += data.resourceBytes;+      if (data.unusedBytes) rootNode.unusedBytes = (rootNode.unusedBytes || 0) + data.unusedBytes;+    }++    /**+     * Collapse nodes that have only one child.+     * @param {*} node+     */+    function collapse(node) {+      while (node.children && node.children.length === 1) {+        node.name += '/' + node.children[0].name;+        node.children = node.children[0].children;+      }++      if (node.children) {+        for (const child of node.children) {+          collapse(child);+        }+      }+    }+    collapse(rootNode);++    // TODO(cjamcl): Should this structure be flattened for space savings?+    // Like DOM Snapshot.+    // Less JSON (no super nested children, and no repeated property names).++    return rootNode;+  }++  /**+   * @param {LH.Artifacts} artifacts+   * @param {LH.Audit.Context} context+   * @return {Promise<RootNode[]>}+   */+  static async makeJavaScriptRootNodes(artifacts, context) {+    /** @type {RootNode[]} */+    const rootNodes = [];++    const bundles = await JsBundles.request(artifacts, context);+    const duplication = await ModuleDuplication.request(artifacts, context);++    /** @type {Array<{src: string, length: number, unusedJavascriptSummary?: import('../computed/unused-javascript-summary.js').Summary}>} */+    const scriptData = [];+    const inlineScriptData = {+      src: artifacts.URL.finalUrl,+      length: 0,+    };+    for (const scriptElement of artifacts.ScriptElements) {+      // Normalize ScriptElements so that inline scripts show up as a single entity.+      if (!scriptElement.src) {+        inlineScriptData.length += (scriptElement.content || '').length;+        continue;+      }++      const url = scriptElement.src;+      const bundle = bundles.find(bundle => url === bundle.script.src);+      const scriptCoverages = artifacts.JsUsage[url];+      if (!bundle || !scriptCoverages) continue;++      scriptData.push({+        src: scriptElement.src,+        length: (scriptElement.content || '').length,+        unusedJavascriptSummary:+          await UnusedJavaScriptSummary.request({url, scriptCoverages, bundle}, context),+      });+    }+    if (inlineScriptData.length) scriptData.unshift(inlineScriptData);++    for (const {src, length, unusedJavascriptSummary} of scriptData) {+      const bundle = bundles.find(bundle => bundle.script.src === src);+      const name = src;++      let node;+      if (bundle && unusedJavascriptSummary && unusedJavascriptSummary.sourcesWastedBytes) {+        /** @type {Record<string, SourceData>} */+        const sourcesData = {};+        for (const source of Object.keys(bundle.sizes.files)) {+          /** @type {SourceData} */+          const sourceData = {+            resourceBytes: bundle.sizes.files[source],+          };++          if (unusedJavascriptSummary && unusedJavascriptSummary.sourcesWastedBytes) {+            sourceData.unusedBytes = unusedJavascriptSummary.sourcesWastedBytes[source];+          }++          if (duplication) {+            const key = ModuleDuplication._normalizeSource(source);+            if (duplication.has(key)) sourceData.duplicate = key;+          }++          sourcesData[source] = sourceData;+        }++        node = this.prepareTreemapNodes(bundle.rawMap.sourceRoot || '', sourcesData);+      } else if (unusedJavascriptSummary) {+        node = {+          name,+          resourceBytes: unusedJavascriptSummary.totalBytes,+          unusedBytes: unusedJavascriptSummary.wastedBytes,+          executionTime: 0,+        };+      } else {+        // TODO ...?+        node = {+          name,+          resourceBytes: length,+          unusedBytes: 0,+          executionTime: 0,+        };+      }++      rootNodes.push({+        name: name,+        node,+      });+    }++    return rootNodes;+  }++  /**+   * @param {LH.Artifacts.Bundle[]} bundles+   * @param {string} url+   * @param {LH.Artifacts['JsUsage']} JsUsage+   * @param {LH.Audit.Context} context+   */+  static async getUnusedJavascriptSummary(bundles, url, JsUsage, context) {+    const bundle = bundles.find(bundle => url === bundle.script.src);+    const scriptCoverages = JsUsage[url];+    if (!scriptCoverages) return;++    const unusedJsSumary =+      await UnusedJavaScriptSummary.request({url, scriptCoverages, bundle}, context);+    return unusedJsSumary;+  }++  /**+   * @param {LH.Artifacts} artifacts+   * @param {LH.Audit.Context} context+   * @return {Promise<RootNode>}+   */+  static async makeResourceSummaryRootNode(artifacts, context) {+    const devtoolsLog = artifacts.devtoolsLogs[Audit.DEFAULT_PASS];+    const networkRecords = await NetworkRecords.request(devtoolsLog, context);+    const origin = new URL(artifacts.URL.finalUrl).origin;++    const totalCount = networkRecords.length;+    let totalSize = 0;++    /** @type {Node[]} */+    const children = [];+    for (const networkRecord of networkRecords) {+      const resourceType = ResourceSummary.determineResourceType(networkRecord);++      let child = children.find(child => child.name === resourceType);+      if (!child) {+        child = {+          name: resourceType,+          resourceBytes: 0,+          children: [],+        };+        children.push(child);+      }++      totalSize += networkRecord.resourceSize;+      child.resourceBytes += networkRecord.resourceSize;++      let name = networkRecord.url;+      // TODO ...+      if (name.startsWith(origin)) name = name.replace(origin, '/');+      child.children = child.children || [];+      child.children.push({+        name,+        resourceBytes: networkRecord.resourceSize,+      });+    }++    return {+      name: 'Resource Summary',+      node: {+        name: `${totalCount} requests`,+        resourceBytes: totalSize,+        children,+      },+    };+  }++  /**+   * @param {LH.Artifacts} artifacts+   * @param {LH.Audit.Context} context+   * @return {Promise<LH.Audit.Product>}+   */+  static async audit(artifacts, context) {+    /** @type {TreemapData} */+    const treemapData = {+      scripts: await TreemapDataAudit.makeJavaScriptRootNodes(artifacts, context),+      resources: [await TreemapDataAudit.makeResourceSummaryRootNode(artifacts, context)],+    };++    /** @type {LH.Audit.Details.DebugData} */

Should I create a new detail type?

it might be easiest to stay as DebugData during life in experimental-config, but when shipping to default-config it probably would be best as a new audit-details type since it has a real intent to use, not just debugging/httparchive debugging

connorjclark

comment created time in 5 days

PullRequestReviewEvent

Pull request review commentGoogleChrome/lighthouse

new_audit: add treemap-data to experimental

 function getProtoRoundTrip() {   }; } +/**+ * @typedef PartialScriptCoverage+ * @property {string} url+ * @property {Array<{ranges: LH.Crdp.Profiler.CoverageRange[]}>} functions+ */++/**+ * @param {string} name+ * @return {{map: LH.Artifacts.RawSourceMap, content: string, usage?: PartialScriptCoverage}}

if the usage type is always what's given below, augment with dummy functionName/scriptId/isBlockCoverage and make it a real ScriptCoverage instead of this PartialScriptCoverage?

connorjclark

comment created time in 5 days

Pull request review commentGoogleChrome/lighthouse

new_audit: add treemap-data to experimental

+/**+ * @license Copyright 2020 The Lighthouse Authors. All Rights Reserved.+ * Licensed under the Apache License, Version 2.0 (the "License"); you may not use this file except in compliance with the License. You may obtain a copy of the License at http://www.apache.org/licenses/LICENSE-2.0+ * Unless required by applicable law or agreed to in writing, software distributed under the License is distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the License for the specific language governing permissions and limitations under the License.+ */+'use strict';++const TreemapData_ = require('../../audits/treemap-data.js');+const networkRecordsToDevtoolsLog = require('../network-records-to-devtools-log.js');+const {loadSourceMapFixture, makeParamsOptional} = require('../test-utils.js');++/* eslint-env jest */++const TreemapData = {+  audit: makeParamsOptional(TreemapData_.audit),+  prepareTreemapNodes: makeParamsOptional(TreemapData_.prepareTreemapNodes),+};++/**+ * @param {string} name+ */+function load(name) {+  const data = loadSourceMapFixture(name);+  if (!data.usage) throw new Error('exepcted usage');+  return {...data, usage: data.usage};+}++/**+ * @param {string} url+ * @param {number} transferSize+ * @param {LH.Crdp.Network.ResourceType} resourceType+ */+function generateRecord(url, transferSize, resourceType) {+  return {url, transferSize, resourceType};+}++describe('TreemapData audit', () => {+  describe('squoosh fixture', () => {+    /** @type {import('../../audits/treemap-data.js').TreemapData} */+    let treemapData;+    beforeAll(async () => {+      const context = {computedCache: new Map()};+      const {map, content, usage} = load('squoosh');+      const mainUrl = 'https://squoosh.app';+      const scriptUrl = 'https://squoosh.app/main-app.js';+      const networkRecords = [generateRecord(scriptUrl, content.length, 'Script')];++      const artifacts = {+        URL: {requestedUrl: mainUrl, finalUrl: mainUrl},+        JsUsage: {[usage.url]: [usage]},+        devtoolsLogs: {defaultPass: networkRecordsToDevtoolsLog(networkRecords)},+        SourceMaps: [{scriptUrl: scriptUrl, map}],+        ScriptElements: [{src: scriptUrl, content}],+      };+      const results = await TreemapData.audit(artifacts, context);++      // @ts-expect-error: Debug data.+      treemapData = results.details.treemapData;+    });++    it('basics', () => {+      expect(Object.keys(treemapData)).toEqual(['scripts', 'resources']);+    });++    it('js', () => {+      expect(treemapData.scripts).toMatchSnapshot();+    });++    it('resource summary', () => {

Is it possible to move these to more specific tests? With a snapshot, when something changes all that's clear is that something has changed, and it may not be clear that it's a bad thing.

It also makes debugging start with a blank slate..instead of knowing that the failure is exactly that dependent modules aren't nested under their parents, or that names aren't trimmed of the sourceRoot, you have to start with, something has changed, track down what that is and what caused it, then go read the file under test to figure out if the original intention was that dependent modules should be nested under their parents or that names are trimmed of the sourceRoot (and also are there cases when those things shouldn't happen...).

connorjclark

comment created time in 5 days

Pull request review commentGoogleChrome/lighthouse

new_audit: add treemap-data to experimental

+/**+ * @license Copyright 2020 Google Inc. All Rights Reserved.+ * Licensed under the Apache License, Version 2.0 (the "License"); you may not use this file except in compliance with the License. You may obtain a copy of the License at http://www.apache.org/licenses/LICENSE-2.0+ * Unless required by applicable law or agreed to in writing, software distributed under the License is distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the License for the specific language governing permissions and limitations under the License.+ */+'use strict';++/**+ * @fileoverview+ * Creates treemap data for webtreemap.+ */++const Audit = require('./audit.js');+const JsBundles = require('../computed/js-bundles.js');+const UnusedJavaScriptSummary = require('../computed/unused-javascript-summary.js');+const ModuleDuplication = require('../computed/module-duplication.js');+const NetworkRecords = require('../computed/network-records.js');+const ResourceSummary = require('../computed/resource-summary.js');++/**+ * @typedef {Record<string, RootNode[]>} TreemapData+ */++/**+ * @typedef RootNode+ * @property {string} name+ * @property {Node} node+ */++/**+ * @typedef Node+ * @property {string} name+ * @property {number} resourceBytes+ * @property {number=} unusedBytes+ * @property {number=} executionTime+ * @property {string=} duplicate+ * @property {Node[]=} children+ */++/**+ * @typedef {Omit<Node, 'name'|'children'>} SourceData+ */++class TreemapDataAudit extends Audit {+  /**+   * @return {LH.Audit.Meta}+   */+  static get meta() {+    return {+      id: 'treemap-data',+      scoreDisplayMode: Audit.SCORING_MODES.INFORMATIVE,+      title: 'Treemap Data',+      description: 'Used for treemap app.',+      requiredArtifacts:+        ['traces', 'devtoolsLogs', 'SourceMaps', 'ScriptElements', 'JsUsage', 'URL'],+    };+  }++  /**+   * @param {string} sourceRoot+   * @param {Record<string, SourceData>} sourcesData+   * @return {Node}+   */+  static prepareTreemapNodes(sourceRoot, sourcesData) {+    /**+     * @param {string} name+     * @return {Node}+     */+    function newNode(name) {+      return {+        name,+        resourceBytes: 0,+      };+    }++    /**+     * Given a slash-delimited path, traverse the Node structure and increment+     * the data provided for each node in the chain. Creates nodes as needed.+     * Ex: path/to/file.js will find or create "path" on `node`, increment the data fields,+     *     and continue with "to", and so on.+     * @param {string} source+     * @param {SourceData} data+     * @param {Node} node+     */+    function addNode(source, data, node) {+      // Strip off the shared root.+      const sourcePathSegments = source.replace(sourceRoot, '').split(/\/+/);+      sourcePathSegments.forEach((sourcePathSegment, i) => {+        const isLastSegment = i === sourcePathSegments.length - 1;++        let child = node.children && node.children.find(child => child.name === sourcePathSegment);+        if (!child) {+          child = newNode(sourcePathSegment);+          node.children = node.children || [];+          node.children.push(child);+        }+        node = child;++        // Now that we've found or created the next node in the path, apply the data.+        node.resourceBytes += data.resourceBytes;+        if (data.unusedBytes) node.unusedBytes = (node.unusedBytes || 0) + data.unusedBytes;+        if (data.duplicate !== undefined && isLastSegment) {+          node.duplicate = data.duplicate;+        }+      });+    }++    const rootNode = newNode(sourceRoot);++    // For every source file, apply the data to all components+    // of the source path, creating nodes as necessary.+    for (const [source, data] of Object.entries(sourcesData)) {+      addNode(source || `<unmapped>`, data, rootNode);++      // Apply the data to the rootNode.+      rootNode.resourceBytes += data.resourceBytes;+      if (data.unusedBytes) rootNode.unusedBytes = (rootNode.unusedBytes || 0) + data.unusedBytes;+    }++    /**+     * Collapse nodes that have only one child.+     * @param {Node} node+     */+    function collapse(node) {+      while (node.children && node.children.length === 1) {+        node.name += '/' + node.children[0].name;+        node.children = node.children[0].children;+      }++      if (node.children) {+        for (const child of node.children) {+          collapse(child);+        }+      }+    }+    collapse(rootNode);++    // TODO(cjamcl): Should this structure be flattened for space savings?+    // Like DOM Snapshot.+    // Less JSON (no super nested children, and no repeated property names).++    return rootNode;+  }++  /**+   * @param {LH.Artifacts} artifacts+   * @param {LH.Audit.Context} context+   * @return {Promise<RootNode[]>}+   */+  static async makeJavaScriptRootNodes(artifacts, context) {+    /** @type {RootNode[]} */+    const rootNodes = [];++    const bundles = await JsBundles.request(artifacts, context);+    const duplication = await ModuleDuplication.request(artifacts, context);++    /** @type {Array<{src: string, length: number, unusedJavascriptSummary?: import('../computed/unused-javascript-summary.js').Summary}>} */+    const scriptData = [];+    const inlineScriptData = {+      src: artifacts.URL.finalUrl,+      length: 0,+    };+    for (const scriptElement of artifacts.ScriptElements) {+      // Normalize ScriptElements so that inline scripts show up as a single entity.+      if (!scriptElement.src) {+        inlineScriptData.length += (scriptElement.content || '').length;+        continue;+      }++      const url = scriptElement.src;+      const bundle = bundles.find(bundle => url === bundle.script.src);+      const scriptCoverages = artifacts.JsUsage[url];+      if (!bundle || !scriptCoverages) continue;++      scriptData.push({+        src: scriptElement.src,+        length: (scriptElement.content || '').length,+        unusedJavascriptSummary:+          await UnusedJavaScriptSummary.request({url, scriptCoverages, bundle}, context),+      });+    }+    if (inlineScriptData.length) scriptData.unshift(inlineScriptData);++    for (const {src, length, unusedJavascriptSummary} of scriptData) {+      const bundle = bundles.find(bundle => bundle.script.src === src);+      const name = src;++      let node;+      if (bundle && unusedJavascriptSummary && unusedJavascriptSummary.sourcesWastedBytes) {+        /** @type {Record<string, SourceData>} */+        const sourcesData = {};+        for (const source of Object.keys(bundle.sizes.files)) {+          /** @type {SourceData} */+          const sourceData = {+            resourceBytes: bundle.sizes.files[source],+          };++          if (unusedJavascriptSummary && unusedJavascriptSummary.sourcesWastedBytes) {+            sourceData.unusedBytes = unusedJavascriptSummary.sourcesWastedBytes[source];+          }++          if (duplication) {+            const key = ModuleDuplication._normalizeSource(source);+            if (duplication.has(key)) sourceData.duplicate = key;+          }++          sourcesData[source] = sourceData;+        }++        node = this.prepareTreemapNodes(bundle.rawMap.sourceRoot || '', sourcesData);+      } else if (unusedJavascriptSummary) {+        node = {+          name,+          resourceBytes: unusedJavascriptSummary.totalBytes,+          unusedBytes: unusedJavascriptSummary.wastedBytes,+          executionTime: 0,+        };+      } else {+        node = {+          name,+          resourceBytes: length,+          unusedBytes: 0,+          executionTime: 0,+        };+      }++      rootNodes.push({+        name,+        node,+      });+    }++    return rootNodes;+  }++  /**

jsdoc here, too, including e.g. what resources are we talking about, and a general description of the hierarchy generated

connorjclark

comment created time in 5 days

Pull request review commentGoogleChrome/lighthouse

new_audit: add treemap-data to experimental

+/**+ * @license Copyright 2020 Google Inc. All Rights Reserved.+ * Licensed under the Apache License, Version 2.0 (the "License"); you may not use this file except in compliance with the License. You may obtain a copy of the License at http://www.apache.org/licenses/LICENSE-2.0+ * Unless required by applicable law or agreed to in writing, software distributed under the License is distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the License for the specific language governing permissions and limitations under the License.+ */+'use strict';++/**+ * @fileoverview+ * Creates treemap data for webtreemap.+ */++const Audit = require('./audit.js');+const JsBundles = require('../computed/js-bundles.js');+const UnusedJavaScriptSummary = require('../computed/unused-javascript-summary.js');+const ModuleDuplication = require('../computed/module-duplication.js');+const NetworkRecords = require('../computed/network-records.js');+const ResourceSummary = require('../computed/resource-summary.js');++/**+ * @typedef {Record<string, RootNode[]>} TreemapData+ */++/**+ * @typedef RootNode+ * @property {string} name+ * @property {Node} node+ */++/**+ * @typedef Node+ * @property {string} name+ * @property {number} resourceBytes+ * @property {number=} unusedBytes+ * @property {number=} executionTime+ * @property {string=} duplicate+ * @property {Node[]=} children+ */++/**+ * @typedef {Omit<Node, 'name'|'children'>} SourceData+ */++class TreemapDataAudit extends Audit {+  /**+   * @return {LH.Audit.Meta}+   */+  static get meta() {+    return {+      id: 'treemap-data',+      scoreDisplayMode: Audit.SCORING_MODES.INFORMATIVE,+      title: 'Treemap Data',+      description: 'Used for treemap app.',+      requiredArtifacts:+        ['traces', 'devtoolsLogs', 'SourceMaps', 'ScriptElements', 'JsUsage', 'URL'],+    };+  }++  /**

can you add a simple jsdoc description

connorjclark

comment created time in 5 days

Pull request review commentGoogleChrome/lighthouse

new_audit: add treemap-data to experimental

+/**+ * @license Copyright 2020 The Lighthouse Authors. All Rights Reserved.+ * Licensed under the Apache License, Version 2.0 (the "License"); you may not use this file except in compliance with the License. You may obtain a copy of the License at http://www.apache.org/licenses/LICENSE-2.0+ * Unless required by applicable law or agreed to in writing, software distributed under the License is distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the License for the specific language governing permissions and limitations under the License.+ */+'use strict';++const TreemapData_ = require('../../audits/treemap-data.js');+const networkRecordsToDevtoolsLog = require('../network-records-to-devtools-log.js');+const {loadSourceMapFixture, makeParamsOptional} = require('../test-utils.js');++/* eslint-env jest */++const TreemapData = {+  audit: makeParamsOptional(TreemapData_.audit),+  prepareTreemapNodes: makeParamsOptional(TreemapData_.prepareTreemapNodes),+};++/**+ * @param {string} name+ */+function load(name) {+  const data = loadSourceMapFixture(name);+  if (!data.usage) throw new Error('exepcted usage');

just move this check into loadSourceMapFixture? The majority of that function is loading the usage...if some hypothetical test in the future just wants a source map, it could just do the readFileSync call without the utility

connorjclark

comment created time in 5 days

Pull request review commentGoogleChrome/lighthouse

new_audit: add treemap-data to experimental

+/**+ * @license Copyright 2020 Google Inc. All Rights Reserved.+ * Licensed under the Apache License, Version 2.0 (the "License"); you may not use this file except in compliance with the License. You may obtain a copy of the License at http://www.apache.org/licenses/LICENSE-2.0+ * Unless required by applicable law or agreed to in writing, software distributed under the License is distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the License for the specific language governing permissions and limitations under the License.+ */+'use strict';++/**+ * @fileoverview+ * Creates treemap data for webtreemap.+ */++const Audit = require('./audit.js');+const JsBundles = require('../computed/js-bundles.js');+const UnusedJavaScriptSummary = require('../computed/unused-javascript-summary.js');+const ModuleDuplication = require('../computed/module-duplication.js');+const NetworkRecords = require('../computed/network-records.js');+const ResourceSummary = require('../computed/resource-summary.js');++/**+ * @typedef {Record<string, RootNode[]>} TreemapData+ */++/**+ * @typedef RootNode+ * @property {string} name+ * @property {Node} node+ */++/**+ * @typedef Node+ * @property {string} name+ * @property {number} resourceBytes+ * @property {number=} unusedBytes+ * @property {number=} executionTime+ * @property {string=} duplicate+ * @property {Node[]=} children+ */++/**+ * @typedef {Omit<Node, 'name'|'children'>} SourceData+ */++class TreemapDataAudit extends Audit {+  /**+   * @return {LH.Audit.Meta}+   */+  static get meta() {+    return {+      id: 'treemap-data',+      scoreDisplayMode: Audit.SCORING_MODES.INFORMATIVE,+      title: 'Treemap Data',+      description: 'Used for treemap app.',+      requiredArtifacts:+        ['traces', 'devtoolsLogs', 'SourceMaps', 'ScriptElements', 'JsUsage', 'URL'],+    };+  }++  /**+   * @param {string} sourceRoot+   * @param {Record<string, SourceData>} sourcesData+   * @return {Node}+   */+  static prepareTreemapNodes(sourceRoot, sourcesData) {+    /**+     * @param {string} name+     * @return {Node}+     */+    function newNode(name) {+      return {+        name,+        resourceBytes: 0,+      };+    }++    /**+     * Given a slash-delimited path, traverse the Node structure and increment+     * the data provided for each node in the chain. Creates nodes as needed.+     * Ex: path/to/file.js will find or create "path" on `node`, increment the data fields,+     *     and continue with "to", and so on.+     * @param {string} source+     * @param {SourceData} data+     * @param {Node} node+     */+    function addNode(source, data, node) {+      // Strip off the shared root.+      const sourcePathSegments = source.replace(sourceRoot, '').split(/\/+/);+      sourcePathSegments.forEach((sourcePathSegment, i) => {+        const isLastSegment = i === sourcePathSegments.length - 1;++        let child = node.children && node.children.find(child => child.name === sourcePathSegment);+        if (!child) {+          child = newNode(sourcePathSegment);+          node.children = node.children || [];+          node.children.push(child);+        }+        node = child;++        // Now that we've found or created the next node in the path, apply the data.+        node.resourceBytes += data.resourceBytes;+        if (data.unusedBytes) node.unusedBytes = (node.unusedBytes || 0) + data.unusedBytes;+        if (data.duplicate !== undefined && isLastSegment) {+          node.duplicate = data.duplicate;+        }+      });+    }++    const rootNode = newNode(sourceRoot);++    // For every source file, apply the data to all components+    // of the source path, creating nodes as necessary.+    for (const [source, data] of Object.entries(sourcesData)) {+      addNode(source || `<unmapped>`, data, rootNode);++      // Apply the data to the rootNode.+      rootNode.resourceBytes += data.resourceBytes;+      if (data.unusedBytes) rootNode.unusedBytes = (rootNode.unusedBytes || 0) + data.unusedBytes;+    }++    /**+     * Collapse nodes that have only one child.+     * @param {Node} node+     */+    function collapse(node) {+      while (node.children && node.children.length === 1) {+        node.name += '/' + node.children[0].name;+        node.children = node.children[0].children;+      }++      if (node.children) {+        for (const child of node.children) {+          collapse(child);+        }+      }+    }+    collapse(rootNode);++    // TODO(cjamcl): Should this structure be flattened for space savings?+    // Like DOM Snapshot.+    // Less JSON (no super nested children, and no repeated property names).++    return rootNode;+  }++  /**+   * @param {LH.Artifacts} artifacts+   * @param {LH.Audit.Context} context+   * @return {Promise<RootNode[]>}+   */+  static async makeJavaScriptRootNodes(artifacts, context) {+    /** @type {RootNode[]} */+    const rootNodes = [];++    const bundles = await JsBundles.request(artifacts, context);+    const duplication = await ModuleDuplication.request(artifacts, context);++    /** @type {Array<{src: string, length: number, unusedJavascriptSummary?: import('../computed/unused-javascript-summary.js').Summary}>} */+    const scriptData = [];+    const inlineScriptData = {+      src: artifacts.URL.finalUrl,+      length: 0,+    };+    for (const scriptElement of artifacts.ScriptElements) {+      // Normalize ScriptElements so that inline scripts show up as a single entity.+      if (!scriptElement.src) {+        inlineScriptData.length += (scriptElement.content || '').length;+        continue;+      }++      const url = scriptElement.src;+      const bundle = bundles.find(bundle => url === bundle.script.src);+      const scriptCoverages = artifacts.JsUsage[url];+      if (!bundle || !scriptCoverages) continue;++      scriptData.push({+        src: scriptElement.src,+        length: (scriptElement.content || '').length,+        unusedJavascriptSummary:+          await UnusedJavaScriptSummary.request({url, scriptCoverages, bundle}, context),+      });+    }+    if (inlineScriptData.length) scriptData.unshift(inlineScriptData);++    for (const {src, length, unusedJavascriptSummary} of scriptData) {+      const bundle = bundles.find(bundle => bundle.script.src === src);+      const name = src;++      let node;+      if (bundle && unusedJavascriptSummary && unusedJavascriptSummary.sourcesWastedBytes) {+        /** @type {Record<string, SourceData>} */+        const sourcesData = {};+        for (const source of Object.keys(bundle.sizes.files)) {+          /** @type {SourceData} */+          const sourceData = {+            resourceBytes: bundle.sizes.files[source],+          };++          if (unusedJavascriptSummary && unusedJavascriptSummary.sourcesWastedBytes) {+            sourceData.unusedBytes = unusedJavascriptSummary.sourcesWastedBytes[source];+          }++          if (duplication) {+            const key = ModuleDuplication._normalizeSource(source);

we should drop the leading _ on the function name if it's not going to be private to the class

connorjclark

comment created time in 5 days

Pull request review commentGoogleChrome/lighthouse

new_audit: add treemap-data to experimental

+/**+ * @license Copyright 2020 Google Inc. All Rights Reserved.+ * Licensed under the Apache License, Version 2.0 (the "License"); you may not use this file except in compliance with the License. You may obtain a copy of the License at http://www.apache.org/licenses/LICENSE-2.0+ * Unless required by applicable law or agreed to in writing, software distributed under the License is distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the License for the specific language governing permissions and limitations under the License.+ */+'use strict';++/**+ * @fileoverview+ * Creates treemap data for webtreemap.+ */++const Audit = require('./audit.js');+const JsBundles = require('../computed/js-bundles.js');+const UnusedJavaScriptSummary = require('../computed/unused-javascript-summary.js');+const ModuleDuplication = require('../computed/module-duplication.js');+const NetworkRecords = require('../computed/network-records.js');+const ResourceSummary = require('../computed/resource-summary.js');++/**+ * @typedef {Record<string, RootNode[]>} TreemapData+ */++/**+ * @typedef RootNode+ * @property {string} name+ * @property {Node} node+ */++/**+ * @typedef Node+ * @property {string} name+ * @property {number} resourceBytes+ * @property {number=} unusedBytes+ * @property {number=} executionTime+ * @property {string=} duplicate+ * @property {Node[]=} children+ */++/**+ * @typedef {Omit<Node, 'name'|'children'>} SourceData+ */++class TreemapDataAudit extends Audit {+  /**+   * @return {LH.Audit.Meta}+   */+  static get meta() {+    return {+      id: 'treemap-data',+      scoreDisplayMode: Audit.SCORING_MODES.INFORMATIVE,+      title: 'Treemap Data',+      description: 'Used for treemap app.',+      requiredArtifacts:+        ['traces', 'devtoolsLogs', 'SourceMaps', 'ScriptElements', 'JsUsage', 'URL'],+    };+  }++  /**+   * @param {string} sourceRoot+   * @param {Record<string, SourceData>} sourcesData+   * @return {Node}+   */+  static prepareTreemapNodes(sourceRoot, sourcesData) {+    /**+     * @param {string} name+     * @return {Node}+     */+    function newNode(name) {+      return {+        name,+        resourceBytes: 0,+      };+    }++    /**+     * Given a slash-delimited path, traverse the Node structure and increment+     * the data provided for each node in the chain. Creates nodes as needed.+     * Ex: path/to/file.js will find or create "path" on `node`, increment the data fields,+     *     and continue with "to", and so on.+     * @param {string} source+     * @param {SourceData} data+     * @param {Node} node+     */+    function addNode(source, data, node) {+      // Strip off the shared root.+      const sourcePathSegments = source.replace(sourceRoot, '').split(/\/+/);+      sourcePathSegments.forEach((sourcePathSegment, i) => {+        const isLastSegment = i === sourcePathSegments.length - 1;++        let child = node.children && node.children.find(child => child.name === sourcePathSegment);+        if (!child) {+          child = newNode(sourcePathSegment);+          node.children = node.children || [];+          node.children.push(child);+        }+        node = child;++        // Now that we've found or created the next node in the path, apply the data.+        node.resourceBytes += data.resourceBytes;+        if (data.unusedBytes) node.unusedBytes = (node.unusedBytes || 0) + data.unusedBytes;+        if (data.duplicate !== undefined && isLastSegment) {+          node.duplicate = data.duplicate;+        }+      });+    }++    const rootNode = newNode(sourceRoot);++    // For every source file, apply the data to all components+    // of the source path, creating nodes as necessary.+    for (const [source, data] of Object.entries(sourcesData)) {+      addNode(source || `<unmapped>`, data, rootNode);++      // Apply the data to the rootNode.+      rootNode.resourceBytes += data.resourceBytes;+      if (data.unusedBytes) rootNode.unusedBytes = (rootNode.unusedBytes || 0) + data.unusedBytes;+    }++    /**+     * Collapse nodes that have only one child.+     * @param {Node} node+     */+    function collapse(node) {+      while (node.children && node.children.length === 1) {+        node.name += '/' + node.children[0].name;+        node.children = node.children[0].children;+      }++      if (node.children) {+        for (const child of node.children) {+          collapse(child);+        }+      }+    }+    collapse(rootNode);++    // TODO(cjamcl): Should this structure be flattened for space savings?+    // Like DOM Snapshot.+    // Less JSON (no super nested children, and no repeated property names).++    return rootNode;+  }++  /**+   * @param {LH.Artifacts} artifacts+   * @param {LH.Audit.Context} context+   * @return {Promise<RootNode[]>}+   */+  static async makeJavaScriptRootNodes(artifacts, context) {+    /** @type {RootNode[]} */+    const rootNodes = [];++    const bundles = await JsBundles.request(artifacts, context);+    const duplication = await ModuleDuplication.request(artifacts, context);++    /** @type {Array<{src: string, length: number, unusedJavascriptSummary?: import('../computed/unused-javascript-summary.js').Summary}>} */+    const scriptData = [];+    const inlineScriptData = {+      src: artifacts.URL.finalUrl,+      length: 0,+    };+    for (const scriptElement of artifacts.ScriptElements) {+      // Normalize ScriptElements so that inline scripts show up as a single entity.+      if (!scriptElement.src) {+        inlineScriptData.length += (scriptElement.content || '').length;+        continue;+      }++      const url = scriptElement.src;+      const bundle = bundles.find(bundle => url === bundle.script.src);+      const scriptCoverages = artifacts.JsUsage[url];+      if (!bundle || !scriptCoverages) continue;++      scriptData.push({+        src: scriptElement.src,+        length: (scriptElement.content || '').length,+        unusedJavascriptSummary:+          await UnusedJavaScriptSummary.request({url, scriptCoverages, bundle}, context),+      });+    }+    if (inlineScriptData.length) scriptData.unshift(inlineScriptData);++    for (const {src, length, unusedJavascriptSummary} of scriptData) {+      const bundle = bundles.find(bundle => bundle.script.src === src);+      const name = src;++      let node;+      if (bundle && unusedJavascriptSummary && unusedJavascriptSummary.sourcesWastedBytes) {+        /** @type {Record<string, SourceData>} */+        const sourcesData = {};+        for (const source of Object.keys(bundle.sizes.files)) {+          /** @type {SourceData} */+          const sourceData = {+            resourceBytes: bundle.sizes.files[source],+          };++          if (unusedJavascriptSummary && unusedJavascriptSummary.sourcesWastedBytes) {+            sourceData.unusedBytes = unusedJavascriptSummary.sourcesWastedBytes[source];+          }++          if (duplication) {

this looks like it's always truthy?

connorjclark

comment created time in 5 days

Pull request review commentGoogleChrome/lighthouse

new_audit: add treemap-data to experimental

+/**+ * @license Copyright 2020 Google Inc. All Rights Reserved.+ * Licensed under the Apache License, Version 2.0 (the "License"); you may not use this file except in compliance with the License. You may obtain a copy of the License at http://www.apache.org/licenses/LICENSE-2.0+ * Unless required by applicable law or agreed to in writing, software distributed under the License is distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the License for the specific language governing permissions and limitations under the License.+ */+'use strict';++/**+ * @fileoverview+ * Creates treemap data for webtreemap.+ */++const Audit = require('./audit.js');+const JsBundles = require('../computed/js-bundles.js');+const UnusedJavaScriptSummary = require('../computed/unused-javascript-summary.js');+const ModuleDuplication = require('../computed/module-duplication.js');+const NetworkRecords = require('../computed/network-records.js');+const ResourceSummary = require('../computed/resource-summary.js');++/**+ * @typedef {Record<string, RootNode[]>} TreemapData+ */++/**+ * @typedef RootNode+ * @property {string} name+ * @property {Node} node+ */++/**+ * @typedef Node+ * @property {string} name

jsdoc for some/all of these would be helpful (what kinds of values do name and duplicate have, for instance), as well as maybe a short example? (either here or in the fileoverview)

connorjclark

comment created time in 5 days

Pull request review commentGoogleChrome/lighthouse

new_audit: add treemap-data to experimental

+/**+ * @license Copyright 2020 Google Inc. All Rights Reserved.+ * Licensed under the Apache License, Version 2.0 (the "License"); you may not use this file except in compliance with the License. You may obtain a copy of the License at http://www.apache.org/licenses/LICENSE-2.0+ * Unless required by applicable law or agreed to in writing, software distributed under the License is distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the License for the specific language governing permissions and limitations under the License.+ */+'use strict';++/**+ * @fileoverview+ * Creates treemap data for webtreemap.+ */++const Audit = require('./audit.js');+const JsBundles = require('../computed/js-bundles.js');+const UnusedJavaScriptSummary = require('../computed/unused-javascript-summary.js');+const ModuleDuplication = require('../computed/module-duplication.js');+const NetworkRecords = require('../computed/network-records.js');+const ResourceSummary = require('../computed/resource-summary.js');++/**+ * @typedef {Record<string, RootNode[]>} TreemapData+ */++/**+ * @typedef RootNode+ * @property {string} name+ * @property {Node} node+ */++/**+ * @typedef Node+ * @property {string} name+ * @property {number} resourceBytes+ * @property {number=} unusedBytes+ * @property {number=} executionTime+ * @property {string=} duplicate+ * @property {Node[]=} children+ */++/**+ * @typedef {Omit<Node, 'name'|'children'>} SourceData+ */++class TreemapDataAudit extends Audit {+  /**+   * @return {LH.Audit.Meta}+   */+  static get meta() {+    return {+      id: 'treemap-data',+      scoreDisplayMode: Audit.SCORING_MODES.INFORMATIVE,+      title: 'Treemap Data',+      description: 'Used for treemap app.',+      requiredArtifacts:+        ['traces', 'devtoolsLogs', 'SourceMaps', 'ScriptElements', 'JsUsage', 'URL'],+    };+  }++  /**+   * @param {string} sourceRoot+   * @param {Record<string, SourceData>} sourcesData+   * @return {Node}+   */+  static prepareTreemapNodes(sourceRoot, sourcesData) {+    /**+     * @param {string} name+     * @return {Node}+     */+    function newNode(name) {+      return {+        name,+        resourceBytes: 0,+      };+    }++    /**+     * Given a slash-delimited path, traverse the Node structure and increment+     * the data provided for each node in the chain. Creates nodes as needed.+     * Ex: path/to/file.js will find or create "path" on `node`, increment the data fields,+     *     and continue with "to", and so on.+     * @param {string} source+     * @param {SourceData} data+     * @param {Node} node+     */+    function addNode(source, data, node) {+      // Strip off the shared root.+      const sourcePathSegments = source.replace(sourceRoot, '').split(/\/+/);+      sourcePathSegments.forEach((sourcePathSegment, i) => {+        const isLastSegment = i === sourcePathSegments.length - 1;++        let child = node.children && node.children.find(child => child.name === sourcePathSegment);+        if (!child) {+          child = newNode(sourcePathSegment);+          node.children = node.children || [];+          node.children.push(child);+        }+        node = child;++        // Now that we've found or created the next node in the path, apply the data.+        node.resourceBytes += data.resourceBytes;+        if (data.unusedBytes) node.unusedBytes = (node.unusedBytes || 0) + data.unusedBytes;+        if (data.duplicate !== undefined && isLastSegment) {+          node.duplicate = data.duplicate;+        }+      });+    }++    const rootNode = newNode(sourceRoot);++    // For every source file, apply the data to all components+    // of the source path, creating nodes as necessary.+    for (const [source, data] of Object.entries(sourcesData)) {+      addNode(source || `<unmapped>`, data, rootNode);++      // Apply the data to the rootNode.+      rootNode.resourceBytes += data.resourceBytes;+      if (data.unusedBytes) rootNode.unusedBytes = (rootNode.unusedBytes || 0) + data.unusedBytes;+    }++    /**+     * Collapse nodes that have only one child.+     * @param {Node} node+     */+    function collapse(node) {+      while (node.children && node.children.length === 1) {+        node.name += '/' + node.children[0].name;+        node.children = node.children[0].children;+      }++      if (node.children) {+        for (const child of node.children) {+          collapse(child);+        }+      }+    }+    collapse(rootNode);++    // TODO(cjamcl): Should this structure be flattened for space savings?+    // Like DOM Snapshot.+    // Less JSON (no super nested children, and no repeated property names).++    return rootNode;+  }++  /**+   * @param {LH.Artifacts} artifacts+   * @param {LH.Audit.Context} context+   * @return {Promise<RootNode[]>}+   */+  static async makeJavaScriptRootNodes(artifacts, context) {+    /** @type {RootNode[]} */+    const rootNodes = [];++    const bundles = await JsBundles.request(artifacts, context);+    const duplication = await ModuleDuplication.request(artifacts, context);++    /** @type {Array<{src: string, length: number, unusedJavascriptSummary?: import('../computed/unused-javascript-summary.js').Summary}>} */+    const scriptData = [];+    const inlineScriptData = {+      src: artifacts.URL.finalUrl,+      length: 0,+    };+    for (const scriptElement of artifacts.ScriptElements) {+      // Normalize ScriptElements so that inline scripts show up as a single entity.+      if (!scriptElement.src) {+        inlineScriptData.length += (scriptElement.content || '').length;+        continue;+      }++      const url = scriptElement.src;+      const bundle = bundles.find(bundle => url === bundle.script.src);+      const scriptCoverages = artifacts.JsUsage[url];+      if (!bundle || !scriptCoverages) continue;++      scriptData.push({+        src: scriptElement.src,+        length: (scriptElement.content || '').length,+        unusedJavascriptSummary:+          await UnusedJavaScriptSummary.request({url, scriptCoverages, bundle}, context),+      });+    }+    if (inlineScriptData.length) scriptData.unshift(inlineScriptData);++    for (const {src, length, unusedJavascriptSummary} of scriptData) {+      const bundle = bundles.find(bundle => bundle.script.src === src);+      const name = src;++      let node;+      if (bundle && unusedJavascriptSummary && unusedJavascriptSummary.sourcesWastedBytes) {+        /** @type {Record<string, SourceData>} */+        const sourcesData = {};+        for (const source of Object.keys(bundle.sizes.files)) {+          /** @type {SourceData} */+          const sourceData = {+            resourceBytes: bundle.sizes.files[source],+          };++          if (unusedJavascriptSummary && unusedJavascriptSummary.sourcesWastedBytes) {

both of these are already checked by the next outer conditional

connorjclark

comment created time in 5 days

Pull request review commentGoogleChrome/lighthouse

new_audit: add treemap-data to experimental

+/**+ * @license Copyright 2020 Google Inc. All Rights Reserved.+ * Licensed under the Apache License, Version 2.0 (the "License"); you may not use this file except in compliance with the License. You may obtain a copy of the License at http://www.apache.org/licenses/LICENSE-2.0+ * Unless required by applicable law or agreed to in writing, software distributed under the License is distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the License for the specific language governing permissions and limitations under the License.+ */+'use strict';++/**+ * @fileoverview+ * Creates treemap data for webtreemap.+ */++const Audit = require('./audit.js');+const JsBundles = require('../computed/js-bundles.js');+const UnusedJavaScriptSummary = require('../computed/unused-javascript-summary.js');+const ModuleDuplication = require('../computed/module-duplication.js');+const NetworkRecords = require('../computed/network-records.js');+const ResourceSummary = require('../computed/resource-summary.js');++/**+ * @typedef {Record<string, RootNode[]>} TreemapData+ */++/**+ * @typedef RootNode+ * @property {string} name+ * @property {Node} node+ */++/**+ * @typedef Node+ * @property {string} name+ * @property {number} resourceBytes+ * @property {number=} unusedBytes+ * @property {number=} executionTime+ * @property {string=} duplicate+ * @property {Node[]=} children+ */++/**+ * @typedef {Omit<Node, 'name'|'children'>} SourceData+ */++class TreemapDataAudit extends Audit {+  /**+   * @return {LH.Audit.Meta}+   */+  static get meta() {+    return {+      id: 'treemap-data',+      scoreDisplayMode: Audit.SCORING_MODES.INFORMATIVE,+      title: 'Treemap Data',+      description: 'Used for treemap app.',+      requiredArtifacts:+        ['traces', 'devtoolsLogs', 'SourceMaps', 'ScriptElements', 'JsUsage', 'URL'],+    };+  }++  /**+   * @param {string} sourceRoot+   * @param {Record<string, SourceData>} sourcesData+   * @return {Node}+   */+  static prepareTreemapNodes(sourceRoot, sourcesData) {+    /**+     * @param {string} name+     * @return {Node}+     */+    function newNode(name) {+      return {+        name,+        resourceBytes: 0,+      };+    }++    /**+     * Given a slash-delimited path, traverse the Node structure and increment+     * the data provided for each node in the chain. Creates nodes as needed.+     * Ex: path/to/file.js will find or create "path" on `node`, increment the data fields,+     *     and continue with "to", and so on.+     * @param {string} source+     * @param {SourceData} data+     * @param {Node} node+     */+    function addNode(source, data, node) {+      // Strip off the shared root.+      const sourcePathSegments = source.replace(sourceRoot, '').split(/\/+/);+      sourcePathSegments.forEach((sourcePathSegment, i) => {+        const isLastSegment = i === sourcePathSegments.length - 1;++        let child = node.children && node.children.find(child => child.name === sourcePathSegment);+        if (!child) {+          child = newNode(sourcePathSegment);+          node.children = node.children || [];+          node.children.push(child);+        }+        node = child;++        // Now that we've found or created the next node in the path, apply the data.+        node.resourceBytes += data.resourceBytes;+        if (data.unusedBytes) node.unusedBytes = (node.unusedBytes || 0) + data.unusedBytes;+        if (data.duplicate !== undefined && isLastSegment) {+          node.duplicate = data.duplicate;+        }+      });+    }++    const rootNode = newNode(sourceRoot);++    // For every source file, apply the data to all components+    // of the source path, creating nodes as necessary.+    for (const [source, data] of Object.entries(sourcesData)) {+      addNode(source || `<unmapped>`, data, rootNode);++      // Apply the data to the rootNode.+      rootNode.resourceBytes += data.resourceBytes;+      if (data.unusedBytes) rootNode.unusedBytes = (rootNode.unusedBytes || 0) + data.unusedBytes;+    }++    /**+     * Collapse nodes that have only one child.+     * @param {Node} node+     */+    function collapse(node) {+      while (node.children && node.children.length === 1) {+        node.name += '/' + node.children[0].name;+        node.children = node.children[0].children;+      }++      if (node.children) {+        for (const child of node.children) {+          collapse(child);+        }+      }+    }+    collapse(rootNode);++    // TODO(cjamcl): Should this structure be flattened for space savings?+    // Like DOM Snapshot.+    // Less JSON (no super nested children, and no repeated property names).++    return rootNode;+  }++  /**+   * @param {LH.Artifacts} artifacts+   * @param {LH.Audit.Context} context+   * @return {Promise<RootNode[]>}+   */+  static async makeJavaScriptRootNodes(artifacts, context) {+    /** @type {RootNode[]} */+    const rootNodes = [];++    const bundles = await JsBundles.request(artifacts, context);+    const duplication = await ModuleDuplication.request(artifacts, context);++    /** @type {Array<{src: string, length: number, unusedJavascriptSummary?: import('../computed/unused-javascript-summary.js').Summary}>} */+    const scriptData = [];+    const inlineScriptData = {+      src: artifacts.URL.finalUrl,+      length: 0,+    };+    for (const scriptElement of artifacts.ScriptElements) {+      // Normalize ScriptElements so that inline scripts show up as a single entity.+      if (!scriptElement.src) {+        inlineScriptData.length += (scriptElement.content || '').length;+        continue;+      }++      const url = scriptElement.src;+      const bundle = bundles.find(bundle => url === bundle.script.src);+      const scriptCoverages = artifacts.JsUsage[url];+      if (!bundle || !scriptCoverages) continue;++      scriptData.push({+        src: scriptElement.src,+        length: (scriptElement.content || '').length,+        unusedJavascriptSummary:+          await UnusedJavaScriptSummary.request({url, scriptCoverages, bundle}, context),+      });+    }+    if (inlineScriptData.length) scriptData.unshift(inlineScriptData);

maybe add a comment on why this should be first?

connorjclark

comment created time in 5 days

Pull request review commentGoogleChrome/lighthouse

new_audit: add treemap-data to experimental

+/**+ * @license Copyright 2020 Google Inc. All Rights Reserved.+ * Licensed under the Apache License, Version 2.0 (the "License"); you may not use this file except in compliance with the License. You may obtain a copy of the License at http://www.apache.org/licenses/LICENSE-2.0+ * Unless required by applicable law or agreed to in writing, software distributed under the License is distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the License for the specific language governing permissions and limitations under the License.+ */+'use strict';++/**+ * @fileoverview+ * Creates treemap data for webtreemap.+ */++const Audit = require('./audit.js');+const JsBundles = require('../computed/js-bundles.js');+const UnusedJavaScriptSummary = require('../computed/unused-javascript-summary.js');+const ModuleDuplication = require('../computed/module-duplication.js');+const NetworkRecords = require('../computed/network-records.js');+const ResourceSummary = require('../computed/resource-summary.js');++/**+ * @typedef {Record<string, RootNode[]>} TreemapData+ */++/**+ * @typedef RootNode+ * @property {string} name+ * @property {Node} node+ */++/**+ * @typedef Node+ * @property {string} name+ * @property {number} resourceBytes+ * @property {number=} unusedBytes+ * @property {number=} executionTime+ * @property {string=} duplicate+ * @property {Node[]=} children+ */++/**+ * @typedef {Omit<Node, 'name'|'children'>} SourceData+ */++class TreemapDataAudit extends Audit {+  /**+   * @return {LH.Audit.Meta}+   */+  static get meta() {+    return {+      id: 'treemap-data',+      scoreDisplayMode: Audit.SCORING_MODES.INFORMATIVE,+      title: 'Treemap Data',+      description: 'Used for treemap app.',+      requiredArtifacts:+        ['traces', 'devtoolsLogs', 'SourceMaps', 'ScriptElements', 'JsUsage', 'URL'],+    };+  }++  /**+   * @param {string} sourceRoot+   * @param {Record<string, SourceData>} sourcesData+   * @return {Node}+   */+  static prepareTreemapNodes(sourceRoot, sourcesData) {+    /**+     * @param {string} name+     * @return {Node}+     */+    function newNode(name) {+      return {+        name,+        resourceBytes: 0,+      };+    }++    /**+     * Given a slash-delimited path, traverse the Node structure and increment+     * the data provided for each node in the chain. Creates nodes as needed.+     * Ex: path/to/file.js will find or create "path" on `node`, increment the data fields,+     *     and continue with "to", and so on.+     * @param {string} source+     * @param {SourceData} data+     * @param {Node} node+     */+    function addNode(source, data, node) {+      // Strip off the shared root.+      const sourcePathSegments = source.replace(sourceRoot, '').split(/\/+/);+      sourcePathSegments.forEach((sourcePathSegment, i) => {+        const isLastSegment = i === sourcePathSegments.length - 1;++        let child = node.children && node.children.find(child => child.name === sourcePathSegment);+        if (!child) {+          child = newNode(sourcePathSegment);+          node.children = node.children || [];+          node.children.push(child);+        }+        node = child;++        // Now that we've found or created the next node in the path, apply the data.+        node.resourceBytes += data.resourceBytes;+        if (data.unusedBytes) node.unusedBytes = (node.unusedBytes || 0) + data.unusedBytes;+        if (data.duplicate !== undefined && isLastSegment) {+          node.duplicate = data.duplicate;+        }+      });+    }++    const rootNode = newNode(sourceRoot);++    // For every source file, apply the data to all components+    // of the source path, creating nodes as necessary.+    for (const [source, data] of Object.entries(sourcesData)) {+      addNode(source || `<unmapped>`, data, rootNode);++      // Apply the data to the rootNode.+      rootNode.resourceBytes += data.resourceBytes;+      if (data.unusedBytes) rootNode.unusedBytes = (rootNode.unusedBytes || 0) + data.unusedBytes;+    }++    /**+     * Collapse nodes that have only one child.+     * @param {Node} node+     */+    function collapse(node) {+      while (node.children && node.children.length === 1) {+        node.name += '/' + node.children[0].name;+        node.children = node.children[0].children;+      }++      if (node.children) {+        for (const child of node.children) {+          collapse(child);+        }+      }+    }+    collapse(rootNode);++    // TODO(cjamcl): Should this structure be flattened for space savings?+    // Like DOM Snapshot.+    // Less JSON (no super nested children, and no repeated property names).++    return rootNode;+  }++  /**+   * @param {LH.Artifacts} artifacts+   * @param {LH.Audit.Context} context+   * @return {Promise<RootNode[]>}+   */+  static async makeJavaScriptRootNodes(artifacts, context) {+    /** @type {RootNode[]} */+    const rootNodes = [];++    const bundles = await JsBundles.request(artifacts, context);+    const duplication = await ModuleDuplication.request(artifacts, context);++    /** @type {Array<{src: string, length: number, unusedJavascriptSummary?: import('../computed/unused-javascript-summary.js').Summary}>} */+    const scriptData = [];+    const inlineScriptData = {+      src: artifacts.URL.finalUrl,+      length: 0,+    };+    for (const scriptElement of artifacts.ScriptElements) {+      // Normalize ScriptElements so that inline scripts show up as a single entity.+      if (!scriptElement.src) {+        inlineScriptData.length += (scriptElement.content || '').length;+        continue;+      }++      const url = scriptElement.src;+      const bundle = bundles.find(bundle => url === bundle.script.src);+      const scriptCoverages = artifacts.JsUsage[url];+      if (!bundle || !scriptCoverages) continue;++      scriptData.push({+        src: scriptElement.src,+        length: (scriptElement.content || '').length,+        unusedJavascriptSummary:+          await UnusedJavaScriptSummary.request({url, scriptCoverages, bundle}, context),+      });+    }+    if (inlineScriptData.length) scriptData.unshift(inlineScriptData);++    for (const {src, length, unusedJavascriptSummary} of scriptData) {

this might be a little simpler to follow if it separated the inline/not-inline cases, something like

let inlineScriptLength = 0;
for (const scriptElement of artifacts.ScriptElements) {
  // Normalize ScriptElements so that inline scripts show up as a single entity.
  if (!scriptElement.src) {
    inlineScriptLength += (scriptElement.content || '').length;
  }
}
const name = artifacts.URL.finalUrl;
if (inlineScriptLength) {
  rootNodes.push({
    name,
    node: {
      name,
      resourceBytes: inlineScriptLength,
      unusedBytes: 0,
      executionTime: 0,
    },
  });
}

const bundles = await JsBundles.request(artifacts, context);
const duplication = await ModuleDuplication.request(artifacts, context);

for (const scriptElement of artifacts.ScriptElements) {
  if (!scriptElement.src) {
    continue;
  }

  const src = scriptElement.src;
  const bundle = bundles.find(bundle => src === bundle.script.src);
  const scriptCoverages = artifacts.JsUsage[src];
  if (!bundle || !scriptCoverages) continue;

  const length = (scriptElement.content || '').length;
  const unusedJavascriptSummary = await UnusedJavaScriptSummary.request({url: src, scriptCoverages, bundle}, context);

  const name = src;
  // The rest of the existing second loop...
}

It would make it clearer exactly what happens to the inline script info, and it would get rid of the intermediate scriptData objects that are immediately destructured, the 2x find of the bundle (and no need to check it exists twice), etc.

connorjclark

comment created time in 5 days

Pull request review commentGoogleChrome/lighthouse

new_audit: add treemap-data to experimental

+/**+ * @license Copyright 2020 Google Inc. All Rights Reserved.+ * Licensed under the Apache License, Version 2.0 (the "License"); you may not use this file except in compliance with the License. You may obtain a copy of the License at http://www.apache.org/licenses/LICENSE-2.0+ * Unless required by applicable law or agreed to in writing, software distributed under the License is distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the License for the specific language governing permissions and limitations under the License.+ */+'use strict';++/**+ * @fileoverview+ * Creates treemap data for webtreemap.+ */++const Audit = require('./audit.js');+const JsBundles = require('../computed/js-bundles.js');+const UnusedJavaScriptSummary = require('../computed/unused-javascript-summary.js');+const ModuleDuplication = require('../computed/module-duplication.js');+const NetworkRecords = require('../computed/network-records.js');+const ResourceSummary = require('../computed/resource-summary.js');++/**+ * @typedef {Record<string, RootNode[]>} TreemapData+ */++/**+ * @typedef RootNode+ * @property {string} name+ * @property {Node} node+ */++/**+ * @typedef Node+ * @property {string} name+ * @property {number} resourceBytes+ * @property {number=} unusedBytes+ * @property {number=} executionTime+ * @property {string=} duplicate+ * @property {Node[]=} children+ */++/**+ * @typedef {Omit<Node, 'name'|'children'>} SourceData+ */++class TreemapDataAudit extends Audit {+  /**+   * @return {LH.Audit.Meta}+   */+  static get meta() {+    return {+      id: 'treemap-data',+      scoreDisplayMode: Audit.SCORING_MODES.INFORMATIVE,+      title: 'Treemap Data',+      description: 'Used for treemap app.',+      requiredArtifacts:+        ['traces', 'devtoolsLogs', 'SourceMaps', 'ScriptElements', 'JsUsage', 'URL'],+    };+  }++  /**+   * @param {string} sourceRoot+   * @param {Record<string, SourceData>} sourcesData+   * @return {Node}+   */+  static prepareTreemapNodes(sourceRoot, sourcesData) {+    /**+     * @param {string} name+     * @return {Node}+     */+    function newNode(name) {+      return {+        name,+        resourceBytes: 0,+      };+    }++    /**+     * Given a slash-delimited path, traverse the Node structure and increment+     * the data provided for each node in the chain. Creates nodes as needed.+     * Ex: path/to/file.js will find or create "path" on `node`, increment the data fields,+     *     and continue with "to", and so on.+     * @param {string} source+     * @param {SourceData} data+     * @param {Node} node+     */+    function addNode(source, data, node) {+      // Strip off the shared root.+      const sourcePathSegments = source.replace(sourceRoot, '').split(/\/+/);+      sourcePathSegments.forEach((sourcePathSegment, i) => {+        const isLastSegment = i === sourcePathSegments.length - 1;++        let child = node.children && node.children.find(child => child.name === sourcePathSegment);+        if (!child) {+          child = newNode(sourcePathSegment);+          node.children = node.children || [];+          node.children.push(child);+        }+        node = child;++        // Now that we've found or created the next node in the path, apply the data.+        node.resourceBytes += data.resourceBytes;+        if (data.unusedBytes) node.unusedBytes = (node.unusedBytes || 0) + data.unusedBytes;+        if (data.duplicate !== undefined && isLastSegment) {+          node.duplicate = data.duplicate;+        }+      });+    }++    const rootNode = newNode(sourceRoot);++    // For every source file, apply the data to all components+    // of the source path, creating nodes as necessary.+    for (const [source, data] of Object.entries(sourcesData)) {+      addNode(source || `<unmapped>`, data, rootNode);++      // Apply the data to the rootNode.+      rootNode.resourceBytes += data.resourceBytes;+      if (data.unusedBytes) rootNode.unusedBytes = (rootNode.unusedBytes || 0) + data.unusedBytes;+    }++    /**+     * Collapse nodes that have only one child.+     * @param {Node} node+     */+    function collapse(node) {+      while (node.children && node.children.length === 1) {+        node.name += '/' + node.children[0].name;+        node.children = node.children[0].children;+      }++      if (node.children) {+        for (const child of node.children) {+          collapse(child);+        }+      }+    }+    collapse(rootNode);++    // TODO(cjamcl): Should this structure be flattened for space savings?+    // Like DOM Snapshot.+    // Less JSON (no super nested children, and no repeated property names).++    return rootNode;+  }++  /**+   * @param {LH.Artifacts} artifacts+   * @param {LH.Audit.Context} context+   * @return {Promise<RootNode[]>}+   */+  static async makeJavaScriptRootNodes(artifacts, context) {+    /** @type {RootNode[]} */+    const rootNodes = [];++    const bundles = await JsBundles.request(artifacts, context);+    const duplication = await ModuleDuplication.request(artifacts, context);++    /** @type {Array<{src: string, length: number, unusedJavascriptSummary?: import('../computed/unused-javascript-summary.js').Summary}>} */+    const scriptData = [];+    const inlineScriptData = {+      src: artifacts.URL.finalUrl,+      length: 0,+    };+    for (const scriptElement of artifacts.ScriptElements) {+      // Normalize ScriptElements so that inline scripts show up as a single entity.+      if (!scriptElement.src) {+        inlineScriptData.length += (scriptElement.content || '').length;+        continue;+      }++      const url = scriptElement.src;+      const bundle = bundles.find(bundle => url === bundle.script.src);+      const scriptCoverages = artifacts.JsUsage[url];+      if (!bundle || !scriptCoverages) continue;++      scriptData.push({+        src: scriptElement.src,+        length: (scriptElement.content || '').length,+        unusedJavascriptSummary:+          await UnusedJavaScriptSummary.request({url, scriptCoverages, bundle}, context),+      });+    }+    if (inlineScriptData.length) scriptData.unshift(inlineScriptData);++    for (const {src, length, unusedJavascriptSummary} of scriptData) {+      const bundle = bundles.find(bundle => bundle.script.src === src);+      const name = src;++      let node;+      if (bundle && unusedJavascriptSummary && unusedJavascriptSummary.sourcesWastedBytes) {

style nit: totally my opinion, but reversing the order of these conditionals (and possibly early push and continues?) might help with readability here too. e.g.

if (!unusedJavascriptSummary) {
  // Don't even have coverage, so only provide the byte length.
  // (is this even possible if there's a guaranteed `scriptCoverages`?)
  // ...
} else if (!unusedJavascriptSummary.sourcesWastedBytes)  {
  // Without sources, can't delve inside, so give bundle-level stats.
  // ...
} else {
  // Full source-map-based data.
}
connorjclark

comment created time in 5 days

Pull request review commentGoogleChrome/lighthouse

new_audit: add treemap-data to experimental

+/**+ * @license Copyright 2020 Google Inc. All Rights Reserved.+ * Licensed under the Apache License, Version 2.0 (the "License"); you may not use this file except in compliance with the License. You may obtain a copy of the License at http://www.apache.org/licenses/LICENSE-2.0+ * Unless required by applicable law or agreed to in writing, software distributed under the License is distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the License for the specific language governing permissions and limitations under the License.+ */+'use strict';++/**+ * @fileoverview+ * Creates treemap data for webtreemap.+ */++const Audit = require('./audit.js');+const JsBundles = require('../computed/js-bundles.js');+const UnusedJavaScriptSummary = require('../computed/unused-javascript-summary.js');+const ModuleDuplication = require('../computed/module-duplication.js');+const NetworkRecords = require('../computed/network-records.js');+const ResourceSummary = require('../computed/resource-summary.js');++/**+ * @typedef {Record<string, RootNode[]>} TreemapData+ */++/**+ * @typedef RootNode+ * @property {string} name+ * @property {Node} node+ */++/**+ * @typedef Node+ * @property {string} name+ * @property {number} resourceBytes+ * @property {number=} unusedBytes+ * @property {number=} executionTime+ * @property {string=} duplicate+ * @property {Node[]=} children+ */++/**+ * @typedef {Omit<Node, 'name'|'children'>} SourceData+ */++class TreemapDataAudit extends Audit {+  /**+   * @return {LH.Audit.Meta}+   */+  static get meta() {+    return {+      id: 'treemap-data',+      scoreDisplayMode: Audit.SCORING_MODES.INFORMATIVE,+      title: 'Treemap Data',+      description: 'Used for treemap app.',+      requiredArtifacts:+        ['traces', 'devtoolsLogs', 'SourceMaps', 'ScriptElements', 'JsUsage', 'URL'],+    };+  }++  /**+   * @param {string} sourceRoot+   * @param {Record<string, SourceData>} sourcesData+   * @return {Node}+   */+  static prepareTreemapNodes(sourceRoot, sourcesData) {+    /**+     * @param {string} name+     * @return {Node}+     */+    function newNode(name) {+      return {+        name,+        resourceBytes: 0,+      };+    }++    /**+     * Given a slash-delimited path, traverse the Node structure and increment+     * the data provided for each node in the chain. Creates nodes as needed.+     * Ex: path/to/file.js will find or create "path" on `node`, increment the data fields,+     *     and continue with "to", and so on.+     * @param {string} source+     * @param {SourceData} data+     * @param {Node} node+     */+    function addNode(source, data, node) {+      // Strip off the shared root.+      const sourcePathSegments = source.replace(sourceRoot, '').split(/\/+/);+      sourcePathSegments.forEach((sourcePathSegment, i) => {+        const isLastSegment = i === sourcePathSegments.length - 1;++        let child = node.children && node.children.find(child => child.name === sourcePathSegment);+        if (!child) {+          child = newNode(sourcePathSegment);+          node.children = node.children || [];+          node.children.push(child);+        }+        node = child;++        // Now that we've found or created the next node in the path, apply the data.+        node.resourceBytes += data.resourceBytes;+        if (data.unusedBytes) node.unusedBytes = (node.unusedBytes || 0) + data.unusedBytes;+        if (data.duplicate !== undefined && isLastSegment) {+          node.duplicate = data.duplicate;+        }+      });+    }++    const rootNode = newNode(sourceRoot);++    // For every source file, apply the data to all components+    // of the source path, creating nodes as necessary.+    for (const [source, data] of Object.entries(sourcesData)) {+      addNode(source || `<unmapped>`, data, rootNode);++      // Apply the data to the rootNode.+      rootNode.resourceBytes += data.resourceBytes;+      if (data.unusedBytes) rootNode.unusedBytes = (rootNode.unusedBytes || 0) + data.unusedBytes;+    }++    /**+     * Collapse nodes that have only one child.+     * @param {Node} node+     */+    function collapse(node) {+      while (node.children && node.children.length === 1) {+        node.name += '/' + node.children[0].name;+        node.children = node.children[0].children;+      }++      if (node.children) {+        for (const child of node.children) {+          collapse(child);+        }+      }+    }+    collapse(rootNode);++    // TODO(cjamcl): Should this structure be flattened for space savings?+    // Like DOM Snapshot.+    // Less JSON (no super nested children, and no repeated property names).++    return rootNode;+  }++  /**+   * @param {LH.Artifacts} artifacts+   * @param {LH.Audit.Context} context+   * @return {Promise<RootNode[]>}+   */+  static async makeJavaScriptRootNodes(artifacts, context) {+    /** @type {RootNode[]} */+    const rootNodes = [];++    const bundles = await JsBundles.request(artifacts, context);+    const duplication = await ModuleDuplication.request(artifacts, context);++    /** @type {Array<{src: string, length: number, unusedJavascriptSummary?: import('../computed/unused-javascript-summary.js').Summary}>} */+    const scriptData = [];+    const inlineScriptData = {+      src: artifacts.URL.finalUrl,+      length: 0,+    };+    for (const scriptElement of artifacts.ScriptElements) {+      // Normalize ScriptElements so that inline scripts show up as a single entity.+      if (!scriptElement.src) {+        inlineScriptData.length += (scriptElement.content || '').length;+        continue;+      }++      const url = scriptElement.src;+      const bundle = bundles.find(bundle => url === bundle.script.src);+      const scriptCoverages = artifacts.JsUsage[url];+      if (!bundle || !scriptCoverages) continue;

when does this happen? If it happens for decent-sized scripts, should they be included in some way still? (even if just an "other" category or whatever). If it's just for type checking or shouldn't really occur, add a comment?

connorjclark

comment created time in 5 days

Pull request review commentGoogleChrome/lighthouse

new_audit: add treemap-data to experimental

+/**+ * @license Copyright 2020 Google Inc. All Rights Reserved.+ * Licensed under the Apache License, Version 2.0 (the "License"); you may not use this file except in compliance with the License. You may obtain a copy of the License at http://www.apache.org/licenses/LICENSE-2.0+ * Unless required by applicable law or agreed to in writing, software distributed under the License is distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the License for the specific language governing permissions and limitations under the License.+ */+'use strict';++/**+ * @fileoverview+ * Creates treemap data for webtreemap.+ */++const Audit = require('./audit.js');+const JsBundles = require('../computed/js-bundles.js');+const UnusedJavaScriptSummary = require('../computed/unused-javascript-summary.js');+const ModuleDuplication = require('../computed/module-duplication.js');+const NetworkRecords = require('../computed/network-records.js');+const ResourceSummary = require('../computed/resource-summary.js');++/**+ * @typedef {Record<string, RootNode[]>} TreemapData+ */++/**+ * @typedef RootNode+ * @property {string} name+ * @property {Node} node+ */++/**+ * @typedef Node+ * @property {string} name+ * @property {number} resourceBytes+ * @property {number=} unusedBytes+ * @property {number=} executionTime+ * @property {string=} duplicate+ * @property {Node[]=} children+ */++/**+ * @typedef {Omit<Node, 'name'|'children'>} SourceData+ */++class TreemapDataAudit extends Audit {+  /**+   * @return {LH.Audit.Meta}+   */+  static get meta() {+    return {+      id: 'treemap-data',+      scoreDisplayMode: Audit.SCORING_MODES.INFORMATIVE,+      title: 'Treemap Data',+      description: 'Used for treemap app.',+      requiredArtifacts:+        ['traces', 'devtoolsLogs', 'SourceMaps', 'ScriptElements', 'JsUsage', 'URL'],+    };+  }++  /**+   * @param {string} sourceRoot+   * @param {Record<string, SourceData>} sourcesData+   * @return {Node}+   */+  static prepareTreemapNodes(sourceRoot, sourcesData) {+    /**+     * @param {string} name+     * @return {Node}+     */+    function newNode(name) {+      return {+        name,+        resourceBytes: 0,+      };+    }++    /**+     * Given a slash-delimited path, traverse the Node structure and increment+     * the data provided for each node in the chain. Creates nodes as needed.+     * Ex: path/to/file.js will find or create "path" on `node`, increment the data fields,+     *     and continue with "to", and so on.+     * @param {string} source+     * @param {SourceData} data+     * @param {Node} node+     */+    function addNode(source, data, node) {+      // Strip off the shared root.+      const sourcePathSegments = source.replace(sourceRoot, '').split(/\/+/);+      sourcePathSegments.forEach((sourcePathSegment, i) => {+        const isLastSegment = i === sourcePathSegments.length - 1;++        let child = node.children && node.children.find(child => child.name === sourcePathSegment);+        if (!child) {+          child = newNode(sourcePathSegment);+          node.children = node.children || [];+          node.children.push(child);+        }+        node = child;++        // Now that we've found or created the next node in the path, apply the data.+        node.resourceBytes += data.resourceBytes;+        if (data.unusedBytes) node.unusedBytes = (node.unusedBytes || 0) + data.unusedBytes;+        if (data.duplicate !== undefined && isLastSegment) {+          node.duplicate = data.duplicate;+        }+      });+    }++    const rootNode = newNode(sourceRoot);++    // For every source file, apply the data to all components+    // of the source path, creating nodes as necessary.+    for (const [source, data] of Object.entries(sourcesData)) {+      addNode(source || `<unmapped>`, data, rootNode);++      // Apply the data to the rootNode.+      rootNode.resourceBytes += data.resourceBytes;+      if (data.unusedBytes) rootNode.unusedBytes = (rootNode.unusedBytes || 0) + data.unusedBytes;+    }++    /**+     * Collapse nodes that have only one child.+     * @param {Node} node+     */+    function collapse(node) {+      while (node.children && node.children.length === 1) {+        node.name += '/' + node.children[0].name;+        node.children = node.children[0].children;+      }++      if (node.children) {+        for (const child of node.children) {+          collapse(child);+        }+      }+    }+    collapse(rootNode);++    // TODO(cjamcl): Should this structure be flattened for space savings?+    // Like DOM Snapshot.+    // Less JSON (no super nested children, and no repeated property names).++    return rootNode;+  }++  /**+   * @param {LH.Artifacts} artifacts+   * @param {LH.Audit.Context} context+   * @return {Promise<RootNode[]>}+   */+  static async makeJavaScriptRootNodes(artifacts, context) {+    /** @type {RootNode[]} */+    const rootNodes = [];++    const bundles = await JsBundles.request(artifacts, context);+    const duplication = await ModuleDuplication.request(artifacts, context);++    /** @type {Array<{src: string, length: number, unusedJavascriptSummary?: import('../computed/unused-javascript-summary.js').Summary}>} */+    const scriptData = [];+    const inlineScriptData = {+      src: artifacts.URL.finalUrl,+      length: 0,+    };+    for (const scriptElement of artifacts.ScriptElements) {+      // Normalize ScriptElements so that inline scripts show up as a single entity.+      if (!scriptElement.src) {+        inlineScriptData.length += (scriptElement.content || '').length;+        continue;+      }++      const url = scriptElement.src;

nit:

      const src = scriptElement.src;

seems like the more straightforward choice here :)

connorjclark

comment created time in 5 days

Pull request review commentGoogleChrome/lighthouse

new_audit: add treemap-data to experimental

+/**+ * @license Copyright 2020 Google Inc. All Rights Reserved.+ * Licensed under the Apache License, Version 2.0 (the "License"); you may not use this file except in compliance with the License. You may obtain a copy of the License at http://www.apache.org/licenses/LICENSE-2.0+ * Unless required by applicable law or agreed to in writing, software distributed under the License is distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the License for the specific language governing permissions and limitations under the License.+ */+'use strict';++/**+ * @fileoverview+ * Creates treemap data for webtreemap.+ */++const Audit = require('./audit.js');+const JsBundles = require('../computed/js-bundles.js');+const UnusedJavaScriptSummary = require('../computed/unused-javascript-summary.js');+const ModuleDuplication = require('../computed/module-duplication.js');+const NetworkRecords = require('../computed/network-records.js');+const ResourceSummary = require('../computed/resource-summary.js');++/**+ * @typedef {Record<string, RootNode[]>} TreemapData+ */++/**+ * @typedef RootNode+ * @property {string} name+ * @property {Node} node+ */++/**+ * @typedef Node+ * @property {string} name+ * @property {number} resourceBytes+ * @property {number=} unusedBytes+ * @property {number=} executionTime+ * @property {string=} duplicate+ * @property {Node[]=} children+ */++/**+ * @typedef {Omit<Node, 'name'|'children'>} SourceData+ */++class TreemapDataAudit extends Audit {+  /**+   * @return {LH.Audit.Meta}+   */+  static get meta() {+    return {+      id: 'treemap-data',+      scoreDisplayMode: Audit.SCORING_MODES.INFORMATIVE,+      title: 'Treemap Data',+      description: 'Used for treemap app.',+      requiredArtifacts:+        ['traces', 'devtoolsLogs', 'SourceMaps', 'ScriptElements', 'JsUsage', 'URL'],+    };+  }++  /**+   * @param {string} sourceRoot+   * @param {Record<string, SourceData>} sourcesData+   * @return {Node}+   */+  static prepareTreemapNodes(sourceRoot, sourcesData) {+    /**+     * @param {string} name+     * @return {Node}+     */+    function newNode(name) {+      return {+        name,+        resourceBytes: 0,+      };+    }++    /**+     * Given a slash-delimited path, traverse the Node structure and increment+     * the data provided for each node in the chain. Creates nodes as needed.+     * Ex: path/to/file.js will find or create "path" on `node`, increment the data fields,+     *     and continue with "to", and so on.+     * @param {string} source+     * @param {SourceData} data+     * @param {Node} node+     */+    function addNode(source, data, node) {+      // Strip off the shared root.+      const sourcePathSegments = source.replace(sourceRoot, '').split(/\/+/);+      sourcePathSegments.forEach((sourcePathSegment, i) => {+        const isLastSegment = i === sourcePathSegments.length - 1;++        let child = node.children && node.children.find(child => child.name === sourcePathSegment);+        if (!child) {+          child = newNode(sourcePathSegment);+          node.children = node.children || [];+          node.children.push(child);+        }+        node = child;++        // Now that we've found or created the next node in the path, apply the data.+        node.resourceBytes += data.resourceBytes;+        if (data.unusedBytes) node.unusedBytes = (node.unusedBytes || 0) + data.unusedBytes;+        if (data.duplicate !== undefined && isLastSegment) {+          node.duplicate = data.duplicate;+        }+      });+    }++    const rootNode = newNode(sourceRoot);++    // For every source file, apply the data to all components+    // of the source path, creating nodes as necessary.+    for (const [source, data] of Object.entries(sourcesData)) {+      addNode(source || `<unmapped>`, data, rootNode);++      // Apply the data to the rootNode.+      rootNode.resourceBytes += data.resourceBytes;+      if (data.unusedBytes) rootNode.unusedBytes = (rootNode.unusedBytes || 0) + data.unusedBytes;+    }++    /**+     * Collapse nodes that have only one child.+     * @param {Node} node+     */+    function collapse(node) {+      while (node.children && node.children.length === 1) {+        node.name += '/' + node.children[0].name;+        node.children = node.children[0].children;+      }++      if (node.children) {+        for (const child of node.children) {+          collapse(child);+        }+      }+    }+    collapse(rootNode);++    // TODO(cjamcl): Should this structure be flattened for space savings?+    // Like DOM Snapshot.+    // Less JSON (no super nested children, and no repeated property names).++    return rootNode;+  }++  /**+   * @param {LH.Artifacts} artifacts+   * @param {LH.Audit.Context} context+   * @return {Promise<RootNode[]>}+   */+  static async makeJavaScriptRootNodes(artifacts, context) {+    /** @type {RootNode[]} */+    const rootNodes = [];++    const bundles = await JsBundles.request(artifacts, context);+    const duplication = await ModuleDuplication.request(artifacts, context);++    /** @type {Array<{src: string, length: number, unusedJavascriptSummary?: import('../computed/unused-javascript-summary.js').Summary}>} */+    const scriptData = [];+    const inlineScriptData = {+      src: artifacts.URL.finalUrl,+      length: 0,+    };+    for (const scriptElement of artifacts.ScriptElements) {+      // Normalize ScriptElements so that inline scripts show up as a single entity.

Normalize

Combine?

connorjclark

comment created time in 5 days

PullRequestReviewEvent
PullRequestReviewEvent

Pull request review commentGoogleChrome/lighthouse

core(driver): don't clear indexedb, websql, or localstorage before run

 class GatherRunner {     await driver.registerPerformanceObserver();     await driver.dismissJavaScriptDialogs();     await driver.registerRequestIdleCallbackWrap(options.settings);-    if (resetStorage) await driver.clearDataForOrigin(options.requestedUrl);+    if (resetStorage) {+      const locations = await driver.getImportantLocationsNotCleared(options.requestedUrl);+      if (locations.length) {+        LighthouseRunWarnings.push(str_(+          UIStrings.warningData,+          {locations: locations.join(', '), locationCount: locations.length}

No, the locations should be replacement values, but I mean languages don't all format lists the same way, so

"localized content: Local Storage, IndexedDB, Web SQL. localized content"

(or whatever order it ends up in) may be a nonsense way of doing a list like that in that locale.

There has been an argument that what's being stated should still be understandable by someone using that locale (especially because many technical users are used to being stuck with only english in their developer tools), but in the past we've taken the position that we should stick with English for the string until we can't figure out the "right" away to do it.

Another example is the multi-check-audit "failures" list:

https://github.com/GoogleChrome/lighthouse/blob/bc6ab76a33f50403e4bfc84eb069b070ed10466f/lighthouse-core/audits/multi-check-audit.js#L56-L57

The TODO points to "i18n concat'd lists" in #7238 (maybe it's finally time to do)

adamraine

comment created time in 5 days

PullRequestReviewEvent

Pull request review commentGoogleChrome/lighthouse

core(driver): don't clear indexedb, websql, or localstorage before run

 class GatherRunner {     await driver.registerPerformanceObserver();     await driver.dismissJavaScriptDialogs();     await driver.registerRequestIdleCallbackWrap(options.settings);-    if (resetStorage) await driver.clearDataForOrigin(options.requestedUrl);+    if (resetStorage) {+      const locations = await driver.getImportantLocationsNotCleared(options.requestedUrl);+      if (locations.length) {+        LighthouseRunWarnings.push(str_(+          UIStrings.warningData,+          {locations: locations.join(', '), locationCount: locations.length}+        ));+      }

Yeah, it's probably arguable if this code makes more sense in driver than in gather-runner. It's not agnostic driving, and the smarts for how to drive should generally live more in gather-runner, but OTOH we're not agnostic about many things in driver, including which types of storage to clear or not. Until Fraggle Rock starts changing things, though, personally I think it does make sense in driver for now, as part of a pair with clearDataForOrigin.

For UIStrings: now that #10630 and IcuMessage make it easier to carry localized strings over from the gathering stage, we'll probably start to have more driver/gathering UIStrings, so it seems ok to start a new one in this PR.

adamraine

comment created time in 5 days

PullRequestReviewEvent

Pull request review commentGoogleChrome/lighthouse

core(driver): don't clear indexedb, websql, or localstorage before run

 describe('GatherRunner', function() {     });   }); +  it('warns if important data not cleared may impact performance', () => {+    const asyncFunc = () => Promise.resolve();+    driver.assertNoSameOriginServiceWorkerClients = asyncFunc;+    driver.beginEmulation = asyncFunc;+    driver.enableRuntimeEvents = asyncFunc;+    driver.enableAsyncStacks = asyncFunc;+    driver.cacheNatives = asyncFunc;+    driver.registerPerformanceObserver = asyncFunc;+    driver.dismissJavaScriptDialogs = asyncFunc;+    driver.registerRequestIdleCallbackWrap = asyncFunc;+    driver.clearDataForOrigin = asyncFunc;++    connectionStub.sendCommand = createMockSendCommandFn()+      .mockResponse('Storage.getUsageAndQuota', {usageBreakdown: [+        {storageType: 'local_storage', usage: 5},+        {storageType: 'indexeddb', usage: 5},+        {storageType: 'websql', usage: 0},+        {storageType: 'appcache', usage: 5},+        {storageType: 'cookies', usage: 5},+        {storageType: 'file_systems', usage: 5},+        {storageType: 'shader_cache', usage: 5},+        {storageType: 'service_workers', usage: 5},+        {storageType: 'cache_storage', usage: 0},+      ]});+    /** @type {string[]} */+    const LighthouseRunWarnings = [];+    GatherRunner.setupDriver(driver, {settings: {}}, LighthouseRunWarnings).then(_ => {+      expect(LighthouseRunWarnings[0]).toBeDisplayString(new RegExp(

Is this a fix I should include in this PR?

It's such a small extra change, it's only for the type on code that can only ever be used in tests, and it's additive rather than changing the type, so it seems fine to me to land in this PR, but feel free to split it out if you'd prefer.

adamraine

comment created time in 5 days

PullRequestReviewEvent

Pull request review commentGoogleChrome/lighthouse

core(driver): don't clear indexedb, websql, or localstorage before run

 describe('GatherRunner', function() {     });   }); +  it('warns if important data not cleared may impact performance', () => {+    const asyncFunc = () => Promise.resolve();+    driver.assertNoSameOriginServiceWorkerClients = asyncFunc;+    driver.beginEmulation = asyncFunc;+    driver.enableRuntimeEvents = asyncFunc;+    driver.enableAsyncStacks = asyncFunc;+    driver.cacheNatives = asyncFunc;+    driver.registerPerformanceObserver = asyncFunc;+    driver.dismissJavaScriptDialogs = asyncFunc;+    driver.registerRequestIdleCallbackWrap = asyncFunc;+    driver.clearDataForOrigin = asyncFunc;++    connectionStub.sendCommand = createMockSendCommandFn()+      .mockResponse('Storage.getUsageAndQuota', {usageBreakdown: [+        {storageType: 'local_storage', usage: 5},+        {storageType: 'indexeddb', usage: 5},+        {storageType: 'websql', usage: 0},+        {storageType: 'appcache', usage: 5},+        {storageType: 'cookies', usage: 5},+        {storageType: 'file_systems', usage: 5},+        {storageType: 'shader_cache', usage: 5},+        {storageType: 'service_workers', usage: 5},+        {storageType: 'cache_storage', usage: 0},+      ]});+    /** @type {string[]} */+    const LighthouseRunWarnings = [];+    GatherRunner.setupDriver(driver, {settings: {}}, LighthouseRunWarnings).then(_ => {+      expect(LighthouseRunWarnings[0]).toBeDisplayString(new RegExp(

Ah, looks like we got the type declaration wrong and we just don't type check all the other test files that use this :)

https://github.com/GoogleChrome/lighthouse/blob/bc6ab76a33f50403e4bfc84eb069b070ed10466f/types/jest.d.ts#L23

should include string |

adamraine

comment created time in 5 days

PullRequestReviewEvent

Pull request review commentGoogleChrome/lighthouse

core(driver): don't clear indexedb, websql, or localstorage before run

 class Driver {   async clearDataForOrigin(url) {     const origin = new URL(url).origin; -    // Clear all types of storage except cookies, so the user isn't logged out.+    // Clear some types of storage.+    // Cookies are not cleared, so the user isn't logged out.+    // indexeddb, websql, and localstorage are not cleared to prevent loss of potentially important data.     //   https://chromedevtools.github.io/debugger-protocol-viewer/tot/Storage/#type-StorageType     const typesToClear = [       'appcache',       // 'cookies',       'file_systems',-      'indexeddb',

do we have an existing test that some storage types aren't cleared that these can be added to so we get some assurance we don't accidentally regress at some point?

adamraine

comment created time in 5 days

Pull request review commentGoogleChrome/lighthouse

core(driver): don't clear indexedb, websql, or localstorage before run

 class GatherRunner {     await driver.registerPerformanceObserver();     await driver.dismissJavaScriptDialogs();     await driver.registerRequestIdleCallbackWrap(options.settings);-    if (resetStorage) await driver.clearDataForOrigin(options.requestedUrl);+    if (resetStorage) {+      const locations = await driver.getImportantLocationsNotCleared(options.requestedUrl);+      if (locations.length) {+        LighthouseRunWarnings.push(str_(+          UIStrings.warningData,+          {locations: locations.join(', '), locationCount: locations.length}

I don't believe we can concat a list of names like this, especially in the middle of a string. We have a few options, e.g. there are only three possible values, so having three separate strings:

const warningData1 = 'There may be important data in this location: {location}. Audit...';
const warningData2 = 'There may be important data in these locations: {location1}, {location2}. Audit...';
const warningData3 = 'There may be important data in these locations: {location1}, {location2}, {location3}. Audit...';

There's also Intl.ListFormat, but that's not available until Node 12 (and supposedly there may still be issues with that...I'm not sure how we communicate that will be used to the translators, for instance).

Maybe there are other options, @exterkamp?

adamraine

comment created time in 5 days

Pull request review commentGoogleChrome/lighthouse

core(driver): don't clear indexedb, websql, or localstorage before run

 class Driver {     }   } +  /**+   * @param {string} url+   * @return {Promise<string[]>}+   */+  async getImportantLocationsNotCleared(url) {+    const usageData = await this.sendCommand('Storage.getUsageAndQuota', {+      origin: url,+    });+    const locations = usageData.usageBreakdown.filter(usage => usage.usage)+      .map(usage => {+        switch (usage.storageType) {+          case 'local_storage':+            return 'Local Storage';+          case 'indexeddb':+            return 'IndexedDB';+          case 'websql':+            return 'Web SQL';+          default:+            return '';+        }+      })+      .filter(resourceString => resourceString);

something like

const storageTypeNames = {
  local_storage: 'Local Storage',
  indexeddb: 'IndexedDB',
  websql: 'Web SQL',
};
const locations = usageData.usageBreakdown
  .filter(usage => usage.usage)
  .map(usage => storageTypeNames[usage.storageType] || '')
  .filter(Boolean);

might be a little clearer in what the code is up to? I assume there will be a tsc issue to be resolved on the storageTypeNames[usage.storageType] part, though.

adamraine

comment created time in 5 days

Pull request review commentGoogleChrome/lighthouse

core(driver): don't clear indexedb, websql, or localstorage before run

 class GatherRunner {     await driver.registerPerformanceObserver();     await driver.dismissJavaScriptDialogs();     await driver.registerRequestIdleCallbackWrap(options.settings);-    if (resetStorage) await driver.clearDataForOrigin(options.requestedUrl);+    if (resetStorage) {+      const locations = await driver.getImportantLocationsNotCleared(options.requestedUrl);+      if (locations.length) {+        LighthouseRunWarnings.push(str_(+          UIStrings.warningData,+          {locations: locations.join(', '), locationCount: locations.length}+        ));+      }

wdyt about wrapping this up into the driver function (driver.getImportantStorageWarning(url, runWarnings) or whatever)? I don't imagine we're going to be using getImportantLocationsNotCleared in other ways, so it makes some sense to keep the warning generation code all together

adamraine

comment created time in 5 days

Pull request review commentGoogleChrome/lighthouse

core(driver): don't clear indexedb, websql, or localstorage before run

 describe('GatherRunner', function() {     });   }); +  it('warns if important data not cleared may impact performance', () => {+    const asyncFunc = () => Promise.resolve();+    driver.assertNoSameOriginServiceWorkerClients = asyncFunc;+    driver.beginEmulation = asyncFunc;+    driver.enableRuntimeEvents = asyncFunc;+    driver.enableAsyncStacks = asyncFunc;+    driver.cacheNatives = asyncFunc;+    driver.registerPerformanceObserver = asyncFunc;+    driver.dismissJavaScriptDialogs = asyncFunc;+    driver.registerRequestIdleCallbackWrap = asyncFunc;+    driver.clearDataForOrigin = asyncFunc;++    connectionStub.sendCommand = createMockSendCommandFn()+      .mockResponse('Storage.getUsageAndQuota', {usageBreakdown: [+        {storageType: 'local_storage', usage: 5},+        {storageType: 'indexeddb', usage: 5},+        {storageType: 'websql', usage: 0},+        {storageType: 'appcache', usage: 5},+        {storageType: 'cookies', usage: 5},+        {storageType: 'file_systems', usage: 5},+        {storageType: 'shader_cache', usage: 5},+        {storageType: 'service_workers', usage: 5},+        {storageType: 'cache_storage', usage: 0},+      ]});+    /** @type {string[]} */+    const LighthouseRunWarnings = [];+    GatherRunner.setupDriver(driver, {settings: {}}, LighthouseRunWarnings).then(_ => {

nit: generally async/await is better for new code (though this file does have its fair share of old-ish promise code), otherwise need to return this promise so jest knows what to wait on

adamraine

comment created time in 5 days

Pull request review commentGoogleChrome/lighthouse

core(driver): don't clear indexedb, websql, or localstorage before run

 class GatherRunner {   /**    * @param {Driver} driver    * @param {{requestedUrl: string, settings: LH.Config.Settings}} options+   * @param {string[]} LighthouseRunWarnings

you'll need to update from master, LighthouseRunWarnings isn't a string[] anymore after #10630

adamraine

comment created time in 5 days

Pull request review commentGoogleChrome/lighthouse

core(driver): don't clear indexedb, websql, or localstorage before run

 describe('GatherRunner', function() {     });   }); +  it('warns if important data not cleared may impact performance', () => {+    const asyncFunc = () => Promise.resolve();+    driver.assertNoSameOriginServiceWorkerClients = asyncFunc;+    driver.beginEmulation = asyncFunc;+    driver.enableRuntimeEvents = asyncFunc;+    driver.enableAsyncStacks = asyncFunc;+    driver.cacheNatives = asyncFunc;+    driver.registerPerformanceObserver = asyncFunc;+    driver.dismissJavaScriptDialogs = asyncFunc;+    driver.registerRequestIdleCallbackWrap = asyncFunc;+    driver.clearDataForOrigin = asyncFunc;++    connectionStub.sendCommand = createMockSendCommandFn()+      .mockResponse('Storage.getUsageAndQuota', {usageBreakdown: [+        {storageType: 'local_storage', usage: 5},+        {storageType: 'indexeddb', usage: 5},+        {storageType: 'websql', usage: 0},+        {storageType: 'appcache', usage: 5},+        {storageType: 'cookies', usage: 5},+        {storageType: 'file_systems', usage: 5},+        {storageType: 'shader_cache', usage: 5},+        {storageType: 'service_workers', usage: 5},+        {storageType: 'cache_storage', usage: 0},+      ]});+    /** @type {string[]} */+    const LighthouseRunWarnings = [];+    GatherRunner.setupDriver(driver, {settings: {}}, LighthouseRunWarnings).then(_ => {+      expect(LighthouseRunWarnings[0]).toBeDisplayString(new RegExp(

why a RegExp instead of just the string literal?

adamraine

comment created time in 5 days

PullRequestReviewEvent
PullRequestReviewEvent

Pull request review commentGoogleChrome/lighthouse

docs(auth): add setCookie example

 # Using Puppeteer with Lighthouse +## Recipes++### [Using Puppeteer for authenticated pages](./recipes/auth/README.md)

I see the value of bumping their visual prominence a bit over regular text-sized bullets.

Do they look weird enough to you to change? :) I'll accept whatever suggestion you put up

No suggestion, sorry, and I figured that was the reasoning. It is a little awkward looking but it's also functional and clear in what that function is, so that's why I said not terribly important :)

patrickhulce

comment created time in 7 days

PullRequestReviewEvent

Pull request review commentGoogleChrome/lighthouse

docs(auth): add setCookie example

 # Using Puppeteer with Lighthouse +## Recipes++### [Using Puppeteer for authenticated pages](./recipes/auth/README.md)

would these be better as bullet points or something? IMO they look kind of weird as linked headings with no content, but the formatting is also not terribly important.

patrickhulce

comment created time in 7 days

PullRequestReviewEvent
PullRequestReviewEvent

Pull request review commentGoogleChrome/lighthouse

core(js-bundles): return error object when sizes cannot be determined

 declare global {           files: Record<string, number>;           unmappedBytes: number;           totalBytes: number;-        };+        } | {error: string};

Ah yeah, you'd want undefined, not never (since never disappears in unions)

export type Bundle = {
  sizes: {
    // ...
  };
  errorMessage?: undefined;
} | {
  errorMessage: string;
}

but because errorMessage is a string and '' is falsy, you have to do a full check for undefined to discriminate:

/** @type {LH.Artifacts.Bundle} */
const bundle;
if (bundle.errorMessage === undefined) {
  // we know bundle.sizes exists
}

which also feels kind of gross, so pick a flavor of awkwardness for this :)

connorjclark

comment created time in 8 days

PullRequestReviewEvent

Pull request review commentGoogleChrome/lighthouse

core(js-bundles): return error object when sizes cannot be determined

 declare global {           files: Record<string, number>;           unmappedBytes: number;           totalBytes: number;-        };+        } | {error: string};

Belay that, this doesn't even work for NumericProduct, but I think we can do better here. I'll play around with it, but per the above, the current approach works fine too.

connorjclark

comment created time in 8 days

PullRequestReviewEvent
PullRequestReviewEvent
PullRequestReviewEvent

Pull request review commentGoogleChrome/lighthouse

core(js-bundles): return error object when sizes cannot be determined

 function computeGeneratedFileSizes(map, content) {      const line = lines[lineNum];     if (line === null) {-      log.error('JSBundles', `${map.url()} mapping for line out of bounds: ${lineNum + 1}`);-      return failureResult;+      const error = `${map.url()} mapping for line out of bounds: ${lineNum + 1}`;+      log.error('JSBundles', error);+      return {error};

good point, codecov, should have a test for this case :)

connorjclark

comment created time in 8 days

Pull request review commentGoogleChrome/lighthouse

core(js-bundles): return error object when sizes cannot be determined

 declare global {           files: Record<string, number>;           unmappedBytes: number;           totalBytes: number;-        };+        } | {error: string};

open to some other format for this. ideas?

Checking with in is kind of gross and ancient-feeling js, but it's also not the worst thing ever.

The other approach would be something like is done with SourceMap above this or NumericProduct/NonNumericProduct

export interface Bundle {
  rawMap: RawSourceMap;
  script: ScriptElement;
  map: TextSourceMap;
  sizes: {
    files: Record<string, number>;
    unmappedBytes: number;
    totalBytes: number;
  };
  errorMessage?: never;
} | {
  rawMap: RawSourceMap;
  script: ScriptElement;
  map: TextSourceMap;
  errorMessage: string;
}

more verbose in the type definition, but here the if (!bundle.errorMessage) { /* we know bundle.sizes exists */ } check is more idiomatic and is still required before tsc will let code access bundle.sizes.

connorjclark

comment created time in 8 days

PullRequestReviewEvent
PullRequestReviewEvent

issue commentGoogleChrome/lighthouse

PSI fecht bad url

When I instrument afterPass network requests, for this site (at least) it happens in both the ImageElements and TraceElements gatherers due to how we create the outerHTML snippet:

https://github.com/GoogleChrome/lighthouse/blob/6062f8ba305026a75904b984332d56850090be93/lighthouse-core/lib/page-functions.js#L127-L143

The clone of an img element is still an img element, so when the src attribute is set to the truncated src.slice(0, ATTRIBUTE_CHAR_LIMIT - 1) + '…', the page immediately tries to load an image from the new (bad) url.

This request doesn't affect anything, it's just a cloned node that's not in the DOM and also isn't used by Lighthouse for anything but the snippet, but it's also kind of weird. I think we'd need to take a pretty different approach to creating the snippet (or never truncate src, which probably isn't workable) to fix it, though.

Lofesa

comment created time in 8 days

Pull request review commentGoogleChrome/lighthouse

misc: add missing space in description

 const UIStrings = {   /** Description of a Lighthouse audit that tells the user *why* HTTPS use *for all resources* is important. This is displayed after a user expands the section to see more. No character length limits. 'Learn More' becomes link text to additional documentation. */   description: 'All sites should be protected with HTTPS, even ones that don\'t handle ' +       'sensitive data. This includes avoiding [mixed content](https://developers.google.com/web/fundamentals/security/prevent-mixed-content/what-is-mixed-content), ' +-      'where some resources are loaded over HTTP despite the initial request being served' ++      'where some resources are loaded over HTTP despite the initial request being served ' +

our lint settings already allows this string to break max-len because it has a url in it :)

qwright10

comment created time in 8 days

PullRequestReviewEvent

issue commentGoogleChrome/lighthouse

Eliminate without-javascript audit's dependence on the HTTP redirect pass

I propose just converting the without-javascript check to look for a <noscript> with some textContent

That does mean e.g. example.com (and any other completely static site) would fail the audit :/ I would guess that those are more common than sites not serving something on http (whether or not it's a redirect).

Another option worth considering is retiring the without-javascript audit in the next major release. No one has ever really expressed interest or ideas on how to make this obviously ham-fisted audit better and actually test what it claims to be doing. Maybe it's not worth keeping until (and if) some new approach is developed?

GlenKPeterson

comment created time in 11 days

push eventGoogleChrome/lighthouse

Brendan Kenny

commit sha 378a31f8117d20c852562514612c80ea12892c54

i18n: use IcuMessage objects instead of string IDs (#10630)

view details

push time in 12 days

delete branch GoogleChrome/lighthouse

delete branch : icuobjects

delete time in 12 days

PR merged GoogleChrome/lighthouse

Reviewers
i18n: use IcuMessage objects instead of string IDs cla: yes waiting4reviewer

As discussed in #10614 and offline, this moves us to using objects for localizable strings. This has several advantages (listed in that issue) and it turns out few disadvantages.

Proposed object:

type IcuMessage = {
  /** The id locating this message in the locale message json files. */
  i18nId: string;
  /** The dynamic values, if any, to insert into the localized string. */
  values?: Record<string, string | number>;
  /** A formatted version of the string, usually as found in a file's `UIStrings` entry, used as backup. */
  formattedDefault: string,
};

Biggest change from the proposal is the formattedDefault which will always be there. This is basically what swap-locale already does (uses the string already in the LHR as backup if lighthouse has changed enough that the message ID no longer refers to a translated string anymore) but uses it for allowing serialization at any stage, as well as for the current UIStrings backup during development (allowing use of a string without having to run yarn update:sample-json after every edit).

The PR changes are a little spread out, but the substantial changes are in

  • i18n.js
  • i18n.d.ts
  • 'swap-locale.js`
  • runner.js
  • details-renderer.js
  • audit.d.ts, audit-details.d.ts, lhr.d.ts more or less in that order for significance of the changes.
+708 -390

10 comments

56 changed files

brendankenny

pr closed time in 12 days

PR opened GoogleChrome/lighthouse

deps: update transitive lodash

see: https://github.com/GoogleChrome/lighthouse/pull/11441#issuecomment-693699747

+4 -9

0 comment

1 changed file

pr created time in 12 days

create barnchGoogleChrome/lighthouse

branch : lodeps

created branch time in 12 days

pull request commentGoogleChrome/lighthouse

deps(inquirer): upgrade to 7.3.3

(but this PR also isn't updating lodash)

y'all... :)

connorjclark

comment created time in 12 days

Pull request review commentGoogleChrome/lighthouse

i18n: use IcuMessage objects instead of string IDs

 ViewportDimensions: { }  -content-width: pass +content-width: pass undefined

audit.explanation is optional, so can be undefined :)

It was the empty string before...not sure that's better :)

brendankenny

comment created time in 12 days

PullRequestReviewEvent

push eventGoogleChrome/lighthouse

Brendan Kenny

commit sha f377a812959c634f3caa52e544553880ee2bd560

devtools expectations

view details

push time in 12 days

issue commentGoogleChrome/lighthouse

DevTools webtests improvements

wdyt about dropping the need for wget?

Sucks to have to install it so some scripts can basically curl some things :)

I don't see any unusual things going on, seems like it would be a simple replacement?

patrickhulce

comment created time in 12 days

Pull request review commentGoogleChrome/lighthouse

misc: yarn open-devtools

+#!/usr/bin/env bash++set -euo pipefail++##+# @license Copyright 2020 The Lighthouse Authors. All Rights Reserved.+# Licensed under the Apache License, Version 2.0 (the "License"); you may not use this file except in compliance with the License. You may obtain a copy of the License at http://www.apache.org/licenses/LICENSE-2.0+# Unless required by applicable law or agreed to in writing, software distributed under the License is distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the License for the specific language governing permissions and limitations under the License.+##++if [[ -z $CHROME_PATH ]]+then+  echo 'Must set $CHROME_PATH'+  exit 1+fi++export DEVTOOLS_PATH=${DEVTOOLS_PATH:-"$HOME/src/devtools/devtools-frontend"}

+1 as an amateur devtools builder/tester, my use case probably 100% of the time will be opening the thing that just failed in yarn test-devtools

connorjclark

comment created time in 12 days

PullRequestReviewEvent

push eventGoogleChrome/lighthouse

Brendan Kenny

commit sha 9898a9fe5af9c832c39787ea76ad33b5eb76e338

review suggestion Co-authored-by: Patrick Hulce <patrick.hulce@gmail.com>

view details

push time in 12 days

pull request commentGoogleChrome/lighthouse

i18n: use IcuMessage objects instead of string IDs

OK:

  • moved isStringOrIcuMessage to i18n.js so config-helpers and config-plugin can use it
  • config-plugin now allows IcuMessage for the relevant config strings
  • added a config-plugin test with a localizable plugin
  • temporarily switched all the string | IcuMessage types to just IcuMessage and verified there aren't any other places that should account for IcuMessage but currently don't
brendankenny

comment created time in 12 days

push eventGoogleChrome/lighthouse

Brendan Kenny

commit sha 28d0e41ba3ee404078d7958b7f4a2f39b2c797ee

handle localized plugins

view details

push time in 12 days

pull request commentGoogleChrome/lighthouse

i18n: use IcuMessage objects instead of string IDs

weird, but that's great, that really is something that needs to be updated. Any ideas on how we could pipe startup errors like that back to github actions instead of timing out?

brendankenny

comment created time in 12 days

pull request commentGoogleChrome/lighthouse

i18n: use IcuMessage objects instead of string IDs

hmm, it's the same key and commit that's passed on the latest commit on master, but caches are hard, so I'll try it :)

brendankenny

comment created time in 12 days

pull request commentGoogleChrome/lighthouse

i18n: use IcuMessage objects instead of string IDs

@connorjclark smoke-devtools times out no matter how many times I run this (and it's rebased against latest). Any ideas?

brendankenny

comment created time in 13 days

Pull request review commentGoogleChrome/lighthouse

misc(build): minify bundles with terser, emit source map

 const isDevtools = file => /** @param {string} file */ const isLightrider = file => path.basename(file).includes('lightrider'); -const BANNER = `// lighthouse, browserified. ${VERSION} (${COMMIT_HASH})\n` +-  '// @ts-nocheck\n'; // To prevent tsc stepping into any required bundles.

it only ever worked when the entire bundle was on a single line

@ts-nocheck definitely works for multi-line files, we use it in several places for that...

connorjclark

comment created time in 13 days

Pull request review commentGoogleChrome/lighthouse

misc(build): minify bundles with terser, emit source map

     "@wardpeet/brfs": "2.1.0-0",     "angular": "^1.7.4",     "archiver": "^3.0.0",-    "babel-core": "^6.26.0",-    "babel-plugin-syntax-async-generators": "^6.13.0",-    "babel-plugin-syntax-object-rest-spread": "^6.13.0",     "browserify": "^16.2.3",+    "browserify-banner": "^1.0.15",+    "bundlesize": "^0.18.0",

what now

:)

connorjclark

comment created time in 13 days

Pull request review commentGoogleChrome/lighthouse

misc(build): minify bundles with terser, emit source map

 async function browserifyFile(entryPath, distPath) {     writeStream.on('finish', resolve);     writeStream.on('error', reject); -    bundleStream.pipe(writeStream);+    bundleStream+      // Extract the inline source map to an external file.+      .pipe(exorcist(`${distPath}.map`))+      .pipe(writeStream);   }); }  /**- * Minimally minify a javascript file, in place.+ * Minify a javascript file, in place.  * @param {string} filePath  */ function minifyScript(filePath) {-  const opts = {-    compact: true, // Do not include superfluous whitespace characters and line terminators.-    retainLines: true, // Keep things on the same line (looks wonky but helps with stacktraces)-    comments: false, // Don't output comments-    shouldPrintComment: () => false, // Don't include @license or @preserve comments either-    plugins: [-      'syntax-object-rest-spread',-      'syntax-async-generators',-    ],-    // sourceMaps: 'both'-  };+  const result = terser.minify(fs.readFileSync(filePath, 'utf-8'), {+    output: {+      comments: /^!/,+      // @ts-ignore - terser types are whack-a-doodle wrong.
      // @ts-expect-error - terser types are whack-a-doodle wrong.
connorjclark

comment created time in 13 days

Pull request review commentGoogleChrome/lighthouse

misc(build): minify bundles with terser, emit source map

 async function browserifyFile(entryPath, distPath) {     writeStream.on('finish', resolve);     writeStream.on('error', reject); -    bundleStream.pipe(writeStream);+    bundleStream+      // Extract the inline source map to an external file.+      .pipe(exorcist(`${distPath}.map`))+      .pipe(writeStream);   }); }  /**- * Minimally minify a javascript file, in place.+ * Minify a javascript file, in place.  * @param {string} filePath  */ function minifyScript(filePath) {-  const opts = {-    compact: true, // Do not include superfluous whitespace characters and line terminators.-    retainLines: true, // Keep things on the same line (looks wonky but helps with stacktraces)-    comments: false, // Don't output comments-    shouldPrintComment: () => false, // Don't include @license or @preserve comments either-    plugins: [-      'syntax-object-rest-spread',-      'syntax-async-generators',-    ],-    // sourceMaps: 'both'-  };+  const result = terser.minify(fs.readFileSync(filePath, 'utf-8'), {+    output: {+      comments: /^!/,+      // @ts-ignore - terser types are whack-a-doodle wrong.+      max_line_len: /** @type {boolean} */ (1000),+    },+    // The config relies on class names for gatherers.+    keep_classnames: true,+    // Runtime.evaluate errors if function names are elided.+    keep_fnames: true,+    sourceMap: {+      content: JSON.parse(fs.readFileSync(`${filePath}.map`, 'utf-8')),+      url: path.basename(`${filePath}.map`),+    },+  });+  if (result.error) {+    throw result.error;+  } -  // Add the banner and modify globals for DevTools if necessary-  let minified = BANNER + babel.transformFileSync(filePath, opts).code;-  if (isDevtools(filePath)) {-    assert.ok(minified.includes('\nrequire='), 'missing browserify require stub');-    minified = minified.replace('\nrequire=', '\nglobalThis.require=');-    assert.ok(!minified.includes('\nrequire='), 'contained unexpected browserify require stub');+  // Add the banner and modify globals for DevTools if necessary.+  // This will mess up the source map for the first line, but we don't ship+  // the map to DevTools, so that's fine.

what does that leave that gets a source map?

connorjclark

comment created time in 13 days

PullRequestReviewEvent
PullRequestReviewEvent

push eventGoogleChrome/lighthouse

Connor Clark

commit sha c52552b9e01ae577284957b83139ac25d7aa0e9a

tests: check for dependencies when setting up blink tools (#11437)

view details

adrianaixba

commit sha c6e753c191ac96300a931f2a4c434083590b265f

core: normalize node information in gathering (#11405)

view details

adrianaixba

commit sha 039b6c6e0826d8b09e145e6ffd5253724b497021

core(password-inputs-can-be-pasted-into): add devtoolsNodePath (#11416) Uses our snippet for axe nodes instead of aXe's.

view details

Brendan Kenny

commit sha 30c5ec224521449153b0a50571a41e93703f904a

i18n: use IcuMessage objects instead of string IDs

view details

Brendan Kenny

commit sha 4e614f01ba4a08073003dcd00161be9bbc34c063

initial feedback

view details

Brendan Kenny

commit sha 686354910f34fa6d8e60d609c93bbe35f36654b2

more tests

view details

Brendan Kenny

commit sha 5c297f85b776fca768b75b9cd9a50c06394c496c

add serialization test

view details

Brendan Kenny

commit sha 4455f242534875feeed3279da23d20cd51d7f0d2

update readme

view details

push time in 13 days

pull request commentGoogleChrome/lighthouse

report: add score shapes to legend

nice!

adrianaixba

comment created time in 13 days

pull request commentGoogleChrome/lighthouse

i18n: use IcuMessage objects instead of string IDs

outstanding TODOs are complete, ready for review :)

brendankenny

comment created time in 13 days

PullRequestReviewEvent
more