profile
viewpoint

GoogleChrome/lighthouse 20771

Automated auditing, performance metrics, and best practices for the web.

aFarkas/html5shiv 9717

This script is the defacto way to enable use of HTML5 sectioning elements in legacy Internet Explorer.

google/ios-webkit-debug-proxy 4958

A DevTools proxy (Chrome Remote Debugging Protocol) for iOS devices (Safari Remote Web Inspector).

ChromeDevTools/awesome-chrome-devtools 4300

Awesome tooling and resources in the Chrome DevTools & DevTools Protocol ecosystem

GoogleChrome/lighthouse-ci 3528

Automate running Lighthouse for every commit, viewing the changes, and preventing regressions

csnover/TraceKit 895

Attempts to create stack traces for unhandled JavaScript exceptions in all major browsers.

borismus/device.js 768

Semantic client-side device detection with Media Queries

GoogleChrome/chrome-launcher 663

Launch Google Chrome with ease from node.

GoogleChrome/devtools-docs 610

The legacy documentation for Chrome DevTools.

benschwarz/metaquery 329

A declarative responsive web design syntax. Breakpoints, defined in `<meta>`

push eventChromeDevTools/devtools-protocol

devtools-bot

commit sha 686ce105e0f78909b4c4b2c35d99c20b4ae77e26

Updates

view details

push time in 5 hours

Pull request review commentGoogleChrome/lighthouse

report: reuse generalized clumping for perf category

 class PerformanceCategoryRenderer extends CategoryRenderer {      // Filmstrip     const timelineEl = this.dom.createChildOf(element, 'div', 'lh-filmstrip-container');-    const thumbnailAudit = category.auditRefs.find(audit => audit.id === 'screenshot-thumbnails');-    const thumbnailResult = thumbnailAudit && thumbnailAudit.result;-    if (thumbnailResult && thumbnailResult.details) {-      timelineEl.id = thumbnailResult.id;-      const filmstripEl = this.detailsRenderer.render(thumbnailResult.details);-      filmstripEl && timelineEl.appendChild(filmstripEl);+    // We only expect one of these, but the renderer will support multiple+    const thumbnailAudits = category.auditRefs.filter(audit => audit.group === 'filmstrip');

yeah this would break formatting the v6 reports in a new renderer. good call. i'll add a compat bit to also the screenshot-thumbnails audit

paulirish

comment created time in 16 hours

PullRequestReviewEvent

issue commentGoogleChrome/lighthouse

PSI fecht bad url

image elements have some atypical characteristics when it comes to loading. eg.

img = new Image();  
img instanceof HTMLImageElement // TRUE.. yes the Image() constructor is a full fledged html element
img.src = 'stuff' // immediately kicks off this network request, even tho this image isn't part of the DOM

(i think part of this is legacy web stuff)

i think we can get around this with the use of a <template> element.

const clone = element.cloneNode(); 

+ const temp = document.createElement('template')
+ temp.content.append(clone)

clone.setAttribute(.....

within that .content prop, all that content is inert so we can do whatever and no side-effects.

Lofesa

comment created time in 16 hours

Pull request review commentGoogleChrome/lighthouse

misc(build): give build-tracker a shared git history on PRs

 jobs:     - run: yarn i18n:checks     - run: yarn dogfood-lhci -    # buildtracker runs `git merge-base HEAD origin/master` which needs more history than depth=1. https://github.com/paularmstrong/build-tracker/issues/106-    - name: Deepen git fetch (for buildtracker)-      run: git fetch --deepen=100+    # buildtracker needs history and a common merge commit.

we want master for timeseries historical data

we want prs for budgets. once bt is running successfully i can add bundle size budgets, which is the last remaining item to complete the move from the old bundlesize module to this. buildtracker CLI won't have a non-zero exit code, even with failing budgets.. but i'll likely post to the status API to express the failure state

paulirish

comment created time in 18 hours

PullRequestReviewEvent

Pull request review commentGoogleChrome/lighthouse

misc(build): give build-tracker a shared git history on PRs

 jobs:     - run: yarn i18n:checks     - run: yarn dogfood-lhci -    # buildtracker runs `git merge-base HEAD origin/master` which needs more history than depth=1. https://github.com/paularmstrong/build-tracker/issues/106-    - name: Deepen git fetch (for buildtracker)-      run: git fetch --deepen=100+    # buildtracker needs history and a common merge commit.

this script is mostly necessary for PRs.

though on master we need we need just 1 command... the git fetch --deepen. the bash script early exists (line 26) in this case.

paulirish

comment created time in 19 hours

PullRequestReviewEvent
PullRequestReviewEvent

issue commentGoogleChrome/lighthouse

Request for metrics that are inclusive to Assistive Technology

The AcT (Accessibility Tree) is a very defined thing and we have decent observability on it. Not as good as the DOM tree, but pretty good. I like the idea of getting a Time To A11y Tree First Built metric. Though keep in mind the tree will be changing as scripts load in and content is added to the page.

Adding another possibility to the brainstorm, I can imagine a metric that considers the how quickly the AcT settles into its "final" position. (defining "final" TBD, much like "fully loaded") It could be computed much like Speed Index, assuming there's a decent calculation for determining tree similarity.


A note on the instrumentation that currently exists:

puppeteer actually has some great work, culminating in the accessibility.snapshot() method. Behind the scenes, it uses Accessibility.getFullAXTree from the devtools protocol, plus some more work to flesh out a solid picture of the AcT.

The protocol (and thus pptr) don't have events that indicate "AccessibilityTreeChanged", so right now in order to understand how it changes, it'd need to be polled. Hopefully what @LJWatson said about the perf hit indicates that polling would be decently performant. Regardless, we're in a lab scenario so no user perceivable impact anyhow. :) If this exploration works out, perhaps some "change" events could be added to the protocol so the approach could be optimized a bit.


I think some prototyping here is the next best step.

With some straightforward puppeteer scripting, someone can make a basic Time To First AcT metric and also explore the AcT Speed Index idea. Once built, there's always a good amount of metric validation necessary to understand how well the numbers we get track the intent of the metric. Testing on a variety of webpages/webapps is key here.

I'm happy to give some guidance if anyone has questions about the protocol underpinnings here.

scottjehl

comment created time in 19 hours

issue closedGoogleChrome/lighthouse

Paint performance audit

<!-- Before creating an feature request please make sure you are using the latest version. -->

<!-- If this is a new audit please review the audit doc https://github.com/GoogleChrome/lighthouse/blob/master/docs/new-audits.md -->

Feature request summary Include paint performance audits in Lighthouse.

What is the motivation or use case for changing this? Besides optimizing for the network and CPU, optimizing for render performance is a valid area of performance tunning. It is also one of the hardest categories to measure and analyze, and therefore awareness is low.

I'm just raising it as a general question as to whether this category will be considered. If yes, some early ideas for a possible implementation:

  • Express paint performance score (not sure what metric can do that)
  • Reporting unusual paint metrics (for example excessive repaints)
  • Check against known anti-practises
  • Advise on more performant approaches

How is this beneficial to Lighthouse? Make the very difficult category of render performance visible and accessible to developers. I am aware that paint performance can be checked in devtools yet I still find it quite hard to understand.

closed time in 19 hours

fchristant

issue commentGoogleChrome/lighthouse

Paint performance audit

This is a tricky area, mostly as there aren't widely accepted metrics around paint/rendering perf that are representative of user perception. I know there's work around "frame throughput" that some Chromium folks have worked on.

But most of the investigations I know in this area isn't specific to paint specifically, but rather some portion of the rendering pipeline that excludes JS. Even doing that is hard, since one of the biggest challenges is layout thrashing, which is entirely due to JS.

Overpainting is a thing, but typically not very actionable by the web developer, and more typically it's the browser's responsibility to avoid rendering costs for pixels that are occluded.

I've seen cases where rendering a complex SVG is quite expensive and it's possible to invalidate those paints regularly, thus making paint a heavy cost for the lifetime of page. Though I think these situations are fairly rare.

Right now we're going to focus on summarizing runtime costs effectively. the "Minimizes main-thread work" audit captures much of this. If there are cases where "Paint" is > 20% of costs, then it certainly requires a more in-depth look, but I suspect that's an edge case.

fchristant

comment created time in 19 hours

push eventChromeDevTools/devtools-protocol

devtools-bot

commit sha 88c24a68b54f8b7aee1bcba8787cd4827bb6ba9d

Updates

view details

push time in 4 days

push eventChromeDevTools/devtools-protocol

devtools-bot

commit sha b5c72990331d9174de3a7536c26e36430503f5b0

Updates

view details

push time in 5 days

pull request commentChromeDevTools/debugger-protocol-viewer

Update path to browser_protocol.pdl.

heh, love his writing. thanks @jeremyroman !

jeremyroman

comment created time in 5 days

push eventChromeDevTools/debugger-protocol-viewer

Jeremy Roman

commit sha 1c10bbb9d5cb7f9a551383e388a184f01fbf12ee

Update path to browser_protocol.pdl. (#161)

view details

push time in 5 days

PR merged ChromeDevTools/debugger-protocol-viewer

Update path to browser_protocol.pdl.

This moved in 2019 and apparently someone has noticed:

https://mango.pdf.zone/stealing-chrome-cookies-without-a-password#sidenote-the-chrome-dev-tools-are-absolutely-loose:~:text=Finally%2C%20they%20have%20a%20section%20called,to%20read%2C%20both%20of%20which%20404.

+1 -1

1 comment

1 changed file

jeremyroman

pr closed time in 5 days

PullRequestReviewEvent

Pull request review commentGoogleChrome/lighthouse

report: reuse generalized clumping for perf category

 class CategoryRenderer {     }   } +  /**+   * @param {LH.ReportResult.Category} category+   * @param {Object<string, LH.Result.ReportGroup>} [groupDefinitions]+   * @return {Element}+   */+  render(category, groupDefinitions = {}) {

(the diff is a little funny here because i moved the diagram... but the method signature and top 3 lines are identical to what it had been.)

<details> <summary>git diff catches the changes as a "move", and calls out the new method of renderClumps a bit more clearer</summary>

image

</details>

paulirish

comment created time in 5 days

Pull request review commentGoogleChrome/lighthouse

report: reuse generalized clumping for perf category

 class PerformanceCategoryRenderer extends CategoryRenderer {       element.appendChild(groupEl);     } -    // Passed audits-    const passedAudits = category.auditRefs-        .filter(audit => (audit.group === 'load-opportunities' || audit.group === 'diagnostics') &&-            Util.showAsPassed(audit.result));+    // Everything else (passed, passed with warnings, n/a)+    const renderedAudits = [...metricAudits, thumbnailAudit, budgetAudit, ...opportunityAudits,+      ...diagnosticAudits];+    const unrenderedAudits = category.auditRefs.filter(ref => !renderedAudits.includes(ref));+    const remainingAudits = unrenderedAudits.filter(ref => !!ref.group);

this sounds good.

it means every auditRef needs a group. but that works for me.

the tricky thing is manual audits are currently defined by their scoreDisplayMode, and this should be a group.

but once that's cleaned up it makes sense to do that policy change.

paulirish

comment created time in 5 days

PullRequestReviewEvent
PullRequestReviewEvent

push eventGoogleChrome/lighthouse

Connor Clark

commit sha 042d831c6cb1b55f3d62760bb8c95d8ee1c7094e

misc: yarn static-server (#9293)

view details

Brendan Kenny

commit sha a26d38f86a00e294f226b05222f83bc960de71c9

core(config): assert all audit requiredArtifacts will be gathered (#9284)

view details

Brendan Kenny

commit sha de820ec3f24a1fbc7a0a27cdb916b322e72d15d0

misc(runner): add assertion for devtoolsLog as requiredArtifact (#9290)

view details

PatOnTheBack

commit sha 1ab2929ef2bdf8adbad5048c49a9babc406e9c28

misc: remove duplicate colon from regex (#9295)

view details

Brendan Kenny

commit sha b671932abeee47ff055efa05b7a58a2fd7fe7f3b

misc: localize logged GatherRunner error (#9291)

view details

Connor Clark

commit sha 8bef45472a4ede831c176ce584393f86e80b296e

report: use css grid for metrics (#9273)

view details

Connor Clark

commit sha d9e012bb34b1206dbb9ba6126783acb4eb2b8977

report: make urls clickable (#9224)

view details

Paul Irish

commit sha 0da300a45d001887dd0a373c1eafcbd3e048c68d

misc(build): create error-y LHR for the deploy (#9283)

view details

Connor Clark

commit sha 29f56670d1061eedf68ba52bd8380aab9227e4b8

report: remove unnecessary attribute in svg (#9301)

view details

Connor Clark

commit sha 6794e2e603aebe9ebfe979362c83ab5a620ef638

deps(intl): move from devDep to dep (#9309)

view details

Connor Clark

commit sha b751de512b3026fe7c5d35596afb362b1287a3f6

deps(brfs): upgrade to 2.0.2 to resolve source map debug issue (#9312)

view details

remexllee

commit sha 9cef08299a2c69a3a81e8da802051b630dffb259

tests: improve drag-and-drop coverage (#9314)

view details

Brendan Kenny

commit sha 25b309fcc67a99a7e265f9b849a8a499bda8f9e2

core: localize invalid URL error message (#9334)

view details

Paul Irish

commit sha 5e52dcca72b35943d14cc7c27613517c425250b9

misc(build): adjust deployment filenames (#9338)

view details

Connor Clark

commit sha 4c658d272705cc68f1b55a898a2c7fb5f881b57c

tests(smokehouse): assert on expected array length (#9292) Co-Authored-By: Brendan Kenny <bckenny@gmail.com>

view details

Paul Irish

commit sha 7317676af233627869796ef96b66bdf405298882

deps: chrome-launcher@0.11.1 (#9339)

view details

Connor Clark

commit sha fa9bda4e3244fdb17fceaeed62fb3ca5c08e7997

report: show disabled checkbox when all/no urls are third-party (#9299)

view details

Brendan Kenny

commit sha 069d8dbe7123dea5d21d1e3127c8ce9cc442e33a

core(domstats): support an empty html body (#9340)

view details

Patrick Hulce

commit sha c4664b3cd5974afcecb19f56f7207bc3e30f207d

deps: update axe-core to 3.3.0 (#9343)

view details

Brendan Kenny

commit sha 0c78f37d2889c72119f0d450420a7c4af7fcc77f

deps: update outdated transitive deps (#9347)

view details

push time in 5 days

push eventGoogleChrome/lighthouse

Paul Irish

commit sha dd19244657af942204c608ca8892ec6c26d6a843

Apply suggestions from code review Co-authored-by: Brendan Kenny <bckenny@gmail.com>

view details

push time in 5 days

push eventGoogleChrome/lighthouse

Paul Irish

commit sha cddf1b5b78d2d53f42dc0d004d9cc3bd460d0e0c

report: tweak naming in element-screenshot renderer (#11152)

view details

Patrick Hulce

commit sha b7bb6c0cdfe6626a7ff7db573c148b40050a2bee

core(lantern): fallback to FCP in 0-weight SI situations (#11174)

view details

Patrick Hulce

commit sha 46b0b88778271327ea8ed1bcbb2c568addc8a1e8

misc: remove all appveyor references (#11171)

view details

Brendan Kenny

commit sha d0eacd7166ecaadd9ddebaf438644a1091251537

test: run test-viewer in github actions (#11195)

view details

Paul Irish

commit sha d87306c187012e7296585739ae8edff5623a17de

deps(angular): update minor version of angular fixture redux (#11192)

view details

Patrick Hulce

commit sha d7dfa0cbdd6377d56576b7e3886a99c75d4bfac2

cli: clearTimeout for faster exit (#11170)

view details

Paul Irish

commit sha 1897c012d6d942d89adcc940bffc26a8140b1e16

tests(minification-est): add testcase with pre-minified bundle (#11191)

view details

Patrick Hulce

commit sha d173f5ab14895d78349cf36b70f5d1f1e1915e24

misc: add GCP collection scripts (#11189)

view details

Wojciech Maj

commit sha 40baa221b87ae44c3184f9e6587ef173590ef595

deps: update dot-prop secondary dependency (#11198)

view details

Patrick Hulce

commit sha 434610e7d5a07de37f79c2082965da8f9bb6a5c6

tests: update chromestatus expecatations (#11221)

view details

Adam Raine

commit sha 5859bdcc70c9ad6cd7ba2c61538bfaa8b3de8c98

new_audit: report animations not run on compositor (#11105)

view details

George Makunde Martin

commit sha 8bc43c5bb966dc881e0b4d7c1f5408344fcd6a73

core: add FormElements gatherer (#11062)

view details

lemcardenas

commit sha e0f7d5107e022ba96105c28fdcc54d865f29a221

core(image-elements): collect CSS sizing, ShadowRoot, & position (#11188)

view details

lemcardenas

commit sha 97a2375bec7a551a4dcef2b47404c0a3cbfb9838

core(config): unsized-images to default (#11217)

view details

Patrick Hulce

commit sha f9006751a56a9dca0e62d883d34a5a58fb6b4e05

core(stacks): timeout stack detection (#11172)

view details

Patrick Hulce

commit sha 611eb5126a9eab68a5287b54b3162257232061d8

deps(snyk): update script to prune <0.0.0 and update snapshot (#11223)

view details

Patrick Hulce

commit sha 2089f49b90b5165f0da2a6cf4a95e67ffb845e6e

tests: istanbul ignore inpage function (#11229)

view details

Connor Clark

commit sha f323a32d113bac65ec3af7b71b0ffa2abc9f1853

i18n: import (#11225)

view details

lemcardenas

commit sha 47a1b472e0bc860e7dbd9ef2d19fa72fe4457e85

report: vertically center thumbnails (#11220)

view details

Michael Blasingame

commit sha 5b9a37ae0f1601d699d9eeb16fd1cd38ff94bec9

report: correctly display CLS in budget table (#11209) Co-authored-by: Connor Clark <cjamcl@google.com>

view details

push time in 5 days

PR opened GoogleChrome/lighthouse

@paulirish misc(build): give build-tracker a shared git history on PRs

build-tracker relies on a common commit that's shared between HEAD and master. Lighthouse runs CI on pull_request, not push, so the checkout is not the branch with shared history, but the result of a merge.

The checkout@v2 action uses a merge remote (eg. remotes/pull/9605/merge) that often has just a single commit. (single commit on these for our repo, not the case for other smaller repos... TBH i don't know how their logic works)

This script creates a new branch that matches the current checkout, but does have a shared history.

ref #10472

+78 -3

0 comment

2 changed files

pr created time in 5 days

push eventGoogleChrome/lighthouse

Paul Irish

commit sha f9a5e61f5877703644145d362c83cbcab9fef315

minor

view details

push time in 5 days

push eventGoogleChrome/lighthouse

Paul Irish

commit sha fca665bf2b9323474ba44af6048e7b200464e49d

protocol 2 so fast

view details

push time in 5 days

PR opened paulirish/lighthouse

i18n(import): new audits, dropdowns, and columns (#10645)

<!-- Thank you for submitting a pull request! See CONTRIBUTING.MD for help in getting a change landed. https://github.com/GoogleChrome/lighthouse/blob/master/CONTRIBUTING.md -->

Summary <!-- What kind of change does this PR introduce? --> <!-- Is this a bugfix, feature, refactoring, build related change, etc? -->

<!-- Describe the need for this change -->

<!-- Link any documentation or information that would help understand this change -->

Related Issues/PRs <!-- Provide any additional information we might need to understand the pull request -->

+96 -132

0 comment

2 changed files

pr created time in 5 days

push eventpaulirish/lighthouse

Patrick Hulce

commit sha 2afddc779df46717cb6c9f3303adebf9cd3493ff

misc: ignore duplicate builds in lhci dogfood (#10482)

view details

Snyk bot

commit sha 244b61ab6780cbcb11706a2165ca4e1ecd7dc87d

deps(snyk): update snyk snapshot (#10478)

view details

Alex Tkachuk

commit sha 42da38bf199b8f9bfd900f6464009c7163c0709d

docs(readme): update PageSpeed Green in related projects

view details

Sebastian Kreft

commit sha c8b71635d241d99aec0c17280cc8fb4530e6337d

new_audit: check images are big enough

view details

Michael Blasingame

commit sha 2b500fc81e5d22d40d00920206742c51b015ee87

core: remove some dead code in driver.js (#10491)

view details

AndreasKubasa

commit sha 1457b4ceccb107ce1c689bef1006790b52248077

docs(readme): add AwesomeTechStack to lighthouse integrations (#10475)

view details

Connor Clark

commit sha 882183953b949ae43df33d5fea9bb2be8b1e129c

core(unused-javascript): update doc link to web.dev

view details

Patrick Hulce

commit sha 69ef99d0b4f8467789047069e24009f26a7f12cb

docs(variability): expand on hardware recommendations

view details

Patrick Hulce

commit sha f1216f9b854c1966b7284792acbb540f8958837a

core(driver): pause after FCP event before resolving load (#10505)

view details

François Beaufort

commit sha ff4f82fc76a1e7ac6ee7e4eacf5ebaa780d5c36a

core(audits): remove audio-caption accessibility audit (#10453)

view details

Connor Clark

commit sha 6330d7799618df9e5b54b26c1b45f3110ba4e501

docs: add link to gist for using lighthouse audits directly

view details

Connor Clark

commit sha 61207bebedf98b7c74bc923c13dc12f8613f845c

docs: emphasize some points on what makes a good audit

view details

Patrick Hulce

commit sha 844317639211d14af90f035efb39344013592159

misc(driver): rename ForFCP to ForFcp (#10516)

view details

Connor Clark

commit sha 20e4bc7d8b1ed720242e62417235b107a3d60f8d

misc(compare-runs): fix filter, allow for resume, reorganize output

view details

Michael Blasingame

commit sha f6a3201e07c6287348373a518124f2e916316e6e

report: update table and inline code formatting (#10437)

view details

Snyk bot

commit sha fffc33b5f67280cd04f83431700a478354b13736

deps(snyk): update snyk snapshot (#10531)

view details

Brendan Kenny

commit sha e55fcd2f015fc8edaa00e85d88bb7e0a997b036a

core: include finished state on hidden network-requests audit (#10530)

view details

Warren Maresca

commit sha 2b614837c192c6b2023885d837f00e74a19e262f

core(lantern): add edges from initiatorRequest when there are duplicate records (#10097)

view details

Connor Clark

commit sha ec45d53f3f6542bc1817d3b84186ae97bb0f0f30

docs(lantern): add deep dive video (#10546)

view details

Patrick Hulce

commit sha 2f11010a8a35c55618c6b067390fcb3a7916bd8c

core(responsive-images): find offscreen images larger than viewport (#10506)

view details

push time in 5 days

create barnchpaulirish/lighthouse

branch : bundltrackernoerrorquit

created branch time in 5 days

push eventGoogleChrome/lighthouse

Paul Irish

commit sha 6f2e34c6857e2e76d386acb4c1b4a62c6f070419

spaces

view details

Paul Irish

commit sha c8edf8594b41c3238293f168f1466206b80fd86a

temporary: faster ci

view details

push time in 5 days

push eventGoogleChrome/lighthouse

Paul Irish

commit sha 0fd213ba474f3ddc2f72692266620f88118324b4

shellcheck

view details

push time in 5 days

create barnchGoogleChrome/lighthouse

branch : bundltrackernoerrorquit

created branch time in 5 days

PR opened paularmstrong/build-tracker

chore(api-client): add child process invocation command to error output

Problem

Problem

Sometimes buildtracker fails randomly. Example: https://github.com/GoogleChrome/lighthouse/runs/647433752?check_suite_focus=true

image

since we have no idea what command triggered a nonzero exit code, it's a bit hard to debug.

Solution

since the error object that spawn() rejects with already has a few things, and is logged out to stderr... i figure why not.

fixes #200

TODO

  • [X] 🤓 Add & update tests (always try to increase test coverage)
  • [ ] 🔬 Ensure CI is passing (yarn lint:ci, yarn test, yarn tsc:ci)
  • [X] 📖 Update relevant documentation
+2 -1

0 comment

1 changed file

pr created time in 5 days

push eventpaulirish/build-tracker

Paul Irish

commit sha a1cfbb0619c05f94b0a59f7adb838af5e6a33208

chore(api-client): add child process invocation command to error output

view details

push time in 5 days

create barnchpaulirish/build-tracker

branch : logcommand

created branch time in 5 days

created tagpaulirish/build-tracker

tagv1.0.0-beta.15

A set of tools to track the size of your build artifacts over time.

created time in 5 days

created tagpaulirish/build-tracker

tagv1.0.0-beta.16

A set of tools to track the size of your build artifacts over time.

created time in 5 days

issue commentGoogleChrome/lighthouse

Reduce size of CDT bundle

Some additional data and exploration in this doc: https://docs.google.com/document/d/15DyId8C9bGnk1ZpgaIekYG_RDEC_KYNQW3Tsp6vs11Q/edit#

connorjclark

comment created time in 6 days

Pull request review commentGoogleChrome/lighthouse

misc(build): minify bundles with terser, emit source map

     "@wardpeet/brfs": "2.1.0-0",     "angular": "^1.7.4",     "archiver": "^3.0.0",-    "babel-core": "^6.26.0",

👋 bye!

connorjclark

comment created time in 6 days

Pull request review commentGoogleChrome/lighthouse

misc(build): minify bundles with terser, emit source map

     "chrome",     "devtools"   ],-  "author": "The Chromium Authors",+  "author": "Google Inc.",

i think we do "The Lighthouse Authors"

https://github.com/GoogleChrome/lighthouse/pull/10469

connorjclark

comment created time in 6 days

Pull request review commentGoogleChrome/lighthouse

misc(build): minify bundles with terser, emit source map

 {   "name": "lighthouse",   "version": "6.3.0",-  "description": "Lighthouse",+  "description": "Automated auditing, performance metrics, and best practices for the web.",

👍

connorjclark

comment created time in 6 days

Pull request review commentGoogleChrome/lighthouse

misc(build): minify bundles with terser, emit source map

 class TapTargets extends Gatherer {    */   afterPass(passContext) {     const expression = `(function() {-      const tapTargetsSelector = "${tapTargetsSelector}";

i looked through the other uses of evalAsync for a similar situation and i think we're good.

only one that caught my eye was L116 of the start-url gatherer

https://github.com/GoogleChrome/lighthouse/blob/039b6c6e0826d8b09e145e6ffd5253724b497021/lighthouse-core/gather/gatherers/start-url.js#L116

but looking at the minified source, i think we're ok

image

ya?


in minified land this was that taptarget section:

image

connorjclark

comment created time in 6 days

Pull request review commentGoogleChrome/lighthouse

misc(build): minify bundles with terser, emit source map

 const isDevtools = file => /** @param {string} file */ const isLightrider = file => path.basename(file).includes('lightrider'); -const BANNER = `// lighthouse, browserified. ${VERSION} (${COMMIT_HASH})\n` +-  '// @ts-nocheck\n'; // To prevent tsc stepping into any required bundles.

just confirming you want this gone?

i saw it in the banner for a bit, but now its totally gone.

connorjclark

comment created time in 6 days

PullRequestReviewEvent
PullRequestReviewEvent

issue commentGoogleChrome/lighthouse

Always showing Aspect Ratio (Actual) as 300x150 (2.00) for AVIF files

https://github.com/thierryk/AVIF/commits/master indicates nothing about the test page changed afer you viewed it there's just a populated readme now: https://github.com/thierryk/AVIF

also hi @thierryk ! 👋 really good to see you.

BillGoldstein

comment created time in 7 days

issue commentGoogleChrome/lighthouse

Add service-worker-with-fetch-handler check to PWA Installable criteria

Chrome's has the has-fetch-handler check, but is working on an improved one: 965802 - Implement more accurate service worker's offline capability detection - chromium

I tested this out and it works and even provides this signal to Page.getInstallabilityErrors:

chrome-debug --enable-features=CheckOfflineCapability

image

so backing up... a sw with this (non-handler):

/// COMMENTED OUT! NO HANDLER! this.addEventListener('fetch', .........

and we'll see a not-offline-capable in Chrome today.

but if its...


this.addEventListener('fetch', function(e) {
  console.log('nothing happening...')
});

we won't have that error signal... unless the --enable-features=CheckOfflineCapability flag is flipped. wooo!

we can work on this at will, though it'd be nice to start using this signal once the experiment lands so that chrome doesn't change Lighthouse's installability signal unexpectedly. :)

star crbug.com/965802 to watch it.

connorjclark

comment created time in 7 days

Pull request review commentGoogleChrome/lighthouse

core(audits): devtoolsNodePath for password-inputs-can-be-pasted-into audit

 class AxeAudit extends Audit {       items = rule.nodes.map(node => ({         node: /** @type {LH.Audit.Details.NodeValue} */ ({           type: 'node',+          // This selector represents the aXe css selectors+          // Node selector can be accessed with node.selector           selector: Array.isArray(node.target) ? node.target.join(' ') : '',

@adrianaixba and I just deep dived into .target and what it's useful for. It addresses cases where the node is within shadowDOM or iframes. (because css selectors can't express that.)

Since Lighthouse can't do anything useful here (and our own getSelector doesnt handle the iframe/shadow DOM cases) I think we should ignore it.

So yeah let's use our own NodeValue selector here and discard their target selector array.

adrianaixba

comment created time in 7 days

PullRequestReviewEvent

issue commentGoogleChrome/lighthouse

Align load-fast-enough-for-pwa with Core Web Vitals?

We're steering towards removing the perf audit from the PWA category. Next action was on @b1tr0t.

On the Lighthouse side we'll have to figure out what we'll do with the "fast and reliable" group.

kaycebasques

comment created time in 7 days

issue closedGoogleChrome/lighthouse

Question: Can audit url with redirects?

<!-- We would love to hear anything on your mind about Lighthouse --> Question: Does lighthouse audit urls with redirects? If yes, is there a maximum number of redirects it can handle? Will the final url be audited instead?

closed time in 7 days

jmdelacerna

issue commentGoogleChrome/lighthouse

Visited links are unreadable in dark mode | CSS

Let's tweak the anchor styles to be more defensive. Probably on all a (to handle cases where an embedder has their own styles)

TheHunterDog

comment created time in 7 days

IssuesEvent

issue closedGoogleChrome/lighthouse

Add banner to PSI notifying of 6.0 changes

Plenty of people using PSI have been caught off guard by the 6.0 changes.

We think it'd be good to add a attention-getting banner to the top of the report indicating that this is 6.0 and linking to all the changes they should know about.

closed time in 7 days

paulirish

issue closedGoogleChrome/lighthouse

Unused '@ts-expect-error' directive in 6.2.0

<!-- We would love to hear anything on your mind about Lighthouse --> Summary

I use lighthouse in a typescript project with the following ts configs

"allowJs": true,
"maxNodeModuleJsDepth": 1,

The JSDoc types in lighthouse has been working nicely before 6.2.0.

On version 6.2.0, I start to see Unused '@ts-expect-error' directive errors

node_modules/lighthouse/lighthouse-core/gather/driver.js:296:5 - error TS2578: Unused '@ts-expect-error' directive.

296     // @ts-expect-error TODO(bckenny): tsc can't type event.params correctly yet,
        ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

node_modules/lighthouse/lighthouse-core/gather/gatherers/gatherer.js:25:5 - error TS2578: Unused '@ts-expect-error' directive.

25     // @ts-expect-error - assume that class name has been added to LH.GathererArtifacts.
       ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

node_modules/lighthouse/lighthouse-core/lib/dependency-graph/base-node.js:132:5 - error TS2578: Unused '@ts-expect-error' directive.

132     // @ts-expect-error - in checkJs, ts doesn't know that CPUNode and NetworkNode *are* BaseNodes.
        ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

node_modules/lighthouse/lighthouse-core/lib/dependency-graph/base-node.js:267:5 - error TS2578: Unused '@ts-expect-error' directive.

267     // @ts-expect-error - only traverses graphs of Node, so force tsc to treat `this` as one
        ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

node_modules/lighthouse/lighthouse-core/lib/dependency-graph/base-node.js:273:7 - error TS2578: Unused '@ts-expect-error' directive.

273       // @ts-expect-error - queue has length so it's guaranteed to have an item
          ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

node_modules/lighthouse/lighthouse-core/lib/dependency-graph/base-node.js:309:7 - error TS2578: Unused '@ts-expect-error' directive.

309       // @ts-expect-error - toVisit has length so it's guaranteed to have an item
          ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
...

It seems the errors are happening because I don't have "checkJs": true ts config set, as a result typescript raises errors for all the @ts-expect-error in js files. However setting"checkJs": true would cause typescript to raise errors all over the place for other js libraries.

I'm wondering if it's possible to change @ts-expect-error back to @ts-ignore? Or are there other ways to work around it?

Thanks

closed time in 7 days

oddui

issue closedGoogleChrome/lighthouse

Gatsby/Netlify Remove Unused JS + Cache Policy

We have a Gatsby site deployed through Netlify.

Our site is connected to Google Tag Manager which fires tags for our Facebook Pixel, Hotjar, Hubspot account integrations.

Upon running Google Lighthouse (which we love by the way), the output for performance has us wondering how we can go about resolving (or trying to resolve as best we can) the finding report of Google Lighthouse.

For "Remove unused Javascript", is there anything we can do about these?

For "Serve static assets with an efficient cache policy", is there anything we can do about these as well? Ultimately, these are fired via Google Tag Manager

Appreciate your help in advance!

google-lighthouse

closed time in 7 days

trueblood12

issue closedGoogleChrome/lighthouse

Disable Chrome Extensions before testing

image

Lighthouse swears at browser extensions.

Please, make it possible to disable all extensions before testing

closed time in 7 days

Artik-Man

issue closedGoogleChrome/lighthouse

CPU throttling

Hello,

I can't desactivate the CPU throttling. It displays "4x slowdown (DevTools)".

Environment Information

  • Affected Channels: <!-- CLI, Node, Extension, DevTools -->
  • Lighthouse version: Lighthouse 5.7.0
  • Chrome version: 80.0.3987.106
  • Node.js version:
  • Operating System: Windows

Related issues

closed time in 7 days

jaouher2

issue closedGoogleChrome/lighthouse

data:image/png;base64 => ERR_INVALID_URL

Provide the steps to reproduce

  1. Run LH on a page containing:
<img src="data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAAAEAAAABCAQAAAC1HAwCAAAAC0lEQVR42‌​mNgYAAAAAMAASsJTYQAAAAASUVORK5CYII=" alt="" />

What is the current behavior?

Failed to load resource: net::ERR_INVALID_URL

What is the expected behavior?

No such error logged.

Environment Information

  • Affected Channels: devtools
  • Lighthouse version: 6.0.0
  • Chrome version: 85.0.4183.102
  • Node.js version: ?
  • Operating System: Win 10 64bit

Related issues

closed time in 7 days

black-snow

push eventChromeDevTools/devtools-protocol

devtools-bot

commit sha 0da74bc4b04131d7eb8acd722030c6d0700ff404

Updates

view details

push time in 8 days

issue commentGoogleChrome/lighthouse

Getting a ? on mobile PageSpeed score

perhaps related to https://github.com/GoogleChrome/lighthouse/issues/10876

rulloliver

comment created time in 8 days

pull request commentGoogleChrome/lighthouse

tests: hash more files for devtools test cache

i guess the retry_3 line in run-web-tests.sh is being a jerk

image

connorjclark

comment created time in 8 days

pull request commentGoogleChrome/lighthouse

core(full-page-screenshot): use layoutViewport width

And the failure mode for pages that are too tall is both significant and very common

there's two situations here and i'm not sure which you're talking about.. perhaps both?

  • page too tall: chrome can't take a fullpage screenshot that's taller than 16384px (actually 16383). it'll get one that's that tall, but it wont include any content past that line.
  • image too big: apparently a max size for data uris of 2MB.

If we convert to a test on the handling a tall page by cutting down to <5000px tall instead of the error to reduce flakes that seems fine to me but something in smokes to cover this too big handling case seems important.

are you saying our test would assert that we reattempted at 5000px? and it worked or didnt or both?


On the data uri max size.. I see it in our code, the chromium bug comments, and the chromium source.

However... I'm not sure if this limit is a problem for us.

I logged out the size of the data uri that devtools is creating when it makes a fullsize screenshot of our test page..

image

and it hits 10MB and saves just fine.

perhaps that 2MB is for top-level URLs being navigated to?

connorjclark

comment created time in 8 days

PR opened GoogleChrome/lighthouse

deps: chrome-launcher to v0.13.4

3 commits, the 2nd of which is a nice quality of life upgrade for us. makes running LH ~5s faster on my machine.

  • 08406b28 fix: preserve existing getInstallations output
  • f3669f45 perf: check default paths first when running on Mac (https://github.com/GoogleChrome/chrome-launcher/pull/209)
  • aef94948 docs: update defaultFlags() example for new API (https://github.com/GoogleChrome/chrome-launcher/pull/205)
+5 -5

0 comment

2 changed files

pr created time in 8 days

create barnchGoogleChrome/lighthouse

branch : clbump

created branch time in 8 days

push eventGoogleChrome/lighthouse

Paul Irish

commit sha b194ca74b1764da92d93815bf364fb61a5a03e21

Apply suggestions from code review Co-authored-by: Connor Clark <cjamcl@google.com>

view details

push time in 8 days

pull request commentGoogleChrome/lighthouse

core(full-page-screenshot): use layoutViewport width

It's passing locally for me. you?

yarn smoke screenshot fails locally for me on this branch.

(though since screenshot size is dependent on DPR, i wouldn't be surprised if i locally had diff results than the CI bot)

i still think my changes in #11428 are good.. wanna pull them in here?

your changes are just deleting the test?

  1. it adds an assertion in the already-existing too-large unit test. the test already asserts a null return value (which is colocated with populating a warning). The newly added assertion just doublechecks the warning is there.
  2. it deletes the smoketest. the smoketest only checks that the artifact is null and runwarning is issued. nothing fancy going on here, and this behavior is already validated by the unit test. i don't see a reason to keep an entire smoketest to assert that a if (num >= maxnum) conditional works when that's already checked in the unit test. but maybe i'm missing something.
connorjclark

comment created time in 8 days

pull request commentGoogleChrome/lighthouse

tests(page-functions): add test for getNodePath

thanks git reset --soft master && git commit && git push --force

paulirish

comment created time in 8 days

push eventGoogleChrome/lighthouse

Paul Irish

commit sha 088067bad63654b4427e8a2201bd4b8e6e8c840a

tests(page-functions): add test for getNodePath

view details

push time in 8 days

PR opened GoogleChrome/lighthouse

tests(page-functions): add test for getNodePath

made this when i was noodling on #10956

the PR already had shared authorship, so i thought it best to just split this out separate.

+32 -1

0 comment

2 changed files

pr created time in 8 days

issue commentGoogleChrome/lighthouse

Does Lighthouse run the entire aXe core or just part of it?

table-fake-caption

the implementation from axe doesn't make a significant test here. it's fairly handwavy in determining pass/fail.

td-has-header

the implementation is far too costly. it could be sped up with a better algorithm, but until that's done, we can't incur the cost of it. see https://github.com/dequelabs/axe-core/issues/908 if users want this check, they can use the axe extension

aria-hidden-body form-field-multiple-labels

both of these rules are enabled and reported in lighthouse

susanlynnholland

comment created time in 8 days

Pull request review commentGoogleChrome/lighthouse

core: normalize node information in gathering

 declare global {       export interface IFrameElement {         /** The `id` attribute of the iframe. */         id: string,+        /** Details for node in DOM for the iframe element */

discussed in chat. we'll do the changes as described by patrick. using & is slightly more idiomatic TS for such a mixin but it has weaker IDE support, so we'll go with extends

let's put the type in artifacts.d.ts for now.. in the followup we'll properly deduplicate with audit-details

adrianaixba

comment created time in 8 days

PullRequestReviewEvent

Pull request review commentGoogleChrome/lighthouse

core: normalize node information in gathering

 class AxeAudit extends Audit {         node: /** @type {LH.Audit.Details.NodeValue} */ ({           type: 'node',           selector: Array.isArray(node.target) ? node.target.join(' ') : '',-          path: node.path,+          path: node.devtoolsNodePath,

yah sg. renaming the audit details prop to devtoolsNodePath seems like a win, even if its a bit more painful to do.

adrianaixba

comment created time in 8 days

PullRequestReviewEvent

Pull request review commentGoogleChrome/lighthouse

core: normalize node information in gathering

               "path": "3,HTML,1,BODY,36,A",               "selector": "body > a",               "nodeLabel": "external link",-              "snippet": "<a href=\"https://www.google.com/\" target=\"_blank\">"

also this is unexpected.

i think we just need to manually update the AnchorElements bit in lighthouse-core/test/results/artifacts/artifacts.json

adrianaixba

comment created time in 8 days

Pull request review commentGoogleChrome/lighthouse

core: normalize node information in gathering

             "node": {               "type": "node",               "selector": "h2",-              "path": "3,HTML,1,BODY,5,DIV,0,H2",

ah, need to rename path to devtoolsNodePath in the accessibility artifact in lighthouse-core/test/results/artifacts/artifacts.json

adrianaixba

comment created time in 8 days

Pull request review commentGoogleChrome/lighthouse

core: normalize node information in gathering

             "node": {               "type": "node",               "selector": "h2",-              "path": "3,HTML,1,BODY,5,DIV,0,H2",

something odd is going on here, these shouldn't be dropping.

adrianaixba

comment created time in 8 days

PullRequestReviewEvent
PullRequestReviewEvent

PR opened GoogleChrome/lighthouse

misc(axe): rename axe types

small tweak.

we use axeResult in two places for two different things. I've renamed both to differentiate more clearly.

  1. in the gatherer to refer to the return value of axe.run().
  2. the artifacts.d.ts type that refers to an individual axe rule's results.
    • I've renamed it to AxeRuleResult to clarify it's just a single axe rule being described. axe-core uses this term, too.
+13 -13

0 comment

2 changed files

pr created time in 8 days

create barnchGoogleChrome/lighthouse

branch : axeruleresult

created branch time in 8 days

pull request commentGoogleChrome/lighthouse

tests: hash more files for devtools test cache

Current CI failure is that is cuz it wants #11418 landed

image

connorjclark

comment created time in 8 days

push eventChromeDevTools/devtools-protocol

devtools-bot

commit sha 970604e9d1381b9279d0b8728b411548e476a3ca

Updates

view details

push time in 8 days

pull request commentGoogleChrome/lighthouse

core(full-page-screenshot): use layoutViewport width

you got a smoke failure. at first i thought it was a flake.. https://github.com/GoogleChrome/lighthouse/pull/11428

but its not.

under emulation this page looks like this

image

funnily the "full-size screenshot" that devtools takes looks like this:

image

as we know.. devtools still uses content size.. and you can tell by the TINY lil text <p> that this screenshot is WAY too wide, given the original emulation.

so..

the smoke failure is actually indiciative that the 5000px attempt to try to get a TOO HUGE data uri does't work because we're taking a much smaller screenshot:

image

1440px instead of 5000px.

and thus the datauri comes in under the threshold.

yay.

connorjclark

comment created time in 10 days

PR closed GoogleChrome/lighthouse

Reviewers
tests(smoke): drop screenshot-too-big case cla: yes waiting4reviewer

trying to fix another smoke flake. #11341

example happened here: https://github.com/GoogleChrome/lighthouse/pull/11402/checks?check_run_id=1098415903#step:8:622

the smoketest tries creating a page that'll end up with a TOO HUGE datauri, but sometimes it's not huge enough to hit our check.

IMO constructing smoketest for this case seems unnecessary. The existing unit test seems fine to confirm the if (screenshot.data.length > MAX_DATA_URL_SIZE) conditional works.

+1 -11

1 comment

2 changed files

paulirish

pr closed time in 10 days

pull request commentGoogleChrome/lighthouse

tests(smoke): drop screenshot-too-big case

Onnnnnnnnn second thought.... perhaps the screenshot-too-big failure happening on the PR where we change how we get a screenshot is something worth investigating......... :D

/me starts looking at it.

paulirish

comment created time in 10 days

Pull request review commentGoogleChrome/lighthouse

core(gather-runner): warn when BenchmarkIndex is sufficiently low

 const UIStrings = {    */   warningTimeout: 'The page loaded too slowly to finish within the time limit. ' +   'Results may be incomplete.',+  /**+   * @description Warning that the host device where Lighthouse is running appears to have a slower+   * CPU than the expected Lighthouse baseline.+   */+  warningSlowHostCpu: 'The device that ran this test appears to have a slower CPU than  ' ++  'Lighthouse expects. This can negatively affect your performance score. Learn more about using ' +

What action do we expect people to take if they see this warning?

The docs don't really spell it out.

They can buy a better machine.. or they can specify a cpuSlowdownMultiplier.. but... how would they calculate what number is good. Without guidance they can feel free to set cpuSlowdownMultiplier to 1 and enjoy some pretty great scores. 😛

patrickhulce

comment created time in 10 days

Pull request review commentGoogleChrome/lighthouse

core(gather-runner): warn when BenchmarkIndex is sufficiently low

 const UIStrings = {    */   warningTimeout: 'The page loaded too slowly to finish within the time limit. ' +   'Results may be incomplete.',+  /**+   * @description Warning that the host device where Lighthouse is running appears to have a slower+   * CPU than the expected Lighthouse baseline.+   */+  warningSlowHostCpu: 'The device that ran this test appears to have a slower CPU than  ' ++  'Lighthouse expects. This can negatively affect your performance score. Learn more about using ' ++  '[custom throttling settings](https://github.com/GoogleChrome/lighthouse/blob/ccbc8002fd058770d14e372a8301cc4f7d256414/docs/throttling.md#calibrating-multipliers) ' +
  '[custom throttling settings](https://github.com/GoogleChrome/lighthouse/blob/master/docs/throttling.md#calibrating-multipliers) ' +

while it adds a risk of this 404'ing, i think it's better that we can update the text people are reading.

patrickhulce

comment created time in 10 days

PullRequestReviewEvent
PullRequestReviewEvent
PullRequestReviewEvent

push eventGoogleChrome/lighthouse

andreizet

commit sha a58510583acd2f796557175ac949932618af49e7

tests: add markdown link checker (#11358)

view details

push time in 10 days

PR merged GoogleChrome/lighthouse

tests: add markdown link checker cla: yes waiting4reviewer

<!-- Thank you for submitting a pull request! See CONTRIBUTING.MD for help in getting a change landed. https://github.com/GoogleChrome/lighthouse/blob/master/CONTRIBUTING.md -->

Summary <!-- What kind of change does this PR introduce? --> <!-- Is this a bugfix, feature, refactoring, build related change, etc? --> This is a build related change, it adds a new action that checks for broken links inside markdown files when performing the CI actions.

<!-- Describe the need for this change --> It will prevent issues like #11089

<!-- Link any documentation or information that would help understand this change -->

Related Issues/PRs <!-- Provide any additional information we might need to understand the pull request -->

+38 -2

7 comments

4 changed files

andreizet

pr closed time in 10 days

PullRequestReviewEvent

PR opened GoogleChrome/lighthouse

Reviewers
tests(smoke): drop screenshot-too-big case

trying to fix another smoke flake. #11341

example happened here: https://github.com/GoogleChrome/lighthouse/pull/11402/checks?check_run_id=1098415903#step:8:622

the smoketest tries creating a page that'll end up with a TOO HUGE datauri, but sometimes it's not huge enough to hit our check.

IMO constructing smoketest for this case seems unnecessary. The existing unit test seems fine to confirm the if (screenshot.data.length > MAX_DATA_URL_SIZE) conditional works.

+1 -11

0 comment

2 changed files

pr created time in 10 days

create barnchGoogleChrome/lighthouse

branch : screenshottoobig

created branch time in 10 days

more