profile
viewpoint

GoogleChromeLabs/squoosh 11011

Make images smaller using best-in-class codecs, right in the browser.

GoogleChromeLabs/proxx 1039

A game of proximity

GoogleChrome/devsummit 708

Chrome Dev Summit Conference Site

dominiccooney/cache-polyfill 464

Service Worker Cache polyfill extracted from https://jakearchibald.github.io/trained-to-thrill/

GoogleChromeLabs/file-drop 175

A simple file drag and drop custom-element

push eventwhatwg/html

Jake Archibald

commit sha fd9899c3afb6f66444bbbf5864efa9219986b0e7

Create parent-child relationship in browsing session

view details

push time in a day

push eventGoogleChromeLabs/squoosh

Jake Archibald

commit sha f96ae9bdeedd1ca170ff049d081dd0d92514e95a

wip

view details

Jake Archibald

commit sha b47d6b46962fe51acab194269941f44e53a072e9

Moving worker and worker bridge

view details

push time in a day

Pull request review commentWICG/portals

Rough history traversal algorithm

+# History traversal++These rough algorithms describe how portal activation, adoption, reportaling, and deportaling should work.++Eventually, these steps will be moved into the portal spec, and the history traversal section of the HTML spec.++## Definitions++<dl>+  <dt>Navigable</dt>+  <dd>something that has session history, such as a top level page, an iframe, or a portal (although a portal can only have one item of session history).</dd>+</dl>++## Activate a portal in a 'push' style++This is a regular activation that clears any 'forward' items in join session history and adds a new top-level history entry for the portaled document.++1. Let _document_ be the portal's session history item's document.+1. Let _targetHistoryItem_ be a copy of the portal's session history item.+1. Remove _targetHistoryItem_'s document.+1. Give _targetHistoryItem_ a weak reference to _document_.

This isn't as easy as I thought.

Good bits:

  • Avoids moving the document from property to property
  • Avoids cases where the weak document and main document are set at the same time (and are different)

Less good bits:

  • Maybe the moving solidifies the 'documents move around model' that we agreed on?
  • It doesn't work with portals. With portals the history item is removed upon activation, so the weak reference is on the portal itself. We could keep the history item and set the weak flag there, but that means any portal rendering/interaction code would need to treat the history entry as absent if the weak flag is set. Although, this would ensure the weak reference is gone when the portal is navigated.

There may be other places where we'd need to avoid doing things with the session history item if the document reference is weak.

I'm undecided.

jakearchibald

comment created time in a day

PullRequestReviewEvent

push eventWICG/portals

Jake Archibald

commit sha 308dfc9db8095e45edb08cfc44f0cbe37118e052

Responding to feedback

view details

push time in a day

push eventGoogleChromeLabs/squoosh

Jake Archibald

commit sha 4d8efcea66c766bed60619ec182aa5388c9e24f3

SW bits that make it good enough for now

view details

push time in a day

push eventGoogleChromeLabs/squoosh

Jake Archibald

commit sha e11b4cf22c5f909e684d5e87a3554de480a3619d

Service worker building (but not quite right yet)

view details

push time in 2 days

push eventGoogleChromeLabs/squoosh

Jake Archibald

commit sha 3f2466f44dc7488fe7a498397baf271853052186

Re-enable terser

view details

push time in 2 days

push eventGoogleChromeLabs/squoosh

Jake Archibald

commit sha 778aa41f0dc2e58df461cfcaf4810f2e04886f5b

Prerender & client render of intro

view details

push time in 2 days

push eventGoogleChromeLabs/squoosh

Jake Archibald

commit sha 02e4eaf4b50771bac76f0f3b6e60548559c08a1f

CSS plugin

view details

Jake Archibald

commit sha 486957443dec25a70c35e830ae8ed99005d9e448

Downgrade postcss for modules support

view details

push time in 2 days

issue commentWICG/is-input-pending

Return a Promise<void> instead of a boolean

But isInputPending has granularity. You can optionally wait on continual events like mousemove. Why's it ok for the "you need to stop" part to have that granularity, but not the "now you can continue" part?

TimvdLippe

comment created time in 2 days

issue commentWICG/is-input-pending

Return a Promise<void> instead of a boolean

With isInputPending, you get to decide if you're interested in value-update events (like mousemove) or just one-time events like mousedown. sheduler.yield() is a good fit if it offers the same fidelity, but it doesn't look like it does.

The use case here is "I want to perform main thread work while there isn't an input event of a particular category pending". This proposal provides the "you need to stop" part of that, but not "now you can continue" - that requires polling by queuing tasks.

The solution seems incomplete if the "you need to stop" and "now you can continue" APIs are referring to different things, no?

TimvdLippe

comment created time in 2 days

Pull request review commentWICG/portals

Rough history traversal algorithm

+# History traversal++These rough algorithms describe how portal activation, adoption, reportaling, and deportaling should work.++Eventually, these steps will be moved into the portal spec, and the history traversal section of the HTML spec.++## Definitions++<dl>+  <dt>Navigable</dt>+  <dd>something that has session history, such as a top level page, an iframe, or a portal (although a portal can only have one item of session history).</dd>+</dl>++## Activate a portal in a 'push' style++This is a regular activation that clears any 'forward' items in join session history and adds a new top-level history entry for the portaled document.++1. Let _document_ be the portal's session history item's document.+1. Let _targetHistoryItem_ be a copy of the portal's session history item.+1. Remove _targetHistoryItem_'s document.+1. Give _targetHistoryItem_ a weak reference to _document_.+1. All 'forward' session history items are removed from the parent navigable.+1. Append _targetHistoryItem_ to the parent navigable's history items.+1. [Traverse to session history item](#traverse-to-session-history-item) with _targetHistoryItem_.++   Note: Actual activation is handled in history traversal.++## Activate a portal in a 'back' style++This is an activation that goes back through join session history. The API for this hasn't been designed, but if it exists it's likely to be an option to `activate()`.++1. Let _targetHistoryItem_ be the previous history item in the parent navigable, where the browsing context or URL differs from the current history item, ignoring the hash portion of the URLs.+1. If _targetHistoryItem_ does not have a weak reference to the portal's history item's document, reject and abort these steps.+1. [Traverse to session history item](#traverse-to-session-history-item) with _targetHistoryItem_.++   Note: Actual activation is handled in history traversal.++## Activate a portal in a 'forwards' style++This is an activation that go forward through join session history, without destroying existing items. The API for this hasn't been designed, but if it exists it's likely to be an option to `activate()`.++1. Let _targetHistoryItem_ be the next history item in the parent navigable, where the browsing context or URL differs from the current history item, ignoring the hash portion of the URLs.+1. If _targetHistoryItem_ does not have a weak reference to the portal's history item's document, reject and abort these steps.+1. [Traverse to session history item](#traverse-to-session-history-item) with _targetHistoryItem_.++   Note: Actual activation is handled in history traversal.++## Activate a portal in a 'replace' style++This is an activation that will replace the current top level page. The API for this hasn't been designed, but if it exists it's likely to be an option to `activate()`.++1. Let _document_ be the portal's session history item's document.+1. Let _targetHistoryItem_ be a copy of the portal's session history item.+1. Remove _targetHistoryItem_'s document.+1. Give _targetHistoryItem_ a weak reference to _document_.+1. [Traverse to session history item](#traverse-to-session-history-item) with _targetHistoryItem_ and isReplacement set to true.++   Note: Actual activation is handled in history traversal.++## Traverse to session history item++With _targetHistoryItem_ and _isReplacement_.++1. If the _targetHistoryItem_ and the current history item have the same document, and neither is null, traverse to _targetHistoryItem_ in the regular way and abort these steps.++   Note: This is an in-document navigation. Nothing portal-related happens in this case.++1. Let _documentChangesInDelta_ be 0.+1. Let _historyItems_ be null.+1. If _isReplacement_, then set _historyItems_ to [the current history item, _targetHistoryItem_].+1. Otherwise, set _historyItems_ to the history items of this navigable, between the current item, and target item, including the current item and _targetHistoryItem_, in order.+1. Remove the first item from _historyItems_.+1. For each _historyItem_ of _historyItems_, if any of the following is true, increment _documentChangesInDelta_:+   - _historyItem_ has a different browsing context to the previous item in the list.+   - _historyItem_'s URL is different to the previous item in the list, ignoring the hash portion of the URL in both.+1. If _documentChangesInDelta_ is greater than 1, then traverse to _targetHistoryItem_ in the regular way and abort these steps.++   Note: To avoid multiple levels of reportaling, reportaling and implicit activation is skipped if the navigation spans across multiple documents++1. Asset: _documentChangesInDelta_ is not 0.++   Note: This should have been catered for in step 1.++1. If _targetHistoryItem_ does not contain a document, and _targetHistoryItem_ has a weak reference to a document that's inside a portal in the current history item's document. then:++   1. Let _document_ be that portal's document.+   1. For each item in the navigable's session history that has a weak reference to _document_, set its document to _document_, and remove its weak reference to _document_.+   1. Give the portal a weak reference to the document inside the portal.+   1. Remove the portal's session history item.+   1. If adoption is permitted, offer _targetHistoryItem_'s document the opportunity to adopt the current history item. If this opportunity is taken:++      1. A new portal element is created in _targetHistoryItem_'s document.+      1. Give the portal a weak reference to the current history item's document.++         Note: The rest of the process is picked up in step 12.++1. If _targetHistoryItem_ does not contain a document, then traverse to _targetHistoryItem_ in the regular way and abort these steps.+1. If _targetHistoryItem_'s document contains a portal with a weak reference to the current history item's document, then:+   1. Let _documentToPortal_ be the current history item's document.+   1. Set the portal's history item to a copy of the current history item.+   1. Remove the portal's weak reference to _documentToPortal_.+   1. For each item in the navigable's session history, remove its document if the document is _documentToPortal_, and give it a weak reference to _documentToPortal_.+1. Otherwise, unload the current history item's document.++   Note: This may include excluding it from bfcache.

I'll improve the language here. During unloading, things can happen that exclude the document from bfcache, such as unload event handlers.

jakearchibald

comment created time in 2 days

PullRequestReviewEvent

Pull request review commentWICG/portals

Rough history traversal algorithm

+# History traversal++These rough algorithms describe how portal activation, adoption, reportaling, and deportaling should work.++Eventually, these steps will be moved into the portal spec, and the history traversal section of the HTML spec.++## Definitions++<dl>+  <dt>Navigable</dt>

Portals can navigate, and when they do so they change browsing context each time.

If you navigate and it requires isolation due to COOP+COEP, that also involves a change in browsing context.

This is one of the "boil the ocean" things that needs to happen, because the spec doesn't carter for this at all, it assumes that browsing contexts stay constant within navigable. The spec also makes session history a property of the browsing context, which is similarly broken.

jakearchibald

comment created time in 2 days

PullRequestReviewEvent

issue commentrustwasm/wasm-pack

env dependency not provided

No idea, sorry 😢

PSeitz

comment created time in 3 days

PullRequestReviewEvent

Pull request review commentWICG/portals

Explain activation subtleties

 An additional reason for avoiding these mechanisms is that it makes writing port  To conclude, instead of giving embedders this control as iframes do, we believe that the browser can take the role of mitigating any problematic features. For example, instead of requiring embedders to use `sandbox=""` to turn off modal `alert()`/`confirm()`/`prompt()` dialogs, or permissions policy to turn off autoplaying media, those features are [always disabled](https://github.com/WICG/portals#other-restrictions-while-portaled) in pre-activation portals. And because portals are isolated from communicating with their embedder pre-activation, any problems which CSP Embedded Enforcement would attempt to protect against will instead be caught by this communications barrier and prevented from impacting the embedder. +### Activation++The basics of activation are explained [in the intro example](#example): calling `portalElement.activate()` causes the embedding window to navigate to the content which is already loaded into the portal. This section discusses some of the subtler details.++First, note that a portal may be in a "closed" state, when it is not displaying valid, activatable content. This could happen for several reasons:++- The host page author has incorrectly set the portal to a non-HTTP(S) URL, e.g. using `<portal src="data:text/html,hello"></portal>`. Portals can only display HTTP(S) URLs.+- The portaled page cannot be loaded, for reasons outside of the host page author's control. For example, if the portaled content does a HTTP redirect to a `data:` URL, or if the portaled content gives a network error.+- The user is offline, which also causes a network error.++(What, exactly, the `<portal>` element displays in this state is still under discussion: [#251](https://github.com/WICG/portals/issues/251).)++Attempting to activate a closed portal will fail. Activation can also fail if another navigation is in progress, as discussed [above](#session-history-navigation-and-bfcache). In all of these cases, the promise returned by the `activate()` method will be rejected, allowing page authors to gracefully handle the failure with a custom error experience.++Another consideration is how activation behaves when the portal is currently loading content. This breaks down into two cases:++- During the initial load of content into a portal, e.g. given++  ```js+  const portal = document.createElement("portal");+  portal.src = "https://slow.example.com/";+  document.body.append(portal);+  portal.activate();+  ```++  the promise returned by `activate()` will not settle until the navigation is far enough along to determine whether or not it will be successful. This requires waiting for the response to start arriving, to ensure there are no network errors and that the final response URL is a HTTP(S) URL. Once it reaches that point, then the promise will fulfill or reject appropriately. If the promise fulfills, then activation will have completed, and the content will be loading into the newly-activated browsing context. If it rejects, then no activation will have occurred.++- After the initial load of the portal, e.g. given++  ```js+  const portal = getSomeExistingFullyLoadedPortal();+  portal.src = "https://different-url.example.com/";+  portal.activate();+  ```++  activation of the already-loaded content will happen immediately, and the navigation to the new content will happen in the newly-activated browsing context. In these cases, the promise returned by `activate()` will generally fulfill, as it is almost always possible to activate the already-loaded content. (The exceptions are edge cases like if another user-initiated navigation, or another portal activation, is already ongoing.)

I still think this bypasses CSP rules, but I guess we can continue debating it in the issues.

domenic

comment created time in 3 days

Pull request review commentWICG/portals

Explain activation subtleties

 An additional reason for avoiding these mechanisms is that it makes writing port  To conclude, instead of giving embedders this control as iframes do, we believe that the browser can take the role of mitigating any problematic features. For example, instead of requiring embedders to use `sandbox=""` to turn off modal `alert()`/`confirm()`/`prompt()` dialogs, or permissions policy to turn off autoplaying media, those features are [always disabled](https://github.com/WICG/portals#other-restrictions-while-portaled) in pre-activation portals. And because portals are isolated from communicating with their embedder pre-activation, any problems which CSP Embedded Enforcement would attempt to protect against will instead be caught by this communications barrier and prevented from impacting the embedder. +### Activation++The basics of activation are explained [in the intro example](#example): calling `portalElement.activate()` causes the embedding window to navigate to the content which is already loaded into the portal. This section discusses some of the subtler details.++First, note that a portal may be in a "closed" state, when it is not displaying valid, activatable content. This could happen for several reasons:++- The host page author has incorrectly set the portal to a non-HTTP(S) URL, e.g. using `<portal src="data:text/html,hello"></portal>`. Portals can only display HTTP(S) URLs.+- The portaled page cannot be loaded, for reasons outside of the host page author's control. For example, if the portaled content does a HTTP redirect to a `data:` URL, or if the portaled content gives a network error.+- The user is offline, which also causes a network error.++(What, exactly, the `<portal>` element displays in this state is still under discussion: [#251](https://github.com/WICG/portals/issues/251).)++Attempting to activate a closed portal will fail. Activation can also fail if another navigation is in progress, as discussed [above](#session-history-navigation-and-bfcache). In all of these cases, the promise returned by the `activate()` method will be rejected, allowing page authors to gracefully handle the failure with a custom error experience.

I'm still a little worried that portal activation rejections can tell you something about a page that you didn't already know, but we can work on that in issues.

domenic

comment created time in 3 days

push eventGoogleChromeLabs/squoosh

Jake Archibald

commit sha f5d9023ff3c84e08869f275fde960b3c0a8e63ff

Update postcss

view details

push time in 3 days

push eventGoogleChromeLabs/squoosh

Jake Archibald

commit sha 455c868e554789aa5c68673a5ad75dc346a90c44

First bit of real UI code landed

view details

push time in 3 days

push eventGoogleChromeLabs/file-drop

Jake Archibald

commit sha cd00719e045e3ac12869e7ca1ba3004c8a8f3516

Removing type defs They're out of date, and Preact-only. They shouldn't be part of this project.

view details

Jake Archibald

commit sha 25e8c2fd99746436d123e44582479d9e006fde99

1.0.0

view details

push time in 3 days

created tagGoogleChromeLabs/file-drop

tagv1.0.0

A simple file drag and drop custom-element

created time in 3 days

issue commentpreactjs/preact

New typings broke custom elements for me

ffs, it works if the .d.ts is a module. Adding export {} to the file makes it work. Ugh, I do not understand this part of TypeScript at all.

surma

comment created time in 3 days

issue commentpreactjs/preact

New typings broke custom elements for me

@andrewiggins that doesn't work for me. It means my preact imports stop working:

import { h, Component } from 'preact';

Module '"preact"' has no exported member 'h'.

surma

comment created time in 3 days

push eventGoogleChromeLabs/squoosh

Jake Archibald

commit sha 657310375593c5dc9099d8b2361b783d5d44ad10

Add manifest

view details

push time in 3 days

issue commentWICG/portals

Should you be able to tell if a portal is currently closed, without calling activate()?

In a doc, @kjmcnee suggested the following portal states:

  • Empty - Nothing has ever been loaded in the portal, or the previous contents were discarded due to failed readoption, the user agent reclaiming resources, etc.
  • Live - The portal is hosting a browsing context.
  • Activated - The portal has activated. The element’s context is adopted or in bfcache.
  • Frozen - The hosted context is alive, but frozen by the user agent.
  • Epitaph - Like empty, but there’s a screenshot or some other visual representation of the previous content.

Portaled content is broken, e.g. it 301s to a data: URL, or it network errors.

Can we detect this currently with iframes and such, or is it a new capability?

domenic

comment created time in 3 days

issue commentWICG/portals

What should closed portals display?

2 means that layout doesn't change between activating and reportaling

domenic

comment created time in 3 days

pull request commentjakearchibald/idb

Add mode to IDBPTransaction

I want to give it some detailed testing before I land it, but it looks good at a glance. I also need to figure out if it requires a major version bump or not. Sorry for the delay, just not got a bunch of free time right now. It's still on my todo list, and I really appreciate the work you've put in.

zhouhanseng

comment created time in 3 days

push eventGoogleChromeLabs/squoosh

Jake Archibald

commit sha a30e38856e2e0c153ae925877616e9559f1533b2

Avoid infinite rebuilds

view details

push time in 4 days

push eventGoogleChromeLabs/squoosh

Jake Archibald

commit sha d2807ebb18fb372cad8ca6c25e7448f8f324c9e5

Integrate Oxi

view details

push time in 4 days

push eventGoogleChromeLabs/squoosh

Jake Archibald

commit sha fd151fc70d772642ef4eaa5d944d8a242fba0315

Add hqx

view details

push time in 4 days

push eventGoogleChromeLabs/squoosh

Jake Archibald

commit sha 2a6a83f56d7d008e4b61c9e1d49e36cc501b55f0

Resize working

view details

Jake Archibald

commit sha 812e727de0e282d7a329586c6ac07f869d3620a8

Integrate rotate

view details

push time in 4 days

Pull request review commentWICG/portals

Rough history traversal algorithm

 Navigations within portals are subject to certain security restrictions for load  Navigation errors within portals may cause portal activation to be rejected. Instead of, for example, the user agent showing an error page to the user as with a conventional navigation, the promise returned by the activate method allows a page to gracefully handle the rejection. Furthermore, user agents have existing limitations on navigations initiated by the page where they may be ignored if they are considered to conflict with a user's intent to perform a different navigation. Such cases are not described by the existing navigation spec (see [#218](https://github.com/WICG/portals/issues/218)), but portal activations are subject to these limitations. In the case where another navigation takes precedence over portal activation, the promise returned by the activate method rejects. +See the [rough algorithms](history-traversal.md) for cases where portals may be auto-activated, and pages may be reportaled, when traversing session history.

Is this the right way to link to another doc from a README.md?

jakearchibald

comment created time in 4 days

PullRequestReviewEvent

PR opened WICG/portals

Reviewers
Rough history traversal algorithm
+109 -0

0 comment

2 changed files

pr created time in 4 days

create barnchWICG/portals

branch : history-traversal

created branch time in 4 days

issue commentwebscreens/window-placement

Why `scaleFactor` and not `devicePixelRatio`?

That isn't the pixel ratio of the device though, so it'd right to use a different name. It's only impacted by zooming within the page viewport.

jakearchibald

comment created time in 4 days

issue commentWICG/is-input-pending

Return a Promise<void> instead of a boolean

Should waitOnPendingInput resolve once currently queued inputs have dispatched, or should that also include any additional inputs queued since? Should that be an option?

TimvdLippe

comment created time in 4 days

issue commentWICG/is-input-pending

Return a Promise<void> instead of a boolean

  • isInputPending will return true if (and only if) a task is enqueued for dispatching input.
  • setTimeout uses the queue a task algorithm, which appends a new task to the end of the task queue.

Therefore, a setTimeout call after isInputPending returns true will have the resulting task dispatch after any pending input tasks are dispatched.

There isn't a single task queue. Input and setTimeout add items to different queues. Typically browsers will empty input queues ahead of timer queues, but this isn't spec'd or guaranteed.

TimvdLippe

comment created time in 4 days

issue commentGoogleChrome/web.dev

content: WebCodecs [2020-10-14]

I don't know what the state of this proposal is anymore. It looked like progress was being made, and things were heading in the right direction, but I'm now running against CDS deadlines so won't be taking additional work on.

kaycebasques

comment created time in 4 days

issue commentrustwasm/wasm-pack

env dependency not provided

This problem went away for me by updating to Rust 1.46

PSeitz

comment created time in 5 days

push eventGoogleChromeLabs/squoosh

Jake Archibald

commit sha 21fc70cbddca7f13f0fcaa7a840f03f3a67514a6

wip

view details

push time in 5 days

push eventGoogleChromeLabs/squoosh

Jake Archibald

commit sha d9e1177cd875908b21a0e9da5a96f5a8119b181e

wip on rust wasm integration

view details

push time in 5 days

push eventGoogleChromeLabs/squoosh

Jake Archibald

commit sha 300809fdcb0f27bd82f839e2c63b55bfa87f7a4a

Client/shared/worker split for resize

view details

push time in 5 days

issue closedjakearchibald/idb

Safari throws "TransactionInactiveError: Failed to store record in an IDBObjectStore: The transaction is inactive or finished."



import('https://cdn.skypack.dev/idb@5.0.6/with-async-ittr').then(async (idb) => {
    await idb.deleteDB('test')
    const sleep = (ms) => new Promise((resolve) => setTimeout(resolve, ms))
    const openDB = (() => {
        /** @type {import('idb').IDBPDatabase<unknown>} */
        let db = undefined
        return async () => {
            if (db) return db
            if (!db)
                db = await idb.openDB('test', 1, {
                    upgrade(db, oldVersion, newVersion, transaction) {
                        db.createObjectStore('store')
                    },
                })
            db.addEventListener('close', () => (db = undefined))
            return db
        }
    })()
    /** @type {import('idb').IDBPTransaction<unknown, ["store"]>} */
    let transaction = undefined
    const beforeTx = async () => {
        try {
            await transaction.objectStore('store').openCursor(IDBKeyRange.only('a'))
            console.log('The transaction is alive!')
        } catch (e) {
            console.log('Transaction outdated', e, 'creating new one')
            transaction = (await openDB()).transaction('store', 'readwrite')
        }
    }
    await beforeTx()
    await transaction.store.add({ a: 1 }, 'a')
    console.log('added a')
    await beforeTx()
    await transaction.store.add({ a: 1 }, 'b')
    console.log('added b')

    await sleep(120)
    await beforeTx()
    await transaction.store.add({ a: 1 }, 'c')
    console.log('added c')

    await sleep(120)
    console.log('before call beforeTx')
    await beforeTx()
    console.log('after call beforeTx')
    const cursor = await transaction.store.openCursor(IDBKeyRange.only('a'))
    console.log(cursor.value)
    const next = await cursor.continue()
    console.log(next)
})


Chrome and Firefox no problem. But Safari throws.

(The last Unhandled Promise Rejection).

The expected result is (on Chrome):

closed time in 5 days

Jack-Works

issue commentjakearchibald/idb

Safari throws "TransactionInactiveError: Failed to store record in an IDBObjectStore: The transaction is inactive or finished."

Thanks for filing this on WebKit, I agree it looks like a browser bug. I recommend trying to reduce the demo to just the failing test, and even try to recreate it without the library.

Jack-Works

comment created time in 5 days

push eventw3c/ServiceWorker

Deployment Bot (from Travis CI)

commit sha c176c1a85813178d7d35dfe2a8b5539cab0fc014

Deploy w3c/ServiceWorker to github.com/w3c/ServiceWorker.git:gh-pages

view details

push time in 5 days

pull request commentw3c/ServiceWorker

Editorial: Align with Web IDL specification

Ah, thanks for catching that. I didn't notice it targeted the wrong branch.

autokagami

comment created time in 5 days

push eventw3c/ServiceWorker

autokagami

commit sha 044cc99578d3510cdac200ca5a8b1311782d12f7

Editorial: Align with Web IDL specification (#1541)

view details

push time in 5 days

PR merged w3c/ServiceWorker

Editorial: Align with Web IDL specification

This is an automated pull request to align the spec with the latest Web IDL specification.

Currently the autofix might introduce some awkward code formatting, so please feel free to modify the formatting.

Please file an issue on https://github.com/saschanaz/webidl-updater/issues/new if you think this PR is invalid or should be enhanced.

The following is the validation messages from webidl2.js, which may help understanding this PR:

Validation error at line 5 in service-workers-1,0:
  void postMessage(any message
  ^ `void` is now replaced by `undefined`. Refer to the [relevant GitHub issue](https://github.com/heycam/webidl/issues/60) for more information.

Validation error at line 6 in service-workers-1,0:
  void postMessage(any message
  ^ `void` is now replaced by `undefined`. Refer to the [relevant GitHub issue](https://github.com/heycam/webidl/issues/60) for more information.

Validation error at line 11 in service-workers-1,1:
  [NewObject] Promise<void> update()
                      ^ `void` is now replaced by `undefined`. Refer to the [relevant GitHub issue](https://github.com/heycam/webidl/issues/60) for more information.

Validation error at line 11 in service-workers-1,3:
  void startMessages();
  ^ `void` is now replaced by `undefined`. Refer to the [relevant GitHub issue](https://github.com/heycam/webidl/issues/60) for more information.

Validation error at line 3 in service-workers-1,5:
  Promise<void> enable()
          ^ `void` is now replaced by `undefined`. Refer to the [relevant GitHub issue](https://github.com/heycam/webidl/issues/60) for more information.

Validation error at line 4 in service-workers-1,5:
  Promise<void> disable()
          ^ `void` is now replaced by `undefined`. Refer to the [relevant GitHub issue](https://github.com/heycam/webidl/issues/60) for more information.

Validation error at line 5 in service-workers-1,5:
  Promise<void> setHeaderValue(ByteString
          ^ `void` is now replaced by `undefined`. Refer to the [relevant GitHub issue](https://github.com/heycam/webidl/issues/60) for more information.

Validation error at line 7 in service-workers-1,6:
  [NewObject] Promise<void> skipWaiting()
                      ^ `void` is now replaced by `undefined`. Refer to the [relevant GitHub issue](https://github.com/heycam/webidl/issues/60) for more information.

Validation error at line 7 in service-workers-1,7:
  void postMessage(any message
  ^ `void` is now replaced by `undefined`. Refer to the [relevant GitHub issue](https://github.com/heycam/webidl/issues/60) for more information.

Validation error at line 8 in service-workers-1,7:
  void postMessage(any message
  ^ `void` is now replaced by `undefined`. Refer to the [relevant GitHub issue](https://github.com/heycam/webidl/issues/60) for more information.

Validation error at line 7 in service-workers-1,8:
  [NewObject] Promise<void> claim()
                      ^ `void` is now replaced by `undefined`. Refer to the [relevant GitHub issue](https://github.com/heycam/webidl/issues/60) for more information.

Validation error at line 4 in service-workers-1,11:
  void waitUntil(Promise<
  ^ `void` is now replaced by `undefined`. Refer to the [relevant GitHub issue](https://github.com/heycam/webidl/issues/60) for more information.

Validation error at line 9 in service-workers-1,13:
  readonly attribute Promise<void> handled;
                             ^ `void` is now replaced by `undefined`. Refer to the [relevant GitHub issue](https://github.com/heycam/webidl/issues/60) for more information.

Validation error at line 11 in service-workers-1,13:
  void respondWith(Promise<
  ^ `void` is now replaced by `undefined`. Refer to the [relevant GitHub issue](https://github.com/heycam/webidl/issues/60) for more information.

Validation error at line 7 in service-workers-1,14:
  Promise<void> handled;
          ^ `void` is now replaced by `undefined`. Refer to the [relevant GitHub issue](https://github.com/heycam/webidl/issues/60) for more information.

Validation error at line 5 in service-workers-1,18:
  [NewObject] Promise<void> add(RequestInfo
                      ^ `void` is now replaced by `undefined`. Refer to the [relevant GitHub issue](https://github.com/heycam/webidl/issues/60) for more information.

Validation error at line 6 in service-workers-1,18:
  [NewObject] Promise<void> addAll(sequence
                      ^ `void` is now replaced by `undefined`. Refer to the [relevant GitHub issue](https://github.com/heycam/webidl/issues/60) for more information.

Validation error at line 7 in service-workers-1,18:
  [NewObject] Promise<void> put(RequestInfo
                      ^ `void` is now replaced by `undefined`. Refer to the [relevant GitHub issue](https://github.com/heycam/webidl/issues/60) for more information.

<!-- This comment and the below content is programatically generated. You may add a comma-separated list of anchors you'd like a direct link to below (e.g. #idl-serializers, #idl-sequence):

Don't remove this comment or modify anything below this line.
If you don't want a preview generated for this pull request,
just replace the whole of this comment's content by "no preview"
and remove what's below.

-->


<a href="https://pr-preview.s3.amazonaws.com/autokagami/ServiceWorker/pull/1541.html" title="Last updated on Sep 21, 2020, 1:28 AM UTC (787777f)">Preview</a> | <a href="https://pr-preview.s3.amazonaws.com/w3c/ServiceWorker/1541/ce67c04...autokagami:787777f.html" title="Last updated on Sep 21, 2020, 1:28 AM UTC (787777f)">Diff</a>

+18 -18

0 comment

1 changed file

autokagami

pr closed time in 5 days

PullRequestReviewEvent

issue commentjakearchibald/idb-keyval

Adding a store creating a different db

Does this plan work? https://github.com/jakearchibald/idb-keyval/issues/80

puneetkverma

comment created time in 6 days

push eventGoogleChromeLabs/squoosh

Jake Archibald

commit sha 7540a15f8d758f6264e29d7819df73ffc4b2885a

Features folder

view details

push time in 8 days

issue commentjakearchibald/idb

Failed to execute 'transaction' on 'IDBDatabase

If that isn't the issue, please create a minimal reproducible example of the issue, and I'll take a look.

pinal-r

comment created time in 8 days

issue closedjakearchibald/idb

Failed to execute 'transaction' on 'IDBDatabase

Unhandled Rejection (NotFoundError): Failed to execute 'transaction' on 'IDBDatabase': One of the specified object stores was not found.

Facing this error. My code is

const DB_NAME=`language_${window.localStorage.getItem('accountcode') || 0}`;
const langDB=openDB('abs2anguage', 1, {
    upgrade(db) {
        db.createObjectStore(DB_NAME);
    },
});

closed time in 8 days

pinal-r

issue commentjakearchibald/idb

Failed to execute 'transaction' on 'IDBDatabase

In the code example above, DB_NAME is used as the store name, which is weird. It means that the store name could change without changing the schema version of the database. If that happens, you'll try to get an object store that was never created.

pinal-r

comment created time in 8 days

issue closedw3c/ServiceWorker

importScripts request does not include Service-Worker header

This GET request for the root service worker script does include "Service-Worker: script" header as required by the spec, while the GET request generated by importScripts() from the root script does not.

This is behavior of the recent (at the moment of the issue submitting) desktop versions of Chrome 85.0.4183.102 (Official Build) (64-bit), FireFox 80.0.1 (64-bit), and Edge 85.0.564.51 (Official build) (64-bit).

I did not test this behavior neither on the mobile devices, nor with other browsers.

closed time in 8 days

tms320c

issue commentw3c/ServiceWorker

importScripts request does not include Service-Worker header

The header isn't optional, it's included on the request for the service worker script, always. It isn't included on additional requests made by that script, including fetch and importScripts.

It's definitely mentioned in the spec - it's requires that header to be there on root service worker scripts, and it requires it isn't there on other requests.

tms320c

comment created time in 8 days

push eventw3c/ServiceWorker

Deployment Bot (from Travis CI)

commit sha 1efb50517a5c2bfd0fb1ff340c0b642b6069ca94

Deploy w3c/ServiceWorker to github.com/w3c/ServiceWorker.git:gh-pages

view details

push time in 8 days

issue closedw3c/ServiceWorker

export 'ServiceWorkerRegistration/service worker registration'

Spec's that are extending ServiceWorkerRegistration probably need to refer to the service worker registration associated with a ServiceWorkerRegistration, i.e. https://w3c.github.io/ServiceWorker/#serviceworkerregistration-service-worker-registration. That definition isn't currently exported though.

closed time in 8 days

mkruisselbrink

issue commentw3c/ServiceWorker

export 'ServiceWorkerRegistration/service worker registration'

Agreed. Fixed in ce67c0407def81312fcf51643f0b5d733b42ea31.

mkruisselbrink

comment created time in 8 days

push eventw3c/ServiceWorker

Jake Archibald

commit sha ce67c0407def81312fcf51643f0b5d733b42ea31

Export ServiceWorkerRegistration's service worker registration

view details

push time in 8 days

issue commentAOMediaCodec/av1-avif

Optional 'progressive' download frame

@leo-barnes

But is progressive quality refinement better than initially showing the thumbnail and then either waiting until you have the rest of the image or sequentially fill in more quality in raster scan order?

The proof of concept I hacked together was just a low-quality half-resolution version followed by a full version.

I guess that could be done with a thumbnail, but it's a bit of a hack. If browsers start displaying the thumbnail before they have the full image, I'd definitely optimise my thumbnails for early-render on a web page, and publish tooling to help others to do the same. This would make them crappy thumbnails, but if you're happy with that, so am I.

I don't think scan order is as good, as @kornelski says.

I'm neither a web developer or a browser developer, so I can't really give you very much detail unfortunately. I think what they were saying was that if a server has multiple versions of an asset with different sizes, there are ways of specifying this in HTML so that the browser knows that there are multiple choices and automatically downloads the best asset for the given device/monitor. They said this was preferable to having multiple assets in the same file since the browser would then have no other choice than to download everything.

Ah, they're talking about responsive images. Yes, this is ideal for cases where the browser wants to pick the right file for the number of pixels needed. However, I think this is orthogonal to progressive/preview rendering.

  • Responsive images: "Of these resources, I will pick the best one for me"
  • Progressive/preview rendering: "While downloading that resource, I can represent it in a useful way without downloading the whole resource"

Both are useful at the same time.

2. See it's an AVIF, start parsing.

Doesn't the browser know the MIME type of the asset up front?

Not before making the request. It knows the type if it's specified in the response Content-Type header, however all browsers ignore this and just sniff the start of the resource instead.

I definitely think you should lay out the layers and tiles in the file so that you get progressively better quality as you download sequentially. But exposing the layers in the container gives you the choice of easily stopping once you have good enough quality.

Ah, cool, I think I got confused when you said "Both items are placed in an 'altr' group with the (full quality) item placed first in the list", and assumed that meant the full quality data would appear first in the file.

jakearchibald

comment created time in 8 days

issue commentw3c/resource-timing

spec what requestStart, responseStart, and responseEnd should represent when service worker is involved

but that would make the SW and the non-SW case differ greatly

Yeah, I think we should avoid that, so I agree with keeping fetchStart as is, assuming browsers implement the current spec text.

It feels like the spec is trying to say workerStart is the time the browser decides to use a service worker for that request. So I agree with @mfalken that this should be [A]. I guess this is step 16 of handle fetch.

requestStart is a tricky one. Here's the spec text:

The time immediately before the user agent starts requesting the resource from the server, or from relevant application caches or from local resources

To me, it seems like it's saying the source of the data has been selected, but that source hasn't yet been queried. But the existence of connectStart means that connection work has already been done at this point. I guess that also means it's after any kind of queuing the request might get, due to priority or limits. With service worker, I guess the equivalent of connecting is starting the worker. That means, yep, I agree with @mfalken, requestStart is [F], just before dispatching the fetch event. I also agree with @mfalken that this happens just before the dispatch step, so it's part of the task queued in the service worker.

(requestStart - workerStart) gives you the delay between deciding to use a service worker, and running code to handle the event. This would be long if service worker startup is slow, or the fetch event takes a long time to dispatch due to main thread work in the service worker. Is that what folks want to know?

If folks just want to capture worker startup time, then yeah we'd need something like workerReady.

wanderview

comment created time in 8 days

issue commentAOMediaCodec/av1-avif

Optional 'progressive' download frame

Thanks for the update!

S1 is only really an issue when you have very slow download speeds. This is still the case in some places in the world, but is becoming less and less of an issue.

I think there's still a benefit at 3g-like speeds. Back when commuting was a thing, I'd frequently end up on the train's wifi, which was terrible, or on mobile, which was spotty/slow during the commute. This was a train to London, so I don't think this is just an emerging markets issue.

The conclusion from those teams were that they in general preferred using separate files rather than have multiple independent assets in the same file.

I'm not sure how that works on the web. If I ask the server for two files, the 'preview' version and the full version, the server would need to be smart enough to know they're part of the same overall image and send them in sequence. Otherwise, you can end up receiving the preview after the full version. Or, the server might chunk them and send them in parallel, but bandwidth spent on the full version rather than the preview is kinda a waste. Servers are pretty bad at serving content in the most optimal order today.

The client could avoid this by requesting each in sequence, but that creates a delay as it has to create a separate request once it's received the preview.

With a single file, you don't have the sequencing issues, since the order of bytes in the file dictates that. It also means the browser can make decisions like "I don't need to decode the preview, because I already have the whole file".

The layer locations are specified in the container, so the parser knows exactly which parts of the file need to be downloaded.

If I've understood this correctly, I worry about the performance. It means the browser would have to:

  1. Start downloading the image.
  2. See it's an AVIF, start parsing.
  3. Realise the bit of the image it needs is elsewhere in the file.
  4. Terminate the stream.
  5. Make range request for the bit of the image it does need.

I think in many cases this would be slower than downloading the whole file.

For the web to benefit from it, I think the data needs to be delivered in a single response, where the data is ordered lowest-resolution/quality first.

jakearchibald

comment created time in 9 days

issue commentGoogleChromeLabs/tooling.report

Test: Converting non-esm modules to esm

Isn't this covered by being able to load particular modules types https://bundlers.tooling.report/importing-modules/ and being able to output particular module types https://bundlers.tooling.report/output-module-formats/?

frank-dspeed

comment created time in 9 days

issue commentw3c/ServiceWorker

importScripts request does not include Service-Worker header

Why should it?

tms320c

comment created time in 9 days

push eventGoogleChromeLabs/squoosh

Jake Archibald

commit sha 25102095aa0f4d11bcb8a6cc9d997c651978860c

Update webp from main branch

view details

Jake Archibald

commit sha 2583d689b9bb23c08f7afa674e2df0f921e178c7

AVIF to module

view details

Jake Archibald

commit sha 7776134bc22197372bcee588a835416cc7f2f07b

AVIF in worker

view details

Jake Archibald

commit sha f92e3c2194791a8a2037d32de2eb4d7783b4400f

Move encode options definition to the wasm

view details

push time in 10 days

pull request commentGoogleChromeLabs/squoosh

Image height and width fix.

Just to make it easier to see the colour reduction

velusgautam

comment created time in 10 days

PR opened GoogleChromeLabs/squoosh

Reviewers
Ensure node_modules is created

Missed this from my PR

+2 -0

0 comment

1 changed file

pr created time in 10 days

create barnchGoogleChromeLabs/squoosh

branch : create-dir

created branch time in 10 days

push eventGoogleChromeLabs/squoosh

Surma

commit sha 34cb55978fedaf0f2d786953024f53c9e1509c06

Add avif decoder binaries

view details

Surma

commit sha 02807aab32ab6a62556c88e09fa179455b35947f

Add AVIF decoder to squoosh

view details

Surma

commit sha 409df481db8f4d67ddbaeb8581df55bd2082a962

Fix HDR image support in AVIF decoder

view details

Surma

commit sha e1ab43b76fbd6bab0ff403cf58ac2aa56bc38354

Add AVIF encoder

view details

Surma

commit sha c29006d59336be5b038a519b70d596b2831b85d9

Add AVIF encoder without options

view details

Surma

commit sha ac9a7767d22ff58cfb839601fd6996eae76f6665

Expose some options for AVIF

view details

Surma

commit sha 17dcc9c7d4350ffb8f57c1cba9cf20ff73ba5fdb

Update AVIF encoder README

view details

Ingvar Stepanyan

commit sha 0c3ef3fdf59533a57a1180043fe00222efd47917

Migrate AVIF to Emscripten upstream

view details

Ingvar Stepanyan

commit sha 08c267a98b52f2976e2e8fef9319fe2f12c690f3

Add LTO to AVIF

view details

Ingvar Stepanyan

commit sha 368ad9505edcd38e7e23728784181a8e8b59acf2

Use make -j in AVIF

view details

Ingvar Stepanyan

commit sha 1baa823d7722a9122b94e001aa17c10345d76b31

Upgrade libavif

view details

Ingvar Stepanyan

commit sha 0ac3d179691b8f292d6895f4a80cf34b71cae425

Move AOM cloning to napa

view details

Ingvar Stepanyan

commit sha 2edb8cbd7e6c4dab0c038a1115a5cf2e2e467b29

Upgrade AVIF decoding code - Update to newer APIs. - Avoid manual pixel-by-pixel copy in favour of decoding directly to desired format & bit depth. - Avoid use-after-free by cloning the Uint8Array Wasm memory view into a JS-owned Uint8Array right away.

view details

Surma

commit sha ee99cf6e0b86717f2e3e9c75917a2dcb235d638f

Move to makefile for AVIF

view details

Surma

commit sha 7ffa45ba86d58d0a3b704316f2953309d8382996

Update libavif and libaom

view details

Surma

commit sha 53298a23ade0b4557da5329b3a82b2c78d6ed826

Remove package.json and move git to Makefile

view details

Surma

commit sha 15dac42a7f2f10b01c80fadbc0dcc9c4747c6844

Remove stray files

view details

Surma

commit sha 789366067986f66d790e5ec2192fb25f3d0e4b1f

Fix alpha channel in encoder

view details

Surma

commit sha f5ab9a9a59e31422d9e088a1e2fef27e98e1e32c

Remove CFLAGS and improve git folder targets

view details

Surma

commit sha 3c92f2d531f23723afc16299c53449e4cc4906af

Download sources as tar.gz

view details

push time in 10 days

push eventGoogleChromeLabs/squoosh

Jake Archibald

commit sha 19a67b3472c245a001732d0c8d7a6dd11bd8c4dc

wip

view details

push time in 10 days

push eventGoogleChromeLabs/squoosh

Jake Archibald

commit sha d7432199d829506d1df96619a2c53fe070aab70c

Updated webp enc & dec

view details

push time in 10 days

push eventGoogleChromeLabs/squoosh

Velu S Gautam

commit sha dfee848a397186d98747ee1dfc254b8d9126a7a9

Update example.html (#827) rawImage is a Uint8ClampedArray and doesn't have width and height property.

view details

push time in 10 days

PR merged GoogleChromeLabs/squoosh

Image height and width fix.

rawImage is a Uint8ClampedArray and doesn't have width and height property.

+1 -1

1 comment

1 changed file

velusgautam

pr closed time in 10 days

PullRequestReviewEvent

issue commentwhatwg/infra

Named arguments and especially named optional arguments

function someFunction({ foo, bar }) {
  // I think it's fair to call `foo` and `bar` variables here.
  console.log(foo, bar);
}

someFunction({ foo: 'hello', bar: 'world' });
// I don't think it's right to call `foo` and `bar` variables here.

Within a function, I think it's fair to treat named arguments as variables. However, I don't think named parameters are variables. They aren't declared, they don't have a scope, and they can't be set after declaration.

domenic

comment created time in 11 days

push eventGoogleChromeLabs/squoosh

Jake Archibald

commit sha d3d3cfd482ce035c01198ef4ebedea474807ff34

oops

view details

Jake Archibald

commit sha 8a57b0edb7a7f8978be7250d5c1e5c7e51f29619

Move image quant

view details

push time in 11 days

issue commentwhatwg/infra

Named arguments and especially named optional arguments

Yeah, I was thinking more about the immediate problem if people start using this in today's specs without <call>.

What's the benefit of using <var> here at all? It isn't a variable, and it already links to the definition, so it seems correct semantically.

domenic

comment created time in 11 days

push eventw3c/ServiceWorker

Deployment Bot (from Travis CI)

commit sha 701bf705caa4e910beb3be39ad2336b98927b9a7

Deploy w3c/ServiceWorker to github.com/w3c/ServiceWorker.git:gh-pages

view details

push time in 11 days

push eventw3c/ServiceWorker

Domenic Denicola

commit sha eab774ec655c4292925f54df567335f41140274a

Use new Streams algorithms (#1533) * Use new Streams algorithms Follows https://github.com/whatwg/streams/pull/1073. * Namespacing tweaks * No subsubsteps Co-authored-by: Jake Archibald <jaffathecake@gmail.com> * Missing :: Co-authored-by: Jake Archibald <jaffathecake@gmail.com>

view details

push time in 11 days

PR merged w3c/ServiceWorker

Use new Streams algorithms

Follows https://github.com/whatwg/streams/pull/1073.

Do not merge until that PR is merged. But, feel free to review.

See also https://github.com/tabatkins/bikeshed/issues/1758.

<!-- This comment and the below content is programatically generated. You may add a comma-separated list of anchors you'd like a direct link to below (e.g. #idl-serializers, #idl-sequence):

Don't remove this comment or modify anything below this line.
If you don't want a preview generated for this pull request,
just replace the whole of this comment's content by "no preview"
and remove what's below.

-->


<a href="https://pr-preview.s3.amazonaws.com/domenic/ServiceWorker/pull/1533.html" title="Last updated on Sep 15, 2020, 2:16 PM UTC (a483802)">Preview</a> | <a href="https://pr-preview.s3.amazonaws.com/w3c/ServiceWorker/1533/e4c2953...domenic:a483802.html" title="Last updated on Sep 15, 2020, 2:16 PM UTC (a483802)">Diff</a>

+22 -15

1 comment

1 changed file

domenic

pr closed time in 11 days

PullRequestReviewEvent

push eventGoogleChromeLabs/squoosh

Jake Archibald

commit sha 3be85aaf649c768be9180b72a821b041267a7de5

Just one flatten

view details

push time in 11 days

push eventGoogleChromeLabs/squoosh

Jake Archibald

commit sha 5eb64aae9731bc963153920f4aee732e65600655

wip

view details

Jake Archibald

commit sha c1261229509c2379a790acc195f6abd55f250b23

Comlink all working now

view details

Jake Archibald

commit sha 5b63150e4d5977f71f98d397daa0a556654fe5e0

Deleting autogenerated file

view details

Jake Archibald

commit sha 98fb5a53b0941ad05c7101a2a38071b73a391cb5

Worker autogeneration

view details

push time in 11 days

delete branch GoogleChromeLabs/squoosh

delete branch : fix-avif-build

delete time in 11 days

push eventGoogleChromeLabs/squoosh

Jake Archibald

commit sha a437afdf2bb06ec8fc8f328d3a1f9fad7d9e61cd

Update AVIF build to produce shipped wasm (#823) * argh * It works! * Silly me * Changes following feedback

view details

push time in 11 days

PR merged GoogleChromeLabs/squoosh

Reviewers
Update AVIF build to produce shipped wasm

Fixes #805.

+111 -48

4 comments

2 changed files

jakearchibald

pr closed time in 11 days

issue closedGoogleChromeLabs/squoosh

Smaller AVIF wasms

Right now, both the encoder and the decoder contain both the encoder and decoder.

If I build the encoder with CONFIG_AV1_DECODER and CONFIG_AV1_HIGHBITDEPTH=0 in the AOM build, it knocks around 200k off the wasm.

If I build the decoder with CONFIG_AV1_ENCODER=0, it knocks almost 1mb off the wasm.

It complains about missing symbols, so I included -s ERROR_ON_UNDEFINED_SYMBOLS=0 in the emscripten part, but it seems to function correctly.

I'm not familiar enough with MAKE etc to automate this. @surma @RReverser can you take a look? Best to do it on the avif-options branch, as I've made some changes to the cpp there.

closed time in 11 days

jakearchibald

Pull request review commentw3c/ServiceWorker

Use new Streams algorithms

 spec: webappsec-referrer-policy; urlPrefix: https://w3c.github.io/webappsec-refe                 1. Let |done| be false.                 1. Let |potentialResponse| be a copy of |response|'s associated [=Response/response=], except for its [=response/body=].                 1. If |response|'s [=response/body=] is non-null, run these substeps:-                    1. Let |reader| be the result of [=get a reader|getting a reader=] from |response|'s [=response/body=]'s [=stream=].+                    1. Let |reader| be the result of [=ReadableStream/getting a reader=] from |response|'s [=response/body=]'s [=stream=].+                    1. Let |pullAlgorithm| be an action that runs these subsubsteps:+                        1. Let |readRequest| be a new [=read request=] with the following [=struct/items=]:+                            : [=read request/chunk steps=], given |chunk|+                            ::+                                1. Assert: |chunk| is a {{Uint8Array}}.+                                1. Append the bytes represented by |chunk| to |bytes|.+                                1. Perform ! [=DetachArrayBuffer=](|chunk|.\[[ViewedArrayBuffer]]).+                            : [=read request/close steps=]+                            ::+                                1. Set |end-of-body| to true.+                            : [=read request/error steps=]+                                1. [=ReadableStream/error=] |newStream| with a {{TypeError}}.+                        1. [=ReadableStreamDefaultReader/Read a chunk=] from |reader| given |readRequest|.+                    1. Let |cancelAlgorithm| be an action that [=ReadableStreamDefaultReader/cancels=] |reader|.                     1. Let |highWaterMark| be a non-negative, non-NaN number, chosen by the user agent.                     1. Let |sizeAlgorithm| be an algorithm that accepts a [=chunk=] object and returns a non-negative, non-NaN, non-infinite number, chosen by the user agent.-                    1. Let |pull| be an action that runs these subsubsteps:-                        1. Let |promise| be the result of [=read a chunk|reading a chunk=] from |response|'s [=response/body=]'s [=stream=] with |reader|.-                        1. When |promise| is fulfilled with an object whose `done` property is false and whose `value` property is a `Uint8Array` object, append the bytes represented by the `value` property to |bytes| and perform ! [=DetachArrayBuffer=] with the `ArrayBuffer` object wrapped by the `value` property.-                        1. When |promise| is fulfilled with an object whose `done` property is true, set |end-of-body| to true.-                        1. When |promise| is fulfilled with a value that matches with neither of the above patterns, or |promise| is rejected, [=ReadableStream/error=] |newStream| with a `TypeError`.-                    1. Let |cancel| be an action that [=ReadableStream/cancels=] |response|'s [=response/body=]'s [=stream=] with |reader|.-                    1. Let |newStream| be the result of [=ReadableStream/construct a ReadableStream object=] with |highWaterMark|, |sizeAlgorithm|, |pull|, and |cancel| in |targetRealm|.+                    1. Let |newStream| be the result of [=ReadableStream/creating=] a {{ReadableStream}} with <a for=ReadableStream/create><var ignore>pullAlgorithm</var></a> set to |pullAlgorithm|, <a for=ReadableStream/create><var ignore>cancelAlgorithm</var></a> set to |cancelAlgorithm|, <a for=ReadableStream/create><var ignore>highWaterMark</var></a> set to |highWaterMark|, and <a for=ReadableStream/create><var ignore>sizeAlgorithm</var></a> set to |sizeAlgorithm|, in |targetRealm|.

I'm ok with this, but in general I don't see the need for the vars here.

When we "Set document's foo to bar", we don't wrap foo in a <var>, so I don't see the need to do it here.

domenic

comment created time in 11 days

PullRequestReviewEvent

issue commentwebscreens/window-placement

Putting the event target behind a permission

Nice! If general window placement is possible there, then it seems like this spec has it covered.

jakearchibald

comment created time in 11 days

PullRequestReviewEvent

Pull request review commentw3c/ServiceWorker

Use new Streams algorithms

 spec: webappsec-referrer-policy; urlPrefix: https://w3c.github.io/webappsec-refe                 1. Let |done| be false.                 1. Let |potentialResponse| be a copy of |response|'s associated [=Response/response=], except for its [=response/body=].                 1. If |response|'s [=response/body=] is non-null, run these substeps:-                    1. Let |reader| be the result of [=get a reader|getting a reader=] from |response|'s [=response/body=]'s [=stream=].+                    1. Let |reader| be the result of [=ReadableStream/getting a reader=] from |response|'s [=response/body=]'s [=stream=].+                    1. Let |pullAlgorithm| be an action that runs these subsubsteps:+                        1. Let |readRequest| be a new [=read request=] with the following [=struct/items=]:+                            : [=read request/chunk steps=], given |chunk|+                            ::+                                1. Assert: |chunk| is a {{Uint8Array}}.+                                1. Append the bytes represented by |chunk| to |bytes|.+                                1. Perform ! [=DetachArrayBuffer=](|chunk|.\[[ViewedArrayBuffer]]).+                            : [=read request/close steps=]+                            ::+                                1. Set |end-of-body| to true.+                            : [=read request/error steps=]+                                1. [=ReadableStream/error=] |newStream| with a {{TypeError}}.

Fair enough. Add in the :: then it's good to go.

domenic

comment created time in 11 days

issue commentwebscreens/window-placement

Putting the event target behind a permission

This spec is about aiding placement of browser windows on auxiliary screens.

jakearchibald

comment created time in 11 days

issue commentwebscreens/window-placement

Putting the event target behind a permission

Is window placement typically possible in those situations?

As you can attack external displays and projectors, yes it is still very relevant.

I meant, can you place a Chrome window on your trackpad?

jakearchibald

comment created time in 11 days

issue commentwebscreens/window-placement

Putting the event target behind a permission

There is a bit for that https://webscreens.github.io/window-placement/#dom-screeninfo-internal, but doesn't seem documented.

jakearchibald

comment created time in 11 days

more