profile
viewpoint
Rod Vagg rvagg require.io NSW, Australia http://r.va.gg Awk Ninja; Yak Shaving Rock Star

chakra-core/ChakraCore 8593

ChakraCore is an open source Javascript engine with a C API.

ded/reqwest 2919

browser asynchronous http requests

fat/bean 1388

an events api for javascript

ded/bonzo 1329

library agnostic, extensible DOM utility

ded/qwery 1108

a query selector engine

microsoft/node-v0.12 792

Enable Node.js to use Chakra as its JavaScript engine.

ded/morpheus 499

A Brilliant Animator

justmoon/node-bignum 416

Big integers for Node.js using OpenSSL

deoxxa/npmrc 406

Switch between different .npmrc files with ease and grace.

isaacs/st 373

A node module for serving static files. Does etags, caching, etc.

pull request commentipfs/go-graphsync

feat: add basic tracing for responses (WIP)

The test failures are all from the original tracing testing not understanding these new traces, but they're instructive to look at to see what's been added, e.g.:

TestGraphsyncRoundTrip:

          	            	 (string) (len=38) "message(0)->request(0)->executeTask(0)",
          	            	 (string) (len=10) "message(1)",
          	            	 (string) (len=25) "request(0)->newRequest(0)",
          	            	 (string) (len=31) "request(0)->terminateRequest(0)",
          	            	 (string) (len=26) "request(0)->executeTask(0)"

TestGraphsyncRoundTripRequestBudgetRequestor:

          	            	 (string) (len=39) "message(0)->request(0)->abortRequest(0)",
          	            	 (string) (len=25) "request(0)->newRequest(0)",
          	            	 (string) (len=31) "request(0)->terminateRequest(0)",
          	            	 (string) (len=26) "request(0)->executeTask(0)",
          	            	 (string) (len=10) "message(1)"
          	            	 (string) (len=38) "message(0)->request(0)->executeTask(0)",
          	            	 (string) (len=39) "message(0)->request(0)->abortRequest(0)",
          	            	 (string) (len=38) "message(0)->request(0)->executeTask(1)",
          	            	 (string) (len=25) "request(0)->newRequest(0)",
          	            	 (string) (len=31) "request(0)->terminateRequest(0)",
          	            	 (string) (len=26) "request(0)->executeTask(0)"

And TestGraphsyncRoundTripIgnoreNBlocks has lots of message spans that go into the (so far) un traced requestManager.ProcessResponses() call.

rvagg

comment created time in 2 hours

PR opened ipfs/go-graphsync

Reviewers
feat: add basic tracing for responses (WIP)

This works so far for the responseManager.ProcessRequests() path, but it's pretty complicated; with the two main problems (building on from what I mentioned @ https://github.com/ipfs/go-graphsync/pull/290/files#r760950285 about the branching inside GraphSync#ReceiveMessage() and wanting to have a base span that lives for the entirety of an incoming message there:

  • Even if taking the responseManager path in isolation, each "message" can have multiple requests, each of which becomes detached from the others for processing—so making a "message" span live across all of these is going to have to take some tricky tracking, or a special wrapper that can decrement calls to "End" and call a true "End" when it reaches zero.
  • The fact that an incoming message can have two paths - one to responseManager and one to requestManager, with the latter having a fairly complicated journey, doubles the pain of having a parent "message" span having a discrete end.

Perhaps a parent "message" span isn't what we want, and we should just have a parent "request" span for each of the requests that an individual message can trigger? Which is a bit unfortunate because then we can't have a span that lives for the lifetime of a call through the top-level GraphSync call, but maybe that's OK?

Currently I have a parent "message" span live until all child "request" spans have been started, so there'll be a parent->child relationship, but the parent won't live much longer after the child is started. I bet different trace viewing tools will deal with that in their own unique way but I reckon it'll still retain the hierarchy.

+154 -24

0 comment

8 changed files

pr created time in 2 hours

create barnchipfs/go-graphsync

branch : rvagg/tracing

created branch time in 2 hours

Pull request review commentipfs/go-graphsync

chore: short-circuit unnecessary message processing

 func (gsr *graphSyncReceiver) ReceiveMessage( 	ctx context.Context, 	sender peer.ID, 	incoming gsmsg.GraphSyncMessage) {-	gsr.graphSync().responseManager.ProcessRequests(ctx, sender, incoming.Requests())-	gsr.graphSync().requestManager.ProcessResponses(sender, incoming.Responses(), incoming.Blocks())++	requests := incoming.Requests()+	responses := incoming.Responses()+	blocks := incoming.Blocks()++	if len(requests) > 0 {+		gsr.graphSync().responseManager.ProcessRequests(ctx, sender, incoming.Requests())+	}+	if len(responses) > 0 || len(blocks) > 0 {

@hannahhoward could this possibly be made exclusive, i.e. an else if, or an assertion and error if there are requests then there shouldn't be responses or blocks?

It'd be nice to plumb a trace through this function but the "end" gets even tricker to determine when it forks off in these two directions, potentially for the same trace.

rvagg

comment created time in 2 hours

PullRequestReviewEvent

issue closedrvagg/github-webhook-handler

123

closed time in 3 hours

poguyer

issue closedrvagg/github-webhook-handler

test2

closed time in 3 hours

poguyer

PR opened ipfs/go-graphsync

Reviewers
chore: short-circuit unnecessary message processing

@hannahhoward I noticed this when looking at some new traces I'm starting from ReceiveMessage. The majority that come in through here don't have any requests, so going into responseManager.ProcessRequests gets it put into the message handling loop and eventually hit a for loop that iterates over an empty slice and does nothing else. On the other side, requestManager.ProcessResponses has the same thing but even more branches where it hits various for loops to collect and process things over an empty slice when there's no responses or blocks.

So, minor optimisation, but seems to be worthwhile given how chatty this call seems to be.

+11 -2

0 comment

1 changed file

pr created time in 6 hours

create barnchipfs/go-graphsync

branch : rvagg/message-shunting

created branch time in 7 hours

push eventmultiformats/js-multiformats

MF416

commit sha 496af8cba4545a5d128ea03baa0095d8f7825c6f

doc: extract runnable examples from README + update dependencies (#135)

view details

push time in 10 hours

PR merged multiformats/js-multiformats

doc: extract runnable examples from README + update dependencies

Extracted and updated 4 examples from README file:

  • cid-interface.js: consolidates examples on basic CID creation, multibase interface
  • block-interface.js: block interface example from README with add'l outputs and checks
  • multicodec-interface.js: borrows README multicodec example, no changes
  • multihash-interface.js: expands on multihash examples

Additionally, updated a dependency for '@ipld/dag-cbor' package which was throwing errors in block-interface.js

+164 -0

1 comment

4 changed files

MF416

pr closed time in 10 hours

PullRequestReviewEvent

created tagrvagg/list-stream

tagv2.1.0

Collect chunks / objects from a readable stream, write obejcts / chunks to a writable stream

created time in 10 hours

push eventrvagg/list-stream

Rod Vagg

commit sha 51edc2aff648404538a82349b28767d7ee6904c3

2.1.0

view details

push time in 10 hours

push eventrvagg/list-stream

Daniel Roe

commit sha 36261c77f18e5e5da0542430cf2149fbe6c66a82

fix: return `this` from `end()` (#3)

view details

push time in 10 hours

PR merged rvagg/list-stream

fix: return `this` from `end()`

For consistency with Node Duplex implementation, end() should likely return this - https://nodejs.org/api/stream.html#writableendchunk-encoding-callback.

Context: https://github.com/DefinitelyTyped/DefinitelyTyped/pull/57473

+2 -0

0 comment

1 changed file

danielroe

pr closed time in 10 hours

PullRequestReviewEvent

issue closedipld/js-dag-cbor

Handling of `undefined` for old blocks

We at Ceramic used the legacy codec to store data, when it was the only option. Apparently, some of the data contain undefined. New js-ipfs release can retrieve a block, but can not decode it, as undefined is not allowed anymore in code. IMO, it is a bug to so drastically change the code and not make it visible for a general audience. You know, SemVer was created for a reason.

As ipfs is released with this dag-cbor codec which is more strict than the legacy dag-cbor codec, I am wondering, what is the plan to accomodate for old blocks created by the legacy codec? Could you lift the restrictions for undefined, maybe, so that old IPFS records could be properly decoded still?

closed time in 10 hours

ukstv

issue commentipld/js-dag-cbor

Handling of `undefined` for old blocks

Sorry, but undefined should never have been allowed and was only possible because it used https://github.com/dignifiedquire/borc which would even let you encode Dates, and its own custom BigNumber objects as well. The IPLD data model has a strict set of kinds that work throughout systems that support IPLD and undefined will break in many places. Your existing data with undefined is going to break if you pass it through Go IPLD / IPFS code for instance. https://ipld.io/docs/data-model/kinds/

You know, SemVer was created for a reason.

Snark isn't particularly helpful. And while I agree and would prefer that js-ipfs be >1 by now so we can do proper signalling, that's a decision by others and they are currently using <0 semver semantics which do allow for breaking changes in minor release. So technically it is following semver.

Could you lift the restrictions for undefined, maybe, so that old IPFS records could be properly decoded still?

You're welcome to do this now by using the BLOCK API to fetch the raw bytes and use the older DAG-CBOR codec or just use a CBOR decoder directly (using borc) and deal with the CID tags yourself. If you really wanted to you could even fork this repo and replace the CBOR decoder with borc to make it behave how you want and then force js-ipfs to use it instead of this codec whenever it encounters a DAG-CBOR block to decode (I believe you can override existing codecs when you supply new ones that conflict). In fact, it's probably easier than that since you may be able to just undo https://github.com/ipld/js-dag-cbor/blob/master/index.js#L88 and the undefineds will flow freely.

It's really important that you ensure that new data doesn't get encoded with undefined, it'll be treated as a bad block by the newer JS stack and the Go stack is the same, and since Go runs most of the backend infra around the world for IPFS, that's bad news: https://github.com/polydawn/refmt/blob/30ac6d18308e584ca6a2e74ba81475559db94c5f/cbor/cborDecoder.go#L213-L218

Sorry for the annoyance on this, clarifying the boundaries of our codecs and ensuring fully compatibility across the stack has been a long journey. Hopefully the current level of clarity helps, though.

ukstv

comment created time in 10 hours

issue commentnodejs/build

`release-nearform-macos10.15-x64-1` is offline

There's a chance this is blocked on needing direct access to the machine, some prompt wanting a real keyboard to acknowlege something. I've had this problem running VMs remotely, and it doesn't just do this for keychain access but at other points too. Can we get someone @ nearForm to look at the terminal for us? Are we set up for that or are these fully remote?

richardlau

comment created time in 10 hours

issue commentnodejs/snap

Corepack not available

OK, so this is probably going to be complicated.

PATH=$PATH:/snap/node/current/bin corepack enable will work because your error there is that it can't find corepack in the PATH, but you'll get another error because it's going to want to set up binaries in the Snap's bin directory, which it's not allowed to touch. You could PATH=$PATH:/snap/node/current/bin corepack enable --install-directory=/usr/local/bin to install it there.

I'm not sure the corepack model is going to be suitable for the Snap because of this need to install in a different path. It's just going to confuse users I suspect.

yarn and yarnpkg are already installed with the Snap, along with npm and npx. So only pnpm and pnpx are missing I think from what corepack offers? I think if these are genuinely worthwhile to you and maybe other users then you could help just get it installed straight into the Snap itself, making corepack unnecessary.

Unfortunately it's not a straightforward task. Have a look at https://github.com/nodejs/snap/blob/HEAD/snapcraft.yaml.sh, which generates https://github.com/nodejs/snap/blob/HEAD/snapcraft.yaml for each release line. If you get yourself setup to develop Snaps locally you can build them straight from these configs and try them out on your local machine. Have a look in the file for yarn and you'll see there's a couple of things that need doing: (1) download and install the package into the right place, and (2) set up aliases for /snap/bin—although we do have to get some official permission from Snapcraft to enable pnpm and pnpx as globals for the package but that shouldn't be a problem I think.

hterik

comment created time in a day

issue commentfilecoin-project/go-fil-markets

CBOR map order for named fields in protocol v1.1.0

See also https://github.com/whyrusleeping/cbor-gen/pull/42 (followed by https://github.com/whyrusleeping/cbor-gen/pull/56) where I've tried to push this before.

It's not the Core CBOR standard we should be following, but the DAG-CBOR standard where we're very explicit about this: https://ipld.io/specs/codecs/dag-cbor/spec/#strictness

Unfortunately this ship sailed at Filecoin launch and changing this now requires a bit more work. I'd love to see this done, however.

Alexey-N-Chernyshov

comment created time in a day

pull request commentmultiformats/js-multiformats

doc: extract runnable examples from README + update dependencies

Also, check the results of the CI run I've just initiated, there's some linting issues to address. You can run these locally with npm run lint to get it happy.

MF416

comment created time in a day

PullRequestReviewEvent

Pull request review commentmultiformats/js-multiformats

doc: extract runnable examples from README + update dependencies

         "types/*"       ]     }+  },+  "dependencies": {+    "@ipld/dag-cbor": "^6.0.14"

we'd best not add this in the top level or every user of multiformats will unnecessarily get it and we're trying to keep the package as minimal as possible

MF416

comment created time in a day

Pull request review commentmultiformats/js-multiformats

doc: extract runnable examples from README + update dependencies

+// Example of multicodec implementation for JSON (UTF-8-encoded)+// Codec implementations should conform to the BlockCodec interface which implements both BlockEncoder and BlockDecoder++/**+ * @template T+ * @type {BlockCodec<0x0200, T>}+ */+ export const { name, code, encode, decode } = {+    name: 'json',+    code: 0x0200,+    encode: json => new TextEncoder().encode(JSON.stringify(json)),+    decode: bytes => JSON.parse(new TextDecoder().decode(bytes))+  }

newline needed here for consistency with other files

  }
MF416

comment created time in a day

Pull request review commentmultiformats/js-multiformats

doc: extract runnable examples from README + update dependencies

+import assert from 'assert'+import { CID } from 'multiformats/cid'+import * as json from 'multiformats/codecs/json'+import { sha256 } from 'multiformats/hashes/sha2'+import { base64 } from "multiformats/bases/base64"++// ** PART 1: CREATING A NEW CID **++// Arbitrary input value+const value = { hello: "world"}++// Uint8array representation+const bytes = json.encode(value)++// Hash Uint8array representation+const hash = await sha256.digest(bytes)++// Create CID (default base32)+const cid = CID.create(1, json.code, hash)++cid.code // 512 (JSON codec)
cid.code // 512 (0x0200) JSON IPLD codec
MF416

comment created time in a day

Pull request review commentmultiformats/js-multiformats

doc: extract runnable examples from README + update dependencies

+import * as Block from 'multiformats/block'+import * as codec from '@ipld/dag-cbor'+import { sha256 as hasher } from 'multiformats/hashes/sha2'++const value = { hello: 'world' }++// encode a block+let block = await Block.encode({ value, codec, hasher })++block.value // { hello: 'world' }+block.bytes // Uint8Array+block.cid   // CID() w/ sha2-256 hash address and dag-cbor codec++console.log("Example block CID: " + block.cid.toString())++// you can also decode blocks from their binary state+let block2 = await Block.decode({ bytes: block.bytes, codec, hasher })++// check for equivalency using cid interface+console.log("Example block CID equal to decoded binary block: " + block.cid.equals(block2.cid))++// if you have the cid you can also verify the hash on decode+let block3 = await Block.create({ bytes: block.bytes, cid: block.cid, codec, hasher })++
let block3 = await Block.create({ bytes: block.bytes, cid: block.cid, codec, hasher })

extraneous newlines at end of file

MF416

comment created time in a day

Pull request review commentmultiformats/js-multiformats

doc: extract runnable examples from README + update dependencies

+import assert from 'assert'+import { CID } from 'multiformats/cid'+import * as json from 'multiformats/codecs/json'+import { sha256 } from 'multiformats/hashes/sha2'+import { base64 } from "multiformats/bases/base64"++// ** PART 1: CREATING A NEW CID **++// Arbitrary input value+const value = { hello: "world"}++// Uint8array representation
// Encoded Uint8array representation of `value` using the plain JSON IPLD codec
MF416

comment created time in a day

Pull request review commentmultiformats/js-multiformats

doc: extract runnable examples from README + update dependencies

+import assert from 'assert'+import { CID } from 'multiformats/cid'+import * as json from 'multiformats/codecs/json'+import { sha256 } from 'multiformats/hashes/sha2'+import { base64 } from "multiformats/bases/base64"++// ** PART 1: CREATING A NEW CID **++// Arbitrary input value+const value = { hello: "world"}++// Uint8array representation+const bytes = json.encode(value)++// Hash Uint8array representation+const hash = await sha256.digest(bytes)++// Create CID (default base32)+const cid = CID.create(1, json.code, hash)++cid.code // 512 (JSON codec)+cid.version // 1 +cid.multihash // digest, including code (18 for sha2-256), digest size (32 bytes)+cid.bytes // byte representation++console.log("Example CID: " + cid.toString())+//> 'bagaaierasords4njcts6vs7qvdjfcvgnume4hqohf65zsfguprqphs3icwea'++++// ** PART 2: MULTIBASE ENCODERS / DECODERS / CODECS **++// Encode CID from part 1 to base64, decode back to base32+const cid_base64 = cid.toString(base64.encoder)+console.log("base64 encoded CID: " + cid_base64)+// 'mAYAEEiCTojlxqRTl6svwqNJRVM2jCcPBxy+7mRTUfGDzy2gViA'++const cid_base32 = CID.parse(cid_base64, base64.decoder)+//> 'bagaaierasords4njcts6vs7qvdjfcvgnume4hqohf65zsfguprqphs3icwea'++// test decoded CID against original+assert.strictEqual(cid_base32.toString(), cid.toString(), "Warning: decoded base32 CID does not match original")+console.log("Decoded CID equal to original base32: " + cid_base32.equals(cid)) // alternatively, use more robust built-in function to test equivalence++// Multibase codec exposes both encoder and decoder properties+cid.toString(base64)+CID.parse(cid.toString(base64), base64)++++// ** PART 3: CID BASE CONFIGURATIONS **++// CID v1 default encoding is base32+const v1 = CID.parse('bagaaierasords4njcts6vs7qvdjfcvgnume4hqohf65zsfguprqphs3icwea')+v1.toString()+//> 'bagaaierasords4njcts6vs7qvdjfcvgnume4hqohf65zsfguprqphs3icwea'++// CID v0 default encoding is base58btc+const v0 = CID.parse('QmdfTbBqBPQ7VNxZEYEj14VmRuZBkqFbiwReogJgS1zR1n')+v0.toString()+//> 'QmdfTbBqBPQ7VNxZEYEj14VmRuZBkqFbiwReogJgS1zR1n'+v0.toV1().toString()+//> 'bafybeihdwdcefgh4dqkjv67uzcmw7ojee6xedzdetojuzjevtenxquvyku'++

extraneous newlines

//> 'bafybeihdwdcefgh4dqkjv67uzcmw7ojee6xedzdetojuzjevtenxquvyku'
MF416

comment created time in a day

more