profile
viewpoint
If you are wondering where the data of this site comes from, please visit https://api.github.com/users/dleve123/events. GitMemory does not store any data, but only uses NGINX to cache data for a period of time. The idea behind GitMemory is simply to give users a better reading experience.
Daniel Levenson dleve123 New York, NY http://www.dlevenson.com Machine learning and web engineering in healthcare. Previously @the-commons-project, @artsy & co-founder @healthify. Currently studying ML @CornellTech

dleve123/.well-known 0

Specs and documentation for all DID-related /.well-known resources

dleve123/aptible-cli 0

Command line interface to Aptible

dleve123/artsy-passport 0

Wires up the common auth handlers for Artsy's http://ezeljs.com based apps using http://passportjs.org.

dleve123/bearden 0

A simple database of organizations

dleve123/book 0

The main docs for intermezzOS

dleve123/bootup-on-rails 0

A Demo of Bootup Baltimore's Online Inventory System

dleve123/code-corps-api 0

Elixir/Phoenix API for Code Corps.

dleve123/dleve123.github.io 0

Learn about me

issue commentsmart-on-fhir/health-cards

Clarification re multiple QR codes

@jasonxylee It appears that you're missing a second / to denote the N number of numerically-encoded QR chunks that compose your SHC JWS.

Example from the docs:

in a longer JWS, the second chunk in a set of three might produce a QR code like shc:/2/3/56762909524320603460292437404460<snipped for brevity>

Note the :2/3/... component of this.

littleforest

comment created time in 6 days

PR opened minitorch/minitorch.github.io

Fix typo in module0

classfication -> classification

+1 -1

0 comment

1 changed file

pr created time in 9 days

push eventdleve123/minitorch.github.io

Daniel Levenson

commit sha dbe078ff674a1c5e745373ca5444cf7df8a2c5fd

Fix typo in module0 classfication -> classification

view details

push time in 9 days

PR closed minitorch/minitorch.github.io

Fix typo in modules.html

eaiser -> easier

+2 -2

2 comments

1 changed file

dleve123

pr closed time in 13 days

pull request commentminitorch/minitorch.github.io

Fix typo in modules.html

Closing in favor of #5

dleve123

comment created time in 13 days

PR opened minitorch/minitorch.github.io

Fix typo in modules.rst.txt

eaiser -> easier

+1 -1

0 comment

1 changed file

pr created time in 13 days

push eventdleve123/minitorch.github.io

Daniel Levenson

commit sha 201cfd5e0629ade03479705bf998f96ad43e4bbe

Fix typo in modules.rst.txt eaiser -> easier

view details

push time in 13 days

pull request commentminitorch/minitorch.github.io

Fix typo in modules.html

I see now that I might have changed the generated file, will updated the source as well.

dleve123

comment created time in 13 days

PR opened minitorch/minitorch.github.io

Fix typo in modules.html

eaiser -> easier

+2 -2

0 comment

1 changed file

pr created time in 13 days

push eventdleve123/minitorch.github.io

Daniel Levenson

commit sha f7872bd3033d239c97a898fc48d43e47ab9050e0

Fix typo in modules.html eaiser -> easier

view details

push time in 13 days

issue openedkeras-team/keras-io

Clarify Keras Version in API Docs

Problem:

It seems like the API documentation at keras.io documents a version of the codebase that is more recent than the latest stable release of keras that installed upon pip install tensorflow.

Let's take the documentation for StringLookup . One cool feature of StringLookup is 'one_hot' output_mode. Here's an example from the documentation page:

One-hot output

Configure the layer with output_mode='one_hot'. Note that the first num_oov_indices dimensions in the ont_hot encoding represent OOV values.


>>> vocab = ["a", "b", "c", "d"]
>>> data = tf.constant(["a", "b", "c", "d", "z"])
>>> layer = StringLookup(vocabulary=vocab, output_mode='one_hot')
>>> layer(data)
<tf.Tensor: shape=(5, 5), dtype=float32, numpy=
  array([[0., 1., 0., 0., 0.],
         [0., 0., 1., 0., 0.],
         [0., 0., 0., 1., 0.],
         [0., 0., 0., 0., 1.],
         [1., 0., 0., 0., 0.]], dtype=float32)>

Without given any more information, I think it's reasonable to expect that the official Keras docs document the latest stable release of the package, but I don't think it does. Consider the following developer journey.

Developer Journey:

  • pip install tensorflow installs, at latest, TF v2.5.0 (https://pypi.org/project/tensorflow/#history)
  • Developer googles arounds regarding one-hot encoding layers for TF and ends up on the StringLookup docs and learns about the one_hot output mode
  • TF 2.5.0's version of StringLookup does not include output_mode as it was added in a commit post cutting the 2.5 release
  • Confusion/surprise!

Possible Solutions:

  • [probably hard] Document latest stable release of keras at keras.io -- I appreciate this might be tricky at the moment considering the evolving code structure between TF and Keras.
  • [hopefully less hard] State the version of the dependency being documented.

I think Rails does a particularly good job of this hard problem. Their docs (https://www.google.com/search?q=rails+api+docs&safe=active&ssui=on):

  1. Include the version in the website header
  2. Persist older documentation releases which are accessible via a URL path (e.g. https://api.rubyonrails.org/v5.1/)

Thanks for the awesome library and let me know if/how I can be helpful here!

created time in a month

issue commenthanxiao/bert-as-service

TypeError: cannot unpack non-iterable NoneType object

After banging my head against the wall for a while on this issue, was finally able to boot the server using:

  • Python 3.6.16
  • Tensorflow 1.15.0
  • bert-as-service 1.10.0
  • Linux 3.10 (RHEL)
$ uname -srv
Linux 3.10.0-862.2.3.el7.x86_64 #1 SMP Mon Apr 30 12:37:51 EDT 2018

from a new directory

$ conda create -n foo python=3.6
$ conda activate foo
$ pip install tensorflow==1.15.0
$ mkdir bert-params
$ wget https://storage.googleapis.com/bert_models/2018_10_18/cased_L-12_H-768_A-12.zip
$ cd ..
$ bert-serving-start -model_dir bert-params/cased_L-12_H-768_A-12/

It's possible that either/both of Python 3.6.16 and TF 1.15.0 are required, or that any 3.6.X and 1.15.X would work.

What I do know is that Python 3.7 with TF 1.15.X does not work.

abhishekkuber

comment created time in 2 months