profile
viewpoint
If you are wondering where the data of this site comes from, please visit https://api.github.com/users/vikigenius/events. GitMemory does not store any data, but only uses NGINX to cache data for a period of time. The idea behind GitMemory is simply to give users a better reading experience.
Vikash vikigenius Waterloo, Ontaria CA CS Grad Student at University of Waterloo

vikigenius/conditional_text_generation 2

Adversarial Latent Space model for dialog generation

vikigenius/voidwoken 1

My dotfiles for void linux

vikigenius/allennlp 0

An open-source NLP research library, built on PyTorch.

vikigenius/allennlp-feedstock 0

A conda-smithy repository for allennlp.

vikigenius/allennlp-models 0

Officially supported AllenNLP models

vikigenius/BaySMM 0

Model for learning document embeddings along with their uncertainties

PR opened allenai/allennlp

Update transformers requirement from <4.7,>=4.1 to >=4.1,<4.8

Updates the requirements on transformers to permit the latest version. <details> <summary>Commits</summary> <ul> <li><a href="https://github.com/huggingface/transformers/commit/7a6c9fab8e2f740f013d89b99e635a9037f91d5d"><code>7a6c9fa</code></a> Release: v4.7.0</li> <li><a href="https://github.com/huggingface/transformers/commit/d6ea91c96ac7188be3ea1e6d80b2e169d4ac8cf9"><code>d6ea91c</code></a> fix pt-1.9.0 <code>add_</code> deprecation (<a href="https://github-redirect.dependabot.com/huggingface/transformers/issues/12217">#12217</a>)</li> <li><a href="https://github.com/huggingface/transformers/commit/3a960c4857d6b1550acaae34c9561de0e1f975be"><code>3a960c4</code></a> Support for torch 1.9.0 (<a href="https://github-redirect.dependabot.com/huggingface/transformers/issues/12224">#12224</a>)</li> <li><a href="https://github.com/huggingface/transformers/commit/afdd9e36633f5df2ce52a03dfafc4dc9b54b86cb"><code>afdd9e3</code></a> Add link to the course (<a href="https://github-redirect.dependabot.com/huggingface/transformers/issues/12229">#12229</a>)</li> <li><a href="https://github.com/huggingface/transformers/commit/29b0aef871dfb938c6a56884d9dab77950057955"><code>29b0aef</code></a> Improve detr (<a href="https://github-redirect.dependabot.com/huggingface/transformers/issues/12147">#12147</a>)</li> <li><a href="https://github.com/huggingface/transformers/commit/b56848c8c8d29822cb970abd56bc620cece937a3"><code>b56848c</code></a> Pipeline update & tests (<a href="https://github-redirect.dependabot.com/huggingface/transformers/issues/12207">#12207</a>)</li> <li><a href="https://github.com/huggingface/transformers/commit/700cee344691afc41f68aa18fedea463b22f95f1"><code>700cee3</code></a> [Docs] fixed broken link (<a href="https://github-redirect.dependabot.com/huggingface/transformers/issues/12205">#12205</a>)</li> <li><a href="https://github.com/huggingface/transformers/commit/255a17a089a507c23f0c786d704bf876d3eb5b4d"><code>255a17a</code></a> Use yaml to create metadata (<a href="https://github-redirect.dependabot.com/huggingface/transformers/issues/12185">#12185</a>)</li> <li><a href="https://github.com/huggingface/transformers/commit/15ef0dc5c6271b02a18e97235cb2ecf789c74e45"><code>15ef0dc</code></a> Enabling AutoTokenizer for HubertConfig. (<a href="https://github-redirect.dependabot.com/huggingface/transformers/issues/12198">#12198</a>)</li> <li><a href="https://github.com/huggingface/transformers/commit/afa414d060f52cc1f45cad0f1f6cd016fa994ace"><code>afa414d</code></a> updated DLC images and sample notebooks (<a href="https://github-redirect.dependabot.com/huggingface/transformers/issues/12191">#12191</a>)</li> <li>Additional commits viewable in <a href="https://github.com/huggingface/transformers/compare/v4.1.0...v4.7.0">compare view</a></li> </ul> </details> <br />

Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting @dependabot rebase.


<details> <summary>Dependabot commands and options</summary> <br />

You can trigger Dependabot actions by commenting on this PR:

  • @dependabot rebase will rebase this PR
  • @dependabot recreate will recreate this PR, overwriting any edits that have been made to it
  • @dependabot merge will merge this PR after your CI passes on it
  • @dependabot squash and merge will squash and merge this PR after your CI passes on it
  • @dependabot cancel merge will cancel a previously requested merge and block automerging
  • @dependabot reopen will reopen this PR if it is closed
  • @dependabot close will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
  • @dependabot ignore this major version will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)
  • @dependabot ignore this minor version will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)
  • @dependabot ignore this dependency will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)

</details>

+1 -1

0 comment

1 changed file

pr created time in 2 hours

pull request commenthlissner/doom-emacs

getting_started.org: Gentoo Linux install guide

You might want to mention that the xft use flag must be enabled for app-editors/emacs in order for the fonts to render correctly.

See #4876

mjkalyan

comment created time in 10 hours

issue commenthlissner/doom-emacs

Broken icons on the dashboard

@savageCW

In order for the fonts to render correctly on Gentoo you need to enable the xft use-flag for app-editors/emacs.

This can be done like so...

# echo "app-editors/emacs xft" >> /etc/portage/package.use/emacs
savageCW

comment created time in 10 hours

push eventallenai/allennlp

epwalsh

commit sha a2e764348c73fea38fbe58dc18a6cdeed45c4b6b

change name of step

view details

push time in 14 hours

push eventallenai/allennlp

Dirk Groeneveld

commit sha 462695791380da19fb7f812ecdfe31a418aec836

Fix duplicate line

view details

Dirk Groeneveld

commit sha 668022aa0553abd82bdba95e2d7eda825edd6b84

Easy access to the output dimension of an activation layer

view details

Dirk Groeneveld

commit sha 3194b2d409800f057fbb023e61a55685a59e9748

Take an ignore an attention mask in TransformerEmbeddings

view details

Dirk Groeneveld

commit sha ad41685fdd25cbd2730ee621c6ee7c5a2c6303c0

Make it so a pooler can be derived from a huggingface module

view details

Dirk Groeneveld

commit sha 7c457612afaaa4c4feb5e1a23b256433ce456711

Pooler that can load from a transformer module

view details

Dirk Groeneveld

commit sha 67041cbced6c8da2b3816fdc5c902bfbecd736cc

Changelog

view details

Akshita Bhagia

commit sha b684bf02e808e38147b469a4940ce4adf5c78e29

Update transformer_embeddings.py

view details

Dirk Groeneveld

commit sha 0df423ecda031dc1de21f2e1b6b240ecaaa884f3

Productivity through formatting

view details

Dirk Groeneveld

commit sha 77d189b96866c3903abf6a5b352795473081d1cf

Don't break positional arguments

view details

Dirk Groeneveld

commit sha 2aa1b721dad0fa92fc62b072680144d94e878806

Merge branch 'AkshitaB-patch-1' into TransformerToolkitUpdates

view details

Dirk Groeneveld

commit sha 453d6450e868e91c4ae748cebc84e866fff49869

Some mode module names

view details

Dirk Groeneveld

commit sha 32fda861feea4656518b610005c389aeacdbcdce

Remove _get_input_arguments()

view details

Dirk Groeneveld

commit sha 43a200b4c40eccb41cf37c2a4f42f8fc307843b2

Merge branch 'TransformerToolkitUpdates' into Tango

view details

Dirk Groeneveld

commit sha 76bc409c978a270145f0854d2866fb5562570be2

Fix previously broken merge

view details

Dirk Groeneveld

commit sha 8c5a8dabda7d66ce98a9330b7dae29aa79816dd4

Formatting

view details

push time in 14 hours

Pull request review commentallenai/allennlp

Transformer toolkit updates

+from typing import Dict, Optional, Any, Union, TYPE_CHECKING++import torch+ from allennlp.common import FromParams from allennlp.modules.transformer.activation_layer import ActivationLayer +if TYPE_CHECKING:+    from transformers.configuration_utils import PretrainedConfig+  class TransformerPooler(ActivationLayer, FromParams):++    _pretrained_relevant_module = ["pooler", "bert.pooler"]+     def __init__(         self,         hidden_size: int,         intermediate_size: int,+        activation: Union[str, torch.nn.Module] = "relu",     ):-        super().__init__(hidden_size, intermediate_size, "relu", pool=True)+        super().__init__(hidden_size, intermediate_size, activation, pool=True)++    @classmethod+    def _get_input_arguments(

Removed!

dirkgr

comment created time in 14 hours

push eventallenai/allennlp

Dirk Groeneveld

commit sha 32fda861feea4656518b610005c389aeacdbcdce

Remove _get_input_arguments()

view details

push time in 14 hours

push eventallenai/allennlp

Dirk Groeneveld

commit sha 453d6450e868e91c4ae748cebc84e866fff49869

Some mode module names

view details

push time in 14 hours

push eventallenai/allennlp

Akshita Bhagia

commit sha b684bf02e808e38147b469a4940ce4adf5c78e29

Update transformer_embeddings.py

view details

Dirk Groeneveld

commit sha 2aa1b721dad0fa92fc62b072680144d94e878806

Merge branch 'AkshitaB-patch-1' into TransformerToolkitUpdates

view details

push time in 14 hours

pull request commentallenai/allennlp

Update transformer_embeddings.py

Cool, thanks. I will merge this into #5270 for one big transformer toolkit update.

AkshitaB

comment created time in 14 hours

push eventallenai/allennlp

Dirk Groeneveld

commit sha 77d189b96866c3903abf6a5b352795473081d1cf

Don't break positional arguments

view details

push time in 14 hours

push eventallenai/allennlp

epwalsh

commit sha b610953a443fbc2d5d2cb26a4c17bdee41691eba

fix

view details

push time in 14 hours

push eventallenai/allennlp

epwalsh

commit sha 145175a8396dda198bf52c214be078e332d984fe

fix

view details

push time in 15 hours

push eventallenai/allennlp

epwalsh

commit sha 6d7ab372f419544ae437ebaeedb3ead5a3e26fdb

skip install on cache hit

view details

push time in 15 hours

Pull request review commentallenai/allennlp

Transformer toolkit updates

+from typing import Dict, Optional, Any, Union, TYPE_CHECKING++import torch+ from allennlp.common import FromParams from allennlp.modules.transformer.activation_layer import ActivationLayer +if TYPE_CHECKING:+    from transformers.configuration_utils import PretrainedConfig+  class TransformerPooler(ActivationLayer, FromParams):++    _pretrained_relevant_module = ["pooler", "bert.pooler"]+     def __init__(         self,         hidden_size: int,         intermediate_size: int,+        activation: Union[str, torch.nn.Module] = "relu",     ):-        super().__init__(hidden_size, intermediate_size, "relu", pool=True)+        super().__init__(hidden_size, intermediate_size, activation, pool=True)++    @classmethod+    def _get_input_arguments(

We don't require this method any longer. from_config takes care of what we need.

dirkgr

comment created time in 15 hours

push eventallenai/allennlp

epwalsh

commit sha fffb4d731e11f8eee13345462ebf6773c5108865

more fixes

view details

push time in 15 hours

push eventallenai/allennlp

epwalsh

commit sha 723f251630cddebf86050de0f3e931f1756c599b

fix

view details

push time in 15 hours

push eventallenai/allennlp

epwalsh

commit sha 1bb5621370446802850dfd27f3329960a42d461c

fix again

view details

epwalsh

commit sha 903eeacc56318572112d3d6fbd74887f6f888ef0

fix again

view details

push time in 16 hours

push eventallenai/allennlp

epwalsh

commit sha 5f791134f2654278ba9c248a20f818e97a498777

fix

view details

push time in 16 hours

push eventallenai/allennlp

epwalsh

commit sha 361e510da68620b7e509c5e096cd11d480e42ba7

fix

view details

push time in 16 hours

PR opened allenai/allennlp

update Python environment setup
+161 -189

0 comment

2 changed files

pr created time in 16 hours

push eventallenai/allennlp

epwalsh

commit sha c4487dc0478f5f589766806276d569d3d68cadc4

update Python environment setup

view details

push time in 16 hours

push eventallenai/allennlp

Dirk Groeneveld

commit sha 0df423ecda031dc1de21f2e1b6b240ecaaa884f3

Productivity through formatting

view details

push time in 16 hours

PR opened allenai/allennlp

Update transformer_embeddings.py
+1 -1

0 comment

1 changed file

pr created time in 16 hours

create barnchallenai/allennlp

branch : AkshitaB-patch-1

created branch time in 16 hours

push eventallenai/allennlp

Dirk Groeneveld

commit sha 67041cbced6c8da2b3816fdc5c902bfbecd736cc

Changelog

view details

push time in 16 hours

PR opened allenai/allennlp

Transformer toolkit updates

This is spun out of some of the Tango stuff.

+44 -2

0 comment

3 changed files

pr created time in 16 hours

create barnchallenai/allennlp

branch : TransformerToolkitUpdates

created branch time in 16 hours

push eventallenai/allennlp

ArjunSubramonian

commit sha a6cfb1221520fca7a5cc55bef001c6a79a6a3e2f

added `on_backward` trainer callback (#5249) * added BackwardCallback * finished tests * fixed linting issue * revised design per Dirk's suggestion * added OnBackwardException, changed loss to batch_ouputs, etc. Co-authored-by: Arjun Subramonian <arjuns@Arjuns-MacBook-Pro.local>

view details

Pete

commit sha 5da5b5ba35075d387aad9ca5242889957646a94d

Upload code coverage reports from different jobs, other CI improvements (#5257) * add new job to upload coverage * checkout code to get coverage config * fix * write coverage from GPU test * print some debug info * try debug * fix test image * fix * fix * debugging * update * fixes * upload GPU checks coverage * fix codecov action config * move linting and style checks to separate jobs * update CONTRIBUTING * update coverage artifact names and paths * remove bulldozer config * save coverage from model tests * upload coverage from model tests * fix * use Make command in models to run tests * fix * update 'multi_device' decorator * rename, clean up * fix * Update allennlp/common/testing/__init__.py Co-authored-by: Akshita Bhagia <akshita23bhagia@gmail.com> * address comments Co-authored-by: Akshita Bhagia <akshita23bhagia@gmail.com>

view details

dependabot[bot]

commit sha b37686f663e675237a05c5c75c809d3555a38b26

Update torch requirement from <1.9.0,>=1.6.0 to >=1.6.0,<1.10.0 (#5267)

view details

dependabot[bot]

commit sha e5468d964eeec090762da7e6c55d1692fec65855

Bump black from 21.5b2 to 21.6b0 (#5255)

view details

dependabot[bot]

commit sha a1d36e67ab757b3b8db5eaac42efadc7b237cfdb

Update torchvision requirement from <0.10.0,>=0.8.1 to >=0.8.1,<0.11.0 (#5266) * Update torchvision requirement from <0.10.0,>=0.8.1 to >=0.8.1,<0.11.0 Updates the requirements on [torchvision](https://github.com/pytorch/vision) to permit the latest version. - [Release notes](https://github.com/pytorch/vision/releases) - [Commits](https://github.com/pytorch/vision/compare/v0.8.1...v0.10.0) --- updated-dependencies: - dependency-name: torchvision dependency-type: direct:production ... Signed-off-by: dependabot[bot] <support@github.com> * set timeout limits, clear transformer cache in some tests * update torch versions in tests * try skipping the test in question * fix * fixes * fix I think * run more with spawn * ugh * Apply suggestions from code review * fix * revert comment change * patch models version * Update .github/workflows/ci.yml * update Makefile Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com> Co-authored-by: Pete <petew@allenai.org> Co-authored-by: epwalsh <epwalsh10@gmail.com>

view details

Dirk Groeneveld

commit sha af101d67cdbfae80d2d232c5ed5162ff8b8ef123

Removes confusing zero mask from VilBERT (#5264) * Remove confusing zero mask * Changelog * More coattention mask cleanup

view details

ArjunSubramonian

commit sha f1f51fc9997fdca24763463db411d30cc8e88f09

Adversarial bias mitigation (#5269) * started adversarial bias mitigator wrapper * initial commit * finished adversarial bias mitigation; need to write tests * manually checked for bugs * debugged through testing * updated CHANGELOG * Update CHANGELOG.md * minor fixes to docstrings and addressed Dirk's feedback Co-authored-by: Arjun Subramonian <arjuns@Arjuns-MacBook-Pro.local> Co-authored-by: Arjun Subramonian <arjuns@arjuns-mbp.home> Co-authored-by: Akshita Bhagia <akshita23bhagia@gmail.com>

view details

Dirk Groeneveld

commit sha e93ef1d9c880378c88e8bcae96bc8dd1d667a84a

Merge remote-tracking branch 'origin/main' into Tango

view details

push time in 17 hours