profile
viewpoint
If you are wondering where the data of this site comes from, please visit https://api.github.com/users/hardmaru/events. GitMemory does not store any data, but only uses NGINX to cache data for a period of time. The idea behind GitMemory is simply to give users a better reading experience.
hardmaru hardmaru Tokyo https://otoro.net/ml/ I make simple things with neural networks.

hardmaru/estool 811

Evolution Strategies Tool

hardmaru/cppn-gan-vae-tensorflow 336

Train CPPNs as a Generative Model, using Generative Adversarial Networks and Variational Autoencoder techniques to produce high resolution images.

hardmaru/cppn-tensorflow 304

Very Simple and Basic Implementation of Compositional Pattern Producing Network in TensorFlow

AI-ON/ai-on.org 216

AI•ON projects repository and website source.

hardmaru/backprop-neat-js 118

Neural Network Evolution Playground with Backprop NEAT

hardmaru/astool 86

Augmented environments with RL

hardmaru/gecco-tutorial-2019 68

2019 talk at GECCO

hardmaru/cppn-gan-vae-cifar-tensorflow 37

First attempt to use the previous CPPN-GAN-VAE model to train on CIFAR-10 images.

hardmaru/kanji2kanji 31

Reproduce domain transfer results in Deep Learning for Classical Japanese Literature

issue commentworldmodels/worldmodels.github.io

Replacing RNN with Self-Attention Mechanism

Hi Raphaël,

In later work, I've generally kept the RNN, but replaced the latent space bottleneck with other types of bottlenecks related to self-attention.

For example:

  1. Inattentional Blindness bottleneck: https://attentionagent.github.io/

  2. Screen shuffling bottleneck: https://attentionneuron.github.io/

Cheers.

rabaur

comment created time in 8 days

MemberEvent