csxeba/evolute 174

Evolutionary algorithm toolbox

csxeba/brainforge 113

A Neural Networking library based on NumPy only

csxeba/Awesome-Adaptive-Computation 3

Adaptive and Conditional Computation in Neural Netwoks

csxeba/ANN_IQ 1

Artificial Neural Networking - A basic tutorial

csxeba/Cicero 1

Toroidal Game of Life

csxeba/AdasCNN 0

Homework for AdasWorks

csxeba/AIM-SR 0

Road sign classification with the BrainForge library

csxeba/Artifactorium 0

Experiment artifactory

csxeba/bayesforge 0

Bayesian learning in Keras

create barnchcsxeba/Verres

branch : method/instance_seg

created branch time in 16 days

issue commenttensorflow/tensorflow

Usage and signature of Model.train_step() is unclear

Update: I solved it by issuing the following modifications:

  1. The data needs to be wrapped in a one-element tuple, eg. yield ((tensor1, tensor2, ...),). This structure needs to be unpacked in the body of the train_step() function.

  2. Added a line which builds the forward function. The line can either be a .predict() or a .build() call with the correct arguments.

It would be beneficial to somehow indicate in the documentation that explicit building of forward and backward passes are required for train_step() to work.


comment created time in a month

issue openedtensorflow/tensorflow

Usage and signature of Model.train_step() is unclear

URL(s) with the issue:

Colab link

Description of issue (what needs changing):

There seems to be some kind of restriction on the signature of Model.train_step(), which is undocumented.

Clear description

The issue comes up in TF 2.2.0, where the possibility to overwrite Model.train_step() was introduced.

The training is executed with with a generator as an input. The generator yields 4 tensors, which are combined in train_step() to produce a loss and gradients.

The logic fails, because tf.keras in the fit() function checks that the generator outputs at most 3 tensors (corresponding to x, y, and weights), so this mindset and the x, y, w signature is implicity forced onto the train_step() function.

This is not clear from the documentation, which only states, that train_step() has a single argument (data), which should be a 'A nested structure of Tensors.' Link to documentation

The also seems to be a lack of official examples using this new API, which would help determine the sequence of methods one needs to call in order to use the new API (calling Model.compile() for instance).

Expected behaviour

According to both the documentation and the release notes on TF 2.2.0, I was expecting to be able to use an arbitrary list of tensors as parameter to Model.train_step().

I would expect that a restriction on Model.train_step()'s sole 'data' argument would be documented in the relevant documentation.


Please note if I get the usage of Model.train_step() wrong. Also if there is an example out there using this new and very convenient API, feel free to direct me towards it.

created time in a month

push eventcsxeba/trickster


commit sha d22072bebbebff319c724806d583bfa982d429be

Experimenting with ProcGen

view details

push time in 2 months

issue commenttensorflow/tensorflow

Buggy behaviour of dataset API

Hi Audiber, thank you for the clarification!


comment created time in 2 months

issue commenttensorflow/tensorflow

Buggy behaviour of dataset API

It is also affecting TF 2.0 if that matters.


comment created time in 2 months

issue commenttensorflow/tensorflow

Buggy behaviour of dataset API


comment created time in 2 months

issue openedtensorflow/tensorflow

Buggy behaviour of dataset API

<em>Please make sure that this is a bug. As per our GitHub Policy, we only address code/doc bugs, performance issues, feature requests and build/installation issues on GitHub. tag:bug_template</em>

System information

  • Have I written custom code: yes, see
  • OS Platform and Distribution: Google Colab (Ubuntu 18.04.3 LTS)
  • TensorFlow installed from (source or binary): provided by Colab
  • TensorFlow version (use command below): 2.2.0-rc3
  • Python version: 3.6.9

Describe the current behavior At Dataset graph branching points, the node, which is the root of the branching is resampled for each branch during one round of execution. With non-randomized inputs to the Dataset, this does not cause any problems. If the root node is after a .shuffle() call, the branches will receive different inputs in the same computation round.

Describe the expected behavior Downstream branches should receive the same data even if shuffle() is applied.

Standalone code to reproduce the issue

More info:

This behaviour is also present if the dataset is created from a generator, which handles the shuffling implicitly.

created time in 2 months