Probabilistic reasoning and statistical analysis in TensorFlow

pandora2[mp3|m4a]

Probabilistic reasoning and statistical analysis in TensorFlow

issue commenttensorflow/probability

Save the architecture of a Bayesian neural network

Hi @nbro. Did you have in mind something different than the current serialization capability?

comment created time in a month

issue commentconda-forge/tensorflow-probability-feedstock

Use TFP pip package instead of `pip install .`?

If there is conda support which does not repackage the wheels then i agree with you. Failing that I do not agree with you because of the reasons listed. Possibly relevant notes:

- TFP doesnt even have a TF dependency on TF currently
- TFP pip deps can be ignored via
`--no-dependencies`

FWIW, TFP has no dependencies which arent otherwise satisfied by TF except `decorator`

and `cloudpickle`

. It is very unlikely that we will add additional dependencies in the future.

comment created time in a month

issue commentconda-forge/tensorflow-probability-feedstock

Use TFP pip package instead of `pip install .`?

Before I ask a number of questions, are you saying there already is conda support for TF?

comment created time in a month

issue commentconda-forge/tensorflow-probability-feedstock

Use TFP pip package instead of `pip install .`?

Hi @msarahan. I wonder to what extent your concerns apply to TFP vs TF. Since TFP works for all versions of Python and has no source which isnt Python, it seems to me that the value add of a dedicated conda package for TFP is much lower than the value-add of a dedicated conda package for TF. Thoughts?

comment created time in a month

issue commenttensorflow/probability

monte_carlo.expectation with samples from multiple multidimensional random variables

Sorry for not checking this in yet. I have demos done and just need to polish. I'll try to get this in by the end of next week.

On Thu, Dec 12, 2019, 1:48 PM DM notifications@github.com wrote:

are there any updates on this issue?

— You are receiving this because you were assigned. Reply to this email directly, view it on GitHub https://github.com/tensorflow/probability/issues/633?email_source=notifications&email_token=AAIVTNXXFRR5MCBCLJ3K4HDQYKWSHA5CNFSM4JHMOC52YY3PNVWWK3TUL52HS4DFVREXG43VMVBW63LNMVXHJKTDN5WW2ZLOORPWSZGOEGYEZPQ#issuecomment-565202110, or unsubscribe https://github.com/notifications/unsubscribe-auth/AAIVTNVNTIONS6VOIMHEU23QYKWSHANCNFSM4JHMOC5Q .

comment created time in 2 months

issue commenttensorflow/tensorflow

(Not sure how I got assigned this; in fact I didnt even notice. Reassigned to alextp for further triage.)

comment created time in 2 months

issue commenttensorflow/probability

Thanks for raising this issue. I confirm that the docstring is broken but the code is not. The docstring should have said:

```
# ==> x.shape: [6, 1, 3, 5, 4, 2]
```

since:

```
sample_shape = [6, 1]
batch_shape = [3]
event_shape = [5, 4, 2]
```

comment created time in 3 months

issue commenttensorflow/tensorflow

We definitely used to leak memory but switched to weakref dict. Given the intrinsic complexity here, my guess is we have a bug. Ill see if someone on the team is willing to dig in.

comment created time in 3 months

issue commenttensorflow/probability

GLM: Negative Binomial Regression

Hi @janithwanni -- thanks for the offer! I recommend you start by looking at other models in the glm dir and draw upon them as an example for negative binomial. I *think* there might even be a binomial there. We also have a NegativeBinomial distribution so these two resources together ought to be a good starting pt, at least in terms of examples.

comment created time in 3 months

issue commenttensorflow/probability

GLM: Negative Binomial Regression

Adding NegativeBinomial tfp tfp.glm would be a very nice contribution.

comment created time in 3 months

issue commenttensorflow/probability

monte_carlo.expectation with samples from multiple multidimensional random variables

Im in the process of reworking the Monte Carlo helpers. Stay tuned.

comment created time in 3 months

issue commenttensorflow/probability

Feature request: Conditional sampling

+1 to better docstrings.
I agree there's a learning curve here, but I feel this learning curve is "worth it" since the current approach ensures the unnormalized posterior is merely a thin accessor to the full joint (this being the inferential base). Furthermore, but not codifying this accessor we emphasize that all downstream inference logic is agnostic--any function will suffice.
As for the different call styles, I see this difference as one of the key points of having different JD* flavors. The reason for the current style is that we wanted to preserve the `d.log_prob(d.sample())`

pattern yet also have `d.sample()`

be interpretable wrt the `model`

as supplied to `__init__`

.

comment created time in 3 months

issue commenttensorflow/probability

Feature request: Conditional sampling

Fwiw, I strongly urge against this sugar. To me, the connection between closures and unnormalized densities is the second most elegant part of TFP.

I also disagree it makes code more readable. While the lambda spells it out clearly how the thing is working, the cond_lp seems to me only to obfuscate. Also, the created object is *not* a distribution as the density is unnormalized. Finally, the simplified version isn't really a fair comparison since the cast and expand_dims would need to be done there. A better comparison would be:
`lp = lambda *x: model.log_prob(x + (df[y'],))`

and that reads purdy darn nicely to me!

Im happy to go down a long list of other reasons, but the tl;dr is that I claim this sugar only feels like it solves a problem but actually adds cognitive burden (yet another thing to learn), runs the risk of making a user think its *required*, and obfuscates what is otherwise a one-liner.

comment created time in 4 months

issue commenttensorflow/tensorflow

Sampling from a categorical distribution without replacement

FWIW, TFP would be happy to host a tfp.math.random_choice though my gut feeling is that tf.math.choice is a more natural home, in keeping with other random samplers. However, a sample with replacement distribution object (complete with sample, log_prob, mean, etc) should live in TFP since tf.contrib.distributions are now gone.

comment created time in 4 months

issue commentgoogle/jax

Its rather complex, but Id like to create a jax array which has list semantics and this is how id have preferred to initialize the object.

comment created time in 4 months

issue openedgoogle/jax

import jax.numpy as jnp jnp.array([None]*3)

# ==> TypeError: Unexpected input type for array: <class 'NoneType'>

# Yet:

np.array([None]*3)

# ==> array([None, None, None], dtype=object)

Is this expected?

created time in 4 months

issue openedgoogle/jax

import jax.numpy as jnp jnp.array([4,5], dtype=np.object)

RuntimeError: Invalid argument: Convert does not allow non-arrays, so cannot convert from s32[] to TOKEN.: This is a bug in JAX's shape-checking rules; please report it! https://github.com/google/jax/issues

created time in 4 months