As the title implies, is there any particular reason for not having tensorflow-probability (tfp) in the core tensorflow (tf)?
The reason I’m asking is because
-
I’ve faced several times compatibility issue from both. For example, currently the latest version of
tfp
supportstf 2.9+
(unlike the latesttensorflow-addons (tfa)
, which I can use even withtf 2.4
). -
Emerging library like
KerasCV
(and probabelyKerasNLP
too) doesn’t wanttfp
dependencies. That means, all the probability distributions and sampling functions needs to be written. For example, function liketfp.distributions.Dirichlet
,tfp.distributions.MarkovChain
, etc. -
I don’t know if it has been discussed before and the probability (
tfp
) package started maintaining its own repo (fromtf.compat.v1.distributions
!). It’s a bit confusing and sometimes documentation is hard to follow intfp
. In contrast, torch serves both in single APItorch.distributions
while following the design pattern of TensorFlow Distributions.