jax.vmap
for the JAX backend
numpy.vectorize
/for
-loops in the Python case
for
-loops as well
RandomVariable
work
gufunc
s: https://numpy.org/doc/stable/reference/c-api/generalized-ufuncs.html#
Elemwise
is equivalent to ufunc
, we need an equivalent Op
for gufunc
at.vectorize
would construct those
size
parameter with this and generalize a lot of the RandomVariable
logic
size
parameter would be used when computed)
signature
-like information per Op
Op
-level, it can always be provided to a Blockwise
at.vectorize
helper/constructor function
Blockwise
and Scan
are very related
Blockwise
Op
?
import aesara.tensor as aet
import copy
x = aet.dscalar("x")
z = copy.deepcopy(x)
assert x==z # Fails
Is this behavior of Aesara variables intended/serves some purpose or is it simply a bug ?
==
to work like it generally should, we would need to implement consistent __eq__
s
==
effectively does what aesara.graph.basic.equal_computations
does
g_1 == g_2
a little expensive when they're both large graphs that are very similar except in the "leaf" nodes/inputs
__hash__
implementations
set
s and dict
s all the time, it would surely have an effect
symbolic-pymc
meta objects
This seems like a whole other problem in itself.
As of now, I was looking for ways to have a copy of a nested structure (for instance a dict with list of TensorVariables), I wanted to track changes within such structures, by comparing old to new modified structures. But ran into this issue when I deepcopied the nested structure.