Blockwise
and Scan
are very related
Blockwise
Op
?
import aesara.tensor as aet
import copy
x = aet.dscalar("x")
z = copy.deepcopy(x)
assert x==z # Fails
Is this behavior of Aesara variables intended/serves some purpose or is it simply a bug ?
==
to work like it generally should, we would need to implement consistent __eq__
s
==
effectively does what aesara.graph.basic.equal_computations
does
g_1 == g_2
a little expensive when they're both large graphs that are very similar except in the "leaf" nodes/inputs
__hash__
implementations
set
s and dict
s all the time, it would surely have an effect
symbolic-pymc
meta objects
This seems like a whole other problem in itself.
As of now, I was looking for ways to have a copy of a nested structure (for instance a dict with list of TensorVariables), I wanted to track changes within such structures, by comparing old to new modified structures. But ran into this issue when I deepcopied the nested structure.
softmax.py
make sense? math
seems a bit cluttered. There is also extra_ops.py
?
RandomStream
?import aesara
import aesara.tensor as at
rv, updates = aesara.scan(
fn=lambda : at.random.normal(0, 1),
n_steps=5,
)
print(rv.eval())
# [-1.5294442 -1.5294442 -1.5294442 -1.5294442 -1.5294442]
print(rv.eval())
# [-1.5294442 -1.5294442 -1.5294442 -1.5294442 -1.5294442]
aesara
and numpy
and start working my way through the list. Could do that with scipy
as well.
math
, but this can get unwieldy.
jax.numpy
and jax.scipy
) but it would make a lot of sense
from aesara.tensor.special import softmax
?
from aesara.tensor.math import softmax
from aesara.tensor import softmax
or more typically import aesara.tensor as at; at.softmax