You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
We can see the single node with [id A], has two outputs, which we named next_rng and x. By default only the second output x is given to the user directly, and the other is "hidden".
154
153
@@ -226,14 +225,14 @@ This is exactly what RandomStream does behind the scenes
The destroy map annotation tells us that the first output of the x variable is allowed to alter the first input.
329
+
The destroy map annotation tells us that the first output of the x variable is allowed to modify the first input.
342
330
343
331
>>> %timeit inplace_f() # doctest: +SKIP
344
-
35.5 µs ± 1.87 µs per loop (mean ± std. dev. of 7 runs, 10,000 loops each)
332
+
9.71 µs ± 2.06 µs per loop (mean ± std. dev. of 7 runs, 100000 loops each)
345
333
346
-
Performance is now much closer to calling numpy directly, with only a small overhead introduced by the PyTensor function.
334
+
Performance is now much closer to calling NumPy directly, with a small overhead introduced by the PyTensor function.
347
335
348
336
The `random_make_inplace <https://github.com/pymc-devs/pytensor/blob/3fcf6369d013c597a9c964b2400a3c5e20aa8dce/pytensor/tensor/random/rewriting/basic.py#L42-L52>`_
349
337
rewrite automatically replaces RandomVariable Ops by their inplace counterparts, when such operation is deemed safe. This happens when:
350
338
351
339
#. An input RNG is flagged as `mutable` and is used in not used anywhere else.
352
-
#. A RNG is created intermediately and used in not used anywhere else.
340
+
#. A RNG is created intermediately and not used anywhere else.
353
341
354
-
The first case is true when a users uses the `mutable` `kwarg` directly, or much more commonly,
355
-
when a shared RNG is used and a (default or manual) update expression is given.
356
-
In this case, a RandomVariable is allowed to modify the RNG because the shared variable holding it will be rewritten anyway.
357
-
358
-
The second case is not very common, because RNGs are not usually chained across multiple RandomVariable Ops.
359
-
See more details in the next section.
342
+
The first case is true when a users uses the `mutable` `kwarg` directly.
It works, but that graph is slightly unorthodox in Pytensor.
473
+
It works, but that graph is slightly unorthodox in PyTensor.
490
474
491
-
One practical reason is that it is more difficult to define the correct update expression for the shared RNG variable.
475
+
One practical reason why, is that it is more difficult to define the correct update expression for the shared RNG variable.
492
476
493
-
One techincal reason is that it makes rewrites more challenging in cases where RandomVariables could otherwise be manipulated independently.
477
+
One techincal reason why, is that it makes rewrites more challenging in cases where RandomVariables could otherwise be manipulated independently.
494
478
495
479
Creating multiple RNG variables
496
480
-------------------------------
497
481
498
482
RandomStreams generate high quality seeds for multiple variables, following the NumPy best practices https://numpy.org/doc/stable/reference/random/parallel.html#parallel-random-number-generation.
499
483
500
-
Users who create their own RNGs should follow the same practice!
484
+
Users who sidestep RandomStreams, either by creating their own RNGs or relying on RandomVariable's default shared RNGs, should follow the same practice!
501
485
502
486
Random variables in inner graphs
503
487
================================
@@ -629,7 +613,7 @@ RNGs in Scan are only supported via shared variables in non-sequences at the mom
629
613
>>> print(err)
630
614
Tensor type field must be a TensorType; found <class 'pytensor.tensor.random.type.RandomGeneratorType'>.
631
615
632
-
In the future, TensorTypes may be allowed as explicit recurring states, rendering the use of updates optional or unnecessary
616
+
In the future, RandomGenerator variables may be allowed as explicit recurring states, rendering the internal use of updates optional or unnecessary
633
617
634
618
OpFromGraph
635
619
-----------
@@ -671,7 +655,7 @@ Other backends (and their limitations)
671
655
Numba
672
656
-----
673
657
674
-
NumPy random generator can be used with Numba backend.
658
+
NumPy random generators can be natively used with the Numba backend.
JAX uses a different type of PRNG than those of Numpy. This means that the standard shared RNGs cannot be used directly in graphs transpiled to JAX.
686
+
JAX uses a different type of PRNG than those of NumPy. This means that the standard shared RNGs cannot be used directly in graphs transpiled to JAX.
704
687
705
-
Instead a copy of the Shared RNG variable is made, and its bit generator state is given a jax_state entry that is actually used by the JAX random variables.
688
+
Instead a copy of the Shared RNG variable is made, and its bit generator state is expanded with a jax_state entry. This is what's actually used by the JAX random variables.
706
689
707
-
In general, update rules are still respected, but they won't be used on the original shared variable, only the copied one actually used in the transpiled function
690
+
In general, update rules are still respected, but they won't update/rely on the original shared variable.
0 commit comments