Skip to content

Commit 2ac50d1

Browse files
committed
Replace 'an pytensor' -> 'a pytensor'
1 parent c4b28df commit 2ac50d1

File tree

12 files changed

+23
-23
lines changed

12 files changed

+23
-23
lines changed

docs/source/contributing/developer_guide.md

Lines changed: 5 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -34,7 +34,7 @@ $$
3434
z \sim \text{Normal}(0, 5)
3535
$$
3636

37-
A call to a {class}`~pymc.Distribution` constructor as shown above returns an PyTensor {class}`~pytensor.tensor.TensorVariable`, which is a symbolic representation of the model variable and the graph of inputs it depends on.
37+
A call to a {class}`~pymc.Distribution` constructor as shown above returns a PyTensor {class}`~pytensor.tensor.TensorVariable`, which is a symbolic representation of the model variable and the graph of inputs it depends on.
3838
Under the hood, the variables are created through the {meth}`~pymc.Distribution.dist` API, which calls the {class}`~pytensor.tensor.random.basic.RandomVariable` {class}`~pytensor.graph.op.Op` corresponding to the distribution.
3939

4040
At a high level of abstraction, the idea behind ``RandomVariable`` ``Op``s is to create symbolic variables (``TensorVariable``s) that can be associated with the properties of a probability distribution.
@@ -134,7 +134,7 @@ model_logp # ==> -6.6973152
134134

135135
## Behind the scenes of the ``logp`` function
136136

137-
The ``logp`` function is straightforward - it is an PyTensor function within each distribution.
137+
The ``logp`` function is straightforward - it is a PyTensor function within each distribution.
138138
It has the following signature:
139139

140140
:::{warning}
@@ -277,7 +277,7 @@ as for ``FreeRV`` and ``ObservedRV``, they are ``TensorVariable``\s with
277277
278278
``Factor`` basically `enable and assign the
279279
logp <https://github.com/pymc-devs/pymc/blob/6d07591962a6c135640a3c31903eba66b34e71d8/pymc/model.py#L195-L276>`__
280-
(represented as a tensor also) property to an PyTensor tensor (thus
280+
(represented as a tensor also) property to a PyTensor tensor (thus
281281
making it a random variable). For a ``TransformedRV``, it transforms the
282282
distribution into a ``TransformedDistribution``, and then ``model.Var`` is
283283
called again to added the RV associated with the
@@ -373,7 +373,7 @@ def logpt(self):
373373
return logp
374374
```
375375

376-
which returns an PyTensor tensor that its value depends on the free parameters in the model (i.e., its parent nodes from the PyTensor graph).
376+
which returns a PyTensor tensor that its value depends on the free parameters in the model (i.e., its parent nodes from the PyTensor graph).
377377
You can evaluate or compile into a python callable (that you can pass numpy as input args).
378378
Note that the logp tensor depends on its input in the PyTensor graph, thus you cannot pass new tensor to generate a logp function.
379379
For similar reason, in PyMC we do graph copying a lot using pytensor.clone_replace to replace the inputs to a tensor.
@@ -561,7 +561,7 @@ Moreover, transition kernels in TFP do not flatten the tensors, see eg docstring
561561
We love NUTS, or to be more precise Dynamic HMC with complex stopping rules.
562562
This part is actually all done outside of PyTensor, for NUTS, it includes:
563563
The leapfrog, dual averaging, tuning of mass matrix and step size, the tree building, sampler related statistics like divergence and energy checking.
564-
We actually have an PyTensor version of HMC, but it has never been used, and has been removed from the main repository.
564+
We actually have a PyTensor version of HMC, but it has never been used, and has been removed from the main repository.
565565
It can still be found in the [git history](https://github.com/pymc-devs/pymc/pull/3734/commits/0fdae8207fd14f66635f3673ef267b2b8817aa68), though.
566566

567567
#### Variational Inference (VI)

docs/source/guides/Gaussian_Processes.rst

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -158,7 +158,7 @@ other type of random variable. The first argument is the name of the random
158158
variable representing the function we are placing the prior over.
159159
The second argument is the inputs to the function that the prior is over,
160160
:code:`X`. The inputs are usually known and present in the data, but they can
161-
also be PyMC random variables. If the inputs are an PyTensor tensor or a
161+
also be PyMC random variables. If the inputs are a PyTensor tensor or a
162162
PyMC random variable, the :code:`shape` needs to be given.
163163

164164
Usually at this point, inference is performed on the model. The

docs/source/learn/core_notebooks/Gaussian_Processes.rst

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -155,7 +155,7 @@ other type of random variable. The first argument is the name of the random
155155
variable representing the function we are placing the prior over.
156156
The second argument is the inputs to the function that the prior is over,
157157
:code:`X`. The inputs are usually known and present in the data, but they can
158-
also be PyMC random variables. If the inputs are an PyTensor tensor or a
158+
also be PyMC random variables. If the inputs are a PyTensor tensor or a
159159
PyMC random variable, the :code:`shape` needs to be given.
160160

161161
Usually at this point, inference is performed on the model. The

docs/source/learn/core_notebooks/pymc_pytensor.ipynb

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -415,7 +415,7 @@
415415
"cell_type": "markdown",
416416
"metadata": {},
417417
"source": [
418-
"### What is in an PyTensor graph?\n",
418+
"### What is in a PyTensor graph?\n",
419419
"\n",
420420
"The following diagram shows the basic structure of an `pytensor` graph.\n",
421421
"\n",

pymc/distributions/custom.py

Lines changed: 5 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -510,9 +510,9 @@ class CustomDist:
510510
A callable that calculates the log probability of some given ``value``
511511
conditioned on certain distribution parameter values. It must have the
512512
following signature: ``logp(value, *dist_params)``, where ``value`` is
513-
an PyTensor tensor that represents the distribution value, and ``dist_params``
513+
a PyTensor tensor that represents the distribution value, and ``dist_params``
514514
are the tensors that hold the values of the distribution parameters.
515-
This function must return an PyTensor tensor.
515+
This function must return a PyTensor tensor.
516516
517517
When the `dist` function is specified, PyMC will try to automatically
518518
infer the `logp` when this is not provided.
@@ -523,9 +523,9 @@ class CustomDist:
523523
A callable that calculates the log cumulative log probability of some given
524524
``value`` conditioned on certain distribution parameter values. It must have the
525525
following signature: ``logcdf(value, *dist_params)``, where ``value`` is
526-
an PyTensor tensor that represents the distribution value, and ``dist_params``
526+
a PyTensor tensor that represents the distribution value, and ``dist_params``
527527
are the tensors that hold the values of the distribution parameters.
528-
This function must return an PyTensor tensor. If ``None``, a ``NotImplementedError``
528+
This function must return a PyTensor tensor. If ``None``, a ``NotImplementedError``
529529
will be raised when trying to compute the distribution's logcdf.
530530
support_point : Optional[Callable]
531531
A callable that can be used to compute the finete logp point of the distribution.
@@ -550,7 +550,7 @@ class CustomDist:
550550
When specified, `ndim_supp` and `ndims_params` are not needed. See examples below.
551551
dtype : str
552552
The dtype of the distribution. All draws and observations passed into the
553-
distribution will be cast onto this dtype. This is not needed if an PyTensor
553+
distribution will be cast onto this dtype. This is not needed if a PyTensor
554554
dist function is provided, which should already return the right dtype!
555555
class_name : str
556556
Name for the class which will wrap the CustomDist methods. When not specified,

pymc/distributions/dist_math.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -236,7 +236,7 @@ def log_normal(x, mean, **kwargs):
236236

237237

238238
class SplineWrapper(Op):
239-
"""Creates an PyTensor operation from scipy.interpolate.UnivariateSpline."""
239+
"""Creates a PyTensor operation from scipy.interpolate.UnivariateSpline."""
240240

241241
__props__ = ("spline",)
242242

pymc/distributions/truncated.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -51,7 +51,7 @@
5151

5252

5353
class TruncatedRV(SymbolicRandomVariable):
54-
"""An `Op` constructed from an PyTensor graph that represents a truncated univariate random variable."""
54+
"""An `Op` constructed from a PyTensor graph that represents a truncated univariate random variable."""
5555

5656
default_output: int = 0
5757
base_rv_op: Op

pymc/logprob/rewriting.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -199,7 +199,7 @@ def construct_ir_fgraph(
199199
A custom IR rewriter can be specified. By default,
200200
`logprob_rewrites_db.query(RewriteDatabaseQuery(include=["basic"]))` is used.
201201
202-
Our measurable IR takes the form of an PyTensor graph that is more-or-less
202+
Our measurable IR takes the form of a PyTensor graph that is more-or-less
203203
equivalent to a given PyTensor graph (i.e. the keys of `rv_values`) but
204204
contains `Op`s that are subclasses of the `MeasurableOp` type in
205205
place of ones that do not inherit from `MeasurableOp` in the original

pymc/model/core.py

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -216,7 +216,7 @@ def modelcontext(model: Optional["Model"]) -> "Model":
216216

217217

218218
class ValueGradFunction:
219-
"""Create an PyTensor function that computes a value and its gradient.
219+
"""Create a PyTensor function that computes a value and its gradient.
220220
221221
Parameters
222222
----------
@@ -593,7 +593,7 @@ def isroot(self):
593593
return self.parent is None
594594

595595
def logp_dlogp_function(self, grad_vars=None, tempered=False, **kwargs):
596-
"""Compile an PyTensor function that computes logp and gradient.
596+
"""Compile a PyTensor function that computes logp and gradient.
597597
598598
Parameters
599599
----------
@@ -1660,7 +1660,7 @@ def compile_fn(
16601660
point_fn: bool = True,
16611661
**kwargs,
16621662
) -> PointFunc | Function:
1663-
"""Compiles an PyTensor function.
1663+
"""Compiles a PyTensor function.
16641664
16651665
Parameters
16661666
----------
@@ -2177,7 +2177,7 @@ def compile_fn(
21772177
model: Model | None = None,
21782178
**kwargs,
21792179
) -> PointFunc | Function:
2180-
"""Compiles an PyTensor function.
2180+
"""Compiles a PyTensor function.
21812181
21822182
Parameters
21832183
----------

pymc/pytensorf.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -277,7 +277,7 @@ def cont_inputs(a):
277277

278278

279279
def floatX(X):
280-
"""Convert an PyTensor tensor or numpy array to pytensor.config.floatX type."""
280+
"""Convert a PyTensor tensor or numpy array to pytensor.config.floatX type."""
281281
try:
282282
return X.astype(pytensor.config.floatX)
283283
except AttributeError:

pymc/sampling/jax.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -122,7 +122,7 @@ def get_jaxified_graph(
122122
inputs: list[TensorVariable] | None = None,
123123
outputs: list[TensorVariable] | None = None,
124124
) -> list[TensorVariable]:
125-
"""Compile an PyTensor graph into an optimized JAX function."""
125+
"""Compile a PyTensor graph into an optimized JAX function."""
126126
graph = _replace_shared_variables(outputs) if outputs is not None else None
127127

128128
fgraph = FunctionGraph(inputs=inputs, outputs=graph, clone=True)

tests/test_pytensorf.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -277,7 +277,7 @@ def test_convert_generator_data(input_dtype):
277277
result = convert_generator_data(square_generator)
278278
apply = result.owner
279279
op = apply.op
280-
# Make sure the returned object is an PyTensor TensorVariable
280+
# Make sure the returned object is a PyTensor TensorVariable
281281
assert isinstance(result, TensorVariable)
282282
assert isinstance(op, GeneratorOp), f"It's a {type(apply)}"
283283
# There are no inputs - because it generates...

0 commit comments

Comments
 (0)