Skip to content

Commit 02f0b7f

Browse files
rsdel2007eigenfoo
authored andcommitted
DOC: correct typos in docstrings (#3350)
* Update math.py * Update sampling.py * Update stats.py * Update updates.py * Update opvi.py * Update updates.py * Spelling fix
1 parent f245f11 commit 02f0b7f

File tree

5 files changed

+8
-8
lines changed

5 files changed

+8
-8
lines changed

pymc3/math.py

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -134,15 +134,15 @@ def logit(p):
134134
def log1pexp(x):
135135
"""Return log(1 + exp(x)), also called softplus.
136136
137-
This function is numerically more stable than the naive approch.
137+
This function is numerically more stable than the naive approach.
138138
"""
139139
return tt.nnet.softplus(x)
140140

141141

142142
def log1mexp(x):
143143
"""Return log(1 - exp(-x)).
144144
145-
This function is numerically more stable than the naive approch.
145+
This function is numerically more stable than the naive approach.
146146
147147
For details, see
148148
https://cran.r-project.org/web/packages/Rmpfr/vignettes/log1mexp-note.pdf

pymc3/sampling.py

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -230,7 +230,7 @@ def sample(draws=500, step=None, init='auto', n_init=200000, start=None, trace=N
230230
Starting point in parameter space (or partial point)
231231
Defaults to trace.point(-1)) if there is a trace provided and model.test_point if not
232232
(defaults to empty dict). Initialization methods for NUTS (see `init` keyword) can
233-
overwrite the default. For 'SMC' if should be a list of dict with length `chains`.
233+
overwrite the default. For 'SMC' it should be a list of dict with length `chains`.
234234
trace : backend, list, or MultiTrace
235235
This should be a backend instance, a list of variables to track, or a MultiTrace object
236236
with past values. If a MultiTrace object is given, it must contain samples for the chain
@@ -869,7 +869,7 @@ def _prepare_iter_population(draws, chains, step, start, parallelize, tune=None,
869869
# 5. configure the PopulationStepper (expensive call)
870870
popstep = PopulationStepper(steppers, parallelize)
871871

872-
# Because the preperations above are expensive, the actual iterator is
872+
# Because the preparations above are expensive, the actual iterator is
873873
# in another method. This way the progbar will not be disturbed.
874874
return _iter_population(draws, tune, popstep, steppers, traces, population)
875875

pymc3/stats.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -199,7 +199,7 @@ def waic(trace, model=None, pointwise=False, progressbar=False):
199199
p_waic: effective number parameters
200200
var_warn: 1 if posterior variance of the log predictive
201201
densities exceeds 0.4
202-
waic_i: and array of the pointwise predictive accuracy, only if pointwise True
202+
waic_i: an array of the pointwise predictive accuracy, only if pointwise True
203203
"""
204204
model = modelcontext(model)
205205

pymc3/variational/opvi.py

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -6,10 +6,10 @@
66
yield unreliable decisions.
77
88
Recently on NIPS 2017 `OPVI <https://arxiv.org/abs/1610.09033/>`_ framework
9-
was presented. It generalizes variational inverence so that the problem is
9+
was presented. It generalizes variational inference so that the problem is
1010
build with blocks. The first and essential block is Model itself. Second is
1111
Approximation, in some cases :math:`log Q(D)` is not really needed. Necessity
12-
depends on the third and forth part of that black box, Operator and
12+
depends on the third and fourth part of that black box, Operator and
1313
Test Function respectively.
1414
1515
Operator is like an approach we use, it constructs loss from given Model,

pymc3/variational/updates.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -865,7 +865,7 @@ def adamax(loss_or_grads=None, params=None, learning_rate=0.002, beta1=0.9,
865865
beta2=0.999, epsilon=1e-8):
866866
"""Adamax updates
867867
868-
Adamax updates implemented as in [1]_. This is a variant of of the Adam
868+
Adamax updates implemented as in [1]_. This is a variant of the Adam
869869
algorithm based on the infinity norm.
870870
871871
Parameters

0 commit comments

Comments
 (0)