|
7 | 7 | - Add data container class (`Data`) that wraps the theano SharedVariable class and let the model be aware of its inputs and outputs.
|
8 | 8 | - Add function `set_data` to update variables defined as `Data`.
|
9 | 9 | - `Mixture` now supports mixtures of multidimensional probability distributions, not just lists of 1D distributions.
|
10 |
| -- `GLM.from_formula` and `LinearComponent.from_formula` can extract variables from the calling scope. Customizable via the new `eval_env` argument. Fixing #3382. |
| 10 | +- `GLM.from_formula` and `LinearComponent.from_formula` can extract variables from the calling scope. Customizable via the new `eval_env` argument. Fixing [#3382](https://github.com/pymc-devs/pymc3/issues/3382). |
11 | 11 | - Added the `distributions.shape_utils` module with functions used to help broadcast samples drawn from distributions using the `size` keyword argument.
|
12 |
| -- Used `numpy.vectorize` in `distributions.distribution._compile_theano_function`. This enables `sample_prior_predictive` and `sample_posterior_predictive` to ask for tuples of samples instead of just integers. This fixes issue #3422. |
| 12 | +- Used `numpy.vectorize` in `distributions.distribution._compile_theano_function`. This enables `sample_prior_predictive` and `sample_posterior_predictive` to ask for tuples of samples instead of just integers. This fixes issue [#3422](https://github.com/pymc-devs/pymc3/issues/3422). |
13 | 13 |
|
14 | 14 | ### Fixes
|
15 | 15 |
|
|
19 | 19 |
|
20 | 20 | - All occurances of `sd` as a parameter name have been renamed to `sigma`. `sd` will continue to function for backwards compatibility.
|
21 | 21 | - Made `BrokenPipeError` for parallel sampling more verbose on Windows.
|
22 |
| -- Added the `broadcast_distribution_samples` function that helps broadcasting arrays of drawn samples, taking into account the requested `size` and the inferred distribution shape. This sometimes is needed by distributions that call several `rvs` separately within their `random` method, such as the `ZeroInflatedPoisson` (Fix issue #3310). |
23 |
| -- The `Wald`, `Kumaraswamy`, `LogNormal`, `Pareto`, `Cauchy`, `HalfCauchy`, `Weibull` and `ExGaussian` distributions `random` method used a hidden `_random` function that was written with scalars in mind. This could potentially lead to artificial correlations between random draws. Added shape guards and broadcasting of the distribution samples to prevent this (Similar to issue #3310). |
24 |
| -- Added a fix to allow the imputation of single missing values of observed data, which previously would fail (Fix issue #3122). |
25 |
| -- Fix for #3346. The `draw_values` function was too permissive with what could be grabbed from inside `point`, which lead to an error when sampling posterior predictives of variables that depended on shared variables that had changed their shape after `pm.sample()` had been called. |
26 |
| -- Fix for #3354. `draw_values` now adds the theano graph descendants of `TensorConstant` or `SharedVariables` to the named relationship nodes stack, only if these descendants are `ObservedRV` or `MultiObservedRV` instances. |
| 22 | +- Added the `broadcast_distribution_samples` function that helps broadcasting arrays of drawn samples, taking into account the requested `size` and the inferred distribution shape. This sometimes is needed by distributions that call several `rvs` separately within their `random` method, such as the `ZeroInflatedPoisson` (fixes issue [#3310](https://github.com/pymc-devs/pymc3/issues/3310)). |
| 23 | +- The `Wald`, `Kumaraswamy`, `LogNormal`, `Pareto`, `Cauchy`, `HalfCauchy`, `Weibull` and `ExGaussian` distributions `random` method used a hidden `_random` function that was written with scalars in mind. This could potentially lead to artificial correlations between random draws. Added shape guards and broadcasting of the distribution samples to prevent this (Similar to issue [#3310](https://github.com/pymc-devs/pymc3/issues/3310)). |
| 24 | +- Added a fix to allow the imputation of single missing values of observed data, which previously would fail (fixes issue [#3122](https://github.com/pymc-devs/pymc3/issues/3122)). |
| 25 | +- The `draw_values` function was too permissive with what could be grabbed from inside `point`, which lead to an error when sampling posterior predictives of variables that depended on shared variables that had changed their shape after `pm.sample()` had been called (fix issue [#3346](https://github.com/pymc-devs/pymc3/issues/3346)). |
| 26 | +- `draw_values` now adds the theano graph descendants of `TensorConstant` or `SharedVariables` to the named relationship nodes stack, only if these descendants are `ObservedRV` or `MultiObservedRV` instances (fixes issue [#3354](https://github.com/pymc-devs/pymc3/issues/3354)). |
27 | 27 | - Fixed bug in broadcast_distrution_samples, which did not handle correctly cases in which some samples did not have the size tuple prepended.
|
28 |
| -- Changed `MvNormal.random`'s usage of `tensordot` for Cholesky encoded covariances. This lead to wrong axis broadcasting and seemed to be the cause for issue #3343. |
| 28 | +- Changed `MvNormal.random`'s usage of `tensordot` for Cholesky encoded covariances. This lead to wrong axis broadcasting and seemed to be the cause for issue [#3343](https://github.com/pymc-devs/pymc3/issues/3343). |
29 | 29 | - Fixed defect in `Mixture.random` when multidimensional mixtures were involved. The mixture component was not preserved across all the elements of the dimensions of the mixture. This meant that the correlations across elements within a given draw of the mixture were partly broken.
|
30 | 30 | - Restructured `Mixture.random` to allow better use of vectorized calls to `comp_dists.random`.
|
31 | 31 | - Added tests for mixtures of multidimensional distributions to the test suite.
|
32 | 32 | - Fixed incorrect usage of `broadcast_distribution_samples` in `DiscreteWeibull`.
|
33 | 33 | - `Mixture`'s default dtype is now determined by `theano.config.floatX`.
|
34 | 34 | - `dist_math.random_choice` now handles nd-arrays of category probabilities, and also handles sizes that are not `None`. Also removed unused `k` kwarg from `dist_math.random_choice`.
|
35 | 35 | - Changed `Categorical.mode` to preserve all the dimensions of `p` except the last one, which encodes each category's probability.
|
36 |
| -- Changed initialization of `Categorical.p`. `p` is now normalized to sum to `1` inside `logp` and `random`, but not during initialization. This could hide negative values supplied to `p` as mentioned in #2082. |
| 36 | +- Changed initialization of `Categorical.p`. `p` is now normalized to sum to `1` inside `logp` and `random`, but not during initialization. This could hide negative values supplied to `p` as mentioned in [#2082](https://github.com/pymc-devs/pymc3/issues/2082). |
37 | 37 | - `Categorical` now accepts elements of `p` equal to `0`. `logp` will return `-inf` if there are `values` that index to the zero probability categories.
|
38 | 38 | - Add `sigma`, `tau`, and `sd` to signature of `NormalMixture`.
|
39 |
| -- Resolved issue #3248. Set default lower and upper values of -inf and inf for pm.distributions.continuous.TruncatedNormal. This avoids errors caused by their previous values of None. |
40 |
| -- Resolved issue #3399. Converted all calls to `pm.distributions.bound._ContinuousBounded` and `pm.distributions.bound._DiscreteBounded` to use only and all positional arguments. |
41 |
| -- Restructured `distributions.distribution.generate_samples` to use the `shape_utils` module. This solves issues #3421 and #3147 by using the `size` aware broadcating functions in `shape_utils`. |
| 39 | +- Set default lower and upper values of -inf and inf for pm.distributions.continuous.TruncatedNormal. This avoids errors caused by their previous values of None (fixes issue [#3248](https://github.com/pymc-devs/pymc3/issues/3248)). |
| 40 | +- Converted all calls to `pm.distributions.bound._ContinuousBounded` and `pm.distributions.bound._DiscreteBounded` to use only and all positional arguments (fixes issue [#3399](https://github.com/pymc-devs/pymc3/issues/3399)). |
| 41 | +- Restructured `distributions.distribution.generate_samples` to use the `shape_utils` module. This solves issues [#3421](https://github.com/pymc-devs/pymc3/issues/3421) and [#3147](https://github.com/pymc-devs/pymc3/issues/3147) by using the `size` aware broadcating functions in `shape_utils`. |
42 | 42 | - Fixed the `Multinomial.random` and `Multinomial.random_` methods to make them compatible with the new `generate_samples` function. In the process, a bug of the `Multinomial.random_` shape handling was discovered and fixed.
|
43 | 43 | - Fixed a defect found in `Bound.random` where the `point` dictionary was passed to `generate_samples` as an `arg` instead of in `not_broadcast_kwargs`.
|
44 | 44 | - Fixed a defect found in `Bound.random_` where `total_size` could end up as a `float64` instead of being an integer if given `size=tuple()`.
|
45 |
| -- Fixed an issue in `model_graph` that caused construction of the graph of the model for rendering to hang: replaced a search over the powerset of the nodes with a breadth-first search over the nodes. Fix for #3458. |
46 |
| -- Removed variable annotations from `model_graph` but left type hints (Fix for #3465). This means that we support `python>=3.5.4`. |
| 45 | +- Fixed an issue in `model_graph` that caused construction of the graph of the model for rendering to hang: replaced a search over the powerset of the nodes with a breadth-first search over the nodes. Fix for [#3458](https://github.com/pymc-devs/pymc3/issues/3458). |
| 46 | +- Removed variable annotations from `model_graph` but left type hints (Fix for [#3465](https://github.com/pymc-devs/pymc3/issues/3465)). This means that we support `python>=3.5.4`. |
47 | 47 | - Default `target_accept`for `HamiltonianMC` is now 0.65, as suggested in Beskos et. al. 2010 and Neal 2001.
|
48 | 48 | - Fixed bug in `draw_values` that lead to intermittent errors in python3.5. This happened with some deterministic nodes that were drawn but not added to `givens`.
|
49 | 49 |
|
|
57 | 57 | - References to `live_plot` and corresponding notebooks have been removed.
|
58 | 58 | - Deprecated `vars` parameters of `sample_posterior_predictive` and `sample_prior_predictive` in favor of `var_names`. At least for the latter, this is more accurate, since the `vars` parameter actually took names.
|
59 | 59 |
|
| 60 | +### Contributors sorted by number of commits |
| 61 | + |
| 62 | + 38 Thomas Wiecki |
| 63 | + 26 Luciano Paz |
| 64 | + 20 Colin Carroll |
| 65 | + 19 Junpeng Lao |
| 66 | + 19 lucianopaz |
| 67 | + 15 Chris Fonnesbeck |
| 68 | + 13 Juan Martín Loyola |
| 69 | + 13 Ravin Kumar |
| 70 | + 8 Robert P. Goldman |
| 71 | + 5 Tim Blazina |
| 72 | + 4 chang111 |
| 73 | + 4 adamboche |
| 74 | + 3 Eric Ma |
| 75 | + 3 Osvaldo Martin |
| 76 | + 3 Colin |
| 77 | + 3 Sanmitra Ghosh |
| 78 | + 3 Saurav Shekhar |
| 79 | + 3 chartl |
| 80 | + 3 fredcallaway |
| 81 | + 3 Demetri |
| 82 | + 2 Daisuke Kondo |
| 83 | + 2 David Brochart |
| 84 | + 2 George Ho |
| 85 | + 2 Vaibhav Sinha |
| 86 | + 1 rpgoldman |
| 87 | + 1 Adel Tomilova |
| 88 | + 1 Adriaan van der Graaf |
| 89 | + 1 Bas Nijholt |
| 90 | + 1 Benjamin Wild |
| 91 | + 1 Brigitta Sipocz |
| 92 | + 1 Daniel Emaasit |
| 93 | + 1 Hari |
| 94 | + 1 Jeroen |
| 95 | + 1 Joseph Willard |
| 96 | + 1 Juan Martin Loyola |
| 97 | + 1 Katrin Leinweber |
| 98 | + 1 Lisa Martin |
| 99 | + 1 M. Domenzain |
| 100 | + 1 Matt Pitkin |
| 101 | + 1 Peadar Coyle |
| 102 | + 1 Rupal Sharma |
| 103 | + 1 Tom Gilliss |
| 104 | + 1 changjiangeng |
| 105 | + 1 michaelosthege |
| 106 | + 1 monsta |
| 107 | + 1 579397 |
60 | 108 |
|
61 | 109 | ## PyMC3 3.6 (Dec 21 2018)
|
62 | 110 |
|
|
0 commit comments