-
-
Notifications
You must be signed in to change notification settings - Fork 2.1k
Arbitrary Deterministic without Theano function? #507
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
If this is possible, we probably should get an example. The current examples of deterministics are very minimal. |
Another reason for supporting easy plug-in code is that hand-tuned cython / SSE will probably outperform theano in many cases. This would give users a very clear path to optimizing their code--find the rate limiting step and optimize it. |
I suppose one way to do this is via extending Theano (via theano.Op). I'm looking at this now. |
I agree that we need to support this. I think the theano.Op way is the On Sun, Mar 9, 2014 at 3:12 PM, kyleabeauchamp [email protected]:
|
I'd love to hear suggestions for how to make it easier and or syntax On Sun, Mar 9, 2014 at 9:48 PM, John Salvatier [email protected] wrote:
|
Yeah, I think that is required and we should just raise an error if you use a non-auto-diffable logp with a gradient-based sampler. Writing an OP for each operator seems a little cumbersome. Don't they support blackbox OPs that are doing arbitrary transforms on the inputs and get called like functions without having to define the OP? |
I think not really, but we can probably pressure theano into doing so. On Mon, Mar 10, 2014 at 10:14 AM, Thomas Wiecki [email protected]:
|
In the long-term it probably makes sense to refactor PyMC to have an interface that allows us to swap out sampling engines so that we are not completely tied to Theano. That way, one could conceivably write step methods that use Theano, SymPy, or some other engine for doing sampling. In the shorter term, talking to the Theano folks to see what they might be able to do makes sense. I wonder if any of them will be at SciPy 2014. |
I agree that being completely tied to Theano is undesirable. For example, many python users have become quite good at the following optimization pipeline:
A first step towards this goal might be to create a PYMC base class that allows gradients to be calculated either manually (via subclass) or automagically (via theano). Then, Theano would just provide one way to get derivatives. |
I opened a theano issue about this here: |
This is to tell that I closed this issue as I just merge a PR that should allow that. |
@noulz Can you link to it here? Thanks! |
the merged PR is Theano/Theano#1785 The new decorator doc: http://deeplearning.net/software/theano/library/compile/ops.html#theano.compile.ops.as_op |
Kudos to the theano guys to making this happen so quickly! |
We should be able to make the |
Yes definitely
|
Right, but we also want to support non-theano-likelihoods. On Fri, Apr 11, 2014 at 11:51 AM, John Salvatier
Thomas Wiecki |
That should actually be as easy as adding an as_op decorator to the
|
Could you give a minimal example for that? Thanks! |
Here is the doc we have about as_op: On Wed, Aug 6, 2014 at 5:29 AM, maahnman [email protected] wrote:
|
Thanks for the hint, I got the coalmine example working with the decorator:
However, the reason why I'm interested in pymc3 is multiprocessing, but as soon as I add
It happens only if I have to use the as_op decorator and it looks like argsample() of sampling.py is to blame even though the function is defined at top level to be able to pickle it?! |
I know there is some problem with pickle and decorator. @abergeron, you Fred On Wed, Aug 6, 2014 at 11:38 AM, maahnman [email protected] wrote:
|
For as_op it's a bit different since it makes a class, not a function. But it's still not impossible. |
If somebody wants to write tests for that code, we can include it in theano. It should make ops created from as_op pickleable. |
There is a PR to Theano from @abergeron with a fix: On Wed, Aug 6, 2014 at 12:39 PM, abergeron [email protected] wrote:
|
@maahnman The pickle PR seems merged. Can you try if that fixes the |
It works, thanks for the quick fix! |
Great, do you want to do a PR with an example (like the one you already have)? |
I'm trying to build a model that performs a calculation that is unavailable in Theano. In pymc2, I could do this via a deterministic that acted on the numpy form of a random variable (e.g.
x.value
).In pymc3, is there any way to do this? For example, I need to calculate a matrix exponential, but there is no Theano function for matrix exponential. In pymc2, I could get around this by using a
scipy.linalg
function.Am I right in concluding that pymc3 can only support operations that are combinations of theano functions?
To me, one of the key advantages of PYMC2 was that I could plug-in nearly arbitrary code for use in an MCMC framework. I realize that Theano / auto-gradient is required for using HMC / NUTS samplers on arbitrary problems. However, it would be awesome if there were still an easy way to call arbitrary functions / libraries, even if it means "falling back" on old-school step methods.
The text was updated successfully, but these errors were encountered: