Skip to content

Initial commit of pivoted cholesky algorithm from GPyTorch #5991

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
wants to merge 1 commit into from

Conversation

kunalghosh
Copy link
Contributor

@kunalghosh kunalghosh commented Jul 20, 2022

What is this PR about?
fast exact Gaussian processes (Gardner et.al 2018) works on a modified batch conjugate gradient descent algorithm which needs a function to return the preconditioning matrix, given a kernel. This PR implements the function to compute the preconditioning matrix.

Checklist

Major / Breaking Changes

  • None

Bugfixes / New features

  • New feature.

Docs / Maintenance

  • Yet to be implemented.

@kunalghosh
Copy link
Contributor Author

This is a draft pull request for my GSoC Project.

@kunalghosh
Copy link
Contributor Author

This produces the same results as gpytorch.pivoted_cholesky()

Test kernel matrices can be generated using the following code which generates random positive semi-definite matrices

import torch 
import gpytorch
import numpy as np

# Test PSD Matrix

N = 10
rank = 5
np.random.seed(1234) # nans with seed 1234
K = np.random.randn(N, N)
K = K @ K.T + N * np.eye(N)
K_torch = torch.from_numpy(K)

Now passing the same kernel matrix to the two functions will yield the same output.

L_gpt = gpytorch.pivoted_cholesky(K_torch, rank=rank, error_tol=1e-3)
L_np  = pivoted_cholesky_np_gpt(K, max_iter=rank, error_tol=1e-3)
assert np.allclose(L_gpt, L_np.T) == True, "BUG: The two cholesky decompositions are not same !"

@codecov
Copy link

codecov bot commented Jul 20, 2022

Codecov Report

Merging #5991 (416e5c2) into main (4e6527c) will increase coverage by 1.38%.
The diff coverage is 0.00%.

Impacted file tree graph

@@            Coverage Diff             @@
##             main    #5991      +/-   ##
==========================================
+ Coverage   87.68%   89.07%   +1.38%     
==========================================
  Files          73       74       +1     
  Lines       13225    13294      +69     
==========================================
+ Hits        11597    11842     +245     
+ Misses       1628     1452     -176     
Impacted Files Coverage Δ
pymc/gp/pivoted_cholesky.py 0.00% <0.00%> (ø)
pymc/__init__.py 68.42% <0.00%> (-31.58%) ⬇️
pymc/step_methods/compound.py 86.66% <0.00%> (-6.67%) ⬇️
pymc/data.py 80.08% <0.00%> (-1.56%) ⬇️
pymc/step_methods/hmc/base_hmc.py 89.76% <0.00%> (-0.72%) ⬇️
pymc/parallel_sampling.py 85.80% <0.00%> (-0.67%) ⬇️
pymc/distributions/discrete.py 99.21% <0.00%> (-0.52%) ⬇️
pymc/step_methods/hmc/nuts.py 97.40% <0.00%> (-0.10%) ⬇️
pymc/distributions/continuous.py 97.89% <0.00%> (-0.10%) ⬇️
pymc/ode/ode.py 85.85% <0.00%> (ø)
... and 30 more

@kunalghosh kunalghosh marked this pull request as draft July 20, 2022 16:24
@bwengals
Copy link
Contributor

bwengals commented Jul 29, 2022

Hey @kunalghosh, so sorry again for missing this. Read through it, it looks good. Definitely with the torch dependency I think you should move it to pymc-experimental. Once there, tag me for a review and we can get it in ASAP

@kunalghosh
Copy link
Contributor Author

Closing, migrated to pymc_experimental

@kunalghosh kunalghosh closed this Aug 2, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants