You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hey, I worked briefly on this project a few years back looking at some ABC methods but essentially ran out of time and the code never made it to a production standard and performed fairly poorly. I've got some free time now and would be interested in tidying up and contributing some (hopefully) higher quality code for these methods and some other features to the pints project now I have a few years of professional experience behind me.
In addition to tidying up the previous work on HABC, ABC-MCMC, and the Synthetic likelihoods and gradients, I believe that the poor performance on HABC in my previous investigation was likely due to the gradient noise decimating the acceptance rate in the the MH step. I would be interested to try implementing the Stochastic Gradient Hamiltonian Monte Carlo sampler which is much more tolerant to gradient noise (https://arxiv.org/pdf/1402.4102.pdf) and should give much better performance when numerically estimated gradients/hessians are used. This would also permit a nice interface to augment any existing LogPDF or ForwardsModel with estimated sensitivities.
Potential Work items:
Tidy up existing synthetic likelihood code
Extend synthetic likelihoods to reuse existing LogPDFs over parameter space
Decouple finite-difference gradient estimation code (2-point, 3-point, SPSA) and implement decorator for LogPDF and ForwardModel
Evaluate ABC-SGHMC, ABC-SGLD, ABC-MCMC (with Indicator kernel) on likelihood-free stochastic toy problems
Evaluate exotic ABC variants using SLs with est-gradients, e.g. ABC-NUTS, ABC-HBMCMC to see if any perform well
If there are currently students working on any of the above then let me know and I'll gladly leave them out, otherwise let me know if you think any of the above would be useful and I'll have a crack at writing it up and seeing how it goes!
Thanks,
Jack
The text was updated successfully, but these errors were encountered:
Hey, I worked briefly on this project a few years back looking at some ABC methods but essentially ran out of time and the code never made it to a production standard and performed fairly poorly. I've got some free time now and would be interested in tidying up and contributing some (hopefully) higher quality code for these methods and some other features to the pints project now I have a few years of professional experience behind me.
In addition to tidying up the previous work on HABC, ABC-MCMC, and the Synthetic likelihoods and gradients, I believe that the poor performance on HABC in my previous investigation was likely due to the gradient noise decimating the acceptance rate in the the MH step. I would be interested to try implementing the Stochastic Gradient Hamiltonian Monte Carlo sampler which is much more tolerant to gradient noise (https://arxiv.org/pdf/1402.4102.pdf) and should give much better performance when numerically estimated gradients/hessians are used. This would also permit a nice interface to augment any existing LogPDF or ForwardsModel with estimated sensitivities.
Potential Work items:
If there are currently students working on any of the above then let me know and I'll gladly leave them out, otherwise let me know if you think any of the above would be useful and I'll have a crack at writing it up and seeing how it goes!
Thanks,
Jack
The text was updated successfully, but these errors were encountered: