The breadth of theoretical results on efficient Markov Chain Monte Carlo (MCMC) sampling schemes on discrete spaces is slim when compared to the available theory for MCMC sampling schemes on continuous spaces. Nonetheless, in [Zan17] a simple framework to design Metropolis-Hastin
...
The breadth of theoretical results on efficient Markov Chain Monte Carlo (MCMC) sampling schemes on discrete spaces is slim when compared to the available theory for MCMC sampling schemes on continuous spaces. Nonetheless, in [Zan17] a simple framework to design Metropolis-Hastings (MH) proposal kernels that incorporate local information about the target is presented. The class of functions for which the resulting MH kernels are Peskun optimal in high-dimensional regimes is characterized. We will refer to these functions as \textit{balancing functions} and to the class of resulting MH proposal kernels as \textit{pointwise informed proposals}. In [PG19], the class of balancing functions is used to construct Markov Jump Processes (MJP) on discrete state spaces. As a result, the Zanella process is constructed. In the absence of a theoretical result on the optimal balancing function to choose from the class of balancing functions, a heuristic approach is proposed using the Zanella process. To further encourage the mixing behaviour of the simulated chain, the algebraic structure of the state space is exploited to achieve non-reversible Markov chains on short to medium timescales. Simulations are performed for all the considered MCMC sampling schemes by studying the Bayesian record linkage problem.