Derivatives are nonparametric functionals that are important in a wide range of applications where the rate of change of an unknown function is important. In the Bayesian paradigm, the Gaussian Process (GP) is routinely used as a flexible prior probability for unknown functions and is one of the most popular tools in many fields. However, little is known about the optimal modeling strategies and theoretical properties when using GPs for derivatives. In this article, we study the plugin strategy by differentiating the posterior using GP priors of arbitrary order of differentiation. This pragmatically attractive plug-in GP method was previously perceived as suboptimal and inferior, but that is not necessarily the case. We provide the posterior contraction rate of the plugin GP and establish that it is remarkably adaptable to derivative orders. Posterior measurements of the regression function and its derivatives show that, using the same choice of hyperparameters independent of the order of the derivative, they converge at a minimax optimal rate up to the logarithmic coefficient of a particular class of functions. This, to our knowledge, provides the first positive results for plugin GP in the context of differential functional inference, using guided hyperparameter tuning for jointly estimating regression functions and their derivatives. , leading to a practically simple nonparametric Bayesian method. Simulations demonstrate competitive finite sample performance of the plug-in GP method. Describes a climate change application that analyzes global sea level rise.



    Source link

    Share.

    Leave A Reply