Journal of Probability and Statistics
Volume 2010 (2010), Article ID 201489, 14 pages
doi:10.1155/2010/201489
Research Article

Spiked Dirichlet Process Priors for Gaussian Process Models

1Department of Statistics, Rice University, Houston, TX 77030, USA
2Statistics group, RAND Corporation, Santa Monica, CA 90407, USA

Received 27 December 2009; Revised 19 August 2010; Accepted 5 October 2010

Academic Editor: Ishwar Basawa

Copyright © 2010 Terrance Savitsky and Marina Vannucci. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

Abstract

We expand a framework for Bayesian variable selection for Gaussian process (GP) models by employing spiked Dirichlet process (DP) prior constructions over set partitions containing covariates. Our approach results in a nonparametric treatment of the distribution of the covariance parameters of the GP covariance matrix that in turn induces a clustering of the covariates. We evaluate two prior constructions: the first one employs a mixture of a point-mass and a continuous distribution as the centering distribution for the DP prior, therefore, clustering all covariates. The second one employs a mixture of a spike and a DP prior with a continuous distribution as the centering distribution, which induces clustering of the selected covariates only. DP models borrow information across covariates through model-based clustering. Our simulation results, in particular, show a reduction in posterior sampling variability and, in turn, enhanced prediction performances. In our model formulations, we accomplish posterior inference by employing novel combinations and extensions of existing algorithms for inference with DP prior models and compare performances under the two prior constructions.