bayesian generalized kernel mixed models zhihua zhang, guang dai and michael i. jordan jmlr 2011

19
Bayesian Generalized Kernel Mixed Models Zhihua Zhang, Guang Dai and Michael I. Jordan JMLR 2011

Upload: dwayne-george

Post on 14-Dec-2015

217 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Bayesian Generalized Kernel Mixed Models Zhihua Zhang, Guang Dai and Michael I. Jordan JMLR 2011

Bayesian Generalized Kernel Mixed Models

Zhihua Zhang, Guang Dai and Michael I. Jordan

JMLR 2011

Page 2: Bayesian Generalized Kernel Mixed Models Zhihua Zhang, Guang Dai and Michael I. Jordan JMLR 2011

Summary of contributions

• Propose generalized kernel models (GKMs) as a framework in which sparsity can be given an explicit treatment and in which a fully Bayesian methodology can be carried out

• Data augmentation methodology to develop a MCMC algorithm for inference

• Approach shown to be related Gaussian processes and provide a flexible approximation method for GPs

Page 3: Bayesian Generalized Kernel Mixed Models Zhihua Zhang, Guang Dai and Michael I. Jordan JMLR 2011

Bayesian approach for kernel supervised learning

• The form of the regressor or classifier is given by

• For a Mercer kernel, there exists a corresponding mapping (say ), from the input space , such that

• This provides an equivalent representation in the feature space, where,

Page 4: Bayesian Generalized Kernel Mixed Models Zhihua Zhang, Guang Dai and Michael I. Jordan JMLR 2011

Generalized Kernel Models

Page 5: Bayesian Generalized Kernel Mixed Models Zhihua Zhang, Guang Dai and Michael I. Jordan JMLR 2011

Prior for regression coefficients

Page 6: Bayesian Generalized Kernel Mixed Models Zhihua Zhang, Guang Dai and Michael I. Jordan JMLR 2011

Sparse models

• Recall that the number of active vectors is the number of non-zero components of– We are thus interested in a prior for which

allows some components of to be zero

Page 7: Bayesian Generalized Kernel Mixed Models Zhihua Zhang, Guang Dai and Michael I. Jordan JMLR 2011

Methodology

For the indicator vector

Page 8: Bayesian Generalized Kernel Mixed Models Zhihua Zhang, Guang Dai and Michael I. Jordan JMLR 2011

Graphical model

Page 9: Bayesian Generalized Kernel Mixed Models Zhihua Zhang, Guang Dai and Michael I. Jordan JMLR 2011

Inference

• Gibbs for most parameters• MH for kernel parameters• Reversible jump Markov Chain for – takes 2^n distinct values– For small n, posterior may be obtained by calculating

the normalizing constant by summing over all possible values of

– For large n, a reversible jump MC sampler may be employed to identify high posterior probability models

Page 10: Bayesian Generalized Kernel Mixed Models Zhihua Zhang, Guang Dai and Michael I. Jordan JMLR 2011

Automatic choice of active vectors

• We generate a proposal from the current value of by one of the three possible moves:

Prediction :

Page 11: Bayesian Generalized Kernel Mixed Models Zhihua Zhang, Guang Dai and Michael I. Jordan JMLR 2011

Sparse Gaussian process for classification

Given a function , then is a Gaussian process with zero mean and covariance function and vice versa.

Also,

Page 12: Bayesian Generalized Kernel Mixed Models Zhihua Zhang, Guang Dai and Michael I. Jordan JMLR 2011

Sparse GP classification

Page 13: Bayesian Generalized Kernel Mixed Models Zhihua Zhang, Guang Dai and Michael I. Jordan JMLR 2011

Results

Page 14: Bayesian Generalized Kernel Mixed Models Zhihua Zhang, Guang Dai and Michael I. Jordan JMLR 2011
Page 15: Bayesian Generalized Kernel Mixed Models Zhihua Zhang, Guang Dai and Michael I. Jordan JMLR 2011
Page 16: Bayesian Generalized Kernel Mixed Models Zhihua Zhang, Guang Dai and Michael I. Jordan JMLR 2011
Page 17: Bayesian Generalized Kernel Mixed Models Zhihua Zhang, Guang Dai and Michael I. Jordan JMLR 2011
Page 18: Bayesian Generalized Kernel Mixed Models Zhihua Zhang, Guang Dai and Michael I. Jordan JMLR 2011
Page 19: Bayesian Generalized Kernel Mixed Models Zhihua Zhang, Guang Dai and Michael I. Jordan JMLR 2011