Empirical Inference Talk 2006

Sampling for non-conjugate infinite latent feature models

Latent variable models are powerful tools to model the underlying structure in data. Infinite latent variable models can be defined using Bayesian nonparametrics. Dirichlet process (DP) models constitute an example of infinite latent class models in which each object is assumed to belong to one of the, mutually exclusive, infinitely many classes. Recently, the Indian buffet process (IBP) has been defined as an extension of the DP. IBP is a distribution over sparse binary matrices with infinitely many columns which can be used as a distribution for non-exclusive features. Inference using Markov chain Monte Carlo (MCMC) in conjugate IBP models has been previously described, however requiring conjugacy restricts the use of IBP. We describe an MCMC algorithm for non-conjugate IBP models. Modelling the choice behaviour is an important topic in psychology, economics and related fields. Elimination by Aspects (EBA) is a choice model that assumes each alternative has latent features with associated weights that lead to the observed choice outcomes. We formulate a non-parametric version of EBA by using IBP as the prior over the latent binary features. We infer the features of objects that lead to the choice data by using our sampling scheme for inference.

Author(s): Görür, D. and Rasmussen, CE.
Year: 2006
Month: June
Day: 0
Editors: Bernardo, J. M.
Bibtex Type: Talk (talk)
Digital: 0
Electronic Archiving: grant_archive
Event Name: 8th Valencia International Meeting on Bayesian Statistics (ISBA 2006)
Event Place: Benidorm, Spain
Language: en
Organization: Max-Planck-Gesellschaft
School: Biologische Kybernetik
Links:

BibTex

@talk{5364,
  title = {Sampling for non-conjugate infinite latent feature models},
  abstract = {Latent variable models are powerful tools to model the underlying structure in
  data. Infinite latent variable models can be defined using Bayesian nonparametrics.
  Dirichlet process (DP) models constitute an example of infinite latent class models
  in which each object is assumed to belong to one of the, mutually exclusive, infinitely
  many classes. Recently, the Indian buffet process (IBP) has been defined as
  an extension of the DP. IBP is a distribution over sparse binary matrices with infinitely
  many columns which can be used as a distribution for non-exclusive features.
  Inference using Markov chain Monte Carlo (MCMC) in conjugate IBP models has
  been previously described, however requiring conjugacy restricts the use of IBP. We
  describe an MCMC algorithm for non-conjugate IBP models.
  Modelling the choice behaviour is an important topic in psychology, economics
  and related fields. Elimination by Aspects (EBA) is a choice model that assumes
  each alternative has latent features with associated weights that lead to the observed
  choice outcomes. We formulate a non-parametric version of EBA by using IBP as
  the prior over the latent binary features. We infer the features of objects that lead
  to the choice data by using our sampling scheme for inference.},
  editors = {Bernardo, J. M.},
  organization = {Max-Planck-Gesellschaft},
  school = {Biologische Kybernetik},
  month = jun,
  year = {2006},
  slug = {5364},
  author = {G{\"o}r{\"u}r, D. and Rasmussen, CE.},
  month_numeric = {6}
}