Unitary Approximate Message Passing for Sparse Bayesian Learning

Man Luo, Qinghua Guo, Ming Jin, Yonina C. Eldar, Defeng Huang, Xiangming Meng

Research output: Contribution to journalArticlepeer-review

Abstract

Sparse Bayesian learning (SBL) can be implemented with low complexity based on the approximate message passing (AMP) algorithm. However, it does not work well for a generic measurement matrix, which may cause AMP to diverge. Damped AMP has been used for SBL to alleviate divergence issues at the cost of reducing convergence speed. In this work, we propose a new SBL algorithm based on structured variational inference, leveraging AMP with a unitary transformation. Both single measurement vector and multiple measurement vector problems are investigated. It is shown that, compared to state-of-the-art AMP-based SBL algorithms, the proposed UAMP-SBL is more robust and efficient, leading to remarkably better performance.

Original languageEnglish
Article number2101.09954
Pages (from-to)6023-6039
Number of pages17
JournalIEEE Transactions on Signal Processing
Volume69
DOIs
StatePublished - 24 Sep 2021

Keywords

  • Sparse Bayesian learning
  • approximate message passing
  • structured variational inference

All Science Journal Classification (ASJC) codes

  • Signal Processing
  • Electrical and Electronic Engineering

Fingerprint

Dive into the research topics of 'Unitary Approximate Message Passing for Sparse Bayesian Learning'. Together they form a unique fingerprint.

Cite this