Hyper-Parameter Auto-Tuning for Sparse Bayesian Learning

Dawei Gao, Qinghua Guo, Ming Jin, Guisheng Liao, Yonina C Eldar

Research output: Contribution to journalArticle

Abstract

Choosing the values of hyper-parameters in sparse Bayesian learning (SBL) can significantly impact performance. However, the hyper-parameters are normally tuned manually, which is often a difficult task. Most recently, effective automatic hyper-parameter tuning was achieved by using an empirical auto-tuner. In this work, we address the issue of hyper-parameter auto-tuning using neural network (NN)-based learning. Inspired by the empirical auto-tuner, we design and learn a NN-based auto-tuner, and show that considerable improvement in convergence rate and recovery performance can be achieved.
Original languageEnglish
Number of pages5
Journalarxiv.org
StateIn preparation - 9 Nov 2022

Fingerprint

Dive into the research topics of 'Hyper-Parameter Auto-Tuning for Sparse Bayesian Learning'. Together they form a unique fingerprint.

Cite this