Fast total variation image restoration with parameter estimation using Bayesian inference

Bruno Amizic*, S. Derin Babacan, Michael K. Ng, Rafael Molina, Aggelos K. Katsaggelos

*Corresponding author for this work

Research output: Chapter in book/report/conference proceedingConference proceedingpeer-review

3 Citations (Scopus)

Abstract

In this paper we propose two fast Total Variation (TV) based algorithms for image restoration by utilizing variational posterior distribution approximation. The unknown image and the hyperparameters for the image and observation models are formulated and estimated simultaneously within a hierachical Bayesian framework, rendering the algorithms fully-automated without any free parameters. Experimental results demonstrate that the proposed algorithms provide restoration results competitive to existing methods in terms of image quality while achieving superior computational efficiency.

Original languageEnglish
Title of host publication2010 IEEE International Conference on Acoustics, Speech, and Signal Processing, ICASSP 2010 - Proceedings
PublisherIEEE
Pages770-773
Number of pages4
ISBN (Print)9781424442966
DOIs
Publication statusPublished - 2010
Event2010 IEEE International Conference on Acoustics, Speech, and Signal Processing, ICASSP 2010 - Dallas, TX, United States
Duration: 14 Mar 201019 Mar 2010

Publication series

NameICASSP, IEEE International Conference on Acoustics, Speech and Signal Processing - Proceedings
ISSN (Print)1520-6149

Conference

Conference2010 IEEE International Conference on Acoustics, Speech, and Signal Processing, ICASSP 2010
Country/TerritoryUnited States
CityDallas, TX
Period14/03/1019/03/10

Scopus Subject Areas

  • Software
  • Signal Processing
  • Electrical and Electronic Engineering

User-Defined Keywords

  • Bayesian methods
  • Image restoration
  • Parameter estimation
  • Total variation
  • Variational methods

Fingerprint

Dive into the research topics of 'Fast total variation image restoration with parameter estimation using Bayesian inference'. Together they form a unique fingerprint.

Cite this