Multiplicative Noise and Blur Removal by Framelet Decomposition and l1-Based L-Curve Method

Fan Wang, Xi-Le Zhao, Michael K. Ng*

*Corresponding author for this work

Research output: Contribution to journalJournal articlepeer-review

25 Citations (Scopus)
37 Downloads (Pure)


This paper proposes a framelet-based convex optimization model for multiplicative noise and blur removal problem. The main idea is to employ framelet expansion to represent the original image and use the variable decomposition to solve the problem. Because of the nature of multiplicative noise, we decompose the observed data into the original image variable and the noise variable to obtain the resulting model. The original image variable is represented by framelet, and it is determined by using $l-{1}$-norm in the selection and shrinkage of framelet coefficients. The noise variable is measured by using the mean and the variance of the underlying probability distribution. This framelet setting can be applied to analysis, synthesis, and balanced approaches, and the resulting optimization models are convex, such that they can be solved very efficiently by the alternating direction of a multiplier method. An another contribution of this paper is to propose to select the regularization parameter by using the $l-{1}$-based L-curve method for these framelet based models. Numerical examples are presented to illustrate the effectiveness of these models and show that the performance of the proposed method is better than that by the existing methods.

Original languageEnglish
Article number7497475
Pages (from-to)4222-4232
Number of pages11
JournalIEEE Transactions on Image Processing
Issue number9
Publication statusPublished - Sept 2016

Scopus Subject Areas

  • Software
  • Computer Graphics and Computer-Aided Design

User-Defined Keywords

  • blur removal
  • convex optimization
  • framelet
  • Multiplicative noise
  • sparsity


Dive into the research topics of 'Multiplicative Noise and Blur Removal by Framelet Decomposition and l1-Based L-Curve Method'. Together they form a unique fingerprint.

Cite this