FSIM: A Feature SIMilarity Index for Image Quality Assessment

Lin Zhang, Lei Zhang, Xuanqin Mou and David Zhang


Introduction

Image quality assessment (IQA) aims to use computational models to measure the image quality consistently with subjective evaluations. The well-known structural-similarity (SSIM) index brings IQA from pixel based stage to structure based stage. In this work, a novel feature-similarity (FSIM) index for full reference IQA is proposed based on the fact that human visual system (HVS) understands an image mainly according to its low-level features. Specifically, the phase congruency (PC), which is a dimensionless measure of the significance of a local structure, is used as the primary feature in FSIM. Considering that PC is contrast invariant while the contrast information does affect HVS¨ perception of image quality, the image gradient magnitude (GM) is employed as the secondary feature in FSIM. PC and GM play complementary roles in characterizing the image local quality. After obtaining the local similarity map, we use PC again as a weighting function to derive a single quality score. Extensive experiments performed on six benchmark IQA databases demonstrate that FSIM can achieve much higher consistency with the subjective evaluations than all the state-of-the-art IQA metrics used in comparison. Although FSIM is designed for grayscale images (or the luminance components of color images), the chrominance information can be easily incorporated by means of a simple extension of FSIM, and we call this extension FSIMC.


Algorithm

The visual information in an image is often very redundant, while the HVS understands an image mainly based on its low-level features. In other words, the salient low-level features convey crucial information for the HVS to interpret the scene. Accordingly, perceptible image degradations will lead to perceptible changes in image low-level features, and hence a good IQA metric could be devised by comparing the low-level feature sets between the reference image and the distorted image.

Based on the physiological and psychophysical evidence, it is found that visually discernable features coincide with those points where the Fourier waves at different frequencies have congruent phases. That is, at points of high phase congruency (PC) we can extract highly informative features. Therefore, PC is used as the primary feature in computing FSIM. Meanwhile, considering that PC is contrast invariant but image local contrast does affect HVS¨ perception on the image quality, the image gradient magnitude (GM) is computed as the secondary feature to encode contrast information. PC and GM are complementary and they reflect different aspects of the HVS in assessing the local quality of the input image. After computing the local similarity map, PC is utilized again as a weighting function to derive a single similarity score. Although FSIM is designed for grayscale images (or the luminance components of color images), the chrominance information can be easily incorporated by means of a simple extension of FSIM, and we call this extension FSIMC.

For the computation of phase congruency, readers can refer to the salient work by Dr. Peter Kovesi. For gradient magnitude, we use the Scharr operator. Suppose that we are going to calculate the similarity between images f1 and f2. Denote by PC1 and PC2 the PC maps extracted from f1 and f2, and G1 and G2 the GM maps extracted from them. It should be noted that for color images, PC and GM features are extracted from their luminance channels. FSIM will be defined and computed based on PC1, PC2, G1 and G2. Furthermore, by incorporating the image chrominance information into FSIM, an IQA index for color images, denoted by FSIMC, will be obtained.

The computation of FSIM index consists of two stages. In the first stage, the local similarity map is computed, and then in the second stage, we pool the similarity map into a single similarity score.
We separate the feature similarity measurement between f1(x) and f2(x) into two components, each for PC or GM. First, the similarity measure for PC1(x) and PC2(x) is defined as

where T1 is a positive constant to increase the stability of SPC. Similarly, the GM values G1(x) and G2(x) are compared and the similarity measure is defined as

where T2 is a positive constant depending on the dynamic range of GM values. In our experiments, both T1 and T2 will be fixed to all databases so that the proposed FSIM can be conveniently used. Then, SPC(x) and SG(x) are combined to get the similarity SL(x) of f1(x) and f2(x). We define SL(x) as SL(x) = SPC(x)·SG(x).

Having obtained the similarity SL(x) at each location x, the overall similarity between f1 and f2 can be calculated. However, different locations have different contributions to HVS¨ perception of the image. For example, edge locations convey more crucial visual information than the locations within a smooth area. Since human visual cortex is sensitive to phase congruent structures, the PC value at a location can reflect how likely it is a perceptibly significant structure point. Intuitively, for a given location x, if anyone of  f1(x) and f2(x) has a significant PC value, it implies that this position x will have a high impact on HVS in evaluating the similarity between f1 and f2. Therefore, we use PCm(x) = max(PC1(x), PC2(x)) to weight the importance of SL(x) in the overall similarity between f1 and f2, and accordingly the FSIM index between f1 and f2 is defined as

The FSIM index is designed for grayscale images or the luminance components of color images. Since the chrominance information will also affect HVS in understanding the images, better performance can be expected if the chrominance information is incorporated in FSIM for color IQA. Such a goal can be achieved by applying a straightforward extension to the FSIM framework. Please refer to our paper for more details. The procedures to calculate the FSIM/FSIMC indices are illustrated in the following figure. If the chromatic information is ignored in the following figure, the FSIMC index is reduced to the FSIM index.


Source Code

The source code to compute the proposed FSIM/FSIMC can be downloaded here: FeatureSIM.m.

Usage:

%Given 2 test images img1 and img2. For gray-scale images, their dynamic range should be 0-255.
%For colorful images, the dynamic range of each color channel should be 0-255.
[FSIM, FSIMc] = FeatureSIM(img1, img2);

Note: FSIM compares two images based on their luminance components only; while FSIMC also considers the chromatic information in addition to the luminance.


Evaluation Results

The FSIM/FSIMC values are computed (using FeatureSIM.m) for 8 publicly available IQA databases, including TID2013 database, TID2008 database, CSIQ database, LIVE database, IVC database, Toyama-MICT database, and Cornell A57 database and WIQ. The results (in Matlab .mat format) are provided here, together with performance evaluations based on Spearman rank order correlation coefficient (SROCC) and Kendall rank order correlation coefficient (KROCC), for future comparisons. Each result file contains a n by 3 matrix, where n denotes the number of distorted images in the database. The first column is the FSIM values, the second column is the FSIMC values, and the third column is the mos/dmos values provided by the database. For example, you can use the following matlab code to calculate the SROCC and KROCC values for FSIM and FSIMC values obtained on the TID2008 database:

%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%

matData = load('FSIMOnTID2008.mat');
FSIMOnTID2008 = matData.FSIMOnTID2008;
FSIM_TID_SROCC = corr(FSIMOnTID2008(:,1), FSIMOnTID2008(:,3), 'type', 'spearman');
FSIM_TID_KROCC = corr(FSIMOnTID2008(:,1), FSIMOnTID2008(:,3), 'type', 'kendall');
FSIMc_TID_SROCC = corr(FSIMOnTID2008(:,2), FSIMOnTID2008(:,3), 'type', 'spearman');
FSIMc_TID_KROCC = corr(FSIMOnTID2008(:,2), FSIMOnTID2008(:,3), 'type', 'kendall');

%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%

The source codes to calculate the PLCC (Pearson Linear Correlation Coefficient) and RMSE (Root Mean-Squared-Error) are also provided for each database. This needs a nonlinear regression procedure which is dependant on the initialization of the parameters. We try to adjust the parameters to get a high PLCC value. For different databases, the parameter initialization may be different. The nonlinear fitting function is of the form as described in [2].

Evaluation results of FSIM/FSIMC on six databases on given below. Besides, for each evaluation metric, we present its weighted-average value over all the testing datasets; and the weight for each database is set as the number of distorted images in that dataset.

Database

Results

FSIM

FSIMC

SROCC

KROCC

PLCC RMSE SROCC KROCC PLCC RMSE

TID2013

FSIMOnTID2013

0.8015 0.6289 0.8589 0.6349 0.8510 0.6665 0.8769 0.5959

TID2008

FSIMOnTID2008

0.8805 0.6946 0.8738 0.6525 0.8840 0.6991 0.8762 0.6468

CSIQ

FSIMOnCSIQ

0.9242 0.7567 0.9120 0.1077 0.9310 0.7690 0.9192 0.1034

LIVE

FSIMOnLIVE

0.9634 0.8337 0.9597 7.6780 0.9645 0.8363 0.9613 7.5296

IVC

FSIMOnIVC

0.9262 0.7564 0.9376 0.4236 0.9293 0.7636 0.9392 0.4183

Toyama-MICT

FSIMOnMICT

0.9059 0.7302 0.9078 0.5248 0.9067 0.7303 0.9075 0.5257

A57

FSIMOnA57

0.9181

0.7639

0.9393

0.0844

0.9181

0.7639

0.9393

0.0844

WIQ

FSIMOnWIQ

0.8006

0.6215

0.8546

11.8949

0.8006

0.6215

0.8546

11.8949

Weighted-Average

 

               

Note: since images in A57 and WIQ are gray-scale, FSIMC will produce exactly the same results with FSIM.


Reference                

[1] Lin Zhang, Lei Zhang, Xuanqin Mou, and David Zhang, "FSIM: A feature similarity index for image quality assessment", IEEE Transactions on Image Processing, vol. 20, no. 8, pp. 2378-2386, 2011.

[2] H.R. Sheikh, M.F. Sabir, and A.C. Bovik, "A statistical evaluation of recent full reference image quality assessment algorithms", IEEE Trans. on Image Processing, vol. 15, no. 11, pp. 3440-3451, 2006.


Created on: Oct. 24, 2010

Last update: Dec. 10, 2013                                  

Web Counter