Abstract:

  In this paper, we propose a nonlocal low-rank regularization (NLR) approach toward exploiting structured sparsity and explore its application into CS of both photographic and MRI images. 

   We also propose the use of a nonconvex log det(X) as a smooth surrogate function for the rank instead of the convex nuclear norm .

I.Introduction:

    In this paper, we propose a unified (统一的)variational(变化的) framework for nonlocal low-rank regularization of CS recovery. 

   To exploit the nonlocal sparsity of natural or medical images, we propose to regularize the CS recovery by patch grouping and low-rank approximation. 

    Specifically, for each exemplar(标本) image patch we group a set of similar image patches to form a data matrix X. Since each patch contain similar structures, the rank of this data matrix is low implying a useful image prior. To more efficiently solve the problem of rank minimization, we propose to use the log det(X) as a smooth surrogate function for the rank (instead of using the convex nuclear norm), which lends itself to iterative singular-value thresholding. 

II. BACKGROUND

III. NONLOCAL LOW-RANK REGULARIZATION FOR CS RECOVERY

    The proposed regularization model consists of two components: patch groupingfor characterizing self-similarity of a signal and low-rank approximation for sparsity enforcement.

      所提出方法的基本假设是自相似性在我们的信号中是丰富的。可以发现大量的大小为Compressive Sensing via Nonlocal Low-Rank Regularization论文总结  的相似块,在位置i,表示为Compressive Sensing via Nonlocal Low-Rank Regularization论文总结,对于每个样本块Compressive Sensing via Nonlocal Low-Rank Regularization论文总结,我们可以用K-近邻搜索得到一个局部窗口

                                         Compressive Sensing via Nonlocal Low-Rank Regularization论文总结

 T是设定的阈值,Gi是这些相似块位置的集合。对于每个样本块Compressive Sensing via Nonlocal Low-Rank Regularization论文总结可以得到一个数据矩阵Compressive Sensing via Nonlocal Low-Rank Regularization论文总结=Compressive Sensing via Nonlocal Low-Rank Regularization论文总结

Compressive Sensing via Nonlocal Low-Rank Regularization论文总结的每一列表示一个和Compressive Sensing via Nonlocal Low-Rank Regularization论文总结相似的块。

     因为存在噪声,所以将数据矩阵建模为Compressive Sensing via Nonlocal Low-Rank Regularization论文总结=Compressive Sensing via Nonlocal Low-Rank Regularization论文总结, Li  :the low-rank matrix ,Wi :the Gaussian noise matrix 

Compressive Sensing via Nonlocal Low-Rank Regularization论文总结可以用如下优化问题来解决:

                         Compressive Sensing via Nonlocal Low-Rank Regularization论文总结

Compressive Sensing via Nonlocal Low-Rank Regularization论文总结   是  Frobenious 范数,定义如下。   Compressive Sensing via Nonlocal Low-Rank Regularization论文总结是加入高斯噪声的方差。

                           Compressive Sensing via Nonlocal Low-Rank Regularization论文总结

     A* 表示 A 的共轭转置,σi 是 A 的奇异值,并使用了迹函数。

为了方便解决,用核范数Compressive Sensing via Nonlocal Low-Rank Regularization论文总结替代。使用核范数,秩的最小化问题可以解决通过 singular value thresholding (SVT)。本文中,使用一个光滑的非凸的秩来替代核范数。可以得到        

                                                    Compressive Sensing via Nonlocal Low-Rank Regularization论文总结

函数E(X,ε)近似奇异值的对数之和(直到一个范围),图1显示了标量情况下非凸代理函数,秩和核范数的比较。

               Compressive Sensing via Nonlocal Low-Rank Regularization论文总结

可以看到,Compressive Sensing via Nonlocal Low-Rank Regularization论文总结可以更好的近似rank。

    对于一个矩阵  Compressive Sensing via Nonlocal Low-Rank Regularization论文总结,模仿公式(6)可得,

                                            Compressive Sensing via Nonlocal Low-Rank Regularization论文总结

  Compressive Sensing via Nonlocal Low-Rank Regularization论文总结是一个对角矩阵,对角线上的元素的值是Compressive Sensing via Nonlocal Low-Rank Regularization论文总结的特征值。Compressive Sensing via Nonlocal Low-Rank Regularization论文总结(奇异值分解)Compressive Sensing via Nonlocal Low-Rank Regularization论文总结是对角矩阵,对角线的值是矩阵Li的奇异值。设Compressive Sensing via Nonlocal Low-Rank Regularization论文总结我们可以得到Compressive Sensing via Nonlocal Low-Rank Regularization论文总结Compressive Sensing via Nonlocal Low-Rank Regularization论文总结的logdet的替代函数。为了求解Li,提出低秩近似问题。

                                      Compressive Sensing via Nonlocal Low-Rank Regularization论文总结

 用拉格朗日求解

                                        Compressive Sensing via Nonlocal Low-Rank Regularization论文总结????前后是否相反)

 对于每个示例图像块,我们可以通过求解等式(9)来用低秩矩阵Li近似矩阵Xi

       如何使用基于patch的低秩正则化模型进行CS图像恢复? 基本思想是在非局部相似patch针对每个提取的样本patch以及线性测量的约束,增强低秩属性。 利用所提出的低阶正则化术语,我们提出了以下用于CS恢复的全局目标函数:

                                      Compressive Sensing via Nonlocal Low-Rank Regularization论文总结

Compressive Sensing via Nonlocal Low-Rank Regularization论文总结表示和样本Xi相似的patch的集合。(理解为用相似的patch减该patch)

IV. OPTIMIZATION ALGORITHM FOR CS IMAGE RECOVERY

A. Low-Rank Matrix Optimization via Iterative Single Value Thresholding

求低秩矩阵Li表示为

                                   Compressive Sensing via Nonlocal Low-Rank Regularization论文总结

公式(11)可以写为

                               Compressive Sensing via Nonlocal Low-Rank Regularization论文总结

因为是非凸的,所以可以求得一个局部最优解。设Compressive Sensing via Nonlocal Low-Rank Regularization论文总结=Compressive Sensing via Nonlocal Low-Rank Regularization论文总结,则Compressive Sensing via Nonlocal Low-Rank Regularization论文总结使用一阶泰勒展开为

                                Compressive Sensing via Nonlocal Low-Rank Regularization论文总结

Compressive Sensing via Nonlocal Low-Rank Regularization论文总结表示第K次迭代的解。

公式(12)可以被表示为

                                 Compressive Sensing via Nonlocal Low-Rank Regularization论文总结

直接使用Compressive Sensing via Nonlocal Low-Rank Regularization论文总结并且忽视了公式(13)中的常量。上式也可以写为

                                    Compressive Sensing via Nonlocal Low-Rank Regularization论文总结                            

Compressive Sensing via Nonlocal Low-Rank Regularization论文总结Compressive Sensing via Nonlocal Low-Rank Regularization论文总结表示weighted nuclear norm ,权重Compressive Sensing via Nonlocal Low-Rank Regularization论文总结奇异值Compressive Sensing via Nonlocal Low-Rank Regularization论文总结是降序排列,权重是升序。
     已知在实矩阵的情况下,加权核范数只有在权重下降时才是凸函数,并且(15)的最优解由加权奇异值阈值算子给出,称为近端算子。 在本文例子中,权重是上升的,因此(15)不是凸的。 所以我们不期望找到它的全局最小化者。 另外,我们正在处理一个复矩阵。 尽管如此,仍然可以证明加权奇异值阈值给出了一个(可能的局部)最小值到(15):

B. Image Recovery via Alternative Direction Multiplier Method

在对每个Compressive Sensing via Nonlocal Low-Rank Regularization论文总结进行求解之后,我们可以重建整幅图像通过解决如下的最小化问题,

                                       Compressive Sensing via Nonlocal Low-Rank Regularization论文总结

公式(19)是一个二次优化问题(按照公式推导)

                                   Compressive Sensing via Nonlocal Low-Rank Regularization论文总结

H表示共轭转置,Compressive Sensing via Nonlocal Low-Rank Regularization论文总结

      在式(20)中,被反转的矩阵是大(为什么是大的)的。因此,直接解Eq.(20)是不可能的。在实践中,Eq.(20)可以是共轭梯度(CG)算法进行计算。

      通过对公式(19)应用ADMM(alternative direction multiplier method ),可以将上述问题分解为两个子问题

                            Compressive Sensing via Nonlocal Low-Rank Regularization论文总结

Compressive Sensing via Nonlocal Low-Rank Regularization论文总结是一个辅助的变量,Compressive Sensing via Nonlocal Low-Rank Regularization论文总结是一个拉格朗日乘子,β  是一个正的标量。

对于等式(21)的最优化通过如下迭代过程完成

                               Compressive Sensing via Nonlocal Low-Rank Regularization论文总结

Compressive Sensing via Nonlocal Low-Rank Regularization论文总结是一个常量,对于确定的Compressive Sensing via Nonlocal Low-Rank Regularization论文总结,一个closed-form solution(封闭式??)

                             Compressive Sensing via Nonlocal Low-Rank Regularization论文总结

Compressive Sensing via Nonlocal Low-Rank Regularization论文总结是一个对角矩阵,每个对角矩阵的元素对应于图像像素的位置,它的值是块在该位置重叠的次数,Compressive Sensing via Nonlocal Low-Rank Regularization论文总结表示所有该块的相似块的平均值,对于确定的Compressive Sensing via Nonlocal Low-Rank Regularization论文总结可以通过计算下式解x-subproblem
                                  Compressive Sensing via Nonlocal Low-Rank Regularization论文总结                       

Compressive Sensing via Nonlocal Low-Rank Regularization论文总结是一个局部的傅里叶变换矩阵,Compressive Sensing via Nonlocal Low-Rank Regularization论文总结分别表示下采样矩阵和傅里叶变换矩阵。Eq.(24)很容易解决通弄过从图像空间转换为傅里叶空间。将Compressive Sensing via Nonlocal Low-Rank Regularization论文总结代入公式(24),并且对等式的每边做傅里叶变换,可以得到

                                           Compressive Sensing via Nonlocal Low-Rank Regularization论文总结

上述公式可以被简化为

                              Compressive Sensing via Nonlocal Low-Rank Regularization论文总结

矩阵被倒转成一个对角矩阵,(怎么实现的)

        Compressive Sensing via Nonlocal Low-Rank Regularization论文总结通过对公式的右边进行逆傅里叶变换得到。

                           Compressive Sensing via Nonlocal Low-Rank Regularization论文总结

 With updated x and z, μ and β can be readily computed according to Eq.(22)

After obtaining an improved estimate of the unknown image, the low-rank matrices Li can be updated by Eq.(18).

The updated Li is then used to improve the estimate of x by solving Eq.(19). 

 迭代直至收敛。整个过程如下:

Compressive Sensing via Nonlocal Low-Rank Regularization论文总结

Compressive Sensing via Nonlocal Low-Rank Regularization论文总结

知识点:

 Frobenious 范数: 衡量矩阵大小

矩阵核范数:

weighted nuclear norm

加权奇异值阈值

共轭梯度(CG)算法

alternative direction multiplier method

问题:

(秩的最小化和奇异值有什么关系)

     对矩阵进行奇异值分解,并把其所有奇异值排列为一个向量,那么这个向量的稀疏性便对应于该矩阵的低秩性。低秩性可以看做稀疏性在矩阵上的拓展. 矩阵秩最小化主要是指利用原始数据矩阵的低秩性进行矩阵的重建, 这涉及到最小化矩阵的秩函数. 低秩矩阵恢复则是指同时利用原始数据矩阵的低秩性和误差矩阵的稀疏性来恢复数据矩阵. 在具体求解压缩传感、 矩阵秩最小化或低秩矩阵恢复问题时, 由于原始目标函数 0 范数和矩阵秩函数是非连续非凸的函数, 往往分别使用 1 范数和矩阵核范数 代替, 将原始问题转化为凸优化问题求解,。


参考文献:

从压缩传感到低秩矩阵恢复: 理论与应用



相关文章:

  • 2021-07-27
  • 2022-12-23
  • 2021-11-21
  • 2021-11-19
  • 2023-04-03
  • 2021-08-10
  • 2022-01-23
  • 2021-06-02
猜你喜欢
  • 2021-07-01
  • 2021-11-02
  • 2021-06-07
  • 2021-11-27
  • 2021-06-07
  • 2021-05-17
相关资源
相似解决方案