Paper of the Week (Po'W): Tuning Iterative Reconstruction Algorithms Edition

| No Comments
Hello, and welcome to Paper of the Week (Po'W): Tuning Iterative Reconstruction Algorithms Edition. This week's paper is from my holiday reading list, and addresses the anonymous question on Nuit Blanche about the need for an approach to benchmarking: A. Maleki and D. L. Donoho, "Optimally tuned iterative reconstruction algorithms for compressed sensing," IEEE J. Selected Topics in Signal Process., vol. 4, pp. 330-341, Apr. 2010.

In the interest of extreme brevity, here is my one line description of the work in this paper:
The authors produce complete recipes of tuned algorithms for out-of-the-box iterative signal reconstruction from compressive measurements.
The authors compare several iterative reconstruction algorithms (iterative hard and soft thresholding with relaxation, and two-stage thresholding, including CoSaMP and subspace pursuit) under various conditions of problem sparsity \(\rho := k/M\) (\(k\) is the sparsity of the vector, \(M\) is the number of measurements), undersampling \(\delta := M/N\) (\(N\) is the length of the sparse vector), and distributions of the sensing matrix and sparse vector amplitudes. These methods can be described as follows.

Say we have sensed some vector \(\vx \in \mathcal{R}^N\) by the sensing matrix \(\mathbf{\Phi} \in \mathcal{R}^{M\times N}\) with \(N > M\) and unit-norm columns. This produces the measurement \(\vy = \mathbf{\Phi}\vx\), from which we want to recover \(\vx\). Iterative thresholding with relaxation works as follows. At step \(n\) we have the \(n\)th-order solution \(\hat\vx(n) \in \mathcal{R}^N\), producing the residual \(\vr(n) = \vy - \mathbf{\Phi}\hat \vx(n)\). (Before the first step we make \(\hat \vx(0) = \mathbf{0}\).) We update this solution by $$\hat \vx(n+1) = T \left ( \hat \vx(n) + \kappa \mathbf{\Phi}^T\vr(n); t(n) \right )$$ where \(0 < \kappa < 1\) is a relaxation parameter, and \(T( \cdot; t)\) is a thresholding function with positive threshold \(t\). For "hard" thresholding, $$T(x; t) := \begin{cases} x, & |x| > t \\ 0, & \textrm{else} \end{cases}$$ and for "soft" thresholding $$T(x; t) := \begin{cases} sgn(x)(|x| - t), & |x| > t \\ 0, & \textrm{else}. \end{cases}$$ The two-stage thresholding approach takes the iterative thresholding approach and after each step performs an orthogonal projection and then secondary thresholding step with possibly a different thresholding function \(G( \cdot; w)\). So, at step \(n\) given a matrix \(\mathbf{\Phi}(n)\) composed of the \(n\) atoms from \(\mathbf{\Phi}\) that have non-zero weights in \(\hat \vx(n)\) after the first thresholding step (as described above) we find the weights of the orthogonal projection $$\vs(n) := \left [ \mathbf{\Phi}^T(n)\mathbf{\Phi}(n)\right ]^{-1} \mathbf{\Phi}^T(n)\vy$$ and then update the solution by $$\hat \vx(n+1) = G \left ( \vs(n); w(n) \right )$$ where \(w(n)\) is a positive threshold.

From their exhaustive experiments the authors develop "recommended versions" of these reconstructors for all sparse vector distributions by finding the greatest lower bound of the empirical phase transition curve, which is the tradeoff of sparsity and undersampling for guaranteed recovery. Their experiments also reveal that "constant amplitude random sign" (CARS) sparse signals have the worst recovery rates (a ha!); that different random sensing matrices behave nearly the same with respect to the phase transition; and that subspace pursuit is the winner. As a tip of their hat to reproducible research, the authors make their code available as MATLAB scripts at http://sparselab.stanford.edu/OptimalTuning/main.htm, which can be run out-of-the-box on a specific problem, i.e., with sensing matrices (iid Gaussian) or operators (partial Fourier ensembles), and distributions of sparse vectors.

Leave a comment

About this Entry

This page contains a single entry by Bob L. Sturm published on December 23, 2010 9:16 AM.

Let the parrots hit the floor was the previous entry in this blog.

Boids: Wandering, Seeking and Fleeing is the next entry in this blog.

Find recent content on the main index or look in the archives to find all content.