This week I am in Australia, attending the ANZIAM annual meeting in Katoomba, New South Wales (in the picturesque Blue Mountains). I gave an overview talk on some recent developments in compressed sensing, particularly with regards to the basis pursuit approach to recovering sparse (or compressible) signals from incomplete measurements. The slides for my talk can be found here, with some accompanying pictures here. (There are of course by now many other presentations of compressed sensing on-line; see for instance this page at Rice.)

There was an interesting discussion after the talk. Some members of the audience asked the very good question as to whether any *a priori* information about a signal (e.g. some restriction about the support) could be incorporated to improve the performance of compressed sensing; a related question was whether one could perform an adaptive sequence of measurements to similarly improve performance. I don’t have good answers to these questions. Another pointed out that the restricted isometry property was like a local “well-conditioning” property for the matrix, which only applied when one viewed a few columns at a time.

### Like this:

Like Loading...

## 7 comments

Comments feed for this article

6 February, 2008 at 12:22 pm

Igor CarronTerry,

Even though it was supposed to be a feature (being non-adaptive), there has been some headway in the direction of adaptive measurements in the form of a bayesian approach:

Bayesian Compressive Sensing, Shihao Ji, Ya Xue, and Lawrence Carin,

http://www.ece.duke.edu/~shji/BCS.html

and more recent ones:

Colored Random Projections for Compressed Sensing by Zhongmin Wang; Arce, G.R.; Paredes, J.L., http://ieeexplore.ieee.org/xpl/freeabs_all.jsp?arnumber=4217849

Compressed sensing image reconstruction via recursive spatially adaptive filtering,

Karen Egiazarian, Alessandro Foi, and Vladimir Katkovnik,

Adaptive compressed image sensing based on wavelet-trees

S. Dekel

With regards to a priori information, it would seem that one way to do this is to build a basis that reflects the a priori information that is sought. In the most recent undertakings, several methods have been devised to find specific bases in which the training/most representative datasets end up being sparsely represented:

MULTISCALE SPARSE IMAGE REPRESENTATION WITH

LEARNED DICTIONARIES, Julien Mairal, Guillermo Sapiro and Michael Elad

Efficient sparse coding algorithms, Honglak Lee Alexis Battle Rajat Raina Andrew Y. Ng

Once you have the “sparse” basis, you can use CS by obtaining measurements in a basis that is incoherent with the one you just built.

Igor.

http://nuit-blanche.blogspot.com/search/label/CS

7 February, 2008 at 3:41 am

Gabriel PeyréConcerning the enhancement of CS by using more complex models than just sparsity, some pointers for the interested reader are gathered here

http://www.dsp.ece.rice.edu/cs/#ext

This is important since in practice, sparsity is not enough to perform efficient image compression (see e.g. the JPEG-2k image coder, which uses an advanced statistical modeling), so this is likely to be the case for efficient CS recovery.

Some additional stuffs:

* optimization in a scale-by-scale fashion of the wavelet coefficients

D. Donoho and Y. Tsaig, “Extensions of compressed sensing,”

* positivity or other constraints can be enforced by including additional convex sets for iterative projection on convex sets {x \ |x|_1 < tau} and {x \ A*x=b} where A is the sensing matrix.

E. Candes and J. Romberg, “Practical signal recovery from random projections”

* low total variation is another constraint that can be enforced (the intuition is that the gradient is sparse).

E. Candes and J. Romberg, “Practical signal recovery from random projections”

* One can also use more advanced signals models not based on sparsity

but rather on low dimensional smooth manifolds (for the whole image or for small patches).

R. Baraniuk and M. Wakin, “Random projections of smooth manifolds”

G. Peyre, “Manifold Models for Signals and Images”

* Recovery of piecewise smooth signals (the wavelet coefficients in this case can be “chained” accross the scale)

Duarte et al., “Fast reconstruction of piecewise smooth signals from random projections”

* Recovery of several signals having the same sparsity pattern (e.g. color image or multi-channels), see for instance

Fornasier et al., “Recovery algorithms for vector valued data with joint sparsity constraints.”

Gribonval et al., “Atoms of all channels, unite! Average case analysis of multi-channel sparse recovery using greedy algorithms.”

* sparsity in a best orthogonal basis (e.g. wavelet packets) instead of a fixed basis (e.g. wavelets)

G. Peyre, “Best Basis Compressed Sensing”

7 March, 2008 at 9:29 pm

VivianProf. Tao,

Thanks very much for posting your ANZIAM slides. I realize that I’ve come upon this entry about a month late, but I was hoping that you could clarify one point for me. On slide #4, you write that the least-squares solution for Ax=b, with A mxn and m<n, is:

x = inv(A*A)A*b.

But is A*A even invertible under these conditions? The answer I get is:

x = A*inv(AA*)b.

Am I misinterpreting your notation somehow?

Regards,

Vivian

8 March, 2008 at 8:57 am

Terence TaoDear Vivian,

Oops, you are right, the matrices should be multiplied in the reverse order here. Thanks!

31 August, 2009 at 12:29 pm

More Mahler lectures « What’s new[…] Compressed sensing. This is an updated and reformatted version of my ANZIAM talk on this topic. […]

15 May, 2012 at 12:42 am

Compressive Sensing: The Big Picture | spider's space[…] Non-Deterministic and Adaptive Measurement codes/matrices as mentioned here (as well as new […]

6 March, 2019 at 7:29 am

Yuping ZhangMy conclusion is the blue-eyed island puzzle is incorrect.

Here’s David Lewis’s 100 and N=0 or N=100 puzzle.

The single color puzzle.

In this N=0 or N=100 puzzle, the common knowledge is they all know that they all know that: ”At least one of you has green eyes.”

Here, the crucial key is the entire 100 of them are seeing the single color in the same amount of people=99, therefore, it’s not leaving each day that tells more and more works out but not in Terence Tao’s the blue-eyed island puzzle.

Let’s see more example.

The case, 10 and N=1, and N= is the only person to whose eyes are not green. Rest of the 9 are greens.

Someone announced: ”At least one of the ten has green eyes.”

In the 9th days, CAN all of the 9 greens figure it out their eye color logically?