75
Les math´ ematiques du Compressive Sensing une introduction Simon Foucart Drexel University Laboratoire Paul Painlev´ e Universit´ e Lille 1 21-22 Mars 2013

Les math´ematiques du Compressive Sensing une introduction ...math.univ-lille1.fr/~cempi/activites_scientifiques/... · “Compressive Sensing”. Son but est de donner un aperc¸u

  • Upload
    others

  • View
    3

  • Download
    0

Embed Size (px)

Citation preview

Page 1: Les math´ematiques du Compressive Sensing une introduction ...math.univ-lille1.fr/~cempi/activites_scientifiques/... · “Compressive Sensing”. Son but est de donner un aperc¸u

Les mathematiques du Compressive Sensing

une introduction

Simon Foucart

Drexel University

Laboratoire Paul PainleveUniversite Lille 121-22 Mars 2013

Page 2: Les math´ematiques du Compressive Sensing une introduction ...math.univ-lille1.fr/~cempi/activites_scientifiques/... · “Compressive Sensing”. Son but est de donner un aperc¸u

RESUME

Ce mini-cours est une excursion dans l’elegante theorie du“Compressive Sensing”. Son but est de donner un apercu assezcomplet des aspects mathematiques fondamentaux.

Le contenu de ce mini-cours est en partie base sur l’ouvrage

A Mathematical Introduction to Compressive Sensing

ecrit en collaboration avec Holger Rauhut.

Page 3: Les math´ematiques du Compressive Sensing une introduction ...math.univ-lille1.fr/~cempi/activites_scientifiques/... · “Compressive Sensing”. Son but est de donner un aperc¸u

Lecon 1: Le probleme type et les premiers algorithmes

Cette lecon introduit la question de la reconstructionparcimonieuse, etablit ses limites theoriques, et presente unalgorithme atteignant ces limites dans une situation idealisee. Pourtraiter une situation plus realiste, d’autres algorithmes sontnecessaires. La minimisation �1 et “Orthogonal Matching Pursuit”font des a present leur apparition. Leur succes est prouve enutilisant le concept de coherence d’une matrice.

Page 4: Les math´ematiques du Compressive Sensing une introduction ...math.univ-lille1.fr/~cempi/activites_scientifiques/... · “Compressive Sensing”. Son but est de donner un aperc¸u

Keywords

� SparsityEssential

� RandomnessNothing better so far(measurement process)

� OptimizationPreferred, but competitive alternatives are available(reconstruction process)

Page 5: Les math´ematiques du Compressive Sensing une introduction ...math.univ-lille1.fr/~cempi/activites_scientifiques/... · “Compressive Sensing”. Son but est de donner un aperc¸u

Keywords

� SparsityEssential

� RandomnessNothing better so far(measurement process)

� OptimizationPreferred, but competitive alternatives are available(reconstruction process)

Page 6: Les math´ematiques du Compressive Sensing une introduction ...math.univ-lille1.fr/~cempi/activites_scientifiques/... · “Compressive Sensing”. Son but est de donner un aperc¸u

Keywords

� SparsityEssential

� RandomnessNothing better so far(measurement process)

� OptimizationPreferred, but competitive alternatives are available(reconstruction process)

Page 7: Les math´ematiques du Compressive Sensing une introduction ...math.univ-lille1.fr/~cempi/activites_scientifiques/... · “Compressive Sensing”. Son but est de donner un aperc¸u

Keywords

� SparsityEssential

� RandomnessNothing better so far(measurement process)

� OptimizationPreferred, but competitive alternatives are available(reconstruction process)

Page 8: Les math´ematiques du Compressive Sensing une introduction ...math.univ-lille1.fr/~cempi/activites_scientifiques/... · “Compressive Sensing”. Son but est de donner un aperc¸u

The Standard Compressive Sensing Problem

x : unknown signal of interest in KN

y : measurement vector in Km with m � N,

s : sparsity of x = card�j ∈ {1, . . . ,N} : xj �= 0

�.

Find concrete sensing/recovery protocols, i.e., find

� measurement matrices A : x ∈ KN �→ y ∈ Km

� reconstruction maps ∆ : y ∈ Km �→ x ∈ KN

such that

∆(Ax) = x for any s-sparse vector x ∈ KN .

In realistic situations, two issues to consider:

Stability: x not sparse but compressible,

Robustness: measurement error in y = Ax+ e.

Page 9: Les math´ematiques du Compressive Sensing une introduction ...math.univ-lille1.fr/~cempi/activites_scientifiques/... · “Compressive Sensing”. Son but est de donner un aperc¸u

The Standard Compressive Sensing Problem

x : unknown signal of interest in KN

y : measurement vector in Km with m � N,

s : sparsity of x = card�j ∈ {1, . . . ,N} : xj �= 0

�.

Find concrete sensing/recovery protocols, i.e., find

� measurement matrices A : x ∈ KN �→ y ∈ Km

� reconstruction maps ∆ : y ∈ Km �→ x ∈ KN

such that

∆(Ax) = x for any s-sparse vector x ∈ KN .

In realistic situations, two issues to consider:

Stability: x not sparse but compressible,

Robustness: measurement error in y = Ax+ e.

Page 10: Les math´ematiques du Compressive Sensing une introduction ...math.univ-lille1.fr/~cempi/activites_scientifiques/... · “Compressive Sensing”. Son but est de donner un aperc¸u

The Standard Compressive Sensing Problem

x : unknown signal of interest in KN

y : measurement vector in Km

with m � N,

s : sparsity of x = card�j ∈ {1, . . . ,N} : xj �= 0

�.

Find concrete sensing/recovery protocols, i.e., find

� measurement matrices A : x ∈ KN �→ y ∈ Km

� reconstruction maps ∆ : y ∈ Km �→ x ∈ KN

such that

∆(Ax) = x for any s-sparse vector x ∈ KN .

In realistic situations, two issues to consider:

Stability: x not sparse but compressible,

Robustness: measurement error in y = Ax+ e.

Page 11: Les math´ematiques du Compressive Sensing une introduction ...math.univ-lille1.fr/~cempi/activites_scientifiques/... · “Compressive Sensing”. Son but est de donner un aperc¸u

The Standard Compressive Sensing Problem

x : unknown signal of interest in KN

y : measurement vector in Km with m � N,

s : sparsity of x = card�j ∈ {1, . . . ,N} : xj �= 0

�.

Find concrete sensing/recovery protocols, i.e., find

� measurement matrices A : x ∈ KN �→ y ∈ Km

� reconstruction maps ∆ : y ∈ Km �→ x ∈ KN

such that

∆(Ax) = x for any s-sparse vector x ∈ KN .

In realistic situations, two issues to consider:

Stability: x not sparse but compressible,

Robustness: measurement error in y = Ax+ e.

Page 12: Les math´ematiques du Compressive Sensing une introduction ...math.univ-lille1.fr/~cempi/activites_scientifiques/... · “Compressive Sensing”. Son but est de donner un aperc¸u

The Standard Compressive Sensing Problem

x : unknown signal of interest in KN

y : measurement vector in Km with m � N,

s : sparsity of x

= card�j ∈ {1, . . . ,N} : xj �= 0

�.

Find concrete sensing/recovery protocols, i.e., find

� measurement matrices A : x ∈ KN �→ y ∈ Km

� reconstruction maps ∆ : y ∈ Km �→ x ∈ KN

such that

∆(Ax) = x for any s-sparse vector x ∈ KN .

In realistic situations, two issues to consider:

Stability: x not sparse but compressible,

Robustness: measurement error in y = Ax+ e.

Page 13: Les math´ematiques du Compressive Sensing une introduction ...math.univ-lille1.fr/~cempi/activites_scientifiques/... · “Compressive Sensing”. Son but est de donner un aperc¸u

The Standard Compressive Sensing Problem

x : unknown signal of interest in KN

y : measurement vector in Km with m � N,

s : sparsity of x = card�j ∈ {1, . . . ,N} : xj �= 0

�.

Find concrete sensing/recovery protocols, i.e., find

� measurement matrices A : x ∈ KN �→ y ∈ Km

� reconstruction maps ∆ : y ∈ Km �→ x ∈ KN

such that

∆(Ax) = x for any s-sparse vector x ∈ KN .

In realistic situations, two issues to consider:

Stability: x not sparse but compressible,

Robustness: measurement error in y = Ax+ e.

Page 14: Les math´ematiques du Compressive Sensing une introduction ...math.univ-lille1.fr/~cempi/activites_scientifiques/... · “Compressive Sensing”. Son but est de donner un aperc¸u

The Standard Compressive Sensing Problem

x : unknown signal of interest in KN

y : measurement vector in Km with m � N,

s : sparsity of x = card�j ∈ {1, . . . ,N} : xj �= 0

�.

Find concrete sensing/recovery protocols,

i.e., find

� measurement matrices A : x ∈ KN �→ y ∈ Km

� reconstruction maps ∆ : y ∈ Km �→ x ∈ KN

such that

∆(Ax) = x for any s-sparse vector x ∈ KN .

In realistic situations, two issues to consider:

Stability: x not sparse but compressible,

Robustness: measurement error in y = Ax+ e.

Page 15: Les math´ematiques du Compressive Sensing une introduction ...math.univ-lille1.fr/~cempi/activites_scientifiques/... · “Compressive Sensing”. Son but est de donner un aperc¸u

The Standard Compressive Sensing Problem

x : unknown signal of interest in KN

y : measurement vector in Km with m � N,

s : sparsity of x = card�j ∈ {1, . . . ,N} : xj �= 0

�.

Find concrete sensing/recovery protocols, i.e., find

� measurement matrices A : x ∈ KN �→ y ∈ Km

� reconstruction maps ∆ : y ∈ Km �→ x ∈ KN

such that

∆(Ax) = x for any s-sparse vector x ∈ KN .

In realistic situations, two issues to consider:

Stability: x not sparse but compressible,

Robustness: measurement error in y = Ax+ e.

Page 16: Les math´ematiques du Compressive Sensing une introduction ...math.univ-lille1.fr/~cempi/activites_scientifiques/... · “Compressive Sensing”. Son but est de donner un aperc¸u

The Standard Compressive Sensing Problem

x : unknown signal of interest in KN

y : measurement vector in Km with m � N,

s : sparsity of x = card�j ∈ {1, . . . ,N} : xj �= 0

�.

Find concrete sensing/recovery protocols, i.e., find

� measurement matrices A : x ∈ KN �→ y ∈ Km

� reconstruction maps ∆ : y ∈ Km �→ x ∈ KN

such that

∆(Ax) = x for any s-sparse vector x ∈ KN .

In realistic situations, two issues to consider:

Stability: x not sparse but compressible,

Robustness: measurement error in y = Ax+ e.

Page 17: Les math´ematiques du Compressive Sensing une introduction ...math.univ-lille1.fr/~cempi/activites_scientifiques/... · “Compressive Sensing”. Son but est de donner un aperc¸u

The Standard Compressive Sensing Problem

x : unknown signal of interest in KN

y : measurement vector in Km with m � N,

s : sparsity of x = card�j ∈ {1, . . . ,N} : xj �= 0

�.

Find concrete sensing/recovery protocols, i.e., find

� measurement matrices A : x ∈ KN �→ y ∈ Km

� reconstruction maps ∆ : y ∈ Km �→ x ∈ KN

such that

∆(Ax) = x for any s-sparse vector x ∈ KN .

In realistic situations, two issues to consider:

Stability: x not sparse but compressible,

Robustness: measurement error in y = Ax+ e.

Page 18: Les math´ematiques du Compressive Sensing une introduction ...math.univ-lille1.fr/~cempi/activites_scientifiques/... · “Compressive Sensing”. Son but est de donner un aperc¸u

The Standard Compressive Sensing Problem

x : unknown signal of interest in KN

y : measurement vector in Km with m � N,

s : sparsity of x = card�j ∈ {1, . . . ,N} : xj �= 0

�.

Find concrete sensing/recovery protocols, i.e., find

� measurement matrices A : x ∈ KN �→ y ∈ Km

� reconstruction maps ∆ : y ∈ Km �→ x ∈ KN

such that

∆(Ax) = x for any s-sparse vector x ∈ KN .

In realistic situations, two issues to consider:

Stability: x not sparse but compressible,

Robustness: measurement error in y = Ax+ e.

Page 19: Les math´ematiques du Compressive Sensing une introduction ...math.univ-lille1.fr/~cempi/activites_scientifiques/... · “Compressive Sensing”. Son but est de donner un aperc¸u

The Standard Compressive Sensing Problem

x : unknown signal of interest in KN

y : measurement vector in Km with m � N,

s : sparsity of x = card�j ∈ {1, . . . ,N} : xj �= 0

�.

Find concrete sensing/recovery protocols, i.e., find

� measurement matrices A : x ∈ KN �→ y ∈ Km

� reconstruction maps ∆ : y ∈ Km �→ x ∈ KN

such that

∆(Ax) = x for any s-sparse vector x ∈ KN .

In realistic situations, two issues to consider:

Stability: x not sparse but compressible,

Robustness: measurement error in y = Ax+ e.

Page 20: Les math´ematiques du Compressive Sensing une introduction ...math.univ-lille1.fr/~cempi/activites_scientifiques/... · “Compressive Sensing”. Son but est de donner un aperc¸u

A Selection of Applications

� Magnetic resonance imaging

Figure: Left: traditional MRI reconstruction; Right: compressivesensing reconstruction (courtesy of M. Lustig and S. Vasanawala)

� Sampling theory

Figure: Time-domain signal with 16 samples.

� Error correction

� and many more...

Page 21: Les math´ematiques du Compressive Sensing une introduction ...math.univ-lille1.fr/~cempi/activites_scientifiques/... · “Compressive Sensing”. Son but est de donner un aperc¸u

A Selection of Applications

� Magnetic resonance imaging

Figure: Left: traditional MRI reconstruction; Right: compressivesensing reconstruction (courtesy of M. Lustig and S. Vasanawala)

� Sampling theory

Figure: Time-domain signal with 16 samples.

� Error correction

� and many more...

Page 22: Les math´ematiques du Compressive Sensing une introduction ...math.univ-lille1.fr/~cempi/activites_scientifiques/... · “Compressive Sensing”. Son but est de donner un aperc¸u

A Selection of Applications

� Magnetic resonance imaging

Figure: Left: traditional MRI reconstruction; Right: compressivesensing reconstruction (courtesy of M. Lustig and S. Vasanawala)

� Sampling theory

0.5 0.4 0.3 0.2 0.1 0 0.1 0.2 0.3 0.4 0.56

4

2

0

2

4

6

8

Figure: Time-domain signal with 16 samples.

� Error correction

� and many more...

Page 23: Les math´ematiques du Compressive Sensing une introduction ...math.univ-lille1.fr/~cempi/activites_scientifiques/... · “Compressive Sensing”. Son but est de donner un aperc¸u

A Selection of Applications

� Magnetic resonance imaging

Figure: Left: traditional MRI reconstruction; Right: compressivesensing reconstruction (courtesy of M. Lustig and S. Vasanawala)

� Sampling theory

Figure: Time-domain signal with 16 samples.

� Error correction

� and many more...

Page 24: Les math´ematiques du Compressive Sensing une introduction ...math.univ-lille1.fr/~cempi/activites_scientifiques/... · “Compressive Sensing”. Son but est de donner un aperc¸u

A Selection of Applications

� Magnetic resonance imaging

Figure: Left: traditional MRI reconstruction; Right: compressivesensing reconstruction (courtesy of M. Lustig and S. Vasanawala)

� Sampling theory

Figure: Time-domain signal with 16 samples.

� Error correction

� and many more...

Page 25: Les math´ematiques du Compressive Sensing une introduction ...math.univ-lille1.fr/~cempi/activites_scientifiques/... · “Compressive Sensing”. Son but est de donner un aperc¸u

�0-Minimization

Since

�x�pp :=N�

j=1

|xj |p −→p→0

N�

j=1

1{xj �=0},

the notation �x�0 (sic!) has become usual for

�x�0 := card(supp(x)), where supp(x) := {j ∈ [N] : xj �= 0}.

For an s-sparse x ∈ KN , observe the equivalence of

� x is the unique s-sparse solution of Az = y with y = Ax,

� x can be reconstructed as the unique solution of

(P0) minimizez∈KN

�z�0 subject to Az = y.

This is a combinatorial problem, NP-hard in general.

Page 26: Les math´ematiques du Compressive Sensing une introduction ...math.univ-lille1.fr/~cempi/activites_scientifiques/... · “Compressive Sensing”. Son but est de donner un aperc¸u

�0-Minimization

Since

�x�pp :=N�

j=1

|xj |p −→p→0

N�

j=1

1{xj �=0},

the notation �x�0 (sic!) has become usual for

�x�0 := card(supp(x)), where supp(x) := {j ∈ [N] : xj �= 0}.

For an s-sparse x ∈ KN , observe the equivalence of

� x is the unique s-sparse solution of Az = y with y = Ax,

� x can be reconstructed as the unique solution of

(P0) minimizez∈KN

�z�0 subject to Az = y.

This is a combinatorial problem, NP-hard in general.

Page 27: Les math´ematiques du Compressive Sensing une introduction ...math.univ-lille1.fr/~cempi/activites_scientifiques/... · “Compressive Sensing”. Son but est de donner un aperc¸u

�0-Minimization

Since

�x�pp :=N�

j=1

|xj |p −→p→0

N�

j=1

1{xj �=0},

the notation �x�0 (sic!) has become usual for

�x�0 := card(supp(x)), where supp(x) := {j ∈ [N] : xj �= 0}.

For an s-sparse x ∈ KN , observe the equivalence of

� x is the unique s-sparse solution of Az = y with y = Ax,

� x can be reconstructed as the unique solution of

(P0) minimizez∈KN

�z�0 subject to Az = y.

This is a combinatorial problem, NP-hard in general.

Page 28: Les math´ematiques du Compressive Sensing une introduction ...math.univ-lille1.fr/~cempi/activites_scientifiques/... · “Compressive Sensing”. Son but est de donner un aperc¸u

�0-Minimization

Since

�x�pp :=N�

j=1

|xj |p −→p→0

N�

j=1

1{xj �=0},

the notation �x�0 (sic!) has become usual for

�x�0 := card(supp(x)), where supp(x) := {j ∈ [N] : xj �= 0}.

For an s-sparse x ∈ KN , observe the equivalence of

� x is the unique s-sparse solution of Az = y with y = Ax,

� x can be reconstructed as the unique solution of

(P0) minimizez∈KN

�z�0 subject to Az = y.

This is a combinatorial problem, NP-hard in general.

Page 29: Les math´ematiques du Compressive Sensing une introduction ...math.univ-lille1.fr/~cempi/activites_scientifiques/... · “Compressive Sensing”. Son but est de donner un aperc¸u

�0-Minimization

Since

�x�pp :=N�

j=1

|xj |p −→p→0

N�

j=1

1{xj �=0},

the notation �x�0 (sic!) has become usual for

�x�0 := card(supp(x)), where supp(x) := {j ∈ [N] : xj �= 0}.

For an s-sparse x ∈ KN , observe the equivalence of

� x is the unique s-sparse solution of Az = y with y = Ax,

� x can be reconstructed as the unique solution of

(P0) minimizez∈KN

�z�0 subject to Az = y.

This is a combinatorial problem, NP-hard in general.

Page 30: Les math´ematiques du Compressive Sensing une introduction ...math.univ-lille1.fr/~cempi/activites_scientifiques/... · “Compressive Sensing”. Son but est de donner un aperc¸u

�0-Minimization

Since

�x�pp :=N�

j=1

|xj |p −→p→0

N�

j=1

1{xj �=0},

the notation �x�0 (sic!) has become usual for

�x�0 := card(supp(x)), where supp(x) := {j ∈ [N] : xj �= 0}.

For an s-sparse x ∈ KN , observe the equivalence of

� x is the unique s-sparse solution of Az = y with y = Ax,

� x can be reconstructed as the unique solution of

(P0) minimizez∈KN

�z�0 subject to Az = y.

This is a combinatorial problem, NP-hard in general.

Page 31: Les math´ematiques du Compressive Sensing une introduction ...math.univ-lille1.fr/~cempi/activites_scientifiques/... · “Compressive Sensing”. Son but est de donner un aperc¸u

Minimal Number of Measurements

Given A ∈ Km×N , the following are equivalent:

1. Every s-sparse x is the unique s-sparse solution of Az = Ax,

2. kerA ∩ {z ∈ KN : �z�0 ≤ 2s} = {0},3. For every S ⊂ [N] with card(S) ≤ 2s, the matrix AS is

injective,

4. Every set of 2s columns of A is linearly independent.

As a consequence, exact recovery of every s-sparse vector forces

m ≥ 2s.

This can be achieved using partial Vandermonde matrices.

Page 32: Les math´ematiques du Compressive Sensing une introduction ...math.univ-lille1.fr/~cempi/activites_scientifiques/... · “Compressive Sensing”. Son but est de donner un aperc¸u

Minimal Number of Measurements

Given A ∈ Km×N , the following are equivalent:

1. Every s-sparse x is the unique s-sparse solution of Az = Ax,

2. kerA ∩ {z ∈ KN : �z�0 ≤ 2s} = {0},3. For every S ⊂ [N] with card(S) ≤ 2s, the matrix AS is

injective,

4. Every set of 2s columns of A is linearly independent.

As a consequence, exact recovery of every s-sparse vector forces

m ≥ 2s.

This can be achieved using partial Vandermonde matrices.

Page 33: Les math´ematiques du Compressive Sensing une introduction ...math.univ-lille1.fr/~cempi/activites_scientifiques/... · “Compressive Sensing”. Son but est de donner un aperc¸u

Minimal Number of Measurements

Given A ∈ Km×N , the following are equivalent:

1. Every s-sparse x is the unique s-sparse solution of Az = Ax,

2. kerA ∩ {z ∈ KN : �z�0 ≤ 2s} = {0},3. For every S ⊂ [N] with card(S) ≤ 2s, the matrix AS is

injective,

4. Every set of 2s columns of A is linearly independent.

As a consequence, exact recovery of every s-sparse vector forces

m ≥ 2s.

This can be achieved using partial Vandermonde matrices.

Page 34: Les math´ematiques du Compressive Sensing une introduction ...math.univ-lille1.fr/~cempi/activites_scientifiques/... · “Compressive Sensing”. Son but est de donner un aperc¸u

Minimal Number of Measurements

Given A ∈ Km×N , the following are equivalent:

1. Every s-sparse x is the unique s-sparse solution of Az = Ax,

2. kerA ∩ {z ∈ KN : �z�0 ≤ 2s} = {0},

3. For every S ⊂ [N] with card(S) ≤ 2s, the matrix AS isinjective,

4. Every set of 2s columns of A is linearly independent.

As a consequence, exact recovery of every s-sparse vector forces

m ≥ 2s.

This can be achieved using partial Vandermonde matrices.

Page 35: Les math´ematiques du Compressive Sensing une introduction ...math.univ-lille1.fr/~cempi/activites_scientifiques/... · “Compressive Sensing”. Son but est de donner un aperc¸u

Minimal Number of Measurements

Given A ∈ Km×N , the following are equivalent:

1. Every s-sparse x is the unique s-sparse solution of Az = Ax,

2. kerA ∩ {z ∈ KN : �z�0 ≤ 2s} = {0},3. For every S ⊂ [N] with card(S) ≤ 2s, the matrix AS is

injective,

4. Every set of 2s columns of A is linearly independent.

As a consequence, exact recovery of every s-sparse vector forces

m ≥ 2s.

This can be achieved using partial Vandermonde matrices.

Page 36: Les math´ematiques du Compressive Sensing une introduction ...math.univ-lille1.fr/~cempi/activites_scientifiques/... · “Compressive Sensing”. Son but est de donner un aperc¸u

Minimal Number of Measurements

Given A ∈ Km×N , the following are equivalent:

1. Every s-sparse x is the unique s-sparse solution of Az = Ax,

2. kerA ∩ {z ∈ KN : �z�0 ≤ 2s} = {0},3. For every S ⊂ [N] with card(S) ≤ 2s, the matrix AS is

injective,

4. Every set of 2s columns of A is linearly independent.

As a consequence, exact recovery of every s-sparse vector forces

m ≥ 2s.

This can be achieved using partial Vandermonde matrices.

Page 37: Les math´ematiques du Compressive Sensing une introduction ...math.univ-lille1.fr/~cempi/activites_scientifiques/... · “Compressive Sensing”. Son but est de donner un aperc¸u

Minimal Number of Measurements

Given A ∈ Km×N , the following are equivalent:

1. Every s-sparse x is the unique s-sparse solution of Az = Ax,

2. kerA ∩ {z ∈ KN : �z�0 ≤ 2s} = {0},3. For every S ⊂ [N] with card(S) ≤ 2s, the matrix AS is

injective,

4. Every set of 2s columns of A is linearly independent.

As a consequence, exact recovery of every s-sparse vector forces

m ≥ 2s.

This can be achieved using partial Vandermonde matrices.

Page 38: Les math´ematiques du Compressive Sensing une introduction ...math.univ-lille1.fr/~cempi/activites_scientifiques/... · “Compressive Sensing”. Son but est de donner un aperc¸u

Minimal Number of Measurements

Given A ∈ Km×N , the following are equivalent:

1. Every s-sparse x is the unique s-sparse solution of Az = Ax,

2. kerA ∩ {z ∈ KN : �z�0 ≤ 2s} = {0},3. For every S ⊂ [N] with card(S) ≤ 2s, the matrix AS is

injective,

4. Every set of 2s columns of A is linearly independent.

As a consequence, exact recovery of every s-sparse vector forces

m ≥ 2s.

This can be achieved using partial Vandermonde matrices.

Page 39: Les math´ematiques du Compressive Sensing une introduction ...math.univ-lille1.fr/~cempi/activites_scientifiques/... · “Compressive Sensing”. Son but est de donner un aperc¸u

Exact s-Sparse Recovery from 2s Fourier Measurements

Identify an s-sparse x ∈ CN with a function x on {0, 1, . . . ,N − 1}with support S , card(S) = s. Consider the 2s Fourier coefficients

x(j) =N−1�

k=0

x(k)e−i2πjk/N , 0 ≤ j ≤ 2s − 1.

Consider a trigonometric polynomial vanishing exactly on S , i.e.,

p(t) :=�

k∈S

�1− e−i2πk/Ne i2πt/N

�.

Since p · x ≡ 0, discrete convolution gives

0 = (p ∗ x)(j) =N−1�

k=0

p(k)x(j − k), 0 ≤ j ≤ N − 1.

Note that p(0) = 1 and that p(k) = 0 for k > s. The equationss, . . . , 2s − 1 translate into a Toeplitz system with unknownsp(1), . . . , p(s). This determines p, hence p, then S , and finally x.

Page 40: Les math´ematiques du Compressive Sensing une introduction ...math.univ-lille1.fr/~cempi/activites_scientifiques/... · “Compressive Sensing”. Son but est de donner un aperc¸u

Exact s-Sparse Recovery from 2s Fourier Measurements

Identify an s-sparse x ∈ CN with a function x on {0, 1, . . . ,N − 1}with support S , card(S) = s.

Consider the 2s Fourier coefficients

x(j) =N−1�

k=0

x(k)e−i2πjk/N , 0 ≤ j ≤ 2s − 1.

Consider a trigonometric polynomial vanishing exactly on S , i.e.,

p(t) :=�

k∈S

�1− e−i2πk/Ne i2πt/N

�.

Since p · x ≡ 0, discrete convolution gives

0 = (p ∗ x)(j) =N−1�

k=0

p(k)x(j − k), 0 ≤ j ≤ N − 1.

Note that p(0) = 1 and that p(k) = 0 for k > s. The equationss, . . . , 2s − 1 translate into a Toeplitz system with unknownsp(1), . . . , p(s). This determines p, hence p, then S , and finally x.

Page 41: Les math´ematiques du Compressive Sensing une introduction ...math.univ-lille1.fr/~cempi/activites_scientifiques/... · “Compressive Sensing”. Son but est de donner un aperc¸u

Exact s-Sparse Recovery from 2s Fourier Measurements

Identify an s-sparse x ∈ CN with a function x on {0, 1, . . . ,N − 1}with support S , card(S) = s. Consider the 2s Fourier coefficients

x(j) =N−1�

k=0

x(k)e−i2πjk/N , 0 ≤ j ≤ 2s − 1.

Consider a trigonometric polynomial vanishing exactly on S , i.e.,

p(t) :=�

k∈S

�1− e−i2πk/Ne i2πt/N

�.

Since p · x ≡ 0, discrete convolution gives

0 = (p ∗ x)(j) =N−1�

k=0

p(k)x(j − k), 0 ≤ j ≤ N − 1.

Note that p(0) = 1 and that p(k) = 0 for k > s. The equationss, . . . , 2s − 1 translate into a Toeplitz system with unknownsp(1), . . . , p(s). This determines p, hence p, then S , and finally x.

Page 42: Les math´ematiques du Compressive Sensing une introduction ...math.univ-lille1.fr/~cempi/activites_scientifiques/... · “Compressive Sensing”. Son but est de donner un aperc¸u

Exact s-Sparse Recovery from 2s Fourier Measurements

Identify an s-sparse x ∈ CN with a function x on {0, 1, . . . ,N − 1}with support S , card(S) = s. Consider the 2s Fourier coefficients

x(j) =N−1�

k=0

x(k)e−i2πjk/N , 0 ≤ j ≤ 2s − 1.

Consider a trigonometric polynomial vanishing exactly on S , i.e.,

p(t) :=�

k∈S

�1− e−i2πk/Ne i2πt/N

�.

Since p · x ≡ 0, discrete convolution gives

0 = (p ∗ x)(j) =N−1�

k=0

p(k)x(j − k), 0 ≤ j ≤ N − 1.

Note that p(0) = 1 and that p(k) = 0 for k > s. The equationss, . . . , 2s − 1 translate into a Toeplitz system with unknownsp(1), . . . , p(s). This determines p, hence p, then S , and finally x.

Page 43: Les math´ematiques du Compressive Sensing une introduction ...math.univ-lille1.fr/~cempi/activites_scientifiques/... · “Compressive Sensing”. Son but est de donner un aperc¸u

Exact s-Sparse Recovery from 2s Fourier Measurements

Identify an s-sparse x ∈ CN with a function x on {0, 1, . . . ,N − 1}with support S , card(S) = s. Consider the 2s Fourier coefficients

x(j) =N−1�

k=0

x(k)e−i2πjk/N , 0 ≤ j ≤ 2s − 1.

Consider a trigonometric polynomial vanishing exactly on S , i.e.,

p(t) :=�

k∈S

�1− e−i2πk/Ne i2πt/N

�.

Since p · x ≡ 0, discrete convolution gives

0 = (p ∗ x)(j) =N−1�

k=0

p(k)x(j − k), 0 ≤ j ≤ N − 1.

Note that p(0) = 1 and that p(k) = 0 for k > s. The equationss, . . . , 2s − 1 translate into a Toeplitz system with unknownsp(1), . . . , p(s). This determines p, hence p, then S , and finally x.

Page 44: Les math´ematiques du Compressive Sensing une introduction ...math.univ-lille1.fr/~cempi/activites_scientifiques/... · “Compressive Sensing”. Son but est de donner un aperc¸u

Exact s-Sparse Recovery from 2s Fourier Measurements

Identify an s-sparse x ∈ CN with a function x on {0, 1, . . . ,N − 1}with support S , card(S) = s. Consider the 2s Fourier coefficients

x(j) =N−1�

k=0

x(k)e−i2πjk/N , 0 ≤ j ≤ 2s − 1.

Consider a trigonometric polynomial vanishing exactly on S , i.e.,

p(t) :=�

k∈S

�1− e−i2πk/Ne i2πt/N

�.

Since p · x ≡ 0, discrete convolution gives

0 = (p ∗ x)(j) =N−1�

k=0

p(k)x(j − k), 0 ≤ j ≤ N − 1.

Note that p(0) = 1 and that p(k) = 0 for k > s.

The equationss, . . . , 2s − 1 translate into a Toeplitz system with unknownsp(1), . . . , p(s). This determines p, hence p, then S , and finally x.

Page 45: Les math´ematiques du Compressive Sensing une introduction ...math.univ-lille1.fr/~cempi/activites_scientifiques/... · “Compressive Sensing”. Son but est de donner un aperc¸u

Exact s-Sparse Recovery from 2s Fourier Measurements

Identify an s-sparse x ∈ CN with a function x on {0, 1, . . . ,N − 1}with support S , card(S) = s. Consider the 2s Fourier coefficients

x(j) =N−1�

k=0

x(k)e−i2πjk/N , 0 ≤ j ≤ 2s − 1.

Consider a trigonometric polynomial vanishing exactly on S , i.e.,

p(t) :=�

k∈S

�1− e−i2πk/Ne i2πt/N

�.

Since p · x ≡ 0, discrete convolution gives

0 = (p ∗ x)(j) =N−1�

k=0

p(k)x(j − k), 0 ≤ j ≤ N − 1.

Note that p(0) = 1 and that p(k) = 0 for k > s. The equationss, . . . , 2s − 1 translate into a Toeplitz system with unknownsp(1), . . . , p(s).

This determines p, hence p, then S , and finally x.

Page 46: Les math´ematiques du Compressive Sensing une introduction ...math.univ-lille1.fr/~cempi/activites_scientifiques/... · “Compressive Sensing”. Son but est de donner un aperc¸u

Exact s-Sparse Recovery from 2s Fourier Measurements

Identify an s-sparse x ∈ CN with a function x on {0, 1, . . . ,N − 1}with support S , card(S) = s. Consider the 2s Fourier coefficients

x(j) =N−1�

k=0

x(k)e−i2πjk/N , 0 ≤ j ≤ 2s − 1.

Consider a trigonometric polynomial vanishing exactly on S , i.e.,

p(t) :=�

k∈S

�1− e−i2πk/Ne i2πt/N

�.

Since p · x ≡ 0, discrete convolution gives

0 = (p ∗ x)(j) =N−1�

k=0

p(k)x(j − k), 0 ≤ j ≤ N − 1.

Note that p(0) = 1 and that p(k) = 0 for k > s. The equationss, . . . , 2s − 1 translate into a Toeplitz system with unknownsp(1), . . . , p(s). This determines p, hence p, then S , and finally x.

Page 47: Les math´ematiques du Compressive Sensing une introduction ...math.univ-lille1.fr/~cempi/activites_scientifiques/... · “Compressive Sensing”. Son but est de donner un aperc¸u

�1-Minimization (Basis Pursuit)

Replace (P0) by

(P1) minimizez∈KN

�z�1 subject to Az = y.

� Geometric intuition

� Unique �1-minimizers are unique

� Convex optimization program, hence solvable in practice

� In the real setting, recast as the linear optimization program

minimizec,z∈RN

N�

j=1

cj subject to Az = y and − cj ≤ zj ≤ cj .

� In the complex setting, recast as a second order cone program

Page 48: Les math´ematiques du Compressive Sensing une introduction ...math.univ-lille1.fr/~cempi/activites_scientifiques/... · “Compressive Sensing”. Son but est de donner un aperc¸u

�1-Minimization (Basis Pursuit)

Replace (P0) by

(P1) minimizez∈KN

�z�1 subject to Az = y.

� Geometric intuition

� Unique �1-minimizers are unique

� Convex optimization program, hence solvable in practice

� In the real setting, recast as the linear optimization program

minimizec,z∈RN

N�

j=1

cj subject to Az = y and − cj ≤ zj ≤ cj .

� In the complex setting, recast as a second order cone program

Page 49: Les math´ematiques du Compressive Sensing une introduction ...math.univ-lille1.fr/~cempi/activites_scientifiques/... · “Compressive Sensing”. Son but est de donner un aperc¸u

�1-Minimization (Basis Pursuit)

Replace (P0) by

(P1) minimizez∈KN

�z�1 subject to Az = y.

� Geometric intuition

� Unique �1-minimizers are unique

� Convex optimization program, hence solvable in practice

� In the real setting, recast as the linear optimization program

minimizec,z∈RN

N�

j=1

cj subject to Az = y and − cj ≤ zj ≤ cj .

� In the complex setting, recast as a second order cone program

Page 50: Les math´ematiques du Compressive Sensing une introduction ...math.univ-lille1.fr/~cempi/activites_scientifiques/... · “Compressive Sensing”. Son but est de donner un aperc¸u

�1-Minimization (Basis Pursuit)

Replace (P0) by

(P1) minimizez∈KN

�z�1 subject to Az = y.

� Geometric intuition

� Unique �1-minimizers are unique

� Convex optimization program, hence solvable in practice

� In the real setting, recast as the linear optimization program

minimizec,z∈RN

N�

j=1

cj subject to Az = y and − cj ≤ zj ≤ cj .

� In the complex setting, recast as a second order cone program

Page 51: Les math´ematiques du Compressive Sensing une introduction ...math.univ-lille1.fr/~cempi/activites_scientifiques/... · “Compressive Sensing”. Son but est de donner un aperc¸u

�1-Minimization (Basis Pursuit)

Replace (P0) by

(P1) minimizez∈KN

�z�1 subject to Az = y.

� Geometric intuition

� Unique �1-minimizers are unique

� Convex optimization program, hence solvable in practice

� In the real setting, recast as the linear optimization program

minimizec,z∈RN

N�

j=1

cj subject to Az = y and − cj ≤ zj ≤ cj .

� In the complex setting, recast as a second order cone program

Page 52: Les math´ematiques du Compressive Sensing une introduction ...math.univ-lille1.fr/~cempi/activites_scientifiques/... · “Compressive Sensing”. Son but est de donner un aperc¸u

�1-Minimization (Basis Pursuit)

Replace (P0) by

(P1) minimizez∈KN

�z�1 subject to Az = y.

� Geometric intuition

� Unique �1-minimizers are unique

� Convex optimization program, hence solvable in practice

� In the real setting, recast as the linear optimization program

minimizec,z∈RN

N�

j=1

cj subject to Az = y and − cj ≤ zj ≤ cj .

� In the complex setting, recast as a second order cone program

Page 53: Les math´ematiques du Compressive Sensing une introduction ...math.univ-lille1.fr/~cempi/activites_scientifiques/... · “Compressive Sensing”. Son but est de donner un aperc¸u

�1-Minimization (Basis Pursuit)

Replace (P0) by

(P1) minimizez∈KN

�z�1 subject to Az = y.

� Geometric intuition

� Unique �1-minimizers are unique

� Convex optimization program, hence solvable in practice

� In the real setting, recast as the linear optimization program

minimizec,z∈RN

N�

j=1

cj subject to Az = y and − cj ≤ zj ≤ cj .

� In the complex setting, recast as a second order cone program

Page 54: Les math´ematiques du Compressive Sensing une introduction ...math.univ-lille1.fr/~cempi/activites_scientifiques/... · “Compressive Sensing”. Son but est de donner un aperc¸u

Basis Pursuit — Null Space Property

∆1(Ax) = x for every vector x supported on S if and only if

(NSP) �uS�1 < �uS�1, all u ∈ kerA \ {0}.

For real measurement matrices, real and complex NSPs read

j∈S|uj | <

�∈S

|u�|, all u ∈ kerR A \ {0},

j∈S

�v2j + w2

j <�

�∈S

�v2� + w2

� , all (v,w) ∈ (kerR A)2 \ {0}.

Real and complex NSPs are in fact equivalent.

Page 55: Les math´ematiques du Compressive Sensing une introduction ...math.univ-lille1.fr/~cempi/activites_scientifiques/... · “Compressive Sensing”. Son but est de donner un aperc¸u

Basis Pursuit — Null Space Property

∆1(Ax) = x for every vector x supported on S if and only if

(NSP) �uS�1 < �uS�1, all u ∈ kerA \ {0}.

For real measurement matrices, real and complex NSPs read

j∈S|uj | <

�∈S

|u�|, all u ∈ kerR A \ {0},

j∈S

�v2j + w2

j <�

�∈S

�v2� + w2

� , all (v,w) ∈ (kerR A)2 \ {0}.

Real and complex NSPs are in fact equivalent.

Page 56: Les math´ematiques du Compressive Sensing une introduction ...math.univ-lille1.fr/~cempi/activites_scientifiques/... · “Compressive Sensing”. Son but est de donner un aperc¸u

Basis Pursuit — Null Space Property

∆1(Ax) = x for every vector x supported on S if and only if

(NSP) �uS�1 < �uS�1, all u ∈ kerA \ {0}.

For real measurement matrices, real and complex NSPs read

j∈S|uj | <

�∈S

|u�|, all u ∈ kerR A \ {0},

j∈S

�v2j + w2

j <�

�∈S

�v2� + w2

� , all (v,w) ∈ (kerR A)2 \ {0}.

Real and complex NSPs are in fact equivalent.

Page 57: Les math´ematiques du Compressive Sensing une introduction ...math.univ-lille1.fr/~cempi/activites_scientifiques/... · “Compressive Sensing”. Son but est de donner un aperc¸u

Basis Pursuit — Null Space Property

∆1(Ax) = x for every vector x supported on S if and only if

(NSP) �uS�1 < �uS�1, all u ∈ kerA \ {0}.

For real measurement matrices,

real and complex NSPs read

j∈S|uj | <

�∈S

|u�|, all u ∈ kerR A \ {0},

j∈S

�v2j + w2

j <�

�∈S

�v2� + w2

� , all (v,w) ∈ (kerR A)2 \ {0}.

Real and complex NSPs are in fact equivalent.

Page 58: Les math´ematiques du Compressive Sensing une introduction ...math.univ-lille1.fr/~cempi/activites_scientifiques/... · “Compressive Sensing”. Son but est de donner un aperc¸u

Basis Pursuit — Null Space Property

∆1(Ax) = x for every vector x supported on S if and only if

(NSP) �uS�1 < �uS�1, all u ∈ kerA \ {0}.

For real measurement matrices, real and complex NSPs read

j∈S|uj | <

�∈S

|u�|, all u ∈ kerR A \ {0},

j∈S

�v2j + w2

j <�

�∈S

�v2� + w2

� , all (v,w) ∈ (kerR A)2 \ {0}.

Real and complex NSPs are in fact equivalent.

Page 59: Les math´ematiques du Compressive Sensing une introduction ...math.univ-lille1.fr/~cempi/activites_scientifiques/... · “Compressive Sensing”. Son but est de donner un aperc¸u

Basis Pursuit — Null Space Property

∆1(Ax) = x for every vector x supported on S if and only if

(NSP) �uS�1 < �uS�1, all u ∈ kerA \ {0}.

For real measurement matrices, real and complex NSPs read

j∈S|uj | <

�∈S

|u�|, all u ∈ kerR A \ {0},

j∈S

�v2j + w2

j <�

�∈S

�v2� + w2

� , all (v,w) ∈ (kerR A)2 \ {0}.

Real and complex NSPs are in fact equivalent.

Page 60: Les math´ematiques du Compressive Sensing une introduction ...math.univ-lille1.fr/~cempi/activites_scientifiques/... · “Compressive Sensing”. Son but est de donner un aperc¸u

Basis Pursuit — Null Space Property

∆1(Ax) = x for every vector x supported on S if and only if

(NSP) �uS�1 < �uS�1, all u ∈ kerA \ {0}.

For real measurement matrices, real and complex NSPs read

j∈S|uj | <

�∈S

|u�|, all u ∈ kerR A \ {0},

j∈S

�v2j + w2

j <�

�∈S

�v2� + w2

� , all (v,w) ∈ (kerR A)2 \ {0}.

Real and complex NSPs are in fact equivalent.

Page 61: Les math´ematiques du Compressive Sensing une introduction ...math.univ-lille1.fr/~cempi/activites_scientifiques/... · “Compressive Sensing”. Son but est de donner un aperc¸u

Basis Pursuit — Null Space Property

∆1(Ax) = x for every vector x supported on S if and only if

(NSP) �uS�1 < �uS�1, all u ∈ kerA \ {0}.

For real measurement matrices, real and complex NSPs read

j∈S|uj | <

�∈S

|u�|, all u ∈ kerR A \ {0},

j∈S

�v2j + w2

j <�

�∈S

�v2� + w2

� , all (v,w) ∈ (kerR A)2 \ {0}.

Real and complex NSPs are in fact equivalent.

Page 62: Les math´ematiques du Compressive Sensing une introduction ...math.univ-lille1.fr/~cempi/activites_scientifiques/... · “Compressive Sensing”. Son but est de donner un aperc¸u

Orthogonal Matching Pursuit

Starting with S0 = ∅ and x0 = 0, iterate

Sn+1 = Sn ∪�jn+1 := argmax

j∈[N]

�|(A∗(y − Axn))j |

��,(OMP1)

xn+1 = argminz∈CN

��y − Az�2, supp(z) ⊆ Sn+1

�.(OMP2)

� The norm of the residual decreases according to

�y − Axn+1�22 ≤ �y − Axn�22 −��(A∗(y − Axn))jn+1

��2.

� Every vector x �=0 supported on S , card(S) = s, is recoveredfrom y = Ax after at most s iterations of OMP if and only ifAS is injective and

(ERC) maxj∈S

|(A∗r)j | > max�∈S

|(A∗r)�|

for all r �=0 ∈�Az, supp(z) ⊆ S

�.

Page 63: Les math´ematiques du Compressive Sensing une introduction ...math.univ-lille1.fr/~cempi/activites_scientifiques/... · “Compressive Sensing”. Son but est de donner un aperc¸u

Orthogonal Matching Pursuit

Starting with S0 = ∅ and x0 = 0, iterate

Sn+1 = Sn ∪�jn+1 := argmax

j∈[N]

�|(A∗(y − Axn))j |

��,(OMP1)

xn+1 = argminz∈CN

��y − Az�2, supp(z) ⊆ Sn+1

�.(OMP2)

� The norm of the residual decreases according to

�y − Axn+1�22 ≤ �y − Axn�22 −��(A∗(y − Axn))jn+1

��2.

� Every vector x �=0 supported on S , card(S) = s, is recoveredfrom y = Ax after at most s iterations of OMP if and only ifAS is injective and

(ERC) maxj∈S

|(A∗r)j | > max�∈S

|(A∗r)�|

for all r �=0 ∈�Az, supp(z) ⊆ S

�.

Page 64: Les math´ematiques du Compressive Sensing une introduction ...math.univ-lille1.fr/~cempi/activites_scientifiques/... · “Compressive Sensing”. Son but est de donner un aperc¸u

Orthogonal Matching Pursuit

Starting with S0 = ∅ and x0 = 0, iterate

Sn+1 = Sn ∪�jn+1 := argmax

j∈[N]

�|(A∗(y − Axn))j |

��,(OMP1)

xn+1 = argminz∈CN

��y − Az�2, supp(z) ⊆ Sn+1

�.(OMP2)

� The norm of the residual decreases according to

�y − Axn+1�22 ≤ �y − Axn�22 −��(A∗(y − Axn))jn+1

��2.

� Every vector x �=0 supported on S , card(S) = s, is recoveredfrom y = Ax after at most s iterations of OMP if and only ifAS is injective and

(ERC) maxj∈S

|(A∗r)j | > max�∈S

|(A∗r)�|

for all r �=0 ∈�Az, supp(z) ⊆ S

�.

Page 65: Les math´ematiques du Compressive Sensing une introduction ...math.univ-lille1.fr/~cempi/activites_scientifiques/... · “Compressive Sensing”. Son but est de donner un aperc¸u

Orthogonal Matching Pursuit

Starting with S0 = ∅ and x0 = 0, iterate

Sn+1 = Sn ∪�jn+1 := argmax

j∈[N]

�|(A∗(y − Axn))j |

��,(OMP1)

xn+1 = argminz∈CN

��y − Az�2, supp(z) ⊆ Sn+1

�.(OMP2)

� The norm of the residual decreases according to

�y − Axn+1�22 ≤ �y − Axn�22 −��(A∗(y − Axn))jn+1

��2.

� Every vector x �=0 supported on S , card(S) = s, is recoveredfrom y = Ax after at most s iterations of OMP if and only ifAS is injective and

(ERC) maxj∈S

|(A∗r)j | > max�∈S

|(A∗r)�|

for all r �=0 ∈�Az, supp(z) ⊆ S

�.

Page 66: Les math´ematiques du Compressive Sensing une introduction ...math.univ-lille1.fr/~cempi/activites_scientifiques/... · “Compressive Sensing”. Son but est de donner un aperc¸u

Coherence

For a matrix with �2-normalized columns a1, . . . , aN ,

µ := maxi �=j

|�ai , aj�| .

As a rule, the smaller the coherence, the better.

However, the Welch bound reads

µ ≥

�N −m

m(N − 1).

� Welch bound achieved at and only at equiangular tight frames

� Deterministic matrices with coherence µ ≤ c/√m exist

Page 67: Les math´ematiques du Compressive Sensing une introduction ...math.univ-lille1.fr/~cempi/activites_scientifiques/... · “Compressive Sensing”. Son but est de donner un aperc¸u

Coherence

For a matrix with �2-normalized columns a1, . . . , aN ,

µ := maxi �=j

|�ai , aj�| .

As a rule, the smaller the coherence, the better.

However, the Welch bound reads

µ ≥

�N −m

m(N − 1).

� Welch bound achieved at and only at equiangular tight frames

� Deterministic matrices with coherence µ ≤ c/√m exist

Page 68: Les math´ematiques du Compressive Sensing une introduction ...math.univ-lille1.fr/~cempi/activites_scientifiques/... · “Compressive Sensing”. Son but est de donner un aperc¸u

Coherence

For a matrix with �2-normalized columns a1, . . . , aN ,

µ := maxi �=j

|�ai , aj�| .

As a rule, the smaller the coherence, the better.

However, the Welch bound reads

µ ≥

�N −m

m(N − 1).

� Welch bound achieved at and only at equiangular tight frames

� Deterministic matrices with coherence µ ≤ c/√m exist

Page 69: Les math´ematiques du Compressive Sensing une introduction ...math.univ-lille1.fr/~cempi/activites_scientifiques/... · “Compressive Sensing”. Son but est de donner un aperc¸u

Coherence

For a matrix with �2-normalized columns a1, . . . , aN ,

µ := maxi �=j

|�ai , aj�| .

As a rule, the smaller the coherence, the better.

However, the Welch bound reads

µ ≥

�N −m

m(N − 1).

� Welch bound achieved at and only at equiangular tight frames

� Deterministic matrices with coherence µ ≤ c/√m exist

Page 70: Les math´ematiques du Compressive Sensing une introduction ...math.univ-lille1.fr/~cempi/activites_scientifiques/... · “Compressive Sensing”. Son but est de donner un aperc¸u

Coherence

For a matrix with �2-normalized columns a1, . . . , aN ,

µ := maxi �=j

|�ai , aj�| .

As a rule, the smaller the coherence, the better.

However, the Welch bound reads

µ ≥

�N −m

m(N − 1).

� Welch bound achieved at and only at equiangular tight frames

� Deterministic matrices with coherence µ ≤ c/√m exist

Page 71: Les math´ematiques du Compressive Sensing une introduction ...math.univ-lille1.fr/~cempi/activites_scientifiques/... · “Compressive Sensing”. Son but est de donner un aperc¸u

Coherence

For a matrix with �2-normalized columns a1, . . . , aN ,

µ := maxi �=j

|�ai , aj�| .

As a rule, the smaller the coherence, the better.

However, the Welch bound reads

µ ≥

�N −m

m(N − 1).

� Welch bound achieved at and only at equiangular tight frames

� Deterministic matrices with coherence µ ≤ c/√m exist

Page 72: Les math´ematiques du Compressive Sensing une introduction ...math.univ-lille1.fr/~cempi/activites_scientifiques/... · “Compressive Sensing”. Son but est de donner un aperc¸u

Recovery Conditions using Coherence

� Every s-sparse x ∈ CN is recovered from y = Ax ∈ Cm via atmost s iterations of (OMP) provided

µ <1

2s − 1.

� Every s-sparse x ∈ CN is recovered from y = Ax ∈ Cm viaBasis Pursuit provided

µ <1

2s − 1.

� In fact, the Exact Recovery Condition can be rephrased as

�A†SAS�1→1 < 1,

and this implies the Null Space Property.

Page 73: Les math´ematiques du Compressive Sensing une introduction ...math.univ-lille1.fr/~cempi/activites_scientifiques/... · “Compressive Sensing”. Son but est de donner un aperc¸u

Recovery Conditions using Coherence

� Every s-sparse x ∈ CN is recovered from y = Ax ∈ Cm via atmost s iterations of (OMP) provided

µ <1

2s − 1.

� Every s-sparse x ∈ CN is recovered from y = Ax ∈ Cm viaBasis Pursuit provided

µ <1

2s − 1.

� In fact, the Exact Recovery Condition can be rephrased as

�A†SAS�1→1 < 1,

and this implies the Null Space Property.

Page 74: Les math´ematiques du Compressive Sensing une introduction ...math.univ-lille1.fr/~cempi/activites_scientifiques/... · “Compressive Sensing”. Son but est de donner un aperc¸u

Recovery Conditions using Coherence

� Every s-sparse x ∈ CN is recovered from y = Ax ∈ Cm via atmost s iterations of (OMP) provided

µ <1

2s − 1.

� Every s-sparse x ∈ CN is recovered from y = Ax ∈ Cm viaBasis Pursuit provided

µ <1

2s − 1.

� In fact, the Exact Recovery Condition can be rephrased as

�A†SAS�1→1 < 1,

and this implies the Null Space Property.

Page 75: Les math´ematiques du Compressive Sensing une introduction ...math.univ-lille1.fr/~cempi/activites_scientifiques/... · “Compressive Sensing”. Son but est de donner un aperc¸u

Recovery Conditions using Coherence

� Every s-sparse x ∈ CN is recovered from y = Ax ∈ Cm via atmost s iterations of (OMP) provided

µ <1

2s − 1.

� Every s-sparse x ∈ CN is recovered from y = Ax ∈ Cm viaBasis Pursuit provided

µ <1

2s − 1.

� In fact, the Exact Recovery Condition can be rephrased as

�A†SAS�1→1 < 1,

and this implies the Null Space Property.