pázmány péter catholic universityusers.itk.ppke.hu/~vago/funkanal_9_20_online.pdf · functional...

Post on 08-Aug-2020

4 Views

Category:

Documents

0 Downloads

Preview:

Click to see full reader

TRANSCRIPT

Functional analysis Distant Learning. Week 3.

Functional analysis

Lesson 9.

April 21, 2020

Functional analysis Distant Learning. Week 3.

Review

In the WEIGHTED L2 SPACE we applied G-S orthogonalization:

{1, x , . . . , xn . . . } −→ {ϕ0, ϕ1, . . . , ϕn . . . } ON polynomials.

E.g. Legendre-, Chebishev-, Hermite-polynomials. Do you know?

Questions.

I Why are these systems of orthogonal polynomials important?

I What can we use the ON polynomials for?

Functional analysis Distant Learning. Week 3.

Review

In the WEIGHTED L2 SPACE we applied G-S orthogonalization:

{1, x , . . . , xn . . . } −→ {ϕ0, ϕ1, . . . , ϕn . . . } ON polynomials.

E.g. Legendre-, Chebishev-, Hermite-polynomials. Do you know?

Questions.

I Why are these systems of orthogonal polynomials important?

I What can we use the ON polynomials for?

Functional analysis Distant Learning. Week 3.

Review

In the WEIGHTED L2 SPACE we applied G-S orthogonalization:

{1, x , . . . , xn . . . }

−→ {ϕ0, ϕ1, . . . , ϕn . . . } ON polynomials.

E.g. Legendre-, Chebishev-, Hermite-polynomials. Do you know?

Questions.

I Why are these systems of orthogonal polynomials important?

I What can we use the ON polynomials for?

Functional analysis Distant Learning. Week 3.

Review

In the WEIGHTED L2 SPACE we applied G-S orthogonalization:

{1, x , . . . , xn . . . } −→

{ϕ0, ϕ1, . . . , ϕn . . . } ON polynomials.

E.g. Legendre-, Chebishev-, Hermite-polynomials. Do you know?

Questions.

I Why are these systems of orthogonal polynomials important?

I What can we use the ON polynomials for?

Functional analysis Distant Learning. Week 3.

Review

In the WEIGHTED L2 SPACE we applied G-S orthogonalization:

{1, x , . . . , xn . . . } −→ {ϕ0, ϕ1, . . . , ϕn . . . } ON polynomials.

E.g. Legendre-, Chebishev-, Hermite-polynomials. Do you know?

Questions.

I Why are these systems of orthogonal polynomials important?

I What can we use the ON polynomials for?

Functional analysis Distant Learning. Week 3.

Review

In the WEIGHTED L2 SPACE we applied G-S orthogonalization:

{1, x , . . . , xn . . . } −→ {ϕ0, ϕ1, . . . , ϕn . . . } ON polynomials.

E.g. Legendre-, Chebishev-, Hermite-polynomials. Do you know?

Questions.

I Why are these systems of orthogonal polynomials important?

I What can we use the ON polynomials for?

Functional analysis Distant Learning. Week 3.

Review

In the WEIGHTED L2 SPACE we applied G-S orthogonalization:

{1, x , . . . , xn . . . } −→ {ϕ0, ϕ1, . . . , ϕn . . . } ON polynomials.

E.g. Legendre-, Chebishev-, Hermite-polynomials. Do you know?

Questions.

I Why are these systems of orthogonal polynomials important?

I What can we use the ON polynomials for?

Functional analysis Distant Learning. Week 3.

Review

In the WEIGHTED L2 SPACE we applied G-S orthogonalization:

{1, x , . . . , xn . . . } −→ {ϕ0, ϕ1, . . . , ϕn . . . } ON polynomials.

E.g. Legendre-, Chebishev-, Hermite-polynomials. Do you know?

Questions.

I Why are these systems of orthogonal polynomials important?

I What can we use the ON polynomials for?

Functional analysis Distant Learning. Week 3.

Review

In the WEIGHTED L2 SPACE we applied G-S orthogonalization:

{1, x , . . . , xn . . . } −→ {ϕ0, ϕ1, . . . , ϕn . . . } ON polynomials.

E.g. Legendre-, Chebishev-, Hermite-polynomials. Do you know?

Questions.

I Why are these systems of orthogonal polynomials important?

I What can we use the ON polynomials for?

Functional analysis Distant Learning. Week 3.

A detour

Theorem. (Classical Fourier theorem.)

Assume f : [−π, π]→ IR satisfies the Dirichlet conditions. ??

Then ∀x ∈ [−π, π]:

f (x) =a0

2+∞∑

k=1

(ak cos(kx) + bk sin(kx)) , with

ak =1π

∫ π

−πf (x) cos(kx) dx , bk =

∫ π

−πf (x) sin(kx) dx .

Corollary. The trig. system is complete in L2[−π, π].

−→ Moreover, the coefficients are known.

Functional analysis Distant Learning. Week 3.

A detourTheorem. (Classical Fourier theorem.)

Assume f : [−π, π]→ IR satisfies the Dirichlet conditions. ??

Then ∀x ∈ [−π, π]:

f (x) =a0

2+∞∑

k=1

(ak cos(kx) + bk sin(kx)) , with

ak =1π

∫ π

−πf (x) cos(kx) dx , bk =

∫ π

−πf (x) sin(kx) dx .

Corollary. The trig. system is complete in L2[−π, π].

−→ Moreover, the coefficients are known.

Functional analysis Distant Learning. Week 3.

A detourTheorem. (Classical Fourier theorem.)

Assume f : [−π, π]→ IR satisfies the Dirichlet conditions.

??

Then ∀x ∈ [−π, π]:

f (x) =a0

2+∞∑

k=1

(ak cos(kx) + bk sin(kx)) , with

ak =1π

∫ π

−πf (x) cos(kx) dx , bk =

∫ π

−πf (x) sin(kx) dx .

Corollary. The trig. system is complete in L2[−π, π].

−→ Moreover, the coefficients are known.

Functional analysis Distant Learning. Week 3.

A detourTheorem. (Classical Fourier theorem.)

Assume f : [−π, π]→ IR satisfies the Dirichlet conditions. ??

Then ∀x ∈ [−π, π]:

f (x) =a0

2+∞∑

k=1

(ak cos(kx) + bk sin(kx)) , with

ak =1π

∫ π

−πf (x) cos(kx) dx , bk =

∫ π

−πf (x) sin(kx) dx .

Corollary. The trig. system is complete in L2[−π, π].

−→ Moreover, the coefficients are known.

Functional analysis Distant Learning. Week 3.

A detourTheorem. (Classical Fourier theorem.)

Assume f : [−π, π]→ IR satisfies the Dirichlet conditions. ??

Then ∀x ∈ [−π, π]:

f (x) =a0

2+∞∑

k=1

(ak cos(kx) + bk sin(kx)) , with

ak =1π

∫ π

−πf (x) cos(kx) dx , bk =

∫ π

−πf (x) sin(kx) dx .

Corollary. The trig. system is complete in L2[−π, π].

−→ Moreover, the coefficients are known.

Functional analysis Distant Learning. Week 3.

A detourTheorem. (Classical Fourier theorem.)

Assume f : [−π, π]→ IR satisfies the Dirichlet conditions. ??

Then ∀x ∈ [−π, π]:

f (x) =a0

2+∞∑

k=1

(ak cos(kx) + bk sin(kx)) ,

with

ak =1π

∫ π

−πf (x) cos(kx) dx , bk =

∫ π

−πf (x) sin(kx) dx .

Corollary. The trig. system is complete in L2[−π, π].

−→ Moreover, the coefficients are known.

Functional analysis Distant Learning. Week 3.

A detourTheorem. (Classical Fourier theorem.)

Assume f : [−π, π]→ IR satisfies the Dirichlet conditions. ??

Then ∀x ∈ [−π, π]:

f (x) =a0

2+∞∑

k=1

(ak cos(kx) + bk sin(kx)) , with

ak =1π

∫ π

−πf (x) cos(kx) dx , bk =

∫ π

−πf (x) sin(kx) dx .

Corollary. The trig. system is complete in L2[−π, π].

−→ Moreover, the coefficients are known.

Functional analysis Distant Learning. Week 3.

A detourTheorem. (Classical Fourier theorem.)

Assume f : [−π, π]→ IR satisfies the Dirichlet conditions. ??

Then ∀x ∈ [−π, π]:

f (x) =a0

2+∞∑

k=1

(ak cos(kx) + bk sin(kx)) , with

ak =1π

∫ π

−πf (x) cos(kx) dx ,

bk =1π

∫ π

−πf (x) sin(kx) dx .

Corollary. The trig. system is complete in L2[−π, π].

−→ Moreover, the coefficients are known.

Functional analysis Distant Learning. Week 3.

A detourTheorem. (Classical Fourier theorem.)

Assume f : [−π, π]→ IR satisfies the Dirichlet conditions. ??

Then ∀x ∈ [−π, π]:

f (x) =a0

2+∞∑

k=1

(ak cos(kx) + bk sin(kx)) , with

ak =1π

∫ π

−πf (x) cos(kx) dx , bk =

∫ π

−πf (x) sin(kx) dx .

Corollary. The trig. system is complete in L2[−π, π].

−→ Moreover, the coefficients are known.

Functional analysis Distant Learning. Week 3.

A detourTheorem. (Classical Fourier theorem.)

Assume f : [−π, π]→ IR satisfies the Dirichlet conditions. ??

Then ∀x ∈ [−π, π]:

f (x) =a0

2+∞∑

k=1

(ak cos(kx) + bk sin(kx)) , with

ak =1π

∫ π

−πf (x) cos(kx) dx , bk =

∫ π

−πf (x) sin(kx) dx .

Corollary.

The trig. system is complete in L2[−π, π].

−→ Moreover, the coefficients are known.

Functional analysis Distant Learning. Week 3.

A detourTheorem. (Classical Fourier theorem.)

Assume f : [−π, π]→ IR satisfies the Dirichlet conditions. ??

Then ∀x ∈ [−π, π]:

f (x) =a0

2+∞∑

k=1

(ak cos(kx) + bk sin(kx)) , with

ak =1π

∫ π

−πf (x) cos(kx) dx , bk =

∫ π

−πf (x) sin(kx) dx .

Corollary. The trig. system

is complete in L2[−π, π].

−→ Moreover, the coefficients are known.

Functional analysis Distant Learning. Week 3.

A detourTheorem. (Classical Fourier theorem.)

Assume f : [−π, π]→ IR satisfies the Dirichlet conditions. ??

Then ∀x ∈ [−π, π]:

f (x) =a0

2+∞∑

k=1

(ak cos(kx) + bk sin(kx)) , with

ak =1π

∫ π

−πf (x) cos(kx) dx , bk =

∫ π

−πf (x) sin(kx) dx .

Corollary. The trig. system is complete in L2[−π, π].

−→ Moreover, the coefficients are known.

Functional analysis Distant Learning. Week 3.

A detourTheorem. (Classical Fourier theorem.)

Assume f : [−π, π]→ IR satisfies the Dirichlet conditions. ??

Then ∀x ∈ [−π, π]:

f (x) =a0

2+∞∑

k=1

(ak cos(kx) + bk sin(kx)) , with

ak =1π

∫ π

−πf (x) cos(kx) dx , bk =

∫ π

−πf (x) sin(kx) dx .

Corollary. The trig. system is complete in L2[−π, π].

−→ Moreover, the coefficients are known.

Functional analysis Distant Learning. Week 3.

General Fourier series

Functional analysis Distant Learning. Week 3.

In a Hilbert space.

(H, 〈·, ·〉) is a Hilbert space. (Can you recall the definition?)

Let (ϕk , ) ⊂ H be an ON system.

Theorem. Assume, that for some f ∈ H we have

f =∞∑

k=1

ckϕk .

Then ck = 〈f , ϕk 〉. I.e. the coefficients can be recovered from f .

Remark. If (ϕn) ⊂ H is complete, then every f ∈ H: ∃(cn)

f =∞∑

n=1

cnϕn.

Functional analysis Distant Learning. Week 3.

In a Hilbert space.

(H, 〈·, ·〉) is a Hilbert space. (Can you recall the definition?)

Let (ϕk , ) ⊂ H be an ON system.

Theorem. Assume, that for some f ∈ H we have

f =∞∑

k=1

ckϕk .

Then ck = 〈f , ϕk 〉. I.e. the coefficients can be recovered from f .

Remark. If (ϕn) ⊂ H is complete, then every f ∈ H: ∃(cn)

f =∞∑

n=1

cnϕn.

Functional analysis Distant Learning. Week 3.

In a Hilbert space.

(H, 〈·, ·〉) is a Hilbert space. (Can you recall the definition?)

Let (ϕk , ) ⊂ H be an ON system.

Theorem. Assume, that for some f ∈ H we have

f =∞∑

k=1

ckϕk .

Then ck = 〈f , ϕk 〉. I.e. the coefficients can be recovered from f .

Remark. If (ϕn) ⊂ H is complete, then every f ∈ H: ∃(cn)

f =∞∑

n=1

cnϕn.

Functional analysis Distant Learning. Week 3.

In a Hilbert space.

(H, 〈·, ·〉) is a Hilbert space. (Can you recall the definition?)

Let (ϕk , ) ⊂ H be an ON system.

Theorem. Assume, that for some f ∈ H we have

f =∞∑

k=1

ckϕk .

Then ck = 〈f , ϕk 〉. I.e. the coefficients can be recovered from f .

Remark. If (ϕn) ⊂ H is complete, then every f ∈ H: ∃(cn)

f =∞∑

n=1

cnϕn.

Functional analysis Distant Learning. Week 3.

In a Hilbert space.

(H, 〈·, ·〉) is a Hilbert space. (Can you recall the definition?)

Let (ϕk , ) ⊂ H be an ON system.

Theorem. Assume, that for some f ∈ H we have

f =∞∑

k=1

ckϕk .

Then ck = 〈f , ϕk 〉.

I.e. the coefficients can be recovered from f .

Remark. If (ϕn) ⊂ H is complete, then every f ∈ H: ∃(cn)

f =∞∑

n=1

cnϕn.

Functional analysis Distant Learning. Week 3.

In a Hilbert space.

(H, 〈·, ·〉) is a Hilbert space. (Can you recall the definition?)

Let (ϕk , ) ⊂ H be an ON system.

Theorem. Assume, that for some f ∈ H we have

f =∞∑

k=1

ckϕk .

Then ck = 〈f , ϕk 〉. I.e. the coefficients can be recovered from f .

Remark. If (ϕn) ⊂ H is complete, then every f ∈ H: ∃(cn)

f =∞∑

n=1

cnϕn.

Functional analysis Distant Learning. Week 3.

In a Hilbert space.

(H, 〈·, ·〉) is a Hilbert space. (Can you recall the definition?)

Let (ϕk , ) ⊂ H be an ON system.

Theorem. Assume, that for some f ∈ H we have

f =∞∑

k=1

ckϕk .

Then ck = 〈f , ϕk 〉. I.e. the coefficients can be recovered from f .

Remark. If (ϕn) ⊂ H is complete, then every f ∈ H:

∃(cn)

f =∞∑

n=1

cnϕn.

Functional analysis Distant Learning. Week 3.

In a Hilbert space.

(H, 〈·, ·〉) is a Hilbert space. (Can you recall the definition?)

Let (ϕk , ) ⊂ H be an ON system.

Theorem. Assume, that for some f ∈ H we have

f =∞∑

k=1

ckϕk .

Then ck = 〈f , ϕk 〉. I.e. the coefficients can be recovered from f .

Remark. If (ϕn) ⊂ H is complete, then every f ∈ H: ∃(cn)

f =∞∑

n=1

cnϕn.

Functional analysis Distant Learning. Week 3.

In a Hilbert space.

(H, 〈·, ·〉) is a Hilbert space. (Can you recall the definition?)

Let (ϕk , ) ⊂ H be an ON system.

Theorem. Assume, that for some f ∈ H we have

f =∞∑

k=1

ckϕk .

Then ck = 〈f , ϕk 〉. I.e. the coefficients can be recovered from f .

Remark. If (ϕn) ⊂ H is complete, then every f ∈ H: ∃(cn)

f =∞∑

n=1

cnϕn.

Functional analysis Distant Learning. Week 3.

Proof.

Let us define sn :=n∑

k=1

ckϕk .

Then by the Thm.’s assumption

limn→∞

‖f − sn‖ = 0.

It follows, that for all ϕj , j ≤ n

limn→∞〈f − sn, ϕj〉 = 0. (Why?) =⇒ 〈f , ϕj〉 = lim

n→∞〈sn, ϕj〉

If n ≥ j , then

〈sn, ϕj〉 =

⟨n∑

k=1

ckϕk , ϕj

⟩= ??? =

n∑k=1

ck 〈ϕk , ϕj〉 = cj .

Functional analysis Distant Learning. Week 3.

Proof.

Let us define sn :=n∑

k=1

ckϕk .

Then by the Thm.’s assumption

limn→∞

‖f − sn‖ = 0.

It follows, that for all ϕj , j ≤ n

limn→∞〈f − sn, ϕj〉 = 0. (Why?) =⇒ 〈f , ϕj〉 = lim

n→∞〈sn, ϕj〉

If n ≥ j , then

〈sn, ϕj〉 =

⟨n∑

k=1

ckϕk , ϕj

⟩= ??? =

n∑k=1

ck 〈ϕk , ϕj〉 = cj .

Functional analysis Distant Learning. Week 3.

Proof.

Let us define sn :=n∑

k=1

ckϕk .

Then by the Thm.’s assumption

limn→∞

‖f − sn‖ = 0.

It follows, that for all ϕj , j ≤ n

limn→∞〈f − sn, ϕj〉 = 0. (Why?) =⇒ 〈f , ϕj〉 = lim

n→∞〈sn, ϕj〉

If n ≥ j , then

〈sn, ϕj〉 =

⟨n∑

k=1

ckϕk , ϕj

⟩= ??? =

n∑k=1

ck 〈ϕk , ϕj〉 = cj .

Functional analysis Distant Learning. Week 3.

Proof.

Let us define sn :=n∑

k=1

ckϕk .

Then by the Thm.’s assumption

limn→∞

‖f − sn‖ = 0.

It follows, that for all ϕj , j ≤ n

limn→∞〈f − sn, ϕj〉 = 0. (Why?) =⇒ 〈f , ϕj〉 = lim

n→∞〈sn, ϕj〉

If n ≥ j , then

〈sn, ϕj〉 =

⟨n∑

k=1

ckϕk , ϕj

⟩= ??? =

n∑k=1

ck 〈ϕk , ϕj〉 = cj .

Functional analysis Distant Learning. Week 3.

Proof.

Let us define sn :=n∑

k=1

ckϕk .

Then by the Thm.’s assumption

limn→∞

‖f − sn‖ = 0.

It follows, that for all ϕj , j ≤ n

limn→∞〈f − sn, ϕj〉 = 0.

(Why?) =⇒ 〈f , ϕj〉 = limn→∞〈sn, ϕj〉

If n ≥ j , then

〈sn, ϕj〉 =

⟨n∑

k=1

ckϕk , ϕj

⟩= ??? =

n∑k=1

ck 〈ϕk , ϕj〉 = cj .

Functional analysis Distant Learning. Week 3.

Proof.

Let us define sn :=n∑

k=1

ckϕk .

Then by the Thm.’s assumption

limn→∞

‖f − sn‖ = 0.

It follows, that for all ϕj , j ≤ n

limn→∞〈f − sn, ϕj〉 = 0. (Why?)

=⇒ 〈f , ϕj〉 = limn→∞〈sn, ϕj〉

If n ≥ j , then

〈sn, ϕj〉 =

⟨n∑

k=1

ckϕk , ϕj

⟩= ??? =

n∑k=1

ck 〈ϕk , ϕj〉 = cj .

Functional analysis Distant Learning. Week 3.

Proof.

Let us define sn :=n∑

k=1

ckϕk .

Then by the Thm.’s assumption

limn→∞

‖f − sn‖ = 0.

It follows, that for all ϕj , j ≤ n

limn→∞〈f − sn, ϕj〉 = 0. (Why?) =⇒ 〈f , ϕj〉 = lim

n→∞〈sn, ϕj〉

If n ≥ j , then

〈sn, ϕj〉 =

⟨n∑

k=1

ckϕk , ϕj

⟩= ??? =

n∑k=1

ck 〈ϕk , ϕj〉 = cj .

Functional analysis Distant Learning. Week 3.

Proof.

Let us define sn :=n∑

k=1

ckϕk .

Then by the Thm.’s assumption

limn→∞

‖f − sn‖ = 0.

It follows, that for all ϕj , j ≤ n

limn→∞〈f − sn, ϕj〉 = 0. (Why?) =⇒ 〈f , ϕj〉 = lim

n→∞〈sn, ϕj〉

If n ≥ j , then

〈sn, ϕj〉 =

⟨n∑

k=1

ckϕk , ϕj

⟩= ??? =

n∑k=1

ck 〈ϕk , ϕj〉 = cj .

Functional analysis Distant Learning. Week 3.

Proof.

Let us define sn :=n∑

k=1

ckϕk .

Then by the Thm.’s assumption

limn→∞

‖f − sn‖ = 0.

It follows, that for all ϕj , j ≤ n

limn→∞〈f − sn, ϕj〉 = 0. (Why?) =⇒ 〈f , ϕj〉 = lim

n→∞〈sn, ϕj〉

If n ≥ j , then

〈sn, ϕj〉 =

⟨n∑

k=1

ckϕk , ϕj

⟩= ??? =

n∑k=1

ck 〈ϕk , ϕj〉 = cj .

Functional analysis Distant Learning. Week 3.

Proof.

Let us define sn :=n∑

k=1

ckϕk .

Then by the Thm.’s assumption

limn→∞

‖f − sn‖ = 0.

It follows, that for all ϕj , j ≤ n

limn→∞〈f − sn, ϕj〉 = 0. (Why?) =⇒ 〈f , ϕj〉 = lim

n→∞〈sn, ϕj〉

If n ≥ j , then

〈sn, ϕj〉 =

⟨n∑

k=1

ckϕk , ϕj

⟩=

??? =n∑

k=1

ck 〈ϕk , ϕj〉 = cj .

Functional analysis Distant Learning. Week 3.

Proof.

Let us define sn :=n∑

k=1

ckϕk .

Then by the Thm.’s assumption

limn→∞

‖f − sn‖ = 0.

It follows, that for all ϕj , j ≤ n

limn→∞〈f − sn, ϕj〉 = 0. (Why?) =⇒ 〈f , ϕj〉 = lim

n→∞〈sn, ϕj〉

If n ≥ j , then

〈sn, ϕj〉 =

⟨n∑

k=1

ckϕk , ϕj

⟩= ???

=n∑

k=1

ck 〈ϕk , ϕj〉 = cj .

Functional analysis Distant Learning. Week 3.

Proof.

Let us define sn :=n∑

k=1

ckϕk .

Then by the Thm.’s assumption

limn→∞

‖f − sn‖ = 0.

It follows, that for all ϕj , j ≤ n

limn→∞〈f − sn, ϕj〉 = 0. (Why?) =⇒ 〈f , ϕj〉 = lim

n→∞〈sn, ϕj〉

If n ≥ j , then

〈sn, ϕj〉 =

⟨n∑

k=1

ckϕk , ϕj

⟩= ??? =

n∑k=1

ck 〈ϕk , ϕj〉

= cj .

Functional analysis Distant Learning. Week 3.

Proof.

Let us define sn :=n∑

k=1

ckϕk .

Then by the Thm.’s assumption

limn→∞

‖f − sn‖ = 0.

It follows, that for all ϕj , j ≤ n

limn→∞〈f − sn, ϕj〉 = 0. (Why?) =⇒ 〈f , ϕj〉 = lim

n→∞〈sn, ϕj〉

If n ≥ j , then

〈sn, ϕj〉 =

⟨n∑

k=1

ckϕk , ϕj

⟩= ??? =

n∑k=1

ck 〈ϕk , ϕj〉 = cj .

Functional analysis Distant Learning. Week 3.

Fourier series expansion

Let (ϕn) ⊂ H be a complete ON system. For any f ∈ H we define

I FOURIER COEFFICIENTS of f with respect to (ϕn) as

〈f , ϕn〉 , n = 1,2, . . .

I FOURIER SERIES EXPANSION of f with respect to (ϕn) as

∞∑n=1

〈f , ϕn〉 ϕn.

Notation. f ∼∞∑

n=1

cn ϕn, with cn = 〈f , ϕn〉.

It is a formal definition yet. Why?

Functional analysis Distant Learning. Week 3.

Fourier series expansion

Let (ϕn) ⊂ H be a complete ON system. For any f ∈ H we define

I FOURIER COEFFICIENTS of f with respect to (ϕn) as

〈f , ϕn〉 , n = 1,2, . . .

I FOURIER SERIES EXPANSION of f with respect to (ϕn) as

∞∑n=1

〈f , ϕn〉 ϕn.

Notation. f ∼∞∑

n=1

cn ϕn, with cn = 〈f , ϕn〉.

It is a formal definition yet. Why?

Functional analysis Distant Learning. Week 3.

Fourier series expansion

Let (ϕn) ⊂ H be a complete ON system. For any f ∈ H we define

I FOURIER COEFFICIENTS of f with respect to (ϕn) as

〈f , ϕn〉 , n = 1,2, . . .

I FOURIER SERIES EXPANSION of f with respect to (ϕn) as

∞∑n=1

〈f , ϕn〉 ϕn.

Notation. f ∼∞∑

n=1

cn ϕn, with cn = 〈f , ϕn〉.

It is a formal definition yet. Why?

Functional analysis Distant Learning. Week 3.

Fourier series expansion

Let (ϕn) ⊂ H be a complete ON system. For any f ∈ H we define

I FOURIER COEFFICIENTS of f with respect to (ϕn) as

〈f , ϕn〉 , n = 1,2, . . .

I FOURIER SERIES EXPANSION of f with respect to (ϕn) as

∞∑n=1

〈f , ϕn〉 ϕn.

Notation. f ∼∞∑

n=1

cn ϕn, with cn = 〈f , ϕn〉.

It is a formal definition yet. Why?

Functional analysis Distant Learning. Week 3.

Fourier series expansion

Let (ϕn) ⊂ H be a complete ON system. For any f ∈ H we define

I FOURIER COEFFICIENTS of f with respect to (ϕn) as

〈f , ϕn〉 , n = 1,2, . . .

I FOURIER SERIES EXPANSION of f with respect to (ϕn) as

∞∑n=1

〈f , ϕn〉 ϕn.

Notation. f ∼∞∑

n=1

cn ϕn, with cn = 〈f , ϕn〉.

It is a formal definition yet. Why?

Functional analysis Distant Learning. Week 3.

Fourier series expansion

Let (ϕn) ⊂ H be a complete ON system. For any f ∈ H we define

I FOURIER COEFFICIENTS of f with respect to (ϕn) as

〈f , ϕn〉 , n = 1,2, . . .

I FOURIER SERIES EXPANSION of f with respect to (ϕn) as

∞∑n=1

〈f , ϕn〉 ϕn.

Notation. f ∼∞∑

n=1

cn ϕn, with cn = 〈f , ϕn〉.

It is a formal definition yet.

Why?

Functional analysis Distant Learning. Week 3.

Fourier series expansion

Let (ϕn) ⊂ H be a complete ON system. For any f ∈ H we define

I FOURIER COEFFICIENTS of f with respect to (ϕn) as

〈f , ϕn〉 , n = 1,2, . . .

I FOURIER SERIES EXPANSION of f with respect to (ϕn) as

∞∑n=1

〈f , ϕn〉 ϕn.

Notation. f ∼∞∑

n=1

cn ϕn, with cn = 〈f , ϕn〉.

It is a formal definition yet. Why?

Functional analysis Distant Learning. Week 3.

Sum of the Fourier series

Theorem. If (ϕn) is a complete ON system, then

f =∞∑

n=1

〈f , ϕn〉 ϕn.

I.e. the sum of the Fourer series gives back the original function.

Analogy. V is a finite dim. vector space. v1, . . . , vn ∈ V is a basis, if

I these vectors are linearly independent,

I ∀v ∈ V can be written as v =n∑

k=1

ck vk (i.e. a generator system).

In infinite dimensional Hilbert space basis ≡ complete ON system

Functional analysis Distant Learning. Week 3.

Sum of the Fourier series

Theorem. If (ϕn) is a complete ON system, then

f =∞∑

n=1

〈f , ϕn〉 ϕn.

I.e. the sum of the Fourer series gives back the original function.

Analogy. V is a finite dim. vector space. v1, . . . , vn ∈ V is a basis, if

I these vectors are linearly independent,

I ∀v ∈ V can be written as v =n∑

k=1

ck vk (i.e. a generator system).

In infinite dimensional Hilbert space basis ≡ complete ON system

Functional analysis Distant Learning. Week 3.

Sum of the Fourier series

Theorem. If (ϕn) is a complete ON system, then

f =∞∑

n=1

〈f , ϕn〉 ϕn.

I.e. the sum of the Fourer series gives back the original function.

Analogy. V is a finite dim. vector space. v1, . . . , vn ∈ V is a basis, if

I these vectors are linearly independent,

I ∀v ∈ V can be written as v =n∑

k=1

ck vk (i.e. a generator system).

In infinite dimensional Hilbert space basis ≡ complete ON system

Functional analysis Distant Learning. Week 3.

Sum of the Fourier series

Theorem. If (ϕn) is a complete ON system, then

f =∞∑

n=1

〈f , ϕn〉 ϕn.

I.e. the sum of the Fourer series gives back the original function.

Analogy.

V is a finite dim. vector space. v1, . . . , vn ∈ V is a basis, if

I these vectors are linearly independent,

I ∀v ∈ V can be written as v =n∑

k=1

ck vk (i.e. a generator system).

In infinite dimensional Hilbert space basis ≡ complete ON system

Functional analysis Distant Learning. Week 3.

Sum of the Fourier series

Theorem. If (ϕn) is a complete ON system, then

f =∞∑

n=1

〈f , ϕn〉 ϕn.

I.e. the sum of the Fourer series gives back the original function.

Analogy. V is a finite dim. vector space.

v1, . . . , vn ∈ V is a basis, if

I these vectors are linearly independent,

I ∀v ∈ V can be written as v =n∑

k=1

ck vk (i.e. a generator system).

In infinite dimensional Hilbert space basis ≡ complete ON system

Functional analysis Distant Learning. Week 3.

Sum of the Fourier series

Theorem. If (ϕn) is a complete ON system, then

f =∞∑

n=1

〈f , ϕn〉 ϕn.

I.e. the sum of the Fourer series gives back the original function.

Analogy. V is a finite dim. vector space. v1, . . . , vn ∈ V is a basis, if

I these vectors are linearly independent,

I ∀v ∈ V can be written as v =n∑

k=1

ck vk (i.e. a generator system).

In infinite dimensional Hilbert space basis ≡ complete ON system

Functional analysis Distant Learning. Week 3.

Sum of the Fourier series

Theorem. If (ϕn) is a complete ON system, then

f =∞∑

n=1

〈f , ϕn〉 ϕn.

I.e. the sum of the Fourer series gives back the original function.

Analogy. V is a finite dim. vector space. v1, . . . , vn ∈ V is a basis, if

I these vectors are linearly independent,

I ∀v ∈ V can be written as v =n∑

k=1

ck vk (i.e. a generator system).

In infinite dimensional Hilbert space basis ≡ complete ON system

Functional analysis Distant Learning. Week 3.

Sum of the Fourier series

Theorem. If (ϕn) is a complete ON system, then

f =∞∑

n=1

〈f , ϕn〉 ϕn.

I.e. the sum of the Fourer series gives back the original function.

Analogy. V is a finite dim. vector space. v1, . . . , vn ∈ V is a basis, if

I these vectors are linearly independent,

I ∀v ∈ V can be written as v =n∑

k=1

ck vk (i.e. a generator system).

In infinite dimensional Hilbert space

basis ≡ complete ON system

Functional analysis Distant Learning. Week 3.

Sum of the Fourier series

Theorem. If (ϕn) is a complete ON system, then

f =∞∑

n=1

〈f , ϕn〉 ϕn.

I.e. the sum of the Fourer series gives back the original function.

Analogy. V is a finite dim. vector space. v1, . . . , vn ∈ V is a basis, if

I these vectors are linearly independent,

I ∀v ∈ V can be written as v =n∑

k=1

ck vk (i.e. a generator system).

In infinite dimensional Hilbert space basis ≡ complete ON system

Functional analysis Distant Learning. Week 3.

Parseval equality

Try to recall ”the original” one

Theorem. Let f ∈ H.

1. (ϕn) ⊂ H is an ON system. Then

∞∑n=1

c2n ≤ ‖f‖2, cn = 〈f , ϕn〉 ϕn.

2. (ϕn) is ON and complete ⇐⇒∞∑

n=1

c2n = ‖f‖2.

The latter identity is called PARSEVAL EQUALITY.

Functional analysis Distant Learning. Week 3.

Parseval equality Try to recall ”the original” one

Theorem. Let f ∈ H.

1. (ϕn) ⊂ H is an ON system. Then

∞∑n=1

c2n ≤ ‖f‖2, cn = 〈f , ϕn〉 ϕn.

2. (ϕn) is ON and complete ⇐⇒∞∑

n=1

c2n = ‖f‖2.

The latter identity is called PARSEVAL EQUALITY.

Functional analysis Distant Learning. Week 3.

Parseval equality Try to recall ”the original” one

Theorem.

Let f ∈ H.

1. (ϕn) ⊂ H is an ON system. Then

∞∑n=1

c2n ≤ ‖f‖2, cn = 〈f , ϕn〉 ϕn.

2. (ϕn) is ON and complete ⇐⇒∞∑

n=1

c2n = ‖f‖2.

The latter identity is called PARSEVAL EQUALITY.

Functional analysis Distant Learning. Week 3.

Parseval equality Try to recall ”the original” one

Theorem. Let f ∈ H.

1. (ϕn) ⊂ H is an ON system. Then

∞∑n=1

c2n ≤ ‖f‖2, cn = 〈f , ϕn〉 ϕn.

2. (ϕn) is ON and complete ⇐⇒∞∑

n=1

c2n = ‖f‖2.

The latter identity is called PARSEVAL EQUALITY.

Functional analysis Distant Learning. Week 3.

Parseval equality Try to recall ”the original” one

Theorem. Let f ∈ H.

1. (ϕn) ⊂ H is an ON system.

Then

∞∑n=1

c2n ≤ ‖f‖2, cn = 〈f , ϕn〉 ϕn.

2. (ϕn) is ON and complete ⇐⇒∞∑

n=1

c2n = ‖f‖2.

The latter identity is called PARSEVAL EQUALITY.

Functional analysis Distant Learning. Week 3.

Parseval equality Try to recall ”the original” one

Theorem. Let f ∈ H.

1. (ϕn) ⊂ H is an ON system. Then

∞∑n=1

c2n ≤ ‖f‖2,

cn = 〈f , ϕn〉 ϕn.

2. (ϕn) is ON and complete ⇐⇒∞∑

n=1

c2n = ‖f‖2.

The latter identity is called PARSEVAL EQUALITY.

Functional analysis Distant Learning. Week 3.

Parseval equality Try to recall ”the original” one

Theorem. Let f ∈ H.

1. (ϕn) ⊂ H is an ON system. Then

∞∑n=1

c2n ≤ ‖f‖2, cn = 〈f , ϕn〉 ϕn.

2. (ϕn) is ON and complete ⇐⇒∞∑

n=1

c2n = ‖f‖2.

The latter identity is called PARSEVAL EQUALITY.

Functional analysis Distant Learning. Week 3.

Parseval equality Try to recall ”the original” one

Theorem. Let f ∈ H.

1. (ϕn) ⊂ H is an ON system. Then

∞∑n=1

c2n ≤ ‖f‖2, cn = 〈f , ϕn〉 ϕn.

2. (ϕn) is ON and complete

⇐⇒∞∑

n=1

c2n = ‖f‖2.

The latter identity is called PARSEVAL EQUALITY.

Functional analysis Distant Learning. Week 3.

Parseval equality Try to recall ”the original” one

Theorem. Let f ∈ H.

1. (ϕn) ⊂ H is an ON system. Then

∞∑n=1

c2n ≤ ‖f‖2, cn = 〈f , ϕn〉 ϕn.

2. (ϕn) is ON and complete ⇐⇒

∞∑n=1

c2n = ‖f‖2.

The latter identity is called PARSEVAL EQUALITY.

Functional analysis Distant Learning. Week 3.

Parseval equality Try to recall ”the original” one

Theorem. Let f ∈ H.

1. (ϕn) ⊂ H is an ON system. Then

∞∑n=1

c2n ≤ ‖f‖2, cn = 〈f , ϕn〉 ϕn.

2. (ϕn) is ON and complete ⇐⇒∞∑

n=1

c2n = ‖f‖2.

The latter identity is called PARSEVAL EQUALITY.

Functional analysis Distant Learning. Week 3.

Parseval equality Try to recall ”the original” one

Theorem. Let f ∈ H.

1. (ϕn) ⊂ H is an ON system. Then

∞∑n=1

c2n ≤ ‖f‖2, cn = 〈f , ϕn〉 ϕn.

2. (ϕn) is ON and complete ⇐⇒∞∑

n=1

c2n = ‖f‖2.

The latter identity is called PARSEVAL EQUALITY.

Functional analysis Distant Learning. Week 3.

∞∑n=1

c2n ≤ ‖f‖2

cn = 〈f , ϕn〉 ϕn.,

Proof. 1. Let us define sn :=n∑

k=1

ckϕk . Geometrically it is try to

finish...the projection of f onto span{ϕ1, ..., ϕn}. Thus (f − sn)⊥sn.

Then we can use the Pythagorean theorem:

‖f‖2 = ‖f − sn‖2 + ‖sn‖2 =⇒ ‖sn‖2 ≤ ‖f‖2 ∀n.

By orthogonality ‖sn‖2 =n∑

k=1

c2k . Finally, with n→∞

√.

Functional analysis Distant Learning. Week 3.

∞∑n=1

c2n ≤ ‖f‖2 cn = 〈f , ϕn〉 ϕn.,

Proof. 1. Let us define sn :=n∑

k=1

ckϕk . Geometrically it is try to

finish...the projection of f onto span{ϕ1, ..., ϕn}. Thus (f − sn)⊥sn.

Then we can use the Pythagorean theorem:

‖f‖2 = ‖f − sn‖2 + ‖sn‖2 =⇒ ‖sn‖2 ≤ ‖f‖2 ∀n.

By orthogonality ‖sn‖2 =n∑

k=1

c2k . Finally, with n→∞

√.

Functional analysis Distant Learning. Week 3.

∞∑n=1

c2n ≤ ‖f‖2 cn = 〈f , ϕn〉 ϕn.,

Proof.

1. Let us define sn :=n∑

k=1

ckϕk . Geometrically it is try to

finish...the projection of f onto span{ϕ1, ..., ϕn}. Thus (f − sn)⊥sn.

Then we can use the Pythagorean theorem:

‖f‖2 = ‖f − sn‖2 + ‖sn‖2 =⇒ ‖sn‖2 ≤ ‖f‖2 ∀n.

By orthogonality ‖sn‖2 =n∑

k=1

c2k . Finally, with n→∞

√.

Functional analysis Distant Learning. Week 3.

∞∑n=1

c2n ≤ ‖f‖2 cn = 〈f , ϕn〉 ϕn.,

Proof. 1. Let us define sn :=n∑

k=1

ckϕk .

Geometrically it is try to

finish...the projection of f onto span{ϕ1, ..., ϕn}. Thus (f − sn)⊥sn.

Then we can use the Pythagorean theorem:

‖f‖2 = ‖f − sn‖2 + ‖sn‖2 =⇒ ‖sn‖2 ≤ ‖f‖2 ∀n.

By orthogonality ‖sn‖2 =n∑

k=1

c2k . Finally, with n→∞

√.

Functional analysis Distant Learning. Week 3.

∞∑n=1

c2n ≤ ‖f‖2 cn = 〈f , ϕn〉 ϕn.,

Proof. 1. Let us define sn :=n∑

k=1

ckϕk . Geometrically it is

try to

finish...the projection of f onto span{ϕ1, ..., ϕn}. Thus (f − sn)⊥sn.

Then we can use the Pythagorean theorem:

‖f‖2 = ‖f − sn‖2 + ‖sn‖2 =⇒ ‖sn‖2 ≤ ‖f‖2 ∀n.

By orthogonality ‖sn‖2 =n∑

k=1

c2k . Finally, with n→∞

√.

Functional analysis Distant Learning. Week 3.

∞∑n=1

c2n ≤ ‖f‖2 cn = 〈f , ϕn〉 ϕn.,

Proof. 1. Let us define sn :=n∑

k=1

ckϕk . Geometrically it is try to

finish...

the projection of f onto span{ϕ1, ..., ϕn}. Thus (f − sn)⊥sn.

Then we can use the Pythagorean theorem:

‖f‖2 = ‖f − sn‖2 + ‖sn‖2 =⇒ ‖sn‖2 ≤ ‖f‖2 ∀n.

By orthogonality ‖sn‖2 =n∑

k=1

c2k . Finally, with n→∞

√.

Functional analysis Distant Learning. Week 3.

∞∑n=1

c2n ≤ ‖f‖2 cn = 〈f , ϕn〉 ϕn.,

Proof. 1. Let us define sn :=n∑

k=1

ckϕk . Geometrically it is try to

finish...the projection of f onto span{ϕ1, ..., ϕn}.

Thus (f − sn)⊥sn.

Then we can use the Pythagorean theorem:

‖f‖2 = ‖f − sn‖2 + ‖sn‖2 =⇒ ‖sn‖2 ≤ ‖f‖2 ∀n.

By orthogonality ‖sn‖2 =n∑

k=1

c2k . Finally, with n→∞

√.

Functional analysis Distant Learning. Week 3.

∞∑n=1

c2n ≤ ‖f‖2 cn = 〈f , ϕn〉 ϕn.,

Proof. 1. Let us define sn :=n∑

k=1

ckϕk . Geometrically it is try to

finish...the projection of f onto span{ϕ1, ..., ϕn}. Thus (f − sn)⊥sn.

Then we can use the Pythagorean theorem:

‖f‖2 = ‖f − sn‖2 + ‖sn‖2 =⇒ ‖sn‖2 ≤ ‖f‖2 ∀n.

By orthogonality ‖sn‖2 =n∑

k=1

c2k . Finally, with n→∞

√.

Functional analysis Distant Learning. Week 3.

∞∑n=1

c2n ≤ ‖f‖2 cn = 〈f , ϕn〉 ϕn.,

Proof. 1. Let us define sn :=n∑

k=1

ckϕk . Geometrically it is try to

finish...the projection of f onto span{ϕ1, ..., ϕn}. Thus (f − sn)⊥sn.

Then we can use the Pythagorean theorem:

‖f‖2 = ‖f − sn‖2 + ‖sn‖2 =⇒ ‖sn‖2 ≤ ‖f‖2 ∀n.

By orthogonality ‖sn‖2 =n∑

k=1

c2k . Finally, with n→∞

√.

Functional analysis Distant Learning. Week 3.

∞∑n=1

c2n ≤ ‖f‖2 cn = 〈f , ϕn〉 ϕn.,

Proof. 1. Let us define sn :=n∑

k=1

ckϕk . Geometrically it is try to

finish...the projection of f onto span{ϕ1, ..., ϕn}. Thus (f − sn)⊥sn.

Then we can use the Pythagorean theorem:

‖f‖2 = ‖f − sn‖2 + ‖sn‖2

=⇒ ‖sn‖2 ≤ ‖f‖2 ∀n.

By orthogonality ‖sn‖2 =n∑

k=1

c2k . Finally, with n→∞

√.

Functional analysis Distant Learning. Week 3.

∞∑n=1

c2n ≤ ‖f‖2 cn = 〈f , ϕn〉 ϕn.,

Proof. 1. Let us define sn :=n∑

k=1

ckϕk . Geometrically it is try to

finish...the projection of f onto span{ϕ1, ..., ϕn}. Thus (f − sn)⊥sn.

Then we can use the Pythagorean theorem:

‖f‖2 = ‖f − sn‖2 + ‖sn‖2 =⇒ ‖sn‖2 ≤ ‖f‖2 ∀n.

By orthogonality ‖sn‖2 =n∑

k=1

c2k . Finally, with n→∞

√.

Functional analysis Distant Learning. Week 3.

∞∑n=1

c2n ≤ ‖f‖2 cn = 〈f , ϕn〉 ϕn.,

Proof. 1. Let us define sn :=n∑

k=1

ckϕk . Geometrically it is try to

finish...the projection of f onto span{ϕ1, ..., ϕn}. Thus (f − sn)⊥sn.

Then we can use the Pythagorean theorem:

‖f‖2 = ‖f − sn‖2 + ‖sn‖2 =⇒ ‖sn‖2 ≤ ‖f‖2 ∀n.

By orthogonality ‖sn‖2 =

n∑k=1

c2k . Finally, with n→∞

√.

Functional analysis Distant Learning. Week 3.

∞∑n=1

c2n ≤ ‖f‖2 cn = 〈f , ϕn〉 ϕn.,

Proof. 1. Let us define sn :=n∑

k=1

ckϕk . Geometrically it is try to

finish...the projection of f onto span{ϕ1, ..., ϕn}. Thus (f − sn)⊥sn.

Then we can use the Pythagorean theorem:

‖f‖2 = ‖f − sn‖2 + ‖sn‖2 =⇒ ‖sn‖2 ≤ ‖f‖2 ∀n.

By orthogonality ‖sn‖2 =n∑

k=1

c2k .

Finally, with n→∞√

.

Functional analysis Distant Learning. Week 3.

∞∑n=1

c2n ≤ ‖f‖2 cn = 〈f , ϕn〉 ϕn.,

Proof. 1. Let us define sn :=n∑

k=1

ckϕk . Geometrically it is try to

finish...the projection of f onto span{ϕ1, ..., ϕn}. Thus (f − sn)⊥sn.

Then we can use the Pythagorean theorem:

‖f‖2 = ‖f − sn‖2 + ‖sn‖2 =⇒ ‖sn‖2 ≤ ‖f‖2 ∀n.

By orthogonality ‖sn‖2 =n∑

k=1

c2k . Finally, with n→∞

√.

Functional analysis Distant Learning. Week 3.∞∑

n=1

c2n = ‖f‖2

⇐⇒ (ϕn) is complete,

2. To verify a proposition with ⇐⇒ inside has to parts.

Part A. Assume (ϕn) is ON. From the previous slide:

‖f‖2 = ‖f − sn‖2 + ‖sn‖2. (1)

From the completeness of (ϕn) follows, that f =∞∑

n=1

cnϕn, thus

limn→∞

‖f − sn‖2 = 0. From (1) we get ‖f‖2 = limn→∞

‖sn‖2 =∞∑

k=1

c2k .

Part B. Assuming ‖f‖2 =∞∑

k=1

c2k ∀f , prove (ϕn) is COMPLETE. Do it

Yourself. HW.

Functional analysis Distant Learning. Week 3.∞∑

n=1

c2n = ‖f‖2 ⇐⇒ (ϕn) is complete,

2. To verify a proposition with ⇐⇒ inside has to parts.

Part A. Assume (ϕn) is ON. From the previous slide:

‖f‖2 = ‖f − sn‖2 + ‖sn‖2. (1)

From the completeness of (ϕn) follows, that f =∞∑

n=1

cnϕn, thus

limn→∞

‖f − sn‖2 = 0. From (1) we get ‖f‖2 = limn→∞

‖sn‖2 =∞∑

k=1

c2k .

Part B. Assuming ‖f‖2 =∞∑

k=1

c2k ∀f , prove (ϕn) is COMPLETE. Do it

Yourself. HW.

Functional analysis Distant Learning. Week 3.∞∑

n=1

c2n = ‖f‖2 ⇐⇒ (ϕn) is complete,

2. To verify a proposition with ⇐⇒ inside has to parts.

Part A. Assume (ϕn) is ON. From the previous slide:

‖f‖2 = ‖f − sn‖2 + ‖sn‖2. (1)

From the completeness of (ϕn) follows, that f =∞∑

n=1

cnϕn, thus

limn→∞

‖f − sn‖2 = 0. From (1) we get ‖f‖2 = limn→∞

‖sn‖2 =∞∑

k=1

c2k .

Part B. Assuming ‖f‖2 =∞∑

k=1

c2k ∀f , prove (ϕn) is COMPLETE. Do it

Yourself. HW.

Functional analysis Distant Learning. Week 3.∞∑

n=1

c2n = ‖f‖2 ⇐⇒ (ϕn) is complete,

2. To verify a proposition with ⇐⇒ inside has to parts.

Part A. Assume (ϕn) is ON.

From the previous slide:

‖f‖2 = ‖f − sn‖2 + ‖sn‖2. (1)

From the completeness of (ϕn) follows, that f =∞∑

n=1

cnϕn, thus

limn→∞

‖f − sn‖2 = 0. From (1) we get ‖f‖2 = limn→∞

‖sn‖2 =∞∑

k=1

c2k .

Part B. Assuming ‖f‖2 =∞∑

k=1

c2k ∀f , prove (ϕn) is COMPLETE. Do it

Yourself. HW.

Functional analysis Distant Learning. Week 3.∞∑

n=1

c2n = ‖f‖2 ⇐⇒ (ϕn) is complete,

2. To verify a proposition with ⇐⇒ inside has to parts.

Part A. Assume (ϕn) is ON. From the previous slide:

‖f‖2 = ‖f − sn‖2 + ‖sn‖2. (1)

From the completeness of (ϕn) follows, that f =∞∑

n=1

cnϕn, thus

limn→∞

‖f − sn‖2 = 0. From (1) we get ‖f‖2 = limn→∞

‖sn‖2 =∞∑

k=1

c2k .

Part B. Assuming ‖f‖2 =∞∑

k=1

c2k ∀f , prove (ϕn) is COMPLETE. Do it

Yourself. HW.

Functional analysis Distant Learning. Week 3.∞∑

n=1

c2n = ‖f‖2 ⇐⇒ (ϕn) is complete,

2. To verify a proposition with ⇐⇒ inside has to parts.

Part A. Assume (ϕn) is ON. From the previous slide:

‖f‖2 = ‖f − sn‖2 + ‖sn‖2. (1)

From the completeness of (ϕn) follows, that f =∞∑

n=1

cnϕn, thus

limn→∞

‖f − sn‖2 = 0. From (1) we get ‖f‖2 = limn→∞

‖sn‖2 =∞∑

k=1

c2k .

Part B. Assuming ‖f‖2 =∞∑

k=1

c2k ∀f , prove (ϕn) is COMPLETE. Do it

Yourself. HW.

Functional analysis Distant Learning. Week 3.∞∑

n=1

c2n = ‖f‖2 ⇐⇒ (ϕn) is complete,

2. To verify a proposition with ⇐⇒ inside has to parts.

Part A. Assume (ϕn) is ON. From the previous slide:

‖f‖2 = ‖f − sn‖2 + ‖sn‖2. (1)

From the completeness of (ϕn) follows, that f =∞∑

n=1

cnϕn, thus

limn→∞

‖f − sn‖2 = 0.

From (1) we get ‖f‖2 = limn→∞

‖sn‖2 =∞∑

k=1

c2k .

Part B. Assuming ‖f‖2 =∞∑

k=1

c2k ∀f , prove (ϕn) is COMPLETE. Do it

Yourself. HW.

Functional analysis Distant Learning. Week 3.∞∑

n=1

c2n = ‖f‖2 ⇐⇒ (ϕn) is complete,

2. To verify a proposition with ⇐⇒ inside has to parts.

Part A. Assume (ϕn) is ON. From the previous slide:

‖f‖2 = ‖f − sn‖2 + ‖sn‖2. (1)

From the completeness of (ϕn) follows, that f =∞∑

n=1

cnϕn, thus

limn→∞

‖f − sn‖2 = 0. From (1) we get ‖f‖2 = limn→∞

‖sn‖2 =

∞∑k=1

c2k .

Part B. Assuming ‖f‖2 =∞∑

k=1

c2k ∀f , prove (ϕn) is COMPLETE. Do it

Yourself. HW.

Functional analysis Distant Learning. Week 3.∞∑

n=1

c2n = ‖f‖2 ⇐⇒ (ϕn) is complete,

2. To verify a proposition with ⇐⇒ inside has to parts.

Part A. Assume (ϕn) is ON. From the previous slide:

‖f‖2 = ‖f − sn‖2 + ‖sn‖2. (1)

From the completeness of (ϕn) follows, that f =∞∑

n=1

cnϕn, thus

limn→∞

‖f − sn‖2 = 0. From (1) we get ‖f‖2 = limn→∞

‖sn‖2 =∞∑

k=1

c2k .

Part B. Assuming ‖f‖2 =∞∑

k=1

c2k ∀f , prove (ϕn) is COMPLETE. Do it

Yourself. HW.

Functional analysis Distant Learning. Week 3.∞∑

n=1

c2n = ‖f‖2 ⇐⇒ (ϕn) is complete,

2. To verify a proposition with ⇐⇒ inside has to parts.

Part A. Assume (ϕn) is ON. From the previous slide:

‖f‖2 = ‖f − sn‖2 + ‖sn‖2. (1)

From the completeness of (ϕn) follows, that f =∞∑

n=1

cnϕn, thus

limn→∞

‖f − sn‖2 = 0. From (1) we get ‖f‖2 = limn→∞

‖sn‖2 =∞∑

k=1

c2k .

Part B.

Assuming ‖f‖2 =∞∑

k=1

c2k ∀f , prove (ϕn) is COMPLETE. Do it

Yourself. HW.

Functional analysis Distant Learning. Week 3.∞∑

n=1

c2n = ‖f‖2 ⇐⇒ (ϕn) is complete,

2. To verify a proposition with ⇐⇒ inside has to parts.

Part A. Assume (ϕn) is ON. From the previous slide:

‖f‖2 = ‖f − sn‖2 + ‖sn‖2. (1)

From the completeness of (ϕn) follows, that f =∞∑

n=1

cnϕn, thus

limn→∞

‖f − sn‖2 = 0. From (1) we get ‖f‖2 = limn→∞

‖sn‖2 =∞∑

k=1

c2k .

Part B. Assuming ‖f‖2 =

∞∑k=1

c2k ∀f , prove (ϕn) is COMPLETE. Do it

Yourself. HW.

Functional analysis Distant Learning. Week 3.∞∑

n=1

c2n = ‖f‖2 ⇐⇒ (ϕn) is complete,

2. To verify a proposition with ⇐⇒ inside has to parts.

Part A. Assume (ϕn) is ON. From the previous slide:

‖f‖2 = ‖f − sn‖2 + ‖sn‖2. (1)

From the completeness of (ϕn) follows, that f =∞∑

n=1

cnϕn, thus

limn→∞

‖f − sn‖2 = 0. From (1) we get ‖f‖2 = limn→∞

‖sn‖2 =∞∑

k=1

c2k .

Part B. Assuming ‖f‖2 =∞∑

k=1

c2k ∀f ,

prove (ϕn) is COMPLETE. Do it

Yourself. HW.

Functional analysis Distant Learning. Week 3.∞∑

n=1

c2n = ‖f‖2 ⇐⇒ (ϕn) is complete,

2. To verify a proposition with ⇐⇒ inside has to parts.

Part A. Assume (ϕn) is ON. From the previous slide:

‖f‖2 = ‖f − sn‖2 + ‖sn‖2. (1)

From the completeness of (ϕn) follows, that f =∞∑

n=1

cnϕn, thus

limn→∞

‖f − sn‖2 = 0. From (1) we get ‖f‖2 = limn→∞

‖sn‖2 =∞∑

k=1

c2k .

Part B. Assuming ‖f‖2 =∞∑

k=1

c2k ∀f , prove (ϕn) is COMPLETE.

Do it

Yourself. HW.

Functional analysis Distant Learning. Week 3.∞∑

n=1

c2n = ‖f‖2 ⇐⇒ (ϕn) is complete,

2. To verify a proposition with ⇐⇒ inside has to parts.

Part A. Assume (ϕn) is ON. From the previous slide:

‖f‖2 = ‖f − sn‖2 + ‖sn‖2. (1)

From the completeness of (ϕn) follows, that f =∞∑

n=1

cnϕn, thus

limn→∞

‖f − sn‖2 = 0. From (1) we get ‖f‖2 = limn→∞

‖sn‖2 =∞∑

k=1

c2k .

Part B. Assuming ‖f‖2 =∞∑

k=1

c2k ∀f , prove (ϕn) is COMPLETE. Do it

Yourself. HW.

Functional analysis Distant Learning. Week 3.

Generalized Parseval equality.

Theorem. Let (ϕn) be a complete ON system in L2(R)-ben.

f ,g ∈ L2(R) are arbitrary functions.Then

〈f ,g〉 =∞∑

k=1

ck dk ,

where c = (ck ) and d = (dk ) are the Fourier coefficients of f and g.

This relation can be also written as:

〈f ,g〉L2 = 〈c,d〉`2 .

Functional analysis Distant Learning. Week 3.

Generalized Parseval equality.

Theorem. Let (ϕn) be a complete ON system in L2(R)-ben.

f ,g ∈ L2(R) are arbitrary functions.Then

〈f ,g〉 =∞∑

k=1

ck dk ,

where c = (ck ) and d = (dk ) are the Fourier coefficients of f and g.

This relation can be also written as:

〈f ,g〉L2 = 〈c,d〉`2 .

Functional analysis Distant Learning. Week 3.

Generalized Parseval equality.

Theorem. Let (ϕn) be a complete ON system in L2(R)-ben.

f ,g ∈ L2(R) are arbitrary functions.

Then

〈f ,g〉 =∞∑

k=1

ck dk ,

where c = (ck ) and d = (dk ) are the Fourier coefficients of f and g.

This relation can be also written as:

〈f ,g〉L2 = 〈c,d〉`2 .

Functional analysis Distant Learning. Week 3.

Generalized Parseval equality.

Theorem. Let (ϕn) be a complete ON system in L2(R)-ben.

f ,g ∈ L2(R) are arbitrary functions.Then

〈f ,g〉 =∞∑

k=1

ck dk ,

where c = (ck ) and d = (dk ) are the Fourier coefficients of f and g.

This relation can be also written as:

〈f ,g〉L2 = 〈c,d〉`2 .

Functional analysis Distant Learning. Week 3.

Generalized Parseval equality.

Theorem. Let (ϕn) be a complete ON system in L2(R)-ben.

f ,g ∈ L2(R) are arbitrary functions.Then

〈f ,g〉 =∞∑

k=1

ck dk ,

where c = (ck ) and d = (dk ) are the Fourier coefficients of f and g.

This relation can be also written as:

〈f ,g〉L2 = 〈c,d〉`2 .

Functional analysis Distant Learning. Week 3.

Generalized Parseval equality.

Theorem. Let (ϕn) be a complete ON system in L2(R)-ben.

f ,g ∈ L2(R) are arbitrary functions.Then

〈f ,g〉 =∞∑

k=1

ck dk ,

where c = (ck ) and d = (dk ) are the Fourier coefficients of f and g.

This relation can be also written as:

〈f ,g〉L2 =

〈c,d〉`2 .

Functional analysis Distant Learning. Week 3.

Generalized Parseval equality.

Theorem. Let (ϕn) be a complete ON system in L2(R)-ben.

f ,g ∈ L2(R) are arbitrary functions.Then

〈f ,g〉 =∞∑

k=1

ck dk ,

where c = (ck ) and d = (dk ) are the Fourier coefficients of f and g.

This relation can be also written as:

〈f ,g〉L2 = 〈c,d〉`2 .

Functional analysis Distant Learning. Week 3.

Special case: H = L2(R)

Corollary. For any f ∈ L2(R) it is possible to assign (cn) ∈ `2, usingany (ϕn) complete ON system.

The other direction is the following important Thm.

Theorem. (Riesz-Fisher thm.) Let (dk ) ∈ `2, i.e.∞∑

k=1

d2k <∞.

Then ∃f ∈ L2(R), such that ‖f‖2 =∞∑

k=1

d2k , and it’s Fourier coefficients

are dk .

Proof. (Hint) A ”candidate” is f :=∞∑

k=1

dkϕk . It is OK. Finish the proof.

Functional analysis Distant Learning. Week 3.

Special case: H = L2(R)

Corollary. For any f ∈ L2(R) it is possible to assign (cn) ∈ `2, usingany (ϕn) complete ON system.

The other direction is the following important Thm.

Theorem. (Riesz-Fisher thm.) Let (dk ) ∈ `2, i.e.∞∑

k=1

d2k <∞.

Then ∃f ∈ L2(R), such that ‖f‖2 =∞∑

k=1

d2k , and it’s Fourier coefficients

are dk .

Proof. (Hint) A ”candidate” is f :=∞∑

k=1

dkϕk . It is OK. Finish the proof.

Functional analysis Distant Learning. Week 3.

Special case: H = L2(R)

Corollary. For any f ∈ L2(R) it is possible to assign (cn) ∈ `2, usingany (ϕn) complete ON system.

The other direction is the following important Thm.

Theorem. (Riesz-Fisher thm.) Let (dk ) ∈ `2, i.e.∞∑

k=1

d2k <∞.

Then ∃f ∈ L2(R), such that ‖f‖2 =∞∑

k=1

d2k , and it’s Fourier coefficients

are dk .

Proof. (Hint) A ”candidate” is f :=∞∑

k=1

dkϕk . It is OK. Finish the proof.

Functional analysis Distant Learning. Week 3.

Special case: H = L2(R)

Corollary. For any f ∈ L2(R) it is possible to assign (cn) ∈ `2, usingany (ϕn) complete ON system.

The other direction is the following important Thm.

Theorem. (Riesz-Fisher thm.) Let (dk ) ∈ `2, i.e.∞∑

k=1

d2k <∞.

Then ∃f ∈ L2(R), such that ‖f‖2 =∞∑

k=1

d2k , and it’s Fourier coefficients

are dk .

Proof. (Hint) A ”candidate” is f :=∞∑

k=1

dkϕk . It is OK. Finish the proof.

Functional analysis Distant Learning. Week 3.

Special case: H = L2(R)

Corollary. For any f ∈ L2(R) it is possible to assign (cn) ∈ `2, usingany (ϕn) complete ON system.

The other direction is the following important Thm.

Theorem. (Riesz-Fisher thm.) Let (dk ) ∈ `2, i.e.∞∑

k=1

d2k <∞.

Then ∃f ∈ L2(R), such that ‖f‖2 =∞∑

k=1

d2k , and it’s Fourier coefficients

are dk .

Proof. (Hint)

A ”candidate” is f :=∞∑

k=1

dkϕk . It is OK. Finish the proof.

Functional analysis Distant Learning. Week 3.

Special case: H = L2(R)

Corollary. For any f ∈ L2(R) it is possible to assign (cn) ∈ `2, usingany (ϕn) complete ON system.

The other direction is the following important Thm.

Theorem. (Riesz-Fisher thm.) Let (dk ) ∈ `2, i.e.∞∑

k=1

d2k <∞.

Then ∃f ∈ L2(R), such that ‖f‖2 =∞∑

k=1

d2k , and it’s Fourier coefficients

are dk .

Proof. (Hint) A ”candidate” is f :=∞∑

k=1

dkϕk .

It is OK. Finish the proof.

Functional analysis Distant Learning. Week 3.

Special case: H = L2(R)

Corollary. For any f ∈ L2(R) it is possible to assign (cn) ∈ `2, usingany (ϕn) complete ON system.

The other direction is the following important Thm.

Theorem. (Riesz-Fisher thm.) Let (dk ) ∈ `2, i.e.∞∑

k=1

d2k <∞.

Then ∃f ∈ L2(R), such that ‖f‖2 =∞∑

k=1

d2k , and it’s Fourier coefficients

are dk .

Proof. (Hint) A ”candidate” is f :=∞∑

k=1

dkϕk . It is OK.

Finish the proof.

Functional analysis Distant Learning. Week 3.

Special case: H = L2(R)

Corollary. For any f ∈ L2(R) it is possible to assign (cn) ∈ `2, usingany (ϕn) complete ON system.

The other direction is the following important Thm.

Theorem. (Riesz-Fisher thm.) Let (dk ) ∈ `2, i.e.∞∑

k=1

d2k <∞.

Then ∃f ∈ L2(R), such that ‖f‖2 =∞∑

k=1

d2k , and it’s Fourier coefficients

are dk .

Proof. (Hint) A ”candidate” is f :=∞∑

k=1

dkϕk . It is OK. Finish the proof.

Functional analysis Distant Learning. Week 3.

L2 and `2

Corollary. L2(R) es `2 are isometrically isomorphic.

The linear isometry is based an any (ϕn) complete ON system,

using the Fourier coefficients: f ←→ (cn).

PLEASE STOP FOR A WHILE, AND UNDERSTAND THIS POINT.

L2(R) and `2 are the ”same”.

Functional analysis Distant Learning. Week 3.

L2 and `2

Corollary. L2(R) es `2 are isometrically isomorphic.

The linear isometry is based an any (ϕn) complete ON system,

using the Fourier coefficients: f ←→ (cn).

PLEASE STOP FOR A WHILE, AND UNDERSTAND THIS POINT.

L2(R) and `2 are the ”same”.

Functional analysis Distant Learning. Week 3.

L2 and `2

Corollary. L2(R) es `2 are isometrically isomorphic.

The linear isometry is based an any (ϕn) complete ON system,

using the Fourier coefficients:

f ←→ (cn).

PLEASE STOP FOR A WHILE, AND UNDERSTAND THIS POINT.

L2(R) and `2 are the ”same”.

Functional analysis Distant Learning. Week 3.

L2 and `2

Corollary. L2(R) es `2 are isometrically isomorphic.

The linear isometry is based an any (ϕn) complete ON system,

using the Fourier coefficients:

f ←→ (cn).

PLEASE STOP FOR A WHILE, AND UNDERSTAND THIS POINT.

L2(R) and `2 are the ”same”.

Functional analysis Distant Learning. Week 3.

L2 and `2

Corollary. L2(R) es `2 are isometrically isomorphic.

The linear isometry is based an any (ϕn) complete ON system,

using the Fourier coefficients: f ←→

(cn).

PLEASE STOP FOR A WHILE, AND UNDERSTAND THIS POINT.

L2(R) and `2 are the ”same”.

Functional analysis Distant Learning. Week 3.

L2 and `2

Corollary. L2(R) es `2 are isometrically isomorphic.

The linear isometry is based an any (ϕn) complete ON system,

using the Fourier coefficients: f ←→ (cn).

PLEASE STOP FOR A WHILE, AND UNDERSTAND THIS POINT.

L2(R) and `2 are the ”same”.

Functional analysis Distant Learning. Week 3.

L2 and `2

Corollary. L2(R) es `2 are isometrically isomorphic.

The linear isometry is based an any (ϕn) complete ON system,

using the Fourier coefficients: f ←→ (cn).

PLEASE STOP FOR A WHILE, AND UNDERSTAND THIS POINT.

L2(R) and `2 are the ”same”.

Functional analysis Distant Learning. Week 3.

L2 and `2

Corollary. L2(R) es `2 are isometrically isomorphic.

The linear isometry is based an any (ϕn) complete ON system,

using the Fourier coefficients: f ←→ (cn).

PLEASE STOP FOR A WHILE, AND UNDERSTAND THIS POINT.

L2(R) and `2 are the ”same”.

Functional analysis Distant Learning. Week 3.

Example. H = L2[−1,1]

In L2[−1,1] a complete ON system are the Legendre polynomials.

We have seen some elements of (Pn(x)):

P0(x) =1√2, P1(x) =

√32

x , P2(x) = it was a HW . . . ..

Then every f ∈ L2[−1,1] can be written as

f (x) =∞∑

n=0

cnPn(x), with cn =

∫ 1

−1f (x)Pn(x)dx .

Thus every f ∈ L2[−1,1] can be approximated by a polynomial of

degree n with KNOWN coefficients. Can you recall sg. similar?

Functional analysis Distant Learning. Week 3.

Example. H = L2[−1,1]

In L2[−1,1] a complete ON system are the Legendre polynomials.

We have seen some elements of (Pn(x)):

P0(x) =1√2, P1(x) =

√32

x , P2(x) = it was a HW . . . ..

Then every f ∈ L2[−1,1] can be written as

f (x) =∞∑

n=0

cnPn(x), with cn =

∫ 1

−1f (x)Pn(x)dx .

Thus every f ∈ L2[−1,1] can be approximated by a polynomial of

degree n with KNOWN coefficients. Can you recall sg. similar?

Functional analysis Distant Learning. Week 3.

Example. H = L2[−1,1]

In L2[−1,1] a complete ON system are the Legendre polynomials.

We have seen some elements of (Pn(x)):

P0(x) =1√2,

P1(x) =

√32

x , P2(x) = it was a HW . . . ..

Then every f ∈ L2[−1,1] can be written as

f (x) =∞∑

n=0

cnPn(x), with cn =

∫ 1

−1f (x)Pn(x)dx .

Thus every f ∈ L2[−1,1] can be approximated by a polynomial of

degree n with KNOWN coefficients. Can you recall sg. similar?

Functional analysis Distant Learning. Week 3.

Example. H = L2[−1,1]

In L2[−1,1] a complete ON system are the Legendre polynomials.

We have seen some elements of (Pn(x)):

P0(x) =1√2, P1(x) =

√32

x , P2(x) =

it was a HW . . . ..

Then every f ∈ L2[−1,1] can be written as

f (x) =∞∑

n=0

cnPn(x), with cn =

∫ 1

−1f (x)Pn(x)dx .

Thus every f ∈ L2[−1,1] can be approximated by a polynomial of

degree n with KNOWN coefficients. Can you recall sg. similar?

Functional analysis Distant Learning. Week 3.

Example. H = L2[−1,1]

In L2[−1,1] a complete ON system are the Legendre polynomials.

We have seen some elements of (Pn(x)):

P0(x) =1√2, P1(x) =

√32

x , P2(x) = it was a HW . . . ..

Then every f ∈ L2[−1,1] can be written as

f (x) =∞∑

n=0

cnPn(x), with cn =

∫ 1

−1f (x)Pn(x)dx .

Thus every f ∈ L2[−1,1] can be approximated by a polynomial of

degree n with KNOWN coefficients. Can you recall sg. similar?

Functional analysis Distant Learning. Week 3.

Example. H = L2[−1,1]

In L2[−1,1] a complete ON system are the Legendre polynomials.

We have seen some elements of (Pn(x)):

P0(x) =1√2, P1(x) =

√32

x , P2(x) = it was a HW . . . ..

Then every f ∈ L2[−1,1] can be written as

f (x) =∞∑

n=0

cnPn(x), with cn =

∫ 1

−1f (x)Pn(x)dx .

Thus every f ∈ L2[−1,1] can be approximated by a polynomial of

degree n with KNOWN coefficients. Can you recall sg. similar?

Functional analysis Distant Learning. Week 3.

Example. H = L2[−1,1]

In L2[−1,1] a complete ON system are the Legendre polynomials.

We have seen some elements of (Pn(x)):

P0(x) =1√2, P1(x) =

√32

x , P2(x) = it was a HW . . . ..

Then every f ∈ L2[−1,1] can be written as

f (x) =∞∑

n=0

cnPn(x), with cn =

∫ 1

−1f (x)Pn(x)dx .

Thus every f ∈ L2[−1,1] can be approximated by a polynomial of

degree n with KNOWN coefficients. Can you recall sg. similar?

Functional analysis Distant Learning. Week 3.

Example. H = L2[−1,1]

In L2[−1,1] a complete ON system are the Legendre polynomials.

We have seen some elements of (Pn(x)):

P0(x) =1√2, P1(x) =

√32

x , P2(x) = it was a HW . . . ..

Then every f ∈ L2[−1,1] can be written as

f (x) =∞∑

n=0

cnPn(x), with cn =

∫ 1

−1f (x)Pn(x)dx .

Thus every f ∈ L2[−1,1] can be approximated by a polynomial of

degree n with KNOWN coefficients.

Can you recall sg. similar?

Functional analysis Distant Learning. Week 3.

Example. H = L2[−1,1]

In L2[−1,1] a complete ON system are the Legendre polynomials.

We have seen some elements of (Pn(x)):

P0(x) =1√2, P1(x) =

√32

x , P2(x) = it was a HW . . . ..

Then every f ∈ L2[−1,1] can be written as

f (x) =∞∑

n=0

cnPn(x), with cn =

∫ 1

−1f (x)Pn(x)dx .

Thus every f ∈ L2[−1,1] can be approximated by a polynomial of

degree n with KNOWN coefficients. Can you recall sg. similar?

Functional analysis Distant Learning. Week 3.

An example in H = L2[0,1]

This example gives an ON system in L2[0,1]-ben.

They are called Haar-functions.

They are not polynomials, but this is the simplest wavelet family .

( More details on that can be found in the in the book.)

They are defined in blocks.

Hn,k with n = 0,1,2, . . . k = 1, ...,2n.

For all indices Hn,k : [0,1]→ IR.

Functional analysis Distant Learning. Week 3.

An example in H = L2[0,1]

This example gives an ON system in L2[0,1]-ben.

They are called Haar-functions.

They are not polynomials, but this is the simplest wavelet family .

( More details on that can be found in the in the book.)

They are defined in blocks.

Hn,k with n = 0,1,2, . . . k = 1, ...,2n.

For all indices Hn,k : [0,1]→ IR.

Functional analysis Distant Learning. Week 3.

An example in H = L2[0,1]

This example gives an ON system in L2[0,1]-ben.

They are called Haar-functions.

They are not polynomials, but this is the simplest wavelet family .

(

More details on that can be found in the in the book.)

They are defined in blocks.

Hn,k with n = 0,1,2, . . . k = 1, ...,2n.

For all indices Hn,k : [0,1]→ IR.

Functional analysis Distant Learning. Week 3.

An example in H = L2[0,1]

This example gives an ON system in L2[0,1]-ben.

They are called Haar-functions.

They are not polynomials, but this is the simplest wavelet family .

( More details on that can be found in the in the book.)

They are defined in blocks.

Hn,k with n = 0,1,2, . . . k = 1, ...,2n.

For all indices Hn,k : [0,1]→ IR.

Functional analysis Distant Learning. Week 3.

An example in H = L2[0,1]

This example gives an ON system in L2[0,1]-ben.

They are called Haar-functions.

They are not polynomials, but this is the simplest wavelet family .

( More details on that can be found in the in the book.)

They are defined in blocks.

Hn,k with n = 0,1,2, . . . k = 1, ...,2n.

For all indices Hn,k : [0,1]→ IR.

Functional analysis Distant Learning. Week 3.

An example in H = L2[0,1]

This example gives an ON system in L2[0,1]-ben.

They are called Haar-functions.

They are not polynomials, but this is the simplest wavelet family .

( More details on that can be found in the in the book.)

They are defined in blocks.

Hn,k

with n = 0,1,2, . . . k = 1, ...,2n.

For all indices Hn,k : [0,1]→ IR.

Functional analysis Distant Learning. Week 3.

An example in H = L2[0,1]

This example gives an ON system in L2[0,1]-ben.

They are called Haar-functions.

They are not polynomials, but this is the simplest wavelet family .

( More details on that can be found in the in the book.)

They are defined in blocks.

Hn,k with n = 0,1,2, . . . k = 1, ...,2n.

For all indices Hn,k : [0,1]→ IR.

Functional analysis Distant Learning. Week 3.

An example in H = L2[0,1]

This example gives an ON system in L2[0,1]-ben.

They are called Haar-functions.

They are not polynomials, but this is the simplest wavelet family .

( More details on that can be found in the in the book.)

They are defined in blocks.

Hn,k with n = 0,1,2, . . . k = 1, ...,2n.

For all indices Hn,k : [0,1]→ IR.

Functional analysis Distant Learning. Week 3.

Haar functions

For n = 0 there are two functions: H0,0 and H0,1.

H0,0(x) = 1.

H0,1(x) =

1 if 0 ≤ x < 1/2

−1 if 1/2 ≤ x ≤ 1

x ∈ [0,1]This is the so called mother wavelet

Easy to check, that ‖H0,0‖ = ‖H0,1‖ = 1 and H0,0⊥H0,1. DO IT.

Functional analysis Distant Learning. Week 3.

Haar functions

For n = 0 there are two functions: H0,0 and H0,1.

H0,0(x) = 1.

H0,1(x) =

1 if 0 ≤ x < 1/2

−1 if 1/2 ≤ x ≤ 1

x ∈ [0,1]This is the so called mother wavelet

Easy to check, that ‖H0,0‖ = ‖H0,1‖ = 1 and H0,0⊥H0,1. DO IT.

Functional analysis Distant Learning. Week 3.

Haar functions

For n = 0 there are two functions: H0,0 and H0,1.

H0,0(x) = 1.

H0,1(x) =

1 if 0 ≤ x < 1/2

−1 if 1/2 ≤ x ≤ 1

x ∈ [0,1]This is the so called mother wavelet

Easy to check, that ‖H0,0‖ = ‖H0,1‖ = 1 and H0,0⊥H0,1. DO IT.

Functional analysis Distant Learning. Week 3.

Haar functions

For n = 0 there are two functions: H0,0 and H0,1.

H0,0(x) = 1.

H0,1(x) =

1 if 0 ≤ x < 1/2

−1 if 1/2 ≤ x ≤ 1

x ∈ [0,1]

This is the so called mother wavelet

Easy to check, that ‖H0,0‖ = ‖H0,1‖ = 1 and H0,0⊥H0,1. DO IT.

Functional analysis Distant Learning. Week 3.

Haar functions

For n = 0 there are two functions: H0,0 and H0,1.

H0,0(x) = 1.

H0,1(x) =

1 if 0 ≤ x < 1/2

−1 if 1/2 ≤ x ≤ 1

x ∈ [0,1]This is the so called mother wavelet

Easy to check, that ‖H0,0‖ = ‖H0,1‖ = 1 and H0,0⊥H0,1. DO IT.

Functional analysis Distant Learning. Week 3.

Haar functions

For n = 0 there are two functions: H0,0 and H0,1.

H0,0(x) = 1.

H0,1(x) =

1 if 0 ≤ x < 1/2

−1 if 1/2 ≤ x ≤ 1

x ∈ [0,1]This is the so called mother wavelet

Easy to check, that ‖H0,0‖ = ‖H0,1‖ = 1 and

H0,0⊥H0,1. DO IT.

Functional analysis Distant Learning. Week 3.

Haar functions

For n = 0 there are two functions: H0,0 and H0,1.

H0,0(x) = 1.

H0,1(x) =

1 if 0 ≤ x < 1/2

−1 if 1/2 ≤ x ≤ 1

x ∈ [0,1]This is the so called mother wavelet

Easy to check, that ‖H0,0‖ = ‖H0,1‖ = 1 and H0,0⊥H0,1.

DO IT.

Functional analysis Distant Learning. Week 3.

Haar functions

For n = 0 there are two functions: H0,0 and H0,1.

H0,0(x) = 1.

H0,1(x) =

1 if 0 ≤ x < 1/2

−1 if 1/2 ≤ x ≤ 1

x ∈ [0,1]This is the so called mother wavelet

Easy to check, that ‖H0,0‖ = ‖H0,1‖ = 1 and H0,0⊥H0,1. DO IT.

Functional analysis Distant Learning. Week 3.

Haar functions, nth block.

For n ≥ 1 divide [0,1] into 2n equal parts with pointsk2n . Let’s define:

Hn,k (x) =

√2n if

k − 12n ≤ x <

k − 1/22n

−√

2n ifk − 1/2

2n ≤ x <k2n

0 otherwise

, n ≥ 1, 1 ≤ k ≤ 2n.

The nonzero part is the ”mother wavelet”, squished and stretched.

Easy to check, that ‖Hn,k‖ = 1 and Hn,k⊥Hn,j for j 6= k . DO IT.

Functional analysis Distant Learning. Week 3.

Haar functions, nth block.

For n ≥ 1 divide [0,1] into 2n equal parts with pointsk2n .

Let’s define:

Hn,k (x) =

√2n if

k − 12n ≤ x <

k − 1/22n

−√

2n ifk − 1/2

2n ≤ x <k2n

0 otherwise

, n ≥ 1, 1 ≤ k ≤ 2n.

The nonzero part is the ”mother wavelet”, squished and stretched.

Easy to check, that ‖Hn,k‖ = 1 and Hn,k⊥Hn,j for j 6= k . DO IT.

Functional analysis Distant Learning. Week 3.

Haar functions, nth block.

For n ≥ 1 divide [0,1] into 2n equal parts with pointsk2n . Let’s define:

Hn,k (x) =

√2n if

k − 12n ≤ x <

k − 1/22n

−√

2n ifk − 1/2

2n ≤ x <k2n

0 otherwise

, n ≥ 1, 1 ≤ k ≤ 2n.

The nonzero part is the ”mother wavelet”, squished and stretched.

Easy to check, that ‖Hn,k‖ = 1 and Hn,k⊥Hn,j for j 6= k . DO IT.

Functional analysis Distant Learning. Week 3.

Haar functions, nth block.

For n ≥ 1 divide [0,1] into 2n equal parts with pointsk2n . Let’s define:

Hn,k (x) =

√2n if

k − 12n ≤ x <

k − 1/22n

−√

2n ifk − 1/2

2n ≤ x <k2n

0 otherwise

, n ≥ 1, 1 ≤ k ≤ 2n.

The nonzero part is the ”mother wavelet”, squished and stretched.

Easy to check, that ‖Hn,k‖ = 1 and Hn,k⊥Hn,j for j 6= k . DO IT.

Functional analysis Distant Learning. Week 3.

Haar functions, nth block.

For n ≥ 1 divide [0,1] into 2n equal parts with pointsk2n . Let’s define:

Hn,k (x) =

√2n if

k − 12n ≤ x <

k − 1/22n

−√

2n ifk − 1/2

2n ≤ x <k2n

0 otherwise

, n ≥ 1, 1 ≤ k ≤ 2n.

The nonzero part is the ”mother wavelet”, squished and stretched.

Easy to check, that ‖Hn,k‖ = 1 and Hn,k⊥Hn,j for j 6= k . DO IT.

Functional analysis Distant Learning. Week 3.

Haar functions, nth block.

For n ≥ 1 divide [0,1] into 2n equal parts with pointsk2n . Let’s define:

Hn,k (x) =

√2n if

k − 12n ≤ x <

k − 1/22n

−√

2n ifk − 1/2

2n ≤ x <k2n

0 otherwise

, n ≥ 1, 1 ≤ k ≤ 2n.

The nonzero part is the ”mother wavelet”, squished and stretched.

Easy to check, that ‖Hn,k‖ = 1 and Hn,k⊥Hn,j for j 6= k . DO IT.

Functional analysis Distant Learning. Week 3.

Haar functions, nth block.

For n ≥ 1 divide [0,1] into 2n equal parts with pointsk2n . Let’s define:

Hn,k (x) =

√2n if

k − 12n ≤ x <

k − 1/22n

−√

2n ifk − 1/2

2n ≤ x <k2n

0 otherwise

, n ≥ 1, 1 ≤ k ≤ 2n.

The nonzero part is the ”mother wavelet”, squished and stretched.

Easy to check, that ‖Hn,k‖ = 1 and

Hn,k⊥Hn,j for j 6= k . DO IT.

Functional analysis Distant Learning. Week 3.

Haar functions, nth block.

For n ≥ 1 divide [0,1] into 2n equal parts with pointsk2n . Let’s define:

Hn,k (x) =

√2n if

k − 12n ≤ x <

k − 1/22n

−√

2n ifk − 1/2

2n ≤ x <k2n

0 otherwise

, n ≥ 1, 1 ≤ k ≤ 2n.

The nonzero part is the ”mother wavelet”, squished and stretched.

Easy to check, that ‖Hn,k‖ = 1 and Hn,k⊥Hn,j for j 6= k .

DO IT.

Functional analysis Distant Learning. Week 3.

Haar functions, nth block.

For n ≥ 1 divide [0,1] into 2n equal parts with pointsk2n . Let’s define:

Hn,k (x) =

√2n if

k − 12n ≤ x <

k − 1/22n

−√

2n ifk − 1/2

2n ≤ x <k2n

0 otherwise

, n ≥ 1, 1 ≤ k ≤ 2n.

The nonzero part is the ”mother wavelet”, squished and stretched.

Easy to check, that ‖Hn,k‖ = 1 and Hn,k⊥Hn,j for j 6= k . DO IT.

Functional analysis Distant Learning. Week 3.

E.g. Haar functions H2,k

As an example, hereare the graphs of the

H2,k

Haar functions fork = 1,2,3,4.

Remark. This ON system is complete. (Not trivial to prove. )

Functional analysis Distant Learning. Week 3.

E.g. Haar functions H2,k

As an example, hereare the graphs of the

H2,k

Haar functions fork = 1,2,3,4.

Remark. This ON system is complete. (Not trivial to prove. )

Functional analysis Distant Learning. Week 3.

E.g. Haar functions H2,k

As an example, hereare the graphs of the

H2,k

Haar functions fork = 1,2,3,4.

Remark. This ON system is complete. (Not trivial to prove. )

Functional analysis Distant Learning. Week 3.

E.g. Haar functions H2,k

As an example, hereare the graphs of the

H2,k

Haar functions fork = 1,2,3,4.

Remark. This ON system is complete. (Not trivial to prove. )

top related