quantum one: lecture 8. continuously indexed basis sets

Post on 24-Dec-2015

214 Views

Category:

Documents

0 Downloads

Preview:

Click to see full reader

TRANSCRIPT

Quantum One: Lecture 8

Continuously Indexed Basis Sets

In the last lecture, we began to describe a more general formulation of quantum mechanics, applicable to arbitrary quantum systems, which develops the basic postulates in a form that is designed to be representation independent.

We began by stating the first postulate, which associated the dynamical state of a quantum system with a state vector |ψ that is an element of a ⟩ complex linear vector space S.

We then gave a definition of the term linear vector space, and saw that it defines a set of objects that we can multiply by scalars and add together, to obtain other elements of the set. That is, they obey a superposition principle.

We then introduced a series of additional definitions, that included the idea of spanning sets, linearly independent sets, and basis sets, and we defined what we mean by the dimension of a linear vector space.

In this lecture we continue our exploration of the mathematical properties of the state spaces of quantum mechanical systems.

To this end we note that our previous definitions were expressed using a notation that was strictly applicable only to countable sets of states labeled by a discrete index.

But it often arises that a set of vectors {|φ_{α} } is labeled by a continuous index ⟩α.

Examples from functional linear vector spaces include the plane waves and the delta functions.

We therefore need to extend our definitions presented for discrete sets so that we can apply the same concepts to sets of vectors labeled by a continuous index.

In this lecture we continue our exploration of the mathematical properties of the state spaces of quantum mechanical systems.

To this end we note that our previous definitions were expressed using a notation that was strictly applicable only to countable sets of states labeled by a discrete index.

But it often arises that a set of vectors {|φ_{α} } is labeled by a continuous index ⟩α.

Examples from functional linear vector spaces include the plane waves and the delta functions.

We therefore need to extend our definitions presented for discrete sets so that we can apply the same concepts to sets of vectors labeled by a continuous index.

In this lecture we continue our exploration of the mathematical properties of the state spaces of quantum mechanical systems.

To this end we note that our previous definitions were expressed using a notation that was strictly applicable only to countable sets of states labeled by a discrete index.

But we often encounter sets of vectors labeled by a continuous index α.

Examples from functional linear vector spaces include the plane waves and the delta functions.

We therefore need to extend our definitions presented for discrete sets so that we can apply the same concepts to sets of vectors labeled by a continuous index.

In this lecture we continue our exploration of the mathematical properties of the state spaces of quantum mechanical systems.

To this end we note that our previous definitions were expressed using a notation that was strictly applicable only to countable sets of states labeled by a discrete index.

But we often encounter sets of vectors labeled by a continuous index α.

Examples from functional linear vector spaces include the plane waves and the delta functions.

We therefore need to extend our definitions presented for discrete sets so that we can apply the same concepts to sets of vectors labeled by a continuous index.

In this lecture we continue our exploration of the mathematical properties of the state spaces of quantum mechanical systems.

To this end we note that our previous definitions were expressed using a notation that was strictly applicable only to countable sets of states labeled by a discrete index.

But we often encounter sets of vectors labeled by a continuous index α.

Examples from functional linear vector spaces include the plane waves and the delta functions.

We therefore need to extend our definitions presented for discrete sets so that we can apply the same concepts to sets of vectors labeled by a continuous index.

This gives rise to the following set of definitions

In this lecture we continue our exploration of the mathematical properties of the state spaces of quantum mechanical systems.

To this end we note that our previous definitions were expressed using a notation that was strictly applicable only to countable sets of states labeled by a discrete index.

But we often encounter sets of vectors labeled by a continuous index α.

Examples from functional linear vector spaces include the plane waves and the delta functions.

We therefore need to extend our definitions presented for discrete sets so that we can apply the same concepts to sets of vectors labeled by a continuous index.

This gives rise to the following set of definitions

Span - A continuously indexed set of vectors is said to span a vector space S if every vector |ψ in S can be written as a ⟩ continuous linear combination

of the elements of the set. In this expression the function gives the complex value of the expansion coefficient of multiplying the state of the spanning set.

Linear Independence - A continuously indexed set of vectors is linearly independent if the only solution to the equation

is for all α.

Span - A continuously indexed set of vectors is said to span a vector space S if every vector |ψ in S can be written as a ⟩ continuous linear combination

of the elements of the set. In this expression the function gives the complex value of the expansion coefficient of multiplying the state of the spanning set.

Linear Independence - A continuously indexed set of vectors is linearly independent if the only solution to the equation

is for all α.

Basis - A linearly independent set of continuously indexed vectors that spans S forms a basis for the space.

We note in passing that any space that contains a continuously indexed basis, is necessarily infinite dimensional, since it must contain an infinite number of linearly independent vectors in any domain of the index α in which it takes on a continuous range of values.

Basis - A linearly independent set of continuously indexed vectors that spans S forms a basis for the space.

We note in passing that any space that contains a continuously indexed basis, is necessarily infinite dimensional, since it must contain an infinite number of linearly independent vectors in any domain in which the index α takes on continuous values.

Inner Products

Inner Products Towards a notion of length and direction

Another important property associated with the linear vector spaces of quantum mechanics is that they are inner product spaces.

Definition: A linear vector space S is an inner product space if there exists an assignment to each pair of vectors |φ and |ψ in S, a scalar (an element of the ⟩ ⟩field), denoted by the symbol φ|ψ , the inner product of |φ and |ψ , obeying ⟨ ⟩ ⟩ ⟩the following properties:

1. ⟨φ|φ is real and non-negative, i.e., φ|φ ≥0. Moreover, φ|φ =0, if and only ⟩ ⟨ ⟩ ⟨ ⟩if |φ is the null vector.⟩

2. ⟨φ|[|ψ₁ +|ψ₂ ]= φ|ψ₁ + φ|ψ₂ . Thus, the inner product distributes itself over ⟩ ⟩ ⟨ ⟩ ⟨ ⟩vector addition.

3. ⟨φ|[λ|ψ ]=λ φ|ψ⟩ ⟨ ⟩

4. ⟨φ|ψ =( ψ|φ )⟩ ⟨ ⟩ ∗. Thus the order of the inner product is important for complex vector spaces.

Another important property associated with the linear vector spaces of quantum mechanics is that they are inner product spaces.

Definition: A linear vector space S is an inner product space if there exists an assignment to each pair of vectors |φ and |ψ in S, a scalar (an element of the ⟩ ⟩field), denoted by the symbol φ|ψ referred to as the inner product of |φ and |⟨ ⟩ ⟩ψ , obeying the following properties:⟩

1. ⟨φ|φ is real and non-negative, i.e., φ|φ ≥0. Moreover, φ|φ =0, if and only ⟩ ⟨ ⟩ ⟨ ⟩if |φ is the null vector.⟩

2. ⟨φ|[|ψ₁ +|ψ₂ ]= φ|ψ₁ + φ|ψ₂ . Thus, the inner product distributes itself over ⟩ ⟩ ⟨ ⟩ ⟨ ⟩vector addition.

3. ⟨φ|[λ|ψ ]=λ φ|ψ⟩ ⟨ ⟩

4. ⟨φ|ψ =( ψ|φ )⟩ ⟨ ⟩ ∗. Thus the order of the inner product is important for complex vector spaces.

Another important property associated with the linear vector spaces of quantum mechanics is that they are inner product spaces.

Definition: A linear vector space S is an inner product space if there exists an assignment to each pair of vectors |φ and |ψ in S, a scalar (an element of the ⟩ ⟩field), denoted by the symbol φ|ψ referred to as the inner product of |φ and |⟨ ⟩ ⟩ψ , obeying the following properties:⟩

1. ⟨φ|φ is real and non-negative, i.e., φ|φ ≥0. Moreover, φ|φ =0, if and only ⟩ ⟨ ⟩ ⟨ ⟩if |φ is the null vector.⟩

2. ⟨φ|[|ψ₁ +|ψ₂ ]= φ|ψ₁ + φ|ψ₂ . Thus, the inner product distributes itself over ⟩ ⟩ ⟨ ⟩ ⟨ ⟩vector addition.

3. ⟨φ|[λ|ψ ]=λ φ|ψ⟩ ⟨ ⟩

4. ⟨φ|ψ =( ψ|φ )⟩ ⟨ ⟩ ∗. Thus the order of the inner product is important for complex vector spaces.

Another important property associated with the linear vector spaces of quantum mechanics is that they are inner product spaces.

Definition: A linear vector space S is an inner product space if there exists an assignment to each pair of vectors |φ and |ψ in S, a scalar (an element of the ⟩ ⟩field), denoted by the symbol φ|ψ referred to as the inner product of |φ and |⟨ ⟩ ⟩ψ , obeying the following properties:⟩

1. ⟨φ|φ is real and non-negative, i.e., φ|φ ≥0. Moreover, φ|φ =0, if and only ⟩ ⟨ ⟩ ⟨ ⟩if |φ is the null vector.⟩

2. ⟨φ|[|ψ₁ +|ψ₂ ]= φ|ψ₁ + φ|ψ₂ . Thus, the inner product distributes itself over ⟩ ⟩ ⟨ ⟩ ⟨ ⟩vector addition.

3. ⟨φ|[λ|ψ ]=λ φ|ψ⟩ ⟨ ⟩

4. ⟨φ|ψ =( ψ|φ )⟩ ⟨ ⟩ ∗. Thus the order of the inner product is important for complex vector spaces.

Another important property associated with the linear vector spaces of quantum mechanics is that they are inner product spaces.

Definition: A linear vector space S is an inner product space if there exists an assignment to each pair of vectors |φ and |ψ in S, a scalar (an element of the ⟩ ⟩field), denoted by the symbol φ|ψ referred to as the inner product of |φ and |⟨ ⟩ ⟩ψ , obeying the following properties:⟩

1. ⟨φ|φ is real and non-negative, i.e., φ|φ ≥0. Moreover, φ|φ =0, if and only ⟩ ⟨ ⟩ ⟨ ⟩if |φ is the null vector.⟩

2. ⟨φ|[|ψ₁ +|ψ₂ ]= φ|ψ₁ + φ|ψ₂ . Thus, the inner product distributes itself over ⟩ ⟩ ⟨ ⟩ ⟨ ⟩vector addition.

3. ⟨φ|[λ|ψ ]=λ φ|ψ⟩ ⟨ ⟩

4. ⟨φ|ψ =( ψ|φ )⟩ ⟨ ⟩ ∗. Thus the order of the inner product is important for complex vector spaces.

Another important property associated with the linear vector spaces of quantum mechanics is that they are inner product spaces.

Definition: A linear vector space S is an inner product space if there exists an assignment to each pair of vectors |φ and |ψ in S, ⟩ ⟩ a scalar (an element of the field), denoted by the symbol φ|ψ referred to as the inner product of |φ and |⟨ ⟩ ⟩ψ , obeying the following properties:⟩

1. ⟨φ|φ is real and non-negative, i.e., φ|φ ≥0. Moreover, φ|φ =0, if and only ⟩ ⟨ ⟩ ⟨ ⟩if |φ is the null vector.⟩

2. ⟨φ|[|ψ₁ +|ψ₂ ]= φ|ψ₁ + φ|ψ₂ . Thus, the inner product distributes itself over ⟩ ⟩ ⟨ ⟩ ⟨ ⟩vector addition.

3. ⟨φ|[λ|ψ ]=λ φ|ψ⟩ ⟨ ⟩

4. ⟨φ|ψ =( ψ|φ )⟩ ⟨ ⟩ ∗. Thus the order of the inner product is important for complex vector spaces.

In complex vector spaces, the inner product φ|ψ is⟨ ⟩ linear in |ψ , but ⟩ antilinear in |φ . ⟩

The first half of this comment follows from the observation that

which follows from (2) and (3),

In complex vector spaces, the inner product φ|ψ is⟨ ⟩ linear in |ψ , but ⟩ antilinear in |φ . ⟩

The first half of this comment follows from the observation that

which follows from (2) and (3),

In complex vector spaces, the inner product φ|ψ is⟨ ⟩ linear in |ψ , but ⟩ antilinear in |φ . ⟩

The first half of this comment follows from the observation that

which follows from (2) and (3),

while the second stems from the fact that ifthen

which defines the condition of antilinearity with respect to |φ .⟩

In complex vector spaces, the inner product φ|ψ is⟨ ⟩ linear in |ψ , but ⟩ antilinear in |φ . ⟩

The first half of this comment follows from the observation that

which follows from (2) and (3),

while the second stems from the fact that ifthen

which defines the condition of antilinearity with respect to |φ .⟩

In complex vector spaces, the inner product φ|ψ is⟨ ⟩ linear in |ψ , but ⟩ antilinear in |φ . ⟩

The first half of this comment follows from the observation that

which follows from (2) and (3),

while the second stems from the fact that ifthen

which defines the condition of antilinearity with respect to |φ .⟩

It is convenient to think of each vector |ψ as a column vector containing elements ⟩ψi, and to think of φ| as a row vector whose elements are the complex conjugates ⟨of the components of the column vector representing |φ . ⟩In this way the inner product can be viewed as the "dot product"

This is, of course, the inner product commonly associated with CN.

The complex conjugated row vectors associated with the symbols { φ|} thus form a ⟨vector space of their own, which is isomorphic (or dual or adjoint) to the original space S having elements {|φ }. They are in 1-1 correspondence.⟩

It is convenient to think of each vector |ψ as a column vector containing elements ⟩ψi, and to think of φ| as a row vector whose elements are the complex conjugates ⟨of the components of the column vector representing |φ . ⟩In this way the inner product can be viewed as the "dot product"

This is, of course, the inner product commonly associated with CN.

The complex conjugated row vectors associated with the symbols { φ|} thus form a ⟨vector space of their own, which is isomorphic (or dual or adjoint) to the original space S having elements {|φ }. They are in 1-1 correspondence.⟩

It is convenient to think of each vector |ψ as a column vector containing elements ⟩ψi, and to think of φ| as a row vector whose elements are the complex conjugates ⟨of the components of the column vector representing |φ . ⟩In this way the inner product can be viewed as the "dot product"

This is, of course, the inner product commonly associated with CN.

The complex conjugated row vectors associated with the symbols { φ|} thus form a ⟨vector space of their own, which is isomorphic (or dual or adjoint) to the original space S having elements {|φ }. They are in 1-1 correspondence.⟩

It is convenient to think of each vector |ψ as a column vector containing elements ⟩ψi, and to think of φ| as a row vector whose elements are the complex conjugates ⟨of the components of the column vector representing |φ . ⟩In this way the inner product can be viewed as the "dot product"

This is, of course, the inner product commonly associated with CN.

The complex conjugated row vectors associated with the symbols { φ|} thus form a ⟨vector space of their own, which is isomorphic (or dual or adjoint) to the original space S having elements {|φ }. They are in 1-1 correspondence.⟩

In the Dirac notation, an element |φ of S is referred to as a ⟩ ket, while an element φ| of is referred to as a ⟨ bra.

The combination φ|ψ forms a "bracket", which in the Dirac formalism is always a ⟨ ⟩number, i.e., an element of C.

Examples:

1. In the space of displacement vectors in R³ the inner product is just the familiar "dot product".

2. As discussed above the inner product in CN is obtained by "dotting" a complex conjugated row vector into an unconjugated column vector.

In the Dirac notation, an element |φ of S is referred to as a ⟩ ket, while an element φ| of is referred to as a ⟨ bra.

The combination φ|ψ forms a "bracket", which in the Dirac formalism is always a ⟨ ⟩number, i.e., an element of C.

Examples:

1. In the space of displacement vectors in R³ the inner product is just the familiar "dot product".

2. As discussed above the inner product in CN is obtained by "dotting" a complex conjugated row vector into an unconjugated column vector.

In the Dirac notation, an element |φ of S is referred to as a ⟩ ket, while an element φ| of is referred to as a ⟨ bra.

The combination φ|ψ forms a "bracket", which in the Dirac formalism is always a ⟨ ⟩number, i.e., an element of C.

Examples:

1. In the space of displacement vectors in R³ the inner product is just the familiar "dot product".

2. As discussed above the inner product in CN is obtained by "dotting" a complex conjugated row vector into an unconjugated column vector.

In the Dirac notation, an element |φ of S is referred to as a ⟩ ket, while an element φ| of is referred to as a ⟨ bra.

The combination φ|ψ forms a "bracket", which in the Dirac formalism is always a ⟨ ⟩number, i.e., an element of C.

Examples:

1. In the space of displacement vectors in R³ the inner product is just the familiar "dot product".

2. As discussed above the inner product in CN is obtained by "dotting" a complex conjugated row vector into an unconjugated column vector.

3. In functional spaces, the inner product involves the continuous analog of a summation over components, namely an integral.

Thus, e.g., in the space of Fourier transformable function on R³ we "associate" with each function ψ(r) a vector |ψ . The inner product of two functions then takes the ⟩form

where the integral is over all space.

3. In functional spaces, the inner product involves the continuous analog of a summation over components, namely an integral.

Thus, e.g., in the space of Fourier transformable function on R³ we "associate" with each function ψ(r) a vector |ψ . The inner product of two functions then takes the ⟩form

where the integral is over all space.

The concept of an inner product allows us to make several new definitions:

Norm - The positive real quantity is referred to as the norm, or the length of the vector

A vector is said to be square-normalized, have unit norm, or be a unit vector if

Any vector having a finite norm can be square normalized. That is, if is not infinite, then the vector

is a unit vector along the same direction in the space as |ψ⟩

The concept of an inner product allows us to make several new definitions:

Norm - The positive real quantity is referred to as the norm, or the length of the vector

A vector is said to be square-normalized, have unit norm, or be a unit vector if

Any vector having a finite norm can be square normalized. That is, if is not infinite, then the vector

is a unit vector along the same direction in the space as |ψ⟩

The concept of an inner product allows us to make several new definitions:

Norm - The positive real quantity is referred to as the norm, or the length of the vector

A vector is said to be square-normalized, have unit norm, or be a unit vector if

Any vector having a finite norm can be square normalized. That is, if is not infinite, then the vector

is a unit vector along the same direction in the space as |ψ⟩

The concept of an inner product allows us to make several new definitions:

Norm - The positive real quantity is referred to as the norm, or the length of the vector

A vector is said to be square-normalized, have unit norm, or be a unit vector if

Any vector having a finite norm can be square normalized. That is, if is not infinite, then the vector

is a unit vector along the same direction in the space as |ψ⟩

The concept of an inner product allows us to make several new definitions:

Norm - The positive real quantity is referred to as the norm, or the length of the vector

A vector is said to be square-normalized, have unit norm, or be a unit vector if

Any vector having a finite norm can be square normalized. That is, if is not infinite, then the vector

is a unit vector along the same direction in the space as |ψ⟩

Orthogonality - Two vectors and |φ are ⟩ orthogonal if

⟨ψ|φ = φ|ψ =0⟩ ⟨ ⟩

i.e., if their inner product vanishes.

We loosely say that the vectors have no overlap, or that |ψ has no component ⟩along |φ and vice versa. ⟩

Orthonormal Set of vectors

1. A discrete set of vectors forms an orthonormal set if

that is, if they are a set of unit-normalized vectors which are mutually orthogonal.

Orthogonality - Two vectors and |φ are ⟩ orthogonal if

⟨ψ|φ = φ|ψ =0⟩ ⟨ ⟩

i.e., if their inner product vanishes.

We loosely say that the vectors have no overlap, or that |ψ has no component ⟩along |φ and vice versa. ⟩

Orthonormal Set of vectors

1. A discrete set of vectors forms an orthonormal set if

that is, if they are a set of unit-normalized vectors which are mutually orthogonal.

Orthogonality - Two vectors and |φ are ⟩ orthogonal if

⟨ψ|φ = φ|ψ =0⟩ ⟨ ⟩

i.e., if their inner product vanishes.

We loosely say that the vectors have no overlap, or that |ψ has no component ⟩along |φ and vice versa. ⟩

Orthonormal Set of vectors

1. A discrete set of vectors forms an orthonormal set if

that is, if they are a set of unit-normalized vectors which are mutually orthogonal.

Orthogonality - Two vectors and |φ are ⟩ orthogonal if

⟨ψ|φ = φ|ψ =0⟩ ⟨ ⟩

i.e., if their inner product vanishes.

We loosely say that the vectors have no overlap, or that |ψ has no component ⟩along |φ and vice versa. ⟩

Orthonormal Set of vectors 2. A continuously-indexed set of vectors forms an orthonormal set if

The members of such a set have infinite norm, and are not square-normalizable.

We can now combine the notion of a basis set with our two definitions of orthonormality and introduce the critical idea of an

Orthonormal Basis:An orthonormal set of vectors which spans a vector space S forms an orthonormal basis for the space.

As we will see, of all the different possible sets of basis vectors that one could construct, those in which the elements form an orthonormal set are the best kind.

Intuitively, we knew this, when we expressed our opinion

that the vectors {, } form a good basis

and that the vectors {, } do not.

We can now combine the notion of a basis set with our two definitions of orthonormality and introduce the critical idea of an

Orthonormal Basis:An orthonormal set of vectors which spans a vector space S forms an orthonormal basis for the space.

As we will see, of all the different possible sets of basis vectors that one could construct, those in which the elements form an orthonormal set are the best kind.

Intuitively, we knew this, when we expressed our opinion

that the vectors {, } form a good basis

and that the vectors {, } do not.

We can now combine the notion of a basis set with our two definitions of orthonormality and introduce the critical idea of an

Orthonormal Basis:An orthonormal set of vectors which spans a vector space S forms an orthonormal basis for the space.

As we will see, of all the different possible sets of basis vectors that one could construct, those in which the elements form an orthonormal set are the best kind.

Intuitively, we knew this, when we expressed our opinion

that the vectors {, } form a good basis

and that the vectors {, } do not.

We can now combine the notion of a basis set with our two definitions of orthonormality and introduce the critical idea of an

Orthonormal Basis:An orthonormal set of vectors which spans a vector space S forms an orthonormal basis for the space.

As we will see, of all the different possible sets of basis vectors that one could construct, those in which the elements form an orthonormal set are the best kind.

Intuitively, we knew this, when we expressed our opinion

that the vectors {, } form a good basis

and that the vectors {, } do not.

We can now combine the notion of a basis set with our two definitions of orthonormality and introduce the critical idea of an

Orthonormal Basis:An orthonormal set of vectors which spans a vector space S forms an orthonormal basis for the space.

As we will see, of all the different possible sets of basis vectors that one could construct, those in which the elements form an orthonormal set are the best kind.

Intuitively, we knew this, when we expressed our opinion

that the vectors {, } form a good basis

and that the vectors {, } do not.

It is straightforward to show that any set of mutually orthogonal vectors not containing the null vector is linearly independent.

Thus, any orthonormal set of vectors which spans S space forms an orthonormal basis for the space.

But what if we have a basis for the space that is not an orthonormal basis? It may be good thing, but does an orthonormal basis always exist?

Yes! We show in the next lecture that from any set of N linearly independent vectors of finite length, it is always possible to construct a set of N orthonormal vectors.

The explicit algorithm for doing this so referred to as the Gram-Schmidt orthogonalization procedure, and it leads to the conclusion that from any basis, we can construct an orthonormal basis.

It is straightforward to show that any set of mutually orthogonal vectors not containing the null vector is linearly independent.

Thus, any orthonormal set of vectors which spans S space forms an orthonormal basis for the space.

But what if we have a basis for the space that is not an orthonormal basis? It may be good thing, but does an orthonormal basis always exist?

Yes! We show in the next lecture that from any set of N linearly independent vectors of finite length, it is always possible to construct a set of N orthonormal vectors.

The explicit algorithm for doing this so referred to as the Gram-Schmidt orthogonalization procedure, and it leads to the conclusion that from any basis, we can construct an orthonormal basis.

It is straightforward to show that any set of mutually orthogonal vectors not containing the null vector is linearly independent.

Thus, any orthonormal set of vectors which spans S space forms an orthonormal basis for the space.

But what if we have a basis for the space that is not an orthonormal basis? It may be good thing, but does an orthonormal basis always exist?

Yes! We show in the next lecture that from any set of N linearly independent vectors of finite length, it is always possible to construct a set of N orthonormal vectors.

The explicit algorithm for doing this so referred to as the Gram-Schmidt orthogonalization procedure, and it leads to the conclusion that from any basis, we can construct an orthonormal basis.

It is straightforward to show that any set of mutually orthogonal vectors not containing the null vector is linearly independent.

Thus, any orthonormal set of vectors which spans S space forms an orthonormal basis for the space.

But what if we have a basis for the space that is not an orthonormal basis? It may be good thing, but does an orthonormal basis always exist?

Yes, it does. We show in the next lecture that from any set of N linearly independent vectors of finite length, it is always possible to construct a set of N orthonormal vectors.

The explicit algorithm for doing this so referred to as the Gram-Schmidt orthogonalization procedure, and it leads to the conclusion that from any basis, we can construct an orthonormal basis.

It is straightforward to show that any set of mutually orthogonal vectors not containing the null vector is linearly independent.

Thus, any orthonormal set of vectors which spans S space forms an orthonormal basis for the space.

But what if we have a basis for the space that is not an orthonormal basis? It may be good thing, but does an orthonormal basis always exist?

Yes, it does. We show in the next lecture that from any set of N linearly independent vectors of finite length, it is always possible to construct a set of N orthonormal vectors.

The explicit algorithm for doing so is referred to as the Gram-Schmidt orthogonalization procedure, and it leads to the conclusion that from any basis, it is always possible to construct an orthonormal basis.

top related