In Vector Spaces, Modules, and Linear Algebra, we defined vector spaces as sets closed under addition and scalar multiplication (in this case the scalars are the elements of a field; if they are elements of a ring which is not a field, we have not a vector space but a module). We have seen since then that the study of vector spaces, linear algebra, is very useful, interesting, and ubiquitous in mathematics.

In this post we discuss vector spaces with some more additional structure – which will give them a **topology** (Basics of Topology and Continuous Functions), giving rise to **topological vector spaces**. This also leads to the branch of mathematics called **functional analysis**, which has applications to subjects such as **quantum mechanics**, aside from being an interesting subject in itself. Two of the important objects of study in functional analysis that we will introduce by the end of this post are **Banach spaces** and **Hilbert spaces**.

##### I. Metric

We start with the concept of a **metric**. We have to get two things out of the way. First, this is *not* the same as the **metric tensor** in differential geometry, although it also gives us a notion of a “distance”. Second, the concept of metric is not limited to vector spaces only, unlike the other two concepts we will discuss in this post. It is actually something that we can put on a set to define a topology, called the **metric topology**.

As we discussed in Basics of Topology and Continuous Functions, we may think of a topology as an “arrangement”. The notion of “distance” provided by the metric gives us an intuitive such arrangement. We will make this concrete shortly, but first we give the technical definition of the metric. We quote from the book Topology by James R. Munkres:

*A metric on a set is a function *

*having the following properties: *

*1) for all ; equality holds if and only if . *

*2) for all . *

*3) (Triangle inequality) , for all .*

We quote from the same book another important definition:

*Given a metric d on X, the number is often called the distance between **and in the metric . Given , consider the set*

*of all points у whose distance from is less than . It is called the -ball centered *

*at . Sometimes we omit the metric from the notation and write this ball simply as*

*when no confusion will arise.*

Finally, once more from the same book, we have the definition of the metric topology:

*If is a metric on the set , then the collection of all -balls , for and , is a basis for a topology on , called the metric topology induced by .*

We recall that the basis of a topology is a collection of open sets such that every other open set can be described as a union of the elements of this collection. A set with a specific metric that makes it into a topological space with the metric topology is called a **metric space**.

An example of a metric on the set is given by the ordinary “distance formula”:

*Note: We have followed the notation of the book of Munkres, which may be different from the usual notation. Here and are two different points on , and and are their respective coordinates.*

The above metric is not the only one possible however. There are many others. For instance, we may simply put

if

if .

This is called the **discrete metric**, and one may check that it satisfies the definition of a metric. One may think of it as something that simply specifies the distance from a point to itself as “near”, and the distance to any other point that is not itself as “far”. There is also the **taxicab metric**, given by the following formula:

One way to think of the taxicab metric, which reflects the origins of the name, is that it is the “distance” important to taxi drivers (needed to calculate the fare) in a certain city with perpendicular roads. The ordinary distance formula is not very helpful since one needs to stay on the roads – therefore, for example, if one needs to go from point to point which are on opposite corners of a square, the distance traversed is not equal to the length of the diagonal, but is instead equal to the length of two sides. Again, one may check that the taxicab metric satisfies the definition of a metric.

##### II. Norm

Now we move on to vector spaces (we will consider in this post only vector spaces over the real or complex numbers), and some mathematical concepts that we can associate with them, as suggested in the beginning of this post. Being a set closed under addition and scalar multiplication is already a useful concept, as we have seen, but we can still add on some ideas that would make them even more interesting. The notion of metric that we have discussed earlier will show up repeatedly over this discussion.

We first discuss the notion of a **norm**, which gives us a notion of a “magnitude” of a vector. We quote from the book Introductory Functional Analysis with Applications by Erwin Kreyszig for the definition:

*A norm on a (real or complex) vector space is a real valued function on whose value at an is denoted by*

* (read “norm of “)*

*and which has the properties*

**(N1)**

**(N2)**

**(N3)**

**(N4)** (triangle inequality)

*here and are arbitrary vectors in and is any scalar.*

A vector space with a specified norm is called a **normed space**.

A norm automatically provides a vector space with a metric; in other words, a normed space is always a metric space. The metric is given in terms of the norm by the following equation:

However, not all metrics come from a norm. An example is the discrete metric, which satisfies the properties of the metric but not the norm.

##### III. Inner Product

Next we discuss the **inner product**. The inner product gives us a notion of “orthogonality”, a concept which we already saw in action in Some Basics of Fourier Analysis. Intuitively, when two vectors are “orthogonal”, they are “perpendicular” in some sense. However, our geometric intuition may not be as useful when we are discussing, say, the infinite-dimensional vector space whose elements are functions. For this we need a more abstract notion of orthogonality, which is embodied by the inner product. Again, for the technical definition we quote from the book of Kreyszig:

*With every pair of vectors and there is associated a scalar which is written*

*and is called the inner product of and , such that for all vectors , , and scalars we have*

**(IPl)**

**(IP2)**

**(IP3)**

**(IP4)** ,

A vector space with a specified inner product is called an **inner product space**.

One of the most basic examples, in the case of a finite-dimensional vector space, is given by the following procedure. Let and be elements (vectors) of some -dimensional real vector space , with respective components and in some basis. Then we can set

This is the familiar “**dot product**” taught in introductory university-level mathematics courses.

Let us now see how the inner product gives us a notion of “orthogonality”. To make things even easier to visualize, let us set , so that we are dealing with vectors (which we can now think of as quantities with magnitude and direction) in the plane. A unit vector pointing “east” has components , while a unit vector pointing “north” has components . These two vectors are perpendicular, or orthogonal. Computing the inner product we discussed earlier, we have

.

We say, therefore, that two vectors are **orthogonal** when their inner product is zero. As we have mentioned earlier, we can extend this to cases where our geometric intuition may no longer be as useful to us. For example, consider the infinite dimensional vector space of (real-valued) functions which are “square integrable” over some interval (if we square them and integrate over this interval, we have a finite answer), say . We set our inner product to be

.

As an example, let and . We say that these functions are “orthogonal”, but it is hard to imagine in what way. But if we take the inner product, we will see that

.

Hence we see that and are orthogonal. Similarly, we have

and and are also orthogonal. We have discussed this in more detail in Some Basics of Fourier Analysis. We have also seen in that post that orthogonality plays a big role in the subject of Fourier analysis.

Just as a norm always induces a metric, an inner product also induces a norm, and by extension also a metric. In other words, an inner product space is also a normed space, and also a metric space. The norm is given in terms of the inner product by the following expression:

Just as with the norm and the metric, although an inner product always induces a norm, not every norm is induced by an inner product.

##### IV. Banach Spaces and Hilbert Spaces

There is one more concept I want to discuss in this post. In Valuations and Completions, we discussed **Cauchy sequences** and **completions**. Those concepts still carry on here, because they are actually part of the study of metric spaces (in fact, the valuations discussed in that post actually serve as a metric on the fields that were discussed, showing how in number theory the concept of metric and metric spaces still make an appearance). If every Cauchy sequence in a metric space converges to an element in , then we say that is a **complete metric space**.

Since normed spaces and inner product spaces are also metric spaces, the notion of a complete metric space still makes sense, and we have special names for them. A normed space which is also a complete metric space is called a **Banach space**, while an inner product space which is also a complete metric space is called a **Hilbert space**. Finite-dimensional vector spaces (over the real or complex numbers) are always complete, and therefore we only really need the distinction when we are dealing with infinite dimensional vector spaces.

Banach spaces and Hilbert spaces are important in quantum mechanics. We recall in Some Basics of Quantum Mechanics that the possible states of a system in quantum mechanics form a vector space. However, more is true – they actually form a Hilbert space, and the states that we can observe “classically” are orthogonal to each other. The Dirac “bra-ket” notation that we have discussed makes use of the inner product to express probabilities.

Meanwhile, Banach spaces often arise when studying **operators**, which correspond to **observables** such as position and momentum. Of course the states form Banach spaces too, since all Hilbert spaces are Banach spaces, but there is much motivation to study the Banach spaces formed by the operators as well instead of just that formed by the states. This is an important aspect of the more mathematically involved treatments of quantum mechanics.

References:

Topological Vector Space on Wikipedia

Functional Analysis on Wikipedia

Inner Product Space on Wikipedia

Complete Metric Space on Wikipedia

A Functional Analysis Primer on Bahcemizi Yetistermeliyiz

Topology by James R. Munkres

Introductory Functional Analysis with Applications by Erwin Kreyszig

Real Analysis by Halsey Royden