Vectors: Dot or Inner Products

The purpose of this notebook is to help you understand the inner or dot product for vectors in $N$ dimensions and how the dot product can tell you how similar two vectors are. Two vectors define the plane which they are in. We show a diagram in this plane, and two vectors. The goal is to find out how much of the first vector, $\vec u$, is in the direction of the second vector, $\vec v$.

Vectors are directional entities with two main properties, a length, or potentcy, and a direction. The length of a vector is known as its magnitude. The magnitude of vector $\vec v$ is written as $|\vec v|$. The direction is generally given by a "unit vector" whose magnitude is one, and whose direction is the same as the original vector. I will use $\vec e$ for unit vectors, and will note their direction with a subscript. For example, a unit vector in the direction of $\vec v$, is written as $\vec e_v = \vec v / |\vec v|$. Dividing the vector by its magnitude makes the result have unit magnitude.

To find how much of a vector, $\vec u$ is in the direction of vector $\vec v$, we use the dot product. In general, the dot product of two vectors, $\vec u \cdot \vec v = |\vec u| |\vec v| cos(\theta)$, where $\theta$ is the angle between $\vec u$ and $\vec v$. See the figure below.This definition makes it clear that vectors with an angle of $90^\circ$ or $\pi/2$ between them yield a zero dot product. Vectors $90^\circ$ apart in angle are known as orthogonal vectors. The dot product of vectors in the same direction is just the product of their magnitudes because $cos(0^\circ) = 1$. The dot product is distributive so that $$(\vec u_1 + \vec u_2)\cdot (\vec v_1 + \vec v_2) = \vec u_1 \cdot \vec v_1 + \vec u_1 \cdot \vec v_2 + \vec u_2 \cdot \vec v_1 + \vec u_2 \cdot \vec v_2$$

If vectors with different subscripts are orthogonal to one another, ($\vec u_1$ is orthogonal to $\vec v_2$ and $\vec u_2$ is orthogonal to $\vec v_1$), the above expression simplifies to $$\vec u \cdot \vec v = \vec u_1 \cdot \vec v_1 + \vec u_2 \cdot \vec v_2$$

This property is often used to write a vector as $$\vec v = \sum_{i=0}^{N-1} v_i \vec e_i$$ or $$\vec u = \sum_{i=0}^{N-1} u_i \vec e_i$$ etc., where the $\vec e_i$ for $i \in \{0, 1, 2, ..., N-1\}$ are $N$ mutually orthonormal (orthogonal unit) vectors. This then gives the easy method of computing a dot product in terms of the components of the vectors. $$\vec u \cdot \vec v = \sum_i u_i v_i$$

The projection of the vector $\vec u$ in the direction of vector $\vec v$ is $|\vec u| cos(\theta) = \vec u \cdot \vec e_v$. The vector portion of $\vec u$ in the direction of $\vec v$ is $|\vec u|cos(\theta)\vec e_v$. The rest of the vector is $\vec u - |\vec u|cos(\theta)\vec e_v$. Its magnitude is $|\vec u|sin(\theta)$ and it is orthogonal to the vector portion of $\vec u$ in the direction or $\vec v$ as you can see from the figure.

Application of Dot Products to a Communications System

Any ordered list of numbers may be viewed as a vector. The position in the list is known as the index number, and the number at that position is the vector component in the direction of the index number. This means you can look at functions of integers as vectors. Data coming from an analog to digital converter may be thought of in this way as a vector. Sometimes you need to differentiate between two signals, each with a different meaning. For example, when you ask a "yes, no" question, and the answer comes, you need to determine if it sounded more like a "yes" or a "no." A receiver in a digital communications system does this for every bit it receives. It computes the dot product of the signal received with a "yes" minus "no" vector. Then in compares that dot product to zero. If it is less than zero, the answer was "no". If it is greater than zero, the answer is "yes." "Yes" could correspond to a one bit, and "no" to a zero bit. This way many bits of information can be sent sequentially. In this situation, the loudness or magnitude of the received vector is not expected to match that sent, but the direction is.

In [ ]:
 
In [ ]: