Linear combination


Understanding Linear Combination

Linear combination is a fundamental concept in linear algebra and vector spaces. It involves the combination of vectors using scalar multiplication and vector addition to create new vectors. This concept is crucial in various areas of mathematics, physics, engineering, and computer science.

Definition

A linear combination of a set of vectors ${\vec{v}_1, \vec{v}_2, \ldots, \vec{v}_n}$ in a vector space is an expression of the form:

$$ c_1\vec{v}_1 + c_2\vec{v}_2 + \ldots + c_n\vec{v}_n $$

where $c_1, c_2, \ldots, c_n$ are scalars (real or complex numbers, depending on the context).

Properties

  • Closure: The set of all linear combinations of a set of vectors forms a subspace of the vector space, known as the span of the vectors.
  • Associativity: Scalar multiplication and vector addition are associative.
  • Distributivity: Scalar multiplication distributes over vector addition.

Table of Differences and Important Points

Property Description
Closure The set of all linear combinations is closed under addition and scalar multiplication.
Span The collection of all possible linear combinations of a set of vectors is called the span of those vectors.
Linear Independence A set of vectors is linearly independent if no vector in the set can be written as a linear combination of the others.
Basis A basis of a vector space is a set of vectors that are linearly independent and span the entire space.

Formulas

  • Linear Combination: $c_1\vec{v}_1 + c_2\vec{v}_2 + \ldots + c_n\vec{v}_n$
  • Span: $\text{Span}({\vec{v}_1, \vec{v}_2, \ldots, \vec{v}_n}) = {c_1\vec{v}_1 + c_2\vec{v}_2 + \ldots + c_n\vec{v}_n \mid c_i \in \mathbb{F}}$
  • Linear Independence: $\alpha_1\vec{v}_1 + \alpha_2\vec{v}_2 + \ldots + \alpha_n\vec{v}_n = \vec{0} \Rightarrow \alpha_1 = \alpha_2 = \ldots = \alpha_n = 0$

Examples

Example 1: Linear Combination of Vectors

Consider two vectors in $\mathbb{R}^2$, $\vec{v}_1 = \begin{bmatrix} 1 \ 2 \end{bmatrix}$ and $\vec{v}_2 = \begin{bmatrix} 3 \ 4 \end{bmatrix}$. A linear combination of $\vec{v}_1$ and $\vec{v}_2$ with scalars $c_1$ and $c_2$ could be:

$$ c_1\begin{bmatrix} 1 \ 2 \end{bmatrix} + c_2\begin{bmatrix} 3 \ 4 \end{bmatrix} = \begin{bmatrix} c_1 + 3c_2 \ 2c_1 + 4c_2 \end{bmatrix} $$

For example, if $c_1 = 2$ and $c_2 = -1$, the linear combination is:

$$ 2\begin{bmatrix} 1 \ 2 \end{bmatrix} - 1\begin{bmatrix} 3 \ 4 \end{bmatrix} = \begin{bmatrix} 2 - 3 \ 4 - 4 \end{bmatrix} = \begin{bmatrix} -1 \ 0 \end{bmatrix} $$

Example 2: Span of Vectors

The span of vectors $\vec{v}_1$ and $\vec{v}_2$ from Example 1 is the set of all their linear combinations. This set forms a plane in $\mathbb{R}^2$ since any vector in $\mathbb{R}^2$ can be expressed as a linear combination of $\vec{v}_1$ and $\vec{v}_2$.

Example 3: Linear Independence

To check if vectors $\vec{v}_1$ and $\vec{v}_2$ are linearly independent, we set up the equation:

$$ c_1\begin{bmatrix} 1 \ 2 \end{bmatrix} + c_2\begin{bmatrix} 3 \ 4 \end{bmatrix} = \begin{bmatrix} 0 \ 0 \end{bmatrix} $$

This leads to a system of linear equations:

$$ \begin{align*} c_1 + 3c_2 &= 0 \ 2c_1 + 4c_2 &= 0 \end{align*} $$

Solving this system, we find that $c_1 = c_2 = 0$ is the only solution, which means that $\vec{v}_1$ and $\vec{v}_2$ are linearly independent.

Example 4: Basis

The vectors $\vec{v}_1$ and $\vec{v}_2$ from the previous examples can serve as a basis for $\mathbb{R}^2$ since they are linearly independent and their span is the entire $\mathbb{R}^2$ space.

Understanding linear combinations is essential for solving systems of linear equations, analyzing vector spaces, and studying the behavior of linear transformations. Mastery of this concept is crucial for success in examinations and practical applications in various scientific and engineering fields.