Using determinant to explain some properties of adjoint of matrix


Using Determinant to Explain Some Properties of Adjoint of a Matrix

The adjoint of a matrix, often referred to as the adjugate, is a concept that plays a crucial role in linear algebra, particularly when dealing with the inverse of a matrix. To understand the properties of the adjoint of a matrix, it is essential to first define what an adjoint is and how it is related to the determinant of a matrix.

Definition of Adjoint

Given a square matrix $A$, the adjoint of $A$, denoted as $\text{adj}(A)$, is the transpose of the cofactor matrix of $A$. The cofactor matrix is constructed by replacing each element of $A$ with its cofactor.

The cofactor $C_{ij}$ of an element $a_{ij}$ in a matrix $A$ is given by:

$$ C_{ij} = (-1)^{i+j} \cdot M_{ij} $$

where $M_{ij}$ is the determinant of the submatrix formed by removing the $i$-th row and $j$-th column from $A$.

Relationship Between Adjoint and Determinant

The adjoint of a matrix is closely related to its determinant. The following formula shows this relationship:

$$ A \cdot \text{adj}(A) = \text{adj}(A) \cdot A = (\det(A)) \cdot I $$

where $\det(A)$ is the determinant of $A$, and $I$ is the identity matrix of the same order as $A$.

Properties of Adjoint

Here are some important properties of the adjoint of a matrix, explained using the determinant:

Property Description Formula Example
Non-singular Matrix If $A$ is non-singular (invertible), then $\text{adj}(A)$ is also non-singular, and $A^{-1}$ can be expressed in terms of $\text{adj}(A)$. $A^{-1} = \frac{1}{\det(A)} \cdot \text{adj}(A)$ For $A = \begin{pmatrix} 1 & 2 \ 3 & 4 \end{pmatrix}$, $\text{adj}(A) = \begin{pmatrix} 4 & -2 \ -3 & 1 \end{pmatrix}$, and $A^{-1} = \frac{1}{\det(A)} \cdot \text{adj}(A)$
Singular Matrix If $A$ is singular (non-invertible), then $\det(A) = 0$, and $\text{adj}(A)$ may or may not be the zero matrix. N/A For $A = \begin{pmatrix} 1 & 2 \ 2 & 4 \end{pmatrix}$, $\det(A) = 0$, and $\text{adj}(A)$ is not unique.
Multiplicative Property The adjoint of the product of two matrices equals the product of their adjoints in reverse order. $\text{adj}(AB) = \text{adj}(B) \cdot \text{adj}(A)$ If $A$ and $B$ are $2 \times 2$ matrices, then $\text{adj}(AB) = \text{adj}(B) \cdot \text{adj}(A)$.
Determinant Property The determinant of the adjoint of $A$ is the determinant of $A$ raised to the power of $n-1$, where $n$ is the order of $A$. $\det(\text{adj}(A)) = (\det(A))^{n-1}$ For a $3 \times 3$ matrix $A$, $\det(\text{adj}(A)) = (\det(A))^2$.
Transpose Property The adjoint of the transpose of $A$ is the transpose of the adjoint of $A$. $\text{adj}(A^T) = (\text{adj}(A))^T$ For $A = \begin{pmatrix} 1 & 2 \ 3 & 4 \end{pmatrix}$, $\text{adj}(A^T) = (\text{adj}(A))^T$.

Examples

Let's illustrate some of these properties with examples.

Example 1: Non-singular Matrix

Consider the matrix $A = \begin{pmatrix} 1 & 2 \ 3 & 4 \end{pmatrix}$. The determinant of $A$ is $\det(A) = 1 \cdot 4 - 2 \cdot 3 = -2$. Since $\det(A) \neq 0$, $A$ is non-singular. The adjoint of $A$ is $\text{adj}(A) = \begin{pmatrix} 4 & -2 \ -3 & 1 \end{pmatrix}$.

Using the property of the non-singular matrix, we can find the inverse of $A$:

$$ A^{-1} = \frac{1}{\det(A)} \cdot \text{adj}(A) = \frac{1}{-2} \cdot \begin{pmatrix} 4 & -2 \ -3 & 1 \end{pmatrix} = \begin{pmatrix} -2 & 1 \ 1.5 & -0.5 \end{pmatrix} $$

Example 2: Determinant Property

Let $A$ be a $3 \times 3$ matrix with a non-zero determinant. According to the determinant property, $\det(\text{adj}(A)) = (\det(A))^2$. If $\det(A) = 3$, then $\det(\text{adj}(A)) = 3^2 = 9$.

Example 3: Transpose Property

Consider the same matrix $A$ from Example 1. The transpose of $A$ is $A^T = \begin{pmatrix} 1 & 3 \ 2 & 4 \end{pmatrix}$, and the adjoint of $A^T$ is $\text{adj}(A^T) = \begin{pmatrix} 4 & -3 \ -2 & 1 \end{pmatrix}$. Notice that this is the transpose of $\text{adj}(A)$, which confirms the transpose property.

Understanding these properties of the adjoint of a matrix, especially in relation to the determinant, is crucial for solving problems in linear algebra, such as finding the inverse of a matrix and solving systems of linear equations.