• Skip to primary navigation
  • Skip to content
  • Skip to primary sidebar
  • Skip to footer
  • Home
  • Collection of formulas
  • Basic engineering mathematics
    • Partial fractions
    • Differentiation
    • Differentiation applications
    • Integration
    • Complex numbers
    • Series
    • Vector
    • Sequences
  • Advanced engineering mathematics
    • First order ODE
    • Second order ODE
    • Partial differentiation
    • Vector analysis
    • Determinants
    • Matrix analysis
    • Laplace transform
    • Difference equations and z-transform
    • Partial differential equations
    • Multiple integration

Engineering math blog

  • Bloglovin
  • Facebook
  • LinkedIn
  • Pinterest
  • Tumblr
  • Twitter

April 3, 2018 By Dr. Aspriha Peters Leave a Comment

Complex eigenvalues and eigenvectors of a matrix

Complex eigenvalues and eigenvectors of a matrix. Dear friends, today it’s all about the complex eigenvalues and eigenvectors of a matrix. Have a look!!

Complex eigenvalues and eigenvectors of a matrix

Complex eigenvalues and eigenvectors of a matrix

In my earlier posts, I have already shown how to find out eigenvalues and the corresponding eigenvectors of a matrix.

Now with eigenvalues of any matrix, three things can happen.

First one is a simple one – like all eigenvalues are real and different.

Next one is at least one eigenvalue is repeated, can be twice or even more.

And finally, the last one is that the eigenvalues are the complex conjugate numbers, not real anymore.

Today I’ll talk about only the complex eigenvalues of a matrix with real numbers.

Then I’ll also try to figure out the corresponding eigenvectors.

So, I’ll start with some examples.

Some examples of Complex eigenvalues and eigenvectors of a matrix

Disclaimer: None of these examples is mine. I have chosen these from some book or books. I have also given the due reference at the end of the post.

So here are the examples.




Example 1

According to Kreyszig (2005) “Find the eigenvalues and eigenvectors of the following matrices. 

    \[ \begin{pmatrix} 0.8&-0.6\\ 0.6 &0.8 \end{pmatrix}. \]

”

Solution

Here the given matrix is 

    \[ \begin{pmatrix} 0.8&-0.6\\ 0.6 &0.8 \end{pmatrix}. \]

First of all, I’ll give it a name, say, A.

Therefore it will be 

    \[ \begin{pmatrix} 0.8&-0.6\\ 0.6 &0.8 \end{pmatrix}. \]

Now I’ll start with the eigenvalues of the matrix A.

Step 1

In the beginning, comes the characteristic equation of the matrix A.

So it will be

    \[|A - \lambda I| = 0.\]

Here \lambda is the eigenvalue of the matrix A.

Thus |A - \lambda I| will be 

    \[ |A - \lambda I| = \left|\begin{array}{cc} 0.8 - \lambda &-0.6\\ 0.6 &0.8 - \lambda \end{array}\right|. \]

Next, I’ll evaluate the determinant.

So it will be

    \[|A - \lambda I| = (0.8- \lambda)(0.8- \lambda) - (-0.6)(0.6).\]

Now this gives

    \begin{eqnarray*} |A - \lambda I| &=& (0.8- \lambda)^2 +0.36\\&=& 0.64 + \lambda^2 - 1.6 \lambda + 0.36\\&=& \lambda^2 - 1.6 \lambda +1.\end{eqnarray*}

Since |A - \lambda I| = 0, this means \lambda^2 - 1.6 \lambda +1 = 0.

Now I’ll solve this equation to get the values of \lambda.

Thus it will be

    \[\lambda = \frac{1.6 \pm \sqrt{(1.6)^2 - 4(1)(1)}}{2}.\]

So I’ll simplify it to get the values of \lambda as

    \begin{eqnarray*} \lambda &=& \frac{1.6 \pm \sqrt{2.56 - 4}}{2}\\&=& \frac{1.6 \pm \sqrt{-1.44}}{2}\\&=& \frac{1.6 \pm 1.2 j}{2}\\&=& 0.8 \pm 0.6 j.\end{eqnarray*}

Now I have two values of \lambda – one is \lambda_1 = 0.8 + 0.6 j and the other one is \lambda_2 = 0.8 - 0.6 j.

So both are the complex conjugate numbers.

Hence these are the complex eigenvalues of a matrix with real numbers.

Now I’ll find out the eigenvectors corresponding to each eigenvalue.

Step 2

First of all, I’ll get the eigenvector corresponding to \lambda_1 = 0.8 + 0.6 j.

For \lambda_1 = 0.8 + 0.6 j, the matrix (A - \lambda I) is



    \[ A - \lambda I = \begin{pmatrix} 0.8 - 0.8 - 0.6 j&-0.6\\ 0.6 &0.8- 0.8 - 0.6 j \end{pmatrix}. \]

This means 

    \[ A - \lambda I = \begin{pmatrix} - 0.6 j&-0.6\\ 0.6 & - 0.6 j \end{pmatrix}. \]

Suppose [x_1, x_2]^T is the corresponding eigenvector for the eigenvalue \lambda = 0.8 + 0.6 j.

Thus I can say 

    \[ \begin{pmatrix} - 0.6 j&-0.6\\ 0.6 & - 0.6 j \end{pmatrix}\begin{pmatrix} x_1\\ x_2 \end{pmatrix} = \begin{pmatrix} 0\\ 0 \end{pmatrix}. \]

This means

    \[- 0.6j x_1 - 0.6 x_2 = 0\]

and

    \[0.6 x_1 - 0.6j x_2 = 0.\]

So I get two different equations as

(1)   \begin{equation*}j x_1 +  x_2 = 0\end{equation*}

and 

(2)   \begin{equation*}  x_1 -j x_2 = 0.\end{equation*}

From equation (2), I can say

    \[x_1 = j x_2.\]

Thus the eigenvector will be  

    \[ \begin{pmatrix} x_1\\ x_2 \end{pmatrix} = \begin{pmatrix} j x_2\\ x_2 \end{pmatrix} = x_2 \begin{pmatrix} j\\ 1 \end{pmatrix} . \]

Therefore I can say for \lambda = 0.8 + 0.6 j, the corresponding eigenvector is [j, 1]^T.

Now I’ll find out the eigenvector corresponding to \lambda_2 = 0.8 - 0.6 j.

Step 3

For \lambda_2 = 0.8 - 0.6 j, the matrix (A - \lambda I) is 

    \[ A - \lambda I = \begin{pmatrix} 0.8 - 0.8 + 0.6 j&-0.6\\ 0.6 &0.8- 0.8 + 0.6 j \end{pmatrix}. \]

This means 

    \[ A - \lambda I = \begin{pmatrix} 0.6 j&-0.6\\ 0.6 &  0.6 j \end{pmatrix}. \]

Suppose [x_1, x_2]^T is the corresponding eigenvector for the eigenvalue \lambda = 0.8 - 0.6 j.

Thus I can say 

    \[ \begin{pmatrix} 0.6 j&-0.6\\ 0.6 & 0.6 j \end{pmatrix}\begin{pmatrix} x_1\\ x_2 \end{pmatrix} = \begin{pmatrix} 0\\ 0 \end{pmatrix}. \]

This means

    \[0.6j x_1 - 0.6 x_2 = 0\]

and

    \[0.6 x_1 + 0.6j x_2 = 0.\]

So I get two different equations as

(3)   \begin{equation*} j x_1 -  x_2 = 0\end{equation*}

and 

(4)   \begin{equation*}  x_1 + j x_2 = 0.\end{equation*}

From equation (4), I can say

    \[x_1 = - j x_2.\]

Thus the eigenvector will be  

    \[ \begin{pmatrix} x_1\\ x_2 \end{pmatrix} = \begin{pmatrix} - j x_2\\ x_2 \end{pmatrix} = x_2 \begin{pmatrix} - j\\ 1 \end{pmatrix} . \]

Therefore I can say for \lambda = 0.8 - 0.6 j, the corresponding eigenvector is [-j, 1]^T.

Hence I can conclude that the eigenvalues of the matrix are 0.8 \pm 0.6 j and the corresponding eigenvectors are [j, 1]^T, [-j, 1]^T.

This is the answer to this example.

Now I’ll solve another example on complex eigenvalues and eigenvectors of a matrix.


Example 2

According to Kreyszig (2005) “Find the eigenvalues and eigenvectors of the following matrices. 

    \[ \begin{pmatrix} \cos \theta&-\sin \theta\\ \sin \theta &\cos \theta \end{pmatrix}. \]

”

Solution

Here the given matrix is 

    \[ \begin{pmatrix} \cos \theta&-\sin \theta\\ \sin \theta &\cos \theta \end{pmatrix}. \]

First of all, I’ll give it a name, say, A.

Therefore it will be 

    \[ \begin{pmatrix} \cos \theta&-\sin \theta\\ \sin \theta &\cos \theta \end{pmatrix}. \]

Now I’ll start with the eigenvalues of the matrix A.



Step 1

In the beginning, comes the characteristic equation of the matrix A.

So it will be

    \[|A - \lambda I| = 0.\]

Here \lambda is the eigenvalue of the matrix A.

Thus |A - \lambda I| will be 

    \[ |A - \lambda I| = \left|\begin{array}{cc} \cos \theta - \lambda&-\sin \theta\\ \sin \theta &\cos \theta - \lambda \end{array}\right|. \]

Next, I’ll evaluate the determinant.

So it will be

    \[|A - \lambda I| = (\cos \theta - \lambda)(\cos \theta - \lambda) - (-\sin \theta)(\sin \theta).\]

Now this gives

    \begin{eqnarray*} |A - \lambda I| &=& (\cos \theta - \lambda)^2 + \sin^2 \theta\\&=& \cos^2 \theta + \lambda^2 - 2 \cos \theta \lambda + \sin^2 \theta\\&=& \lambda^2 - 2 \cos \theta \lambda + (\cos^2 \theta + \sin^2 \theta)\\&=& \lambda^2 - 2 \cos \theta \lambda + 1.\end{eqnarray*}

Since |A - \lambda I| = 0, this means \lambda^2 - 2 \cos \theta \lambda +1 = 0.

Now I’ll solve this equation to get the values of \lambda.

Thus it will be

    \[\lambda = \frac{2 \cos \theta \pm \sqrt{(2 \cos \theta )^2 - 4(1)(1)}}{2}.\]

So I’ll simplify it to get the values of \lambda as

    \begin{eqnarray*} \lambda &=& \frac{2 \cos \theta \pm \sqrt{4\cos^2 \theta - 4}}{2}\\&=& \frac{2 \cos \theta \pm 2\sqrt{-\sin^2 \theta}}{2}\\&=&  \cos \theta \pm \sin \theta j.\end{eqnarray*}

Now I have two values of \lambda – one is \lambda_1 = \cos \theta + \sin \theta j and the other one is \lambda_2 = \cos \theta - \sin \theta j.

So both are the complex conjugate numbers.

Hence these are the complex eigenvalues of a matrix with real numbers.

Now I’ll find out the eigenvectors corresponding to each eigenvalue.

Step 2

First of all, I’ll get the eigenvector corresponding to \lambda_1 = \cos \theta + \sin \theta j.

For \lambda_1 = \cos \theta + \sin \theta j, the matrix (A - \lambda I) is 

    \[ A - \lambda I = \begin{pmatrix} \cos \theta - \cos \theta - \sin \theta j &-\sin \theta\\ \sin \theta &\cos \theta - \cos \theta - \sin \theta j\\ \end{pmatrix}. \]

This means 

    \[ A - \lambda I = \begin{pmatrix} - \sin \theta j &-\sin \theta\\ \sin \theta & - \sin \theta j \end{pmatrix}. \]

Suppose [x_1, x_2]^T is the corresponding eigenvector for the eigenvalue \lambda = \cos \theta + \sin \theta j.

Thus I can say 

    \[ \begin{pmatrix} - \sin \theta j &-\sin \theta\\ \sin \theta & - \sin \theta j \end{pmatrix}\begin{pmatrix} x_1\\ x_2 \end{pmatrix} = \begin{pmatrix} 0\\ 0 \end{pmatrix}. \]

This means

    \[- \sin \theta j x_1 - \sin \theta x_2 = 0\]

and

    \[\sin \theta x_1 - \sin \theta j x_2 = 0.\]

So I get two different equations as

(5)   \begin{equation*}j x_1 +  x_2 = 0\end{equation*}

and 

(6)   \begin{equation*}  x_1 -j x_2 = 0.\end{equation*}

From equation (6), I can say

    \[x_1 = j x_2.\]

Thus the eigenvector will be  

    \[ \begin{pmatrix} x_1\\ x_2 \end{pmatrix} = \begin{pmatrix} j x_2\\ x_2 \end{pmatrix} = x_2 \begin{pmatrix} j\\ 1 \end{pmatrix} . \]

Therefore I can say for \lambda = \cos \theta + \sin \theta j, the corresponding eigenvector is [j, 1]^T.

Now I’ll find out the eigenvector corresponding to \lambda_2 = \cos \theta - \sin \theta j.

Step 3

For \lambda_2 = \cos \theta - \sin \theta j, the matrix (A - \lambda I) is 

    \[ A - \lambda I = \begin{pmatrix} \cos \theta - \cos \theta + \sin \theta j &-\sin \theta\\ \sin \theta &\cos \theta - \cos \theta + \sin \theta j\\ \end{pmatrix}. \]

This means 

    \[ A - \lambda I = \begin{pmatrix} \sin \theta j &-\sin \theta\\ \sin \theta & \sin \theta j \end{pmatrix}. \]

Suppose [x_1, x_2]^T is the corresponding eigenvector for the eigenvalue \lambda = \cos \theta - \sin \theta j.

Thus I can say 

    \[ \begin{pmatrix} \sin \theta j &-\sin \theta\\ \sin \theta & \sin \theta j \end{pmatrix}\begin{pmatrix} x_1\\ x_2 \end{pmatrix} = \begin{pmatrix} 0\\ 0 \end{pmatrix}. \]

This means

    \[\sin \theta j x_1 - \sin \theta x_2 = 0\]

and

    \[\sin \theta x_1 + \sin \theta j x_2 = 0.\]

So I get two different equations as

(7)   \begin{equation*}j x_1 -  x_2 = 0\end{equation*}

and 

(8)   \begin{equation*}  x_1 + j x_2 = 0.\end{equation*}

From equation (8), I can say

    \[x_1 = - j x_2.\]

Thus the eigenvector will be  

    \[ \begin{pmatrix} x_1\\ x_2 \end{pmatrix} = \begin{pmatrix} - j x_2\\ x_2 \end{pmatrix} = x_2 \begin{pmatrix} - j\\ 1 \end{pmatrix} . \]

Therefore I can say for \lambda = \cos \theta + \sin \theta j, the corresponding eigenvector is [- j, 1]^T.

Hence I can conclude that the eigenvalues of the matrix are \cos \theta \pm \sin \theta j and the corresponding eigenvectors are [j, 1]^T, [-j, 1]^T.

This is the answer to this example.


Dear friends, this is the end of my today’s post on complex eigenvalues and eigenvectors of a matrix. Thank you very much for reading this. Please let me know how you feel about it. Soon I will be back again with a new post. Till then, bye, bye!!

Kreyszig, E. (2005): Advanced Engineering Mathematics: International Edition, John Wiley & Sons, 9th Edition, 29th December 2005, Chapter 2, Second-order linear ODEs, p. 338, Problem set 2.2, Q. 1(Example 7), Q. 2 (Example 10).

Filed Under: Advanced engineering mathematics, Matrix analysis Tagged With: characteristic equation, complex eigenvalues, eigenvalues

About Dr. Aspriha Peters

Trained mathematician & hobby academic. Curious about nature as well as an aspiring blogger.

Copyright

All rights reserved© ‘Engineering mathematics blog’ and ‘engineeringmathgeek.com’, 2016-19. All written materials and photos published on this blog is copyright protected. Excerpts and links may be used, provided that full and clear credit is given to ‘Aspriha Peters’ and “Engineering mathematics blog” with appropriate and specific direction to the original content.

Reader Interactions

Leave a Reply Cancel reply

Your e-mail address will not be published. Required fields are marked *

Primary Sidebar

COPYRIGHT

All rights reserved© ‘Engineering mathematics blog’ and ‘engineeringmathgeek.com’, 2016-19. All written materials and photos published on this blog is copyright protected.

Excerpts and links may be used, provided that full and clear credit is given to ‘Aspriha Peters’ and “Engineering mathematics blog” with appropriate and specific direction to the original content.

Please contact engineeringmathgeek (at) gmail dot com for permissions or questions or queries, if any.

Thank you very much for visiting this blog. All relevant suggestions and comments are extremely welcome.

Mathematician & hobby academic. Curious about nature. Aspiring blogger & avid reader. More about me.

Recent posts

  • Describe graphs in Fourier series February 20, 2019
  • Complex conjugate numbers – what is it? February 18, 2019
  • Roots of a complex number – how to find out? February 15, 2019
  • Integrate exact differentials – but how? February 13, 2019
  • Multiplication and division of complex numbers February 11, 2019
  • Gauss-Jordan method to find out the inverse of a matrix February 9, 2019
  • Symmetric skew-symmetric and orthogonal matrices – how to find these? February 6, 2019
  • Addition and subtraction of complex numbers February 4, 2019
  • Functions of complex variables January 29, 2019
  • Gaussian elimination method in 3 × 3 matrices January 28, 2019

Advanced engineering mathematics

  • First order ODE
  • Second order ODE
  • Partial differentiation
  • Determinants
  • Laplace transform
  • Matrix analysis
  • Vector analysis
  • Difference equations and z-transform
  • Partial differential equations
  • Multiple integration

Basic engineering mathematics

  • Partial fractions
  • Differentiation
  • Differentiation applications
  • Integration
  • Complex numbers
  • Vector
  • Series
  • Sequences

Footer

Mathematician & hobby academic. Curious about nature. Aspiring blogger & avid reader. More about me.

  • About
  • Overview
  • Privacy policy

engineeringmathgeek.com © 2019- All rights reserved .

We use cookies to ensure that we give you the best experience on our website. If you continue to use this site we will assume that you are happy with it.OkPrivacy policy