Skip to main content
Logo image

Introduction to Differential Equations

Section 2.3 Higher order linear ODEs

We briefly study higher order equations. Equations appearing in applications tend to be second order. Higher order equations do appear from time to time, but generally the world around us is “second order.”
The basic results about linear ODEs of higher order are essentially the same as for second order equations, with 2 replaced by \(n\text{.}\) The important concept of linear independence is somewhat more complicated when more than two functions are involved. For higher order constant coefficient ODEs, the methods developed are also somewhat harder to apply, but we will not dwell on these complications. It is also possible to use the methods for systems of linear equations from Chapter 7 to solve higher order constant coefficient equations.
Let us start with a general homogeneous linear equation
\begin{equation} y^{(n)} + p_{n-1}(x)y^{(n-1)} + \cdots + p_1(x) y' + p_0(x) y = 0 .\tag{2.4} \end{equation}

Subsection 2.3.1 Linear independence

We briefly saw the idea of two functions being linearly independent when talking about the theory of 2nd order ODEs in Section 2.1. Now that we are talking about nth order, we need to figure out how to show that n functions are linearly independent. As you can imagine there are strong parallels to linear algebra, but we also get something unique to ODEs called the Wronskian which will be a test for linear independence. Note: While the Wronskian was covered by Bazett in the videos of this section, the text of this section beyond a brief mention largely avoids this topic.
When we had two functions \(y_1\) and \(y_2\) we said they were linearly independent if one was not the multiple of the other. Same idea holds for \(n\) functions. In this case it is easier to state as follows. The functions \(y_1\text{,}\) \(y_2\text{,}\) ..., \(y_n\) are linearly independent if the equation
\begin{equation*} c_1 y_1 + c_2 y_2 + \cdots + c_n y_n = 0 \end{equation*}
has only the trivial solution \(c_1 = c_2 = \cdots = c_n = 0\text{,}\) where the equation must hold for all \(x\text{.}\) If we can solve equation with some constants where for example \(c_1 \not= 0\text{,}\) then we can solve for \(y_1\) as a linear combination of the others. If the functions are not linearly independent, they are linearly dependent.
There are several ways to show that a set of functions are linearly independent. The first method we’ll present uses a tool called the Wronskian, which is rather unintuitive but computationally simple.
\begin{equation} W(y_1,...y_n)= \begin{vmatrix} y_1 & y_2 & ... & y_n \\ y_1' & y_2' & ... & y_n' \\ \vdots & \vdots & \vdots & \vdots \\ y_1^{(n-1)} & y_2^{(n-1)} & ... & y_n^{(n-1) } \end{vmatrix} .\tag{2.5} \end{equation}
If the functions \(y_1,...,y_n\) where linearly dependent (and \(n-1\) times differentiable), then the columns of the matrix in the Wronskian will be as well and thus the Wronskian is zero. Thus to show a set of functions is linearly independent on an interval \(I\) it suffices to find a single point in \(I\) where the Wronskian is non-zero. Note the converse is not true, \(W=0\) does not imply linear dependence.

Example 2.3.1.

\(e^x\) and \(e^{2x}\) are linearly independent as
\begin{equation*} W(e^x, e^{2x})=\begin{vmatrix}e^x & e^{2x}\\ e^x & 2e^{2x}\end{vmatrix}=e^{3x}. \end{equation*}
As \(e^{3x}\neq 0\) for any \(x\text{,}\) these two functions are linearly independent everywhere.

Example 2.3.2.

Show that \(e^x, e^{2x}, e^{3x}\) are linearly independent.
Let us write down
\begin{equation*} c_1 e^x + c_2 e^{2x} + c_3 e^{3x} = 0. \end{equation*}
We use rules of exponentials and write \(z = e^x\text{.}\) Hence \(z^2 = e^{2x}\) and \(z^3 = e^{3x}\text{.}\) Then we have
\begin{equation*} c_1 z + c_2 z^2 + c_3 z^3 = 0. \end{equation*}
The left-hand side is a third degree polynomial in \(z\text{.}\) It is either identically zero, or it has at most 3 zeros. Therefore, it is identically zero, \(c_1 = c_2 = c_3 = 0\text{,}\) and the functions are linearly independent.
Let us try another way. As before we write
\begin{equation*} c_1 e^x + c_2 e^{2x} + c_3 e^{3x} = 0. \end{equation*}
This equation has to hold for all \(x\text{.}\) We divide through by \(e^{3x}\) to get
\begin{equation*} c_1 e^{-2x} + c_2 e^{-x} + c_3 = 0. \end{equation*}
As the equation is true for all \(x\text{,}\) let \(x \to \infty\text{.}\) After taking the limit we see that \(c_3 = 0\text{.}\) Hence our equation becomes
\begin{equation*} c_1 e^x + c_2 e^{2x} = 0. \end{equation*}
Rinse, repeat!
How about yet another way. We again write
\begin{equation*} c_1 e^x + c_2 e^{2x} + c_3 e^{3x} = 0. \end{equation*}
We can evaluate the equation and its derivatives at different values of \(x\) to obtain equations for \(c_1\text{,}\) \(c_2\text{,}\) and \(c_3\text{.}\) Let us first divide by \(e^{x}\) for simplicity.
\begin{equation*} c_1 + c_2 e^{x} + c_3 e^{2x} = 0. \end{equation*}
We set \(x=0\) to get the equation \(c_1 + c_2 + c_3 = 0\text{.}\) Now differentiate both sides
\begin{equation*} c_2 e^{x} + 2 c_3 e^{2x} = 0 . \end{equation*}
We set \(x=0\) to get \(c_2 + 2c_3 = 0\text{.}\) We divide by \(e^x\) again and differentiate to get \(2 c_3 e^{x} = 0\text{.}\) It is clear that \(c_3\) is zero. Then \(c_2\) must be zero as \(c_2 = -2c_3\text{,}\) and \(c_1\) must be zero because \(c_1 + c_2 + c_3 = 0\text{.}\)
There is no one best way to do it. All of these methods are perfectly valid. The important thing is to understand why the functions are linearly independent.

Example 2.3.3.

On the other hand, the functions \(e^x\text{,}\) \(e^{-x}\text{,}\) and \(\cosh x\) are linearly dependent. Simply apply definition of the hyperbolic cosine:
\begin{equation*} \cosh x = \frac{e^x + e^{-x}}{2} \qquad \text{or} \qquad 2 \cosh x - e^x - e^{-x} = 0. \end{equation*}

Subsection 2.3.2 Theory of Higher Order ODEs

In this video we put on a firmer ground the theory of higher order ODEs. This both is what really makes the method of constant coefficients work for higher order cases, but these theorems also apply to more general linear ODEs.
In other words, a linear combination of solutions to (2.4) is also a solution to (2.4). We also have the existence and uniqueness theorem for nonhomogeneous linear equations.

Subsection 2.3.3 Constant coefficient higher order ODEs

Let’s consider the generalization of the constant coefficient method to higher order. A few things change. Our characteristic equation is now an nth order polynomial, which we can still factor, although this can be harder than just using the quadratic formula, and we get a range of options analogous to the 2D case.
When we have a higher order constant coefficient homogeneous linear equation, the song and dance is exactly the same as it was for second order. We just need to find more solutions. If the equation is \(n^{\text{th}}\) order, we need to find \(n\) linearly independent solutions. It is best seen by example.

Example 2.3.4.

Find the general solution to
\begin{equation} y''' - 3 y'' - y' + 3y = 0 .\tag{2.6} \end{equation}
Try: \(y = e^{rx}\text{.}\) We plug in and get
\begin{equation*} \underbrace{r^3 e^{rx}}_{y'''} - 3 \underbrace{r^2 e^{rx}}_{y''} - \underbrace{r e^{rx}}_{y'} + 3 \underbrace{e^{rx}}_{y} = 0 . \end{equation*}
We divide through by \(e^{rx}\text{.}\) Then
\begin{equation*} r^3 - 3 r^2 - r + 3 = 0 . \end{equation*}
The trick now is to find the roots. There is a formula for the roots of degree 3 and 4 polynomials but it is very complicated. There is no formula for higher degree polynomials. That does not mean that the roots do not exist. There are always \(n\) roots for an \(n^{\text{th}}\) degree polynomial. They may be repeated and they may be complex. Computers are pretty good at finding roots approximately for reasonable size polynomials.
A good place to start is to plot the polynomial and check where it is zero. We can also simply try plugging in. We just start plugging in numbers \(r=-2,-1,0,1,2,\ldots\) and see if we get a hit (we can also try complex numbers). Even if we do not get a hit, we may get an indication of where the root is. For example, we plug \(r=-2\) into our polynomial and get \(-15\text{;}\) we plug in \(r=0\) and get 3. That means there is a root between \(r=-2\) and \(r=0\text{,}\) because the sign changed. If we find one root, say \(r_1\text{,}\) then we know \((r-r_1)\) is a factor of our polynomial. Polynomial long division can then be used.
A good strategy is to begin with \(r=0\text{,}\) \(1\text{,}\) or \(-1\text{.}\) These are easy to compute. Our polynomial has two such roots, \(r_1 = -1\) and \(r_2 = 1\text{.}\) There should be 3 roots and the last root is reasonably easy to find. The constant term in a monic 1  polynomial such as this is the multiple of the negations of all the roots because \(r^3 - 3 r^2 - r + 3 = (r-r_1)(r-r_2)(r-r_3)\text{.}\) So
\begin{equation*} 3 = (-r_1)(-r_2)(-r_3) = (1)(-1)(-r_3) = r_3 . \end{equation*}
You should check that \(r_3 = 3\) really is a root. Hence \(e^{-x}\text{,}\) \(e^{x}\) and \(e^{3x}\) are solutions to (2.6). They are linearly independent as can easily be checked, and there are 3 of them, which happens to be exactly the number we need. So the general solution is
\begin{equation*} y = C_1 e^{-x} + C_2 e^{x} + C_3 e^{3x} . \end{equation*}
Suppose we were given some initial conditions \(y(0) = 1\text{,}\) \(y'(0) = 2\text{,}\) and \(y''(0) = 3\text{.}\) Then
\begin{equation*} \begin{aligned} 1 = y(0) & = C_1 + C_2 + C_3 , \\ 2 = y'(0) & = -C_1 + C_2 + 3C_3 , \\ 3 = y''(0) & = C_1 + C_2 + 9C_3 . \end{aligned} \end{equation*}
It is possible to find the solution by high school algebra, but it would be a pain. The sensible way to solve a system of equations such as this is to use matrix algebra, see Section 7.2 or Appendix A. For now we note that the solution is \(C_1 = -\nicefrac{1}{4}\text{,}\) \(C_2 = 1\text{,}\) and \(C_3 = \nicefrac{1}{4}\text{.}\) The specific solution to the ODE is
\begin{equation*} y = \frac{-1}{4}\, e^{-x} + e^x + \frac{1}{4}\, e^{3x} . \end{equation*}
Next, suppose that we have real roots, but they are repeated. Let us say we have a root \(r\) repeated \(k\) times. In the spirit of the second order solution, and for the same reasons, we have the solutions
\begin{equation*} e^{rx}, \quad xe^{rx}, \quad x^2 e^{rx}, \quad \ldots, \quad x^{k-1} e^{rx} . \end{equation*}
We take a linear combination of these solutions to find the general solution.

Example 2.3.5.

\begin{equation*} y^{(4)} - 3 y''' + 3 y'' - y' = 0 . \end{equation*}
We note that the characteristic equation is
\begin{equation*} r^4 - 3r^3 + 3r^2 -r = 0 . \end{equation*}
By inspection we note that \(r^4 - 3r^3 + 3r^2 -r = r{(r-1)}^3\text{.}\) Hence the roots given with multiplicity are \(r = 0, 1, 1, 1\text{.}\) Thus the general solution is
\begin{equation*} y = \underbrace{(C_1 + C_2 x + C_3 x^2)\, e^x}_{\text{terms coming from } r=1} + \underbrace{C_4}_{\text{from } r=0} . \end{equation*}
The case of complex roots is similar to second order equations. Complex roots always come in pairs \(r = \alpha \pm i \beta\text{.}\) Suppose we have two such complex roots, each repeated \(k\) times. The corresponding solution is
\begin{equation*} ( C_0 + C_1 x + \cdots + C_{k-1} x^{k-1} ) \, e^{\alpha x} \cos (\beta x) + ( D_0 + D_1 x + \cdots + D_{k-1} x^{k-1} ) \, e^{\alpha x} \sin (\beta x) . \end{equation*}
where \(C_0\text{,}\) ..., \(C_{k-1}\text{,}\) \(D_0\text{,}\) ..., \(D_{k-1}\) are arbitrary constants.

Example 2.3.6.

\begin{equation*} y^{(4)} - 4 y''' + 8 y'' - 8 y' + 4y = 0 . \end{equation*}
The characteristic equation is
\begin{equation*} \begin{aligned} r^4 - 4 r^3 + 8 r^2 - 8 r + 4 & = 0 , \\ {(r^2-2r+2)}^2 & = 0 , \\ {\bigl({(r-1)}^2+1\bigr)}^2 & = 0 . \end{aligned} \end{equation*}
Hence the roots are \(1 \pm i\text{,}\) both with multiplicity 2. Hence the general solution to the ODE is
\begin{equation*} y = ( C_1 + C_2 x ) \, e^{x} \cos x + ( C_3 + C_4 x ) \, e^{x} \sin x . \end{equation*}
The way we solved the characteristic equation above is really by guessing or by inspection. It is not so easy in general. We could also have asked a computer or an advanced calculator for the roots.

Subsection 2.3.4 Exercises

Exercise 2.3.1.

Find the general solution for \(y''' - y'' + y' - y = 0\text{.}\)
The characteristic equation is
\begin{equation*} \begin{aligned} r^3-r^2+r-1 &=0 \\ (r^2+1)(r-1) &=0\quad \rightarrow r_1=1,\ r_2=i,\ r_3=-i \end{aligned} \end{equation*}
Therefore the general solution is
\begin{equation*} \begin{aligned} y=C_1e^x+C_2\cos x +C_3\sin x \end{aligned} \end{equation*}

Exercise 2.3.2.

Find the general solution for \(y^{(4)} - 5 y''' + 6 y'' = 0\text{.}\)

Exercise 2.3.3.

Find the general solution for \(y''' + 2 y'' + 2 y' = 0\text{.}\)
\(y=C_1+e^{-x}\left( C_2\cos x+C_3 \sin x\right)\)

Exercise 2.3.4.

Find the general solution of \(y^{(5)}-y^{(4)}=0\text{.}\)
\(y=C_1 e^x +C_2 x^3 + C_3 x^2 +C_4 x + C_5\)

Exercise 2.3.5.

Solve \(1001y'''+3.2y''+\pi y'-\sqrt{4} y = 0\text{,}\) \(y(0)=0\text{,}\) \(y'(0) = 0\text{,}\) \(y''(0) = 0\text{.}\)

Exercise 2.3.6.

Suppose the characteristic equation for an ODE is \({(r-1)}^2{(r-2)}^2 = 0\text{.}\)
  1. Find such a differential equation.
  2. Find its general solution.
a) Expanding the characteristic equation
\begin{equation*} \begin{aligned} (r^2-2r+1)(r^2-4r+4) &=0 \\ r^4-6r^3+13r^2-12r+4 &= 0 \end{aligned} \end{equation*}
Therefore an ODE with such a characteristic equation is
\begin{equation*} \begin{aligned} y^{(4)}-6y'''+13y''-12y'+4y=0 \end{aligned} \end{equation*}

b) Since \(r_1=r_2=1\) and \(r_3=r_4=2\) the general solution is
\begin{equation*} \begin{aligned} y=(C_1+C_2x)e^x+(C_3+C_4x)e^{2x} \end{aligned} \end{equation*}

Exercise 2.3.7.

Suppose that the characteristic equation of a third order differential equation has roots \(\pm 2i\) and 3.
  1. What is the characteristic equation?
  2. Find the corresponding differential equation.
  3. Find the general solution.
a) \(r^3-3r^2+4r-12 = 0\)     b) \(y'''-3y''+4y'-12y = 0\)     c) \(y = C_1 e^{3x} + C_2 \sin(2x) + C_3 \cos(2x)\)

Exercise 2.3.8.

Suppose that a fourth order equation has a solution \(y = 2 e^{4x} x \cos x\text{.}\)
  1. Find such an equation.
  2. Find the general solution to the above equation.
  3. Find the initial conditions that the given solution satisfies at \(x=0\text{.}\) Note: You might like to use a computer algebra system like Wolframalpha to take the derivatives.
a) The appearance of \(\cos x\) means there are two complex roots with \(r=4\pm i \text{.}\) Furthermore, the appearance of \(x\) means that these roots are repeated, so
\begin{equation*} \begin{aligned} r_1=r_2 &=4+i \\ r_3=r_4 &=4-i \\ \end{aligned} \end{equation*}
The resulting characteristic equation after expanding is
\begin{equation*} \begin{aligned} r^3-16r^3+98r^2-272r+289=0 \end{aligned} \end{equation*}
Giving an ODE
\begin{equation*} \begin{aligned} y^{(4)}-16y'''+98y''-272y'+289y=0 \end{aligned} \end{equation*}

b) \(y=(C_1+C_2x)e^{4x}\cos x-(C_3+C_4x)e^{4x}\sin x\)
c) Try \(y(0)=2e^0(0)\cos(0)=0\text{,}\) so the first initial condition is \(y(0)=0\text{.}\) Differentiating
\begin{equation*} \begin{aligned} y' &=2e^{4x}(-x\sin x+\cos x)+8e^{4x}x\cos x \\ y'(0) &= 2 \end{aligned} \end{equation*}
Differentiating again
\begin{equation*} \begin{aligned} y'' &= 2e^{4x}\left[(15x+8)\cos x-2(4x+1)\sin x\right] \\ y''(0) &= 16 \end{aligned} \end{equation*}
Differentiating yet again
\begin{equation*} \begin{aligned} y''' &= 2e^{4x}\left[(52x+45)\cos x-(47x+24)\sin x\right] \\ y'''(0) &= 90 \end{aligned} \end{equation*}
So the initial conditions are \(y(0)=0,\ y'(0)=2,\ y''(0)=16,\ y'''(0)=90\text{.}\)

Exercise 2.3.9.

Find an equation such that \(y=\cos(x)\text{,}\) \(y=\sin(x)\text{,}\) \(y=e^x\) are solutions.

Exercise 2.3.10.

Find an equation such that \(y=xe^{-2x}\sin(3x)\) is a solution.

Exercise 2.3.11.

Let \(f(x) = e^x - \cos x\text{,}\) \(g(x) = e^x + \cos x\text{,}\) and \(h(x) = \cos x\text{.}\) Are \(f(x)\text{,}\) \(g(x)\text{,}\) and \(h(x)\) linearly independent? If so, show it, if not, find a linear combination that works.
We want to write
\begin{equation*} \begin{aligned} c_1f(x)+c_2g(x)+c_3h(x)=0 \end{aligned} \end{equation*}
And find whether this is satisfied for any combination of coefficients \(c_i\) (linearly dependent), or if \(c_i=0\) is the only solution (linear independent). This equation should be satisfied for any choice of \(x\) so first we set \(x=0\)
\begin{equation*} \begin{aligned} 2c_2+c_3 &=0\quad \rightarrow c_3=-2c_2 \\ c_1f(x)+c_2(g(x)-2h(x))&=0 \end{aligned} \end{equation*}
This equation should again be satisfied for any \(x\text{,}\) we choose \(x=\pi/2\)
\begin{equation*} \begin{aligned} c_1e^{\pi/2}+c_2(e^{\pi/2}-0) &=0\quad \rightarrow c_2=-c_1 \\ c_1(f(x)-g(x)+2h(x)) &=0 \\ c_1(e^x-\cos x-e^x-\cos x+2\cos x)&=0 \end{aligned} \end{equation*}
So \(c_1\) is a free parameter, therefore \(f(x), g(x)\) and \(h(x)\) are linearly dependent. Choosing \(c_1=1\quad \rightarrow c_2=-1,\ c_3=2\) we get the combination
\begin{equation*} \begin{aligned} (e^x-\cos x)-(e^x+\cos x)+2\cos x=0 \end{aligned} \end{equation*}

Exercise 2.3.12.

Let \(f(x) = 0\text{,}\) \(g(x) = \cos x\text{,}\) and \(h(x) = \sin x\text{.}\) Are \(f(x)\text{,}\) \(g(x)\text{,}\) and \(h(x)\) linearly independent? If so, show it, if not, find a linear combination that works.
No, choosing \(c_1=1,\ c_2=c_3=0\) we get \(1(0)+0(\cos x)+0(\sin x)=0\text{.}\)

Exercise 2.3.13.

Are \(x\text{,}\) \(x^2\text{,}\) and \(x^4\) linearly independent? If so, show it, if not, find a linear combination that works.
We write
\begin{equation*} \begin{aligned} c_1x+c_2x^2+c_3x^4=0 \end{aligned} \end{equation*}
We differentiate a few times
\begin{equation*} \begin{aligned} c_1+2c_2x+4c_3x^3 &=0 \\ 2c_2+12c_3x^2 &=0 \\ 24c_3x &=0 \\ 24c_3 &=0\quad \rightarrow c_3=0 \end{aligned} \end{equation*}
Plugging \(c_3=0\) back in and repeating the process, we find that \(c_1=c_2=0\text{.}\) So the functions are linearly independent.

Exercise 2.3.14.

Are \(e^x\text{,}\) \(xe^x\text{,}\) and \(x^2e^x\) linearly independent? If so, show it, if not, find a linear combination that works.
Yes. Divide the equation by \(e^x\text{,}\) then differentiate twice to find \(c_3=0\) and consequently \(c_1=c_2=0\text{.}\)

Exercise 2.3.15.

Are \(e^{x}\text{,}\) \(e^{x+1}\text{,}\) \(e^{2x}\text{,}\) \(\sin(x)\) linearly independent? If so, show it, if not find a linear combination that works.
No. \(e^1 e^x - e^{x+1} = 0\text{.}\)

Exercise 2.3.16.

Are \(\sin(x)\text{,}\) \(x\text{,}\) \(x\sin(x)\) linearly independent? If so, show it, if not find a linear combination that works.
Yes. (Hint: First note that \(\sin(x)\) is bounded. Then note that \(x\) and \(x\sin(x)\) cannot be multiples of each other.)
The word monic means that the coefficient of the top degree \(r^d\text{,}\) in our case \(r^3\text{,}\) is 1.
For a higher quality printout use the PDF version: