Ordinary Differential Equations (8): Nonlinear Systems and Phase Portraits
Chen Kai BOSS

Why do populations of foxes and rabbits fluctuate periodically in ecosystems? How does the heart maintain a stable rhythmic beat? How do neurons switch between "resting" and "firing" states? These seemingly unrelated questions all hide the secrets of nonlinear dynamics. When we step from linear systems into the nonlinear world, the face of mathematics undergoes a fundamental change — the superposition principle fails, solutions may not be unique, and small perturbations can cause huge changes. But it is precisely these "complications" that allow nonlinear systems to describe nature's richest and most fascinating phenomena.

From Linear to Nonlinear: A Leap in Thinking

In previous chapters, we systematically studied linear differential equations. Linear systems have a beautiful property: the superposition principle— if andare both solutions, thenis also a solution. This property makes linear systems "tameable": we can decompose complex problems into simple parts, solve them individually, and then combine them.

But the real world is rarely linear. Consider a simple example: population growth. If the growth rate is constant, we get exponential growth, with solution. But this predicts unlimited population growth — obviously absurd! The real world has resource limitations, and growth rates decrease as population increases. This leads to the famous Logistic equation:Hereis the carrying capacity. Whenis very small, the equation approximates (exponential growth); whenapproaches, the growth rate approaches zero. This equation is nonlinear— there's aterm on the right side.

Characteristics of nonlinear systems: 1. Superposition principle fails: The sum of two solutions is no longer a solution 2. Solution behavior can be extremely complex: Periodic oscillations, quasi-periodic, chaos... 3. Sensitive to initial conditions: Tiny initial differences can lead to completely different results (the seed of chaos, detailed in the next chapter) 4. Usually no analytical solutions: Must rely on qualitative analysis and numerical methods

Phase Space: Visualizing System States

The Concept of State Space

Consider a second-order ordinary differential equation, like a damped spring:We can introduce velocityand rewrite it as a first-order system:This is a first-order system of equations for the state vector. The space of all possible values ofis called phase space or state space.

Key insight: The state of the system at any moment is completely determined by a single point in phase space. As time evolves, this point moves in phase space, tracing out a trajectory or orbit.

Phase Portrait: A Panoramic View of System Behavior

A phase portrait is a collection of trajectories in phase space. It shows how the system evolves from different initial conditions, and is one of the most powerful tools for understanding nonlinear systems.

Let's use Python to plot the phase portrait of a damped spring:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
import numpy as np
import matplotlib.pyplot as plt
from scipy.integrate import odeint

def damped_oscillator(state, t, m, b, k):
"""State equations for damped oscillator"""
x, v = state
dxdt = v
dvdt = -(k/m) * x - (b/m) * v
return [dxdt, dvdt]

# Parameters: mass, damping coefficient, spring constant
m, b, k = 1.0, 0.3, 1.0

# Create phase portraits
fig, axes = plt.subplots(1, 3, figsize=(15, 5))

# Different damping cases
damping_cases = [
(0.0, 'Undamped (b=0)'),
(0.3, 'Underdamped (b=0.3)'),
(2.0, 'Overdamped (b=2.0)')
]

t = np.linspace(0, 30, 1000)

for ax, (b_val, title) in zip(axes, damping_cases):
# Multiple initial conditions
initial_conditions = [
(2, 0), (0, 2), (-2, 0), (0, -2),
(1, 1), (-1, -1), (1, -1), (-1, 1),
(3, 0), (0, 3)
]

for x0, v0 in initial_conditions:
sol = odeint(damped_oscillator, [x0, v0], t, args=(m, b_val, k))
ax.plot(sol[:, 0], sol[:, 1], 'b-', linewidth=0.8, alpha=0.7)
ax.plot(x0, v0, 'go', markersize=4) # Starting point

# Plot vector field
x_range = np.linspace(-3.5, 3.5, 15)
v_range = np.linspace(-3.5, 3.5, 15)
X, V = np.meshgrid(x_range, v_range)
dX = V
dV = -(k/m) * X - (b_val/m) * V

# Normalize arrows
M = np.sqrt(dX**2 + dV**2)
M[M == 0] = 1
ax.quiver(X, V, dX/M, dV/M, M, cmap='Reds', alpha=0.4)

ax.set_xlabel('Position x', fontsize=11)
ax.set_ylabel('Velocity v', fontsize=11)
ax.set_title(title, fontsize=12, fontweight='bold')
ax.set_xlim(-4, 4)
ax.set_ylim(-4, 4)
ax.grid(True, alpha=0.3)
ax.set_aspect('equal')
ax.plot(0, 0, 'ro', markersize=8) # Origin/equilibrium

plt.tight_layout()
plt.savefig('phase_portraits_damping.png', dpi=150, bbox_inches='tight')
plt.close()

Observing the three cases: - Undamped (): Trajectories are closed ellipses, the system oscillates forever - Underdamped (small): Trajectories spiral toward the origin - Overdamped (large): Trajectories converge directly to the origin without oscillation

Equilibrium Points: The System's Stationary States

Definition of Equilibrium Points

For an autonomous system (a system not explicitly containing time):If, thenis an equilibrium point or fixed point.

Physical meaning: If the system is at an equilibrium point, it will stay there forever — because all derivatives are zero, there's no driving force for change.

Classification of Equilibrium Points: Linearization Method

Near equilibrium point, we can approximate system behavior using Taylor expansion. Let, whereis a small deviation:Hereis the Jacobian matrix:The properties of equilibrium points are determined by the eigenvalues of the Jacobian matrix. For 2D systems, let the eigenvalues be:

Eigenvalue Type Equilibrium Type Stability
(real) Stable node Asymptotically stable
(real) Unstable node Unstable
Saddle point Unstable
(complex) Stable spiral Asymptotically stable
(complex) Unstable spiral Unstable
(purely imaginary) Center Marginally stable

Python Implementation: Automatic Classification of Equilibrium Points

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
import numpy as np
from scipy.optimize import fsolve
import matplotlib.pyplot as plt

def classify_equilibrium(J):
"""Classify equilibrium point based on Jacobian eigenvalues"""
eigenvalues = np.linalg.eigvals(J)

real_parts = np.real(eigenvalues)
imag_parts = np.imag(eigenvalues)

# Check for imaginary parts
has_imaginary = np.any(np.abs(imag_parts) > 1e-10)

if has_imaginary:
# Complex eigenvalue case
if real_parts[0] < -1e-10:
return "Stable Spiral", eigenvalues
elif real_parts[0] > 1e-10:
return "Unstable Spiral", eigenvalues
else:
return "Center", eigenvalues
else:
# Real eigenvalue case
if np.all(real_parts < -1e-10):
return "Stable Node", eigenvalues
elif np.all(real_parts > 1e-10):
return "Unstable Node", eigenvalues
elif real_parts[0] * real_parts[1] < 0:
return "Saddle Point", eigenvalues
else:
return "Degenerate Case", eigenvalues

# Example: analyze different systems
def example_system1(state):
"""Linear system: dx/dt = -x + y, dy/dt = -x - y"""
x, y = state
return [-x + y, -x - y]

def jacobian_system1(state):
"""Jacobian matrix (independent of state for linear systems)"""
return np.array([[-1, 1], [-1, -1]])

J = jacobian_system1([0, 0])
eq_type, eigs = classify_equilibrium(J)
print(f"System 1: {eq_type}")
print(f"Eigenvalues: {eigs}")

Classic Case: Lotka-Volterra Predator-Prey Model

Model Background

In 1925, Italian mathematician Vito Volterra studied fish populations in the Adriatic Sea and proposed the famous predator-prey model. Around the same time, American mathematician Alfred Lotka independently derived the same equations.

Imagine a simplified ecosystem with only rabbits (prey) and foxes (predators):

  • Rabbits: Without foxes, rabbits grow exponentially (abundant food)
  • Foxes: Without rabbits, foxes starve (exponential decay)
  • Encounters: Foxes eat rabbits → rabbits decrease, foxes increase

Mathematical Model

Letbe the prey (rabbit) population,be the predator (fox) population:Parameter meanings: -: Prey natural growth rate -: Predation rate (probability of prey being eaten per encounter) -: Predator natural death rate -: Predation efficiency (efficiency of converting prey into predator reproduction)

Equilibrium Point Analysis

Settingand:

Equilibrium 1:— Extinction of both species

Equilibrium 2:— Coexistence equilibrium

Computing the Jacobian at the coexistence equilibrium:Eigenvalues:Purely imaginary eigenvalues! This means the coexistence equilibrium is a center, with closed curves around it —periodic solutions!

Python Simulation and Phase Portrait

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
import numpy as np
import matplotlib.pyplot as plt
from scipy.integrate import odeint

def lotka_volterra(state, t, alpha, beta, delta, gamma):
"""Lotka-Volterra predator-prey model"""
x, y = state # x: prey, y: predator
dxdt = alpha * x - beta * x * y
dydt = delta * x * y - gamma * y
return [dxdt, dydt]

# Parameter settings
alpha = 1.0 # Rabbit natural growth rate
beta = 0.1 # Predation rate
delta = 0.075 # Predation efficiency
gamma = 0.5 # Fox natural death rate

# Calculate equilibrium
x_eq = gamma / delta
y_eq = alpha / beta
print(f"Coexistence equilibrium: ({x_eq:.2f}, {y_eq:.2f})")

# Create figure
fig, axes = plt.subplots(1, 2, figsize=(14, 6))

# Left plot: time evolution
t = np.linspace(0, 100, 2000)
x0, y0 = 10, 5 # Initial conditions

sol = odeint(lotka_volterra, [x0, y0], t, args=(alpha, beta, delta, gamma))
x_sol, y_sol = sol[:, 0], sol[:, 1]

axes[0].plot(t, x_sol, 'b-', linewidth=2, label='Prey (Rabbits)')
axes[0].plot(t, y_sol, 'r-', linewidth=2, label='Predator (Foxes)')
axes[0].axhline(y=x_eq, color='b', linestyle='--', alpha=0.5)
axes[0].axhline(y=y_eq, color='r', linestyle='--', alpha=0.5)
axes[0].set_xlabel('Time', fontsize=12)
axes[0].set_ylabel('Population', fontsize=12)
axes[0].set_title('Population Dynamics Over Time', fontsize=13, fontweight='bold')
axes[0].legend(fontsize=11)
axes[0].grid(True, alpha=0.3)

# Right plot: phase portrait
ax = axes[1]

# Multiple trajectories
initial_conditions = [
(5, 5), (10, 5), (15, 5), (20, 5),
(10, 3), (10, 8), (10, 12)
]

for x0, y0 in initial_conditions:
sol = odeint(lotka_volterra, [x0, y0], t, args=(alpha, beta, delta, gamma))
ax.plot(sol[:, 0], sol[:, 1], linewidth=1.5, alpha=0.7)
ax.plot(x0, y0, 'go', markersize=5)

# Vector field
x_range = np.linspace(0.5, 25, 15)
y_range = np.linspace(0.5, 18, 15)
X, Y = np.meshgrid(x_range, y_range)
dX = alpha * X - beta * X * Y
dY = delta * X * Y - gamma * Y
M = np.sqrt(dX**2 + dY**2)
M[M == 0] = 1
ax.quiver(X, Y, dX/M, dY/M, M, cmap='Greys', alpha=0.4)

# Mark equilibrium
ax.plot(x_eq, y_eq, 'r*', markersize=15, label=f'Equilibrium ({x_eq:.1f}, {y_eq:.1f})')
ax.plot(0, 0, 'ko', markersize=8)

ax.set_xlabel('Prey Population x', fontsize=12)
ax.set_ylabel('Predator Population y', fontsize=12)
ax.set_title('Phase Portrait: Lotka-Volterra Model', fontsize=13, fontweight='bold')
ax.legend(fontsize=10, loc='upper right')
ax.grid(True, alpha=0.3)
ax.set_xlim(0, 28)
ax.set_ylim(0, 20)

plt.tight_layout()
plt.savefig('lotka_volterra.png', dpi=150, bbox_inches='tight')
plt.close()

Ecological Significance of the Model

From the phase portrait, we can clearly see population oscillation: 1. Many rabbits → foxes have abundant food → foxes increase 2. Many foxes → rabbits heavily predated → rabbits decrease 3. Few rabbits → foxes starve → foxes decrease 4. Few foxes → rabbits less predated → rabbits increase 5. Return to 1, cycle repeats

This explains the periodic fluctuations observed in natural populations!

Limitations of the Model

  1. Structurally unstable: Any small parameter change destroys closed orbits
  2. Ignores carrying capacity: Rabbits can grow infinitely (without foxes)
  3. Ignores spatial factors: All individuals uniformly mixed
  4. Ignores time delays: Reproduction takes time

Improved models are discussed in the next section.

Competition Model: Two Species Competing for Resources

Competitive Exclusion Principle

What happens when two species compete for the same resource? Soviet ecologist G.F. Gause discovered through paramecium experiments in the 1930s: two species with completely overlapping ecological niches cannot coexist long-term.

Mathematical Model

Let the populations of two species beand, each following logistic growth while competing with each other:Parameter meanings: -: Intrinsic growth rates -: Carrying capacities -: Competition coefficient of species 2 on species 1 (one y's effect on x equalsx's) -: Competition coefficient of species 1 on species 2

Equilibria and Nullclines

Nullclines are curves in phase space where derivatives are zero: --nullcline:or --nullcline:orThe intersection of the two non-trivial nullclines is the coexistence equilibrium.

Python Analysis and Visualization

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
import numpy as np
import matplotlib.pyplot as plt
from scipy.integrate import odeint

def competition_model(state, t, r1, r2, K1, K2, a12, a21):
"""Two-species competition model"""
x, y = state
dxdt = r1 * x * (1 - (x + a12 * y) / K1)
dydt = r2 * y * (1 - (y + a21 * x) / K2)
return [dxdt, dydt]

# Parameter settings for four competition outcomes
scenarios = {
'Species 1 wins': {'r1': 1, 'r2': 1, 'K1': 100, 'K2': 80, 'a12': 0.5, 'a21': 1.5},
'Species 2 wins': {'r1': 1, 'r2': 1, 'K1': 80, 'K2': 100, 'a12': 1.5, 'a21': 0.5},
'Coexistence': {'r1': 1, 'r2': 1, 'K1': 100, 'K2': 100, 'a12': 0.5, 'a21': 0.5},
'Bistability': {'r1': 1, 'r2': 1, 'K1': 100, 'K2': 100, 'a12': 1.5, 'a21': 1.5}
}

fig, axes = plt.subplots(2, 2, figsize=(12, 12))
axes = axes.flatten()

for ax, (scenario_name, params) in zip(axes, scenarios.items()):
r1, r2 = params['r1'], params['r2']
K1, K2 = params['K1'], params['K2']
a12, a21 = params['a12'], params['a21']

# Plot nullclines
x_line = np.linspace(0, 150, 100)

# x-nullcline: x + a12*y = K1 => y = (K1 - x) / a12
y_nullcline_x = (K1 - x_line) / a12
y_nullcline_x = np.clip(y_nullcline_x, 0, 150)

# y-nullcline: y + a21*x = K2 => y = K2 - a21*x
y_nullcline_y = K2 - a21 * x_line
y_nullcline_y = np.clip(y_nullcline_y, 0, 150)

ax.plot(x_line, y_nullcline_x, 'b-', linewidth=2, label='x-nullcline')
ax.plot(x_line, y_nullcline_y, 'r-', linewidth=2, label='y-nullcline')

# Plot trajectories
t = np.linspace(0, 100, 1000)
initial_conditions = [
(10, 10), (80, 10), (10, 80), (50, 50),
(20, 60), (60, 20), (90, 90)
]

for x0, y0 in initial_conditions:
sol = odeint(competition_model, [x0, y0], t,
args=(r1, r2, K1, K2, a12, a21))
ax.plot(sol[:, 0], sol[:, 1], 'g-', linewidth=0.8, alpha=0.6)
ax.plot(x0, y0, 'ko', markersize=4)
ax.plot(sol[-1, 0], sol[-1, 1], 'r*', markersize=8)

ax.set_xlabel('Species 1 (x)', fontsize=11)
ax.set_ylabel('Species 2 (y)', fontsize=11)
ax.set_title(scenario_name, fontsize=12, fontweight='bold')
ax.legend(fontsize=9, loc='upper right')
ax.set_xlim(0, 120)
ax.set_ylim(0, 120)
ax.grid(True, alpha=0.3)

plt.tight_layout()
plt.savefig('competition_model.png', dpi=150, bbox_inches='tight')
plt.close()

Four Competition Outcomes

Depending on parameters, competition can have four outcomes:

  1. Species 1 wins: Species 2 goes extinct
  2. Species 2 wins: Species 1 goes extinct
  3. Coexistence: Both species reach stable equilibrium
  4. Bistability: Whoever reaches sufficient numbers first wins (history-dependent)

Discrimination conditions: - Ifand: Species 1 wins - Ifand: Species 2 wins - Ifand: Stable coexistence - Ifand: Bistability

Van der Pol Oscillator: Self-Sustained Oscillations

From Radios to Hearts

In the 1920s, Dutch engineer Balthasar van der Pol discovered a peculiar oscillator while studying vacuum tube circuits in radios. Its uniqueness lies in: no matter what initial state it starts from, the system eventually converges to the same periodic oscillation.

This behavior is called a limit cycle— an isolated closed orbit that nearby trajectories either spiral into or spiral away from.

Van der Pol equation:Rewriting as a first-order system:Parametercontrols nonlinearity strength: - When,: negative damping, system gains energy - When,: positive damping, system loses energy

This "intelligent damping" keeps the system in stable oscillation — exactly the working principle of a pacemaker!

Python Plotting of Limit Cycles

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
import numpy as np
import matplotlib.pyplot as plt
from scipy.integrate import odeint

def van_der_pol(state, t, mu):
"""Van der Pol oscillator"""
x, y = state
dxdt = y
dydt = mu * (1 - x**2) * y - x
return [dxdt, dydt]

fig, axes = plt.subplots(1, 3, figsize=(15, 5))
mu_values = [0.5, 1.0, 3.0]
t = np.linspace(0, 50, 5000)

for ax, mu in zip(axes, mu_values):
# Trajectories from different initial conditions
initial_conditions = [
(0.1, 0), (0.5, 0), (3, 0), (0, 3),
(-2, -2), (2, 2), (4, 0)
]

for x0, y0 in initial_conditions:
sol = odeint(van_der_pol, [x0, y0], t, args=(mu,))
ax.plot(sol[:, 0], sol[:, 1], linewidth=0.8, alpha=0.7)
ax.plot(x0, y0, 'go', markersize=5)

# Mark origin (unstable equilibrium)
ax.plot(0, 0, 'ro', markersize=8)

ax.set_xlabel('x', fontsize=12)
ax.set_ylabel('y = dx/dt', fontsize=12)
ax.set_title(f'Van der Pol Oscillator (μ = {mu})', fontsize=12, fontweight='bold')
ax.grid(True, alpha=0.3)
ax.set_xlim(-5, 5)
ax.set_ylim(-8, 8)
ax.set_aspect('equal')

plt.tight_layout()
plt.savefig('van_der_pol.png', dpi=150, bbox_inches='tight')
plt.close()

Relaxation Oscillations

Whenis very large, oscillations become relaxation oscillations: - Most of the time the system evolves slowly - Then suddenly jumps rapidly to another state - Then evolves slowly again...

This describes many "charge-discharge" type phenomena: - Neuron action potentials - Heart rhythm - Geyser eruptions

Lyapunov Stability Theory

Precise Definition of Stability

Intuitively, a stable equilibrium is "robust to disturbances"— give it a slight push, and the system returns to equilibrium or stays nearby.

Lyapunov stable: For any, there existssuch that if the initial value satisfies, thenfor all.

Asymptotically stable: Lyapunov stable +

Lyapunov Function Method

Determine stability without solving the differential equation! Core idea: construct an energy function; if it monotonically decreases along trajectories, the system is stable.

Lyapunov's second method: Letbe a positive definite function (,for). Compute the derivative along trajectories: - If(negative semi-definite):is Lyapunov stable - If(negative definite):is asymptotically stable

Example: Damped Pendulum

Consider a pendulum with damping:Rewrite as system:,Equilibrium point:Construct Lyapunov function (total energy):This is positive definite (kinetic + potential energy, with minimum at lowest point).

Derivative along trajectories:Therefore the origin is stable! When(with damping), asymptotic stability can be further proven.

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
import numpy as np
import matplotlib.pyplot as plt
from scipy.integrate import odeint

def damped_pendulum(state, t, b):
"""Damped pendulum"""
theta, omega = state
dthetadt = omega
domegadt = -b * omega - np.sin(theta)
return [dthetadt, domegadt]

# Plot Lyapunov function contours and trajectories
fig, ax = plt.subplots(figsize=(10, 8))

b = 0.3 # Damping coefficient

# Lyapunov function V = omega^2/2 + (1 - cos(theta))
theta_range = np.linspace(-np.pi, np.pi, 100)
omega_range = np.linspace(-3, 3, 100)
Theta, Omega = np.meshgrid(theta_range, omega_range)
V = 0.5 * Omega**2 + (1 - np.cos(Theta))

# Contours
contour = ax.contour(Theta, Omega, V, levels=15, cmap='coolwarm', alpha=0.6)
ax.clabel(contour, inline=True, fontsize=8)

# Trajectories
t = np.linspace(0, 30, 1000)
initial_conditions = [
(2.5, 0), (2, 1), (1, 2), (-2, 1), (-1, -2), (0.5, 0.5)
]

for theta0, omega0 in initial_conditions:
sol = odeint(damped_pendulum, [theta0, omega0], t, args=(b,))
ax.plot(sol[:, 0], sol[:, 1], 'b-', linewidth=1.5, alpha=0.7)
ax.plot(theta0, omega0, 'go', markersize=6)

ax.plot(0, 0, 'r*', markersize=15, label='Stable equilibrium')
ax.set_xlabel(r'$\theta$(angle)', fontsize=12)
ax.set_ylabel(r'$\omega$(angular velocity)', fontsize=12)
ax.set_title(f'Damped Pendulum Phase Portrait (b={b})\nContours: Lyapunov function V',
fontsize=12, fontweight='bold')
ax.legend(fontsize=11)
ax.grid(True, alpha=0.3)
ax.set_xlim(-np.pi, np.pi)
ax.set_ylim(-3, 3)

plt.tight_layout()
plt.savefig('lyapunov_pendulum.png', dpi=150, bbox_inches='tight')
plt.close()

Gradient Systems and Hamiltonian Systems

Gradient Systems: Always Downhill

If a system can be written as:whereis some potential function, this is a gradient system.

Properties: - Trajectories always follow the steepest descent direction of - No periodic solutions (can't go in circles) - All trajectories eventually converge to local minima of Gradient descent in machine learning is a discrete version of gradient systems!

Hamiltonian Systems: Energy Conservation

If a system can be written as:whereis the Hamiltonian (usually total energy), this is a Hamiltonian system.

Properties: -is conserved along trajectories: - Trajectories are level curves of - Phase space volume is conserved (Liouville's theorem)

Frictionless systems in classical mechanics are all Hamiltonian systems.

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
import numpy as np
import matplotlib.pyplot as plt
from scipy.integrate import odeint

def simple_pendulum(state, t):
"""Undamped pendulum (Hamiltonian system)"""
theta, p = state # p = d(theta)/dt
# H = p^2/2 + (1 - cos(theta))
dthetadt = p # = dH/dp
dpdt = -np.sin(theta) # = -dH/dtheta
return [dthetadt, dpdt]

fig, ax = plt.subplots(figsize=(10, 8))

# Hamiltonian contours
theta_range = np.linspace(-2*np.pi, 2*np.pi, 200)
p_range = np.linspace(-3, 3, 200)
Theta, P = np.meshgrid(theta_range, p_range)
H = 0.5 * P**2 + (1 - np.cos(Theta))

# Plot contours (which are also trajectories!)
contour = ax.contour(Theta, P, H, levels=20, cmap='viridis')
ax.clabel(contour, inline=True, fontsize=8)

# Mark separatrix H = 2
separatrix = ax.contour(Theta, P, H, levels=[2], colors='red', linewidths=2)

# Mark equilibria
ax.plot(0, 0, 'go', markersize=10, label='Stable (oscillation center)')
ax.plot(np.pi, 0, 'ro', markersize=10, label='Unstable (inverted)')
ax.plot(-np.pi, 0, 'ro', markersize=10)

ax.set_xlabel(r'$\theta$', fontsize=12)
ax.set_ylabel(r'$p = d\theta/dt$', fontsize=12)
ax.set_title('Phase Portrait of Simple Pendulum (Hamiltonian System)\nContours = Energy levels = Trajectories',
fontsize=12, fontweight='bold')
ax.legend(fontsize=10, loc='upper right')
ax.grid(True, alpha=0.3)
ax.set_xlim(-2*np.pi, 2*np.pi)
ax.set_ylim(-3.5, 3.5)

plt.tight_layout()
plt.savefig('hamiltonian_pendulum.png', dpi=150, bbox_inches='tight')
plt.close()

Global Analysis: Poincar é-Bendixson Theorem

Limit Sets

As, what do trajectories approach? Possible limit sets include: - Equilibrium points - Limit cycles - Chaotic attractors (only possible in 3D or higher, see next chapter)

Poincar é-Bendixson Theorem

For two-dimensional continuous systems, there's an amazing result:

Theorem: Letbe a bounded closed region in the plane, and let trajectorystay insidewithout approaching any equilibrium. Then the limit set ofis a closed orbit (periodic solution).

Corollary: Two-dimensional continuous systems cannot exhibit chaos!

This is why studying chaos requires at least three dimensions (like the Lorenz system).

Determining Existence of Limit Cycles

Bendixson's criterion: Ifhas constant sign (always positive or always negative) in some region, then that region contains no closed orbits.

Dulac's criterion: Generalization of Bendixson's criterion, allowing multiplication by a positive factor.

Nonlinear Systems in Daily Life

Case 1: Heart Rhythm

The sinoatrial node is the heart's natural pacemaker, and its electrical activity can be described by the FitzHugh-Nagumo model — a simplified Van der Pol-type equation.

Under normal conditions, the system has a stable limit cycle (normal heartbeat). Under certain pathological conditions, the limit cycle may disappear or deform, leading to arrhythmias.

Case 2: Disease Transmission

The SIR model (discussed in later chapters) is a nonlinear system.(basic reproduction number) determines whether a disease can break out — this is a bifurcation phenomenon, detailed in a later chapter.

Case 3: Climate System

Global climate is a typical high-dimensional nonlinear system. Ice ages, greenhouse effects, and climate tipping points can all be understood through nonlinear dynamics.

Case 4: Neural Networks

The training process of artificial neural networks can be viewed as gradient system evolution in high-dimensional space. Local minima, saddle points, and flat regions all affect training efficiency.

Brief Overview of Numerical Methods

Common Methods

  1. Euler's method: Simplest, but low accuracy

  2. Improved Euler method (Heun): Second-order accuracy

  3. Fourth-order Runge-Kutta (RK4): Most commonly used, good balance of accuracy and efficiency

  4. Adaptive step-size methods: Such as RK45, DOPRI5

Python Implementation of RK4

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
import numpy as np

def rk4_step(f, x, t, h):
"""Single RK4 step"""
k1 = h * np.array(f(x, t))
k2 = h * np.array(f(x + k1/2, t + h/2))
k3 = h * np.array(f(x + k2/2, t + h/2))
k4 = h * np.array(f(x + k3, t + h))
return x + (k1 + 2*k2 + 2*k3 + k4) / 6

def rk4_solve(f, x0, t_span, h):
"""Solve ODE system using RK4"""
t_start, t_end = t_span
t_values = [t_start]
x_values = [np.array(x0)]

t = t_start
x = np.array(x0)

while t < t_end:
if t + h > t_end:
h = t_end - t
x = rk4_step(f, x, t, h)
t = t + h
t_values.append(t)
x_values.append(x.copy())

return np.array(t_values), np.array(x_values)

# Test: Lotka-Volterra
def lotka_volterra_f(state, t):
x, y = state
alpha, beta, delta, gamma = 1.0, 0.1, 0.075, 0.5
return [alpha*x - beta*x*y, delta*x*y - gamma*y]

t_vals, sol = rk4_solve(lotka_volterra_f, [10, 5], (0, 100), 0.01)
print(f"Final state: x = {sol[-1, 0]:.2f}, y = {sol[-1, 1]:.2f}")

Summary

In this chapter, we deeply explored the core concepts of nonlinear systems:

  1. Phase space and phase portraits: Understanding system behavior with geometric intuition
  2. Classification of equilibria: Determining stability through Jacobian eigenvalues
  3. Classic models:
    • Lotka-Volterra (periodic oscillations)
    • Competition model (four outcomes)
    • Van der Pol (limit cycles)
  4. Lyapunov method: Determining stability without solving equations
  5. Special structures: Gradient systems, Hamiltonian systems
  6. Global theory: Poincar é-Bendixson theorem

The nonlinear world is much richer than the linear world: periodicity, quasi-periodicity, limit cycles... and this is just the tip of the iceberg. In the next chapter, we'll witness the most amazing behavior of nonlinear systems —chaos.

Exercises

Conceptual Questions

  1. Explain why the superposition principle of linear systems doesn't apply to nonlinear systems. Give a specific counterexample.

  2. What is phase space? Why is it important for analyzing dynamical systems?

  3. Distinguish between Lyapunov stability and asymptotic stability. Give examples of each.

  4. Why can't two-dimensional continuous systems exhibit chaos?

Computational Problems

  1. For the system
  1. Find all equilibria
  2. Calculate the Jacobian at each equilibrium
  3. Classify each equilibrium type
  1. Prove thatis a Lyapunov function for the system,at the origin, and determine stability.

  2. For the Lotka-Volterra system, prove that the functionis constant along trajectories (first integral).

  3. For the Van der Pol equation(), calculate the approximate period of the limit cycle (using numerical methods).

Analysis Problems

  1. Consider a predator-prey model with density-dependent death:Analyze how the additionalterm affects the qualitative behavior of the system.

  2. In the competition model, if both species have completely symmetric parameters (,,), how will the system evolve? What does this mean ecologically?

Programming Problems

  1. Write a Python program to plot the phase portrait of the following system, marking all equilibria and their types:12. Implement an interactive program that lets users adjust Lotka-Volterra model parameters and observe real-time changes in phase portraits and time series.

  2. Use numerical methods to verify the Poincar é-Bendixson theorem: for a two-dimensional system with bounded trajectories not approaching equilibria, prove that it indeed converges to a periodic orbit.

  3. Compare the accuracy and efficiency of Euler's method, improved Euler method, and RK4 for solving the Van der Pol equation. Test with different step sizes and plot error vs. step size relationships.

References

  1. Strogatz, S. H. (2015). Nonlinear Dynamics and Chaos: With Applications to Physics, Biology, Chemistry, and Engineering. Westview Press.

  2. Perko, L. (2001). Differential Equations and Dynamical Systems. Springer.

  3. Guckenheimer, J., & Holmes, P. (1983). Nonlinear Oscillations, Dynamical Systems, and Bifurcations of Vector Fields. Springer.

  4. Murray, J. D. (2002). Mathematical Biology I: An Introduction. Springer.

  5. Hirsch, M. W., Smale, S., & Devaney, R. L. (2012). Differential Equations, Dynamical Systems, and an Introduction to Chaos. Academic Press.


Previous Chapter: ← Ordinary Differential Equations (VII)

Next Chapter: → Ordinary Differential Equations (IX): Chaos Theory and the Lorenz System


This is Chapter 8 of the "World of Ordinary Differential Equations" series.

  • Post title:Ordinary Differential Equations (8): Nonlinear Systems and Phase Portraits
  • Post author:Chen Kai
  • Create time:2019-05-13 10:30:00
  • Post link:https://www.chenk.top/ode-chapter-08-nonlinear-stability/
  • Copyright Notice:All articles in this blog are licensed under BY-NC-SA unless stating additionally.
 Comments