##

15.2.1 Hamilton-Jacobi-Bellman Equation

The HJB equation is a central result in optimal control theory. Many
other principles and design techniques follow from the HJB equation,
which itself is just a statement of the dynamic programming principle
in continuous time. A proper derivation of all forms of the HJB
equation would be beyond the scope of this book. Instead, a
time-invariant formulation that is most relevant to planning will be
given here. Also, an informal derivation will follow, based in part
on [95].

**Subsections**

Steven M LaValle
2012-04-20