bn:03597900n
Noun Concept
Categories: Stochastic control, Optimal control, Dynamic programming, William Rowan Hamilton, Partial differential equations
EN
Hamilton–Jacobi–Bellman equation  Hamilton-Jacobi-Bellman equation  Hamiltonian-Jacobi-Bellman equation  Hamilton–Jacobi equation  HJB
EN
The Hamilton-Jacobi-Bellman equation is a nonlinear partial differential equation that provides necessary and sufficient conditions for optimality of a control with respect to a loss function. Wikipedia
Definitions
Relations
Sources
EN
The Hamilton-Jacobi-Bellman equation is a nonlinear partial differential equation that provides necessary and sufficient conditions for optimality of a control with respect to a loss function. Wikipedia
A condition for optimality of a control with respect to a loss function Wikipedia Disambiguation
An optimality condition in optimal control theory Wikidata