Abstract: We discuss the dynamic programming method to address optimal control problems associated with random dynamical systems defined by Itô diffusions. We start with a quick overview of stochastic differential equations, before moving on to the derivation of Bellmans principle of optimality and the corresponding Hamilton-Jacobi-Bellman equation.