Abstract: We discuss the dynamic programming method to address optimal control problems associated with dynamical systems defined by ordinary differential equations. For this class problems, we obtain Bellmans principle of optimality and the corresponding Hamilton-Jacobi-Bellman equation.