Optimal feedback control design provides a robust and real-time computable answer to fundamental challenges in modern engineering, such as active vibration control, fluid flow control, and multi-agent networks. For an optimality-based formulation of the feedback design problem, the Dynamic Programming Principle allows the characterization of the associated value function as the viscosity solution of a fully nonlinear Hamilton-Jacobi-Bellman (HJB) equation, defined over the state-space of the controlled dynamical system. This talk focuses on the computation of optimal feedback controllers for multsicale nonlinear dynamics through the numerical approximation of HJB equations. We will review recent results concerning optimal feedback control of low and high-dimensional dynamics arising in the optimal control of partial differential equations, and agent-based models. We shall also address different control features such as robustness, sparsity, and multiscale control design.