Original manuscript: 2012/09/24
Controllability and stabilisability are two fundamental properties of control systems and it is intuitively appealing to conjecture that the former should imply the latter; especially so when the state of a control system is assumed to be known at every time instant. Such an implication can, indeed, be proven for certain types of controllability and stabilisability, and certain classes of control systems. In the present thesis, we consider real analytic control systems of the form Σ : =f(x,u), with x in a real analytic manifold and u in a separable metric space, and we show that, under mild technical assumptions, small-time local controllability from an equilibrium p of Σ implies the existence of a piecewise analytic feedback FΣ that asymptotically stabilises Σ at p. As a corollary to this result, we show that nonlinear control systems with controllable unstable dynamics and stable uncontrollable dynamics are feedback stabilisable, extending, thus, a classical result of linear control theory.
Next, we modify the proof of the existence of to show stabilisability of small-time locally controllable systems in finite time, at the expense of obtaining a closed-loop system that may not be Lyapunov stable. Having established stabilisability in finite time, we proceed to prove a converse-Lyapunov theorem. If FΣ is a piecewise analytic feedback that stabilises a small-time locally controllable system in finite time, then the Lyapunov function we construct has the interesting property of being differentiable along every trajectory of the closed-loop system obtained by "applying" FΣ to Σ.
We conclude this thesis with a number of open problems related to the stabilisability of nonlinear control systems, along with a number of examples from the literature that hint at potentially fruitful lines of future research in the area.
Last Updated: Thu Oct 11 08:59:48 2018