Evolution, whether in life or in software, is the gradual accretion of functional structure from the mix of entropy-increasing randomness (like mutation) and entropy-decreasing selection. In both disciplines structure changes gradually, so that structural features from the earliest generations persist as the substrates of later innovations. Thus computers have operating systems and digital protocols echoing from decades previous, and vertebrates have spines and organs laid out the same way for 500 million years.
So we would expect that humans share most of our mechanical and computational hardware with earlier bipeds like homo erectus, most of that with chimps and bonobos, most of that with monkeys, most of that with quadrupeds, most of that with fish, and most of that with sea-snakes. As long as each evolved from the other by incremental changes, the overall architectural structure must have remained in place for it remain functional.
This means that the human brain must at its deepest level be structured to control a spine. Certainly, the layout of our internal organs has remained relatively unchanged over that time, with the highest-bandwidth tasks of controlling a mouth and eyes close to the brain, respiration and digestion midway along it, and the lowest-bandwidth tasks like reproduction near the tail. Even in humans, the major pain and pleasure centers are still centered on the spine, along with most communication channels (both neural and vibratory). Certainly spinal alignment, sensation, and control is central to theraputic practices like yoga, chiropractic, Pilates, and so on.
A spine exists in three dimensions, one longer than the others, so to control it a brain will need to use a representation with one long axis ("Z") and a few local radial and angular axes along it. This is now a physics problem, equivalent to controlling the eigenmodes of a long elastic cylinder. This problem can be posed, and solved, without even making reference to breathing or external sensation, so the ur-architecture of brain computation--the structure necessary to compute the fastest and most data-starved motions--must be those of continuous oscillation. That task can be further sub-divided by scale, so it might be computed by many similar modules operating in parallel: a few modules for each set of muscle fibers, a few for each vertebra, and a few for the whole spine.
Once such modules evolved, Nature could have re-used them atop the basic spinal architecture for breathing, digestion, limbs, and sensation, all of which involve 3-D sensorimotor feedback control similar in topology to that used in joints. For example, both vision and tactile perception by fingertips involve "scanning" a 2-D sensor array back and forth autonomously, then assembling the sensory data tomographically into smoothly moving 3-D representations. So once Nature solved the problem of perceiving and stabilizing a single vertebra, it was most of the way to solving almost any other real-world 3-D problem.