This paper is concerned with the operational performance of a semi-autonomous wheelchair system named TIM (Thought-controlled Intelligent Machine), which uses cameras in a system configuration modeled on the vision system of a horse. This new camera configuration utilizes stereoscopic vision for 3-Dimensional (3D) depth perception and mapping ahead of the wheelchair, combined with a spherical camera system for 360-degrees of monocular vision. The unique combination allows for static components of an unknown environment to be mapped and any surrounding dynamic obstacles to be detected, during real-time autonomous navigation, minimizing blind-spots and preventing accidental collisions with people or obstacles. Combining this vision system with a shared control strategy provides intelligent assistive guidance during wheelchair navigation, and can accompany any hands-free wheelchair control technology for people with severe physical disability. Testing of this system in crowded dynamic environments has displayed the feasibility and real-time performance of this system when assisting hands-free control technologies, in this case being a proof-of-concept brain-computer interface (BCI).