Some context: there is an entire neuroscientific field of study devoted to substituting one sensory modality with another: <a href="http://en.wikipedia.org/wiki/Sensory_substitution" rel="nofollow">http://en.wikipedia.org/wiki/Sensory_substitution</a><p>The field was pioneered by Paul Bach-y-Rita (<a href="http://en.wikipedia.org/wiki/Paul_Bach-y-Rita" rel="nofollow">http://en.wikipedia.org/wiki/Paul_Bach-y-Rita</a>) who most notably invented a setup that allowed blind people to "see" via a camera connected to a vibrating grid attached to their their backs, effectively substituting visual with haptic input.<p>In a nutshell, there is nothing intrinsically "visual" about neurons in the visual cortex, nor are neurons in the, e.g., auditory cortex exclusively tuned towards sound - the brain is plastic enough to "make sense" of a new type of input signal, which typically takes a couple of weeks.<p>My co-founder Peter König at EyeQuant.com - a neuroscience professor at the University of Osnabrueck - is working on similar projects with his feelspace group, where they created a compass-belt that vibrates whereever north is, taking sensory substitution a step further by effectively creating a <i>new</i> sensory modality of direction (Wired article: <a href="http://www.wired.com/wired/archive/15.04/esp.html" rel="nofollow">http://www.wired.com/wired/archive/15.04/esp.html</a>)<p>As an excellent philosophical take on this I would recommend Alva Noe's "Action in Perception":
<a href="http://www.amazon.com/dp/0262140888/" rel="nofollow">http://www.amazon.com/dp/0262140888/</a>