"On the motor output side, different motor actions
similarly require the representation of target location in different
reference frames according to the natural coordinates
of the moving muscles. The surprising finding that several
movement planning areas in the PPC represent space in an
eye-centered coordinate frame independent of the sensory
input and motor output may provide a unifying scheme
for multimodal sensory integration and the development of
high-level movement plans.
In area LIP, it was recently established that cells were
involved in the planning of eye movements, regardless of
whether the eye movements were triggered by a visual or an
auditory stimulus (Grunewald et al., 1999; Linden et al.,
1999; Mazzoni et al., 1996b). In further experiments directly
addressing the question of the underlying reference frame,
it was found that the majority of neurons coded auditory
targets, similar to visual targets, in eye-centered coordinates,
and with many response fields of LIP neurons gainmodulated
by eye position (Stricanne et al., 1996). These
findings suggest that LIP is involved in the sensory integration
of visual and auditory signals and that many LIP
neurons are encoding these signals in eye-centered coordinates.
"

Extracted from "The visual neurosciences"

This means that "the field" -as a perceptual concept and as a motor domain- is integrated by the visual system not only because it is the most generous with information about it but also because other sensations deals the field through the visual mapping.

Another thing is that "The sensorimotor Transformation" is performed at the posterior parietal association cortex recieving its sensory input from the visual system.

These notions assures more the perspective that everyone has his individual Vision that its information is privately to him(the relation to the motor output), in contrast to the Auditory system which is primarily function as an analyzer of a physical property(the air vibrations) and not related primarily to field perception and even the indirect calculations done by it, is feeded to the visual map.


Audition & the Where stream

Location cues are not contained in the receptor
cells like on the retina in vision, and hence the location for sounds must be calculated.
The sense of location is therefore different from Auditory sensation than that from Vision sensation, as in the vision one it is inherent in contrast to the Auditory which is calculated.

This adds more to the postulation that multiplicity of vision is from sense of location

Owl's Hearing

"The activity pattern of the neurons matched the location of the sound, the team found. Sounds from above, for example, cause neurons towards the top of the auditory centre to fire, whereas sounds from lower down trigger neurons towards the bottom. "The owls basically have a topographic map of space in their brain," says Bala."
Natue.com News :
Owls' ears map the world

The Owl has a topographic map to the external environment by means of the auditory sensation.
The architecture of the auditory cortex as the researchers find is arranged topographically, in contrast to our auditory cortex which is arranged according to sound frequency.


This piece of information favors the postulation that in humans hearing is single( as the auditory cortex is a representation of the audible frequencies) and vision is multiple(as the visual cortex is a topographic representation of the environment and so everyone sees different from the other).
Just as the Qura'n verse says:
" It is he who brought you forth from the wombs of your mothers when ye knew nothing and he gave you hearing and sight and intelligence and affections: that ye may give thanks (to God)."
[16.78] Surat Al-Nahl "The Bees"
وَاللّهُ أَخْرَجَكُم مِّن بُطُونِ أُمَّهَاتِكُمْ لاَ تَعْلَمُونَ شَيْئاً وَجَعَلَ لَكُمُ الْسَّمْعَ وَالأَبْصَارَ وَالأَفْئِدَةَ لَعَلَّكُمْ تَشْكُرُونَ [النحل : 78]