Humans and other primates possess a remarkable ability to interact with objects in their surrounding space. Neuroscience research has shown that areas of the primate posterior parietal cortex represent the surrounding environment appropriately for coordinated movements of eyes and arms. In this talk, I will present two techniques used to extract meaningful patterns from the single-cell data recorded from the area V6A while a macaque monkey performed gazing and reaching tasks. I will explain the implications that this analysis had for computational modelling of visuomotor transformations. Then, I will outline our computational model that was based on a radial basis functions and configured and trained by the coordinated control of eye/head and arm movements. I will present the results of our robot simulations and implications that these have for the new neuroscience studies. Finally, I will argue that such a joint approach advance robotics and enable further understanding of the brain mechanisms of visuomotor transformations.