Please use this identifier to cite or link to this item: http://bura.brunel.ac.uk/handle/2438/19325
Full metadata record
DC FieldValueLanguage
dc.contributor.authorvan Polanen, V-
dc.contributor.authorTibold, R-
dc.contributor.authorNuruki, A-
dc.contributor.authorDavare, M-
dc.date.accessioned2019-10-17T10:43:51Z-
dc.date.available2019-04-01-
dc.date.available2019-10-17T10:43:51Z-
dc.date.issued2019-04-01-
dc.identifier.citationvan Polanen V, Tibold R, Nuruki A, Davare M. Visual delay affects force scaling and weight perception during object lifting in virtual reality. Journal of neurophysiology. 2019 Apr 1;121(4):1398-409.en_US
dc.identifier.issn0022-3077-
dc.identifier.urihttps://bura.brunel.ac.uk/handle/2438/19325-
dc.description.abstractLifting an object requires precise scaling of fingertip forces based on a prediction of object weight. At object contact, a series of tactile and visual events arise that need to be rapidly processed online to fine-tune the planned motor commands for lifting the object. The brain mechanisms underlying multisensory integration serially at transient sensorimotor events, a general feature of actions requiring hand-object interactions, are not yet understood. In this study we tested the relative weighting between haptic and visual signals when they are integrated online into the motor command. We used a new virtual reality setup to desynchronize visual feedback from haptics, which allowed us to probe the relative contribution of haptics and vision in driving participants’ movements when they grasped virtual objects simulated by two force-feedback robots. We found that visual delay changed the profile of fingertip force generation and led participants to perceive objects as heavier than when lifts were performed without visual delay. We further modeled the effect of vision on motor output by manipulating the extent to which delayed visual events could bias the force profile, which allowed us to determine the specific weighting the brain assigns to haptics and vision. Our results show for the first time how visuo-haptic integration is processed at discrete sensorimotor events for controlling object-lifting dynamics and further highlight the organization of multisensory signals online for controlling action anden_US
dc.description.sponsorshipBelgium Fonds Wetenschappelijk Onderzoek (FWO) Post-doctoral Fellowship; Belgium FWO Odysseus Grant; Biotechnology and Biological Sciences Research Council David Phillips Fellowship (UK); Japan Society for the Promotion of Scienceen_US
dc.format.extent1398 - 1409-
dc.language.isoenen_US
dc.publisherAmerican Physiological Societyen_US
dc.subjectForceen_US
dc.subjectGraspingen_US
dc.subjectMultisensoryen_US
dc.subjectVirtual realityen_US
dc.subjectWeight perceptionen_US
dc.titleVisual delay affects force scaling and weight perception during object lifting in virtual realityen_US
dc.typeArticleen_US
dc.identifier.doihttps://doi.org/10.1152/jn.00396.2018-
dc.relation.isPartOfJournal of Neurophysiology-
pubs.issue4-
pubs.publication-statusPublished-
pubs.volume121-
dc.identifier.eissn1522-1598-
Appears in Collections:Dept of Health Sciences Research Papers

Files in This Item:
File Description SizeFormat 
FullText.pdf1.09 MBAdobe PDFView/Open


Items in BURA are protected by copyright, with all rights reserved, unless otherwise indicated.