toggle visibility
Search within Results:
Display Options:

Select All    Deselect All
 |   | 
Details
   print
  Record Links
Author McLachlan, G.; Lladó, P.; Peremans, H. url  doi
openurl 
  Title Head rotations follow those of a truncated Fick gimbal during an auditory-guided visual search task Type A1 Journal article
  Year (down) 2024 Publication Journal of neurophysiology Abbreviated Journal  
  Volume 132 Issue 6 Pages 1857-1866  
  Keywords A1 Journal article; Engineering sciences. Technology; Engineering Management (ENM); Condensed Matter Theory (CMT)  
  Abstract Recent interest in dynamic sound localization models has created a need to better understand the head movements made by humans. Previous studies have shown that static head positions and small oscillations of the head obey Donders' law: for each facing direction there is one unique three-dimensional orientation. It is unclear whether this same constraint applies to audiovisual localization, where head movement is unrestricted and subjects may rotate their heads depending on the available auditory information. In an auditory-guided visual search task, human subjects were instructed to localize an audiovisual target within a field of visual distractors in the frontal hemisphere. During this task, head and torso movements were monitored with a motion capture system. Head rotations were found to follow Donders' law during search tasks. Individual differences were present in the amount of roll that subjects deployed, though there was no statistically significant improvement in model performance when including these individual differences in a gimbal model. The roll component of head rotation could therefore be predicted with a truncated Fick gimbal, which consists of a pitch axis nested within a yaw axis. This led to a reduction from three to two degrees of freedom when modeling head movement during localization tasks. NEW & NOTEWORTHY Understanding how humans utilize head movements during sound localization is crucial for the advancement of auditory perception models and improvement of practical applications like hearing aids and virtual reality systems. By analyzing head motion data from an auditory-guided visual search task, we concluded that findings from earlier studies on head movement can be generalized to audiovisual localization and, from this, proposed a simple model for head rotation that reduced the number of degrees of freedom.  
  Address  
  Corporate Author Thesis  
  Publisher Place of Publication Editor  
  Language Wos Publication Date 2024-10-30  
  Series Editor Series Title Abbreviated Series Title  
  Series Volume Series Issue Edition  
  ISSN 0022-3077 ISBN Additional Links UA library record  
  Impact Factor Times cited Open Access  
  Notes Approved no  
  Call Number UA @ admin @ c:irua:210733 Serial 9378  
Permanent link to this record
Select All    Deselect All
 |   | 
Details
   print

Save Citations:
Export Records: