Please use this identifier to cite or link to this item: http://bura.brunel.ac.uk/handle/2438/15183
Full metadata record
DC FieldValueLanguage
dc.contributor.authorLi, Y-
dc.contributor.authorthorsten, R-
dc.contributor.authorweier, M-
dc.contributor.authorhinkenjann, A-
dc.contributor.authorslusallek, P-
dc.date.accessioned2017-09-22T10:05:53Z-
dc.date.available2017-09-22T10:05:53Z-
dc.date.issued2017-
dc.identifier.citationJournal of eye movement researchen_US
dc.identifier.issn1995-8692-
dc.identifier.urihttp://bura.brunel.ac.uk/handle/2438/15183-
dc.description.abstractThis work presents the analysis of data recorded by an eye tracking device in the course of evaluating a foveated rendering approach for head-mounted displays (HMDs). Foveated rendering methods adapt the image synthesis process to the user’s gaze and exploiting the human visual system’s limitations to increase rendering performance. Especially, foveated rendering has great potential when certain requirements have to be fulfilled, like low-latency rendering to cope with high display refresh rates. This is crucial for virtual reality (VR), as a high level of immersion, which can only be achieved with high rendering performance and also helps to reduce nausea, is an important factor in this field. We put things in context by first providing basic information about our rendering system, followed by a description of the user study and the collected data. This data stems from fixation tasks that subjects had to perform while being shown fly-through sequences of virtual scenes on an HMD. These fixation tasks consisted of a combination of various scenes and fixation modes. Besides static fixation targets, moving targets on randomized paths as well as a free focus mode were tested. Using this data, we estimate the precision of the utilized eye tracker and analyze the participants’ accuracy in focusing the displayed fixation targets. Here, we also take a look at eccentricity-dependent quality ratings. Comparing this information with the users’ quality ratings given for the displayed sequences then reveals an interesting connection between fixation modes, fixation accuracy and quality ratings.en_US
dc.description.sponsorshipWe would like to thank NVIDIA for providing us with two Quadro K6000 graphics cards for the user study, the Intel Visual Computing Institute, the European Union (EU) for the co-funding as part of the Dreamspace project, the German Federal Ministry for Economic A airs and Energy (BMWi) for funding the MATEDIS ZIM project (grant no KF2644109) and the Federal Ministry of Education and Research (BMBF) for funding the project OLIVE (grant no 13N13161).en_US
dc.language.isoenen_US
dc.publisherJournal of Eye Movement Researchen_US
dc.subjectRenderingen_US
dc.subjectRay tracingen_US
dc.subjectdata analysisen_US
dc.subjectperceived qualityen_US
dc.subjecteye trackingen_US
dc.subjectfoveated renderingen_US
dc.subjecteye movementen_US
dc.subjectregion of interest, gazeen_US
dc.titleA Quality-Centered Analysis of Eye Tracking Data in Foveated Renderingen_US
dc.typeArticleen_US
dc.relation.isPartOfJournal of eye movement research-
pubs.publication-statusAccepted-
Appears in Collections:Dept of Health Sciences Research Papers

Files in This Item:
File Description SizeFormat 
Fulltext.pdf8.08 MBAdobe PDFView/Open


Items in BURA are protected by copyright, with all rights reserved, unless otherwise indicated.