[visionlist] Pre-VSS Interactive symposium on eye tracking in VR (Thursday ay 17th, 10am-3pm)

Gabriel Diaz Gabriel.Diaz at rit.edu
Tue Apr 10 11:45:00 -05 2018


This year at VSS (Thursday, May 17, 10:00 am - 3:00 pm, Jasmine/Palm), the Perform Lab from the Rochester Institute of Technology<http://www.cis.rit.edu/performlab/projects> (of which I, Gabriel Diaz, am the director) will be offering a free interactive symposium on eye tracking in virtual reality. The aim of this symposium is to assist the interested researcher to incorporate eye tracking technology into a laboratory, and into his or her next funding proposal. The workshop is 6 hours long, and will include periods of lecture interspersed with periods of goal-directed programming assignments in Python or Matlab. This symposium is sponsored by the OSA Vision technical group.  Food may be provided, however, we are working out the details.


We would appreciate if you would send RSVP's to gabriel.diaz at rit.edu<mailto:gabriel.diaz at rit.edu>



Anticipated lecture topics include:
- An overview of the hardware, resolution, etc.

- A quick overview of some related research.
- An overview of the software (e.g. game engines, SDK), and what it can / cannot do for you.

- Methods for assessing and reducing spatial and temporal inaccuracies.
- Algorithms for identifying what someone is looking at.
- Algorithms for identifying where someone is looking relative to a fixed point in space.
- Algorithms for the classification of gaze events (e.g. fixation, head tracking, VOR, etc.).
- Relevant metrics to include in your next publication involving eye tracking in VR.
- What type of students should I hire, and what skills should they develop?
- Issues that may be raised by shrewd reviewers of your related funding proposal, and how to address them.



The lecture portion of the symposium will be delivered by myself, as well as graduate students Kamran Binaee, Rakshit Kothari, and Catherine Fromm.


Anticipated programming assignments include:

  *   Data representation
  *   Filtering
  *   The measurement of spatial accuracy and temporal latency
  *   The transformation matrix, nested coordinate systems, and the gaze-in-world vector
  *   Event detection (e.g. saccade identification)


Python users: Because example data and code will be provided in the form of Jupyter notebooks running Python3, attendees are encouraged to arrive with a copy of Continuum's Anaconda installed on their machine (https://www.continuum.io/downloads), with pre-installed modules including numpy, pandas, and plotly.

This symposium is intended to educate and facilitate adoption of an emerging technology. If a topic you are interested is not listed, or if you are a researcher or manufacturer that feels you can contribute knowledge to one or more of these topics (in the form of literature), you are encouraged to contact the organizer at gabriel.diaz at rit.edu<mailto:gabriel.diaz at rit.edu>.



-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://visionscience.com/pipermail/visionlist_visionscience.com/attachments/20180410/82ab539e/attachment.html>


More information about the visionlist mailing list