[visionlist] About help on eye-tracking of head free non-human primates

Gabriel Diaz Gabriel.Diaz at rit.edu
Tue Jul 17 15:34:53 -05 2018


Diedrich, you're right! I've been spending too much time using head-mounted
trackers.  Thanks for the clarification.

----------------------

Gabriel J. Diaz, Ph.D.
Assistant Professor
Rochester Institute of Technology
Chester F. Carlson Center for Imaging Science

*Founder of PerForM Labs*
*Click for demos.* <https://www.cis.rit.edu/performlab/>

Office 2108, Building #76
Rochester, NY 14623
Office: (585) 475-6215
gabriel.diaz at rit.edu

On Tue, Jul 17, 2018 at 4:28 PM, Diederick C. Niehorster <dcnieho at gmail.com>
wrote:

> Hi Harish, Gabe, et al,
>
> Disclaimer: i have never built my own eye-tracker, so the below is
> theoretical.
>
> I do not see why for a remote eye-tracker (which is what I read
> Harish' question as referring to, you may be talking about
> head-mounted ones) would need to know anything about head pose. All
> that is needed is measurements of eyeball position and orientation in
> 3D space. That defines the origin and direction of the gaze vector,
> which you can then intersect with an arbitrary object, such as the
> stimulus plane.
> In fact, a remote eye-tracker gives you gaze-in-world data, and
> recovering eye-in-head signals is what you would need head pose
> information for.
>
> Harish, three systems i would recommend you evaluate are:
> 1. the Tobii Spectrum. It has a monkey tracking mode, though i have no
> idea how well it works. Performance appears solid and robust with
> humans.
> 2. the SmartEye system. It can be specced with up to 8 cameras for a
> large area where gaze can be tracked.
> 3. the LC technologies EyeFollower. They have moving (both orientation
> and focus) cameras, which means they have a very large headbox.
>
> All the best,
> Dr. Diederick Niehorster
> Researcher and Research Engineer,
> The Lund University Humanities Lab, and
> Department of Psychology,
> Lund University,
> Sweden
> http://www.humlab.lu.se/en/person/DiederickCNiehorster/
>
> On Tue, Jul 17, 2018 at 1:45 PM, Gabriel Diaz <Gabriel.Diaz at rit.edu>
> wrote:
> > Harish, I've spent some time thinking about this problem in the past,
> and it
> > seems that we have some common ideas, and some differences in opinion.
> Here
> > are a few thoughts...
> >
> > 1. Is there a good eye tracking system that already has macaque face
> > appearance templates bulit in?
> >
> > Not that I know of!  ...but, yours is a good comment, because it suggests
> > that you're thinking of using face tracking to estimate head pose, which
> > must be combined with eye-in-head angles to recover the gaze vector
> within a
> > world based reference frame.  That's the right way to approach the
> problem.
> > If you you aren't familiar with the technique, there are plenty of good
> > tutorials for head pose estimation online . For example, here is one.  I
> > worry a bit that the hairy macaque face may not provide enough stable
> > features for tracking, but I'm not very experienced with the algorithms,
> so
> > don't let my hesitations hold you back.
> >
> >
> > 2. Are there any novel ways of placing  the screen and tracker that
> result
> > in better eye-tracking? We have tried various ways of placing trackers
> below
> > the screen and at various distances from the animal.
> >
> > You're going through the trouble of facilitating recording from free
> moving
> > macaques, and you still want to use a screen!?!  Doesn't that defeat the
> > purpose of facilitating natural behavior?  In any case, to answer this
> > question requires a lot more knowledge about what kind of eye tracker you
> > are trying to use. My guess is that you're using a remote tracker placed
> > near to the screen, and the degradation is due to the small size of the
> > pupil in the remote eye camera when the head is further away.  That's
> pure
> > speculation.
> >
> >
> > 3. Are there multi-camera eye-tracker systems that we can set-up from
> > different view points so that one or more can always have a clear view
> of
> > the animal?
> >
> > Not that I've seen. I have discussed building something like this before
> > with colleagues.  That would be a feat of engineering and software
> > development that requires a very firm grasp of multiview geometry.  That
> is
> > a multi-year project.
> >
> >
> > 4. Do these systems have hardware input for behavioral event markers and
> > analog/digital outputs of eye-gaze data so that we can sync it with our
> > neural data acquisition?
> >
> > N/A, because these systems don't yet exist.
> >
> > Hope that was somewhat informative.  Sorry if it is disappointing!
> >
> >
> >
> >
> >
> >
> > ----------------------
> >
> > Gabriel J. Diaz, Ph.D.
> > Assistant Professor
> > Rochester Institute of Technology
> > Chester F. Carlson Center for Imaging Science
> >
> > Founder of PerForM Labs
> > Click for demos.
> >
> > Office 2108, Building #76
> > Rochester, NY 14623
> > Office: (585) 475-6215
> > gabriel.diaz at rit.edu
> >
> > On Mon, Jul 16, 2018 at 1:27 PM, Stefan Dowiasch
> > <stefan.dowiasch at physik.uni-marburg.de> wrote:
> >>
> >> Dear all,
> >>
> >> I am Stefan, a visiting assistant professor at the Department of
> >> Neurophysics at the University of Marburg and Chief Scientific Officer
> at
> >> Thomas RECORDING GmbH, Germany.
> >> We faced the same problems in the past and are currently working on a
> >> solution, which allows freely moving primates to perform behavioral
> tasks in
> >> their home cages or special arenas in combination with eye tracking and
> >> single cell recordings.
> >> Recently we finished the first step, i.e. a training and experimental
> unit
> >> for freely moving primates, which is commercially available at Thomas
> >> RECORDING (please see:
> >> https://www.thomasrecording.com/products/neuroscience-produc
> ts/primate-training-systems/incage-training-system-icts.html).
> >> You can find a demo video of the system on Youtube:
> >> https://youtu.be/yDOZauNSwqs
> >> In short: The system consists of a ruggedized tablet computer, a
> flexible
> >> cage-mountable holding device and an integrated reward unit. Currently
> the
> >> build-in front-facing camera can be used to monitor the animal and its
> >> overall behavior. However, we are currently working on a software
> update to
> >> implement basic eye tracking features (fixation control,
> >> saccade-antisaccade-tasks, etc.) to the system.
> >> Furthermore, a trigger interface for synchronization with chronic
> >> recording devices (e.g. the wireless version of the AMEP system
> >> https://www.physiology.org/doi/abs/10.1152/jn.00504.2017), is in
> >> development.
> >>
> >> Taken together, I think this system should meet most of your
> requirements
> >> regarding eye tracking and single unit recordings in freely moving
> primates.
> >> At the moment, you can start training your animals with the system and
> >> getting them used to the new environment. In the near future, you can
> >> upgrade your existing device with a new software package, giving you the
> >> possibility to track the eyes of the primate and synchronize your
> behavioral
> >> and eye tracking data with your physiological recordings.
> >>
> >> If you have further questions or suggestions, please feel free to
> contact
> >> me anytime.
> >>
> >> Best regards,
> >>
> >> Dr. Stefan Dowiasch
> >>
> >>
> >> Am 14.07.2018 um 21:09 schrieb visionlist-request at visionscience.com:
> >>
> >> Date: Sat, 14 Jul 2018 12:00:45 +0530
> >> From: Harish Katti <harish2006 at gmail.com>
> >> To: visionlist at visionscience.com
> >> Subject: [visionlist] About help on eye-tracking of head free
> >>      non-human primates
> >> Message-ID:
> >>      <CAOei6hAoRnc=aApwyws4R2WiZ6EXd9K4q-JPSj+u+TwWHi9ALA at mail.gm
> ail.com>
> >> Content-Type: text/plain; charset="utf-8"
> >>
> >>  Dear all
> >>       I am Harish, a post-doctoral fellow in Dr SP Arun's experimental
> >> vision group at the Centre for Neuroscience, Indian Institute of
> Science.
> >> I'm posting this to get feedback from researchers who have tried
> automated
> >> eye-gaze/head-pose/body-pose tracking of freely moving non-human
> primates.
> >>
> >> In our lab we are trying to setup eye tracking in monkeys without any
> >> head restraints. Our plan is to have a behavioural arena where the
> >> animal is not head-fixed and can come up to a touch screen and perform
> >> simple tasks in return for juice rewards. Since the animals are not
> >> head-fixed, the eye-tracking needs to be done in a manner that can
> >> handle change in body and head pose. We have been evaluating a few
> >> commercial eye-tracking systems but find that the trackers have
> >> difficulty in finding the face/eyes. It will be nice to have your inputs
> >> on the following issues,
> >>
> >> 1. Is there a good eye tracking system that already has macaque face
> >> appearance templates bulit in?
> >>
> >> 2. Are there any novel ways of placing  the screen and tracker that
> >> result in better eye-tracking? We have tried various ways of placing
> >> trackers below the screen and at various distances from the animal.
> >>
> >> 3. Are there multi-camera eye-tracker systems that we can set-up from
> >> different view points so that one or more can always have a clear view
> >> of the animal?
> >>
> >> 4. Do these systems have hardware input for behavioral event markers and
> >> analog/digital outputs of eye-gaze data so that we can sync it with our
> >> neural data acquisition.
> >>
> >> best,
> >> Harish
> >>
> >>
> >> _______________________________________________
> >> visionlist mailing list
> >> visionlist at visionscience.com
> >> http://visionscience.com/mailman/listinfo/visionlist_visionscience.com
> >>
> >
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://visionscience.com/pipermail/visionlist_visionscience.com/attachments/20180717/c0096609/attachment.html>


More information about the visionlist mailing list