<div dir="ltr"><div class="gmail_default" style="font-family:arial,helvetica,sans-serif;font-size:small"><span style="color:rgb(0,0,0);font-family:arial,sans-serif;background-color:rgb(255,255,255);text-decoration-style:initial;text-decoration-color:initial;float:none;display:inline">Harish, I've spent some time thinking about this problem in the past, and it seems that we have some common ideas, and some differences in opinion.  Here are a few thoughts...</span></div><div class="gmail_default" style="font-family:arial,helvetica,sans-serif;font-size:small"><span style="color:rgb(0,0,0);font-family:arial,sans-serif;background-color:rgb(255,255,255);text-decoration-style:initial;text-decoration-color:initial;float:none;display:inline"><br></span></div><div class="gmail_default" style="font-family:arial,helvetica,sans-serif;font-size:small"><i><span style="color:rgb(0,0,0);font-family:arial,sans-serif;background-color:rgb(255,255,255);text-decoration-style:initial;text-decoration-color:initial;float:none;display:inline">1. Is there a good eye tracking system that already has macaque face<span> </span></span><span style="color:rgb(0,0,0);font-family:arial,sans-serif;background-color:rgb(255,255,255);text-decoration-style:initial;text-decoration-color:initial;float:none;display:inline">appearance templates bulit in?</span></i><br></div><div class="gmail_default" style="font-family:arial,helvetica,sans-serif;font-size:small"><span style="color:rgb(0,0,0);font-family:arial,sans-serif;background-color:rgb(255,255,255);text-decoration-style:initial;text-decoration-color:initial;float:none;display:inline"><br></span></div><blockquote style="margin:0 0 0 40px;border:none;padding:0px"><div class="gmail_default" style="font-family:arial,helvetica,sans-serif;font-size:small"><span style="color:rgb(0,0,0);font-family:arial,sans-serif;background-color:rgb(255,255,255);text-decoration-style:initial;text-decoration-color:initial;float:none;display:inline">Not that I know of!  ...but, yours is a good comment, because it suggests that you're thinking of using face tracking to estimate head pose, which must be combined with eye-in-head angles to recover the gaze vector within a world based reference frame.  That's the right way to approach the problem. If you you aren't familiar with the technique, there are plenty of good tutorials for head pose estimation online . <a href="https://www.learnopencv.com/head-pose-estimation-using-opencv-and-dlib/" target="_blank">For example, here is one.</a>  I worry a bit that the hairy macaque face may not provide enough stable features for tracking, but I'm not very experienced with the algorithms, so don't let my hesitations hold you back.</span></div></blockquote><div class="gmail_default" style="font-family:arial,helvetica,sans-serif;font-size:small"><span style="color:rgb(0,0,0);font-family:arial,sans-serif;background-color:rgb(255,255,255);text-decoration-style:initial;text-decoration-color:initial;float:none;display:inline"><br></span></div><div class="gmail_default" style="font-family:arial,helvetica,sans-serif;font-size:small"><span style="color:rgb(0,0,0);font-family:arial,sans-serif;background-color:rgb(255,255,255);text-decoration-style:initial;text-decoration-color:initial;float:none;display:inline"><i><span style="text-decoration-style:initial;text-decoration-color:initial;font-size:small;float:none;display:inline">2. Are there any novel ways of placing  the screen and tracker that<span> </span></span><span style="text-decoration-style:initial;text-decoration-color:initial;font-size:small;float:none;display:inline">result in better eye-tracking? We have tried various ways of placing<span> </span></span></i><span class="m_-3789227149921482451gmail-m_-4754232528054036665gmail-im" style="text-decoration-style:initial;text-decoration-color:initial;font-size:small"><i>trackers below the screen and at various distances from the animal.</i><br></span></span></div><div class="gmail_default" style="font-family:arial,helvetica,sans-serif;font-size:small"><span style="color:rgb(0,0,0);font-family:arial,sans-serif;background-color:rgb(255,255,255);text-decoration-style:initial;text-decoration-color:initial;float:none;display:inline"><br></span></div><blockquote style="margin:0 0 0 40px;border:none;padding:0px"><div class="gmail_default" style="font-size:small"><font color="#000000">You're going through the trouble of facilitating recording from free moving macaques, and <i>you still want to use a screen!?!  </i>Doesn't that defeat the purpose of facilitating natural behavior?  In any case, to answer this question requires a lot more knowledge about what kind of eye tracker you are trying to use. My guess is that you're using a remote tracker placed near to the screen, and the degradation is due to the small size of the pupil in the remote eye camera when the head is further away.  That's pure speculation.</font></div></blockquote><div class="gmail_default" style="font-size:small"><font color="#000000"><br></font></div><div class="gmail_default" style="font-size:small"><font color="#000000"><i><span style="background-color:rgb(255,255,255);text-decoration-style:initial;text-decoration-color:initial;font-size:small;float:none;display:inline">3. Are there multi-camera eye-tracker systems that we can set-up from<span>  </span></span><span class="m_-3789227149921482451gmail-m_-4754232528054036665gmail-im" style="background-color:rgb(255,255,255);text-decoration-style:initial;text-decoration-color:initial;font-size:small">different view points so that one or more can always have a clear view<span>  </span></span><span style="background-color:rgb(255,255,255);text-decoration-style:initial;text-decoration-color:initial;font-size:small;float:none;display:inline">of the animal?</span></i><br></font></div><div class="gmail_default" style="font-family:arial,helvetica,sans-serif;font-size:small"><span style="color:rgb(0,0,0);font-family:arial,sans-serif;background-color:rgb(255,255,255);text-decoration-style:initial;text-decoration-color:initial;float:none;display:inline"><br></span></div><blockquote style="margin:0 0 0 40px;border:none;padding:0px"><div class="gmail_default" style="font-family:arial,helvetica,sans-serif;font-size:small"><span style="color:rgb(0,0,0);font-family:arial,sans-serif;background-color:rgb(255,255,255);text-decoration-style:initial;text-decoration-color:initial;float:none;display:inline">Not that I've seen. I have discussed building something like this before with colleagues.  That would be a feat of engineering and software development that requires a very firm grasp of multiview geometry.  That is a multi-year project.  </span></div></blockquote><div class="gmail_default" style="font-family:arial,helvetica,sans-serif;font-size:small"><br></div><div class="gmail_default" style="font-family:arial,helvetica,sans-serif;font-size:small"><span style="color:rgb(0,0,0);font-family:arial,sans-serif;background-color:rgb(255,255,255);text-decoration-style:initial;text-decoration-color:initial;float:none;display:inline"><i><span style="text-decoration-style:initial;text-decoration-color:initial;font-size:small;float:none;display:inline">4. Do these systems have hardware input for behavioral event markers and<span> </span></span><span class="m_-3789227149921482451gmail-m_-4754232528054036665gmail-im" style="text-decoration-style:initial;text-decoration-color:initial;font-size:small">analog/digital outputs of eye-gaze data so that we can sync it with our<span> </span></span><span style="text-decoration-style:initial;text-decoration-color:initial;font-size:small;float:none;display:inline">neural data acquisition?</span></i></span></div><div class="gmail_default" style="font-family:arial,helvetica,sans-serif;font-size:small"><span style="color:rgb(0,0,0);font-family:arial,sans-serif;background-color:rgb(255,255,255);text-decoration-style:initial;text-decoration-color:initial;float:none;display:inline"><span style="text-decoration-style:initial;text-decoration-color:initial;font-size:small;float:none;display:inline"><br></span></span></div><blockquote style="margin:0 0 0 40px;border:none;padding:0px"><div class="gmail_default" style="font-family:arial,helvetica,sans-serif;font-size:small"><span style="color:rgb(0,0,0);font-family:arial,sans-serif;background-color:rgb(255,255,255);text-decoration-style:initial;text-decoration-color:initial;float:none;display:inline"><span style="text-decoration-style:initial;text-decoration-color:initial;font-size:small;float:none;display:inline">N</span></span><span style="color:rgb(0,0,0);font-family:arial,sans-serif">/A, because these systems don't yet exist.</span></div><div class="gmail_default" style="font-family:arial,helvetica,sans-serif;font-size:small"><span style="color:rgb(0,0,0);font-family:arial,sans-serif"><br></span></div></blockquote><div class="gmail_default"><font color="#000000">Hope that was somewhat informative.  Sorry if it is disappointing!</font></div><blockquote style="margin:0 0 0 40px;border:none;padding:0px"><div class="gmail_default" style="font-family:arial,helvetica,sans-serif;font-size:small"><span style="color:rgb(0,0,0);font-family:arial,sans-serif"><br></span></div><div class="gmail_default" style="font-family:arial,helvetica,sans-serif;font-size:small"><span style="color:rgb(0,0,0);font-family:arial,sans-serif"><br></span></div></blockquote><div class="gmail_default" style="font-family:arial,helvetica,sans-serif;font-size:small"><span style="color:rgb(0,0,0);font-family:arial,sans-serif"><br></span></div><div class="gmail_default" style="font-family:arial,helvetica,sans-serif;font-size:small"><span style="color:rgb(0,0,0);font-family:arial,sans-serif"><br></span></div><div class="gmail_extra"><br clear="all"><div><div class="m_-3789227149921482451gmail_signature" data-smartmail="gmail_signature"><div dir="ltr"><div style="font-size:small"><div dir="ltr"><div dir="ltr"><div><span style="font-family:arial,helvetica,sans-serif">----------------------</span></div><div><font face="arial, helvetica, sans-serif"><br></font></div><div><font face="arial, helvetica, sans-serif">Gabriel J. Diaz, Ph.D.</font></div></div></div></div><div style="font-size:small"><div dir="ltr"><div dir="ltr"><div style="font-size:12.8px"><div dir="ltr"><div dir="ltr"><font face="arial, helvetica, sans-serif">Assistant Professor<br></font></div><div><font face="arial, helvetica, sans-serif"><span style="font-size:12.8px">Rochester Institute of Technology</span><br></font></div><div><div style="font-size:12.8px"><font face="arial, helvetica, sans-serif">Chester F. Carlson Center for Imaging Science</font></div><div style="font-size:12.8px"><div><font face="arial, helvetica, sans-serif"><br></font></div><div><div dir="ltr"><div dir="ltr"><font face="arial, helvetica, sans-serif"><b style="font-size:12.8px">Founder of PerForM Labs</b></font></div></div></div><div><div dir="ltr"><div dir="ltr"><div style="font-size:12.8px"><a href="https://www.cis.rit.edu/performlab/" style="color:rgb(17,85,204)" target="_blank"><b><font face="arial, helvetica, sans-serif">Click for demos.</font></b></a></div></div></div></div><div><font face="arial, helvetica, sans-serif"><br></font></div><div><font face="arial, helvetica, sans-serif">Office 2108, Building #76</font></div><div><font face="arial, helvetica, sans-serif">Rochester, NY 14623</font></div></div></div></div></div><div style="font-size:12.8px"><div dir="ltr"><div dir="ltr"><div style="font-size:12.8px"><font face="arial, helvetica, sans-serif">Office: <a href="tel:(585)%20475-6215" value="+15854756215" style="color:rgb(17,85,204)" target="_blank">(585) 475-6215</a></font></div><div style="font-size:12.8px"><a href="mailto:gabriel.diaz@rit.edu" style="color:rgb(17,85,204)" target="_blank"><font face="arial, helvetica, sans-serif">gabriel.diaz@rit.edu</font></a></div></div></div></div></div></div></div></div></div></div>
<br><div class="gmail_quote">On Mon, Jul 16, 2018 at 1:27 PM, Stefan Dowiasch <span dir="ltr"><<a href="mailto:stefan.dowiasch@physik.uni-marburg.de" target="_blank">stefan.dowiasch@physik.uni-<wbr>marburg.de</a>></span> wrote:<br><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex">
  
    
  
  <div text="#000000" bgcolor="#FFFFFF">
    <p>Dear all,<br>
      <br>
      I am Stefan, a visiting assistant professor at the Department of
      Neurophysics at the University of Marburg and Chief Scientific
      Officer at Thomas RECORDING GmbH, Germany. <br>
      We faced the same problems in the past and are currently working
      on a solution, which allows freely moving primates to perform
      behavioral tasks in their home cages or special arenas in
      combination with eye tracking and single cell recordings.<br>
      Recently we finished the first step, i.e. a training and
      experimental unit for freely moving primates, which is
      commercially available at Thomas RECORDING (please see:
      <a class="m_-3789227149921482451m_-2675392039381232516moz-txt-link-freetext" href="https://www.thomasrecording.com/products/neuroscience-products/primate-training-systems/incage-training-system-icts.html" target="_blank">https://www.thomasrecording.co<wbr>m/products/neuroscience-produc<wbr>ts/primate-training-systems/<wbr>incage-training-system-icts.<wbr>html</a>).
      You can find a demo video of the system on Youtube: <a class="m_-3789227149921482451m_-2675392039381232516moz-txt-link-freetext" href="https://youtu.be/yDOZauNSwqs" target="_blank">https://youtu.be/yDOZauNSwqs</a>
      <br>
      In short: The system consists of a ruggedized tablet computer, a
      flexible cage-mountable holding device and an integrated reward
      unit. Currently the build-in front-facing camera can be used to
      monitor the animal and its overall behavior. However, we are
      currently working on a software update to implement basic eye
      tracking features (fixation control, saccade-antisaccade-tasks,
      etc.) to the system. <br>
      Furthermore, a trigger interface for synchronization with chronic
      recording devices (e.g. the wireless version of the AMEP system
      <a class="m_-3789227149921482451m_-2675392039381232516moz-txt-link-freetext" href="https://www.thomasrecording.com/products/neuroscience-products/chronic-recording-devices/small-animal/thomas-wireless-system-tws.html" target="_blank">https://www.physiology.org/doi<wbr>/abs/10.1152/jn.00504.2017</a>),
      is in development.<br>
      <br>
      Taken together, I think this system should meet most of your
      requirements regarding eye tracking and single unit recordings in
      freely moving primates. At the moment, you can start training your
      animals with the system and getting them used to the new
      environment. In the near future, you can upgrade your existing
      device with a new software package, giving you the possibility to
      track the eyes of the primate and synchronize your behavioral and
      eye tracking data with your physiological recordings.<br>
      <br>
      If you have further questions or suggestions, please feel free to
      contact me anytime.<br>
      <br>
      Best regards,<br>
      <br>
      Dr. Stefan Dowiasch</p>
    <br>
    <div class="m_-3789227149921482451m_-2675392039381232516moz-cite-prefix">Am 14.07.2018 um 21:09 schrieb
      <a class="m_-3789227149921482451m_-2675392039381232516moz-txt-link-abbreviated" href="mailto:visionlist-request@visionscience.com" target="_blank">visionlist-request@visionscien<wbr>ce.com</a>:<br>
    </div>
    <blockquote type="cite">
      <pre>Date: Sat, 14 Jul 2018 12:00:45 +0530
From: Harish Katti <a class="m_-3789227149921482451m_-2675392039381232516moz-txt-link-rfc2396E" href="mailto:harish2006@gmail.com" target="_blank"><harish2006@gmail.com></a>
To: <a class="m_-3789227149921482451m_-2675392039381232516moz-txt-link-abbreviated" href="mailto:visionlist@visionscience.com" target="_blank">visionlist@visionscience.com</a>
Subject: [visionlist] About help on eye-tracking of head free
        non-human primates
Message-ID:
        <a class="m_-3789227149921482451m_-2675392039381232516moz-txt-link-rfc2396E" href="mailto:CAOei6hAoRnc=aApwyws4R2WiZ6EXd9K4q-JPSj+u+TwWHi9ALA@mail.gmail.com" target="_blank"><CAOei6hAoRnc=aApwyws4R2WiZ6EX<wbr>d9K4q-JPSj+u+TwWHi9ALA@mail.<wbr>gmail.com></a>
Content-Type: text/plain; charset="utf-8"

 Dear all
      I am Harish, a post-doctoral fellow in Dr SP Arun's experimental
vision group at the Centre for Neuroscience, Indian Institute of Science.
I'm posting this to get feedback from researchers who have tried automated
eye-gaze/head-pose/body-pose tracking of freely moving non-human primates.

In our lab we are trying to setup eye tracking in monkeys without any
head restraints. Our plan is to have a behavioural arena where the
animal is not head-fixed and can come up to a touch screen and perform
simple tasks in return for juice rewards. Since the animals are not
head-fixed, the eye-tracking needs to be done in a manner that can
handle change in body and head pose. We have been evaluating a few
commercial eye-tracking systems but find that the trackers have
difficulty in finding the face/eyes. It will be nice to have your inputs
on the following issues,

1. Is there a good eye tracking system that already has macaque face
appearance templates bulit in?

2. Are there any novel ways of placing  the screen and tracker that
result in better eye-tracking? We have tried various ways of placing
trackers below the screen and at various distances from the animal.

3. Are there multi-camera eye-tracker systems that we can set-up from
different view points so that one or more can always have a clear view
of the animal?

4. Do these systems have hardware input for behavioral event markers and
analog/digital outputs of eye-gaze data so that we can sync it with our
neural data acquisition.

best,
Harish

</pre>
    </blockquote>
  </div>

<br>______________________________<wbr>_________________<br>
visionlist mailing list<br>
<a href="mailto:visionlist@visionscience.com" target="_blank">visionlist@visionscience.com</a><br>
<a href="http://visionscience.com/mailman/listinfo/visionlist_visionscience.com" rel="noreferrer" target="_blank">http://visionscience.com/mailm<wbr>an/listinfo/visionlist_visions<wbr>cience.com</a><br>
<br></blockquote></div><br></div></div>