<div dir="ltr"><div class="gmail_default" style="font-family:arial,helvetica,sans-serif;font-size:small">Diedrich, you're right! I've been spending too much time using head-mounted trackers. Thanks for the clarification.</div><div class="gmail_extra"><br clear="all"><div><div class="m_8634035219602642226gmail_signature" data-smartmail="gmail_signature"><div dir="ltr"><div style="font-size:small"><div dir="ltr"><div dir="ltr"><div><span style="font-family:arial,helvetica,sans-serif">----------------------</span></div><div><font face="arial, helvetica, sans-serif"><br></font></div><div><font face="arial, helvetica, sans-serif">Gabriel J. Diaz, Ph.D.</font></div></div></div></div><div style="font-size:small"><div dir="ltr"><div dir="ltr"><div style="font-size:12.8px"><div dir="ltr"><div dir="ltr"><font face="arial, helvetica, sans-serif">Assistant Professor<br></font></div><div><font face="arial, helvetica, sans-serif"><span style="font-size:12.8px">Rochester Institute of Technology</span><br></font></div><div><div style="font-size:12.8px"><font face="arial, helvetica, sans-serif">Chester F. Carlson Center for Imaging Science</font></div><div style="font-size:12.8px"><div><font face="arial, helvetica, sans-serif"><br></font></div><div><div dir="ltr"><div dir="ltr"><font face="arial, helvetica, sans-serif"><b style="font-size:12.8px">Founder of PerForM Labs</b></font></div></div></div><div><div dir="ltr"><div dir="ltr"><div style="font-size:12.8px"><a href="https://www.cis.rit.edu/performlab/" style="color:rgb(17,85,204)" target="_blank"><b><font face="arial, helvetica, sans-serif">Click for demos.</font></b></a></div></div></div></div><div><font face="arial, helvetica, sans-serif"><br></font></div><div><font face="arial, helvetica, sans-serif">Office 2108, Building #76</font></div><div><font face="arial, helvetica, sans-serif">Rochester, NY 14623</font></div></div></div></div></div><div style="font-size:12.8px"><div dir="ltr"><div dir="ltr"><div style="font-size:12.8px"><font face="arial, helvetica, sans-serif">Office: <a href="tel:(585)%20475-6215" value="+15854756215" style="color:rgb(17,85,204)" target="_blank">(585) 475-6215</a></font></div><div style="font-size:12.8px"><a href="mailto:gabriel.diaz@rit.edu" style="color:rgb(17,85,204)" target="_blank"><font face="arial, helvetica, sans-serif">gabriel.diaz@rit.edu</font></a></div></div></div></div></div></div></div></div></div></div>
<br><div class="gmail_quote">On Tue, Jul 17, 2018 at 4:28 PM, Diederick C. Niehorster <span dir="ltr"><<a href="mailto:dcnieho@gmail.com" target="_blank">dcnieho@gmail.com</a>></span> wrote:<br><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex">Hi Harish, Gabe, et al,<br>
<br>
Disclaimer: i have never built my own eye-tracker, so the below is theoretical.<br>
<br>
I do not see why for a remote eye-tracker (which is what I read<br>
Harish' question as referring to, you may be talking about<br>
head-mounted ones) would need to know anything about head pose. All<br>
that is needed is measurements of eyeball position and orientation in<br>
3D space. That defines the origin and direction of the gaze vector,<br>
which you can then intersect with an arbitrary object, such as the<br>
stimulus plane.<br>
In fact, a remote eye-tracker gives you gaze-in-world data, and<br>
recovering eye-in-head signals is what you would need head pose<br>
information for.<br>
<br>
Harish, three systems i would recommend you evaluate are:<br>
1. the Tobii Spectrum. It has a monkey tracking mode, though i have no<br>
idea how well it works. Performance appears solid and robust with<br>
humans.<br>
2. the SmartEye system. It can be specced with up to 8 cameras for a<br>
large area where gaze can be tracked.<br>
3. the LC technologies EyeFollower. They have moving (both orientation<br>
and focus) cameras, which means they have a very large headbox.<br>
<br>
All the best,<br>
Dr. Diederick Niehorster<br>
Researcher and Research Engineer,<br>
The Lund University Humanities Lab, and<br>
Department of Psychology,<br>
Lund University,<br>
Sweden<br>
<a href="http://www.humlab.lu.se/en/person/DiederickCNiehorster/" rel="noreferrer" target="_blank">http://www.humlab.lu.se/en/per<wbr>son/DiederickCNiehorster/</a><br>
<span class="m_8634035219602642226im m_8634035219602642226HOEnZb"><br>
On Tue, Jul 17, 2018 at 1:45 PM, Gabriel Diaz <<a href="mailto:Gabriel.Diaz@rit.edu" target="_blank">Gabriel.Diaz@rit.edu</a>> wrote:<br>
> Harish, I've spent some time thinking about this problem in the past, and it<br>
> seems that we have some common ideas, and some differences in opinion. Here<br>
> are a few thoughts...<br>
><br>
</span><span class="m_8634035219602642226im m_8634035219602642226HOEnZb">> 1. Is there a good eye tracking system that already has macaque face<br>
> appearance templates bulit in?<br>
><br>
</span><span class="m_8634035219602642226im m_8634035219602642226HOEnZb">> Not that I know of! ...but, yours is a good comment, because it suggests<br>
> that you're thinking of using face tracking to estimate head pose, which<br>
> must be combined with eye-in-head angles to recover the gaze vector within a<br>
> world based reference frame. That's the right way to approach the problem.<br>
> If you you aren't familiar with the technique, there are plenty of good<br>
> tutorials for head pose estimation online . For example, here is one. I<br>
> worry a bit that the hairy macaque face may not provide enough stable<br>
> features for tracking, but I'm not very experienced with the algorithms, so<br>
> don't let my hesitations hold you back.<br>
><br>
><br>
</span><span class="m_8634035219602642226im m_8634035219602642226HOEnZb">> 2. Are there any novel ways of placing the screen and tracker that result<br>
> in better eye-tracking? We have tried various ways of placing trackers below<br>
> the screen and at various distances from the animal.<br>
><br>
</span><span class="m_8634035219602642226im m_8634035219602642226HOEnZb">> You're going through the trouble of facilitating recording from free moving<br>
> macaques, and you still want to use a screen!?! Doesn't that defeat the<br>
> purpose of facilitating natural behavior? In any case, to answer this<br>
> question requires a lot more knowledge about what kind of eye tracker you<br>
> are trying to use. My guess is that you're using a remote tracker placed<br>
> near to the screen, and the degradation is due to the small size of the<br>
> pupil in the remote eye camera when the head is further away. That's pure<br>
> speculation.<br>
><br>
><br>
</span><span class="m_8634035219602642226im m_8634035219602642226HOEnZb">> 3. Are there multi-camera eye-tracker systems that we can set-up from<br>
> different view points so that one or more can always have a clear view of<br>
> the animal?<br>
><br>
</span><span class="m_8634035219602642226im m_8634035219602642226HOEnZb">> Not that I've seen. I have discussed building something like this before<br>
> with colleagues. That would be a feat of engineering and software<br>
> development that requires a very firm grasp of multiview geometry. That is<br>
> a multi-year project.<br>
><br>
><br>
</span><span class="m_8634035219602642226im m_8634035219602642226HOEnZb">> 4. Do these systems have hardware input for behavioral event markers and<br>
> analog/digital outputs of eye-gaze data so that we can sync it with our<br>
</span><span class="m_8634035219602642226im m_8634035219602642226HOEnZb">> neural data acquisition?<br>
><br>
> N/A, because these systems don't yet exist.<br>
><br>
> Hope that was somewhat informative. Sorry if it is disappointing!<br>
><br>
><br>
><br>
><br>
><br>
><br>
> ----------------------<br>
><br>
> Gabriel J. Diaz, Ph.D.<br>
> Assistant Professor<br>
> Rochester Institute of Technology<br>
> Chester F. Carlson Center for Imaging Science<br>
><br>
> Founder of PerForM Labs<br>
> Click for demos.<br>
><br>
> Office 2108, Building #76<br>
> Rochester, NY 14623<br>
> Office: (585) 475-6215<br>
> <a href="mailto:gabriel.diaz@rit.edu" target="_blank">gabriel.diaz@rit.edu</a><br>
><br>
> On Mon, Jul 16, 2018 at 1:27 PM, Stefan Dowiasch<br>
> <<a href="mailto:stefan.dowiasch@physik.uni-marburg.de" target="_blank">stefan.dowiasch@physik.uni-ma<wbr>rburg.de</a>> wrote:<br>
>><br>
</span><div class="m_8634035219602642226HOEnZb"><div class="m_8634035219602642226h5">>> Dear all,<br>
>><br>
>> I am Stefan, a visiting assistant professor at the Department of<br>
>> Neurophysics at the University of Marburg and Chief Scientific Officer at<br>
>> Thomas RECORDING GmbH, Germany.<br>
>> We faced the same problems in the past and are currently working on a<br>
>> solution, which allows freely moving primates to perform behavioral tasks in<br>
>> their home cages or special arenas in combination with eye tracking and<br>
>> single cell recordings.<br>
>> Recently we finished the first step, i.e. a training and experimental unit<br>
>> for freely moving primates, which is commercially available at Thomas<br>
>> RECORDING (please see:<br>
>> <a href="https://www.thomasrecording.com/products/neuroscience-products/primate-training-systems/incage-training-system-icts.html" rel="noreferrer" target="_blank">https://www.thomasrecording.co<wbr>m/products/neuroscience-produc<wbr>ts/primate-training-systems/<wbr>incage-training-system-icts.<wbr>html</a>).<br>
>> You can find a demo video of the system on Youtube:<br>
>> <a href="https://youtu.be/yDOZauNSwqs" rel="noreferrer" target="_blank">https://youtu.be/yDOZauNSwqs</a><br>
>> In short: The system consists of a ruggedized tablet computer, a flexible<br>
>> cage-mountable holding device and an integrated reward unit. Currently the<br>
>> build-in front-facing camera can be used to monitor the animal and its<br>
>> overall behavior. However, we are currently working on a software update to<br>
>> implement basic eye tracking features (fixation control,<br>
>> saccade-antisaccade-tasks, etc.) to the system.<br>
>> Furthermore, a trigger interface for synchronization with chronic<br>
>> recording devices (e.g. the wireless version of the AMEP system<br>
>> <a href="https://www.physiology.org/doi/abs/10.1152/jn.00504.2017" rel="noreferrer" target="_blank">https://www.physiology.org/doi<wbr>/abs/10.1152/jn.00504.2017</a>), is in<br>
>> development.<br>
>><br>
>> Taken together, I think this system should meet most of your requirements<br>
>> regarding eye tracking and single unit recordings in freely moving primates.<br>
>> At the moment, you can start training your animals with the system and<br>
>> getting them used to the new environment. In the near future, you can<br>
>> upgrade your existing device with a new software package, giving you the<br>
>> possibility to track the eyes of the primate and synchronize your behavioral<br>
>> and eye tracking data with your physiological recordings.<br>
>><br>
>> If you have further questions or suggestions, please feel free to contact<br>
>> me anytime.<br>
>><br>
>> Best regards,<br>
>><br>
>> Dr. Stefan Dowiasch<br>
>><br>
>><br>
>> Am 14.07.2018 um 21:09 schrieb <a href="mailto:visionlist-request@visionscience.com" target="_blank">visionlist-request@visionscien<wbr>ce.com</a>:<br>
>><br>
>> Date: Sat, 14 Jul 2018 12:00:45 +0530<br>
>> From: Harish Katti <<a href="mailto:harish2006@gmail.com" target="_blank">harish2006@gmail.com</a>><br>
>> To: <a href="mailto:visionlist@visionscience.com" target="_blank">visionlist@visionscience.com</a><br>
>> Subject: [visionlist] About help on eye-tracking of head free<br>
>> non-human primates<br>
>> Message-ID:<br>
>> <CAOei6hAoRnc=<a href="mailto:aApwyws4R2WiZ6EXd9K4q-JPSj%2Bu%2BTwWHi9ALA@mail.gmail.com" target="_blank">aApwyws4R2WiZ6EX<wbr>d9K4q-JPSj+u+TwWHi9ALA@mail.gm<wbr>ail.com</a>><br>
>> Content-Type: text/plain; charset="utf-8"<br>
>><br>
>> Dear all<br>
>> I am Harish, a post-doctoral fellow in Dr SP Arun's experimental<br>
>> vision group at the Centre for Neuroscience, Indian Institute of Science.<br>
>> I'm posting this to get feedback from researchers who have tried automated<br>
>> eye-gaze/head-pose/body-pose tracking of freely moving non-human primates.<br>
>><br>
>> In our lab we are trying to setup eye tracking in monkeys without any<br>
>> head restraints. Our plan is to have a behavioural arena where the<br>
>> animal is not head-fixed and can come up to a touch screen and perform<br>
>> simple tasks in return for juice rewards. Since the animals are not<br>
>> head-fixed, the eye-tracking needs to be done in a manner that can<br>
>> handle change in body and head pose. We have been evaluating a few<br>
>> commercial eye-tracking systems but find that the trackers have<br>
>> difficulty in finding the face/eyes. It will be nice to have your inputs<br>
>> on the following issues,<br>
>><br>
>> 1. Is there a good eye tracking system that already has macaque face<br>
>> appearance templates bulit in?<br>
>><br>
>> 2. Are there any novel ways of placing the screen and tracker that<br>
>> result in better eye-tracking? We have tried various ways of placing<br>
>> trackers below the screen and at various distances from the animal.<br>
>><br>
>> 3. Are there multi-camera eye-tracker systems that we can set-up from<br>
>> different view points so that one or more can always have a clear view<br>
>> of the animal?<br>
>><br>
>> 4. Do these systems have hardware input for behavioral event markers and<br>
>> analog/digital outputs of eye-gaze data so that we can sync it with our<br>
>> neural data acquisition.<br>
>><br>
>> best,<br>
>> Harish<br>
>><br>
>><br>
</div></div><div class="m_8634035219602642226HOEnZb"><div class="m_8634035219602642226h5">>> ______________________________<wbr>_________________<br>
>> visionlist mailing list<br>
>> <a href="mailto:visionlist@visionscience.com" target="_blank">visionlist@visionscience.com</a><br>
>> <a href="http://visionscience.com/mailman/listinfo/visionlist_visionscience.com" rel="noreferrer" target="_blank">http://visionscience.com/mailm<wbr>an/listinfo/visionlist_visions<wbr>cience.com</a><br>
>><br>
><br>
</div></div></blockquote></div><br></div></div>