[visionlist] About help on eye-tracking of head free non-human primates

Rakshit Kothari (RIT Student) rsk3900 at rit.edu
Sat Jul 14 13:59:37 -05 2018


Hi Harish,

I could offer my thoughts since I am working on head-free portable eye
tracking.* Note:* I have never worked with monkeys so I do not know if they
are opposed to bodily attachments such as wristbands and headbands etc.

A potential (and cheapest IMO) solution would be to retrofit a cricket
helmet + ratchet add-on and placing the IR based cameras on the insides on
the helmet. This will ensure that the monkey won't scratch the eye tracking
cameras and that it remains snug. You could fit an IMU on the inside of the
helmet - something like this:

[image: image.png]
Please ignore the Stereo cameras, this is a picture from my presentation (
*Note:* no self promotion intended).

Now, the Pupil labs has excellent open source software that can provide
excellent pupil detection. You could either use their cameras or make your
own. https://pupil-labs.com/store/
The best part about the pupil hardware is that their design is very easy to
work with or break open and use the eye + scene cameras separately. *(Note:*
not promoting Pupil labs, this is simply a suggestion towards your
project). You can easily detach the scene and eye cameras and place them
into the cricket helmet and their software will do the rest.

By extracting the head pose from the IMU and Gaze from the Eye Tracker, you
could get a Gaze-In-World vector, however, it is in the nature of IMUs to
drift. Hence, the GIW vector would require occasional correction (every 10
mins or so) - which brings me to my last point.

I'm assuming you'd want accurate gaze tracking on the screen (in pixel
coordinates). You could easily display fiducial markers at the corners of
the screens and find the mapping between screen coordinates and Point of
Regard values on the Scene camera. This mapping (should be a 3x3
transformation matrix) can also be used to estimate head position in a *3D
space relative* to the screen location. Everytime a monkey arrives near a
screen, a program can identify these markers and estimate head position in
3D space and automatically align the IMU - this last part can be a little
difficult to implement though! It took me ages to work with Quaternions and
poses.

If you want body tracking as well, then a 2nd IMU, a little below the nape
would be a good idea.

If you have the ₹ for it, you could also fit the monkeys in a tracking suit
and use a motion capture system such as PhaseSpace http://phasespace.com/ -
We have one in our lab and it works great!

I hope I could help

Rakshit

P.S - You could also *create *your own MoCap system and that would be a fun
project for an engineering graduate - using patterns of IR emitters and IR
cameras around the room to triangulate the 3D position of a marker!
P.S.S - There are many open source implementations of skeleton find using a
3D camera, but I don't know if they can be modified for monkeys.

On Sat, Jul 14, 2018 at 12:14 PM Harish Katti <harish2006 at gmail.com> wrote:

> Dear all
>       I am Harish, a post-doctoral fellow in Dr SP Arun's experimental
> vision group at the Centre for Neuroscience, Indian Institute of Science.
> I'm posting this to get feedback from researchers who have tried
> automated eye-gaze/head-pose/body-pose tracking of freely moving non-human
> primates.
>
> In our lab we are trying to setup eye tracking in monkeys without any
> head restraints. Our plan is to have a behavioural arena where the
> animal is not head-fixed and can come up to a touch screen and perform
> simple tasks in return for juice rewards. Since the animals are not
> head-fixed, the eye-tracking needs to be done in a manner that can
> handle change in body and head pose. We have been evaluating a few
> commercial eye-tracking systems but find that the trackers have
> difficulty in finding the face/eyes. It will be nice to have your inputs
> on the following issues,
>
> 1. Is there a good eye tracking system that already has macaque face
> appearance templates bulit in?
>
> 2. Are there any novel ways of placing  the screen and tracker that
> result in better eye-tracking? We have tried various ways of placing
> trackers below the screen and at various distances from the animal.
>
> 3. Are there multi-camera eye-tracker systems that we can set-up from
> different view points so that one or more can always have a clear view
> of the animal?
>
> 4. Do these systems have hardware input for behavioral event markers and
> analog/digital outputs of eye-gaze data so that we can sync it with our
> neural data acquisition.
>
> best,
> Harish
> _______________________________________________
> visionlist mailing list
> visionlist at visionscience.com
> http://visionscience.com/mailman/listinfo/visionlist_visionscience.com
>


-- 

Rakshit Kothari

Research & Teaching Assistant

Perception for Action and Motion lab (PerForM)

Center for Imaging Science

Rochester Institute of Technology
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://visionscience.com/pipermail/visionlist_visionscience.com/attachments/20180714/83aae40c/attachment-0001.html>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: image.png
Type: image/png
Size: 206283 bytes
Desc: not available
URL: <http://visionscience.com/pipermail/visionlist_visionscience.com/attachments/20180714/83aae40c/attachment-0001.png>


More information about the visionlist mailing list