<div dir="ltr"><span style="color:rgb(0,0,0);font-family:Calibri,Arial,Helvetica,sans-serif;font-size:16px"><p dir="ltr" style="margin-top:0pt;margin-bottom:0pt;line-height:1.38;text-align:center"><span style="font-size:11pt;font-family:Arial;font-weight:700"><span class="gmail-il">Call</span> for papers</span></p><p dir="ltr" style="margin-top:0pt;margin-bottom:0pt;line-height:1.38;text-align:center"><span style="font-size:14pt;font-family:Arial;font-weight:700">ActivEye Workshop 2021</span></p><p dir="ltr" style="margin-top:0pt;margin-bottom:0pt;line-height:1.38;text-align:center"><span style="font-size:14pt;font-family:Arial;font-weight:700">Challenges in large scale eye tracking for active participants </span></p><br><p dir="ltr" style="margin-top:0pt;margin-bottom:0pt;line-height:1.38;text-align:justify"><span style="font-size:10pt;font-family:Arial">Dear all,</span></p><br><p dir="ltr" style="margin-top:0pt;margin-bottom:0pt;line-height:1.38;text-align:justify"><span style="font-size:10pt;font-family:Arial">We invite you to submit your work to the </span><span style="font-size:10pt;font-family:Arial;font-weight:700">ActivEye Workshop 2021</span><span style="font-size:10pt;font-family:Arial">. As part of this year’s ACM ETRA 2021 conference (Eye Tracking Research and Application), we’re inviting two-page submissions in one of the areas below in order to bring together the vision, engineering, human factors, and computer science communities to share our solutions that address the following challenges in eye tracking:</span></p><p dir="ltr" style="margin-top:0pt;margin-bottom:0pt;line-height:1.38;text-align:justify"><span style="font-size:10pt;font-family:Arial"> </span></p><p dir="ltr" style="margin-top:0pt;margin-bottom:4pt;line-height:1.38;text-align:justify"><span style="font-size:10pt;font-family:Arial;font-weight:700">    Robust Gaze tracking in a challenging environment:</span><span style="font-size:10pt;font-family:Arial"> </span></p><p dir="ltr" style="margin-top:0pt;margin-bottom:4pt;line-height:1.38;text-align:justify"><span style="font-size:10pt;font-family:Arial">Topics include algorithms for 2D and 3D gaze tracking of active participants outside of the lab, slippage detection and correction, run-time and post-hoc re-calibration, and validation techniques. </span></p><p dir="ltr" style="margin-top:0pt;margin-bottom:4pt;line-height:1.38;text-align:justify"><span style="font-size:10pt;font-family:Arial;font-weight:700">    Head pose tracking:</span></p><p dir="ltr" style="margin-top:0pt;margin-bottom:4pt;line-height:1.38;text-align:justify"><span style="font-size:10pt;font-family:Arial">We invite studies that take into account the synergy between the eye and head movements,  especially encourage contributions to onboard head pose tracking systems, post-hoc head tracking techniques, and head + eye movement classification algorithms in order to better understand the underlying oculomotor mechanisms.</span></p><p dir="ltr" style="margin-top:0pt;margin-bottom:4pt;line-height:1.38;text-align:justify"><span style="font-size:10pt;font-family:Arial;font-weight:700">    World Camera Characteristics:</span></p><p dir="ltr" style="margin-top:0pt;margin-bottom:4pt;line-height:1.38;text-align:justify"><span style="font-size:10pt;font-family:Arial">Typically, portable eye-tracking devices have a limited field of view, poor optics, and low-quality world video. We encourage studies that address issues of image quality, dynamic range, wide FOV gaze tracking, color consistency, frame rate, spatial resolution, quality vs. size compromise, and modern depth-sensing techniques.</span></p><p dir="ltr" style="margin-top:0pt;margin-bottom:4pt;line-height:1.38;text-align:justify"><span style="font-size:10pt;font-family:Arial;font-weight:700">    Extending use cases to special populations:</span></p><p dir="ltr" style="margin-top:0pt;margin-bottom:4pt;line-height:1.38;text-align:justify"><span style="font-size:10pt;font-family:Arial">The day-to-day challenges of mobile eye-tracking are often exacerbated under certain experimental conditions and with specific participant populations. The latter can include children and older adults, who might have additional ergonomic and physiological constraints; or patient populations (</span><span style="font-size:10pt;font-family:Arial;font-style:italic">e.g.</span><span style="font-size:10pt;font-family:Arial">, those with strabismus or visual field damage).</span></p><p dir="ltr" style="margin-top:0pt;margin-bottom:4pt;line-height:1.38;text-align:justify"><span style="font-size:10pt;font-family:Arial;font-weight:700">    Gaze Tracking Data Annotation:</span></p><p dir="ltr" style="margin-top:0pt;margin-bottom:4pt;line-height:1.38;text-align:justify"><span style="font-size:10pt;font-family:Arial">Considering the growing demands for state of the art deep learning techniques, a key feature of a usable dataset is that it is accurately and reliably annotated. We welcome submissions that include efforts for annotating different aspects of such datasets efficiently, including but not limited to different types of eye + head movements, external events, different eye regions for real or synthetic images and scene objects.</span></p><p dir="ltr" style="margin-top:0pt;margin-bottom:4pt;line-height:1.38;text-align:justify"><span style="font-size:10pt;font-family:Arial;font-weight:700">    Best Practices and DIY:</span></p><p dir="ltr" style="margin-top:0pt;margin-bottom:4pt;line-height:1.38;text-align:justify"><span style="font-size:10pt;font-family:Arial">We encourage submissions on best practices that researchers take into account when running large scale data collection, such as comparisons of different calibration routines, efforts to enhance participant comfort, experimental procedures, UI design for system error tolerance, error handling, and notifications.</span></p><p dir="ltr" style="margin-top:0pt;margin-bottom:4pt;line-height:1.38;text-align:justify"><span style="font-size:10pt;font-family:Arial;font-weight:700">    Devices Ergonomy:</span></p><p dir="ltr" style="margin-top:0pt;margin-bottom:4pt;line-height:1.38;text-align:justify"><span style="font-size:10pt;font-family:Arial">Participant discomfort with most head-mounted trackers is a key obstacle in the collection of large-scale and outdoor data,  particularly for extended time periods. We welcome submissions where researchers in academia and industry share their experience and potential solutions.</span></p><br><p dir="ltr" style="margin-top:0pt;margin-bottom:0pt;line-height:1.38;text-align:justify"><span style="font-size:11pt;font-family:Arial;font-weight:700">Workshop website:</span></p><p dir="ltr" style="margin-top:0pt;margin-bottom:0pt;line-height:1.38;text-align:justify"><a href="https://sites.google.com/view/activeye" target="_blank"><span style="font-size:11pt;font-family:Arial">https://sites.google.com/view/activeye</span></a></p><br><p dir="ltr" style="margin-top:0pt;margin-bottom:0pt;line-height:1.38;text-align:justify"><span style="font-size:11pt;font-family:Arial;font-weight:700">ETRA 2021 website:</span></p><p dir="ltr" style="margin-top:0pt;margin-bottom:0pt;line-height:1.38;text-align:justify"><a href="https://etra.acm.org/2021/" target="_blank"><span style="font-size:11pt;font-family:Arial">https://etra.acm.org/2021/</span></a></p><br><p dir="ltr" style="margin-top:0pt;margin-bottom:0pt;line-height:1.38;text-align:justify"><span style="font-size:11pt;font-family:Arial;font-weight:700">Workshop email address:</span></p><p dir="ltr" style="margin-top:0pt;margin-bottom:0pt;line-height:1.38;text-align:justify"><a href="mailto:activeye_etra2021@googlegroups.com" target="_blank"><span style="font-size:11pt;font-family:Arial">activeye_etra2021@googlegroups.com</span></a></p><p dir="ltr" style="margin-top:0pt;margin-bottom:0pt;line-height:1.38;text-align:justify"><span style="font-size:11pt;font-family:Arial;font-weight:700"><br></span></p><p dir="ltr" style="margin-top:0pt;margin-bottom:0pt;line-height:1.38;text-align:justify"><span style="font-size:11pt;font-family:Arial;font-weight:700">Submission Deadline:</span></p><p dir="ltr" style="margin-top:0pt;margin-bottom:0pt;line-height:1.38;text-align:justify"><span style="font-size:11pt;font-family:Arial">March 8, 2021</span></p><br><p dir="ltr" style="margin-top:0pt;margin-bottom:0pt;line-height:1.38;text-align:justify"><span style="font-size:11pt;font-family:Arial;font-weight:700">Workshop Organizers:</span></p><p dir="ltr" style="margin-top:0pt;margin-bottom:0pt;line-height:1.38;text-align:justify"><span style="font-size:11pt;font-family:Arial"><span class="gmail-il">Kamran</span> Binaee (University of Nevada, Reno) (</span><a href="mailto:kbinaee@unr.edu" target="_blank"><span style="font-size:11pt;font-family:Arial">kbinaee@unr.edu</span></a><span style="font-size:11pt;font-family:Arial">)</span></p><p dir="ltr" style="margin-top:0pt;margin-bottom:0pt;line-height:1.38;text-align:justify"><span style="font-size:11pt;font-family:Arial">Natela Shanidze (The Smith-Kettlewell Eye Research Institute) (</span><a href="mailto:natela@ski.org" target="_blank"><span style="font-size:11pt;font-family:Arial">natela@ski.org</span></a><span style="font-size:11pt;font-family:Arial">)</span></p><p dir="ltr" style="margin-top:0pt;margin-bottom:0pt;line-height:1.38;text-align:justify"><span style="font-size:11pt;font-family:Arial">Agostino Gibaldi (University of California, Berkeley) (</span><a href="mailto:agostino.gibaldi@berkeley.edu" target="_blank"><span style="font-size:11pt;font-family:Arial">agostino.gibaldi@berkeley.edu</span></a><span style="font-size:11pt;font-family:Arial">)</span></p><p dir="ltr" style="margin-top:0pt;margin-bottom:0pt;line-height:1.38;text-align:justify"><span style="font-size:11pt;font-family:Arial">Caroline Robertson (Dartmouth College) (</span><a href="mailto:caroline.e.robertson@dartmouth.edu" target="_blank"><span style="font-size:11pt;font-family:Arial">caroline.e.robertson@dartmouth.edu</span></a><span style="font-size:11pt;font-family:Arial">)</span></p><br><p dir="ltr" style="margin-top:0pt;margin-bottom:0pt;line-height:1.38;text-align:justify"><span style="font-size:11pt;font-family:Arial;font-weight:700">Advisory Board:</span></p><p dir="ltr" style="margin-top:0pt;margin-bottom:0pt;line-height:1.38;text-align:justify"><span style="font-size:11pt;font-family:Arial">Paul Macneilage (University of Nevada, Reno) (</span><a href="mailto:pmacneilage@unr.edu" target="_blank"><span style="font-size:11pt;font-family:Arial">pmacneilage@unr.edu</span></a><span style="font-size:11pt;font-family:Arial">)</span></p><p dir="ltr" style="margin-top:0pt;margin-bottom:0pt;line-height:1.38;text-align:justify"><span style="font-size:11pt;font-family:Arial">Mark Lescroart (University of Nevada, Reno) (</span><a href="mailto:mlescroart@unr.edu" target="_blank"><span style="font-size:11pt;font-family:Arial">mlescroart@unr.edu</span></a><span style="font-size:11pt;font-family:Arial">)</span></p></span></div>