<html><head><meta http-equiv="Content-Type" content="text/html; charset=us-ascii"></head><body style="word-wrap: break-word; -webkit-nbsp-mode: space; line-break: after-white-space;" class=""><div class=""><span id="docs-internal-guid-476745b4-7fff-55c8-64de-ce7652a99cfc" class=""><font face="Arial" size="2" class=""><div style="line-height: 1.38; margin-top: 0pt; margin-bottom: 0pt;" class=""><span style="font-variant-ligatures: normal; font-variant-east-asian: normal; font-variant-position: normal; vertical-align: baseline; white-space: pre-wrap;" class="">Dear Colleagues,</span></div><br class=""><div style="line-height: 1.38; text-align: justify; margin-top: 0pt; margin-bottom: 0pt;" class=""><span style="font-variant-ligatures: normal; font-variant-east-asian: normal; font-variant-position: normal; vertical-align: baseline; white-space: pre-wrap;" class="">The Perception and Activity Understanding group (Jean-Marc Odobez,</span><a href="http://www.idiap.ch/%7Eodobez/" style="text-decoration:none;" class=""><span style="color: rgb(0, 0, 0); font-variant-ligatures: normal; font-variant-east-asian: normal; font-variant-position: normal; vertical-align: baseline; white-space: pre-wrap;" class=""> </span><span style="color: rgb(17, 85, 204); font-variant-ligatures: normal; font-variant-east-asian: normal; font-variant-position: normal; text-decoration: underline; -webkit-text-decoration-skip: none; vertical-align: baseline; white-space: pre-wrap;" class="">http://www.idiap.ch/~odobez/</span></a><span style="font-variant-ligatures: normal; font-variant-east-asian: normal; font-variant-position: normal; vertical-align: baseline; white-space: pre-wrap;" class="">) seeks one highly motivated PhD candidate to work within the AI4Autism project aiming at improving the digital phenotyping of children with Autistic Spectrum Disorders (ASD). The PhD candidate will work on the multimodal perception of small children involved in free play activities as well as their social interactions with adults. In particular, he will investigate deep learning methods and models for the recognition of gestures and visual attention events, including the modeling of their coordination, from visual data and IoT sensors. Experiments will be conducted on various project data (e.g. data coming from standard ADOS evaluation protocol of more than 300 toddlers with partial behavior annotations) as well as standard datasets from the computer vision and multimodal domains (for gesture recognition, attention). </span></div><br class=""><div style="line-height: 1.38; text-align: justify; margin-top: 0pt; margin-bottom: 0pt;" class=""><span style="font-variant-ligatures: normal; font-variant-east-asian: normal; font-variant-position: normal; vertical-align: baseline; white-space: pre-wrap;" class="">The ideal PhD candidate should hold a MS degree in computer science, engineering, physics or applied mathematics. S/he should have a good background in statistics, linear algebra, signal processing and programming, machine learning. Experience in computer vision and deep learning are definitely a plus. The successful applicant will have good analytical skills, written and oral communication skills.</span></div><br class=""><div style="line-height: 1.38; text-align: justify; margin-top: 0pt; margin-bottom: 0pt;" class=""><span style="font-variant-ligatures: normal; font-variant-east-asian: normal; font-variant-position: normal; vertical-align: baseline; white-space: pre-wrap;" class="">The position is for 4 years, provided successful progress, and should lead to a dissertation. The selected candidates will become doctoral students at EPFL provided acceptance by the Doctoral School at EPFL (</span><a href="http://phd.epfl.ch/applicants" style="text-decoration:none;" class=""><span style="color: rgb(17, 85, 204); font-variant-ligatures: normal; font-variant-east-asian: normal; font-variant-position: normal; text-decoration: underline; -webkit-text-decoration-skip: none; vertical-align: baseline; white-space: pre-wrap;" class="">http://phd.epfl.ch/applicants</span></a><span style="font-variant-ligatures: normal; font-variant-east-asian: normal; font-variant-position: normal; vertical-align: baseline; white-space: pre-wrap;" class="">). Starting date is ideally on the 1st of November 2021. The salary follows the EPFL standards (52,400 annual gross salary in the first year).</span></div><br class=""><div style="line-height: 1.38; margin-top: 0pt; margin-bottom: 0pt;" class=""><span style="font-variant-ligatures: normal; font-variant-east-asian: normal; font-variant-position: normal; vertical-align: baseline; white-space: pre-wrap;" class="">Interested candidates should submit a cover letter, a detailed CV, and the names of three references (or recommendation letters) through the Idiap online recruitment system:</span><a href="http://www.idiap.ch/education-and-jobs/" style="text-decoration:none;" class=""><span style="color: rgb(0, 0, 0); font-variant-ligatures: normal; font-variant-east-asian: normal; font-variant-position: normal; vertical-align: baseline; white-space: pre-wrap;" class=""> </span></a><a href="https://www.idiap.ch/en/join-us/job-opportunities" style="text-decoration:none;" class=""><span style="color: rgb(17, 85, 204); font-variant-ligatures: normal; font-variant-east-asian: normal; font-variant-position: normal; text-decoration: underline; -webkit-text-decoration-skip: none; vertical-align: baseline; white-space: pre-wrap;" class="">http://www.idiap.ch/job-opportunities</span></a><span style="font-variant-ligatures: normal; font-variant-east-asian: normal; font-variant-position: normal; vertical-align: baseline; white-space: pre-wrap;" class="">.</span></div><br class=""><div style="line-height: 1.38; margin-top: 0pt; margin-bottom: 0pt;" class=""><span style="font-variant-ligatures: normal; font-variant-east-asian: normal; font-variant-position: normal; vertical-align: baseline; white-space: pre-wrap;" class="">Interviews will start upon reception of applications until the position is filled.</span></div><br class=""><div style="line-height: 1.38; margin-top: 0pt; margin-bottom: 0pt;" class=""><span style="font-variant-ligatures: normal; font-variant-east-asian: normal; font-variant-position: normal; vertical-align: baseline; white-space: pre-wrap;" class="">---</span></div><div style="line-height: 1.38; margin-top: 0pt; margin-bottom: 0pt;" class=""><span style="font-variant-ligatures: normal; font-variant-east-asian: normal; font-variant-position: normal; vertical-align: baseline; white-space: pre-wrap;" class="">About the AI4Autism project and the PhD position.</span></div><br class=""><div style="line-height: 1.38; margin-top: 0pt; margin-bottom: 0pt;" class=""><span style="font-variant-ligatures: normal; font-variant-east-asian: normal; font-variant-position: normal; vertical-align: baseline; white-space: pre-wrap;" class="">AI4Autism is a sinergia project funded by the SNSF and involving the University of Geneva (</span><a href="https://www.unige.ch/ProjetAutismeOMP/fr/membres/schaer/" style="text-decoration:none;" class=""><span style="color: rgb(17, 85, 204); font-variant-ligatures: normal; font-variant-east-asian: normal; font-variant-position: normal; text-decoration: underline; -webkit-text-decoration-skip: none; vertical-align: baseline; white-space: pre-wrap;" class="">Marie Schaer</span></a><span style="font-variant-ligatures: normal; font-variant-east-asian: normal; font-variant-position: normal; vertical-align: baseline; white-space: pre-wrap;" class="">, as well as </span><a href="https://www.unige.ch/gsem/en/research/faculty/all/thomas-maillart/" style="text-decoration:none;" class=""><span style="color: rgb(17, 85, 204); font-variant-ligatures: normal; font-variant-east-asian: normal; font-variant-position: normal; text-decoration: underline; -webkit-text-decoration-skip: none; vertical-align: baseline; white-space: pre-wrap;" class="">Thomas Maillart</span></a><span style="font-variant-ligatures: normal; font-variant-east-asian: normal; font-variant-position: normal; vertical-align: baseline; white-space: pre-wrap;" class="">), the Human behavioral analysis research unit at the DTI department of the University of Applied Sciences and Arts of Southern Switzerland (SUPSI) (</span><a href="https://www.supsi.ch/home_en/strumenti/rubrica/dettaglio.5674.backLink.79718207-24d5-44ab-8cec-be3898389703.html" style="text-decoration:none;" class=""><span style="color: rgb(17, 85, 204); font-variant-ligatures: normal; font-variant-east-asian: normal; font-variant-position: normal; text-decoration: underline; -webkit-text-decoration-skip: none; vertical-align: baseline; white-space: pre-wrap;" class="">Michela Papandrea</span></a><span style="font-variant-ligatures: normal; font-variant-east-asian: normal; font-variant-position: normal; vertical-align: baseline; white-space: pre-wrap;" class="">), and the Perception and Activity Understanding group (Jean-Marc Odobez,</span><a href="http://www.idiap.ch/%7Eodobez/" style="text-decoration:none;" class=""><span style="color: rgb(0, 0, 0); font-variant-ligatures: normal; font-variant-east-asian: normal; font-variant-position: normal; vertical-align: baseline; white-space: pre-wrap;" class=""> </span><span style="color: rgb(17, 85, 204); font-variant-ligatures: normal; font-variant-east-asian: normal; font-variant-position: normal; text-decoration: underline; -webkit-text-decoration-skip: none; vertical-align: baseline; white-space: pre-wrap;" class="">http://www.idiap.ch/~odobez/</span></a><span style="font-variant-ligatures: normal; font-variant-east-asian: normal; font-variant-position: normal; vertical-align: baseline; white-space: pre-wrap;" class="">) at the Idiap Research Institute (</span><a href="http://www.idiap.ch" style="text-decoration:none;" class=""><span style="color: rgb(17, 85, 204); font-variant-ligatures: normal; font-variant-east-asian: normal; font-variant-position: normal; text-decoration: underline; -webkit-text-decoration-skip: none; vertical-align: baseline; white-space: pre-wrap;" class="">http://www.idiap.ch</span></a><span style="font-variant-ligatures: normal; font-variant-east-asian: normal; font-variant-position: normal; vertical-align: baseline; white-space: pre-wrap;" class="">).</span></div><br class=""><p dir="ltr" style="line-height: 1.38; text-align: justify; margin-top: 12pt; margin-bottom: 12pt;" class=""><span style="font-weight: 700; font-variant-ligatures: normal; font-variant-east-asian: normal; font-variant-position: normal; vertical-align: baseline; white-space: pre-wrap;" class="">Project description: </span><span style="font-variant-ligatures: normal; font-variant-east-asian: normal; font-variant-position: normal; vertical-align: baseline; white-space: pre-wrap;" class="">Nowadays, 1 in 59 children diagnosed with autism spectrum disorders (ASD), which makes this condition one of the most prevalent neurodevelopmental disorders. The AI4Autism project is grounded on the recognition that, on the one hand, early diagnosis at scale of autism in young children requires the development of tools for digital phenotyping and automated screening, through digital computer vision and Internet of Things sensing. On the other hand, it aims to examine the potential of digital sensing to provide automated measures of the extended and more fine-grained autism phenotypes. To address these two questions, the project proposes an interdisciplinary project combining the skills of experts in clinical research, engineering and computational social sciences to develop precise and scaled approaches for autism screening and profiling, by investigating three following research directions which are critical to move beyond the state-of-the-art. (1) </span><span style="font-weight: 700; font-variant-ligatures: normal; font-variant-east-asian: normal; font-variant-position: normal; vertical-align: baseline; white-space: pre-wrap;" class="">Clinical research in autism: </span><span style="font-variant-ligatures: normal; font-variant-east-asian: normal; font-variant-position: normal; vertical-align: baseline; white-space: pre-wrap;" class="">we propose a comprehensive and reproducible research approach designed to propose groundbreaking digital tools for screening and automated profiling of autism phenotype. It relies on the investigation of both a structured and well established protocol and a less structured one (free play) which may scale better. (2) </span><span style="font-weight: 700; font-variant-ligatures: normal; font-variant-east-asian: normal; font-variant-position: normal; vertical-align: baseline; white-space: pre-wrap;" class="">Internet of Things (IoT)</span><span style="font-variant-ligatures: normal; font-variant-east-asian: normal; font-variant-position: normal; vertical-align: baseline; white-space: pre-wrap;" class="">: exploring the hypothesis that some ASD phenotypes might be related to the motor skills of very young children, we will explore the use IoT sensors for ASD diagnosis. (3) </span><span style="font-weight: 700; font-variant-ligatures: normal; font-variant-east-asian: normal; font-variant-position: normal; vertical-align: baseline; white-space: pre-wrap;" class="">Computational perception and machine learning. </span><span style="font-variant-ligatures: normal; font-variant-east-asian: normal; font-variant-position: normal; vertical-align: baseline; white-space: pre-wrap;" class="">The project will be rooted in modern AI, investigating novel machine learning and computer vision techniques leveraging the availability of large behavioral and clinical annotation data to propose novel behavioral cues extraction models working in challenging sensing conditions.</span></p><span style="font-weight: 700; font-variant-ligatures: normal; font-variant-east-asian: normal; font-variant-position: normal; vertical-align: baseline; white-space: pre-wrap;" class="">PhD position</span><span style="font-variant-ligatures: normal; font-variant-east-asian: normal; font-variant-position: normal; vertical-align: baseline; white-space: pre-wrap;" class="">: the Phd student will join a team of one PhD student and a postdoc at Idiap working on the project. He will work with them and study methods and models (domain adaptation, unsupervised or weakly supervised learning; temporal graph neural networks, attention-based neural networks and transformers) for the analysis of motor (recognizing gestures) and gaze coordination patterns which are at the core of ASD based on computer vision and IoT sensors, and investigate multimodal interaction deep-learning techniques for ASD diagnosis and profiling with a focus towards interpretable models.</span></font></span></div><div class=""><br class=""></div><br class=""><div class="">
<div style="word-wrap: break-word; -webkit-nbsp-mode: space; line-break: after-white-space;" class=""><div style="color: rgb(0, 0, 0); font-family: Helvetica; font-size: 12px; font-style: normal; font-variant-caps: normal; font-weight: normal; letter-spacing: normal; text-align: start; text-indent: 0px; text-transform: none; white-space: normal; word-spacing: 0px; -webkit-text-stroke-width: 0px;">-- <br class="">Jean-Marc Odobez, <br class="">IDIAP Senior Researcher, Head of the Perception and Activity Understanding group</div><div style="color: rgb(0, 0, 0); font-family: Helvetica; font-size: 12px; font-style: normal; font-variant-caps: normal; font-weight: normal; letter-spacing: normal; text-align: start; text-indent: 0px; text-transform: none; white-space: normal; word-spacing: 0px; -webkit-text-stroke-width: 0px;">EPFL Senior Researcher (EPFL MER)<br class="">IDIAP Research Institute (<a href="http://www.idiap.ch" class="">http://www.idiap.ch</a>)<br class="">Tel: +41 (0)27 721 77 26<br class="">Web: <a href="http://www.idiap.ch/~odobez" class="">http://www.idiap.ch/~odobez</a></div><div style="color: rgb(0, 0, 0); font-family: Helvetica; font-size: 12px; font-style: normal; font-variant-caps: normal; font-weight: normal; letter-spacing: normal; text-align: start; text-indent: 0px; text-transform: none; white-space: normal; word-spacing: 0px; -webkit-text-stroke-width: 0px;" class=""><br class=""></div><br class="Apple-interchange-newline"></div><br class="Apple-interchange-newline">
</div>
<br class=""></body></html>