<html><head><meta http-equiv="Content-Type" content="text/html; charset=us-ascii"></head><body style="word-wrap: break-word; -webkit-nbsp-mode: space; line-break: after-white-space;" class=""><span id="docs-internal-guid-b8921a1d-7fff-3c00-8744-2e9fb2fc266b" class=""><div style="line-height: 1.38; margin-top: 0pt; margin-bottom: 0pt;" class=""><span style="font-family: Arial; font-size: 11pt; white-space: pre-wrap;" class="">Dear Colleagues,</span></div><br class=""><div style="line-height: 1.38; margin-top: 0pt; margin-bottom: 0pt; text-align: justify;" class=""><span style="font-size: 11pt; font-family: Arial; font-variant-ligatures: normal; font-variant-east-asian: normal; font-variant-position: normal; vertical-align: baseline; white-space: pre-wrap;" class="">The Perception and Activity Understanding group (Jean-Marc Odobez,</span><a href="http://www.idiap.ch/%7Eodobez/" style="text-decoration:none;" class=""><span style="font-size: 11pt; font-family: Arial; color: rgb(0, 0, 0); font-variant-ligatures: normal; font-variant-east-asian: normal; font-variant-position: normal; vertical-align: baseline; white-space: pre-wrap;" class=""> </span><span style="font-size: 11pt; font-family: Arial; color: rgb(17, 85, 204); font-variant-ligatures: normal; font-variant-east-asian: normal; font-variant-position: normal; text-decoration: underline; -webkit-text-decoration-skip: none; vertical-align: baseline; white-space: pre-wrap;" class="">http://www.idiap.ch/~odobez/</span></a><span style="font-size: 11pt; font-family: Arial; font-variant-ligatures: normal; font-variant-east-asian: normal; font-variant-position: normal; vertical-align: baseline; white-space: pre-wrap;" class="">), along with the Robot Learning and Interaction group (Sylvain Calinon,</span><a href="http://calinon.ch" style="text-decoration:none;" class=""><span style="font-size: 11pt; font-family: Arial; color: rgb(0, 0, 0); font-variant-ligatures: normal; font-variant-east-asian: normal; font-variant-position: normal; vertical-align: baseline; white-space: pre-wrap;" class=""> </span><span style="font-size: 11pt; font-family: Arial; color: rgb(17, 85, 204); font-variant-ligatures: normal; font-variant-east-asian: normal; font-variant-position: normal; text-decoration: underline; -webkit-text-decoration-skip: none; vertical-align: baseline; white-space: pre-wrap;" class="">http://calinon.ch</span></a><span style="font-size: 11pt; font-family: Arial; font-variant-ligatures: normal; font-variant-east-asian: normal; font-variant-position: normal; vertical-align: baseline; white-space: pre-wrap;" class="">) at Idiap (</span><a href="http://www.idiap.ch" style="text-decoration:none;" class=""><span style="font-size: 11pt; font-family: Arial; color: rgb(17, 85, 204); font-variant-ligatures: normal; font-variant-east-asian: normal; font-variant-position: normal; text-decoration: underline; -webkit-text-decoration-skip: none; vertical-align: baseline; white-space: pre-wrap;" class="">www.idiap.ch</span></a><span style="font-size: 11pt; font-family: Arial; font-variant-ligatures: normal; font-variant-east-asian: normal; font-variant-position: normal; vertical-align: baseline; white-space: pre-wrap;" class="">), seeks one motivated postdoctoral fellow to work on interaction strategies and deep learning for object manipulations in cluttered object scenes.</span></div><br class=""><div style="line-height: 1.38; margin-top: 0pt; margin-bottom: 0pt; text-align: justify;" class=""><span style="font-size: 11pt; font-family: Arial; font-variant-ligatures: normal; font-variant-east-asian: normal; font-variant-position: normal; vertical-align: baseline; white-space: pre-wrap;" class="">The position is part of a European CHIST-ERA project involving the University of Lincoln (UK), INRIA (France), TU Wien (Austria), and IIT (Italy). The project aims at benchmarking object recognition, manipulation and human-robot interaction for sorting a complex unstructured heap of unknown and irregular objects. The Idiap research institute will focus on pushing or pulling tasks and in combining different interaction strategies (self-learning, learning from demonstration, learning by correction) including the use of deep learning approaches to model multimodal action-perception interactions (proprioception vs visual modalities).</span></div><br class=""><div style="line-height: 1.38; margin-top: 0pt; margin-bottom: 0pt; text-align: justify;" class=""><span style="font-size: 11pt; font-family: Arial; font-variant-ligatures: normal; font-variant-east-asian: normal; font-variant-position: normal; vertical-align: baseline; white-space: pre-wrap;" class="">The ideal postdoctoral candidate should hold a MS degree in computer science, engineering, physics or applied mathematics. S/he should have an excellent background in statistics, linear algebra, signal processing, programming, machine learning. The successful applicant will have good analytical skills, written and oral communication skills, and the ability to work and collaborate in a project with international teams. The recruited person will complement and work with a number of other teams members of the two research groups involved in related topics (e.g. in the SNSF funded ROSALIS project). </span></div><br class=""><div style="line-height: 1.38; margin-top: 0pt; margin-bottom: 0pt; text-align: justify;" class=""><span style="font-size: 11pt; font-family: Arial; font-variant-ligatures: normal; font-variant-east-asian: normal; font-variant-position: normal; vertical-align: baseline; white-space: pre-wrap;" class="">Research during the Phd or/and prior substantial experience in at least two of the three following domains is imperative, along with the willingness to learn in the third one:</span></div><div style="line-height: 1.38; margin-top: 0pt; margin-bottom: 0pt; text-align: justify;" class=""><span style="font-size: 11pt; font-family: Arial; font-variant-ligatures: normal; font-variant-east-asian: normal; font-variant-position: normal; vertical-align: baseline; white-space: pre-wrap;" class="">- robotics (related to gesture and/or manipulation), </span></div><div style="line-height: 1.38; margin-top: 0pt; margin-bottom: 0pt; text-align: justify;" class=""><span style="font-size: 11pt; font-family: Arial; font-variant-ligatures: normal; font-variant-east-asian: normal; font-variant-position: normal; vertical-align: baseline; white-space: pre-wrap;" class="">- machine learning, </span></div><div style="line-height: 1.38; margin-top: 0pt; margin-bottom: 0pt; text-align: justify;" class=""><span style="font-size: 11pt; font-family: Arial; font-variant-ligatures: normal; font-variant-east-asian: normal; font-variant-position: normal; vertical-align: baseline; white-space: pre-wrap;" class="">- computer vision.</span></div><br class=""><div style="line-height: 1.38; margin-top: 0pt; margin-bottom: 0pt; text-align: justify;" class=""><span style="font-size: 11pt; font-family: Arial; font-variant-ligatures: normal; font-variant-east-asian: normal; font-variant-position: normal; vertical-align: baseline; white-space: pre-wrap;" class="">More specific expertise, experience or technical skills sought include:</span></div><div style="line-height: 1.38; margin-top: 0pt; margin-bottom: 0pt; text-align: justify;" class=""><span style="font-size: 11pt; font-family: Arial; font-variant-ligatures: normal; font-variant-east-asian: normal; font-variant-position: normal; vertical-align: baseline; white-space: pre-wrap;" class="">- robot manipulation (gesture learning and synthesis, grasping) </span></div><div style="line-height: 1.38; margin-top: 0pt; margin-bottom: 0pt; text-align: justify;" class=""><span style="font-size: 11pt; font-family: Arial; font-variant-ligatures: normal; font-variant-east-asian: normal; font-variant-position: normal; vertical-align: baseline; white-space: pre-wrap;" class="">- visual servoing </span></div><div style="line-height: 1.38; margin-top: 0pt; margin-bottom: 0pt; text-align: justify;" class=""><span style="font-size: 11pt; font-family: Arial; font-variant-ligatures: normal; font-variant-east-asian: normal; font-variant-position: normal; vertical-align: baseline; white-space: pre-wrap;" class="">- active perception</span></div><div style="line-height: 1.38; margin-top: 0pt; margin-bottom: 0pt; text-align: justify;" class=""><span style="font-size: 11pt; font-family: Arial; font-variant-ligatures: normal; font-variant-east-asian: normal; font-variant-position: normal; vertical-align: baseline; white-space: pre-wrap;" class="">- deep learning</span></div><div style="line-height: 1.38; margin-top: 0pt; margin-bottom: 0pt; text-align: justify;" class=""><span style="font-size: 11pt; font-family: Arial; font-variant-ligatures: normal; font-variant-east-asian: normal; font-variant-position: normal; vertical-align: baseline; white-space: pre-wrap;" class="">- visual processing</span></div><div style="line-height: 1.38; margin-top: 0pt; margin-bottom: 0pt; text-align: justify;" class=""><span style="font-size: 11pt; font-family: Arial; font-variant-ligatures: normal; font-variant-east-asian: normal; font-variant-position: normal; vertical-align: baseline; white-space: pre-wrap;" class="">- software: python and/or C++, ROS</span></div><br class=""><div style="line-height: 1.38; margin-top: 0pt; margin-bottom: 0pt; text-align: justify;" class=""><span style="font-size: 11pt; font-family: Arial; font-variant-ligatures: normal; font-variant-east-asian: normal; font-variant-position: normal; vertical-align: baseline; white-space: pre-wrap;" class="">The position is for one year, with possibilities of being renewed until the end of the project. The Annual gross salary is 80,000 CHF.</span></div><br class=""><div style="line-height: 1.38; margin-top: 0pt; margin-bottom: 0pt; text-align: justify;" class=""><span style="font-size: 11pt; font-family: Arial; font-variant-ligatures: normal; font-variant-east-asian: normal; font-variant-position: normal; vertical-align: baseline; white-space: pre-wrap;" class="">The project will start in March 2019, but the position can start earlier or later, depending on the profile.</span></div><br class=""><div style="line-height: 1.38; margin-top: 0pt; margin-bottom: 0pt;" class=""><span style="font-size: 11pt; font-family: Arial; font-variant-ligatures: normal; font-variant-east-asian: normal; font-variant-position: normal; vertical-align: baseline; white-space: pre-wrap;" class="">Interested candidates should submit a cover letter, a detailed CV, and the names of three references (or recommendation letters) through the Idiap online recruitment system:</span></div><div style="line-height: 1.38; margin-top: 0pt; margin-bottom: 0pt;" class=""><a href="http://www.idiap.ch/education-and-jobs/job-10256" style="text-decoration:none;" class=""><span style="font-size: 9pt; font-family: Arial; color: rgb(17, 85, 204); font-variant-ligatures: normal; font-variant-east-asian: normal; font-variant-position: normal; text-decoration: underline; -webkit-text-decoration-skip: none; vertical-align: baseline; white-space: pre-wrap;" class="">http://www.idiap.ch/education-and-jobs/job-10256</span></a></div><br class=""><div style="line-height: 1.38; margin-top: 0pt; margin-bottom: 0pt;" class=""><span style="font-size: 11pt; font-family: Arial; font-variant-ligatures: normal; font-variant-east-asian: normal; font-variant-position: normal; vertical-align: baseline; white-space: pre-wrap;" class="">Interviews will start upon reception of applications until the position is filled.</span></div><div class=""><span style="font-size: 11pt; font-family: Arial; font-variant-ligatures: normal; font-variant-east-asian: normal; font-variant-position: normal; vertical-align: baseline; white-space: pre-wrap;" class=""><br class=""></span></div></span><div class="">
<div style="word-wrap: break-word; -webkit-nbsp-mode: space; line-break: after-white-space;" class=""><div style="color: rgb(0, 0, 0); font-family: Helvetica; font-size: 12px; font-style: normal; font-variant-caps: normal; font-weight: normal; letter-spacing: normal; text-align: start; text-indent: 0px; text-transform: none; white-space: normal; word-spacing: 0px; -webkit-text-stroke-width: 0px;">-- <br class="">Jean-Marc Odobez, <br class="">IDIAP Senior Researcher, Head of the Perception and Activity Understanding group</div><div style="color: rgb(0, 0, 0); font-family: Helvetica; font-size: 12px; font-style: normal; font-variant-caps: normal; font-weight: normal; letter-spacing: normal; text-align: start; text-indent: 0px; text-transform: none; white-space: normal; word-spacing: 0px; -webkit-text-stroke-width: 0px;">EPFL Senior Researcher (EPFL MER)<br class="">IDIAP Research Institute (<a href="http://www.idiap.ch" class="">http://www.idiap.ch</a>)<br class="">Tel: +41 (0)27 721 77 26<br class="">Web: <a href="http://www.idiap.ch/~odobez" class="">http://www.idiap.ch/~odobez</a></div><div style="color: rgb(0, 0, 0); font-family: Helvetica; font-size: 12px; font-style: normal; font-variant-caps: normal; font-weight: normal; letter-spacing: normal; text-align: start; text-indent: 0px; text-transform: none; white-space: normal; word-spacing: 0px; -webkit-text-stroke-width: 0px;" class=""><br class=""></div><br class="Apple-interchange-newline"></div><br class="Apple-interchange-newline">
</div>
<br class=""></body></html>