<div dir="ltr"><div dir="ltr"><div class="gmail_default" style="font-family:verdana,sans-serif;color:rgb(7,55,99)"><p dir="ltr" style="color:rgb(0,0,0);line-height:1.656;text-align:justify;margin-top:0pt;margin-bottom:0pt"><span style="font-size:12pt;font-family:Arial;background-color:transparent;font-variant-ligatures:normal;font-variant-east-asian:normal;vertical-align:baseline;white-space:pre-wrap">Applications are invited for a fully funded PhD position at the ETS, Montreal, Canada. ETS is the fastest-growing and largest engineering school in Quebec, with an expanding team of highly qualified young researchers in image analysis, computer vision and deep learning, some of the priority areas of the school.</span></p><br style="color:rgb(0,0,0)"><p dir="ltr" style="color:rgb(0,0,0);line-height:1.656;text-align:justify;margin-top:0pt;margin-bottom:0pt"><span style="font-size:12pt;font-family:Arial;background-color:transparent;font-variant-ligatures:normal;font-variant-east-asian:normal;vertical-align:baseline;white-space:pre-wrap">The position is available after the candidate passes ETS application requirements and the candidate will start at her/his convenience (latest at summer 2021). Financial support is available for 4 years. This project will explore learning strategies forĀ trainingĀ under a sequential stream of training data, to alleviate the potential problem of catastrophic forgetting in deep neural networks.</span><span style="font-family:Arial;font-size:12pt;white-space:pre-wrap;background-color:transparent"> The main application domain will be in computer vision. The successful candidate will work under the supervision of Prof. Jose Dolz and will potentially collaborate with other Professors and research groups. Furthermore, the selected student is expected to publish her/his research on top computer vision </span><span style="font-family:Arial;font-size:12pt;white-space:pre-wrap;background-color:transparent">journals (IJCV) and conferences (CVPR, ECCV, ICCV, etc).</span></p><br style="color:rgb(0,0,0)"><p dir="ltr" style="color:rgb(0,0,0);line-height:1.656;text-align:justify;margin-top:0pt;margin-bottom:0pt"><span style="font-size:12pt;font-family:Arial;background-color:transparent;font-variant-ligatures:normal;font-variant-east-asian:normal;vertical-align:baseline;white-space:pre-wrap">Prospective applicants should have:</span></p><ul style="color:rgb(0,0,0);margin-top:0px;margin-bottom:0px"><li dir="ltr" style="margin-left:11pt;list-style-type:disc;font-size:12pt;font-family:Arial;background-color:transparent;font-variant-ligatures:normal;font-variant-east-asian:normal;vertical-align:baseline;white-space:pre-wrap"><p dir="ltr" style="line-height:1.656;text-align:justify;margin-top:0pt;margin-bottom:0pt"><span style="font-size:12pt;background-color:transparent;font-variant-ligatures:normal;font-variant-east-asian:normal;vertical-align:baseline">Strong academic record with an excellent M.Sc. degree in computer science, applied mathematics, or electrical/biomedical engineering, preferably with expertise in more than one of the following areas: medical image analysis, machine learning, computer vision, pattern recognition, semi/weakly supervised learning and/or optimization;</span></p></li><li dir="ltr" style="margin-left:11pt;list-style-type:disc;font-size:12pt;font-family:Arial;background-color:transparent;font-variant-ligatures:normal;font-variant-east-asian:normal;vertical-align:baseline;white-space:pre-wrap"><p dir="ltr" style="line-height:1.656;text-align:justify;margin-top:0pt;margin-bottom:0pt"><span style="font-size:12pt;background-color:transparent;font-variant-ligatures:normal;font-variant-east-asian:normal;vertical-align:baseline">Experience with a deep learning framework (preferrably PyTorch, or Tensorflow).</span></p></li><li dir="ltr" style="margin-left:11pt;list-style-type:disc;font-size:12pt;font-family:Verdana;color:rgb(7,55,99);background-color:transparent;font-variant-ligatures:normal;font-variant-east-asian:normal;vertical-align:baseline;white-space:pre-wrap"><p dir="ltr" style="line-height:1.656;text-align:justify;margin-top:0pt;margin-bottom:0pt"><span style="font-size:12pt;font-family:Arial;color:rgb(38,50,56);background-color:transparent;font-variant-ligatures:normal;font-variant-east-asian:normal;vertical-align:baseline">Knowledge of weakly or semi-supervised learning strategies is an asset</span><span style="font-size:12pt;font-family:Arial;color:rgb(0,0,0);background-color:transparent;font-variant-ligatures:normal;font-variant-east-asian:normal;vertical-align:baseline">.</span></p></li><li dir="ltr" style="margin-left:11pt;list-style-type:disc;font-size:12pt;font-family:Arial;background-color:transparent;font-variant-ligatures:normal;font-variant-east-asian:normal;vertical-align:baseline;white-space:pre-wrap"><p dir="ltr" style="line-height:1.656;text-align:justify;margin-top:0pt;margin-bottom:0pt"><span style="font-size:12pt;background-color:transparent;font-variant-ligatures:normal;font-variant-east-asian:normal;vertical-align:baseline">Publications in a peer-reviewed journal or conference in a related topic are a bonus.</span></p></li></ul><br style="color:rgb(0,0,0)"><span style="color:rgb(0,0,0);font-family:Arial;font-size:12pt;white-space:pre-wrap;background-color:transparent;text-align:justify">For consideration, please send a CV, a cover letter, names and contact details of two references, transcripts for graduate studies, and a link to a M.Sc. thesis (as well as relevant publications if any) to:</span><br style="color:rgb(0,0,0)"><br style="color:rgb(0,0,0)"><p dir="ltr" style="color:rgb(0,0,0);line-height:1.656;text-align:justify;margin-top:0pt;margin-bottom:0pt"><span style="font-size:12pt;font-family:Arial;color:rgb(17,85,204);background-color:transparent;font-variant-ligatures:normal;font-variant-east-asian:normal;vertical-align:baseline;white-space:pre-wrap"><a href="mailto:jose.dolz@etsmtl.ca" target="_blank">jose.dolz@etsmtl.ca</a></span></p></div><div><br></div>-- <br><div dir="ltr" class="gmail_signature"><span style="color:rgb(0,0,153)"><b>Jose Dolz</b></span><br></div></div></div>