<html>
  <head>
    <meta http-equiv="Content-Type" content="text/html; charset=UTF-8">
  </head>
  <body style="word-wrap: break-word; -webkit-nbsp-mode: space;
    line-break: after-white-space;" class="" text="#000000"
    bgcolor="#FFFFFF">
    <div style="margin: 0px; font-stretch: normal; font-size: 14.7px;
      line-height: normal; font-family: Arial;
      -webkit-text-stroke-width: initial; -webkit-text-stroke-color:
      rgb(0, 0, 0);" class=""><span style="font-kerning: none" class="">***
        Please accept our apologies if you receive multiple copies of
        this CfP ***</span></div>
    <b style="font-weight:normal;"
      id="docs-internal-guid-2d5f23c3-7fff-0c0e-4ce7-2a6543e915b3"><br>
      <p dir="ltr"
        style="line-height:1.38;margin-top:0pt;margin-bottom:0pt;"><span style="font-size:11pt;font-family:Arial;color:#000000;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;">We invite you to participate in the open </span><span style="font-size:11pt;font-family:Arial;color:#000000;background-color:transparent;font-weight:700;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;">competitions</span><span style="font-size:11pt;font-family:Arial;color:#000000;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;"> at The 15th IEEE International Conference on Automatic Face and Gesture Recognition (</span><span style="font-size:11pt;font-family:Arial;color:#000000;background-color:transparent;font-weight:700;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;">FG 2020</span><span style="font-size:11pt;font-family:Arial;color:#000000;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;">) that will be held in Buenos Aires, Argentina, 18-22 May 2020. </span></p>
      <br>
      <p dir="ltr"
        style="line-height:1.38;margin-top:0pt;margin-bottom:0pt;"><span style="font-size:11pt;font-family:Arial;color:#000000;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;">Accepted Competitions:</span></p>
      <ul style="margin-top:0;margin-bottom:0;">
        <li dir="ltr" style="list-style-type:disc;font-size:11pt;font-family:Arial;color:#000000;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;"><p dir="ltr" style="line-height:1.38;margin-top:0pt;margin-bottom:0pt;" role="presentation"><span style="font-size:11pt;font-family:Arial;color:#000000;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;">2020 ChaLearn LAP Workshop FG: Identity-preserving human detection (IPHD)</span></p></li>
        <li dir="ltr" style="list-style-type:disc;font-size:11pt;font-family:Arial;color:#000000;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;"><p dir="ltr" style="line-height:1.38;margin-top:0pt;margin-bottom:0pt;" role="presentation"><span style="font-size:11pt;font-family:Arial;color:#000000;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;">EmoPain Face and Movement Behaviour Challenge</span></p></li>
        <li dir="ltr" style="list-style-type:disc;font-size:11pt;font-family:Arial;color:#000000;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;"><p dir="ltr" style="line-height:1.38;margin-top:0pt;margin-bottom:0pt;" role="presentation"><span style="font-size:11pt;font-family:Arial;color:#000000;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;">4th Recognizing Families In the Wild (RFIW)</span></p></li>
        <li dir="ltr" style="list-style-type:disc;font-size:11pt;font-family:Arial;color:#000000;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;"><p dir="ltr" style="line-height:1.38;margin-top:0pt;margin-bottom:0pt;" role="presentation"><span style="font-size:11pt;font-family:Arial;color:#000000;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;">Joint Challenge on Compound Emotion Recognition and Multimodal (Audio, Facial and Gesture) based Emotion Recognition (CER\&MMER)</span></p></li>
        <li dir="ltr" style="list-style-type:disc;font-size:11pt;font-family:Arial;color:#000000;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;"><p dir="ltr" style="line-height:1.38;margin-top:0pt;margin-bottom:0pt;" role="presentation"><span style="font-size:11pt;font-family:Arial;color:#000000;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;">Affective Behavior Analysis in-the-wild</span></p></li>
      </ul>
      <br>
      <p dir="ltr"
        style="line-height:1.38;margin-top:0pt;margin-bottom:0pt;"><span style="font-size:11pt;font-family:Arial;color:#000000;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;">Check for more information at </span><a
          href="https://fg2020.org/competitions/"
          style="text-decoration:none;" moz-do-not-send="true"><span style="font-size:11pt;font-family:Arial;color:#1155cc;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:underline;-webkit-text-decoration-skip:none;text-decoration-skip-ink:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;">https://fg2020.org/competitions/</span></a></p>
      <br>
      <br>
      <p dir="ltr"
        style="line-height:1.38;margin-top:0pt;margin-bottom:0pt;"><span style="font-size:11pt;font-family:Arial;color:#000000;background-color:transparent;font-weight:700;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;">** 2020 ChaLearn LAP Workshop FG: Identity-preserving human detection (IPHD) **</span></p>
      <br>
      <p dir="ltr"
        style="line-height:1.38;margin-top:0pt;margin-bottom:0pt;"><span style="font-size:11pt;font-family:Arial;color:#000000;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;">Participants will be asked to automatically produce human detections on depth and/or radiometric thermal images ïn the form of 2D bounding box locations. These two visual modalities preserve subjects’ identity to a large extent while still providing rich human-related features: shape and absolute temperature. We provide a dataset consisting of more than 100K multimodal frames, which are a mix of close-range in-the-wild pedestrian scenes and indoor ones with people performing basic actions in scripted scenarios. The competition is divided into 3 tracks, so the participants can choose to participate in one or more and will be asked to exploit different input data for the task: (1) only depth, (2) only thermal, and (3) both depth and thermal (spatially registered to depth).</span></p>
      <br>
      <p dir="ltr"
        style="line-height:1.38;margin-top:0pt;margin-bottom:0pt;"><span style="font-size:11pt;font-family:Arial;color:#000000;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;">Competition website:  <a class="moz-txt-link-freetext" href="http://chalearnlap.cvc.uab.es/challenge/34/description/" moz-do-not-send="true">http://chalearnlap.cvc.uab.es/challenge/34/description/</a></span></p>
      <br>
      <p dir="ltr"
        style="line-height:1.38;margin-top:0pt;margin-bottom:0pt;"><span style="font-size:11pt;font-family:Arial;color:#000000;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;">Competition start-ending: November 26th, 2019 — February 4th, 2020</span></p>
      <br>
      <p dir="ltr"
        style="line-height:1.38;margin-top:0pt;margin-bottom:0pt;"><span style="font-size:11pt;font-family:Arial;color:#000000;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;">Organizers:</span></p>
      <p dir="ltr"
        style="line-height:1.38;margin-top:0pt;margin-bottom:0pt;"><span style="font-size:11pt;font-family:Arial;color:#000000;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;">- Albert Clapés, Computer Vision Center (Universitat Autònoma de Barcelona), <a class="moz-txt-link-abbreviated" href="mailto:aclapes@cvc.uab.es" moz-do-not-send="true">aclapes@cvc.uab.es</a></span></p>
      <p dir="ltr"
        style="line-height:1.38;margin-top:0pt;margin-bottom:0pt;"><span style="font-size:11pt;font-family:Arial;color:#000000;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;">- Carla Morral, Universitat de Barcelona, <a class="moz-txt-link-abbreviated" href="mailto:carla.morral@gmail.com" moz-do-not-send="true">carla.morral@gmail.com</a></span></p>
      <p dir="ltr"
        style="line-height:1.38;margin-top:0pt;margin-bottom:0pt;"><span style="font-size:11pt;font-family:Arial;color:#000000;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;">- Julio C. S. Jacques Junior, Universitat Oberta de Catalunya & Computer Vision Center (Universitat Autònoma de Barcelona), <a class="moz-txt-link-abbreviated" href="mailto:juliojj@gmail.com" moz-do-not-send="true">juliojj@gmail.com</a></span></p>
      <p dir="ltr"
        style="line-height:1.38;margin-top:0pt;margin-bottom:0pt;"><span style="font-size:11pt;font-family:Arial;color:#000000;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;">- Sergio Escalera,  Universitat de Barcelona & Computer Vision Center (Universitat Autònoma de Barcelona), <a class="moz-txt-link-abbreviated" href="mailto:sergio@maia.ub.es" moz-do-not-send="true">sergio@maia.ub.es</a></span></p>
      <br>
      <br>
      <p dir="ltr"
        style="line-height:1.38;margin-top:0pt;margin-bottom:0pt;"><span style="font-size:11pt;font-family:Arial;color:#000000;background-color:transparent;font-weight:700;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;">** EmoPain Face and Movement Behaviour Challenge **</span></p>
      <br>
      <p dir="ltr"
        style="line-height:1.38;margin-top:0pt;margin-bottom:0pt;"><span style="font-size:11pt;font-family:Arial;color:#000000;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;">The EmoPain Face and Movement Behaviour Challenge is based on the EmoPain dataset captured from chronic pain patients and healthy participants performing movement exercises. The challenge consists of three tasks to choose from: ‘Pain Intensity Estimation from Facial Expressions’, ‘Pain Level Recognition from Multimodal Movement Data’, ‘Multimodal Movement Behaviour Classification’. The challenge is an opportunity to contribute to solving the challenging problem of automatic detection of pain behaviours and pain levels during movement performance. This is fundamental to the development of technology that improves the quality and quantity of engagement in valued everyday activities for people with chronic pain by providing tailored support to the specific barriers that arise.</span></p>
      <br>
      <p dir="ltr"
        style="line-height:1.38;margin-top:0pt;margin-bottom:0pt;"><span style="font-size:11pt;font-family:Arial;color:#000000;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;">Competition website: <a class="moz-txt-link-freetext" href="https://mvrjustid.github.io/EmoPainChallenge2020/" moz-do-not-send="true">https://mvrjustid.github.io/EmoPainChallenge2020/</a></span></p>
      <br>
      <p dir="ltr"
        style="line-height:1.38;margin-top:0pt;margin-bottom:0pt;"><span style="font-size:11pt;font-family:Arial;color:#000000;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;">Competition start-ending dates:  October 2019 – Jan 17, 2020</span></p>
      <br>
      <p dir="ltr"
        style="line-height:1.38;margin-top:0pt;margin-bottom:0pt;"><span style="font-size:11pt;font-family:Arial;color:#000000;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;">General Chairs: </span></p>
      <p dir="ltr"
        style="line-height:1.38;margin-top:0pt;margin-bottom:0pt;"><span style="font-size:11pt;font-family:Arial;color:#000000;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;">- Prof Nadia Berthouze, UCL</span></p>
      <p dir="ltr"
        style="line-height:1.38;margin-top:0pt;margin-bottom:0pt;"><span style="font-size:11pt;font-family:Arial;color:#000000;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;">- Dr. Amanda Williams, UCL,</span></p>
      <p dir="ltr"
        style="line-height:1.38;margin-top:0pt;margin-bottom:0pt;"><span style="font-size:11pt;font-family:Arial;color:#000000;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;">- Dr Michel Valstar, University of Nottingham,</span></p>
      <p dir="ltr"
        style="line-height:1.38;margin-top:0pt;margin-bottom:0pt;"><span style="font-size:11pt;font-family:Arial;color:#000000;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;">- Dr Hongying Meng, Brunel University London,</span></p>
      <p dir="ltr"
        style="line-height:1.38;margin-top:0pt;margin-bottom:0pt;"><span style="font-size:11pt;font-family:Arial;color:#000000;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;">- Dr Min Aung, University of East Anglia,</span></p>
      <p dir="ltr"
        style="line-height:1.38;margin-top:0pt;margin-bottom:0pt;"><span style="font-size:11pt;font-family:Arial;color:#000000;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;">- Dr Nicholas Lane, University of Oxford.</span></p>
      <br>
      <p dir="ltr"
        style="line-height:1.38;margin-top:0pt;margin-bottom:0pt;"><span style="font-size:11pt;font-family:Arial;color:#000000;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;">Data Chairs: </span></p>
      <p dir="ltr"
        style="line-height:1.38;margin-top:0pt;margin-bottom:0pt;"><span style="font-size:11pt;font-family:Arial;color:#000000;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;">- Dr Joy Egede, University of Nottingham,</span></p>
      <p dir="ltr"
        style="line-height:1.38;margin-top:0pt;margin-bottom:0pt;"><span style="font-size:11pt;font-family:Arial;color:#000000;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;">- Dr Olugbade Temitayo, UCL,</span></p>
      <p dir="ltr"
        style="line-height:1.38;margin-top:0pt;margin-bottom:0pt;"><span style="font-size:11pt;font-family:Arial;color:#000000;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;">- Wang Chongyang , UCL,</span></p>
      <p dir="ltr"
        style="line-height:1.38;margin-top:0pt;margin-bottom:0pt;"><span style="font-size:11pt;font-family:Arial;color:#000000;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;">- Siyang Song, University of Nottingham.</span></p>
      <br>
      <br>
      <p dir="ltr"
        style="line-height:1.38;margin-top:0pt;margin-bottom:0pt;"><span style="font-size:11pt;font-family:Arial;color:#000000;background-color:transparent;font-weight:700;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;">** 4th Recognizing Families In the Wild (RFIW) **</span></p>
      <br>
      <p dir="ltr"
        style="line-height:1.38;margin-top:0pt;margin-bottom:0pt;"><span style="font-size:11pt;font-family:Arial;color:#000000;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;">We are pleased to kick-off the 4th RFIW data challenge workshop in conjunction with 2020 IEEE AMFG conference.</span></p>
      <br>
      <p dir="ltr"
        style="line-height:1.38;margin-top:0pt;margin-bottom:0pt;"><span style="font-size:11pt;font-family:Arial;color:#000000;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;">New this year:</span></p>
      <p dir="ltr"
        style="line-height:1.38;margin-top:0pt;margin-bottom:0pt;"><span style="font-size:11pt;font-family:Arial;color:#000000;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;">Support 3 tasks (2 new)– upon successful Kaggle competition, where many obtained impressive verification results, we see it as time for newer, more practical challenges (i.e. large-scale tri-subject verification and search & retrieval for relatives of missing children).</span></p>
      <p dir="ltr"
        style="line-height:1.38;margin-top:0pt;margin-bottom:0pt;"><span style="font-size:11pt;font-family:Arial;color:#000000;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;">Call for novel work in automatic kinship recognition (i.e. general paper track).</span></p>
      <p dir="ltr"
        style="line-height:1.38;margin-top:0pt;margin-bottom:0pt;"><span style="font-size:11pt;font-family:Arial;color:#000000;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;">Brave New Ideas: new ways of viewing problem and its formulation. Special attention for inter-disciplinary work and innovative use-cases.</span></p>
      <p dir="ltr"
        style="line-height:1.38;margin-top:0pt;margin-bottom:0pt;"><span style="font-size:11pt;font-family:Arial;color:#000000;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;">Major release of FIW (v1.1.0)</span></p>
      <p dir="ltr"
        style="line-height:1.38;margin-top:0pt;margin-bottom:0pt;"><span style="font-size:11pt;font-family:Arial;color:#000000;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;">Submissions will be peer-reviewed for the proceedings of 4th RFIW workshop at 2020 AMFG– papers will be accepted as orals and posters </span></p>
      <br>
      <p dir="ltr"
        style="line-height:1.38;margin-top:0pt;margin-bottom:0pt;"><span style="font-size:11pt;font-family:Arial;color:#000000;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;">Competition website: <a class="moz-txt-link-freetext" href="https://web.northeastern.edu/smilelab/rfiw2020/" moz-do-not-send="true">https://web.northeastern.edu/smilelab/rfiw2020/</a></span></p>
      <br>
      <p dir="ltr"
        style="line-height:1.38;margin-top:0pt;margin-bottom:0pt;"><span style="font-size:11pt;font-family:Arial;color:#000000;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;">Competition start-ending dates: October 18, 2019</span></p>
      <br>
      <p dir="ltr"
        style="line-height:1.38;margin-top:0pt;margin-bottom:0pt;"><span style="font-size:11pt;font-family:Arial;color:#000000;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;">Organizer:</span></p>
      <p dir="ltr"
        style="line-height:1.38;margin-top:0pt;margin-bottom:0pt;"><span style="font-size:11pt;font-family:Arial;color:#000000;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;">Joseph Robinson <a class="moz-txt-link-abbreviated" href="mailto:robinson.jo@husky.neu.edu" moz-do-not-send="true">robinson.jo@husky.neu.edu</a>. Department of Electrical and Computer Engineering, Northeastern University, Boston, MA, USA</span></p>
      <br>
      <br>
      <p dir="ltr"
        style="line-height:1.38;margin-top:0pt;margin-bottom:0pt;"><span style="font-size:11pt;font-family:Arial;color:#000000;background-color:transparent;font-weight:700;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;">** Joint Challenge on Compound Emotion Recognition and Multimodal (Audio, Facial and Gesture) based Emotion Recognition (CER&MMER) **</span></p>
      <br>
      <p dir="ltr"
        style="line-height:1.38;margin-top:0pt;margin-bottom:0pt;"><span style="font-size:11pt;font-family:Arial;color:#000000;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;">Emotion recognition has a key role in affective computing. People express emotions through different modalities. Expanding the  focus to several expression forms can facilitate research on emotion recognition as well as human-machine interaction. This competition focuses on two important problems which are: (1) recognition of compound emotions, that require, in addition to performing an effective visual analysis, to deal with recognition of micro emotions. The database includes 31250 facial faces with different emotions of 115 subjects; (2) recognition of multimodal emotions composed of three modalities, namely, facial expressions, body movement and gestures, and speech. </span></p>
      <br>
      <p dir="ltr"
        style="line-height:1.38;margin-top:0pt;margin-bottom:0pt;"><span style="font-size:11pt;font-family:Arial;color:#000000;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;">Competitions dates:</span></p>
      <p dir="ltr"
        style="line-height:1.38;margin-top:0pt;margin-bottom:0pt;"><span style="font-size:11pt;font-family:Arial;color:#000000;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;">- 5th Nov, 2019: Beginning of the quantitative competition, release of development and data.</span></p>
      <p dir="ltr"
        style="line-height:1.38;margin-top:0pt;margin-bottom:0pt;"><span style="font-size:11pt;font-family:Arial;color:#000000;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;">- 10th Feb, 2020: Deadline for code submission.</span></p>
      <p dir="ltr"
        style="line-height:1.38;margin-top:0pt;margin-bottom:0pt;"><span style="font-size:11pt;font-family:Arial;color:#000000;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;">- 22nd Feb, 2020: Release of final evaluation data decryption key. Participants start predicting the results on the final evaluation data.</span></p>
      <p dir="ltr"
        style="line-height:1.38;margin-top:0pt;margin-bottom:0pt;"><span style="font-size:11pt;font-family:Arial;color:#000000;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;">- 23rd Feb, 2020: Contest paper submission deadline.</span></p>
      <br>
      <p dir="ltr"
        style="line-height:1.38;margin-top:0pt;margin-bottom:0pt;"><span style="font-size:11pt;font-family:Arial;color:#000000;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;">Check for more deadlines at </span><a
          href="https://fg2020.org/competitions/competitions-cer-mmer/"
          style="text-decoration:none;" moz-do-not-send="true"><span style="font-size:11pt;font-family:Arial;color:#1155cc;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:underline;-webkit-text-decoration-skip:none;text-decoration-skip-ink:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;">https://fg2020.org/competitions/competitions-cer-mmer/</span></a></p>
      <br>
      <p dir="ltr"
        style="line-height:1.38;margin-top:0pt;margin-bottom:0pt;"><span style="font-size:11pt;font-family:Arial;color:#000000;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;">Organizers</span></p>
      <p dir="ltr"
        style="line-height:1.38;margin-top:0pt;margin-bottom:0pt;"><span style="font-size:11pt;font-family:Arial;color:#000000;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;">- Gholamreza Anbarjafari </span></p>
      <p dir="ltr"
        style="line-height:1.38;margin-top:0pt;margin-bottom:0pt;"><span style="font-size:11pt;font-family:Arial;color:#000000;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;">- Sergio Escalera </span></p>
      <p dir="ltr"
        style="line-height:1.38;margin-top:0pt;margin-bottom:0pt;"><span style="font-size:11pt;font-family:Arial;color:#000000;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;">- Kamal Nasrollahi </span></p>
      <p dir="ltr"
        style="line-height:1.38;margin-top:0pt;margin-bottom:0pt;"><span style="font-size:11pt;font-family:Arial;color:#000000;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;">- Thomas B. Moeslund </span></p>
      <p dir="ltr"
        style="line-height:1.38;margin-top:0pt;margin-bottom:0pt;"><span style="font-size:11pt;font-family:Arial;color:#000000;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;">- Dorota Kiaminska </span></p>
      <p dir="ltr"
        style="line-height:1.38;margin-top:0pt;margin-bottom:0pt;"><span style="font-size:11pt;font-family:Arial;color:#000000;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;">- Cagri Ozcinar </span></p>
      <p dir="ltr"
        style="line-height:1.38;margin-top:0pt;margin-bottom:0pt;"><span style="font-size:11pt;font-family:Arial;color:#000000;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;">- Tomasz Spanski </span></p>
      <p dir="ltr"
        style="line-height:1.38;margin-top:0pt;margin-bottom:0pt;"><span style="font-size:11pt;font-family:Arial;color:#000000;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;">- Juri Allik</span></p>
      <br>
      <br>
      <br>
      <p dir="ltr"
        style="line-height:1.38;margin-top:0pt;margin-bottom:0pt;"><span style="font-size:11pt;font-family:Arial;color:#000000;background-color:transparent;font-weight:700;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;">** Affective Behavior Analysis in-the-wild **</span></p>
      <br>
      <p dir="ltr"
        style="line-height:1.38;margin-top:0pt;margin-bottom:0pt;"><span style="font-size:11pt;font-family:Arial;color:#000000;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;">Representing human emotions has been a basic topic of research. The most usual emotion representation (ER) is the categorical one (7 basic categories). Discrete ER can also be described in terms of the FACS model, in which all possible facial actions are described in terms of Action Units (AUs). Finally, the dimensional model of affect -with valence and arousal (VA) being the most usual representations- has been proposed to distinguish between subtly different displays of affect and encode small changes in the intensity of each emotion on a continuous scale. This Competition is split into 3 Challenges: i) VA estimation, ii) 7 basic emotion classification, iii) AU detection. These Challenges use the Aff-Wild2, the first comprehensive benchmark for all 3 affect recognition tasks in-the-wild</span></p>
      <br>
      <p dir="ltr"
        style="line-height:1.38;margin-top:0pt;margin-bottom:0pt;"><span style="font-size:11pt;font-family:Arial;color:#000000;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;">Competition website:</span></p>
      <p dir="ltr"
        style="line-height:1.38;margin-top:0pt;margin-bottom:0pt;"><span style="font-size:11pt;font-family:Arial;color:#000000;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;"><a class="moz-txt-link-freetext" href="https://ibug.doc.ic.ac.uk/resources/fg-2020-competition-affective-behavior-analysis/" moz-do-not-send="true">https://ibug.doc.ic.ac.uk/resources/fg-2020-competition-affective-behavior-analysis/</a> </span></p>
      <br>
      <p dir="ltr"
        style="line-height:1.38;margin-top:0pt;margin-bottom:0pt;"><span style="font-size:11pt;font-family:Arial;color:#000000;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;">Competition start-ending dates:</span></p>
      <p dir="ltr"
        style="line-height:1.38;margin-top:0pt;margin-bottom:0pt;"><span style="font-size:11pt;font-family:Arial;color:#000000;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;">- 01 Nov 2019: Call for participation announced; start of Competition</span></p>
      <p dir="ltr"
        style="line-height:1.38;margin-top:0pt;margin-bottom:0pt;"><span style="font-size:11pt;font-family:Arial;color:#000000;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;">- 14 Feb 2020: Results and Paper submission deadline</span></p>
      <p dir="ltr"
        style="line-height:1.38;margin-top:0pt;margin-bottom:0pt;"><span style="font-size:11pt;font-family:Arial;color:#000000;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;">- 21 Feb 2020: End of review; Decisions sent to authors; Winner announcement</span></p>
      <p dir="ltr"
        style="line-height:1.38;margin-top:0pt;margin-bottom:0pt;"><span style="font-size:11pt;font-family:Arial;color:#000000;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;">- 28 Feb 2020: Camera ready versions deadline</span></p>
      <p dir="ltr"
        style="line-height:1.38;margin-top:0pt;margin-bottom:0pt;"><span style="font-size:11pt;font-family:Arial;color:#000000;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;">- 5 Mar 2020: End of Competition</span></p>
      <br>
      <p dir="ltr"
        style="line-height:1.38;margin-top:0pt;margin-bottom:0pt;"><span style="font-size:11pt;font-family:Arial;color:#000000;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;">Organisers</span></p>
      <p dir="ltr"
        style="line-height:1.38;margin-top:0pt;margin-bottom:0pt;"><span style="font-size:11pt;font-family:Arial;color:#000000;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;">- Stefanos Zafeiriou, Imperial College London, UK (<a class="moz-txt-link-abbreviated" href="mailto:s.zafeiriou@imperial.ac.uk" moz-do-not-send="true">s.zafeiriou@imperial.ac.uk</a>)</span></p>
      <p dir="ltr"
        style="line-height:1.38;margin-top:0pt;margin-bottom:0pt;"><span style="font-size:11pt;font-family:Arial;color:#000000;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;">- Dimitrios Kollias, Imperial College London, UK (<a class="moz-txt-link-abbreviated" href="mailto:dimitrios.kollias15@imperial.ac.uk" moz-do-not-send="true">dimitrios.kollias15@imperial.ac.uk</a>)</span></p>
      <p dir="ltr"
        style="line-height:1.38;margin-top:0pt;margin-bottom:0pt;"><span style="font-size:11pt;font-family:Arial;color:#000000;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;">- Attila Schulc,  Realeyes (<a class="moz-txt-link-abbreviated" href="mailto:attila.schulc@realeyesit.com" moz-do-not-send="true">attila.schulc@realeyesit.com</a>)</span></p>
      <br>
      <br>
    </b>
  </body>
</html>