<div dir="ltr">
<p dir="ltr" style="line-height:1.38;text-align:justify;margin-top:0pt;margin-bottom:0pt" id="gmail-docs-internal-guid-7b985fe9-7fff-da66-4dd3-b693b44b8dfe"><span style="font-size:10pt;font-family:Arial;color:rgb(0,0,0);background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre-wrap">We cordially invite you to participate in our </span><span style="font-size:10pt;font-family:Arial;color:rgb(0,0,0);background-color:transparent;font-weight:700;font-style:italic;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre-wrap">ICCV’2021 Understanding Social Behavior in Dyadic and Small Group Interactions Workshop & Challenge</span></p><br><br><p dir="ltr" style="line-height:1.38;text-align:center;margin-top:0pt;margin-bottom:6pt"><span style="font-size:14pt;font-family:Arial;color:rgb(0,0,0);background-color:transparent;font-weight:700;font-style:italic;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre-wrap">Workshop </span><span style="font-size:14pt;font-family:Arial;color:rgb(0,0,0);background-color:rgb(255,255,255);font-weight:700;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre-wrap">description</span></p><p dir="ltr" style="line-height:1.38;text-align:center;margin-top:0pt;margin-bottom:6pt"><a href="http://chalearnlap.cvc.uab.es/workshop/44/description/" style="text-decoration:none"><span style="font-size:10pt;font-family:Arial;color:rgb(17,85,204);background-color:transparent;font-weight:400;font-style:italic;font-variant:normal;text-decoration:underline;vertical-align:baseline;white-space:pre-wrap">http://chalearnlap.cvc.uab.es/workshop/44/description/</span></a><span style="font-size:10pt;font-family:Arial;color:rgb(0,0,0);background-color:transparent;font-weight:400;font-style:italic;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre-wrap"> </span></p><p dir="ltr" style="line-height:1.38;text-align:justify;margin-top:0pt;margin-bottom:6pt"><span style="font-size:10pt;font-family:Arial;color:rgb(0,0,0);background-color:rgb(255,255,255);font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre-wrap">Human interaction has been a central topic in psychology and social sciences, aiming at explaining the complex underlying mechanisms of communication with respect to cognitive, affective and behavioral perspectives. From a computational point of view, research in dyadic and small group interactions enables the development of automatic approaches for detection, understanding, modeling and synthesis of individual and interpersonal social signals and dynamics. Many human-centered applications for good (e.g., early diagnosis and intervention, augmented telepresence and personalized agents) depend on devising solutions for such tasks. </span></p><p dir="ltr" style="line-height:1.38;text-align:justify;margin-top:0pt;margin-bottom:6pt"><span style="font-size:10pt;font-family:Arial;color:rgb(0,0,0);background-color:rgb(255,255,255);font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre-wrap">Verbal and nonverbal communication channels are used in dyadic and small group interactions to convey our goals and intentions while building a common ground. During interactions, people influence each other based on the cues they perceive. However, the way we perceive, interpret, react, and adapt to them depends on a myriad of factors (e.g., our personal characteristics, either stable or transient; the relationship and shared history between individuals; the characteristics of the situation and task at hand; societal norms; and environmental factors). To analyze individual behaviors during a conversation, the joint modeling of participants is required due to the existing dyadic or group interdependencies. While these aspects are usually contemplated in non-computational dyadic research, context- and interlocutor-aware computational approaches are still scarce, largely due to the lack of datasets providing contextual metadata in different situations and populations. </span></p><p dir="ltr" style="line-height:1.38;text-align:justify;margin-top:0pt;margin-bottom:6pt"><span style="font-size:10pt;font-family:Arial;color:rgb(0,0,0);background-color:rgb(255,255,255);font-weight:400;font-style:normal;font-variant:normal;text-decoration:underline;vertical-align:baseline;white-space:pre-wrap">Topics and Motivation:</span><span style="font-size:10pt;font-family:Arial;color:rgb(0,0,0);background-color:rgb(255,255,255);font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre-wrap"> In line with these, we would like to bring together researchers in the field and from related disciplines to discuss the advances and new challenges on the topic of dyadic and small group interactions. We want to put a spotlight on the strengths and limitations of the existing approaches, and define the future directions of the field. In this context, we accept papers addressing the issues related to, but not limited to, these topics:</span></p><ul style="margin-top:0px;margin-bottom:0px"><li dir="ltr" style="list-style-type:disc;font-size:10pt;font-family:Arial;color:rgb(0,0,0);background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre"><p dir="ltr" style="line-height:1.38;margin-top:12pt;margin-bottom:0pt"><span style="font-size:10pt;font-family:Arial;color:rgb(0,0,0);background-color:rgb(255,255,255);font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre-wrap">Detection, understanding, modeling and synthesis of individual and interpersonal social signals and dynamics;</span></p></li><li dir="ltr" style="list-style-type:disc;font-size:10pt;font-family:Arial;color:rgb(0,0,0);background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre"><p dir="ltr" style="line-height:1.38;margin-top:0pt;margin-bottom:0pt"><span style="font-size:10pt;font-family:Arial;color:rgb(0,0,0);background-color:rgb(255,255,255);font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre-wrap">Verbal / nonverbal communication analysis in dyadic and small groups;</span></p></li><li dir="ltr" style="list-style-type:disc;font-size:10pt;font-family:Arial;color:rgb(0,0,0);background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre"><p dir="ltr" style="line-height:1.38;margin-top:0pt;margin-bottom:0pt"><span style="font-size:10pt;font-family:Arial;color:rgb(0,0,0);background-color:rgb(255,255,255);font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre-wrap">Contextual analysis in dyadic and small groups;</span></p></li><li dir="ltr" style="list-style-type:disc;font-size:10pt;font-family:Arial;color:rgb(0,0,0);background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre"><p dir="ltr" style="line-height:1.38;margin-top:0pt;margin-bottom:0pt"><span style="font-size:10pt;font-family:Arial;color:rgb(0,0,0);background-color:rgb(255,255,255);font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre-wrap">Datasets, annotation protocols and bias discovering/mitigation methods in dyadic and small groups;</span></p></li><li dir="ltr" style="list-style-type:disc;font-size:10pt;font-family:Arial;color:rgb(0,0,0);background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre"><p dir="ltr" style="line-height:1.38;margin-top:0pt;margin-bottom:12pt"><span style="font-size:10pt;font-family:Arial;color:rgb(0,0,0);background-color:rgb(255,255,255);font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre-wrap">Interpretability / Explainability in dyadic and small groups;</span></p></li></ul><p dir="ltr" style="line-height:1.38;text-align:justify;margin-top:12pt;margin-bottom:12pt"><span style="font-size:10pt;font-family:Arial;color:rgb(0,0,0);background-color:rgb(255,255,255);font-weight:400;font-style:normal;font-variant:normal;text-decoration:underline;vertical-align:baseline;white-space:pre-wrap">Workshop papers will be published in two different venues</span><span style="font-size:10pt;font-family:Arial;color:rgb(0,0,0);background-color:rgb(255,255,255);font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre-wrap">, detailed next. </span></p><ol style="margin-top:0px;margin-bottom:0px"><li dir="ltr" style="list-style-type:decimal;font-size:10pt;font-family:Arial;color:rgb(0,0,0);background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre"><p dir="ltr" style="line-height:1.38;text-align:justify;margin-top:0pt;margin-bottom:6pt"><span style="font-size:10pt;font-family:Arial;color:rgb(0,0,0);background-color:rgb(255,255,255);font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre-wrap">Papers submitted following our “</span><span style="font-size:10pt;font-family:Arial;color:rgb(0,0,0);background-color:rgb(255,255,255);font-weight:700;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre-wrap">ICCV Workshop schedule</span><span style="font-size:10pt;font-family:Arial;color:rgb(0,0,0);background-color:rgb(255,255,255);font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre-wrap">” will use the ICCV format and will be published in the proceedings of ICCV’2021. </span></p></li></ol><ul style="margin-top:0px;margin-bottom:0px"><li dir="ltr" style="list-style-type:disc;font-size:10pt;font-family:Arial;color:rgb(0,0,0);background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;margin-left:36pt"><p dir="ltr" style="line-height:1.38;margin-top:0pt;margin-bottom:0pt"><span style="font-size:10pt;font-family:Arial;color:rgb(0,0,0);background-color:rgb(255,255,255);font-weight:700;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre-wrap">Paper submission (ICCV):</span><span style="font-size:10pt;font-family:Arial;color:rgb(0,0,0);background-color:rgb(255,255,255);font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre-wrap"> July 25, 2021</span></p></li><li dir="ltr" style="list-style-type:disc;font-size:10pt;font-family:Arial;color:rgb(0,0,0);background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;margin-left:36pt"><p dir="ltr" style="line-height:1.38;margin-top:0pt;margin-bottom:0pt"><span style="font-size:10pt;font-family:Arial;color:rgb(0,0,0);background-color:rgb(255,255,255);font-weight:700;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre-wrap">Author notification (ICCV): </span><span style="font-size:10pt;font-family:Arial;color:rgb(0,0,0);background-color:rgb(255,255,255);font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre-wrap">September 10th, 2021</span></p></li><li dir="ltr" style="list-style-type:disc;font-size:10pt;font-family:Arial;color:rgb(0,0,0);background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;margin-left:36pt"><p dir="ltr" style="line-height:1.38;margin-top:0pt;margin-bottom:0pt"><span style="font-size:10pt;font-family:Arial;color:rgb(0,0,0);background-color:rgb(255,255,255);font-weight:700;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre-wrap">Camera-ready (ICCV): </span><span style="font-size:10pt;font-family:Arial;color:rgb(0,0,0);background-color:rgb(255,255,255);font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre-wrap">September 16th, 2021</span></p></li></ul><br><ol style="margin-top:0px;margin-bottom:0px" start="2"><li dir="ltr" style="list-style-type:decimal;font-size:10pt;font-family:Arial;color:rgb(0,0,0);background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre"><p dir="ltr" style="line-height:1.38;text-align:justify;margin-top:0pt;margin-bottom:6pt"><span style="font-size:10pt;font-family:Arial;color:rgb(0,0,0);background-color:rgb(255,255,255);font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre-wrap">Papers submitted following our “</span><span style="font-size:10pt;font-family:Arial;color:rgb(0,0,0);background-color:rgb(255,255,255);font-weight:700;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre-wrap">PMLR Workshop schedule</span><span style="font-size:10pt;font-family:Arial;color:rgb(0,0,0);background-color:rgb(255,255,255);font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre-wrap">” will use the PMLR format and will be published in Proceedings of Machine Learning Research (PMLR).</span></p></li></ol><ul style="margin-top:0px;margin-bottom:0px"><li dir="ltr" style="list-style-type:disc;font-size:10pt;font-family:Arial;color:rgb(0,0,0);background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;margin-left:36pt"><p dir="ltr" style="line-height:1.38;margin-top:0pt;margin-bottom:0pt"><span style="font-size:10pt;font-family:Arial;color:rgb(0,0,0);background-color:rgb(255,255,255);font-weight:700;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre-wrap">Paper submission (PMLR):</span><span style="font-size:10pt;font-family:Arial;color:rgb(0,0,0);background-color:rgb(255,255,255);font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre-wrap"> October 31th, 2021</span></p></li><li dir="ltr" style="list-style-type:disc;font-size:10pt;font-family:Arial;color:rgb(0,0,0);background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;margin-left:36pt"><p dir="ltr" style="line-height:1.38;margin-top:0pt;margin-bottom:0pt"><span style="font-size:10pt;font-family:Arial;color:rgb(0,0,0);background-color:rgb(255,255,255);font-weight:700;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre-wrap">Author notification (PMLR): </span><span style="font-size:10pt;font-family:Arial;color:rgb(0,0,0);background-color:rgb(255,255,255);font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre-wrap">November 30th, 2021</span></p></li><li dir="ltr" style="list-style-type:disc;font-size:10pt;font-family:Arial;color:rgb(0,0,0);background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;margin-left:36pt"><p dir="ltr" style="line-height:1.38;margin-top:0pt;margin-bottom:0pt"><span style="font-size:10pt;font-family:Arial;color:rgb(0,0,0);background-color:rgb(255,255,255);font-weight:700;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre-wrap">Camera-ready (PMLR): </span><span style="font-size:10pt;font-family:Arial;color:rgb(0,0,0);background-color:rgb(255,255,255);font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre-wrap">December 20th, 2021</span></p></li></ul><br><br><p dir="ltr" style="line-height:1.38;text-align:justify;margin-top:0pt;margin-bottom:0pt"><span style="font-size:10pt;font-family:Arial;color:rgb(0,0,0);background-color:rgb(255,255,255);font-weight:700;font-style:normal;font-variant:normal;text-decoration:underline;vertical-align:baseline;white-space:pre-wrap">INVITED SPEAKERS:</span></p><p dir="ltr" style="line-height:1.38;text-align:justify;margin-top:0pt;margin-bottom:0pt"><span style="font-size:10pt;font-family:Arial;color:rgb(0,0,0);background-color:rgb(255,255,255);font-weight:700;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre-wrap">Louis-Philippe Morency,</span><span style="font-size:10pt;font-family:Arial;color:rgb(0,0,0);background-color:rgb(255,255,255);font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre-wrap"> Carnegie Mellon University, USA</span></p><p dir="ltr" style="line-height:1.38;text-align:justify;margin-top:0pt;margin-bottom:0pt"><span style="font-size:10pt;font-family:Arial;color:rgb(0,0,0);background-color:rgb(255,255,255);font-weight:700;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre-wrap">Alexander Todorov, </span><span style="font-size:10pt;font-family:Arial;color:rgb(0,0,0);background-color:rgb(255,255,255);font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre-wrap">Princeton University, USA</span></p><p dir="ltr" style="line-height:1.38;text-align:justify;margin-top:0pt;margin-bottom:0pt"><span style="font-size:10pt;font-family:Arial;color:rgb(0,0,0);background-color:rgb(255,255,255);font-weight:700;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre-wrap">Hatice Gunes, </span><span style="font-size:10pt;font-family:Arial;color:rgb(0,0,0);background-color:rgb(255,255,255);font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre-wrap">University of Cambridge, UK</span></p><p dir="ltr" style="line-height:1.38;text-align:justify;margin-top:0pt;margin-bottom:0pt"><span style="font-size:10pt;font-family:Arial;color:rgb(0,0,0);background-color:rgb(255,255,255);font-weight:700;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre-wrap">Daniel Gatica-Perez, </span><span style="font-size:10pt;font-family:Arial;color:rgb(0,0,0);background-color:rgb(255,255,255);font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre-wrap">IDIAP, Switzerland</span></p><p dir="ltr" style="line-height:1.38;text-align:justify;margin-top:0pt;margin-bottom:0pt"><span style="font-size:10pt;font-family:Arial;color:rgb(0,0,0);background-color:rgb(255,255,255);font-weight:700;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre-wrap">Qiang Ji, </span><span style="font-size:10pt;font-family:Arial;color:rgb(0,0,0);background-color:rgb(255,255,255);font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre-wrap">Rensselaer Polytechnic Institute, USA</span></p><p dir="ltr" style="line-height:1.38;text-align:justify;margin-top:0pt;margin-bottom:0pt"><span style="font-size:10pt;font-family:Arial;color:rgb(0,0,0);background-color:rgb(255,255,255);font-weight:700;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre-wrap">Yaser Sheikh, </span><span style="font-size:10pt;font-family:Arial;color:rgb(0,0,0);background-color:rgb(255,255,255);font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre-wrap">Carnegie Mellon University, USA</span></p><p dir="ltr" style="line-height:1.38;text-align:justify;margin-top:0pt;margin-bottom:0pt"><span style="font-size:10pt;font-family:Arial;color:rgb(0,0,0);background-color:rgb(255,255,255);font-weight:700;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre-wrap">Norah Dunbar, </span><span style="font-size:10pt;font-family:Arial;color:rgb(0,0,0);background-color:rgb(255,255,255);font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre-wrap">UC Santa Barbara, USA</span></p><br><br><p dir="ltr" style="line-height:1.38;text-align:center;margin-top:0pt;margin-bottom:6pt"><span style="font-size:14pt;font-family:Arial;color:rgb(0,0,0);background-color:transparent;font-weight:700;font-style:italic;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre-wrap">Challenge </span><span style="font-size:14pt;font-family:Arial;color:rgb(0,0,0);background-color:rgb(255,255,255);font-weight:700;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre-wrap">description</span></p><p dir="ltr" style="line-height:1.38;text-align:center;margin-top:0pt;margin-bottom:6pt"><a href="http://chalearnlap.cvc.uab.es/challenge/45/description/" style="text-decoration:none"><span style="font-size:10pt;font-family:Arial;color:rgb(17,85,204);background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:underline;vertical-align:baseline;white-space:pre-wrap">http://chalearnlap.cvc.uab.es/challenge/45/description/</span></a><span style="font-size:10pt;font-family:Arial;color:rgb(0,0,0);background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre-wrap"> </span></p><p dir="ltr" style="line-height:1.38;text-align:justify;margin-top:0pt;margin-bottom:6pt"><span style="font-size:10pt;font-family:Arial;color:rgb(0,0,0);background-color:rgb(255,255,255);font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre-wrap">To advance and motivate the research on visual human behavior analysis in dyadic and small group interactions, the challenge will use a large scale, multimodal, and multiview (</span><a href="https://openaccess.thecvf.com/content/WACV2021W/HBU/papers/Palmero_Context-Aware_Personality_Inference_in_Dyadic_Scenarios_Introducing_the_UDIVA_Dataset_WACVW_2021_paper.pdf" style="text-decoration:none"><span style="font-size:10pt;font-family:Arial;color:rgb(17,85,204);background-color:rgb(255,255,255);font-weight:400;font-style:normal;font-variant:normal;text-decoration:underline;vertical-align:baseline;white-space:pre-wrap">UDIVA</span></a><span style="font-size:10pt;font-family:Arial;color:rgb(0,0,0);background-color:rgb(255,255,255);font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre-wrap">) dataset recently collected by our group, which provides many related challenges. It will address two different problems, divided in two competition tracks:</span></p><ol style="margin-top:0px;margin-bottom:0px"><li dir="ltr" style="list-style-type:decimal;font-size:10pt;font-family:Arial;color:rgb(0,0,0);background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre"><p dir="ltr" style="line-height:1.38;text-align:justify;margin-top:0pt;margin-bottom:0pt"><span style="font-size:10pt;font-family:Arial;color:rgb(0,0,0);background-color:rgb(255,255,255);font-weight:700;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre-wrap">Automatic self-reported personality recognition </span><span style="font-size:10pt;font-family:Arial;color:rgb(0,0,0);background-color:rgb(255,255,255);font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre-wrap">of single individuals (i.e., a target person) during a dyadic interaction, from two individual views. </span></p></li><li dir="ltr" style="list-style-type:decimal;font-size:10pt;font-family:Arial;color:rgb(0,0,0);background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre"><p dir="ltr" style="line-height:1.38;text-align:justify;margin-top:0pt;margin-bottom:0pt"><span style="font-size:10pt;font-family:Arial;color:rgb(0,0,0);background-color:rgb(255,255,255);font-weight:700;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre-wrap">Behavior forecasting:</span><span style="font-size:10pt;font-family:Arial;color:rgb(0,0,0);background-color:rgb(255,255,255);font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre-wrap"> the focus of this track will be to estimate future (up to N frames) 2D facial landmarks, hand, and upper body pose of a target individual in a dyadic interaction.</span></p></li></ol><br><p dir="ltr" style="line-height:1.38;text-align:justify;margin-top:0pt;margin-bottom:0pt"><span style="font-size:10pt;font-family:Arial;color:rgb(0,0,0);background-color:rgb(255,255,255);font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre-wrap">In both tasks, multiview and multimodal information (audio-visual, transcriptions, context and medatada) are expected to be exploited to solve the problem.</span></p><br><p dir="ltr" style="line-height:1.38;margin-left:36pt;text-align:justify;margin-top:0pt;margin-bottom:0pt"><span style="font-size:10pt;font-family:Arial;color:rgb(0,0,0);background-color:rgb(255,255,255);font-weight:400;font-style:normal;font-variant:normal;text-decoration:underline;vertical-align:baseline;white-space:pre-wrap">Important Dates:</span></p><br><ul style="margin-top:0px;margin-bottom:0px"><li dir="ltr" style="list-style-type:disc;font-size:10pt;font-family:Arial;color:rgb(0,0,0);background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;margin-left:36pt"><p dir="ltr" style="line-height:1.38;margin-top:0pt;margin-bottom:0pt"><span style="font-size:10pt;font-family:Arial;color:rgb(0,0,0);background-color:rgb(255,255,255);font-weight:700;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre-wrap">Dataset access request period open:</span><span style="font-size:10pt;font-family:Arial;color:rgb(0,0,0);background-color:rgb(255,255,255);font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre-wrap"> 18th May, 2021</span></p></li><li dir="ltr" style="list-style-type:disc;font-size:10pt;font-family:Arial;color:rgb(0,0,0);background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;margin-left:36pt"><p dir="ltr" style="line-height:1.38;margin-top:0pt;margin-bottom:0pt"><span style="font-size:10pt;font-family:Arial;color:rgb(0,0,0);background-color:rgb(255,255,255);font-weight:700;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre-wrap">Start of the Challenge (development phase): </span><span style="font-size:10pt;font-family:Arial;color:rgb(0,0,0);background-color:rgb(255,255,255);font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre-wrap">June 1st, 2021</span></p></li><li dir="ltr" style="list-style-type:disc;font-size:10pt;font-family:Arial;color:rgb(0,0,0);background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;margin-left:36pt"><p dir="ltr" style="line-height:1.38;margin-top:0pt;margin-bottom:0pt"><span style="font-size:10pt;font-family:Arial;color:rgb(0,0,0);background-color:rgb(255,255,255);font-weight:700;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre-wrap">Start of test phase: </span><span style="font-size:10pt;font-family:Arial;color:rgb(0,0,0);background-color:rgb(255,255,255);font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre-wrap">September 1st, 2021</span></p></li><li dir="ltr" style="list-style-type:disc;font-size:10pt;font-family:Arial;color:rgb(0,0,0);background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;margin-left:36pt"><p dir="ltr" style="line-height:1.38;margin-top:0pt;margin-bottom:0pt"><span style="font-size:10pt;font-family:Arial;color:rgb(0,0,0);background-color:rgb(255,255,255);font-weight:700;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre-wrap">End of the Challenge: </span><span style="font-size:10pt;font-family:Arial;color:rgb(0,0,0);background-color:rgb(255,255,255);font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre-wrap">September 17th, 2021</span></p></li><li dir="ltr" style="list-style-type:disc;font-size:10pt;font-family:Arial;color:rgb(0,0,0);background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;margin-left:36pt"><p dir="ltr" style="line-height:1.38;margin-top:0pt;margin-bottom:0pt"><span style="font-size:10pt;font-family:Arial;color:rgb(0,0,0);background-color:rgb(255,255,255);font-weight:700;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre-wrap">Release of final results: </span><span style="font-size:10pt;font-family:Arial;color:rgb(0,0,0);background-color:rgb(255,255,255);font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre-wrap">September 30th, 2021</span></p></li></ul><br><p dir="ltr" style="line-height:1.38;text-align:justify;margin-top:0pt;margin-bottom:0pt"><span style="font-size:10pt;font-family:Arial;color:rgb(0,0,0);background-color:rgb(255,255,255);font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre-wrap">Top winning solutions will be invited to give a talk to present their work at the associated ICCV 2021 ChaLearn workshop (</span><a href="http://chalearnlap.cvc.uab.es/workshop/44/description/" style="text-decoration:none"><span style="font-size:10pt;font-family:Arial;color:rgb(17,85,204);background-color:rgb(255,255,255);font-weight:400;font-style:normal;font-variant:normal;text-decoration:underline;vertical-align:baseline;white-space:pre-wrap">http://chalearnlap.cvc.uab.es/workshop/44/description/</span></a><span style="font-size:10pt;font-family:Arial;color:rgb(0,0,0);background-color:rgb(255,255,255);font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre-wrap">).</span></p><br><p dir="ltr" style="line-height:1.38;text-align:justify;margin-top:0pt;margin-bottom:0pt"><span style="font-size:10pt;font-family:Arial;color:rgb(0,0,0);background-color:rgb(255,255,255);font-weight:700;font-style:normal;font-variant:normal;text-decoration:underline;vertical-align:baseline;white-space:pre-wrap">ORGANIZATION and CONTACT*</span></p><p dir="ltr" style="line-height:1.38;text-align:justify;margin-top:0pt;margin-bottom:0pt"><span style="font-size:10pt;font-family:Arial;color:rgb(0,0,0);background-color:rgb(255,255,255);font-weight:700;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre-wrap">Sergio Escalera*, </span><span style="font-size:10pt;font-family:Arial;color:rgb(0,0,0);background-color:rgb(255,255,255);font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre-wrap">Computer Vision Center (CVC) and University of Barcelona, Spain <</span><a href="mailto:sergio.escalera.guerrero@gmail.com" style="text-decoration:none"><span style="font-size:10pt;font-family:Arial;color:rgb(17,85,204);background-color:rgb(255,255,255);font-weight:400;font-style:normal;font-variant:normal;text-decoration:underline;vertical-align:baseline;white-space:pre-wrap">sergio.escalera.guerrero@gmail.com</span></a><span style="font-size:10pt;font-family:Arial;color:rgb(0,0,0);background-color:rgb(255,255,255);font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre-wrap">> </span></p><p dir="ltr" style="line-height:1.38;text-align:justify;margin-top:0pt;margin-bottom:0pt"><span style="font-size:10pt;font-family:Arial;color:rgb(0,0,0);background-color:rgb(255,255,255);font-weight:700;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre-wrap">Cristina Palmero*, </span><span style="font-size:10pt;font-family:Arial;color:rgb(0,0,0);background-color:rgb(255,255,255);font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre-wrap">Computer Vision Center (CVC) and University of Barcelona, Spain <</span><a href="mailto:c.palmero.cantarino@gmail.com" style="text-decoration:none"><span style="font-size:10pt;font-family:Arial;color:rgb(17,85,204);background-color:rgb(255,255,255);font-weight:400;font-style:normal;font-variant:normal;text-decoration:underline;vertical-align:baseline;white-space:pre-wrap">c.palmero.cantarino@gmail.com</span></a><span style="font-size:10pt;font-family:Arial;color:rgb(0,0,0);background-color:rgb(255,255,255);font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre-wrap">> </span></p><p dir="ltr" style="line-height:1.38;text-align:justify;margin-top:0pt;margin-bottom:0pt"><span style="font-size:10pt;font-family:Arial;color:rgb(0,0,0);background-color:rgb(255,255,255);font-weight:700;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre-wrap">Wei-Wei Tu, </span><span style="font-size:10pt;font-family:Arial;color:rgb(0,0,0);background-color:rgb(255,255,255);font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre-wrap">4Paradigm Inc., China</span></p><p dir="ltr" style="line-height:1.38;text-align:justify;margin-top:0pt;margin-bottom:0pt"><span style="font-size:10pt;font-family:Arial;color:rgb(0,0,0);background-color:rgb(255,255,255);font-weight:700;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre-wrap">Albert Clapés, </span><span style="font-size:10pt;font-family:Arial;color:rgb(0,0,0);background-color:rgb(255,255,255);font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre-wrap">Computer Vision Center (CVC), Spain</span></p><p dir="ltr" style="line-height:1.38;text-align:justify;margin-top:0pt;margin-bottom:0pt"><span style="font-size:10pt;font-family:Arial;color:rgb(0,0,0);background-color:rgb(255,255,255);font-weight:700;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre-wrap">Julio C. S. Jacques Junior</span><span style="font-size:10pt;font-family:Arial;color:rgb(0,0,0);background-color:rgb(255,255,255);font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre-wrap">, Computer Vision Center (CVC/UAB), Spain</span></p><br><p dir="ltr" style="line-height:1.38;text-align:justify;margin-top:0pt;margin-bottom:0pt"><span style="font-size:10pt;font-family:Arial;color:rgb(0,0,0);background-color:rgb(255,255,255);font-weight:700;font-style:normal;font-variant:normal;text-decoration:underline;vertical-align:baseline;white-space:pre-wrap">Sponsors:</span><span style="font-size:10pt;font-family:Arial;color:rgb(0,0,0);background-color:rgb(255,255,255);font-weight:700;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre-wrap"> </span><span style="font-size:10pt;font-family:Arial;color:rgb(0,0,0);background-color:rgb(255,255,255);font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre-wrap">This event is sponsored by ChaLearn, 4Paradigm Inc., and Facebook Reality Labs. University of Barcelona, Computer Vision Center at Autonomous University of Barcelona, and Human Pose Recovery and Behavior Analysis (HuPBA) group, are the co-sponsors of the Challenge.</span></p><br><p dir="ltr" style="line-height:1.38;text-align:justify;margin-top:0pt;margin-bottom:0pt"><span style="font-size:10pt;font-family:Arial;color:rgb(0,0,0);background-color:rgb(255,255,255);font-weight:700;font-style:normal;font-variant:normal;text-decoration:underline;vertical-align:baseline;white-space:pre-wrap">Prizes:</span><span style="font-size:10pt;font-family:Arial;color:rgb(0,0,0);background-color:rgb(255,255,255);font-weight:700;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre-wrap"> </span><span style="font-size:10pt;font-family:Arial;color:rgb(0,0,0);background-color:rgb(255,255,255);font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre-wrap">Top winning solutions will be invited to give a talk to present their work at the associated ICCV 2021 ChaLearn workshop, will receive a winning certificate and will have </span><span style="font-size:10pt;font-family:Arial;color:rgb(0,0,0);background-color:rgb(255,255,255);font-weight:400;font-style:normal;font-variant:normal;text-decoration:underline;vertical-align:baseline;white-space:pre-wrap">free ICCV registration</span><span style="font-size:10pt;font-family:Arial;color:rgb(0,0,0);background-color:rgb(255,255,255);font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre-wrap">. Our sponsors are also offering the following prizes:</span></p><ul style="margin-top:0px;margin-bottom:0px"><li dir="ltr" style="list-style-type:disc;font-size:10pt;font-family:Arial;color:rgb(0,0,0);background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre"><p dir="ltr" style="line-height:1.38;text-align:justify;margin-top:0pt;margin-bottom:0pt"><span style="font-size:10pt;font-family:Arial;color:rgb(0,0,0);background-color:rgb(255,255,255);font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre-wrap">Track 1: </span><span style="font-size:10pt;font-family:Arial;color:rgb(0,0,0);background-color:rgb(255,255,255);font-weight:700;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre-wrap">Top-1 solution: 1000$</span><span style="font-size:10pt;font-family:Arial;color:rgb(0,0,0);background-color:rgb(255,255,255);font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre-wrap"> / </span><span style="font-size:10pt;font-family:Arial;color:rgb(0,0,0);background-color:rgb(255,255,255);font-weight:700;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre-wrap">Top-2 solution: 500$</span><span style="font-size:10pt;font-family:Arial;color:rgb(0,0,0);background-color:rgb(255,255,255);font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre-wrap"> / </span><span style="font-size:10pt;font-family:Arial;color:rgb(0,0,0);background-color:rgb(255,255,255);font-weight:700;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre-wrap">Top-3 solution: 300$</span></p></li><li dir="ltr" style="list-style-type:disc;font-size:10pt;font-family:Arial;color:rgb(0,0,0);background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre"><p dir="ltr" style="line-height:1.38;text-align:justify;margin-top:0pt;margin-bottom:0pt"><span style="font-size:10pt;font-family:Arial;color:rgb(0,0,0);background-color:rgb(255,255,255);font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre-wrap">Track 2: </span><span style="font-size:10pt;font-family:Arial;color:rgb(0,0,0);background-color:rgb(255,255,255);font-weight:700;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre-wrap">Top-1 solution: 1000$</span><span style="font-size:10pt;font-family:Arial;color:rgb(0,0,0);background-color:rgb(255,255,255);font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre-wrap"> / </span><span style="font-size:10pt;font-family:Arial;color:rgb(0,0,0);background-color:rgb(255,255,255);font-weight:700;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre-wrap">Top-2 solution: 500$</span><span style="font-size:10pt;font-family:Arial;color:rgb(0,0,0);background-color:rgb(255,255,255);font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre-wrap"> / </span><span style="font-size:10pt;font-family:Arial;color:rgb(0,0,0);background-color:rgb(255,255,255);font-weight:700;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre-wrap">Top-3 solution: 300$</span></p></li></ul><br><h1 dir="ltr" style="line-height:1.2;text-align:justify;margin-top:20pt;margin-bottom:6pt"><span style="font-size:10pt;font-family:Arial;color:rgb(0,0,0);background-color:rgb(255,255,255);font-weight:400;font-style:normal;font-variant:normal;text-decoration:underline;vertical-align:baseline;white-space:pre-wrap">Honorable mention:</span><span style="font-size:10pt;font-family:Arial;color:rgb(0,0,0);background-color:rgb(255,255,255);font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre-wrap"> based on the significance of the result in a particular trait/s (track 1) or body part (track 2) and the level of novelty/originality of the solution, in addition to top-3 solutions, we may announce additional </span><span style="font-size:10pt;font-family:Arial;color:rgb(0,0,0);background-color:rgb(255,255,255);font-weight:400;font-style:italic;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre-wrap">honorable mentions</span><span style="font-size:10pt;font-family:Arial;color:rgb(0,0,0);background-color:rgb(255,255,255);font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre-wrap">, which will also receive a winning certificate and a free ICCV registration.</span></h1><br>

<br clear="all"><br>-- <br><div dir="ltr" class="gmail_signature" data-smartmail="gmail_signature"><div dir="ltr"><div><div dir="ltr"><div><div dir="ltr"><div><div dir="ltr"><div><div dir="ltr"><div><div dir="ltr"><div><div dir="ltr"><div><div dir="ltr"><div><div dir="ltr"><div><div dir="ltr"><div><div><font size="1"><span style="color:rgb(68,68,68)"><b><span style="color:rgb(102,102,102)">Dr. Sergio Escalera Guerrero</span></b></span><span style="color:rgb(153,153,153)"></span></font><div><span style="color:rgb(153,153,153)"><font size="1"><span><span><span style="color:rgb(153,153,153)"><font size="1">Full Professor</font></span></span></span> at Universitat de Barcelona</font></span></div><font size="1"><span style="color:rgb(153,153,153)"><font size="1"><span style="color:rgb(153,153,153)">ELLIS Fellow / </span></font>Head of Human Pose Recovery and Behavior Analysis group /</span></font><span style="color:rgb(153,153,153)"><font size="1"> ICREA Academia / Project Manager at the Computer Vision Center</font></span><span style="color:rgb(153,153,153)"></span><br><span style="color:rgb(153,153,153)"></span></div></div><font size="1"><span style="color:rgb(153,153,153)">Email: <a href="mailto:sergio.escalera.guerrero@gmail.com" target="_blank">sergio.escalera.guerrero@gmail.com</a> / Webpage: <a href="http://www.maia.ub.es/~sergio/" target="_blank">http://www.sergioescalera.com/</a></span></font><span><span style="color:rgb(153,153,153)"><font size="1">  / Phone:+34</font></span><font size="1"><span style="color:rgb(153,153,153)"><font size="1"><span dir="ltr"><span dir="ltr"><span><span>934020853<br></span></span></span></span></font></span></font></span><div><span></span></div></div></div></div></div></div></div></div></div></div></div></div></div></div></div></div></div></div></div></div></div></div>