<html xmlns:v="urn:schemas-microsoft-com:vml" xmlns:o="urn:schemas-microsoft-com:office:office" xmlns:w="urn:schemas-microsoft-com:office:word" xmlns:m="http://schemas.microsoft.com/office/2004/12/omml" xmlns="http://www.w3.org/TR/REC-html40">
<head>
<meta http-equiv="Content-Type" content="text/html; charset=us-ascii">
<meta name="Generator" content="Microsoft Word 15 (filtered medium)">
<style><!--
/* Font Definitions */
@font-face
{font-family:Wingdings;
panose-1:5 0 0 0 0 0 0 0 0 0;}
@font-face
{font-family:"Cambria Math";
panose-1:2 4 5 3 5 4 6 3 2 4;}
@font-face
{font-family:Calibri;
panose-1:2 15 5 2 2 2 4 3 2 4;}
/* Style Definitions */
p.MsoNormal, li.MsoNormal, div.MsoNormal
{margin:0in;
font-size:11.0pt;
font-family:"Calibri",sans-serif;}
a:link, span.MsoHyperlink
{mso-style-priority:99;
color:#0563C1;
text-decoration:underline;}
span.EmailStyle17
{mso-style-type:personal-compose;
font-family:"Calibri",sans-serif;
color:windowtext;}
.MsoChpDefault
{mso-style-type:export-only;
font-family:"Calibri",sans-serif;}
@page WordSection1
{size:8.5in 11.0in;
margin:1.0in 1.0in 1.0in 1.0in;}
div.WordSection1
{page:WordSection1;}
/* List Definitions */
@list l0
{mso-list-id:545486839;
mso-list-template-ids:925788030;}
@list l0:level1
{mso-level-number-format:bullet;
mso-level-text:\F0B7;
mso-level-tab-stop:.5in;
mso-level-number-position:left;
text-indent:-.25in;
mso-ansi-font-size:10.0pt;
font-family:Symbol;}
@list l0:level2
{mso-level-number-format:bullet;
mso-level-text:o;
mso-level-tab-stop:1.0in;
mso-level-number-position:left;
text-indent:-.25in;
mso-ansi-font-size:10.0pt;
font-family:"Courier New";
mso-bidi-font-family:"Times New Roman";}
@list l0:level3
{mso-level-number-format:bullet;
mso-level-text:\F0A7;
mso-level-tab-stop:1.5in;
mso-level-number-position:left;
text-indent:-.25in;
mso-ansi-font-size:10.0pt;
font-family:Wingdings;}
@list l0:level4
{mso-level-number-format:bullet;
mso-level-text:\F0A7;
mso-level-tab-stop:2.0in;
mso-level-number-position:left;
text-indent:-.25in;
mso-ansi-font-size:10.0pt;
font-family:Wingdings;}
@list l0:level5
{mso-level-number-format:bullet;
mso-level-text:\F0A7;
mso-level-tab-stop:2.5in;
mso-level-number-position:left;
text-indent:-.25in;
mso-ansi-font-size:10.0pt;
font-family:Wingdings;}
@list l0:level6
{mso-level-number-format:bullet;
mso-level-text:\F0A7;
mso-level-tab-stop:3.0in;
mso-level-number-position:left;
text-indent:-.25in;
mso-ansi-font-size:10.0pt;
font-family:Wingdings;}
@list l0:level7
{mso-level-number-format:bullet;
mso-level-text:\F0A7;
mso-level-tab-stop:3.5in;
mso-level-number-position:left;
text-indent:-.25in;
mso-ansi-font-size:10.0pt;
font-family:Wingdings;}
@list l0:level8
{mso-level-number-format:bullet;
mso-level-text:\F0A7;
mso-level-tab-stop:4.0in;
mso-level-number-position:left;
text-indent:-.25in;
mso-ansi-font-size:10.0pt;
font-family:Wingdings;}
@list l0:level9
{mso-level-number-format:bullet;
mso-level-text:\F0A7;
mso-level-tab-stop:4.5in;
mso-level-number-position:left;
text-indent:-.25in;
mso-ansi-font-size:10.0pt;
font-family:Wingdings;}
@list l1
{mso-list-id:2055040789;
mso-list-template-ids:-1536942298;}
@list l1:level1
{mso-level-number-format:bullet;
mso-level-text:\F0B7;
mso-level-tab-stop:.5in;
mso-level-number-position:left;
text-indent:-.25in;
mso-ansi-font-size:10.0pt;
font-family:Symbol;}
@list l1:level2
{mso-level-number-format:bullet;
mso-level-text:o;
mso-level-tab-stop:1.0in;
mso-level-number-position:left;
text-indent:-.25in;
mso-ansi-font-size:10.0pt;
font-family:"Courier New";
mso-bidi-font-family:"Times New Roman";}
@list l1:level3
{mso-level-number-format:bullet;
mso-level-text:\F0A7;
mso-level-tab-stop:1.5in;
mso-level-number-position:left;
text-indent:-.25in;
mso-ansi-font-size:10.0pt;
font-family:Wingdings;}
@list l1:level4
{mso-level-number-format:bullet;
mso-level-text:\F0A7;
mso-level-tab-stop:2.0in;
mso-level-number-position:left;
text-indent:-.25in;
mso-ansi-font-size:10.0pt;
font-family:Wingdings;}
@list l1:level5
{mso-level-number-format:bullet;
mso-level-text:\F0A7;
mso-level-tab-stop:2.5in;
mso-level-number-position:left;
text-indent:-.25in;
mso-ansi-font-size:10.0pt;
font-family:Wingdings;}
@list l1:level6
{mso-level-number-format:bullet;
mso-level-text:\F0A7;
mso-level-tab-stop:3.0in;
mso-level-number-position:left;
text-indent:-.25in;
mso-ansi-font-size:10.0pt;
font-family:Wingdings;}
@list l1:level7
{mso-level-number-format:bullet;
mso-level-text:\F0A7;
mso-level-tab-stop:3.5in;
mso-level-number-position:left;
text-indent:-.25in;
mso-ansi-font-size:10.0pt;
font-family:Wingdings;}
@list l1:level8
{mso-level-number-format:bullet;
mso-level-text:\F0A7;
mso-level-tab-stop:4.0in;
mso-level-number-position:left;
text-indent:-.25in;
mso-ansi-font-size:10.0pt;
font-family:Wingdings;}
@list l1:level9
{mso-level-number-format:bullet;
mso-level-text:\F0A7;
mso-level-tab-stop:4.5in;
mso-level-number-position:left;
text-indent:-.25in;
mso-ansi-font-size:10.0pt;
font-family:Wingdings;}
ol
{margin-bottom:0in;}
ul
{margin-bottom:0in;}
--></style><!--[if gte mso 9]><xml>
<o:shapedefaults v:ext="edit" spidmax="1026" />
</xml><![endif]--><!--[if gte mso 9]><xml>
<o:shapelayout v:ext="edit">
<o:idmap v:ext="edit" data="1" />
</o:shapelayout></xml><![endif]-->
</head>
<body lang="EN-US" link="#0563C1" vlink="#954F72" style="word-wrap:break-word">
<div class="WordSection1">
<p style="mso-margin-top-alt:0in;margin-right:0in;margin-bottom:22.5pt;margin-left:0in;background:white">
<span style="font-size:12.0pt;font-family:"Arial",sans-serif;color:#2F3138">To investigate ethical, applied affect recognition, this workshop will leverage multimodal data that includes, but is not limited to, 2D, 3D, thermal, brain, physiological, and mobile
sensor signals. This workshop aims to expose current use cases for affective computing and emerging applications of affective computing to spark future work. Along with this, this workshop has a specific focus on the ethical considerations of such work, including
how to mitigate ethical concerns. Considering this, topics of the workshop will focus on questions including, but not limited to:<o:p></o:p></span></p>
<ul type="disc">
<li class="MsoNormal" style="color:#2F3138;mso-margin-top-alt:auto;mso-margin-bottom-alt:auto;mso-list:l1 level1 lfo1;background:white;box-sizing: border-box">
<span style="font-size:12.0pt;font-family:"Arial",sans-serif">What inter-correlations exist between facial affect (e.g. expression) and other modalities (e.g. EEG)?<o:p></o:p></span></li><li class="MsoNormal" style="color:#2F3138;mso-margin-top-alt:auto;mso-margin-bottom-alt:auto;mso-list:l1 level1 lfo1;background:white;box-sizing: border-box">
<span style="font-size:12.0pt;font-family:"Arial",sans-serif">How can multimodal data be leveraged to create real-world applications of affect recognition such as prediction of stress, real-time ubiquitous emotion recognition, and impact of mood on ubiquitous
subject identification?<o:p></o:p></span></li><li class="MsoNormal" style="color:#2F3138;mso-margin-top-alt:auto;mso-margin-bottom-alt:auto;mso-list:l1 level1 lfo1;background:white;box-sizing: border-box">
<span style="font-size:12.0pt;font-family:"Arial",sans-serif">How can we facilitate the collection of multimodal data for applied affect recognition?<o:p></o:p></span></li><li class="MsoNormal" style="color:#2F3138;mso-margin-top-alt:auto;mso-margin-bottom-alt:auto;mso-list:l1 level1 lfo1;background:white;box-sizing: border-box">
<span style="font-size:12.0pt;font-family:"Arial",sans-serif">What are the ethical implications of working on such questions?<o:p></o:p></span></li><li class="MsoNormal" style="color:#2F3138;mso-margin-top-alt:auto;mso-margin-bottom-alt:auto;mso-list:l1 level1 lfo1;background:white;box-sizing: border-box">
<span style="font-size:12.0pt;font-family:"Arial",sans-serif">How can we mitigate the ethical concerns that such work produces?<o:p></o:p></span></li><li class="MsoNormal" style="color:#2F3138;mso-margin-top-alt:auto;mso-margin-bottom-alt:auto;mso-list:l1 level1 lfo1;background:white;box-sizing: border-box">
<span style="font-size:12.0pt;font-family:"Arial",sans-serif">Can we positively address public fears and misconceptions regarding applied affective computing?<o:p></o:p></span></li></ul>
<p class="MsoNormal"><span style="font-size:12.0pt;font-family:"Arial",sans-serif;color:#2F3138;background:white">To address these questions, AMAR2021 targets researchers in BCI, affective computing, biometrics, computer vision, human-computer interaction,
behavioral sciences, social sciences, and policy makers who are interested in leveraging multimodal data for ethical, applied affect recognition.</span><o:p></o:p></p>
<p class="MsoNormal"><b><span style="font-size:12.0pt;font-family:"Arial",sans-serif;color:#2F3138;background:white">Topics of interest</span></b><span style="font-size:12.0pt;font-family:"Arial",sans-serif;color:#2F3138;background:white"> include, but are
not limited to, <b>ethical</b> applications of the following:</span><o:p></o:p></p>
<ul type="disc">
<li class="MsoNormal" style="color:#2F3138;mso-margin-top-alt:auto;mso-margin-bottom-alt:auto;mso-list:l0 level1 lfo2;background:white;box-sizing: border-box">
<span style="font-size:12.0pt;font-family:"Arial",sans-serif">Health applications with a focus on multimodal affect<o:p></o:p></span></li><li class="MsoNormal" style="color:#2F3138;mso-margin-top-alt:auto;mso-margin-bottom-alt:auto;mso-list:l0 level1 lfo2;background:white;box-sizing: border-box">
<span style="font-size:12.0pt;font-family:"Arial",sans-serif">Multimodal affective computing for cybersecurity applications (e.g., biometrics and IoT security)<o:p></o:p></span></li><li class="MsoNormal" style="color:#2F3138;mso-margin-top-alt:auto;mso-margin-bottom-alt:auto;mso-list:l0 level1 lfo2;background:white;box-sizing: border-box">
<span style="font-size:12.0pt;font-family:"Arial",sans-serif">Inter-correlations and fusion of ubiquitous multimodal data as it relates to applied emotion recognition (e.g. face and EEG data)<o:p></o:p></span></li><li class="MsoNormal" style="color:#2F3138;mso-margin-top-alt:auto;mso-margin-bottom-alt:auto;mso-list:l0 level1 lfo2;background:white;box-sizing: border-box">
<span style="font-size:12.0pt;font-family:"Arial",sans-serif">Leveraging ubiquitous devices to create reliable multimodal applications for emotion recognition<o:p></o:p></span></li><li class="MsoNormal" style="color:#2F3138;mso-margin-top-alt:auto;mso-margin-bottom-alt:auto;mso-list:l0 level1 lfo2;background:white;box-sizing: border-box">
<span style="font-size:12.0pt;font-family:"Arial",sans-serif">Applications of in-the-wild data vs. lab controlled<o:p></o:p></span></li><li class="MsoNormal" style="color:#2F3138;mso-margin-top-alt:auto;mso-margin-bottom-alt:auto;mso-list:l0 level1 lfo2;background:white;box-sizing: border-box">
<span style="font-size:12.0pt;font-family:"Arial",sans-serif">Facilitation and collection of multimodal data (e.g. ubiquitous data) for applied emotion recognition<o:p></o:p></span></li><li class="MsoNormal" style="color:#2F3138;mso-margin-top-alt:auto;mso-margin-bottom-alt:auto;mso-list:l0 level1 lfo2;background:white;box-sizing: border-box">
<span style="font-size:12.0pt;font-family:"Arial",sans-serif">Engineering applications of multimodal affect (e.g., robotics, social engineering, domain inspired hardware / sensing technologies, etc.)<o:p></o:p></span></li><li class="MsoNormal" style="color:#2F3138;mso-margin-top-alt:auto;mso-margin-bottom-alt:auto;mso-list:l0 level1 lfo2;background:white;box-sizing: border-box">
<span style="font-size:12.0pt;font-family:"Arial",sans-serif">Privacy and security<o:p></o:p></span></li><li class="MsoNormal" style="color:#2F3138;mso-margin-top-alt:auto;mso-margin-bottom-alt:auto;mso-list:l0 level1 lfo2;background:white;box-sizing: border-box">
<span style="font-size:12.0pt;font-family:"Arial",sans-serif">Institutionalized bias<o:p></o:p></span></li><li class="MsoNormal" style="color:#2F3138;mso-margin-top-alt:auto;mso-margin-bottom-alt:auto;mso-list:l0 level1 lfo2;background:white;box-sizing: border-box">
<span style="font-size:12.0pt;font-family:"Arial",sans-serif">Trustworthy applications of affective computing<o:p></o:p></span></li><li class="MsoNormal" style="color:#2F3138;mso-margin-top-alt:auto;mso-margin-bottom-alt:auto;mso-list:l0 level1 lfo2;background:white;box-sizing: border-box">
<span style="font-size:12.0pt;font-family:"Arial",sans-serif">Equal access to ethical applications of affective computing (e.g. medical applications inaccessible due to wealth inequality)<o:p></o:p></span></li></ul>
<p class="MsoNormal"><b><span style="font-size:12.0pt;font-family:"Arial",sans-serif;color:#2F3138;background:white">NOTE:</span></b><span style="font-size:12.0pt;font-family:"Arial",sans-serif;color:#2F3138;background:white"> Topics that do <i>not</i> demonstrate
an existing or potential application of affective computing / emotion recognition are not topics of interest for this workshop.</span><span style="font-size:12.0pt;font-family:"Arial",sans-serif;color:#2F3138"><br>
<br>
<span style="background:white">Workshop candidates are invited to submit papers up to 4 pages plus one for references in the ACII format. Submissions to AMAR 2021 should have no substantial overlap with any other paper submitted to ACII2021 or already published.
All persons who have made any substantial contribution to the work should be listed as authors, and all listed authors should have made some substantial contribution to the work. Papers presented at AMAR 2021 will appear in the IEEE Xplore digital library.
Papers should follow the </span></span><a href="https://www.acii-conf.net/2021/submission/" target="_blank"><span style="font-size:12.0pt;font-family:"Arial",sans-serif;color:#F82249;background:white">ACII conference format (anonymous)</span></a><span style="font-size:12.0pt;font-family:"Arial",sans-serif;color:#2F3138;background:white">.
Submission portal is up through EasyChair at </span><a href="https://easychair.org/my/conference?conf=acii2021">ACII 2021 (9th International Conference on Affective Computing & Intelligent Interaction (ACII 2021)) (easychair.org)</a>.<span style="font-size:12.0pt;font-family:"Arial",sans-serif;color:#2F3138;background:white"><o:p></o:p></span></p>
<p class="MsoNormal"><span style="font-size:12.0pt;font-family:"Arial",sans-serif;color:#2F3138;background:white"><o:p> </o:p></span></p>
<p class="MsoNormal" style="margin-bottom:12.0pt"><b><span style="font-size:12.0pt;font-family:"Arial",sans-serif;color:#2F3138;background:white">Important dates:</span></b><span style="font-size:12.0pt;font-family:"Arial",sans-serif;color:#2F3138"><br>
<b><span style="background:white">Paper submission:</span></b><span style="background:white"> <s>May 28, 2021</s></span>
</span><span style="font-size:12.0pt;font-family:"Arial",sans-serif;color:red">June 15, 2021</span><span style="font-size:12.0pt;font-family:"Arial",sans-serif;color:#2F3138"><br>
<b><span style="background:white">Decision to Authors:</span></b><span style="background:white"> <s>June 25, 2020</s></span>
</span><span style="font-size:12.0pt;font-family:"Arial",sans-serif;color:red">June 30, 2021</span><span style="font-size:12.0pt;font-family:"Arial",sans-serif;color:#2F3138"><br>
<b><span style="background:white">Camera-ready papers due:</span></b><span style="background:white"> TBD<o:p></o:p></span></span></p>
<p class="MsoNormal"><b><span style="font-size:12.0pt;font-family:"Arial",sans-serif;color:#2F3138;background:white">Organizers:<o:p></o:p></span></b></p>
<p class="MsoNormal"><span style="font-size:12.0pt;font-family:"Arial",sans-serif;color:#2F3138;background:white">Shaun Canavan, University of South Florida<o:p></o:p></span></p>
<p class="MsoNormal"><span style="font-size:12.0pt;font-family:"Arial",sans-serif;color:#2F3138;background:white">Tempestt Neal, University of South Florida<o:p></o:p></span></p>
<p class="MsoNormal"><span style="font-size:12.0pt;font-family:"Arial",sans-serif;color:#2F3138;background:white">Marvin Andujar, University of South Florida<o:p></o:p></span></p>
<p class="MsoNormal"><span style="font-size:12.0pt;font-family:"Arial",sans-serif;color:#2F3138;background:white">Lijun Yin, Binghamton University<o:p></o:p></span></p>
<p class="MsoNormal"><o:p> </o:p></p>
</div>
</body>
</html>