<html xmlns:v="urn:schemas-microsoft-com:vml" xmlns:o="urn:schemas-microsoft-com:office:office" xmlns:w="urn:schemas-microsoft-com:office:word" xmlns:m="http://schemas.microsoft.com/office/2004/12/omml" xmlns="http://www.w3.org/TR/REC-html40">
<head>
<meta http-equiv="Content-Type" content="text/html; charset=us-ascii">
<meta name="Generator" content="Microsoft Word 15 (filtered medium)">
<style><!--
/* Font Definitions */
@font-face
{font-family:"Cambria Math";
panose-1:2 4 5 3 5 4 6 3 2 4;}
@font-face
{font-family:Calibri;
panose-1:2 15 5 2 2 2 4 3 2 4;}
/* Style Definitions */
p.MsoNormal, li.MsoNormal, div.MsoNormal
{margin:0cm;
margin-bottom:.0001pt;
font-size:11.0pt;
font-family:"Calibri",sans-serif;
mso-fareast-language:EN-US;}
a:link, span.MsoHyperlink
{mso-style-priority:99;
color:#0563C1;
text-decoration:underline;}
a:visited, span.MsoHyperlinkFollowed
{mso-style-priority:99;
color:#954F72;
text-decoration:underline;}
span.EmailStyle17
{mso-style-type:personal-compose;
font-family:"Calibri",sans-serif;
color:windowtext;}
.MsoChpDefault
{mso-style-type:export-only;
font-family:"Calibri",sans-serif;
mso-fareast-language:EN-US;}
@page WordSection1
{size:612.0pt 792.0pt;
margin:72.0pt 72.0pt 72.0pt 72.0pt;}
div.WordSection1
{page:WordSection1;}
--></style><!--[if gte mso 9]><xml>
<o:shapedefaults v:ext="edit" spidmax="1026" />
</xml><![endif]--><!--[if gte mso 9]><xml>
<o:shapelayout v:ext="edit">
<o:idmap v:ext="edit" data="1" />
</o:shapelayout></xml><![endif]-->
</head>
<body lang="EN-GB" link="#0563C1" vlink="#954F72">
<div class="WordSection1">
<p class="MsoNormal">[Apologies if you receive multiple copies of this CFP]<o:p></o:p></p>
<p class="MsoNormal">Call for Papers: <o:p></o:p></p>
<p class="MsoNormal"><o:p> </o:p></p>
<p class="MsoNormal"><b><span style="font-size:12.0pt">Special Session on Deep and Generative Adversarial Learning<o:p></o:p></span></b></p>
<p class="MsoNormal">International Joint Conference on Neural Networks (IJCNN 2019)<o:p></o:p></p>
<p class="MsoNormal">July 14-19 2019, Budapest, Hungary <a href="https://www.ijcnn.org/">
https://www.ijcnn.org/</a> <o:p></o:p></p>
<p class="MsoNormal"><o:p> </o:p></p>
<p class="MsoNormal"><b>Submission Deadline:</b> <span style="color:red">15 December 2018</span><o:p></o:p></p>
<p class="MsoNormal"><o:p> </o:p></p>
<p class="MsoNormal"><b>Aims and Scope: <o:p></o:p></b></p>
<p class="MsoNormal">Deep Generative Adversarial Networks (GANs) are one of the most recent breakthroughs in deep learning (DL) and neural networks. One of the main advantages of GANs over other deep learning systems is their ability to learn from unlabelled
data, as well as their ability to generate new data from random distributions. However, generating realistic data using GANs remains a challenge, particularly when specific features are required; e.g., constraining the latent aggregate distribution space does
not guarantee that the generator will produce an image with a specific attribute. New advancements in deep representation learning (RL) can help improve the learning process in GANs. For instance, RL can help address issues such as dataset bias and network
co-adaptation, and help identify a set of features that are ideal for a given task.
<o:p></o:p></p>
<p class="MsoNormal">Practical applications of GANs include: realistic data synthesis, generation of speech or images from text, image denoising and completion, artificial environment generation for reinforcement learning problems, conversion of satellite images
into maps, class imbalance learning, or other unsupervised and supervised learning tasks. Nonetheless, GANs have yet to overcome several challenges. They often fail to converge and are very sensitive to parameter and hyperparameter initialization. Simultaneous
learning of a generator and a discriminator network also makes the learning process more difficult and often results in overfitting or vanishing gradients in the generator network. Moreover, the generator model is prone to mode collapse which results in failure
to generate data with several variations. New theoretical methods in deep learning and GANs are therefore required to improve the learning process and generalization performance of GANs. Topics of interest for this special session include, but are not limited
to:<o:p></o:p></p>
<p class="MsoNormal">• Generative adversarial learning methods and theory;<o:p></o:p></p>
<p class="MsoNormal">• Representation learning methods and theory; <o:p>
</o:p></p>
<p class="MsoNormal">• Adversarial representation learning for domain adaptation;
<o:p></o:p></p>
<p class="MsoNormal">• Interpretable representation adversarial learning;
<o:p></o:p></p>
<p class="MsoNormal">• Adversarial feature learning; <o:p></o:p></p>
<p class="MsoNormal">• RL and GANs for data augmentation and class imbalance;
<o:p></o:p></p>
<p class="MsoNormal">• New GAN models and learning criteria; <o:p></o:p></p>
<p class="MsoNormal">• RL and GANs in classification; <o:p></o:p></p>
<p class="MsoNormal">• Image completion and super-resolution; <o:p></o:p></p>
<p class="MsoNormal">• RL and GANs in Deep Reinforcement Learning;<o:p></o:p></p>
<p class="MsoNormal">• Deep learning and GANs for image and video synthesis;<o:p></o:p></p>
<p class="MsoNormal">• Deep Learning and GANs for speech and audio synthesis;<o:p></o:p></p>
<p class="MsoNormal">• RL and GANs and for In-painting and Sketch to image;
<o:p></o:p></p>
<p class="MsoNormal">• Representation and Adversarial Learning in Machine Translation;<o:p></o:p></p>
<p class="MsoNormal">• RL and GANs in other application domains. <o:p>
</o:p></p>
<p class="MsoNormal"><o:p> </o:p></p>
<p class="MsoNormal"><b>Submission:</b> For paper guidelines please visit https://www.ijcnn.org/paper-submission-guidelines and for submissions please select
<b>Special Session S06. Deep and Generative Adversarial Learning</b> as the main research topic at
<a href="https://ieee-cis.org/conferences/ijcnn2019/upload.php">https://ieee-cis.org/conferences/ijcnn2019/upload.php</a>
<o:p></o:p></p>
<p class="MsoNormal"><o:p> </o:p></p>
<p class="MsoNormal"><b>Organizers:<o:p></o:p></b></p>
<p class="MsoNormal">Ariel Ruiz-Garcia, Coventry University, UK (<a href="mailto:ariel.ruiz-garcia@coventry.ac.uk">ariel.ruiz-garcia@coventry.ac.uk</a>)<o:p></o:p></p>
<p class="MsoNormal">Vasile Palade, Coventry University, UK (<a href="mailto:vasile.palade@coventry.ac.uk">vasile.palade@coventry.ac.uk</a>)<o:p></o:p></p>
<p class="MsoNormal">Clive Cheong Took, Royal Holloway(University of London), UK (<a href="mailto:clive.cheongtook@rhul.ac.uk">clive.cheongtook@rhul.ac.uk</a>)<o:p></o:p></p>
</div>
<p style="line-height: 15.0pt;"><strong><span style="font-size: 13pt; font-family: 'Arial',sans-serif; color: #005eb8;">University of the Year for Student Experience<br>
</span></strong><span style="font-size: 9.0pt; font-family: 'Arial',sans-serif; color: black;">The Times and Sunday Times Good University Guide 2019</span></p>
<p style="mso-margin-top-alt: 3.0pt; margin-right: 0cm; margin-bottom: 3.0pt; margin-left: 0cm; line-height: 12.0pt;">
<strong><span style="font-size: 13pt; font-family: 'Arial',sans-serif; color: #005eb8;">2nd for Teaching Excellence</span></strong><br>
<span style="font-size: 9.0pt; font-family: 'Arial',sans-serif; color: black;">Times Higher Education UK (TEF) metrics ranking 2017 – Gold winner</span></p>
<p style="mso-margin-top-alt: 3.0pt; margin-right: 0cm; margin-bottom: 3.0pt; margin-left: 0cm; line-height: 12.0pt;">
<strong><span style="font-size: 13pt; font-family: 'Arial',sans-serif; color: #005eb8;">5th UK Student City</span></strong><br>
<span style="font-size: 9.0pt; font-family: 'Arial',sans-serif; color: black;">QS Best Student Cities Index 2018</span></p>
<p style="line-height: 15.0pt;"><strong><span style="font-size: 13pt; font-family: 'Arial',sans-serif; color: #005eb8;">13th in Guardian University Guide 2019</span></strong><br>
<span style="font-size: 9.0pt; font-family: 'Arial',sans-serif; color: black;">of 121 UK institutions ranked
</span></p>
<p style="margin-top: 15.0pt; line-height: 8.25pt;"><span style="font-size: 7.0pt; font-family: 'Arial',sans-serif;">NOTICE<br>
<br>
This message and any files transmitted with it is intended for the addressee only and may contain information that is confidential or privileged. Unauthorised use is strictly prohibited. If you are not the addressee, you should not read, copy, disclose or otherwise
use this message, except for the purpose of delivery to the addressee.<br>
<br>
Any views or opinions expressed within this e-mail are those of the author and do not necessarily represent those of Coventry University.</span></p>
</body>
</html>