<html><p>[Apologies for multiple postings]<br /><br />************************************<br />3rd International IEEE Workshop on Hyperrealistic Multimedia for Enhanced Quality of Experience<br /><a rel="noopener" target="_blank" href="https://www.realvision-itn.eu/Events/ICME-2022-Workshop">https://www.realvision-itn.eu/Events/ICME-2022-Workshop</a><br />in conjunction with ICME 2022, Taipei, Taiwan, July 18-22, 2022, <a href="http://2022.ieeeicme.org/">http://2022.ieeeicme.org/</a><br />************************************<br /><br />The aim of hyper-realistic media is to faithfully represent the physical world. The ultimate goal is to create an experience which is perceptually indistinguishable from a real scene. Traditional technologies can only capture a fraction of the audio-visual information, limiting the realism of the experience. Recent innovations in computers and audio-visual technology have made it possible to circumvent these bottlenecks in audio-visual systems. As a result, new multimedia signal processing areas have emerged such as light fields, point clouds, ultra-high definition, high frame rate, high dynamic range imaging and novel 3D audio and sound field technologies. The novel combinations of those technologies can facilitate a hyper-realistic media experience. Without a doubt, this will be the future frontier for new multimedia systems. However, several technological barriers and challenges need to be overcome in developing the best solutions perceptually.</p><p> </p>This third ICME workshop on Hyper-Realistic Multimedia for Enhanced Quality of Experience aims at bringing forward recent advances related to capturing, processing, and rendering technologies. The goal is to gather researchers with diverse and interdisciplinary backgrounds to cover the full multimedia signal chain, to efficiently develop truly perceptually enhanced multimedia systems. We seek unpublished high-quality papers within, but not limited to, the following topics:<ul><li>Lightfield, point-cloud, volumetric imaging</li><li>High Dynamic Range imaging, Wide Color Gamut, Ultra High Definition</li><li>Hyper-realistic display technologies</li><li>Human perception modeling, perceptually-inspired processing</li><li>Processing and coding of hyper-realistic multimedia content</li><li>3D audio / spatial audio</li><li>Subjective and objective quality assessment</li><li>Quality of experience</li><li>Hyper-realism and immersiveness</li><li>Human vision, clinical and experimental psychology and psychophysics</li></ul><p> </p><strong>Important Dates</strong><br />Paper submission due: <strong>March 12, 2022</strong><br />Decision notification: April 4, 2022<br />Camera ready submission: April 11, 2022<p> </p><strong>Paper Submission</strong><br />Full-length papers of 6 pages in length reporting on original research are solicited. The reviewing is double blind and the paper submission link is: <a rel="noopener" target="_blank" href="https://cmt3.research.microsoft.com/ICMEW2022">https://cmt3.research.microsoft.com/ICMEW2022</a>.<br />For detailed instructions, see <a rel="noopener" target="_blank" href="http://2022.ieeeicme.org/">http://2022.ieeeicme.org/</a>.<p> </p><strong>Contacts</strong><br />Giuseppe Valenzise (<a target="_blank" rel="noopener" href="https://mail.google.com/mail/?view=cm&fs=1&tf=1&to=giuseppe.valenzise@centralesupelec.fr">giuseppe.valenzise@centralesupelec.fr</a>)<br />Federica Battisti (<a target="_blank" rel="noopener" href="https://mail.google.com/mail/?view=cm&fs=1&tf=1&to=federica.battisti@dei.unipd.it">federica.battisti@dei.unipd.it</a>)<br />Homer Chen (<a target="_blank" rel="noopener" href="https://mail.google.com/mail/?view=cm&fs=1&tf=1&to=homer@ntu.edu.tw">homer@ntu.edu.tw</a>)<br />Søren Forchhammer (<a target="_blank" rel="noopener" href="https://mail.google.com/mail/?view=cm&fs=1&tf=1&to=sofo@fotonik.dtu.dk">sofo@fotonik.dtu.dk</a>)<br />Mylène Farias (<a target="_blank" rel="noopener" href="https://mail.google.com/mail/?view=cm&fs=1&tf=1&to=mylene@ene.unb.br">mylene@ene.unb.br</a>)<br /><br />--<br />______________________________<wbr />________<br />Giuseppe Valenzise<br />Chargé de Recherche CNRS<br />Laboratoire des Signaux et Systèmes (L2S, UMR 8506)<br />CNRS - CentraleSupelec - Université Paris-Saclay<br />3, rue Joliot Curie<br />91192 Gif-sur-Yvette Cedex, FRANCE<br />email: <a target="_blank" rel="noopener" href="mailto:giuseppe.valenzise@l2s.centralesupelec.fr">giuseppe.valenzise@l2s.<wbr />centralesupelec.fr</a><br />tel: +33 1 69 85 17 72<br />web: <a target="_blank" data-saferedirecturl="https://www.google.com/url?q=https://l2s.centralesupelec.fr/u/valenzise-giuseppe/&source=gmail&ust=1643328746458000&usg=AOvVaw2i9iaTuH1xpQPCkZbozmLk" rel="noopener" href="https://l2s.centralesupelec.fr/u/valenzise-giuseppe/">https://l2s.<wbr />centralesupelec.fr/u/<wbr />valenzise-giuseppe/</a></html>