<html xmlns:v="urn:schemas-microsoft-com:vml" xmlns:o="urn:schemas-microsoft-com:office:office" xmlns:w="urn:schemas-microsoft-com:office:word" xmlns:m="http://schemas.microsoft.com/office/2004/12/omml" xmlns="http://www.w3.org/TR/REC-html40">
<head>
<meta http-equiv="Content-Type" content="text/html; charset=utf-8">
<meta name="Generator" content="Microsoft Word 15 (filtered medium)">
<style><!--
/* Font Definitions */
@font-face
{font-family:"Cambria Math";
panose-1:2 4 5 3 5 4 6 3 2 4;}
@font-face
{font-family:Calibri;
panose-1:2 15 5 2 2 2 4 3 2 4;}
@font-face
{font-family:"Roboto Mono";
panose-1:2 11 6 4 2 2 2 2 2 4;}
/* Style Definitions */
p.MsoNormal, li.MsoNormal, div.MsoNormal
{margin:0in;
margin-bottom:.0001pt;
font-size:12.0pt;
font-family:"Calibri",sans-serif;}
a:link, span.MsoHyperlink
{mso-style-priority:99;
color:#0563C1;
text-decoration:underline;}
a:visited, span.MsoHyperlinkFollowed
{mso-style-priority:99;
color:#954F72;
text-decoration:underline;}
p.msonormal0, li.msonormal0, div.msonormal0
{mso-style-name:msonormal;
mso-margin-top-alt:auto;
margin-right:0in;
mso-margin-bottom-alt:auto;
margin-left:0in;
font-size:11.0pt;
font-family:"Calibri",sans-serif;}
span.EmailStyle18
{mso-style-type:personal;
font-family:"Calibri",sans-serif;
color:windowtext;}
span.apple-tab-span
{mso-style-name:apple-tab-span;}
.MsoChpDefault
{mso-style-type:export-only;
font-size:10.0pt;}
@page WordSection1
{size:8.5in 11.0in;
margin:1.0in 1.0in 1.0in 1.0in;}
div.WordSection1
{page:WordSection1;}
--></style><!--[if gte mso 9]><xml>
<o:shapedefaults v:ext="edit" spidmax="1026" />
</xml><![endif]--><!--[if gte mso 9]><xml>
<o:shapelayout v:ext="edit">
<o:idmap v:ext="edit" data="1" />
</o:shapelayout></xml><![endif]-->
</head>
<body lang="EN-US" link="#0563C1" vlink="#954F72">
<div class="WordSection1">
<p class="MsoNormal"><span style="font-size:11.0pt;font-family:"Roboto Mono";color:#222222">Dear colleagues,<o:p></o:p></span></p>
<p class="MsoNormal"><span style="font-size:11.0pt;font-family:"Roboto Mono";color:#222222"><o:p> </o:p></span></p>
<p class="MsoNormal"><span style="font-size:11.0pt;font-family:"Roboto Mono";color:#222222">As CVPR appears on the horizon, we would like to remind you that the submission deadline of April 26th is fast approaching for the Scene Understanding and Modeling (SUMO)
Workshop. Please see the CFP below and we hope you will help us spread the word about the workshop. Details can be found on the SUMO website: https://sumochallenge.org/2019-sumo-workshop<o:p></o:p></span></p>
<p class="MsoNormal"><span style="font-size:11.0pt;font-family:"Roboto Mono";color:#222222"><o:p> </o:p></span></p>
<p class="MsoNormal"><span style="font-size:11.0pt;font-family:"Roboto Mono";color:#222222">Thanks for your attention and looking forward to have a great workshop!<o:p></o:p></span></p>
<p class="MsoNormal"><span style="font-size:11.0pt;font-family:"Roboto Mono";color:#222222"><o:p> </o:p></span></p>
<p class="MsoNormal"><span style="font-size:11.0pt;font-family:"Roboto Mono";color:#222222">Ilke Demir on behalf of SUMO organizers<o:p></o:p></span></p>
<p class="MsoNormal"><span style="font-size:11.0pt;font-family:"Roboto Mono";color:#222222"><o:p> </o:p></span></p>
<p class="MsoNormal"><span style="font-size:11.0pt;font-family:"Roboto Mono";color:#222222"><o:p> </o:p></span></p>
<p class="MsoNormal"><span style="font-size:11.0pt;font-family:"Roboto Mono";color:#222222">*********************************************************************<o:p></o:p></span></p>
<p class="MsoNormal"><span style="font-size:11.0pt;font-family:"Roboto Mono";color:#222222"> Call for Papers<o:p></o:p></span></p>
<p class="MsoNormal"><span style="font-size:11.0pt;font-family:"Roboto Mono";color:#222222">The 2019 SUMO Workshop: 360° Indoor Scene Understanding and Modeling<o:p></o:p></span></p>
<p class="MsoNormal"><span style="font-size:11.0pt;font-family:"Roboto Mono";color:#222222"> in Association with CVPR 2019
<o:p></o:p></span></p>
<p class="MsoNormal"><span style="font-size:11.0pt;font-family:"Roboto Mono";color:#222222">*********************************************************************<o:p></o:p></span></p>
<p class="MsoNormal"><span style="font-size:11.0pt;font-family:"Roboto Mono";color:#222222"><o:p></o:p></span></p>
<p class="MsoNormal"><span style="font-size:11.0pt;font-family:"Roboto Mono";color:#222222">Webpage: https://sumochallenge.org/2019-sumo-workshop.html<o:p></o:p></span></p>
<p class="MsoNormal"><span style="font-size:11.0pt;font-family:"Roboto Mono";color:#222222">Submission site: https://cmt3.research.microsoft.com/SUMO2019<o:p></o:p></span></p>
<p class="MsoNormal"><span style="font-size:11.0pt;font-family:"Roboto Mono";color:#222222">Date: June 17, 2019<o:p></o:p></span></p>
<p class="MsoNormal"><span style="font-size:11.0pt;font-family:"Roboto Mono";color:#222222">Location: Long Beach, CA, USA<o:p></o:p></span></p>
<p class="MsoNormal"><span style="font-size:11.0pt;font-family:"Roboto Mono";color:#222222"><o:p></o:p></span></p>
<p class="MsoNormal"><span style="font-size:11.0pt;font-family:"Roboto Mono";color:#222222">In computer vision, scene understanding and modeling encapsulate a diverse set of research problems, ranging from low-level geometric modeling (e.g., SLAM algorithms)
to 3D room layout estimation. These tasks are often addressed separately, yielding only a constrained understanding and representation of the underlying scene. In parallel, the popularity of 360° cameras has encouraged the digitization of the real world into
augmented and virtual realities, enabling new applications such as virtual social interactions and semantically leveraged augmented reality. This workshop aims to promote comprehensive 3D scene understanding and modeling algorithms that create integrated scene
representations (with geometry, appearance, semantics, and perceptual qualities), while utilizing 360° imagery to encourage research on its unique challenges.
<o:p></o:p></span></p>
<p class="MsoNormal"><span style="font-size:11.0pt;font-family:"Roboto Mono";color:#222222"> <o:p></o:p></span></p>
<p class="MsoNormal"><span style="font-size:11.0pt;font-family:"Roboto Mono";color:#222222"> <o:p></o:p></span></p>
<p class="MsoNormal"><span style="font-size:11.0pt;font-family:"Roboto Mono";color:#222222">*** Call for Papers:<o:p></o:p></span></p>
<p class="MsoNormal"><span style="font-size:11.0pt;font-family:"Roboto Mono";color:#222222"><o:p></o:p></span></p>
<p class="MsoNormal"><span style="font-size:11.0pt;font-family:"Roboto Mono";color:#222222">The workshop is soliciting papers covering various problems related to 3D and 360° scene understanding and modeling from RGB and RGB-D imagery. The topics mainly focus
on indoor scene modeling and include, but are not limited to:<o:p></o:p></span></p>
<p class="MsoNormal"><span style="font-size:11.0pt;font-family:"Roboto Mono";color:#222222">* 360° data processing and scene understanding<o:p></o:p></span></p>
<p class="MsoNormal"><span style="font-size:11.0pt;font-family:"Roboto Mono";color:#222222">* Object detection<o:p></o:p></span></p>
<p class="MsoNormal"><span style="font-size:11.0pt;font-family:"Roboto Mono";color:#222222">* Object localization<o:p></o:p></span></p>
<p class="MsoNormal"><span style="font-size:11.0pt;font-family:"Roboto Mono";color:#222222">* Layout estimation<o:p></o:p></span></p>
<p class="MsoNormal"><span style="font-size:11.0pt;font-family:"Roboto Mono";color:#222222">* “Stuff” detection and modeling<o:p></o:p></span></p>
<p class="MsoNormal"><span style="font-size:11.0pt;font-family:"Roboto Mono";color:#222222">* Instance segmentation<o:p></o:p></span></p>
<p class="MsoNormal"><span style="font-size:11.0pt;font-family:"Roboto Mono";color:#222222">* Object completion and 3D reconstruction<o:p></o:p></span></p>
<p class="MsoNormal"><span style="font-size:11.0pt;font-family:"Roboto Mono";color:#222222">* Object pose estimation<o:p></o:p></span></p>
<p class="MsoNormal"><span style="font-size:11.0pt;font-family:"Roboto Mono";color:#222222">* Generative models
<o:p></o:p></span></p>
<p class="MsoNormal"><span style="font-size:11.0pt;font-family:"Roboto Mono";color:#222222">* Articulated object modeling<o:p></o:p></span></p>
<p class="MsoNormal"><span style="font-size:11.0pt;font-family:"Roboto Mono";color:#222222">* Texture and appearance modeling<o:p></o:p></span></p>
<p class="MsoNormal"><span style="font-size:11.0pt;font-family:"Roboto Mono";color:#222222">* Material property estimation<o:p></o:p></span></p>
<p class="MsoNormal"><span style="font-size:11.0pt;font-family:"Roboto Mono";color:#222222">* Lighting recognition<o:p></o:p></span></p>
<p class="MsoNormal"><span style="font-size:11.0pt;font-family:"Roboto Mono";color:#222222"><o:p></o:p></span></p>
<p class="MsoNormal"><span style="font-size:11.0pt;font-family:"Roboto Mono";color:#222222">Submissions must be written in English and must be sent in PDF format. Each submitted paper must be no longer than four pages, excluding references. Please refer to
the CVPR author submission guidelines for instructions regarding formatting, templates, and policies. The CVPR submission guidelines can be found at http://cvpr2019.thecvf.com/submission/main_conference/author_guidelines and the paper submissions are accepted
through the CMT site at https://cmt3.research.microsoft.com/SUMO2019. The review process will be double blind and selected papers will be published in the IEEE CVPRW proceedings, visible in the CVF Website.<o:p></o:p></span></p>
<p class="MsoNormal"><span style="font-size:11.0pt;font-family:"Roboto Mono";color:#222222"><o:p></o:p></span></p>
<p class="MsoNormal"><span style="font-size:11.0pt;font-family:"Roboto Mono";color:#222222">*** Important dates:<o:p></o:p></span></p>
<p class="MsoNormal"><span style="font-size:11.0pt;font-family:"Roboto Mono";color:#222222"><o:p></o:p></span></p>
<p class="MsoNormal"><span style="font-size:11.0pt;font-family:"Roboto Mono";color:#222222">April 26, 2019 Paper submission deadline<o:p></o:p></span></p>
<p class="MsoNormal"><span style="font-size:11.0pt;font-family:"Roboto Mono";color:#222222">May 10, 2019 Notification to authors<o:p></o:p></span></p>
<p class="MsoNormal"><span style="font-size:11.0pt;font-family:"Roboto Mono";color:#222222">May 17, 2019 Camera-ready deadline<o:p></o:p></span></p>
<p class="MsoNormal"><span style="font-size:11.0pt;font-family:"Roboto Mono";color:#222222">June 17, 2019 SUMO Workshop<o:p></o:p></span></p>
<p class="MsoNormal"><span style="font-size:11.0pt;font-family:"Roboto Mono";color:#222222"><o:p></o:p></span></p>
<p class="MsoNormal"><span style="font-size:11.0pt;font-family:"Roboto Mono";color:#222222">*** Organizing Committee:<o:p></o:p></span></p>
<p class="MsoNormal"><span style="font-size:11.0pt;font-family:"Roboto Mono";color:#222222"><o:p></o:p></span></p>
<p class="MsoNormal"><span style="font-size:11.0pt;font-family:"Roboto Mono";color:#222222">Daniel Huber, Facebook<o:p></o:p></span></p>
<p class="MsoNormal"><span style="font-size:11.0pt;font-family:"Roboto Mono";color:#222222">Lyne Tchapmi, Stanford University<o:p></o:p></span></p>
<p class="MsoNormal"><span style="font-size:11.0pt;font-family:"Roboto Mono";color:#222222">Frank Dellaert, Georgia Tech<o:p></o:p></span></p>
<p class="MsoNormal"><span style="font-size:11.0pt;font-family:"Roboto Mono";color:#222222">Ilke Demir, DeepScale<o:p></o:p></span></p>
<p class="MsoNormal"><span style="font-size:11.0pt;font-family:"Roboto Mono";color:#222222">Shuran Song, Princeton University<o:p></o:p></span></p>
<p class="MsoNormal"><span style="font-size:11.0pt;font-family:"Roboto Mono";color:#222222">Rachel Luo, Stanford University<o:p></o:p></span></p>
<p class="MsoNormal"><span style="font-size:11.0pt;font-family:"Roboto Mono";color:#222222"><o:p> </o:p></span></p>
<p class="MsoNormal"><span style="font-size:11.0pt;font-family:"Roboto Mono";color:#222222">*** Contact:<o:p></o:p></span></p>
<p class="MsoNormal"><span style="font-size:11.0pt;font-family:"Roboto Mono";color:#222222"><o:p> </o:p></span></p>
<p class="MsoNormal"><span style="font-size:11.0pt;font-family:"Roboto Mono";color:#222222">Daniel Huber - dhuber@fb.com<o:p></o:p></span></p>
<p class="MsoNormal"><span style="font-size:11.0pt;font-family:"Roboto Mono";color:#222222">Lyne Tchapmi - lynetcha@stanford.edu
<o:p></o:p></span></p>
<p class="MsoNormal"><span style="font-size:11.0pt;font-family:"Roboto Mono";color:#222222"><o:p> </o:p></span></p>
<p class="MsoNormal"><o:p> </o:p></p>
</div>
</body>
</html>